Record:   Prev Next
作者 Hallnor, Erik George
書名 Design and applications of an indirection-based cache structure
國際標準書號 0496981536
book jacket
說明 95 p
附註 Source: Dissertation Abstracts International, Volume: 66-02, Section: B, page: 1073
Chair: Steven K. Reinhardt
Thesis (Ph.D.)--University of Michigan, 2005
The gap between CPU and main memory speeds has long been a performance bottleneck. As we move towards the future, gains in processor performance further outpace those seen in memory latencies and bandwidths increasing the CPU/memory gap. The gap has grown so large that is has been proposed to stop execution of a program when an off-chip access occurs, much like when a disk access occurs
This memory gap has largely been addressed through the uses of caches. These caches are smaller and faster memories that reduce accesses to main memory by keeping data closer to the processor. The early generations of caches were a few kilobytes in size and were off-chip. Later generations have moved the caches on-chip and increased their size. With current technology trends we are seeing multiple levels of on-chip caches, with the last level being many megabytes in size
These increasing memory latencies motivate a more aggressive approach to reducing the number of off-chip accesses. The central thesis of this dissertation is that increasing the flexibility of available on-chip storage can lead to significant improvements in memory-system efficiency and performance. This work proposes and evaluates an indirection-based cache structure that removes the limitations imposed by traditional caches on where data is placed and how it is accessed. This new freedom enables a number of new techniques; this dissertation focuses on three
First, managing cache contents more efficiently reduces unnecessary off-chip accesses by up to 84% on system level memory traces while maintaining performance equivalent to traditional caches for less memory-intensive programs. Second, allocating different amounts of storage for different data blocks based on their information content (compressibility) increases the effective capacity of the cache, improving performance by an average 20% with a peak increase of 292%. Finally, using a single physical copy for data that appears in multiple memory locations increases cache capacity by 33% and reduces the cost of copy operations, increasing network bandwidth by up to 5%
School code: 0127
DDC
Host Item Dissertation Abstracts International 66-02B
主題 Engineering, Electronics and Electrical
Computer Science
0544
0984
Alt Author University of Michigan
Record:   Prev Next