Base-Delta-Immediate Compression: Practical Data Compression For On-Chip Caches

PACT(2012)

引用 476|浏览289
暂无评分
摘要
Cache compression is a promising technique to increase on-chip cache capacity and to decrease on-chip and off-chip bandwidth usage. Unfortunately, directly applying well-known compression algorithms (usually implemented in software) leads to high hardware complexity and unacceptable decompression/compression latencies, which in turn can negatively affect performance. Hence, there is a need for a simple yet efficient compression technique that can effectively compress common in-cache data patterns, and has minimal effect on cache access latency.In this paper, we introduce a new compression algorithm called Base-Delta-Immediate (B Delta I) compression, a practical technique for compressing data in on-chip caches. The key idea is that, for many cache lines, the values within the cache line have a low dynamic range - i. e., the differences between values stored within the cache line are small. As a result, a cache line can be represented using a base value and an array of differences whose combined size is much smaller than the original cache line (we call this the base+ delta encoding). Moreover, many cache lines intersperse such base+ delta values with small values - our B Delta I technique efficiently incorporates such immediate values into its encoding.Compared to prior cache compression approaches, our studies show that B Delta I strikes a sweet-spot in the tradeoff between compression ratio, decompression/compression latencies, and hardware complexity. Our results show that B Delta I compression improves performance for both single-core (8.1% improvement) and multi-core workloads (9.5% /11.2% improvement for two/four cores). For many applications, B Delta I provides the performance benefit of doubling the cache size of the baseline system, effectively increasing average cache capacity by 1.53X.
更多
查看译文
关键词
Cache compression,Caching,Memory
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要