Loading…

A Case for a Value-Aware Cache

Replication of values causes poor utilization of on-chip cache memory resources. This paper addresses the question: How much cache resources can be theoretically and practically saved if value replication is eliminated? We introduce the concept of value-aware caches and show that a sixteen times sma...

Full description

Saved in:
Bibliographic Details
Published in:IEEE computer architecture letters 2014-01, Vol.13 (1), p.1-4
Main Authors: Arelakis, Angelos, Stenstrom, Per
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Replication of values causes poor utilization of on-chip cache memory resources. This paper addresses the question: How much cache resources can be theoretically and practically saved if value replication is eliminated? We introduce the concept of value-aware caches and show that a sixteen times smaller value-aware cache can yield the same miss rate as a conventional cache. We then make a case for a value-aware cache design using Huffman-based compression. Since the value set is rather stable across the execution of an application, one can afford to reconstruct the coding tree in software. The decompression latency is kept short by our proposed novel pipelined Huffman decoder that uses canonical codewords. While the (loose) upper-bound compression factor is 5.2×, we show that, by eliminating cache-block alignment restrictions, it is possible to achieve a compression factor of 3.4× for practical designs.
ISSN:1556-6056
1556-6064
1556-6064
DOI:10.1109/L-CA.2012.31