Loading…

Modelling Cache Pollution

Cache memories are commonly used to reduce the coat of accessing data and instructions in memory. Misses in the cache can severely reduce system performance. It is therefore beneficial to try to anticipate cache misses in an attempt to reduce their frequency. Prefetching can be used to prime a cache...

Full description

Saved in:
Bibliographic Details
Published in:International journal of modelling & simulation 1998-01, Vol.18 (2), p.132-138
Main Authors: Casmira, J.P., Kaeli, D.R.
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Cache memories are commonly used to reduce the coat of accessing data and instructions in memory. Misses in the cache can severely reduce system performance. It is therefore beneficial to try to anticipate cache misses in an attempt to reduce their frequency. Prefetching can be used to prime a cache with the addresses that will most probably be needed in the future. One side-effect of prefetching is that cache lines that would had been used in the future will be supplanted by prefetdied data and instructions that may never be used. This is known as cache pollution. We model this effect using tracedriven cache simulation, and propose a new mechanism called the Prefetch Buffer Filter (PBF) that, combined with tagged prefetching, significantly reduces the cache pollution while also reducing the number of memory references issued for prefetching. Through the use of traced-driven simulation we demonstrate that using a 16-entry PBF for both the instruction and the data stream can consistently reduce data cache miss rates while effectively reducing memory traffic.
ISSN:0228-6203
1925-7082
DOI:10.1080/02286203.1998.11760369