CACHE MEMORIES
What is Cache Memory?
• Cache memory is a small, high-speed RAM buffer located between
the CPU and main memory.
• Cache memory holds a copy of the instructions (instruction cache)
or data (operand or data cache) currently being used by the CPU.
• The main purpose of a cache is to accelerate your computer while
keeping the price of the computer low.
Types of Cache Mapping
1. Direct Mapping
2. Associative Mapping
3. Set Associative Mapping
DIRECT MAPPING
• The block-j of the main-memory maps onto block-j modulo-128 of the cache
• When the memory-blocks 0, 128, & 256 are loaded into cache, the block is stored in
cache-block 0. Similarly, memory-blocks 1, 129, 257 are stored in cache-block 1. The
contention may arise when
1) When the cache is full.
2) When more than one memory-block is mapped onto a given cache-block
position.
• The contention is resolved by allowing the new blocks to overwrite the currently
resident-block.
• Memory-address determines placement of block in the cache.
Associative mapping
• The memory-block can be placed into any cache-block position.
• 12 tag-bits will identify a memory-block when it is resolved in the cache.
• Tag-bits of an address received from processor are compared to the tag-bits of
each block of cache.
• This comparison is done to see if the desired block is present
• It gives complete freedom in choosing the cache-location.
• A new block that has to be brought into the cache has to replace an existing
block if the cache is full.
• The memory has to determine whether a given block is in the cache.
Set associative
• It is the combination of direct and associative mapping.
• The blocks of the cache are grouped into sets.
• The mapping allows a block of the main-memory to reside in any block of the
specified set.
• The cache has 2 blocks per set, so the memory-blocks 0, 64, 128…….. 4032 maps
into cache set 0.
• The cache can occupy either of the two block position within the set.
Replacement algorithms
• Replacement algorithms are used when there are no available space in a cache in
which to place a data.
Four of the most common cache replacement algorithms are described below:
• Least Recently Used (LRU): The LRU algorithm selects for replacement the item
that has been least recently used by the CPU.
• First-In-First-Out (FIFO): The FIFO algorithm selects for replacement the item that
has been in the cache from the longest time.
• Least Frequently Used (LRU): The LRU algorithm selects for replacement the
item that has been least frequently used by the CPU.
• Random: The random algorithm selects for replacement the item randomly.
• In direct mapping method,
the position of each block is pre-determined and there is no need of replacement
strategy.
• In associative & set associative method,
• The block position is not pre-determined. If the cache is full and if new blocks are
brought into the cache, then the cache-controller must decide which of the old
blocks has to be replaced.
• When a block is to be overwritten, the block with longest time w/o being
referenced is overwritten
THANK YOU