What is Cache Memory?-Levels of Cache Memory
Cache Memory is a type of storage space inside a computer’s central processing unit (CPU) where data temporarily stored by the CPU is kept until it is needed again. It helps to speed up the performance of the system. The more cache memory you have, the faster your computer will run.
The size of the cache memory varies depending on the type of processor. Most processors have at least 64 KB (kilobytes) of cache memory, although some have much larger caches. Larger caches allow programs to execute more rapidly since they reduce the number of times that information must be fetched from slower main memory.
A computer can have several different levels of cache memory. The level numbers refer to the distance from the CPU where Level 1 is the closest All levels of cache memory are faster than RAM. The cache closest to the CPU is always faster but generally costs more and stores less data than another level of cache.
Levels of cache memory
Different levels of cache memory are as follows:
Level 1 (L1) Cache
It is also called primary or internal cache. It is built directly into the processor chip. It has a small capacity from 8 KB to 128 KB
Level 2 (L2) Cache
It is slower than the L1 cache. Its storage capacity is more i.e. from 64KB to 16MB. The current processors contain an advanced transfer cache on the processor chip that is a type of L2 cache. The common size of this cache is from 512KB to 8MB.
Level 3 (L3) Cache
This cache is separate from the processor chip on the motherboard. It exists on the computer that uses an L2 advanced transfer cache. It is slower than 1.1 and L2 cache. The personal computer often has up to 8MB of L3 cache.
Cache Memory Mapping
there are different ways through which cache memory is mapped.
Direct Mapped Cache
In direct mapping caching, each line of the cache is directly mapped to a line of the RAM. If the line of the RAM is not present then the data cannot be accessed. In the case of direct mapping, if the data is already present in the cache then no additional computation is performed since it is assumed that the data is valid and therefore no additional computations need to be done.
Fully Associative Cache Mapping
Fully associative mapping caches allow entries to reside anywhere in the cache. Therefore, the locations in the cache do not have to be paired with each other. There is no restriction on how the cache is organized. All locations in the cache are used equally.
Set associated cache Mapping
Set associative mapping acts as a halfway- house between direct and completely associative mapping, in that every block is mapped to a lower subset of locales within the cache.
Rather than having only a single line that a block can map to( direct mapping), lines are grouped together into sets. Memory blocks are also counterplotted to specific sets, and also assigned to any line within that set.
Leave a Reply