What is Cache Memory? Definition of Cache in Computing

Share on:

Cache or cache memory is a supplementary memory system for the CPU. It simply stores frequently used data temporarily for quicker and a more efficient processing. Essentially, it’s a RAM unit for the processor. So, it only holds the most recently used data. That’s why it doesn’t need to be too big in size. Because the smaller size allows for quicker data location.

In this article we’ll explain in detail what cache memory is and how it works. 

What is Cache memory

What’s Cache Memory

Cache is a very fast and small store that is placed between the processor and the Main Memory in the computer motherboard. It is used to store computer data and instructions temporarily and making them faster and more efficient. In general, today’s processor speed is very fast compared to the speed of the main memory of the PC. It can provide information to the processor at a very fast rate by acting as an interface between the faster processor unit and the slower storage unit.

Cache helps the CPU by reducing the required time to transfer information and data between the storage and the processor. It increases the efficiency and performance of the CPU and reduces the cost of time and electrical energy. Data or instructions that are used frequently by the processor, are stored in the cache memory of the CPU for further uses. Thus, cache memory affects the execution time and makes your PC fast.

CPU-Memory Gap: Why Cache Memory Was Invented

Speed differences between processor and memory are always a topic of concern for computer engineers. Processors are always dependent on memory for their work. But as the speed of the memory is slower than processors, the processor can’t provide the actual speed they can. Because memory takes a long time to move the information to and from the CPU.

CPU memory Gap

In 1980, the Average accessing time of a processor was around 130 ns where memories were 124 ns. In 1990 it became processor ~40 ns and Memory ~80 ns. But by 2000 processor’s accessing time become less than 1 ns but at the same time, the average accessing time of the memory is 70 ~ 75 ns. This difference is huge.

What Role Does Cache Memory Play?

Cache Memory affects the performance of your PC differently depending on the CPU. It is super fast so the more of it, the better as your computer doesn’t have to wait for data and instruction. As when a cache miss occurs, the CPU must then look for information in the slow RAM or storage. And when this happens, MHz is wasted doing nothing as the CPU has no data to work with.

In the early PC’s, a common matter with computer’s hardware was that they all were really slow. The processor speed was 8 MHz or less. But it wasn’t a big problem because memories were also slow. But nowadays you have processors of 2.4 GHz to 5 GHz or more (the Latest Intel™ Core i9 processor’s speed is almost 5.2 GHz). But PC’s main memory’s (RAM) speed is 1200 ~ 2400 Hz. However, a modern top of the line processor has performed over 1,000 times than that of the original IBM PC. So the importance of cache memory is very high.

Why Cache Memory is so fast but small in size?

CPU Cache position in the motherboard is on the same die as the processor. It is so close to the processor than other memories. It is a solid-state memory and not still on the rotating platters as Hard Disk. This advantage reduces the round trip of this memory. Moreover, they need not have to be refreshed while working. It is a waste of cycles and reduced the efficiency of the memory. Another point is the requested data or information has already been retrieved from the storage to the cache memory. So that they don’t have to be bussed over to the processor again.

Another thing is disk drives are mechanical systems that are slower and less efficient than pure electronics devices. It makes Cache memory faster than any kind of memory. Cache Memory plays a vital role to increase the performance of the PC. But why it is so small in size? The answer is cache size is at the same time proportional to latency (which makes your PC slow), to manufacturing costs of it, and also to energy consumption.

So if the cache size is large then it will be slower rather than fast. A huge number of transistors are needed to build a CPU cache. In the motherboard, it is one of the most energy absorbing elements among others. Again the larger cache size means a larger area of memory that needs to “search” to find the requested information or data, thus the access time of the process will be much longer. Large cache memory will also increase the die area of the processor

Nowadays the manufacturer prefers smaller cache in the development of newer and developed CPUs. Because if they use higher cache in the CPU then it will be just a waste, as an example of Intel™ core2 has 12 MB of cache memory, however more developed core i3 uses only 3 MB cache memory. But core i3 has 100 MHz more clock speed than core2. Most modern CPUs have an independent different cache memory of their own.

RAM vs Cache Memory

The memory of a computer is organized in a pattern that is called ‘hierarchy’ and they are placed serially around the processor. Their position is selected considering the role of them, taken the time to access them, cost, and capacity. RAM and cache memory both are two important members of this memory hierarchy of the CPU. RAM is considered as the main memory of the computer where the cache is a special memory type used by the CPU.

Read- Does The RAM size affect the speed of your PC | How much RAM you really need

It seems that RAM and Cache memory are kinds of similar as they both help the processor by storing CPU data temporarily for further uses. But there are some fundamental differences between these two. Cache memory is closer to the CPU when compared to the RAM position. Cache memory is much faster in speed and more expensive than RAM. Cache memory management is done fully by hardware devices but RAM is managed by the operating systems of the PC.

The capacity of the RAM is larger. RAM can be 16 GB or more but Cache memory is usually made around 3-12 MB. RAMs are divided into two categories. Further, the cache memory is organized as L1, L2, and L3 caches that differ in the performance of the PC, speed of the processor, cost, and capacity of it.

Different Types of Cache Memory

There are mainly 3 levels of cache memory on modern CPU cores. They refer to the distance from the CPU where Level 1 is the closest. They are:

  1. Level 1 (L1) Cache

L1 cache is called primary or internal cache. It is very small and tightly bound to the actual processing unit of the CPU. It tends to be around 4-32 KB depending on the CPU architecture. It can fulfill data requests within 3 CPU clock ticks.

  1. Level 2 (L2) Cache

L2 cache is larger than L1 but it is a bit slower. It needs 20 CPU ticks to fulfill a request. It is generally tied to a CPU core. Its storage capacity is more, from 64 KB to 8 MB. The current CPUs used advanced transfer cache on their processor chip that is one kind of L2 cache.

  1. Level 3 (L3) Cache

L3 cache is much larger but slower than the rest of the caches, though still a lot faster compared to the main memory. It is separated from the main processor. Today’s personal computer often has up to 8 MB of L3 cache.

Cache types and specialized caches

There are different types of caches. They all work together for different purposes.

  1. Registers.
  2. TLB.
  3. L1 Cache.
  4. L2 Cache.
  5. Virtual Memory.
  6. Buffer Cache.
  7. Disk Cache.
  8. Network Buffer Cache.
  9. Browser Cache.
  10. Web Cache.

There is some special type cache memory used by different CPUs. Like:

  • Trace Cache: found in Intel Pentium 4.
  • Victim Cache: found in Intel’s Crystal-well variant of its Haswell.
  • Multilevel Caches: in IBM Power4, AMD Phenom II, Intel Core i7
  • Write Coalescing Cache: in AMD’s Bulldozer

Working Principle: How Does Cache Work

In CPU, the processor is at the core of it. And then cache memory, then RAM, and at last storage device. The cache is also a type of RAM, but it is a static RAM which is called SRAM. SRAM is faster and costlier than DRAM. Because it consists of flip-flops (6 transistors) to store data, unlike DRAM which uses 1 transistor and capacitor to store data.

When an application starts or any data is to be read/ write, then the data are shifted from the storage device (magnetic device – Hard disk, optical device – CD drive, etc.) to DRAM. Whenever any data or instruction is required by the processor, they provide them at a faster rate than slower storage devices.

Although they are much faster, the processor processes at a more fast rate than DRAM. They are not able to transfer the required data or instructions from storage to processor at that rate. So the required data is transmitted to the next level of fast memory, which is known as Cache memory.it provides data to the processor at a very high rate and the processor can work with that data.

Cache memory also stored recent data that is used by the processor, temporarily. So next time if the processor needs that data again, they can provide that data at a more high rate because this time they don’t need to collect it from slower DRAM or Storage device. It makes the overall process faster.

Cache Mapping

Cache memory mapping is how CPU map or organize data in the cache memory. The cache is mapped whenever the data is to be used by the Processor. This is done by increasing the efficiency of storing data. There are three Cache mapping techniques.

  • Direct Mapping: in Direct mapping both (RAM and Cache Memory) are used to store data. An address of data or instruction is divided into individual two parts. Index part for RAM and tag part for Cache. But this process is not flexible.
  • Associative Mapping: in this type, the associative memory is used to store content and addresses. This enables the placement of any word at any place in the cache memory. It is considered to be the fastest and easiest mapping technique among others.
  • Set-associative Mapping: this mapping technique is a hybrid mixture of the associative mapping technique and direct mapping technique. It is a reasonable compromise between the easiest direct mapping and the complex hardware which is needed for associative caches.

Is it changeable?

It is not possible to change your PC’s cache memory as it is one kind of part of the processor. To change your cache memory you have to change your processor. But you can maintain cache size for different works (i.e. browser cache) by command.

Frequently Asked Question

Is cache in RAM or ROM?

Cache is Random Access Memory, or RAM. It is located inside the CPU. And because there’s no permanent data written over it, and it is frequently written over, it’s not a ROM.

Why is it called cache?

Cache refers to something that is hidden or stored somewhere. Because the data that’s being used frequently is stored inside it, it’s called cache memory.

Is cache faster than RAM?

Yes, theoretically, cache memory is faster than RAM. It is located higher in memory hierarchy.

Summing UP

Hopefully, after reading this article, the answer to What is a Cache Memory is now cleared for you. In Cache Memory, each technique is associated with its own design constraints, advantages, and limitation. There are many advantages and at the same time many rooms for improving the performance of cache memory. Engineers are working hard to create a new technology that will remove the obstacles and will make your PC more fast and efficient.

About The Author

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.