Advocacy

  Myths
  Press

Dojo (HowTo)

  General
  Hack
  Hardware
  Interface
  Software

Reference

  Standards
  People
  Forensics

Markets

  Web

Museum

  CodeNames
  Easter Eggs
  History
  Innovation
  Sightings

News

  Opinion

Other

  Martial Arts
  ITIL
  Thought


CACHE, CRASH or CASH:
What the heck is a cache?

By:David K. Every
©Copyright 1999


 

Cache - Pronounced like cash. (You will be ridiculed mercilessly by technophiles if you pronounce it "Ka-shay"). It means --

  1. A hiding place for storing provisions.
  2. A place for concealment and safekeeping.
  3. Computers - A fast storage buffer.

Obviously we are most concerned with the last definition.

Computers have a few different types of caches, each serves its own purpose, but generally all forms of cache's fit the definitions. It is usually a fast pool of Memory, meant to sit between a fast device and a slower device (or bus) to speed up the transfer between the two. Not only are computer caches buffers, but they often have electronics or software support logic so that they can "think ahead".

Cache Functions

 
What do cache's do? Cache's serve a few different functions. Some caches are dedicated to only one function but most caches do all of the following -

Write caches are for when you have a fast device trying to send information to a slow device. Normally the fast device would have to sit and wait for the slow device to say "I'm ready for the next word". It would be like trying to hold a conversation with someone who had to translate each word you said with a dictionary. The cache lets you send a whole bunch of information out, and then go do something else while they translate. So the write caches often have logic or Software so that they flush themselves out (write) at the speed of the slower device automatically, so that they are empty and ready to go when you get around to using them again.

Imagine the sink in your kitchen. When you want to pour out a gallon of spoilt milk, you can dump the entire gallon into into the sink, all at once, even though the drain may not be as fast as you are pouring into the sink. This is because the sink works like a cache - it is a buffer between a you and the drain. You pour into the sink - and it fills up as fast as you can put it in, and the drain will empty out at its own speed (which is a little slower). Even if you were pouring out 10 individual gallons of spoilt milk, by the time you get the next gallon and pour it into the sink, the previous gallon will have drained away (thanks to the cache and the temporary storage of the sink). If you do fill up the sink, you have to wait until it drains down enough that you can pour in the next gallon, but at least you can go do other things while you are waiting. That is basically what a write cache does.

The faster one side is (relative to the other) will help decide on how big a cache you need.

Read caches are for when you have a fast device trying to get information from a slow device. Read caches often have support logic to pre-load (prefetch) what you are going to need next. So they preread (slowly) all the information you are likely to need. When you ask for what is in the buffer, it can give it to you really quickly. When you've emptied the buffer, it can go an refill itself while you go on and do other things.

Imagine that same sink, but you are interested from the drains point of view. You let someone else fill the sink with water, one small glass full of water at a time. When you want the water out the bottom, you just open the drain and it comes pouring out (until the sink is empty).

Buffer caches are just a pool of what what you have already done. It remember the last data or information, so that if you want to look at that data again (or execute that same code) it is already waiting for you. So if you go to repeat anything - it is already there (and easier to do a second or third time). Since a lot of computer programs is repetitive work, this happens more than you might guess.

Combinations - most caches are a combination of the all the above and do some or all of those functions.

Types of Cache

L1 Cache

The processor in your machine is very fast - often running at 200 or 300mhz. Regular RAM is faster than almost anything else in your computer, but can not keep up at anywhere close to the processors speed. So CPU designers put a special sort of fast memory inside the microprocessor chip. This is L1 cache, and runs at the same speed as the processor. Space on the processor is very expensive, so they can not afford to put a lot of memory in there (probably about 1/1000th of the size of your average total memory). Since computers programs are doing a lot of looping over the same code, over and over again, instructions and data stored in this cache get executed often. Even small L1 caches can speed up a processor by a large amount. The faster the processor goes relative to main memory, and the more data you are working with, the larger the L1 cache you will want.

L2 Cache

The fast memory that is on the processor itself (L1 Cache) is small, and the L1 cache and processor is still way way faster than RAM (up to 50 times faster). When something is not in the L1 cache, and the processor has to go out to main memory, there is a stall. The CPU basically has to wait for a long time (in its time) for the RAM to give it more information, and the processor can only twiddle its anthropomorphic thumbs. So between the really fast L1 cache (and the processor), and the really slow RAM (or ROM), we put a secondary buffer - the Level-2 (or L2) buffer/cache.

This buffer is often made out of a fast, but relatively expensive memory, called Static-RAM, and is up to 10 times faster than regular (Dynamic) RAM. People can afford to put 256k or up to a megabyte or two of static RAM, but most can't afford to put 16 or 32 Meg of the stuff in their home computers.

Most of the time your Processor wants to do something, it will either be doing something it has done recently (looping) and that memory will be in the L1 cache, or it will have logic to have already prefetched the information into its L1 cache. But computers are excuting literally hundreds of millions of instructions per second. A 1% miss rate still means that it will miss over a million times each second! L1 miss rates are closer to 10-20%. So the L2 cache makes a difference as well. When it misses on the L1 cache, most of the rest of the time it will get a hit in its L2 cache. So it does not matter that main memory is much slower than the processor -- because what the CPU wants will already be in the faster memory called the L2 cache.

Most people that are talking about caches are referring to L2 caches. Most home computers today have these 2 stages (levels) of cache.

L3...L4... Cache

Some systems are so fast that they have more stages or levels of cache between the processor and the main memory. Each of these levels just gets the next number in the sequence (L1, L2, L3 and so on). The Exponential x704 processor is built around a design that has 3 levels of cache, as is the DEC Alpha. L3 caches are rare but not unheard of (especially as we are getting 500+ mhz computers), but for now I have not heard of any more stages beyond that . I know of no L4 caches or beyond, but with the way computers are going (faster and faster), who knows?

Drive Cache

Just like the processor is much faster than memory (RAM), the memory is much faster than the hard drive. Some designers decided to use a buffer (cache) or RAM to sit between the hard drive and the computer. When the computer writes something out, it gets put in this buffer, and then it gets written out slowly (the drives max. speed) while other parts of the computer are free to do other things. When the computer wants to read something, it can be pre-fetched or may have been something that was done earlier, and it is already in the disks read cache.

Not only can part of the computers main memory do this, but Hardware designers also add a small amount of memory (relative to the hard drives total storage) to the hard-drives themselves. This is a hardware cache, and is the norm for almost all hard drives produced today. Unfortunately people don't differentiate well between "the drives cache using main memory" and "the hard-drives cache", other than wording it how I just did. So be really careful on this terminology. They both serve the same function but do so in different ways.

Usually when people are talking about caches are referring to L2 caches, but drive caches (in either flavor) are not uncommon, and are discussed often enough to cause confusion.

CD-ROM Cache

Hard Drives are lightning fast compared to many CD-ROM's. So some smart engineers have decided to use either hard-drives or RAM or BOTH (in a multi-stage cache scheme) to speed up the CD-ROM's. They work just like a drive cache, and store the data temporarily in either the hard-drive or RAM, until the computer gets around to reading. Remember, CD-ROM's are READ-ONLY, so there is no such thing as a write cache for a CD-ROM. (There are CD-devices that you can write to, but they are not called CD-ROM's -- usually they have a name like CD-R or PD-CD or Optical Drives, etc.).

There are also read-ahead buffers (caches) built into CD-ROM's, just like hard drives. These too are CD-ROM caches.

Network and Serial Buffers

Notice I did not say cache's. A Network and other serial ports are much slower than RAM or Processors, so engineers have buffers for reading and writing to. A buffer is a type of cache. Yet for some inexplicable reason - network buffers (or serial buffers) are almost never called caches. Go figure!

Conclusion

So a cache can be put between any two devices that run at different speeds. The cache exists on the faster device, and takes a small amount of that faster devices space to simulate a fast version of the slower device.Sometimes this is hardware, often it is software. Sometimes caches are used just as a pool to store, usually they have some logic of their own so that they can flush themselves out or prefetch what you need next.

Thanks to caches, our computers and most of the sub-components work much faster than they would otherwise. Like a squirrel has a cache of nuts and you can have a cache of cash -- computers have a lot of different types of caches, but the most confusing part of caches (for users) is keeping the names straight.


Created: 04/15/97
Updated: 11/09/02


Top of page

Top of Section

Home