Submitted Dave Astle, Oct 02 1999 05:21 AM | Last updated Oct 02 1999 05:21 AM
A cache is a relatively small location to store frequently needed items. In the context of OSes, a disk cache is a place in main memory where commonly accessed disk blocks are stored. There are two aspects of disk cache that help improve performance: read caching and write caching. When a program attempts to read a block from disk, the OS first checks the cache; if the block is in the cache, the OS returns that block instead of reading from disk. When a program attempts to write a block to disk, the OS first puts it into the cache; the write can be combined with other writes to the same block, and only later (when the disk is idle) does the OS write the block to disk. In the context of processors, a cache is a place on the CPU (level 1 cache) or near the CPU (level 2 cache) that parts of memory. When the CPU requests that memory be transferred into a register, it checks the cache first. When the CPU writes registers to memory, it writes first to the cache. Applications (including games) can have their own caching to keep frequently needed data in memory or on disk to avoid going to disk or network.