Memory

Cache Line

The fixed-size block of memory data that a cache stores and transfers as a single unit.

Detailed Explanation

Caches do not fetch or store one byte at a time internally. Instead, they move data in blocks called cache lines. If a processor reads one byte that misses in the cache, the memory system typically fetches the entire line containing that byte and stores it in the cache, because nearby bytes are likely to be used soon as well. This is one of the main ways caches exploit spatial locality.

Each cache line usually contains the data block itself plus metadata like a valid bit, dirty bit, coherence state, and tag. Larger cache lines improve the efficiency of burst transfers and can raise hit rate for sequential accesses, but they also waste bandwidth when software touches sparse addresses and can increase miss penalty.

Industry Context

Common line sizes in modern CPUs are 32 or 64 bytes. Line size is a fundamental architecture choice because it affects cache bandwidth, refill latency, prefetch behavior, and coherence traffic.

Code Example

systemverilog
// A simple 64-byte cache line represented as 16 words
typedef struct packed {
  logic        valid;
  logic        dirty;
  logic [19:0] tag;
  logic [15:0][31:0] data;
} cache_line_t;