Difficulty: Medium
Correct Answer: The property that ensures all processors see a consistent view of shared memory, so that copies of the same data item in different caches eventually reflect the most recent write
Explanation:
Introduction / Context:
In multiprocessor or multicore systems, each core typically has its own cache. When multiple cores access and update the same shared data, maintaining a consistent view of memory becomes challenging. Cache coherency protocols are designed to ensure that all cores observe updates in a reasonable and consistent way. Understanding cache coherency is important for grasping the behaviour of parallel programs and memory models.
Given Data / Assumptions:
Concept / Approach:
Cache coherency is a property of a memory system in which internally cached copies of shared data remain consistent. Specifically, if one processor writes a new value to a shared memory location, other processors should eventually see that new value when they read the same location. Coherency protocols, such as MESI or MOESI, use bus snooping or directory mechanisms to track which caches hold which lines and to invalidate or update stale copies when writes occur.
Step-by-Step Solution:
Step 1: Imagine two cores, both caching the same variable X. Initially, both caches and main memory hold the same value.
Step 2: Core one updates X to a new value in its cache and possibly in main memory.
Step 3: Without coherency, core two might continue reading the old value from its cache and never see the update from core one.
Step 4: A coherency protocol ensures that when core one writes to X, either core two cache line is invalidated or updated so that any later read by core two observes the new value.
Step 5: This behaviour matches the idea that all processors eventually see a consistent view of shared memory for each location.
Verification / Alternative check:
Architecture references define cache coherency in terms of invariants such as all writes being seen in some order and no processor reading directly obsolete data indefinitely. Compression, disk swapping or completely private memories are not mentioned as definitions of coherency.
Why Other Options Are Wrong:
Compressing cache contents is a possible optimization but does not by itself address consistency of shared data.
Swapping cache lines directly to disk is not typical and would be extremely slow; it is not related to the coherency concept.
Having completely private, never shared memories would avoid coherency issues but does not describe coherency; it describes a different architecture model.
Common Pitfalls:
A common confusion is between coherency and consistency models. Coherency deals with a single memory location, while consistency relates to ordering of operations on multiple locations. Another pitfall is to assume coherency comes for free; in reality, coherency protocols add complexity and can influence performance and scalability.
Final Answer:
Cache coherency is the property of a shared memory system with caches in which multiple cached copies of the same data item are kept consistent, so that all processors eventually observe the most recent write to that item.
Discussion & Comments