Difficulty: Easy
Correct Answer: to improve disk performance
Explanation:
Introduction / Context:
Disks are orders of magnitude slower than CPU and DRAM. Operating systems use buffer caches (block caches) to reduce I/O latency and increase throughput by keeping recently accessed disk blocks in memory and coalescing writes efficiently.
Given Data / Assumptions:
Concept / Approach:
The buffer cache stores frequently used blocks, enabling read hits from RAM and delayed, batched writes (write-back or write-through depending on policy). It also allows read-ahead and elevator scheduling to reduce seek overhead, dramatically improving perceived disk performance.
Step-by-Step Solution:
Verification / Alternative check:
Performance monitoring shows higher cache hit rates correlate with lower average I/O times and fewer physical disk operations.
Why Other Options Are Wrong:
Common Pitfalls:
Confusing buffer cache with page cache versus CPU L1/L2 caches; ignoring write-ordering and durability trade-offs in write-back policies.
Final Answer:
to improve disk performance
Discussion & Comments