Deadlock theory: Dijkstra’s banker’s algorithm is a classic solution for which operating-system problem related to resource allocation?

Difficulty: Easy

Correct Answer: deadlock avoidance

Explanation:


Introduction / Context:
Deadlocks occur when processes wait cyclically for resources. The banker’s algorithm is a resource-allocation strategy that prevents unsafe states by simulating allocations and only granting requests that keep the system in a safe state.



Given Data / Assumptions:

  • Finite, reusable resources (e.g., devices, locks).
  • Processes declare maximum resource needs.
  • The OS can check safety before granting requests.


Concept / Approach:

Deadlock avoidance aims to ensure the system never enters a state from which deadlock is possible. The banker’s algorithm models future requests and releases, and approves only those resource grants that leave at least one safe sequence to completion for all processes.



Step-by-Step Solution:

Model current allocation, maximum demand, and available resources.For a request, pretend to allocate and test for a safe sequence.If the state remains safe, commit; otherwise, defer the request.Thus the algorithm implements deadlock avoidance, not detection or recovery.


Verification / Alternative check:

OS texts distinguish prevention, avoidance, detection/recovery. Banker’s fits the “avoidance” category by safety checks prior to allocation.



Why Other Options Are Wrong:

  • Mutual exclusion: a required condition for deadlock, not what banker’s solves.
  • Deadlock recovery: handles deadlock after it occurs; banker’s tries to avoid it.
  • Cache coherence: multiprocessor memory consistency, unrelated to deadlocks.


Common Pitfalls:

Assuming the algorithm detects deadlocks post-hoc; forgetting it needs declared maximums and can be conservative or costly for large systems.


Final Answer:

deadlock avoidance

More Questions from Operating Systems Concepts

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion