Difficulty: Easy
Correct Answer: doubles
Explanation:
Introduction / Context:
In a counter-type (digital-ramp) ADC, a binary counter drives a DAC. The system increases the count until the DAC output just exceeds the analog input; the count at that moment is latched as the conversion result. This architecture’s conversion time depends on how many DAC codes must be tried sequentially.
Given Data / Assumptions:
Concept / Approach:
Worst-case conversion time is proportional to the number of codes to test. For N bits, that is approximately 2^N clock periods. Increasing resolution by one bit doubles the number of possible counts (2^(N+1) vs 2^N), so the worst-case time doubles. Average time also scales proportionally, though exact factors depend on input distribution and start-from-zero or tracking variants.
Step-by-Step Solution:
1) Codes to test (worst case) ≈ 2^N.2) Conversion time T_worst ≈ 2^N * Tclk.3) Increase N → N + 1 → T_worst(new) ≈ 2^(N+1) * Tclk.4) Therefore, T_worst(new) / T_worst(old) ≈ 2 → time doubles.
Verification / Alternative check:
Compare 8-bit vs 9-bit: 256 vs 512 counts in the worst case; time per conversion doubles at the same clock.
Why Other Options Are Wrong:
Tripling is unsupported by the code-count relationship. Decrease options contradict the monotonic increase of code count with more bits.
Common Pitfalls:
Confusing counter-type with SAR (which scales ~linearly with N comparisons, not exponentially with 2^N codes); forgetting clock-rate changes can mask the doubling if the system clocks faster.
Final Answer:
doubles
Discussion & Comments