Difficulty: Easy
Correct Answer: Successive-approximation analog-to-digital converter
Explanation:
Introduction / Context:
Different analog-to-digital converter (ADC) architectures trade off speed, accuracy, cost, and complexity. A key specification is conversion time. Some ADCs take a constant time regardless of the input level, while others have input-dependent conversion latency. Knowing which family offers fixed conversion time is essential for real-time sampling and deterministic control loops.
Given Data / Assumptions:
Concept / Approach:
A SAR ADC performs a binary search of the input using an internal DAC and comparator. For an N-bit SAR, exactly N comparison steps are required, so the conversion time is basically N clock cycles (plus small overhead) and is independent of the input voltage. In contrast, a digital-ramp (counter-type) ADC counts up until the DAC matches the input, so time depends on the code value. Single/dual-slope integrate for predetermined intervals but include portions that depend on input. Flash is also fixed time but is typically considered a separate extreme (very fast, many comparators). Among the listed options, SAR is the general fixed-time, mainstream choice.
Step-by-Step Solution:
Verification / Alternative check:
Data sheets of popular 8–16 bit SAR ADCs specify conversion time as a fixed multiple of the internal clock (for example, N + 2 cycles), independent of input. Counter-type ADCs show worst-case varying up to full-scale counts.
Why Other Options Are Wrong:
Common Pitfalls:
Final Answer:
Successive-approximation analog-to-digital converter
Discussion & Comments