Introduction / Context:
Resolution of an analog-to-digital converter (ADC) quantifies the smallest change in input voltage that results in a one-code change at the output. It is a key specification in data acquisition and signal processing systems.
Given Data / Assumptions:
- ADC resolution: 12 bits.
- Input range: 0 to 10 V (unipolar).
- Ideal mid-tread quantizer assumption; LSB size computed as full-scale range divided by number of quantization levels.
Concept / Approach:
For an N-bit ADC, there are 2^N quantization levels. A common, simple estimate for LSB size is: LSB ≈ Full-Scale Range / 2^N. For 12 bits, 2^12 = 4096. Dividing the 10 V range by 4096 yields the LSB step in volts, which we convert to millivolts for readability.
Step-by-Step Solution:
Compute levels: 2^12 = 4096.Compute LSB: 10 V / 4096 = 0.00244140625 V.Convert to mV: 0.00244140625 V * 1000 mV/V ≈ 2.44 mV.
Verification / Alternative check:
If using the 2^N − 1 convention, LSB ≈ 10 / 4095 = 2.442 mV—essentially the same to three significant digits, confirming 2.44 mV as the correct choice.
Why Other Options Are Wrong:
24.4 mV would correspond to only 10 bits over 10 V (≈ 9.77 mV) or a smaller code width; not correct here.1.2 V is off by orders of magnitude.'none of the above' is wrong because 2.44 mV is correct.0.61 mV would imply 14 bits over 10 V (≈ 0.61 mV), not 12 bits.
Common Pitfalls:
Confusing resolution (LSB size) with accuracy; an ADC's absolute accuracy may be worse than its resolution due to INL/DNL and noise.
Final Answer:
2.44 mV
Discussion & Comments