Difficulty: Easy
Correct Answer: resolution
Explanation:
Introduction / Context:
Converter datasheets list several performance metrics. Among them, the number of bits is foundational because it sets the quantization granularity, code range, and ideal step size. Correct terminology avoids confusion when sizing a converter for a given application.
Given Data / Assumptions:
Concept / Approach:
Resolution is the smallest change discernible in the converter’s output/input, ideally one least significant bit (1 LSB) relative to full scale. The number of bits directly defines resolution as a fraction of full scale: 1 / 2^N. Accuracy, linearity, and monotonicity are different (and usually worse-case) error and behavior specifications layered atop the ideal resolution.
Step-by-Step Solution:
Verification / Alternative check:
Typical datasheets: “Resolution: 12 bits”; “INL: ±1 LSB”; “DNL: ±0.5 LSB”; “Monotonic over temperature,” clearly separating resolution from other metrics.
Why Other Options Are Wrong:
Common Pitfalls:
Assuming more bits always mean more accuracy; real accuracy depends on errors and noise, not just resolution.
Final Answer:
resolution
Discussion & Comments