Difficulty: Easy
Correct Answer: It is the comparison between the actual output of the converter and its expected output.
Explanation:
Introduction / Context:
When selecting a D/A converter, engineers must distinguish among several specs: resolution, accuracy, linearity, and monotonicity. Misreading these terms can lead to surprising system-level errors.
Given Data / Assumptions:
Concept / Approach:
Accuracy is the absolute closeness between the DAC’s actual analog output and the correct ideal output voltage or current for a specific digital input code, usually expressed in LSB or percent of full scale.
Step-by-Step Solution:
Identify the definition requested: “accuracy.”Match it to “actual vs expected output.”Exclude definitions of resolution (step size), monotonicity (no step reversals), and linearity (straight-line deviation).
Verification / Alternative check:
Datasheets specify accuracy as total unadjusted error or absolute accuracy relative to the ideal transfer function, confirming the definition.
Why Other Options Are Wrong:
Reciprocal of number of steps: that is resolution, not accuracy.Resolve forward/reverse steps: monotonicity.Deviation from straight line: integral nonlinearity (linearity), not overall accuracy.Update rate: a speed spec, not accuracy.
Common Pitfalls:
Assuming more bits (resolution) automatically means accurate outputs; offset/gain/nonlinearity errors can still be large.
Final Answer:
It is the comparison between the actual output of the converter and its expected output.
Discussion & Comments