Difficulty: Easy
Correct Answer: Incorrect
Explanation:
Introduction / Context:
In electronics labs and field testing, engineers constantly judge the quality of measurements. Two foundational terms are accuracy and precision. Confusing them leads to poor instrument selection, incorrect uncertainty budgets, and misleading conclusions about whether a design meets specification.
Given Data / Assumptions:
Concept / Approach:
Accuracy describes closeness to the accepted value. It is about correctness. Precision describes repeatability or spread when measuring the same quantity multiple times under identical conditions. It is about consistency. A system can be precise but inaccurate (tight groupings far from the bullseye), accurate but imprecise (average near truth but with wide scatter), neither, or both. The statement in the prompt defines accuracy, not precision.
Step-by-Step Solution:
Verification / Alternative check:
Plot repeated readings as a histogram. Small spread (low standard deviation) indicates high precision. Compare the mean of these readings to the accepted value. A small bias (mean minus accepted) indicates high accuracy. The two qualities are independent axes of measurement quality.
Why Other Options Are Wrong:
Common Pitfalls:
Using “precise” when you mean “accurate”; trusting a high-resolution readout as proof of accuracy; ignoring calibration and traceability when claiming accuracy; equating a single reading with precision, which requires repetition.
Final Answer:
Incorrect
Discussion & Comments