A digital voltmeter (0–999 counts) has a full-scale reading of 9.999 V. What is the resolution (smallest change it can display) on this range?

Difficulty: Easy

Correct Answer: 0.01 V

Explanation:


Introduction / Context:
Digital meters display quantized values in counts. Knowing the range and the number of display counts lets you compute the resolution, which is the value represented by one count. This informs uncertainty and least significant digit (LSD) behavior in measurements.


Given Data / Assumptions:

  • Readout spans 0 to 999 counts.
  • Full-scale reading is 9.999 V (typical 3½-digit DVM).
  • Quantization is uniform.


Concept / Approach:
Resolution = Full-scale value / maximum count. For a 0–999 display on a 9.999 V range, one count equals approximately 9.999 / 999 V, which is essentially 0.01 V. Many instruments also quote resolution as 1 count (1 LSD).


Step-by-Step Solution:

Compute resolution: ΔV = 9.999 V / 999 ≈ 0.01001 V.Round to display granularity: ≈ 0.01 V.


Verification / Alternative check:

Noting that the instrument shows four digits with one decimal place (X.XXY), the smallest change is 0.01 V.


Why Other Options Are Wrong:

1 V and 0.1 V are too coarse; 1 mV or 1 μV are far finer than the stated display capability.


Common Pitfalls:

Confusing resolution with accuracy; resolution is just the smallest step, not the measurement error.


Final Answer:

0.01 V

More Questions from Measurements and Instrumentation

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion