Difficulty: Medium
Correct Answer: 41.25 seconds
Explanation:
Introduction / Context:
The sensitivity of a level tube (bubble) is defined as the angular tilt of the line of sight per division of bubble movement. By comparing the change in staff reading at a known sight distance when the bubble is intentionally displaced a known number of divisions, we can compute arc-seconds per division. This is a standard field calibration task for levelling instruments.
Given Data / Assumptions:
Concept / Approach:
Tilting the telescope by a small angle φ shifts the line of sight vertically. Over a horizontal distance D, the change in intercept on the staff is approximately Δs ≈ D * tan φ ≈ D * φ (radians). If 5 divisions correspond to total tilt φ_total, then angle per division = φ_total / 5. Convert radians to arc-seconds using 1 rad ≈ 206265 seconds.
Step-by-Step Solution:
Verification / Alternative check:
Using exact tan instead of small-angle: tan φ_total = 0.08 / 80 = 0.001 → φ_total ≈ 0.001 rad to within 0.0000005, giving essentially the same result.
Why Other Options Are Wrong:
28.8 s and 25.05 s underestimate the computed sensitivity.
14.52 s is far too small for the given displacement and distance.
“None” is invalid because a consistent value (≈41.25 s) results from the data.
Common Pitfalls:
Using degrees instead of radians in the conversion; forgetting to divide by the number of divisions; not keeping the staff exactly vertical for the readings.
Final Answer:
41.25 seconds
Discussion & Comments