Difficulty: Medium
Correct Answer: 0.0001 second
Explanation:
Introduction / Context:
Parallax is the apparent change in direction to an object when observed from two separated points. In astronomy, stellar parallax is tiny even with Earth’s orbital diameter as the baseline. Using the much smaller baseline of Earth’s diameter drastically reduces the angle still further. This problem tests order-of-magnitude reasoning for angular measurement in arcseconds.
Given Data / Assumptions:
Concept / Approach:
Since parallax angle is proportional to the observing baseline, reducing the baseline from 1 AU (~149.6 million km) to one Earth diameter reduces the angle by the same factor. Ratio = Earth diameter / 1 AU ≈ 1.2742×10^4 km / 1.496×10^8 km ≈ 8.5×10^-5. Multiplying 0.77 arcsecond by 8.5×10^-5 gives roughly 6.5×10^-5 arcsecond, which is less than 0.0001 arcsecond.
Step-by-Step Solution:
Verification / Alternative check:
Even brighter/closer stars exhibit parallax well below 1″; shrinking the baseline to Earth’s diameter guarantees a sub-0.0001″ difference, far below visual detection and demanding space-grade instrumentation.
Why Other Options Are Wrong:
0.01″ and 0.001″ are too large by orders of magnitude for the given baseline; “None of these” is incorrect because 0.0001″ provides a correct upper bound; 0.1″ is even more unrealistic.
Common Pitfalls:
Confusing Earth’s orbital baseline (1 AU) with Earth’s diameter; forgetting linear proportionality of small parallax angles with baseline.
Final Answer:
0.0001 second
Discussion & Comments