In a cinema hall, if the distance between the projector and the screen is increased by 2 percent, how does the intensity of illumination on the screen change approximately?

Difficulty: Medium

Correct Answer: It decreases by about 4 percent

Explanation:


Introduction / Context:
The brightness of an image on a cinema screen depends on the light intensity reaching it from the projector. As the distance between the projector and the screen changes, the intensity changes according to the inverse square law, which is a key idea in optics and wave physics. This question asks what happens to the intensity of illumination when the projector screen distance is increased slightly by 2 percent.



Given Data / Assumptions:

  • Initial distance between projector and screen is some value d.
  • The distance is increased by 2 percent, making it 1.02 times d.
  • Light spreads roughly uniformly over the screen area as distance changes.
  • The inverse square law applies: intensity is proportional to 1 divided by distance squared.
  • We seek the approximate percentage change in intensity.



Concept / Approach:
The inverse square law states that intensity I of light from a point source is inversely proportional to the square of the distance r from the source, so I is proportional to 1 divided by r^2. When distance increases by a small factor, intensity decreases by the square of that factor. For a 2 percent increase in distance, the new distance is 1.02 times the old distance, so the intensity becomes 1 divided by 1.02 squared times the original intensity. We can compute this factor and convert it to a percentage decrease to get an approximate answer.



Step-by-Step Solution:
Step 1: Let the original distance be d and original intensity be I. Step 2: New distance d new equals 1.02 times d because of the 2 percent increase. Step 3: By the inverse square law, new intensity I new is proportional to 1 divided by (d new)^2 which is 1 divided by (1.02^2) times 1 divided by d^2. Step 4: Therefore I new equals I divided by 1.02^2, since I is proportional to 1 divided by d^2. Step 5: Compute 1.02^2 which is approximately 1.0404, so I new is about I divided by 1.0404, or about 0.96 times I, implying a decrease of roughly 4 percent.



Verification / Alternative check:
Another way is to use a small change approximation. For small percentage changes, the relative change in 1 divided by r^2 is about minus 2 times the relative change in r. A 2 percent increase in distance means a relative change of plus 0.02. The relative change in 1 divided by r^2 is therefore approximately minus 2 times 0.02, which is minus 0.04 or minus 4 percent. This matches the direct calculation using 1 divided by 1.0404 and gives confidence that the correct answer is a 4 percent decrease in intensity.



Why Other Options Are Wrong:
Decreases by about 2 percent underestimates the effect because intensity depends on the square of distance, not linearly on distance.
Increases by about 2 percent or 4 percent are opposite to the expected inverse relationship between intensity and distance.
Remains exactly unchanged would be correct only if distance did not change, but here we specifically increase the distance, so intensity must change.



Common Pitfalls:
A typical mistake is to treat intensity as inversely proportional to distance rather than to distance squared, leading to a 2 percent estimate instead of 4 percent. Some students also forget to square the factor 1.02 properly or to interpret the result as a decrease rather than simply quoting the factor 1 divided by 1.0404. To avoid such errors, always write the inverse square relation explicitly and carefully square the distance factor when calculating percentage changes.



Final Answer:
When the projector screen distance is increased by 2 percent, the intensity of illumination on the screen decreases by about 4 percent.


Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion