0
$\begingroup$

As per what I have understood, the inverse square law applies to diverging beams of light. But what about parallel rays in a lossless medium? If its intensity does decrease, how?

$\endgroup$

2 Answers 2

1
$\begingroup$

If the area the light is spread over does not increase and the light is not absorbed, then intensity will not decrease. If you had a perfectly parallel beam of light, one would not see the intensity decrease at longer distances.

Of course, that's an idealization. Making perfectly parallel light is, to my understanding, not possible. But one can get close. This is why laser pointers can be so dangerous. While they are not perfectly parallel, and thus spread out over time, they spread out so much more slowly than other sources that we can be surprised just how much light can be received by someone's pupil far away.

$\endgroup$
0
$\begingroup$

**Certainly! The intensity of parallel light rays does indeed decrease with distance. This phenomenon follows the inverse square law. Let me explain:

Imagine a point light source (like a bulb or the sun) emitting light in all directions. As you move away from this source, the light spreads out over a larger area. The intensity of the light passing through a single square decreases as you move farther away. Specifically, the intensity is proportional to 1/r², where r represents the distance from the light source. This inverse square law applies not only to light but also to other physical phenomena like sound, gravity, and electrostatic interactions. So, whether you’re measuring the brightness of a lamp or the radiance of the sun, this law remains consistent!**

$\endgroup$
1
  • $\begingroup$ Understood. But I want to know about parallel rays, which I suppose, aren't emitted from a point source. Because point sources emit diverging rays. $\endgroup$ Commented Apr 5 at 17:22

Not the answer you're looking for? Browse other questions tagged or ask your own question.