On this page:
https://www.handprint.com/ASTRO/ae1.html
the author gives this calculation to show light rays from very distant objects are essentially parallel:
In astronomical applications, light sources are so distant that the concentric wavefronts become a series of equally spaced parallel planes across the width of any practical telescope aperture. To illustrate: across the aperture of a 1 meter (39.4") telescope, light rays from a single point on the Moon, the closest astronomical object at 384,403 kilometers, diverge from parallel by no more than 1/384403000 of a radian or 0.0000026 millimeter, which is 0.0047 or 1/200 a wavelength of "green" light. Since the fabrication limits of the highest quality astronomical optics are around λ/20 wave, or 10 times larger than the wavefront divergence, optical calculations can assume perfectly flat and parallel wavefronts from a distant light source.
I don't follow the steps in this reasoning. Is he using a small angle approximation? I know it's very elementary, but could somebody explain it to me?