9
$\begingroup$

When you emit the radio signal it starts moving at the speed of light. Radio beam is diffusing with each kilometer the signal has traveled. To the nearby receiver the signal is strong. But if the receiver is far away, the signal will become weaker and weaker until it becomes a noise. So the question is does the same happen in deep space and what distance radio signal (or any electromagnetic wave frequency) could travel until it becomes noise?

$\endgroup$

2 Answers 2

9
$\begingroup$

The first thing to consider is that the area of a beam will, over long distances, diffuse. The best situation we can hope for is a diffraction-limited system, where this diffusion is minimized thus maximizing our received signal. That is, in theory we have a perfectly collimated transmission beam that neither diverges nor converges.

In practice, we are still limited by diffraction. A diffraction limited system is described by the formula

$$ \sin\theta = \frac{1.22\lambda}{D}, $$

which prescribes an angular resolution $\theta$ in terms of wavelength $\lambda$ and a circular aperture of diameter $D$. This is called the Rayleigh criterion. The definition of angular resolution, in this case, is when two point sources are just discernable from each other where the principal maximum of the airy disk pattern of one source coincides with the first minimum of another. It's this definition that results in the apparently arbitrary constant of $1.22$.

We usually think of diffraction being applicable in terms of receiving a signal - for example, a space telescope will usually have a diffraction-limited optical system. However, the exact same laws hold true whether we are receiving or sending a signal. The optical path is the same. Everything is just in reverse!

Side note: if we instead projected an image into space, in order to acceptably resolve the image a receiver would need to have an angular resolution equal to, or greater than, than the projection. This includes a spatial resolution criterion in addition to the signal-to-noise performance discussed below.

To make a real-life example, let's consider a radio signal. Since a distant receiver will be getting a frequency-modulated signal not unlike FM radio, we are not concerned with angular resolution. We don't care if the "image" is blurred, or even if some areas of the originally transmitted beam entirely miss our receiver. All we are concerned with is the modulation of frequency over time - it's a one-dimesional signal.

In this case, a receiver is a noise-limited system. This NASA report outlines some of the limitations that a realistic implementation of interstellar communication must deal with. Even in the case of a quantum noise-limited system, we can still make the best of the limitations dealt to us.

If the signal-to-noise ratio is above an acceptable threshold then the signal will be received well. There are so many factors to consider that really only an order-of-magnitude estimate is feasible. I don't know enough about this to come up with a good estimate myself of the noise levels of a particular system.

Project Cyclops (1971) was the initial investigation into the feasibility of a search for extraterrestrial intelligence. For example, on page 41, we can see that the minimum noise temperature of a receiver receiving the 2.4 GHz Arecibo message is about 4K - the major contributor to noise here is the CMB. Frequencies of this order of magnitude will usually provide the best possible noise performance - too high and quantum noise and atmospheric effects become significant. Too low, and galactic noise takes over.

This noise temperature provides a noise floor for the signal. The receiver usually introduces a significant noise temperature to the degree of some tens or hundreds of Kelvin, so any practical limitations on interstellar communication tend to become a function of our equipment.

Although the Arecibo message was broadcast at a good frequency, for very long-distance communication amplitude modulation is superior to frequency modulation as it's easy to increase the pulse duration and interval to compensate for a weaker signal strength.

This table from page 50 of the well-worth-reading Cyclops report shows that a single $100~ \mathrm{m}$ transmitter/receiver combination, with a transmission power of $10^5~\mathrm{W}$, could function at a distance of 500 light years.

enter image description here

Building bigger transmitters and receivers will increase the maximum distance of communications. So will increasing the transmission power, pulse duration, and pulse interval. Current technology could let us communicate over tens or hundreds of light years. To communicate further, just build something bigger. The laws of physics place few limits on the distance we can communicate.

$\endgroup$
9
$\begingroup$

All electromagnetic radiation from a point source - which a normal radio transmitter is - propagates according to the inverse square law which means that the intensity of the signal is inversely proportional to the square of the distance. This happens on earth and in deep space equally.

So this will mean that for any signal there will be a distance at which it's becomes indistinguishable from the background noise of the universe. However, that distance will depend on the initial strength of the signal.

$\endgroup$
0

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .