3
$\begingroup$

Interferometry is always in the news, and at radio frequencies it has been for a long time...

The popular press always talks about directly 'interfering' two waves as they come in, but can they tell the exact phase of a single wave?

Don't the Event Horizon Telescope and the new LOHAR do this, 'recording' the phases of incoming radio waves and comparing and contrasting them later, with specialized software?

P.S.: When the amplitude of a wave is at its greatest, that means the 'strength' (power, or energy) of the wave is at its greatest, correct? As measure in eV or joules (energy) or watts (power)? So is that a way detectors can detect and record the amplitude(s) and therefore the phase(s)?

And if you know the time passing between energy maxima, you can know the freq./wavelength that way, too?...

Also, is there an independent way to know the direction of the E-field lines? To know if the amplitude of the wave at that moment is 'above' or 'below' the proverbial line? So you'll know if you'll obtain constructive or destructive interference if another wave of the same frequency, also at maximal amplitude, crosses paths with it?

P.P.S.: If a single, extremely short wave or burst passes a detector at minimal amplitude, could the detector fail to detect it?

$\endgroup$
3
  • 2
    $\begingroup$ Does this answer your question? How does the Event Horizon Telescope implement the interferometry? $\endgroup$
    – antlersoft
    Commented Sep 8, 2021 at 13:33
  • 1
    $\begingroup$ At each antenna signal is sampled digitally at really high frequency with a very accurate atomic clock reference and the data is stored on massive arrays of hard drives. See astronomy.stackexchange.com/questions/20082/… $\endgroup$
    – antlersoft
    Commented Sep 8, 2021 at 13:37
  • 2
    $\begingroup$ @antlersoft precisely measuring the phase of a wave in radio astronomy is a big challenge! I think this is a very specific question and requires its own answer. I'm the author of the proposed duplicate question and I can say that this question is different and answers there do not sufficiently answer this question nor should they have. voting to leave open $\endgroup$
    – uhoh
    Commented Sep 9, 2021 at 0:01

1 Answer 1

5
$\begingroup$

How, precisely, do radio astronomers detect (and record) the phases of waves for interferometry?

The popular press always talks about directly 'interfering' two waves as they come in, but can they tell the exact phase of a single wave?

tl;dr: It's a good question. There is no such thing as "the exact phase of a single wave", it's only the phase difference between waves that has any real meaning in interferometric observations.


"Phase" is a word we hear constantly in all forms of interferometry, as well as in radio engineering, electrical engineering, audio engineering, signals engineering and processing, the popular press and even in science fiction.

Phase, like voltage, is a relative measurement.

Voltmeters have two probes, not one. There is no absolute potential (gravitational, electrical, etc.), there are only potential differences. The reading they display is the electric potential of the "+" probe minus the potential of the "-" probe.

It's the same kind of thing when talking about the phase of a wave.

If it's say a ~1 GHz frequency when striking the antenna, then the phase changes by a full 360° every nanosecond. Technically, if you amplified then digitized that wave 10 times per nanosecond, you'd see the phase change by ~36° between digitizations.

But that won't tell you anything by itself, you already know it's ~1 GHz so that's what you expect.

However, you could digitize two signals simultaneously, your ~1 GHz signal and an ultra-stable 1 GHz oscillator which uses an atomic clock for reference.

Now if your signal was say 1.000001 GHz and you compared phases of the two signals, you'd see that the phase difference increased by 360° every millisecond*.

But that's easier to do by mixing the two signals in a nonlinear way and listening for the 1 kHz beating between the two. (That's how AM and shortwave radio work).

Interferometry compares two or more signals for phase differences

Interferometers in general take two or more samples of a signal and interfere them and look only at the phase differences.

In the original astronomical interferometry experiments (both radio and optical) the light was collected by two nearby collectors and then brought together in the middle where they were interfered so that the phase difference between the two is measured.

In big modern arrays like VLA, ALMA or LOHAR, the signals from many receivers are individually digitized first, then brought together in a big room full of electronics (specialized computers mostly) and interfered mathematically by a computer program. These big machines are called correlators.

For ALMA for example the signals from each dish are brought to the correlator via fiber optic connection.

But they don't do this at the signal frequency, which can be up to 1 THz, because that's incredibly hard to digitize. They shift it down to a lower frequency first, of the order of a few GHz, and then digitize that.

They distribute a stable local oscillator (LO) frequency to down-convert by fiber as well in order to preserve phase stability.

In the case of the Event Horizon Telescope or EHT, the dishes are spread all over the Earth. They can't connect them with fiber optic links of sufficient bandwidth, reliability and continuity, so instead they record the phase difference between the incoming signal and a local oscillator that is connected to a local atomic clock and receives some help from GPS.

This is not stable enough, so they also record signals at the same time from another strong, known radio object and include that in the data feed.

It's all recorded to local hard drives which are flown back to a central location where again interferometry is done in software.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .