5
$\begingroup$

The BBC article Event Horizon Telescope ready to image black hole describes the Event Horizon Telescope, a coordinated observing technique with several radio telescope arrays across the globe forming a synthetic aperture with an Earth-sized baseline.

$$\frac{\lambda}{r_{Earth}} \sim \frac{r_{Sag A*}}{D_{Sag A*}} \sim 10^{-11}$$

...when plugging in 1 millimeter for $\lambda$, and with $r_{Sag A*}$ and $D_{Sag A*}$ the radius of, and distance from Earth to Sagittarius A*, the black hole in the center of the Milky Way Galaxy of 20 million km and 26,000 light years, respectively. (values taken from the article).

The equation I've written shows that millimeter wavelength interferometry with an Earth-sized baseline has the possibility to resolve the existence of some structure with scale of the black hole's event horizon.

My question is How does the Event Horizon Telescope implement the interferometry? It would certainly be impossible to bring all signals together to a central site and perform the interference there in real time as down-converted analog signals, and quite difficult/expensive to do it with dedicated, synchronized digital optical fiber lines. Is the massive amounts of data sent as IP packets over the internet to a central correlator (numerical interferometer)?

The article mentions atomic clocks and lots of hard drives, and I have a hunch these have something to do with it.

The eventual EHT array will have 12 widely spaced participating radio facilities". From The BBC's February 16, 2017 "Event Horizon Telescope ready to image black hole" http://www.bbc.com/news/science-environment-38937141

above: "The eventual EHT array will have 12 widely spaced participating radio facilities". From The BBC's February 16, 2017 Event Horizon Telescope ready to image black hole.

$\endgroup$
1
  • 1
    $\begingroup$ If I had to guess - since you can't synchronize these telescopes in real time - it is that each telescope is programmed to take measurements at predefined points in time on the atomic clock and then the interference is computed on a central server (or rather several servers) for all the data recorded. Would be a very complex setup that involves some pretty nasty calculations, but could actually be possible. $\endgroup$
    – Adwaenyth
    Commented Feb 17, 2017 at 9:44

2 Answers 2

4
$\begingroup$

Assuming I understand your question:

As in actually mentioned in the article, all data collected are stored on hard drives, timestamped with an atomic clock, and then flown to a central location where the interferometry actually takes place.

Further reading:

$\endgroup$
10
  • 1
    $\begingroup$ That certainly sounds like how they will facilitate it. But to implement the interferometry there has to be something more complex going on. Are they just blindly writing GHz raw data directly to a hard drive? Is there a local oscillator? Each site has a substantially different doppler shift due to the rotation of the Earth. I don't think proper interferometry can be done with just timestamps. $\endgroup$
    – uhoh
    Commented Feb 17, 2017 at 15:07
  • 2
    $\begingroup$ @uhoh "Blindly" probably isn't the best word to describe the methodology. We know very precisely the location from which certain data sets were taken, and at what time. We also know the relative position in the sky of the target object very precisely. Therefore, any sort of disturbance resulting from geophysical location (Doppler redshift and otherwise) should be able to be compensated for. As for a local oscillator, I cannot say for certain—although, I would certainly not be surprised if there were some local heterodyning taking place. $\endgroup$
    – user14781
    Commented Feb 17, 2017 at 15:23
  • $\begingroup$ You are right - it was the wrong word. I meant to draw a contrast between just recording everything possible as raw data - the ADC outputs, ionospheric conditions, water vapor, atomic clock time, as one big data stream to hard drive, vs trying to heterodyne to an oscillator that's been first synchronized to the clocks, then frequency/phase shifted based on velocity, etc. I don't have a good word for a brute force approach to "record now, correct later", but the word I used was not what I meant to say! $\endgroup$
    – uhoh
    Commented Feb 17, 2017 at 15:30
  • $\begingroup$ Come to think about it, if there are other objects very close by, it's possible they can be used as a radio "guide star" to compensate for atmospherics off line. $\endgroup$
    – uhoh
    Commented Feb 17, 2017 at 15:38
  • 1
    $\begingroup$ @uhoh datacenterdynamics.com/news/… $\endgroup$ Commented Dec 29, 2018 at 13:27
1
$\begingroup$

Supplementary graphical representation for the accepted answer, from tweet:

enter image description here

$\endgroup$
3
  • 1
    $\begingroup$ нαηαzσησ lαη∂ commented: "at the end of the data collection there were over 5 petabytes of data stored in over 100 stacks of hard drives like the ones on the picture." Also, Peter Telford "You'd think black hole data would compress really well ." $\endgroup$ Commented Jul 24, 2019 at 2:48
  • $\begingroup$ @KeithMcClary hah! $\endgroup$
    – uhoh
    Commented Jul 24, 2019 at 2:56
  • $\begingroup$ @KeithMcClary okay now I am curious, where are these quotes from exactly? $\endgroup$
    – uhoh
    Commented May 15, 2022 at 1:54

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .