I can't understand where the math supports the concept of 5G frequencies interfering with aircraft altimeters. Can anyone point out the holes in my logic or educate me?
Knowns:
- Aircraft altimeter frequency range: 4.2 - 4.4 GHz
- Closest 5G frequency range: 3.98 GHz -- While the FCC Online Table of Frequency Allocations allocates up to 4.2 GHz, licenses haven't been granted above 3.98 GHz (Source: https://www.fcc.gov/auction/107/factsheet)
- Upper bound on maximum airplane speed: 343 m/s (speed of sound)
- Aircraft altimeter frequency range: 4.2 - 4.4 GHz
Unknown: What speed could produce the doppler shift required to cause 5G signals to bleed into the altimeter band?
Approach: Since airplanes move way slower than the speed of light, we can start with the approximation
Change in frequency $$\Delta f = \frac{\Delta v}{c}f_0$$
and solve for the delta_v required to shift from 3.98 (= f_0) -> 4.2 GHz. This gives us 0.22 GHz for delta_f and delta_v = 16,582,915 m/s. Clearly not achievable by a passenger airplane.
So what am I doing wrong? Even if you wanted to add a factor of 2 to cover the worst case of a handheld 5G handset on another airplane flying right towards you, I can't see how you could possibly get enough Doppler shift to cause interference.
Acknowledge I've extrapolated away center frequencies and rolloff, but that stuff just isn't as important when you're dealing with 100s of MHz of separation.