This is interesting!
At first I thought that optical communication always wins because the $\lambda/d$ for a 30 cm diameter telescope at 850 nm is about 350,000 whereas for a 3 meter dish on a deep space spacecraft at 8 GHz or 32 GHz Ka band is only 80 or 320. That factor of 1000 in $\lambda/d$ is a factor of a million in signal strength at the other end, or 60 dB.
That multiplicative factor of a million goes a long way, but the problem is that the current detection schemes for radio and optical are very different.
Radio signal detection
A radio receiver/detector couples the electric field of the incoming wave into a voltage and that squared, divided by the amplifier's impedance is a power ($V^2/R$).
In other words, the received radio power is also the power in the detection circuit, that we compare to the noise equivalent power (NEP) of the amplifier, which will be about $k_B T \times \Delta f$ where $k_B$ is the Boltzmann constant.
The signal to noise ratio (S/N) is just the ratio of the received power to the noise equivalent power of the receiver front end.
Let's say we are running at the very edge with a S/N = 1. If the received power drops by a factor of 10 (distance is $\sqrt{10}$ further) then we have to cut $\Delta f$ also by a factor of 10 to maintain the same S/N.
Photon signal detection
Right now the standard method of converting an optical signal into an electrical signal is to use some kind of photodiode. Most photons that get into the photodiode are absorbed and produce an electron-hole pair. These are collected as an electrical current.
The number of pairs produced and thus the current is proportional to the indcident optical power, okay so far, but the electrical power in the amplifier is equal to the current squared divided by the impedance! ($I^2R$)
This means that the electrical power we must compare to the NEP is proportional to the square of the optical power!
Thus once one opens the hood on this problem, one sees that the power collected by the antenna is only half the problem; the method of conversion to electrical signals is so different for optical vs radio that at some very far distance radio may be able to win using conventional detection technology.
But what about UN-conventional detection technology?
There are a few things to consider that can make optical communication's future at extremely large distances brighter.
Exceeding classical capacity limit in quantum optical channel (also researchgate) is reference #8 in Toyoshima et al.
The amount of information transmissible through a communications channel is determined by the noise characteristics of the channel and by the quantities of available transmission resources. In classical information theory, the amount of transmissible information can be increased twice at most when the transmission resource (e.g. the code length, the bandwidth, the signal power) is doubled for fixed noise characteristics. In quantum information theory, however, the amount of information transmitted can increase even more than twice. We present a proof-of-principle demonstration of this super-additivity of classical capacity of a quantum channel by using the ternary symmetric states of a single photon, and by event selection from a weak coherent light source. We also show how the super-additive coding gain, even in a small code length, can boost the communication performance of conventional coding technique.
Also, since detectors can count individual photons and record their exact arrival time to picosecond precision and some lasers can generate picosecond pulses at micro' and nano-second intervals, there is a lot of opportunity to use the time structure to help boost S/N in a way that is not possible with radio waves, since counting individual radio photons is far more challenging.
For more on that, see