In quantum optics, coherent light with constant frequency, phase, and amplitude shows poissonian photon number statistics:
$$P(n) = \frac{\bar{n}^{n}}{n!}e^{-\bar{n}}.$$
A well-known result for Poisson distributions is that their variance equals their average and therefore their standard deviation is equal to the square root of the average:
$(\Delta n)^2 = \bar{n}$ and $\Delta n = \sqrt{\bar{n}}$.
Now, if you were to attenuate a coherent light beam, your average n would go down, let's say by a factor x. For as far as I understand, when scaling a probability distribution, the average and the standard deviation get scaled by that factor. So a 'new' light beam would have the following properties in terms of an 'old' light beam:
$\bar{n}_2 = x \bar{n}_1$ and $\Delta n_2 = x \Delta n_1 = x \sqrt{\bar{n}_1}$.
The variance of the attenuated beam is then the following:
$(\Delta n_2)^2 = x^2 n_1$.
This means that now $\bar{n}_2 \ne (\Delta n_2)^2$, indicating that this light is not Poissonian anymore. However, this would make it impossible to create coherent light, since there is always some sort of attenuation in any set-up or light source. Furthermore, books like "Quantum optics, an introduction" by Mark Fox, still speak of Poissonian statistics, even after attenuation or inefficient detection. This indicates that the scaling of the probability distribution explained above is incorrect.
So my question is, how can these two concepts be combined? Why do the properties of light not get scaled like other probability distributions?