6
$\begingroup$

This is my first post here, so I apologize if there's something wrong.

I am studying quantum optics and I found myself in trouble with the difference between bunching/antibunching and super poissonian/sub-poissonian light.

According to Loudon, Quantum theory of light, pag. 248 bunching and anti bunching depends on the fact that $g^{(2)}(\tau)$ (the degree of second order coherence) is less or greater than $g^{(2)}(0)$:

  • bunching $g^{(2)}(\tau)$ < $g^{(2)}(0)$
  • random $g^{(2)}(\tau)$ = $g^{(2)}(0)$
  • anti-bunching $g^{(2)}(\tau)$ > $g^{(2)}(0)$

And X.T. Zou and L. Mandel, Photon anti-bunching and sub-Poissonian photon statistics, 1989 stated that the joint probability for detecting one photon at time t and another at time $t + \tau$ is proportional to $g^{(2)}$, hence $p_2(t, t + \tau)$ decreasing or increasing when $\tau$ increases according to bunching and anti-bunching, while the value of $g^{(2)}(0)$ dictate sub or super poissonian statistics. I think this is the correct definition and it's coherent with the treatment of Loudon, while other textbooks and papers seem to make confusion on the two different aspects.

For example this answer doesn't seem correct: What is $g^{(2)}$ in the context of quantum optics? And how is it calculated?

A threefold classification of light according to the second-order correlation function can be made as following:

  • bunched light: $g^{(2)}(0) > 1$,
  • coherent light: $g^{(2)}(0)= 1$,
  • antibunched light: $g^{(2)}(0) < 1$.

This implies sub or super poissonian not bunched or anti bunched.

So, I have three related questions:

  1. Why there is this confusion and what is the correct answer? I almost sure that Loudon, Zou and Mandel are right.
  2. I understand the math but I don't see completly the physical meaning and how to distinguish the different effects experimentally. I understand that bunching means that photons tend to come in bunches on the detector, but super poissonian seems similar because you have $\Delta n^{2} = <n> + <n>^{2}$. It seems to imply an excess of number of photons.
  3. The Hanbury Brown and Twiss interferometer measure the correlation of intensity of electric field. This tell us automatically if the source is bunched or anti bunched and super or sub poissonian at the same time? (I guess yes because if we have knowledge of $g^{(2)}(\tau)$, we can know it's value at zero and if it increases or decreases with $\tau$. This is valid only if the definition of Loudon is correct).
$\endgroup$

1 Answer 1

4
$\begingroup$

This is a very good first question. Keep going like this!

  1. Why there is this confusion and what is the correct answer? I almost sure that Loudon, Zou and Mandel are right.

Let me first clarify the connection between Loudon's classification and the one in the linked answer. For any type of light it is reasonable to assume that the intensity at time $t$ is uncorrelated to the intensity at time $t + \tau$ for very large $\tau$. $$ \langle I(t) I(t+\tau) \rangle \xrightarrow{\tau \to \infty} \langle I(t) \rangle \langle I (t+\tau) \rangle $$ Or, phrased with photon detection probabilities, the probability to detect a photon a long time $\tau$ after a photon had been detected at time $t$ is the same as the probability to detect a photon at time $t + \tau$ under any circumstances. $$ p(t+\tau | t) \xrightarrow{\tau \to \infty} p(t+\tau) $$ With this one can show that $$ g^{(2)}(\tau\to\infty) = \begin{cases} \lim_{\tau\to\infty} \frac{\langle I(t) I(t+\tau) \rangle}{\langle I(t) \rangle \langle I (t+\tau) \rangle} = \frac{\langle I(t) \rangle \langle I(t+\tau) \rangle}{\langle I(t) \rangle \langle I (t+\tau) \rangle} = 1 \\ \lim_{\tau\to\infty} \frac{p(t) \, p(t+\tau | t)}{p(t) \, p(t+\tau)} = \frac{p(t) \, p(t+\tau)}{p(t) \, p(t+\tau)} = 1 \end{cases} $$ Assuming that $g^{(2)}(\tau)$ is monotonic (in some cases it isn't), i.e. it goes from its value at $\tau=0$ to its value for $\tau \to \infty$ without wiggles

  • $g^{(2)}(0) > g^{(2)}(\tau)$ is equivalent to $g^{(2)}(0) > 1$.
  • $g^{(2)}(0) = g^{(2)}(\tau)$ is equivalent to $g^{(2)}(0) = 1$.
  • $g^{(2)}(0) < g^{(2)}(\tau)$ is equivalent to $g^{(2)}(0) < 1$.
Therefore both, the definition in Loudon's book and in the linked answer, are correct. I personally prefer the definition which compares the value of $g^{(2)}(0)$ to $1$, because there you only need to measure / compute $g^{(2)}(0)$, not its whole time evolution.
  1. I understand the math but I don't see completly the physical meaning and how to distinguish the different effects experimentally.

Photon statistics, and whether it's (sub-/super-)Poissonian, talks about the number distribution $P(n)$ of photons detected within a very short time window. Short in comparison to the time at which $g^{(n)}(\tau)$ goes to $1$. Therefore, given the photon statistics $P(n)$, one can calculate $g^{(n)}(0)$. In particular $$ g^{(2)}(0) = \frac{\langle n \, (n-1) \rangle}{\langle n \rangle^2} \text{.} $$ Bunched light is typically classical light with intensity fluctuations. For a more detailed description have a look into this question. In short, if you detect a photon at time $t$, chances are high that this is within an intensity maximum (a bunch of photons), therefore the probability to detect another photon shortly afterwards is increased.

Antibunched light is typically the emission from single-photon emitters. An atom / molecule can only emit one photon at a time. So if you just detected one photon from it you know that there can't be another one immediately after it.

Uncorrelated light ($g^{(2)}(\tau) = 1$) is when a detected photon doesn't give you any information about future detection events. See for example this question to see why uncorrelated photons result in a Poissonian number distribution.

Experimentally one measures $g^{(2)}(\tau)$ by time-tagging single photons and then binning the delays between each pair into a histogram. Here is a question on this.

I understand that bunching means that photons tend to come in bunches on the detector, but super poissonian seems similar because you have $\Delta n^2 = \langle n \rangle + \langle n \rangle^2$. It seems to imply an excess of number of photons.

This is exactly how super-Poissonian light is defined. If the photon statistics have a variance larger than that of the Poissonian distribution (with the same mean value $\bar{n}$) it's called super-Poissonian and for most cases has a $g^{(2)}(0) > 1$. If the variance is below that of a Poissonian distribution it is sub-Poissonian and for most cases has a $g^{(2)}(0) < 1$.


  1. The Hanbury Brown and Twiss interferometer measure the correlation of intensity of electric field. This tell us automatically if the source is bunched or anti bunched and super or sub poissonian at the same time? (I guess yes because if we have knowledge of $g^{(2)}(\tau)$, we can know it's value at zero and if it increases or decreases with $\tau$. This is valid only if the definition of Loudon is correct).

As written before, in most cases bunching means super-Poissonian light, antibunching means sub-Poissonian light. You might wonder what these "most cases" don't include. A counterexample is a source which emits perfect 2-photon states $|2\rangle$. There you have $$ g^{(2)}(0) = \frac{\langle 2 | \hat{n} \, ( \hat{n} - 1 ) | 2 \rangle}{\langle 2 | \hat{n} | 2 \rangle^2} = \frac{2 \cdot 1}{2^2} = \frac{1}{2} < 1 \text{,} $$ but $$ \Delta n^2 = 0 < \bar{n} \text{.} $$ Edit: This example is not specific as it doesn't make any statement about the time-evolution. Still, one finds this in many theory papers. Partially because their calculations don't depend on the dynamics of the system and can therefore be widely applied, partially because it's much easier to calculate $g^{(2)}(0)$ then the full evloution $g^{(2)}(\tau)$. Let me specify a concrete example here:

Imagine a system which is in the steady-state $\psi_{ss} = |2\rangle$, i.e. $2$ photons in a single mode of a cavity, to not deal with flying photon wavepackets. If a photon is detected at time $t$ the system is projected into the new state $\psi(t) = |1\rangle$. From there it will evolve again towards the steady-state. After infinitely long time $\tau \to \infty$ the system is in back in the steady state. This means the probability to detect a second photon right after the first is only half as much as after infinite time, simply because of the expected number of photons in the cavity at these times: $$ p(t+0|t) = \tfrac{1}{2} p(t+\infty|t) $$ Together with $g^{(2)}(0) = \frac{1}{2}$ this leads to $g^{(2)}(\infty) = 1$, as it should be for any system.

Now, why is the state $|2\rangle$ not bunched? Because it has no intensity fluctuations which lead to gaps between the bunches. It's like when you compare an infinite forest to a group of trees on a meadow. Seeing the group of trees one would say it's a bunch, but standing in the infinite forest you only know trees everywhere. To make it mathematically precise, take a look at the "group of trees on a meadow" equivalent state: $\sqrt{1-\epsilon} |0\rangle + \sqrt{\epsilon} |2\rangle$. Here the bunches cover a fraction $\epsilon$ of the landscape. For this state $$ g^{(2)}(0) = \frac{\epsilon \langle 2 | \hat{n} \, (\hat{n}-1) | 2 \rangle}{\left( \epsilon \langle 2 | \hat{n} | 2 \rangle \right)^2} = \frac{2 \epsilon}{\left( 2 \epsilon \right)^2} = \frac{1}{2 \epsilon} \text{.} $$ For small $\epsilon$ this can lead to insanely high bunching values. Here is a paper in which they measured $g^{(2)}(0) = 21$.

$\endgroup$
11
  • $\begingroup$ First of all, thank you for the great answer. I have some questions. In the part about the counterexample, why you say "but"? $g^{(2)}(0)$ is related to $\Delta n^2$ for single mode: $g^{(2)}(0) = 1 + \frac{\Delta n^{2} - \langle n \rangle}{\langle n \rangle^{2}} $, then $\Delta n^{2} = 0$ is equivalent to $g^{(2)}(0) = \frac{1}{2}$. Anyway, in this case we have sub Poissonian statistics, but I don't know if it's bunched or not. Intuitively it seems two photons are in a bunch. But if I think of single mode in general $g^{(2)}$ is independent of $\tau$ so it seems like random detection. $\endgroup$
    – Mark_Bell
    Commented Mar 24, 2021 at 4:29
  • $\begingroup$ It seems like there is not increasing or decreasing in probability and the detection must be random. Moreover in this case $g^{(2)}(\tau\to\infty)$ is not 1, so intensity are not uncorrelated. These things contradict each other. Where I am wrong? Please can you clarify this point? Thank you so much $\endgroup$
    – Mark_Bell
    Commented Mar 24, 2021 at 4:32
  • $\begingroup$ @Mark_Bell I've added a more detailed description of the example. $\endgroup$
    – A. P.
    Commented Mar 24, 2021 at 9:44
  • $\begingroup$ In single mode $g^{(2)}$ is independent of $\tau$. But here goes from $\frac{1}{2}$ to 1. I understand the reasoning of why $g^{(2)}$ has to be 1 for $\tau\to\infty$ in the last part of your answer. Is it the fact that $g^{(2)}$ change with time in this example? I mean the time when you perform measurement t, not the delay time $\tau$; maybe related to the fact that the state change with time? $\endgroup$
    – Mark_Bell
    Commented Mar 25, 2021 at 1:46
  • $\begingroup$ @Mark_Bell The time-dependence in the example makes the system multimode. The temporal length of a mode is given by the coherence time, which in this example coincides with the time the system needs to reach the steady state. $\endgroup$
    – A. P.
    Commented Mar 25, 2021 at 8:48

Not the answer you're looking for? Browse other questions tagged or ask your own question.