1
$\begingroup$

I often see figures of how long it would take a black hole of a given mass to evaporate away due to Hawking Radiation. The wiki page itself mentions that a black hole of about $10^{11}$ kg takes about 2.667 billion years to evaporate. As I understand it, Hawking radiation is a statistical process, so there will be a random factor in this evaporation time. So what is the standard deviation of black hole evaporation times, at least as an order of magnitude estimate? Is it always very tiny, does it have a similar timescale to the expected life time, or what? In particular, would that $10^{11}$ kg black hole have any meaningful chance of surviving ~13 billion years, or is it going to be gone long before then with near absolute certainty? We may assume, as I assume those other calculations do, that the spacetime is flat at infinity and that the black hole is the only thing in the universe (so no microwave background radiation to feed the black hole, etc.).

$\endgroup$
2
  • $\begingroup$ I don’t believe that we understand Hawking radiation well enough to answer this question, since we’ve never observed it, and we don’t have a working theory of quantum gravity that we can use to make exact theoretical predictions. Just for example, a proposed solution to the black hole information paradox is that the “lost” information may modulate the Hawking radiation in some way. $\endgroup$
    – Mike Scott
    Commented Feb 10, 2023 at 13:05
  • $\begingroup$ @MikeScott The variation would implicitly be calculated with respect to whatever was used to model the existing lifetime calculations; answerer's choice if there are several (the wikipedia mentions some calculations that depended on the mass and number of neutrino varieties). We also don't know if those are correct or exactly what makes them happen, but they could clearly be done. But if even that's still not enough, an answer explaining that would also be welcome. $\endgroup$ Commented Feb 10, 2023 at 21:44

2 Answers 2

2
$\begingroup$

Basically the radiation is blackbody radiation, with the added complications of some "greybody factors" that affect the spectrum and the issue that hot emissions become particle emissions beside photon emissions. These things have minor effects on the lifetime but don't change the variance much.

For the lifetime to deviate much from the standard formula there must have been unusually many high- or low-energy emissions across the lifespan, or there must have been consistently short or long delays between emitting them.

The Planck radiation law, seen as a probability distribution, has a finite variance (there doesn't seem to be any analytic expression for it in elementary functions, but it is there). That means that the central limit theorem applies: as you add together the energies of a large number of emitted particles, the sum approaches the Gaussian distribution. Since the number of particles emitted from a standard black hole is vast, this makes the final energy sum Gaussian to a very high precision (the greybody factors do not matter). What this means is that the standard deviation, which scales as $\propto 1/\sqrt{n}$ will be negligible: the energy sum is totally dominated by the average.

A similar argument can be used for the particle emission times. The typical emission rate is $\lambda \propto 1/M$, about one particle per light-crossing time $c/2M$, presumably well modelled by a Poisson point process. But in the limit of long summing times this also converges to a Gaussian: the time for a very large number of particles to be emitted is Gaussian with a mean set by the (average) rate and a standard deviation going as $\propto 1/\sqrt{n}$.

So macroscopically these reasons imply that there will not be any great individual variance in black hole lifetimes. Two black holes of the same mass will decay at the same time... almost. There are still random factors here, so there will be a small standard deviation negligible compared to the overall lifespan.

$\endgroup$
2
  • $\begingroup$ I'm not quite sure the analysis is correct. For the evaporation we are summing a vast number of events, but we are not averaging that vast number of events. If we are averaging then the distribution is very approximately normal with deviation scaling with $1/\sqrt{n}$: the average emission event is extremely tight in distribution. But we aren't averaging, we're just summing. Distribution is still approximately normal, but the deviation scales with $\sqrt{n}$. Yes? And since each emission event is not actually independent of the other, doesn't that complicate things? $\endgroup$ Commented Feb 13, 2023 at 1:23
  • $\begingroup$ If the lifetime evaporation can be reasonably modeled by $n$ i.i.d. random variables with average $\mu>0$ and standard deviation $\sigma$, I think we can say that the relative proportion (standard deviation of lifetime)/(average lifetime) scales with $1/\sqrt{n}$, which is to say that as $n$ gets/is very large then the standard deviation is, indeed, a very tiny fraction of the expected lifetime (unless $\sigma/\mu$ is very large, which doesn't seem to be the case). $\endgroup$ Commented Feb 13, 2023 at 1:45
0
$\begingroup$

Very small.

The actual maths of Hawking radiation is complicated for me, so I tried to get a feel for the scale by thinking about a very very simplified model: constant rate emission of photons. This model is intentionally simple so the predictions it makes can only be "order of magnitude" preditions, but this is a start.

So how many photons will the black hole emit? It is big and all its mass needs to be converted to photons. That is a lot of energy by $E=mc^2$. Photons are small and each one only has a small amount of energy by $E=h\nu$ A black hole will need to emit lots of photons to evaporate. Lets say $10^{50}$ for the sake of calculation (later we can try varying this to see if it makes a big difference). And suppose that it emits these photons over its 10 billion year life (that is roughly $10^{17}$ seconds) And this is a random process. You would say that in those 10 billion years, The number of photons emitted would be modelled by a random variable, it would be a Poisson distribution, but since the number of events is large, we can use a Normal distribution with equal mean and variance.

This means that in 10 billion years it would emit an average of $10^{50}$ photons with a standard deviation of $10^{25}$. Standard deviation is the root of the variance.

So how long much longer might it last after 10 billion years? If it has emitted $10^{50}-10^{25}$ photons (one standard deviation below the mean) it would last for as long as it takes to emit $10^{25}$ photons. But in this model it emits $10^{50}/10^{17} = 10^{33}$ photons per second, So it would take $10^{-7}$ seconds extra.

This is the kind of scale of randomness. The fact that in black body radiation, photons are emitted at random would some black holes (in this massively simplified model) to live for $10^{-7}$ seconds longer.

That gives a sense of scale of standard deviation of evaporation times of black holes. I conclude that the life span of a black hole can be adequately described by a non-random variable. That $10^{11}$ kg black hole would not survive to the present.

However a larger black hole would survive, and we may see an evaporation signal when we look at distant galaxies, as we may be seeing them 2.667 billion years after the big bang.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .