2
$\begingroup$

I need this for worldbuilding, but it is a physics question. I want to be able calculate the energy over distance from a hypernova of a variable star like Eta Carinae. Understanding what energy levels would be dangerous to a planet, and dangerous on what level, would also be necessary.

Scientists have numbers saying that a normal supernova would damage Earth from within 50 to 100 light years. I'm more interested in hypernovae, because Eta Carinae is 7500 light years from Earth, and some scientists are saying it could strip our ozone layer literally any minute, day, year, or century now. I have to wonder: if it might be that bad here, how bad is it 5000 light years or 1000 light years away?https://en.wikipedia.org/wiki/Eta_Carinae#Possible_effects_on_Earth

I know we have models to predict these, but after repeated attempts at searching I cannot find them, and maybe I just don't know the correct terms to search. I like this answer to a similar question: https://worldbuilding.stackexchange.com/a/19002 but it appears to be about planets orbiting the dying star having a normal supernova.

I'd like to be able to take the characteristics of the star and create a distance spectrum that goes from planetary evaporation to an aurora light show for a wide range of planets: rocky, Neptunian, or Jovian . Being able to estimate that of any star would be a bonus, but I mostly just want to be able to do massive variable stars.

$\endgroup$

1 Answer 1

1
$\begingroup$

In short, there are no nice standard formulas for this. One can make some order-of-magnitude calculations, though.

The key formula you need is the inverse-square law: the intensity of a spherical energy source falls off with the inverse square of distance. $$I(r)=\frac{I}{r^2}.$$ The useful thing is that if you know that some source with intensity $I_1$ has certain effects at distance $r_1$ (say, a supernova being dangerous at 50 light years) then you get the same effects from a source with intensity $I_2$ at a distance $$r_2=\sqrt{\frac{I_2}{I_1}}r_1.$$ So a hypernova 10 times more luminious than a supernova would be dangerous out to about 158 lightyears (assuming the initial 50 lightyear range - this is where different papers and models might quibble a fair bit). Notice that you need a fairly large luminosity to get a vast range in this case.

Hypernovas are likely progenitors of gamma-ray bursts that release most of the energy along fairly narrow jets, 2-20 degrees across. That would amp up the range significantly. If the jet covers a fraction $f$ of the sky, that means the range is now $$r_{2 GRB}=\sqrt{\frac{I_2}{f I_1}}r_1$$ where the intensities are the true energy releases. For an opening-half angle $\theta$, $f=1-\cos(\theta)$ so we should expect $f$ to be between 0.00015 and 0.015, basically increasing the range by a factor 81 to 8.1 if one is unlucky enough to be in the beam.

The exact effect on different planets depends on many complex factors. Planetary evaporation is not a simple process unless the star delivers far more energy than the gravitational binding energy. The absorbed energy $\pi \epsilon R^2 I(r)$ where $0<\epsilon<1$ is the absorption efficiency must be much larger than $3GM^2/5R$, or $$ I(r) \gg \frac{3GM^2}{5\pi\epsilon R^3} $$ This scales proportional to mass and density, so to evaporate a ten times more massive planet you need ten times more radiation, assuming the same density.

Merely melting the surface or blowing off the atmosphere would depend on a lot of geophysical details. $\epsilon$ depends a lot on wavelengths, whether there is a plasma layer forming and similar messy issues. The light curve timecourse also matters, since some of the processes will be hydrodynamic flows rather than instant shockwaves or long-term heating.

A crude model might be that the absorbed energy heats up the atmosphere initially as $$\Delta T = \frac{\epsilon I(r)}{ C_P M_a}$$ where $C_P$ is the specific heat and $M_a$ the mass of a one-square meter column of atmosphere. For Earth, $M_a=10300$ kg and $C_P=1.00$ kJ/kg.K. A $10^{47}$ J hypernova at a distance of 50 ly radiating evenly would deposit $10^{47}/4\pi (50 \text{ly})^2\approx 35\cdot 10^9$ W/m$^2$, heating atmosphere if we assume $\epsilon=0.1$ by 345 K - enough to broil us, but not enough to evaporate the oceans (there $M$ is 1000 kg per meter depth, and $C_P=4.2$ kJ/kg.K). Were it just a supernova 100 times less intensive the effect is a balmy 3.45 K. On the other hand, with a focused GRB beam we can get a hundred to 10,000 times more heating (at which point the above formula stops being valid since the air ionises and becomes plasma).

$\endgroup$
2
  • $\begingroup$ I think I understand. I did a little more research on top of what you posted here. Hypernovae aren't a particular type of supernova. From what I gather, all types of supernovae are basically uniform in energy output. I see hypernova and type 1c used interchangeably in some place, but I read hypernovae are really like a secondary blast that is even worse than the first. If I know the effects of the supernova type in question, I can use your equations to amp it up to a hypernova to figure out the specifics of the galactic scale destruction. $\endgroup$
    – dboggs95
    Commented May 18, 2019 at 21:28
  • $\begingroup$ en.wikipedia.org/wiki/Superluminous_supernova en.wikipedia.org/wiki/Supernova#Energy_output $\endgroup$
    – dboggs95
    Commented May 18, 2019 at 21:31

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .