1
$\begingroup$

From the Stefan-Boltzmann Law ($j^*=\sigma T^4$), the luminosity of the Sun is $L_{sun}=j^*A=[\sigma T^4][4\pi R_{sun}^4]=3.85\times 10^{26}$ W.

If I assume the Sun to be in a steady state then the energy exiting the core (I will assume $R_{core}\sim 0.25R_{sun}$) should be equal to the energy the sun radiates away into space at the surface.

If I set the the power that is exiting the Sun's core (heading into the radiative zone) equal to the Sun's luminosity and apply the $j^*=\sigma T^4$ and $L=j^*A$, I find the temperature at the boundary between the core and radiative zone is 11,558 K (much lower than the 7,000,000 K value quoted online).

I realise that a more accurate picture would be to solve the equations of hydrostatic equilibrium, but I would think that if all the Sun's energy is created in the core, and the Sun operates at a steady state, than energy conservation would require that the energy exiting the core should equal the luminosity of the sun.

Is there something wrong in my reasoning about energy conservation, or is/are there ways that the sun emit energy formed in the core other than by blackbody radiation? Am I wrong to assume the Sun's core emits as a black body (if this is the case, I don't think a non-zero emissivity would matter much since I'm off by almost 3 orders of magnitude)?

$\endgroup$
2
  • 1
    $\begingroup$ It works well for the Sun's surface. The Stefan-Boltzmann surface is imaginary surface which is assumed to be just below the objects surface - and it assumes the "furnace" is a perfect reflector internally. It's roughly 15 million C in the core - 6000 C on the surface - or the photosphere - and a few million degrees C in corona which is hundreds of kilometers above the photosphere. $\endgroup$ Commented Apr 18, 2019 at 0:38
  • $\begingroup$ When you apply SB law inside the Sun, don't forget the energy moving in both directions. At the Sun's surface there is no incoming energy. But at any surface inside the Sun there is almost exactly equal energy going inwards as outwards. $\endgroup$ Commented Jul 1, 2023 at 16:09

2 Answers 2

1
$\begingroup$

If I assume the Sun to be in a steady state then the energy exiting the core (I will assume Rcore∼0.25Rsun) should be equal to the energy the sun radiates away into space at the surface.

That's where your problem is: The energy flux existing the surface of the sun is only $~10^{-12}$ of the flux exiting the core. This is because the interior of the sun is not a naked Stefan-Boltzmann-surface, but radiation bounces back and forth through diffusion, and is thus mostly trapped. The tiny fraction that escapes in the end is what we see as surface luminosity.

If you want the core photon fluxes, I'm afraid you'll need a structure model for that. Or go full hydrostatic (assuming all fluxes are in balance) and use a polytrope. Once you have the structure, you can play the Stefan-Boltzmann game again and compare the fluxes exiting the core vs. exiting the surface.

$\endgroup$
6
  • $\begingroup$ Are you implying that the sun is not in a steady state, and that there is a sink for energy between the core and the photosphere? $\endgroup$
    – Bob
    Commented Apr 17, 2019 at 22:22
  • $\begingroup$ @Bob: Depends which steady state you mean. The sun is very near perfectly hydrostatic, but it's not in thermodynamic equilibrium and it is only to the quoted fraction in radiative equilibrium. The energy sink is the photosphere, the energy source is the core. I don't know why you think there should be a sink between them. $\endgroup$ Commented Apr 17, 2019 at 22:28
  • $\begingroup$ I mean by "not steady state" that a significant amount more energy enters the radiative zone than exits it (for the entirety of the Sun's main sequence lifetime) $\endgroup$
    – Bob
    Commented Apr 17, 2019 at 22:32
  • $\begingroup$ @Bob: The photon flux through the surface surrounding the core at $r_c$ is enormously greater than what exists the photosphere. But this is only because the matter surrounding the core is enormously optically thick, it's diffusing photons back and forth. Thus, the volume integral of photon flux surrounding the shell $r_c + dr$ is minuscule, and it is this minuscule difference that is transported to the next shell and the next etc. until you reach the surface, where no return flux is possible. That's why we don't need an energy sink. $\endgroup$ Commented Apr 18, 2019 at 9:25
  • $\begingroup$ So you're saying that a shell at radius $R+\delta R$ radiates in both directions (towards a shell at radius $R$ and towards a shell at radius $R+2\delta R$), implying to a much steeper temperature gradient and hence greater temperature near the core (compared to my original estimate)? $\endgroup$
    – Bob
    Commented Apr 19, 2019 at 15:13
1
$\begingroup$

Your arguments are all nearly 100% spot on, but you are ignoring the fact that the outer shell radiates nearly equal amount of energy back towards the core. A bit more reasoning will convince you that if there were no temperature gradient, no energy could possibly escape at all. If you are a physicist with ambition to become as good as Fermi, or Feynman, try to make a quick estimate and then calculate more precisely later. HINT: to first approximation it will depend on the ratio of the optical thickness of the sun to the radius.

$\endgroup$
4
  • $\begingroup$ So you're saying that a significant amount more energy enters the radiative zone than exits it (for the entirety that the Sun will spend on the main sequence)? $\endgroup$
    – Bob
    Commented Apr 17, 2019 at 23:14
  • $\begingroup$ Assuming the energy is produced only in the core, then all layers outside the core have as much entering as leaving. But, and that is crucial - on the whole energy is flowing out. Think about the balance, and you can figure it out. $\endgroup$
    – Kphysics
    Commented Apr 18, 2019 at 17:16
  • $\begingroup$ So (in your original post) you're saying that a shell at radius $R+\delta R$ radiates in both directions (a shell at radius $R$ and a shell at radius $R+2\delta R$), implying steeper temperature gradient and hence greater temperature near the core (compared to my original estimate)? $\endgroup$
    – Bob
    Commented Apr 19, 2019 at 15:14
  • $\begingroup$ Right, each interface has radiation going in both directions. Your estimate of the gradient was taking into account only the fact that the surface of the core is smaller than the outer surface by a factor of 16, so the temperature had to be higher by a factor of 2. Thats a relatively minor effect. The major issue is that if there is no gradient, then there is no energy flow, just like with ordinary heat conduction. Except with normal heat that leads to a constant gradient. With radiation it leads to exponential (!) increases in temperature, just to keep a constant flux going. $\endgroup$
    – Kphysics
    Commented Apr 19, 2019 at 22:09

Not the answer you're looking for? Browse other questions tagged or ask your own question.