19
$\begingroup$

Some of the wavelengths of light that are emitted from the Sun will be absorbed by atoms in the outer layer of the Sun and also the atmosphere of the Sun, and we see this as absorption lines in the spectrum. Now, this absorbed radiation will indeed be re-emitted again, so one might think that thIs emission should "cancel out" the absorption lines. The usual explanation for why this doesn't happen is that the re-emitted light is radiated in all directions, not just towards us, meaning that to us these wavelengths will be much fainter than the other wavelengths.

But the problem I have is that this happens all around the Sun (since the atmosphere is completely surrounding it), and intuitively it seems then that all of this re-emitted light should combine such that far away it would appear that the Sun is radiating at these wavelengths just as it is radiating at all the other wavelengths. And if that is true, then we shouldn't see absorption lines in the spectrum. So what is it that I am missing?

$\endgroup$
1
  • $\begingroup$ @ProfRob I don't see it happen often, but because of your "Star" badge your single vote to close resulted in a closing as duplicate of this question. See screenshot. That's fine, I just though you might not know you had this "superpower" :-) $\endgroup$
    – uhoh
    Commented Apr 23, 2023 at 20:39

2 Answers 2

18
$\begingroup$

Possibly you are under the misapprehension that the number of photons is a conserved quantity? That isn't true, there are more photons at any given wavelength when you are deeper into the star because there is a temperature gradient. Cooler material further out is less emissive because fewer atoms are in excited states.

The temperature gradient is responsible for the formation of absorption lines. If the photosphere of the Sun were at a single temperature then we would see a perfect blackbody spectrum, for the reasons you outline.

The filling in of absorption by scattering would only take place if the radiation field that the atoms were in were isotropic. But it isn't isotropic because of the temperature gradient.

A much better way to think about the spectrum of a star is to imagine that you can see to a wavelength-dependent depth into the star. Where there is a strong atomic absorption feature, you cannot see very far into the star at that wavelength.

Since the star gets hotter the deeper you go into it, and the emissivity scales as $T^4$, then the deeper we can see into the star, the brighter it will appear at that wavelength (and vice-versa).

EDIT:

More formally. The radiative transfer equation, if you want to consider the absorption and remission as some sort of scattering process, would be (e.g, equation 1.86 of the canonical textbook "Radiative Processes in Astrophysics", 1979, Rybicki & Lightman or any book detailing radiative transfer) $$\frac{dI_{\nu}}{ds} = -\sigma_\nu I_{\nu} + \sigma_\nu J_\nu\ ,$$ where $I$ is the specific intensity in the solar photosphere (in this case, directed towards the Earth), $J$ is the mean specific intensity at a point in the solar photosphere averaged over all directions (i.e. $J = \int I d\Omega/4\pi$, where $\Omega$ is solid angle), $\sigma$ is the scattering coefficient (assumed to be isotropic) and $ds$ is a piece of pathlength towards the observer. The $\nu$ subscript just indicates everything is wavelength/frequency dependent.

To avoid creating an absorption or emission line then $dI_\nu/ds$ must equal zero (i.e. nothing is added or subtracted from the beam of light).

This will only happen if $I_\nu = J_\nu$, which would require that the specific intensity averaged over all directions is equal to the specific intensity emerging from the Sun and heading towards the observer. This will only be true if the radiation field is isotropic and equal to $I_\nu$ in all directions.

Whilst this would be true for a blackbody radiation field at a set temperature, it isn't true in the solar photosphere. The specific intensity heading towards us (generally outwards) is always larger than the specific intensity heading away (generally inwards and true regardless of which portion of the visible solar disc is considered) because of the temperature gradient in the photosphere, which means it is hotter further into the interior. That means that $I_\nu$ is always greater than $J_\nu$ and hence $dI_\nu/ds < 0$ and we have net absorption.

$\endgroup$
3
  • $\begingroup$ Comments are not for extended discussion; this conversation has been moved to chat. $\endgroup$
    – called2voyage
    Commented Sep 21, 2020 at 12:09
  • $\begingroup$ @ProfRob So the existence of absorption lines due to intergalactic dust and cold gas clouds also has to be related to a necessarily existing temperature gradient in those? $\endgroup$
    – trynerror
    Commented Feb 18, 2023 at 14:25
  • 1
    $\begingroup$ @trynerror they exist because they are illuminated by hotter things. The temperature gradient is between the illuminating objects and the absorbing objects. In a star, the illuminating object is a hotter layer beneath a colder layer. $\endgroup$
    – ProfRob
    Commented Feb 18, 2023 at 15:24
1
$\begingroup$

The atmospheric layer that produces the absorption lines acts somewhat like a mirror at these frequencies and scatters the light back into the sun (although this is diffuse reflection not specular reflection like an actual mirror). In principle, light is scattered also outwards (with a probability of 1/2 for each scattering event), but since the layer is very dense at the line frequencies it takes many scattering events to get trough. After two scattering events it would only be a fraction 1/2 *1/2 =1/4, after three 1/2 *1/2 *1/2 =1/8 and so on (this is just to demonstrate the principle, in reality it is bit more complicated due to multiple scattering back and forth in the layer). There are so many scattering events required that very little is getting through. It is being all scattered back into the lower layers of the atmosphere where it is eventually converted to photons of different frequencies.

It is a bit similar to why you have little light from the sun here on our earth under a dense cloud layer compared to a clear sky. If you have ever been in an airplane 5 miles high above the clouds, you realize that this light missing under the clouds is in fact reflected back from the top into space, making the clouds appear blindingly white. It is just the reverse situation in the solar atmosphere (if you could take a spectrum from below the layer responsible for the Fraunhofer lines looking upwards, you would see those lines all in emission)

Edit: The following diagram (taken from https://courses.lumenlearning.com/astronomy/chapter/formation-of-spectral-lines/ ) illustrates what happens here enter image description here

The specific difference is here only that the geometry of the scattering layer is different, being more like an infinitely extended vertical plane layer than kind of cylindrical. So in this case you can see the emission line (bright line) spectrum only from underneath the solar layer producing absorption lines when looking upwards (this is the emission the OP was missing in the absorption spectrum). In all other directions, you see (for obvious geometrical reasons) always the continuum source behind (which you have to assume as an extended plane layer as well) and thus the absorption spectrum.

Edit 2: Note that the accepted answer above is incorrect. It claims to describe the scattering of radiation, but the quoted equation effectively neglects the scattering source term when associating the source term later on with the thermal black-body term in order to bring in the temperature argument here. The correct equation is (see http://irina.eas.gatech.edu/EAS8803_Fall2017/petty_11.pdf ) enter image description here Note that $\beta_e$ is here the combined absorption/scattering coefficient going into the loss term (with the minus sign), and $\tilde\omega=\beta_s/\beta_e=\beta_s/(\beta_a+\beta_s)$ is the relative contribution of scattering to the absorption coefficient. This means for pure scattering we have $\tilde\omega=1$ and the thermal black-body radiation term vanishes. The temperature argument given in the accepted answer above is therefore not applicable in this case. It is clear from this that the thermal emission is only related to the continuum absorption, which however a) is negligible in the visible region above the photosphere and b) can not produce absorption lines anyway, whether there is temperature gradient or not.

So absorption lines can only be produced by resonance scattering, as already qualitatively explained by the colour illustration above. I have made in this respect some explicit numerical calculation with my own radiative transfer program reproduced at https://www.plasmaphysics.org.uk/programs/plantrans.htm , modified somewhat to show the actual line profile rather than frequency integrated intensities.

This is what you get from a mono-directional continuum source falling from one side onto an isothermal purely scattering plane-parallel layer with a line center optical depth $\tau$=10 (assuming a Doppler (Gausian) scattering emissivity) for the transmitted line at the other end (looking vertically into the layer and including the continuum source)

Transmitted $\tau$=10

and this is what is being vertically reflected back to the continuum source

Reflected $\tau$=10 enter image description here


Here is the same for an optical depth $\tau$=100 instead

Transmitted $\tau$=100 enter image description here

Reflected $\tau$=100 enter image description here

If one looks at the actual numerical scale of the graphs, it is obvious that the amount reflected back does not fully explain the amount missing from the continuum on the other side. This is simply due to the fact these plots hold for a fixed (vertical) viewing direction only and are furthermore normalized to a solid angle of 1 steradian (which is only 1/2/$\pi$ of the full half-space the radiation is scattered back into). If one would add up the back-scattered radiation over the complete half-space, taking also into account that the line shape and intensity varies with the viewing direction, it would exactly account for the radiation that is missing in the transmitted spectrum. The question the OP had can only be answered in this way.

$\endgroup$
11
  • $\begingroup$ This isn't correct. Outward scattered light will escape from layers where the photospheric absorption lines are formed. That is what the photosphere is. The last part is correct, but happens because of the temperature gradient, not because anything is acting like a mirror. $\endgroup$
    – ProfRob
    Commented Sep 13, 2020 at 20:31
  • $\begingroup$ In the end half of the photons will end up going out, half will end up going deeper into hotter layers and the energy likely comes back in a different form. $\endgroup$ Commented Sep 14, 2020 at 2:00
  • $\begingroup$ @Rob Jefferies If you have an atmospheric layer that is sufficiently dense, its albedo (reflectivity) will be close to 1. Any light falling on it from one side will effectively be reflected like from a mirror. Very little will come through to the other side. In this case the reflectivity is frequency dependent, being very small in the continuous part of the spectrum but very high at the line frequencies (because the atoms have resonances there). Therefore the latter won't get through but are reflected back. Temperature gradients have nothing to do with it. // See also my edited post above $\endgroup$
    – Thomas
    Commented Sep 14, 2020 at 19:00
  • $\begingroup$ @Loren Pechtel There is much more than just half of the photons missing in the absorption lines $\endgroup$
    – Thomas
    Commented Sep 14, 2020 at 19:04
  • $\begingroup$ Perhaps you should think about how heat is transported by radiation. It certainly doesn't happen by reflecting photons back into the Sun. It happens by more radiation travelling outwards than inwards. The Sun is totally opaque at all wavelengths just below the photosphere, so by your reasoning light would never escape. The analogy to clouds is quite l false. $\endgroup$
    – ProfRob
    Commented Sep 14, 2020 at 20:18

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .