I have read that the reason why the Sun produces an absorption spectrum is because the temperature drops as you go away from the center, such that as the various layers of the atmosphere of the sun absorb certain wavelengths, the re-emitted light will have a smaller intensity than the absorbed one, causing a dip in the spectrum (i.e., an absorption spectrum). This is consistent with Kirchhoff's laws and Planck's law for blackbody radiation. However, I checked and found that as you go from the photosphere and into the chromosphere and corona, the temperature rises instead. So then, why doesn't the opposite happen where instead of dips in the spectrum, we get peaks in it (i.e., an emission spectrum)?
(I will note that there was a temperature drop inside of the photosphere itself, but to my understanding the photosphere is opaque to all wavelengths, meaning that it can't pick out the specific wavelengths of the absorption spectrum).
Now, I already have some doubts about the logic above applying here. It seems to me that one of the assumptions being made is that the energy that is absorbed by the various layers are first distributed among the various atoms in the layers, such that the re-emitted light that comes out is the ordinary thermal radiation/blackbody radiation. However, considering the fact that the chromosphere and corona is so dilute, it would seem that that wouldn't be the case. Is it more correct to say that the radiation is rather being absorbed and re-emitted (i.e., scattered) so many times on its journey through the Sun's atmosphere that it continually loses small "bits" of energy that once it reaches us, the intensity of the re-emitted light is much smaller than the absorbed one (causing a dip in the spectrum)? Or is the gas perhaps so dilute that even that wouldn't work? If not, then again, why do we see an absorption spectrum?