3
$\begingroup$

We can derive the minimum mass of a star in the main sequence using the minimum temperature $T_C$ that triggers hydrogen burning in the core. Then using the stellar structure equations we can derive how the temperature in the core of a generic star is related to its mass and obtain:

$$ T_{core} \propto M^{\frac{4}{7}} \iff \frac{T_{core}}{T_{core,\odot}} = \left(\frac{M}{M_{\odot}}\right)^{\frac{4}{7}} $$

If we insert the minimum core temperature for fusion we'll get an approximation for the minimum solar mass, but how can I predict $T^{minimum}_{core}$?

In many book I see that they use $10^{6}$K, but why?

$\endgroup$
2
  • 1
    $\begingroup$ If I use the known $T_{{\rm core},\odot}\simeq 1.5\times 10^7$ K and your value for $T_{\rm core}$ then I get a minimum mass of $0.008M_\odot$, which is an order of magnitude too small. $\endgroup$
    – ProfRob
    Commented Aug 24, 2023 at 6:27
  • $\begingroup$ I suspect you have seen $10^6$ K as the minimum temperature for fusion (of deuterium). $\endgroup$
    – ProfRob
    Commented Aug 24, 2023 at 19:21

1 Answer 1

3
$\begingroup$

The minimum core temperature at which hydrogen fusion is significant in the lowest mass stars is $\simeq 3\times 10^6$ K. Whilst you can handwave that there must be some smallest core temperature and explain why (see below), the only way you can get an accurate number is from a numerical stellar evolutionary model. Inserting this core temperature into your scaling law and assuming knowledge that $T_{{\rm core},\odot}\simeq 1.5\times 10^7$ K then $M_{\rm min}= 0.06M_\odot$. This is not a bad estimate - a more accurate value would be $0.075M_\odot$ (see below).

Details

Can you estimate the minimum mass for "starhood" using some minimum temperature to initiate nuclear fusion?

It isn't as simple as that and there isn't a minimum temperature for nuclear fusion - the reaction rate simply scales with the density and (a high power of) temperature. The cores of low mass stars are denser than the solar core, but of lower temperature and as a result they have a much lower luminosity.

The virial theorem tells us that as a ball of gas contracts, that half the change in gravitational potential energy is radiated away and half goes into heating the interior. i.e. The core gets hotter.

The dividing line between a star and a brown dwarf is defined by whether the core of a contracting ball of gas ever reaches a temperature such that it supplies the luminosity of the object (basically replacing the rate of gravitational potential energy loss due to contraction) or whether the contraction is halted at high densities by electron degeneracy pressure, which is independent of temperature. If the latter, we have a brown dwarf that will then just cool down at an ever decreasing rate whilst maintaining an almost constant radius (about the size of Jupiter).

Working out where the dividing line is, is not simple and requires a numerical model of the contracting gas. The picture below illustrates these arguments. It shows the core temperature versus time for objects of various mass. These are from a numerical model due to Chabrier & Baraffe (2000).

The cores start cool and get hotter as the ball of gas contracts. Looking at the top curves, we see higher mass objects reach some plateau. This is where nuclear fusion of hydrogen is established and supplies all the luminosity of the star. However, at 0.07 solar masses, we see the core temperature peaks. This is because the core has become electron degenerate, halting the contraction before nuclear fusion is significant and the core just cools thereafter.

Core temperature of stars and brown dwarfs

From this plot you can see that the core temperature of the lowest mass star will be about $10^{6.5}\simeq 3\times 10^6$ K at just above $0.07M_\odot$.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .