6
$\begingroup$

So I'm reading this book, where after the preface and before the models there is a section called General Notions and Essential Quantities, which introduce some things I don't understand. They regard different temperatures of one system, especially in a non- or near equilibrium state.


At first I give a direct quote (pages 17, 18, section "VI Particle Distribution over Velocities and Energy: Temperatures of Different Degrees of Freedom":

  • "The Maxwellian distribution over translational energy $\epsilon$ of particle motion is represented by $$f(\epsilon)=\ ...$$ The Boltzmann distribution for population $N_i$ of the $i$th energy level relative to the population $N_0$ of the ground energy level $E_0$ is $$\tfrac{N_i}{N_0}=\tfrac{g_i}{g_0}e^\left({-\tfrac{E_i-E_0}{k T}}\right)$$ and relative to the total particle number $N$ of the given species is $$\tfrac{N_i}{N}=\tfrac{g_ie^\left({-\tfrac{E_i}{k T}}\right)}{Q},\ \ \ \ Q=\sum g_ie^\left({-\tfrac{E_i}{k T}}\right).$$ Maxwellian and Boltzmann distributions determine the temperature of the considered system. In the equilibrium system the temperatures of different degrees of freedom (translational, rotational, vibrational, electronic) are equal. In the nonequilibrium system involving the subsystems of the indicated degrees of freedom the single temperature is absent. If in any subsystem the velocity distribution or the energy distribution may be approximate by Maxwellian or Boltzmann functions, these function determine the temperatures of the appropriate degrees of freedom.
    • Translational temperature (gas temperature, or temperature of translational degrees of freedom): This is the parameter of Gibbs canonical distribution of particles over velocities and energy of the translational motion of particles. It is represented by the qunatity $T$ in the Maxwellian distribution as previously described.
    • Rotational temperature (temperature of rotational degrees of freedom): This is the parameter of Gibbs canonical distribution of molecues over rotational energy. It is represented by the quantity $T_r\equiv T_R$ in the Boltzmann distribution for a population $N^r_i$ of the $i$th rotational level: $$\tfrac{N_i^r}{N_0^r}=\tfrac{g_i^r}{g_0^r}e^\left({-\frac{E_i^r}{k T_r}}\right)\tfrac{N^r_i}{N^r} = \tfrac{g_i^r e^\left({-\frac{E_i^r}{k T_r}}\right)}{Q(T_r)}.$$
    • Vibrational temperature: ..."

This is then also followed by several partition functions $Q_i$, with $i=t,r,v,e$ and the formulas $$\epsilon=\sum\epsilon_k,\ \ \ \ Q=\prod Q_k.$$

Wikipedia knows such quanities ($Q_t$,$Q_v$,$Q_r$,$T_v$,$T_r$) but doesn't explain much about them.


Now of course I could just give the name temperature to every composition of quantities, which happen to have energy as a unit, but I have a problem with the possibility of defining something like a new temperature (and the unique partition function), given that they are supposed to coincide in total equilibrium.

Viewed especially from the microcanonical ensamble, one defines temperature $$\frac{1}{T}=\frac{\partial S(E,V )}{\partial E},$$ which as a variable is fixed by a single number - the curse of non-equilibrium thermodynamics. Say I break my system into parts like suggested above and it actually turns out that these aspects of the problem are describable by the distribution of an equilibrium system (Maxwell, Boltzmann, Gibbs). What is the temperature of the subsystem and how do I get to it? Is the plan to define objects which generate the variables $T_k$, like $T$ is canonical to $E$ in the sense of the entropy formula stated above? The counting of possible configurations (like over vibrational degrees of freedom) and therefore the associated entropy function should heavily depend on the type of degree of freedom. And then it wouldn't feel the functional dependences of the rest of the model. Why would the values for different $T_k$ overlap for the perfect equilibrium limit? Especially regarding translational degrees in the Boltzmann theory, when you consider spatially varying temperatures $\theta(\vec x)$. What would they have in common with a temperature derived from a counting of vibrational degrees of freedom? Is there a concept of a derivative with respect to just an aspect of the energy (translational/kintec, rotational, vibrational,...). The partition function seem to depend on these seperated energies after all. The computed expressions on wikipedia look very distinct. I also don't see how in this limit the factors $\frac{1}{Q(T_k)}$ would suddently join to one big partition function? How does this partitioning of the system work anyway?

I also have a problem with how to get the different temperatures from the partition function. In practice, I can solve an (empirical) equation of state like $pV=NkT$ for the temperature if I know $p,V$ and $N$. But how can I compute the different temperatures for the associated more general expression in statistical mechanics, when their knowledge implies a functional dependence of the temperature of a bath?

$\endgroup$

3 Answers 3

5
$\begingroup$

The definition of temperature through Maxwellian and Boltzmann distributions have certain problems in quantum mechanics.

In thermodynamics temperature is usually defined through the derivative of entropy as you say: $$ \frac{1}{T} = \frac{\partial S(E,\mathbf{V})}{\partial E}. \qquad (1) $$

The division of the system into different parts (or different degrees of freedom) can be understood form the microcanonical distribution. Let the system have Hamiltonian of the following form: $$ H = H(\mathbf{q}, \mathbf{p}, \mathbf{V}); $$ where $\mathbf{q}$ and $\mathbf{p}$ are the vectors of microscopic generalized coordinates and momenta respectively and $\mathbf{V}$ is the vector of macroscopic parameters that are constant (at the average) in the equilibrium.

The dimension of $\mathbf{q}$ and $\mathbf{p}$ is the number of the degrees of freedom of the system. Note that degrees of freedom of the same type (e.g. translation along $x$ axis) of different particles are different degrees of freedom. The set of $(\mathbf{q},\mathbf{p})$ pairs is the phase space of the system.

The distribution function for the system is $$ f(\mathbf{q},\mathbf{p}) = \frac{ \delta\bigl( E - H(\mathbf{q}, \mathbf{p}, \mathbf{V}) \bigr) }{\Omega(E, \mathbf{V})}; $$ where $E$ is the internal energy and $\Omega(E, \mathbf{V})$ is the phase density of states or the number of accessible microscopic states for given $E$ and $\mathbf{V}$: $$ \Omega(E, \mathbf{V}) = \int \delta\bigl( E - H(\mathbf{q}, \mathbf{p}, \mathbf{V}) \bigr) d\mathbf{q} d\mathbf{p}. $$ The entropy is $$ S(E, \mathbf{V}) = \ln \Omega(E, \mathbf{V}) $$

Temperature of a subsystem

Let the system consist of two independent (non-interacting) subsystems. Then $$ \mathbf{q} = (\mathbf{q}_1, \mathbf{q}_2); \quad \mathbf{p} = (\mathbf{p}_1, \mathbf{q}_2); $$ $$ H(\mathbf{q}, \mathbf{p}, \mathbf{V}) = H_1(\mathbf{q}_1, \mathbf{p}_1, \mathbf{V}) + H_2(\mathbf{q}_2, \mathbf{p}_2, \mathbf{V}). \qquad (2) $$

NB:
The subsystems are not obliged to be separated spatially. They even are not obliged to consist of different particles. The only requirement is that the Hamiltonian must have the form (2). We can put all translational coordinates to $\mathbf{q}_1$, rotational to $\mathbf{q}_2$, oscillatory to $\mathbf{q}_3$ and so on. If the energy transfer (interaction) between the subsystems is negligible during some period of time then expression (2) is correct for that period.

We can introduce distribution functions for each subsystem: $$ f_i(\mathbf{q}_i,\mathbf{p}_i) = \frac{ \delta\bigl( E_i - H_i(\mathbf{q}_i, \mathbf{p}_i, \mathbf{V}) \bigr) }{\Omega_i(E_i, \mathbf{V})}; $$ where $E_i$ is the internal energy of the subsystem.

The entropy of the subsystem then is $$ S_i(E_i, \mathbf{V}) = \ln \Omega_i(E_i, \mathbf{V}) $$ and the temperature is $$ T_i = \left( \frac{\partial S_i(E_i, \mathbf{V})}{\partial E_i} \right)^{-1} \qquad (3) $$ Here is the definition of the temperature of the subsystem (degree of freedom).

Temperatures in the equilibrium

Since the subsystems are independent the distribution function of whole system is the product: $$ f(\mathbf{q},\mathbf{p}) = f_1(\mathbf{q}_1,\mathbf{p}_1)f_2(\mathbf{q}_2,\mathbf{p}_2); $$ and total number of accessible states is: $$ \Omega(E_1, E_2, \mathbf{V}) = \Omega_1(E_1, \mathbf{V})\Omega_2(E_2, \mathbf{V}). $$ Hence the total entropy is $$ S(E_1, E_2, \mathbf{V}) = S_1(E_1, \mathbf{V}) + S_2(E_2, \mathbf{V}) \qquad (4) $$

If there is an interaction between the subsystems the internal energy will be transfered from one system to the other until the equilibrium will be reached. During this process the total energy is constant: $$ E = E_1 + E_2 = \text{const} $$ The energies of the subsystems changes with time and have certain values in the equilibrium. According to the 2nd law of thermodynamics the total entropy is maximal in this state. The condition of the extremum is $$ \frac{\partial S(E_1, E_2(E, E_1), \mathbf{V})}{\partial E_1} = 0. $$ From (4) we get: $$ \frac{\partial S(E_1, E_2(E, E_1), \mathbf{V})}{\partial E_1} = \frac{\partial S_1(E_1, \mathbf{V})}{\partial E_1} + \frac{\partial S_2(E_2, \mathbf{V})}{\partial E_2}\frac{\partial E_1}{\partial E_2} = $$ $$ \frac{1}{T_1} - \frac{1}{T_2} = 0 $$ or $$ T_1 = T_2. $$ One can prove that these temperatures are equal to $T$ defined as (1).

$\endgroup$
1
  • $\begingroup$ +1: Yes, this is the correct answer. I was going to delete mine, but the additive temperature idea is close to the OP's confusion. $\endgroup$
    – Ron Maimon
    Commented Dec 23, 2011 at 19:49
2
$\begingroup$

In principle, temperature is defined only at equilibrium. In some cases, however, a system may be out of equilibrium and yet some of its degrees of freedom be in quasi-equilibrium. Consider for example a metal at low temperature where the electron-phonon coupling is small, and the electron-electron and phonon-phonon couplings are large. If you heat the electron bath (for example by Joule effect), you may end up in a situation where:

  • there is (almost) an equilibrium between all the electronic degrees of freedom
  • there is also a quasi-equilibrium between all the vibrational degrees of freedom
  • the electron and phonon baths are not in equilibrium with one another.

In this case, you can approximate the system as a set of two uncoupled (or weakly coupled) subsystems: the electrons and the phonons. Each subsystem is at equilibrium and has therefore a well defined temperature, yet the whole system is not at equilibrium and heat flows from the hotter to the colder. If you let things relax, this heat flow will eventually make both temperatures equal, then you have reached the global equilibrium.

$\endgroup$
9
  • $\begingroup$ Thanks for the answer, although it's a bit too informal for me to understand. If the two aspects of the system are coupled, I don't get the notion of two baths: Say they have temperature $T_e, T_{ph}$ (are they just numbers from outside?). These baths have an affect on the electron and photon subsystems and therefore "Each subsystem is at equilibrium ... has a well defined temperature ... yet the whole system is not at equilibrium and heat flows". When the systems relaxes and I approach a single temperature, which temperature is it? I mean given that the baths are still at $T_e, T_{ph}$. $\endgroup$
    – Nikolaj-K
    Commented Dec 20, 2011 at 10:17
  • $\begingroup$ “These baths have an affect on the electron and photon subsystems”. No, the baths are the electron and phonon (not photon) subsystems. “When the systems relaxes and I approach a single temperature, which temperature is it?” The electrons loose heat, so $T_e$ decreases. The phonons gain heat, so $T_{ph}$ increases. The final $T_e = T_{ph}$ will be an average of the two initial temperatures, weighted by the heat capacities of the electron and phonon subsystems (neglecting the temperature dependence of the heat capacities). $\endgroup$ Commented Dec 20, 2011 at 10:32
  • $\begingroup$ So you're saying there is an energy transfer until the subsystems with different kinds of energy dependencies equalize w.r.t. temperatures. This is then equivalent to neighboring systems of different temperatures. Why do you call them baths, if there is nothing else? Usually I'd visualize a bath imposing a constant temperature (like through the Joule effect) on my system of interest, not as the system itself. And in equilibrium, does the temperature computed from $Q$ or $S_{total}(E)$ equal the one computed from the partial partition function $Q_k$ or $S_k$ w.r.t an aspect of the energy $E_k$? $\endgroup$
    – Nikolaj-K
    Commented Dec 20, 2011 at 10:58
  • $\begingroup$ Yes, this is equivalent to neighboring systems of different temperatures. I call them baths by habit, as they may act as baths w.r.t. some other degree of freedom I may be interested in (e.g. the magnetization...). Maybe I should not use the word “bath”... And yes, at equilibrium $S = \sum S_k$, $Q = \prod Q_k$ and $T = T_k$ for any $k$. This works because we are neglecting the interactions between subsystems. These interactions should be small anyway in order for the different temperatures to make sense. $\endgroup$ Commented Dec 20, 2011 at 11:50
  • $\begingroup$ So you say the "write down the different entropies according to the different systems with their own degrees of freedom" approach works. This way I get $T_k$ from $S_k(E_k)$. But given $E_k$ for the sub-system, how do we know the partition function presented here? I.e. what is the $\tfrac{1}{kT}$ term in the "rotational partition function" representing a priori? There is no outside temperature in our case. I don't see why you'd use the canonical system scenario here, which deals with an external bath temperature. How to compute the different temperatures from the different partition functions? $\endgroup$
    – Nikolaj-K
    Commented Dec 20, 2011 at 12:40
1
+100
$\begingroup$

This terminology the book is using is not standard and no good. The correct way to express the relation between energy and temperature is that at any given the temperature the energies in each mode is proportional to the temperature.

Then all your questions become moot. The energy of a rotational mode, vibrational mode, and so on are each proportional to the temperature, and you add up all the energies to get the total energy in the system at any given temperature. You do not add up the temperatures of each mode to get the "total temperature", this is a concept that is absurd.

There is an anecdote in "Surely Your Joking Mr. Feynman" where Feynman is reviewing textbooks for California, and the textbook problem says "What is the total temperature of these stars?" as an example of where addition is useful in science. It seems that not only elementary books contain this bit of sillyness.

The energies add, not the temperatures. The energy at any given temperature is $T$ (in Boltzmann units $k_B=1$) in each harmonic oscillator, $T/2$ in every free-particle motion, and more generally, the average over the distribution $e^{-H/T}$ over the entire phase space for an arbitrary H(p,q).

EDIT: The book is not stupid! My mistake

I understand now what the book is getting at--- the system can be out of equilibrium between different sectors, but in equilibrium in each sector separately. In this case you have a different temperature describing the translational/rotational/nuclear-spin whatever degrees of freedom. Your confusion was confusing these essentially separate systems as being one system with an additive temperature.

If the rotational and translational motion of molecules do not interact very much (I can't think of when this would ever happen), you can treat the rotational system and the translational system as two separate systems, like two separated blocks of wood. Heat can flow from translational to rotational, and back, restoring global equlibrium, but this is slower than the heat flow from rotational to rotational and translational to translational (again, there are no systems which I know which separate rotation and translation like this). Then you can have an effective description where there is a temperature which is different for the two parts of the system.

This is rare. Usually separate systems have to be separate. The only example of degrees of freedom which are physically separate enough to make their own separate equilibrium are the degrees of freedom of nuclear spins. These can have a separate temperature (even a negative temperature--- meaning a very hot system where big energy is more likely than small) without leaking heat into the electronic components. The reason is that the nuclear spin/electron spin interaction is small, because the chemical and nuclear scales are separate.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.