20
$\begingroup$

Take an empty container and fill it with $N$ gas particles (ideally a monoatomic gas), each having the same kinetic energy $E$, then isolate the container. Since initially the speeds don't follow the Maxwell-Boltzmann distribution, such a system cannot be in thermodynamic equilibrium. On the other hand, assuming perfectly elastic collisions (and there is no reason to assume otherwise, since the only form of energy the particles can possibly have is kinetic), I see no way such a system could spontaneously evolve to equilibrium: elastic collisions among equal masses keep speeds unchanged! What gives?

I have no background in non-quasistatic processes, but I tried nonetheless to work out a solution taking into account the container, which necessarily has a certain heat capacity, a certain initial temperature, and whose walls are not necessarily perfectly elastic, etc. Knowing the number of particles and their individual speed, it's possible to compute the system's total heat content (but is this exactly $NE$, or less?) and thus derive it's equilibrium temperature. Since the system is isolated, I take it the quantity that has to change must be entropy (namely increase, as the uniform speed state seems less likely; the change can probably be arrived at from a strictly combinatorial point of view). At any rate, the process I imagined goes like this: initially, the particles bombarding the wall transfer some amount of heat to it while slowing down; in turn the wall, now heated up, will transfer back some heat to the gas; eventually, the system will reach the expected equilibrium.

Is my assumption of perfectly elastic collisions wrong, and if so, where does the dissipated energy go?

Is there an increase in temperature that accompanies the increase in entropy?

Can someone point me to the rigorous mathematical framework for analyzing the problem?

Is there direct experimental evidence that the speed of gas particles attains the Maxwell-Boltzmann distribution, or is it just a theoretical result that everyone is just happy to work with?

Thanks for any suggestion.

$\endgroup$
5
  • $\begingroup$ Elastic collisions do not keep speeds unchanged unless in 1D. This allows for energy transfer and thermalization. $\endgroup$
    – fffred
    Commented Sep 10, 2013 at 23:28
  • $\begingroup$ Right, I overlooked the general case of non-head-on collisions. Still, how about the entropy? And the last question? All the energy comes from the kinetic energy of the particles, but since the initial system has some level of organization, not all the energy can be regarded as heat. Is it correct to say that the temperature increases? What's the appropriate approach in terms of free energy, etc? $\endgroup$
    – suissidle
    Commented Sep 11, 2013 at 0:31
  • $\begingroup$ @fffred an elastic collision between two hard spheres or disks of equal mass and radius, and equal initial kinetic energies, does conserve the speeds. The collision can't change the total energy, and (from symmetry) it can't change the distribution of energy between the two particles. So after the collision both particles must have the same energy, and hence the same speed, that they started with. $\endgroup$
    – N. Virgo
    Commented Sep 11, 2013 at 9:46
  • $\begingroup$ @fffred I've just seen my error - I was considering only the case where the two particles' momenta sum to zero - I think you're right after all. I will correct my answer later. $\endgroup$
    – N. Virgo
    Commented Sep 11, 2013 at 12:52
  • $\begingroup$ Nice video here (simulation of 2D isolated gas of hard spheres relaxing to Maxwell-Boltzmann) youtube.com/… see also: physics.stackexchange.com/q/769018/226902 for the 1D case. $\endgroup$
    – Quillo
    Commented Jun 21, 2023 at 12:04

2 Answers 2

10
$\begingroup$

WetSavanaAnimal aka Rod Vance has given a good introduction to the issues involved.

When I originally wrote an answer, I though that you were right: if you had a perfectly ideal system of perfectly hard spheres, with perfectly elastic collisions, in a container with perfectly rigid walls, and if all the particles started out with exactly the same speed, then the velocities could not evolve into a Maxwell-Boltzmann distribution, because I thought there was no process that could make the velocities become non-equal. However, I've realised I was wrong about that. For example, consider this collision:

enter image description here

The total $x$-momentum is zero but the total $y$-momentum is positive. This must be the case after the collision as well, so the motion must end up looking like this

enter image description here

with the top particle having gained some kinetic energy while the bottom particle loses some. Through this type of mechanism the initially equal velocities will rapidly become unequal, and the system will converge to a Maxwell-Boltzmann distribution just by transferring energy between particles, with no need for energy to enter or leave the system.

However, it's still possible to imagine special initial conditions where this won't happen. For example, we can imagine that all the particles are moving in the same direction, exactly perpendicular to the chamber walls, and are positioned such that they will never collide. A system in this configuration will remain in this special state forever, and not go into a Maxwell-Boltzmann distribution.

However, such a special state is unstable, in that if you start out with even one particle moving in a slightly different direction than all the others, it will eventually collide with another particle, creating two particles out of line, which can collide with others, and so on. Soon all particles will be affected and the system will converge to the Maxwell-Boltzmann distribution.

In reality, as Rod Vance said, in practice the walls will not be perfectly rigid and will be in thermal motion, which would prevent any such precise initial state from persisting for very long.

Even so, this seems to imply that the hard sphere gas system has at least one special initial state from which it will never reach a thermal state. But this isn't necessarily a problem for statistical mechanics. In this case (if my intuition is correct) the states with this special property form such a small proportion of the overall phase space that they can essentially be ignored, since the probability of the system being in such an initial state by chance is technically zero.

There can be cases where every initial state has a property like this. This means that the system will always remain in some restricted portion of the phase space and never explore the whole thing. But these are just the cases where there is some other conservation law, in addition to the energy, and we know how to deal with that in statistical mechanics.

People used to worry a lot about proving that systems were "ergodic", which essentially means that every possible initial state will explore every other state eventually. But nowadays a lot less emphasis is put on this. As Edwin Jaynes said, the way we do physics in practice is that we use statistical mechanics to make predictions and then test them experimentally. If those predictions are broken then that's good, because we've found new physics, often in the form of a conservation law. When the new law is taken into account, the new distribution will be seen to be a thermal one after all. So we don't need to prove that systems are ergodic in order to justify statistical mechanics, we just need to assume they are "ergodic enough", until Nature tells us differently.

$\endgroup$
3
  • $\begingroup$ I apologise for the fact that the latter half of this answer no longer has much relevance to the question. It made more sense before I corrected my mistake. $\endgroup$
    – N. Virgo
    Commented Sep 11, 2013 at 13:11
  • 1
    $\begingroup$ Thank you for your answer. I didn't imagine I would be stirring so many concepts, but that a question of elementary mechanics would be the one eventually tripping us up =D As it's now clear that my premise was wrong and that nothing prevents energy transfer even among ideal particles, I feel the case is pretty much closed. At any rate, thanks for mentioning the coherent state, which I didn't explicitly think about. As to the last paragraph, I understand the approach of the working physicist, but allow me to be dissatisfied with it sometimes. $\endgroup$
    – suissidle
    Commented Sep 11, 2013 at 14:04
  • 1
    $\begingroup$ "The total x-momentum is zero but the total y-momentum is positive"... In your diagram I'm not completely sure where your x and y axes run (left-right & up-down presumably?) but, whichever way, the total x-momentum cannot be zero and at the same time the two particles have the same speeds before collision. E.g. Pa has velocity Va1=(4,-3) and Pb has velocity Vb1=(0,5). After collision we get Va2=(4,5) and Vb2=(0,-3). Net momenta along x and y are 4 and 2 before and after collision. Momentum is exchanged between particles along the "line-of-collision" which connects their centres at impact. $\endgroup$
    – steveOw
    Commented Jul 12, 2015 at 23:21
6
$\begingroup$

Bravo on some wonderfully clear and careful thoughts on your problem! I think this paper will help you:

E. T. Jaynes, "Gibbs vs Boltzmann Entropies", Am. J. Phys. 33, number 5, pp 391-398, 1965 as well as many other of his works in this field

And a summary of my answer below is:

Statistical correlation between a system constituents' states and the use of the Gibbs entropy vs. the Boltzmann entropy to account for that correlation is the mathematical framework you need.

You are quite right that in the case of perfectly elastic collisions with the walls that the system can never reach ideal thermodynamic equilibrium i.e. where the states of all the particles have identical probability density distributions AND all these states are perfectly statistically independent (i.e. wholly uncorrelated).

In the practical case, the collisions with the wall are never perfectly elastic. The wall is thermalized, which means that the thermal state of the wall's molecules is randomly varying. Sometimes a gas particle will pick up a little bit of energy in interacting with the wall, other times it will lose a little bit of energy: at thermodynamic equilibrium the long term averages of energy flows into and out of the box are equal. But if the gas molecules's states are highly correlated as in the special example you cite, the information (i.e. Shannon entropy - in thermodynamics it corresponds to the Gibbs Entropy) needed to define the complete microstate of all the molecules in the box will increase with time in your example because the particles are interacting with thermalized walls: their collisions are defined by parameters that stochastically evolve owing to the randomly evolving states of the wall's particles and you would need include a description of the history of these random collisions to fully define the box particles.

Actually if the walls are a finite thermal reservoir, the initially perfectly uncorrelated states of the wall's molecules will actually become weakly correlated with time as they interact with the correlated gas molecules. The Shannon entropy of a closed system always stays constant. If the walls are thought of as being the boundaries of a bigger and bigger thermodynamic reservoir, the correlation just spoken of becomes spread over more and more wall molecules, so in the limit of an infinite reservoir, the correlation between the wall system particles becomes nought and the interaction with the gas particles does not disturb the wall system's thermodynamic equilibrium. In classical thermodynamic problems "infinite reservoirs" are essential thought experimental tools, because they are perfect "dumping grounds" for the kind of statistical correlation just spoken of. This is why the non-equilibrium thermodynamics of small systems is so hard to analyse: the statistical correlations make the exact analysis of such systems intractable.

Given the above two paragraphs, we can now look at how you thought about the problem:

At any rate, the process I imagined goes like this: initially, the particles bombarding the wall transfer some amount of heat to it while slowing down; in turn the wall, now heated up, will transfer back some heat to the gas; eventually, the system will reach the expected equilibrium.

This is substantially sound: the only thing that is a near miss is the last phrase "eventually, the system will reach the expected equilibrium". In general this statement needs to be qualified: if the walls are a "small" thermodynamic system, with not too many more particles than the gas itself, the correlation between the wall molecule states that arises from the interaction with the at-first-highly-correlated gas molecule states can only be "spread" over a limited number of particles. So the system will wind up with both gas and wall molecules having substantially correlated states. If however the walls are a big system, then the system "eventually reaches its expected equilibrium", for all practical purposes. The size of the total system determines how far towards equilibrium it can proceed.

You might find it helpful now to look at a kind of "backwards" version of your problem: let's think of an "irreversible" change where a gas at first at thermodynamic equilibrium (i.e. all particle states have the same probability density function and they are all statistically uncorrelated) undergoes the irreversible change where the container's volume is suddenly doubled AND the collisions with the walls are perfectly elastic. Its Gibbs entropy is unchanged by the volume doubling, because one can in principle compute all the future particle states from a knowledge of their states just before the doubling. The laws of physics at the molecular level are reversible; beginning states are mapped one-to-one and onto future states so that the former can always in principle be computed from the latter and contrawise and therefore no information gets added to a truly closed (i.e. sundered from the rest of the Wordl) thermodynamic system. However the experimentally observed Boltzmann entropy increases by $k_B \log 2$ per particle: intuitively, this is because all the particles' positions now need one more bit of information to specify which half of the newly doubled box volume they lie within - i.e. whether they lie in the former half volume, or in the newly allowed volume. But now the particle states are statistically correlated and so

$$S_B = S_G + k_B N M$$

where $S_B$ is the Boltzmann entropy, $S_G$ the initial, unchanged Gibbs entropy, $N$ the number of particles and $M$ the mutual information (see the Wiki page with this name) per particle. $M$ is an information theoretical measure of how much of a particle's state can be foretold from knowledge of other particles's states i.e. an information theoretical measure of statistical correlation. So the Boltzmann entropy, which can be shown to equal the macroscopic, experimentally observable Clausius entropy definition, is a measure of information when systems' constituents are statistically uncorrelated.

We can think of this irreversible change in terms of the shape of phase space volumes. Initially, the phase space volume of the uncorrelated gas in thermodynamic equilibrium looks like a pretty ordinary, convex, simply connected set (if anything in such fantastically high number dimensioned space can be called "ordinary" with a straight face!). But with the irreversible change it evolves into something with the same volume (i.e. Gibbs entropy: entropies can also be interpreted as phase space volumes) but it gets squashed and finely divided: it can end up as a fractal-like foam or something like a Sierpinski carpet. The true volume of this object does not change, but its practically observable volume (i.e. observed down to a given "coarse graining" level) i.e. the smallest "unfoamy" set (which is simply connected at levels finer than the coarse graining in question) containing the volume does (just as foam, resolved to coarse levels, seems to take up more space that it does), and this change is the difference between the Boltzmann and Gibbs entropies. AS we make more and more detailed measurements, we can experimentally "see" more and more of the "foamy" structure, but in doing so we depart further and further from the classical Clausius characterisation of entropy as we must measure and bring more and more information into our description aside from the classical variables of pressure, temperature and so on so as to describe the foamy set more fully.

$\endgroup$
13
  • 1
    $\begingroup$ @Nathaniel Your style of technical writing on matters such as these is in general much better than mine (since this is your field of expertise), so you might like to add an answer for the OP if you think this unclear. I'm sure the OP would appreciate it. $\endgroup$ Commented Sep 11, 2013 at 2:30
  • $\begingroup$ Unfortunately you can't "@"-notify someone who hasn't already participated in the discussion - but I saw it anyway - I've posted my own answer as a supplement to yours. $\endgroup$
    – N. Virgo
    Commented Sep 11, 2013 at 10:27
  • $\begingroup$ @Nathaniel Thanks for letting me know - I thought that might be so. Anyway, good to see you here! :) $\endgroup$ Commented Sep 11, 2013 at 10:59
  • $\begingroup$ Thank you for taking the time to write up such a clear and insightful answer. I want to take the time to understand it and the paper you linked well. To clarify the description of the system, I intended the container as a barrier to all kinds of outside interactions. I think this corresponds to your 'small reservoir' wording. $\endgroup$
    – suissidle
    Commented Sep 11, 2013 at 13:58
  • $\begingroup$ Are you saying that in this case, gas + container will converge to a correlated state (as if the entropy just gets spread out over the whole system, but no more is added)? Or, given enough time, will it become thermalized too? I imagine the walls having a natural tendency to become thermalized: it remains to be seen if this process can keep up and ultimately beat the correlated bombardment. However, in light of Nathaniel's observations, it appears the container only plays a marginal role in the thermalization of the gas (unless the initial configuration is quite peculiar). $\endgroup$
    – suissidle
    Commented Sep 11, 2013 at 13:59

Not the answer you're looking for? Browse other questions tagged or ask your own question.