1
$\begingroup$

As early as the year 1850 the macroscopic entropy was introduced by Clausius, Kelvin and others through the second law of thermodynamics coming in many equivalent forms. There were various vague physical interpretations of the entropy such as the degree of disorder, axis of time, thermodynamic ability, etc. But none of them is considered well defined or far from being miss leading.

However, a breakthrough came in the Boltzmann equilibrium statistical mechanics known as the proportionality relation between macroscopic entropy and probability $S=k \log W$ as early as 1877. Only then the macroscopic entropy did have a mathematical foundation based on a well defined particle or molecular structure of the matter.

My question specifically is can we rely on the Boltzmann relation as a unique definition of the entropy which satisfies the sense of necessity and sufficiency?

$\endgroup$
5
  • $\begingroup$ You would need to add quantum mechanics (Sackur-Tetrode, quantum statistics) to get it well defined. $\endgroup$
    – user137289
    Commented Dec 6, 2019 at 9:59
  • 2
    $\begingroup$ Peter,Thanks for the comment.Can be an answer meaning there is no uniquely defined entropy in the frame of classical statistical mechanics alone .! $\endgroup$
    – user248651
    Commented Dec 6, 2019 at 10:23
  • $\begingroup$ I’m not conversant in statistical thermodynamics but based on my reading of a Wikipedia article on the formula it seems to have limitations regarding its ability to predict macroscopic behavior. For one it appears to be based on an ideal gas model $\endgroup$
    – Bob D
    Commented Dec 6, 2019 at 11:17
  • $\begingroup$ Hi Lionheart, please note that in English, spaces come after punctuation marks (comma, period, etc) rather than before it. $\endgroup$
    – Kyle Kanos
    Commented Dec 6, 2019 at 12:53
  • $\begingroup$ Thanks Kyle Kanos. Now I believe it makes a difference. $\endgroup$
    – user248651
    Commented Dec 6, 2019 at 16:04

1 Answer 1

5
$\begingroup$

This is a subtle question and there is no quick answer. The subtlety lies in what the symbol $W$ represents, or how it is obtained for any given system. It represents the number of microstates that are ... what? Thereby hangs a tail. One could say "that are accessible to the system" under given macroscopic constraints. But what are macroscopic constraints? What does 'macroscopic' mean? And what does 'accessible' mean? After all, the system does not have time to move among all the states that are ordinarily included in $W$, so in what sense are they accessible? They are accessible in the sense that there is no conservation law preventing the system from changing its state to any among the $W$ that are included. The constraints amount to a way of carving out a region of the state-space and announcing that we have under consideration a system that is known to be in that region, but we say we know nothing else about it. It follows of course that the entropy is not a property of the system alone, but of the system combined with whatever conditions (or knowledge if you like) we have used to specify the constraints.

There is also the question how to say when one microstate is sufficiently different from another to be counted as different. This was resolved in the early days by taking equal volumes of phase space and arguing that the fundamental unit of volume is not important as long as there is no further structure at smaller scales. It is resolved nowadays by counting mutually orthogonal quantum states. But the reason why this is a good choice is itself interesting: it is because one expects the system to spend equal times in each such accessible state on average. That is a statement about dynamics that itself has to be carefully argued from Schrodinger's equation or by bringing in things like Liouville's theorem.

I have written the above largely as a prelude to the following point. In your question you propose that the thermodynamic definition of entropy, which is $$ dS = \frac{dQ_{\rm rev}}{T} $$ (and $S=0$ at $T=0$) is in some sense ill-defined or less well-defined than Boltzmann's $S = k \log W$. My reply is that neither is better than the other; they are equally well-defined and both are useful. In the thermodynamic statement the ambiguity comes in considering which aspects of a transfer of energy amount to heat, which to work. The Boltzmann statement does not really remove that ambiguity so much as to move it to somewhere else: it is moved to the way we choose to specify the constraints that we ascribe to any given configuration of the system. In both cases the ambiguity goes away in the thermodynamic limit. And in both cases the ambiguity remains in the limit of finite systems.

$\endgroup$
1
  • $\begingroup$ Thanks Andrew Steane. Your answer is defining the entropy in a rigorous way. It removes the ambiguity about the two definitions of the entropy macroscopic and microscopic ones. The tails are of minor importance: i- dS = dQ/T applies only for reversible process and goes greater for irreversible one. ii-S=k Log W needs W itself to be well defined for each ensemble of systems and should be complimented with the Ergodic hypothesis as you have implicitly stated. In an agreement conclusion, the entropy can only be defined at or near to equilibrium. I suppose your answer is what we are looking for. $\endgroup$
    – user248651
    Commented Dec 6, 2019 at 20:29