3
$\begingroup$

I know that entropy is measure of disorder, and a book (Intro to Thermal Physics by David J. Schroeder) states that $\mathrm dS=\mathrm dq/T$, where $S$ is entropy, $q$ is heat flow, and $T$ is temperature.

What I don't understand is the relation between these two interpretations.

Taking the definition of entropy as measure of disorder, I am looking for an intuitive proof of the equation $\mathrm dS=\mathrm dq/T$, or an explanation as to why this equation represents measure of disorder, or both.

I would very much like to know why entropy is not a path function, as it is dependent on heat (which is a path function).

$\endgroup$
4
  • $\begingroup$ Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. $\endgroup$
    – Community Bot
    Commented Sep 4, 2021 at 19:50
  • $\begingroup$ While 'measure of disorder' is correct it does not help much especially as 'measure' tends to get dropped. It is better to think of it as the number of ways energy levels can be filled (as in Boltzmann definition $S=k\ln(\Omega)) $. In this sense ,heat being transfer of energy in a in a random manner, it is easy to follow why entropy increases. $\endgroup$
    – porphyrin
    Commented Sep 4, 2021 at 20:34
  • 1
    $\begingroup$ Entropy is not a measure of disorder. This statement has been corrected in several textbooks. Order/disorder is related to information entropy. $\endgroup$
    – ACR
    Commented Sep 4, 2021 at 22:34
  • $\begingroup$ The post is probably about relation of information entropy and thermodynamic entropy. $\endgroup$
    – Poutnik
    Commented Sep 5, 2021 at 5:50

2 Answers 2

2
$\begingroup$

I just try to answer the second question as I am just a university student that took an Introductory Physical Chemistry course a few months ago.

$\ce{\mathrm{d}S=\frac{\mathrm{d}q}{T}}$ is the definition for entropy in the classical thermodynamic view. A measure of energy dispersal at a specific temperature. Although it does not have "S" sitting on one side like many other equations (e.g. $E=mc^2$), it is how we define entropy. It only has a derivative form. This is because it has to associate with the change of heat and the equation for heat and temperature is different under different cases. Thus, we write this as the general form. When we apply this definition under certain condition, we may simplify it further by doing integration.

Actually, more specially, the definition is:

$\mathrm{d}S = \frac{\delta q_{rev}}{T}$

The heat in this equation refers to the reversible heat pathway only, (side note: "δ" sign means path function differential, "d" sign means state function differential). Entropy is a state function because it only refers to one and only one path for the heat, the reversible pathway. There is no other pathway, so it becomes a state function.

I am not sure how scientists in the past came up with this equation for the definition. Why divided δq by temperature? Why it need reversible heat? Maybe some else will answer it.

$\endgroup$
1
$\begingroup$

It seems like there are two questions you care about:

  1. Given that $dS = dQ/T$ and that $Q$ is a path function, why is $S$ a state function?
  2. How do you relate the interpretation of $S$ as 'measure of disorder' and as 'quantity related to heat production'?

tl;dr: Entropy is related to heat production via $dQ = TdS$, and heat characterizes disordered microscopic motion, so entropy is thus linked to disorder.


The two questions are kind of muddled up in terms of terminology, so let me hide some preconceptions by defining a new function $X$ such that $dX = dQ/T$. This is a useful definition because it turns out that $X$ is a state function. (Mathematically, $dQ$ is an inexact differential, which corresponds to a path function. Inexact differentials can be made into exact differentials by multiplying them by integrating factors. In this case, on physical grounds, the integrating factor for $dQ$ is $1/T$.)

The identification of $X$ as a state function follows from the analysis of Carnot cycles by the eponymous Sadi Carnot, for which it was found that $$\frac{Q_H}{T_H} + \frac{Q_C}{T_C} = 0$$ for the heat exchanged ($Q$) and temperature ($T$) of a hot ($H$) and a cold ($C$) reservoir. Because all thermodynamic cycles can be constructed as a limit of infinitely many infinitesimal Carnot cycles, the identity above establishes the exactness of the differential $dX$, but a satisfactory interpretation of this curious quantity $X$ was never established in macroscopic thermodynamics. (A common incorrect explanation for $S$ being a state function despite $Q$ being a path function is the imposition of a reversible path for the process. This is certainly true, but does not explain why $S$ is a state function. After all, there are still infinitely many reversible paths between two states.)


Thus we turn to statistical mechanics. The most basic form of statistical mechanics deals with a microcanonical ensemble with some number of microstates $\Omega$. In order to relate this to macroscopic thermodynamics, let us assume the existence of an extensive thermodynamic variable $S$, which we will call the entropy, that relates the number of microstates to thermodynamic properties. Mathematically, we take $S = f(\Omega)$ for some as yet undetermined function $f$. By extensivity, scaling up this system by a factor of $n$ will give us an entropy $nS$. We should also expect to have $\Omega^n$ microstates, assuming each subsystem is approximately independent. Thus, on physical grounds, we identify the functional equation $$f(\Omega^n) = nf(\Omega).$$ This can be transformed into Cauchy's functional equation, and has as its only continuous solution $$S = f(\Omega) = k\ln\Omega$$ for some constant $k$. This is the expression for the Boltzmann entropy.

The Gibbs entropy is a generalization of the Boltzmann entropy to canonical, rather than microcanonical, ensembles: take the total entropy (which you cannot measure in practice, not being aware of the number of available microstates) and subtract the entropy associated with the microstates to end up with the entropy associated with the macrostates, $$S = -k \sum_i p_i\ln p_i,$$ where $i$ enumerates your macrostates. For the precise derivation, see Blundell and Blundell's Concepts in Thermal Physics, Ex. 14.5. (There is a bogus derivation that writes the microcanonical entropy as $S = - k\ln p$ and then purports to take the average of this entropy to get $\langle S\rangle = -k\sum p\ln p$, but this does not stand up to scrutiny.)

Now that we have an expression for the statistical-mechanical entropy, we can see what thermodynamic variable it relates to: not surprisingly, the conclusion is that $$X = S,$$ so the mysterious state function of macroscopic thermodynamics is actually the entropy, which is concerned with the number of microstates of the system! The derivation involves writing the first law of thermodynamics in macroscopic and microscopic form, matching terms, and following a similar functional-analytic treatment as above. See Hill's An Introduction to Statistical Thermodynamics, Ch. 1.4. (I believe this particular example of functional analysis was first advanced by Schrodinger, him of quantum-mechanical fame.)


Finally, after having derived that $X$ was a state function just from thermodynamics, and having interpreted $X$ using statistical mechanics, we revisit thermodynamics from a more enlightened standpoint. The amazing simplicity of thermodynamics is a reduction from $6N$ microscopic degrees of freedom to just three (in the simplest case) macroscopic variables: $S$, $V$, and $N$. What is so special about these variables that they would survive this coarse-graining procedure?

It turns out that $V$ and $N$ are intimately linked to conservation laws and hence preserved by symmetry via Noether's theorem, whereas $S$ basically lumps together all the other microscopic degrees of freedom that were coarse-grained away. This distinction underlies the treatment of $pdV$ and $\mu dN$ in the first law as work terms associated with ordered microscopic motion, whereas $TdS$ is a heat term associated with disordered microscopic motion. See Callen's Thermodynamics and an Introduction to Thermostatistics, Ch. 21.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.