3
$\begingroup$

In thermodynamics as I understand entropy is a state function.

A state function is a property whose value does not depend on the path taken to reach that specific value. In contrast, functions that depend on the path from two values are called path functions. Both path and state functions are often encountered in thermodynamics.

Second Law of Thermodynamics (Variational statement of the second law of thermodynamics): Stated in words, at equilibrium, any small change to the state of the system that induces a small change in the entropy while the internal energy and volume remain constant must lower the entropy of the system.

My problem is that since entropy is a state function, if the internal energy and volume remain constant then how can the entropy change, would this not contradict the definition of a state function above, in particular "property whose value does not depend on the path taken to reach that specific value..."?

Can anyone see where I am going wrong with my reasoning?

$\endgroup$
7
  • $\begingroup$ Where did you get that statement from. According to the equation dU=TDS-PdV, if dV and dU are zero, dS must be zero. Plus, that is not the variational statement of the 2nd law as I understand it. $\endgroup$ Commented Dec 4, 2018 at 14:12
  • $\begingroup$ @ChesterMiller Hi. From MIT lecture notes. See this link. $\endgroup$
    – John Doe
    Commented Dec 4, 2018 at 14:17
  • 1
    $\begingroup$ These are not actual changes, these are virtual changes. The concept is similar to the concept of static mechanical equilibrium where the equilibrium is a stable one if all virtual changes conformant with the external constraints would lead to an increase of the energy, and since the state is in the neighborhood of a minimum all spontaneous restoring forces would point back to the minimum, hence the stability. $\endgroup$
    – hyportnex
    Commented Dec 4, 2018 at 14:35
  • $\begingroup$ In the context of thermodynamics if all virtual displacements would lead to a decrease of entropy means that the entropy is at maximum, and one part of one formulation of the 2nd law is just that: in equilibrium the entropy is maximum with the other extensive variables are being kept constant. $\endgroup$
    – hyportnex
    Commented Dec 4, 2018 at 14:43
  • $\begingroup$ Say you bring two (U,V,N) systems together that are at different temperatures but you keep the total energy $U_1+U_2$, total volume $V_1 + V_2$ and the total particle count $N_1 + N_2$ constant. Equilibrium is achieved for that common temperature and common chemical potential for which the total entropy is at maximum. Now the virtual displacements are $\pm dV$ and $\pm dN$ so that $dV_1+dV_2=0$ and $dN_1+dN_2=0$. $\endgroup$
    – hyportnex
    Commented Dec 4, 2018 at 14:43

4 Answers 4

3
$\begingroup$

This question hits one of the most frequent misunderstandings about maximum and minimum properties of thermodynamic fundamental equations (i.e., the state functions that embody the whole information about the thermodynamic behavior of a system).

Sometimes, I hear uncontrolled statements like "entropy at equilibrium should be maximum." Without adding "maximum with respect to what," such a statement is either void of meaning or false. Consider, for example, what would result from characterizing thermodynamic equilibrium as the state that maximizes entropy as a function of the volume. Since $\left.\frac{\partial{S}}{\partial{V}}\right|_U=\frac{P}{T}$, the equilibrium would be possible only at zero pressure or at infinite temperature. It's a nonsense.

The correct statement of the maximum entropy principle is that the equilibrium entropy of a system at fixed volume, energy, and number of particles is maximum with respect to any additional variable expressing a possible internal constraint on the thermodynamic system when it is allowed to vary. For instance, such a form of the principle is used when the condition for thermal equilibrium between two subsystems is looked for. in that case, one introduces the entropy of the combined system as a function of the total variables $U,V,N$ and an additional variable which is the energy of one of the two subsystems, say $U_1$. The resulting entropy for the total system is a function of four variables:

$$ S(U,V,N,U_1) = S_1(U_1,V,N) + S_2(U-U_1,V,N) $$ The condition of extremum w.r.t. $U_1$ at fixed $U,V,N$ corresponds to the equality of the temperature of the two subsystems.

There is nothing special about entropy. It is possible to show that the maximum of entropy w.r.t. constraint variables at fixed energy, volume, and number of particles is equivalent to the condition of minimum of the internal energy $U$ as a function of the same constraints at fixed $S,V,N$. Moreover, such minimum principle for the energy can be translated into similar minimum principles for the remaining thermodynamic potentials (Helmholtz and Gibbs free energy, enthalpy), provided their natural variables are kept fixed.

A very terse discussion of such an issue is contained in the classical thermodynamics textbook by H. Callen.

$\endgroup$
2
$\begingroup$

The answer to your question is that entropy is a state function only at equilibrium.

Consider a system with two parts, A and B. Each part is in equilibrium. Then the entropy of part $A$ is $S_A=S(U_A,V_A,N_A)$ and the entropy fo part $B$ is $S_Β=S(U_Β,V_Β,N_Β)$.

Now consider the two parts as a single system. The entropy of the combined system is $$ S_{A+B} = S_A+S_Β=S(U_A,V_A,N_A)+S(U_Β,V_Β,N_Β) $$ In general, $S_{A+B}$ is not a state function of the combined system because it depends on the six variables, $$ U_A, U_B, V_A, V_B, N_A, N_B . $$ In the special case that $S_{A+B}$ happens to be a function of $$ U_A+U_B, V_A+V_B, N_A+ N_B $$ then entropy is indeed a state function for the combined system. It turns out that the conditions for this to happen are: $$ T_A = T_B, P_A=P_B, \mu_A = \mu_B $$ which are of course the equilibrium conditions.

$\endgroup$
7
  • $\begingroup$ that "entropy is a state function only at equilibrium" is incorrect, entropy is a state function always, whether or how you can calculate it in non-equilibrium is a different matter. $\endgroup$
    – hyportnex
    Commented Dec 4, 2018 at 15:09
  • $\begingroup$ @hyportnex If the microcanonical state is defined by its energy, volume, and number of particles, this information alone is insufficient to calculate entropy, unless I add that the system is in equilibrium. $\endgroup$
    – Themis
    Commented Dec 4, 2018 at 20:15
  • $\begingroup$ Let me say it again: just because you (we) do not know how to calculate something does not mean it does not exist. Entropy is a state function in or out of equilibrium. The existence and properties of thermodynamic entropy outside of equilibrium is a huge and controversial subject most of it completely out of the reach of statistical mechanics. $\endgroup$
    – hyportnex
    Commented Dec 4, 2018 at 20:23
  • $\begingroup$ We must be precise: entropy exists in non equilibrium states. But is not a function of the global energy, volume and number of particles. That's a provable statement. $\endgroup$
    – Themis
    Commented Dec 4, 2018 at 20:35
  • $\begingroup$ @Themis: please, provide a reference for your claim that entropy exists in non equilibrium states. Which kind of non equilibrium you are speaking about? if you refer to system at local thermodynamic equilibrium, I would agree. For more general conditions, I am curious to know what you are referring to. $\endgroup$ Commented Dec 5, 2018 at 7:28
2
$\begingroup$

I guess what the MIT notes are saying is that, at constant internal energy and constant volume, any change in the state of the system would constitute a deviation from a thermodynamic equilibrium state, which would entail fewer allowable distributions statistically, and thus lower entropy.

This is not what I would regard as a variational statement of the 2nd law of thermodynamics. The variational statement as I understand it would be: In a closed system that undergoes a change in thermodynamic equilibrium state from state 1 to state 2, over all the possible paths between these two end states, there is an extremum (maximum) value for the integral of $dQ_B/T_B$ (where $dQ_B$ is the differential heat transferred from the surroundings to the system and $T_B$ is the temperature at the boundary between the system and surroundings at which this heat transfer takes place). Since this is an extremum, it must depend only on the two end states, and not on the specific path. It thus represents a change in a function of state. We call this function of state the entropy S. This is basically the Clausius inequality which represents a variational statement of the 2nd law of thermodynamics.

$\endgroup$
1
$\begingroup$

These are not actual changes, these are virtual changes. The concept is similar to the concept of static mechanical equilibrium where the equilibrium is a stable one if all virtual changes conformant with the external constraints would lead to an increase of the energy, and since the state is in the neighborhood of a minimum all spontaneous restoring forces would point back to the minimum, hence the stability.

In the context of thermodynamics if all virtual displacements would lead to a decrease of entropy means that the entropy is at maximum, and one part of one formulation of the 2nd law is just that: in equilibrium the entropy is maximum with the other extensive variables are being kept constant.

Say you bring two $(U,V,N)$ systems together that are at different temperatures but you keep the total energy $U_1+U_2$, total volume $V_1+V_2$ and the total particle count $N_1+N_2$ constant. Equilibrium is achieved for that common temperature and common chemical potential for which the total entropy is at maximum. Now the virtual displacements are $±dU$, $±dV$ and $±dN$ so that $dU_1+dU_2=0 $, $dV_1+dV_2=0 $ and $dN_1+dN_2=0$.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.