23
$\begingroup$

Entropy is defined in my book as $\Delta\ S = \frac{Q}{T}$. To derive the formula it says that entropy should be directly proportional to the heat energy as with more energy the particles would be flying all over the place more rapidly. This makes sense. But then it says that entropy must be inversely proportional to the temperature at which it is added. To be precise

Heat added to a system at a lower temperature causes higher entropy increase than heat added to the same system at a higher temperature.

How does this make intuitive sense?

EDIT 1:

I found an answer here. I think this makes sense. Can anyone read it and tell me if it's correct or not?

Rough intuitive answer:

Adding heat to a system increases the system's total energy. This gives more kinetic energy to distribute among the particles in the system, increasing the size of the system's phase space and hence its entropy. However, since momentum is proportional to the square root of kinetic energy, it is clear that the total momentum phase space cannot grow exponentially as you add more kinetic energy; so the rate of growth of its logarithm must taper off. Since entropy is proportional to the log of the size of the phase space, we see that each additional unit of heat must increase the entropy by less and less.

Now, for technical details. The reason why the relationship is precisely $dS = \delta q/T$ rather than some more complicated relationship that depends on the nature of the system is that temperature is defined such that this is true. We know that during first-order phase transitions, the temperature actually stays constant as you add heat! This corresponds to a region where adding heat actually does cause an exponential increase in phase space. Finally, note that some systems actually have negative temperature, and adding more heat to them actually makes the temperature even more negative; so adding heat actually decreases the phase space. So the intuitive answer in the previous paragraph shouldn't be taken too seriously.

EDIT 2: Moreover, this answer explains why there needs to be an additional factor of temperature along with heat in the definition of entropy so as to be consistent with the second law.

This is why heat flows from hot objects to cold objects: the entropy change of the hot object is negative, while that of the cold object is positive. The magnitude of entropy change for the cold object is greater. $\Delta S = \frac{Q}{T_c}-\frac{Q}{T_h} > 0$ as $T_h > T_c$

Keep in mind that entropy increases with temperature. This can be understood intuitively in the classical picture, as you mention. However, at higher temperatures, a certain amount of heat added to the system causes a smaller change in entropy than the same amount of heat at a lower temperature.

The formula is $\Delta S = \frac{Q}{T}$. The change in entropy is related to heat. Remember that heat is a form of energy transfer, not energy; we can talk about heat only when some change takes place.

Please do point out if there is anything wrong with the answers above.

$\endgroup$
6
  • 1
    $\begingroup$ The formula you are citing is the change of entropy for a reversible process in the isothermal case, I believe. In the general case one has to integrate $\Delta S = \int{Q_{rev}\over T}$. That's different from the total entropy of a system, which is a function of temperature and other intensive properties (like magnetization etc). $\endgroup$
    – CuriousOne
    Commented Dec 15, 2014 at 17:22
  • $\begingroup$ OP: Are you sure that your book says entropy should be directly proportional to "temperature"? Shouldn't that say "heat" instead? $\endgroup$
    – BMS
    Commented Dec 15, 2014 at 17:25
  • $\begingroup$ @BMS Sorry. Made the edit. $\endgroup$
    – Yashbhatt
    Commented Dec 15, 2014 at 18:52
  • $\begingroup$ $\frac QT = \Delta S = \Delta(k \ln\Omega) \approx k \frac 1\Omega \Delta\Omega \Rightarrow \Delta\Omega\approx \Omega \frac Q{kT}$, ie transferring heat into a system opens up a new number of microstates $\Delta\Omega$ proportional to the number of existing ones and the number of degrees of freedom we can excite given by heat $Q$ divided by average energy per degree of freedom $kT$; cf physics.stackexchange.com/questions/33372/… $\endgroup$
    – Christoph
    Commented Feb 8, 2015 at 13:33
  • $\begingroup$ Are you asking for a conceptual answer which restricts itself to classical thermodynamics, which only uses macro variables like temperature and volume (where entropy can be defined in the way you wrote), or do you want an answer which starts from the more fundamental definition of entropy in statistical mechanics in terms of the number of possible microstates associated with a given macrostate? (which is where the notion of entropy as 'disorder' comes from) $\endgroup$
    – Hypnosifl
    Commented Feb 8, 2015 at 16:08

8 Answers 8

22
+25
$\begingroup$

You asked for intuitive sense and I'll try to provide it. The formula is:

$$\Delta S = \frac{\Delta Q}{T}$$

So, you can have $\Delta S_1=\frac{\Delta Q}{T_{lower}}$ and $\Delta S_2=\frac{\Delta Q}{T_{higher}}$

Assume the $\Delta Q$ is the same in each case. The denominator controls the "largeness" of the $\Delta S$.

Therefore, $\Delta S_1 > \Delta S_2$

In each case, let's say you had X number of hydrogen atoms in each container. The only difference was the temperature of the atoms. The lower temperature group is at a less frenzied state than the higher temperature group. If you increase the "frenziedness" of each group by the same amount, the less frenzied group will notice the difference more easily than the more frenzied group.

Turning a calm crowd into a riot will be much more noticeable than turning a riot into a more frenzied riot. Try to think of the change in entropy as the noticeably of changes in riotous behavior.

$\endgroup$
1
  • 1
    $\begingroup$ Inquisitive, that last paragraph is "priceless". $\endgroup$ Commented Jan 14, 2018 at 18:44
11
$\begingroup$

The formula is actually better written $$ \Delta S = \frac{Q}{T}. $$ That is, the change in entropy associated with the flow of heat is inversely proportional to the temperature at which the heat flow occurs. Note that $Q$ is already a change itself: it is not a state variable, but rather something more like $\Delta W$. Physically, this is because adding heat to a hot system doesn't disorder it as much as adding the same heat to a cold system.

If $T$ is changing, you would need to integrate $$ \Delta S = \int \mathrm{d}S = \int \frac{\delta Q}{T}. $$

For an ideal gas at constant volume and particle number, it turns out $$ \Delta S \propto \log\left(\frac{T_\mathrm{final}}{T_\mathrm{initial}}\right). $$ Thus doubling the temperature does lead to the exponential of $S$ doubling, as you would expect.

$\endgroup$
4
  • $\begingroup$ " Physically, this is because adding heat to a hot system doesn't disorder it as much as adding the same heat to a cold system." But I don't get why this is the case? $\endgroup$
    – Yashbhatt
    Commented Jan 30, 2015 at 14:35
  • $\begingroup$ @Yashbhatt try this coffeeshopphysics.com/articles/2011-08/… $\endgroup$
    – pentane
    Commented Jan 30, 2015 at 15:08
  • $\begingroup$ @pentane Great article. But doesn't answer my question. $\endgroup$
    – Yashbhatt
    Commented Feb 5, 2015 at 14:01
  • $\begingroup$ Interesting! And phase changes are the places where $\Delta S$ has sharp jumps (e.g. in T-S plots). $\endgroup$
    – Mathews24
    Commented Dec 3, 2018 at 7:02
3
$\begingroup$

Heat added to a system at a lower temperature causes higher entropy increase than heat added to the same system at a higher temperature.

How does this make intuitive sense?

The formula defines entropy change. Since it defines new word not conceived before and thus devoid of sense, it is hard to imagine as having "intuitive sense".

If your question really is why people use this definition and not other, here is one possible view:

Carnot came to conclusion that all reversible cycles operating between two temperatures $T_1< T_2$ (acquiring heat $Q_1$ and $Q_2$) have the same efficiency (work divided by heat consumed)

$$ \frac{\sum W}{Q_{2}} = 1 - \frac{T_1}{T_2}. $$

Today, this is being derived from 1st and 2nd law of thermodynamics in most textbooks on thermodynamics. It is done most easily for ideal gas, but the result is generally valid for any substance. It is at this point where the division by temperatures enters the discussion.

Since $\sum W = Q_1+Q_2$, it follows

$$ \frac{Q_1}{T_1} + \frac{Q_2}{T_2} = 0. $$

(sum of reduced heats equals zero).

General cyclic process has the same effect on the surroundings of the system as many Carnot cycles tesselating the original cycle in the work diagram, each operating with very small amount of heat.

Writing Carnot's equation for all of them and summing the terms, adjacent terms cancel each other and we are left with sum over terms that correspond to boundary elements of the curve representing the general cycle only:

$$ \sum_{s} \frac{Q_s}{T_s} $$

with both isothermic and adiabatic elements $s$.

We pass from this sum to loop integral in the thermodynamic space of states $\mathbf X$: $$ \oint_\Gamma \frac{\mathbf J}{T}\cdot d\mathbf X = 0 $$

where $\mathbf J$ is such function of $\mathbf X$ and $\Gamma$ that integral over segment of $\Gamma$ (let's call it $\Delta \Gamma$) $\int_{\Delta \Gamma} \mathbf J\cdot d\mathbf X$ is the heat accepted by the system when it changes state along the curve $\Delta\Gamma$.

The last equation can be expressed in words this way: the line integral of $\mathbf J/T$ along closed curve $\Gamma$ in space of thermodynamic equilibrium states $\mathbf X$ is always zero.

It follows that the integral

$$ \int_{\mathbf X_i}^{\mathbf X_f} \mathbf J\cdot d\mathbf X $$

(equal to heat accepted by the system) depends on the path chosen to connect thermodynamic equilibrium states $\mathbf X_i$ and $\mathbf X_f$, but the integral

$$ \int_{\mathbf X_i}^{\mathbf X_f} \frac{\mathbf J}{T}\cdot d\mathbf X $$

does not depend on it; it only depends on those two states. This enables us to define function in the space of equilibrium states $$ S(\mathbf X_f) = S(\mathbf X_i) + \int_{\mathbf X_i}^{\mathbf X_f} \frac{\mathbf J \cdot d\mathbf X}{T}, $$ where $S(\mathbf X_i)$ is value of the function for reference state $\mathbf X_i$, chosen by convention. It does not matter which path is chosen to connect $\mathbf X_i$ and $\mathbf X_f$; value of $S$ only depends on the endpoint $\mathbf X_f$.

This function is called entropy. Unfortunately, there is nothing intuitive about it in thermodynamics; it's just a useful definition.

$\endgroup$
3
$\begingroup$

The question is based on the reasoning that "S rises when Q is added, and T rises when Q is added, so S must rise if T rises." The problem is that T is kinetic energy per particle while Q is internal energy added to the entire system minus the work done. T is an intensive property while S and Q are extensive properties. It's true that for every particle in a given system S ~ ln(T) + c and that S ~ ln(Q) + c (which I show below). I think the core problem is that the question mixes system-wide Q with a per particle T.

To be precise in explaining how T and Q are different when considering an entire system: to raise T without raising Q you would have to reduce the number of particles carrying the energy. The energy from the removed particles would be added to the remaining ones. This would result in lower entropy because although the number of states per particle is increased, the number of particles is decreased (the reduced N dominates in gases: S ~ N [a ln(T) + b - c ln(N) ]. But to raise Q without raising T you would have to add particles which raises entropy.

BTW, the original formula should have dQ instead of Q so that you should have dS=dQ/T. But in either case, it is obvious from the equation that at a higher T there will be less of an dS increase for a given amount of Q added.

In the following I'll detail the relationship between Q and S, and then T and S, for a given system, showing they have the same effect on S for a given fixed system but only if you do not mix T and Q. "~ "will mean proportional, not approximately. My statements should be exact exact, but a lot is hidden in the proportionalities because it takes advantage of log(x^c) = c log(x).

In looking at the Sackur-Tetrode equation for an ideal gas of N particles in a given box I see:
S ~ ln(states per particle) + constant
note: the "states per particle" does not change in a simple way if N changes.
states per particle ~ momentum p per particle
momentum per particle ~ SQRT(2m (K.E.) per particle)
K.E. per particle ~ Temperature, internal energy, and heat per particle (above absolute zero) but approximating an unrealistic ideal that heat capacity is not changing with temperature.
ln(x^0.5) ~ ln(x)

So, for a given gas in a given box
S ~ ln(T) + constant
or
S ~ ln(Q) + constant

So here's what's happening at a T and a 2 x T:
S ~ ln(Q0 + Q1) + constant
verses
S ~ ln(2Q0 + Q1) + constant

Example: Q0 = Q1 = 2
low T has fractional S increase with Q1 added of (1.38+c)/(0.69+c)
2x T has fractional S increase with Q1 added of (1.79+c)/(1.38+c)

I could replace the Q's with T's, so the original question has some reason, but I can't mix and match the T and Q like the question does.

$\endgroup$
2
$\begingroup$

I find that the question here relates directly to the definition of temperature, and I'll give a short version of it. For simplicity let us consider a system generated by two sub-systems, A and B, in thermal contact (meaning they only exchange energy in the form of heat).

Let me state that, for $A$ and $B$ in thermal equilibrium, the entropy $S_{A}$ and $S_{B}$ for each respective sub-system depend upon the energy in precisely the same way. This statement only reflects the thermal equilibrium condition. Further, consider this statement in more mathematical terms \begin{equation*} \frac{\text{d}S_{A}}{\text{d}E} = \frac{\text{d}S_{B}}{\text{d}E} \end{equation*} that is, that when an infinitesimal energy quantity $\text{d}E$ is traded between sub-system $A$ and $B$ such that $S_{A}$ changes, then the corresponding change in $S_{B}$ exactly compensates for this. The observation to be made is that the quantity $\text{d}S/\text{d}E$ gives us a measure of how benign a system is to accept a change in energy by thermal means. From this we define temperature as \begin{equation*} \frac{\text{d}S}{\text{d}E} \equiv \frac{1}{T} \end{equation*} and we can see that for temperatures $T_{1} < T_{2}$, the change in entropy is greater when trading energy at $T_{1}$ as compared to $T_{2}$.

Edit: The intuition is in understanding the definition. Although, I will admit that it is still rather abstract, since it contain two abstract concepts like energy and entropy. This may be stated somewhat differently if we look more closely at the definition of entropy. For a given system, the entropy is defined as the logarithm of the number of accessible microstates $g$ at some energy $E$ \begin{equation*} S \propto \text{ln}\Big(g(E)\Big) \end{equation*} that is, $g(E)$ counts the number of accessible microstates for a given energy E. This $g$ may depend on several factors of course (volume for instance). Considering the definition again and not talking explicitly in terms of entropy; let us give a system some energy $\delta E$ through thermal contact and we realize that for a low system temperature the number of accessible quantum states $g(E)$ increase to a greater extent than compared to when the system has a higher temperature. As such, the intuition is that we ''unlock'' more microstates for a system at low temperature than at a higher temperature with some quantity of energy $\delta E$. This is relatively speaking, since at low temperatures the system has a low energy and thus few accessible microstates to be in, such that an energy $\delta E$ for the system will increase the number of accessible states to a greater extent than when at high temperatures, that is, the impact of $\delta E$ it on $g(E)$ depends on the number of new states it makes available compared to the number which was already available.

$\endgroup$
2
  • 1
    $\begingroup$ This is a nice derivation of the definition of temperature, but I'm sure how this adds intuition to the issue of lower $T$ leading to higher $\Delta S$. $\endgroup$
    – Kyle Kanos
    Commented Feb 8, 2015 at 15:59
  • $\begingroup$ @KyleKanos Alright, I will have to agree. The whole deal can be given more light. See my edit. $\endgroup$
    – Invoker
    Commented Feb 8, 2015 at 17:19
1
$\begingroup$

In my statistical mechanics course last year, we derived that $S \propto \frac{1}{T}$ from the following considerations:

Consider two boxes (each with $N_i$ particles, $V_i$ volume, $U_i$ internal energy, and $T_i$ temperature, where $i = 1,2$) separated by a wall through which they can exchange energy (heat).

Clearly $\Omega_i \propto U_i$ and $\Omega_i \propto U_{3 - i}^{-1}$ when the total energy, $U_0 = U_1 + U_2$, is fixed, where $\Omega_i$ is the number of states of system $i$.

The first law of thermodynamics says that, at equilibrium, $T_1 = T_2$. The second law says that, if $T_1 < T_2$, then heat goes from system 2 to system 1, meaning that $U_1 < U_1^{\text{eq}}$, where $U_i^{\text{eq}}$ is the internal energy of system $i$ when the two systems are in equilibrium with each other.

In equilibrium, entropy ($S = k_b\ln\Omega$) is maximized.

Let's now expand the total entropy of the two systems, $S_T$: $$ S_T(U_1) \approx S_1(U_1^{\text{eq}}) + \left.\frac{\partial S_1}{\partial U_1}\right|_{U_1^{\text{eq}}}(U_1 - U_1^{\text{eq}}) + S_2(U_0 - U_1^{\text{eq}}) + \left.\frac{\partial S_2}{\partial U_1}\right|_{U_1^{\text{eq}}}(U_1 - U_1^{\text{eq}}) $$ $$ \approx S_T(U_1^{\text{eq}}) + \left.\left(\frac{\partial S_1(U_1)}{\partial U_1} + \frac{\partial S_2(U_0 - U_1)}{\partial U_1}\right)\right|_{U_1^{\text{eq}}}(U_1 - U_1^{\text{eq}}) $$ From the above statement about equilibrium maximizing entropy, $$ \left.\frac{\partial S_T}{\partial U_1}\right|_{U_1^{\text{eq}}} = 0 = \left.\left(\frac{\partial S_1(U_1)}{\partial U_1} + \frac{\partial S_2(U_0 - U_1)}{\partial U_1}\right)\right|_{U_1^{\text{eq}}} $$ Consider that $$ U_0 = U_1 + U_2 \ \ \Rightarrow \ \ 0 = dU_1 + dU_2 \ \ \Rightarrow \ \ dU_1 = -dU_2 $$ So $$ 0 = \left.\frac{d S_1(U_1)}{d U_1}\right|_{U_1^{\text{eq}}} - \left.\frac{d S_2(U_2)}{d U_2}\right|_{U_0 - U_1^{\text{eq}}} $$ At this point, it might seem reasonable to suppose that $T = \frac{d S}{d U}$ (which it is not). As a check, let's expand about $\tilde{U}_1 < U_1^{\text{eq}}$: $$ S_T(U_1) \approx S_T(\tilde{U}_1) + \left.\left(\frac{\partial S_1}{\partial U_1} + \frac{\partial S_2}{\partial U_1}\right)\right|_{\tilde{U}_1}(U_1 - \tilde{U}_1) $$ $$ \Rightarrow \ \ \frac{d S_T}{d U_1} = \left.\frac{d S_1}{d U_1}\right|_{\tilde{U}_1} - \left.\frac{d S_2}{d U_2}\right|_{U_0 - \tilde{U}_1} > 0 $$ But this implies that $T_1 - T_2 > 0$, and that heat is flowing from cold to hot, in violation of the second law. To make our definition of temperature mesh with the second law, we therefore need $$ \frac{1}{T} = \frac{d S}{d U} \ \ \Rightarrow \ \ dU = T dS $$

$\endgroup$
0
1
$\begingroup$

Entropy is defined in my book as ΔS=Q/T.

Already, this is not correct. In general, $$ \delta S\ge\delta Q/T\;. $$

In the specific case where the system being heated is at always in thermal equilibrium, then $$ \delta S=\delta Q/T $$ So, clearly, $\delta S=\delta Q/T$ can not be taken as the definition of entropy.

The definition of entropy is [See Landau and Lifshitz "Statistical Physics" (2nd edition) Equation 7.7]: $$ S=\log(\Delta \Gamma)\;, $$ where $\Delta \Gamma$ is the statistical weight of the subsystem, which is defined as follows: If $w_n=w(E_n)$ is the probability of finding the system in quantum state $n$ then $w(\bar E)\Delta\Gamma\equiv 1$, where $\bar E$ is the most probable energy of the system (which is the same as the macroscopic internal energy of thermodynamics since fluctuations are negligible). To put it another way, $\Delta \Gamma$ is the number of states within $\Delta E$ of $\bar E$ (where $\Delta E$ is defined as: $\Delta E$ times the energy-probability-distribution at $\bar E$ equals 1).

We can rewrite the Entropy more explicitly as a function of $\bar E$ as: $$ S=-\log(w(\bar E))\;. $$

(As an aside, it turns out that the log of $w$ has to be linear in $E$ so that $$ S=-\log(w(\bar E))=-\sum_n w_n \log(w_n)\;, $$ which is another often-encountered definition of entropy.)

But, anyways, since $S$ is a function of $\bar E$, we can take the derivative of $S$ with respect to $\bar E$. This derivative is defined to be the inverse temperature: $$ \frac{dS}{d\bar E}\equiv \frac{1}{T} $$ The derivative above is taken at fixed system volume, so this says that $\delta S=\delta Q/T$... I can explain that a little bit more below:

One way to change the energy of system is to perform work on the system. Typically in thermodynamics this happens by compressing or expanding the volume of the system (although it doesn't have to happen this way). If the system is not thermally isolated there is another way to change the energy of the system, which is called heat. Heat is a direct transfer of energy to/from other systems in thermal contact with the system. Heat and work. Work and heat. This is how we change the energy of the system. I.e., $$ \delta E=\delta W+\delta Q $$ If, the system of interest is always in thermal equilibrium throughout the process over which the energy is changed by heat and work, then from the definition of temperature we know that $$ \delta E=\delta W +T\delta S $$ I.e., $\delta Q=T\delta S$ holds.

So, anyways, we've arrived at $\Delta S=Q/T$ by defining $T$ as $$ \frac{dS}{d\bar E}\equiv \frac{1}{T}\;, $$ but why define the right hand side as the inverse of the temperature, why not define it as the inverse of the square of the temperature? One answer is that if you make any other definition you will not end up with the usual known thermodynamic laws, such as $PV=NkT$ and whatnot. But does this definition jive with my intuitive sense of temperature?! I'm sure you are asking... Well, no, it does not. In order to make it jive with your intuitive sense you need Boltzmann's constant $k$ and make the substitutions $T\to kT$ and $S\to S/k$. So now does it jive?! Well, that depends on what your intuitive sense of temperature was to begin with. If you think about this you will probably realize that we humans probably start off with an intuitive sense of hot and cold and the sense that heat flows from hot to cold (but not an intuitive sense of temperature). Later on we associate temperature with the expansion of mercury in thermometers and whatnot. And the above definition of temperature appropriately describes the expansion of mercury in thermometers just as it appropriately describes the ideal gas law and so on... So, yes, this definition does jive with that simple intuitive sense of temperature, only it is much sharper and more useful.

P.S. Most of the equations use the notation from Landau and Lifshitz. For more info along these lines read chapters one and two of their "Statistical Physics".

$\endgroup$
1
$\begingroup$

A simple explanation: We know that heat travels from a body or system which has higher temperature to a colder body or system which has lower temperature. Also the entropy is directly proportional to heat. Therefore the entropy decreases in the former body (higher temperature body) and increases in the latter body (lower temperature body). This explains the following notion, "Heat added to a system at a lower temperature causes higher entropy increase than heat added to the same system at a higher temperature."

$\endgroup$
1
  • $\begingroup$ Very simple but clear explanation $\endgroup$ Commented May 1, 2022 at 11:33

Not the answer you're looking for? Browse other questions tagged or ask your own question.