14
$\begingroup$

The title is not a reference to a Jason Derulo song.

In any case:

1) How is change in entropy measured, experimentally? I've Googled this for a bit and I've found all sorts of mathematical equations such as delta S = q/T as a way to get a grasp on entropy. However, I want to know how entropy is actually determined in the lab (as opposed to on paper). I understand that the units on entropy are in terms of energy/temperature. So if I had to guess I would suppose that some sort of calorimetry is used.

2) What is the best way to teach entropy (at an introductory level)? I know that textbooks like to say it's a "measure of disorder." On the other hand I have lots of websites telling me that entropy is best NOT described at a measure of disorder. So what is it really, and what do you think about the term "disorder" as a synonym? My professor refers to it as the degrees of freedom in a system; he also notes that longer carbon chains tend to have higher entropy values due to a greater ability to "wiggle" along their carbon chains. What do you think of this?

Discarding the archaic idea of "disorder" in regard to entropy is essential. It just doesn't make scientific sense in the 21st century and its apparent convenience often is flat-out misleading. As of November 2005, fifteen first-year college texts have deleted “entropy is disorder” although a few still retain references to energy “becoming disorderly”. (This latter description is meaningless

$\endgroup$
2
  • 1
    $\begingroup$ I really like Boltzmann's statistical view of entropy $$S=\mathcal{k}_\mathrm{B}\cdot\ln W$$ in which $W$ is the number of microstates, that contribute to a macrostate defined through $S$. In this way you can relate it to the degree of freedom of a system, as well as the overall uncertainty of finding a particle in a particular state. It is derived for the ideal gas, but I think the philosophical touch is quite general. But I cannot answer your question. (Esp. not 1) $\endgroup$ Commented Jul 27, 2014 at 7:14
  • 3
    $\begingroup$ I found classical thermodynamics utterly and completely incomprehensible back when learned it. Luckily, a year later I took course in statistical physics, that provided me with really working insight into meaning of thermodynamical functions, entropy included. It was much more intuitive for me, so if you have troubles with classical thermodynamics, statistical physics may be a way to go. $\endgroup$
    – permeakra
    Commented Jul 27, 2014 at 17:08

3 Answers 3

9
$\begingroup$

You can't measure entropy directly, any more than you can measure interatomic distances. You measure other quantities -- for instance often you can measure energy gain/loss and temperature, and then you integrate $dS=dE/T$.

How to explain it? One of the best expositions I know is The Second Law by Henry A. Bent. It is full of insightful examples, lays the ground carefully and avoids woolly talk, unlike many other thermodynamics books.

Some elementary but valid comments about the link between information-theoretical entropy and thermodynamical entropy can be found here: http://www.madsci.org/posts/archives/1998-10/909712896.Ph.r.html

Edit: Greg reminds us that energy and enthalpy are not directly measurable either. That's why I was careful to write energy gain/loss, which are somewhat more accessible to measurement; but of course even when doing a calorimetric experiment you're not measuring heat per se but other quantities: how much gas you burned or how intense the current was and how long it was left on. And even some of these quantities again are indirectly measured.

I agree 100% that "entropy is not more abstract than energy" -- they are all abstractions. Ultimately, however, I do think the measurement of thermodynamic entropy is one step more indirect than that of energy, if only because it involves the notion of temperature, which (if you think about it carefully) is subtle indeed.

$\endgroup$
3
  • $\begingroup$ Just for the record: you measure heat and such, and not energy or enthalpy or whatnot. Yes, if you do the experimental design well than heat will be equal to energy, or at least a good estimation of it. Entropy is not more abstract than energy or free enthalpy, and its measurement is not indirect than those. $\endgroup$
    – Greg
    Commented Jul 27, 2014 at 13:33
  • $\begingroup$ Wait are you sure you can't measure absolute entropy? Don't perfect crystals have 0 entropy? So can't we actually measure absolute entropy? $\endgroup$
    – Dissenter
    Commented Jul 30, 2014 at 20:46
  • $\begingroup$ Perfect crystals have 0 entropy only at 0 K, and this value is not measured (you can't get to absolute zero). $\endgroup$ Commented Jul 30, 2014 at 21:15
0
$\begingroup$

As an interesting bit of history, Boltzmann was the first one to describe entropy as a "measure of disorder" of a system. It's worth noting that he didn't know this was an oversimplification. In reality, entropy is best described as a measure of the number of ways that energy can be distributed in energy levels between or within particles.

From this, it's clear how the number of possible particle arrangements correlate with entropy: a greater number of particle arrangements means that more energy distributions are possible. I would avoid conflating particle disorder with entropy/microstates though; they are correlated, but not one and the same.

As for how entropy changes are measured experimentally, I would think the easiest way to do that would be dividing the heat lost by the system by the temperature of the surroundings. This essentially tells you how much energy has dispersed.

$\endgroup$
0
$\begingroup$

Here are two common ways of measuring the entropy change in a reaction:

  1. Measure the equilibrium constant $K$ at multiple temperatures. This gives you the Gibbs energy (via $\Delta_r G^\circ = - R T \ln{K}$) and, via the van't Hoff relationship, the enthalpy. You can calculate the entropy from those two.
  2. Use microcalorimetry in a titrating mode. You get the enthalpy directly, and the equilibrium constant by fitting the trace (which tells you the extent of reaction at different concentrations). From those two, you obtain the entropy.

The entropy change is concentration-dependent, so once you have the standard entropy of reaction, you might have to correct for concentrations that are different from those at standard state.

2) What is the best way to teach entropy (at an introductory level)?

It depends. However, my favorite analogy to use when guessing which state has higher energy is "freedom of atoms to move". This works for solid vs. gas, for bound vs. in solution, and for constrained conformations vs. rotatable bonds. It also hand-wavingly makes the connection to the number of microstates that Martin references in a comment. The other term you sometimes find is "dispersion of energy", which nicely explains how entropy increases when a rock (that fell off a cliff) hits the ground.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.