9
$\begingroup$

It was recently brought to my attention that my understanding of entropy is wonky at best.

In my experience, entropy was introduced (superficially at best) during general chemistry/foundations of inorganic chem and described in terms of order/disorder of a system, usually followed by some messy room analogy. In one of my textbooks, the author, Gary Wulfsberg, was discussing how a reaction tends to favor the products "if it increases the dispersal of ions/molecules over a larger volume of space..."(Wulfsberg, G., 2018). Wulfsberg goes on to say that

The measure of this dispersal or disorder is known as the entropy (S) of the substance. A positive entropy change for a reaction indicates increasing dispersal or disorder (Wulfsberg, G., Foundations of Inorganic Chemistry; ch. 4, pg. 200)

which I interpreted it as him saying that all systems that increase in entropy have a corresponding increase in disorder.

Later in my undergraduate career, Boltzmann's entropy was introduced during thermodynamics and described by my professor as the amount of micro-states available for the particles within the system and that an increase in entropy corresponds to an increase in the states that the system can exist in.

All was fine and peachy until recently, when I came across an article that discusses the Shannon Measure of Information (SMI) and entropy. In an article by Arieh Ben-Naim, he talks about how the idea of order/disorder in relation to entropy is not necessarily correct, in fact it is a fallacy that does not hold true nor can order/disorder be measured definitely for all systems. He states,

It is true that many spontaneous processes may be viewed as proceeding from an ordered to a disordered state. However, there are two difficulties with this interpretation. First, the concept of order is not well-defined, and in many processes it is difficult, if not impossible, to decide which of the two states of the system is more or less ordered.
J. Chem. Educ. 2011, 88 (5), 594–596

Additionally, he talks about how some systems do have "order parameters", however, it is not in relation to entropy and that not every process where an increase in entropy is observed has a corresponding increase in disorder. He later goes on to describe the SMI treatment of entropy "as the number of binary questions one needs to ask to find the location of the particle." Hence, if the number of yes/no questions one needs to ask to find the location of particle increases, so has the thermodynamic entropy.

So here is my question

Is entropy described by Boltzmann/statistical mechanics the same as the entropy described by Shannon information theory?

Additionally, is there any validity in relating order/disorder in describing the change in entropy of chemical systems?

$\endgroup$
16
  • 1
    $\begingroup$ Thanks for raising this query. Hope we will see a lot of activity and various view points. The recent terminology in Atkins and others entropy as "dispersal of energy" is due to the efforts of Frank L. Lambert. Many textbooks adopted his terminology. When he was alive, he used to discuss this in a chemistry education forum. If I understood him correctly at that time, he meant to say that entropy is a measure in how many ways energy can be distributed hence the word dispersal of energy. $\endgroup$
    – ACR
    Commented Sep 22, 2020 at 3:14
  • 2
    $\begingroup$ The biggest problem was created by Shannon himself by using the same word "Entropy" which really led to this "messy room" and "shuffled cards" stories. $\endgroup$
    – ACR
    Commented Sep 22, 2020 at 3:15
  • 6
    $\begingroup$ Let's distinguish the "fundamental concept is wrong" from the "the often used, simplistic explanation for a fundamental concept is wrong/totally misleading". Two very different situations. Entropy has a very well done definition in statistical mechanics and thermodynamics, and these definitions hold well to chemistry, too. The problem is when people want to explain everything at a 5 years olds level: more often wrong than enlightening. The entropy as a disorder is a very unhelpful concept in chemical systems. $\endgroup$
    – Greg
    Commented Sep 22, 2020 at 3:54
  • 2
    $\begingroup$ I'm with @Greg on this, originally it was described (by Boltzmann ?) as 'a measure of disorder' but the 'measure of' got lost on the way hence the confusion. Why not either think of it in the classical thermodynamic sense only as $\pm T\Delta S$ as related to heat or as the number of ways of placing particles into energy levels/boxes etc., which is basically Boltzmann's approach, when an atomistic approach is needed. $\endgroup$
    – porphyrin
    Commented Sep 22, 2020 at 11:10
  • 3
    $\begingroup$ I wonder why this question was closed. The OP has edited it to be more focused, and it seems a very legitimate question how a very common (and often taught) interpretation of a central thermodynamic concept holds up. $\endgroup$
    – Greg
    Commented Sep 23, 2020 at 3:30

1 Answer 1

2
$\begingroup$

The association of entropy with disorder is anthropocentric. "Disorder" is a concept derived from our experience of the world. The association with "disorder" is clearer once we explain what we mean by "order". The following definition among those provided by Merriam-Webster most closely fits the intended meaning:

a regular or harmonious arrangement

Since regularity always implies lower entropy, all else being equal, it is therefore fair to associate "order" with lower entropy.

The association with "order" or regularity also syncs with the concept of entropy according to Boltzmann's statistical mechanical definition ($S= k_\mathrm B \log \Omega$). $\Omega$, the number of microstates available to the system, can be quantified using the entropy. More possible unique microstates implies a higher entropy. Greater regularity implies more constraints regarding arrangement of the system and therefore less possible microstates. Solids are usually more regular and therefore have lower entropy than fluid states at the same T. Same when comparing gases and liquids. The possible microstates increases from solid to liquid to gas.

This also jives with the informational-content definition. You can use less information (use a more compact description) to describe an orderly (regular) system. You need more information to describe all possible arrangements of molecules in a gas or liquid than in a solid. Think of entropy as measuring the length of the recipe required to build all possible arrangements of the system.

$\endgroup$
6
  • $\begingroup$ How would we define regularity at molecular level? Is it periodicity? How would explain regularity in liquids vs. gases? $\endgroup$
    – ACR
    Commented Sep 22, 2020 at 18:54
  • $\begingroup$ @M.Farooq For a gas you need to describe more possible arrangements, each unique, than in a liquid, to describe the system. Similarly on going from solid to liquid (where the difference is more striking). For a simple description of a solid all possible arrangements can be considered permutations of one template. Once you describe that one template and the way of permuting positions you are done. For liquids you are somewhere inbetween. You see this e.g. in g(r). $\endgroup$
    – Buck Thorn
    Commented Sep 22, 2020 at 19:33
  • 1
    $\begingroup$ The topic deserves going through the examples in more detail, including those in the comments to the OP. I find the idea of dispersal of energy appealing at times but not entirely satisfying. For instance, in an isolated system (say a gas in an adiabatic box) what exactly do you mean by energy "dispersal"? Are you talking about the kinetic theory of gases? The Boltzmann distribution? $\endgroup$
    – Buck Thorn
    Commented Sep 22, 2020 at 19:38
  • $\begingroup$ Yes, I am not fully satisfied with the wordings of "energy dispersal." Frank Lambert, who was behind this terminology indeed did a big service by getting rid of order-disorders mythology from many textbooks. On the contrary, he also introduced "energy dispersal" concept. Which is a bigger evil entropy or energy dispersal? Only time will tell. $\endgroup$
    – ACR
    Commented Sep 22, 2020 at 19:53
  • 1
    $\begingroup$ Trying to read the original work of Clausius will require a year to understand where he was coming from and what was going on in his mind when he came up with this idea. His book is available in English. $\endgroup$
    – ACR
    Commented Sep 22, 2020 at 19:54