1
$\begingroup$

In this article, the author makes the following argument.

  1. The difference between entropy and information is that entropy is the bits that you don't understand while information is the bits you do understand.

  2. Bits can never be destroyed.

  3. Therefore entropy must always increase, because entropy is bits.

Is this correct derivation of the second law of thermodynamics or is it wrong? I need discernment from experts in physics because I'm not a professional physicist.

$\endgroup$
7
  • 1
    $\begingroup$ I cannot follow the argument $\endgroup$
    – user65081
    Commented Nov 25, 2018 at 22:04
  • 1
    $\begingroup$ Keep in mind that the laws of thermodynamics (or really any physical laws) are not derived. They are experimentally verified. $\endgroup$ Commented Nov 25, 2018 at 23:58
  • 3
    $\begingroup$ The guy who wrote that article and maintains that blog hasn't a shred of intellectual integrity.... you should do yourself the favor and look up the author's credentials before trying to unpack their arguments, you'll save a lot of time and energy that way. wiki.c2.com/?RichardKulisz $\endgroup$ Commented Nov 26, 2018 at 2:10
  • $\begingroup$ N. Steinle it wasn't obvious to me that he was being intellectually dishonest. I guess I was fooled. :/ He just seemed like a person who was really into futurism and user interfaces. $\endgroup$
    – Fomalhaut
    Commented Nov 26, 2018 at 3:43
  • $\begingroup$ It's not stated precisely enough to be judged right or wrong, but the general flavour of it is about right. The second law is indeed the result of the conservation of information ("bits") at the microscopic level. The number of bits you understand about a system can decrease if you measure it, but if you don't then it can only increase, which is really what the second law says. However, there are quite a lot of mathematical subtleties involved in going from there to a quantitative equation. $\endgroup$
    – N. Virgo
    Commented Nov 26, 2018 at 5:36

1 Answer 1

3
$\begingroup$

It is not an argument at all. At least, not a scientific argument. The concept of "bits that you want or don't want" or "that you understand or do not understand" is at least very vague: is the author suggesting an arbitrary subjective definition of information or of entropy, depending on what different persons want or understand? Even worse, there is no physical definition of a bit. Ok, the minimum information, as thought in the introductory course on information theory, is ok. But, what is a bit in the real world according to the author? In any case, information theory never identifies information and entropy. Missing the definition of what represents a bit in the physical world, claiming that bits can never be destroyed is pure nonsense. Is the author able to reconstruct the content of his hard drive after melting it in a furnace? And how can we know that it is possible to create bits from nothing? Because, if it is not possible the creation of new bits, entropy, according to his definition, should stop to increase at some point.

Leaving aside these unjustified claims, there are a few things that could be said on this subject which may help to understand why this "derivation" is nonsense.

There are many different concepts which (unfortunately) have all been named "entropy". Those immediately relevant for the second law of thermodynamics are

  1. the thermodynamic definition introduced by Clausius:$$ S(B) = S(A) + \int_A^B \frac{dQ_{rev}}{T},$$
  2. the statistical mechanics definition by Boltzmann/Planck/Gibbs, which can be expressed in many ways, depending on the set of state variables one likes to use to describe a macroscopic state;
  3. the information theory definition by Shannon: $$S = -k \sum_i p_i log(p_i). $$

In the Shannon's formula, a generic system, (even a system without any thermodynamic behavior like a deck of cards) is supposed to be found in each $i-$th state with a probability $p_i$. It is reasonable to associate a quantity named $information$ to each state via the quantity $log(1/p_i)$. Thus, a value of the Shannon entropy can be assigned to each system characterized by a given probability of its states. It is evident that Shannon entropy is not the same as information but it can be seen as the average information embodied in a probability distribution.

It is also interesting to notice that it is possible to assign different probability distributions to the same physical system, according to different ways of listing its (micro)states. A different entropy will correspond to each of such distributions. Last but not least, nothing is stated, at level of information theory, about the time evolution of the probabilities. So, in order to make contact with the second principle, something more should be said.

The conceptual chain of links between the three entropies goes as follows:

  1. Shannon entropy reduces to the statistical mechanics expressions for the entropy in different ensembles if the probability distribution used in the information entropy coincides with the probability distribution of the relevant ensemble.
  2. The different formulae of the entropy in each ensemble are not always equivalent but the corresponding entropy per particle or per unitary volume coincides, after taking the thermodynamic limit, and always in that limit, it has all the properties of the Clausius thermodynamic entropy.

In conclusion, only for an infinite system, controlled by the Boltzmann-Gibbs probability, it is possible to establish a safe scientific link between information and second law. Unfortunately, such a clean conceptual connection is very often ignored or misinterpreted, even in textbooks. Even though the whole scenario was very clear already in the fifties when Brillouin's book Science and Information Theory was written.

$\endgroup$
13
  • $\begingroup$ Please note that the author of the linked article did not make the claim presented in this question. There doesn't seem to be anything substantially wrong in this popular-level article, it was just misinterpreted by the OP. The author did not make any of the points listed in the question nor did he make this logical conclusion. $\endgroup$
    – safesphere
    Commented Nov 26, 2018 at 2:38
  • $\begingroup$ @safesphere Point #1: "You see, information, entropy, order, these are all words that mean exactly the same thing. They are physically identical things. Entropy is information. It's just that entropy is the kind of plentiful low-level information that the human sensory and nervous systems screen out as irrelevant." Point #2: "Now to understand the whole thing you have to realize that information can never, ever be erased from the physical universe." Point #3: "So when junk accumulates, it uses up hard drive capacity forever (ie, it obeys the second law of thermodynamics)." $\endgroup$
    – Fomalhaut
    Commented Nov 26, 2018 at 3:42
  • $\begingroup$ @safesphere RK made all of the points I outlined in his blog post, as I have proven by quoting. You may have by accident failed to peruse it thoroughly. $\endgroup$
    – Fomalhaut
    Commented Nov 26, 2018 at 3:56
  • 1
    $\begingroup$ @safeshere: much more important for the present discussion is that, even if he did not make exaclty the claim presented in this question, what he writes is purely imaginary self-made physics. No one of his claims would resist to a serious analysis. Technical words are used in a meaningless way without any attempt to provide a reason. In my opinion, that page does not contain anything which could be called physics. If you think I am wrong you could open a chat to discuss in a rational way the physical content of that page. $\endgroup$ Commented Nov 26, 2018 at 21:02
  • 1
    $\begingroup$ @TomislavOstojich I still read it as information cannot be destroyed, because information is entropy and entropy cannot be destroyed. The second law causes conservation of information, not vice versa. If for some reason it is so important to you what this particular guy meant in some blog, perhaps you could contact him directly for a clarification. Until then the fact is that your statements are your interpretation of what he said, but not direct quotes of what he said. His logic is not as clear as the logic you have presented, so only he can confirm or deny. $\endgroup$
    – safesphere
    Commented Nov 27, 2018 at 21:21

Not the answer you're looking for? Browse other questions tagged or ask your own question.