10

Everybody agrees that the concepts of energy and momentum or the concept of spacetime are genuine concepts from physics. Not only that these concepts are used in physics. They are made precise in physics.

What about the concept of information? The concept is used in a series of sciences, including physics, biology and informatics. There is a striking parallel between Boltzmann’s definition of entropy „S“ in thermodynamics and Shannons definition of entropy „H“ in his information theory.

As far as I know, currently there is no general accepted, quantitative definition of information, which also captures the aspect of meaning with its reference to a sender, a receiver, and a message.

My question:

  • Is information considered a genuine concept from physics? I.e. the concept of information - alike to energy - can only made precise by physics?
  • Or is it more suitable to consider information a concept from informatics with application in physics?
  • Or is the concept of information still „work in progress“, alike to the concept of energy before its clarification in the 19th century?

I know that these questions have found a broad spectrum of different answers. Therefore I would also welcome some criteria how to decide about the answers.

8
  • 2
    Can you clarify what you mean by "genuine physical concept"? Do you mean that it expresses something that exists on a physical level or something else?
    – virmaior
    Commented Apr 15, 2018 at 7:15
  • I have the same problem. The phrase is confusing. 'Physical concept ' seems to be an oxymoron.
    – user20253
    Commented Apr 15, 2018 at 11:57
  • 2
    "There is a striking parallel between Boltzmann’s definition of entropy „S“ in thermodynamics and Shannons definition of entropy „H“ in his information theory." it's much, much more than just a striking parallel: en.wikipedia.org/wiki/…
    – Not_Here
    Commented Apr 15, 2018 at 20:16
  • Nice question -- see related: philosophy.stackexchange.com/questions/40383/… Commented Apr 16, 2018 at 17:28
  • There is a well-established concept of physical information, which is not co-extensive with the colloquial use of the word. In both regards it is often analogized to energy, and there are even proposals for formulating fundamental physical theories in terms of information, just as energy-momentum formulations are preferred in modern physics to older force-velocity ones.
    – Conifold
    Commented Apr 16, 2018 at 20:25

4 Answers 4

8

Yes, it is.

The reason is Maxwell's demon. One of the core tenets of statistical physics is the second law of thermodynamics: You cannot simply change the distribution of energy while leaving the first law (the energy remains constant) intact. If it would be possible, it allows a perpetuum mobile.

Maxwell had a nasty little idea: Let's say we have a very little demon which controls a small door between two reservoirs, both having equal temperature. The demon observes the nearing molecules and if it is a hot molecule from the right side, it may pass to the left reservoir, if it is a cold molecule from the left side, it may pass, too. So suddenly we build up a temperature difference: hot left, cold right without using any energy.

It can be shown that building such a passage which switches between open/close does not need energy to operate. That question may have sound academic 100 years ago, but now we can build microscopic machines and Maxwell's demon is no longer impossible. A being or machine which uses and processes information is able to decide which molecules may pass seem and can seemingly defeat the second law.

Leo Szilard finally wrote the first conclusive paper which tackled the problem rigorously: Über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen (On the reduction of entropy in a thermodynamic system by the intervention of intelligent beings). He showed that the demon needs some kind of information storage to process the information and this information must be resetted/deleted later to allow indefinite operation. Léon Brillouin refined the argumentation and showed that in fact the necessary entropy to operate a demon equals the won energy difference.

Both have deep connections with the Shannon information theory, so, no, physicists cannot ignore information theory because of its importance in statistical physics.

0
3

Information has more than one sense. The SEP article on information provides half a dozen.

The Shannon concept of information, which you refer to, is rooted in communication theory. It is naturally interpretable as the maximum amount of 'content' that can be conveyed in a message. More generally, we can think of it as a way of quantifying the concept of distinguishability. If I can distinguish between a switch being in the up position or the down position, then I possess information. To be precise, I possess one bit of information. Likewise, if I can distinguish between here and there, off and on, this and that, etc., then these are all examples of information that I possess. Information can be correlated, so it is not simply additive. If I can see by looking at the light switch whether it is up or down, and also see by looking at the light whether it is on or off, this does not sum to two bits of information, because the two are strongly correlated. Not perfectly correlated, because the light may be off due to a power failure or a circuit breaker tripping out, or it may be on because a mischievous person has shorted the switch out with a nail. But Shannon information theory, coupled with Bayesian probability, can allow us to express relationhips in such a way as to allow us to quantify how much distinguishability I have. Although I speak here in the first person, information in this sense does not have to be possessed by a human agent. One could speak of a measuring instrument or a computer having information. Information can be understood as a quantitative measure of the number of distinctions represented by a set, or a distribution, of possibilities. I believe this is the sense in which the concept of information is most commonly deployed in physics and biology.

According to some theorists, it is no accident that Boltzmann entropy and Shannon entropy have a striking parallel, as you put it. It is possible to interpret statistical mechanics as a purely statistical theory based only on some simple properties of matter at the microscopic level. Arieh Ben-Naim has written several books on entropy from this perspective. In particular, in his "A Farewell to Entropy" he shows how the Sackur-Tetrode equation for the absolute entropy of an ideal gas can be derived from information theoretic considerations. On this view, the second law of thermodynamics is not so much physics as an application of Bayesian statistics. This view is not undisputed, however, and more detailed consideration would take us into ergodic theory.

At the level of human cognition, we think of information as relating to semantic concepts such as meaning and truth. A message may contain information in the Shannon sense, but if it is the ciphertext of an encrypted message and I don't have the decryption key, it conveys no meaning to me. Also, a message may simply be false: Floridi (The Philosophy of Information) prefers to restrict the term information to things that are meaningful and true.

I think you are correct in saying that there is no generally accepted way of combining these two concepts of information. Dretske (Knowledge and the Flow of Information) made a valiant attempt at it, but it only goes so far. One might say that the problem of relating information in the purely mathematical sense to information in the semantic sense is an example of the problem of relating extension to intension. If we had a systematic way of doing that, logic would be a lot simpler and more powerful.

Speaking of logic, information theory suggests a way of interpreting certain common concepts within logic. A valid argument might be thought of as one in which all of the information in the conclusion is present in the premises. A tautology is a sentence that conveys no information, etc. Information here might be understood in terms of the set of logical possibilities that are excluded. If you are interested in this, a couple of useful papers are David Ellerman "An Introduction to Logical Entropy and its Relation to Shannon Entropy" International Journal of Semantic Computing, (2013) 7(2): 121-145; and Jon Barwise "Information and Impossibilities" Notre Dame Journal of Formal Logic (1997) 38(4): 488–515.

1

Yes, it is. There are many things to be said here but the most important is that physics do evolve. What was valid before may be considered absurd quite easy a little later.

Regarding purely the Information concept, some areas and researchers are so advanced in this area that they already do informational therapy instead of physical and chemical based therapy. The point is that instead of obtaining a result by chemical or biological interaction, you can directly relay the proper information to the proper cells to make them be in the desired state, therefore not needing the bio-chemical process to accomplish this. This also is recognized as valid medical licensed therapy, so I'd say we're certainly at the point where this is accepted as genuine.

2
  • 1
    'Information' seems to me to be similar to the concept 'energy', several hundred years ago. The idea that the fire in the stove and the rock falling down the hill are both facets of the same abstract force must have been a little hard for some to work their minds around. Now, it seems obvious. Commented Apr 16, 2018 at 17:32
  • Think of energy as information carrier. An organism's brain is the best example, although neurons communicate in a scalar manner, not by hertzian waves like our radio transmissions.
    – Overmind
    Commented Apr 17, 2018 at 4:53
0

The usual concept of ïnformation conflates physics and semantics, or, in classical terms, it blurs the distinction form vs content. While physical form can be given in a precise manner content is infinitely varied and unshapely. Physical laws suggest limitations but semantics, as understood by Peirce, is about potentialities.

As long as meaning stays imprecise it is outside the scope of physics. Actually there is no ultimate theory in linguistics but two broad views which do not fit well. Engineers, and Wittgenstein was one, tend to see language mostly as names-of-things that are just signals and the rest is physics. This simplistic view is to be contrasted with Saussure's which insists that words relate to concepts which further relate to the world, that is, a three tiered construction, not just pairs. The famous Semantic triangle makes obvious what remains outside the scope of physics and informatics.

There are currently theories of Information (see SEP) which appear to be sweeping the problem under the carpet: information itself does not have to be defined, having a way to measure it is enough (just as we don't have to know what is space if we can measure length). Perhaps one should also remember the sarcastic view of Feynman about energy (The Character of physical laws, chapt.8): just a number without a definite reality.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .