23

I tend to think of randomness as a lack of complete information when it comes to knowing something. If we look at the history of probability theory it centers on a lack of knowing the exact outcome of certain games/gambling bets. This I have no problem with, as it centers on epistemological considerations of incomplete information.

However, very often I'll hear someone explain Quantum Mechanics or radioactivity in terms of an inherent indeterminism/randomness in nature. This I cannot understand, and I honestly think the people who nonchalantly say "Well the amazing thing is the Universe is inherently random!" really aren't thinking things through.

Where is this indeterministic/random "event" or "interaction" coming from? Does it come from nowhere, like a space invader in Newtonian dynamics? How is this possible? And why does it seem to follow objective probabilities at the very least? What "fixes" that probabilistic distribution, and why must it be fixed like that instead of being completely random/chaotic (in the sense of completely patternless, not Chaos Theory, which is fully deterministic)?

It seems as if these concerns are never addressed, and in all honesty, true randomness is almost like magic. Someone just has to say "Well something funky happens during the measurement process. Basically a wand is waved and we have a determinate result at the end."

I find that conclusion terribly wanting, and the fact that so many fully deterministic interpretations of QM exist makes it far from something we have to take as settled. If someone can shed some light on these concerns that would be much appreciated. Causal determinism seems like the only way of looking at the world that doesn't suffer from this quasi-mystical "randomness" that no one can seem to adequately explain.

16
  • 3
    philosophy.stackexchange.com/questions/2439/…
    – Dave
    Commented Nov 5, 2015 at 18:57
  • 1
    @Dave Yea I checked that thread beforehand but thought some of the answers weren't quite up to par. Figured I'd give it another go here and see if we could get some new respondents as well
    – Pete1187
    Commented Nov 5, 2015 at 19:12
  • 2
    The ideal randomness of probability theory is nothing more than a mathematical abstraction characterized by certain axioms. Like the ideal points and lines of mathematical geometry, ideal randomness is merely a convenient figment which never exists in the real world.
    – David H
    Commented Nov 5, 2015 at 19:20
  • 1
    @Pete1187 maybe it will be helpful for future readers.
    – Dave
    Commented Nov 5, 2015 at 20:07
  • 1
    If you're not satisfied with the answers elsewhere, the are different means you should use (bounty, e.g.). This question is a duplicate and should be closed.
    – user2953
    Commented Nov 6, 2015 at 4:57

9 Answers 9

34

Like you, I think most uses of the terms 'probable' and 'random' are just epistemic, i.e. they relate to how much information we have. We say of a toss of a coin that it is random, and that there is a probability of (approximately) one half of it falling heads, but this just reflects the information we possess. Tell me more about the force and vector of the impetus given to the coin, its mass, etc., and I may be able to provide a certain answer about whether it will fall heads.

But the situation changes when we talk about quantum mechanics. There are processes such that the best we can do is to say that certain outcomes have a probability, and this remains true no matter how much information we have. Does this mean that God plays dice with the cosmos? Could there be hidden variables that are entirely deterministic that we just don't have access to?

There is evidence that there are no local hidden variables that could be appealed to in order to restore determinism. It is possible, though entirely speculative, that hidden variables could be dispersed throughout the universe, but this would have strange consequences. Otherwise, it looks like the randomness is just a rock bottom fact that we need to deal with. Even then, one's preferred interpretation of QM makes a difference. On a many-worlds interpretation, all possible results of a quantum event occur through the universe splitting and the only randomness for us is that we can't predict which of the split universes we will end up in.

As to your question of where randomness comes from, I find this odd. Randomness is not a thing that causes other things. In fact it is a common error to reify randomness and you find people making strange statements such as "this was caused by random events" or "this is attributable to random variation". Randomness doesn't cause things. Randomness is fundamentally a way of saying we don't know what caused things, and in the case of QM events, no such certain knowledge exists.

12
  • 9
    Actually, no one knows if quantum mechanics is truly non-deterministic (the so-called "Cophenhagen interpretation") or if there is some underlying mechanism. See this answer for more details. Commented Nov 5, 2015 at 23:13
  • 2
    @BlueRaja-DannyPflughoeft Which agrees with Bumbles answer. The key is the non-locality of the competing interpretation.
    – Taemyr
    Commented Nov 5, 2015 at 23:47
  • 2
    "On a many-worlds interpretation, all possible results of a quantum event occur through the universe splitting and the only randomness for us is that we can't predict which of the split universes we will end up in." You end up in all the ones where a version of you survives. There is no random selection process, so there is no random outcome.
    – Keen
    Commented Nov 6, 2015 at 14:22
  • 1
    David Deutsch wrote recently in New Scientist that Probability is as useful to physics as flat-Earth theory. (Link is to a teaser, just the opening paragraph). Commented Nov 7, 2015 at 9:20
  • 1
    @JanDvorak: "Link is to a teaser, just the opening paragraph" and no, I'm not, other than that I subscribe. Great magazine, though. :-) Commented Nov 8, 2015 at 10:36
9

The idea that the underlying core of quantum dynamics is randomness is known as the Copenhagen interpretation. It is the simplest one to lay out, but from the very beginning, it bothered people of a more idealistic bent. It is what drew the famous "The Old Man does not throw dice" quote from Einstein.

Underlying randomness is not the only way to look at this. Theoreticians like Feynman and popularizers like Everett have proposed alternative ways of interpreting the effects.

Feynman suggests that all interactions actually take place across all possible paths not ruled out by other criteria. So there is no randomness involved, but you as the experimenter do not have access to absolutely all the information that might affect the set of possible routes to an outcome.

In a less precise and less demanding spirit, Everett suggests that every event splits time into a continuum of options, and those may or may not 'collapse' back into the same 'world'. So there may be multiple possible worlds, and you are just in one of them due to fine details, while other versions of you are in the others. Again, no randomness: every reality gets its own 'you', and all the bases are covered without any random choice.

Ultimately these three, and a few others, are all equally valid ways of framing the observations we run across at quantum scope and the theories that predict them property, and of making philosophical sense of them.

Heisenberg's famous 'uncertainty principle' is theoretically traceable to the fact that we are limited to using atoms and fields as ways of getting observations, and that those involve waves that necessarily interact with the thing you wish to measure before you succeed at taking a measurement. Attempts to correct for the aspects of those waves you cannot determine would involve another set of waves, and another set of values impossible to fix. This does not presume any basic underlying randomness of the universe, only effects that can never be measured or known. There are deterministic models of this inequality, although they have undesirable qualities of their own.

It is easier to do the math as if we have perfect focus and that the target is always moving than to dwell on the idea that our attempts to focus are what moves the target. Again, both interpretations net the same data, so folks adopt the simpler one. If you object to it philosophically, there is still a valid option for addressing why we see the effect.

But the rest of your questions are answered by actually looking at the theory as it evolves. It is simply the best theory to cover what we observe. The Schroedinger equation is a generalization of a set of observations and the math they had in common, unifying an underlying notion of overlapping waves, whose exact phases we cannot know, combining in the most likely way. Its outputs are statistical because its inputs involve unknown quantities. It is not a philosophical statement, and does not address philosophical objections. But the results of experiments fit the distributions it predicts. So we keep it.

18
  • 4
    So there may be multiple possible worlds, and you are just in one of them due to fine details, .... Again, no randomness: ... How is there "no randomness" in that? Where is the determinism that selects for a specific one of multiple worlds for us? Commented Nov 6, 2015 at 11:12
  • 1
    @user2338816 There is no randomness because there is no selected world. There is no selection process at all. Every world is equal under that interpretation.
    – Keen
    Commented Nov 6, 2015 at 14:19
  • 1
    It's true that the uncertainty principle doesn't imply non-realism. But it's also true that the paragraph OrangeDog objects to is a bit misleading. What does a "perfect measurement" actually mean? In the linear algebra terms of quantum mechanics, a measurement is applying a Hamiltonian operator and receiving back a real number, which then puts the system in a known eigenvector state of the operator. What quality of that measurement would make it perfect or not, and what do atomic vibrations have to do with any of it? It may not be strictly false, but it does give an inaccurate impression Commented Nov 6, 2015 at 17:13
  • 1
    There is no selection process at all. That feels like a fundamental characteristic of "randomness". Commented Nov 8, 2015 at 1:03
  • 1
    @user2338816 No randomness is a selection process. If I make five cakes and give them to someone, they have not chosen a sample of five cakes, they have been given five cakes. Similarly, if there are two-hundred copies of you, in two-hundred different universes, there is no selection, no sample, no randomness, there is simply redundancy.
    – user9166
    Commented Nov 8, 2015 at 1:09
5

Where is this indeterministic/random "event" or "interaction" coming from? Does it come from nowhere, like a space invader in Newtonian dynamics?

To say that the randomness of quantum phenomena is coming from somewhere/something is to assert what physicists call a hidden variable theory. In such a theory, there would be apparent randomness, but there are hidden variables we are not yet aware of (for lack of better knowledge of physics, lack of better measurement equipment, etc...) which if known would provide a deterministic explanation of the apparent randomness. John Bell proved in his 1964 paper (Bell's theorem) that no local hidden variable theory can reproduce the results of quantum mechanics. His theorem was experimentally validated by Aspect et al in 1981 (and many subsequent teams), in the sense that the experimental results indicate that Quantum Mechanics holds.

The key terms here are "local" and "hidden variable". "hidden variable" was already explained above. "local" theory means any theory that precludes instantaneous interaction at a distance.

It is still possible for a non local hidden variable theory to reproduce the results of quantum mechanics. We would eliminate the quantum randomness, but at the cost of removing locality as well.

You might say "well fine, locality is not essential". But the consequences of non locality for causality are seen by some as even more disturbing than quantum randomness. Keep in mind that what is meant by non-local here is fully instantaneous interaction, not just faster than the speed of light. This means that it would be possible in theory for you to push a button on earth and cause a cat to die on the other side of the galaxy without any signal having to travel from here to there. The implications of this spooky action at a distance for causality are significant, especially given the results of relativity (which are much more intuitive than people think they are).

And why does it seem to follow objective probabilities at the very least? What "fixes" that probabilistic distribution, and why must it be fixed like that instead of being completely random/chaotic (in the sense of completely patternless, not Chaos Theory, which is fully deterministic)?

Quantum randomness follows objective probabilities because of the wave function, which gives a quantifiable measure of the probability of each outcome. Again the mathematics of quantum wave functions have been validated successfully by experimental means (they are the basis of the C-mos circuits which constitute whatever device you are accessing Philosophy SE on).

7
3

Determinism and randomness play a double role in Quantum Mechanics (QM): The fundamental equations of QM like Schroedinger or Dirac equation are differential equations similar to the differential equations of classical mechanics. And differential equations are the paradigms of a deterministic development.

On the other hand, the differential equations of QM do not deal with observable quantites like position or velocity, but with a certain function, named the psi-function. It took some time until Born proposed to consider the psi-function a probability. Hence

QM deals with the deterministic development of probability.

In many cases the psi-function does not make a deterministic predict about the outcome of a single experiment. It predicts only the probability of all different outcomes when the experiment is repeated several times with the same preparation. The typical example is the decay of an radioactive atom.

A widespread interpretation of these results is the Kopenhagen interpretation, which considers QM a complete theory and this type of probability an inherent property of nature. But today this interpretation is continously questioned, e.g. by the many world interpretation according to Everett.

2

Quantum randomness is indeed different from classical one, where uncertainty can be reduced to lack of information about the underlying full picture, and a description of that full picture is hypothetically provided. Mathematical formalism of quantum mechanics does not provide these two levels, there is a probabilistic wave function, but there is no underlying full picture. Some mathematical theorems show that providing such a deterministic underlying picture comes at a high price, in particular, many symmetries are lost, faster than light motion has to be allowed for hidden variables even though when everything is translated into the actual predictions of quantum mechanics no such possibility remains. Because of this feature, so far no deterministic interpretation of quantum mechanics has been successfully extended to quantum field theory, which supersedes quantum mechanics and special relativity, despite decades long efforts by Bohm and others.

Perhaps, part of the aversion to quantum randomness comes from analogy to classical situation, when thinking about some underlying picture with causal gaps in it that account for miraculous randomness. This is indeed how indeterminism enters classical mechanics of non-Lipschitz systems, like the n-body problem for gravity, but that is not how it enters quantum theory. On the Bohmian interpretation of quantum mechanics the probability distribution comes from uncertainty about the initial distribution of Bohmian particles, the hidden variables of the theory, which is unknowable because individual particles are undetectable in principle, only the wave function is. Since luminiferous ether most physicists are weary of such objects. Mainstream interpretations (Feynman's for example) ask us instead to give up the idea that there exists mathematically describable "absolute precision" reality about which one can reason classically in the first place. And there is no more reason to presume that reality we face is completely lawless than to presume that it is completely deterministic, there is regularity expressed in probabilistic laws, but no classical certainty. All that physics can predict is probabilities of future events conditioned on past events, randomness does not enter "miraculously", it is built in so to speak.

How satisfactory is such answer? Clearly, we would like more, and classical physics "spoiled" us to expect more. But it breaks down at small scales, and there is no reason why our classical intuitions acquired from experience with macroscopic objects should have any bearing on how the world should be at small scales (or very large ones for that matter). But the "miracle" of randomness presupposes that we intuitively project classical intuition onto the underlying reality first. For example, asking to explain a random event presupposes some form of the principle of sufficient reason, no little detail happens without a cause, but that is more or less determinism itself. So why should we presuppose it? Again, it is more or less an extrapolation of classical experience (since in practice we were never really able to predict anything "in every detail", not even the motion of planets). Our best evidence about behavior at small scales comes not from intuition but from empirical evidence summarized in mathematics of quantum theory. This mathematics so far can not be forced into deterministic interpretations fully, and for parts that can be the result does not retain regularities manifest in probabilistic picture. Here is a historical analogy. Early examples of waves were mechanical waves in a medium, when it was discovered that light was a wave naturally the assumption was that it spreads in a medium too. But eventually the picture of this medium became so unattractive that the idea of a medium was given up, even though the idea of waves without a medium is counterintuitive. But there is no a priori reason to think that intuitions and expectations formed classically should be a good guide to describing an aspect of reality that was never encountered classically.

2
  • I know you and I have been here before when it comes to this topic! Again though, much of your response seems to center on the Bohmian interpretation of QM, as if it is the only realist position. The increasingly popular MWI/Everettian picture doesn't suffer from any of the above problems. It can be trivially extended to QFT, as many of the other deterministic interpretations can be. So I must reiterate that I just don't see how you believe indeterminism to be so well established.
    – Pete1187
    Commented Nov 6, 2015 at 14:49
  • 1
    @Pete1187 I know we are unlikely to persuade each other, but we can explain ourselves to each other and leave it at that. MWI has the same problem as Bohm, proliferation of entities that are unobservable in principle (decohered branches). And like all interpretations of QM that single out time, both deterministic and indeterministic like Copenhagen, it does not naturally extend to QFT, again despite long efforts. Because in QFT time is relative, and running branching or collapses in different frames turns out to be inconsistent.
    – Conifold
    Commented Nov 9, 2015 at 21:36
2

When one actually looks at various cosmologies, as opposed to the modern classical one based on the Newtonian paradigm, the concept of randomness has clear irreducible precursors; for example:

  1. Anaximanders apeiron - the unbounded - out of which stuff the world is made of.

  2. Lucretian atomic physics has the clinamen - an irreducible iota of random motion.

  3. Hesiods theogony - a rationalisation of Greek myth - has chaos arising spontaneously.

  4. Genesis, the opening book in the Bible, has 'the earth was without form and void, and darkness was upon the face of the deep'

  5. The Tao says that it is both being and non-being that are the 'mother of ten thousand things'.

  6. Notably, Hegel like the Tao, equivocates between both: pure indeterminate Being is equivalent (but not identical) to pure Nothingness.

It's only really the success of the mechanistic philosophy following the success of Newtonian Physics that made neccessity and determinateness the irreducible point of physical phenomena; this is why when uncertainty was discovered in the early 20C, first empirically in radioactivity and then theoretically established in QM; it was seen as new phenomena, rather than a reintroduction and re-establishing of an old and antique idea.

2
  • 1
    Never thought of that, but yes indeed, determinism is the exception not the rule. But since you mention it, if I recall correctly some of the stoics adhered to strict determinism, long before Newtonian mechanics was established. Commented Nov 6, 2015 at 6:40
  • @alexander King: I didn't know that about the stoics; I'd go along with Aristotles characterisation of physical principles; in that they come in pairs: so not just indeterminateness and nothing else, and nor just Determinateness and nothing else; but both - and irreducibly so. Commented Nov 6, 2015 at 17:35
1

Looking over these answers, the question of whether or not there is true randomness in the universe seems logically equivalent to whether or not there is free will. They're the same question -- how many things in the universe can be said to happen but not be caused by anything external to them.

It makes intuitive sense that there must be at least one uncaused event/thing in the history of the universe. Spinoza calls that thing "God."

The problem with getting any farther in either randomness or free will is that, although there are events or acts for which we do not know the causes, that doesn't prove that the causes do not exist. We very often find that new people, events, and ideas explain things that had been heretofore unexplainable. In fact, in order to pursue any line of thought, we must say "Until such time as new information changes our perspective, we think x."

So, a random event must be always mean something like, "an event that we currently understand to be not determined by anything external to it, although we might find later that it was." Free will must be remain a shorthand for "that which we cannot currently find determination for, and so may very well be uncaused."

Neither are solvable problems or answerable questions, but both ideas are practically useful. In order to state that a given event is random/an act of free will, you must have a total knowledge of all things in the universe. Which you ain't getting anytime soon.

I don't believe in either actual randomness or free will (or, rather, I don't believe that I could prove definitively that any individual event or decision is uncaused), but still use both concepts -- to me, they simply mean "the events or decisions made that I don't currently know the causes of, but still wish to make use of in my thinking."

1
  • 2
    Logically equivalent, but opposite in value: if I was told that my thoughts were random, to me, that would be the opposite of having free will. We want our causes, and we want them to be OURS.
    – user16869
    Commented Nov 6, 2015 at 16:36
1

One way of looking at things is the "Greenness Disappears" form of argument: it is question-begging to explain something in terms of itself. (A green thing ultimately must be made up of colorless things, like atoms, or you have not explained why it IS green.)

So, the only way that things can be deterministic ultimately, is by being based on uncaused (whether you label that "random" is up to you) events below that. Otherwise you can keep digging with your deterministic shovel until you fall out the other side of the Earth.

When people realize this, they should know to be happy with a sufficient explanation, and not go on asking "why why why " like a child.

0

I didn't see it here, but it may be worth adding to the discussion that the nth digit of any transcendental number (such as π or e) is essentially random.

The knowledge to calculate that number with e.g. a Taylor series is also typically precisely the mechanism one would use to verify it.

However if one starts at a random place in a transcendental, the successive numbers are not predictable (without knowing the preceding digits or the offset and having a formula for calculating the succeeding digits).

These numbers occur as fundamentals of mathematics, and as observable phenomenon such as π (being the ratio of every circle's circumference to its diameter). So we see transcendentals in nature, leading to some strange means of prediction.

So the phenomenon we observe in reality, such as quantum mechanics, could derive from an arbitrarily determined precision of a transcendental. In such cases, the randomness is a consequence of universal, fundamental, and immutable laws of mathematics.

Not the answer you're looking for? Browse other questions tagged .