11
$\begingroup$

The classic example of an indeterministic system is a radioactive isotope, e.g. the one that kills Schrödinger's cat.

I get there are arguments against hidden variables in quantum mechanics, but how could they be so sure, back in the twenties, that the strong nuclear forces involved in radioactivity were not governed by hidden variables rather than true randomness?

Einstein was very unhappy about the indeterminism of quantum mechanics regarding even well understood effects like Young's slit experiments, but it seems kind of ideological and brash on behalf of Heisenberg & Co to extend the indeterminism over to phenomena they hadn't even begun to understand, like alpha decay.

Is there a reason for this early self-assuredness in postulating indeterminsm?

$\endgroup$
4
  • 3
    $\begingroup$ The reason is more philosophical than physical, quantum mechanics can be interpreted completely deterministic(although it's not local). I will add more on this later(Maybe tomorrow). $\endgroup$
    – Ali
    Commented Jul 18, 2013 at 19:38
  • $\begingroup$ De Broglie did propose a hidden variable theory for the electron: en.wikipedia.org/wiki/… $\endgroup$
    – user4552
    Commented Jul 18, 2013 at 20:42
  • $\begingroup$ @Ali, would you explain more about deterministic interpretation? thx $\endgroup$
    – user26143
    Commented Jul 22, 2013 at 19:15
  • $\begingroup$ @user26143 Sure. $\endgroup$
    – Ali
    Commented Jul 23, 2013 at 9:42

2 Answers 2

11
$\begingroup$

Schrödinger came up with the cat in 1935, which was relatively late in the development of quantum mechanics.

Back in the 1920's there had been a lot more uncertainty. The Copenhagen school had wanted to quantize the atom while leaving the electromagnetic field classical, as formalized in the Bohr-Kramers-Slater (BKS) theory. De Broglie's 1924 thesis included a hypothesis that there were hidden variables involved in the electron. In the 20's virtually nothing was known about the nucleus; the neutron had been theorized but not experimentally confirmed.

But we're talking about 1935. This was after the uncertainty principle, after Bothe-Geiger, after the discovery of the neutron, and after the EPR paper. (Schrödinger proposed the cat in a letter discussing the interpretation of EPR.) By this time it had long ago been appreciated that if you tried to quantize one field but not another (as in BKS), you had to pay a high price (conservation of energy-momentum only on a statistical basis), and experiments had falsified such a mixed picture for electrons interacting with light. It would have been very unnatural to quantize electrons and light, but not neutrons and protons. Neutrons and protons were material particles and therefore in the same conceptual category as electrons -- which had been the first particles to be quantized. Ivanenko had already proposed a nuclear shell model in 1932.

$\endgroup$
1
  • $\begingroup$ Great answer! I didn't know the history behind the question when I gave my answer. $\endgroup$
    – Bubble
    Commented Jul 18, 2013 at 23:52
4
$\begingroup$

It was known that a nucleus existed back in the 20's. If you ever did experiments with nuclear decay you would see the hallmarks of a Poisson process. I am talking about simple undergraduate experiments, which, I guess, that in most countries even theoreticians specializing in other branches have to do before they get their degree (like I did). You could also see that the half-life of a sample does not depend on the size of the sample. This invalidates any claim that there is some unknown deterministic interaction between nuclei which causes the statistical-like behavior, because then the physics would change with the size of the sample. Therefore, one can conclude only that there was some hidden determinism inside the nucleus itself which causes decay to simulate a Poisson process. This would make a nucleus a highly complicated system, indeed. Statistical physics teaches us how equilibrium classical systems behave. Why is the nucleus not at equilibrium with itself?! The nucleus would have to be very special indeed to deviate from Boltzmann' theory. This would require highly unusual behavior and completely unknown physical mechanisms. A theory like that would look very unnatural. It is a much better and more natural conclusion to extend quantum indeterminism known from previously understood experiments to the nucleus itself. In the end this approach proved right. When you actively research a completely new theory you can never be 100% sure that what you are doing is correct until you finish your research. You need to look for the most natural and consistent theory you can and hope for the best. :)

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.