38
$\begingroup$

The intuition I have for why quantum computing can perform better than classical computing is that the wavelike nature of wavefunctions allow you to interfere multiple states of information with a single operation, which theoretically could allow for exponential speedup.

But if it really is just constructive interference of complicated states, why not just perform this interference with classical waves?

And on that matter, if the figure-of-merit is simply how few steps something can be calculated in, why not start with a complicated dynamical system that has the desired computation embedded in it. (ie, why not just create "analog simulators" for specific problems?)

$\endgroup$
4
  • $\begingroup$ are you familiar w/ photonic or phononic computing? $\endgroup$
    – user820789
    Commented Jul 4, 2018 at 3:18
  • 1
    $\begingroup$ @meowzz yes, I'm familiar. Photonic computing is a particular example which has shown to be particularly promising at doing fast matrix multiplication for neural nets (but I'm wondering if anyone looks at nonlinear classical systems). "Quantum analog simulators" are a newish topic that some groups are working on, and I am asking a more general question of why exactly classical "analog simulators" are assumed to be inferior. $\endgroup$ Commented Jul 4, 2018 at 4:42
  • $\begingroup$ This question is essentially the same as this one: What's the difference between a set of qubits and a capacitor with a subdivided plate?. $\endgroup$ Commented Jul 5, 2018 at 6:08
  • $\begingroup$ Where is the main assertion coming from? I mean that the speed up is due to "wave like nature" of QM? $\endgroup$ Commented Jul 9, 2018 at 21:26

6 Answers 6

22
+25
$\begingroup$

Your primary assertion that the mathematics of waves mimic that of quantum mechanics is the right one. In fact, many of the pioneers of QM used to refer to it as wave mechanics for this precise reason. Then it is natural to ask, " Why can't we do quantum computing with waves ?".

The short answer is that quantum mechanics allows us to work with an exponentially large Hilbert space while spending only polynomial resources. That is, the state space of $n$ qubits is a $2^n$ dimensional Hilbert space.

One cannot construct an exponentially large Hilbert space from polynomially many classical resources. To see why this is the case let us look at two different kinds of wave mechanics based computers.

The first way to build such a computer would be to take $n$ number of two-level classical systems. Each system then by itself could be represented by a 2D Hilbert space. For example, one could imagine $n$ guitar strings with only the first two harmonics excited.

This setup will not be able to mimic quantum computing because there is no entanglement. So any state of the system will be a product state and the combined system of $n$ guitar strings cannot be used to make a $2^n$ dimensional Hilbert space.

The second way one could attempt to construct an exponentially large Hilbert space is to use a single guitar sting and to identify its first $2^n$ harmonics with the basis vectors of the Hilbert space. This is done in @DaftWullie 's answer. The problem with this approach is that the frequency of the highest harmonic one needs to excite to make this happen will scale as $O(2^n)$. And since the energy of a vibrating string scales quadratically with its frequency, we will need an exponential amount of energy to excite the string. So in the worst case, the energy cost of the computation can scale exponentially with the problem size.

So the key point here is that classical systems lack entanglement between physically separable parts. And without entanglement, we cannot construct exponentially large Hilbert spaces with polynomial overhead.

$\endgroup$
2
  • $\begingroup$ "This setup will not be able to mimic quantum computing because there is no entanglement."- A quantum computer is not required to have entanglement. $\endgroup$
    – Jitendra
    Commented Jan 5, 2019 at 0:46
  • $\begingroup$ @Jitendra what do you mean by "A quantum computer is not required to have entanglement"? Indeed, a quantum computer would technically not need entanglement in order to be a quantum computer, but without this entanglement it would not be able to work with $2^n$ tones/waves/states. $\endgroup$ Commented Mar 9, 2021 at 14:10
6
$\begingroup$

I myself often describe the source of the power of quantum mechanics as being due to 'destructive interference', which is to say the wave-like nature of quantum mechanics. From a standpoint of computational complexity, it is clear that this is one of the most important and interesting features of quantum computation, as Scott Aronson (for example) notes. But when we describe it in this very brief way — that "the power of quantum computation is in destructive interference / the wave-like nature of quantum mechanics" — it is important to notice that this sort of statement is a short-hand, and necessarily incomplete.

Whenever you make a statement about "the power" or "the advantage" of something, it is important to bear in mind: compared to what? In this case, what we are comparing to is specifically probabilistic computing: and what we have in mind is not just that 'something' is acting like a wave, but specifically that something which is otherwise like a probability is acting like a wave.

It must be said that probability itself, in the classical world, already does act a bit like a wave: specifically, it obeys a sort of Huygen's Principle (that you can understand the propagation of probabilities of things by summing over the contributions from individual initial conditions — or in other words, by a superposition principle). The difference, of course, is that probability is non-negative, and so can only accumulate, and its evolution will be essentially a form of diffusion. Quantum computation manages to exhibit wave-like behaviour with probability-like amplitudes, which can be non-positive; and so it is possible to see destructive interference of these amplitudes.

In particular, because the things which are acting as waves are things like probabilities, the 'frequency space' in which the system evolves can be exponential in the number of particles you involve in the computation. This general sort of phenomenon is necessary if you want to get an advantage over conventional computation: if the frequency space scaled polynomially with the number of systems, and the evolution itself obeyed a wave equation, the obstacles to simulation with classical computers would be easier to overcome. If you wanted to consider how to achieve similar computational advantages with other kinds of waves, you have to ask yourself how you intend to squeeze in an exponential amount of distinguishable 'frequencies' or 'modes' into a bounded energy space.

Finally, on a practical note, there is a question of fault-tolerance. Another side-effect of the wave-like behaviour being exhibited by probability-like phenomena is that you can perform error correction by testing parities, or more generally, coarse-trainings of marginal distributions. Without this facility, quantum computation would essentially be limited to a form of analog computation, which is useful for some purposes but which is limited to the problem of sensitivity to noise. We do not yet have fault-tolerant quantum computation in built computer systems, but we know that it is possible in principle and we are aiming for it; whereas it is unclear how any similar thing could be achieved with water waves, for instance.

Some of the other answers touch on this same feature of quantum mechanics: 'wave-particle duality' is a way of expressing the fact that we have something probabilistic about the behaviour of individual particles which are are acting like waves, and remarks about scalability / the exponentially of the configuration space follow from this. But underlying these slightly higher-level descriptions is the fact that we have quantum amplitudes, behaving like elements of a multi-variate probability distribution, evolving linearly with time and accumulating but which can be negative as well as positive.

$\endgroup$
5
$\begingroup$

I don't claim to have a full answer (yet! I hope to update this, as it's an interesting issue to try and explain well). But let me start with a few clarifying comments...

But if it really is just constructive interference of complicated states, why not just perform this interference with classical waves?

The glib answer is that it's not just interference. I think what it really comes down to is that quantum mechanics uses different axioms of probability (probability amplitudes) to classical physics, and these are not reproduced in the wave scenario.

When someone writes about "waves", I naturally think about water waves, but that may not be the most helpful picture to have. Let's think instead about an ideal guitar string. On a string of length $L$ (pinned at both ends), this has wavefunctions $$ y_n(x,t)=A_n\sin\left(\omega_nt\right)\cos\left(\frac{n\pi x}{L}\right). $$ Let's define the concept of a w-bit ("wave bit"). We can limit ourselves to, say, 4 modes, on the string, so you can associate $$ |00\rangle\equiv y_1 \qquad |01\rangle\equiv y_2\qquad |10\rangle\equiv y_3 \qquad |11\rangle\equiv y_4 $$ Now since we can prepare the initial shape of the string to be anything we want (subject to the boundary conditions), we can create any arbitrary superposition of those 4 states. So, the theory certainly includes things that look like superposition and entanglement.

However, they are not superposition and entanglement as we understand them in quantum theory. A key feature of quantum theory is that it contains indeterminism - that the results of some outcomes are inherently unpredictable. We don't start or end our computation from these points, but we must go through them somewhere during the computation$^*$. For example, experimental tests of Bell's Theorem have proven that the world is not deterministic (and, so far, conforms to what quantum theory predicts). The wave-bit theory is entirely deterministic: I can look at the string of my guitar, whatever weird shape it might be in, and my looking at it does not change its shape. Moreover, I can even determine the values of the $\{A_n\}$ in a single shot, and therefore know what shape it will be in at all later times. This is very different to quantum theory, where there are different bases that can give me different information, but I can never access all of it (indeterminism).

$^*$ I don't have a complete proof of this. We know that entanglement is necessary for quantum computation, and that entanglement can demonstrate indeterminism, but that's not quite enough for a precise statement. Contextuality is a similar measure of indeterminism but for single qubits, and results along those lines have started to become available recently, see here, for broad classes of computations.


Another way to think about this might be to ask what computational operations we can perform with these waves? Presumably, even if you allow some non-linear interactions, the operations can be simulated by a classical computer (after all, classical gates include non-linearity). I assume that the $\{A_n\}$ function like classical probabilities, not probability amplitudes.

This might be one way of seeing the difference (or at least heading in the right direction). There's a way of performing quantum computation classed measurement-based quantum computation. You prepare your system in some particular state (which, we've already agreed, we could do with our w-bits), and then you measure the different qubits. Your choice of measurement basis determines the computation. But we can't do that here because we don't have that choice of basis.

And on that matter, if the figure-of-merit is simply how few steps something can be calculated in, why not start with a complicated dynamical system that has the desired computation embedded in it. (ie, why not just create "analog simulators" for specific problems?)

This is not the figure of merit. The figure of merit is really "How long does it take to perform the computation" and "how does that time scale as the problem size changes?". If we choose to break everything down in terms of elementary gates, then the first question is essentially how many gates are there, and the second is how does the number of gates scale. But we don't have to break it down like that. There are plenty of "analog quantum simulators". Feynman's original specification of a quantum computer was one such analogue simulator. It's just that the time feature manifests in a different way. There, you're talking about implementing a Hamiltonian evolution $H$ for a particular time $t_0$, $e^{iHt_0}$. Now, sure, you could implement $2H$, and replace $t_0$ with $t_0/2$, but practically, the coupling strengths in $H$ are limited, so there's a finite time that things take, and we can still demand how that scales with the problem size. Similarly, there's adiabatic quantum computation. There, the time required is determined by the energy gap between the ground and the first excited state. The smaller the gap, the longer your computation takes. We know that all 3 models are equivalent in the time they take (up to polynomial conversion factors, which are essentially irrelevant if you're talking about an exponential speed-up).

So, analog quantum simulators are certainly a thing, and there are those of us who think they're a very sensible thing at least in the short-term. My research, for example, is very much about "how do we design Hamiltonians $H$ so that their time evolution $e^{-iHt_0}$ creates the operations that we want?", aiming to do everything we can in a language that is "natural" for a given quantum system, rather than having to coerce it into performing a whole weird sequence of quantum gates.

$\endgroup$
9
  • 1
    $\begingroup$ Thanks. Commenting on the first part, I agree that collapse seems to be the main difference. I would think wave-function collapse, in most cases, only slows things down. I believe (maybe incorrectly?) that if you break down a quantum algorithm there's a "write phase", a "processing phase", and a "read phase". I could be wrong but I think that the amount of "steps" or "operations" for a quantum computer is not in terms of the amount of gate-operations, but is determined by how many times you need to measure the system to fully determine your output with high likelihood. $\endgroup$ Commented Jul 5, 2018 at 18:06
  • 1
    $\begingroup$ If you knew your output state without having to collapse and then reconstruct, I would think that improvements would be even /better/. (Also, as a separate comment, I wonder if you could simulate collapse by "pinching" the string, which forces a deterministic collapse to a mode matching the new boundary condition.) $\endgroup$ Commented Jul 5, 2018 at 18:07
  • 1
    $\begingroup$ @StevenSagona regarding your first comment and the number of times you need to measure: the trick with a quantum algorithm is that the final answer will be something that is definitely in the basis that you're measuring. So, you don't need to determine probability distributions or anything: your output is exactly the measurement outcome. $\endgroup$
    – DaftWullie
    Commented Jul 6, 2018 at 9:37
  • 1
    $\begingroup$ @StevenSagona Regarding the "knowing the state without having to collapse", it's almost the opposite that's true. Imagine there are lots of possible routes from input to output. You want to compute by picking the shortest possible route. Generically, a route will go through positions where you cannot know everything about the system simultaneously. If you make the artificial restriction that you have to follow a path where you always know everything, you're following a more restricted set of paths. Chances are, it doesn't contain the globally shortest path. $\endgroup$
    – DaftWullie
    Commented Jul 6, 2018 at 9:40
  • 1
    $\begingroup$ I don't think it is correct to say that this system can produce entanglement. You can represent any vector space using the harmonics of a string, that is correct. But if you take two separate strings and look at the combined space, the state of the system will always be in a product state. Entanglement cannot be produced between two separate classical systems. $\endgroup$
    – biryani
    Commented Jul 7, 2018 at 10:18
3
$\begingroup$

Regular waves can interfere, but cannot be entangled.
An example of an entangled pair of qubits, that cannot happen with classical waves, is given in the first sentence of my answer to this question: What's the difference between a set of qubits and a capacitor with a subdivided plate?

Entanglement is considered to be the crucial thing that gives quantum computers advantage over classical ones, since superposition alone can be simulated by a probabilistic classical computer (i.e. a classical computer plus a coin flipper).

$\endgroup$
2
  • $\begingroup$ For the sake of completeness, given that it is directly relevant to your answer, you should maybe copy the relevant part of your other answer rather than making readers chase after it. $\endgroup$ Commented Jul 13, 2018 at 10:04
  • $\begingroup$ I agree that it's inconvenient when someone cites a paper/article/book/SE question, but doesn't tell you where in the paper to look. Then you have to "chase" what part of he reference is relevant. However here I said "is given in the first sentence of my answer to quantumcomputing.stackexchange.com/questions/2225/…" so they know the exact sentence to look at. That sentence is even shorter than the sentence here describing it. $\endgroup$ Commented Jul 13, 2018 at 11:49
2
$\begingroup$

What makes quantum wave mechanics different from classical is that the wave is defined over a configuration space with a huge number of dimensions. In nonrelativistic undergraduate quantum mechanics (which is good enough for a theoretical discussion of quantum computing), a system of $n$ spinless point particles in 3D space is described by a wave in $\mathbb{R}^{3n}$, which for $n=2$ already has no analogue in classical mechanics. All quantum algorithms exploit this. It may be possible to exploit classical wave mechanics to improve certain calculations (analog computing), but not using quantum algorithms.

The usual model of quantum computing uses qubits that can only be in two states ($\{0,1\}$), not a continuum of states ($\mathbb{R}^3$). The closest classical analogue to that is coupled pendulums, not continuous waves. But there's still an exponential difference between the classical and quantum cases: the classical system of n pendulums is described by $n$ positions and momenta (or $n$ complex numbers), while the quantum system is described by $2^n$ complex numbers (or $2^n$ abstract "positions" and "momenta", but quantum physicists never talk that way).

$\endgroup$
1
$\begingroup$

"why not just perform this interference with classical waves?"

Yes this is one way we can simulate quantum computers on regular digital computers. We simulate the "waves" using floating point arithmetic. The problem is that it does not scale. Every qubit doubles the number of dimensions. For 30 qubits you already need about 8 gigabytes of ram just to store the "wave" aka state vector. At around 40 qubits we run out of computers big enough to do this.

A similar question was asked here: What's the difference between a set of qubits and a capacitor with a subdivided plate?

$\endgroup$
2
  • 2
    $\begingroup$ At the moment there are three answers to this question, all of which have been downvoted several times. It is not clear to me that downvoting is serving any purpose here. Perhaps these answers are not "perfect" or are not addressing the question, but downvoting does not really help to encourage the discussion. Given how new this stack exchange is, I think we should hold off on the downvoting unless someone is clearly acting in bad faith. Good answers can be upvoted instead. $\endgroup$ Commented Jul 5, 2018 at 6:06
  • 2
    $\begingroup$ I have not down-voted your answer, but there are good reasons to down-vote answers below a certain quality on this particular StackExchange. Quantum computation is a subject which is conceptually difficult for many, and is the subject of a lot of poor exposition and hyperbole. It is important in such a situation for the experts to give strong feedback about the quality of answers, in order to give a good indication about which information is higher-quality --- otherwise we risk getting swamped with noise. (Incidentally: I don't see how the other question you linked is similar.) $\endgroup$ Commented Jul 9, 2018 at 11:10

Not the answer you're looking for? Browse other questions tagged or ask your own question.