54
$\begingroup$

Reading discussions of the recent quantum supremacy experiment by Google I noticed that a lot of time and effort (in the experiment itself, but also in the excellent blog posts by Scott Aaronson and others explaining the results) is spent on verifying that the quantum computer did indeed compute the thing we believe it to have computed.

From a naive point of view this is completely understandable: the essence of any quantum supremacy experiment is that you have the quantum computer perform a task that is hard for a classical computer to achieve, so surely it would also be hard for the classical computer to verify that the quantum computer did complete the task we gave it, right?

Well, no. About the first thing you learn when starting to read blogs or talk to people about computational complexity is that, counter-intuitive as it may seem, there exist problems that are hard to solve, but for which it is easy to verify the validity of a given solution: the so called NP problems.

Thus it seems that Google could have saved themselves and others a lot of time by using one of these problems for their quantum supremacy experiment rather than the one they did. So my question is why didn't they?

An answer for the special case of the NP problem factoring is given in this very nice answer to a different question: https://cs.stackexchange.com/a/116360/26301. Paraphrasing: the regime where the quantum algorithm starts to out perform the best known classical algorithm starts at a point that requires more than the 53 qubits currently available.

So my follow-up question is: does this answer for the special case extend to all NP-problems where quantum speedups are expected or is it specific to factoring? And in the first case: is there a fundamental reason related to the nature of NP that quantum-supremacy 'kicks in later' for NP problems than for sampling problems or is it just that for NP problems better classical algorithms are available due to their being more famous?

$\endgroup$
4
  • 13
    $\begingroup$ I think you answered yourself. Current quantum computers are pretty weak, so they can't really do anything useful. If there were any NP problem they knew how to solve using their current hardware, they would do so. $\endgroup$ Commented Oct 28, 2019 at 15:23
  • $\begingroup$ Because they wouldn't be able to claim "quantum supremacy" otherwise. $\endgroup$ Commented Oct 29, 2019 at 1:40
  • 5
    $\begingroup$ Note also that NP problems (including factoring!) tend to be those with a lot of structure. This means that somone could invent a brand new fast classical algorithm tomorrow that exploits this structure. Google chose the simulation of random circuits (which has essentially no structure) so that there is very little hope of finding a clever classical trick in the future that renders their supremacy claim invalid. $\endgroup$ Commented Oct 29, 2019 at 11:39
  • 1
    $\begingroup$ Ironically I use the letters NP to mean no problem. $\endgroup$
    – copper.hat
    Commented Oct 31, 2019 at 21:01

3 Answers 3

74
$\begingroup$

there exist problems that are hard to solve, but for which it is easy to verify the validity of a given solution: the so called NP problems.

This statement is wrong. There are many NP problems which are easy to solve. "NP" simply means "easy to verify". It does not mean hard to solve.

What you are probably thinking of is NP-complete problems which is a subset of the NP problems for which we have very, very good evidence to think they are hard. However, quantum computers are not expected to be able to solve NP-complete problems significantly more "easily" than regular computers.

Factoring is also thought to be hard, but the evidence for this is only "very good" and not "very, very good" (in other words: factoring is likely not NP-complete). Factoring is one of very few natural problems which falls in between not being NP-complete and not being easy.

The list of problems that we know that are easy to verify, easy to solve on a quantum computer but hard classicly, is even shorter. In fact, I do not know of any problem other than factoring (and the very closely related discrete logarithm problem) with this property.

Moreover, any easy to verify problem would likely have the same issue as factoring: $53$ qubits is not that many, and $2^{53}$ is huge, but just within reach of classical computing. $2^{53}$ less than $10^{16}$, and most classical computers can execute on the order of $10^9$ operations per second. We could run through all possibilities in about $1/3$rd of a year on a single classical desktop computer.

Quantum computers have very few applications which they're known to be good at, and are essentially useless for most hard NP problems.

$\endgroup$
3
  • 18
    $\begingroup$ Very good answer. To add, the class of problems that Quantum computers are good at are called BQP. $\endgroup$ Commented Oct 28, 2019 at 23:42
  • $\begingroup$ Good answer. Also note that the problem type that is easiest to implement in this sort of quantum supremacy demonstration is a sampling problem. Sampling problems by nature aren't meaningfully in NP since they aren't yes or no questions. $\endgroup$
    – JoshuaZ
    Commented Oct 29, 2019 at 1:58
  • $\begingroup$ Comments are not for extended discussion; this conversation has been moved to chat. $\endgroup$
    – Raphael
    Commented Oct 30, 2019 at 21:11
9
$\begingroup$

Because then their experiment would have been a complete failure.

As I wrote in an answer on a sister site (which was somewhat poorly received there, but I think your question validates what I was saying about how a general audience interprets it):

[the hyping of the result] plays on a discrepancy between what they mean by quantum supremacy (QS) and what people tend to think QS means.

What I find most people think QS is supposed to mean, and what I assumed it meant until a month or so ago, was that there exists a computable problem (in the CTT sense of computation) and an actual quantum computer, such that, at some scales, the problem is tractable on the quantum computer but intractable on all classical computers.

The problem the Google QC folks have demonstrated is not computation in the CTT sense. It is a physical process of sampling that involves computations as part of the process, and as with any physical process, it can be simulated approximately by computation. They have good reason to believe (proof? I'm not sure but it should reasonably be assumed true by default anyway) that computation to similate the process is going to be intractably slow. This is not surprising at all. It's a fundamental consequence of quantum mechanics that lots of physical processes will have that property.

$\endgroup$
3
  • $\begingroup$ Comments are not for extended discussion; this conversation has been moved to chat. $\endgroup$
    – Raphael
    Commented Oct 30, 2019 at 21:12
  • 1
    $\begingroup$ to reiterate the top comment explaining why this is not a good answer -- "but results about sampling problems do tell you about what you call "CTT" complexity classes." E.g. "if there exists a polynomial-time classical algorithm that samples from the same probability distribution as a linear-optical network, then P#P = BPPNP, and hence the polynomial hierarchy collapses to the third level" scottaaronson.com/papers/optics.pdf $\endgroup$
    – BurnsBA
    Commented Oct 31, 2019 at 12:45
  • $\begingroup$ @BurnsBA: The answer, particularly the first line, is direct and correct. Had they set out to compute a problem believed to be intractable (which is what OP seems to have meant by "NP problem"), they would have failed, because it's much harder and there is still no evidence that it will even be possible with QC, ever. $\endgroup$ Commented Oct 31, 2019 at 13:22
7
$\begingroup$

Quantum computing is not a piece of magic, and there seems to be a widespread misconception about the power of quantum computers. I am by no means an expert in this field, but as far as I know QC is very suitable for computational problems that employ some kind of cyclic structure. This happens to be true for problems like the integer factoring problem and the discrete logarithm problem, thus, if quantum computers become practical, asymmetric ciphers like RSA and any kind of DSA (including ECDSA) will become obsolete. However, it is not known (or widely believed) that QC breaks symmetric ciphers (the key size has to be increased, though) or, more generally, inverts generic one-way functions. QC is also not known to break lattice-based asymmetric crypto, therefore, NTRU, McEliece etc. do apparently survive QC.

Therefore, even if you have a fully functional QC, you cannot arbitrarily pick any computational problem and conjure the solution instantly.

$\endgroup$
1
  • 1
    $\begingroup$ You can not arbitrarily pick a problem, but you could pick any classical problem and not a synthetic „QC simulation“ $\endgroup$
    – eckes
    Commented Oct 31, 2019 at 20:14

Not the answer you're looking for? Browse other questions tagged or ask your own question.