179
$\begingroup$

I have a computer science degree. I work in IT, and have done so for many years. In that period "classical" computers have advanced by leaps and bounds. I now have a terabyte disk drive in my bedroom drawer amongst my socks, my phone has phenomenal processing power, and computers have revolutionized our lives.

But as far as I know, quantum computing hasn't done anything. Moreover it looks like it's going to stay that way. Quantum computing has been around now for the thick end of forty years, and real computing has left it in the dust. See the timeline on Wikipedia, and ask yourself where's the parallel adder? Where's the equivalent of Atlas, or the MU5? I went to Manchester University, see the history on the Manchester Computers article on Wikipedia. Quantum computers don't show similar progress. Au contraire, it looks like they haven't even got off the ground. You won't be buying one in PC World any time soon.

Will you ever be able to? Is it all hype and hot air? Is quantum computing just pie in the sky? Is it all just jam-tomorrow woo peddled by quantum quacks to a gullible public? If not, why not?

$\endgroup$
2
  • 2
    $\begingroup$ A comment, not an answer: people think that quantum computers are where conventional ones were in 1950 when they are actually in 1890. The basic problems arent solved, everything has to be custom made. In 1950 you alredy had all the basic parts to create conventional computers available in the market, like vacuum tubes. We don't have the analogue to the vacuum tube for quantum computers, just like in 1890, even if you had all the math needed to create a computer you wouldn't be able to because the components don't exist yet. $\endgroup$
    – Geronimo
    Commented Nov 5, 2019 at 20:42
  • $\begingroup$ not true. check this out. this might help. datatracker.ietf.org/meeting/interim-2020-qirg-01/materials/… $\endgroup$
    – Nathan Aw
    Commented Apr 16, 2020 at 10:28

13 Answers 13

54
$\begingroup$

Is quantum computing just pie in the sky?

So far it is looking this way. We have been reaching for this pie aggressively over the last three decades but with not much success. we do have quantum computers now, but they are not the pie we wanted, which is a quantum computer that can actually solve a useful problem faster or with better energetic efficiency than a classical computer.

You won't be buying one in PC World any time soon.

Will you ever be able to?

We cannot predict the future, but if I had to guess right now, I would say "no". There is not yet any application for which quantum computing would be valuable enough. Instead we might have quantum computers at a small number of special institutes where very special calculations are done (like the supercomputer called Titan at Oak Ridge National Lab, or like a cyclotron particle accelerator where special experiments are done).

Is it all hype and hot air?

Most of it is hype, unfortunately.
But applications in quantum chemistry can indeed be game changing. Instead of doing physically laborious experiments on thousands of candidate molecules for a medicine or fertilizer, we can search for the best molecules on a computer. Molecules behave quantum mechanically, and simulating quantum mechanics is not efficient on classical computers, but is on quantum computers. Much of Google's investment in QC is for chemistry applications [1].

Is it all just jam-tomorrow woo peddled by quantum quacks to a gullible public? If not, why not?

Much of it is, unfortunately.

You were probably one of the more talented students in your class at Manchester University. You might have noticed that there was only a few of you and a larger number of mediocre and sub-mediocre students. There is a similar phenomenon at the professor level. Many professors don't find it easy or "natural" to write well-received grant proposals, but they need funding to keep their job, and to make sure their PhD students aren't starved of experiencing scientific conferences and having access to the software they need.

When a professor becomes:

  • desperate for funding, or
  • caught up with other problems in life, such as having to take care of a child with cancer, or
  • aware that they won't make huge scientific discoveries like some scientists did 100s of years ago,

life becomes more about surviving, keeping a happy family, and doing what they enjoy rather than making a better world for their grandchildrens' grandschildren. As a professor, I can tell you that many of my colleagues are not as "noble" as the public often perceives scientists to be.

I know around 1000 people with funding to work in quantum computing, and not a single one seems to have ill intentions to fool a "gullible public" in some sinister way. Most of us just apply for grants available through our universities or through our governments, and we don't intend to exaggerate the importance of our work any more than other scientists competing for the same money (we have to compete with molecular physicists pretending their work is important for fixing climate change just because the molecule they're working on is in our atmosphere, or biophysicists pretending their work might cure cancer just because they're working on a molecule that's prominent in the body).

A lot of the "hype" around quantum computing comes from the media. Journalists have twisted the contents of my papers to make eye-catching headlines which will get more clicks on their ads, and their bosses give them pressure to do this or they'll lose their job to the other intern that doesn't care as much about being honest.

Some of the hype does come from scientists themselves, many who truly believe quantum computing will be revolutionary because their PhD supervisor didn't have a great education (remember that Manchester University is one of the best in the world, and most universities are not even close), or perhaps in rare cases there is hype from people desperate for funding, but not much for reasons other than these.

I do believe the public should invest a bit in quantum computing, as they do for lots of other areas of research which have no guaranteed positive outcome. The hype is often exaggerated by journalists, ignorant scientists, or non-ignorant scientists who think they need it for survival. There is also unfairly harsh criticism from journalists and funding agencies.

Nothing you said in your question is wrong.
I have just given some reasons for why they are correct.

$\endgroup$
0
108
$\begingroup$

I'll be trying to approach this from a neutral point of view. Your question is sort of "opinion-based", but yet, there are a few important points to be made. Theoretically, there's no convincing argument (yet) as to why quantum computers aren't practically realizable. But, do check out: How Quantum Computers Fail: Quantum Codes, Correlations in Physical Systems, and Noise Accumulation - Gil Kalai, and the related blog post by Scott Aaronson where he provides some convincing arguments against Kalai's claims. Also, read James Wotton's answer to the related QCSE post: Is Gil Kalai's argument against topological quantum computers sound?

Math Overflow has a great summary: On Mathematical Arguments Against Quantum Computing.

However, yes, of course, there are engineering problems.

Problems (adapted from arXiv:cs/0602096):

  • Sensitivity to interaction with the environment: Quantum computers are extremely sensitive to interaction with the surroundings since any interaction (or measurement) leads to a collapse of the state function. This phenomenon is called decoherence. It is extremely difficult to isolate a quantum system, especially an engineered one for a computation, without it getting entangled with the environment. The larger the number of qubits the harder is it to maintain the coherence.

    [Further reading: Wikipedia: Quantum decoherence]

  • Unreliable quantum gate actions: Quantum computation on qubits is accomplished by operating upon them with an array of transformations that are implemented in principle using small gates. It is imperative that no phase errors be introduced in these transformations. But practical schemes are likely to introduce such errors. It is also possible that the quantum register is already entangled with the environment even before the beginning of the computation. Furthermore, uncertainty in initial phase makes calibration by rotation operation inadequate. In addition, one must consider the relative lack of precision in the classical control that implements the matrix transformations. This lack of precision cannot be completely compensated for by the quantum algorithm.

  • Errors and their correction: Classical error correction employs redundancy. The simplest way is to store the information multiple times, and — if these copies are later found to disagree — just take a majority vote; e.g. Suppose we copy a bit three times. Suppose further that a noisy error corrupts the three-bit state so that one bit is equal to zero but the other two are equal to one. If we assume that noisy errors are independent and occur with some probability $p$, it is most likely that the error is a single-bit error and the transmitted message is three ones. It is possible that a double-bit error occurs and the transmitted message is equal to three zeros, but this outcome is less likely than the above outcome. Copying quantum information is not possible due to the no-cloning theorem. This theorem seems to present an obstacle to formulating a theory of quantum error correction. But it is possible to spread the information of one qubit onto a highly entangled state of several (physical) qubits. Peter Shor first discovered this method of formulating a quantum error correcting code by storing the information of one qubit onto a highly entangled state of nine qubits. However, quantum error correcting code(s) protect quantum information against errors of only some limited forms. Also, they are efficient only for errors in a small number of qubits. Moreover, the number of qubits needed to correct errors doesn't normally scale well with the number of qubits in which error actually occurs.

    [Further reading: Wikipedia: Quantum error correction]

  • Constraints on state preparation: State preparation is the essential first step to be considered before the beginning of any quantum computation. In most schemes, the qubits need to be in a particular superposition state for the quantum computation to proceed correctly. But creating arbitrary states precisely can be exponentially hard (in both time and resource (gate) complexity).

  • Quantum information, uncertainty, and entropy of quantum gates: Classical information is easy to obtain by means of interaction with the system. On the other hand, the impossibility of cloning means that any specific unknown state cannot be determined. This means that unless the system has specifically been prepared, our ability to control it remains limited. The average information of a system is given by its entropy. The determination of entropy would depend on the statistics obeyed by the object.

  • A requirement for low temperatures: Several quantum computing architectures like superconducting quantum computing require extremely low temperatures (close to absolute zero) for functioning.

Progress:

There have been several experimental realizations of CSS-based codes. The first demonstration was with NMR qubits. Subsequently, demonstrations have been made with linear optics, trapped ions, and superconducting (transmon) qubits. Other error-correcting codes have also been implemented, such as one aimed at correcting for photon loss, the dominant error source in photonic qubit schemes.

Conclusion:

Whether we will ever have efficient quantum computers which can visibly outperform classical computers in certain areas, is something which only time will say. However, looking at the considerable progress we have been making, it probably wouldn't be too wrong to say that in a couple of decades we should have sufficiently powerful quantum computers. On the theoretical side though, we don't yet know if classical algorithms (can) exist which will match quantum algorithms in terms of time complexity. See my previous answer about this issue. From a completely theoretical perspective, it would also be extremely interesting if someone can prove that all BQP problems lie in BPP or P!

I personally believe that in the coming decades we will be using a combination of quantum computing techniques and classical computing techniques (i.e. either your PC will be having both classical hardware components as well as quantum hardware or quantum computing will be totally cloud-based and you'll only access online them from classical computers). Because remember that quantum computers are efficient only for a very narrow range of problems. It would be pretty resource-intensive and unwise to do an addition like 2+3 using a quantum computer (see How does a quantum computer do basic math at the hardware level?).

Now, coming to your point of whether national funds are unnecessarily being wasted on trying to build quantum computers. My answer is NO! Even if we fail to build legitimate and efficient quantum computers, we will still have gained a lot in terms of engineering progress and scientific progress. Already research in photonics and superconductors has increased manyfold and we are beginning to understand a lot of physical phenomena better than ever before. Moreover, quantum information theory and quantum cryptography have led to the discovery of a few neat mathematical results and techniques which may be useful in a lot of other areas too (cf. Physics SE: Mathematically challenging areas in Quantum information theory and quantum cryptography). We will also have understood a lot more about some of the hardest problems in theoretical computer science by that time (even if we fail to build a "quantum computer").

Sources and References:

  1. Difficulties in the Implementation of Quantum Computers (Ponnath, 2006)

  2. Wikipedia: Quantum computing

  3. Wikipedia: Quantum error correction


Addendum:

After a bit of searching, I found a very nice article which outlines almost all of Scott Aaronson's counter-arguments against the quantum computing skepticism. I very highly recommend going through all the points given in there. It's actually part 14 of the lecture notes put up by Aaronson on his website. They were used for the course PHYS771 at the University of Waterloo. The lectures notes are based on his popular textbook Quantum Computing Since Democritus.

$\endgroup$
0
38
$\begingroup$

Classical computing has been around longer than quantum computing. The early days of classical computing is similar to what we are experiencing now with quantum computing. The Z3 (First Turing complete electronic device) built in the 1940s was the size of a room and less powerful than your phone. This speaks to the phenomenal progress we have experienced in classical computing.

The dawn of quantum computing on the other hand, did not start until the 1980s. Shor's factoring algorithm; the discovery that jump-started the field was discovered in the 1990s. This was followed a few years later with the first experimental demonstration of a quantum algorithm.

There is evidence that quantum computers can work. There is a tremendous amount of progress on the experimental and theoretical aspects of this field every year and there is no reason to believe that it's going to stop. The Quantum threshold theorem states that large scale quantum computing is possible if the error rates for physical gates are below a certain threshold. We are approaching (some argue that we are already there) this threshold for small systems.

It's good to be skeptical about the usefulness of quantum computation. In fact, it's encouraged! It's also natural to compare the progress of quantum computation with classical computation; forgetting that quantum computers are more difficult to build than classical computers.

$\endgroup$
1
  • 2
    $\begingroup$ The Z3 was capable of some useful work. Quantum computers are not currently, 40 years after the idea. The few problems where they really compete with conventional computers are classifiable as study of a quantum system that they emulate, not doing any work that people wanted to get done. Things go full-speed in solution-in-search-of-problem mode. In the field of cryptanalysis and factoring, where there has been much work, the results are miserable, and over-hyped. $\endgroup$
    – fgrieu
    Commented Jun 29, 2020 at 18:26
22
$\begingroup$

Early classical computers were built with existing technology. For example, vacuum tubes were invented around four decades before they were used to make Colossus.

For quantum computers, we need to invent the technology before we make the computer. And the technology is so beyond what had previous existed, that just this step has taken a few decades.

Now we pretty much have our quantum versions of vacuum tubes. So expect a Colossus in a decade or so.

$\endgroup$
0
20
+150
$\begingroup$

When you ask whether it is pie in the sky, that rather depends on what promises you think quantum technologies are trying to fulfil. And that depends on who the people are making those promises.

Consider why you are even aware of quantum computation, given that it hasn't yet managed to produce any devices (or to be more fair, not very many devices) which resemble muscular computer hardware. Where are you hearing about it from, whence the excitement? I'm willing to bet that even if you attend every academic talk about quantum computing that you can personally manage to, not very much of what you hear about quantum computing is coming from academics. Chances are you hear a lot about quantum computing from sources which are more interested in excitement than fact.

There are some corporate sources who are making more or less grandiose claims about what their quantum hardware can do, or will be able to do; and there have been for well over a decade. Meanwhile, there is a large community of people who have simply been trying to make careful progress and not spend too much of their energy making promises they can't deliver. Whom will you have heard more from?

But even granting those, the parties most responsible for excitement about quantum computation are certain kinds of magazines and special-interest websites, which as sources of information are like market-square waffle vendors: they trade very much on sweet vaporous aromas rather than something with substance and bite. The attention-seeking advertising industry, rather than academia, are the main reason why there are such puffed-up expectations of quantum computation. They don't even care about quantum computation in principle: it is one of several magical incantations with which to amaze the crowd, to evoke dreams of pie in the sky, and in the meantime make money from some other company for the mere possibility that an ad was seen for half a second. That industry is very much in the business of selling airborne pastry, both to their clients and to their audience. But does that mean that the world is owed flying fig rolls by those who are actually working on quantum technologies? It's hard enough to accomplish the things which we think might be possible to accomplish — which are more modest, but still worthwhile.

Among my academic peers (theoretical computer scientists and theoretical physicists), the blatant misinformation about quantum computation among the public is a source of significant frustration. Most of us believe that it will be possible to build a quantum computer, and most of those who do also believe that it will have significant economic impacts. But none of us expect that it would turn the world upside-down in five to ten years, nor have we been expecting that for any of the past fifteen years that it started to become fashionable to say that we would have massive quantum computers "in five to ten years". I have always made it a point to say that I hope to see the impacts in my lifetime, and recent activity has made me hope to see it within twenty — but even then you won't be going to the store to buy one, any more than you go down to the store to buy a Cray supercomputer.

Nor do any of us expect that it will allow you to easily solve the Travelling Salesman Problem, or the like. Being able to analyse problems in quantum chemistry and quantum materials is the original, and in the short term still the best, prospective application of quantum computation, and it may be revolutionary there; and perhaps in the longer term we can provide robust and significant improvements in practise for optimisation problems. (D-Wave claims they can already do this in practise with their machines: the jury is still out among academics whether this claim is justified.)

The devil of it is, to explain what you can actually expect out of the theory and development of quantum computation, you have to somehow explain a little quantum mechanics. This Is Not An Easy Thing To Do, and as with anything complicated, there is little patience in the larger world for nuanced understanding, especially when "alternative facts" in the form of candy-flavoured 'yakawow' hype is striding mightily around in seven league boots.

The truth — about what quantum computation can do, and that it likely won't allow you to teleport across the world, nor solve world hunger or airline chaos at a stroke — is boring. But making significant advances in chemistry and materials science is not. To say nothing of applications not yet developed: how easily can you extrapolate from gear-based computers to help reliably compute taxes or compute logarithm-tables to designing aircraft?

The timeline of classical computing technology extends well before even the 19th century. We have some idea of how to try to re-tread this path with quantum technologies, and we have an idea of the sorts of dividends which may be possible if we do so. For that reason, we hope to reproduce the development to useful computing technology in a much, much faster amount of time than the 370-plus years from Pascal's adders to the modern day. But it's not going to be quite as fast as some people have been promising, particularly those people who are not actually responsible for delivering on those 'promises'.

Some remarks.

"Where's the parallel adder?"

  • We don't have large devices which carry out addition by quantum computers, but we do have some people working on fast addition circuits in quantum computers — some of what quantum computers will have to do would involve more conventional operations on data in superposition.

"Where's the equivalent of Atlas, or the MU5?"

  • To be frank, we're still working on the first reliable quantum analogue of Pascal's adder. I'm hopeful that the approach of the NQIT project (disclosure: I'm involved in it, but not as an experimentalist) of making small, high-quality modules which can exchange entanglement will be a route to rapid scaling via mass-production of the modules, in which case we might go from Pascal's adder, to the Collosus, to the Atlas, and beyond in a matter of a few years. But only time will tell.

"It looks like they haven't even got off the ground. You won't be buying one in PC World any time soon."

  • That is perfectly true. However, if you were ever told to expect otherwise, this is more likely to be the fault of PC World (or to be fair, PC World's competitors in the market for your subscription money as a tech enthusiast) than it is ours. Any responsible researcher would tell you that we're striving hard to make the first serious prototype devices.

"Will you ever be able to [buy a quantum computer in PC World]?"

  • Will you ever be able to buy a Cray in PC World? Would you want to? Maybe not. But your university may want to, and serious businesses may want to. Beyond that is wild speculation — I don't see how a quantum computer would improve word-processing. But then again, I doubt that Babbage ever imagined that anything akin to his Difference Engine would be used to compose letters.
$\endgroup$
0
19
$\begingroup$

TL,DR: Engineering and physics arguments have already been made. I add a historical perspective: I argue that the field of quantum computation is really only a bit more than two decades old and that it took us more than three decades to build something like the MU5.


Since you mention the timeline, let's have a closer look:

The beginnings

First of all, the mere possibility of something like a quantum computer was voiced by Richard Feynman in the west (1959 or 1981 if you wish) and Yuri Manin in the east (1980). But that's just having an idea. No implementation starts.

When did similar things happen with classical computing? Well, a very long time ago. Charles Babbage for instance already wanted to build computing machines in the early 19th century and he already had ideas. Pascal, Leibniz, they all had ideas. Babbage's analytical machine of 1837 which was never built due to funding and engineering challenges (by the way, the precursor of the analytical machine was built with Lego) is definitely the most recent first idea that is already way ahead of what Feynman and Manin proposed for quantum computing, because it proposes a concrete implementation.

The '70s don't see anything related to a quantum computer. Some codes are invented, some theoretical groundwork is done (how much information can be stored?), which is necessary for qc, but it's not really pursuing the idea of the quantum computer.

Codes and communication-related ideas are to quantum computation what telephones and telegraph wires are to classical computing: an important precursor, but not a computer. As you know, Morse codes and telegraphs are technologies of the 19th century and more difficult codes for noisy channels were also studied. The mathematical groundwork (in terms of no-go-theorems and the like) was done in 1948 by Shannon.

Anyway, it can be argued that punch card computing was developed in 1804 for weaving, but I don't want to claim that this was really the beginning of the classical computation.

Universal (quantum) computers

So when did computation start? I'm going to argue that you need a number of things to get research for universal computing off the ground; before that, the number of people and money invested there will be limited.

  1. You need the notion of a universal computer and a theoretical model of what to achieve.
  2. You need an architecture of how to implement a universal computer - on a theoretical level.
  3. You need a real-life system where you could implement it.

When do we get those in quantum computation?

  • Deutsch describes the universal quantum computer in 1985 (33 years ago).
  • Circuit models and gates are developed around the same time.
  • The first complete model of how to put everything together was proposed by Cirac and Zoller in 1994 (merely 24 years ago).

All the other advances in quantum computation before or during that time were limited to cryptography, quantum systems in general or other general theory.

What about classical computation?

  • We have Turing's work on Turing machines (1936) or Church's work (same time frame).
  • Modern architectures rely on von Neumann's model (1945); other architectures exist.
  • As a model, the digital circuit model was designed in 1937 by Shannon.

So, in 1994 we are in a comparable state to 1937:

  • There are a few people doing the theoretical groundwork, and the groundwork has now been done.
  • There are a fair number of people doing engineering work on foundational issues not directly related but very helpful for building a (quantum) computer.
  • And the field is generally not that big and well-funded.
  • But: from that date, funding and people start pouring into the field.

The field is taking off

For classical computing, this is illustrated by the amount of different "first computer systems" in the Wikipedia timeline. There were several research groups at least in Germany, England, and the United States in several locations (e.g. Manchester and Bletchley Park in the UK, to name just a few). War-time money was diverted to computing because it was necessary for e.g. the development of the nuclear bomb (see accounts at Los Alamos).

For quantum computation, see e.g. this comment:

The field of QIS began an explosive growth in the early to mid-1990s as a consequence of several simultaneous stimuli: Peter Shor demonstrated that a quantum computer could factor very large numbers super-efficiently. The semiconductor industry realized that the improvement of computers according to Moore’s law would all too soon reach the quantum limit, requiring radical changes in technology. Developments in the physical sciences produced trapped atomic ions, advanced optical cavities, quantum dots, and many other advances that made it possible to contemplate the construction of workable quantum logic devices. Furthermore, the need for secure communications drove the investigations of quantum communication schemes that would be tamper proof.

All in all, from the time, that the theoretical groundwork of modern computers had been laid to the time that the first computers are available (Zuse 1941, Manchester 1948, to name just two) it took about a decade. Similarly, it took about a decade for the first systems doing some sort of universally programmable calculation with quantum systems. Granted, their capabilities are lower than the first Manchester computers, but still.

Twenty years later, we are slowly seeing explosive growth in technology and a lot of firms get involved. We also see the advent of new technologies like the transistor (first discovered in 1947).

Similarly, 20 years after the beginning of quantum computation we see the serious entrance of private companies into the field, with Google, IBM, Intel, and many others. When I was at my first conference in 2012, their involvement was still academic, today, it is strategic. Similarly, we saw a proposition of a wealth of different quantum computing systems during the 2000s such as superconducting qubits, which form the basis of the most advanced chips from the three companies mentioned above. In 2012, nobody could claim to have a somewhat reliable system with more than a couple of physical qubits. Today, only six years later, IBM lets you play with their very reliable 16 qubits (5 if you really only want to play around) and Google claims to test a 72 qubit system as we speak.

Yes, we have still some way to go to have a reliable large-scale quantum computer with error-correction capabilities, and the computers we currently have are weaker than the classical computers we had in the '60s, but I (as others explain in other answers) believe this is due to the unique engineering challenges. There is a small chance that it's due to physical limitations we have no idea about but if it is, given current progress, we should know in a couple of years at the latest.

What's my point here?

  • I argued that the reason that we don't see an MU5 quantum computer yet is also due to the fact that the field is just not that old, yet, and hasn't really achieved that much attention until recently.
  • I argue that from a present-day perspective, it seemed that classical computers became very good very quickly, but that this neglects decades of prior work where development and growth didn't seem as fast.
  • I argue that if you believe (as almost everybody in the field does) that the initial engineering problems faced by quantum computers are harder than those faced by classical computers, then you see a very much comparable research and innovation trajectory to one of the classical computers. Of course, they are somewhat different, but the basic ideas of how it goes are similar.
$\endgroup$
0
12
$\begingroup$

To answer part of the question, "will I ever buy a quantum computer", etc. I think there is a fundamental misunderstanding.

Quantum computing isn't just classical computing but faster. A quantum computer solves certain kinds of problems in a short time that would take a classical super computer a thousand years. This isn't an exaggeration. But regular kinds of computing, adding numbers, moving bits for graphics, etc. Those will still just be classical computing things.

If the technology could ever be miniaturized (I don't know), it might be something more like an MMU or a graphics card. An additional feature to your classical computer, not a replacement. In the same way a high end graphics card lets your computer do things that it would not be able to (in reasonable time) with the main CPU, a quantum computer would allow other sorts of operations that can't be done currently.

I recommend you at least scan maybe the first paragraph of the "Principles of Operation" page on the quantum computing Wikipedia page.

$\endgroup$
0
12
$\begingroup$

TL;DR: I've been working on the theory of quantum computers for about 15 years. I've seen nothing convincing to say that they won't work. Of course, the only real proof that they can work is to make one. It's happening now. However, what a quantum computer will do and why we want it does not match up with the public perception.

Is quantum computing just pie in the sky? Is it all just jam-tomorrow woo peddled by quantum quacks to a gullible public?

As a "quantum quack" (thanks for that), of course I'm going to tell you it's all realistic. But the theory is sound. So long as quantum mechanics is correct, the theory of quantum computation is correct, and there are efficient algorithms for quantum computers for which we don't know how to efficiently compute the solution on a classical computer. But I don't think anything that I write here can convince a skeptic. Either, you have to sit down and learn all the details yourself, or wait and see.

Of course, quantum mechanics is only a theory which could be superseded at any time, but its predictions have already been applied to explain the world around us. Quantum computers are not pushing the theory into an untested regime where we might hope there are unexpected results (which is what physicists really hope for, because that's where you start to see hints at new physics). For example, quantum mechanics is already applied to condensed matter systems consisting of far more constituents than we're talking about qubits in a near-term quantum computer. It's just that we need an unprecedented level of control over them. A few people think they have arguments for why a quantum computer won't work, but I've not found anything particularly convincing in the arguments that I've read.

Is it all hype and hot air?

There is a lot of hype surrounding quantum computers. I would say that this comes from two main sources:

  • the popular representation of quantum computing in the mainstream media and popular culture (e.g. science fiction books). Ask anybody actively working on quantum computation, I think they will all agree it's poorly represented, giving the impression that it's a universal solution that will make everything run quicker, which is, at least for now, not the case. There has been some jam-tomorrow woo peddled to a gullible public, but that's more through a "lost in translation" attempt to over-simplify what's going on, mostly by non-specialist intermediaries.

  • researchers themselves. For the past 20(ish) years, people have been promising that quantum computing is just over the horizon, and its never quite materialized. It's quite reasonable that observers get sick of it at that point. However, my perspective from being within the field is that many people claiming to be working towards quantum computers haven't been. As funding bodies have got progressively more demanding with the "why" for research, and ensuring "impact", quantum computing has become the go-to for many experimentalists, even if they aren't really interested in doing anything for a quantum computer. If there's been some way that they can twist what they're doing so that it sounds relevant to quantum computing, they've tended to do it. It doesn't mean that quantum computing can't be done, it just hasn't been as much of a focus as has been implied. Take, at a slightly different level, the explosion of quantum information theory. So few theorists within that have actively worked on the theory of quantum computers and how to make them work (that's not to say they've not been doing interesting things).

However, we are now hitting a critical mass where there's suddenly a lot of research investment in making quantum computers, and associated technology, a reality, and things are starting to move. We seem to be just hitting the point, with devices of about 50 qubits, that we might be capable of achieving "quantum supremacy" - performing computations whose results we can't really verify on a classical computer. Part of the problem with achieving this has actually been the aforementioned rapid progress of classical computing. Given Moore's Law type of progress, yielding exponentially improving classical computational power, it's been a constantly shifting bar of what do we need to achieve to be convincing.

Quantum computers don't show similar progress. Au contraire, it looks like they haven't even got off the ground.

The point is, it's hard to do, and it's taken a long time to get the basic technology right. This is a slightly imperfect comparison, but it's not too bad: think about the lithography processes that are used for making processors. Their development has been progressive, making smaller and smaller transistors, but progress has been slowing as it's got harder and harder to deal with, for one, the quantum effects that are getting in the way. Quantum computers, on the other hand, are essentially trying to step over that whole progressive improvement thing and jump straight to the ultimate, final, result: single atom transistors (kind of). Perhaps that gives some level of insight into what the experimentalists are trying to deal with?

You won't be buying one in PC World any time soon. Will you ever be able to?

It's not clear that you'd even want to. At the moment, we expect quantum computers to be useful for certain, very specific tasks. In that case, we perhaps envisage a few powerful centralized quantum computers that do those specific jobs, and most people will keep going with classical computers. But, since you want to draw analogies with the development of classical computers, then (according to Wikipedia) it is in 1946 that Sir Charles Darwin (grandson of the famous naturalist), head of Britain's National Physical Laboratory, wrote:

it is very possible that ... one machine would suffice to solve all the problems that are demanded of it from the whole country

(variants of this are attributed to people like Watson). This very clearly is not the case. The reality was that once computers became widely available, further uses were found for them. It might be the same for quantum computers, I don't know. One of the other reasons that you wouldn't buy a quantum computer in a shop is its size. Well, the actual devices are usually tiny, but it's all the interfacing equipment and, especially, cooling that takes up all the space. As the technology improves, it'll be able to operate at progressively higher temperatures (look, for example, at the progress of high temperature superconductivity compared to the original temperatures that had to be achieved) which will reduce cooling requirements.

$\endgroup$
0
10
$\begingroup$

The sad truth for most of the people here is that John Duffield (the asker) is right.

There is no proof that a quantum computer will ever be of any value.

However, for the companies that have invested in quantum computing (IBM, Google, Intel, Microsoft, etc.), it is entirely worth it to try to build one, because if they are successful they will be able to solve some problems exponentially faster than classical computers, and if they are not successful no dent has been put in the billions of dollars they have available.

The attempt to build useful quantum computers, which you can call a failure so far, has at least lead to advances in understanding superconductors, photonics, and even quantum theory itself. A lot of mathematics used for analyzing quantum mechanics, was developed in the context of quantum information theory.

And finally, quantum computers might never be marketable, but quantum communication devices by Toshiba, HP, IBM, Mitsubishi, NEC and NTT are already on the market.

In conclusion: I agree with John Duffield that quantum computing may never be of any value. But quantum communication is already marketable, and a lot of new science, mathematics, and engineering (e.g. for superconductors) was developed for our failed (so far) attempts in making quantum computing a reality.

$\endgroup$
0
10
$\begingroup$

Like all good questions, the point is what you mean. As the CTO of a startup developing a quantum computer, I have to emphatically disagree with the proposition that quantum computing is just pie in the sky.

But then you assert "You won't be buying one in PC World any time soon." This I not only agree with but would suggest that in the foreseeable future, you won't be able to, which is as close to "never" as you'll get me to assert.

Why is that? To the first point, it is valid because there are no engineering reasons to prevent us from building a quantum computer and in fact there are no reasons that will continue to prevent us from building one for much longer. To the second point, it is because it is harder to build a quantum computer than it is to build a classical computer (you need special conditions such as extremely cold temperatures or a very good vacuum, and they are slower) yet there are only certain problems that quantum computers excel at. You don't need any laptops to do drug discovery by computation or breaking outdated crypto or to accelerate inverting some function (especially not if they come with wardrobe sized support equipment), but you need one or a few supercomputers to do it.

Why can I say there are no engineering issues that prevent (large, universal) quantum computers? Note that a single example would suffice, hence I choose the technology I know best, the one I am pursuing professionally. In ion trap based quantum computing, all the ingredients one needs have been demonstrated: There are high-fidelity, universal quantum gates. There are successful attempts to move ions (separate and recombine them from strings of ions, move them along paths, and through intersections of paths), with suitable performance. Plus initializing, measuring, etc. is possible at a fidelity comparable to gate operations. The only thing that prevents large, universal ion trap based quantum computers from being built are related to getting the scientists that made the individual contributions together with the right engineers, serious engineering indeed, and finance.

I'm itching to even tell you just how one might go about getting the feat done soon, technically, but I fear I'd make our patent attorney (and my CEO and everyone else in the company) a bit mad. What it boils down to is this:

If quantum computing is indeed a pie in the sky, then looking back, people in the future will perceive it as just such a low hanging fruit as the first microcomputers.

$\endgroup$
9
$\begingroup$

Why would you expect two different technologies to advance at the same rate?

Simply put, quantum computers can be immensely more powerful but are immensely harder to build than classical computers. The theory of their operation is more complicated and based on recent physics, there are greater theoretical pitfalls and obstacles that inhibit their scaling up in size, and their design requires much more sophisticated hardware which is harder to engineer.

Nearly every stage of development of a quantum computer is inanalogous to that of a classical computer. So a question for you; why compare them?

$\endgroup$
0
9
$\begingroup$

See the timeline on Wikipedia, and ask yourself where's the parallel adder?

It seems to me that your answer lies in your question. Looking at the timeline on Wikipedia shows very slow progress from 1959 until about 2009. It was mainly theoretical work until we went from zero to one.

In the only 9 years since then, the pace of progress has been tremendous, going from 2 qubits to 72 and if you include dwave up to 2000 qubits. And, there's one working in the cloud right now that we have access to. Graph the progress of the last 60 years and I'm sure you'll see quite the knee in the curve you seem to desire and a rebuttal to your statement But as far as I know, quantum computing hasn't done anything.

Where's the equivalent of Atlas, or the MU5?

Is that the measure against which your question is based?

Will you ever be able to? Is it all hype and hot air? Is quantum computing just pie in the sky? Is it all just jam-tomorrow woo peddled by quantum quacks to a gullible public?

Yes. No. No. No.

If not, why not?

Because, as your referenced timeline shows, people are making significant progress in the number and stability of qubits as well as in quantum algorithms.

Asking people to predict the future has always been fraught with failure which is why most of these sites don't allow 'opinion based' questions.

Perhaps more specific (non-opinion based) questions would better serve to answer your questions.

$\endgroup$
0
7
$\begingroup$

There are many technical challenges to developing a universal quantum computer consisting of with many qubits, as pointed out in the other answers. See also this review article. However, there may be workaround ways to get certain nontrivial quantum computing results before we get to the first truly universal quantum computer.

Note that classical computing devices existed a long time before the first universal computer was made. E.g. to numerically solve differential equations, you can construct an electric circuit consisting of capacitors, coils and resistors, such that the voltage between certain points will satisfy the same differential equation as the one you want to solve. This method was popular in astrophysics before the advent of digital computers.

In case of quantum computing, note that when Feynman came up with the idea of quantum computing, he argued on the basis of the difficulty of simulating quantum mechanical properties of certain physical systems using ordinary computers. He turned the argument around by noting that the system itself solves the mathematical problem that is hard to solve using ordinary computers. The quantum mechanical nature of the system makes that so, therefore one can consider if one can construct quantum mechanical devices that are able to tackle problems that are hard to solve using ordinary computers.

$\endgroup$
0

Not the answer you're looking for? Browse other questions tagged or ask your own question.