7
$\begingroup$

Background

IBM, Infleqtion, QuEra, and other quantum hardware companies have announced roadmaps where they expect to have 100 or more fault-tolerant qubits by the end of the decade. It seems increasingly likely that at least one of them will succeed.

In a 2024 preprint arxiv:2401.16317, Scholten et al. argue that "While much research and development needs to take place, the physics and engineering of quantum computers has been de-risked to the point where these roadmaps should be given credence."

Scholten et al.'s meta analysis shows that there are many problems in the category of "simulating physics" that require few than 1,000 qubits and perhaps only millions of Toffoli gates.

Question

What sorts of non-trivial problems in chemistry or materials science could be studied with 100 fault-tolerant qubits?

The answer(s) I'm looking for are (ideally) papers that have made a fairly specific claim in this regard (e.g. "with our algorithm, you could understand this specific chemical reaction if you had 100 fault-tolerant* qubits") with fairly detailed resource estimates.

An example of the sort of thing I'm looking for is this preprint from Xanadu and Volkswagen which estimates the resources required to calculate x-ray absorption spectra for systems of 18 orbitals with a quantum computers.

*I'm aware fault tolerant and error corrected don't mean exactly the same thing, but that's not the point of this question.

Notes

$\endgroup$

1 Answer 1

4
$\begingroup$

With $N$ fault-tolerant qubits we can find the ground state energy of a system of electrons, with $N$ spin orbitals in the basis set that is used to represent the electronic wavefunction.

This means that with 100 fault-tolerant qubits, you can find the ground-state energy of an electronic system with 100 spin orbitals. The precision (or the size of the error bars) desired for the ground-state energy prediction, will basically determine how hard it will be to do the calculation (if you want the error bars to be $\pm$1 micro-hartree, then you will need more auxiliary qubits for the error correction, than if you just want the error bars to be $\pm$1 mega-hartree).

So assuming that you can get the electronic energy with arbitrary precision for an electronic system with 100 spin orbitals (this won't be happening by the end of the decade, regardless of what IBM, Infleqtion, QuEra say), what scientific problems can be solved? Unfortunately not much, even if we have 50 electrons in the 100 spin orbitals, which is the maximum number of electrons before the problem starts to get easier.

As mentioned in this QCSE answer, a famous paper enthusiastically claimed that a quantum computer with 109 logical qubits and 34000 physical qubits (assuming an error rate of $10^{-9}$, which won't be happening any time soon!) that they could find the ground state energy of a system with 108 spin orbitals with a precision of $\pm$1 milli-hartree, in about 12 days assuming that the quantum gates are operating at 100 MHz (which was considered to be extremely high). We were able to get the ground state energy for precisely that same 108-orbital system, with roughly the same order of magnitude for precision and wall time, on a classical computer in 2018. Also, 108 spin orbitals is not nearly enough to gain any meaningful insights about the sought reaction mechanism for the molecule (FeMoco) that was discussed in those papers.

I am confident in saying that 100 error-corrected or fault-tolerant qubits will not be enough to solve any known chemistry problem on a quantum computer that a classical computer will not be able to do more cheaply, because 100 spin orbitals (or 50 spatial orbitals) is small enough that classical-computer algorithms can currently do the same calculations, even for the maximum number (50) of electrons, and ground-state electronic energy calculations are (as far as we know) the calculations that would benefit the most from quantum computation.

$\endgroup$
4
  • 3
    $\begingroup$ Note that quantum chemistry algorithms have improved enormously since that 2016 paper. For example, I remember arxiv.org/abs/1805.03662 used millions of times fewer gates (instead of weeks of 100MHz, think hours of 10KHz). That said, the space costs haven't improved much so that will still be a major limiting factor. $\endgroup$ Commented May 29 at 1:17
  • $\begingroup$ That paper uses plane waves, for which we need 10s of 1000s more of them to get the same accuracy for molecules as we would get with a given number of Gaussians. Also, the space cost is larger in that paper than in the 2016 one, because of the need for auxiliary logical qubits, which means that 100 fault-tolerant qubits will get you even less than in the 2016 paper. As for the speed, your paper says that it would take 5.6 hours with 810,000 physical qubits, whereas the 2016 paper estimated that it would take 11 hours with 1982 qubits. $\endgroup$ Commented May 29 at 4:40
  • $\begingroup$ Admittedly, a lot of the improvements went into making the physical assumptions sane. The space cost in particular is larger because we assumed physical qubits with 1e-3 noise instead of 1e-9 noise. I am not disagreeing with your overall conclusion, only with the choice of a very outdated paper as the example. $\endgroup$ Commented May 29 at 5:04
  • $\begingroup$ Thanks for your answer! If I am understanding the Li *et al. 2019 paper, it seems they were discussing solving for the ground state electronic structure, for which efficient classical methods exist. What about other problems, such as dynamics, where there are not efficient classical algorithms? $\endgroup$ Commented May 29 at 13:44

Not the answer you're looking for? Browse other questions tagged or ask your own question.