If the era of quantum computing began 3 years ago, its rising sun may have fallen behind a cloud. In 2019, Google researchers claimed to have passed a milestone known as quantum supremacy when their Sycamore quantum computer performed in 200 seconds an abstruse calculation that they said would take a supercomputer 10,000 years to complete. Now, scientists in China have done the calculation in a few hours with ordinary processors. A supercomputer, they say, could defeat Sycamore completely.
“I think they’re right that if they had access to a big enough supercomputer, they could have simulated the … task in seconds,” says Scott Aaronson, a computer scientist at the University of Texas at Austin. The breakthrough takes some shine off Google’s claim, says Greg Kuperberg, a mathematician at the University of California, Davis. “Getting within 300 feet of the top is less exciting than getting to the top.”
Still, the promise of quantum computing remains unchanged, say Kuperberg and others. And Sergio Boixo, chief scientist for Google Quantum AI, said in an email that the Google team knew its advantage might not last for long. “In our 2019 paper, we said that classical algorithms would improve,” he said. But, “we don’t think this classical approach can continue with quantum circuits in 2022 and beyond.”
The “problem” solved by Sycamore was designed to be difficult for a conventional computer, but as easy as possible for a quantum computer, which manipulates qubits that can be set to 0, 1, or—thanks to mechanics quantum – any combination of 0 and 1 at the same time. Together, Sycamore’s 53 qubits, tiny resonant electrical circuits made of superconducting metal, can encode any number from 0 to 2.53 (roughly 9 quadrillion) – or even all at once.
Starting with all qubits set to 0, the Google researchers applied to a single qubit and paired a random but fixed set of logic operations, or gates, over 20 cycles, then read the qubits. Crudely, quantum waves representing all possible outputs sat between the qubits, and the gates created interference that amplified some outputs and canceled others. So some should have appeared with greater probability than others. Over millions of trials, a sharp production pattern emerged.
Google researchers argued that simulating these interference effects would beat even Summit, a supercomputer at Oak Ridge National Laboratory, which has 9,216 central processing units and 27,648 faster graphics processing units (GPUs). Researchers at IBM, which developed Summit, quickly countered that if they used every bit of hard drive available to the computer, it could handle the calculation in days. Now, Pan Zhang, a statistical physicist at the Institute of Theoretical Physics at the Chinese Academy of Sciences, and his colleagues have shown how to beat Sycamore in a paper in press in Physical review papers.
Following others, Zhang and colleagues reformulated the problem as a 3D mathematical array called a tensor lattice. It consisted of 20 layers, one for each gate cycle, with each layer comprising 53 points, one for each qubit. Lines connected points to represent gates, with each gate encoded in a tensor—a 2D or 4D grid of complex numbers. Running the simulation then reduces to, essentially, multiplying all the tensors. “The advantage of the tensor grid method is that we can use multiple GPUs to do the calculations in parallel,” Zhang says.
Zhang and colleagues also relied on a key insight: Sycamore’s calculation wasn’t quite right, so theirs shouldn’t be either. Sycamore calculated the output distribution with an estimated fidelity of 0.2%—enough to distinguish the fingerprint-like sting from the noise in the circuit. So Zhang’s team traded accuracy for speed by cutting some lines in its network and eliminating the corresponding gates. Losing just eight rows made the calculation 256 times faster while maintaining a fidelity of 0.37%.
The researchers calculated the output pattern for 1 million of the 9 quadrillion possible number strings, relying on an innovation of their own to obtain a truly random, representative set. The computation took 15 hours on 512 GPUs and produced a sharp output. “It’s fair to say that Google’s experiment was simulated on a conventional computer,” says Dominik Hangleiter, a quantum computer scientist at the University of Maryland, College Park. On a supercomputer, the calculation would take a few tens of seconds, Zhang says — 10 billion times faster than the Google team estimated.
The breakthrough highlights the pitfalls of racing a quantum computer against a conventional computer, the researchers say. “There is an urgent need for better quantum supremacy experiments,” says Aaronson. Zhang suggests a more practical approach: “We need to find some real-world applications to demonstrate the quantum advantage.”
However, Google’s demonstration was not just advertising, the researchers say. Sycamore required far fewer operations and less power than a supercomputer, Zhang notes. And if Sycamore had a little higher fidelity, he says, his team’s simulation could not have continued. As Hangleiter says, “Google’s experiment did what it was meant to do, start this race.”