This article is more than 1 year old
Google claims milestone in quantum error correction
'We reached the break-even point' on roadmap, say boffins in peer reviewed paper, but it's still 'not good enough'
Google is claiming a new milestone on the road to fault-tolerant quantum computers with a demonstration that a key error correction method that groups multiple qubits into logical qubits can deliver lower error rates, paving the way for quantum systems that can scale reliably.
A team at Google Quantum AI said it has demonstrated that a method of quantum error correction called surface codes can exhibit lower error rates as larger surface codes are employed. Specifically, it tested a distance-5 logical qubit against a distance-3 logical qubit, and the larger code delivered more reliable performance.
The work is described in a peer reviewed paper published by science journal Nature entitled: "Suppressing quantum errors by scaling a surface code logical qubit", and while the authors noted that more work is needed to reach the logical error rates required for effective computation, the work demonstrates that this approach may be able to scale to deliver a fault-tolerant quantum computer.
Dr Hartmut Nevan, one of the authors, said the Google Quantum AI team aims to build a machine with about a million quantum bits, but to be useful they had to be capable of participating in a large number of algorithmic steps.
"The only way to achieve this is by introducing quantum error correction," he said, "and our team was able for the first time to demonstrate, in practice, that qubits protected by surface code error correction can indeed be scaled to reach lower error rates."
When engineers previously tried to organize larger and larger ensembles of physical qubits into logical qubits to reach lower error rates, the opposite happened, Dr Nevan claimed. This was because "more qubits means more gates means more readout operations means more stuff that can throw an error," he explained. But if all components in the system have sufficiently low error rates, "then the magic of quantum error correction kicks in."
As described in the paper, surface codes are a family of quantum error-correcting codes whereby a logical qubit is defined by the joint entangled state of a d × d square of physical qubits, referred to as data qubits. Errors are detected by periodically measuring special "measure qubits" interspersed with the data qubits, such that there is one less measure qubit.
This is done so as to keep one degree of freedom open as this is used to store the quantum information, Dr Nevan explained.
This means that a 3 x 3 surface code uses 9 data qubits, plus 8 measure qubits for a total of 17 making one logical qubit. A 5 x 5 surface code uses 25 data qubits, plus 24 measure qubits to make a total of 49 in one logical qubit.
In tests with a 72-qubit quantum system, based on Google's existing Sycamore design, the team found that the larger surface code delivered better logical qubit performance (2.914 percent logical errors per cycle) than the smaller surface code (3.028 percent logical errors per cycle).
"What we've shown here for the first time is a system implementing error correction that was good enough and large enough to show that the logical error is reduced as the scale of error correction increases," said Dr Julian Kelly, another of the paper authors.
- Bosch-backed VCs pour more funds into Brit quantum silicon chips
- Quantum of solace: Rigetti to cut workforce as it faces Nasdaq delisting
- Classiq to school academia in quantum computing with help from Microsoft
- DARPA's quantum computing is powered by ... FOMO
This result does not yet show the scale of performance needed to build an error-corrected machine, Dr Kelly said, but it does demonstrate that that error correction works and "gives us key learnings as we move forward to our next milestone, building a large scale and extremely low error logical qubit," he added.
Google published a roadmap in 2020 that leads to the goal of an error-corrected quantum computer with a million physical qubits. The first milestone was Google's claim to have demonstrated "quantum supremacy" back in 2019, while this logical qubit represents the second.
That million qubit machine is intended to have roughly a thousand to one overhead, resulting in 1,000 reliable logical qubits available for quantum calculations.
"And we think at that stage, we can confidently promise commercial value," said Dr Nevan.
However, the Google AI team is under no illusion about the challenges still ahead on this roadmap.
"That is milestone six, and we just achieved milestone two, which as Julian explained, is just the break-even point; we showed for the first time that in more qubits, five by five, instead of three by three data qubits, the error rate came down for the first time," said Dr Nevan.
"So in financial language, we reached the break-even point, but that's of course not good enough, we need to get to an absolute low error rate, let's say only one in a minute or one in a billion times the qubit operations throw an error," he explained.
Gartner VP analyst Matthew Brisse said that any improvement in error correction for quantum processors would be a step forwards. "Error correction is a major barrier to quantum scaling. Any improvement in error correction will help quantum systems scale, therefore allowing for more complex algorithms to run with the goal towards general purpose fault tolerant quantum computing," he said.
"Today some problems can be addressed with tens to hundreds of qubits, but the real power of quantum will come with millions of qubits. Error correction at scale would be a major accomplishment."
However, Chris Schnabel, VP at quantum-resistant cryptography outfit Qrypt and a former quantum specialist at IBM, isn't so sure today's development is breaking any new ground.
"As described this doesn't sound particularly novel, as the entire industry has been driving research around the use of surface codes for error correction for quite some time," he said.
"It would be interesting if they could demonstrate the use of fewer qubits to accomplish this, as scaling quantum computers to larger qubit counts often also affects error rates, so it frequently turns into a balancing act." ®