'Quantum supremacy will soon be ours!', says Google as it reveals 72-qubit quantum chip
Don't panic: 'supremacy' is the point at which quantum kit trumps classical computers
Google reckons it's on the cusp of demonstrating “quantum supremacy” with the development of a 72-qubit processor.
Quantum supremacy refers to the point at which a quantum computer should outperform a classical computer, without incurring the performance costs of error correction (as described in this 2016 paper).
Google demonstrated a nine-qubit design in 2014 (published in Nature in 2015), and Julian Kelly, a research scientist at the company's Quantum AI Lab explained that the previous experiment hit one percent readout error rates, 0.1 percent errors for single qubit gates, and 0.6 percent errors for two-qubit gates.
The 72-qubit design, dubbed “Bristlecone”, aims to “preserve the underlying physics” of its predecessor, scaling up “the same scheme for coupling, control, and readout”.
Microsoft ports its Quantum Development Kit to Linux and macOSREAD MORE
The new device doesn't quite achieve quantum supremacy; rather, it's a testbed for research into error rates and scalability “as well as applications in quantum simulation, optimisation, and machine learning”.
However, the post hinted that the processor might be the platform that outperforms a classical computer: “We are cautiously optimistic that quantum supremacy can be achieved with Bristlecone”, Kelly wrote.
Another important characteristic of a 72-qubit machine, as illustrated in the image below, is that it's still within reach of a classical computer simulation, which is currently the only way to validate both that the quantum computer is operating correctly (since you can cross-check the answers), and that it achieves a speed-up over classical computers.
Kelly's post drew attention to the need to validate a quantum computer's operation against a simulator, adding that Google's developed a suitable benchmarking tool: ““We can assign a single system error by applying random quantum circuits to the device and checking the sampled output distribution against a classical simulation.”
Bristlecone would also, Kelly wrote, help Google's boffins develop quantum algorithms on “actual hardware”. ®