The Osprey has landed: IBM's 433-qubit quantum processor
Still quite a way to go before fabled 4,158-qubit system lands, scheduled for 2025. Plus: Fujitsu details quantum/HPC hybrid calculation tech
IBM has officially unveiled its Osprey quantum processor featuring 433 qubits, more than three times the qubits seen in its Eagle processor introduced just a year ago.
The news comes soon after Fujitsu said it has designed hybrid quantum/HPC computing technology that automatically finds the "optimal" solution for complex customer workloads.
At its Quantum Summit 2022, IBM detailed both its new Osprey quantum processor and gave an update on its upcoming IBM Quantum System Two hardware.
With its 433 qubits, Osprey has the potential to run complex quantum computations well beyond the computational capability of any classical computer, Big Blue claimed, and represents another milestone to its previously announced goal of delivering a 4,158-qubit system by 2025.
Big Blue's roadmap includes two other quantum processors — the 1,121-qubit Condor and 1,386-qubit Flamingo in 2023 and 2024 respectively — between the Osprey and its planned 4,000-qubit+ Kookaburra processor, which it hopes to also release in 2025.
"The new 'Osprey' processor brings us a step closer to the point where quantum computers will be used to tackle previously unsolvable problems," claimed IBM senior VP and director of research, Dr Darío Gil.
Like last year's 127-qubit Eagle, Osprey includes multi-level wiring to provide flexibility for signal routing and device layout, while also adding in integrated filtering to reduce noise and improve stability, IBM said.
In addition, the company is aiming to address the noise issue in quantum processors with new capabilities that allow users to employ error suppression as part of its Qiskit software development kit for quantum systems. Currently a beta update to the Qiskit Runtime, this allows a user to trade speed for reduced error count via a simple option in the API, IBM said.
Qiskit also now allows users to add error mitigation strategies. The various methods available have different cost/accuracy tradeoffs, so IBM said these are added via a new option to the Qiskit primitives called a "resilience level" that lets users dial in the cost/accuracy trade that is suitable to their task. This is also a beta, with full support for both features scheduled for 2025.
IBM said its Quantum System Two, the company's first step on its datacenter-style approach to quantum computers, is targeted to be online by the end of 2023. (In fact, a video released by the company says it will be unveiling its first working system at next year's Quantum Summit).
According to IBM, Quantum System Two will form a building block of its vision of quantum-centric supercomputing. This will scale by using a modular architecture linked by quantum communication to increase its computational capacity, as well as deploying hybrid cloud middleware to integrate quantum and classical workflows.
IBM Fellow and VP of IBM Quantum Jay Gambetta said that today's news marks "a pivotal moment in the evolution of the global quantum computing sector," as the company advances along its quantum roadmap.
"As we continue to increase the scale of quantum systems and make them simpler to use, we will continue to see adoption and growth of the quantum industry," he predicted.
Going for broker
Meanwhile, Fujitsu said that it was working towards offering customers a computing workload broker that will use AI to automatically select the most "optimal" resources for an application from a mix of HPC and quantum computing technologies.
Fujitsu said it developed a quantum/HPC hybrid calculation technology for solving quantum chemical problems while it was working on the tech. This is intended to serve as a precursor to the workload broker, and is intended to enable high precision calculations at high speed by combining HPC and quantum resources.
It's basically a prototype workload broker, but built with a single workload in mind: analyzing the properties of materials for drug discovery and new material development.
This comprises three main features: one is a quantum/HPC algorithm discrimination technology, which Fujitsu said can determine whether quantum or HPC algorithms offer the optimal solution to the problem. Another is an AI model that tries to estimate in advance the time and cost which will be required to obtain accurate solutions. The third is a system developed to enable customers to perform calculations at optimal costs and in optimal time, taking account of output from the other two.
However, instead of using an actual quantum computer, this prototype uses Fujitsu's quantum simulator technology, announced in March. This runs on a cluster of the company's PRIMEHPC FX 700 nodes, which are based on a similar architecture to the Fugaku supercomputer.
- A diverse range of processing units linked via interconnects. Sounds familiar? Yes, it's IBM's quantum computing
- IBM forges entanglement to double quantum simulations by 'cutting up a larger circuit into smaller circuits'
- IBM ordered to hand over ex-CEO emails plotting cuts in older workers
- Optimizer Rescale recommends Rescale's optimization recommender
- Fujitsu to test robot datacenter inspector that – trust us – won't take your jobs
The version used in this instance can now simulate a quantum computer with 39 qubits, and runs on 512 nodes instead of 64, Fujitsu told us. It also has improved job management, system management and automated optimization features. The company said it plans to step up to a 40 qubit quantum simulator in the spring of 2023.
Fujitsu said its next step with the quantum/HPC hybrid calculation technology is to verify its effectiveness and develop it further, with the aim of having the workload broker technology ready by the company's fiscal 2023, which starts in April next year. ®