At this year’s 2022 Quantum Summit, IBM made 12 major announcements indicating quantum computing is on the cusp of significant breakthroughs. One of the more exciting and far-reaching announcements was IBM's plans to begin investigating quantum-centric supercomputing.
A big plus for IBM is that there were no real surprises in the announcements since each one was highlighted on IBM’s previous technology roadmap.
How performance is measured
Before discussing any announcements, it is essential to understand how IBM looks at quantum performance using three key metrics: scale, quality, and speed.
The number of qubits measures the scale. Quality is based on quantum volume, which indicates how well qubits obey quantum mechanics. Speed is determined by how fast actual quantum circuits can be run.
A few years ago, IBM created a new circuit efficiency metric called Circuit Layer Operations per Second (CLOPS). CLOPS measure show many actual quantum circuits a quantum processor can run in a given amount of time.
Dr. Jay Gambetta, IBM Fellow and Vice President of IBM Quantum, explained how CLOPS had affected quantum progress. “We continue to see that speed is important,” he said. “We have started many different research activities that focused on improving speed, and it all came together with success.”
IBM started the year with a CLOPS of 1,400 with an established goal to achieve a year-end CLOPS of 10,000. An achievement of 15,000 CLOPS surpassed that goal. Dr. Gambetta explained that IBM researchers achieved a 10x improvement by focusing on the combination of scale, quality, and speed.
IBM also announced a 4x improvement in quantum volume. It went from a QV of 128 to 512. Both the quantum volume and CLOPS measurements were performed on IBM Falcon processors.
Osprey – IBM’s new 433 qubit quantum processor
One significant 2022 Quantum Summit announcement was IBM’s new Osprey quantum processor. Considering scale, the processor represents the largest quantum processor ever built by IBM. Equipped with 433 qubits, the Osprey more than triples the 127 qubits unveiled last year on the IBM Eagle processor.
With 433 qubits, the Osprey is in a different category than previous IBM quantum processors because it has the potential to run complex quantum circuits beyond the capability of classical computers. To put its computational size in perspective, it would take more classical bits than atoms in the universe for a classical computer to represent a state on the IBM Osprey processor.
Despite its newness, IBM has already iterated a second-generation Osprey. These efforts resulted in an improved 3x coherence of 300 microseconds. IBM quickly produced the second generation because it uses an agile hardware development process incorporating multiple research teams working in parallel to improve processor scale, quality, and speed. Any resulting improvements from the agile process are incorporated in later chip generations. In this case, multiple iterations of the third-generation Eagle processor helped create an improved second-generation Osprey.
The longer a qubit can maintain its superposition state (coherence), the more computational operations it can perform. One of today’s major quantum challenges is qubit sensitivity. It is susceptible to all forms of environmental noise from quantum controllers, imperceptible vibrations, wiring, cryogenics, heat, other qubits, and even cosmic radiation.
Noise is troublesome because it can cause quantum states to collapse, creating errors. The trouble doesn’t stop there. Because of quantum effects, a single uncorrected error can cascade throughout the entire system and destroy the computation.
Since we don't know how to entirely correct quantum errors, noise is what stands between scaling a quantum computer from a few hundred qubits to thousands or millions of qubits. And, although noise can’t be completely removed, and we don’t yet have a complete error correction solution, noise can be partially managed using error suppression and error mitigation techniques.
IBM has a very mature error correction research program. In addition, IBM is one of the few companies with a continuous history of error mitigation research. Error correction is a long-term effort, with perhaps five or more years of research needed before a workable solution can be found. On the other hand, it is possible to use error mitigation now through Qiskit runtime primitives Sampler and Estimator.
Sampler is a program that takes a user's circuits as an input and generates an error-mitigated readout of quasiprobabilities. This provides users a way to better evaluate shot results using error mitigation and enables them to more efficiently evaluate the possibility of multiple relevant data points in the context of destructive interference.
Estimator is a program interface that takes circuits and observables and allows users to selectively group between circuits and observables for execution to efficiently evaluate expectation values and variances for a given parameter input. This primitive will enable users to efficiently calculate and interpret expectation values of quantum operators required for many algorithms.
At the 2022 Quantum Summit, IBM announced the release of a beta update to Qiskit Runtime. It simplifies error mitigation by allowing users to choose how much accuracy is desired. Using four incremental software resilience settings enable the user to exchange speed for reduced error count.
As shown in the above graphic: Resilience 1 is measurement error mitigation, Resilience 2 provides biased error mitigation (via ZNE), and Resilience 3 enables unbiased estimators (via PEC).
Resilience settings are now built into software as well as dynamic circuits. It is available on 18 IBM systems. It is important to know that as the Resilience numbers go up, the cost also goes up. Resilience 3 is the most expensive. It produces the least amount of errors but could require 100,000x more time.
IBM announced that dynamic circuits will also be built into the same systems as error mitigation. Dynamic circuits are a powerful and important technology that can:
- Extend hardware capabilities to allow reduced circuit depth as shown in the above graphic
- Allow consideration of alternative models for quantum computing, such as the measurement model in contrast to the standard gate array model
- Play a fundamental role in quantum error correction codes that use parity checks that are dependent on real-time classical data
Dynamic circuits have essentially created a much broader family of circuits that take advantage of measurement, computation, and management to allow future states to be changed or controlled by the outcome of mid-circuit measurements made during circuit execution.
At the summit, IBM announced Quantum Safe as a product to enable customers to harden systems and data against intrusion enabled by future quantum technology.
Because quantum computers are maturing quickly, there is a high probability that quantum computers will eventually be able to break the two most widely used security protocols in the world.
Until recently, methods of encryption based on prime factorization were considered unbreakable and were used to safeguard everything from email to banking to credit cards. It is not only future data at risk to quantum, but all previous data secured with prime factorization could become insecure in a matter of years.
Implementing Quantum Safe in multiple industries will take several years. IBM announced that its first Quantum Safe partnership would be with Vodaphone to understand its application for the telco industry.
Quantum serverless and circuit knitting
IBM announced the alpha releases of circuit knitting and quantum serverless. It is much easier for algorithm developers to create and run many small quantum and classical programs rather than one large program. IBM is integrating Quantum Serverless into its core software stack. That will enable circuit knitting, which allows large quantum circuits to be solved by splitting them into smaller circuits and distributing them across quantum resources. Knitted circuits can be recombined by using an orchestrated solution with classical CPUs.
Quantum centric supercomputing
One of the most exciting and far-reaching announcements made at the 2022 IBM Quantum Summit was IBM’s vision for a future quantum-centric supercomputer. A complete discussion of the project is contained in the Moor Insights & Strategy video interview with Patrick Moorhead and Dr. Jay Gambetta below.
There are three fundamental things necessary before a quantum-centric supercomputer can be built.
First, quantum modularity is essential. It will likely be necessary to have multiple chips, multiple fridges, good electronics, and modular software.
Second, communication and computation have traditionally been regarded as different fields. However, in a quantum-centric supercomputing environment, it will be necessary for them to come together much like classical computing allows communication of data between other processes.
Third, we’ll need middleware for quantum. How do we get quantum and classical working together? It will be essential to have seamlessly integrated workflows that can take the best of classical and quantum to enable quantum as an accelerator in a larger heterogeneous computing architecture.
IBM recognizes that the scope of such a project will require a community effort focused on the first-of-its-kind quantum project.
Moor Insights & Strategy will write a future article on the quantum-centric supercomputer project.
IBM delivered the advanced products and features on its roadmap according to the originally forecasted timelines. That is impressive, considering the complexity and interdependence of the numerous features and products.
Looking back on the roadmap from a few years ago and considering the numerous announcements made at The IBM Quantum Summit 2022, it is clear that quantum computing has made significant progress.
This year’s announcements establish the foundation for 2023, when multiple devices will be connected to grow the number of qubits. It also establishes a preliminary base for the eventual integration of communications and quantum necessary for quantum-centric supercomputing.
- IBM likely developed Osprey as a pathfinder to find a solution for scaling. Although the Heron has fewer qubits than the Osprey, its complexity is similar. The Heron is better suited for attaining higher quantum volume and better fidelity. For higher fidelity, more complexity is needed in the gates. But the Osprey probably becomes the model for scaling a chip large enough for IBM’s 100x100 challenge planned for late 2024.
- IBM’s generation three electronics support dynamic circuits, so it is equivalent to OpenQasm3 .
- One thing not covered in this piece is the IBM Quantum System 2, a key component of IBM’s future quantum plans. IBM plans to show it at the 2023 IBM Quantum Summit.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.