IBM Rolls Out A Game-Changing 127-Qubit Quantum Computer That Redefines Scale, Quality, And Speed

By Patrick Moorhead - December 16, 2021

IBM Quantum’s recent quantum computing announcement about its new 127-qubit quantum Eagle processor is an important milestone for the entire quantum ecosystem. The announcement was made in the timeframe forecasted by IBM’s September 2020 quantum technology roadmap. 

The 127-qubit Eagle is important for several reasons:

·      It is the first quantum computer with 127 high-quality qubits, surpassing the 113-qubit Jiuzhang Chinese quantum machine. 

·      127 qubits put the processor beyond the ability of a classical computer to simulate it - this is unknown territory for quantum science.

·      Eagle’s architecture contains many technical improvements that will help IBM attain its goal of frictionless computing by late 2023. 

·      Technical improvements in Eagle are also foundational to the development of IBM’s future quantum processors, the 433 qubit Osprey in 2022 and the 1121 qubit Condor in 2023.

·      Eagle is the last quantum processor to be developed for use in the IBM Quantum System One. The following two generations - the Osprey and the Condor - will be developed for the IBM Quantum System Two. 

3D architecture of Eagle IBM

Jerry Chow is Director, Quantum Hardware System Development for IBM Quantum. In a recent technology analyst conference, Chow said: “I can categorically say that this is the most advanced quantum computing chip ever built. In fact, not only has it been built, but our Eagle has landed. This is a world's first quantum processor over 100 qubits. And for that matter, it has 127 qubits arranged in our well-known heavy hexagonal lattice. And let me stress this isn't just a processor we fabricated, but a full working system that is running quantum circuits today.”

The Eagle chip uses readout multiplexing instead of sets of control and readout electronics for each qubit. This reduces the amount of wiring and electronics inside the dilution refrigerator.

The processor also incorporates fabrication that provides scalable access wiring to all qubits using 3D integration to place microwave circuit components and wiring on multiple physical levels. 

Scale, quality, and speed are important 

IBM measures quantum computing performance using three key metrics – scale, quality, and speed. For scale, it measures its progress by the number of qubits in its system. Moreover, it is possible to include any qubit in a 2-qubit gate operation.



Scaling the number of qubits is critical because it determines the degree of computational complexity a quantum machine can handle. Scaling qubits and hardware is not an overnight effort. It is systematic, long-term research effort that usually results in small but important incremental improvements. 

Consider improvements made to high-performance race cars. You can't make a vehicle faster simply by installing a new set of tires. A racing science team makes many small changes due to tests performed over a long period. Consideration is given to fuel mixtures, carburetor settings, gear ratios, body aerodynamics, weight distribution, and much more.

IBM uses the same type of approach to achieve improvements and incremental scaling of qubits. In 2019, the 27-qubit Falcon processor was made possible by making superconducting Josephson junctions more reliable. Additionally, a different qubit lattice arrangement with improved yield led to reduced qubit to qubit interference. 

Last year, IBM released its 65-qubit Hummingbird quantum processor with qubit readout and measurements reduced from one line per qubit to a multiplexed readout using one line for every eight qubits. The innovation created more space in the cryogenic system for and made room to expand from 65 qubits to the Eagle’s 127 qubits.



IBM considers quality to be a measure of how well its technology implements quantum circuits with sufficient depth. It uses a holistic metric called Quantum Volume to benchmark quality. Quantum Volume was introduced by IBM in 2017 and takes into consideration several factors, including the number of qubits, how qubits are interconnected, gate and measurement errors, device cross-talk, circuit compiler efficiency, and more. Quantum Volume also takes into account such things as material loss and other imperfections, plus control and readout errors. Read more about quantum volume in my previous Forbes articles here and here.

It was important for IBM to include the calculation of Quantum Volume to benchmark speed as part of its measurements. It is important to know how fast systems can solve a problem. IBM defines speed as circuit layer operations per second (CLOPS). It is essential to measure speed because fast, high-quality circuits allow complex problems to be solved in less time.

Once Quantum Volume benchmarking determines circuit quality, then the circuits can be subsequently compiled and run on hardware to calculate CLOPS.



Through its extensive research, IBM has developed several ways to achieve increased circuit speed, such as:

·      Fast gates that translate into faster operations on qubits.

·      High-fidelity fast readouts to minimize the time required for qubit resets and reuse.

·      Advanced controller electronics provide faster operations for preparing registers to run the next circuit, fast 10-20 microsecond qubit resets, and better overall control. In the following section, an advanced control system used to simulate a lithium hydride molecule provided better readout and faster qubit reset performance. It reduced execution time of each batch of circuits from 1,000 microseconds to 70 microseconds.

·      Qiskit Runtime speeds up processing time by as much as 120x. Classical computers and quantum computing work together in an architecture that reduces latency by locating classic resources close to quantum processing. Essentially, Qiskit Runtime functions as a containerized service for quantum computers. Running quantum programs in the Qiskit Runtime execution environment takes full advantage of the IBM hybrid cloud’s ability to handle most of the work. 

Why qubit gate speed and qubit quality are important


Simulation of large chemical molecules is a task we expect future quantum computers to perform. However, it will require millions of qubits running on a fault-tolerant quantum computer. A quantum computer of that size and capability is still many years away. 

Currently, the power of current quantum computers limits simulations to small molecules. In the case of the above graphic, IBM computed the binding energy of a two-atom lithium hydride (LiH) molecule. The simulation was done entirely on the cloud using error mitigation to reduce errors. It also used Qiskit Runtime, which provided a significant speed advantage. 

The computation required running 4.8 billion quantum circuits passed back and forth between a classical computer and a quantum computer. The problem, defined as a quantum circuit, was evaluated by the quantum computer, then updated by the classical computer to find the optimal value, and then sent back to the quantum computer for another run. That process was repeated until a solution was found.

In addition to improvements in the algorithm, hardware speed and quality played an essential role in reducing the number of iterations. Improved processor performance resulted in a 10x decrease in the number of repeated circuit runs required by each algorithm iteration. 

Quality and speed improvements in the control systems resulted in better readout and faster qubit reset performance, reducing the amount of time to execute of each batch circuits from 1,000 microseconds to 70 microseconds.

Bottom line, the importance of speed in quantum computing becomes clear in this example. The need to run 4.8 billion quantum circuits means a high circuit repetition rate is critical. 

Considering just one of the factors in this experiment underscores why quantum speed is essential. By only requiring microseconds to reset a qubit register, the experiment was able to be completed in a few hours. On the other hand, had it needed milliseconds, it would have taken almost a year to obtain the results. Rarely will any researcher begin an experiment that has its results a year away.

Wrapping up

Jay Gambetta is the IBM Fellow and Vice President, Quantum Computing. In a recent analyst conference, he said: “We anticipate that with Eagle our users will be able to explore uncharted computational territory and experience a key milestone on the path towards practical quantum computation." 

Gambetta further explained that IBM wants to begin focusing on the useful work qubits can do and start talking about performance. 

Personally, I’m looking forward to seeing papers written by researchers using IBM’s new 127-qubit processor. Over 700 papers have already been written using early versions of IBM Quantum systems. 

However, IBM is moving fast. According to the dates on its roadmap, the 433-qubit processor is only a year away. Then, 12 months later, we are in for a real treat when the 1121-qubit Condor comes online.

The next two years should be very interesting.

+ posts
Patrick founded the firm based on his real-world world technology experiences with the understanding of what he wasn’t getting from analysts and consultants. Ten years later, Patrick is ranked #1 among technology industry analysts in terms of “power” (ARInsights)  in “press citations” (Apollo Research). Moorhead is a contributor at Forbes and frequently appears on CNBC. He is a broad-based analyst covering a wide variety of topics including the cloud, enterprise SaaS, collaboration, client computing, and semiconductors. He has 30 years of experience including 15 years of executive experience at high tech companies (NCR, AT&T, Compaq, now HP, and AMD) leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.