Quantinuum Makes A Significant Quantum Computing Breakthrough By Connecting The Dots Of Its Previous Research

By Paul Smith-Goodson, Patrick Moorhead - August 21, 2022

To appreciate the importance of Quantinuum’s latest research, it is important to first understand why quantum error correction plays such an important role in quantum computing.

Solutions for world-class problems such as climate change, new pharmaceuticals, custom design of new materials, long-range electric vehicle batteries, and many other applications are beyond the computational ability of today's most powerful supercomputers. 

Quantum computing isn’t just a faster or larger type of computer. It is fundamentally a different type of computing technology, one that is anchored in the strange and weird world of quantum mechanics. Quantum computing has the potential to solve enormous and complex problems quickly, but only if equipped with vast numbers of qubits (quantum bits) necessary to do the job.

For example, a classical computer will likely never be able to break Bitcoin’s encryption key, not even if given the remaining lifetime of the universe to solve it. According to the University of Sussex in the UK, it would require a quantum computer with 13 million qubits working for about 24 hours to break Bitcoin’s key. Pumping up the qubit count to 300 million qubits would reduce the quantum computer’s solution time to about an hour or less.

For perspective, today’s quantum computers have small qubit counts, ranging from 50 to several hundred qubits with the likelihood of having several thousand qubits in a few years. As shown by the University of Sussex example, that’s still a tiny fraction of the number needed to do serious and useful computations. 

Can’t we just add lots of qubits to a quantum computer?

Physics and engineering considerations associated with qubit fidelity and error correction restrict the ad hoc addition of large numbers of qubits to a quantum computer. Quantum scientists have yet to develop a usable and scalable method to correct errors.

Classical computers rarely make errors so it makes little difference if a few bits are flipped out of trillions of calculations. Unlike classical computer bits that operate strictly as a 1 or a 0, qubits operate in quantum superposition states without the precision of being exactly 1 or 0. 

Qubits are also very susceptible to errors caused by environmental factors such as noise, cabling, and even other qubits. Qubit errors can even occur when exposed to relative weak galactic space radiation. Moreover, a qubit’s quantum state deteriorates rapidly, requiring a quantum computer to initiate and complete the totality of its operations before quantum states collapse. It is not an overstatement to say every part of the quantum computing process is a potential source of qubit errors.

Quantum error correction (QEC) is complicated not only because of its quantum nature, but also because there are multiple types of qubit errors. Depending on the quantum technology and process, error counts can range from one error per hundred computations to one error for several thousand calculations. 

Error correction is necessary because it will allow us to build large, fault-tolerant quantum computers scalable to hundreds of thousands of error-corrected qubits. 

Significance of Quantinuum’s fault-tolerant achievement

Quantinuum has published the first research paper to demonstrate a fault-tolerant end-to-end circuit with entangled logical qubits using real-time error correction. It is also the first time that two error-corrected logical qubits performed a circuit with higher fidelity than its constituent physical qubits.

What’s important is that Quantinuum’s fault-tolerant demonstration creates a new starting point that might enable future researchers to expand the number of qubits.

It is important to note that Quantinuum's QCCD architecture has made a significant contribution to the company’s ongoing research and allowed experimentation with geometries. The flexibility of QCCD zones allow qubits to be arbitrarily and experimentally rearranged to accommodate codes with exotic geometries and codes that are not in one or two dimensions, particularly when compared to what is possible with quantum computers that have fixed geometries. The QCCD design was first proposed by David Wineland’s group at NIST in a 1998 paper.

Although the initial QCCD architecture contained some unresolved technical issues, Tony Uttley, previously president of Honeywell Quantum Systems and the Honeywell team decided to develop the company’s next-generation quantum system using the QCCD architecture. Fully aware of the risks, Uttley decided the opportunity for greater rewards offset what he believed to be manageable risks. 

Considering Quantinuum’s technical achievements in 2022 and earlier, the decision to use QCCD architecture has proven to be correct.

Connecting the 2022 dots

The following list details advances made by Quantinuum that established foundational work for follow-on research in 2022 and beyond. 

  • March 3 – A world record SPAM (State Preparation and Measurement error) of using Barium-137 provided measured evidence of a near-term future with physical state preparation and measurement (SPAM) error rates in the 105 range. Improving SPAM fidelity helps reduce errors that accumulate in today’s “noisy” quantum machines, which is critical for moving to fault tolerant systems that prevent errors from cascading through a system and corrupting circuits.
  • April 14 – Quantinuum’s sixth quantum volume record was measured to be 4,096 (212). QCCD architecture allowed an increase in qubits and a corresponding increase in fidelity. Important because increasing fidelity is necessary when more qubits are added down the road to ensure it will be computationally useful. The Quantinuum H1-2 system used all 12 of its qubits for this new milestone signaling the likelihood that Quantinuum would soon add more qubits.
  • May 24 – InQuanto was released. It is a quantum computational chemistry software platform built for computational chemists. This platform could only be accurately run using a high-performance quantum hardware system such as Quantinuum’s H1 series. 
  • June 14 - Upgrade from 12 to 20 qubits to the H1-1 machine. The number of gate zones in the QCCD architecture were also increased from 3 to 5 to allow more simultaneous operations and improved parallelization in circuit execution. Prior planning and work in 2021 set the stage for this upgrade. 
  • July 11 - Junction transport with Barium and Ytterbium, a measured scaling method that takes advantage of 2-D grids. It allows two different species of ions to move through a junction in a surface trap together as a pair. This will be incorporated in the future design of System Model H3. It is expected to aid in scaling plus provide faster computation, allow more qubits to be added, and reduced errors. 
  • July 20 – New phase of matter realized in the H1-1 as described in the research paper: “Dynamical topological phase realized in a trapped-ion quantum simulator” (peer review published to Nature for 2021 work) 
  • August 4 – New quantum dynamics simulation technique demonstrated as described in the research paper: “Holographic dynamics simulations with a trapped-ion quantum computer’ (peer review published to Nature Physics for 2021 work)
  • August 4 – This article is based on this research paper: “Implementing Fault-tolerant Entangling Gates on the Five-Qubit code and the Color Code.” This work substantiates a future where real-time quantum error correction paves the way for a fault-tolerant regime. 

The bottom line

Correcting errors in real-time is essential for the continued development of reliable large-scale quantum computing. Error correction is a high priority for almost every company in the quantum ecosystem; it is why a great deal of research is being conducted by a number of companies and universities. 

Quantinuum has made a small but very significant two-qubit step towards fault-tolerance. It has opened the door to a new and promising direction of research. 

Without fault tolerance, today’s quantum computing technology will not be capable of solving the important world-class computational problems we hoped it could solve. So the question is – can we do it? In my opinion, absolutely. 

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Paul Smith-Goodson

Paul Smith-Goodson is the Moor Insights & Strategy Vice President and Principal Analyst for quantum computing and artificial intelligence.  His early interest in quantum began while working on a joint AT&T and Bell Labs project and, during 360 overviews of Murray Hill advanced projects, Peter Shor provided an overview of his ground-breaking research in quantum error correction. 

Patrick Moorhead

Patrick founded the firm based on his real-world world technology experiences with the understanding of what he wasn’t getting from analysts and consultants. Ten years later, Patrick is ranked #1 among technology industry analysts in terms of “power” (ARInsights)  in “press citations” (Apollo Research). Moorhead is a contributor at Forbes and frequently appears on CNBC. He is a broad-based analyst covering a wide variety of topics including the cloud, enterprise SaaS, collaboration, client computing, and semiconductors. He has 30 years of experience including 15 years of executive experience at high tech companies (NCR, AT&T, Compaq, now HP, and AMD) leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.