Atom Computing Announces Record-Breaking 1,225-Qubit Quantum Computer

By Paul Smith-Goodson, Patrick Moorhead - November 13, 2023

In January 2022, Atom Computing received $60 million in Series B funding. The company’s stated objective for the funds was to build a larger, second-generation optically trapped neutral atom quantum computer. Today, Atom Computing fulfilled that objective with its announcement of the 2024 release of a second-generation neutral atom quantum computer equipped with 1,225 qubits.

I had the opportunity to speak with Rob Hays, president and CEO of Atom Computing, about the new machine and the efforts that went into its development. The announcement is important for the entire quantum industry because Atom Computing will be the first company to release a universal, gate-based quantum computer with over a thousand qubits. It becomes even more significant considering that the company is a relatively new startup.

Beginnings Of Atom Computing

Atom Computing was founded five years ago by Benjamin Bloom, who has a Ph.D. in physics from the University of Colorado, and Jonathan King, who has a Ph.D. in chemical engineering from the University of California at Berkeley. After obtaining seed funding of $5 million, Bloom and King built the world’s first nuclear-spin qubit quantum computer created from optically trapped neutral atoms. Atom Computer’s first prototype, called Phoenix, used a 10×10 array of strontium-87 atoms to create 100 qubits.

The Phoenix machine was developed in Atom Computing’s headquarters location in Berkeley. Since its inception, Atom Computing scientists have used Phoenix to advance the capabilities of neutral atom hardware and software, much of which is being used in the company’s latest-generation computer.

Atom Computing’s commercial operations facility in Boulder, Colorado
Moor Insights & Strategy

Atom Computing’s next-generation 1,225-qubit machine was developed in its newest commercial operations facility in Boulder, Colorado. Patrick Moorhead, founder and chief analyst for Moor Insights & Strategy, and I had a chance to visit and tour the facility late last year during its grand opening.

Earlier this year, Atom Computing was chosen by the Defense Advanced Research Projects Agency (DARPA) to participate in a special program designed to find new methods to scale up qubits and develop a broader set of quantum error correction algorithms needed for fault tolerance. In addition to funding, the DARPA partnership provided Atom Computing with access to experts from the Defense Department, academia and national labs.

Scaling Challenges

I asked Rob Hays about major challenges faced by Atom Computing scientists while building the new machine. I wasn’t surprised when he said that scaling up the number of atoms and creating 1,225 individual traps were challenging.

“You need just the right amount of laser power to hold atoms in place and still be able manipulate their states while simultaneously maintaining good fidelity,” he said. “The combination of doing all three things at the same time and doing them right is the real challenge.”

The number of qubits determines the computer’s power and the algorithm complexity it can handle. However, scaling is difficult because neutral atom qubits, like all qubits, can lose their quantum state because of various factors such as unwanted laser light or magnetic fields. Even increasing the number of qubits can add to these problems.

Hays added that the development team also solved a future energy issue while working on the current machine. He said that the scientists achieved enough energy efficiency to provide enough power and precision control to scale up the system beyond what was necessary for the new machine.

Fault Tolerance

The long-term goal for quantum computing is to build a fault-tolerant quantum computer. Atom Computing’s initial 100-qubit Phoenix machine and its next-generation 1,225-qubit platform are important milestones in its roadmap to build a fault-tolerant gate-based machine. So far, the company continues to hit its goal of scaling qubits by an order of magnitude in each generation.

A great deal of technical progress has already been made by the quantum scientific community. However, there are still many known and unknown engineering and physics problems yet to be solved before the community can build a fault-tolerant quantum computer capable of running quintillions of circuit operations per second.

Atom Computing has already solved many difficult technical issues needed for fault-tolerance. It holds the record for a 40-second coherence time that allows longer and more complex algorithms to be run. It was also the first neutral atom quantum company to develop mid-circuit measurement, an important feature needed for many quantum operations such as error correction and conditional logic operations. Atom Computing previously demonstrated the ability to measure the quantum state of specific qubits during computation and detect certain types of errors without disturbing other qubits.

Atom Computing is expected to release specific technical details about the new machine when the release date gets closer. It will be a new and interesting experience to see benchmark data for a 1,225-qubit quantum computer.

Swapping Types of Atoms

Average fluorescence of neutral ytterbium-171 atoms trapped in a 1,225-site optical array.
Atom Computing

Obviously, Atom Computing had to make many adjustments and improvements to existing features, along with introducing technical innovations, to scale up from a hundred-qubit machine to one with more than 1,200 qubits. I will cover these technical changes once the data becomes available and I can do a more comprehensive review of what was done and the subsequent benchmarking results.

However, there is one important change I want to discuss here. The initial 100-qubit Phoenix machine was built on a platform of strontium-87 atoms for its qubits. The new 1,225-qubit quantum computer uses ytterbium-171 atoms to create its qubits. I’m glad to see the change because there are a number of very sound technical reasons why Atom Computing changed to ytterbium-171. Indeed, a recent study concluded that ytterbium-171 may be the best material of all for qubits.

The main reason for the change is because ytterbium-171 has a nuclear spin of 1/2 compared to the strontium-87 isotope, which has a more complicated spin of 9/2. In plain English, that means ytterbium has only two quantum levels that can be accessed in its lowest state. Having only two levels makes ytterbium’s states easier to manipulate and easier to measure than strontium’s complicated structure. Having more available levels in a 9/2 spin requires more control fields, which can also create complications that make a strontium-based system more prone to errors.

Wrapping Up

Atom Computing’s 100-qubit Phoenix prototype and its next-generation 1,225-qubit platform are important stepping stones in the roadmap that leads to a million-qubit fault-tolerant gate-based machine.

While fault-tolerance remains a distant target, there are research signals and commercial results showing that quantum is getting close to becoming practical for real-world computing tasks. Depending on how good the performance is for Atom Computing’s next-generation processor, 1,225 qubits should produce some very useful and very interesting research enroute to that goal.

+ posts

Paul Smith-Goodson is the Moor Insights & Strategy Vice President and Principal Analyst for quantum computing and artificial intelligence.  His early interest in quantum began while working on a joint AT&T and Bell Labs project and, during 360 overviews of Murray Hill advanced projects, Peter Shor provided an overview of his ground-breaking research in quantum error correction. 

+ posts

Patrick founded the firm based on his real-world world technology experiences with the understanding of what he wasn’t getting from analysts and consultants. Ten years later, Patrick is ranked #1 among technology industry analysts in terms of “power” (ARInsights)  in “press citations” (Apollo Research). Moorhead is a contributor at Forbes and frequently appears on CNBC. He is a broad-based analyst covering a wide variety of topics including the cloud, enterprise SaaS, collaboration, client computing, and semiconductors. He has 30 years of experience including 15 years of executive experience at high tech companies (NCR, AT&T, Compaq, now HP, and AMD) leading strategy, product management, product marketing, and corporate marketing, including three industry board appointments.