Despite what most people think, quantum mechanics is not a new kid on the technology block. It is a concept almost 120 years old.
Max Planck, a German theoretical physicist, first introduced quantum theory in 1900, an innovation that won the Nobel Prize in physics in 1918. Then in 1959, a radical thinking American theoretical physicist named Richard Feynman planted the seeds of quantum computing. He suggested using quantum mechanics to build a new type of computer.
Since then, despite its complexity, we have made good progress in developing small working quantum computers and a limited menu of quantum algorithms.
Even so, Robin Blume-Kohout and Kevin Young, scientists from Sandia National Laboratory, looks at our quantum computing progress from a different perspective. They believe we are at the same stage that classical computing was in during the late 1930s.
IBM calls this the Quantum Ready stage. It suggests it is time to get off the corporate couch and begin preparing our systems, people, and resources for the era of quantum computing. By all accounts, full-blown quantum computing won’t be a small wave of change; it will be a technological tsunami.
Most researchers agree that quantum computing is still in the experimental stage. The truth is, a regular computer can do anything today’s quantum computers can do.
However, stay tuned, we have reason to believe that might change very soon.
Quantum supremacy is a techie buzzword. Until now, it’s been an impossible benchmark for quantum researchers to meet. It describes the ability of quantum computers to solve problems that classical computers can’t touch. You can look at quantum supremacy as the Mount Everest of quantum computing, except this mountain hasn’t been climbed yet.
Despite the skeptics, Google has hinted its gate-based 72 qubit quantum processor called Bristlecone, will achieve quantum supremacy sometime this year. Bristlecone is a scaled-up version of its nine qubit older brother. Scaling up qubits generally increases system noise and errors, but Google has done an excellent job of keeping quantum errors in check with Bristlecone.
The world’s first demonstration of quantum supremacy will be a significant milestone, not only for quantum computing but for the entire scientific community as well.
Breaking this barrier will be just as significant as the spectacular 1997 victory of IBM’s Deep Blue over the world chess champion, Gary Kasparov. That humiliating defeat was not only a computer beating a man. It signaled the supremacy of computer logic over expert human thinking.
If our expectations of quantum computers hold, then when quantum computing fully matures, Moor Insights & Strategy believes it will be the most disruptive technology since the invention of microprocessors. Here are a few areas where quantum computing will have a significant impact:
- New chemicals, drugs, and materials can be modeled, modified, and designed with custom properties to develop new pharmaceutical, commercial, and business products.
- Today we use supercomputers for a variety of optimization problems, such as Monte Carlo simulations, energy applications, and bond prices. Quantum computers will allow for more robust simulations and on a much larger scale to provide more in-depth insight, higher efficiency, and better forecasting.
- The combination of quantum computing and artificial intelligence is almost a scary thought. Artificial intelligence might become orders of magnitude smarter than it is today.
It will take another three to five years to develop a mid-scale quantum computer. Dr. Jeffrey Welser, Vice President of IBM Research, gave the keynote address at SEMICON West. He said it would take another 10 to 15 years until we realize the real benefits of quantum computing.
Despite projected long-term development, large companies that rely on sophisticated financial or scientific models of their products or industry shouldn’t wait to get into the quantum game.
It’s not half-time yet, but the whistle has blown, and the quantum game has started.
Over the past year, there has been a lot of media hype about quantum computers. Most of it has been spot on, but some exaggerated. Despite a few erroneous articles, people have learned that quantum computers are faster and have more processing power than classical computers. However, the real value of quantum computing lies in the ability to solve complex problems that are too difficult, or even impossible, for traditional computers to solve.
As fantastic as it sounds, problems that would require a staggering amount of time for traditional computers to solve – billions and billions of years – will only take a few seconds for quantum computers.
This jaw-dropping power is possible because quantum machines are entirely different from classical computers. They rely on bizarre properties that exist only in the realm of quantum mechanics and gives them exponentially more storage and processing capability.
Just as everyone has a different opinion, every company seems to have a different idea about which technology is best for building quantum computers. To the average person, they all sound like science fiction gone wild. Eye-rollers like trapped ions, superconductors, and particles of light called photons, are at the heart of quantum computing’s ability to do lightning-fast calculations and ingest massive amounts of data.
Stranger Things in the real world
Classical physics describes how big things behave and interact in our physical world. Quantum theory is all about the extraordinary and inexplicable interaction of small particles on the invisible scale of atoms, electrons, and photons.
If you lived in the tiny world of particles, you would probably describe it as schizophrenic.
Particles disappear then reappear, they exert control over their distant and unconnected sibling particles and act like a precision drill team when assembled as a group. Some scientists even believe that quantum particles not only time travel but move in and out of other universes as well.
Wacky-but-true quantum concepts
- Wave-particle duality: A quantum particle can behave like both a particle and a wave. Once you measure it, the wave function collapses, and it looks like a particle.
- Superpositioning: Quantum systems called qubits can exist in multiple states simultaneously. A qubit can be a one, or it can be a zero. However, it can also be both a one and a zero, or any number in between. If an operation is applied to a qubit while it’s in a superposition state, it will affect both states simultaneously.
- Entanglement: Entangled qubits act similarly. If two qubits are entangled, then the state of each qubit is dependent on the other qubit. Theoretically, you can entangle several thousand qubits (we don’t have the technology to do that many yet). If you did an operation on one of the thousand, you could immediately determine the state of all the qubits. Even if you separate entangled particles from each other, they can remain entangled. China recently set the record by creating entanglement between particles on a satellite and particles on earth.
- Measurement: By merely measuring a quantum particle, you can change it, causing the wave function to collapse.
- Coherence: The amount of time that a qubit maintains its state of entanglement or superpositioning is called coherence. The longer the coherence time, the better, because it allows more computations to take place.
- Teleportation: I know this sounds like beam-me-up-Scottie stuff, but it’s not. It’s an essential part of quantum mechanics. You need a pair of entangled particles to do it. It merely consists of copying the state of one particle to the other entangled particle, then destroying the original state. Teleportation is possible over distances.
We’re the same but different
Gate-based quantum computers and quantum annealers are the cats and dogs of quantum computers. Both technologies use superconducting architectures and require cooling processors to near absolute zero temperatures. Despite their similarities, the two types are wildly different.
Gate-based quantum computers
These are the computers that will take us into the future. They are designed to solve general problems and perform simulations. Their qubits interact with each other during computation, rather than acting alone as quantum annealers do. As the name implies, a gate-based machine uses quantum gates and can run arbitrary quantum algorithms. They are susceptible to noise and need error correction to perform correctly. Error correction has a very high overhead in terms of space, hardware, and logic.
From a technical standpoint, gate-based computers are very complex machines. However, the theory behind gate-based quantum computers is very mature, which means we know what needs to be done, we just don’t know yet how to do it.
The primary quantum computer companies are Google, IBM, Intel, Ion Q, Microsoft, and Rigetti. They all have gate-based quantum processors but use different qubit technologies as shown above.
While gate-based computers are designed to solve most problems, quantum annealers are special-purpose computers that work best for specific types of optimization and sampling problems.
Most of us studied global minimums, local minimums, and optimization problems in high school algebra. If you can remember that far back, then you already have an idea of how quantum annealers work.
D-Wave is the dominant vendor in the quantum annealing space. It uses a superconducting processor with 2048 qubits. To solve a problem, it examines all solution possibilities to find the lowest energy and values. It is much different from quantum gate-based machines. Unlike a quantum gate-based computer, the D-Wave cannot manipulate individual qubits during computation, which is the reason it can scale to 2048 qubits. It doesn’t use error correction and only runs one quantum algorithm.
Down the road
The end game for quantum computing is a fully functional, universal fault-tolerant gate computer. To fulfill its promise, it needs thousands, maybe even millions, of qubits that can run arbitrary quantum algorithms and solve extremely complex problems and simulations.
Before we can build a quantum machine like that, we have a lot of development work to be done. In general terms, we need:
- Advanced algorithms necessary to support extraordinarily complex problems
- Quantum processors with thousands of qubits
- Sophisticated error correction without a football field full of hardware
- More reliable qubits with longer coherence time and less sensitivity to noise
- A better way to minimize noise
Don’t look for a quantum machine like this in five or ten years. It is at least two or three decades away.