24 Jan IBM Doubles Its Quantum Computing Power Again

Quantum Volume growth chart. IBM

IBM announced at CES 2020 that its newest 28-qubit quantum computer, Raleigh, achieved the company’s goal of doubling its Quantum Volume (IBM names its systems by city names). Raleigh reached a Quantum Volume of 32 this year, up from 16 last year. Raleigh draws on an improved hexagonal lattice connectivity structure developed in IBM’s 53-qubit quantum computer, and features improved coherence aspects.  According to IBM, the lattice connectivity had an impact on reduced gate errors and exposure to crosstalk.

IBM has doubled its systems’ Quantum Volume every year since 2017, when it first demonstrated a Quantum Volume of 4 with its five-qubit computer called Tenerife. In 2018, the 20-qubit Tokyo obtained a Quantum Volume of 8, and last year the 20-qubit IBM Q System One, called Johannesburg, achieved a Quantum Volume of 16. Increasing the Quantum Volume each year is an important goal, both to IBM and the quantum computing industry in general. Last year I wrote a more detailed description of Quantum Volume, which you can find here.

The higher the Quantum Volume, the more real-world, complex problems quantum computers can potentially solve, such as those explored by IBM’s quantum network organizations.

IBM

Quantum Volume is a full-system quantum computer performance metric developed by IBM researchers in 2017. It produces a numerical value, such as the newly announced QV of 32. Interpreting Quantum Volume is simple—the larger the number, the more powerful the quantum computer.

You can look at the Quantum Volume number as you would look at a numerical grade given by an expert consultant who has evaluated the significant issues affecting the power and ability of a quantum computer to perform complex computations.

Quantum volume 
IBM

Quantum Volume considers such technical factors as how long quantum bits (qubits) can maintain their quantum state, errors made during hardware calibration, crosstalk, spectator errors, gate fidelity and other fidelity measurements. It also considers the number of qubits and their connectivity, as well as circuit software compiler efficiency.

How did IBM manage to double the Quantum Volume? Jay Gambetta and Jerry Chow, IBM Q researchers, said, “To hit our latest Quantum Volume milestone, we had to combine elements of learning which we developed along the generational development threads, together with new ideas from research. From the generations of Penguin and the first of Hummingbird we zeroed in on a sparse lattice arrangement to reduce qubit frequency collisions and too many spectator errors. From the research side, last year, we showed that we had made advances in single-qubit coherence, pushing greater than 10 million quality factor on isolated devices.”

A big plus for Quantum Volume is that it is architecture-independent. That means any quantum circuit-based computer can use it. Likely, most circuit-based quantum computer companies have already run Quantum Volume on their machines. Some, like Rigetti, have reported results, while many have remained silent for various reasons.

Quantum Volume: a tool to reach Quantum Advantage

Quantum Volume is an essential measurement for several different reasons. The most obvious is that it represents the relative power of a quantum computer. If you are not a physicist or quantum researcher, it’s hard to understand the relative power of different quantum computer technologies and configurations. The single number aspect of Quantum Volume allows anyone to make an easy comparison between gate-based quantum computers.

Quantum Volume can also play a significant role in ongoing development and research necessary to create bigger and better quantum computers required to achieve quantum advantage. According to Bob Sutor, Vice President, IBM Q Strategy & Ecosystem, doubling the Quantum Volume every year is necessary to stay on the path to reach Quantum Advantage sometime during the next decade.

Quantum Advantage expectations

Quantum advantage is when quantum computers can demonstrate a significant advantage in speed and computational space over classical computers. In other words, it will be when quantum computers can solve substantial and relevant problems that will take classical computers too long to solve, if they can solve them at all.

Google’s Quantum Supremacy was an important quantum computing event. However, it was a singular achievement accomplished by a single company, without an immediate impact on industry applications or society. Quantum Advantage, on the other hand, is even more important than Quantum Supremacy. Several technical obstacles remain to be solved before Quantum Advantage arrives. As demonstrated by IBM’s steady progress, Quantum Volume will help us get there.

Unlike Quantum Supremacy, I believe Quantum Advantage will likely begin with multiple companies announcing breakthroughs for different applications, perhaps for finance, simulations or medicine. Then, like a light sprinkle before a downpour, more and more companies will start successfully running previously untouchable and complex algorithms in a hybrid environment of both classical and quantum computing. 

One thing is almost inevitable—the arrival of Quantum Advantage will signal that a new era of quantum computing has begun, and it will have a significant impact on many facets of our lives.