Quantum Volume: A New Quantum Metric?

Over the past several years, the benchmark for determining whether quantum computing is a viable commercial technology has been quantum supremacy. This term was coined by Caltech professor and quantum computing guru John Preskill to describe the demonstration of a quantum computer that can carry out tasks not possible or practical with a traditional computer. More recently, the term quantum advantage has been applied to this concept of reaching a point at which quantum computers surpass classical computers in their ability to perform artificial intelligence (AI) algorithms.

To achieve that level of performance, significant attention has been paid to the number of qubits, or the subatomic particles that are the building blocks of quantum physics, that can be effectively used in quantum computers to solve problems. Though estimates vary as to the exact number of qubits needed to achieve quantum advantage, IBM researchers have introduced a more comprehensive performance metric that considers a variety of elements within a quantum computer.

This metric, known as quantum volume, considers several aspects of a quantum computer’s performance:

  • Number of qubits
  • Connectivity performance
  • Gate set performance
  • Whole-algorithm errors
  • Compilers and software stack performance

IBM’s Quantum Volume Measurement

Quantum volume was originally introduced 2 years ago by IBM, but today it is highlighted along with recent performance results from the company’s test quantum devices. It is designed to address the performance of a quantum computer based on how well it can run a complex algorithm, which is ultimately the type of problem quantum computers will be asked to solve in the real world. The idea is to consider not just the number of qubits that a system may contain, but also how many operations can be performed before an unacceptable number of errors occurs or the qubits experience decoherence. This latter situation occurs when the qubits fall out of their quantum state and are no longer suitable for use in calculations.

These two factors – the number of qubits used in a quantum computer, and the number of operations that can performed before decoherence – are considered as the width and depth of a quantum circuit. The greater the depth of a circuit, the greater the number of steps that can be carried out by the quantum computer, thereby allowing it to run more complex algorithms than a shallower circuit. To assess the depth of a quantum circuit, these other aspects – level of connectivity, gate performance, coherence time, error rates, and the performance of compilers and the software stack – must be assessed and measured. Doing so provides a more complete picture of a quantum circuit’s actual performance, volume which is expressed as a single numerical value that can be compared against other quantum systems.

IBM validated this approach by testing it across three of its quantum machines, including the 5-qubit system it made publicly available over the cloud in 2017, the 20-qubit system it released in 2018, and the Q System 1 it released earlier in 2019. The result was that quantum volume doubled each year, from 4 to 8 to 16. According to IBM, this is a pattern that largely mirrors Moore’s Law, which essentially says that computing power will double every 18 months or so. Moore’s Law has governed the improvement curve of classical computer performance dating back half a century.

It’s also important to note that quantum volume is, at its heart, an IBM metric, and therefore likely is tuned to favor the company’s technology architecture. That said, IBM is actively trying to get other quantum computing researchers and companies to support the metric as well. Such support would theoretically make comparing quantum systems from other vendors easier, since all the performance metrics can be consolidated into a single number.

Perhaps most importantly, IBM’s promotion of quantum volume focuses on the development and adoption of a performance metric that describes a quantum computer’s ability to perform a machine learning technique known as feature mapping. This component of machine learning is used to process more complex data structures than those that can be processed by the most powerful classical computers.

Significant Potential but Uncertain Timeline

On a recent analyst call to discuss quantum volume, IBM researchers said the company’s quantum machines have been able to classify data using short-depth circuits. As circuit depth increases (measured by quantum volume), there is significant potential to deploy machine learning applications in the near future, as well as a path forward to achieve quantum advantage for machine learning over the longer term.

However, IBM was less specific as to when its quantum systems, or any quantum systems, might actually be able to achieve quantum advantage. Indeed, significant work remains to be done with respect to increasing quantum circuit depth, including reducing noise and errors and increasing qubit coherence times, to achieve quantum advantage within the next 5 to 10 years.

Comments are closed.