Papers
Topics
Authors
Recent
Search
2000 character limit reached

Validating quantum computers using randomized model circuits

Published 30 Nov 2018 in quant-ph | (1811.12926v2)

Abstract: We introduce a single-number metric, quantum volume, that can be measured using a concrete protocol on near-term quantum computers of modest size ($n\lesssim 50$), and measure it on several state-of-the-art transmon devices, finding values as high as 16. The quantum volume is linked to system error rates, and is empirically reduced by uncontrolled interactions within the system. It quantifies the largest random circuit of equal width and depth that the computer successfully implements. Quantum computing systems with high-fidelity operations, high connectivity, large calibrated gate sets, and circuit rewriting toolchains are expected to have higher quantum volumes. The quantum volume is a pragmatic way to measure and compare progress toward improved system-wide gate error rates for near-term quantum computation and error-correction experiments.

Citations (506)

Summary

  • The paper introduces quantum volume as a single-number metric that evaluates system-wide quantum performance using randomized model circuits.
  • It applies heavy output sampling to assess fidelity, highlighting the impact of two-qubit gate quality and connectivity on overall device reliability.
  • Empirical results from IBM processors demonstrate that compiler optimizations and enhanced gate fidelity significantly boost quantum volume measurements.

Quantum Volume: An Empirical Metric for Assessing Quantum Computers

The paper "Validating quantum computers using randomized model circuits" pioneers a metric called quantum volume (QV) designed to quantify the computational capacity of quantum processors. This metric is independent of architecture, making it a versatile tool for comparing quantum devices with distinct underlying technologies. In this essay, the core contributions of this paper are examined, offering insights into its methodologies, findings, and potential implications for the field of quantum computing.

Definition and Protocol of Quantum Volume

Quantum volume is defined as a single-number metric that provides a comprehensive measure of a quantum computer’s proficiency in implementing high-fidelity quantum operations across its entire system. Unlike metrics that focus on isolated components, quantum volume captures system-wide behavior, accounting for gate errors, connectivity, and parallelism. The protocol to measure quantum volume involves executing model circuits—randomly generated quantum circuits with specified depth and width—on the quantum device and evaluating the implementation through a metric analogous to classical benchmarking frameworks like LINPACK.

The quantum volume is characterized via random circuits, leveraging heavy output generation, a method where one assesses the probability distribution of outputs as a function of the quantum circuit’s operations. This approach derives from heavy output sampling, where the probability of generating outputs exceeding a certain threshold (i.e., 'heavy outputs') is determined. The metric adapts to quantum architectures' particularities without requiring customization for specific systems, assuming the presence of a universal quantum gate set.

Empirical Analysis and Findings

The research employs IBM’s quantum processors—specifically, Tenerife, Melbourne, Tokyo, and Johannesburg—to empirically evaluate quantum volume. Through extensive experiments, a quantum volume as high as 16 is reported, contingent on advances in two-qubit gate fidelity and overall device reliability. The empirical findings reveal that higher quantum volume correlates strongly with enhanced two-qubit operations, connectivity, and coherence times.

One of the key insights is the realization that classical noise models like depolarizing errors provide close approximations to the real-world performance of quantum devices, reinforcing the utility of quantum volume as a robust benchmarking tool. Moreover, optimizations via compilers, specifically those that recognize circuit symmetries and minimize gate operations, yield tangible improvements in quantum volume, as demonstrated on the 20-qubit Johannesburg device.

Implications and Future Directions

Quantum volume offers significant theoretical and practical implications for the quantum computing community. Theoretically, it coherently integrates the complexities of quantum gate operations, coherence, and error correction into a unified framework, offering researchers a clearer lens through which to understand and enhance device performance. Practically, it functions as a benchmarking standard, comparable to classical computing metrics, that can drive the development and maturation of quantum technologies.

Speculating on future developments, the paper suggests pathways to increase quantum volume that include further enhancing gate fidelity, optimizing connectivity layouts, and advancing compiler techniques. There is also a pronounced emphasis on refining the measurement accuracy of quantum operations, which remains crucial for achieving scalable quantum computation.

The paper’s coherent articulation of quantum volume provides a foundational metric encouraging progress in designing, optimizing, and deploying quantum computers. As researchers continue to explore this metric across more diverse quantum architectures, it is anticipated that quantum volume will become an integral part of the quantum computing evaluation landscape, facilitating a more standardized comparison of quantum capabilities across different platforms.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.