Sample size needed to estimate coherence metrics across manufactured qubit ensembles

Determine the number of superconducting qubits that must be measured, for a given manufacturing and assembly process, to estimate with some degree of confidence the highest, lowest, and median values of the distribution of qubit coherence times across the ensemble.

Background

The paper studies coherence statistics across more than 100 superconducting coaxmon qubits integrated on a 3-inch wafer-scale die. The authors discuss that different statistical descriptors of coherence (e.g., highest, lowest, median) are relevant for characterizing manufacturing quality and identifying potential error hot spots in large-scale quantum processors.

They explicitly note that, given qubit-to-qubit variability, it is not clear a priori how many qubits must be measured to determine these metrics with confidence. They subsequently provide a bootstrapped analysis for their dataset to illustrate how sample size affects estimation error for the median, minimum, and maximum coherence times, but the general determination of required sample sizes for arbitrary processes remains an explicit uncertainty they call out.

References

As coherence times of individual qubits fluctuate over time, different metrics representing coherence times can be quoted, the highest, lowest and median value as well as a, potentially complicated, distribution. Given variation between qubits, it is not clear a priori how many qubits should be measured to, with some degree of confidence, determine these metrics for a given manufacturing and assembly process.

Design and Operation of Wafer-Scale Packages Containing >500 Superconducting Qubits  (2602.12773 - Kennedy et al., 13 Feb 2026) in Section "Coherence Statistics" (Characterising the Monolithic Die)