Papers
Topics
Authors
Recent
Search
2000 character limit reached

Context-aware gate set tomography: Improving the self-consistent characterization of trapped-ion universal gate sets by leveraging non-Markovianity

Published 3 Jul 2025 in quant-ph | (2507.02542v1)

Abstract: To progress in the characterization of noise for current quantum computers, gate set tomography (GST) has emerged as a self-consistent tomographic protocol that can accurately estimate the complete set of noisy quantum gates, state preparations, and measurements. In its original incarnation, GST improves the estimation precision by applying the gates sequentially, provided that the noise makes them a set of fixed completely-positive and trace preserving (CPTP) maps independent of the history of previous gates in the sequence. This 'Markovian' assumption is sometimes in conflict with experimental evidence, as there might be time-correlated noise leading to non-Markovian dynamics or, alternatively, slow drifts and cumulative calibration errors that lead to context dependence, such that the CP-divisible maps composed during a sequence actually change with the circuit depth. In this work, we address this issue for trapped-ion devices with phonon-mediated two-qubit gates. By a detailed microscopic modeling of high-fidelity light-shift gates, we tailor GST to capture the main source of context dependence: motional degrees of freedom. Rather than invalidating GST, we show that context dependence can be incorporated in the parametrization of the gate set, allowing us to reduce the sampling cost of GST. Our results identify a promising research avenue that might be applicable to other platforms where microscopic modeling can be incorporated: the development of a context-aware GST.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.