Papers
Topics
Authors
Recent
Search
2000 character limit reached

Quantum encoder for fixed Hamming-weight subspaces

Published 30 May 2024 in quant-ph | (2405.20408v3)

Abstract: We present an exact $n$-qubit computational-basis amplitude encoder of real- or complex-valued data vectors of $d=\binom{n}{k}$ components into a subspace of fixed Hamming weight $k$. This represents a polynomial space compression of degree $k$. The circuit is optimal in that it expresses an arbitrary data vector using only $d-1$ (controlled) Reconfigurable Beam Splitter (RBS) gates and is constructed by an efficient classical algorithm that sequentially generates all bitstrings of weight $k$ and identifies the gates that superpose the corresponding states with the correct amplitudes. An explicit compilation into CNOTs and single-qubit gates is presented, with the total CNOT-gate count of $\mathcal{O}(k\, d)$ provided in analytical form. In addition, we show how to load data in the binary basis by sequentially stacking encoders of different Hamming weights using $\mathcal{O}(d\,\log(d))$ CNOT gates. Moreover, using generalized RBS gates that mix states of different Hamming weights, we extend the construction to efficiently encode arbitrary sparse vectors. Experimentally, we perform a proof-of-principle demonstration of our scheme on a commercial trapped-ion quantum computer. We successfully upload a $q$-Gaussian probability distribution in the non-log-concave regime with $n = 6$ and $k = 2$. We also showcase how the effect of hardware noise can be alleviated by quantum error mitigation. Numerically, we show how our encoder can improve the performance of variational quantum algorithms for problems that include particle-preserving symmetries. Our results constitute a versatile framework for quantum data compression with various potential applications in fields such as quantum chemistry, quantum machine learning, and constrained combinatorial optimizations.

Citations (2)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.