Post-Quantum Cryptography from Quantum Stabilizer Decoding
Published 19 Mar 2026 in quant-ph and cs.CR | (2603.19110v1)
Abstract: Post-quantum cryptography currently rests on a small number of hardness assumptions, posing significant risks should any one of them be compromised. This vulnerability motivates the search for new and cryptographically versatile assumptions that make a convincing case for quantum hardness. In this work, we argue that decoding random quantum stabilizer codes -- a quantum analog of the well-studied LPN problem -- is an excellent candidate. This task occupies a unique middle ground: it is inherently native to quantum computation, yet admits an equivalent formulation with purely classical input and output, as recently shown by Khesin et al. (STOC '26). We prove that the average-case hardness of quantum stabilizer decoding implies the core primitives of classical Cryptomania, including public-key encryption (PKE) and oblivious transfer (OT), as well as one-way functions. Our constructions are moreover practical: our PKE scheme achieves essentially the same efficiency as state-of-the-art LPN-based PKE, and our OT is round-optimal. We also provide substantial evidence that stabilizer decoding does not reduce to LPN, suggesting that the former problem constitutes a genuinely new post-quantum assumption. Our primary technical contributions are twofold. First, we give a reduction from random quantum stabilizer decoding to an average-case problem closely resembling LPN, but which is equipped with additional symplectic algebraic structure. While this structure is essential to the quantum nature of the problem, it raises significant barriers to cryptographic security reductions. Second, we develop a new suit of scrambling techniques for such structured linear spaces, and use them to produce rigorous security proofs for all of our constructions.
The paper introduces quantum stabilizer decoding as a new hardness assumption (LSN) to build cryptographic primitives like OWFs, PKE, and OT.
It reduces the decoding to the SympLPN problem, employing novel scrambling techniques to manage symplectic constraints and quantum noise.
This approach diversifies cryptographic assumptions, enhancing post-quantum security by offering an alternative to traditional noisy linear algebra challenges.
Post-Quantum Cryptography from Quantum Stabilizer Decoding
Introduction and Motivation
This paper proposes quantum stabilizer code decoding as a new, quantum-native computational assumption for classical cryptography, offering an alternative foundation for post-quantum cryptography (PQC). The current landscape of PQC relies heavily on problems such as Learning with Errors (LWE), Learning Parity with Noise (LPN), and various lattice-based, code-based, multivariate, and isogeny-based constructions. All known PQC primitives ultimately derive their security from a narrow set of classical hard problems, presenting risks of catastrophic failure if these problems are broken by algorithmic advances.
Unlike foundational assumptions in PQC that are classical but conjectured to resist quantum attacks, quantum stabilizer decoding is intrinsically tied to quantum information processing. Recent results, especially by Khesin et al., show that average-case quantum stabilizer decoding reduces to a classically formulated problem that maintains the essential structure and computational hardness of the original quantum problem. This opens the door to developing classical cryptography rooted in quantum-native complexity, raising both theoretical interest and practical cryptographic utility.
Quantum Stabilizer Decoding and the LSN Problem
The main computational hardness assumption is captured by the Learning Stabilizers with Noise (LSN) problem, a quantum generalization of LPN. In LSN, one must decode a random quantum stabilizer code—represented as a random Clifford encoding and a noisy codeword—back to the original information. The noise follows a quantum depolarizing distribution, differing significantly from the independent Bernoulli errors used in LPN.
Two major insights underpin the cryptographic utility of LSN:
Classical Equivalent Formulation: Average-case quantum stabilizer decoding can be formulated purely with classical inputs and outputs, yet preserves the quantum mechanical structure and symplectic algebraic constraints intrinsic to stabilizer codes.
Hardness Distinct from LPN: Reductions from LPN to LSN exist in certain parameter regimes, yet the relationship is not tight. Evidence is provided that decoding random quantum stabilizer codes does not reduce to LPN, particularly in cryptographically relevant low-noise parameter regimes. This suggests that LSN constitutes a new hardness assumption, not subsumed by established noisy linear algebraic problems.
Cryptographic Constructions from Stabilizer Decoding
Core Primitives Realizable
The paper demonstrates that the average-case hardness of stabilizer decoding suffices to construct essential classical cryptographic objects:
One-way functions (OWFs)
Public-Key Encryption (PKE)
Oblivious Transfer (OT)
Given the classical completeness of these primitives, this extends to the full power of multi-party computation.
Construction Efficiency
All constructed primitives—OWF, PKE, and OT—achieve efficiency comparable to the best known LPN-based schemes. For example, the PKE construction from LSN in the low-noise regime (p=O(1/n)) achieves O(n2) encryption time and O(n) decryption time, matching practical state-of-the-art [Alekhnovich03, damgaard2012practical].
Technical Approach
The main technical developments are:
Reduction to Symplectic LPN (SympLPN): The authors reduce quantum stabilizer decoding to a structured linear code problem (SympLPN), which incorporates symplectically orthogonal constraints—retaining crucial quantum code structure and degeneracy effects.
Scrambling Techniques in Symplectic Linear Algebra: New techniques are developed to handle security reductions in the presence of complex algebraic dependencies in symplectic subspaces, enabling rigorous proofs for the security of the proposed cryptographic schemes.
A central result is a non-trivial classical reduction from LSN (k=O(logn) logical qubits) to SympLPN of dimension n, arguing that for cryptographically meaningful security, brute-force attacks are no longer parameterized by the number of qubits but by the need to correct errors (akin to the quantum degeneracy phenomenon).
Security Foundations
The constructions are proven secure under the search or decision versions of SympLPN, with formal reductions for each primitive. PKE and OT constructions require delicate reductions when decreasing the logical dimension by one, for which the authors introduce hyperplane rotations and innovations in noise symmetrization.
Comparative Hardness and Reduction Barriers
The comparative relation between SympLPN and LPN is extensively addressed. Although a reduction from LPN to SympLPN exists for certain regimes, no converse reduction is known or expected. The authors provide strong technical barriers against SympLPN-to-LPN reductions, showing that generic linear mappings intended to "forget" symplectic structure result in entropy deficiencies or amplify noise beyond the Shannon limit for code decoding. Thus, SympLPN appears to be a genuinely new, quantum-founded assumption.
Empirically, the best known classical attacks (information-set decoding, ISD) on PKE schemes built from SympLPN match those on LPN, as ISD is largely agnostic to algebraic code structure and instead governed by noise sparsity. The quantum-inspired depolarizing noise is slightly more challenging for ISD, owing to correlated error pairs.
Implications and Future Directions
Practical Implications
Cryptographic portfolio diversification is strengthened by SympLPN-based primitives, mitigating catastrophic risk if standard noisy linear-algebraic problems become tractable (e.g., via novel quantum algorithms). The efficiency and security of such schemes rival those of classical LPN-based designs in relevant parameter regimes.
Theoretical Advancement
The establishment of a cryptographically complete, quantum-native assumption offers a win-win scenario: if the assumption is broken, this implies dramatic progress in quantum code decoding; if not, it provides an alternative secure basis for classical cryptography scaling with quantum hardware.
Future Developments
Deeper Hardness Analysis: Further work is needed to fully characterize the hardness of stabilizer decoding, in particular its resilience to potential yet-undiscovered quantum attacks.
Direct High-Rate Schemes: Investigating whether public-key cryptography can be directly constructed from high-rate, low-noise quantum stabilizer decoding, bypassing current reductions, remains open.
Composable Security: Extending security analyses to composable frameworks (e.g., UC security for OT and general multiparty computation) is a promising avenue.
Conclusion
By anchoring classical cryptography in the average-case hardness of decoding random quantum stabilizer codes, this work establishes a fundamentally quantum foundation for post-quantum cryptographic schemes, with rigorous reductions, strong efficiency, and a plausible separation from existing hardness assumptions. The approach broadens the security landscape of PQC, leverages deep connections between quantum information and cryptography, and motivates continued study of native quantum complexity as a source of practical and theoretically robust cryptographic hardness.
Reference:
"Post-Quantum Cryptography from Quantum Stabilizer Decoding" (2603.19110)