Papers
Topics
Authors
Recent
Search
2000 character limit reached

From Promises to Totality: A Framework for Ruling Out Quantum Speedups

Published 31 Mar 2026 in quant-ph | (2603.29256v1)

Abstract: We study when partial Boolean functions can (and cannot) exhibit superpolynomial quantum query speedups, and develop a general framework for ruling out such speedups via two complementary lenses: promise-aware complexity measures and function completions. First, we introduce promise versions of standard combinatorial measures (including block sensitivity and related variants) and prove that if the relevant promise and completion measures collapse, then deterministic and quantum query complexities are necessarily polynomially related, i.e., $D(f)=poly(Q(f))$. We then analyze structured families of promises, including symmetric partial functions and promises supported on Hamming slices, obtaining sharp (up to polynomial factors) characterizations in terms of a single gap parameter for the symmetric case and refined slice-dependent bounds for $k$-slice domains. Next, we formalize completion complexity as the minimum of a measure over total completions of a partial function, and show that completability of a measure captures the possibility of superpolynomial quantum speedups. Finally, we apply this viewpoint to derive broad non-speedup criteria for some classes of functions admitting well-behaved completions, such as functions with low maximum influence on both the standard and $p$-biased hypercubes and functions with efficiently identifiable domains, and then show some hardness results for general completion techniques.

Summary

  • The paper introduces promise-aware complexity measures to assess when superpolynomial quantum speedups cannot occur.
  • It develops a unified hierarchy blending classical measures with completion complexity to relate deterministic and quantum query complexities.
  • It proves that efficient function completions prevent exponential quantum advantages by linking structural properties to computational hardness.

A Framework for Ruling Out Quantum Speedups: From Promise-Aware Measures to Total Function Completion

Introduction

This work systematically investigates which partial Boolean functions admit superpolynomial quantum query speedups, exemplified by separations between quantum and classical (deterministic or randomized) query complexity. Unlike previous approaches relying heavily on symmetric or highly structured domains, this paper introduces a general framework based on promise-complexity measures and function completions. The framework not only unifies existing results but also yields new generalizations, quantitative bounds, and inherent limitations, revealing structural prerequisites for quantum speedups.

Promise-Aware Complexity Measures

Classical complexity measures such as sensitivity, block sensitivity, and certificate complexity have well-understood polynomial relationships for total Boolean functions. These relationships preclude superpolynomial quantum speedups for total functions and are foundational in the polynomial method for lower bounds. The present work extends these concepts to partial functions by constructing promise versions of these measures, refining how undefined inputs are treated. Specifically, promised block sensitivity and promised certificate complexity are defined so that flipping a block may exit the promised domain as a signal of potential speedup.

Main Technical Result: If for a partial function ff, the critical block sensitivity and promised block sensitivity (or variants such as critical certificate complexity) are polynomially related, then deterministic and quantum query complexities must also be polynomially related. Thus, unless there is a significant collapse of these promise-aware measures, superpolynomial quantum speedup is impossible.

The analytic structure of these relationships is captured in a hierarchy of complexity measures, where polynomial collapse at any point enforces the collapse between deterministic and quantum complexity. The paper gives detailed proofs showing how arguments standard for total functions must be carefully generalized—often failing outright—when undefined points are present.

Characterization for Highly Symmetric and Slice Domains

A central theme in the literature is that the symmetry or structure of the function domain obstructs quantum speedup. This work revisits symmetric functions and slice domains through the lens of the new measures, quantifying the degree of possible quantum advantage in terms of a minimal gap parameter.

For symmetric partial functions (those invariant under SnS_n action), all key query complexity measures (randomized, quantum, and deterministic) can be sharply characterized up to polynomial factors by the minimal Hamming-weight separation between disagreeing outputs. For functions defined on a single Hamming slice, the paper refines previous polynomial bounds relating deterministic complexity to quantum complexity, particularly improving prior results for slices near k=n/2k = n/2.

Notably, for these domains, the polynomial method is proved to be essentially tight: approximate degree and quantum query complexity are polynomially related, precluding exponential advantages even when the promise definition is maximally permissive.

Completion Complexity and the Necessity of Incompletable Promises

A key conceptual contribution is the notion of completion complexity: Given a partial function, can one extend it to a total function without causing a superpolynomial blow-up in complexity measures such as degree or quantum query complexity? The paper demonstrates that superpolynomial quantum speedup is possible if and only if no such completion exists preserving polynomial measure relationships.

This completion-oriented perspective directly links the existence of speedups to the (in)ability to complete partial functions efficiently. Several broad impossibility results are derived:

  • If domain membership can be efficiently checked (by deterministic, randomized, or quantum query algorithms), and the approximate degree is not much larger than the verification complexity, then the (approximate) degree of any completion is also efficient, forbidding superpolynomial speedup.
  • For partial functions with domains efficiently identifiable via low-degree indicator polynomials, the “naïve” completion (extending to 0 or 1 outside the promise) suffices to guarantee polynomial relationship between quantum and classical complexity.

Lipschitz, Influence, and Sparsity Constraints

The paper further develops analytic conditions on the approximating polynomial for a partial function. If the approximator is globally Lipschitz with modest constant or if all coordinates have low influence and low sparsity, then extension to a total function can be accomplished without significant degree blow-up. This again precludes quantum speedup: high influence or “ruggedness” is necessary to even permit the possibility of quantum advantage.

This extends to the pp-biased hypercube, showing the framework's generality beyond the uniform cube and allowing application to Boolean functions relevant to random graph models and property testing.

Computational Complexity and the Hardness of Completion

From an algorithmic perspective, even when a completion should exist, finding the appropriate perturbation of an approximator's coefficients (to yield a good total function) can itself be an intractable problem—provably NP-complete via reduction from 3-SAT. This computational barrier aligns with the functional-analytic obstacles that must be surmounted for a quantum speedup to manifest and underscores that speedup is rare in a strong sense: both structural and computational conditions must be satisfied.

Implications and Open Problems

The results unify several lines of work on quantum query complexity lower bounds, promise problems, and function symmetries. The completion framework provides a precise “if and only if” demarcation for when quantum-classical separations are possible, reducing the question to the existence of efficiently extendable polynomials. Practically, this suggests that attempts to find exponential quantum advantages should focus on incompletable (from the perspective of the polynomial/certificate hierarchy) partial functions with highly structured, hard-to-identify domains. Theoretically, this shifts future research to finer characterizations of function classes that are resistant to completion and understanding the distribution of such “hard” functions among all partial functions.

Several directions remain open:

  • Fine-grained extensions of promise-aware measures to capture marginal speedups between quantum and randomized complexity.
  • Structural characterization of domains where even approximate completion is infeasible.
  • Systematic study of completion complexity for measures beyond approximate degree and their tight polynomial relationships.

Conclusion

This framework, organized around promise-sensitive complexity measures and completion complexity, offers a comprehensive toolkit for ruling out quantum speedups. By reducing quantum-classical query complexity separation to the behavior of promise-aware measures and completion properties, it provides a clear roadmap to understanding which problems are fundamentally hard for quantum algorithms—both in the presence of promises and for total functions. The theoretical and practical boundaries drawn by these results will inform both lower bound proofs and the ongoing search for natural problems with genuine quantum advantages.

Reference: "From Promises to Totality: A Framework for Ruling Out Quantum Speedups" (2603.29256)

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 1 like about this paper.