Regularized Moment-SOS Hierarchy
- Regularized Moment-SOS hierarchies are advanced relaxation schemes that refine classical moment-SOS methods through analytic and algebraic regularizations.
- They integrate functional-analytic, spectral, and homogenization techniques to restore convergence and stability, particularly on non-Archimedean and unbounded domains.
- These approaches significantly reduce computational costs and ensure finite convergence in applications such as optimal control, combinatorial optimization, and neural network certification.
The regularized Moment-SOS hierarchy is a family of relaxation schemes for global polynomial optimization that enhances or extends classical Moment-SOS and sum-of-squares (SOS) techniques. It incorporates analytic or algebraic regularization to restore convergence, stability, and tightness when standard methods encounter difficulties due to non-Archimedean sets, lack of compactness, or adverse numerical properties. This hierarchy emerges in diverse forms: functional-analytic regularization based on measure determinacy, penalized (norm-based) relaxations, partial homogenization to compactify unbounded domains, spectral (bounded-trace) relaxations, and blockwise/interpolated matrix schemes bridging low and high relaxation degrees.
1. Classical Moment-SOS Hierarchy and Its Limitations
The foundational Moment-SOS hierarchy formulates polynomial optimization problems on semialgebraic sets via a sequence of primal moment SDPs and their SOS duals. At relaxation order , one solves:
- Primal: Minimize subject to , for all .
- Dual: Maximize subject to , where is the truncated quadratic module.
The theoretical guarantee of convergence is backed by Putinar's Positivstellensatz, which requires the Archimedean property (existence of such that ). On non-Archimedean or unbounded sets, this property fails; positive polynomials may exist outside , causing the hierarchy to stall or lose tightness (Henrion, 5 Dec 2025).
2. Functional-Analytic Regularization via Measure Determinacy
Instead of algebraic compactness, one may invoke measure-theoretic determinacy, specifically the multivariate Carleman condition. For a Borel measure , if
then is determinate, and projections of the quadratic module become dense in the cone of positive polynomials in . The regularized Moment-SOS hierarchy replaces the strict SOS equality by the following (Henrion, 5 Dec 2025):
- Dual (SOS): Maximize subject to , , SOS of suitable degree.
- Primal (moment): Minimize subject to standard moment and localizing matrix constraints.
Convergence is achieved without invoking Positivstellensatz; the sequence of optimal values approaches as . Penalized variants use a Bernstein-Markov constant to self-certify lower bounds:
ensuring monotonicity and convergence (Henrion, 5 Dec 2025).
3. Homogenization and Compactification for Unbounded Domains
For polynomial optimization or optimal control on unbounded sets, partial homogenization introduces auxiliary variables to transform the control or decision variables to a compact domain. For unbounded optimal control:
- Original non-compact problem fails standard Moment-SOS relaxation due to absence of an Archimedean module.
- By defining , with , the real line is compactified onto an algebraic "sphere" .
- Occupation measures are then supported on , and all constraints become polynomial.
The resulting infinite-dimensional linear programs and Moment-SOS relaxations are equivalent to the original unbounded-optimal control problem, with proven absence of relaxation gap and hierarchy convergence under compactness and coercivity assumptions (no loss of optimal value in passage from non-compact to compactified domains) (Sehnalová et al., 17 Mar 2025).
4. Spectral and Bounded-Trace Regularization
Spectral regularization enforces a constant trace property (CTP) on moment matrices indexed on the sphere. This regularizes the spectrahedron and leads to the "spectral" Moment-SOS hierarchy:
- For sphere-constrained POPs, every feasible moment matrix has fixed trace: (or after diagonal scaling).
- The SDP dual reduces to minimizing the largest eigenvalue of plus a linear term, a first-order nonsmooth convex problem.
This approach reforms ill-conditioned, large-scale SDPs into well-conditioned, low-memory spectral optimization. Empirical evidence demonstrates speedups and scalability in problems with thousands of variables, with numerical bounds nearly matching classical SDP relaxations but at much lower computational cost (Mai et al., 2020).
5. Sublevel and Blockwise Regularization
The sublevel Moment-SOS hierarchy bridges the gap between low (-th) and higher (-th) order relaxations by blockwise enrichment. For parameters :
- Only a subset of -order SOS blocks—each involving variables and blocks per constraint—are added.
- This interpolates between very weak first-order relaxations and expensive full higher-order SDPs.
This localized regularization improves bounds and stabilizes the solver. Empirical studies confirm reductions in optimality gaps and computational cost compared to full high-order relaxations in combinatorial, integer, and neural network optimization (Chen et al., 2021).
6. Homogenized and Denominator Hierarchies for Unbounded Sets
An explicit homogenization construction introduces and works on the projective sphere , with all constraints homogenized. Alternatively, classical denominator hierarchies use multiplier to transfer positivity from the unbounded domain to SOS certificates. Both approaches achieve finite convergence under closedness-at-infinity, real radicality of the constraint ideal, and strict optimality conditions at all global minimizers, including at infinity. The resolution of conjectures regarding finite convergence for denominator-based hierarchies consolidates their role in regularization for unbounded optimization (Huang et al., 2021).
7. Numerical Stability, Implementation, and Applications
Regularized Moment-SOS hierarchies address both theoretical convergence and practical numerical stability challenges. Recommended practices include:
- Variable scaling to , exploiting sparsity or symmetry, reliance on high-precision SDP solvers (e.g., MOSEK), and warm-starting via lower-order solutions.
- Monitoring mass and complementarity of localizing matrices for convergence detection.
- Application to unbounded control, polynomial optimization without compactness, combinatorial problems, integer programming, and neural network certification.
Penalized and spectral regularizations consistently demonstrate enhanced stability—avoiding ill-conditioning, enabling solution of large-scale problems, and providing self-certified lower bounds even where classical hierarchies fail (e.g., Motzkin polynomial, Stengle's and Prestel–Delzell's examples) (Henrion, 5 Dec 2025, Mai et al., 2020, Chen et al., 2021).
In summary, regularized Moment-SOS hierarchies generalize and refine classical SOS and moment schemes, enabling systematic and convergent approaches to polynomial optimization on non-Archimedean, unbounded, and numerically challenging semialgebraic sets. Techniques include analytic penalization, spectral normalization, sparsity and blockwise interpolation, and homogenization, each rigorously justified and empirically validated for scalability, robustness, and fidelity to the underlying optimization problem.