Papers
Topics
Authors
Recent
Search
2000 character limit reached

Sum of Squares Programming

Updated 19 January 2026
  • Sum of squares programming is a convex optimization framework that certifies the nonnegativity of multivariate polynomials by expressing them as sums of squares.
  • It employs Gram matrix representations and linear matrix inequalities to transform hard nonconvex constraints into tractable semidefinite programs.
  • Advanced techniques such as Newton polytope pruning and sparse basis selection enhance its scalability in control, optimization, and verification applications.

Sum of squares (SOS) programming is a structured convex optimization framework for certifying nonnegativity of multivariate polynomials by expressing them as sums of squares, and using this property to formulate and solve convex relaxations of nonconvex problems. The central insight is that the hard algebraic constraint "p(x)≥0 for all x" is sufficient but not necessary for "p(x) is a sum of squares," and that determining SOS representation can be reduced to a large-scale semidefinite program (SDP). SOS programming is foundational in nonlinear systems control, polynomial optimization, program analysis, robustness, and beyond, providing a practical machinery for handling infinite-dimensional or hard combinatorial constraints via convex optimization over matrices.

1. Algebraic Foundations: SOS and Gram-Matrix Characterization

Let p(x)p(x) be a real multivariate polynomial in xRnx \in \mathbb{R}^n of degree $2d$. pp is called a sum of squares (SOS) if there exist polynomials f1(x),,fk(x)f_1(x),\dots,f_k(x) such that p(x)=i=1k[fi(x)]2p(x) = \sum_{i=1}^k [f_i(x)]^2. Classical results (Parrilo 2000; Choi–Lam–Reznick; Lasserre) establish that p(x)p(x) is SOS if and only if there exists a symmetric, positive semidefinite (PSD) matrix Q0Q \succeq 0 such that: p(x)=z(x)Qz(x)p(x) = z(x)^\top Q z(x) where z(x)z(x) is the vector of all monomials of degree up to dd in xx. The monomials z(x)z(x) index the Gram matrix QQ. Matching the coefficients of p(x)p(x) with z(x)Qz(x)z(x)^\top Q z(x) yields a linear system in the entries of QQ, and the requirement Q0Q \succeq 0 is an LMI constraint. Thus, verifying or searching for an SOS decomposition is equivalent to solving an SDP (see (Summers et al., 2012, Seiler et al., 2013, Cifuentes et al., 2018)).

2. Polynomial Positivity on Semialgebraic Sets and SOS-S-procedure

Testing p(x)0p(x)\geq 0 on Rn\mathbb{R}^n is NP-hard, but the SOS relaxation—the existence of a PSD Gram matrix—provides a tractable sufficient condition. When positivity is required only on a semialgebraic set R={xg1(x)0,,gm(x)0}\mathcal{R}=\{x\mid g_1(x)\geq 0,\dots,g_m(x)\geq 0\}, the SOS S-procedure seeks SOS multiplier polynomials λj(x)\lambda_j(x) such that: p(x)j=1mλj(x)gj(x)Σ[x],λj(x)Σ[x]p(x) - \sum_{j=1}^m \lambda_j(x) g_j(x) \in \Sigma[x], \quad \lambda_j(x) \in \Sigma[x] By the Gram theorem, all such constraints reduce to a structured SDP (see (Summers et al., 2012, Lin et al., 2023)). This approach is critical in safety verification and constrained optimization, where functional (infinite) constraints are replaced by LMI lifts in a finite basis.

3. Efficient Representations and Model Reduction

The computational burden of general SOS programming depends crucially on the choice and number of monomials (basis size grows combinatorially with nn and dd). Recent work addresses the scaling via:

  • Newton Polytope Pruning: Given p(x)p(x), one keeps only monomials xαx^\alpha in z(x)z(x) such that α\alpha lies in the shrunken Newton polytope (1/2)C(p)(1/2)C(p), substantially reducing the problem dimension (Seiler et al., 2013).
  • Zero-Diagonal (PSD-Based) Pruning: If the SDP encoding forces a diagonal Gram entry Qii=0Q_{ii} = 0, the associated monomial can be eliminated, offering even tighter reductions and faster preprocessing than convex hull–based selection (Seiler et al., 2013).
  • Multivariate and Quotient-Ring Bases: In quotient-ring constructions, z(x)z(x) is the monomial basis modulo an ideal II, or is chosen w.r.t. the support of the polynomial in a variety, reflecting problem-specific structure (Cifuentes et al., 2018, Cifuentes et al., 2015).

4. SOS Programming in Control, Optimization, and Verification

SOS methods have been widely adopted in polynomial control synthesis, Lyapunov and barrier certificate construction, reachability analysis, and program verification:

  • Approximate Dynamic Programming (ADP): The Bellman residual R(x,u;α)R(x,u;\alpha) is forced SOS (possibly over constraints), leading to offline SDPs for value-function approximation and online convex policy evaluation (Summers et al., 2012).
  • Safety Verification and Controller Synthesis: Via the S-procedure, safety constraints are encoded as SOS with polynomial multipliers, yielding SDPs certifying forward invariance or enabling synthesis of safe (potentially secondary) controllers (Lin et al., 2023, Zhao et al., 2022, Shakhesi et al., 27 Apr 2025).
  • SOS in Nonlinear Systems: Time-varying and time-invariant Lyapunov certificates for finite-time invariance or region of attraction computation are constructed by enforcing SOS conditions on the derivative of the Lyapunov function along closed-loop trajectories (Tobenkin et al., 2010).
  • Software Pipelines: Computational environments such as Macaulay2 SumsOfSquares, SOSTOOLS, YALMIP, and specialized first-order methods (e.g., SOSADMM) automate Gram-matrix-based SDPs with solvers like MOSEK, SDPA, and SeDuMi (Cifuentes et al., 2018, Zheng et al., 2017).

5. Advanced Applications, Hierarchies, and Theoretical Guarantees

SOS programming underpins a range of structured relaxations and hierarchies, as well as applications in non-convex polynomial optimization and algebraic geometry:

  • Positivstellensatz Certificates: The emptiness of semialgebraic sets (e.g., in safety index synthesis) can be certified by constructing SOS representations (or their duals) via the Positivstellensatz (Schmüdgen, Putinar) (Zhao et al., 2022, Zhao et al., 2021).
  • Sparse and Structured SOS, Matrix Completion: For quadratic forms on a real variety determined by a sparse ideal (e.g., monomial ideals associated to a graph), SOS decompositions are related to positive semidefinite matrix completion and clique decompositions (Blekherman et al., 2020).
  • Optimization Hierarchies: Hierarchical SOS relaxations (e.g., Lasserre, Parrilo schemes) systematically tighten outer and inner approximations of nonnegative polynomials and polynomial optimization over algebraic varieties (Cifuentes et al., 2015, Adjé et al., 2014).
  • Low-Rank Reductions: When matrices or maps are low rank, one can reduce the ambient dimension or exploit product structure in the associated SDPs (Legat et al., 2017).

The rigorous theoretical foundation of SOS programming guarantees that any SOS certificate (when found) is an explicit, efficiently checkable proof of polynomial nonnegativity or safety property, sound up to numerical tolerances. Asymptotic completeness holds in the limit of increasing basis or hierarchy order under standard conditions, and empirical studies confirm the conservative nature of relaxations is often mild in practical polynomial and control problems.

6. Numerical Implementation, Complexity, and Practical Considerations

The practical deployment of SOS programming relies on tractable SDP formulations and efficient solvers. The dimension of the Gram matrix grows as O((n+dd))\mathrm{O}(\binom{n+d}{d}), rapidly increasing the computational cost. Advancements in algorithmic sparsification (row-sparsity exploitation, chordal decompositions, first-order ADMM implementations), monomial selection, and basis pursuit enable larger problem instances and better scalability (Seiler et al., 2013, Zheng et al., 2017). Rational postprocessing, Cholesky/LDL' factorization for explicit decomposition, and DSOS/SDSOS (LP/SOCP inner-approximations) further bridge the gap between computational efficiency and certificate strength (Ahmadi et al., 2015, Cifuentes et al., 2018).

The table below summarizes the principal SOS-to-SDP translation:

Step Mathematical Object SDP/Algebraic Representation
SOS polynomial p(x)p(x) p(x)=fi2(x)p(x)=\sum f_i^2(x) p(x)=zQz,  Q0p(x)=z^\top Qz,\; Q \succeq 0
Positivity on semialgebraic R\mathcal{R} p(x)0,  xRp(x)\geq 0,\; x\in \mathcal{R} p(x)λj(x)gj(x)Σ[x]p(x)-\sum \lambda_j(x)g_j(x)\in \Sigma[x]
Nonnegativity on variety VV p(x)0,xVp(x)\geq 0, x\in V p(x)q(x)Σ[x]/I(V)p(x)\equiv q(x)\in \Sigma[x]/I(V)

All matching of polynomial coefficients and PSD constraints can be reduced to (possibly large) finite-dimensional SDPs.

7. Perspectives and Limitations

SOS programming provides a unifying algebraic-optimization paradigm that is both theoretically expressive and practical. It bridges classical real algebraic geometry and modern convex optimization, giving constructive witnesses (Gram matrix certificates) for positivity problems that pervade optimization, control, and formal verification. Computational limitations in scaling remain, especially for high-dimensional or high-degree cases, but continued advances in sparsity exploitation, bespoke SDP algorithms, and problem-specific monomial reduction continue to expand its practical reach (Seiler et al., 2013, Blekherman et al., 2020, Zheng et al., 2017). Limitations exist in non-SOS nonnegative polynomials (Hilbert's 17th problem), where the relaxation is not tight, but in a broad array of structured problems and moderate sizes, SOS techniques yield rigorous, computationally verifiable guarantees.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Sum of Squares Programming.