Sum-of-Squares Framework for Optimization
- The Sum-of-Squares Framework is a method to certify nonnegativity of real polynomials by expressing them as sums of squared polynomials.
- It leverages algebraic tools like Gram matrix equivalence and pruning techniques (Newton polytope and zero-diagonal methods) to reduce computational complexity.
- By streamlining SDP formulations, the framework improves performance in polynomial optimization, robust control, and system verification applications.
A sum-of-squares (SOS) framework refers to the computational and mathematical apparatus that certifies the nonnegativity of real polynomials by expressing them as sums of squared polynomials, and translates this algebraic property into efficiently solvable semidefinite programming formulations. This framework is foundational in polynomial optimization, control, robust estimation, and many areas where nonnegativity of polynomials under constraints must be certified or exploited.
1. Algebraic Foundations and Gram Matrix Equivalence
Let and consider a real polynomial
is a sum of squares (SOS) if
SOS implies for all , but the converse fails outside special cases (e.g., Hilbert's 17th problem). For of degree $2d$, collect all monomials of degree into a vector of length . Each , . The Gram-matrix theorem states:
if and only if is SOS. Equating coefficients yields a system of linear equations , with the positive semidefiniteness constraint , defining a linear matrix inequality (LMI) feasibility problem (Seiler et al., 2013).
2. Monomial Basis Pruning: Newton Polytope and Zero-Diagonal Algorithms
The computational tractability of SOS programs depends crucially on pruning unnecessary monomials from .
Newton Polytope Pruning:
Define the Newton polytope . The reduced polytope is . Reznick's theorem guarantees that only monomials with can appear in any SOS decomposition. The pruning algorithm:
- Construct for all monomials up to degree .
- Compute , obtain its half-space representation .
- Discard from if fails.
However, computing the convex hull is , which is prohibitive for large (Seiler et al., 2013).
Zero-Diagonal (PSD Property) Pruning:
If and , then row and column of are zero, so can be removed. The iterative algorithm alternates between checking for equations (from the linear system reflecting ) and removing the corresponding monomials/rows/columns from further consideration. This method produces a pruned monomial set , and can yield strict reductions. For example, leads Newton polytope pruning to keep 4 monomials, while zero-diagonal pruning reduces this to 3 (Seiler et al., 2013). Complexity is , much faster than Newton polytope convex-hull construction in high dimensions.
3. Generalization to SOS Programming in Polynomial Optimization
An SOS program has the form:
where is affine in , and denotes the cone of SOS polynomials. Each constraint introduces a Gram matrix and a reduced monomial basis via the zero-diagonal or Newton polytope pruning procedures. The aggregate system is encoded as an SDP in the concatenated variable subject to feasibility , with .
The iterative pruning approach can be extended to automatically eliminate unnecessary monomials and even free variables that are forced to zero (Seiler et al., 2013).
Performance experiments show monomial reductions up to 50%, reductions in free scalar variables by 10–20%, and total solver time reductions of 30–70% on benchmark SOS programs, with improved numerical stability.
4. Algorithmic and Numerical Impact
Reducing the monomial basis shrinks Gram matrices , resulting in smaller SDP blocks and fewer decision variables—directly lowering both memory and computational burden. Preprocessing scales as per constraint for the zero-diagonal method, compared to exponential scaling in for Newton polytope convex-hull methods.
Automatic detection and elimination of zero-valued variables or monomials enhances problem conditioning and can prevent solver warnings or numerical issues (Seiler et al., 2013). Implementations are available in prominent software toolboxes such as SOSOPT and SOSTOOLS, enabling practical solution of high-dimensional or higher-degree problems in systems analysis, control, and polynomial optimization.
5. Integration and Practical Toolbox Implementation
The described simplification and pruning techniques are integral to modern SOS optimization toolchains. They underpin the ability to solve SDPs arising from SOS relaxations of polynomial optimization problems, particularly in higher dimensions or with higher-degree polynomials, where brute-force enumeration of all monomials becomes computationally infeasible.
Toolboxes such as SOSOPT and SOSTOOLS provide options to invoke these simplification procedures automatically. Their adoption has been critical in making SOS approaches tractable for practitioners in robust control, system verification, and nonconvex polynomial optimization (Seiler et al., 2013).
The sum-of-squares framework thus consists of a hierarchy of certificate-construction and basis-reduction methods, all ultimately aiming to encode the existence of an SOS decomposition as a tractable SDP or LMI, and supporting this transformation with scalable preprocessing algorithms that exploit polyhedral and algebraic structure to minimize computational overhead (Seiler et al., 2013).