Papers
Topics
Authors
Recent
Search
2000 character limit reached

Sum-of-Squares Framework for Optimization

Updated 27 November 2025
  • The Sum-of-Squares Framework is a method to certify nonnegativity of real polynomials by expressing them as sums of squared polynomials.
  • It leverages algebraic tools like Gram matrix equivalence and pruning techniques (Newton polytope and zero-diagonal methods) to reduce computational complexity.
  • By streamlining SDP formulations, the framework improves performance in polynomial optimization, robust control, and system verification applications.

A sum-of-squares (SOS) framework refers to the computational and mathematical apparatus that certifies the nonnegativity of real polynomials by expressing them as sums of squared polynomials, and translates this algebraic property into efficiently solvable semidefinite programming formulations. This framework is foundational in polynomial optimization, control, robust estimation, and many areas where nonnegativity of polynomials under constraints must be certified or exploited.

1. Algebraic Foundations and Gram Matrix Equivalence

Let x=(x1,,xn)Tx=(x_1,\ldots,x_n)^T and consider a real polynomial

p(x)=αAcαxα,cαR, ANn finite.p(x) = \sum_{\alpha \in \mathcal{A}} c_\alpha x^\alpha, \qquad c_\alpha \in \mathbb{R},\ \mathcal{A}\subset\mathbb{N}^n \text{ finite}.

pp is a sum of squares (SOS) if

p(x)=i=1mfi(x)2,fiR[x].p(x) = \sum_{i=1}^m f_i(x)^2, \quad f_i\in\mathbb{R}[x].

SOS implies p(x)0p(x)\ge 0 for all xx, but the converse fails outside special cases (e.g., Hilbert's 17th problem). For pp of degree $2d$, collect all monomials of degree d\le d into a vector z(x)z(x) of length =(n+dd)\ell = {n+d\choose d}. Each fi(x)=aiTz(x)f_i(x)=a_i^T z(x), aiRa_i\in\mathbb{R}^\ell. The Gram-matrix theorem states:

p(x)=z(x)TQz(x),QR×, Q=QT, Q0p(x) = z(x)^T Q z(x), \quad Q \in \mathbb{R}^{\ell \times \ell},\ Q=Q^T,\ Q\succeq 0

if and only if pp is SOS. Equating coefficients yields a system of linear equations Avec(Q)=bA\operatorname{vec}(Q)=b, with the positive semidefiniteness constraint Q0Q\succeq 0, defining a linear matrix inequality (LMI) feasibility problem (Seiler et al., 2013).

2. Monomial Basis Pruning: Newton Polytope and Zero-Diagonal Algorithms

The computational tractability of SOS programs depends crucially on pruning unnecessary monomials from z(x)z(x).

Newton Polytope Pruning:

Define the Newton polytope C(p)=convhull{α:αA}C(p)=\operatorname{convhull}\{\alpha:\alpha\in\mathcal{A}\}. The reduced polytope is (1/2)C(p)(1/2)C(p). Reznick's theorem guarantees that only monomials xβx^\beta with β(1/2)C(p)Nn\beta\in(1/2)C(p)\cap\mathbb{N}^n can appear in any SOS decomposition. The pruning algorithm:

  1. Construct z(x)z(x) for all monomials up to degree dd.
  2. Compute C(p)C(p), obtain its half-space representation HαgH\alpha\le g.
  3. Discard xβx^\beta from z(x)z(x) if HβgH\beta\le g fails.

However, computing the convex hull C(p)C(p) is O(An/2)O(|\mathcal{A}|^{\lceil n/2\rceil}), which is prohibitive for large A|\mathcal{A}| (Seiler et al., 2013).

Zero-Diagonal (PSD Property) Pruning:

If Q0Q\succeq 0 and Qii=0Q_{ii}=0, then row and column ii of QQ are zero, so ziz_i can be removed. The iterative algorithm alternates between checking for equations Qii=0Q_{ii}=0 (from the linear system reflecting p=zTQzp=z^T Q z) and removing the corresponding monomials/rows/columns from further consideration. This method produces a pruned monomial set MnewMNPM_\text{new} \subseteq M_\text{NP}, and can yield strict reductions. For example, p=x12+x22+x14x24p=x_1^2+x_2^2+x_1^4 x_2^4 leads Newton polytope pruning to keep 4 monomials, while zero-diagonal pruning reduces this to 3 (Seiler et al., 2013). Complexity is O(2)O(\ell^2), much faster than Newton polytope convex-hull construction in high dimensions.

3. Generalization to SOS Programming in Polynomial Optimization

An SOS program has the form:

minuRrcTus.t. ak(x,u)Σ[x], k=1,,N\min_{u\in\mathbb{R}^r} c^T u \quad \text{s.t.} \ a_k(x,u)\in\Sigma[x],\ k=1,\ldots,N

where ak(x,u)a_k(x,u) is affine in uu, and Σ[x]\Sigma[x] denotes the cone of SOS polynomials. Each constraint introduces a Gram matrix QkQ_k and a reduced monomial basis via the zero-diagonal or Newton polytope pruning procedures. The aggregate system is encoded as an SDP in the concatenated variable y=[u;vec(Q1);;vec(QN)]y=[u; \operatorname{vec}(Q_1);\ldots;\operatorname{vec}(Q_N)] subject to feasibility Ay=bA y = b, with Qk0Q_k\succeq 0.

The iterative pruning approach can be extended to automatically eliminate unnecessary monomials and even free variables uju_j that are forced to zero (Seiler et al., 2013).

Performance experiments show monomial reductions up to 50%, reductions in free scalar variables by 10–20%, and total solver time reductions of 30–70% on benchmark SOS programs, with improved numerical stability.

4. Algorithmic and Numerical Impact

Reducing the monomial basis shrinks Gram matrices QkQ_k, resulting in smaller SDP blocks and fewer decision variables—directly lowering both memory and computational burden. Preprocessing scales as O(2)O(\ell^2) per constraint for the zero-diagonal method, compared to exponential scaling in nn for Newton polytope convex-hull methods.

Automatic detection and elimination of zero-valued variables or monomials enhances problem conditioning and can prevent solver warnings or numerical issues (Seiler et al., 2013). Implementations are available in prominent software toolboxes such as SOSOPT and SOSTOOLS, enabling practical solution of high-dimensional or higher-degree problems in systems analysis, control, and polynomial optimization.

5. Integration and Practical Toolbox Implementation

The described simplification and pruning techniques are integral to modern SOS optimization toolchains. They underpin the ability to solve SDPs arising from SOS relaxations of polynomial optimization problems, particularly in higher dimensions or with higher-degree polynomials, where brute-force enumeration of all monomials becomes computationally infeasible.

Toolboxes such as SOSOPT and SOSTOOLS provide options to invoke these simplification procedures automatically. Their adoption has been critical in making SOS approaches tractable for practitioners in robust control, system verification, and nonconvex polynomial optimization (Seiler et al., 2013).


The sum-of-squares framework thus consists of a hierarchy of certificate-construction and basis-reduction methods, all ultimately aiming to encode the existence of an SOS decomposition as a tractable SDP or LMI, and supporting this transformation with scalable preprocessing algorithms that exploit polyhedral and algebraic structure to minimize computational overhead (Seiler et al., 2013).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Sum-of-Squares Framework.