Papers
Topics
Authors
Recent
Search
2000 character limit reached

Sparse Copositive Polynomial Optimization

Published 31 Mar 2026 in math.OC | (2604.00180v1)

Abstract: This paper studies the copositive optimization problem whose objective is a sparse polynomial, with linear constraints over the nonnegative orthant. We propose sparse Moment-SOS relaxations to solve it. Necessary and sufficient conditions are shown for these relaxations to be tight. In particular, we prove they are tight under the cop-SOS convexity assumption. Compared to the traditional dense ones, the sparse Moment-SOS relaxations are more computationally efficient. Numerical experiments are given to show the efficiency.

Summary

  • The paper introduces a sparse Moment-SOS relaxation hierarchy that exploits structural sparsity in both the objective and constraints for efficient global minimization.
  • It establishes necessary and sufficient conditions for tightness, including flat truncation tests and cop-SOS convexity certificates, enabling extraction of global optimizers.
  • Numerical experiments validate significant efficiency gains over dense methods, demonstrating the approach’s applicability to large-scale, structured optimization problems.

Sparse Moment-SOS Relaxations for Copositive Polynomial Optimization

Problem Formulation and Context

The paper "Sparse Copositive Polynomial Optimization" (2604.00180) addresses the global minimization of sparse copositive polynomials over linearly constrained convex polyhedral sets in the nonnegative orthant. The objective polynomial f(x)f(x) admits a decomposable sparse form:

f(x)=f1(xI1)+⋯+fm(xIm),f(x) = f_1(x_{I_1}) + \cdots + f_m(x_{I_m}),

where each fif_i depends on a subvector xIix_{I_i}, and the index sets {I1,…,Im}\{I_1, \ldots, I_m\} partition the variable set. The feasible set is defined by dense or sparse linear equality and inequality constraints:

Ax=b,Cx≥d,x≥0.Ax = b,\quad Cx \geq d,\quad x \geq 0.

This framework generalizes standard dense polynomial optimization and various notions of copositive programming, enabling the exploitation of structural sparsity not only in the objective polynomial but also in the constraints.

Copositive polynomial optimization is highly relevant: Copositive polynomials (i.e., p(x)≥0p(x)\geq 0 for x≥0x\geq 0) arise in robust optimization, quadratic programming, graph theory, and tensor analysis. Solving such problems at scale is challenging since checking copositivity is NP-hard and standard (dense) Moment-SOS hierarchies scale poorly with problem size.

Sparse Moment-SOS Relaxation Hierarchy

The authors introduce a hierarchy of sparse semidefinite relaxations, derived from Lasserre-type Moment and SOS (Sum-of-Squares) methods, specifically tailored to capture the sparsity pattern of the objective. For relaxation order kk, two dual programs are formulated:

  • Sparse Moment Relaxation: The moment variables yy are indexed over a union of truncated monomial sets, one per sparse block.
  • Sparse SOS Dual: Nonnegativity constraints are encoded via sparse quadratic modules, decomposed correspondingly.

The crucial innovation is restricting moment and SOS variables to polynomials and moments appearing in the specified sparsity pattern. The dimension of resulting SDP variables grows polynomially with the maximum block size, not the ambient dimension, which makes the approach tractable for large-scale optimization with modest local coupling.

These relaxations generalize existing sparse Moment-SOS approaches; however, the authors emphasize that, unlike classical settings, the linear constraints in the present model can be dense, complicating standard certificate conditions and precluding reuse of earlier results.

Tightness and Exactness Analysis

A central contribution is the thorough analysis of conditions under which the sparse relaxations deliver globally exact bounds (finite convergence):

  • Characterizations of Tightness: Necessary and sufficient conditions for the tightness of the sparse hierarchy are established (Theorem 3.1). These involve algebraic decompositions of the optimality residual via the sum of quadratic modules and the constraints, with constructive existence results for primal and dual variables.
  • Flat Truncation and Atomic Extractability: The optimizer of the moment relaxation can sometimes be shown (by rank or flat extension tests) to represent a finitely atomic measure. In the setting where local moment matrices have rank one, the global minimizer can be extracted directly from the moment solution (Theorem 3.2).
  • Beyond Rank-One: The flat truncation mechanism is further extended, showing that, under certain support compatibility conditions among extracted atoms across the blocks, global optimizers can still be recovered even if ranks exceed one (Theorem 3.3). This is particularly essential for problems with multiple global minimizers.

The paper addresses failure modes via counterexamples and demonstrates practical strategies for certifying tightness based on decomposition uniqueness and objective value matching.

Cop-SOS Convexity and Finite Convergence

A major theoretical advance is the identification of sufficient conditions—cop-SOS convexity—ensuring the hierarchy is exact at a finite level:

  • Cop-SOS Convexity: The authors introduce a generalized algebraic convexity notion suitable for the nonnegative orthant:

f(x)=f1(xI1)+⋯+fm(xIm),f(x) = f_1(x_{I_1}) + \cdots + f_m(x_{I_m}),0

allowing certificate construction for convexity restricted to f(x)=f1(xI1)+⋯+fm(xIm),f(x) = f_1(x_{I_1}) + \cdots + f_m(x_{I_m}),1. All convex quadratic polynomials supported on f(x)=f1(xI1)+⋯+fm(xIm),f(x) = f_1(x_{I_1}) + \cdots + f_m(x_{I_m}),2 are cop-SOS convex, and the property is easily verified for structured higher-order terms.

  • Jensen Inequality for Cop-SOS Convex Polynomials: An analogue to the classic Jensen inequality is proved, leveraging the properties of the associated moment matrices. This underpins the proof that primal solutions to the moment relaxation yield values above the minimizer in the feasible set.
  • Exactness for Cop-SOS Convex Objective: Under cop-SOS convexity (including strictly for each f(x)=f1(xI1)+⋯+fm(xIm),f(x) = f_1(x_{I_1}) + \cdots + f_m(x_{I_m}),3), the sparse Moment-SOS relaxation yields the global optimum and an extractable minimizer for any feasible relaxation order f(x)=f1(xI1)+⋯+fm(xIm),f(x) = f_1(x_{I_1}) + \cdots + f_m(x_{I_m}),4 (Theorem 4.3). Strong duality and certificate construction are ensured if strict feasibility holds.

Numerical Validation and Computational Implications

Extensive numerical experiments corroborate the theoretical claims:

  • Efficiency Gains: The sparse Moment-SOS approach is demonstrated to solve problems with hundreds of variables and small block sizes orders-of-magnitude faster than dense approaches, which quickly become computationally infeasible.
  • Tightness Even in Large-Scale and Dense-Constraint Settings: For both synthetic and application-inspired instances (including copositive tensor cones and structured graph polynomials), the method efficiently finds certified global optimizers.
  • Failure of Local Algorithms: Benchmarking with classical nonlinear programming routines (e.g., MATLAB’s fmincon) shows that for nonconvex, copositive problems, these frequently fail to attain or even approach the certified global optimum.

Importantly, the techniques are easily adapted to certificate-based copositivity checks for higher-order tensors, as well as optimization on highly structured yet nonconvex domains.

Theoretical and Practical Implications

The research advances both the theory and practice of large-scale polynomial optimization:

  • Theoretical Generalization: The certificate and convergence theory presented bridges the gap between dense and sparse polynomial hierarchies, extending sum-of-squares decomposition methods to settings with strong but nontrivial sparsity, as well as dense constraints.
  • Algorithmic Scalability: By aligning the SDP sizes with the true coupling structure, the approach makes theoretically strong polynomial optimization tools applicable at scales relevant for engineering, combinatorial optimization, and machine learning.
  • Future Work: Open directions include generalizing to non-block structures, further sharpening conditions for necessity of cop-SOS convexity, and extending to settings involving additional integer or combinatorial variables.

Conclusion

The paper delivers a rigorous framework and efficient algorithmic approach for globally minimizing sparse copositive polynomials under general linear constraints. Through the construction of sparse Moment-SOS hierarchies and the establishment of cop-SOS convexity as a key sufficient condition for tightness, both computational scalability and theoretical guarantees are achieved. These methods have significant implications for optimization over structured nonnegative domains, with potential applicability in robust optimization, tensor analysis, and other fields where sparsity and nonnegativity constraints co-occur.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.