Convex SDP Relaxation in Optimization
- Convex SDP relaxation is a technique that lifts nonconvex problems into higher-dimensional PSD matrix spaces, facilitating tractable convex optimization.
- It relaxes rank-1 constraints to semidefinite constraints, achieving exact recovery under structured conditions such as Rank-One Generated cones and convex quadratic constraints.
- Key applications include combinatorial optimization, power systems, global registration, and signal recovery, with proven approximation guarantees and theoretical rigor.
Convex Semidefinite Programming (SDP) Relaxation
Convex semidefinite programming (SDP) relaxation is a foundational technique in modern mathematical optimization, providing convex formulations for diverse classes of otherwise nonconvex problems. By lifting nonconvex quadratic or polynomial optimization problems to higher-dimensional matrix spaces and relaxing rank constraints to positive-semidefinite (PSD) or nuclear norm constraints, SDP relaxations enable tractable convex optimization, exact recovery in many structured problems, and provable approximation guarantees. Key applications span combinatorial optimization, power systems, signal recovery, global registration, machine learning, and quantum information.
1. Mathematical Foundation and Standard Lifting
The principle of SDP relaxation is to transform a nonconvex quadratic or polynomial problem into a convex problem by "lifting" scalar or vector variables to matrices and relaxing nonconvex (typically rank-1) constraints to convex semidefinite constraints. In the quadratic setting, consider the canonical quadratically constrained quadratic program (QCQP):
The Shor SDP relaxation is constructed by introducing a symmetric matrix variable
$X = \begin{bmatrix}x\1\end{bmatrix} \begin{bmatrix}x^\top & 1\end{bmatrix}$
and rewriting each quadratic form as a linear function in :
where .
Dropping the nonconvex rank-1 constraint and requiring only yields the convex SDP relaxation:
(Kojima et al., 4 Apr 2025, Kılınç-Karzan et al., 2021)
This lifting paradigm generalizes to higher-order polynomials via sum-of-squares (SOS) and to combinatorial and nonconvex integer QPs by encoding discrete constraints as quadratic forms and relaxing via PSD matrices (Park et al., 2015, Li, 2015).
2. Classes of Exact Convex SDP Relaxation
Three principal classes of problems admit "exact" SDP relaxations, i.e., the SDP optimum coincides with the global minimum of the original nonconvex problem, often with a rank-1 SDP optimizer recoverable:
- Rank-One Generated (ROG) Cones: A convex cone of feasible matrices is ROG if it is the convex hull of rank-one elements; QCQPs whose PSD-lifted feasible set is ROG have exact relaxations for any objective (Kojima et al., 4 Apr 2025). Sufficient conditions include "no bad intersection" among constraint-zero sets: for all pairs .
- Convex Quadratic Constraints: If all , the nonconvexity vanishes, and the classical SDP is tight.
- Sign-Patterned Matrices: For diagonal QCQPs, when every cycle in the constraint graph has consistent sign pattern (), exactness holds (Kojima et al., 4 Apr 2025, Kılınç-Karzan et al., 2021).
These cases are theoretically characterized by the ability of the SDP feasible region to yield rank-1 solutions spanning all original feasible points, or by explicit geometric conditions based on the dual multiplier cone and its faces (Wang et al., 2019, Wang et al., 2024, 2002.01566).
3. Structural Analysis and Extension Results
Convex SDP relaxation analysis relies on the geometry of the dual Lagrange multiplier cone (), its polar (), and the facial structure of the corresponding spectrahedral set. Sufficiency and necessity of SDP-exactness are established via:
- Explicit rounding/perturbation directions in the nullspace of "semidefinite" faces of (Wang et al., 2019, Wang et al., 2024)
- The quadratic eigenvalue multiplicity parameter (), quantifying symmetry: when exceeds the dimension of the affine hulls of the dual faces by at least one, convex hull exactness follows.
- Extension theorems for adding new quadratic constraints to QCQPs while preserving SDP exactness: if, for every added quadratic , the zero set of the corresponding linear functional on the PSD cone () is contained in the original feasible cone (), exactness is maintained (Kojima et al., 4 Apr 2025).
These provide systematic methodologies for certifying and expanding the scope of SDP-exact nonconvex quadratic problems.
4. Applications: Global Registration, Power Systems, Integer Programming
Convex SDP relaxations are central in diverse domains:
- Global registration: The multiple point cloud registration problem is formulated as a nonconvex least-squares problem over manifold constraints. The SDP relaxation, with Gram matrix lifting and block-diagonal normalization, admits theoretical guarantees for exact and stable recovery, tied to universal rigidity of the underlying "patch graph" (Chaudhury et al., 2013). Experimentally, the SDP exhibits zero relaxation gap up to a critical noise threshold, outperforming spectral and manifold-optimization methods.
- Mixed-integer QCQP in power systems: SDP relaxations can be significantly tightened by adding linearized products of equality constraints, enforcing integrality via disjunctive programming, and applying convex-hull formulations for each integer-coupled bilinear, yielding the tightest known convex relaxations. These methods enable practical global optimization of large-scale systems (Li, 2015).
- Integer convex quadratic minimization: The SDP relaxation provides lower bounds, with probabilistic (Gaussian-based) rounding schemes producing near-global solutions efficiently; gap analyses and randomized rounding are core (Park et al., 2015).
5. Robustness, Extension, and Hierarchical Approaches
SDP relaxations are extended to robust optimization, semi-infinite, and fractional polynomial programming via hierarchies (e.g., SOS, Lasserre, and measurable-moment relaxations):
- Robust SOS-convex programming: For polytopic or restricted ellipsoidal uncertainty sets, exact SDP (or SOCP) relaxations remain attainable under standard robust Slater conditions (Jeyakumar et al., 2013).
- Fractional semi-infinite programs: Hierarchies of SDP relaxations involving measure-based moment cones and outer approximations of the feasible set lead to convergent sequences of lower bounds and certified outer approximations (Guo et al., 2021).
These hierarchies exploit SOS-convexity to guarantee both convergence and practical tractability.
6. Specialized Relaxations and Algorithmic Innovations
- Nuclear-norm and PSD relaxations: For problems with quadratic complementary slackness or cardinality constraints (e.g., optimal sparsity-constrained layout in mechanical design), nuclear-norm penalties are used to convexify rank constraints, ensuring tight relaxation and practical solubility (Zhong et al., 2022).
- Disjunctive and convex-hull reformulations: In problems where MIQCQP structure is prominent, convex relaxations are derived by encoding bilinear integrality links as convex-hull constraints, reducing solution space and enabling compact formulations through variable sparsity (Li, 2015).
- Algorithmic advances: For large-scale SDP relaxations where interior-point is infeasible, methods such as conditional gradient with random sketching enable solutions for problems with variables in excess of entries with provable convergence (Yurtsever et al., 2019).
7. Geometric and Exactness Criteria
The convergence, tightness, and descriptive power of convex SDP relaxations depend crucially on:
- The geometry and symmetry of the original problem (e.g., eigenvalue block structure, commutativity, simultaneous diagonalizability).
- The facial structure of the dual cone and the existence of sufficient rounding directions or affine-separation from the origin.
- Explicit construction of polyhedral conic representations (), which enables finite checking for convex hull or value exactness (2002.01566, Wang et al., 2019, Wang et al., 2024).
Necessary and sufficient conditions for convex hull exactness, and hence tightness of SDP relaxations, have been characterized geometrically in terms of the faces of dual cones and associated perturbation properties (Wang et al., 2024).
Convex SDP relaxation comprises a rigorous and expressive toolkit for addressing nonconvexity in optimization, blending geometric, algebraic, and algorithmic insights to yield tractable convex surrogates for a broad spectrum of quadratic and polynomial models. Its exactness is theoretically characterized in terms of cone geometry, symmetry-induced multiplicities, and hierarchy-based convergence. In structured cases, convex SDPs not only approximate but solve nonconvex problems exactly, with provable and efficiently recoverable solutions. Key references: (Chaudhury et al., 2013, Li, 2015, Kojima et al., 4 Apr 2025, Wang et al., 2019, Kılınç-Karzan et al., 2021, Wang et al., 2024).