Improving the constant in Nesterov's $\fracπ{2}$-theorem
Abstract: One of the hard optimization problems that has a semi-definite relaxation with quantitative bound on the approximation error is the maximization of a convex quadratic form on the hypercube. The relaxation not only yields an upper bound on the optimal value, but its solution can be used to construct random sub-optimal solutions of the original problem whose expected value is not less than $\frac{2}{\pi}$ times the value of the relaxation. This constant cannot be improved globally. More precisely, for every $\epsilon > 0$ there exists a problem instance for which the ratio of the two values in question is larger than $\frac{\pi}{2} - \epsilon$. However, if a given problem instance is considered, then the relaxation yields a concrete solution which may result in a much better ratio. In this contribution we present an improved, explicit bound depending on the rank of the solution. We consider also the problem of maximization of a convex hermitian quadratic form on the complex poly-disc. In this case a bound on the approximation error is given by the $\frac{4}{\pi}$-theorem of Ben-Tal, Nemirovski, and Roos. The derivation of a rank-dependent improved bound is similar to the real case. In the complex case we provide explicit expressions in the form of an infinite series and conjecture a closed-form expression.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.