Mirror Duality in Convex Optimization
Abstract: While first-order optimization methods are usually designed to efficiently reduce the function value $f(x)$, there has been recent interest in methods efficiently reducing the magnitude of $\nabla f(x)$, and the findings show that the two types of methods exhibit a certain symmetry. In this work, we present mirror duality, a one-to-one correspondence between mirror-descent-type methods reducing function value and reducing gradient magnitude. Using mirror duality, we obtain the dual accelerated mirror descent (dual-AMD) method that efficiently reduces $\psi*(\nabla f(x))$, where $\psi$ is a distance-generating function and $\psi*$ quantifies the magnitude of $\nabla f(x)$. We then apply dual-AMD to efficiently reduce $|\nabla f(\cdot) |_q$ for $q\in [2,\infty)$ and to efficiently compute $\varepsilon$-approximate solutions of the optimal transport problem.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.