Papers
Topics
Authors
Recent
Search
2000 character limit reached

Mirror Duality in Convex Optimization

Published 29 Nov 2023 in math.OC | (2311.17296v2)

Abstract: While first-order optimization methods are usually designed to efficiently reduce the function value $f(x)$, there has been recent interest in methods efficiently reducing the magnitude of $\nabla f(x)$, and the findings show that the two types of methods exhibit a certain symmetry. In this work, we present mirror duality, a one-to-one correspondence between mirror-descent-type methods reducing function value and reducing gradient magnitude. Using mirror duality, we obtain the dual accelerated mirror descent (dual-AMD) method that efficiently reduces $\psi*(\nabla f(x))$, where $\psi$ is a distance-generating function and $\psi*$ quantifies the magnitude of $\nabla f(x)$. We then apply dual-AMD to efficiently reduce $|\nabla f(\cdot) |_q$ for $q\in [2,\infty)$ and to efficiently compute $\varepsilon$-approximate solutions of the optimal transport problem.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.