Papers
Topics
Authors
Recent
Search
2000 character limit reached

From differential equation solvers to accelerated first-order methods for convex optimization

Published 6 Sep 2019 in math.OC | (1909.03145v4)

Abstract: Convergence analysis of accelerated first-order methods for convex optimization problems are presented from the point of view of ordinary differential equation solvers. A new dynamical system, called Nesterov accelerated gradient flow, has been derived from the connection between acceleration mechanism and $A$-stability of ODE solvers, and the exponential decay of a tailored Lyapunov function along with the solution trajectory is proved. Numerical discretizations are then considered and convergence rates are established via a unified discrete Lyapunov function. The proposed differential equation solver approach can not only cover existing accelerated methods, such as FISTA, G\"{u}ler's proximal algorithm and Nesterov's accelerated gradient method, but also produce new algorithms for composite convex optimization that possess accelerated convergence rates.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.