Papers
Topics
Authors
Recent
Search
2000 character limit reached

Fast convex optimization via inertial dynamics combining viscous and Hessian-driven damping with time rescaling

Published 16 Sep 2020 in math.OC | (2009.07620v1)

Abstract: In a Hilbert setting, we develop fast methods for convex unconstrained optimization. We rely on the asymptotic behavior of an inertial system combining geometric damping with temporal scaling. The convex function to minimize enters the dynamic via its gradient. The dynamic includes three coefficients varying with time, one is a viscous damping coefficient, the second is attached to the Hessian-driven damping, the third is a time scaling coefficient. We study the convergence rate of the values under general conditions involving the damping and the time scale coefficients. The obtained results are based on a new Lyapunov analysis and they encompass known results on the subject. We pay particular attention to the case of an asymptotically vanishing viscous damping, which is directly related to the accelerated gradient method of Nesterov. The Hessian-driven damping significantly reduces the oscillatory aspects. As a main result, we obtain an exponential rate of convergence of values without assuming the strong convexity of the objective function. The temporal discretization of these dynamics opens the gate to a large class of inertial optimization algorithms.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.