Papers
Topics
Authors
Recent
Search
2000 character limit reached

Restart of accelerated first order methods with linear convergence under a quadratic functional growth condition

Published 24 Feb 2021 in math.OC | (2102.12387v2)

Abstract: Accelerated first order methods, also called fast gradient methods, are popular optimization methods in the field of convex optimization. However, they are prone to suffer from oscillatory behaviour that slows their convergence when medium to high accuracy is desired. In order to address this, restart schemes have been proposed in the literature, which seek to improve the practical convergence by suppressing the oscillatory behaviour. This paper presents a restart scheme for accelerated first order methods for which we show linear convergence under the satisfaction of a quadratic functional growth condition, thus encompassing a broad class of non-necessarily strongly convex optimization problems. Moreover, the worst-case convergence rate is comparable to the one obtained using a (generally non-implementable) optimal fixed-rate restart strategy. We compare the proposed algorithm with other restart schemes by applying them to a model predictive control case study.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.