Papers
Topics
Authors
Recent
Search
2000 character limit reached

Generalizing the optimized gradient method for smooth convex minimization

Published 22 Jul 2016 in math.OC | (1607.06764v4)

Abstract: This paper generalizes the optimized gradient method (OGM) that achieves the optimal worst-case cost function bound of first-order methods for smooth convex minimization. Specifically, this paper studies a generalized formulation of OGM and analyzes its worst-case rates in terms of both the function value and the norm of the function gradient. This paper also develops a new algorithm called OGM-OG that is in the generalized family of OGM and that has the best known analytical worst-case bound with rate $O(1/N{1.5})$ on the decrease of the gradient norm among fixed-step first-order methods. This paper also proves that Nesterov's fast gradient method has an $O(1/N{1.5})$ worst-case gradient norm rate but with constant larger than OGM-OG. The proof is based on the worst-case analysis called Performance Estimation Problem.

Citations (43)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.