Papers
Topics
Authors
Recent
Search
2000 character limit reached

A Note on the Gradient-Evaluation Sequence in Accelerated Gradient Methods

Published 6 Mar 2026 in math.OC | (2603.06937v1)

Abstract: Nesterov's accelerated gradient descent method (AGD) is a seminal deterministic first-order method known to achieve the optimal order of iteration complexity for solving convex smooth optimization problems. Two distinct sequences of iterates are included in the description of AGD: gradient evaluations are performed at one sequence, while approximate solutions are selected from the other. The iteration complexity on minimizing objective function value has been well-studied in the literature, but such analysis is almost always performed only at the approximate solution sequence. To the best of our knowledge, for projection-based AGD that solves problems with feasible sets, it is still an open research question whether the gradient evaluation sequence (when treated as approximate solutions) could also achieve the same optimal order of iteration complexity. It is also unknown whether such results still hold in the non-Euclidean setting. Motivated by computer-aided algorithm analysis, we provide positive results that answer the open problems affirmatively. Specifically, for (possibly constrained) problem $f*:=\min_{x\in X}f(x)$ where $f$ is convex and $L$-smooth and $X$ is closed, convex and projection friendly, we prove that the gradient-evaluation sequence ${\underline{x}_k}$ in AGD satisfies that $f(\underline{x}_k) - f*\le \mathcal O(L/k2)$.

Authors (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.