Papers
Topics
Authors
Recent
Search
2000 character limit reached

Convexity of Optimization Curves: Local Sharp Thresholds, Robustness Impossibility, and New Counterexamples

Published 10 Sep 2025 in math.OC and cs.LG | (2509.08954v1)

Abstract: We study when the \emph{optimization curve} of first--order methods -- the sequence \${f(x_n)}{n\ge0}\$ produced by constant--stepsize iterations -- is convex, equivalently when the forward differences \$f(x_n)-f(x{n+1})\$ are nonincreasing. For gradient descent (GD) on convex \$L\$--smooth functions, the curve is convex for all stepsizes \$\eta \le 1.75/L\$, and this threshold is tight. Moreover, gradient norms are nonincreasing for all \$\eta \le 2/L\$, and in continuous time (gradient flow) the curve is always convex. These results complement and refine the classical smooth convex optimization toolbox, connecting discrete and continuous dynamics as well as worst--case analyses.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.