Papers
Topics
Authors
Recent
Search
2000 character limit reached

Stochastic Gradient Descent on Nonconvex Functions with General Noise Models

Published 1 Apr 2021 in math.OC | (2104.00423v1)

Abstract: Stochastic Gradient Descent (SGD) is a widely deployed optimization procedure throughout data-driven and simulation-driven disciplines, which has drawn a substantial interest in understanding its global behavior across a broad class of nonconvex problems and noise models. Recent analyses of SGD have made noteworthy progress in this direction, and these analyses have innovated important and insightful new strategies for understanding SGD. However, these analyses often have imposed certain restrictions (e.g., convexity, global Lipschitz continuity, uniform Holder continuity, expected smoothness, etc.) that leave room for innovation. In this work, we address this gap by proving that, for a rather general class of nonconvex functions and noise models, SGD's iterates either diverge to infinity or converge to a stationary point with probability one. By further restricting to globally Holder continuous functions and the expected smoothness noise model, we prove that -- regardless of whether the iterates diverge or remain finite -- the norm of the gradient function evaluated at SGD's iterates converges to zero with probability one and in expectation. As a result of our work, we broaden the scope of nonconvex problems and noise models to which SGD can be applied with rigorous guarantees of its global behavior.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.