Papers
Topics
Authors
Recent
Search
2000 character limit reached

High Probability Complexity Bounds for Adaptive Step Search Based on Stochastic Oracles

Published 11 Jun 2021 in math.OC | (2106.06454v5)

Abstract: We consider a step search method for continuous optimization under a stochastic setting where the function values and gradients are available only through inexact probabilistic zeroth- and first-order oracles. Unlike the stochastic gradient method and its many variants, the algorithm does not use a pre-specified sequence of step sizes but increases or decreases the step size adaptively according to the estimated progress of the algorithm. These oracles capture multiple standard settings including expected loss minimization and zeroth-order optimization. Moreover, our framework is very general and allows the function and gradient estimates to be biased. The proposed algorithm is simple to describe and easy to implement. Under fairly general conditions on the oracles, we derive a high probability tail bound on the iteration complexity of the algorithm when it is applied to non-convex, convex, and strongly convex (more generally, those satisfying the PL condition) functions. Our analysis strengthens and extends prior results for stochastic step and line search methods.

Citations (7)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.