Papers
Topics
Authors
Recent
Search
2000 character limit reached

Finding Local Minima via Stochastic Nested Variance Reduction

Published 22 Jun 2018 in cs.LG and stat.ML | (1806.08782v1)

Abstract: We propose two algorithms that can find local minima faster than the state-of-the-art algorithms in both finite-sum and general stochastic nonconvex optimization. At the core of the proposed algorithms is $\text{One-epoch-SNVRG}+$ using stochastic nested variance reduction (Zhou et al., 2018a), which outperforms the state-of-the-art variance reduction algorithms such as SCSG (Lei et al., 2017). In particular, for finite-sum optimization problems, the proposed $\text{SNVRG}{+}+\text{Neon2}{\text{finite}}$ algorithm achieves $\tilde{O}(n{1/2}\epsilon{-2}+n\epsilon_H{-3}+n{3/4}\epsilon_H{-7/2})$ gradient complexity to converge to an $(\epsilon, \epsilon_H)$-second-order stationary point, which outperforms $\text{SVRG}+\text{Neon2}{\text{finite}}$ (Allen-Zhu and Li, 2017) , the best existing algorithm, in a wide regime. For general stochastic optimization problems, the proposed $\text{SNVRG}{+}+\text{Neon2}{\text{online}}$ achieves $\tilde{O}(\epsilon{-3}+\epsilon_H{-5}+\epsilon{-2}\epsilon_H{-3})$ gradient complexity, which is better than both $\text{SVRG}+\text{Neon2}{\text{online}}$ (Allen-Zhu and Li, 2017) and Natasha2 (Allen-Zhu, 2017) in certain regimes. Furthermore, we explore the acceleration brought by third-order smoothness of the objective function.

Citations (23)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.