Papers
Topics
Authors
Recent
Search
2000 character limit reached

Escape saddle points faster on manifolds via perturbed Riemannian stochastic recursive gradient

Published 23 Oct 2020 in math.OC and cs.LG | (2010.12191v2)

Abstract: In this paper, we propose a variant of Riemannian stochastic recursive gradient method that can achieve second-order convergence guarantee and escape saddle points using simple perturbation. The idea is to perturb the iterates when gradient is small and carry out stochastic recursive gradient updates over tangent space. This avoids the complication of exploiting Riemannian geometry. We show that under finite-sum setting, our algorithm requires $\widetilde{\mathcal{O}}\big( \frac{ \sqrt{n}}{\epsilon2} + \frac{\sqrt{n} }{\delta4} + \frac{n}{\delta3}\big)$ stochastic gradient queries to find a $(\epsilon, \delta)$-second-order critical point. This strictly improves the complexity of perturbed Riemannian gradient descent and is superior to perturbed Riemannian accelerated gradient descent under large-sample settings. We also provide a complexity of $\widetilde{\mathcal{O}} \big( \frac{1}{\epsilon3} + \frac{1}{\delta3 \epsilon2} + \frac{1}{\delta4 \epsilon} \big)$ for online optimization, which is novel on Riemannian manifold in terms of second-order convergence using only first-order information.

Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.