Papers
Topics
Authors
Recent
Search
2000 character limit reached

Stochastic Variance-Reduced Heavy Ball Power Iteration

Published 24 Jan 2019 in math.OC | (1901.08179v1)

Abstract: We present a stochastic variance-reduced heavy ball power iteration algorithm for solving PCA and provide a convergence analysis for it. The algorithm is an extension of heavy ball power iteration, incorporating a step size so that progress can be controlled depending on the magnitude of the variance of stochastic gradients. The algorithm works with any size of the mini-batch, and if the step size is appropriately chosen, it attains global linear convergence to the first eigenvector of the covariance matrix in expectation. The global linear convergence result in expectation is analogous to those of stochastic variance-reduced gradient methods for convex optimization but due to non-convexity of PCA, it has never been shown for previous stochastic variants of power iteration since it requires very different techniques. We provide the first such analysis and stress that our framework can be used to establish convergence of the previous stochastic algorithms for any initial vector and in expectation. Experimental results show that the algorithm attains acceleration in a large batch regime, outperforming benchmark algorithms especially when the eigen-gap is small.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.