Papers
Topics
Authors
Recent
Search
2000 character limit reached

Guarantees of Riemannian Optimization for Low Rank Matrix Completion

Published 21 Mar 2016 in math.NA and math.OC | (1603.06610v2)

Abstract: We study the Riemannian optimization methods on the embedded manifold of low rank matrices for the problem of matrix completion, which is about recovering a low rank matrix from its partial entries. Assume $m$ entries of an $n\times n$ rank $r$ matrix are sampled independently and uniformly with replacement. We first prove that with high probability the Riemannian gradient descent and conjugate gradient descent algorithms initialized by one step hard thresholding are guaranteed to converge linearly to the measured matrix provided \begin{align*} m\geq C_\kappa n{1.5}r\log{1.5}(n), \end{align*} where $C_\kappa$ is a numerical constant depending on the condition number of the underlying matrix. The sampling complexity has been further improved to \begin{align*} m\geq C_\kappa nr2\log{2}(n) \end{align*} via the resampled Riemannian gradient descent initialization. The analysis of the new initialization procedure relies on an asymmetric restricted isometry property of the sampling operator and the curvature of the low rank matrix manifold. Numerical simulation shows that the algorithms are able to recover a low rank matrix from nearly the minimum number of measurements.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.