Papers
Topics
Authors
Recent
Search
2000 character limit reached

Guarantees of Riemannian Optimization for Low Rank Matrix Recovery

Published 5 Nov 2015 in math.NA | (1511.01562v8)

Abstract: We establish theoretical recovery guarantees of a family of Riemannian optimization algorithms for low rank matrix recovery, which is about recovering an $m\times n$ rank $r$ matrix from $p < mn$ number of linear measurements. The algorithms are first interpreted as iterative hard thresholding algorithms with subspace projections. Based on this connection, we show that provided the restricted isometry constant $R_{3r}$ of the sensing operator is less than $C_\kappa /\sqrt{r}$, the Riemannian gradient descent algorithm and a restarted variant of the Riemannian conjugate gradient algorithm are guaranteed to converge linearly to the underlying rank $r$ matrix if they are initialized by one step hard thresholding. Empirical evaluation shows that the algorithms are able to recover a low rank matrix from nearly the minimum number of measurements necessary.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.