Papers
Topics
Authors
Recent
Search
2000 character limit reached

Finding Low-Rank Solutions via Non-Convex Matrix Factorization, Efficiently and Provably

Published 10 Jun 2016 in math.OC, cs.DS, cs.IT, cs.LG, cs.NA, and math.IT | (1606.03168v3)

Abstract: A rank-$r$ matrix $X \in \mathbb{R}{m \times n}$ can be written as a product $U V\top$, where $U \in \mathbb{R}{m \times r}$ and $V \in \mathbb{R}{n \times r}$. One could exploit this observation in optimization: e.g., consider the minimization of a convex function $f(X)$ over rank-$r$ matrices, where the set of rank-$r$ matrices is modeled via the factorization $UV\top$. Though such parameterization reduces the number of variables, and is more computationally efficient (of particular interest is the case $r \ll \min{m, n}$), it comes at a cost: $f(UV\top)$ becomes a non-convex function w.r.t. $U$ and $V$. We study such parameterization for optimization of generic convex objectives $f$, and focus on first-order, gradient descent algorithmic solutions. We propose the Bi-Factored Gradient Descent (BFGD) algorithm, an efficient first-order method that operates on the $U, V$ factors. We show that when $f$ is (restricted) smooth, BFGD has local sublinear convergence, and linear convergence when $f$ is both (restricted) smooth and (restricted) strongly convex. For several key applications, we provide simple and efficient initialization schemes that provide approximate solutions good enough for the above convergence results to hold.

Citations (56)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.