Papers
Topics
Authors
Recent
Search
2000 character limit reached

Does $\ell_p$-minimization outperform $\ell_1$-minimization?

Published 15 Jan 2015 in cs.IT, math.IT, math.ST, and stat.TH | (1501.03704v2)

Abstract: In many application areas we are faced with the following question: Can we recover a sparse vector $x_o \in \mathbb{R}N$ from its undersampled set of noisy observations $y \in \mathbb{R}n$, $y=A x_o+w$. The last decade has witnessed a surge of algorithms and theoretical results addressing this question. One of the most popular algorithms is the $\ell_p$-regularized least squares (LPLS) given by the following formulation: [ \hat{x}(\gamma,p )\in \arg\min_x \frac{1}{2}|y - Ax|_22+\gamma|x|_pp, ] where $p \in [0,1]$. Despite the non-convexity of these problems for $p<1$, they are still appealing because of the following folklores in compressed sensing: (i) $\hat{x}(\gamma,p )$ is closer to $x_o$ than $\hat{x}(\gamma,1)$. (ii) If we employ iterative methods that aim to converge to a local minima of LPLS, then under good initialization these algorithms converge to a solution that is closer to $x_o$ than $\hat{x}(\gamma,1)$. In spite of the existence of plenty of empirical results that support these folklore theorems, the theoretical progress to establish them has been very limited. This paper aims to study the above folklore theorems and establish their scope of validity. Starting with approximate message passing algorithm as a heuristic method for solving LPLS, we study the impact of initialization on the performance of AMP. Then, we employ the replica analysis to show the connection between the solution of AMP and $\hat{x}(\gamma, p)$ in the asymptotic settings. This enables us to compare the accuracy of $\hat{x}(\gamma,p)$ for $p \in [0,1]$. In particular, we will characterize the phase transition and noise sensitivity of LPLS for every $0\leq p\leq 1$ accurately. Our results in the noiseless setting confirm that LPLS exhibits the same phase transition for every $0\leq p <1$ and this phase transition is much higher than that of LASSO.

Citations (25)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.