Optimization landscape of $\ell_0$-Bregman relaxations
Abstract: In this paper, we study (noisy) linear systems, and their $\ell_0$-regularized optimization problems, coupled with general data fidelity terms. Recent approaches for solving this class of problems have proposed to consider non-convex exact continuous relaxations that preserve global minimizers while reducing the number of local minimizers. Within this framework, we consider the class of $\ell_0$-Bregman relaxations, and establish sufficient conditions under which a critical point is isolated in terms of sparsity, in the sense that any other critical point has a strictly larger cardinality. In this way, we ensure a form of uniqueness in the solution structure. Furthermore, we analyze the exact recovery properties of such exact relaxations. To that end, we derive conditions under which the oracle solution (i.e., the one sharing the same support as the ground-truth) is the unique global minimizer of the relaxed problem, and is isolated in terms of sparsity. Our analysis is primarily built upon a novel property we introduce, termed the Bregman Restricted Strong Convexity. Finally, we specialize our general results to both sparse Gaussian (least-squares) and Poisson ((generalized) Kullback-Leibler divergence) regression problems. In particular, we show that our general analysis sharpens existing bounds for the LS setting, while providing an entirely new result for the KL case.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.