Papers
Topics
Authors
Recent
Search
2000 character limit reached

Minimization Over the Nonconvex Sparsity Constraint Using A Hybrid First-order method

Published 9 Apr 2021 in math.OC | (2104.04400v2)

Abstract: We investigate a class of nonconvex optimization problems characterized by a feasible set consisting of level-bounded nonconvex regularizers, with a continuously differentiable objective. We propose a novel hybrid approach to tackle such structured problems within a first-order algorithmic framework by combining the Frank-Wolfe method and the gradient projection method. The Frank-Wolfe step is amenable to a closed-form solution, while the gradient projection step can be efficiently performed in a reduced subspace. A notable characteristic of our approach lies in its independence from introducing smoothing parameters, enabling efficient solutions to the original nonsmooth problems. We establish the global convergence of the proposed algorithm and show the $O(1/\sqrt{k})$ convergence rate in terms of the optimality error for nonconvex objectives under reasonable assumptions. Numerical experiments underscore the practicality and efficiency of our proposed algorithm compared to existing cutting-edge methods. Furthermore, we highlight how the proposed algorithm contributes to the advancement of nonconvex regularizer-constrained optimization.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.