The proximal alternating direction method of multipliers in the nonconvex setting: convergence analysis and rates
Abstract: We propose two numerical algorithms in the fully nonconvex setting for the minimization of the sum of a smooth function and the composition of a nonsmooth function with a linear operator. The iterative schemes are formulated in the spirit of the proximal alternating direction method of multipliers and its linearized variant, respectively. The proximal terms are introduced via variable metrics, a fact which allows us to derive new proximal splitting algorithms for nonconvex structured optimization problems, as particular instances of the general schemes. Under mild conditions on the sequence of variable metrics and by assuming that a regularization of the associated augmented Lagrangian has the Kurdyka-Lojasiewicz property, we prove that the iterates converge to a KKT point of the objective function. By assuming that the augmented Lagrangian has the Lojasiewicz property, we also derive convergence rates for both the augmented Lagrangian and the iterates.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.