2000 character limit reached
Acceleration of saddle-point methods in smooth cases
Published 13 Dec 2016 in math.OC | (1612.04141v3)
Abstract: In the present paper we propose a novel convergence analysis of the Alternating Direction Methods of Multipliers (ADMM), based on its equivalence with the overrelaxed Primal-Dual Hybrid Gradient (oPDHG) algorithm. We consider the smooth case, which correspond to the cas where the objective function can be decomposed into one differentiable with Lipschitz continuous gradient part and one strongly convex part. An accelerated variant of the ADMM is also proposed, which is shown to converge linearly with same rate as the oPDHG.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.