Papers
Topics
Authors
Recent
Search
2000 character limit reached

AGDA+: Proximal Alternating Gradient Descent Ascent Method with a Nonmonotone Adaptive Step-Size Search for Nonconvex Minimax Problems

Published 20 Jun 2024 in math.OC | (2406.14371v2)

Abstract: We consider double-regularized nonconvex-strongly concave (NCSC) minimax problems of the form $(P):\min_{x\in\mathcal{X}} \max_{y\in\mathcal{Y}}g(x)+f(x,y)-h(y)$, where $g$, $h$ are closed convex, $f$ is $L$-smooth in $(x,y)$ and strongly concave in $y$. We propose a proximal alternating gradient descent ascent method AGDA+ that can adaptively choose nonmonotone primal-dual stepsizes to compute an approximate stationary point for $(P)$ without requiring the knowledge of the global Lipschitz constant $L$ and the concavity modulus $\mu$. Using a nonmonotone step-size search (backtracking) scheme, AGDA+ stands out by its ability to exploit the local Lipschitz structure and eliminates the need for precise tuning of hyper-parameters. AGDA+ achieves the optimal iteration complexity of $\mathcal{O}(\epsilon{-2})$ and it is the first step-size search method for NCSC minimax problems that require only $3$ calls to $\nabla f$ on average per backtracking iteration. The numerical experiments demonstrate its robustness and efficiency.

Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.