Papers
Topics
Authors
Recent
Search
2000 character limit reached

Robust Accelerated Primal-Dual Methods for Computing Saddle Points

Published 24 Nov 2021 in math.OC | (2111.12743v4)

Abstract: We consider strongly-convex-strongly-concave saddle point problems assuming we have access to unbiased stochastic estimates of the gradients. We propose a stochastic accelerated primal-dual (SAPD) algorithm and show that SAPD sequence, generated using constant primal-dual step sizes, linearly converges to a neighborhood of the unique saddle point. Interpreting the size of the neighborhood as a measure of robustness to gradient noise, we obtain explicit characterizations of robustness in terms of SAPD parameters and problem constants. Based on these characterizations, we develop computationally tractable techniques for optimizing the SAPD parameters, i.e., the primal and dual step sizes, and the momentum parameter, to achieve a desired trade-off between the convergence rate and robustness on the Pareto curve. This allows SAPD to enjoy fast convergence properties while being robust to noise as an accelerated method. SAPD admits convergence guarantees for the distance metric with a variance term optimal up to a logarithmic factor -which can be removed by employing a restarting strategy. We also discuss how convergence and robustness results extend to the convex-concave setting. Finally, we illustrate our framework on distributionally robust logistic regression problem.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.