Papers
Topics
Authors
Recent
Search
2000 character limit reached

Decentralized Conjugate Gradient and Memoryless BFGS Methods

Published 11 Sep 2024 in math.OC | (2409.07122v3)

Abstract: This paper proposes a new decentralized conjugate gradient (NDCG) method and a decentralized memoryless BFGS (DMBFGS) method for the nonconvex and strongly convex decentralized optimization problem, respectively, of minimizing a finite sum of continuously differentiable functions over a fixed-connected undirected network. Gradient tracking techniques are applied in these two methods to enhance their convergence properties and the numerical stability. In particular, we show global convergence of NDCG with constant stepsize for general nonconvex smooth decentralized optimization. Our new DMBFGS method uses a scaled memoryless BFGS technique and only requires gradient information to approximate second-order information of the component functions in the objective. We also establish global convergence and linear convergence rate of DMBFGS with constant stepsize for strongly convex smooth decentralized optimization. Our numerical results show that NDCG and DMBFGS are very efficient in terms of both iteration and communication cost compared with other state-of-the-art methods for solving smooth decentralized optimization.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.