Papers
Topics
Authors
Recent
Search
2000 character limit reached

The Effectiveness of Local Updates for Decentralized Learning under Data Heterogeneity

Published 23 Mar 2024 in cs.LG and math.OC | (2403.15654v3)

Abstract: We revisit two fundamental decentralized optimization methods, Decentralized Gradient Tracking (DGT) and Decentralized Gradient Descent (DGD), with multiple local updates. We consider two settings and demonstrate that incorporating local update steps can reduce communication complexity. Specifically, for $\mu$-strongly convex and $L$-smooth loss functions, we proved that local DGT achieves communication complexity {}{$\tilde{\mathcal{O}} \Big(\frac{L}{\mu(K+1)} + \frac{\delta + {}{\mu}}{\mu (1 - \rho)} + \frac{\rho }{(1 - \rho)2} \cdot \frac{L+ \delta}{\mu}\Big)$}, %\zhize{seems to be $\tilde{\mathcal{O}}$} {where $K$ is the number of additional local update}, $\rho$ measures the network connectivity and $\delta$ measures the second-order heterogeneity of the local losses. Our results reveal the tradeoff between communication and computation and show increasing $K$ can effectively reduce communication costs when the data heterogeneity is low and the network is well-connected. We then consider the over-parameterization regime where the local losses share the same minimums. We proved that employing local updates in DGD, even without gradient correction, achieves exact linear convergence under the Polyak-{\L}ojasiewicz (PL) condition, which can yield a similar effect as DGT in reducing communication complexity. {}{Customization of the result to linear models is further provided, with improved rate expression. }Numerical experiments validate our theoretical results.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 0 likes about this paper.