Papers
Topics
Authors
Recent
Search
2000 character limit reached

Analysis of Total Variation Minimization for Clustered Federated Learning

Published 10 Mar 2024 in cs.LG | (2403.06298v2)

Abstract: A key challenge in federated learning applications is the statistical heterogeneity of local datasets. Clustered federated learning addresses this challenge by identifying clusters of local datasets that are approximately homogeneous. One recent approach to clustered federated learning is generalized total variation minimization (GTVMin). This approach requires a similarity graph which can be obtained by domain expertise or in a data-driven fashion via graph learning techniques. Under a widely applicable clustering assumption, we derive an upper bound the deviation between GTVMin solutions and their cluster-wise averages. This bound provides valuable insights into the effectiveness and robustness of GTVMin in addressing statistical heterogeneity within federated learning environments.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (21)
  1. M. Wollschlaeger, T. Sauter, and J. Jasperneite, “The future of industrial communication: Automation networks in the era of the internet of things and industry 4.0,” IEEE Industrial Electronics Magazine, vol. 11, no. 1, pp. 17–27, 2017.
  2. M. Satyanarayanan, “The emergence of edge computing,” Computer, vol. 50, no. 1, pp. 30–39, Jan. 2017. [Online]. Available: https://doi.org/10.1109/MC.2017.9
  3. H. Ates, A. Yetisen, F. Güder, and C. Dincer, “Wearable devices for the detection of covid-19,” Nature Electronics, vol. 4, no. 1, pp. 13–14, 2021. [Online]. Available: https://doi.org/10.1038/s41928-020-00533-1
  4. H. Boyes, B. Hallaq, J. Cunningham, and T. Watson, “The industrial internet of things (iiot): An analysis framework,” Computers in Industry, vol. 101, pp. 1–12, 2018. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0166361517307285
  5. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y. Arcas, “Communication-Efficient Learning of Deep Networks from Decentralized Data,” in Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, ser. Proceedings of Machine Learning Research, A. Singh and J. Zhu, Eds., vol. 54.   PMLR, 20–22 Apr 2017, pp. 1273–1282. [Online]. Available: https://proceedings.mlr.press/v54/mcmahan17a.html
  6. T. Li, A. K. Sahu, A. Talwalkar, and V. Smith, “Federated learning: Challenges, methods, and future directions,” IEEE Signal Processing Magazine, vol. 37, no. 3, pp. 50–60, May 2020.
  7. Y. Cheng, Y. Liu, T. Chen, and Q. Yang, “Federated learning for privacy-preserving ai,” Communications of the ACM, vol. 63, no. 12, pp. 33–36, Dec. 2020.
  8. N. Agarwal, A. Suresh, F. Yu, S. Kumar, and H. McMahan, “cpSGD: Communication-efficient and differentially-private distributed sgd,” in Proc. Neural Inf. Proc. Syst. (NIPS), 2018.
  9. V. Smith, C.-K. Chiang, M. Sanjabi, and A. Talwalkar, “Federated multi-task learning,” in Advances in Neural Information Processing Systems, vol. 30, 2017.
  10. B. J. Lengerich, B. Aragam, and E. P. Xing, “Personalized regression enables samples-specific pan-cancer analysis,” Bioinformatics, vol. 34, 2018.
  11. L. Li, W. Chu, J. Langford, and R. Schapire, “A contextual-bandit approach to personalized news article recommendation,” in Proc. International World Wide Web Conference, Raleigh, North Carolina, USA, April 2010, pp. 661–670.
  12. K. Guk, G. Han, J. Lim, K. Jeong, T. Kang, E.-K. Lim, and J. Jung, “Evolution of wearable devices with real-time disease monitoring for personalized healthcare,” Nanomaterials, vol. 9, no. 6, Jun. 2019.
  13. Y. SarcheshmehPour, Y. Tian, L. Zhang, and A. Jung, “Clustered federated learning via generalized total variation minimization,” IEEE Transactions on Signal Processing, vol. 71, pp. 4240–4256, 2023.
  14. D. Hallac, J. Leskovec, and S. Boyd, “Network lasso: Clustering and optimization in large graphs,” in Proc. SIGKDD, 2015, pp. 387–396.
  15. A. Jung, S. Abdurakhmanova, O. Kuznetsova, and Y. Sarcheshmehpour, “Towards model-agnostic federated learning over networks,” in 2023 31st European Signal Processing Conference (EUSIPCO), 2023, pp. 1614–1618.
  16. A. Nedic and A. Olshevsky, “Distributed optimization over time-varying directed graphs,” ArXiv e-prints, Mar. 2013.
  17. J. Rasch and A. Chambolle, “Inexact first-order primal–dual algorithms,” Computational Optimization and Applications, vol. 76, no. 2, pp. 381–430, 2020. [Online]. Available: https://doi.org/10.1007/s10589-020-00186-y
  18. A. Jung and N. Tran, “Localized linear regression in networked data,” IEEE Sig. Proc. Lett., vol. 26, no. 7, Jul. 2019.
  19. K. Lee, K. You, and L. Lin, “Bayesian Optimal Two-Sample Tests for High-Dimensional Gaussian Populations,” Bayesian Analysis, pp. 1 – 25, 2023. [Online]. Available: https://doi.org/10.1214/23-BA1373
  20. J. Acharya, “Profile maximum likelihood is optimal for estimating kl divergence,” in 2018 IEEE International Symposium on Information Theory (ISIT), 2018, pp. 1400–1404.
  21. V. Smith, S. Forte, C. Ma, M. Takáč, M. I. Jordan, and M. Jaggi, “CoCoA: A general framework for communication-efficient distributed optimization,” Journal of Machine Learning Research, vol. 18, no. 230, pp. 1–49, 2018. [Online]. Available: http://jmlr.org/papers/v18/16-512.html

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.