Papers
Topics
Authors
Recent
Search
2000 character limit reached

Dual-Segment Clustering Strategy for Hierarchical Federated Learning in Heterogeneous Wireless Environments

Published 15 May 2024 in cs.LG, cs.AI, and cs.DC | (2405.09276v2)

Abstract: Non-independent and identically distributed (Non- IID) data adversely affects federated learning (FL) while heterogeneity in communication quality can undermine the reliability of model parameter transmission, potentially degrading wireless FL convergence. This paper proposes a novel dual-segment clustering (DSC) strategy that jointly addresses communication and data heterogeneity in FL. This is achieved by defining a new signal-to-noise ratio (SNR) matrix and information quantity matrix to capture the communication and data heterogeneity, respectively. The celebrated affinity propagation algorithm is leveraged to iteratively refine the clustering of clients based on the newly defined matrices effectively enhancing model aggregation in heterogeneous environments. The convergence analysis and experimental results show that the DSC strategy can improve the convergence rate of wireless FL and demonstrate superior accuracy in heterogeneous environments compared to classical clustering methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (18)
  1. J. Dean, G. Corrado, R. Monga, K. Chen, M. Devin, M. Mao, M. Ranzato, A. Senior, P. Tucker, K. Yang et al., “Large scale distributed deep networks,” Advances in neural information processing systems, vol. 25, 2012.
  2. M. Li, D. G. Andersen, J. W. Park, A. J. Smola, A. Ahmed, V. Josifovski, J. Long, E. J. Shekita, and B.-Y. Su, “Scaling distributed machine learning with the parameter server,” in 11th USENIX Symposium on operating systems design and implementation (OSDI 14), pp. 583–598, 2014.
  3. J. Konečnỳ, B. McMahan, and D. Ramage, “Federated optimization: Distributed optimization beyond the datacenter,” arXiv preprint arXiv:1511.03575, 2015.
  4. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, “Communication-efficient learning of deep networks from decentralized data,” in Artificial intelligence and statistics, pp. 1273–1282.   PMLR, 2017.
  5. T.-M. H. Hsu, H. Qi, and M. Brown, “Measuring the effects of non-identical data distribution for federated visual classification,” arXiv preprint arXiv:1909.06335, 2019.
  6. K. Hsieh, A. Phanishayee, O. Mutlu, and P. Gibbons, “The non-iid data quagmire of decentralized machine learning,” in International Conference on Machine Learning, pp. 4387–4398.   PMLR, 2020.
  7. Y. Zhao, M. Li, L. Lai, N. Suda, D. Civin, and V. Chandra, “Federated learning with non-iid data,” arXiv preprint arXiv:1806.00582, 2018.
  8. M. Duan, D. Liu, X. Chen, R. Liu, Y. Tan, and L. Liang, “Self-balancing federated learning with global imbalanced data in mobile systems,” IEEE Transactions on Parallel and Distributed Systems, vol. 32, no. 1, pp. 59–71, 2020.
  9. S. Seo, J. Lee, H. Ko, and S. Pack, “Performance-aware client and quantization level selection algorithm for fast federated learning,” in 2022 IEEE Wireless Communications and Networking Conference (WCNC), pp. 1892–1897.   IEEE, 2022.
  10. L. Liu, J. Zhang, S. Song, and K. B. Letaief, “Client-edge-cloud hierarchical federated learning,” in ICC 2020-2020 IEEE international conference on communications (ICC), pp. 1–6.   IEEE, 2020.
  11. C. Wang, Y. Yang, and P. Zhou, “Towards efficient scheduling of federated mobile devices under computational and statistical heterogeneity,” IEEE Transactions on Parallel and Distributed Systems, vol. 32, no. 2, pp. 394–410, 2020.
  12. J.-w. Lee, J. Oh, Y. Shin, J.-G. Lee, and S.-Y. Yoon, “Accurate and fast federated learning via iid and communication-aware grouping,” arXiv preprint arXiv:2012.04857, 2020.
  13. Y. Lei, L. Yanyan, C. Jiannong, H. Jiaming, and Z. Mingjin, “E-tree learning: A novel decentralized model learning framework for edge ai,” IEEE Internet of Things Journal, vol. 8, no. 14, pp. 11 290–11 304, 2021.
  14. Z. He, L. Yang, W. Lin, and W. Wu, “Improving accuracy and convergence in group-based federated learning on non-iid data,” IEEE Transactions on Network Science and Engineering, vol. 10, no. 3, pp. 1389–1404, 2022.
  15. H. Wu and P. Wang, “Fast-convergent federated learning with adaptive weighting,” IEEE Transactions on Cognitive Communications and Networking, vol. 7, \hrefhttp://dx.doi.org/10.1109/TCCN.2021.3084406DOI 10.1109/TCCN.2021.3084406, no. 4, pp. 1078–1088, 2021.
  16. G. Zhu and K. Huang, “Mimo over-the-air computation for high-mobility multimodal sensing,” IEEE Internet of Things journal, vol. 6, no. 4, pp. 6089–6103, 2018.
  17. W. Ni, Y. Liu, Z. Yang, H. Tian, and X. Shen, “Integrating over-the-air federated learning and non-orthogonal multiple access: What role can ris play?” IEEE Transactions on Wireless Communications, vol. 21, no. 12, pp. 10 083–10 099, 2022.
  18. W. Nie, L. Yu, and Z. Jia, “Research on aggregation strategy of federated learning parameters under non-independent and identically distributed conditions,” in 2022 4th International Conference on Applied Machine Learning (ICAML), pp. 41–48.   IEEE, 2022.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 1 like about this paper.