Papers
Topics
Authors
Recent
Search
2000 character limit reached

AdaFed: Fair Federated Learning via Adaptive Common Descent Direction

Published 10 Jan 2024 in cs.LG and cs.AI | (2401.04993v1)

Abstract: Federated learning (FL) is a promising technology via which some edge devices/clients collaboratively train a machine learning model orchestrated by a server. Learning an unfair model is known as a critical problem in federated learning, where the trained model may unfairly advantage or disadvantage some of the devices. To tackle this problem, in this work, we propose AdaFed. The goal of AdaFed is to find an updating direction for the server along which (i) all the clients' loss functions are decreasing; and (ii) more importantly, the loss functions for the clients with larger values decrease with a higher rate. AdaFed adaptively tunes this common direction based on the values of local gradients and loss functions. We validate the effectiveness of AdaFed on a suite of federated datasets, and demonstrate that AdaFed outperforms state-of-the-art fair FL methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (53)
  1. Fairness in machine learning. Nips tutorial, 1:2017, 2017.
  2. Towards federated learning at scale: System design. Proceedings of machine learning and systems, 1:374–388, 2019.
  3. Convex optimization. Cambridge university press, 2004.
  4. Leaf: A benchmark for federated settings. arXiv preprint arXiv:1812.01097, 2018.
  5. Addressing algorithmic disparity and performance inconsistency in federated learning. Advances in Neural Information Processing Systems, 34:26091–26102, 2021.
  6. Cinic-10 is not imagenet or cifar-10. arXiv preprint arXiv:1810.03505, 2018.
  7. Jean-Antoine Désidéri. Multiple-gradient descent algorithm (MGDA). PhD thesis, INRIA, 2009.
  8. Jean-Antoine Désidéri. Multiple-gradient descent algorithm (mgda) for multiobjective optimization. Comptes Rendus Mathematique, 350(5-6):313–318, 2012.
  9. Fairness-aware agnostic federated learning. In Proceedings of the 2021 SIAM International Conference on Data Mining (SDM), pp.  181–189. SIAM, 2021.
  10. Semi-cyclic stochastic gradient descent. In International Conference on Machine Learning, pp.  1764–1773. PMLR, 2019.
  11. Steepest descent methods for multicriteria optimization. Mathematical methods of operations research, 51(3):479–494, 2000.
  12. Fair wireless federated learning through the identification of a common descent direction. IEEE Communications Letters, pp.  1–1, 2024. doi: 10.1109/LCOMM.2024.3350378.
  13. Systems and methods for communication resource usage control, May 30 2019. US Patent App. 15/824,352.
  14. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp.  770–778, 2016.
  15. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
  16. Federated learning meets multi-objective optimization. IEEE Transactions on Network Science and Engineering, 2022.
  17. Fairness in wireless networks: Issues, measures and challenges. IEEE Communications Surveys & Tutorials, 16(1):5–24, 2013.
  18. An efficiency-boosting client selection scheme for federated learning with fairness guarantee. IEEE Transactions on Parallel and Distributed Systems, 32(7):1552–1564, 2020a.
  19. Stochastic client selection for federated learning with volatile clients. IEEE Internet of Things Journal, 9(20):20055–20070, 2022.
  20. Fairness and accuracy in federated learning. arXiv preprint arXiv:2012.10069, 2020b.
  21. Advances and open problems in federated learning. Foundations and Trends® in Machine Learning, 14(1–2):1–210, 2021.
  22. Incentive design for efficient federated learning in mobile networks: A contract theory approach. In 2019 IEEE VTS Asia Pacific Wireless Communications Symposium (APWCS), pp.  1–5. IEEE, 2019.
  23. Learning multiple layers of features from tiny images. 2009.
  24. P. Langley. Crafting papers on machine learning. In Pat Langley (ed.), Proceedings of the 17th International Conference on Machine Learning (ICML 2000), pp.  1207–1216, Stanford, CA, 2000. Morgan Kaufmann.
  25. An incentive mechanism for federated learning in wireless cellular networks: An auction approach. IEEE Transactions on Wireless Communications, 20(8):4874–4887, 2021.
  26. Ya Le and Xuan Yang. Tiny imagenet visual recognition challenge. CS 231N, 7(7):3, 2015.
  27. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.
  28. Fair resource allocation in federated learning. In International Conference on Learning Representations, 2019a.
  29. Tilted empirical risk minimization. In International Conference on Learning Representations, 2020a.
  30. Federated optimization in heterogeneous networks. Proceedings of Machine Learning and Systems, 2:429–450, 2020b.
  31. Ditto: Fair and robust federated learning through personalization. In International Conference on Machine Learning, pp.  6357–6368. PMLR, 2021.
  32. On the convergence of fedavg on non-iid data. arXiv preprint arXiv:1907.02189, 2019b.
  33. Collaborative fairness in federated learning. Federated Learning: Privacy and Incentive, pp.  189–204, 2020.
  34. Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics, pp.  1273–1282. PMLR, 2017.
  35. A stochastic multiple gradient descent algorithm. European Journal of Operational Research, 271(3):808–817, 2018.
  36. Agnostic federated learning. In International Conference on Machine Learning, pp.  4615–4625. PMLR, 2019.
  37. Hiroaki Mukai. Algorithms for multicriterion optimization. IEEE transactions on automatic control, 25(2):177–186, 1980.
  38. Client selection for federated learning with heterogeneous resources in mobile edge. In ICC 2019-2019 IEEE international conference on communications (ICC), pp.  1–7. IEEE, 2019.
  39. John Rawls. A theory of justice: Revised edition. Harvard university press, 2020.
  40. Adaptive federated optimization. In International Conference on Learning Representations, 2020.
  41. Reputation-based federated learning for secure wireless networks. IEEE Internet of Things Journal, 9(2):1212–1226, 2021.
  42. Effective federated adaptive gradient methods with non-iid decentralized data. arXiv preprint arXiv:2009.06557, 2020.
  43. Federated learning with matched averaging. arXiv preprint arXiv:2002.06440, 2020.
  44. A field guide to federated optimization. arXiv preprint arXiv:2107.06917, 2021a.
  45. Federated learning with fair averaging. arXiv preprint arXiv:2104.14937, 2021b.
  46. Group normalization. In Proceedings of the European conference on computer vision (ECCV), pp.  3–19, 2018.
  47. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747, 2017.
  48. Learning from massive noisy labeled data for image classification. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp.  2691–2699, 2015.
  49. Fedcorr: Multi-stage federated learning for label noise correction. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp.  10184–10193, 2022.
  50. Federated learning with class imbalance reduction. In 2021 29th European Signal Processing Conference (EUSIPCO), pp.  2174–2178. IEEE, 2021.
  51. Federated learning in vehicular edge computing: A selective model aggregation approach. IEEE Access, 8:23920–23935, 2020.
  52. Hierarchically fair federated learning. arXiv preprint arXiv:2004.10386, 2020.
  53. Incentive mechanism for horizontal federated learning based on reputation and reverse auction. In Proceedings of the Web Conference 2021, pp.  947–956, 2021.
Citations (9)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.