Loading [MathJax]/extensions/MathZoom.js
A Federated Learning Mechanism with Feature Drift for Feature Distribution Skew | IEEE Conference Publication | IEEE Xplore

A Federated Learning Mechanism with Feature Drift for Feature Distribution Skew


Abstract:

Federated learning is a nascent distributed machine learning paradigm that enables multiple clients to collaborate in training a model for a specific task under the coord...Show More

Abstract:

Federated learning is a nascent distributed machine learning paradigm that enables multiple clients to collaborate in training a model for a specific task under the coordination of a central server, all while safeguarding the privacy of the user’s local data. Nevertheless, the constraint that distributed datasets must remain within local nodes introduces data heterogeneity in federated learning training. In this paper, we focus on how to mitigate the damage caused by the data heterogeneity of feature distribution skew in federated learning models during training. To achieve this goal, we propose a feature drift-corrected federated learning algorithm. We design a feature drift variable derived from the local models of clients and the global model of the server. This variable is incorporated into the client’s local loss function to rectify local model parameters. Additionally, we utilize the disparity between the global models before and after to regulate the local model. Validation experiments are conducted on multiple datasets exhibiting feature distribution skew. The implementation results demonstrate the efficacy of our approach in significantly enhancing the model performance of federated learning under feature distribution skew.
Date of Conference: 08-11 July 2024
Date Added to IEEE Xplore: 11 October 2024
ISBN Information:
Conference Location: Venice, Italy

Funding Agency:

References is not available for this document.

I. Introduction

Federated learning (FL) is a burgeoning machine learning paradigm designed to enable model training without compromising the confidentiality of local private data. As contemporary society places increasing emphasis on safeguarding personal and corporate private data, FL, distinguished by its privacy and security features, has garnered growing attention [1]. In 2017, Google introduced the pioneering FL algorithm, FedAvg [2], establishing a robust foundation for subsequent FL research.

Select All
1.
Q. Li, Z. Wen, Z. Wu, S. Hu, N. Wang, Y. Li, et al., "A survey on federated learning systems: Vision hype and reality for data privacy and protection", IEEE Transactions on Knowledge and Data Engineering, vol. 35, no. 4, pp. 3347-3366, 2023.
2.
B. McMahan, E. Moore, D. Ramage, S. Hampson and A. y. B. Arcas, "Communication-efficient learning of deep networks from decentralized data", Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS), vol. 54, pp. 1273-1282, 2017.
3.
Q. Yang, Y. Liu, T. Chen and Y. Tong, "Federated machine learning: Concept and applications", ACM Transactions on Intelligent Systems and Technology, vol. 10, no. 2, pp. 12, 2019.
4.
T. Li, K. A. Sahu, A. Talwalkar and V. Smith, "Federated learning: Challenges methods and future directions", IEEE Signal Processing Magazine, vol. 37, no. 3, pp. 50-60, 2020.
5.
P. Kairouz, H. B. McMahan, B. Avent et al., "Advances and open problems in federated learning", Foundations and Trends in Machine Learning, vol. 14, no. 1, pp. 1-210, 2021.
6.
T. Zhou and E. Konukoglu, "FedFA: Federated feature augmentation", in Proceedings of the 11th International Conference on Learning Representations (ICLR), 2023.
7.
M. Jiang, Z. Wang and Q. Dou, "HarmoFL: harmonizing local and global drifts in federated learning on heterogeneous medical images", in Proceedings of the 36th AAAI Conference on Artificial Intelligence (AAAI), vol. 36, pp. 1087-1095, 2022.
8.
X. Li, M. Jiang, X. Zhang, M. Kamp and Q. Dou, "FedBN: Federated learning on non-iid features via local batch normalization", in Proceedings of the 9th International Conference on Learning Representations (ICLR), 2021.
9.
P. S. Karimireddy, S. Kale, M. Mohri, J. S. Reddi, U. S. Stich and T. A. Suresh, "SCAFFOLD: stochastic controlled averaging for ondevice federated learning", in Proceedings of the 37th International Conference on Machine Learning (ICML), vol. 119, pp. 5132-5143, 2020.
10.
B. Gong, Y. Shi, F. Sha and K. Grauman, "Geodesic flow kernel for unsupervised domain adaptation", in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2066-2073, 2012.
11.
X. Peng, Q. Bai, X. Xia, Z. Huang, K. Saenko and B. Wang, "Moment matching for multi-source domain adaptation", IEEE/CVF International Conference on Computer Vision (ICCV), pp. 1406-1415, 2019.
12.
Y. LeCun, L. Bottou, Y. Bengio and P. Haffner, "Gradient-based learning applied to document recognition", Proceedings of the IEEE, vol. 86, no. 11, pp. 2278-2324, 1998.
13.
Y. Netzer, T. Wang, A. Coates, A. Bissacco, B. Wu and Y. A. Ng, "Reading digits in natural images with unsupervised feature learning", in Advances in Neural Information Processing Systems (NeurIPS), 2011.
14.
J. J. Hull, "A database for handwritten text recognition research", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 139, no. 5, pp. 550-554, 1994.
15.
Y. Ganin and S. V. Lempitsky, "Unsupervised domain adaptation by backpropagation", in Proceedings of the 32th International Conference on Machine Learning (ICML), vol. 37, pp. 1180-1189, 2015.
16.
A. Krizhevsky, I. Sutskever and E. G. Hinton, "Imagenet classification with deep convolutional neural networks", Communications of the ACM, vol. 60, no. 6, pp. 84-90, 2017.
17.
H. T. Hsu, H. Qi and M. Brown, "Measuring the effects of nonidentical data distribution for federated visual classification", CoRR, vol. abs/1909.06335, 2019.
18.
T. Li, K. A. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar and V. Smith, "Federated optimization in heterogeneous networks", Proceedings of Machine Learning and Systems (MLSys), 2020.

Contact IEEE to Subscribe

References

References is not available for this document.