Loading [MathJax]/extensions/MathZoom.js
FNCS: Federated Learning Strategy Based on Cosine Similarity under Resource Constraints | IEEE Conference Publication | IEEE Xplore

FNCS: Federated Learning Strategy Based on Cosine Similarity under Resource Constraints


Abstract:

Federated learning has been widely applied in healthcare services and real-time object tracking. However, limited by communication resources, such as the server bandwidth...Show More

Abstract:

Federated learning has been widely applied in healthcare services and real-time object tracking. However, limited by communication resources, such as the server bandwidth, and the impact of client data heterogeneity, the convergence rates and accuracy of federated learning significantly drop. Hence, this study proposes a novel federated normalization learning strategy based on cosine similarity (FNCS). Starting from a new perspective of the relationship between local and global updates of the model, FNCS selects valuable clients to upload updates using cosine similarity. The regularization term is then inserted in the last layer of clients by utilizing cosine distance-based update divergence. Numerous experiments are conducted in PyTorch for accelerated validation. Results show that the high accuracy experiments are conducted on the complex CelebA dataset, and the communication rounds of FNCS are improved by 44.71% and 41.98% compared with FedAvg and FedProx, respectively.
Date of Conference: 20-22 December 2021
Date Added to IEEE Xplore: 30 May 2022
ISBN Information:
Conference Location: Haikou, Hainan, China
References is not available for this document.

I. Introduction

Artificial intelligence-based services pose data privacy, communication, and computational challenges that warrant prevailing concerns. As a distributed machine learning architecture, the proposal of federated learning (FL) [1]–[3] provides a new idea to address the above challenges. However, studies, such as those of the optimization of FL in real networks [4], are at an exploratory stage. At present, the federated averaging (FedAvg) algorithm proposed in the paper [5] assumes that all client data are random samples of real data distribution. Although the FedAvg algorithm obtains excellent training results in this case, the heterogeneous computing resources and different communication capabilities of clients may invalidate this assumption. Therefore, improving training efficiency for the successful deployment of FL in edge and mobile devices under resource-constrained and client heterogeneous environments is crucial [6], [7].

Select All
1.
H. H. Yang, Z. Liu, T. Q. S. Quek and H. V. Poor, "Scheduling Policies for Federated Learning in Wireless Networks", IEEE Transactions on Communications, vol. 68, no. 1, pp. 317-333, Jan. 2020.
2.
S. Luo, X. Chen, Q. Wu, Z. Zhou and S. Yu, "HFEL: Joint Edge Association and Resource Allocation for Cost-Efficient Hierarchical Federated Edge Learning", IEEE Transactions on Wireless Communications, vol. 19, no. 10, pp. 6535-6548, Oct. 2020.
3.
M. M. Wadu, S. Samarakoon and M. Bennis, "Federated Learning under Channel Uncertainty: Joint Client Scheduling and Resource Allocation", 2020 IEEE Wireless Communications and Networking Conference (WCNC), pp. 1-6, 2020.
4.
A. Ratner, D. Alistarh, G. Alonso, D. G. Andersen, P. Bailis, S. Bird et al., "Sysml: The new frontier of machine learning systems", Computing Research Repository, 2019.
5.
B. McMahan, E. Moore, D. Ramage, S. Hampson, Y Arcas and B. A., "Communication-efficient learning of deep networks from decentralized data", Artificial intelligence and statistics, pp. 1273-1282, 2017.
6.
M. S. H. Abad, E. Ozfatura, D. Gunduz and O. Ercetin, "Hierarchical Federated Learning ACROSS Heterogeneous Cellular Networks", ICASSP 2020 — 2020 IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP), pp. 8866-8870, 2020.
7.
K. Yang, T. Jiang, Y. Shi and Z. Ding, "Federated Learning Based on Over-the-Air Computation", ICC 2019 — 2019 IEEE International Conference on Communications (ICC), pp. 1-6, 2019.
8.
T. Nishio and R. Yonetani, "Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge", ICC 2019 — 2019 IEEE International Conference on Communications (ICC), pp. 1-7, 2019.
9.
C. Yang, Q. Wang, M. Xu, Z. Chen, K. Bian, Y. Liu, et al., "Characterizing Impacts of Heterogeneity in Federated Learning upon Large-Scale Smartphone Data", Proceedings of the Web Conference 2021, pp. 935-946, April 2021.
10.
H. B. McMahan, E. Moore, D. Ramage, S. Hampson and B. A. Y. Areas, "Communication-Efficient Learning of Deep Networks from Decentralized Data", Proc. the Artificial Intelligence and Statistics Conference (AISTATS), 2017.
11.
Z. Tang, Z. Hu, S. Shi, Y. M. Cheung, Y. Jin, Z. Ren, et al., "Data Resampling for Federated Learning with Non-IID Labels", FTL-IJCAI'21, 2021.
12.
T. Li, A. K. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar and V. Smith, "Federated optimization in heterogeneous networks", arXiv preprint, 2018.
13.
W. A. N. G. Luping, W. A. N. G. Wei and L. I. Bo, "CMFL: Mitigating communication overhead for federated learning", 2019 IEEE 39th International Conference on Distributed Computing Systems (ICDCS), 2019.
14.
A. F. Aji and K. Heafield, "Sparse communication for distributed gradient descent", Conf. Empirical Methods Natural Lang. Process. (EMNLP), pp. 440-445, Sep. 2017.
15.
Y. Lin, S. Han, H. Mao, Y. Wang and W. J. Dally, "Deep gradient compression: Reducing the communication bandwidth for distributed training", Proc. Int. Conf. Learn. Represent. (ICLR), pp. 1-14, May 2018.
16.
J. Yoon, W. Jeong, G. Lee, E. Yang and S. J. Hwang, "Federated continual learning with weighted inter-client transfer", International Conference on Machine Learning, 2021.
17.
F. Wei, P. Vijayakumar, N. Kumar, R. Zhang and Q. Cheng, "Privacy-Preserving Implicit Authentication Protocol Using Cosine Similarity for Internet of Things", IEEE Internet of Things Journal, vol. 8, no. 7, pp. 5599-5606, 2020.
18.
X. Lin, B. Zhou and Y. Xia, "Online Recursive Power Management Strategy based on the Reinforcement Learning Algorithm with Cosine Similarity and a Forgetting Factor", IEEE Transactions on Industrial Electronics, pp. 1-1, 2020.
19.
Farzin Haddadpour and Mehrdad Mahdavi, "On the convergence of local descent methods in federated learning", arXiv preprint, 2019.
Contact IEEE to Subscribe

References

References is not available for this document.