Loading [MathJax]/extensions/MathZoom.js
FL2DP: Privacy-Preserving Federated Learning Via Differential Privacy for Artificial IoT | IEEE Journals & Magazine | IEEE Xplore

FL2DP: Privacy-Preserving Federated Learning Via Differential Privacy for Artificial IoT


Abstract:

Federated learning (FL) is a promising paradigm for collaboratively training networks on distributed clients while retaining data locally. Recent work has shown that pers...Show More

Abstract:

Federated learning (FL) is a promising paradigm for collaboratively training networks on distributed clients while retaining data locally. Recent work has shown that personal data can be recovered even though clients only send gradients to the server. To against the gradient leakage issue, differential privacy (DP)-based solutions are proposed to protect data privacy by adding noise to the gradient before sending it to the server. However, the introduced noise affects the training efficiency of local clients, resulting in low model accuracy. Moreover, the identity privacy of clients has not been seriously considered in FL. In this article, we propose FL2DP, a privacy-preserving scheme focusing on protecting the data privacy as well as the identity privacy of clients. Different from the current schemes that add noise sampled from the Gaussian or Laplace distribution, in our scheme the noise is added to the gradient based on the exponential mechanism to achieve high training efficiency. Then, clients upload the perturbed gradients to a shuffler, which reassigns these gradients with different identities. We give a formal privacy definition called gradient indistinguishability to provide strict unlinkability for gradients shuffle. We propose a new gradient shuffling mechanism by adapting the DP-based exponential mechanism to satisfy gradient indistinguishability using the designed utility function. In this case, an attacker cannot infer the real identity of the client via the shuffled gradient. We conduct extensive experiments on two real-world datasets, and the results demonstrate the effectiveness of the proposed scheme.
Published in: IEEE Transactions on Industrial Informatics ( Volume: 20, Issue: 4, April 2024)
Page(s): 5100 - 5111
Date of Publication: 21 November 2023

ISSN Information:

Funding Agency:

No metrics found for this document.

I. Introduction

With the rapid and progressive development of the Internet of Things (IoT), massive smart devices generate a high volume of data including images and voice records, which contributes to the data-driven artificial intelligence (AI) models in various applications, such as staff recognition and risk prediction [1], [2]. However, traditional machine learning requires users to send personal information to a centralized server, resulting in a breach of data privacy and confidentiality. For example, cameras sharing photos to a server will expose important locations. To address this challenge, federated learning (FL) has emerged as a new distributed machine learning paradigm in which clients conduct the training procedure to maintain users data locally [3], [4]. Rather than sending personal data to the centralized server, clients utilize the shared global model sent by the server to train their data and upload gradients back to it for aggregation to prevent data leakage.

Usage
Select a Year
2025

View as

Total usage sinceNov 2023:802
010203040506070JanFebMarAprMayJunJulAugSepOctNovDec503964000000000
Year Total:153
Data is updated monthly. Usage includes PDF downloads and HTML views.
Contact IEEE to Subscribe

References

References is not available for this document.