Loading [MathJax]/extensions/MathMenu.js
Differential Privacy in Federated Learning Using Noise Multipliers: An Analysis on MNIST Dataset | IEEE Conference Publication | IEEE Xplore

Differential Privacy in Federated Learning Using Noise Multipliers: An Analysis on MNIST Dataset


Abstract:

Federated learning enables decentralized learning to run on multiple clients by training locally on each client’s data and sharing new instances only on a central server....Show More

Abstract:

Federated learning enables decentralized learning to run on multiple clients by training locally on each client’s data and sharing new instances only on a central server. However, if privacy is not protected, sensitive information about any customer can be accessed via model updates. Differential privacy (DP) provides a robust method of preserving data privacy by combining controlled noise with gradient updates sent to a central server This paper explores the application of differential privacy to federated learning using different noise factors on the MNIST dataset. We investigate the trade-off between accuracy and privacy protection by analyzing test loss and accuracy over several communication channels, with noise factors of 0.2, 0.3, and 0.7. The results show that increased noise reduces model accuracy but provides better privacy protection. We also emphasize the importance of balancing accuracy and confidentiality in official learning processes to ensure for example safe yet efficient sampling operations.
Date of Conference: 13-14 December 2024
Date Added to IEEE Xplore: 13 March 2025
ISBN Information:
Conference Location: Indore, India

I. Introduction

In today’s data-driven world, the demand for machine learning (ML) models in various industries has increased exponentially. Many organizations are reluctant to share their data due to privacy concerns, necessitating decentralized approaches such as federated learning (FL). FL can jointly train models for multiple clients, such as mobile devices or organizations, while storing their data locally. Although FL provides a solution to privacy concerns, it is still vulnerable to privacy leaks via shared gradient updates.

Contact IEEE to Subscribe

References

References is not available for this document.