Feed: Towards Personalization-Effective Federated Learning | IEEE Conference Publication | IEEE Xplore

Feed: Towards Personalization-Effective Federated Learning


Abstract:

Federated learning (FL) has become an emerging paradigm via cooperative training models among distributed clients without leaking data privacy. The performance degradatio...Show More

Abstract:

Federated learning (FL) has become an emerging paradigm via cooperative training models among distributed clients without leaking data privacy. The performance degradation of F1 on heterogeneous data has driven the development of personalized FL (PFL) solutions, where different models are built for individual clients. However, existing PFL approaches often have limited personalization in terms of modeling capability and training strategy. In this paper, we propose a novel PFL solution, Feed, that employs an enhanced shared-private model architecture and equips with a hybrid federated training strategy. Specifically, to model heterogeneous data for different clients, we design an ensemble-based shared encoder that generates an ensemble of embeddings, and a private decoder that adaptively aggregates these embeddings for personalized prediction. In addition, we propose a server-side hybrid federated aggregation strategy to enable effective training of the heterogeneous shared-private model. To prevent personalization degradation in local model updates, we further optimize the personalized local training on the client-side by smoothing the historical encoders. Extensive experiments on MNIST/FEMNIST, CIFARIO/CIFARIOO, and YELP datasets demonstrate that Feed consistently outperforms state-of-the-art approaches.
Date of Conference: 13-16 May 2024
Date Added to IEEE Xplore: 23 July 2024
ISBN Information:

ISSN Information:

Conference Location: Utrecht, Netherlands

I. Introduction

Federated Learning (FL) is a learning paradigm that collab-oratively trains machine learning models among distributed clients, preserving data privacy for each client. In recent years, FL has been widely applied in various domains, such as finance and healthcare, where private data are isolated. Although FL has achieved great success in training a shared global model, the performance of the shared model would be unsatisfactory for specific clients. The reason is that each client possesses a distinct data distribution, and a shared model fails to generalize well to the heterogeneous data distribution across different clients. To this end, Personalized Federated Learning (PFL) has emerged, aiming to provide personalized ML models for different clients [1]–[5]. In the literature, existing PFL solutions fell into two categories, i.e., global model personalization and personalized model construction.

Contact IEEE to Subscribe

References

References is not available for this document.