Abstract:
In this paper, we address the challenges of preserving client data and model sharing privacy in a distributed machine learning system. Different from conventional federat...Show MoreMetadata
Abstract:
In this paper, we address the challenges of preserving client data and model sharing privacy in a distributed machine learning system. Different from conventional federated learning where clients share their entire model, we propose a distributed learning framework named FlexSplit where each client can select the number of layers trained between it and the server to further control their individual privacy levels. FlexSplit improves scalability by performing part of the training process at multiple edge servers in parallel before the aggregation at the cloud server. Preliminary experimental results show that FlexSplit can achieve higher validation accuracy than a conventional federated learning model. We also highlight the privacy-utility trade-off where clients can increase their privacy level by sharing fewer layers to minimise privacy attacks at the cost of lower validation accuracy of their local model.
Date of Conference: 28 May 2023 - 01 June 2023
Date Added to IEEE Xplore: 23 October 2023
ISBN Information: