TVFL: Tunable Vertical Federated Learning towards Communication-Efficient Model Serving | IEEE Conference Publication | IEEE Xplore

TVFL: Tunable Vertical Federated Learning towards Communication-Efficient Model Serving


Abstract:

Vertical federated learning (VFL) enables multiple participants with different data features and the same sample ID space to collaboratively train a model in a privacy-pr...Show More

Abstract:

Vertical federated learning (VFL) enables multiple participants with different data features and the same sample ID space to collaboratively train a model in a privacy-preserving way. However, the high computational and communication overheads hinder the adoption of VFL in many resource-limited or delay-sensitive applications. In this work, we focus on reducing the communication cost and delay incurred by the transmission of intermediate results in VFL model serving. We investigate the inference results, and find that a large portion of test samples can be predicted correctly by the active party alone, thus the corresponding communication for federated inference is dispensable. Based on this insight, we theoretically analyze the "dispensable communication" and propose a novel tunable vertical federated learning framework, named TVFL, to avoid "dispensable communication" in model serving as much as possible. TVFL can smartly switch between independent inference and federated inference based on the features of the input sample. We further reveal that such tunability is highly related to the importance of participants’ features. Our evaluations on seven datasets and three typical VFL models show that TVFL can save 57.6% communication cost and reduce 57.1% prediction latency with little performance degradation.
Date of Conference: 17-20 May 2023
Date Added to IEEE Xplore: 29 August 2023
ISBN Information:

ISSN Information:

Conference Location: New York City, NY, USA

Funding Agency:


I. Introduction

There are two main categories of federated learning frameworks, horizontal federated learning (HFL) and vertical federated learning (VFL), based on the distribution of participants’ data in the feature space and sample ID space. In HFL, participants share the same feature space but have different sample IDs [1]–[7]; while in VFL, participants share the same sample ID space but have different data features [1], [8]–[10]. As VFL is being used in various businesses such as insurance assessment and financial risk control, the high computational and communication overheads of VFL hinder its adoption in many resource-limited or delay-sensitive applications, e.g., mobile computing and online advertising.

Contact IEEE to Subscribe

References

References is not available for this document.