Loading [MathJax]/extensions/MathMenu.js
Snowball: Energy Efficient and Accurate Federated Learning With Coarse-to-Fine Compression Over Heterogeneous Wireless Edge Devices | IEEE Journals & Magazine | IEEE Xplore

Snowball: Energy Efficient and Accurate Federated Learning With Coarse-to-Fine Compression Over Heterogeneous Wireless Edge Devices


Abstract:

Model update compression is a widely used technique to alleviate the communication cost in federated learning (FL). However, there is evidence indicating that the compres...Show More

Abstract:

Model update compression is a widely used technique to alleviate the communication cost in federated learning (FL). However, there is evidence indicating that the compression-based FL system often suffers the following two issues, i) the implicit learning performance deterioration of the global model due to the inaccurate update, ii) the limitation of sharing the same compression rate over heterogeneous edge devices. In this paper, we propose an energy-efficient learning framework, named Snowball, that enables edge devices to incrementally upload their model updates in a coarse-to-fine compression manner. To this end, we first design a fine-grained compression scheme that enables a nearly continuous compression rate. After that, we investigate the Snowball optimization problem to minimize the energy consumption of parameter transmission with learning performance constraints. By leveraging the theoretical insights of the convergence analysis, the optimization problem is transformed into a tractable form. Following that, a water-filling algorithm is designed to solve the problem, where each device is assigned a personalized compression rate according to the status of the locally available resource. Experiments indicate that, compared to state-of-the-art FL algorithms, our learning framework can save five times the required energy of uplink communication to achieve a good global accuracy.
Published in: IEEE Transactions on Wireless Communications ( Volume: 22, Issue: 10, October 2023)
Page(s): 6778 - 6792
Date of Publication: 23 February 2023

ISSN Information:

Funding Agency:

Citations are not available for this document.

I. Introduction

According to Cisco’s forecast, there will be 500 billion devices connected to the Internet by 2030 [1]. These devices equipped with versatile sensors generate massive data at the network edge, opening up new horizons for data-driven learning methods. Federated learning (FL) is an emerging distributed paradigm that enables multiple edge devices to train a global model without sharing local training data [2]. FL-empowered mobile edge computing system is recognized as a promising solution to realize ubiquitous intelligence [3]. In many real-world scenarios, mobile devices are strictly constrained by computing capability, channel state condition and battery lifetime [4]. In order to improve the efficiency of resource utilization, many researchers have proposed to compress the local model update before uploading it to the parameter server.

Cites in Papers - |

Cites in Papers - IEEE (10)

Select All
1.
Siguang Chen, Qun Li, Yanhang Shi, Xue Li, "Debiased Device Sampling for Federated Edge Learning in Wireless Networks", IEEE Transactions on Mobile Computing, vol.24, no.2, pp.709-721, 2025.
2.
Zhixiong Chen, Wenqiang Yi, Hyundong Shin, Arumugam Nallanathan, "Adaptive Semi-Asynchronous Federated Learning Over Wireless Networks", IEEE Transactions on Communications, vol.73, no.1, pp.394-409, 2025.
3.
Sakshi Patni, Sungpil Woo, Joohyung Lee, "Predictive Dynamic Virtual Machine Scaling for Federated Learning Over Edge-Cloud Interworking", IT Professional, vol.26, no.6, pp.35-44, 2024.
4.
Yuyi Mao, Xianghao Yu, Kaibin Huang, Ying-Jun Angela Zhang, Jun Zhang, "Green Edge AI: A Contemporary Survey", Proceedings of the IEEE, vol.112, no.7, pp.880-911, 2024.
5.
Jiayi He, Bingkun Lai, Jiawen Kang, Hongyang Du, Jiangtian Nie, Tao Zhang, Yanli Yuan, Weiting Zhang, Dusit Niyato, Abbas Jamalipour, "Securing Federated Diffusion Model With Dynamic Quantization for Generative AI Services in Multiple-Access Artificial Intelligence of Things", IEEE Internet of Things Journal, vol.11, no.17, pp.28064-28077, 2024.
6.
Bingkun Lai, Jiayi He, Jiawen Kang, Gaolei Li, Minrui Xu, Tao zhang, Shengli Xie, "On-demand Quantization for Green Federated Generative Diffusion in Mobile Edge Networks", ICC 2024 - IEEE International Conference on Communications, pp.2883-2888, 2024.
7.
Ruyan Wang, Lan Yang, Tong Tang, Boran Yang, Dapeng Wu, "Robust Federated Learning for Heterogeneous Clients and Unreliable Communications", IEEE Transactions on Wireless Communications, vol.23, no.10, pp.13440-13455, 2024.
8.
Guoping Tan, Hui Yuan, Hexuan Hu, Siyuan Zhou, Zhenyu Zhang, "A Framework of Decentralized Federated Learning With Soft Clustering and 1-Bit Compressed Sensing for Vehicular Networks", IEEE Internet of Things Journal, vol.11, no.13, pp.23617-23629, 2024.
9.
Peichun Li, Hanwen Zhang, Yuan Wu, Liping Qian, Rong Yu, Dusit Niyato, Xuemin Shen, "Filling the Missing: Exploring Generative AI for Enhanced Federated Learning Over Heterogeneous Mobile Edge Devices", IEEE Transactions on Mobile Computing, vol.23, no.10, pp.10001-10015, 2024.
10.
Chao Chen, Bohang Jiang, Shengli Liu, Chuanhuang Li, Celimuge Wu, Rui Yin, "Efficient Federated Learning using Random Pruning in Resource-Constrained Edge Intelligence Networks", GLOBECOM 2023 - 2023 IEEE Global Communications Conference, pp.5244-5249, 2023.
Contact IEEE to Subscribe

References

References is not available for this document.