Snowball: Energy Efficient and Accurate Federated Learning With Coarse-to-Fine Compression Over Heterogeneous Wireless Edge Devices

IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS(2023)

引用 0|浏览14
暂无评分
摘要
Model update compression is a widely used technique to alleviate the communication cost in federated learning (FL). However, there is evidence indicating that the compression-based FL system often suffers the following two issues, i) the implicit learning performance deterioration of the global model due to the inaccurate update, ii) the limitation of sharing the same compression rate over heterogeneous edge devices. In this paper, we propose an energy-efficient learning framework, named Snowball, that enables edge devices to incrementally upload their model updates in a coarse-to-fine compression manner. To this end, we first design a fine-grained compression scheme that enables a nearly continuous compression rate. After that, we investigate the Snowball optimization problem to minimize the energy consumption of parameter transmission with learning performance constraints. By leveraging the theoretical insights of the convergence analysis, the optimization problem is transformed into a tractable form. Following that, a water-filling algorithm is designed to solve the problem, where each device is assigned a personalized compression rate according to the status of the locally available resource. Experiments indicate that, compared to state-of-the-art FL algorithms, our learning framework can save five times the required energy of uplink communication to achieve a good global accuracy.
更多
查看译文
关键词
Federated learning,gradient compression,wireless resource management
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要