e.g. , mo"/>

Enhancing Decentralized and Personalized Federated Learning with Topology Construction

IEEE Transactions on Mobile Computing(2024)

引用 0|浏览6
暂无评分
摘要
The emerging Federated Learning (FL) permits all workers ( e.g. , mobile devices) to cooperatively train a model using their local data at the network edge. In order to avoid the possible bottleneck of conventional parameter server architecture, the decentralized federated learning (DFL) is developed on the peer-to-peer (P2P) communication. Non-IID issue is a key challenge in FL and will significantly degrade the model training performance. To this end, we propose a personalized solution called TOPFL, in which only parts of the local models (not the entire models) are shared and aggregated. Moreover, considering the limited communication bandwidth on workers, we propose a topology construction algorithm to accelerate the training process. To verify the convergence of the decentralized training framework, we theoretically analyze the impact of the data heterogeneity and topology on the convergence upper bound. Extensive simulation results show that TOPFL can achieve 2.2× speedup when reaching convergence and 5.8% higher test accuracy under the same resource consumption, compared with the baseline solutions.
更多
查看译文
关键词
Personalized Federated Learning,P2P Communication,Topology Construction,Edge Computing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要