Unleashing Edgeless Federated Learning With Analog Transmissions

IEEE TRANSACTIONS ON SIGNAL PROCESSING(2024)

引用 0|浏览10
暂无评分
摘要
We demonstrate that merely analog transmissions and match filtering can realize the function of an edge server in federated learning (FL). Therefore, a network with massively distributed user equipments (UEs) can achieve large-scale FL without an edge server. We also develop a training algorithm that allows UEs to continuously perform local computing without being interrupted by the global parameter uploading, which exploits the full potential of UEs' processing power. We derive convergence rates for the proposed schemes to quantify their training efficiency. The analyses reveal that when the interference obeys a Gaussian distribution, the proposed algorithm retrieves the convergence rate of a server-based FL. But if the interference distribution is heavy-tailed, then the heavier the tail, the slower the algorithm converges. Nonetheless, the system run time can be largely reduced by enabling computation in parallel with communication, whereas the gain is particularly pronounced when communication latency is high. These findings are corroborated via extensive simulations.
更多
查看译文
关键词
Servers,Training,Convergence,Computational modeling,Interference,Global communication,Heavily-tailed distribution,Federated learning,wireless network,analog over-the-air computing,zero-wait training,convergence rate
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要