Adaptive asynchronous federated learning

FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE(2024)

引用 0|浏览8
暂无评分
摘要
Federated Learning enables data owners to train an artificial intelligence model collaboratively while keeping all the training data locally, reducing the possibility of personal data breaches. However, the heterogeneity of local resources and dynamic characteristics of federated learning systems bring new challenges hindering the development of federated learning techniques. To this end, we propose an Adaptive Asynchronous Federated Learning scheme with Momentum, called FedAAM, comprising an adaptive weight allocation algorithm and a novel asynchronous federated learning framework. Firstly, we dynamically allocate weights for the global model update using an adaptive weight allocation strategy that can improve the convergence rate of models in asynchronous federated learning systems. Then, targeting the challenges mentioned previously, we proposed two new asynchronous global update rules based on the differentiated strategy, which is an essential component of the proposed novel federated learning framework. Furthermore, our asynchronous federated learning framework introduces the historical global update direction (i.e., global momentum) into the global update operation, aiming at improving training efficiency. Moreover, we prove that the model under the FedAAM scheme can achieve a sublinear convergence rate. Extensive experiments on real-world datasets demonstrate that the FedAAM scheme outperforms representative synchronous and asynchronous federated learning schemes (i.e., FedAvg and FedAsync) regarding the model's convergence rate and capacity to deal with dynamic systems.
更多
查看译文
关键词
Federated learning,Asynchronous aggregation,Distributed machine learning,Momentum
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要