Decay Momentum for Improving Federated Learning.

The European Symposium on Artificial Neural Networks (ESANN)(2021)

Cited 0|Views5
No score
Abstract
We propose two novel Federated Learning (FL) algorithms based on decaying momentum (Demon): Federated Demon (FedDemon) and Federated Demon Adam (FedDemonAdam).In particular, we apply Demon to Momentum Stochastic Gradient Descent (SGD) and Adam in a Federated setting, which has shown to improve results in a centralized environment.We empirically show that FedDemon and FedDemonAdam have a faster convergence rate and performance improvements compared to state-of-the-art algorithms including FedAvg, FedAvgM and FedAdam.
More
Translated text
Key words
Federated Learning,Stochastic Gradient Descent,Coordinate Descent,Deep Learning,Convex Optimization
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined