Chrome Extension
WeChat Mini Program
Use on ChatGLM

Federated probability memory recall for federated continual learning.

Inf. Sci.(2023)

Cited 2|Views50
No score
Abstract
Federated Continual Learning (FCL) approaches exist two major problems of the probability bias and the imbalance in parameter variations. These two problems lead to catastrophic forgetting of the network in the FCL process. Therefore, this paper proposes a novel FCL framework, Federated Probability Memory Recall (FedPMR), to mitigate the probability bias problem and the imbalance in parameter variations. Firstly, for the probability bias problem, this paper designs the Probability Distribution Alignment (PDA) module, which consolidates the memory of old probability experience. Specifically, PDA maintains a replay buffer and uses the probability memory stored in the buffer to correct the offset probabilities of the previous tasks during the two-stage training. Secondly, to alleviate the imbalance in parameter variations, this paper designs the Parameter Consistency Constraint (PCC) module, which constrains the magnitude of neural weight changes for previous tasks. Concretely, PCC applies a set of adaptive weights to subsets of the regularization term that constrains parameter changes, forcing the current model to be sufficiently close to the past model in parameter space distance. Experiments with various levels of task similitude across clients demonstrate that our technique establishes the new state-of-the-art performance when compared to previous FCL approaches.
More
Translated text
Key words
Catastrophic forgetting,Federated continual learning,Knowledge distillation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined