Chrome Extension
WeChat Mini Program
Use on ChatGLM

Learning Critically: Selective Self-Distillation in Federated Learning on Non-IID Data

IEEE TRANSACTIONS ON BIG DATA(2024)

Cited 21|Views49
Key words
Data models,Training,Servers,Collaborative work,Adaptation models,Convergence,Feature extraction,Federated learning,knowledge distillation,non-identically distributed,deep learning,catastrophic forgetting
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined