FedMCSA: Personalized federated learning via model components self-attention

arxiv(2023)

引用 0|浏览17
暂无评分
摘要
Federated learning (FL) facilitates multiple clients to jointly train a machine learning model without sharing their private data. However, heterogeneous data that is not independent and identically distributed (Non-IID) from different clients presents a tough challenge for FL. Existing personalized FL approaches rely heavily on the default treatment of one complete model as a basic unit and ignore the significance of different layers on Non-IID data of clients. In this work, we propose a new framework, namely federated model components self-attention (FedMCSA), to handle Non-IID data in FL, which employs model components self-attention mechanism to granularly promote cooperation between different clients. This mechanism facilitates collaboration between similar model components while reducing interference between model components with large differences. We conduct extensive experiments to demonstrate that FedMCSA outperforms the previous methods on four benchmark datasets. Furthermore, we empirically show the effectiveness of the model components self-attention mechanism, which is complementary to existing personalized FL and can significantly improve the performance of FL.
更多
查看译文
关键词
Personalized federated learning,Non-IID,Model components,Self-attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要