Contractible Regularization for Federated Learning on Non-IID Data

2022 IEEE International Conference on Data Mining (ICDM)(2022)

引用 2|浏览18
暂无评分
摘要
In the medical domain, gathering all data and training a global supervised model is very difficult due to scattered data from different hospitals and security and privacy concerns. In recent years, several federated learning models have been proposed for training over isolated data. These models usually employ a client-server framework: 1) train local models on clients in parallel; 2) aggregate local models on the server to produce a global one. By iterating the above two steps, federated learning aims to approximate the performance of a model centrally trained on data. However, due to the non-IID data distribution issue, local models could deviate from the optimal model resulting in a biased aggregated global model. To address this problem, we propose a contractible regularization (ConTre) to act on the local model’s latent space. On each client, we first project the input data into a latent space and then pose regularization to avoid converging too fast to bad local optima. The proposed regularization can be easily integrated into existing federated learning frameworks without bringing in additional parameters. According to experimental results on multiple natural and medical image datasets, the proposed ConTre can significantly improve the performance of various federated learning frameworks. Our code is available at https://github.com/czifan/ConTre.pytorch.
更多
查看译文
关键词
Federated learning,feature learning,image classification,computer vision
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要