ShapleyFL: Robust Federated Learning Based on Shapley Value.

KDD(2023)

引用 2|浏览178
暂无评分
摘要
Federated Learning (FL) allows clients to form a consortium to train a global model under the orchestration of a central server while keeping data on the local client without sharing it, thus mitigating data privacy issues. However, training a robust global model is challenging since the local data is invisible to the server. The local data of clients are naturally heterogeneous, while some clients can use corrupted data or send malicious updates to interfere with the training process artificially. Meanwhile, communication and computation costs are inevitable challenges in designing a practical FL algorithm. In this paper, to improve the robustness of FL, we propose a Shapley value-inspired adaptive weighting mechanism, which regards the FL training as sequential cooperative games and adjusts clients' weights according to their contributions. We also develop a client sampling strategy based on importance sampling, which can reduce the communication cost by optimizing the variance of the global updates according to the weights of clients. Furthermore, to diminish the computation cost of the server, we propose a weight calculation method by estimating differences between the Shapley value of clients. Our experimental results on several real data sets demonstrate the effectiveness of our approaches.
更多
查看译文
关键词
Federated Learning,Shapley Value
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要