Dynamic User Clustering for Efficient and Privacy-Preserving Federated Learning

IEEE Transactions on Dependable and Secure Computing(2024)

引用 0|浏览1
暂无评分
摘要
With the wider adoption of machine learning and increasing concern about data privacy, federated learning (FL) has received tremendous attention. FL schemes typically enable a set of participants, i.e., data owners, to individually train a machine learning model using their local data, which are then aggregated with the coordination of a central server to construct a global FL model. Improvements upon standard FL include (i) reducing the communication overheads of gradient transmission by utilizing gradient sparsification and (ii) enhancing the security of aggregation by adopting privacy-preserving aggregation (PPAgg) protocols. However, state-of-the-art PPAgg protocols do not interoperate easily with gradient sparsification due to the heterogeneity of users' sparsified gradient vectors. To resolve this issue, we propose a Dynamic User Clustering (DUC) approach with a set of supporting protocols to partition users into clusters based on the nature of the PPAgg protocol and gradient sparsification technique, providing both security guarantees and communication efficiency. Experimental results show that DUC-FL significantly reduces communication overheads and achieves similar model accuracy compared to the baselines. The simplicity of the proposed protocol makes it attractive for both implementation and further improvements.
更多
查看译文
关键词
User clustering,gradient sparsification,privacy-preserving aggregation,federated learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要