BM-FL: A Balanced Weight Strategy for Multi-stage Federated Learning Against Multi-client Data Skewing

IEEE Transactions on Knowledge and Data Engineering(2024)

引用 0|浏览2
暂无评分
摘要
Federated Learning (FL) combined with Differential Privacy (DP) is widespread in healthcare, finance, and IoT due to its advantages in multi-client data distribution. However, existing FL approaches overlook the differential impact levels among clients and data redundancy issues, resulting in high computational overhead and limited real-time applicability. Additionally, non-independent identical distribution (Non-IID) and imbalanced datasets in multi-clients pose challenges in privacy preservation and model overfitting. Therefore, we propose a balanced weight strategy for multi-stage federated learning against multi-client data skewing, called BM-FL, which involves clients, intermediate trust servers (ITSs), and the central server (CS). Firstly, to protect data privacy, an improved Laplace $\epsilon$ -differential privacy method is employed. Secondly, a novel generative adversarial network (GAN) called BC-GAN is introduced. It is used to generate realistic fake samples and maintain a balanced proportion of samples across different categories. Then, to make full use of each client's valuable data, we designe a balanced weight strategy. Moreover, extensive experimental results clearly demonstrate the effectiveness of BM-FL in efficiently handling classification tasks involving Non-IID and imbalanced datasets while maintaining privacy and security. Furthermore, our method attains superior classification accuracy with fewer training epochs compared to relevant classical algorithms. The code is available at https://github.com/ylxzjy/BMFL.git .
更多
查看译文
关键词
Federated Learning,Differential Privacy,Balanced Weight Strategy,Data Privacy,Data Skewing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要