Chrome Extension
WeChat Mini Program
Use on ChatGLM

Personalized and privacy-enhanced federated learning framework via knowledge distillation

NEUROCOMPUTING(2024)

Cited 0|Views24
No score
Abstract
Federated learning is a distributed learning framework in which all participants jointly train a global model to ensure data privacy. In the existing federated learning framework, all clients share the same global model and cannot customize the model architecture according to their needs. In this paper, we propose FLKD (federated learning with knowledge distillation), a personalized and privacy -enhanced federated learning framework. The global model will serve as a medium for knowledge transfer in FLKD, and the client can customize the local model while training with the global model by mutual learning. Furthermore, the participation of the heterogeneous local models changes the training strategy of the global model, which means that FLKD has a natural immune effect against gradient leakage attacks. We conduct extensive empirical experiments to support the training and evaluation of our framework. Results of experiments show that FLKD provides an effective way to solve the problem of model heterogeneity and can effectively defend against gradient leakage attacks.
More
Translated text
Key words
Federated learning,Model heterogeneity,Gradient privacy,Knowledge distillation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined