FedBiKD: Federated Bidirectional Knowledge Distillation for Distracted Driving Detection

IEEE Internet of Things Journal(2023)

引用 3|浏览6
暂无评分
摘要
Distracted driving behavior is known as a leading factor in road traffic injuries and deaths. Fortunately, rapidly developing deep learning technology has shown its potential in distracted driving detection. Nevertheless, deep learning-based solutions need to collect large amounts of driving data captured by camera sensors in the vehicle, which will cause serious privacy concerns. As a privacy-preserving distributed learning paradigm, federated learning (FL) has achieved competitive performance in many applications recently. Inspired by this, we introduce FL into distracted driving detection tasks. However, we observe that the heterogeneous data distribution across drivers leads to significant performance degradation of the model learned in FL. To address this challenge, we propose a simple and effective federated bidirectional knowledge distillation framework, FedBiKD. Specifically, FedBiKD utilizes the knowledge from the global model in guiding local training to mitigate the issue of local deviation. Meanwhile, the consensus from the ensemble of local models is also employed to fine-tune the aggregated global model for less volatility in training. Our extensive experiments demonstrate the effectiveness of FedBiKD in distracted driving detection. The results show that FedBiKD significantly outperforms other FL algorithms in terms of accuracy, communication efficiency, convergence rate, and stability.
更多
查看译文
关键词
Deep neural networks, distracted driving detection, federated learning (FL), knowledge distillation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要