Evaluating Knowledge Transfer in the Neural Network for Medical Images

IEEE Access(2023)

引用 2|浏览6
暂无评分
摘要
The performance of deep learning models, such as convolutional neural networks (CNN)s, is highly dependent on the size of the training dataset. Consequently, it can be challenging to achieve satisfactory performance when training models from scratch in low-data environments. To address this issue, using knowledge transfer approaches from pre-trained networks can be particularly useful. In this study, we implement different experiments for standard transfer learning approaches as our baseline and introduce a novel knowledge transfer approach, called teacher-student learning, to improve the performance of predictive models in diagnostic medical imaging. Specifically, we investigate various configurations in the teacher-student learning framework inspired by the activation attention transfer in computer vision models to help address some challenges faced in medical imaging, such as the limited availability of annotated data and limited computing resources. We show that the teacher-student learning approach holds great promise in significantly enhancing the performance of diagnostic models. The implications of our findings could be instrumental in improving healthcare accessibility and affordability as they may enable the development of cost-effective and widely accessible medical imaging technologies, particularly in limited data environments.
更多
查看译文
关键词
Deep learning,Diseases,Task analysis,medical image,transfer learning,attention transfer,small dataset,teacher-student
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要