A Fusion Algorithm of Multi-model Pruning and Collaborative Distillation Learning

Zihan Liu,Zhiguo Shi

Journal of Physics: Conference Series(2020)

引用 0|浏览0
暂无评分
摘要
In order to improve the prediction performance, the complex depth network model needs a large number of calculation parameters, and the negative impact is the fact that it greatly increases the calculation time and energy consumption. In this paper, a fusion algorithm of model pruning and joint multi-model knowledge distillation learning is proposed. By constructing an adaptive joint learning loss function including distillation, multi-model training is carried out. This method can replace the training process of model fine-tuning after pruning. In this paper, firstly, multi classification tasks are carried out on different model structures and data sets, and a complex network model with good effect is trained as a teacher's model. Then, the channel pruning process of random pruning degree is executed to generate multiple student models. Finally, the accuracy of multiple student models is improved by using this method. The experimental results show that this method effectively improves the accuracy of each model after pruning, and the experimental results achieve a suitable balance between model performance and accuracy.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要