Chrome Extension
WeChat Mini Program
Use on ChatGLM

AFMPM: adaptive feature map pruning method based on feature distillation

International Journal of Machine Learning and Cybernetics(2024)

Cited 0|Views2
No score
Abstract
Feature distillation is a technology that uses the middle layer feature map of the teacher network as knowledge to transfer to the students. The feature information not only reflects the image information but also covers the feature extraction ability of the teacher network. However, the existing feature distillation methods lack theoretical guidance for feature map evaluation and suffer from the mismatch of sizes between high-dimensional feature maps and low-dimensional feature maps, and poor information utilization. In this paper, we propose an Adaptive Feature Map Pruning Method (AFMPM) for feature distillation, which transforms the problem of feature map pruning into the problem of optimization so that the valid information of the feature map is retained to the maximum extent. AFMPM has achieved significant improvements in feature distillation, and the advanced and generalized nature of the method has been verified by conducting experiments on the teacher-student distillation framework and the self-distillation framework.
More
Translated text
Key words
Knowledge distillation,Model compression,Neural networks,Adaptive pruning
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined