Convolutional Neural Network Simplification With Progressive Retraining

PATTERN RECOGNITION LETTERS(2021)

Cited 5|Views26
No score
Abstract
Kernel pruning methods have been proposed to speed up (simplify) convolutional neural network (CNN) models. However, the effectiveness of a simplified model is often below the original one. This letter presents new methods based on objective and subjective relevance criteria for kernel elimination in a layer-by-layer fashion. During the process, a CNN model is retrained only when the current layer is en-tirely simplified by adjusting the weights from the next layer to the first one and preserving weights of subsequent layers not involved in the process. We call this strategy progressive retraining, differently from kernel pruning methods that usually retrain the entire model after eliminating one or a few ker-nels. Our subjective relevance criterion exploits humans' ability to recognize visual patterns and improve the designer's understanding of the simplification process. We show that our methods can increase ef-fectiveness with considerable model simplification, outperforming two popular approaches and another one from the state-of-the-art on four challenging image datasets. An indirect comparison with 14 recent methods on a famous image dataset also places our approach using the objective criterion among the most competitive ones. (c) 2021 Elsevier B.V. All rights reserved.
More
Translated text
Key words
Kernel pruning, Deep learning, Image classification
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined