Adaptive CNN filter pruning using global importance metric

Computer Vision and Image Understanding(2022)

引用 9|浏览11
暂无评分
摘要
The depth and width of CNNs have increased over the years so as to learn a better representation of the input–output mapping of a dataset. However, a significant amount of redundancy exists among different convolutional kernels. Several methods on pruning suggest that trimming redundant parameters can produce compact structures with minor degradation in classification performance. Existing pruning methods reduce the number of filters at a uniform rate (i.e. pruning same percentage of filters from each layer) in every convolutional layer, which is suboptimal. In this paper, we conduct experiments to observe the sensitivity of each and every filter towards the final performance of the neural network. The essence of comparing filter importance on a global scale and subsequently pruning the neural network adaptively, is highlighted for the first time in this paper. Based on our observations, we propose a novel method named ‘Global Filter Importance based Adaptive Pruning (GFI-AP)’ that assigns importance scores to all filters based on how the network learns the input–output mapping of a dataset, which can then be compared across all the other convolutional filters. Our results show that non-uniform pruning achieves better compression as compared to uniform pruning. We demonstrate that GFI-AP significantly decreases the number of FLOPs (floating point operations) of VGG and ResNet networks in ImageNet and CIFAR datasets, without substantial drop in classification accuracy. GFI-AP reduces more number of FLOPs compared to existing pruning methods, for example, the ResNet50 variant of GFI-AP provides an additional 11% reduction in FLOPs over Taylor-FO-BN-72% while achieving higher accuracy.
更多
查看译文
关键词
Adaptive pruning,Convolutional neural networks,Filter pruning,Model compression
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要