Towards Meta-Pruning via Optimal Transport
CoRR(2024)
摘要
Structural pruning of neural networks conventionally relies on identifying
and discarding less important neurons, a practice often resulting in
significant accuracy loss that necessitates subsequent fine-tuning efforts.
This paper introduces a novel approach named Intra-Fusion, challenging this
prevailing pruning paradigm. Unlike existing methods that focus on designing
meaningful neuron importance metrics, Intra-Fusion redefines the overlying
pruning procedure. Through utilizing the concepts of model fusion and Optimal
Transport, we leverage an agnostically given importance metric to arrive at a
more effective sparse model representation. Notably, our approach achieves
substantial accuracy recovery without the need for resource-intensive
fine-tuning, making it an efficient and promising tool for neural network
compression.
Additionally, we explore how fusion can be added to the pruning process to
significantly decrease the training time while maintaining competitive
performance. We benchmark our results for various networks on commonly used
datasets such as CIFAR-10, CIFAR-100, and ImageNet. More broadly, we hope that
the proposed Intra-Fusion approach invigorates exploration into a fresh
alternative to the predominant compression approaches. Our code is available
here: https://github.com/alexandertheus/Intra-Fusion.
更多查看译文
关键词
Pruning,Fusion
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要