Channel Regeneration: Improving Channel Utilization for Compact DNNs.

AAAI(2023)

引用 0|浏览14
暂无评分
摘要
Overparameterized deep neural networks have redundant neurons that do not contribute to the network's accuracy. In this paper, we introduce a novel channel regeneration technique that reinvigorates these redundant channels of efficient architectures by re-initializing its batch normalization scaling factor ś. This re-initialization of BN ś of these channels promotes regular weight updates during training. Furthermore, we show that channel regeneration encourages the channels to contribute equally to the learned representation and further boosts the generalization accuracy. We apply our technique at regular intervals of the training cycle to improve channel utilization. The solutions proposed in previous works either raise the total computational cost or increase the model complexity. Integrating the channel regeneration technique into the training methodology of efficient architectures requires minimal effort and comes at no additional cost in size or memory. Extensive experiments on several image classification benchmarks and on semantic segmentation task demonstrate the effectiveness of applying the channel regeneration technique to compact architectures.
更多
查看译文
关键词
channel regeneration,channel utilization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要