OrthCaps: An Orthogonal CapsNet with Sparse Attention Routing and Pruning
CVPR 2024(2024)
摘要
Redundancy is a persistent challenge in Capsule Networks (CapsNet),leading to
high computational costs and parameter counts. Although previous works have
introduced pruning after the initial capsule layer, dynamic routing's fully
connected nature and non-orthogonal weight matrices reintroduce redundancy in
deeper layers. Besides, dynamic routing requires iterating to converge, further
increasing computational demands. In this paper, we propose an Orthogonal
Capsule Network (OrthCaps) to reduce redundancy, improve routing performance
and decrease parameter counts. Firstly, an efficient pruned capsule layer is
introduced to discard redundant capsules. Secondly, dynamic routing is replaced
with orthogonal sparse attention routing, eliminating the need for iterations
and fully connected structures. Lastly, weight matrices during routing are
orthogonalized to sustain low capsule similarity, which is the first approach
to introduce orthogonality into CapsNet as far as we know. Our experiments on
baseline datasets affirm the efficiency and robustness of OrthCaps in
classification tasks, in which ablation studies validate the criticality of
each component. Remarkably, OrthCaps-Shallow outperforms other Capsule Network
benchmarks on four datasets, utilizing only 110k parameters, which is a mere
1.25
achieves the smallest parameter count among existing Capsule Networks.
Similarly, OrthCaps-Deep demonstrates competitive performance across four
datasets, utilizing only 1.2
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要