Compact Sparse R-CNN: Speeding up sparse R-CNN by reducing iterative detection heads and simplifying feature pyramid network

AIP ADVANCES(2023)

引用 0|浏览12
暂无评分
摘要
Processing a large number of proposals usually takes a significant proportion of inference time in two-stage object detection methods. Sparse regions with CNN features (Sparse R-CNN) was proposed using a small number of learnable proposals to replace the proposals derived from anchors. To decrease the missing rate, Sparse R-CNN uses six iterative detection heads to gradually regress the detection boxes to the corresponding objects, which hence increases the inference time. To reduce the number of iterative heads, we propose the iterative Hungarian assigner that encourages Sparse R-CNN to generate multiple proposals for each object at the inference stage. This decreases the missing rate when the number of iterative heads is small. As a result, Sparse R-CNN using the proposed assigner needs fewer iterative heads but gives higher detection accuracy. Also, we observe that the multi-layer outputs of the feature pyramid network contribute little to Sparse R-CNN and propose using a single-layer output neck to replace it. The single-layer output neck further improves the inference speed of Sparse R-CNN without the cost of detection accuracy. Experimental results show that the proposed iterative Hungarian assigner together with the single-layer output neck improves Sparse R-CNN by 2.5 AP(50) on the Microsoft common objects in context (MS-COCO) dataset and improves Sparse R-CNN by 3.0 AP(50) on the PASCAL visual object classes (VOC) dataset while decreasing 30% floating point operations (FLOPs).
更多
查看译文
关键词
iterative detection heads,feature,r-cnn
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要