PCTP: point cloud transformer pooling block for points set abstraction structure

The Visual Computer(2022)

引用 0|浏览6
暂无评分
摘要
Point cloud is a simple but accurate form of data in the 3D domain, and its disorder brings the challenge of feature representation. The transformer structure which has been successfully used in natural language processing helps to establish connections between discrete points in the point cloud data. In this work, by focusing on adapting the self-attention mechanism to point cloud data, we propose a point cloud transformer pooling (PCTP) method combined with the typical set abstraction (SA) structure. In the proposed PCTP, we use the transformer structure to fuse non-local features while pooling local features. The SA structure is widely used in various point cloud networks for various tasks, so we apply the PCTP module to multiple baselines containing SA-like structures. The preliminary experimental results show that the proposed PCTP can significantly improve multiple tasks with a small additional computational cost.
更多
查看译文
关键词
Self-attention,Transformer,Point cloud,Pooling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要