DSNet: Double Strand Robotic Grasp Detection Network Based on Cross Attention

Yonghong Zhang, Xiayang Qin,Tiantian Dong, Yuchao Li, Hongcheng Song,Yunping Liu,Ziqi Li,Qi Liu

IEEE ROBOTICS AND AUTOMATION LETTERS(2024)

引用 0|浏览4
暂无评分
摘要
In this letter, we propose a Double Strand robotic grasp detection Network (DSNet), that combines a transformer branch and a U-Net branch within an encoder-decoder structure. The DSNet is designed to reconcile differences between these two approaches and provide access to both local and global resources. We have strategically incorporated bidirectional bridges with cross-attention mechanisms at the bottleneck points of each branch. These bridges facilitate the retrieval of abstract semantic data and reciprocally transfer it to the alternate branch, preserving both local features and global representations. To validate the performance of the DSNet, we utilized complete RGB-D information as input. The DSNet achieves an impressive accuracy rate of 98.31% and 95.7% on the Cornell and Jacquard grasping datasets. We used a 6DoF AUBO i5 robot to perform full-angle grasping of unknown objects, thereby confirming the reliability of the model.
更多
查看译文
关键词
Deep learning in grasping and manipulation,grasping,hybrid network structure,perception for grasping and manipulation,vision transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要