FuncGrasp: Learning Object-Centric Neural Grasp Functions from Single Annotated Example Object
CoRR(2024)
摘要
We present FuncGrasp, a framework that can infer dense yet reliable grasp
configurations for unseen objects using one annotated object and single-view
RGB-D observation via categorical priors. Unlike previous works that only
transfer a set of grasp poses, FuncGrasp aims to transfer infinite
configurations parameterized by an object-centric continuous grasp function
across varying instances. To ease the transfer process, we propose Neural
Surface Grasping Fields (NSGF), an effective neural representation defined on
the surface to densely encode grasp configurations. Further, we exploit
function-to-function transfer using sphere primitives to establish semantically
meaningful categorical correspondences, which are learned in an unsupervised
fashion without any expert knowledge. We showcase the effectiveness through
extensive experiments in both simulators and the real world. Remarkably, our
framework significantly outperforms several strong baseline methods in terms of
density and reliability for generated grasps.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要