ArtiGrasp: Physically Plausible Synthesis of Bi-Manual Dexterous Grasping and Articulation
arXiv (Cornell University)(2023)
摘要
We present ArtiGrasp, a novel method to synthesize bi-manual hand-object
interactions that include grasping and articulation. This task is challenging
due to the diversity of the global wrist motions and the precise finger control
that are necessary to articulate objects. ArtiGrasp leverages reinforcement
learning and physics simulations to train a policy that controls the global and
local hand pose. Our framework unifies grasping and articulation within a
single policy guided by a single hand pose reference. Moreover, to facilitate
the training of the precise finger control required for articulation, we
present a learning curriculum with increasing difficulty. It starts with
single-hand manipulation of stationary objects and continues with multi-agent
training including both hands and non-stationary objects. To evaluate our
method, we introduce Dynamic Object Grasping and Articulation, a task that
involves bringing an object into a target articulated pose. This task requires
grasping, relocation, and articulation. We show our method's efficacy towards
this task. We further demonstrate that our method can generate motions with
noisy hand-object pose estimates from an off-the-shelf image-based regressor.
更多查看译文
关键词
grasping,physically plausible synthesis,articulation,bi-manual
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要