RPMArt: Towards Robust Perception and Manipulation for Articulated Objects
CoRR(2024)
Abstract
Articulated objects are commonly found in daily life. It is essential that
robots can exhibit robust perception and manipulation skills for articulated
objects in real-world robotic applications. However, existing methods for
articulated objects insufficiently address noise in point clouds and struggle
to bridge the gap between simulation and reality, thus limiting the practical
deployment in real-world scenarios. To tackle these challenges, we propose a
framework towards Robust Perception and Manipulation for Articulated Objects
(RPMArt), which learns to estimate the articulation parameters and manipulate
the articulation part from the noisy point cloud. Our primary contribution is a
Robust Articulation Network (RoArtNet) that is able to predict both joint
parameters and affordable points robustly by local feature learning and point
tuple voting. Moreover, we introduce an articulation-aware classification
scheme to enhance its ability for sim-to-real transfer. Finally, with the
estimated affordable point and articulation joint constraint, the robot can
generate robust actions to manipulate articulated objects. After learning only
from synthetic data, RPMArt is able to transfer zero-shot to real-world
articulated objects. Experimental results confirm our approach's effectiveness,
with our framework achieving state-of-the-art performance in both noise-added
simulation and real-world environments. The code and data will be open-sourced
for reproduction. More results are published on the project website at
https://r-pmart.github.io .
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined