Category-Independent Articulated Object Tracking with Factor Graphs.

IEEE/RJS International Conference on Intelligent RObots and Systems (IROS)(2022)

引用 9|浏览11
暂无评分
摘要
Robots deployed in human-centric environments may need to manipulate a diverse range of articulated objects, such as doors, dishwashers, and cabinets. Articulated objects often come with unexpected articulation mechanisms that are inconsistent with categorical priors: for example, a drawer might rotate about a hinge joint instead of sliding open. We propose a category-independent framework for predicting the articulation models of unknown objects from sequences of RGB-D images. The prediction is performed by a two-step process: first, a visual perception module tracks object part poses from raw images, and second, a factor graph takes these poses and infers the articulation model including the current configuration between the parts as a 6D twist. We also propose a manipulation-oriented metric to evaluate predicted joint twists in terms of how well a compliant robot controller would be able to manipulate the articulated object given the predicted twist. We demonstrate that our visual perception and factor graph modules outperform baselines on simulated data and show the applicability of our factor graph on real world data.
更多
查看译文
关键词
articulation model,category-independent articulated object tracking,category-independent framework,factor graph modules,human-centric environments,predicted joint twists,unexpected articulation mechanisms,unknown objects,visual perception module tracks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要