Yale-CMU-Berkeley dataset for robotic manipulation research

Periodicals(2017)

引用 348|浏览325
暂无评分
摘要
AbstractIn this paper, we present an image and model dataset of the real-life objects from the Yale-CMU-Berkeley Object Set, which is specifically designed for benchmarking in manipulation research. For each object, the dataset presents 600 high-resolution RGB images, 600 RGB-D images and five sets of textured three-dimensional geometric models. Segmentation masks and calibration information for each image are also provided. These data are acquired using the BigBIRD Object Scanning Rig and Google Scanners. Together with the dataset, Python scripts and a Robot Operating System node are provided to download the data, generate point clouds and create Unified Robot Description Files. The dataset is also supported by our website, www.ycbbenchmarks.org, which serves as a portal for publishing and discussing test results along with proposing task protocols and benchmarks.
更多
查看译文
关键词
Benchmarking, manipulation, grasping, simulation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要