The Yale Human Grasping Dataset: Grasp, Object, And Task Data In Household And Machine Shop Environments

INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH(2015)

引用 102|浏览1
暂无评分
摘要
This paper presents a dataset of human grasping behavior in unstructured environments. Wide-angle head-mounted camera video was recorded from two housekeepers and two machinists during their regular work activities, and the grasp types, objects, and tasks were analyzed and coded by study staff. The full dataset contains 27.7 hours of tagged video and represents a wide range of manipulative behaviors spanning much of the typical human hand usage. We provide the original videos, a spreadsheet including the tagged grasp type, object, and task parameters, time information for each successive grasp, and video screenshots for each instance. Example code is provided for MATLAB and R, demonstrating how to load in the dataset and produce simple plots.
更多
查看译文
关键词
Grasping,manipulation,multifingered hands,dexterous
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要