Real-Time Tactile Grasp Force Sensing Using Fingernail Imaging via Deep Neural Networks

IEEE Robotics and Automation Letters(2022)

引用 1|浏览24
暂无评分
摘要
This letter introduces a novel approach for the real-time estimation of 3D tactile forces exerted by human fingertips via vision only. The introduced approach is entirely monocular vision-based and does not require any physical force sensor. Therefore, it is scalable, non-intrusive, and easily fused with other perception systems such as body pose estimation, making it ideal for HRI applications where force sensing is necessary. The introduced approach consists of three main modules: finger tracking for detection and tracking of each individual finger, image alignment for preserving the spatial information in the images, and the force model for estimating the 3D forces from coloration patterns in the images. The model has been implemented experimentally, and the results have shown a maximum RMS error of 5.8% (for the entire range of force levels) along all three directions. The estimation accuracy is comparable to prior offline models in the literature, such as EigenNail, while this model is capable of performing real-time force estimation at 30 frames per second. In addition, unlike the previous force estimation models, this model does not need to be trained for each new human subject. Once the model is trained with 11 or 12 subjects, it can then estimate forces for new human subjects that have not been seen by the model.
更多
查看译文
关键词
Force and tactile sensing,deep learning in grasping and manipulation,perception for grasping and manipulation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要