Egocentric Hand Tracking for Virtual Reality

Hershed Tilak,Royce Cheng-Yue, Gordon Wetzstein, Robert Konrad

semanticscholar(2016)

引用 0|浏览1
暂无评分
摘要
Among the contemporary VR problems, creating an immersive and realistic environment is one of the most challenging tasks. In this respect, having the ability to touch and interact with the VR environment is extremely important. Through this project, we created an egocentric hand tracking system that could be embedded inside a VR head mounted display. The pipeline involves reading RGB-D images from an Intel RealSense, piping the RGB components to YOLO, a CNN model, for bounding boxes, retrieving the corresponding depths at these bounding boxes, and displaying the resulting rendered hands in VR. For the hand detection portion of the pipeline, we trained three YOLO models with the best model achieving a test mAP of 64.6%, which is on par with the YOLO results on the VOC 2012 dataset. With more training, we should be able to achieve better accuracies with the other dropout and jitter models. We showcase the results of this pipeline in a video presented in Section 4.2.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要