Towards a data-driven method for RGB video-based hand action quality assessment in real time

SAC '20: The 35th ACM/SIGAPP Symposium on Applied Computing Brno Czech Republic March, 2020(2020)

Cited 3|Views12
No score
Abstract
In recent years, the research community has begun to explore Video-Based Action Quality Assessment on Human Body (VB-AQA), while few work focuses on Video-Based Action Quality Assessment on Human Hand (VH-AQA) yet. The current work on VB-AQA fails to deal with the inconsistency between captured features and the reality due to the changing angles of the camera, leaving a huge gap between VB-AQA and VH-AQA, while the computational efficiency is another critical problem. In this paper, a novel data-driven method for real-time VH-AQA is proposed. Features are formulated as spatio-temporal hand poses in this method and extracted via four steps: hand segmentation, 2D hand pose estimation, 3D hand pose estimation and hand pose organization. Based on the extracted features an assessment model is applied to evaluate the performance of actions and indicate the most promising adjustment as the feedback. We demonstrate the evaluation accuracy and computational efficiency of our method using our own Origami Video Dataset. For the latter, two new metrics are designed. It turns out that our method provides opportunities for real-time digital reconstruction of physical world activities and timely assessment.
More
Translated text
Key words
Video-Based Action Quality Assessment on Human Hand, hand pose organization, real time, Origami dataset, data-driven
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined