Signer independent real-time hand gestures recognition using multi-features extraction and various classifiers

International Journal of Information Technology(2020)

引用 5|浏览1
暂无评分
摘要
In this research paper, an effort has been placed to convert 24 American Sign Language (ASL) signer independent, real time hand gesture alphabets into human or machine recognizable English text. In the proposed work, the ASL hand gestures used for cognition and recognition process is completely invariant to scale, luminance, gender, and distance in the complex background of indoor location. The Viola-Jones algorithm, CIE Lab color model and canny approximation to the derivative are used for proper hand segmentation. In both the cognition and recognition process, the various features such as boundary, centroid, entropy, Hu moments, Zernike moments, Gabor filters, Histogram of Oriented Gradients (HOG) and Local Phase Quantization (LPQ) are extracted from the hand gestures. The K-Nearest Neighbor (KNN), Multiclass-Support Vector Machines (M-SVM) and Decision Tree (DT) classifiers are used for classifying the hand gestures. In recognition task, these classifiers are applied independently on the same set of hand gestures to check the optimality of recognition rate and recognition time. With the detailed experimentation, it is found that, the KNN classifier achieved an average recognition rate and average recognition time of 92.71% and 0.48 s per gestures. This recognition rate and time is better and optimal compared to M-SVM and DT classifiers. Also it is an inspiring result compared to state of art techniques in real time environment by considering various invariants.
更多
查看译文
关键词
American sign language,Gabor filters,Hand gesture,HOG,Hu moments,LPQ,Zernike moment,KNN,M-SVM,Decision tree
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要