A TinyML solution for an IoT-based communication device for hearing impaired

S. Sharma,R. Gupta,A. Kumar

EXPERT SYSTEMS WITH APPLICATIONS(2024)

引用 0|浏览0
暂无评分
摘要
Research on automatic translation of sign language to verbal languages has been progressively explored in recent years to assist speech and hearing-impaired people in communicating with non-signers. In this paper, a tiny machine learning (TinyML) solution is proposed for sign language recognition using a low-cost, wearable, internet-of things (IoT) device. A lightweight deep neural network is deployed on the edge device to interpret isolated signs from the Indian sign language using the time-series data collected from the motion sensors of the device. The scarcity of labeled training data is addressed by employing the deep transfer learning approach. Here, the knowledge gained from the data collected using the motion sensors of a different device is used to initialize the model parameters. The performance of the model is assessed in terms of classification accuracy and prediction time for different sampling rates and transferring schemes. The model achieves an average accuracy of 87.18% when all the parameters are retrained with just 4 observations of each sign recorded from the motion sensors of the proposed IoT device. The recognized sign is transmitted to a cloud platform in real-time. A mobile application, SignTalk, is also developed, which wirelessly receives the predicted signs from the cloud and displays it as text. Additionally, text-to-speech conversion is also provided on SignTalk to vocalize the predicted sign for better communication.
更多
查看译文
关键词
Tiny machine learning,Deep transfer learning,Internet of things,Wearable sensors,Indian Sign Language
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要