EarCommand: "Hearing" Your Silent Speech Commands In Ear.

Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies(2022)

引用 4|浏览24
暂无评分
摘要
Intelligent speech interfaces have been developing vastly to support the growing demands for convenient control and interaction with wearable/earable and portable devices. To avoid privacy leakage during speech interactions and strengthen the resistance to ambient noise, silent speech interfaces have been widely explored to enable people's interaction with mobile/wearable devices without audible sounds. However, most existing silent speech solutions require either restricted background illuminations or hand involvement to hold device or perform gestures. In this study, we propose a novel earphone-based, hand-free silent speech interaction approach, named EarCommand. Our technique discovers the relationship between the deformation of the ear canal and the movements of the articulator and takes advantage of this link to recognize different silent speech commands. Our system can achieve a WER (word error rate) of 10.02% for word-level recognition and 12.33% for sentence-level recognition, when tested in human subjects with 32 word-level commands and 25 sentence-level commands, which indicates the effectiveness of inferring silent speech commands. Moreover, EarCommand shows high reliability and robustness in a variety of configuration settings and environmental conditions. It is anticipated that EarCommand can serve as an efficient, intelligent speech interface for hand-free operation, which could significantly improve the quality and convenience of interactions.
更多
查看译文
关键词
Acoustic sensing,Ear Canal Deformation,Earphone,Silent Speech
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要