谷歌Chrome浏览器插件
订阅小程序
在清言上使用

Communication breakdown: Gaze-based prediction of system error for AI-assisted robotic arm simulated in VR.

Eye Tracking Research & Application(2024)

引用 0|浏览3
暂无评分
摘要
Neurological degenerative conditions can affect motor functions, making mobility daunting. Recent configurations of mobility devices that leverage artificial intelligence (AI) show its ability to handle complex information like user input. We create a virtual reality environment to measure participants’ reactions to correct and incorrect feedback from an AI-assistance system. Using gaze to evaluate these reactions, we investigate whether we can automatically predict an upcoming system error. Our results show that gaze reactions occur within 300 ms when the system highlights user input, but the delay extends to 1 second without highlighting. Subject dependent gaze behavior proved complicated for developing a generalizable model based on previous work using TCNs for online recognition of upcoming errors. Therefore, more adaptable models for individuals may be a better alternative for gaze-based accessibility systems.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要