Characterizing information access needs in gaze-adaptive augmented reality interfaces: implications for fast-paced and dynamic usage contexts

HUMAN-COMPUTER INTERACTION(2023)

引用 0|浏览2
暂无评分
摘要
Gaze-adaptive interfaces can enable intuitive hands-free augmented reality (AR) interaction but unintentional selection (i.e. "Midas Touch") can have serious consequences during high-stakes real-world AR use. In the present study, we assessed how simulated gaze-adaptive AR interfaces, implementing single and dual gaze inputs, influence Soldiers' human performance and user experience (UX) in a fast-paced virtual reality marksmanship task. In Experiment 1, we investigated 1- and 2-stage dwell-based interfaces, finding confirmatory dual gaze dwell input effectively reduced Midas Touch but also reduced task performance and UX compared to an always-on (AO) interface. In Experiment 2, we investigated gaze depth-based interfaces, finding similar negative impacts of confirmatory dwell on Midas Touch, task performance, and UX. Overall, compared to the AO interface, single gaze input interfaces (e.g. single dwell or gaze depth threshold) reduced viewing of task-irrelevant information and yielded similar task performance and UX despite being prone to Midas Touch. Broadly, our findings demonstrate that AR users performing fast-paced dynamic tasks can tolerate some unintentional activation of AR displays if reliable and rapid information access is maintained and point to the need to develop and refine gaze depth estimation algorithms and novel gaze depth-based interfaces that provide rapid access to AR display content.
更多
查看译文
关键词
reality interfaces,information access,gaze-adaptive,fast-paced
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要