MRLab: Virtual-Reality Fusion Smart Laboratory Based on Multimodal Fusion

INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION(2024)

引用 0|浏览0
暂无评分
摘要
During the COVID-19 pandemic, online classes became the only option for many students. The main challenge for these classes was conducting risky and complex chemical or biological experiments in a domestic environment. To address this challenge, a smart experiment system called MRLab was developed. MRLab used wearables such as a smart glove and head-mounted device to record sensory data and a multimodal hybrid fusion model GVVS to interpret the user's experimental intent, which essentially transforms the user's abstract behavioral actions into a probabilistic set of experimental intent that can be computed. Different experiments in MRLab used different libraries of experimental intents. The SrNet model in GVVS was used to estimate the probability of the user's gesture behavior generated from the smart glove, while the SIPA algorithm compared speech information entered during the experiment with the experimental intent library to estimate the probability of the user's intent. At the same time, the scene visual channel monitored the information about the object the user intended to operate, with the SVF algorithm computing the probability of the intended object in real-time. The results from ANOVA and post-hoc comparative testing conducted on 21 volunteers revealed that MRLab outperformed other experiment modes, including WEB, AR, and VR, with a higher intention understanding rate, efficiency, and user satisfaction. Therefore, MRLab proved to be a useful alternative to traditional physics laboratory experiments during the pandemic, along with being an additional teaching tool for remote learning purposes.
更多
查看译文
关键词
Smart glove,mixed reality,virtual-reality fusion interaction,multimodal fusion,intent understanding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要