Chrome Extension
WeChat Mini Program
Use on ChatGLM

Visual and haptic feedback in detecting motor imagery within a wearable brain–computer interface

Measurement(2023)

Cited 3|Views11
No score
Abstract
This paper presents a wearable brain–computer interface relying on neurofeedback in extended reality for the enhancement of motor imagery training. Visual and vibrotactile feedback modalities were evaluated when presented either singularly or simultaneously. Only three acquisition channels and state-of-the-art vibrotactile chest-based feedback were employed. Experimental validation was carried out with eight subjects participating in two or three sessions on different days, with 360 trials per subject per session. Neurofeedback led to statistically significant improvement in performance over the two/three sessions, thus demonstrating for the first time functionality of a motor imagery-based instrument even by using an utmost wearable electroencephalograph and a commercial gaming vibrotactile suit. In the best cases, classification accuracy exceeded 80% with more than 20% improvement with respect to the initial performance. No feedback modality was generally preferable across the cohort study, but it is concluded that the best feedback modality may be subject-dependent.
More
Translated text
Key words
Brain–computer interface,Motor imagery,Electroencephalography,Extended reality,Haptic,Neurofeedback
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined