Personalised, Multi-modal, Affective State Detection for Hybrid Brain-Computer Music Interfacing

IEEE Transactions on Affective Computing(2020)

Cited 20|Views84
No score
Abstract
Brain-computer music interfaces (BCMIs) may be used to modulate affective states, with applications in music therapy, composition, and entertainment. However, for such systems to work they need to be able to reliably detect their user's current affective state. We present a method for personalised affective state detection for use in BCMI. We compare it to a population-based detection method trained on 17 users and demonstrate that personalised affective state detection is significantly ( $p<0.01$p<0.01 ) more accurate, with average improvements in accuracy of 10.2 percent for valence and 9.3 percent for arousal. We also compare a hybrid BCMI (a BCMI that combines physiological signals with neurological signals) to a conventional BCMI design (one based upon the use of only EEG features) and demonstrate that the hybrid design results in a significant ( $p<0.01$p<0.01 ) 6.2 percent improvement in performance for arousal classification and a significant ( $p<0.01$p<0.01 ) 5.9 percent improvement for valence classification.
More
Translated text
Key words
Electroencephalography,Physiology,Music,Calibration,Training,Medical treatment,Psychology
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined