谷歌Chrome浏览器插件
订阅小程序
在清言上使用

Hypercomplex Multimodal Emotion Recognition from EEG and Peripheral Physiological Signals

2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)(2023)

引用 1|浏览91
暂无评分
摘要
Multimodal emotion recognition from physiological signals is receiving an increasing amount of attention due to the impossibility to control them at will unlike behavioral reactions, thus providing more reliable information. Existing deep learning-based methods still rely on extracted handcrafted features, not taking full advantage of the learning ability of neural networks, and often adopt a single-modality approach, while human emotions are inherently expressed in a multimodal way. In this paper, we propose a hypercomplex multimodal network equipped with a novel fusion module comprising parameterized hypercomplex multiplications. Indeed, by operating in a hypercomplex domain the operations follow algebraic rules which allow to model latent relations among learned feature dimensions for a more effective fusion step. We perform classification of valence and arousal from electroencephalogram (EEG) and peripheral physiological signals, employing the publicly available database MAHNOB-HCI surpassing a multimodal state-of-the-art network. The code of our work is freely available at https://github.com/ispamm/MHyEEG.
更多
查看译文
关键词
Hypercomplex Neural Networks, Hypercomplex Algebra, EEG, Multimodal Emotion Recognition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要