Multimodal Affective States Recognition Based on Multiscale CNNs and Biologically Inspired Decision Fusion Model

IEEE Transactions on Affective Computing(2023)

引用 10|浏览5
暂无评分
摘要
There has been an encouraging progress in the affective states recognition models based on the single-modality signals as electroencephalogram (EEG) signals or peripheral physiological signals in recent years. However, multimodal physiological signals-based affective states recognition methods have not been thoroughly exploited yet. Here we propose Multiscale Convolutional Neural Networks (Multiscale CNNs) and a biologically inspired decision fusion model for multimodal affective states recognition. First, the raw signals are pre-processed with baseline signals. Then, the High Scale CNN and Low Scale CNN in Multiscale CNNs are utilized to predict the probability of affective states output for EEG and each peripheral physiological signal respectively. Finally, the fusion model calculates the reliability of each single-modality signals by the euclidean distance between various class labels and the classification probability from Multiscale CNNs, and the decision is made by the more reliable modality information while other modalities information is retained. We use this model to classify four affective states from the arousal valence plane in the DEAP and AMIGOS dataset. The results show that the fusion model improves the accuracy of affective states recognition significantly compared with the result on single-modality signals, and the recognition accuracy of the fusion result achieve 98.52 and 99.89 percent in the DEAP and AMIGOS dataset respectively.
更多
查看译文
关键词
Physiology,Brain modeling,Feature extraction,Electroencephalography,Biological system modeling,Convolution,Reliability,Multimodal affective states recognition,convolutional neural network,decision fusion model,physiological signals
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要