Emo-FilM: A multimodal dataset for affective neuroscience using naturalistic stimuli

Elenor Morgenroth, Stefano Moia, Laura Vilaclara,Raphael Fournier,Michal Muszynski, Maria Ploumitsakou, Marina Almato-Bellavista,Patrik Vuilleumier, Dimitri Van de Ville

biorxiv(2024)

引用 0|浏览3
暂无评分
摘要
The extensive Emo-FilM dataset stands for Emotion research using Films and fMRI in healthy participants. This dataset includes detailed emotion annotations by 44 raters for 14 short films with a combined duration of over 2 1/2 hours, as well as recordings of respiration, heart rate, and functional magnetic resonance imaging (fMRI) from a different sample of 30 individuals watching the same films. The detailed annotations of experienced emotion evaluated 50 items including ratings of discrete emotions and emotion components from the domains of appraisal, motivation, motor expression, physiological response, and feeling. Quality assessment for the behavioural data shows a mean inter-rater agreement of 0.38. The parallel fMRI data was acquired at 3 Tesla in four sessions, accompanied with a high-resolution structural (T1) and resting state fMRI scans for each participant. Physiological recordings during fMRI included heart rate, respiration, and electrodermal activity (EDA). Quality assessment indicators confirm acceptable quality of the MRI data. This dataset is designed, but not limited, to studying the dynamic neural processes involved in emotion experience. A particular strength of this data is the high temporal resolution of behavioural annotations, as well as the inclusion of a validation study in the fMRI sample. This high-quality behavioural data in combination with continuous physiological and MRI measurements makes this dataset a treasure trove for researching human emotion in response to naturalistic stimulation in a multimodal framework. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要