Decoding the temporal dynamics of affective scene processing

NeuroImage(2022)

引用 3|浏览11
暂无评分
摘要
Natural images containing affective scenes are used extensively to investigate the neural mechanisms of visual emotion processing. Functional fMRI studies have shown that these images activate a large-scale distributed brain network that encompasses areas in visual, temporal, and frontal cortices. The underlying spatial and temporal dynamics among these network structures, however, remain to be characterized. We recorded simultaneous EEG-fMRI data while participants passively viewed affective images from the International Affective Picture System (IAPS). Applying multivariate pattern analysis to decode EEG data, and representational similarity analysis to fuse EEG data with simultaneously recorded fMRI data, we found that: (1) ~100 ms after picture onset, perceptual processing of complex visual scenes began in early visual cortex, proceeding to ventral visual cortex at ~160 ms, (2) between ~200 and ~300 ms (pleasant pictures: ~200 ms; unpleasant pictures: ~260 ms), affect-specific neural representations began to form, supported mainly by areas in occipital and temporal cortices, and (3) affect-specific neural representations, lasting up to ~2 s, were stable and exhibited temporally generalizable activity patterns. These results suggest that affective scene representations in the brain are formed in a valence-dependent manner and are sustained by recurrent neural interactions among distributed brain areas. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要