Article

Integration of cross-modal emotional information in the human brain: An fMRI study

Interdisciplinary Program in Cognitive Science, Seoul National University, Seoul, Republic of Korea.
Cortex (Impact Factor: 6.04). 07/2008; 46(2):161-9. DOI: 10.1016/j.cortex.2008.06.008
Source: PubMed

ABSTRACT The interaction of information derived from the voice and facial expression of a speaker contributes to the interpretation of the emotional state of the speaker and to the formation of inferences about information that may have been merely implied in the verbal communication. Therefore, we investigated the brain processes responsible for the integration of emotional information originating from different sources. Although several studies have reported possible sites for integration, further investigation using a neutral emotional condition is required to locate emotion-specific networks. Using functional magnetic resonance imaging (fMRI), we explored the brain regions involved in the integration of emotional information from different modalities in comparison to those involved in integrating emotionally neutral information. There was significant activation in the superior temporal gyrus (STG); inferior frontal gyrus (IFG); and parahippocampal gyrus, including the amygdala, under the bimodal versus the unimodal condition, irrespective of the emotional content. We confirmed the results of previous studies by finding that the bimodal emotional condition elicited strong activation in the left middle temporal gyrus (MTG), and we extended this finding to locate the effects of emotional factors by using a neutral condition in the experimental design. We found anger-specific activation in the posterior cingulate, fusiform gyrus, and cerebellum, whereas we found happiness-specific activation in the MTG, parahippocampal gyrus, hippocampus, claustrum, inferior parietal lobule, cuneus, middle frontal gyrus (MFG), IFG, and anterior cingulate. These emotion-specific activations suggest that each emotion uses a separate network to integrate bimodal information and shares a common network for cross-modal integration.

Full-text

Available from: Ji-Young Park, Apr 17, 2015
1 Follower
 · 
119 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Emotion perception naturally entails multisensory integration. It is also assumed that multisensory emotion perception is characterized by enhanced activation of brain areas implied in multisensory integration, such as the superior temporal gyrus and sulcus (STG/STS). However, most previous studies have employed designs and stimuli that preclude other forms of multisensory interaction, such as crossmodal prediction, leaving open the question whether classical integration is the only relevant process in multisensory emotion perception. Here, we used video clips containing emotional and neutral body and vocal expressions to investigate the role of crossmodal prediction in multisensory emotion perception. While emotional multisensory expressions increased activation in the bilateral fusiform gyrus (FFG), neutral expressions compared to emotional ones enhanced activation in the bilateral middle temporal gyrus (MTG) and posterior STS. Hence, while neutral stimuli activate classical multisensory areas, emotional stimuli invoke areas linked to unisensory visual processing. Emotional stimuli may therefore trigger a prediction of upcoming auditory information based on prior visual information. Such prediction may be stronger for highly salient emotional compared to less salient neutral information. Therefore, we suggest that multisensory emotion perception involves at least two distinct mechanisms; classical multisensory integration, as shown for neutral expressions, and crossmodal prediction, as evident for emotional expressions.
    Neuropsychologia 11/2014; 66. DOI:10.1016/j.neuropsychologia.2014.10.038 · 3.45 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Using fMRI and self-reports, we explore the relationship between ad-elicited emotional arousal and memory for the ad, as well as the mechanisms involved in this relationship. A broad conceptual framework proposes three routes for emotional memory: attention, elaboration, and social cognition. Our exploratory study examines the association between ad-elicited emotional arousal and predetermined ad memorability, as a proxy for memory for the ad. Results reveal greater amygdala activation in memorable (versus unmemorable) ads, reinforcing the association between ad-elicited emotional arousal and memory for the ad. Amygdala activation was accompanied by activation in the brain region termed the superior temporal sulcus (STS), which is involved in social cognition. These results are indicative of a sociocognitive emotional memory process, which has been neglected in past research. Future research directions are discussed.
    Journal of Advertising 10/2013; 42(2):275-291. DOI:10.1080/00913367.2013.768065 · 0.99 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Emotional intelligence (EI) is a multi-faceted construct consisting of our ability to perceive, monitor, regulate and use emotions. Despite much attention being paid to the neural substrates of EI, little is known of the spontaneous brain activity associated with EI during resting state. We used resting-state fMRI to investigate the association between the amplitude of low-frequency fluctuations (ALFFs) and EI in a large sample of young, healthy adults. We found that EI was significantly associated with ALFFs in key nodes of two networks: the social emotional processing network (the fusiform gyrus, right superior orbital frontal gyrus, left inferior frontal gyrus and left inferior parietal lobule) and the cognitive control network (the bilateral pre-SMA, cerebellum and right precuneus). These findings suggest that the neural correlates of EI involve several brain regions in two crucial networks, which reflect the core components of EI: emotion perception and emotional control.
    PLoS ONE 10/2014; 9(10):e111435. DOI:10.1371/journal.pone.0111435 · 3.53 Impact Factor