Integration of cross-modal emotional information in the human brain: An fMRI study

Interdisciplinary Program in Cognitive Science, Seoul National University, Seoul, Republic of Korea.
Cortex (Impact Factor: 5.13). 07/2008; 46(2):161-9. DOI: 10.1016/j.cortex.2008.06.008
Source: PubMed


The interaction of information derived from the voice and facial expression of a speaker contributes to the interpretation of the emotional state of the speaker and to the formation of inferences about information that may have been merely implied in the verbal communication. Therefore, we investigated the brain processes responsible for the integration of emotional information originating from different sources. Although several studies have reported possible sites for integration, further investigation using a neutral emotional condition is required to locate emotion-specific networks. Using functional magnetic resonance imaging (fMRI), we explored the brain regions involved in the integration of emotional information from different modalities in comparison to those involved in integrating emotionally neutral information. There was significant activation in the superior temporal gyrus (STG); inferior frontal gyrus (IFG); and parahippocampal gyrus, including the amygdala, under the bimodal versus the unimodal condition, irrespective of the emotional content. We confirmed the results of previous studies by finding that the bimodal emotional condition elicited strong activation in the left middle temporal gyrus (MTG), and we extended this finding to locate the effects of emotional factors by using a neutral condition in the experimental design. We found anger-specific activation in the posterior cingulate, fusiform gyrus, and cerebellum, whereas we found happiness-specific activation in the MTG, parahippocampal gyrus, hippocampus, claustrum, inferior parietal lobule, cuneus, middle frontal gyrus (MFG), IFG, and anterior cingulate. These emotion-specific activations suggest that each emotion uses a separate network to integrate bimodal information and shares a common network for cross-modal integration.

Download full-text


Available from: Ji-Young Park, Apr 17, 2015
1 Follower
26 Reads
    • "Multimodal information from video clips (Eryilmaz, Van De Ville, Schwartz, & Vuilleumier, 2011) or pairing visual scenes with music (Baumgartner, Lutz, Schmidt, & Jancke, 2006; Eldar, Ganor, Admon, Bleich, & Hendler, 2007) also produces vivid emotional experiences where the content in one modality can be boosted or modified by the other modality (Pehrs et al., 2013). Different emotions recruit partly separate networks integrating multimodal information, such as the amygdala or insula, in addition to areas implicated in nonemotional multisensory integration (Klasen et al., 2011; Park et al., 2010). Among other sensory modalities, taste and smell are powerful emotion-eliciting stimuli through their direct access to the amygdala and OFC (Rolls, 2014). "
    Brain Mapping: An Encyclopedic Reference, vol. 3, Edited by Arthur W. Toga, 01/2015: chapter Emotion Perception and Elicitation: pages pp. 79-90; Academic Press: Elsevier.
  • Source
    • "Using degrees of handedness to research the extent to which the cerebral hemispheres interact in the processing of emotion is a largely unexplored, but promising territory. Moreover , the experimental manipulation of bilateral eye movements has been demonstrated to increase inter-hemispheric processing (Christman et al., 2003; Parker and Dagnall, 2010) particularly in consistent-handers (Lyle et al., 2008; Shobe et al., 2009). The use of degree of handedness and bilateral eye movements are two methods that can be used to explore inter-hemispheric collaboration during various types and phases of emotional processing. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Presented is a model suggesting that the right hemisphere (RH) directly mediates the identification and comprehension of positive and negative emotional stimuli, whereas the left hemisphere (LH) contributes to higher level processing of emotional information that has been shared via the corpus callosum. RH subcortical connections provide initial processing of emotional stimuli, and their innervation to cortical structures provides a secondary pathway by which the hemispheres process emotional information more fully. It is suggested that the LH contribution to emotion processing is in emotional regulation, social well-being, and adaptation, and transforming the RH emotional experience into propositional and verbal codes. Lastly, it is proposed that the LH has little ability at the level of emotion identification, having a default positive bias and no ability to identify a stimulus as negative. Instead, the LH must rely on the transfer of emotional information from the RH to engage higher-order emotional processing. As such, either hemisphere can identify positive emotions, but they must collaborate for complete processing of negative emotions. Evidence presented draws from behavioral, neurological, and clinical research, including discussions of subcortical and cortical pathways, callosal agenesis, commissurotomy, emotion regulation, mood disorders, interpersonal interaction, language, and handedness. Directions for future research are offered.
    Frontiers in Human Neuroscience 04/2014; 8(1):230. DOI:10.3389/fnhum.2014.00230 · 3.63 Impact Factor
  • Source
    • "Loss of Ank3 in mice disrupts action potential firing (Zhou et al., 1998), therefore lower expression in the superior temporal gyrus may impair neuronal activity in this region. Although mainly responsible for processing sound and speech, the superior temporal gyrus has also been implicated in emotion processing, specifically emotion recognition in facial expressions and speech (Fruhholz et al., 2012; Park et al., 2010; Robins et al., 2009). A recent study found substantial differences in synchronous neural interactions in the right superior temporal gyrus between veterans with PTSD and resilient veterans, suggesting that modulation of neural networks by trauma, particularly in this region, may be a marker of resilience (James et al., 2013). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The ankyrin 3 gene (ANK3) produces the ankyrin G protein that plays an integral role in regulating neuronal activity. Previous studies have linked ANK3 to bipolar disorder and schizophrenia. A recent mouse study suggests that ANK3 may regulate behavioral disinhibition and stress reactivity. This led us to hypothesize that ANK3 might also be associated with stress-related psychopathology such as posttraumatic stress disorder (PTSD), as well as disorders of the externalizing spectrum such as antisocial personality disorder and substance-related disorders that are etiologically linked to impulsivity and temperamental disinhibition. We examined the possibility of association between ANK3 SNPs and both PTSD and externalizing (defined by a factor score representing a composite of adult antisociality and substance abuse) in a cohort of white non-Hispanic combat veterans and their intimate partners (n=554). Initially, we focused on rs9804190-a SNP previously reported to be associated with bipolar disorder, schizophrenia, and ankyrin G expression in brain. Then we examined 358 additional ANK3 SNPs utilizing a multiple-testing correction. rs9804190 was associated with both externalizing and PTSD (p=0.028 and p=0.042 respectively). Analysis of other ANK3 SNPs identified several that were more strongly associated with either trait. The most significant association with externalizing was observed at rs1049862 (p=0.00040, pcorrected=0.60). The most significant association with PTSD (p=0.00060, pcorrected=0.045) was found with three SNPs in complete linkage disequilibrium (LD)-rs28932171, rs11599164, and rs17208576. These findings support a role of ANK3 in risk of stress-related and externalizing disorders, beyond its previous associations with bipolar disorder and schizophrenia.
    Psychoneuroendocrinology 06/2013; 38(10). DOI:10.1016/j.psyneuen.2013.04.013 · 4.94 Impact Factor
Show more