Incongruence effects in crossmodal emotional integration

Department of Psychiatry und Psychotherapy, RWTH Aachen University, Aachen, Germany.
NeuroImage (Impact Factor: 6.36). 10/2010; 54(3):2257-66. DOI: 10.1016/j.neuroimage.2010.10.047
Source: PubMed


Emotions are often encountered in a multimodal fashion. Consequently, contextual framing by other modalities can alter the way that an emotional facial expression is perceived and lead to emotional conflict. Whole brain fMRI data was collected when 35 healthy subjects judged emotional expressions in faces while concurrently being exposed to emotional (scream, laughter) or neutral (yawning) sounds. The behavioral results showed that subjects rated fearful and neutral faces as being more fearful when accompanied by screams than compared to yawns (and laughs for fearful faces). Moreover, the imaging data revealed that incongruence of emotional valence between faces and sounds led to increased activation in the middle cingulate cortex, right superior frontal cortex, right supplementary motor area as well as the right temporoparietal junction. Against expectations no incongruence effects could be found in the amygdala. Further analyses revealed that, independent of emotional valence congruency, the left amygdala was consistently activated when the information from both modalities was emotional. If a neutral stimulus was present in one modality and emotional in the other, activation in the left amygdala was significantly attenuated. These results indicate that incongruence of emotional valence in audiovisual integration activates a cingulate-fronto-parietal network involved in conflict monitoring and resolution. Furthermore in audiovisual pairing amygdala responses seem to signal also the absence of any neutral feature rather than only the presence of an emotionally charged one.

  • Source
    • "Vice versa, the display of facial expressions has been used as a method to induce emotions in subjects in the fields of neurosciences , functional neuroimaging, and psychology (Gur et al. 2002; Schneider et al. 2006). Consequently, the amygdala has been interpreted as a crucial link between face processing mediated by the fusiform region and affective processing (Breiter et al. 1996; Davis and Whalen 2001; Vuilleumier et al. 2001; Ishai 2008; Herrington et al. 2011; Müller et al. 2011). Aberrant amygdala activations during various tasks involving social cognition such as inferring mental states from pictures of eyes (Baron-Cohen et al. 1999, 2000) and judging facial expressions (Critchley et al. 2000) have led to the hypothesis that the amygdala may fail to assign emotional relevance to social stimuli in ASD patients (Dichter 2012). "
    [Show abstract] [Hide abstract]
    ABSTRACT: One of the most consistent neuropsychological findings in autism spectrum disorders (ASD) is a reduced interest in and impaired processing of human faces. We conducted an activation likelihood estimation meta-analysis on 14 functional imaging studies on neural correlates of face processing enrolling a total of 164 ASD patients. Subsequently, normative whole-brain functional connectivity maps for the identified regions of significant convergence were computed for the task-independent (resting-state) and task-dependent (co-activations) state in healthy subjects. Quantitative functional decoding was performed by reference to the BrainMap database. Finally, we examined the overlap of the delineated network with the results of a previous meta-analysis on structural abnormalities in ASD as well as with brain regions involved in human action observation/imitation. We found a single cluster in the left fusiform gyrus showing significantly reduced activation during face processing in ASD across all studies. Both task-dependent and task-independent analyses indicated significant functional connectivity of this region with the temporo-occipital and lateral occipital cortex, the inferior frontal and parietal cortices, the thalamus and the amygdala. Quantitative reverse inference then indicated an association of these regions mainly with face processing, affective processing, and language-related tasks. Moreover, we found that the cortex in the region of right area V5 displaying structural changes in ASD patients showed consistent connectivity with the region showing aberrant responses in the context of face processing. Finally, this network was also implicated in the human action observation/imitation network. In summary, our findings thus suggest a functionally and structurally disturbed network of occipital regions related primarily to face (but potentially also language) processing, which interact with inferior frontal as well as limbic regions and may be the core of aberrant face processing and reduced interest in faces in ASD.
    Full-text · Article · May 2014 · Brain Structure and Function
  • Source
    • "Most studies using Stroop-like tasks to investigate emotional conflict processing have used other emotions such as fear/angry/happy (audio-visual: e.g. [3]; visual: e.g. [4], [42]; auditory: e.g. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Often we cannot resist emotional distraction, because emotions capture our attention. For example, in TV-commercials, tempting emotional voices add an emotional expression to a formerly neutral product. Here, we used a Stroop-like conflict paradigm as a tool to investigate whether emotional capture results in contextual integration of loose mental associations. Specifically, we tested whether the associatively connected meaning of an ignored auditory emotion with a non-emotional neutral visual target would yield a modulation of activation sensitive to emotional conflict in the brain. In an fMRI-study, nineteen participants detected the presence or absence of a little worm hidden in the picture of an apple, while ignoring a voice with an emotional sound of taste (delicious/disgusting). Our results indicate a modulation due to emotional conflict, pronounced most strongly when processing conflict in the context of disgust (conflict: disgust/no-worm vs. no conflict: disgust/worm). For conflict in the context of disgust, insula activity was increased, with activity correlating positively with reaction time in the conflict case. Conflict in the context of deliciousness resulted in increased amygdala activation, possibly due to the resulting "negative" emotion in incongruent versus congruent combinations. These results indicate that our associative stimulus-combinations showed a conflict-dependent modulation of activity in emotional brain areas. This shows that the emotional sounds were successfully contextually integrated with the loosely associated neutral pictures.
    Full-text · Article · Mar 2014 · PLoS ONE
  • Source
    • "Multisensory integration mechanisms are of special importance during the perception and processing of emotions. Research from our own and other groups provides evidence for a neural network involving the amygdala, insula, frontal areas, FG, and STS which are responsible for integration of cross- or multisensory information related to emotional perception during stimulation with dynamic stimulus material of different modalities (Ethofer et al., 2006; Kreifelts et al., 2009; Seubert et al., 2010a,b; Klasen et al., 2011, 2012; Muller et al., 2011, 2012; Regenbogen et al., 2012a,b). Although we gained novel insights into multisensory integration processes with regards to object and emotion perception during the last years, a systematic investigation of multisensory integration in relation to differences between age groups and age-related pathologies using functional imaging means is still missing. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The rapid demographical shift occurring in our society implies that understanding of healthy aging and age-related diseases is one of our major future challenges. Sensory impairments have an enormous impact on our lives and are closely linked to cognitive functioning. Due to the inherent complexity of sensory perceptions, we are commonly presented with a complex multisensory stimulation and the brain integrates the information from the individual sensory channels into a unique and holistic percept. The cerebral processes involved are essential for our perception of sensory stimuli and becomes especially important during the perception of emotional content. Despite ongoing deterioration of the individual sensory systems during aging, there is evidence for an increase in, or maintenance of, multisensory integration processing in aging individuals. Within this comprehensive literature review on multisensory integration we aim to highlight basic mechanisms and potential compensatory strategies the human brain utilizes to help maintain multisensory integration capabilities during healthy aging to facilitate a broader understanding of age-related pathological conditions. Further our goal was to identify where further research is needed.
    Full-text · Article · Dec 2013 · Frontiers in Human Neuroscience
Show more