Article

Multisensory emotions: Perception, combination and underlying neural processes

Department of Psychiatry, RWTH Aachen University, Aachen, Germany.
Reviews in the neurosciences (Impact Factor: 3.33). 08/2012; 23(4):381-92. DOI: 10.1515/revneuro-2012-0040
Source: PubMed

ABSTRACT

In our everyday lives, we perceive emotional information via multiple sensory channels. This is particularly evident for emotional faces and voices in a social context. Over the past years, a multitude of studies have addressed the question of how affective cues conveyed by auditory and visual channels are integrated. Behavioral studies show that hearing and seeing emotional expressions can support and influence each other, a notion which is supported by investigations on the underlying neurobiology. Numerous electrophysiological and neuroimaging studies have identified brain regions subserving the integration of multimodal emotions and have provided new insights into the neural processing steps underlying the synergistic confluence of affective information from voice and face. In this paper we provide a comprehensive review covering current behavioral, electrophysiological and functional neuroimaging findings on the combination of emotions from the auditory and visual domains. Behavioral advantages arising from multimodal redundancy are paralleled by specific integration patterns on the neural level, from encoding in early sensory cortices to late cognitive evaluation in higher association areas. In summary, these findings indicate that bimodal emotions interact at multiple stages of the audiovisual integration process.

14 Followers
 · 
80 Reads
  • Source
    • "The current investigation provides a baseline from normal adults for developmental studies as well as clinical research including individuals with autism, language impairment, or aphasia. Many similar studies have utilized behavioral measures and eventrelated potentials or other functional imaging methods to investigate brain mechanisms for emotional prosody processing (SeeBelin et al. (2004), Klasen et al. (2012, Schirmer and Kotz (2006) for reviews). Our study has two novel features. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The current study employed behavioral and electrophysiological measures to investigate the timing, localization, and neural oscillation characteristics of cortical activities associated with phonetic and emotional information processing of speech. The experimental design used a cross-modal priming paradigm in. which the normal adult participants were presented a visual prime followed by an auditory target. Primes were facial expressions that systematically varied in emotional content (happy or angry) and mouth shape (corresponding to /a/ or /i/ vowels). Targets were spoken words that varied by emotional prosody (happy or angry) and vowel (/a/ or /i/). In both the phonetic and prosodic conditions, participants were asked to judge congruency status of the visual prime and the auditory target. Behavioral results showed a congruency effect for both percent correct and reaction time. Two ERP responses, the N400 and late positive response (LPR), were identified in both conditions. Source localization and inter-trial phase coherence of the N400 and LPR components further revealed different cortical contributions and neural oscillation patterns for selective processing of phonetic and emotional information in. speech. The results provide corroborating evidence for the necessity of differentiating brain mechanisms underlying the representation and processing of co-existing linguistic and paralinguistic information in spoken language, which has important implications for theoretical models of speech recognition as well as clinical studies on the neural bases of language and social communication deficits.
    Full-text · Article · Jan 2016 · Neuropsychologia
  • Source
    • "Since then, numerous studies have confirmed the finding that implicit attention to emotion competes with explicit attentional demands not only in the amygdala but also in other brain regions, consequently decreasing preferential emotion processing under conditions of heightened task-load and/or distraction (Blair et al., 2007; Hsu and Pessoa, 2007; Mitchell et al., 2007; Van Dillen et al., 2009; McRae et al., 2010; Yates et al., 2010; Kanske and Kotz, 2011). In addition to studying the interaction of implicit emotion and explicit attention processes, multisensory studies enable examining the interaction of multiple implicit processes by concurrently presenting emotional stimuli in different sensory modalities (for recent reviews see Klasen et al., 2012; Gerdes et al., 2014 ). In according studies, participants view e.g., emotional facial expressions while listening at the same time to human voices with emotionally modulated prosody. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The present study utilized functional magnetic resonance imaging (fMRI) to examine the neural processing of concurrently presented emotional stimuli under varying explicit and implicit attention demands. Specifically, in separate trials, participants indicated the category of either pictures or words. The words were placed over the center of the pictures and the picture-word compound-stimuli were presented for 1500 ms in a rapid event-related design. The results reveal pronounced main effects of task and emotion: the picture categorization task prompted strong activations in visual, parietal, temporal, frontal, and subcortical regions; the word categorization task evoked increased activation only in left extrastriate cortex. Furthermore, beyond replicating key findings regarding emotional picture and word processing, the results point to a dissociation of semantic-affective and sensory-perceptual processes for words: while emotional words engaged semantic-affective networks of the left hemisphere regardless of task, the increased activity in left extrastriate cortex associated with explicitly attending to words was diminished when the word was overlaid over an erotic image. Finally, we observed a significant interaction between Picture Category and Task within dorsal visual-associative regions, inferior parietal, and dorsolateral, and medial prefrontal cortices: during the word categorization task, activation was increased in these regions when the words were overlaid over erotic as compared to romantic pictures. During the picture categorization task, activity in these areas was relatively decreased when categorizing erotic as compared to romantic pictures. Thus, the emotional intensity of the pictures strongly affected brain regions devoted to the control of task-related word or picture processing. These findings are discussed with respect to the interplay of obligatory stimulus processing with task-related attentional control mechanisms.
    Full-text · Article · Jan 2016 · Frontiers in Psychology
  • Source
    • "Ample evidence has demonstrated that the congruency of multimodal stimuli may facilitate the perception and identification of emotional (e.g. Paulmann and Pell, 2011; Paulmann et al., 2009; see Klasen et al., 2012 for a review) and non-emotional speech signals (e.g. Schwartz et al., 2004; van Wassenhove et al., 2005) and is mandatorily processed (e.g. de Gelder and Vroomen, 2000) already during early perceptual processing stages (e.g. de Gelder et al., 1999; Gerdes et al., 2013; Pourtois et al., 2000, 2002; Stekelenburg and Vroomen, 2007) probably involving specialized structures (e.g. de Gelder and Van den Stock, 2011), while incongruent audiovisual input can even lead to perceptual illusions (cf. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Emotional verbal messages are typically encountered in meaningful contexts, for instance, during face-to-face communication in social situations. Yet, they are often investigated by confronting single participants with isolat-ed words on a computer screen, thus potentially lacking ecological validity. In the present study we recorded event-related brain potentials (ERPs) during emotional word processing in communicative situations provided by videos of a speaker, assuming that emotion effects should be augmented by the presence of a speaker address-ing the listener. Indeed, compared to non-communicative situations or isolated word processing, emotion effects were more pronounced, started earlier and lasted longer in communicative situations. Furthermore, while the brain responded most strongly to negative words when presented in isolation, a positivity bias with more pronounced emotion effects for positive words was observed in communicative situations. These findings demonstrate that communicative situations – in which verbal emotions are typically encountered – strongly enhance emotion effects, underlining the importance of social and meaningful contexts in processing emotional and verbal messages.
    Full-text · Article · Jan 2015 · NeuroImage
Show more