Vuilleumier P, Pourtois G. Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia 45: 174-194

Laboratory for Behavioral Neurology & Imaging of Cognition, Clinic of Neurology, University Hospital of Geneva, Geneva, Switzerland.
Neuropsychologia (Impact Factor: 3.3). 02/2007; 45(1):174-94. DOI: 10.1016/j.neuropsychologia.2006.06.003
Source: PubMed


Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.

Download full-text


Available from: Patrik Vuilleumier,
  • Source
    • "Their results demonstrate that both face and scene inversion cause a shift from specialised processing streams towards generic object-processing mechanisms , but this shift only leads to a reliable behavioural deficit in the case of face inversion. The temporal characteristics of neural affective processing have been relatively well documented for emotional faces (Vuilleumier and Pourtois, 2007; Wieser and Brosch, 2012) and emotional scenes (Olofsson et al., 2008; Schupp et al., 2006a). Anatomically, visual information passes through the extrastriate visual cortex where low-lying physical stimulus properties such as luminance and spatial complexity determine which aspects of visual information receive rapid attentional capture and further processing (Clark and Hillyard, 1996; Givre et al., 1994; Hillyard and Anllo-Vento, 1998; Mangun et al., 1993; Rugg et al., 1987). "
    [Show abstract] [Hide abstract]
    ABSTRACT: In the current study, electroencephalography (EEG) was recorded simultaneously with facial lectromyography (fEMG) to determine whether emotional faces and emotional scenes are processed differently at the neural level. In addition, it was investigated whether these differences can be observed at the behavioural level via spontaneous facial muscle activity. Emotional content of the stimuli did not affect early P1 activity. Emotional faces elicited enhanced amplitudes of the face-sensitive N170 component, while its counterpart, the scene-related N100, was not sensitive to emotional content of scenes. At 220 – 280 ms, the early posterior negativity (EPN) was enhanced only slightly for fearful as compared to neutral or happy faces. However, its amplitudes were significantly enhanced during processing of scenes with positive content, particularly over the right hemisphere. Scenes of positive content also elicited enhanced spontaneous zygomatic activity from 500 – 750 ms onwards, while happy faces elicited no such changes. Contrastingly, both fearful faces and negative scenes elicited enhanced spontaneous corrugator activity at 500 – 750 ms after stimulus onset. However, relative to baseline EMG changes occurred earlier for faces (250 ms) than for scenes (500 ms) whereas for scenes activity changes were more pronouncedover the whole viewing period. Taking into account all effects, the data suggests that emotional facial expressions evoke faster attentional orienting, but weaker affective neural activity and emotional behavioural responses compared to emotional scenes.
  • Source
    • "This effect can also be observed in the absence of conscious awareness. Evidence regarding emotion differences at the level of the N170, a component involved in face processing (see Rossion, 2014), are mixed: while some adult studies show emotion effects at the N170, others do not (Vuilleumier & Pourtois, 2007). For subliminal processing findings are also mixed with some work reporting an emotion effect on the N170 (Pegna, Landis, & Khateb, 2008), while others observed no difference for the N170 (Kiss & Eimer, 2008). "
    [Show abstract] [Hide abstract]
    ABSTRACT: From early in life, emotion detection plays an important role during social interactions. Recently, 7-month-old infants have been shown to process facial signs of fear in others without conscious perception and solely on the basis of their eyes. However, it is not known whether unconscious fear processing from eyes is present before 7months of age or only emerges at around 7months. To investigate this question, we measured 5-month-old infants' event-related potentials (ERPs) in response to subliminally presented fearful and non-fearful eyes and compared these with 7-month-old infants' ERP responses from a previous study. Our ERP results revealed that only 7-month-olds, but not 5-month-olds, distinguished between fearful and non-fearful eyes. Specifically, 7-month-olds' processing of fearful eyes was reflected in early visual processes over occipital cortex and later attentional processes over frontal cortex. This suggests that, in line with prior work on the conscious detection of fearful faces, the brain processes associated with the unconscious processing of fearful eyes develop between 5 and 7months of age. More generally, these findings support the notion that emotion perception and the underlying brain processes undergo critical change during the first year of life. Therefore, the current data provide further evidence for viewing infancy as a formative period in human socioemotional functioning.
    Journal of Experimental Child Psychology 10/2015; DOI:10.1016/j.jecp.2015.09.009 · 3.12 Impact Factor
  • Source
    • "fear > disgust > neutral), helping exclude this possible caveat. It is widely known that threat stimuli evoke enhanced early and late ERPs and hemodynamic responses in multiple brain re- 105 gions, compared with neutral stimuli (Schupp et al., 2000; Vuilleumier and Pourtois, 2007; Sabatinelli et al., 2011). Specifically, neuroimaging studies have evinced response enhancement to both fear and disgust in the amygdala and OFC (Phillips et al., 1998; Rolls, 2004). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Differential processing of threat can consummate as early as 100 ms post-stimulus. Moreover, early perception not only differentiates threat from non-threat stimuli but also distinguishes among discrete threat subtypes (e.g., fear, disgust, and anger). Combining spatial-frequency-filtered images of fear, disgust and neutral scenes with high-density event-related potentials and intracranial source estimation, we investigated the neural underpinnings of general and specific threat processing in early stages of perception. Conveyed in low spatial frequencies, fear and disgust images evoked convergent visual responses with similarly enhanced N1 potentials and dorsal visual (middle temporal gyrus) cortical activity (relative to neutral cues; peaking at 156 ms). Nevertheless, conveyed in high spatial frequencies, fear and disgust elicited divergent visual responses, with fear enhancing and disgust suppressing P1 potentials and ventral visual (occipital fusiform) cortical activity (peaking at 121 ms). Therefore, general and specific threat processing operates in parallel in early perception, with the ventral visual pathway engaged in specific processing of discrete threats and the dorsal visual pathway in general threat processing. Furthermore, selectively tuned to distinctive spatial-frequency channels and visual pathways, these parallel processes underpin dimensional and categorical threat characterization, promoting efficient threat response. These findings thus lend support to hybrid models of emotion.
    Social Cognitive and Affective Neuroscience 09/2015; DOI:10.1093/scan/nsv123 · 7.37 Impact Factor
Show more