Article

Vuilleumier P, Pourtois G. Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia 45: 174-194

Laboratory for Behavioral Neurology & Imaging of Cognition, Clinic of Neurology, University Hospital of Geneva, Geneva, Switzerland.
Neuropsychologia (Impact Factor: 3.3). 02/2007; 45(1):174-94. DOI: 10.1016/j.neuropsychologia.2006.06.003
Source: PubMed

ABSTRACT

Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.

Download full-text

Full-text

Available from: Patrik Vuilleumier
  • Source
    • "Brain areas involved in emotional face processing include the amygdala, hippocampus , and surrounding cortex [46, 47]. Despite ongoing debate in the literature, several lesion studies demonstrated the role of the amygdala in processing fear/threat related facial expression4849505152 and in functional imaging studies, fearful facial expression was found to activate the amygdala more reliably than other emotions [53, 54]. In summary, there is evidence for an interaction between WM and emotional processing whereby emotional stimuli can compete for attentional resources and interfere with the task at hand. "
    [Show abstract] [Hide abstract]
    ABSTRACT: In mild cognitive impairment (MCI), a risk state for Alzheimer’s disease, patients have objective cognitive deficits with relatively preserved functioning. fMRI studies have identified anomalies during working memory (WM) processing in individuals with MCI. The effect of task-irrelevant emotional face distractor on WM processing in MCI remains unclear. We aim to explore the impact of fearful-face task-irrelevant distractor on WM processing in MCI using fMRI. Hypothesis . Compared to healthy controls (HC), MCI patients will show significantly higher BOLD signal in a priori identified regions of interest (ROIs) during a WM task with a task-irrelevant emotional face distractor. Methods . 9 right-handed female participants with MCI and 12 matched HC performed a WM task with standardized task-irrelevant fearful versus neutral face distractors randomized and counterbalanced across WM trials. MRI images were acquired during the WM task and BOLD signal was analyzed using statistical parametric mapping (SPM) to identify signal patterns during the task response phase. Results . Task-irrelevant fearful-face distractor resulted in higher activation in the amygdala, anterior cingulate, and frontal areas, in MCI participants compared to HC. Conclusions . This exploratory study suggests altered WM processing as a result of fearful-face distractor in MCI.
    Full-text · Article · Feb 2016 · Behavioural neurology
  • Source
    • "Emotion-specific responses may arise outside the face processing network, such as the insula for disgust (Phillips et al., 1998) or the striatum and presupplementary motor area (pre-SMA) for smiling (Krolak-Salmon et al., 2006; Vrticka, Andersson, Grandjean, Sander, & Vuilleumier, 2008). Expression-specific patterns were found in the fusiform cortex using multivariate pattern analyses (MVPA) (see Glossary for definition) in some studies (Harry, Williams, Davis, & Kim, 2013) but not others (Peelen, Atkinson, Andersson, & Vuilleumier, 2007; Peelen et al., 2010). Emotion-specific patterns are also observed with MVPA in the posterior STS and mPFC (Said, Haxby, & Todorov, 2011), reflecting supramodal representations equally activated by vocal and bodily expressions (Peelen et al., 2007, 2010). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Over the last 20 years, neuroimaging research in humans has progressively charted emotions on neural activity within distributed and interactive networks of brain regions. This work provides compelling evidence that emotional perception and elicitation are intimately intertwined with a broad range of mental functions, challenging a classic divide between cognitive and affective sectors of the mind. This article describes neural systems engaged by different types of emotional stimuli (e.g., faces, voices, and memories) and different types of emotions (e.g., fear, disgust, and regret). Specificities and commonalities across emotion categories are critically discussed in relation to current theoretical models.
    Full-text · Chapter · Dec 2015
  • Source
    • "It is speculated that perception of attractiveness and expression share similar processing, given that both attractiveness and expression are derived from facial characteristics (e.g., size, position and movement of eyes, nose and mouth) and that both are capable of eliciting affective experiences in the observers. Besides, previous functional magnetic resonance imaging (fMRI) studies have shown that a number of occipital, limbic, temporal, parietal, and prefrontal brain regions that responded to the manipulation of attractiveness (O'Doherty et al., 2003; Ishai, 2007; Winston et al., 2007; Chatterjee et al., 2009) responded to the manipulation of expression (Vuilleumier and Pourtois, 2007; Fusar-Poli et al., 2009). Moreover, O'Doherty et al. (2003) found that the increased activation elicited by faces of high attractiveness in the orbitofrontal cortex (OFC), a rewardrelated brain region, was enhanced by a smile, suggesting that some common neural processing (e.g., reward) is shared by perceptions of attractiveness and expression. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Facial attractiveness is closely related to romantic love. To understand if the neural underpinnings of perceived facial attractiveness and facial expression are similar constructs, we recorded neural signals using an event-related potential (ERP) methodology for 20 participants who were viewing faces with varied attractiveness and expressions. We found that attractiveness and expression were reflected by two early components, P2-lateral (P2l) and P2-medial (P2m), respectively; their interaction effect was reflected by LPP, a late component. The findings suggested that facial attractiveness and expression are first processed in parallel for discrimination between stimuli. After the initial processing, more attentional resources are allocated to the faces with the most positive or most negative valence in both the attractiveness and expression dimensions. The findings contribute to the theoretical model of face perception.
    Full-text · Article · Nov 2015 · Frontiers in Psychology
Show more