Article

Perception of emotional expressions is independent of face selectivity in monkey inferior temporal cortex.

Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA.
Proceedings of the National Academy of Sciences (Impact Factor: 9.81). 05/2008; 105(14):5591-6. DOI: 10.1073/pnas.0800489105
Source: PubMed

ABSTRACT The ability to perceive and differentiate facial expressions is vital for social communication. Numerous functional MRI (fMRI) studies in humans have shown enhanced responses to faces with different emotional valence, in both the amygdala and the visual cortex. However, relatively few studies have examined how valence influences neural responses in monkeys, thereby limiting the ability to draw comparisons across species and thus understand the underlying neural mechanisms. Here we tested the effects of macaque facial expressions on neural activation within these two regions using fMRI in three awake, behaving monkeys. Monkeys maintained central fixation while blocks of different monkey facial expressions were presented. Four different facial expressions were tested: (i) neutral, (ii) aggressive (open-mouthed threat), (iii) fearful (fear grin), and (iv) submissive (lip smack). Our results confirmed that both the amygdala and the inferior temporal cortex in monkeys are modulated by facial expressions. As in human fMRI, fearful expressions evoked the greatest response in monkeys-even though fearful expressions are physically dissimilar in humans and macaques. Furthermore, we found that valence effects were not uniformly distributed over the inferior temporal cortex. Surprisingly, these valence maps were independent of two related functional maps: (i) the map of "face-selective" regions (faces versus non-face objects) and (ii) the map of "face-responsive" regions (faces versus scrambled images). Thus, the neural mechanisms underlying face perception and valence perception appear to be distinct.

Full-text

Available from: Roger B Tootell, May 22, 2014
0 Followers
 · 
109 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: According to the classic Bruce and Young (1986) model of face recognition, identity and emotional expression information from the face are processed in parallel and independently. Since this functional model was published, a growing body of research has challenged this viewpoint and instead support an interdependence view. In addition, neural models of face processing (Haxby, Hoffman & Gobbini, 2000) emphasise differences in terms of the processing of changeable and invariant aspects of faces. This article provides a critical appraisal of this literature and discusses the role of motion in both expression and identity recognition and the intertwined nature of identity, expression and motion processing. We conclude, by discussing recent advancements in this area and research questions that still need to be addressed.
    Frontiers in Psychology 03/2015; 6. DOI:10.3389/fpsyg.2015.00255 · 2.80 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Evidence suggests the anterior temporal lobe (ATL) plays an important role in person identification and memory. In humans, neuroimaging studies of person memory report consistent activations in the ATL to famous and personally familiar faces and studies of patients report resection or damage of the ATL causes an associative prosopagnosia in which face perception is intact but face memory is compromised. In addition, high-resolution fMRI studies of non-human primates and electrophysiological studies of humans also suggest regions of the ventral ATL are sensitive to novel faces. The current study extends previous findings by investigating whether similar subregions in the dorsal, ventral, lateral, or polar aspects of the ATL are sensitive to personally familiar, famous, and novel faces. We present the results of two studies of person memory: a meta-analysis of existing fMRI studies and an empirical fMRI study using optimized imaging parameters. Both studies showed left-lateralized ATL activations to familiar individuals while novel faces activated the right ATL. Activations to famous faces were quite ventral, similar to what has been reported in previous high-resolution fMRI studies of non-human primates. These findings suggest that face memory-sensitive patches in the human ATL are in the ventral/polar ATL.
    Frontiers in Human Neuroscience 02/2013; 7:17. DOI:10.3389/fnhum.2013.00017 · 2.90 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: It is widely assumed that the fusiform face area (FFA), a brain region specialized for face perception, is not involved in processing emotional expressions. This assumption is based on the proposition that the FFA is involved in face identification and only processes features that are invariant across changes due to head movements, speaking and expressing emotions. The present study tested this proposition by examining whether the response in the human FFA varies across emotional expressions with functional magnetic resonance imaging and brain decoding analysis techniques (n = 11). A one vs. all classification analysis showed that most emotional expressions that participants perceived could be reliably predicted from the neural pattern of activity in left and the right FFA, suggesting that the perception of different emotional expressions recruit partially non-overlapping neural mechanisms. In addition, emotional expressions could also be decoded from the pattern of activity in the early visual cortex (EVC), indicating that retinotopic cortex also shows a differential response to emotional expressions. These results cast doubt on the idea that the FFA is involved in expression invariant face processing, and instead indicate that emotional expressions evoke partially de-correlated signals throughout occipital and posterior temporal cortex.
    Frontiers in Human Neuroscience 10/2013; 7:692. DOI:10.3389/fnhum.2013.00692 · 2.90 Impact Factor