Perception of emotional expressions is independent of face selectivity in monkey inferior temporal cortex.

Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, MD 20892, USA.
Proceedings of the National Academy of Sciences (Impact Factor: 9.81). 05/2008; 105(14):5591-6. DOI: 10.1073/pnas.0800489105
Source: PubMed

ABSTRACT The ability to perceive and differentiate facial expressions is vital for social communication. Numerous functional MRI (fMRI) studies in humans have shown enhanced responses to faces with different emotional valence, in both the amygdala and the visual cortex. However, relatively few studies have examined how valence influences neural responses in monkeys, thereby limiting the ability to draw comparisons across species and thus understand the underlying neural mechanisms. Here we tested the effects of macaque facial expressions on neural activation within these two regions using fMRI in three awake, behaving monkeys. Monkeys maintained central fixation while blocks of different monkey facial expressions were presented. Four different facial expressions were tested: (i) neutral, (ii) aggressive (open-mouthed threat), (iii) fearful (fear grin), and (iv) submissive (lip smack). Our results confirmed that both the amygdala and the inferior temporal cortex in monkeys are modulated by facial expressions. As in human fMRI, fearful expressions evoked the greatest response in monkeys-even though fearful expressions are physically dissimilar in humans and macaques. Furthermore, we found that valence effects were not uniformly distributed over the inferior temporal cortex. Surprisingly, these valence maps were independent of two related functional maps: (i) the map of "face-selective" regions (faces versus non-face objects) and (ii) the map of "face-responsive" regions (faces versus scrambled images). Thus, the neural mechanisms underlying face perception and valence perception appear to be distinct.

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: According to the classic Bruce and Young (1986) model of face recognition, identity and emotional expression information from the face are processed in parallel and independently. Since this functional model was published, a growing body of research has challenged this viewpoint and instead support an interdependence view. In addition, neural models of face processing (Haxby, Hoffman & Gobbini, 2000) emphasise differences in terms of the processing of changeable and invariant aspects of faces. This article provides a critical appraisal of this literature and discusses the role of motion in both expression and identity recognition and the intertwined nature of identity, expression and motion processing. We conclude, by discussing recent advancements in this area and research questions that still need to be addressed.
    Frontiers in Psychology 03/2015; 6. DOI:10.3389/fpsyg.2015.00255 · 2.80 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: To investigate the effect of face inversion and thatcherization (eye inversion) on temporal processing stages of facial information, single neuron activities in the temporal cortex (area TE) of two rhesus monkeys were recorded. Test stimuli were colored pictures of monkey faces (four with four different expressions), human faces (three with four different expressions), and geometric shapes. Modifications were made in each face-picture, and its four variations were used as stimuli: upright original, inverted original, upright thatcherized, and inverted thatcherized faces. A total of 119 neurons responded to at least one of the upright original facial stimuli. A majority of the neurons (71%) showed activity modulations depending on upright and inverted presentations, and a lesser number of neurons (13%) showed activity modulations depending on original and thatcherized face conditions. In the case of face inversion, information about the fine category (facial identity and expression) decreased, whereas information about the global category (monkey vs human vs shape) was retained for both the original and thatcherized faces. Principal component analysis on the neuronal population responses revealed that the global categorization occurred regardless of the face inversion and that the inverted faces were represented near the upright faces in the principal component analysis space. By contrast, the face inversion decreased the ability to represent human facial identity and monkey facial expression. Thus, the neuronal population represented inverted faces as faces but failed to represent the identity and expression of the inverted faces, indicating that the neuronal representation in area TE cause the perceptual effect of face inversion.
    The Journal of Neuroscience : The Official Journal of the Society for Neuroscience 09/2014; 34(37):12457-69. DOI:10.1523/JNEUROSCI.0485-14.2014 · 6.75 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: In 1998 several groups reported the feasibility of fMRI experiments in monkeys, with the goal to bridge the gap between invasive nonhuman primate studies and human functional imaging. These studies yielded critical insights in the neuronal underpinnings of the BOLD signal. Furthermore, the technology has been successful in guiding electrophysiological recordings and identifying focal perturbation targets. Finally, invaluable information was obtained concerning human brain evolution. We here provide a comprehensive overview of awake monkey fMRI studies mainly confined to the visual system. We review the latest insights about the topographic organization of monkey visual cortex and discuss the spatial relationships between retinotopy and category- and feature-selective clusters. We briefly discuss the functional layout of parietal and frontal cortex and continue with a summary of some fascinating functional and effective connectivity studies. Finally, we review recent comparative fMRI experiments and speculate about the future of nonhuman primate imaging.
    Neuron 08/2014; 83(3):533-550. DOI:10.1016/j.neuron.2014.07.015 · 15.98 Impact Factor

Full-text (2 Sources)

Available from
Jun 10, 2014