The role of the occipital face area in the cortical face perception network

Institute of Cognitive Neuroscience, University College London, Alexandra House, London, WC1N 3AR, UK.
Experimental Brain Research (Impact Factor: 2.04). 02/2011; 209(4):481-93. DOI: 10.1007/s00221-011-2579-1
Source: PubMed


Functional magnetic resonance imaging (fMRI) studies have identified spatially distinct face-selective regions in human cortex. These regions have been linked together to form the components of a cortical network specialized for face perception but the cognitive operations performed in each region are not well understood. In this paper, we review the evidence concerning one of these face-selective regions, the occipital face area (OFA), to better understand what cognitive operations it performs in the face perception network. Neuropsychological evidence and transcranial magnetic stimulation (TMS) studies demonstrate the OFA is necessary for accurate face perception. fMRI and TMS studies investigating the functional role of the OFA suggest that it preferentially represents the parts of a face, including the eyes, nose, and mouth and that it does so at an early stage of visual perception. These studies are consistent with the hypothesis that the OFA is the first stage in a hierarchical face perception network in which the OFA represents facial components prior to subsequent processing of increasingly complex facial features in higher face-selective cortical regions.

Download full-text


Available from: Vincent Walsh,
  • Source
    • "This conjecture was based on the observation that category-selective visual cortical regions generally come in anterior–posterior pairs (Taylor & Downing, 2011; Schwarzlose, Swisher, Dang, & Kanwisher, 2008), such as the fusiform face area (FFA; Kanwisher, McDermott, & Chun, 1997) and the occipital face area (OFA; Gauthier et al., 2000), both of which are frequently implicated in fMRI studies of face recognition. One view is that the OFA generates an initial representation of face features or parts that are subsequently integrated with respect to spatial configuration in the FFA (Pitcher, Walsh, & Duchaine, 2011; Liu, Harris, & Kanwisher, 2010; Pitcher, Walsh, Yovel, & Duchaine, 2007). We therefore hypothesized the existence of a secondary VWFA—a more posterior occipital word form area (OWFA) in the left hemisphere—that works together with the VWFA to represent hemifield-split letter strings as whole words. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Reading requires the neural integration of visual word form information that is split between our retinal hemifields. We examined multiple visual cortical areas involved in this process by measuring fMRI responses while observers viewed words that changed or repeated in one or both hemifields. We were specifically interested in identifying brain areas that exhibit decreased fMRI responses as a result of repeated versus changing visual word form information in each visual hemifield. Our method yielded highly significant effects of word repetition in a previously reported visual word form area (VWFA) in occipitotemporal cortex, which represents hemifield-split words as whole units. We also identified a more posterior occipital word form area (OWFA), which represents word form information in the right and left hemifields independently and is thus both functionally and anatomically distinct from the VWFA. Both the VWFA and the OWFA were left-lateralized in our study and strikingly symmetric in anatomical location relative to known face-selective visual cortical areas in the right hemisphere. Our findings are consistent with the observation that category-selective visual areas come in pairs and support the view that neural mechanisms in left visual cortex-especially those that evolved to support the visual processing of faces-are developmentally malleable and become incorporated into a left-lateralized visual word form network that supports rapid word recognition and reading.
    Journal of Cognitive Neuroscience 11/2015; · 4.09 Impact Factor
    • "These models did not, however, take into account experimental and clinical data which clearly show: (a) that visual (face), auditory (voice) and verbal (name) recognition modalities have a different hemispheric representation and (b) that different patterns of famous people recognition disorders can be observed in patients with right and left hemispheric lesions. As for the asymmetry of brain areas underlying the person recognition modalities, it is now well known that structures involved in the processing of faces, (such as the fusiform face area (Kanwisher et al., 1997; Yovel and Kanwisher, 2004) and the occipital face area (Pitcher et al., 2007, 2011)) and in the processing of voices, (such as the the superior temporal sulcus (von Kriegstein et al., 2003; von Kriegstein and Giraud, 2004; Dubois et al., 2010)), are more developed in the right than in the left hemisphere. As for clinical data dealing with the different patterns of person recognition disorders observed in right and left brain-damaged patients, in a recent review, Gainotti and Marra (2011) showed that two main sources of variance, related to the intra-and inter-hemispheric locus of lesion, account for the qualitative differences existing between face recognition disorders observed in patients with anterior and posterior, right and left hemisphere lesions. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The aim of the present survey was to review clinical and experimental data concerning the visual (face), auditory (voice) and verbal (name) channels through which familiar people are recognized, by contrasting these data with assumptions made by modular cognitive models of familiar people recognition. Particular attention was paid to the fact that visual (face), auditory (voice) and verbal (name) recognition modalities have different hemispheric representations and that these asymmetries have important implications for cognitive models which have not considered hemispheric differences as an important variable in familiar people recognition. Several lines of research have, indeed, shown that familiar faces and voices are mainly underpinned by the right hemisphere, whereas names are mostly subsumed by the left hemisphere. Furthermore, anatomo-clinical data have shown that familiarity judgements are not generated at the level of the Person Identity Nodes (PINs), as suggested by influential cognitive models, but at the level of the modality-specific recognition units, with a right hemisphere dominance in the generation of face and voice familiarity feelings. Additionally, clinical and experimental data have shown that PINs should not be considered as a simple gateway to a unitary semantic system, which stores information about people in an abstract and amodal format, but as structures involved in the storage and retrieval of person-specific information, preferentially represented in a sensory-motor format in the right hemisphere and in a language-mediated format in the left hemisphere. Finally, clinical and experimental data have shown that before the level of the person identity nodes (PINs) a cross-communication exists between the perceptual channels concerning faces and voices, but not between the latter and personal names. These data show that person-specific representations are mainly based on perceptual (face and voice) information in the right hemisphere and on verbal information in the left hemisphere.
    Neuropsychologia 09/2015; 77. DOI:10.1016/j.neuropsychologia.2015.09.002 · 3.30 Impact Factor
    • "Taken together, in this article we provide results that may indicate the usefulness of GSR for 1 In contrast to the FFA, the OFA is suggested to be an earlier stage in a hierarchical face perception network (Pitcher et al., 2011), where basic facial components are decoded. The OFA provides input to higher face selective cortical regions, such as the FFA (Haxby et al., 2000), in which more complex features are processed (Haxby et al., 2000; Pitcher et al., 2011). In turn, the FFA is thought to be involved in a later stage of more complex information processing and is assumed to exert the dominant influence (as compared to the OFA) to the extended face processing system, which, among others, includes the amygdala (Fairhall and Ishai, 2007; Herrington et al., 2011; Vuilleumier et al., 2003, 2004; Vuilleumier and Pourtois, 2007). "
    [Show abstract] [Hide abstract]
    ABSTRACT: The application of global signal regression (GSR) to resting-state functional magnetic resonance imaging data and its usefulness is a widely discussed topic. In this article, we report an observation of segregated distribution of amygdala resting-state functional connectivity (rs-FC) within the fusiform gyrus (FFG) as an effect of GSR in a multi-center-sample of 276 healthy subjects. Specifically, we observed that amygdala rs-FC was distributed within the FFG as distinct anterior versus posterior clusters delineated by positive versus negative rs-FC polarity when GSR was performed. To characterize this effect in more detail, post hoc analyses revealed the following: first, direct overlays of task-functional magnetic resonance imaging derived face sensitive areas and clusters of positive versus negative amygdala rs-FC showed that the positive amygdala rs-FC cluster corresponded best with the fusiform face area, whereas the occipital face area corresponded to the negative amygdala rs-FC cluster. Second, as expected from a hierarchical face perception model, these amygdala rs-FC defined clusters showed differential rs-FC with other regions of the visual stream. Third, dynamic connectivity analyses revealed that these amygdala rs-FC defined clusters also differed in their rs-FC variance across time to the amygdala. Furthermore, subsample analyses of three independent research sites confirmed reliability of the effect of GSR, as revealed by similar patterns of distinct amygdala rs-FC polarity within the FFG. In this article, we discuss the potential of GSR to segregate face sensitive areas within the FFG and furthermore discuss how our results may relate to the functional organization of the face-perception circuit. Hum Brain Mapp, 2015. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
    Human Brain Mapping 07/2015; 36(10). DOI:10.1002/hbm.22900 · 5.97 Impact Factor
Show more