Article

Neural correlates of processing facial identity based on features versus their spacing.

Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, Ont., Canada.
Neuropsychologia (Impact Factor: 3.48). 05/2007; 45(7):1438-51. DOI: 10.1016/j.neuropsychologia.2006.11.016
Source: PubMed

ABSTRACT Adults' expertise in recognizing facial identity involves encoding subtle differences among faces in the shape of individual facial features (featural processing) and in the spacing among features (a type of configural processing called sensitivity to second-order relations). We used fMRI to investigate the neural mechanisms that differentiate these two types of processing. Participants made same/different judgments about pairs of faces that differed only in the shape of the eyes and mouth, with minimal differences in spacing (featural blocks), or pairs of faces that had identical features but differed in the positions of those features (spacing blocks). From a localizer scan with faces, objects, and houses, we identified regions with comparatively more activity for faces, including the fusiform face area (FFA) in the right fusiform gyrus, other extrastriate regions, and prefrontal cortices. Contrasts between the featural and spacing conditions revealed distributed patterns of activity differentiating the two conditions. A region of the right fusiform gyrus (near but not overlapping the localized FFA) showed greater activity during the spacing task, along with multiple areas of right frontal cortex, whereas left prefrontal activity increased for featural processing. These patterns of activity were not related to differences in performance between the two tasks. The results indicate that the processing of facial features is distinct from the processing of second-order relations in faces, and that these functions are mediated by separate and lateralized networks involving the right fusiform gyrus, although the FFA as defined from a localizer scan is not differentially involved.

0 Bookmarks
 · 
83 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Facial color is important information for social communication as it provides important clues to recognize a person's emotion and health condition. Our previous EEG study suggested that N170 at the left occipito-temporal site is related to facial color processing (Nakajima et al., [2012]: Neuropsychologia 50:2499–2505). However, because of the low spatial resolution of EEG experiment, the brain region is involved in facial color processing remains controversial. In the present study, we examined the neural substrates of facial color processing using functional magnetic resonance imaging (fMRI). We measured brain activity from 25 subjects during the presentation of natural- and bluish-colored face and their scrambled images. The bilateral fusiform face (FFA) area and occipital face area (OFA) were localized by the contrast of natural-colored faces versus natural-colored scrambled images. Moreover, region of interest (ROI) analysis showed that the left FFA was sensitive to facial color, whereas the right FFA and the right and left OFA were insensitive to facial color. In combination with our previous EEG results, these data suggest that the left FFA may play an important role in facial color processing. Hum Brain Mapp, 2014. © 2014 Wiley Periodicals, Inc.
    Human Brain Mapping 04/2014; · 6.88 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This study investigated the neurocognitive mechanisms underlying the role of the eye and the mouth regions in the recognition of facial happiness, anger, and surprise. To this end, face stimuli were shown in three formats (whole face, upper half visible, and lower half visible) and behavioral categorization, computational modeling, and ERP (event-related potentials) measures were combined. N170 (150-180ms post-stimulus; right hemisphere) and EPN (early posterior negativity; 200-300ms; mainly, right hemisphere) were modulated by expression of whole faces, but not by separate halves. This suggests that expression encoding (N170) and emotional assessment (EPN) require holistic processing, mainly in the right hemisphere. In contrast, the mouth region of happy faces enhanced left temporo-occipital activity (150-180ms), and also the LPC (late positive complex; centro-parietal) activity (350-450ms) earlier than the angry eyes (450-600ms) or other face regions. Relatedly, computational modeling revealed that the mouth region of happy faces was also visually salient by 150ms following stimulus onset. This suggests that analytical or part-based processing of the salient smile occurs early (150-180ms) and lateralized (left), and is subsequently used as a shortcut to identify the expression of happiness (350-450ms). This would account for the happy face advantage in behavioral recognition tasks when the smile is visible.
    NeuroImage 02/2014; · 6.25 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: We investigated how face-selective cortical areas process configural and componential face information and how race of faces may influence these processes. Participants saw blurred (preserving configural information), scrambled (preserving componential information), and whole faces during fMRI scan, and performed a post-scan face recognition task using blurred or scrambled faces. The fusiform face area (FFA) showed stronger activation to blurred than to scrambled faces, and equivalent responses to blurred and whole faces. The occipital face area (OFA) showed stronger activation to whole than to blurred faces, which elicited similar responses to scrambled faces. Therefore, the FFA may be more tuned to process configural than componential information, whereas the OFA similarly participates in perception of both. Differences in recognizing own- and other-race blurred faces were correlated with differences in FFA activation to those faces, suggesting that configural processing within the FFA may underlie the other-race effect in face recognition.
    Cognitive neuroscience 05/2014; · 2.19 Impact Factor

Full-text (2 Sources)

View
25 Downloads
Available from
Jun 4, 2014