Neural response to specific components of fearful faces in healthy and schizophrenic adults
Perception of fearful faces is associated with functional activation of cortico-limbic structures, which has been found altered in individuals with psychiatric disorders such as schizophrenia, autism and major depression. The objective of this study was to isolate the brain response to the features of standardized fearful faces by incorporating principal component analysis (PCA) into the analysis of neuroimaging data of healthy volunteers and individuals with schizophrenia. At the first stage, the visual characteristics of morphed fearful facial expressions (FEEST, Young et al., 2002) were classified with PCA, which produced seven orthogonal factors, with some of them related to emotionally salient facial features (eyes, mouth, brows) and others reflecting non-salient facial features. Subsequently, these PCA-based factors were included into the functional magnetic resonance imaging (fMRI) analysis of 63 healthy volunteers and 32 individuals with schizophrenia performing a task that involved implicit processing of FEEST stimuli. In healthy volunteers, significant neural response was found to visual characteristics of eyes, mouth or brows. In individuals with schizophrenia, PCA-based analysis enabled us to identify several significant clusters of activation that were not detected by the standard approach. These clusters were implicated in processing of visual and emotional information and were attributable to the perception of eyes and brows. PCA-based analysis could be useful in isolating brain response to salient facial features in psychiatric populations.
Neural response to speciﬁc components of fearful faces in healthy and
, Mary L. Phillips
, Tamara Russell
, Natalia Lawrence
, Nicolette Marshall
, Wissam El-Hage
, Colm McDonald
, Vincent Giampietro
, Michael J. Brammer
Anthony S. David
, Simon A. Surguladze
King's College London Institute of Psychiatry, UK
Institut d'Alta Tecnologia, Parc de Recerca Biomèdica de Barcelona, Spain
Department of Psychiatry, Western Psychiatric Institute and Clinic, University of Pittsburgh School of Medicine, Pittsburgh, USA
Department of Psychological Medicine, Cardiff University School of Medicine, Cardiff, UK
Wales Institute of Cognitive Neuroscience, Cardiff University, Cardiff, UK
INSERM U930 ERL CNRS 3106, Université François Rabelais, Tours, France
Department of Psychiatry, National University of Ireland, Galway, Ireland
Received 13 June 2009
Accepted 10 August 2009
Available online 20 August 2009
Principal component analysis
Perception of fearful faces is associated with functional activation of cortico-limbic structures, which has
been found altered in individuals with psychiatric disorders such as schizophrenia, autism and major
depression. The objective of this study was to isolate the brain response to the features of standardized
fearful faces by incorporating principal component analysis (PCA) into the analysis of neuroimaging data of
healthy volunteers and individuals with schizophrenia. At the ﬁrst stage, the visual characteristics of
morphed fearful facial expressions (FEEST, Young et al., 2002) were classiﬁed with PCA, which produced
seven orthogonal factors, with some of them related to emotionally salient facial features (eyes, mouth,
brows) and others reﬂecting non-salient facial features. Subsequently, these PCA-based factors were
included into the functional magnetic resonance imaging (fMRI) analysis of 63 healthy volunteers and 32
individuals with schizophrenia performing a task that involved implicit processing of FEEST stimuli. In
healthy volunteers, signiﬁcant neural response was found to visual characteristics of eyes, mouth or brows.
In individuals with schizophrenia, PCA-based analysis enabled us to identify several signi ﬁcant clusters of
activation that were not detected by the standard approach. These clusters were implicated in processing of
visual and emotional information and were attributable to the perception of eyes and brows. PCA-based
analysis could be useful in isolating brain response to salient facial features in psychiatric populations.
© 2009 Elsevier Inc. All rights reserved.
The ability to recognize facial emotional expressions in others is an
essential aspect of social cognition. In neuroimaging studies, the
processing of fearful facial expressions has been associated with
functional activation of several brain structures in both the “core” and
the extended face processing systems (Haxby et al., 2002), including
the amygdala (Breiter et al., 1996; Costafreda et al., 2008; Morris et al.,
1998), the orbitofrontal cortex (Blair et al., 1999) and the fusiform
gyrus (Sprengelmeyer et al., 1998; Surguladze et al., 2003). This robust
activation of the limbic network by fearful facial expressions has led to
the wide use of such stimuli in psychiatric research. For instance,
functional magnetic resonance imaging (fMRI) studies have reported
increased responses in amygdala in individuals with depression
(Sheline et al., 2001), social phobia (Phan et al., 2006) or posttraumatic
stress disorder (Rauch et al., 2000), and decreased responses in
individuals with non-paranoid schizophrenia (Phillips et al., 1999)or
Asperger syndrome (Ashwin et al., 2007). Decreased responses in
fusiform gyrus have been reported in individuals with social phobia
(Gentili et al., 2008) and Asperger syndrome (Deeley et al., 2007).
Several strategies have been developed to explore the underlying
mechanisms of these abnormalities in face perception. It is known
that when viewing faces, healthy individuals ﬁxate their gaze on
salient features, e.g. the eyes, mouth and ears (Walker Smith et al.,
1977). Conversely, deluded schizophrenia patients pay comparatively
less attention to the salient features of faces (Green and Phillips 2004),
and this is associated with poor facial recognition (Williams et al.,
1999). Individuals with autism or social phobia are also less likely to
direct their gaze to the eyes ( Horley et al., 2003; Pelphrey et al., 2002;
Riby et al., 2008). Importantly, the abnormalities in visual scan path
NeuroImage 49 (2010) 939–946
⁎ Corresponding author. King's College London, Institute of Psychiatry, PO 69.
Division of Psychological Medicine, London, SE5 8AF, UK. Fax: +02078480379.
E-mail addresses: Joaquim.Radua@kcl.ac.uk, Joaquim.Radua@iop.kcl.ac.uk (J. Radua)
1053-8119/$ – see front matter © 2009 Elsevier Inc. All rights reserved.
Contents lists available at ScienceDirect
journal homepage: www.elsevier.com/locate/ynimg
are more apparent during the processing of emotional facial expres-
sions—e.g. individuals with schizophrenia ﬁxate less on the salient
features when viewing expressions of negative (Green et al., 2003)or
even positive (Shimizu et al., 2000) affect. This kind of abnormality
has been also described in patients with Alzheimer's disease (Ogrocki
et al., 2000), who ﬁxated more on irrelevant rather than salient facial
features when exposed to pictures of facial affect. Thus, it follows that
the brain response to emotional expressions in different psychiatric
populations would be different not only because of the illness-related
changes in emotional circuits, but also because these individuals differ
in their strategies of viewing other people's faces. Recently Dalton
et al. (2005) highlighted the importance of accounting for the visual
scan path in individuals with autism. The study showed that whereas
the patients were avoiding looking at other people's eyes (presented
at the photographs), taking into account the visual scan paths showed
overactive (rather than under-active as in previous studies) amygdala
and fusiform cortex.
There have been attempts to examine the brain responses to
distinct facial features. Neuroimaging studies with chimerical (Morris
et al., 2002) or masked faces (isolated eyes area) (Whalen et al., 2004)
demonstrated that processing of other people's eye regions was
associated with activation in amygdala. Changeable aspects of face
(mouth movements, gaze shifts) have been found to be processed by
areas in superior temporal sulcus (Hoffman and Haxby, 2000; Puce
et al., 1998). Conversely, it has been shown that the whole facial
conﬁguration (rather than separate parts) was processed in other
parts of the brain, e.g., the fusiform gyrus (Harris and Aguirre, 2008;
Maurer et al., 2007; Rotshtein et al., 2007). Studies on dynamics of the
brain response to emotional faces have similarly found that
integration of some emotion-related salient facial features (e.g. eye
regions in fear) precedes and determines the duration of the latency
of the N170 event related potential (Schyns et al., 2007).
In this study we tested a method that allowed to examine the brain
response to distinct components of facial stimuli expressing different
degrees of fear (i.e., mild or prototypical fear (Young et al., 2002). We
ﬁrst measured the Facial Action Units based on the Facial Action
Coding System (FACS) (Ekman and Friesen, 1978) and then employed
Principal Component Analysis (PCA) to obtain few orthogonal facial
factors. It should be noted that PCA has been previously used by Calder
et al. (2001) in a behavioral study of facial expression recognition.
However, our approach was different from that of Calder et al. since
we measured facial features based on FACS—rather than pixel
intensities. Another important difference is that by including the
PCA into the neuroimaging data analysis we were able to produce
brain maps showing blood oxygenation level dependent (BOLD)
response variation associated with each PCA-based independent facial
factor (e.g. response to eyes, response to brows, etc.). Finally, to
explore the clinical relevance of this approach we have applied this
method to the neuroimaging data of individuals with schizophrenia
who underwent the same facial emotion processing experiments.
Materials and methods
Sixty-three healthy volunteers and thirty-two individuals with
DSM-IV diagnosis of schizophrenia participated in the study. Main
demographic and clinical characteristics of the samples are shown at
Table 1. It must be noted that our study was not designed to compare
healthy volunteers with individuals with schizophrenia, so we did not
use matched sampling. Healthy volunteers had no history of
psychiatric disorder, traumatic brain injury, or recent substance
abuse. Individuals with schizophrenia were stable out-patients
treated with depot antipsychotic medication: risperidone long-acting
injections (n =16), ﬂupentixol decanoate (n =12), ﬂuphenazine
decanoate (n= 2), haloperidol decanoate (n =1) and pipotiazine
palmitate (n =1); mean chlorpromazine equivalents of the depot
antipsychotics were 213 mg/day (British Medical Association and
Royal Pharmaceutical Society of Great Britain, 2006; Goldberg and
Murray, 2006). All participants were right-handed and had normal or
The study protocols were in compliance with the Code of Ethical
Principles for Medical Research Involving Human Subjects of the
World Medic al Assoc iation ( Declar ation of Helsinki) and were
approved by the joint ethical committee of the Institute of Psychiatry
and South London & Maudsley NHS Trust. All study participants have
given written informed consent.
During a 6-min event-related fMRI experiment the participants
(both healthy volunteers and patients) were presented with series of
photographs of fearful and emotionally neutral male and female faces
from the FEEST. The faces were expressing different levels of fear:
there were 10 photographs with neutral expression (0% fear), 10
morphed photographs with mild (50%) fear and 10 photographs with
prototypical (100%) fear. The presentation order was randomized,
with each of the 30 facial stimuli presented twice, which made 60
presentations in total. Duration of each facial presentation was 2 s.
Stimulus onset asynchrony (SOA) varied from 3 to 13 s according to a
Poisson distribution with average interval 6 s. Immediately after each
facial stimulus the subjects viewed a ﬁxation cross that was used as a
baseline stimulus in subsequent analysis. The participants were
requested to decide upon the sex of each facial stimulus and press
one of two buttons accordingly with the right index or middle ﬁnger—
this implicit task has been robustly associated with activation of
limbic structures (Morris et al., 1998; Surguladze et al., 2003). All
participants were able to identify the sex of the faces correctly (at
Gradient echo Echo Planar Imaging (EPI) data were acquired on a
GE Signa 1.5-T system (General Electric, Milwaukee WI, USA) at the
Maudsley Hospital (London). A quadrature birdcage head coil was
used for radio frequency transmission and reception. One-hundred
eighty T2⁎-weighted images depicting BOLD contrast ( Ogawa et al.,
1990) were acquired at each of 16 near-axial non-contiguous 7-mm-
thick planes parallel to the intercommissural (AC-PC) line: echo time
(TE) 40 ms, repetition time (TR) 2 s, in-plane resolution 3.44 mm,
interslice gap 0. 7 mm, ﬂip angle 70 degrees, matrix size 64 ×64, ﬁeld
of view (FOV) 24 cm. In the same scanning session a high-resolution
EPI dataset was acquired with 2 pulse sequences, gradient echo EPI
and spin echo EPI. The structural images were acquired at 43 near-
axial 3-mm-thick planes parallel to the AC-PC line: TE 73 ms, time for
inversion (TI) 180 ms, TR 16 s, in-plane resolution 1.72 mm, interslice
gap 0. 3 mm, matrix size 128 × 128× 43; FOV 24 cm. This EPI dataset
would be later used to coregister the fMRI datasets acquired from each
Age (SD) in years 37.8 (10.5) 43.2 (10.1)
Males/females 37/26 17/15
Years of education (SD) 15 (3.4) 12 (1.0)
Duration of the illness in years (SD) 16.8 (8.2)
PANSS general score 21.1
PANSS: Positive and Negative Syndrome Scale (Kay et al., 1987). GAF: Global
Assessment of Functioning (American Psychiatric Association, 2000).
940 J. Radua et al. / NeuroImage 49 (2010) 939–946
individual and normalize them to standard stereotactic space. Prior to
each imaging run, four dummy scans were acquired to reach equilibrium
magnetization. An autoshimming routine was used on each run.
Factorial analysis of the features of facial expressions
Prior to fMRI dat a analysis, the FEEST photographs were
reclassiﬁed using factor analysis.
First, various fear-related features of each photograph were
examined, based on FACS. We measured action unit (AU) 1 (inner
eyebrow upwards), AU 2 (outer eyebrow upwards), AU 4 (eyebrows
together when in combination with action units 1 and 2), AU 5 (upper
eyelid upwards) and AUs 25/26 (lips parted). The distances, angles and
sizes were measured by standard computer image software similarly to
the approach of the Automated Face Analysis (Tian et al., 2001). For
example, the vertical distance in pixels from the top of the iris to the
upper eyelid, or the angle between the inner and the outer halves of
eyebrow, etc., were measured in pictures of each poser and intensity
(Fig. 1). The means of the left and the right measurements were used in
bilateral features. In addition to the FACS-based features, we measured
non-salient parts of the photographs. These included basic structural
features (e.g. the size of face area, the distance between both inner eye
corners, etc), as well as face brightness and contrast.
Secondly, in order to avoid using too many variables (i.e. one
variable per measurement) and multicollinearity, a Principal Compo-
nents Analysis (PCA) of the measurements with Equamax rotation
was p erformed to estimate Anderson–Ru bin factors, and each
measurement was included in the factor that held the strongest
correlation with it. Seven uncorrelated facial factors were obtained. It
must be noted that with this procedure a factor can have correlations
with particular measurements included in other factors. However,
these are expected to be much weaker than the correlations with the
measurements included in the factor itself, and Anderson–Rubin
method ensures that factors are completely uncorrelated between
Finally, the faces were reclassiﬁed so that the facial factors could be
used as regressors in fMRI data analyses. For this purpose, all 30 faces
were re-grouped into 3 equally sized groups according to the values
within each newly derived factor. For example, classiﬁcation within
the eyes factor implied that the 10 faces with the lowest values in this
factor were labeled as low, 10 faces with the highest values were
labeled as high and the 10 faces with medium-range values were
labeled as medium. The same procedure was performed 7 times—to
classify the levels of intensity within each of 7 newly derived factors.
Consequently, the newly derived factors contained 3 levels of
intensity which matched the levels of intensity in the standard
analysis (10 neutral faces, 10 faces with mild fear and 10 faces with
prototypical fear). Therefore, subsequent fMRI analyses would have
exactly the same design and group sizes as the standard analysis (see
Fig. 2 for the diagram of the procedure).
fMRI data analyses
We ﬁrst computed the BOLD response to the facial stimuli at each
level (i.e., neutral, mild fear, prototypical fear). We then applied a
linear trend analysis across the levels that would reﬂect the BOLD
response trends to the degrees of intensity within the standard
classiﬁcation of fear or within a factor. These trends could be either
positive (i.e. BOLD response to high intensity N mild intensityN low
intensity) or negative—with an opposite direction of the BOLD
response (for details of fMRI analysis see Supplementary methods).
For simplicity we will refer to the fMRI analysis that was based on the
standard classiﬁcation of fear as standard analysis
, and to the PCA-based
analysis by the corresponding factor name, e.g. eyes factor, brows factor,
etc. We emphasize that all analyses—either standard or those derived
from PCA factors—reﬂected brain responses to the same facial set. The
difference was just in the procedure of analysis—whereby the PCA-
based analyses targeted the variation of brain response to the three
levels of intensity within each particular facial factor.
PCA of the facial features
PCA (Table 2) produced the following factors: (1) eyes, composed
of vertical distance between the lower and upper eyelids and the
amount of eye white between them, (2) brows, mainly composed of
the elevation of the eyebrows and the distance between them, (3)
mouth, mainly composed of the vertical distance between the upper
and lower lips and the size of the eye whites below the iris, (4) mixed,
composed of both measures of luminance and conﬁguration of brows,
Fig. 1. Measurement of facial components. AU: action unit from the Facial Action Coding System (Ekman and Friesen, 1978). Please note that distances have been hand-drawn for the
Fig. 2. Diagram of the method.
941J. Radua et al. / NeuroImage 49 (2010) 939–946
(5) non-emotional I, composed of the size of face area and the distance
between the eyes and lips, (6) non-emotional II, composed of the
distance between eye corners and lip corners, and (7) non-emotional
III, composed of the mean brightness/luminance of the whole face
area (for details of each factor please see Supplementary table).
The ﬁrst three factors corresponded to the salient facial features
that are known to be involved in emotional expressions, i.e. eyes,
brows and mouth; therefore we called them emotional factors. The
last three factors were expected not to have any special meaning, as
variation in features not related to fear was theoretically low.
It must be noted that eyes and mouth factors were correlated with
the standard classiﬁcation of fear (r =0.450, p =0.013 and r= 0.750,
pb 0.001, respectively). In order to ensure that results of the fMRI
analyses were not confounded by this statistical resemblance, the
angle between these factors and the standard classiﬁcation was
enlarged (see Supplementary methods). Thus, we obtained a new,
derivative eyes factor which still included the relevant eye features
but was uncorrelated with the standard classiﬁcation (r =0.250,
p= 0.183). A new uncorrelated mouth factor could not be obtained.
None of the remaining factors correlated with the standard classiﬁ-
cation of fear (|r | ≤ 0.250, p≥ 0.183).
Standard analysis in healthy individuals
Standard analysis (Fig. 3 and Table 3) showed that there was a
positive trend of activation (i.e. BOLD response to prototypical
fearN mild fear N neutral) in bilateral cerebellum, lingual gyri, cunei,
middle and inferior occipital gyri, and right fusiform gyrus. Another
positive trend involved left superior temporal, inferior parietal and
postcentral gyri. A negative trend, which reﬂected activation to
prototypical fear b mild fear b neutral face, was found in left superior
frontal and bilateral middle and medial frontal gyri.
PCA-based analyses in healthy individuals
The emotional factors eyes, brows and mouth, and to a lesser
mixed factor, reproduced t he activation trends in
cerebellum and fusiform/occipital areas. Interestingly, eyes factor
analysis showed activation in left fusiform gyrus, which was not
signiﬁcant in the standard analysis (Fig. 3 and Table 2). Positive
activation trends in bilateral lingual and left inferior parietal/superior
temporal gyri were reproduced by the eyes and the mouth factors, but
not by brows, whereas the positive trend in bilateral cunei and the
Facial measurements and their Spearman correlations with the Anderson–Rubin factors found.
⁎⁎ Uncorrected p-value b 0.01; ⁎ uncorrected p-valueb 0.05; n.s.: not signiﬁcant.
If the top or bottom of the iris was covered by eyelid, a negative distance was calculated by interpolation.
942 J. Radua et al. / NeuroImage 49 (2010) 939–946
negative trends in superior frontal cluster were only reproduced by
the mouth factor. Therefore, activation pattern pertaining to the
mouth factor was similar to that obtained by the standard analysis, as
it could be expected due to the signiﬁcant correlation between the
factor and the standard classiﬁcation of faces, so we decided to
exclude this factor from subsequent analyses. Mixed factor was also
excluded as it was heterogeneous and accounted for only a small
proportion (9%) of the standard activation.
At the predetermined level of signiﬁcance ther e were no
signiﬁcant trends of activations to non-emotional factors.
Analyses in individuals with schizophrenia
In order to test the utility of the PCA-based approach we applied
the PCA-based analyses to the data acquired from the patients with
schizophrenia (Fig. 3 and Table 4). The standard analysis showed only
one cluster of negative trend of activation in left inferior parietal
region and postcentral gyri and no positive trends. Conversely, with
the PCA-based analysis we were able to detect several signiﬁcant
clusters, mainly to eyes factor (positive activation trend in left
inferior-posterior temporal gyrus and left cerebellum, negative
trend in right fusiform gyrus and amygdala/hippocampus), as well
as to brows factor (positive activation trend in middle frontal gyrus/
This is the ﬁrst study on the brain response to fearful faces where
analysis incorporated orthogonal factors reﬂecting the salient features
of the facial stimuli. First, PCA of facial measurements produced seven
factors related to facial stimuli: eyes, mouth, brows, three non-
emotional factors reﬂecting spatial and luminance measures irrelevant
to facial emotion, and one mixed factor that included both salient
facial features and a luminance measure. Mouth factor was discarded
because it correlated with the standard classiﬁcation of fear and thus
the brain activation associated with this factor simply overlapped
with that obtained by standard analysis.
The standard analysis of data from healthy volunteers produced
activation maps consistent with the existing literature. Our ﬁndings of
positive trends of activation in the visual association cortex in
response to increasing intensity of facial fear replicate previous
results (Morris et al., 1998; Surguladze et al., 2003; Vuilleumier et al.,
2001). The posterior superior temporal cortex activation is also
supported by the existing literature where changeable aspects of face
have been found to be processed by the areas in superior temporal
sulcus (Hoffman and Haxby, 2000; Puce et al., 1998). Finally, the
negative trend of activation in superior frontal gyrus may reﬂect a re-
distribution of resources from areas implicated in cognitive proces-
sing towards those directly engaged in emotion processing (Drevets
and Raichle, 1998).
PCA-based analyses of the same dataset from healthy individuals
showed that brain activation patterns associated with each emotional
factor had commonalities with the results of the standard analysis,
while non-emotional factors elicited no signiﬁcant brain response.
The common regions with positive activation trends associated with
either standard or emotional factors analyses were bilateral cerebel-
lum and fusiform/occipital cortices. There was some factor-related
speciﬁcity related to independent factors, e.g. eyes but not brows
factors were associated with positive trends of activation in bilateral
lingual, inferior parietal and superior temporal gyri.
Fig. 3. BOLD response in the standard analysis and to eyes and brows factors. Signiﬁcant trends of activation in reponse to the degrees of intensity, according to standard or PCA-based
analysis. Positive trends are depicted in red-yellow colors and negative in blue-purple colors. Left side of the slice corresponds to the left side of the brain. Slice coordinates in
Talairach space (Talairach and Tournoux, 1988).
943J. Radua et al. / NeuroImage 49 (2010) 939–946
We su ggest that both eyes area and eyebrows are critical
components of emotional expression. These ﬁndings are in accor-
dance with the idea that evolutionary old facial expressions might
serve as reliable signals of threat. For example, displays of fear in
gorillas resemble the human ones where facial changes involve
movements of eyebrows and mouth (Estes, 1992), and human
Trend analyses based on standard classiﬁcation of fear and on the PCA-derived factors: healthy subjects (n = 63).
Talairach 2D clusters range Standard
PCA-derived factors analyses: number of voxels
X range Y range Z range Eyes Brows Mouth Mixed
Left cerebellum − 47/0 − 89/−41 − 40/−7 225 137 (61%) 179 (80%) 134 (60%) 14 (6%)
Right cerebellum 0/40 − 89/−26 − 46/−7 223 100 (45%) 98 (44%) 166 (74%)
Left fusiform (BA 18) − 47/−25 − 85/−52 − 18/−711(N 99%) 31 (N 99%)
Right fusiform (BA 19, 37) 25/43 − 78/−56 − 13/−7 56 49 (88%) 44 (79%) 31 (55%)
Left MOG (BA 18) − 43/−25 − 85/−70 − 7/9 19 29 (N 99%) 30 (N 99%)
Right MOG (BA 18, 19) 25/36 − 85/−78 − 7/15 50 35 (70%) 38 (76%) 39 (78%)
Left IOG (BA 18, 19) − 43/−14 − 93/−70 − 13/−2 27 14 (52%) 20 (74%) 22 (81%)
Right IOG (BA 18) 32/40 − 81/
− 70 − 2 19 12 (63%)
Left lingual (BA 18) − 25/0 − 93/−70 − 13/−2 69 39 (57%) 30 (43%)
Right lingual (BA 18) 0/29 − 89/−63 − 13/4 146 98 (67%) 123 (84%)
Left cuneus (BA 18) − 14/0 − 93/− 70 4/37 25 41 (N 99%)
Right cuneus (BA 30) 0/18 − 85/− 67 9/31 45 68 (N 99%)
Subtotal 903 515 (57%) 423 (47%) 635 (70%) 84 (9%)
Left inferior parietal cluster
Left inferior parietal (BA 40) − 58/− 40 − 30/− 22 26/37 76 40 (53%) 46 (61%)
Left postcentral (BA 2, 40) − 58/− 40 −33/− 19 15/37 48 109 (N 99%) 34 (71%)
Left superior temporal (BA 13, 41) − 58/− 43 − 44/− 7 − 2/15 35 16 (46%) 46 (N 99%)
Subtotal 159 168 (N 99%) 126 (79%)
Superior frontal cluster
Left superior frontal (BA 9, 10) − 25/− 4 30/63 15/37 127 95 (75%)
Left middle frontal (BA 8) − 29/− 22 15 37 103 72 (70%)
Right middle frontal (BA 8) 25 22/26 31/37 20
Left medial frontal (BA 9) − 22/0 33/63 9/26 51
Right medial frontal (BA 9) 0/22 33/56 9/37 42 17 (40%)
Subtotal 342 193 (56%)
There were no signiﬁcant trends of activations to non-emotional factors. The percent values indicate the size of the PCA-based clusters relative to the size of the corresponding
standard-analysis clusters. Subtotals may not coincide with the sum of included regions because of rounding and not reporting of regions with less th
an 10 voxels. MOG: Middle
occipital gyrus. IOG: Inferior occipital gyrus.
Trend analyses based on standard classiﬁcation of fear and on the PCA-derived factors: individuals with schizophrenia (n= 32).
Talairach 2D clusters range Standard
PCA-derived factors analyses:
number of voxels
X range Y range Z range Eyes Brows
Left inferior posterior temporal (BA 37) − 47 − 41/− 59 − 18/− 24 74 (N 99%)
Left cerebellum − 25 − 74 − 35 22 (N 99%)
Subtotal 96 (N 99%)
Frontal pole (BA 10, 46) − 18/− 25 63/67 −784(N 99%)
Subtotal 84 (N 99%)
Left inferior parietal cluster
Left inferior parietal (BA 40) − 36/− 47 − 44 37/48 126 (N 99%)
Left postcentral (BA 2, 40) − 58/− 40 − 33/− 19 15/37 109 (N 99%)
Subtotal 235 (N 99%)
Left parietal/postcentral cluster
Left inferior parietal (BA 40) − 32/− 47 − 30/v37 42/48 39
Left postcentral gyrus (BA 2) − 43/− 47 − 19 48/53 17
Right inferior temporal cluster
Right fusiform gyrus (BA 20) 40 − 26 − 24 39 (N 99%)
Right amygdala/hippocampus 29 − 4 − 29 34 (N 99%)
Subtotal 73 (N 99%)
The percent values indicate the size of the PCA-based clusters relative to the size of the corresponding standard-analysis clusters. Subtotals may not coincide with the sum of
included regions because of rounding and not reporting of regions with less than 10 voxels.
944 J. Radua et al. / NeuroImage 49 (2010) 939–946
children have been found to focus on eyebrows when interpreting
fearful faces (Sullivan and Kirkpatrick, 1996).
Thus, the PCA-based approach proved to work well when applied
to the healthy individuals' data.
Based on the whole-brain analysis of healthy volunteers we were
not able to detect any linear activation trend in amygdala. This may be
due to the fact that the whole-brain trends analysis only picks up large
clusters consistently showing a linear trend of BOLD response. In
order to explore the amygdala, we conducted a region of interest
(ROI) analysis which showed right (but not left) amygdala activation
to standard analysis, as well as to eyes and mouth factors (data
available on request).
At the second stage we tested the utility of the approach by
applying the method to the data from patients with schizophrenia.
While the standard analysis only showed a negative cluster in left
parietal region / postcentral gyrus, the PCA-based analysis produced
several positive and negative clusters of activation pertaining to the
salient facial features—which were not detected when the data were
analyzed in a standard way.
Results of the standard analysis of data from individuals with
schizophrenia are in line with previous evidence showing abnormally
little BOLD response in these patients when attending to facial
emotional expressions (Phillips et al., 1999). It might be suggested
that this lack of activation may be due to the deviant visual scan path
where patients with schizophrenia avoid looking at other persons'
eyes or mouth (Green et al., 2003). The PCA-based approach might
overcome this problem by focusing on the analysis of the processing
of salient facial features. Speciﬁcally, we found that the neural
response to eyes in visual association regions appeared similar in
patients with schizophrenia and in healthy controls. Moreover, we
detected a negative trend in activation of amygdala/hippocampal
region associated with the increasing degrees of eyes factor intensity.
It is worth mentioning that a negative trend of activation in amygdala
to fearful faces has been demonstrated by our group earlier on a
different sample of patients with schizophrenia (Surguladze et al.,
2006). These results were obtained by comparing a schizophrenia
group with healthy controls using ANOVA. With the new approach we
were able not only to see this trend in the schizophrenia sample per
se, but also to add substantially—e.g. the s chizophrenia group
demonstrated additional activation to eyes in two large clusters
implicated in visual processing (occipito-cerebellar and parietal
regions) that were comparable to those detected in healthy
volunteers. We also found, only in the schizophrenia sample, positive
activation trends in left frontal polar regions in response to variability
in brows factor. We suggest that this activation reﬂects an allocation of
attentional resources in patients with schizophrenia to signals of
Thus, the PCA-based analysis provided an opportunity to look at
BOLD response to variability in salient facial features. This analytical
approach may therefore help to clarify the functionality of cortical and
subcortical networks involved in emotion processing in individuals
As mentioned above (Dalton et al., 2005) it is possible to account
for the attention-related differences in the brain response by
employing visual scan path (VSP) methodology. Our study addressed
a sl ightly different issue. In particular, we were interested in
variability of BOLD response related to the degrees of intensity of
facial components representing fearful expressions. Due to the very
nature of the stimuli used (neutral, mildly fearful and prototypically
fearful faces) we were able to extract distinct factors and then
examine the trends of BOLD response to the increasing intensity of
fear, pertaining to these facial factors. We s uggest that this
methodology could be useful in the studies employing varying
degrees of emotional expressions.
The study has limitations. First, gender of the facial stimuli was not
considered. However, basic structural measurements were taken into
account, thus controlling for facial changes other than fear-related.
Second, our measurements were performed in a fearful face set,
limiting the extrapolation of the ﬁndings to other emotional facial
expressions. Finally, we had to exclude the
mouth factor for its strong
resemblance to the standard classiﬁcation of fear.
To summarize, our approach proved to be effective in exploring
the brain response to fear-related characteristics of the salient facial
features. We emphasize that this was accomplished without any
manipulatio n of the parts of facial stimuli which thus could
represent an ecologically valid approach as compared with chime-
rical faces or masked facial parts. Compared with the standard
analysis, the PCA-based method has demonstrated a higher
sensitivity which was of great importance when applied to the
data obtained from individuals with schizophrenia. We therefore
suggest that the PCA-based approach adds to the methodology of
using pictures of facial affect, widely used in emotion research. By
employing PCA, researchers should be able to further probe the
processing of distinct features of the facial stimuli in psychiatric
We thank the staff of Centre for Neuroimaging Sciences of King's
College London for their assistance with the study.
Financial disclosures: Dr McDonald received support from a
Medical Research Council (UK) Pathﬁnder Award. Dr El-Hage was
supported by Servier and the French Association of Biological
Psychiatry (AFPB). Dr David received a research grant from Janssen-
Cilag. Dr Phillips is supported by NIMH, R01(MH076971-01. The
funding sources had no involvement in inﬂuencing the study design;
in the collection, analysis and interpretation of data; in the writing of
the report; and in the decision to submit the paper for publication.
J.Radua, T.Russell, N.Lawrence, N.Marshall, S.Kalidindi, V.Giampietro,
M.Brammer and S.Surguladze reported no biomedical ﬁnanc ial
interests or potential conﬂicts of interest.
Appendix A. Supplementary data
Supplementary data associated with this article can be found, in
the online version, at doi:10.1016/j.neuroimage.2009.08.030.
American Psychiatric Association, 2000. Diagnostic and Statistical Manual of Mental
Disorders. DSM-IV-TR. American Psychiatric Association, Washington DC.
Ashwin, C., Baron-Cohen, S., Wheelwright, S., O'Riordan, M., Bullmore, E.T., 2007.
Differential activation of the amygdala and the 'social brain' during fearful face-
processing in Asperger syndrome. Neuropsychologia 45, 2–14.
Blair, R.J., Morris, J.S., Frith, C.D., Perrett, D.I., Dolan, R.J., 1999. Dissociable neural
responses to facial expressions of sadness and anger. Brain 122, 883–893.
Breiter, H.C., Etcoff, N.L., Whalen, P.J., Kennedy, W.A., Rauch, S.L., Buckner, R.L., Strauss,
M.M., Hyman, S.E., Rosen, B.R., 1996. Response and habituation of the human
amygdala during visual processing of facial expression. Neuron 17, 875–887.
British Medical Association and Royal Pharmaceutical Society of Great Britain, 2006.
British National Formulary. BMJ Publishing Group Ltd., London.
Calder, A.J., Burton, A.M., Miller, P., Young, A.W., Akamatsu, S., 2001. A principal
component analysis of facial expressions. Vis. Res. 41, 1179–1208.
Costafreda, S.G., Brammer, M.J., David, A.S., Fu, C.H.Y., 2008. Predictors of amygdala
activation during the processing of emotional stimuli: a meta-analysis of 385 PET
and fMRI studies. Brain Res. Rev. 58, 57–70.
Dalton, K.M., N acew icz , B.M ., Joh nsto ne , T., Sch ae fer, H.S., Gernsbacher , M.A.,
Goldsmith, H.H., Alexander, A.L., Davidson, R.J., 2005. Gaze ﬁxation and the neural
circuitry of face processing in autism. Nat. Neurosci. 8, 519–526.
Deeley, Q., Daly, E.M., Surguladze, S., Page, L., Toal, F., Robertson, D., Curran, S.,
Giampietro, V., Seal, M., Brammer, M.J., Andrew, C., Murphy, K., Phillips, M.L.,
Murphy, D.G., 2007. An event related functional magnetic resonance imaging study
of facial emotion processing in Asperger syndrome. Biol. Psychiatry 62, 207–217.
Drevets, W.C., Raichle, M.E., 1998. Reciprocal suppression of regional cerebral blood
ﬂow during emotional versus higher cognitive process es: implicat ions for
interactions between emotion and cognition. Cogn. Emot. 12, 353–385.
Ekman, P., Friesen, W.V., 1978. Manual of the Facial Action Coding System (FACS).
Consulting Psychologists Press, Palo Alto, CA.
945J. Radua et al. / NeuroImage 49 (2010) 939–946
Estes, R.D., 1992. The Behavior Guide to African Mammals. University of California
Press, Los Angeles.
Gentili, C., Gobbini, M.I., Ricciardi, E., Vanello, N., Pietrini, P., Haxby, J.V., Guazzelli, M.,
2008. Differential modulation of neural activity throughout the distributed neural
system for face perception in patients with social phobia and healthy subjects.
Brain Res. Bull. 77 (5), 286–292.
Goldberg, D., Murray, R., 2006. Maudsley Handbook of Practical Psychiatry. Oxford
University Press, Inc., New York.
Green, M.J., Phillips, M.L., 2004. Social threat perception and the evolution of paranoia.
Neurosci. Biobehav. Rev. 28, 333–342.
Green, M.J., Williams, L.M., Davidson, D., 2003. Visual scanpaths to threat-related faces
in deluded schizophrenia. Psychiatry Res. 119, 271–285.
Harris, A., Aguirre, G.K., 2008. The representation of parts and wholes in face-selective
cortex. J. Cogn. Neurosci. 20, 863–878.
Haxby, J.V., Hoffman, E.A., Gobbini, M.I., 2002. Human neural systems for face
recognition and social communication. Biol. Psychiatry 51, 59–67.
Hoffman, E.A., Haxby, J.V., 2000. Distinct representations of eye gaze and identity in the
distributed human neural system for face perception. Nat. Neurosci. 3, 80–84.
Horley, K., Williams, L.M., Gonsalvez, C., Gordon, E., 2003. Social phobics do not see eye
to eye: a visual scanpath study of emotional expression processing. J. Anxiety.
Disord. 17, 33–44.
Kay, S.R., Fiszbein, A., Opler, L.A., 1987. The positive and negative syndrome scale
(PANSS) for schizophrenia. Schizophr. Bull. 13, 261–276.
Maurer, D., O'Craven, K.M., Le Grand, R., Mondloch, C.J., Springer, M.V., Lewis, T.L.,
Grady, C.L., 2007. Neural correlates of processing facial identity based on features
versus their spacing. Neuropsychologia 45, 1438–1451.
Morris, J.S., deBonis, M., Dolan, R.J., 2002. Human amygdala responses to fearful eyes.
NeuroImage 17, 214–222.
Morris, J.S., Friston, K.J., Buchel, C., Frith, C.D., Young, A.W., Calder, A.J., Dolan, R.J., 1998.
A neuromodulatory role for the human amygdala in processing emotional facial
expressions. Brain 121 (Pt 1), 47–57.
Ogawa, S., Lee, T.M., Kay, A.R., Tank, D.W., 1990. Brain magnetic resonance imaging with
contrast dependent on blood oxygenation. Proc. Natl. Acad. Sci. U. S. A. 87,
Ogrocki, P.K., Hills, A.C., Strauss, M.E., 2000. Visual exploration of facial emotion by
healthy older adults and patients with Alzheimer disease. Neuropsychiatry
Neuropsychol. Behav. Neurol. 13, 271–278.
Pelphrey, K.A., Sasson, N.J., Reznick, J.S., Paul, G., Goldman, B.D., Piven, J., 2002. Visual
scanning of faces in autism. J. Autism Dev. Disord. 32, 249–261.
Phan, K.L., Fitzgerald, D.A., Nathan, P.J., Tancer, M.E., 2006. Association between
amygdala hyperactivity to harsh faces and severity of social anxiety in generalized
social phobia. Biol. Psychiatry 59, 424–429.
Phillips, M.L., Williams, L., Senior, C., Bullmore, E.T., Brammer, M.J., Andrew, C.,
Williams, S.C., David, A.S., 1999. A differential neural response to threatening and
non-threatening negative facial expressions in paranoid and non-paranoid
schizophrenics. Psychiatry Res. 92, 11–31.
Puce, A., Allison, T., Bentin, S., Gore, J.C., McCarthy, G., 1998. Temporal cortex activation
in humans viewing eye and mouth movements. J. Neurosci. 18, 2188–2199.
Rauch, S.L., Whalen, P.J., Shin, L.M., McInerney, S.C., Macklin, M.L., Lasko, N.B., Orr, S.P.,
Pitman, R.K., 2000. Exaggerated amygdala response to masked facial stimuli in
posttraumatic stress disorder: a functional MRI study. Biol. Psychiatry 47, 769–776.
Riby, D.M., Doherty-Sneddon, G., Bruce, V., 2008. The eyes or the mouth? Feature
salience and unfamiliar face processing in Williams syndrome and autism. Q. J. Exp.
Rotshtein, P., Geng, J.J., Driver, J., Dolan, R.J., 2007. Role of features and second-order
spatial relations in face discrimination, face recognition, and individual face skills:
behavioral and functional magnetic resonance imaging data. J. Cogn. Neurosci. 19,
Schyns, P.G., Petro, L.S., Smith, M.L., 2007. Dynamics of visual information integration in
the brain for categorizing facial expressions. Curr. Biol. 17, 1580–1585.
Sheline, Y.I., Barch, D.M., Donnelly, J.M., Ollinger, J.M., Snyder, A.Z., Mintun, M.A., 2001.
Increased amygdala response to masked emotional faces in depressed subjects
resolves with antidepressant treatment: an fMRI study. Biol. Psychiatry 50,
Shimizu, T., Shimizu, A., Yamashita, K., Iwase, M., Kajimoto, O., Kawasaki, T., 2000.
Comparison of eye-movement patterns in schizophrenic and normal adults during
examination of facial affect displays. Percept. Mot. Skills 91, 1045–1056.
Sprengelmeyer, R., Rausch, M., Eysel, U.T., Przuntek, H., 1998. Neural structures
associated with recognition of facial expressions of basic emotions. Proc. Biol. Sci.
Sullivan, L.A., Kirkpatrick, S.W., 1996. Facial interpretation and component consistency.
Genet. Soc. Gen. Psychol. Monogr. 122, 389–404.
Surguladze, S., Russell, T., Kucharska-Pietura, K., Travis, M.J., Giampietro, V., David, A.S.,
Phillips, M.L., 2006. A reversal of the normal pattern of parahippocampal response
to neutral and fearful faces is associated with reality distortion in schizophrenia.
Biol. Psychiatry 60, 423–431.
Surguladze, S.A., Brammer, M.J., Young, A.W., Andrew, C., Travis, M.J., Williams, S.C.,
Phillips, M.L., 2003. A preferential increase in the extrastriate response to signals of
danger. NeuroImage 19, 1317–1328.
Talairach, J., Tournoux, P., 1988. Co-Planar Stereotaxic Atlas of the Human Brain.
Thieme, New York.
Tian, Y., Kanade, T., Cohn, J.F., 2001. Recognizing action units for facial expression
analysis. IEEE Trans. Pattern Anal. Mach. Intell. 23.
Vuilleumier, P., Armony, J.L., Driver, J., Dolan, R.J., 2001. Effects of atte ntion and emotion
on face processing in the human brain: an event-related fMRI study. Neuron 30,
Walker Smith, G.J., Gale, A.G., Findlay, J.M., 1977. Eye movement strategies involved in
face perception. Perception 6, 313–326.
Whalen, P.J., Kagan, J., Cook, R.G., Davis, F.C., Kim, H., Polis, S., McLaren, D.G., Somerville,
L.H., McLean, A.A., Maxwell, J.S., Johnstone, T., 2004. Human amygdala responsivity
to masked fearful eye whites. Science 306, 2061.
Williams, L.M., Loughland, C.M., Gordon, E., Davidson, D., 1999. Visual scanpaths in
schizophrenia: is there a deﬁcit in face recognition? Schizophr. Res. 40, 189–199.
Young, A., Perrett, D., Calder, A., Sprengelmeyer, R., Ekman, P., 2002. Facial Expressions
of Emotion: Stimuli and Tests (FEEST). Thames Valley Test Company, Bury St.
946 J. Radua et al. / NeuroImage 49 (2010) 939–946