Analysis of face gaze in autism using "Bubbles".
ABSTRACT One of the components of abnormal social functioning in autism is an impaired ability to direct eye gaze onto other people's faces in social situations. Here, we investigated the relationship between gaze onto the eye and mouth regions of faces, and the visual information that was present within those regions. We used the "Bubbles" method to vary the facial information available on any given trial by revealing only small parts of the face, and measured the eye movements made as participants viewed these stimuli. Compared to ten IQ- and age-matched healthy controls, eight participants with autism showed less fixation specificity to the eyes and mouth, a greater tendency to saccade away from the eyes when information was present in those regions, and abnormal directionality of saccades. The findings provide novel detail to the abnormal way in which people with autism look at faces, an impairment that likely influences all subsequent face processing.
- SourceAvailable from: David Benjamin[Show abstract] [Hide abstract]
ABSTRACT: We examined the visual processing of a social learning stimulus and the ways in which visual attention was distributed to objects as well as to the examiner's face during word learning under conditions that varied only in the presence or absence of a label. The goal of the current study, then, was to evaluate the effects of differentially providing pointing and labeling during exposure to a novel target object in males with fragile X syndrome (FXS) (n=14, ages 4.33-10.02), autism spectrum disorder (ASD) (n=17, ages 4.04-10.4), or typical development (TD) (n=18, ages 2.05-5.33). In particular, the present study examined attention to the examiner's face as well as target and distracter objects that were presented as video stimuli. An eye-tracker captured gaze to the video stimuli as they were shown in order to examine the way in which children with FXS, ASD, or TD distributed their gaze toward the examiner and the objects. Results indicated that no group showed increased gaze toward the target object compared to the distracter object. However, results revealed that participants with FXS showed significantly increased face gaze compared to the novel objects, whereas children with ASD and TD both showed similar amounts of relative gaze toward the face and objects. Furthermore, the act of pointing at the target object was found to increase gaze toward the target objects compared to when there was no pointing in all groups. Together, these findings suggest that social cues like those employed in a word-learning task, when presented with video, may relate to gaze in FXS in context- or task-dependent ways that are distinct from those expected during live interaction.Research in Developmental Disabilities 07/2014; 35(11):2658-2672. · 3.40 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: People with autism spectrum disorders (ASD) have pervasive impairments in social interactions, a diagnostic component that may have its roots in atypical social motivation and attention. One of the brain structures implicated in the social abnormalities seen in ASD is the amygdala. To further characterize the impairment of people with ASD in social attention, and to explore the possible role of the amygdala, we employed a series of visual search tasks with both social (faces and people with different postures, emotions, ages, and genders) and non-social stimuli (e.g., electronics, food, utensils). We first conducted trial-wise analyses of fixation properties and elucidated visual search mechanisms. We found that an attentional mechanism of initial orientation could explain the detection advantage of non-social targets. We then zoomed into fixation-wise analyses. We defined target-relevant effects as the difference in the percentage of fixations that fell on target-congruent vs. target-incongruent items in the array. In Experiment 1, we tested 8 high-functioning adults with ASD, 3 adults with focal bilateral amygdala lesions, and 19 controls. Controls rapidly oriented to target-congruent items and showed a strong and sustained preference for fixating them. Strikingly, people with ASD oriented significantly less and more slowly to target-congruent items, an attentional deficit especially with social targets. By contrast, patients with amygdala lesions performed indistinguishably from controls. In Experiment 2, we recruited a different sample of 13 people with ASD and 8 healthy controls, and tested them on the same search arrays but with all array items equalized for low-level saliency. The results replicated those of Experiment 1. In Experiment 3, we recruited 13 people with ASD, 8 healthy controls, 3 amygdala lesion patients and another group of 11 controls and tested them on a simpler array. Here our group effect for ASD strongly diminished and all four subject groups showed similar target-relevant effects. These findings argue for an attentional deficit in ASD that is disproportionate for social stimuli, cannot be explained by low-level visual properties of the stimuli, and is more severe with high-load top-down task demands. Furthermore, this deficit appears to be independent of the amygdala, and not evident from general social bias independent of the target-directed search.Neuropsychologia 10/2014; · 3.45 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: Introduction. Existing eye-tracking literature has shown that both adults and children with autism spectrum disorders (ASD) show fewer and slower fixations on faces. Despite this reduced saliency and processing of other faces, recognition of their own face is reported to be more "typical" in nature. This study uses eye-tracking to explore the typicality of gaze patterns when children with ASD attend their own faces compared to other familiar and unfamiliar faces. Methods. Eye-tracking methodology was used to explore fixation duration and time taken to fixate on the Eye and Mouth regions of familiar, unfamiliar and Self Faces. Twenty-one children with ASD (9-16 years) were compared to typically developing matched groups. Results. There were no significant differences between children with ASD and typically matched groups for fixation patterns to the Eye and Mouth areas of all face types (familiar, unfamiliar and self). Correlational analyses showed that attention to the Eye area of unfamiliar and Self Faces was related to socio-communicative ability in children with ASD. Conclusions. Levels of socio-communicative ability in children with ASD were related to gaze patterns on unfamiliar and Self Faces, but not familiar faces. This lack of relationship between ability and attention to familiar faces may indicate that children across the autism spectrum are able to fixate these faces in a similar way. The implications for these findings are discussed.Cognitive Neuropsychiatry 08/2014; · 2.18 Impact Factor
E-mail address: email@example.com (R. Adolphs).
NSY 2300 1–8
Neuropsychologia xxx (2006) xxx–xxx
Analysis of face gaze in autism using “Bubbles”
Michael L. Spezioa, Ralph Adolphsa,∗, Robert S.E. Hurleyb, Joseph Pivenb
aDivision of Humanities and Social Sciences, California Institute of Technology, HSS 228-77, Caltech, Pasadena, CA 91125, USA
bNeurodevelopmental Disorders Research Center, University of North Carolina, Chapel Hill, NC 27599, USA
One of the components of abnormal social functioning in autism is an impaired ability to direct eye gaze onto other people’s faces in social
situations. Here, we investigated the relationship between gaze onto the eye and mouth regions of faces, and the visual information that was present
within those regions. We used the “Bubbles” method to vary the facial information available on any given trial by revealing only small parts of the
face, and measured the eye movements made as participants viewed these stimuli. Compared to ten IQ- and age-matched healthy controls, eight
participants with autism showed less fixation specificity to the eyes and mouth, a greater tendency to saccade away from the eyes when information
was present in those regions, and abnormal directionality of saccades. The findings provide novel detail to the abnormal way in which people with
autism look at faces, an impairment that likely influences all subsequent face processing.
© 2006 Published by Elsevier Ltd.
Keywords: Social cognition; Emotion; Autism; HFA; Eyetracking; Facial information
Autism is a neurodevelopmental disorder strongly character-
1999; Kanner, 1943; Siegel, Vukicevic, & Spitzer, 1990), a dys-
the normal range. Because high functioning children and adults
with autism show (Baron-Cohen et al., 1999; Buitelaar, van
Engeland, de Kogel, de Vries, & van Hooff, 1991; Carpenter,
Pennington, & Rogers, 2002; Castelli, Frith, Happe, & Frith,
2002; Loveland, Pearson, Tunali-Kotoski, Ortegon, & Gibbs,
2001; Ozonoff & Miller, 1995; Pedersen, Livoir-Petersen, &
Schelde, 1989; Rogers, 2000; Rogers, Hepburn, Stackhouse, &
Wehner, 2003) and report (Gilpin, 2002; Grandin, 1996) diffi-
deciding on appropriate social behaviors, etc.), a main focus of
salient social cues, notably from faces.
There has been a considerable amount of work using static
Battacchi, & Arcidiacono, 1999; Critchley et al., 2000; Grelotti,
Gauthier, & Schultz, 2002; Trepagnier, Sebrechts, & Peterson,
2002; Volkmar, Sparrow, Rende, & Cohen, 1989; Weeks &
∗Corresponding author. Tel.: +1 626 395 4486; fax: +1 626 793 8580.
Trepagnier et al., 2002; van der Geest, Kemner, Camfferman,
Verbaten, & van Engeland, 2002a; van der Geest, Kemner,
Verbaten, & van Engeland, 2002b). Weeks and Hobson (1987)
showed that children with autism sorted static images of emo-
tional facial expressions using non-emotional characteristics
matched for verbal ability sorted the faces mainly according to
emotional expression. The difference in behavior was unlikely
to be due to impairments in simple object recognition, since
normally in assembling puzzles displaying pictures of human
faces. Celani et al. (1999) found that children with autism were
impaired both at matching facial emotions in static faces and
at judging valence from static facial expressions, compared to
et al. (2002) found that high functioning people with autism,
while being normal or better than normal at object recognition,
were impaired in facial recognition.
Eyetracking studies of children and adults with autism have
typically found abnormally infrequent gaze to the eyes and
abnormally frequent gaze to the mouth (Klin, Jones, Schultz,
Volkmar, & Cohen, 2002b; but see van der Geest et al., 2002a).
Pelphrey et al. (2002), in a study of high functioning adults with
autism, showed increased gaze to nonfeatural elements of static
faces and decreased gaze to facial features (e.g., eyes, nose,
0028-3932/$ – see front matter © 2006 Published by Elsevier Ltd.
the underlying, or base, image. This mask is called the “Bub-
bles” mask. The more bubbles there are, the greater the portion
of the face that is revealed to a viewer. Averaging “Bubbles”
NSY 2300 1–8
M.L. Spezio et al. / Neuropsychologia xxx (2006) xxx–xxx
mouth) compared to controls. Trepagnier et al. (2002) found
that high functioning people with autism showed less gaze onto
faces overall than controls.
Critchley and co-workers (2000) identified neurobiological
cessing of emotional facial images: unlike in healthy brains, the
fusiform face area, the left amygdala, and the left cerebellum
failed to show significant activations during implicit process-
ing of faces. Ogai et al. (2003) were also able to differentiate
high functioning people with autism from controls using neu-
roimaging during face processing. Although they did not find
any behavioral differences in the ability to accurately judge the
emotion in a face, they did show reduced activation in the insula
elicited by disgust faces and reduced activation in the middle
frontal gyrus in response to fear faces.
The literature thus documents impairments in eye gaze, in
social judgments, and in brain activation when people with
impaired face gaze might account for impairments in the other
ment of face gaze behavior. Here, we probed several aspects of
face gaze behavior during emotion judgment in autism. Specifi-
cally, we focused on the relationship between, on the one hand,
fixations to and saccades away from the eyes and mouth, and
on the other hand, the visual information present within those
First, we examined whether people with autism would show
less specific fixation behavior to the eyes and mouth, such that
their fixations to those regions would not be as strongly asso-
ciated with information in the regions. Given that people with
autism are reported to make more fixations to the mouth and
less to the eyes (Klin et al., 2002b), compared to controls, one
would expect that their fixations to the mouth would be less
specific than those of controls. Second, we examined the rela-
tionship between saccades away from the eyes and mouth and
the information within those regions. Given evidence for direct
gaze aversion in autism (Dalton et al., 2005; Hutt & Ounsted,
with autism make saccades away from the eyes, there would be
greater information in the eye areas, compared to when controls
make saccades away from the eyes. Controls, of course, would
be expected not to make saccades away from the eyes when
task-relevant information is present there. Finally, we examined
whether people with autism make saccades away from the eyes
and mouth in the same general direction as do controls. Given
known facial processing deficits in autism, one would expect
that people with autism would show abnormal directionality in
saccades away from the eyes and mouth, compared to controls.
Our approach, utilizing the “Bubbles” method (Gosselin &
Schyns, 2001), combines the ease of static facial stimuli with
an approach that allows the visual information in the face on
each trial to be varied randomly. During “Bubbles,” a given trial
shows only randomly revealed areas of the face, determined by
the number of “bubbles,” or Gaussian holes in a mask covering
masks in a parameter-specific manner across all the trials (i.e.,
across emotions as well) yields an image, called the “diagnos-
tic image,” that depicts which areas of the face, on average,
were associated most with the parameter under investigation.
For example, if people fixated the ears consistently, and not the
eyes and mouth, when the ears were revealed by the “Bubbles”
method, an analysis driven by fixations to the ears would yield
an image prominently showing the ears but missing the eyes
and mouth. So what is seen in a “Bubbles” diagnostic image is
the information associated with the behavioral parameter under
detailed aspects of face gaze behavior.
1.1. Research participants
Research methods were conducted with the approval of the Institutional
Review Boards at the California Institute of Technology and the University of
North Carolina. Eight high functioning male participants with autism (HFA)
were recruited through the Subject Registry of the Neurodevelopmental Disor-
ders Research Center (NDRC) at the University of North Carolina, where they
were tested. All HFA participants met DSM-IV/ICD-10 diagnostic criteria for
autism, and all met the cutoff scores for autism on both the Autism Diagnostic
Interview (LeCouteur, Rutter, & Lord, 1989) and the Autism Diagnostic Obser-
Wechsler Abbreviated Scale of Intelligence (WASITM). The HFA group had a
mean age of 23 years (20, 22, 21, 26, 20, 20, 18, 40), and mean IQ values of 106
verbal (108, 77, 122, 74, 120, 130, 87, 131), 102 performance (111, 118, 104,
97, 91, 119, 82, 94), and 104 full scale (111, 96, 115, 83, 106, 128, 83, 112).1
Ten male participants were enrolled as controls (C) and tested at Caltech with
the same protocols as were used for the HFA participants. Control participants
had no history of neurological or psychiatric disease or pervasive developmen-
tal disorder or other evidence of developmental disability, or family history of
autism. Controls had a mean age of 28 years (20, 20, 22, 22, 22, 40, 39, 34, 32,
35), and mean IQ values of 101 verbal (83, 76, 81, 123, 104, 109, 121, 105, 95,
117), 111 performance (93, 106, 98, 119, 118, 106, 119, 109, 121, 119), and
106 full scale (86, 88, 88, 125, 111, 109, 124, 108, 108, 118). There was no
significant difference between the HFA group and controls in age, or in verbal,
performance, or full-scale IQ (p>0.1 for each comparison, Wilcoxon rank-sum
test). All participants had normal or corrected-to-normal vision at testing time.
All eyetracking data and button responses were recorded using the Eye-
link II head-mounted eyetracking system (SR Research, Osgoode, Ontario).
Eyetracking data were recorded either at 250Hz, when a stable corneal reflec-
tion was obtainable for a given participant, or at 500Hz, when pupil-only
recording was used. These two different sampling rates had no effect on the
results. New nine-point calibrations and validations were performed prior to the
start of each experiment in a participant’s session. Accuracy in the validations
typically was better than 0.5◦of visual angle. Experiments were run under Win-
dowsXP (Microsoft, Inc.) in Matlab (Mathworks, Inc., Natick, MA) using the
Psychophysics Toolbox (Brainard, 1997; Pelli, 1997) and the Eyelink Toolbox
(Cornelissen, Peters, & Palmer, 2002).
IQ scores >70. Thus, although our HFA sample included two individuals whose
verbal IQs were in the 70s, they belong to the HFA population, especially given
that their full-scale IQs are >80. Note that three of the individuals in our control
sample had full-scale IQs <90, to ensure IQ comparability.
& Gosselin, in press) that uses the same approach as that used for the statisti-
NSY 2300 1–8
M.L. Spezio et al. / Neuropsychologia xxx (2006) xxx–xxx
Judgment of facial expressions in the “Bubbles” task used faces with ran-
domly revealed regions as previously described (Gosselin & Schyns, 2001).
Briefly, on each trial, a randomly selected base facial image was first decom-
toolbox for Matlab (Portilla & Simoncelli, 2000) with a Gaussian filter subtend-
ing 1◦of visual angle (11w×11h). Levels one through five were then filtered
with a number of bubbles whose centers were randomly distributed across the
image. These bubbles are collectively described as the “Bubbles” mask for a
given spatial frequency on a given trial. After filtering, levels one through five
were combined with a standard background corresponding to the sixth level,
and the resulting image was presented. The number of bubbles was adjusted for
each participant on a trial by trial basis in order to maintain performance accu-
racy of response near 80% correct. Note that bubbles were allowed to overlap,
increasing the amount of the face revealed beyond the size of a single bubble.
Base stimuli (256w×256h; pixel units) were cropped from four Ekman faces
(Ekman & Friesen, 1976), each of a different posing participant (image codes:
A1-6, JB1-9, JJ5-13, MF1-27), and balanced for gender and facial expression
(2 fearful, 2 happy, 2 male, 2 female). Images were normalized for magnitude
across all spatial frequencies and centrally displayed using a monitor resolution
of 640w×480h (pixel units) on a 15.9in. w×11.9in. h monitor, at an eye-
to-screen distance of approximately 31in., thus subtending 11.3◦of horizontal
A given trial lasted the time it took participants to decide whether the face
showed fear or happiness (Adolphs et al., 2005), for a maximal decision time of
10s following image onset. Participants were asked to judge whether the bub-
bled face they saw was afraid or happy, by pushing a button. All participants
completed 512 trials. On every fifth trial, a circular annulus was centrally dis-
played and participants were given an opportunity to rest. When they decided
to continue, they fixated the annulus and simultaneously pressed a key. This
advanced the experiment to the next trial and allowed the system to correct
for any drift in eyetracking accuracy. Participants were instructed to decide as
quickly as possible and to always make a decision, even if it was a best guess.
1.3. Analysis of performance and gaze behavior
Eyetracking data were analyzed for fixations using the Eyelink DataViewer
(SR Research, Osgoode, Ontario). Data were collected for both eyes and gaze
coordinates for a given datapoint were calculated by taking the average of the
coordinates for both eyes. In discriminating fixations, we set saccade velocity,
acceleration, and motion thresholds to 30◦/s, 9500◦/s2, and 0.15◦, respectively.
Regions of interest (ROIs) were drawn for each facial image, using the drawing
eye region (including the right eye and the eye socket around it), the left eye
region (including the left eye and the eye socket around it), and the mouth. The
designations right and left are anatomical, and not from the perspective of the
1.4. Association between face gaze and facial information in the
eyes and mouth
Each trial in the “Bubbles” paradigm reveals to a participant some areas of
the face while obscuring others. To determine the extent of group differences in
with facial information revealed in these regions, we first calculated, for each
region of interest, a fixation-dependent “Bubbles” mask. A fixation-dependent
“Bubbles” mask for a given region was calculated by summing all “Bubbles”
masks for trials on which a fixation was made to the region. We then subtracted
one group’s fixation-dependent “Bubbles” mask from that of the other group,
for each region.
In order to select regions of statistically significant difference, we con-
verted all pixel values in a difference mask into Z-scores relative to that mean
and standard deviation. The statistical analyses of the Z-scored classification
image2proceeded by a recently developed method (Chauvin, Worsley, Schyns,
cal analysis of significant clusters of activation in fMRI and PET data (Friston,
sian filter having sigma=5, we subjected this Z-scored classification image to
in a group diagnostic difference image for each region, showing which group
demonstrated greater association between information in facial regions and fix-
ations to those regions.
We used the procedure described above to determine statistically significant
group differences, resulting in diagnostic difference images for each region. For
each group, these images showed those regions more associated with fixation to
a given region, compared to the other group.
All trials were used for this analysis, including all fixations that occurred
between 50ms following image onset and the end of the trial (i.e., image offset
1.5. Association of saccades from the eyes and mouth with facial
We followed the same procedure as described, but modified to examine
saccade-related facial information. To calculate a saccade-dependent “Bubbles”
mask for a given region, we summed all “Bubbles” masks for trials on which
1.6. Directional analysis of saccades from the eyes and mouth
To determine whether there were group differences in the directionality of
to the right eye region, left eye region, and mouth, that occurred between 50ms
outside of the region.
Circular rose plots were used for descriptive purposes. Rose plots are his-
tograms displaying the saccade angle in bins and the number of saccades
in a bin. We used a bin size of 5◦. Calculation of circular means and dis-
persions and non-parametric statistical differences proceeded using circular
statistics (Fisher, 1995) implemented in the PhasePACK toolbox in Matlab
Findings regarding performance on the “bubbles” discrim-
ination task (i.e., accuracy, reaction time), the use of facial
information on which this performance was based (as revealed
summarized here only for background reference. Briefly, there
were no group differences in accuracy, reaction time, the num-
ber of bubbles required for the task, or overall fixation to either
eye: HFA subjects performed entirely normally on these mea-
sures. Despite this equivalent overall performance, there were
statistically significant group differences in how specific facial
revealed that the HFA group had a greater reliance on the mouth
and a decreased use of both eyes. In addition to this different
use of facial information in the “bubbles” task, the HFA group
also showed an overall increased fixation to the mouth, com-
pared to controls. The aim of the present study, however, was
not to analyze emotion discrimination performance or global
fixation tendencies, but rather specifically to investigate the fix-
ations made onto facial features as a function of what features
were actually revealed in the “bubbles” image.
Fig. 1. Fixations and facial information in the eyes and mouth (a–f). The information associated with gaze fixation to the right eye region (a,d), the left eye region
(b,e), and the mouth (c,f), compared to gaze fixation to the other two regions (g–l). Group differences in facial information associated with gaze fixation to the right
eye region (g,j), the left eye region (h,k), and the mouth (i,l). Note that these images depict statistically thresholded differences; the facial features shown are thus
those that differed significantly (p<0.001 with a cluster threshold t=2.5) in their use between the two subject groups.
NSY 2300 1–8
M.L. Spezio et al. / Neuropsychologia xxx (2006) xxx–xxx
2.1. Fixations and facial information in the eyes and mouth
We sought to determine whether people with autism were
more or less likely than controls to demonstrate an associa-
tion between gaze to the eyes and mouth and the information
displayed in those regions. The “Bubbles” method varies the
amount of information in a given region of a face on each trial,
allowing us to determine the average amount of information
present in a region when a participant looked at that region,
compared to when the participant looked at the other regions.
We hypothesized that controls would be fairly specific in
their face gaze. That is, we expected that when they looked at
NSY 2300 1–8
M.L. Spezio et al. / Neuropsychologia xxx (2006) xxx–xxx
the eyes, there would, on average, be more information in the
eyes than when they looked at the mouth, and vice versa. How-
ever, we hypothesized that the HFA group would not show the
same gaze specificity for gaze to the mouth as was shown by
controls. That is, we expected the HFA group to gaze at the
mouth even when information was available in the eyes. We
calculated diagnostic difference images to determine the group
the names of defined regions of interest during fixation analy-
ses are capitalized, while facial features in general are in lower
The results are shown in Fig. 1a–f for HFA-Controls
(Fig. 1a–c) and Controls-HFA (Fig. 1d–f). There was a slight
group difference for information associated with gaze to the
mouth (Fig. 1c,f). The HFA group showed greater information
in the left eye during gaze to the mouth, compared to controls
(i.e., for those trials in which HFA participants looked at the
mouth, there was more information available in the left eye
than for those trials in which controls looked at the mouth).
Controls, on the other hand, showed greater information in
the mouth associated with gaze to the Mouth, compared to
the HFA group. This suggests both that the HFA group was
slightly less dependent upon information in the mouth area in
making fixations to that area and that the HFA group fixated
the mouth when there was information present in the left eye.
Thus, the HFA group showed less gaze specificity to the mouth.
A weaker group difference was seen for the right eye region.
Here, the HFA group showed decreased information at low
spatial frequencies in the right eye during gaze to this region,
again suggesting a decrease in gaze specificity. No group dif-
ference in the eyes or mouth was seen for the left eye region
2.2. Saccades and facial information in the eyes and mouth
sity to make eye movements away from fixations to the eyes,
compared to controls, we analyzed the association between
facial information in the eyes and saccades from these areas.
We proceeded using the approach described above (see Meth-
ods). Since some reports suggest that people with autism find
direct eye gaze aversive (Dalton et al., 2005; Hutt & Ounsted,
1966; Richer & Coss, 1976), we predicted that on trials when
the HFA group made saccades out of the eye regions, we would
find greater information in the eyes, compared to when con-
trols made saccades out of the eye regions. Fig. 2a,b shows this
result. When the HFA group made saccades leaving the right
eye region (Fig. 2a) or the left eye region (Fig. 2b), there was
more information in those areas, and only in those areas, than
when controls made the same saccades. Controls did not show
any greater information in the eyes or mouth (Fig. 2d,e). Inter-
estingly, there was also a greater amount of information in both
eyes when the HFA group made saccades leaving the mouth,
compared to controls (Fig. 2c).
Fig. 2. Saccades and facial information in the eyes and mouth. Shown are group differences in the information associated with saccades leaving the right eye region
(a,d), the left eye region (b,e), and the mouth (c,f).
interest was in a given direction; bin size was 5◦. Shown are data for the right
eye region, the left eye region, and the mouth, pooling all saccades leaving the
regions immediately following fixations in the region, for the HFA group (a–c)
and controls (d–f).
NSY 2300 1–8
M.L. Spezio et al. / Neuropsychologia xxx (2006) xxx–xxx
2.3. Directional analysis of saccades leaving the eyes and
Given that the HFA group showed an increased tendency
compared to controls to make saccades leaving the eyes when
there was information in the eyes, we sought to determine
whether the direction of these saccades was also abnormal. We
analyzed all saccades leaving the eyes and mouth that immedi-
ately followed fixations in those regions. We found no direc-
tional difference for the right eye region (HFA, 330◦±46◦;
C, 330◦±61◦; p>0.1; M±circular dispersion), shown in
Fig. 3a,d. However, there was a difference for the left eye region
(Fig. 3b,e), such that the HFA group made a greater propor-
tion of saccades in the direction of the mouth than did controls
(HFA, 223◦±52◦; C, 214◦±42◦; p<0.002). We also found a
difference for saccades leaving the Mouth (Fig. 3c,f), in that the
HFA group showed a greater propensity to make saccades in the
Fig. 3. Directionality of saccades leaving the eyes and mouth. Each circular, or
“rose,” plot is a histogram of the number of times a saccade leaving a region of
showed nearly equal tendencies to make saccades toward both
eyes (C, 87◦±17◦; p<0.0001). Taken together, these findings
suggest that the HFA group showed saccade behavior that was
different from controls even when both groups fixated the same
key facial regions.
different features of the face affects face gaze during emotion
judgment in autism. We isolated several face processing impair-
ments in people with autism by employing a novel approach
to facial information processing, compared to a group matched
for IQ, performance accuracy, and reaction time. We showed
that individuals with autism were distinguished from controls
in that they exhibited less specificity of fixation to the mouth,
an increased tendency to make saccades away from informa-
tion in fixated eye regions, and abnormal saccade directionality
leaving the left eye and mouth. All face gaze abnormalities
were observed in the absence of group differences in accuracy
and reaction time. Thus, eyetracking in combination with the
malities in how people with autism process faces, and those
abnormalities could not be attributed simply to impaired perfor-
mance accuracy on the task.
These impairments were revealed using facial expressions
of fear and happiness in an emotion judgment task. The pri-
mary reason that we limited our study to these two emotions
was the large number of trials required in the “bubbles” task,
making it infeasible to include additional emotion categories.
It is therefore important to consider whether the findings we
report are specific for these two emotions, or whether they
would generalize to other emotions as well, or even to face pro-
cessing under other cognitive demands (e.g., identity matching,
gender discrimination). Green, Williams, & Davidson (2003)
found that people made more fixations to facial features when
shown facial expressions of anger and fear, compared to non-
threatening facial expressions. No specific featural differences,
though, were noted. Similarly, we have not found any major
differences in fixation patterns onto facial features between the
six basic emotions (Adolphs et al., 2005). Moreover, the highly
expression-dependent face gaze specificity seen in a Rhesus
tionally, Klin and coworkers (Klin, Jones, Schultz, Volkmar, &
Cohen, 2002a; Klin et al., 2002b) do not report expression-
dependent fixation patterns in participants with autism or in
controls. It is thus likely that the associations between gaze and
facial information that are reported here would generalize to
other facial expressions.
Yet, we expect that the same may not be true of other behav-
ioral tasks. For example, judging gender and identity appears to
Schyns, 2001; Schyns, Bonnar, & Gosselin, 2002). Thus, peo-
ple with autism may be less oriented to the mouth in tasks not
requiring emotion judgment, and there is little evidence that
direct eye contact aversive. Unfortunately, we did not interview
the HFA participants to determine whether they in fact report
NSY 2300 1–8
M.L. Spezio et al. / Neuropsychologia xxx (2006) xxx–xxx
would lead one to expect that people with autism would have a
hair or the chin when judging identity or gender. To be sure, we
expect that the relation between eye information and saccades
away from the eyes would still be observed in tasks of identity
and gender identification.
In interpreting these findings, one should recall that some
have raised the possibility that the “Bubbles” method, which
reveals only certain areas of an object on any given trial, alters
a strong argument has been made that the “Bubbles” method
does not elicit an altered visual processing strategy for faces
in emotion judgment tasks (Gosselin & Schyns, 2004). That
this conclusion applies also in our study is further corroborated
by the identical performances on the “Bubbles” task (both in
terms of accuracy and reaction times). Thus, it is likely that our
employed by the HFA participants and controls when they pro-
cess whole faces.
The HFA group showed reduced specificity in gaze to the
mouth (Fig. 1c,f), suggesting that the participants with autism
made fixations to the mouth even when information in the eye
areas was present and could presumably have contributed to the
emotion judgment task. Several hypotheses could account for
this result. The HFA group may have a propensity to look at the
mouth whether or not there is any useful information present
there at all. However, the slight difference between controls
and the HFA group seen in Fig. 1f suggests that this is not
the case. Another explanation for the lower specificity shown
in Fig. 1c is that the HFA group may show a greater propen-
sity to look at the mouth when there is task-relevant information
equally present in the eyes and mouth. Still another hypothe-
sis is that the HFA group’s fixation behavior is guided more
by task-irrelevant, low-level attention cues, and that the mouth
region provides more of those even when there is task-relevant
information in the eye regions. We are testing this hypothesis by
examining the HFA fixation behavior in relation to predictions
made by a computational model of low-level attention (Itti &
autism find direct eye gaze aversive (Dalton et al., 2005; Hutt &
Ounsted, 1966; Richer & Coss, 1976). We reasoned that if this
were the case with the HFA participants in this study, we would
see a higher level of information in the eyes when the HFA
group made saccades leaving the eyes, compared to controls.
Controls, of course, would likely not have made saccades away
from the eyes if task-relevant information were present in those
information in the eyes associated with saccades leaving the left
eye due to the low number of fixations the HFA group made to
the right eye. Had more fixations to the right eye been available
an aversion to direct eye gaze. Nor did our experimental design
lend itself to the recording psychophysiological measures that
could shed light on this issue. Such design considerations are
planned for future experiments.
Saccade behavior in autism has been examined primarily to
identify oculomotor deficits in autism, relating these to puta-
tive cortical and cerebellar dysfunction (Chawarska, Klin, &
Volkmar, 2003; Minshew, Luna, & Sweeney, 1999; Rosenhall,
Johansson, & Gillberg, 1988; Takarae, Minshew, Luna, Krisky,
& Sweeney, 2004a; Takarae, Minshew, Luna, & Sweeney,
2004b). Here, we analyzed saccade directionality to determine
whether there is an impairment in how people with autism make
saccades during emotion judgment when they fixate the same
an impairment for saccades leaving the left eye and mouth,
are consistent with the view that face processing deficits in
autism are partially independent of the foveated visual informa-
tion. That is, face processing deficits in autism cannot be fully
ing that the brains of people with autism treat facial information
The authors would like to thank the participants and their
families for making this study possible, Dr. Fr´ ed´ eric Gosselin
Castelli for helpful comments. We gratefully acknowledge the
support of NIMH STAART Center funding and a grant from the
Cure Autism Now Foundation.
Adolphs, R., Gosselin, F., Buchanan, T. W., Tranel, D., Schyns, P., & Dama-
sio, A. R. (2005). A mechanism for impaired fear recognition after
amygdala damage. Nature, 433(7021), 68–72.
Baron-Cohen, S. (1997). Mindblindness: An essay on autism and theory of
mind. Cambridge, MA: MIT Press.
Baron-Cohen, S., Ring, H. A., Wheelwright, S., Bullmore, E. T., Brammer,
M. J., Simmons, A., et al. (1999). Social intelligence in the normal and
autistic brain: An FMRI study. European Journal of Neuroscience, 11(6),
Brainard, D. H. (1997). The psychophysics toolbox. Spatial Vision, 10,
Buitelaar, J. K., van Engeland, H., de Kogel, K. H., de Vries, H., & van Hooff,
J. A. (1991). Differences in the structure of social behaviour of autistic
children and non-autistic retarded controls. Journal of Child Psychology
and Psychiatry, 32(6), 995–1015.
Carpenter, M., Pennington, B. F., & Rogers, S. J. (2002). Interrelations among
social-cognitive skills in young children with autism. Journal of Autism
and Developmental Disorders, 32(2), 91–106.
Castelli, F., Frith, C., Happe, F., & Frith, U. (2002). Autism, asperger syn-
drome and brain mechanisms for the attribution of mental states to
animated shapes. Brain, 125(Pt 8), 1839–1849.
Celani, G., Battacchi, M. W., & Arcidiacono, L. (1999). The under-
standing of the emotional meaning of facial expressions in people
with autism. Journal of Autism and Developmental Disorders, 29(1),
Chauvin, A., Worsley, K. J., Schyns, P. G., & Gosselin, F. (in press). Accurate
statistical tests for smooth classification images. Journal of Vision.
NSY 2300 1–8
M.L. Spezio et al. / Neuropsychologia xxx (2006) xxx–xxx
Chawarska, K., Klin, A., & Volkmar, F. (2003). Automatic attention cueing
through eye movement in 2-year-old children with autism. Child Devel-
opment, 74(4), 1108–1122.
Cornelissen, F. W., Peters, E. M., & Palmer, J. (2002). The eyelink toolbox:
Eye tracking within matlab and the psychophysics toolbox. Behavioral
Research Methods, Instrumentation and Computers, 34, 613–617.
Critchley, H. D., Daly, E. M., Bullmore, E. T., Williams, S. C., Van
Amelsvoort, T., Robertson, D. M., et al. (2000). The functional neu-
roanatomy of social behaviour: Changes in cerebral blood flow when
people with autistic disorder process facial expressions. Brain, 123(Pt
Dalton, K. M., Nacewicz, B. M., Johnstone, T., Schaefer, H. S., Gernsbacher,
M. A., Goldsmith, H. H., et al. (2005). Gaze fixation and the neural cir-
cuitry of face processing in autism. Nature Neuroscience, 8(4), 519–526.
Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect [slides]. Palo
Alto, CA: Consulting Psychologists Press.
Fisher, N. I. (1995). Statistical analysis of circular data (2nd ed.). Cambridge:
Cambridge University Press.
Friston, K. J., Worsley, K. J., Frackowiak, R. S. J., Mazziotta, J. C., & Evans,
A. C. (1994). Assessing the significance of focal activations using their
spatial extent. Human Brain Mapping, 1(214–220).
Frith, C. D., & Frith, U. (1999). Interacting minds – a biological basis.
Science, 286(5445), 1692–1695.
Gilpin, W. (2002). Much more laughing and loving with autism. Arlington,
TX: Future Horizons.
Gosselin, F., & Schyns, P. G. (2001). Bubbles: A technique to reveal the
use of information in recognition tasks. Vision Research, 41(17), 2261–
Gosselin, F., & Schyns, P. G. (2004). No troubles with bubbles: A reply to
murray and gold. Vision Research, 44(5), 471–477, discussion 479-482.
Grandin, T. (1996). Thinking in pictures: And other reports from my life with
autism. New York, NY: Vintage.
Green, M. J., Williams, L. M., & Davidson, D. (2003). In the face of danger:
Specific viewing strategies for facial expressions of threat? Cognition and
Emotion, 17(5), 779–786.
Grelotti, D. J., Gauthier, I., & Schultz, R. T. (2002). Social interest and the
development of cortical face specialization: What autism teaches us about
face processing. Developmental Psychobiology, 40(3), 213–225.
Hutt, C., & Ounsted, C. (1966). The biological significance of gaze aversion
with particular reference to the syndrome of infantile autism. Behavioral
Science, 11(5), 346–356.
Itti, L., & Koch, C. (2001). Computational modelling of visual attention.
Nature Reviews, Neuroscience, 2(3), 194–203.
Kanner, L. (1943). Autistic disturbances of affective contact. Nervous Child,
Klin, A., Jones, W., Schultz, R., Volkmar, F., & Cohen, D. (2002a). Defining
and quantifying the social phenotype in autism. American Journal of
Psychiatry, 159(6), 895–908.
Klin, A., Jones, W., Schultz, R., Volkmar, F., & Cohen, D. (2002b). Visual
fixation patterns during viewing of naturalistic social situations as predic-
tors of social competence in individuals with autism. Archives of General
Psychiatry, 59(9), 809–816.
LeCouteur, A., Rutter, M., & Lord, C. (1989). Autism diagnostic interview: A
standardized investigator-based instrument. Journal of Autism and Devel-
opmental Disorders, 19, 363–387.
Lord, C., Rutter, M., Goode, S., Heemsbergen, J., Jordan, H., Mawhood,
L., et al. (1989). Autism diagnostic observation schedule: A standardized
observation of communicative and social behavior. Journal of Autism and
Developmental Disorders, 19(2), 185–212.
Loveland, K. A., Pearson, D. A., Tunali-Kotoski, B., Ortegon, J., & Gibbs,
M. C. (2001). Judgments of social appropriateness by children and ado-
lescents with autism. Journal of Autism and Developmental Disorders,
Minshew, N. J., Luna, B., & Sweeney, J. A. (1999). Oculomotor evidence for
neocortical systems but not cerebellar dysfunction in autism. Neurology,
Murray, R. F., & Gold, J. M. (2004). Troubles with bubbles. Vision Research,
Nahm, F. K. D., Perret, A., Amaral, D. G., & Albright, T. D. (1997). How do
monkeys look at faces? Journal of Cognitive Neuroscience, 9, 611–623.
Ogai, M., Matsumoto, H., Suzuki, K., Ozawa, F., Fukuda, R., Uchiyama, I.,
et al. (2003). Fmri study of recognition of facial expressions in high-
functioning autistic patients. Neuroreport, 14(4), 559–563.
Ozonoff, S., & Miller, J. N. (1995). Teaching theory of mind: A new approach
to social skills training for individuals with autism. Journal of Autism and
Developmental Disorders, 25(4), 415–433.
Pedersen, J., Livoir-Petersen, M. F., & Schelde, J. T. (1989). An ethological
approach to autism: An analysis of visual behaviour and interpersonal
contact in a child versus adult interaction. Acta Psychiatrica Scandinavica,
Pelli, D. G. (1997). The videotoolbox software for visual psychophysics:
Transforming numbers into movies. Spatial Vision, 10, 437–442.
Pelphrey, K. A., Sasson, N. J., Reznick, J. S., Paul, G., Goldman, B. D., &
Piven, J. (2002). Visual scanning of faces in autism. Journal of Autism
and Developmental Disorders, 32(4), 249–261.
Portilla, J., & Simoncelli, E. P. (2000). A parametric texture model based on
joint statistics of complex wavelet coefficients. 40, 49–71.
Richer, J. M., & Coss, R. G. (1976). Gaze aversion in autistic and normal
children. Acta Psychiatrica Scandinavica, 53(3), 193–210.
Rogers, S. J. (2000). Interventions that facilitate socialization in children with
autism. Journal of Autism and Developmental Disorders, 30(5), 399–409.
Rogers, S. J., Hepburn, S. L., Stackhouse, T., & Wehner, E. (2003). Imitation
performance in toddlers with autism and those with other developmental
disorders. Journal of Child Psychology Psychiatry, 44(5), 763–781.
Rosenhall, U., Johansson, E., & Gillberg, C. (1988). Oculomotor findings
in autistic children. The Journal of Laryngology and Otology, 102(5),
Schyns, P. G., Bonnar, L., & Gosselin, F. (2002). Show me the features!
Understanding recognition from the use of visual information. Psycho-
logical Science, 13(5), 402–409.
Siegel, B., Vukicevic, J., & Spitzer, R. L. (1990). Using signal detection
methodology to revise dsm-iii-r: Re-analysis of the dsm-iii-r national
field trials for autistic disorder. Journal of Psychiatric Research, 24(4),
Takarae, Y., Minshew, N. J., Luna, B., Krisky, C. M., & Sweeney, J. A.
(2004). Pursuit eye movement deficits in autism. Brain, 127(Pt 12),
Takarae, Y., Minshew, N. J., Luna, B., & Sweeney, J. A. (2004). Oculomo-
tor abnormalities parallel cerebellar histopathology in autism. Journal of
Neurology, Neurosurgery, and Psychiatry, 75(9), 1359–1361.
Trepagnier, C., Sebrechts, M. M., & Peterson, R. (2002). Atypical face gaze
in autism. Cyberpsychology & Behavior, 5(3), 213–217.
van der Geest, J. N., Kemner, C., Camfferman, G., Verbaten, M. N., & van
Engeland, H. (2002). Looking at images with human figures: Comparison
between autistic and normal children. Journal of Autism and Developmen-
tal Disorders, 32(2), 69–75.
van der Geest, J. N., Kemner, C., Verbaten, M. N., & van Engeland, H.
(2002). Gaze behavior of children with pervasive developmental disorder
toward human faces: A fixation time study. Journal of Child Psychology
and Psychiatry, 43(5), 669–678.
Volkmar, F. R., Sparrow, S. S., Rende, R. D., & Cohen, D. J. (1989). Facial
perception in autism. Journal of Child Psychology and Psychiatry, 30(4),
Weeks, S. J., & Hobson, R. P. (1987). The salience of facial expression
for autistic children. Journal of Child Psychology and Psychiatry, 28(1),