Uncanny valley, robot and autism: Perception of the uncanniness in an emotional gait

Article (PDF Available) · April 2015with 147 Reads
DOI: 10.1109/ROBIO.2014.7090488
Abstract
While robots are often used in autism therapy, the Uncanny valley effect was never studied in subjects with Autistic Spectrum Disorder (ASD). Since persons with ASD have trouble understanding body language, they react differently to the Uncanny valley. In this paper, we propose to investigate the possible difference in the Uncanny valley's perception of an emotional humanoid robot in subjects with ASD and subjects without ASD. Thirty four adult participants (N = 34, control: 19, ASD: 15; age: 28.5) were asked to watch videos of an emotional humanoid robot and rate its emotions and its gait (Perceived Humanness, Eeriness and Attractiveness). We have found differences between the two groups in their perception of the robot's Perceived Humanness (p <.05). Also, while the ASD group performed as well as the control group for emotion recognition task, we found that the ASD group is more sensible to the Uncanny valley effects than the control group. Finally we conclude on what our findings bring to the Human Robot Interaction field.
Figures - uploaded by Matthieu Destephe
Author content
All content in this area was uploaded by Matthieu Destephe
No caption available
… 
Abstract— While robots are often used in autism therapy,
the Uncanny valley effect was never studied in subjects with
Autistic Spectrum Disorder (ASD). Since persons with ASD
have trouble understanding body language, they react
differently to the Uncanny valley. In this paper, we propose to
investigate the possible difference in the Uncanny valley's
perception of an emotional humanoid robot in subjects with
ASD and subjects without ASD. Thirty four adult participants
(N = 34, control: 19, ASD: 15; age: 28.5) were asked to watch
videos of an emotional humanoid robot and rate its emotions
and its gait (Perceived Humanness, Eeriness and
Attractiveness). We have found differences between the two
groups in their perception of the robot's Perceived Humanness
(p < .05). Also, while the ASD group performed as well as the
control group for emotion recognition task, we found that the
ASD group is more sensible to the Uncanny valley effects than
the control group. Finally we conclude on what our findings
bring to the Human Robot Interaction field.
I. INTRODUCTION
ITH the emerging field of robot therapy, psychologists
and engineers have been working together to use
robots as motivational aids in order to help individuals with
Autistic Spectrum Disorder (ASD). ASD is usually
characterized by troubles affecting the social interaction, the
communication with others (verbal and non verbal) and the
emergence of unusual behavior. Two main types of robot are
used in robotic autism therapy: animal-like such as Pleo[1]
and AIBO[2] and human-like such as NAO[3] or Kaspar[4].
Those robots are usually used for either assess and compare
the behavior of ASD subjects in interaction with a robot or a
human, provoke behavior, teach a social skill to ASD
subjects or even as a diagnostic tool [5]. According to
Robins et al. [4], robots are preferred by persons with ASD
This work was supported in part by JSPS KAKENHI (#26540137 and
#26870639) and by Waseda Special Research Funds (#2013A-888). It was
also partially supported by SolidWorks Japan K.K and DYDEN
Corporation whom we thank for their financial and technical support.
M. Destephe is with the Graduate School of Science and Engineering,
Waseda University, #41-304, 17 Kikui-cho, Shinjuku-ku, Tokyo 162-0044,
JAPAN (e-mail: contact@takanishi.mech.waseda.ac.jp).
M. Zecca is with the School of Electronic, Electrical and Systems
Engineering, Loughborough University, UK.
K. Hashimoto are with the School of Creative Science and Engineering,
Faculty of Science and Engineering, Waseda University and Humanoid
Robotics Institute, Waseda University, Tokyo, Japan.
A. Takanishi is with the Department of Modern Mechanical
Engineering, Waseda University, and Humanoid Robotics Institute, Waseda
University, Tokyo, Japan.
over humans in regards to social interaction.
After a review of the literature about autism therapy using
robots, we found that only children subjects participated to
the studies. While it is important to help the children with
ASD, to the best of our knowledge, no study has been done
on adults with ASD and robots. Knowing that the prevalence
of autism in the general population varies between 1% and
2.6 % [6-7], we think that adults should also benefit from
robot therapy if it is proven to be efficient to help them
dealing with difficulties linked to their condition. Moreover,
as our society ages, more robots are expected to take care of
us and help us in our jobs and daily life. Therefore, a large
population of persons will be in contact with robots, persons
with ASD included, and understanding reactions from
different type of persons would be beneficial for the design
(appearance and behavior) of robots.
When robots, especially humanoid robots, are designed to
interact with persons, there is a risk of reject from the users.
A theory called "the Uncanny Valley", quite popular in
Human-Robot Interaction (HRI), tries to explain this
phenomenon. Developed by Mori [8] in 1970, the Uncanny
valley occurs when the more human-like a thing is (a doll, a
robot, etc.), the more familiar people will feel towards that
thing. Nonetheless this relationship is not linear. At one
point, where human-likeness is close to perfect but some
differences still exist, the curve collapses and the feeling,
which was familiar, becomes the opposite: uncanny. The
term uncanny is the English translation of the German
Unheimlich, a word describing something being felt
simultaneously as familiar, strange and scary. When the
human-likeness reaches the point where it is quite hard to tell
the difference between that thing and a human being, the
curve rises steeply again, outlining the shape of a valley, thus
giving the name to that phenomenon (Figure 1) [9]. Many
studies have been done on the effect of robots' appearance
and even some were done on robot movement [10].
Nonetheless the uncanny effect was not studied in people
suffering of ASD. ASD researchers have done numerous
studies on the emotion processing of ASD people, testing
mainly facial expression recognition, but only a few were
done on other emotional cues such as voice or body
movement [21]. Through our work, we want to investigate
the possible difference in the perception of the Uncanny
valley between ASD persons and non ASD persons with a
Uncanny Valley, Robot and Autism:
Perception of the Uncanniness in an Emotional Gait
Matthieu Destephe, Member, IEEE, Massimiliano Zecca, Member, IEEE,
Kenji Hashimoto, Member, IEEE, Atsuo Takanishi, Member, IEEE
W
978-1-4799-7397-2/14/$31.00 © 2014 IEEE 1152
Proceedings of the 2014 IEEE
International Conference on Robotics and Biomimetics
December 5-10, 2014, Bali, Indonesia
humanoid robot expressing its emotions with its gait only.
According to the general idea stating persons with ASD have
impairment which perturbs their understanding of body
language [18], we hypothesize the ASD group will have
more difficulty to interpret the emotional body expression
expressed by the robot. Therefore we think they will be less
affected by the Uncanny valley effects due to their lack of
understanding of the displayed emotions. We assume that
the perception of the emotions displayed by the robot will
affect differently the control and ASD groups, especially
their rating of familiarity (as described by MacDorman and
Mori [9]) and attractiveness toward the robot.
II. METHODS
A. The robot
The videos used for this work are based on the humanoid
robot WABIAN-2R (Figure 2). Unlike most bipedal
humanoid robots, WABIAN-2R is able to perform a
human-like walking with stretched knees thanks to its 2-DoF
waist during the stance phase while other robots walk with
bent knees [11]. WABIAN-2R is 1.5m in height, and 64kg in
weight. Its design allows human-like gait including
heel-contact and toe-off phases. This robot is mainly used for
locomotion experiments and to study human movements.
Besides an advanced locomotion technology, the head is a
neutral, stylized human-like shape with no distinguishable
feature. We chose this robot because having no facial
expression helps to focus on the expressivity of the whole
body without having any influence coming from facial
expressions. Moreover, other humanoid robots such as
ASIMO from Honda and ATLAS from Boston Dynamics do
not have facial features and our findings might be applied to
them as well.
B. The emotional gait patterns
The robot emotional walking patterns used for this work
are designed from the results of a previous work where we
assessed the emotion recognition rate of the emotional gait
simulated by a virtual robot [12]. Two professional actors
(who acted in plays, drama and movies) were asked to
perform several types of emotional walking such as Sadness,
Happiness, Anger and Fear and with different intensities:
Natural (Low, Intermediate, High) and Exaggerated.
From this previous research, we use for Anger, Happiness
and Sadness emotions and for them, two walking patterns of
different intensity (High and Exaggerated). We chose those
intensities since they achieved a high recognition rate (High
intensity / Exaggerated intensity) (Anger: 71.4% / 67.9%;
Happiness: 75.0% / 85.7%; Sadness: 75.0% / 92.9%) when
we assessed the patterns in simulation with subjects [13].
Examples of patterns used in this study are shown in the
Figure 3.
C. Questionnaire
The questionnaire we gave to the participants of this study
is composed of 3 sub-questionnaires:
1. A questionnaire asking for general information
such as sex, age, nationality, etc.;
2. A questionnaire to assess the participant's
personality. This questionnaire is in fact a short
screening for autism called AQ10 (Autism
spectrum Quotient with 10 items) [14];
3. A questionnaire assessing the participant's reactions
and feelings about our emotional robot based on Ho
questionnaire [15].
This questionnaire was conducted online. For the last
sub-questionnaire (questionnaire #3), the videos' order was
randomized to avoid inference resulting from a fixed order.
The participants were also randomly assigned an emotional
intensity and they watched once each video.
Figure 1. The Uncanny valley
(Karl F. MacDorman and Takashi Minato [9])
Figure 2. WABIAN-2R platform
1153
Angry walk
Happy walk
Sad walk
Figure 3. Example of emotional gait patterns shown to the subjects
1154
The participants are able to leave a comment at the end of the
study and a message offering to send them a following-up of
the study is displayed.
D. Participants
We recruited online on specialized forums and online
communities 15 participants for the ASD group (NASD = 15,
26 ± 7.1 years old, 66.7% of female/male ratio, USA:6,
Canada: 4, Germany: 2, India: 1, Sweden: 2) and on general
social network services, forums and by e-mails 19
participants for the control group (NC = 19, 31 ± 8.3 years
old, 35.7% of female/male ratio, nationality: USA:10,
England: 2, Canada: 2, India: 1, France: 2, Norway: 2). The
ASD group was composed of persons who score 6 or more
on the AQ-10 screening test [14]. Our experiment was
approved by our University ethic committee and all the
participants gave us their written consent.
III. RESULTS
A. Emotion recognition
We investigate if there is any difference in the recognition
of the emotions between the two groups. The recognition
rate for each emotion was not significantly different between
the groups: Anger (control: 5.3%; ASD: 6.7%), Happiness
(control: 31.6%; ASD: 40%), Sadness (control: 63.2%;
ASD: 53.3%) (chance rate: 20%) (Table I). The overall
recognition is the same for the two groups (33.3%). Anger
emotion was confused with Happiness by the control group
(52.6%). The gait characteristics of Anger and Happiness are
close, which may explain the confusion of the control group.
Those emotions are close in terms of movement dynamics
(fast paced) and movement ranges. Anger emotion was
confused with No Emotion by the ASD group (66.7%).
Emotion recognition of persons with ASD is impaired, with
Anger and Fear being the most affected by this impairment
[16-17].
B. Questionnaire about the Uncanny Valley
By using the Ho questionnaire [15], we measure the
possible differences existing in the perception of emotional
movements and their link to the uncanny effects. We also
investigate if several factors would influence the perception
such as the emotional intensity and the type of emotions.
The Ho questionnaire is a questionnaire designed to test the
feelings related to the Uncanny valley phenomenon by rating
three different groups of items: Perceived Humanness,
Eeriness and Attractiveness, rated from 1 (low) to 5 (high).
Perceived Humanness represents the degree of humanity and
human-like characteristics in the robot tested. The Eeriness
describes the feeling of simultaneous strangeness, disgust
and familiarity which occurs when something seems natural
but some details are not quite conform to the expectation.
The Attractiveness characterizes the level of comfort and
physical attraction we might feel by looking at the robot.
According to Ho [15], the Eeriness and Perceived
Humanness can be plotted together to obtain a graph similar
to Mori's Uncanny Valley figure.
We use a Generalized Linear Model (GLM) to test for
significant differences each questionnaire item group
(Perceived Humanness, Eeriness and Attractiveness) with
ASD spectrum, Emotion category and Intensity category as
independent variables. We found that Autism as main effect
for the Perceived Humanness (F(1,90) = 4.234, p < .05) and
the Intensity as main effect for Eeriness (F(1,90) = 4.105, p
< .05) and for Attractiveness (F(1,90) = 4.907, p < .05).
We plot all the Attractiveness and Familiarity scores against
the Perceived Humanness score given by the participants
(Figure 4) in order to reproduce the theoretical Uncanny
valley. Familiarity is the modified score of Eeriness in order
to stay coherent with the Attractiveness score: positive
values mean positive feelings. We can see a visible
difference for the Attractiveness and Familiarity score
between the control group and the ASD group.
The Attractiveness and the Perceived Humanness are
strongly correlated in the ASD group, rs(45) = .48, p < .001.
For the control group, the Eeriness and the Perceived
Humanness are strongly positively correlated, rs(57) = .45, p
< .001 (obviously, Familiarity is negatively correlated).
IV. DISCUSSION
We have two objectives: firstly, we want to see if the ASD
impairs the ability to understand the emotions expressed by
the robot and secondly, study the reaction of adults with
ASD to the Uncanny valley effect triggered by an emotional
humanoid robot.
A. Emotion recognition
We did not find any difference in the emotion recognition
between the control and the ASD groups. Philip et al. [17]
found only impairment in the recognition of two emotions
among six (Ekman's six basic emotions) in ASD persons.
Loveland et al. [18] suggest that there is no real issue in the
recognition of the emotion information for individuals with
ASD but more in their processing of that emotion
information for a use during social interaction. Dyck et al.
[19] show that for same intelligence and developmental age,
Aspergers perform as well as non ASD people on emotion
recognition tests. Our findings are coherent with those
previous studies.
Are emotion recognition tests performed with a robot are
really useful for autism therapy? Some researchers are
investigating robots as diagnostic tool to detect if a person
has some autistic troubles. Nonetheless, other diagnostic
methods such as the Autism Spectrum Quotient test [20]
TABLE I
EMOTION RECOGNITION COMPARISON
Group
Anger
Happiness
Sadness
Control
5.3 %
31.6 %
63.2 %
ASD
6.7 %
40.0 %
53.3 %
1155
might be more efficient and less costly in time and money
than using a robot.
B. The Uncanny valley effect
Eerines/Familiarity
While the participants did not perform quite well (33.3%)
in the emotion recognition task, they did recognize the
difference between emotional gait versus non-emotional gait.
Therefore the further study of the effect of the emotions on
the Uncanny valley stays valid. The Eeriness/Familiarity
towards the robot is perceived differently by both groups.
The control group is always positively familiar with the
robot even though as their perception of humanness of the
robot decreases, their familiarity feeling decreases also. For
the ASD group, the evolution of the familiarity feeling
follows a bell curve, reaching its maximum between 2 (not
much human-like) and 3 (neither human nor robot-like) on
the Perceived Humanness scale. The more human-like the
robot appears, the more it will be perceived eerie. According
to Gray [21], what makes a robot uncanny is its ability to
experience and feel. Only the ASD group found the robot
quite uncanny (negative score) the more human-like it
appears which could mean that persons with ASD would
recognize the robot as a feeling machine. The control group
found the robot rather neutral at worse. The control group
does not recognize the robot as a feeling machine and thus is
not affected in its perception of Eeriness. They found that
experience in terms of being able to observe and feel and
agency in terms of being able to act have different effects on
the uncanniness. The perception of the experience might be
one of the factors causing of the uncanniness feeling,
especially in robots. The idea of robots being able to
experience is rather disturbing because being able to
experience is one of the main characteristics which makes us
human. While the robot we showed to the participants
expressed emotions with its body, those emotional motions
might have not been enough to overcome the lack of
experience feeling. Without actual interaction with the
participants, the emotion expression might have been
perceived as fake and did not affect their feeling the way
Mori predicted. Those findings support the fact that persons
with ASD recognize as well as persons without ASD the
emotion information but have more trouble to interpret this
information. In our study, this is reflected by the control
group interpreting the body language of the robot as
non-genuine since there is no effect of their perception of
Eeriness and the ASD group interpreting the body language
as genuine thus affecting their Eeriness perception.
Do those results mean the Mori theory of the Uncanny
valley can be rejected? Our research shows that a similar
trend is common between persons with ASD and without: a
low humanness is perceived less uncanny and a high
humanness is perceived uncannier. Several other studies
such as MacDorman et al. [22] and Seyama et al. [23] tested
the phenomenon with relative success. The difference
between those studies and our study is that our Perceived
Humanness stimuli is based on the subjects' subjective
perception and MacDorman and Seyama's Perceived
Humanness stimuli are based on researchers' manipulated
stimuli (face morph between a virtual or mechanical
character to a human). For the relationship between
movements and Eeriness, Tung et al. [24] used different
robots and tested two conditions (still and moving). They
found evidences supporting the existence of the Uncanny
valley but unlike Mori's theory, the still condition had a
greater influence on the Eeriness than the moving condition.
Moreover, those stimuli are based on different characters or
robots and we only tested one robot. There might be an
impassable limit for a given character, robot or machine
which makes impossible to climb again the other side of the
Uncanny valley.
Attractiveness
For the Attractiveness feeling, both groups react in a
similar fashion: the more the robot is perceived human-like,
the more they feel attracted to it. The notable difference is
the ASD group which rated the robot twice more attractive
(a) Control group
(b) ASD group
Figure 4. The Uncanny valley
1 2 3 4 5
-1
-
0.5
0
0.5
1
1.5
2
Perceived Humanness
pp
Att racti veness
Familiarity
1 2 3 4 5
-1
-
0.5
0
0.5
1
1.5
2
Perceived Humanness
pp
Att racti veness
Familiarity
1156
when they perceived it quite human-like (between 4 and 5 on
the Perceived Humanness scale). Those findings are
supported by Robins et al. [4] who reported a clear
preference of robot by persons with ASD.
V. CONCLUSION
We wanted to understand what would be the effects of the
Uncanny valley on persons with ASD. We expected that they
would be less affected due to their difficulty to understand
emotional contents of gesture. While performing as well as
the control group during the emotion recognition task, the
ASD group was more negatively influenced than the control
group when the subjects perceived the robot more humanlike
(Perceived Humanness score ≥ 3). Although being perceived
uncanny, the ASD group rated the robot much more
attractive when it was perceived more human. The results
suggest that persons with ASD are more sensible to robot
appearance and motion than persons without ASD. This
result would suggest that researchers working with persons
with ASD should carefully choose or design the robot
appearance and movement.
As this is a preliminary study, we plan to study further the
effects of the Uncanny valley on the interaction between
humans and robots. We will make subjects and the robot to
interact directly and will further investigate how the
attractiveness of the robot would mitigate the Uncanny
valley effects.
ACKNOWLEDGMENT
This study was conducted as part of the Research Institute
for Science and Engineering, Waseda University, and as part
of the humanoid project at the Humanoid Robotics Institute,
Waseda University.
REFERENCES
[1] Myounghoon, Rayan IA, “The effect of physical embodiment of an
animal robot on affective prosody recognition.”, Proceedings of the
14th international conference on Human-computer interaction:
interaction techniques and environments - Volume Part II (HCII'11),
(2011), 523-532.
[2] Stanton, C.M.; Kahn, P.H.; Severson, R.L.; Ruckert, J.H.; Gill, B.T.,
"Robotic animals might aid in the social development of children with
autism," 3rd ACM/IEEE International Conference on Human-Robot
Interaction (HRI), (2008), 271-278.
[3] Tapus, A., Peca, A., Aly, A., Pop, C., Jisa, L., Pintea, S., Rusu, A., and
David, D. "Exploratory Study: Children's with Autism Awareness of
Being Imitated by Nao Robot", Interaction Studies, 13(3), 2012, 315
–347.
[4] Robins B. , Dautenhahn K., Dubowski J., “Does appearance matter in
the interaction of children with autism with a humanoid robot?”
Interaction Studies 7( 3), 2006, 479-512.
[5] Diehl, J.J., Schmitt, L., Crowell, C.R., & Villano, M. The clinical use
of robots for children with autism spectrum disorders: A critical
review. Research in Autism Spectrum Disorders, 6(1), (2012).
249-262.
[6] Kim, Y.S., Leventhal, B.L., Koh, Y.J., Fombonne, E., Laska, E., et al.
Prevalence of autism spectrum disorders in a total population sample.
American Journal of Psychiatry, 168, (2011), 904–912.
[7] http://www.cdc.gov/ncbddd/autism/data.html
[8] Mori M., The uncanny valley. Energy 7(4), (1970), 33-35.
[9] MacDorman, K. F., Minato, T., Shimada, M., Itakura, S., Cowley, S.
J., Ishiguro, H. Assessing human likeness by eye contact in an android
testbed. In Proceedings of the XXVII Annual Meeting of the
Cognitive Science Society. (2005).
[10] Pollick FE, In Search of the Uncanny Valley. UCMedia, (2009), 69-78
[11] Ogura Y., Shimomura K., Kondo H., Morishima A., Okubo T.,
Momoki S., Lim H., Takanishi A., "Human-like walking with knee
stretched, heel-contact and toe-off motion by a humanoid robot",
IEEE/RSJ International Conference on Intelligent Robots and
Systems, (2006), 3976–3981.
[12] Destephe M., Maruyama T., Zecca M., Hashimoto K., Takanishi A.,
“The Influences of Emotional Intensity for Happiness and Sadness on
Walking”, 35th Annual International Conference of the IEEE
Engineering in Medicine and Biology Society (EMBC 2013), (2013),
7452-7455.
[13] Destephe M., Henning A., Zecca M., Hashimoto K., Takanishi A.,
“Perception of Emotion and Emotional Intensity in Humanoid Robots
Gait”, IEEE Robotics and Biomimetics 2013 (RoBio2013), (2013),
1276-1281.
[14] Allison C., Auyeung B., Baron-Cohen S., Toward Brief “Red Flags”
for Autism Screening: The Short Autism Spectrum Quotient and the
Short Quantitative Checklist in 1,000 Cases and 3,000 Controls,
Journal of the American Academy of Child and Adolescent Psychiatry
51(2), (2012), 202-212.
[15] Ho CC, MacDorman KF. Revisiting the uncanny valley theory:
Developing and validating an alternative to the Godspeed indices.
Comput. Hum. Behav. 26(6), (2010), 1508-1518.
[16] Uljarevic M., Hamilton A., “Recognition of emotions in autism: a
formal meta-analysis,” Journal of Autism and Developmental
Disorders 43(7), (2013), 1517–1526.
[17] Philip, R C M; Whalley, Heather; Stanfield, A C et al. Deficits in
facial, body movement and vocal emotional processing in autism
spectrum disorders. Psychological Medicine , Vol. 40, No. 11, 2010,
p. 1919-29.
[18] Loveland, K.; Tunali-Kotoski, B; Chen, Y. R; Ortegon, J et al..
Emotion recognition in autism: Verbal and nonverbal information.
Development and Psychopathology, Vol 9(3), 1997.
[19] Dyck, M. J., Ferguson, K., & Shochet, I. MDo autism spectrum
disorders differ from each other and from nonspectrum disorders on
emotion recognition tests? EuropeanChild and Adolescent Psychiatry
10, (2001). 105–116.
[20] Woodbury-Smith, M. R.; Robinson, J.; Wheelwright, S.;
Baron-Cohen, S. Screening Adults for Asperger Syndrome Using the
AQ: A Preliminary Study of its Diagnostic Validity in Clinical
Practice, Journal of Autism & Developmental Disorders 35(3);
(2005), 331-335.
[21] Gray K, Wegner DM. Feeling robots and human zombies: mind
perception and the uncanny valley. Cognition 125, (2012), 125–130.
[22] MacDorman, K.F., Ishiguro, H.: The uncanny advantage of using
androids in social and cognitive science research. Interaction Studies
7(3) (2006).
[23] Seyama, J., & Nagayama, R. S. (2007). The uncanny valley: Effect of
realism on the impression of artificial human faces. Presence:
Teleoperators and Virtual Environments, 16(4), 337–351.
[24] Tung FW, Chang TY, Exploring Children's Attitudes towards Static
and Moving Humanoid Robots. HCI (3) 2013, 237-245.
1157
  • ... Disorder (ASD) are more sensible to the Uncanny valley effect [7]. To avoid the effect of the "Uncanny Valley" we designed a not animal like robot. ...
    Conference Paper
    Full-text available
    The main contribution of our research is to develop a robot which can take part in ethological research. For this purpose the program of the robot contains an emotional behaviour engine. The program is developed in object oriented LabVIEW environment. In our paper there are sections about the hardware and the software of the robot.
  • Article
    Full-text available
    Horror movies have discovered an easy recipe for making people creepy: alter their eyes. Instead of normal eyes, zombies’ eyes are vacantly white, vampires’ eyes glow with the color of blood, and those possessed by demons are cavernously black. In the Academy Award winning Pan’s Labyrinth, director Guillermo del Toro created the creepiest of all creatures by entirely removing its eyes from its face, placing them instead in the palms of its hands. The unease induced by altering eyes may help to explain the uncanny valley, which is the eeriness of robots that are almost—but not quite—human (Mori, 1970). Much research has explored the uncanny valley, including the research reported by MacDorman & Entezari (in press), which focuses on individual differences that might predict the eeriness of humanlike robots. In their paper, they suggest that a full understanding of this phenomenon needs to synthesize individual differences with features of the robot. One theory that links these two concepts is mind perception, which past research highlights as essential to the uncanny valley (Gray & Wegner, 2012). Mind perception is linked to both individual differences—autism—and to features of the robot—the eyes—and can provide a deeper understanding of this arresting phenomenon. In this paper, we present original data that links uncanniness to the eyes through aberrant perceptions of mind.
  • Article
    Full-text available
    This paper presents a series of 4 single subject experiments aimed to investigate whether children with autism show more social engagement when interacting with the Nao robot, compared to a human partner in a motor imitation task. The Nao robot imitates gross arm movements of the child in real-time. Different behavioral criteria (i.e. eye gaze, gaze shifting, free initiations and prompted initiations of arm movements, and smile/laughter) were analyzed based on the video data of the interaction. The results are mixed and suggest a high variability in reactions to the Nao robot. The results are as follows: For Child2 and Child3, the results indicate no effect of the Nao robot in any of the target variables. Child1 and Child4 showed more eye gaze and smile/laughter in the interaction with the Nao robot compared to the human partner and Child1 showed a higher frequency of motor initiations in the interaction with the Nao robot compared to the baselines, but not with respect to the human-interaction. The robot proved to be a better facilitator of shared attention only for Child1. Keywords: human-robot interaction; assistive robotics; autism
  • Conference Paper
    The originally published version of Table 1 contains an error. This erratum shows the corrected version of Table 1. Table 1. The experimental stimuli
  • Conference Paper
    Full-text available
    Humanoid robots progress everyday closer and closer to a more stable walking suitable for a human environment as the researchers in the Humanoid robotics field focus their effort on the understanding of the human locomotion. Nonetheless for Social Robotics researchers, humanoid robots might have another use, such as being our companions from birth to nursing home. Designing social humanoid robots is one critical step if we want the robots to be active in our society. However, to our knowledge, only a few studies in the area of humanoid robotics have addressed emotion expression with robot gaits. In this paper we propose to assess different emotional gait patterns and the perception of the emotion intensity in those patterns. Actors' emotional movement were captured and then normalized for our robot platform. Several robot simulations were shown to human observers who completed a survey questionnaire in which they indicated their assessment of the portrayed emotion by the robot simulation. The surveyed emotions consist of Sadness, Happiness, Anger, Fear with different intensities (Intermediate, High and Exaggerated). We achieved a high recognition rate of emotions (72.32%). Even if the intensities were less well recognized (33.63%), our study indicates that the intensity might help the recognition of emotional walking.
  • Article
    Full-text available
    Walking is one of the most common activities that we perform every day. Even if the main goal of walking is to move from one place to another place, walking can also convey emotional clues in social context. Those clues can be used to improve interactions or any messages we want to express. However, there are not many studies on the effects of the intensity of the emotions on the walking. In this paper, the authors propose to assess the differences between the expression of emotion regarding the expressed intensity (low, middle, high and exaggerated). We observed two professional actors perform emotive walking, with different intensities and we analyzed the recorded data. For each emotion, we analyzed characteristic features which can be used in the future to model gait patterns and to recognize emotions from the gait parameters. Additionally, we found characteristics which can be used to create new emotion expression for our biped robot Kobian, improving the human-robot interaction.
  • Article
    Full-text available
    Androids have the potential to reinvigorate the social and cognitive sciences — both by serving as an experimental apparatus for evaluating hypotheses about human interaction and as a testing ground for cognitive models. Unlike other robotics techniques, androids can illuminate how interaction draws on human appearance and behavior. When cognitive models are implemented in androids, feelings associated with the uncanny valley provide heightened feedback for diagnosing flaws in the models during human–android interaction. This enables a detailed examination of real-time factors in human social interaction. Not only can android science inform us about human beings, but it can also contribute to a methodology for creating interactive robots and a set of principles for their design. By doing this, android science can help us devise a new kind of interface. Since our expressive bodies and perceptual and motor systems have co-evolved to work together, it seems natural for robot engineers to exploit this by building androids, rather than hoping for people to gradually adapt themselves to mechanical-looking robots. In the longer term, androids may prove to be a useful tool for understanding social learning, interpersonal relationships, and how human brains and bodies turn themselves into persons (MacDorman & Cowley, 2006). Of course, there are many ways to investigate human perception and interaction and to explore the potential for interactive robotics. Android science is only one of them. Although the uncanny valley plays a special role in android science, the nature of the phenomenon should rightly be investigated by other approaches too.
  • Conference Paper
    Full-text available
    The development of robots that closely resemble human beings enables us to investigate many phenomena related to human interactions that could not otherwise be investigated with mechanical-looking robots. This is because more humanlike devices are in a better position to elicit the kinds of responses people direct at each other. In particular, we cannot ignore the role of appearance in giving us a subjective impression of social presence or intelligence. However, this impression is influenced by behavior and the complex relationship between it and appearance. As Masahiro Mori observed, a humanlike appearance does not necessarily give a positive impression. We propose a hypothesis as to how appearance and behavior are related and map out a plan for android research to investigate this hypothesis. We then examine a study that evaluates the behavior of androids according to the patterns of gaze fixations they elicit in human subjects. Studies such as these, which integrate the development of androids with the investigation of human behavior, constitute a new research area fusing engineering and science.
  • Article
    Full-text available
    Determining the integrity of emotion recognition in autistic spectrum disorder is important to our theoretical understanding of autism and to teaching social skills. Previous studies have reported both positive and negative results. Here, we take a formal meta-analytic approach, bringing together data from 48 papers testing over 980 participants with autism. Results show there is an emotion recognition difficulty in autism, with a mean effect size of 0.80 which reduces to 0.41 when a correction for publication bias is applied. Recognition of happiness was only marginally impaired in autism, but recognition of fear was marginally worse than recognition of happiness. This meta-analysis provides an opportunity to survey the state of emotion recognition research in autism and to outline potential future directions.
  • Article
    This study examined the roles of verbal and nonverbal sources of information in the ability of persons with and without autism to recognize emotion. Child, adolescent, and young adult participants in four groups [Lower Functioning Autism (LFA) (n = 17), High Functioning Autism (HFA) (n = 18), Lower Functioning Comparison (LFC) (n = 18), and High Functioning Comparison (HFC) (n = 23)] identified emotions shown (happy, angry, sad, surprised, or neutral) in video clips of individuals expressing emotion verbally, nonverbally, or both. Verbal expressions of emotion were either Explicit, Implicit, or Neutral, whereas nonverbal expressions were Animate or Flat (3 × 2). Pairwise ANCOVAs indicated no group differences between HFA and HFC groups or between the LFA and LFC groups, and indicated instead group differences between higher and lower functioning persons. With groups collapsed into High Functioning (HF) and Lower Functioning (LF), significant group differences were found. Performance of LF individuals suggested they had difficulty inferring how a person felt based on what the person said, if the emotion was not explicitly named. Performance of HF individuals suggested they relied more on nonverbal than on verbal information to determine a speaker's emotion, except where the emotion was explicitly named. Results suggested that persons with autistic spectrum disorders can use affective information from multiple sources in much the same ways as persons of comparable developmental level without autism.
  • Article
    The uncanny valley-the unnerving nature of humanlike robots-is an intriguing idea, but both its existence and its underlying cause are debated. We propose that humanlike robots are not only unnerving, but are so because their appearance prompts attributions of mind. In particular, we suggest that machines become unnerving when people ascribe to them experience (the capacity to feel and sense), rather than agency (the capacity to act and do). Experiment 1 examined whether a machine's humanlike appearance prompts both ascriptions of experience and feelings of unease. Experiment 2 tested whether a machine capable of experience remains unnerving, even without a humanlike appearance. Experiment 3 investigated whether the perceived lack of experience can also help explain the creepiness of unfeeling humans and philosophical zombies. These experiments demonstrate that feelings of uncanniness are tied to perceptions of experience, and also suggest that experience-but not agency-is seen as fundamental to humans, and fundamentally lacking in machines.