Content uploaded by Amanda Jane Sharkey
Author content
All content in this area was uploaded by Amanda Jane Sharkey on Jan 15, 2014
Content may be subject to copyright.
Interaction Studies 11:2 (2010), –. doi 10.1075/is.11.2.01sha
issn 1572–0373 / e-issn 1572–0381 © John Benjamins Publishing Company
e crying shame of robot nannies
An ethical appraisal
Noel Sharkey & Amanda Sharkey
University of Sheeld, UK
Childcare robots are being manufactured and developed with the long term aim
of creating surrogate carers. While total childcare is not yet being promoted,
there are indications that it is ‘on the cards’. We examine recent research and
developments in childcare robots and speculate on progress over the coming
years by extrapolating from other ongoing robotics work. Our main aim is to
raise ethical questions about the part or full-time replacement of primary carers.
e questions are about human rights, privacy, robot use of restraint, deception of
children and accountability. But the most pressing ethical issues throughout the
paper concern the consequences for the psychological and emotional wellbeing
of children. We set these in the context of the child development literature on the
pathology and causes of attachment disorders. We then consider the adequacy
of current legislation and international ethical guidelines on the protection of
children from the overuse of robot care.
Who’s to say that at some distant moment there might
be an assembly line producing a gentle product in
the form of a grandmother – whose stock in trade is
love. From I Sing the Body Electric, Twilight Zone,
Series 3, Episode 35, 1960
. Introduction
A babysitter/companion on call round the clock to supervise and entertain the
kids is the dream of many working parents. Now robot manufacturers in South
Korea and Japan are racing to full that dream with aordable robot “nannies”.
ese currently have game playing, quizzes, speech recognition, face recognition
and limited conversation to capture the preschool child’s interest and attention.
eir mobility and semi-autonomous function combined with facilities for visual
and auditory monitoring are designed to keep the child from harm. Most are pro-
hibitively expensive at present but prices are falling and some cheap versions are
already becoming available.
Noel Sharkey & Amanda Sharkey
Children love robots as indicated by the numbers taking part in robot com-
petitions worldwide. Even in a war zone, when bomb disposal robots entered a
village in Iraq, they were swamped with excited children (Personal communica-
tion, Ronald C. Arkin, 2008). ere is a growing body of research showing positive
interactions between children and robots in the home (e.g. Turkle et al. 2006 a,b),
and in the classroom (e.g. Tanaka et al. 2007; Kanda et al., 2009). Robots have also
been shown to be useful in therapeutic applications for children (e.g. Shibata et al.,
2001, Dautenhahn, 2003; Dautenhahn & Werry, 2004; Marti et al., 2005; Liu et al.,
2008). e natural engagement value of robots makes them a great motivational
tool for education in science and engineering. We raise no ethical objections to
the use of robots for such purposes or with their use in experimental research or
even as toys.
Our concerns are about the evolving use of childcare robots and the poten-
tial dangers they pose for children and society (Sharkey, 2008a). By extrapolating
from ongoing developments in other areas of robotics, we can get a reasonable
idea of the facilities that childcare robots could have available to them over the
next 5 to 15 years. We make no claims about the precision of the time estimate as
this has proved to be almost impossible for robotics and AI developments (Sharkey,
2008b). Our approach is conservative and explicitly avoids entanglement with
issues about strong AI and super smart machines. Nonetheless, it may not be long
before robots can be used to keep children safe and maintain their physical needs
for as long as required.
To be commercially viable, robot carers will need to enable considerably lon-
ger parent/carer absences than can be obtained from leaving a child sitting in front
of a video or television programme. Television and video have long been used by
busy parents to entertain children for short periods of time. But they are a passive
form of entertainment and children get dgety aer a while and become unsafe.
ey need to be monitored with frequent “pop-ins” or the parent has to work in
the same room as the child and suer the same DVDs while trying to concen-
trate. e robot can extend the length of parent absences by keeping the child safe
from harm, keeping her entertained and, ideally, by creating a relationship bond
between child and robot (Turkle et al. 2006b).
We start with a simple example, the Hello Kitty Robot, which parents are
already beginning to use if the marketing website is to be believed. It gives an
idea of how these robots are already getting a ‘foot in the door.’ Even for such a
robotically simple and relatively cheap robot, the marketing claims are that, “is
is a perfect robot for whoever does not have a lot time [sic] to stay with their
child.” (Hello Kitty website). Although Hello Kitty is not mobile, it creates a lifelike
appearance by autonomously moving its head to four angles and moving its arms.
What gives it an edge is that it can recognise voices and faces so that it can call
e crying shame of robot nannies
children by their names. It has a stereo CCD camera that allows it to track faces
and it can chat. For children this may be enough to create the illusion that it has
mental states (Melson et al. in press b).
Busy working parents might be tempted to think that a robot nanny could pro-
vide constant supervision, entertainment and companionship for their children.
Some of the customer reviews of the “Hello Kitty Robot”, on the internet made
interesting reading. ese have now been removed but we kept a copy: (there is
also a copy of some of the comments at (Bittybobo))
Since we have invited Hello Kitty (Kiki-as my son calls her), life has been so –
much easier for everyone. My daughter is no longer the built in babysitter for
my son. Hello Kitty does all the work. I always set Kiki to parent mode, and
she does a great job. My two year old is already learning words in Japanese,
German, and French.
As a single executive mom, I spend most of my home time on the computer –
and phone and so don’t have a lot of chance to interact with my 18-month old.
e HK robot does a great job of talking to her and keeping her occupied for
hours on end. Last night I came into the playroom around 1AM to nd her,
still dressed (in her Hello Kitty regalia of course), curled sound asleep around
the big plastic Kitty Robo. How cute! (And, how nice not to hear those heart-
breaking lonely cries while I’m trying to get some work done.)
Robo Kitty is like another parent at our house. She talks so kindly to my little –
boy. He’s even starting to speak with her accent! It’s so cute. Robo Kitty puts
Max to sleep, watches TV with him, watches him in the bath, listens to him
read. It’s amazing, like a best friend, or as Max says “Kitty Mommy!” Now
when I’m working from home I don’t have to worry about Max asking a bunch
of questions or wanting to play or having to read to him. He hardly even talks
to me at all! He no longer asks to go to the park or the zoo – being a parent has
NEVER been so easy! ank you Robo Kitty!”
We are not presenting these anecdotal examples as rigorous evidence of how a
simple robot like Hello Kitty will generally be used. Other parents commenting
on the website were highly critical about these mothers being cold or undeserv-
ing of having children. We cannot authenticate these comments. Nonetheless
this example provides a worrying indication of what might be and what we need
to be prepared for. Perhaps it is only a small minority of parents who would rely
on such a simple robot to mind their pre-school children. But as more sophis-
ticated robots of the type we describe later become aordable, their use could
increase dramatically.
What follows is an examination of the present day and near-future childcare
robots and a discussion of potential ethical dangers that arise from their extended
Noel Sharkey & Amanda Sharkey
use in caring for babies and young children. Our biggest concern is about what
will happen if children are le in the regular or near-exclusive care of robots.
First we briey examine how near-future robots will be able to keep children
safe from harm and what ethical issues this may raise. en we make the case,
from the results of research on child–robot interaction, that children can and will
form pseudo-relationships with robots and attribute mental states and sociality
to them. Children’s natural anthropomorphism could be amplied and exploited
by the addition of a number of methods being developed through research on
human–robot interaction, for example, in the areas of conversation, speech, touch,
face and emotion recognition. We draw upon evidence from the psychological
literature on attachment and neglect to look at the possible emotional harm that
could result from children spending too much time exclusively in the company of
mechanical minders.
In the nal section, we turn to current legislation and international ethical
guidelines on the care and rights of children to nd out what protections they have
from sustained or exclusive robot care. Our aim is not to oer answers or solutions
to the ethical dangers but to inform and raise the issues for discussion. It is up to
society, the legislature and the professional bodies to provide codes of conduct to
deal with future robot childcare.
. Keeping children from physical harm
An essential ingredient for consumer trust in childcare robots is that they keep
children safe from physical harm. e main method used at present is mobile
monitoring. For example, the PaPeRo Personal Partner Robot by NEC (Yoshiro
et al., 2005) uses cameras in the robot’s ‘eyes’ to transmit images of the child to a
window on the parent-carer’s computer or to their mobile phone. e carer can
then see and control the robot to nd the child if she moves out of sight. is is
like having a portable baby monitor but it defeats the purpose of mechanical care.
ere is little point in having a childcare robot if the busy carer has to continuously
monitor their child’s behaviour. For costly childcare robots to be attractive to con-
sumers or institutions, they will need to have sucient autonomous functioning
to free the carer’s time and call upon them only in unusual circumstances.
As a start in this direction, some childcare robots keep track of the location of
children and alert adults if they move outside of a preset perimeter. e PaPeRo
robot comes with PaPeSacks, each containing an ultrasonic sensor with a unique
signature. e robot can then detect the exact whereabouts of several children at
the same time and know which child is which. Similarly the Japanese Tmsuk robot
e crying shame of robot nannies
uses radio frequency identication tags. But more naturalistic methods of tracking
are now being developed that will eventually nd their way into the care robot
market. For example, Lopes et al., (2009) have developed a method for tracking
people in a range of environments and lighting conditions without the use of sensor
beacons. is means that the robot will be able to follow a child outside and alert
carers of her location or encourage and guide her back into the home.
We may also see the integration of care robots with other home sensing and
monitoring systems. ere is considerable research on the development of smart
sensing homes for the frail elderly. ese can monitor a range of potentially dan-
gerous activities such as leaving on water taps or cookers. ey can monitor a per-
son getting out of bed and wandering. ey can prompt the person with a voice to
remind them to go to the toilet and switch the toilet light on for them (Orpwood
et al., 2008). Vision systems can detect a fall and other sensors can determine if
assistance is required (Toronto Rehabilitation Unit Annual Report 2008, 40–41).
Simple versions of such systems could be adapted for use in robot childcare.
One ethical issue arising from such close monitoring is that every child has a
right to privacy under Articles 16 and 40 of the UN Convention on Child Rights.
It is ne for parents to listen out for their children with a baby alarm. Parents
also frequently video and photograph their young children’s activities. In most
circumstances legal guardians have the right to full disclosure regarding a very
young child. However, there is something dierent about an adult being present to
observe a child and a child being covertly monitored when she thinks that she is
alone with her robot friend.
Without making too much of this issue, when a child discusses something
with an adult, she may expect the discussion will be reported to a third party –
especially her parents. But sometimes conversations about issues concerning the
parents, such as abuse or injustice, should be treated in condence. A robot might
not be able to keep such condences from the parents before reporting the incident
to the appropriate authorities. Moreover, when a child has a discussion with a peer
friend (or robot friend) they may be doing so in the belief that it is in condence.
With the massive memory hard drives available today, it would be possible
to record a child’s entire life. is gives rise to concerns about whether such close
invigilation is acceptable. Important questions need to be discussed here such as,
who will be allowed access to the recordings? Will the child, in later life have the
right to destroy the records?
Privacy aside, an additional way to increase autonomous supervision would
be to allow customisation of home maps so that a robot could encode danger
areas. is could be extended with better vision systems that could detect poten-
tially dangerous activities like climbing on furniture to jump. A robot could make
Noel Sharkey & Amanda Sharkey
a rst pass at warning a child to stop doing or engaging in a potentially dangerous
activity in the same way that smart sensing homes do for the elderly. But there is
another ethical problem lurking in the shadows here.
If a robot could predict a dangerous situation, it could also be programmed
to autonomously take steps to physically prevent it rather than merely warn. For
example, it could take matches from the hands of a child, get between a child and
a danger area such as a re, or even restrain a child from carrying out a dangerous
or naughty action. However, restraining a child to avoid harm could be a slip-
pery slope towards authoritarian robotics. We must ask how acceptable it is for a
robot to make decisions that can aect the lives of our children by constraining
their behaviour.
It would be easy to construct scenarios where it would be hard to deny such
robot action. For example, if a child was about to run across the road into heavy
oncoming trac and a robot could stop her, should it not do so? e problem is
in trusting the classications and sensing systems of a robot to determine what is
a dangerous activity. As an extreme case, imagine a child having doughnuts taken
from her because the robot wanted to prevent her from becoming obese. ere are
many discussions to be had over the extremes of robots blocking human actions
and where to draw the line (c.f. Wallach & Allen, 2009).
Another ethically tricky area of autonomous care is in the development of
robots to do what some might consider to be the ‘dull and dirty’ work of childcare.
ey may eventually be able to carry out tasks such as changing nappies, bathing,
dressing, feeding and adjusting clothing and bedding to accord with temperature
changes. Certainly, robot facilities like these are being thought about and developed
in Japan with an eye to caring for their aging population (Sharkey and Sharkey,
in press). Performing such duties would allow lengthier absences from human
carers but could be a step too far in childcare robotics; care routines are an impor-
tant component in fostering the relationship between a child and her primary carer
to promote healthy mental development. If we are not careful to lay out guidelines,
robots performing care routines could exacerbate some of the problems we discuss
later in the section on the psychological harm of robot childcare.
Carers who wish to leave their charges at home alone with a robot will need
to be concerned about the possibility of intruders entering the home for nefari-
ous purposes. Security is a major growth area in robotics and care robots could
incorporate some of the features being developed. For example, the Seoul authori-
ties, in combination with the private security company KT Telecop use a school
guard robot, OFRO, to watch out for potential paedophiles in school playgrounds.
It can autonomously patrol areas on pre-programmed routes and alert teachers if
it spots a person over a specic height, (Metro, May 31st 2007; e Korea Times,
e crying shame of robot nannies
May, 30th 2007). If we combine this with face recognition, already available on
some of the care robots, they could stop adults to determine if they were on the
trusted list and alert the authorities if necessary.
. Relating to the inanimate
Another essential ingredient for consumer trust in childcare robots is that chil-
dren must want to spend time with them. Research has already begun to nd
ways to sustain long term relationships between humans and robots (e.g. Kanda
et al., 2004; Mitsunaga, et al., 2006; Mavridis et al., 2009). Care robots are already
being designed to exploit both natural human anthropomorphism and the bond
that children can form with personal toys. e attribution of animacy to objects
possessing certain key characteristics is part of being human (Sharkey & Sharkey
2006). Puppeteers have understood and exploited the willing or unconscious
“suspension of disbelief” for thousands of years as have modern animators and
cartoonists. e characteristics they exploit can be visual, behavioural or auditory.
Even the vaguest suggestion of a face brings an object to life; something as simple
as a sock can be moved in a way that makes it into a cute creature (Rocks, et al.,
in press). Robots, by comparison, can greatly amplify anthropomorphic and zoo-
morphic tendencies. Unlike other objects, a robot can combine visual, movement
and auditory features to present a powerful illusion of animacy without a controller
being present.
Young children invest emotionally in their most treasured cuddly toy. ey
may have diculty sleeping without it and become distraught if it gets misplaced
or lost. e child can be asked, “What does Bear think about X?”. Bear can reply
through the child’s voice or by whispering in the child’s ear or by simply nod-
ding or waving an arm. is is a part of normal childhood play and pretence that
requires imagination, with the child in control of the action. As Cayton (2006
p. 283) points out, “When children play make-believe, ‘let’s pretend’ games they
absolutely know it is pretend… Real play is a conscious activity. Ask a child who
is playing with a doll what they are doing and they may tell you matter-of-factly
that they are going to the shops or that the doll is sick but they will also tell you
that they are playing.”
A puppet, on the other hand, is outside of the child’s control and less imagina-
tion and pretence is required. But a child le alone with a puppet soon realises the
illusion and the puppet can then be classied in the ‘let’s pretend’ category. e
dierence with a robot is that it can still operate and act when no one is stand-
ing next to it or even when the child is alone with it. is could create physical,
Noel Sharkey & Amanda Sharkey
social and relational anthropomorphism that a child might perceive as ‘real’ and
not illusion.
ere is a gradually accumulating body of evidence that children of all ages
can come to believe in the reality of a relationship they have with robots. Melson
et al. (in press a) report three studies that employed Sony’s robotic dog AIBO: (i) a
content analysis of 6,438 internet discussion forum postings by 182 AIBO owners;
(ii) observations and interviews with 80 preschoolers during a 40-minute play
period with AIBO and a stued dog; and (iii) observations and interviews with
72 school-age children from 7 to 15 years old who played with both AIBO and a
living dog. e majority of participants across all three studies viewed AIBO as a
social companion: both the preschool and older children said that AIBO “could
be their friend, that they could be a friend to AIBO, and that if they were sad, they
would like to be in the company of AIBO”.
In a related study, Kahn et al. (2006) looked at the responses of two groups
of preschoolers – 34–50 months and 58–74 months, in a comparison between an
AIBO and a stued dog. ey found that a quarter of the children, in verbal evalu-
ations, accorded animacy to the AIBO, half accorded biological properties and
around two-thirds accorded mental states. But a very similar pattern of evalua-
tion was found for the stued dog. e interesting thing here is that the children’s
behaviour towards the two artefacts did not t with their evaluations. Based on
2,360 coded behavioural interactions, the children exhibited signicantly more
apprehensive and reciprocal behaviours with the AIBO whilst they more oen
mistreated the stued dog (184 occurrences versus 39 for AIBO). us the verbal
reports were not as reliable an indicator as the behavioural observations. e robot
was treated more like a living creature than the stued dog.
Children can also form relationships with humanoid robots. Tanaka et al. (2007)
placed a “state-of-the-art” social robot (QRIO) in a day care centre for 5 months.
ey report that children between 10 and 24 months bonded with the robot in a
way that was signicantly greater than their bonding with a teddy bear. Tanaka
et al. claim that the toddlers came to treat the robot as one of their peers. ey
looked aer it, played with it, and hugged it. ey touched the robot more than
they hugged or touched a static toy robot, or a teddy bear. e researchers related
the children’s relationship with the robot to Harlow’s (1958) “aectional responses”.
ey claimed that “long-term bonding and socialization occurred between toddlers
and a state-of-the-art social robot” (Tanaka et al., 2007 p. 17957).
Turkle et al. (2006a) report a number of individual case studies that attest
to children’s willingness to become attached to robots. For example, one of the
case studies was of a ten year old girl, Melanie who was allowed to take home a
robotic doll, “My Real Baby”, and an AIBO for several weeks. e development
e crying shame of robot nannies
of a relationship of the girl with the robots is apparent from her interview with
the researcher.
“Researcher: Do you think the doll is dierent now than when you rst started
playing with it?
Melanie: Yeah. I think we really got to know each other a whole lot better. Our
relationship, it grows bigger. Maybe when I rst started playing with her, she
didn’t really know me so she wasn’t making as much [sic] of these noises, but
now that she’s played with me a lot more, she really knows me and is a lot more
outgoing. Same with AIBO” (Turkle et al. 2006b pp. 352).
In another paper, Turkle et al. (2006b) chart the rst encounters of 60 children
between the ages of ve and thirteen with the MIT robots Cog and Kismet. e
children anthropomorphised the robots, made up “back stories” about their
behaviour, and developed “a range of novel strategies for seeing the robots not
only as ‘sort of alive’ but as capable of being friends and companions”. e chil-
dren were so ready to form relationships with the robots, that when they failed
to respond appropriately to their interactions, the children created explanations
of their behaviour that preserved their view of the robot as being something with
which they could have a relationship. For example, when Kismet failed to speak to
them, children would explain that this was because it was deaf, or ill, or too young
to understand, or shy, or sleeping. eir view of the robots did not even seem to
change when the researchers spent some time showing them how they worked,
and emphasising their underlying machinery.
Melson and her colleagues (Melson et al., in press b) directly compared chil-
dren’s views of and interactions with a living, and a robot dog. e children did
see the live dog as being more likely than the AIBO to have physical essences,
mental states, sociality and moral standing. However, a majority of the children
still thought of and interacted with AIBO as if it were a real dog; they were as likely
to give commands to the AIBO as to the living dog and over 60% armed that
AIBO had “mental states, sociality and moral standing”.
Overall, the pattern of evidence indicates that the illusion of robot animacy
works well for children from preschool to at least early teens. Robots appear to
amplify natural anthropomorphism. Children who spent time with robots saw
them as friends and felt that they had formed relationships with them. ey even
believed that a relatively simple robot was getting to know them better as they
played with it more. A large percentage was also willing to attribute mental states,
sociality and moral standing to a simple robot dog. Kahn et al. (2006) suggest that
a new technological genre of autonomous, adaptive, personied and embodied
artefacts is emerging that the English language is not well-equipped to handle.
Noel Sharkey & Amanda Sharkey
ey believe that there may be need for a new ontological category beyond the
traditional distinction between animate and inanimate.
. Extending the reach of childcare robots
ere are a number of ways in which current childcare robots interact with chil-
dren. e main methods involve touch, language with speech recognition, tracking,
maintaining eye contact and face recognition among others. Extending social
interaction with better computational conversation and the ability to respond
contingently with facial expressions could result in more powerful illusions of
personhood and intent to a young child. It could make child-robot relationships
stronger and maintain them for longer. We discuss each of the current interactive
features in turn together with their possible near-future extensions.
Touch is an important element of human interaction (Hertenstein et al., 2006)
particularly with young children (Hertenstein, 2002). It has been exploited in the
development of robot companions and several of the manufacturers have inte-
grated touch sensitivity into their childcare machines in dierent ways. It seems
obvious that a robot responding contingently to touch by purring or making pleas-
ing gestures will increase its appeal. For example, Tanaka et al. (2007) reported
that children were more interested in the QRIO robot when they discovered that
patting it on the head caused it to ‘giggle’.
e PaPeRo robot has four touch sensors on the head and ve around its body
so that it can tell if it is being patted or hit. iRobiQ has a bump sensor, and touch
screen as well as touch sensors on the head, arms and wheels. e Probo robot
(Goris et al., 2008, 2009) is being developed to recognise dierent types of aective
touch such as slap, tickle, pet and poke. e Huggable (Stiehl et al. 2005, 2006) has
a dense sensor network for detecting the aective component of touch in rubbing,
petting, tapping, scratching and other types of interactions that a person nor-
mally has with a pet animal. It has four modalities for touch, pain, temperature
and kinaesthetic information.
Ongoing experimental research on touch is nding out the best way to create
emotional responses (Yohanan et al., 2005; Yohanan and Maclean, 2008). ere is
also research on the impact of a robot proactively touching people – like a “gimme
ve” gesture or an encouraging pat on the shoulder (Cramer et al. 2009). Touch
technology will improve over the next few years with better, cheaper and smaller
sensors available to create higher resolution haptic sensitivity. is will greatly
improve the interaction and friendship links with small children.
Robots could even have an advantage over humans in being allowed to touch
children. In the UK, for example, there has been considerable discussion about
the appropriateness of touching children by teachers and child minders. Teachers
e crying shame of robot nannies
are reluctant to restrain children from hurting other children for fear of being
charged with sexual oences or assault. Similarly childcare workers and infant
school teachers are advised strongly not to touch children or hug them. Even
music teachers are asked not to touch children’s hands to instruct them on how to
hold an instrument unless absolutely necessary and then only aer warning them
very explicitly and asking for their permission. ese restrictions would not apply
to a robot because it could not be accused of having sexual intent and so there are
no particular ethical concerns. e only concern would be the child’s safety, e.g.
not being crushed by a hugging robot.
Another key element in interaction is spoken language. Even a doll with a
recorded set of phrases that can be activated by pulling a string, can keep children
entertained for hours by increasing the feeling of living reality for the child. We
found eight of the current childcare robots that could talk to some extent and had
speech recognition capability for simple commands. For example, iRobi, by Yujin
Robotics of South Korea responds to 1000 words of voice commands. None had
a full blown natural language processing interface, yet they can create the illusion
of understanding.
e PaPeRo robot is one of the most advanced and can answer some simple
questions. For example, when asked, “What kind of person do you like?” it answers,
“I like gentle people”. It can even give children simple quizzes and recognise if
their answers are correct. PaPeRo gets out of conversational diculties by making
jokes or by dancing to distract children. is is very rudimentary compared to
what is available in the rapidly advancing areas of computational natural language
processing and speech recognition. Such developments could lead to care robots
being able to converse with young children in a supercially convincing way
within the next 5 to 10 years.
Face recognition is another important factor in developing relationships (Kanda
et al., 2004). Some care robots are already able to store and recognise a limited num-
ber of faces, allowing them to distinguish between children and call them by name.
e RUBI robot system has built-in face detection that enables it to autonomously
nd and gaze at a face. is is a very useful way to engage a child and convince her
that the robot has “intent”. Spurred on by their importance in security applications,
face recognition methods are improving rapidly. Childcare robots of the future will
adopt this technology to provide rapid face recognition of a wide range of people.
An even more compelling way to create the illusion of a robot having mental
states and intention, is to give it the ability to recognise the emotion conveyed by
a child’s facial expression. e RUBI project team has been working on expres-
sion recognition for about 15 years with their computer expression recogni-
tion toolbox (CERT) (Bartlett et al., 2008). is uses Ekman’s facial action units
(Ekman & Friesen, 1978) which were developed to classify all human expressions.
Noel Sharkey & Amanda Sharkey
e latest development uses CERT in combination with a sophisticated robot head
to mimic people’s emotional expressions.
e head, by David Hanson, resembles Albert Einstein and is made of a poly-
mer material called Flubber that makes it resemble human skin and provides exi-
bility of movement. Javier Movellan, the team leader said that, “We got the Einstein
robot head and did a rst pass at driving it with our expression recognition system.
In particular we had Einstein looking at himself in a mirror and learning how to
make expressions using feedback from our expression recognition. is is a trivial
machine learning problem.” (personal communication, February 27, 2009). e
head can mimic up to 5,000 dierent expressions. is is still at an early stage of
development but will eventually, “assist with the development of cognitive, social
and emotional skills of your children” (ibid).
Robots can be programmed to react politely to us, to imitate us, and to behave
acceptably in the presence of humans (Fong et al., 2003). As the evidence pre-
sented earlier suggests, we have reached a point where it is possible to make chil-
dren believe that robots can understand them at least some of the time. Advances
in language processing, touch and expression recognition will act to strengthen
the illusion. Although such developments are impressive, they are not without
ethical concerns.
An infant entertaining a relationship with a robot may not be in a position
to distinguish this from a relationship with a socially and emotionally compe-
tent being. As Sparrow pointed out about relationships with robot pets, “[they]
are predicated on mistaking, at a conscious or unconscious level, the robot for a
real animal. For an individual to benet signicantly from ownership of a robot
pet they must systematically delude themselves regarding the real nature of their
relation with the animal. It requires sentimentality of a morally deplorable sort.
Indulging in such sentimentality violates a (weak) duty that we have to ourselves
to apprehend the world accurately. e design and manufacture of these robots is
unethical in so far as it presupposes or encourages this” (Sparrow, 2002).
Sparrow was talking about the vulnerable elderly but the evidence presented
in this section suggests that young children are also highly susceptible to the belief
that they are forming a genuine relationship with a robot. We could say in abso-
lute terms that it is ethically unacceptable to create a robot that appears to have
mental states and emotional understanding. However, if it is the child’s natural
anthropomorphism that is deceiving her, then it could be argued that there are no
moral concerns for the roboticist or manufacturer. Aer all, there are many similar
illusions that appear perfectly acceptable to our society. As in our earlier example,
when we take a child to a puppet show, the puppeteer creates the illusion that the
puppets are interacting with each other and the audience. e ‘pretend’ attitude of
e crying shame of robot nannies
the puppeteer may be supported by the parents to ‘deceive’ very young children
into thinking that the puppets have mental states. But this minor ‘deception’ might
better be called ‘pretence’ and is not harmful in itself as long as it is not exploited
for unethical purposes.
It is dicult to take an absolutist ethical approach to questions about robots
and deception. Surely the moral correctness comes down to the intended applica-
tion of an illusion and its consequences. Drawing an illusion on a piece of paper to
fool our senses is an entertainment, but drawing it on the road to fool drivers into
crashing is morally unjustiable. Similarly, if the illusion of a robot with mental
states is created for a movie or a funfair or even to motivate and inspire children
at school there is no harm.
e moral issue arises and the illusion becomes a harmful deceit both when
it is used to lure a child into a false relationship with a robot and when it leads
parents to overestimate the capabilities of a robot. If such an illusory relationship is
used in combination with near-exclusive exposure to robot care, it could possibly
damage a child emotionally and psychologically, as we now discuss.
. Psychological risks of robot childcare
It is possible that exclusive or near exclusive care of a child by a robot could result in
cognitive and linguistic impairments. We only touch on these issues in this section
as our main focus here is on the ways in which a child’s relationship with a robot
carer could aect the child’s emotional and social development and potentially
lead to pathological states. e experimental research on robot–child interaction
to date has been short term with limited daily exposure to robots and mostly under
adult supervision. It would be unethical to conduct experiments on long term care
of children by robots. What we can do though, is make a ‘smash and grab raid’ on
the developmental psychology literature to extract pointers to what a child needs
for a successful relationship with a carer.
A fruitful place to start is with the considerable body of experimental research
on the theory of attachment (Ainsworth et al. 1978; Bowlby, 1969, 1980, 1998). is
work grew out of concerns about young children raised in contexts of less-than-
adequate care giving, who had later diculties in social relatedness (Zeanah et al.,
2000). Although the term ‘attachment’ has some denitional diculties, Hofer
(2006) has noted that it has “found a new usefulness as a general descriptive term
for the processes that maintain and regulate sustained social relationships, much
the same way that appetite refers to a cluster of behavioral and physiological pro-
cesses that regulate food intake” (p. 84).
Noel Sharkey & Amanda Sharkey
A fairly standard denition that suits our purposes here is that “Infant attach-
ment is the deep emotional connection that an infant forms with his or her primary
caregiver, oen the mother. It is a tie that binds them together, endures over
time, and leads the infant to experience pleasure, joy, safety, and comfort in the
caregiver’s company. e baby feels distress when that person is absent. Sooth-
ing, comforting, and providing pleasure are primary elements of the relationship.
Attachment theory holds that a consistent primary caregiver is necessary for a
child’s optimal development.” (Swartout-Corbeil, 2006). Criticising such deni-
tions, Mercer (in press) acknowledges that while it is true that attachment has a
strong emotional component, cognitive and behavioural factors are also present.
ere is always controversy within developmental psychology about the
detailed aspects of attachment. Our aim is not to present a novel approach to
attachment theory but to use the more established ndings to warn about the
possibility of harmful outcomes from robot care of children. Here we take a broad
brush stroke approach to the psychological data. Given the paucity of research
on childcare robots we have not been age specic, but our concerns are predomi-
nantly with the lower age groups – babies to preschoolers up to ve years old – that
appear to be the target group of the manufacturers.
One well established nding is that becoming well adjusted and socially attuned
requires a carer with sucient maternal sensitivity to perceive and understand an
infant’s cues and to respond to them promptly and appropriately (Ainsworth et al.,
1974). It is this that promotes the development of secure attachment in infants and
allows them to explore their environment and develop socially. But insecure forms
of attachment can develop even when the primary carer is human. Extrapolating
from the developmental literature, we will argue below that a child le with a robot
in the belief that she has formed a relationship with it, would at best, form an
insecure attachment to the robot but is more likely to suer from a pathological
attachment disorder.
Responding appropriately to an infant’s cues requires a sensitive and subtle
understanding of the infant’s needs. We have already discussed a number of ways
in which the relationship between a child and a robot can be enhanced when the
robot responds contingently to the child’s actions with touch, speech or emo-
tional expressions. When the responses are not contingent, pre-school children
quickly lose interest as Tanaka et al. (2007) found when they programmed a robot
to perform a set dance routine. However, there is a signicant dierence between
responding contingently and responding appropriately to subtle cues and signals.
We humans understand and empathise with a child’s tears when she falls because
we have experienced similar injuries when we were children, and we know what
comforted us.
e crying shame of robot nannies
ere is more to the meaning of emotional signals than simply analysing
and classifying expressions. Our ability to understand the behaviour of others is
thought to be facilitated by our mirror neurons (Rizzolatti et al., 2000; Caggiano
et al., 2009). Gallese (2001) argues that a mirror matching system underlies our
ability to perceive the sensations and emotions of others. For instance, it is possible
to show that the same neurons become active when a person feels pain as when
observing another feeling pain (Hutchinson et al., 1999).
Responding appropriately to the emotions of others is a contextually sensitive
ability that humans are particularly skilled at from a very young age. Even new-
borns can locate human faces and imitate their facial gestures (Meltzo & Moore,
1977). By 12 months, infants are able to interpret actions in context (Woodward &
Somerville, 2000). By 18 months, they can understand what another person
intends to do with an instrument and they will complete a goal-directed behaviour
that someone else fails to complete (Meltzo, 1995; Herrmann et al., 2007).
No matter how good a machine is at classifying expressions or even respond-
ing with matching expressions, children require an understanding of the reasons
for their emotional signals. A good carer’s response is based on grasping the cause
of emotions rather than simply acting on the emotions displayed. We should
respond dierently to a child crying because she has lost her toy than because she
has been abused. A child may over-react to a small event and a caring human may
realise that there is something else going on in the child’s life like the parents hav-
ing a row the night before. Appropriate responses require human common sense
reasoning over a very large, possibly innite, number of circumstances to ascertain
what may have caused an unhappy expression. “Come on now, cheer up”, might
not always be the best response to a sad face.
A human carer may not get a full and complete understanding of the context
of an emotion every time but they will make good guess with a high hit rate and
can then recalculate based on the child’s subsequent responses.
Advances in natural language processing using statistical methods to search
databases containing millions of words could lead to supercially convincing con-
versations between robots and children in the near-future. However we should
not mistake such interactions as being meaningful in the same way as caring
adult–child interactions. It is one thing for a machine to give a convincing con-
versational response to a remark or question and a completely dierent thing to
provide appropriate guidance or well founded answers to puzzling cultural ques-
tions. ere are many cues that an adult human uses to understand what answer
the child requires and at what level.
Language interactions between very young children and adults are transac-
tional in nature – both participants change over time. Adults change register
Noel Sharkey & Amanda Sharkey
according to the child’s abilities and understanding. ey continuously assess the
child’s comprehension abilities through both language and non-verbal cues and
push the child’s understanding along. is is required both for language devel-
opment and cognitive development in general. It would be extremely dicult to
nd speciable rules that a robot could apply for transactional communication to
adequately replace a carer’s intuitions about appropriate guidance.
e consequence for children of contingent but inappropriate responses could
be an insecure attachment called ‘anxious avoidant attachment’. Typically, mothers
with insecurely attached children are, “less able to read their infant’s behaviour,
leading them to try to socialise with the baby when he is hungry, play with him
when he is tired, and feed him when he is trying to initiate social interaction”
(Ainsworth et al., 1974 p. 129). Babies with withdrawn or depressed mothers are
more likely to suer aberrant forms of attachment: avoidance, or disorganised
attachment (Martins & Gaan, 2000).
‘Maternal sensitivity’1 provides a detailed understanding of an infant’s emo-
tional state. Responses need to be tailor made for each child’s particular personality.
A timid child will need a dierent response from an outgoing one, and a tired
child needs dierent treatment from a bored one. O-the-shelf responses, how-
ever benign, will not create secure attachment for a child: “If he’s bored he needs
a distraction. If he’s hungry he needs food. If he has caught his foot in a blanket, it
needs releasing. Each situation requires its own tailor-made response, suitable for
the personality of a particular baby. Clearly, it isn’t much use being given a rattle
when you are hungry, nor being rocked in your basket if your foot is uncomfort-
ably stuck” (Gerhardt, 2004 pp. 197).
Another important aspect of maternal sensitivity is the role played by “mind-
mindedness”, or the tendency of a mother to “treat her infant as an individual with
a mind rather than merely as a creature with needs that must be satised” (Meins
et al., 2001). Mind-mindedness has also been shown to be a predictor of the security
of attachment between the infant and mother. It comes from the human ability to
form a theory of mind based on knowledge of one’s own mind and the experience
of others. It allows predictions about what an infant may be thinking or intend-
ing by its actions, expressions and body language. A machine without a full blown
theory of mind (or a mind) could not easily demonstrate mind-mindedness.
Other types of insecure attachment are caused by not paying close enough
attention to a child’s needs. If the primary carer responds unpredictably, it
can lead to an ambivalent attachment where the child tends to overly cling to
her caregiver and to others. More recently, a fourth attachment category, dis-
organised attachment, has been identied (Solomon & George, 1999; Schore,
2001). It tends to result from parents who are overtly hostile and frightening to
their children, or who are so frightened themselves that they cannot attend to
e crying shame of robot nannies
their children’s needs. Children with disorganised attachment have no consistent
attachment behaviour patterns.
While it seems unlikely that a robot could show a sucient level of sensitivity
to engender secure attachment, it could be argued that the robot is only stand-
ing in for the mother in the same way as a human nanny stands in. But a poor
nanny can also cause emotional or psychological damage to a child. Children and
babies are resilient but there is clear evidence that children do better when placed
with childminders who are highly responsive to them. Elicker et al. (1999) found
that the security of attachment of children (aged 12 to 19 months) to their child-
care providers varied depending on the quality of their interactions. Dettling et al.
(2000) studied children aged between 3 and 5 years old in home-based day care.
ey found that when they were looked aer by a focused and responsive carer,
their stress levels, as measured by swabbing them for cortisol, were similar to those
of children cared for at home by their mother. In contrast, cortisol testing of chil-
dren cared for in group settings with less focused attention indicated increased
levels of stress. Belsky et al. (2007) found that children between 4.5 and 12 years
old were more likely to have problems, as reported by teachers, if they had spent
more time in childcare centres. At the same time they found that an eect of higher
quality care showed up in higher vocabulary scores.
us even regular part-time care by a robot may cause some stress and minor
behavioural problems for children. But we are not suggesting that occasional use
will be harmful, especially if the child is securely attached to their primary carer;
it may be no more harmful than watching television for a few hours. However, it is
dicult at present, without the proper research, to compare the impact of passive
entertainment to a potentially damaging relationship with an interactive artefact.
e impact will depend on a number of factors such as the age of the child, the
type of robot and the tasks that the robot performs.
In our earlier discussion of robot–child interaction research, we noted claims
that children had formed bonds and friendships with robots. However, in such
research, the terms ‘attachment’, ‘bonding’ and ‘relationship’ are oen used in a
more informal or dierent way than in developmental psychology. is makes it
dicult to join them at the seams. Attachment theorists are not just concerned
with the types of attachment but also with their consequences. As Fonagy (2003)
pointed out, attachment is not an end in itself, although secure attachment is
associated with better development of a wide range of abilities and competencies.
Secure attachment provides the opportunity “to generate a higher order regulatory
mechanism: the mechanism for appraisal and reorganisation of mental contents”
(Fonagy, 2003 pp. 230).
A securely attached child develops the ability to take another’s perspective.
When the mother or carer imitates or reects their baby’s emotional distress in
Noel Sharkey & Amanda Sharkey
their facial expression, it helps the baby to form a representation of their own
emotions. is social biofeedback leads to the development of a second order sym-
bolic representation of the infant’s own emotional state (Fonagy, 2003; Gergely &
Watson, 1996, 1999), and facilitates the development of the ability to empathise,
and understand the emotions and intentions of others. ese are not skills that any
near-future robot is likely to have.
When a young child encounters unfamiliar, or ambiguous circumstances,
they will, if securely attached, look to their caregiver for clues about how to behave.
is behaviour is termed “social referencing” (Feinman 1982). e mother or
carer provides clues about the dangers, or otherwise, of the world, particularly
by means of their facial expressions. For example, Hornik et al. (1987) found that
securely attached infants, played more with toys that their mothers made positive
emotional expressions about, and less with those that received negative expres-
sions. A more convincing example of the powerful eect of social referencing is
provided by research using a Gibson visual cli. e apparatus, frequently used in
depth-perception studies, gives the child an illusion of a sheer drop onto the oor
(the drop is actually made safe by being covered with a clear plexiglass panel). Ten
month olds will look at their mother’s face, and continue to crawl over the appar-
ent perilous edge towards an attractive toy if their mothers smile and nod. ey
back away if their mothers look fearful or doubtful (Scorce et al., 1985).
It would certainly be possible to create a robot that provided facial indications
of approval or disapproval of certain actions for the child. But before a robot can
approve or disapprove, it needs to be able to predict and recognise what action the
child is intending. And even if it could predict accurately, it would need to have a
sense of what is or is not a sensible action for a given child in a particular circum-
stance. With such a wide range and large number of possible actions that a child
could intend, it seems unlikely that we could devise a robot system to make appro-
priate decisions. As noted from the studies cited above, it is important that responses
are individually tailored, sensitive to the child’s needs, consistent and predictable.
. Is robot care better than minimal care?
Despite the drawbacks of robot care, it could be argued that it would be prefera-
ble and less harmful than leaving a child with minimal human contact. Studies
of the shocking conditions in Romanian orphanages show the eects of extreme
neglect. Nelson et al. (2007) compared the cognitive development of young children
reared in Romanian institutions to that of those moved to foster care with fami-
lies. Children were randomly assigned to be either fostered, or to remain in insti-
tutional care. e results showed that children reared in institutions manifested
greatly diminished intellectual performance (borderline mental retardation) com-
pared to children reared in their foster families. Chugani et al. (2001) found that
e crying shame of robot nannies
Romanian orphans who had experienced virtually no mothering, diered from
children of comparable ages in their brain development – and had less active orbito-
frontal cortex, hippocampus, amygdala and temporal areas.
But would a robot do a better job than scant human contact? We have no
explicit evidence but we can get some clues from animal research in the 1950s
when they were less concerned about ethical treatment. Harlow (1959) compared
the eect on baby monkeys of being raised in isolation with two dierent types of
articial “mother”: a wire-covered, or a so terry-cloth covered wire frame surro-
gate “mother”. ose raised with the so mother substitute became attached to it,
and spent more time with it than with the wire covered surrogate even when the
wire surrogate provided them with their food. eir attachment to the surrogate
was demonstrated by their increased condence when it was present – they would
return and cling to it for reassurance, and would be braver – venturing to explore
a new room and unfamiliar toys, instead of cowering in a corner. e babies fed
quickly from the wire surrogate and then returned to cuddle and cling to the terry
cloth one.
is suggests that human infants might do better with a robot carer than with
no carer at all. But the news is not all good. Even though the baby monkeys became
attached to their cloth covered surrogates, and obtained comfort and reassur-
ance from them, they did not develop normally. ey exhibited odd behaviours
and “displayed the characteristic syndrome of the socially-deprived macaque:
they clutched themselves, engaged in non-nutritive sucking, developed stereo-
typed body-rocking and other abnormal motor acts, and showed aberrant social
responses” (Mason & Berkson, 1975).
Although Harlow’s monkeys clearly formed attachments to inanimate surro-
gate mothers, the surrogates le them seriously lacking in the skills needed to
reach successful maturity. Of course, a robot nanny could be more responsive than
the cuddly surrogate statues. In fact when the surrogate terry-cloth mother was
hung from the ceiling so that the baby monkeys had to work harder to hug it as
it swung, they developed more normally that when the surrogate was stationary
(Mason & Berkson, 1975). But these were not ideal substitutes for living mothers.
e monkeys did even better when they were raised in the company of dogs which
were not mother substitutes at all.
We could conclude that robots would be better than nothing in horric situa-
tions like the Romanian orphanages. But they would really need to be a last resort.
Without systematic experimental work we cannot tell whether or not exclusive
care by a robot would be pathogenic. It is even possible that the severe deprivation
exclusive care might engender could lead to the type of impaired development
pattern found in Reactive Attachment Disorder (RAD) (Zeanah et al., 2000). RAD
was rst introduced in DSM-III (American Psychiatric Association, 1980). e
term is used in both the World Health Organization’s International Statistical
Noel Sharkey & Amanda Sharkey
Classication of Diseases and Related Health Problems (ICD-10) and in the DSM-
IV-TR, (American Psychiatric Association, 1994).
Reactive Attachment Disorder is dened by inappropriate social relatedness,
as manifest either in (i) failure to appropriately initiate or respond to social encoun-
ters, or (2) indiscriminate sociability or diuse attachment. Although Rushton and
Mayes (1997) warn against the overuse of the diagnosis of RAD it is still possible
that the inappropriate and exclusive care of a child by a robot could lead to behaviour
indicative of RAD.
Another worry is that a “robots are better than nothing” argument could lead
to a more widespread use of the technology in situations where there is a shortage
of funding, and where what is actually needed is more sta and better regulation.
It is a dierent matter to use a teleoperated robot as a parental stand in for children
who are in hospitals, perhaps quarantined or whose parent needs to be far away.
Robots under development like the MIT Huggable (Stiehl et al. 2005, 2006) or the
Probo (Goris et al. 2008, 2009) full that role and allow carers to communicate
and hug their children remotely. Such robots do not give rise to the same ethical
concerns as exclusive or near exclusive care by autonomous robots.
Overall, the evidence presented in this section points to the kinds of emo-
tional harm that robot carers might cause if infants and young children, lacking
appropriate human attachment, were overexposed to them at critical periods in
their development. We have reviewed evidence of the kinds of human skills and
sensitivities required to create securely attached children and compared these
with current robot functionality. While we have no direct experimental support as
yet, it seems clear that the robots lack the necessary abilities to adequately replace
human carers. Given the potential dangers, much more investigation needs to be
carried out before robot nannies are freely available on the market.
. Legal protections and accountability
e whole idea of robot childcare is a new one and has not had time to get into the
statute books. ere have been no legal test cases yet and there is little provision in
the law. e various international nanny codes of ethics (e.g. FICE Bulletin 1998)
do not deal with the robot nanny but require the human nanny to ensure that the
child is socialised with other children and adults and that they are taught social
responsibility and values. ese requirements are not enforceable by the law.
ere are a number of variations in the laws for child protection of dierent
European countries, USA and other developed countries, but essentially legal
cases against the overuse of robot care would have to be mounted on grounds
of neglect, abuse or mistreatment and perhaps on grounds of delaying social
e crying shame of robot nannies
and mental development. e National Society for the Prevention of Cruelty
to Children (NSPCC) in the UK regards neglect as “the persistent lack of appro-
priate care of children, including love, stimulation, safety, nourishment, warmth,
education and medical attention. It can have a serious eect on a child’s physical,
mental and emotional development. For babies and very young children, it can be
life-threatening.”
ere are currently no international guidelines, codes of practice or legislation
specically dealing with a child being le in the care of a robot. ere has been talk
from the Japanese Ministry of Trade and Industry (Lewis, 2007), and the South
Korean Ministry of Economy, Trade and Industry (Yoon-mi, 2007) about drawing
up ethical and safety guidelines. e European Robotics Research Network also
suggests a number of areas in robotics needing ethical guidelines (Verrogio, 2006)
but no guidelines or codes have yet appeared from any of these sources. Some even
argue that, “because dierent cultures may disagree on the most appropriate uses
for robots, it is unrealistic and impractical to make an internationally unied code
of ethics.” (Guo & Zhang, 2009). ere is certainly some substance to this argu-
ment as Guo and Zhang (2009) point out: “the value placed on the development of
independence in infants and toddlers could lead to totally divergent views of the
use of robots as caregivers for children.” However, despite cultural dierences,
we believe that there are certain inviolable rights that should be aorded to all
children regardless of culture, e.g. all children have a right not to be treated cruelly,
neglected, abused or emotionally harmed.
e United Nations Convention on the Rights of the Child gives 40 major
rights to children and young persons under 18. e most pertinent of these is
Article 19 which states that, Governments must do everything to protect children
from all forms of violence, abuse, neglect and mistreatment. Article 27 requires
that, “States Parties recognize the right of every child to a standard of living ade-
quate for the child’s physical, mental, spiritual, moral and social development”.
ese articles could be seen to vaguely apply to the care of children by robots but
it is certainly far from being clear.
In the USA, Federal legislation identies a minimum set of acts or behaviours
that dene child abuse and neglect. e Federal Child Abuse Prevention and Treat-
ment Act (CAPTA), (42 U.S.C.A. §5106g), as amended by the Keeping Children
and Families Safe Act of 2003, denes child abuse and neglect as, at minimum:
Any recent act or failure to act on the part of a parent or caretaker which results –
in death, serious physical or emotional harm, sexual abuse or exploitation; or
An act or failure to act which presents an imminent risk of serious harm. –
Under US federal law, neglect is divided into a number of dierent sections. e
most appropriate for our purposes, and one that does not appear under UK or
Noel Sharkey & Amanda Sharkey
European law, is emotional or psychological abuse. Emotional or psychological
abuse is dened as, “a pattern of behavior that impairs a child’s emotional develop-
ment or sense of self-worth.” is may include constant criticism, threats, or rejec-
tion, as well as withholding love, support, or guidance. Emotional abuse is oen
dicult to prove and, therefore, child protective services may not be able to inter-
vene without evidence of harm or mental injury to the child. “Emotional abuse is
almost always present when other forms are identied.” (What is Child Abuse and
Neglect Factsheet).
Although much of the research on child–robot interaction has been conducted
in the USA, the main manufacturers and currently the main target audience is in
Japan and South Korea. As in the other countries mentioned, the only legisla-
tion available to protect Japanese children from overextended care by robots is
the Child Abuse Prevention Law 2000. “e Law denes child abuse and neglect
into four categories: (i) causing external injuries or other injuries by violence;
(ii) committing acts of indecency on a child or forcing a child to commit indecent
acts; (iii) neglecting a child’s needs such as meals, leaving them for a long time,
etc.; and (iv) speaking and behaving in a manner which causes mental distress for
a child.” (Nakamura, 2002).
In South Korea it may be harder to prevent the use of extended robot child-
care. Hahm and Guterman (2001) point out that “South Korea has had a remark-
ably high incidence and prevalence rates of physical violence against children,
yet the problem has received only limited public and professional attention until
very recently” (p. 169). e problem is that “South Koreans strongly resist inter-
ference in family lives by outsiders because family aairs, especially with regard
to child-rearing practices are considered strictly the family’s own business.” e
one place where it might be possible to secure a legal case against near-exclusive
care by robots is in the recently revised Special Law for Family Violence Criminal
Prohibition (1998). is includes the Child Abuse and Neglect Prevention Act
which is similar to the laws of other civilised countries: “the new law recognises
that child maltreatment may entail physical abuse, sexual abuse, emotional abuse
or neglect”.
In the UK, a case against robot care would have to be built on provisions
in the Children and Young Persons Act (1933 with recent updates) for leaving a
child unsupervised “in a manner likely to cause unnecessary suering or injury to
health”. e law does not even specify at what age a person can be a baby sitter; it
only states that when a baby-sitter is under the age of 16 years old, the parents of
the child being “sat” are legally responsible to ensure that the child does not come
to harm.
Under UK law, a child does not have to suer actual harm for a case of neglect
to be brought. It is sucient to show that the child has been kept in, “a manner
e crying shame of robot nannies
likely to cause him unnecessary suering and injury to health”, as in the case of R v
Jasmin, L (2004) 1CR, App.R (s) 3. e Appellants had gone to work for periods of
up to 3 hours leaving their 16 month old child alone in the home. is happened
on approximately three separate occasions. e Appellants were both found guilty
of oences relating to neglect contrary to S1(1) Children and Young Persons Act
1933 and were sentenced to concurrent terms of 2 years imprisonment. Summing
up, Lord Justice Law said that, “… there was no evidence of any physical harm
resulting from this neglect [but] …both parents had diculty in accepting the
idea that their child was in any danger”.
e outcome would have been dierent if the parents had le the child alone
in exactly the same way but had stayed at home in a dierent room. If they could
have shown that they were monitoring the child with a baby monitor (and perhaps
a CCTV camera), the case against them would have been weak and it is highly
unlikely that they would have been prosecuted.
is case is relevant for the protection of children against robot care because
near-future robots, as discussed earlier, could provide safety from physical harm
and allow remote monitoring combined with autonomous alerting and a way for
the parents to remotely communicate with their children. e mobile remote
monitoring available on a robot would be signicantly better than a static camera
and baby monitor. If absent parents had such a robot system and could reach the
child within a couple of minutes, it would be dicult to prove negligence. e
time to get home is probably crucial. We could play the game of gradually moving
the parents’ place of work further and further away to get a threshold time of per-
missibility. It then becomes like the discussion of how many hairs do you have to
remove from someone’s head before they can be called bald. ese are the kind of
issues that will only be decided by legal precedence.
Another important question about robot care is who would be responsible
and accountable for psychological and emotional harm to the child? Under current
legislations it would be the parents or primary carers. But it may not be fair to
hold parents or primary carers entirely responsible. Assuming that the robot could
demonstrably keep the child safe from physical harm, the parents may have been
misled by the nature of the product. For example, if a carer’s anthropomorphism
had been amplied as a result of some very clever robot–human interaction, then
that carer may have falsely believed that the robot had mental states, and could
form ‘real’ relationships.
is leads to problems in determining accountability beyond the primary
carer. Allocating responsibility to the robot would be ridiculous. at would be
like holding a knife responsible for a murder – we are not talking about hypo-
thetical sentient robots here. But blaming others also has its diculties. ere is a
potentially long chain of responsibility that may involve the carer, the manufacturer
Noel Sharkey & Amanda Sharkey
and a number of a third parties such as the programmers and the researchers who
developed the kit. is is yet another of the many reasons why there is a need to
examine the ethical issues before the technology is developed for the mass market.
Codes of practice and even legislation are required to ensure that the advertising
claims are realistic and that the product contains warnings about potential danger
of overuse.
If a case of neglect is eventually brought to court because of robot care, a large
corporation with commercial interests may put the nest legal teams to work.
eir argument could be based on demonstrating that a robot could both keep a
child safe from physical harm and alert a designated adult about imminent dan-
gers in time for intervention. It would be more dicult to prove emotional harm
because many children have emotional problems regardless of their upbringing.
Pathological states can be genetic in origin or result from prenatal brain damage
among other possible causes. us a legal case of neglect is most likely to be won if
an infant or a baby is discovered at home alone with an unsafe robot.
. Conclusions
We have discussed a trajectory for childcare robotics that appears to be moving
towards sustained periods of care with the possibility of near-exclusive care. We
examined how childcare robots could be developed to keep children safe from
physical harm. en we looked at research that showed children forming relation-
ships and friendships with robots and how they came to believe that the robots
had mental states. Aer that, we examined the functionality of current childcare
robots and discussed how these could be extended in the near future to create
more ‘realistic’ interactions between children and robots, and intensify the illusion
of genuine relationships.
Our main focus throughout has been on the potential ethical risks that robot
childcare poses. e ethical problems discussed here could be among those that
society will have to solve over the next 20 years. e main issues and questions we
raised were:
Privacy: every child has a right to privacy under Articles 16 and 40 of the –
UN Convention on Child Rights. How much would the use of robot nannies
infringe these rights?
Restraint: ere are circumstances where a robot could keep a child from –
serious physical harm by restraining her. But how much autonomous decision
authority should we give to a robot childminder?
e crying shame of robot nannies
Deception: Is it ethically acceptable to create a robot that fools people into –
believing that it has mental states and emotional understanding? In many cir-
cumstances this can be considered to be natural anthropomorphism, illusion
and fun pretence. Our concerns are twofold (i) it could lead parents to over-
estimate the capabilities of a robot carer and to imagine that it could meet the
emotional needs of a child and (ii) it could lure a child into a false relationship
that may possibly damage her emotionally and psychologically if the robot is
overused for her care.
Accountability: Who is morally responsible for leaving children in the care –
of robots? e law on neglect puts the duty of care on the primary carer. But
should the primary carer shoulder the whole moral burden or should others,
such as the manufacturers, take some share in the responsibility?
Psychological damage: Is it ethically acceptable to use a robot as a nanny –
substitute or as a primary carer? is was the main question explored. If our
analysis of the potentially devastating psychological and emotional harm that
could result is correct, then the answer is a resounding ‘no’.
In our exploration of the developmental diculties that could be caused by robot
care, we have assumed that it would be regular, daily and possibly near exclusive
care. We also discussed evidence that part-time outside care can cause children
minor harm that they can later recover from. Realistically a couple of hours a day
in the care of a robot are unlikely to be any more harmful than watching televi-
sion – if we are careful about what we permit the robot to do. We just don’t know
if there is a continuum between the problems that could arise with exclusive care
and those that may arise with regular short-time care.
In a brief overview of international laws, we found that the main legal protec-
tion that children have is under the laws of neglect. A major concern was that as
the robots become safer, protect children from physical harm and ensure that they
are fed and watered, it will become harder to make a case for neglect. However,
the quality of robot interaction we can expect, combined with the evidence from
developmental studies on attachment, suggest that robots would at best be insensi-
tive carers unable to respond with sucient attention to the ne detailed needs of
individual children.
As we stated at the outset, we are seeking discussion of these matters rather
than attempting to oer answers or solutions. e robotics community needs to
consider questions like the ones we have raised, and take them up, where possible,
with their funders, the public and policy makers. Ultimately, it will be up to society,
the legislature and professional bodies to provide codes of conduct to deal with
future robot childcare.
Noel Sharkey & Amanda Sharkey
Note
. Maternal sensitivity is a term used even when the primary carer is not the “mother”.
References
Ainsworth, M., Blehar, M., Waters, E. & Wall, S. (1978). Patterns of attachment: a psychological
study of the strange situation. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.
Ainsworth, M.D.S., Bell, S.M., & Stayton, D.J. (1974). Infant-mother attachment and social
development: Socialisation as a product of reciprocal responsiveness to signals. In
M.P.M. Richards (Ed.), e introduction of the child into a social world. London: Cambridge
University Press.
Bartlett, M.S., Littlewort-Ford, G.C., & Movellan, J.R. (2008). Computer Expression Recogni-
tion Toolbox. Demo: 8th International IEEE Conference on Automatic Face and Gesture
Recognition. Amsterdam.
Belsky, J., Vandell, D.L., Burchinal, M., Clarke-Stewart, K.A., McCartney, K., Owen, M.T. (2007).
Are there long-term eects of early child care? Child Development, 78 (2), 681–701.
Bittybobo. URL: http://bittybobo.blogspot.com/search?updated-min=2008-01-01T00%3A00%
3A00-08%3A00&updated-max=2009-01-01T00%3A00%3A00-08%3A00&max-results=23,
comments under April 17, 2008, last accessed 15 January 2010)
Blum, D. (2003). Love at Goon Park: Harry Harlow and the Science of Aection. John Wiley:
Chichester, England.
Bowlby, J. (1969). Attachment and Loss: Volume 1: Attachment. London: Hogarth Press.
Bowlby, J. (1980). Attachment and Loss: Volume 3: Loss. London: Hogarth Press.
Bowlby, J. (1998). (edition originally 1973) Attachment and Loss: Volume 2: Separation, anger
and anxiety. London: Pimlico.
Caggiano, V., Fogassi, L., Rizzolatti, G., ier, P., & Casile, A. (2009). Mirror neurons dierentially
encode peripersonal and extrapersonal space of monkeys. Science, Vol. 324, pp. 403–406.
Cayton, H. (2006). From childhood to childhood? Autonomy and dependence through the ages
of life. In Julian C. Hughes, Stephen J. Louw, Steven R. Sabat (Eds) Dementia: mind, mean-
ing, and the person, Oxford, UK: Oxford University Press 277–286.
Children and Young Persons Act 1933. UK Statute Law Database, Part 1 Prevention of cruelty
and exposure to moral and physical danger: Oences: 12 Failing to provide for safety of
children at entertainments. URL: http://www.statutelaw.gov.uk/legResults.aspx?LegType=
All+Legislation&searchEnacted=0&extentMatchOnly=0&confersPower=0&blanketAme
ndment=0&sortAlpha=0&PageNumber=0&NavFrom=0&activeTextDocId=1109288, last
accessed 15 January 2010.
Chugani, H., Behen, M., Muzik, O., Juhasz, C., Nagy, F. & Chugani, D. (2001). Local brain
functional activity following early deprivation: a study of post-institutionalised Romanian
orphans. Neuroimage, 14: 1290–1301.
Cramer, H.S., Kemper, N.A., Amin, A., & Evers, V. (2009). e eects of robot touch and
proactive behaviour on perceptions of human–robot interactions. In Proceedings of the
4th ACM/IEEE international Conference on Human Robot interaction (La Jolla, California,
USA, March 09–13, 2009). HRI ‘09. ACM, New York, NY, 275–276.
e crying shame of robot nannies
Dautenhahn, K., Werry, I. (2004). Towards Interactive Robots in Autism erapy: Background,
Motivation and Challenges. Pragmatics and Cognition 12(1), pp. 1–35.
Dautenhahn, K. (2003). Roles and Functions of Robots in Human Society – Implications from
Research in Autism erapy. Robotica 21(4), pp. 443–452.
Dettling, A., Parker, S., Lane, S., Sebanc, A., & Gunnar, M. (2000). Quality of care determines
whether cortisol levels rise over the day for children in full-day childcare. Psychoneuroen-
docrinology, 25, 819±836.
Ekman P. & Friesen, W. (1978). Facial Action Coding System: A Technique for the Measurement
of Facial Movement, Consulting Psychologists Press, Palo Alto, CA.
Elicker, J., Fortner-Wood, C., & Noppe, I.C. (1999). e context of infant attachment in family
child care. Journal of Applied Developmental Psychology, 20, 2, 319–336.
Feinman, S., Roberts, D., Hsieh, K.F., Sawyer, D. & Swanson, K. (1992), A critical review of
social referencing in infancy, in Social Referencing and the Social Construction of Reality in
Infancy, S. Feinman, Ed. New York: Plenum Press.
Fice Bulletin (1998). A Code of Ethics for People Working with Children and Young People. URL:
http://www.ance.lu/index.php?option=com_content&view=article&id=69:a-code-of-
ethics-for-people-working-with-children-and-young-people&catid=10:ce-declaration-
2006&Itemid=29, last accessed 15 January 2010.
Fonagy, P., (2003). e development of psychopathology from infancy to adulthood: e
mysterious unfolding of disturbance in time. Infant Mental Health Journal Volume 24,
Issue 3, Date: May/June 2003, Pages: 212–239.
Fong. T., Nourbakhsh, I., & Dautenhahn, K. (2003). A Survey of Socially Interactive Robots,
Robotics and Autonomous Systems 42(3–4), 143–166.
Gallese, V. (2001). e shared manifold hypothesis: From mirror neurons to empathy. Journal of
Consciousness Studies, 8, 33–50.
Gergely, G., & Watson, J. (1996). e social biofeedback model of parental aect-mirroring.
International Journal of Psycho-Analysis, 77, 1181–1212.
Gergely, G., & Watson, J. (1999). Early social-emotional development: Contingency perception
and the social biofeedback model. In P. Rochat (Ed.), Early social cognition: Understanding
others in the rst months of life (pp. 101–137). Hillsdale, NJ: Erlbaum.
Gerhardt, S. (2004). Why love matters: how aection shapes a baby’s brain. Routledge Taylor and
Francis Group, London and New York.
Goris, K., Saldien, J., Vanderniepen, I., & Lefeber, D. (2008). e Huggable Robot Probo, a
Multi-disciplinary Research Platform. Proceedings of the EUROBOT Conference 2008,
Heidelberg, Germany, 22–24 May, 2008, pages 63–68.
Goris, K., Saldien, J., & Lefeber, D. 2009. Probo: a testbed for human robot interaction. In
Proceedings of the 4th ACM/IEEE International Conference on Human Robot interaction
(La Jolla, California, USA, March 09–13, 2009). HRI ‘09. ACM, New York, NY, 253–254.
Guo, S. & Zhang, G. (2009). Robot Rights, Letter to Science, 323, 876.
Hahm, H.C., Guterman N.B. (2001). e emerging problem of physical child abuse in South
Korea. Child maltreatment 6(2): 169–79.
Hello kitty web reference
URL: http://www.dreamkitty.com/Merchant5/merchant.mvc?Screen=PROD&Store_Code=
DK2000&Product_Code=K-EM070605&Category_Code=HKDL.
Hermann, E., Call, J., Hare, B., & Tomasello, M. (2007). Human evolved specialized skills of
social cognition: e cultural intelligence hypothesis. Science, 317(5843), 1360–1366.
Noel Sharkey & Amanda Sharkey
Hertenstein, M.J. (2002). Touch: its communicative functions in infancy, Human Development,
45, 70–92.
Hertenstein, M.J., Verkamp, J.M., Kerestes, A.M., & Holmes, R.M. (2006). e communicative
functions of touch in humans, nonhuman primates, and rats: A review and synthesis of the
empirical research, Genetic, Social & General Psychology Monographs, 132(1), 5–94.
Hornik, R., Risenhoover, N., & Gunnar, M. (1987). e eects of maternal positive, neutral, and
negative aective communications on infant responses to new toys. Child Development,
58, 937–944.
Hutchinson, W., Davis, K., Lozano, A., Tasker, R., & Dostrovsky, J. (1999). Pain-related neurons
in the human cingulated cortex. Nature Neuroscience, 2, 403–5.
Kahn, P.H., Jr., Friedman, B., Perez-Granados, D., & Freier, N.G. (2006). Robotic pets in the lives
of preschool children. Interaction Studies, 7(3), 405–436.
Kanda, T., Takuyaki, H., Eaton, D. & Ishiguro, H. (2004). Interactive robots as social partners
and peer tutors for children: a eld trial, Human Computer Interaction, 19, 16–84.
Kanda, T., Nishio, S., Ishiguro, H., & Hagita, N. (2009). Interactive Humanoid Robots and
Androids in Children’s Lives. Children, Youth and Environments, 19 (1), 12–33. Available
from: www.colorado.edu/journals/cye.
Lewis, L. (2007). e robots are running riot! Quick, bring out the red tape, e Times Online,
April 6th. URL: http://www.timesonline.co.uk/tol/news/world/asia/article1620558.ece, last
accessed 15 January 2010.
Liu, C., Conn, K., Sarkar, N., & Stone, W. (2008). Online aect detection and robot behaviour
adaptation for intervention of children with autism. IEEE Transactions on Robotics, Vol. 24,
Issue 4, pp. 883–896.
Lopes, M.M., Koenig, N.P., Chernova, S.H., Jones, C.V., & Jenkins, O.C. (2009). Mobile human-
robot teaming with environmental tolerance. In Proceedings of the 4th ACM/IEEE inter-
national Conference on Human Robot interaction (La Jolla, California, USA, March 09–13,
2009). HRI ‘09. ACM, New York, NY, 157–164.
Marti, P., Palma, V., Pollini, A., Rullo, A. & Shibata, T. (2005). My Gym Robot, Proceeding of the
Symposium on Robot Companions: Hard Problems and Open Challenges in Robot–Human
Interaction, pp.64–73.
Martins, C. & Gaan, E.A. (2000). Eects of early maternal depression on patterns of infant-
mother attachment: A meta-analytic investigation, Journal of Child Psychology and Psychiatry
42, pp. 737–746.
Mason, W.A. & Berkson, G. (1975). Eects of Maternal Mobility on the Development of Rocking
and Other Behaviors in Rhesus Monkeys: A Study with Articial Mothers. Developmental
Psychobiology 8, 3, 197–211.
Mason, W.A. (2002). e Natural History of Primate Behavioural Development: An Organismic
Perspective. In Eds. D. Lewkowicz & R. Lickliter, Conceptions of Development: Lessons from
the Laboratory. Psychology Press. 105–135.
Mavridis, N., Chandan, D., Emami, S., Tanoto, A., BenAbdelkader, C. & Rabie, T. (2009).
FaceBots: Robots Utilizing and Publishing Social Information in Facebook. HRI’09,
March 11–13, 2009, La Jolla, California, USA. ACM 978-1-60558-404-1/09/03.
Melson, G.F., Kahn, P.H., Jr., Beck, A.M., & Friedman, B. (in press a). Robotic pets in human
lives: Implications for the human-animal bond and for human relationships with personied
technologies. Journal of Social Issues.
Melson, G.F., Kahn, P.H., Jr., Beck, A.M., Friedman, B., Roberts, T., Garrett, E., & Gill, B.T.
(in press b). Robots as dogs? – Children’s interactions with the robotic dog AIBO and a live
Australian shepherd. Journal of Applied Developmental Psychology.
e crying shame of robot nannies
Mercer, J. (in press for 2010), Attachment theory, eory and Psychology.
Meins, E., Fernyhough, C., Fradley, E. & Tuckey, M. (2001). Rethinking maternal sensitivity:
mothers’ comments on infants’ mental processes predict security of attachment at 12 months.
Journal of Child Psychology and Psychiatry 42, pp. 637–48.
Meltzo, A.N. (1995). Understanding the intention of others: Re-enactment of intended acts by
18 month old children. Developmental Psychology 32, 838–850.
Meltzo, A.N. & Moore, M.K. (1977). Imitation of facial and manual gestures by human neonates.
Science, 198, 75–78.
Mitsunaga, N., Miyashita, T., Ishiguro, H., Kogure, K., & Hagita, N. (2006). Robovie-IV:
A Communication Robot Interacting with People Daily in an Office, In Proc of IROS,
5066–5072.
Nakamura, Y. (2002). Child abuse and neglect in Japan, Paediatrics International, 44, 580–581.
Nelson, C.A., Zeanah, C.H., Fox, N.A., Marshall, P.J., Smyke, A.T. & Guthrie, D. (2007). Cogni-
tive recovery in socially deprived young children: e Bucharest early intervention project.
Science, 318, no 5858, pp. 1937–1940.
Orpwood, R., Adlam, T., Evans, N., Chadd, J. (2008). Evaluation of an assisted-living smart
home for someone with dementia. Journal of Assistive Technologies, 2, 2, 13–21.
Rocks, C.L., Jenkins, S., Studley, M. & McGoran, D. (in press).’Heart Robot’, a public engagement
project. Robots in the Wild: Exploring Human–Robot Interaction in Naturalistic Environ-
ments. Special Issue of Interaction Studies.
Rushton, A. & Mayes, D. 1997. Forming Fresh Attachments in Childhood: A Research Update.
Child and Familiy Social Work 2(2): 121–127.
Sharkey, N. (2008a). e Ethical Frontiers of Robotics, Science, 322. 1800–1801.
Sharkey, N (2008b). Cassandra or False Prophet of Doom: AI Robots and War, IEEE Intelligent
Systems, vol. 23, no, 4, 14–17, July/August Issue.
Sharkey, N. & Sharkey, A. (in press) Living with robots: ethical tradeos in eldercare, In Wilks, Y.
Articial Companions in Society: scientic, economic, psychological and philosophical perspec-
tives. Amsterdam: John Benjamins.
Sharkey, N., & Sharkey, A. (2006). Articial Intelligence and Natural Magic, Articial Intelligence
Review, 25, 9–19.
Shibata, T., Mitsui, T., Wada, K., Touda, A., Kumasaka, T., Tagami, K. & Tanie, K. (2001). Mental
Commit Robot and its Application to erapy of Children, Proc. of 2001 IEEE/ASME Int.
Conf. on Advanced Intelligent Mechatronics, pp.1053–1058.
Schore, A. (2001). e eects of early relational trauma on right brain development, aect regu-
lation, and infant mental health. Infant Mental Health Journal, 22, 1–2, pp. 201–69.
Scorce, J.F., Ernde, R.N., Campos, J., & Klinnert, M.D. (1985). Maternal emotional signaling:
Its eect on the visual cli behavior of 1-year-olds. Developmental Psychology, 21(1),
195–200.
Solomon, J. & George, C. (eds) (1999). Attachment Disorganisation, New York: Guilford Press.
Sparrow, R. (2002). e March of the Robot Dogs, Ethics and Information Technology, Vol. 4.
No. 4, pp. 305–318.
Stiehl, W.D., Lieberman, J., Breazeal, C., Basel, L., & Lalla, L. (2005). e Design of the Huggable:
A erapeutic Robotic Companion for Relational, Aective Touch. AAAI Fall Symposium
on Caring Machines: AI in Eldercare, Washington, D.C.
Stiehl, W.D., Breazeal, C., Han, K., Lieberman, J., Lalla, L., Maymin, A., Salinas, J., Fuentes, D.,
Toscano, R., Tong, C.H., & Kishore, A. 2006. e huggable: a new type of therapeutic
robotic companion. In ACM SIGGRAPH 2006. Sketches (Boston, Massachusetts, July 30–
August 03, 2006). SIGGRAPH ‘06. ACM, New York, NY, 14.
Noel Sharkey & Amanda Sharkey
Swartout-Corbeil, D.M. (2006). Attachment between infant and caregiver, In e Gale Encyclopedia
of Children’s Health: Infancy through Adolescence, MI: e Gale Group.
Tanaka F., Cicourel, A. & Movellan, J.R. (2007). Socialization Between Toddlers and Robots
at an Early Childhood Education Center. Proceedings of the National Academy of Science.
Vol 194, No 46, 17954 x17958.
Turkle, S., Taggart, W., Kidd, C.D., Dasté, O. (2006a). Relational Artifacts with Children and
Elders: e Complexities of Cybercompanionship. Connection Science, 18, 4, pp. 347–362.
Turkle, S., Breazeal, C., Dasté, O., & Scassellati, B., (2006b). First Encounters with Kismet and
Cog: Children Respond to Relational Artifacts. In Digital Media:
Transformations in Human Communication, Paul Messaris & Lee Humphreys (eds.). New York:
Peter Lang Publishing.
United Nations Convention on the Rights of the Child, URL: http://www2.ohchr.org/english/
law/crc.htm, last accessed 15 January 2010.
Verrugio, G. (2006). e EURON robotethics roadmap, 6th IEEE-RAS International Conference
on Humanoid Robots, 612–617.
What is Child Abuse and Neglect Factsheet, URL: http://www.childwelfare.gov/pubs/factsheets/
whatiscan.cfm, last accessed 15 January 2010.
Wallach, W., & Allen, C. (2009). Moral Machines: Teaching Robots Right from Wrong, Oxford
University Press, New York.
Woodward, A.L. & Sommerville, J.A. (2000). Twelve-month-old infants interpret action in context.
Psychological Science, 11, 73–77.
Yohanan, S., & MacLean, K.E. (2008). e Haptic Creature Project: Social Human–Robot Inter-
action through Aective Touch. In Proceedings of the AISB 2008 Symposium on the Reign of
Catz & Dogs: e Second AISB Symposium on the Role of Virtual Creatures in a Computer-
ised Society, volume 1, pages 7–11, Aberdeen, Scotland, UK, April, 2008.
Yohanan, S., Chan, M., Hopkins, J., Sun, H., & MacLean, K. (2005). Hapticat: Exploration of
Aective Touch. In ICMI ‘05: Proceedings of the 7th International Conference on Multi-
modal Interfaces, pages 222–229, Trento, Italy, 2005.
Yoon-mi, K. (2007). Korea dras Robot Ethics Charter, e Korea Herald, April 28.
Yoshiro, U., Shinichi, O., Yosuke, T., Junichi, F., Tooru, I., Toshihro, N., Tsuyoshi, S., Junichi, O,
(2005). Childcare Robot PaPeRo is designed to play with and watch over children at nurs-
ery, kindergarten, school and at home. Development of Childcare Robot PaPeRo, Nippon
Robotto #Gakkai Gakujutsu Koenkai Yokoshu, 1–11.
Zeanah, C.H., Boris, N.W. & Lieberman, A.F. (2000). Attachment disorders of Infancy In
Arnold J. Samero, Michael Lewis, Suzanne Melanie Miller (Eds) Handbook of develop-
mental psychopathology, Birkhäuser, 2nd Edition.
Authors’ address
Noel Sharkey & Amanda Sharkey
Department of Computer Science
University of Sheeld
Regent Court
211 Portobello
Sheeld, S1 4DP
UK
Email: noel@dcs.shef.ac.uk, amanda@dcs.shef.ac.uk