Conference PaperPDF Available

The Wider Supportive Role of Social Robots in the Classroom for Teachers

Authors:

Abstract

Robots are being increasingly used in schools by researchers keen to assess how they may be used to facilitate learning and provide support. Based on 15 school experiment visits at 9 different schools in the U.K., we outline our observations, specifically focusing on the broader implications of robots in the classroom primarily from the perspective of the teacher. We then outline the basis for future research considerations in HRI, centred around the three themes of pedagogy, methodology, and ethics. For further application of robotics to education, we suggest that these three themes need to form a central part of continuing research.
The Wider Supportive Role of Social Robots in
the Classroom for Teachers
Paul Baxter1, Emily Ashurst2, James Kennedy1,
Emmanuel Senft1, S´everin Lemaignan1, and Tony Belpaeme1
1Centre for Robotics and Neural Systems
The Cognition Institute, Plymouth University, U.K.
paul.baxter@plymouth.ac.uk
2School of Nursing and Midwifery
Plymouth University, U.K.
Abstract. Robots are being increasingly used in schools by researchers
keen to assess how they may be used to facilitate learning and provide
support. Based on 15 school experiment visits at 9 different schools in the
U.K., we outline our observations, specifically focusing on the broader
implications of robots in the classroom primarily from the perspective of
the teacher. We then outline the basis for future research considerations
in HRI, centred around the three themes of pedagogy, methodology, and
ethics. For further application of robotics to education, we suggest that
these three themes need to form a central part of continuing research.
Keywords: Education, Ethics, Methodology, Pedagogy, Social HRI,
Teacher Support
1 Introduction
There are increasing numbers of applications of social robots to the domain of
education, and in particular to deployment in classrooms. There are typically
two main, and often overlapping, goals for these efforts. Firstly, they are intended
to augment teaching structures and provide supplementary support to children
by providing an alternative and/or personalised learning experience. Note that
these applications, quite rightly, do not seek to replace any human teaching staff
or reduce human contact time, but are rather intended as supplementary to
existing pedagogical structures. Secondly, they seek to examine the attitudes of
teachers and students regarding robots in the classroom, and solicit their views
on how applications should be implemented and used.
Rather than fit into one of these two essential goals, we rather in this paper
seek to take a step back and address some of the wider implications of robots
in the classroom from the perspective of the teachers themselves. We further
consider research methodology, beyond the application to the learning task that
is typically the focus of the work conducted.
We report on the experiences gained from 15 separate school visits to nine
different schools around Plymouth (in the south-west of the U.K.) to run a num-
ber of experiments over the past three years. Our observations are qualitative
2
and based on extensive interactions during and after the experiments with teach-
ers. Despite this, a number of trends and facets have emerged that we believe
have consequences for the conduct of HRI experiments with children in schools,
but also more generally ‘in the wild’ – a difficult task with many pitfalls [9].
Three outcomes from these observations are derived. Pedagogically, the way
the teachers use the robot in terms of its (social) presence needs to be properly
characterised from the perspective of teaching practice. Methodologically (from
the experimental point of view), the manner in which children anticipate their
interactions with the robot are altered by the changed behaviour of the teacher
(beyond the mere presence of the robot) needs to be accounted for. Ethically,
the way the robot is treated by the teacher with respect to the children and the
consequences of this needs to be considered.
2 Robots for Teachers
As noted above, much existing work is focused on having a robot support and
supplement existing teaching structures, by, perhaps, engaging children in addi-
tional one-on-one tuition. While space constraints prevent a complete overview
of the literature here, this is typically conducted from the perspective of the
individual children and/or of the robot itself. For example, Kanda et al provided
a seminal exploration of how robots interacted and formed relationships with
children over time [5], considering both the social dimension from the point of
view of the children, and the technical and behavioural facets of the robot used
to encourage this. Similar to this, the focus on improving robot competencies
to facilitate interaction with children has encompassed empathic behaviour [8],
interactant-directed behavioural cues [7], etc.
The second strand of work is on teachers’ and children’s perceptions of robots
and how they may be useful. This has for example highlighted practical concerns
of robustness and equality of access, but also acknowledged the positive role of
the robot in maintaining the engagement of children [11]. In the domain of
children with health problems, a social robot has been highlighted as a means of
integration at school (a novel experience to share that is not health-related) [1].
3 Teachers with Robots
Beyond the established goals described above, in this section we seek to expand
our observations of other factors involved in running experiments in real schools
that typically do not explicitly form part of the experimental considerations (or
appear in the subsequent publications). We first provide some general observa-
tions, and then provide more specific observations from a recently conducted
longer-term experiment.
3.1 General Observations
In the majority of our experiments (e.g. [7]), we set up the robot in a room
in the school that is not the children’s classroom, but one with which they are
3
familiar. In this way, we maintain the school environment while not disturbing
the ongoing lessons in another room. Children from our subject group (typically
a single class) are brought by the experimenter one by one to this experiment
room, the experiment is conducted, and the children are subsequently returned
to their classroom.
One consistent observation from our experiments in the schools was that the
teacher adapted their behaviour both in the presence of the robot and also given
just the prospect of the robot being present. This is not so surprising given that,
at the present time at least, robots are not a typical feature in U.K. schools.
A certain level of excitement at a novel experience in school time is thus to be
expected.
However, it was apparent that the teachers, without exception in our experi-
ence, would use the occasion to aid in their management of the classroom. There
were two general means of achieving this observed. Firstly, prior to the start of
the experiments, the teachers would emphasise to the children the importance
of getting the parents/guardians to complete the consent form - otherwise the
children would miss out. Secondly, during the experiment, the teachers would
explicitly refer to the robot when requesting that the children settle down to
work - a common phrase used was “if you don’t behave/be quiet, you won’t get
to play with the robot” (emphasis added3).
It must be acknowledged that the teachers would use similar tactics to en-
courage and remind the children to do various other normal school activities4.
However, in the case of the robot, this has a number of methodological conse-
quences. By emphasising that the children would miss out on the novel experi-
ence, the teachers are implicitly raising the expectation in the children that this
is a desirable experience. The children are therefore primed, with consequences
for subsequent interactions with the robot: it has indeed been shown that prior
expectation strongly shapes human-robot interactions, e.g. [4].
3.2 Long-term Embedded Study Observations
In one of our more recent studies, we dispensed with the presence of experi-
menters, and placed robots directly into classrooms without technical supervi-
sion: the teacher (and perhaps a teacher’s assistant) was the only supervising
adult present. In this continuous two-week study (publication in preparation),
the robot interacted with individual children during class time, in the classroom,
where the teacher was responsible for designating which child would be allowed
to interact with the robot next. Two classes were involved in the same school
3This use of the word ‘play’ seems to arise irrespective of the way in which the robot is
presented by the researchers. This could indicate that despite the insistence that we
are attempting to help the children learn, the robot does give rise to the expectations
associated with toys. Alternatively, it could just be the manner the teachers choose
to portray the robot in order to maximise the effect with the children: play is fun,
work is not.
4Perhaps implicitly taking advantage of the children’s desire not to be left out, and
to remain included.
4
and in the same age-range, with separate teachers. In initial planning discussions
with the teachers, they enquired whether it would be possible to use the robot
to indicate to them when the classroom was too noisy to help them manage the
children, however this interfered with the planned experimental protocol and so
was subsequently not pursued. Based on observations from video recorded inter-
actions, and subsequent discussion with the teachers, this long-term embedded
situation provides additional insight into the altered dynamics in the classroom
when a (social) robot is present.
Firstly regarding the general attitude of the children over the two week pe-
riod. Both teachers involved said that while in the first week the behaviour of
the robot caused distraction for the rest of the class (e.g. children turning to
look when the robot spoke), in the second week, this distraction effect wore off.
This meant that the classes continued uninterrupted by the robot despite the
interactions taking place during normal lesson time. Conversely, both teachers
reported that their ability to use the robot as a ‘bribe’ (i.e. a means of motivat-
ing the children) was maintained over the experimental period. In other words,
while the distraction caused by the robot decreased, it still formed a sufficiently
attractive lure to induce behaviour change.
Secondly, prior to this experiment, a great deal of effort was put into designing
the content to be learned by the children. We chose to maintain a strong link
to the curriculum that the children were following, and also chose a topic that
would appear in the syllabus for the following year (thus a novel topic for the
children at the time of the experiment). This was done for two reasons. Firstly,
any positive learning outcomes would be directly useful for the children (and
thus also increasing ecological validity). Secondly, it was a means of making
the experiment relevant to the school, and reducing the potential (perceived)
distraction from the curriculum if a completely unrelated topic were chosen.
4 Discussion
In all of the examples described above, it is clear that the novelty effect provides
a significant influence. As a means of increasing interest in a topic to be learned,
this effect can be bootstrapped to improve child learning, e.g. [10]. However, as
robots become more pervasive in these environments, there will naturally come
a point where this effect of novelty can no longer be relied upon to increase
the interest levels sufficiently to enhance engagement and/or learning. While
long-term studies have examined relationships between children and robots in
classroom settings [6], the equivalent research for learning outcomes has yet to
provide conclusive evidence that learning benefits will remain, although there are
indications that at a least a moderate positive effect will persist. Based on these
considerations, we can identify three themes around which our observations and
their implications can be structured: pedagogy, methodology, and ethics.
Pedagogy: Assuming that the goal is to enhance the classroom as a learning
environment with social robots, the wider question is how the robot would fit
in terms of both the teaching style of the teacher and the content to be learned
5
by the children. Insight can be gained from research in the field of Computer
Supported Cooperative Work/Learning, which has approached the problem from
the more general perspective of technology in the classroom. For example, the
concept of ‘classroom orchestration’ [3] investigates how device appearance and
functionality affords improved teacher-directed workflow. Taking advantage of
frameworks such as this will enable the role of the robot to be considered from
a global educational, rather than robot-centred technological, perspective.
Methodology: Given our observations, it would appear that the methodolog-
ical consequences need to be handled. If the children’s expectations are inflated
prior to the interaction with the robot by implicit suggestions from the teacher,
then a practical consequence could be a reduction in measurable difference be-
tween conditions. Consider that in many studies, a single manipulation distin-
guishes between conditions, and it is clear that an initial increase of excitement
or engagement in all subjects could be enough to reduce the significance of any
differences between conditions – made already difficult due the child propensity
to anthropomorphise [2]. This could be an effect that distinguishes research with
children from research with adults for instance. The means of handling this po-
tential confound could however be difficult. One means could be to extend the
length of studies so as to get over this initial biasing effect. Although this may
be practically difficult, another benefit of this solution is that begins to address
the questions in a more ecologically valid manner.
Ethics: A further consideration is how the manner in which the teacher treats
the robot impacts on the dynamics of the classroom, and the subsequent way
the robot is viewed by the children. This goes beyond the existing concerns of
the long-term effect of children interacting with robots (e.g. in the extreme case
[12]). One example given above was of the teacher requesting that the robot be
used to indicate if the classroom was too noisy. By using the robot in the capacity
of an ‘informant’ in this way, the children in the classroom would begin to see
it as an ‘other’ rather than as a peer (if the robot designers envisaged a peer
role for the robot). Similar scenarios can be constructed that follow similar lines
of argument: a mismatch between the roles apparently ascribed by the teacher
to the robot, and the role the robot takes on when interacting with a child. To
the extent that such discrepancies impact trust, for example, this becomes an
ethical issue that requires consideration.
5 Conclusion
In this paper we have outlined the three themes of pedagogy, methodology and
ethics as areas that are impacted by the presence of (social) robots in the class-
room, but which are not typically considered in research efforts at the present
time. The example from methodology demonstrates what we contend are real
experimental issues that can be addressed if this perspective is taken. These ob-
servations are not intended to be read as criticisms of current practice (we have
been subject to the same effects mentioned here), but rather as opportunities
for further research endeavours. What we advocate though is a wider consider-
6
ation of the role of the robot within the classroom beyond the actual learning
application that is typically the focus of the studies reported in the literature.
Acknowledgements
This work was supported by the EU FP7 ALIZ-E project (grant number 248116),
and the EU FP7 DREAM project (grant number 611391), www.dream2020.eu.
References
1. Baroni, I., Nalin, M., Baxter, P., Pozzi, C., Oleari, E., Sanna, A., Belpaeme, T.:
What a robotic companion could do for a diabetic child. In: RoMAN’14. pp. 936–
941. IEEE Press, Edinburgh, U.K. (Aug 2014)
2. Belpaeme, T., Baxter, P., Greeff, J.D., Kennedy, J., Looije, R., Neerincx, M., Ba-
roni, I., Coti, M.: Child-Robot Interaction: Perspectives and Challenges. In: ICSR.
pp. 452–459. Springer, Bristol, U.K. (2013)
3. Dillenbourg, P.: Design for classroom orchestration. Computers and Education 69,
485–492 (2013)
4. Fischer, K., Soto, B., Pantofaru, C., Takayama, L.: Initiating Interactions in Order
to Get Help: Effects of Social Framing on Peoples Responses to Robots Requests
for Assistance. In: RoMAN’14. pp. 999–1005. IEEE Press, Edinburgh, U.K. (2014)
5. Kanda, T., Hirano, T., Eaton, D., Ishiguro, H.: Interactive Robots as Social Part-
ners and Peer Tutors for Children: A Field Trial. Human-Computer Interaction
19(1), 61–84 (Jun 2004)
6. Kanda, T., Sato, R., Saiwaki, N., Ishiguro, H.: A two-month field trial in an elemen-
tary school for long-term human-robot interaction. IEEE Transactions on Robotics
23(5), 962–971 (2007)
7. Kennedy, J., Baxter, P., Belpaeme, T.: The Robot Who Tried Too Hard: Social
Behaviour of a Robot Tutor Can Negatively Affect Child Learning. In: HRI’15. pp.
67–74. ACM Press, Portland, Oregon, USA (2015)
8. Leite, I., Castellano, G., Pereira, A., Martinho, C., Paiva, A.: Modelling Empathic
Behaviour in a Robotic Game Companion for Children : an Ethnographic Study
in Real-World Settings. In: HRI’12. pp. 367–374. ACM Press, Boston, MA, U.S.A.
(2012)
9. Salter, T., Werry, I., Michaud, F.: Going into the Wild in Child-Robot Interaction
Studies. Intelligent Service Robotics 1(2), 93–108 (2007)
10. Schiefele, U.: Interest, Learning, and Motivation. Educational Psychologist 26(3-4),
299–323 (1991)
11. Serholt, S., Barendregt, W., Leite, I., Hastie, H., Jones, A., Paiva, A., Vasalou,
A., Castellano, G.: Teachers’ views on the use of empathic robotic tutors in the
classroom. RoMAN’15 (September 2015), 955–960 (2014)
12. Sharkey, N., Sharkey, A.: The crying shame of robot nannies: An ethical appraisal.
Interaction Studies 11(2), 161–190 (2010)
... Advances in enabling technologies, both software and hardware, have encouraged a widespread proliferation of social robots in several application domains, including education, therapy, services, entertainment, and arts [1][2][3][4][5]. In all applications, the capacity of social robots to sensibly interact with humans is critical [6][7][8]. ...
Article
Full-text available
Social robots keep proliferating. A critical challenge remains their sensible interaction with humans, especially in real world applications. Hence, computing with real world semantics is instrumental. Recently, the Lattice Computing (LC) paradigm has been proposed with a capacity to compute with semantics represented by partial order in a mathematical lattice data domain. In the aforementioned context, this work proposes a parametric LC classifier, namely a Granule-based-Classifier (GbC), applicable in a mathematical lattice (T,⊑) of tree data structures, each of which represents a human face. A tree data structure here emerges from 68 facial landmarks (points) computed in a data preprocessing step by the OpenFace software. The proposed (tree) representation retains human anonymity during data processing. Extensive computational experiments regarding three different pattern recognition problems, namely (1) head orientation, (2) facial expressions, and (3) human face recognition, demonstrate GbC capacities, including good classification results, and a common human face representation in different pattern recognition problems, as well as data induced granular rules in (T,⊑) that allow for (a) explainable decision-making, (b) tunable generalization enabled also by formal logic/reasoning techniques, and (c) an inherent capacity for modular data fusion extensions. The potential of the proposed techniques is discussed.
... Teacher concerns about surveillance in the classroom are well documented (Page, 2017a,b). Baxter et al. (2015) discussed the importance of ethical explorations regarding educational robots in the classroom. They cited an example where a teacher suggested a surveillance task to the robot and reflected on the barriers created by giving the robot a supervisory role. ...
Article
Full-text available
Currently there are 4.9 million English Language Learners (ELLs) in the United States, however, only 2% of educators are trained to support these vulnerable students. Educational robots show promise for language acquisition and may provide valuable support for ELLs, yet, little is known about social robots for this population. Inviting participants as cultural informants can ensure that the robot is appropriately designed, situated and adopted into that educational community. Therefore, we conducted an exploratory study using interactive group interviews with 95 ELLs (kindergarten through fifth grade) from 18 different home language backgrounds. We also interviewed 39 ELL parents and eight elementary school educators to understand their views of educational robots. Responses to robot images suggested a preference for a popular educational robot. Parents expressed a strong desire for educational robots to support their children at school. While children embraced the idea of a robot at school, some expressed concerns about the potential for robots to be disruptive. School educators saw the potential for educational robots to support teachers in meeting instructional needs but also raised salient concerns. Exploring social robots with ELLs as cultural informants was a valuable exploration to determine important factors in social robot design and implementation for a diverse educational setting.
... Values such as privacy, human contact and accountability are reported to be impacted by the social robots [19]. Hence, the urgency for further research into the moral considerations regarding educational robots is voiced throughout the robotic literature [3,9,19,21]. The aim of this exploratory study is to explore teachers' perspectives regarding the moral concerns that come with social robots in education. ...
... The common element in these three gaps is the aspect of increasing the overall acceptance of humanoid robots by students, teachers and instructors and to instill a positive attitude towards the technology. By providing the teacher with control over the robot and the necessary interfaces and resources to do so, we can overcome the fear and reluctance that may occur in their mind when they think of Educational robots [5]. Our focus and goal is not to replace the teacher but rather enable and facilitate better integration of the robot in the classroom, by working with the teacher to design interaction and pedagogical scenarios that assist in doing so. ...
Article
Social robots are beginning adopted in PreK-12 schools around the world and have the potential to initiate far-reaching changes in education. Analysis from high-quality field studies is essential for educational researchers, administrators, and practitioners to make informed decisions about using these robots. We identified 23 studies between 2000 and 2020 that examined social robots in classroom settings. These studies demonstrated the feasibility of using social robots in natural school settings but revealed how difficult it was to obtain long-term, highly autonomous interaction between robots and children. The studies varied considerably on key conditions (e.g., length of deployment, the autonomy of robotic action, and length of interaction time). The studies did not demonstrate that social robots are more effective than human teachers (or even other forms of technology), and only occasionally explored important ethical and safety issues.
Conference Paper
Full-text available
Human-robot interaction has been a significant area of research with the widespread use of social robots. Many modalities can be used to achieve interaction, including vision. For each modality, many methodologies have been proposed, with varying degrees of effectiveness and efficiency in terms of the computational power needed. The varied nature of these algorithms makes data fusion a complex and application-specific task. This paper introduces a novel Lattice Computing-based methodology to interpret visual stimuli for head pose estimation. An investigation of the various parameters involved and initial results are presented. The aim is to determine head pose in robot-assisted therapy settings and use it in decision making. This work is part of a broader effort to use the Lattice Computing (LC) paradigm as a unified methodology for sensory data interpretation in human-robot interaction.
Conference Paper
Full-text available
Visual stimuli are essential in many applications in human robot interaction. However, such tasks are usually computationally intensive. Also, data received from the various sensors on a robot require different data representation and processing techniques, which increases the complexity and makes the fusion of sensory data for decision making more difficult. An alternative approach is the use of the Lattice Computing (LC) paradigm for hybrid mathematical modelling based on mathematical lattice theory that unifies rigorously numerical data and non-numerical data. This paper presents an application of this approach, and more specifically a novel method for head pose estimation using LC techniques, as an initial step towards using LC as a unified methodology in social robot interaction applications.
Conference Paper
Full-text available
Social robots in education introduce new moral challenges. The aim of this exploratory study is to gain a better understanding of the moral conceptions held by parents regarding the implementation of social robots in primary schools. These moral conceptions are important because parents are the representatives of children, but also experience the effects of robot tutoring first-hand. Through empirical data gathered from focus group sessions with parents, we identified and categorised the concerns and opportunities linked to implementing social robots in an educational context from the perspective of parents. These opportunities and concerns formed the basis for identifying the moral values held by parents that are affected by the introduction of these social robots. We mapped the opportunities and concerns to a list of 14 relevant moral values regarding social robots and education as identified in a review on ethics and educational robots, in order to identify and conceptualise the relevant moral values for parents. We identified the relevant moral values for parents that are affected by the social robot to gain a better understanding of parents' attitudes towards the use of social robots in education, to help the robotic industry integrate parents' moral values in their robot tutor design, and to help create the necessary moral guidelines towards an ethical implementation of social robots in education.
Conference Paper
Full-text available
Being a child with diabetes is challenging: apart from the emotional difficulties of dealing with the disease, there are multiple physical aspects that need to be dealt with on a daily basis. Furthermore, as the children grow older, it becomes necessary to self-manage their condition without the explicit supervision of parents or carers. This process requires that the children overcome a steep learning curve. Previous work hypothesized that a robot could provide a supporting role in this process. In this paper, we characterise this potential support in greater detail through a structured collection of perspectives from all stakeholders, namely the diabetic children, their siblings and parents, and the healthcare professionals involved in their diabetes education and care. A series of brain-storming sessions were conducted with 22 families with a diabetic child (32 children and 38 adults in total) to explore areas in which they expected that a robot could provide support and/or assistance. These perspectives were then reviewed, validated and extended by healthcare professionals to provide a medical grounding. The results of these analyses suggested a number of specific functions that a companion robot could fulfil to support diabetic children in their daily lives.
Conference Paper
Full-text available
Social robots are finding increasing application in the domain of education, particularly for children, to support and augment learning opportunities. With an implicit assumption that social and adaptive behaviour is desirable, it is therefore of interest to determine precisely how these aspects of behaviour may be exploited in robots to support children in their learning. In this paper, we explore this issue by evaluating the effect of a social robot tutoring strategy with children learning about prime numbers. It is shown that the tutoring strategy itself leads to improvement, but that the presence of a robot employing this strategy amplifies this effect, resulting in significant learning. However, it was also found that children interacting with a robot using social and adaptive behaviours in addition to the teaching strategy did not learn a significant amount. These results indicate that while the presence of a physical robot leads to improved learning, caution is required when applying social behaviour to a robot in a tutoring context.
Conference Paper
Full-text available
In this paper, we describe the results of an interview study conducted across several European countries on teachers' views on the use of empathic robotic tutors in the classroom. The main goals of the study were to elicit teachers' thoughts on the integration of the robotic tutors in the daily school practice, understanding the main roles that these robots could play and gather teachers' main concerns about this type of technology. Teachers' concerns were much related to the fairness of access to the technology, robustness of the robot in students' hands and disruption of other classroom activities. They saw a role for the tutor in acting as an engaging tool for all, preferably in groups, and gathering information about students' learning progress without taking over the teachers' responsibility for the actual assessment. The implications of these results are discussed in relation to teacher acceptance of ubiquitous technologies in general and robots in particular.
Conference Paper
Full-text available
Child-Robot Interaction (cHRI) is a promising point of entry into the rich challenge that social HRI is. Starting from three years of experiences gained in a cHRI research project, this paper offers a view on the opportunities offered by letting robots interact with children rather than with adults and having the interaction in real-world circumstances rather than lab settings. It identifies the main challenges which face the field of cHRI: the technical challenges, while tremendous, might be overcome by moving away from the classical perspective of seeing social cognition as residing inside an agent, to seeing social cognition as a continuous and self-correcting interaction between two agents.
Article
Full-text available
The idea of autonomous social robots capable of assisting us in our daily lives is becoming more real every day. However, there are still many open issues regarding the social capabilities that those robots should have in order to make daily interactions with humans more natural. For example, the role of affective interactions is still unclear. This paper presents an ethnographic study conducted in an elementary school where 40 children interacted with a social robot capable of recognising and responding empathically to some of the children's affective states. The findings suggest that the robot's empathic behaviour affected positively how children perceived the robot. However, the empathic behaviours should be selected carefully, under the risk of having the opposite effect. The target application scenario and the particular preferences of children seem to influence the degree of empathy that social robots should be endowed with.
Article
Full-text available
Takayuki Kanda is a computer scientist with interests in intelligent robots and human-robot interaction; he is a researcher in the Intelligent Robotics and Communication Laboratories at ATR (Advanced Telecommunications Re-search Institute), Kyoto, Japan. Takayuki Hirano is a computer scientist with an interest in human–robot interaction; he is an intern researcher in the Intelli-gent Robotics and Communication Laboratories at ATR, Kyoto, Japan. Daniel Eaton is a computer scientist with an interest in human–robot interaction; he is an intern researcher in the Intelligent Robotics and Communication Labora-tories at ATR, Kyoto, Japan. Hiroshi Ishiguro is a computer scientist with in-terests in computer vision and intelligent robots; he is Professor of Adaptive Machine Systems in the School of Engineering at Osaka University, Osaka, Ja-pan, and a visiting group leader in the Intelligent Robotics and Communication Laboratories at ATR, Kyoto, Japan. ABSTRACT Robots increasingly have the potential to interact with people in daily life. It is believed that, based on this ability, they will play an essential role in human society in the not-so-distant future. This article examined the proposition that robots could form relationships with children and that children might learn from robots as they learn from other children. In this article, this idea is studied in an 18-day field trial held at a Japanese elementary school. Two English-speak-ing "Robovie" robots interacted with first-and sixth-grade pupils at the perime-ter of their respective classrooms. Using wireless identification tags and sensors, these robots identified and interacted with children who came near them. The robots gestured and spoke English with the children, using a vocabulary of about 300 sentences for speaking and 50 words for recognition.
Article
Full-text available
Childcare robots are being manufactured and developed with the long term aim of creating surrogate carers. While total childcare is not yet being promoted, there are indications that it is 'on the cards' . We examine recent research and developments in childcare robots and speculate on progress over the coming years by extrapolating from other ongoing robotics work. Our main aim is to raise ethical questions about the part or full-time replacement of primary carers. The questions are about human rights, privacy, robot use of restraint, deception of children and accountability. But the most pressing ethical issues throughout the paper concern the consequences for the psychological and emotional wellbeing of children. We set these in the context of the child development literature on the pathology and causes of attachment disorders. We then consider the adequacy of current legislation and international ethical guidelines on the protection of children from the overuse of robot care. Who's to say that at some distant moment there might be an assembly line producing a gentle product in the form of a grandmother – whose stock in trade is love. From I Sing the Body Electric, Twilight Zone, Series 3, Episode 35, 1960 . Introduction A babysitter/companion on call round the clock to supervise and entertain the kids is the dream of many working parents. Now robot manufacturers in South Korea and Japan are racing to fulfil that dream with affordable robot "nannies". These currently have game playing, quizzes, speech recognition, face recognition and limited conversation to capture the preschool child's interest and attention. Their mobility and semi-autonomous function combined with facilities for visual and auditory monitoring are designed to keep the child from harm. Most are pro-hibitively expensive at present but prices are falling and some cheap versions are already becoming available.
Conference Paper
Robots often need to ask humans for help, for instance to complete a human component in a larger task or to recover from an unforeseen error. In this paper, we explore how robots can initiate interactions with people in order to ask for help. We discuss a study in which a robot initiated interaction with a participant by producing either an acoustic signal or a verbal greeting. Thereafter, the robot produced a gesture in order to request help in performing a task. We investigate the effect that social framing by means of a verbal greeting may have on people's attention to the robot, on their recognition of the robot's actions and intention, and on their willingness to help. The results show that social framing, in contrast to other methods for getting a person's continued attention, is effective and increases how friendly the robot appears. However, it has little influence on people's willingness to assist the robot, which rather depends on the activities people are engaged in, and on the readability of the robot's request.
Article
The prevalence of information and communication technology (ICT) has considerably converted the means of/for publication and circulation, as well as transforming academia and English pedagogy. However, with the availability and convenience of online ...
Article
Recent research related to the concept of interest is reviewed. It is argued that current constructs of motivation fail to include crucial aspects of the meaning of interest emphasized by classical American and German educational theorists. In contrast with many contemporary concepts (e.g., intrinsic learning orientation), interest is defined as a content-specific motivational characteristic composed of intrinsic feeling-related and value-related valences. Results from a number of studies are presented that indicate the importance of interest for the depth of text comprehension, the use of learning strategies, and the quality of the emotional experience while learning. The implications of these results and possible directions for future research are discussed.