Content uploaded by Deepak Akkil
Author content
All content in this area was uploaded by Deepak Akkil on Oct 06, 2017
Content may be subject to copyright.
Content uploaded by Deepak Akkil
Author content
All content in this area was uploaded by Deepak Akkil on Sep 25, 2017
Content may be subject to copyright.
Little Bear – A Gaze Aware Learning Companion for
Early Childhood Learners
Deepak Akkil 1, Prasenjit Dey 2, Nitendra Rajput 3 *
1 University of Tampere, Finland
deepak.akkil@uta.fi
2 IBM Research, Bangalore, India.
prasenjit.dey@in.ibm.com
3 InfoEdge India Limited
nitendra@acm.org
Abstract. Computing devices such as mobile phones and tablet computers are
increasingly used to support early childhood learning. Currently, touching the
screen is the most common interaction technique on such devices. To augment
the current interaction experience, overcome posture-related issues with tablet
usage and promote novel ways of engagement, we propose gaze as an input
modality in educational applications for early learners. In this demonstration, we
present the Little Bear, a gaze aware pedagogical agent that tailors its verbal and
non-verbal behaviour based on the visual attention of the child. We built an
application using the Little Bear, to teach the names of everyday fruits and
vegetables to young children. Our demonstration system shows the potential of
gaze-based learning applications and the novel engagement possibility provided
by gaze-aware pedagogical agents.
Keywords. Gaze, Touch, Pedagogical Agent, Early-childhood learning,
Vocabulary building, Games, Mobile devices, Engagement.
1 Introduction
Touchscreen devices such as Apple iPad and Microsoft Surface tablets are
increasingly applied in early childhood pedagogical environments, such as preschools
and kindergartens. Growing popularity and access to mobile devices provide exciting
opportunities to design innovative, ubiquitous, and constructive learning experiences
for young children. While the immense potential of such devices for early childhood
learning is well accepted, there are also many concerns regarding touch-based
interaction on such devices in an early childhood learning environment.
There are several challenges in designing engaging and intuitive touch-based
applications for young children. Plowman and Mcpake [12] note that if children do not
understand what they need to do, the interactivity offered by such devices may be
* This work was done when Nitendra Rajput was working for IBM Research.
counter-productive to learning. This requires careful design of the stimuli and prompts,
to make the interaction intuitive [5]. In addition, the underdeveloped fine motor skills
and finger dexterity in children [6] lead to difficulty performing certain touch gestures
[11], lead to slower and error-prone interactions [15] and cause accidental or
unintended touch [10], affecting the overall interaction.
Another concern regarding the use of a mobile device is to balance the placement of
the device for optimal touch interaction and considerations of neutral posture of the
child. Straker et al. [14] studied posture and muscle activity of children while using
desktop computers and tablets and found that touch-based interaction on a tablet is
linked with asymmetric spinal and strained neck postures. Similar concerns regarding
posture and prolonged tablet use are also raised by teachers and parents [4]. Researchers
have proposed several recommendations to overcome the problem, by promoting task
variations [14] and encouraging elevated placement of the device which promote
neutral viewing postures [16]. In turn, elevated placement of the device may make the
touch interaction difficult.
A third challenge in designing learning applications for children is that children have
very limited attention spans and easily get distracted by environmental factors (e.g.
noise from the hallway, or a colourful object in the tablet screen of a peer) [2]. Luna [9]
note that the younger the children, the more easily they get distracted. It is hence
important that educational applications designed for early learners are aware of
children’s attention and that they employ ways to reorient the attention when the child
is distracted, to facilitate learning.
We propose gaze as a viable and potentially beneficial input modality in learning
applications for children. Unlike explicit touch-based interaction that inherently
requires the device to be placed close to the child, by using gaze, the child can interact
with the device at a distance, enabling the device to be optimally placed to promote
better posture.
Applications that are aware of the visual attention of the child could implicitly adapt
themselves, by integrating learning with their curious visual exploration providing a
rich and embodied experience. In addition, gaze has a strong association with attention
and gaze-aware learning applications can also keep track of the attention of the child
and employ means to reorient the attention when the child is distracted.
There are two distinct ways of using gaze information in learning applications. First,
by using gaze as the only interaction modality, which could be useful in simple
interaction tasks. Second, by using gaze in combination with conventional touch-based
interaction, which could be suitable when more complex interactions are required. In
this demonstration, we will focus on applications that use gaze as the only input
modality. To showcase the interaction and engagement possibility offered by the
modality, we designed Little Bear, an animated pedagogical agent capable of oral
communication that is also aware of the visual attention of the learner.
Animated pedagogical agents or virtual characters designed to teach or guide users,
provide engagement and motivational benefits in learning applications. However,
Kramer and Bente [7] note that current generation of agents do not exhibit sophisticated
non-verbal communication nor do they exert a social influence. They envision that
agents that are more aware of the emotional and cognitive state of the user and show
capacities for non-verbal communication, may have more pedagogical value. Gaze-
aware agents have been studied in previous research for children with special needs
[8,13] and for adult users [3]. The novelty of our system is that the Little Bear, uses the
gaze information to adapt its verbal as well as non-verbal behavior and exhibit
emotional states as a mean to reorient attention of the child, when distracted from the
learning activity.
2 Demonstration application
Fig 1. Agent based learning application. The application can be interacted using gaze. The
red boxes indicate the gaze reactive area.
We designed an application with the “little-bear”, a bear-like animated pedagogical
agent. The application was set in a garden-like 3D environment, where the agent would
take the child for a walk, and the application was designed to teach children the names
of some everyday fruits and vegetables. Different fruits and vegetables would appear
on screen at pre-defined locations during the walk, which the child could interact with
by using gaze. When the child glanced at a specific fruit, the bear spoke an interesting
detail about the fruit. The speech was powered by IBM Watson text to speech service,
and further customized by choosing the parameters for the speed of speech, pitch and
pauses between words, to make the speech feel natural, fitting to the character and easy
to understand. The agent also exhibited realistic lip movement and blink behavior to
complement the speech.
For accurate gaze tracking, we used Tobii EyeX, an off-the-shelf video-based gaze
tracker. The agent used the gaze information to adapt its verbal and non-verbal
behaviour. For example, when the child is distracted and does not look at the screen,
the character becomes sad (see Figure 2a) and uses speech to attract attention by saying,
“I become sad when you do not look at me.”
A fixation of more than 500ms on a fruit resulted in its activation and the agent spoke
an interesting detail about the fruit (e.g. you are looking at apple or apple is red in
color). The fixation duration was selected, based on previous works that suggest that
the normal median gaze fixation duration for children in an image viewing task is 300
ms and that children have difficulty fixating at a target for longer durations. The speech
after the activation of a fruit lasted for roughly 3-6 seconds, during which the
application did not respond to any other gaze fixations. Choosing a relatively short
fixation duration for activation allowed our application to be implicitly reactive to the
interest/visual attention of the child, without requiring an explicit gaze action. When
the character was not speaking, the head of the character would orient towards the
direction the user was looking at, giving an implicit feedback of gaze tracking and
helping establish joint attention. When the character was speaking, the head of the
character was oriented directly ahead, abiding by the established social conventions of
eye contact during face-to-face conversation.
3 Summary
In this paper, we described the challenges of using touch-based interaction on mobile
devices for children and how gaze input could be used as a beneficial input modality in
educational applications for children. We further presented the little-bear, a gaze aware
pedagogical agent that tailors its verbal and non-verbal behavior in response to visual
attention of the user. Our demonstration system will allow others to experience the
potential of gaze-based interaction. The demonstration and the related user study [1]
shows the novel engagement possibilities of a gaze aware pedagogical agent.
4 Requirements for the demonstration setup
The requirements for the demonstration setup are a desk, access to power sockets for
the tablet, and preferably a demonstration area with no direct sunlight, or other intense
sources of infrared light, immediately in front of, or behind, the user.
Figure 2. Agent non-verbal behaviour in response to visual attention of the child. (a) A
frame from the sad animation. (b)-(e) head orientation of the bear changes based on the
visual attention of the child (agent looks where the child is looking).
References
1. Akkil D, Dey P, Salian D, Rajput N. Gaze Awareness in Agent-Based Early-Childhood
Learning Application. In Proceedings of Human-Computer Interaction. 2017.
2. Bruckman A, Bandlow A, Forte A. HCI for kids. Handbook of Human-Computer
Interaction, J. Jacko and A. Sears, Ed. Lawrence Erlbaum Associates.793-809 (2008)
3. D'Mello S, Olney A, Williams C, Hays P. Gaze tutor: A gaze-reactive intelligent tutoring
system. International Journal of human-computer studies. 70(5):377-98 (2012).
4. Fawcett L. Tablets in Schools : How Useful Are They ? (2016)
5. Hiniker A, Sobel K, Hong SR, Suh H, Kim D, Kientz JA. Touchscreen prompts for
preschoolers: designing developmentally appropriate techniques for teaching young
children to perform gestures. In Proceedings Interaction Design and Children. 109-118
(2015).
6. Hourcade JP. Interaction design and children. Foundations and Trends in Human-Computer
Interaction. 1(4):277-392 (2008).
7. Krämer NC, Bente G. Personalizing e-learning. The social effects of pedagogical agents.
Educational Psychology Review. 22(1):71-87 (2010).
8. Lahiri U, Warren Z, Sarkar N. Design of a gaze-sensitive virtual social interactive system
for children with autism. IEEE Transactions on Neural Systems and Rehabilitation
Engineering. 19(4):443-52 (2011).
9. Luna B. Developmental changes in cognitive control through adolescence. Advances in
child development and behavior. 37:233-78 (2009).
10. McKnight L, Fitton D. Touch-screen technology for children: giving the right instructions
and getting the right responses. In Proceedings of interaction design and children. 238-241
(2010).
11. Nacher V, Jaen J, Navarro E, Catala A, González P. Multi-touch gestures for pre-
kindergarten children. International Journal of Human-Computer Studies. 73:37-51 (2015).
12. Plowman L, McPake J. Seven myths about young children and technology. Childhood
Education. 89(1):27-33 (2013).
13. Ramloll R, Trepagnier C, Sebrechts M, Finkelmeyer A. A gaze contingent environment for
fostering social attention in autistic children. In Proceedings of Eye tracking research &
applications. 19-26 (2004).
14. Straker LM, Coleman J, Skoss R, Maslen BA, Burgess-Limerick R, Pollock CM. A
comparison of posture and muscle activity during tablet computer, desktop computer and
paper use by young children. Ergonomics. 51(4):540-55 (2008).
15. Vatavu RD, Cramariuc G, Schipor DM. Touch interaction for children aged 3 to 6 years:
Experimental findings and relationship to motor skills. International Journal of Human-
Computer Studies. 74:54-76 (2015).
16. Young JG, Trudeau M, Odell D, Marinelli K, Dennerlein JT. Touch-screen tablet user
configurations and case-supported tilt affect head and neck flexion angles. Work. 41(1):81-
91 (2012).