Conference PaperPDF Available

Emotionally Expressive Avatars for Chatting, Learning and Therapeutic Intervention

Authors:

Abstract and Figures

We present our work on emotionally expressive avatars, animated virtual characters that can express emotions via facial expressions. Because these avatars are highly distinctive and easily recognizable, they may be used in a range of applications. In the first part of the paper we present their use in computer mediated communication where two or more people meet in virtual space, each represented by an avatar. Study results suggest that social interaction behavior from the real-world is readily transferred to the virtual world. Empathy is identified as a key component for creating a more enjoyable experience and greater harmony between users. In the second part of the paper we discuss the use of avatars as an assistive, educational and therapeutic technology for people with autism. Based on the results of a preliminary study, we provide pointers regarding how people with autism may overcome some of the limitations that characterize their condition.
Content may be subject to copyright.
J. Jacko (Ed.): Human-Computer Interaction, Part III, HCII 2007, LNCS 4552, pp. 275–285, 2007.
© Springer-Verlag Berlin Heidelberg 2007
Emotionally Expressive Avatars for
Chatting, Learning and Therapeutic Intervention
Marc Fabri, Salima Y. Awad Elzouki, and David Moore
Faculty of Information and Technology, Leeds Metropolitan University, UK
{m.fabri,s.elzouki,d.moore}@leedsmet.ac.uk
Abstract. We present our work on emotionally expressive avatars, animated
virtual characters that can express emotions via facial expressions. Because
these avatars are highly distinctive and easily recognizable, they may be used in
a range of applications. In the first part of the paper we present their use in
computer mediated communication where two or more people meet in virtual
space, each represented by an avatar. Study results suggest that social
interaction behavior from the real-world is readily transferred to the virtual
world. Empathy is identified as a key component for creating a more enjoyable
experience and greater harmony between users. In the second part of the paper
we discuss the use of avatars as an assistive, educational and therapeutic
technology for people with autism. Based on the results of a preliminary study,
we provide pointers regarding how people with autism may overcome some of
the limitations that characterize their condition.
Keywords: Emotion, avatar, virtual reality, facial expression, instant
messaging, empathy, autism, education, therapeutic intervention.
1 Introduction
The word avatar comes from the Sanskrit language and can be translated as God’s
Incarnation on Earth. In the virtual reality community, avatars are 3D humanoid
characters inhabiting virtual space, with varying degrees of animation and behavioral
abilities. Avatars typically represent humans who visit the space virtually. Each
visitor controls their avatar and is aware of other visitors’ avatars and their actions [3].
In this paper we present our work on emotionally expressive avatars. Our avatars
are able to show expressions corresponding to the following emotions: happiness,
disgust, sadness, surprise, anger and fear [cf. 12]. These avatars are animated and
highly distinctive in their appearance. Because humans are so adept at reading facial
expressions in the real world, any avatar that plausibly displays emotions in a virtual
world can be utilized in a range of applications.
In the first part of the paper we discuss the use of avatars in computer mediated
communication. We have developed and tested an Instant Messaging tool called the
Virtual Messenger, which allowed two users to virtually enter a meeting place in
order to communicate and collaborate on a given task. Users see each others’ virtual
representations and chat with each other, as well as visibly express emotions via their
276 M. Fabri, S.Y.A. Elzouki, and D. Moore
animated avatar heads. We develop an evaluation framework for the user’s subjective
experience and discuss study results.
In the second part of the paper we then look into a potential application area for the
Virtual Messenger, or for tools derived from the interaction paradigm employed.
People with autism often display behavior that is considered socially or emotionally
inappropriate [15], and find it hard to relate to other people [40]. There is evidence
that virtual environment technology can address some of these impairments [5,6,31].
However, any technology using avatars has to be designed so that people with
autism can readily understand the avatar’s expressions, and potentially ascribe a
mental and emotional state to the avatar. We present results from a study that
explored the extent to which children and youth with autism could recognize, and
make inferences from, emotions displayed by a humanoid avatar. The positive
findings support the optimism that such avatars could be used effectively as a) an
assistive technology such as the Virtual Messenger to help people with autism to
circumvent their social isolation, b) as a means of educating the person with autism
where the avatar may in some sense become a “teacher”, and c) as actors in virtual
reality role-playing where people with autism may practice their mind-reading skills.
1.1 Why Emotions Are Important
From the real world we know that whenever one interacts with another person, both
monitor and interpret each others emotional expressions. Argyle [2] argued that that
the expression of emotion, in the face or through the body, is part of a wider system of
natural human communication that has evolved to facilitate social life. Emotions can
also have an influence on cognitive processes, including coping behaviors such as
wishful thinking, resignation, or blame-shifting [21]. Findings in psychology and
neurology suggest that emotions are also an important factor in decision-making,
problem solving, cognition and intelligence in general. Picard [37] pointed out that
the emotional state of others influences not only our own emotional state, but directly
the decisions we make. Another area where emotions can be critical is that of
learning. It has been argued that the ability to show emotions and empathy through
body language is central to ensuring the quality of tutor-learner and learner-learner
interaction [7]. Acceptance and understanding of ideas and feelings, criticizing,
silence, questioning – all involve non-verbal elements of interaction [27]. Emotions
can motivate and encourage, they can help us achieve things [7].
2 Avatars for Chatting – The Virtual Messenger
The ‘Virtual Messenger’ is a communication tool designed to allow two spatially
separated users to meet virtually and discuss a topic (Fig. 1). It is probably best
described as an Instant Messaging tool with the added facility of representing
interlocutors as avatars. The tool allowed investigating how a user’s experience is
different when the avatars representing users are emotionally expressive, as opposed
to being non-expressive. Giving virtual characters expressive abilities has long been
considered beneficial as it potentially leverages the observer’s real-life experience
with social interaction [37,8]. A choice of six avatar heads was available, each
Emotionally Expressive Avatars for Chatting, Learning and Therapeutic Intervention 277
capable of displaying the “universal” facial expressions of emotion happiness,
surprise, anger, fear, sadness and disgust [12] and a neutral face. Expressions were
designed to be highly distinctive and recognizable [14]. All characters were based on
identical animation sequences to ensure consistence and validity.
Fig. 1. The Virtual Messenger Interface
2.1 Experimental Setup
The Virtual Messenger was evaluated in a between-groups experiment conducted in
pairs. Participants were given a classic survival scenario (You are stranded in the
desert about 50 miles from the nearest road…) and had to debate what course of
action would be best for their survival. During the experiment, their only means of
communication was the Virtual Messenger. There were two versions of the Virtual
Messenger tool, corresponding to the experimental conditions:
1. Condition (NE): Users could click on emoticons which then appeared in the chat
log of both participants. Participants were represented by avatars, but there was no
change in avatar appearance other than random idle animations such as blinking.
2. Condition (EX): Featured the same emoticons and avatar representations. When a
user clicked an emoticon, it appeared in the chat log and also caused their avatar in
the partner’s messenger window to display that emotion.
By making emotional expressiveness the intervention, we were able to investigate
its effect on the user’s experience. Conditions were assigned to participant pairs, i.e.
interlocutors either could both use their avatar’s expressions, or neither of them could.
2.2 Evaluation Framework
In order to evaluate the user’s experience effectively, we introduced the concept of
“richness of experience” and hypothesized that a user’s experience is richer when
avatars are emotionally expressive, compared to non-expressive avatars. It was
postulated that a richer experience would manifest itself through:
1. More involvement in the task
2. Greater enjoyment of the experience
3. A higher sense of presence during the experience
4. A higher sense of copresence
In addition to these four measures, participants were observed during the
experiment, and given the opportunity to comment on any aspect of their experience
278 M. Fabri, S.Y.A. Elzouki, and D. Moore
after the task was completed. These qualitative measures formed an important part of
the data analysis. At the time of the experiment no comparable study combining all
four factors of richness existed. Various researchers have looked at these in isolation
[39,29,38,17,20] and their interpretations informed the definition and the choice of
evaluation tools. Below we explore each characteristic in detail:
1. Involvement: Defined here as an objective measure of the number of user-initiated
actions and communications taking place. These were automatically recorded.
2. Enjoyment: Designers of consumer products have long been aware of the
potential that quantifying enjoyment and pleasurability of use can yield for a
product's success [25,22]. We used Nichols’ [32] mood adjective checklist, a self-
report tool specifically designed for measuring aspects of interaction in virtual
reality systems.
3. Presence: Defined as a psychological state in which the individual perceives
oneself as existing within an environment” [4]. Several instruments for measuring
presence were available varying from technological [41] to psychological,
introspective approaches [39]. We used the fully validated 43-item ITC-SOPI
questionnaire [29] because of its generic applicability. ITC-SOPI considers four
distinct factors of presence: a) Being in a physical space other than the actual place
one is in (spatial), b) the user’s interest in the presented content (engagement), c)
believability and realism of the content (naturalness), and d) .negative physical
effects of the experience such as headaches or eyestrain.
4. Copresence: Refers to the sense of being together with another person in a
computer-generated environment, sometimes also referred to as social presence
[20,39]. For the Virtual Messenger investigation, other researchers [17,38] were
followed by measuring the phenomenon of copresence via a short post-experiment
questionnaire covering aspects of space, togetherness and responsiveness.
2.3 Results and Analysis
32 volunteers took part in the study. They were aged 21-63 with equal gender split
(average age 28.2 years, stdev 13.4). Participants were computer literate, well
educated and skilled in the use of keyboard and mouse. Few had experience of using
console games, virtual reality, or other 3D applications. Several had used Instant
Messaging tools before. In summary, results confirmed that the avatar faces used
were effective and efficient. It is worth looking at each measure in detail because not
all characteristics of richness did produce equally conclusive results:
1. Involvement: Sessions lasted between 8 and 35 minutes (average 21.2) excluding
questionnaires. (EX) participants were significantly more involved in the task
(p<0.05). They wrote considerably more messages, messages were longer, nearly
twice as many items were moved, and there was 8 times more use of emoticons.
2. Enjoyment: High enjoyment scores were recorded in both conditions (averaging
71.22% for (EX) and 72.01% for (NE)). The difference was not statistically
significant (one-way ANOVA, p<0.05).
3. Presence: The subjective sense of presence was consistently high across both
conditions with no significant difference between the two conditions overall, or
when testing for individual factors (one-way ANOVA, p<0.05). As an indication,
Emotionally Expressive Avatars for Chatting, Learning and Therapeutic Intervention 279
presence scores for the (EX) condition were 3.06 (Spatial), 3.61 (Engagement),
2.61 (Naturalness) and 1.63 (Negative Effects), all based on a 5-item Likert scale.
4. Copresence: Participants using expressive avatars (EX) reported a significantly
higher sense of copresence (F(1,32)=3.5, p<0.05). Average scores were 4.13 for
condition (EX), and 3.73 for (NE), again based on a 5-item Likert scale.
2.4 Discussion
Whilst involvement and copresence showed significantly higher scores in the (EX)
condition (supporting the hypothesis), the enjoyment and presence showed no
significant difference in response to the two conditions. The consistently high
enjoyment scores can arguably have been influenced by the novelty of the application.
When looking at quantitative factors in combination with the qualitative data,
interesting patterns emerged: The greater activity under condition (EX) which was
logged as well as visually observed could be attributed to a richer experience.
However, the way these discussions developed would have been counter-productive
in a real life-threatening situation. (NE) acted in a more task-oriented and efficient
way. It is possible that the introduction of emotional expressiveness may not be
appropriate for all types of avatar applications, or in all communication contexts, as
they may distract users from the task at hand. An alternative possibility is that by
focusing on emotional expressions alone the environment may become “hyper-
emotional”, leading to a distracting rather than constructive collaborative experience.
There was a tendency by some participants to mimic emotions displayed by their
partners. This was particularly relevant to the (EX) condition where it affected
predominantly the happiness expression. The mimicry of communicative cues is well-
documented for real life social interaction, typically as a regulator of trust and rapport
[26,28]. Imitative behavior is also considered a good indicator for the existence of
empathy [34]. From the observations we infer that such mimicry may have taken
place during the task, when participants appeared to have copied their partner’s facial
expressions. This in turn may have led to more likeability between partners as they
sensed that there was an interlocutor who empathized with them, which is congruent
with post-experimental feedback by these participants. There is, then, some evidence
that mechanisms fostering the emergence of empathy in the real world may apply
equally to interaction through the Virtual Messenger, despite the somewhat artificial
setup. It is quite conceivable that deliberate use of mimicking in future system has the
potential to be a useful means of communication – and potentially persuasion.
In the next section we consider the further application of emotionally expressive
avatars, such as those used here. Our main concern is to find ways to help people with
autism overcome at least some of the limitations that characterize their condition.
3 Avatars for Learning and Therapeutic Intervention
Wing [40] considers autism involving a “triad of impairments”: 1) A social
impairment: the person with autism finds it hard to relate to and empathize with other
people; 2) a communication impairment: the person with autism finds it hard to
understand and use verbal and non-verbal signals, and may display behavior
280 M. Fabri, S.Y.A. Elzouki, and D. Moore
considered socially or emotionally inappropriate [15]; and 3) a tendency towards
rigidity and inflexibility in thinking, language and behavior. Research suggests that
this triad is underpinned by a “theory of mind deficit” [24]: people with autism may
have difficulty understanding other people’s mental and emotional state, or ascribing
such a state to themselves. Given this understanding of autism, we argue that virtual
reality systems utilizing emotionally expressive avatars can potentially benefit people
with autism in three ways – as an assistive technology, as an educational technology,
and as a means of helping address any Theory of Mind deficit.
3.1 Avatars as an Assistive Technology
Concerning its potential role as an assistive technology, our argument is that people
with autism may be able to use the Virtual Messenger to communicate more fruitfully
with other people. This is important since people with autism may experience social
exclusion because they find it difficult to make friends [6]. Indeed, the difficulty to
relate socially to other people is seen as a hallmark of autism [33]. Any means of
addressing these issues, we argue, is therefore worthy of investigation. Tools such as
the Virtual Messenger have the potential to enable communication that is simpler and
less threatening to people with autism than its face-to-face equivalent, thereby
avoiding many of the potential pitfalls [35]. The direct and active control over
interactions may also increase confidence of people who otherwise feel out of control
in social situations [35]. Users can communicate at their own pace and, if needed,
slow down the rate of interaction in order to gain time to think of alternative ways of
dealing with a particular situation. Thus, tools like the Virtual Messenger can
potentially help people with autism who cannot or do not wish to come together
physically, but who wish to discuss common interests. It may provide a means by
which people with autism can communicate with others, and thus circumvent, at least
in part, their social and communication impairment and sense of isolation.
3.2 Avatars as an Educational Technology
Concerning the potential educational use of the Virtual Messenger, the idea is to use
the technology as a means of educating the user with autism, possibly in an attempt to
help overcome their autism-specific “deficits”. Thus the conversational partner of a
user with autism may be in some sense their “teacher”. One specific way in which this
might be used is for the purposes of practice and rehearsal of events in the “real
world”, for example a forthcoming school visit, family gathering or interview.
Programmes that allow people with autism to practice social skills are often
advocated, partly on the grounds that social impairments can affect general
educational progress [1]. The argument for the use of avatar-based communication in
such programmes is that it enables social skills to be practiced and rehearsed in
realistic settings in real time [6,35,36]. Tools like the Virtual Messenger offer a safe
and controlled environment which can be used repeatedly under the same conditions
in order to learn appropriate social rules, without having to deal face-to-face with
other people [6]. Users’ interactions can be recorded and used for subsequent
educational discussion. This creates an opportunity for people with autism to learn by
making mistakes but without suffering the real consequence of their errors.
Emotionally Expressive Avatars for Chatting, Learning and Therapeutic Intervention 281
3.3 Addressing Theory of Mind Issues
Another interesting possibility is that of using tools like the Virtual Messenger to help
people with autism with any Theory of Mind (ToM) deficit. Although the status of
this alleged deficit is controversial, with for example some research suggesting that
the perception of emotions in others is not systematically or specifically deficient in
people with autism [19], many advocate its explicit teaching [e.g. 24,33]. It is argued
[24] that children with autism can be successfully taught to interpret mental states.
We argue, then, that tools like the Virtual Messenger can potentially play a
valuable role concerning ToM. Being able to express their emotions through a choice
of appropriate facial expressions for their avatars., and being required to interpret the
emotions displayed by their interlocutors’ avatars, may help address the ToM issue in
users with autism. McIlhagga and George [30] suggest that users who see other
avatars’ behavior and facial expressions may build a model of the emotional state of
the underlying agent or user. Enabling people with autism to work in such
environments provides them, in principle at least, with an opportunity to practice their
mind reading skills and address ToM issues.
4 Avatars for People with Autism – An Exploratory Study
In order to investigate whether and how people with autism may interact with
emotionally expressive avatars, we conducted a preliminary study with two aims: to
establish whether a) the chosen avatars were readily recognizable, and b) participants
could relate events in simple social scenarios to the relevant emotions. We developed
a single-user computer system, incorporating avatar representations for 4 emotions –
happy, sad, angry, frightened – and involving 3 stages [5]. It should be noted that the
development of the system happened in parallel to the Virtual Messenger
development. While the action units underlying all facial expressions were identical
in the two systems, for technical reasons different avatar models were used.
In Stage 1 the avatar representations of the 4 emotions were sequentially presented
in isolation. Users were asked to select the emotion they think is being displayed,
from a list. In a second activity, users were told that a particular emotion is being felt
and asked to select the avatar head they believe to correspond to that emotion. These
two activities form part of a standard procedure to establish the baseline of emotion
recognition [24]. Stage 2 attempts to elicit the possible emotions in the context of a
simple social scenario (Fig. 2). It requires users to predict the likely emotion caused
by certain events. In Stage 3 of the system the user is given an avatar representation
of one of the emotions and asked to select which of a number of given events they
think may have caused this emotion. Throughout the system, the avatar “face” is used
as the means of attempting to portray the emotions. A problematic issue when
developing the system concerned the range of emotions to consider. The literature
suggests that there are 6 universal expressions of emotion [12] as used with the
Virtual Messenger. However, autism researchers [18] argue that it is debatable when
and if children utilize all 6 emotions. Instead, work with individuals with autism tends
to concentrate on a subset of emotions. While subsets vary in the literature, we
followed [24] and used happy, sad, angry and frightened.
282 M. Fabri, S.Y.A. Elzouki, and D. Moore
Fig. 2. Stage 2 of the system
4.1 Results and Discussion
The study involved school-aged participants with a diagnosis of autism. Of 100
potential UK-based participants contacted, 34 replied. 18 participants were reported
as children with Aspergers Syndrome and 16 as children with severe autism. The age
range was from 7 to 16 years (mean 9.96). 29 participants were male, 5 female.
Each participant was sent a pack consisting of a CD containing the system outlined
above, a blank diskette, a questionnaire asking participants for their views about the
software, a parent questionnaire asking for the participant’s age and autism diagnosis,
and for the parent’s views about the software, brief instructions and a stamped
addressed envelope. Participants were asked to work through the 3 stages of the
system described above. The software logged their work onto the diskette. Once the
task was completed, participants and their parents were each asked to fill in the
questionnaires. The diskette with log data and the questionnaires were then returned.
Results from analyzing the log files suggest that, for all but one of the questions,
the participants were demonstrating responses significantly above those expected by
chance. Of the 34 participants, 30 were able to use the avatars at levels demonstrably
better than chance. Concerning the four participants who did not demonstrate a
significant difference from chance, it appears that these participants had a real
difficulty in understanding the emotional representation of the avatars. These four
participants were in the group that described themselves as having severe autism as
opposed to Aspergers Syndrome. In general, however, for the participants who
responded, there is very strong evidence that the emotions of the avatars are being
understood and used appropriately.
5 Summary and Further Work
We have outlined two empirical studies concerning emotionally expressive avatars.
The first investigates how the ability to express and perceive emotions during a
dialogue between two individuals in the Virtual Messenger tool affects their
experience of the given virtual world scenario. The study has also led to the
development of guidelines for making emotionally expressive avatars effective and
efficient [see 13]. The second study can be seen as an application of the first study to
the specific potential user group of people with autism. We believe that this study
gives grounds for optimism that avatars can help addressing one or more of the
impairments of people with autism.
Emotionally Expressive Avatars for Chatting, Learning and Therapeutic Intervention 283
We are currently conducting a third empirical study to investigate whether and how
the avatars used in the Virtual Messenger are recognizable, and their emotional
expressions understandable to children with more severe autism. By studying the pre-
validated emotion representations with such a user group, we are arguably testing the
standard in extremis, and hence potentially enabling the standard to be strengthened.
This is an example of an “off-shoot” argument for assistive technology – lessons from
the use of the technology in extraordinary human computer interaction might lead to
helpful development of the technology for “general” use [11,23]. Similarly, our work
can be expected to contribute towards clarifying the noted lack of guidance in the
literature [cf. 18] regarding how children might understand the behavior of virtual
characters and their emotional signals.
Much remains to be done, therefore, and we hope that the studies reported in this
paper may play a part in moving forward the important area of emotional
expressiveness in avatar-based communications, as well as the use of avatars as an
educational and therapeutic tool.
References
1. Aarons, M., Gittens, T.: Autism: A social skills approach for children and adolescents.
Winslow Press, Oxford (1998)
2. Argyle, M.: Bodily Communication, 2nd edn. Methuen, New York (1988)
3. Bailenson, J., Blascovich, J.: Avatars. Encyclopedia of HCI. Berkshire, pp. 64–68 (2004)
4. Blascovich, J.: Social influence within immersive virtual environments. In: Schroeder, R.
(ed.) The Social Life of Avatars. CSCW Series, pp. 127–145. Springer, London (2002)
5. Cheng, Y.: An avatar representation of emotion in collaborative virtual environment
technology for people with autism. PhD thesis, Leeds Metropolitan University, UK (2005)
6. Cobb, S., Beardon, L., Eastgate, R., Glover, T., Kerr, S., Neale, H., Parsons, S., Benford,
S., Hopkins, E., Mitchell, P., Reynard, G., Wilson, J.: Applied virtual environments to
support learning of social interaction skills in users with Aspergers Syndrome. Digital
Creativity 13(1), 11–22 (2002)
7. Cooper, B., Brna, P., Martins, A.: Effective Affective in Intelligent Systems – Building on
Evidence of Empathy in Teaching and Learning. In: Paiva, A. (ed.) Affective Interactions.
LNCS (LNAI), vol. 1814, pp. 21–34. Springer, London (2000)
8. Cowell, A., Stanney, K.: Manipulation of non-verbal interaction style and demographic
embodiment to increase anthropomorphic computer character credibility. International
Journal of Human-Computer Studies 62, 281–306 (2005)
9. Dautenhahn, K., Woods, S.: Possible Connections between bullying behaviour, empathy
and imitation. In: Proceedings of Second International Symposium on Imitation in
Animals and Artifacts, pp. 68–77, AISB Society (2003) ISBN 1-902956-30-7
10. Desmet, P.M.A.: Measuring emotion: development and application of an instrument to
measure emotional responses to products. In: Blythe, M.A., Monk, A.F., Overbeeke, K.,
Wright, P.C. (eds.) Funology: from usability to enjoyment, pp. 111–123. Kluwer,
Dordrecht (2003)
11. Edwards, A.: Extra-ordinary human-computer interaction. Cambridge University Press,
New York (1995)
12. Ekman, P., Friesen, W.V.: Facial Action Coding System. Consulting Psych. Press (1978)
284 M. Fabri, S.Y.A. Elzouki, and D. Moore
13. Fabri, M.: Emotionally expressive avatars for collaborative virtual environments. PhD
Thesis, Leeds Metropolitan University, UK (2006)
14. Fabri, M., Moore, D., Hobbs, D.: Mediating the Expression of Emotion in Educational
CVEs. International Journal of Virtual Reality, Springer, London, 7(2), 66–81 (2004)
15. Frith, U.: Autism: Explaining the Enigma. Blackwell, Oxford (1989)
16. Garau, M., Slater, M., Pertaub, D., Razzaque, S.: The responses of people to virtual
humans in an Immersive Virtual Environment. Presence, vol. 14(1), pp. 104–116. MIT
Press, Cambridge (2005)
17. Garau, M.: The Impact of Avatar Fidelity on Social Interaction in Virtual Environments.
PhD Thesis, University College London (2003)
18. George, P., McIlhagga, M.: The communication of meaningful emotional information for
children interacting with virtual actors. In: Paiva, A.M. (ed.) Affective Interactions. LNCS
(LNAI), vol. 1814, pp. 35–48. Springer, Berlin (2000)
19. Gepner, B., Deruelle, C., Grynfeltt, S.: Motion and emotion: A novel approach to the study
of face processing by young autistic children. Journal of Autism and Developmental
Disorders 31, 37–45 (2001)
20. Gerhard, M.: A Hybrid Avatar/Agent Model for Educational Collaborative Virtual
Environments. PhD Thesis, Leeds Metropolitan University, UK (2003)
21. Gratch, J., Marsella, S.: Evaluating a computational model of emotion. Journal of
Autonomous Agents and Multi-Agent Systems 11(1), 23–43 (2005)
22. Hassenzahl, M.: The effect of perceived hedonic quality on product appealingness.
International Journal of Human-Computer Interaction 13(4), 481–499 (2001)
23. Hobbs, D.J., Moore, D.J.: Human computer interaction. FTK Publishing, London (1998)
24. Howlin, P., Baron-Cohen, S., Hadwin, J.: Teaching Children with Autism to Mind-Read:
A Practical Guide for Teachers and Parents. John Wiley and Sons, New York (1999)
25. Jordan, P.W.: Designing Pleasurable Products. Taylor and Francis, London (2000)
26. Kendon, A.: Movement coordination in social interactions. Acta Psych 32(2), 101–125
(1970)
27. Knapp, M.L., Hall, J.A.: Nonverbal Communication in Human Interaction, 3rd edn., Holt,
Rinehart and Winston (1992)
28. LaFrance, M.: Posture Mirroring and Rapport. In: Davis, M. (ed.): Interaction Rhythms:
Periodicity in Communicative Behavior, pp. 279–298 (1982)
29. Lessiter, J., Freeman, J., Keogh, E., Davidoff, J.D.: A Cross-Media Presence
Questionnaire: The ITC Sense of Presence Inventory. Presence, MIT Press, Cambridge
10(3) (2001)
30. McIlhagga, M., George, P.: Communicating Meaningful Emotional Information in a
Virtual World. In: Paiva, A., Martinho, C. (eds.): Proceedings of International Workshop
on Affect in Interactions, Siena, Italy, pp. 150–155 (1999)
31. Moore, D.J., Cheng, Y., McGrath, P., Powell, N.J.: CVE technology for people with
autism. Focus on Autism and Other Developmental Disabilities 20(4), 231–243 (2005)
32. Nichols, S.: Virtual Reality Induced Symptoms and Effects (VRISE): Methodological and
Theoretical Issues. PhD Thesis, University of Nottingham, UK (1999)
33. Ozonoff, S., Miller, J.: Teaching Theory of Mind. Journal of Autism and Developmental
Disorders 25, 415–433 (1995)
34. Paiva, A., Dias, J., Sobral, D., Aylett, R., Sobreperez, P., Woods, S., Zoll, C., Hall, L.:
Caring for Agents and Agents that Care. In: Proceedings of International Conference on
Autonomous Agents and Multi-Agent System, New York, USA (2004)
35. Parsons, S., Mitchell, P., Leonard, A.: Do adolescents with autistic spectrum disorders
adhere to social conventions in virtual environments? Autism 9, 95–117 (2005)
Emotionally Expressive Avatars for Chatting, Learning and Therapeutic Intervention 285
36. Parsons, S., Mitchell, P., Leonard, A.: The use and understanding of VE by adolescents
with autistic spectrum disorders. J of Autism and Developmental Disorders 34(4), 449–
466 (2004)
37. Picard, R.: Affective Computing. MIT Press, Cambridge (1997)
38. Schroeder, R., Steed, A., Axelsson, A., Heldal, I., Abelin, A., Wideström, J., Nilsson, A.,
Slater, M.: Collaborating in Networked Immersive Spaces. Computers and Graphics 25(5),
781–788 (2001)
39. Slater, M.: Measuring Presence: A Response to the Witmer and Singer Presence
Questionnaire. Presence, MIT Press, Cambridge, 8(5), 560–565 (1999)
40. Wing, L.: The Autism Spectrum Constable, London (1996)
41. Witmer, B.G., Singer, M.J.: Measuring Presence in Virtual Environments: a Presence
Questionnaire. Presence, MIT Press, Cambridge, 7(3), 225–240 (1998)
... There is evidence that points to an optimistic view of using VRT. In an experiment conducted by Fabri and colleagues (2007), they gathered patients suffering from autism and had them play an interactive online game with emotionally expressive avatars, in order to help them understand and express emotions (31). At the end of the process, the participants were asked to complete a questionnaire, and the results showed that 30 of 34 participants were able to adequately understand the emotions of the avatars and use them properly, something that is often a struggle for those diagnosed with ASD.. ...
Article
Full-text available
Virtual Reality Therapy (VRT) has emerged as a transformative tool in mental health treatment, offering immersive and interactive experiences that enhance traditional therapeutic approaches. This literature review explores the effectiveness of VRT in addressing various psychological disorders, including anxiety, post-traumatic stress disorder (PTSD), and phobias. By synthesizing existing research, this study examines the mechanisms through which VRT facilitates cognitive and emotional engagement, providing a safe and controlled environment for exposure therapy and skill-building. Additionally, the review highlights the technological advancements that have expanded VRT’s accessibility and its integration into clinical practice. While findings suggest promising outcomes, challenges such as cost, accessibility, and the need for standardized protocols remain barriers to widespread adoption. Future research should focus on refining methodologies, ensuring ethical considerations, and evaluating longterm efficacy. This review underscores the potential of VRT to revolutionize mental health interventions, bridging the gap between innovation and clinical application.
... However, these studies have focused on a single or relatively restricted HCI contexts, and a comprehensive depiction of HCI-specific emotional experience still requires a deeper look into the complexity and diversity of HCI contexts. According to the Activity Theory (Bedny & Karwowski, 2011;Kuutti, 1996;Leont'ev, 1978;Uden, 2007), there exist substantial differences among different types of HCI tasks, which can be generally divided into two types, computer as an interaction object (Cowie et al., 2001;Fragopanagos & Taylor, 2005;Hibbeln et al., 2017) and computer as a communication tool (Fabri, Elzouki & Moore, 2007;Wadley et al., 2015). Nevertheless, it still remains unclear whether there exist distinct emotional experiences in these subtypes of HCI contexts. ...
Article
As human-computer interaction (HCI) technology becomes more and more integrated into our daily life, increasing attention has been drawn towards the interaction experience in addition to HCI efficiency. In the present study, we conducted a survey to explore context-specific emotional experience in HCI. Four hundred participants were recruited to report the frequency of their emotional experiences on 44 fine-grained emotion items in six representative HCI scenarios. Compared with six matched human-human interaction (HHI) scenarios used as control, the HCI scenarios were in general more frequently associated with negative emotions, and less frequently associated with positive emotions, especially when computer served as a tool for communication with other people. Furthermore, the 44 emotional experience items in HCI were summarized as five factors, representing low-arousal focused, positively engaged, emotionally empathetic, high-arousal negative and frustratingly confused. Our study presents a comprehensive overview of context-specific emotional experience in human-computer interactions and provides a framework for emotion evaluation in HCI applications.
... We compared the avatars created in the two contexts to determine if the conclusions of prior studies had changed. Numerous studies have confirmed that human emotions can be expressed through cartoon characters [16] or human-like avatars [17], and through the gestures and facial expressions of the avatars [18]. Negative emotions can be reduced, and emotional relief promoted during interactions with virtual digital humans [19]. ...
Article
Full-text available
This research aims to examine the psychology and behavior of users when customizing avatars from the standpoint of user experience and to provide constructive contributions to the Metaverse avatar customization platform. This study analyzed the factors that affect the behavior of user-customized avatars in different virtual environments and compared the differences in public self-consciousness, self-expression, and emotional expression among customized avatars in multiple virtual contexts. Methods: Using a between-subjects experimental design, two random groups of participants were asked to customize avatars for themselves in two contexts, a multiplayer online social game (MOSG) and a virtual meeting (VM). Results: When subjects perceived a more relaxed environment, the customized avatars had less self-similarity, and the subjects exhibited a stronger self-disclosure willingness and enhanced avatar wishful identification; nevertheless, public self-consciousness was not increased. When subjects perceived a more serious environment, the customized avatars exhibited a higher degree of self-similarity, and the subjects exhibited a greater self-presentation willingness, along with enhanced identification of avatar similarity and increased public self-consciousness. Conclusions: Participants in both experiment groups expressed positive emotions. The virtual context affects the self-similarity of user-customized avatars, and avatar self-similarity affects self-presentation and self-disclosure willingness, and these factors will affect the behavior of the user-customized avatar.
... We compared the avatars created in the two contexts to determine if the conclusions of prior studies had changed. Numerous studies have confirmed that human emotions can be expressed through cartoon characters [16] or human-like avatars [17], as well as through the gestures and facial expressions of the avatars [18]. In the discussion of related work, we focus primarily on observing the differences in emotional expression when customizing avatars for different contexts and on determining whether avatar design elements effect emotional expression. ...
Preprint
Full-text available
Purpose: This study aims to analyze the factors that affect the behavior of user-defined avatars in different virtual environments, and compare the differences in public self-awareness, self-expression, and emotional expression among customized avatars in multiple virtual contexts. Methods: Using a between-subjects experimental design, two random groups of participants were asked to customize avatars for themselves in two contexts, a multiplayer online social game (MOSG) and a virtual meeting (VM). i.e. a relaxed and a serious social environment. Results: When subjects perceived a more relaxed environment, the customized avatars had less self-similarity, and the subjects exhibited a stronger self-disclosure willingness and enhanced avatar wishful identification; nevertheless, public self-consciousness was not increased. When subjects perceived a more serious environment, the customized avatars exhibited a higher degree of self-similarity, and the subjects exhibited a greater self-presentation willingness, along with enhanced identification of avatar similarity, and increased public self-consciousness. Conclusions: Participants expressed positive emotions, suggesting that avatars play a positive role in various virtual contexts. The virtual context affects the self-similarity of user-customized avatars, and avatar self-similarity affects self-presentation and self-disclosure willingness, and these factors will affect the behavior of the user-customized avatar. This study contributes suggestions to the Metaverse avatar customization platform design.
... To this end, technology has demonstrated beneficial outcomes for children with autism in better understanding and recognising emotions and feelings (Schuller et al., 2013), developing cognitive flexibility (Pascualvaca et al., 1998), expressive and receptive vocabulary (Ploog et al., 2013), and reducing repetitive behaviours (Odom et al., 2003;Boyd et al., 2012). Applications focused on addressing key deficit areas for children with autism include improving communication skills (Bosseler and Massaro, 2003;Hetzroni and Tannous, 2004;Schlosser and Wendt, 2008;Hailpern et al., 2009), empathy and emotion recognition (Moore et al., 2005;Fabri et al., 2007), and social-interaction skills (Mesa-Gresa et al., 2018). ...
Article
Full-text available
This research explores the inclusion of children on the autism spectrum in the design of educational technology from the perspectives of adult co-designers. A group of five non-verbal children with a diagnosis of autism participated in a series of four design workshops over the course of 6 weeks. Using a participatory design approach, a small team of three teachers and two technology developers worked alongside the children to design a language development and literacy app for use in a special education classroom. The outcome of this process was a stand-alone education app that comprised many of the contributions made by children during the workshops. The inclusion of children with autism in technology design ensures the end-product reflects their education needs and requirements. Using a qualitative approach, this small-scale study sought to examine the participation of children with autism through the various stages of the design process from the perspectives of their teachers and technology designers. Data were collected through individual interviews and a focus group with teachers and technology designers. Three major themes emerged from thematic analysis: (1) valuing contribution; (2) the challenge of listening; and (3) ownership in outcome. Emerging subthemes highlight challenges described by teachers and designers in facilitating and maintaining meaningful participation in design activities and their efforts to address these. Findings emphasise the value of participation while questioning participatory practices for specific phases of design. The study explores the challenges of equalising power between adults and children with autism in participatory design projects. It uncovers tension between the desire to ensure the authentic participation of children with autism where communication and engagement is significantly compromised by the complexity of their disability. The small number of participants and the modest scope of this design project limit the generalisability of the findings. However, it points to the value of recognising children’s contributions and the importance of striving to incorporate these in the final design artefact.
... With the rapid advancement of social interaction platforms that use virtual reality and mixed reality technologies, most of the above communications and collaborations also converted into digital forms using different avatar types. Real-time expressions reenactment technologies are used to reproduce the facial expressions on avatars to provide realistic interactions in these systems [4,5]. ...
Article
Full-text available
This article introduces the Social-Emotional Nurturing and Skill Enhancement System (SENSES-ASD) as an innovative method for assisting individuals with autism spectrum disorder (ASD). Leveraging deep learning technologies, specifically convolutional neural networks (CNN), our approach promotes facial emotion recognition, enhancing social interactions and communication. The methodology involves the use of the Xception CNN model trained on the FER-2013 dataset. The designed system accepts a variety of media inputs, successfully classifying and predicting seven primary emotional states. Results show that our system achieved a peak accuracy rate of 71% on the training dataset and 66% on the validation dataset. The novelty of our work lies in the intricate combination of deep learning methods specifically tailored for high-functioning autistic adults and the development of a user interface that caters to their unique cognitive and sensory sensitivities. This offers a novel perspective on utilising technological advances for ASD intervention, especially in the domain of emotion recognition.
Article
Full-text available
Collaborative virtual environments (CVEs) hold great potential for people With autism. An exploratory empirical study Was conducted to determine if children and youth With autism could understand basic emotions as represented by a humanoid avatar. Thirty-four participants (ages 7.8—16 years) reported to have autism interacted With a softWare program designed to evaluate their ability to identify and make inferences from facial expressions. Over 90% of the participants accurately recognized emotions displayed by avatar representations. These findings support the optimism that CVEs can be used effectively as an assistive technology, as an educational technology, and as a means of helping address potential theory-of-mind impairments.
Book
According to Rosalind Picard, if we want computers to be genuinely intelligent and to interact naturally with us, we must give computers the ability to recognize, understand, even to have and express emotions. The latest scientific findings indicate that emotions play an essential role in decision making, perception, learning, and more—that is, they influence the very mechanisms of rational thinking. Not only too much, but too little emotion can impair decision making. According to Rosalind Picard, if we want computers to be genuinely intelligent and to interact naturally with us, we must give computers the ability to recognize, understand, even to have and express emotions. Part 1 of this book provides the intellectual framework for affective computing. It includes background on human emotions, requirements for emotionally intelligent computers, applications of affective computing, and moral and social questions raised by the technology. Part 2 discusses the design and construction of affective computers. Although this material is more technical than that in Part 1, the author has kept it less technical than typical scientific publications in order to make it accessible to newcomers. Topics in Part 2 include signal-based representations of emotions, human affect recognition as a pattern recognition and learning problem, recent and ongoing efforts to build models of emotion for synthesizing emotions in computers, and the new application area of affective wearable computers.