Content uploaded by Zechen Bai
Author content
All content in this area was uploaded by Zechen Bai on Feb 21, 2024
Content may be subject to copyright.
Play with Emotional Characters: Improving User Emotional Experience
by A Data-driven Approach in VR Volleyball Games
Zechen Bai1*Naiming Yao1Nidhi Mishra2Hui Chen1†
Hongan Wang1Nadia Magnenat Thalmann2
Institute of Software, Chinese Academy of Sciences, China1Nanyang Technological University, Singapore2
/RVHU7HDP :LQQHU7HDP
Figure 1: Screenshot of the VR volleyball game with emotional virtual characters. After a point being scored, the virtual characters
express appropriate emotions according to the game state.
ABSTRACT
In real-world volleyball games, players are generally aware of the
emotions of other players as they can observe facial expressions,
body behaviors, etc., which evokes a rich emotional experience.
However, most of the VR volleyball games mainly concentrate on
modeling the game playing, rather than supporting an emotional
experience. We introduce a data-driven framework to enhance the
user’s emotional experience and engagement by building emotional
virtual characters in VR volleyball games. This framework enables
virtual characters to arouse emotions according to the game state
and express emotions through facial expressions. Evaluation results
demonstrate our framework has benefits to enhance user’s emotional
experience and engagement.
Index Terms:
VR volleyball games—emotional experience—
emotional virtual characters—sports games;
1I
NTRODUCTION
Virtual reality (VR) has the potential to support rich user experiences
and has been widely used in immersive sports games. In VR volley-
ball games, enabling virtual characters to behave like real athletes is
crucial to increase the user’s feeling of immersion and engagement.
Existing VR volleyball games [2,3] usually concentrate on imitating
physical body movements (e.g. pass, pick-up, volley) that are more
related to the game playing, rather than emotional expressions. We
argue that virtual characters’ emotional expressions could also en-
hance game user’s experience regarding emotional and engagement
*E-mail: zechen2019@iscas.ac.cn
†Corresponding Author: chenhui@iscas.ac.cn
aspects. Therefore, in this work, we specially develop virtual char-
acters that can generate facial emotional expressions according to
the game state (as shown in Figure 1) and evaluate its effects on user
emotional experience and engagement.
2F
RAMEWORK
In real-world volleyball games, players could experience a variety
of emotions, such as joy, pride, or disappointment. This experience
procedure contains (1) arousing the emotion from the game state
such as scores, activities; (2) expressing emotion through facial ex-
pressions and/or body behaviors appropriately. We postulate that a
virtual character should have this experience in a similar procedure.
Thus we decompose the problem of building emotional virtual char-
acters into two consecutive parts: (1) Data-driven emotion cause
model which infers characters’ emotions based on the game state.
(2) Expression model which generates target emotional expression
and renders plausible animation.
2.1 Emotion cause model
Conventional emotion cause models are usually based on emotion
theories and hand-crafted rules [4]. The rules are often manually
designed, coarse and has the risk of stereotype while the emotion
theories might be unsuitable for modeling emotion in sports game
[1]. Therefore, we adopt a data-driven emotion cause model that
is capable of learning emotion cause patterns. As the learning data
come from real world volleyball games, the optimized model tends
to perform more reasonable in VR volleyball games.
To achieve this, a volleyball emotion dataset is constructed based
on real-world volleyball game videos. Specifically, we split the
game videos into rally-level clips and annotate each rally with game
state and resulted emotion states of two teams. We take
{
game
score, match score, winner team, score reason, last-ball-handler
team and last-ball-handler action
}
into the scope of game state. The
possible emotion categories include
{
joy,admire,satisfied,angry,
458
2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
978-0-7381-1367-8/21/$31.00 ©2021 IEEE
DOI 10.1109/VRW52623.2021.00111
7LPH6\QFKURQL]DWLRQ
([SUHVVLRQ*HQHUDWLRQ
7LPH6
\
QF
K
URQL]DWLRQ
([SUHVVLRQ*HQHUDWLRQ
(PRWLRQ$VVLJQPHQW
)Ò¥c÷÷¥
JDPHVFRUH
PDWFKVFRUH
VFRUHUHDVRQ
ZLQQHUWHDP
«
ÒÛ÷¹ÛÔýï¥GÛ¥Ì Ďçë¥ïï¹ÛÔGÛ¥Ì
ͶΓ
.ɵfŒŻŻɵxűƄĪňĪưÚƄĪŒŊ
7UDLQLQJ
7HVWLQJ
Figure 2: The framework consists of two parts: emotion cause model (left), expression model (right). The emotion cause model takes game state
as input and predicts emotion states. Then the expression model let the virtual characters express the predicted emotions appropriately.
disliking,disappointed,pride,shame,neutral
}
. We annotate two
representative emotion categories for each team as the players in one
team may present different kinds of emotions.
The emotion cause model is shown in the left part of Figure 2.
We model this problem as a multi-label classification problem by
a multi-layer neural network. Given the game state of this rally as
input, the model is trained to predict the appropriate emotion state
of characters at the end of this rally. During training, the parameters
of the neural network are optimized by minimizing the widely used
BCE (Binary Cross Entropy) Loss between the prediction and ground
truth emotion states. In the testing phase, (i.e. during running the
game), for each team (winner and loser), this model predicts top-2
most possible emotion categories.
2.2 Emotional expression model
With relate to emotional expressions, we focus on generating char-
acters’ facial part. The model is developed to render emotional
expression animation for each player according to the predicted
emotion states. The total generation process includes: emotion
assignment, facial expression generation and time synchronization.
For emotion assignment, two predicted emotion states are ran-
domly assigned to six virtual characters within one team. The facial
expression generation module employs the blendshapes to generate
the peak facial expression. The result is a linear combination of a
neutral face
B0
plus 46 facial blendshapes
B1−B46
, and 4 eye gazing
blendshapes
B61 −B64
according to Facial Action Coding System
(FACS). Then the emotional expression animation can be generated
based on the peak facial expression in the time synchronization part.
This facial animation starts with a neutral face, then reaches the peak
intensity of emotional expression, and returns to a neutral state at
the end. The duration of the facial animation is required to be time-
synchronized with the virtual character’s on-going body behaviour
animation, which already exists in the VR volleyball platform. The
intermediate transition of the animation is implemented by linear
interpolation.
3E
XPERIMENTS
Participants and procedure. Twenty participants (10 men, 10
women, aged 19 to 28 years old,
M=23
) were recruited to take part
in this experiment. All of them have experience in VR games. Two
versions of the VR volleyball game are prepared: benchmark verse
emotional. The benchmark version is the original game that only
possesses the basic behaviors related to game playing, the emotional
version is an updated version that is equipped with the proposed
framework. The only difference between the two versions is that
whether the virtual characters have emotional expressions. The
participants were asked to play the two versions in a random order,
i.e., participants did not know which version of the game they were
playing. After playing two versions, all the participants were asked
to answer the questionnaire in Table 1.
Results. Results of paired-samples t-test shows that participants
perceive significantly more emotional changes in the emotional ver-
sion (
M=3.65,SD =0.875
) than that in the benchmark version
(
M=2.20,SD =0.696,t(19)=−6.866,p<0.01
), indicating that
Table 1: A questionnaire to investigate the emotional experience in a
VR volleyball game; the scores are designed as five levels: 0-not at
all; 1-slightly; 2-moderately; 3-fairly; 4-extremely.
Question Description
Q1 I can feel the emotional changes of virtual
characters as the game progresses.
Q2 I experienced emotional changes in my mind
during the game.
Q3 I feel engaged in the game.
the proposed framework enhanced users’ perception of virtual char-
acters’ emotional changes as the game processes. Results also show
that participants had more inner emotional experiences in the emo-
tional version (
M=3.00,SD =1.124
) than that in the benchmark
version (
M=1.85,SD =0.813,t(19)=−5.510,p<0.01
), indicat-
ing that our framework has increased the users’ emotional experience
by a large margin. Moreover, we also conducted a paired-samples
t-test in
Q3
for comparison in the overall game engagement aspect.
Results show that participants were more engaged in the emotional
version (
M=2.85,SD =1.182
) than that in the benchmark version
(
M=2.00,SD =1.076,t(19)=−4.344,p<0.01
), indicating that
our framework also has benefits on increasing users’ engagement of
the VR volleyball game.
4C
ONCLUSION
In this paper, we present a data-driven framework to improve users’
emotional experience in VR volleyball games. This framework can
be used to build emotional virtual characters that are capable of
arouse emotion according to the game state and appropriately ex-
press emotion through facial expressions. Evaluation results show
that our framework: (1) helps the users perceive the emotion varia-
tions in the volleyball game; (2) has great potential to increase users’
emotional experience and engagement of the game.
ACKNOWLEDGMENTS
This work was supported by National Key R&D Program
(2020YFC2004100), National Natural Science Foundation of China
(NSFC: 61661146002) and National Research Foundation, Singa-
pore under its NRF-NSFC Joint Research Grant Call (Data Science)
(NRF2016NRF-NSFC001-071).
REFERENCES
[1]
P. R. Crocker, K. C. Kowalski, S. D. Hoar, and M. H. McDonough.
Emotion in sport across adulthood. Developmental sport and exercise
psychology: A lifespan perspective, pp. 333–356, 2004.
[2]
W. Hai, N. Jain, A. Wydra, N. M. Thalmann, and D. Thalmann. Increas-
ing the feeling of social presence by incorporating realistic interactions
in multi-party vr. In Proceedings of the 31st International Conference
on Computer Animation and Social Agents, pp. 7–10, 2018.
[3]
N. Jain, A. Wydra, W. Hai, N. M. Thalmann, and D. Thalmann. Time-
scaled interactive object-driven multi-party vr. The Visual Computer,
34(6-8):887–897, 2018.
[4]
B. Ravenet, F. Pecune, M. Chollet, and C. Pelachaud. Emotion and
attitude modeling for non-player characters. In Emotion in Games, pp.
139–154. Springer, 2016.
459