Conference PaperPDF Available

Toward Measuring Empathy in Virtual Reality

Authors:

Abstract and Figures

While VR is often described as an empathy-inducing medium, it is difficult to link specific features of a VR experience to empathy outcomes. This is due in large part to the challenges of measuring empathy constructs, and the difficulty of capturing the subjective emotional experience of a player in VR. In response, the authors propose a lightweight, inexpensive framework that researchers can use to easily establish a VR testing environment and begin gathering data on the subjective empathic response of users in VR. The authors also propose future research needed to expand, modify, and validate the proposed user-testing protocol.
Content may be subject to copyright.
Toward Measuring Empathy in Virtual
Reality
Kate Carey
Carnegie Mellon University
Pittsburgh, PA 15213, USA
katharic@andrew.cmu.edu
Mark Micheli
Carnegie Mellon University
Pittsburgh, PA 15213, USA
mmichel1@andrew.cmu.edu
Emily Saltz
Carnegie Mellon University
Pittsburgh, PA 15213, USA
esaltz@andrew.cmu.edu
Judeth Oden Choi
Carnegie Mellon University
Pittsburgh, PA 15213, USA
jochoi@andrew.cmu.edu
Jacob Rosenbloom
Carnegie Mellon University
Pittsburgh, PA 15213, USA
jlrosenb@andrew.hmc.edu
Jessica Hammer
Carnegie Mellon University
Pittsburgh, PA 15213, USA
hammerj@cs.cmu.edu
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for third-party components of this work must be honored.
For all other uses, contact the Owner/Author.
CHI PLAY’17 Extended Abstracts, October 15–18, 2017,
Amsterdam, The Netherlands.
©2017 Copyright is held by the owner/author(s).
ACM ISBN 978-1-4503-5111-9/17/10.
https://doi.org/10.1145/3130859.3131325
Abstract
While VR is often described as an empathy-inducing medium,
it is difficult to link specific features of a VR experience to
empathy outcomes. This is due in large part to the chal-
lenges of measuring empathy constructs, and the difficulty
of capturing the subjective emotional experience of a player
in VR. In response, the authors propose a lightweight, inex-
pensive framework that researchers can use to easily es-
tablish a VR testing environment and begin gathering data
on the subjective empathic response of users in VR. The
authors also propose future research needed to expand,
modify, and validate the proposed user-testing protocol.
Author Keywords
Virtual Reality, research methods, empathy
ACM Classification Keywords
H.5.1 [Information interfaces and presentation]: Multime-
dia Information Systems; H.5.3 [Information interfaces and
presentation]: Group and Organization Interfaces; K.8.0
[Personal Computing]: Games
Introduction
With the advent of consumer Virtual Reality (VR) products,
the idea that VR is the ultimate "empathy machine" is gain-
ing popularity [25]. However, these claims are more reflec-
tive of the hopes of VR content creators and their excite-
ment about the medium than they are of the current state of
scholarship about empathy and VR.
A critical step for developing empathic VR is to be able to
study empathy in VR contexts. However, existing method-
ologies for measuring empathy fall short when applied to
VR. For example, empathy self-report methods can be un-
reliable or burdensome when compounded with the cog-
nitive overload of a VR experience, while adding biometric
monitoring can be expensive [16, 33].
We set out to create a lightweight approach to measuring
empathy in VR experiences, building off of existing research
on empathy measures in VR [1, 2, 3, 13, 20, 23, 27, 28].
Our work serves as a jumping off point for researchers in-
terested in understanding the affective power of VR expe-
riences, and for those interested in a user testing method
that can be deployed in the field quickly and inexpensively.
The Challenges of Studying Empathy in VR
Commonly, empathy is understood as the ability to under-
stand another’s emotions. In the psychological sciences,
however, there is a distinction between two primary types of
empathy–cognitive empathy and emotional empathy. Cog-
nitive empathy is the ability to consciously understand how
and why someone feels what they feel (often called "per-
spective taking") [6]. It is useful when discerning another
person’s intent, strategy, or thought process [15]. Emotional
empathy, on the other hand, is an embodied reaction, re-
flecting one’s ability to respond to another’s emotions [6]. It
is useful when building a team, understanding the strengths
of others, and fortifying interpersonal relationships [15].
Despite these differences, both types of empathy are skills.
People have different baseline aptitudes and abilities with
each, and empathy can be trained and strengthened with
practice [6, 19]. Independent of skill level, empathy can be
stimulated by intense emotion, with larger affective inten-
sity leading to higher empathy [12]. There is evidence that
carrying out simulated physical acts, more so than imagin-
ing them, has the ability to influence emotional states and
induce empathy [1, 18, 34]. Empathy can also be attenu-
ated based on the observer’s motivation to empathize, the
emotion the observed person is feeling, and the familiarity
the observer has with the emotion with which they are to
empathize. For example, pleasant, familiar emotions induce
more empathy than unpleasant, alien ones [11].
VR allows users to take on new perspectives while pro-
viding the immersion necessary for embodied thinking, a
key component of empathy [6, 22, 30]. Player immersion
has been linked to empathy games generally, but is es-
pecially relevant to studying the effects of VR on empathy
[8, 17]. However, observing, testing for, and measuring
empathy as induced by a VR experience is challenging.
VR is experienced in the first person, which makes it dif-
ficult for observers to capture, especially when the player
may not be consciously aware of what they are experi-
encing. Additionally, VR experiences are influenced by a
range of factors unique to the medium, including immer-
sion, embodiment and presence–all of which are notori-
ously difficult to capture in their own right [22, 24, 8, 20].
Researchers have measured the empathy effects of expe-
riences in mixed and virtual reality through in-depth inter-
views on the player’s ability to elaborate on a character’s
inner-life [20], and through surveys measuring perspec-
tive taking[1]. However, we have yet to see a unified, VR-
specific method for capturing data on user emotion and
perspective taking [7, 10].
Existing Measurement Methodologies
Researchers studying cognitive and emotional empathy
have traditionally collected data through self-report and ob-
servation. [14]. Researchers have tried to capture user data
about VR experiences through retrospective self-report,
momentary self-report, and physiological tracking. However,
existing methods for measuring empathy and for capturing
user data about VR experiences do not neatly align.
Empathy Measures
Figure 1: VR experiences tested,
matched to iterations on the
protocol.
The most common tools for measuring empathy are self-
assessment surveys [14]. A small number of validated
measures capture both emotional and cognitive empathy
factors, such as the Empathy Quotient (EQ) [21]. Other
measures focus exclusively on cognitive or exclusively on
emotional empathy, and often capture mediating concepts.
For example, one proxy for cognitive empathy is whether
the user’s sense of self is merged or distinct with an empa-
thy target, which can be captured by the Inclusion of Other
in the Self (IOS) scale [4, 14]. While these methods provide
standardized data for researchers, because they rely on
self-report, it can be difficult to judge whether the reported
responses hold up in real life scenarios [14].
In contrast, observational methods for measuring empa-
thy (for example, ratings and assessment of interactions
with therapists [14], or with research confederates [7]) are
rooted in real life scenarios. However, observational meth-
ods lack the standardization of surveys, even when com-
paring the same person across testing sessions [7]. This
challenge suggests an opportunity for VR to contribute to
empathy research, as VR allows researchers to tightly con-
trol realistic virtual environments.
Collecting First-Person Experience Data in VR
Existing methods for capturing subjective user data about
VR experiences include retrospective self-report, momen-
tary self-report, and physiological tracking. With retrospec-
tive self-report, users answer surveys or participate in inter-
views after the end of a VR experience. Existing self-report
measures mostly focus on core VR concepts, such as pres-
ence and immersion [16, 17], rather than on the range and
intensity of player’s emotions. Because self-report mea-
sures are administered after the experience, they are vul-
nerable to memory decay, bias, and distortion [33]. How-
ever, momentary self-report techniques such as the think-
aloud method [26], real-time sliders [16], and counters [30],
can distract from the VR experience, prevent users from en-
gaging with the VR console controls, and be difficult to em-
ploy because users may not be consciously aware of the el-
ements they are supposed to be reporting [16]. While phys-
iological tracking techniques that measure arousal through
skin conductance, heart rate, and reflex response times,
promise a less biased way to measure user response com-
pared to self-report, results can be hard to normalize and
interpret across users with different baselines [16]. Beyond
this, the hardware and setup time necessary to implement
physiological tracking is prohibitive to many researchers.
Researchers studying empathy have struggled to find meth-
ods that are standardized, like self-report surveys; that have
ecological validity, like observational studies; and that can
be implemented in VR. We believe there is an opportunity
to develop new methods for capturing a user’s subjective
emotional experience in a time-linked way, so that spe-
cific game experiences can be connected to emotions. This
method should allow unobtrusive observation of a first per-
son experience. Because many users are new to VR, the
method should account for player familiarity and comfort
with the technology. Given the high cost of external track-
ing hardware, this method should be lightweight so as to be
accessible to more researchers. Lastly, this method should
address cognitive and emotional empathy as distinct fac-
tors, allowing for integration of existing measures relevant to
empathy, such as the EQ [5]. and IOS scale [4].
Methods
To explore methods for capturing empathic responses to
VR content, our team playtested multiple VR experiences
and iteratively developed protocols for data collection [9].
We generated a list of 34 VR experiences that made claims
about inducing empathy by pulling from academic journals
and popular articles on VR experiences, and searching the
VR game stores SteamVR and Rift Experiences. How-
ever, many VR experiences described in both academic
journals and blogs are not publicly available through VR
game stores. From the publicly available experiences, we
selected eight experiences, aiming for a wide variance of
subject matter and VR experience styles, which included
360-degree documentary video, 3D scenario reconstruc-
tion, and interactive fictional narrative. A complete list of the
experiences used in testing can be found in Figure 1.
In each of eight playtest sessions, a member of the re-
search team who had never played the VR experience
being tested was assigned to be the "player." Additional
researchers, or "observers," managed the video and audio
data collection, took notes of their observations, and admin-
istered the interview portions of the protocol. Rotating the
player role among members of the research team allowed
us to consider both the subjective, experiential aspects of
empathy-related data collection in VR, and the usefulness
of the data collected to researchers. At the end of each ses-
sion, the team collaboratively debriefed about their observa-
tions and experiences, reviewed the data collected, identi-
fied gaps in data collection, and made proposals to iterate
the protocol. To develop the insights reported in this paper,
the team reviewed all data collected from the project, in-
cluding the multiple versions of the data collection protocol,
the field notes generated from each session, and the notes
from each debriefing session. All team members reached
agreement on the concepts presented.
Insights for Protocol Development
Observing a First Person Experience
A player’s subjective experience in VR is shaped by their
interactions within the virtual experience, including where
they choose to look. While in the VR experience, players
often have physical or verbal reactions to the content of
which they are unaware. For this reason, the player’s body
language can provide an important complement to self-
reported subjective data. Our iterative protocol development
revealed that it was valuable for researchers to be able to
observe the player’s physical reactions while simultane-
ously seeing what they were experiencing in VR, so that
they could take effective field notes. Additionally, recording
the VR headset feed provided a useful prompt for player
interviews as well as an artifact for later analysis.
VR platforms, like Oculus Rift and SteamVR, automatically
display a video feed of the player’s view on a connected
screen, and can simultaneously display a webcam capture
of the player. However, VR platforms do not natively sup-
port piping the audio of the VR experience into the room.
We used Voicemeeter to ensure that the VR audio went
through both the headset and the computer’s speakers
[32]. Video capture software such as OBS Studio can then
record the audio, the screen view, and the webcam [31].
It is critical for the researcher to be able to observe both
the screen and the player without interfering with data cap-
ture. One layout that accomplishes this goal can be found in
Figure 2, where the observer sits at the edge of the space,
between the player and the screen displaying the VR head-
set’s video feed. From this position the observer can note
the player’s posture, facial expressions, verbalizations, and
other unexpected player behavior, as well as identifying
what is happening in the VR experience at the time.
Figure 2: The layout of the room
during playtesting. The player
stands in the center of a cleared
area with the VR headset. The
observers sit just outside the area
with a view of both the player and
the video stream from the headset,
taking notes and controlling the
recording equipment. A single
camera captures the player from
straight on.
Calibrating Player Experience Measures
Through iterative testing, it became apparent that there
are a number of ways relevant empathy measures can
be skewed during a VR experience. The player’s famil-
iarity with VR affected their emotional experience due to
responses like frustration when learning how to use the
controls. The highly immersive nature of VR can lead to
information overload, making it difficult for the player to pro-
cess information during the experience and for the observer
to ask detailed questions of the player after the experi-
ence. We recommend assessing the player’s experience
level with VR, allowing them time to process after each
playthrough, and ensuring their comfort with the material.
Interviewing players about their experience level with VR
before interacting with the experience accomplishes two
goals. First, it helps the observer establish a baseline for
their observations about the player’s comfort level or frus-
trations during play. Second, it helps the player reflect on
their own abilities and take frustration with the controls into
account when reporting data.
Our original protocol also included a structured post-play
interview. However, in our iterative testing, we found that
players repeatedly diverged from the interview script to
touch on the most immediately memorable points in the
experience. Players were using the post-game interview
to process the highly immersive experience, which made
it difficult to collect other types of data. In response, we in-
serted open-ended questions to start to identify key mo-
ments to delve into in more detail later in the interview. This
decompression time also exposed the player’s feelings of
embodiment as they adjust back to the physical world, of-
ten commenting on their awareness of their hands or their
change in height from VR to the physical world.
Through iterative playtests, we saw that players had trouble
taking in both the narrative and the mechanics of the game
in a single playthrough. Empathic responses are linked,
through immersion, to engagement with a rich narrative
world [8], and we found that players often reported higher
empathetic responses after having played an experience
multiple times. We therefore recommend that researchers
expose players to empathy-inducing VR experiences at
least twice. The player spends the first playthrough famil-
iarizing themselves with the experience and the controls;
subsequent playthroughs allow the player to actively no-
tice the details of the experience-sounds, interaction with
objects, characters-and engage with the narrative world.
Subjective Data Capture
Measures such as emotional response or sense of self
are difficult to communicate because of the wide range of
possible interpretations across participants, but they are
important to empathy measures. For this reason, we em-
ployed two tools as communication aids and to standardize
responses across players. The first set of tools includes a
wheel of emotions (Figure 4) to help players identify emo-
tions, and video playback to situate an emotional response
in a particular moment in the experience. The wheel pro-
vides a comprehensive list of emotions and standardizes
the data collected [29]. When players identify emotions the
researcher prompts them to identify specific moments of
play that they associate with the feeling.
The player and observer then watch the indicated emotional
moments using the recorded video feed capture, allowing
the player and observer to have a shared language about
the VR experience and a targeted discussion about the
emotional experience. As the researcher plays back the
excerpt of the experience, the player is asked to recall their
initial thoughts and emotions, much like a think-aloud. The
researcher then asks follow up questions to probe on the
Figure 3: The final version of our
protocol included at least two
playthroughs of the VR experience,
and three interviews with
supporting material.
player further to discover their responses to specific inter-
actions, narrative events, and characters, including feelings
toward the character the player inhabited, if applicable. The
process of identifying emotions with the wheel and then sit-
uating them in moments of play, also helps the researcher
separate emotional responses related to learning the inter-
face from feelings induced by the content of the experience.
Figure 4: The Wheel of Emotions.
Figure 5: The VR-adapted Other
in the Self scale.
While the wheel of emotions helps us understand emotional
empathy, the spontaneous feeling of oneness with another’s
emotions, a second set of tools is needed to understand
cognitive empathy, which is characterized by understand-
ing another’s perspective while also maintaining a distinct
sense of self [6]. We employed a VR-adapted Other in the
Self Scale [4] to help players articulate how they related to
the character they inhabited and others that they encoun-
tered.
After revisiting the video recording in the interview after
the second playthrough, the player fills out our VR-adapted
Other in the Self Scale (Figure 5), in order to articulate their
sense of separation from or overlap with the characters they
inhabited or interacted with. The observer also asks a se-
ries of questions about the player’s sense of identity and
feelings towards other characters throughout the experi-
ence, such as the characteristics of who the player was in
the experience, a description of what the other characters
experienced, and how the player felt watching other char-
acters’ experiences. Here we are looking for correlations
between the detail with which the player can recount what
other characters felt and what they felt themselves.
Accessibility and Cost
By focusing our research on low-cost, accessible methods,
our protocol has a low barrier for use by researchers of all
backgrounds and means. Provided the researcher has ac-
cess to a VR headset and compatible PC, the materials and
methods employed in the protocol are low-cost (webcam,
speakers) or free (Voicemeeter, OBS Studio, wheel of emo-
tions, and Other in the Self scale).
Future Work
We have found that much of the rhetoric around the po-
tential of VR and empathy is based on conjecture and the
novelty of the medium, rather than on studies of specific
empathy constructs in the context of a VR experience. We
offer these insights and our sample protocol to support
lightweight data collection that emphasizes the first-person
emotional qualities of VR, and to work toward a future eval-
uative framework for understanding VR and empathy.
At this stage, our work implies, but does not validate, meth-
ods for collecting empathy-relevant data in VR experiences.
A critical next step would be to validate our insights. In par-
ticular, we would like to test how well the wheel of emotions
is able to capture a comprehensive inventory of emotions,
and how closely the Other in the Self scale correlates with
existing empathy measures like the Empathy Quotient when
used in VR contexts. Additionally, future versions of this
protocol will likely contain a method of capturing user em-
pathy baselines in order to better determine the empathy
effect. By validating the protocol using subjects outside this
research group, we will be able to further iterate and refine
our procedure. With validated empathy measurements, a
baseline empathy capture, and a more fine-tuned momen-
tary tracking, we see the potential of this protocol to pro-
vide the ability to measure player empathy resulting from
features of VR experiences for experience designers and
researchers from all fields.
REFERENCES
1. Sun Joo Ahn, Amanda Minh Tran Le, and Jeremy
Bailenson. 2013. The effect of embodied experiences
on self-other merging, attitude, and helping behavior.
Media Psychology 16, 1 (2013), 7–38.
2. Sun Joo (Grace) Ahn, Jeremy N. Bailenson, and
Dooyeon Park. 2014. Short- and Long-term Effects of
Embodied Experiences in Immersive Virtual
Environments on Environmental Locus of Control and
Behavior. Comput. Hum. Behav. 39, C (Sept. 2014),
235–245. DOI:
http://dx.doi.org/10.1016/j.chb.2014.07.025
3. Sun Joo Grace Ahn, Joshua Bostick, Elise Ogle,
Kristine L Nowak, Kara T McGillicuddy, and Jeremy N
Bailenson. 2016. Experiencing nature: Embodying
animals in immersive virtual environments increases
inclusion of nature in self and involvement with nature.
Journal of Computer-Mediated Communication 21, 6
(2016), 399–419.
4. Arthur Aron, Elaine N Aron, and Danny Smollan. 1992.
Inclusion of Other in the Self Scale and the structure of
interpersonal closeness. Journal of personality and
social psychology 63, 4 (1992), 596.
5. Simon Baron-Cohen and Sally Wheelwright. 2004. The
empathy quotient: an investigation of adults with
Asperger syndrome or high functioning autism, and
normal sex differences. Journal of autism and
developmental disorders 34, 2 (April 2004), 163–175.
http://view.ncbi.nlm.nih.gov/pubmed/15162935
6. Liz Owens Boltz, Danah Henriksen, Punya Mishra,
Deep-Play Research Group, and others. 2015.
Rethinking Technology & Creativity in the 21st Century:
Empathy through Gaming-Perspective Taking in a
Complex World. TechTrends 59, 6 (2015), 3–8.
7. Dario Bombari, Marianne Schmid Mast, Elena
Canadas, and Manuel Bachmann. 2015. Studying
social interactions through immersive virtual
environment technology: virtues, pitfalls, and future
challenges. Frontiers in Psychology 6 (2015), 869.
DOI:http://dx.doi.org/10.3389/fpsyg.2015.00869
8. Kevin Brooks. 2003. There is nothing virtual about
immersion: Narrative immersion for VR and other
interfaces. alumni. media. mit. edu/˜
brooks/storybiz/immersiveNotVirtual. pdf (accessed
May 2007) (2003).
9. Judeth Oden Choi, Jodi Forlizzi, Michael Christel,
Rachel Moeller, MacKenzie Bates, and Jessica
Hammer. 2016. Playtesting with a Purpose. In
Proceedings of the 2016 Annual Symposium on
Computer-Human Interaction in Play. ACM, 254–265.
10. Julia Diemer, Georg W Alpers, Henrik M Peperkorn,
Youssef Shiban, and Andreas Mühlberger. 2015. The
impact of perception and presence on emotional
reactions: a review of research in virtual reality.
Frontiers in psychology 6 (2015), 26.
11. Changming Duan. 2000. Being Empathic: The Role of
Motivation to Empathize and the Nature of Target
Emotions. Motivation and Emotion 24, 1 (2000), 29–49.
DOI:http://dx.doi.org/10.1023/A:1005587525609
12. Elisabeth Engelberg and Lennart Sjà ˝uberg. 2004.
Emotional intelligence, affect intensity, and social
adjustment. Personality and Individual Differences 37,
3 (2004), 533 – 542. DOI:
http://dx.doi.org/10.1016/j.paid.2003.09.024
13. Hunter Gehlbach, Geoff Marietta, Aaron M. King, Cody
Karutz, Jeremy N. Bailenson, and Chris Dede. 2015.
Many ways to walk a mile in another’s moccasins: Type
of social perspective taking and its effect on negotiation
outcomes. Computers in Human Behavior 52 (2015),
523 – 532. DOI:http://dx.doi.org/https:
//doi.org/10.1016/j.chb.2014.12.035
14. Karen E. Gerdes, Elizabeth A. Segal, and Cynthia A.
Lietz. 2010. Conceptualising and Measuring Empathy.
The British Journal of Social Work 40, 7 (Oct 01 2010),
2326–2343.
https://search-proquest-com.proxy.library.cmu.
edu/docview/818793491?accountid=9902
15. Debra Gilin, William W. Maddux, Jordan Carpenter,
and Adam D. Galinsky. 2013. When to Use Your Head
and When to Use Your Heart. Personality and Social
Psychology Bulletin 39, 1 (2013), 3–16. DOI:
http://dx.doi.org/10.1177/0146167212465320
PMID: 23150199.
16. WA IJsselsteijn and H De Ridder. 1998. Measuring
temporal variations in presence. In Presence in Shared
Virtual Environments Workshop, University College,
London. 10–11.
17. Charlene Jennett, Anna L Cox, Paul Cairns, Samira
Dhoparee, Andrew Epps, Tim Tijs, and Alison Walton.
2008. Measuring and defining the experience of
immersion in games. International journal of
human-computer studies 66, 9 (2008), 641–661.
18. Karine Jospe, Agnes FlÃ˝uel, and Michal Lavidor. 2017.
The role of embodiment and individual empathy levels
in gesture comprehension. Experimental Psychology
64, 1 (2017), 56 – 64. http://search.ebscohost.com.
proxy.library.cmu.edu/login.aspx?direct=true&
db=pdh&AN=2017-08398-007&site=ehost-live
19. Zeinab Khanjani, Elnaz Mosanezhad Jeddi, Issa
Hekmati, Saeede Khalilzade, Mahin Etemadi Nia,
Morteza Andalib, and Parvaneh Ashrafian. 2015.
Comparison of Cognitive Empathy, Emotional Empathy,
and Social Functioning in Different Age Groups.
Australian Psychologist 50, 1 (2015), 80–85. DOI:
http://dx.doi.org/10.1111/ap.12099
20. Martijn JL Kors, Gabriele Ferri, Erik D van der Spek,
Cas Ketel, and Ben AM Schouten. 2016. A
Breathtaking Journey.: On the Design of an
Empathy-Arousing Mixed-Reality Game. In
Proceedings of the 2016 Annual Symposium on
Computer-Human Interaction in Play. ACM, 91–104.
21. E. J. Lawrence, P. Shaw, D. Baker, S. Baron-Cohen,
and A. S. David. 2004. Measuring empathy: Reliability
and validity of the empathy quotient. Psychological
medicine 34, 5 (07 2004), 911–919.
https://search-proquest-com.proxy.library.cmu.
edu/docview/204514684?accountid=9902
22. Matthew R Longo, Friederike Schüür, Marjolein PM
Kammers, Manos Tsakiris, and Patrick Haggard. 2008.
What is embodiment? A psychometric approach.
Cognition 107, 3 (2008), 978–998.
23. Geoff Marietta, Julianne Viola, Nneka Ibekwe, Jessica
Claremon, and Hunter Gehlbach. 2014. Improving
relationships through virtual environments: How seeing
the world through victims’ eyes may prevent bullying.
Work. Pap., Grad. Sch. Educ., Harvard Univ. Article
Location (2014).
24. Alison McMahan. 2003. Immersion, engagement and
presence. The video game theory reader 67 (2003),
86.
25. Chris Milk. 2015. Chris Milk: How Virtual Reality Can
Create the Ultimate Empathy Machine. TED Talks.
26. Jakob Nielsen. 1994. Usability engineering. Elsevier.
27. Soo Youn Oh, Jeremy Bailenson, Erika Weisz, and
Jamil Zaki. 2016. Virtually old: Embodied perspective
taking and the reduction of ageism under threat.
Computers in Human Behavior 60 (2016), 398 – 410.
DOI:http://dx.doi.org/10.1016/j.chb.2016.02.007
28. David L Penn, James D Ivory, Abigail Judge, and
others. 2010. The virtual doppelganger: Effects of a
virtual reality simulator on perceptions of
schizophrenia. The Journal of nervous and mental
disease 198, 6 (2010), 437–443.
29. Robert Plutchik. 1980. A general psychoevolutionary
theory of emotion. Theories of emotion 1, 3-31 (1980),
4.
30. Martijn J Schuemie, Peter Van Der Straaten, Merel
Krijn, and Charles APG Van Der Mast. 2001. Research
on presence in virtual reality: A survey.
CyberPsychology & Behavior 4, 2 (2001), 183–201.
31. Open Broadcaster Software. 2017a. OBS Studio. (1
April 2017). https://obsproject.com/.
32. VB Audio Software. 2017b. Voicemeeter. (1 April 2017).
http://www.vb-audio.com/Voicemeeter/index.htm.
33. Arthur A Stone, Christine A Bachrach, Jared B Jobe,
Howard S Kurtzman, and Virginia S Cain. 1999. The
science of self-report: Implications for research and
practice. Psychology Press.
34. Philip J. Walsh. 2014. Empathy, Embodiment, and the
Unity of Expression. Topoi 33, 1 (01 Apr 2014),
215–226. DOI:
http://dx.doi.org/10.1007/s11245-013-9201-z
... For the new VR version of AS IF, the Wheel of Emotion [43] and the VR-adapted Other in the Self Scale [44] (Figure 4) were adopted to measure any emotional changes associated with empathy and the degree of perspective taking according to the suggestions of Carey et al [45]. The Wheels of Emotion provides a comprehensive list of emotions and standardizes the data collected (according to Plutchik's Psycho-Evolutionary Theory of Emotion), whereas the Other in the Self Scale helps players articulate how they related to the grandmother's avatar. ...
... In their protocol paper for measuring empathy in VR, Carey et al [45] recommended using the Wheel of EmotionsScale to measure the emotional aspects of empathy. Specifically, this instrument is intended for understanding emotional empathy or the spontaneous feeling of oneness with another's emotions [45]. ...
... In their protocol paper for measuring empathy in VR, Carey et al [45] recommended using the Wheel of EmotionsScale to measure the emotional aspects of empathy. Specifically, this instrument is intended for understanding emotional empathy or the spontaneous feeling of oneness with another's emotions [45]. Therefore, we reported the basic analysis of each participant's emotional changes before and after the study. ...
Preprint
Full-text available
BACKGROUND Many researchers have been evaluating how digital media may impact the emotional and perspective taking aspects of empathy in both clinical and nonclinical settings. Despite the growing interest in using virtual reality (VR) and VR games to motivate empathy, few studies have focused on empathy for people who live with chronic pain. OBJECTIVE Chronic pain affects, by conservative estimates, 1 in 5 people in industrialized countries. Despite this prevalence, public awareness of chronic pain was remarkably low until the recent opioid crisis; as a result, stigma remains a problem frequently faced by people who live with this condition. To address this, the VR game AS IF was developed to increase nonpatients’ empathy toward the growing number of people who live with long-term chronic pain. On the basis of our prior work, we overhauled our approach, designed and built a VR prototype and evaluated it, and offered design suggestions for future research. METHODS We introduced the design features of the VR game AS IF and described the study we devised to evaluate its effectiveness. We adopted a mixed methods approach and compared the empathy-related outcomes in both pre- and posttesting. A total of 19 participants were recruited. RESULTS The findings of this study suggest that the VR game was effective in improving implicit and explicit empathy as well as its emotional and perspective taking aspects. More specifically, for the Empathy Scale , the total pretest scores (mean 47.33, SD 4.24) and posttest scores (mean 59.22, SD 4.33) did not reach statistical significance ( P =.08). However, we did find differences in the subscales. The kindness subscale showed a statistically significant increase in the posttest score (mean 15.61, SD 2.85) compared with the pretest score (mean 17.06, SD 2.65; P =.001). For the Willingness to Help Scale , a significant increase was observed from a t test analysis ( P <.001) of scores before (mean 7.17, SD 2.28) and after (mean 8.33, SD 2.03) the gameplay. The effect size for this analysis was large ( d =−1.063). CONCLUSIONS The contributions of this research are as follows: AS IF provides a promising approach for designing VR games to motivate people’s empathy toward patients with chronic pain, the study evaluates the potential effectiveness of such a VR approach, and the general design suggestions devised from this study could shed light on future VR game systems.
... This is because there are many factors such as immersion, presence and interactivity in the virtual experience that can be difficult to capture from an outsider perspective [28,32]. In particular, it is highlighted that existing VR research often measures empathy based on the novelty and innovation of the software itself, instead of recognising the specific empathy constructs that contribute to it [9]. ...
Conference Paper
Full-text available
Despite being situated in public spaces, public interactive displays (PIDs) are not always designed to be fully inclusive, particularly for people with vision impairments. Prototyping physical PIDs with accessible features can be time-consuming and there are ethical and safety barriers to navigate when recruiting appropriate participants for studies. While it is still important to represent this user group in the design process, empathic modelling can also be used to rapidly simulate some challenges people with disability might face when interacting with a system. Traditionally this method has been performed using physical props, however Virtual reality (VR) is a promising way to help amplify it due to its immersive nature. Despite this, its use as a tool for empathic design remains unexplored. Therefore, our work is aimed towards filling this gap through the evaluation of a VR prototype which simulates the experience of a visually impaired person interacting with a public interactive display. This work contributes design considerations for VR simulations to generate empathy towards people with vision impairment.
... To form more robust practices, there is growing literature on different frameworks for empathy in technological contexts. For example, Carey et al. [21] proposed a framework for gathering data on empathy in VR studies, and the framework by W. B. Owais and E. Yaacoub [130] focused on quantifying empathy experiences in immersive virtual environments. We thus classify the studies based on the research methodology, to address the diverse measures. ...
Preprint
Full-text available
Recent advances in extended reality (XR) technologies have enabled new and increasingly realistic empathy tools and experiences. In XR, all interactions take place in different spatial contexts, all with different features, affordances, and constraints. We present a systematic literature survey of recent work on empathy in XR. As a result, we contribute a research roadmap with three future opportunities in XR-enabled empathy research across both physical and virtual spaces.
... To date, only one other review by Carey et al. [10] has explored empirical methods measuring empathy in VR. This review is from 2017 and the field of VR and empathy research has rapidly grown since then with more and more empirical findings and novel methodologies. ...
Preprint
Full-text available
This scoping review identifies and summarizes previous research exploring the efficacy of Virtual Reality (VR) training for empathy and compassion. It clarifies working definitions of empathy and compassion by breaking the constructs down into three components: emotional, cognitive, and behavioral. These components are matched to three key design features of immersive VR technologies: biofeedback, perspective taking, and simulation. Although techniques for empathy-enhancing VR have been reviewed previously, the topic of empathy training in VR has not been comprehensively explored. This paper will present findings on VR empathy training to date, research gaps, and recommendations for future research.
... Although some studies have questioned the validity of the GEQ, we hope our results still depict interesting feedback on our games [24]. [18] (Table 1), a subjective emotion questionnaire (SEQ) using 5-Likert scale questions (Table 2), and conducted semi-structured interviews to analyze the players' preferences and empathy in each case [7]. We only tested the low difficulty mode of both VR games and asked each participant to experience all four myopic conditions and collected the game scores. ...
Preprint
Full-text available
Myopia is an eye condition that makes it difficult for people to focus on faraway objects. It has become one of the most serious eye conditions worldwide and negatively impacts the quality of life of those who suffer from it. Although myopia is prevalent, many non-myopic people have misconceptions about it and encounter challenges empathizing with myopia situations and those who suffer from it. In this research, we developed two virtual reality (VR) games, (1) Myopic Bike and (2) Say Hi, to provide a means for the non-myopic population to experience the frustration and difficulties of myopic people. Our two games simulate two inconvenient daily life scenarios (riding a bicycle and greeting someone on the street) that myopic people encounter when not wearing glasses. We evaluated four participants' game experiences through questionnaires and semi-structured interviews. Overall, our two VR games can create an engaging and non-judgmental experience for the non-myopic population to better understand and empathize with those who suffer from myopia.
... Although some studies have questioned the validity of the GEQ, we hope our results still depict interesting feedback on our games [24]. [18] (Table 1), a subjective emotion questionnaire (SEQ) using 5-Likert scale questions (Table 2), and conducted semi-structured interviews to analyze the players' preferences and empathy in each case [7]. We only tested the low difficulty mode of both VR games and asked each participant to experience all four myopic conditions and collected the game scores. ...
Conference Paper
Full-text available
Myopia is an eye condition that makes it difficult for people to focus on faraway objects. It has become one of the most serious eye conditions worldwide and negatively impacts the quality of life of those who suffer from it. Although myopia is prevalent, many non-myopic people have misconceptions about it and encounter challenges empathizing those who suffer from it. In this research, we developed two virtual reality (VR) games, (1) "Myopic Bike" and (2) "Say Hi", to provide a means for the non-myopic population to experience the difficulties of myopic people. Our two games simulate two inconvenient daily life scenarios (riding a bicycle and greeting friends on the street) that myopic people encounter when not wearing glasses. The goal is to facilitate empathy in people with non-myopia for those who suffer from myopia. We evaluated four participants' game experiences through questionnaires and semi-structured interviews. Overall, our two VR games can create an engaging and non-judgmental experience for the non-myopic people that has potential to facilitate empathizing with those who suffer from myopia.
... This measure has been used in previous studies such as in a study in which participants were exposed to the perspective of an elderly person in VR (Oh et al. (2016)). Although there have been efforts to develop measures of empathy in VR (Carey et al. (2017)), we adopted a state empathy questionnaire (Shen (2010a)) to measure participants' level of empathy towards the robots to be protected in the prosocial game (the vulnerable NPCs). The main reason for choosing this measure was that the state empathy questionnaire is designed specifically to measure one's empathy towards a person or a character, which made it particularly suitable for the study design. ...
Article
This study explores the effects of the perspective-taking of non-player characters (NPCs) on enhancing game immersion in prosocial virtual reality (VR) games. Prosocial games are games focusing on helping others. Game researchers have been keen to investigate factors that influence the immersive experience in digital games. Previous studies show that VR allows people to take the perspective of others, inducing empathy and prosocial behaviour in the real world. In this lab-based study, we explore whether and how taking the perspective of other game characters – NPCs in a prosocial VR game – influences players’ in-game empathy towards NPCs and game immersion. Participants first experienced either a robot’s perspective of being destroyed by fire in VR or read a text description about the same event. Then, they participated a prosocial VR game in which they saved robots. The findings show that perspective-taking experiences indirectly enhance participants’ game immersion via the effects of closeness with the destroyed robot and empathy towards the four robots protected by the player. This indirect effect is moderated by players’ weekly exposure to video games. These results suggest that VR-based perspective-taking of NPCs can indirectly enhance gameplay experiences in prosocial VR games. Theoretical and game design implications are discussed.
Conference Paper
Full-text available
Persuasive games exist for a wide variety of objectives, from marketing, to healthcare and activism. Some of the more socially-aware ones cast players as members of disenfranchised minorities, such as migrants, prompting them to 'see what they see'. In parallel, a growing number of designers has recently started to leverage immersive technologies to enable the public to temporarily inhabit another person, to 'sense what they sense'. From these two converging perspectives, we hypothesize a still-uncharted space of opportunities at the crossroads of games, empathy, persuasion, and immersion. Following a Research through Design approach, we explored this space by designing A Breathtaking Journey, an embodied and multisensory mixed-reality game providing a first-person perspective of a refugee's journey. A qualitative study was conducted with a grounded theory/open coding methodology to tease out empathy-arousing characteristics, and to chart this novel game design space. As we elaborate on our analysis, we provide insights on empathic mixed-reality experiences, and conclude with offering three design opportunities: visceral engagement, reflective moments and affective appeals, to spur future research and design.
Article
Full-text available
Immersive virtual environments (IVEs) produce simulations that mimic unmediated sensory experiences. Three experiments (N = 228) tested how different modalities increase environmental involvement by allowing users to inhabit the body of animals in IVEs or watch the experience on video. Embodying sensory-rich experiences of animals in IVEs led to greater feeling of embodiment, perception of being present in the virtual world, and interconnection between the self and the nature compared to video. Heightened interconnection with nature elicited greater perceptions of imminence of the environmental risk and involvement with nature, which persisted for one week. Although the effect sizes were small to moderate, findings suggest that embodied experiences in IVEs may be an effective tool to promote involvement with environmental issues.
Article
Full-text available
Immersive virtual environment technology (IVET) provides users with vivid sensory information that allow them to embody another person's perceptual experiences. Three experiments explored whether embodied experiences via IVET would elicit greater self-other merging, favorable attitudes, and helping toward persons with disabilities compared to traditional perspective taking, which relies on imagination to put the self in another person's shoes. Trait dispositions to feel concern for others was tested as a moderating variable. Participants in the embodied experiences (EE) condition were exposed to a red-green colorblind simulation using IVET while participants in the perspective taking (PT) condition were exposed to a normal colored IVET world and instructed to imagine being colorblind. Experiment 1 compared EE against PT and found that EE was effective for participants with lower tendencies to feel concern for others 24 hours after treatment. Experiment 2 delved further into the underlying process of EE and confirmed that a heightened sense of realism during the EE led to greater self-other merging compared to PT. Finally, Experiment 3 demonstrated that the effect of EE transferred into the physical world, leading participants to voluntarily spend twice as much effort to help persons with colorblindness compared to participants who had only imagined being colorblind.
Article
Full-text available
Virtual reality (VR) has made its way into mainstream psychological research in the last two decades. This technology, with its unique ability to simulate complex, real situations and contexts, offers researchers unprecedented opportunities to investigate human behavior in well controlled designs in the laboratory. One important application of VR is the investigation of pathological processes in mental disorders, especially anxiety disorders. Research on the processes underlying threat perception, fear, and exposure therapy has shed light on more general aspects of the relation between perception and emotion. Being by its nature virtual, i.e., simulation of reality, VR strongly relies on the adequate selection of specific perceptual cues to activate emotions. Emotional experiences in turn are related to presence, another important concept in VR, which describes the user's sense of being in a VR environment. This paper summarizes current research into perception of fear cues, emotion, and presence, aiming at the identification of the most relevant aspects of emotional experience in VR and their mutual relations. A special focus lies on a series of recent experiments designed to test the relative contribution of perception and conceptual information on fear in VR. This strand of research capitalizes on the dissociation between perception (bottom-up input) and conceptual information (top-down input) that is possible in VR. Further, we review the factors that have so far been recognized to influence presence, with emotions (e.g., fear) being the most relevant in the context of clinical psychology. Recent research has highlighted the mutual influence of presence and fear in VR, but has also traced the limits of our current understanding of this relationship. In this paper, the crucial role of perception on eliciting emotional reactions is highlighted, and the role of arousal as a basic dimension of emotional experience is discussed. An interoceptive attribution model of presence is suggested as a first step toward an integrative framework for emotion research in VR. Gaps in the current literature and future directions are outlined.
Article
Research suggests that the action-observation network is involved in both emotional-embodiment (empathy) and actionembodiment (imitation) mechanisms. Here we tested whether empathy modulates action-embodiment, hypothesizing that restricting imitation abilities will impair performance in a hand gesture comprehension task. Moreover, we hypothesized that empathy levels will modulate the imitation restriction effect. One hundred twenty participants with a range of empathy scores performed gesture comprehension under restricted and unrestricted hand conditions. Empathetic participants performed better under the unrestricted compared to the restricted condition, and compared to the low empathy participants. Remarkably however, the latter showed the exactly opposite pattern and performed better under the restricted condition. This pattern was not found in a facial expression recognition task. The selective interaction of embodiment restriction and empathy suggests that empathy modulates the way people employ embodiment in gesture comprehension. We discuss the potential of embodiment-induced therapy to improve empathetic abilities in individuals with low empathy.
Conference Paper
Playtesting, or using play to guide game design, gives designers feedback about whether their game is meeting their goals and the player's expectations. We report a case study of designing, deploying, and iterating on a series of playtesting workshops for novice game designers. We identify common missteps made by novice designers and address these missteps through the concept of purposefulness, understanding why you are playtesting as well as how to playtest. We ground our workshops in the development of rich player experience goals, which inform playtest design, data collection and iteration. We show that by applying methods taught in our workshops, novice game designers leveraged playtest methods and tools, employed playtesting and data collection methods appropriate for their goals, and effectively applied playtest data in iterative design. We conclude with lessons learned and next steps in our research on playtesting.
Article
Empathy is an essential part of normal social functioning, yet there are precious few instruments for measuring individual differences in this domain. In this article we review psychological theories of empathy and its measurement. Previous instruments that purport to measure this have not always focused purely on empathy. We report a new self-report questionnaire, the Empathy Quotient (EQ), for use with adults of normal intelligence. It contains 40 empathy items and 20 filler/control items. On each empathy item a person can score 2, 1, or 0, so the EQ has a maximum score of 80 and a minimum of zero. In Study 1 we employed the EQ with n = 90 adults (65 males, 25 females) with Asperger Syndrome (AS) or high-functioning autism (HFA), who are reported clinically to have difficulties in empathy. The adults with AS/HFA scored significantly lower on the EQ than n = 90 (65 males, 25 females) age-matched controls. Of the adults with AS/HFA, 81% scored equal to or fewer than 30 points out of 80, compared with only 12% of controls. In Study 2 we carried out a study of n = 197 adults from a general population, to test for previously reported sex differences (female superiority) in empathy. This confirmed that women scored significantly higher than men. The EQ reveals both a sex difference in empathy in the general population and an empathy deficit in AS/HFA.
Article
Intergroup threat harms attitudes toward the outgroup, leading to greater levels of prejudice and outgroup derogation (Rothgerber, 1997). Two experiments were conducted to examine (1) if perspective taking mitigates the negative influence of threat on explicit and implicit intergenerational attitudes and, if so, (2) whether this buffering effect would be stronger for participants who embodied an elderly person in an immersive virtual environment (IVE) compared to those who engaged in a traditional perspective taking exercise via mental simulation (MS). When intergroup threat was presented without intergroup contact (Study 1), the negative effect of threat on ageism dissipated when participants engaged in a perspective taking exercise. Differential effects were found depending on the perspective taking medium. However, when participants were exposed to a concrete and experiential intergroup threat (Study 2), neither modality of perspective taking (IVE and MS) buffered negative intergenerational attitudes.
Article
The goal of the present review is to explain how immersive virtual environment technology (IVET) can be used for the study of social interactions and how the use of virtual humans in immersive virtual environments can advance research and application in many different fields. Researchers studying individual differences in social interactions are typically interested in keeping the behavior and the appearance of the interaction partner constant across participants. With IVET researchers have full control over the interaction partners, can standardize them while still keeping the simulation realistic. Virtual simulations are valid: growing evidence shows that indeed studies conducted with IVET can replicate some well-known findings of social psychology. Moreover, IVET allows researchers to subtly manipulate characteristics of the environment (e.g., visual cues to prime participants) or of the social partner (e.g., his/her race) to investigate their influences on participants’ behavior and cognition. Furthermore, manipulations that would be difficult or impossible in real life (e.g., changing participants’ height) can be easily obtained with IVET. Beside the advantages for theoretical research, we explore the most recent training and clinical applications of IVET, its integration with other technologies (e.g., social sensing) and future challenges for researchers (e.g., making the communication between virtual humans and participants smoother).