Conference PaperPDF Available

Investigation of Thermal Perception and Emotional Response in Augmented Reality using Digital Biomarkers: A Pilot Study

Authors:

Abstract

Dialectical behavior therapy (DBT) is an evidence-based psychotherapy that helps patients learn skills to regulate emotions as a central strategy to improve life functioning. However, DBT skills require a long-term and consistent commitment, typically to group therapy over the course of months. Patients who might benefit may find this approach undesirable; it can be challenging to transfer learning from therapy sessions to daily life, and there is no way to personalize skills learning based on individualized needs. In this paper we propose the use of Augmented Reality (AR) and digital biomarkers to enhance DBT skill exercises to be more immersive and personalized by using physiological data as real-time feedback. To explore the feasibility of AR-based DBT skill implementation, we developed AR-based DBT skill exercises that manipulate the user's thermal perception by visualizing different thermal information in holograms. We conducted a user study to evaluate the impact of AR in changing the thermal perception and emotional states of the user with an analysis of physiological data collected from wearable devices.
Investigation of Thermal Perception and Emotional Response in
Augmented Reality using Digital Biomarkers: A Pilot Study
Sangjun Eom*
Department of Electrical and
Computer Engineering
Duke University
Seijung Kim
Department of Computer
Science
Duke University
Yihang Jiang
Department of Biomedical
Engineering
Duke University
Ryan Jay Chen §
Department of Electrical and
Computer Engineering
Duke University
Ali R. Roghanizad
Department of Biomedical
Engineering
Duke University
M. Zachary Rosenthal ||
Department of Psychiatry
and Behavioral Sciences
Department of Psychology
and Neuroscience
Duke University
Jessilyn Dunn **
Department of Biomedical
Engineering
Duke University
Maria Gorlatova††
Department of Electrical and
Computer Engineering
Duke University
Figure 1: Hardware setup of the user wearing Magic Leap One AR headset and EEG sensor (a), the AccuTemp blood temperature
sensor on the left forearm (b), varying thermal perception with holograms to induce warm sensations (c) and cooling sensations (d).
ABSTRACT
Dialectical behavior therapy (DBT) is an evidence-based psychother-
apy that helps patients learn skills to regulate emotions as a central
strategy to improve life functioning. However, DBT skills require
a long-term and consistent commitment, typically to group therapy
over the course of months. Patients who might benefit may find this
approach undesirable; it can be challenging to transfer learning from
therapy sessions to daily life, and there is no way to personalize skills
learning based on individualized needs. In this paper we propose the
use of Augmented Reality (AR) and digital biomarkers to enhance
DBT skill exercises to be more immersive and personalized by using
physiological data as real-time feedback. To explore the feasibility
of AR-based DBT skill implementation, we developed AR-based
DBT skill exercises that manipulate the user’s thermal perception
by visualizing different thermal information in holograms. We con-
ducted a user study to evaluate the impact of AR in changing the
thermal perception and emotional states of the user with an analysis
of physiological data collected from wearable devices.
*E-mail: sangjun.eom@duke.edu
E-mail: seijung.kim@duke.edu
E-mail: yihang.jiang@duke.edu
§E-mail: ryan.j.chen@duke.edu
E-mail: ali.roghanizad@duke.edu
||E-mail: mark.rosenthal@duke.edu
**E-mail: jessilyn.dunn@duke.edu
††E-mail: maria.gorlatova@duke.edu
Index Terms: Human-centered computing—Human computer
interaction (HCI)—Interaction paradigms—Mixed / augmented real-
ity;
1 INTRODUCTION
Dialectical behavior therapy (DBT) is a type of cognitive behav-
ioral therapy developed to treat complex behaviors associated with
emotion dysregulation [7]. DBT is focused on enabling patients to
acquire new skills to improve life functioning by learning to regulate
emotions effectively. In its conventional form, DBT is a 1-year
treatment during which the patients acquire and generalize four sets
of skills: distress tolerance,mindfulness,emotion regulation, and
interpersonal effectiveness [11]. However, the long-term and con-
sistent commitment required in DBT can often be challenging for
therapists and patients. Some patients cannot regularly complete the
given assignments by transferring their learning from the therapy
sessions to their daily life. Furthermore, DBT is based on the indi-
vidual patient’s needs. Therefore, the DBT skill exercises need to be
personalized for each patient based on their learning progress and
changes in emotional states.
Augmented Reality (AR) and digital biomarkers have the poten-
tial to address those challenges by enhancing DBT skill exercises
to be more effective and personalized for patients. Digital biomark-
ers are aggregate metrics from wearable devices that can collect
various types of physiological data from patients. These metrics
can be calculated in real-time, and such real-time feedback can be
used for evaluating and monitoring the changes in users’ emotional
states to allow the AR system to personalize skills based on their
needs. Hence, we propose the first use of AR and digital biomarkers
for DBT through a pilot study investigating the manipulation and
monitoring of thermal perception and emotional arousal.
We developed an AR app for an adaptation of the emotional
regulation skills, which are the DBT skills that can be practiced by
patients to help them regulate their emotional states. Our AR app
manipulates the thermal perception of the user by showing different
levels of thermal information in holograms. We conduct a user
study to investigate 1) the impact of varying thermal information
in AR on user’s thermal perception and emotional response, and
2) the changes in digital biomarkers using various physiological
data including heart rate variability (HRV), electroencephalogram
(EEG), electrocardiogram (ECG), electrodermal activity (EDA), and
forearm core blood temperature.
2 RE LATE D WORK
AR and Virtual Reality have the potential to enhance mental health
training to be more immersive and interactive for patients. Prior
studies have shown that exposure to an augmented environment (e.g.,
displaying animal holograms for patients with animal phobia [16] or
providing guided meditation to patients for emotion regulation [14])
helps the patients embrace certain types of phobias and reduce nega-
tive emotions such as anxiety and depression. Similarly, exposure
to a virtual environment (e.g., reconstruction of war experiences for
post-traumatic stress disorder patients [12], calming environment
involving the nature for DBT mindfulness training [6]) helps the
patients reduce the level of stress or increase the positive emotions.
Additionally, the use of digital biomarkers has the potential in
evaluating patients’ emotional responses using quantifiable physio-
logical data. Prior studies show that we can relate physiological data
such as HRV, EEG, and EDA to users’ emotional responses (e.g.,
stress or relaxation levels) [9]. From the analysis of the brain waves
of gamma, beta, and alpha activities from the EEG data, a reduction
in negative moods was found after the AR-based meditation [14].
Similarly, the decreases in both HRV and skin conductance response
analyzed from the EDA data correlated with the decrease in self-
reported stress levels (e.g., an increase in relaxation was seen after
the AR-based meditation exercises [9]). Digital biomarkers pro-
vide crucial feedback about users’ emotional responses in real-time
that enhance the DBT skill exercises to be more personalized for
individuals.
The use of AR in manipulating users’ thermal perception by
displaying thermal information in holograms has been demonstrated
by prior studies. Different types of thermal stimuli (e.g., virtual
flames and icy fogs [5], flames in red and blue colors [4]) have been
used to invoke warming or cooling sensations. Though prior studies
have demonstrated that thermal stimuli are effective in manipulating
the thermal perception of the user, the implementation of the AR app
with thermal stimuli into clinical applications (e.g., helping patients
control and reduce pain) still remains as future work.
3 PI LOT ST UDY DESIGN
We created an AR app using a Magic Leap One AR headset and
Unity 2020.3.14fl. The app consists of two main stimuli: thermal
stimuli for visualizing different thermal information and emotional
stimuli for visualizing pictures that represent different emotional
states as holograms. For collecting physiological data, we used four
wearable devices: 1) the Bittium Faros 180, an FDA-approved 3-lead
wearable ECG device, 2) an Empatica E4 wristband for collecting
HRV and EDA, 3) the OpenBCI Cyton, a 16-channel EEG device
with around-the-ear electrodes, and 4) a ThermaSENSE AccuTemp
wearable blood temperature sensor. In addition to the physiological
data, we collected eye gaze data including gaze fixation and pupil
diameter using the Magic Leap One AR headset.
3.1 AR Application
3.1.1 Thermal Stimuli
To manipulate the thermal perception of the user, we show holograms
that convey different thermal information to be overlaid on the object
Table 1: Classification of emotion labels associated with valence and
arousal ratings [13].
Quadrant Valence Arousal Emotion Labels
LVHA Low High Angry, Distressed, Tense
LVLA Low Low Sad, Depressed, Tired
HVHA High High Excited, Happy, Aroused
HVLA High Low Relaxed, Satisfied, Calm
Figure 2: Selection of pictures in emotion classification model (a) and
display of these pictures in holograms using Magic Leap One headset
for emotional stimuli (b).
placed on the user’s left hand. We used a metal cube with a printed
image marker attached to the top surface of the cube for overlaying a
hologram through Vuforia marker detection. Two different levels of
thermal information were displayed in holograms. The hologram for
the warm temperature was displayed as a burning coal texture with
an animation of fire particles emanating from the cube, as shown in
Fig. 1c. The hologram for the cold temperature was displayed as an
ice texture and an animation of snowflakes emanating from the cube,
as shown in Fig. 1d.
3.1.2 Emotional Stimuli
To induce the emotional states of the user, we display pictures that
convey different levels of arousal and valence as holograms. Valence
is a level of pleasantness that a stimulus generates, ranging from
unhappy (i.e., low rating) to happy (i.e., high rating) [15]. Arousal
is a level of autonomic activation that a stimulus generates, ranging
from calm (i.e., low rating) to excited (i.e., high rating) [15]. Based
on the ratings of valence and arousal, emotions can be categorized
into four quadrants in the emotion classification model (shown in
Fig. 2a) [13]. Table 1 shows the list of emotional labels associated
with each quadrant.
We use pictures from the OASIS database [10] that provide nor-
mative arousal and valence ratings. Using this database, we create
two sets of images, shown in Fig. 2b, as emotional stimuli to induce
emotional changes after the thermal perception experiment. The first
set comprises two images with one low valence and high arousal
image (i.e., a burning fire) and one high valence and low arousal
image (i.e., a frozen view with ice particles). These two images are
in opposite quadrants in the emotion classification model (shown in
Fig. 2a). Alternatively, the second set comprises two images with
similar levels of valence and arousal ratings, but the images convey
different thermal perception to the user (i.e., a summer beach vs. a
winter lake). These two images are in the same quadrant in the emo-
tion classification model. We display each image set as holograms,
allowing participants to observe and record their emotional states
with valence and arousal ratings on the questionnaire.
3.2 Digital Biomarkers
The four wearable devices used for collecting various types of phys-
iological data in this user study are shown in Fig. 3.
Figure 3: Overall experimental setup including four wearable devices
and Magic Leap One AR headset.
3.2.1 EDA Signal Processing
During the experiments, the users wore the Empatica E4 wristband
on their right forearms (shown in Fig. 3). The forearm was put at
the height of the user’s heart. We used the timestamps to select the
signals for each experimental step for data processing. From the
data collection, we can quantify features that are associated with the
level of arousals such as peaks, amplitude, rise time, and recovery
time from the EDA data. [2].
3.2.2 ECG Signal Processing
The preprocessing of the ECG data includes signal filtering, peak
detection, and metric calculation. The processed data can quantify
the heart rate and HRV by calculating the peak-to-peak intervals of
the signal. The HRV measurements can induce the level of relaxation
(e.g., significant fluctuations in HRV indicates a more relaxed state
of the user [1]).
3.2.3 EEG Signal Processing
The EEG data collected during the experiments can quantify the
measurements for alpha, beta, and gamma activities. The signals of
these activities are related to the level of stress [8] by calculating the
alpha/beta ratio that is negatively correlated with the level of stress.
The open-source software from OpenBCI provides the analysis of
the recorded data from the 16-channel EEG device.
3.2.4 Core Blood Temperature Sensing
The AccuTemp blood temperature sensor, created by ThermaSENSE,
worn on the user’s forearm (shown in Fig. 3) can directly measure
the internal blood temperature response. By using this unique non-
invasive sensor instead of a skin temperature sensor, we can more
directly quantify the forearm’s core blood temperature which can
provide information about the user’s thermal perception. We hy-
pothesize that the thermal perception of the user induces vasomotor
changes in the left forearm which causes localized changes in blood
flow and blood temperature. Therefore, changes in the internal blood
temperature of the forearm can be affected by the user’s thermal
perception.
3.3 Experimental Steps
Prior to the trials, the participant put on all wearable devices. The
EDA sensor was worn on the right forearm, and the AccuTemp
sensor was worn on the left forearm. The two electrodes of the ECG
sensor were attached to the right clavicle near the right shoulder and
below the pectoral muscles’ lower edge of the left rib cage (shown in
Fig. 3). The microcontroller of the EEG sensor was attached to the
back of the participant’s neck and two EEG electrodes were attached
to the participant’s ears. We started the calibration for 15 minutes to
establish the baseline for data collection. Finally, the participant put
on the Magic Leap One.
Figure 4: Changes in the core (i.e., internal) blood temperature of one
of the user’s left forearm.
The participant performed four trials during the study. Each
trial consists of 5 minutes for interacting with cubes in different
temperature-perceived environments, and 5 minutes each for observ-
ing the image sets to stimulate emotions. We created a different
thermal perception of the environment by varying the temperature
level of the cube and the thermal information of the hologram for
each trial. We designed two trials for matching thermal perception
between the temperature of the cube and the thermal information
of the hologram, and the other two trials for unmatching thermal
perception. Four trials were 1) cold cube and ice hologram, 2) warm
cube and burning coal hologram, 3) cube at room temperature and
ice hologram, and 4) cube at room temperature and burning coal
hologram. On the other hand, we used the same two image sets for
emotional stimuli for all trials. We recorded the timestamps after
each trial for the analysis of the digital biomarker data.
3.4 Survey Questionnaire
In the pre-experiment survey, participants were asked to record
self-reported emotional states using the self-assessment manikin
(SAM), a clinically validated survey designed for the evaluation of
emotions [3]. In the post-experiment survey, participants were asked
to record self-reported emotional states and invoked emotions from
the image sets using SAM. These self-reported emotional states
can be compared to the analysis of digital biomarker data that can
quantify the changes in users’ emotional states.
4 PRELIMINARY RES ULTS
We recruited 6 participants to perform all four trials of AR-based
DBT skill exercises while wearing wearable devices. One participant
uses the AR headset frequently (i.e., more than once a week); The
other five participants have never used it before. This study was
approved by the Duke University IRB. In this section, we present
the analysis of changes in users’ thermal perception and emotional
states by processing the data from the AccuTemp blood temperature
sensor and survey responses.
4.1 Changes in Forearm Core Blood Temperature
We analyzed the changes in the core blood temperature of the user’s
left forearm during the trials. Fig. 4 shows the change in the core
blood temperature from one of the participants in the study over
time. We observed that the core blood temperature of the user’s
left forearm changed without varying the temperature of the cube.
During trials 1 and 2, the core blood temperature changed based on
the temperature of the cube placed on the left hand (e.g., the cold
cube induced a decrease in temperature due to vasoconstriction, and
the warm cube induced an increase in temperature due to vasodila-
tion). However, the core blood temperature of the user still changed
when the temperature of the cube was consistent at room tempera-
ture. We believe that the changes in the core blood temperature are
in response to the perceived temperature of the cube. The results
indicate that the thermal perception created by visualizing ice and
burning coal holograms affected vasomotor function and induced
changes in the core blood temperature of the user’s left forearm.
Figure 5: Changes in valence and arousal ratings after each trial.
4.2 Changes in Emotional States
We explored the changes in the emotional states of the user by
calculating the difference between the self-reported valence and
arousal ratings before and after trials, shown in Fig. 5. We observed
that when the temperature of the cube was at room temperature, the
use of the burning coal hologram (i.e., trial 4) resulted in a shift
towards the HVLA quadrant (i.e., decrease in valence and increase
in arousal), while the use of the ice hologram (i.e., trial 3) resulted
in a shift towards the HVHA quadrant (i.e., increase in valence
and increase in arousal) in the emotion classification model. This
indicates that the use of the ice hologram induced emotions such as
happiness or pleasantness, while the burning coal hologram induced
emotions such as anger or annoyance (Table 1). We hypothesize that
this is due to the changes in the user’s thermal perception induced by
the visualization of different thermal information in holograms. The
overlay of the ice hologram on the cube potentially illuded the users
to feel the cold temperature which induced them to feel happy and
pleasant. This shows that the manipulation of thermal perception
through AR has the potential to enhance such DBT skill exercises
that are used to help users reduce high emotional states.
5 DISCUSSION AND FUTURE WORK
In this study we displayed holograms of cubes with burning coal
and ice textures to manipulate the user’s thermal awareness and
perception of the environment. However, the current visualization of
holograms is overlaid only on the cube (i.e., a 2cm by 2cm by 2cm
dimension), and the hologram animations (i.e., fire and snowflake
particles emanating from the texture) were too subtle for users to
notice. We plan to improve the visualization by expanding the holo-
gram animations (e.g., snow falling from the sky and accumulating
on the floor) and adding more holograms to the surrounding areas
(e.g., snow surface on the hand or the table) for a more realistic and
immersive environment.
Moreover, we currently analyzed the physiological data collected
from the wearable devices after the experiment. However, using the
physiological data as real-time feedback to AR can further enhance
the AR-based DBT skill exercises to be personalized based on the
users’ needs during their everyday lives. In our future work, we plan
to develop a wireless communication pipeline for our AR system to
use physiological data as real-time feedback. We will evaluate this
intervention by digital biomarkers in AR by conducting a long-term
user study of the daily uses of AR-based DBT skills.
6 CONCLUSION
This paper presents the first use of AR and digital biomarkers for
enhancing DBT skill exercises. We conducted a user study to inves-
tigate the impact of AR visualization on the thermal perception and
emotional responses of the users. Our results show that the manipu-
lation of the user’s thermal awareness and perception by displaying
different levels of thermal information in holograms has impacts on
the changes in the user’s vasomotor function and emotional states.
We will further analyze the relationship between the physiological
data and the emotional response of the users and evaluate the clinical
use of AR-based DBT skill exercises in DBT therapy sessions.
ACKNOWLEDGMENTS
This work was supported in part by NSF grants CNS-2112562 and
CNS-1908051, NSF CAREER Award IIS-204607, and by a Thomas
Lord Educational Innovation Grant.
REFERENCES
[1] B. M. Appelhans and L. J. Luecken. Heart rate variability as an index
of regulated emotional responding. Review of General Psychology,
10(3):229–240, 2006.
[2]
M. Z. Baig and M. Kavakli. A survey on psycho-physiological anal-
ysis & measurement methods in multimodal systems. Multimodal
Technologies and Interaction, 3(2):37, 2019.
[3]
M. M. Bradley and P. J. Lang. Measuring emotion: The self-assessment
manikin and the semantic differential. Journal of Behavior Therapy
and Experimental Psychiatry, 25(1):49–59, 1994.
[4]
D. Eckhoff, C. Sandor, G. L. Cheing, J. Schnupp, and A. Cassinelli.
Thermal pain and detection threshold modulation in augmented reality.
Frontiers in Virtual Reality, p. 130, 2022.
[5]
A. Erickson, K. Kim, R. Schubert, G. Bruder, and G. Welch. Is it cold
in here or is it just me? Analysis of augmented reality temperature
visualization for computer-mediated thermoception. In Proc. of IEEE
ISMAR, 2019.
[6]
J. Gomez, H. G. Hoffman, S. L. Bistricky, M. Gonzalez, L. Rosenberg,
M. Sampaio, A. Garcia-Palacios, M. V. Navarro-Haro, W. Alhalabi,
M. Rosenberg, et al. The use of virtual reality facilitates dialectical
behavior therapy® “observing sounds and visuals” mindfulness skills
training exercises for a latino patient with severe burns: A case study.
Frontiers in Psychology, 8:1611, 2017.
[7]
M. Goodman, D. Banthin, N. J. Blair, K. A. Mascitelli, J. Wilsnack,
J. Chen, J. W. Messenger, M. M. Perez-Rodriguez, J. Triebwasser, H. W.
Koenigsberg, et al. A randomized trial of dialectical behavior therapy
in high-risk suicidal veterans. The Journal of Clinical Psychiatry,
77(12):4031, 2016.
[8]
N. H. A. Hamid, N. Sulaiman, S. A. M. Aris, Z. H. Murat, and M. N.
Taib. Evaluation of human stress using EEG power spectrum. In
Proc. of IEEE International Colloquium on Signal Processing & Its
Applications, 2010.
[9]
Y. Jiang, W. Wang, T. Scargill, M. Rothman, J. Dunn, and M. Gorla-
tova. Digital biomarkers reflect stress reduction after augmented reality
guided meditation: A feasibility study. In Proc. of the Workshop on
Emerging Devices for Digital Biomarkers, 2022.
[10]
B. Kurdi, S. Lozano, and M. R. Banaji. Introducing the open affec-
tive standardized image set (OASIS). Behavior Research Methods,
49(2):457–470, 2017.
[11]
M. M. Linehan. Skills Training Manual for Treating Borderline Per-
sonality Disorder. Guilford Press, 1993.
[12]
A. Rizzo, B. Newman, T. Parsons, G. Reger, K. Holloway, G. Gahm,
B. Rothbaum, J. Difede, R. McLay, S. Johnston, et al. Development
and clinical results from the virtual Iraq exposure therapy applica-
tion for PTSD. In Proc. of IEEE Virtual Rehabilitation International
Conference, 2009.
[13]
J. A. Russell. A circumplex model of affect. Journal of Personality
and Social Psychology, 39(6):1161, 1980.
[14]
J. Viczko, J. Tarrant, and R. Jackson. Effects on mood and EEG states
after meditation in augmented reality with and without adjunctive
neurofeedback. Frontiers in Virtual Reality, 2:618381, 2021.
[15]
A. B. Warriner, V. Kuperman, and M. Brysbaert. Norms of valence,
arousal, and dominance for 13,915 English lemmas. Behavior Research
Methods, 45(4):1191–1207, 2013.
[16]
M. Wrzesien, M. Alca
˜
niz, C. Botella, J.-M. Burkhardt, J. Bret
´
on-
L
´
opez, M. Ortega, and D. B. Brotons. The therapeutic lamp: Treating
small-animal phobias. IEEE Computer Graphics and Applications,
33(1):80–86, 2013.
... Studies focusing on neurological metrics, primarily EEG [45,46,47,48], report shifts in power spectra (e.g., low-frequency posterior alpha for visual cognition or frontal theta for attentional processes) depending on the complexity of AR-based interactions. Finally, neuro-physiological research integrating both peripheral (HRV, EDA) and cortical (EEG) measures has shown that AR guidance and training can produce quantifiable changes in stress and workload markers [49,50,51], emphasizing how combined data streams offer deeper insight into user responses. However, the influence of AR safety warnings on users' stress and cognitive workload, particularly in high-risk settings such as construction, is underexplored. ...
... Following preprocessing, feature extraction was performed to obtain spectral power in key frequency bands. We focused on EEG channels in the frontal lobe and prefrontal cortex, as shown in Figure 3(a), due to their critical role in higher-order cognitive functions such as emotional regulation, stress appraisal, and top-down modulation of the autonomic nervous system during stressful situations [51,83,84,85]. To enhance computational efficiency and eliminate redundancy, Principal Component Analysis (PCA) reduced data dimensionality while preserving significant variance [86,54]. ...
Preprint
Full-text available
This paper presents a multi-stage experimental framework that integrates immersive Virtual Reality (VR) simulations, wearable sensors, and advanced signal processing to investigate construction workers neuro-physiological stress responses to multi-sensory AR-enabled warnings. Participants performed light- and moderate-intensity roadway maintenance tasks within a high-fidelity VR roadway work zone, while key stress markers of electrodermal activity (EDA), heart rate variability (HRV), and electroencephalography (EEG) were continuously measured. Statistical analyses revealed that task intensity significantly influenced physiological and neurological stress indicators. Moderate-intensity tasks elicited greater autonomic arousal, evidenced by elevated heart rate measures (mean-HR, std-HR, max-HR) and stronger electrodermal responses, while EEG data indicated distinct stress-related alpha suppression and beta enhancement. Feature-importance analysis further identified mean EDR and short-term HR metrics as discriminative for classifying task intensity. Correlation results highlighted a temporal lag between immediate neural changes and subsequent physiological stress reactions, emphasizing the interplay between cognition and autonomic regulation during hazardous tasks.
... This study used tools such as Text Blob, a wordemotion association lexicon (EmoLex), a valence aware dictionary for sentiment reasoning (VADER), and LDA for this purpose [85]. Eom et al. [86] applied emotion modeling based on arousal and valence to display pictures as holograms. The user biosensor data were monitored for various levels of happiness (from happy to unhappy) and calmness (from calm to excited), and pictures were chosen as holograms to complete the task [86]. ...
... Eom et al. [86] applied emotion modeling based on arousal and valence to display pictures as holograms. The user biosensor data were monitored for various levels of happiness (from happy to unhappy) and calmness (from calm to excited), and pictures were chosen as holograms to complete the task [86]. Hart et al. [87] applied mixed reality (MR) to share emotions between collaborators and its augmentation in the form of virtual avatars. ...
Article
Full-text available
The availability of human emotion data is a promising development for intelligent systems that can enhance the interaction between humans and technology. By combining emotions with new technologies—such as artificial intelligence (AI), geospatial information systems (GIS), and extended reality (XR)—a new generation of emotion-based intelligent systems could offer several benefits for various urban applications. Our research aims to contribute to this field by reviewing the relevant literature. Consequently, we proposed a 3-layer framework covering the basics of emotion analysis, technologies and models, privacy issues, and implementation environments. We categorized different emotion-technology combinations—such as AI, GIS, and XR—in urban services, and introduced the concept of emotion-intelligent systems constructed based on the integration of emotion data and new technologies. We also discussed the challenges in developing emotionally aware intelligent systems, addressing the obstacles that must be overcome to achieve a more emotionally intelligent generation. These systems have the potential to revolutionize the way humans interact with technology and could open new avenues for future studies in this area.
... To assess the physical engagements of patients in Legato, we will collect patients' physiological data and evaluate patients' experiences with quantitative data measurements. Given that each patient's ability to perform movements may vary due to their medical condition, we will gather objective measures such as heart rate [33], blood pressure, and ECG [5], along with mobility data such as detailed measures of headset pose changes and the distance traveled by the patient's hands [29]. These metrics will be used to assess engagement, activity, and biometric feedback. ...
Conference Paper
Full-text available
Patients in intensive care units (ICUs) undergoing prolonged bed rest are at high risk of physical, cognitive, and psychological impairments due to immobility. While early mobilization protocols are critical for rehabilitation, the implementation is hindered by patients' restricted mobility and the complex ICU environment. To address these challenges, we present Legato, an immersive virtual reality (VR) game designed to enhance mobility activities for ICU patients. Legato offers a calming, customizable environment with thematic styles and music tracks, while providing real-time progress tracking and adaptive difficulty levels tailored to patients' hand movements. To evaluate its feasibility, we deployed Legato on Meta Quest 3 and conducted a pilot study with an ICU patient at Duke University hospital , demonstrating promising potential for improving rehabilitation outcomes.
... AR technology can elicit an emotional reaction in users through immersive and interactive experiences, hence impacting their behavioral intentions (Khosasih & Lisana, 2023;Voicu et al., 2023). Furthermore, AR technology has shown its capacity to elicit emotional reactions and influence users' behavioral intents in several fields, such as e-commerce, online food delivery, game-based learning, and others (Eom et al., 2023;Yu et al., 2022). ...
Article
Full-text available
This article specifically examines the emotional implications of using augmented reality (AR) technology in advertising. It primarily focuses on how users' emotional reactions and subsequent actions are influenced by the design of AR advertisements. The text examines the multifaceted impact of emotion in commercial communication, encompassing diverse areas such as political advertising and online advertising that employs hilarious storytelling. The study subsequently examines the impact of emotional components of augmented reality (AR) advertisements on users, emphasizing the crucial roles played by immersive and interactive experiences, creative design, user engagement, and other aspects in guiding emotions and shaping behavioral intentions. Ultimately, the utilization of augmented reality (AR) technology is showcased in the domains of electronic commerce, internet-based food delivery, and educational gaming. In order to gain a comprehensive understanding of emotion in AR advertising, it is necessary to conduct thorough study on the particular application and implementation methods. Additionally, it is important to examine how technological, cultural, and societal factors impact user responses. This paper seeks to analyze and combine existing studies in order to uncover the potential of emotional applications in augmented reality (AR) marketing. Additionally, it attempts to offer guidelines for future research and practical implementation.
... This convergence of physical and digital elements empowers users to engage with meditation in innovative ways. Research indicates that such experiences can lead to heightened sensations of tranquillity and reduced stress levels (Eom et al., 2023). The integration of AR in meditation enables users to tailor their surroundings according to their preferences, potentially amplifying focus and engagement . ...
Chapter
The rise of the metaverse as a digital domain for diverse activities has birthed an innovative application known as ‘metaverse virtual meditation.' This concept seamlessly merges technology and mindfulness, employing virtual reality (VR) and augmented reality (AR) to craft serene digital landscapes. These immersive settings, ranging from natural vistas to abstract spaces, enable users to overcome physical constraints and distractions, facilitating mindfulness, stress reduction, and emotional resilience. The chapter navigates the fusion of technology and contemplative practices, from traditional meditation to modern VR and AR experiences. Stress reduction, heightened focus, and inclusivity are among the advantages highlighted. The convergence of visuals, biofeedback, brain-computer interfaces (BCIs), and AI-driven personalization is explored for tailored meditation. Design principles, interactive elements, and natural components play a crucial role in shaping tranquil virtual environments.
Article
Full-text available
Augmented Reality (AR) overlays computer-generated visual, auditory or other sensory information onto the real world. Due to recent technological advancement in the field, it can become increasingly difficult for the user to differentiate between sensory information coming from real and virtual objects, leading to interesting perceptual phenomena. For example, an AR experience in which users can experience their own hands in flames has been shown to elicit heat illusions on the affected hands. In this study, we investigate the potential that AR has for top-down modulation of pain and thermal perception. We assessed thermal pain and detection thresholds on the participant’s right hand while covering it with realistic virtual flames. We compared this experience to a baseline condition with no additional stimuli. We also report on a condition in which the hand is covered by a blue fluid not instantly associated with fire. We found that experiencing a virtual burning hand induces analgesic as well hyperalgesic effects as participants begin to feel heat related pain at lower temperatures and cold related pain at higher temperatures. The experience also impacts significantly on the lowest temperature at which participants starts perceiving warmth. The blue fluid do not affect the thresholds corresponding to the baseline condition. Our research thus confirms previous experiments showing that pain and thermal perception can be manipulated by by AR, while providing quantitative results on the magnitude of this effect.
Article
Full-text available
Research and design of virtual reality technologies with mental-health focused applications has increased dramatically in recent years. However, the applications and psychological outcomes of augmented reality (AR) technologies still remain to be widely explored and evaluated. This is particularly true for the use of AR for the self-management of stress, anxiety, and mood. In the current study, we examined the impact of a brief open heart meditation AR experience on participants with moderate levels of anxiety and/or depression. Using a randomized between-group design subjects participated in the AR experience or the AR experience plus frontal gamma asymmetry neurofeedback integrated into the experience. Self-reported mood state and resting-state EEG were recorded before and after the AR intervention for both groups. Participants also reported on engagement and perceived use of the experience as a stress and coping tool. EEG activity was analyzed as a function of the frontal, midline, and parietal scalp regions, and with sLORETA current source density estimates of anterior cingulate and insular cortical regions of interest. Results demonstrated that both versions of the AR meditation significantly reduced negative mood and increased positive mood. The changes in resting state EEG were also comparable between groups, with some trending differences observed, in line with existing research on open heart and other loving-kindness and compassion-based meditations. Engagement was favorable for both versions of the AR experience, with higher levels of engagement reported with the addition of neurofeedback. These results provide early support for the therapeutic potential of AR-integrated meditations as a tool for the self-regulation of mood and emotion, and sets the stage for more research and development into health and wellness-promoting AR applications.
Article
Full-text available
Psycho-physiological analysis has gained greater attention in the last few decades in various fields including multimodal systems. Researchers use psychophysiological feedback devices such as skin conductance (SC), Electroencephalography (EEG) and Electrocardiography (ECG) to detect the affective states of the users during task performance. Psycho-physiological feedback has been successful in detection of the cognitive states of users in human-computer interaction (HCI). Recently, in game studies, psycho-physiological feedback has been used to capture the user experience and the effect of interaction on human psychology. This paper reviews several psycho-physiological, cognitive, and affective assessment studies and focuses on the use of psychophysiological signals in estimating the user’s cognitive and emotional states in multimodal systems. In this paper, we review the measurement techniques and methods that have been used to record psycho-physiological signals as well as the cognitive and emotional states in a variety of conditions. The aim of this review is to conduct a detailed study to identify, describe and analyze the key psycho-physiological parameters that relate to different mental and emotional states in order to provide an insight into key approaches. Furthermore, the advantages and limitations of these approaches are also highlighted in this paper. The findings state that the classification accuracy of >90% has been achieved in classifying emotions with EEG signals. A strong correlation between self-reported data, HCI experience, and psychophysiological data has been observed in a wide range of domains including games, human-robot interaction, mobile interaction, and simulations. An increase in β and γ-band activity have been observed in high intense games and simulations.
Article
Full-text available
Sustaining a burn injury increases an individual's risk of developing psychological problems such as generalized anxiety, negative emotions, depression, acute stress disorder, or post-traumatic stress disorder. Despite the growing use of Dialectical Behavioral Therapy® (DBT®) by clinical psychologists, to date, there are no published studies using standard DBT® or DBT® skills learning for severe burn patients. The current study explored the feasibility and clinical potential of using Immersive Virtual Reality (VR) enhanced DBT® mindfulness skills training to reduce negative emotions and increase positive emotions of a patient with severe burn injuries. The participant was a hospitalized (in house) 21-year-old Spanish speaking Latino male patient being treated for a large (>35% TBSA) severe flame burn injury. Methods: The patient looked into a pair of Oculus Rift DK2 virtual reality goggles to perceive the computer-generated virtual reality illusion of floating down a river, with rocks, boulders, trees, mountains, and clouds, while listening to DBT® mindfulness training audios during 4 VR sessions over a 1 month period. Study measures were administered before and after each VR session. Results: As predicted, the patient reported increased positive emotions and decreased negative emotions. The patient also accepted the VR mindfulness treatment technique. He reported the sessions helped him become more comfortable with his emotions and he wanted to keep using mindfulness after returning home. Conclusions: Dialectical Behavioral Therapy is an empirically validated treatment approach that has proved effective with non-burn patient populations for treating many of the psychological problems experienced by severe burn patients. The current case study explored for the first time, the use of immersive virtual reality enhanced DBT® mindfulness skills training with a burn patient. The patient reported reductions in negative emotions and increases in positive emotions, after VR DBT® mindfulness skills training. Immersive Virtual Reality is becoming widely available to mainstream consumers, and thus has the potential to make this treatment available to a much wider number of patient populations, including severe burn patients. Additional development, and controlled studies are needed.
Article
Full-text available
We introduce the Open Affective Standardized Image Set (OASIS), an open-access online stimulus set containing 900 color images depicting a broad spectrum of themes, including humans, animals, objects, and scenes, along with normative ratings on two affective dimensions-valence (i.e., the degree of positive or negative affective response that the image evokes) and arousal (i.e., the intensity of the affective response that the image evokes). The OASIS images were collected from online sources, and valence and arousal ratings were obtained in an online study (total N = 822). The valence and arousal ratings covered much of the circumplex space and were highly reliable and consistent across gender groups. OASIS has four advantages: (a) the stimulus set contains a large number of images in four categories; (b) the data were collected in 2015, and thus OASIS features more current images and reflects more current ratings of valence and arousal than do existing stimulus sets; (c) the OASIS database affords users the ability to interactively explore images by category and ratings; and, most critically, (d) OASIS allows for free use of the images in online and offline research studies, as they are not subject to the copyright restrictions that apply to the International Affective Picture System. The OASIS images, along with normative valence and arousal ratings, are available for download from www.benedekkurdi.com/#oasis or https://db.tt/yYTZYCga .
Article
Full-text available
Information about the affective meanings of words is used by researchers working on emotions and moods, word recognition and memory, and text-based sentiment analysis. Three components of emotions are traditionally distinguished: valence (the pleasantness of a stimulus), arousal (the intensity of emotion provoked by a stimulus), and dominance (the degree of control exerted by a stimulus). Thus far, nearly all research has been based on the ANEW norms collected by Bradley and Lang (1999) for 1,034 words. We extended that database to nearly 14,000 English lemmas, providing researchers with a much richer source of information, including gender, age, and educational differences in emotion norms. As an example of the new possibilities, we included stimuli from nearly all of the category norms (e.g., types of diseases, occupations, and taboo words) collected by Van Overschelde, Rawson, and Dunlosky (Journal of Memory and Language 50:289-335, 2004), making it possible to include affect in studies of semantic memory.
Article
Objective: Despite advances in suicide prevention implemented throughout the US Department of Veterans Affairs (VA) including the hiring of Suicide Prevention Coordinators (SPCs) at every VA hospital, enhanced monitoring, and the availability of 24-hour crisis hotline services, suicide by veterans remains a critical problem affecting 20 veterans daily. Few empirically based treatment strategies for suicide prevention for postdeployment military personnel exist. This study aimed to test whether dialectical behavior therapy (DBT), one of the few psychosocial treatments with proven efficacy in diminishing suicidal behavior in individuals with personality disorder, can be applied to veterans irrespective of personality diagnosis. Methods: From January 2010 to December 2014, 91 nonpsychotic veterans at high risk for suicide (61 men, 30 women) were randomly assigned to a 6-month treatment trial at a veterans' medical center comparing standard DBT to treatment as usual (TAU) and followed for 6 months after trial completion. Primary outcome was suicide attempts, measured with the Columbia-Suicide Severity Rating Scale, and secondary outcomes were suicide ideation, depression, hopelessness, and anxiety. There were no exclusions pertaining to substance abuse, homelessness, or medical comorbidity. Results: Both DBT and TAU resulted in improvements in suicidal ideation, depression, and anxiety during the course of the 6-month treatment trial that did not differ between treatment arms. Survival analyses for suicide attempts and hospitalizations did not differ between treatment arms. However, DBT subjects utilized significantly more individual mental health services than TAU subjects (28.5 ± 19.6 vs 14.7 ± 10.9, F₁,₇₇ = 11.60, P = .001). Conclusions: This study is the first to examine 6-month DBT in a mostly male, veteran population. Increased mental health treatment service delivery, which included enhanced monitoring, outreach, and availability of a designated SPC, did not yield statistically significant differences in outcome for veterans at risk for suicide in TAU as compared to the DBT treatment arm. However, both treatments had difficulty with initial engagement post-hospitalization. Future studies examining possible sex differences and strategies to boost retention in difficult-to-engage, homeless, and substance-abusing populations are indicated. Trial registration: ClinicalTrials.gov identifier: NCT02462694.