Figure - available from: Human Behavior and Emerging Technologies
This content is subject to copyright. Terms and conditions apply.
Source publication
In virtual reality (VR), participants may not always have hands, bodies, eyes, or even voices—using VR helmets and two controllers, participants control an avatar through virtual worlds that do not necessarily obey familiar laws of physics; moreover, the avatar’s bodily characteristics may not neatly match our bodies in the physical world. Despite...
Citations
... Therefore, a more multimodal approach could be taken, transcribing participants' behaviours in the context of different companions and virtual interaction conditions. Similarly, for Klowait [40] and Klowait and Erofeeva [41], the analysis could focus on more general nonverbal behaviours, such as sequential analysis of body motion and action-based decision-making. A systematic analysis of such behaviours could reveal more about inner processes related to perceived others' presence, which might not be revealed through focusing just on eye-tracking. ...
... In the present study, the only sensory modality used to induce copresence was vision. It is important to, however, note the important role the multimodal cues can play to invoking a sense of immersion and social presence, including proprioception, haptics (touch), olfactory (smell), and auditory cues [23,40] but were beyond the scope of the present study. Any sensory input in addition to vision, or an expectation of such, could additionally contribute to either the congruency or incongruency of the immersive experience, as well as the vigilance over the occluded physical environment in which immersion occurs. ...
Sharing experiences with others is an important part of everyday life. Immersive virtual reality (IVR) promises to simulate these experiences. However, whether IVR elicits a similar level of social presence as measured in the real world is unclear. It is also uncertain whether AI-driven virtual humans (agents) can elicit a similar level of meaningful social copresence as people-driven virtual-humans (avatars). The current study demonstrates that both virtual human types can elicit a cognitive impact on a social partner. The current experiment tested participants’ cognitive performance changes in the presence of virtual social partners by measuring the social facilitation effect (SFE). The SFE-related performance change can occur through either vigilance-based mechanisms related to other people’s copresence (known as the mere presence effect (MPE)) or reputation management mechanisms related to other people’s monitoring (the audience effect (AE)). In this study, we hypothesised AE and MPE as distinct mechanisms of eliciting SFE. Firstly, we predicted that, if head-mounted IVR can simulate sufficient copresence, any social companion’s visual presence would elicit SFE through MPE. The results demonstrated that companion presence decreased participants’ performance irrespective of whether AI or human-driven. Secondly, we predicted that monitoring by a human-driven, but not an AI-driven, companion would elicit SFE through AE. The results demonstrated that monitoring by a human-driven companion affected participant performance more than AI-driven, worsening performance marginally in accuracy and significantly in reaction times. We discuss how the current results explain the findings in prior SFE in virtual-world literature and map out future considerations for social-IVR testing, such as participants’ virtual self-presence and affordances of physical and IVR testing environments.
... Therefore, a more multimodal approach could be taken, transcribing participants' behaviours in the context of different companions and virtual interaction conditions. Similarly, to Klowait (2023) and (Klowait & Erofeeva, 2023), the analysis could focus on more general non-verbal behaviours, such as sequential analysis of body motion and actionbased decision-making. A systematic analysis of such behaviours could reveal more about inner processes related to perceived others' presence, which might not be revealed through focusing just on eye-tracking. ...
... In the present study, the only sensory modality used to induce co-presence was vision. It is important to however note the important role the multimodal cues can play to invoking a sense of immersion and social presence, including proprioception, haptics (touch), olfactory (smell) and auditory cues (Klowait, 2023;Klowait and Erofeeva, 2021), but were beyond the scope of present study. Any sensory input in addition to vision, or an expectation of such, could additionally contribute to either the congruency or incongruency of the immersive experience, as well as the vigilance over the occluded physical environment in which immersion occurs. ...
Sharing experiences with others is an important part of everyday life. Immersive virtual reality (IVR) promises to simulate these experiences. However, whether IVR elicits a similar level of social presence as measured in the real world is unclear. It is also uncertain whether AI-driven virtual humans (agents) can elicit a similar level of meaningful social co-presence as people-driven virtual-humans (avatars). The current study demonstrates that both virtual human types can elicit a cognitive impact on a social partner. The current experiment tested participants’ cognitive performance changes in the presence of virtual social partners by measuring the social facilitation effect (SFE). The SFE-related performance change can occur through either vigilance-based mechanisms related to other people’s co-presence (known as the mere presence effect: MPE) or reputation management mechanisms related to other people's monitoring (the audience effect: AE). In this study, we hypothesised AE and MPE as distinct mechanisms of eliciting SFE. Firstly, we predicted that, if head-mounted IVR can simulate sufficient co-presence, any social companion’s visual presence would elicit SFE through MPE. The results demonstrated that companion presence decreased participants' performance irrespective of whether AI or human-driven. Secondly, we predicted that monitoring by a human-driven, but not an AI-driven, companion would elicit SFE through AE. The results demonstrated that monitoring by a human-driven companion affected participant performance more than AI-driven, worsening performance marginally in accuracy and significantly in reaction times. We discuss how the current results explain the findings in prior SFE in virtual-world literature, and map out future considerations for social-IVR testing, such as participants virtual self-presence, and affordances of physical and IVR testing environment.
... Third, in our study only one stigma measurement instrument was used, and because stigma is a complex construct, future research should assess the effect on other cognitive, emotional, and behavioral components of the construct (e.g., by measuring implicit stigma). Fourth, our study did not include within the design and analysis mediating variables of the mechanisms of interaction, perception, and behavior in the virtual reality environment that can affect and explain the effects of the intervention, such as the degree of immersion and sense of presence 44 nor the sequence and time of interaction with the virtual environments 45 , so future studies should include these variables. Finally, we did not account for the specific characteristics of the participants. ...
Background: Stigma toward people with serious mental illnesses (SMI), like schizophrenia, is a serious global public health challenge that limits the quality of life of those affected and poses a major barrier that keeps people from seeking professional help. There is an urgent need for novel, effective, and scalable interventions to decrease stigmatized perceptions of chronic psychotic disorders and to reduce the health burden imposed by them.
Method: We conducted a randomized controlled trial to assess the impact of a new immersive virtual reality game (Inclúyete-VR) on the level of stigma toward people with SMI, measured by the Attribution questionnaire (AQ-27). Participants in the experimental group were exposed in an immersive way to hallucinations common in schizophrenia, then shown different psychosocial resources available for their recovery and social inclusion; those in the control group used VR software unrelated to mental health. VR sessions were delivered through Oculus headgear and lasted 25 minutes.
Results: We randomly assigned 124 university students (55% female) to experimental or control conditions (n = 62 each). We used mixed ANOVA to compare outcomes before and after the intervention between the two groups. We found a significant intervention-by-time interaction (P < 0.001), with a reduction in the experimental group of overall stigma levels on the AQ-27 scale and its three subscales: dangerousness-fear, avoidance, and lack of solidarity (P < 0.001 for all).
Conclusions: The Inclúyete-VR software proved effective in the short term in reducing stigma toward people with severe mental illness. The program's longer-term efficacy, scalability, and dissemination remain to be studied.
ClinicalTrials.gov Identifier: NCT05393596.