Figure 1 - uploaded by Michael Wilkinson
Content may be subject to copyright.
Oculus Rift headset and touch controllers. The headset is worn atop the head, against the face. The goggles are secured with two side straps and a top strap. The controllers are held in the hands of the participant and are used to move within the VR environment.
Source publication
The only evidence that seeing in slow-motion exists comes from retrospective interviews. An ongoing debate is whether this phenomenon exists as a figment of memory or a true function of visual perception. Testing these speculations is difficult given slow-motion experience is often associated with intense, stressful, and even threatening situations...
Context in source publication
Similar publications
Citations
... They offer real-time control and quantitative measures of user behavior when exposed to VR and adaptation to virtual environments. Peripheral physiological responses, such as Electrodermal Activity (EDA) and photoplethysmography (PPG), are indicative metrics for physiological arousal [107], emotional valence [74], or cognitive workload [45]. ...
Virtual reality experiences increasingly use physiological data for virtual environment adaptations to evaluate user experience and immersion. Previous research required complex medical-grade equipment to collect physiological data, limiting real-world applicability. To overcome this, we present SensCon for skin conductance and heart rate data acquisition. To identify the optimal sensor location in the controller, we conducted a first study investigating users' controller grasp behavior. In a second study, we evaluated the performance of SensCon against medical-grade devices in six scenarios regarding user experience and signal quality. Users subjectively preferred SensCon in terms of usability and user experience. Moreover, the signal quality evaluation showed satisfactory accuracy across static, dynamic, and cognitive scenarios. Therefore, SensCon reduces the complexity of capturing and adapting the environment via real-time physiological data. By open-sourcing SensCon, we enable researchers and practitioners to adapt their virtual reality environment effortlessly. Finally, we discuss possible use cases for virtual reality-embedded physiological sensing.
... Recently, Wilkinson et al. (2019) conducted a study to explore subjective experience of slow-motion using VR which consisted of various height-related events (three events for arousal manipulations: walking on a sidewalk, plank-walking from 100m height, falling from the plank from top of a building) coupled with a perceptual encoding task. Heart rate was used as an objective measure of arousal. ...
... Many studies examining presence and immersion in VR generally compare this [VR] technology to another medium or the real world. Although some research suggests the effectiveness of VR-HMDs in supporting participants' feelings of being in the world just like in the physical one, Wilkinson et al. (2019) discovered that there may be a ceiling effect that certain scenarios may not lead to optimal presence (i.e., the experience of falling was not realistic enough to make participants believe they were actually falling). ...
Virtual reality technology is constantly improving such that a virtual environment is more like a physical one. However, some research evidence suggest that certain virtual reality scenarios are less real than others to human observers (e.g., experience of falling from a high place) leading to potential limitations of using virtual reality as a research tool for certain tasks. Moreover, since the inception of VR research the terms presence and immersion have been somewhat convoluted and at times, even used interchangeably. Using a thematic content analysis based on seventeen articles, a theme for each term emerged. Presence is an experiential quality in virtual environments and immersion is associated with the technical aspects of a virtual system that aide the user in feeling a sense of presence. Several new technologies, as well as more traditional approaches are discussed as potential methods to improve of immersion, and therefore presence, in virtual reality.
Electroencephalography (EEG) has become a widely used non-invasive measurement method for brain-computer interfaces (BCI). Hybrid BCI (hBCI) additionally incorporate other physiological indicators, also called bio-signals, in order to improve the decryption of brain signals evaluating a variety of different sensor data. Although significant progress has been made in the field of BCI, the correlation of data from different sensors as well as the possible redundancy of certain sensors have been less frequently studied. Based on deep learning our concept presents a theoretical approach to potentially replace one sensor with the measurements of others. Hence, a costly or difficult to sensor measurement could be left out of a setup completely without losing its functionality. In this context, we additionally propose a conceptual framework which facilitates and improves the generation of scientifically significant data through their collection within a corresponding VR application and set-up. The evaluation of these collected sensor data, which is described in five consecutive steps, is to cluster the data of one sensor and to classify the data from other sensors into these clusters. Afterwards, the sensor data in each cluster are analysed for patterns. Through the predictive data analysis of existing sensors, the required number of sensors can be reduced. This allows valid statements about the output of the original sensor with no need to use it effectively. An artificial intelligence (AI) based EEG emulation, derived from other directly related bio-signals, could therefore potentially replace EEG measurements which indirectly enables the use of BCI in situations where it was previously not possible. Future work might clarify relevant questions concerning the realisation of the concept and how it could be further developed.