Conference Paper

Iterative Prototyping of a Cut for a Finger Tracking Glove

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The perception of the movement of the own fingers is important in VR. Also the sense of touch of the hands provide crucial information about an object that is touched. This is especially noticeable when climbing in VR. While prototyping a VR climbing application, we developed the finger tracking glove g 1. Glove g 1 enables the perception of the finger movements but limits the sense of touch of the hand. Hence, we developed the proposed glove g 2 that enables to feel objects with the skin of the palm and the finger tips. In this paper we describe the iterative design process of g 2. Furthermore, pros and cons of g 2 compared to g 1 are discussed. Finally, an outlook of a future study about the measurement of the presence and the body ownership is given.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Conference Paper
Full-text available
Dealing with fear of falling is a challenge in sport climbing. Virtual reality (VR) research suggests that using physical and reality-based interaction increases the presence in VR. In this paper, we present a study that investigates the influence of physical props on presence, stress and anxiety in a VR climbing environment involving whole body movement. To help climbers overcoming fear of falling, we compared three different conditions: Climbing in reality at 10 m height, physical climbing in VR (with props attached to the climbing wall) and virtual climbing in VR using game controllers. From subjective reports and biosignals, our results show that climbing with props in VR increases the anxiety and sense of realism in VR for sport climbing. This suggests that VR in combination with physical props are an effective simulation setup to induce the sense of height.
Conference Paper
Full-text available
While current consumer virtual reality headsets can convey a strong feeling of immersion, one drawback is still the missing haptic feedback when interacting with virtual objects. In this work, we investigate the use of a artificial climbing wall as a haptic feedback device in a virtual rock climbing environment. It enables the users to wear a head-mounted display and actually climb on the physical climbing wall which conveys the feeling of climbing on a large mountain face.
Article
Full-text available
In this paper, I address the question as to why participants tend to respond realistically to situations and events portrayed within an immersive virtual reality system. The idea is put forward, based on the experience of a large number of experimental studies, that there are two orthogonal components that contribute to this realistic response. The first is 'being there', often called 'presence', the qualia of having a sensation of being in a real place. We call this place illusion (PI). Second, plausibility illusion (Psi) refers to the illusion that the scenario being depicted is actually occurring. In the case of both PI and Psi the participant knows for sure that they are not 'there' and that the events are not occurring. PI is constrained by the sensorimotor contingencies afforded by the virtual reality system. Psi is determined by the extent to which the system can produce events that directly relate to the participant, the overall credibility of the scenario being depicted in comparison with expectations. We argue that when both PI and Psi occur, participants will respond realistically to the virtual reality.
Conference Paper
Hands deserve particular attention in virtual reality (VR) applications because they represent our primary means for interacting with the environment. Although marker-based motion capture with inverse kinematics works adequately for full body tracking, it is less reliable for small body parts such as hands and fingers which are often occluded when captured optically, thus leading VR professionals to rely on additional systems (e.g. inertial trackers). We present a machine learning pipeline to track hands and fingers using solely a motion capture system based on cameras and active markers. Our finger animation is performed by a predictive model based on neural networks trained on a movements dataset acquired from several subjects with a complementary capture system. We employ a two-stage pipeline, which first resolves occlusions, and then recovers all joint transformations. We show that our method compares favorably to inverse kinematics by inferring automatically the constraints from the data, provides a natural reconstruction of postures, and handles occlusions better than three proposed baselines.
Article
Multi-fingered haptics is imperative for truly immersive virtual reality (VR) experience, as many real world tasks involve finger manipulation. One of the key lacking aspect for this is the absence of technologically and economically viable wearable haptic interfaces that can simultaneously track the finger/hand motions and display multi-degree-of-freedom (DOF) contact forces. In this paper, we propose a novel wearable cutaneous haptic interface (WCHI), which consists of: 1) finger tracking modules (FTMs) to estimate complex multi-DOF finger and hand motion; and 2) cutaneous haptic modules (CHMs) to convey three-DOF contact force at the finger-tip. By opportunistically utilizing such different types of sensors as inertial measurement units (IMUs), force sensitive resistor (FSR) sensors, and soft sensors, the WCHI can track complex anatomically-consistent multi-DOF finger motion while avoiding FTM-CHM electromagnetic interference possibly stemming from their collocation in the small form-factor interface; while also providing the direction and magnitude of three-DOF finger-tip contact force, the feedback of which can significantly enhance the precision of contact force generation against variability among users via their closed-loop control. Human subject study is performed with a virtual peg insertion task to show the importance of both the multi-DOF finger tracking and the three-DOF cutaneous haptic feedback for dexterous manipulation in virtual environment.
Conference Paper
Scary climbing situations can appear when climbing on big walls. Some stones may be slippery due to humidity or smoothness and distances between belaying bolts can be large whereby a deep fall is risked. In such situations, the stress level increases, climbing performance decreases and an ascent might even be aborted. Furthermore, hazardous accidents can happen when the concentration is disturbed by stress and fear. Moreover, a beginner has to overcome some hurdles such as learning movements and belaying techniques as well as overcoming acrophobia or fear of falling down. We try to facilitate these hurdles with augmentation of the human in Mixed Reality (MR) superhuman sport climbing. In this paper, we present the development of a MR climbing system that tackles these hurdles by allowing a user to climb in safe height at ground level in reality while seeing his or her own body visualised in a much more demanding situation (e.g. in the middle of a big wall) in a Virtual Environment (VE) through a HMD. Additionally an audience can be integrated into the VE via a collaboration with an operator of a virtual drone. The drone operator has the ability to provide additional feedback, e.g. point to a certain stone with a virtual laser and can communicate with the climber via microphone. Hence, if a climber gets stuck in a climbing route, the drone operator can suggest the next move in the VE.
Article
Arthritis remains a disabling and painful disease, and involvement of finger joints is a major cause of disability and loss of employment. Traditional arthritis measurements require labour intensive examination by clinical staff. These manual measurements are inaccurate and open to observer variation. This paper presents the development and testing of a next generation wireless smart glove to facilitate the accurate measurement of finger movement through the integration of multiple IMU sensors, with bespoke controlling algorithms. Our main objective was to measure finger and thumb joint movement. These dynamic measurements will provide clinicians with a new and accurate way to measure loss of movement in patients with Rheumatoid Arthritis. Commercially available gaming gloves are not fitted with sufficient sensors for this particular application, and require calibration for each glove wearer. Unlike these state-of-the-art data gloves, the Inertial Measurement Unit (IMU) glove uses a combination of novel stretchable substrate material and 9 degree of freedom (DOF) inertial sensors in conjunction with complex data analytics to detect joint movement. Our novel iSEG-Glove requires minimal calibration and is therefore particularly suited to the healthcare environment. Inaccuracies may arise for wearers who have varying degrees of movement in their finger joints, variance in hand size or deformities. The developed glove is fitted with sensors to overcome these issues. This glove will help quantify joint stiffness and monitor patient progression during the arthritis rehabilitation process.
Article
Previous studies have examined the experience of owning a virtual surrogate body or body part through specific combinations of cross-modal multisensory stimulation. Both visuomotor (VM) and visuotactile (VT) synchronous stimulation have been shown to be important for inducing a body ownership illusion, each tested separately or both in combination. In this study we compared the relative importance of these two cross-modal correlations, when both are provided in the same immersive virtual reality setup and the same experiment. We systematically manipulated VT and VM contingencies in order to assess their relative role and mutual interaction. Moreover, we present a new method for measuring the induced body ownership illusion through time, by recording reports of breaks in the illusion of ownership ('breaks') throughout the experimental phase. The balance of the evidence, from both questionnaires and analysis of the breaks, suggests that while VM synchronous stimulation contributes the greatest to the attainment of the illusion, a disruption of either (through asynchronous stimulation) contributes equally to the probability of a break in the illusion.
Wearable soft artificial skin for hand motion detection with embedded microfluidic strain sensing
  • Jean-Baptiste Chossat
  • Yiwei Tao
  • Vincent Duchaine
  • Yong-Lae Park
Jean-Baptiste Chossat, Yiwei Tao, Vincent Duchaine, and Yong-Lae Park. Wearable soft artificial skin for hand motion detection with embedded microfluidic strain sensing. In 2015 IEEE International Conference on Robotics and Automation (ICRA), volume 2015-June, pages 2568-2573. IEEE, 5 2015.
Otmar Hilliges, and Olga Sorkine-Hornung. Interactive Hand Pose Estimation using a Stretch-Sensing Soft Glove
  • Oliver Glauser
  • Shihao Wu
  • Daniele Panozzo
[GWP + 19] Oliver Glauser, Shihao Wu, Daniele Panozzo, Otmar Hilliges, and Olga Sorkine-Hornung. Interactive Hand Pose Estimation using a Stretch-Sensing Soft Glove. In SACM Transactions on Graphics (Proceedings of ACM SIGGRAPH), volume 38, page 15, New York, USA, 2019. ACM Press.