Project

Virtual Reality in Medicine

Goal: To develop Virtual Reality (VR) applications for the medical domain. Examples are facial and cranial reconstructions where an immersive preoperative planning and inspection can help to achieve an esthetic looking outcome.

Updates
0 new
5
Recommendations
0 new
0
Followers
0 new
35
Reads
0 new
1033

Project log

Jan Egger
added a research item
Virtual Reality (VR) is a promising future. It sets up a virtual environment providing an (audio)visual user experience by creating an artificial world where a user is directly placed into. Extensible by haptic systems, it allows to simulate lifelike experiences to any user. Over the last years, VR hardware has more and more turned into something that is affordable and thus available to common end-users in daily life. At the point of time only limited by physical restrictions such as the walls of a room or the cables connecting the hardware devices to a computer, a user is able to move around inside that VR, i.e. physical movement is directly applied onto the artificially generated world. To experience VR, one is usually equipped with a so-called Head Mounted Device (HMD). Containing a screen in front of each eye, this device renders whatever wanted in front of the users eyes. Today’s VR products are used in different fields such as training simulations, gaming and even videos. This work presents how VR is integrated into medical applications by developing a VR plugin for a medical image processing framework called MeVisLab. The thread-based plugin has been developed using OpenVR, a VR library that can be used for developing vendor and platform independent VR applications and tested with the HTC Vive.
Jan Egger
added a research item
Apart from the applications in surgical navigations, virtual surgery has become a new feasible method for training young surgeons. Since 2006 virtual simulation has been performed in selected patient cases affected by complex craniomaxillofacialdisorders (n = 8) in addition to standard surgical planning based on patient specific 3d-models. Although for training, standard models are enough for young surgeons to practice, we still need to build an accurate and personalized model to simulate a surgery better. Patients’ CT images are considered appropriate references for this procedure. CT images are taken along the axis of the body, and each of them represents the structure of a plane that is perpendicular to the body axis. Information like the gray scale levels of each pixel, which can be used in building a volume model, are also contained. During an operation, the model should be able to give some haptic feedback as well as to update the changes on the models in real time. In this project, we realize these functions with CHAI3D library. Omega 6 is the device we use for applying forces and receiving feedbacks.
Jan Egger
added a research item
The haptic-based virtual reality simulation which can create a visual and tactile fusion in virtual environment for operators is an important application in surgical training. The catheter’s deformation and force feedback are key techniques in the catheter interventional simulator. The simulation of catheter interventional surgery is developed with the aid of some open–source software, such as OpenGL and CHAI3D, and the force feedback instrument Omega.6 (Force Dimension, Switzerland). CHAI3D is a platform framework for visualization, computer haptic and interactive real-time simulation. To realize the deformation and the haptic effect in virtual system, the skeleton ball-spring model based on classic mass-spring model (MSM) is applied to the catheter 3d model, and the improved GEL dynamics engine is proposed. According to the skeleton ball-spring model, a set of 6-DOF balls are arranged within the 3d model. By defining the parameters of those balls and springs, such as mass, radius, elasticity, damping and external force, the real characteristic of the catheter can be simulated vividly. The improved model is more accurate than MSM but still computationally efficient. The simulator provides an effective surgical training method to help operators practice the catheter interventional surgery and get more skillful in a safe and easy way rather than traditional surgeries. Meanwhile, the deformable and collision-detective catheter model in the simulator provide operators with a more realistic feeling in surgical trainings. The methods can also be applied to other surgical fields.
Jan Egger
added a research item
Apart from the applications in surgical navigations, virtual surgery has become a new feasible method for training young surgeons. Since 2006 virtual simulation has been performed in selected patient cases affected by complex craniomaxillofacial disorders (n = 8) in addition to standard surgical planning based on patient specific 3d-models. Although for training, standard models are enough for young surgeons to practice, we still need to build an accurate and personalized model to simulate a surgery better. Patients' CT images are considered appropriate references for this procedure. CT images are taken along the axis of the body, and each of them represents the structure of a plane that is perpendicular to the body axis. Information like the gray scale levels of each pixel, which can be used in building a volume model, are also contained. During an operation, the model should be able to give some haptic feedback as well as to update the changes on the models in real time. In this project, we realize these functions with CHAI3D library. Omega 6 is the device we use for applying forces and receiving feedbacks.
Jan Egger
added a research item
Virtual Reality (VR) sets up a virtual environment providing an (audio)visual user experience by creating an artificial world where a user is directly placed into. To experience VR, one is usually equipped with a so-called Head Mounted Device (HMD). Today’s VR products are used in different fields such as training simulations, gaming and even videos. This work presents how VR is integrated into medical applications by developing a VR plugin for a medical image processing framework MeVisLab. The thread-based plugin has been developed using OpenVR, a VR library that can be used for developing vendor and platform independent VR applications.
Jan Egger
added an update
Video showing a Thread-Based Integration of the HTC Vive into the medical platform MeVisLab via OpenVR (credit to Simon Gunacker):
 
Jan Egger
added an update
Source Code for a MeVisLab implementation can be found under the following GitHub-Account: https://github.com/simon-gunacker/vive
 
Jan Egger
added a research item
Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection.
Jan Egger
added a research item
Virtual Reality (VR) is an immersive technology that replicates an environment via computer-simulated reality. VR gets a lot of attention in computer games but has also great potential in other areas, like the medical domain. Examples are planning, simulations and training of medical interventions, like for facial surgeries where an aesthetic outcome is important. However, importing medical data into VR devices is not trivial, especially when a direct connection and visualization from your own application is needed. Furthermore, most researcher don’t build their medical applications from scratch, rather they use platforms, like MeVisLab, Slicer or MITK. The platforms have in common that they integrate and build upon on libraries like ITK and VTK, further providing a more convenient graphical interface to them for the user. In this contribution, we demonstrate the usage of a VR device for medical data under MeVisLab. Therefore, we integrated the OpenVR library into MeVisLab as an own module. This enables the direct and uncomplicated usage of head mounted displays, like the HTC Vive under MeVisLab. Summarized, medical data from other MeVisLab modules can directly be connected per drag-and-drop to our VR module and will be rendered inside the HTC Vive for an immersive inspection.
Jan Egger
added an update
YouTube Video showing the integration of the HTC Vive into the medical platform MeVisLab:
 
Jan Egger
added an update
Jan Egger
added 11 project references
Jan Egger
added a project goal
To develop Virtual Reality (VR) applications for the medical domain. Examples are facial and cranial reconstructions where an immersive preoperative planning and inspection can help to achieve an esthetic looking outcome.