Article

Live augmented reality: a new visualization method for laparoscopic surgery using continuous volumetric computed tomography.

Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, 22 South Greene Street, Baltimore, MD 21201, USA.
Surgical Endoscopy (Impact Factor: 3.43). 02/2010; 24(8):1976-85. DOI: 10.1007/s00464-010-0890-8
Source: PubMed

ABSTRACT Current laparoscopic images are rich in surface detail but lack information on deeper structures. This report presents a novel method for highlighting these structures during laparoscopic surgery using continuous multislice computed tomography (CT). This has resulted in a more accurate augmented reality (AR) approach, termed "live AR," which merges three-dimensional (3D) anatomy from live low-dose intraoperative CT with live images from the laparoscope.
A series of procedures with swine was conducted in a CT room with a fully equipped laparoscopic surgical suite. A 64-slice CT scanner was used to image the surgical field approximately once per second. The procedures began with a contrast-enhanced, diagnostic-quality CT scan (initial CT) of the liver followed by continuous intraoperative CT and laparoscopic imaging with an optically tracked laparoscope. Intraoperative anatomic changes included user-applied deformations and those from breathing. Through deformable image registration, an intermediate image processing step, the initial CT was warped to align spatially with the low-dose intraoperative CT scans. The registered initial CT then was rendered and merged with laparoscopic images to create live AR.
Superior compensation for soft tissue deformations using the described method led to more accurate spatial registration between laparoscopic and rendered CT images with live AR than with conventional AR. Moreover, substitution of low-dose CT with registered initial CT helped with continuous visualization of the vasculature and offered the potential of at least an eightfold reduction in intraoperative X-ray dose.
The authors proposed and developed live AR, a new surgical visualization approach that merges rich surface detail from a laparoscope with instantaneous 3D anatomy from continuous CT scanning of the surgical field. Through innovative use of deformable image registration, they also demonstrated the feasibility of continuous visualization of the vasculature and considerable X-ray dose reduction. This study provides motivation for further investigation and development of live AR.

0 Bookmarks
 · 
69 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: This article presents general principles and recent advancements in the clinical application of augmented reality-based navigation surgery (AR based NS) for abdominal procedures and includes a description of our clinical trial and subsequent outcomes. Moreover, current problems and future aspects are discussed. The development of AR-based NS in the abdomen is delayed compared with another field because of the problem of intraoperative organ deformations or the existence of established modalities. Although there are a few reports on the clinical use of AR-based NS for digestive surgery, sophisticated technologies in urology have often been reported. However, the rapid widespread use of video- or robot assisted surgeries requires this technology. We have worked to develop a system of AR-based NS for hepatobiliary and pancreatic surgery. Then we developed a short rigid scope that enables surgeons to obtain 3D view. We recently focused on pancreatic surgery, because intraoperative organ shifting is minimal. The position of each organ in overlaid image almost corresponded with that of the actual organ with about 5 mm of mean registration errors. Intraoperative information generated from this system provided us with useful navigation. However, AR-based NS has several problems to overcome such as organ deformity, evaluation of utility, portability or cost.
    Surgery Today 06/2014; · 0.96 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Autostereoscopic 3D image overlay for augmented reality (AR) based surgical navigation has been studied and reported many times. For the purpose of surgical overlay, the 3D image is expected to have the same geometric shape as the original organ, and can be transformed to a specified location for image overlay. However, how to generate a 3D image with high geometric fidelity and quantitative evaluation of 3D image's geometric accuracy have not been addressed. This paper proposes a graphics processing unit (GPU) based computer-generated integral imaging pipeline for real-time autostereoscopic 3D display, and an automatic closed-loop 3D image calibration paradigm for displaying undistorted 3D images. Based on the proposed methods, a novel AR device for 3D image surgical overlay is presented, which mainly consists of a 3D display, an AR window, a stereo camera for 3D measurement, and a workstation for information processing. The evaluation on the 3D image rendering performance with 2560 × 1600 elemental image resolution shows the rendering speeds of 50-60 frames per second (fps) for surface models, and 5-8 fps for large medical volumes. The evaluation of the undistorted 3D image after the calibration yields sub-millimeter geometric accuracy. A phantom experiment simulating oral and maxillofacial surgery was also performed to evaluate the proposed AR overlay device in terms of the image registration accuracy, 3D image overlay accuracy, and the visual effects of the overlay. The experimental results show satisfactory image registration and image overlay accuracy, and confirm the system usability.
    Computerized Medical Imaging and Graphics 11/2014; · 1.50 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Minimally invasive surgery represents one of the main evolutions of surgical techniques. However, minimally invasive surgery adds difficulty that can be reduced through computer technology. From a patient's medical image [US, computed tomography (CT) or MRI], we have developed an Augmented Reality (AR) system that increases the surgeon's intraoperative vision by providing a virtual transparency of the patient. AR is based on two major processes: 3D modeling and visualization of anatomical or pathological structures appearing in the medical image, and the registration of this visualization onto the real patient. We have thus developed a new online service, named Visible Patient, providing efficient 3D modeling of patients. We have then developed several 3D visualization and surgical planning software tools to combine direct volume rendering and surface rendering. Finally, we have developed two registration techniques, one interactive and one automatic providing intraoperative augmented reality view. From January 2009 to June 2013, 769 clinical cases have been modeled by the Visible Patient service. Moreover, three clinical validations have been realized demonstrating the accuracy of 3D models and their great benefit, potentially increasing surgical eligibility in liver surgery (20% of cases). From these 3D models, more than 50 interactive AR-assisted surgical procedures have been realized illustrating the potential clinical benefit of such assistance to gain safety, but also current limits that automatic augmented reality will overcome. Virtual patient modeling should be mandatory for certain interventions that have now to be defined, such as liver surgery. Augmented reality is clearly the next step of the new surgical instrumentation but remains currently limited due to the complexity of organ deformations during surgery. Intraoperative medical imaging used in new generation of automated augmented reality should solve this issue thanks to the development of Hybrid OR.
    Hepatobiliary surgery and nutrition. 04/2014; 3(2):73-81.