Live augmented reality: A new visualization method for laparoscopic surgery using continuous volumetric computed tomography
Department of Diagnostic Radiology and Nuclear Medicine, University of Maryland School of Medicine, 22 South Greene Street, Baltimore, MD 21201, USA. Surgical Endoscopy
(Impact Factor: 3.26).
02/2010; 24(8):1976-85. DOI: 10.1007/s00464-010-0890-8
Current laparoscopic images are rich in surface detail but lack information on deeper structures. This report presents a novel method for highlighting these structures during laparoscopic surgery using continuous multislice computed tomography (CT). This has resulted in a more accurate augmented reality (AR) approach, termed "live AR," which merges three-dimensional (3D) anatomy from live low-dose intraoperative CT with live images from the laparoscope.
A series of procedures with swine was conducted in a CT room with a fully equipped laparoscopic surgical suite. A 64-slice CT scanner was used to image the surgical field approximately once per second. The procedures began with a contrast-enhanced, diagnostic-quality CT scan (initial CT) of the liver followed by continuous intraoperative CT and laparoscopic imaging with an optically tracked laparoscope. Intraoperative anatomic changes included user-applied deformations and those from breathing. Through deformable image registration, an intermediate image processing step, the initial CT was warped to align spatially with the low-dose intraoperative CT scans. The registered initial CT then was rendered and merged with laparoscopic images to create live AR.
Superior compensation for soft tissue deformations using the described method led to more accurate spatial registration between laparoscopic and rendered CT images with live AR than with conventional AR. Moreover, substitution of low-dose CT with registered initial CT helped with continuous visualization of the vasculature and offered the potential of at least an eightfold reduction in intraoperative X-ray dose.
The authors proposed and developed live AR, a new surgical visualization approach that merges rich surface detail from a laparoscope with instantaneous 3D anatomy from continuous CT scanning of the surgical field. Through innovative use of deformable image registration, they also demonstrated the feasibility of continuous visualization of the vasculature and considerable X-ray dose reduction. This study provides motivation for further investigation and development of live AR.
Available from: Marta A. Kersten-Oertel
- "In endoscopic and laparoscopic surgery, a telescopic rod lens with a video camera is used to guide surgical instruments to the patient anatomy through small incisions or keyholes. Laparoscopic systems were specified for liver , digestive , abdominal , prostate , urologic  and robotic surgery  or simply general laparoscopic surgery    . Endoscopic robotic systems were described by Suzuki et al.  and Sudra et al. . "
[Show abstract] [Hide abstract]
ABSTRACT: This paper presents a review of the state of the art of visualization in mixed reality image guided surgery (IGS). We used the DVV (data, visualization processing, view) taxonomy to classify a large unbiased selection of publications in the field. The goal of this work was not only to give an overview of current visualization methods and techniques in IGS but more importantly to analyze the current trends and solutions used in the domain. In surveying the current landscape of mixed reality IGS systems, we identified a strong need to assess which of the many possible data sets should be visualized at particular surgical steps, to focus on novel visualization processing techniques and interface solutions, and to evaluate new systems.
Computerized medical imaging and graphics: the official journal of the Computerized Medical Imaging Society 03/2013; 37(2). DOI:10.1016/j.compmedimag.2013.01.009 · 1.22 Impact Factor
[Show abstract] [Hide abstract]
ABSTRACT: In this paper, we present a new method for adaptive IIR filtering that promises improved performance under general operating conditions, including both colored and white noise, sufficient and insufficient order filter cases. By defining the regressor in adaptive IIR filter update to be a convex combination of the regressors for the Steiglitz-McBride method (SMM) and recursive predicted error method (RPEM), we are able to tradeoff the benefits of each. RPEM minimizes mean square output error (MSOE) directly, and thus has slow convergence rate and may converge to a local minimum because of nonconvexity and multimodality of MSOE surface, SMM converges fast, but may converge to a biased solution or diverge in colored noise environments. Other composite methods (e.g., composite regressor method (CRM)) use a similar approach, but only focus on reducing bias of equation error estimates for sufficient order filters in white noise environments. Conversely, our method, CPRM, can extend applications to general environments, prevent diverging, reduce bias, and increase likelihood of convergence to the global minimum
Acoustics, Speech, and Signal Processing, 1994. ICASSP-94., 1994 IEEE International Conference on; 05/1994
[Show abstract] [Hide abstract]
ABSTRACT: Current generation minimally invasive surgeries present many visualization challenges, including two-dimensional representation of three-dimensional anatomy and a lack of visualization of deeply recessed structures. Coupled with the loss of tactile feedback which places greater emphasis on available visual cues, improved surgical visualization remains a long-standing need. Our response to address this need is Live Augmented Reality (Live AR), in which processed images from live radiologie scans of the surgical field are merged with optical images, accounting for spatial and temporal registration. We have demonstrated the feasibility of Live AR, but its clinical implementation is hampered by many current technical limitations. Ln this report, we have presented guidance of a simple laparoscopic maneuver completely based on computed tomography (CT) scanning and rapid 3D rendering of the acquired images in the CT room. The capability developed here and the reported results constitute a step toward the eventual goal of routine clinical implementation of Live AR.
11th International Conference on Control, Automation, Robotics and Vision, ICARCV 2010, Singapore, 7-10 December 2010, Proceedings; 01/2010
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.