Live augmented reality: a new visualization method for laparoscopic surgery using continuous volumetric computed tomography.
ABSTRACT Current laparoscopic images are rich in surface detail but lack information on deeper structures. This report presents a novel method for highlighting these structures during laparoscopic surgery using continuous multislice computed tomography (CT). This has resulted in a more accurate augmented reality (AR) approach, termed "live AR," which merges three-dimensional (3D) anatomy from live low-dose intraoperative CT with live images from the laparoscope.
A series of procedures with swine was conducted in a CT room with a fully equipped laparoscopic surgical suite. A 64-slice CT scanner was used to image the surgical field approximately once per second. The procedures began with a contrast-enhanced, diagnostic-quality CT scan (initial CT) of the liver followed by continuous intraoperative CT and laparoscopic imaging with an optically tracked laparoscope. Intraoperative anatomic changes included user-applied deformations and those from breathing. Through deformable image registration, an intermediate image processing step, the initial CT was warped to align spatially with the low-dose intraoperative CT scans. The registered initial CT then was rendered and merged with laparoscopic images to create live AR.
Superior compensation for soft tissue deformations using the described method led to more accurate spatial registration between laparoscopic and rendered CT images with live AR than with conventional AR. Moreover, substitution of low-dose CT with registered initial CT helped with continuous visualization of the vasculature and offered the potential of at least an eightfold reduction in intraoperative X-ray dose.
The authors proposed and developed live AR, a new surgical visualization approach that merges rich surface detail from a laparoscope with instantaneous 3D anatomy from continuous CT scanning of the surgical field. Through innovative use of deformable image registration, they also demonstrated the feasibility of continuous visualization of the vasculature and considerable X-ray dose reduction. This study provides motivation for further investigation and development of live AR.
- [Show abstract] [Hide abstract]
ABSTRACT: Laparoscopic liver surgery is particularly challenging owing to restricted access, risk of bleeding, and lack of haptic feedback. Navigation systems have the potential to improve information on the exact position of intrahepatic tumors, and thus facilitate oncological resection. This study aims to evaluate the feasibility of a commercially available augmented reality (AR) guidance system employing intraoperative robotic C-arm cone-beam computed tomography (CBCT) for laparoscopic liver surgery. A human liver-like phantom with 16 target fiducials was used to evaluate the Syngo iPilot(®) AR system. Subsequently, the system was used for the laparoscopic resection of a hepatocellular carcinoma in segment 7 of a 50-year-old male patient. In the phantom experiment, the AR system showed a mean target registration error of 0.96 ± 0.52 mm, with a maximum error of 2.49 mm. The patient successfully underwent the operation and showed no postoperative complications. The use of intraoperative CBCT and AR for laparoscopic liver resection is feasible and could be considered an option for future liver surgery in complex cases.Surgical Endoscopy 11/2013; · 3.43 Impact Factor
- [Show abstract] [Hide abstract]
ABSTRACT: Minimally invasive surgery represents one of the main evolutions of surgical techniques. However, minimally invasive surgery adds difficulty that can be reduced through computer technology. From a patient's medical image [US, computed tomography (CT) or MRI], we have developed an Augmented Reality (AR) system that increases the surgeon's intraoperative vision by providing a virtual transparency of the patient. AR is based on two major processes: 3D modeling and visualization of anatomical or pathological structures appearing in the medical image, and the registration of this visualization onto the real patient. We have thus developed a new online service, named Visible Patient, providing efficient 3D modeling of patients. We have then developed several 3D visualization and surgical planning software tools to combine direct volume rendering and surface rendering. Finally, we have developed two registration techniques, one interactive and one automatic providing intraoperative augmented reality view. From January 2009 to June 2013, 769 clinical cases have been modeled by the Visible Patient service. Moreover, three clinical validations have been realized demonstrating the accuracy of 3D models and their great benefit, potentially increasing surgical eligibility in liver surgery (20% of cases). From these 3D models, more than 50 interactive AR-assisted surgical procedures have been realized illustrating the potential clinical benefit of such assistance to gain safety, but also current limits that automatic augmented reality will overcome. Virtual patient modeling should be mandatory for certain interventions that have now to be defined, such as liver surgery. Augmented reality is clearly the next step of the new surgical instrumentation but remains currently limited due to the complexity of organ deformations during surgery. Intraoperative medical imaging used in new generation of automated augmented reality should solve this issue thanks to the development of Hybrid OR.Hepatobiliary surgery and nutrition. 04/2014; 3(2):73-81.
- [Show abstract] [Hide abstract]
ABSTRACT: The minimally invasive surgeon cannot use 'sense of touch' to orientate surgical resection, identifying important structures (vessels, tumors, etc.) by manual palpation. Robotic research has provided technology to facilitate laparoscopic surgery; however, robotics has yet to solve the lack of tactile feedback inherent to keyhole surgery. Misinterpretation of the vascular supply and tumor location may increase the risk of intraoperative bleeding and worsen dissection with positive resection margins. Augmented reality (AR) consists of the fusion of synthetic computer-generated images (three-dimensional virtual model) obtained from medical imaging preoperative work-up and real-time patient images with the aim of visualizing unapparent anatomical details. In this article, we review the most common modalities used to achieve surgical navigation through AR, along with a report of a case of robotic duodenopancreatectomy using AR guidance complemented with the use of fluorescence guidance. The presentation of this complex and high-technology case of robotic duodenopancreatectomy, and the overview of current technology that has made it possible to use AR in the operating room, highlights the needs for further evolution and the windows of opportunity to create a new paradigm in surgical practice.Surgical Endoscopy 03/2014; · 3.43 Impact Factor