Virtual Neck Exploration for Parathyroid Adenomas A First Step Toward Minimally Invasive Image-Guided Surgery

IRCAD/Institut Hospitalo Universitaire, Strasbourg, France.
JAMA SURGERY (Impact Factor: 4.3). 03/2013; 148(3):232-8; discussion 238. DOI: 10.1001/jamasurg.2013.739
Source: PubMed

ABSTRACT To evaluate the performance of 3-dimensional (3D) virtual neck exploration (VNE) as a modality for preoperative localization of parathyroid adenomas in primary hyperparathyroidism and assess the feasibility of using augmented reality to guide parathyroidectomy as a step toward minimally invasive imageguided surgery.
Enhanced 3D rendering methods can be used to transform computed tomographic scan images into a model for 3D VNE. In addition to a standard imaging modality, 3D VNE was performed in all patients and used to preoperatively plan minimally invasive parathyroidectomy. All preoperative localization studies were analyzed for their sensitivity, specificity, positive predictive value, and negative predictive value for the correct side of the adenoma(s) (lateralization) and the correct quadrant of the neck (localization). The 3D VNE model was used to generate intraoperative augmented reality in 3 cases.
Tertiary care center.
A total of 114 consecutive patients with primary hyperparathyroidism were included from January 8, 2008, through July 26, 2011.
The accuracy of 3D VNE in lateralization and localization was 77.2% and 64.9%, respectively. Virtual neck exploration had superior sensitivity to ultrasonography (P.001), sestamibi scanning (P=.07), and standard computed tomography (P.001). Use of the 3D model for intraoperative augmented reality was feasible.
3-Dimensional VNE is an excellent tool in preoperative localization of parathyroid adenomas with sensitivity, specificity, and diagnostic accuracy commensurate with accepted first-line imaging modalities. The added value of 3D VNE includes enhanced preoperative planning and intraoperative augmented reality to enable less-invasive image-guided surgery.

1 Follower
  • [Show abstract] [Hide abstract]
    ABSTRACT: Surgical innovation relies on patient safety and quality of life, which require a drastic iatrogenic impact reduction. A parallel development toward less invasive approaches has occurred in the field of surgery, interventional radiology, and endoscopy. Minimally invasive techniques provide unquestionable benefits to patients in terms of postoperative outcome. However, those techniques are not intuitive, and extensive training is required to overcome the inherent challenges and to be proficient and consequently to achieve a steep learning curve. Technologies have been developed by computer science and robotics departments, which might improve minimally invasive techniques. A new concept of cyber therapies is emerging through the development of computer and robotic sciences aiming at human-machine integration. Additionally, the convergence of surgery, endoscopy, and interventional radiology toward a hybrid therapeutic modality, namely image-guided minimally invasive procedures, holds promises insofar as they could well maximize benefits in terms of efficacy and iatrogenic impact. In the present manuscript, the mainstays of these new paradigm developments are briefly outlined in light of our experience and vision of the future.
    World Journal of Surgery 11/2014; 39(3). DOI:10.1007/s00268-014-2879-2 · 2.35 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Augmented reality (AR) in surgery consists in the fusion of synthetic computer-generated images (3D virtual model) obtained from medical imaging preoperative workup and real-time patient images in order to visualize unapparent anatomical details. The 3D model could be used for a preoperative planning of the procedure. The potential of AR navigation as a tool to improve safety of the surgical dissection is outlined for robotic hepatectomy.
    Langenbeck s Archives of Surgery 11/2014; DOI:10.1007/s00423-014-1256-9 · 2.16 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: The minimally invasive surgeon cannot use 'sense of touch' to orientate surgical resection, identifying important structures (vessels, tumors, etc.) by manual palpation. Robotic research has provided technology to facilitate laparoscopic surgery; however, robotics has yet to solve the lack of tactile feedback inherent to keyhole surgery. Misinterpretation of the vascular supply and tumor location may increase the risk of intraoperative bleeding and worsen dissection with positive resection margins. Augmented reality (AR) consists of the fusion of synthetic computer-generated images (three-dimensional virtual model) obtained from medical imaging preoperative work-up and real-time patient images with the aim of visualizing unapparent anatomical details. In this article, we review the most common modalities used to achieve surgical navigation through AR, along with a report of a case of robotic duodenopancreatectomy using AR guidance complemented with the use of fluorescence guidance. The presentation of this complex and high-technology case of robotic duodenopancreatectomy, and the overview of current technology that has made it possible to use AR in the operating room, highlights the needs for further evolution and the windows of opportunity to create a new paradigm in surgical practice.
    Surgical Endoscopy 03/2014; 28(8). DOI:10.1007/s00464-014-3465-2 · 3.31 Impact Factor