Conference Paper

A Navigation System for Augmenting Laparoscopic Ultrasound.

DOI: 10.1007/978-3-540-39903-2_23 Conference: Medical Image Computing and Computer-Assisted Intervention - MICCAI 2003, 6th International Conference, Montréal, Canada, November 15-18, 2003, Proceedings, Part II
Source: DBLP

ABSTRACT Establishing image context is the major difficulty of performing laparoscopic ultrasound. The standard techniques used by
transabdominal ultrasonographers to understand image orientation are difficult to apply with laparoscopic instruments. In
this paper, we describe a navigation system that displays the position and orientation of laparoscopic ultrasound images to
the operating surgeon in real time. The display technique we developed for showing the orientation information uses a D model
of the aorta as the main visual reference. This technique is helpful because it provides surgeons with important spatial cues,
which we show improves their ability to interpret the laparoscopic ultrasound.

Download full-text

Full-text

Available from: William Wells, Oct 09, 2014
0 Followers
 · 
120 Views
  • Source
    • "Kawamata et al. [76] visualize the anatomical context by drawing virtual objects in a larger area of the screen than endoscope images are available. Ellsmere and colleagues [77] suggest augmenting laparoscopic ultrasound images into CT slices and using segmented CT data for improved context sensing. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The impressive development of medical imaging technology during the last decades provided physicians with an increasing amount of patient specific anatomical and functional data. In addition, the increasing use of non-ionizing real-time imaging, in particular ultrasound and optical imaging, during surgical procedures created the need for design and development of new visualization and display technology allowing physicians to take full advantage of rich sources of heterogeneous preoperative and intraoperative data. During 90's, medical augmented reality was proposed as a paradigm bringing new visualization and interaction solutions into perspective. This paper not only reviews the related literature but also establishes the relationship between subsets of this body of work in medical augmented reality. It finally discusses the remaining challenges for this young and active multidisciplinary research community.
    Journal of Display Technology 01/2009; 4(4-4):451 - 467. DOI:10.1109/JDT.2008.2001575 · 1.69 Impact Factor
  • Source
    • "Several high quality, low cost, portable (battery powered or low power) instruments are now commercially available, creating the opportunity to design such a system for portable casualty care. However, since ultrasound imaging requires a level of care and sophistication that makes it unlikely that a minimally Compact Anatomically Guided Ultrasound for Casualty Care Barnabas Takacs and Kirby G. Vosburgh F trained, unsupported operator could perform a diagnostic examination under field conditions, tools for 3D visualization, guidance and telemedicine using the toolset of Image Guided Surgery become a necessity [2]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: We present a 3D anatomically guided diagnostic system to detect internal bleeding of patients in the field. Our solution employs high fidelity digital human models to help medics with minimal training to find anatomical structures and subsequently obtain high quality ultrasound scans that may be shared with doctors located remotely.
    Advances in Computer-Human Interaction, 2008 First International Conference on; 03/2008
  • Source
    • "CAS techniques, whether they enhance traditional methods (e.g. image visualization [35]) or provide new tools such as augmented displays [10], share a common need for an OR-compatible UI. In [7], Cleary reports on a large scale survey conducted at a workshop on the future of spinal surgery. "
    [Show abstract] [Hide abstract]
    ABSTRACT: We propose an architecture for a real-time multimodal system, which provides non-contact, adaptive user interfacing for Computer-Assisted Surgery (CAS). The system, called M/ORIS (for Medical/Operating Room Interaction System) combines gesture interpretation as an explicit interaction modality with continuous, real-time monitoring of the surgical activity in order to automatically address the surgeon's needs. Such a system will help reduce a surgeon's workload and operation time. This paper focuses on the proposed activity monitoring aspect of M/ORIS. We analyze the issues of Human-Computer Interaction in an OR based on real-world case studies. We then describe how we intend to address these issues by combining a surgical procedure description with parameters gathered from vision-based surgeon tracking and other OR sensors (e.g. tool trackers). We call this approach Scenario-based Procedure and Activity Monitoring (SPAM). We finally present preliminary results, including a non-contact mouse interface for surgical navigation systems.
    Proceedings of the 6th International Conference on Multimodal Interfaces, ICMI 2004, State College, PA, USA, October 13-15, 2004; 10/2004
Show more