Content uploaded by Maria Gorlatova
Author content
All content in this area was uploaded by Maria Gorlatova on Feb 01, 2022
Content may be subject to copyright.
AR-Assisted Surgical Guidance System for Ventriculostomy
Sangjun Eom*
Department of Electrical and
Computer Engineering
Duke University
Seijung Kim†
Department of Computer
Science
Duke University
Shervin Rahimpour‡
Department of Neurosurgery
University of Utah
Maria Gorlatova§
Department of Electrical and
Computer Engineering
Duke University
Figure 1: The overall setup of our system using OptiTrack cameras for real-time tracking and HoloLens 2 as an AR device (left), and
the AR view of ventricle and EVD catheter holograms overlaid for surgical guidance, and a cube hologram for localization (right).
ABSTRACT
Augmented Reality (AR) is increasingly used in medical applications
for visualizing medical information. In this paper, we present an AR-
assisted surgical guidance system that aims to improve the accuracy
of catheter placement in ventriculostomy, a common neurosurgical
procedure. We build upon previous work on neurosurgical AR,
which has focused on enabling the surgeon to visualize a patient’s
ventricular anatomy, to additionally integrate surgical tool tracking
and contextual guidance. Specifically, using accurate tracking of
optical markers via an external multi-camera OptiTrack system, we
enable Microsoft HoloLens 2-based visualizations of ventricular
anatomy, catheter placement, and the information on how far the
catheter tip is from its target. We describe the system we developed,
present initial hologram registration results, and comment on the
next steps that will prepare our system for clinical evaluations.
Index Terms: Human-centered computing—Human computer
interaction (HCI)—Interaction paradigms—Mixed / augmented real-
ity;
1 INTRODUCTION
The surgical field of view can often be limited when looking through
an endoscope or working through a narrow incision. Especially in
neurosurgery, a surgeon often prefers to take minimally invasive
approaches to avoid inadvertent injuries to vascular or nervous struc-
tures due to a limited field of view [6]. In the case of bedside cranial
procedures, the surgeons rely on their expertise, external anatomical
landmarks, and imaging such as computed tomography (CT) to “see
through” the skull. Among these neurosurgical procedures, ven-
triculostomy involves the placement of an external ventricular drain
(EVD), a procedure that entails placement of twist drill craniotomy
and subsequent placement of a catheter in the lateral ventricles to
*E-mail: sangjun.eom@duke.edu
†E-mail: seijung.kim@duke.edu
‡E-mail: shervin.rahimpour@utah.edu
§E-mail: maria.gorlatova@duke.edu
drain cerebral spinal fluid. EVD placement is one of the most com-
mon neurosurgical procedures, practiced more than 20,000 times
annually in the U.S. alone [7]. However, while this procedure relies
on external landmarks, internal ventricular anatomy can vary by pa-
tient and pathology. Hence, we developed an AR-assisted guidance
system for neurosurgery with EVD as the target application.
Our AR-assisted surgical guidance system, shown in Fig. 1, uses
an external 6-camera OptiTrack system for real-time tracking of a
collection of optical markers, attached to different objects, enabling
Microsoft HoloLens 2-based intraoperative visualization of both the
hologram of patient’s ventricles and the hologram of the inserted
EVD catheter. OptiTrack is often used for precise motion capture in
video games, movie production, and Virtual Reality (VR). Unlike
previous work which uses a single fiducial marker to enable dis-
playing the hologram of patient’s ventricles alone, our system’s use
of multiple optical markers allows displaying multiple holograms,
including holograms of moving objects (i.e., the EVD catheter).
We enable OptiTrack and HoloLens 2 to work together via the
transformation of their coordinate systems. Towards this, we devel-
oped a localization procedure that uses a combination of optical and
fiducial markers. In this paper, we compare the results for two vari-
ants of this procedure. We also demonstrate a complete workflow for
AR-assisted ventriculostomy, including the generation of a ventricu-
lar hologram from a patient’s CT scan, tracking and holographically
representing the catheter, and calculating the distance from the tip
of the inserted catheter to the point the neurosurgeons aim to reach
(ipsilateral foramen of Monro). We believe that our system is the
first to integrate tool tracking with image registration in EVD for
adding more contextual guidance to AR-assisted ventriculostomy.1
In the rest of this paper, we discuss the related work in Section 2,
describe our system architecture in Section 3, detail our system’s
AR-based visualizations in Section 4, share our preliminary results in
Section 5, and comment on our future work directions in Section 6.
2 RE LATE D WORK
Image registration, specifically overlaying a 3D model of the preop-
erative scan of the patient with the surgeon’s view, has been explored
for different types of surgeries [4,5,10]. The most common approach
1
The video of AR-assisted ventriculostomy is provided at
https://sites.duke.edu/sangjuneom/arevd/
Table 1: Accuracy results of instrument manipulation on a hologram
overlaid target in various surgical applications.
Papers Areas Devices Results (mm)
Schneider et al. [10] EVD HoloLens 1 5.20 ±2.60
Li et al. [5] EVD HoloLens 1 4.34 ±1.63
Andress et al. [1] Orthopedic HoloLens 1 4.47 ±2.91
Rose et al. [9] Otolaryngology HoloLens 1 2.47 ±0.46
to it is the use of fiducial or optical markers to determine the holo-
gram overlay location [2]. Fiducial markers within the Vuforia
engine, which is natively integrated with the Microsoft HoloLens
software stack, have been used in several neurosurgical AR applica-
tions [10]. This method achieves a mean hologram registration error
of 1-3mm. Similar results have been demonstrated with other types
of fiducial markers, namely ARToolKit [1, 8] and AprilTags [3].
The use of a single fiducial marker limits the line of sight and
robustness of marker detection. For example, Vuforia uses feature
extraction for identifying trackable features and ARToolKit uses
contour detection for identifying corners of fiducial markers. Both
algorithms require the fiducial marker to be in the line of sight and
marker pattern to be clearly visible to the camera, thus compromising
the robustness of marker detection. These limitations affect the
accuracy of hologram alignments for image registration.
The OptiTrack system overcomes both limitations by expanding
the line of sight to a broader angle with multiple cameras, and
providing robust real-time tracking with multiple optical markers
while maintaining high accuracy of hologram alignment. Multiple
optical markers were used with a flatbed camera system in [11],
however, instead of an AR device, an external monitor was used to
display an augmented video to assist the surgeons.
Various types of surgeries involve the tasks of instrument manip-
ulations where the image registration brings guidance to surgeons
in improving the accuracy results of the tasks. EVD placement
(which is typically a freehand procedure) with no visualization of
internal anatomy can result in inaccurate placement of the catheter,
especially for more junior surgeons [7]. The overlay of a ventricular
hologram has been shown to improve the accuracy of catheter place-
ment. Table 1 shows the accuracy results of instrument manipulation
for various applications. In particular, the EVD application reported
a reduction in the distance between the tip of the catheter and its
intended target from 11.26mm [5] to 4-5mm [5, 10]. The drift of
hologram over time or when erratic user movement is observed, and
low first attempt success rate remain as challenges for the image reg-
istration. The mean drift of 1.41mm using Vuforia marker detection
was reported for EVD [10].
Integrating tool tracking with image registration can enable pro-
viding additional contextual guidance to the surgeons [12]. In EVD,
both the angle and the inserted length of the catheter are impor-
tant for its accurate placement, yet surgeons have a limited view
of the catheter that is inserted. A holographic representation of the
catheter can provide an important visual reference for the surgeon.
In this work, we enable catheter tracking and demonstrate catheter
tracking-based contextual guidance.
3 OVE RA LL ARCHITECTURE
In this section, we describe our system’s hardware specifications, our
approach to tracking different objects that comprise our integrated
setup, and the transformation of world coordinate systems between
the two core components of our system, HoloLens 2 and OptiTrack.
Hardware Setup: The proposed setup includes a surgeon-worn
HoloLens 2 and six Flex 3 OptiTrack cameras, shown in Fig. 1, with
lens specs of 57.5 degrees in field of view and 800nm of long-pass
infrared (IR) range. The OptiTrack cameras are distributed around
the table to capture full 360 degrees of angular view for stable
Figure 2: (a) 3D printed localization marker attached with both fiducial
(ARToolKit) and optical markers. (b) H-shaped 3D printed mount
for the catheter, attached to the inner stylet. (c) Dimensions of the
phantom model attached with eight optical markers and Kocher’s
points. (d) A mold filled with jello inside the phantom model.
and accurate tracking of optical markers. We use the OptiTrack
cameras to track all objects including the HoloLens 2; tracked object
positions and orientations are transmitted back to the HoloLens 2
for visualization within our Unity-based AR app. HoloLens 2 only
tracks one fiducial marker (e.g., the ARToolKit-based marker shown
in Fig. 2a) to localize in order to transform its world coordinates to
OptiTrack’s coordinate system.
The HoloLens 2 and the OptiTrack system are communicating
via a desktop-based server, set up with the Motive Application
Programming Interface (API). The server receives marker positions
from the OptiTrack cameras 100 times per second, via wired USB-
based connections. Tracked objects’ coordinates, calculated on the
server, are transmits to the HoloLens 2 wirelessly in real-time at
the same rate. These settings achieve imperceptible latency and
minimize visual artifacts that can be associated with lower frame-
rate object tracking (spatial jitter, judder, unexpected disappearance
of holograms).
Tracked Objects: There are a total of four objects that the Opti-
Track system is tracking in real-time. 1) HoloLens 2 has three optical
markers attached to it, and is tracked in real-time to adjust the AR
view with the surgeon’s perspective.22) A localization marker that
allows HoloLens 2 and OptiTrack to operate in the same coordinate
space, as described in detail in the next paragraph. We designed
two localization markers: a 2D Vuforia marker, shown in Fig. 1b,
and a 3D ARToolKit marker, shown in Fig. 2a. 3) The phantom
model of the patient’s head, shown in Figs. 2c and d. We created
the phantom model to be anatomically similar to a patient’s head,
for testing, analysis, and evaluation of our system. The phantom
model has eight optical markers attached to its surface, as shown
2
This approach to headset pose tracking is similar to the methods used in
modern VR systems, such as Oculus Quest 2 and HTC Vive.
Figure 3: Transformation between two world coordinate systems
through the tracking of a localization marker. This diagram shows a
3D ARToolKit marker; we also experimented with a 2D Vuforia marker.
in Fig. 2c. The phantom model also shows two Kocher’s points,
which in EVD serve as the entry points for the EVD catheter. In
the phantom model we pre-drilled holes at these points, each with a
diameter of approximately 6mm. Inside the phantom model, there
is a 3D printed mold filled with red jello to imitate the target, as
shown in Fig. 2d. We used jello as a low-cost material that imitates
the texture of the ventricle and is capable of holding the catheter
position after the placement. 4) The EVD catheter. It is not possible
to attach optical markers directly to the catheter, which is a thin tube
with a 3.3mm diameter. Rather, we designed a 3D printed mount,
and attached four optical markers to it, each facing in a different
direction. The 3D printed mount is attached to the top of the inner
stylet, inserted inside the catheter to maintain the tube’s rigidity for
the insertion. EVD catheter mount design and attachment are shown
in Fig. 2b.
Transformation of World Coordinates: HoloLens 2 and Opti-
Track operate in different world coordinates. To be able to perform
transformations between the different coordinate systems, both the
HoloLens and the OptiTrack need to locate the same target, to be
used as a reference when calculating the differences in coordinate
values. To enable this, we created a custom rigid object (i.e., the
localization marker) that can be tracked by both OptiTrack and
HoloLens 2, and compute the same centroid point to calculate the
differences.
The transformation of world coordinates from OptiTrack,
{O}
to HoloLens 2,
{H}
,
TH
O
is shown in Eq. 1 and Fig. 3. The trans-
formation between HoloLens 2 and fiducial markers,
{F}
on the
localization marker,
TH
F
is obtained by either ARToolKit or Vuforia
marker detection methods. We placed the optical markers on the
localization marker to register it as a rigid body, {R}, and calculate
the same centroid point as possible as the fiducial markers. This
minimizes the transformation between the fiducial marker and rigid
body,
TF
R
. The transformation between the rigid body of localization
marker and OptiTrack,
TR
O
, is then obtained by OptiTrack’s real-time
tracking.
TH
O=TH
F·TF
R·TR
O(1)
4 HOLOGRAPHIC VISUALIZATION
High accuracy of image registration is of utmost importance in
surgical applications. We take advantage of OptiTrack with high
accuracy tracking of each marker position, as seen in Fig. 3, for
registering ventricular hologram and tracking the EVD catheter.
In EVD applications, a patient-specific ventricular model needs
to be extracted from the patient’s CT scan (since ventricular anatomy
varies by patient). We first imported the DICOM data of a patient’s
CT scan into 3D Slicer, open-source image analysis and visualization
software commonly used in medical applications. Then, we applied a
Figure 4: Extraction of 3D models of skull frame and ventricle from
the patient’s CT scan.
threshold value to extract the lateral ventricle with foramen of Monro
and the skull, as seen in Fig. 4, rendered the combined 3D model, and
exported it to load into Unity. The 3D model of the skull was initially
extracted together with the ventricle to evaluate the alignment with
the phantom model; however, the ventricular hologram alone is
sufficient for the EVD. The extraction of the ventricular model from
the patient’s DICOM data is a preoperative step that will increase
surgery preparation time. In our experience, it has been taking
30 minutes on average. We believe that the extraction time can
be shortened by automating the threshold selection. Examples of
ventricular hologram overlays can be seen in Figs. 1 and 5.
The phantom model of a patient’s head we use for our experiments
(shown in Fig. 2c) does not correspond to a specific patient for
whom we have the ventricular CT scan. Thus, to evaluate our image
registration, we obtained the CT scan of the phantom model and
extracted its 3D model by using the filtering approach. This allows
us to directly measure the misalignment between the physical and
the virtual (i.e., AR) representations of the phantom model; we
report these results in the next section.
In EVD, once the catheter is inserted through a small opening
in the skull, the tip of the catheter is no longer visible, making it
difficult for surgeons to estimate its true location. A hologram of
the inserted catheter can bring additional guidance by showing the
depth and the angle of the insertion, and the distance from the tip
of the catheter to the ventricle. Since EVD catheters have standard
dimensions, we represent the catheter with a similarly-shaped virtual
object: namely, a cylinder with 3.3mm diameter and 36cm length.
The resulting visualization is shown in Figs. 1b and 5. We are
currently working on improving the robustness of catheter tracking.
Currently, while providing useful visualizations in some conditions,
it results in spatial jitter in others. We will report on these results in
future work.
5 PRELIMINARY RES ULTS
Our system is intended to be used as follows. First, during the
system initialization phase, the user performs the eye calibration for
HoloLens 2, and the connectivity between the HoloLens 2 and the
OptiTrack is established. Following the initialization, the surgeon
navigates around the localization marker until a white cube hologram
appears, as shown in Fig. 1, which indicates that the fiducial marker
is detected. Once the cube hologram is accurately positioned on the
marker, the surgeon presses a button to compute the transformation
of the world coordinate system using the detected location. After the
transformation, the holograms of the ventricle and the EVD catheter
appear in the surgeon’s view, and the surgeon performs the insertion
of the catheter through Kocher’s point. The surgeon removes the
inner stylet when catheter placement is complete.
Localization: The performance of the two marker detection meth-
ods we examined (namely, ARToolKit and Vuforia) differed signifi-
cantly. We summarize our observations in Table 2. Over 15 trials,
we observed that the ARToolKit localization marker was detected
more readily while Vuforia localization marker was sensitive to
user distance and orientation. The detection stability was unreli-
able for ARToolKit, subsequently maintaining a poor alignment that
depended on the user’s movement, however for Vuforia, a good
Table 2: Comparison between ARToolKit and Vuforia localization
methods in detection and hologram alignments.
Detection Method ARToolKit Vuforia
Marker Type 3D Fiducial 2D Fiducial
Localization Time (s) 15.11 ±7.57 12.71 ±7.05
x: 1.39 ±0.57 x: 0.96 ±0.43
Registration Error (mm) y: 1.17 ±0.83 y: 1.11 ±0.27
z: 1.39 ±0.88 z: 1.44 ±1.05
Detection Flexibility Easy, Flexible Strict to angle, distance
Detection Stability Unreliable Reliable
Figure 5: Procedures of AR-assisted ventriculostomy in placing the
catheter at the foramen of Monro.
alignment of the cube hologram was observed, once detected. The
mean
±
standard deviation time of user navigation was 15.11
±
7.57 seconds for ARToolKit, and 12.71
±
7.05 seconds for Vufo-
ria. The average latency of data communication between OptiTrack
and HoloLens 2 was 12.32 ms. The data stream of marker posi-
tions maintained a seamless AR experience of reflecting changes in
tracked models with no data loss.
Hologram Alignment: Over 15 trials, we measured the differ-
ence in the alignment between the physical phantom model and
its holographic representation, using a digital caliper in
x
,
y
, and
z
coordinates (in the coordinate system shown in Fig. 3). As shown in
Table 2, we observed the mean
(x,y,z)
registration errors of (1.39
±
0.57, 1.17
±
0.83, 1.39
±
0.88)mm when using ARToolKit, and
(0.96
±
0.43, 1.11
±
0.27, 1.44
±
1.05)mm when using Vuforia.
The detection stability and higher accuracy of Vuforia are the key to
maintaining the image registration robust and accurate. We will thus
use Vuforia as our localization method in our subsequent research.
Catheter Placement: After the catheter is inserted, the surgeon
removes the inner stylet and the jello maintains the position of the
catheter placement, as shown in Fig. 5. In our system, we added
contextual guidance of a text display above the ventricle hologram,
which shows the calculated distance from the tip of the catheter to the
foramen of Monro (which the surgeons target in EVD). This distance
calculation is obtained in real time, using the tracked positions of the
phantom model and the catheter. We believe that our system is the
first to provide such guidance for the EVD procedure, which may
enable its use in training novice neurosurgeons in conducting this
procedure. However, the guidance we currently present is subject to
object tracking errors. We will evaluate its correctness by comparing
the distance we calculate with the physical distance between the
foramen of Monro and the tip of the catheter.
6 CONCLUSIONS AND FUTURE WORK
Among various surgical applications of AR, neurosurgery benefits
from AR guidance to maintain minimal invasiveness while ensuring
safety. In this paper we integrate AR into a neurosurgical application
by creating an AR-assisted surgical guidance system with image
registration and tool tracking. We demonstrate the visualization
of the ventricular anatomy for guidance to the surgeons and the
projection of the catheter tip for the EVD catheter placement.
We are currently preparing our system to be evaluated in clinical
settings, for the accuracy of AR-assisted catheter placement and
for other potential benefits of contextual guidance that our system
will provide. We have identified the following next steps to prepare
our system for the user studies. First, we will continue to improve
the robustness of tool tracking by optimizing the number and the
distribution of markers on the 3D-printed mount, and by configuring
the number, positions, orientations, and other parameters of the Op-
tiTrack cameras. Furthermore, we will develop more user-friendly
and intuitive workflows for the surgeons. The current procedure
requires pre-operative steps of extracting a patient-specific ventricu-
lar model and intraoperative steps for performing the localization,
increasing the overall surgery preparation time. We will improve our
system by automating threshold selection of different DICOM data,
and by providing additional AR-based visual guidance for system
localization. We expect to start conducting initial user studies, which
will evaluate different elements of our system with both novice and
experienced neurosurgeons, within the next 6 months.
ACKNOWLEDGMENTS
We thank Emily Eisele for her work on this project during an NSF
REU Program at Duke University. This work was supported in
part by NSF grants CSR-1903136 and CNS-1908051, NSF CA-
REER Award IIS-2046072, IBM Faculty Award, and by an AANS
Neurosurgery Technology Development Grant.
REFERENCES
[1]
S. Andress, A. Johnson, M. Unberath, A. F. Winkler, K. Yu, J. Fotouhi,
S. Weidert, G. M. Osgood, and N. Navab. On-the-fly augmented reality
for orthopedic surgery using a multimodal fiducial. Journal of Medical
Imaging, 5(2):021209, 2018.
[2]
L. Chen, T. W. Day, W. Tang, and N. W. John. Recent developments
and future challenges in medical mixed reality. In Proc. IEEE ISMAR,
2017.
[3]
W. Gibby, S. Cvetko, A. Gibby, C. Gibby, K. Sorensen, E. G. Andrews,
J. Maroon, and R. Parr. The application of augmented reality–based
navigation for accurate target acquisition of deep brain sites: advances
in neurosurgical guidance. Journal of Neurosurgery, 1(aop):1–7, 2021.
[4]
N. Haouchine, J. Dequidt, I. Peterlik, E. Kerrien, M.-O. Berger, and
S. Cotin. Image-guided simulation of heterogeneous tissue deformation
for augmented reality during hepatic surgery. In Proc. IEEE ISMAR,
2013.
[5]
Y. Li, X. Chen, N. Wang, W. Zhang, D. Li, L. Zhang, X. Qu, W. Cheng,
Y. Xu, W. Chen, et al. A wearable mixed-reality holographic computer
for guiding external ventricular drain insertion at the bedside. Journal
of Neurosurgery, 131(5):1599–1606, 2018.
[6]
A. Meola, F. Cutolo, M. Carbone, F. Cagnazzo, M. Ferrari, and V. Fer-
rari. Augmented reality in neurosurgery: a systematic review. Neuro-
surgical Review, 40(4):537–548, 2017.
[7]
B. R. O’Neill, D. A. Velez, E. E. Braxton, D. Whiting, and M. Y. Oh. A
survey of ventriculostomy and intracranial pressure monitor placement
practices. Surgical Neurology, 70(3):268–273, 2008.
[8]
L. Qian, A. Deguet, and P. Kazanzides. ARssist: augmented reality
on a head-mounted display for the first assistant in robotic surgery.
Healthcare Technology Letters, 5(5):194–200, 2018.
[9]
A. S. Rose, H. Kim, H. Fuchs, and J.-M. Frahm. Development
of augmented-reality applications in otolaryngology–head and neck
surgery. The Laryngoscope, 129:S1–S11, 2019.
[10]
M. Schneider, C. Kunz, A. Pal’a, C. R. Wirtz, F. Mathis-Ullrich, and
M. Hlav
´
a
ˇ
c. Augmented reality–assisted ventriculostomy. Neurosurgi-
cal Focus, 50(1):E16, 2021.
[11]
S. Skyrman, M. Lai, E. Edstr
¨
om, G. Burstr
¨
om, P. F
¨
orander, R. Homan,
F. Kor, R. Holthuizen, B. H. Hendriks, O. Persson, et al. Augmented
reality navigation for cranial biopsy and external ventricular drain
insertion. Neurosurgical Focus, 51(2):E7, 2021.
[12]
T. Song, C. Yang, O. Dianat, and E. Azimi. Endodontic guided
treatment using augmented reality on a head-mounted display system.
Healthcare Technology Letters, 5(5):201–207, 2018.