ArticlePDF Available

Towards AR-assisted Visualization and Guidance for Imaging of Dental Decay

Wiley
Healthcare Technology Letters
Authors:

Abstract and Figures

Untreated dental decay is the most prevalent dental problem in the world, affecting up to 2.4 billion people and leading to a significant economic and social burden. Early detection can greatly mitigate irreversible effects of dental decay, avoiding the need for expensive restorative treatment that forever disrupts the enamel protective layer of teeth. However, two key challenges exist that make early decay management difficult: unreliable detection and lack of quantitative monitoring during treatment. New optically based imaging through the enamel provides the dentist a safe means to detect, locate, and monitor the healing process. This work explores the use of an augmented reality (AR) headset to improve the workflow of early decay therapy and monitoring. The proposed workflow includes two novel AR-enabled features: (i) in situ visualisation of pre-operative optically based dental images and (ii) augmented guidance for repetitive imaging during therapy monitoring. The workflow is designed to minimise distraction, mitigate hand–eye coordination problems, and help guide monitoring of early decay during therapy in both clinical and mobile environments. The results from quantitative evaluations as well as a formative qualitative user study uncover the potentials of the proposed system and indicate that AR can serve as a promising tool in tooth decay management.
This content is subject to copyright. Terms and conditions apply.
Towards AR-assisted visualisation and guidance for imaging of dental decay
Yaxuan Zhou1,2, Paul Yoo3, Yingru Feng3, Aditya Sankar3, Alireza Sadr4, Eric J. Seibel2
1
Department of Electrical and Computer Engineering, University of Washington, Seattle, WA 98195, USA
2
Human Photonics Lab, Department of Mechanical Engineering, University of Washington, Seattle, WA 98195, USA
3
Paul G. Allen School of Computer Science and Engineering, University of Washington, Seattle, WA 98195, USA
4
School of Dentistry, University of Washington, Seattle, WA 98195, USA
E-mail: eseibel@uw.edu
Published in Healthcare Technology Letters; Received on 19th September 2019; Accepted on 2nd October 2019
Untreated dental decay is the most prevalent dental problem in the world, affecting up to 2.4 billion people and leading to a signicant
economic and social burden. Early detection can greatly mitigate irreversible effects of dental decay, avoiding the need for expensive
restorative treatment that forever disrupts the enamel protective layer of teeth. However, two key challenges exist that make early decay
management difcult: unreliable detection and lack of quantitative monitoring during treatment. New optically based imaging through the
enamel provides the dentist a safe means to detect, locate, and monitor the healing process. This work explores the use of an augmented
reality (AR) headset to improve the workow of early decay therapy and monitoring. The proposed workow includes two novel AR-
enabled features: (i) in situ visualisation of pre-operative optically based dental images and (ii) augmented guidance for repetitive imaging
during therapy monitoring. The workow is designed to minimise distraction, mitigate handeye coordination problems, and help guide
monitoring of early decay during therapy in both clinical and mobile environments. The results from quantitative evaluations as well as a
formative qualitative user study uncover the potentials of the proposed system and indicate that AR can serve as a promising tool in tooth
decay management.
1. Introduction: Oral health problems remain a major public health
challenge worldwide in the past 30 years, leading to economic and
social burden [13]. Wherein, untreated dental decay is the most
prevalent issue and is relevant to socio-economic disparities
[4,5]. As shown in Fig. 1, the traditional dental care pattern for
dental decay management consists of routine examination in
clinics, non-destructive treatments for detected early decays and
destructive treatments for irreversible decays. There are three
limitations to this pattern. First, visual or tactile examination and
the current gold-standard x-ray radiography cannot reliably and
timely detect interproximal and occlusal lesions [6], which are the
most common types of dental decays. Second, medicine therapy
and instructed cleaning are performed by patients at home
without supervision. And they need to revisit the dental clinic,
which limits the timely monitoring of decay and often leads to
further progression of the decay into irreversible decay. Lastly,
the treatments for irreversible lesion such as drill-and-ll procedure,
root canal treatment and even dental implant are all destructive,
painful, expensive and time-consuming. These limitations need
to be solved to develop an ideal dental care procedure for decay
management, also shown in Fig. 1. If early-stage lesions can
be detected reliably, patients can be prescribed with medicinal
therapies and instructed/directed cleaning over time outside the
dental clinic [3,7,8]. Also, if the current clinic-revisiting-based
monitoring of decay can be enhanced by monitoring at community
health centre or even patients home and sharing data with dentists,
then timely intervention can be made with fewer clinic-visits and
less burden on both dentists and patients [3,9]. Then, early
decays can be detected and healed in time thus avoiding destructive
and costly procedures. In need is the continuous research into such
ideal management of tooth decay [3].
To move towards this ideal pattern, there have been signicant
strides towards developing reliable, sensitive and low-cost imaging
modalities to diagnose early decays [10,11]. Three-dimensional
(3D) imaging modalities such as cone-beam computed tomography
(CBCT) and optical coherence tomography (OCT) are reliable and
sensitive but usually require long imaging time on expensive
clinical systems. Clinicians typically perform 3D imaging
pre-operatively and use the 3D image for planning and intra-
operative reference. For intra-operative imaging and also remote
monitoring, clinicians also need a 2D imaging modality, e.g. the
scanning bre endoscope (SFE).
Along with the development of imaging modalities, the ease of
use for dental imaging needs to be improved in general.
Acquiring high-quality images from desired perspective usually
requires expert manipulation of the instrument. For example, to
effectively monitor the condition of a carious lesion with SFE,
users need to image the decay from the same perspective every
time, which is difcult without any assistance [12]. Also, using
the previous images for navigation requires handeye coordination.
Clinicians need to divert their attention to the display monitor
while manually positioning the scope, additionally compensating
for patients movement. This is particularly challenging in dental
eld as there is only manual xation of patients jaw and patients
are typically not under local anaesthesia during dental procedures.
The above challenges lead to a lengthy learning curve for providing
treatment accurately [13,14]. Moreover, resource-limited areas may
lack budgets for well-trained personnel.
In this work, we utilise an augmented reality (AR) head-mounted
display (HMD) to develop a platform for visualising dental images
from multiple modalities. We also use the HMD as a guidance tool
for positioning of an imaging probe during repetitive monitoring of
dental lesions and their treatments. We built a prototype system
using the Magic Leap One AR headset and two dental imaging
modalities OCT and infrared SFE. The key contributions of our
work are (i) the design and development of a novel end-to-end
system for multi-modal dental image visualisation, (ii) a technique
for guided image capture using SFE, and (iii) quantitative evalua-
tions as well as a user study to evaluate the usefulness, usability
and limitations of our system and identify areas for future work.
To the best knowledge of the authors, this is the rst pilot
study to develop HMD-based AR environment for visualisation
and guidance for optically monitoring the status of dental lesions.
Continued advances in AR devices, dental imaging modalities, as
well as systems that combine these two technologies will together
push the traditional dental practice towards an ideal future.
Healthcare Technology Letters, 2019, Vol. 6, Iss. 6, pp. 243248
doi: 10.1049/htl.2019.0082
243
This is an open access article published by the IET under the
Creative Commons Attribution License (http://creativecommons.
org/licenses/by/3.0/)
2. Related work: Near-infrared (NIR) optical imaging is shown
to have the potential to detect early-stage dental decays more
reliably [15,16]. In NIR reection image, dental decays appear
brighter than surrounding sound areas due to increasing scattering
coefcient [17]. OCT is a 3D volumetric imaging technique
and has been used for NIR imaging of dental decay [18]. Fig. 2a
shows a prototype OCT system imaging an extracted human
tooth and a slice of the 3D OCT scan where two interproximal
dental lesions appear as bright spots. OCT systems are expected
to be expensive when introduced to dental clinics, and currently
a complete 3D scan takes at least several minutes from prototype
systems. Also, the OCT probe is bulky and requires expert
manipulation to acquire high-quality scans. Thus OCT is more
suitable as the pre-operative imaging modality used in clinics.
The SFE is a 2D imaging technique with the advantages of
miniature probe tip and expected low cost. Many SFE prototypes
have been used for real-time NIR dental imaging in previous
works [1921]. Fig. 2bshows SFE imaging an extracted human
tooth and the SFE image where the white patterns on both sides
of tooth indicate two interproximal dental lesions. In the gure,
SFE is imaging from the biting surface of tooth, but since NIR
light penetrates around 3 mm deep into the surface [20], the
interproximal dental lesion under the surface also shows up in
the image. This is very helpful for dental decays that are hidden
in between the neighbouring teeth and not accessible to the
operator. Due to the above advantages, SFE is well-suited for
quick intraoperative screening and long-term monitoring.
AR technology has been introduced into research areas of dental
implant [2226], oral and maxillofacial surgery [14,2729], ortho-
dontics [30] as well as dental education [31,32]. In previous work,
introduction of AR has assisted clinicians by displaying and regis-
tering virtual models in the operating eld thus reducing difculty
of handeye coordination. However, there is as yet no study aimed
at assisting dental imaging modalities for detection and monitoring
of dental decay [33]. Among all available AR devices, HMDs have
the advantage of compactness and intuitiveness (as compared to
handheld or armature mounted AR devices). For this study, we
chose Magic Leap One [34] AR headset as the hardware platform.
Magic Leap One also includes a hand-held controller with a home
button, a bumper, a trigger and a touchpad.
3. Methods: The proposed workow and corresponding technical
components are described in Fig. 3. During the initial appointment
in dental clinics with high resource availability, a pre-operative
3D raw image is acquired and transferred onto AR headset, and
then dentists can examine the 3D image in AR environment intra-
operatively and make a diagnosis based on observed position,
dimension and severity of dental decays. During this process, the
dentist can translate, rotate, and scale the 3D image at will to
view it from an optimal viewing angle based on their preference
and experience. The dentist can also adjust display parameters
including intensity, opacity, and contrast threshold to optimise
decay visibility and also account for varying external lighting
conditions. Furthermore, they can examine the image by slicing
through the 3D structure to accurately locate the decay.
For long-term monitoring, the dentist can select the desired angle
of view for future repetitive 2D imaging. Then a virtual model of
tooth and imaging instrument, with registered spatial relationships,
is generated and stored. During the monitoring phase, 2D imaging
can be performed regularly within or outside of a clinical setting,
using the virtual model as guidance. In order to reproduce the ref-
erence image, the operator aligns the position of the selected tooth
and the imaging probe with respect to the virtual model so that the
same desired view angle is preserved. Alignment of imaging probe
can be done by manual alignment or tracking-based alignment.
2D images are then transferred into AR environment and fused
Fig. 2 Demonstration of two NIR dental imaging modalities and their images
aAn OCT probe imaging an extracted human tooth; a slice of the 3D OCT
scan, where the bright patterns indicate demineralised regions of enamel
(dental lesion)
bAn SFE probe imaging an extracted human tooth; SFE image, where the
bright patterns (marked by arrows) indicate high optical reectance from
dental decay regions
Fig. 1 Comparison of traditional and ideal dental care patterns for tooth
decay management. Blue texts are areas that are under active development.
Purple texts indicate how our work is supporting the new approach to
healing dental decays
Fig. 3 Diagram of workow and corresponding technical components
244
This is an open access article published by the IET under the
Creative Commons Attribution License (http://creativecommons.
org/licenses/by/3.0/)
Healthcare Technology Letters, 2019, Vol. 6, Iss. 6, pp. 243248
doi: 10.1049/htl.2019.0082
with the 3D image and all previous 2D images for comparison. The
operator or remote dentist can change the desired angle of view
according to updated 2D images throughout the period of monitor-
ing. After 2D SFE images are acquired, they are fused with 3D
image and transferred to a dentist with computer-aided image ana-
lysis for interpretation. By comparing the historical images to the
present, the dentist can make determination of whether the dental
decay is healing or is progressing under the current prescription
and make corresponding adjustment on the prescription (such as
frequency and dose of medicine application, and/or time of next
dental visit). We prototyped a software system based on this prin-
ciple using Unity [35] (version 2019.1.0f1) with Magic Leap
Lumin SDK [34].
3.1. AR-assisted visualisation of pre-operative 3D image: In our
pilot study, a pre-operative 3D image of the tooth is acquired
using a pre-commercial 1310 nm swept-source OCT (Yoshida
Dental Mfg., Tokyo, Japan) with 110 nm band and 50kHz scan.
The OCT 3D scan is taken from the occlusal view with an
imaging range of 10 ×10 ×8mm
3and an axial imaging
resolution of 11 µm. The raw data from OCT imaging system is
rst converted into point cloud data and downsampled to reduce
the data size without losing useful features. The point intensities
are then rescaled to increase the dynamic range. The point cloud
data is then rendered as a 3D volumetric object using an
open-source Unity package for volumetric rendering [36].
Slicing through three orthogonal directions is implemented to
allow users to inspect inner structures of the tooth. By examining
cross-section slices, dentists can comprehensively inspect the loca-
tion and size of dental lesions. More importantly, dentists can nd
out how deep the dental decay has progressed into the dental
enamel layer, which would determine whether a drill-and-ll pro-
cedure is needed or the medicine treatment should be prescribed
with long-term monitoring. Since the visualisation needs to accom-
modate different lighting conditions and user preferences, adjust-
ment of three display parameters is provided. Users can adjust
intensity value to adjust the overall brightness of the volumetric
display. They can also adjust the threshold value for saturation,
hiding areas that have low contrast. Opacity value can be adjusted
to determine the transparency of the volume. Appropriate opacity
values allow the user to see the surface structure of tooth as well
as inner features like dental decay or a crack without having to
inspect through every slice, thus providing an initial and intuitive
sense of existence, position and structure of these features.
Slicing and display adjustment are implemented as sliders on a
panel. The controller is used to select and adjust sliders. The
panel and the pre-operative 3D image can be selected by aiming
the controller at them and holding down the trigger and physically
translating or rotating the controller. When the panel or the image is
selected, users can also rescale them by pressing on left of the
touchpad to shrink and left of the touchpad to enlarge. See the
video in supplementary material for the interaction demo.
3.2. AR-assisted guidance for 2D imaging: Guidance for 2D
imaging is necessary not only in that it helps non-dentist
personnel to take 2D images at desired view angles, but also in
that it guarantees the eld of view and perspective of 2D images
during repetitive imaging remain the constant and the series of
images can be quantitatively compared. After dentists spot decay
on the OCT 3D image, they can designate the desired view angle
to take 2D images so that the decay can be detected by 2D
images. In the view angle selection mode, a virtual cone shape is
attached to the end of controller, corresponding to the view
frustum of the endoscope. Since NIR SFE has a disc-shaped eld
of view which grows larger when the target is further away from
the probe, a cone can be used to represent the eld of view of
SFE. The user can aim the cone at the OCT 3D image and adjust
the area that is covered by the cone, as shown in Fig. 4a.By
pressing the bumper to indicate that the desired view angle
is chosen and a virtual reference model consisting of 3D tooth
surface model registered with SFE probe model according to
indicated view angle is generated for future guidance. The 3D
tooth surface model is acquired by an intra-oral scanner (3Shape
TRIOS 3, 3Shape, Copenhagen, Denmark).
In this pilot study, we strive to keep the system and workow as
concise as possible, so we are not using any ducial-point-based
tracking which requires an additional tracker. Furthermore, the
alignment between the virtual tooth model with the real tooth is
done manually by the user. Since the virtual tooth model is the
3D surface structure scan from the same tooth, the user can
shrink the model to the same size as the tooth and align them.
The next step is to use the reference model for guidance of 2D
imaging, where the user needs to align the virtual probe model.
The alignment of SFE probe to the virtual model is made more
difcult since SFE probe is of a smaller scale. Therefore, we
designed two virtual SFE probe models, a cylinder model and a
tri-colour-plane model, as shown in Figs. 4band c.
Besides manual alignment, there are also two tracking-based
methods supported by hardware systems on Magic Leap One.
The rst method is based on image-tracking API provided by
Magic Leap [37]. The front-view camera and depth camera on
the headset can be used for tracking the spatial position and rotation
of a at image. The target image is printed in the dimension of
3.4×3.2cm
2and attached to the SFE probe. Then the tracked
position and rotation of the target image can be transformed to
the position and rotation of the probe, assuming the offset
between the probe and target image remains rigid and unchanged.
The second method is based on the electromagnetic 6-DoF spatial
tracking of the control handle [38]. By xing the SFE probe
with the control handle, the tracked position and rotation of the
controller can be transformed into the position and rotation of the
probe. Once the probe is being tracked, a red cylinder virtual
model is shown to indicate the tracked position and rotation.
Then the user needs to align the red cylinder virtual model (the
tracked position and rotation of the real probe) with the virtual
probe model (desired position and rotation for positioning the
real probe).
3.3. Data transfer and image fusion: The 2D SFE images are
transferred from the instrument to the AR headset via a web
server. A polling-based scheme downloads newly acquired
images onto the headset, over HTTP. 2D SFE images and the 3D
OCT image can then be registered according to the view angles
with which the SFE images were taken. As shown in Fig. 5,an
occlusal-view SFE image is registered with the OCT 3D image.
With the image fusion, users can interpret and compare images
from multiple modalities and also inspect the condition of decays
during monitoring of therapy.
Fig. 4 Design of view selection and probe models
aUse cone model to select desired angular view for consistent 2D imaging
bThe tri-colour-plane model for probe alignment
cThe cylinder model for probe alignment
Healthcare Technology Letters, 2019, Vol. 6, Iss. 6, pp. 243248
doi: 10.1049/htl.2019.0082
245
This is an open access article published by the IET under the
Creative Commons Attribution License (http://creativecommons.
org/licenses/by/3.0/)
4. Evaluation
4.1. Experiments: To measure the augmentation quality, we set up a
3D grid coordinate as shown in Fig. 6a. The grid paper has 1 mm
ne grids, 5 mm medium grids and 1 cm large grids. Once the
hologram is manually aligned with the object, the observer uses a
sharp pointer to localise position of a certain point on hologram
and then measures the distance between the points on real object
and hologram. Jitter and perceived drift of the hologram are
quantied by the translation distance measured on the grid paper.
To measure the alignment performance, we also measure the
end-to-end accuracy quantied by keypoint displacement in
acquired SFE images. We choose to image a USAF resolution
test chart as shown in Fig. 6b, to simplify the accurate extraction
of keypoints in SFE images. Ten key points are selected on the
test chart. The user rst aligns the SFE probe in front of the test
chart in desired viewpoint and takes one image. Then after
putting the SFE probe down for a while, the user realigns the
SFE probe with or without guidance and takes another SFE
image with the attempt to replicate the same viewpoint as in the
rst image. Three guidance approaches are used in turn for the guid-
ance of repositioning of SFE probe, among which, without any
guidancemeans that user aligns the probe only according to their
memory of the desired probe position without referring to real-time
SFE video, with AR guidancemeans that user aligns the probe
with the AR hint of desired probe position, with video guidance
means that user aligns the probe by referring to the real-time SFE
video and comparing with the reference image. Three guidance
approaches are used in random order for ten runs to avoid training
bias. The time it takes to realign the probe to desired position is
recorded. The xand ypositions of the ith keypoint are measured
in pixels in reference image and repetitive image as pref
xi,pref
yi,
prep
xi,prep
yi. The overall keypoint displacement Dof the repetitive
image is then calculated according to
D=i
(prep
xipref
xi)2+(prep
yipref
yi)2
10
Among ten runs, the mean and standard deviation of Dis quantied
and used to evaluate the three guidance approaches along with
the time.
4.2. User study: We conducted a user study to get user feedbacks
for this prototype. We used a dentoform model with an extracted
human tooth installed on it, as shown in Fig. 6c. The extracted
human tooth has two articial dental lesions on its interproximal
surfaces. OCT 3D image, occlusal-view SFE 2D image as well as
3D surface shape scan were acquired from this sample, as shown
in Fig. 7.
Six subjects were recruited and asked to conduct the tasks with
the system, to walk through the workow. Among the six subjects,
three self-reported as dental students or clinicians, while the other
three were general users without specialised dental knowledge.
All users were new to this AR system and the workow. The proto-
col that subjects were asked to perform using the Magic Leap One
were as follows: (i) examine the 3D OCT image in the headset by
slicing and adjusting display parameters. (ii) Use the cone to select
the desired view angle. (iii) Manually align the virtual model with
the real tooth. (iv) Align the SFE probe with the virtual probe model
and compare two virtual probe models. The manual alignment,
image-tracking-based alignment and controller-tracking-based
alignment are also compared.
After the tasks were completed, the users were asked to ll out a
questionnaire anonymously. See supplementary material for the
template of questionnaire.
5. Results and discussion: In the quantitative measurements,
we measured the augmentation quality between hologram and
objects manually aligned together. We noticed the augmentation
quality is inuenced by jitter, perceived drift and latency,
which degrade perception as well as accuracy and efciency
of the alignment procedure. Jitter is the continuous shaking of
the hologram. We measured jitter within the range of 1 mm,
which is at the edge of our acceptable range considering the
tooth to have a dimension of around 10 mm. Perceived drift is
that when the observer moves around a hologram, the perceived
position of hologram drifts away. We measured the perceived
drift within the range of 5 mm when the observer takes two
orthogonal viewpoints. The perceived drift limits users from
observing from multiple viewpoints to align probe with the
hologram. However, considering that users are not able to freely
move around when aligning the probe, the perceived drift may
be less fatal for our prototype. Latency is the time lag of
hologram update when the user moves their head and is
determined by the distance of head movement. The measured
latency is within range of 2 s when head motion is within the
Fig. 6 Experiment setup
a3D grid coordinate for measuring augmentation accuracy between
hologram and object
bUSAF resolution test chart for measuring end-to-end accuracy during
probe repositioning. Ten key points are selected from square corners
marked by red dots
cDentoform model with an extracted human tooth installed on top. There
are two articial dental decays on the interproximal surfaces marked by
the two red arrows
Fig. 5 Fusion of OCT 3D image and SFE 2D images
Fig. 7 Extracted human tooth with articial interproximal decays
aPhotograph of the extracted human tooth with two articial interproximal
lesions
bOne slice of OCT 3D image of the tooth
cNIR occlusal-view SFE image
d3D surface shape scan of the tooth. Note that in (b) and (c), the blue frame
indicates an articial dental decay deep into the dentin, the orange frame
indicates an articial dental decay less than halfway into the enamel, and
the green circle indicates a natural dental decay in the groove under the
biting surface
246
This is an open access article published by the IET under the
Creative Commons Attribution License (http://creativecommons.
org/licenses/by/3.0/)
Healthcare Technology Letters, 2019, Vol. 6, Iss. 6, pp. 243248
doi: 10.1049/htl.2019.0082
general range needed for performing the imaging procedure. We
also measured the accuracy of image-tracking-based alignment and
controller-tracking-based alignment. The image-tracking-based
alignment suffers from limited capability of front-facing camera.
The image tracking has an error of up to 4 mm and may lose the
target when the printed target image moves fast. Furthermore,
when the background of environment is complicated, the image
tracking may recognise the wrong target. It is recommended that
the image tracking is used in well-lit space while avoiding black or
very uniform surfaces as well as reective surface like mirrors or
glasses. The controller-tracking-based alignment suffers from the
hologram drift when the electromagnetic sensor is rotated around
or moved close to conducting surfaces. All that being said, the
current image-tracking and controller-tracking-based alignment
approaches suffer from instability and accuracy issues and need
improvement either from hardware or from the tracking scheme
design. So far, manual alignment seems to be more robust in terms
of accuracy and efciency.
The end-to-end accuracy and efciency of manual alignment is
quantied by the keypoint displacement in acquired reference
SFE image and repetitive SFE image with dimension of
400 × 400 pixels. As shown in Table 1, AR guidance has the
advantage of better repositioning accuracy compared to without
any guidance, and the advantage of faster repositioning speed
compared to using SFE real-time video for guidance. By trans-
ferring the real-time SFE video to AR headset and placing it near
the operating eld, we may further improve the accuracy and
efciency of our prototype.
In the user study, the average time taken to educate each subject
to use the system to general prociency (i.e. familiar with the inter-
action techniques and can use them to accomplish the workow)
was 15 min, which is quite fast considering their unfamiliarity
to AR devices. Afterwards, all subjects were able to accomplish
the protocol. During the process of prototyping and quantitative
evaluation, we thought the following factors may inuence the
workow and therefore included qualitative questions regarding
their effects. The factors include (i) the latency which may
impede the accuracy and efciency of alignment of the tooth and
probe with the virtual models due to the small scale, (ii) the avail-
able eld of view of the headset. For Magic Leap One, the width
and height of the AR eld of view are currently the largest in the
market and the interface design also avoid borders of frames to miti-
gate the sense of the limited eld of view. However, when the user
is too close to the virtual objects, the virtual objects will be cut off
by a clipping plane. This limits users to work from a distance of
about 37 cm away from the virtual objects, which means that
the users may have to always extend their arms away from their
body during the alignment tasks. Five subjects felt the latency
was noticeable but it did not impede their workow, while one
dental clinician felt the latency of the headset was an impediment.
Five subjects reported that the limits of the AR eld of view within
the headset were unnoticeable, while only one general user thought
clipping plane of the headset caused discomfort/distraction during
the workow.
As for feedback on the workow, three dental personnel all
thought the AR-assisted visualisation of OCT is an improvement
over standard screen display in the sense of exible movement in
space while preserving the same information as the standard
display. Two dental clinicians that are familiar with the OCT
image were able to localise the position of both articial interprox-
imal lesions (decay) and even the natural decay in the groove. The
other dental student is not familiar with OCT images so was not
able to do this. Although, they commented that the rendering
speed of OCT image may be a problem when more 3D scans
need to be acquired. All three dental personnel and one general
user thought the SFE 2D imaging AR-assisted guidance is easier
than without guidance, while two other general users thought it
was more difcult. These two general users commented that the
manual alignment of the virtual tooth model and the real tooth is
complicated due to one major reason. The depth perception does
not work well when you want to accurately align virtual object
with real object. This is caused by an inherent issue called occlusion
leak which has also been reported for other AR devices like
Hololens [39] and theres ongoing research on solving this issue
[40]. The image tracking and controller tracking sometimes also
suffer from instability. The choice of manual alignment versus
tracking-based alignment methods seems to be up to personal pref-
erence. In terms of choice of virtual probe model, all three general
users prefer the tri-colour-plane model, while three dental personnel
have various preference. Therefore it is advantageous to have
both virtual probe models available and provide an interface to
switch between the two.
This rst-ever prototype showed both clinical potential and
technical limitations in our study, which we believe will be a
useful reference for future research. First, the AR display can
relieve clinicians or general users from the troubles of constantly
switching views between patient and computer screen and the con-
sequent handeye coordination problem. Importantly, the AR
display preserves required information in the composite images.
Second, this system can assist in the adaptation of multiple dental
imaging modalities into clinical use, such as the safe and inform-
ative infrared optical imaging. Since images from multiple modal-
ities can be integrated into the system and provide supplementary
information for clinicians, this improves the learning curve of clin-
icians on using these new imaging modalities, and also improves
the reliability and sensitivity of dental decay quantication.
Notably, the prototype can be easily generalised to other dental
imaging modalities available in the clinics, such as CBCT, NIR
and uorescence dental cameras. Also, most of these imaging
modalities along with the intra-oral scanners are common in
dental clinics. The SFE we use in this study is not commercial
but expected to be a low-cost NIR imaging modality. The other
addition is the AR headset which continues to get cheaper. Thus,
our prototype is both generalisable and cost-effective. Lastly,
the proposed solution can help repetitive imaging of dental
decay for therapy monitoring, which is the core of the ideal
dental care protocol of tooth decay management which maintains
the integrity of teeth. There are denite limitations in our prototype
reported above. Some limitations stem from the inherent restrictions
of the Magic Leap One hardware, such as jitter, perceived
drift, latency, occlusion leak and limited FOV. We believe
that the rapid progress of AR HMD products will help resolve
these limitations. Other limitations stem from our designs on the
software and workow themselves, such as the inaccuracy of
manual alignment, which may be resolved by improved designs
of tracking mechanism. See supplementary material for the video
demo of our system in use.
6. Conclusion: In this work, we proposed an AR-assisted
visualisation and guidance system for imaging of dental
decay. We introduce a novel workow which is implemented
as a software application on the Magic Leap One AR headset.
We evaluated the multimodal system and workow through
quantitative measurements as well as a pilot user study with the
recognition that the prototype can be generalisable to other more
conventional dental imaging modalities, such as 3D-CBCT and
2D-oral cameras. Thus, with the addition of an AR headset and
Table 1 Comparison of different imaging guidance approaches
Imaging guidance approach keypoint displacement, px Time taken, s
without any guidance 83 ± 10 3
with AR guidance 31 ± 11 10
with video guidance 7 ± 2 20
Healthcare Technology Letters, 2019, Vol. 6, Iss. 6, pp. 243248
doi: 10.1049/htl.2019.0082
247
This is an open access article published by the IET under the
Creative Commons Attribution License (http://creativecommons.
org/licenses/by/3.0/)
a low-cost 2D imaging modality like SFE, our prototype can be
adapted into dental clinics and rural community health centres.
7. Funding and declaration of interests: Financial support
was provided by US NSF PFI:BIC 1631146 award and
VerAvanti Inc. Equipment support was provided by NIH/NIDCR
R21DE025356 grant and Yoshida Dental Mfg. Corp. A.S. was
supported by the University of Washington (UW) Reality Lab,
Facebook, Google, and Huawei. Authors have no personal
conicts of interest outside the UW. UW receives license and
funding from Magic Leap Inc., and VerAvanti has licensed SFE
patents from UW for medical.
8 References
[1] Kassebaum N.J., Bernabé E., Dahiya M., ET AL.: Global burden of un-
treated caries: a systematic review and metaregression,J. Dent. Res.,
2015, 94, (5), pp. 650658
[2] Kassebaum N.J., Smith A.G.C., Bernabé E., ET AL.: Global, regional,
and national prevalence, incidence, and disability-adjusted life years
for oral conditions for 195 countries, 19902015: a systematic
analysis for the global burden of diseases, injuries, and risk
factors,J. Dent. Res., 2017, 96, (4), pp. 380387
[3] Featherstone J.D., Fontana M., Wolff M.: Novel anticaries and
remineralization agents: future research needs,J. Dent. Res., 2018,
97, (2), pp. 125127
[4] Rozier R.G., White B.A., Slade G.D.: Global burden of untreated
caries: a systematic review and metaregression,J. Dent. Res.,
2017, 81, (8), pp. 97106
[5] Gupta N., Vujicic M., Yarbrough C., ET AL.: Disparities in untreated
caries among children and adults in the U.S., 20112014,J. Dent.
Res., 2018, 18, (1), p. 30
[6] Shah N., Bansal N., Logani A.: Recent advances in imaging tech-
nologies in dentistry,World J. Radiol., 2014, 6, (10), pp. 794807
[7] Gardner G., Xu Z., Lee A., ET AL.: Effects of mHealth applications on
pediatric dentistsuoride varnish protocols. IADR/AADR/CADR,
Vancouver, BC, Canada, 2019, 3183697
[8] Savas S., Kucukyilmaz E., Celik E.U.: Effects of remineralization
agents on articial carious lesions,Pediatr. Dent., 2016, 38, (7),
pp. 511518
[9] Fontana M., Eckert G.J., Keels M.A., ET AL.: Fluoride use in health
care settings: association with childrens caries risk,Adv. Dent.
Res., 2018, 29, (1), pp. 2434
[10] Karlsson L.: Caries detection methods based on changes in optical
properties between healthy and carious tissue,Int. J. Dent., 2010,
270729, pp. 19
[11] Javed F., Romanos G.E.: A comprehensive review of various laser-
based systems used in early detection of dental caries,Stoma. Edu.
J., 2015, 2, (2), pp. 106111
[12] Zhou Y., Jiang Y., Kim A.S., ET AL.: Developing laser-based therapy
monitoring of early caries in pediatric dental settings. Proc. SPIE
10044, Lasers in Dentistry XXIII, San Francisco, CA, USA, 2017,
p. 100440D
[13] Breedveld P., Stassen H.G., Meijer D.W., ET AL.: Manipulation in
laparoscopic surgery: overview of impeding effects and supporting
aids,J. Laparoendosc Adv. Surg. Tech. A., 1999, 9, (6), pp. 469480
[14] Bosc R., Fitoussi A., Hersant B., ET AL.: Intraoperative augmented
reality with heads-up displays in maxillofacial surgery: a systematic
review of the literature and a classication of relevant technologies,
Int. J. Oral Maxillofac Surg., 2019, 48, (1), pp. 132139
[15] Chung S., Fried D., Staninec M., ET AL.: Multispectral near-IR reect-
ance and transillumination imaging of teeth,Biomed. Opt. Exp.,
2011, 2, (10), pp. 28042814
[16] Fried W.A., Fried D., Chan K.H., ET AL.: High contrast reectance
imaging of simulated lesions on tooth occlusal surfaces at near-IR
wavelengths,Lasers Surg. Med., 2013, 45, (8), pp. 533541
[17] Darling C.L., Huynh G., Fried D.: Light scattering properties of
natural and articially demineralized dental enamel at 1310nm,
J. Biomed. Opt., 2006, 11, (3), pp. 111
[18] Machoy M., Seeliger J., Szyszka-Sommerfeld L., ET AL.: The use of
optical coherence tomography in dental diagnostics: a state-of-the-art
review,J. Healthc. Eng., 2017, 2017, p. 7560645
[19] Zhang L., Kim A.S., Ridge J.S., ET AL.: Trimodal detection of early
childhood caries using laser light scanning and uorescence spectro-
scopy: clinical prototype,J. Biomed. Opt., 2013, 18, (11), p. 111412
[20] Zhou Y., Lee R., Finkleman S., ET AL.: Near-infrared multispectral
endoscopic imaging of deep articial interproximal lesions in
extracted teeth,Lasers Surg. Med., 2019, 51, (5), pp. 459465
[21] Lee R., Zhou Y., Finkleman S., ET AL.: Near-infrared imaging of arti-
cial enamel caries lesions with a scanning ber endoscope,Sensors,
2019, 19, (6), p. 1419
[22] Jiang J., Huang Z., Qian W., ET AL.: Registration technology of aug-
mented reality in oral medicine: a review,IEEE. Access., 2019, 7,
pp. 5356653584
[23] KaticD., Spengler P., Bodenstedt S., ET AL.: A system for
context-aware intraoperative augmented reality in dental implant
surgery,Int. J. Comput. Assist. Radiol. Surg., 2015, 10, (1),
pp. 101108
[24] Lin Y.-K., Yau H.-T., Wang I.-C., ET AL.: A novel dental implant
guided surgery based on integration of surgical template and augmen-
ted reality,Clin. Implant Dentistry Rel. Res., 2015, 17, (3),
pp. 543553
[25] Song T., Yang C., Dianat O., ET AL.: Endodontic guided treatment
using augmented reality on a head-mounted display system,
Healthcare Technol. Lett., 2018, 5, (5), pp. 201207
[26] Ma L.F., Jiang W., Zhang B., ET AL.: Augmented reality surgical navi-
gation with accurate CBCT-patient registration for dental implant
placement,Med. Biol. Eng. Comput., 2019, 57, (1), pp. 4757
[27] Won Y.-J., Kang S.-H.: Application of augmented reality for inferior
alveolar nerve block anesthesia: A technical note,J. Dental
Anesthesia Pain Med., 2017, 17, (2), pp. 129134
[28] Bijar A., Rohan P.Y., Perrier P., ET AL.: Atlas-based auto- matic
generation of subject-specicnite element tongue meshes,
Ann. Biomed. Eng., 2016, 44, (1), pp. 1634
[29] Wang J., Suenaga H., Yang L., ET AL.: Video see-through augmented
reality for oral and maxillofacial surgery,Int. J. Med. Robot.
Comput. Assist. Surg., 2017, 13, (2), p. e1754
[30] Aichert A., Wein W., Ladikos A., ET AL.: Image-based tracking of the
teeth for orthodontic augmented reality. Proc. 15th Int. Conf.
Medical Image Computing and Computer-Assisted Intervention
(MICCAI), Nice, France, 2012, pp. 601608
[31] Onishi K., Mizushino K., Noborio H., ET AL.: Haptic AR dental simu-
lator using Z-buffer for object deformation. Universal Access in
Human-Computer Interaction. Aging and Assistive Environments,
Heraklion, Crete, Greece, 2014, pp. 342348
[32] Wang D.X., Tong H., Shi Y.J., ET AL.: Interactive haptic simulation of
tooth extraction by a constraint-based haptic rendering approach.
Proc. IEEE Int. Conf. Robotics and Automation (ICRA), Seattle,
Washington, USA, 2015, pp. 2630
[33] Farronato M., Maspero C., Lanteri V., ET AL.: Current state of the art
in the use of augmented reality in dentistry: a systematic review of the
literature,BMC Oral Health, 2019, 19, (135), pp. 115
[34] Magic leap one AR headset. Available at https://www.magicleap.
com/magic-leap-one, accessed: 2019-07-15
[35] Unity real-time development platform. Available at https://unity.com/
, accessed: 2019-07-15
[36] Unity package for volume rendering. Available at https://github.com/
mattatz/unity-volume-rendering, accessed: 2019-08-28
[37] Magic leap one AR headset image tracking API. Available at https://
creator.magicleap.com/learn/guides/sdk-example-image-tracking,
accessed: 2019-07-15
[38] Magic leap one AR headset controller tracking API. Available
at https://creator.magicleap.com/learn/guides/control-6dof, accessed:
2019-07-15
[39] El-Hariri H., Pandey P., Hodgson A.J., ET AL.: Augmented reality
visualisation for orthopaedic surgical guidance with pre- and
intra-operative multimodal image data fusion,Healthcare Technol.
Lett., 2018, 5, (5), pp. 189193
[40] Itoh Y., Hamasaki T., Sugimoto M.: Occlusion leak compensation
for optical see-through displays using a single-layer transmissive
spatial light modulator,IEEE Trans. Visualization Comput.
Graphics, 2017, 23, (11), pp. 24632473
248
This is an open access article published by the IET under the
Creative Commons Attribution License (http://creativecommons.
org/licenses/by/3.0/)
Healthcare Technology Letters, 2019, Vol. 6, Iss. 6, pp. 243248
doi: 10.1049/htl.2019.0082
... Several studies have integrated a variety of AR visualizations. An AR-assisted visualization workflow including two in situ visualization features of a preoperative dental 3D image and repetitive 2D imaging during therapy monitoring was proposed (Zhou et al 2019a). After the image fusion of the 2D images and the 3D image, the condition of decays was able to be inspected by interpreting and comparing images from multiple modalities. ...
... It is also combined with a special controller to realize multi-degree of freedom (DoF) control of the instrument, such as the EM 6-DoF spatial tracking of the control handle was supported by Magic Leap one (Zhou et al 2019a). The pose of the probe can be tracked by fixing it with the control handle. ...
... Song et al (2018) used HoloLens for AR navigation on endodontically guided treatment and validated feasibility with model experiments. Zhou et al (2019a) built an AR-assisted visualization and guidance system for imaging dental caries using Magic Leap One. Ma et al (2019a) used a naked eye 3D OST AR navigation system for dental implant placement guidance and CBCT for patient registration to ensure implant accuracy. ...
Article
Full-text available
Augmented reality (AR) surgical navigation has developed rapidly in recent years. This paper reviews and analyzes the visualization, registration, and tracking techniques used in AR surgical navigation systems, as well as the application of the AR systems in different surgical fields. The types of AR visualization are divided into two categories of in situ visualization and non in situ visualization. The rendering contents of AR visualization are various. The registration methods include manual registration, point-based registration, surface registration, marker-based registration, and calibration-based registration. The tracking methods consist of self-localization, tracking with integrated cameras, external tracking, and hybrid tracking. Moreover, we describe the applications of AR in surgical fields. However, most AR applications were evaluated through model experiments and animal experiments, and there are relatively few clinical experiments, indicating that the current AR navigation methods are still in the early stage of development. Finally, we summarize the contribution and challenges of AR in the surgical fields, as well as the future development trend. Despite the fact that AR-guided surgery has not yet reached clinical maturity, we believe that if its current development trend continues, it will soon reveal its clinical utility.
... In the field of dentistry, AR has played a pivotal role in dental treatment planning [39], [40], visualization of radiographs [41], [42] and, training and education [43], [44]. ...
Article
Full-text available
The integration of augmented reality (AR) technology worldwide has been pivotal for advancements across various sectors. Its accessibility and availability are fundamental to technological evolution. This paper presents a literature review on the applications of AR in health sciences, highlighting its main contributions and advancements in this field. To assess the level of attention from the scientific community, three search algorithms were employed using the Scopus database with relevant keywords and a historical range extending to the present. The search was restricted to review, research, and conference articles. The results indicate a growing and profound interest from the research community in exploring the role of AR in health sciences over recent years. Index Terms-augmented reality, health sciences, literature review, search algorithm, database. Resumen-La inclusión de la tecnología de realidad aumentada (AR, por sus siglas en inglés) a nivel mundial ha sido crucial para el desarrollo de muchos sectores, su accesibilidad y disponibilidad es fundamental para la evolución tecnológica. Este documento presenta una revisión de literatura acerca de AR en las ciencias de la salud, sus principales contribuciones y avances en el campo de estudio. En ese sentido, para indagar sobre la atención de la comunidad científica hacia este tema; se emplearon tres algoritmos de búsqueda a través de la base de datos Scopus mediante palabras clave y años desde la historia hasta la actualidad. La búsqueda se limitó a artículos de revisión, investigación y conferencias. Los resultados demuestran que durante los últimos años ha habido un profundo interés por parte de la comunidad de investigadores para indagar acerca de AR sobre las ciencias de la salud. Palabras claves-realidad aumentada, ciencias de la salud, revisión de literatura, algoritmo de búsqueda, base de datos.
... 26,27 For instance, the intraoral scanners often provide the anterior and posterior teeth arrangement as well as the position of the dental cavity. 28,29 Dentists require teeth images from various angles such as right/left buccal and maxillary/mandibular occlusal for accurate diagnosis of teeth. 30,31 However, previously developed intraoral cameras have technical bottlenecks such as large minimum object distance, thick total track length, narrow viewing angle, and the absence of functionality fusion. ...
... full robot arm placement in minimally invasive gastrectomy (abdominal surgery) fromFotouhi et al. (2020) (Fig. 10 (b)). The remaining applications of surgical guidance cover topics such as stent-graft placement in endovascular aortic repair(Rynio et al., 2019), imaging probe navigation for tooth decay management(Zhou et al., 2019a), C-arm positioning guidance in percutaneous orthopaedic procedures, identification of spinal anatomy underneath the skin(Aaskov et al., 2019) and dissection guidance for vascular pedunculated flaps of the lower extremities presented byPratt et al. (2018) (Fig. ...
Preprint
Full-text available
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6\%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n = 40). Experiments mainly involve phantoms (n = 43) or system setup (n = 21), with patient case studies ranking third (n = 19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
... full robot arm placement in minimally invasive gastrectomy (abdominal surgery) fromFotouhi et al. (2020) (Fig. 10 (b)). The remaining applications of surgical guidance cover topics such as stent-graft placement in endovascular aortic repair(Rynio et al., 2019), imaging probe navigation for tooth decay management(Zhou et al., 2019a), C-arm positioning guidance in percutaneous orthopaedic procedures, identification of spinal anatomy underneath the skin(Aaskov et al., 2019) and dissection guidance for vascular pedunculated flaps of the lower extremities presented byPratt et al. (2018) (Fig. ...
Article
Full-text available
This article presents a systematic review of optical see-through head mounted display (OST-HMD) usage in augmented reality (AR) surgery applications from 2013 to 2020. Articles were categorised by: OST-HMD device, surgical speciality, surgical application context, visualisation content, experimental design and evaluation, accuracy and human factors of human-computer interaction. 91 articles fulfilled all inclusion criteria. Some clear trends emerge. The Microsoft HoloLens increasingly dominates the field, with orthopaedic surgery being the most popular application (28.6%). By far the most common surgical context is surgical guidance (n=58) and segmented preoperative models dominate visualisation (n = 40). Experiments mainly involve phantoms (n = 43) or system setup (n = 21), with patient case studies ranking third (n = 19), reflecting the comparative infancy of the field. Experiments cover issues from registration to perception with very different accuracy results. Human factors emerge as significant to OST-HMD utility. Some factors are addressed by the systems proposed, such as attention shift away from the surgical site and mental mapping of 2D images to 3D patient anatomy. Other persistent human factors remain or are caused by OST-HMD solutions, including ease of use, comfort and spatial perception issues. The significant upward trend in published articles is clear, but such devices are not yet established in the operating room and clinical studies showing benefit are lacking. A focused effort addressing technical registration and perceptual factors in the lab coupled with design that incorporates human factors considerations to solve clear clinical problems should ensure that the significant current research efforts will succeed.
Article
Objectives To investigate whether the scanning time, trueness and number of photos are influenced when augmented reality (AR) heads-up display (HUD) is utilized during the intraoral scan of fully dentate mandibular arches. Methods A total of 10 patients (6 females and 4 males) were included. The mandibular arch of each patient was scanned twice using an intraoral scanner (Trios4 Pod IOS: 3Shape): one with and one without AR-HUD (ML2; Magic Leap). Further, alginate impression was taken, and the cast was digitized to acquire the reference model for trueness comparison (T310, Medit). The scan time and number of photos were recorded. Trueness was evaluated qualitatively and quantitatively using colored heat maps and RMSE values respectively. t-test was used to evaluate the difference in scan time, trueness, and number of photos between the two groups (α = .05). Results AR-assisted IOS resulted in significantly faster scan time (44 seconds) compared to the time consumed following conventional scan method without AR-HUD (63 seconds) (P= <.001). The number of photos was also significantly less with AR-assisted IOS (836) compared to IOS using conventional technique without AR-HUD (1209) P= <.001. No statistical difference was detected in RMSE between the test groups. Conclusions Integration of AR technology with IOS process represents a promising potential to acquire digital impressions with reduced scan acquisition time and reduced images count while simultaneously maintaining the trueness of the acquired scans. Clinical Significance Augmented Reality presents an emerging potential in Prosthodontics to acquire digital impressions with decreased number of images and acquisition time.
Chapter
Full-text available
Artificial intelligence (AI) has transformed the medical field by providing intelligent algorithms and machine learning (ML) techniques that aid in accurate diagnosis and effective treatment planning. The chapter delves into the various AI applications like image recognition and natural language processing (NLP), which have significantly improved medical outcomes and patient care. Furthermore, the integration of big data analysis has revolutionized healthcare management by enabling comprehensive data collection, storage, and analysis. The chapter highlights the potential of leveraging big data to enhance clinical decision-making, identify patterns, and predict disease outbreaks. It also emphasizes the significance of data privacy and security in managing sensitive healthcare information. The advent of wearable medical devices has empowered one person to monitor their health condition in real time. The chapter explores the diverse range of wearable devices, including smartwatches, fitness trackers, and biosensors, with their pros and cons. The analysis of this data enables early detection of health issues and provides personalized feedback for improved self-management. Moreover, bio-signals and telemedicine have emerged as key components of modern healthcare delivery. By leveraging advanced technological solutions, these innovations hold the promise of enhancing medical practices and delivering better healthcare outcomes.
Article
Augmented reality (AR) is a technology offering the simultaneous integration of real-world surroundings and information in the form of audio, video, graphics and other virtual aspects. The AR is being used in various fields of engineering and Dentistry which included speciality level practices. The use of AR in dentistry varies from dental implant placement to orthognathic surgery and recently advancements are enabling in various fields of dentistry in orthodontics, Endodontics, Pedodontics, Periodontics, Surgery, Prosthodontics, Oral-pathology. This article summarizes the various applications of AR in dentistry. Key Words: Augmented reality, Digital dentistry, Dentistry 2.0, Digital dental era, Digital face, Technology
Article
Full-text available
In this study, the ozonation of cinnamaldehyde to benzaldehyde catalyzed by Ca(OH)2 was studied by using the bubbling reactor with in situ diffuse reflectance infrared Fourier transform spectrum (DRIFTS) and Density Function Theory (DFT). After the preparation, characterization, and catalytic activity evaluation of Ca(OH)2, the data showed that specific surface area, average pore diameter, and pore volume of Ca(OH)2 had a 1.77‐, 1.44‐, and 3.06‐fold larger than those of CaO, respectively. Compared with CaO catalysis, the activation energy of cinnamaldehyde ozonation decreased 41 % under Ca(OH)2 catalysis. in situ DRIFTS and DFT results indicated that alkaline‐earth metal ozonide was the yield on the Ca(OH)2 surface after the adsorption, and the ozonide selectively oxidated the carbon‐carbon double bond of cinnamaldehyde with the Criegee mechanism.
Article
Full-text available
Background: The aim of the present systematic review was to screen the literature and to describe current applications of augmented reality. Materials and methods: The protocol design was structured according to PRISMA-P guidelines and registered in PROSPERO. A review of the following databases was carried out: Medline, Ovid, Embase, Cochrane Library, Google Scholar and the Gray literature. Data was extracted, summarized and collected for qualitative analysis and evaluated for individual risk of bias (R.O.B.) assessment, by two independent examiners. Collected data included: year of publishing, journal with reviewing system and impact factor, study design, sample size, target of the study, hardware(s) and software(s) used or custom developed, primary outcomes, field of interest and quantification of the displacement error and timing measurements, when available. Qualitative evidence synthesis refers to SPIDER. Results: From a primary research of 17,652 articles, 33 were considered in the review for qualitative synthesis. 16 among selected articles were eligible for quantitative synthesis of heterogenous data, 12 out of 13 judged the precision at least as acceptable, while 3 out of 6 described an increase in operation timing of about 1 h. 60% (n = 20) of selected studies refers to a camera-display augmented reality system while 21% (n = 7) refers to a head-mounted system. The software proposed in the articles were self-developed by 7 authors while the majority proposed commercially available ones. The applications proposed for augmented reality are: Oral and maxillo-facial surgery (OMS) in 21 studies, restorative dentistry in 5 studies, educational purposes in 4 studies and orthodontics in 1 study. The majority of the studies were carried on phantoms (51%) and those on patients were 11 (33%). Conclusions: On the base of literature the current development is still insufficient for full validation process, however independent sources of customized software for augmented reality seems promising to help routinely procedures, complicate or specific interventions, education and learning. Oral and maxillofacial area is predominant, the results in precision are promising, while timing is still very controversial since some authors describe longer preparation time when using augmented reality up to 60 min while others describe a reduced operating time of 50/100%. Trial registration: The following systematic review was registered in PROSPERO with RN: CRD42019120058.
Article
Full-text available
Augmented reality (AR) technology, as a computer simulation technology, combines various technologies such as virtual reality, computer vision, computer network, and human-computer interaction. AR has been widely used in medicine. The introduction of AR can effectively help doctors complete preoperative planning, intraoperative guidance, postoperative evaluation and medical training. Oral medicine is a major branch of modern medicine. AR can enhance the doctor’s visual system, making the internal structure of the oral clearer and effectively reducing the difficulty of oral repair/surgery. Real-time tracking, registration, display, and interactive technologies for AR will play an important role in oral medicine. Among them, the registration technology has become an important indicator for evaluating the AR system, and it is also the main bottleneck restricting the stability and applicability of the current AR system. Therefore, we reviewed the registration technology of AR in oral medicine. Firstly, we conducted a hot spot analysis of AR keywords based on Citespace. And then, the registration technology is divided into static registration and real-time registration according to the actual clinical application, among which static registration is divided into rigid registration and non-rigid registration. We discussed problems and limitations of static registration and real-time registration in oral applications at this stage. Finally, the future direction of AR registration technology in oral medicine is proposed.
Article
Full-text available
Several studies have shown that near-infrared imaging has great potential for the detection of dental caries lesions. A miniature scanning fiber endoscope (SFE) operating at near-infrared (NIR) wavelengths was developed and used in this study to test whether the device could be used to discriminate demineralized enamel from sound enamel. Varying depths of artificial enamel caries lesions were prepared on 20 bovine blocks with smooth enamel surfaces. Samples were imaged with a SFE operating in the reflectance mode at 1310-nm and 1460-nm in both wet and dry conditions. The measurements acquired by the SFE operating at 1460-nm show significant difference between the sound and the demineralized enamel. There was a moderate positive correlation between the SFE measurements and micro-CT measurements, and the NIR SFE was able to detect the presence of demineralization with high sensitivity (0.96) and specificity (0.85). This study demonstrates that the NIR SFE can be used to detect early demineralization from sound enamel. In addition, the NIR SFE can differentiate varying severities of demineralization. With its very small form factor and maneuverability, the NIR SFE should allow clinicians to easily image teeth from multiple viewing angles in real-time.
Article
Full-text available
Endodontic treatment is performed to treat inflamed or infected root canal system of any involved teeth. It is estimated that 22.3 million endodontic procedures are performed annually in the USA. Preparing a proper access cavity before cleaning/shaping (instrumentation) of the root canal system is among the most important steps to achieve a successful treatment outcome. However, accidents such as perforation, gouging, ledge and canal transportation may occur during the procedure because of an improper or incomplete access cavity design. To reduce or prevent these errors in root canal treatments, this Letter introduces an assistive augmented reality (AR) technology on the head-mounted display (HMD). The proposed system provides audiovisual warning and correction in situ on the optical see-through HMD to assist the dentists to prepare access cavity. Interaction of the clinician with the system is via voice commands allowing the bi-manual operation. Also, dentist is able to review tooth radiographs during the procedure without the need to divert attention away from the patient and look at a separate monitor. Experiments are performed to evaluate the accuracy of the measurements. To the best of the authors’ knowledge, this is the first time that an HMD-based AR prototype is introduced for an endodontic procedure. © 2018 Institution of Engineering and Technology.All right reserved.
Article
Full-text available
Augmented reality (AR) has proven to be a useful, exciting technology in several areas of healthcare. AR may especially enhance the operator's experience in minimally invasive surgical applications by providing more intuitive and naturally immersive visualisation in those procedures which heavily rely on three-dimensional (3D) imaging data. Benefits include improved operator ergonomics, reduced fatigue, and simplified hand–eye coordination. Head-mounted AR displays may hold great potential for enhancing surgical navigation given their compactness and intuitiveness of use. In this work, the authors propose a method that can intra-operatively locate bone structures using tracked ultrasound (US), registers to the corresponding pre-operative computed tomography (CT) data and generates 3D AR visualisation of the operated surgical scene through a head-mounted display. The proposed method deploys optically-tracked US, bone surface segmentation from the US and CT image volumes, and multimodal volume registration to align pre-operative to the corresponding intra-operative data. The enhanced surgical scene is then visualised in an AR framework using a HoloLens. They demonstrate the method's utility using a foam pelvis phantom and quantitatively assess accuracy by comparing the locations of fiducial markers in the real and virtual spaces, yielding root mean square errors of 3.22, 22.46, and 28.30 mm in the x, y, and z directions, respectively.
Article
Full-text available
It is challenging to achieve high implant accuracy in dental implant placement, because high risk tissues need to be avoided. In this study, we present an augmented reality (AR) surgical navigation with an accurate cone beam computed tomography (CBCT)-patient registration method to provide clinically desired dental implant accuracy. A registration device is used for registration between preoperative data and patient outside the patient’s mouth. After registration, the registration device is worn on the patient’s teeth for tracking the patient. Naked-eye 3D images of the planning path and the mandibular nerve are superimposed onto the patient in situ to form an AR scene. Simultaneously, a 3D image of the drill is overlaid accurately on the real one to guide the implant procedure. Finally, implant accuracy is evaluated postoperatively. A model experiment was performed by an experienced dentist. Totally, ten parallel pins were inserted into five 3D-printed mandible models guided by our AR navigation method and through the dentist’s experience, respectively. AR-guided dental implant placement showed better results than the dentist’s experience (mean target error = 1.25 mm vs. 1.63 mm; mean angle error = 4.03° vs. 6.10°). Experimental results indicate that the proposed method is expected to be applied in the clinic. Open image in new window Graphical abstract ᅟ
Article
Although the term augmented reality appears increasingly in published studies, the real-time, image-guided (so-called ‘hands-free’ and ‘heads-up’) surgery techniques are often confused with other virtual imaging procedures. A systematic review of the literature was conducted to classify augmented reality applications in the fields of maxillofacial surgery. Publications containing the terms ‘augmented reality’ ‘hybrid reality’ and ‘surgery’ were sought through a search of three medical databases, covering the years 1995–2018. Thirteen publications containing enough usable data to perform a comparative analysis of methods used and results obtained were identified. Five out of 13 described a method based on a hands-free and heads-up augmented reality approach using smart glasses or a headset combined with tracking. Most of the publications reported a minimum error of less than 1 mm between the virtual model and the patient. Augmented reality during surgery may be classified into four categories: heads-up guided surgery (type I) with tracking (Ia) or without tracking (Ib); guided surgery using a semi-transparent screen (type II); guided surgery based on the digital projection of images onto the patient (type III); and guided surgery based on the transfer of digital data to a monitor display (type IV). © 2018 International Association of Oral and Maxillofacial Surgeons