PreprintPDF Available

Towards Finger Motion Tracking and Analyses for Cardiac Surgery

Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

Robot Assisted Surgery is attracting increasing amount of attention as it offers numerous benefits to patients as well as surgeons. Heart surgery requires a high level of precision and dexterity, in contrast to other surgical specialties. Robot assisted heart surgery is not as widely performed due to numerous reasons including a lack of appropriate and intuitive surgical interfaces to control minimally invasive surgical tools. In this paper, finger motion of the surgeon is analyzed during cardiac surgery tasks on an ex-vivo animal model with the purpose of designing a more intuitive master console. First, a custom finger tracking system is developed using IMU sensors, which is lightweight and comfortable enough to allow free movement of the surgeons fingers/hands while using instruments. The proposed system tracks finger joint angles and fingertip positions for three involved fingers (thumb, index, middle). Accuracy of the IMU sensors has been evaluated using an optical tracking system (Polaris, NDI). Finger motion of the cardiac surgeon while using a Castroviejo instrument is studied in suturing and knotting scenarios. The results show that PIP and MCP joints have larger Range Of Motion (ROM), and faster rate of change compared to other finger/thumb joints, while thumb has the largest Fingertip WorkSpace (FWS) of all three digits.
Content may be subject to copyright.
Towards Finger Motion Tracking and Analyses
for Cardiac Surgery
Mohammad Fattahi Sani1, Sajeeva Abeywardena1, Efi Psomopoulou1,
Raimondo Ascione2, and Sanja Dogramadzi1
1Bristol Robotics Laboratory, Coldharbour Lane, Bristol, UK
2Bristol Heart Institute, University of Bristol, Bristol, UK
Abstract. Robot Assisted Surgery is attracting increasing amount of
attention as it offers numerous benefits to patients as well as surgeons.
Heart surgery requires a high level of precision and dexterity, in contrast
to other surgical specialties. Robot assisted heart surgery is not as widely
performed due to numerous reasons including a lack of appropriate and
intuitive surgical interfaces to control minimally invasive surgical tools.
In this paper, finger motion of the surgeon is analyzed during cardiac
surgery tasks on an ex-vivo animal model with the purpose of designing
a more intuitive master console. First, a custom finger tracking system
is developed using IMU sensors, which is lightweight and comfortable
enough to allow free movement of the surgeons fingers/hands while using
instruments. The proposed system tracks finger joint angles and finger-
tip positions for three involved fingers (thumb, index, middle). Accuracy
of the IMU sensors has been evaluated using an optical tracking sys-
tem (Polaris, NDI). Finger motion of the cardiac surgeon while using
a Castroviejo instrument is studied in suturing and knotting scenarios.
The results show that PIP and MCP joints have larger Range Of Mo-
tion (ROM), and faster rate of change compared to other finger/thumb
joints, while thumb has the largest Fingertip WorkSpace (FWS) of all
three digits.
Keywords: Robot Assisted Surgery, Finger Tracking, Cardiac surgery
1 Introduction
Minimally invasive surgery brought a remarkable improvement over open surgery
in terms of hospitalization time and patients scarring [1]. On the other hand,
this method has complicated surgery performance due to a range of problems
such as fulcrum effect, inaccurate scaling, and lack of precision, dexterity and
intuitive handling [2] . Robot Assisted Surgery (RAS) has addressed some of
these problems. Among many available robotic systems, Da Vinci is currently
the only widely used in hospitals [3]. In the field of cardiac surgery, however, RAS
has not been widely adopted [3] and its application is mainly limited to mitral
valve repair [2–4] and in smaller extent, coronary artery revascularization [2,3],
2 Mohammad Fattahi Sani et al.
closure of simple septal defects, tricuspid valve repair, and cardiac tumor re-
moval [4] . In the literature, cost and steep learning curve of currently available
systems are counted as reasons for this [1,3,5, 6]. Despite Da Vinci system with
7 DOF master station and a wide range of tools the control of fine movements
required in heart surgeries is still an issue [6–8]. In addition, the lack of haptic
feedback [1, 5, 7, 8] , and the system bulkiness [1, 7] motivate research on more
intuitive control of the robotic surgical tools for this surgical area [8]. The most
important step towards designing a teleoperation system that deals with the
restricted space of heart surgeries and a finer set of motions required is to gain
a thorough understanding of the characteristics of finger/ hand motions of car-
diac surgeons. The characteristics of finger motion required for complex surgical
tasks are necessary for developing better, more customized master-slave robotic
tools for cardiac surgery. Therefore, the aim of our work is to track and analyze
finger/hand motion during cardiac surgery with the purpose of extracting ROM
for each finger and thumb joint as well as FWS during different maneuvers of
cardiac surgery.
2 Related Works
Finger motion tracking can be implemented for many purposes such as tele–
operating a robotic hand [9], or patient motion analysis in order to study dis-
eases [10, 11]. Different methods to implement finger tracking include using: 1)
Inertial Measurement Unit (IMU) based sensors [12,13] 2) optical tracking sys-
tems [9, 11, 14–17] 3) exoskeleton based systems including anthropomorphic ex-
oskeletons attached to fingers, or highly redundant exoskeleton attached on top
of the hand and wrist [18], 4) magnetic sensing [19], 5)flex sensor based [20],
and 6) fusing several methods together [21]. When it comes to surgeon fingers
tracking, optical tracking methods are highly vulnerable to occlusion [22]. Ex-
oskeleton based finger-tracking methods, on the other hand, are usually bulky
for surgery, and in some models suffer from inaccuracy due to misalignments
of the exoskeleton and the finger joints [18]. IMU sensors, are relatively small,
lightweight and occlusion free method of acquiring three rotation angles for each
link of the hand/finger. Nevertheless, their values might drift after a while, and
experience some magnetic interferences.
Hand motion analysis is a useful tool in different applications. Researchers
in [21] analyzed three different methods of data glove, force sensors and EMG
sensors in order to study human hand motion. The hand motion analysis is
typically used to study physical impairments [10, 11], electrical stimulation of
hand [15], analyze joint loads [16], gestures [23], or to find comfort zone of fingers
when interacting with smartphones [17] .
Surgical tool tracking has already been studied in the literature using
either 2D or 3D image processing [24], or mounting optical tracking markers [25].
Therefore, it is apparent that developing a reliable platform for analyzing finger
motions is necessary.
Towards Finger Motion Tracking... 3
Our Contributions: We propose a tracking system customized for hand
tracking of surgeon which is occlusion-free, precise and lightweight to collect
data during specific cardiac surgery tasks. We have identified ROM and FWS of
the three digits (Middle, Index and thumb) in different stages of the surgery.
3 Hand kinematic Model
In order to study hand/finger motions, a good understanding of biomechanics of
hand is essential. A full hand consists of 27 bones including fingers, thumb, palm,
and wrist [22]. Each finger is comprised of three parts of bones called phalanges.
Fingers are attached to Metacarpals through Metacarpophalangeal (MCP) joint,
which is followed by Proximal interphalangeal joint (PIP) and Distal interpha-
langeal joint (DIP). Thumb, however, which plays a crucial role in human ma-
nipulation capability, consists of two phalanges followed by a Metacarpal bone
attached to Carpus [18]. Various approaches have been put forward to model
hand/finger kinematics [26,27]. In this study, we are mainly interested to model
middle finger, index finger and thumb, therefore we utilize simple model pro-
posed in [27] , which can be seen in Fig.1.
Fig. 1. Finger joint models used in our method
As it can be seen in fig.1, DIP and PIP joints in index and middle fingers as
well as IP and MCP joints in thumb all have one revolute joint, whereas MCP
and CMC joints have two DOF revolute joints. We measure absolute orientation
of each phalange by a single IMU. Therefore, the joint angles can be calculated
4 Mohammad Fattahi Sani et al.
by simply subtracting the orientation of two consecutive IMU sensors. Sensors
are aligned with the finger/thumb lengthwise (Yaxis) while the sensors Zaxis
is perpendicular to the the fingers/thumb. Let us assume φ,θ,ψfor rotation
around Z,YandXaxis, respectively. Therefore, for instance, φM CP and θM C P
which show φand θangles for MCP joints, can be calculated as follows:
φMC P =φM C φMP , θM CP =θM C θM P (1)
Having all the required joint angles allow calculation of fingertip positions for
the two fingers and the thumb using forward kinematics [26]. DenavitHartenberg
(DH) parameters of the fingers and the thumb and forward kinematics calcula-
tions are used according to SynGrasp toolbox [28].
4 Experimental setup
The tracking system consists of 10 BNO055 IMU sensors (4 on the thumb and 3
on index and middle fingers). IMU sensors are connected to an acquisition board
comprised of multiple Microprocessor Units (MCU) running Arduino firmware.
Each MCU connects to two IMU sensors on a single I2C bus. Orientations from
each sensor are sent to the computer with frequency of 50Hz through a serial port.
Fig.2 shows the experimental implementation of the data acquisition system.
(a) Our custom Hand Tracking system. (b) Reconstructed model of the
hand in computer.
Fig. 2. Experimental setup for hand/finger tracking.
In addition to the sensors internal calibration process, we remove the offset
by measuring the angle values at the beginning of the test.
4.1 Accuracy assessment of IMU finger tracking system
In order to validate accuracy of the hand tracking system,, digit orientations and
joint angles estimated by the IMUs were compared to a ground truth. As it can
be seen in Fig.3a, an angled circle is used as a ground truth to test the accuracy
of IMU sensors for each orientation. Results show that average error is 4.1with
Towards Finger Motion Tracking... 5
a standard deviation of 2. Researchers in [12] assessed accuracy of the same
sensors in a planar pose and reported average error of 36with standard
deviation of 1.7for different angles. The researchers in [13] reported in Static
angle errors of 2for their developed IMU-based hand tracking system.
(a) Accuracy assessment setup in planar mode. (b) Accuracy assessment in 3D
Fig. 3. Accuracy assessment setup.
In addition, a dynamic angle verification was carried out using Polaris Spectra
system (NDI) to verify the accuracy of the tracking system. A custom tool with
markers (Fig.3b), which also houses IMU sensors, is designed and attached to
the Index finger, and orientations were measured relative to the marker and the
sensor fixed on the palm. Fig.4 shows simple flexion extension movements of the
index finger and their corresponding error values. According to experimental
tests, the measurement error is less than 8for rotating around Z,Yand Xaxis
of the sensor.
(a) Finger joint angles extracted from IMU
sensors and NDI Polaris system.
(b) Joint angle error with respect to NDI
Polaris system.
Fig. 4. Finger motions: Flexion, Extension. IMU sensors compared with NDI Polaris
motion capture system.
6 Mohammad Fattahi Sani et al.
4.2 Cardiac Surgery data collection
An experienced cardiac surgeon performed typical mitral valve surgery and aorta
suturing tasks on an ex-vivo pigs heart specimen with the tracking system fitted
to his hands, as as shown in Fig. 5. The tasks were performed using 7” Castroviejo
Needle Holder Plier Straight and 7” Castroviejo Micro Scissor.
(a) Finger/Hand data collection using our developed system dur-
ing the Mitral Valve surgery.
(b) A set of Castroviejo
instruments used in the
Fig. 5. Cardiac surgery data collection setup and instruments.
4.3 Results and discussion
ROM and FWS are two important features of finger motion. They were extracted
in [11] and [15] in order to study human functional abilities and to explore hand
grasp patterns, respectively. In our study, ROM for each joint of the surgeons
fingers has been extracted and is shown in Table.1. Fig. 6,7,8,9 demonstrate joint
angles for a sample suturing operation.
Table 1. Range of motion (ROM) for surgeons fingers. (Degrees)
IMU an-
φ φ φ θ φ φ φ θ φ φ φ θ φ θ
Knotting 23 40 30 10 50 85 60 21 35 38 25 15 25 41
Suturing 15 25 19 12 39 60 55 30 22 15 15 20 45 45
*φstands for flexion–extension movement of fingers, whereas θstands for abduction
Towards Finger Motion Tracking... 7
Fig. 6. Middle finger joint angles
Fig. 7. Index finger joint angles
8 Mohammad Fattahi Sani et al.
Fig. 8. Thumb joint angles
Fig. 9. Wrist joint angles
Towards Finger Motion Tracking... 9
Rate of change is another valuable characteristic of finger joint motion which
demonstrates how fast each joint is moving. Assuming φ[n]as a discrete–time
series of joint angles, the angular velocity (ω) is calculated as follows.
ω[n]= (φ[n]φ[n1])/F s (2)
Where Fs =50 is the sampling frequency. Now, the average angular velocity
(ωavg ) of a series with N samples is calculated as follows:
ωavg = 1/N
Fig. 10 shows rate of change for two different surgical scenarios. In addition,
FWS during the surgery is shown in Fig.11 .
(a) Knotting. (b) suturing.
Fig. 10. Mean rate of change for each joint (joints are numbered from 1 for DIP joint).
A close look into the ROM results show that MCP and PIP joints in middle
finger have a relatively higher range of motion compared to others. Furthermore,
according to Fig.10, PIP and MCP joint are the ones that are changing fastest.
Finally, Fig.11 shows that FWS for the thumb has larger space compared to the
middle and index fingers.
5 Conclusion
In this paper, motion of the fingers and the thumb during typical cardiac surgery
tasks was studied. A custom IMU–based finger tracking system has been devel-
oped which is lightweight, comfortable, and precise enough to track the surgeons
finger motions. The captured data during the heart surgery shows that PIP and
MCP joints are moving faster and more than the other tracked joints. In addi-
tion, the thumb has larger FWS in this study. The outcome of this research will
10 Mohammad Fattahi Sani et al.
Fig. 11. Fingertip position during the suturing process with Castroviejo tool
be further utilized in designing a new intuitive way of controlling tele–operated
surgical robot tools for heart surgeries. In future steps, surgical tools will also
be monitored and their motion analyzed together with finger/thumb motions.
1. Peters BS, Armijo PR, Krause C, Choudhury SA, Oleynikov D (2018) Review of
emerging surgical robotic technology. Surg Endosc 32:16361655
2. Rodriguez E, Chitwood WR (2009) Robotics in cardiac surgery. Scand J Surg
3. Quint E, Sivakumar G (2019) The role of robotic technology in cardiac surgery.
Univ West Ont Med J 87:4042
4. Dearani JA (2018) Robotic heart surgery: Hype or hope? J Thorac Cardiovasc Surg
5. Pettinari M, Navarra E, Noirhomme P, Gutermann H (2017) The state of robotic
cardiac surgery in Europe. Ann Cardiothorac Surg 6:18
6. Scholar MD, Ashford H (2015) Use of Robots on Cardiac Surgery.
7. Simorov A, Stephen Otte R, Kopietz CM, Oleynikov D (2012) Review of surgical
robotics user interface: What is the best way to control robotic surgery? Surg Endosc
8. Cuschieri CFVFFMMFFMA (2010) Technical review of the da Vinci surgical tele-
manipulator. Int J Med Robot 6:468472
9. Cerulo I, Ficuciello F, Lippiello V, Siciliano B (2017) Teleoperation of the SCHUNK
S5FH under-actuated anthropomorphic hand using human hand motion tracking.
Rob Auton Syst 89:7584
10. Medicine P (2016) Kinematic Motion Analysis in Upper Extremity Cerebral Palsy.
Towards Finger Motion Tracking... 11
11. Leitkam ST, Reid Bush T (2014) Comparison Between Healthy and Reduced Hand
Function Using Ranges of Motion and a Weighted Fingertip Space Model. J Biomech
Eng 137:041003
12. Liu H, Xie X, Millar M, Edmonds M, Gao F, Zhu Y, Santos VJ, Rothrock B, Zhu
S-C A Glove-based System for Studying Hand-Object Manipulation via Joint Pose
and Force Sensing.
13. Lin B-S, Lee I-Jung, Chiang P-Y, Huang S-Y, Chih , Peng W A Modular Data
Glove System for Finger and Hand Motion Capture Based on Inertial Sensors. J
Med Biol Eng. doi: 10.1007/s40846-018-0434-6
14. Cerveri P, De Momi E, Lopomo N, Baud-Bovy G, Barros RML, Ferrigno G Finger
Kinematic Modeling and Real-Time Hand Motion Estimation. doi: 10.1007/s10439-
15. Shin H, Watkins Z, Hu X Exploration of Hand Grasp Patterns Elicitable Through
Non-Invasive Proximal Nerve Stimulation OPEN. doi: 10.1038/s41598-017-16824-1
16. Vignais N, Cocchiarella DM, Kociolek AM, Keir PJ (2012) Dynamic Assessment of
Finger Joint Loads Using Kinetic and Kinematic Measurements. Digit Hum Model
2013 15
17. Le HV, Mayer S, Bader P, Henze N (2018) Fingers Range and Comfortable Area
for One-Handed Smartphone Interaction Beyond the Touchscreen. 112
18. Sarakoglou I, Brygo A, Mazzanti D, Hernandez NG, Caldwell DG, Tsagarakis NG
(2016) Hexotrac: A highly under-actuated hand Exoskeleton for finger tracking and
force feedback. IEEE Int Conf Intell Robot Syst 2016Novem:10331040
19. Chen K-Y, Patel S, Keller S (2016) Finexus: Tracking Precise Motions of Multiple
Fingertips Using Magnetic Sensing. Chi 16 15041514
20. Saggio G, Pallotti A, Sbernini L, Errico V, Paolo F Di (2016) Feasibility of Com-
mercial Resistive Flex Sensors for Hand Tracking Applications. Sensors & Trans-
ducers 201:1726
21. Ju Z, Liu H, Member S (2014) Human Hand Motion AnalysisWith Multisensory
Information. IEEE/ASME Trans Mechatronics 19:456466
22. Wheatland N, Wang Y, Song H, Neff M, Zordan V, Jrg S State of the Art in Hand
and Finger Modeling and Animation.
23. Tits M (2018) Expert Gesture Analysis through Motion Capture using Statistical
Modeling and Machine Learning Mickal Tits. doi: 10.13140/RG.2.2.36839.50084
24. Zhang L, Menglong Ye , Chan P-L, Yang G-Z (2017) Real-time surgical tool track-
ing and pose estimation using a hybrid cylindrical marker. Int J CARS 12:921930
25. Ye M, Zhang L, Giannarou S, Yang G-Z (2016) Real-time 3D Tracking of Articu-
lated Tools for Robotic Surgery.
26. Van Der Hulst FPJ, Schtzle S, Preusche C, Schiele A (2012) A functional anatomy
based kinematic human hand model with simple size adaptation. Proc - IEEE Int
Conf Robot Autom 51235129
27. Li K, Chen I-M, Yeo SH, Lim CK (2011) Development of finger-motion capturing
device based on optical linear encoder. J Rehabil Res Dev 48:69
28. Malvezzi M, Salvietti G, Gioioso G, Prattichizzo D (2015) SynGrasp: A Matlab
Toolbox for Underactuated and Compliant Hands Version 2.2 User Guide.
... Electrospun dressings can mimic the architecture of the extracellular matrix (ECM). These fibers exhibit a large surface area that can be modified to incorporate bioactive compounds within their structure to fulfill specific requirements of the injured area [12]. Polyvinyl alcohol (PVA) is a Food and Drug Administration (FDA)-approved synthetic and water-soluble polymer, biodegradable, biocompatible, with high electrospinnability and suitable mechanical properties, commonly used for biomedical purposes [13]. ...
... Other ratios, namely 70/30, 60/40 and 0/100 v/v, were also tested, yet without success. Ratios were selected with the goal of integrating the cytocompatibility and the low immunogenicity of CA [12] with the wellestablished flexibility, mechanical resilience and electrospinnability of PVA. Medium Mw PVA, partially hydrolyzed, was used to enhance its stability on the solvent and to confer proper surface tension to the solution [37]. ...
Full-text available
In this research, we propose to engineer a nanostructured mat that can simultaneously kill bacteria and promote an environment conducive to healing for prospective wound care. Polyvinyl alcohol (PVA) and cellulose acetate (CA) were combined at different polymer ratios (100/0, 90/10, 80/20% v/v), electrospun and crosslinked with glutaraldehyde vapor. Crosslinked fibers increased in diameter (from 194 to 278 nm), retaining their uniform structure. Fourier-transform infrared spectroscopy and thermal analyses proved the excellent miscibility between polymers. CA incorporation incremented the fibers swelling capacity and reduced the water vapor and air permeabilities of the mats, preventing the excessive drying of wounds. The antimicrobial peptide cys-pexiganan and the immunoregulatory peptide Tiger 17 were incorporated onto the mats via polyethylene glycol spacer (hydroxyl-PEG2-maleimide) and physisorbed, respectively. Time-kill kinetics evaluations revealed the mats effectiveness against Staphylococcus aureus and Pseudomonas aeruginosa. Tiger 17 played a major role in accelerating clotting of re-calcified plasma. Data reports for the first time the collaborative effect of pexiganan and Tiger 17 against bacterial infections and in boosting hemostasis. Cytocompatibility data verified the peptide-modified mats safety. Croslinked 90/10 PVA/CA mats were deemed the most promising combination due to their moderate hydrophilicity and permeabilities, swelling capacity, and high yields of peptide loading.
... We have put forward alternative biomarkers based on the analysis of computed fundus images for individual layers of the retina [14]. Also, we recently demonstrated that texture biomarkers allow not only to distinguish AD patients from age-matched healthy controls correctly but also to distinguish them from age-matched Parkinson's disease patients [15], thus demonstrating the potential of texture biomarkers for the study of neurodegenerative disorders [16]. ...
Full-text available
Alzheimer’s disease (AD) is a progressive neurodegenerative disorder whose diagnosis remains a notable challenge. The literature suggests that cerebral changes precede AD symptoms by over two decades, implying a significantly advanced stage of AD by the time it is usually diagnosed. In the study herein, texture analysis was applied to computed optical coherence tomography ocular fundus images to identify differences between a group of the transgenic mouse model of the Alzheimer’s disease (3×Tg-AD) and a group of wild-type mice, at the ages of one and two-months-old. A substantial difference between groups was found at both time-points across all neuroretina’s layers. Here, the inner nuclear layer stands out both in the level of statistically significant differences and on the extension of these differences which span through the imaged area. Also, the progression of AD is suggested to be spotted by texture analysis as demonstrated by the significant difference found in the inner plexiform and the outer nuclear layers from the age of one to the age of two-months-old. These findings demonstrate the potential of the use of the retina and texture analysis to the diagnosis of AD and monitor AD progression. Besides, the differences between groups found in this study suggest that the 3×Tg-AD model may be inappropriate to study early changes associated with the AD and other animal models should be tested following the same path and rationale. Moreover, these results also suggest that the human genes present in these transgenic mice may have an impact on the neurodevelopment of offspring which would justify the significant changes found at the age of one-month-old.
Full-text available
The aim of this study was to determine a gait pattern, i.e., a subset of spatial and temporal parameters, through a supervised machine learning (ML) approach, which could be used to reliably distinguish Parkinson’s Disease (PD) patients with and without mild cognitive impairment (MCI). Thus, 80 PD patients underwent gait analysis and spatial–temporal parameters were acquired in three different conditions (normal gait, motor dual task and cognitive dual task). Statistical analysis was performed to investigate the data and, then, five ML algorithms and the wrapper method were implemented: Decision Tree (DT), Random Forest (RF), Naïve Bayes (NB), Support Vector Machine (SVM) and K-Nearest Neighbour (KNN). First, the algorithms for classifying PD patients with MCI were trained and validated on an internal dataset (sixty patients) and, then, the performance was tested by using an external dataset (twenty patients). Specificity, sensitivity, precision, accuracy and area under the receiver operating characteristic curve were calculated. SVM and RF showed the best performance and detected MCI with an accuracy of over 80.0%. The key features emerging from this study are stance phase, mean velocity, step length and cycle length; moreover, the major number of features selected by the wrapper belonged to the cognitive dual task, thus, supporting the close relationship between gait dysfunction and MCI in PD.
Full-text available
Teledermatology has given dermatologists a tool to track patients’ responses to therapy using images. Virtual assistants, the programs that interact with users through text or voice messages, could be used in teledermatology to enhance the interaction of the tool with the patients and healthcare professionals and the overall impact of the medication and quality of life of patients. As such, this work aimed to investigate the effectiveness of using a virtual assistant for teledermatology and its impact on the quality of life. We conducted surveys with the participants and measured the usability of the system with the System Usability Scale (SUS). A total of 34 participants (30 patients diagnosed with moderate-severe psoriasis and 4 healthcare professionals) were included in the study. The measurement of the improvement of quality of life was done by analyzing Psoriasis Quality of Life (PSOLIFE) and Dermatology Life Quality Index (DLQI) questionnaires. The results showed that, on average, the quality of life improved (from 63.8 to 64.8 for PSOLIFE (with a p-value of 0.66 and an effect size of 0.06) and 4.4 to 2.8 for DLQI (with a p-value of 0.04 and an effect size of 0.31)). Patients also used the virtual assistant to do 52 medical consultations. Moreover, the usability is above average, with a SUS score of 70.1. As supported by MMAS-8 results, adherence also improved slightly. Our work demonstrates the improvement of the quality of life with the use of a virtual assistant in teledermatology, which could be attributed to the sense of security or peace of mind the patients get as they can contact their dermatologists directly within the virtual assistant-integrated system.
Full-text available
Hyperthermia using High‐Intensity Focused Ultrasound (HIFU) is an acoustic therapy for cancer treatment. This technique consists of an increase in the temperature field of the tumor to achieve coagulative necrosis and immediate cell death. Therefore, for having a successful treatment, the physical problem requires to know several properties due to the high variability from individual to individual, or even for the same individual under different physiological conditions. This paper presents a numerical simulation of hyperthermia therapy for cancer treatment using HIFU, as well as the estimation of parameters that influence the physical problem. Two mathematical models were considered to solve the forward problem. The acoustic model based on acoustic pressure performs a frequency‐domain study, and the bioheat transfer model a time‐dependent study. These models were solved using Comsol Multiphysics® software in a 2D‐axisymmetric rectangular domain to determine the temperature field. Parameter estimation was coded in Matlab Mathworks® environment using a Bayesian approach. The Markov Chain Monte Carlo method by the Metropolis‐Hastings algorithm was implemented, and the simulated temperature measurements were considered. Results suggest that specific HIFU therapy can be performed for each patient by estimating appropriate parameters for cancer treatment and provides the possibility to define procedures before and during the treatment. This article is protected by copyright. All rights reserved.
Full-text available
Background: The status of the data-driven management of cancer care as well as the challenges, opportunities, and recommendations aimed at accelerating the rate of progress in this field are topics of great interest. Two international workshops, one conducted in June 2019 in Cordoba, Spain, and one in October 2019 in Athens, Greece, were organized by four Horizon 2020 (H2020) European Union (EU)–funded projects: BOUNCE, CATCH ITN, DESIREE, and MyPal. The issues covered included patient engagement, knowledge and data-driven decision support systems, patient journey, rehabilitation, personalized diagnosis, trust, assessment of guidelines, and interoperability of information and communication technology (ICT) platforms. A series of recommendations was provided as the complex landscape of data-driven technical innovation in cancer care was portrayed. Objective: This study aims to provide information on the current state of the art of technology and data-driven innovations for the management of cancer care through the work of four EU H2020–funded projects. Methods: Two international workshops on ICT in the management of cancer care were held, and several topics were identified through discussion among the participants. A focus group was formulated after the second workshop, in which the status of technological and data-driven cancer management as well as the challenges, opportunities, and recommendations in this area were collected and analyzed. Results: Technical and data-driven innovations provide promising tools for the management of cancer care. However, several challenges must be successfully addressed, such as patient engagement, interoperability of ICT-based systems, knowledge management, and trust. This paper analyzes these challenges, which can be opportunities for further research and practical implementation and can provide practical recommendations for future work. Conclusions: Technology and data-driven innovations are becoming an integral part of cancer care management. In this process, specific challenges need to be addressed, such as increasing trust and engaging the whole stakeholder ecosystem, to fully benefit from these innovations.
Full-text available
Minimally invasive surgical techniques have been developed in order to improve patient outcomes and satisfaction. These minimally invasive techniques have been applied to numerous fields, including cardiac surgery. Currently, mitral valve repair and coronary artery bypass grafting are the most common procedures performed robotically. Numerous studies have shown that robotic technology provides similar outcomes to traditional surgery, which is much more invasive. However, there are numerous barriers to performing robotic surgery, including the cost of robotic systems and the steep learning curve associated with these systems. It is predicted that the indications for robotic cardiac surgery will increase as these limitations are addressed.
Full-text available
The present thesis is a contribution to the field of human motion analysis. It studies the possibilities for a computer to interpret human gestures, and more specifically to evaluate the quality of expert gestures. These gestures are generally learned through an empirical process, limited to the subjectivity and own perception of the teacher. In order to objectify the evaluation of the quality of these gestures, researchers have proposed various measurable criteria. However, these measurements are still generally based on human observation. Enabled by significant steps in the development of Motion Capture (MoCap) and artificial intelligence technologies, research on automatic gesture evaluation has sparked a new interest, due to its applications in education, health and entertainment. This research field is, however, recent and sparsely explored. The few studies on the subject generally focus on a small dataset, limited to a specific type of gestures, and a data representation specific to the studied discipline, hereby limiting the validity of their results. Moreover, the few proposed methods are rarely compared, due to the lack of available benchmark datasets and of reproducibility on other types of data. The aim of this thesis is therefore to develop a generic framework for the development of an evaluation model for the expertise of a gesture. The methods proposed in this framework are designed to be reusable on various types of data and in various contexts. The framework consists of six sequential steps, for each of which an original contribution is proposed in the present thesis: Firstly, a benchmark dataset is proposed to promote further research in the domain and allow method comparison. The dataset consists of repetitions of 13 Taijiquan techniques by 12 participants of various levels from novice to expert, resulting in a total of 2200 gestures. Secondly, the MoCap data must be processed, in order to ensure the use of high-quality data for the design of an evaluation model. To that end, an original method is proposed for automatic and robust recovery of optical MoCap data, based on a probabilistic averaging of different individual recovery models, and the application of automatic skeleton constraints. In an experiment where missing data were simulated into a MoCap dataset, the proposed method outperforms various methods of the literature, independently of gap length, sequence duration and the number of simultaneous gaps. Thirdly, various motion features are proposed for the representation of various aspects of motion, potentially correlated with different components of expertise. Additionally, a new set of features is proposed, inspired by Taijiquan ergonomic principles. In this respect, 36 new motion features, representing aspects of stability, joint alignments, joint optimal angles and fluidity are presented. Fourthly, the features must be processed to provide a more relevant representation of expertise. In the present work, the morphology influence on motion is addressed. Morphology is an individual factor that has a great influence on motion, but is not related to expertise. A novel method is therefore proposed for the extraction of motion features independent of the morphology. From the linear modeling of the relation of each feature with a morphological factor, residues are extracted, providing a morphology-independent version of the motion features. As a consequence, the resulting features are (i) less correlated between each other, and (ii) enable a more relevant comparison between the gestures of various individuals, hereby allowing a more relevant modeling of expertise. Results show that the method, termed as Morphology-Independent Residual Feature Extraction (MIRFE) outperforms a baseline method (skeleton scaling) in (i) reducing the correlation with the morphological factor, and in (ii) improving the correlation with skill, for various gestures of the Taijiquan MoCap dataset, and for a large set of motion features. Fifthly, an evaluation model must be developed from these features, allowing the prediction of the expertise level on a new gesture performed by a new user. A model based on feature statistics, dimension reduction and regression is proposed. The model is designed to be used with any motion feature, in order to be generic and relevant in different contexts, including various users and various types of gestural disciplines. Trained on the Taijiquan MoCap dataset, the model outperforms two methods of the literature for the evaluation of gestures of a new user, with a mean relative prediction error of 10% ( R=0.909 ). Additionally, a first exploration of the use of deep learning for gesture evaluation is proposed. To that end, MoCap sequences are represented as abstract RGB images, and used for transfer learning on a pre-trained image classification convolutional neural network. Despite a lower performance ( R=0.518 ), an analysis of the results suggests that the model could achieve better performance given a larger dataset, including a larger number of novices and experts. Sixthly, and finally, to allow a practical use of the evaluation model, a feedback system must provide an intuitive interpretation of the predicted level, allowing an effective understanding and assimilation by the user of the system. In the present work, an original and generic feedback system is proposed, based on the synthesis of an improved gesture, and its comparison to the user's original gesture. Both intuitive and precise feedback are proposed, based on (i) synchronized visualization of both gestures, and (ii) striped images highlighting the motion features that need improvement. As a validation of the proposed method, examples of feedback are proposed for various sequences of the Taijiquan MoCap dataset, showing its practical interest for objective and automated supervision.
Full-text available
This study proposes a modular data glove system to accurately and reliably capture hand kinematics. This data glove system’s modular design enhances its flexibility. It can provide the hand’s angular velocities, accelerations, and joint angles to physicians for adjusting rehabilitation treatments. Three validations—raw data verification, static angle verification, and dynamic angle verification—were conducted to verify the reliability and accuracy of the data glove. Furthermore, to ensure the wearability of the data glove, 15 healthy participants and 15 participants with stroke were recruited to test the data glove and fill out a questionnaire. The errors of the finger ROMs obtained from the fusion algorithm were less than 2°, proving that the fusion algorithm can measure the wearer’s range of motion accurately. The result of the questionnaire shows the participants’ high satisfaction with the data glove. Moreover, a comparison between the proposed data glove and related research shows that the proposed data glove is superior to other data glove systems.
Conference Paper
Full-text available
Previous research and recent smartphone development presented a wide range of input controls beyond the touchscreen. Fingerprint scanners, silent switches, and Back-of-Device (BoD) touch panels offer additional ways to perform input. However, with the increasing amount of input controls on the device, unintentional input or limited reachability can hinder interaction. In a one-handed scenario, we conducted a study to investigate the areas that can be reached without losing grip stability (comfortable area), and with stretched fingers (maximum range) using four different phone sizes. We describe the characteristics of the comfortable area and maximum range for different phone sizes and derive four design implications for the placement of input controls to support one-handed BoD and edge interaction. Amongst others, we show that the index and middle finger are the most suited fingers for BoD interaction and that the grip shifts towards the top edge with increasing phone sizes.
Full-text available
Background: The use of laparoscopic and robotic procedures has increased in general surgery. Minimally invasive robotic surgery has made tremendous progress in a relatively short period of time, realizing improvements for both the patient and surgeon. This has led to an increase in the use and development of robotic devices and platforms for general surgery. The purpose of this review is to explore current and emerging surgical robotic technologies in a growing and dynamic environment of research and development. Methods: This review explores medical and surgical robotic endoscopic surgery and peripheral technologies currently available or in development. The devices discussed here are specific to general surgery, including laparoscopy, colonoscopy, esophagogastroduodenoscopy, and thoracoscopy. Benefits and limitations of each technology were identified and applicable future directions were described. Results: A number of FDA-approved devices and platforms for robotic surgery were reviewed, including the da Vinci Surgical System, Sensei X Robotic Catheter System, FreeHand 1.2, invendoscopy E200 system, Flex® Robotic System, Senhance, ARES, the Single-Port Instrument Delivery Extended Research (SPIDER), and the NeoGuide Colonoscope. Additionally, platforms were reviewed which have not yet obtained FDA approval including MiroSurge, ViaCath System, SPORT™ Surgical System, SurgiBot, Versius Robotic System, Master and Slave Transluminal Endoscopic Robot, Verb Surgical, Miniature In Vivo Robot, and the Einstein Surgical Robot. Conclusions: The use and demand for robotic medical and surgical platforms is increasing and new technologies are continually being developed. New technologies are increasingly implemented to improve on the capabilities of previously established systems. Future studies are needed to further evaluate the strengths and weaknesses of each robotic surgical device and platform in the operating suite.
Full-text available
Various neurological conditions, such as stroke or spinal cord injury, result in an impaired control of the hand. One method of restoring this impairment is through functional electrical stimulation (FES). However, traditional FES techniques often lead to quick fatigue and unnatural ballistic movements. In this study, we sought to explore the capabilities of a non-invasive proximal nerve stimulation technique in eliciting various hand grasp patterns. The ulnar and median nerves proximal to the elbow joint were activated transcutanously using a programmable stimulator, and the resultant finger flexion joint angles were recorded using a motion capture system. The individual finger motions averaged across the three joints were analyzed using a cluster analysis, in order to classify the different hand grasp patterns. With low current intensity (<5 mA and 100 µs pulse width) stimulation, our results show that all of our subjects demonstrated a variety of consistent hand grasp patterns including single finger movement and coordinated multi-finger movements. This study provides initial evidence on the feasibility of a proximal nerve stimulation technique in controlling a variety of finger movements and grasp patterns. Our approach could also be developed into a rehabilitative/assistive tool that can result in flexible movements of the fingers.
Full-text available
Purpose: To provide an integrated visualisation of intraoperative ultrasound and endoscopic images to facilitate intraoperative guidance, real-time tracking of the ultrasound probe is required. State-of-the-art methods are suitable for planar targets while most of the laparoscopic ultrasound probes are cylindrical objects. A tracking framework for cylindrical objects with a large work space will improve the usability of the intraoperative ultrasound guidance. Methods: A hybrid marker design that combines circular dots and chessboard vertices is proposed for facilitating tracking cylindrical tools. The circular dots placed over the curved surface are used for pose estimation. The chessboard vertices are employed to provide additional information for resolving the ambiguous pose problem due to the use of planar model points under a monocular camera. Furthermore, temporal information between consecutive images is considered to minimise tracking failures with real-time computational performance. Results: Detailed validation confirms that our hybrid marker provides a large working space for different tool sizes (6-14 mm in diameter). The tracking framework allows translational movements between 40 and 185 mm along the depth direction and rotational motion around three local orthogonal axes up to [Formula: see text]. Comparative studies with the current state of the art confirm that our approach outperforms existing methods by providing nearly 100% detection rates and accurate pose estimation with mean errors of 2.8 mm and 0.72[Formula: see text]. The tracking algorithm runs at 20 frames per second for [Formula: see text] image resolution videos. Conclusion: Experiments show that the proposed hybrid marker can be applied to a wide range of surgical tools with superior detection rates and pose estimation accuracies. Both the qualitative and quantitative results demonstrate that our framework can be used not only for assisting intraoperative ultrasound guidance but also for tracking general surgical tools in MIS.
Full-text available
Human hand is a masterpiece of mechanical complexity, and the measure of its motion capabilities can be a challenging matter. Currently, these measures are generally performed by standard-gold techniques which mostly rely on video-based systems, advantageously effective, but disadvantageously expensive and time-consuming. To overcome such limitations, different researchers have been proposing different and new technologies aimed at tracking the posture and motions of the hand. Unfortunately, these technologies are, for the most part, not commercially available, being based on prototypes of sensors. In such a frame, however, commercial resistive flex sensors can be considered as an off-the-shelf valid technological solution for those who want to realize a cost-effective tracking system of both fingers and wrist. These sensors have been already used and investigated by researchers but, as far as we know, no comprehensive investigation about their mechanical-electrical transduction and feasibility capabilities are reported. This work intends to fill this lack.
Background: In the past two decades, the introduction of robotic technology has facilitated minimally invasive cardiac surgery, allowing surgeons to operate endoscopically rather than through a median sternotomy. This approach has facilitated procedures for several structural heart conditions, including mitral valve repair, atrial septal defect closure and multivessel minimally invasive coronary artery bypass grafting. In this rapidly evolving field, we review the status of robotic cardiac surgery in Europe with a focus on mitral valve surgery and coronary revascularization. Methods: Structured searches of MEDLINE, Embase, and Cochrane databases were performed from their dates of inception to June 2016. All original studies, except case-reports, were included in this qualitative review. Studies performed in Europe were presented quantitatively. Data provided from Intuitive Surgical Inc. are also presented. Results: Fourteen papers on coronary surgery were included in the analysis and reported a mortality rate ranging between 0-1%, revision for bleeding between 2-7%, conversion to a larger incision between 2-15%, and patency rate between 92-98%. The number of procedures ranged between 23 and 170 per year. There were only a small number of published reports for robotic mitral valve surgery from European centers. Conclusions: Coronary robotic surgery in Europe has been performed safely and effectively with very few perioperative complications in the last 15 years. On the other hand, mitral surgery has been developed later with increasing applications of this technology only in the last 5-6 years.