Content uploaded by Roger Daglius Dias
Author content
All content in this area was uploaded by Roger Daglius Dias on Jun 25, 2019
Content may be subject to copyright.
This material is based upon work supported by the National Heart, Lung, and Blood Institute (NHLBI) of the
National Institutes of Health under award 1R01HL126896-01A1. The content is solely the responsibility of
the authors and does not necessarily represent the official views of the funding agencies.
Psychophysiological Data and Computer Vision to Assess
Cognitive Load and Team Dynamics in Cardiac Surgery
Roger D. Dias1,2, Steven J. Yule1,3, Lauren Kennedy-Metz4, and Marco A. Zenati3,4
1 STRATUS Center for Medical Simulation, Brigham & Women’s Hospital, Boston, MA, USA
2 Department of Emergency Medicine, Harvard Medical School, Boston, MA, USA
3 Department of Surgery, Harvard Medical School, Boston, MA, USA
4 Division of Cardiac Surgery, VA Healthcare System, Boston, MA, USA
rdias@bhw.harvard.edu
1 Purpose
The cardiac operating room (OR) is a high-risk environment where several specialized
providers work as team to provide care for patients undergoing complex surgical pro-
cedures. Cardiac surgery can be conceptualized as a team-based sociotechnical system
with critical requirements for communication and coordination. Contemporary research
in this realm has moved away from the individual as the unit of cognitive analysis, and
a new focus on the activity system (human actors, their tools and the environment) has
been proposed; this framework has been referred to as “distributed cognition”. [1] To
execute highly specialized tasks, resources are distributed throughout the OR team, as
well as the cognitive demands imposed by the surgical tasks. Furthermore, the dynam-
ics of team activities may provide relevant information for understanding the multitude
of factors that impact surgical performance and patient safety outcomes.[2] Previous
research has shown that certain patterns extracted from team members’ position and
motion data can predict team coordination and cohesion..[3] In this pilot study, we de-
scribe a novel integrative approach that captures objective measures of team cognitive
load (heart rate variability), as well as position and motion metrics from multiple OR
team members generated by a computer vision system. Our aim was to investigate the
feasibility of using this novel approach to integrate and visualize team dynamics and
cognitive load metrics gathered from the OR team during a real-life cardiac surgery.
2 Methods
2.1 Study Design, Setting and Participants
We studied a cardiac surgical team during a real-life coronary-artery bypass grafting
surgery (CABG). This procedure involves an open-heart surgery, requiring a highly
specialized team of 6-12 providers divided into four subteams. This study was approved
by the local Institutional Review Board (IRB).
2 R. Dias et al.
2.2 Cognitive Load Metrics (Heart Rate Variability)
Attending surgeons, anesthesiologists and perfusionists were equipped with a heart rate
sensor (Polar H7 chest strap) that captures the intervals between successive heart beats
(RR intervals) in milliseconds, and transmits the data, via Bluetooth connection, to a
receiving station (sports watch (Polar V800). Previous research has used RR intervals
as a psychophysiological measure surgeon’s cognitive workload.[4] Raw data was ex-
ported in .txt format and analyzed in Kubios HRV (version 3.1.0), where an automatic
RR artefact correction method was used to remove artefacts and ectopic beats. Artefacts
were replaced with interpolated values using a cubic spline interpolation.
2.3 Team Dynamics Metrics (Position and Motion Tracking)
A GoPro Hero 4 camera (1280 x 720 pixels, 30 fps) was used to record the entire pro-
cedure. The first 5 minutes of the Separation from Bypass step was analyzed. This is a
critical step and imposes the highest cognitive demand to the OR cardiac team.[5]
OpenPose is an open-source, deep-learning enabled computer vision system capable
of detecting multiple humans and labeling up to 25 key body points using 2D video
input from conventional cameras. The architecture of this system uses a two-branch
multi-stage convolutional network (CNN) in which each stage in the first branch pre-
dicts confidence 2D maps of body part locations, and each stage in the second branch
predicts Part Affinity Fields (PAF) which encode the degree of association between
parts. Training and validation of the OpenPose algorithms were evaluated on two
benchmarks for multi-person pose estimation: the MPII human multi-person dataset
and the COCO 2016 keypoints challenge dataset. Both datasets had images collected
from diverse real-life scenarios, such as crowding, scale variation, occlusion and con-
tact. The OpenPose system exceeded previous state-of-the-art systems.[6] We pro-
cessed the 5-minute video using this software (version 1.4.0) and exported x and y co-
ordinates (in pixels) of the position of the neck of each OR team member, per frame, in
JSON format. For each detected point a confidence score from 0 to 1 was provided
based on the comparison with ground-truth data from the system’s library. To measure
motion, we calculated the distance (in pixels) from the team member’s neck point to a
stationary reference point (patient’s heart) for each frame. Then, we calculated frame-
to-frame change in distance, which was used to calculate the velocity (pixels/second)
in which each participant moved over time. Team centrality was calculated by the dis-
tance between each team member and the patient’s heart (surgical field).
2.4 Data Integration and Visualization
To integrate and synchronize position and motion data generated by the computer vi-
sion software and physiological data gathered by the heart rate sensor, we used the
InterAct software (Mangold v. 16.0). A visual analytics software (Tableau v2018.1)
was used to create a heat map of the position of each team member in the OR, displaying
the density of the neck position over a 5-minute period.
3
3 Results
The analysis of the 5-minute video generated 90.000 frames, detecting 7 different peo-
ple in the OR. The neck keypoint of all team members was detected in 96% of the
frames, with a mean confidence score of 0.83 (standard deviation = 0.11). A heat map
plotting the density of people’s position is shown in Fig. 1.
Fig. 1. Density of individual’s position (A), and team centrality (B) measured by the distance
between each team member and the patient’s heart (surgical field) during Separation from By-
pass.
The processed video was overlaid with team members’ skeletons (See supplemental
video) and synchronized with the physiological signals (RR intervals) and motion met-
rics (neck keypoint velocity) of the core cardiac team members. This integrated view
allows the simultaneous analysis of team dynamics and cognitive load. (Fig. 2)
4 R. Dias et al.
Fig. 2. Integrated visualization of motion tracking, cognitive load and team dynamics metrics.
4 Conclusion
This pilot study demonstrates the feasibility of a novel integrative approach to capture
team cognitive load and team dynamics metrics during complex surgical procedures.
This approach uses heart rate variability as an objective and unobtrusive measure of
cognitive load and a deep learning-based system that performs multi-person pose esti-
mation from videos recorded with regular cameras. The main novelty of the proposed
approach is its ability to monitor in real-time the cognitive load imposed by the surgical
tasks to the OR team members while simultaneously capturing relevant spatial relation-
ships between team members, the patient and medical devices in the OR environment.
Future studies can use similar approach to investigate the relationship between team
cognitive load, team dynamics metrics and surgical patient outcomes.
Bibliography
1. Hazlehurst B, McMullen CK, Gorman PN. Distributed cognition in the heart room: how
situation awareness arises from coordinated communications during cardiac surgery. J
Biomed Inform. 2007;40:539-51.
2. Tiferes J, Hussein AA, Bisantz A, et al. The Loud Surgeon Behind the Console:
Understanding Team Activities During Robot-Assisted Surgery. J Surg Educ. 2016;504-12.
3. Gorman JC, Dunbar TA, Grimm D, et al. Understanding and Modeling Teams As
Dynamical Systems. Front Psychol. 2017;8:1053.
4. Dias RD, Ngo-Howard MC, Boskovski MT, et al. Systematic review of measurement tools
to assess surgeons' intraoperative cognitive workload. Br J Surg. 2018;105(5)491-501.
5. Wadhera RK, Parker SH, Burkhart HM, et al. Is the "sterile cockpit" concept applicable to
cardiovascular surgery critical intervals or critical events? The impact of protocol-driven
communication during cardiopulmonary bypass. J Thorac Cardiovasc Surg. 2010;139:312-
6. Cao Z, Simon T, Wei S, Sheikh Y. Realtime Multi-person 2D Pose Estimation Using Part
Affinity Fields. 2017 IEEE Conference on Computer Vision and Pattern Recognition
(CVPR)2017. p. 1302-10.