Content uploaded by Aswin Balasubramaniam
Author content
All content in this area was uploaded by Aswin Balasubramaniam on Feb 13, 2024
Content may be subject to copyright.
Running with a Drone for Pace Seing, Video Self-Reflection, and
Beyond: An Experiential Study
Aswin Balasubramaniam
Dennis Reidsma
Dirk Heylen
a.balasubramaniam@utwente.nl
d.reidsma@utwente.nl
d.k.j.heylen@utwente.nl
University of Twente, Human Media Interaction
Enschede, Netherlands
ABSTRACT
This paper explores the potential of drones in supporting running
activities as pacesetters and video recorders. Using questionnaires
and interviews, insights were gathered from 10 recreational runners
regarding their experience running with a drone in the study and
viewing drone-captured videos of their run. Results indicated that
participants found the drone experience engaging and minimally
disruptive, despite perceiving it somewhat unnatural and having
polarized view on the spatial immersion. Analysis of responses un-
veiled factors aecting runners’ experiences, while their reections
on drone-captured run videos revealed benets of leveraging such
footage for post-run self-reections and opportunities for improve-
ment. Additionally, participants’ insights led to the identication of
more roles and functions for drones in supporting various running
activities, beyond pace-setting and video recorders. This study lays
the groundwork for future research, positioning drone utilization
in running as a promising avenue for exploration.
CCS CONCEPTS
•Human-centered computing
→
User studies;Empirical stud-
ies in HCI.
KEYWORDS
Drone, Running, Runner Drone Interaction, Pacing, Post-Run Re-
ections, User Study, Experience Evaluation
ACM Reference Format:
Aswin Balasubramaniam, Dennis Reidsma, and Dirk Heylen. 2023. Running
with a Drone for Pace Setting, Video Self-Reection, and Beyond: An Experi-
ential Study. In OzCHI 2023 (OzCHI 2023), December 02–06, 2023, Wellington,
New Zealand. ACM, New York, NY, USA, 16 pages. https://doi.org/10.1145/
3638380.3638400
This work is licensed under a Creative Commons Attribution International
4.0 License.
OzCHI 2023, December 02–06, 2023, Wellington, New Zealand
©2023 Copyright held by the owner/author(s).
ACM ISBN 979-8-4007-1707-9/23/12.
https://doi.org/10.1145/3638380.3638400
1 INTRODUCTION
With drones becoming more readily available and the integration
of cameras becoming a standard feature, the eld of human drone
interaction (HDI) as such has showcased the diverse capabilities
of drones in supporting athletes [
18
]. By leveraging their ight
capabilities and video recording functionalities, drones have the
potential to accompany athletes, fullling a wide range of needs
during and after their activities. For instance Delfa et al
. [13]
and
Zwaan and Barakova
[64]
have demonstrated the use of drone
movements to support activities such as Tai Chi and boxing. Ad-
ditionally, drone video recordings have proven valuable in sports
scenarios like soccer [
19
,
21
,
47
], rowing [
35
], skiing [
41
], cycling
[
62
], climbing [
43
], and hiking [
27
], enabling athletes to reect on
their performances using the video recordings. Similarly, in the
context of running, several studies have investigated the potential
benets of using drones and drone videos to support and enhance
the running experience (most notably [17, 19, 34, 44]).
Graether and Mueller
[17]
and Mueller and Muirhead
[34]
demonstrate the utilization of drones in the running domain and
show how their movements can be harnessed to support runners,
oering recommendations for the design of interactions for drones
that accompany runners based on users’ experiences running with
a drone. Mueller and Muirhead
[34]
, in particular, elaborate on
what functions the runners desired. One such function was for the
drone to act as a pacesetter, which was also identied through the
work of Seuter et al
. [49]
via an online survey of self-identied
runners. We build on this and, as the rst objective of our study,
look more elaborately at the runners’ experience quantitatively
and qualitatively while using drones as pacesetters. By identifying
crucial aspects of this experience, we can shed light on the factors
that impact the overall running experience and help researchers
leverage the benecial aspects to enhance the runners’ experience
with the drone as a pacesetter.
Higuchi et al
. [19]
and Romanowski et al
. [44]
carried out studies
investigating the use of aerial videos in the running domain. They
have demonstrated the use of aerial videos to facilitate third person
perspective visualization of runners to the runners themselves
[
19
] and to generate support from spectators in marathon settings
[
44
]. However, apart from these identied opportunities, there is a
lack of research specically exploring runners’ preferences on how
they would like to utilize drone videos of themselves for post-hoc
self-reection. To address this, the second objective of our study
was to assess runners’ insights after viewing the drone videos
OzCHI 2023, December 02–06, 2023, Wellington, New Zealand Balasubramaniam et al.
recorded while they were running with a drone. We aim to uncover
the benets of using drone videos and identify ways to enhance
their presentation and utilization, enabling runners to engage in
meaningful post-hoc self-reection on their runs.
Furthermore, the third objective of our study was to identify
additional roles and functions that drones could support during
running, through an analysis of runners’ reections. This explo-
ration went beyond the roles of a pace setter or video recording
device and was informed by the runners’ rst-hand experiences
running with a drone. Through this, we show a wider range of
running situations o-the-shelf drones could be used to enhance
the running activity.
To full our objectives, we designed a study where 10 runners
were instructed to run on an outdoor track while being accompa-
nied by a drone positioned on their side. We used a commercial
drone capable of autonomous ight and video recording. The drone
was programmed to maintain a ground speed of 10km/h, as a pace
setter, and record the runners’ run, to facilitate post-hoc video self-
reection. Participants were instructed to synchronize their pace
with the drone’s position on the side and complete two rounds on
the track. After the run, participants were asked to complete the
ITC-SOPI questionnaire [
28
] and undergo interviews to assess their
experience. They then watched the recorded drone footage of their
run and participated in an interview to gather insights on the use
of drone videos for post-run self-reections. The interviews fur-
thermore invited the participants to more widely reect on possible
roles and functions of the drone. By providing runners with the
opportunity to run with a drone, outdoors, our aim was to recreate
the sensations, emotions, and feelings they might encounter during
an actual run, thereby eliciting insightful reections that we can
subsequently evaluate.
By conducting our study, we discovered meaningful insights
and make the following contributions to this eld. The runners’
responses to the ITC-SOPI questionnaire highlighted the engaging
nature of running with a drone and its minimal disruptive impacts
on the running experience. However, participants expressed that
the drone experience lacked a sense of naturalness and the level
of immersion was unclear. Through analyzing their interview re-
sponses, we identied some factors that may have inuenced their
perceptions. This included: Drone Presence, Position, Speed, Noise,
and Length of Experience. Furthermore, participants’ reections on
the drone-captured videos revealed the benets of incorporating
drone footage for post-run self-reections. Their feedback provided
valuable insights into rening the presentation and utilization of
drone videos to better support post-run self-reections. Addition-
ally, based on in-depth analysis of the interview data, we derived
various roles and functions for drones in enhancing the running
experience, such as AirTrainers, Companions, Trail/Forest Drones,
Alerter & Navigators, Social Media Facilitators, Riveting, Exergame
Conductors, Behavioural Catalysts, and Sightseeing Coaches. These
ndings, substantiated by user feedback, not only demonstrate the
potential of drones in supporting runners but also inform future
research and development eorts to optimize the use of drones and
drone videos in pacing activities and post-run self-reections.
Our contributions aim to guide future work by leveraging the
positive outcomes and addressing the challenges associated with
utilizing drones to support pacing activities. With the work in this
paper, we seek to facilitate the future development of optimized
pacing experiences that eectively address the drawbacks identi-
ed by runners. Additionally, we aim to raise awareness among
researchers and practitioners about the value of drone videos in
supporting post-run reections. By identifying the benets and
oering insights on how to enhance their utilization, we provide
valuable background information to support their future endeav-
ors. Furthermore, our identication of specic roles and functions
opens up avenues for further investigation and the development of
accompanying systems to better support and assist runners.
2 RELATED WORK
Our work is related to 1) technology driven pace training experi-
ences in running, 2) supporting post activity self-reections using
video recordings, and 3) deriving roles & functions for drones to
support running activities
2.1 Technology Driven Pace Training
Experiences in Running
Eective pacing strategies and pace training oer signicant bene-
ts for runners. By adopting a well-dened pacing strategy, run-
ners can maintain optimal energy levels, enhance endurance, and
improve overall running performance [
9
,
29
,
39
,
42
]. Previous stud-
ies have consistently highlighted the advantages of pacing tech-
niques, emphasizing the importance of deliberate practice in this
area [
2
,
56
]. Developing pacing abilities would enable runners to
manage their eort, maintain a consistent pace, and make informed
speed adjustments, resulting in more ecient and successful per-
formances [2, 11, 56].
There are several existing technologies that support pace train-
ing or enable runners to maintain a specic pace, as noted in our
earlier extensive literature review [
3
]. Treadmills, for instance, are
designed to help runners train at varying and precise paces [
46
,
48
].
However, they fall short for outdoor runners who prefer the chal-
lenge of diverse terrains. In outdoor settings, runners typically rely
on smartwatches or smartphones (or similar technology) to monitor
their pace during or after a run [
23
]. While post-run pace analysis
provides valuable information, real-time access to pace during the
run can be more engaging and motivating. Yet, constantly check-
ing pace throughout the run could disrupt the ow and hinder
performance [
50
]. To overcome this, an external visual cue within
the running environment could reduce cognitive load compared
to secondary actions like glancing at a smart devices or relying
on vibrations. Visual systems such as a cyclist maintaining a xed
pace, a car equipped with a laser pointer (as seen in Kipchoge’s
marathon race [
4
]), light based systems like WaveLight [
54
], or
even robots could provide such cues [
34
,
40
]. However, systems
involving cyclists or cars require additional personnel and may
not be logistically feasible for solo training. Light based systems
like WaveLight [
54
], which utilize ground-based lights in specic
stadium locations, limit runners to those particular areas. Alterna-
tively, ground robots like the Puma BeatBot [
40
] could be used, as
they can follow any designated line. Yet, they have limitations tied
to level ground requirements and the need for a predetermined path.
Considering these limitations, drones oer a compelling solution.
As ying robots, drones can be programmed at dierent speeds and
are not constrained by terrain limitations. While previous research,
such as the notable work conducted by Mueller and Muirhead
[34]
,
Running with a Drone for Pace Seing, Video Self-Reflection, and Beyond OzCHI 2023, December 02–06, 2023, Wellington, New Zealand
has demonstrated the feasibility of using ying robots as pacers
and has provided valuable insights into their design considerations,
our primary focus in this study is to utilize this knowledge to assess
the experiences of runners who run with a drone as a pacesetter.
As a pacesetter, we employ a drone to direct and guide the runners
while supporting them in maintaining a steady pace throughout
their runs. Through the drone we oer a continuous visual cue to
the runners, reminding them of the pace they should maintain.
2.2 Supporting Post Activity Self-Reections
using Video Recordings
Self-reection empowers athletes to draw upon their prior experi-
ences, eectively leveraging them to improve future performances
in pursuit of their goals [
63
]. Previous studies have explored the use
of various tools to support self-reection on running data, includ-
ing dashboards [
38
], applications on smartwatches, smartphones,
and smart devices [
23
,
26
,
36
], physicalization of data [
1
,
32
], and
integrated displays on running shoes [
60
]. While these reection
tools have dierent objectives, they share the common goal of en-
hancing self-knowledge, self-modelling, and goal tracking to help
with promoting positive running behaviour, motor learning [
14
,
51
],
and self-development in sports [22].
Although various forms of running data representation have
proven useful for post-run reections, videos of running activity
can oer a more detailed perspective, capturing subtleties of move-
ments that may not be visible through other data representations
mentioned above. Videos can help provide a relational and con-
structive view, aiding in framing and representing movement more
accurately than simplied data which may not help capture the com-
plexity of running motion. Research conducted in various sports
have demonstrated this and also demonstrated the benecial eects
of self-reection through videos showing positive impacts on phys-
ical performance, skill acquisition, and motor learning [
10
,
37
,
61
]
while also assisting athletes in eectively articulating their thought
processes during sporting activities [
45
]. While previous studies
have demonstrated the potential of video, particularly drone videos,
in other sports [
19
,
21
,
27
,
35
,
41
,
43
,
47
,
62
], there is limited re-
search exploring the use of drone videos specically for running.
Higuchi et al
. [19]
have presented how aerial videos could help
runners visualize themselves in third person perspective, to im-
prove their performance and supporting training. Romanowski
et al
. [44]
have shown how drone videos can enhance the experi-
ence of marathon runners. Although few studies have contributed
to our understanding of how aerial videos can be utilized for post-
run running reections, their potential remains largely unexplored.
In contrast to these earlier works, our study aims to understand
runners’ perspectives on the benets of viewing their drone video
of running with the drone as a pacesetter and how drone videos can
be presented/utilized to facilitate meaningful post-run reections.
2.3 Deriving Roles & Functions for Drone to
Support Running Activities
In the context of running, studies by Mayer et al
. [31]
, Romanowski
et al
. [44]
, Seuter et al
. [49]
, and Mueller and Muirhead
[34]
have
identied various roles and functions that drones could fulll to sup-
port runners. However, the recommendations of Mayer et al
. [31]
and Romanowski et al
. [44]
are based on their experiences working
with drones rather than on an analysis of runners’ preferences.
Seuter et al
. [49]
, on the other hand, considered user preferences
through an online survey to understand how self-identied run-
ners would like to incorporate drones in their running activities.
Although these studies have identied valuable roles and functions
for drones, we believe that one limitation in these works was the
absence of runners running with the drone to form their opinions.
As interpreted in the work of Cronin et al
. [12]
having lived an
experience helps better understand the phenomenon of a technol-
ogy, in our case the drone. By allowing runners to run with drones,
they would have the opportunity to interpret the technology and
provide more meaningful insights by considering their sensations,
emotions, and feelings experienced during the run. Mueller and
Muirhead
[34]
’s study adopted this approach, enabling runners to
gain a deeper understanding of how drones could be better utilized
to support their running activities. However, given the growing ac-
cessibility of running technologies that enhance self-understanding
and self-realization within the running lifestyle [
53
], we believe
there are undiscovered functionalities for drones to support runners
beyond those identied by Mueller and Muirhead
[34]
. Therefore, in
addition to acting as pace-setters and video recording devices, our
analysis of runner responses aims to identify further desired roles
and their functions substantiated by their experiential knowledge
of running with drones. Furthermore, earlier works in the eld of
Human-Drone Interaction (HDI) have shown that people associate
both utilitarian and hedonic values with drones and attribute spe-
cic roles and functions to drones depending on the situation [
25
].
However, Herdel et al
. [18]
have pointed out that interaction tech-
niques in HDI are often developed and evaluated without providing
users with the necessary context regarding the intended roles and
functions of the drone. This contextual information is crucial for
obtaining aligned feedback and enhancing study validity. Addition-
ally, the lack of context may aect users’ responses, as they are
not provided with sucient information to allow for appropriate
feedback. Therefore, by identifying additional roles and functions,
we aim to support future researchers in aligning their work.
3 STUDY
To gain insights into the runners’ experience of running with a
drone as a pacesetter, understand how the drone footage of their
run could be used to better support post-run self-reections, and
uncover additional roles and functions for drones we conducted
the following study. In the following subsections we describe our
participant pool, the design of the study, the procedure we followed
and measurements we took, and nally the procedure we followed
to analyse the collected data.
3.1 Participant Pool
We recruited 10 experienced runners (4 female, 6 male) aged be-
tween 22–60 years (mean = 30.6; median = 26; std. dev. = 11.5). The
participants had varying years of running experience (1-14 years,
mean = 5.9 years), and ran outdoors 2-4 times every week cover-
ing an average distance between 15-60 km (mean = 21.9; median
= 17; std. dev. = 13.6). Most of them run outdoors by themselves.
The participants were recruited through local channels and yers
posted around the university and were compensated for their time
OzCHI 2023, December 02–06, 2023, Wellington, New Zealand Balasubramaniam et al.
Figure 1: DJI Mavic Pro 2 Setup used for the Study
Figure 2: Dimensions of Running Track Used for the Study and Images
Illustrating the Study Setup and Video Captured
through treats. Before this study, the runners had never used or
operated a drone.
3.2 Study Design
The study took place outdoors, at the university’s athletics track,
in order to replicate a training environment that closely resembles
real-life conditions for the runner. A DJI Mavic Pro 2 drone tted
with propellor guards (Figure 1) was used in this study. Litchi, a
drone ight planner was used to program the drone to y automat-
ically along the inner perimeter of the track at a constant ground
speed of 10km/h for two laps. In case of an emergency, the pilot was
ready to take over the drone control to prevent any damage. The
drone’s airspeed was calculated using track dimensions (Figure 2)
to maintain a constant ground speed during both straight sections
and curves. To ensure consistency and minimize variables in analyz-
ing participants’ running patterns using videos for future studies,
we maintained a constant drone speed throughout the study. The
drone was carefully positioned near the runner, considering safety,
video capture requirements, and environmental constraints. Am-
ple distance was maintained between the runner and the drone
throughout the study to ensure safety and allow the pilot to respond
in emergencies. The drone’s position was optimized to capture the
runner’s sagittal plane view, even in cases of deviation from the
prescribed path. To address the soccer pitch in the center of the
athletics track, a 6.5m net was installed along the inner perimeter
to prevent ball exit. To account for all of these requirements and
for a safe and compliant drone ight, the drone was placed 7m
high and 7m horizontally on the left side of the runner. Addition-
ally, a cyclist accompanied the runner to ensure clear paths and
safety for all individuals involved. Given the participants’ running
experience, we refrained from conducting specic health checks,
instead relying on their self-assessment of their tness to run prior
to the activity. The study was reviewed by the ethics board at the
university, and the drone pilot followed all the necessary rules and
regulations as specied by federal law. The study also took the
necessary precautions to also uphold the safety of the people not
involved in the study.
3.3 Procedure and Measurements
Before the start of the study the runners were asked to sign a
consent form, and were briefed about the study’s objectives. The
runners were made aware of the safety procedures in place, and
were given the option to stop the study at any time. The runners
were instructed to maintain their pace using the drone’s position
and were asked to stay as center as possible when following the
drone on their left. They were also instructed to run in the last lane
of the track to ensure the video of their run is captured clearly. At
the end of the run, we interviewed the runners and asked them to
ll out a modied version of the ITC-SOPI questionnaire [
28
] to
evaluate their experience. To avoid recall bias the interview was
conducted immediately after the run. This was also the reason for
keeping the study relatively short. To avoid expectancy bias, the
interviewer maintained a neutral expression and refrained from
employing suggestive or leading language.
The primary aim of the rst interview was twofold: rstly, to ful-
ll our rst objective of understanding runners’ experience running
alongside drones as a pacesetter and secondly, to fulll our third
Running with a Drone for Pace Seing, Video Self-Reflection, and Beyond OzCHI 2023, December 02–06, 2023, Wellington, New Zealand
objective of gathering insights into their expectations from drones.
The interviews were audio-recorded and semi-structured. The rst
part of the interview included questions about their experience
running with a drone: “What was it like to run with a quadcopter?”,
“Did the quadcopter eect your regular running experience if so
how?”, “How did you feel when the quadcopter eected your reg-
ular running experience?”, “What more can this technology do
to help you with the running experience?”. After the runners re-
sponded to these questions and had enough time for self-reection,
they were asked out to ll out a modied version of the ITC-SOPI
questionnaire.
The ITC-SOPI questionnaire is used to assess users’ presence-
related experiences in a displayed environment. It evaluates four
dimensions of an experience: Spatial Presence, Engagement, Natu-
ralness, and Negative Eects. While originally designed for virtual
experiences, we adapted the questionnaire as we recognized the
questions could be valuable for evaluating dimensions relevant to
the real-world experience we created. Ambiguous terms related
to virtual worlds were removed, and irrelevant questions were ex-
cluded. The on-eld researcher provided assistance to clarify any
unclear questions for the runner.
After completing the questionnaire, the runners were presented
with a video recording of their run captured by the drone. The
video was viewed on a laptop using VLC media player, allowing
the runners to control playback speed and pause/play as desired.
Subsequently, to fulll our second objective of the study, the run-
ners were asked specic questions regarding their experience after
watching the video and their ability to reect on their run using
the footage. The questions included: “Do you want to change your
response about your experience running with a quadcopter after
viewing the video of your run if so how?”, “Could you comment
on how the video helps in perceiving your run and does it help
inform you about the way you run and how?”, and “Does watching
and reecting about your run in the video, help you correlate the
thoughts that may have been running on your mind during the run,
if so how?”. Additionally, the runners were given the opportunity
to share additional feedback or thoughts on the study and their
experience.
3.4 Data Analysis
The responses to the ITC-SOPI questionnaire were evaluated follow-
ing the instructions provided by the authors of the questionnaire
through an email request. The audio recordings of the interview
were automatically transcribed using Otter.ai and were also man-
ually checked to correct any discrepancies. The interview data
was imported into Atlas.ti, and an open-inductive coding approach
was used to analyze the responses [
6
]. Before being archived and
stored, the audio and video data collected during the study were
anonymized.
We calculated the descriptive statistics (means and standard
deviation) of the runners’ responses to the ITC-SOPI questionnaire
for each dimension. Before computing the statistics, responses to the
5-item Likert scale (Strongly Disagree, Disagree, Neither Agree nor
Disagree, Agree, and Strongly Agree) were converted to values from
1-5. The data obtained was visualized using a box plot as shown in
Figure 3. To further analyse the dimensions, we created a stacked
bar plot of the responses to questions for each dimension, as shown
in Figure 4. To supplement the ITC-SOPI dimension analysis, using
anity diagramming the generated codes were grouped together,
with a focus on sentiments. We identied phrases and terms from
the data that indicated a positive, neutral, or negative sentiment
toward the experience. We discovered that we could associate the
grouped codes under each sentiment to a factor related to the
experience and also relate them to the dimensions of the ITC-SOPI
questionnaire.
In addition, we created a similar anity diagram based on codes
derived from runners’ reections and their perspectives regarding
the use of drone videos for reviewing their runs. The coding process
aimed to provide a deeper insight into the advantages of utilizing
drone footage for post-run self-reection and to identify ways to
enhance the presentation of drone videos to support runners in
their self-reection experiences.
Following the previous rounds of analysis on the interview re-
sponses, we conducted a round of thematic analysis with the ob-
jective of identifying the roles and functions that runners desired
from drones. We chose a reexive thematic analysis approach due
to the exploratory nature of this study [
7
]. This approach involved
coding the data by focusing on the data itself (inductive) and the
explicit information provided by the runners (semantic). Through
discussions with the second author, who possesses more experience
in qualitative data analysis, some of the codes were further claried
and improved and we then identied nine themes from the gen-
erated codes. These themes were dened and named to represent
the roles and functions that drones could fulll to meet the run-
ners’ needs. In the subsequent sections, we will elaborate on these
themes, providing examples of how runners intend to utilize drone
technology during their runs and their expectations regarding how
drones can support their running activities.
4 RESULTS & ANALYSIS 1: EXPERIENCES OF
RUNNERS PACED BY A DRONE
The summary of the calculated statistical values were as follows:
Spatial Presence (Mean: 2.7; SD: 1.1); Negative Eects (Mean: 1.5;
SD: 0.9); Naturalness (Mean: 2.6; SD: 0.9) and Engagement (Mean:
3.4; SD: 1.1). Examining the spread of results as depicted in the
accompanying boxplot (Figure 3), oer noteworthy results. Firstly
it becomes apparent that majority of the runners found their expe-
rience to be engaging. This is particularly interesting considering
the physically demanding nature of the running activity. Secondly,
it is noteworthy that a substantial number of runners expressed
disagreement with the descriptors associated with the negative
eects dimension. Indicating that most runners did not perceive
their experience with the drone as having negative implications.
However, the dimension of naturalness received scores leaning
towards the lower end of the spectrum, implying that many run-
ners did not perceive their experience as entirely natural. Lastly,
despite the fact that the interaction took place in the real outdoor
environment, opinions on the sense of spatial immersion varied
signicantly among the runners. This variability could stem from
the diverse ways individuals perceive their spatial engagement with
technology in a real-world setting, possibly inuenced by individual
preferences and prior experiences.
In order to gain a more comprehensive understanding of the
runners’ perspectives, we thoroughly examined their responses to
OzCHI 2023, December 02–06, 2023, Wellington, New Zealand Balasubramaniam et al.
Figure 3: Box Plot Illustrating Responses to the ITC-SOPI Questionnaire, Providing Insights into the Runners’ Experience of
Running with a Drone.
the interview questions. By utilizing their interview responses as
a reference, we further analyzed and interpreted their selections
in the ITC-SOPI questionnaire, whenever applicable. As indicated
earlier, the questions in each dimension of the ITC-SOPI question-
naire were grouped together and presented as a stacked bar plot, as
shown in Figure 4. This approach allowed us to delve deeper into
the runners’ thoughts and provide a richer interpretation of their
experiences and viewpoints.
Spatial Presence: The overall mean score for spatial presence
(evaluation of the sense of being physically present at/immersed
in the space of the experience) does not fully reect the runners’
opinions on their immersiveness of the experience. However, the
bar plot in Figure 4a provides additional information that reveals
more about their perspectives. The runners expressed that they
did not feel a sense of physical contact with the drone (Q2) or that
it provided them with a feeling of being transported to dierent
locations (Q3). This was mainly due to the runners being conned
to a specic location and running alongside the drone, which main-
tained a safe distance. Furthermore, the runners indicated that their
movements were reactive to the experience (Q5) and they had lim-
ited control over altering the course of the experience (Q9). They
also perceived that the drone was not responsive to their actions
(Q11). This is evident from the fact that the drone dictated the run-
ners’ pace and the experience lacked options for the runners to
modify its parameters or have the drone respond to their actions.
The varied responses to the remaining questions do not oer
denitive evidence regarding the impact of the drone on the im-
mersiveness of the experience. However, this does not necessarily
indicate a negative outcome, as running is an activity that individ-
uals engage in to achieve a state of full immersion, where they can
become absorbed in their thoughts, breathing, and personal focus.
If the presence of the drone in the physical space did not disrupt
this state, it could be considered a positive aspect. Nevertheless,
by analyzing the interview responses, we can glean some insights
into the factors that positively or negatively inuence the spatial
immersiveness of running with a drone.
•Drone Presence:
Several runners mentioned that after the
rst round of running, they became accustomed to the pres-
ence of the drone, and it had no noticeable impact on their
running experience thereafter. They also expressed that the
drone did not distract them from estimating their running
time. One participant stated, "I tried to adapt as much as pos-
sible, but overall, I would say it’s not bothering during the run.
So it’s a good feeling..." [P5]. Another runner mentioned, "In
the rst lap I had to get accustomed to it but then in the second
lap I was just normal routine of running. Get into the rhythm,
which I could easily sustain" [P10].
The presence of the drone had a positive impact on some
runners as it prompted them to reect on their performance
and make adjustments in the moment. They felt more
self-conscious due to the perception of being observed.
One participant expressed, "...it has kind of a positive eect
because I kind of thought about myself and what I’m doing
at the moment and that I don’t lose kind of the shape while
running... I do not really reect during running what I’m
doing, and this time I really did" [P5]. Another participant
mentioned, "...it was adding, how to say, changing my way
of checking myself..." [P9]. The feeling of being watched by
the drone encouraged these runners to be more mindful of
their running form and actions.
•Drone Noise:
While the noise generated by the drone was
generally perceived negatively by the runners, there were
a few exceptions. One runner mentioned that the noise
actually had a positive eect as it made them feel more alert
and aware of their surroundings. They stated, "...alerted
cause I don’t know what’s going on. And then I did I knew it
was a drone..." [P10].
However, for most runners, the noise would be uncomfort-
able and strange, particularly when they run alone. One
participant expressed, "It may be a bit uncomfortable because
mostly I run by myself, alone" [P5]. Another runner shared
concerns about feeling aected and endangered by the
noise, especially when running in secluded areas like the
woods [P10].
Running with a Drone for Pace Seing, Video Self-Reflection, and Beyond OzCHI 2023, December 02–06, 2023, Wellington, New Zealand
(a) Spatial Presence (b) Negative Eects
(c) Naturalness (d) Engagement
Figure 4: Stacked Bar Plot Illustrating the Responses to Each Factor of the ITC-SOPI Questionnaire (Top Left: Spatial Presence; Top Right: Negative Eects;
Bottom Left: Naturalness; Bottom Right: Engagement)
OzCHI 2023, December 02–06, 2023, Wellington, New Zealand Balasubramaniam et al.
Interestingly, although some runners initially found the noise
triggering, they gradually became accustomed to it as the
run progressed. As one participant noted, "In the beginning,
you’re a little bit triggered but the noise... you’re accustomed
to, so you don’t really hear it... it’s not that irritating" [P10].
•Length of Experience:
Some runners expressed that the
duration of the study or their experience running with
the drone was not sucient for them to fully adjust to the
presence of the drone or accurately assess its impact on
their running. One participant mentioned, "I think the user
experience would be a little bit more dierent if I would have
run for like ve or six more rounds so that I really get into
this running movement pattern" [P5]. Another participant
shared a similar sentiment, stating, "...maybe I should have
run longer... because I think there’s also kind of time that we
need to adjust each other" [P9]. These runners believed that
a longer duration of running with the drone would have
allowed for better adaptation and a clearer understanding
of the drone’s inuence on their running experience.
Negative Eects: The overall mean score for the negative eects
(evaluation of the negative reactions of the experience) dimension
of the experience indicated that the runners was not negatively
impacted by the experience. Upon examining the responses for the
questions assessing negative eects (Figure 4b), it is evident that
the runners generally did not experience feelings of tiredness (Q2),
dizziness (Q3), nausea (Q4), or headaches (Q5) after a short run
(less than 5 minutes) with the drone. However, it is worth noting
that one runner reported feeling tired, although this occurrence
was not documented and could potentially be attributed to their
pre-existing physical condition or the environmental conditions,
such as sunny weather.
Regarding the sensation of feeling disoriented (Q1), the runners
expressed mixed opinions. The reasons for this can be attributed
to various factors related to the presence of the drone during the
experience, as revealed through qualitative analysis of the inter-
view responses. Excerpts from the interviews provide insights into
why these factors in drones may have contributed to the runners’
feelings of disorientation.
•Drone Position:
The position of the drone during the run was
generally perceived as unfavorable by most runners. In order
to maintain pace with the drone, runners had to constantly
monitor its position, which proved to be distracting and took
their focus away from maintaining their running form. One
participant expressed, "...I had to keep the mind on the drone
too, it was like, distracting from the pace..." [P6]. Another
participant described the mental challenge of multitasking,
stating, "...looking at the quadcopter I was trying to focus on
two things at the same time... which is mentally [exhausting]
trying to do two things at the same time" [P2]. The positioning
of the drone required runners to not only locate it for pacing,
but also constantly monitor its location relative to their own.
As one runner stated, "I have to take into account of the drone
during the running and not really just ying next to me as
a pacer but really focusing on where it is and where I am in
the view" [P1]. Additionally, one participant mentioned the
diculty of tracking the drone’s position when it happened
to be positioned against the sun, obstructing visibility. They
stated, "...the drone was exactly in the sun so I could not really
look up and see where the drone was..." [P7].
Furthermore, for some runners, the drone proved to be coun-
terproductive to their goal of clearing their mind while run-
ning. The need to concentrate on the drone and the surround-
ing environment became mentally taxing, as expressed by
one participant: "...mostly mentally trying, to do two things at
the same time" [P2].
•Drone Speed:
One runner [P9] expressed their concern about
the drone’s pacing speed and its potential impact on their
ability to focus internally during the run. They believed that
if the drone’s pacing speed is inconsistent, uncontrollable,
or fails to adapt to their own speed during the run, it would
be an annoying experience. The runner stated that during
interval training sessions, their focus is solely on maintain-
ing a constant speed, and any deviations in the drone’s pace
would be disruptive. They mentioned, ". . . for interval train-
ing I don’t hear I don’t listen, I don’t feel anything. I’m just
trying to pace up myself and then keep that constant speed
and in that sense, .. . I will be really annoyed if the speed does
not adapt" [P9].
Additionally, the runner highlighted the annoyance that
could arise when they need to slow down or speed up during
the run for hydration or other reasons. They stated, "When
you would like to slow down and speed up sometimes...it be-
comes annoying because [the drone] will...go forward and then
not really follow me" [P9]. The runner also expressed concern
that instead of focusing on themselves, they would be pre-
occupied with catching up to the drone, further illustrating
the potential distraction caused by an inconsistent pacing
experience, ".. .I will not be focusing on myself but focusing
on whether or not like I’m catching up with the drone" [P9].
•Drone Noise:
The noise produced by the drone’s motors is an
inherent characteristic that cannot be completely eliminated.
However, the specic characteristics of the noise can vary
depending on the type of motors and drone conguration
used, resulting in dierent impacts on the runners.
For some runners, the noise of the drone proved to be dis-
ruptive and disorienting, causing them to lose focus on their
running environment and surroundings. One participant
expressed that the constant sound of the drone was disturb-
ing and hindered their awareness of what was happening
around them. They stated, "It was a little bit disturbing to hear
the sound all the time because I was not really 100% aware of
what’s going around me" [P4]. Another participant echoed a
similar sentiment, noting that the noise of the drone was also
distracting and aected their situational awareness. They
stated, "It was a little bit disturbing to hear the sound all the
time because I was not really 100% aware of what’s going
around me" [P5].
These responses indicate that for some runners, the noise
produced by the drone had a negative impact on their ability
to maintain full awareness of their running environment,
potentially aecting their overall running experience.
Running with a Drone for Pace Seing, Video Self-Reflection, and Beyond OzCHI 2023, December 02–06, 2023, Wellington, New Zealand
Naturalness: The overall mean score for the naturalness (evalu-
ation of the reality of the experience) dimension of the experience
indicated that the runners did not nd the experience highly natural.
While the runners acknowledged that running with a drone was be-
lievable (Q2) and part of the real world (Q3), they also expressed that
the experience did not feel entirely natural (Q1) (Figure 4c). Upon
analyzing the interview responses, it became evident that certain
factors related to the experience and the drone itself contributed to
this perception of unnaturalness.
•Drone Position:
Some runners found it unnatural and incon-
venient to constantly turn their heads to check on the drone’s
position during their runs. This movement was perceived
as forced and non-natural, potentially aecting their overall
running experience. "Quite hard as it was ying right above
me and next to me instead of in front" [P1], "... you just have
to look up and to the left every time so maybe it was a little
bit annoying..." [P4], "I think that doing it constantly it will be
not good. I have to say because you are like moving muscles in
a non-natural way" [P6], and "It feels a bit unnatural to look
left up during running well, every now and then to see whether
you’re still on the same line as the drone is" [P7]. These re-
sponses highlight the runners’ concerns about the impact
of constantly monitoring the drone’s position on their body
movements and the overall ow of their run.
•Drone Speed:
Some runners noticed inconsistencies in the
drone’s speed, specically during curves on the track. While
the drone’s overall speed remained consistent, the adjust-
ments it made to navigate the curves resulted in abrupt
changes that were observed by the runners. This phenom-
enon impacted their running experience. This led to com-
ments such as, "I found it a bit hard to keep a constant pace"
[P3] and "it was a little bit annoying or it didn’t keep up the
pace" [P4]. Adjusting to the drone’s speed proved dicult for
some runners; ".. . sometimes it was a little little hard because
I was a little in front or a little back or so on. So I tried to
adapt as much as possible.. . " [P5], ".. . dicult to maintain
the pace . . . a little bit dicult, especially in the curves" [P6].
The speed variations became particularly noticeable during
the corners; "The corners... I am moving faster than the drone...
I really want to focus on the speed, so I don’t want to check
whether or not it’s pacing me correctly" [P9] and "During rst
corner I really had struggles adapting to the speed of the drone"
[P10]. Additionally, one runner expressed discomfort with
the xed pace set by the drone, stating, "I don’t try to keep a
constant pace, but it feels a bit unnatural this way" [P7].
•Drone Noise:
Some runners expressed their dissatisfaction
with the noise produced by the drone during the running
experience. One participant stated, "I did not really like the
sound" [P3], while another described it as "a bit loud" [P5].
One runner even compared the noise to that of a construc-
tion site, saying, "I feel very annoying with the sound. Yeah,
exactly you’re in a construction site" [P8]. Another participant
humorously remarked, "But like after three seconds, it was
okay, because I understood it was the drone and not the bees"
[P3].
Engagement: The overall mean score for the engagement (eval-
uation of the psychological/cognitive involvement in the presented
experience) dimension indicated that the runners found the experi-
ence engaging. However, a closer look at the individual questions
within the engagement dimension provided more insights (Figure
4d). The runners unanimously reported that they paid more atten-
tion to the experience than their own thoughts (Q11). Despite the
simplicity of the experience and the lack of input from the runners,
they still felt involved and engaged during the experience (Q7). They
expressed enjoyment in running with the drone (Q9) and were able
to recall their experiences well (Q4). Additionally, they indicated
that they would recommend this experience to others (Q5). Most
runners agreed that they did not have strong emotional responses
during the experience (Q12) and that the experience appealed to
them (Q13). However, there were variation in the responses to the
remaining questions, which was also reected in the interview re-
sponses. These variations could be attributed to dierent factors
related to the experience that triggered specic reactions from the
runners, which we further explain below.
•Drone Presence:
For some runners, the drone acting as a
pacer provided a fun experience that was dierent from
running behind a cyclist/human pacer. One participant ex-
pressed, "I think it was a fun experience. It was dierent than
just behind a cyclist" [P4]. Another participant creatively
imagined themselves as a detective being chased in a movie
set, saying, "I just tried to imagine I’m a detective in a movie"
[P6], highlighting the unique and entertaining aspect of run-
ning with the drone.
Some runners indicated that focusing on the drone during
their run was immersive and made them lose track of time.
One participant stated, "If you run two rounds normally it’s
just too boring, but with a drone, you’re just focused on a point
and the experience is going faster" [P1]. Another participant
mentioned, "It seems to be longer than normal" [P6], indicat-
ing that the presence of the drone had an impact on their
perception of time.
•Drone Noise:
Focusing on the drone’s noise had a positive
impact on some runners, allowing them to direct their atten-
tion/focus inwards and maintain their pace. One participant
expressed, "...you didn’t focus on the environment, but only
on the buzzing sounds, but it helped me keep the pace" [P4].
Another participant mentioned, "...I felt like I needed to hear
where the sound is to feel if I’m doing the right thing or not"
[P3], indicating that the drone’s noise served as a guiding
element for their running pace.
•Drone Speed:
Some runners highlighted the impact of the
drone’s varying speed on their overall experience. Speci-
cally, they noticed how the drone adjusted its speed when
navigating the curves of the track. This led to an engaging
aspect of the experience, as they described it as playing a
game of catch-up with the drone. One runner explained, "In
the rst corner, I felt I had to go way faster than it used to
follow it, and in the second lap, I was used to that, so it was
easier to follow in the corner as well" [P8]. Another partici-
pant described the experience as a "weird game" where they
constantly checked if they were catching up with the drone
OzCHI 2023, December 02–06, 2023, Wellington, New Zealand Balasubramaniam et al.
or if the drone was catching up with them [P9]. The same
participant, however, indicated this also made them doubt
their ability to maintain the pace; "I think there’s too much
that I’m looking at [the drone] too often" [P9].
•Drone Position:
Given the position of the drone, some run-
ners mentioned that it enhanced their enjoyment of the
outdoors and made them more aware of their surroundings
as they constantly looked up; "I enjoyed the outside more
because I was looking towards the sky" [P9], and "I felt more
aware of what was happening around me" [P3].
One runner also mentioned that due to the drone’s position,
they occasionally had to turn their head, which they found
to be a pleasant experience; "But occasionally moving the
head will be a nice experience" [P6].
Furthermore, one runner pointed out that because of the
drone’s position, it was challenging to track its location when
the sun was behind it; "The drone was exactly in the sun, so I
couldn’t look up and see where the drone was" [P7].
•General Experience:
Some of the generally positive com-
ments were related to the novelty of the experience; "... fun
experience..." [P4], "It was nice and I think that it would be a
really nice form of training" [P6] , and "... rst time I did [run-
ning with drone], and I think was a interesting experience..."
[P5]
5 RESULTS & ANALYSIS 2: REFLECTIONS OF
RUNNERS AFTER VIEWING THE DRONE
VIDEO OF THEIR RUN
Drone videos proved highly benecial for runners, enabling self-
reection on performance and gaining valuable insights into tech-
nique. Access to drone video footage oered practical advantages,
facilitating convenient self-analysis and identication of areas for
improvement. One runner found it easy and convenient to analyse
their running form using videos allowing them to identify unnoticed
aspects; "...feel like there’s my legs are not moving symmetrically... my
upper body is a bit tilted forwards... there are denitely things I would
change based on my own knowledge about running " [P3]. Similarly
another participant mentioned using the video in conjunction with
their existing knowledge to adjust their running technique; "...I can
see my moving pattern... I think I would know how to maybe adapt a
little bit or improve the running pattern..." [P5]. Slowing down the
videos for detailed examination allowed runners to identify and
correct errors or areas for improvement, as emphasized by a par-
ticipant "...this bit of video can be slowed down... you can see like the
slow-mo movements... and everything... better... like your mistakes"
[P6]. In addition to easy access and analysis, runners appreciated
the convenience of readily available drone video footage. They no
longer had to rely on lab settings or assistance from others to record
their runs. Instead, they could easily review the drone-captured
video for a comprehensive understanding of their running form;
"...before... to analyze, we have to be on a treadmill or in a lab or ask
our family to take a video, and then we analyze. But now I can see
this, the experience from top..." [P8].
The videos facilitated runners in reecting on their thoughts and
experiences during their runs, leading to a deeper level of introspec-
tion. They could conrm observations like a constant downward
gaze and distractions from the run through video validation. One
participant expressed ".. .I’m pretty much looking downwards all the
time and it was conrmed on video. . . " [P10]. Another mentioned
".. .not thinking about how I run more about what I going to do with
my schedule next like my to do list " [P3]. The videos visually repre-
sented runners’ thought processes and intentions during the run,
including their focus on the drone and their position relative to
it. They also captured moments when runners adjusted their pace
to match the drone’s movements or made adaptations based on
environmental factors like shadows or sunlight. These reections
deepened their understanding of their running performance. Par-
ticipants mentioned ".. .thinking about the drone and that I’m not
in the middle. And then I see it.. ." [P4], ".. . I have to slow down to
match the pace of the thing that I can see. You can see that like the
movement of my head and then I suddenly speed up or slow down..."
[P6] and ".. . I could see I was moving a little bit outwards. . .I was
thinking oh, I should move outward because there’s this little bit of
shadow ..." [P10].
Furthermore, the videos oered valuable pacing feedback, alle-
viating runners’ concerns about maintaining the right pace. They
could see from the videos that their pace remained steady and
centered throughout their runs, boosting their condence in their
performance. As one participant described, "...I could see that I was
running at a more steady pace than I felt I was..." [P2]. Another par-
ticipant echoed this sentiment, stating, "...I think I’m almost every
time in the middle. So then I had not to worry that much during the
run..." [P4]. The strong appeal and desire for the videos were evident
among the runners, as many expressed interest in obtaining a copy
for future reference and analysis. This eagerness demonstrated their
recognition of the transformative impact drone videos could have
on post-run reections and the overall running experience.
But, there were some runners who expressed that the videos
didn’t provide additional insights into their running technique
beyond what they already knew. They mentioned that the high
angle and perspective of the video made it challenging to analyze
their running form eectively. One participant noted that the video
lacked a side view, stating, "I think the video was quite high, so you’re
not really have the side view" [P1]. Another participant mentioned
the diculty of assessing their technique from the top-down per-
spective, saying, ".. .I think there are denitely things I would change
based on my own knowledge about running. But I think it’s hard to
tell because it’s lmed from above instead of like the side front" [P3].
Additionally, the runners provided valuable insights on potential
enhancements that could be incorporated into the videos to further
elevate their self-reection process. They emphasize the value of
expert analysis combined with running data, as one participant
states, "the video itself it doesn’t say much because when you see
yourself running but it’s not like the small details we can see, when
we have actual data that is better" [P1]. The integration of video
analysis with running data and performance indicators is seen as
crucial for a comprehensive understanding of performance. Another
participant indicated the importance of peer feedback and not just
coaches, "So maybe you can improve your own running by watching
but I think watching yourself is dicult because mostly you learn
when other people say it looks weird what you’re doing. I’m not sure
if you’re able to do it yourself" [P3].
Running with a Drone for Pace Seing, Video Self-Reflection, and Beyond OzCHI 2023, December 02–06, 2023, Wellington, New Zealand
Slowing down the videos was also highly appreciated, allowing
for detailed examination of movements and identication of mis-
takes, as one participant notes, "this bit of video can be slowed down...
you can see like the slow-mo movements... and everything... better...
like your mistakes" [P6]. Furthermore, the same runner believed that
the captured video can be used to generate personalized training
experiences to enhance running performance, as expressed by a
participant who says, "this will be also adding some more input to
the training experience" [P6].
6 RESULTS & ANALYSIS 3: ROLES AND
FUNCTIONS FOR DRONES TO SUPPORT
RUNNERS
This section highlights the varied roles and functions that drones
can play in supporting runners, based on our analysis of partici-
pant responses. Through thematic analysis, we devised these roles
to explore the broader applications of drone technology beyond
the scope of our study. By delving into each role, we present the
functions of each role and valuable insights for future research. The
roles are presented in order of their supporting number of codes,
emphasizing their signicance and relevance.
•
AirTrainers [25 codes]: This is a role that entails drones
supporting runners in their training regimens. One of the
drone’s functions would be to help runners set varying pre-
cise speeds for interval training (". . . interval trainings .. . I’m
just trying to pace up myself and then keep that constant
speed. . . " [P9]) and understand their form at higher speeds
("..when you’re running, .. . when you’re going fast, it becomes
more dicult to focus on technique. So it will be nice to be
able to look at that at home and see form" [P10]). The avail-
ability of drone videos had received positive reactions from
runners, who indicated a desire for dierent video angles
("...for sprinters then the side view is more important thing
for long distance" [P1], "Would it be able to also lm from
dierent angles? Because for me, that will make a very, very
big dierence" [P3], "...you’re not seeing the sideway so I don’t
think you can see the movements of the legs that well to really
see if you have to change something" [P4]) and combining
visuals with information on their running form ("...if you
would know how to exactly improve the way you run then
these videos could denitely help I think" [P7], "...better to just
a little bit of focus on cadence or step length" [P10]).
Additionally, the runners mentioned that the videos would
help them design better training sessions ("...this [video] will
be also adding some more input to the training experience
because if you can see like the slow-mo movements you can
then pause the video and everything you can see better" [P6]).
They believe that if trainers can use drone videos to develop
tailored training activities and share them on social media,
it can attract more trainees and improve their reputation
in the training world ("If this personal trainer have all the
gadget that is the top one that not so many people have and
this personal trainer immediately if he will stand out in the
business" [P8]).
As AirTrainers, a drone should provide runners with the
autonomy to change the position and speed of the drone.
It should also allow runners to self-analyze their runs and
highlight the small details in their movements, through dash-
boards ("when you see yourself running but it’s not like the
small details we can see, when we have actual data that is
better" [P1]). Additionally, the drone should help conduct
training similar to the convenience of a smartwatch ("...be-
cause now I’m wearing a watch and my watch can determine
my trainings and determine the interval trainings I make"
[P1]).
To full the role of AirTrainers, a drone should have soft-
ware capable of detecting runners and their pose in real
time. It should process pose information and derive various
running parameters that can be communicated in real time
or through post-hoc dashboards. In addition, the physical
design of the drone should be carefully crafted to align with
runners’ preferences, prioritizing trust and likability. These
factors play a crucial role in determining the adoption of
technology within any context of usage [59].
•
Companion [7 codes]: This is a role that entail drones enter-
taining runners during their runs. As one participant men-
tioned, they desired the drone to alleviate boredom by adapt-
ing music to their running or simply being present to provide
cheer ("I wouldn’t expect it to repeat or direct me; I would just
expect it to be with me, cheering me up" [P9]).
To full its role as a companion, the drone could create a
sense of agency by adjusting its positions and movements
based on the runner’s actions, while minimizing interference.
For some runners, the ability to control the drone themselves
is essential, especially when running in remote areas or re-
quiring speed variations ("...if I can control the drone myself
without the other person because sometimes the runner they
just go somewhere to do running..."[P8], "when you would like
to slow down and speed up sometimes because I’m like, you
know, feeding myself sometimes I’m drinking water and I slow
down and then my heart rate. And then yeah, it becomes an-
noying because it [drone] will you know, go forward and then
not really follow me" [P9]). Alternatively, the drone could be
tele-operated by an actual companion from a distance, mak-
ing the interaction between the runner and the drone more
realistic. Moreover, careful consideration should be given
to the physical design of the drone to ensure it generates a
positive eect and enhances the overall running experience.
•
Trail/Forest Drone [4 codes]: This is a role that entail
drones supporting runs that take place in wooded areas or
forests. One runner expressed enthusiasm for the drone as
a valuable device for trail running ("It will be a really, really
nice product. I think for both track running and trail running"
[P6]). While serving a similar purpose as the AirTrainer, the
trail drone would oer additional benets. The same runner
highlighted the need for the drone to capture video footage
during trail runs, as it can be challenging for cyclists or other
individuals to run alongside and record videos on uneven and
unpredictable paths ("...more useful than the normal trainings
because in the mountains you don’t have places like to stay
and lm people so there is just a path through which you walk
with a drone..." [P6]). Furthermore, they believed it would be
advantageous for the drone to map and analyze the terrain,
OzCHI 2023, December 02–06, 2023, Wellington, New Zealand Balasubramaniam et al.
allowing them to plan optimal training regimens and routes
("...also see the terrain, the track, you can analyse everything
weather, do a better strategy. It will be a really, really nice
product..." [P6]).
However, researchers must consider that runners venture
into forests to clear their heads and listen to internal and
nature’s sounds, preferring not to hear the noise of a drone.
("...but if I’m running into forests, then I would love to clear
my head and then I would rather not have the sound of the
quadcopter" [P3]). Therefore, in addition to providing in-
formation about the forest paths, it is essential to ensure
that the drone produces minimal noise or allows the runner
to control the noise, contributing to an enhanced running
experience. ("...reducing the volume of the drone would im-
prove the user experience" [P5]). Furthermore it is crucial
for researchers to consider the necessity of incorporating
advanced obstacle avoidance and navigation systems into
the drone. These systems are essential to prevent the drone
from crashing into trees or other obstacles while accurately
following the runner through the forest.
•
Alerter & Navigator [3 codes]: This is a role that entail
drones to help runners stay alert and notify others while
guiding them along a running track or path. One runner
mentioned that the drone could assist in resolving conicts
with other track users by providing warnings or instructions
("Sometimes you have conict with the other guys using the
track then he could so the drone could say hey, go out .. . or
you want to overtake .. .. it can . .. help .. . alert them . .. give a
warning sign" [P4]). Another emphasized the benets of hav-
ing a drone during city runs to enhance their awareness of
surroundings, especially in potentially dangerous situations
("If I’m running in the city it’s good to be aware of because
then there’s trac and there could be dangerous situation."
[P3]). Another believed that being observant of their envi-
ronment, including observing what happens on the streets,
keeps them motivated ("...observe what happens (dangerous
incidents) in the street is quite keep me motivated" [P8]). To
full this role, the drone should be equipped with functions
that allow for alerting the runner or authorities in case of
dangerous situations. There are various opportunities for
researchers to explore dierent communication modalities,
including incorporating speakers and connecting to haptic
sensors.
•
Social Media Facilitator [2 codes]: This is a role that entail
drones supporting runners to record impressive videos of
various training activities from dierent angles. One partici-
pant expressed that recording and sharing these videos on
social media platforms could help trainers gain popularity
and even generate income, especially when posted on so-
cial media platforms ("...if you can show this video, you get
a lot of respect. & .. earn your income as a YouTuber..." [P8]).
In addition to popularity and monetary benets, sharing
drone-recorded videos can enhance the credibility of train-
ers and users, positioning them as individuals who embrace
new technologies to improve their routines. This association
with advanced technology could add to their reputation and
respect among peers ("...drone most of time is used for the
young boy they they want to do have a kind of get respect
because it’s a new technology.. . if you can show this video,
you get a lot of respect within your friend" [P9]).
•Riveting [2 codes]: This is a role that entail drones to help
runners become immersed in the experience as facilitated by
the drone, causing them to lose track of time. Participants
mentioned that focusing on the drone during their runs made
time feel faster and kept their concentration engaged ("...you
have to focus on something so it’s time goes faster” [P1], and
"I have to keep the mind on the drone too . . . usually when
you are running, you’re just concentrating on something. So
if you’re losing concentration with these little things, it seems
to be longer than normal..." [P6]). Researchers could take
advantage of the immersive experience a drones’ presence
creates to manipulate the runners’ perception of time. By
designing interactions and incorporating temporal illusions,
the drone could help create the impression that the runner
has spent more or less time on an exercise, giving them a
sense of maximum benet or providing them with additional
time to focus on other things.
•
Exergame Conductor [2 codes]: This is a role that entails
drones conducting games that drives exercises. During the
experiential study, one runner indicated that they sometimes
used the noise generated by the drone to maintain their pace,
leading to a sense of engagement similar to playing a game
("I was always constantly looking up to see if I’m catching up
with it, or it’s catching up with me. So it was kind of a weird
game" [P9]). Another runner expressed that the drone noise
made them perceive being chased by a swarm of bees and
even fostered a detective-like imagination as if in a movie
set ("...later I just tried to imagine I’m a detective in a movie
set" [P8]). Drawing parallels to existing applications like
ZombieRun [
16
], drones have the potential to replicate such
scenarios, enhancing exergames with their presence and
diverse on-board sensors and actuators.
•
Behavioral Catalyst [2 codes]: This is a role that entails
drones acting as a catalyst to inuence and inspire the runner
to change their behaviour and adopt a new approach to self-
monitoring, facilitated by the presence. One participant had
indicated that the drone had changed the way they checked
on themselves ("[the drone following them] had kind of a
positive eect because I kind of thought about myself and what
I’m doing at the moment and got me think about what I’m
doing and how I do it at the moment" [P5]) and replaced their
behaviour of monitoring their pace using their smartwatch
("...it replaced my watch checking behaviour . . . I didn’t check
my speed at all [on watch]. I was just checking it and then in
that sense, maybe like we can say enjoying the outside more"
[P9]). This highlights the potential of the presence of external
systems like drones to transform self-monitoring practices
during running.
•
Sightseeing Coach [1 code]: This is a role that entails
drones to help runners discover new places or explore un-
familiar trails while also acting as coaches. One runner
mentioned that the drone’s position, when running with
the drone as a pacesetter, enhanced their enjoyment of the
Running with a Drone for Pace Seing, Video Self-Reflection, and Beyond OzCHI 2023, December 02–06, 2023, Wellington, New Zealand
outdoors during their runs ("...enjoying the outside more be-
cause I’m looking towards the sky..." [P9]). Drones can be pro-
grammed with waypoints to guide runners to locations they
may nd appealing, both within the city and in scenic out-
skirts like wooded or forest trails. Moreover, when combined
with the AirTrainer function, drones can provide coaching
support while runners engage in their outdoor activities.
7 DISCUSSION
Upon analyzing the runners’ responses to their experience running
with a drone as a pacesetter, we determined that they found the ex-
perience to be engaging and found the playback of the drone video
recordings from their runs valuable in supporting their post-run
reections. Additionally, through our interviews with the runners,
we discovered the potential for drones and drone videos to be fur-
ther utilized in fullling various functions that runners require, as
well as enhancing their post-run reections. We now delve into the
benets, future considerations, and limitations of our work.
Supporting Interval/Pace Training: Based on our work, it is
clear that runners nd running with a drone to be engaging. By
addressing the shortcomings we identied in our study, drones
have the potential to support interval/pace training. Although we
did not directly compare this method with traditional approaches,
our ndings provide strong evidence in favor of using drones for
such training purposes. We plan on further exploring the utilization
of drones as interval trainers by investigating the range of train-
ing programs they can support compared to traditional methods,
investigate the interaction design considerations for such drones,
and study the proxemics between the runner and the drone for
this use case. Furthermore, the insights we have gathered from our
study aim to serve as an inspiration for future researchers. We hope
our work encourages them to design and test interactive scenarios
tailored to supporting runners during interval and pace training.
Designing Dashboards to Support Reections Using Drone
Videos: Based on the reections of the runners, it becomes evident
that they would greatly benet from having a dedicated dashboard
to process and analyze the drone video recordings of their runs.
While our work already demonstrates the value of using unedited
and unprocessed videos for post-run reections, we believe that
enhancing their reective experience is possible by incorporating
additional features into the dashboard as suggested by the runners.
These features could include options for viewing the running videos
in slow motion, controlling playback, highlighting key moments,
and presenting meaningful running data in a user-friendly inter-
face that indexes events of interest or allows runners to search for
specic instances using key terms, enabling them to augment their
reections even further. By leveraging AI tools [DeeperCut [
20
],
DeepLabCut [
30
], DeepPose [
57
], AlphaPose [
15
], OpenPose [
8
],
OpenCap [
58
]], videos can be utilized to estimate poses and run-
ning parameter, thereby providing more valuable insights through
a dashboard. A deep learning model can also help minimize the
reliance on expert intervention, such as a coach, by leveraging pat-
tern analysis in running data to detect and address problematic
areas more autonomously [
24
]. However, further investigation is
necessary to fully explore the potential of this approach. To ensure
such a system’s eectiveness and usability, input and feedback from
both coaches and runners should be considered during the devel-
opment process. By incorporating their perspectives, the system
can be tailored to meet the specic requirements and preferences
of the users, maximizing its impact. As part of our project, we plan
to conduct further interviews with additional runners to gain a
deeper understanding of the information they would like to see
on a dashboard and to explore the characteristic of their preferred
ways of interacting with running data.
Roles & Functions for Drones: The roles and functions of
drones were derived based on the insights provided by runners
regarding how they envision using drones in specic scenarios.
The naming of these roles were thoughtfully chosen to encapsulate
the common codes identied in the data, and the functions assigned
to them were directly shaped by the input of the runners. It is im-
portant to note that these functions need not exist independently;
they could overlap and combined with the others. For example,
a drone may simultaneously assume the roles of guide and com-
panion, particularly in settings like trail runs. Our intention is to
provide a solid foundation for future researchers, enabling them
to better understand the requirements of runners. We believe that
these insights into the roles and functions of drones, as informed
by runners experiences with a drone, will be valuable in guiding
the design and development of drone technologies, that cater to
various types of runs. By considering these, future endeavors in
this eld can be better aligned with the expectations of the running
community.
Longitudinal Studies with Diverse and More Participants:
Our initial results are promising; however, to draw denitive con-
clusions on the long-term adoption and usage of drones, as well as
how they may complement or compare to existing running tech-
nology, conducting longitudinal studies involving a more diverse
group of runners is crucial. Over time, factors like technology nov-
elty wearing o or the experience becoming less engaging could
inuence runners’ interest in drones. Through longitudinal studies,
we can gain deeper insights into the specic factors contributing to
this potential decline and work towards addressing them in future
designs.
Furthermore, a larger participant sample would have signi-
cantly improved the quality and breadth of our ndings. Our study’s
recruitment criteria were tailored to align with a parallel study, lead-
ing to a specic subset of runners being included
1
. This subset may
not have represented all possible perspectives on drone technology,
potentially biasing our recorded opinions. By including a broader
range of participants and recording additional running character-
istics, such as technology adoption, motivation, and habits, we
could have unearthed the underlying reasons behind their opinions.
This, in turn, would have enabled us to provide more robust design
recommendations for drones, catering to a broader spectrum of
runners.
It’s important to underscore that our study’s primary goal was
not to make sweeping generalizations but rather to oer valuable
insights within a specic context. As such, our ndings should be
1
In order to align with our parallel biomechanical study, we specically recruited
participants who met certain running criteria. These criteria included the ability to
maintain a minimum speed of 13 kmph for at least 5 minutes, continue running until
reaching exhaustion, and have no prior injuries
OzCHI 2023, December 02–06, 2023, Wellington, New Zealand Balasubramaniam et al.
interpreted within this context, and any extrapolations to wider
populations should be approached with caution.
Study Design Limitations: For ethical reasons, participant
safety was a top priority while operating the potentially harm-
ful drone. Maintaining a safe distance between the drone and the
runner was crucial to allow the pilot to assume control in case of
emergencies without endangering the runner. However, position-
ing the drone in a way that required runners to frequently turn
their heads to the right and look up had a negative impact on their
experience. This aspect, highlighted during the interviews, may
have inuenced their perception of running with the drone.
Factors Evaluating Experience Running with Drone: In
retrospect, based on our qualitative analysis of the runners’ expe-
riences and evaluating the results obtained from our chosen ques-
tionnaire, we realize that additional questions and factors should
have been included and some removed to comprehensively assess
the runners’ experience running with a drone quantitatively. Specif-
ically, we acknowledge the need to incorporate factors related to
physical exertion, cognitive process, and drone-related aspects in
order to obtain a more comprehensive evaluation of the experience.
This realization suggests that there is an opportunity for further
research in this area, with the goal of developing questionnaires
that eectively evaluates physical exertion experiences mediated
through a drone. We hope that the insights we derived from the
runners’ experience will contribute to the development of such a
questionnaire.
Overarching Ethical Concerns: Using camera drones, whether
in research or commercial contexts, entails ethical considerations.
While some of these considerations may not directly align with
our study’s focus, we acknowledge the importance of addressing
relevant ethical aspects. Recording videos in public spaces with
drones can unintentionally capture data beyond the intended scope,
including bystanders and private properties. Our study took pre-
cautions to avoid such unintended data, but further advancements
are needed in system design to discard unnecessary recordings
and ensure privacy. AI technology can aid in developing drones
that selectively record only the intended subject while masking
unintended data [
5
,
52
]. Additionally, addressing the issue of com-
municating with bystanders to ensure they are aware of not being
recorded is an area that requires exploration. Stringent laws reg-
ulate the use of drones weighing over 250g in various countries
for safety and privacy reasons. To comply with these laws, drone
operators must obtain a remote pilot license. While our research
study had a licensed team member, future drone market oerings
for runners may require individual licenses. However, advance-
ments in technology miniaturization, sensors, motors, and batteries
can enable development of smaller drones weighing less than 250g
[
33
,
55
]. This would grant runners greater freedom as sub-250g
drones are often subject to fewer restrictions due to their lower
potential harm. In future implementations, researchers should also
consider additional factors related to navigating around bystanders
or buildings and the utilization of such systems in uncontrolled
environments.
8 CONCLUSION
In this paper we showcased that incorporating drones to conduct a
simple pacing exercise can create an engaging experience without
signicant negative consequences. Our qualitative analysis of the
interview responses provided valuable insights into the factors that
contribute to runner enjoyment and those that may need address-
ing. Moreover, the runners’ suggestions regarding the utilization
of drone videos for post-run reections highlight the benets as-
sociated with drone technology. The insights gained from their
responses guided the formulation of potential roles and functions
for drones, suggesting promising avenues for future research. Build-
ing on the ndings of this study, our future research endeavours
will involve conducting a more comprehensive study that compares
the eectiveness of utilizing drones to support advanced pace train-
ing regimens with traditional methods. This will provide additional
evidence supporting the practicality of drones in assisting with
outdoor pace-related training. Lastly, we also plan to investigate
the design of dashboards for post-run reections that enable easy
identication of problem areas and showcase key actions through
accessible drone video highlights, motivating runners to improve
their training regimens.
ACKNOWLEDGMENTS
The work was made possible by the funding of the EEMCS depart-
ment for the UT-EEMCS Theme Team project “Sports Data and
Interaction” (SDI). Moreover, we would like to acknowledge the
help provided by the members of the projects who aided in user
studies, and review of the paper. Last but not least we would like
to thank the runners who participated in this study without whom
the study would not have been possible during the dicult period
of the COVID-19 pandemic.
REFERENCES
[1]
Zann Anderson and Michael Jones. 2020. Tangible Interactions with Physical-
izations of Personal Experience Data. In Proceedings of the 15th International
Joint Conference on Computer Vision, Imaging and Computer Graphics Theory
and Applications. SCITEPRESS - Science and Technology Publications, Setubal,
Portugal, 163–172. https://doi.org/10.5220/0008990201860194
[2]
Simon D. Angus. 2013. Did recent world record marathon runners employ
optimal pacing strategies? Journal of Sports Sciences 32, 1 (July 2013), 31–45.
https://doi.org/10.1080/02640414.2013.803592
[3]
Aswin Balasubramaniam, Dennis Reidsma, and Dirk Heylen. 2023. Drone-
Driven Running: Exploring the Opportunities for Drones to Support Running
Well-being through a Review of Running and Drone Interaction Technolo-
gies. (2023). https://doi.org/10.1145/3623809.3623831 preprint on webpage
at http://camps.aptaracorp.com/ACM_PMS/PMS/ACM/HAI23/20/4212b9ec-5795-
11ee-b37c- 16bb50361d1f/OUT/hai23- 20.html#.
[4]
Kim Bellware. 2019. Lasers, rabbits and new Nikes: How the 2-hour marathon
barrier was broken. https://www.washingtonpost.com/sports/2019/10/15/lasers-
rabbits-new- kicks-how- hour-marathon-barrier- was-broken/
[5]
Thierry Bouwmans, Sajid Javed, Maryam Sultana, and Soon Ki Jung. 2019. Deep
neural network concepts for background subtraction:A systematic review and
comparative evaluation. Neural Networks 117 (Sept. 2019), 8–66. https://doi.org/
10.1016/j.neunet.2019.04.024
[6]
Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology.
Qualitative Research in Psychology 3, 2 (Jan. 2006), 77–101. https://doi.org/10.
1191/1478088706qp063oa
[7]
Virginia Braun and Victoria Clarke. 2020. Can I use TA? Should I use TA? Should
I not use TA? Comparing reexive thematic analysis and other pattern-based
qualitative analytic approaches. Counselling and Psychotherapy Research 21, 1
(Oct. 2020), 37–47. https://doi.org/10.1002/capr.12360
[8]
Zhe Cao, Gines Hidalgo, Tomas Simon, Shih-En Wei, and Yaser Sheikh. 2021.
OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Anity Fields.
IEEE Transactions on Pattern Analysis and Machine Intelligence 43, 1 (Jan. 2021),
172–186. https://doi.org/10.1109/tpami.2019.2929257
Running with a Drone for Pace Seing, Video Self-Reflection, and Beyond OzCHI 2023, December 02–06, 2023, Wellington, New Zealand
[9]
Arturo Casado, Brian Hanley, Pedro Jiménez-Reyes, and Andrew Renfree. 2021.
Pacing proles and tactical behaviors of elite runners. Journal of Sport and Health
Science 10, 5 (Sept. 2021), 537–549. https://doi.org/10.1016/j.jshs.2020.06.011
[10]
Shannon E. Clark and Diane M. Ste-Marie. 2007. The impact of self-as-a-
model interventions on children's self-regulation of learning and swimming
performance. Journal of Sports Sciences 25, 5 (March 2007), 577–586. https:
//doi.org/10.1080/02640410600947090
[11]
Pedro Corbí-Santamaría, Alba Herrero-Molleda, Juan García-López, Daniel Boul-
losa, and Vicente García-Tormo. 2023. Variable Pacing Is Associated with Per-
formance during the OCC®Ultra-Trail du Mont-Blanc®(2017–2021). Interna-
tional Journal of Environmental Research and Public Health 20, 4 (Feb. 2023), 3297.
https://doi.org/10.3390/ijerph20043297
[12]
C. Cronin, A.E. Whitehead, S. Webster, and T. Huntley. 2017. Transforming,
storing and consuming athletic experiences: a coach’s narrative of using a video
application. Sport, Education and Society 24, 3 (July 2017), 311–323. https:
//doi.org/10.1080/13573322.2017.1355784
[13]
Joseph La Delfa, Mehmet Aydin Baytas, Rakesh Patibanda, Hazel Ngari, Ro-
hit Ashok Khot, and Florian 'Floyd'Mueller. 2020. Drone Chi: Somaesthetic
Human-Drone Interaction. In Proceedings of the 2020 CHI Conference on Hu-
man Factors in Computing Systems. ACM, New York, NY, USA, 1–13. https:
//doi.org/10.1145/3313831.3376786
[14]
P W Dowrick and C Dove. 1980. The use of self-modeling to improve the
swimming performance of spina bida children. Journal of Applied Behavior
Analysis 13, 1 (1980), 51–56. https://doi.org/10.1901/jaba.1980.13- 51
[15]
Hao-Shu Fang, Shuqin Xie, Yu-Wing Tai, and Cewu Lu. 2017. RMPE: Regional
Multi-person Pose Estimation. In ICCV. IEEE, New York, NY, USA, 2353–2362.
[16]
Nuša Farič, Henry W.W. Potts, Sarah Rowe, Taryn Beaty, Adrian Hon, and Abi
Fisher. 2021. Running App “Zombies, Run!” Users'Engagement with Physical
Activity: A Qualitative Study. Games for Health Journal 10, 6 (Dec. 2021), 420–429.
https://doi.org/10.1089/g4h.2021.0060
[17]
Eberhard Graether and Florian ‘Floyd’ Mueller. 2012. Joggobot: A Flying Robot as
Jogging Companion. In CHI ’12 Extended Abstracts on Human Factors in Computing
Systems (Austin, Texas, USA) (CHI EA ’12). Association for Computing Machinery,
New York, NY, USA, 1063–1066. https://doi.org/10.1145/2212776.2212386
[18]
Viviane Herdel, Lee J. Yamin, and Jessica R. Cauchard. 2022. Above and Beyond:
A Scoping Review of Domains and Applications for Human-Drone Interaction.
In CHI Conference on Human Factors in Computing Systems. ACM, New York, NY,
USA, 1–22. https://doi.org/10.1145/3491102.3501881
[19]
Keita Higuchi, Tetsuro Shimada, and Jun Rekimoto. 2011. Flying sports assistant:
external visual imagery representation for sports training. In Proceedings of the
2nd Augmented Human International Conference. ACM, New York, NY, USA, 1–4.
https://doi.org/10.1145/1959826.1959833
[20]
Eldar Insafutdinov, Leonid Pishchulin, Bjoern Andres, Mykhaylo Andriluka, and
Bernt Schiele. 2016. DeeperCut: A Deeper, Stronger, and Faster Multi-person
Pose Estimation Model. In Computer Vision – ECCV 2016. Springer International
Publishing, New York, NY, USA, 34–50. https://doi.org/10.1007/978-3-319- 46466-
4_3
[21]
Muhammad Shahidul Islam. 2020. Introducing Drone Technology to Soccer
Coaching. International Journal of Sports Science and Physical Education 5, 1
(2020), 1. https://doi.org/10.11648/j.ijsspe.20200501.11
[22]
Laura Jonker, Marije T. Elferink-Gemser, Ilse M. de Roos, and Chris Visscher.
2012. The Role of Reection in Sport Expertise. The Sport Psychologist 26, 2 (June
2012), 224–242. https://doi.org/10.1123/tsp.26.2.224
[23]
Armagan Karahanoglu, Rúben Hugo De Freitas Gouveia, Jasper Reenalda, and
Geke D.S. Ludden. 2021. How Are Sports-Trackers Used by Runners? Running-
Related Data, Personal Goals, and Self-Tracking in Running. Sensors (Switzerland)
21, 11 (26 May 2021), 1–11. https://doi.org/10.3390/s21113687
[24]
Łukasz Kidziński, Bryan Yang, Jennifer L. Hicks, Apoorva Rajagopal, Scott L.
Delp, and Michael H. Schwartz. 2020. Deep neural networks enable quantitative
movement analysis using single-camera videos. Nature Communications 11, 1
(Aug. 2020), 1–10. https://doi.org/10.1038/s41467-020-17807-z
[25]
Hyun Young Kim, Bomyeong Kim, and Jinwoo Kim. 2016. The Naughty Drone: A
Qualitative Research on Drone as Companion Device. In Proceedings of the 10th
International Conference on Ubiquitous Information Management and Communi-
cation. ACM, New York, NY, USA, 1–6. https://doi.org/10.1145/2857546.2857639
[26]
Francisco Kiss, Konrad Kucharski, Sven Mayer, Lars Lischke, Pascal Knierim, An-
drzej Romanowski, and Paweł W. Wozniak. 2017. RunMerge: Towards Enhanced
Proprioception for Advanced Amateur Runners. In Proceedings of the 2017 ACM
Conference Companion Publication on Designing Interactive Systems. ACM, New
York, NY, USA, 192–196. https://doi.org/10.1145/3064857.3079144
[27]
Shiou Yih Lee, Chengju Du, Zhihui Chen, Hao Wu, Kailang Guan, Yirong Liu,
Yongjie Cui, Wenyan Li, Qiang Fan, and Wenbo Liao. 2020. Assessing Safety
and Suitability of Old Trails for Hiking Using Ground and Drone Surveys. ISPRS
International Journal of Geo-Information 9, 4 (April 2020), 221. https://doi.org/10.
3390/ijgi9040221
[28]
Jane Lessiter, Jonathan Freeman, Edmund Keogh, and Jules Davido. 2001. A
Cross-Media Presence Questionnaire: The ITC-Sense of Presence Inventory.
Presence: Teleoperators and Virtual Environments 10, 3 (June 2001), 282–297. https:
//doi.org/10.1162/105474601300343612
[29]
Adriano E. Lima-Silva, Romulo C. M. Bertuzzi, Flavio O. Pires, Ronaldo V. Barros,
João F. Gagliardi, John Hammond, Maria A. Kiss, and David J. Bishop. 2009.
Eect of performance level on pacing strategy during a 10-km running race.
European Journal of Applied Physiology 108, 5 (Dec. 2009), 1045–1053. https:
//doi.org/10.1007/s00421-009- 1300-6
[30]
Alexander Mathis, Pranav Mamidanna, Kevin M. Cury, Taiga Abe, Venkatesh N.
Murthy, Mackenzie Weygandt Mathis, and Matthias Bethge. 2018. DeepLabCut:
markerless pose estimation of user-dened body parts with deep learning. Nature
Neuroscience 21, 9 (Aug. 2018), 1281–1289. https://doi.org/10.1038/s41593- 018-
0209-y
[31]
Sven Mayer, Pascal Knierim, Pawel W Wozniak, and Markus Funk. 2017. How
drones can support backcountry activities. In Proceedings of the 2017 natureCHI
workshop, in conjunction with ACM mobileHCI, Vol. 17. ACM, New York, NY, USA,
6.
[32]
Daphne Menheere, Evianne van Hartingsveldt, Mads Birkebæk, Steven Vos, and
Carine Lallemand. 2021. Laina: Dynamic Data Physicalization for Slow Exercising
Feedback. In Designing Interactive Systems Conference 2021 (Virtual Event, USA)
(DIS ’21). Association for Computing Machinery, New York, NY, USA, 1015–1030.
https://doi.org/10.1145/3461778.3462041
[33]
Syed Agha Hassnain Mohsan, Nawaf Qasem Hamood Othman, Yanlong Li, Mo-
hammed H. Alsharif, and Muhammad Asghar Khan. 2023. Unmanned aerial
vehicles (UAVs): practical aspects, applications, open challenges, security is-
sues, and future trends. Intelligent Service Robotics 16 (Jan. 2023), 109–137.
https://doi.org/10.1007/s11370-022- 00452-4
[34]
Florian ’Floyd’ Mueller and Matthew Muirhead. 2015. Jogging with a Quadcopter.
In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing
Systems (Seoul, Republic of Korea) (CHI ’15). Association for Computing Machin-
ery, New York, NY, USA, 2023–2032. https://doi.org/10.1145/2702123.2702472
[35]
Joseph S Munn. 2016. Using an aerial drone to examine lateral movement in sweep
rowers. Ph. D. Dissertation. The University of Western Ontario (Canada).
[36]
Elizabeth L. Murnane, Xin Jiang, Anna Kong, Michelle Park, Weili Shi, Connor
Soohoo, Luke Vink, Iris Xia, Xin Yu, John Yang-Sammataro, Grace Young, Jenny
Zhi, Paula Moya, and James A. Landay. 2020. Designing Ambient Narrative-
Based Interfaces to Reect and Motivate Physical Activity. In Proceedings of the
2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI,
USA) (CHI ’20). Association for Computing Machinery, New York, N Y, USA, 1–14.
https://doi.org/10.1145/3313831.3376478
[37]
Flora Panteli, Charilaos Tsolakis, Dimitris Efthimiou, and Athanasia Smirniotou.
2013. Acquisition of the Long Jump Skill, Using Dierent Learning Techniques.
The Sport Psychologist 27, 1 (March 2013), 40–52. https://doi.org/10.1123/tsp.27.1.
40
[38]
C. Perin, R. Vuillemot, C. D. Stolper, J. T. Stasko, J. Wood, and S. Carpendale. 2018.
State of the Art of Sports Data Visualization. Computer Graphics Forum 37, 3
(June 2018), 663–686. https://doi.org/10.1111/cgf.13447
[39]
Anniek Postema, Arnold B. Bakker, and Heleen van Mierlo. 2021. Work-Sports
Enrichment in Amateur Runners: A Diary Study. The Journal of Psychology 155,
4 (March 2021), 406–425. https://doi.org/10.1080/00223980.2021.1894411
[40]
PUMA. 2016. PUMA introduces the BeatBot - PUMA CATch up. https://www.
puma-catchup.com/puma- introduces-the- beatbot/
[41]
Jiashuo Qi, Dongguang Li, Cong Zhang, and Yu Wang. 2022. Alpine Skiing
Tracking Method Based on Deep Learning and Correlation Filter. IEEE Access 10
(2022), 39248–39260. https://doi.org/10.1109/access.2022.3166949
[42]
Andrew Renfree, Everton Crivoi do Carmo, Louise Martin, and Derek M. Peters.
2015. The Inuence of Collective Behavior on Pacing in Endurance Competitions.
Frontiers in Physiology 6 (Dec. 2015), 1–5. https://doi.org/10.3389/fphys.2015.
00373
[43]
Lionel Reveret, Sylvain Chapelle, Franck Quaine, and Pierre Legreneur. 2020. 3D
Visualization of Body Motion in Speed Climbing. Frontiers in Psychology 11 (Sept.
2020), 1–8. https://doi.org/10.3389/fpsyg.2020.02188
[44]
Andrzej Romanowski, Sven Mayer, Lars Lischke, Krzysztof Grudzień, Tomasz
Jaworski, Izabela Perenc, Przemysław Kucharski, Mohammad Obaid, Tomasz
Kosizski, and Paweł W. Wozniak. 2017. Towards Supporting Remote Cheering
during Running Races with Drone Technology. In Proceedings of the 2017 CHI
Conference Extended Abstracts on Human Factors in Computing Systems (Denver,
Colorado, USA) (CHI EA ’17). Association for Computing Machinery, New York,
NY, USA, 2867–2874. https://doi.org/10.1145/3027063.3053218
[45]
Amanda M. Rymal, Rose Martini, and Diane M. Ste-Marie. 2010. Self-Regulatory
Processes Employed During Self-Modeling: A Qualitative Analysis. The Sport
Psychologist 24, 1 (March 2010), 1–15. https://doi.org/10.1123/tsp.24.1.1
[46]
Patrick P.J.M. Schoenmakers and Kate E. Reed. 2018. The physiological and
perceptual demands of running on a curved non-motorised treadmill: Implications
for self-paced training. Journal of Science and Medicine in Sport 21, 12 (Dec. 2018),
1293–1297. https://doi.org/10.1016/j.jsams.2018.05.011
[47]
Atom Scott, Ikuma Uchida, Masaki Onishi, Yoshinari Kameda, Kazuhiro Fukui,
and Keisuke Fujii. 2022. SoccerTrack: A Dataset and Tracking Algorithm for
Soccer with Fish-eye and Drone Videos. In 2022 IEEE/CVF Conference on Computer
Vision and Pattern Recognition Workshops (CVPRW). IEEE, New York, NY, USA,
OzCHI 2023, December 02–06, 2023, Wellington, New Zealand Balasubramaniam et al.
3569–3579. https://doi.org/10.1109/cvprw56347.2022.00401
[48]
Stephen Seiler and Jarl Espen Sjursen. 2004. Eect of work duration on physio-
logical and rating scale of perceived exertion responses during self-paced interval
training. Scandinavian Journal of Medicine and Science in Sports 14, 5 (Oct. 2004),
318–325. https://doi.org/10.1046/j.1600-0838.2003.00353.x
[49]
Matthias Seuter, Eduardo Rodriguez Macrillante, Gernot Bauer, and Christian
Kray. 2018. Running with Drones: Desired Services and Control Gestures. In
Proceedings of the 30th Australian Conference on Computer-Human Interaction
(Melbourne, Australia) (OzCHI ’18). Association for Computing Machinery, New
York, NY, USA, 384–395. https://doi.org/10.1145/3292147.3292156
[50]
Matthias Seuter, Max Pfeier, Gernot Bauer, Karen Zentgraf, and Christian Kray.
2017. Running with Technology. Proceedings of the ACM on Interactive, Mobile,
Wearable and Ubiquitous Technologies 1, 3 (Sept. 2017), 1–17. https://doi.org/10.
1145/3130966
[51]
Diane M Ste-Marie, Michael J Carter, and Zachary D Yantha. 2019. Self-controlled
learning: Current ndings, theoretical perspectives, and future directions. Routledge,
Oxfordshire, UK, Chapter Self-controlled learning: Current ndings, theoretical
perspectives, and future directions, 1–22.
[52]
ShiJie Sun, Naveed Akhtar, HuanSheng Song, Ajmal S. Mian, and Mubarak Shah.
2019. Deep Anity Network for Multiple Object Tracking. IEEE Transactions on
Pattern Analysis and Machine Intelligence 43, 1 (2019), 104–119. https://doi.org/
10.1109/tpami.2019.2929520
[53]
Melanie Swan. 2013. The Quantied Self: Fundamental Disruption in Big Data
Science and Biological Discovery. Big Data 1, 2 (June 2013), 85–99. https:
//doi.org/10.1089/big.2012.0002
[54]
Jonathan Taylor, Greg Atkinson, and Russell Best. 2021. Paced to perfection:
Exploring the potential impact of WaveLight Technology in athletics. The Sport
and Exercise Scientist 68, Summer (2021), 8–9.
[55]
Dante Tezza and Marvin Andujar. 2019. The State-of-the-Art of Human–Drone
Interaction: A Survey. IEEE Access 7 (2019), 167438–167454. https://doi.org/10.
1109/access.2019.2953900
[56]
Christian Thiel, Carl Foster, Winfried Banzer, and Jos De Koning. 2012. Pacing
in Olympic track races: Competitive tactics versus best performance strategy.
Journal of Sports Sciences 30, 11 (July 2012), 1107–1115. https://doi.org/10.1080/
02640414.2012.701759
[57]
Alexander Toshev and Christian Szegedy. 2014. DeepPose: Human Pose Es-
timation via Deep Neural Networks. In 2014 IEEE Conference on Computer
Vision and Pattern Recognition. IEEE, New York, USA, 1653–1660. https:
//doi.org/10.1109/cvpr.2014.214
[58]
Scott D. Uhlrich, Antoine Falisse, Łukasz Kidziński, Julie Muccini, Michael Ko,
Akshay S. Chaudhari, Jennifer L. Hicks, and Scott L. Delp. 2022. OpenCap: 3D
human movement dynamics from smartphone videos. bioRxiv 0, 0 (July 2022),
1–48. https://doi.org/10.1101/2022.07.07.499061
[59]
Anna Wojciechowska, Jeremy Frey, Esther Mandelblum, Yair Amichai-
Hamburger, and Jessica R. Cauchard. 2019. Designing Drones: Factors and
Characteristics Inuencing the Perception of Flying Robots. Proceedings of the
ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3, 3 (Sept. 2019),
1–19. https://doi.org/10.1145/3351269
[60]
Paweł W. Woźniak, Monika Zbytniewska, Francisco Kiss, and Jasmin Niess. 2021.
Making Sense of Complex Running Metrics Using a Modied Running Shoe. In
Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems.
ACM. https://doi.org/10.1145/3411764.3445506
[61]
Chih-Hung Yu, Cheng-Chih Wu, Jye-Shyan Wang, Hou-Yu Chen, and Yu-Tzu
Lin. 2020. Learning Tennis through Video-based Reective Learning by Using
Motion-Tracking Sensors. Journal of Educational Technology & Society 23, 1
(2020), 64–77. https://www.jstor.org/stable/26915407
[62]
Andrea Zignoli and Damiano Fruet. 2022. Insights in road cycling downhill
performance using aerial drone footages and an ‘optimal’ reference trajectory.
Sports Engineering 25, 1 (Oct. 2022), 1–9. https://doi.org/10.1007/s12283-022-
00386-1
[63]
Barry J. Zimmerman. 2000. Attaining Self-Regulation. In Handbook of Self-
Regulation. Elsevier, San Diego, 13–39. https://doi.org/10.1016/b978- 012109890-
2/50031-7
[64]
Sergej G. Zwaan and Emilia I. Barakova. 2016. Boxing against Drones: Drones
in Sports Education. In Proceedings of the The 15th International Conference
on Interaction Design and Children (Manchester, United Kingdom) (IDC ’16).
Association for Computing Machinery, New York, NY, USA, 607–612. https:
//doi.org/10.1145/2930674.2935991