Title: Smartwatch inertial sensors continuously monitor real-world motor fluctuations in
Authors: Rob Powers1, Maryam Etezadi-Amoli1, Edith M. Arnold1, Sara Kianian1,2, Irida
Maxsim Gibiansky1, Dan Trietsch1, Alexander Singh Alvarado1, James D. Kretlow1, Todd M.
Herrington3,4, Salima Brillman5, Nengchun Huang6, Peter T. Lin6, Hung A. Pham1, Adeeti V.
1Apple, Inc., Cupertino, California, 95014
2Renaissance School of Medicine, Stony Brook University, Stony Brook, NY 11794
3Massachusetts General Hospital, Department of Neurology, Boston, MA 02114
4Harvard Medical School, Department of Neurology, Boston, MA 02115
5Parkinson’s Disease and Movement Center of Silicon Valley, Menlo Park, CA 94025
6Silicon Valley Parkinson’s Center, Los Gatos, California, 95032
*Corresponding author. Email: firstname.lastname@example.org (A.V.U.)
One Sentence Summary: A smartwatch-based sensing system measures tremor and dyskinesia,
enabling remote monitoring and aiding medication titration in Parkinson’s disease.
Longitudinal,!remote monitoring of motor symptoms in Parkinson's disease (PD) could
enable!more precise treatment decisions.!We developed the Motor ﬂuctuations Monitor for
Parkinson’s Disease (MM4PD), an ambulatory monitoring system which used!smartwatch
inertial sensors to continuously track ﬂuctuations in resting tremor and dyskinesia. We designed
and validated MM4PD in 343 participants with PD, including a longitudinal study of up to six
months in a 225-subject cohort.!MM4PD measurements correlated to clinical evaluations of
tremor severity (ρ = 0.80) and mapped to expert ratings of dyskinesia presence (P < 0.001)
during in-clinic tasks. MM4PD captured symptom changes in response to treatment that matched
the treating clinician's expectations in 94% of evaluated subjects. In the remaining 6% of cases,
symptom data from MM4PD identiﬁed opportunities to make clinically applicable changes in
pharmacologic strategy. These results demonstrate the promise of MM4PD as a tool to support
improvements in patient-clinician communication, medication titration and clinical trial design.
Smartwatches are a well-established tool for continuous activity and ﬁtness tracking(1-6).
Recent studies have shown that smartwatch-based longitudinal tracking can be extended to
clinical applications(3, 7-9), such as remote monitoring of motor symptoms in Parkinson’s
disease (PD), the second most common neurodegenerative disease worldwide(4, 10-19).
Quality of life in patients with PD correlates with a clinician’s ability to precisely titrate
medications(15, 20, 21). Dopamine replacement therapies improve motor symptoms but
determining optimal medication schedules remains challenging. The Movement Disorder
Society-sponsored revision of the Uniﬁed Parkinson’s Disease Rating Scale (MDS-UPDRS) part
III is a broadly accepted tool using a quantized ﬁve-point scale to measure motor symptom
severity(22-25). These assessments provide only a snapshot view of the patient's day during in-
clinic visits that typically occur every few months. For out-of-clinic symptom tracking, clinicians
rely on patient recall of symptoms(26-30), which is often error-prone, particularly for
medication-induced symptoms like dyskinesia (31). As such, clinicians are limited by infrequent,
coarse patient evaluations that cannot capture subtle disease progression, or daily ﬂuctuations
from medication, exercise, diet, or stress (20, 32-38).
We hypothesized that objective, continuous and sensitive tracking of tremor and
dyskinesia could reveal more granular on-off patterns in a patient's day and serve as a clinical
decision support tool to improve medication titration. Prior work has shown that machine
learning and kinematics-based algorithms on inertial sensor data can identify Parkinsonian
resting tremor (12, 18, 22, 39-49) and choreiform dyskinesias, a side-effect often associated with
peak-dose of dopamine medications (10, 13, 16, 34, 50-56). However, only a few algorithms
have been successfully translated into monitoring systems usable in real-world scenarios and
with sufﬁcient patient adherence to potentially impact clinical decision-making (57-65).
To bridge these gaps, we developed an ambulatory monitoring system, the Motor
ﬂuctuations Monitor for Parkinson’s Disease (MM4PD), that unobtrusively and accurately
captures patterns of resting tremor and choreiform dyskinesia. MM4PD was designed with all-
day data from a longitudinal control study to establish signal-to-noise cutoffs for tremor and
dyskinesia estimates and a longitudinal patient study to assess the system’s ability to capture
motor ﬂuctuations and potential utility in aiding clinicians during medication titration across a
range of Parkinson’s phenotypes in hundreds of participants with PD.
Figure 1. Overview of the motor ﬂuctuation monitor for Parkinson’s disease. (A) Raw accelerometer and
gyroscope data from smartwatches contained features that predicted the presence of dyskinetic, choreiform
movements and the severity of Parkinsonian resting tremor. Second-level choreiform dyskinesia and tremor metrics
were aggregated into minute-level outputs that were accessible via an application programming interface (API) on
the smartwatch. (B) Minute-level outputs were averaged over multiple days to generate smartwatch symptom
proﬁles for each patient. (C) Clinicians evaluated smartwatch symptom proﬁles to capture the effect of medication
titration, deep brain stimulation, and lifestyle changes.
MM4PD development and study coverage
MM4PD development occurred in two main phases: (i) algorithm design that converted
raw inertial sensor data into tremor and dyskinesia estimates, and (ii) validation of MM4PD
outputs' ability to capture ﬂuctuating daily symptom proﬁles in response to medication schedules
and other factors while subjects engaged in real-world behavior. (Fig. 1, ﬁg. S1). The tremor and
dyskinesia algorithms were developed using sensor data captured across three studies (Fig. 2, ﬁg.
S1, table S1) covering 343 unique participants with PD and 171 elderly non-PD controls.
Algorithms were primarily designed using sensor data collected during an initial pilot study
utilizing in-clinic MDS-UPDRS assessments. We established that MM4PD is accurate and
sensitive in estimating symptom presence or severity in controlled scenarios. Using all-day
sensor data from an additional phase of the pilot study consisting of one week of daily device
Deep Brain Stimulation
Time of Day
% Detected Symptom
usage to better assess real world scenarios, we extended the algorithms’ robustness during daily
activities including walking (table S2). A multi-month longitudinal patient study was then used
for further design and validation of the algorithms’ ability to discern changes caused by events
more commonly occurring at these longer time scales such as medication changes. A cohort of
171 older subjects enrolled in the longitudinal control study, undergoing continuous monitoring
with the same smartwatch for the purposes of MM4PD development, and served as age-matched
controls. In both the pilot study and longitudinal patient study, we recruited from a broad
enrollment pool to test the MM4PD algorithms from patients with varied routines. We did not
restrict by subtype or severity of PD, and patients reported a broad range of dominant symptoms
from tremor to gait to bradykinesia or fear of falling (ﬁg. S2).
Figure 2. Summary of study design and validation. (A) Smartwatch sensor data was mapped to MDS-UPDRS
ratings by designing an algorithm that worked in bounded conditions such as in-clinic cognitive distraction tasks,
and further enriched with shorter free-living periods spanning weeks. In the validation phase, motor ﬂuctuations
from smartwatch symptom proﬁles were retrospectively compared to a patient’s prescribed medication times.
Longitudinal datasets were divided into design and hold-out sets. (B) Three studies were performed to map sensor
data to MDS-UPDRS ratings: (i) The pilot study with 118 patients with Parkinson's disease (PD) with multiple
Longitudinal Patient Study
Longitudinal Control Study
n = 118 PD
n = 171 control
up to 4 in-clinic sessions
1 week live-on up to 6 months up to 12 months
(MDS-UPDRS Part II-IV)
up to 4 1 —
3 movement disorder
specialists through video
1 movement disorder specialist
(familiar with patient history)
Algorithms Design Design (n=143); Hold-out (n=82)
Validate vs. Medication Change Design
Bounded conditions 1-week duration with
compliant device usage
Longitudinal, free-living studies
Multi-month study with freeform device usage from patients and controls
Smartwatch sensor vs. MDS-UPDRS ratings
Smartwatch outputs vs. Medication Data
expert ratings, (ii) The longitudinal patient study with 6+ months of data from 225 patients with PD, and (iii) The
longitudinal control study with 171 elderly, non-PD controls
System performance: tremor
We designed MM4PD to detect and classify the severity of tremor (based on
displacement) as slight (<0.1 cm), mild (0.1-0.6 cm), moderate (0.6-2.2 cm), strong (>2.2 cm),
unknown and absent during each one-minute interval the smartwatch was worn (ﬁg. S3). First,
we veriﬁed the accuracy of smartwatch estimates of wrist displacement. The Pearson correlation
coefﬁcient between displacement measured by a motion capture system with sub-millimeter
accuracy and the watch estimate was 0.98 in a control subject (Fig. 3A) with a mean signed error
of -0.04 ± 0.17 cm. Wrist displacement estimates responsively tracked increases in tremor
severity as conﬁrmed by time-synchronized video in several subjects (Fig. 3B, ﬁg. S4, movies S1
to S3). Next, we showed that watch displacement correlated with expert MDS-UPDRS tremor
amplitude ratings, with a rank correlation coefﬁcient of 0.80 (Fig. 3C, ﬁg. S5). During cognitive
distraction tasks from the pilot study, MM4PD captured tremor in 97.7% of cases where all raters
agreed (table S3, ﬁg. S6). The median tremor false positive rate over 43,300 hours of all-day data
from 171 elderly, non-PD control subjects in the longitudinal control study was 0.25%. False
positives occurred infrequently during targeted activities in young, healthy controls, such as
manual teeth brushing (8%) and playing a musical instrument (2%) (table S2).
All-day tremor estimates from the longitudinal patient study, as quantiﬁed by an
individual's mean percentage of time with tremor detected per day correlated with their MDS-
UPDRS tremor constancy score assessed during a brief, in-clinic visit at the start of the study,
with a Spearman's rank correlation coefﬁcient of 0.72 in design and hold-out sets (Fig. 3, D and
E). Intra-day ﬂuctuations from sensor estimates corresponded with prescribed times for
carbidopa/levodopa (C/L) doses, as shown in the smartwatch symptom proﬁles for several
subjects (ﬁg. S7). An example is shown for subject S002 (Fig. 3F), who showed characteristic
wearing off periods of 1.5-2.5h between peak and trough times, which were consistent with
established C/L pharmacokinetics in non-fasting patients(32, 33, 35, 66, 67).
Figure 3. Smartwatch estimates of tremor severity and presence correlate to MDS-UPDRS ratings and show
intra-day motor ﬂuctuations. (A) Displacement estimated from the smartwatch correlated to motion capture
reference during simulated tremor by a control subject in 9 different seated and standing positions (r2=0.98). (B)
Smartwatch displacements from inertial sensor data sensitively tracked the onset of tremor during cognitive
distraction tasks (movie S1). Upper: accelerometry, lower: estimated displacement. (C) Displacement estimates from
the smartwatch during 253 seated and standing tasks are shown as a box plot separated into boxes by the average
MDS-UPDRS tremor severity ratings from three movement disorder specialists in the pilot study (Spearman rank
correlation of 0.80). Mean daily smartwatch tremor estimates correlated with MDS-UPDRS tremor constancy
ratings from the subject’s last in-clinic visit in (D) design (n=95) and (E) hold-out (n=43) sets. (F) Smartwatch
symptom proﬁles (tremor amounts in 15-min increments) for subject S002 indicating ﬂuctuations characteristic of
the pharmacokinetics of the patient's medication (25/100 mg Carbidopa/Levodopa). Data in (D and E) are shown as
box plots with each box containing the mean daily fraction of time reported with tremor by the smartwatch for
subjects with a tremor constancy assessment shown on the x-axis. (Spearman rank correlations of 0.72 were
calculated for each set).
Mean MDS-UPDRS Tremor Rating
Motion Capture Reference (cm)
Apple Watch estimate (cm)
Displacement (cm) Accel (g)
Smartwatch Estimate (cm)
Motion Capture Reference (cm)
Mean Daily Tremor (%)
Mean Daily Tremor (%)
MDS-UPDRS Tremor Constancy
MDS-UPDRS Tremor Constancy
8am 10am 12pm 2pm 4pm 6pm 8pm 10pm
Time of Day
Tremor Amount in 15 min (%)
< 0.1 cm
Prescribed Medication Time
0.1 - 0.6 cm
System performance: dyskinesia
We developed a Choreiform Movement Score (CMS) (Supplementary Methods: Chorea
Detection) that returns an estimate of the frequency of dyskinesia-absent classiﬁcation or
quantiﬁcation of severity or amplitude by selecting features that capture irregular, jerky
movements in subjects with a diagnosis of chorea visible at the wrist. The dyskinesia algorithm
was designed and validated across 343 participants with PD (61 with dyskinesia) and 171
elderly, non-PD controls (ﬁg. S1). We calculated CMS from sensor data in the pilot study, and
compared it to dyskinesia ratings from three MDS-certiﬁed experts during multiple MDS-
UPDRS assessments. CMS showed signiﬁcant differences (P<0.001) for all pairwise
comparisons using a Wilcoxon rank sum test across three groups: (i) 65 subjects with conﬁrmed
absence of in-session dyskinesia by all three raters (89 tasks), (ii) 69 subjects with discordant
dyskinesia ratings (109 tasks), and (iii) 19 subjects with conﬁrmed dyskinesia across all three
raters (22 tasks) (Fig. 4, A to C). Some examples of detected and undetected dyskinesia during a
cognitive distraction task are shown in ﬁg. S4, C to F and movies S4 to S7).
The amount of dyskinesia detected by MM4PD signiﬁcantly differed between subjects
with PD with known chorea and those without, in both cross-validation and hold-out datasets. In
our cross-validation design set, we detected dyskinesia for an average of 10.7±9.9% (μ±σ) of the
day in 32 subjects with chorea (Fig. 4D, table S4). In contrast, we detected dyskinesia for
2.7±2.2% of the day in 125 patients with PD with no known dyskinesia (P<0.001, Wilcoxon rank
sum test). In a hold-out dataset from the longitudinal patient study, the percentage of time
dyskinesias were detected for the chorea group (5.9±5.3%) signiﬁcantly differed from subjects
with no reported dyskinesias (2.0±2.2%) (P=0.027, Wilcoxon rank sum test) (Fig. 4E, table S4).
Dyskinesia false positive rates were low across common activities like walking (1%). In all-day
data from elderly, non-PD controls in the longitudinal control study, the median false positive
rate was 2.0% (table S2).!However, speciﬁc activities that mimic choreiform movements, such as
playing the piano, had high false positive rates (table S2). Averaged over time, MM4PD
dyskinesia and tremor estimates nonetheless showed “on-off” ﬂuctuations alternating between
peaks of Parkinsonian tremor symptoms and dyskinetic side-effects as a result of Levodopa
therapy as shown for subject S003 (Fig. 4F).
Figure 4. In-clinic and all-day smartwatch choreiform dyskinesia detection matches clinical evaluation. (A)
Chorea movement scores created from smartwatch inertial sensor data fall above the dyskinesia detection threshold
during a standardized MDS-UPDRS cognitive distraction task for a patient. Upper: accelerometry, lower: chorea
movement scores. (B) Chorea movement scores remain low for a patient who is walking and does not have
dyskinesia. Upper: accelerometry, lower: chorea movement scores. (C) Chorea Movement Scores (CMS) computed
during in-clinic cognitive distraction tasks for the pilot study differentiated between the presence or absence of
dyskinesia (DK) as based on expert ratings (***P < 0.001 for all pairwise comparisons, using Wilcoxon rank sum
test). (D) The amount of dyskinesia detected in patients signiﬁcantly differed between subjects with and without
chorea using all-day data in a design set (***P < 0.001, using Wilcoxon rank sum test), (E) and a hold-out set (*P =
0.027, using Wilcoxon rank sum test). (F) A smartwatch symptom proﬁle for subject S003 captures dyskinesia and
Dyskinesia: Cognitive Distraction Task
No Dyskinesia: Walking Task
Dyskinesia Tremor Prescribed Medication Time
Real world clinical deployment
Subjects aged 71±8.9 years in the longitudinal patient study wore the watch for 10.9±2.5
hours/day, remained in the study for an average of 104±59 days (μ±σ), and 3% of subjects
dropped out of the study (table S1). Subjects spanned a range of all-day behavior and activity,
were treated with a variety of medications, and engaged in diverse activities enabled by
smartwatch functionality such as activity tracking, workouts, and guided breathing (ﬁg. S8, S9,
To determine whether watch tremor and dyskinesia estimates could capture symptom
patterns relevant in the course of disease and treatment monitoring, we visualized MM4PD
outputs to create individual symptom proﬁles for all subjects in the longitudinal patient study. We
found that smartwatch symptom proﬁles captured changes after individuals underwent surgery
for deep brain stimulation (DBS) (ﬁg. S10A), deep brain stimulation reprogramming (ﬁg. S10B),
and even as a subject became more adherent to their prescribed treatment plan (ﬁg. S11).
MM4PD also captured days with worsened symptoms and improvement with medication, as
shown for subject S004, who experienced worsened tremor until the subject began taking
additional morning doses of controlled release carbidopa/levodopa (C/L CR)(Fig. 5, ﬁg. S12).
Figure 5. Longitudinal tremor and choreiform dyskinesia symptom levels and proﬁles reﬂect changes in
medication titration. (A) Longitudinal tracking of tremor displacement over time for subject S004. Outlier
symptom burden and severity was observed on day 19, conﬁrmed by patient-report of a "bad" day in clinical notes.
Med, medication. (B) Mean symptoms per day for subject S004. After day 19, the patient started a new prescribed
medication plan and the amount of tremor reduced again. (C) Smartwatch symptom proﬁles (% per 15-min window)
aggregated across multiple days according to medication schedules for subject S004. Upper plot: Higher tremor
levels were observed in the morning and mid-afternoon, which matched patient-reported "off" times. Lower plot: A
reduction in the amount of tremor was seen during the patient’s mid-afternoon “off" time on the new medication
schedule (day 20 onward).
We then evaluated whether smartwatch symptom measurement could identify unexpected
responses to treatment that may not be caught by traditional clinical assessment. A clinician well-
versed with each subject’s medical history reviewed all medication changes that occurred during
the longitudinal patient study in the absence of additional smartwatch data. The clinician then
completed a second review of the medication changes with accompanying smartwatch symptom
proﬁles from before and after treatment to determine whether it matched his initial clinical
assessment based on medical history alone. Changes in symptom proﬁles for 104 subjects who
underwent treatment matched the clinician’s expectation in 94% of cases (Fig. 6A). In the
Med Schedule 1 (19 days)
C/L 25/100x3 C/L 25/100x3 C/L 25/100x3
4am 8am 12pm 4pm 8pm
% of 15 minute window
C/L 25/100 C/L 25/100x3 C/L 25/100x3
C/L CR 50/200 C/L CR 50/200
4am 8am 12pm 4pm 8pm
Time of Day
% of 15 minute window
Duration of medication schedule
Patient reported “bad” day
Median tremor displacement (cm) C/L 25/100x3 = 3 doses of Carbidopa/Levodopa 25/100mg
CR = Controlled Release
Prescribed medication time
Patient-reported “off” time
0 5 10 15 20 25 32 37 42 67
S004: Longitudinal Changes in Tremor
0 5 10 15 20 25 32 37 42 67
Days Since Study Enrollment
Mean symptom/day (%)
C/L 25/100x3 C/L 25/100x3 C/L 25/100x3
4am 8am 12pm 4pm 8pm
% of 15 minute window
Med Schedule 2 (25 days)
C/L 25/100 C/L 25/100x3 C/L 25/100x3
C/L CR 50/200
C/L CR 50/200
4am 8am 12pm 4pm 8pm
Time of Day
% of 15 minute window
18-85th percentile displacement
25-75th percentile displacement
40-60th percentile displacement
< 0.1 cm tremor
0.1 - 0.6 cm tremor
0.6 - 2.2 cm tremor
remaining 6% of cases (n=6), we found that smartwatch symptom proﬁles helped the clinician
identify likely alternate clinical or pharmacological responses that may have otherwise gone
unnoticed. For example, in three of six cases, the clinician determined that MM4PD detected
symptom changes imperceptible by traditional assessment but that were supported by secondary
clinical evidence (e.g., subject reported upper limb cramps, but did not report dyskinesias) (table
S6, ﬁg. S13A). In the other three cases, the clinician deduced that MM4PD data indicated that
those subjects could have uncommon but known side effects from other prescribed drugs that
impacted their tremor or dyskinesia symptoms in an unexpected manner (e.g., antipsychotic
medication clozapine was thought to have reduced tremor in some subjects) (table S6, ﬁg.
Additionally, the clinician noted that MM4PD may have the potential to support variable
or subclinical symptoms such as emergent tremor or dyskinesia (ﬁg. S13A, S14, S15). The
clinician also identiﬁed distinct patterns of dyskinesia including peak-dose dyskinesias and
diphasic dyskinesia, which require distinct approaches to medication management (ﬁg. S16)
Finally, we tested whether interpretation of smartwatch symptom proﬁles would
generalize beyond a single clinician familiar with the subject’s history. Three additional
movement disorder specialists classiﬁed smartwatch symptom proﬁles as either “before" or
“after" treatment for a given medication change. No supporting medical history was supplied
other than MDS-UPDRS ratings for tremor and dyskinesia assessed at the start of the study.
Raters correctly matched smartwatch symptom proﬁles in most cases (87.5%). For the remaining
cases, the rater presumed that an alternate medication from the patient's prescription was having
a dominant effect, which resulted in a misclassiﬁcation. This occurred once per rater, and in each
instance the other two raters had correctly classiﬁed the proﬁles (Fig. 6B, ﬁg. S17, S18).
Figure 6. Smartwatch symptom proﬁles match clinician expectation and provide quantitative evidence for
cases with uncertainty. The clinician reviewed the smartwatch symptom proﬁles of 112 subjects in the longitudinal
patient study who underwent treatment changes. (A) Symptom changes matched the clinician's expectation of the
prescribed medication change in 94% of cases. Unexpected cases revealed plausible incidence of known side-effects
to medications. (B) Three blinded movement disorder specialists classiﬁed 10 sets of proﬁles as pre- or post-
treatment using only the patient's medication schedule and MDS-UPDRS tremor and dyskinesia ratings from the
intake visit. 87.5% of classiﬁcations were correct; three misclassiﬁcations occurred because raters presumed that an
alternate medication had a dominant effect. Six cases were deemed inconclusive and were excluded.
The MM4PD smartwatch system converts motion sensor data into resting tremor and
dyskinesia measurements. We show that MM4PD outputs correlate with clinician-rated symptom
Clinician Evaluation in Longitudinal
Patient Study n
Patients with sufficient data who underwent
medication or DBS changes
Symptom data inconclusive (excluded) (8)
Symptom data changes matched
Symptom data unexpected but
Total Assessed 104
Clinician Evaluation with Full
Matched clinician’s expectation
Unexpected but plausible
n = 104
rnate interpretation of med effect
3 Expert Raters
n = 24
symptom pr n
Ratings from n = 10 tasks and 3 raters 30
Insufficient or inconclusive signal
plausible, alternate medication effect)
Total Assessed 24
severity in controlled and real-world environments, and observe low false positive rates in an
elderly, non-PD control cohort. The system also captures motor ﬂuctuations and medication
response, which were validated by comparing smartwatch symptom proﬁles to holistic,
retrospective clinical evaluation and through a blinded classiﬁcation task by three movement
disorder specialists. In the longitudinal patient study, smartwatch symptom proﬁles apprised the
clinician of subclinical symptoms missed in routine care and identiﬁed subjects who had not
adhered to their medications. These ﬁndings demonstrate that by aligning MM4PD with
standardized MDS-UPDRS assessments, the system can complement traditional examination
with interpretable, quantitative and longitudinal symptom data (ﬁg. S19, S20).
Several limitations of this study should be noted. First, assessing MM4PD accuracy is
constrained by the precision and availability of gold standard MDS-UPDRS ratings. This rating
scale requires an in-person clinician evaluation and is not designed for continuous symptom
measurement in everyday life. Furthermore, this reference scale is validated for its overall score
across many aspects of Parkinson's disease, not just its motor subcomponent. Discrepancies can
also occur as clinical raters may miss tremors at the edge of human perception that are easily
identiﬁed with an advanced sensor system (24). Second, the studies were subject to recruitment
bias. Notably, there were fewer than expected cases of severe Parkinson's disease, which may be
explained by reduced willingness in patients with advanced cases to participate in studies, or
better treatment plans and adherence in our cohort. Finally, the study was only conducted with
the Apple watch; in the future, a more direct comparison of MM4PD with predicate systems, or
systems employing multiple inertial sensor sites may be warranted (72).
Our system addresses common barriers patients face in remote care settings. By
embedding the algorithms on a full-featured consumer device, users beneﬁt from discreet,
unobtrusive symptom monitoring without the stigma of a dedicated medical device or burden of
active tasks. Device adherence may increase due to interest in other features like activity tracking
and messaging. Logged workouts and device usage can help patients and clinicians contextualize
how their symptoms are affected by lifestyle factors like exercise and stress.
There are several opportunities to improve the clinical utility of the MM4PD system. The
current system focuses on measuring resting tremor and detecting dyskinesia. Future work could
incorporate additional outputs such as postural tremor, dyskinesia severity, and periods of
simultaneous tremor and dyskinesias. Assessment of other motor symptoms such as
bradykinesia, gait and posture may also be needed to capture the complete patient
Additionally, because the smartwatch is limited to a single observation point at the wrist,
MM4PD outputs may be less reliable in particular scenarios (e.g. tremor overestimates from a
loose band, false dyskinesia from daily piano practice). However, we observed that most of these
situations occur sporadically and do not unduly impact smartwatch symptom patterns when data
is averaged over multiple days. Symptoms that do not manifest on the limb that the watch is
worn may also not be robustly detected, and the system is not designed to measure or assess non-
motor symptoms, which may have noteworthy impact on patients’ quality of life (76). Future
studies could determine how to safely and advantageously combine traditional assessments,
including those for non-motor symptoms, with smartwatch data (77). Clinicians could conﬁrm or
identify additional symptoms with in-person or virtual visits or potentially via an app that
enables self-reporting of both motor and nonmotor symptoms. Future algorithms could also
classify temporal patterns of symptoms such as peak-dose or diphasic dyskinesias to guide
clinicians to adjust medication dosing or consider an advanced therapy such as DBS or
continuous levodopa infusion.
Gaps remain before MM4PD can become a decision support tool in clinical workﬂows.
Workﬂow integration requires secure and compliant data ﬂows to providers from the MM4PD
system, which is solely designed to store data on-device until the user consents to third-party
access. However, other applications have successfully achieved secure integrations with the
Apple Watch and iPhone platforms (78, 79). Without reimbursement, smartwatches and
smartphones may not be broadly accessible to all patients. Integrating MM4PD into clinical
settings requires careful consideration to avoid further exacerbating socioeconomic disparities
among patients with Parkinson’s disease (80, 81). Finally, United States Food and Drug
Administration (FDA) clearance may be necessary to achieve widespread clinical utilization.
While the underlying hardware (Apple Watch) of MM4PD is not a medical device, examples
exist of both FDA-cleared software built upon the Apple Watch and FDA-cleared wearable
systems for Parkinson's disease (82-84).
In the future, this technology has the potential to serve a wide range of applications.
Clinicians could use smartwatch symptom proﬁles to improve treatment plans, motivate patients
to remain adherent, or quantify post-surgery improvements (85). Researchers could passively
assess disease progression or treatment efﬁcacy without symptom journals, which may burden
participants and confound results by increasing patients’ awareness of symptoms. Clinical
trialists could deploy the smartwatch system in real world environments to screen for
Parkinsonian tremor in asymptomatic cohorts, or serve as a companion diagnostic during drug
development (86). In summary, MM4PD enables continuous symptom monitoring through a
smartwatch, and can help clinicians and researchers incorporate out-of-clinic data for the
treatment and study of Parkinson’s disease.
Materials and Methods:
We conducted three distinct studies to design and validate the algorithms, namely: the pilot
study (n = 118 for algorithm design), the longitudinal patient study (n = 143 for design, n = 82
for validation), and the longitudinal control study (n = 171). (Fig. 2, table S1, ﬁg S1). All three
included the current clinical gold standard assessment of Parkinson’s disease motor symptoms —
MDS-UPDRS Part III scores — and continuous inertial sensor data from the Apple Watch.
Randomization was not applicable; all clinical ratings used to design and validate the algorithm
were obtained in a blinded manner (unexposed to algorithm development or outputs). We
obtained informed consent from subjects in all three studies in accordance with IRB-approved
protocols (pilot patient study: Protocol #201005, 201014, Midlands Independent Review Board;
longitudinal patient study: Protocol #100.1 Quorum Institutional Review Board; longitudinal
control study: Protocol #18-0424-781, Advarra Institutional Review Board). Individual proﬁle
data is only included from subjects who consented to publication of individual proﬁles.
With pilot study data, we designed the tremor algorithm using data from standardized
MDS-UPDRS Part III motor tasks (pronation-supination, rest, etc.) with expert clinical ratings.
Three movement disorder specialists provided expert ratings via video recordings time-aligned
with smartwatch sensor data. Remote video recordings have previously been shown to be a valid
method for MDS-UPDRS assessments (87). A subset of subjects also participated in a one-week,
out-of-clinic measurement period to capture typical free-living behavior in the pilot study. The
free-living period was book-ended by additional MDS-UPDRS motor assessments with video
reference; however, no clinical (MDS-UPDRS) ratings were performed by video or on site.
Subjects self-reported updated medication information upon enrolling into the pilot study. In
addition to MDS-UPDRS assessments, subjects underwent a 30-minute instruction period to
familiarize them with their smartwatch (Apple Watch) functionalities. Speciﬁcally, subjects were
shown how to log workouts, check their activity “rings” and start deep breathing sessions
(Supplementary Methods, data ﬁle S1). We also conducted short surveys about the study
subjects’ experience with a smartwatch (table S7, ﬁg. S8).
The longitudinal patient study was conducted in the context of a movement disorder clinic
and combined standardized on-site assessment with free-living observation lasting up to 6
months. A movement disorder specialist administered the MDS-UPDRS Part III Motor
Assessment at enrollment. Subjects were instructed to wear the watch on the side most affected
by tremor or dyskinesia. The ﬁrst 143 subjects were used for design whereas 82 subjects
enrolling later were separated into a validation data set for algorithm assessment. Tremor and
dyskinesia detection algorithms also used data from a 171-subject elderly cohort (longitudinal
control study) with subjects 65+ who did not self-report Parkinson’s disease; sensor data was
collected continuously for up to 12 months (ﬁg. S1).
Three sub-studies were performed within the longitudinal patient study to validate
MM4PD outputs. First, a small pilot investigation (n = 36) was conducted within the longitudinal
patient study in which patients reviewed the smartwatch symptom proﬁles captured by the
tremor and dyskinesia algorithms alongside a clinician (ﬁg. S19). Second, a comprehensive
evaluation of MM4PD as a decision support tool was performed. The study clinician determined
whether intra-day patterns matched the patient’s medication schedules, and whether weekly
symptom burden changed in response to new treatment regimens (ﬁg. S20). The study clinician
initially performed the review without access to patient charts. After the analysis of the
anonymized review was complete, the clinician investigated unexpected symptoms from the
smartwatch symptom proﬁles either through review of the patient's charts, or by directly
discussing with the patient or the patient's primary movement disorder specialist as applicable.
The clinician did not evaluate some smartwatch symptom proﬁles due to insufﬁcient data (i.e.,
subjects with less than ﬁve full days of data post-treatment or subjects with unobservable signals
like facial dyskinesias). In the ﬁnal sub-study, ten smartwatch symptom proﬁles were reviewed
by three blinded and independent movement disorder specialists. Each rater was given
standardized, written instructions on how to interpret smartwatch symptom proﬁles. Raters then
classiﬁed the proﬁles as “before” or “after” treatment for a known medication change. Evaluated
smartwatch symptom proﬁle pairs were selected from longitudinal patient study subjects who
underwent treatment changes through automated ﬁltering rules (ﬁg. S17). Smartwatch symptom
proﬁles were randomly ordered before presenting the task to the raters (ﬁg. S18). The only
additional information provided was the patient’s screening MDS-UPDRS scores for tremor and
dyskinesia (Supplementary Methods).
In the pilot and longitudinal patient studies the primary inclusion criterion was a prior
diagnosis of Parkinson’s disease of at least 6 months, and primary exclusion criteria were
pregnancy or severe cognitive impairment. Recruitment of patients with PD was initially
structured with the goal of spanning a range of MDS-UPDRS in-clinic scores, and subsequently
broadened in the longitudinal patient study to take all who met inclusion and exclusion criteria to
maximize observations of a range of all-day behaviors and activity. Further details on all studies,
activity and lifestyle characterization, and the patient-clinician pilot within the longitudinal
patient study are provided in Supplementary Methods.
Finally, two control datasets were collected. First a longitudinal dataset was collected on
elderly patients without Parkinson's disease; patients went through a full patient history and
physical exam with a registered nurse at the study start. Second, an engineering controlled study
was run on data from young, healthy participants performing a number of all-day activities
including playing musical instruments, driving, etc. Raw accelerometer and gyroscope from the
Apple Watch was collected and the tremor and dyskinesia algorithms were simulated from this
raw sensor data to report false positive performance in both free-living and controlled, simulated
Movement Monitor for Parkinson’s Disease
Materials and Software
Accelerometer and gyroscope data were collected at a sample rate of 100 Hz using the
Apple Watch (Series 2 and above). Smartwatch workout type and duration and activity
information with classiﬁcation (e.g., pedometer, driving classiﬁcation) were also logged. Sample
code to create all-day tremor and dyskinesia proﬁles are available on the Apple developer portal
and the open-source ResearchKit GitHub repository (88). Results presented here have been
veriﬁed and reproduced to match outputs from the Movement Disorder API available on Apple
WatchOS 5 or later (89, 90).
Tremor displacement and detection
Estimates of resting tremor were made at 2.56s intervals when period signals were
detected between ~3-7Hz. This window size was selected to capture brief, intermittent tremors
with sufﬁcient signal-to-noise ratio (SNR). For lower SNRs or periods where the subject was
moving (e.g. active tasks or gestures), the algorithm classiﬁed tremor as “unknown”. The
remaining regions where the subject was at rest were classiﬁed as “no tremor.” When tremor was
reported, a displacement was computed. Tremor SNR thresholds were determined empirically by
comparing true tremor signals to internal false positive datasets spanning >100 hours of
workouts, mechanical vibrations, and all-day data from healthy controls (data from 171 subject
subset consented for publication shown herein as the longitudinal control study). Tremor
estimates were aggregated into minute-by-minute outputs that showed the percentage of time
tremor was unknown, not present, or detected at slight (1), mild (2), moderate (3) or severe
displacements (4) (ﬁg. S3). Displacement categories were determined by evaluating 1-mm
increments and selecting a threshold that maximized algorithm-rater agreement in 72 tasks across
31 subjects rated to have tremor by at least 2 of 3 raters.
Motion capture displacement measurements
A healthy, control subject simulated tremor movements with varying amplitudes while
wearing the Apple Watch in seated and standing positions. We attached six reﬂective markers to
the watch and tracked these markers using a motion capture system (Vicon) during the simulated
tremor movements (Fig. 3A). For each of nine tasks, the Vicon data was divided into 2.56-s
duration segments and a principal component analysis was performed over each segment to
extract the primary direction of tremor motion. The mean peak to peak displacement from Vicon
data was calculated using the ﬁrst principal component in 2.56s windows.
Comparison of tremor displacements to MDS-UPDRS ratings
For subjects with periodic 3-7 Hz tremor signal, we compared smartwatch wrist
displacements to rater MDS-UPDRS scores for 253 cognitive distraction and eyes-open standing
tasks across 61 unique subjects from the pilot study (Fig. 3C, ﬁg. S5). The Spearman’s rank
correlation coefﬁcient was computed between the maximum tremor displacement detected
during the task and the mean of the rater MDS-UPDRS scores rounded to the nearest integer.
False negative rates were computed as the proportion of time the algorithm detected no tremor
during cognitive distraction tasks where the clinical raters reported tremor (table S3). We also
compared the mean daily tremor detection rate for all subjects from the longitudinal patient study
to an overall tremor rating that synthesized constancy and severity using the Spearman's rank
correlation coefﬁcient (Fig. 3D). We calculated the daily tremor percentage as the total detected
tremor time divided by the total time period the watch was worn. Watch wear time excluded
periods where the subject was likely asleep, or where the watch was not being worn as indicated
by a lack of device movement. This percentage was then averaged across all the days the subject
was in the study. Six subjects were excluded because they had insufﬁcient data for analysis.
Choreiform movement score and all-day dyskinesia analyses
We optimized the dyskinesia algorithm for subjects who already had a conﬁrmed
diagnosis of chorea visible at the wrist. Features were designed to capture the irregular, jerky
movements characteristic of chorea. We veriﬁed and selected features by comparing their
predictive power to clinician ratings of dyskinesias during cognitive distraction tasks using
subjects from the pilot study. We then trained the model with the selected features on all-day,
free-living data. For the chorea algorithm, the estimated CMS was calculated four times within a
10.24-second window. We averaged overlapping windows across the task duration to get a mean
CMS for each wrist, and the greater of the two scores from each wrist was compared to expert
ratings. In all-day analyses, averaged CMS values were computed over one-minute intervals.
Scores over 0.16 were reported as chorea during that minute; this threshold was selected to
minimize false positives during all-day analyses. We performed a leave-one-out cross-validation
and hold-out validation on free-living data from subjects in the pilot study and longitudinal
patient study with >24h of data. A full break down of subjects used per analysis and their
associated group are shown in table S4 and described in the Supplementary Methods.
Movement disorder symptom proﬁles
We generated all-day symptom proﬁles from 2.56s tremor and dyskinesia outputs
aggregated into 15 minute periods to capture symptom response to medications, while ensuring
sufﬁcient measurements to prevent outliers from masking the primary signal. Dyskinesia was
reported as the percent of time chorea was detected in 15 minutes; tremor was reported as the
percent of time samples were categorized into displacement groups of <0.1 cm, 0.1-0.6 cm,
0.6-2.2 cm, or >2.2 cm. We removed stationary periods of ﬁve minutes or more, where inertial
accelerations were near zero. Fifteen-minute periods comprising a 24-hour day were organized
according to the local time of the subject. Windows with less than 50% of available data were
discarded. Tremor percentages were averaged over all days with data, and chorea was displayed
as the median (to reduce the effect of occasional large outliers). This created symptom proﬁles
over a set period (e.g., a single week, or for the duration of a given medication schedule). Only
windows with data from at least 5 days or more than 20% of the total period were shown.
Symptom changes between medication schedules
Symptom changes were visualized over time by calculating the mean amount of tremor
and dyskinesia per day, and the mean tremor displacement with a three-day moving average. In
order to determine whether symptoms statistically differed between medication schedules, a
Wilcoxon signed rank test was performed to determine whether paired windows from distinct
time periods showed signiﬁcant differences (e.g., between periods with two different med
schedules, such as for subject S004: Fig. 5). Paired windows were matched on both time of day
and day of the week. The average difference between the two days was computed to determine
the net change between the periods.
Figures and statistical analysis were generated using MATLAB and Statistics Toolbox
(Release 2017b, The MathWorks, Inc., Natick, Massachusetts, United States). Box plots are
shown with a central mark at the median, bottom and top edges of the boxes at 25th and 75th
percentiles respectively and whiskers out to the most extreme points within 1.5 times the
interquartile range. Mean difference and limit of agreement (1.96σ) calculations to compare
smartwatch and motion capture displacements were calculated with a custom MATLAB
implementation of a Bland-Altman visualization. Spearman’s rank correlation coefﬁcients were
used to compare tremor displacements and all-day tremor detection to clinical ratings.
Hypotheses testing was performed using Wilcoxon rank sum tests to compare differences in
CMS distributions, dyskinesia detection rates, and tremor rates for different subject groups.
Comparisons in differences in symptom rates between medication schedules were performed
with a Wilcoxon signed rank test. Statistical tests on data used for design and validation are
summarized in table S8. Outliers were displayed for all ﬁgures unless otherwise stated.
Supplementary Materials List
Materials and Methods
Fig. S1. Overview of data collected for MM4PD development and validation
Fig. S2. MDS-UPDRS Part 2 and Part 4 scores
Fig. S3. Block diagram of tremor and dyskinesia algorithms in MM4PD
Fig. S4. Accelerometer data and displacement estimates for cognitive distraction task from pilot study subjects
Fig. S5. Data from Fig. 3C, separated by phase (seated and standing)
Fig. S6. Inter-rater concordance compared to algorithm displacement thresholds
Fig. S7. Aggregated all-day tremor symptom proﬁles for sample subjects grouped by in-clinic MDS-UPDRS scores
Fig. S8. Longitudinal patient study, subjects provided feedback of their smartwatch experiences; responses were
collected via an optional exit survey
Fig. S9. Workouts logged by subjects in the longitudinal patient study.
Fig. S10. Smartwatch symptom proﬁles show changes in response to DBS surgery and programming
Fig. S11. Increased medication adherence improves consistency in the motor ﬂuctuation proﬁle over averaged
periods of time
Fig. S12. Subject S004’s tremor amounts were signiﬁcantly reduced when additional carbidopa/levodopa controlled
release doses were added
Fig. S13. Example cases of unexpected symptoms as identiﬁed by longitudinal patient study clinician
Fig. S14. Tremor and dyskinesia rates as measured by MM4PD matched clinician expectations in longitudinal
Fig. S15. Subject with clinically undetected tremor still showed intra-day patterns indicative of medication response
Fig. S16. Examples of diphasic dyskinesia
Fig. S17. Selection criteria and results of matching task from 3 independent raters with 15 medication schedules
Fig. S18. Matching task example where all 3 raters were consistent in rationale and correct
Fig. S19. Subject smartwatch symptom proﬁle review mediated with movement disorder specialist
Fig. S20. Movement disorder specialist smartwatch symptom proﬁle review
Table S1. Study demographics
Table S2. False positive rates
Table S3. Tremor detection rates in a cognitive distraction task compared to ratings from 3 movement disorder
Table S4. Categorization of live-on data by subject
Table S5. Summary of medications across all 225 patients in the longitudinal patient study
Table S6. Details behind cases where clinician expectation differed from smartwatch symptom proﬁles
Table S7. Longitudinal patient study exit survey
Table S8. Statistical test summary
Movie S1. Example of detected moderate-strong tremor
Movie S2. Example of detected distal tremor
Movie S3. Example of detected strong tremor
Movie S4. Example of detected moderate dyskinesia
Movie S5. Example of detected mild dyskinesia
Movie S6. Example of very subtle dyskinesia below detection threshold
Movie S7. Example of mild dyskinesia below detection threshold
Data ﬁle S1. Longitudinal patient study processed data
1. E. J. Topol, Transforming medicine via digital innovation. Sci Transl Med 2, 16cm14 (2010); published online EpubJan
2. M. S. Patel, D. A. Asch, K. G. Volpp, Wearable devices as facilitators, not drivers, of health behavior change. JAMA
313, 459-460 (2015); published online EpubFeb 3 (10.1001/jama.2014.14781).
3. S. R. Steinhubl, E. D. Muse, E. J. Topol, The emerging ﬁeld of mobile health. Sci Transl Med 7, 283rv283 (2015);
published online EpubApr 15 (10.1126/scitranslmed.aaa3487).
4. L. Piwek, D. A. Ellis, S. Andrews, A. Joinson, The Rise of Consumer Health Wearables: Promises and Barriers. PLoS
Med 13, e1001953 (2016); published online EpubFeb (10.1371/journal.pmed.1001953).
5. D. R. Bassett, Jr., L. P. Toth, S. R. LaMunion, S. E. Crouter, Step Counting: A Review of Measurement Considerations
and Health-Related Applications. Sports Med 47, 1303-1315 (2017); published online EpubJul (10.1007/
6. S. S. Gambhir, T. J. Ge, O. Vermesh, R. Spitler, Toward achieving precision health. Sci Transl Med 10, (2018);
published online EpubFeb 28 (10.1126/scitranslmed.aao3612).
7. R. Rawassizadeh, B. A. Price, M. Petre, Wearables: has the age of smartwatches ﬁnally arrived? Commun. ACM 58,
8. B. Reeder, A. David, Health at hand: A systematic review of smart watch uses for health and wellness. Journal of
biomedical informatics 63, 269-276 (2016)https://dx.doi.org/10.1016/j.jbi.2016.09.001).
9. K. I. Taylor, H. Staunton, F. Lipsmeier, D. Nobbs, M. Lindemann, Outcome measures based on digital health
technology sensor data: data- and patient-centric approaches. NPJ Digit Med 3, 97 (2020)10.1038/s41746-020-0305-8).
10. S. Patel, K. Lorincz, R. Hughes, N. Huggins, J. Growdon, D. Standaert, M. Akay, J. Dy, M. Welsh, P. Bonato,
Monitoring motor ﬂuctuations in patients with Parkinson's disease using wearable sensors. IEEE transactions on
information technology in biomedicine : a publication of the IEEE Engineering in Medicine and Biology Society 13,
11. R. Araujo, M. Tabuas-Pereira, L. Almendra, J. Ribeiro, M. Arenga, L. Negrao, A. Matos, A. Morgadinho, C. Januario,
Tremor Frequency Assessment by iPhone R Applications: Correlation with EMG Analysis. Journal of Parkinson's
disease 6, 717-721 (2016).
12. S. Cohen, L. R. Bataille, A. K. Martig, Enabling breakthroughs in Parkinson's disease with wearable technologies and
big data analytics. Mhealth 2, 20 (2016)10.21037/mhealth.2016.04.02).
13. L. Li, Q. Yu, B. Xu, Q. Bai, Y. Zhang, H. Zhang, C. Mao, C. Liu, T. Shen, S. Wang, in 2017 8th International IEEE/
EMBS Conference on Neural Engineering (NER). (2017), pp. 70-73.
14. A. L. Silva de Lima, T. Hahn, L. J. W. Evers, N. M. de Vries, E. Cohen, M. Afek, L. Bataille, M. Daeschler, K. Claes,
B. Boroojerdi, D. Terricabras, M. A. Little, H. Baldus, B. R. Bloem, M. J. Faber, Feasibility of large-scale deployment
of multiple wearable sensors in Parkinson's disease. PloS one 12, e0189161 (2017)https://dx.doi.org/10.1371/
15. A. Antonini, G. Gentile, M. Giglio, A. Marcante, H. Gage, M. M. L. Touray, D. I. Fotiadis, D. Gatsios, S. Konitsiotis,
L. Timotijevic, B. Egan, C. Hodgkins, R. Biundo, C. Pellicano, P. D. M. consortium, Acceptability to patients, carers
and clinicians of an mHealth platform for the management of Parkinson's disease (PD_Manager): study protocol for a
pilot randomised controlled trial. Trials 19, 492 (2018); published online EpubSep 14 (10.1186/s13063-018-2767-4).
16. A. Zhan, S. Mohan, C. Tarolli, R. B. Schneider, J. L. Adams, S. Sharma, M. J. Elson, K. L. Spear, A. M. Glidden, M. A.
Little, A. Terzis, E. R. Dorsey, S. Saria, Using Smartphones and Machine Learning to Quantify Parkinson Disease
Severity: The Mobile Parkinson Disease Score. JAMA neurology 75, 876-880 (2018)https://dx.doi.org/10.1001/
17. G. Albani, C. Ferraris, R. Nerino, A. Chimienti, G. Pettiti, F. Parisi, G. Ferrari, N. Cau, V. Cimolin, C. Azzaro, L.
Priano, A. Mauro, An Integrated Multi-Sensor Approach for the Remote Monitoring of Parkinson's Disease. Sensors
(Basel) 19, (2019); published online EpubNov 2 (10.3390/s19214764).
18. M. D. Hssayeni, J. Jimenez-Shahed, M. A. Burack, B. Ghoraani, Wearable Sensors for Estimation of Parkinsonian
Tremor Severity during Free Body Movements. Sensors (Basel) 19, (2019); published online EpubSep 28 (10.3390/
19. C. Morgan, M. Rolinski, R. McNaney, B. Jones, L. Rochester, W. Maetzler, I. Craddock, A. L. Whone, Systematic
Review Looking at the Use of Technology to Measure Free-Living Symptom and Activity Outcomes in Parkinson's
Disease in the Home or a Home-like Environment. J Parkinsons Dis 10, 429-454 (2020)10.3233/JPD-191781).
20. S. Rahman, H. J. Grifﬁn, N. P. Quinn, M. Jahanshahi, Quality of life in Parkinson's disease: the relative importance of
the symptoms. Mov Disord 23, 1428-1434 (2008); published online EpubJul 30 (10.1002/mds.21667).
21. T. Asakawa, K. Sugiyama, T. Nozaki, T. Sameshima, S. Kobayashi, L. Wang, Z. Hong, S. Chen, C. Li, H. Namba, Can
the Latest Computerized Technologies Revolutionize Conventional Assessment Tools and Therapies for a Neurological
Disease? The Example of Parkinson's Disease. Neurol Med Chir (Tokyo) 59, 69-78 (2019); published online EpubMar
22. G. Deuschl, P. Bain, M. Brin, Consensus statement of the Movement Disorder Society on Tremor. Ad Hoc Scientiﬁc
Committee. Mov Disord 13 Suppl 3, 2-23 (1998)10.1002/mds.870131303).
23. The Uniﬁed Parkinson's Disease Rating Scale (UPDRS): status and recommendations. Mov Disord 18, 738-750 (2003);
published online EpubJul (10.1002/mds.10473).
24. C. G. Goetz, B. C. Tilley, S. R. Shaftman, G. T. Stebbins, S. Fahn, P. Martinez-Martin, W. Poewe, C. Sampaio, M. B.
Stern, R. Dodel, B. Dubois, R. Holloway, J. Jankovic, J. Kulisevsky, A. E. Lang, A. Lees, S. Leurgans, P. A. LeWitt, D.
Nyenhuis, C. W. Olanow, O. Rascol, A. Schrag, J. A. Teresi, J. J. van Hilten, N. LaPelle, Movement Disorder Society-
sponsored revision of the Uniﬁed Parkinson's Disease Rating Scale (MDS-UPDRS): scale presentation and clinimetric
testing results. Mov Disord 23, 2129-2170 (2008); published online EpubNov 15 (10.1002/mds.22340).
25. C. G. Goetz, [Movement Disorder Society-Uniﬁed Parkinson's Disease Rating Scale (MDS-UPDRS): a new scale for
the evaluation of Parkinson's disease]. Movement Disorder Society-Uniﬁed Parkinson's Disease Rating Scale (MDS-
UPDRS) : une nouvelle echelle pour l'evaluation de la maladie de Parkinson. 166, 1-4 (2010)https://dx.doi.org/
26. G. K. Montgomery, N. C. Reynolds, Jr., Compliance, reliability, and validity of self-monitoring for physical
disturbances of Parkinson's disease. The Parkinson's Symptom Diary. The Journal of nervous and mental disease 178,
27. R. A. Hauser, J. Friedlander, T. A. Zesiewicz, C. H. Adler, L. C. Seeberger, C. F. O'Brien, E. S. Molho, S. A. Factor, A
home diary to assess functional status in patients with Parkinson's disease with motor ﬂuctuations and dyskinesia. Clin
Neuropharmacol 23, 75-81 (2000); published online EpubMar-Apr (10.1097/00002826-200003000-00003).
28. S. S. Papapetropoulos, Patient diaries as a clinical endpoint in Parkinson's disease clinical trials. CNS Neurosci Ther 18,
380-387 (2012); published online EpubMay (10.1111/j.1755-5949.2011.00253.x).
29. E. R. Dorsey, S. Papapetropoulos, M. Xiong, K. Kieburtz, The First Frontier: Digital Biomarkers for
Neurodegenerative Disorders. Digit Biomark 1, 6-13 (2017); published online EpubSep-Dec (10.1159/000477383).
30. M. K. Erb, D. R. Karlin, B. K. Ho, K. C. Thomas, F. Parisi, G. P. Vergara-Diaz, J. F. Daneault, P. W. Wacnik, H. Zhang,
T. Kangarloo, C. Demanuele, C. R. Brooks, C. N. Detheridge, N. Shaaﬁ Kabiri, J. S. Bhangu, P. Bonato, mHealth and
wearable technology should replace motor diaries to track motor ﬂuctuations in Parkinson's disease. NPJ Digit Med 3,
31. S. Pietracupa, A. Fasano, G. Fabbrini, M. Sarchioto, M. Bloise, A. Latorre, M. Altieri, M. Bologna, A. Berardelli, Poor
self-awareness of levodopa-induced dyskinesias in Parkinson's disease: clinical features and mechanisms.
Parkinsonism Relat Disord 19, 1004-1008 (2013); published online EpubNov (10.1016/j.parkreldis.2013.07.002).
32. J. G. Nutt, W. R. Woodward, J. P. Hammerstad, J. H. Carter, J. L. Anderson, The "on-off" phenomenon in Parkinson's
disease. Relation to levodopa absorption and transport. N Engl J Med 310, 483-488 (1984); published online EpubFeb
33. M. Contin, R. Riva, P. Martinelli, F. Albani, A. Baruzzi, Effect of meal timing on the kinetic-dynamic proﬁle of
levodopa/carbidopa controlled release [corrected] in parkinsonian patients. Eur J Clin Pharmacol 54, 303-308 (1998);
published online EpubJun (10.1007/s002280050464).
34. A. P. Denny, M. Behari, Motor ﬂuctuations in Parkinson's disease. Journal of the neurological sciences 165, 18-23
35. D. J. Brooks, Optimizing levodopa therapy for Parkinson's disease with levodopa/carbidopa/entacapone: implications
from a clinical and patient perspective. Neuropsychiatr Dis Treat 4, 39-47 (2008); published online EpubFeb (10.2147/
36. A. Haahr, M. Kirkevold, E. O. Hall, K. Ostergaard, Living with advanced Parkinson's disease: a constant struggle with
unpredictability. J Adv Nurs 67, 408-417 (2011); published online EpubFeb (10.1111/j.1365-2648.2010.05459.x).
37. D. J. Daley, P. K. Myint, R. J. Gray, K. H. Deane, Systematic review on factors associated with medication non-
adherence in Parkinson's disease. Parkinsonism Relat Disord 18, 1053-1061 (2012); published online EpubDec
38. J. E. Thorp, P. G. Adamczyk, H. L. Ploeg, K. A. Pickett, Monitoring Motor Symptoms During Activities of Daily
Living in Individuals With Parkinson's Disease. Front Neurol 9, 1036 (2018)10.3389/fneur.2018.01036).
39. A. Beuter, R. Edwards, Using frequency domain characteristics to discriminate physiologic and parkinsonian tremors.
Journal of clinical neurophysiology : ofﬁcial publication of the American Electroencephalographic Society 16, 484-494
40. R. Lemoyne, T. Mastroianni, M. Cozza, C. Coroian, W. Grundfest, Implementation of an iPhone for characterizing
Parkinson's disease tremor through a wireless accelerometer application. Conference proceedings : ... Annual
International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine
and Biology Society. Annual Conference 2010, 4954-4958 (2010)https://dx.doi.org/10.1109/IEMBS.2010.5627240).
41. B. T. Cole, S. H. Roy, C. J. De Luca, S. H. Nawab, Dynamical learning and tracking of tremor and dyskinesia from
wearable sensors. IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE
Engineering in Medicine and Biology Society 22, 982-991 (2014)https://dx.doi.org/10.1109/TNSRE.2014.2310904).
42. H. Dai, P. Zhang, T. C. Lueth, Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement
Unit. Sensors (Basel, Switzerland) 15, 25055-25071 (2015)https://dx.doi.org/10.3390/s151025055).
43. R. LeMoyne, T. Mastroianni, Use of smartphones and portable media devices for quantifying human movement
characteristics of gait, tendon reﬂex response, and Parkinson's disease hand tremor. Methods in molecular biology
(Clifton, N.J.) 1256, 335-358 (2015)https://dx.doi.org/10.1007/978-1-4939-2172-0_23).
44. O. Bazgir, J. Frounchi, S. A. H. Habibi, L. Palma, P. Pierleoni, in 2015 22nd Iranian Conference on Biomedical
Engineering (ICBME). (2015), pp. 1-5.
45. H. Zach, M. Dirkx, B. R. Bloem, R. C. Helmich, The Clinical Evaluation of Parkinson's Tremor. J Parkinsons Dis 5,
46. H. Jeon, W. Lee, H. Park, H. J. Lee, S. K. Kim, H. B. Kim, B. Jeon, K. S. Park, High-accuracy automatic classiﬁcation
of Parkinsonian tremor severity using machine learning method. Physiological measurement 38, 1980-1999
47. J. Prince, M. de Vos, A Deep Learning Framework for the Remote Detection of Parkinson'S Disease Using Smart-
Phone Sensor Data. Conf Proc IEEE Eng Med Biol Soc 2018, 3144-3147 (2018); published online EpubJul (10.1109/
48. R. A. Ramdhani, A. Khojandi, O. Shylo, B. H. Kopell, Optimizing Clinical Assessments in Parkinson's Disease
Through the Use of Wearable Sensors and Data Driven Modeling. Front Comput Neurosci 12, 72 (2018)10.3389/
49. F. Luft, S. Shariﬁ, W. Mugge, A. C. Schouten, L. J. Bour, A. F. van Rootselaar, P. H. Veltink, T. Heida, A Power
Spectral Density-Based Method to Detect Tremor and Tremor Intermittency in Movement Disorders. Sensors (Basel)
19, (2019); published online EpubOct 4 (10.3390/s19194301).
50. A. Schrag, N. Quinn, Dyskinesias and motor ﬂuctuations in Parkinson's disease. A community-based study. Brain 123 (
Pt 11), 2297-2305 (2000); published online EpubNov (10.1093/brain/123.11.2297).
51. J. Jankovic, Parkinson's disease: clinical features and diagnosis. J Neurol Neurosurg Psychiatry 79, 368-376 (2008);
published online EpubApr (10.1136/jnnp.2007.131045).
52. R. I. Grifﬁths, K. Kotschet, S. Arfon, Z. M. Xu, W. Johnson, J. Drago, A. Evans, P. Kempster, S. Raghav, M. K. Horne,
Automated assessment of bradykinesia and dyskinesia in Parkinson's disease. J Parkinsons Dis 2, 47-55
53. S. Arora, V. Venkataraman, S. Donohue, K. M. Biglan, E. R. Dorsey, M. A. Little, in 2014 IEEE International
Conference on Acoustics, Speech and Signal Processing (ICASSP). (2014), pp. 3641-3644.
54. S. Arora, V. Venkataraman, A. Zhan, S. Donohue, K. M. Biglan, E. R. Dorsey, M. A. Little, Detecting and monitoring
the symptoms of Parkinson's disease using smartphones: A pilot study. Parkinsonism & related disorders 21, 650-653
55. M. Braybrook, S. O'Connor, P. Churchward, T. Perera, P. Farzanehfar, M. Horne, An Ambulatory Tremor Score for
Parkinson's Disease. J Parkinsons Dis 6, 723-731 (2016); published online EpubOct 19 (10.3233/JPD-160898).
56. E. Goubault, H. P. Nguyen, S. Bogard, P. J. Blanchet, E. Bezard, C. Vincent, M. Langlois, C. Duval, Cardinal Motor
Features of Parkinson's Disease Coexist with Peak-Dose Choreic-Type Drug-Induced Dyskinesia. J Parkinsons Dis 8,
57. B. M. Bot, C. Suver, E. C. Neto, M. Kellen, A. Klein, C. Bare, M. Doerr, A. Pratap, J. Wilbanks, E. R. Dorsey, S. H.
Friend, A. D. Trister, The mPower study, Parkinson disease mobile data collected using ResearchKit. Scientiﬁc data 3,
58. R. Lakshminarayana, D. Wang, D. Burn, K. R. Chaudhuri, C. Galtrey, N. V. Guzman, B. Hellman, J. Ben, S. Pal, J.
Stamford, M. Steiger, R. W. Stott, J. Teo, R. A. Barker, E. Wang, B. R. Bloem, M. van der Eijk, L. Rochester, A.
Williams, Using a smartphone-based self-management platform to support medication adherence and clinical
consultation in Parkinson's disease. NPJ Parkinsons Dis 3, 2 (2017)10.1038/s41531-016-0003-z).
59. S. Gernon, A. Fowler, K. Lyons, R. Pahwa, Clinical Experience with Personal KinetiGraph Before and After Deep
Brain Stimulation for Parkinson’s Disease (P3.045). Neurology 90, P3.045 (2018).
60. R. Pahwa, S. H. Isaacson, D. Torres-Russotto, F. B. Nahab, P. M. Lynch, K. E. Kotschet, Role of the Personal
KinetiGraph in the routine clinical assessment of Parkinson's disease: recommendations from an expert panel. Expert
Rev Neurother 18, 669-680 (2018); published online EpubAug (10.1080/14737175.2018.1503948).
61. S. H. Isaacson, B. Boroojerdi, O. Waln, M. McGraw, D. L. Kreitzman, K. Klos, F. J. Revilla, D. Heldman, M. Phillips,
D. Terricabras, M. Markowitz, F. Woltering, S. Carson, D. Truong, Effect of using a wearable device on clinical
decision-making and motor symptoms in patients with Parkinson's disease starting transdermal rotigotine patch: A pilot
study. Parkinsonism Relat Disord 64, 132-137 (2019); published online EpubJul (10.1016/j.parkreldis.2019.01.025).
62. H. Khodakarami, P. Farzanehfar, M. Horne, The Use of Data from the Parkinson's KinetiGraph to Identify Potential
Candidates for Device Assisted Therapies. Sensors (Basel) 19, (2019); published online EpubMay 15 (10.3390/
63. M. Linares-Del Rey, L. Vela-Desojo, R. Cano-de la Cuerda, Mobile phone applications in Parkinson's disease: A
systematic review. Neurologia 34, 38-54 (2019); published online EpubJan - Feb (10.1016/j.nrl.2017.03.006).
64. A. Santiago, J. W. Langston, R. Gandhy, R. Dhall, S. Brillman, L. Rees, C. Barlow, Qualitative Evaluation of the
Personal KinetiGraphTM Movement Recording System in a Parkinson's Clinic. J Parkinsons Dis 9, 207-219
65. E. E. Tan, E. J. Hogg, M. Tagliati, The role of Personal KinetiGraph TM ﬂuctuator score in quantifying the progression
of motor ﬂuctuations in Parkinson's disease. Functional neurology 34, 21-28 (2019).
66. S. P. Khor, A. Hsu, The pharmacokinetics and pharmacodynamics of levodopa in the treatment of Parkinson's disease.
Curr Clin Pharmacol 2, 234-243 (2007); published online EpubSep (10.2174/157488407781668802).
67. M. Barichella, E. Cereda, E. Cassani, G. Pinelli, L. Iorio, V. Ferri, G. Privitera, M. Pasqua, A. Valentino, F. Monajemi,
S. Caronni, C. Lignola, C. Pusani, C. Bolliri, S. A. Faierman, A. Lubisco, G. Frazzitta, M. L. Petroni, G. Pezzoli,
Dietary habits and neurological features of Parkinson's disease patients: Implications for practice. Clinical nutrition
(Edinburgh, Scotland) 36, 1054-1061 (2017)https://dx.doi.org/10.1016/j.clnu.2016.06.020).
68. S. Fahn, The spectrum of Levodopa-induced dyskinesias. Annals of neurology 47, S2-9; discussion S9 (2000);
published online Epub05/01 (10.1002/1531-8249(200001)47:1<2::AID-ANA2>3.0.CO;2-B).
69. G. Fabbrini, J. M. Brotchie, F. Grandas, M. Nomoto, C. G. Goetz, Levodopa-induced dyskinesias. Mov Disord 22,
1379-1389; quiz 1523 (2007); published online EpubJul 30 (10.1002/mds.21475).
70. A. Latorre, M. C. Bloise, C. Colosimo, F. Di Biasio, G. Defazio, A. Berardelli, G. Fabbrini, Dyskinesias and motor
symptoms onset in Parkinson disease. Parkinsonism & related disorders 20, 1427-1429 (2014)https://dx.doi.org/
71. L. Verhagen Metman, A. J. Espay, Teaching Video NeuroImages: The underrecognized diphasic dyskinesia of
Parkinson disease. Neurology 89, e83-e84 (2017); published online EpubAug 15 (10.1212/WNL.0000000000004238).
72. B. Boroojerdi, R. Ghaffari, N. Mahadevan, M. Markowitz, K. Melton, B. Morey, C. Otoul, S. Patel, J. Phillips, E. Sen-
Gupta, O. Stumpp, D. Tatla, D. Terricabras, K. Claes, J. A. Wright, Jr., N. Sheth, Clinical feasibility of a wearable,
conformable sensor patch to monitor motor symptoms in Parkinson's disease. Parkinsonism Relat Disord 61, 70-76
(2019); published online EpubApr (10.1016/j.parkreldis.2018.11.024).
73. J. Cancela, M. Pastorino, M. T. Arredondo, K. S. Nikita, F. Villagra, M. A. Pastor, Feasibility study of a wearable
system based on a wireless body area network for gait assessment in Parkinson's disease patients. Sensors (Basel,
Switzerland) 14, 4618-4633 (2014)https://dx.doi.org/10.3390/s140304618).
74. S. Del Din, M. Elshehabi, B. Galna, M. A. Hobert, E. Warmerdam, U. Suenkel, K. Brockmann, F. Metzger, C. Hansen,
D. Berg, L. Rochester, W. Maetzler, Gait analysis with wearables predicts conversion to parkinson disease. Ann Neurol
86, 357-367 (2019); published online EpubSep (10.1002/ana.25548).
75. G. Yahalom, Z. Yekutieli, S. Israeli-Korn, S. Elincx-Benizri, V. Livneh, T. Fay-Karmon, K. Tchelet, Y. Rubel, S.
Hassin-Baer, Smartphone Based Timed Up and Go Test Can Identify Postural Instability in Parkinson's Disease. The
Israel Medical Association journal : IMAJ 22, 37-42 (2020).
76. P. Barone, R. Erro, M. Picillo, Quality of Life and Nonmotor Symptoms in Parkinson's Disease. Int Rev Neurobiol 133,
77. A. J. Espay, J. M. Hausdorff, A. Sanchez-Ferro, J. Klucken, A. Merola, P. Bonato, S. S. Paul, F. B. Horak, J. A.
Vizcarra, T. A. Mestre, R. Reilmann, A. Nieuwboer, E. R. Dorsey, L. Rochester, B. R. Bloem, W. Maetzler, T.
Movement Disorder Society Task Force on, A roadmap for implementation of patient-centered digital outcome
measures in Parkinson's disease obtained using mobile health technologies. Mov Disord 34, 657-663 (2019); published
online EpubMay (10.1002/mds.27671).
78. N. Genes, S. Violante, C. Cetrangol, L. Rogers, E. E. Schadt, Y. Y. Chan, From smartphone to EHR: a case report on
integrating patient-generated health data. NPJ Digit Med 1, 23 (2018)10.1038/s41746-018-0030-8).
79. H. L. Henriksen A, Hartvigsen G, Grimsgaard S. , Using Cloud-Based Physical Activity Data from Google Fit and
Apple Healthkit to Expand Recording of Physical Activity Data in a Population Study. Stud Health Technol Inform 245,
80. J. P. Hemming, A. L. Gruber-Baldini, K. E. Anderson, P. S. Fishman, S. G. Reich, W. J. Weiner, L. M. Shulman, Racial
and socioeconomic disparities in parkinsonism. Arch Neurol 68, 498-503 (2011); published online EpubApr (10.1001/
81. A. Saadi, D. U. Himmelstein, S. Woolhandler, N. I. Mejia, Racial disparities in neurologic health care access and
utilization in the United States. Neurology 88, 2268-2275 (2017); published online EpubJun 13 (10.1212/
82. M. Schroeder, "K161717: Personal Kinetigraph (PKG) System Model GKC-2000," (US FDA, Dept Health and Human
83. D.-B. Tillman, "DEN180044: ECG App," (US FDA, US FDA CDRH, 2018).
84. D.-B. Tillman, "DEN180042: Irregular Rhythm Notiﬁcation Feature," (US FDA, US FDA CDRH, 2018).
85. I. Thomas, M. Alam, F. Bergquist, D. Johansson, M. Memedi, D. Nyholm, J. Westin, Sensor-based algorithmic dosing
suggestions for oral administration of levodopa/carbidopa microtablets for Parkinson's disease: a ﬁrst experience. J
Neurol 266, 651-658 (2019); published online EpubMar (10.1007/s00415-019-09183-6).
86. F. Lipsmeier, K. I. Taylor, T. Kilchenmann, D. Wolf, A. Scotland, J. Schjodt-Eriksen, W.-Y. Cheng, I. Fernandez-
Garcia, J. Siebourg-Polster, L. Jin, J. Soto, L. Verselis, F. Boess, M. Koller, M. Grundman, A. U. Monsch, R. B.
Postuma, A. Ghosh, T. Kremer, C. Czech, C. Gossens, M. Lindemann, Evaluation of smartphone-based testing to
generate exploratory outcome measures in a phase 1 Parkinson's disease clinical trial. Movement disorders : ofﬁcial
journal of the Movement Disorder Society 33, 1287-1297 (2018)https://dx.doi.org/10.1002/mds.27376).
87. A. Abdolahi, N. Scoglio, A. Killoran, E. R. Dorsey, K. M. Biglan, Potential reliability and validity of a modiﬁed
version of the Uniﬁed Parkinson's Disease Rating Scale that could be administered remotely. Parkinsonism Relat
Disord 19, 218-221 (2013); published online EpubFeb (10.1016/j.parkreldis.2012.10.008).
88. Apple Inc. , (ORKParkinsonStudy GitHub Open Source Repository, GitHub, 2018!https://github.com/ResearchKit/
89. Apple Inc. , “Movement Disorder API Developer Documentation” (Apple Developer Documentation,!https://
90. Powers, III; William R.; Etezadi-Amoli; Maryam; Ullal; Adeeti V.; Trietsch; Daniel; Kianian; Sara; Pham; Hung A.
Passive Tracking of Dyskinesia/Tremor Symptoms (US Patent Application Number: 20190365286, ﬁled 12/5/2019).
Acknowledgments: We thank K. Tsou, Y. Zhu, M. Agarwal, G. Blanco, S. Chang, C. Currie, B. Cusack, S. Estoesta,
W. Goh, K. Jessop, J. Tung, G. Valsan, J. Beard, L. Huet and J. Yip for supporting study execution. We also thank R.
Huang, M. O’Reilly, C. Mermel, G. Chi-Johnston, J. Leung, C. Jia, S. Friend, A. Trister, B. Tribble and countless
other academic and industry leaders for many helpful discussions. Most of all, we are grateful to the study
participants who shared their time and experiences with Parkinson’s disease with us.
Funding: The study was funded by Apple, Inc.
Author contributions: R.P., M.E.-A., A.V.U., contributed to the study design, algorithm design,
data analysis and writing of the paper. H.A.P. contributed to study design, algorithm design and writing. E.M.A.
contributed to study design and writing. S.K., I.M. and M.G., contributed to study design, data collection, data
analysis of supplementary material, and writing. P.T.L., N.H., T.M.H. and S.B. contributed to study execution,
clinical input, and writing. D.T. contributed to system design and implementation, and data collection. A.S.A.
supported algorithm design and reference data collection. J.D.K. contributed to writing.
Competing interests: All authors with Apple afﬁliations are either current or former Apple
employees and are Apple shareholders. P.T.L. has received clinical trial support from US WorldMeds, Acadia
Pharmaceuticals and BioIVT. R.P, M.E.-A., A.V.U., D.T., S. K., and H.A.P. are inventors on patent US 20190365286
submitted by Apple, Inc. that covers passive tracking of tremor and dyskinesia symptoms. There are no other
conﬂicts of interest to declare.
Data and materials availability: All data associated with this study are present in the paper or the Supplementary
Materials. Raw data cannot be publicly provided due to restrictions on
the informed consent and risk of re-identiﬁcation. All symptom proﬁles for consented
subjects have been made available with corresponding medication schedules in supplementary
ﬁgures for review. Algorithm outputs can be accessed by submitting a request for the Movement
Disorder API entitlement (see developer.apple.com for full API documentation). Details about
this algorithm can also be found in the patent application “Passive Tracking of Dyskinesia/Tremor
Symptoms” (US20190365286. Dec 5, 2019. U. S. Patent Ofﬁce).