ArticlePDF Available

Smartwatch inertial sensors continuously monitor real-world motor fluctuations in Parkinson’s disease


Abstract and Figures

Longitudinal, remote monitoring of motor symptoms in Parkinson’s disease (PD) could enable more precise treatment decisions. We developed the Motor fluctuations Monitor for Parkinson’s Disease (MM4PD), an ambulatory monitoring system that used smartwatch inertial sensors to continuously track fluctuations in resting tremor and dyskinesia. We designed and validated MM4PD in 343 participants with PD, including a longitudinal study of up to 6 months in a 225-subject cohort. MM4PD measurements correlated to clinical evaluations of tremor severity (ρ = 0.80) and mapped to expert ratings of dyskinesia presence ( P < 0.001) during in-clinic tasks. MM4PD captured symptom changes in response to treatment that matched the clinician’s expectations in 94% of evaluated subjects. In the remaining 6% of cases, symptom data from MM4PD identified opportunities to make improvements in pharmacologic strategy. These results demonstrate the promise of MM4PD as a tool to support patient-clinician communication, medication titration, and clinical trial design.
Content may be subject to copyright.
Title: Smartwatch inertial sensors continuously monitor real-world motor fluctuations in
Parkinson’s disease
Authors: Rob Powers1, Maryam Etezadi-Amoli1, Edith M. Arnold1, Sara Kianian1,2, Irida
Maxsim Gibiansky1, Dan Trietsch1, Alexander Singh Alvarado1, James D. Kretlow1, Todd M.
Herrington3,4, Salima Brillman5, Nengchun Huang6, Peter T. Lin6, Hung A. Pham1, Adeeti V.
1Apple, Inc., Cupertino, California, 95014
2Renaissance School of Medicine, Stony Brook University, Stony Brook, NY 11794
3Massachusetts General Hospital, Department of Neurology, Boston, MA 02114
4Harvard Medical School, Department of Neurology, Boston, MA 02115
5Parkinson’s Disease and Movement Center of Silicon Valley, Menlo Park, CA 94025
6Silicon Valley Parkinson’s Center, Los Gatos, California, 95032
*Corresponding author. Email: (A.V.U.)
One Sentence Summary: A smartwatch-based sensing system measures tremor and dyskinesia,
enabling remote monitoring and aiding medication titration in Parkinson’s disease.
Longitudinal,!remote monitoring of motor symptoms in Parkinson's disease (PD) could
enable!more precise treatment decisions.!We developed the Motor fluctuations Monitor for
Parkinson’s Disease (MM4PD), an ambulatory monitoring system which used!smartwatch
inertial sensors to continuously track fluctuations in resting tremor and dyskinesia. We designed
and validated MM4PD in 343 participants with PD, including a longitudinal study of up to six
months in a 225-subject cohort.!MM4PD measurements correlated to clinical evaluations of
tremor severity (ρ = 0.80) and mapped to expert ratings of dyskinesia presence (P < 0.001)
during in-clinic tasks. MM4PD captured symptom changes in response to treatment that matched
the treating clinician's expectations in 94% of evaluated subjects. In the remaining 6% of cases,
symptom data from MM4PD identified opportunities to make clinically applicable changes in
pharmacologic strategy. These results demonstrate the promise of MM4PD as a tool to support
improvements in patient-clinician communication, medication titration and clinical trial design.
Smartwatches are a well-established tool for continuous activity and fitness tracking(1-6).
Recent studies have shown that smartwatch-based longitudinal tracking can be extended to
clinical applications(3, 7-9), such as remote monitoring of motor symptoms in Parkinson’s
disease (PD), the second most common neurodegenerative disease worldwide(4, 10-19).
Quality of life in patients with PD correlates with a clinician’s ability to precisely titrate
medications(15, 20, 21). Dopamine replacement therapies improve motor symptoms but
determining optimal medication schedules remains challenging. The Movement Disorder
Society-sponsored revision of the Unified Parkinson’s Disease Rating Scale (MDS-UPDRS) part
III is a broadly accepted tool using a quantized five-point scale to measure motor symptom
severity(22-25). These assessments provide only a snapshot view of the patient's day during in-
clinic visits that typically occur every few months. For out-of-clinic symptom tracking, clinicians
rely on patient recall of symptoms(26-30), which is often error-prone, particularly for
medication-induced symptoms like dyskinesia (31). As such, clinicians are limited by infrequent,
coarse patient evaluations that cannot capture subtle disease progression, or daily fluctuations
from medication, exercise, diet, or stress (20, 32-38).
We hypothesized that objective, continuous and sensitive tracking of tremor and
dyskinesia could reveal more granular on-off patterns in a patient's day and serve as a clinical
decision support tool to improve medication titration. Prior work has shown that machine
learning and kinematics-based algorithms on inertial sensor data can identify Parkinsonian
resting tremor (12, 18, 22, 39-49) and choreiform dyskinesias, a side-effect often associated with
peak-dose of dopamine medications (10, 13, 16, 34, 50-56). However, only a few algorithms
have been successfully translated into monitoring systems usable in real-world scenarios and
with sufficient patient adherence to potentially impact clinical decision-making (57-65).
To bridge these gaps, we developed an ambulatory monitoring system, the Motor
fluctuations Monitor for Parkinson’s Disease (MM4PD), that unobtrusively and accurately
captures patterns of resting tremor and choreiform dyskinesia. MM4PD was designed with all-
day data from a longitudinal control study to establish signal-to-noise cutoffs for tremor and
dyskinesia estimates and a longitudinal patient study to assess the system’s ability to capture
motor fluctuations and potential utility in aiding clinicians during medication titration across a
range of Parkinson’s phenotypes in hundreds of participants with PD.
Figure 1. Overview of the motor fluctuation monitor for Parkinson’s disease. (A) Raw accelerometer and
gyroscope data from smartwatches contained features that predicted the presence of dyskinetic, choreiform
movements and the severity of Parkinsonian resting tremor. Second-level choreiform dyskinesia and tremor metrics
were aggregated into minute-level outputs that were accessible via an application programming interface (API) on
the smartwatch. (B) Minute-level outputs were averaged over multiple days to generate smartwatch symptom
profiles for each patient. (C) Clinicians evaluated smartwatch symptom profiles to capture the effect of medication
titration, deep brain stimulation, and lifestyle changes.
MM4PD development and study coverage
MM4PD development occurred in two main phases: (i) algorithm design that converted
raw inertial sensor data into tremor and dyskinesia estimates, and (ii) validation of MM4PD
outputs' ability to capture fluctuating daily symptom profiles in response to medication schedules
and other factors while subjects engaged in real-world behavior. (Fig. 1, fig. S1). The tremor and
dyskinesia algorithms were developed using sensor data captured across three studies (Fig. 2, fig.
S1, table S1) covering 343 unique participants with PD and 171 elderly non-PD controls.
Algorithms were primarily designed using sensor data collected during an initial pilot study
utilizing in-clinic MDS-UPDRS assessments. We established that MM4PD is accurate and
sensitive in estimating symptom presence or severity in controlled scenarios. Using all-day
sensor data from an additional phase of the pilot study consisting of one week of daily device
Lifestyle Changes
Medication Titration
and Adherence
Deep Brain Stimulation
Time of Day
% Detected Symptom
Dyskinesia Tremor
usage to better assess real world scenarios, we extended the algorithms’ robustness during daily
activities including walking (table S2). A multi-month longitudinal patient study was then used
for further design and validation of the algorithms’ ability to discern changes caused by events
more commonly occurring at these longer time scales such as medication changes. A cohort of
171 older subjects enrolled in the longitudinal control study, undergoing continuous monitoring
with the same smartwatch for the purposes of MM4PD development, and served as age-matched
controls. In both the pilot study and longitudinal patient study, we recruited from a broad
enrollment pool to test the MM4PD algorithms from patients with varied routines. We did not
restrict by subtype or severity of PD, and patients reported a broad range of dominant symptoms
from tremor to gait to bradykinesia or fear of falling (fig. S2).
Figure 2. Summary of study design and validation. (A) Smartwatch sensor data was mapped to MDS-UPDRS
ratings by designing an algorithm that worked in bounded conditions such as in-clinic cognitive distraction tasks,
and further enriched with shorter free-living periods spanning weeks. In the validation phase, motor fluctuations
from smartwatch symptom profiles were retrospectively compared to a patient’s prescribed medication times.
Longitudinal datasets were divided into design and hold-out sets. (B) Three studies were performed to map sensor
data to MDS-UPDRS ratings: (i) The pilot study with 118 patients with Parkinson's disease (PD) with multiple
Pilot Study
Longitudinal Patient Study
Longitudinal Control Study
Unique Subjects
n = 118 PD
n = 171 control
Measurement Duration
up to 4 in-clinic sessions
1 week live-on up to 6 months up to 12 months
Reference assessments
up to 4 1
Clinical Raters
3 movement disorder
specialists through video
Application to
Algorithms Design Design (n=143); Hold-out (n=82)
Validate vs. Medication Change Design
In-clinic Out-of-clinic
Design Phase
Bounded conditions 1-week duration with
compliant device usage
Longitudinal, free-living studies
Multi-month study with freeform device usage from patients and controls
Smartwatch sensor vs. MDS-UPDRS ratings
Validation Phase
Smartwatch outputs vs. Medication Data
expert ratings, (ii) The longitudinal patient study with 6+ months of data from 225 patients with PD, and (iii) The
longitudinal control study with 171 elderly, non-PD controls
System performance: tremor
We designed MM4PD to detect and classify the severity of tremor (based on
displacement) as slight (<0.1 cm), mild (0.1-0.6 cm), moderate (0.6-2.2 cm), strong (>2.2 cm),
unknown and absent during each one-minute interval the smartwatch was worn (fig. S3). First,
we verified the accuracy of smartwatch estimates of wrist displacement. The Pearson correlation
coefficient between displacement measured by a motion capture system with sub-millimeter
accuracy and the watch estimate was 0.98 in a control subject (Fig. 3A) with a mean signed error
of -0.04 ± 0.17 cm. Wrist displacement estimates responsively tracked increases in tremor
severity as confirmed by time-synchronized video in several subjects (Fig. 3B, fig. S4, movies S1
to S3). Next, we showed that watch displacement correlated with expert MDS-UPDRS tremor
amplitude ratings, with a rank correlation coefficient of 0.80 (Fig. 3C, fig. S5). During cognitive
distraction tasks from the pilot study, MM4PD captured tremor in 97.7% of cases where all raters
agreed (table S3, fig. S6). The median tremor false positive rate over 43,300 hours of all-day data
from 171 elderly, non-PD control subjects in the longitudinal control study was 0.25%. False
positives occurred infrequently during targeted activities in young, healthy controls, such as
manual teeth brushing (8%) and playing a musical instrument (2%) (table S2).
All-day tremor estimates from the longitudinal patient study, as quantified by an
individual's mean percentage of time with tremor detected per day correlated with their MDS-
UPDRS tremor constancy score assessed during a brief, in-clinic visit at the start of the study,
with a Spearman's rank correlation coefficient of 0.72 in design and hold-out sets (Fig. 3, D and
E). Intra-day fluctuations from sensor estimates corresponded with prescribed times for
carbidopa/levodopa (C/L) doses, as shown in the smartwatch symptom profiles for several
subjects (fig. S7). An example is shown for subject S002 (Fig. 3F), who showed characteristic
wearing off periods of 1.5-2.5h between peak and trough times, which were consistent with
established C/L pharmacokinetics in non-fasting patients(32, 33, 35, 66, 67).
Figure 3. Smartwatch estimates of tremor severity and presence correlate to MDS-UPDRS ratings and show
intra-day motor fluctuations. (A) Displacement estimated from the smartwatch correlated to motion capture
reference during simulated tremor by a control subject in 9 different seated and standing positions (r2=0.98). (B)
Smartwatch displacements from inertial sensor data sensitively tracked the onset of tremor during cognitive
distraction tasks (movie S1). Upper: accelerometry, lower: estimated displacement. (C) Displacement estimates from
the smartwatch during 253 seated and standing tasks are shown as a box plot separated into boxes by the average
MDS-UPDRS tremor severity ratings from three movement disorder specialists in the pilot study (Spearman rank
correlation of 0.80). Mean daily smartwatch tremor estimates correlated with MDS-UPDRS tremor constancy
ratings from the subject’s last in-clinic visit in (D) design (n=95) and (E) hold-out (n=43) sets. (F) Smartwatch
symptom profiles (tremor amounts in 15-min increments) for subject S002 indicating fluctuations characteristic of
the pharmacokinetics of the patient's medication (25/100 mg Carbidopa/Levodopa). Data in (D and E) are shown as
box plots with each box containing the mean daily fraction of time reported with tremor by the smartwatch for
subjects with a tremor constancy assessment shown on the x-axis. (Spearman rank correlations of 0.72 were
calculated for each set).
Accel (g)
Time (seconds)
Displacement (cm)
Mean MDS-UPDRS Tremor Rating
Motion Capture Reference (cm)
Apple Watch estimate (cm)
1 (sit)
2 (sit)
3 (sit)
4 (sit)
5 (sit)
6 (stand)
7 (stand)
8 (sit)
9 (sit)
Displacement (cm) Accel (g)
Smartwatch Estimate (cm)
Time (seconds)
Motion Capture Reference (cm)
Pilot Study
Mean Daily Tremor (%)
Mean Daily Tremor (%)
MDS-UPDRS Tremor Constancy
MDS-UPDRS Tremor Constancy
8am 10am 12pm 2pm 4pm 6pm 8pm 10pm
Time of Day
Tremor Amount in 15 min (%)
< 0.1 cm
Prescribed Medication Time
0.1 - 0.6 cm
System performance: dyskinesia
We developed a Choreiform Movement Score (CMS) (Supplementary Methods: Chorea
Detection) that returns an estimate of the frequency of dyskinesia-absent classification or
quantification of severity or amplitude by selecting features that capture irregular, jerky
movements in subjects with a diagnosis of chorea visible at the wrist. The dyskinesia algorithm
was designed and validated across 343 participants with PD (61 with dyskinesia) and 171
elderly, non-PD controls (fig. S1). We calculated CMS from sensor data in the pilot study, and
compared it to dyskinesia ratings from three MDS-certified experts during multiple MDS-
UPDRS assessments. CMS showed significant differences (P<0.001) for all pairwise
comparisons using a Wilcoxon rank sum test across three groups: (i) 65 subjects with confirmed
absence of in-session dyskinesia by all three raters (89 tasks), (ii) 69 subjects with discordant
dyskinesia ratings (109 tasks), and (iii) 19 subjects with confirmed dyskinesia across all three
raters (22 tasks) (Fig. 4, A to C). Some examples of detected and undetected dyskinesia during a
cognitive distraction task are shown in fig. S4, C to F and movies S4 to S7).
The amount of dyskinesia detected by MM4PD significantly differed between subjects
with PD with known chorea and those without, in both cross-validation and hold-out datasets. In
our cross-validation design set, we detected dyskinesia for an average of 10.7±9.9% (μ±σ) of the
day in 32 subjects with chorea (Fig. 4D, table S4). In contrast, we detected dyskinesia for
2.7±2.2% of the day in 125 patients with PD with no known dyskinesia (P<0.001, Wilcoxon rank
sum test). In a hold-out dataset from the longitudinal patient study, the percentage of time
dyskinesias were detected for the chorea group (5.9±5.3%) significantly differed from subjects
with no reported dyskinesias (2.0±2.2%) (P=0.027, Wilcoxon rank sum test) (Fig. 4E, table S4).
Dyskinesia false positive rates were low across common activities like walking (1%). In all-day
data from elderly, non-PD controls in the longitudinal control study, the median false positive
rate was 2.0% (table S2).!However, specific activities that mimic choreiform movements, such as
playing the piano, had high false positive rates (table S2). Averaged over time, MM4PD
dyskinesia and tremor estimates nonetheless showed “on-off” fluctuations alternating between
peaks of Parkinsonian tremor symptoms and dyskinetic side-effects as a result of Levodopa
therapy as shown for subject S003 (Fig. 4F).
Figure 4. In-clinic and all-day smartwatch choreiform dyskinesia detection matches clinical evaluation. (A)
Chorea movement scores created from smartwatch inertial sensor data fall above the dyskinesia detection threshold
during a standardized MDS-UPDRS cognitive distraction task for a patient. Upper: accelerometry, lower: chorea
movement scores. (B) Chorea movement scores remain low for a patient who is walking and does not have
dyskinesia. Upper: accelerometry, lower: chorea movement scores. (C) Chorea Movement Scores (CMS) computed
during in-clinic cognitive distraction tasks for the pilot study differentiated between the presence or absence of
dyskinesia (DK) as based on expert ratings (***P < 0.001 for all pairwise comparisons, using Wilcoxon rank sum
test). (D) The amount of dyskinesia detected in patients significantly differed between subjects with and without
chorea using all-day data in a design set (***P < 0.001, using Wilcoxon rank sum test), (E) and a hold-out set (*P =
0.027, using Wilcoxon rank sum test). (F) A smartwatch symptom profile for subject S003 captures dyskinesia and
motor fluctuations.
0510 15
Accel (g)
Dyskinesia: Cognitive Distraction Task
0510 15
Time (sec)
0510 15
Accel (g)
No Dyskinesia: Walking Task
0510 15
Time (sec)
Dyskinesia Tremor Prescribed Medication Time
Real world clinical deployment
Subjects aged 71±8.9 years in the longitudinal patient study wore the watch for 10.9±2.5
hours/day, remained in the study for an average of 104±59 days (μ±σ), and 3% of subjects
dropped out of the study (table S1). Subjects spanned a range of all-day behavior and activity,
were treated with a variety of medications, and engaged in diverse activities enabled by
smartwatch functionality such as activity tracking, workouts, and guided breathing (fig. S8, S9,
table S5)
To determine whether watch tremor and dyskinesia estimates could capture symptom
patterns relevant in the course of disease and treatment monitoring, we visualized MM4PD
outputs to create individual symptom profiles for all subjects in the longitudinal patient study. We
found that smartwatch symptom profiles captured changes after individuals underwent surgery
for deep brain stimulation (DBS) (fig. S10A), deep brain stimulation reprogramming (fig. S10B),
and even as a subject became more adherent to their prescribed treatment plan (fig. S11).
MM4PD also captured days with worsened symptoms and improvement with medication, as
shown for subject S004, who experienced worsened tremor until the subject began taking
additional morning doses of controlled release carbidopa/levodopa (C/L CR)(Fig. 5, fig. S12).
Figure 5. Longitudinal tremor and choreiform dyskinesia symptom levels and profiles reflect changes in
medication titration. (A) Longitudinal tracking of tremor displacement over time for subject S004. Outlier
symptom burden and severity was observed on day 19, confirmed by patient-report of a "bad" day in clinical notes.
Med, medication. (B) Mean symptoms per day for subject S004. After day 19, the patient started a new prescribed
medication plan and the amount of tremor reduced again. (C) Smartwatch symptom profiles (% per 15-min window)
aggregated across multiple days according to medication schedules for subject S004. Upper plot: Higher tremor
levels were observed in the morning and mid-afternoon, which matched patient-reported "off" times. Lower plot: A
reduction in the amount of tremor was seen during the patient’s mid-afternoon “off" time on the new medication
schedule (day 20 onward).
We then evaluated whether smartwatch symptom measurement could identify unexpected
responses to treatment that may not be caught by traditional clinical assessment. A clinician well-
versed with each subject’s medical history reviewed all medication changes that occurred during
the longitudinal patient study in the absence of additional smartwatch data. The clinician then
completed a second review of the medication changes with accompanying smartwatch symptom
profiles from before and after treatment to determine whether it matched his initial clinical
assessment based on medical history alone. Changes in symptom profiles for 104 subjects who
underwent treatment matched the clinician’s expectation in 94% of cases (Fig. 6A). In the
Med Schedule 1 (19 days)
C/L 25/100x3 C/L 25/100x3 C/L 25/100x3
4am 8am 12pm 4pm 8pm
% of 15 minute window
C/L 25/100 C/L 25/100x3 C/L 25/100x3
C/L CR 50/200 C/L CR 50/200
4am 8am 12pm 4pm 8pm
Time of Day
% of 15 minute window
Duration of medication schedule
Patient reported “bad” day
Median tremor displacement (cm) C/L 25/100x3 = 3 doses of Carbidopa/Levodopa 25/100mg
CR = Controlled Release
Dyskinetic symptom
Prescribed medication time
Patient-reported “off” time
Schedule 1
Schedule 2
0 5 10 15 20 25 32 37 42 67
Displacement (cm)
S004: Longitudinal Changes in Tremor
0 5 10 15 20 25 32 37 42 67
Days Since Study Enrollment
Mean symptom/day (%)
C/L 25/100x3 C/L 25/100x3 C/L 25/100x3
4am 8am 12pm 4pm 8pm
% of 15 minute window
Med Schedule 2 (25 days)
C/L 25/100 C/L 25/100x3 C/L 25/100x3
C/L CR 50/200
C/L CR 50/200
4am 8am 12pm 4pm 8pm
Time of Day
% of 15 minute window
18-85th percentile displacement
25-75th percentile displacement
40-60th percentile displacement
< 0.1 cm tremor
0.1 - 0.6 cm tremor
0.6 - 2.2 cm tremor
remaining 6% of cases (n=6), we found that smartwatch symptom profiles helped the clinician
identify likely alternate clinical or pharmacological responses that may have otherwise gone
unnoticed. For example, in three of six cases, the clinician determined that MM4PD detected
symptom changes imperceptible by traditional assessment but that were supported by secondary
clinical evidence (e.g., subject reported upper limb cramps, but did not report dyskinesias) (table
S6, fig. S13A). In the other three cases, the clinician deduced that MM4PD data indicated that
those subjects could have uncommon but known side effects from other prescribed drugs that
impacted their tremor or dyskinesia symptoms in an unexpected manner (e.g., antipsychotic
medication clozapine was thought to have reduced tremor in some subjects) (table S6, fig.
Additionally, the clinician noted that MM4PD may have the potential to support variable
or subclinical symptoms such as emergent tremor or dyskinesia (fig. S13A, S14, S15). The
clinician also identified distinct patterns of dyskinesia including peak-dose dyskinesias and
diphasic dyskinesia, which require distinct approaches to medication management (fig. S16)
Finally, we tested whether interpretation of smartwatch symptom profiles would
generalize beyond a single clinician familiar with the subject’s history. Three additional
movement disorder specialists classified smartwatch symptom profiles as either “before" or
“after" treatment for a given medication change. No supporting medical history was supplied
other than MDS-UPDRS ratings for tremor and dyskinesia assessed at the start of the study.
Raters correctly matched smartwatch symptom profiles in most cases (87.5%). For the remaining
cases, the rater presumed that an alternate medication from the patient's prescription was having
a dominant effect, which resulted in a misclassification. This occurred once per rater, and in each
instance the other two raters had correctly classified the profiles (Fig. 6B, fig. S17, S18).
Figure 6. Smartwatch symptom profiles match clinician expectation and provide quantitative evidence for
cases with uncertainty. The clinician reviewed the smartwatch symptom profiles of 112 subjects in the longitudinal
patient study who underwent treatment changes. (A) Symptom changes matched the clinician's expectation of the
prescribed medication change in 94% of cases. Unexpected cases revealed plausible incidence of known side-effects
to medications. (B) Three blinded movement disorder specialists classified 10 sets of profiles as pre- or post-
treatment using only the patient's medication schedule and MDS-UPDRS tremor and dyskinesia ratings from the
intake visit. 87.5% of classifications were correct; three misclassifications occurred because raters presumed that an
alternate medication had a dominant effect. Six cases were deemed inconclusive and were excluded.
The MM4PD smartwatch system converts motion sensor data into resting tremor and
dyskinesia measurements. We show that MM4PD outputs correlate with clinician-rated symptom
Clinician Evaluation in Longitudinal
Patient Study n
Patients with sufficient data who underwent
medication or DBS changes
Symptom data inconclusive (excluded) (8)
Symptom data changes matched
medication change
Symptom data unexpected but
Total Assessed 104
Clinician Evaluation with Full
Patient History
Matched clinician’s expectation
Unexpected but plausible
n = 104
rnate interpretation of med effect
Task with
3 Expert Raters
n = 24
Pre/post-treatment smartwatch
symptom pr n
Ratings from n = 10 tasks and 3 raters 30
Insufficient or inconclusive signal
rectly 21
plausible, alternate medication effect)
Total Assessed 24
severity in controlled and real-world environments, and observe low false positive rates in an
elderly, non-PD control cohort. The system also captures motor fluctuations and medication
response, which were validated by comparing smartwatch symptom profiles to holistic,
retrospective clinical evaluation and through a blinded classification task by three movement
disorder specialists. In the longitudinal patient study, smartwatch symptom profiles apprised the
clinician of subclinical symptoms missed in routine care and identified subjects who had not
adhered to their medications. These findings demonstrate that by aligning MM4PD with
standardized MDS-UPDRS assessments, the system can complement traditional examination
with interpretable, quantitative and longitudinal symptom data (fig. S19, S20).
Several limitations of this study should be noted. First, assessing MM4PD accuracy is
constrained by the precision and availability of gold standard MDS-UPDRS ratings. This rating
scale requires an in-person clinician evaluation and is not designed for continuous symptom
measurement in everyday life. Furthermore, this reference scale is validated for its overall score
across many aspects of Parkinson's disease, not just its motor subcomponent. Discrepancies can
also occur as clinical raters may miss tremors at the edge of human perception that are easily
identified with an advanced sensor system (24). Second, the studies were subject to recruitment
bias. Notably, there were fewer than expected cases of severe Parkinson's disease, which may be
explained by reduced willingness in patients with advanced cases to participate in studies, or
better treatment plans and adherence in our cohort. Finally, the study was only conducted with
the Apple watch; in the future, a more direct comparison of MM4PD with predicate systems, or
systems employing multiple inertial sensor sites may be warranted (72).
Our system addresses common barriers patients face in remote care settings. By
embedding the algorithms on a full-featured consumer device, users benefit from discreet,
unobtrusive symptom monitoring without the stigma of a dedicated medical device or burden of
active tasks. Device adherence may increase due to interest in other features like activity tracking
and messaging. Logged workouts and device usage can help patients and clinicians contextualize
how their symptoms are affected by lifestyle factors like exercise and stress.
There are several opportunities to improve the clinical utility of the MM4PD system. The
current system focuses on measuring resting tremor and detecting dyskinesia. Future work could
incorporate additional outputs such as postural tremor, dyskinesia severity, and periods of
simultaneous tremor and dyskinesias. Assessment of other motor symptoms such as
bradykinesia, gait and posture may also be needed to capture the complete patient
Additionally, because the smartwatch is limited to a single observation point at the wrist,
MM4PD outputs may be less reliable in particular scenarios (e.g. tremor overestimates from a
loose band, false dyskinesia from daily piano practice). However, we observed that most of these
situations occur sporadically and do not unduly impact smartwatch symptom patterns when data
is averaged over multiple days. Symptoms that do not manifest on the limb that the watch is
worn may also not be robustly detected, and the system is not designed to measure or assess non-
motor symptoms, which may have noteworthy impact on patients’ quality of life (76). Future
studies could determine how to safely and advantageously combine traditional assessments,
including those for non-motor symptoms, with smartwatch data (77). Clinicians could confirm or
identify additional symptoms with in-person or virtual visits or potentially via an app that
enables self-reporting of both motor and nonmotor symptoms. Future algorithms could also
classify temporal patterns of symptoms such as peak-dose or diphasic dyskinesias to guide
clinicians to adjust medication dosing or consider an advanced therapy such as DBS or
continuous levodopa infusion.
Gaps remain before MM4PD can become a decision support tool in clinical workflows.
Workflow integration requires secure and compliant data flows to providers from the MM4PD
system, which is solely designed to store data on-device until the user consents to third-party
access. However, other applications have successfully achieved secure integrations with the
Apple Watch and iPhone platforms (78, 79). Without reimbursement, smartwatches and
smartphones may not be broadly accessible to all patients. Integrating MM4PD into clinical
settings requires careful consideration to avoid further exacerbating socioeconomic disparities
among patients with Parkinson’s disease (80, 81). Finally, United States Food and Drug
Administration (FDA) clearance may be necessary to achieve widespread clinical utilization.
While the underlying hardware (Apple Watch) of MM4PD is not a medical device, examples
exist of both FDA-cleared software built upon the Apple Watch and FDA-cleared wearable
systems for Parkinson's disease (82-84).
In the future, this technology has the potential to serve a wide range of applications.
Clinicians could use smartwatch symptom profiles to improve treatment plans, motivate patients
to remain adherent, or quantify post-surgery improvements (85). Researchers could passively
assess disease progression or treatment efficacy without symptom journals, which may burden
participants and confound results by increasing patients’ awareness of symptoms. Clinical
trialists could deploy the smartwatch system in real world environments to screen for
Parkinsonian tremor in asymptomatic cohorts, or serve as a companion diagnostic during drug
development (86). In summary, MM4PD enables continuous symptom monitoring through a
smartwatch, and can help clinicians and researchers incorporate out-of-clinic data for the
treatment and study of Parkinson’s disease.
Materials and Methods:
Study design
We conducted three distinct studies to design and validate the algorithms, namely: the pilot
study (n = 118 for algorithm design), the longitudinal patient study (n = 143 for design, n = 82
for validation), and the longitudinal control study (n = 171). (Fig. 2, table S1, fig S1). All three
included the current clinical gold standard assessment of Parkinson’s disease motor symptoms —
MDS-UPDRS Part III scores — and continuous inertial sensor data from the Apple Watch.
Randomization was not applicable; all clinical ratings used to design and validate the algorithm
were obtained in a blinded manner (unexposed to algorithm development or outputs). We
obtained informed consent from subjects in all three studies in accordance with IRB-approved
protocols (pilot patient study: Protocol #201005, 201014, Midlands Independent Review Board;
longitudinal patient study: Protocol #100.1 Quorum Institutional Review Board; longitudinal
control study: Protocol #18-0424-781, Advarra Institutional Review Board). Individual profile
data is only included from subjects who consented to publication of individual profiles.
With pilot study data, we designed the tremor algorithm using data from standardized
MDS-UPDRS Part III motor tasks (pronation-supination, rest, etc.) with expert clinical ratings.
Three movement disorder specialists provided expert ratings via video recordings time-aligned
with smartwatch sensor data. Remote video recordings have previously been shown to be a valid
method for MDS-UPDRS assessments (87). A subset of subjects also participated in a one-week,
out-of-clinic measurement period to capture typical free-living behavior in the pilot study. The
free-living period was book-ended by additional MDS-UPDRS motor assessments with video
reference; however, no clinical (MDS-UPDRS) ratings were performed by video or on site.
Subjects self-reported updated medication information upon enrolling into the pilot study. In
addition to MDS-UPDRS assessments, subjects underwent a 30-minute instruction period to
familiarize them with their smartwatch (Apple Watch) functionalities. Specifically, subjects were
shown how to log workouts, check their activity “rings” and start deep breathing sessions
(Supplementary Methods, data file S1). We also conducted short surveys about the study
subjects’ experience with a smartwatch (table S7, fig. S8).
The longitudinal patient study was conducted in the context of a movement disorder clinic
and combined standardized on-site assessment with free-living observation lasting up to 6
months. A movement disorder specialist administered the MDS-UPDRS Part III Motor
Assessment at enrollment. Subjects were instructed to wear the watch on the side most affected
by tremor or dyskinesia. The first 143 subjects were used for design whereas 82 subjects
enrolling later were separated into a validation data set for algorithm assessment. Tremor and
dyskinesia detection algorithms also used data from a 171-subject elderly cohort (longitudinal
control study) with subjects 65+ who did not self-report Parkinson’s disease; sensor data was
collected continuously for up to 12 months (fig. S1).
Three sub-studies were performed within the longitudinal patient study to validate
MM4PD outputs. First, a small pilot investigation (n = 36) was conducted within the longitudinal
patient study in which patients reviewed the smartwatch symptom profiles captured by the
tremor and dyskinesia algorithms alongside a clinician (fig. S19). Second, a comprehensive
evaluation of MM4PD as a decision support tool was performed. The study clinician determined
whether intra-day patterns matched the patient’s medication schedules, and whether weekly
symptom burden changed in response to new treatment regimens (fig. S20). The study clinician
initially performed the review without access to patient charts. After the analysis of the
anonymized review was complete, the clinician investigated unexpected symptoms from the
smartwatch symptom profiles either through review of the patient's charts, or by directly
discussing with the patient or the patient's primary movement disorder specialist as applicable.
The clinician did not evaluate some smartwatch symptom profiles due to insufficient data (i.e.,
subjects with less than five full days of data post-treatment or subjects with unobservable signals
like facial dyskinesias). In the final sub-study, ten smartwatch symptom profiles were reviewed
by three blinded and independent movement disorder specialists. Each rater was given
standardized, written instructions on how to interpret smartwatch symptom profiles. Raters then
classified the profiles as “before” or “after” treatment for a known medication change. Evaluated
smartwatch symptom profile pairs were selected from longitudinal patient study subjects who
underwent treatment changes through automated filtering rules (fig. S17). Smartwatch symptom
profiles were randomly ordered before presenting the task to the raters (fig. S18). The only
additional information provided was the patient’s screening MDS-UPDRS scores for tremor and
dyskinesia (Supplementary Methods).
In the pilot and longitudinal patient studies the primary inclusion criterion was a prior
diagnosis of Parkinson’s disease of at least 6 months, and primary exclusion criteria were
pregnancy or severe cognitive impairment. Recruitment of patients with PD was initially
structured with the goal of spanning a range of MDS-UPDRS in-clinic scores, and subsequently
broadened in the longitudinal patient study to take all who met inclusion and exclusion criteria to
maximize observations of a range of all-day behaviors and activity. Further details on all studies,
activity and lifestyle characterization, and the patient-clinician pilot within the longitudinal
patient study are provided in Supplementary Methods.
Finally, two control datasets were collected. First a longitudinal dataset was collected on
elderly patients without Parkinson's disease; patients went through a full patient history and
physical exam with a registered nurse at the study start. Second, an engineering controlled study
was run on data from young, healthy participants performing a number of all-day activities
including playing musical instruments, driving, etc. Raw accelerometer and gyroscope from the
Apple Watch was collected and the tremor and dyskinesia algorithms were simulated from this
raw sensor data to report false positive performance in both free-living and controlled, simulated
activity settings.
Movement Monitor for Parkinson’s Disease
Materials and Software
Accelerometer and gyroscope data were collected at a sample rate of 100 Hz using the
Apple Watch (Series 2 and above). Smartwatch workout type and duration and activity
information with classification (e.g., pedometer, driving classification) were also logged. Sample
code to create all-day tremor and dyskinesia profiles are available on the Apple developer portal
and the open-source ResearchKit GitHub repository (88). Results presented here have been
verified and reproduced to match outputs from the Movement Disorder API available on Apple
WatchOS 5 or later (89, 90).
Tremor displacement and detection
Estimates of resting tremor were made at 2.56s intervals when period signals were
detected between ~3-7Hz. This window size was selected to capture brief, intermittent tremors
with sufficient signal-to-noise ratio (SNR). For lower SNRs or periods where the subject was
moving (e.g. active tasks or gestures), the algorithm classified tremor as “unknown”. The
remaining regions where the subject was at rest were classified as “no tremor.” When tremor was
reported, a displacement was computed. Tremor SNR thresholds were determined empirically by
comparing true tremor signals to internal false positive datasets spanning >100 hours of
workouts, mechanical vibrations, and all-day data from healthy controls (data from 171 subject
subset consented for publication shown herein as the longitudinal control study). Tremor
estimates were aggregated into minute-by-minute outputs that showed the percentage of time
tremor was unknown, not present, or detected at slight (1), mild (2), moderate (3) or severe
displacements (4) (fig. S3). Displacement categories were determined by evaluating 1-mm
increments and selecting a threshold that maximized algorithm-rater agreement in 72 tasks across
31 subjects rated to have tremor by at least 2 of 3 raters.
Motion capture displacement measurements
A healthy, control subject simulated tremor movements with varying amplitudes while
wearing the Apple Watch in seated and standing positions. We attached six reflective markers to
the watch and tracked these markers using a motion capture system (Vicon) during the simulated
tremor movements (Fig. 3A). For each of nine tasks, the Vicon data was divided into 2.56-s
duration segments and a principal component analysis was performed over each segment to
extract the primary direction of tremor motion. The mean peak to peak displacement from Vicon
data was calculated using the first principal component in 2.56s windows.
Comparison of tremor displacements to MDS-UPDRS ratings
For subjects with periodic 3-7 Hz tremor signal, we compared smartwatch wrist
displacements to rater MDS-UPDRS scores for 253 cognitive distraction and eyes-open standing
tasks across 61 unique subjects from the pilot study (Fig. 3C, fig. S5). The Spearman’s rank
correlation coefficient was computed between the maximum tremor displacement detected
during the task and the mean of the rater MDS-UPDRS scores rounded to the nearest integer.
False negative rates were computed as the proportion of time the algorithm detected no tremor
during cognitive distraction tasks where the clinical raters reported tremor (table S3). We also
compared the mean daily tremor detection rate for all subjects from the longitudinal patient study
to an overall tremor rating that synthesized constancy and severity using the Spearman's rank
correlation coefficient (Fig. 3D). We calculated the daily tremor percentage as the total detected
tremor time divided by the total time period the watch was worn. Watch wear time excluded
periods where the subject was likely asleep, or where the watch was not being worn as indicated
by a lack of device movement. This percentage was then averaged across all the days the subject
was in the study. Six subjects were excluded because they had insufficient data for analysis.
Choreiform movement score and all-day dyskinesia analyses
We optimized the dyskinesia algorithm for subjects who already had a confirmed
diagnosis of chorea visible at the wrist. Features were designed to capture the irregular, jerky
movements characteristic of chorea. We verified and selected features by comparing their
predictive power to clinician ratings of dyskinesias during cognitive distraction tasks using
subjects from the pilot study. We then trained the model with the selected features on all-day,
free-living data. For the chorea algorithm, the estimated CMS was calculated four times within a
10.24-second window. We averaged overlapping windows across the task duration to get a mean
CMS for each wrist, and the greater of the two scores from each wrist was compared to expert
ratings. In all-day analyses, averaged CMS values were computed over one-minute intervals.
Scores over 0.16 were reported as chorea during that minute; this threshold was selected to
minimize false positives during all-day analyses. We performed a leave-one-out cross-validation
and hold-out validation on free-living data from subjects in the pilot study and longitudinal
patient study with >24h of data. A full break down of subjects used per analysis and their
associated group are shown in table S4 and described in the Supplementary Methods.
Movement disorder symptom profiles
We generated all-day symptom profiles from 2.56s tremor and dyskinesia outputs
aggregated into 15 minute periods to capture symptom response to medications, while ensuring
sufficient measurements to prevent outliers from masking the primary signal. Dyskinesia was
reported as the percent of time chorea was detected in 15 minutes; tremor was reported as the
percent of time samples were categorized into displacement groups of <0.1 cm, 0.1-0.6 cm,
0.6-2.2 cm, or >2.2 cm. We removed stationary periods of five minutes or more, where inertial
accelerations were near zero. Fifteen-minute periods comprising a 24-hour day were organized
according to the local time of the subject. Windows with less than 50% of available data were
discarded. Tremor percentages were averaged over all days with data, and chorea was displayed
as the median (to reduce the effect of occasional large outliers). This created symptom profiles
over a set period (e.g., a single week, or for the duration of a given medication schedule). Only
windows with data from at least 5 days or more than 20% of the total period were shown.
Symptom changes between medication schedules
Symptom changes were visualized over time by calculating the mean amount of tremor
and dyskinesia per day, and the mean tremor displacement with a three-day moving average. In
order to determine whether symptoms statistically differed between medication schedules, a
Wilcoxon signed rank test was performed to determine whether paired windows from distinct
time periods showed significant differences (e.g., between periods with two different med
schedules, such as for subject S004: Fig. 5). Paired windows were matched on both time of day
and day of the week. The average difference between the two days was computed to determine
the net change between the periods.
Statistical analysis
Figures and statistical analysis were generated using MATLAB and Statistics Toolbox
(Release 2017b, The MathWorks, Inc., Natick, Massachusetts, United States). Box plots are
shown with a central mark at the median, bottom and top edges of the boxes at 25th and 75th
percentiles respectively and whiskers out to the most extreme points within 1.5 times the
interquartile range. Mean difference and limit of agreement (1.96σ) calculations to compare
smartwatch and motion capture displacements were calculated with a custom MATLAB
implementation of a Bland-Altman visualization. Spearman’s rank correlation coefficients were
used to compare tremor displacements and all-day tremor detection to clinical ratings.
Hypotheses testing was performed using Wilcoxon rank sum tests to compare differences in
CMS distributions, dyskinesia detection rates, and tremor rates for different subject groups.
Comparisons in differences in symptom rates between medication schedules were performed
with a Wilcoxon signed rank test. Statistical tests on data used for design and validation are
summarized in table S8. Outliers were displayed for all figures unless otherwise stated.
Supplementary Materials List
Materials and Methods
Fig. S1. Overview of data collected for MM4PD development and validation
Fig. S2. MDS-UPDRS Part 2 and Part 4 scores
Fig. S3. Block diagram of tremor and dyskinesia algorithms in MM4PD
Fig. S4. Accelerometer data and displacement estimates for cognitive distraction task from pilot study subjects
Fig. S5. Data from Fig. 3C, separated by phase (seated and standing)
Fig. S6. Inter-rater concordance compared to algorithm displacement thresholds
Fig. S7. Aggregated all-day tremor symptom profiles for sample subjects grouped by in-clinic MDS-UPDRS scores
Fig. S8. Longitudinal patient study, subjects provided feedback of their smartwatch experiences; responses were
collected via an optional exit survey
Fig. S9. Workouts logged by subjects in the longitudinal patient study.
Fig. S10. Smartwatch symptom profiles show changes in response to DBS surgery and programming
Fig. S11. Increased medication adherence improves consistency in the motor fluctuation profile over averaged
periods of time
Fig. S12. Subject S004’s tremor amounts were significantly reduced when additional carbidopa/levodopa controlled
release doses were added
Fig. S13. Example cases of unexpected symptoms as identified by longitudinal patient study clinician
Fig. S14. Tremor and dyskinesia rates as measured by MM4PD matched clinician expectations in longitudinal
patient study
Fig. S15. Subject with clinically undetected tremor still showed intra-day patterns indicative of medication response
Fig. S16. Examples of diphasic dyskinesia
Fig. S17. Selection criteria and results of matching task from 3 independent raters with 15 medication schedules
Fig. S18. Matching task example where all 3 raters were consistent in rationale and correct
Fig. S19. Subject smartwatch symptom profile review mediated with movement disorder specialist
Fig. S20. Movement disorder specialist smartwatch symptom profile review
Table S1. Study demographics
Table S2. False positive rates
Table S3. Tremor detection rates in a cognitive distraction task compared to ratings from 3 movement disorder
Table S4. Categorization of live-on data by subject
Table S5. Summary of medications across all 225 patients in the longitudinal patient study
Table S6. Details behind cases where clinician expectation differed from smartwatch symptom profiles
Table S7. Longitudinal patient study exit survey
Table S8. Statistical test summary
Movie S1. Example of detected moderate-strong tremor
Movie S2. Example of detected distal tremor
Movie S3. Example of detected strong tremor
Movie S4. Example of detected moderate dyskinesia
Movie S5. Example of detected mild dyskinesia
Movie S6. Example of very subtle dyskinesia below detection threshold
Movie S7. Example of mild dyskinesia below detection threshold
Data file S1. Longitudinal patient study processed data
1. E. J. Topol, Transforming medicine via digital innovation. Sci Transl Med 2, 16cm14 (2010); published online EpubJan
27 (10.1126/scitranslmed.3000484).
2. M. S. Patel, D. A. Asch, K. G. Volpp, Wearable devices as facilitators, not drivers, of health behavior change. JAMA
313, 459-460 (2015); published online EpubFeb 3 (10.1001/jama.2014.14781).
3. S. R. Steinhubl, E. D. Muse, E. J. Topol, The emerging field of mobile health. Sci Transl Med 7, 283rv283 (2015);
published online EpubApr 15 (10.1126/scitranslmed.aaa3487).
4. L. Piwek, D. A. Ellis, S. Andrews, A. Joinson, The Rise of Consumer Health Wearables: Promises and Barriers. PLoS
Med 13, e1001953 (2016); published online EpubFeb (10.1371/journal.pmed.1001953).
5. D. R. Bassett, Jr., L. P. Toth, S. R. LaMunion, S. E. Crouter, Step Counting: A Review of Measurement Considerations
and Health-Related Applications. Sports Med 47, 1303-1315 (2017); published online EpubJul (10.1007/
6. S. S. Gambhir, T. J. Ge, O. Vermesh, R. Spitler, Toward achieving precision health. Sci Transl Med 10, (2018);
published online EpubFeb 28 (10.1126/scitranslmed.aao3612).
7. R. Rawassizadeh, B. A. Price, M. Petre, Wearables: has the age of smartwatches finally arrived? Commun. ACM 58,
45–47 (2014)10.1145/2629633).
8. B. Reeder, A. David, Health at hand: A systematic review of smart watch uses for health and wellness. Journal of
biomedical informatics 63, 269-276 (2016)
9. K. I. Taylor, H. Staunton, F. Lipsmeier, D. Nobbs, M. Lindemann, Outcome measures based on digital health
technology sensor data: data- and patient-centric approaches. NPJ Digit Med 3, 97 (2020)10.1038/s41746-020-0305-8).
10. S. Patel, K. Lorincz, R. Hughes, N. Huggins, J. Growdon, D. Standaert, M. Akay, J. Dy, M. Welsh, P. Bonato,
Monitoring motor fluctuations in patients with Parkinson's disease using wearable sensors. IEEE transactions on
information technology in biomedicine : a publication of the IEEE Engineering in Medicine and Biology Society 13,
864-873 (2009)
11. R. Araujo, M. Tabuas-Pereira, L. Almendra, J. Ribeiro, M. Arenga, L. Negrao, A. Matos, A. Morgadinho, C. Januario,
Tremor Frequency Assessment by iPhone R Applications: Correlation with EMG Analysis. Journal of Parkinson's
disease 6, 717-721 (2016).
12. S. Cohen, L. R. Bataille, A. K. Martig, Enabling breakthroughs in Parkinson's disease with wearable technologies and
big data analytics. Mhealth 2, 20 (2016)10.21037/mhealth.2016.04.02).
13. L. Li, Q. Yu, B. Xu, Q. Bai, Y. Zhang, H. Zhang, C. Mao, C. Liu, T. Shen, S. Wang, in 2017 8th International IEEE/
EMBS Conference on Neural Engineering (NER). (2017), pp. 70-73.
14. A. L. Silva de Lima, T. Hahn, L. J. W. Evers, N. M. de Vries, E. Cohen, M. Afek, L. Bataille, M. Daeschler, K. Claes,
B. Boroojerdi, D. Terricabras, M. A. Little, H. Baldus, B. R. Bloem, M. J. Faber, Feasibility of large-scale deployment
of multiple wearable sensors in Parkinson's disease. PloS one 12, e0189161 (2017)
15. A. Antonini, G. Gentile, M. Giglio, A. Marcante, H. Gage, M. M. L. Touray, D. I. Fotiadis, D. Gatsios, S. Konitsiotis,
L. Timotijevic, B. Egan, C. Hodgkins, R. Biundo, C. Pellicano, P. D. M. consortium, Acceptability to patients, carers
and clinicians of an mHealth platform for the management of Parkinson's disease (PD_Manager): study protocol for a
pilot randomised controlled trial. Trials 19, 492 (2018); published online EpubSep 14 (10.1186/s13063-018-2767-4).
16. A. Zhan, S. Mohan, C. Tarolli, R. B. Schneider, J. L. Adams, S. Sharma, M. J. Elson, K. L. Spear, A. M. Glidden, M. A.
Little, A. Terzis, E. R. Dorsey, S. Saria, Using Smartphones and Machine Learning to Quantify Parkinson Disease
Severity: The Mobile Parkinson Disease Score. JAMA neurology 75, 876-880 (2018)
17. G. Albani, C. Ferraris, R. Nerino, A. Chimienti, G. Pettiti, F. Parisi, G. Ferrari, N. Cau, V. Cimolin, C. Azzaro, L.
Priano, A. Mauro, An Integrated Multi-Sensor Approach for the Remote Monitoring of Parkinson's Disease. Sensors
(Basel) 19, (2019); published online EpubNov 2 (10.3390/s19214764).
18. M. D. Hssayeni, J. Jimenez-Shahed, M. A. Burack, B. Ghoraani, Wearable Sensors for Estimation of Parkinsonian
Tremor Severity during Free Body Movements. Sensors (Basel) 19, (2019); published online EpubSep 28 (10.3390/
19. C. Morgan, M. Rolinski, R. McNaney, B. Jones, L. Rochester, W. Maetzler, I. Craddock, A. L. Whone, Systematic
Review Looking at the Use of Technology to Measure Free-Living Symptom and Activity Outcomes in Parkinson's
Disease in the Home or a Home-like Environment. J Parkinsons Dis 10, 429-454 (2020)10.3233/JPD-191781).
20. S. Rahman, H. J. Griffin, N. P. Quinn, M. Jahanshahi, Quality of life in Parkinson's disease: the relative importance of
the symptoms. Mov Disord 23, 1428-1434 (2008); published online EpubJul 30 (10.1002/mds.21667).
21. T. Asakawa, K. Sugiyama, T. Nozaki, T. Sameshima, S. Kobayashi, L. Wang, Z. Hong, S. Chen, C. Li, H. Namba, Can
the Latest Computerized Technologies Revolutionize Conventional Assessment Tools and Therapies for a Neurological
Disease? The Example of Parkinson's Disease. Neurol Med Chir (Tokyo) 59, 69-78 (2019); published online EpubMar
15 (10.2176/nmc.ra.2018-0045).
22. G. Deuschl, P. Bain, M. Brin, Consensus statement of the Movement Disorder Society on Tremor. Ad Hoc Scientific
Committee. Mov Disord 13 Suppl 3, 2-23 (1998)10.1002/mds.870131303).
23. The Unified Parkinson's Disease Rating Scale (UPDRS): status and recommendations. Mov Disord 18, 738-750 (2003);
published online EpubJul (10.1002/mds.10473).
24. C. G. Goetz, B. C. Tilley, S. R. Shaftman, G. T. Stebbins, S. Fahn, P. Martinez-Martin, W. Poewe, C. Sampaio, M. B.
Stern, R. Dodel, B. Dubois, R. Holloway, J. Jankovic, J. Kulisevsky, A. E. Lang, A. Lees, S. Leurgans, P. A. LeWitt, D.
Nyenhuis, C. W. Olanow, O. Rascol, A. Schrag, J. A. Teresi, J. J. van Hilten, N. LaPelle, Movement Disorder Society-
sponsored revision of the Unified Parkinson's Disease Rating Scale (MDS-UPDRS): scale presentation and clinimetric
testing results. Mov Disord 23, 2129-2170 (2008); published online EpubNov 15 (10.1002/mds.22340).
25. C. G. Goetz, [Movement Disorder Society-Unified Parkinson's Disease Rating Scale (MDS-UPDRS): a new scale for
the evaluation of Parkinson's disease]. Movement Disorder Society-Unified Parkinson's Disease Rating Scale (MDS-
UPDRS) : une nouvelle echelle pour l'evaluation de la maladie de Parkinson. 166, 1-4 (2010)
26. G. K. Montgomery, N. C. Reynolds, Jr., Compliance, reliability, and validity of self-monitoring for physical
disturbances of Parkinson's disease. The Parkinson's Symptom Diary. The Journal of nervous and mental disease 178,
636-641 (1990).
27. R. A. Hauser, J. Friedlander, T. A. Zesiewicz, C. H. Adler, L. C. Seeberger, C. F. O'Brien, E. S. Molho, S. A. Factor, A
home diary to assess functional status in patients with Parkinson's disease with motor fluctuations and dyskinesia. Clin
Neuropharmacol 23, 75-81 (2000); published online EpubMar-Apr (10.1097/00002826-200003000-00003).
28. S. S. Papapetropoulos, Patient diaries as a clinical endpoint in Parkinson's disease clinical trials. CNS Neurosci Ther 18,
380-387 (2012); published online EpubMay (10.1111/j.1755-5949.2011.00253.x).
29. E. R. Dorsey, S. Papapetropoulos, M. Xiong, K. Kieburtz, The First Frontier: Digital Biomarkers for
Neurodegenerative Disorders. Digit Biomark 1, 6-13 (2017); published online EpubSep-Dec (10.1159/000477383).
30. M. K. Erb, D. R. Karlin, B. K. Ho, K. C. Thomas, F. Parisi, G. P. Vergara-Diaz, J. F. Daneault, P. W. Wacnik, H. Zhang,
T. Kangarloo, C. Demanuele, C. R. Brooks, C. N. Detheridge, N. Shaafi Kabiri, J. S. Bhangu, P. Bonato, mHealth and
wearable technology should replace motor diaries to track motor fluctuations in Parkinson's disease. NPJ Digit Med 3,
6 (2020)10.1038/s41746-019-0214-x).
31. S. Pietracupa, A. Fasano, G. Fabbrini, M. Sarchioto, M. Bloise, A. Latorre, M. Altieri, M. Bologna, A. Berardelli, Poor
self-awareness of levodopa-induced dyskinesias in Parkinson's disease: clinical features and mechanisms.
Parkinsonism Relat Disord 19, 1004-1008 (2013); published online EpubNov (10.1016/j.parkreldis.2013.07.002).
32. J. G. Nutt, W. R. Woodward, J. P. Hammerstad, J. H. Carter, J. L. Anderson, The "on-off" phenomenon in Parkinson's
disease. Relation to levodopa absorption and transport. N Engl J Med 310, 483-488 (1984); published online EpubFeb
23 (10.1056/nejm198402233100802).
33. M. Contin, R. Riva, P. Martinelli, F. Albani, A. Baruzzi, Effect of meal timing on the kinetic-dynamic profile of
levodopa/carbidopa controlled release [corrected] in parkinsonian patients. Eur J Clin Pharmacol 54, 303-308 (1998);
published online EpubJun (10.1007/s002280050464).
34. A. P. Denny, M. Behari, Motor fluctuations in Parkinson's disease. Journal of the neurological sciences 165, 18-23
35. D. J. Brooks, Optimizing levodopa therapy for Parkinson's disease with levodopa/carbidopa/entacapone: implications
from a clinical and patient perspective. Neuropsychiatr Dis Treat 4, 39-47 (2008); published online EpubFeb (10.2147/
36. A. Haahr, M. Kirkevold, E. O. Hall, K. Ostergaard, Living with advanced Parkinson's disease: a constant struggle with
unpredictability. J Adv Nurs 67, 408-417 (2011); published online EpubFeb (10.1111/j.1365-2648.2010.05459.x).
37. D. J. Daley, P. K. Myint, R. J. Gray, K. H. Deane, Systematic review on factors associated with medication non-
adherence in Parkinson's disease. Parkinsonism Relat Disord 18, 1053-1061 (2012); published online EpubDec
38. J. E. Thorp, P. G. Adamczyk, H. L. Ploeg, K. A. Pickett, Monitoring Motor Symptoms During Activities of Daily
Living in Individuals With Parkinson's Disease. Front Neurol 9, 1036 (2018)10.3389/fneur.2018.01036).
39. A. Beuter, R. Edwards, Using frequency domain characteristics to discriminate physiologic and parkinsonian tremors.
Journal of clinical neurophysiology : official publication of the American Electroencephalographic Society 16, 484-494
40. R. Lemoyne, T. Mastroianni, M. Cozza, C. Coroian, W. Grundfest, Implementation of an iPhone for characterizing
Parkinson's disease tremor through a wireless accelerometer application. Conference proceedings : ... Annual
International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine
and Biology Society. Annual Conference 2010, 4954-4958 (2010)
41. B. T. Cole, S. H. Roy, C. J. De Luca, S. H. Nawab, Dynamical learning and tracking of tremor and dyskinesia from
wearable sensors. IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE
Engineering in Medicine and Biology Society 22, 982-991 (2014)
42. H. Dai, P. Zhang, T. C. Lueth, Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement
Unit. Sensors (Basel, Switzerland) 15, 25055-25071 (2015)
43. R. LeMoyne, T. Mastroianni, Use of smartphones and portable media devices for quantifying human movement
characteristics of gait, tendon reflex response, and Parkinson's disease hand tremor. Methods in molecular biology
(Clifton, N.J.) 1256, 335-358 (2015)
44. O. Bazgir, J. Frounchi, S. A. H. Habibi, L. Palma, P. Pierleoni, in 2015 22nd Iranian Conference on Biomedical
Engineering (ICBME). (2015), pp. 1-5.
45. H. Zach, M. Dirkx, B. R. Bloem, R. C. Helmich, The Clinical Evaluation of Parkinson's Tremor. J Parkinsons Dis 5,
471-474 (2015)10.3233/jpd-150650).
46. H. Jeon, W. Lee, H. Park, H. J. Lee, S. K. Kim, H. B. Kim, B. Jeon, K. S. Park, High-accuracy automatic classification
of Parkinsonian tremor severity using machine learning method. Physiological measurement 38, 1980-1999
47. J. Prince, M. de Vos, A Deep Learning Framework for the Remote Detection of Parkinson'S Disease Using Smart-
Phone Sensor Data. Conf Proc IEEE Eng Med Biol Soc 2018, 3144-3147 (2018); published online EpubJul (10.1109/
48. R. A. Ramdhani, A. Khojandi, O. Shylo, B. H. Kopell, Optimizing Clinical Assessments in Parkinson's Disease
Through the Use of Wearable Sensors and Data Driven Modeling. Front Comput Neurosci 12, 72 (2018)10.3389/
49. F. Luft, S. Sharifi, W. Mugge, A. C. Schouten, L. J. Bour, A. F. van Rootselaar, P. H. Veltink, T. Heida, A Power
Spectral Density-Based Method to Detect Tremor and Tremor Intermittency in Movement Disorders. Sensors (Basel)
19, (2019); published online EpubOct 4 (10.3390/s19194301).
50. A. Schrag, N. Quinn, Dyskinesias and motor fluctuations in Parkinson's disease. A community-based study. Brain 123 (
Pt 11), 2297-2305 (2000); published online EpubNov (10.1093/brain/123.11.2297).
51. J. Jankovic, Parkinson's disease: clinical features and diagnosis. J Neurol Neurosurg Psychiatry 79, 368-376 (2008);
published online EpubApr (10.1136/jnnp.2007.131045).
52. R. I. Griffiths, K. Kotschet, S. Arfon, Z. M. Xu, W. Johnson, J. Drago, A. Evans, P. Kempster, S. Raghav, M. K. Horne,
Automated assessment of bradykinesia and dyskinesia in Parkinson's disease. J Parkinsons Dis 2, 47-55
53. S. Arora, V. Venkataraman, S. Donohue, K. M. Biglan, E. R. Dorsey, M. A. Little, in 2014 IEEE International
Conference on Acoustics, Speech and Signal Processing (ICASSP). (2014), pp. 3641-3644.
54. S. Arora, V. Venkataraman, A. Zhan, S. Donohue, K. M. Biglan, E. R. Dorsey, M. A. Little, Detecting and monitoring
the symptoms of Parkinson's disease using smartphones: A pilot study. Parkinsonism & related disorders 21, 650-653
55. M. Braybrook, S. O'Connor, P. Churchward, T. Perera, P. Farzanehfar, M. Horne, An Ambulatory Tremor Score for
Parkinson's Disease. J Parkinsons Dis 6, 723-731 (2016); published online EpubOct 19 (10.3233/JPD-160898).
56. E. Goubault, H. P. Nguyen, S. Bogard, P. J. Blanchet, E. Bezard, C. Vincent, M. Langlois, C. Duval, Cardinal Motor
Features of Parkinson's Disease Coexist with Peak-Dose Choreic-Type Drug-Induced Dyskinesia. J Parkinsons Dis 8,
323-331 (2018)10.3233/jpd-181312).
57. B. M. Bot, C. Suver, E. C. Neto, M. Kellen, A. Klein, C. Bare, M. Doerr, A. Pratap, J. Wilbanks, E. R. Dorsey, S. H.
Friend, A. D. Trister, The mPower study, Parkinson disease mobile data collected using ResearchKit. Scientific data 3,
160011 (2016)
58. R. Lakshminarayana, D. Wang, D. Burn, K. R. Chaudhuri, C. Galtrey, N. V. Guzman, B. Hellman, J. Ben, S. Pal, J.
Stamford, M. Steiger, R. W. Stott, J. Teo, R. A. Barker, E. Wang, B. R. Bloem, M. van der Eijk, L. Rochester, A.
Williams, Using a smartphone-based self-management platform to support medication adherence and clinical
consultation in Parkinson's disease. NPJ Parkinsons Dis 3, 2 (2017)10.1038/s41531-016-0003-z).
59. S. Gernon, A. Fowler, K. Lyons, R. Pahwa, Clinical Experience with Personal KinetiGraph Before and After Deep
Brain Stimulation for Parkinson’s Disease (P3.045). Neurology 90, P3.045 (2018).
60. R. Pahwa, S. H. Isaacson, D. Torres-Russotto, F. B. Nahab, P. M. Lynch, K. E. Kotschet, Role of the Personal
KinetiGraph in the routine clinical assessment of Parkinson's disease: recommendations from an expert panel. Expert
Rev Neurother 18, 669-680 (2018); published online EpubAug (10.1080/14737175.2018.1503948).
61. S. H. Isaacson, B. Boroojerdi, O. Waln, M. McGraw, D. L. Kreitzman, K. Klos, F. J. Revilla, D. Heldman, M. Phillips,
D. Terricabras, M. Markowitz, F. Woltering, S. Carson, D. Truong, Effect of using a wearable device on clinical
decision-making and motor symptoms in patients with Parkinson's disease starting transdermal rotigotine patch: A pilot
study. Parkinsonism Relat Disord 64, 132-137 (2019); published online EpubJul (10.1016/j.parkreldis.2019.01.025).
62. H. Khodakarami, P. Farzanehfar, M. Horne, The Use of Data from the Parkinson's KinetiGraph to Identify Potential
Candidates for Device Assisted Therapies. Sensors (Basel) 19, (2019); published online EpubMay 15 (10.3390/
63. M. Linares-Del Rey, L. Vela-Desojo, R. Cano-de la Cuerda, Mobile phone applications in Parkinson's disease: A
systematic review. Neurologia 34, 38-54 (2019); published online EpubJan - Feb (10.1016/j.nrl.2017.03.006).
64. A. Santiago, J. W. Langston, R. Gandhy, R. Dhall, S. Brillman, L. Rees, C. Barlow, Qualitative Evaluation of the
Personal KinetiGraphTM Movement Recording System in a Parkinson's Clinic. J Parkinsons Dis 9, 207-219
65. E. E. Tan, E. J. Hogg, M. Tagliati, The role of Personal KinetiGraph TM fluctuator score in quantifying the progression
of motor fluctuations in Parkinson's disease. Functional neurology 34, 21-28 (2019).
66. S. P. Khor, A. Hsu, The pharmacokinetics and pharmacodynamics of levodopa in the treatment of Parkinson's disease.
Curr Clin Pharmacol 2, 234-243 (2007); published online EpubSep (10.2174/157488407781668802).
67. M. Barichella, E. Cereda, E. Cassani, G. Pinelli, L. Iorio, V. Ferri, G. Privitera, M. Pasqua, A. Valentino, F. Monajemi,
S. Caronni, C. Lignola, C. Pusani, C. Bolliri, S. A. Faierman, A. Lubisco, G. Frazzitta, M. L. Petroni, G. Pezzoli,
Dietary habits and neurological features of Parkinson's disease patients: Implications for practice. Clinical nutrition
(Edinburgh, Scotland) 36, 1054-1061 (2017)
68. S. Fahn, The spectrum of Levodopa-induced dyskinesias. Annals of neurology 47, S2-9; discussion S9 (2000);
published online Epub05/01 (10.1002/1531-8249(200001)47:1<2::AID-ANA2>3.0.CO;2-B).
69. G. Fabbrini, J. M. Brotchie, F. Grandas, M. Nomoto, C. G. Goetz, Levodopa-induced dyskinesias. Mov Disord 22,
1379-1389; quiz 1523 (2007); published online EpubJul 30 (10.1002/mds.21475).
70. A. Latorre, M. C. Bloise, C. Colosimo, F. Di Biasio, G. Defazio, A. Berardelli, G. Fabbrini, Dyskinesias and motor
symptoms onset in Parkinson disease. Parkinsonism & related disorders 20, 1427-1429 (2014)
71. L. Verhagen Metman, A. J. Espay, Teaching Video NeuroImages: The underrecognized diphasic dyskinesia of
Parkinson disease. Neurology 89, e83-e84 (2017); published online EpubAug 15 (10.1212/WNL.0000000000004238).
72. B. Boroojerdi, R. Ghaffari, N. Mahadevan, M. Markowitz, K. Melton, B. Morey, C. Otoul, S. Patel, J. Phillips, E. Sen-
Gupta, O. Stumpp, D. Tatla, D. Terricabras, K. Claes, J. A. Wright, Jr., N. Sheth, Clinical feasibility of a wearable,
conformable sensor patch to monitor motor symptoms in Parkinson's disease. Parkinsonism Relat Disord 61, 70-76
(2019); published online EpubApr (10.1016/j.parkreldis.2018.11.024).
73. J. Cancela, M. Pastorino, M. T. Arredondo, K. S. Nikita, F. Villagra, M. A. Pastor, Feasibility study of a wearable
system based on a wireless body area network for gait assessment in Parkinson's disease patients. Sensors (Basel,
Switzerland) 14, 4618-4633 (2014)
74. S. Del Din, M. Elshehabi, B. Galna, M. A. Hobert, E. Warmerdam, U. Suenkel, K. Brockmann, F. Metzger, C. Hansen,
D. Berg, L. Rochester, W. Maetzler, Gait analysis with wearables predicts conversion to parkinson disease. Ann Neurol
86, 357-367 (2019); published online EpubSep (10.1002/ana.25548).
75. G. Yahalom, Z. Yekutieli, S. Israeli-Korn, S. Elincx-Benizri, V. Livneh, T. Fay-Karmon, K. Tchelet, Y. Rubel, S.
Hassin-Baer, Smartphone Based Timed Up and Go Test Can Identify Postural Instability in Parkinson's Disease. The
Israel Medical Association journal : IMAJ 22, 37-42 (2020).
76. P. Barone, R. Erro, M. Picillo, Quality of Life and Nonmotor Symptoms in Parkinson's Disease. Int Rev Neurobiol 133,
499-516 (2017)10.1016/bs.irn.2017.05.023).
77. A. J. Espay, J. M. Hausdorff, A. Sanchez-Ferro, J. Klucken, A. Merola, P. Bonato, S. S. Paul, F. B. Horak, J. A.
Vizcarra, T. A. Mestre, R. Reilmann, A. Nieuwboer, E. R. Dorsey, L. Rochester, B. R. Bloem, W. Maetzler, T.
Movement Disorder Society Task Force on, A roadmap for implementation of patient-centered digital outcome
measures in Parkinson's disease obtained using mobile health technologies. Mov Disord 34, 657-663 (2019); published
online EpubMay (10.1002/mds.27671).
78. N. Genes, S. Violante, C. Cetrangol, L. Rogers, E. E. Schadt, Y. Y. Chan, From smartphone to EHR: a case report on
integrating patient-generated health data. NPJ Digit Med 1, 23 (2018)10.1038/s41746-018-0030-8).
79. H. L. Henriksen A, Hartvigsen G, Grimsgaard S. , Using Cloud-Based Physical Activity Data from Google Fit and
Apple Healthkit to Expand Recording of Physical Activity Data in a Population Study. Stud Health Technol Inform 245,
108-112 (2017).
80. J. P. Hemming, A. L. Gruber-Baldini, K. E. Anderson, P. S. Fishman, S. G. Reich, W. J. Weiner, L. M. Shulman, Racial
and socioeconomic disparities in parkinsonism. Arch Neurol 68, 498-503 (2011); published online EpubApr (10.1001/
81. A. Saadi, D. U. Himmelstein, S. Woolhandler, N. I. Mejia, Racial disparities in neurologic health care access and
utilization in the United States. Neurology 88, 2268-2275 (2017); published online EpubJun 13 (10.1212/
82. M. Schroeder, "K161717: Personal Kinetigraph (PKG) System Model GKC-2000," (US FDA, Dept Health and Human
Services, 2016).
83. D.-B. Tillman, "DEN180044: ECG App," (US FDA, US FDA CDRH, 2018).
84. D.-B. Tillman, "DEN180042: Irregular Rhythm Notification Feature," (US FDA, US FDA CDRH, 2018).
85. I. Thomas, M. Alam, F. Bergquist, D. Johansson, M. Memedi, D. Nyholm, J. Westin, Sensor-based algorithmic dosing
suggestions for oral administration of levodopa/carbidopa microtablets for Parkinson's disease: a first experience. J
Neurol 266, 651-658 (2019); published online EpubMar (10.1007/s00415-019-09183-6).
86. F. Lipsmeier, K. I. Taylor, T. Kilchenmann, D. Wolf, A. Scotland, J. Schjodt-Eriksen, W.-Y. Cheng, I. Fernandez-
Garcia, J. Siebourg-Polster, L. Jin, J. Soto, L. Verselis, F. Boess, M. Koller, M. Grundman, A. U. Monsch, R. B.
Postuma, A. Ghosh, T. Kremer, C. Czech, C. Gossens, M. Lindemann, Evaluation of smartphone-based testing to
generate exploratory outcome measures in a phase 1 Parkinson's disease clinical trial. Movement disorders : official
journal of the Movement Disorder Society 33, 1287-1297 (2018)
87. A. Abdolahi, N. Scoglio, A. Killoran, E. R. Dorsey, K. M. Biglan, Potential reliability and validity of a modified
version of the Unified Parkinson's Disease Rating Scale that could be administered remotely. Parkinsonism Relat
Disord 19, 218-221 (2013); published online EpubFeb (10.1016/j.parkreldis.2012.10.008).
88. Apple Inc. , (ORKParkinsonStudy GitHub Open Source Repository, GitHub, 2018!
89. Apple Inc. , “Movement Disorder API Developer Documentation” (Apple Developer Documentation,!https://
90. Powers, III; William R.; Etezadi-Amoli; Maryam; Ullal; Adeeti V.; Trietsch; Daniel; Kianian; Sara; Pham; Hung A.
Passive Tracking of Dyskinesia/Tremor Symptoms (US Patent Application Number: 20190365286, filed 12/5/2019).
Acknowledgments: We thank K. Tsou, Y. Zhu, M. Agarwal, G. Blanco, S. Chang, C. Currie, B. Cusack, S. Estoesta,
W. Goh, K. Jessop, J. Tung, G. Valsan, J. Beard, L. Huet and J. Yip for supporting study execution. We also thank R.
Huang, M. O’Reilly, C. Mermel, G. Chi-Johnston, J. Leung, C. Jia, S. Friend, A. Trister, B. Tribble and countless
other academic and industry leaders for many helpful discussions. Most of all, we are grateful to the study
participants who shared their time and experiences with Parkinson’s disease with us.
Funding: The study was funded by Apple, Inc.
Author contributions: R.P., M.E.-A., A.V.U., contributed to the study design, algorithm design,
data analysis and writing of the paper. H.A.P. contributed to study design, algorithm design and writing. E.M.A.
contributed to study design and writing. S.K., I.M. and M.G., contributed to study design, data collection, data
analysis of supplementary material, and writing. P.T.L., N.H., T.M.H. and S.B. contributed to study execution,
clinical input, and writing. D.T. contributed to system design and implementation, and data collection. A.S.A.
supported algorithm design and reference data collection. J.D.K. contributed to writing.
Competing interests: All authors with Apple affiliations are either current or former Apple
employees and are Apple shareholders. P.T.L. has received clinical trial support from US WorldMeds, Acadia
Pharmaceuticals and BioIVT. R.P, M.E.-A., A.V.U., D.T., S. K., and H.A.P. are inventors on patent US 20190365286
submitted by Apple, Inc. that covers passive tracking of tremor and dyskinesia symptoms. There are no other
conflicts of interest to declare.
Data and materials availability: All data associated with this study are present in the paper or the Supplementary
Materials. Raw data cannot be publicly provided due to restrictions on
the informed consent and risk of re-identification. All symptom profiles for consented
subjects have been made available with corresponding medication schedules in supplementary
figures for review. Algorithm outputs can be accessed by submitting a request for the Movement
Disorder API entitlement (see for full API documentation). Details about
this algorithm can also be found in the patent application “Passive Tracking of Dyskinesia/Tremor
Symptoms” (US20190365286. Dec 5, 2019. U. S. Patent Office).
... Wearable devices can continuously and objectively monitor changes in a patient's condition in daily life outside the hospital through 24 h continuous monitoring. Indeed, the Apple Watch offers a movement disorder API that can distinguish between tremors and dyskinesia and monitor them separately 17) . Digital recording of an individual's movement symptoms is also expected to serve as a digital biomarker and is expected to become the basis for personalized medicine in the future 18) . ...
Full-text available
The coronavirus disease 2019 pandemic has uncovered several inherent problems in society. While the demand for telemedicine surged worldwide and some countries responded flexibly, in Japan, most telemedicine services were limited to telephone consultations, and full-fledged telemedicine did not become widespread. In addition, the digitalization process in both medicine and wider society lags behind some other nations. It is necessary to accelerate digital transformation in healthcare to build a sustainable society that is resilient to crises, such as new pandemics. In particular, as Japan is facing an issue of super-aged society, a sustainable care model for people with Parkinson’s disease, dementia, and intractable neurological diseases should be established. Many neurodegenerative and intractable neurological diseases are progressive; as the disease progresses, patients could become difficult to visit specialists. Although online medical care has many advantages, it does not provide the same quality of information as face-to-face consultations. However, new technology can overcome the limitations of online medical care. As an evolutionary direction for telemedicine, three-dimensional telemedicine technologies are being developed, which enable online medical treatment to be delivered as if the patient was sharing the same space. Telemonitoring can enable the objective and continuous evaluation of patient information at home through the use of motion capture, wearable devices, and other devices. The advancement of digital transformation in medical care should be a game-changer in accumulating big data and analyzing it using artificial intelligence.
... Further, the smartwatch-technology is a proved relevant source of objective motility monitoring, allowing great precision in recording subtle changes in any patients' home environment. In large cohorts of patients affected by Parkinson's disease, smartwatch accelerometer data could quantify low hand-tremor amplitudes and frequencies with high accuracy (Hadley et al., 2021;Powers et al., 2021). ...
Full-text available
Objective: In the field of non-treatable muscular dystrophies, promising new gene and cell therapies are being developed and are entering clinical trials. Objective assessment of therapeutic effects on motor function is mandatory for economical and ethical reasons. Main shortcomings of existing measurements are discontinuous data collection in artificial settings as well as a major focus on walking, neglecting the importance of hand and arm movements for patients’ independence. We aimed to create a digital tool to measure muscle function with an emphasis on upper limb motility. Methods : suMus provides a custom-made App running on smartwatches. Movement data are sent to the backend of a suMus web-based platform, from which they can be extracted as CSV data. Fifty patients with neuromuscular diseases assessed the pool of suMus activities in a first orientation phase. suMus performance was hence validated in four upper extremity exercises based on the feedback of the orientation phase. We monitored the arm metrics in a cohort of healthy volunteers using the suMus application, while completing each exercise at low frequency in a metabolic chamber. Collected movement data encompassed average acceleration, rotation rate as well as activity counts. Spearman rank tests correlated movement data with energy expenditure from the metabolic chamber. Results: Our novel application “suMus,” sum of muscle activity, collects muscle movement data plus Patient-Related-Outcome-Measures, sends real-time feedback to patients and caregivers and provides, while ensuring data protection, a long-term follow-up of disease course. The application was well received from the patients during the orientation phase. In our pilot study, energy expenditure did not differ between overnight fasted and non-fasted participants. Acceleration ranged from 1.7 ± 0.7 to 3.2 ± 0.5 m/sec ² with rotation rates between 0.9 ± 0.5 and 2.0 ± 3.4 rad/sec. Acceleration and rotation rate as well as derived activity counts correlated with energy expenditure values measured in the metabolic chamber for one exercise (r = 0.58, p < 0.03). Conclusion: In the analysis of slow frequency movements of upper extremities, the integration of the suMus application with smartwatch sensors characterized motion parameters, thus supporting a use in clinical trial outcome measures. Alternative methodologies need to complement indirect calorimetry in validating accelerometer-derived energy expenditure data.
... 20 Wearable devices such as smartwatches have the additional benefit of combining survey questions with passively collected sensor data which can be leveraged to detect changes in activity levels and physiological measures such as heart rate and rhythm. 24,25 We have conducted a smartwatch feasibility study (Watch Your Steps) in order to explore how we might harness the potential of smartwatches to explore longitudinal symptom patterns in MLTC-M populations. Participants with MLTC-M submitted their daily ratings of a range of symptoms via a consumer smartwatch touch face for 90 days. ...
Full-text available
Introduction People living with multiple long-term conditions (MLTC-M) (multimorbidity) experience a range of inter-related symptoms. These symptoms can be tracked longitudinally using consumer technology, such as smartphones and wearable devices, and then summarised to provide useful clinical insight. Aim We aimed to perform an exploratory analysis to summarise the extent and trajectory of multiple symptom ratings tracked via a smartwatch, and to investigate the relationship between these symptom ratings and demographic factors in people living with MLTC-M in a feasibility study. Methods ‘Watch Your Steps’ was a prospective observational feasibility study, administering multiple questions per day over a 90 day period. Adults with more than one clinician-diagnosed long-term condition rated seven core symptoms each day, plus up to eight additional symptoms personalised to their LTCs per day. Symptom ratings were summarised over the study period at the individual and group level. Symptom ratings were also plotted to describe day-to-day symptom trajectories for individuals. Results Fifty two participants submitted symptom ratings. Half were male and the majority had LTCs affecting three or more disease areas (N = 33, 64%). The symptom rated as most problematic was fatigue. Patients with increased comorbidity or female sex seemed to be associated with worse experiences of fatigue. Fatigue ratings were strongly correlated with pain and level of dysfunction. Conclusion In this study we have shown that it is possible to collect and descriptively analyse self reported symptom data in people living with MLTC-M, collected multiple times per day on a smartwatch, to gain insights that might support future clinical care and research.
... This study provides pragmatic insights to expand the use of remote evaluations for clinical research in MS, whether observational or interventional-and adding to the repertoire of remote research assessments for other neurologic conditions. [33][34][35] In prospective observational studies, flexibility in transitioning some episodic evaluations to remote monitoring could support recruitment of more diverse (racial, disability, geographic, age, etc.) participants, promote participant access and convenience, maintain longitudinal engagement, and reduce study costs while maintaining the episodic in-person collection of biosamples, imaging, and objective assessments. Remote collection of biosensor data would naturally support and enrich the analyses. ...
Full-text available
Background and Objectives Prospective, deeply phenotyped research cohorts monitoring individuals with chronic neurologic conditions, such as multiple sclerosis (MS), depend on continued participant engagement. The COVID-19 pandemic restricted in-clinic research activities, threatening this longitudinal engagement, but also forced adoption of televideo-enabled care. This offered a natural experiment in which to analyze key dimensions of remote research: (1) comparison of remote vs in-clinic visit costs from multiple perspectives and (2) comparison of the remote with in-clinic measures in cross-sectional and longitudinal disability evaluations. Methods Between March 2020 and December 2021, 207 MS cohort participants underwent hybrid in-clinic and virtual research visits; 96 contributed 100 “matched visits,” that is, in-clinic (Neurostatus-Expanded Disability Status Scale [NS-EDSS]) and remote (televideo-enabled EDSS [tele-EDSS]; electronic patient-reported EDSS [ePR-EDSS]) evaluations. Clinical, demographic, and socioeconomic characteristics of participants were collected. Results The costs of remote visits were lower than in-clinic visits for research investigators (facilities, personnel, parking, participant compensation) but also for participants (travel, caregiver time) and carbon footprint ( p < 0.05 for each). Median cohort EDSS was similar between the 3 modalities (NS-EDSS: 2, tele-EDSS: 1.5, ePR-EDSS: 2, range 0.6.5); the remote evaluations were each noninferior to the NS-EDSS within ±0.5 EDSS point (TOST for noninferiority, p < 0.01 for each). Furthermore, year to year, the % of participants with worsening/stable/improved EDSS scores was similar, whether each annual evaluation used NS-EDSS or whether it switched from NS-EDSS to tele-EDSS. Discussion Altogether, the current findings suggest that remote evaluations can reduce the costs of research participation for patients, while providing a reasonable evaluation of disability trajectory longitudinally. This could inform the design of remote research that is more inclusive of diverse participants.
... 16,17 The devices are familiar to many, user friendly, have standardized software upgrades, enable remote data capture, and can inform individuals of results. 11 Devices and data plans were provided to minimize effects of variable access to technology or internet based on socioeconomic status or geographic location. Perhaps re ecting these advantages, interest among sites and participants was high, and enrollment was completed in a timely manner despite the COVID-19 pandemic. ...
Full-text available
Digital health technologies can provide continuous monitoring and objective, real world measures of Parkinson’s disease (PD), but have primarily been evaluated in small, single-site studies. In this 12-month, multicenter observational study, we evaluated whether a smartwatch and smartphone application could measure features of early PD. 82 individuals with early, untreated PD and 50 age-matched controls wore research-grade sensors, a smartwatch, and a smartphone while performing standardized assessments in clinic. At home, participants wore the smartwatch for seven days after each clinic visit and completed motor, speech and cognitive tasks on the smartphone every other week. Features derived from the devices, particularly arm swing, proportion of time with tremor, and finger tapping, differed significantly between individuals with early PD and age-matched controls and had variable correlation with traditional assessments. Longitudinal assessments will inform the value of these digital measures for use in future clinical trials.
... An example is the Apple Heart [43] that studies atrial fibrillation and applies an algorithm that uses pulse rate data to identify and evaluate "atrial fibrillation" in a large group of Apple Watch users. Another example is the apps that investigate the smartwatch for monitoring fluctuations in Parkinson's disease [44,45,46,47]. Using the device has proven to be efficacious in patient-clinician communication, allowing for monitoring motor symptoms in Parkinson's disease. ...
Full-text available
Smartwatches (SWs) can continuously and autonomously monitor vital signs, including heart rates and physical activities involving wrist movement. The monitoring capability of SWs has several key health benefits arising from their role in preventive and diagnostic medicine. Current research, however, has not explored many of these opportunities, including longitudinal studies. In our work, we gathered longitudinal data points, e.g., heart rate and physical activity, from various brands of SWs worn by 1,014 users. Our analysis shows three common heart rate patterns during sleep but two common patterns during the day. We find that heart rate and physical activities are higher in summer and the first month of the new year compared to other months. Moreover, physical activities are reduced on weekends compared with weekdays. Interestingly, the highest peak of physical activity is during the evening.
... Some have proposed standardized templates in EHR documentation to help clinicians input patient-reported wearable data [4]. In much of the research literature, custom interfaces have been developed on top of the EHR to visualize wearables data [19,20]. However, there is now a growing trend toward direct EHR integrations, which will greatly accelerate clinician adoption [21]. ...
Despite the rapid growth of wearables as a consumer technology sector and a growing evidence base supporting their use, they have been slow to be adopted by the health system into clinical care. As regulatory, reimbursement and technical barriers recede, a persistent challenge remains how to make wearable data actionable for clinicians - transforming disconnected grains of wearable data into meaningful clinical ‘pearls’. In order to bridge this adoption gap, wearable data must become visible, interpretable and actionable for the clinician. We showcase emerging trends and best practices that illustrate these three pillars, and offer some recommendations on how the ecosystem can move forward.
Full-text available
Neurological and psychiatric diseases have high degrees of genetic and pathophysiological heterogeneity, irrespective of clinical manifestations. Traditional medical paradigms have focused on late-stage syndromic aspects of these diseases, with little consideration of the underlying biology. Advances in disease modeling and methodological design have paved the way for the development of precision medicine (PM), an established concept in oncology with growing attention from other medical specialties. We propose a PM architecture for central nervous system diseases built on four converging pillars: multimodal biomarkers, systems medicine, digital health technologies, and data science. We discuss Alzheimer’s disease (AD), an area of significant unmet medical need, as a case-in-point for the proposed framework. AD can be seen as one of the most advanced PM-oriented disease models and as a compelling catalyzer towards PM-oriented neuroscience drug development and advanced healthcare practice.
Full-text available
Machine learning represents a growing subfield of artificial intelligence with much promise in the diagnosis, treatment, and tracking of complex conditions, including neurodegenerative disorders such as Alzheimer's and Parkinson's diseases. While no definitive methods of diagnosis or treatment exist for either disease, researchers have implemented machine learning algorithms with neuroimaging and motion-tracking technology to analyze pathologically relevant symptoms and biomarkers. Deep learning algorithms such as neural networks and complex combined architectures have proven capable of tracking disease-linked changes in brain structure and physiology as well as patient motor and cognitive symptoms and responses to treatment. However, such techniques require further development aimed at improving transparency, adaptability, and reproducibility. In this review, we provide an overview of existing neuroimaging technologies and supervised and unsupervised machine learning techniques with their current applications in the context of Alzheimer's and Parkinson's diseases.
Full-text available
Parkinson’s disease is a neurodegenerative disorder impacting patients’ movement, causing a variety of movement abnormalities. It has been the focus of research studies for early detection based on wearable technologies. The benefit of wearable technologies in the domain rises by continuous monitoring of this population’s movement patterns over time. The ubiquity of wrist-worn accelerometry and the fact that the wrist is the most common and acceptable body location to wear the accelerometer for continuous monitoring suggests that wrist-worn accelerometers are the best choice for early detection of the disease and also tracking the severity of it over time. In this study, we use a dataset consisting of one-week wrist-worn accelerometry data collected from individuals with Parkinson’s disease and healthy elderlies for early detection of the disease. Two feature engineering methods, including epoch-based statistical feature engineering and the document-of-words method, were used. Using various machine learning classifiers, the impact of different windowing strategies, using the document-of-words method versus the statistical method, and the amount of data in terms of number of days were investigated. Based on our results, PD was detected with the highest average accuracy value (85% ± 15%) across 100 runs of SVM classifier using a set of features containing features from every and all windowing strategies. We also found that the document-of-words method significantly improves the classification performance compared to the statistical feature engineering model. Although the best performance of the classification task between PD and healthy elderlies was obtained using seven days of data collection, the results indicated that with three days of data collection, we can reach a classification performance that is not significantly different from a model built using seven days of data collection.
Full-text available
Digital health technology tools (DHTT) are technologies such as apps, smartphones, and wearables that remotely acquire health-related information from individuals. They have the potential advantages of objectivity and sensitivity of measurement, richness of high-frequency sensor data, and opportunity for passive collection of health-related data. Thus, DHTTs promise to provide patient phenotyping at an order of granularity several times greater than is possible with traditional clinical research tools. While the conceptual development of novel DHTTs is keeping pace with technological and analytical advancements, an as yet unaddressed gap is how to develop robust and meaningful outcome measures based on sensor data. Here, we describe two roadmaps which were developed to generate outcome measures based on DHTT data: one using a data-centric approach and the second a patient-centric approach. The data-centric approach to develop digital outcome measures summarizes those sensor features maximally sensitive to the concept of interest, exemplified with the quantification of disease progression. The patient-centric approach summarizes those sensor features that are optimally relevant to patients’ functioning in everyday life. Both roadmaps are exemplified for use in tracking disease progression in observational and clinical interventional studies, and with a DHTT designed to evaluate motor symptom severity and symptom experience in Parkinson’s disease. Use cases other than disease progression (e.g., case-finding) are considered summarily. DHTT research requires methods to summarize sensor data into meaningful outcome measures. It is hoped that the concepts outlined here will encourage a scientific discourse and eventual consensus on the creation of novel digital outcome measures for both basic clinical research and clinical drug development.
Full-text available
Background: The emergence of new technologies measuring outcomes in Parkinson's disease (PD) to complement the existing clinical rating scales has introduced the possibility of measurement occurring in patients' own homes whilst they freely live and carry out normal day-to-day activities. Objective: This systematic review seeks to provide an overview of what technology is being used to test which outcomes in PD from free-living participant activity in the setting of the home environment. Additionally, this review seeks to form an impression of the nature of validation and clinimetric testing carried out on the technological device(s) being used. Methods: Five databases (Medline, Embase, PsycInfo, Cochrane and Web of Science) were systematically searched for papers dating from 2000. Study eligibility criteria included: adults with a PD diagnosis; the use of technology; the setting of a home or home-like environment; outcomes measuring any motor and non-motor aspect relevant to PD, as well as activities of daily living; unrestricted/unscripted activities undertaken by participants. Results: 65 studies were selected for data extraction. There were wide varieties of participant sample sizes (<10 up to hundreds) and study durations (<2 weeks up to a year). The metrics evaluated by technology, largely using inertial measurement units in wearable devices, included gait, tremor, physical activity, bradykinesia, dyskinesia and motor fluctuations, posture, falls, typing, sleep and activities of daily living. Conclusions: Home-based free-living testing in PD is being conducted by multiple groups with diverse approaches, focussing mainly on motor symptoms and sleep.
Full-text available
Accurately monitoring motor and non-motor symptoms as well as complications in people with Parkinson’s disease (PD) is a major challenge, both during clinical management and when conducting clinical trials investigating new treatments. A variety of strategies have been relied upon including questionnaires, motor diaries, and the serial administration of structured clinical exams like part III of the MDS-UPDRS. To evaluate the potential use of mobile and wearable technologies in clinical trials of new pharmacotherapies targeting PD symptoms, we carried out a project (project BlueSky) encompassing four clinical studies, in which 60 healthy volunteers (aged 23–69; 33 females) and 95 people with PD (aged 42–80; 37 females; years since diagnosis 1–24 years; Hoehn and Yahr 1–3) participated and were monitored in either a laboratory environment, a simulated apartment, or at home and in the community. In this paper, we investigated (i) the utility and reliability of self-reports for describing motor fluctuations; (ii) the agreement between participants and clinical raters on the presence of motor complications; (iii) the ability of video raters to accurately assess motor symptoms, and (iv) the dynamics of tremor, dyskinesia, and bradykinesia as they evolve over the medication cycle. Future papers will explore methods for estimating symptom severity based on sensor data. We found that 38% of participants who were asked to complete an electronic motor diary at home missed ~25% of total possible entries and otherwise made entries with an average delay of >4 h. During clinical evaluations by PD specialists, self-reports of dyskinesia were marked by ~35% false negatives and 15% false positives. Compared with live evaluation, the video evaluation of part III of the MDS-UPDRS significantly underestimated the subtle features of tremor and extremity bradykinesia, suggesting that these aspects of the disease may be underappreciated during remote assessments. On the other hand, live and video raters agreed on aspects of postural instability and gait. Our results highlight the significant opportunity for objective, high-resolution, continuous monitoring afforded by wearable technology to improve upon the monitoring of PD symptoms.
Full-text available
The increment of the prevalence of neurological diseases due to the trend in population aging demands for new strategies in disease management. In Parkinson's disease (PD), these strategies should aim at improving diagnosis accuracy and frequency of the clinical follow-up by means of decentralized cost-effective solutions. In this context, a system suitable for the remote monitoring of PD subjects is presented. It consists of the integration of two approaches investigated in our previous works, each one appropriate for the movement analysis of specific parts of the body: low-cost optical devices for the upper limbs and wearable sensors for the lower ones. The system performs the automated assessments of six motor tasks of the unified Parkinson's disease rating scale, and it is equipped with a gesture-based human machine interface designed to facilitate the user interaction and the system management. The usability of the system has been evaluated by means of standard questionnaires, and the accuracy of the automated assessment has been verified experimentally. The results demonstrate that the proposed solution represents a substantial improvement in PD assessment respect to the former two approaches treated separately, and a new example of an accurate, feasible and cost-effective mean for the decentralized management of PD.
Full-text available
There is no objective gold standard to detect tremors. This concerns not only the choice of the algorithm and sensors, but methods are often designed to detect tremors in one specific group of patients during the performance of a specific task. Therefore, the aim of this study is twofold. First, an objective quantitative method to detect tremor windows (TWs) in accelerometer and electromyography recordings is introduced. Second, the tremor stability index (TSI) is determined to indicate the advantage of detecting TWs prior to analysis. Ten Parkinson’s disease (PD) patients, ten essential tremor (ET) patients, and ten healthy controls (HC) performed a resting, postural and movement task. Data was split into 3-s windows, and the power spectral density was calculated for each window. The relative power around the peak frequency with respect to the power in the tremor band was used to classify the windows as either tremor or non-tremor. The method yielded a specificity of 96.45%, sensitivity of 84.84%, and accuracy of 90.80% of tremor detection. During tremors, significant differences were found between groups in all three parameters. The results suggest that the introduced method could be used to determine under which conditions and to which extent undiagnosed patients exhibit tremors.
Full-text available
Tremor is one of the main symptoms of Parkinson’s Disease (PD) that reduces the quality of life. Tremor is measured as part of the Unified Parkinson Disease Rating Scale (UPDRS) part III. However, the assessment is based on onsite physical examinations and does not fully represent the patients’ tremor experience in their day-to-day life. Our objective in this paper was to develop algorithms that, combined with wearable sensors, can estimate total Parkinsonian tremor as the patients performed a variety of free body movements. We developed two methods: an ensemble model based on gradient tree boosting and a deep learning model based on long short-term memory (LSTM) networks. The developed methods were assessed on gyroscope sensor data from 24 PD subjects. Our analysis demonstrated that the method based on gradient tree boosting provided a high correlation (r = 0.96 using held-out testing and r = 0.93 using subject-based, leave-one-out cross-validation) between the estimated and clinically assessed tremor subscores in comparison to the LSTM-based method with a moderate correlation (r = 0.84 using held-out testing and r = 0.77 using subject-based, leave-one-out cross-validation). These results indicate that our approach holds great promise in providing a full spectrum of the patients’ tremor from continuous monitoring of the subjects’ movement in their natural environment.
Full-text available
Objective: Quantification of gait with wearable technology is promising; recent cross-sectional studies showed that gait characteristics are potential prodromal markers for Parkinson's disease (PD). The aim of this longitudinal prospective observational study was to establish gait impairments and trajectories in the prodromal phase of PD, identifying which gait characteristics are potentially early diagnostic markers of PD. Methods: 696 healthy controls (HC, 63±7 years) recruited in the TREND study were included. Assessments were performed longitudinally four times at 2-year intervals and people who converted to PD identified. Participants were asked to walk at different speed under single and dual-tasking, with a wearable placed on the lower back; fourteen validated clinically relevant gait characteristics were quantified. Cox regression was used to examine whether gait at first visit could predict time to PD conversion after controlling for age and sex. Random effects linear mixed-models (RELMs) were used to establish longitudinal trajectories of gait and model the latency between impaired gait and PD diagnosis. Results: Sixteen participants were diagnosed with PD on average 4.5 years after first visit (converters, PDC). Higher step time variability and asymmetry of all gait characteristics were associated with a shorter time to PD diagnosis. RELMs indicated gait (lower pace) deviates from non-PDC approximately four years prior to diagnosis. Interpretation: Together with other prodromal markers, quantitative gait characteristics can play an important role to identify prodromal PD and progression within this phase. This article is protected by copyright. All rights reserved.
Full-text available
Device-assisted therapies (DAT) benefit people with Parkinsons Disease (PwP) but many referrals for DAT are unsuitable or too late, and a screening tool to aid in identifying candidates would be helpful. This study aimed to produce such a screening tool by building a classifier that models specialist identification of suitable DAT candidates. To our knowledge, this is the first objective decision tool for managing DAT referral. Subjects were randomly assigned to either a construction set (n = 112, to train, develop, cross validate, and then evaluate the classifier’s performance) or to a test set (n = 60 to test the fully specified classifier), resulting in a sensitivity and specificity of 89% and 86.6%, respectively. The classifier’s performance was then assessed in PwP who underwent deep brain stimulation (n = 31), were managed in a non-specialist clinic (n = 81) or in PwP in the first five years from diagnosis (n = 22). The classifier identified 87%, 92%, and 100% of the candidates referred for DAT in each of the above clinical settings, respectively. Furthermore, the classifier score changed appropriately when therapeutic intervention resolved troublesome fluctuations or dyskinesia that would otherwise have required DAT. This study suggests that information from objective measurement could improve timely referral for DAT.
Background: There is a need for standardized and objective methods to measure postural instability (PI) and gait dysfunction in Parkinson's disease (PD) patients. Recent technological advances in wearable devices, including standard smartphones, may provide such measurements. Objectives: To test the feasibility of smartphones to detect PI during the Timed Up and Go (TUG) test. Methods: Ambulatory PD patients, divided by item 30 (postural stability) of the motor Unified Parkinson's Disease Rating Scale (UPDRS) to those with a normal (score = 0, PD-NPT) and an abnormal (score ≥ 1, PD-APT) test and a group of healthy controls (HC) performed a 10-meter TUG while motion sensor data was recorded from a smartphone attached to their sternum using the EncephaLog application. Results: In this observational study, 44 PD patients (21 PD-NPT and 23 PD-APT) and 22 HC similar in age and gender distribution were assessed. PD-APT differed significantly in all gait parameters when compared to PD-NPT and HC. Significant difference between PD-NPT and HC included only turning time (P < 0.006) and step-to-step correlation (P < 0.05). Conclusions: While high correlations were found between EncephaLog gait parameters and axial UPDRS items, the pull test was least correlated with EncephaLog measures. Motion sensor data from a smartphone can detect differences in gait and balance measures between PD with and without PI and HC.
Motor fluctuations (MF) are important determinants of quality of life in Parkinson's disease (PD). To determine whether the Personal Kineti Graph (PKG), a wearable motion tracking device, can define MF progression, we correlated PKG fluctuator scores (FS) with clinical motor fluctuator profiles in a case-control cohort study. 54 subjects completed a 6-day PKG trial and completed a standardized motor diary. We distinguished non-fluctuators (NF), early (EF), moderate (MF) and troublesome fluctuators (TF), based on Wearing Off Questionnaire and Movement Disorders Society-Unified Parkinson's Disease Rating Scale scores. PKG FS significantly differentiated EF and TF, as well as dyskinetic and non-dyskinetic subjects. Motor diaries could not distinguish the four study groups on the basis of average OFF time, while average time with dyskinesia distinguished NF and MF. In conclusion, PKG FS can distinguish EF from TF, as well as dyskinetic from non-dyskinetic patients, but cannot discriminate subtler MF. PKG may provide objective MF measures for routine PD management and clinical trials.