May 2019
·
27 Reads
BACKGROUND Technology-enabled ecological momentary assessment (EMA) facilitates the calibration of physiological signals against self-reported data and contexts. However, research using this method rarely considers the impact that user experience (UX) has on the quality of data. OBJECTIVE The purpose of this study is to explore the biases that UX factors induce in self-reported data and physiological signals collected through EMA and the UX factors that have the largest impact on the data. METHODS A retrospective analysis on data from a field feasibility study is conducted. The study uses an application on a smartwatch device to measure heart rate variability (HRV) and collect self-reported stress levels. We collected data on event types, age, sex, personality traits, and engineered 66 UX features (e.g., number of screens viewed, perception of notification frequency). We use a series of random forest models, conditional forest models, linear regression models, and correlation analysis to predict self-reported stress, HRV, and their discrepancies. We then use iterated comparative analysis to confirm the effects of UX factors. RESULTS Analysis on 1240.6 hours of data from 29 participants reveal that self-reported stress is correlated with the HRV signal collected after EMA notification (HRV2) but not with the HRV signal collected before the notification (HRV1) or after user interaction starts (HRV3). UX factors explain 6.6% - 10% (P < .001) of the variation in self-reported stress. UX factors do not significantly predict HRV signals but explain 63.8% (P < .001) of the difference between self-reported stress and the HRV signal collected after the EMA notification. In addition, UX factors have a significant but smaller delayed effect on self-reported stress and HRV signals collected in the next user interaction cycle. In almost all models, UX features rank higher in terms of feature importance than the other confounding factors (i.e., age, sex, personality traits) and in some models rank higher than the main effect (i.e., event types). We discuss specific symptoms of UX-induced biases related to EMA instrument design and study design, mere measurement effect and observer effect, and propose topics of examination for future studies. CONCLUSIONS User experience may induce biases in data collected through technology-enabled EMA method. In some cases, the impact of the biases may be larger than that of the main effect, other confounding factors, and the corresponding data used for calibration.