Evaluating Alert Fatigue Over Time to EHR-Based Clinical Trial Alerts: Findings from a Randomized Controlled Study

Departments of Biomedical Informatics and Internal Medicine, College of Medicine, The Ohio State University, Columbus, Ohio, USA.
Journal of the American Medical Informatics Association (Impact Factor: 3.5). 04/2012; 19(e1):e145-e148. DOI: 10.1136/amiajnl-2011-000743
Source: PubMed


OBJECTIVE: Inadequate participant recruitment is a major problem facing clinical research. Recent studies have demonstrated that electronic health record (EHR)-based, point-of-care, clinical trial alerts (CTA) can improve participant recruitment to certain clinical research studies. Despite their promise, much remains to be learned about the use of CTAs. Our objective was to study whether repeated exposure to such alerts leads to declining user responsiveness and to characterize its extent if present to better inform future CTA deployments. METHODS: During a 36-week study period, we systematically documented the response patterns of 178 physician users randomized to receive CTAs for an ongoing clinical trial. Data were collected on: (1) response rates to the CTA; and (2) referral rates per physician, per time unit. Variables of interest were offset by the log of the total number of alerts received by that physician during that time period, in a Poisson regression. RESULTS: Response rates demonstrated a significant downward trend across time, with response rates decreasing by 2.7% for each advancing time period, significantly different from zero (flat) (p<0.0001). Even after 36 weeks, response rates remained in the 30%-40% range. Subgroup analyses revealed differences between community-based versus university-based physicians (p=0.0489). DISCUSSION: CTA responsiveness declined gradually over prolonged exposure, although it remained reasonably high even after 36 weeks of exposure. There were also notable differences between community-based versus university-based users. CONCLUSIONS: These findings add to the limited literature on this form of EHR-based alert fatigue and should help inform future tailoring, deployment, and further study of CTAs.

Download full-text


Available from: Peter J Embi,

Click to see the full-text of:

Article: Evaluating Alert Fatigue Over Time to EHR-Based Clinical Trial Alerts: Findings from a Randomized Controlled Study

292.21 KB

See full-text
  • Source
    • "In a recent evaluation, the effectiveness and the efficacy of the feasibility process using the EHR4CR QB compared with traditional methods were assessed [13]. However, other systems have been proven to be accurate and effective, while the final software was not usable due to its lack of user-friendliness [14]. Thus, there is a need for a user satisfaction evaluation to ensure that the system fits the user needs and an estimation of the training required for the use of the EHR4CR QB in a production environment. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The Electronic Health Records for Clinical Research (EHR4CR) project aims to develop services and technology for the leverage reuse of Electronic Health Records with the purpose of improving the efficiency of clinical research processes. A pilot program was implemented to generate evidence of the value of using the EHR4CR platform. The user acceptance of the platform is a key success factor in driving the adoption of the EHR4CR platform; thus, it was decided to evaluate the user satisfaction. In this paper, we present the results of a user satisfaction evaluation for the EHR4CR multisite patient count cohort system. This study examined the ability of testers (n=22 and n=16 from 5 countries) to perform three main tasks (around 20 minutes per task), after a 30-minute period of self-training. The System Usability Scale score obtained was 55.83 (SD: 15.37), indicating a moderate user satisfaction. The responses to an additional satisfaction questionnaire were positive about the design of the interface and the required procedure to design a query. Nevertheless, the most complex of the three tasks proposed in this test was rated as difficult, indicating a need to improve the system regarding complicated queries.
    BioMed Research International 07/2015; 2015(3). DOI:10.1155/2015/801436 · 2.71 Impact Factor
  • Source
    • "To address barriers of clinician recruitment, numerous automated methods have been used to identify potentially eligible participants [4,8-12]. While these methods have enhanced recruitment, alert fatigue and dismissal of auto alerts may reduce screening of potentially eligible participants. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Much of the existing literature on physical activity (PA) interventions involves physically inactive individuals recruited from community settings rather than clinical practice settings. Recruitment of patients into interventions in clinical practice settings is difficult due to limited time available in the clinic, identification of appropriate personnel to efficiently conduct the process, and time-consuming methods of recruitment. The purpose of this report is to describe the approach used to identify and recruit veterans from the Veterans Affairs (VA) Pittsburgh Healthcare System Primary Care Clinic into a randomized controlled PA study. A sampling frame of veterans was developed using the VA electronic medical record. During regularly scheduled clinic appointments, primary care providers (PCPs) screened identified patients for safety to engage in moderate-intensity PA and willingness to discuss the study with research staff members. Research staff determined eligibility with a subsequent telephone screening call and scheduled a research study appointment, at which time signed informed consent and baseline measurements were obtained. Of the 3,482 veterans in the sampling frame who were scheduled for a primary care appointment during the study period, 1,990 (57.2%) were seen in the clinic and screened by the PCP; moderate-intensity PA was deemed safe for 1,293 (37.1%), 871 (25.0%) agreed to be contacted for further screening, 334 (9.6%) were eligible for the study, and 232 (6.7%) enrolled. Using a semiautomated screening approach that combined an electronically-derived sampling frame with paper and pencil prescreening by PCPs and research staff, VA-STRIDE was able to recruit 1 in 15 veterans in the sampling frame. Using this approach, a high proportion of potentially eligible veterans were screened by their PCPs.Trial Registration: Clinical trials.gov identifier: NCT00731094.
    Trials 01/2014; 15(1):11. DOI:10.1186/1745-6215-15-11 · 1.73 Impact Factor
  • Source
    • "Recruitment is the primary and most costly barrier to clinical and translational research.138 This supplement contains two articles that contribute to the literature on informatics solutions for boosting recruitment.20 139 Embi and Leonard (see page ) evaluated the response patterns over time to EHR-based clinical trial alerts using a randomized clinical trial.139 The authors observed that responses to clinical trial alerts declined gradually over prolonged exposure. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Clinical research informatics is the rapidly evolving sub-discipline within biomedical informatics that focuses on developing new informatics theories, tools, and solutions to accelerate the full translational continuum: basic research to clinical trials (T1), clinical trials to academic health center practice (T2), diffusion and implementation to community practice (T3), and 'real world' outcomes (T4). We present a conceptual model based on an informatics-enabled clinical research workflow, integration across heterogeneous data sources, and core informatics tools and platforms. We use this conceptual model to highlight 18 new articles in the JAMIA special issue on clinical research informatics.
    Journal of the American Medical Informatics Association 04/2012; 19(e1):e36-e42. DOI:10.1136/amiajnl-2012-000968 · 3.50 Impact Factor
Show more