We had to look at the particular subtypes of HAIs to ﬁnd
speciﬁc opportunities to improve HAI surveillance by CSS. In
order to improve surveillance of SSIs that required readmission,
CSS would need access to the information found in emergency
department reports and admission history and physical reports.
These reports contained information about signs, symptoms,
signiﬁcant white blood cell counts, antimicrobial treatment,
bedside inter ventions, diagnostic imaging, and physician
impressions. To improve the surveillance of SSIs that occurred
within the current admission, CSS would need access to
information found in general surgery reports; these reports
contained phrases that suggested the presence of a ‘post-opera-
tive wound infection.’ Improving the detection of LRTIs would
require access to general consult reports, which contained signs,
symptoms, antimicrobial treatment, and physician impressions.
In addition to signs of pneumonia (important for LRTIs),
diagnostic radiology reports contained important evidence of
intra-abdominal and retroperitoneal abscesses, which were
important indicators of SSIs that required radiologically-guided
drainage. Information about one post-procedural LRTI was
found in a death summary report.
Information about outpatient ADEs was found in emergency
department reports and admission history and physical reports.
Information about anticoagulation-related bleeding events was
found in general surgery reports, radiology reports, and discharge
summaries. For example, one patient with repeated bleeding
episodes on Coumadin received an inferior vena cava ﬁlter,
which was documented in a radiology report because it was
placed under radiographic guidance. Another anticoagulation-
related gastrointestinal bleeding event was recorded in an
endoscopy report. The ADEs severe enough to require transfer to
the intensive care unit were mentioned in a general consult
report, which included general anesthesia-related events, cardiac
arrests secondary to cardiovascular medications, and opiate-
related sedation. Almost all ADEs involving narcotic analgesics
were mentioned in a general consult report, which contained
signs (eg, mental status changes and decreased respiratory rate),
response to naloxone, and physician assessments.
If the clinician did not document their assessment of a suspected
case in the CSS, we were unable to distinguish between false
positive cases and suspected AEs that were not reviewed. This is
an important area for additional investigation, since it would
affect the beneﬁt obtained by the integration of additional data
from physician narratives.
Recommendations for future work
Physician narratives must be available in electronic form, so that
CSS can access their content. The ideal narrative for a concur-
rent system like CSS is the progress note, since it is typically
created daily throughout a hospitalization. As hospitals
implement electronic progress notes, we need to understand
what information about AEs is more likely to be recorded in
progress notes than other physician narratives.
Additional investigation of ADEs missed by CSS is needed to
troubleshoot the system and the surveillance workﬂow. In these
cases, improvements may be attained in the cognitive burden,
stafﬁng, and prioritization of patient safety activities.
As public reporting requirements increase, providers must
consider the role of sur veillance technologies. In this study, we
identiﬁed and described differences between two such systems,
including how each system used information from different
sources. Computerized surveillance system detection of LRTIs,
SSIs, and ADEs would improve if patient signs, symptoms,
interventions, and physician assessments from physician narra-
tives were integrated using technologies such as natural
Acknowledgments We would like to thank Vikrant Deshmukh, MSc, MS for his
expert advice in the design of the database application used for this study.
Funding This project was funded in part by an institutional medical informatics
training grant from the National Library of Medicine (contract number 5T
Competing interests None.
Ethics approval This study was approved by Intermountain Healthcare and the
University of Utah.
Provenance and peer review Not commissioned; externally peer reviewed.
1. Institute of Medicine. Performance Measurement: Accelerating Improvement.
Washington, DC: National Academies Press, 2006:1e16.
2. Pennsylvania Health Care Cost Containment Council. Hospital-Acquired
Infections in Pennsylvania. 2007. http://www.phc4.org/reports/hai/ (accessed 26
3. Lindenauer PK, Remus D, Roman S, et al. Public reporting and pay for performance
in hospital quality improvement. N Engl J Med 2007;356:486e96.
4. Fung CH, Lim YW, Mattke S, et al. Systematic review: the evidence that publishing
patient care performance data improves quality of care. Ann Intern Med
5. The Joint Commission. The Joint Commission Health Care Quality Data Download
Website. 2010. http://www.healthcarequalitydata.org/ (accessed 25 Nov 2010).
6. Lagu T, Lindenauer PK. Putting the public back in public reporting of health care
quality. JAMA 2010;304:1711e12.
7. Klompas M, Yokoe DS. Automated surveillance of health care-associated infections.
Clin Infect Dis 2009;48:1268e75.
8. Evans RS, Larsen RA, Burke JP. Computer surveillance of hospital-acquired
infections and antibiotic use. JAMA 1986;256:1007e11.
9. Classen DC, Pestotnik SL, Evans RS, et al. Computerized surveillance of adverse
drug events in hospital patients. JAMA 1991;266:2847e51.
10. Bates DW, Pappius E, Kuperman GJ, et al. Using information systems to measure
and improve quality. Int J Med Inform 1999;53:115e24.
11. Frank L, Galanos H, Penn S, et al. Using BPI and emerging technology to improve
J Healthc Inf Manag 2004;18:65e71.
12. Chen ES, Wajngurt D, Qureshi K, et al. Automated real-time detection and
notiﬁcation of positive infection cases. AMIA Annu Symp Proc 2006:883.
13. Kilbridge PM, Campbell UC, Cozart HB, et al. Automated surveillance for adverse
drug events at a commun ity hospital and an academic medical center. J Am Med
Inform Assoc 2006;13:372e7.
14. Ferranti JM, Langman MK, Tanaka D, et al. Bridging the gap: leveraging business
intelligence tools in support of patient safety and ﬁnancial effectiveness. J Am Med
Inform Assoc 2010;17:136e43.
15. Jha AK, Kuperman GJ, Teich JM, et al. Identifying adverse drug events:
development of a computer-based monitor and comparison with chart review and
stimulated voluntary report. J Am Med Inform Assoc 1998;5:305e14.
16. Fiszman M, Chapman WW, Aronsky D, et al. Automatic detection of acute
bacterial pneumonia from chest x-ray reports. J Am Med Inform Assoc
17. Murff HJ, Forster AJ, Peterson JF, et al. Electronically screening discharge
summaries for adverse medical events. J Am Med Inform Assoc 2003;10:339e50.
18. Melton GB, Hripcsak G. Automated detection of adverse events using natural
language processing of discharge summaries. J Am Med Inform Assoc
19. Gardner RM, Pryor TA, Warner HR. The HELP hospital information system: update
1998. Int J Med Inf 1999;54:169e82.
20. Special issue: The SENIC Project. Am J Epidemiol 1980;111
21. Evans RS, Gardner RM, Bush AR. Development of a computerized infectious disease
monitor (CIDM). Comput Biomed Res 1985;18:103e13.
22. Evans RS, Pestotnik SL, Classen DC, et al. Development of a computerized adverse
drug event monitor. Proc Annu Symp Comput Appl Med Care 1991:23e7.
23. Jacobson JA, Burke JP, Kasworm E. Effect of bacte riologic monitoring of urinary
catheters on recognition and treatment of hospital-a cquired urinary tract infections.
Infect Control 1981;2:227e32.
24. Classen DC, Pestotnik SL, Evans RS, et al. Adverse drug events in hospitalized
patients: excess length of stay, extra costs, and attributable mortality. J Am Med
496 J Am Med Inform Assoc 2011;18:491e497. doi:10.1136/amiajnl-2011-000187
Research and applications