[show abstract][hide abstract] ABSTRACT: High-fidelity medical simulation of sudden cardiac arrest (SCA) presents an opportunity for systematic probing of in-hospital resuscitation systems. Investigators developed and implemented the SimCode program to evaluate simulation's ability to generate meaningful data for system safety analysis and determine concordance of observed results with institutional quality data.
Resuscitation response performance data were collected during in situ SCA simulations on hospital medical floors. SimCode dataset was compared with chart review-based dataset of actual (live) in-hospital resuscitation system performance for SCA events of similar acuity and complexity.
135 hospital personnel participated in nine SimCode resuscitations between 2006 and 2008. Resuscitation teams arrived at 2.5+/-1.3 min (mean+/-SD) after resuscitation initiation, started bag-valve-mask ventilation by 2.8+/-0.5 min, and completed endotracheal intubations at 11.3+/-4.0 min. CPR was performed within 3.1+/-2.3 min; arrhythmia recognition occurred by 4.9+/-2.1 min, defibrillation at 6.8+/-2.4 min. Chart review data for 168 live in-hospital SCA events during a contemporaneous period were extracted from institutional database. CPR and defibrillation occurred later during SimCodes than reported by chart review, i.e., live: 0.9+/-2.3 min (p<0.01) and 2.1+/-4.1 min (p<0.01), respectively. Chart review noted fewer problems with CPR performance (simulated: 43% proper CPR vs. live: 98%, p<0.01). Potential causes of discrepancies between resuscitation response datasets included sample size and data limitations, simulation fidelity, unmatched SCA scenario pools, and dissimilar determination of SCA response performance by complementary reviewing methodologies.
On-site simulations successfully generated SCA response measurements for comparison with live resuscitation chart review data. Continued research may refine simulation's role in quality initiatives, clarify methodologic discrepancies and improve SCA response.
[show abstract][hide abstract] ABSTRACT: To compare the quality of data recorded by a commercially available clinical information system (CIS) to other commonly used methods for obtaining large amounts of patient data.
Five sets of clinical patient data were chosen as a cross-section of all the data collected by a CIS in our intensive care unit (ICU): 1) Length of stay in the ICU, 2) Vital signs, 3) Days of mechanical ventilation, 4) medications, and 5) diagnoses. Data generated by our ICU CIS was compared with other parallel data sets commonly used to obtain the same data for clinical research.
When compared with our CIS, the hospital database recorded a length of stay at least 1 day longer than the actual length of stay 53% of the time. A search of 139,387 sets of vital signs showed less than 0.1% rate of suspected artifact. When compared to direct observation, our CIS correctly recorded days of mechanical ventilation in 23 of 26 patients (88%). Two other data sets, medical diagnoses and medications given showed significant differences with other commonly used databases of the same information collected outside the ICU (billing codes and pharmacy records respectively
Compared to other commonly used data sources for clinical research, a commercially available CIS is an acceptable source of ICU patient data.
Journal of Critical Care 04/2004; 19(1):10-5. · 2.50 Impact Factor