Developing and Testing a Clinical Information System Evaluation Tool Prioritizing Modifications Through End-User Input

Patient Care Services Clinical Information Systems, Children's Mercy Hospitals and Clinics, Kansas City, Missouri 64108, USA.
The Journal of nursing administration (Impact Factor: 1.27). 06/2011; 41(6):252-8. DOI: 10.1097/NNA.0b013e31821c4634
Source: PubMed


The objectives were to develop and validate the Information System Evaluation Tool (ISET), use feedback to modify the institution's clinical information system (CIS), and determine the modifications' success.
The ability of a CIS to increase patient safety and care quality is dependent on its systems and processes. A survey was needed to provide the specificity necessary to make meaningful system improvements.
The ISET was pilot tested and revised before being administered before implementation of the CIS. It was administered at 2 times after implementation. The ISET was revised after analysis of the results, and comparisons were made between the times.
The ISET is a valid and reliable instrument. Perceptions of the CIS initially decreased, but had significantly improved by 16 months after implementation.
End-users must be convinced that the CIS supports their practice and improves care for adoption to be successful. The ISET measures these perceptions and highlights areas for improvement.

Download full-text


Available from: Susan L Teasley, Oct 27, 2014
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background Two of the current methodological barriers to implementation science efforts are the lack of agreement regarding constructs hypothesized to affect implementation success and identifiable measures of these constructs. In order to address these gaps, the main goals of this paper were to identify a multi-level framework that captures the predominant factors that impact implementation outcomes, conduct a systematic review of available measures assessing constructs subsumed within these primary factors, and determine the criterion validity of these measures in the search articles. Method We conducted a systematic literature review to identify articles reporting the use or development of measures designed to assess constructs that predict the implementation of evidence-based health innovations. Articles published through 12 August 2012 were identified through MEDLINE, CINAHL, PsycINFO and the journal Implementation Science. We then utilized a modified five-factor framework in order to code whether each measure contained items that assess constructs representing structural, organizational, provider, patient, and innovation level factors. Further, we coded the criterion validity of each measure within the search articles obtained. Results Our review identified 62 measures. Results indicate that organization, provider, and innovation-level constructs have the greatest number of measures available for use, whereas structural and patient-level constructs have the least. Additionally, relatively few measures demonstrated criterion validity, or reliable association with an implementation outcome (e.g., fidelity). Discussion In light of these findings, our discussion centers on strategies that researchers can utilize in order to identify, adapt, and improve extant measures for use in their own implementation research. In total, our literature review and resulting measures compendium increases the capacity of researchers to conceptualize and measure implementation-related constructs in their ongoing and future research.
    Implementation Science 02/2013; 8(1):22. DOI:10.1186/1748-5908-8-22 · 4.12 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Purpose: Most clinical information systems (CIS) today are technically sound, but the number of successful implementations of these systems is low. The purpose of this study was to develop and test a theoretically based integrated CIS Success Model (CISSM) from the nurse perspective. Methods: Model predictors of CIS success were taken from existing research on information systems acceptance, user satisfaction, use intention, user behavior and perceptions, as well as clinical research. Data collected online from 234 registered nurses in four hospitals were used to test the model. Each nurse had used the Cerner Power Chart Admission Health Profile for at least 3 months. Results: Psychometric testing and factor analysis of the 23-item CISSM instrument established its construct validity and reliability. Initial analysis showed nurses' satisfaction with and dependency on CIS use predicted their perceived CIS use Net Benefit. Further analysis identified Social Influence and Facilitating Conditions as other predictors of CIS user Net Benefit. The level of hospital CIS integration may account for the role of CIS Use Dependency in the success of CIS. Conclusions: Based on our experience, CISSM provides a formative as well as summative tool for evaluating CIS success from the nurse's perspective.
    International Journal of Medical Informatics 02/2013; 82(6). DOI:10.1016/j.ijmedinf.2013.01.011 · 2.00 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Doctor of Nursing Practice (DNP) graduates are expected to contribute to nursing knowledge through empirically based studies and testing the effectiveness of practice approaches that ultimately benefit patients and health care systems. This article describes publication practices of DNP graduates in the scholarly literature. Published studies (2005 to 2012) with at least one author with a DNP degree were identified. The search yielded 300 articles in 59 journals; 175 met the inclusion criteria and were included in this study. A codebook, consisting of 15 major categories, was used to extract relevant information. Original clinical investigations were the most frequent, followed by practice-focused patient and provider studies. The number of studies published in peer-reviewed journals with DNP-prepared authors increased over time. We recommend greater integration of translational science models into DNP curricula to achieve the goal of publishing scholarly products that use evidence to improve either practice or patient outcomes. [J Nurs Educ. 2013;52(x):xxx-xxx.].
    Journal of Nursing Education 07/2013; 52(8):1-10. DOI:10.3928/01484834-20130718-02 · 0.91 Impact Factor
Show more