Article

Evaluating health information technology in community-based settings: lessons learned.

Department of Public Health, Weill Cornell Medical College, New York, New York 10065, USA.
Journal of the American Medical Informatics Association (Impact Factor: 3.93). 07/2011; 18(6):749-53. DOI: 10.1136/amiajnl-2011-000249
Source: PubMed

ABSTRACT Implementing health information technology (IT) at the community level is a national priority to help improve healthcare quality, safety, and efficiency. However, community-based organizations implementing health IT may not have expertise in evaluation. This study describes lessons learned from experience as a multi-institutional academic collaborative established to provide independent evaluation of community-based health IT initiatives. The authors' experience derived from adapting the principles of community-based participatory research to the field of health IT. To assist other researchers, the lessons learned under four themes are presented: (A) the structure of the partnership between academic investigators and the community; (B) communication issues; (C) the relationship between implementation timing and evaluation studies; and (D) study methodology. These lessons represent practical recommendations for researchers interested in pursuing similar collaborations.

Download full-text

Full-text

Available from: Vaishali Patel, Jul 06, 2015
2 Followers
 · 
172 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: This article is an analysis of the Health Information Technology Education published research. The purpose of this study was to examine selected literature using variables such as journal frequency, keyword analysis, universities associated with the research and geographic diversity. The analysis presented in this paper has identified intellectually significant studies that have contributed to the development and accumulation of intellectual wealth of Health Information Technology. The keyword analysis suggests that Health Information Technology research has evolved from establishing concepts and domains of health information systems, technology and management to contemporary issues such as education, outsourcing, web services and security. The research findings have implications for educators, researchers, journal.
    Technology and health care: official journal of the European Society for Engineering and Medicine 07/2012; 20(4):239-246. DOI:10.3233/THC-2012-0673 · 0.64 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Health information exchange (HIE) has the potential to improve the quality of healthcare by enabling providers with better access to patient information from multiple sources at the point of care. However, HIE efforts have historically been difficult to establish in the US and the failure rates of organizations created to foster HIE have been high. We sought to better understand how RHIO-based HIE systems were used in practice and the challenges care practitioners face using them. The objective of our study were to so investigate how HIE can better meet the needs of care practitioners. We performed a multiple-case study using qualitative methods in three communities in New York State. We conducted interviews onsite and by telephone with HIE users and non-users and observed the workflows of healthcare professionals at multiple healthcare organizations participating in a local HIE effort in New York State. The empirical data analysis suggests that challenges still remain in increasing provider usage, optimizing HIE implementations and connecting HIE systems across geographic regions. Important determinants of system usage and perceived value includes users experienced level of available information and the fit of use for physician workflows. Challenges still remain in increasing provider adoption, optimizing HIE implementations, and demonstrating value. The inability to find information reduced usage of HIE. Healthcare organizations, HIE facilitating organizations, and states can help support HIE adoption by ensuring patient information is accessible to providers through increasing patient consents, fostering broader participation, and by ensuring systems are usable.
    Applied Clinical Informatics 01/2014; 5:861-877. DOI:10.4338/ACI-2014-06-RA-0055 · 0.39 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background Studies of the effects of electronic health records (EHRs) have had mixed findings, which may be attributable to unmeasured confounders such as individual variability in use of EHR features. Objective To capture physician-level variations in use of EHR features, associations with other predictors, and usage intensity over time. Methods Retrospective cohort study of primary care providers eligible for meaningful use at a network of federally qualified health centers, using commercial EHR data from January 2010 through June 2013, a period during which the organization was preparing for and in the early stages of meaningful use. Results Data were analyzed for 112 physicians and nurse practitioners, consisting of 430 803 encounters with 99 649 patients. EHR usage metrics were developed to capture how providers accessed and added to patient data (eg, problem list updates), used clinical decision support (eg, responses to alerts), communicated (eg, printing after-visit summaries), and used panel management options (eg, viewed panel reports). Provider-level variability was high: for example, the annual average proportion of encounters with problem lists updated ranged from 5% to 60% per provider. Some metrics were associated with provider, patient, or encounter characteristics. For example, problem list updates were more likely for new patients than established ones, and alert acceptance was negatively correlated with alert frequency. Conclusions Providers using the same EHR developed personalized patterns of use of EHR features. We conclude that physician-level usage of EHR features may be a valuable additional predictor in research on the effects of EHRs on healthcare quality and costs.
    Journal of the American Medical Informatics Association 06/2014; 21(6). DOI:10.1136/amiajnl-2013-002627 · 3.93 Impact Factor