Addressing Physician Concerns About Performance Profiling: Experience With a Local Veterans Affairs Quality Evaluation Program
UCLA Department of General Internal Medicine and Health Services Research, Los Angeles, CA, USA. American Journal of Medical Quality
(Impact Factor: 1.25).
03/2009; 24(2):123-31. DOI: 10.1177/1062860608330828
The Authors investigated the addition of novel quality indicators, patient risk adjustment, and simple statistics in an ongoing clinician feedback initiative that profiles diabetes care for 13 Veterans Affairs (VA) clinics. Data were extracted from a computerized database for calendar years 2004 to 2005. Performance was assessed with 4 monitoring measures, 3 intermediate outcomes, and 3 appropriate treatment measures. Attainment rates for each indicator were calculated by clinic. The effect of risk adjustment and the significance of clinic performance variation were determined with multivariate logistic models. Analysis of the 10 quality measures revealed lower attainment and greater clinic-level variation for the less familiar indicators. Statistically significant performance variations were detected among clinics, with several being of a clinically important magnitude. Risk adjustment did not substantially change performance. The addition of clinically relevant quality measures and simple statistics appeared to enhance the characterization of performance by this profiling program.
Available from: Sylvia Hysong
- "Recent research suggests that the greatest practice variation occurs at the facility level [27,28]. Therefore, sites were selected using a purposive stratified approach based on their performance on a profile of 15 outpatient EPRP measures (see Data Source for Site Selection subsection, below)a. "
[Show abstract] [Hide abstract]
ABSTRACT: The Department of Veterans Affairs (VA) has led the industry in measuring facility performance as a critical element in improving quality of care, investing substantial resources to develop and maintain valid and cost-effective measures. The External Peer Review Program (EPRP) of the VA is the official data source for monitoring facility performance, used to prioritize the quality areas needing most attention. Facility performance measurement has significantly improved preventive and chronic care, as well as overall quality; however, much variability still exists in levels of performance across measures and facilities. Audit and feedback (A&F), an important component of effective performance measurement, can help reduce this variability and improve overall performance. Previous research suggests that VA Medical Centers (VAMCs) with high EPRP performance scores tend to use EPRP data as a feedback source. However, the manner in which EPRP data are used as a feedback source by individual providers as well as service line, facility, and network leadership is not well understood. An in-depth understanding of mental models, strategies, and specific feedback process characteristics adopted by high-performing facilities is thus urgently needed.This research compares how leaders of high, low, and moderately performing VAMCs use clinical performance data from the EPRP as a feedback tool to maintain and improve quality of care.
We will conduct a qualitative, grounded theory analysis of up to 64 interviews using a novel method of sampling primary care, facility, and Veterans Integrated Service Network (VISN) leadership at high-, moderate-, and low-performing facilities. We will analyze interviews for evidence of cross-facility differences in perceptions of performance data usefulness and strategies for disseminating performance data evaluating performance, with particular attention to timeliness, individualization, and punitiveness of feedback delivery.
Most research examining feedback to improve provider and facility performance lacks a detailed understanding of the elements of effective feedback. This research will highlight the elements most commonly used at high-performing facilities and identify additional features of their successful feedback strategies not previously identified. Armed with this information, practices can implement more effective A&F interventions to improve quality of care.
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.