Modifying DRG-PPS to include only diagnoses present on admission: financial implications and challenges.

Agency for Healthcare Research and Quality, Department of Health and Human Services, Rockville, Maryland 20850, USA.
Medical Care (Impact Factor: 3.23). 05/2007; 45(4):288-91. DOI: 10.1097/
Source: PubMed

ABSTRACT The inability to distinguish complications acquired in hospital from comorbid conditions that are present on admission (POA) has long hampered the use of claims data in quality and safety research. Now pay-for-performance initiatives and legislation requiring Medicare to reduce payment for acquired infections add imperative for POA coding. This study used data from 2 states currently coding POA to assess the financial impact if Medicare pays based on POA conditions only and to examine the challenges in implementing POA coding.
Medicare payments were calculated based first on all diagnoses and then on POA diagnoses in the Medicare discharge abstracts from California and New York in 2003, using the Diagnosis Related Group (DRG)-based Prospective Payment System (PPS) formula. The potential savings that result from excluding non-POA diagnoses were calculated. Patterns of POA coding were explored.
Medicare could have saved $56 million in California, $51 million in New York, and $800 million nationwide in 2003 had it paid hospital claims based only on POA diagnoses. Approximately 15% of the claims had non-POA codes, but only 1.4% of the claims were reassigned to lower-cost DRGs after excluding non-POA diagnoses. Excluding non-POA diagnoses resulted in reduced payment for operating costs, but increased outlier payments because some of the claims were designated as "unusually high cost" in the lower-cost DRGs. POA coding patterns suggest some problems in current POA coding.
To be consistent with pay-for-performance principles and make claims data more useful for quality assurance, incorporating POA coding into DRG-PPS could produce sizable savings for Medicare.

1 Bookmark
  • Source
  • Source
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Objective: Numerous studies have shown the rates of voluntary reporting of patient safety events to be disappointingly low. This article demonstrates that targeted use of administrative data can substantially improve identification of adverse events. The design and pilot test of a reporting system at the University of Michigan Health System (UMHS) is detailed. The system identifies otherwise unreported adverse events and provides information about the preventability of these events. When fully implemented, the reporting system will become an integral part of the UMHS system for improving patient safety. Methods: The patient safety indicator (PSI)-case review system (CRS) is based on the Agency for Healthcare Research and Quality PSIs and uses a diagnosis timing variable to ensure that only inpatients with a condition that was acquired after hospital admission are identified and referred to clinicians for investigation. Results: During a 4-week pilot, 66 PSI cases for 56 patients were identified. Sixty-four cases (96%) were unrelated to adverse events recorded in the voluntary patient safety reporting system. Clinicians reviewed 50 cases, and 43 (86%) had a confirmed event. Nineteen (44%) were deemed potentially preventable and were associated with selected infections due to medical care, postoperative sepsis, postoperative pulmonary embolism or deep vein thrombosis, decubitus ulcer, and accidental puncture or laceration. Conclusion: Although a brief experiment, the pilot test demonstrated that the PSI-CRS can enhance the voluntary patient safety reporting system and identify quality improvement opportunities. As an im-portant side benefit, the system can also help UMHS prepare for federal initiatives to use administrative data to identify and re-duce Medicare reimbursement for inpatients with hospital-acquired complications. I n the 2000 report, To Err Is Human, the Institute of Medicine identified voluntary patient safety reporting systems (PSRSs) as a key component to reducing medical errors. 1 The reporting of medical errors, complications or adverse events, and near misses is critical to the analysis of the root causes of failures in processes of care and implementation of interventions aimed at their prevention. Voluntary reports of patient safety events (defined here as errors, adverse events, and near misses) in hospitals by physicians, nurses, and other care givers are often captured in electronic error reporting systems. 2 Although these are considered a key component of hospital patient safety programs, numerous studies have shown reporting rates using electronic error reporting system to be disappointingly low. Common barriers to reporting include concerns about the confidentiality of the reporter, time required to report a patient safety event, lack of feedback about the resolution of a reported event, and confusion about the types of events that should be reported. 2Y4 While trying to boost voluntary reporting rates, hospitals are also tapping other sources of information to identify patient safety events, such as laboratory and pharmacy data, electronic discharge records, and administrative (billing) data. 5Y7 In New York, which has a mandatory statewide program for reporting adverse events, administrative data were successfully used as a supplement to internal voluntary reporting methods to identify and report events to the state. 8 The Agency for Healthcare Research and Quality (AHRQ) promoted the use of administrative data to identify adverse events through the development of the patient safety indicators (PSIs). The indicators are based on the presence of specific International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnosis codes and are used to report rates of adverse events such as postoperative pulmonary embolism (PE) or deep vein thrombosis (DVT). Hospitals can compare their rates to national rates and investigate individual cases to validate the results and, if an adverse event occurred, assess potential quality of care problems. 9 Although researchers have demonstrated the limitations of administrative data when used to screen for complications, they suggested that the validity of results can be improved by the addition of a variable to capture the timing of secondary diagnoses, such as a Bpresent on admission[ variable (PoA). 10Y17 Recent studies have shown that the PSI rates include cases where the diagnosis that triggered the PSI was a pre-existing comorbid condition and not a complication that was acquired during the patient_s hospitalization. 18 These false-positive cases inflate the PSI rates and can cause hospital staff to spend time investigating cases that yield little useful information. The federal government is poised to utilize adminis-trative data to identify and reduce reimbursement for inpatient this article is prohibited. cases involving hospital-acquired complications. Hospitals are now required to submit a PoA code for all diagnoses. The Deficit Reduction Act of 2005, section 5001c, calls for a reduction in Medicare reimbursement for inpatient cases with selected preventable hospital-acquired complications begin-ning in October 2008. Currently, an inpatient case with a diagnosis indicating a certain pre-existing condition and/or hospital-acquired complication can be classified to higher paying diagnosis-related group (DRG). Beginning in October 2008, the diagnoses for 8 hospital-acquired complications will no longer be used to classify patients to a DRG. 19 The purpose of this article is to describe the design and pilot test of a reporting system that identifies adverse events from administrative data and refers them for clinical evalua-tion. The goals of the pilot were to evaluate whether this system could identify adverse events that were not recorded in the voluntary PSRS and provide information about the preventability of these events. Because preventable adverse events are attributed to error of either commission or omission, this information can be used to reduce the risk of future events.
    Journal of Patient Safety. 01/2008; 4(1).