Data-driven process and operational improvement in the emergency department: the ED Dashboard and Reporting Application.

Clinical Services Group, Sky Ridge Medical Center, Carepoint, USA.
Journal of healthcare management / American College of Healthcare Executives (Impact Factor: 0.73). 57(3):167-80; discussion 180-1.
Source: PubMed

ABSTRACT Emergency departments (EDs) in the United States are expected to provide consistent, high-quality care to patients. Unfortunately, EDs are encumbered by problems associated with the demand for services and the limitations of current resources, such as overcrowding, long wait times, and operational inefficiencies. While increasing the effectiveness and efficiency of emergency care would improve both access and quality of patient care, coordinated improvement efforts have been hindered by a lack of timely access to data. The ED Dashboard and Reporting Application was developed to support data-driven process improvement projects. It incorporated standard definitions of metrics, a data repository, and near real-time analysis capabilities. This helped acute care hospitals in a large healthcare system evaluate and target individual improvement projects in accordance with corporate goals. Subsequently, there was a decrease in "arrival to greet" time--the time from patient arrival to physician contact--from an average of 51 minutes in 2007 to the goal level of less than 35 minutes by 2010. The ED Dashboard and Reporting Application has also contributed to data-driven improvements in length of stay and other measures of ED efficiency and care quality. Between January 2007 and December 2010, overall length of stay decreased 10.5 percent while annual visit volume increased 13.6 percent. Thus, investing in the development and implementation of a system for ED data capture, storage, and analysis has supported operational management decisions, gains in ED efficiency, and ultimately improvements in patient care.

  • [Show abstract] [Hide abstract]
    ABSTRACT: The Commission on Information and Accountability for Women's and Children's Health of the World Health Organization (WHO) reported that national health outcome data were often of questionable quality and "not timely enough for practical use by health planners and administrators". Delayed reporting of poor-quality data limits the ability of front-line staff to identify problems rapidly and make improvements. Clinical "dashboards" based on locally available data offer a way of providing accurate and timely information. A dashboard is a simple computerized tool that presents a health facility's clinical data graphically using a traffic-light coding system to alert front-line staff about changes in the frequency of clinical outcomes. It provides rapid feedback on local outcomes in an accessible form and enables problems to be detected early. Until now, dashboards have been used only in high-resource settings. An overview maternity dashboard and a maternal mortality dashboard were designed for, and introduced at, a public hospital in Zimbabwe. A midwife at the hospital was trained to collect and input data monthly. Implementation of the maternity dashboards was feasible and 28 months of clinical outcome data were summarized using common computer software. Presentation of these data to staff led to the rapid identification of adverse trends in outcomes and to suggestions for actions to improve health-care quality. Implementation of maternity dashboards was feasible in a low-resource setting and resulted in actions that improved health-care quality locally. Active participation of hospital management and midwifery staff was crucial to their success.
    Bulletin of the World Health Organisation 02/2014; 92(2):146-52. DOI:10.2471/BLT.13.124347 · 5.11 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The main objective of this descriptive and development research was to introduce a design protocol to develop radiology dashboards. The first step was to determine key performance indicators for radiology department. The second step was to determine required infrastructure for implementation of radiology dashboards. Infrastructure was extracted from both data and technology perspectives. The third step was to determine main features of the radiology dashboards. The fourth step was to determine the key criteria for evaluating the dashboards. In all these steps, non-probability sampling methods including convenience and purposive were employed and sample size determined based on a persuasion model. Results showed that there are 92 KPIs, 10 main features for designing dashboards and 53 key criteria for dashboards evaluation. As well as, a Prototype of radiology management dashboards in four aspects including services, clients, personnel and cost-income were implemented and evaluated. Applying such dashboards could help managers to enhance performance, productivity and quality of services in radiology department.
    Acta Informatica Medica 10/2014; 22(5):341-6. DOI:10.5455/aim.2014.22.341-346
  • [Show abstract] [Hide abstract]
    ABSTRACT: To assess the development of local clinical dashboards in line with UK national guidance and to identify ongoing issues being faced by maternity units, across an entire health region, in developing quality assurance systems. A mixed-methods study involving all consultant-led maternity units in the South West of England Strategic Health Authority region (SWSHA). An electronic survey, followed by semi-structured interviews with the lead obstetrician and risk management midwife (or equivalent) of each maternity unit, to investigate methods employed to monitor outcomes locally, particularly the development of tools including maternity dashboards. Interviews were audio recorded, transcribed and thematically analysed to identify conceptual categories and themes. 12/15 eligible consultant-led maternity units participated in the study and 10/12 (83%) of these used a dashboard. There was an excessive number of non-standard indicators used by the maternity units, with 352 different quality indicators (QIs), covering 37 different indicator categories, with up to 39 different definitions for one particular QI. Issues identified were: an excess of indicators, disproportionate time taken to produce the dashboard, uncertainty surrounding thresholds for alert within the dashboards and a desire for more guidance and standardisation of indicators, and their use. Following recommendation by the Royal College of Obstetricians and Gynaecologists, maternity dashboards have been widely adopted by maternity units across the SWSHA to provide a local quality assurance system. There is, however, wide variation in both the quality indicators monitored and their definition. There is an urgent requirement for a national and international core set of maternity QIs. Further guidance is also required to inform alert thresholds for adverse outcomes. These perinatal data are collected electronically, and automating the production of a standardised dashboard is both possible and desirable.
    European journal of obstetrics, gynecology, and reproductive biology 07/2013; 170(1). DOI:10.1016/j.ejogrb.2013.06.003 · 1.63 Impact Factor