Development and implementation of a performance measure tool in an academic pediatric research network

Department of Emergency Medicine, University of Michigan, Ann Arbor, MI, United States.
Contemporary clinical trials (Impact Factor: 1.94). 09/2010; 31(5):429-37. DOI: 10.1016/j.cct.2010.05.007
Source: PubMed


The Pediatric Emergency Care Applied Research Network (PECARN) is a federally funded multi-center research network. To promote high quality research within the network, it is important to establish evaluation tools to measure performance of the research sites.
To describe the collaborative development of a site performance measure tool "report card" in an academic pediatric research network. To display report card template information and discuss the successes and challenges of the report cards. DEVELOPMENT AND IMPLEMENTATION OF THE NETWORK PERFORMANCE MEASURE TOOL: The PECARN Quality Assurance Subcommittee and the PECARN data center were responsible for the development and implementation of the report cards. Using a Balanced Scorecard format, four key metrics were identified to align with PECARN's research goals. Performance indicators were defined for each of these metrics. After two years of development, the final report cards have been implemented annually since 2005. Protocol submission time to the Institutional Review Board (IRB) improved between 2005 and 2007. Mean overall report card scores for site report cards increased during this period with less variance between highest and lowest performing sites indicating overall improvement.
Report cards have helped PECARN sites and investigators focus on performance improvement and may have contributed to improved operations and efficiencies within the network.

Download full-text


Available from: Richard M Ruddy,
70 Reads
  • Source
    • "기타 실무적 자료 분석을 통해 전자자료 관리의 방향과 세부지침을 분석하였다. 또한 '임상시험 데이터 관리'의 평가를 위한 본 연구의 기본적 구성의 틀은 '임상시험 자 체'의 평가를 위한 척도를 연구한 Stanley 외의 연구[17] "
    [Show abstract] [Hide abstract]
    ABSTRACT: Electronic data management is getting important to reduce overall cost and run-time of clinical data management with the enhancement of data quality. It also critically needs to meet regulated guidelines for the overall quality and safety of electronic clinical trials. The purpose of this paper is to develop the performance evaluation framework in electronic clinical data management. Four key metrics in the area of infrastructure, intellectual preparation, study implementation and study completion covering major aspects of clinical trial processes are proposed. The performance measures evaluate the extent of regulation compliance, data quality, cost and efficiency of electronic data management process. They also provide measurement indicators for each evaluation items. Based on the key metrics, the performance evaluation framework is developed in three major areas involved in clinical data management - clinical site, monitoring and data coordinating center. As of the initial attempt how to evaluate the extent of electronic data management in clinical trials by Delphi survey, further empirical studies are planned and recommended.
    02/2012; 13(1). DOI:10.7472/jksii.2012.13.1.45
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background: Academic hospitalist groups (AHGs) are often expected to excel in multiple domains: quality improvement, patient safety, education, research, administration, and clinical care. To be successful, AHGs must develop strategies to balance their energies, resources, and performance. The balanced scorecard (BSC) is a strategic management system that enables organizations to translate their mission and vision into specific objectives and metrics across multiple domains. To date, no hospitalist group has reported on BSC implementation. We set out to develop a BSC as part of a strategic planning initiative. Methods: Based on a needs assessment of the University of California, San Francisco, Division of Hospital Medicine, mission and vision statements were developed. We engaged representative faculty to develop strategic objectives and determine performance metrics across 4 BSC perspectives. Results: There were 41 metrics identified, and 16 were chosen for the initial BSC. It allowed us to achieve several goals: 1) present a broad view of performance, 2) create transparency and accountability, 3) communicate goals and engage faculty, and 4) ensure we use data to guide strategic decisions. Several lessons were learned, including the need to build faculty consensus, establish metrics with reliable measureable data, and the power of the BSC to drive goals across the division. Conclusions: We successfully developed and implemented a BSC in an AHG as part of a strategic planning initiative. The BSC has been instrumental in allowing us to achieve balanced success in multiple domains. Academic groups should consider employing the BSC as it allows for a data-driven strategic planning and assessment process.
    Journal of Hospital Medicine 03/2013; 8(3). DOI:10.1002/jhm.2006 · 2.30 Impact Factor