Advancing the application, quality and harmonization of implementation science measures

Implementation Science (Impact Factor: 4.12). 12/2012; 7(1):119. DOI: 10.1186/1748-5908-7-119
Source: PubMed


The field of implementation science (IS) encompasses a broad range of constructs and uses measures from a variety of disciplines. However, there has been little standardization of measures or agreement on definitions of constructs across different studies, fields, authors, or research groups.

We describe a collaborative, web-based activity using the United States National Cancer Institute’s (NCI) Grid-Enabled Measures (GEM) portal that uses a wiki platform to focus discussion and engage the research community to enhance the quality and harmonization of measures for IS health-related research and practice. We present the history, process, and preliminary data from the GEM Dissemination & Implementation (D&I) Campaign on IS measurement.

The GEM D&I Campaign has been ongoing for eight weeks as of this writing, and has used a combination of expert opinion and crowd-sourcing approaches. To date it has listed definitions for 45 constructs and summarized information on 120 measures. Usage of the website peaked at a rate of 124 views from 89 visitors on week seven. Users from seven countries have contributed measures and/or constructs, shared experience in using different measures, contributed comments, and identified research gaps and needs.

Thus far, this campaign has provided information about different IS measures, their associated characteristics, and comments. The next step is to rate these measures for quality and practicality. This resource and ongoing activity have potential to advance the quality and harmonization of IS measures and constructs, and we invite readers to contribute to the process.

Download full-text


Available from: Enola K Proctor, Aug 20, 2014
    • "Validating the implementation climate measure within the child welfare setting will further support the generalizability of the ICS and provide a useful tool that can be utilized to improve child welfare practice. It is also in line with the recent push by researchers in implementation science for the development and evaluation of measures related to EBP implementation (Rabin et al., 2012; Proctor, Powell, & Feely, 2014). "
    [Show abstract] [Hide abstract]
    ABSTRACT: There is increasing emphasis on the use of evidence-based practices (EBPs) in child welfare settings and growing recognition of the importance of the organizational environment, and the organization's climate in particular, for how employees perceive and support EBP implementation. Recently, Ehrhart, Aarons, and Farahnak (2014) reported on the development and validation of a measure of EBP implementation climate, the Implementation Climate Scale (ICS), in a sample of mental health clinicians. The ICS consists of 18 items and measures six critical dimensions of implementation climate: focus on EBP, educational support for EBP, recognition for EBP, rewards for EBP, selection or EBP, and selection for openness. The goal of the current study is to extend this work by providing evidence for the factor structure, reliability, and validity of the ICS in a sample of child welfare service providers. Survey data were collected from 215 child welfare providers across three states, 12 organizations, and 43 teams. Confirmatory factor analysis demonstrated good fit to the six-factor model and the alpha reliabilities for the overall measure and its subscales was acceptable. In addition, there was general support for the invariance of the factor structure across the child welfare and mental health sectors. In conclusion, this study provides evidence for the factor structure, reliability, and validity of the ICS measure for use in child welfare service organizations.
    No preview · Article · Nov 2015 · Child abuse & neglect
  • Source
    • "Future studies should take care to integrate a wider range of implementation outcomes whenever possible, as they can serve as indicators of implementation success, proximal indicators of implementation processes, and key intermediate outcomes in relation to service system or clinical outcomes (Proctor et al., 2011). Efforts are underway to catalog and rate the quality of implementation related measures (Rabin et al., 2012) to promote the use of a wider array of valid and reliable instruments and serve to catalyze the development of new measures needed to advance the field. A promising finding of this review was that approximately two thirds (64%) of studies demonstrated beneficial effects of employing implementation strategies to improve intermediate and/or clinical outcomes over the comparison conditions. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This systematic review examines experimental studies that test the effectiveness of strategies intended to integrate empirically supported mental health interventions into routine care settings. Our goal was to characterize the state of the literature and to provide direction for future implementation studies. A literature search was conducted using electronic databases and a manual search. Eleven studies were identified that tested implementation strategies with a randomized (n = 10) or controlled clinical trial design (n = 1). The wide range of clinical interventions, implementation strategies, and outcomes evaluated precluded meta-analysis. However, the majority of studies (n = 7; 64%) found a statistically significant effect in the hypothesized direction for at least one implementation or clinical outcome. There is a clear need for more rigorous research on the effectiveness of implementation strategies, and we provide several suggestions that could improve this research area.
    Full-text · Article · Mar 2014 · Research on Social Work Practice
  • Source
    • "This study will be among the first to provide the public health field with information about the facilitators and strategies that state level practitioners use in evidence based chronic disease prevention. Measures of dissemination among practitioners working in prevention of cancer and other chronic diseases are lacking [79-82]. This study will be among the first to develop, test, and utilize such measures. "
    [Show abstract] [Hide abstract]
    ABSTRACT: BackgroundCancer and other chronic diseases reduce quality and length of life and productivity, and represent a significant financial burden to society. Evidence-based public health approaches to prevent cancer and other chronic diseases have been identified in recent decades and have the potential for high impact. Yet, barriers to implement prevention approaches persist as a result of multiple factors including lack of organizational support, limited resources, competing emerging priorities and crises, and limited skill among the public health workforce. The purpose of this study is to learn how best to promote the adoption of evidence based public health practice related to chronic disease prevention.Methods/designThis paper describes the methods for a multi-phase dissemination study with a cluster randomized trial component that will evaluate the dissemination of public health knowledge about evidence-based prevention of cancer and other chronic diseases. Phase one involves development of measures of practitioner views on and organizational supports for evidence-based public health and data collection using a national online survey involving state health department chronic disease practitioners. In phase two, a cluster randomized trial design will be conducted to test receptivity and usefulness of dissemination strategies directed toward state health department chronic disease practitioners to enhance capacity and organizational support for evidence-based chronic disease prevention. Twelve state health department chronic disease units will be randomly selected and assigned to intervention or control. State health department staff and the university-based study team will jointly identify, refine, and select dissemination strategies within intervention units. Intervention (dissemination) strategies may include multi-day in-person training workshops, electronic information exchange modalities, and remote technical assistance. Evaluation methods include pre-post surveys, structured qualitative phone interviews, and abstraction of state-level chronic disease prevention program plans and progress reports.Trial NCT01978054.
    Full-text · Article · Dec 2013 · Implementation Science
Show more