Advancing the application, quality and harmonization of implementation science measures

Implementation Science (Impact Factor: 4.12). 12/2012; 7(1):119. DOI: 10.1186/1748-5908-7-119
Source: PubMed

ABSTRACT Background
The field of implementation science (IS) encompasses a broad range of constructs and uses measures from a variety of disciplines. However, there has been little standardization of measures or agreement on definitions of constructs across different studies, fields, authors, or research groups.

We describe a collaborative, web-based activity using the United States National Cancer Institute’s (NCI) Grid-Enabled Measures (GEM) portal that uses a wiki platform to focus discussion and engage the research community to enhance the quality and harmonization of measures for IS health-related research and practice. We present the history, process, and preliminary data from the GEM Dissemination & Implementation (D&I) Campaign on IS measurement.

The GEM D&I Campaign has been ongoing for eight weeks as of this writing, and has used a combination of expert opinion and crowd-sourcing approaches. To date it has listed definitions for 45 constructs and summarized information on 120 measures. Usage of the website peaked at a rate of 124 views from 89 visitors on week seven. Users from seven countries have contributed measures and/or constructs, shared experience in using different measures, contributed comments, and identified research gaps and needs.

Thus far, this campaign has provided information about different IS measures, their associated characteristics, and comments. The next step is to rate these measures for quality and practicality. This resource and ongoing activity have potential to advance the quality and harmonization of IS measures and constructs, and we invite readers to contribute to the process.

Download full-text


Available from: Enola K Proctor, Aug 20, 2014
16 Reads
  • Source
    • "Future studies should take care to integrate a wider range of implementation outcomes whenever possible, as they can serve as indicators of implementation success, proximal indicators of implementation processes, and key intermediate outcomes in relation to service system or clinical outcomes (Proctor et al., 2011). Efforts are underway to catalog and rate the quality of implementation related measures (Rabin et al., 2012) to promote the use of a wider array of valid and reliable instruments and serve to catalyze the development of new measures needed to advance the field. A promising finding of this review was that approximately two thirds (64%) of studies demonstrated beneficial effects of employing implementation strategies to improve intermediate and/or clinical outcomes over the comparison conditions. "
    [Show abstract] [Hide abstract]
    ABSTRACT: This systematic review examines experimental studies that test the effectiveness of strategies intended to integrate empirically supported mental health interventions into routine care settings. Our goal was to characterize the state of the literature and to provide direction for future implementation studies. A literature search was conducted using electronic databases and a manual search. Eleven studies were identified that tested implementation strategies with a randomized (n = 10) or controlled clinical trial design (n = 1). The wide range of clinical interventions, implementation strategies, and outcomes evaluated precluded meta-analysis. However, the majority of studies (n = 7; 64%) found a statistically significant effect in the hypothesized direction for at least one implementation or clinical outcome. There is a clear need for more rigorous research on the effectiveness of implementation strategies, and we provide several suggestions that could improve this research area.
    Research on Social Work Practice 03/2014; 24(2):192-212. DOI:10.1177/1049731513505778 · 1.53 Impact Factor
  • Source
    • "This study will be among the first to provide the public health field with information about the facilitators and strategies that state level practitioners use in evidence based chronic disease prevention. Measures of dissemination among practitioners working in prevention of cancer and other chronic diseases are lacking [79-82]. This study will be among the first to develop, test, and utilize such measures. "
    [Show abstract] [Hide abstract]
    ABSTRACT: BackgroundCancer and other chronic diseases reduce quality and length of life and productivity, and represent a significant financial burden to society. Evidence-based public health approaches to prevent cancer and other chronic diseases have been identified in recent decades and have the potential for high impact. Yet, barriers to implement prevention approaches persist as a result of multiple factors including lack of organizational support, limited resources, competing emerging priorities and crises, and limited skill among the public health workforce. The purpose of this study is to learn how best to promote the adoption of evidence based public health practice related to chronic disease prevention.Methods/designThis paper describes the methods for a multi-phase dissemination study with a cluster randomized trial component that will evaluate the dissemination of public health knowledge about evidence-based prevention of cancer and other chronic diseases. Phase one involves development of measures of practitioner views on and organizational supports for evidence-based public health and data collection using a national online survey involving state health department chronic disease practitioners. In phase two, a cluster randomized trial design will be conducted to test receptivity and usefulness of dissemination strategies directed toward state health department chronic disease practitioners to enhance capacity and organizational support for evidence-based chronic disease prevention. Twelve state health department chronic disease units will be randomly selected and assigned to intervention or control. State health department staff and the university-based study team will jointly identify, refine, and select dissemination strategies within intervention units. Intervention (dissemination) strategies may include multi-day in-person training workshops, electronic information exchange modalities, and remote technical assistance. Evaluation methods include pre-post surveys, structured qualitative phone interviews, and abstraction of state-level chronic disease prevention program plans and progress reports.Trial NCT01978054.
    Implementation Science 12/2013; 8(1):141. DOI:10.1186/1748-5908-8-141 · 4.12 Impact Factor
  • Source
    • "It should also be noted that the Powell and colleagues’ compilation includes a number of strategies that could not be reasonably adopted by the participants of this study (e.g., ‘centralize technical assistance’) [9], and those strategies will be eliminated. The survey will also be informed by a conceptual taxonomy of implementation outcomes [74] and other existing surveys drawn from implementation science measures collections [75,76]. In addition to basic demographic questions, stakeholders will be asked whether or not they have experienced each included implementation strategy (yes or no) and will then rate each strategy (using a Likert-style scale) on the following dimensions: ‘effectiveness’ and ‘relative importance’ (i.e., How well did it work and how important was it relative to other strategies?), ‘acceptability’ (i.e., How agreeable, palatable, or satisfactory is the strategy?), "
    [Show abstract] [Hide abstract]
    ABSTRACT: Improving quality in children's mental health and social service settings will require implementation strategies capable of moving effective treatments and other innovations (e.g., assessment tools) into routine care. It is likely that efforts to identify, develop and refine implementation strategies will be more successful if they are informed by relevant stakeholders and are responsive to the strengths and limitations of the contexts and implementation processes identified in usual care settings. This study will describe: the types of implementation strategies used; how organizational leaders make decisions about what to implement and how to approach the implementation process; organizational stakeholders' perceptions of different implementation strategies; and the potential influence of organizational culture and climate on implementation strategy selection, implementation decision-making, and stakeholders' perceptions of implementation strategies.Methods/design: This study is a mixed methods multiple case study of seven children's social service organizations in one midwestern city in the United States that compose the control group of a larger randomized controlled trial. Qualitative data will include semi-structured interviews with organizational leaders (e.g., CEOs/directors, clinical directors, program managers) and a review of documents (e.g., implementation and quality improvement plans, program manuals, etc.) that will shed light on implementation decision-making and specific implementation strategies that are used to implement new programs and practices. Additionally, focus groups with clinicians will explore their perceptions of a range of implementation strategies. This qualitative work will inform the development of a Web-based survey that will assess the perceived effectiveness, relative importance, acceptability, feasibility, and appropriateness of implementation strategies from the perspective of both clinicians and organizational leaders. Finally, the Organizational Social Context measure will be used to assess organizational culture and climate. Qualitative, quantitative, and mixed methods data will be analyzed and interpreted at the case level as well as across cases in order to highlight meaningful similarities, differences, and site-specific experiences. This study is designed to inform efforts to develop more effective implementation strategies by fully describing the implementation experiences of a sample of community-based organizations that provide mental health services to youth in one midwestern city.
    Implementation Science 08/2013; 8(1):92. DOI:10.1186/1748-5908-8-92 · 4.12 Impact Factor
Show more