A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice.

Northwest Permanente, Portland, Oregon, USA.
Joint Commission journal on quality and patient safety / Joint Commission Resources 05/2008; 34(4):228-43.
Source: PubMed

ABSTRACT BACKGROUND: Although numerous studies address the efficacy and effectiveness of health interventions, less research addresses successfully implementing and sustaining interventions. As long as efficacy and effectiveness trials are considered complete without considering implementation in nonresearch settings, the public health potential of the original investments will not be realized. A barrier to progress is the absence of a practical, robust model to help identify the factors that need to be considered and addressed and how to measure success. A conceptual framework for improving practice is needed to integrate the key features for successful program design, predictors of implementation and diffusion, and appropriate outcome measures. DEVELOPING PRISM: A comprehensive model for translating research into practice was developed using concepts from the areas of quality improvement, chronic care, the diffusion of innovations, and measures of the population-based effectiveness of translation. PRISM--the Practical, Robust Implementation and Sustainability Model--evaluates how the health care program or intervention interacts with the recipients to influence program adoption, implementation, maintenance, reach, and effectiveness. DISCUSSION: The PRISM model provides a new tool for researchers and health care decision makers that integrates existing concepts relevant to translating research into practice.

  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper summarizes recommendations made regarding the National Action Alliance for Suicide Prevention Research Prioritization Task Force’s Aspirational Goal 2, to “determine the degree of suicide risk (e.g., imminent, near-term, long-term) among individuals in diverse populations and in diverse settings through feasible and effective screening and assessment approaches.” We recommend that researchers shift to using “design for dissemination” principles to maximize both the goodness of fit and validity of screening and assessment measures for a given setting. Three specific recommendations to guide research efforts are made to achieve this shift: (1) the parameters related to each setting, including the logistics, scope of practice, infrastructure, and decision making required, should be identified and used to choose or design screening and assessment instruments that have a good fit; (2) to the greatest feasible extent, technology should be used to support screening and assessment; and (3) researchers should study the best methods for translating validated instruments into routine clinical practice. We discuss the potential barriers to implementing these recommendations and illustrate the paradigm shift within the emergency department setting.
    American Journal of Preventive Medicine. 01/2014; 47(3):S163–S169.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The effects of health promotion interventions are the result not only of the interventions themselves, but also of the contexts in which they unfold. The objective of this study was to analyze, through stakeholders' discourse, the characteristics of an intervention that can influence its outcomes.
    BMC Public Health 11/2014; 14(1):1134. · 2.32 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Background Like many new fields, implementation science has become vulnerable to instrumentation issues that potentially threaten the strength of the developing knowledge base. For instance, many implementation studies report findings based on instruments that do not have established psychometric properties. This article aims to review six pressing instrumentation issues, discuss the impact of these issues on the field, and provide practical recommendations.DiscussionThis debate centers on the impact of the following instrumentation issues: use of frameworks, theories, and models; role of psychometric properties; use of `home-grown¿ and adapted instruments; choosing the most appropriate evaluation method and approach; practicality; and need for decision-making tools. Practical recommendations include: use of consensus definitions for key implementation constructs; reporting standards (e.g., regarding psychometrics, instrument adaptation); when to use multiple forms of observation and mixed methods; and accessing instrument repositories and decision aid tools.SummaryThis debate provides an overview of six key instrumentation issues and offers several courses of action to limit the impact of these issues on the field. With careful attention to these issues, the field of implementation science can potentially move forward at the rapid pace that is respectfully demanded by community stakeholders.
    Implementation Science 09/2014; 9(1):118. · 3.47 Impact Factor


Available from