Evaluating the successful implementation of evidence into practice using the PARIHS framework: Theoretical and practical challenges

Green College, University of Oxford, Woodstock Road, Oxford OX2 6HG, UK.
Implementation Science (Impact Factor: 4.12). 02/2008; 3(1, article 1):1. DOI: 10.1186/1748-5908-3-1
Source: PubMed


The PARiHS framework (Promoting Action on Research Implementation in Health Services) has proved to be a useful practical and conceptual heuristic for many researchers and practitioners in framing their research or knowledge translation endeavours. However, as a conceptual framework it still remains untested and therefore its contribution to the overall development and testing of theory in the field of implementation science is largely unquantified.
This being the case, the paper provides an integrated summary of our conceptual and theoretical thinking so far and introduces a typology (derived from social policy analysis) used to distinguish between the terms conceptual framework, theory and model - important definitional and conceptual issues in trying to refine theoretical and methodological approaches to knowledge translation. Secondly, the paper describes the next phase of our work, in particular concentrating on the conceptual thinking and mapping that has led to the generation of the hypothesis that the PARiHS framework is best utilised as a two-stage process: as a preliminary (diagnostic and evaluative) measure of the elements and sub-elements of evidence (E) and context (C), and then using the aggregated data from these measures to determine the most appropriate facilitation method. The exact nature of the intervention is thus determined by the specific actors in the specific context at a specific time and place. In the process of refining this next phase of our work, we have had to consider the wider issues around the use of theories to inform and shape our research activity; the ongoing challenges of developing robust and sensitive measures; facilitation as an intervention for getting research into practice; and finally to note how the current debates around evidence into practice are adopting wider notions that fit innovations more generally.
The paper concludes by suggesting that the future direction of the work on the PARiHS framework is to develop a two-stage diagnostic and evaluative approach, where the intervention is shaped and moulded by the information gathered about the specific situation and from participating stakeholders. In order to expedite the generation of new evidence and testing of emerging theories, we suggest the formation of an international research implementation science collaborative that can systematically collect and analyse experiences of using and testing the PARiHS framework and similar conceptual and theoretical approaches. We also recommend further refinement of the definitions around conceptual framework, theory, and model, suggesting a wider discussion that embraces multiple epistemological and ontological perspectives.

Download full-text


Available from: Jo Rycroft-Malone
  • Source
    • "Clinical practice guidelines (CPGs) are commonly used as a strategy to overcome this gap, but the assumption that a CPG will implement itself is long gone, as there are several factors to consider for its implementation[5]. Successful implementation can, according to the Promoting Action on Research Implementation in Health Services (PARIHS) framework, be understood as a function of the relationship between evidence, context, and facilitation678. The framework suggests that successful implementation is more likely to occur when evidence and context are considered high. "

    Full-text · Article · Jan 2016 · Implementation Science
  • Source
    • "Although only a few authors used the term " uncertainty, " authors used other terms (e.g., " unproven knowledge " ; Elwyn et al., 2007) to refer to similar concepts. Uncertainty is high when guidance on how to implement an intervention is limited, the evidence base for available EBIs is weak, or the potential to translate an EBI to a new context is unknown (DeGroff et al., 2010; Elwyn et al., 2007; Kitson et al., 2008; Yuan et al., 2010). The evidence base for translating EBIs to new contexts often is less certain when those contexts are complex adaptive systems that are composed of numerous elements that interact in ways that cannot be predicted (Lanham et al., 2013; Snowden & Boone, 2007). "
    [Show abstract] [Hide abstract]
    ABSTRACT: Public health and other community-based practitioners have access to a growing number of evidence-based interventions (EBIs), and yet EBIs continue to be underused. One reason for this underuse is that practitioners often lack the capacity (knowledge, skills, and motivation) to select, adapt, and implement EBIs. Training, technical assistance, and other capacity-building strategies can be effective at increasing EBI adoption and implementation. However, little is known about how to design capacity-building strategies or tailor them to differences in capacity required across varying EBIs and practice contexts. To address this need, we conducted a scoping study of frameworks and theories detailing variations in EBIs or practice contexts and how to tailor capacity-building to address those variations. Using an iterative process, we consolidated constructs and propositions across 24 frameworks and developed a beginning theory to describe salient variations in EBIs (complexity and uncertainty) and practice contexts (decision-making structure, general capacity to innovate, resource and values fit with EBI, and unity vs. polarization of stakeholder support). The theory also includes propositions for tailoring capacity-building strategies to address salient variations. To have wide-reaching and lasting impact, the dissemination of EBIs needs to be coupled with strategies that build practitioners' capacity to adopt and implement a variety of EBIs across diverse practice contexts.
    Full-text · Article · Oct 2015 · Health Education & Behavior
  • Source
    • "For programs of research in speech-language pathology that are ready for translation into clinical practice, the use of these kinds of theoretical approaches would serve to identify what factors impact uptake of research evidence into clinical practice, how to best facilitate the implementation process, and how to tell when efforts to make practice change have succeeded or failed. As just one example, the Promoting Action on Research Implementation in Health Services (PARIHS) framework outlines key factors (i.e., determinants) to consider in implementation of evidence-based practices (Rycroft-Malone, 2004; Kitson et al., 2008). In this model, it is the interaction among perceptions of evidence by stakeholders, the organizational context, and characteristics of the innovation (i.e., ensuring that the intervention is perceived as acceptable and manageable by clinicians) that encourages successful implementation of evidence-based practice. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Purpose: To provide a resource of pertinent information concerning implementation science for immediate research application in communication sciences and disorders (CSD). Method: Key terminology related to implementation science is reviewed. Practical suggestions for the application of implementation science theories and methodologies are provided, including an overview of hybrid research designs that simultaneously investigate clinical effectiveness and implementation as well as an introduction to approaches for engaging stakeholders in the research process. A detailed example from education is shared to show how implementation science was utilized to move an intervention program for autism into routine practice in the public school system. In particular, the example highlights the value of strong partnership between researchers, policymakers, and frontline practitioners in implementing and sustaining new evidence-based practices. Conclusions: Implementation science is not just a buzzword. This is a new field of study that can make a substantive contribution in CSD by informing research agendas, reducing health and education disparities, improving accountability and quality control, increasing clinician satisfaction and competence, and improving client outcomes.
    Full-text · Article · Oct 2015 · Journal of Speech Language and Hearing Research
Show more