Article

Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges

Green College, University of Oxford, Woodstock Road, Oxford OX2 6HG, UK.
Implementation Science (Impact Factor: 3.47). 02/2008; 3:1. DOI: 10.1186/1748-5908-3-1
Source: PubMed

ABSTRACT The PARiHS framework (Promoting Action on Research Implementation in Health Services) has proved to be a useful practical and conceptual heuristic for many researchers and practitioners in framing their research or knowledge translation endeavours. However, as a conceptual framework it still remains untested and therefore its contribution to the overall development and testing of theory in the field of implementation science is largely unquantified.
This being the case, the paper provides an integrated summary of our conceptual and theoretical thinking so far and introduces a typology (derived from social policy analysis) used to distinguish between the terms conceptual framework, theory and model - important definitional and conceptual issues in trying to refine theoretical and methodological approaches to knowledge translation. Secondly, the paper describes the next phase of our work, in particular concentrating on the conceptual thinking and mapping that has led to the generation of the hypothesis that the PARiHS framework is best utilised as a two-stage process: as a preliminary (diagnostic and evaluative) measure of the elements and sub-elements of evidence (E) and context (C), and then using the aggregated data from these measures to determine the most appropriate facilitation method. The exact nature of the intervention is thus determined by the specific actors in the specific context at a specific time and place. In the process of refining this next phase of our work, we have had to consider the wider issues around the use of theories to inform and shape our research activity; the ongoing challenges of developing robust and sensitive measures; facilitation as an intervention for getting research into practice; and finally to note how the current debates around evidence into practice are adopting wider notions that fit innovations more generally.
The paper concludes by suggesting that the future direction of the work on the PARiHS framework is to develop a two-stage diagnostic and evaluative approach, where the intervention is shaped and moulded by the information gathered about the specific situation and from participating stakeholders. In order to expedite the generation of new evidence and testing of emerging theories, we suggest the formation of an international research implementation science collaborative that can systematically collect and analyse experiences of using and testing the PARiHS framework and similar conceptual and theoretical approaches. We also recommend further refinement of the definitions around conceptual framework, theory, and model, suggesting a wider discussion that embraces multiple epistemological and ontological perspectives.

3 Followers
 · 
271 Views
  • [Show abstract] [Hide abstract]
    ABSTRACT: Background Meticulous steps and procedures are proposed in planning guidelines for the development of comprehensive multiyear plans for national immunization programmes. However, we know very little about whether the real-life experience of those who adopt these guidelines involves following these procedures as expected. Are these steps and procedures followed in practice? We examined the adoption and usage of the guidelines in planning national immunization programmes and assessed whether the recommendations in these guidelines are applied as consistently as intended. Methods We gathered information from the national comprehensive multiyear plans developed by 77 low-income countries. For each of the 11 components, we examined how each country applied the four recommended steps of situation analysis, problem prioritization, selection of interventions, and selection of indicators. We then conducted an analysis to determine the patterns of alignment of the comprehensive multiyear plans with those four recommended planning steps. Results Within the first 3 years following publication of the guidelines, 66 (86%) countries used the tool to develop their comprehensive multiyear plans. The funding conditions attached to the use of these guidelines appeared to influence their rapid adoption and usage. Overall, only 33 (43%) countries fully applied all four recommended planning steps of the guidelines. Conclusions Adoption and usage of the guidelines for the development of comprehensive multiyear plans for national immunization programmes were rapid. However, our findings show substantial variation between the proposed planning ideals set out in the guidelines and actual use in practice. A better understanding of factors that influence how recommendations in public health guidelines are applied in practice could contribute to improvements in guidelines design. It could also help adjust strategies used to introduce them into public health programmes, with the ultimate goal of a greater health impact.
    Implementation Science 04/2015; 10. DOI:10.1186/s13012-015-0239-8 · 3.47 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: To explore improvement facilitators' experiences of handling their commission to implement evidence-based practice in elderly care for frail older persons. Improvement facilitators were put in place across Sweden in a time-limited project by the government, with one part of the project being to evaluate the model before establishing this facilitation of evidence-based practice in elderly care. Two focus groups were interviewed twice. Each group comprised three respondents. The interviews were analysed using qualitative content analysis. A main theme, 'Moving forward by adjusting to the circumstances', described how the improvement facilitators handle their commitment. Five subthemes emerged: identifying barriers, keeping focus, maintaining motivation, building bridges and finding balance. The improvement facilitators' commitment is ambiguous because of unclear leadership of, and responsibility for the national investment. They have to handle leaders' different approaches and justify the need for evidence-based practice. The improvement facilitators did not reflect on the impact of programme adaptations on evidence-based practice. The findings emphasise the need for collaboration between the improvement facilitator and the nurse manager. To fully implement evidence-based practice, negotiations with current practitioners for adaptation to local conditions are necessary. Furthermore, the value of improving organisational performance needs to be rigorously communicated throughout the organisation. © 2015 The Authors. Journal of Nursing Management Published by John Wiley & Sons Ltd.
    Journal of Nursing Management 04/2015; DOI:10.1111/jonm.12300 · 1.14 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Implementation science has progressed towards increased use of theoretical approaches to provide better understanding and explanation of how and why implementation succeeds or fails. The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection and application of relevant approaches in implementation research and practice and to foster cross-disciplinary dialogue among implementation researchers. Theoretical approaches used in implementation science have three overarching aims: describing and/or guiding the process of translating research into practice (process models); understanding and/or explaining what influences implementation outcomes (determinant frameworks, classic theories, implementation theories); and evaluating implementation (evaluation frameworks). This article proposes five categories of theoretical approaches to achieve three overarching aims. These categories are not always recognized as separate types of approaches in the literature. While there is overlap between some of the theories, models and frameworks, awareness of the differences is important to facilitate the selection of relevant approaches. Most determinant frameworks provide limited "how-to" support for carrying out implementation endeavours since the determinants usually are too generic to provide sufficient detail for guiding an implementation process. And while the relevance of addressing barriers and enablers to translating research into practice is mentioned in many process models, these models do not identify or systematically structure specific determinants associated with implementation success. Furthermore, process models recognize a temporal sequence of implementation endeavours, whereas determinant frameworks do not explicitly take a process perspective of implementation.
    Implementation Science 04/2015; 10(1):53. DOI:10.1186/s13012-015-0242-0 · 3.47 Impact Factor

Full-text (4 Sources)

Download
60 Downloads
Available from
May 31, 2014