ArticlePDF AvailableLiterature Review

Abstract and Figures

Implementation science is growing in importance among funders, researchers, and practitioners as an approach to bridging the gap between science and practice. We addressed three goals to contribute to the understanding of the complex and dynamic nature of implementation. Our first goal was to provide a conceptual overview of the process of implementation by synthesizing information from 25 implementation frameworks. The synthesis extends prior work by focusing on specific actions (i.e., the "how to") that can be employed to foster high quality implementation. The synthesis identified 14 critical steps that were used to construct the Quality Implementation Framework (QIF). These steps comprise four QIF phases: Initial Considerations Regarding the Host Setting, Creating a Structure for Implementation, Ongoing Structure Once Implementation Begins, and Improving Future Applications. Our second goal was to summarize research support for each of the 14 QIF steps and to offer suggestions to direct future research efforts. Our third goal was to outline practical implications of our findings for improving future implementation efforts in the world of practice. The QIF's critical steps can serve as a useful blueprint for future research and practice. Applying the collective guidance synthesized by the QIF to the Interactive Systems Framework for Dissemination and Implementation (ISF) emphasizes that accountability for quality implementation does not rest with the practitioner Delivery System alone. Instead, all three ISF systems are mutually accountable for quality implementation.
Content may be subject to copyright.
The Quality Implementation Framework: A Synthesis of Critical
Steps in the Implementation Process
Duncan C. Meyers Joseph A. Durlak
Abraham Wandersman
ÓSociety for Community Research and Action 2012
Abstract Implementation science is growing in impor-
tance among funders, researchers, and practitioners as an
approach to bridging the gap between science and practice.
We addressed three goals to contribute to the understand-
ing of the complex and dynamic nature of implementation.
Our first goal was to provide a conceptual overview of the
process of implementation by synthesizing information
from 25 implementation frameworks. The synthesis
extends prior work by focusing on specific actions (i.e., the
‘how to’’) that can be employed to foster high quality
implementation. The synthesis identified 14 critical steps
that were used to construct the Quality Implementation
Framework (QIF). These steps comprise four QIF phases:
Initial Considerations Regarding the Host Setting, Creating
a Structure for Implementation, Ongoing Structure Once
Implementation Begins, and Improving Future Applica-
tions. Our second goal was to summarize research support
for each of the 14 QIF steps and to offer suggestions to
direct future research efforts. Our third goal was to outline
practical implications of our findings for improving future
implementation efforts in the world of practice. The QIF’s
critical steps can serve as a useful blueprint for future
research and practice. Applying the collective guidance
synthesized by the QIF to the Interactive Systems Frame-
work for Dissemination and Implementation (ISF)
emphasizes that accountability for quality implementation
does not rest with the practitioner Delivery System alone.
Instead, all three ISF systems are mutually accountable for
quality implementation.
Keywords Implementation Knowledge utilization
Implementation framework Implementation science
Numerous reviews have investigated the process of imple-
mentation and have advanced our understanding of how it
unfolds (e.g., Fixsen et al. 2005; Greenhalgh et al. 2004; Hall
and Hord 2006; Rogers 2003). We now have a growing body
of: (1) evidence which clearly indicates that implementation
influences desired outcomes (e.g., Aarons et al. 2009;
DuBois et al. 2002, Durlak and DuPre 2008; Smith et al.
2004; Tobler 1986; Wilson et al. 2003) and (2) several
frameworks that provide an overview of ideas and practices
that shape the complex implementation process (e.g.,
Damschroder et al. 2009; Greenberg et al. 2005). In recog-
nition of its critical importance, various professional groups
have determined that one of the criteria related to identifying
evidence-based interventions should involve documentation
of effective implementation (e.g., Society for Prevention
Research, Division 16 of the American Psychological
Association). In addition, various funders are emphasizing
implementation research and making more funds available
to address implementation in research proposals (e.g., The
William T. Grant Foundation, National Cancer Institute,
National Institute of Mental Health).
Prominent research agencies have intensified their role
in the advancement of implementation science. For
example, the National Institutes for Health (NIH) has an
initiative that involves 13 of its 27 Institutes and the Office
of Behavioral and Social Sciences Research in funding
D. C. Meyers (&)A. Wandersman
University of South Carolina, Columbia, SC, USA
J. A. Durlak
Loyola University Chicago, Chicago, IL, USA
Am J Community Psychol
DOI 10.1007/s10464-012-9522-x
research to identify, develop, and refine effective methods
for disseminating and implementing effective treatments
(NIH 2011). The Centers for Disease Control and Pre-
vention (CDC) is currently playing a key role in improving
the quality and efficiency of a global public health initia-
tive through addressing operational questions related to
program implementation within existing and developing
health systems infrastructures (CDC 2010). In the United
Kingdom, the National Health System has established the
National Institute for Health Research (NIHR) which aims
to use research to improve national health outcomes. The
NIHR has built infrastructure through the creation of
Collaborations for Leadership in Applied Health Research
and Care (CLAHRC) which investigate methods of trans-
lating implementation research evidence to practice (Baker
et al. 2009).
These recent developments have been described as
‘stepping stones’’ that reflect the beginnings of an orga-
nized and resourced approach to bridging research and
practice (Proctor et al. 2009). New developments bring
new ideas, and these ideas have found their way into recent
dissemination- and implementation-related frameworks.
For example, the Interactive Systems Framework for Dis-
semination and Implementation (ISF) recognized that
quality implementation is a critical aspect of widespread
successful innovation (Wandersman et al. 2008). While the
original special issue on the ISF (American Journal of
Community Psychology 2008) recognized the importance
of implementation, it provided relatively little detail on
implementation frameworks per se (with the notable
exception of the review on implementation performed by
Durlak and Dupre 2008). In this article, we were motivated
to incorporate implementation research and related con-
cepts into the ISF to a greater degree, which, in turn, can
contribute to the field of implementation science. Given the
growing recognition of the importance of implementation,
its quickly expanding evidence base, and the numerous
implementation frameworks that are emerging, we sought
to increase understanding of the critical steps of the
implementation process by undertaking a conceptual syn-
thesis of relevant literature.
Implementation and the Interactive Systems
The ISF (Wandersman et al. 2008) is a framework that
describes the systems and processes involved in moving
from research development and testing of innovations to
their widespread use. It has a practical focus on infra-
structure, innovation capacities, and three systems needed
to carry out the functions necessary for dissemination and
implementation (Synthesis and Translation System,
Support System, Delivery System). The role of the Syn-
thesis and Translation System is to distill theory and evi-
dence and translate this knowledge into user-friendly
innovations (an idea, practice, or object that is perceived as
new by an individual or an organization/community
(Rogers 2003)). To increase the user-friendliness of these
innovations, this system may create manuals, guides,
worksheets, or other tools to aid in the dissemination of the
innovation. This system may strive to develop evidence-
based strategies for implementing a given innovation in
diverse contexts (e.g., Mazzucchelli and Sanders 2010;
Schoenwald 2008). Worthwhile innovations developed by
the Synthesis and Translation System need to be put into
practice, and actual use of these innovations is accom-
plished primarily by the Delivery System.
The Delivery System is comprised of the individuals,
organizations, and communities that can carry out activities
that use the innovations that the Synthesis and Translation
develops. Implementation in the Delivery System is sup-
ported by the Support System. To increase the likelihood that
innovation use will lead to desired outcomes, the Support
System works directly with the members of the Delivery
System to help them implement with quality. The Support
System does this by building two types of capacities through
training, technical assistance, and/or monitoring progress:
(1) innovation-specific capacity—the necessary knowledge,
skills, and motivation that are required for effective use of
the innovation; and (2) general capacity—effective struc-
tural and functional factors (e.g., infrastructure, aspects of
overall organizational functioning such as effective com-
munication and establishing relationships with key com-
munity partners) (Flaspohler et al. 2008b).
Each of the three systems in the ISF are linked with
bi-directional relationships. The stakeholders in each sys-
tem (e.g., funders, practitioners, trainers, and researchers)
should communicate and collaborate to achieve desired
outcomes. In the original ISF special issue, there was an
emphasis on building capacity for quality implementation
(e.g., Chinman et al. 2008; Fagan et al. 2008). This article
seeks to enhance the ISF’s emphasis on implementation
using a synthesis of implementation frameworks to further
inform the types of structures and functions that are
important for quality implementation per se. More specif-
ically, this collective guidance can be applied to the ISF
systems by creating more explicit links (both within and
between systems) that detail specific actions that can be
used to collaboratively foster high quality implementation.
Overview of the Article
This article has conceptual, empirical research, and prac-
tical goals. Our first goal was to provide a conceptual
Am J Community Psychol
overview of the implementation process through a syn-
thesis of the literature. The literature synthesis was
designed to develop a new implementation meta-frame-
work which we call the Quality Implementation Frame-
work (QIF). The QIF identifies the critical steps in the
implementation process along with specific actions related
to these steps that can be utilized to achieve quality
Our research goal was to summarize the research sup-
port that exists for the different steps in the newly-devel-
oped QIF and to offer some suggestions for future research
efforts. Our practical goal was to outline the practical
implications of our findings in terms of improving future
implementation efforts in the world of practice.
Progress toward these goals will enhance theory related
to implementation research and practice. Theoretical con-
tributions will also be applied to the ISF, since the
framework synthesis will identify actions and strategies
that the three ‘‘mutually accountable’’ ISF systems can
employ to collaboratively foster quality implementation.
Wandersman and Florin (2003) discussed the importance
of interactive accountability in which funders, researchers/
evaluators, and practitioners are mutually accountable and
work together to help each other achieve results. The ISF
helps operationalize how these stakeholders can work
together. When collaborating for quality implementation,
these systems should strive to increase the likelihood that
the necessary standards of the innovation (e.g., active
ingredients, core components, critical features, essential
elements) are met and that the innovation’s desired out-
comes are achieved.
We hypothesized that our literature synthesis would
yield convergent evidence regarding many of the important
steps associated with quality implementation. Our frame-
work review differs from other recent framework reviews,
since we focus on literature relating specifically to the
‘how-to’’ of implementation (i.e., specific procedures and
strategies). Systematically identifying these action-oriented
steps can serve as practical guidance related to specific
tasks to include in the planning and/or execution of
implementation efforts. Another difference is that we
sought to develop a framework that spans multiple research
and practice areas as opposed to focusing on a specific field
such as healthcare (e.g., Damschroder et al. 2009; Green-
halgh et al. 2004). We believed our explicit focus on spe-
cific steps and strategies that can be used to operationalize
‘how to’’ implement would make a useful contribution to
the literature.
In the following section, we provide a brief overview of
prior implementation research that places implementation
in context, discuss issues related to terminology, and
describe prior work depicting the implementation process.
We then describe our literature synthesis and apply its
results to the advancement of the ISF and implementation
theory and practice.
Brief Overview of Implementation Research
In many fields, such as education, health care, mental
health treatment, and prevention and promotion, program
evaluations did not historically include any mention or
systematic study of implementation (Durlak and Dupre
2008). However, beginning in the 1980s, many empirical
studies began appearing that indicated how important
quality implementation was to intended outcomes (e.g.,
Abbott et al. 1998; Basch et al. 1985; Gottfredson et al.
1993; Grimshaw and Russell 1993; Tobler 1986).
As research on implementation evolved, so did our
understanding of its complexity. For example, authors have
identified eight different aspects to implementation such as
fidelity, dosage, and program differentiation, and at least 23
personal, organizational, or community factors that affect
one or more aspects of implementation (Dane and
Schneider 1998; Durlak and Dupre 2008). Because
implementation often involves studying innovations in real
world contexts, rigorous experimental designs encom-
passing all of the possible influential variables are impos-
sible to execute. Individual or multiple case studies have
been the primary vehicle for learning about factors that
affect the implementation process, yet the methodological
rigor and generalizability of these reports varies. Never-
theless, there has been a steady improvement in the number
and quality of studies investigating implementation, and
there are now more carefully done quantitative and quali-
tative reports that shed light on the implementation process
(e.g., Domitrovich et al. 2010; Fagan et al. 2008; Saunders
et al. 2006; Walker and Koroloff 2007).
Although there is extensive empirical evidence on the
importance of implementation and a growing literature on
the multiple contextual factors that can influence imple-
mentation (e.g., Aarons et al. 2011; Domitrovich et al.
2008), there is a need for knowing how to increase the
likelihood of quality implementation. Can a systematic,
comprehensive overview of implementation be developed?
If so, what would be its major elements? Could specific
steps be identified to aid future research and practice on
implementation? Our review helps to address these ques-
tions and focuses on issues related to high quality
Using Rogers’ (2003) classic model, implementation is one
of five crucial stages in the wide-scale diffusion of inno-
vations: (1) dissemination (conveying information about
Am J Community Psychol
the existence of an innovation to potentially interested
parties), (2) adoption (an explicit decision by a local unit or
organization to try the innovation), (3) implementation
(executing the innovation effectively when it is put in
place), (4) evaluation (assessing how well the innovation
achieved its intended goals), and (5) institutionalization
(the unit incorporates the innovation into its continuing
practices). While there can be overlap among Rogers’
stages, our discussion of implementation assumes that the
first two stages (dissemination of information and explicit
adoption) have already occurred.
There has yet to be a standardized language for describing
and assessing implementation. For example, the extent to
which an innovation that is put into practice corresponds to
the originally intended innovation has been called fidelity,
compliance, integrity, or faithful replication. Our focus is
on quality implementation—which we define as putting an
innovation into practice in such a way that it meets the
necessary standards to achieve the innovation’s desired
outcomes (Meyers et al. 2012). This definition is consistent
with how the International Organization for Standardiza-
tion (ISO) views quality as a set of features and charac-
teristics of a product or service that bear on its ability to
satisfy stated or implied needs (ISO/IEC 1998). Imple-
mentation is not an all-or-none construct, but exists in
degrees. For example, one may eventually judge that the
execution of some innovations was of low quality, medium
quality, or high quality (e.g., Saunders et al. 2006). This
article focuses on issues related to high quality
Implementation Frameworks
Implementation scholars have made gains in describing the
process of implementation. These efforts have taken dif-
ferent forms. Sometimes, they are descriptions of the major
steps involved in implementation and at other times they
are more refined conceptual frameworks based on research
literature and practical experiences (e.g., theoretical
frameworks, conceptual models). Miles and Huberman
(1994) define a conceptual framework as a representation
of a given phenomenon that ‘‘explains, either graphically or
in narrative form, the main things to be studied—the key
factors, concepts, or variables’’ (p. 18) that comprise the
phenomenon. Conceptual frameworks organize a set of
coherent ideas or concepts in a manner that makes them
easy to communicate to others. Often, the structure and
overall coherence of frameworks are ‘‘built’’ and borrow
elements from elsewhere (Maxwell 2005).
Implementation frameworks have been described as
windows into the key attributes, facilitators, and challenges
related to promoting implementation (Flaspohler et al.
2008a). They provide an overview of ideas and practices
that shape the complex implementation process and can
help researchers and practitioners use the ideas of others
who have implemented similar projects. Some frameworks
are able to provide practical guidance by describing spe-
cific steps to include in the planning and/or execution of
implementation efforts, as well as mistakes that should be
Toward a Synthesis of Implementation Frameworks:
A Review of Implementation Frameworks
In this section, we describe our work on our conceptual
goal. We use the term implementation framework to
describe reports that focus on the ‘‘how-to’’ of implemen-
tation; that is, sources that offer details on the specific
procedures and strategies that various authors believe are
important for quality implementation. By synthesizing
these frameworks, we are able to cross-walk the critical
themes from the available literature to suggest actions that
practitioners and those who work with them can employ to
ensure quality implementation.
Inclusion Criteria and Literature Search Procedures
To be included in our review of implementation frame-
works, a document about implementation had to meet two
main criteria: (1) contain a framework that describes the
main actions and strategies believed to constitute an
effective implementation process related to using innova-
tions in new settings, and (2) be a published or unpublished
report that appeared in English by the end of June 2011.
The framework could be based on empirical research or be
a theoretical or conceptual analysis of what is important in
implementation based on experience or a literature review.
We placed no restrictions on the content area, population of
interest, or type of innovation being considered; however,
to be retained, the framework needed to focus on specific
details of the implementation process.
Three strategies were used to locate relevant reports: (1)
computer searches of six databases (Business Source Pre-
mier, Dissertation Abstracts,Google Scholar, MEDLINE,
PsycINFO, and Web of Science) using variants of multiple
search terms in various configurations (e.g., ‘‘implemen-
tation,’’ ‘‘framework’’, ‘‘model’’, ‘‘approach’’, and ‘‘strat-
egy’’), (2) hand searches over the last 5 years of four
journals that we judged were likely to contain relevant
publications (American Journal of Community Psychology,
American Journal of Evaluation,Implementation Science,
Am J Community Psychol
Prevention Science), and (3) inspection of the reference
lists of each relevant report and review of implementation
research (e.g., Durlak and DuPre 2008; Fixsen et al. 2005;
Greenhalgh et al. 2004).
We did not include reports about implementation based
on a single implementation trial (e.g., Chakravorty 2009),
articles with implementation frameworks that have not
been cited more than once in the literature (e.g., Chinow-
sky 2008; Spence and Henderson-Smart 2011), articles that
focus on contextual factors that can influence implemen-
tation (e.g., Aarons et al. 2011; Domitrovich et al. 2008),
articles that focus more on fidelity (i.e., adherence, integ-
rity) and less on the implementation process as a whole
(e.g., Bellg et al. 2004;), articles that do not contain an
implementation framework (e.g., Chorpita et al. 2002),
articles that focus on a framework that is redundant with
another source, or articles that do not put enough focus on
the process of implementation and instead focus on a more
expansive process (e.g., Simpson 2002). Instead, we only
included reports in which authors attempted to offer a
framework for implementation that was intended to be
applied generally across one or more areas of research or
practice, has been utilized over extended periods of time,
and has been cited more than once in the literature (e.g.,
Kilbourne et al. 2007; Klein and Sorra 1996). Figure 1is a
flow diagram depicting our study selection for the imple-
mentation framework synthesis. The diagram was created
in light of reporting guidance from the preferred reporting
items for systematic reviews and meta-analyses (PRISMA;
Liberati et al. 2009).
Once the sample of frameworks was established, we
examined each one and distilled what appeared to be dis-
tinct critical steps for quality implementation, and we
identified specific actions and strategies associated with
each step. We then created broad categories to group
similar steps and actions from the different frameworks to
depict what appears to constitute quality implementation
from beginning to end. Although authors used different
terminology in many cases, the activities they described
greatly assisted the categorization process. Few issues
arose in placing elements in categories, and these were
resolved through discussion among the authors.
A total of 25 frameworks contained in 27 different sources
were retained for the current synthesis. Two sources each
were used for the Communities That Care and the PROS-
PER frameworks, since combining these sources provided
a more elaborate description of the main steps and actions
of each framework. All the sources are listed in Table 1,
which also describes how each framework was based on a
particular literature area, target population, and type of
Most of the 25 frameworks were based on the imple-
mentation of evidence-based programs via community-
based planning approaches (n=6) or health care delivery
(n=5), while others are specifically related to prevention/
promotion (n=4), evidence-based programs and/or
Reports Initially Screened
(n = 1945)
Detailed inspection for
inclusion (n= 152)
(n = 27 sources)
Excluded (n = 125)
Reasons for exclusion:
Source did not focus on the process of implementation (n =
Source did not contain a framework (n = 43)
Source focused on contextual factors that impact
implementation (n = 11)
Framework contained in source was based on single case
study (n = 8)
Framework posited in source redundant with a framework
already in our sample (n = 6)
Source focused on fidelity of implementation (n = 6)
Source is not cited more than once (n = 2)
Excluded as not-applicable
(n = 1807)
Fig. 1 Flow diagram of
selected sources for the
implementation framework
synthesis. While there were a
total of 27 sources that were
used to comprise our sample,
only 25 frameworks were
described in these sources (two
additional sources were retained
to allow for a greater level of
detail for the Communities
That Care framework and the
PROSPER framework)
Am J Community Psychol
treatments (n=3), specific to school-based innovations
(n=3), implementing non-specific innovations in organi-
zations (n=2), or are related to management (n=2).
Most of the evidence-based programs/treatments targeted
children and adolescents. Many of the health care
innovations were related to integrating different aspects of
evidence-based medicine into routine practice.
The synthesis of the critical steps associated with quality
implementation is summarized in Table 2. Table 3contains
important questions to answer at each step and the overall
Table 1 Sources for implementation frameworks included in the review
Source Primary literature areas examined as basis for
Target population
CASEL (2011) School-based social and emotional learning Children and adolescents
Chinman et al. (2004)—GTO Community-based substance abuse prevention planning Children and adolescents
Damschroder et al. (2009)—CFIR Evidence-based health care Not specified
Durlak and DuPre (2008) Prevention and health promotion programs Children and adolescents
Feldstein and Glasgow (2008)—PRISM Evidence-based health care Not specified
Fixsen et al. (2005) Implementation of evidence-based practices including
human services (e.g., mental health, social services,
juvenile justice, education, employment services,
substance abuse prevention and treatment),
agriculture, business, engineering, medicine,
manufacturing, and marketing
Not specified
Glisson and Schoenwald (2005)—ARC Evidence-based treatments Children, adolescents, and
their families
Greenberg et al. (2005) School-based preventive and mental health promotion
Children and adolescents
Greenhalgh et al. (2004) Health care Not specified
Guldbrandsson (2008) Health promotion and disease prevention Not specified
Hall and Hord (2006) School-based innovations Children and adolescents
Hawkins et al. (2002)—CTC; Mihalic
et al. (2004)—Blueprints
Evidence-based violence and drug prevention programs Children and adolescents
Kilbourne et al. (2007)—REP Community-based behavioral and treatment
interventions for HIV
Not specified
Klein and Sorra (1996) Management Organizational managers
Okumus (2003) Management Organizational managers
PfS (2003) Community-based prevention planning Children and adolescents
Rogers (2003) Diffusion of innovations in organizations Not specified
Rycroft-Malone (2004)—PARIHS Evidence-based healthcare Not specified
Spoth et al. (2004); Spoth and Greenberg
Population-based youth development and reduction of
youth problem behaviors (e.g., substance use,
violence, and other conduct problems)
Children and adolescents
Sandler et al. (2005) Community-based prevention services Children and adolescents
Stetler et al. (2008)—QUERI Evidence-based health care United States Veterans
Stith et al. (2006) Community-based programs for violence prevention
and substance abuse prevention
Children and adolescents
Van de Ven et al. (1989) Technological innovations Organizational managers
and stakeholders
Walker and Koroloff (2007) Comprehensive, individualized, family-driven mental
health services
Children, adolescents, and
their families
Wandersman et al. (2008)—ISF Injury and violence prevention Children and adolescents
ARC Availability, Responsiveness, Continuity community intervention model, Blueprints for Violence Prevention, CASEL Collaborative for
Academic, Social, and Emotional Learning, CFIR Consolidated Framework for Implementation Research, CTC Communities That Care, GTO
Getting To Outcomes, PfS Partnerships for Success, ISF Interactive Systems Framework, PARIHS Promoting Action on Research Implemen-
tation in Health Services, PRISM Practical, Robust Implementation and Sustainability Model, PROSPER PROmoting School/Community-
University Partnerships to Enhance Resilience, QUERI Quality Enhancement Research Initiative, REP Replicating Effective Programs
Am J Community Psychol
frequency with which each step was included in the sampled
frameworks. We call the results of our synthesis the Quality
Implementation Framework (QIF) because it focuses on
important elements (critical steps and actions) believed to
constitute quality implementation. Four important findings
emerged from our synthesis: (1) it was possible to identify 14
distinct steps comprising quality implementation; (2) these
steps could be logically divided into four temporal phases;
(3) there was considerable agreement among the various
sources on many of these steps; and (4) the overall con-
ceptualization of implementation that emerged suggests that
quality implementation is a systematic process that involves
a coordinated series of related elements. These findings offer
a useful blueprint for future research and practice.
For example, the information in Table 3indicates that
quality implementation can be viewed conceptually as a
systematic, step-by-step, four-phase sequence that contains
over one dozen steps. Most of these steps (10 of the 14)
should be addressed before implementation begins, and
they suggest that quality implementation is best achieved
through a combination of multiple activities that include
assessment, negotiation and collaboration, organized
planning and structuring, and, finally, personal reflection
and critical analysis.
The four phase conceptualization that appears in Table 3
suggests when and where to focus one’s attention in order
to achieve quality implementation. The first phase, Initial
Considerations Regarding the Host Setting, contains eight
critical steps and focuses on the host setting. Activities in
this phase involve various assessment strategies related to
organizational needs, innovation-organizational fit, and a
capacity or readiness assessment. Each implementation
effort also raises the critical question regarding if and how
the innovation should be adapted to fit the host setting. In
other words, work in the first phase of implementation
focuses primarily on the ecological fit between the inno-
vation and the host setting.
Although it is not noted in Table 3, a clear explanation
and definition of the specified standards for implementation
(e.g., active ingredients, core components, critical features,
or essential elements) should be agreed on by all involved
parties. Therefore, decisions about whether any adaptations
are to be made should occur before explicit buy-in for the
innovation is obtained so all stakeholders understand what
the innovation consists of and what using it entails. If the
core components of the innovation are clearly known,
many of the framework authors emphasized that any
adaptations should preserve these components to maintain
the integrity of the innovation.
An emerging strategy for adaptation calls upon inno-
vation developers and researchers to identify which com-
ponents of innovations can be adapted. Unless practitioners
have a deep understanding of effective implementation and
program theory, they need support and guidance when
adapting innovations to new contexts and populations.
Such support must rely on the local knowledge that these
practitioners have about the setting that hosts the innova-
tion. Multiple frameworks in this review state that inno-
vation developers should provide a foundation for
adaptations by identifying what can be modified (e.g.,
surface structure modifications that are intended to boost
engagement and retention) and what should never be
modified (e.g., an innovation’s core components) as part of
their dissemination strategy. Approaches have been
developed to help resolve the tension between the need for
fidelity and adaptation (e.g., Lee et al. 2008), and such
guidance can foster adherence to an innovation’s protocol
for use while also enhancing its fit and relevance to the
organization/community (Forehand et al. 2010).
In addition, all but two frameworks indicated that steps
should be taken to foster a supportive climate for imple-
mentation and secure buy-in from key leaders and front-
line staff in the organization/community. Some of the
specific strategies suggested in this critical step include: (1)
assuring key opinion leaders and decision-makers are
engaged in the implementation process and perceive that
the innovation is needed and will benefit organizational
Table 2 Summary of the four implementation phases and 14 critical
steps in the Quality Implementation Framework that are associated
with quality implementation
Phase One: Initial considerations regarding the host setting
Assessment strategies
1. Conducting a needs and resources assessment
2. Conducting a fit assessment
3. Conducting a capacity/readiness assessment
Decisions about adaptation
4. Possibility for adaptation
Capacity-building strategies
5. Obtaining explicit buy-in from critical stakeholders and
fostering a supportive community/organizational climate
6. Building general/organizational capacity
7. Staff recruitment/maintenance
8. Effective pre-innovation staff training
Phase Two: Creating a structure for implementation
Structural features for implementation
9. Creating implementation teams
10. Developing an implementation plan
Phase Three: Ongoing structure once implementation begins
Ongoing implementation support strategies
11. Technical assistance/coaching/supervision
12. Process evaluation
13. Supportive feedback mechanism
Phase Four: Improving future applications
14. Learning from experience
Am J Community Psychol
Table 3 Critical steps in implementation, important questions to answer at each step in the Quality Implementation Framework, and the
frequency with which each step was included in the 25 reviewed frameworks
Phases and steps of the quality implementation framework Frequency
Phase one: Initial considerations regarding the host setting
Assessment strategies
1. Conducting a needs and resources assessment:
Why are we doing this?
What problems or conditions will the innovation address (i.e., the need for the innovation)?
What part(s) of the organization and who in the organization will benefit from improvement efforts?
14 (56 %)
2. Conducting a fit assessment:
Does the innovation fit the setting?
How well does the innovation match the:
Identified needs of the organization/community?
Organization’s mission, priorities, values, and strategy for growth?
Cultural preferences of groups/consumers who participate in activities/services provided by the organization/community?
14 (56 %)
3. Conducting a capacity/readiness assessment:
Are we ready for this?
To what degree does the organization/community have the will and the means (i.e., adequate resources, skills and motivation) to
implement the innovation?
Is the organization/community ready for change?
11 (44 %)
Decisions about adaptation
4. Possibility for adaptation
Should the planned innovation be modified in any way to fit the host setting and target group?
What feedback can the host staff offer regarding how the proposed innovation needs to be changed to make it successful in a new
setting and for its intended audience?
How will changes to the innovation be documented and monitored during implementation?
19 (76 %)
Capacity Building Strategies (may be optional depending on the results of previous elements)
5. Obtaining explicit buy-in from critical stakeholders and fostering a supportive community/organizational climate:
Do we have genuine and explicit buy-in for this innovation from:
Leadership with decision-making power in the organization/community?
From front-line staff who will deliver the innovation?
The local community (if applicable)?
Have we effectively dealt with important concerns, questions, or resistance to this innovation? What possible barriers to
implementation need to be lessened or removed?
Can we identify and recruit an innovation champion(s)?
Are there one or more individuals who can inspire and lead others to implement the innovation and its associated practices?
How can the organization/community assist the champion in the effort to foster and maintain buy-in for change?
23 (92 %)
Note. Fostering a supportive climate is also important after implementation begins and can be maintained or enhanced through such strategies as
organizational policies favoring the innovation and providing incentives for use and disincentives for non-use of the innovation
6. Building general/organizational capacity:
What infrastructure, skills, and motivation of the organization/community need enhancement in order to ensure the innovation will
be implemented with quality?
Of note is that this type of capacity does not directly assist with the implementation of the innovation, but instead enables the
organization to function better in a number of its activities (e.g., improved communication within the organization and/or with
other agencies; enhanced partnerships and linkages with other agencies and/or community stakeholders).
15 (60 %)
7. Staff recruitment/maintenance:
Who will implement the innovation?
Initially, those recruited do not necessarily need to have knowledge or expertise related to use of the innovation; however, they
will ultimately need to build their capacity to use the innovation through training and on-going support
Who will support the practitioners who implement the innovation?
These individuals need expertise related to (a) the innovation, (b) its use, (c) implementation science, and (d) process evaluation
so they can support the implementation effort effectively
Might roles of some existing staff need realignment to ensure that adequate person-power is put towards implementation?
13 (52 %)
Am J Community Psychol
functioning; (2) aligning the innovation with the setting’s
broader mission and values; (3) identifying policies that
create incentives for innovation use, disincentives for non-
use, and/or reduce barriers to innovation use; and (4)
identifying champions for the innovation who will advo-
cate for its use and support others in using it properly.
Advocates for the innovation should be able to answer
the following questions before proceeding further: How
well does the innovation (either as originally intended or in
a modified format) fit this setting? To what extent does
staff understand what the innovation entails? In what ways
will the innovation address important perceived needs of
the organization? Does staff have a realistic view of what
the innovation may accomplish, and are they ready and
able to sponsor, support, and use the innovation with
The second phase of quality implementation, Creating a
Structure for Implementation, suggests that an organized
structure should be developed to oversee the process. At a
minimum, this structure includes having a clear plan for
Table 3 continued
Phases and steps of the quality implementation framework Frequency
8. Effective pre-innovation staff training
Can we provide sufficient training to teach the why, what, when, where, and how regarding the intended innovation?
How can we ensure that the training covers the theory, philosophy, values of the innovation, and the skill-based competencies
needed for practitioners to achieve self-efficacy, proficiency, and correct application of the innovation?
22 (88 %)
Phase two: Creating a structure for implementation
Structural features for implementation
9. Creating implementation teams:
Who will have organizational responsibility for implementation?
Can we develop a support team of qualified staff to work with front-line workers who are delivering the innovation?
Can we specify the roles, processes, and responsibilities of these team members?
17 (68 %)
10. Developing an implementation plan:
Can we create a clear plan that includes specific tasks and timelines to enhance accountability during implementation?
What challenges to effective implementation can we foresee that we can address proactively?
13 (52 %)
Phase three: Ongoing structure once implementation begins
Ongoing implementation support strategies
11. Technical assistance/coaching/supervision:
Can we provide the necessary technical assistance to help the organization/community and practitioners deal with the inevitable
practical problems that will develop once the innovation begins?
These problems might involve a need for further training and practice in administering more challenging parts of the innovation,
resolving administrative or scheduling conflicts that arise, acquiring more support or resources, or making some required
changes in the application of the innovation
20 (80 %)
12. Process evaluation
Do we have a plan to evaluate the relative strengths and limitations in the innovation’s implementation as it unfolds over time?
Data are needed on how well different aspects of the innovation are being conducted as well as the performance of different
individuals implementing the innovation
24 (96 %)
13. Supportive feedback mechanism
Is there an effective process through which key findings from process data related to implementation are communicated, discussed,
and acted upon?
How will process data on implementation be shared with all those involved in the innovation (e.g., stakeholders, administrators,
implementation support staff, and front-line practitioners)?
This feedback should be offered in the spirit of providing opportunities for further personal learning and skill development and
organizational growth that leads to quality improvement in implementation
18 (72 %)
Phase four: Improving future applications
14. Learning from experience
What lessons have been learned about implementing this innovation that we can share with others who have an interest in its use?
Researchers and innovation developers can learn how to improve future implementation efforts if they critically reflect on their
experiences and create genuine collaborative relationships with those in the host setting
Collaborative relationships appreciate the perspectives and insights of those in the host setting and create open avenues for
constructive feedback from practitioners on such potentially important matters as: (a) the use, modification, or application of
the innovation; and (b) factors that may have affected the quality of its implementation
7 (28 %)
Am J Community Psychol
implementing the innovation and identifying a team of
qualified individuals who will take responsibility for these
issues. Two important questions to answer before this
phase concludes are: (1) Is there a clear plan for what will
happen, and when it should occur; and (2) who will
accomplish the different tasks related to delivering the
innovation and overseeing its implementation?
The work involved in the first two phases is in prepa-
ration for beginning implementation (i.e., planning imple-
mentation). Implementation actually begins in phase three
of our framework: Ongoing Structure Once Implementa-
tion Begins. There are three important tasks in this phase:
(1) providing needed on-going technical assistance to
front-line providers; (2) monitoring on-going implementa-
tion; and (3) creating feedback mechanisms so involved
parties understand how the implementation process is
progressing. Therefore, the corresponding questions that
require answers involve: (1) Do we have a sound plan in
place to provide needed technical assistance? (2) Will we
be able to assess the strengths and limitations that occur
during implementation? (3) Will the feedback system be
rapid, accurate, and specific enough so that successes in
implementation can be recognized and changes to improve
implementation can be made quickly?
The fourth phase, Improving Future Applications, indi-
cates that retrospective analysis and self-reflection coupled
with feedback from the host setting can identify particular
strengths and weaknesses that occurred during implemen-
tation. The primary question is: ‘‘What has this effort
taught us about quality implementation?’’ This phase only
includes one critical step—learning from experience—
which appears because it was implicit in many of the
frameworks and explicit in a few of them. For example,
many authors implied that they learned about implemen-
tation from practical experience and from the feedback
received from host staff. This is understandable because in
the absence of systematic theory and research on imple-
mentation in many fields of inquiry, learning by doing was
the primary initial vehicle for developing knowledge about
implementation. Several authors revised their frameworks
over time by adding elements or modifying earlier notions
about implementation. While there have been instances of
researchers empirically testing their implementation
framework and modifying it based on data (Klein et al.
2001), modifications were often shaped by: feedback
received from a host setting about ineffective and effective
strategies, considering what others were beginning to
report in the literature, and/or by critical self-reflection
about one’s effort. In sum, over time, based on their own or
others’ experiences, both mistakes and successes in the
field coalesced to shape various conceptualizations of what
quality implementation should look like (e.g., Grol and
Jones 2000; Van de Ven et al. 1989).
Convergent Evidence for Specific Elements
Table 4indicates how many of the 25 reviewed frame-
works included each of the 14 steps. As we hypothesized,
there was substantial agreement about many of the steps.
We did not expect perfect agreement on each critical step
because the individual frameworks appeared at different
times in the history of implementation research, and the
frameworks came from different content areas (health care,
prevention and promotion, mental health treatment, edu-
cation, and industry) served different populations (adults or
children) and had different goals (e.g., promotion, treat-
ment, or increased organizational effectiveness). Never-
theless, there was near universal agreement on the
importance of monitoring implementation (critical step 12;
present in 96 % of the reviewed reports) and strong
agreement on the value of developing buy-in and a sup-
portive organizational climate (critical step 5; 92 %),
training (critical step 8; 88 %), technical assistance (critical
step 11; 80 %), feedback mechanisms (critical step 13;
72 %), the creation of implementation teams (critical step
9; 68 %), and the importance of building organizational
capacity (critical step 6; 60 %). Several other steps were
present in more than half of the frameworks (e.g., critical
steps 1 and 2; assessing the need for the innovation and the
fit of the innovation, respectively).
Research Support for Different Elements
Which elements in our framework have received research
support? It is difficult to make exact comparisons between
our synthesis and the findings from specific research
investigations. Some critical steps represent a combination
of behaviors and actions that may address multiple targets
and constructs and that can be applied somewhat differ-
ently across different contexts. Most research on imple-
mentation has not focused on critical steps for quality
implementation as we define them here, but instead on
specific factors that influence the overall success of
implementation such as challenges inherent in the imple-
mentation process (e.g., Aarons et al. 2011) or contextual
factors that influence quality of implementation (e.g.,
Domitrovich et al. 2008). However, several research stud-
ies have examined issues that relate to one or more activ-
ities within the scope of different critical steps.
Given these considerations, with one exception, there is
some support for each of the QIF critical steps. This sup-
port varies in strength and character depending on the step,
and is discussed in several sources (Durlak and Dupre
2008; Fixsen et al. 2005; Greenhalgh et al. 2004). The
strongest support, in terms of the quantity and quality of
empirical studies, exists for the importance of training and
on-going technical assistance (critical steps 8 and 11,
Am J Community Psychol
Table 4 Steps included in each reviewed framework
Framework phases and steps Van de Ven
et al. (1989)
Klein and Sorra
Hawkins et al. (2002);
Mihalic et al. (2004)
et al. (2004)
Greenhalgh et al.
Phase One: Initial considerations
1. Needs and resources assessment X X X X
2. Fit assessment X X X X X X
3. Capacity/readiness assessment X X X X
4. Possibility for adaptation X X X X
5. Buy-in; supportive climate X X X X X X X X X
6. General org. capacity building X X X X X
7. Staff recruitment/maintenance X X X X X
8. Pre-innovation training X X X X X X X
Phase Two: Structure for implementation
9. Implementation teams X X X X X
10. Implementation plan X X X X
Phase Three: Ongoing support strategies
11. TA/coaching/supervision X X X X X
12. Process evaluation X X X X X X X X
13. Feedback mechanism X X X X X
Phase Four: Improving future applications
14. Learning from experience X X
Framework phases and steps Spoth et al. (2004);
Spoth and Greenberg
et al. (2005)
Glisson and
Schoenwald (2005)
et al. (2005)
et al. (2005)
Hall and
Hord (2006)
et al. (2006)
et al. (2007)
Phase One: Initial considerations
1. Needs and resources assessment X X X X X X X
2. Fit assessment X X X X X X
3. Capacity/readiness assessment X X X X
4. Possibility for adaptation X X X X X X X
5. Buy-in; supportive climate X X X X X X
6. General org. capacity building X XX
7. Staff recruitment/maintenance X X X X
8. Pre-innovation training X X X X X X X X
Phase Two: Structure for implementation
9. Implementation teams X X X X X X
10. Implementation plan X X X X
Phase Three: Ongoing support strategies
11. TA/coaching/supervision X X X X X X X X
Am J Community Psychol
Table 4 continued
Framework phases and steps Spoth et al. (2004);
Spoth and Greenberg
et al. (2005)
Glisson and
Schoenwald (2005)
et al. (2005)
et al. (2005)
Hall and
Hord (2006)
et al. (2006)
et al. (2007)
12. Process evaluation X X X X X X X X
13. Feedback mechanism X X X X X X
Phase Four: Improving future applications
14. Learning from experience X X X
Framework phases and steps Walker and
Koroloff (2007)
Durlak and
DuPre (2008)
Feldstein and
Glasgow (2008)
Guldbrandsson (2008) Stetler
et al. (2008)
et al. (2008)
et al. (2009)
Phase One: Initial considerations
1. Needs and resources assessment X X X
2. Fit assessment X X
3. Capacity/readiness assessment X XX
4. Possibility for adaptation X X X X X X X X
5. Buy-in; supportive climate X X X X X X X X
6. General org. capacity building X X X X X X X
7. Staff recruitment/maintenance X X X X
8. Pre-innovation training X X X X X X X
Phase Two: Structure for implementation
9. Implementation Teams X X X X X X
10. Implementation plan X X X X X
Phase Three: Ongoing support strategies
11. TA/coaching/supervision X X X X X X X
12. Process evaluation X X X X X X X X
13. Feedback mechanism X X X X X X X
Phase Four: Improving future applications
14. Learning from experience X X
Am J Community Psychol
respectively); the evidence indicates that it is the combi-
nation of training and on-going support that enhances
learning outcomes (Miller et al. 2004; Sholomskas et al.
2005). Historically, work on implementation focused only
on training, and it was only later as a result of both research
findings and experiences from the field that the necessary
added value of supportive technical assistance was noted
(e.g., Fixsen et al. 2005; Joyce and Showers 2002).
Using an approach similar to Durlak and DuPre (2008),
we interpreted research support to mean the existence of at
least five reports that generally agree on the importance of
the step. Using this metric indicates that there is research
support for the importance of studying the needs of the host
setting (critical step 1), determining the degree of fit
between the innovation and the setting and target popula-
tion (critical step 2), taking steps to foster a supportive
organizational climate for implementation and having
champions on hand to advocate for the program (critical
step 5), the importance of capacity building (critical step
6), and for monitoring the process of implementation
(critical step 12). There is also both quantitative and
qualitative support for the value of adaptation (critical
step 4).
Support for other elements rests upon conclusions from
the field based mainly on a few individual qualitative case
studies rather than quantitative studies. This refers to
importance of developing an implementation team and plan
(critical steps 9 and 10), and instituting a feedback system
regarding how well the implementation process is pro-
ceeding (critical step 13). These qualitative investigations
are important because it would be difficult to arrange an
experimental or quasi-experimental study in which these
elements were missing in one program condition but
present in another. Nevertheless, empirical studies have
documented how early monitoring of implementation can
identify those having difficulties, and that subsequent
retraining and assistance can lead to dramatic improve-
ments in implementation (DuFrene et al. 2005; Greenwood
et al. 2003).
Step 7, which involves recruiting staff to deliver the
intervention, does not require research confirmation per se,
but rests on the obvious consideration that someone must
provide the innovation. Most support for the importance of
learning from experience (step 14) is largely implicit and
was inferred from several reports. For example, data from
multi-year interventions indicated how implementation
improves over time (Cook et al. 1999; Elder et al. 1996;
Riley et al. 2001), presumably because authors have seen
the need for and have acted to enhance implementation in
one fashion or another. In other cases, authors recognized
strengths or weaknesses in their implementation efforts—
either in retrospect or as the innovation was being deliv-
ered—that offered important lessons for improving future
trials. There are reports in which suggestions about better
subsequent implementation might occur through improving
communication among stakeholders (Sobo et al. 2008),
changing aspects of training or technical assistance
(Wandersman et al. 2012), or modifying the innovation
itself to fit the host setting (Blakely et al. 1987; Kerr et al.
1985; McGraw et al. 1996; Mihalic et al. 2004).
Temporal Ordering of Elements
Our synthesis suggests there is a temporal order to the
critical steps of quality implementation. Some steps need
attention prior to the beginning of any innovation (namely,
critical steps 1–10), some are ascendant as implementation
unfolds (critical steps 11–13), and the last element offers
opportunities for learning once the first innovation trial is
complete (critical step 14).
The temporal ordering of implementation steps suggests
why some innovations may have failed to achieve their
intended effects because of poor implementation. In some
cases, researchers realized only after the fact that they had
not sufficiently addressed one or more steps in the imple-
mentation process. The need to be proactive about possible
implementation barriers is reported by Mihalic et al. (2004)
in their description of the Blueprints for Violence Pre-
vention initiative. They found that lack of staff buy-in
usually resulted in generalized low morale and eventually
led to staff turnover. Moreover, lack of administrative
support was present in every case of failed implementation.
Proactive monitoring systems can be developed to identify
such challenges as they arise during implementation and
provide feedback to stakeholders so they can take action.
An example of a proactive monitoring system’s benefit is
described in Fagan et al. (2008). The proactive system was
developed to ensure high-fidelity prevention program
implementation in the Community Youth Development
Study. In this study, local input was sought for how to
modify the implementation procedures to increase owner-
ship and buy-in. Together, actively fostering this buy-in
and administrative support, providing training and techni-
cal assistance, and developing a proactive monitoring
system helped support 12 communities in replicating pre-
vention programs with high rates of adherence to the pro-
grams’ core components. Therefore, the sequence offered
in Table 2may assist other practitioners and researchers in
preventing future problems in implementation, if they
attend to its critical steps.
The temporal order suggested in Table 2is not invariant
because implementation is a dynamic process. Quality
implementation does not always occur in the exact
sequence of steps illustrated in Table 2. In some cases,
individuals must revisit some of the steps at a later time
(e.g., if necessary, to gather more support and resources, to
Am J Community Psychol
re-train some staff, to re-secure genuine buy-in from crit-
ical stakeholders). In other cases, some steps might be
skipped, for example, if evidence exists that the organiza-
tion already has sufficient capacity to conduct the innova-
tion, or if champions are already apparent and have
advocated for the innovation. Furthermore, some steps may
need to be addressed simultaneously because of time,
financial, or administrative pressures. In addition, it may be
more efficient to conduct some steps simultaneously (e.g.,
the self-assessment strategies in Phase 1).
The dynamic nature of the implementation process is
such that some of the phases in Table 2overlap. For
example, step 5 relates to gaining buy-in and fostering a
climate that is supportive of appropriate use of the inno-
vation. We have included this critical step as part of our
first phase of the QIF, yet our literature review indicated
that this element could also be viewed as part of creating a
supportive structure in the second phase (e.g., enacting
policies that remove barriers to implementation and enable
practitioners to implement an innovation with greater
ease), or in the third phase related to maintaining ongoing
support (e.g., monitoring the enforcement of policies and
evaluating their benefit). We had to make a final decision to
place each step into one of the four phases. In order to
display the dynamic nature of the phases and critical steps
of the QIF, we have provided a figure that suggests the
dynamic interplay (see Fig. 2).
Modifications in implementation might be necessary
because of the complexities of the host setting. Context is
always important. Innovations are introduced into settings
for many reasons and via different routes. Organizations/
communities might become involved because of true per-
ceived needs, because of administrative fiat, or as a result
of political or financial pressures. Such entities also have
varied histories in terms of their ability to promote change
and work effectively together. If the above circumstances
are not clarified, it is likely that their importance will not
emerge until after contact with the host organization or
community has been established. As a result, some critical
steps in implementation might have to be prioritized and
periodically revisited to confirm the process is on a suc-
cessful track. Nevertheless, the QIF can serve as a cross-
walk that can offer guidance in the form of an ordered
sequence of activities that should be considered and
accomplished to increase the odds of successful
Our findings reflected success in achieving our main con-
ceptual, research, and practical goals. Based on our liter-
ature synthesis, we developed the QIF, which provides a
conceptual overview of the critical steps that comprise the
process of quality implementation. The QIF contains four
temporal phases and 14 distinct steps and offers a useful
blueprint for future research and practice. For example, the
QIF indicates that quality implementation is best achieved
by thinking about the implementation process systemati-
cally as a series of coordinated steps and that multiple
activities that include assessment, collaboration and nego-
tiation, monitoring, and self-reflection are required to
enhance the likelihood that the desired goals of the inno-
vation will be achieved.
Our review of existing frameworks, which the QIF is
based upon, is different from previous reviews because its
sample of frameworks (1) were from multiple domains
(e.g., school-based prevention programs, health care
Fig. 2 Dynamic interplay
among the critical steps of the
QIF. The arrows from one
phase to the next are intended to
suggest that the steps in each of
the phases should continue to be
addressed throughout the
implementation process. Steps
in each of the phases may need
to be strengthened, revisited, or
adapted throughout the use of an
innovation in an organization/
community. While a logical
order in which the critical steps
unfold was needed to develop a
coherent framework, we believe
the manner in which they are
implemented in practice will
depend on many factors (e.g.,
context, resources, logistical
Am J Community Psychol
innovations, management) and (2) focused on the ‘‘how to’
of implementation (i.e., details on the specific actions and
strategies that authors believe are important). There was
considerable convergence on many elements in the QIF,
which is an important finding. Science frequently advances
through the identification of principles with broad appli-
cability. Our findings suggest that there are similar steps in
the implementation process regardless of the type of
innovation, target population, and desired outcomes, and
thus offers guidance to others working in many different
fields. The QIF can assist those interested in incorporating
more evidence-based innovations into everyday practice by
offering assistance on how to approach implementation in a
systematic fashion.
Our second goal was to summarize the research support
that exists for the QIF’s critical steps for quality imple-
mentation. While support exists in varying degrees for each
of the synthesized elements of implementation presented
here, there are still many unknowns. The strongest empir-
ical support is for the critical steps related to training and
on-going technical assistance (Wandersman et al. 2012).
These support strategies are often essential to quality
implementation and using both is recommended. Other
steps which have empirical support include assessing the
needs and resources of the host setting when planning for
implementation, assessing how the innovation aligns and
fits with this setting, fostering and maintaining buy-in, and
building organizational capacity. Also, it is apparent that
implementation should always be monitored.
Our findings also suggest implementation-related
research questions that require careful study. Research
questions about the host setting where implementation will
take place (Phase One of the QIF) include: How compre-
hensively should we conduct assessments of organizational
needs and the degree of fit between the innovation and each
setting? Who should provide this information and how can it
be obtained most reliably, validly, and efficiently? Which
dimensions of ‘‘innovation fit’’ (e.g., cultural preferences,
organizational mission and values) are most important?
How do we know whether an innovation fits sufficiently with
the host setting? Questions related to capacity are also rel-
evant, including: How can we best capture the current and
future capacity of host organizations? What criteria should
be used to assess when this capacity is sufficient to mount an
innovation? How can we assess the relative effectiveness of
different training strategies, and how do we measure staff
mastery of required skills before we launch the innovation?
In the first phase of the QIF, we need to better under-
stand the conditions when adaptations are necessary and
which criteria should be used to make this determination. If
adaptations are planned, they need to be operationalized
and carefully assessed during implementation, or else the
nature of the new innovation is unclear. What are the most
effective methods to ensure we have clear data on adap-
tation and its effects? How do we judge if the adaptation
improved the innovation or lessened its impact? Is it pos-
sible to conduct an experiment in which the relative
influence of the originally intended and adapted forms of
an innovation can be compared?
In Phase Two, we need more information on what forms
of on-going technical assistance are most successful for
different purposes and how we can accurately measure the
impact of this support. In past research, it seems many
authors have assumed that training or on-going technical
assistance leads to uniform mastery among front-line staff;
yet the empirical literature is now clear that substantial
variability in implementation usually occurs among pro-
gram providers (Durlak and Dupre 2008). There is a need
to develop the evidence base for effective training and
technical assistance (Wandersman et al. 2012).
Additional questions about the QIF include: How can it
be applied to learn more about the degree to which its use
improves implementation, the value and specifics of each
critical step, and the connections and interactions among
these steps? Are there important these steps in the current
framework that are missing? Should some steps in the
framework be revised?
Our third goal was to discuss the practical implications
of our findings. We will discuss these implications by
applying the elements of quality implementation from the
QIF to the three ISF systems. First we will specify the roles
that the systems of the ISF have in ensuring quality
implementation. Second, we will apply the collective
guidance synthesized via the QIF by making explicit links
between and within these systems, and detail specific
actions that can be used to collaboratively foster high
quality implementation.
In the ISF, innovations are processed by the Synthesis
and Translation System. This system promotes innovations
that can achieve their intended outcomes. The Delivery
System is comprised of the end-implementers (practitio-
ners) of innovations; therefore, quality implementation by
the Delivery System is crucial since this is where innova-
tions are used in real-world settings. In order to ensure
quality implementation by the Delivery System, the Sup-
port System provides ongoing assistance to build and
strengthen the necessary capacities for effective innovation
use. In other words, the Support System aims to build and
help maintain an adequate level of capacity in the Delivery
System, and the Delivery System utilizes its capacities to
put the innovation into practice so that outcomes are likely
to be achieved. In this way, the three systems in the ISF are
mutually accountable for quality implementation and need
to work together to make sure it happens.
The QIF can facilitate how these systems work together,
and the Support System can use this framework to help
Am J Community Psychol
plan for how it will provide support to the Delivery System
during implementation. For example, in Phase One, the
Support System can facilitate the assessment of key aspects
of the Delivery System’s environment (e.g., needs and
resources, how the innovation ‘‘fits’’ with the setting, and
whether the organization/community is ready to imple-
ment), help identify appropriate adaptations to the inno-
vation (e.g., cultural or other modifications required by
local circumstances, changes in the manner or intensity of
delivery of program components), ensure adequate buy-in
from key leaders and staff members, and provide necessary
training so the innovation is used properly. Given the
interactive nature of this process, there is a need to foster
and maintain positive relationships among these systems
and the QIF can help identify key issues that require
In regard to adaptation, our review indicated that the
Synthesis and Translation System plays a critical role in
deciding whether and how to modify an innovation. Given
that this system is charged with developing user-friendly
evidence-based innovations, several frameworks in our
review indicated that this system is accountable for pro-
viding information relevant to adaptation as a critical
aspect of their dissemination strategy. Such information
guides practitioners in the process of adapting programs to
new contexts: this may include consulting at the initial
stages where planning for implementation is taking place.
Such consultation could be considered part of the innova-
tion itself—an innovation that can be tailored to better fit
within the host setting. This is a much more involved
process than disseminating packaged program materials
(e.g., manuals and other tools) that lack guidance on what
can be adapted and what should never be adapted.
In Phase Two, the QIF indicates that the Delivery and
Support systems should work together to develop a struc-
ture that can support implementation. A key component of
this structure is a team that is accountable for implemen-
tation. An implementation plan needs to be created that
serves to guide implementation and anticipate challenges
that may be encountered. This plan can be strengthened by
incorporating the Delivery System’s local knowledge of
the host setting with the Support System’s knowledge of
effective support strategies (e.g., effective methods for
technical assistance) and of the innovation.
During Phase Three (when actual implementation tran-
spires), the Support System may assure that implementa-
tion by the Delivery System is supported. It is fundamental
that sufficient funding be in place during this phase to
ensure that adequate resources are available for innovation
use and support, and this has implications for important
implementation support policy considerations. A major
mechanism for support is technical assistance which is
intended to maintain the self-efficacy and skill proficiency
that were developed through training (Durlak and DuPre
2008). The key notion here is that support is on-going,
including monitoring and evaluating the implementation
process: Durlak and DuPre (2008) argue that this is nec-
essary for implementing innovations. If appropriate adap-
tations were identified during Phase One, then the Support
System may assure that monitoring and evaluation activi-
ties are tailored to these adaptations. Then, the Support
System may assess the extent to which the adaptations
impact the implementation process and resulting outcomes.
Other aspects of the process that should be monitored
include the extent to which tasks in the implementation
plan are accomplished in a timely manner, whether prac-
titioners are actually using the innovation (adherence), as
well as performance data related to the quality of innova-
tion delivery. This information can be used by the Support
System to enhance quality assurance and should be fed
back to the Delivery System.
Some researchers are beginning to develop more spe-
cific guidelines on how to monitor the implementation
process. The Collaborative for Academic, Social, and
Emotional Learning (CASEL 2011) has categorized each
of the elements in their implementation framework into one
of five ascending levels. For example, with respect to
availability of human resources, the CASEL guidelines ask
change agents to consider whether there is no staff for the
program (level one), some staff are present (level two) up
through level five (whether there are formal organizational
structures in place that institutionalize adequate human
resources including leadership positions). Such delinea-
tions can help determine where more work is needed for
quality implementation to occur.
During Phase Four, the Support System engages with
the Delivery System to reflect on the implementation pro-
cess. Reflection can illuminate what lessons have been
learned about implementing this innovation that can be
used to improve future applications and can be shared with
others who have similar interests. Researchers and program
developers are encouraged to form genuine collaborative
relationships that appreciate the perspectives and insights
of those in the Delivery System. Constructive feedback
from practitioners in the Delivery System can be important
to the use, modification, or application of the innova-
tion, and factors that may have affected the quality of
A practical application of our findings was the synthesis
and translation of QIF concepts into a tool that can be used
to guide the implementation process. The tool, called the
Quality Implementation Tool, is described in Meyers et al.
(2012); the article also discusses how this instrument was
applied to foster implementation in two different projects.
Am J Community Psychol
Although we searched carefully for relevant articles, it is
likely that some reports were overlooked. The different
terminology used among reviewed authors led us to focus
more on the activities they were describing rather than
what the activities were called. For example, sometimes
notions about obtaining critical support were being used in
the same way that others were discussing the importance of
having local champions, and terminology related to
capacity and capacity-building has yet to achieve universal
acceptance. As a result, we had to make judgments about
how best to categorize the features of different frameworks.
Although our synthesis identified 14 steps related to quality
implementation, it is possible that others might construe
the literature differently and derive fewer or more steps. As
already noted, some steps consist of multiple actions that
might be broken down further into separate, related steps.
The frameworks we reviewed were based on innova-
tions for adults or children—with or without adjustment or
medical problems—in diverse fields such as health care,
mental health, industry, and primary education. Although
there was convergent evidence for many QIF critical steps,
whether our findings can be generalized to diverse fields of
study needs to be explicitly tested. Whether the QIF can be
used effectively in all these settings to achieve diverse
goals needs empirical support. Such investigation can
identify which conditions might affect its application and
whether its critical steps require modifications to suit par-
ticular circumstances.
Another issue is that we included both peer-reviewed
and non-peer reviewed sources. It could be argued that
peer-reviewed sources have a higher level of rigor when
compared to those which have not been subject to such a
process. In addition, one of the ways that we limited our
sample was to exclude sources that had not been cited more
than once. This opens up the possibility of having a time
effect since those more recently published are less likely to
be cited.
Our findings suggest that the implementation process can be
viewed systematically in terms of a temporal series of linked
steps that should be effectively addressed to enhance the
likelihood of quality implementation. Past research indi-
cated that quality implementation is an important element of
any effective innovation, and that many factors may affect
the ultimate level of implementation attained. The current
synthesis and resulting QIF suggest a conceptual overview
of the critical steps of quality implementation that can be
used as a guide for future research and practice.
Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a
conceptual model of evidence-based practice implementation in
public service sectors. Administration and Policy in Mental
Health and Mental Health Services Research, 38, 4–23.
Aarons, G. A., Sommerfeld, D., Hecht, D. B., Silovsky, J. F., &
Chaffin, M. J. (2009). The impact of evidence-based practice
implementation and fidelity monitoring on staff turnover:
Evidence for a protective effect. Journal of Consulting and
Clinical Psychology, 77, 270–280.
Abbott, R. D., O’Donnell, J., Hawkins, J. D., Hill, K. G., Kosterman,
R., & Catalano, R. F. (1998). Changing teaching practices to
promote achievement and bonding to school. American Journal
of Orthopsychiatry, 68, 542–552.
Baker, R., Robertson, N., Rogers, S., Davies, M., Brunskill, N., &
Sinfield, P. (2009). The National Institute of Health Research
(NIHR) Collaboration for Leadership in Applied Health
Research and Care (CLAHRC) for Leicestershire, Northampt-
onshire and Rutland (LNR): A programme protocol. Implemen-
tation Science, 4, 72.
Basch, C. E., Sliepcevich, E. M., Gold, R. S., Duncan, D. F., & Kolbe,
L. J. (1985). Avoiding type III errors in health education
program evaluations: A case study. Health Education Quarterly,
12, 3154–3331.
Bellg, A. J., Borrelli, B., Resnick, B., Hecht. J., Minicucci, D. S., Ory,
M., et al. (2004). Enhancing treatment fidelity in health behavior
change studies: best practices and recommendations from the
NIH Behavior Change Consortium. Health Psychology, 23,
Blakely, C. H., Mayer, J. P., Gottschalk, R. G., Schmitt, N., Davidson,
W. S., Roitman, D. B., et al. (1987). The fidelity-adaptation
debate: Implications for the implementation of public sector
social programs. American Journal of Community Psychology,
15, 253–268.
Centers for Disease Control and Prevention Global AIDS Program.
(2010, August 9). CDC’s role in PEPFAR and the U.S. Global
Health Initiative. Retrieved from
Chakravorty, S. S. (2009). Six sigma programs: An implementation
model. International Journal of Production Economics, 119,
Chinman, M., Hunter, S. B., Ebener, P., Paddock, S. M., Stillman, L.,
Imm, P., et al. (2008). The Getting To Outcomes demonstration
and evaluation: An illustration of the prevention support system.
American Journal of Community Psychology, 41, 206–224.
Chinman, M., Imm, P., & Wandersman, A. (2004). Getting to
Outcomes 2004: Promoting accountability through methods and
tools for planning, implementation, and evaluation. (No. TR-
TR101). Santa Monica, CA: RAND.
Chinowsky, P. S. (2008). Staircase model for new practice imple-
mentation. Journal of Management in Engineering, 24, 187–195.
Chorpita, B. F., Yim, L. M., Donkervoet, J. C., Arensdorf, A.,
Amundsen, M. J., McGee, C., et al. (2002). Toward large-scale
implementation of empirically supported treatments for children: A
review and observations by the Hawaii empirical basis to services
task force. Clinical Psychology: Science and Practice, 9, 165–190.
Collaborative for Academic, Social, and Emotional Learning,
National Center for Mental Health Promotion and Youth
Violence Prevention. (2011). Leading an SEL school: Steps to
implement social and emotional learning for all students. 5/20/
11, Education Development Center.
Cook, T. D., Habib, F. N., Phillips, M., Settersten, R. A., Shagle, S.
C., & Degirmencioglu, S. M. (1999). Comer’s school
Am J Community Psychol
development program in Prince George’s county, Maryland: A
theory-based evaluation. American Educational Research Jour-
nal, 36, 543–597.
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander,
J. A., & Lowery, J. C. (2009). Fostering implementation of
health services research findings into practice: A consolidated
framework for advancing implementation science. Implementa-
tion Science, 4, 50.
Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary
and early secondary prevention: Are implementation effects out
of control. Clinical Psychology Review, 18, 23–45.
Domitrovich, C. E., Bradshaw, C. P., Poduska, J., Hoagwood, K.,
Buckley, J., Olin, S., et al. (2008). Maximizing the implemen-
tation quality of evidence-based preventive interventions in
schools: A conceptual framework. Advances in School Mental
Health Promotion, 1, 6–28.
Domitrovich, C. E., Gest, S. D., Jones, D., Gill, S., & DeRousie, R.
M. S. (2010). Implementation quality: Lessons learned in the
context of the Head Start REDI trial. Early Childhood Research
Quarterly, 25, 284–298.
DuBois, D. L., Holloway, B. E., Valentine, J. C., & Cooper, H.
(2002). Effectiveness of mentoring programs for youth: A meta-
analytic review. American Journal of Community Psychology,
30, 157–198.
DuFrene, B. A., Noell, G. H., Gilbertson, D. N., & Duhon, G. J.
(2005). Monitoring implementation of reciprocal peer tutoring:
Identifying and intervening with students who do not main-
tain accurate implementation. School Psychology Review, 34,
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A
review of research on the influence of implementation on
program outcomes and the factors affecting implementation.
American Journal of Community Psychology, 41, 327–350.
Elder, J. P., Perry, C. L., Stone, E. J., Johnson, C. C., Yang, M.,
Edmundson, E. W., et al. (1996). Tobacco use measurement,
prediction, and intervention in elementary schools in four states:
The CATCH study. Preventive Medicine, 25, 489–494.
Fagan, A. A., Hanson, K., Hawkins, J. D., & Arthur, M. W. (2008).
Bridging science to practice: Achieving prevention program
implementation fidelity in the Community Youth Development
Study. American Journal of Community Psychology, 41, 235–249.
Feldstein, A. C., & Glasgow, R. E. (2008). A practical, robust
implementation and sustainability model (PRISM) for integrat-
ing research findings into practice. Joint Commission Journal on
Quality and Patient Safety/Joint Commission Resources, 34,
Fixsen, D. L., Naoom, S. F., Blase
´, K. A., Friedman, R. M., &
Wallace, F. (2005). Implementation research: A synthesis of the
literature. Tampa, FL: University of South Florida, Louis de la
Parte Florida Mental Health Institute, The National Implemen-
tation Research Network (FMHI Publication #231). Retrieved
November 1, 2006, from
Flaspohler, P. D., Anderson-Butcher, D., & Wandersman, A. (2008a).
Supporting implementation of expanded school mental health
services: Application of the Interactive Systems Framework in
Ohio. Advances in School Mental Health Promotion, 1, 38–48.
Flaspohler, P., Duffy, J., Wandersman, A., Stillman, L., & Maras, M.
A. (2008b). Unpacking prevention capacity: An intersection of
research-to-practice models and community-centered models.
American Journal of Community Psychology, 41, 182–196.
Forehand, R., Dorsey, S., Jones, D. J., Long, N., & McMahon, R.
(2010). Adherence and flexibility: They can (and do) coexist!
Clinical Psychology: Science and Practice, 17, 258–264.
Glisson, C., & Schoenwald, S. K. (2005). The ARC organizational
and community intervention strategy for implementing
evidence-based children’s mental health treatments. Mental
Health Services Research, 7, 243–259.
Gottfredson, D. C., Gottfredson, G. D., & Hybl, L. G. (1993).
Managing adolescent behavior: A multiyear, multi school study.
American Educational Research Journal, 30, 179–215.
Greenberg, M. T., Domitrovich, C. E., Graczyk, P. A., & Zins, J. E.
(2005). The study of implementation in school-based preventive
interventions: Theory, research, and practice (volume 3). DHHS
Pub. No. (SMA). Rockville, MD: Center for Mental Health
Services, Substance Abuse and Mental Health Services Admin-
istration, 2005.
Greenhalgh, T., Robert, G., MacFarlane, F., Bate, P., & Kyriakidou,
O. (2004). Diffusion of innovations in service organizations:
Systematic review and recommendations. Milbank Quarterly,
82, 581–629.
Greenwood, C. R., Tapia, Y., Abbott, M., & Walton, C. (2003). A
building-based case study of evidence-based literacy practices:
Implementation, reading behavior, and growth in reading
fluency, K-4. The Journal of Special Education, 37, 95–110.
Grimshaw, J. M., & Russell, I. T. (1993). Effect of clinical guidelines
on medical practice: A systematic review of rigorous evalua-
tions. The Lancet, 342, 1317–1322.
Grol, R., & Jones, R. (2000). Twenty years of implementation
research. Family Practice, 17, S32–S35.
Guldbrandsson, K. (2008). From news to everyday use: The difficult
art of implementation. Ostersund, Sweden: Swedish National
Institute of Public health. Retrieved from
Hall, G. E., & Hord, S. M. (2006). Implementing change: Patterns,
principles and potholes (2nd ed.). Boston, MA: Allyn and Bacon.
Hawkins, J. D., Catalano, R. F., & Arthur, M. W. (2002). Promoting
science-based prevention in communities. Addictive Behaviors,
27, 951–976.
International Organization for Standardization. (1998). ISO/IEC
international standard 13236: Information technologyQuality
of service: Framework. First edition.
Joyce, R. B., & Showers, B. (2002). Student achievement through
staff development (3rd ed.). Alexandria, VA: Association for
Supervision and Curriculum Development.
Kerr, D. M., Kent, L., & Lam, T. C. M. (1985). Measuring program
implementation with a classroom observation instrument: The
interactive teaching map. Evaluation Review, 9, 461–482.
Kilbourne, A. M., Neuman, M. S., Pincus, H. A., Bauer, M. S., &
Stall, R. (2007). Implementing evidence-based interventions in
health care: Applications of the replicating effective programs
framework. Implementation Science, 2, 42.
Klein, K. J., Conn, A., & Sorra, J. (2001). Implementing computer-
ized technology: An organizational analysis. Journal of Applied
Psychology, 86(5), 811–824.
Klein, K. J., & Sorra, J. S. (1996). The challenge of innovation
implementation. Academy of Management Review, 21, 1055–1080.
Lee, S. J., Altschul, I., & Mowbray, C. T. (2008). Using planned
adaptation to implement evidence-based programs with new
populations. American Journal of Community Psychology, 41,
Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P. C.,
Alonnidis, J. P., et al. (2009) The PRISMA statement for
reporting systematic reviews and meta-analyses of studies that
evaluate health care interventions: explanation and elaboration.
BMJ, 339, b2700.
Maxwell, J. A. (2005). Qualitative research design: An interactive
approach (2nd ed.). Thousand: SageOaks. 2005.
Mazzucchelli, T. G., & Sanders, M. R. (2010). Facilitating practi-
tioner flexibility within an empirically supported intervention:
Lessons from a system of parenting support. Clinical Psychol-
ogy: Science and Practice, 17, 238–252.
Am J Community Psychol
McGraw, S. A., Sellers, D. E., Stone, E. J., Bebchuk, J., Edmundson,
E. W., Johnson, C. C., et al. (1996). Using process data to
explain outcomes: An illustration from the child and adolescent
trial for cardiovascular health (CATCH). Evaluation Review, 20,
Meyers, D. C., Katz, J., Chien, V., Wandersman, A., Scaccia, J. P., &
Wright, A. (2012). Practical implementation science: Develop-
ing and piloting the Quality Implementation Tool. American
Journal of Community Psychology. doi:10.1007/s10464-012-
Mihalic, S., Fagan, A. A., Irwin, K., Ballard, D., & Elliott, D. (2004).
Blueprints for violence prevention. Washington, DC: Office of
Juvenile Justice and Delinquency Prevention.
Miles, M., & Huberman, M. (1994). Qualitative data analysis: An
expanded sourcebook (2nd ed.). London: Sage.
Miller, W. R., Yahne, C. E., Moyers, T. B., Martinez, J., & Pirritano,
M. (2004). A randomized trial of methods to help clinicians learn
motivational interviewing. Journal of Consulting and Clinical
Psychology, 72, 1050–1062.
National Institutes of Health. (2011, October 25). Dissemination and
implementation. Retrieved from
Okumus, F. (2003). A framework to implement strategies in
organizations. Management Decision, 41(9), 871–882.
Partnerships for Success Community Planning and Implementation
Guide. (2003).
Proctor, E. K., Landsverk, J., Aarons, G., Chambers, D., Glisson, C.,
& Mittman, B. (2009). Implementation research in mental health
services: An emerging science with conceptual, methodological,
and training challenges. Administration and Policy in Mental
Health and Mental Health Services Research, 36, 24–34.
Riley, B. L., Taylor, S. M., & Elliott, S. J. (2001). Determinants of
implementing heart healthy promotion activities in Ontario
public health units: A social ecological perspective. Health
Education Research, 16, 425–441.
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York:
Free Press.
Rycroft-Malone, J. (2004). The PARIHS framework: A framework
for guiding the implementation of evidence-based practice.
Journal of Nursing Care Quality, 19, 297–304.
Sandler, I., Ostrom, A., Bitner, M. J., Ayers, T. S., Wolchik, S., &
Daniels, V. S. (2005). Developing effective prevention services
for the real world: A prevention service development model.
American Journal of Community Psychology, 35, 127–142.
Saunders, R. P., Ward, D., Felton, G. M., Dowda, M., & Pate, R. R.
(2006). Examining the link between program implementation
and behavior outcomes in the lifestyle education for activity
program (LEAP). Evaluation and Program Planning, 29,
Schoenwald, S. K. (2008). Toward evidence-based transport of
evidence-based treatments: MST as an example. Journal of
Child & Adolescent Substance Abuse, 17, 69–91.
Sholomskas, D. E., Syracuse-Siewert, G., Rounsaville, B. J., Ball, S.
A., Nuro, K. F., & Carroll, K. M. (2005). We don’t train in Vain:
A dissemination trial of three strategies of training clinicians in
cognitive-behavioral therapy. Journal of Consulting and Clinical
Psychology, 73, 106–115.
Simpson, D. D. (2002). A conceptual framework for transferring
research to practice. Journal of Substance Abuse Treatment, 22,
Smith, J. D., Schneider, B. H., Smith, P. K., & Ananiadou, K. (2004).
The effectiveness of whole-school antibullying programs: A
synthesis of evaluation research. School Psychology Review, 33,
Sobo, E. J., Bowman, C., Halloran, J., Aarons, G. A., Asch, S., &
Gifford, A. L. (2008). Enhancing organizational change and
improvement prospects: Lessons from an HIV testing interven-
tion for veterans. Human Organization, 67, 443–453.
Spence, K., & Henderson-Smart, D. (2011). Closing the evidence-
practice gap for newborn pain using clinical networks. Journal of
Paediatrics and Child Health, 47, 92–98.
Spoth, R. L., & Greenberg, M. T. (2005). Toward a comprehensive
strategy for effective practitioner-scientist partnerships and
larger-scale community benefits. American Journal of Commu-
nity Psychology, 35, 107–126.
Spoth, R., Greenberg, M., Bierman, K., & Redmond, C. (2004).
PROSPER community-university partnership model for public
education systems: Capacity-building for evidence-based, com-
petence-building prevention. Prevention Science, 5, 31–39.
Stetler, C. B., McQueen, L., Demakis, J., & Mittman, B. S. (2008). An
organizational framework and strategic implementation for
system-level change to enhance research-based practice: QUERI
Series. Implementation Science, 3, 30.
Stith, S., Pruitt, I., Dees, J., Fronce, M., Green, N., Som, A., et al.
(2006). Implementing community-based prevention program-
ming: A review of the literature. The Journal of Primary
Prevention, 27, 599–617.
Tobler, N. S. (1986). Meta-analysis of 143 adolescent drug prevention
programs: Quantitative outcome results of program participants
compared to a control or comparison group. Journal of Drug
Issues, 16, 537–567.
Van de Ven, A. H., Angles, H. L., & Poole, M. S. (1989). Research on
the management of innovation: The Minnesota studies. New
York: Harper and Row.
Walker, J. S., & Koroloff, N. (2007). Grounded theory and backward
mapping: Exploring the implementation context for Wrap-
around. Journal of Behavioral Health Services & Research,
34, 443–458.
Wandersman, A., Chien, V., & Katz. J. (2012). Toward an evidence-
based system for innovation support for implementing innova-
tions with quality: Tools, training, technical assistance, and
quality assurance/quality improvement. American Journal of
Community Psychology. doi:10.1007/s10464-012-9509-7.
Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K.,
Stillman, L., Blachman, M., Dunville, R., & Saul, J. (2008).
Bridging the gap between prevention research and practice: The
Interactive Systems Framework for dissemination and imple-
mentation. American Journal of Community Psychology, 41,
Wandersman, A., & Florin, P. (2003). Community interventions and
effective prevention. American Psychologist, 58, 441–448.
Wilson, S. J., Lipsey, M. W., & Derzon, J. H. (2003). The effects of
school-based intervention programs on aggressive behavior: A
meta-analysis. Journal of Consulting and Clinical Psychology,
71, 136–149.
Am J Community Psychol
... c. Formación (Meyers, Durlak y Wandersman, 2012). d. ...
... La formación de las y los profesionales ha de considerar aspectos referidos al cuándo, cómo y sobre qué contenidos debe realizarse; incorporando tanto las bases teóricas y metodológicas, como herramientas que promuevan la comunicación, el trabajo en equipo, y valores y actitudes positivas hacia la intervención (Herrera, León y Medina, 2007;Máiquez et al., 2015;Sánchez Prieto, Pascual Barrio, Orte Socias y Ballester Brage, 2020). Además, la formación debe ser continua, iniciándose antes de la implementación, y aportando apoyo y asesoramiento durante todo el proceso (Acquah y Thévenon, 2020;Meyers et al., 2012;Small, Cooney y O'Connor, 2009). ...
... Un buen número de intervenciones dirigidas a familias tienen un claro carácter intersectorial y multidisciplinar. Este tipo de acciones requieren la colaboración de los diferentes agentes locales e institucionales implicados, y precisan de un liderazgo que aúne intereses, favorezca alianzas, dé continuidad y estabilidad al programa, y facilite su reconocimiento institucional (Meyers et al., 2012). ...
Full-text available
Se presentan los resultados de la valoración de las/os profesionales del Programa de tratamiento a familias con menores en situación de riesgo o desprotección en Andalucía a partir de la información de las/os 479 profesionales que componen los 147 Equipos de Tratamiento Familiar. Se analizó la fidelidad al programa, algunas dimensiones actitudinales relacionadas con esta (motivación, percepción de utilidad y eficacia, satisfacción con el equipo de trabajo) y la relación de dichas variables con los años de experiencia y el perfil profesional. Además, se examinaron las propuestas de cambio dirigidas a mejorar el programa. Se combinó el análisis cualitativo y cuantitativo de la información a partir de un cuestionario con preguntas abiertas y cerradas. Los resultados evidenciaron una valoración positiva del programa por parte de las/os profesionales, tanto en términos generales como atendiendo a distintas dimensiones específicas. Además, la voz de las/os profesionales puso de manifiesto aspectos, tanto internos como externos al programa, susceptibles de mejora. Entre los aspectos internos destacó la modificación del manual del programa, la temporalización, la revisión de los perfiles profesionales y la formación continua. A nivel externo, señalaron la necesidad de mejorar los canales de coordinación y comunicación, y los procesos de supervisión.
... Interview guide questions focused on issues related with context, implementation and early outcomes to capture key stakeholder perspectives of lessons learned and strategies to support acceptance and adoption of the IPM (see 46 for the interview guide). A sub-set of questions were adapted from key questions identified in the Quality Implementation Framework that are designed to facilitate adoption of innovations [55] to explore how components of the IPM are implemented (see both [18,19]. ...
Full-text available
Background System-level approaches that target social determinants of health are promising strategies to support substance use prevention, holistic youth development and wellbeing. Yet, the youth services system is largely based on individual-focused programs that do not adequately account for social determinants of health and place the responsibility for wellness on the individual. There is a need to understand how to enhance adoption of complex system-level approaches that support comprehensive youth development. The Icelandic Prevention Model (IPM) represents a collaborative initiative that takes an ecological, system-level approach to prevent substance use and promote wellness in youth. This research was designed to examine key stakeholder perceptions to better understand social motivations and contextual complexities that influence stakeholder support to garner community-level adoption of the IPM in a rural Canadian community. Methods This research applies a case study approach using qualitative interviews to explore strategies to support uptake in the early stages of IPM adoption associated with developing community buy-in and acceptance. A thematic analysis was applied using QSR NVivo. Results Nine interviews were conducted with community partners leading the implementation of the IPM. Three over-arching themes emerged from the data: 1) Motivating influences 2) Strategies to develop buy-in, and 3) Resistance to the adoption of the IPM. Findings reflect issues that affect behaviour change in system transformation in general as well as upstream prevention and the IPM, in particular. Conclusions The findings from this research describe critical insight derived from implementing community-driven initiatives that are designed to support health promotion. It contributes new scientific knowledge related to implementation of complex system-level innovations and practical information that is useful for communities interested in implementing the IPM or following similar approaches to prevent substance use.
... This analysis should inform a prospective, tailored implementation approach. [46][47][48][49] This is a retrospective assessment of program implementation, which introduces recall and bias in the reporting. Although three of our co-authors (AS, MB, HHJ) were involved with program development and implementation, there are bias risks in overstating program benefits and in understating program challenges through a post hoc analysis. ...
Full-text available
Background The prospective surveillance model (PSM) is an evidence-based rehabilitation care delivery model that facilitates functional screening and intervention for individuals undergoing cancer treatment. While PSM is empirically validated and feasible in practice, implementation into cancer care delivery has languished. The purpose of this manuscript is to characterize the barriers and facilitators to implementing PSM in a breast cancer center and to share policy and process outcomes that have sustained the model in practice.Methods The PSM implementation was undertaken as a quality improvement initiative of our cancer center. We retrospectively assessed barriers to implementation and define those according to the Consolidated Framework for Implementation Research (CFIR). Implementation strategies are defined based on the Expert Recommendations for Implementation Change (ERIC) taxonomy. Breast center policy changes and stakeholder-reported process improvement outcomes at the clinic and system level are described.ResultsPSM implementation facilitation was driven primarily by adapting the model to align with the cancer center workflow, engaging interdisciplinary stakeholders as program champions, enhancing knowledge and awareness among cancer care providers, and changing infrastructure to support the model. System and clinic-level policy and process changes included the development of clinical pathways, EHR order sets and automated referrals, new staffing models, and adapted clinical workflows.Conclusion Our report provides insight on implementing the PSM at a single institution in a cancer care delivery setting. Successful implementation strategies addressed individual, clinic, and system-level barriers and facilitated process and policy changes that have enabled PSM sustainment. Improving integration of rehabilitation services into oncology care has significant implications for survivorship care by enhancing proactive management of functional morbidity.Implications for Cancer SurvivorsImproving integration of rehabilitation services into oncology care has significant implications for survivorship care by enhancing proactive management of functional morbidity.
... As described in our study protocol [26], our implementation approach was informed by our previous implementation research [23][24][25] as well as three frameworks: The Quality Implementation Framework [27] which guided our implementation process; The Implementation Outcomes Taxonomy [28] which guided our evaluation of implementation outcomes; and the Consolidated Framework for Implementation Research [29] which provided factors associated with successful implementation. Based on our previous FBT implementation research [23][24][25], our blended implementation approach in this study consisted of: (a) the establishment of implementation teams at each site; (b) a training workshop; (c) bi-weekly clinical consultation; (d) bi-weekly implementation consultation; and (e) fidelity assessment. ...
Family-Based Treatment (FBT)—the most widely supported treatment for pediatric eating disorders—transitioned to virtual delivery in many programs due to COVID-19. Using a blended implementation approach, we systematically examined therapist adherence to key components of FBT and fidelity to FBT by videoconferencing (FBT-V), preliminary patient outcomes, and team experiences with our FBT-V implementation approach as well as familial perceptions of FBT-V effectiveness. We examined our implementation approach across four pediatric eating disorder programs in Ontario, Canada, using mixed methods. Participants included therapists (n = 8), medical practitioners (n = 4), administrators (n = 6), and families (n = 5; 21 family members in total). We developed implementation teams at each site, provided FBT-V training, and offered clinical and implementation consultation. Therapists submitted video recordings of their first four FBT-V sessions for fidelity rating, and patient outcomes. Therapists self-reported readiness, attitudes, confidence, and adherence to FBT-V. Focus groups were conducted with each team and family after the first four sessions of FBT-V. Quantitative data were analyzed using repeated measures ANOVA. Qualitative data were analyzed using directed and summative content analysis. Therapists adhered to key FBT components and maintained FBT-V fidelity. Changes in therapists’ readiness, attitudes, and confidence in FBT-V over time were not significant. All patients gained weight. Focus groups revealed implementation facilitators/barriers, positives/negatives surrounding FBT-V training and consultation, suggestions for improvement, and effectiveness attributed to FBT-V. Our implementation approach appeared to be feasible and acceptable. Future research with a larger sample is required, furthering our understanding of this approach and exploring how organizational factors influence treatment fidelity.
It is widely recognized that the most effective student mental health interventions, tools, and resources are those that are solidly grounded in theory, evidence, and practice. But developing interventions in this way can be a time-consuming, challenging process. This article describes the process of developing a classroom resource to build social emotional learning skills among high school students in Ontario. The resource was informed by the latest research evidence while also being sensitive to the implementation context and needs of educators and students. In creating, evaluating, and revising these resources over several years, lessons have emerged about what it takes to navigate inherent challenges, balance competing needs and priorities, and ultimately develop an intervention that is both evidence-informed and implementation sensitive. Flexible funding, effective partnerships, and a commitment to contextual responsivity are key.
Full-text available
Purpose: This research examines the implementation of the Icelandic Prevention Model (IPM) in Canada to identify opportunities revealed by the COVID-19 pandemic to re-design our social eco-system to promote wellbeing. This paper has two objectives: 1) to provide a conceptual review of research that applies the bioecological model to youth substance use prevention with a focus on the concepts of time and physical space use and 2) to describe a case study that examines the implementation of the IPM in Canada within the context of the COVID-19 pandemic. Method: Study data were collected through semi-structured qualitative interviews with key stakeholders involved in implementing the IPM. Results: Findings are organized within three over-arching themes derived from a thematic analysis: 1) Issues that influence time and space use patterns and youth substance use, 2) Family and community cohesion and influences on developmental context and time use and 3) Opportunities presented by the pandemic that can promote youth wellbeing. Conclusion: We apply the findings to research on the IPM as well as the pandemic to examine opportunities that may support primary prevention and overall youth wellbeing. We use the concepts of time and space as a foundation to discuss implications for policy and practice going forward.
Purpose: Whilst anyone can be scammed, individuals with acquired brain injury (ABI) may have unique risk factors to cyberscams for which tailored interventions are required. To address this, a co-design approach was utilised to develop cybersafety resources with people with living experience of ABI and scams. This study aimed to evaluate the co-design experience to inform future utilisation of co-design methods. Method: Semi-structured qualitative interviews explored perceived benefits and challenges, level of support and the co-design process for people with ABI (n= 7) and an attendant care worker (ACW) (n= 1). Transcripts were analysed using a six-stage reflexive thematic analysis. Results: Five themes were identified: "An Intervention Addressing Shame"; "Feeling Validated and Valued"; "Experiencing a 'Profound Change Amongst a Group of Peers'"; 'Gaining Stronger Scam Awareness'; and 'Taking Ownership'. Adjustments to support communication, memory impairments and fatigue in the co-design process were recommended. Conclusions: Participant reflections on the co-design process extended beyond resource design and highlighted therapeutic benefits of increased insight and emotional recovery from shame. Likely mechanisms underpinning these benefits were the peer group format and opportunities to make meaningful contributions. Despite identified challenges in facilitating co-design projects, the practical and emotional benefits reported by participants underscore the value of co-design with people with ABI. Implications for rehabilitationIndividuals with acquired brain injury (ABI) may be at increased risk of cyberscams due to cognitive impairments, for which tailored cyberscam interventions are required.Using a co-design approach maximises the relevance of training resources for individuals with ABI.Using a collaborative co-design approach to developing cybersafety training resources may facilitate scam awareness and peer support.Support for communication, memory impairments and fatigue may be necessary in co-design efforts with people with ABI.
This article presents a comprehensive meta-analysis of international studies on the effects of parent training programs (PTP) on antisocial behavior (ASB) in children and adolescents. From systematic literature searches of 7219 reports, we finally selected 239 eligible reports with 241 independent studies and 279 comparisons between a program and a control condition up to the publication year 2020. Although most interventions were based on a cognitive-behavioral approach, we also found a great variety of programs and applications. Overall, the mean effect for PTP was positive for parent/family and ASB outcomes (d = 0.46 and d = 0.47, respectively using the random effect model at postintervention). We also found higher effects on more proximal parental outcomes such as parental stress, parental competencies, and parent–child interaction/relation. However, more distal outcomes such as marital satisfaction or parent psychopathology revealed lower effect sizes. In addition, the link between changes in parental/family outcomes and changes in ASB was significant across several outcome types, thus confirming the general causal assumption of PTP. Postintervention effects were stable across several moderators, although clinical applications revealed slightly higher effect sizes than preventive applications. Several findings cast some doubt on these generally positive results: For example, effect sizes decreased considerably in not only short- (3 to 12 months) but also especially long-term follow ups (12 months or more), and the vast majority of outcome assessments stemmed from parent ratings. Finally, we found a clear negative connection between sample and effect size. Whether this is due to publication bias or indicates a better implementation quality in smaller studies remains an open question.
Cet article décrit un projet de recherche collaborative, entre praticiens-partenaires et chercheurs, portant sur un modèle d’accompagnement et de formation des intervenants scolaires, visant une mise en oeuvre optimale et durable de programmes et d’interventions fondées sur des données probantes concernant le développement de la socialisation des élèves (prévention de la violence) ainsi que leur bien-être psychologique. La conceptualisation et l’expérimentation de ce modèle reposeront sur une structure et un fonctionnement partenarial mobilisant pas moins de 11 organisations représentant divers partenaires du milieu scolaire, qui travailleront avec plus de 18 chercheurs et 10 collaborateurs sur une période de trois ans.
Full-text available
Why do some organizations succeed and others fail in implementing the innovations they adopt? To begin to answer this question, the authors studied the implementation of manufacturing resource planning, an advanced computerized manufacturing technology, in 39 manufacturing plants (number of individual respondents = 1,219). The results of the plant-level analyses suggest that financial resource availability and management support for technology implementation engender high-quality implementation policies and practices and a strong climate for implementation, which in turn foster implementation effectiveness-that is, consistent and skilled technology use. Further research is needed to replicate and extend the findings.
Full-text available
Systematic reviews and meta-analyses are essential to summarize evidence relating to efficacy and safety of health care interventions accurately and reliably. The clarity and transparency of these reports, however, is not optimal. Poor reporting of systematic reviews diminishes their value to clinicians, policy makers, and other users.Since the development of the QUOROM (QUality Of Reporting Of Meta-analysis) Statement--a reporting guideline published in 1999--there have been several conceptual, methodological, and practical advances regarding the conduct and reporting of systematic reviews and meta-analyses. Also, reviews of published systematic reviews have found that key information about these studies is often poorly reported. Realizing these issues, an international group that included experienced authors and methodologists developed PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) as an evolution of the original QUOROM guideline for systematic reviews and meta-analyses of evaluations of health care interventions.The PRISMA Statement consists of a 27-item checklist and a four-phase flow diagram. The checklist includes items deemed essential for transparent reporting of a systematic review. In this Explanation and Elaboration document, we explain the meaning and rationale for each checklist item. For each item, we include an example of good reporting and, where possible, references to relevant empirical studies and methodological literature. The PRISMA Statement, this document, and the associated Web site ( should be helpful resources to improve reporting of systematic reviews and meta-analyses.
We used meta‐analysis to review 55 evaluations of the effects of mentoring programs on youth. Overall, findings provide evidence of only a modest or small benefit of program participation for the average youth. Program effects are enhanced significantly, however, when greater numbers of both theory‐based and empirically based “best practices” are utilized and when strong relationships are formed between mentors and youth. Youth from backgrounds of environmental risk and disadvantage appear most likely to benefit from participation in mentoring programs. Outcomes for youth at‐risk due to personal vulnerabilities have varied substantially in relation to program characteristics, with a noteworthy potential evident for poorly implemented programs to actually have an adverse effect on such youth. Recommendations include greater adherence to guidelines for the design and implementation of effective mentoring programs as well as more in‐depth assessment of relationship and contextual factors in the evaluation of programs.
Although numerous studies address the efficacy and effectiveness of health interventions, less research addresses successfully implementing and sustaining interventions. As long as efficacy and effectiveness trials are considered complete without considering implementation in nonresearch settings, the public health potential of the original investments will not be realized. A barrier to progress is the absence of a practical, robust model to help identify the factors that need to be considered and addressed and how to measure success. A conceptual framework for improving practice is needed to integrate the key features for successful program design, predictors of implementation and diffusion, and appropriate outcome measures.
Presented is a meta-analysis of the outcome results for 143 adolescent drug prevention programs to identify the most effective program modalities for reducing teenage drug use. Glass' et al. (1981) meta-analysis techniques provided a systematic approach for the accumulation, quantification and integration of the numerous research findings. Five major modalities were identified and their effect sizes computed for five distinctly different outcomes: Knowledge, Attitudes, Use, Skills and Behavior measures. The magnitude of the effect size was found dependent on the outcome measure employed and the rigor of the experimental design. These factors were controlled for through use of a standard regression analysis. Peer Programs were found to show a definite superiority for the magnitude of the effect size obtained on all outcome measures. On the ultimate criteria of drug use, Peer Programs were significantly different than the combined results of all the remaining programs (p < .0005). Peer Programs maintained high effect size for alcohol, soft drugs and hard drugs, as well as for cigarette use. Recommendations are made concerning the effectiveness of the underlying theoretical assumption for the different program modalities. Future programming implications are discussed as Peer Programs were identified as effective for the average school-based adolescent population, but the Alternatives programs were shown to be highly successful for the “at risk” adolescents such as drug abusers, juvenile delinquents or students having school problems.
Most research on why health care quality improvement implementation succeeds or fails focuses on front-line or provider-based factors. However, background factors related to the structures and processes of projects themselves also pose challenges. Using a focused ethnographic assessment approach, we undertook a case study to characterize particularly challenging background factors in an ongoing implementation effort. We found that the organizational structure of the project under study sustained several key "cultural" differences in stakeholder agendas. Moreover, it fostered the emergence of strategic communication processes that, despite their immediate utility, sometimes undermined progress and threatened long-term relations by distorting information flow in particularly patterned ways. These included a "focus on the local" and "information reconfigurations" or "partiality" that sometimes led to miscommunication or interpretive disjunctions between various stakeholders. Successful cross-organizational communication is in certain ways a cross-cultural achievement, and several guidelines were devised to facilitate this. Our experience with other health care systems and with health services research in general suggests that our findings and recommendations are broadly applicable. Because the main barriers identified were generated by complex organizational arrangements, lessons learned may also be transferable to other complex organizational contexts.