ArticlePDF Available

Implementation Support Skills: Findings From a Systematic Integrative Review

Authors:
Article

Implementation Support Skills: Findings From a Systematic Integrative Review

Abstract and Figures

Purpose Skills in selecting and designing strategies for implementing research-supported interventions (RSIs) within specific local contexts are important for progressing a wider RSI adoption and application in human and social services. This also applies to a particular role in implementation, the implementation support practitioner (ISP). This study examines which strategies have been reported as being used by ISPs across multiple bodies of research on implementation support and how these strategies were applied in concrete practice settings. Methods A systematic integrative review was conducted. Data analysis utilized the Expert Recommendations for Implementing Change compilation of implementation strategies. Results Studies reported on 18 implementation strategies commonly used by different ISPs, who require mastery in selecting, operationalizing, and detailing these. Two further strategies not included in the ERIC compilation could be identified. Discussion Given the use of primarily more feasible implementation support strategies among ISPs, their potential as agents of change may be underutilized.
Content may be subject to copyright.
Research Article
Implementation Support Skills: Findings
From a Systematic Integrative Review
Bianca Albers
1,2
, Allison Metz
3
, Katie Burke
4
, Leah Bu
¨hrmann
1,5
,
Leah Bartley
3
, Pia Driessen
6
, and Cecilie Varsi
7
Abstract
Purpose: Skills in selecting and designing strategies for implementing research-supported interventions (RSIs) within specific
local contexts are important for progressing a wider RSI adoption and application in human and social services. This also applies to
a particular role in implementation, the implementation support practitioner (ISP). This study examines which strategies have
been reported as being used by ISPs across multiple bodies of research on implementation support and how these strategies were
applied in concrete practice settings. Methods: A systematic integrative review was conducted. Data analysis utilized the Expert
Recommendations for Implementing Change compilation of implementation strategies. Results: Studies reported on 18 imple-
mentation strategies commonly used by different ISPs, who require mastery in selecting, operationalizing, and detailing these. Two
further strategies not included in the ERIC compilation could be identified. Discussion: Given the use of primarily more feasible
implementation support strategies among ISPs, their potential as agents of change may be underutilized.
Keywords
implementation science, implementation support, integrative review, implementation strategies, capacity building
Implementation scientists have shown an interest in how to best
support frontline staff, that is, those involved in the delivery of
services to children, adults, families, and communities, in their
uptake of research-supported interventions (RSIs) since the
early beginnings of the field (Harvey et al., 2002; Kitson
et al., 1998; Schoenwald et al., 2004). RSIs—be they programs,
strategies, procedures, or policies—are those that have been
evaluated using acceptable standards of scientific evidence
and found to yield generally positive outcomes” (Thyer et al.,
2017, p. 86) for their target populations. Obtaining a wide-
spread uptake and sustainable use of these RSIs remains a
challenge in real-world human and social services because
individuals, agencies, and systems often lack sufficient skill
to overcome barriers to their implementation.
This has been a central topic of debate also in social work. In
this profession, evidence-based practice (EBP) is increasingly
viewed as central in tackling societal challenges such as child
maltreatment, homelessness, family violence, or unequal
access to health care (Barth et al., 2017; Nurius et al., 2017).
With EBP, we mean the process of “integrating individual
practice expertise with the best available external evidence
from systematic research as well as considering the values and
expectations of clients” (Gambrill, 1999, p. 346). RSIs repre-
sent the best available external evidence in this context,
which—to establish EBP—would need to be considered and
used in the light of client preferences and professional
expertise.
Knowledge and skill deficiencies have consistently emerged
from studies as a key factor that prevents human and social
service workers from establishing this practice (Ekeland et al.,
2018; Finne, 2020; Goel et al., 2018; Grady et al., 2017; James
et al., 2019; Lery et al., 2015; Scurlock-Evans & Upton, 2015;
Shapira et al., 2017; van der Zwet et al., 2019; Wike et al.,
2019). In recent years, educational institutions have increas-
ingly aimed to accommodate RSIs and EBP by, for example,
adjusting academic or fieldwork curricula (Bertram et al.,
2014, 2018; Drisko & Grady, 2019; Mennen et al., 2018).
However, since the evidence on techniques for effectively
teaching EBP remains scarce (Drisko & Grady, 2019; Spens-
berger et al., 2020), it is unlikely that such efforts alone will
succeed in growing a much needed workforce of professionals
who can progress a more widespread use of the EBP model in
human and social service settings. Furthermore, it is
1
European Implementation Collaborative, Søborg, Denmark
2
University of Melbourne, Parkville, Victoria, Australia
3
University of North Carolina at Chapel Hill, NC, USA
4
Centre for Effective Services, Dublin, Ireland
5
Vrije Universiteit, Amsterdam, the Netherlands
6
European Alliance Against Depression, Leipzig, Germany
7
Oslo University Hospital, Norway
Corresponding Author:
Bianca Albers, Max-Pechstein-Str. 1, 67061 Ludwigshafen, Germany.
Email: balbers@implementation.eu
Research on Social Work Practice
1-24
ªThe Author(s) 2020
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/1049731520967419
journals.sagepub.com/home/rsw
well-documented in the literature that training, in the form of
teacher-centered didactic education sessions, can be effective
in building theoretical knowledge and shaping attitudes and
beliefs, but it is often insufficient to achieve practical applica-
tion of RSIs in real-world settings (Beidas & Kendall, 2010;
Hecht et al., 2016; Prior et al., 2008). This has increased the
attention for the importance of how RSIs are implemented in
human and social services, best represented by the field of
implementation research, that is, “the study of methods to
promote the uptake of research findings into routine practice
(Bhattacharyya et al., 2009, p. 491). Insights from this field of
inquiry highlight that the transfer of knowledge gained through
training to the conditions of human and social service organi-
zations and systems is influenced by a variety of factors—for
example, the climate characterizing the setting into which new
knowledge is embedded or the support for knowledge transfer
available in this setting (Blume et al., 2010; Grossman & Salas,
2011; Jackson et al., 2018). Factors like these have to be con-
sidered and addressed in order to enable real-world RSI transfer
and application.
This implies that next to understanding the RSI itself,
stakeholders directly involved in its implementation also
require implementation expertise, that is, knowledge and skills
in selecting and designing strategies that can support RSI
implementation within specific local contexts. This type of
expertise has been highlighted as one of the particularly impor-
tant elements in progressing a wider adoption and application
of RSIs in human and social services (Brownson et al., 2018),
especially in the field of social work, where implementation
science has taken hold only slowly (Barth et al., 2017; Cabassa,
2016). Only recently has the question about how to build this
implementation capacity in service agencies and their staff
gained increasing attention, reflected in publications that report
on theory development (Leeman et al., 2017), the design of
implementation courses (Moore et al., 2018; Mosson et al.,
2019; Park et al., 2018; Provvidenza et al., 2020), or reviews
of the existing literature on strategies and approaches used in
implementation capacity building (Leeman et al., 2015; Stan-
der et al., 2018).
Implementation Support
One of the most frequently used approaches to building imple-
mentation capacity is the provision of ongoing implementation
support (Katz & Wandersman, 2016; Leeman et al., 2015). In
the absence of a commonly shared definition of the concept of
implementation support, we define this support broadly, as any
activities and processes aimed at assisting leadership and staff
of human and social service organizations and systems in
implementing, sustaining, and scaling RSIs for population
impact. Multiple streams of research exist that focus on imple-
mentation support, including studies on, for example, facilita-
tion (Cranley et al., 2017), coaching (Artman-Meeker et al.,
2015), consultation (Nadeem, Gleacher, & Beidas, 2013), or
technical assistance (Dunst et al., 2019), indicating a high
degree of variability in the terms used to describe implemen-
tation support activities.
In human and social service practice settings, the establish-
ment of organizations dedicated to providing implementation
support, for example, purveyors (McWilliam et al., 2016) and
intermediary organizations (Franks & Bory, 2015; Proctor
et al., 2019), technical assistance centers (Bumbarger & Camp-
bell, 2011), or centers of excellence (Biegel et al., 2003; Met-
trick et al., 2017), has contributed to an increasing
professionalization of this support and considerable public
investment in these organizations (Proctor et al., 2019). Espe-
cially in social work, their role has been described as integra-
tive due to their potential to better interlink academic training
efforts with community services and to develop a growing
workforce in need of skills to implement EBP (Shapiro,
2018). As a consequence, growing numbers of professionals
are providing implementation support. These “implementation
support practitioners” (ISPs; Albers, Metz, & Burke, 2020) are
the first focus point of this article.
ISPs
While multiple reviews exist aimed at describing the activities
of ISPs, these mostly focus on a single particular role, for
example, on technical assistance providers (Dunst et al.,
2019; Taylor et al., 2014), knowledge brokers (Taylor et al.,
2014), or facilitators (Elledge et al., 2018). A similar picture
emerges for conceptual studies aimed at identifying and char-
acterizing the key functions and mechanisms of implementa-
tion support as provided by, for example, coaches (P. A. Snyder
et al., 2015), consultants (Edmunds et al., 2013; McLeod et al.,
2018; Nadeem, Gleacher, & Beidas, 2013), or other change
agents (Glisson & Schoenwald, 2005).
However, a comparison of how these key functions have
been defined (see Table 1) illustrates that they are similar in
focus and aim centered on easing implementation problems and
helping others in improving the uptake and implementation
of RSIs.
Hence, within the field of implementation science, the scat-
tering of the literature across multiple “schools of thought”
may create silos that increase the risk of research waste and a
lack of cross-school learning. Enabling such learning may help
to strengthen and refine the evidence base for implementation
support work. Furthermore, for stakeholders operating within
real-world implementation practice, it is important to under-
stand the characteristics of quality implementation support and
how it can be provided. For the social work profession, it has
been argued that quality implementation is key in achieving
social progress reflected in, for example, improved youth
development, reduced homelessness, or the elimination of
racism (Cabassa, 2016; Gehlert et al., 2017). Guidance on how
to support such implementation should therefore be based on a
consolidated evidence base that utilizes insights about all rel-
evant types of implementation support, independent of their
label as, for example, facilitation, consultation, coaching, or
knowledge brokering.
2Research on Social Work Practice XX(X)
A previously developed theoretical program logic reflects
this thinking (Albers, Metz, & Burke, 2020). It presents ISPs as
individuals who, through the use of unique combinations of
knowledge, skills, and attitudes tailored to the contextual set-
tings in which they work, can help individuals, agencies, and
service systems to effectively utilize RSIs, maintain their use,
and generate positive implementation, service, and client out-
comes. Subsequently, a systematic integrative review was con-
ducted to deepen the preliminary thinking reflected in this logic
and detail its key components. One of these components is the
skills required by ISPs to provide implementation support rep-
resented in their use of implementation strategies. These stra-
tegies are the second focus point of this article.
Implementation Strategies
Skill can be defined as the ability to do something well based
on one’s knowledge, practice experience, or aptitude. It repre-
sents an ability to assess the unique context in which an indi-
vidual operates, to integrate technical with practice knowledge,
and to use this information flexibly and creatively in situational
decision making (Devaney et al., 2017).
Implementation strategies, “the specific means or methods
for adopting and sustaining” RSIs (Proctor et al., 2013, p. 1),
represent a central body of knowledge as the “doing” of imple-
mentation. Examples of common implementation strategies are
training, audit and feedback or learning collaboratives. While
further research on the effectiveness of implementation strate-
gies is required (Powell et al., 2019), initial studies indicate that
the choice of implementation strategy may affect outcomes,
including, for example, knowledge and skill acquisition among
policy makers (Sarkies et al., 2017) as well as frontline practi-
tioners (Dimeff et al., 2015), implementation (Beidas et al.,
2017; Martino et al., 2019), and client outcomes (Chadwick
et al., 2015).
One of the most referenced taxonomies of implementation
strategies in the field of implementation science is that devel-
oped by Powell, Waltz et al. (2015), the Expert Recommenda-
tions for Implementing Change (ERIC) compilation. It
differentiates between 73 distinct implementation strategies
and provides definitions for each. As such, the taxonomy rep-
resents a menu available to ISPs as a means to introduce, teach,
and practice implementation as they collaborate with different
stakeholders in human and social services. Hence, it could be
used to describe the potential skills that ISPs may require to
adequately support others in their implementation practice.
However, scholars have highlighted that the effectiveness of
implementation strategies is dependent on their potential to
sufficiently address the barriers and/or facilitators that influ-
ence an implementation (Grimshaw et al., 2012; Powell et al.,
2014), assigning great importance to the thorough design of
implementation strategies. Taxonomies such as the ERIC com-
pilation are valuable tools in this regard because they provide
both the basic elements of strategy design and a common strat-
egy language that stakeholders can use in jointly assembling
and applying local implementation approaches. Their basic
elements though—the different, predefined implementation
strategies—still represent relatively broad labels for implemen-
tation activities. For ISPs working in often complex settings of
routine practice and policy, this leaves ample room for further
strategy operationalization and tailoring. For example, consul-
tation, included in the ERIC compilation as a form of
expert-based implementation support, has previously been dis-
cussed as a “black box” that contains further activities and
functions (Nadeem, Gleacher, & Beidas, 2013), illustrating the
relevance of examining the concrete details of using a strategy.
Furthermore, the work of ISPs must take into account the
dynamic and highly relational nature of policy and practice
implementation involving multiple layers of context and differ-
ing norms and values among stakeholders (Carey et al., 2019;
Huzzard, 2020; Norris et al., 2017).
Hence, the design of strategies is not a straightforward,
simple selection process but one of tailored strategy operatio-
nalization and integration. Taxonomies of implementation stra-
tegies alone are therefore insufficient to understand the breadth
and depth of the skills that ISPs require in providing imple-
mentation support. To achieve this understanding, it is also
necessary to investigate how strategies have been applied in
Table 1. Definitions of Key Functions of Different Implementation Supports.
Key Function Definition
Coaching “Coaching is defined as a non-evaluative, ongoing process (e.g., occurring over a period of time), in which one individual
observes and provides feedback to another individual targeting an intervention, supports or other variables the individual
wants to increase.” (Stormont et al., 2015, p. 70)
Consultation ...external support provided within dissemination and implementation efforts. Consultation connotes an imparting of
specific expertise in intervention techniques as well as experience and knowledge of the application of these techniques in
different settings.” (Nadeem, Gleacher, & Beidas, 2013, p. 2)
Facilitation “Facilitation is a technique where an individual makes things easier for others, by providing support to help them change
their ways of thinking and working.” (Cranley et al., 2017, p. 1)
Knowledge
brokering
...first, to make evidence more accessible and tailored for clinicians and health care decision makers (knowledge
management); second, to facilitate mutual learning between researchers and clinicians (linkage and exchange); and, finally,
to develop clinicians’ and decision makers’ skills and capacity for EIP (capacity building).” (Hoens & Li, 2014, p. 223)
Technical
assistance
...information sharing, expertise, instruction, training, and other supports for improving program, organization, or
system capacity to achieve specific goals, objectives, or outcomes.” (Dunst et al., 2019, p. 109)
Albers et al. 3
implementation practice. This was the aim of this integrative
review. In order to describe the skills required by ISPs, it
examines (a) which strategies have been reported as being used
by ISPs across multiple bodies of research on implementation
support and (b) how these strategies were applied in concrete
practice settings.
Method
This research is based on the conduct of a systematic integra-
tive review, involving a narrative synthesis of findings guided
by thematic analysis.
The integrative review method “allows for the combination
of diverse methodologies” (Hopia et al., 2016, p. 663), that is,
the inclusion of studies that use different and diverse designs. It
belongs to a group of emerging knowledge synthesis methods
for integrating rich contextual data (Saini & Shlonsky, 2012;
Tricco et al., 2016) and addressing complex practice and policy
questions (Greenhalgh et al., 2018). With theory building being
among its key purposes (Kastner et al., 2016; Tricco et al.,
2016), an integrative review was particularly suitable for this
project’s goal of refining a preliminary ISP program logic pre-
viously developed by the authors (Albers, Metz, & Burke,
2020).
While novel methods suffer at times from a lack of clarity
on how to apply them (Tricco et al., 2016), all operational steps
have been defined for the integrative review in the form of a
framework developed by Whittemore and Knafl (2005). This
framework was applied to this review, which was conducted in
five stages including (1) problem identification, (2) literature
search, (3) data evaluation, (4) data analysis, and (5) data pre-
sentation. With Step (1), presented above, we identified a gap
in our understanding of what and how ISPs use implementation
strategies and thus of the skills that they require to provide
implementation support in social and human services. All other
review steps taken are presented in the following sections.
Literature Search
Integrative reviews are a method for systematically searching
and synthesizing a targeted body of publications that is selected
based on its potential for revealing new patterns and insights.
While a comprehensive, representative set of the literature
should be the basis of an integrative review (Torraco, 2016),
with its key purpose being research integration for theory and
framework development, it is not necessary to include every
study ever published in the area of interest (Snyder, 2019).
Rather, the aim is to achieve a level of “conceptual saturation
(Brunton et al., 2012, p. 120) through studies representative of,
in this case, the work of ISPs.
With this in mind, a combination of five different search
strategies was used to identify relevant publications for this
review. Systematic searches of nine electronic databases
(ASSIA, CINAHL, Criminal Justice Abstracts, ERIC, Family
and Society Studies Worldwide, Medline, PsycInfo, Scopus,
and SocIndex) were conducted and combined with searches
of 42 different organizational websites, a targeted call for pub-
lications among selected experts, an open call through elec-
tronic media, and reference checks of all included studies.
Searches were conducted between February and April 2019.
A detailed account of all strategies has been included in
the electronic results addendum (ERA; Albers, Bu
¨hrmann,
et al., 2020).
Qualitative as well as quantitative peer-reviewed primary
studies that focused on the role of ISPs were eligible for inclu-
sion. These could be studies of knowledge brokers, facilitators,
consultants, coaches, intermediaries, improvement advisers,
mentors, and so on. Given the diversity in terminology
reflected in just these examples, further labels for implemen-
tation support roles were considered on a case-by-case basis if
they emerged from studies. Studies of support provided
through individuals in formal leadership roles or opinion lead-
ers were excluded. No geographical limitations were defined.
Publications had to be written in English, German, Danish,
Swedish, or Norwegian. The detailed inclusion and exclusion
criteria applied are described in the ERA (Albers, Bu
¨hrmann,
et al., 2020).
All work was conducted by a research team with five mem-
bers, including four research assistants and the lead author. The
team combined methodical expertise in systematic literature
reviews (B.A. and C.V.), with content expertise related to
social work (B.A. and L.Ba.), health services (C.V., P.D., and
L.Bu
¨.), and education (B.A.). The team used double-screening
during both title and abstract and full-text screening, with con-
flicts being solved by a third team member. Weekly calls with
all team members were used to discuss any issues emerging
from screening, data extraction, or coding activities.
The flow of studies through the screening process is outlined
in the ERA in figure ERA1 (Albers, Bu
¨hrmann, et al., 2020).
Data Evaluation
Taking into account the purpose and character of this review
and the available resources and instruments, a decision was
made to focus the quality assessment of studies on randomized
controlled trials only. In the absence of tools for assessing the
risk of bias in rigorous implementation studies, we focused on
exploring how developed studies of the still relatively new field
of implementation support were using a framework developed
by Hodder et al. (2016). It enabled us to consider the quality of
included trials based on their integration of theory, consistency
in terminology use, and balanced consideration of both benefits
and harms of implementation interventions. These considera-
tions were summarized and translated into five questions,
which were answered for all included randomized controlled
trials. The tool itself and the assessment of included implemen-
tation studies are presented in the ERA (Albers, Bu
¨hrmann,
et al., 2020).
The quality of all other studies included in this review
remained unassessed. This was deemed acceptable in the cur-
rent context, partly because implementation support is a novel
field of inquiry, partly because this review was conducted for
4Research on Social Work Practice XX(X)
the purpose of theory development. Under these conditions, it
can be valuable to review the entire range of existing evidence
because even weaker studies may contain important informa-
tion (Petticrew & Roberts, 2003).
Data Analysis
Descriptive study data were extracted using a standardized
extraction form with 19 items to extract, including study design
and aim, method, geography, sector, setting, sampling strategy,
sample size, clinical intervention, ISP information, outputs,
and outcomes. Each study was extracted by one research team
member, and all data extraction quality was assured by the lead
author.
Following this data extraction, all studies were uploaded to
dedoose, an online qualitative data analysis platform for the
coding of strategy use. This coding was guided by the 73
implementation strategies that comprise the ERIC compilation
(Powell, Waltz, et al., 2015). The label of each strategy, for
example, “distribute educational materials,” was entered into
dedoose together with its basic definition. In this way, each
strategy represented a single code. Research team members
assigned these codes to text excerpts, which could be single
words, one or more sentences, or entire paragraphs describing
ISPs’ use of different strategies. Multiple codes could be
assigned to one text excerpt. Strategies, which did not fit with
any of the compilation labels, were coded as “other” and later
analyzed separately.
The coding scheme was tested by three research team mem-
bers on a sample of five studies, and results were discussed for
further improvement of the scheme. It was then used with all
studies, each of which was coded by one research team mem-
ber. Of these studies, 42%were double coded by a second
member of the research team, with any conflicts solved by the
lead author. The coding of the remaining 58%of studies was
quality assured by the lead author, through a review of all text
excerpts during data analysis, and a recoding of these as nec-
essary. Simultaneously with coding, all research team members
generated memos to capture ideas related to specific aspects of
strategy use, gaps, or questions emerging from the textual data.
Coding results for all studies were exported from dedoose
by coding category, generating 44 different Excel spreadsheets
each of which contained the raw coding data for a single code
(i.e., ERIC strategy) used, including (a) publication identifiers,
(b) the text excerpts relating to a particular code, and (c) the
ISP category to which a publication belonged. Memos
(N¼277) were exported separately grouped under nine differ-
ent topics, each of which had been defined by different research
team members during the coding process. The lead author
reviewed each spreadsheet separately to assure the quality of
all coding and highlight key content for each excerpt.
As part of the subsequent data comparison, the lead author
reviewed all text excerpts in detail, reassigned codes (i.e., strat-
egy labels) as relevant, identified preliminary ideas for central/
common aspects of strategy use, and finally refined these
through constant comparison with other text excerpts
belonging to the same code. Since excerpts related to the imple-
mentation strategies “facilitation” and “providing ongoing con-
sultation” showed to contain the same key elements and thus
were not distinctly different from each other, a decision was
made to merge these into just one code: “providing ongoing
consultation/facilitation.” Furthermore, while a large number
of findings were available, some of these were derived from
only a single or very few included studies. To ensure that the
content of the revised ISP program logic could be built on an
acceptable minimum of scholarly agreement, it was therefore
decided to only consider findings if they emerged from a min-
imum of 10 studies.
Results
Overview of Studies
A total of 109 publications formed the final sample of litera-
ture, representing 99 separate studies. A full list of these pub-
lications is included in the ERA (Albers, Bu
¨hrmann, et al.,
2020). The four most prominent ISP roles in this literature were
facilitators (n¼22), consultants (n¼21), technical assistance
(TA) providers (n¼12), and knowledge brokers (n¼10),
followed by studies on coaches (n¼8), implementation teams
(n¼4), and intermediaries (n¼3). Five studies examined a
mix of different roles, and a further 14 studies either did not
label the ISP role in focus (n¼5) or used other terms to coin it
(n¼9).
Studies were primarily conducted in North America (United
States, n¼67; Canada, n¼15), followed by the UK (n¼7),
Central Europe (n¼4), Australia (n¼2), and New Zealand
(n¼1). One further study involved multiple international loca-
tions, whereas geographical information was missing from two
studies. The vast majority of studies was conducted within health
services (n¼67), and a smaller number related to social welfare
(n¼15), education (n¼14), and crime and justice (n¼1). A mix
of sectors was involved in two studies. The type of RSIs imple-
mented and supported included research-supported practices
(n¼40), programs (n¼28), guidelines (n¼11), and policies
(n¼2). Quality improvement approaches were the focus of a
further 18 studies.
All articles included were published between 2000 and 2019
with 2015 being the median year, confirming the newness of
this field of inquiry and indicating a growing research interest
in implementation support in more recent years. Quantitative
(n¼48), qualitative (n¼31), and mixed methods designs
(n¼30) were used across the publications included. Studies
used quasi-experimental (n¼4), pre–post evaluation (n¼22),
or case study methods (n¼25). Cross-sectional designs were
used in 12 studies, and a small group of studies used a multiple
baseline (n¼2), a single-case approach (n¼3), and a cohort
design (n¼1). Finally, 30 studies were randomized controlled
trials, with the majority being randomized at the cluster level
(n¼24).
In assessing their quality (see the ERA for details), it was
apparent that less than 50%(n¼13) reported on the use of a
Albers et al. 5
theory, model, or framework to inform the design of the
implementation support examined. This included the Reach,
Effectiveness, Adoption, Implementation, Maintenance
framework (n¼4), Getting to Outcomes (n¼2), the Con-
solidated Framework for Implementation Research (n¼1),
the Promoting Action on Research Implementation in Health
Services framework (n¼2), and the interactive systems
framework (n¼1) together with organizational culture the-
ory, the theory of planned behavior, the theory of continuous
quality improvement, and Roger’s diffusion of innovations
theory. However, the use of such theories and models was
often described in passing, with only a few study reports
providing detailed, theoretically informed explanations of
how a support intervention was anticipated to work (Peterson
et al., 2015; Quanbeck et al., 2018; Williams, Glisson, et al.,
2017). Only three studies included descriptions of the use of
local empirical evidence to ensure that the implementation
support studied would fit with local conditions, two of which
relied on previously collected data on the use of the interven-
tion (Acolet et al., 2011; Calo et al., 2018) and one on the
active codesign of the implementation support with stake-
holders who had developed the guideline to be implemented
locally (Quanbeck et al., 2018). No study referenced the
distinct use of a strategy taxonomy in designing the imple-
mentation support examined.
Implementation Strategies Used by ISPs
In the studies reviewed, ISPs used 37 of the 73 implementation
strategies that form the ERIC compilation (Powell, Waltz,
et al., 2015).
Of these, 18 emerged as particularly common because each
was named and/or detailed in more than 10 studies. Table 2 lists
these strategies, the number of publications in which they
appeared, and the cluster of strategies they belong to, as defined
in a previous study (Waltz et al., 2015), aimed at validating the
ERIC implementation strategies and grouping these into clus-
ters. The ERA (Albers, Bu
¨hrmann, et al., 2020) contains a table
that summarizes the use of these implementation strategies by
ISP role, indicating no clear differences in strategy use between
these roles and confirming that there are strong similarities in
their key activities.
Table 2 shows that the majority of strategies described as
being used by ISPs as part of their support work fell within the
cluster of training and educating stakeholders (n¼8). These
were followed by evaluative and iterative (n¼4) and
Table 2. Common Implementation Strategies Reported as Being Used by Implementation Support Practitioners.
Implementation Strategy
a
(Powell, Waltz, et al., 2015)
nStudies Describing
Use of Strategy Implementation Strategy Cluster (Waltz et al., 2015)
Provide consultation/facilitation [55-I]
b
74 Train and educate stakeholders/provide interactive assistance
Conduct educational meetings [15-II] 53 Train and educate stakeholders
Distribute educational materials [31-I] 39 Train and educate stakeholders
Conduct ongoing training [19-I] 20 Train and educate stakeholders
Conduct educational outreach visits [16-II] 17 Train and educate stakeholders
Make training dynamic [43-I] 17 Train and educate stakeholders
Develop educational materials [29-I] 17 Train and educate stakeholders
Create a learning collaborative [20-II] 12 Train and educate stakeholders
Audit and provide feedback [5-I] 44 Use evaluative and iterative strategies
Develop formal implementation blueprint [23-I] 25 Use evaluative and iterative strategies
Conduct local needs assessment [18-I] 21 Use evaluative and iterative strategies
Assess for readiness and identify barriers and
facilitators [4-I]
18 Use evaluative and iterative strategies
Promote network weaving [52-III] 22 Develop stakeholder interrelationships
Model and simulate change [45-III] 17 Develop stakeholder interrelationships
Identify and prepare champions [35-I] 15 Develop stakeholder interrelationships
Org. clinician implementation team
meetings [48-I]
13 Develop stakeholder interrelationships
Tailor strategies [63-I] 37 Adapt and tailor to context
Promote adaptability [51-I] 19 Adapt and tailor to context
a
This column only contains strategies described in 10 or more studies. Additional strategies (n¼19) identified in texts, but in less than 10 studies each, included
the following: conduct local consensus discussions (n¼6); conduct cyclical small tests of change (n¼7); facilitate relay of clinical data to providers (n¼1);
increase demand (n¼4); purposely reexamine the implementation (n¼8); develop and implement tools for quality monitoring (n¼6); develop and organize
quality monitoring systems (n¼5); identify and prepare champions (n¼6); use train-the-trainer strategies (n¼6); recruit, designate, and train for leadership
(n¼6); provide local technical assistance (n¼4); obtain and use patient/consumer and family feedback (n¼5); access new funding (n¼4); remind clinicians
(n¼3); start a dissemination organization (n¼2); stage implementation scale up (n¼1); revise professional roles (n¼1); provide clinical supervision (n¼1); and
centralize technical assistance (n¼1).
b
The Latin numeral added within a squared parenthesis is the number assigned to a strategy by Waltz et al. (2015). The
Roman numeral describes the location of this strategy in one of four quadrants, each of which represents a particular combination of perceived strategy
importance and feasibility in the study by Waltz et al. (2015). These numerals are used in this article to highlight the distribution of identified ISP strategies in
Figure 1.
6Research on Social Work Practice XX(X)
interpersonal strategies (n¼4). A further two strategies
focused on adapting and tailoring. No strategies belonged to
the clusters of “engaging consumers,” “changing infra-
structure,” or “utilizing financial strategies.”
The analysis of the literature also showed that implementa-
tion support was multifaceted and, in most cases, built on the
combined use of multiple implementation strategies. An aver-
age of 4.5 different implementation strategies were reported as
being used by ISPs across all randomized, controlled trials
(n¼30), with a maximum of nine and a minimum of two
strategies (SD ¼1.9). These represented the entire range of
strategies listed in Table 2.
Opening the Black Box of Strategy Use by ISPs
Table 3 provides an overview of the aspects of strategy use that
could be identified through the literature. Identifiable types of
use for each strategy are listed together with techniques that
were reported to support strategy operationalization and com-
ponents that were apparent as forming strategy use across mul-
tiple publications. Since aspects of using strategies with an
educational purpose, that is, conduct educational meetings/
ongoing training/outreach visits, develop/distribute educational
materials, and create learning collaboratives, were very similar,
these were integrated into one strategy labeled “education” in
Table 3.
In the following, these aspects of strategy use are outlined in
greater detail.
Train and educate stakeholders. When ISPs are providing con-
sultation/facilitation, five common components emerged
across studies: (1) identifying the support needs of those
involved in the consultation/facilitation efforts, for example,
through formalized, periodic needs assessments (Duffy et al.,
2012), structured interviews (Bice-Urbach & Kratochwill,
2016), or by explicitly inviting stakeholders to articulate their
support needs in each session (Akin, 2016; Chilenski et al.,
2016); (2) educating and professionally supporting these stake-
holders, for example, through processes such as learning from
others (Akin, 2016), role-plays (Barac et al., 2018), didactic
teaching (Beidas et al., 2013; Chaffin et al., 2016), answering
questions (Chilenski et al., 2016; Hurtubise et al., 2016; Kelly
et al., 2000), or offering advice (Rosen et al., 2012); (3) mon-
itoring the progress and/or performance of stakeholders, for
example, by measuring fidelity (Bice-Urbach & Kratochwill,
2016; Caron & Dozier, 2019; Eiraldi et al., 2018; Murray et al.,
2018), program outcomes (Funderburk et al., 2015; Olson
et al., 2018), or progress toward other implementation or ser-
vice goals (Chilenski et al., 2016; Holtrop et al., 2008; Preast &
Burns, 2018); (4) identifying implementation barriers and
problems faced as part of the change efforts, typically related
to learning a new practice (Dusenbury et al., 2010; Eiraldi
et al., 2018; Kauth et al., 2010; Nadeem, Gleacher, Pimentel,
et al., 2013) and/or enabling its implementation within a par-
ticular local context (Rosella et al., 2018; Saldana & Chamber-
lain, 2012; Tierney et al., 2014); and (5) identifying potential
solutions to these problems, including next steps to initiate
these. This final step was at times characterized as
“troubleshooting” (Chaffin et al., 2016; Hodge et al., 2017;
Meropol et al., 2014) signaling a more urgent and ad hoc type
of character that this strategy could take. Other studies empha-
sized that consultation/facilitation depended on a climate con-
ducive to jointly identifying and solving problems among
multiple stakeholders (Dogherty et al., 2012; Hurlburt et al.,
2014), thereby indicating that it could take time for ISPs to
create adequate conditions for using this strategy.
Studies reflected that consultation/facilitation was provided
to individuals (Bradshaw et al., 2012; Rosen et al., 2012) as
well as groups of stakeholders (Kousgaard & Thorsen, 2012;
Murray et al., 2018) and occurred either in-person (Anyon
et al., 2016; Dobbins et al., 2018; Eiraldi et al., 2018), or
remotely (C. H. Brown et al., 2014; Gustafson et al., 2013;
Kauth et al., 2010). The use of videotaped work samples or the
direct observation of the work of those supported was described
in multiple studies as a key cross-component activity for ISPs
(Akin, 2016; Bice-Urbach & Kratochwill, 2016; C. H. Brown
et al., 2014; Caron & Dozier, 2019; Dusenbury et al., 2010;
Funderburk et al., 2015).
While the description of ISPs’ educational activities often
lacked detail, study reports displayed a broad spectrum of types
of meetings, ranging from formal, comprehensive series of
meetings, to informal quick check-ins. On the formal end of
the spectrum, ISPs conducted, for example, an “onsite 6-hour
experiential workshop” (Carson et al., 2014, p. S14), “a series
of six learning sessions” (Anaby et al., 2015, p. 3), a “two-day
initial training” (Bradshaw et al., 2012, p. 181), or “study days
(Gerrish et al., 2011, p. 2010). The use of less formal gatherings
aimed at educating stakeholders included, for example,
addressing the subject informally during lunch breaks
(Aasekjær et al., 2016, p. 35), offering one-to-one tutoring
(Bice-Urbach & Kratochwill, 2016; Kaasalainen et al., 2015),
integrating educational elements into an agency’s routine staff
meeting (Byrnes et al., 2018; Graaf et al., 2017), or initiating ad
hoc informal training sessions upon request by those supported
(Tierney et al., 2014). Study reports that presented the tech-
niques used to educate stakeholders in these meetings reflected
a consistent combination of didactic and dynamic, interactive
teaching elements (Becker et al., 2013; Beidas et al., 2012;
Brownson et al., 2007; Chaffin et al., 2016; Dobbins et al.,
2018; Ryba et al., 2017; Tierney et al., 2014; Yano et al.,
2008), the latter of which typically aimed at integrating the
concrete and individual/local work experience of those
supported.
Use evaluative and iterative strategies. The use of audit and feed-
back involved multiple activities for the ISP, including sourcing
relevant administrative or statistical data or self-collecting such
data (Burns et al., 2008; Dickinson et al., 2014; Jacobson et al.,
2019; Meropol et al., 2014), summarizing these data in relevant
and operational ways (Calo et al., 2018; Chaple & Sacks, 2016;
Chinman et al., 2018; Mold et al., 2008), presenting them for
stakeholders (Holtrop et al., 2008; Jacobson et al., 2019), and
Albers et al. 7
promoting a discussion of findings enabling stakeholders to set
priorities for improvements (McCullough et al., 2017; Preast &
Burns, 2018). Multiple studies also included descriptions of
ISPs tracking and assessing fidelity data to better understand
whether interventions were implemented as intended by their
developers (Brunette et al., 2008; Chinman et al., 2018; Eiraldi
et al., 2018; Gunderson et al., 2018; Kirchner et al., 2014),
highlighting that evaluative functions were included in imple-
mentation support.
ISPs also took part in assessing the needs for different clin-
ical or educational interventions (Dogherty et al., 2013; Fort-
ney et al., 2018; Ward et al., 2017; Waterman et al., 2015). This
could occur in formal ways, for example, in using a particular
method such as the nominal group technique (Quanbeck et al.,
Table 3. Types, Techniques, and Components of ISPs’ Implementation Strategy Use.
Implementation Strategy Cluster Aspects of Strategy Use for ISPs
Provide consultation/
facilitation
Train and educate
stakeholders/
provide interactive
assistance
Types: individual versus group, in-person versus remote
Five components: (1) identify support needs, (2) educate and professionally support
stakeholders, (3) monitor progress/performance, (4) identify barriers/facilitators, and
(5) identify solutions and initiate these
Cross-component ingredient: assessing/observing work samples
Education
a
Train and educate
stakeholders
Types: formal versus informal, didactic versus dynamic/interactive, one-off versus
ongoing, and teacher versus stakeholder focused
Central: integration of individual/local work experience
Audit and provide
feedback
Use evaluative and
iterative strategies
Five components: (1) source/self-collect data, (2) synthesize data, (3) present data,
(4) enable data discussion for decision making, and (5) track/monitor data
Develop formal
implementation
blueprint
Types: higher order (e.g., project charters) versus scoped plans (e.g., coping plans,
implementation support plans)
Three phases: (1) development, (2) monitoring, and (3) adjustment
Conduct local needs
assessment
Types: distinct/separate versus integrated, initial versus ongoing;
Techniques: method-based (e.g., nominal group technique) versus free format
Assess for readiness
and identify barriers
and facilitators
Types: RSI implementation versus other readiness (e.g., for implementation support
or training)
Techniques: stakeholder survey, formative evaluation, focus groups, and broader
consultation
Promote network
weaving
Develop stakeholder
interrelationships
Types:
Link stakeholders to relevant services/agencies of relevance to the implementation
Enhance connectivity among stakeholders
Develop own networks
Model and simulate
change
Types:
Role modeling for skill building (e.g., through role-plays, formal shadowing, participant
modeling)
Walk the talk ¼demonstrate desired behaviors in own implementation support
practice
Identify and prepare
champions
Types of champions: formal internal implementation leaders, wider stakeholder groups
(e.g., steering committees), and government officials
Types of preparation: inform, engage, garner support, and advise
Organize clinician
implementation
team meetings
Types: team participation versus team support
Central: ensure diversity, enable learning/problem-solving, and utilize teams as part
of implementation infrastructure
Tailor strategies Adapt and tailor to
context
Types: tailor implementation support to
needs of individuals/groups supported (based on, e.g., stage of professional
development, organizational roles, and preferences)
inner context (based on, e.g., organizational climate and culture, policies, priorities,
and resources)
outer context (based on, e.g., partner organizations, availability of other services)
Techniques: adjust the intensity of implementation support, its focus, delivery mode,
resources, tools, or its documentation
Promote adaptability Types: adaptation to account for target populations’ clinical needs; stakeholders’
interests, priorities, cultural preferences; and availability of local resources
Five components: (1) assess adaptation needs; (2) source evidence to guide adaptation;
(3) translate and apply this evidence; (4) design adaptation; and (5) document, track, and
assess adaptation results
Note.ISP¼implementation support practitioner.
a
“Education” covers aspects identified across the following strategies: conduct educational meetings, distribute educational materials, conduct ongoing training,
conduct educational outreach visits, make training dynamic, develop educational materials, and create a learning collaborative.
8Research on Social Work Practice XX(X)
2018) or as part of purposely scheduled site visits (Anaby et al.,
2015; Rosella et al., 2018). In addition, study reports included
descriptions of informal approaches to needs assessments,
necessary because stakeholders’ support needs changed over
time and required to be continuously updated. In these cases,
the identification of needs was routinely integrated into ISP
observation of practice and the regular contact they had with
those supported (Becker et al., 2013; Duffy et al., 2012; Rivard
et al., 2010).
A slight broadening of scope could be observed for the
strategy to develop a formal implementation blueprint. Origi-
nally defined with a focus on planning the implementation of a
clinical intervention, the literature documents that ISPs were
also involved in developing both more targeted plans in the
form of, for example, “coaching delivery plans” (Rushovich
et al., 2015, p. 373) or “coping plans” (Sanetti et al., 2018,
p. 52), describing how to handle potential implementation bar-
riers, and higher level plans described as, for example, “project
charters” (Lavoie-Tremblay et al., 2012, p. 421). In only a few
cases did study reports outline what this planning implied, for
example, using “a checklist [...]to guide the decision-making
process (e.g., identifying [intervention] target population and
exclusion criteria, [intervention] staff members’ roles and
responsibilities, and how to monitor the implementation
process)” (Ritchie et al., 2017, p. 5).
Multiple studies described ISPs as being involved in readi-
ness assessments, which implied ISPs surveying (Holtrop et al.,
2008; Russell et al., 2010) or interviewing stakeholders (Peter-
son et al., 2015), applying “formative evaluation techniques
(Ritchie et al., 2017, p. 5), or conducting a series of readiness
calls (Brown et al., 2014). The elements of readiness assess-
ments and tools used to conduct them were detailed in only a
few studies (Russell et al., 2010; Waterman et al., 2015). Other
studies reflected that readiness assessments could have a par-
ticular scope and focus on, for example, identifying stake-
holders’ readiness for training (Gerrish et al., 2011; Worton
et al., 2018) or implementation support in general (Feinberg
et al., 2008; Peterson et al., 2015; Yazejian et al., 2019).
Develop stakeholder interrelationships. The importance of net-
work weaving to the work of ISPs was reflected in poignant
terminology used in study reports, characterizing them as a
bridge (Elnitsky et al., 2015), linking agent (Dogherty et al.,
2012), boundary spanner (Dogherty et al., 2012; Graaf et al.,
2017), connector (Franks & Bory, 2015), liaison (Dogherty
et al., 2012), and convener (Worton et al., 2018). ISPs were
described as fulfilling this function in two ways. Firstly, at a
basic level, they shared their knowledge about relevant services
and agencies that those they supported could benefit from in
their own practice. This implied, for example, connecting clin-
ical practices to available community services (Brown, Elliott,
& Leatherdale, 2018; Brown, Elliott, Robertson-Wilson, et al.,
2018a, 2018b; Mader et al., 2016). Secondly, at a more
advanced level, ISPs worked to overcome barriers to stake-
holder connectivity, thereby bringing individuals, organiza-
tions, and systems together for collaborative processes that
these otherwise would struggle to establish themselves (Water-
man et al., 2015; Worton et al., 2018; Yazejian et al., 2019).
This second type of network weaving was described as requir-
ing greater neutrality from ISPs (Worton et al., 2018), enabling
them to consider perspectives that exist outside of the context
that received their direct implementation support.
Through these types of network weaving, ISPs represented a
connection resource to their stakeholders that they themselves
needed to maintain and expand continuously. This was
reflected in studies describing ISPs as needing to regularly
develop their own networks as a form of social capital that
was of general benefit to their work and helped to, for example,
ensure that later implementation activities ran more smoothly
or crucial information was accessible when needed (Waterman
et al., 2015).
In using change modeling as an implementation strategy,
ISPs pursued primarily two purposes: Firstly, to build specific
skills in those supported, for example, clinical skills required
by practitioners to deliver an intervention. These were taught
through, for example, role-plays (Akin, 2016; Barac et al.,
2018; Caron & Dozier, 2019; Dogherty et al., 2012), formal
shadowing (Gerrish et al., 2011), walk-throughs (Jacobson
et al., 2019), or participant modeling (Funderburk et al.,
2015; Graaf et al., 2017; Kinley et al., 2014; Sanetti et al.,
2018). Secondly, ISPs were described as using the exact same
techniques, which they wanted their stakeholders to use with
patients or clients, in their own implementation support. In a
study of consultation, ISPs used the principles of motivational
interviewing (MI) in their support of clinicians learning to
apply MI in their clinical practice (Barac et al., 2018). Simi-
larly, in another study, knowledge brokers used the principles
of EBP in as many aspects of their support work as possible
(Hurtubise et al., 2016).
In identifying and preparing champions for implementation,
ISPs collaborated with formal leaders, labeled as “clinical
leads” (Acolet et al., 2011), “senior management” (Dobbins
et al., 2018), or just “leadership” (Chaffin et al., 2016; McCul-
lough et al., 2017; Ritchie et al., 2017). A few examples also
pointed to ISPs connecting with, for example, full steering
committees (Brunette et al., 2008), boards (Ward et al.,
2017), or government departments (Dobbins et al., 2018). The
function of these connections was predominantly described as
to update champions, “ensuring the right individuals are
informed” (Dogherty et al., 2012, p. 9) about the state and
progress of the implementation. However, implicitly, this also
aimed to engage champions and to garner their support
(Kirchner et al., 2014; McCullough et al., 2017; Ritchie
et al., 2017). In addition, examples could be identified of ISPs
presented as explicit and proactive advisors to implementation
or system leaders (Brunette et al., 2008; Chaffin et al., 2016).
Organizing clinician team meetings involved ISPs support-
ing and/or participating in teams explicitly built to shepherd the
implementation effort and labeled, for example, project team
(van der Zijpp et al., 2016), change team (Fortney et al., 2018;
Quanbeck et al., 2018), interdisciplinary facilitation team
(Lessard et al., 2016), improvement team (Dickinson et al.,
Albers et al. 9
2014), community development team (Saldana & Chamber-
lain, 2012), or interagency collaborative team (Chaffin et al.,
2016). ISPs could be full members of these teams, with all
members providing different aspects of implementation sup-
port (Chaffin et al., 2016; Fortney et al., 2018), or they could
be in an assisting role in which they supported a team in its
local work to promote the implementation of an intervention
(Jacobson et al., 2019; Saldana & Chamberlain, 2012). This
means that the key function of this strategy, described in the
ERIC compilation as focused on enabling shared learning
among clinicians, had a broader scope in the ISP literature.
While shared learning and problem-solving was a goal for
teaming (Kaasalainen et al., 2015; Parchman et al., 2013;
Saldana & Chamberlain, 2012), for ISPs these teams also rep-
resented an infrastructure that made it possible to connect the
parts of a system that were involved in and affected by an
implementation and to include these in their support activities.
The following excerpt illustrates this for a change team that
received regular implementation support from consultants:
Multidisciplinary representation on the team was important.
[...]changing a workflow requires understanding the tasks
performed by staff members in all occupations involved and
securing their cooperation to make the change”(Jacobson
et al., 2019, p. 5). The multidisciplinary nature of teams pointed
to in this quote also emerged as a characteristic from other
studies (Kaasalainen et al., 2015; Lessard et al., 2016), indicat-
ing that ISPs worked with teams with considerable professional
diversity, reflected by team members’ educational back-
grounds, professions, or organizational roles.
Adapt and tailor to context. In coding for tailoring strategies,the
focus was on identifying descriptions of how ISPs tailored their
own support, that is, the consultation, facilitation, or TA they
offered. While tailoring was the dominating term used in this
context, adapting, modifying, individualizing, fitting, custo-
mizing, or matching were alternative wordings used in study
reports. Close to a third of these studies (n¼13) provided little
detail on the factors informing the tailoring practice of ISPs,
while a further 24 studies included some, but still relatively
general, detail about what caused and informed tailoring
processes. These could be structured into two main groups.
The first group explained tailoring as a response to the needs
expressed by or identified for the individuals supported by
ISPs, for example, professional development needs (Akin,
2016; Becker et al., 2013; Dusenbury et al., 2010; Shernoff
et al., 2015), needs emerging from particular preferences
(Anaby et al., 2015; Barac et al., 2018; Dobbins et al., 2018),
or from individuals’ organizational role (Gerrish et al., 2011;
Gustafson et al., 2013; Rivard et al., 2010). The second group
of publications described factors in the inner setting in which
the implementation support was provided as causing the tailor-
ing, including organizational culture and climate (Aasekjær
et al., 2016; Anaby et al., 2015; Garbacz et al., 2016), policies
(Anaby et al., 2015; Brunette et al., 2008) and priorities (Kelly
et al., 2000), resources (Chinman et al., 2017), or structures
(Brunette et al., 2008; Garbacz et al., 2016). Only one study
pointed to outer setting factors, in the form of other health
initiatives and organizations operating in the external environ-
ment to an implementation, as a potential trigger for tailoring
(Tierney et al., 2014). Finally, in a study of intermediaries
(Chew et al., 2013), the professional skills and interests of the
ISP were described as leading to tailoring in a system in which
this role was newly established.
There was very little information about the elements of
implementation support that were tailored. Studies containing
such detail pointed to, for example, intensity and focus
(Meropol et al., 2014), delivery mode (Yazejian et al., 2019),
resources (Russell et al., 2010), tools (Rosella et al., 2018), or
documentation (Mackenzie et al., 2011) with more fine-grained
accounts being a rare exception (Quanbeck et al., 2018).
Finally, ISPs worked to support adaptation. These adapta-
tions aimed to address the clinical needs of specific target
populations (Beidas et al., 2013; Shernoff et al., 2015), or their
cultural preferences (Chaffin et al., 2016; Hurlburt et al., 2014),
and also to address the local context of the implementation,
such as resources (Saldana & Chamberlain, 2012) or the inter-
ests and preferences of individuals using a new intervention
(Rosella et al., 2018; Waterman et al., 2015). The specific
activities conducted by ISPs included surveying stakeholders
to assess whether interventions were deemed feasible and
acceptable and inform adaptation needs (Waterman et al.,
2015); sourcing, translating, and applying evidence to guide
adaptation (Beidas et al., 2013); liaising between intervention
developers and providers to enable adaptation (Chaffin et al.,
2016; Fortney et al., 2018; Saldana & Chamberlain, 2012);
helping to adapt guidelines, protocols, tools, and other inter-
vention resources used by providers and other key stakeholders
(Parchman et al., 2013; Quanbeck et al., 2018; Rosella et al.,
2018); and tracking and documenting adaptations to ensure
these were sufficiently captured (Yano et al., 2008).
Strategy Feasibility
The 18 common implementation strategies listed in Table 2
also were assigned a unique identifier in the form of Latin
numerals added within squared parentheses. These identifiers
were previously used in a study conducted by Waltz et al.
(2015), aimed at locating each strategy in a diagram (Figure
1) by its degree of importance and feasibility.
That study, which was based on a concept mapping process
involving 35 experts, led to the positioning of each strategy in
one of four quadrants—labeled I–IV (the Roman numerals in
Table 2)—each of which represents a particular combination of
perceived strategy importance and feasibility. This is displayed
in Figure 1, in which all 18 ISP strategies presented above and
listed in Table 2 have been circled in black. It shows that the vast
majority (n¼13) of these falls into Quadrant I. This quadrant
represents strategies that were rated as highly important and
highly feasible. Three further ISP strategies fall into Quadrant
II representing strategies deemed equally feasible as those in
Quadrant I but less important. Finally, two strategies—promote
network weaving and model and simulate change—belong to
10 Research on Social Work Practice XX(X)
Quadrant III displaying strategies viewed by experts as being
less feasible and less important. None of the central ISP strate-
gies identified as part of this review fell into Quadrant IV, that is,
were strategies viewed to be of higher importance but less
feasible.
Other Implementation Strategies
While the ERIC strategies supported the extraction and coding
of substantial amounts of text material, not all ISP activities
could be covered through this compilation. Forty-three publi-
cations examining all ISP roles presented information that
initially was coded as “other” and could be synthesized into
further two strategies. While the literature on knowledge bro-
kers contributed considerably to the identification of these stra-
tegies, studies covering other roles also described their use.
The first strategy was labeled “source, share, and translate
evidence of relevance to stakeholders involved in the
implementation.” ISPs applying this strategy were described
as “culling through the research” (Cameron et al., 2011,
p. 30), selecting journal articles, or measures (Anaby et al.,
2015; Hurtubise et al., 2016; Waterman et al., 2015), sharing
and/or summarizing these resources (Chew et al., 2013; Gerrish
et al., 2011), and helping to apply them (Dogherty et al., 2012;
Gerrish et al., 2011). This application could relate to the inter-
vention to be implemented (Beidas et al., 2013), but it could
also be used for advocacy and other efforts to influence policy
or practice (Franks & Bory, 2015). A small sample of studies
also highlighted a precondition of using this strategy, namely,
that ISPs needed to be avid and competent consumers of
research (Dogherty et al., 2013; Gerrish et al., 2012; Hurtubise
et al., 2016; Rivard et al., 2010).
The second strategy falling outside the ERIC compilation
was “contribute to intervention design” and refers to situations
in which ISPs were involved in developing clinical protocols
(Aasekjær et al., 2016; Holtrop et al., 2008), behavior support
plans (Bice-Urbach & Kratochwill, 2016; Sanetti et al., 2018),
guidelines (Byrnes et al., 2018; Gerrish et al., 2011), practice
policies (Lemelin et al., 2001), best practice models (Franks &
Bory, 2015), or evidenced interventions to be integrated into
clinical practice (Dobbins et al., 2018; Hurtubise et al., 2016;
Waterman et al., 2015). Few studies provided further detail on
the look and feel of this activity.
Discussion
This research identified 18 discrete implementation strategies
commonly used by ISPs when providing implementation sup-
port in human and social service settings. No clear differences
in strategy use could be identified across different ISP roles
(e.g., facilitators, knowledge brokers, TA providers, consul-
tants), indicating considerable similarity in their work and con-
firming that greater research integration in the field of
implementation support is relevant.
The range of strategies identified suggests that an ability to
train and educate stakeholders; to continuously monitor, eval-
uate, and adapt implementation; to develop stakeholder inter-
relationships; and to tailor one’s own implementation support
are central skills of ISPs. Furthermore, the reported variability
with which strategies were used by ISPs reflects that these
strategies can be further broken down into concrete activities,
components, and techniques. This highlights that ISPs also
require skill in operationalizing and detailing the implementa-
tion strategies that they decide to integrate into their
Figure 1. The feasibility and importance of implementation support practitioner strategies.
Note. This figure is adapted from its original version and licensed under a Creative Commons Generic License (CC BY 4.0 OA). It is attributed to
Waltz et al. (2015).
Albers et al. 11
implementation support. Finally, with implementation support
being generally characterized by the use of multifaceted stra-
tegies, the ISP role demands mastery in selecting, combining,
and using multiple strategies. Mastery describes an aptitude for
accessing strategy knowledge on demand, flexibly linking and
tailoring this knowledge to situational and contextual condi-
tions, with the purpose of enabling sustainable learning and
skill building in others.
The use of financial strategies, infrastructure changes and
the engagement of consumers, were absent in the ISP literature.
Two strategies—“source, share, and translate evidence of rele-
vance to stakeholders involved in the implementation” and
“contribute to intervention design”—were identified as not
being part of the ERIC compilation, yet, relevant to the work
of ISPs. These strategies are therefore suggested as additions to
the ERIC compilation.
Implications
In scholarly debates about the development of the social work
profession, implementation science has been introduced as a
field of inquiry that, due to its applied nature, can help to create
stronger bidirectional ties between research and practice and
thereby enhance the relevance and quality of social work as
well as its capacity to address societal problems (Bunger &
Hall, 2019; Cabassa, 2016; Gehlert et al., 2017). A central point
of attention in these debates is the connectedness of research
and practice, with learning collaboratives (Bunger et al., 2016;
Stephens et al., 2014), partnerships between academic and
social and human service organizations (McBeath et al.,
2019; Palinkas et al., 2017), and the research-minded practi-
tioner (DePanfilis, 2014; Liedgren, 2020) being among the
suggested solutions for how to create closer linkages between
these domains. With this review, we add a further potential
solution to this list—the ISP—and propose for it to be consid-
ered as a role that can actively bridge the research–practice gap
in social work and help human and social service organizations
to establish and facilitate the adaptive learning required
not only to implement ready-made RSIs but also to apply evi-
dence in the design and improvement of local practice (Mosley
et al., 2019).
However, the knowledge about this role when used in social
work is scarce, pointing to a need for developing programs,
initiatives, and funding structures at the system level to gener-
ate a broader experience with utilizing it in different forms and
contexts. As part of such initiatives, the specific conditions
and characteristics of social work and its providers—human
and social service organizations and their staff—should be
taken into account. This would include, for example, the central
role of peer influence (Wharton & Bolland, 2012; Wike et al.,
2014), of inter-agency networks and collaboration (Bergmark
et al., 2018; Palinkas et al., 2011, 2012), and of supervisors and
managers (Ba¨ck et al., 2020; Bunger et al., 2019) for social
workers’ motivation and ability to integrate evidence into their
daily routines. These and other individual and organizational
factors form the fabric into which implementation support—be
it delivered by an individual or a team, an external or an inter-
nal unit—would need to be integrated.
At the level of service provision, the breadth and variability
of ISP strategy use identified through this review, and the high
level of skill required emerging from this use, naturally raises
the question of how to select, recruit, or develop professionals
for ISP roles in human and social service settings.
With EBP, RSIs, and implementation science still not being
widely integrated into training and professional development
programs on the one hand and routine human and social ser-
vices on the other, highly skilled ISPs can be expected to
remain in short supply in the future. Moreover, even if an
organization identifies a single individual who is an experi-
enced practitioner and familiar with EBP as well as implemen-
tation support, relying solely on this one person would create
vulnerable implementation capacity at risk of disappearing
quickly in the event of staff turnover, budget cuts, or other
challenges that happen routinely in human and social service
agencies. The literature points to two potential pathways for
minimizing such vulnerability.
Providers of human and social services may want to con-
sider whether implementation support could be distributed
across a team, whose members contribute with different types
of skill, expertise, and experience. This thinking aligns with a
central point made in the literature on facilitation, which
emphasizes that facilitation represents both a role and a process
(Dogherty et al., 2010). This indicates that facilitation does not
necessitate the establishment of a single, formal facilitator role
and can take place as long as its key functions are appropriately
represented by different members of an organization. Within
human and social services, this means that implementation
support activities such as identifying and preparing champions,
consultation, or informing local opinion leaders could be per-
formed by different members of an implementation team. In
recent years, this team approach to implementation support has
been increasingly discussed in the literature (Higgins et al.,
2012; Metz & Bartley, 2020). It was also applied in a small
number of studies included in this review (Chaffin et al., 2016;
Hurlburt et al., 2014; Lessard et al., 2016) and described as
usable within as well as across organizations.
The intraorganizational model involves establishing an
internal implementation support team formed by an agencies’
own staff (Lessard et al., 2016). In the cross-organizational
model, multiple agencies work together to establish the imple-
mentation support team, each contributing different personnel.
In this model, decision makers need to be prepared to balance
different organizational cultures, interests, and priorities that
exist among participating agencies. If these are highly dia-
metric, tensions may emerge among stakeholders (Aarons
et al., 2014) and complicate the use of the team approach.
A second pathway toward developing implementation sup-
port roles can be to collaborate with an intermediary organiza-
tion specialized in providing implementation support. While
research on these intermediaries remains scarce (Proctor
et al., 2019), descriptive studies of their work to support the
implementation of RSIs in human and social services exist.
12 Research on Social Work Practice XX(X)
These confirm the positioning of intermediaries at the nexus of
research, practice, and policy and identify the capacity building
as one of their key functions (Cheron et al., 2019; Isett & Hicks,
2019; Smits et al., 2020; Weaver et al., 2017). Human and
social service agencies can use intermediary staff as a tempo-
rary external resource available during implementation efforts
and working to build internal implementation support capacity
that, in the longer term, will make the agency independent of
the support provided by the intermediary.
Of interest in this context are data that were collected
through a recent survey (Proctor et al., 2019), administered
with 54 intermediary and purveyor organizations in the United
States. The findings from this survey showed that these orga-
nizations used a range of 32 distinct implementation strategies
across five domains, indicating a greater breadth of strategy use
than identified through this review. This breadth was later con-
firmed as part of a program evaluation, reporting that interme-
diaries used 31 strategies in supporting the early
implementation of three different manualized RSIs in Australia
(Albers, Hateley-Browne, et al., 2020). One explanation for
this difference may be the underreporting of strategy use in the
studies included in this review, a challenge commonly
acknowledged in the literature (Bunger et al., 2017; Hooley
et al., 2020; Pinnock et al., 2017b; Varsi et al., 2019). Further-
more, both of the above studies included implementation sup-
port as delivered by purveyors, that is, companies focused “on
the dissemination of a specific EBP [evidence-based practice]
with the goal of implementing the EBP with fidelity and good
effect” (Franks & Bory, 2015, p. 43). As they depend on the
success of their products, these companies may tend to inten-
sify their implementation support and in doing so draw on a
greater number of strategies. However, the numerical differ-
ences in strategy use may also reflect that implementation sup-
port capacity, when professionalized and institutionalized
within the organizational settings of intermediaries, makes it
possible to develop expertise in applying a broader range of
strategies—for example, because a more diverse set of knowl-
edge and skills is available across multiple ISPs working for an
intermediary; their access to contacts and networks in acade-
mia, practice, or the policy sphere is broader; or their pooled
funding allows for testing and developing a broader range of
implementation support activities. If so, the collaboration with
an intermediary may provide opportunities for tapping into
specialized expertise that would be difficult to generate by a
single human or social service agency alone.
For intermediary organizations, this presupposes that their
staff are sufficiently skilled in applying implementation strate-
gies, that is, in selecting, operationalizing, designing, and tai-
loring them. While this kind of strategy work requires further
investigation and therefore continues to be a high priority for
implementation scientists (Powell et al., 2019), scholars con-
firm its dependency on appropriate methods that allow for, for
example, the integration of multiple stakeholder perspectives,
the identification of crucial barriers to implementation, or the
assessment of available resources for strategy development.
Concept and intervention mapping, group model building, and
conjoint analysis have been identified as such methods
(Fernandez et al., 2019; Powell, Beidas, et al., 2015), all of
which should belong to an ISP’s toolbox. In addition, the use
of theory has been highlighted as an important feature of imple-
mentation strategy design (Lyon et al., 2019), further adding to
the knowledge and skill level required by ISPs and highlighting
the importance of continuously promoting their professional
development alongside progress in implementation science.
Given their organizational capacity, intermediaries may be in
a particularly strong position to meet these needs, but also
human and social service organizations developing internal
implementation support roles or teams should be aware of them
and consider how this skill building can become a routine
practice.
Within the field of implementation science, the range of
implementation strategies reported to be used by ISPs should
remain a topic for further investigation because it raises a num-
ber of critical questions.
At the basic level, the identification of strategies that are not
currently included in the ERIC compilation—“source, share,
and translate evidence of relevance to stakeholders involved in
the implementation” and “contribute to intervention design”—
suggests that there may be a need to review the compilation and
to consider whether it fully reflects the realities of implemen-
tation as it is practiced and researched in different countries
today. It was developed at an earlier developmental stage of
implementation science, and the broadened implementation
literature and experience that is available may help to refine
it further. For example, adding strategies could improve the
compilation and make it more relevant to social work as new
strategies may relate more to interventions used in this profes-
sion. Furthermore, given a growth in knowledge about how
strategies are used in practice, they could be more clearly deli-
neated from each other in a new version of the compilation,
thereby enhancing its clarity and usefulness for both research
and practice.
The general literature on implementation strategies has also
discussed the persistent challenges of implementers in appro-
priately matching implementation strategies with implementa-
tion barriers, reflected in difficulties with identifying relevant
implementation strategies, and with using these with the proper
frequency, intensity, and fidelity required to achieve their
intended benefit (Eisman et al., 2019; Powell et al., 2019,
2020; Waltz et al., 2019). Given that the findings of this review
indicate that ISPs primarily apply more feasible strategies, it is
relevant to ask why financial strategies or strategies aimed at
changing infrastructure or engaging consumers were less pres-
ent in the literature. One explanation might be that the strate-
gies used were deemed to be most appropriate, given the
context in which support was provided. However, it could also
reflect that they were “more immediate and concrete and [...]
potentially more in the control of those tasked with supporting
change” (Waltz et al., 2015, p. 6). This latter interpretation
suggests that ISPs may have to neglect potentially effective
support strategies if they are out of their control, for example,
because their use would require the continuous engagement of
Albers et al. 13
senior management or deeper structural changes in an organi-
zation. If this is a characteristic of implementation support
work, then ISPs—rather than challenging the values, norms,
or power structures of the systems in which they work—may be
at risk of just “conforming to existing ways of doing things
(Kislov et al., 2016, p. 472). This warrants a deeper examina-
tion of the use of strategies by ISPs in order to better understand
their potential as true agents of change. Relevant research ques-
tions to address in this context would be, for example:
What is the rationale underpinning ISPs’ use of partic-
ular implementation support strategies?
How does the use of strategies by ISPs change when
their roles are positioned/set up differently within a ser-
vice system (e.g., internal vs. external roles, within vs.
outside of leadership structures)?
What characterizes unsuccessful attempts of using par-
ticular implementation support strategies?
Finally, in confirming that the use of strategies did not differ
substantially across different ISP roles, and in identifying two
additional implementation strategies commonly used acrossthese
roles, this review highlights the value of greater implementation
research integration. Variability in terminology—centered on,
for example, the differences between dissemination, knowledge
mobilization, translation, and implementation—has been a char-
acteristic of implementation science since its invention in the
early 2000s (Graham et al., 2006; Khalil, 2016; Rabin et al.,
2008), partly due to the field’s multidisciplinary roots (Rabin
et al., 2008). While a certain level of variability in terminology
may be unavoidable, the investigation of different implementa-
tion support roles makes it visible that this variability also may
have a cost. Different phrases used to coin particular implemen-
tation support roles appear to have generated separate streams of
research as unconnected “schools of thought,” despite consider-
able similarities among them. The potential consequence within
science is a waste of research resources and an unnecessarily
fragmented knowledge base, leading to implications for decision
makers in policy and practice for whom the navigation and use of
this knowledge base becomes needlessly complicated. Future
research activities in this area of implementation science should
therefore be conducted from an integrative perspective, that is,
draw on the broadest possible evidence base existing across mul-
tiple schools of thought and producing knowledge that is inde-
pendent from particular ISP role labels.
Limitations
Multiple limitations have to be taken into account when con-
sidering the findings and implications of this study. Firstly,
while searches conducted for this integrative review were sys-
tematic, not every single study examining the work of partic-
ular ISP roles would have been captured since concept
saturation guided its production. Hence, readers interested in
the detailed knowledge base that exists for specific ISP roles
should consult the literature for these roles separately. Further-
more, readers should keep in mind that implementation science
is a relatively new field of inquiry. Literature may exist in
which novel implementation terminology has not been used
but functions comparable to implementation support have been
described. The search terms applied in this review build on the
common language used in the field of implementation science
to characterize this support. Hence, studies published, for
example, in other, unrelated fields of science or before this
field emerged, may therefore have been missed.
Secondly, the evidence base presented with this review con-
tains a relatively small share of studies conducted in human and
social service settings. The decision to take a cross-sector per-
spective was intentional in that the evidence base in the human
and social service sector was expected to be limited and hence
insufficient to achieve concept saturation. Against this back-
drop, some of the review findings will be applicable across
sectors, that is, social, health, and educational settings, while
the utilization of other findings will require a translation that
takes into account the specific realities of, for example, human
and social service organizations such as resource scarcity, high
rates of turnover, or fast-paced working routines.
Thirdly, the results from this review may be affected by an
underreporting of implementation strategies in the studies
included. While standards for reporting implementation studies
have been developed (Pinnock et al., 2017a), these are still
relatively new and not necessarily applied across eligible stud-
ies. The inclusion of diverse study designs—incorporating a
considerable number of detailed qualitative studies of imple-
mentation support—aimed at capturing the broadest possible
range of strategies used by ISPs. However, readers should be
aware that not all aspects of strategy use may be presented here.
Finally, in using the ERIC strategy compilation for examin-
ing ISP strategy use, we chose a particular lens for our analysis.
The fact that some of its implementation strategies appeared to
be suitable to be merged and others could be further broken
down into detailed activities indicates that not all of its differ-
ent strategies may be fully discrete units that can be clearly
separated from each other. This raises a question about the
compilation’s maturity as an analytical tool and highlights the
importance of continually refining taxonomies as the field of
implementation science progresses. In the meantime, readers
should be aware of a certain blurriness in current implementa-
tion strategy definitions. Furthermore, advocates of relational
theory have criticized the often-gendered nature of the ways in
which we understand organizational phenomena, reflected in,
for example, overemphasizing the linearity of change processes
or the use of technical strategies in enabling such change
(Fletcher, 1998). Viewed from this perspective, the ERIC com-
pilation would represent a masculine biased view on imple-
mentation, which neglects the importance of, for example,
connection, interdependence and collectivity in implementa-
tion processes, and the potential of relational strategies aimed
at, for example, creating reciprocal dialogue among stake-
holders to improve the use of evidence in practice (Metz
et al., 2019). Further studies of ISPs may therefore benefit from
utilizing alternative theories and perspectives in their explora-
tion of this still relatively new role in implementation.
14 Research on Social Work Practice XX(X)
Conclusion
Different ISPs use a similar set of implementation strategies,
indicating considerable similarity in their work as well as their
required skill set and confirming that greater research integra-
tion in the field of implementation support is relevant. The
breadth and depth of strategies used by ISPs suggests that
strong professional development pathways are needed to
enable staff of human and social service and of intermediary
organizations to build, utilize, and offer implementation sup-
port skills. However, the limited range of implementation stra-
tegies applied by ISPs indicates that this role may be far from
reaching the level of change agency that stakeholders in policy
and practice would require for a system wide move toward
implementation-informed EBP in human and social services.
Further development of the ISP role for this sector is therefore
urgently needed.
Authors’ Note
Cecilie Varsi is also affiliated to European Implementation Collabora-
tive, Soborg, Denmark.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to
the research, authorship, and/or publication of this article.
Funding
The authors received no financial support for the research, authorship,
and/or publication of this article.
ORCID iD
Bianca Albers https://orcid.org/0000-0001-9555-0547
References
Aarons, G. A., Fettes, D. L., Hurlburt, M., Palinkas, L. A., Gunderson,
L. M., Willging, C. E., & Chaffin, M. J. (2014). Collaboration,
negotiation, and coalescence for interagency-collaborative teams
to scale-up evidence-based practice. Journal of Clinical Child &
Adolescent Psychology,43(6), 915–928. https://doi.org/10.1080/
15374416.2013.876642
Aasekjær, K., Waehle, H. V., Ciliska, D., Nordtvedt, M. W., & Hja¨lm-
hult, E. (2016). Management involvement—A decisive condition
when implementing evidence-based practice. Worldviews on
Evidence-Based Nursing,13(1), 32–41. https://doi.org/10.1111/
wvn.12141
Acolet, D., Allen, E., Houston, R., Wilkinson, A. R., Costeloe, K., &
Elbourne, D. (2011). Improvement in neonatal intensive care unit
care: A cluster randomised controlled trial of active dissemination
of information. Archives of Disease in Childhood—Fetal and
Neonatal Edition,96, F434–F439. https://doi.org/10.1136/
adc.2010.207522
Akin, B. A. (2016). Practitioner views on the core functions of coach-
ing in the implementation of an evidence-based intervention in
child welfare. Children and Youth Services Review,68, 159–168.
https://doi.org/10.1016/j.childyouth.2016.07.010
Albers, B., Bu
¨hrmann, L., Driessen, P., Bartley, L., & Varsi, C.
(2020). Implementation support skills—Electronic Results Adden-
dum. https://osf.io/9kfqr/
Albers, B., Hateley-Browne, J., Steele, T., Rose, V., Shlonsky, A., &
Mildon, R. (2020). The early implementation of FFT-CW
®
, MST-
Psychiatric
®
, and SafeCare
®
in Australia. Research on Social
Work Practice,30(6), 658–677. https://doi.org/10.1177/104973
1520908326
Albers, B., Metz, A., & Burke, K. (2020). Implementation support
practitioners—A proposal for consolidating a diverse evidence
base. BMC Health Services Research,20(Article 368). https://
doi.org/10.1186/s12913-020-05145-1
Anaby, D., Korner-Bitensky, N., Law, M., & Cormier, I. (2015).
Focus on participation for children and youth with disabilities:
Supporting therapy practice through a guided knowledge transla-
tion process. British Journal of Occupational Therapy,78(7),
440–449. https://doi.org/10.1177/0308022614563942
Anyon, Y., Nicotera, N., & Veeh, C. A. (2016). Contextual influences
on the implementation of a schoolwide intervention to promote
students’ social, emotional, and academic learning. Children &
Schools,38(2), 81–88. https://doi.org/10.1093/cs/cdw008
Artman-Meeker, K., Fettig, A., Barton, E. E., Penney, A., & Zeng, S.
(2015). Applying an evidence-based framework to the early child-
hood coaching literature. Topics in Early Childhood Special Edu-
cation,35(3), 183–196. https://doi.org/10.1177/0271121415
595550
Ba¨ck, A., Schwarz, U. von T., Hasson, H., & Richter, A. (2020).
Aligning perspectives?—Comparison of top and middle-level
managers’ views on how organization influences implementation
of evidence-based practice. British Journal of Social Work,50(4),
1126–1145. https://doi.org/10.1093/bjsw/bcz085
Barac, R., Kimber, M., Johnson, S., & Barwick, M. (2018). The effec-
tiveness of consultation for clinicians learning to deliver motiva-
tional interviewing with fidelity. Journal of Evidence-Informed
Social Work,15(5), 510–533. https://doi.org/10.1080/23
761407.2018.1480988
Barth, R. P., Lee, B. R., & Hodorowicz, M. T. (2017). Equipping the
child welfare workforce to improve the well-being of children.
Journal of Children’s Services,12(2/3), 211–220. https://doi.org/
10.1108/jcs-05-2017-0017
Becker, K. D., Bradshaw, C. P., Domitrovich, C., & Ialongo, N. S.
(2013). Coaching teachers to improve implementation of the good
behavior game. Administration and Policy in Mental Health and
Mental Health Services Research,40, 482–493. https://doi.org/
10.1007/s10488-013-0482-8
Beidas, R. S., Becker-Haimes, E. M., Adams, D. R., Skriner, L.,
Stewart, R. E., Wolk, C. B., Buttenheim, A. M., Williams, N. J.,
Inacker, P., Richey, E., & Marcus, S. C. (2017). Feasibility and
acceptability of two incentive-based implementation strategies for
mental health therapists implementing cognitive-behavioral ther-
apy: A pilot study to inform a randomized controlled trial. Imple-
mentation Science,12(Article 148). https://doi.org/10.1186/s13
012-017-0684-7
Beidas, R. S., Edmunds, J. M., Cannuscio, C. C., Gallagher, M.,
Downey, M. M., & Kendall, P. C. (2013). Therapists perspectives
on the effective elements of consultation following training.
Albers et al. 15
Administration and Policy in Mental Health and Mental Health
Services Research,40(6), 507–517. https://doi.org/10.1007/
s10488-013-0475-7
Beidas, R. S., Edmunds, J. M., Marcus, S. C., & Kendall, P. C. (2012).
Training and consultation to promote implementation of an empiri-
cally supported treatment: A randomized trial. Psychiatric Ser-
vices,63(7), 660–665. https://doi.org/10.1176/appi.ps.201100401
Beidas, R. S., & Kendall, P. C. (2010). Training therapists in evidence-
based practice: A critical review of studies from a systems-
contextual perspective. Clinical Psychology Science and Practice,
17, 1–30. https://doi.org/10.1111/j.1468-2850.2009.01187.x
Bergmark, M., Bejerholm, U., & Markstro¨ m, U. (2018). Implementa-
tion of evidence-based interventions: Analyzing critical compo-
nents for sustainability in community mental health services.
Social Work in Mental Health,17(2), 129–148. https://doi.org/
10.1080/15332985.2018.1511500
Bertram, R. M., Charnin, L. A., Kerns, S. E. U., & Long, A. C. J.
(2014). Evidence-based practices in North American MSW curri-
cula. Research on Social Work Practice,25(6), 737–748. https://
doi.org/10.1177/1049731514532846
Bertram, R. M., Choi, S.-W., & Elsen, M. (2018). Integrating imple-
mentation science and evidence-based practice into academic and
field curricula. Journal of Social Work Education,54(Suppl 1),
S20–S30. https://doi.org/10.1080/10437797.2018.1434441
Bhattacharyya, O., Reeves, S., & Zwarenstein, M. (2009). What is
implementation research? Research on Social Work Practice,
19(5), 491–502. https://doi.org/10.1177/1049731509335528
Bice-Urbach, B. J., & Kratochwill, T. R. (2016). Teleconsultation:
The use of technology to improve evidence-based practices in rural
communities. Journal of School Psychology,56, 27–43. https://
doi.org/10.1016/j.jsp.2016.02.001
Biegel, D. E., Kola, L. A., Ronis, R. J., Boyle, P. E., Reyes, C. M. D.,
Wieder, B., & Kubek, P. (2003). The Ohio substance abuse and
mental illness coordinating center of excellence: Implementation
support for evidence-based practice. Research on Social Work
Practice,13(4), 531–545. https://doi.org/10.1177/1049731503
013004007
Blume, B. D., Ford, J. K., Baldwin, T. T., & Huang, J. L. (2010).
Transfer of training: A meta-analytic review. Journal of Manage-
ment,36(4), 1065–1105. https://doi.org/10.1177/01492063093
52880
Bradshaw, C. P., Pas, E. T., Goldweber, A., Rosenberg, M. S., & Leaf,
P. J. (2012). Integrating school-wide Positive Behavioral Interven-
tions and Supports with tier 2 coaching to student support teams:
The PBISplus model. Advances in School Mental Health Promo-
tion,5(3), 177–193. https://doi.org/10.1080/1754730x.2012.
707429
Brown, C. H., Chamberlain, P., Saldana, L., Padgett, C., Wang, W., &
Cruden, G. (2014). Evaluation of two implementation strategies in
51 child county public service systems in two states: Results of a
cluster randomized head-to-head implementation trial. Implemen-
tation Science,9(Article 134). https://doi.org/10.1186/s13012-
014-0134-8
Brown, K. M., Elliott, S. J., & Leatherdale, S. T. (2018). Researchers
supporting schools to improve health: Influential factors and
outcomes of knowledge brokering in the COMPASS study.
Journal of School Health,88(1), 54–64. https://doi.org/10.1111/
josh.12578
Brown, K. M., Elliott, S. J., Robertson-Wilson, J., Vine, M. M., &
Leatherdale, S. T. (2018a). Can knowledge exchange support the
implementation of a health-promoting schools approach? Per-
ceived outcomes of knowledge exchange in the COMPASS study.
BMC Public Health,18(Article 351). https://doi.org/10.1186/
s12889-018-5229-8
Brown, K. M., Elliott, S. J., Robertson-Wilson, J., Vine, M. M., &
Leatherdale, S. T. (2018b). “Now what?” Perceived factors influ-
encing knowledge exchange in school health research. Health
Promotion Practice,19(4), 590–600. https://doi.org/10.1177/
1524839917732037
Brownson, R. C., Ballew, P., Brown, K. L., Elliott, M. B., Haire-Joshu,
D., Heath, G. W., & Kreuter, M. W. (2007). The effect of disse-
minating evidence-based interventions that promote physical
activity to health departments. American Journal of Public Health,
97(10), 1900–1907. https://doi.org/10.2105/ajph.2006.090399
Brownson, R. C., Colditz, G. A., & Proctor, E. K. (2018). Dissemina-
tion and implementation research in health. Oxford University
Press. https://doi.org/10.1093/oso/9780190683214.001.0001
Brunette, M. F., Asher, D., Whitley, R., Lutz, W. J., Wieder, B. L.,
Jones, A. M., & McHugo, G. J. (2008). Implementation of inte-
grated dual disorders treatment: A qualitative analysis of facilita-
tors and barriers. Psychiatric Services,69(9), 989–995. https://
doi.org/10.1176/ps.2008.59.9.989
Brunton, G., Stansfield, C., & Thomas, J. (2012). Finding relevant
studies. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduc-
tion to systematic reviews (pp. 107–134). Sage.
Bumbarger, B. K., & Campbell, E. M. (2011). A state agency–univer-
sity partnership for translational research and the dissemination of
evidence-based prevention and intervention. Administration and
Policy in Mental Health and Mental Health Services Research,
39(4), 268–277. https://doi.org/10.1007/s10488-011-0372-x
Bunger, A. C., Birken, S. A., Hoffman, J. A., MacDowell, H., Choy-
Brown, M., & Magier, E. (2019). Elucidating the influence of
supervisors’ roles on implementation climate. Implementation
Science,14(Article 93). https://doi.org/10.1186/s13012-019-
0939-6
Bunger, A. C., & Hall, R. L. (2019). Implementation science and
human service organizations research: Opportunities and chal-
lenges for building on complementary strengths. Human Service
Organizations: Management, Leadership & Governance,43(4),
258–268. https://doi.org/10.1080/23303131.2019.1666765
Bunger, A. C., Hanson, R. F., Doogan, N. J., Powell, B. J., Cao, Y., &
Dunn, J. (2016). Can learning collaboratives support implementa-
tion by rewiring professional networks? Administration and Policy
in Mental Health and Mental Health Services Research,43(1),
79–92. https://doi.org/10.1007/s10488-014-0621-x
Bunger, A. C., Powell, B. J., Robertson, H. A., MacDowell, H., Bir-
ken, S. A., & Shea, C. M. (2017). Tracking implementation stra-
tegies: A description of a practical approach and early findings.
Health Research Policy and Systems,15(Article 15). https://
doi.org/10.1186/s12961-017-0175-y
16 Research on Social Work Practice XX(X)
Burns, M. K., Peters, R., & Noell, G. H. (2008). Using performance
feedback to enhance implementation fidelity of the problem-
solving team process. Journal of School Psychology,46(5),
537–550. https://doi.org/10.1016/j.jsp.2008.04.001
Byrnes, A., Young, A., Mudge, A., Banks, M., Clark, D., & Bauer, J.
(2018). Prospective application of an implementation framework
to improve postoperative nutrition care processes: Evaluation of a
mixed methods implementation study. Nutrition & Dietetics,
75(4), 353–362. https://doi.org/10.1111/1747-0080.12464
Cabassa, L. J. (2016). Implementation science: Why it matters for the
future of social work. Journal of Social Work Education,52(Suppl
1), S38–S50. https://doi.org/10.1080/10437797.2016.1174648
Calo, W. A., Gilkey, M. B., Leeman, J., Heisler-MacKinnon, J.,
Averette, C., Sanchez, S., Kornides, M. L., & Brewer, N. T.
(2018). Coaching primary care clinics for HPV vaccination quality
improvement: Comparing in-person and webinar implementation.
Translational Behavioral Medicine,9(1), 23–31. https://doi.org/
10.1093/tbm/iby008
Cameron, D., Russell, D. J., Rivard, L., Darrah, J., & Palisano, R.
(2011). Knowledge brokering in children’s rehabilitation organi-
zations: Perspectives from administrators. Journal of Continuing
Education in the Health Professions,31(1), 28–33. https://doi.org/
10.1002/chp.20098
Carey, G., Dickinson, H., & Olney, S. (2019). What can feminist
theory offer policy implementation challenges? Evidence &
Policy: A Journal of Research, Debate and Practice,15(1),
143–159. https://doi.org/10.1332/174426417x14881935664929
Caron, E., & Dozier, M. (2019). Effects of fidelity-focused consulta-
tion on clinicians’ implementation: An exploratory multiple base-
line design. Administration and Policy in Mental Health and
Mental Health Services Research,46(4), 445–457. https://
doi.org/10.1007/s10488-019-00924-3
Carson, R. L., Castelli, D. M., Kuhn, A. C. P., Moore, J. B., Beets, M.
W.,Beighle,A.,Aija,R.,Calvert,H.G.,&Glowacki,E.M.
(2014). Impact of trained champions of comprehensive school
physical activity programs on school physical activity offerings,
youth physical activity and sedentary behaviors. Preventive Med-
icine,69, S12–S19. https://doi.org/10.1016/j.ypmed.2014.08.025
Chadwick, N., Dewolf, A., & Serin, R. (2015). Effectively training
community supervision officers—A meta-analytic review of the
impact on offender outcome. Criminal Justice and Behavior,
42(10), 977–989. https://doi.org/10.1177/0093854815595661
Chaffin, M. J., Hecht, D., Aarons, G. A., Fettes, D. L., Hurlburt, M., &
Ledesma, K. (2016). EBT fidelity trajectories across training
cohorts using the interagency collaborative team strategy. Admin-
istration and Policy in Mental Health and Mental Health Services
Research,43, 144–156. https://doi.org/10.1007/s10488-015-
0627-z
Chaple, M., & Sacks, S. (2016). The impact of technical assistance and
implementation support on program capacity to deliver integrated
services. The Journal of Behavioral Health Services & Research,
43(1), 3–17. https://doi.org/10.1007/s11414-014-9419-6
Cheron, D. M., Chiu, A. A. W., Stanick, C. F., Stern, H. G.,
Donaldson, A. R., Daleiden, E. L., & Chorpita, B. F. (2019). Imple-
menting evidence based practices for children’s mental health: A
case study in implementing modular treatments in community
mental health. Administration and Policy in Mental Health and
Mental Health Services Research,46, 391–410. https://doi.org/
10.1007/s10488-019-00922-5
Chew, S., Armstrong, N., & Martin, G. (2013). Institutionalising
knowledge brokering as a sustainable knowledge translation solu-
tion in healthcare: How can it work in practice? Evidence &
Policy: A Journal of Research, Debate and Practice,9(3),
335–351. https://doi.org/10.1332/174426413x662734
Chilenski, S. M., Perkins, D. F., Olson, J., Hoffman, L., Feinberg, M.
E., Greenberg, M., Welsh, J., Crowley, D. M., & Spoth, R. (2016).
The power of a collaborative relationship between technical assis-
tance providers and community prevention teams: A correlational
and longitudinal study. Evaluation and Program Planning,54,
19–29. https://doi.org/10.1016/j.evalprogplan.2015.10.002
Chinman, M. J., Ebener, P., Malone, P. S., Cannon, J., D’Amico, E. J.,
& Acosta, J. (2018). Testing implementation support for evidence-
based programs in community settings: A replication cluster-
randomized trial of Getting To Outcomes
®
.Implementation
Science,13(Article 131). https://doi.org/10.1186/s13012-018-
0825-7
Chinman, M. J., McCarthy, S., Hannah, G., Byrne, T. H., & Smelson,
D. A. (2017). Using getting to outcomes to facilitate the use of
an evidence-based practice in VA homeless programs: A
cluster-randomized trial of an implementation support strategy.
Implementation Science,12(Article 34). https://doi.org/10.1186/
s13012-017-0565-0
Cranley, L. A., Cummings, G. G., Profetto-McGrath, J., Toth, F., &
Estabrooks, C. A. (2017). Facilitation roles and characteristics
associated with research use by healthcare professionals: A scop-
ing review. BMJ Open,7(8), e014384. https://doi.org/10.1136/
bmjopen-2016-014384
DePanfilis, D. (2014). Back to the future: Using social work research
to improve social work practice. Journal of the Society for Social
Work and Research,5(1), 1–21. https://doi.org/10.1086/675852
Devaney, J., Hayes, D., & Spratt, T. (2017). The influences of training
and experience in removal and reunification decisions involving
children at risk of maltreatment: Detecting a “beginner dip. The
British Journal of Social Work,47(8), 2364–2383. https://doi.org/
10.1093/bjsw/bcw175
Dickinson, W. P., Dickinson, L. M., Nutting, P. A., Emsermann, C. B.,
Tutt, B., Crabtree, B. F., Fisher, L., Harbrecht, M., Gottsman, A., &
West, D. R. (2014). Practice facilitation to improve diabetes care in
primary care: A report from the EPIC randomized clinical trial.
The Annals of Family Medicine,12(1), 8–16. https://doi.org/10.13
70/afm.1591
Dimeff, L. A., Harned, M. S., Woodcock, E. A., Skutch, J. M.,
Koerner, K., & Linehan, M. M. (2015). Investigating bang for your
training buck: A randomized controlled trial comparing three
methods of training clinicians in two core strategies of dialectical
behavior therapy. Behavior Therapy,46, 283–295. https://doi.org/
10.1016/j.beth.2015.01.001
Dobbins, M., Traynor, R. L., Workentine, S., Yousefi-Nooraie, R., &
Yost, J. (2018). Impact of an organization-wide knowledge trans-
lation strategy to support evidence-informed public health decision
making. BMC Public Health,18(Article 1412). https://doi.org/
10.1186/s12889-018-6317-5
Albers et al. 17
Dogherty, E. J., Harrison, M. B., Baker, C., & Graham, I. D. (2012).
Following a natural experiment of guideline adaptation and early
implementation: A mixed-methods study of facilitation. Implemen-
tation Science,7(Article 9). https://doi.org/10.1186/1748-5908-7-9
Dogherty, E. J., Harrison, M. B., & Graham, I. D. (2010). Facilitation
as a role and process in achieving evidence-based practice in nur-
sing: A focused review of concept and meaning. Worldviews on
Evidence-Based Nursing,7(2), 76–89. https://doi.org/10.1111/
j.1741-6787.2010.00186.x
Dogherty, E. J., Harrison, M. B., Graham, I. D., Vandyk, A. D., &
Keeping-Burke, L. (2013). Turning knowledge into action at the
point-of-care: The collective experience of nurses facilitating the
implementation of evidence-based practice. Worldviews on
Evidence-Based Nursing,10(3), 129–139. https://doi.org/
10.1111/wvn.12009
Drisko,J.W.,&Grady,M.D.(2019).Evidence-basedpractice:
Teaching and supervision. In J. W. Drisko & M. D. Grady,
Evidence-based practice in clinical social work (pp. 281–295).
Springer Nature. https://doi.org/10.1007/978-3-030-15224-6_19
Duffy, J. L., Prince, M. S., Johnson, E. E., Alton, F. L., Flynn, S., Faye,
A. M., Padgett, P. E., Rollison, C., Becker, D., & Hinzey, A. L.
(2012). Enhancing teen pregnancy prevention in local commu-
nities: Capacity building using the interactive systems framework.
American Journal of Community Psychology,50, 370–385. https://
doi.org/10.1007/s10464-012-9531-9
Dunst, C. J., Annas, K., Wilkie, H., & Hamby, D. W. (2019). Scoping
review of the core elements of technical assistance models and
frameworks. World Journal of Education,9(2), 109–122. https://
doi.org/10.5430/wje.v9n2p109
Dusenbury, L., Hansen, W. B., Jackson-Newsom, J., Pittman, D. S.,
Wilson, C. V., Nelson-Simley, K., Ringwalt, C., Pankratz, M., &
Giles, S. M. (2010). Coaching to enhance quality of implementa-
tion in prevention. Health Education,110(1), 43–60. https://
doi.org/10.1108/09654281011008744
Edmunds, J. M., Beidas, R. S., & Kendall, P. C. (2013). Dissemination
and implementation of evidence–based practices: Training and
consultation as implementation strategies. Clinical Psychology
Science and Practice,20(2), 152–165. https://doi.org/10.1111/
cpsp.12031
Eiraldi, R., Mautone, J. A., Khanna, M. S., Power, T. J., Orapallo, A.,
Cacia, J., Schwartz, B. S., McCurdy, B., Keiffer, J., Paidipati, C.,
Kanine, R., Abraham, M., Tulio, S., Swift, L., Bressler, S. N.,
Cabello, B., & Jawad, A. F. (2018). Group CBT for externalizing
disorders in urban schools: Effect of trainings strategy on treatment
fidelity and child outcomes. Behavior Therapy,49, 538–550.
https://doi.org/10.1016/j.beth.2018.01.001
Eisman,A.B.,Kilbourne,A.M.,Dopp,A.R.,Saldana,L.,&
Eisenberg, D. (2019). Economic evaluation in implementation sci-
ence: Making the business case for implementation strategies. Psy-
chiatry Research,283, 112433. https://doi.org/10.1016/j.psychres.
2019.06.008
Ekeland, T.-J., Bergem, R., & Myklebust, V. (2018). Evidence-based
practice in social work: Perceptions and attitudes among Norwe-
gian social workers. European Journal of Social Work,22(4),
611–622. https://doi.org/10.1080/13691457.2018.1441139
Elledge, C., Avworo, A., Cochetti, J., Carvalho, C., & Grota, P.
(2018). Characteristics of facilitators in knowledge translation:
An integrative review. Collegian,26, 171–182. https://doi.org/
https://doi.org/10.1016/j.colegn.2018.03.002
Elnitsky, C. A., Powell-Cope, G., Besterman-Dahan, K. L., Rugs, D.,
& Ullrich, P. M. (2015). Implementation of safe patient handling in
the U.S. veterans health system: A qualitative study of internal
facilitators’ perceptions. Worldviews on Evidence-Based Nursing,
12(4), 208–216. https://doi.org/10.1186/1748-5908-6-99
Feinberg, M. E., Ridenour, T. A., & Greenberg, M. T. (2008). The
longitudinal effect of technical assistance dosage on the function-
ing of Communities That Care prevention boards in Pennsylvania.
The Journal of Primary Prevention,29(2), 145–165. https://
doi.org/10.1007/s10935-008-0130-3
Fernandez, M. E., Hoor, G. A. ten, Lieshout, S. van, Rodriguez, S. A.,
Beidas, R. S., Parcel, G., Ruiter, R. A. C., Markham, C. M., & Kok,
G. (2019). Implementation mapping: Using intervention mapping
to develop implementation strategies. Frontiers in Public Health,
7(Article 158). https://doi.org/10.3389/fpubh.2019.00158
Finne, J. (2020). Evidence-based practice in social work: Who are the
critics? Journal of Social Work. https://doi.org/10.1177/14680173
20955131
Fletcher, J. K. (1998). Relational practice—A feminist reconstruction
of work. Journal of Management Inquiry,7(2), 163–186. https://
doi.org/10.1177%2F105649269872012
Fortney, J. C., Pyne, J. M., Ward-Jones, S., Bennett, I. M., Diehl, J.,
Farris, K., Cerimele, J. M., & Curran, G. M. (2018). Implementa-
tion of evidence-based practices for complex mood disorders in
primary care safety net clinics. Families, Systems, & Health,36(3),
267–280. https://doi.org/10.1037/fsh0000357
Franks, R. P., & Bory, C. T. (2015). Who supports the successful
implementation and sustainability of evidence-based practices?
Defining and understanding the roles of intermediary and purveyor
organizations. New Directions for Child and Adolescent Develop-
ment,149, 41–56. https://doi.org/10.1002/cad.20112
Funderburk, B., Chaffin, M., Bard, E., Shanley, J., Bard, D., &
Berliner, L. (2015). Comparing client outcomes for two
evidence-based treatment consultation strategies. Journal of Clin-
ical Child & Adolescent Psychology,44(5), 730–741. https://
doi.org/10.1080/15374416.2014.910790
Gambrill, E. D. (1999). Evidence-based practice: An alternative to
authority-based practice. Families in Society: The Journal of Con-
temporary Social Services,80(4), 341–350. https://doi.org/
10.1606/1044-3894.1214
Garbacz, S. A., Watkins, N. D., Diaz, Y., Barnabas, E. R., Jr.,
Schwartz, B., & Eiraldi, R. (2016). Using conjoint behavioral con-
sultation to implement evidence-based practices for students in
low-income urban schools. Preventing School Failure: Alternative
Education for Children and Youth,61(3), 198–210. https://doi.org/
10.1080/1045988x.2016.1261078
Gehlert, S., Hall, K. L., & Palinkas, L. A. (2017). Preparing our next-
generation scientific workforce to address the Grand Challenges
for Social Work. Journal of the Society for Social Work and
Research,8(1), 119–136. https://doi.org/10.1086/690659
Gerrish, K., McDonnell, A., Nolan, M., Guillaume, L., Kirshbaum,
M., & Tod, A. (2011). The role of advanced practice nurses in
18 Research on Social Work Practice XX(X)
knowledge brokering as a means of promoting evidence-based
practice among clinical nurses. Journal of Advanced Nursing,
67(9), 2004–2014. https://doi.org/10.1111/j.1365-2648.2011.
05642.x
Gerrish, K., Nolan, M., McDonnell, A., Tod, A., Kirshbaum, M., &
Guillaume, L. (2012). Factors influencing advanced practice
nurses’ ability to promote evidence-based practice among frontline
nurses. Worldviews on Evidence-Based Nursing,9(1), 30–39.
https://doi.org/10.1111/j.1741-6787.2011.00230.x
Glisson, C., & Schoenwald, S. K. (2005). The ARC organizational and
community intervention strategy for implementing evidence-based
children’s mental health treatments. Mental Health Services
Research,7(4), 243–259. https://doi.org/10.1007/s11020-005-
7456-1
Goel, K., Hudson, C., & Cowie, J. (2018). Building research capacity
for social work practitioners: A regional perspective. Social Work
Education,37(8), 1028–1034. https://doi.org/10.1080/02615
479.2018.1481205
Graaf, G., McBeath, B., Lwin, K., Holmes, D., & Austin, M. J. (2017).
Supporting evidence-informed practice in human service organi-
zations: An exploratory study of link officers. Human Service
Organizations: Management, Leadership & Governance,41(1),
58–75. https://doi.org/10.1080/23303131.2016.1192575
Grady, M. D., Wike, T., Putzu, C., Field, S., Hill, J., Bledsoe, S. E.,
Bellamy, J., & Massey, M. (2017). Recent social work practi-
tioners’ understanding and use of evidence-based practice and
empirically supported treatments. Journal of Social Work Educa-
tion,54(1), 163–179. https://doi.org/10.1080/10437797.
2017.1299063
Graham, I. D., Logan, J., Harrison, M. B., Straus, S. E., Tetroe, J. M.,
Caswell, W., & Robinson, N. (2006). Lost in knowledge transla-
tion: Time for a map? Journal of Continuing Education in the
Health Professions,26(1), 13–24. https://doi.org/10.1002/chp.47
Greenhalgh, T., Thorne, S., & Malterud, K. (2018). Time to challenge
the spurious hierarchy of systematic over narrative reviews? Eur-
opean Journal of Clinical Investigation,48(6), e12931. https://
doi.org/10.1111/eci.12931
Grimshaw, J., Eccles, M. P., Lavis, J. N., Hill, S. J., & Squires, J. E.
(2012). Knowledge translation of research findings. Implementa-
tion Science,7(Article 50). https://doi.org/10.1186/1748-5908-
7-50
Grossman, R., & Salas, E. (2011). The transfer of training: What really
matters. International Journal of Training and Development,
15(2), 103–120. https://doi.org/10.1111/j.1468-2419.2011.00373.x
Gunderson, L. M., Willging, C. E., Jaramillo, E. M. T., Green, A. E.,
Fettes, D. L., Hecht, D. B., & Aarons, G. A. (2018). The good
coach: Implementation and sustainment factors that affect coach-
ing as evidence-based intervention fidelity support. Journal of
Children’s Services,13(1), 1–17. https://doi.org/10.1108/jcs-09-
2017-0043
Gustafson, D. H., Quanbeck, A. R., Robinson, J. M., Ford, J. H.,
Pulvermacher, A., French, M. T., McConnell, K. J., Batalden, P.
B., Hoffman, K. A., & McCarty, D. (2013). Which elements of
improvement collaboratives are most effective? A cluster-rando-
mized trial. Addiction,108(6), 1145–1157. https://doi.org/10.1111/
add.12117
Harvey, G., Loftus-Hills, A., Rycroft-Malone, J., Titchen, A., Kitson,
A. L., McCormack, B. G., & Seers, K. (2002). Getting evidence
into practice: The role and function of facilitation. Journal of
Advanced Nursing,37(6), 577–588. https://doi.org/10.1046/j.13
65-2648.2002.02126.x
Hecht, L., Buhse, S., & Meyer, G. (2016). Effectiveness of training in
evidence-based medicine skills for healthcare professionals: A sys-
tematic review. BMC Medical Education,16, Article 103. https://
doi.org/10.1186/s12909-016-0616-2
Higgins, M. C., Weiner, J., & Young, L. (2012). Implementation
teams: A new lever for organizational change. Journal of Organi-
zational Behavior,33(3), 366–388. https://doi.org/10.1002/
job.1773
Hodder, R. K., Wolfenden, L., Kamper, S. J., Lee, H., Williams, A.,
O’Brien, K. M., & Williams, C. M. (2016). Developing implemen-
tation science to improve the translation of research to address low
back pain: A critical review. Best Practice & Research Clinical
Rheumatology,30(6), 1050–1073. https://doi.org/10.1016/
j.berh.2017.05.002
Hodge, L. M., Turner, K. M. T., Sanders, M. R., & Forster, M. (2017).
Factors that influence evidence-based program sustainment for
family support providers in child protection services in disadvan-
taged communities. Child Abuse & Neglect,70, 134–145. https://
doi.org/10.1016/j.chiabu.2017.05.017
Hoens, A. M., & Li, L. C. (2014). The knowledge broker’s “fit” in the
world of knowledge translation. Physiotherapy Canada,66(3),
223–224. https://doi.org/10.3138/ptc.66.3.gee
Holtrop, J. S., Baumann, J., Arnold, A. K., & Torres, T. (2008). Nurses
as practice change facilitators for healthy behaviors. Journal of
Nursing Care Quality,23(2), 123–131. https://doi.org/10.1097/
01.ncq.0000313761.79396.37
Hooley, C., Amano, T., Markovitz, L., Yaeger, L., & Proctor, E.
(2020). Assessing implementation strategy reporting in the mental
health literature: A narrative review. Administration and Policy in
Mental Health and Mental Health Services Research,47, 19–35.
https://doi.org/10.1007/s10488-019-00965-8
Hopia, H., Latvala, E., & Liimatainen, L. (2016). Reviewing the meth-
odology of an integrative review. Scandinavian Journal of Caring
Sciences,30(4), 662–669. https://doi.org/10.1111/scs.12327
Hurlburt, M., Aarons, G. A., Fettes, D. L., Willging, C. E., Gunderson,
L. M., & Chaffin, M. J. (2014). Interagency collaborative team
model for capacity building to scale-up evidence-based practice.
Children and Youth Services Review,39, 160–168. https://doi.org/
10.1016/j.childyouth.2013.10.005
Hurtubise, K., Rivard, L., H´eguy, L., Berbari, J., & Camden, C.
(2016). Virtual knowledge brokering: Describing the roles and
strategies used by knowledge brokers in a pediatric physiotherapy
virtual community of practice. Journal of Continuing Education in
the Health Professions,36(3), 186–194. https://doi.org/10.1097/
ceh.0000000000000101
Huzzard, T. (2020). Achieving impact: Exploring the challenge of
stakeholder engagement. European Journal of Work and Organi-
zational Psychology. https://doi.org/10.1080/1359432x.2020.
1761875
Isett, K. R., & Hicks, D. (2019). Pathways from research into public
decision making: Intermediaries as the third community.
Albers et al. 19
Perspectives on Public Management and Governance,3(1), 45–58.
https://doi.org/doi:10.1093/ppmgov/gvz020
Jackson, C. B., Brabson, L. A., Quetsch, L. B., & Herschell, A. D.
(2018). Training transfer: A systematic review of the impact of
inner setting factors. Advances in Health Sciences Education,24,
167–183. https://doi.org/10.1007/s10459-018-9837-y
Jacobson, N., Johnson, R., Deyo, B., Alagoz, E., & Quanbeck, A.
(2019). Systems consultation for opioid prescribing in primary
care: A qualitative study of adaptation. BMJ Quality & Safety,
28(5), 397–404. https://doi.org/10.1136/bmjqs-2018-008160
James, S., Lampe, L., Behnken, S., & Schulz, D. (2019). Evidence-
based practice and knowledge utilisation—A study of attitudes and
practices among social workers in Germany. European Journal of
Social Work,22(5), 763–777. https://doi.org/10.1080/13
691457.2018.1469475
Kaasalainen, S., Ploeg, J., Donald, F., Coker, E., Brazil, K.,
Martin-Misener, R., Dicenso, A., & Hadjistavropoulos, T.
(2015). Positioning clinical nurse specialists and nurse practi-
tioners as change champions to implement a pain protocol in
long-term care. Pain Management Nursing,16(2), 78–88. https://
doi.org/10.1016/j.pmn.2014.04.002
Kastner, M., Antony, J., Soobiah, C., Straus, S. E., & Tricco, A. C.
(2016). Conceptual recommendations for selecting the most appro-
priate knowledge synthesis method to answer research questions
related to complex evidence. Journal of Clinical Epidemiology,73,
43–49. https://doi.org/10.1016/j.jclinepi.2015.11.022
Katz, J., & Wandersman, A. (2016). Technical assistance to enhance
prevention capacity: A research synthesis of the evidence base.
Prevention Science,17, 417–428. https://doi.org/10.1007/s11121-
016-0636-5
Kauth, M. R., Sullivan, G., Blevins, D., Cully, J. A., Landes, R. D.,
Said, Q., & Teasdale, T. A. (2010). Employing external facilitation
to implement cognitive behavioral therapy in VA clinics: A pilot
study. Implementation Science,5(Article 75). https://doi.org/
10.1186/1748-5908-5-75
Kelly, J. A., Somlai, A. M., DiFranceisco, W. J., Otto-Salaj, L. L.,
McAuliffe, T. L., Hackl, K. L., Heckman, T. G., Holtgrave, D. R.,
& Rompa, D. (2000). Bridging the gap between the science and
service of HIV prevention: Transferring effective research-based
HIV prevention interventions to community AIDS service provi-
ders. American Journal of Public Health,90(7), 1082–1088.
https://doi.org/10.2105/ajph.90.7.1082
Khalil, H. (2016). Knowledge translation and implementation science:
What is the difference? International Journal of Evidence-Based
Healthcare,14(2), 39–40. https://doi.org/10.1097/xeb.000000
0000000086
Kinley, J., Stone, L., Dewey, M., Levy, J., Stewart, R., McCrone, P.,
Sykes, N., Hansford, P., Begum, A., & Hockley, J. (2014). The
effect of using high facilitation when implementing the Gold Stan-
dards Framework in care homes programme: A cluster randomised
controlled trial. Palliative Medicine,28(9), 1099–1109. https://
doi.org/10.1177/0269216314539785
Kirchner, J. E., Ritchie, M. J., Pitcock, J. A., Parker, L. E., Curran, G.
M., & Fortney, J. C. (2014). Outcomes of a partnered facilitation
strategy to implement primary care—mental health. Journal of
General Internal Medicine,29(Suppl 4), 904–912. https://
doi.org/10.1007/s11606-014-3027-2
Kislov, R., Hodgson, D., & Boaden, R. (2016). Professionals as
knowledge brokers: The limits of authority in healthcare collabora-
tion. Public Administration,94(2), 472–489. https://doi.org/
10.1111/padm.12227
Kitson, A. L., Harvey, G., & McCormack, B. G. (1998). Enabling the
implementation of evidence-based practice: A conceptual frame-
work. Quality in Health Care,7, 149–158. https://doi.org/10.1136/
qshc.7.3.149
Kousgaard, M. B., & Thorsen, T. (2012). Positive experiences with a
specialist as facilitator in general practice. Danish Medical Jour-
nal,59(6), A4443. https://ugeskriftet.dk/files/scientific_article_
files/2018-12/a4443.pdf
Lavoie Tremblay, M., Richer, M., Marchionni, C., Cyr, G., Biron, A.
D.,Aubry,M.,BonnevilleRoussy,A.,&V´ezina, M. (2012).
Implementation of evidence-based practices in the context of a
redevelopment project in a Canadian healthcare organization.
Journal of Nursing Scholarship,44(4), 418–427. https://doi.org/
10.1111/j.1547-5069.2012.01480.x
Leeman, J., Calancie, L., Hartman, M. A., Escoffery, C. T., Herrmann,
A. K., Tague, L. E., Moore, A. A., Wilson, K. M., Schreiner, M., &
Samuel-Hodge, C. (2015). What strategies are used to build practi-
tioners’ capacity to implement community-based interventions and
are they effective? A systematic review. Implementation Science,
10(Article 80). https://doi.org/10.1186/s13012-015-0272-7
Leeman, J., Calancie, L., Kegler, M. C., Escoffery, C. T., Herrmann,
A. K., Thatcher, E., Hartman, M. A., & Fernandez, M. E. (2017).
Developing theory to guide building practitioners’ capacity to
implement evidence-based interventions. Health Education &
Behavior,44(1), 59–69. https://doi.org/10.1177/10901981
15610572
Lemelin, J., Hogg, W., & Baskerville, N. (2001). Evidence to action:
A tailored multifaceted approach to changing family physician
practice patterns and improving preventive care. Canadian Medi-
cal Association Journal,164(6), 757–763. https://www.cmaj.ca/
content/164/6/757.short
Lery, B., Wiegmann, W., & Berrick, J. D. (2015). Building an
evidence-driven child welfare workforce: A university-agency
partnership. Journal of Social Work Education,51(Suppl 2),
S283–S298. https://doi.org/10.1080/10437797.2015.1073080
Lessard, S., Bareil, C., Lalonde, L., Duhamel, F., Hudon, E., Gou-
dreau, J., & L´evesque, L. (2016). External facilitators and inter-
professional facilitation teams: A qualitative study of their roles in
supporting practice change. Implementation Science,11(Article
97). https://doi.org/10.1186/s13012-016-0458-7
Liedgren, P. (2020). ‘We know what we are, but know not what we
may be’—Research-minded practitioners and their possible futures
in social work. Nordic Social Work Research. https://doi.org/
10.1080/2156857x.2020.1793807
Lyon, A. R., Cook, C. R., Duong, M. T., Nicodimos, S., Pullmann, M.
D., Brewer, S. K., Gaias, L. M., & Cox, S. (2019). The influence of
a blended, theoretically informed pre-implementation strategy on
school-based clinician implementation of an evidence-based
trauma intervention. Implementation Science,14(Article 54).
https://doi.org/10.1186/s13012-019-0905-3
20 Research on Social Work Practice XX(X)
Mackenzie, T., Innes, J., Boyd, M., Keane, B., Boxall, J., & Allan, S.
(2011). Evaluating the role and value of a national office to coor-
dinate Liverpool care pathway implementation in New Zealand.
International Journal of Evidence-Based Healthcare,9, 252–260.
https://doi.org/10.1111/j.1744-1609.2011.00219.x
Mader, E. M., Fox, C. H., Epling, J. W., Noronha, G. J., Swanger, C.
M., Wisniewsk, A. M., Vitale, K., Norton, A. L., & Morley, C. P.
(2016). A practice facilitation and academic detailing intervention
can improve cancer screening rates in primary care safety net
clinics. Journal of the American Board of Family Medicine,
29(5), 533–542. https://doi.org/10.3122/jabfm.2016.05.160109
Martino, S., Zimbrean, P., Forray, A., Kaufman, J. S., Desan, P. H.,
Olmstead, T. A., Gilstad-Hayden, K., Gueorguieva, R., & Yonkers,
K. A. (2019). Implementing motivational Interviewing for sub-
stance misuse on medical inpatient units: A randomized controlled
trial. Journal of General Internal Medicine,34(11), 2520–2529.
https://doi.org/10.1007/s11606-019-05257-3
McBeath, B., Mosley, J., Hopkins, K., Guerrero, E., Austin, M., &
Tropman, J. (2019). Building knowledge to support human service
organizational and management practice: An agenda to address the
research-to-practice gap. Social Work Research,43(2), 115–128.
https://doi.org/10.1093/swr/svz003
McCullough, M. B., Gillespie, C., Petrakis, B. A., Jones, E. A., Park,
A. M., Lukas, C. V., & Rose, A. J. (2017). Forming and activating
an internal facilitation group for successful implementation:
A qualitative study. Research in Social and Administrative Phar-
macy,13(5), 1014–1027. https://doi.org/10.1016/j.sapharm.
2017.04.007
McLeod, B. D., Cox, J. R., Jensen-Doss, A., Herschell, A. D., Ehren-
reich-May, J., & Wood, J. J. (2018). Proposing a mechanistic
model of clinician training and consultation. Clinical Psychology
Science and Practice,25, e12260. https://doi.org/10.1111/
cpsp.12260
McWilliam, J., Brown, J., Sanders, M. R., & Jones, L. (2016). The
triple P implementation framework: The role of purveyors in the
implementation and sustainability of evidence-based programs.
Prevention Science,17, 636–645. https://doi.org/10.1007/s11121-
016-0661-4
Mennen, F. E., Cederbaum, J., Chorpita, B. F., Becker, K., Lopez, O.,
& Sela-Amit, M. (2018). The large-scale implementation of
evidence-informed practice into a specialized MSW curriculum.
Journal of Social Work Education,54(Suppl 1), S56–S64. https://
doi.org/10.1080/10437797.2018.1434440
Meropol, S. B., Schiltz, N. K., Sattar, A., Stange, K. C., Nevar, A. H.,
Davey, C., Ferretti, G. A., Howell, D. E., Strosaker, R., Vavrek, P.,
Bader, S., Ruhe, M. C., & Cuttler, L. (2014). Practice-tailored
facilitation to improve pediatric preventive care delivery: A ran-
domized trial. Pediatrics,133(6), e1664–e1675. https://doi.org/
10.1542/peds.2013-1578
Mettrick, J., Kanary, P. J., Zabel, M. D., & Shepler, R. (2017). Centers
of excellence—Intermediary experience from the United States.
Developing Practice,48, 62–82. https://search.informit.com.au/
documentSummary;dn¼586428262444791;res¼IELAPA
Metz, A., & Bartley, L. (2020). Implementation teams: A stakeholder
view of leading and sustaining change. In B. Albers, A. Shlonsky,
& R. Mildon, (Eds.), Implementation science 3.0 (pp. 199–225).
Springer Nature.
Metz, A., Boaz, A., & Powell, B. J. (2019). A research protocol for
studying participatory processes in the use of evidence in child
welfare systems. Evidence & Policy: A Journal of Research,
Debate and Practice,15(3), 393–407. https://doi.org/10.1332/
174426419x15579811791990
Mold, J. W., Aspy, C. A., & Nagykaldi, Z. (2008). Implementation of
evidence-based preventive services delivery processes in primary
care: An Oklahoma Physicians Resource/Research Network
(OKPRN) Study. The Journal of the American Board of Family
Medicine,21(4), 334–344. https://doi.org/10.3122/jabfm.2008.
04.080006
Moore, J. E., Rashid, S., Park, J. S., Khan, S., & Straus, S. E. (2018).
Longitudinal evaluation of a course to build core competencies in
implementation practice. Implementation Science,13(Article 106).
https://doi.org/10.1186/s13012-018-0800-3
Mosley, J. E., Marwell, N. P., & Ybarra, M. (2019). How the “What
Works” movement is failing human service organizations, and
what social work can do to fix it. Human Service Organizations:
Management, Leadership & Governance,43(4), 1–10. https://
doi.org/10.1080/23303131.2019.1672598
Mosson, R., Augustsson, H., Ba¨ck, A., A
˚hstro¨m, M., Schwarz, U. von
T., Richter, A., Gunnarsson, M., & Hasson, H. (2019). Building
implementation capacity (BIC): A longitudinal mixed methods
evaluation of a team intervention. BMC Health Services Research,
19(Article 287). https://doi.org/10.1186/s12913-019-4086-1
Murray, M. E., Khoury, D. Y., Farmer, E. M. Z., & Burns, B. J. (2018).
Is more better? Examining whether enhanced consultation/coach-
ing improves implementation. American Journal of Orthopsychia-
try,88(3), 376–385. https://doi.org/10.1037/ort0000296
Nadeem, E., Gleacher, A., & Beidas, R. S. (2013). Consultation as an
implementation strategy for evidence-based practices across mul-
tiple contexts: Unpacking the black box. Administration and Policy
in Mental Health and Mental Health Services Research,40(6),
439–450. https://doi.org/10.1007/s10488-013-0502-8
Nadeem, E., Gleacher, A., Pimentel, S., Hill, L. C., McHugh, M., &
Hoagwood, K. E. (2013). The role of consultation calls for clinic
supervisors in supporting large-scale dissemination of evidence-
based treatments for children. Administration and Policy in Mental
Health and Mental Health Services Research,40(6), 530–540.
https://doi.org/10.1007/s10488-013-0491-7
Norris, J. M., White, D. E., Nowell, L., Mrklas, K., & Stelfox, H. T.
(2017). How do stakeholders from multiple hierarchical levels of a
large provincial health system define engagement? A qualitative
study. Implementation Science,12(Article 98). https://doi.org/
10.1186/s13012-017-0625-5
Nurius, P. S., Coffey, D. S., Fong, R., Korr, W. S., & McRoy, R.
(2017). Preparing professional degree students to tackle grand
challenges: A framework for aligning social work curricula. Jour-
nal of the Society for Social Work and Research,8(1), 99–118.
https://doi.org/10.1086/690562
Olson, J. R., McCarthy, K. J., Perkins, D. F., & Borden, L. M. (2018).
A formative evaluation of a coach-based technical assistance
model for youth- and family-focused programming. Evaluation
Albers et al. 21
and Program Planning,67, 29–37. https://doi.org/10.1016/
j.evalprogplan.2017.11.002
Palinkas, L. A., Fuentes, D., Finno, M., Garcia, A. R., Holloway, I. W.,
& Chamberlain, P. (2012). Inter-organizational collaboration in the
implementation of evidence-based practices among public agen-
cies serving abused and neglected youth. Administration and
Policy in Mental Health and Mental Health Services Research,
41, 74–85. https://doi.org/10.1007/s10488-012-0437-5
Palinkas, L. A., He, A. S., Choy-Brown, M., & Hertel, A. L. (2017).
Operationalizing social work science through research–practice
partnerships. Research on Social Work Practice,27(2), 181–188.
https://doi.org/10.1177/1049731516666329
Palinkas, L. A., Holloway, I. W., Rice, E., Fuentes, D., Wu, Q., &
Chamberlain, P. (2011). Social networks and implementation of
evidence-based practices in public youth-serving systems:
A mixed-methods study. Implementation Science,6(1), 113.
https://doi.org/10.1186/1748-5908-6-113
Parchman, M. L., Noel, P. H., Culler, S. D., Lanham, H. J., Leykum, L.
K., Romero, R. L., & Palmer, R. F. (2013). A randomized trial of
practice facilitation to improve the delivery of chronic illness care
in primary care: Initial and sustained effects. Implementation
Science,8(Article 93). https://doi.org/10.1186/1748-5908-8-93
Park, J. S., Moore, J. E., Sayal, R., Holmes, B. J., Scarrow, G., Gra-
ham, I. D., Jeffs, L., Timmings, C., Rashid, S., Johnson, A. M., &
Straus, S. E. (2018). Evaluation of the “Foundations in Knowledge
Translation” training initiative: Preparing end users to practice KT.
Implementation Science,13(Article 63). https://doi.org/10.1186/
s13012-018-0755-4
Peterson, D. J., Christiansen, A. L., Guse, C. E., & Layde, P. M.
(2015). Community translation of fall prevention interventions:
The methods and process of a randomized trial. Journal of
Community Psychology,43(8), 1005–1018. https://doi.org/
10.1002/jcop.21728
Petticrew, M., & Roberts, H. (2003). Evidence, hierarchies, and typol-
ogies: Horses for courses. Journal of Epidemiology and Commu-
nity Health,57(7), 527–529. https://doi.org/10.1136/jech.57.7.527
Pinnock, H., Barwick, M. A., Carpenter, C. R., Eldridge, S., Grandes,
G., Griffiths, C. J., Rycroft-Malone, J., Meissner, P., Murray, E.,
Patel, A., Sheikh, A., & Taylor, S. J. C. (2017a, March). Standards
for reporting implementation studies (StaRI) statement. BMJ,356,
i6795. https://doi.org/10.1136/bmj.i6795
Pinnock, H., Barwick, M., Carpenter, C. R., Eldridge, S., Grandes, G.,
Griffiths, C. J., Rycroft-Malone, J., Meissner, P., Murray, E., Patel,
A., Sheikh, A., & Taylor, S. J. C. (2017b, April). Standards for
reporting implementation studies (StaRI): Explanation and ela-
boration document. BMJ Open,7, e013318. https://doi.org/
10.1136/bmjopen-2016-013318
Powell, B. J., Beidas, R. S., Lewis, C. C., Aarons, G. A., McMillen, J.
C., Proctor, E. K., & Mandell, D. S. (2015). Methods to improve
the selection and tailoring of implementation strategies. Journal of
Behavioral Health Services Research,44(2), 177–194. https://
doi.org/10.1007/s11414-015-9475-6
Powell, B. J., Fernandez, M. E., Williams, N. J., Aarons, G. A.,
Beidas,R.S.,Lewis,C.C.,McHugh,S.M.,&Weiner,B.J.
(2019). Enhancing the impact of implementation strategies in
healthcare: A research agenda. Frontiers in Public Health,7(Arti-
cle 3). https://doi.org/10.3389/fpubh.2019.00003
Powell, B. J., Haley, A. D., Patel, S. V., Amaya-Jackson, L., Glienke,
B.,Blythe,M.,Lengnick-Hall,R.,McCrary,S.,Beidas,R.S.,
Lewis, C. C., Aarons, G. A., Wells, K. B., Saldana, L., McKay,
M. M., & Weinberger, M. (2020). Improving the implementation
and sustainment of evidence-based practices in community mental
health organizations: A study protocol for a matched-pair cluster
randomized pilot study of the collaborative organizational
approach to selecting and tailoring implementation strategies
(COAST-IS). Implementation Science Communications,1(Article
9). https://doi.org/10.1186/s43058-020-00009-5
Powell, B. J., Proctor, E. K., & Glass, J. E. (2014). A systematic
review of strategies for implementing empirically supported men-
tal health interventions. Research on Social Work Practice,24(2),
192–212. https://doi.org/10.1177/1049731513505778
Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith,
J. L., Matthieu, M. M., Proctor, E. K., & Kirchner, J. E. (2015).
A refined compilation of implementation strategies: Results from
the expert recommendations for implementing change (ERIC)
project. Implementation Science,10(Article 21). https://doi.org/
10.1186/s13012-015-0209-1
Preast, J. L., & Burns, M. K. (2018). Effects of consultation on pro-
fessional learning communities. Journal of Educational and Psy-
chological Consultation,29(2), 1–31. https://doi.org/10.1080/
10474412.2018.1495084
Prior, M., Guerin, M., & Grimmer-Somers, K. (2008). The effective-
ness of clinical guideline implementation strategies—A synthesis
of systematic review findings. Journal of Evaluation in Clinical
Practice,14(5), 888–897. https://doi.org/10.1111/j.1365-2753
.2008.01014.x
Proctor, E. K., Hooley, C., Morse, A., McCrary, S., Kim, H., & Kohl,
P. L. (2019). Intermediary/purveyor organizations for evidence-
based interventions in the US child mental health: Characteristics
and implementation strategies. Implementation Science,14(Article
3). https://doi.org/10.1186/s13012-018-0845-3
Proctor, E. K., Powell, B. J., & McMillen, J. C. (2013). Implementa-
tion strategies: Recommendations for specifying and reporting.
Implementation Science,8(Article 139). https://doi.org/10.1186/
1748-5908-8-139
Provvidenza, C., Townley, A., Wincentak, J., Peacocke, S., & King-
snorth, S. (2020). Building knowledge translation competency in a
community-based hospital: A practice-informed curriculum for
healthcare providers, researchers, and leadership. Implementation
Science,15(Article 54). https://doi.org/10.1186/s13012-020-
01013-y
Quanbeck, A., Brown, R. T., Zgierska, A. E., Jacobson, N., Robinson,
J. M., Johnson, R. A., Deyo, B. M., Madden, L., Tuan, W.-J., &
Alagoz, E. (2018). A randomized matched-pairs study of feasibil-
ity, acceptability, and effectiveness of systems consultation:
A novel implementation strategy for adopting clinical guidelines
for opioid prescribing in primary care. Implementation Science,
13(Article 21). https://doi.org/10.1186/s13012-018-0713-1
Rabin, B. A., Brownson, R. C., Haire-Joshu, D., Kreuter, M. W., &
Weaver, N. L. (2008). A glossary for dissemination and implemen-
tation research in health. Journal of Public Health Management
22 Research on Social Work Practice XX(X)
and Practice,14(2), 117–123. https://doi.org/10.1097/01.phh.
0000311888.06252.bb
Ritchie, M. J., Parker, L. E., Edlund, C. N., & Kirchner, J. E. (2017).
Using implementation facilitation to foster clinical practice quality
and adherence to evidence in challenged settings: A qualitative
study. BMC Health Services Research,17(Article 294). https://
doi.org/10.1186/s12913-017-2217-0
Rivard, L. M., Russell, D. J., Roxborough, L., Ketelaar, M., Bartlett,
D. J., & Rosenbaum, P. (2010). Promoting the use of measurement
tools in practice: A mixed-methods study of the activities and
experiences of physical therapist knowledge brokers. Physical
Therapy,90(11), 1580–1590. https://doi.org/10.2522/ptj.20090408
Rosella, L. C., Bornbaum, C., Kornas, K., Lebenbaum, M., Peirson,
L., Fransoo, R., Loeppky, C., Gardner, C., & Mowat, D. (2018).
Evaluating the process and outcomes of a knowledge translation
approach to supporting use of the diabetes population risk tool
(DPORT) in public health practice. Canadian Journal of Program
Evaluation,33(1), 21–48. https://doi.org/10.3138/cjpe.31160
Rosen, C. S., Nguyen, T., Bernardy, N. C., Hamblen, J. L., Ruzek, J. I.,
& Friedman, M. J. (2012). Evaluation of a mentoring program for
PTSD clinic managers in the U.S. Department of Veterans Affairs.
Psychiatric Services,63(10), 1047–1050. https://doi.org/10.1176/
appi.ps.201100446
Rushovich, B. R., Bartley, L. H., Steward, R. K., & Bright, C. L.
(2015). Technical assistance: A comparison between providers and
recipients. Human Service Organizations: Management, Leader-
ship & Governance,39(4), 362–379. https://doi.org/10.1080/233
03131.2015.1052124
Russell,D.J.,Rivard,L.M.,Walter,S.D.,Rosenbaum,P.L.,
Roxborough, L., Cameron, D., Darrah, J., Bartlett, D. J., Hanna,
S. E., & Avery, L. M. (2010). Using knowledge brokers to facilitate
the uptake of pediatric measurement tools into clinical practice:
A before-after intervention study. Implementation Science,
5(Article 92). https://doi.org/10.1186/1748-5908-5-92
Ryba, M. M., Brothers, B. M., & Andersen, B. L. (2017). Implemen-
tation of an evidence-based biobehavioral treatment for cancer
patients. Translational Behavioral Medicine,7, 648–656. https://
doi.org/10.1007/s13142-016-0459-8
Saini, M., & Shlonsky, A. (2012). Systematic synthesis of qualitative
research. Oxford University Press.
Saldana, L., & Chamberlain, P. (2012). Supporting implementation:
The role of community development teams to build infrastructure.
American Journal of Community Psychology,50(3–4), 334–346.
https://doi.org/10.1007/s10464-012-9503-0
Sanetti, L. M. H., Williamson, K. M., Long, A. C. J., & Kratochwill, T.
R. (2018). Increasing in-service teacher implementation of class-
room management practices through consultation, implementation
planning, and participant modeling. Journal of Positive Behavior
Interventions,20(1), 43–59. https://doi.org/10.1177/10983
00717722357
Sarkies, M. N., Bowles, K.-A., Skinner, E. H., Haas, R., Lane, H., &
Haines, T. P. (2017). The effectiveness of research implementation
strategies for promoting evidence-informed policy and manage-
ment decisions in healthcare: A systematic review. Implementation
Science,12(Article 132). https://doi.org/10.1186/s13012-017-
0662-0
Schoenwald, S. K., Sheidow, A. J., & Letourneau, E. J. (2004).
Toward effective quality assurance in evidence-based practice:
Links between expert consultation, therapist fidelity, and child
outcomes. Journal of Clinical Child & Adolescent Psychology,
33(1), 94–104. https://doi.org/10.1207/s15374424jccp3301_10
Scurlock-Evans, L., & Upton, D. (2015). The role and nature of evi-
dence: A systematic review of social workers’ evidence-based
practice orientation, attitudes, and implementation. Journal of
Evidence-Informed Social Work,12(4), 369–399. https://doi.org/
10.1080/15433714.2013.853014
Shapira, Y., Enosh, G., & Havron, N. (2017). What makes social work
students implement evidence-based practice behaviors? Journal of
Social Work Education,53(2), 187–200. https://doi.org/10.1080/
10437797.2016.1260507
Shapiro, C. J. (2018). Centers of excellence: An opportunity for
academic training in social work. Journal of Social Work Educa-
tion,54(Suppl 1), 1–11. https://doi.org/10.1080/10437797.
2018.1434439
Shernoff, E. S., Lakind, D., Frazier, S. L., & Jakobsons, L. (2015).
Coaching early career teachers in urban elementary schools:
A mixed-method study. School Mental Health,7(1), 6–20.
https://doi.org/10.1007/s12310-014-9136-6
Smits, P., Denis, J.-L., Couturier, Y., Touati, N., Roy, D., Boucher, G.,
& Rochon, J. (2020). Implementing public policy in a non-
directive manner: Capacities from an intermediary organization.
Canadian Journal of Public Health,111, 72–79. https://doi.org/
10.17269/s41997-019-00257-6
Snyder, H. (2019). Literature review as a research methodology: An
overview and guidelines. Journal of Business Research,104,
333–339. https://doi.org/10.1016/j.jbusres.2019.07.039
Snyder, P. A., Hemmeter, M. L., & Fox, L. (2015). Supporting imple-
mentation of evidence-based practices through practice-based
coaching. Topics in Early Childhood Special Education,35(3),
133–143. https://doi.org/10.1177/0271121415594925
Spensberger, F., Kollar, I., Gambrill, E., Ghanem, C., & Pankofer, S.
(2020). How to teach evidence-based practice in social work:
A systematic review. Research on Social Work Practice,30(1),
19–39. https://doi.org/10.1177/1049731519852150
Stander, J., Grimmer, K., & Brink, Y. (2018). Training programmes to
improve evidence uptake and utilisation by physiotherapists:
A systematic scoping review. BMC Medical Education,18(Article
14). https://doi.org/10.1186/s12909-018-1121-6
Stephens, T. N., McGuire-Schwartz, M., Rotko, L., Fuss, A., &
McKay, M. M. (2014). A Learning collaborative supporting the
implementation of an evidence-informed program, the “4Rs and
2Ss for children with conduct difficulties and their families.”
Journal of Evidence-Based Social Work,11(5), 511–523. https://
doi.org/10.1080/15433714.2013.831007
Stormont, M., Reinke, W. M., Newcomer, L., Marchese, D., & Lewis,
C. (2015). Coaching teachers’ use of social behavior interventions
to improve children’s outcomes: A review of the literature. Journal
of Positive Behavior Interventions,17(2), 69–82. https://doi.org/
10.1177/1098300714550657
Taylor, J. S., Verrier, M. C., & Landry, M. D. (2014). What do we
know about knowledge brokers in paediatric rehabilitation? A
Albers et al. 23
systematic search and narrative summary. Physiotherapy Canada,
66(2), 143–152. https://doi.org/10.3138/ptc.2012-71
Thyer, B. A., Babcock, P., & Tutweiler, M. (2017). Locating research-
supported interventions for child welfare practice. Child and Ado-
lescent Social Work Journal,34(2), 85–94. https://doi.org/
10.1007/s10560-016-0478-9
Tierney, S., Kislov, R., & Deaton, C. (2014). A qualitative study of a
primary-care based intervention to improve the management of
patients with heart failure: The dynamic relationship between
facilitation and context. BMC Family Practice,15(Article 153).
https://doi.org/10.1186/1471-2296-15-153
Torraco, R. J. (2016). Writing integrative literature reviews: Using the
past and present to explore the future. Human Resource Develop-
ment Review,15(4), 404–428. https://doi.org/10.1177/15344843
16671606
Tricco, A. C., Antony, J., Soobiah, C., Kastner, M., Cogo, E.,
MacDonald, H., D’Souza, J., Hui, W., & Straus, S. E. (2016).
Knowledge synthesis methods for generating or refining theory:
A scoping review reveals that little guidance is available. Journal
of Clinical Epidemiology,73, 36–42. https://doi.org/10.1016/
j.jclinepi.2015.11.021
van der Zijpp, T. J., Niessen, T., Eldh, A. C., Hawkes, C., McMullan,
C., Mockford, C., Wallin, L., McCormack, B., Rycroft-Malone, J.,
& Seers, K. (2016). A bridge over turbulent waters: Illustrating the
interaction between managerial leaders and facilitators when
implementing research evidence. Worldviews on Evidence-Based
Nursing,13(1), 25–31. https://doi.org/10.1111/wvn.12138
van der Zwet, R. J. M., Beneken, D. M., Schalk, R., & van Regen-
mortel, T. (2019). Views and attitudes towards evidence-based
practice in a Dutch social work organization. Journal of
Evidence-Based Social Work,16(3), 245–260. https://doi.org/
10.1080/23761407.2019.1584071
Varsi, C., Nes, L. S., Kristjansdottir, O. B., Kelders, S. M., Stenberg,
U., Zangi, H. A., Børøsund, E., Weiss, K. E., Stubhaug, A., Asb-
jørnsen, R. A., Westeng, M., Ødegaard, M., & Eide, H. (2019).
Implementation strategies to enhance the implementation of
eHealth programs for patients with chronic illnesses: Realist sys-
tematic review. Journal of Medical Internet Research,21(9),
e14255. https://doi.org/10.2196/14255
Waltz, T. J., Powell, B. J., Fernandez, M. E., Abadie, B., & Dams-
chroder, L. J. (2019). Choosing implementation strategies to
address contextual barriers: Diversity in recommendations and
future directions. Implementation Science,14(Article 42). https://
doi.org/10.1186/s13012-019-0892-4
Waltz, T. J., Powell, B. J., Matthieu, M. M., Damschroder, L. J.,
Chinman, M. J., Smith, J. L., Proctor, E. K., & Kirchner, J. E.
(2015). Use of concept mapping to characterize relationships
among implementation strategies and assess their feasibility and
importance: Results from the expert recommendations for imple-
menting change (ERIC) study. Implementation Science,10(Article
109). https://doi.org/10.1186/s13012-015-0295-0
Ward, M. M., Baloh, J., Zhu, X., & Stewart, G. L. (2017). Promoting
action on research implementation in health services framework
applied to TeamSTEPPS implementation in small rural hospitals.
Health Care Management Review,42(1), 2–13. https://doi.org/
10.1097/hmr.0000000000000086
Waterman,H.,Boaden,R.,Burey,L.,Howells,B.,Harvey,G.,
Humphreys, J., Rothwell, K., & Spence, M. (2015). Facilitating
large-scale implementation of evidence-based health care: Insider
accounts from a co-operative inquiry. BMC Health Services
Research,15(Article 60). https://doi.org/10.1186/s12913-015-
0722-6
Weaver, N. L., Thompson, J., Shoff, C. R., Copanas, K., & McMillin,
S. E. (2017). A conceptual model for the pathways of effect for
intermediary organizations: A case study from maternal and child
health. Evaluation and Program Planning,63, 69–73. https://
doi.org/10.1016/j.evalprogplan.2017.03.006
Wharton, T., & Bolland, K. (2012). Practitioner perspectives of
evidence-based practice. Families in Society: The Journal of
Contemporary Social Services,93(3), 157–164. https://doi.org/
10.1606/1044-3894.4220
Whittemore, R., & Knafl, K. (2005). The integrative review: Updated
methodology. Methodological Issues in Nursing Research,52(2),
546–553. https://doi.org/10.1111/j.1365-2648.2005.03621.x
Wike, T. L., Bledsoe, S. E., Manuel, J. I., Despard, M., Johnson, L. V.,
Bellamy, J. L., & Killian-Farrell, C. (2014). Evidence-based prac-
tice in social work: Challenges and opportunities for clinicians and
organizations. Clinical Social Work Journal,42(2), 161–170.
https://doi.org/10.1007/s10615-014-0492-3
Wike, T. L., Grady, M., Massey, M., Bledsoe, S. E., Bellamy, J. L.,
Stim, H., & Putzu, C. (2019). Newly educated MSW social work-
ers’ use of evidence-based practice and evidence-supported inter-
ventions: Results from an online survey. Journal of Social Work
Education,55(3), 504–518. https://doi.org/10.1080/1043
7797.2019.1600444
Williams, N. J., Glisson, C., Hemmelgarn, A., & Green, P. (2017).
Mechanisms of change in the ARC organizational strategy:
Increasing mental health clinicians’ EBP adoption through
improved organizational culture and capacity. Administration and
Policy in Mental Health and Mental Health Services Research,44,
269–283. https://doi.org/10.1007/s10488-016-0742-5
Worton, S. K., Hasford, J., Macnaughton, E., Nelson, G., MacLeod,
T., Tsemberis, S., Stergiopoulos, V., Goering, P., Aubry, T., Dis-
tasio, J., & Richter, T. (2018). Understanding systems change in
early implementation of housing first in Canadian communities:
An examination of facilitators/barriers, training/technical assis-
tance, and points of leverage. American Journal of Community
Psychology,61(1–2), 118–130. https://doi.org/10.1002/ajcp.12219
Yano, E. M., Rubenstein, L. V., Farmer, M. M., Chernof, B. A., Mitt-
man, B. S., Lanto, A. B., Simon, B. F., Lee, M. L., & Sherman, S.
E. (2008). Targeting primary care referrals to smoking cessation
clinics does not improve quit rates: Implementing evidence-based
interventions into practice. Health Services Research,43,
1637–1661. https://doi.org/10.1111/j.1475-6773.2008.00865.x
Yazejian, N., Metz, A., Morgan, J., Louison, L., Bartley, L., Fleming,
W. O., Haidar, L., & Schroeder, J. (2019). Co-creative technical
assistance: Essential functions and interim outcomes. Evidence &
Policy: A Journal of Research, Debate and Practice,15(3),
339–352. https://doi.org/10.1332/174426419x15468578679853
24 Research on Social Work Practice XX(X)
... IS theories, models, and frameworks provide a systematic approach to identifying facilitators and barriers to successful EBP implementation (Nilsen, 2015). A multitude of research within IS and other fields has identified common, persistent facilitators and barriers for implementation of EBP across sites and settings (Albers et al., 2021;Augustino et al., 2020;Bach-Mortensen et al., 2018;Clarke et al., 2021;Garcia et al., 2021;Li et al., 2018). While individuals or organizations may be knowledgeable about EBP and its importance, they may not have knowledge about the science of implementation, the requisite skills to implement, or be in an environment with a robust implementation culture. ...
... The presence of a robust organizational infrastructure that supports clinical inquiry and integration of EBPs across teams and environments consistently remains one of the most effective facilitators for EBP implementation and sustainability (Li et al., 2018;Melnyk et al., 2017a, b). For example, structural characteristics such as leadership engagement, support, and commitment (Albers et al., 2021;Bach-Mortensen et al., 2018;Bauer et al., 2015;Bergmark et al., 2018;Brown et al., 2014;Ecker et al., 2021;Flodgren et al., 2012;Li et al., 2018;Ogden et al., 2016;Quinn et al., 2019;Shuman et al., 2020;Theys et al., 2020) are frequent facilitators for EBP across settings, as are initial and ongoing education and skill building for clinicians (Albers et al., 2021;Bauer et al., 2015;Bergmark et al., 2018;Bernhardsson et al., 2017;Brown et al., 2014;Eisman et al., 2020;Keurhorst et al., 2015;Li et al., 2018;McNett et al., 2021b;Shuman et al., 2020;Tucker & Gallagher-Ford, 2019;Warren et al., 2016). In addition, unit-based champions or facilitators in local clinical groups can promote routine uptake of an EBP within specific settings, as these individuals often know vital factors to support implementation (Albers et al., 2021;Augustino et al., 2020;Bergmark et al., 2018;Bernhardsson et al., 2017;Brown et al., 2014;Newhouse et al., 2013;Pellecchia et al., 2018;Theys et al., 2020;van Rooijen et al., 2021). ...
... The presence of a robust organizational infrastructure that supports clinical inquiry and integration of EBPs across teams and environments consistently remains one of the most effective facilitators for EBP implementation and sustainability (Li et al., 2018;Melnyk et al., 2017a, b). For example, structural characteristics such as leadership engagement, support, and commitment (Albers et al., 2021;Bach-Mortensen et al., 2018;Bauer et al., 2015;Bergmark et al., 2018;Brown et al., 2014;Ecker et al., 2021;Flodgren et al., 2012;Li et al., 2018;Ogden et al., 2016;Quinn et al., 2019;Shuman et al., 2020;Theys et al., 2020) are frequent facilitators for EBP across settings, as are initial and ongoing education and skill building for clinicians (Albers et al., 2021;Bauer et al., 2015;Bergmark et al., 2018;Bernhardsson et al., 2017;Brown et al., 2014;Eisman et al., 2020;Keurhorst et al., 2015;Li et al., 2018;McNett et al., 2021b;Shuman et al., 2020;Tucker & Gallagher-Ford, 2019;Warren et al., 2016). In addition, unit-based champions or facilitators in local clinical groups can promote routine uptake of an EBP within specific settings, as these individuals often know vital factors to support implementation (Albers et al., 2021;Augustino et al., 2020;Bergmark et al., 2018;Bernhardsson et al., 2017;Brown et al., 2014;Newhouse et al., 2013;Pellecchia et al., 2018;Theys et al., 2020;van Rooijen et al., 2021). ...
Article
Full-text available
Research has identified facilitators and barriers to implementation of evidence-based practices (EBPs). Few studies have evaluated which factors persist among healthcare clinicians with extensive education and training on EBP implementation. Therefore, the purpose of this study was to examine facilitators and barriers to EBP implementation across a national sample of specialty-prepared EBP mentors in healthcare settings. Healthcare clinicians participating in an immersive 5-day EBP knowledge and skill building program were invited to complete a follow-up survey 12 months later to report on implementation experiences. The Consolidated Framework for Implementation Research (CFIR) guided content analysis of responses. A force field analysis using Lewin's change theory was used to assign numerical 'weights' to factors. Eighty-four individuals reported facilitators and barriers to implementation. The majority occurred within the inner setting of the CFIR model. Facilitators were strong leadership engagement (n = 15), positive EBP culture (n = 9), and resources (n = 4). Barriers included lack of resources (n = 21), poor leadership engagement (n = 19), implementation climate (n = 17), lack of relative priority (n = 12), and organizational characteristics (n = 9). Respondents also identified simultaneous facilitators and barriers within the process domain of the CFIR model. The construct of stakeholder engagement was a barrier when absent from the implementation process (n = 23), yet was a strong facilitator when present (n = 23). Implementation in healthcare settings appears most effective when conducted by an interprofessional team with strong leadership, resources, stakeholder engagement, and positive EBP culture. When these same factors are absent, they remain persistent barriers to implementation, even among specialty-trained healthcare clinicians. Supplementary information: The online version of this article (10.1007/s43477-022-00051-6) contains supplementary material, which is available to authorized users.
... All participants had long-term experience using implementation science frameworks to support change efforts. The implementation support role has been described as part of the "service delivery support system" (Wandersman et al., 2008) and has been expanded upon in recent years to include the roles of implementation facilitators Kirchner, et al., 2016;Parker, et al., 2014) and ISPs (Albers et al., 2020;Albers et al., 2021;Metz et al., 2021). ...
... However, the basic elements described for supporting implementation remain broad. As observed by Albers et al. (2021) based on their systematic review of implementation strategies, there is ample room for further operationalizing and tailoring of the strategies. ...
... The experiences of ISPs, as described in the current paper, provide one possible source for further defining the activities and functions for providing implementation support. For example, study participants identified developing trusting relationships as an important strategy that emerged through their experience providing implementation support, which aligns with other recent research findings (Akin, 2016;Albers et al., 2021;Barac et al., 2018;Bührmann, et al., 2022;Metz et al., under review). Insights from the experiences of ISPs can enrich implementation research to demonstrate the range of approaches for building implementation capacity and the rationales for evolving approaches that emphasize the dynamic and highly relational nature of using evidence in practice and the multiple layers of context and range of stakeholder groups involved in the process Carey et al., 2019;Huzzard, 2021). ...
Article
Full-text available
Background: There is growing interest in the lived experience of professionals who provide implementation support (i.e., implementation support practitioners). However, there remains limited knowledge about their experiences and how those experiences can contribute to the knowledge base on what constitutes successful and sustainable implementation support models. This study aimed to examine pathways of implementation support practice, as described by experienced professionals actively supporting systems’ uptake and sustainment of evidence to benefit children and families. Methods: Seventeen individuals with extensive experience providing implementation support in various settings participated in semi-structured interviews. Data were analyzed using qualitative content analysis and episode profile analysis approaches. Iterative diagramming was used to visualize the various pathways of implementation support practitioners’ role reflection and transformation evidenced by the interview data. Results: Findings highlighted rich pathways of implementation support practitioners’ role reflection and transformation. Participants described their roots in providing implementation support as it relates to implementing and expanding the use of evidence-based programs and practices in child and family services. Almost all participants reflected on the early stages of their careers providing implementation support and described a trajectory starting with the use of “push models,” which evolved into “pull models” and eventually “co-creation or exchange models” of implementation support involving both technical and relational skills. Conclusion