Available via license: CC BY-NC 4.0
Content may be subject to copyright.
Special Issue: Strengthening the Science and Practice of Implementation Support: Evaluating the Effectiveness of Training and Technical
Assistance Centers
Evaluation & the Health Professions
2024, Vol. 47(2) 204–218
© The Author(s) 2024
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/01632787241248769
journals.sagepub.com/home/ehp
Unpacking Technical Assistance (TA)
Strategies Within a State-Level Prevention
Support System: A Mixed-Method Study in
Determining Types and Amount of TA
Jochebed G. Gayles
1
, Sarah M. Chilenski
1
, Nataly Barrag´
an
1
, Brittany Rhoades Cooper
2
,
Janet Agnes Welsh
1
, and Megan Galinsky
1
Abstract
The research-practice gap between evidence-based intervention efficacy and its uptake in real-world contexts remains a central
challenge for prevention and implementation science. Providing technical assistance (TA) is considered a crucial support
mechanism that can help narrow the gap. However, empirical measurement of TA strategies and their variation is often lacking.
The current study unpacks the black box of TA, highlighting different TA strategies, amounts, and their relation to intervention
characteristics. First, we qualitatively categorized interactions between TA providers and implementers. Second, we explored
how characteristics of implementing organizations and the intervention related to variations in the amount of TA delivered.
Using data spanning six years, we analyzed over 10,000 encounters between TA providers and implementers. Content analysis
yielded four distinct strategies: Consultation (27.2%), Coordination Logistics (24.5%), Monitoring (16.5%), and Resource
Delivery (28.2%). Organizations with prior experience required less monitoring and resource delivery. Additionally, char-
acteristics of the intervention were significantly associated with the amount of consultation, monitoring, coordination logistics,
and resource delivery provided. The specific features of the intervention showed significant variation in their relation to TA
strategies. These findings provide initial insights into the implications of intervention characteristics in determining how much of
which TA strategies are needed to support implementations in real-world settings.
Keywords
technical assistance, evidence-based programs, implementation support, prevention support system, prevention capacity,
evidence-based preventive interventions
Introduction
Over the past three decades, prevention science has built an
impressive collection of evidence-based preventive inter-
ventions (EBIs) that have proven effective at reducing risk and
promoting healthy outcomes; in this article, we use the term
EBI to reflect universal prevention, selective, and treatment
programs delivered across various community settings.
Achieving population-level public health impact requires
EBIs to be widely adopted, disseminated, and sustained
(Hawkins et al., 2015). However, implementing EBIs across
diverse organizational and community contexts introduces
several challenges that produce a notable gap between find-
ings from research settings and practice contexts. Outside of
research settings, many EBIs encounter implementation
variations where the delivery of the program in practice is not
done as intended by the designer (Durlak, 2013). Im-
plementation variation introduces challenges to fidelity,
quality delivery, and other implementation outcomes (Durlak,
2013;Proctor et al., 2011). Furthermore, the prevention ca-
pacity of human service providers also affects the quality
implementation of programs (Flaspohler et al., 2008). These
challenges contribute to the research-practice gap, where
many EBIs fail to replicate findings from efficacy trials, re-
sulting in diminished public health impact (Glasgow et al.,
1
Edna Bennett Pierce Prevention Research Center, The Pennsylvania State
University, University Park, PA, USA
2
Department of Human Development, Washington State University, Pullman,
WA, USA
Corresponding Author:
Jochebed G. Gayles, Edna Bennett Pierce Prevention Research Center, The
Pennsylvania State University, 320 G Biobehavioral Health Building, University
Park, PA 16802, USA.
Email: jgg137@psu.edu
2003;Powell et al., 2015). The research-practice gap between
producing EBIs and their uptake in real-world contexts re-
mains a central challenge for prevention and implementation
science.
Implementation science aims to address the factors that
enhance the uptake and quality delivery of EBIs (Bauer &
Kirchner, 2020) and to develop strategies that promote pre-
vention capacity among program providers (Flaspohler et al.,
2008). Implementation support is one approach to addressing
the challenges that widen the research-practice gap (Durlak &
DuPre, 2008;Edmunds et al., 2013;Fixsen et al., 2005). The
field’s current state recognizes the need for conceptual and
measurement models that move us closer to an evidence base
for the effective and efficient delivery of implementation
support. Furthermore, a comprehensive, evidence-based
framework on technical assistance (TA) mechanisms, tech-
niques, and applications is in its infancy, and even less is
known about what strategies are most essential. Thus, a
question that remains is how much TA is sufficient and what
conditions bolster a particular strategy’s focus. The current
investigation contributes to the growing knowledge base on
implementation support by empirically classifying TA pro-
viders’activities and examining how EBI characteristics
contribute to the type and amount of TA delivered.
The Delivery of Implementation Support through the
Support System
The Interactive Systems Framework (ISF), developed by
Wandersman and colleagues (2008), provides a conceptual
logic model for how coordinated implementation support
systems help to narrow the research-practice gap (Hunter
et al., 2009a;Katz & Wandersman, 2016;Mitchell et al.,
2004). The ISF consists of three interconnected systems: the
synthesis and translation system, the support system, and the
delivery system. The framework highlights the reciprocal
influences among the three systems, where research carried
out by the synthesis and translation system informs the support
system, and the activities carried out by the support system
inform further research on effective implementation strategies
and technical assistance needs of the delivery system. The ISF
model conceptualizes how quality dissemination and im-
plementation capacity can be built and sustained through
ongoing collaboration among key stakeholders across the
three systems (Wandersman et al., 2008). We argue that
careful examination of the support system may help to unveil
how much TA is needed for providers tasked with im-
plementing EBIs.
Operationalizing Technical Assistance via the Evidence-based
System for Innovation Support. While the ISF establishes a
structural framework contextualizing systems needed for
diffusing EBIs, Wandersman and colleagues (2008) noted that
the real work of bridging the research-practice gap is
actualized within the “arrows”that connect the interactive
systems. In later work, Wandersman and colleagues (2012)
provided a more granular conceptual lens for how support is
executed via the Evidence-based System for Innovation
Support (EBSIS). The EBSIS logic model augments the ISF
by zooming into the “black box”of how the support system
provides various forms of implementation support to delivery
systems, introducing a conceptual basis for theory, research,
and application. The first step in the EBSIS model involves
performing a needs assessment evaluating the delivery sys-
tem’s current capacity level and support needs. Next, the
model describes four sequential components that characterize
implementation support—tools, training, TA, and quality
assurance and quality improvement (QA/QI). Therefore,
support systems help delivery systems achieve prevention
capacity and implementation outcomes by applying all four
components within the EBSIS model. The provision of tools,
training, TA, and QA/QI is both iterative and additive, such
that each component builds on, enhances, and reinforces the
others in a reciprocal and feedback-loop fashion.
Wandersman and colleagues (2012) emphasized the need
for further operationalization, measurement, and empirical
investigation of support components as an essential next step
in substantiating an evidence base for the practice and ef-
fectiveness of implementation support. The current study is a
response to this call. We aim to further unpack implementation
support by examining the application of TA, specifically
within the context of a state-level support system. We believe
that implementation support components can be further un-
packed to illuminate specific strategies that characterize the
components of support (Albers et al., 2021;Chinman et al.,
2018;Metz et al., 2021).
The current study highlights the support system as a viable
system that may be used to unpack the facets of im-
plementation support, specifically regarding technical assis-
tance strategies. The support system functions to build
prevention capacity among program implementers and to
foster collaboration and networking between and within the
synthesis and translation, and delivery systems (Chinman
et al., 2008;Hunter et al., 2009b;Mitchell et al., 2004;
Rhoades et al., 2012). Our current investigation focuses on
classifying different TA strategies delivered within a state-
level support system.
Unpacking Implementation Support through Examination of
Technical Assistance Strategies. Katz and Wandersman (2016)
conceptualized TA as an “individualized hands-on approach to
capacity building in organizations and communities”(p. 418).
Notably, TA is one of the four components of implementation
support (Wandersman et al., 2012), and TA is usually ongoing
following initial training (Durlak & DuPre, 2008;Fixsen et al.,
2009). Technical assistance providers use various strategies
such as training, consultation, program monitoring, im-
plementation tools, peer networking activities, and others to
enhance prevention capacity (Fagan et al., 2012,2019;Farrell
Gayles et al. 205
et al., 2019;Feinberg et al., 2004;Hunter et al., 2009b).
Therefore, various TA strategies may be required to achieve
desired implementation outcomes. What is known is that by
receiving structured, tailored, and ongoing implementation
support, program implementers become equipped with
knowledge, skills, tools, and competencies that enable them to
adopt EBIs, deliver them with quality and fidelity, and sustain
them (Fixsen et al., 2013;Katz & Wandersman, 2016;
Wandersman, 2009;Wandersman et al., 2012). The pressing
question remains: how much of which TA strategies are
necessary, and under what conditions?
How Much TA is Needed: For Which Strategies and Under What
Conditions?. Fixsen et al. (2005) highlighted the diverse in-
fluences of multiple contexts on implementation processes,
ranging from external factors within different systems to
organizational characters and core implementation compo-
nents. It is essential to understand the various factors that
affect implementation outcomes, be they external or internal to
the implementation context. They may directly relate to how
much TA is needed to aid program implementers with EBI
delivery.
Evidence-based preventive interventions vary in design,
quality, complexity, and resources available from the program
developer (Damschroder et al., 2015;Damschroder et al.,
2009). As a result, the amount of TA necessary may vary
considerably, especially given these different contextual
factors. The Consolidated Framework for Implementation
Research (CFIR) provides a comprehensive framework for
understanding and evaluating complex interventions in
healthcare, social services, and other fields (Damschroder
et al., 2009). The CFIR model is useful for illuminating
how intervention characteristics contribute to variations in
which TA strategies are delivered. For example, in a case
study, Engell and colleagues (2021) used the CFIR to show
that intervention content and design characteristics contrib-
uted to the implementability of the Enhanced Academic
Support program. Engell et al. found that content designed
with more flexible structure, integration, and tailoring pro-
cesses had greater implementability, requiring less follow-up
training and additional implementation support. Eisman and
colleagues examined the implementability of state-adopted
health curriculums in school-based settings and found that
intervention design characteristics related to acceptability,
program-context fit, and adaptability (Eisman et al., 2022) and
teacher previous experiences with program components
(Eisman et al., 2020) were related to program fidelity and
implementation quality. The aforementioned studies, as with
much of CFIR research (Kirk et al., 2016), show how factors
related to intervention characteristics influence im-
plementation outcomes; these factors also likely relate to the
degree of TA that is delivered to support organizations im-
plementing EBIs.
Additionally, public health specialists have pointed out that
some EBIs may encounter implementation obstacles given the
complexity of design, high costs, and a narrow focus that does
not align with communities’needs (Green & Mercer, 2001).
Instances in which program implementers encounter these
obstacles with their selected EBIs may necessitate the ef-
fective initiation and maintenance of considerable resources
and infrastructure for the long haul (Aarons et al., 2009;
Brownson et al., 2018). Thus, in such instances, a greater need
for and reliance on various TA strategies may be warranted.
Other factors, such as the differences in the length or intensity
of the delivery of an EBI (Codding et al., 2022;Fallon et al.,
2022) can also impact implementation capacity and, thus, TA
needs. The aforementioned studies and other related work
highlight how factors relating to EBI resources, other infra-
structure supports, and their complexity can contribute to a
provider’s capacity to deliver a program with fidelity, which
may relate to how much TA is needed.
The Current Study
The current study examines the amount of TA delivered to
support program implementation for a variety of EBIs. Our
overarching goal is to delineate how much of different types of
TA strategies are employed given different EBI conditions.
We address our goal via two interrelated sub-studies where we
empirically classify different TA strategies and determine how
often they occur within a state-level support system. In Study
1, we classify TA providers’strategies to support human
service organizations delivering EBIs. We identify specificTA
strategies supported by existing evidence (Le et al., 2016)and
aligned with the standard methods employed by a training and
technical assistance center (Rhoades et al., 2012) to aid human
service organizations delivering EBIs; these include consul-
tation, coaching, coordination logistics, monitoring, resource
delivery, and networking coordination. In Study 2, we explore
how intervention and provider characteristics are associated
with the amount of TA delivered. We examine associations
among characteristics of EBI quality and design. Design
characteristics examined in this study include the intervention
continuum of care (Elliott, 2016), focus of the intervention
content (e.g., family-focused; Bailey et al., 1986), charac-
teristics of delivery settings, including the system setting (e.g.,
school-based) and the respondent mode of delivery (e.g.,
group-based). Quality characteristics include the amount of
external support the program developer provides and the
evidence rating for the EBI. Based on personal communi-
cations with TA providers, we had some expectations on how
EBI characteristics might be related to the quantity of TA
delivered. For instance, we hypothesized that more complex
programs with greater content focus (i.e., Treatment vs.
Universal) would be associated with more significant amounts
of TA. We also hypothesized that more TAwould be found for
EBIs that were family-focused programs than non-family-
focused and school-based settings compared to other settings.
We also hypothesized that more significant amounts of de-
veloper support, measured by developer-provided tools,
206 Evaluation & the Health Professions 47(2)
training, and resources, would be associated with a smaller
quantity of TA delivered.
Study 1: Classifying Implementation Support Activities
Method. In the first study, the unit of analysis is the nature of
TA encounters between implementation support specialists
and TA recipients.
Data Source. Fourteen implementation specialists reported on
over 11,685 TA encounters provided to 168 different human
service organizations deliveringEBIs(310programimplemen-
tations; 16 different EBIs) across Pennsylvania communities. In-
formation about TA provision to support EBI implementation was
tracked using a reporting form, starting in 2011, when the tracking
system was first implemented, and going through 2017, when the
ACCESS tracking system became defunct. Implementation spe-
cialists completed the reporting form after each TA encounter.
Measures
TA Contact Form.Implementation support activities were
assessed via a TA contact form created in Microsoft Access.
Implementation specialists reported their direct contact with
human service organizations, including date of reference, mo-
dality, organization name(s), EBI, the number who received
support, TA strategy, and an open-ended general notes section.
The TA category and the open-ended notes were used to code
implementation support activities.
Analysis Plan. To address the first research question, each
instance of TA contact was qualitatively coded into one of the six
content codes conceptually defined by the research team. The
remaining TA contact encounters not selected during coder
training were divided among two research assistants. Once all TA
contact encounters were coded, each content code was summed
across all years, EBIs, and organizations. Descriptive statistics
were computed for each TA strategy and across EBIs, and years.
Data Reduction Procedures. The number of encounters in
the dataset was reduced using three exclusion criteria:
(1) Any instance that took place during or after 2018 (3 TA
encounters were removed); (2) if the total count of encounters
reported for a given EBI was less than 300 (these included
EBIs that had not been supported across all years of data
collection; 677 TA encounters and 6 EBIs were removed);
and, (3) any instance where there was not enough descriptive
information included to code the reported contact (74 TA
encounters). Once delimited, the dataset yielded 10,931 TA
encounters, with 145 human service organizations delivering
341 programs reflecting 10 different EBIs.
Coding Procedures. The Interactive Systems Framework
(Wandersman et al., 2008), the Evidence-based Prevention
and Intervention Support logic model (EPIS; Rhoades et al.,
2012), and an extensive literature review on TA were used to
develop a coding classification system for TA activities. The
co-authors developed, reviewed, and revised content codes.
The codes were crosswalked with existing TA categories in the
EPIS logic model and performance metric reporting system,
resulting in a standardized codebook. A coding manual was
created with content code descriptions. Next, the first author
randomly selected and coded a subset of the 11,685 reported
contacts before the initial data reduction (1115 encounters,
9.5% of the sample). Table 1 shows the coding scheme for six
content codes used to classify the hypothesized TA interac-
tions. Two research assistants were trained to review and
organize primary support activities using the content codes.
Training included discussing code meanings and practicing
coding. The research assistants coded the “gold standard”
encounters coded by the first author. Coding consensus
meetings were held after each iteration, resulting in revisions
to the coding manual and code descriptions. The training
involved two phases. In phase one, research assistants coded
the “gold standard”encounters separately from the first author.
Coding was done iteratively, with each coder assigned a
random 20% of the 731 gold standard cases per iteration (about
146 encounters per coder). After each iteration, coding consensus
meetings were held to address discrepancies between the coder
and the gold standard codes. In phase one, the coding manual and
descriptions were revised for clarity and distinctiveness. Phase
one was considered complete once each coder achieved 80% or
greater agreement with the “gold standard" codes. Inter-rater
agreement was assessed using Cohen’s kappa with weighted
disagreements in large samples (Fleiss et al., 1969). Interrater
reliability was moderate to strong for phase one (McHugh, 2012),
averaging 88%, ranging from 76% to 85% across the three coding
iterations. In the second phase, another random subset of TA
encounters were coded by each coder (546 encounters, 5% of the
sample). This process was repeated twice with a new random
subset of encounters. Phase two coding consensus meetings were
held after each iteration to review discrepancies and reach a
consensus. Phase two completion was reached when the two
coders achieved at least 80% agreement for the three subsets of
TA contact encounters; phase two interrater reliability was strong
to almost perfect (McHugh, 2012), demonstrating high percent
agreement among the coders (κ=.92).
Results
The first study shows qualitative content analyses’findings
classifying TA providers’different strategies. Tab le 1 shows the
six TA strategy codes, their descriptions, and example text used to
classify TA provider activities—consultation, coaching, coordi-
nation logistics, monitoring, resource delivery, and networking
coordination. All six hypothesized implementation support
practices were classified via the content analysis, but there was
considerable variation in the frequency of content codes assigned
across the different strategies. Tab le 2 shows the coding fre-
quencies and averages across years, EBIs, and human service
organizations. The most frequently occurring TA encounters were
Resource Delivery (28.2%), Consultation (27.2), Coordination
Gayles et al. 207
Logistics (24.5), and Monitoring (16.5). Together, these ac-
counted for more than 95% of the content coded. Networking
Coordination and Coaching were infrequently coded, occurring
less than 5% across all content.
Discussion
Study 1 findings show that TA providers employed resource
delivery, consultation, coordination logistics, and monitoring
strategies in their encounters with EBI implementers. Each of
these four strategies was ongoing during EBI implementation.
Some strategies, mainly resource delivery and consultation,
occurred more frequently than others. Our findings support
those offered in the existing literature, indicating that the
process of TA provision is ongoing and multifaceted and
requires tailoring (Katz & Wandersman, 2016), and the
amount needed may vary across strategies to efficiently de-
liver a sufficient degree of support to the program
Table 1. Coded Activities, EBSIS Component, Descriptions, Brief Coding Instructions and Qualitative Text Examples From TA Records
Strategy
EBSIS
component Description Brief coding Instruction Example
Consultation TA Provide expert advice,
information, and/or opinions,
regarding general and
program-specific
implementation capacity
needs.
IS provides information to help
troubleshoot, resolve, or
navigate a practitioner’s
concerns, challenges, and
needs regarding
implementation.
“Discussed…developer’s call and
certification process for facilitators.
Identified next steps and set date for
follow-up call…”
Coordination
Logistics
QA/QI Support organizing/tracking
program logistical needs
across implementation
stages—i.e., scheduling,
planning, and coordinating
multiple activities including
fidelity verification, facilitator
training, staff hiring, etc.
IS coordinates efforts or
activities such as model-
specific trainings, general
prevention trainings,
networking, or new grantee
orientations.
“Aided [PDS Coordinator] in
scheduling [program] facilitator
training with [program developer
consultant]”.
TA
Monitoring QA/QI Observe/check the progress of
implementation activities,
reporting and grantee
compliance activities,
and maintain regular and
systematic review and
oversight ensuring grant
recipients meet all program
and funder deliverables.
IS reviews reports/documents
prior to submission, asking
providers to rerun reports,
add information or correct
errors.
“Contacted [Coordinator] grantee
about missing spreadsheet in
Egrants. [Coordinator] will submit
within the week.”
TA
Resource
Delivery
Tools Distribute, create, and update
materials/tools to enhance
skills, knowledge, and
practices for program
implementation; resources
could be general prevention
or program-specific tools.
IS distributes knowledge
building tools and resources
to support program
implementation, fidelity
monitoring and valid impact
assessment.
“Emailed new fidelity tool and
instructions for use to all [program
grant recipients].”
QA/QI
TA
Networking
Cooordination
Relationships Facilitates interagency
collaboration, peer-to-peer
networking, fosters
networking and mediates
introductions amongst
systems, and across different
stakeholder agencies.
IS facilitates or coordinates PDS
learning communities and
communication across
interactive systems.
“Phone call and email support
[provider] who is setting up a
county stakeholder meeting to
address barriers to sustainability.”
QA/QI
Coaching TA Provide guidance or on the spot
training to support
practitioners in achieving
skill-based and professional
goals, enables learning and
development within a
mentoring context.
IS and practitioner actively
engage in discussing
problems and identifying
solutions for barriers or
concerns a site may be facing.
“Site requested TA to discuss financial
roadblocks and identify problem-
solving strategies”
Note. TA = Technical Assistance; QA/QI = Quality Assurance/Quality Improvement; IS = Implementation Specialist; PDS = Prevention Delivery System.
208 Evaluation & the Health Professions 47(2)
implementers (Feinberg et al., 2004,2008). Coaching and
networking coordination were the least common strategies.
However, coaching is often considered the overall approach
through which TA strategies take place (Nadeem et al., 2013;
Strompolis et al., 2020). Thus, it is possible that a coaching
approach was employed in delivering multiple different TA
strategies; this may be the case, particularly in TA encounters
where consultation was provided (Nadeem et al., 2013).
Although networking coordination was not a substantial facet
of TA provision in our findings, its delivery may have oc-
curred nonetheless. Networking coordination involves facil-
itating and meditating interagency collaboration, connecting
systems, and fostering peer-to-peer learning communities
(Leeman et al., 2015,2017). It is possible that instances of
networking coordination were recorded via other TA reporting
metrics by the implementation specialists that were not ex-
amined in the current study. Furthermore, Leeman and
colleagues (2015) expanded on the original EBSIS frame-
work by introducing additional capacity-building strategies,
such as peer networking, that also occur within the context of
TA and implementation support, warranting further investi-
gation of this process as an implementation support strategy.
Table 1 also identifies how each TA strategy maps onto the
implementation support components described within the
EBSIS model proposed by Wandersman and colleagues
(2012). In some instances, a strategy is conceptually
mapped onto more than one EBSIS component, given the
activities being executed. While we believe this is an im-
portant area to explore further, it was beyond the scope of the
current investigation.
Our findings support existing literature on TA strategies
used to promote prevention capacity among service providers.
Consultation described activities where TA providers gave the
program provider expert advice, knowledge, or guidance
(Edmunds et al., 2013;Nadeem et al., 2013;Schoenwald et al.,
2004). Consultation is considered unidirectional because in-
formation flows from the TA provider to the program im-
plementer. In contrast, coaching involves a collaborative
approach between the TA provider and the implementer,
where guidance is provided through motivation and shared
learning (Dusenbury et al., 2010;Gunderson et al., 2018).
Coordination logistics described how TA providers organized
and tracked program logistical needs across the stages of
implementation (Nowell, 2009). Monitoring described in-
stances when TA providers gave systematic review and
oversight of implementation activities (Chalmers et al., 2003;
Saunders, 2022). Resource delivery described instances when
TA providers created, distributed, and disseminated tools/
materials relevant to program implementation and building
prevention capacity (Dunst et al., 2019;Le et al., 2016;
Yazejian et al., 2019).
Study 2 Associations Among Intervention
Characteristics and Amount of Technical Assistance
Method. In the second study, the unit of analysis is the unique
EBI implementation. The unique implementation reflects a
specific cohort, or cohorts, of EBI delivery for a given human
service organization within a particular grant period. More
information about the data structure and measures developed
for this study is described below.
Data Source. Data for Study 2 initially included a sample of
310 unique EBI implementations. The implementations re-
flected 10 different EBIs delivered by 168 human service
organizations in Pennsylvania.
Measures. TA Strategy. Technical assistance strategy was
the outcome measure and was operationalized along four
dimensions—consultation, coordination logistics, monitoring,
and resource delivery. Consultation denoted encounters
through which providers gave advice, information, and expert
opinions regarding applying services or program activities.
Coordination Logistics described TA encounters through
which providers supported organizing and tracking different
elements of program activities, staff training, and sustainability
planning and networking with community partners. Monitoring
involved TA encounters during which TA providers offer over-
sight to program implementers, including observing and checking
the progress or quality of implementation, data reporting, grantee
Table 2. Frequencies and Distribution of the Implementation Support Activity Codes
Codes
Overall Avg. Per year Avg. Per program Avg. Per org
N % M SD M SD M SD
Consultation 2969 27.2 438.1 163.29 285 171.48 25.2 27.83
Coordination Logistics 2682 24.5 381.1 162.31 248.9 207.02 21.5 25.50
Monitoring 1801 16.5 421.6 387.67 169.2 184.13 25.5 30.80
Resource delivery 3080 28.2 256.6 167.19 283.2 305.91 15.8 15.86
Netwoking Coordination 273 2.5 37.4 33.69 25.3 25.81 3.6 4.67
Coaching 126 1.2 18.9 11.36 12 12.23 2.4 2.10
Total 10931 100 1564.3 839.79 1030.9 850.37 85.4 94.89
Note. Avg. = Average; Org = Human service organization.
Gayles et al. 209
compliance, and program activities. Finally, Resource De-
livery describes TA encounters during which TA providers
distribute materials and tools to help build skills, knowledge,
and competencies and aid in best practices for implemen-
tation. The four TA strategies showed good inter-coder re-
liability (average κ=.88).
Prior EBI Funding. Prior EBI funding was operationalized as
how often a human service organization received funding to
deliver a particular EBI. It was a continuous variable reflecting
the number of times the organization received funding,
ranging from 1 to 11.
EBI Quality Characteristics. The EBI Quality Characteris-
tics were assessed based on EBI developer support and evi-
dence rating. EBI Developer Support was assessed using an
18-item Program Infrastructure Checklist developed by the
first and second authors in consultation with EPIS im-
plementation specialists. Initially, a list of 28 criteria was
identified; the list was then refined to 18 indicators, combining
constructs with overlapping meaning based on feedback from
the implementation specialists. Focused questions were used
to assess the presence or absence of each criterion. Responses
were coded as Yes = 1 or No = 0. For each EBI, the checklist
was completed by the first author and a program im-
plementation specialist at EPIS. The checklist items were
summed, resulting in a scale ranging from 0 to 18, with good
internal consistency (α= 0.86). A full description of the
checklist development, focused questions, and scoring rubric
is available in the supplementary materials.
EBI Evidence Rating was assessed using a modified rating
system based on Blueprints for Healthy Youth Development
clearinghouse ratings (https://www.blueprintsprograms.org/).
Evidence categories included Model Plus programs (very high
confidence, experimentally proven, and ready for scale),
Model programs (high confidence, experimentally interven-
tion proven, and ready for scale), and Promising programs
(moderate confidence, some experimental evidence). The
rating was recorded if the EBI was listed on Blueprints and had
an evidence rating. Additional clearinghouses were reviewed
if the EBI was not listed or rated on Blueprints: CrimeSo-
lutions (https://crimesolutions.ojp.gov), What Works Clear-
inghouse (https://ies.ed.gov/ncee/wwc/), and Results First
(https://evidence2impact.psu.edu/what-we-do/research-
translation-platform/results-first-resources/clearing-house-
database/). The evidence ratings were crosswalked and
aligned with the categories in the Blueprints model. Initially,
four levels were assigned: Model Plus (0), Model (1),
Promising (2), or Effective (3). These were then collapsed
into two categories where 0 equaled Promising, and 1
equaled Exemplar. Programs rated Model Plus or Model by
Blueprints and Effective by CrimeSolutions were coded as
Exemplar. Programs rated PromisingbybothBlueprintsand
CrimeSolutions were coded as Promising. More details on
the evidence ratings can be found in the supplementary
materials.
EBI Design Characteristics. Evidence-based preventive in-
tervention design characteristics were operationalized across
four domains: prevention continuum, school-based delivery,
family-focused delivery, and group-based delivery. The
Prevention Continuum was identified using the Institute of
Medicine’s model, which includes health promotion, pre-
vention, treatment, and recovery. Programs were assigned to
one of these categories based on program information and
grantee data. The Program Continuum variable had four
levels: Universal = 0, Selective = 1, Indicated = 2, and
Treatment = 3. School-based delivery refers to programs
targeting students or teachers in a school setting versus other
settings. Family-focused programs delivered content focusing
on the family unit versus others (i.e., teachers, parents, youth).
Group-based delivery refers to programs offered to a group of
respondents versus a single individual. A table of EBI clas-
sifications across quality and design characteristics is available
via the online supplementary materials.
Analytic Strategy. Data reduction was executed following
a set of criteria established by the investigative team. The
sample was reduced to only those implementations that
received current or prior funding from the state agency.
This resulted in a reduced number of TA encounters (n=
9063), service organizations (n= 138), and EBIs (n=241)
included in the sample for Study 2. Next, hierarchical
regression analysis was conducted using the SPSS statis-
tical software version 26 (IBM Corp., 2019) for each TA
strategy. Independent variables were added in sequential
order using the forward inclusion method, with variables
entered based on the magnitude of their zero-order cor-
relations with the outcome (individually predicting the four
TA strategies). Prior EBI funding was added first (Model
1), followed by EBI quality characteristics variables
(Model 2), and finally, EBI design characteristics variables
(Model 3).
Results
Table 3 contains descriptive statistical information for all
variables included in the models. The average amount of TA
delivered to a given EBI ranged from a low of 6.68 (Moni-
toring) to a high of 10.76 (Resource Delivery). On average,
human service organizations had had at least two prior ex-
periences implementing an EBI. Concerning EBI Quality, the
average number of developer-provided support events was
13.67, and more EBIs were rated as Promising (57%) than any
other category. In terms of EBI Design, the majority of EBIs
were for Universal programs (41%), delivered in school-based
settings (78%), not family-focused (74%), and group-based
(61%).
210 Evaluation & the Health Professions 47(2)
Relating TA Frequency to EBI Characteristics
Consultation. For Consultation, the hierarchical regression
model explained 17% of the variation in the amount of TA
delivered, R
2
= .17, F(9, 229) = 5.24, p< .001. Table 4 shows
the model results for the hierarchical regression analysis.
Variation in TA encounters was coded as Consultation. Pre-
vention Continuum, Family-Focus, and Group-Based. All
were significantly associated with the amount of consultation
delivered. The results showed a positive association between
the Prevention Continuum and Consultation, indicating that,
on average, EBIs for Treatment interventions received more
Consultation TA than Universal EBIs. There was also a
positive association between Family-Focused EBIs and
Consultation. Results showed that Family-Focused EBIs re-
ceived more Consultation TA than EBIs that were only parent-
or youth-focused. Last, a positive association was found
between Group-Based and Consultation. Results indicated
that, on average, EBIs delivered in group settings received
more Consultation TA than those offered to the individual.
Coordination Logistics. For Coordination Logistics, the hi-
erarchical regression model explained 28% of the variation in
the amount of TA delivered, R
2
= .28, F(9, 229) = 9.96, p<
.001. Table 4 shows the model results for the hierarchical
regression analysis explaining variation in TA encounters
coded as Coordination Logistics. EBI Developer Support, EBI
Evidence Rating, Prevention Continuum, Family-Focus, and
Group-Based were all significantly associated with the amount
of Coordination TA provided. The results showed a positive
relationship between EBI Developer Support and Coordina-
tion Logistics, such that increases in the number of developer-
provided supports for an EBI were associated with increased
amount of Coordination Logistics TA. There was also a
negative association between EBI Evidence Rating and Co-
ordination Logistics; results indicated that EBIs of programs
rated as Exemplar received a smaller amount of Coordination
Logistics TA when compared to programs rated as Promising.
Additionally, there was a positive association between the
Prevention Continuum and Coordination Logistics. Results in-
dicated that EBIs for Treatment interventions received more
Coordination logistics TA when compared to Universal EBIs. A
negative association was found between Family-Focused and
Coordination Logistics; results indicated that, on average,
Family-Focused EBIs received less Coordination Logisitics TA
compared to EBIs that were only parent- or youth-. Last, there
was a positive association between Group-Based and Coordi-
nation Logistics, indicating that, on average, EBIs delivered in
group settings received lower amounts of Coordination Logistics
TA compared to those offered to the individual.
Monitoring. For Monitoring, the hierarchical regression
model explained 37% of the variation in the amount of TA
delivery, R
2
=.37,F(9, 229) = 14.89, p<.001.Tab le 4 shows
the model results for the hierarchical regression analysis ex-
plaining variation in TA encounters coded as Monitoring. Only
Prior EBI Funding and family-focused were significantly
associated with the amount of Monitoring TA provided. A
negative relationship was found between human service orga-
nization experience and Monitoring; results indicated that in-
creases in the number of times a human service organization
previously implemented an EBI were associated with a decrease
Table 3. Descriptive Statistics for Implementation Support
Activities, Program Providers’Previous EBI Funding, and
Characteristics of EBI Quality and Design
Study variables
Descriptive statistics
MSDn%
No. of Unique EBI Implementations
a
241 77.7
No. of TA Encounters
b
9063 82.9
No. of service Organizations
c
114 67.9
Implementation support Practice
Consultation 10.29 9.22 2481 27.38
Coordination-logistics 9.87 7.71 2379 26.25
Monitoring 6.68 6.91 1611 17.78
Resource delivery 10.76 12.72 2592 28.60
Predictor variables
Service Organization EBI Experience
No. of grants received 2.09 1.17
EBI Quality Characteristics
EBI developer supports 13.67 1.63
EBI evidence rating
Promising 138 57.26
Exemplar 103 42.74
EBI Design Characteristics
Prevention continuum
Universal 99 41.08
Selected 72 29.88
Indicated 29 12.03
Treatment 41 17.01
School-based
d
Not school-based 189 78.42
School-based 50 20.75
Family-focused
Not family-focused 179 74.27
Family-focused 62 25.73
Group-based
Not delivered to group 92 38.17
Delivered to group 149 61.83
Note. No. = Number, TA = Technical Assistance, EBI = Evidence-based
preventive intervention.
a
Total number of unique EBI implementations, reflects the sample nfor
regression models. Percentage is calculated out of the total count of im-
plementation, N= 310. The denominator for calculating percentages for the
EBI characteristics.
b
Total number of coded contacts across EBI implementations. Percentage is
calculated out of the total count of coded contacts, N= 10931.
c
Total number of service organizations delivering EBIs. Percentage is calcu-
lated out of the count of organization initially coded, N= 168.
d
Two implementations could not be coded to indicate if they were School-
based.
Gayles et al. 211
in the amount of Monitoring TA delivered. There was a positive
association between Family-Focus and Monitoring; results
showed that Family-Focused EBIs received more Consultation
TA than EBIs that were only parent- or youth-focused.
Resource Delivery. The hierarchical regression model
explained 43% of the variation in Resource Delivery, R
2
=
.43, F(9, 229) = 19.12, p< .001. Table 4 shows the model
results for the hierarchical regression analysis explaining
variation in TA encounters coded as Resource Delivery.
Prior EBI Funding, EBI Level of Evidence Rating, and
Prevention Continuum were all significantly associated
with the amount of Resource Delivery TA provided. A
negative relationship was found between human service
organization experience and Resource Delivery. Results
indicated that increases in the number of times a human
Table 4. Parameter Estimates for Final Regression Models for Consultation, Coordination-Logistics, Monitoring, and Resource Delivery
Regressed on Organization EBI Experience, EBI Supports
Variables
Consultation Coordination logistics
95% CI 95% CI
BSEBβt p LL UL B SE B βt p LL UL
Model 1
# PCCD grants
a
0.62 0.51 .08 1.21 .229 1.63 0.39 0.54 0.40 .08 1.34 .182 0.25 1.32
Model 2
EBI Supports
b
2.36 1.48 .42 1.59 .112 0.56 5.29 2.83 1.15 .60 2.45 .015 0.56 5.11
EBI Rating
c
1.12 2.87 .06 0.39 .698 6.77 4.54 7.98 2.23 .51 3.58 .000 12.38 3.59
Model 3
Continuum
d
Selected 4.86 2.77 .24 1.75 .081 0.60 10.33 4.15 2.16 .25 1.92 .056 0.10 8.40
Indicated 3.40 3.46 .12 0.98 .326 3.41 10.22 4.46 2.69 .19 1.66 .099 0.84 9.76
Treatment 12.71 5.25 .52 2.42 .016 2.37 23.05 23.41 4.08 1.15 5.74 .000 15.37 31.45
School-Based
e
1.44 3.20 .06 0.45 .652 4.86 7.75 4.19 2.49 .22 1.68 .093 9.09 0.71
Family-Focus
f
5.33 2.56 .25 2.08 .038 0.29 10.37 7.54 1.99 .43 3.79 .000 11.46 3.62
Group-Based
g
15.43 7.10 .82 2.17 .031 1.44 29.43 26.37 5.53 1.67 4.77 .000 15.49 37.26
Monitoring Resource delivery
95% CI 95% CI
BSEBβt p LL UL B SE B βt p LL UL
Model 1
# PCCD grants
a
0.67 0.34 .11 1.98 .048 1.33 0.00 1.92 0.59 .18 3.25 .001 3.08 0.76
Model 2
EBI Supports
b
0.34 0.97 .08 0.35 .724 2.25 1.57 1.30 1.70 .17 0.76 .447 2.06 4.65
EBI Rating
c
2.09 1.87 .15 1.11 .267 5.78 1.61 11.38 3.29 .44 3.46 .001 17.87 4.90
Model 3
Continuum
d
Selected 1.90 1.81 .13 1.05 .296 1.67 5.47 3.41 3.18 0.12 1.07 .285 9.68 2.86
Indicated 0.00 2.26 .00 0.00 .999 4.46 4.45 3.77 3.97 0.10 0.95 .343 11.60 4.05
Treatment 0.96 3.43 .05 0.28 .780 5.80 7.72 18.04 6.02 0.53 3.00 .003 6.18 29.91
School-Based
e
0.95 2.09 .06 0.45 .650 3.17 5.07 2.12 3.67 0.07 0.58 .564 5.11 9.36
Family-Focus
f
7.81 1.67 .50 4.67 .000 4.52 11.11 5.50 2.94 0.19 1.87 .062 0.29 11.29
Group-Based
g
3.20 4.64 .23 0.69 .491 5.95 12.35 15.97 8.15 0.61 1.96 .051 0.10 32.03
Note. ***p < 0.001, **p < 0.01, *p < 0.05, +p < 0.10.
a
Number of grants received, 1 = 1 grant received, 2 = 2 grants received, 3 = 3 grants received, 4 = 4 or more grants received.
b
EBI Developer Supports Score is a continuous variable ranging from 0 to 18, however for all programs included in analysis range = 11 –16.
c
EBI Evidence Rating, 0 = Promising EBI Evidence Rating, 1 = Exemplar EBI Evidence Rating.
d
Reference group = Universal programs.
e
School-Based delivery setting, 0 = Not School-based; 1 = School-Based Program.
f
Family-Focused delivery, Not Family-focused; 1 = Family-Focused Program.
g
Group-Based, 0 –Single unit or individual; 1 = Program Delivered to Group.
Bolded value indicated to signify the significant parameters in the regression model for ease of visibility for the reader.
212 Evaluation & the Health Professions 47(2)
service organization previously implemented an EBI were
associated with less Resource Delivery. There was also a
negative association between EBI Evidence Rating and
Resource Delivery; results indicated that EBIs of programs
rated as Exemplar received less Resource Delivery than
programs rated as Promising. Lastly, there was a positive
association between the Prevention Continuum and Re-
source Delivery, with EBIs for Treatment interventions
receiving more Resource DeliverythanUniversalEBIs.
Discussion
In Study 2, the findings generally supported our hypotheses
regarding EBI quality and design characteristics and the amount
of TA provided. Differential patterns within these associations
emerged across the different TA strategies examined. For ex-
ample, more EBI experience within human service organiza-
tions was associated with less TA for Monitoring and Resource
Delivery but not for Consultation or Coordination Logistics.
Prior research has demonstrated that experience implementing
an EBI is associated with greater innovation-specificcapacity
(Bergling et al., 2022;Dogherty et al., 2013); however, given
our findings, more work is warranted to understand the specific
capacity needs of program implementers with previous expe-
riences implementing EBIs. With respect to EBI Characteris-
tics, the results support the notion that features of the EBI design
and quality contribute to how much TA is delivered to support
quality implementation (Damschroder et al., 2009,2015).
However, these associations varied along several dimensions.
For instance, EBIs with higher evidence ratings received less
TA, but this finding was only evident for Coordination Logistics
and Resource Delivery. Associations between EBI design
characteristics varied across types of TA strategy and the
specific design features (e.g., Family-based, Group-Based,
Prevention Continuum). Implementing programs in schools
was the only feature that did not show significant associations
with any of the TA strategy outcomes. These findings provide a
first look into how EBI characteristics relate to the amount of
TA provided for different TA strategies. More research is
needed to unpack the complex relationships among EBI
characteristics, TA dosage, and TA strategy in determining how
much of what is needed to support high-quality im-
plementation, given the different features of the intervention.
General Discussion
Research suggests that TA contributes positively to individ-
uals’and organizations’capacity to deliver and sustain EBIs
effectively. What is lacking, however, are studies delineating
which types and quantities of TA are most effective and for
whom. The current study represents a first step in “unpacking”
some of these specifics regarding TA. The study’s overall goal
was to develop a method of classifying and empirically ex-
amining the amount of TA given as the effort was tailored to fit
the needs of individuals, organizations, and programs.
Study Limitations and Future Considerations
Some limitations warrant consideration. First, TA contact
notes were often brief or offered limited information about the
encounter. In addition, notes provided by implementation
specialists varied in their level of detail. Thus, coding TA
strategies may not have captured the entire nature or quality of
the TA encountered. This was especially true in the case of
coaching. Second, this study did not include other essential
dimensions along which TA can vary. For example, TA
provision can vary extensively across dimensions such as
dosage, frequency, modality (Feinberg et al., 2008;Le et al.,
2016;Scott et al., 2022), content focus, and knowledge
brokering (Baumgartner et al., 2018;Moreland-Russell et al.,
2018;Williams et al., 2017). Indices of modality, dosage, and
dissemination to groups versus individuals were not included
in the current analyses. Still, these may be important con-
siderations to clarify how much TA is delivered to whom and
under what conditions.
Another limitation of the study is that we did not link TA
strategies to actual implementation outcomes, such as program
fidelity, quality of delivery, or provider capacity. A future step
in understanding these relations would examine the effec-
tiveness of TA strategies by delineating how such strategies
improve the motivation and capacity of practitioners who are
implementing EBIs (Wandersman & Scheier, 2024). Our
findings do, however, add value to the discussion on facets of
TA employed by support systems. Further research is also
needed to unpack the mechanisms by which support is de-
livered and uptaken by programs. For example, the logic
model provided by Wandersman and Scott (2022) may pro-
vide a means to identify the processes of TA delivery, TA
recipient outcomes, and, ultimately, the impact of TA on
organizational and community outcomes.
Although these limitations can potentially influence the
interpretation of the study findings, the study has several
strengths. First, direct links between TA and EBI char-
acteristics have rarely been explored. To our knowledge,
this study is the first to provide empirical data linking
specific aspects of TA to the features of the EBIs, shedding
some light on how those features play a role in determining
how much TA to provide. Second, this study represents a
mixed-method design reflecting the coding of over 10,000
archival TA encounters starting from 2011 and continuing
through 2017. In addition, TA encounters reflect those
across more than 200 program implementations for 10
different EBIs. The richness of the dataset presents an
opportunity for future investigations to further unpack TA
delivery along multiple dimensions across various facets
that occur within the implementation context.
Implications for Science, Practice, and Policy
The results of this study indicate that a support system may
employ multiple facets of implementation support to aid
Gayles et al. 213
program providers in implementing EBIs. Previous research
situates organizations operating as support systems as an
opportune context for determining, examining, and building
an evidence base for implementation support mechanisms
(Franks, 2010;Franks & Bory, 2017). Our findings lead us to
offer several recommendations for researchers and evaluators
of TA, training and TA funders, and TA providers.
Implications for Researchers and Evaluators of Training and
Technical Assistance. First, we recommend further investiga-
tion into the unique characteristics of EBIs and the specific
types of TA strategies offered to individuals and organizations.
Specifically, using the CFIR may help to illuminate how
different intervention context factors and EBI implementation
indices may be related to the TA needs of human service
provider organizations. Additionally, challenging EBI im-
plementations require exploring methods that improve access
to a broad range of TA strategies and how TA provision occurs
within support systems. In this respect, further investigations
unpacking the EBSIS components are also warranted. We
recommend further empirical validation and measurement of
the specific types of TA strategies employed, with consider-
ation of how different facets of TA are distinct from the
different facets of training, tools, and QA/QI (Wandersman
et al., 2012).
Furthermore, Wandersman and colleagues (2012) note that
the cyclical process of implementation support is embedded
within interpersonal relationships. Thus, ensuring open
communication, collaboration, and trust among stakeholders,
including funders, practitioners, TA providers, researchers,
evaluators, and consumers, might be a key mechanism for the
effective delivery and uptake of TA. Some studies have found
that the relationship between TA providers and program
implementers helps determine the effectiveness of the TA
provided (Chilenski et al., 2016,2018). Therefore, relation-
ships are essential for the success of implementation support.
It may be beneficial to examine the quality of the relationship
between TA providers and program implementers as an ad-
ditional facet of implementation support or even a mediating
mechanism through which TA strategies are effectively
delivered.
Implications for Training and Technical Assistance Funders. With
increasing demand for EBI support and limited resources to
support TA operations, it is essential to identify effective and
efficient TA approaches that best fit the organizational and
implementation contexts of EBIs. The implications are mul-
tifaceted and significant for funders of technical assistance and
prevention programs. First, we recommend explicit invest-
ment in training and technical assistance to support EBI
implementation. Investing in TA centers demonstrates a
commitment to proactive measures rather than reactive so-
lutions, highlighting a strategic approach to addressing im-
plementation challenges. This investment supports the
development and implementation of innovative solutions and
fosters a culture of prevention, potentially reducing long-term
costs associated with addressing the aftermath of issues the
arise with implementation fidelity and poor context fit. Fur-
thermore, funders play a critical role in shaping the landscape
of program implementation, influencing priorities, and en-
couraging collaboration among stakeholders. Effective
funding strategies can enhance program efficiency, ensure the
sustainability of initiatives, and maximize impact. However, it
also requires funders to undertake rigorous due diligence,
maintain flexibility to adapt to emerging challenges and
commit to ongoing evaluation to measure the impact of TA in
supporting EBIs’success.
Investing in technical assistance has far-reaching impli-
cations beyond the immediate beneficiaries. Resourcing TA
can help create a more comprehensive and coordinated ap-
proach to addressing complex multi-leveled implementations
that address various levels of systems (e.g., families, schools,
and communities). Funding initiatives can also facilitate cross-
sectoral cooperation and promote systemic change by pro-
moting collaboration among stakeholders within the ISF.
Additionally, funders can leverage their investments to attract
other key players, such as private sector partners or govern-
ment agencies, further amplifying the impact of their funding.
Implications for Training and Technical Assistance
Providers. Coordinated TA is necessary for program imple-
menters. This includes training sessions and ongoing support to
help them effectively utilize the technology and understand its
features. It also involves addressing any issues that may arise
during program implementation. Moreover, technical assis-
tance plays a crucial role in promoting program scalability and
adaptability. With continuous guidance and support, prevention
programs can be easily scaled up to reach a larger audience and
adapted to meet the unique needs of different communities or
populations. In addition, a technical assistance infrastructure
can facilitate more adequate data collection and evaluation
efforts. We recommend that TA providers proactively develop
accurate and timely data collection procedures and facilitate
data analysis and reporting. These processes should be incor-
porated into daily operations for TA support systems. This not
only helps in measuring the impact of TA but also allows for
continuous improvement and refinement.
Furthermore, analyzing TA activity can aid in addressing
potential barriers or challenges that may hinder successful
program implementation. By working closely with imple-
menters and users, developers can identify any roadblocks or
issues and provide solutions or alternative approaches. This
promotes a collaborative and problem-solving approach, en-
suring that prevention programs are effective and sustainable.
Conclusion
In conclusion, our findings support the notion that TA delivery
is multifaceted and responsive to the implementing context.
Ultimately, TA should be regarded as an ongoing process that
214 Evaluation & the Health Professions 47(2)
adapts to changing provider needs, implementation contexts,
and program characteristics. We conclude that different EBI
characteristics contribute to the amount of TA delivered to
human service organizations to support implementation ca-
pacity. Unsurprisingly, our data indicate that those EBIs that
are more complex and challenging to do require additional
technical support and resources. Policymakers and prevention
support systems can leverage this research to aid decision-
making regarding methods for efficient allocation of resources
to promote EBI adoption, dissemination, and sustainability.
Further, our findings could be used to inform future research
initiatives focused on refining the measurement and oper-
ationalization of implementation support, and specifically TA
strategies, as they are executed within and across prevention
support systems. We hope this research will contribute to a
better understanding of TA metrics and conceptualizations and
their potential to support the quality implementation of EBIs.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to
the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for
the research, authorship, and/or publication of this article: We
gratefully acknowledge funding from the Social Science Research
Institute at The Pennsylvania State University Social Science
Research Institute (Welsh, Chilenski, Gayles).
ORCID iD
Jochebed G. Gayles https://orcid.org/0000-0001-8182-4826
Supplemental Material
Supplemental material for this article is available online.
References
Aarons, G. A., Sommerfeld, D. H., & Walrath-Greene, C. M. (2009).
Evidence-based practice implementation: The impact of public
versus private sector organization type on organizational sup-
port, provider attitudes, and adoption of evidence-based prac-
tice. Implementation Science: IS,4(1), 83. https://doi.org/10.
1186/1748-5908-4-83
Albers, B., Metz, A., Burke, K., Bührmann, L., Bartley, L., Driessen,
P., Varsi, C., Varsi, C., & Varsi, C. (2021). Implementation
support skills: Findings from a systematic integrative review.
Research on Social Work Practice,31(2), 147–170. https://doi.
org/10.1177/1049731520967419
Bailey, D. B., Simeonsson, R. J., Winton, P. J., Huntington, G. S.,
Comfort, M., Isbell, P., O’donnell, K. J., & Helm, J. M. (1986).
Family-focused intervention: A functional model for planning,
implementing, and evaluating individualized family services in
early intervention. Journal of Early Intervention,10(2),
156–171. https://doi.org/10.1177/105381518601000207
Bauer, M. S., & Kirchner, J. A. (2020). Implementation science:
What is it and why should I care? Psychiatry Research,283,
Article 112376. https://doi.org/10.1016/J.PSYCHRES.2019.04.
025
Baumgartner, S., Cohen, A., & Meckstroth, A. (2018). Providing TA
to local programs and communities: Lessons from a scan of
initiatives offering TA to human services programs. US De-
partment of Health and Human Services, Office of the Assistant
Secretary for Planning and Evaluation.
Bergling, E., Pendleton, D., Shore, E., Harpin, S., Whitesell, N., &
Puma, J. (2022). Implementation factors and teacher experience
of the integrated nutrition education program: A mixed methods
program evaluation. Journal of School Health,92(5), 493–503.
https://doi.org/10.1111/josh.13153
Brownson, R. C., Fielding, J. E., & Green, L. W. (2018). Building
capacity for evidence-based public health: Reconciling the pulls
of practice and the push of research. Annual Review of Public
Health,39(1), 27–53. https://doi.org/10.1146/annurev-
publhealth-040617-014746
Chalmers, M. L., Housemann, R. A., Wiggs, I., Newcomb-Hagood,
L., Malone, B., & Brownson, R. C. (2003). Process evaluation
of a monitoring log system for community coalition activities:
Five-year results and lessons learned. American Journal of
Health Promotion: AJHP,17(3), 190–196. https://doi.org/10.
4278/0890-1171-17.3.190
Chilenski, S. M., Perkins, D. F., Olson, J., Hoffman, L., Feinberg,
M. E., Greenberg, M., Welsh, J., Crowley, D. M., & Spoth, R.
(2016). The power of a collaborative relationship between
technical assistance providers and community prevention teams:
A correlational and longitudinal study. Evaluation and Program
Planning,54(6), 19–29. https://doi.org/10.1016/j.evalprogplan.
2015.10.002
Chilenski, S. M., Welsh, J., Olson, J., Hoffman, L., Perkins, D. F., &
Feinberg, M. E. (2018). Examining the highs and lows of the
collaborative relationship between technical assistance pro-
viders and prevention implementers. Prevention Science: The
Official Journal of the Society for Prevention Research,19(2),
250–259. https://doi.org/10.1007/S11121-017-0812-2
Chinman, M., Ebener, P., Malone, P. S., Cannon, J., D’Amico, E. J.,
& Acosta, J. (2018). Testing implementation support for
evidence-based programs in community settings: A replication
cluster-randomized trial of Getting To Outcomes®. Im-
plementation Science: IS,13(1), 131. https://doi.org/10.1186/
s13012-018-0825-7
Chinman, M., Hunter, S. B., Ebener, P., Paddock, S. M., Stillman, L.,
Imm, P., & Wandersman, A. (2008). The getting to outcomes
demonstration and evaluation: An illustration of the prevention
support system. American Journal of Community Psychology,
41(3-4), 206–224. https://doi.org/10.1007/s10464-008-9163-2
Codding, R. S., Collier-Meek, M., & DeFouw, E. (2022). Treatment
integrity and intensity: Critical considerations for delivering
individualized interventions.https://doi.org/10.1108/s0735-
004x20220000032006
Damschroder, L., Hall, C., Gillon, L., Reardon, C., Kelley, C.,
Sparks, J., & Lowery, J. (2015). The consolidated framework for
Gayles et al. 215
implementation research (CFIR): Progress to date, tools and
resources, and plans for the future. Implementation Science,
10(S1), A12. https://doi.org/10.1186/1748-5908-10-S1-A12
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander,
J. A., & Lowery, J. C. (2009). Fostering implementation of
health services research findings into practice: A consolidated
framework for advancing implementation science. Im-
plementation Science: IS,4(1), 50. https://doi.org/10.1186/
1748-5908-4-50
Dogherty, E. J., Harrison, M. B., Graham, I. D., Vandyk, A. D., &
Keeping-Burke, L. (2013). Turning knowledge into action at the
point-of-care: The collective experience of nurses facilitating
the implementation of evidence-based practice. Worldviews on
Evidence-Based Nursing,10(3), 129–139. https://doi.org/10.
1111/WVN.12009
Dunst, C. J., Annas, K., Wilkie, H., & Hamby, D. W. (2019). Review
of the effects of technical assistance on program, organization
and system change. International Journal of Evaluation and
Research in Education,8(2), 330–343. https://doi.org/10.11591/
ijere.v8i2.17978
Durlak, J. A. (2013). The importance of quality implementation for
research, practice, and policy. ASPE Research Brief,1–16.
https://aspe.hhs.gov/reports/importance-quality-
implementation-research-practice-policy-0
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A
review of research on the influence of implementation on
program outcomes and the factors affecting implementation.
American Journal of Community Psychology,41(3-4), 327–350.
https://doi.org/10.1007/s10464-008-9165-0
Dusenbury, L., Hansen, W. B., Jackson-Newsom, J., Pittman, D. S.,
Wilson, C. V., Simley, K., Ringwalt, C., Pankratz, M., & Giles,
S. M. (2010). Coaching to enhance quality of implementation in
prevention. American Journal of Health Education,110 (1),
43–60. https://doi.org/10.1108/09654281011008744
Edmunds, J. M., Beidas, R. S., & Kendall, P. C. (2013). Dissemi-
nation and implementation of evidence-based practices:
Training and consultation as implementation strategies. Clinical
psychology: A Publication of the Division of Clinical Psy-
chology of the American Psychological Association,20(2),
152–165. https://doi.org/10.1111/CPSP.12031
Eisman, A. B., Kilbourne, A. M., Greene, D., Walton, M., &
Cunningham, R. (2020). The user-program interaction: How
teacher experience shapes the relationship between intervention
packaging and fidelity to a state-adopted health curriculum.
Prevention Science: The Official Journal of the Society for
Prevention Research,21(6), 820–829. https://doi.org/10.1007/
s11121-020-01120-8
Eisman, A. B., Palinkas, L. A., Brown, S., Lundahl, L., & Kilbourne,
A. M. (2022). A mixed methods investigation of im-
plementation determinants for a school-based universal pre-
vention intervention. Implementation Research and Practice,
3(190), Article 26334895221124962. https://doi.org/10.1177/
26334895221124962
Elliott, J. O. (2016). Psychosocial interventions for mental and
substance use disorders: A framework for establishing evidence-
based standards, by Institute of Medicine (IOM). Journal of
Social Work Practice in the Addictions,16(3), 323–324. https://
doi.org/10.1080/1533256x.2016.1199840
Engell, T., Løvstad, A. M., Kirkøen, B., Ogden, T., & Amlund Hagen,
K. (2021). Exploring how intervention characteristics affect
implementability: A mixed methods case study of common
elements-based academic support in child welfare services.
Children and Youth Services Review,129, Article 106180.
https://doi.org/10.1016/J.CHILDYOUTH.2021.106180
Fagan, A. A., Bumbarger, B. K., Barth, R. P., Bradshaw, C. P.,
Cooper, B. R., Supplee, L. H., & Walker, D. K. (2019). Scaling
up evidence-based interventions in US public systems to prevent
behavioral health problems: Challenges and opportunities.
Prevention Science: The Official Journal of the Society for
Prevention Research,20(8), 1147–1168. https://doi.org/10.
1007/S11121-019-01048-8
Fagan, A. A., Hanson, K., Briney, J. S., & David Hawkins, J. (2012).
Sustaining the utilization and high quality implementation of
tested and effective prevention programs using the Communities
that Care prevention system. American Journal of Community
Psychology,49(3-4), 365–377. https://doi.org/10.1007/s10464-
011-9463-9
Fallon, L. M., DeFouw, E. R., Cathcart, S. C., Berkman, T. S.,
Robinson-Link, P., O’Keeffe, B. V., & Sugai, G. (2022). School-
based supports and interventions to improve social and be-
havioral outcomes with racially and ethnically minoritized
youth: A review of recent quantitative research. Journal of
Behavioral Education,31(1), 123–156. https://doi.org/10.1007/
s10864-021-09436-3
Farrell, A. F., Collier-Meek, M. A., & Furman, M. J. (2019). Sup-
porting out-of-school time staff in low resource communities: A
professional development approach. American Journal of
Community Psychology,63(3-4), 378–390. https://doi.org/10.
1002/AJCP.12330
Feinberg, M. E., Greenberg, M. T., & Wayne Osgood, D. (2004).
Technical assistance in prevention programs: Correlates of
perceived need in communities that care. Evaluation and
Program Planning,27(3), 263–274. https://doi.org/10.1016/j.
evalprogplan.2004.04.001
Feinberg, M. E., Ridenour, T. A., & Greenberg, M. T. (2008). The
longitudinal effect of technical assistance dosage on the func-
tioning of Communities that Care prevention boards in Penn-
sylvania. The Journal of Primary Prevention,29(2), 145–165.
https://doi.org/10.1007/s10935-008-0130-3
Fixsen, D. L., Blase, K., Metz, A., & Van Dyke, M. (2013). Statewide
implementation of evidence-based programs. Exceptional
Children,79(3), 213–230. https://doi.org/10.1177/
001440291307900206
Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009). Core
implementation components. Research on Social Work Practice,
19(5), 531–540. https://doi.org/10.1177/1049731509335549
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., &
Wallace. (2005). Implementation research: A synthesis of the
literature (Issue No. FMHI Publication #231). The National
implementation research network, frank porter graham child
216 Evaluation & the Health Professions 47(2)
development Institute. https://doi.org/10.1017/
CBO9781107415324.004
Flaspohler, P., Duffy, J., Wandersman, A., Stillman, L., & Maras,
M. A. (2008). Unpacking prevention capacity: An intersection
of research-to-practice models and community-centered models.
American Journal of Community Psychology,41(3-4), 182–196.
https://doi.org/10.1007/S10464-008-9162-3
Fleiss, J. L., Cohen, J., & Everitt, B. S. (1969). Large sample standard
errors of kappa and weighted kappa. Psychological Bulletin,
72(5), 323–327. https://doi.org/10.1037/h0028106
Franks, R. P. (2010). Role of the intermediary organization in pro-
moting and disseminating best practices for children and youth:
The Connecticut center for effective practice. Emotional Behav
Disord Youth,10(4), 87–93. https://civicresearchinstitute.com/
online/article_abstract.php?pid=5&iid=188&aid=1284
Franks, R. P., & Bory, C. T. (2017). Strategies for developing in-
termediary organizations: Considerations for practice. Families
in Society: The Journal of Contemporary Social Services,98(1),
27–34. https://doi.org/10.1606/1044-3894.2017.6
Glasgow, R. E., Lichtenstein, E., & Marcus, A. C. (2003). Why don’t
we see more translation of health promotion research to prac-
tice? Rethinking the efficacy-to-effectiveness transition.
American Journal of Public Health,93(8), 1261–1267. https://
doi.org/10.2105/AJPH.93.8.1261
Green, L. W., & Mercer, S. L. (2001). Can public health researchers
and agencies reconcile the push from funding bodies and the
pull from communities? In American Journal of Public Health.
91(12), 1926–1929. https://doi.org/10.2105/AJPH.91.12.1926
Gunderson, L. M., Willging, C. E., Trott Jaramillo, E. M., Green,
A. E., Fettes, D. L., Hect, D. B., Aarons, G. A., & Hecht, D. B.
(2018). The good coach: Implementation and sustainment
factors that affect coaching as evidence-based intervention fi-
delity support. Journal of Children’s Services,13(1), 1–17.
https://doi.org/10.1108/JCS-09-2017-0043
Hawkins, J. D., Jenson, J. M., Catalano, R., Fraser, M. W., Botvin,
G. J., Shapiro, V., Brown, C. H., Beardslee, W., Brent, D.,
Leslie, L. K., Rotheram- Borus, M. J., Shea, P., Shih, A.,
Anthony, E., Haggerty, K. P., Bender, K., Gorman-Smith, D.,
Casey, E., & Stone, S. (2015). Unleashing the Power of Pre-
vention. In NAM Perspectives. Washington, DC: Discussion
Paper, National Academy of Medicine.
Hunter, S. B., Chinman, M., Ebener, P., Imm, P., Wandersman, A., &
Ryan, G. W. (2009a). Technical assistance as a prevention
capacity-building tool: A demonstration using the getting to
outcomes framework. Health Education & Behavior: The Of-
ficial Publication of the Society for Public Health Education,
36(5), 810–828. https://doi.org/10.1177/1090198108329999
Hunter, S. B., Paddock, S. M., Ebener, P., Burkhart, A. K., &
Chinman, M. (2009b). Promoting evidence-based practices: The
adoption of a prevention support system in community settings.
Journal of Community Psychology,37(5), 579–593. https://doi.
org/10.1002/JCOP.20316
Katz, J., & Wandersman, A. (2016). Technical assistance to enhance
prevention capacity: A research synthesis of the evidence base.
Prevention Science: The Official Journal of the Society for
Prevention Research,17(4), 417–428. https://doi.org/10.1007/
S11121-016-0636-5
Kirk, M. A., Kelley, C., Yankey, N., Birken, S. A., Abadie, B., &
Damschroder, L. (2016). A systematic review of the use of the
consolidated framework for implementation research. Im-
plementation Science: IS,11(1), 72. https://doi.org/10.1186/
S13012-016-0437-Z
Le, L. T., Anthony, B. J., Bronheim, S. M., Holland, C. M., & Perry,
D. F. (2016). A technical assistance model for guiding service
and systems change. The Journal of Behavioral Health Services
& Research,43(3), 380–395. https://doi.org/10.1007/S11414-
014-9439-2
Leeman, J., Calancie, L., Hartman, M. A., Escoffery, C. T.,
Herrmann, A. K., Tague, L. E., Moore, A. A., Wilson, K. M.,
Schreiner, M., & Samuel-Hodge, C. (2015). What strategies are
used to build practitioners’capacity to implement community-
based interventions and are they effective? A systematic review.
Implementation Science: IS,10(1), 80. https://doi.org/10.1186/
S13012-015-0272-7
Leeman, J., Myers, A., Grant, J. C., Wangen, M., & Queen, T. L.
(2017). Implementation strategies to promote community-
engaged efforts to counter tobacco marketing at the point of
sale. Translational Behavioral Medicine,7(3), 405–414. https://
doi.org/10.1007/s13142-017-0489-x
McHugh, M. L. (2012). Lessons in biostatistics interrater reliability:
the kappa statistic. Biochemica Medica,22(3), 276–282. https://
doi.org/10.11613/bm.2012.031
Metz, A., Albers, B., Burke, K., Bartley, L., Louison, L., Ward, C., &
Farley, A. (2021). Implementation practice in human service
systems: Understanding the principles and competencies of
professionals who support implementation. Human Service
Organizations: Management, Leadership & Governance,45(3),
1–22. https://doi.org/10.1080/23303131.2021.1895401
Mitchell, R. E., Stone-Wiggins, B., Stevenson, J. F., & Florin, P.
(2004). Cultivating capacity: Outcomes of a statewide support
system for prevention coalitions. Journal of Prevention & In-
tervention in the Community,27(2), 67–87. https://doi.org/10.
1300/J005V27N02_05
Moreland-Russell, S., Adsul, P., Nasir, S., Fernandez, M. E., Walker,
T. J., Brandt, H. M., Vanderpool, R. C., Pilar, M., Cuccaro, P.,
Norton, W. E., Vinson, C. A., Chambers, D. A., & Brownson,
R. C. (2018). Evaluating centralized technical assistance as an
implementation strategy to improve cancer prevention and control.
Cancer Causes & Control: Cancer Causes & Control,29(12),
1221–1230. https://doi.org/10.1007/s10552-018-1108-y
Nadeem, E., Gleacher, A., & Beidas, R. S. (2013). Consultation as an
implementation strategy for evidence-based practices across
multiple contexts: Unpacking the black box. Administration and
Policy in Mental Health,40(6), 439–450. https://doi.org/10.
1007/s10488-013-0502-8
Nowell, B. (2009). Profiling capacity for coordination and systems
change: The relative contribution of stakeholder relationships in
interorganizational collaboratives. American Journal of Com-
munity Psychology,44(3-4), 196–212. https://doi.org/10.1007/
S10464-009-9276-2
Gayles et al. 217
Powell, B. J., Bosk, E. A., Wilen, J. S., Danko, C. M., Van Scoyoc,
A., & Banman, A. (2015). Evidence-based programs in “real
world”settings: Finding the best fit. In Advances in Child Abuse
Prevention Knowledge (Vol. 5, pp. 145–177). Springer. https://
doi.org/10.1007/978-3-319-16327-7_7
Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G.,
Bunger, A., Griffey, R., & Hensley, M. (2011). Outcomes for
implementation research: Conceptual distinctions, measurement
challenges, and research agenda. Administration and Policy in
Mental Health,38(2), 65–76. https://doi.org/10.1007/s10488-
010-0319-7
Rhoades, B. L., Bumbarger, B. K., & Moore, J. E. (2012). The role of
a state-level prevention support system in promoting high-
quality implementation and sustainability of evidence-based
programs. American Journal of Community Psychology,
50(3-4), 386–401. https://doi.org/10.1007/S10464-012-9502-1
Saunders, R. P. (2022). Implementation monitoring and process
evaluation. Sage. https://doi.org/10.4135/9781071878736
Schoenwald, S. K., Sheidow, A. J., & Letourneau, E. J. (2004).
Toward effective quality assurance in evidence-based practice:
Links between expert consultation, therapist fidelity, and child
outcomes. Journal of Clinical Child and Adolescent Psychol-
ogy,33(1), 94–104. https://doi.org/10.1207/
S15374424JCCP3301_10
Scott, V. C., Jillani, Z., Malpert, A., Kolodny-Goetz, J., & Wandersman,
A. (2022). A scoping review of the evaluation and effectiveness of
technical assistance. Implementation Science Communications,3(1),
1–16. https://doi.org/10.1186/S43058-022-00314-1
Strompolis, M., Cain, J. M., Wilson, A., Aldridge, W. A., Armstrong,
J. M., & Srivastav, A. (2020). Community capacity coach:
Embedded support to implement evidenced-based prevention.
Journal of Community Psychology,48(4), 1132–1146. https://
doi.org/10.1002/JCOP.22375
Wandersman, A. (2009). Four keys to success (Theory, im-
plementation, evaluation, and resource/system support): High
hopes and challenges in participation. American Journal of
Community Psychology,43(1-2), 3–21. https://doi.org/10.1007/
s10464-008-9212-x
Wandersman, A., Chien, V. H., & Katz, J. (2012). Toward an
evidence-based system for innovation support for implementing
innovations with quality: Tools, training, technical assistance,
and quality assurance/quality improvement. American Journal
of Community Psychology,50(3-4), 445–459. https://doi.org/10.
1007/s10464-012-9509-7
Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell,
K., Stillman, L., Blachman, M., Dunville, R., & Saul, J.
(2008). Bridging the gap between prevention research and
practice: The interactive systems framework for dissemi-
nation and implementation. American Journal of Community
Psychology,41(3-4), 171–181. https://doi.org/10.1007/
s10464-008-9174-z
Wandersman, A., & Scheier, L. M. (2024). Special issue:
Strengthening the science and practice of implementation
support: Evaluating the effectiveness of training and technical
assistance centers (Introduction to the special issue). In Eval-
uation & the Health Professions. Sage.
Wandersman, A., & Scott, V. A. (2022). Technical Assistance
(TA) effectiveness logic model. www.wandersmancenter.
org
Williams, N. J., Glisson, C., Hemmelgarn, A., & Green, P. (2017).
Mechanisms of change in the ARC organizational strategy:
Increasing mental health clinicians’EBP adoption through
improved organizational culture and capacity. Administration
and Policy in Mental Health,44(2), 269–283. https://doi.org/10.
1007/S10488-016-0742-5
Yazejian, N., Metz, A., Morgan, J., Louison, L., Bartley, L., Fleming,
W. O., Haidar, L., & Schroeder, J. (2019). Co-creative technical
assistance: Essential functions and interim outcomes. Evidence
&Policy,15(3), 339–352. https://doi.org/10.1332/
174426419X15468578679853
218 Evaluation & the Health Professions 47(2)