PreprintPDF Available

Development of the Technical Assistance Engagement Scale: A Modified Delphi Study

Authors:

Abstract and Figures

Background: Technical assistance (TA) is a tailored approach to capacity building that is commonly used to support implementation of evidence based interventions. Despite its widespread applications, measurement tools for assessing critical components of TA are scant. In particular, the field lacks a robust measure for examining relationship quality between TA providers and recipients. TA relationships are central to TA and significantly associated with program implementation outcomes. The current study seeks to address the gap in TA measurement tools by providing a scale for assessing TA relationships. Methods: We utilized a modified Delphi approach involving two rounds of Delphi surveys and a panel discussion with TA experts to garner feedback and consensus on the domains and items that compose the TA Engagement Scale. Results: TA experts represented various U.S. organizations and TA roles (e.g., provider, recipient, researcher) with 25 respondents in the first survey and 26 respondents in the second survey. The modified Delphi process resulted in a scale composed of six domains and 22 items relevant and important TA relationships between providers and recipients. Conclusion: The TA Engagement Scale is a formative evaluation tool intended to offer TA providers the ability to identify strengths and areas for growth in the provider-recipient relationship and to communicate about ongoing needs. As a standard measurement tool, it lends a step toward more systematic collection of TA data, the ability to generate a more coherent body of TA evidence, and enables comparisons of TA relationships across settings.
Content may be subject to copyright.
Page 1/27
Development of the Technical Assistance Engagement Scale:
A Modied Delphi Study
Victoria Scott
University of North Carolina at Charlotte https://orcid.org/0000-0002-8382-283X
Jasmine Temple
University of North Carolina at Charlotte
Zara Jilani
University of Georgia at Athens
Research Article
Keywords: technical assistance (TA) relationships, TA engagement, delphi, measurement development, formative evaluation
Posted Date: April 15th, 2024
DOI: https://doi.org/10.21203/rs.3.rs-4189554/v1
License: This work is licensed under a Creative Commons Attribution 4.0 International License.  Read Full License
Page 2/27
Abstract
Background: Technical assistance (TA) is a tailored approach to capacity building that is commonly used to support
implementation of evidence based interventions. Despite its widespread applications, measurement tools for assessing
critical components of TA are scant. In particular, the eld lacks a robust measure for examining relationship quality between
TA providers and recipients. TA relationships are central to TA and signicantly associated with program implementation
outcomes. The current study seeks to address the gap in TA measurement tools by providing a scale for assessing TA
relationships.
Methods: We utilized a modied Delphi approach involving two rounds of Delphi surveys and a panel discussion with TA
experts to garner feedback and consensus on the domains and items that compose the
TA Engagement Scale
.
Results: TA experts represented various U.S. organizations and TA roles (e.g., provider, recipient, researcher) with 25
respondents in the rst survey and 26 respondents in the second survey. The modied Delphi process resulted in a scale
composed of six domains and 22 items relevant and important TA relationships between providers and recipients.
Conclusion: The
TA Engagement Scale
is a formative evaluation tool intended to offer TA providers the ability to identify
strengths and areas for growth in the provider-recipient relationship and to communicate about ongoing needs. As a standard
measurement tool, it lends a step toward more systematic collection of TA data, the ability to generate a more coherent body
of TA evidence, and enables comparisons of TA relationships across settings.
Contributions to the Literature
The technical assistance (TA) provider-recipient relationship is central to TA. However, no robust measurement tool exists
for assessing TA relationships. This article addresses gaps in TA measurement literature by developing the
TA
Engagement Scale
.
The
TA Engagement Scale
advances TA practice by providing providers and recipients with a robust, expert-informed
instrument for monitoring TA engagement quality. The measure increases TA provider ability to make collaborative, data-
informed adjustments to TA delivery.
The scale advances TA by providing a standard measurement tool that aids systematic data collection, formation of
more coherent TA evidence, and comparisons of TA relationships across settings.
Background
Evidence-based interventions and practices (EBIs/EBPs) are critical for advancing population health and community well-
being. However, both uptake and quality implementation of EBIs/EBPs are a persistent challenge to improved health
outcomes globally. A host of contextual factors contribute to implementation-related outcomes, including stakeholder
perceptions of the EBI/EBP, setting capacity for EBI/EBP adoption, and general functioning and climate of the implementation
setting (1–3). A growing body of literature provides evidence that technical assistance improves implementation outcomes
(4–7).
Technical Assistance (TA) is a tailored approach to organizational and community capacity building that is chiey used to
support implementation of EBIs/EBPs (8). TA involves tailored guidance by a TA specialist (or TA organization) to members
of a setting (organization, community) regarding a specic practice area(s) (e.g., needs assessment, program monitoring,
stakeholder engagement, sustainability planning). TA delivery frequently entails an assortment of activities (e.g., coaching,
consultation, resource sharing; (9)), that vary by recipient need, with direct TA provider-recipient interactions as a hallmark
feature. Thus, the relationship between a TA provider and recipient(s) is essential for successful TA (10–12), particularly for
intensive models of TA (13). Importantly, a provider’s ability to effectively build relationships with recipients is recognized as a
core TA competency (14, 15).
Page 3/27
In a research synthesis of the TA evidence base, TA relationships (dened broadly as “human encounters between TA
providers and recipients”) were discussed in approximately 50% of articles ((16), p.418). The synthesis armed the
signicance of provider-recipient relationships to TA, noting trust, collaboration, and a strengths-based orientation as most
commonly reported relationship attributes. When TA providers establish rapport with recipients, recipients view providers as
trusting, respectful, patient, and motivating, underscoring the importance of the recipient-provider relationship (17–19).
Collaborative TA relationships are positively associated with implementation-related outcomes including implementation
adherence (19, 20), and high-quality team functioning - a proximal outcome linked to implementation effectiveness (21).
The value of soliciting client feedback on a professional client-provider relationship is an established best practice across a
variety of professional elds (e.g. clinical therapy/counseling, coaching, consulting). Most providers have access to robust
measurement scales for this purpose. For example, clinicians and counselors can select from an assortment of eld-tested
measures to get patient feedback about the therapeutic relationship (e.g.,
Therapeutic Bonds Scale
,
Helping Alliance
Questionnaire
[Haq-I/II],
California Psychotherapy Alliance Scale
[CALPAS],
Working Alliance Inventory
[WAI] (22–25)).
Similarly, consultants can draw on the
Consulting Effectiveness Survey
(26) and coaches can use the
Coaching Evaluation
Survey
(27) and
Executive Coaching Survey
(28) to assess client-provider relational quality. Akin to allied professional client-
provider relationships, TA provider-recipient relationships are benecial to assess for attaining desired outcomes (i.e.,
formative evaluation). However, TA providers have sparse options for existing measures for this construct. Heretofore, TA
providers or TA centers interested in measuring relational elements of TA have resorted to developing their own measures. For
example, Chilenski and colleagues (21) developed a 7-item instrument to measure collaboration - one specic and important
feature of TA relationships. The eld of TA is in need of an expert-informed measure of TA provider-recipient relationship
quality, particularly an instrument that assesses the multiple dimensions of TA relationships. We aim to ll this need by
developing a novel measurement tool for TA engagement.
Purpose & Initial Scale Development
The purpose of the current study was to obtain subject matter expert input and consensus for the domains and items on the
TA Engagement Scale
, a measure that assesses the quality of engagement (relationship) between TA providers and
recipients. We began with a literature review (May - July 2022) to determine how previous research in TA and related elds
(i.e. clinical therapy/counseling, consulting, coaching) measured provider-client relationship quality. We reviewed measures at
the domain and item level and retained the most common domains and associated items across each eld. For instance, the
following constructs were common in coaching measures: benets, relevance, expectations of coaching, value of coaching,
satisfaction with coaching, communication, interpersonal skills, coaching outcomes, and investment. The measurement tool
review generated an initial set of domains and items, which we categorized using the International Coaching Federation (ICF)
Framework. We used the ICF framework to organize the domains and items due to similarities between coaching and TA; no
equivalent framework exists for TA (29, 30).
Next, we solicited TA subject matter expert (henceforth, experts) input through four meetings. Experts were TA providers and
researchers from three organizations identied via convenience sampling. These initial discussions with TA experts focused
on the adequacy of the domains (
Did domains adequately reect the most salient features of TA relationship quality? Were
domain denitions suciently clear and precise?
). As we obtained feedback, we revised the pool of domains and items
accordingly. For example, experts agged items that measured effectiveness (outcome measure) rather than engagement
(process measure), so we removed those items. Altogether, the literature review and initial expert input led to a preliminary,
comprehensive set of 14 domains and 75 items. In what follows, we describe our approach to obtaining expert input and
consensus on the domains and items on the
TA Engagement Scale
using a modied Delphi process. We used a combination
of literature review, preliminary expert input, and Delphi process to develop a TA scale that is grounded in TA research and
practice.
Methods
Page 4/27
Participants
The TA research team involved a university faculty member PI (VS) and two doctoral students (JT, ZJ). We utilized
convenience and snowball sampling, wherein TA experts across the United States acquainted with the PI were invited to
participate in the Delphi study. Additionally, we asked prospective experts to share contact information for any TA provider,
recipient, or researcher who might be interested in participating. Participant inclusion criteria included: i) having a minimum
of one year experience with TA, and ii) English speaking. TA providers, researchers, and recipients from six organizations
participated in the Delphi process. The TA research team did not participate in the Delphi surveys and were not included in the
data analysis.
Procedures
The Delphi method is a systematic approach for eliciting and aggregating opinion on a topic from a panel of experts (31).
Developed by the Research and Development Corporation (RAND) in the 1950s, researchers have used the Delphi method to
garner and consolidate expert feedback and generate consensus. This method has commonly been used to identify the
current state of knowledge on a subject, come to a resolution on controversial topics, and develop measurement and
indicator tools (32–34). The Delphi method has been utilized across disciplines, including health sciences, business, and
educational settings (4, 35, 36). Typically, respondents engage in several rounds of surveys in which they share feedback on a
series of questions. Between rounds of surveys, the researcher analyzes experts’ feedback and consolidates it for the
following round so that items with more consensus proceed to the next round of surveys and items with less consensus are
eliminated. In prior literature, consensus has been dened as 50–97% or more of subject matter experts in agreement about a
subject matter, with a 75% median threshold for dening agreement (33). The Delphi method reduces group inuence on
responses since experts are often asked to provide feedback asynchronously and condentially, which reduces status effects
on which items are retained (e.g., everyone has equal say (37)).
The TA research team utilized the Accurate Consensus Reporting Document (ACCORD) for a Modied Delphi process (38) to
ensure accurate and systematic reporting of the Delphi method. The completed ACCORD document is provided in
supplementary material. In the current study, the Delphi consensus building process consisted of two surveys administered to
subject matter experts (i.e., TA providers, recipients, and researchers) using Qualtrics, a web-based survey platform. Surveys
were not piloted prior to administration to TA Delphi experts, however feedback about the survey content was received during
discussions with TA experts (see Initial Scale Development section). The two-round survey design was informed by the
research team’s pre-existing work (preliminary feedback from TA experts). Before administration of the rst Delphi survey, the
team hosted two orientation sessions with prospective experts to introduce the goals of the national study and communicate
expectations for participation. An orientation session was recorded and shared with prospective experts who were unable to
attend a live session. Each round of the
TA Engagement Scale
survey was emailed to interested experts (described below),
who then consented to participate in the study. For Survey Round #1 and #2, we asked experts, “
To what extent are the
following domains (and items) and their respective denitions relevant to interactions between TA provider and recipient?
”.
Response options ranged from 1 (
not at all relevant
) to 4 (
completely relevant
). After each domain and item, we provided a
comment box for experts to leave an open-ended response regarding any comments, suggestions, and concerns related to
each domain and item. Additionally, an open-comment box was included at the end of the survey for any other input (e.g.
comments about the measure overall, any suggested items, any general questions). The inclusion of open-ended text boxes
is a common practice in Delphi surveys (39). Survey responses were condential but not anonymous. The rst survey was
administered over a three week period from August 21st, 2023 to September 8, 2023 and reminder emails were sent on a
weekly basis. After we received feedback on survey #1, the TA research team summarized survey results into a report that
was shared with the experts. We then held a one hour discussion with the panel of TA experts to discuss their Survey Round
#1 feedback and to clarify questions on September 22, 2023.
The second
TA Engagement Scale
survey was administered to the experts over the course of two weeks from October 6, 2023
to October 20th, 2023. Reminder emails were sent on a weekly basis. In Survey Round #2, we again asked experts to rate the
Page 5/27
relevance of the rened scale domains and items. We increased the consensus threshold to 85% to increase condence
around the relevance of retained items (40). Additionally, we asked experts to prioritize the items within each domain based
on relative importance to the domain reecting TA relationships. This national TA Delphi study was approved by the
University of North Carolina at Charlotte Institutional Review Board (IRB-23-0463). A study protocol for this research is
unregistered.
Data Analysis
Data for Round #1 was analyzed in September of 2023. We followed a four step approach to analyze the quantitative and
qualitative data for Round #1 of survey input. First, we used a 70% agreement threshold to determine which domains and
items we retained versus removed: if 70% or more respondents indicated that a domain/item was mostly or completely
relevant, then the specic domain/item was retained. Domains and items below the 70% threshold were removed. The second
and third steps were based on qualitative feedback provided by experts in the open-ended responses. In the second step, we
assessed if items needed to be relocated to another domain based on expert input. In the third step, we reviewed domains
and items that were agged by experts as redundant (items with similar wording or attributes). If a domain was redundant
with another domain, then we merged them and revised the domain denition if needed. If an item shared redundancy with
another item, we kept the item with the higher agreement. Lastly, after considering the last three steps, we determined whether
an item was retained, removed, or relocated.
Data for Round #2 was analyzed between October to December 2023. In the second round survey input, we followed a ve
step approach. First, we used an 85% agreement threshold to strengthen the consensus criteria. If domains or items did not
meet the 85% threshold for agreement, then the domain and item was removed. Second, we determined whether the item
needed to be relocated based on qualitative feedback. Third, we determined whether the domains or items shared redundancy
with other domains and items, respectively; if they did, we merged the domains and retained items with the higher agreement.
Fourth, we included a rank ordering system so that respondents could indicate the order of each items relative importance
from least to most important. The average ranking of these items were used to determine the rank order. The lowest ranked
items in a domain were removed. Finally, based on prior steps, we determined whether an item was retained, removed, or
relocated.
Results
At the beginning of the modied Delphi process, the
TA Engagement Scale
included 75 items classied across 14 domains.
After the expert input and consensus process, the nal scale was reduced to 22 items across six domains. A summary of
scale modications across the modied Delphi survey rounds is available in Fig.1. A detailed description of the results for
each survey round follows.
Survey Round #1
We sent the rst survey to 32 TA experts. Twenty-ve experts responded to Survey Round #1 (78% response rate). We asked
experts to indicate their role in TA (i.e., provider, recipient, researcher, other), with the option to select multiple roles. Largely,
respondents were TA providers (n = 24), researchers (n = 12), and recipients (n = 6). See Table1 for expert characteristics.
Page 6/27
Table 1
Delphi Participant Characteristics
Characteristic Round #1 n (%) Round #2 n (%)
Gender
Woman 21 (84%) 22 (85%)
Man 4 (16%) 4 (16%)
Total N 25 (100%) 26 (100%)
Technical Assistance (TA) Professional Role*
TA Provider 24 (96%) 22 (85%)
TA Researcher 12 (48%) 9 (35%)
TA Recipient 6 (24%) 5 (19%)
Other 2 (8%) 5 (19%)
*Note. The TA Professional Role question included ‘select all that apply’ options in which experts could identify as both a
TA researcher and provider, for instance. Thus, totals exceed 100%. Some experts, in addition to their main role as a
provider, researcher, or recipient, also selected an ‘other’ category, which included roles such as a project ocer for a TA
center, a funder of TA projects, a TA support person, a funder overseeing TA evaluation contracts, and a grants manager.
[INSERT HERE: Table1.
Delphi Participant Characteristics
]
Survey Round #1 invited feedback on 14 domains and 75 items. All domains reached the 70% consensus threshold. However,
qualitative expert feedback indicated redundancy between some domains: Mutual Armation and Empathy, Mutual
Investment and Collaboration, Responsiveness and Effective Communication, and Focused Facilitation and Accountability
domains. In response, we merged these domains and revised their denitions (see Table2)
Page 7/27
Table 2
Round #1 Domain Revisions
Domain Percentage
Agreement:
Relevance to TA
Relationships
Decision #1
Did the domain
meet the
relevance
threshold? (70%)
Decision #2
Did the domain
share redundancy
with other
domains?
Decision #3
Did the domain
language need
to be rened?
Final Decision
Was the domain
retained,
removed, or
merged?
Professionalism 96% Yes No Yes Retained
Responsiveness 96% Yes Yes Yes Merged with
Effective
Communication
Client-Centered 96% Yes Yes Yes Removed based
on panel
discussion
Ecologically-
Minded
92% Yes No Yes Retained and
changed to
Contextually-
Minded
Proactive 88% Yes No Yes Retained
Empathy 100% Yes Yes Yes Merged with
Mutual
Armation
Mutual
Armation 80% Yes Yes No Merged with
Empathy
Trust 100% Yes No No Retained
Collaboration 92% Yes Yes Yes Merged with
Mutual
Investment
Mutual
Investment 80% Yes Yes Yes Merged with
Collaboration
Effective
Communication 100% Yes Yes Yes Merged with
Responsiveness
Focused
Facilitation 80% Yes Yes Yes Merged with
Accountability
Tailored 96% Yes No No Retained
Accountability 100% Yes Yes Yes Merged with
Focused
Facilitation
[INSERT HERE: Table2.
Round #1 Domain Revisions
]
At the item-level, we rst removed items that did not achieve a 70% consensus threshold (n = 4). Then, an additional 34 items
were removed due to qualitative feedback indicating the items were redundant with other items, were less clear compared to
similar items, and/or were rated lower in comparison to duplicate items. Seven items were relocated as a result of domain-
level revisions and qualitative feedback. Finally, three new items were added to better reect aspects of the Contextually-
Minded, Proactive, and Trust domains; experts indicated that the full range of components of the domain were not captured
in the original set of items. We ultimately retained 37 of the original 75 items and added three new items for a total of 40
items (see Table3 for detailed information about the number of domains and items removed, retained, modied, or added in
round one).
Page 8/27
Table 3
Round #1 Item Revisions
Domain, Item Percentage
Agreement:
Relevance to
TA
Relationships
Decision #1
Did the item
meet the
relevance
threshold?
(70%)
Decision #2
Did the item
need to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 1.1. My TA provider
demonstrates professionalism
through their conduct.
88% Yes No No Retained
Item 1.2. My TA provider upholds the
agreements of their services. 84% Yes No No Retained
Item 1.3. My TA provider practices
with integrity. 88% Yes No No Retained
Item 1.4. My TA provider treats me
with respect at all times. 100% Yes No No Retained
Item 1.5. My TA provider protects my
organizations sensitive information. 80% Yes No No Retained
Item 1.6. My TA provider creates a
safe space for me to share my
organizations information.
92% Yes No No Retained
Responsiveness
Item 2.1. My TA provider is
responsive to my expressed needs. 100% Yes Yes, relocated
to Effective
Communication
No Relocated
Item 2.2. My TA provider responds to
requests for technical assistance in a
timely manner.
92% Yes Yes, relocated
to Effective
Communication
No Relocated
Item 2.3. My TA provider is easily
accessible and available when I need
them.
92% Yes No Yes Removed
Item 2.4. My TA provider assures me
that I can reach out to them anytime. 80% Yes No Yes Removed
Item 2.5. I feel I can reach out to my
TA provider anytime. 88% Yes No Yes Removed
Client-Centered
Item 3.1. My TA provider encourages
me to generate my own solutions to
challenges.
88% Yes No Yes Removed
Item 3.2. My TA provider tries to help
me arrive at my own solutions when
problems arise, rather than telling me
what he/she would do.
88% Yes No Yes Removed
Item 3.3. During TA sessions, I am
encouraged to share my own views on
issues.
96% Yes No Yes Removed
Page 9/27
Domain, Item Percentage
Agreement:
Relevance to
TA
Relationships
Decision #1
Did the item
meet the
relevance
threshold?
(70%)
Decision #2
Did the item
need to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 3.4. My TA provider values my
knowledge and expertise. 100% Yes No Yes Removed
Item 3.5. My TA provider encourages
me to tackle problems using my own
judgment.
76% Yes No Yes Removed
Item 3.6. I feel as though my TA
provider thinks my opinion is valuable
for reaching my goals."
80% Yes No Yes Removed
Item 3.7. I feel as though my opinion
is valuable to reaching my goals. 64% No - - Removed
Item 3.8. My TA provider encourages
me to use personal initiative in
carrying out my tasks.
76% Yes No Yes Removed
Ecologically-Minded
Item 4.1. My TA provider offers
suggestions that are appropriate for
the setting in which I work.
100% Yes No No Retained
Item 4.2. My TA provider is sensitive
to relationship issues in my work
setting, including power dynamics.
96% Yes No No Retained
Item 4.3. My TA provider works to
understand the contextual (e.g.,
political, cultural) aspects of my
organization.
100% Yes No No Retained
Item 4.4. My TA provider is aware of
the conditions and capacities of my
organization.
80% Yes No Yes Removed
Proactive
Item 5.1. My TA provider
communicates emerging issues
important to my work and
organization.
84% Yes No Yes Removed
Item 5.2. My TA provider proactively
shares information that can be useful
for my work and organization."
88% Yes No No Retained
Item 5.3. My TA provider reaches out
on their own initiative to provide
support for my work and
organization.
84% Yes No No Retained
Empathy
Item 6.1. I feel my TA provider
understands me. 92% Yes No Yes Removed
Page 10/27
Domain, Item Percentage
Agreement:
Relevance to
TA
Relationships
Decision #1
Did the item
meet the
relevance
threshold?
(70%)
Decision #2
Did the item
need to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 6.2. I feel as though my TA
provider and I are on the same
wavelength.
64% No - - Removed
Item 6.3. My TA provider works to
understand my position and
perspectives.
96% Yes Yes, relocated
to Mutual
Armation
No Relocated
Item 6.4. My TA provider works to
make sure I feel understood. 88% Yes Yes Yes Removed
Item 6.5. At minimum, our
interactions are warm and friendly. 84% Yes Yes Yes Removed
Mutual Armation
Item 7.1. I am able to be honest with
my TA provider about my feelings and
thoughts on issues.
100% Yes No No Retained
Item 7.2. I feel comfortable sharing
my skepticism and concerns with my
TA provider.
100% Yes No No Retained
Item 7.3. My TA provider engages in
conversations with an open-mind. 100% Yes No No Retained
ITEM 7.4. My TA provider shows their
support for me. 96% Yes No No Retained
Item 7.5. I feel my TA provider and I
support each other in our work
together.
64% No - - Removed
Item 7.6. My TA provider encourages
me to share my thoughts and
opinions, no matter how small.
84% Yes No Yes Removed
Item 7.7. I feel my ideas are accepted
by my TA provider. 84% Yes No Yes Removed
Item 7.8. I feel my thoughts and
feelings are accepted by my TA
provider.
84% Yes No Yes Removed
Trust
Item 8.1. I trust my TA provider. 96% Yes No No Retained
Item 8.2. I feel I can depend upon my
TA provider. 92% Yes No No Retained
Item 8.3. My TA provider shows me
they are trustworthy. 88% Yes No Yes Removed
Item 8.4. My TA provider shows me
they are dependable. 92% Yes No Yes Removed
Page 11/27
Domain, Item Percentage
Agreement:
Relevance to
TA
Relationships
Decision #1
Did the item
meet the
relevance
threshold?
(70%)
Decision #2
Did the item
need to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Collaboration
Item 9.1. During TA sessions, my TA
provider and I are focused on a shared
goal(s).
92% Yes No No Retained
Item 9.2. My TA provider and I are
able to work through disagreements
together.
84% Yes No Yes Removed
Item 9.3. My TA provider encourages
us to work through disagreements
and/or conict together.
76% Yes No Yes Removed
Item 9.4. My TA provider and I are
able to work through
project/organization based obstacles
together.
92% Yes No No Retained
Item 9.5. My TA provider and I work
through project/organization based
disagreements together.
76% Yes No Yes Removed
Item 9.6. My TA provider and I work
collaboratively to generate solutions
to challenges.
100% Yes No No Retained
Item 9.7. I feel I am working together
with my TA provider in a joint effort. 80% Yes No Yes Removed
ITEM 9.8. My TA provider encourages
us to work together on accomplishing
tasks and goals.
80% Yes No Yes Removed
Mutual Investment
Item 10.1. I very much want to work
through and accomplish my goals
with a TA provider.
84% Yes No Yes Removed
Item 10.2. I feel that my TA provider
and I are both invested and
determined in meeting our goals.
84% Yes No Yes Removed
Item 10.3. Both my TA provider and I
are committed to our working
relationship.
84% Yes Yes, relocated
to Collaboration No Relocated
Item 10.4. I feel my TA provider is
committed to our work and goals
together.
88% Yes No Yes Removed
Effective Communication
Item 11.1. My TA provider asks me
questions that show that they are
actively listening to me.
92% Yes No No Retained
Page 12/27
Domain, Item Percentage
Agreement:
Relevance to
TA
Relationships
Decision #1
Did the item
meet the
relevance
threshold?
(70%)
Decision #2
Did the item
need to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 11.2. My TA provider asks me
questions that elicit new insight about
issues.
96% Yes No No Retained
Item 11.3. My TA provider is clear
about our shared expectations. 88% Yes No Yes Removed
Item 11.4. The things that I am asked
to do by my TA provider are presented
in a way that I understand.
96% Yes No No Retained
Item 11.5. My TA provider is a good
communicator. 100% Yes No No Retained
Item 11.6. My TA providers
gestures/mannerisms make them feel
approachable.
60% No - - Removed
Focused Facilitation
Item 12.1. My TA provider keeps
meetings focused and on track. 92% Yes Yes, relocated
to
Accountability
No Relocated
Item 12.2. When applicable, I have a
clear set of next steps at the end of
each TA session.
96% Yes No Yes Removed
Item 12.3. My TA provider asks
questions that make me think
critically.
96% Yes Yes, relocated
to
Accountability
No Relocated
Item 12.4. My TA provider has
challenged my thoughts and/or
actions.
84% Yes No Yes Removed
Item 12.5. During TA sessions, we
explore different approaches for
responding to challenges.
100% Yes Yes, relocated
to Collaboration No Relocated
Tailored
Item 13.1. What we discuss in TA
sessions is highly relevant to my
workplace responsibilities.
92% Yes No No Retained
Item 13.2. My TA provider offers
suggestions that are reasonable to
implement given my organizations
capacity.
96% Yes No No Retained
Item 13.3. My TA provider shares
information and resources that are
valuable and relevant to my
organization.
96% Yes No No Retained
Page 13/27
Domain, Item Percentage
Agreement:
Relevance to
TA
Relationships
Decision #1
Did the item
meet the
relevance
threshold?
(70%)
Decision #2
Did the item
need to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 13.4. My TA provider offers me
useful feedback and strategies. 96% Yes No No Retained
Item 13.5. My TA provider focuses on
current assets and resources in my
workplace.
84% Yes No Yes Removed
Accountability
Item 14.1. My TA provider follows up
with me about my commitments. 84% Yes No Yes Removed
Item 14.2. My TA provider checks in
on my progress related to my
commitments.
88% Yes No No Retained
Item 14.3. My TA provider identies
clear goals for each session. 84% Yes No Yes Removed
Item 14.4. My TA provider guides me
to develop clear, achievable action
plans for my goals.
96% Yes No No Retained
[INSERT HERE: Table3.
Round #1 Item-level Revisions
]
We shared a report summarizing ndings from Survey #1 and invited experts to a one hour panel discussion to review Survey
#1 results and to discuss questions emerging from expert feedback. A total of 17 experts attended the panel discussion. As a
result of the expert panel discussion, we removed the Client-Centered domain. Experts noted that the Client-Centered domain
was more appropriately represented
across
domains rather than separately (that is, nearly every item pertained to the TA
provider being client-centered). Additionally, the denition for Professionalism and its items were revised to better measure
the issue of privacy in TA relationships.
After completion of Survey Round #1 and the discussion panel, the
TA Engagement Scale
included nine domains and 40
items.
Survey Round #2
We sent the second survey to 32 experts. Twenty-six experts responded to Survey Round #2 (81% response rate). Similar to
the rst round of respondents, experts were providers (n = 22), researchers (n = 9), and recipients (n = 5), with some
respondents indicating more than one TA role.
Survey Round #2 included nine domains and 40 items. All domains reached the 85% inclusion threshold. Based on
qualitative expert feedback, we merged Tailored, Contextually-Minded, and Proactive into a single domain, resulting in six
domains: Professionalism, Trust, Collaboration, Communication, Tailored, and Accountability (see Table4).
Page 14/27
Table 4
Round #2 Domain Revisions
Domain Percentage
Agreement:
Relevance to TA
Relationships
Decision #1
Did the domain
meet the
relevance
threshold? (85%)
Decision #2
Did the domain
share redundancy
with other
domains?
Decision #3
Did the
domain
language need
to be rened?
Final Decision
Was the domain
retained, removed,
or merged?
Professionalism 92% Yes No No Retained
Contextually-
Minded
100% Yes Yes Yes Merged with
Tailored
Proactive 89% Yes Yes Yes Merged with
Tailored
Armation 93% Yes Yes Yes Merged with Trust
Trust 100% Yes Yes Yes Merged with
Armation
Collaboration 100% Yes No No Retained
Effective
Communication 96% Yes No No Retained; Domain
name simplied to
Communication
Tailored 96% Yes Yes Yes Merged with
Contextually-
Minded and
Proactive
Accountability 92% Yes No No Retained
[INSERT HERE: Table4.
Round #2 Domain Revisions
]
At the item-level, Survey Round #2 invited respondents to rank order and rate the importance of the items within each domain.
We retained 22 of the 40 items, removing 18 due to failure to meet 85% consensus, item redundancy[1], or low rank (see
Table5 for additional detail). We shared a report of Survey Round #2 ndings with the experts. At the conclusion of the
Delphi process, we retained 6 domains and 22 items (see Table6 for nal scale).
Page 15/27
Table 5
Round #2 Item Revisions
Domain, Item Percentage
Agreement:
Relevance
to TA
Relationships
Item
Importance
Ranking
Decision
#1
Did the
item meet
the
relevance
threshold?
(85%)
Decision
#2
Did the
item need
to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Decision
#4
Is the
item
lower
ranked?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 1.1. My TA
provider
demonstrates
professionalism
through their
conduct.
96% 2 Yes No No No Retained
Item 1.2. My TA
provider upholds the
agreements of their
services (e.g.,
condentiality,
responsibilities).
96% 1 Yes No No No Retained
Item 1.3. My TA
provider practices
with integrity.
85% 4 Yes No No No Retained
Item 1.4. My TA
provider treats me
with respect at all
times.
92% 3 Yes No No No Retained
Contextually-Minded
Item 4.1. My TA
provider offers
suggestions that are
appropriate for the
setting in which I
work.
100% 1 Yes Yes,
relocated
to
Tailored
No No Relocated
Item 4.2. My TA
provider is sensitive
to relationship
issues in my work
setting, including
power dynamics.
92% 3 Yes No Yes - Removed
Item 4.3. My TA
provider works to
understand the
contextual (e.g.,
political, cultural)
aspects of my
organization.
100% 2 Yes Yes,
relocated
to
Tailored
No No Relocated
NEW ITEM. 4.4. "My
TA provider
encourages me to
think about issues
holistically.
84% 4 No - - - Removed
Proactive
Page 16/27
Domain, Item Percentage
Agreement:
Relevance
to TA
Relationships
Item
Importance
Ranking
Decision
#1
Did the
item meet
the
relevance
threshold?
(85%)
Decision
#2
Did the
item need
to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Decision
#4
Is the
item
lower
ranked?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 5.2. My TA
provider proactively
shares information
that can be useful
for my work and
organization."
92% 1 Yes No Yes - Removed
Item 5.3. My TA
provider reaches out
on their own
initiative to provide
support for my work
and organization.
88% 2 Yes No Yes - Removed
New Item 5.4. "My
TA provider checks
in with me without
waiting for me to
reach out."
73% 3 No - - - Removed
New Item 5.5. "My
TA provider stays
abreast of issues to
anticipate on-going
needs of my work
and organization."
85% 4 Yes No No Yes Removed
Armation
Item 7.1. I can be
honest with my TA
provider about my
thoughts and
feelings.
92% 2 Yes No Yes - Removed
Item 7.2. I am
comfortable sharing
my skepticism and
concerns with my TA
provider.
100% 1 Yes Yes,
relocated
to Trust
No No Relocated
Item 7.3. My TA
provider engages in
conversations with
an open-mind.
92% 3 Yes No Yes - Removed
Item 7.4. My TA
provider shows
support for me.
81% 5 No - - - Removed
Item 6.3. My TA
provider works to
understand my
perspectives.
92% 4 Yes No No Yes Removed
Trust
Page 17/27
Domain, Item Percentage
Agreement:
Relevance
to TA
Relationships
Item
Importance
Ranking
Decision
#1
Did the
item meet
the
relevance
threshold?
(85%)
Decision
#2
Did the
item need
to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Decision
#4
Is the
item
lower
ranked?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 8.1. I trust my
TA provider. 92% 1 Yes No No No Retained
Item 8.2. I can
depend upon my TA
provider.
96% 3 Yes No No No Retained
New Item 8.3. "My
TA provider is
sincere in our
working
relationship."
81% 4 No - - - Removed
New Item 8.4. "I have
condence in my TA
providers expertise."
96% 2 Yes No No No Retained
Collaboration
Item 9.1. My TA
provider and I are
focused on a shared
goal(s).
96% 1 Yes No No No Retained
Item 9.4. My TA
provider and I can
work through
project/organization-
based obstacles
together.
96% 3 Yes No No No Retained
Item 9.6. My TA
provider and I work
collaboratively to
generate solutions to
challenges.
100% 2 Yes No No No Retained
Item 12.5. My TA
provider and I
explore different
approaches for
responding to
challenges.
92% 4 Yes No Yes - Removed
Item 10.3. My TA
provider and I are
committed to our
working
relationship.
88% 5 Yes No No Yes Removed
Effective Communication
Item 11.1. My TA
provider actively
listens to me.
100% 1 Yes No No No Retained
Page 18/27
Domain, Item Percentage
Agreement:
Relevance
to TA
Relationships
Item
Importance
Ranking
Decision
#1
Did the
item meet
the
relevance
threshold?
(85%)
Decision
#2
Did the
item need
to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Decision
#4
Is the
item
lower
ranked?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 11.2. My TA
provider asks me
questions that elicit
new insights for me.
96% 3 Yes No No No Retained
Item 11.4. I
understand the
information my TA
provider presents to
me.
96% 4 Yes No No No Retained
Item 11.5. My TA
provider is a good
communicator.
92% 5 Yes No No Yes Removed
Item 2.1. My TA
provider is
responsive to my
expressed needs.
100% 2 Yes No No No Retained
Item 2.2. My TA
provider responds to
requests for
technical assistance
in a timely manner.
100% 6 Yes No Yes - Removed
Tailored
Item 13.1. What we
discuss in TA
sessions is highly
relevant to my
workplace
responsibilities.
92% 1 Yes No No No Retained
Item 13.2. My TA
provider offers
suggestions that are
reasonable to
implement given my
organizations
capacity.
96% 4 Yes No No Yes Removed
Item 13.3. My TA
provider shares
information and
resources that are
valuable and
relevant to me/my
organization.
100% 3 Yes No No No Retained
Item 13.4. My TA
provider offers me
useful feedback and
strategies.
96% 2 Yes No Yes - Removed
Accountability
Page 19/27
Domain, Item Percentage
Agreement:
Relevance
to TA
Relationships
Item
Importance
Ranking
Decision
#1
Did the
item meet
the
relevance
threshold?
(85%)
Decision
#2
Did the
item need
to be
relocated?
Decision
#3
Did the
item share
redundancy
with other
items?
Decision
#4
Is the
item
lower
ranked?
Final
Decision
Was the
item
retained,
removed,
or
relocated?
Professionalism
Item 14.1. My TA
provider checks in
on my progress
related to my
commitments.
92% 2 Yes No No No Retained
Item 14.4. My TA
provider guides me
to develop clear,
achievable action
plans for my goals.
100% 1 Yes No No No Retained
Item 12.1. "My TA
provider keeps
meetings focused
and on track."
92% 4 Yes No No No Retained
Item 12.3. My TA
provider asks
questions that make
me think critically.
100% 3 Yes No Yes - Removed
Page 20/27
Table 6
Final Domains and Items on the TA Engagement Scale
Domain & Denition Item Average
Relevance Average
Ranking
Professionalism: the extent to which the TA provider
upholds integrity and maintains shared agreements
about condentiality and commitments.
My TA provider demonstrates
professionalism through their
conduct.
96% 2
My TA provider upholds the
agreements of their services (e.g.,
condentiality, responsibilities).
96% 1
My TA provider practices with
integrity. 85% 4
My TA provider treats me with respect
at all times. 92% 3
Trust: the extent to which the TA provider cultivates a
space in which the TA recipient is condent in the
dependability, expertise, and openness of the TA
provider.
I trust my TA provider. 92% 1*
I can count on my TA provider. 96% 3
I have condence in my TA providers
expertise. 96% 2
I am comfortable sharing my
skepticism and concerns with my TA
provider.
100% 1*
Collaboration: the extent to which TA providers and
recipients are committed to working together. My TA provider and I are focused on
a shared goal(s). 96% 1
My TA provider and I can work
through project/organization based
obstacles together.
96% 3
My TA provider and I work
collaboratively to generate solutions
to challenges.
100% 2
Communication: the quality of how information and
ideas are exchanged between the TA provider and
recipient.
My TA provider actively listens to me. 100% 1
My TA provider asks me questions
that elicit new insights for me. 96% 3
I understand the information my TA
provider presents to me. 96% 4
My TA provider is responsive to my
expressed needs. 100% 2
Tailored: the extent to which TA is proactive and
responsive to the motivation, capacities,
characteristics, and needs of the recipient/recipient
organization.
What we discuss in TA sessions is
highly relevant to my workplace
responsibilities.
92% 1*
My TA provider shares information
and resources that are valuable and
relevant to me/my organization.
100% 3
My TA provider offers suggestions
that are appropriate for the setting in
which I work.
100% 1*
*Note. Some items have the same rank value as others in their respective domains due to item relocation based on
qualitative feedback in Survey Round #2. Items were ranked within their original domains and then relocated, meaning
some items share rank values.
Page 21/27
Domain & Denition Item Average
Relevance Average
Ranking
My TA provider works to understand
the contextual (e.g., political, cultural,
power dynamics) aspects of my
organization.
100% 2
Accountability: the extent to which the TA provider is
clear about goals of the TA sessions and follows-up on
issues and commitments to support progress toward
desired outcomes.
My TA provider checks in on my
progress related to my commitments. 92% 2
My TA provider guides me to develop
clear, achievable action plans for my
goals.
100% 1
My TA provider keeps meetings
focused and on track. 92% 4
*Note. Some items have the same rank value as others in their respective domains due to item relocation based on
qualitative feedback in Survey Round #2. Items were ranked within their original domains and then relocated, meaning
some items share rank values.
[INSERT HERE: Table5.
Round #2 Item-level Revisions
]
[INSERT HERE: Table6.
Final TA Engagement Scale
]
Discussion
Formative evaluations of TA are limited in part due to the scarcity of formative measurement tools for TA, including
assessments of TA provider relationship quality (8). We used a modied Delphi approach to develop the
Technical Assistance
(TA) Engagement Scale
, a 22-item formative evaluation tool designed to assess TA provider-recipient relationships. Through
the Delphi study, we retained six domains: Professionalism, Trust, Collaboration, Communication, Tailored, and
Accountability. Five of these domains resemble the relational domains reported in the TA literature synthesis by Katz &
Wandersman (16), reinforcing a core set of qualities important to TA relationships: Professionalism (Respect), Trust (Trust),
Collaboration (Collaboration), Tailored (Adjusting to Readiness), and Accountability (Roles/Responsibilities). A relational
domain that emerged as salient but that was not noted in Katz & Wandersman’s (16) synthesis is Communication. TA experts
were consistently high in consensus about the relevance of Communication to TA relationships (96%-100% agreement at the
item and domain level), suggesting an area of interpersonal relationships for TA providers to particularly attend to. Aligned
with Delphi expert input, effective communication is listed as a key practice for effective TA (10).
The
TA Engagement Scale
critically advances the practice of TA by providing TA providers and recipients with a robust,
expert-informed instrument for monitoring TA engagement quality. It enables TA providers and recipients to examine and
develop their relationship collaboratively and intentionally. The measure increases TA provider ability to make data-informed,
mid-course adjustments to TA delivery. Further, use of the instrument can signal the provider’s high regard for the TA
relationship and thereby bolster relationship quality.
In addition to advances in TA practice, the
TA Engagement Scale
can contribute to developments in the science of TA. A
standard TA measurement tool is an advancement toward more systematic collection of TA data and is essential to
generating a coherent body of evidence. The consistent use of a robust TA measurement scale across studies will allow TA
researchers and evaluators to better compare TA relationships in a variety of settings and to examine correlates of TA
relationships with targeted outcomes.
Use of the TA Engagement Scale & Considerations for Research and
Practice
Page 22/27
The
TA Engagement Scale
is intended for administration by TA providers to TA recipients on a periodic basis (e.g. monthly,
quarterly, semi-annually) to monitor and improve provider-recipient relationship quality. When the instrument is administered,
TA recipients complete the scale by rating the extent to which each scale item is present in their relationship with the TA
provider using a 5-point frequency scale (5-Always, 4-Often, 3-Sometimes, 2-Rarely, 1- Never). The TA provider reviews the
recipients’ responses to identify relational strengths and areas for improvement. The TA provider is encouraged to discuss the
recipients’ feedback with TA colleagues and/or TA recipients. Importantly, this tool is for the purpose of TA relationship
monitoring and improvement (formative evaluation). It is not intended as a performance assessment, or as a measure of a
TA provider’s performance for a workplace employee evaluation.
We designed the
TA Engagement Scale
with several goals in mind: i) to provide a robust measure that captures multiple
dimensions (domains) of TA relationships, ii) to bridge the science and practice of TA through a scale development process
involving an in-depth cross-walk of research literature and TA expert input, and iii) to create a parsimonious measure of TA
engagement that serves as a practical implementation tool. Given the scale’s relative briefness, TA providers can administer
the scale regularly with little time burden on the recipients (< 12 min to complete), making it a practical tool for regularly
assessing engagement over time and aligning with calls for more pragmatic approaches to implementation monitoring and
tailoring (41, 42).
With modications, the
TA Engagement Scale
can be used for group TA (i.e., TA involving one or more TA providers and more
than one TA recipient). Adaptation of the scale is minor, including revision to the scale instructions to reect group TA and
revising the subject at the scale item-level; for example, revising “My TA provider is responsive to my expressed needs. to “My
TA provider(s) are responsive to our expressed needs.” In collaboration with a national TA center, we have begun to pilot use
of the
TA Engagement Scale
in group TA formats. Of note, our modied Delphi study focused primarily on the development of
this scale for dyadic provider-recipient relationships. There may be important relationship dynamics in group TA settings
uncaptured by the current version of the
TA Engagement Scale
. Research on the use of the
TA Engagement Scale
in group TA
is needed to discern if other relational domains beyond the six identied in the scale are important for group TA.
While the
TA Engagement Scale
can be used to assess TA relationship quality across in-person, virtual, and hybrid modes of
TA, it may have increasing value for virtual modes of TA. The COVID-19 pandemic catalyzed an acceleration in the provision
of remote TA, where TA provider-recipient meetings and trainings shifted from in-person to online modalities to accommodate
social distancing mandates and travel restrictions (42, 43). The increase in reliance on remote TA engenders new questions
about how TA provider-recipient relationships are formed and maintained in virtual spaces, including how virtual TA
relationships compare to hybrid (in-person/virtua)l and in-person exclusive TA relationships. It is known that there are unique
considerations associated with remote TA. For example, virtual settings can present more distractions (e.g. email, social
media, multitasking, place-based disruptions) and technological challenges. Specialized preparation by professionals who
provide remote services, such as TA providers, is necessary to effectively hold virtual spaces and to engage remote TA
recipients (44, 45). However, the inuence of remote TA on TA relationships is less understood. This association merits
research as remote TA has become a common practice.
Study Limitations and Future Directions
Though this scale is informed by TA experts using a multi-stage approach, it has yet to be psychometrically validated. The
next step in the scale development process is to administer the scale in practice and conduct a conrmatory factor analysis
to ensure that the items that we have included are measuring the constructs that we intend to measure. We utilized
convenience sampling for recruiting the experts whose feedback we used in this study. However, given that experts were
geographically spread and from multiple organizations and backgrounds, we expect that the results of our Delphi process
included general and diverse perspectives on what aspects of TA relationships are most central.
The main purpose of the modied Delphi TA study was development of a TA provider-recipient relationship measurement
scale. The Delphi study identied six domains highly relevant to TA relationships. We did not seek expert input about the
relative importance of these domains over the life course of a TA relationship or across stages of program implementation;
Page 23/27
the salience of these domains may vary over time. For example, trust may require time to cultivate and thus be positively
correlated with relationship length. Collaboration may dwindle over time as TA recipients become more capable and self-
reliant. In fact, an association has been reported between the salience of collaboration and implementation stage (16).
Systematic research is needed to better understand the relative importance of each of the six relational domains over the life
course of TA engagement.
Conclusion
The quality of a TA provider-recipient relationship is central to TA and positively associated with program implementation
outcomes. Developed through a modied Delphi approach, the
TA Engagement Scale
is a research and expert-informed
formative evaluation measurement tool designed to advance the science and practice of TA. It offers TA providers a
practitioner-friendly measure for monitoring and improving their relationships with TA recipients. As a standard TA
measurement tool, it enables more systematic collection of TA data and thereby, the ability to generate a more coherent body
of evidence. The
TA Engagement Scale
can be used to assess relationship quality across multiple (virtual, in-person, and
hybrid ) TA delivery modalities.
Abbreviations
ACCORD: Accurate Consensus Reporting Document
EBI: Evidence Based Intervention
EBP: Evidence Based Program
ICF: International Coaching Federation
RAND: Research and Development Corporation
TA: Technical Assistance
Declarations
Ethics Approval and Consent to Participate.The study was approved by the University of North Carolina at Charlotte
Institutional Review Board (IRB-23-0463). All participants consented to participate in the study.
Consent for Publication.Not Applicable.
Availability of Data and Materials. The dataset generated and analyzed during the current study are not publicly
available due to condentiality reasons but are available from the corresponding author upon reasonable request.
Competing Interests. The authors declare that they have no competing interests.
Funding.No sources of funding were obtained for the current study.
Authors Contributions. Vs, JT, and ZJ conceptualized the study design. Vs, JT, and ZJ administered study materials,
analyzed study data, and were all major contributors to development of the manuscript. All authors read and approved
the nal manuscript.
Acknowledgments.The authors would like to acknowledge the following individuals and/or organizations for their
contributions to the development of the
TA Engagement Scale
: Build Initiative (Sherri Stewart); Department of Education;
Department of Health and Human Services, Administration for Children and Families (Annalisa Mastri); National Center
for Safe and Supportive Learning Environments (Brianna Cunniff, Elizabeth Chagnon, Frank Rider, Jeanne Poduska, Rob
Mayo); Other/No Organizations (Kathleen Guarino, Laura Hurwitz, Nicole Denmark, Pam Imm); Wandersman Center
(Amber Watson, Andrea Lamont, Brittany Cook).
References
Page 24/27
1. Shrestha R, Karki P, Altice FL, Dubov O, Fraenkel L, Huedo-Medina T, et al. Measuring Acceptability and Preferences for
Implementation of Pre-Exposure Prophylaxis (PrEP) Using Conjoint Analysis: An Application to Primary HIV Prevention
Among High Risk Drug Users. AIDS Behav. 2018 Apr;22(4):1228–38.
2. Scaccia JP, Cook BS, Lamont A, Wandersman A, Castellow J, Katz J, et al. A practical implementation science heuristic
for organizational readiness: R = MC2. J Community Psychol. 2015 Apr;43(4):484–501.
3. Scott VC, Gold SB, Kenworthy T, Snapper L, Gilchrist EC, Kirchner S, et al. Assessing cross-sector stakeholder readiness to
advance and sustain statewide behavioral integration beyond a State Innovation Model (SIM) initiative. Transl Behav
Med. 2021 Jul 29;11(7):1420–9.
4. Olsen AA, Wolcott MD, Haines ST, Janke KK, McLaughlin JE. How to use the Delphi method to aid in decision making and
build consensus in pharmacy education. Curr Pharm Teach Learn. 2021 Oct;13(10):1376–85.
5. Kegler MC, Redmon PB. Using Technical Assistance to Strengthen Tobacco Control Capacity: Evaluation Findings from
the Tobacco Technical Assistance Consortium. Public Health Rep. 2006;121(5):547–56.
. Jadwin-Cakmak L, Bauermeister JA, Cutler JM, Loveluck J, Kazaleh Sirdenis T, Fessler KB, et al. The Health Access
Initiative: A Training and Technical Assistance Program to Improve Health Care for Sexual and Gender Minority Youth. J
Adolesc Health Off Publ Soc Adolesc Med. 2020 Jul;67(1):115–22.
7. Sugarman JR, Phillips KE, Wagner EH, Coleman K, Abrams MK. The safety net medical home initiative: transforming care
for vulnerable populations. Med Care. 2014 Nov;52(11 Suppl 4):S1-10.
. Scott VC, Jillani Z, Malpert A, Kolodny-Goetz J, Wandersman A. A scoping review of the evaluation and effectiveness of
technical assistance. Implement Sci Commun. 2022 Jun 28;3(1):70.
9. Dunst CJ, Annas K, Wilkie H, Hamby DW. Scoping Review of the Core Elements of Technical Assistance Models and
Frameworks. World J Educ. 2019;9(2):109–22.
10. Indicators of Effective Technical Assistance Practices.
11. Skelton SM. Situating my positionality as a Black woman with a dis/ability in the provision of equity-focused technical
assistance: a personal reection. Int J Qual Stud Educ. 2019 Mar 16;32(3):225–42.
12. Mitchell RE, Florin P, Stevenson JF. Supporting community-based prevention and health promotion initiatives: developing
effective technical assistance systems. Health Educ Behav Off Publ Soc Public Health Educ. 2002 Oct;29(5):620–39.
13. Fixen D, Blase K, Horner R, Sugai G. Intensive technical assistance. Chapel Hill, NC: FPG Child Development Institute,
University of North Carolina at Chapel Hill. Intensive Technical Assistance. 2009.
14. Labas L, Downs J, Lavallee S. Technical Assistance Competencies for Maines Early Childhood Workforce Self-
Assessment Checklist.
15. Implementation Support Practitioner Prole | NIRN [Internet]. [cited 2024 Mar 14]. Available from:
https://nirn.fpg.unc.edu/resources/implementation-support-practitioner-prole
1. Katz J, Wandersman A. Technical Assistance to Enhance Prevention Capacity: a Research Synthesis of the Evidence
Base. Prev Sci Off J Soc Prev Res. 2016 May;17(4):417–28.
17. Hunter SB, Chinman M, Ebener P, Imm P, Wandersman A, Ryan GW. Technical assistance as a prevention capacity-
building tool: a demonstration using the getting to outcomes framework. Health Educ Behav Off Publ Soc Public Health
Educ. 2009 Oct;36(5):810–28.
1. Wandersman A, Chien VH, Katz J. Toward an evidence-based system for innovation support for implementing
innovations with quality: tools, training, technical assistance, and quality assurance/quality improvement. Am J
Community Psychol. 2012 Dec;50(3–4):445–59.
19. Yazejian N, Metz A, Morgan J, Louison L, Bartley L, Fleming WO, et al. Co-creative technical assistance: essential
functions and interim outcomes. Evid Policy. 2019 Aug 1;15(3):339–52.
20. Spoth R, Guyll M, Lillehoj CJ, Redmond C, Greenberg M. Prosper study of evidence-based intervention implementation
quality by community–university partnerships. J Community Psychol. 2007;35(8):981–99.
Page 25/27
21. Chilenski SM, Perkins DF, Olson J, Hoffman L, Feinberg ME, Greenberg M, et al. The power of a collaborative relationship
between technical assistance providers and community prevention teams: A correlational and longitudinal study. Eval
Program Plann. 2016 Feb 1;54:19–29.
22. Saunders S, Howard K, Orlinsky D. The Therapeutic Bond Scales: Psychometric Characteristics and Relationship to
Treatment Effectiveness. Psychol Assess [Internet]. 1989 Dec 1; Available from:
https://epublications.marquette.edu/psych_fac/304
23. Luborsky L, Barber JP, Siqueland L, Johnson S, Najavits LM, Frank A, et al. The Revised Helping Alliance Questionnaire
(HAq-II) : Psychometric Properties. J Psychother Pract Res. 1996;5(3):260–71.
24. Gaston L, Piper W, Debbane E, Bienvenu JP, Garant J. Alliance and Technique for Predicting Outcome in Short-and Long-
Term Analytic Psychotherapy. Psychother Res. 1994 Jan 1;4(2):121–35.
25. Horvath AO, Greenberg LS. Development and validation of the Working Alliance Inventory. J Couns Psychol.
1989;36(2):223–33.
2. Appelbaum SH, Steed AJ. The critical success factors in the clientconsulting relationship. J Manag Dev. 2005 Jan
1;24(1):68–93.
27. Coaching_evaluation. Probl Solving. https://oridarti.usf.edu/
2. de Haan E, Duckworth A, Birch D, Jones C. Executive coaching outcome research: The contribution of common factors
such as relationship, personality match, and self-ecacy. Consult Psychol J Pract Res. 2013;65(1):40–57.
29. Le LT, Anthony BJ, Bronheim SM, Holland CM, Perry DF. A Technical Assistance Model for Guiding Service and Systems
Change. J Behav Health Serv Res. 2016 Jul;43(3):380–95.
30. Motes P, Hess P, editors. Index. In: Collaborating with Community-Based Organizations Through Consultation and
Technical Assistance [Internet]. Columbia University Press; 2007 [cited 2024 Mar 14]. p. 197–206. Available from:
https://www.degruyter.com/document/doi/10.7312/mote12872-013/html
31. de Villiers MR, de Villiers PJT, Kent AP. The Delphi technique in health sciences education research. Med Teach. 2005 Nov
1;27(7):639–43.
32. Brady SR. The Delphi Method. In: Jason LA, Glenwick DS, editors. Handbook of Methodological Approaches to
Community-Based Research: Qualitative, Quantitative, and Mixed Methods [Internet]. Oxford University Press; 2015 [cited
2024 Mar 27]. p. 0. Available from: https://doi.org/10.1093/med:psych/9780190243654.003.0007
33. Diamond IR, Grant RC, Feldman BM, Pencharz PB, Ling SC, Moore AM, et al. Dening consensus: a systematic review
recommends methodologic criteria for reporting of Delphi studies. J Clin Epidemiol. 2014 Apr;67(4):401–9.
34. Niederberger M, Spranger J. Delphi Technique in Health Sciences: A Map. Front Public Health [Internet]. 2020 Sep 22
[cited 2024 Mar 10];8. Available from: https://www.frontiersin.org/journals/public-
health/articles/10.3389/fpubh.2020.00457/full
35. Marques JBV, de Freitas D. The DELPHI method: characterization and potentialities for educational research. -Posições.
2018 Aug;29(2):389–415.
3. San-Jose L, Retolaza JL. Is the Delphi method valid for business ethics? A survey analysis. Eur J Futur Res. 2016
Dec;4(1):1–15.
37. Aazami S, Mozafari M. Development of a scale for the evaluation of patients’ rights prerequisites at educational
hospitals in Iran: a study using the Delphi technique. J Med Ethics Hist Med. 2015 Nov 14;8:12.
3. Gattrell WT, Logullo P, Van Zuuren EJ, Price A, Hughes EL, Blazey P, et al. ACCORD (ACcurate COnsensus Reporting
Document): A reporting guideline for consensus methods in biomedicine developed via a modied Delphi. PLOS Med.
2024 Jan 23;21(1):e1004326.
39. Hsu CC, Sandford B. The Delphi Technique: Making Sense of Consensus. Pract Assess Res Eval [Internet]. 2019 Nov
23;12(1). Available from: https://scholarworks.umass.edu/pare/vol12/iss1/10
Page 26/27
40. Strøm M, Lönn L, Bech B, Schroeder TV, Konge L, Aho P, et al. Assessment of Competence in EVAR Procedures: A Novel
Rating Scale Developed by the Delphi Technique. Eur J Vasc Endovasc Surg. 2017 Jul 1;54(1):34–41.
41. Robinson CH, Damschroder LJ. A pragmatic context assessment tool (pCAT): using a Think Aloud method to develop an
assessment of contextual barriers to change. Implement Sci Commun. 2023 Jan 11;4(1):3.
42. Stanick CF, Halko HM, Nolen EA, Powell BJ, Dorsey CN, Mettert KD, et al. Pragmatic measures for implementation
research: development of the Psychometric and Pragmatic Evidence Rating Scale (PAPERS). Transl Behav Med. 2021
Feb 11;11(1):11–20.
43. Cross-Technology Transfer Center (TTC) Workgroup on Virtual Learning. Providing behavioral workforce development
technical assistance during COVID-19: adjustments and needs. Transl Behav Med. 2022 Jan 1;12(1):ibab097.
44. Capacity building for household surveys: Providing technical assistance in the face of COVID-19 [Internet]. 2021 [cited
2024 Mar 14]. Available from: https://blogs.worldbank.org/opendata/capacity-building-household-surveys-providing-
technical-assistance-face-covid-19
45. Greenhalgh T, Payne R, Hemmings N, Leach H, Hanson I, Khan A, et al. Training needs for staff providing remote services
in general practice: a mixed-methods study. Br J Gen Pract. 2023 Dec 29;74(738):e17–26.
Footnotes
1. If two items were redundant, we kept the item with the higher rank and agreement.
Figures
Figure 1
Page 27/27
Scale modications across modied Delphi survey rounds.
Note. Decisions made during Delphi panel discussions are reected under Delphi Round 1.
Supplementary Files
This is a list of supplementary les associated with this preprint. Click to download.
ACCORDDELPHIchecklistTAEngagementScaleMs.docx
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Background In biomedical research, it is often desirable to seek consensus among individuals who have differing perspectives and experience. This is important when evidence is emerging, inconsistent, limited, or absent. Even when research evidence is abundant, clinical recommendations, policy decisions, and priority-setting may still require agreement from multiple, sometimes ideologically opposed parties. Despite their prominence and influence on key decisions, consensus methods are often poorly reported. Our aim was to develop the first reporting guideline dedicated to and applicable to all consensus methods used in biomedical research regardless of the objective of the consensus process, called ACCORD (ACcurate COnsensus Reporting Document). Methods and findings We followed methodology recommended by the EQUATOR Network for the development of reporting guidelines: a systematic review was followed by a Delphi process and meetings to finalize the ACCORD checklist. The preliminary checklist was drawn from the systematic review of existing literature on the quality of reporting of consensus methods and suggestions from the Steering Committee. A Delphi panel (n = 72) was recruited with representation from 6 continents and a broad range of experience, including clinical, research, policy, and patient perspectives. The 3 rounds of the Delphi process were completed by 58, 54, and 51 panelists. The preliminary checklist of 56 items was refined to a final checklist of 35 items relating to the article title (n = 1), introduction (n = 3), methods (n = 21), results (n = 5), discussion (n = 2), and other information (n = 3). Conclusions The ACCORD checklist is the first reporting guideline applicable to all consensus-based studies. It will support authors in writing accurate, detailed manuscripts, thereby improving the completeness and transparency of reporting and providing readers with clarity regarding the methods used to reach agreement. Furthermore, the checklist will make the rigor of the consensus methods used to guide the recommendations clear for readers. Reporting consensus studies with greater clarity and transparency may enhance trust in the recommendations made by consensus panels.
Article
Full-text available
Background The Consolidated Framework for Implementation Research (CFIR) is a determinant framework that can be used to guide context assessment prior to implementing change. Though a few quantitative measurement instruments have been developed based on the CFIR, most assessments using the CFIR have relied on qualitative methods. One challenge to measurement is to translate conceptual constructs which are often described using highly abstract, technical language into lay language that is clear, concise, and meaningful. The purpose of this paper is to document methods to develop a freely available pragmatic context assessment tool (pCAT). The pCAT is based on the CFIR and designed for frontline quality improvement teams as an abbreviated assessment of local facilitators and barriers in a clinical setting. Methods Twenty-seven interviews using the Think Aloud method (asking participants to verbalize thoughts as they respond to assessment questions) were conducted with frontline employees to improve a pilot version of the pCAT. Interviews were recorded and transcribed verbatim; the CFIR guided coding and analyses. Results Participants identified several areas where language in the pCAT needed to be modified, clarified, or allow more nuance to increase usefulness for frontline employees. Participants found it easier to respond to questions when they had a recent, specific project in mind. Potential barriers and facilitators tend to be unique to each specific improvement. Participants also identified missing concepts or that were conflated, leading to refinements that made the pCAT more understandable, accurate, and useful. Conclusions The pCAT is designed to be practical, using everyday language familiar to frontline employees. The pCAT is short (14 items), freely available, does not require research expertise or experience. It is designed to draw on the knowledge of individuals most familiar with their own clinical context. The pCAT has been available online for approximately two years and has generated a relatively high level of interest indicating potential usefulness of the tool.
Article
Full-text available
Background Although the benefits of evidence-based practices (EBPs) for advancing community outcomes are well-recognized, challenges with the uptake of EBPs are considerable. Technical assistance (TA) is a core capacity building strategy that has been widely used to support EBP implementation and other community development and improvement efforts. Yet despite growing reliance on TA, no reviews have systematically examined the evaluation of TA across varying implementation contexts and capacity building aims. This study draws on two decades of peer-reviewed publications to summarize the evidence on the evaluation and effectiveness of TA. Methods Guided by Arksey and O’Malley’s six-stage methodological framework, we used a scoping review methodology to map research on TA evaluation. We included peer-reviewed articles published in English between 2000 and 2020. Our search involved five databases: Business Source Complete, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Education Resources Information Center (ERIC), PsycInfo, and PubMed. Results A total of 125 evaluation research studies met the study criteria. Findings indicate that publications have increased over the last two decades, signaling a growth in the recognition and reporting of TA. Technical assistance is being implemented across diverse settings, often serving socially vulnerable and under-resourced populations. Most evaluation research studies involved summative evaluations, with TA outcomes mostly reported at the organizational level. Only 5% of the studies examined sustainability of TA outcomes. This review also demonstrates that there is a lack of consistent standards regarding the definition of TA and the level of reporting across relevant TA evaluation categories (e.g., cadence of contact, and directionality). Conclusions Advances in the science and practice of TA hinge on understanding what aspects of TA are effective and when, how, and for whom these aspects of TA are effective. Addressing these core questions requires (i) a standard definition for TA; (ii) more robust and rigorous evaluation research designs that involve comparison groups and assessment of direct, indirect, and longitudinal outcomes; (iii) increased use of reliable and objective TA measures; and (iv) development of reporting standards. We view this scoping review as a foundation for improving the state of the science and practice of evaluating TA.
Article
Full-text available
The use of reliable, valid measures in implementation practice will remain limited without pragmatic measures. Previous research identified the need for pragmatic measures, though the characteristic identification used only expert opinion and literature review. Our team completed four studies to develop a stakeholder-driven pragmatic rating criteria for implementation measures. We published Studies 1 (identifying dimensions of the pragmatic construct) and 2 (clarifying the internal structure) that engaged stakeholders—participants in mental health provider and implementation settings—to identify 17 terms/phrases across four categories: Useful, Compatible, Acceptable, and Easy. This paper presents Studies 3 and 4: a Delphi to ascertain stakeholder-prioritized dimensions within a mental health context, and a pilot study applying the rating criteria. Stakeholders (N = 26) participated in a Delphi and rated the relevance of 17 terms/phrases to the pragmatic construct. The investigator team further defined and shortened the list, which were piloted with 60 implementation measures. The Delphi confirmed the importance of all pragmatic criteria, but provided little guidance on relative importance. The investigators removed or combined terms/phrases to obtain 11 criteria. The 6-point rating system assigned to each criterion demonstrated sufficient variability across items. The grey literature did not add critical information. This work produced the first stakeholder-driven rating criteria to assess whether measures are pragmatic. The Psychometric and Pragmatic Evidence Rating Scale (PAPERS) combines the pragmatic criteria with psychometric rating criteria, from previous work. Use of PAPERS can inform development of implementation measures and to assess the quality of existing measures.
Article
Full-text available
A review of 25 technical assistance models and frameworks was conducted to identify the core elements of technicalassistance practices. The focus of analysis was on generally agreed upon technical assistance practices that wereconsidered essential for planning, implementing and evaluating the effectiveness of technical assistance. Resultsindicated that there are five major components of technical assistance and 25 different core elements. Analyses of themodels and components found considerable variability within and between components in terms of the core elementsthat are considered most important or essential. Findings were used to define and describe the core elements of thetechnical assistance models and frameworks and how they can be used in research and evaluation studies todetermine if the use of the core elements and practices are related to changes or improvements in program,organizational, or systems practices.
Article
Full-text available
In this essay, I reflect on how an equity-focused technical assistance (TA) practitioner who holds intersectional minoritized social identities is in a unique position to introduce tensions in the TA activity system, disrupt marginalizing dominant narratives about difference, and affect educators’ development of new ideas about the treatment of difference in schools. To start, I situate myself as the TA provider by focusing specifically on the socio-historical context in which I experienced public K-12 education as a Black, woman with a dis/ability. Next, I outline three reflections related to my experience with the treatment of difference during that time, particularly in terms of race and dis/ability; I consider how my personal history informs my current interactions within the provision of equity-focused TA. I identify three marginalizing impacts resulting from educators’ treatment of my and other students’ difference; and describe three strategic moves I employ to disrupt and mitigate these impacts. Additionally, I reflect on the question: How might my social identities, intersectional education history, and lived experiences serve as instruments that evoke tensions and affect interactions within the TA activity system?
Article
Full-text available
Abstract Background. A large national technical assistance center in the U.S. provides technical assistance to Head Start regions to help strengthen their use of implementation science frameworks. In administration of technical assistance to regions, the center has used a “co-creation” model. Aims and objectives. This paper describes the co-creative technical assistance approach, including level of dosage (frequency and duration) provided, and interim outcomes achieved. Methods. The descriptive paper relies on secondary data analyses of information gathered in the course of delivering the technical assistance. The information gathered included observations of events, surveys, and listening sessions with technical assistance participants. Findings. Analyses revealed that the technical assistance providers were successful in the provision of co-created technical assistance and that interim outcomes of trust, mutual accountability, and some integration of implementation science concepts into ongoing regional work were achieved. Discussion and conclusions. The technical assistance providers succeeded in delivering technical assistance that was perceived by both observers and participants as involving participants and being tailored to their settings and needs. Limitations include a small sample that was selected based on interest and readiness for engaging in technical assistance and descriptive analyses that did not allow for claims of causality. The promising co-creation approach deserves additional exploration and may provide guidance to others designing technical assistance. Key words: Co-creation; technical assistance; implementation science
Article
Our situation As pharmacy educators, we often encounter situations such as designing new curricula or establishing shared values for an organization that may be ambiguous or controversial. To generate effective solutions, it is often necessary to build group consensus with key stakeholders. The purpose of this paper is to describe and provide recommendations for using the Delphi method, a process for facilitating discussions and aiding in decision making. Methodological literature review An overview of the Delphi method, including its multiple variations, is presented. Steps necessary to complete a Delphi study (building a protocol, developing a research question, defining panelists and panel size, piloting the protocol, round one item creation and analysis, round two and beyond, consensus, increasing clarity, and reporting) is described. Our recommendations and their application(s) Practical recommendations are provided to support use the Delphi method to build consensus in research. These recommendations include: (1) clarify the purpose of the Delphi, (2) ensure the research questions are grounded in the literature and are relevant, (3) carefully consider panelist processes, (4) determine any definitions that should be given to or developed by panelists, (5) determine methods for enhancing clarity, and (6) employ methods to reduce attrition. Potential impact The Delphi method provides a systematic approach to generating consensus in pharmacy education for commonly encountered situations such as committee meetings, research studies, faculty retreats, classroom activities, and lab meetings.
Article
Integrated care is recognized as a promising approach to comprehensive health care and reductions in health care costs. However, the integration of behavioral health and primary care is complex and often difficult to implement. Successful and sustainable integration efforts require coordination and alignment both within health care organizations and across multiple sectors. Furthermore, implementation progress and outcomes are shaped by the readiness of stakeholders to work together toward integrated care. In the context of a Colorado State Innovation Model (SIM) effort, we examined stakeholder readiness to advance and sustain partnerships for behavioral health integration beyond the period of grant funding. Partnership readiness was assessed using the Readiness for Cross-sector Partnerships Questionnaire (RCP) in spring 2019. Participants from 67 organizations represented seven sectors: government, health care, academic, practice transformation, advocacy, payer, and other. RCP analyses indicated a moderate level of readiness among Colorado stakeholders for partnering to continue the work of behavioral health integration initiated by SIM. Stakeholders indicated their highest readiness levels for general capacity and lowest for innovation-specific capacity. Five thematic categories emerged from the open-ended questions pertaining to partnership experiences: (a) collaboration and relationships, (b) capacity and leadership, (c) measurement and outcomes, (d) financing integrated care, and (e) sustainability of the cross-sector partnership. Partnering across sectors to advance integrated behavioral health and create more equitable access to services is inherently complex and nonlinear in nature. The RCP usefully identifies opportunities to strengthen the sustainability of integrated care efforts.
Article
Purpose This article describes the Health Access Initiative (HAI), an intervention to improve the general and sexual health care experiences of sexual and gender minority youth (SGMY) by providing training and technical assistance to providers and staff. The training consisted of an online and in-person training, followed by site-specific technical assistance. We present the findings of a pilot evaluation of the program with 10 diverse clinics in Michigan. Methods This program was developed using community-based participatory research principles. Based on a framework of cultural humility, program activities are guided by the Situated Information–Motivation–Behavioral Skills Model. The mixed method program evaluation used training feedback surveys assessing program feasibility, acceptability, and effectiveness; pre/post surveys assessing knowledge, attitudes, and practices toward SGMY; and in-depth interviews with site liaisons assessing technical assistance and structural change. Results The HAI is a highly feasible and acceptable intervention for providers and staff at a variety of health care sites serving adolescents and emerging adults. The results from 10 clinics that participated in the HAI indicate strong intervention efficacy, with significant and meaningful improvements seen in the knowledge of, attitudes toward, and practices with SGMY reported by providers and staff at 6-month follow-up compared with baseline, as well as in qualitative interviews with site liaisons. Conclusions The HAI is a promising intervention to improve the quality of primary and sexual health care provided to SGMY. Expanded implementation with continued evaluation is recommended. The HAI may also be adapted to address specific health needs of SGMY beyond sexual health.