Content uploaded by Yanchen Zhang
Author content
All content in this area was uploaded by Yanchen Zhang on Dec 30, 2022
Content may be subject to copyright.
Vol.:(0123456789)
1 3
Administration and Policy in Mental Health and Mental Health Services Research
https://doi.org/10.1007/s10488-022-01248-5
ORIGINAL ARTICLE
The Interaction Between General andStrategic Leadership
andClimate onTheir Multilevel Associations withImplementer
Attitudes Toward Universal Prevention Programs forYouth Mental
Health: ACross‑Sectional Study
YanchenZhang1 · ClayCook2· LindsayFallon3· CatherineCorbin4· MarkEhrhart5· EricBrown6· JillLocke4·
AaronLyon4
Accepted: 20 December 2022
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022
Abstract
Emerging literature has highlighted the importance of discerning general and strategic organizational context (OC) factors
(e.g., leadership and climate) and their interaction effect on individual implementation behaviors (e.g., attitudes toward
evidence-based practices; EBPs) in youth mental healthcare. This study aimed to examine how leadership and climate (gen-
eral and strategic) are associated with implementer attitudes toward EBPs across the individual and organizational levels
and their interaction effect in schools. A series of multilevel models (MLMs) were fitted on a diverse sample of schools
actively implementing universal prevention programs for youth mental health (441 implementers from 52 schools). The
organization-level aggregates and individual educators’ perceptions of general and strategic leadership and climate, and their
interaction terms, were entered as level-2 and level-1 predictors of four attitudinal dimensions (Requirement, Openness,
Appeal, and Divergence) based on their level of measurement. At the organizational level, higher levels of strategic leader-
ship and climate, but not their general counterparts, were consistently associated with more favorable attitudes in all four
dimensions. At the individual/within-school level, higher levels of perceived general and strategic leadership and climate
were associated with more favorable attitudes of Requirement and Openness. At the organizational/between-school level,
general climate moderated the positive effect of strategic climate on implementers’ perception of appeal and divergence of
EBPs. Our findings indicate that leaders should make data-based decisions to allocate resources on strategic and/or general
leadership and climate to foster favorable staff attitudes toward EBPs based on the level of measurement, implementation-
specificity, and attitudinal dimensions.
Keywords General and strategic organizational factors· Leadership· Organizational climate· Organizational context·
Attitudes toward EBPs
* Yanchen Zhang
yanchen-zhang@uiowa.edu
1 Department ofPsychological & Quantitative Foundations,
The University ofIowa, 361 Lindquist Center, IowaCity,
IA52242, USA
2 Department ofEducational Psychology, University
ofMinnesota, 341 Education Sciences Building, 56 East
River Road, Minneapolis, MN55455, USA
3 Department ofCounseling andSchool Psychology,
University ofMassachusetts-Boston, 100 William
T. Morrissey Blvd., Boston, MA02125, USA
4 Department ofPsychiatry & Behavioral Sciences, University
ofWashington, 6200 NE 74th Street, Suite 110, Box354920,
Seattle, WA98115, USA
5 Department ofIndustrial/Organizational Psychology,
University ofCentral Florida, 4111 Pictor Lane, Orlando,
FL32816, USA
6 Department ofPublic Health Sciences, University ofMiami,
CoralGables, FL33124, USA
Administration and Policy in Mental Health and Mental Health Services Research
1 3
Youth with unmet mental health needs are at risk for long-
term negative outcomes, including interpersonal conflict,
unemployment, encounters with the legal system, and early
death (U.S. Surgeon General’s Advisory, 2021). Therefore,
rigorous research has established various evidence-based
practices (EBPs) to prevent and address mental health con-
cerns in children and youth (Bruns etal., 2016). However,
these EBPs are infrequently adopted or rarely delivered with
sufficient fidelity to achieve expected youth mental health
outcomes (Lyon & Bruns, 2019). Implementation research-
ers have identified factors (i.e., determinants) across differ-
ent social-ecological levels that either enable or obstruct
the successful implementation of EBPs. One important fac-
tor identified in the literature is the organizational context
(OC), which is the inner setting where service providers
and recipients reside and EBP implementation occurs. The
implementation literature has consistently identified asso-
ciations between OC factors (e.g., leadership and climate)
and implementation outcomes (Lyon etal., 2018a, 2018b;
Williams etal., 2020). Furthermore, emerging research
has highlighted the differential effects of general (i.e.,
molar and non-implementation-specific) versus strategic
OC factors (i.e., specific and proximal to EBP implemen-
tation) on implementation outcomes (Powell etal., 2017).
Additionally, research has identified the characteristics of
implementers as individual-level determinants of success-
ful implementation (e.g., attitudes toward EBPs; Fishman
etal., 2021; Lyon etal., 2019). Although the implementation
literature has established the importance of organizational-
and individual-level factors, little research has examined the
cross-level association between OC factors (e.g., leadership
and climate) and individual-level attitudes towards EBPs.
Thus, this study aims to examine (a) the joint and cross-
level associations between general and strategic OC factors
(leadership and climate) and implementer attitudes toward
EBPs in school-based youth mental healthcare, and (b) the
moderation effects of general OC factors on the associa-
tions between their strategic counterparts and implementer
attitudes toward EBPs.
Education Sector asanIdeal Setting
forUniversal Prevention forYouth Mental
Health
Universal prevention is a critical element of tiered public
health frameworks that emphasize providing a continuum of
youth mental healthcare. The goal of universal prevention
is to prevent youth mental health problems from emerging,
reduce mental health problems, and enhance wellbeing-pro-
moting factors (Greenberg & Abenavoli, 2017). The edu-
cation sector (e.g., schools) provides unparalleled oppor-
tunities for the delivery of universal prevention because
it is where youth naturally exist for most of their daytime.
Schools provide youth with consistent access to mental
healthcare in a less stigmatizing setting and alleviate barriers
common to other youth-serving settings (e.g., primary care
settings). This is particularly true for youth from historically
underserved groups (Pescosolido etal., 2021). Therefore,
the education sector is one of the most common settings for
the delivery of universal prevention for youth mental health
(Duong etal., 2021; Durlak etal., 2011).
Numerous universal prevention programs have been
developed and tested for youth mental health in schools
(Horner etal., 2009; Kellam etal., 2011; Nese etal., 2016).
However, the implementation gap has significantly reduced
the public health benefits of these prevention programs,
as evidenced by the persistent prevalence of youth mental
health problems in schools (Greenberg & Abenavoli, 2017).
Implementing universal prevention programs with adequate
fidelity is complicated and requires deliberate attention to
determinants or factors that either obstruct or enable suc-
cessful implementation (Powell etal., 2015; Waltz etal.,
2015). The Consolidated Framework for Implementation
Research (CFIR) categorized implementation factors into
four social-ecological levels of influence, including outer
setting (e.g., policy), inner setting (e.g., leadership, climate),
innovation-specific (e.g., acceptability, feasibility), and
individual (e.g., beliefs and attitudes towards EBPs; Dam-
schroder etal., 2015, 2022). Many studies have focused on
factors at a single level (Allen etal, 2020; Lui etal., 2021).
This single-level approach significantly impedes our under-
standing of how factors across different levels of the imple-
mentation context relate to one another and combine to
explain variability in individuals’ implementation behaviors
and client outcomes (Powell etal., 2017). To promote the
successful implementation of universal prevention programs
in schools, researchers need to examine the cross-level inter-
play between factors within both individual implementers
and the organizational context of the education sector (e.g.,
Locke etal., 2019; Williams etal., 2018).
Organizational Context forEBP
Implementation
It is widely recognized that structures and processes in the
organizational context (OC; e.g., leadership and climate)
can either facilitate or impede successful implementation
because the organizational context is the specific microsys-
tem where implementation happens and implementers reside
(Aarons etal., 2014a, 2014b; Lyon etal., 2018a, 2018b).
The organizational context involves a combination of inner
setting factors that are more proximal and theoretically
linked to implementers’ behaviors than outer setting factors
that exist and function from outside of a school (e.g., state
Administration and Policy in Mental Health and Mental Health Services Research
1 3
education policy; Lyon etal., 2018a, 2018b). Moreover, OC
factors are multilevel by nature because they not only repre-
sent the characteristics of an organization when aggregated
across its employees but also reflect an individual psycho-
logical phenomenon when assessed for a single employee
(Weiner, 2009). Consistent with mainstream implementation
frameworks, general (i.e., molar, non-implementation-spe-
cific) and strategic (i.e., implementation-specific) OC fac-
tors describe the immediate context where implementation
occurs (Williams & Glisson, 2014).
General andStrategic Leadership andClimate
Leadership supportive of EBP implementation in schools is
associated with improved implementation outcomes of EBPs
by school-based mental health providers (Langley etal.,
2010) and educators (Lyon etal., 2018a, 2018b), as well
as improved youth mental health outcomes (Fagan etal.,
2019). Implementation researchers differentiate between
general and strategic leadership to understand the specific
ways in which leadership influences EBP implementation
in service settings (Aarons etal., 2014a, 2014b; Carlson
etal., 2021). General leadership represents broad charac-
teristics of leaders (e.g., transformational leadership) that
do not specifically target staff’s implementation of EBPs
but are associated with establishing a generally positive and
supportive context for staff. Strategic leadership represents
implementation-specific strategies that leaders exhibit to
deliberately promote implementation success. For instance,
educators will perceive a leader with strategic leadership as
being knowledgeable about EBPs and proactively commu-
nicating expectations of them prioritizing EBP implementa-
tion (Aarons etal., 2014a, 2014b).
Another established organizational context factor related
to implementation is organizational climate. Organizational
climate is defined as the “shared meaning organizational
members attach to the events, policies, practices, and pro-
cedures they experience and the behaviors they see being
rewarded, supported, and expected” (Ehrhart etal., 2014a,
2014b, p. 69). Like leadership, researchers differentiate
general versus strategic climate to facilitate a nuanced
understanding of how organizational climate influences the
behaviors of staff working in the organization (Ehrhart etal.,
2014a, 2014b; Williams etal., 2018). Approaches to general
climate focus on describing the overall work environment
and typically capture shared perceptions of staff about the
extent to which the organization encourages productivity or
employee well-being (James etal., 2008; Williams & Glis-
son, 2014). Considering the scope of this study, we take a
general approach to general climate by focusing on school
staff’s shared perceptions of organizational health and needs
(Bradshaw etal., 2008; Domitrovich etal., 2015). Specifi-
cally, general climate in the context of this study refers to
school staff’s shared Gestalt perceptions of their daily func-
tions and the organizational needs and factors that contribute
to a healthy school environment (Hoy & Fredman, 1987).
Although general climate encompasses many dimensions
(e.g., support, resources, cooperation), it does not capture
the aspects of organizational climate that target EBP imple-
mentation (Lyon etal., 2018a, 2018b). In contrast, a strate-
gic climate focuses specifically on the implementation of
EBPs represents people’s shared perceptions of whether
the implementation of EBPs is expected, prioritized, sup-
ported, and rewarded in their organization based on their
experiences with the organization’s policies, procedures,
and practices (Ehrhart etal., 2014a, 2014b; Weiner etal.,
2011). For instance, schools with a strategic climate that
actively pursue EBP implementation are likely to establish
facilitative policies, recognition systems, and EBP training
systems. These aspects of strategic climate will create the
conditions that school staff directly experience, which will
subsequently influence their perceptions of the prioritiza-
tion of EBP implementation at school (Lyon etal., 2018a,
2018b). Therefore, a positive strategic climate is likely
to influence implementers’ individual-level factors (e.g.,
favorable attitudes and subjective norms for implementing
EBPs) to improve implementation behavior and outcomes
(Aarons etal., 2012a, 2012b).
Interaction Between General andStrategic OC
Factors
Neither general nor strategic OC factors (e.g., leadership,
climate) alone is the necessary and sufficient condition to
promote implementers’ characteristics and behaviors for
positive implementation outcomes (Lyon etal., 2018a,
2018b; Meza etal., 2021). Emerging literature elucidates
that general and strategic leadership or climate likely inter-
act to establish a microsystem for staff that is conducive to
implementation (Williams etal., 2018). Organizational theo-
ries suggest that strategic factors do not exist in a vacuum but
rely on their general counterpart that forms the foundation
of support and shared experience of staff in the organiza-
tion (Rhoades & Eisenberger, 2002; Schneider etal., 2010).
It is not until staff feels attached to and rewarded by their
organization because of the positive general OC factor that
they will respond to the tasks prioritized by their organiza-
tion’s strategic leadership and/or climate (e.g., implement-
ing EBPs). Few studies have probed the interaction effect
between general and strategic OC factors on individual-level
implementation outcomes (e.g., use of EBPs; Williams etal.,
2018), but the limited evidence is encouraging. In the case
of climate, Williams etal. (2018) found that higher levels of
strategic climate predicted staff’s increased use of EBPs only
in organizations with high levels of general climate, but not
in those with suboptimal general climate.
Administration and Policy in Mental Health and Mental Health Services Research
1 3
Attitudes asaPivotal Individual‑Level
Implementation Factor
Variation in implementation outcomes of individuals
within the same organizational context is pervasive in
real-world service settings (Sanford DeRousie & Bierman,
2012). Therefore, malleable individual-level factors are
important to target because they are the critical and imme-
diate precursors to individuals’ enactment of expected
implementation behaviors (e.g., successful uptake, imple-
mentation fidelity; Low etal., 2016; Lyon etal., 2018a,
2018b). Implementer attitudes toward EBPs reflect their
favorable or unfavorable evaluative judgments regarding
the adoption and use of EBPs, which is established in the
literature as a prominent motivational implementation
factor (Aarons & Sawitzky, 2006; Fishman etal., 2021).
The most widely used measure for implementer attitudes
toward EBPs in the field of implementation science is the
Evidence-Based Practice Attitude Scale (EBPAS; Aarons
& Sawitzky, 2006), and a collection of existing research
has established that implementer attitudes toward EBPs
(measured by EBPAS) are associated with various imple-
mentation outcomes (e.g., intention to implement, adop-
tion, fidelity, knowledge and use of EBPs, Aarons etal.,
2012a, 2012b; Gregory etal., 2005; Melas etal., 2012).
For instance, educational research indicates that teach-
ers’ attitudes toward certain EBPs influence their fidelity
of implementation (Bowden etal., 2003). In the fields of
healthcare, higher scores on EBPAS have been associated
with clinicians’ higher levels of knowledge about EBPs
(Melas etal., 2012), as well as increased adoption and
delivery of EBPs (Nelson etal., 2012; Smith & Manfredo,
2011). Moreover, based on the theory of planned behav-
ior, attitudes are one of the main mechanisms that influ-
ence clinicians’ intentions to implement EBPs, which is
a proximal predictor of the actual implementation (e.g.,
adoption of EBPs; Godin & Kok, 1996). Although imple-
menter attitudes are commonly conceptualized as an indi-
vidual-level factor, they can also be aggregated to their
organizational-level counterpart that reflects the collec-
tive attitudes toward EBPs held by all individuals in an
organization (List, 2014; Galam & Moscovici, 1991).
Hence, implementation research has widely explored the
multilevel associations between OC factors and attitudes
(Aarons etal., 2012a, 2012b; Fishman etal., 2021).
Associations Between OC Factors andImplementer
Attitudes
Social-cognitive theories emphasize how individuals’ atti-
tudes toward certain behaviors are influenced by the way
they perceive themselves and the social context in which
they reside (e.g., social-cognitive theory, Bandura, 1999;
theory of planned behavior; Ajzen, 1991). In the case of
EBP implementation, implementers exist in and experi-
ence the OC where one is expected to implement EBPs.
Hence, their attitudes towards EBPs are likely influenced
by organizational-level factors, including local policies,
leadership behaviors, climate, etc. (Farahnak etal., 2020;
Forman etal., 2013; Han & Weiss, 2005; Powell etal.,
2017). Collectively, OC factors and individual attitudes
can explain the variation in individual-level implementa-
tion behaviors (e.g., adoption and fidelity; Powell etal.,
2017).
Theoretically, the association between OC factors and
implementer attitudes toward EBPs will vary across different
attitudinal dimensions assessed by existing attitude meas-
ures (e.g., EBPAS; Aarons & Sawitzky, 2006). For instance,
the dimension of Appeal (staff perceive EBPs as appeal-
ing) may have a stronger relationship with general climate
that supports autonomy and strategic climate that explic-
itly rewards implementation efforts than other dimensions
(e.g., Requirement, staff perceive EBPs are required). This is
because autonomy and outcome expectancies can positively
influence one’s affective attitudinal beliefs (i.e., consider
something as appealing, Cook etal., 2018; Schwarzer etal.,
2011). Hence, research is needed to disaggregate implement-
ers’ overall attitudes to examine specific attitudinal dimen-
sions. This can help develop and optimize implementation
strategies and policies that target the differential associa-
tions between OC factors and specific attitudinal dimensions
(Birken & Currie, 2021).
Aims
Several gaps exist in the literature that warrant further inves-
tigation into the associations between general and strate-
gic OC factors, and implementer attitudes toward EBPs in
youth mental healthcare. First, although attitudes represent
an important individual-level implementation determinant, a
systematic review identified that implementation research to
date is ambiguous in the methods used to measure and ana-
lyze attitudes (Fishman etal., 2021). For instance, although
some studies have examined the joint and cross-level asso-
ciations between the general and strategic OC factors and
implementer attitudes (e.g., Powell etal., 2017), few have
examined the interaction between general and strategic OC
factors, or the disaggregated effects of OC factors at both
organizational and individual levels. Second, although lit-
erature suggests that general and strategic climate interact to
influence implementers’ adoption of EBPs (Williams etal.,
2018), no study has examined the interaction effect between
general and strategic OC factors on implementer attitudes.
Administration and Policy in Mental Health and Mental Health Services Research
1 3
This line of inquiry is important for the development of
implementation strategies to promote implementers’ sup-
portive attitudes toward and subsequent use of EBPs, as the
association between strategic OC factors and implementers’
attitudes may depend on the level of general OC factors.
To address these gaps and extend from existing theory
and research, this study used multilevel models to explore
the cross-level associations between leadership and climate
(general and strategic) and school-based implementers’
(educators and school mental health professionals) atti-
tudes toward universal prevention programs for school-based
youth mental health (Fig.1). Three research questions (RQs)
were addressed: (1) Do implementers from the same school
share significant similarities in four dimensions of their atti-
tudes toward EBPs (Requirement, Openness, Appeal, and
Divergence)? (2) To what extent are general and strategic
leadership or climate at organization and individual levels
differentially associated with specific dimensions of imple-
menter attitudes toward EBPs after controlling for covari-
ates? (3) Do general and strategic leadership or climate
interact at organization- and/or individual-levels to explain
variation in specific dimensions of implementer attitudes
beyond these OC factors alone?
Methods
Setting andParticipants
The analytic sample included 441 school-based practition-
ers (educators, staff, consultants, and mental health pro-
fessionals) nested in 52 public elementary schools in six
diverse urban districts (enrollment at the time of this study:
MNon-White = 61.16%; MFRPL = 15.1%; Table1) across three
states in the Midwestern and Western U.S. An average of
eight educators per school were recruited (range: 5 to 14).
The sample demographics are largely consistent with the
U.S. teacher population (Tipton & Miller, 2022). Most edu-
cators self-identified as female (89%), White and non-His-
panic (84%), aged 25 to 34years (29%), having a master’s
degree (68%), and average work experience of 11.6years
(SD = 7; Table2). Upon the completion of data collection,
we ran χ2 tests with continuity correction and one-sample
t-tests to compare schools in the analytic sample and the six
districts on key school characteristics (Blasius & Brandt,
2010). The results generally supported the sample repre-
sentativeness (Table1).
Procedure
Human subject approval was obtained from the university
IRB and partnering schools’ research divisions. This study
is part of a nationwide project for the implementation of uni-
versal prevention programs for school-based youth mental
health in the 2017–2018 academic year. Purposive sampling
(Palinkas etal., 2015) was used to recruit candidate schools
if they were actively implementing either of two universal
prevention programs for youth mental health (school-wide
positive behavior intervention and supports, SWPBIS;
Horner etal., 2009; Promoting Alternative Thinking Strate-
gies curriculum, PATHS; Domitrovich etal., 2007). A strati-
fied cluster random sample was created from the pool of can-
didate schools based on urbanicity, enrollment, and diversity.
With a random number generator (Urbaniak & Plous,
2013), administrators of participating schools randomly
recruited 15 employees to ask for consent to participate. For
Fig. 1 Conceptual model for
the hypothetical multilevel
associations among general and
strategic leadership/climate at
school and individual levels
and individual implementers'
attitudes toward EBPs. Solid red
lines = regression coefficient;
dashed red lines = the modera-
tion effect of general leader-
ship/climate on the association
between its strategic counterpart
and each of the four attitudi-
nal dimensions (Color figure
online)
Administration and Policy in Mental Health and Mental Health Services Research
1 3
generalizability in the public education sector, there were
no exclusion or inclusion criteria for participants (Powell
etal., 2017). Literature supports aggregating at least three
employees from each organization as a reliable approach to
measuring organization-level factors (Glick, 1985; Williams
etal., 2018). The random sampling of schools (level-2 units)
and school-based practitioners (level-1 units) enabled the
generalization of the findings to the general population in
school-based youth mental healthcare because units from
both levels are from level-specific random sampling distri-
butions (Lucas, 2014; Mang etal., 2021). Participants were
emailed a Qualtrics survey link with a 1-month window to
complete, followed by weekly email reminders. Respondents
received an incentive of $75 for completing the survey. The
final response rate was 86.67% for schools and 73.5% for
employees, which was consistent with similar school-based
implementation research (Cook etal., 2018).
Measures
Covariates
This study controlled for theoretically relevant covari-
ates at the organization and individual levels as poten-
tial confounders. The covariates were collected via
school administrative data or Qualtrics survey, includ-
ing enrollment (linear transformed to have a unit of 1000
for commensurate scales), school-level demographics of
students (percentages of non-White and receiving free-/
reduced-priced lunch; Table1), and educators (age, gen-
der, grade level, race, and work experience; Table2). In
the survey, educators’ age, gender, race, and grade level
were coded as categorical variables, which were entered
into the MLMs as n-1 dummy variables (n is the number
of categories of a covariate; see note in Table3).
General Leadership
The Multifactor Leadership Questionnaire-Education ver-
sion (MLQ; Avolio & Bass, 1995) was used to assess edu-
cators’ perceived general leadership qualities of school
leaders. We adopted five subscales assessing two domains
from MLQ based on their relevance to EBP implementa-
tion. The transformational leadership domain (20 items;
α = 0.96) consists of four subscales: idealized influence,
inspirational motivation, intellectual stimulation, and
individualized consideration. The transactional leader-
ship domain (four items; α = 0.85) consists of contingent
reward. All items are rated on a 5-point Likert scale rang-
ing from 0 (not at all) to 4 (frequently). The MLQ has
demonstrated adequate reliabilities (subscales’ α ranged
from 0.87 to 0.91 in this sample) and concurrent and pre-
dictive validity (Aarons & Sawitzky, 2006).
Table 1 School-level demographic information (n = 52)
FRPL free and reduced-priced lunch program
Variables n
State 3
District 6
School 52
Universal Prevention Programs SWPBIS 39
PATH 13
School demographics MMin Max
Enrollment 532 249 976
Student diversity %White 38.84 0.1 79.5
%Mixed race 9.25 0 17.5
% Pacific Islander 1.19 0 10.3
% Black/African American 15.13 0 97.7
% Asian 13.11 0 39.5
% Native 1.18 0 6.5
% Hispanic 22.01 3 99
% FRPL 15.1 3 47.1
Sample representativeness Test used Statistics p
% FRPL Pearson Chi-Square with continuity correction 0.004 0.95
%Non-white 0.92 0.34
Enrollment One-sample t test 0.087 0.93
Administration and Policy in Mental Health and Mental Health Services Research
1 3
Strategic Leadership
The school version of Implementation Leadership Scale
(ILS; Lyon etal., 2018a, 2018b, 2021) was adapted from the
original ILS (Aarons etal., 2014a, 2014b) to assess educa-
tors’ perceptions of their school leaders’ behaviors relevant
to the implementation of generally used EBPs in schools
(e.g., SWPBIS, PATHS). The ILS has 12 items loading onto
four subscales, including Proactive, Knowledgeable, Sup-
portive, and Perseverant. All items are scored on a 5-point
Likert-Scale ranging from 0 (not at all) to 4 (very great
extent). The ILS demonstrated excellent internal consistency
(subscales’ α ranged from 0.95 to 0.98 in this sample) and
discriminant validity from general leadership (Lyon etal.,
2018a, 2018b).
General Organizational Climate
The Organizational Health Inventory for Elementary
Schools (OHI; Hoy & Tarter, 1997) was administered to
assess the general organizational climate and health in
school settings. The theoretical basis of OHI included two
organizational needs (instrumental and expressive) and three
types of influences over these needs for a healthy function-
ing school (technical, managerial, and institutional; Hoy &
Woolfolk, 1993). The OHI consists of 37 items that measure
educators’ collective appraisal of five dimensions, including
institutional integrity (the school’s ability to cope success-
fully with destructive outside forces; teachers are protected
from the unreasonable community and parental demands),
staff affiliation (warm and friendly interactions, positive
feelings about colleagues, commitment to students, trust
and confidence among the staff, and sense of accomplish-
ment), academic emphasis (students are cooperative in the
classroom, respectful of other students who get good grades,
and are driven to improve their skills), collegial leadership
(principal’s behavior is friendly, supportive, open, egalitar-
ian, and neither directive nor restrictive), and resource influ-
ence (principal’s ability to lobby for resources for the school
and positively influence the allocation of district resources).
Table 2 Educator demographics (N = 441)
No missing value was detected in demographic variables, therefore the total n for each variable is 441
M mean, SD standard deviation
Demographics Category values n (%)
Age (years) 18 to 24 21 (4.8)
25 to 34 129 (29.4)
35 to 44 121 (27.6)
45 to 54 103 (23.5)
55 to 64 61 (13.9)
65 to 74 4 (0.9)
Gender Male 46 (10.5)
Female 391 (89.3)
Other 1 (0.2)
Ethnicity Latino/Hispanic 31 (7.1)
Non-Latino/Hispanic 407 (92.9)
Race American Indian or Alaskan Native 8 (1.8)
Asian 6 (1.4)
Black or African American 22 (5.1)
Native Hawaiian or Pacific Islander 1 (0.2)
White or Caucasian 363 (83.8)
Multiracial 21 (4.8)
Other 12 (2.8)
Highest degree earned Bachelors 140 (32)
Masters 297 (67.8)
Doctoral 1 (0.2)
Grade K—2nd 191 (43.3)
3rd—5th and other 250 (56.7)
M SD
Years in current profession 11.6 7
Administration and Policy in Mental Health and Mental Health Services Research
1 3
The items are rated on a 4-point Likert-scale ranging from
1 (rarely occurs) to 4 (very frequently occurs). The OHI
demonstrated good internal consistency, convergent, and dis-
criminant validity in prior research (e.g., Bevans etal., 2007)
and in this sample (subscales’ α ranged from 0.84 to 0.94).
Strategic Climate
The school-specific Implementation Climate Scale (ICS;
Lyon etal., 2018a, 2018b; Thayer etal., 2022) was adapted
from the original ICS (Ehrhart etal., 2014a, 2014b) to assess
educators’ perceptions of the implementation-specific cli-
mate of their schools that support the implementation of
EBPs generally used in schools. The school-specific ICS has
nine first-order factors (focus on EBP, educational support
for EBP, recognition for EBP, rewards for EBP, selection for
EBP, selection for openness, use of data, existing supports
to deliver EBP, and EBP integration) under a second-order
factor of strategic climate. Twenty-nine items are rated on
a 5-point Likert-scale ranging from 0 (not at all) to 4 (To a
Table 3 Acronyms and descriptive statistics of variables in the models
Level-1 interaction terms are products of group-centered individual-level predictors. Level-2 interaction terms are products of aggregates of
individual-level predictors. The grade level, age range, race, and gender of teachers were categorical variables. Grade level: 0 = ‘Kindergarten to
2nd grade’, 1 = “3rd to 5th grades”. Age range: 0 = “18–24years”, 1 = “25–34 years”, each category increase by 6years with the final category
6 = “75years or older”. Gender: 0 = “male”, 1 = “female”. Race: 0 = “white”, 1 = “non-white minority groups”. 0 was used as reference category
for all categorical variables
M mean, SD standard deviation
Category Level of data Acronym Variable description Valid n, M ± SD Min–max
Demographic covariates School ENR School enrollment size 52, 536.93 ± 172.35 249–976
DIV Percentage of non-white students enrolled 52, 41.13 ± 25.86 0.8–107
Individual GRD Grade level an educator serves 441, NA NA
AGE Age range of an educator in 7-year increments 441, NA NA
GEN Gender of an educator 441, NA NA
EXP Educator’s experience in years 441, 11.62 ± 6.95 1–20
RACE Educators’ race 441, NA NA
Outcome variable Individual ATT Individual teachers’ attitudes toward Tier 1 EBPs as meas-
ured by EBPAS
441, 3.15 ± 0.54 1.29–4
REQ Individual teachers’ perceptions regarding if delivering
EBPs is required
441, 2.96 ± 0.87 0–4
OPN Individual teachers’ perceptions regarding openness to
delivering EBPs
441, 3.11 ± 0.68 0–4
APL Individual teachers’ perceptions regarding if delivering
EBPs is found to be appealing
441, 3.32 ± 0.71 0–4
DVG Individual teachers’ perceptions that diverge from deliver-
ing EBPs
441, 3.13 ± 0.52 1.25–4
Predictors School S-ILS School-level aggregated strategic leadership measured by
ILS
52, 2.9 ± 0.65 0.81–3.71
S-MLQ School-level aggregated general leadership measured by
MLQ
52, 2.82 ± 0.56 0.99–3.51
S-ICS School-level aggregated strategic climate measured by ICS 52, 2.05 ± 0.53 0.58–3.18
S-OHI School-level aggregated general climate measured by OHI 52, 2.92 ± 0.22 2.31–3.28
Individual ILS Individual-level strategic leadership measured by ILS 441, 2.93 ± 0.88 0–4
MLQ Individual-level general leadership measured by MLQ 441, 2.85 ± 0.82 0–4
ICS Individual-level strategic climate measured by ICS 441, 2.07 ± 0.81 0–4
OHI Individual-level general climate measured by OHI 441, 2.94 ± 0.34 2–4
Interaction terms School S-INT-L School-level interaction term of leadership created by the
multiplication of S-ILS and S-MLQ
N/A N/A
S-INT-C School-level interaction term of climate created by the
multiplication of S-ICS and S-OHI
N/A N/A
Individual INT-L Individual-level interaction term of leadership interaction
term created by the multiplication of ILS and MLQ
N/A N/A
INT-C Individual-level interaction term of climate interaction
term created by the multiplication of ICS and OHI
N/A N/A
Administration and Policy in Mental Health and Mental Health Services Research
1 3
great extent). In this sample, the subscales’ α ranged from
0.81 to 0.91 (Lyon etal, 2018a, 2018b).
Attitudes Toward EBPs
The 15-item version of the Evidence-Based Practice Atti-
tude Scale (EBPAS) was used to assess the extent to which
educators possess favorable attitudes toward the adoption
and delivery of EBPs (Aarons & Sawitzky, 2006; Rye etal.,
2017). The EBPAS assesses four attitudinal dimensions:
Appeal (the extent to which educators would adopt an EBP
if it appeared intuitively appealing, could be used correctly,
or their peers were happy with using it), Requirements (the
extent to which educators would adopt an EBP if required by
the school or district), Openness (the extent to which educa-
tors are open to new EBPs and willing to try more manual-
ized EBPs), and Divergence (reverse-scored; the extent to
which educators perceive EBPs as useless and less important
than their field experience); the total score captures imple-
menter global attitudes toward EBP. Items are scored on a
5-point Likert scale ranging from 0 (not at all) to 4 (very
great extent). A higher score indicates more favorable atti-
tudes on that attitudinal dimension. Prior studies supported
EBPAS’ adequate internal consistency, convergent, and dis-
criminant validity (subscales α ranged from 0.59 to 0.93;
e.g., Aarons etal., 2004, 2007; Cook etal., 2018). In this
sample, all four subscales demonstrated adequate reliability.
Data Analysis
We used multilevel models (MLMs; i.e., general linear
mixed models; Hoffman & Walters, 2022) to address the
nested data and cross-level associations among the vari-
ables. First, descriptive statistics [distribution, bivariate,
and intra-class correlations (ICCs); Tables1, 2, 3, 4, and
5] were calculated for all variables (uncentered) based on
their levels of measurement. The results supported the sam-
ple adequacy for MLMs after level-specific centering of
variables of OC factors (Hamaker & Muthen, 2020; Online
Appendix1). Specifically, the individual-level variables of
OC factors were centered around their group means. Hence,
the coefficients of these variables represent the within-
school effect of OC factors (i.e., individual-level effect).
The organizational-level variables of OC factors were con-
structed by first aggregating responses of individuals from
each school (i.e., group means; Bliese, 2000, pp. 349–380).
Then, we centered the organizational-level variables of OC
factors around their grand mean to enhance the interpret-
ability (Hoffman & Walters, 2022). The coefficients of these
variables represent the between-school effect of OC factors
(i.e., school-level effect). The average number of implement-
ers per school in this sample was adequate for aggregation
because research has shown that at least three participants
were required to aggregate individual responses to repre-
sent an organizational characteristic (Scherbaum & Ferreter,
Table 4 Bivariate pearson product-moment correlation matrix of all key variables in the models
The prefix “S_” indicate school-level aggregate of an OC factor or attitudinal dimensions of EBPAS. The number above the diagonal line indi-
cate the sample size at individual or school levels. All individual-level and school-level variables were raw data, i.e., not centered
*p < 0.05; **p < 0.01; Level 1: n = 441; Level 2: n = 52
Individual level ILS ICS MLQ OHI OPN DVG APL REQ
ILS 441 441 441 441 441 441 441 441
ICS 0.76** 441 441 441 441 441 441 441
MLQ 0.79** 0.60** 441 441 441 441 441 441
OHI 0.47** 0.35** 0.42** 441 441 441 441 441
OPN 0.45** 0.45** 0.39** 0.34** 441 441 441 441
DVG 0.29** 0.20** 0.23** 0.28** 0.51** 441 441 441
APL 0.33** 0.24** 0.33** 0.32** 0.55** 0.38** 441 441
REQ 0.33** 0.29** 0.32** 0.23** 0.45** 0.29** 0.54** 441
School level S_ILS S_MLQ S_ICS S_OHI S_OPN S_DVG S_APL S_REQ
S_ILS 52 52 52 52 52 52 52 52
S_MLQ 0.87** 52 52 52 52 52 52 52
S_ICS 0.89** 0.75** 52 52 52 52 52 52
S_OHI 0.67** 0.61** 0.6** 52 52 52 52 52
S_OPN 0.61** 0.51** 0.71** 0.57** 52 52 52 52
S_DVG 0.44** 0.32* 0.41** 0.45** 0.69** 52 52 52
S_APL 0.54** 0.48** 0.55** 0.5** 0.76** 0.66** 52 52
S_REQ 0.44** 0.36** 0.46** 0.35* 0.46** 0.42** 0.52** 52
Administration and Policy in Mental Health and Mental Health Services Research
1 3
2009; Klein & Kozlowski, 2000). This approach was deemed
appropriate given the significant within-organization inter-
rater agreement (rwg ) in each OC factor, which exceeded the
recommended cutoff of 0.60 (range = 0.76 to 0.88; Lance
etal., 2006). Considering the dimensional research ques-
tions, instead of the total score, the Requirement, Openness,
Appeal, and Divergence subscale scores of EBPAS were
entered into the MLMs as separate outcomes to differentiate
their associations with OC factors.
A series of 2-level random-intercept-only MLMs were
fitted with HLM version 6.08 (Raudenbush etal., 2009) to
assess the cross-level associations between each of the four
attitudinal dimensions and general and strategic leadership
or climate (Online Appendix2). For RQ1, the ICCs were
calculated from the unconditional models (random-inter-
cept-only models without variables) to assess the magnitude
of clustering effects for each attitudinal dimension (Table5).
Then we entered the level-1 (individual) and level-2 (organi-
zational) variables into the unconditional models. Given the
complex configuration of our theoretical models (Fig.1), we
removed nonsignificant covariates for parsimony and power
preservation. Specifically, we first entered all the level-1
and 2 covariates into the unconditional models. Then, we
retained significant covariates in subsequent model building.
For RQ2, we built leadership- and climate-specific models
separately for each attitudinal dimension as the outcome.
First, we added to the unconditional models with a set of
individual- and organizational-level variables of leadership
or climate (general and strategic; Online Appendix2). For
RQ3, we entered interaction terms of general and strate-
gic leadership or climate into the level-1 and-2 equations
of models from RQ 2. The simultaneous entry of OC fac-
tors into both levels partitioned the within- and between-
organizational effects of general and strategic leadership or
climate on implementer attitudes.
To estimate generalizable level-specific effect sizes of
general and strategic OC factors, unstandardized fixed effect
coefficients were computed (Tables6, 7, and 8; Wang etal.,
2019). We also calculated partial Cohen’s d metric based on
the level-specific approximate t-ratios for each fixed effect,
considering multiple independent variables and multipli-
cative terms in the planned MLMs (Hoffman & Walters,
2022; Brysbaert & Stevens, 2018). The level-1 coefficients
were estimated by the empirical Bayes method, while level-2
coefficients were estimated by the generalized least squares
method (Raudenbush etal., 2009). Interaction effects were
interpreted by simple slope plots (Preacher etal., 2006). In
addition to approximate t-ratio tests for interaction coeffi-
cients, we ran likelihood ratio tests to compare the differ-
ence in deviance statistics (− 2 log-likelihood) between main
effect only MLMs and interaction effect MLMs. The signifi-
cance test used the chi-square value with degrees of freedom
equal to the number of new terms in the interaction MLMs
(Raudenbush, 2004). The test indicated the global contribu-
tion of interaction terms to the overall explained variance
in attitudes (Snijders & Bosker, 2012). This cross-sectional
study followed the reporting guideline of the STROBE
checklist (Online Appendix3).
Results
RQ 1: The Clustering Effect ofImplementer Attitudes
Toward EBPs
The ICCs for Requirement, Openness, Appeal, and Diver-
gence were 0.06, 0.09, 0.11, and 0.13, respectively. This
indicates that individuals from the same organization shared
similar attitudes in all four dimensions. The level-2 variance
components indicated significant between-organization vari-
ation in all four attitudinal dimensions, which indicated that
educators’ shared attitudes vary across schools (Table5).
The findings supported the appropriateness of multilevel
analysis and the need to discern the between- and within-
organization effects of OC factors.
Table 5 Intra-Class
Correlations (ICCs) and
variance components of the
random intercept from the
unconditional models
**p < 0.01; ***p < 0.001; Level 1: N = 441; Level 2: N = 52
DV Uncondi-
tional model
#
Random effects Variance
component
df Chi-square p value ICC
Requirement 1.1.0 Level 1 intercept (
𝜇
0
j
) 0.05** 51 80.41 0.006 0.06
Level 1 residual (
rij
) 0.7
Openness 1.2.0 Level 1 intercept (
𝜇
0
j
) 0.04*** 51 93.99 < 0.001 0.09
Level 1 residual (
rij
) 0.42
Appeal 1.3.0 Level 1 intercept (
𝜇
0
j
) 0.05*** 51 103.99 < 0.001 0.11
Level 1 residual (
rij
) 0.45
Divergence 1.4.0 Level 1 intercept (
𝜇
0
j
) 0.24*** 51 123.16 < 0.001 0.13
Level 1 residual (
rij
) 0.04
Administration and Policy in Mental Health and Mental Health Services Research
1 3
Table 6 Fixed effect estimates of multilevel models for RQ 1
Outcomes Model # Level of
measure-
ment
Fixed effect b S.E. T-ratio Approx. df p value Partial cohen’s d
EBP requirement 1.1.1 leadership Level-2 Intercept 2.96*** 0.045 65.357 49 < 0.001 18.673
Strategic 0.32* 0.122 2.613 49 0.012 0.747
General − 0.086 0.152 − 0.568 49 0.573 − 0.162
Level-1 Strategic 0.19* 0.08 2.372 436 0.018 0.227
General 0.279** 0.078 3.576 436 0.001 0.343
1.1.2 climate Level-2 Intercept 2.959*** 0.045 65.151 49 < 0.001 18.615
Strategic 0.27* 0.1 2.626 49 0.012 0.750
General 0.193 0.263 0.734 49 0.467 0.210
Level-1 Strategic 0.247*** 0.06 4.116 436 < 0.001 0.394
General 0.546*** 0.142 3.852 436 < 0.001 0.369
Openness 1.2.1 leadership Level-2 Intercept 3.08*** 0.106 28.941 48 < 0.001 8.355
Enrollment 0.491** 0.159 3.080 48 0.004 0.889
Strategic 0.303* 0.03 2.327 48 0.024 0.672
General − 0.037 0.15 − 0.247 48 0.81 − 0.071
Level-1 Grade level − 0.191** 0.064 − 3.001 433 0.003 − 0.288
Experience − 0.011** 0.003 − 3.300 433 0.001 − 0.317
Strategic 0.31*** 0.073 4.247 433 < 0.001 0.408
General 0.128* 0.059 2.174 433 0.03 0.209
1.2.2 climate Level-2 Intercept 3.087*** 0.097 31.832 48 < 0.001 9.189
Enrollment 0.454** 0.132 3.429 48 0.002 0.990
Strategic 0.353*** 0.072 4.877 48 < 0.001 1.408
General 0.17 0.184 0.924 48 0.36 0.267
Level-1 Grade level − 0.163** 0.062 − 2.625 433 0.009 − 0.252
Experience − 0.012*** 0.003 − 3.952 433 < 0.001 − 0.380
Strategic 0.306*** 0.041 7.428 433 < 0.001 0.714
General 0.428*** 0.108 3.954 433 < 0.001 0.380
EBP appeal 1.3.1 leadership Level-2 Intercept 2.9*** 0.179 16.187 48 < 0.001 4.673
Enrollment 0.575* 0.237 2.423 48 0.019 0.699
Strategic 0.216* 0.081 2.673 48 0.011 0.772
General 0.096 0.104 0.923 48 0.361 0.266
Level-1 Gender 0.251** 0.092 2.72 433 0.007 0.261
Experience − 0.01* 0.004 − 2.398 433 0.017 − 0.230
Strategic 0.11 0.079 1.399 433 0.163 0.134
General 0.164* 0.083 1.987 433 0.047 0.191
1.3.2 climate Level-2 Intercept 2.969*** 0.171 17.36 48 < 0.001 5.011
Enrollment 0.517* 0.23 2.246 48 0.029 0.648
Strategic 0.318** 0.097 3.281 48 0.002 0.947
General 0.195 0.222 0.873 48 0.387 0.252
Level-1 Gender 0.225* 0.092 2.439 433 0.015 0.234
Experience − 0.012** 0.004 − 3.079 433 0.003 − 0.296
Strategic 0.048 0.049 0.97 433 0.333 0.093
General 0.598*** 0.12 4.966 433 < 0.001 0.477
Administration and Policy in Mental Health and Mental Health Services Research
1 3
RQ 2: Associations Between OC Factors
andAttitudinal Dimensions
Leadership Models
At the organizational level, strategic leadership demon-
strated consistent significant associations with four attitu-
dinal dimensions while general leadership showed none
(Tables6, 8). The fixed effect sizes of organizational-level
strategic leadership were generally larger for Requirement
and Openness compared to that for Appeal and Divergence.
At the individual level, both strategic and general leadership
showed significant associations with four attitudinal dimen-
sions with two exceptions where strategic leadership was
not associated with Appeal and general leadership was not
associated with Divergence. For Requirement and Appeal,
general leadership showed larger effects than strategic lead-
ership. Conversely, for Openness and Divergence, strategic
leadership showed larger effects. In sum, strategic leader-
ship showed the most consistent associations with all four
attitudinal dimensions across both levels, while general lead-
ership showed consistent associations with attitudes at the
individual levels.
Climate Models
We found similar patterns of significance in the fixed effects
of general or strategic climate on educator attitudes toward
EBPs at both levels. At the organizational level, only stra-
tegic climate was significantly associated with Require-
ment, Openness, and Appeal. At the individual level, both
general and strategic climates were significantly associated
with Requirement and Openness (Table8). However, for
Appeal and Divergence, only general climate was signifi-
cant. Moreover, the individual-level fixed effects of general
climate were consistently larger than that of strategic climate
for all four attitudinal dimensions (Table6).
RQ 3: Interactions Between General andStrategic
OC Factors
Overall, the model deviance change indicated that the entry
of the organizational-level interaction significantly improved
the model's fit in explaining the variation in Appeal and Diver-
gence beyond the standalone general and strategic climate
(2 log-likelihood (2) = 12.50). Specifically, at the organiza-
tional level, general climate negatively moderated the effect
of strategic climate on Appeal (b = − 0.59) and Divergence
(b = − 0.41; Table7). To facilitate interpretation, we plotted
the simple slope graphs to depict the interaction effects with
estimated marginal means (Figs.2, 3). At the organizational
level, a high level of strategic climate was consistently associ-
ated with high levels of Appeal and Divergence. But the mag-
nitudes of the positive associations between strategic climate,
Appeal, and Divergence were conditional on the specific lev-
els of general climate. In schools with moderate or low levels
For precision, three decimals are retained. The dependent/outcome variable is specific dimension of educator attitudes toward universal preven-
tive EBPs measured by EBPAS
b regression coefficient (unstandardized), S.E. standard error, df degrees of freedom, p value two-tailed significance level
*p < .05;**p < 0.01; ***p < 0.001; Level 1: N = 441; Level 2: N = 52
Table 6 (continued)
Outcomes Model # Level of
measure-
ment
Fixed effect b S.E. T-ratio Approx. df p value Partial cohen’s d
Divergence 1.4.1 leadership Level-2 Intercept 3.12*** 0.03 99.07 48 < 0.001 28.599
Enrollment 0.50* 0.02 2.38 48 0.021 0.687
Strategic 0.23* 0.11 2.04 48 0.046 0.589
General − 0.09 0.12 − 0.74 48 0.461 − 0.214
Level-1 Strategic 0.13* 0.05 2.59 435 0.01 0.248
General 0.06 0.05 1.14 435 0.257 0.109
1.4.2 climate Level-2 Intercept 3.12*** 0.03 101.93 48 < 0.001 29.425
Enrollment 0.50* 0.02 2.56 48 0.014 0.739
Strategic 0.11 0.09 1.26 48 0.216 0.364
General 0.23 0.20 1.18 48 0.244 0.341
Level-1 Strategic 0.05 0.04 1.26 435 0.208 0.121
General 0.40*** 0.09 4.39 435 < 0.001 0.421
Administration and Policy in Mental Health and Mental Health Services Research
1 3
Table 7 Fixed interaction effect estimates of multilevel models for RQ 2
Outcome Model # Level of
measure-
ment
Fixed effect b S.E. T-ratio Approx. df p value Partial cohen’s d
EBP requirement 2.1.1
leadership interaction
Level-2 Intercept 2.968*** 0.059 49.971 48 < 0.001 14.425
Strategic 0.291* 0.138 2.103 48 0.040 0.607
General − 0.100 0.147 − 0.682 48 0.498 − 0.197
Interaction − 0.064 0.094 − 0.675 48 0.502 − 0.195
Level-1 Strategic 0.196* 0.080 2.459 434 0.015 0.236
General 0.284** 0.079 3.607 434 0.001 0.346
Interaction 0.043 0.066 0.642 434 0.521 0.062
2.1.2
climate interaction
Level-2 Intercept 2.985*** 0.053 56.228 48 < 0.001 16.232
Strategic 0.243* 0.099 2.469 48 0.017 0.713
General 0.114 0.274 0.414 48 0.680 0.120
Interaction − 0.304 0.293 − 1.037 48 0.305 − 0.299
Level-1 Strategic 0.246*** 0.060 4.117 434 < 0.001 0.395
General 0.546*** 0.141 3.869 434 < 0.001 0.371
Interaction − 0.143 0.181 − 0.789 434 0.431 − 0.076
EBP openness 2.2.1
leadership interaction
Level-2 Intercept 3.066*** 0.042 73.865 47 < 0.001 21.549
Enrollment 0.539** 0.174 3.093 47 0.004 0.902
Strategic 0.336* 0.131 2.552 47 0.014 0.744
General − 0.025 0.143 − 0.172 47 0.864 − 0.050
Interaction 0.033 0.064 0.527 47 0.600 0.154
Level-1 Grade level − 0.209** 0.065 − 3.233 431 0.002 − 0.311
Experience − 0.009* 0.004 − 2.221 431 0.027 − 0.214
Strategic 0.318*** 0.072 4.414 431 < 0.001 0.425
General 0.143* 0.059 2.399 431 0.017 0.231
Interaction 0.088 0.047 1.856 431 0.064 0.179
2.2.2
climate interaction
Level-2 Intercept 3.106*** 0.031 99.644 47 < 0.001 29.069
Enrollment 0.499** 0.147 3.406 47 0.002 0.994
Strategic 0.363*** 0.082 4.458 47 0.000 1.301
General 0.127 0.194 0.652 47 0.517 0.190
Interaction − 0.050 0.172 − 0.292 47 0.772 − 0.085
Level-1 Grade level − 0.188** 0.064 − 2.943 431 0.004 − 0.284
Experience − 0.010** 0.004 − 2.791 431 0.006 − 0.269
Strategic 0.307*** 0.041 7.432 431 < 0.001 0.716
General 0.420*** 0.110 3.824 431 < 0.001 0.368
Interaction − 0.049 0.139 − 0.350 431 0.727 − 0.034
Administration and Policy in Mental Health and Mental Health Services Research
1 3
The dependent/outcome variable is specific dimension of educator attitudes toward universal preventive EBPs measured by EBPAS
b regression coefficient (unstandardized), S.E. standard error, df degrees of freedom, p value two-tailed significance level
*p < .05; **p < 0.01; ***p < 0.001; Level 1: N = 441; Level 2: N = 52
Table 7 (continued)
Outcome Model # Level of
measure-
ment
Fixed effect b S.E. T-ratio Approx. df p value Partial cohen’s d
EBP appeal 2.3.1
leadership interaction
Level-2 Intercept 3.106*** 0.101 30.883 47 < 0.001 9.009
Enrollment 0.583* 0.221 2.633 47 0.012 0.768
Strategic 0.154 0.095 1.617 47 0.112 0.472
General 0.074 0.096 0.775 47 0.442 0.226
Interaction − 0.115 0.083 − 1.381 47 0.174 − 0.403
Level-1 Gender 0.256* 0.099 2.591 431 0.01 0.250
Experience − 0.007 0.005 − 1.429 431 0.154 − 0.138
Strategic 0.111 0.080 1.387 431 0.166 0.134
General 0.175* 0.083 2.115 431 0.035 0.204
Interaction 0.044 0.071 0.617 431 0.537 0.059
2.3.2
climate interaction
Level-2 Intercept 3.135*** 0.093 33.686 47 < 0.001 9.827
Enrollment 0.585* 0.22 2.663 47 0.011 0.777
Strategic 0.286** 0.091 3.153 47 0.003 0.920
General 0.002 0.223 0.011 47 0.991 0.003
Interaction − 0.586* 0.240 − 2.447 47 0.018 − 0.714
Level-1 Gender 0.241* 0.097 2.482 431 0.014 0.239
Experience − 0.010* 0.005 − 2.083 431 0.038 − 0.201
Strategic 0.049 0.049 0.993 431 0.322 0.096
General 0.594*** 0.122 4.884 431 < 0.001 0.471
Interaction 0.008 0.140 0.054 431 0.957 0.005
Divergence 2.4.1
leadership interaction
Level-2 Intercept 3.13*** 0.04 81.53 47 < 0.001 23.785
Enrollment 0.41* 0.02 2.33 47 0.024 0.680
Strategic 0.19 0.13 1.50 47 0.140 0.438
General − 0.12 0.14 − 0.82 47 0.416 − 0.239
Interaction − 0.08 0.05 − 1.62 47 0.112 − 0.473
Level-1 Strategic 0.14* 0.05 2.58 433 0.01 0.248
General 0.06 0.06 1.13 433 0.259 0.109
Interaction 0.04 0.04 0.92 433 0.361 0.088
2.4.2
climate interaction
Level-2 Intercept 3.15*** 0.03 97.16 47 < 0.001 28.344
Enrollment 0.5* 0.2 2.61 47 0.013 0.761
Strategic 0.08 0.08 0.99 47 0.328 0.289
General 0.12 0.20 0.62 47 0.537 0.181
Interaction − 0.41* 0.16 − 2.52 47 0.016 − 0.735
Level-1 Strategic 0.05 0.04 1.26 433 0.208 0.121
General 0.40*** 0.09 4.43 433 < 0.001 0.426
Interaction − 0.08 0.12 − 0.63 433 0.528 − 0.061
Administration and Policy in Mental Health and Mental Health Services Research
1 3
of general climate, higher levels of strategic climate showed
a strong positive association with higher levels of Appeal
and Divergence (i.e., steeper slopes in Figs.2, 3). But in the
schools with high levels of general climate, high levels of stra-
tegic climate showed a much weaker positive association with
Appeal and Divergence (i.e., nearly flatted slopes in Figs.2, 3).
Discussion
This study delineated the associations between general and
strategic leadership and climate as both organizational fac-
tors and individual psychological phenomena to examine
their cross-level and interactive associations with specific
dimensions of implementer attitudes toward EBPs.
Varied Associations Between Leadership, Climate,
andSpecific Attitudinal Dimensions
The significant clustering effect of the four attitudinal dimen-
sions implied that implementers from the same organization
have some level of commonality in their attitudes toward
EBPs. In this study, the level-specific centering of OC fac-
tors enabled separate interpretations of the fixed effect coef-
ficients of OC factors at each level (Hoffman & Walters,
2022). In the organizational-level equations, differences
Table 8 Inferential significance
of fixed effects of OC factors in
the multilevel models
“Sig” statistically significant, i.e., p < 0.05, 0.01, or 0.001. “–” nonsignificant, Interaction interaction effect
between the general and strategic types of OC factors
OC factors Attitudinal dimensions Organizational level Individual level
Strategic General Interaction Strategic General Interaction
Leadership Requirement Sig – – Sig Sig –
Openness Sig – – Sig Sig –
Appeal Sig – – – Sig –
Divergence Sig – – Sig – –
Climate Requirement Sig – – Sig Sig –
Openness Sig – - Sig Sig ––
Appeal Sig – Sig – Sig –
Divergence – – Sig – Sig –
Fig. 2 N = 52. The organiza-
tional-level interaction effect
between general and strategic
climate on the school-level
shared perception of EBP
Appeal. S-ICS = Grand centered
school-level aggregated strate-
gic climate; S-OHI = Grand cen-
tered school-level aggregated
general climate. Solid line = 1
standard deviation above the
grand mean of S-OHI, short-
dashed line = grand mean of
S-OHI, dotted line = 1 standard
deviation below the grand mean
of S-OHI
Administration and Policy in Mental Health and Mental Health Services Research
1 3
in the coefficients of OC factors indicated the deviation of
aggregated OC factors in a school from that of all schools
(i.e., grand mean). In the individual-level equations, differ-
ences in the coefficients of OC factors indicated the devia-
tion of an individual's perceived OC factors from that of
their colleagues in the same school (i.e., group mean). Over-
all, the statistical significance and magnitudes of associa-
tions among leadership, climate, and implementer attitudes
toward EBPs varied significantly across levels of measure-
ment (organizational or individual), specificity to imple-
mentation (strategic or general), and attitudinal dimensions
(Requirement, Openness, Appeal, or Divergence; Table8).
At the organizational level, only strategic leadership and
climate showed consistent significant and sizable associa-
tions with all four attitudinal dimensions, while neither gen-
eral leadership nor climate exhibited significant associations.
This echoed the findings from the correlation matrix where
most organizational-level strategic OC factors demonstrated
stronger associations with school aggregates of the four atti-
tudinal dimensions than did general OC factors. The find-
ings collectively supported that, at the organizational level,
strategic OC factors tend to exert superior influence than
their general counterparts on implementer attitudes toward
EBPs. This is congruent with extant theory and research on
OC factors (e.g., Lyon etal., 2018a, 2018b; Williams etal.,
2018), which support general OC factors as a necessary but
insufficient condition for successful EBP implementation
(Aarons etal., 2014a, 2014b; Damschroder etal., 2009).
Strategic OC factors capture the implementation-specific
characteristics of the context most proximal to implement-
ers. Hence at the organizational level, strategic OC factors
exerted more influence on implementer attitudes toward
EBPs than their general counterparts representing similar
implementation phenomena (Williams etal., 2018).
At the individual level, both general and strategic lead-
ership and climate showed consistent associations with
Requirement and Openness. But only general leadership
and climate showed consistent associations with Divergence
and Appeal. Moreover, the fixed effect sizes of individual-
level general leadership and climate were larger than that
of strategic leadership and climate. This is consistent with
prior research highlighting the varied associations between
different attitudinal dimensions and organizational factors
(e.g., climate and culture; Aarons etal., 2012a, 2012b). This
variability may be attributed to the specific conceptualiza-
tions of seemingly conflated yet distinct attitudinal dimen-
sions. Social psychology conceptualizes attitudes with three
components: affective, behavioral/conative, and cognitive
(Calder & Lutz, 1972; de Kok etal., 2020). As defined in
EBPAS and the specific items, Appeal and Divergence pri-
marily involve the affective component of attitudes toward
EBPs (i.e., how one feels about EBPs), while Requirement
and Openness are conceptualized around the cognitive and
behavioral components (i.e., acknowledgment of require-
ment for EBPs and willingness to try new EBPs; Aarons &
Sawitzky, 2006). Moreover, general climate and leadership
capture staff's shared global appraisals of how the organi-
zational context and leadership impact their well-being,
Fig. 3 N = 52. Divergence was
reverse coded so higher value
indicate positive attitudes. The
organizational-level interaction
effect between general and stra-
tegic climate on the school-level
shared perception of Diver-
gence. S-ICS = Grand centered
school-level aggregated strate-
gic climate; S-OHI = Grand cen-
tered school-level aggregated
general climate. Solid line = 1
standard deviation above the
grand mean of S-OHI, short-
dashed line = grand mean of
S-OHI, dotted line = 1 standard
deviation below the grand mean
of S-OHI
Administration and Policy in Mental Health and Mental Health Services Research
1 3
organizational health, and function at work (Ehrhart etal.,
2014a, 2014b; James etal., 2008). Hence, Divergence and
Appeal were more likely to be influenced by general OC
factors that are closely related to feelings and affect, as com-
pared to strategic factors that are action-oriented and spe-
cific to implementing EBPs. Moreover, the affect-focused
attitudinal dimensions may be more dynamic and suscep-
tible to timing and historic events during the assessment
process (e.g., the Divergence subscale has the lowest reli-
ability among the four subscales, Aarons & Sawitzky, 2006).
Researchers should use caution and a longitudinal approach
to measure affect-focused attitudinal dimensions for reliable
results (Fishman etal., 2021).
This study adds to the emerging literature on how OC fac-
tors possess varied associations with different dimensions of
implementer attitudes toward EBPs (Powell etal., 2017), and
underscores the need to distinguish the definitions of specific
attitudinal dimensions to improve measurement and result
interpretation in implementation research on attitudes. Also,
the level-specific findings highlight the overlooked multi-
level nature in the conceptualization and measurement of
attitudes toward EBPs (Fishman etal., 2021). OC factors
are conceptualized at the organizational level. But OC fac-
tors first need to be perceived and processed internally by
individuals working in the organization. Then individuals'
subjective appraisals of the OC factors will exert influences
on their implementation-related factors (e.g., attitudes). For
instance, in a school with high levels of organizational-level
strategic leadership, educators' attitudes toward EBPs may
still vary due to their different appraisals of the strategic
leadership and other factors (e.g., burnout, poor relationship
with leaders; Larson etal., 2018).
Interactions Between General andStrategic
Leadership orClimate
Emerging research suggests that general and strategic OC
factors can interact to influence implementation deter-
minants or outcomes (e.g., Aarons etal., 2014a, 2014b;
Ehrhart etal., 2014a, 2014b; Williams & Beidas, 2019).
For example, leaders with both strategic and general lead-
ership qualities may cast a greater influence on individual-
level implementation determinants (e.g., attitudes toward
EBPs) than those with general or strategic leadership
qualities alone (Farahnak etal., 2020; Lyon etal., 2018a,
2018b). Our findings indicated that, at the organizational
level, the strength of the positive association between stra-
tegic climate and school staff's shared perceptions of EBP
Appeal and Divergence depended on the specific levels
of general climate. In schools with low and moderate lev-
els of general climate, strategic climate showed a strong
positive association with Appeal and Divergence. But in
schools with high levels of general climate, higher levels
of strategic climate showed much weaker associations
with Appeal and Divergence. Our findings suggest that
the change mechanism of organizational-level strategic
climate on teachers' attitudes toward EBPs is conditional
on the existing level of general climate in the school. In
schools with suboptimal levels of general climate, stra-
tegic climate appeared to help compensate for the more
negative environment (i.e., general climate) to promote
teachers' attitudes. However, when a school already had a
strong general climate in place, the additive effect of stra-
tegic climate was more limited potentially due to a “ceiling
effect” of the existing general climate (Khosravi, 2020;
Neal etal., 2005). Hence, it is important for researchers
and leaders selecting implementation strategies to make
decisions based on the organization’s current levels of both
general and strategic climate.
Moreover, our findings differed from that of Williams
and colleagues’ seminal work (2018) where the significant
positive effect of strategic climate on clinicians’ use of EBPs
only existed in organizations with high levels of general cli-
mate. The discrepancy may be due to the difference in imple-
mentation outcomes (attitudes vs. EBP use), model configu-
rations (we included OC factors at both organizational- and
individual-levels), nature of datasets (cross-sectional vs. lon-
gitudinal), and study populations (educators vs. clinicians).
Future researchers should extend this study to explore the
interaction between general and strategic OC factors with
other methods (e.g., cross-lagged design, response surface
analysis, Shanock etal., 2010) in different settings/popula-
tions. Researchers should try to identify the optimal thresh-
old where strategic and general OC factors interact to create
a fostering context to improve implementation determinants/
outcomes (Chaudoir etal., 2013; Durlak & DuPre, 2008;
Williams, etal., 2018).
One of the main theoretical implications of this study’s
findings is to explain why the interaction effect of general
and strategic climates was only significant with particular
attitudinal dimensions (Appeal and Divergence). Appeal
and Divergence represent psychological phenomena (i.e.,
perceiving EBP as appealing or useless) that may be more
malleable compared to Openness, which represents a per-
sonality trait (Silvia & Christensen, 2020), or Requirement,
which is defined by factors individual implementers cannot
change (e.g., job responsibilities, policies). Also, there may
be other attitudinal dimensions not captured by the EBPAS
that are influenced by the interaction of general and strategic
OC factors. Future research should extend our findings to
explore the interactive and joint influence of general and
strategic OC factors on other attitudinal dimensions, such
as fit and burden in the expanded EBPAS-50 (Aarons etal.,
2011). It is important to note too that the non-significant
associations with certain attitudinal dimensions may be due
to the multicollinearities among general and strategic OC
Administration and Policy in Mental Health and Mental Health Services Research
1 3
factors, which can lead to misestimates of coefficients and
standard errors (Alin, 2010; Vatcheva etal., 2016).
Implications
This study demonstrated the advantage of examining OC
factors (general or strategic) concurrently within and across
organizational and individual levels. This approach can
unveil the differential effects of the same OC factor depend-
ing on its level of measurement. In our study, the patterns
of findings for leadership and climate varied significantly
across their measurement at organizational versus individ-
ual levels, yielding different information for practitioners
and leaders to act on. For instance, when analyzed as an
organizational-level variable, general climate aggregates
the perceptions of all implementers into a global construct
of shared experience working in an organization. When
analyzed at the individual level, it reflects an implementer's
personal and idiosyncratic appraisal of that global construct.
Moreover, our findings suggested the relatively larger
utility of strategic OC factors over the general ones at the
organizational level. On the other hand, site leaders should
avoid a narrow focus on strategic OC factors alone because
cross-level comparison of coefficients suggested that the
general OC factors generally had stronger associations with
implementer attitudes at individual level. Instead, leaders
should intervene on both types of OC factors based on their
specific levels. For instance, to effectively promote success-
ful EBP implementation for youth mental health, school
leaders should strategically allocate school-level resources
to enhance school-wide strategic OC factors (e.g., strategic
climate and leadership). Simultaneously, they should invest
in individual-level implementation strategies to enhance
individual staff's perceived general factors (e.g., collegial
leadership and social activities to promote staff's sense of
belonging to the school and connectedness with colleagues;
DeJoy etal., 2010).
Related to the level-specific suggestion above, the sig-
nificant interaction between general and strategic climate
at the organizational level highlights the need for deliberate
care when selecting implementation strategies to improve
the implementation context and individual determinants. For
instance, promoting either organizational-level general or
strategic climate alone appeared as a necessary but insuf-
ficient condition to influence individual attitudes toward
EBPs. Our findings suggested that the additive benefits of
strategic climate diminished in schools with high levels of
general climate. In such cases, leaders can resort to various
implementation strategies to enhance multiple organiza-
tional contextual determinants to implementation, such as
the Leadership for Organizational Change and Implementa-
tion (LOCI; Aarons etal., 2015) to improve organizational
climate, leadership, and culture.
Last, our mixed findings echo the concerns of Fishman
etal. (2021) about the theoretical and practical validity of the
dominant measures of implementer attitudes toward EBPs
in implementation science (e.g., EBPAS). Future research
should try to revise the existing EPBAS based on findings
from emerging implementation research on attitudes. For
instance, EBPAS can be extended with new subscales (atti-
tudinal dimensions) such as outcome expectancies and risk
perceptions, which are common dimensions of attitudes
assessed by health researchers (Fromme, 1997). Moreover,
the field of implementation science can benefit from the
development of new measures that assess key subconstructs
of attitudes based on social-cognitive psychology, such as
separately assessing the affective, conative, and cognitive
components of implementer attitudes toward EBPs (de Kok
etal., 2020; Krosnick etal., 2018).
Limitations andFuture Directions
Some of our findings should be interpreted with caution due
to limitations that call for future research. First, although our
sample demonstrated modest power for MLM, the sample
size at the organizational level was relatively small for com-
plex model specifications. Hence, we had to enter climate
and leadership separately into the models and only retained
significant demographic covariates. For future research with
diverse and larger samples (especially at level 2), general
and strategic climate and leadership (and their subscales)
can be simultaneously entered into the MLMs to compare
their differential and additive effect on implementer attitudes
toward EBPs. Moreover, theoretically plausible demographic
covariates can be entered into the MLMs to explore whether
the effects of climate and leadership vary across subpopula-
tions of educators and school characteristics. Second, this
study used random-intercept-only MLMs to partition and
explain separately the individual- versus school-level vari-
ance in implementer attitudes (Hoffman & Walters, 2022).
In the field, organizational-level factors often work through
or interact with individual-level factors to influence imple-
menters' cognition and behaviors. Thus, research is needed
to extend the findings from this study by using other types
of MLMs (e.g., multilevel mediation, random-slope MLMs)
to explore the cross-level mediational or interactive relation-
ships between organizational- and individual-level factors
(Preacher etal., 2010; Heisig & Schaeffer, 2019).
Third, this study used 2-level univariant MLMs, which
did not account for the correlations between outcomes (atti-
tudinal dimensions). A 3-level multivariate MLM is more
suitable to explore multivariate research questions, such as
how OC factors predict the differential profiles/combinations
of attitudinal dimensions (Park etal., 2015). Fourth, OC fac-
tors themselves vary at both individual and organizational
levels. Moreover, the sizes of fixed effect coefficients may
Administration and Policy in Mental Health and Mental Health Services Research
1 3
be restricted by the magnitudes of variability in OC factors
at specific levels (Stauffer etal., 2001). Hence, researchers
should replicate and verify the findings from this study by
recruiting a larger and more diverse sample of schools (i.e.,
more variability at the school level). Also, researchers can
use multilevel structural equation models (ML-SEMs) to
simultaneously partition and explain the between and within-
school variance of outcomes (attitudes) and predictors (OC
factors) to illustrate the effects of level-specific variability of
all variables in the model (Heck & Thomas, 2020). Last, this
study only delineated the associations between OC factors
and implementer attitudes toward EBPs as an individual-
level implementation factor. Future research should extend
this study to explore whether improved attitudes (resulting
from enhanced OC factors) will subsequently lead to better
implementation outcomes (e.g., adoption, fidelity).
Conclusion
This study contributes to the literature on organizational
contextual factors by elucidating the cross-level interplay
between strategic and general OC factors and implementer
attitudes toward EBPs in the education sector. Our findings
include (a) the associations between implementer attitudes
toward EBPs and climate and leadership vary significantly
depending on the level of measurement, specificity to
implementation, and attitudinal dimensions, and (b) gen-
eral organizational climate moderated the positive effect of
organizational-level strategic climate on implementer per-
ceptions of EBP appeal and divergence. In youth mental
healthcare settings (e.g., schools, community clinics), lead-
ers should allocate resources to deliberately enhance the
strategic leadership and climate conditional on the existing
level of general leadership and climate. This could promote
staff's favorable attitudes toward EBPs, which has emerging
evidence supporting its association with positive implemen-
tation outcomes. Future research should explore the theo-
retical causal chain where OC factors work through imple-
menter attitudes to influence the implementation outcomes
of universal prevention programs for youth mental health in
the education sector and beyond.
Supplementary Information The online version contains supplemen-
tary material available at https:// doi. org/ 10. 1007/ s10488- 022- 01248-5.
Author Contributions YZ developed and finalized the manuscript, ana-
lyzed, and interpreted the data, and created all materials for submis-
sion. CLC was a major contributor in writing the manuscript. LF, CC,
ME, EB, JL, and AL all made contributions to reviewing and editing
the drafts of the manuscript. All authors read and approved the final
manuscript.
Funding This study was part of a project funded by Grant
R305A160114 (Lyon and Cook) awarded by the Institute of Education
Sciences. The content is solely the responsibility of the authors and
does not necessarily represent the official views of the Institute of Edu-
cation Sciences.
Data Availability The de-identified datasets generated and analyzed
during the current study are available in the Open Science Framework
[OSF.IO/9X4TE].
Declarations
Competing Interests The authors declare that they have no competing
interests.
Ethical Approval and Consent to Participate This study was approved
by the Institutional Review Board (IRB) at the University of Washing-
ton. Informed consent was obtained from all individual participants
included in the study.
Consent for Publication Not applicable.
Standards of Reporting This is a cross-sectional observational study.
Therefore, we adhered to the STROBE checklist in terms of analysis,
reporting, and preparation of the manuscript. The completed STROBE
checklist is provided as Supplementary Material 3.
References
Aarons, G. A. (2004). Mental health provider attitudes toward adop-
tion of evidence-based practice: The Evidence-Based Practice
Attitude Scale (EBPAS). Mental Health Services Research, 6(2),
61–74.
Aarons, G. A., Cafri, G., Lugo, L., & Sawitzky, A. (2012a). Expanding
the domains of attitudes towards evidence-based practice: The
evidence-based practice attitude scale-50. Administration and
Policy in Mental Health and Mental Health Services Research,
39(5), 331–340.
Aarons, G. A., Glisson, C., Green, P. D., Hoagwood, K., Kelleher, K.
J., & Landsverk, J. A. (2012b). The organizational social context
of mental health services and clinician attitudes toward evidence-
based practice: A United States study. Implementation Science,
7(1), 1–15.
Aarons, G. A., Ehrhart, M. G., & Farahnak, L. R. (2014a). The imple-
mentation leadership scale (ILS): Development of a brief meas-
ure of unit level implementation leadership. Implementation
Science, 9(1), 1–10.
Aarons, G. A., Ehrhart, M. G., Farahnak, L. R., & Sklar, M. (2014b).
Aligning leadership across systems and organizations to develop
a strategic climate for evidence-based practice implementation.
Annual Review of Public Health, 35, 255–274.
Aarons, G. A., Ehrhart, M. G., Farahnak, L. R., & Hurlburt, M. S.
(2015). Leadership and organizational change for implementation
(LOCI): A randomized mixed method pilot study of a leadership
and organization development intervention for evidence-based
practice implementation. Implementation Science, 10(1), 1–12.
Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing
a conceptual model of evidence-based practice implementation
in public service sectors. Administration and Policy in Mental
Health and Mental Health Services Research, 38(1), 4–23.
Aarons, G. A., McDonald, E. J., Sheehan, A. K., & Walrath-Greene, C.
M. (2007). Confirmatory factor analysis of the Evidence-Based
Practice Attitude Scale (EBPAS) in a geographically diverse
sample of community mental health providers. Administration
Administration and Policy in Mental Health and Mental Health Services Research
1 3
and Policy in Mental Health and Mental Health Services
Research, 34(5), 465–469.
Aarons, G. A., & Sawitzky, A. C. (2006). Organizational culture and
climate and mental health provider attitudes toward evidence-
based practice. Psychological Services, 3(1), 61.
Ajzen, I. (1991). The theory of planned behavior. Organizational
Behavior and Human Decision Processes, 50(2), 179–211.
Alin, A. (2010). Multicollinearity. Wiley Interdisciplinary Reviews:
Computational Statistics, 2(3), 370–374.
Allen, P., Pilar, M., Walsh-Bailey, C., Hooley, C., Mazzucca, S.,
Lewis, C. C., & Brownson, R. C. (2020). Quantitative measures
of health policy implementation determinants and outcomes: A
systematic review. Implementation Science, 15(1), 1–17.
Avolio, B. J., & Bass, B. M. (1995). Individual consideration viewed at
multiple levels of analysis: A multi-level framework for examin-
ing the diffusion of transformational leadership. The Leadership
Quarterly, 6(2), 199–218.
Bandura, A. (1999). Social cognitive theory: An agentic perspective.
Asian Journal of Social Psychology, 2(1), 21–41.
Bevans, K., Bradshaw, C., Miech, R., & Leaf, P. (2007). Staff-and
school-Level predictors of school organizational health: A mul-
tilevel analysis. Journal of School Health, 77(6), 294.
Birken, S. A., & Currie, G. (2021). Using organization theory to posi-
tion middle-level managers as agents of evidence-based practice
implementation. Implementation Science, 16(1), 1–6.
Blasius, J., & Brandt, M. (2010). Representativeness in online surveys
through stratified samples. Bulletin of Sociological Methodology/
Bulletin de Méthodologie Sociologique, 107(1), 5–21.
Bliese, P. D. (2000). Within-group agreement, non-independence, and
reliability: Implications for data aggregation and analysis.
Bowden, R. G., Lanning, B. A., Pippin, G., & Tanner, J. F. (2003).
Teachers' attitudes towards abstinence-only sex education cur-
ricula. Education.
Bradshaw, C. P., Koth, C. W., Bevans, K. B., Ialongo, N., & Leaf, P.
J. (2008). The impact of school-wide positive behavioral inter-
ventions and supports (PBIS) on the organizational health of
elementary schools. School Psychology Quarterly, 23(4), 462.
Bruns, E. J., Duong, M. T., Lyon, A. R., Pullmann, M. D., Cook, C.
R., Cheney, D., & McCauley, E. (2016). Fostering SMART part-
nerships to develop an effective continuum of behavioral health
services and supports in schools. American Journal of Orthopsy-
chiatry, 86(2), 156.
Brysbaert, M., & Stevens, M. (2018). Power analysis and effect size
in mixed effects models: A tutorial. Journal of Cognition, 1(1).
Calder, B. J., & Lutz, R. J. (1972). An investigation of some alterna-
tives to the linear attitude model. ACR Special Volumes.
Carlson, M. A., Morris, S., Day, F., Dadich, A., Ryan, A., Fradgley,
E. A., & Paul, C. (2021). Psychometric properties of leadership
scales for health professionals: A systematic review. Implementa-
tion Science, 16(1), 1–22.
Chaudoir, S. R., Dugan, A. G., & Barr, C. H. (2013). Measuring fac-
tors affecting implementation of health innovations: A system-
atic review of structural, organizational, provider, patient, and
innovation level measures. Implementation Science, 8(1), 1–20.
Cook, C. R., Davis, C., Brown, E. C., Locke, J., Ehrhart, M. G., Aarons,
G. A., & Lyon, A. R. (2018). Confirmatory factor analysis of
the Evidence-Based Practice Attitudes Scale with school-based
behavioral health consultants. Implementation Science, 13(1),
1–8.
Damschroder, L., Hall, C., Gillon, L., Reardon, C., Kelley, C.,
Sparks, J., & Lowery, J. (2015). The Consolidated Framework
for Implementation Research (CFIR): Progress to date, tools
and resources, and plans for the future. InImplementation sci-
ence(Vol. 10, No. 1, pp. 1–1). BioMed Central.
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexan-
der, J. A., & Lowery, J. C. (2009). Fostering implementation of
health services research findings into practice: A consolidated
framework for advancing implementation science. Implementa-
tion Science, 4(1), 1–15.
Damschroder, L. J., Reardon, C. M., Opra Widerquist, M. A., &
Lowery, J. (2022). Conceptualizing outcomes for use with the
Consolidated Framework for Implementation Research (CFIR):
The CFIR Outcomes Addendum. Implementation Science,
17(1), 1–10.
de Kok, L. C., Oosting, D., & Spruit, M. (2020). The influence of
knowledge and attitude on intention to adopt cybersecure
behaviour. Information & Security, 46(3), 251–266.
DeJoy, D. M., Wilson, M. G., Vandenberg, R. J., McGrath-Higgins,
A. L., & Griffin-Blake, C. S. (2010). Assessing the impact of
healthy work organization intervention. Journal of Occupa-
tional and Organizational Psychology, 83(1), 139–165.
Domitrovich, C. E., Cortes, R. C., & Greenberg, M. T. (2007).
Improving young children’s social and emotional competence:
A randomized trial of the preschool “PATHS” curriculum. The
Journal of Primary Prevention, 28(2), 67–91.
Domitrovich, C. E., Pas, E. T., Bradshaw, C. P., Becker, K. D., Kep-
erling, J. P., Embry, D. D., & Ialongo, N. (2015). Individual
and school organizational factors that influence implementa-
tion of the PAX good behavior game intervention. Prevention
Science, 16(8), 1064–1074.
Duong, M. T., Bruns, E. J., Lee, K., Cox, S., Coifman, J., Mayworm,
A., & Lyon, A. R. (2021). Rates of mental health service uti-
lization by children and adolescents in schools and other com-
mon service settings: A systematic review and meta-analysis.
Administration and Policy in Mental Health and Mental Health
Services Research, 48(3), 420–439.
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A
review of research on the influence of implementation on
program outcomes and the factors affecting implementation.
American Journal of Community Psychology, 41(3), 327–350.
Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., &
Schellinger, K. B. (2011). The impact of enhancing students’
social and emotional learning: A meta-analysis of school-based
universal interventions. Child Development, 82(1), 405–432.
Ehrhart, M. G., Aarons, G. A., & Farahnak, L. R. (2014a). Assess-
ing the organizational context for EBP implementation: The
development and validity testing of the Implementation Cli-
mate Scale (ICS). Implementation Science, 9, 157.
Ehrhart, M. G., Schneider, B., & Macey, W. H. (2014b). Organiza-
tional climate and culture: An introduction to theory, research,
and practice. Routledge.
Fagan, A. A., Bumbarger, B. K., Barth, R. P., Bradshaw, C. P.,
Cooper, B. R., Supplee, L. H., & Walker, D. K. (2019). Scal-
ing up evidence-based interventions in US public systems to
prevent behavioral health problems: Challenges and opportuni-
ties. Prevention Science, 20(8), 1147–1168.
Farahnak, L. R., Ehrhart, M. G., Torres, E. M., & Aarons, G. A.
(2020). The influence of transformational leadership and leader
attitudes on subordinate attitudes and implementation suc-
cess. Journal of Leadership & Organizational Studies, 27(1),
98–111.
Fishman, J., Yang, C., & Mandell, D. (2021). Attitude theory and
measurement in implementation science: A secondary review
of empirical studies and opportunities for advancement. Imple-
mentation Science, 16(1), 1–10.
Forman, S. G., Shapiro, E. S., Codding, R. S., Gonzales, J. E., Reddy,
L. A., Rosenfield, S. A., & Stoiber, K. C. (2013). Implementation
science and school psychology. School Psychology Quarterly,
28(2), 77.
Fromme, K. (1997). Outcome expectancies and risk-taking behavior.
Cognitive Therapy and Research, 21(4), 421–442. https:// doi. org/
10. 1023/A: 10219 32326 716
Administration and Policy in Mental Health and Mental Health Services Research
1 3
Galam, S., & Moscovici, S. (1991). Towards a theory of collective
phenomena: Consensus and attitude changes in groups. European
Journal of Social Psychology, 21(1), 49–74.
Glick, W. H. (1985). Conceptualizing and measuring organizational
and psychological climate: Pitfalls in multilevel research. Acad-
emy of Management Review, 10(3), 601–616.
Godin, G., & Kok, G. (1996). The theory of planned behavior: A
review of its applications to health-related behaviors. American
Journal of Health Promotion, 11(2), 87–98.
Greenberg, M. T., & Abenavoli, R. (2017). Universal interventions:
Fully exploring their impacts and potential to produce popula-
tion-level impacts. Journal of Research on Educational Effective-
ness, 10(1), 40–67.
Gregory, A., Aarons, G., & Carmazzi, A. (2005). Organizational cul-
ture and climate and attitude toward innovation adoption. In 20th
annual SIOP conference California.
Hamaker, E. L., & Muthén, B. (2020). The fixed versus random effects
debate and how it relates to centering in multilevel modeling.
Psychological Methods, 25(3), 365.
Han, S. S., & Weiss, B. (2005). Sustainability of teacher implementa-
tion of school-based mental health programs. Journal of Abnor-
mal Child Psychology, 33(6), 665–679.
Heck, R. H., & Thomas, S. L. (2020). An introduction to multilevel
modeling techniques: MLM and SEM approaches. Routledge.
Heisig, J. P., & Schaeffer, M. (2019). Why you should always include
a random slope for the lower-level variable involved in a cross-
level interaction. European Sociological Review, 35(2), 258–279.
Hoffman, L., & Walters, R. W. (2022). Catching up on multilevel mod-
eling. Annual Review of Psychology, 73, 659–689.
Horner, R. H., Sugai, G., Smolkowski, K., Eber, L., Nakasato, J., Todd,
A. W., & Esperanza, J. (2009). A randomized, wait-list controlled
effectiveness trial assessing school-wide positive behavior sup-
port in elementary schools. Journal of Positive Behavior Inter-
ventions, 11(3), 133–144.
Hoy, W. K., & Fedman, J. A. (1987). Organizational health: The con-
cept and its measure. Journal of Research and Development in
Education, 20(4), 30–37.
Hoy, W. K., & Tarter, C. J. (1997). The road to open and healthy
schools: A handbook for change (middle and secondary school
ed.). Corwin.
Hoy, W. K., & Woolfolk, A. E. (1993). Teachers’ sense of efficacy
and the organizational health of schools. The Elementary School
Journal, 93(4), 355–372.
James, L. R., Choi, C. C., Ko, C. H. E., McNeil, P. K., Minton, M. K.,
Wright, M. A., & Kim, K. I. (2008). Organizational and psy-
chological climate: A review of theory and research. European
Journal of Work and Organizational Psychology, 17(1), 5–32.
Kellam, S. G., Mackenzie, A. C., Brown, C. H., Poduska, J. M., Wang,
W., Petras, H., & Wilcox, H. C. (2011). The good behavior game
and the future of prevention and treatment. Addiction Science &
clinical practice, 6(1), 73.
Khosravi, A. A. (2020). The relationship between the authentic lead-
ership self-rating of teachers and the organizational climate of
elementary public schools. Our Lady of the Lake University.
Klein, K. J., & Kozlowski, S. W. J. (Eds.). (2000). Multilevel theory,
research, and methods in organizations: Foundations, extensions,
and new directions. Jossey-Bass.
Krosnick, J. A., Judd, C. M., & Wittenbrink, B. (2018). The measure-
ment of attitudes. In The handbook of attitudes (pp. 45–105).
Routledge.
Lance, C. E., Butts, M. M., & Michels, L. C. (2006). The sources of
four commonly reported cutoff criteria: What did they really say?
Organizational Research Methods, 9(2), 202.
Langley, A. K., Nadeem, E., Kataoka, S. H., Stein, B. D., & Jay-
cox, L. H. (2010). Evidence-based mental health programs in
schools: Barriers and facilitators of successful implementation.
School Mental Health, 2(3), 105–113.
Larson, M., Cook, C. R., Fiat, A., & Lyon, A. R. (2018). Stressed
teachers don¡¯t make good implementers: Examining the inter-
play between stress reduction and intervention fidelity. School
Mental Health, 10(1), 61–76.
List, C. (2014). Three kinds of collective attitudes. Erkenntnis, 79(9),
1601–1622.
Locke, J., Lawson, G. M., Beidas, R. S., Aarons, G. A., Xie, M.,
Lyon, A. R., & Mandell, D. S. (2019). Individual and organi-
zational factors that affect implementation of evidence-based
practices for children with autism in public schools: A cross-
sectional observational study. Implementation Science, 14(1),
1–9.
Low, S., Smolkowski, K., & Cook, C. (2016). What constitutes high-
quality implementation of SEL programs? A latent class analy-
sis of Second Step® implementation. Prevention Science, 17(8),
981–991.
Lucas, S. R. (2014). An inconvenient dataset: Bias and inappropriate
inference with the multilevel model. Quality & Quantity, 48(3),
1619–1649.
Lui, J. H., Brookman-Frazee, L., Lind, T., Le, K., Roesch, S., Aarons,
G. A., & Lau, A. S. (2021). Outer-context determinants in the
sustainment phase of a reimbursement-driven implementation
of evidence-based practices in children’s mental health services.
Implementation Science, 16(1), 1–9.
Lyon, A. R., & Bruns, E. J. (2019). From evidence to impact: Joining
our best school mental health practices with our best implemen-
tation strategies. School Mental Health, 11(1), 106–114.
Lyon, A. R., Cook, C. R., Brown, E. C., Locke, J., Davis, C., Ehrhart,
M., & Aarons, G. A. (2018a). Assessing organizational imple-
mentation context in the education sector: Confirmatory factor
analysis of measures of implementation leadership, climate, and
citizenship. Implementation Science, 13(1), 1–14.
Lyon, A. R., Whitaker, K., Locke, J., Cook, C. R., King, K. M., Duong,
M., & Aarons, G. A. (2018b). The impact of inter-organizational
alignment (IOA) on implementation outcomes: Evaluating
unique and shared organizational influences in education sector
mental health. Implementation Science, 13(1), 1–11.
Lyon, A. R., Cook, C. R., Duong, M. T., Nicodimos, S., Pullmann, M.
D., Brewer, S. K., & Cox, S. (2019). The influence of a blended,
theoretically-informed pre-implementation strategy on school-
based clinician implementation of an evidence-based trauma
intervention. Implementation Science, 14(1), 1–16.
Lyon, A. R., Corbin, C. M., Brown, E. C., Ehrhart, M. G., Locke, J.,
Davis, C., & Cook, C. R. (2021). Leading the charge in the edu-
cation sector: development and validation of the School Imple-
mentation Leadership Scale (SILS). Implementation Science
Mang, J., Küchenhoff, H., Meinck, S., & Prenzel, M. (2021). Sampling
weights in multilevel modelling: An investigation using PISA
sampling structures. Large-Scale Assessments in Education, 9(1),
1–39.
Melas, C. D., Zampetakis, L. A., Dimopoulou, A., & Moustakis, V.
(2012). Evaluating the properties of the Evidence-Based Practice
Attitude Scale (EBPAS) in health care. Psychological Assess-
ment, 24(4), 867.
Meza, R. D., Triplett, N. S., Woodard, G. S., Martin, P., Khairuzzaman,
A. N., Jamora, G., & Dorsey, S. (2021). The relationship between
first-level leadership and inner-context and implementation out-
comes in behavioral health: A scoping review. Implementation
Science, 16(1), 1–21.
Neal, A., West, M. A., & Patterson, M. G. (2005). Do organizational
climate and competitive strategy moderate the relationship
between human resource management and productivity? Journal
of Management, 31(4), 492–512.
Administration and Policy in Mental Health and Mental Health Services Research
1 3
Nelson, M. M., Shanley, J. R., Funderburk, B. W., & Bard, E. (2012).
Therapists’ attitudes toward evidence-based practices and imple-
mentation of parent–child interaction therapy. Child Maltreat-
ment, 17(1), 47–55.
Nese, R., McIntosh, K., Nese, J., Hoselton, R., Bloom, J., Johnson, N.,
& Ghemraoui, A. (2016). Predicting abandonment of school-
wide positive behavioral interventions and supports. Behavioral
Disorders, 42(1), 261–270.
Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan,
N., & Hoagwood, K. (2015). Purposeful sampling for qualitative
data collection and analysis in mixed method implementation
research. Administration and Policy in Mental Health and Mental
Health Services Research, 42(5), 533–544.
Park, R., Pituch, K. A., Kim, J., Chung, H., & Dodd, B. G. (2015).
Comparing the performance of multivariate multilevel modeling
to traditional analyses with complete and incomplete data. Meth-
odology, 11(3), 100.
Pescosolido, B. A., Halpern-Manners, A., Luo, L., & Perry, B. (2021).
Trends in public stigma of mental illness in the US, 1996–2018.
JAMA Network Open, 4(12), e2140202–e2140202.
Powell, B. J., Mandell, D. S., Hadley, T. R., Rubin, R. M., Evans, A. C.,
Hurford, M. O., & Beidas, R. S. (2017). Are general and strate-
gic measures of organizational context and leadership associated
with knowledge and attitudes toward evidence-based practices in
public behavioral health settings? A Cross-Sectional Observa-
tional Study. Implementation Science, 12(1), 1–13.
Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith,
J. L., Matthieu, M. M., & Kirchner, J. E. (2015). A refined com-
pilation of implementation strategies: Results from the Expert
Recommendations for Implementing Change (ERIC) project.
Implementation Science, 10(1), 1–14.
Preacher, K. J., Curran, P. J., & Bauer, D. J. (2006). Computational
tools for probing interactions in multiple linear regression, multi-
level modeling, and latent curve analysis. Journal of Educational
and Behavioral Statistics, 31(4), 437–448.
Preacher, K. J., Zyphur, M. J., & Zhang, Z. (2010). A general multilevel
SEM framework for assessing multilevel mediation. Psychologi-
cal Methods, 15(3), 209.
Raudenbush, S. W. (2004). HLM 6: Hierarchical linear and nonlinear
modeling. Scientific Software International.
Raudenbush, S. W., Bryk, A. S., & Congdon, R. (2009). HLM 6.08
[computer software]. Scientific Software International.
Rhoades, L., & Eisenberger, R. (2002). Perceived organizational sup-
port: A review of the literature. Journal of Applied Psychology,
87(4), 698.
Rye, M., Torres, E. M., Friborg, O., Skre, I., & Aarons, G. A. (2017).
The Evidence-based Practice Attitude Scale-36 (EBPAS-36):
A brief and pragmatic measure of attitudes to evidence-based
practice validated in US and Norwegian samples. Implementation
Science, 12(1), 1–11.
Sanford DeRousie, R. M., & Bierman, K. L. (2012). Examining the
sustainability of an evidence-based preschool curriculum: The
REDI program. Early Childhood Research Quarterly, 27, 55–65.
Scherbaum, C. A., & Ferreter, J. M. (2009). Estimating statistical
power and required sample sizes for organizational research
using multilevel modeling. Organizational Research Methods,
12(2), 347–367.
Schneider, B., Ehrhart, M. G., & Macey, W. H. (2010). Organizational
climate research: achievements and the road ahead. In N. M.
Ashkanasy, C. P. M. Wilderom, & M. F. Peterson (Eds.), The
handbook of organizational culture and climate (2nd ed., pp.
29–49). Sage.
Schwarzer, R., Lippke, S., & Luszczynska, A. (2011). Mechanisms
of health behavior change in persons with chronic illness or
disability: The Health Action Process Approach (HAPA). Reha-
bilitation Psychology, 56(3), 161.
Shanock, L. R., Baran, B. E., Gentry, W. A., Pattison, S. C., & Heg-
gestad, E. D. (2010). Polynomial regression with response sur-
face analysis: A powerful approach for examining moderation
and overcoming limitations of difference scores. Journal of Busi-
ness and Psychology, 25(4), 543–554.
Silvia, P. J., & Christensen, A. P. (2020). Looking up at the curious
personality: Individual differences in curiosity and openness to
experience. Current Opinion in Behavioral Sciences, 35, 1–6.
Smith, B. D., & Manfredo, I. T. (2011). Frontline counselors in organi-
zational contexts: A study of treatment practices in community
settings. Journal of substance abuse treatment, 41(2), 124–136.
Snijders, T. A. B., & Bosker, R. J. (2012). Multilevel analysis: An
introduction to basic and advanced multilevel modelling (2nd
ed.). Sage.
Stauffer, J. M., & Mendoza, J. L. (2001). The proper sequence for cor-
recting correlation coefficients for range restriction and unreli-
ability. Psychometrika, 66(1), 63–68.
Thayer, A., Brown, E., Cook, C. R., Ehrhart, M. G., Locke, J., Davis,
C., Aarons, G. A., Picozzi, E., & Lyon, A. R. (2022). Construct
validity of the School-Implementation Climate Scale. Implemen-
tation Research and Practice, 3, 1–14.
Tipton, E., & Miller, K. (2022). The generalizer. Webtool hosted at
https:// www. thege neral izer. org.
Urbaniak, G. C., &Plous, S. (2013). Research randomizer (Version
4.0) [Computer software]. Retrieved on June 22, 2022, from
http:// www. rando mizer. org/
Vatcheva, K. P., Lee, M., McCormick, J. B., & Rahbar, M. H. (2016).
Multicollinearity in regression analyses conducted in epidemio-
logic studies.Epidemiology
Waltz, T. J., Powell, B. J., Matthieu, M. M., Damschroder, L. J., Chin-
man, M. J., Smith, J. L., & Kirchner, J. E. (2015). Use of concept
mapping to characterize relationships among implementation
strategies and assess their feasibility and importance: Results
from the Expert Recommendations for Implementing Change
(ERIC) study. Implementation Science, 10(1), 1–8.
Wang, L., Zhang, Q., Maxwell, S. E., & Bergeman, C. S. (2019).
On standardizing within-person effects: Potential problems of
global standardization. Multivariate Behavioral Research, 54(3),
382–403.
Weiner, B. J. (2009). A theory of organizational readiness for change.
Implementation Science, 4(1), 1–9.
Weiner, B. J., Belden, C. M., Bergmire, D. M., & Johnston, M. (2011).
The meaning and measurement of implementation climate.
Implementation Science, 6(1), 1–12.
Williams, N. J., & Beidas, R. S. (2019). Annual research review: The
state of implementation science in child psychology and psychia-
try: A review and suggestions to advance the field. Journal of
Child Psychology and Psychiatry, 60(4), 430–450.
Williams, N. J., Ehrhart, M. G., Aarons, G. A., Marcus, S. C., & Bei-
das, R. S. (2018). Linking molar organizational climate and stra-
tegic implementation climate to clinicians’ use of evidence-based
psychotherapy techniques: Cross-sectional and lagged analyses
from a 2-year observational study. Implementation Science,
13(1), 1–13.
Williams, N. J., & Glisson, C. (2014). The role of organizational cul-
ture and climate in the dissemination and implementation of
empirically supported treatments for youth.
Williams, N. J., Wolk, C. B., Becker-Haimes, E. M., & Beidas, R. S.
(2020). Testing a theory of strategic implementation leadership,
implementation climate, and clinicians’ use of evidence-based
practice: A 5-year panel analysis. Implementation Science, 15(1),
1–15.
Administration and Policy in Mental Health and Mental Health Services Research
1 3
Youth mental health reports and publications. (2021, December 7).
Retrieved from https:// www. hhs. gov/ surge ongen eral/ repor ts- and-
publi catio ns/ youth- mental- health/ index. html
Publisher's Note Springer Nature remains neutral with regard to
jurisdictional claims in published maps and institutional affiliations.
Springer Nature or its licensor (e.g. a society or other partner) holds
exclusive rights to this article under a publishing agreement with the
author(s) or other rightsholder(s); author self-archiving of the accepted
manuscript version of this article is solely governed by the terms of
such publishing agreement and applicable law.
A preview of this full-text is provided by Springer Nature.
Content available from Administration and Policy in Mental Health and Mental Health Services Research
This content is subject to copyright. Terms and conditions apply.