ThesisPDF Available

School-Wide Positive Behavioral Interventions and Supports in Dutch Elementary Schools: The Implementation and Effects of a Whole-school Intervention Approach

Authors:
iii
School-Wide Positive Behavioral
Interventions and Supports in
Dutch Elementary Schools
The implementation and Eects of a
Whole-School Intervention Approach
Proefschrift
ter verkrijging van de graad van doctor aan de
Radboud Universiteit Nijmegen op gezag van de
rector magnicus prof. dr. J.H.J.M. van Krieken,
volgens besluit van het college van decanen in het
openbaar te verdedigen op woensdag 1 september
2021 om 14.30 uur precies.
©2021 Monique Nelen
ISBN 978-94-6421-392-8
Design Studio Susan Bijen
Illustrations Suus van den Akker
Printing Ipskamp printing
Funding
This research was supported by the Netherlands Organization for
Scientic Research (NWO) under grant number: 023.005.043
5
Chapter 1. General introduction
Chapter 2. Cultural adaptation
Chapter 3. Fidelity of implementation
Chapter 4. Results of SWPBIS
Chapter 5. General conclusion and discussion
References
Appendix A: Technical report on SWPBIS and
academic achievements
Summary in Dutch
Acknowledgements
About the author
List of publications
6
24
42
64
86
104
118
128
134
140
142
Table of contents
Promotoren:
Prof. dr. R.H.J. Scholte
Prof. dr. E. Denessen (Universiteit Leiden)
Copromotor:
Dr. A. Blonk (Hogeschool Windesheim)
Manuscriptcommissie:
Prof. dr. E. H. Kroesbergen
Prof. dr. A. Popma (Amsterdam UMC)
Prof. dr. A.M. Hintz (Carl von Ossietzky Universität Oldenburg, Duitsland)
6 7
Chapter 1.
General
introduction
Chapter 1.
General
introduction
8 9
School plays an important role in the life of children. It is the place where they learn to
read and write, meet their peers, become more and more independent of their parents,
and much more. It is also a place where they spend a lot of time: In the Netherlands,
children from age 4-12 spend 7,280 hours1 in 8 years at school. To support children’s
cognitive and social-emotional development, schools strive to create safe and eec-
tive learning environments. In fact, Dutch schools are bound by law to guarantee and
monitor the safety of their students. As freedom of education is a constitutional right
in the Netherlands, schools can decide how to meet this legal obligation. A national
inspectorate monitors the quality of the education schools provide, based on the edu-
cational goals determined by the Dutch government (OCW, 2020).
School-Wide Positive Behavioral
Interventions and Supports in
Dutch Elementary Schools: The
implementation and Eects of a
Whole-School Intervention Approach
1 www.rijksoverheid.nl/onderwerpen/schooltijden-en-onderwijstijd/
overzicht-aantal-uren-onderwijstijd
1
10 11
Safety is pivotal for learning. If students do not feel safe at school, learning is unlikely
to take place. S. E. Goldstein, Young, and Boyd (2008) dened safe schools as envi-
ronments where students are likely to remain free from victimization and harassment.
School safety represents the degree of physical and emotional security provided by
the school, as well as the presence of eective, consistent, and fair disciplinary prac-
tices, and is considered to be part of the more general concept of school climate (M. T.
Wang & Degol, 2016). Other dimensions of school climate are: academic climate (e.g.,
teaching and learning), community (e.g., relations and connectedness), and institu-
tional environment (e.g., structural organization, M. T. Wang & Degol, 2016). Next to a
sense of school belonging and good interpersonal relationships, safety is one of the
key determinants of the social-emotional well-being of students. By improving school
safety, social-emotional skills, positive attitudes towards self and school, and positive
social behavior are enhanced.
Safety is not only important for students, as unsafe classrooms can disturb interper-
sonal relations between students, and impede their cognitive functioning. Low levels
of school safety can also have a severe impact on the well-being and, as a result, the
functioning of teachers. Tensions in the classroom can cause emotional stress for
teachers, and interfere with classroom management and eective instruction. Teach-
ers can also be harassed themselves. For example, Brunsting, Sreckovic, and Lane
(2014) found that unsafe situations in schools contribute to teacher burn-out. As a
result from being harassed themselves or teachers being faced with misbehavior from
one student towards another, teachers can suer from emotional strain and burnout
that eects their feelings of commitment and self-ecacy. This can lead to negative or
clinical, cold or distant attitudes towards all students in general (Cornell & Mayer, 2010).
Logically, in line with the denition of safe schools presented above, unsafe schools
are schools where harassment and other problem behaviors occur that cause feelings
of victimization for both students and teachers. Research showed that the occurrence
of problem behaviors not only has a negative impact on school safety, but it also con-
tributes to poor school climate (Ögülmüs & Vuran, 2016), presents a barrier for learn-
ing (Chitiyo, Makweche-Chitiyo, Park, Ametepee, & Chitiyo, 2010), negatively impacts
students’ quality of life (Emerson et al., 2014), and adversely aects peers (Dishion &
Tipsord, 2011). The occurrence of problem behaviors (such as verbal or physical ag-
gression, truancy, bullying) and depression has been linked to high amounts of school
conict, disorder, and friction among students (M. T. Wang & Degol, 2016). Severe
misbehaviors can have a long lasting impact on students, experiencing anxiety over
bullying or fear for their personal safety (Cornell & Mayer, 2010). Students who perceive
their school climate as peaceful, experiencing less aggressive resolutions to peer con-
icts, are likely to engage less in risky behaviors themselves.
Many schools are struggling with behavioral issues, varying from students not follow-
ing teachers’ directions, attitude problems, to truancy or violence. What is considered
problem behavior depends on the context it is occurring in. When behavior interferes
with schools’ daily practice, it is considered problematic. There is a distinction between
minor and major problem behavior. Minor problem behavior are behaviors that can
be addressed by the teacher without support from outside class, for example name
calling or not following teacher’s directions. Examples of major problem behavior are
physical violence, theft or vandalism. These behaviors usually require a broader ap-
proach where, for example, school psychologists or external specialists are deployed.
As the denition of problem behavior is normative, it is hard to estimate how often
problem behavior occurs. Although no research has been done in the Netherlands
to examine the actual percentage of students exhibiting problem behavior in Dutch
schools, teachers often experienced incompetence in dealing with challenging be-
havior of students on a daily basis. In fact, it was found to be a major cause of teacher
burnout (Goei & Kleijnen, 2009). Among all educational needs, teachers especially
experienced students with problem behavior to be the most challenging (Smeets, Le-
doux, & Van Loon-Dikkers, 2019). This is endorsed by McEvoy and Welker (2000), who
stated that problem behavior is a priority area to be addressed in educational agendas.
Creating safe schools
To create safe schools, teacher eectiveness to handle disciplinary infractions and
bullying behaviors, and school attitudes toward acceptable levels of aggression within
the school are important (M. T. Wang & Degol, 2016). M. T. Wang and Degol (2016) fur-
ther argued that norms and values shared by the school may shape both student and
teacher attitudes and beliefs regarding acceptable versus unacceptable behaviors in
schools. These norms determine what counts as a behavior incident and also impact
teachers’ ecacy at preventing behavior incidents. Schoolwide programs addressing
safety, such as School-Wide Positive Behavioral Interventions and Support (SWPBIS)
were found to have a positive impact on enhancing students perceptions of safety
(Horner et al., 2009), and improving school climate (Bradshaw, Koth, Bevans, Lalongo,
& Leaf, 2008). As teachers are part of a team, combining all individual teacher eorts
into a schoolwide approach is likely to be more eective than teachers struggling on
their own with their challenges to improve safety.
Although there were few studies in their meta-analysis that directly compared the ef-
fects of classroom-based programming with schoolwide programs, Durlak, Weissberg,
Dymnicki, Taylor, and Schellinger (2011) did not nd the additional benet of schoolwide
or multicomponent programs over single component (i.e., classroom only) programs,
contrary to ndings that have been reported in other reviews (Catalano et al., 2002;
Greenberg et al., 2001, Tobler et al., 2000). They explained that the reduced program
impact could have been due to the fact that multicomponent programs were more
likely to encounter implementation problems than single component programs. Other
scholars argued that school safety issues, such as bullying, require a comprehensive
approach focusing on school climate (e.g., Anyon, Nicotera, & Veeh, 2016; Bosworth &
Judkins, 2014). As safety is part of the broader concept of school climate, a compre-
hensive schoolwide approach for an increasing consistency across teachers seems
more logical than a classroom or single teacher approach to address safety.
1
12 13
versal, targeted, and indicated, Greenwood et al., 2008). Universal, or Tier 1 interven-
tions, are expected to meet the needs of about 80% of the population of the school.
Examples of Tier 1 interventions are dening and teaching behavioral expectations
to all students. Targeted, or Tier 2 interventions, contain interventions for those
students who need more support, and typically accommodates 15% of the students
shown not to be beneting from primary prevention alone. An example of a Tier 2
intervention is a standardized behavioral education plan that organizes a student to
check in every morning with a designated person, who reminds him or her to focus
on standardized goals related to the behavioral expectations, such as being respect-
ful to the teacher. Indicated, or Tier 3 interventions, are reserved for approximately
5% of the students in a school who have not responded to primary and secondary
interventions. Based on a thorough analysis of the behavior in question, individual-
ized interventions are developed, such as an anger management training. See Figure
1 for the multi-tiered system of support.
Based on a synthesis of the research literature, Sørlie and Ogden (2007) described
components of the most eective schoolwide programs to address problem beha-
vior: (a) multi component; (b) interventions targeting students at dierent risk levels,
sometimes also involving parents; (c) guided by an explicit theory; (d) research
based; (e) developmentally and culturally appropriate; (f) a focus on the importance
of skills training (e.g. student reading and social skills, teacher classroom manage-
ment skills); (g) well planned and systematically implemented; and (h) with research
based and predened intervention components. Some of these components also
emerged in other studies. In their meta-analysis of 221 schoolwide prevention pro-
grams, Wilson, Lipsey, and Derzon (2003) endorsed the importance of a schoolwide
focus on social competence training (component f) as one of the most eective
program components in preventing and reducing aggressive and disruptive beha-
vior. They also concluded that behavioral approaches (based on behavioral theory
and applied behavior analysis [ABA]) were found to be very eective (component c).
ABA is a scientic approach to understanding behavior that refers to a set of prin-
ciples that focus on how behaviors change, or are aected by the environment, as
well as how learning of (new) behaviors take place (H. Goldstein, 2002). Functional
Behavior Assessment (FBA), a thorough analysis of the function of (problem) beha-
vior in a specic context, is rooted in ABA (Crone & Horner, 2003). Accordingly, Cook,
Gottfredson, and Na (2010) stated that programs that teach self-control or social
competency skills using cognitive-behavioral or behavioral instructional methods
can reduce crime in schools (component f and c). Research shows that schools
with schoolwide discipline management policies and practices (e.g., rules are fair
and clearly stated, and consistently enforced, students participate in establishing
mechanisms for reducing misbehavior) experienced less disorder (Cook et al., 2010).
In addition to discipline management, norms and expectations for behavior in the
school, and the quality of relationships among and between students and adults in
the school also predict problem behavior (Gottfredson, 2017). Considering the criteria
mentioned above by Sørlie and Ogden (2007), SWPBIS can be considered as an
example of a schoolwide prevention model that meets the criteria of eective pre-
vention programs and provides structures and routines for implementation.
Designing School-Wide
Systems for Student Success
Academic Systems
Intensive, Individual Interventions
- Individual Students
- Assessment-based
- High Intensity
Intensive, Individual Interventions
- Individual Students
- Assessment-based
- Intense, durable procedures
Targeted Group Interventions
- Some students (at-risk)
- High eciency
- Rapid response
Targeted Group Interventions
- Some students (at-risk)
- High eciency
- Rapid response
Universal Interventions
- All students
- Preventive, proactive
Universal Interventions
- All settings, all students
- Preventive, proactive
Behavioral Systems
80-90%
5-10%
1-5%
80-90%
5-10%
1-5%
Figure 1. Multi-tiered system of support, retrieved from www.pbis.org
Schoolwide prevention models
Over the last decades, schoolwide prevention models have emerged to address
problem behavior and create safe learning environments (Greenwood, Kratochwill,
& Clements, 2008). Schoolwide prevention models have a strong emphasis on the
whole school and include all students and sta members across all school settings.
Prevention models are designed to manage risk and link quick, targeted actions to
reductions in negative outcomes. This is achieved by universal progress monitoring
to identify those at risk, combined with early, dierentiated tiers of intervention (uni-
1
14 15
School-Wide Positive Behavioral Interventions and
Support (SWPBIS): history and foundations
SWPBIS is a schoolwide approach that supports schools in creating a safe learning
environments (Sugai & Horner, 2009). It is originally developed in the US in 1980’s by
researchers from the University of Oregon (Sugai & Simonsen, 2012), and more than
26,000 U.S. schools are currently working with SWPBIS. The theoretical and concep-
tual foundations of SWPBIS are rmly linked to behavioral theory and ABA (Sugai &
Horner, 2009). To understand why certain (problem) behavior occurs, not only obser-
vable behavior, but also antecedents and consequences that are linked to the targeted
behavior, are studied. Well-known techniques focusing on manipulating consequences
for behavior are positive and negative reinforcement, and punishment. Environment -
al redesign is used to promote desired behaviors and minimize the development and
support of problem behaviors. ABA is an applied science that tries to change behavior
by rst assessing the functional relationship between a specic (problem) behavior
and the environment. When this function of behavior is determined, socially acceptable
alternatives for the problem behavior are developed.
Originally, SWPBIS started as a behavioral approach called PBS (Positive Behavior
Support), for resolving serious problem behaviors of individuals with severe develop-
mental disabilities. PBS emerged as an alternative to the prevailing behavior manage-
ment practices that emphasized the manipulation of consequences in order to es-
tablish a change in behavior. PBS was considered a “breakaway movement from the
eld of ABA based on moral revulsion at aversive treatments” (Singer & Wang, 2009,
p18). PBS diered from ABA in the foundational belief that there are eective positive
alternatives to aversive treatments, and in the commitment to use behavioral inter-
ventions to improve the quality of life of PBS recipients instead of only focusing on
target behavior. The adjective “positive” refers to both behavior and support: Positive
behavior, which can be seen as desirable, adaptive, prosocial behavior. And positive
behavioral support as dierentiated from nonpositive support, which might involve
the use of aversive, humiliating, or stigmatizing interventions (Dunlap, Kincaid, Horner,
Knoster, & Bradshaw, 2014). Later on, PBS grew into an approach that not only focus-
ed on individuals, but also included implementation of strategies aimed at groups
of children in classrooms and schools, as well as children and adults in a variety of
early education and service programs (Kincaid et al., 2015). Apart from the behavioral
roots, PBS combines cognitive, biophysical, social, developmental, and environment-
al psychology. For this approach, six characteristics were described: (1) an emphasis
on lifestyle change, (2) functional analysis, (3) antecedent and setting variables, (4)
teaching of adaptive behavior, (5) minimizing the use of punishment procedures, and
(6) using multi-component interventions. In the early 1990s, the main focus was still on
the individual with severe problem behaviors in a specic context that could be used
to modify behavior. In the beginning of this century PBS expanded rapidly. Apart from
individuals with severe problem behaviors, other populations benetted from applying
PBS (e.g., young children or children with autism spectrum disorders). In addition, PBS
began to be applied with groups, and became a major inuence in school restructuring
in the US. As PBS also refers to a U.S. broadcast company, the name was changed into
PBIS: Positive Behavioral Interventions and Supports. The federally funded US Oce
of Special Education Programs (OSEP) Technical Assistance Center on PBIS began a
program of systematically disseminating the PBIS framework for entire schools and
classrooms2. Schoolwide PBIS was embraced by thousands of educators and related
professionals in the US (Kincaid et al., 2015).
SWPBIS: key features
Sugai and Horner (2009) described SWPBIS as “a systems approach for establishing
the social culture and individualized behavior supports needed for a school to be a
safe and eective learning environment for all students” (p. 309). They specically
stated that SWPBIS is an approach, not a curriculum, intervention or program, as it is a
large constellation of systems and practices that needs to be adjusted to the context
it is implemented in, referred to as “contextual t” (McIntosh, Filter, Bennett, Ryan, &
Sugai, 2010). Each schools (a) denes contextually acceptable and measurable aca-
demic and social behavior outcomes; (b) collects information or data to guide decision
making and, accordingly, to select eective behavioral interventions; and (c) uses
evidence-based interventions to support students both academically and behaviorally.
SWPBIS oers systems support designed to increase the accuracy and durability of
practice implementation. Prevention is one of the dening characteristics of SWPBIS,
emphasizing the establishment of a multi-tiered system of support. A school that has
implemented SWPBIS at Tier 1 typically has established schoolwide behavioral expec-
tations that are being taught, systematically acknowledges positive student behavior,
has a schoolwide system for handling problem behavior (including procedures how to
respond to problem behavior with consistent consequences), uses techniques such as
positive reinforcement and active supervision, and develops preventive interventions
based on behavioral data. A SWPBIS leadership team (a delegation of sta including
the administrator) is responsible for the implementation process in school, and con-
tinuously measures outcomes and evaluates delity of implementation. For students
whose behaviors are not responsive to Tier 1 interventions and who, therefore, need
more intensive behavioral support (approximately 10-15% of all students), Tier 2 inter-
ventions are executed. Tier 2 interventions typically include similar implementation
across students, and are continuously available and quickly accessible. Tier 3 interven-
tions are developed for students with chronic or severe behavior needs whose behav-
iors are not responsive to Tier 1 and Tier 2 interventions and who need individualized
support (approximately 1-5% of all students). Functional behavioral assessments (FBA)
give directions for these individualized behavior support plans. Another dening char-
acteristic of SWPBIS is the instructional focus: the direct teaching and training of social
behaviors (Sugai & Horner, 2009). These behaviors are grounded in school values and
norms and identied by stakeholders: educators, students, and preferably parents and
community stakeholders.
2 www.PBIS.org
1
16 17
Contextual t
SWPBIS is (a) a schoolwide approach that meets the criteria of eective prevention
programs and provides structures and routines for implementation (Sørlie & Ogden,
2007), (b) has proven to be eective in terms of creating safe schools and reducing
problem behavior (Horner, Sugai, & Anderson, 2010), and (c) can be adjusted to the
context it is implemented in. McIntosh et al. (2010) dene contextual t as the develop-
ment and alignment of SWPBIS strategies and interventions within the context of an
individual school. This not only applies to implementing SWPBIS in diverse U.S. cultural
contexts, but also to implementation in national cultures dierent than the US. Ac-
cording to Singer and Wang (2009) SWPBIS reects values and beliefs embedded in the
American mainstream culture. Therefore, adopting and implementing SWPBIS into the
Dutch educational context needed to be carefully considered.
In the Netherlands, schools are leading in how to achieve the educational goals that
are set by the Dutch government. Two core goals are imparting knowledge and skills,
and promoting social-emotional development and citizenship. Citizenship refers to
teaching students how to participate in society politically, socially, and economical-
ly (van Oers, Leeman, & Volman, 2009). Apart from qualication and socialization,
education also has a social mission, that can be dened as the schools’ response to
societal issues, such as preventing segregation. In other words, education does not
take place in a vacuum, but is part of society, and, as a result, bearer of cultural values,
norms, and customs. Correspondingly, taking care of contextual t of SWPBIS in the
Netherlands should not just be about adjusting strategies and interventions, but also
a careful consideration of Dutch educational culture. After all, SWPBIS is not a goal in
itself to achieve, it is a schoolwide approach that provides schools with tools to reach
the outcomes schools value within their context, for example creating a social culture
where students feel safe.
Fidelity of implementation
By developing schoolwide systems and procedures that promote positive changes in
student behavior, educators are provided with tools to arrange school environments
according to students’ needs. Training and technical assistance of educators, inclu-
ding direct assessment and feedback on performance of newly acquired skills, are part
of the implementation process, as this is related to implementation quality (Fixsen,
Naoom, Blase, Friedman, & Wallace, 2005). From a systems perspective, SWPBIS gives
priority to establishing local capacity and expertise, majority agreements and com-
mitments, high levels of implementation readiness, high delity of implementation,
continuous implementation and outcome evaluation. Data are systematically collected
and used for decision making to determine if dened practices are implemented with
delity and if those practices have a positive impact on student outcomes (Sugai &
Horner, 2009). Practices and implementation strategies are adjusted to the cultural
context of the school SWPBIS is implemented in.
A major challenge in implementing schoolwide approaches in general, and SWPBIS
in particular, is that the processes, structures and routines of schools often are not
sucient to support the adoption and sustained use of evidence-based interventions
(Chard et al., 2008). Schools are complex systems of classrooms that involve profes-
sionals, policies, programs, and practices that interact in complex ways (Simmons,
Kame’enui, Stoolmiller, Coyne, & Harn, 2003). To increase the likelihood for successfully
adopting and implementing SWPBIS, it is important to rst assess the contextual t of
the intervention to the host environment (e.g., classroom or school), then to establish
a formative, continuous feedback loop at school level to provide information on its
eectiveness in a timely manner, and nally to ensure there is commitment of school
leadership and sta members to using the schoolwide approach (Chard et al., 2008).
According to Fixsen et al. (2005), implementation can be dened as “a set of activities
designed to put into practice an activity or program of known dimensions” (p.5). It may
take two to four years to fully complete implementation of schoolwide approaches
such as SWPBIS. In the process of implementation, six functional stages, that are not
linear but interact with each other in complex ways, can be distinguished: (1) explora-
tion (identify the need, acquire information, access the t, prepare the organization);
(2) installation (organizing resources, structural support of sta); (3) initial implemen-
tation (changes in the overall practice environment); (4) full implementation (the new
learning becomes integrated in practices, policies and procedures); (5) innovation
(learn more about the approach itself and the conditions under which it can be used
with eect); and (6) sustainability (long term survival and continued eectiveness)
(Fixsen et al., 2005). In the US, school-based leadership teams receive further support
in dierent stages of implementation from district and state-level leadership teams
(Oce of Special Education Programs Technical Assistance Center on PBIS, 2015).
The last stage of implementation is sustainability, the point at which an approach
ceases to be a project or an initiative and becomes institutionalized (McIntosh, Horner,
& Sugai, 2009). In many schools, cycles of repeated implementation of dierent pro-
grams are not uncommon. This not only brings along high costs in terms of money,
eort, direct intervention time, and school in-service programming, but also increased
resistance to new implementation eorts. The crux to sustainability is to identify why
a school wants to implement SWPBIS (McIntosh et al., 2009). There are many threats
and barriers to sustained implementation that need to be constantly considered:
changes in the context of the school (such as a lack of contextual t, new challenges
that emerge or competing initiatives that occur), changes in capacity (such as a loss of
funding and attrition of key personnel) and changes in consequences (such as dimin-
ished eectiveness due to poor delity of implementation or outcomes are no longer
perceived as important). The SWPBIS leadership team plays an important role in iden-
tifying and addressing barriers. Regular measurement of delity of implementation can
be helpful to identify barriers and adjust the approach to the current situation.
Fidelity of implementation refers to the extent to which components of an interven-
tion, as conceptualized in a theoretical model or manual, are implemented as intended
(Lane, Bocian, MacMillan, & Gresham, 2004; Schulte, Easton, & Parker, 2009). When
1
18 19
SWPBIS is implemented with delity, students, educators, and schools experience
positive outcomes, including improved school climate (Bradshaw et al., 2008; Brad-
shaw, Koth, Thornton, & Leaf, 2009; Horner et al., 2010), enhanced perceptions of
school safety (Horner et al., 2009), increased prosocial skills (Bradshaw, Waasdorp, &
Leaf, 2012), reduced problem behavior (Bradshaw et al., 2012; Waasdorp, Bradshaw,
& Leaf, 2012), and increased teacher self-ecacy (e.g.,Kelm & McIntosh, 2012) and
well-being (e.g., Ross, Romer, & Horner, 2012). Durlak and DuPre (2008) reported that
interventions that monitored implementation obtained eect sizes two to three times
larger than interventions that reported no monitoring. Many SWPBIS eect studies
showed that delity of implementation is critical to achieve the desired outcomes (e.g.,
Bradshaw et al., 2009; Simonsen et al., 2012). In SWPBIS studies, assessing delity was
operationalized by measuring to what extent core features and standard procedures of
SWPBIS were present in schools. Regular measurements of delity of implementation
is part of the SWPBIS framework. Several instruments are developed to measure Tier
1 delity of implementation: the Schoolwide Evaluation Tool (SET, Horner et al., 2004),
the Benchmarks of Quality (BoQ, Kincaid, Childs, & George, 2005), or the latest one, the
Tiered Fidelity Inventory (TFI, McIntosh et al., 2017). Most instruments are completed
by the SWPBIS leadership team of a school, preferably with guidance by an external
SWPBIS coach to ensure as much objectivity as possible.
Fidelity of implementation can be at odds with contextual t. Adaptations that are
made to make SWPBIS t more closely to the school context, must be in line with the
conceptual foundations to avoid weakening the potential ecacy (Ringwalt, Vincus,
Ennett, Johnson, & Rohrbach, 2004). Castro, Barrera, and Martinez (2004) call this the
tension between delity and t.
SWPBIS in the Netherlands
In 2009, a consortium of Universities of Applied Sciences and youth care organizations
(i.e., Windesheim University of Applied Sciences, Fontys University of Applied Scienc-
es, PI Research, Pica Pedia Support, and Yorneo) introduced SWPBIS to the Nether-
lands to support schools in dealing with student problem behavior. After studying key
publications and visiting schools in Oregon and Norway, the consortium saw the ben-
ets of implementing SWPBIS in Dutch schools. First, SWPBIS needed to be contex-
tualized, not only by translating materials into Dutch, but also by taking into account
Dutch culture and educational context by making core features, practices and imple-
mentation strategies compatible with cultural patterns, meanings and values of those
being served. The ve consortium partners collaborated in adapting SWPBIS for the
Dutch context. They summarized core features in what is known as “the ve pillars of
SWPBIS” (M. J. M. Nelen et al., 2016): 1) Schoolwide approach based on shared values,
2) Prevention (including a multi-tiered system of support, and consistent response to
problem behaviors), 3) Teaching expectations and acknowledging positive behavior,
4) Data-driven decision making, and 5) Partnership with parents and cooperation with
stakeholders. Consortium partners received SWPBIS training from U.S. experts, and an
array of implementation blueprints, materials and procedures was developed.
In 2009-2010, SWPBIS was pilot tested in elementary and secondary schools, sup-
ported by consortium members that were trained to be SWPBIS coaches. Later on, the
consortium started to train Dutch coaches to support schools in implementing SWPBIS
at an independent level. Several modalities in supporting schools emerged: schools
were coached by either an internal or external SWPBIS coach, schools started without
the guidance of a coach, and networks of SWPBIS schools arose (M. J. M. Nelen, van
Oudheusden, & Goei, 2017). At a national level, SWPBIS experts participated in two
teams exploring data-based decision making in SWPBIS, and adjusting and develop-
ing materials for schools. In 2015, a national SWPBIS leadership team was established.
This team developed a procedure to assess SWPBIS implementation in Dutch schools,
based on the TFI. In the beginning SWPBIS was mostly embraced by elementary
schools and schools for special education, followed by secondary schools. Currently,
also vocational education is more and more interested in working with SWPBIS (M. J.
M. Nelen, Verveer, & Kamstra, 2020). No gures exist to date how many Dutch schools
are working with SWPBIS. The national SWPBIS leadership team estimated that ap-
proximately 350 schools are implementing SWPBIS (approximately 4.5% of all Dutch
schools).
Research context
The studies in this dissertation are situated in the context of Dutch elementary edu-
cation. The Netherlands have approximately 17 million inhabitants and a surface area
of 41,543 km2. In 2019, there are 6,431 elementary schools (age of students between
4-12 years), 638 secondary schools (students’ age between12-16/18 years, depending
on type of education), and 549 schools for special education (both elementary and
secondary schools)3. Since 2015, new legislation urged schools to be more inclusive,
but still approximately 2-5% of all students attend schools for special education. Many
schools for primary education are relatively small (50% of schools have less than 200
students). As a result of freedom of education, all schools can decide how to edu-
cate their students and they all receive an allocated budget from the Dutch govern-
ment. Many schools dier in religious aliation (Catholic, Protestant and so on), or in
educational philosophy (Montessori, Dalton, or Jenaplan) 4. The Dutch government
determines the educational goals, and a national inspectorate monitors the quality of
education in the schools by assessing the educational process, school climate, educa-
tional outcomes, school quality of education policy, and nancial management (OCW,
2020). Dutch schools have the second highest amount of (teacher) autonomy in the
world (after Japan) in choosing tests and curriculum (OECD, 2011). Parents are free to
select a school of their choice, and costs are minimal.
The data that are part of the studies presented in this dissertation stem from collabo-
rative work between consortium partners and their regional school and coaching
partners. Schools were recruited through invitations posted at Dutch SWPBIS web-
sites, yers distributed at the national Dutch SWPBIS conference, and through invita-
tions sent by several SWPBIS expertise centers (mostly indirectly via SWPBIS coaches).
Schools themselves also contacted the researchers asking if they could participate in
the project. Most schools received support from an external SWPBIS coach, mainly at
3 www.onderwijsincijfers.nl
4 www.statline.cbs.nl
1
20 21
the beginning of the implementation process. Schools received no funding for parti-
cipating in the studies. Schools were located in all Dutch provinces, except for Zeeland,
both in rural and urban areas. Most schools, except nine, were already implementing
SWPBIS at the start of study 2. The average period of implementing SWPBIS at study
onset was 29 months (SD 16.68). In total, the studies include data from 117 schools,
1,207 teachers, 22,336 students, and 96 SWPBIS professionals (including coaches).
Research questions
Most research on SWPBIS is US-oriented, and at the start of this research project,
research in the Netherlands was mainly focused on describing practices in schools
(Blonk, Das, Haasen, Hoetmer, & Wichers-Bots, 2014; NieuwMeesterschap, 2013; van
Kuijk & van Rens, 2013). SWPBIS has a solid theoretical foundation, next step in building
eective interventions for the Dutch context is gathering evidence on SWPBIS eects
(van Yperen, Veerman, & Bijl, 2017). The main objectives of this dissertation were
threefold: (1) examining the cultural adaptation of SWPBIS to the Dutch educational
context; (2) describing delity of implementation of SWPBIS in Dutch schools; and (3)
exploring the relation between delity of implementation and student outcomes at
school level. In this dissertation, the school is the unit of analysis.
The general research questions were:
1. HowwasSWPBISmodiedtottheDutcheducationalcontext?
2. TowhatextentisSWPBISimplementedwithdelityinDutchschools?
3. Whatistherelationbetweendelityofimplementationandstudent
outcomes(i.e.,socialsafety,behaviorincidents,additionalbehavioral
support)inDutchelementaryschools?
To answer these questions, three studies were conducted. First, as contextual t plays
an important role in successfully implementing SWPBIS, Dutch SWPBIS experts were
questioned to examine which core features and procedures were known to them. With
the introduction of SWPBIS in the Netherlands several adaptations were made by a
consortium of cooperating partners to make SWPBIS t into the Dutch educational
context. The impor tance of delity, and the possible tension between delity and con-
textual t, emphasized the need to explore how the core features and procedures were
further adapted to the Dutch context. Given the autonomy of SWPBIS coaches and
schools, and the diversity of consortium partners, it was not clear if Dutch SWPBIS ex-
perts held shared views about the core features, how they elaborated on the meaning
and practical implications of core features, and how they reected on the procedures.
By drawing upon the perceptions of Dutch SWPBIS experts on the characteristics
of SWPBIS as implemented in Dutch schools, we aimed to gain insight into the core
features of SWPBIS in the Dutch context, how these experts dened and agreed upon
these features, and the adaptation of procedures used to implement SWPBIS in Dutch
schools.
The leading questions for the rst study were: WhatareperceptionsofDutchexperts
oncorefeaturesandproceduresofSWPBISintheNetherlands?
1. WhichcorefeaturesareidentiedbyDutchexpertsandhowdothey
denethesefeatures?
2. HowdoDutchexpertsreectonprocedureswithregardtotheDutch
schoolcontext?
Second, by questioning Dutch SWPBIS experts, we got an impression of core features
and procedures of SWPBIS in the Netherlands. However, what SWPBIS actually looked
like in Dutch schools remained unclear. Fidelity measurements could give a more ade-
quate overview of the prevalence of core features and procedures in Dutch schools.
For that purpose, two delity measures (the TFI and SET) were translated, pilot tested,
and conducted in 117 Dutch schools.
Our main research questions for the second study were:
1. TowhatextentarecorefeaturesandstandardproceduresofSWPBIS
Tier1presentinDutchschoolsaccordingtoTFIandSETscores?
2. WhatarepsychometricpropertiesoftheTFIandSETastheywere
modiedtotDutchculture?
Third, to explore the eects of SWPBIS on student outcomes and examine the relation
between delity of implementation and student outcomes at school level, we collected
data for three consecutive years in Dutch elementary schools. The main objectives of
SWPBIS are behavior related. The rationale is that by creating schoolwide systems that
establish the social culture and a multi-tiered system of behavior support needed for
a safe learning environment, social safety increases and problem behavior decreases.
Therefore, the third study focused on delity of implementation and behavior out-
comes at school level.
Our research questions were:
1. TowhatextentdodelityofTier1SWPBISimplementation,and
studentoutcomes(i.e.,students’perceptionsofsocialsafety,the
prevalenceofbehaviorincidents,andthepercentageofstudentsre-
ceivingadditionalsupportforbehavior)inDutchelementaryschools
changeovertime?
2. WhatistherelationbetweenSWPBISTier1delityofimplementation
andstudentoutcomesinparticipatingschools?
3. IsanincreaseofSWPBISTier1delityofimplementationrelatedto
improvementinstudentoutcomesinparticipatingschools?
1
22 23
Outline of the dissertation
Chapter 2 presents a qualitative study that focuses on the contextual t of SWPBIS
in the Netherlands. Sixteen Dutch SWPBIS experts were questioned on their opinions
on core features and procedures of SWPBIS in Dutch schools. Chapter 3 focuses on
measuring delity in Dutch schools. In a descriptive study, data from 117 Dutch schools
implementing SWPBIS were analyzed to measure prevalence of SWPBIS characteris-
tics, and psychometric properties of delity measures TFI and SET as they were modi-
ed to the Dutch educational context. Chapter 4 reports on a longitudinal study into
the relation between delity of SWPBIS implementation and student outcomes. Finally,
Chapter 5 contains a general conclusion of this dissertation addressing its contribution
to science and implications for practice. This chapter also provides a critical discussion
of this dissertation and directions for future research.
1
24 25
Chapter 2.
Cultural
adaptation
Chapter 2.
Cultural
adaptation
26 27
Abstract
The transfer and adoption of schoolwide approaches, like
School-Wide Positive Behavior Interventions and Supports
(SWPBIS), from one country to another, is an under-exa-
mined process. SWPBIS was mainly developed in the
United States. Although research shows that implemen-
tation of SWPBIS contributes to a positive school climate
and a decrease in problem behavior, little is known about
the generalizability of the eects in other countries. Of
special interest is the role of underlying cultural values
and concepts as reected in SWPBIS. This can inuence
the acceptance of teachers and principals when imple-
menting SWPBIS in another country. SWPBIS procedures
need to be adjusted to the educational context where it
is implemented. As a consequence, delity of implemen-
tation can be at stake when adjustments not only aect
SWPBIS procedures (e.g., the way expected behavior is
taught), but also core features (e.g., teaching of behavior).
In this study, we explored cultural adaptation eorts in
the Netherlands. We have drawn on perceptions of Dutch
SWPBIS experts. In two sessions, 12 and then 10 experts
were questioned. Results suggested that core features of
SWPBIS seemed to be quite consistent across cultures,
but adaptations in procedures were necessary.
This chapter is based on: Nelen, M. J. M., Willemse, T. M., van Oud-
heusden, M. A., & Goei, S. L. (2019). Cultural challenges in adapting
SWPBIS to a Dutch context. Journal ofPositiveBehaviorInterventi-
ons, First published online. doi:10.1177/109830071987609
Introduction
The transfer and adoption of schoolwide approaches, like School-Wide Positive Behav-
ior Interventions and Supports (SWPBIS), from one country to another is an under-
examined process. SWPBIS is an approach developed in the US, to guide schools in
creating schoolwide systems that establish the social culture and individualized be-
havior supports needed for a safe and eective learning environment (Sugai & Horner,
2009). This approach provides schools with accurate systematic implementation and
use of evidence-based practices related to behavior management in a multi-tiered
system of behavior support (Sugai & Horner, 2009). There is sound and growing
empirical evidence for the eectiveness of SWPBIS in diverse settings and contexts
2
28 29
across the US (Benedict, Horner, & Squires, 2007; Carr & Pratt, 2007; Kutash, Duch-
nowski, & Lynn, 2006; Vaughn, Clarke, & Dunlap, 1997). Although eects of implement-
ing SWPBIS have also been reported for other countries, such as Canada (McIntosh,
Bennett, & Price, 2011), Australia (Yeung Alexander, Craven, Mooney, Tracey, & Barker,
2016), and Norway (Sørlie & Ogden, 2015), there is little or no documented work about
the introduction and process of implementation of this US approach in countries with
dierent cultural standards of behavior and social norms. Singer and Wang (2009)
state that “many of the PBS features reect values and beliefs embedded in the
American mainstream culture that dier from beliefs found in some other cultures”
(p. 39). Most research in this area deals with adapting SWPBIS strategies to dierent
subcultural environments within the US (Bal, Schrader, Afacan, & Mawene, 2016; M.
Wang, McCart, & Turnbull, 2007). With the growing interest in and subsequent spread
of SWPBIS worldwide (APBSNewsletter, 2013, 2014, 2016), the need for research about
what it takes to successfully implement an US approach in a foreign country with a
dierent national culture is increasing. On the one hand, SWPBIS is not a treatment
with a specic protocol. On the other hand, it has distinctive core features that need to
be implemented with delity. Implementing with delity refers to the extent to which
core features, prominent or essential components of SWPBIS, as conceptualized in a
theoretical model or manual, are implemented as intended (Lane et al., 2004; Schulte
et al., 2009). The research of Simonsen et al. (2012) shows that implementation delity
is critical to achieve desired outcomes. A distinctive feature of SWPBIS is the so-called
“contextual t” (McIntosh et al., 2010): Strategies and interventions are developed and
modied in alignment within the context of the individual school. However, adaptations
made to make SWPBIS t more closely to the (national) school context, must be in line
with the conceptual foundations of the practice to avoid weakening the potential e-
cacy of the original practice (Ringwalt et al., 2004). When adapting strategies and in-
terventions to make them t to the context, delity of implementation can be at stake
and, as a result, a tension between delity and t might occur (Castro et al., 2004). This
not only applies to implementing SWPBIS in diverse US cultural contexts, but also to
implementation in national cultures dierent than the US.
In 2009, SWPBIS was introduced in the Netherlands. This study aimed to gain more
insight into the contextual and cultural challenges of adapting SWPBIS to a setting that
does not necessarily align with values rooted in the US context.
Core Features of SWPBIS
McIntosh, Mercer, Hume, Frank, and Turri (2013) stated that core features of SWPBIS
need to be implemented with delity. However, we could not nd an unambiguous
description of the core features in the literature. This is partly because several authors
have used slightly dierent concepts like “guiding principles”, “characteristics”, “key
features”, or “(core) features and procedures.” In this article, we dene a core feature
as a prominent, essential component of SWPBIS. With the introduction of SWPBIS in
the Netherlands, core features were summarized in what is known as “the (Dutch) ve
key features of SWPBIS” (M. J. M. Nelen et al., 2016): 1) Schoolwide approach based on
shared values, 2) Prevention (including a multi-tiered system of support, and con-
sistent response to problem behaviors), 3) Teaching expectations and acknowledging
positive behavior, 4) Data-driven decision making, and 5) Partnership with parents
and cooperation with stakeholders. These features were identied based on extensive
study of key publications, such as the Handbook of Positive Behavior Support (Sailor,
Dunlap, Sugai, & Horner, 2009) and the guiding principles mentioned in the PBIS imple-
mentation blueprints (Oce of Special Education Programs, 2004). In addition, Horner,
Blitz, and Ross (2014) make a distinction between the corefeatures of an intervention,
which are considered to be constant across settings, and the procedures, which vary
according to context, that are used to put those core features in place. Teaching social
behavior is for example a core feature, while selecting specic behaviors to be taught,
and the way of teaching are considered procedures.
Implementing SWPBIS
Successful and sustainable implementation of SWPBIS depends on the way members
of a school community align the framework to the school organization and culture
(Fallon, O’Keee, & Sugai, 2012). This is called contextual t or “environmental redesign”
(McIntosh et al., 2010). Albin, Lucyshyn, Horner, and Flannery (1996) dened contex-
tual t as the congruency between the core features of a practice and the identied
needs and environments of a school. Horner et al. (2014) specied contextual t as “the
match between strategies, procedures, or elements of an intervention and the values,
needs, skills, and resources available in a setting” (p.1). Moreover, in order to foster the
cultural t, it is important to take notice of cultural characteristics of a country and the
way it organizes education (e.g., legislation, funding and resources, educational struc-
ture, school size, and support systems inside and outside schools). In addition, it is also
important to take into account both the perceptions of those who implement, typical-
ly teachers, and those who receive the intervention, typically students and families.
With regards to the latter, Sugai, O’Keee, and Fallon (2012) recommended considering
cultural and contextual learning histories of students and their families when designing
and implementing SWPBIS practices in the area of assessment, interventions, and eval-
uation. This was endorsed by Vincent, Randall, Cartledge, Tobin, and Swain-Bradway
(2011), who argued that teachers’ knowledge of cultural dimensions (e.g., collectivistic
versus individualistic orientation, expressiveness, communication styles, interactions
between generations, the role and status of authority and language) is necessary, be-
cause culturally responsive practices function as mediators, aecting the manner and
extent to which implementation of core features of SWPBIS lead to desired outcomes.
Perceptions of the educational professionals are, like those of students and their
families, grounded in national culture. According to Kincaid, Childs, Blase, and Wal-
lace (2007), a lack of teacher support (including philosophical dierences about core
elements of the approach) is the most important barrier for successful and high quality
implementation of SWPBIS. After all, implementation of a schoolwide approach often
depends on individual classroom teachers, whose regular interactions with students
should be consistent with the core features of the approach (Han & Weiss, 2005). Per-
sonal beliefs, values, and motivation are inuenced by the dominant culture in which
a person is raised (Jones, Ross, Lynam, Perez, & Leitch, 2011). Therefore, personal
2
30 31
beliefs, values, and motivation are strongly linked with the acceptance of an approach
(also referred to as “buy-in”) and, consequently, with implementation delity and eec-
tiveness (McIntosh, Mercer, et al., 2013). Swain-Bradway, Pinkney, and Flannery (2015)
report that sta buy-in is an important condition for successful and sustainable imple-
mentation of SWPBIS. To maximize sta buy-in, it is necessary to take into account dif-
ferences in customs, traditions, and underlying values. Therefore, it is highly relevant
to understand the “national culture” in which teachers participate. Hofstede (1986)
stated that the underlying values of teachers are acquired in childhood and ground-
ed in national cultures and, as a result, are hard to change. He suggested, based on
comparison of national groups on cultural dierences regarding teaching and learning,
there are some dierences between U.S. and Dutch teachers. For example, according
to Hofstede’s dimensions of national culture (1986), in the Netherlands, teachers often
avoid openly praising students. Mutual solidarity is often more important than com-
petition between students. Whereas, according to Hofstede (1986), for teachers in the
US, praise seems to be more common, and they focus more on fostering competition
and excellence in students. These dierences in national culture may inuence the
acceptance of SWPBIS practices in Dutch schools. For instance, many Dutch teach-
ers are not familiar with the theory and practice of applied behavior analysis (ABA). In
general, they hold negative preconceptions about systematic use of praise and tokens
(van Kuijk & van Rens, 2013). Although praise and token reinforcement are common in
SWPBIS, they are not the only means of reinforcing student behavior. Positive relation-
ships, compliments, and gestures such as thumbs up can also function to reinforce
student behavior. Targeted professional development in behavioral theory, principles,
and procedures are therefore needed to create systems that include a continuum of
positive reinforcement procedures that are socially acceptable in a Dutch context.
A lack of cultural t can lead to limited commitment and engagement by those in-
volved. Discovering and understanding how educational professionals, such as teach-
ers, and other stakeholders in dierent countries would respond to SWPBIS is therefore
crucial to increase the cultural relevance and the ecacy of the intervention.
SWPBIS in the Netherlands
To understand the cultural t to the Dutch context, the educational system and the
introduction of SWPBIS in the Netherlands are described. The Netherlands have ap-
proximately 17 million inhabitants and a surface area of 41,543 km2. There are 6,431
schools for primary education and 638 schools for secondary education. Many schools
for primary education are relatively small (50% of schools have less than 200 stu-
dents). Freedom of education is a Dutch constitutional right. All schools can decide
how to educate their students and they all receive an allocated budget from the Dutch
government. Many schools dier in religious aliation (Catholic, Protestant and so on),
or in educational philosophy (Montessori, Dalton, or Jenaplan). The Dutch government
establishes the educational goals, and a national inspectorate monitors the quality of
education in the schools. Dutch schools have the second highest amount of (teacher)
autonomy in the world in choosing tests and curriculum (OECD, 2011). Parents are free
to choose a school, and costs are minimal.
SWPBIS was introduced in the Netherlands in 2009 by a consor tium of universities
of applied sciences and youth care organizations that collaborated in adapting
SWPBIS for the Dutch context. Consortium members developed the ve key features
of SWPBIS, as mentioned earlier, based on studying PBIS literature, translating core
features and procedures into Dutch, and discussing which terminology to be used to
make concepts accessible for practitioners. Development was mainly driven by the
urge to provide a clear overview of the content for the eld of education. The focus of
the consortium was to pilot SWPBIS in Dutch schools. For that purpose, consortium
members developed SWPBIS implementation guides, training, and other materials for
Dutch educational professionals in order to support schools implementing SWPBIS.
Later on, when the ve features were evaluated (M. J. M. Nelen et al., 2016), consortium
members found that some of them were conceptual (Feature 1, Schoolwideapproach),
while others were more procedural (Feature 3, Teachingandacknowledgingbehavior).
There seemed to be an overlap between some features, especially between Features 2
(Prevention) and 3 (Teachingandacknowledgingbehavior). For clarication purposes,
the ve features were only revised in details, mainly because these features were
already widely adopted in many practices. However, the actual development of the
Dutch key features was not part of this study. The main goal was to evaluate them to
see if, and how these features were used in schools.
Purpose of this study
The importance of the delity of implementation of SWPBIS, and the possible tension
between delity, contextual, and (national) cultural t, emphasize the need to explore
how the core features and procedures are further adapted to the Dutch context.
Despite general adaptations for the Dutch context, and given the autonomy of SWPBIS
coaches and schools, and the diversity of consortium par tners, it was not clear if
Dutch SWPBIS experts held shared views about the core features, how they elaborated
on the meaning and practical implications of core features, and how they reected
on the procedures. By drawing upon the perceptions of Dutch SWPBIS experts on the
characteristics of SWPBIS as implemented in Dutch schools, we aimed to gain insight
into the core features of SWPBIS in the Dutch context, how these experts dene and
agree on these features, and the adaptation of procedures used to implement SWPBIS
in Dutch schools.
Scientically, the adaptation of SWPBIS to a non-US culture is relatively unexplored.
Therefore, this explorative and descriptive study has the potential to reveal how adap-
tations actually manifest themselves in the specic cultural context of the Nether-
lands. The results of this study can also contribute to enhancing the implementation
steps of SWPBIS in the Netherlands.
The leading questions for the rst study were: WhatareperceptionsofDutchexperts
oncorefeaturesandproceduresofSWPBISintheNetherlands?
1. WhichcorefeaturesareidentiedbyDutchexpertsandhowdothey
denethesefeatures?
2. HowdoDutchexpertsreectonprocedureswithregardtotheDutch
schoolcontext?
2
32 33
Method
Research Design
The study used an explorative and qualitative design that evaluated perceptions of
Dutch SWPBIS experts on core features of SWPBIS through a two-step systematic
assessment consisting of an online survey and an online discussion meeting.
Participants
The criteria for participation were that individuals had at least 3 years experience in
coaching and training SWPBIS and worked as either an internal or external SWPBIS
coach in schools. Preferably, experts also carried out research in the domain of
SWPBIS, or delivered SWPBIS coach training. All experts received formal SWPBIS
training and were actively engaged in coaching schools; some also published in
professional Dutch journals. Two experts were part of the initial consortium introducing
SWPBIS in the Netherlands. Six experts of session one also participated in session two.
Additional information of the experts’ characteristics can be found in Table 1.
Measures
Session One: online survey. To discern which particular core features were identied
by the experts, how they dened these features, and to gather a deeper understanding
if any consistency appeared among experts, an online survey was developed by two
members of the research team. Based on feedback of the other members, the survey
was improved by adding questions about Feature 2 and 3, because of the overlap and
complexity of these features. To guarantee the validity of experts’ perceptions, the rst
question invited them to reect on what they considered core features of SWPBIS in
the Netherlands without presenting any information about the core features. In addi-
tion, experts were asked to elaborate their understanding of the ve identied features
via question 2 – 10. Finally, experts could add features and characteristics, which they
felt the researchers did not include in the survey, in the nal question (see Table 2 for
survey questions).
Session 1 Session 2
Gender 10 female, 1 male, 1 unknown 9 female, 1 male
Highest level of
education
8 Masters, 4 Bachelors
(allwithadditionalcourses)
8 Masters, 2 Bachelors
(allwithadditionalcourses)
PBIS experience M = 4.27 years M = 4.30 years
PBIS roles 10 external PBIS coaches
(onealsobeingresearcher/coach
trainer,twoalsobeingcoach
trainers,onealsobeing
aprincipal)
2 internal PBIS coaches
8 external PBIS coaches
(onealsobeingresearcher/
coachtrainer,threealsobeing
coachtrainers)
2 internal PBIS coaches
Questions
I What are, in your opinion, core features of SWPBIS in the Netherlands?
II Please explain what you consider characteristic/essential about
addressing behavior challenges schoolwide based on shared values?
(Feature 1)
III Please explain what you consider characteristic/essential about
responding at a systematic level to problem behavior? (Feature 2)
IV Please explain what you consider characteristic/essential about
preventing problem behavior? (Feature 2)
VPlease explain what you consider characteristic/essential about
responding at a systematic level to desired behavior? (Feature 3)
VI Please explain what you consider characteristic/essential about
teaching expectations? (Feature 3)
VII Please explain what you consider characteristic/essential about
acknowledging positive behavior? (Feature 3)
VIII Please explain what you consider characteristic/essential about data-
driven decision making? (Feature 4)
IX Please explain what you consider characteristic/essential about
collaboration with parents? (Feature 5)
X Please explain what you consider characteristic/essential about
cooperating with stakeholders? (Feature 5)
XI Did you miss any characteristics or core features about SWPBIS in the
Dutch context in this survey which you would like to add?
Table 1. Participants
Table 2. Survey questions related to the Dutch key features of SWPBIS
2
34 35
Session Two: online discussion. The aim of the second session was to examine how
core features were translated into procedures, if experts agreed on procedures, and
which arguments were used to choose particular procedures. Therefore, seven pro-
positions were developed based on the answers of the survey, to be discussed in an
online meeting with experts (see Table 3). Each proposition was formulated to provoke
discussion in order to identify (dis)agreements among participants.
Procedure
Data were obtained in two sessions conducted by the research team consisting of all
authors. Dutch SWPBIS experts were recruited from several professional SWPBIS-net-
works in the Netherlands and via the annual national SWPBIS conference. The rst
author sent out an invitation email to 64 experts to participate in an online survey in
Formdesk (www.formdesk.com) through a link embedded in the email. Within the 4
weeks given, twelve experts responded. Answers were anonymous, no identifying in-
formation could be retrieved. Respondents consented to participating for the purpose
of research by clicking ‘submit’ at the end of the survey. In the second session, experts
were asked to voice their opinions on SWPBIS procedures in an online discussion. For
this purpose, we scheduled an online meeting and invited the original 64 experts to
participate by sending out another invitation email. Ultimately, 10 experts were able to
participate in session two. Six of them indicated they already took part in the survey of
session one.
Development of the propositions. The development and selection of the propositions
took place in three steps. First, data from session one was analyzed. One member of
the research team categorized all survey text fragments (e.g., “teaching behavior” or
“use of tokens”). Hence, these categories were randomly divided among other mem-
bers of the research team and then analyzed (L. Cohen, Manion, & Morrison, 2000). The
division took place by allocating the rst category to a rst member, the second to the
second and so on. Then, categories and allocation of text fragments were compared.
Dierences were discussed in the research team until consensus was reached (inves-
tigator triangulation, Brantlinger, Jimenez, Klingner, Pugach, & Richardson, 2005). This
resulted in seven themes about SWPBIS procedures, related to the ve key features:
use of tokens, guidelines for reaction procedures for problem behavior, autonomy of
teachers when implementing SWPBIS, methods for teaching expectations, level of
parental involvement, the need of stakeholder cooperation during implementation,
and data-drive decision making. Second, the research team discussed in what way
the themes were also subject of discussion in the national networks, two authors also
being involved in two dierent national network groups. Propositions emerged from
the themes to reect more specic perceptions about the procedures associated with
SWPBIS. Third, the research team made sure that for all ve features there was at least
one proposition, taking into account the discussions at the national level. This resulted
in seven propositions (see Table 3).
Online discussion. For the online discussion, a chat room (Adobe Connect) was used.
All participants received information via email about time and date, login number and
procedure, and the formulated propositions one week prior to discussion. To prevent
bias, the participants were numbered and participated anonymously in the discus-
sion. The video and audio functions were disabled for par ticipants, so they could only
type their answers. All propositions were discussed separately. The discussion started
with a poll in which participants could vote whether they agreed or disagreed with the
proposition. Then pro-voters and no-voters were invited to share and discuss their
arguments. The online discussion was sometimes confusing. Not all responses were
clear. Some participants wrote longer responses then others. Sometimes, respons-
es referred to an earlier response in the discussion. Therefore, the discussion was
moderated by members of the research team. Researchers made clarifying questions
when necessary (mostly in speaking, sometimes by writing) such as, “AsIreadallthe
commentsabove,onecansaythatsixofyouagreewiththestatementofparticipant
numberonethatallPBISschoolsshouldalwaysusetokens.Isthatacorrectassump-
tion? ” After discussing a proposition, a member of the research team summarized, in
writing, what all participants agreed on, such as, “Soeveryoneagreesonthefactthat
theuseoftokensisanimportanttooltoreinforce‘new’behavior.” The duration of the
online discussion was approximately 2 hours. Proposition seven was not discussed due
to a lack of time. All remarks were saved in Word, and sent to participants for an extra
member check (Brantlinger et al., 2005).
Proposition Agreement
1The use of tokens is an indispensable element of the positive
approach of SWPBIS
70%
2Within a SWPBIS context, the reaction procedure (to respond
to problem behavior) should be carried out according to
strict guidelines
80%
3Inside the classroom teachers decide how to conduct
SWPBIS practices
20%
4It does not matter how expectations are taught to students,
as long as this is done
40%
5Considering SWPBIS, one cannot identify (school) values
without parents involved in this procedure
20%
6In implementing SWPBIS in a school, cooperation with
stakeholders (family suppor t systems or youth care) is
obligatory
90%
7Every school can decide for themselves which data is
suitable for data-driven decision making
40%
Table 3. Online discussion: Propositions and expert voting (n = 10)
2
36 37
Analysis. Data were inductively analyzed by the research team. The (main) unit
of analysis was all written answers on the survey from session one and all written
utterances from the discussion of the experts in session two. For Session 2, each
proposition was divided randomly among the research team. First, each member
of the research team read and reread the discussion transcripts in order to identi-
fy signal words such as “tokens” or “parent participation.” Second, for each signal
word identied in the text, utterances were selected and grouped, and displayed in
a table (Miles, Huberman, & Saldana, 2014; Patton, 2015). Third, another member of
the research team coded the same text fragments and also summarized them in a
table (e.g., “boundaries of a schoolwide approach” or “added value of cooperating with
parents”). The second researcher was responsible for comparing both tables. When the
formulation of the codes was dierent, this was discussed in the research team and
a nal code was chosen based on consensus. Mostly, the codes chosen were almost
the same (e.g., ‘when to use tokens’ or ‘reason to use tokens’). When a specic text
fragment was coded dierently, this also was discussed by the research team until full
agreement was reached. As such, the codebook was developed by discussion of the
codes by the research team. While using this procedure, there was a 100% agreement
among codes in the allocation of the text fragments to the codes. In four meetings,
the research team ultimately identied 31 specic codes, derived from the research
questions as sensitizing concepts (Strauss & Corbin, 1998).
Results
Survey: Identifying core features. The rst question invited the 12 experts to reect
on core features. Most experts’ answers referred in one way or another to core fea-
tures as identied in the Dutch ve key features of SWPBIS (see Table 4). They made
71 remarks in total. In question 2-10 of the survey, experts were asked to dene their
understanding of the presented aspects of the ve features (see Table 5). With regard
to Feature 1 (Schoolwideapproachbasedonsharedvalues), most experts emphasized
the importance of identifying shared school values in relation to behavioral expec-
tations. These values should be visualized, established in cooperation with all team
members, and shared and discussed with students and parents. In the exper t’s views,
values and expectations needed to be connected with the school vision and mission
statement.
Considering Feature 3 (Teachingexpectationsandacknowledgingpositivebehavior),
experts provided many examples of positive reinforcement emphasizing this as an
important characteristic of this feature. Examples included, “Atokenisareminderfor
ateachertopayattentiontopositivestudentbehavior” and “Therearemanywaysto
providepositivefeedback:praise,non-verbalsignals,apositivenotehomeetc.” Out
of the 12 experts, eight wrote about the importance of a positive teacher attitude and
argued that a characteristic of this feature is that all teachers act in the same way.
Others (n = 4) expressed their concerns about the fact that teaching expectations and
acknowledging positive behavior should be done in an authentic way (suitable for each
individual teacher), for example in stating, “ItisalsoimportanttolookforlessAmerican
stylereinforcers.” It seemed that at least some of the experts perceived some char-
acteristics of SWPBIS as a more US approach that was, according to them, not always
suitable for the Dutch context.
With regard to Feature 4 (Data-drivendecisionmaking), it is important to note that
Oce Discipline Referrals (ODR’s) are not being used in Dutch schools. Instead, beha-
vior incident forms were developed to track data on student behavior. Examples of
Feature 4 provided by Dutch experts were registration of behavior incidents, data
about academic and social development of students, data about the process of imple-
mentation and the level of delity (e.g., “Twiceayear,ourteachersllintheStrengths
andDicultiesQuestionnaireasaninventoryofproblems” and “Wecollectdataat
severallevels,suchasopinionsofstaanddataofstudentbehavior”).
Dutch features # of remarks* Examples of quotes
1Schoolwide
approach
10 (n = 9**) SWPBISisaschoolwideapproachforall
participantsinschoolcommunity,teaching
andnon-teachingsta,students,parents
and “SWPBISisbasedonsharedvalues
2Prevention 14 (n = 11) Schoolsarecreatingamulti-tieredsystem
ofsupport”, “ThemainfocusofSWPBISison
prevention”, and “Eachschoolmustcreatea
consistentsystemforrespondingtoproblem
behavior
3Teaching and
acknowledging
behavior
19 (n = 10) Onedenesandteachesschoolwide
expectations,whicharerelatedtoschool
values” and “SWPBISisallaboutapositive
approachwithafocusonwhatisgoing
well”, and “Youshouldacknowledgepositive
behaviorsystematically
4Data-driven
decision making
11 (n = 8) TypicalforSWPBISisasystematiccollection
anduseofdata”, and “Actionplanningbased
ondataisimpor tant”, and “Systematic
implementationofSWPBIS
5Cooperation 8 (n = 6) It’simportanttocooperatewithparents” and
Somehow,stakeholdersshouldbeinvolved
Table 4. Results survey: Question one
* Total number of remarks for question one = 71
** n = number of experts making the remarks.
2
38 39
Finally, many remarks (n = 54) were provided about the collaboration with parents
or stakeholders. A majority (n = 24) appeared to be examples about how to inform
parents (e.g., “Some schools have parent’s panels to discuss school related issues” and
“Schools do not only organize SWPBIS information evenings, but also coee meetings
to discuss relevant themes”), or about the importance of building positive relationships
with parents “Schools strive to make parents feel important partners.”
Online discussion: Experts’ opinions about the Dutch adaptations regarding
procedures. Experts shared the same views about most of the propositions. With
regard to Proposition 3, Insidetheclassroom,teachersdecidehowtoconductSWPBIS
practices, eight out of ten exper ts emphasized the importance of a schoolwide
approach: Teacher autonomy in class is limited by the boundaries of schoolwide
agreements to create a predictable school environment. However, 15 comments (of
63) were also made about the Dutch freedom of education that provides a certain
amount of autonomy for teachers. One expert emphasized, “Wehavetobeawareof
theDutchculture,teachersarenotrobotsandtheyareallowedtoaddapersonal
touch,evenworkingwithSWPBIS.” With regard to Proposition 2, WithinaSWPBIS
context,thereactionprocedure[torespondtoproblembehavior]shouldbecarried
outaccordingtostrictguidelines, eight out of ten experts agreed, which may indicate
this is an important SWPBIS issue. When responding to problem behavior, it was
considered essential to give students a choice to strengthen their self-regulation.
Other important aspects mentioned were to minimize attention for (minor) problem
behavior, make expectations clear, and provide clear consequences that will actually
be followed. However, seven experts emphasized the impor tance of a uniform way
(corresponding to guidelines) of responding to problem behavior. Whereas ve experts
argued that it is not about a specic procedure, but up to teachers’ and schools’
autonomy. The same discussion appeared related to Proposition 1, Theuseoftokens
isanindispensableelementofthepositiveapproachofSWPBIS, and 4, Itdoesnot
matterhowexpectationsaretaughttostudents,aslongasthisisdone. Discussing
these propositions took more time than the other propositions. This was possibly
due to the controversy about the subject. All experts agreed on the importance of
using tokens in learning new behavior, indicating it stimulates students to behave
according to the behavioral expectations and reminds teachers to focus on positive
behavior instead of challenging behavior. Four experts emphasized the importance of
building positive relationships with students and supporting a positive school climate
by using positive social reinforcements (a compliment, thumbs up etc.). Statements
included, “Apositiveattitudeandstrongrelationshipswithstudentsarethemost
powerfultoolsforateacher.” With regard to teaching expectations, all experts agreed
on the importance of actually teaching expectations and not just mentioning them
as being relevant. However, six out of ten experts thought it is not necessary to teach
expectations according to formulized steps, for example stating, “Therearedierent
waystoteachexpectations.” The discussion seemed to divide the group of experts,
with one group wanting to follow strict procedures, and the other to leave more room
for teachers’ personal practices. Finally, it was remarkable that when discussing
Feature 5, Partnershipwithparentsandcollaborationwithstakeholders, most experts
emphasized the importance of this topic, but they also expressed the opinion that
parents were mainly to be informed, and school personnel decided what should
happen (e.g., “Parentsshouldbeinvolved,butitisschoolsthatareincharge”, which is
a rather limited concept of partnership). Concerning collaborating with stakeholders,
four experts also mentioned several pitfalls, like trying to involve too many
stakeholders. In sum, it seemed that all experts agreed upon the ve core features,
however regarding procedures some experts emphasized following strict procedures
more than other experts.
Discussion
The implementation of SWPBIS, whose core features and procedures reect the values
and beliefs that are embedded in the US culture, might cause tension in schools in other
countries. Since SWPBIS is a framework for the implementation of evidence-based
practices, rather than a prescribed intervention or curriculum, it allows for the exibil-
ity to align SWPBIS practices with the values, needs, skills, and resources in schools in
dierent cultural settings. However, delity of implementation can be at stake when
contextual t eorts drift too far away from prominent and essential parts of SWPBIS.
In this study we explored the cultural adaptation of SWPBIS in the Netherlands drawing
upon the perceptions of Dutch SWPBIS experts. Research questions were, “Whichcore
featuresareidentiedbyDutchexpertsandhowdotheydenethesefeatures?” and:
HowdoDutchexpertsreectonprocedureswithregardtotheDutchschoolcontext?
We distinguished core features of an intervention that are constant across settings,
Dutch
features
# of remarks* Examples of quotes
125 (of 27) Valuesandbehavioralexpectationsareconnectedwith
schoolvisionandmissionstatement
254 (of 70) Aschoolshouldreectonsuitableconsequences,whichare
aimedatlearninginsteadofpunishing
380 (of 86) Itisimportantthatteachersestablishapositivefocus” and
Expectationsshouldbesystematicallytaughtandactively
practiced,inwhichstudentsplayanactiverole
432 (of 37) Anuser-friendlydatasystemisveryimportantfora
school” and “Datamanagersmusthavebasicknowledgeon
collectingandanalyzingdata
556 (of 66) Aschoolshouldtakeparentsseriously,andworkonbuilding
positiverelationswithparents
Table 5. Results survey: Questions 2-10
* Total number of remarks for questions 2-10 = 299
2
40 41
and procedures that can vary across contexts when putting those core elements in
place. All experts dened the ve key features of SWPBIS, which were formulated at
the introduction of SWPBIS in the Netherlands. At rst glance, consultation of experts
seemed to show that cultural adaptation was merely about adjusting certain proce-
dures. For example, Feature 5, Partnershipwithparentsandcooperationwithstake-
holders, was mentioned less by experts. Possibly, this can be explained by the fact
that in this stage of implementation, the focus is mainly on school related issues. When
SWPBIS elements in schools are established, a more external focus might arise. Proce-
dures that were adjusted to the Dutch cultural and educational context (lesson plans,
ways of responding to problem behavior, collecting data, and procedures to involve
students, parents, and professional partners outside education) were recognized and
agreed on by most experts.
One of the most striking issues discussed by Dutch experts was the use of token econ-
omy systems. All experts agreed on the fact that acknowledging student behavior was
an important core feature. Token economy systems were accepted as a powerful tool,
but must be adjusted to the context. Experts emphasized the importance of culturally
relevant social reinforcers, like compliments or thumbs up. They seemed to share Hof-
stede’s (1986) opinion that openly praising students is often considered “over the top.”
It is likely that the resistance to using token economy systems is based on a limited no-
tion of positive reinforcement, where applied behavior analysis is equated with praise
and tokens. However, SWPBIS and its technology are grounded in applied behavior
analysis (Sugai & Horner, 2009). This does not mean that all schools need to use proce-
dures like token economies, but establishing school systems that include a continuum
of positive reinforcement procedures is a fundamental element of SWPBIS. Targeted
professional development in behavioral theory, principles, and procedures, with em-
phasis on how principles such as reinforcement can be used in a variety of ways tting
the specic context is necessary for those involved in implementing SWPBIS.
Given the fact that acceptance of a schoolwide approach is linked to the personal be-
liefs, values and motivation of teachers, which are all grounded in one’s own historical
and cultural background, the extent to which SWPBIS reects important aspects of
U.S. culture needs to be taken into account when adapting SWPBIS to another (nation-
al) environment. Although research of Hofstede and colleagues (e.g., Degens, Endrass,
Hofstede, Beulens, & Andre, 2017; Minkov & Hofstede, 2014; Schwartz & Sagie, 2000;
P. B. Smith et al., 2002) indicates that there are dierences between Dutch and US
culture, others argue against this concept of national culture (McSweeney, 2002).
Some professionals might consider SWPBIS to be just another US intervention (van
Kuijk & van Rens, 2013). These preconceptions may hinder sta buy-in, and, therefore,
need to be considered. This could argue for adding a step to the process of SWPBIS
implementation in which the values and core features are overtly dened and dis-
cussed when SWPBIS is being considered for adoption outside the US. Another aspect
of Dutch culture, which may undermine sta buy-in and the accompanying cultural
adaptation, is the autonomy of Dutch educational professionals (OECD, 2011). Some
of the experts participating in this study, emphasized the importance of taking this
autonomy into account when adapting SWPBIS to Dutch schools. Although we did not
study teachers’ actual practices, the question can be raised whether teacher auto-
nomy might hinder sta buy-in of SWPBIS as a schoolwide approach.
This study also showed a diversity of opinions on how to use and develop SWPBIS pro-
cedures. Dierent implementation strategies seemed to emerge in the consultation of
experts. Roughly there seemed to be two leading tendencies, also described by Castro
et al. (2004) as a tension between delity and t: following strict procedures and tech-
niques according to a manual, or using techniques and implementation strategies in a
more exible way, modifying them to accommodate the needs of specic schools for
example using SWPBIS as a tool for school development. One limitation of this study is
that only a small group of experts was questioned. Therefore, careful consideration of
outcomes is necessary. However, this small group represented a variety of professional
backgrounds including both internal and external coaches, coach trainers and one re-
searcher, and experts trained by dierent training institutes. Even in this small sample
of Dutch SWPBIS experts, this diversity was present. Further research is necessary to
investigate how SWPBIS is practiced in Dutch schools and whether dierences would
be found between schools supported by experts following strict procedures and those
who follow the procedures less strictly. Another limitation of this study is that imple-
mentation delity data have not yet been collected in schools to assess their eects
on experts’ opinions. Nevertheless, some ndings seem to endorse the argument for
careful adaptation of SWPBIS in other (sub)cultures (e.g., M. Wang et al., 2007). Fidelity
measures, such as SWPBIS Tiered Fidelity Inventory (TFI, McIntosh et al., 2017) and
Schoolwide Evaluation Tool (SET, Horner et al., 2004), reect core features and stand-
ard procedures, and might provide insight in the characteristics of SWPBIS in the
Netherlands in schools’ daily practices. Nevertheless, this study clearly shows that
when implementing SWPBIS in countries outside the US, it is important to pay atten-
tion to existing cultural pre- or misconceptions about SWPBIS core features and pro-
cedures. Dierent interpretations of implementation strategies (strict or more exible
views) need to be taken into account.
2
42 43
Chapter 3.
Fidelity of
implementation
Chapter 3.
Fidelity of
implementation
44 45
Abstract
Schoolwide positive behavioral interventions and sup-
ports (SWPBIS) is a schoolwide approach to create a safe
and positive school climate. SWPBIS is a framework in
which core features and procedures need to be adjusted
to its specic school context, referred to as contextual t.
Implementing with delity is related to positive outcomes
such as a decrease of behavioral problems. Therefore,
when adapting SWPBIS to the context, delity of imple-
mentation needs to be assured. At the introduction of SW-
PBIS in the Netherlands in 2009, several procedures were
adapted to the Dutch educational context, and dierent
modalities of supporting schools in implementing SWPBIS
emerged. In this study, the Tiered Fidelity Inventory (TFI)
and Schoolwide Evaluation Tool (SET) were used to assess
delity of Tier 1 implementation in 117 Dutch schools. The
average period of SWPBIS implementation was 2 years
and 5 months. Results showed that all core features and
procedures were present. Mean total scores were 60% for
the TFI and 70% for the SET. Most participating schools
appeared to have leadership teams, expectations were
taught, and acknowledgement provided. Teams had been
trained, and discipline data collected. Compared to other
features, annual evaluation, data-based decision making
and stakeholder involvement were less well implemented.
This chapter is based on: Nelen, M. J. M., Blonk, A., Scholte, R. H. J.,
& Denessen, E. (2020). School-Wide Positive Behavior Interventions
and Supports: delity of Tier 1 implementation in 117 Dutch schools.
Journal of Posi tive Behavior Interventions, 22(3), 156-166. https://
doi.org/10.1177/10983007198796
Introduction
Schoolwide positive behavioral interventions and supports (SWPBIS) is a schoolwide
approach to create a positive school climate. It has been developed in the US and
implemented in many other countries such as Australia, Canada, Norway, and the
Netherlands (APBSNewsletter, 2013, 2014, 2016). SWPBIS is based on behavioral and
biomedical sciences and can be applied to address problem behavior in schools. It is
not a program or treatment with a specic protocol and standardized interventions.
Rather, it is a framework with distinctive core features and standard procedures (e.g.,
a multi-tiered system of support, the teaching of behavior, the ongoing collection of
data for decision making, and the use of evidence-based practices) that need to be
3
46 47
aligned with the specic school context it is implemented in. Horner et al. (2014) stated
that core features are considered to be constant across settings. Procedures are
used to put core features in place and can vary according to context. When adapting
SWPBIS strategies and interventions to make them t the context, called “contextual
t” (McIntosh et al., 2010), delity of implementation (implementing an intervention
as intended) can be at stake, especially when the adaptations are not in line with the
theoretical basis of the framework. Fidelity measures, like the SWPBIS Tiered Fidelity
Inventory (TFI) and the Schoolwide Evaluation Tool (SET), reect core features and
standard procedures of the framework and are used to determine the extent to which
a school is using SWPBIS (McIntosh et al., 2017).
In 2009, SWPBIS was introduced in the Netherlands. During the introduction, core
features were translated into Dutch, and specic procedures were developed to align
SWPBIS to the educational context. The present study aimed to examine the extent to
which SWPBIS Tier 1 core features and procedures were present in 117 Dutch schools
and if SWPBIS was implemented with delity with cultural adaptations.
SWPBIS
SWPBIS supports schools in creating schoolwide systems that establish the social
climate and individualized behavior supports needed for a safe and eective learning
environment for all students (Sugai & Horner, 2009). It is aimed at reducing problem
behavior, improving school climate, and providing teachers with tools to improve prac-
tice. Research-validated practices and systems change are used to reach valued out-
comes, which are dened and operationalized by the school in which it is implemented
(OSEP, 2004). Research has shown that SWPBIS directly contributes to reduction of
referrals and suspensions, and indirectly to an improved classroom learning climate, a
decrease in segregation of students, and improvement of academic outcomes
(Algozinne, Wang, & Violette, 2011; Bradshaw, Mitchell, & Leaf, 2010; McIntosh, Reinke,
Kelm, & Sadler, 2013; Sørlie & Ogden, 2015).
Theoretical and conceptual characteristics of SWPBIS are described by Sugai and
Horner (2009) as: (a) the behavioral foundation of SWPBIS; (b) emphasis on preven-
tion in a multi-tiered system of behavior support; (c) teaching of behavior; (d) the use
of evidence- or research-based practices; (e) the implementation of systems that
support eective practices related to school safety; and (f) the on-going collection
and use of behavioral data to develop (preventive) strategies. A school that has imple-
mented SWPBIS at Tier 1 typically has established schoolwide behavioral expectations
which are being taught, systematically acknowledges positive student behavior, has
a schoolwide system for handling problem behavior (including procedures how to
respond to problem behavior with consistent consequences), uses techniques such as
positive reinforcement and active supervision, and develops preventive interventions
based on behavioral data. A multi-tiered system of suppor t is in place with universal
interventions for all students (Tier 1), targeted interventions for students who need
more suppor t (Tier 2), and individual interventions for students with chronic or severe
behavior needs who need individualized suppor t (Tier 3). A SWPBIS leadership team (a
delegation of sta including the administrator) is responsible for the implementation
process in school, establishing local capacity and exper tise, majority agreements and
commitments, measuring delity of implementation, and outcome evaluation.
Contextual Fit
Contextual t, or environmental redesign as McIntosh et al. (2010) called it, is crucial
for successful implementation. Sugai et al. (2012) recommended considering cultur-
al contexts and learning histories of students and families, faculty, and community
members to further enhance implementation. They dened culture as “a reection of
a collection of common verbal and overt behaviors that are learned and maintained by
a set of similar social and environmental contingencies (i.e., learning history), and are
occasioned (or not) by actions and objects (i.e., stimuli) that dene a given setting or
context” (p. 204). Taking into account dierent contexts applies not only for schools in
diverse cultural settings within the US, but also in other countries. Indeed, implement-
ing SWPBIS in another country brings additional issues that need to be addressed.
Singer and Wang (2009) claimed that “many of the PBS features reect values and
beliefs embedded in the American mainstream culture that dier from beliefs found in
some other cultures” (p. 39). Therefore, not only do local contexts need to be taken into
account, but also important aspects of national culture, legislation and structures, and
underlying values and perceptions of educational professionals need to be considered,
as they all inuence the successful introduction and acceptance of an approach in one
specic country. This recommendation is endorsed by Bernal, Jimenez-Chafey, and
Rodriguez (2009), who stated that language, culture, and context need to be taken
into account when modifying an evidence-based program to make it compatible with
cultural patterns, meanings and values of those being served. M. Wang and Lam (2017)
argued that “EBP’s [evidence-based practices] often reect the dominant culture’s (as
the norm) inuence in dening what EBP is and determining what constitutes eective
interventions” (p. 54). Implementing practices with delity in education is challeng-
ing, and therefore those practices need to be culturally adapted to be eective and
sustainable. After specifying core components and causal mechanisms of a program,
implementation delity needs to be dened in a rigorous, adaptive, and exible way,
leaving room for cultural adaptation (M. Wang & Lam, 2017).
When SWPBIS was introduced in the Netherlands, core features were formulated in a
recognizable and culturally acceptable language, and procedures were adjusted to the
Dutch educational context. Earlier research (M. J. M. Nelen, Willemse, van Oudheusden,
& Goei, 2019) in which Dutch SWPBIS experts were questioned about core features and
procedures of SWPBIS in the Netherlands, showed that experts agreed on the im-
portance of all core features and that cultural adaptation was merely about adjusting
procedures. All adaptations seemed to be in line with the theoretical basis of SWPBIS.
However, to investigate delity of implementation, more empirical evidence, elabo-
rating what SWPBIS actually looks like in Dutch schools according to frequently used
instruments, is needed.
3
48 49
SWPBIS implementation in the Netherlands
Implementing in any setting requires an understanding of the cultural context in which
that setting is embedded. In the Netherlands there are 6,268 elementary schools (age
4-12 years), 638 secondary schools (age 12-16/18 years, depending on type of educa-
tion), and 549 schools for special education (both elementary and secondary schools).
Many elementary schools are relatively small (50% of elementary schools have less
than 200 students). Since 2015, new legislation urged schools to be more inclusive, but
still approximately 2-5% of all students attend schools for special education. Special
education is organized in four clusters based on the students’ impairments. Most of
the special schools (87%) serve students with learning disabilities or challenging be-
havioral and emotional disturbances. A typical special school has smaller classes and
more sta (both teaching and non-teaching sta) to support students. Freedom of
education is a Dutch constitutional right. This means that all schools are funded by the
Dutch government that prescribes national educational goals. A national inspectorate
monitors the quality of education in the schools.
In 2009, SWPBIS was introduced in the Netherlands by a consortium of universities of
applied sciences and youth care agencies. The consor tium presented several adap-
tations in procedures to make SWPBIS t the Dutch context. For the ongoing use of
behavioral data to develop preventive strategies, the consortium introduced a behav-
ior incident form. In Dutch schools, problem behavior is mostly classroom managed,
and Oce Discipline Referrals [ODRs] do not exist. Installing SWPBIS student teams or
boards was encouraged, due to a strong emphasis on student involvement in Dutch
schools. A dierent way of collaborating with stakeholders was developed, based on
local legislation and dierent organizational structures. For example, because parents
often provide supervision at lunchtime in Dutch elementary schools, many schools
have developed SWPBIS training materials for these parents. Elements of behavioral
theory, more specically the use of token economy systems, are regularly met with re-
sistance of teachers (M. J. M. Nelen, Willemse, et al., 2019). Openly praising students is
in the Netherlands often considered “over the top.” Part of the resistance was probably
also due to a limited notion of the concept of positive reinforcement, where Applied
Behavior Analysis (ABA) is often equated with praise and tokens. Finally, to establish
sta buy-in, cultural adaptive coaching of implementation of SWPBIS was developed,
taking into account the high amount of autonomy of Dutch educational professionals
and the at hierarchy in Dutch schools (OECD, 2011).
Initially, the consortium trained SWPBIS coaches to support Dutch schools in imple-
menting SWPBIS. The consortium itself had no interference with the implementation
processes in schools. After that, dierent modalities of supporting the implementation
of SWPBIS emerged: schools were coached by ocially trained coaches or coaches
that acquired SWPBIS knowledge just by reading, both internal and external coaches,
networks of SWPBIS schools arose, and schools started implementation on their own,
without guidance of a coach. Generally, two tendencies in implementation strategies
could be distinguished: following manualized SWPBIS procedures and techniques or
using implementation strategies and techniques in a more exible way (M. J. M. Nelen,
Willemse, et al., 2019). Castro et al. (2004) referred to these tendencies as the tension
between delity and contextual t. Today, SWPBIS has been implemented in approx-
imately 350 schools (approximately 4.5% of all Dutch schools). Most SWPBIS schools
in the Netherlands are elementary schools, although SWPBIS is also embraced by
special education. Implementation of SWPBIS in secondary and vocational education is
now increasing rapidly. Due to the broad variety in implementation strategies and the
autonomy of Dutch schools and coaches, which could lead to a less rigorous applica-
tion of the approach nationwide, it is not clear what SWPBIS looks like in daily practice.
Fidelity measures can give insight in which core features are present in Dutch schools.
Fidelity of Implementation
Fidelity of implementation refers to the extent to which components of an interven-
tion, as conceptualized in a theoretical model or manual, are implemented as intended
(Lane et al., 2004; Schulte et al., 2009). Many studies (Bradshaw et al., 2010; Flannery,
Fenning, Kato, & McIntosh, 2014; Horner et al., 2009; Simonsen et al., 2012; Sørlie
& Ogden, 2015) reported that delity of implementation is associated with positive
outcomes of SWPBIS such as a decrease of behavioral problems and an increase of
social safety. In these studies, assessing delity of school systems was operationalized
by measuring to what extent core features and standard procedures of SWPBIS were
present in schools. When a school reaches a certain degree of implementation, it is
considered as implementing with delity. Regular measurements of delity of imple-
mentation is part of the SWPBIS framework.
To measure Tier 1 delity of implementation, several instruments have been developed.
Most of them are self-assessment instruments, meaning the SWPBIS leadership
team of a school completes a questionnaire (with or without guidance of an external
SWPBIS coach), which results in a score indicating the level of realized features.
Examples of these measures are the Benchmarks of Quality (BoQ, Kincaid et al., 2005),
Team Implementation Checklist (TIC, Sugai, Horner, & Lewis-Palmer, 2001) and the
PBIS Self-Assessment Survey (SAS, Sugai, Horner, & Todd, 2000). The most recent
delity measure developed is the SWPBIS Tiered Fidelity Inventory (TFI, Algozinne et
al., 2014), which is based on all former instruments and designed to be a more brief
and comprehensive measure of delity. The TFI is completed by an external evaluator
(e.g., the PBIS coach) facilitating the PBIS leadership team. Apart from self-report
measures, the Schoolwide Evaluation Tool (SET, Horner et al., 2004) is a delity
measure that is completed by an independent SWPBIS expert. The SET is mostly used
in research studies because it is considered to be a more objective measure being
completed by an external assessor; however, completion is more time consuming
compared to the other instruments. Almost all delity measures have been subject to
extensive research to validate them in the U.S. context (N= 105 schools for the BoQ, R.
Cohen, Kincaid, & Childs, 2007; N = 150 schools for the SET, Horner et al., 2004; N = 789
schools for the TFI, McIntosh et al., 2017).
3
50 51
Mercer, McIntosh, and Hoselton (2017) compared the convergent validity of several
SWPBIS Tier 1 delity measures (SET, TFI, BoQ, TIC, and SAS) to examine whether they
assessed the same construct and the extent to which comparable scores are ge-
nerated. They found that the measures were comparable to one another and that the
total scores can be used similarly to indicate the level of implementation. The cut-o
scores, used to determine whether a school is adequately implementing SWPBIS, dier
for the TFI and SET, with a 70% on the total score for the TFI, and an 80% both on the
total score and Behavioral Expectations Taught subscale for the SET indicating delity.
Mercer and colleagues (2017) found that Total scores on the SET were signicantly
higher than on the TFI (and all other measures). Correlation between TFI and SET was
high, although the TFI sample size was relatively low (r = .92, p < .001, n = 36). This was
due to the fact that fewer years of TFI data were available, and Mercer and colleagues
used an inclusion criterion of paired assessment within 30 days, which reduced the
number of assessments available for analysis.
In the present study, we used the TFI and the SET to measure Tier 1 delity of imple-
mentation in Dutch schools. The TFI was chosen because it is the most recently devel-
oped and up-to-date instrument, it is brief, and it is based on the factors and features
of existing validated delity measures. Only Tier 1 (universal SWPBIS features) of the
TFI was assessed due to the fact that most Dutch schools did not yet have Tier 2 and
3 systems in place. The SET was chosen to compare the TFI Tier 1 measurements with
more objective data. The goal of this study was to examine the extent to which core
features and procedures of SWPBIS were present in Dutch schools, and if SWPBIS Tier
1 was implemented with delity. We also wanted to evaluate the psychometric proper-
ties of the TFI and the SET as they were modied to t Dutch culture. For that purpose,
we completed both the TFI Tier 1 and the SET in 117 schools.
Our main research questions were:
1. TowhatextentarecorefeaturesandstandardproceduresofSWPBISTier1
presentinDutchschoolsaccordingtoTFIandSETscores?
2. WhatarepsychometricpropertiesoftheTFIandSETastheyweremodiedto
tDutchculture?
Method
Participating Schools
In this study, 117 Dutch schools participated: 92 elementary schools and 25 schools
for special education. Special education schools were both elementary and second-
ary schools. The average number of students per school was 191 (210 students for
elementary schools, 121 students for special schools). The average period of imple-
menting SWPBIS was 29 months (SD 16.68) for all schools, 28 months (SD 16.41) for
elementary, and 31 months (SD 17.85) for special education. Schools were recruited
through invitations posted at Dutch SWPBIS websites, yers distributed at the national
Dutch SWPBIS conference, and through invitations sent by several SWPBIS expertise
centers (mostly indirectly via SWPBIS coaches). Schools themselves also contacted
the researchers asking if they could participate in the project. All participating schools
chose to implement SWPBIS voluntarily and nanced the implementation process
themselves. Many, but not all schools, received support from an external SWPBIS
coach, mainly at the beginning of the implementation process. Researchers had no
involvement implementing SWPBIS in participating schools.
Measures
Instrument translation. Both the TFI and the SET were translated into Dutch and
double checked by a native speaker. Small adjustments were made to use the proper
Dutch terminology, for example “Tier 1 team” was replaced by “SWPBIS team”. When in
doubt about a translation, two authors of the measures were consulted in person or
by email (i.e., Drs. Horner and McIntosh) to make sure that the original content was not
aected. Also scoring issues were discussed, for example, how to score the presence
of (a person with) expertise in behavioral theory on the SWPBIS leadership team. Each
item that needed to be discussed was carefully introduced by the researchers by
explaining the specic Dutch context. Questions or suggestions how to score were
presented to Drs. Horner and McIntosh, who both replied with detailed instructions for
both instruments. In the end, after several feedback rounds, the following changes
were made with consent of Drs. Horner and McIntosh: “schoolwide expectations,” visi-
ble in several items of both the TFI and the SET, was translated as “school values,” and
“school rules” was translated as “behavioral expectations.” “Discipline referral form”
was replaced by “behavior incident form.”
The SET was pilot tested in two elementary schools (M. J. M. Nelen & van Bergen, 2013)
and the TFI in six elementary schools. Based on the feedback of the SWPBIS coaches
who completed the instruments, small textual adjustments were made to clarify the
meaning of items in question. All adjustments made were discussed with and ap-
proved by Drs. Horner and McIntosh.
TFI and SET. The TFI Tier 1 (version 2.1) had 15 questions, divided into three subscales:
Team, Implementation, and Evaluation. Each subscale had dierent numbers of items.
The SET measures only Tier 1 and has seven subscales, called features, (“A” through
“G”); each feature has a dierent number of items (see Table 8). All items on the TFI
can be scored 2 (fullyimplemented), 1 ( partiallyimplemented), or 0 (notimplemented).
The SET has a similar way of scoring, apart from four questions (F3, F8, G1 and G2), for
which only 2 (itemispresent) or 0 (itemisnotpresent) can be scored. The total score,
indicating the level of realized features in schools, used in most of our analysis, was
the sum of all separate items. For the SET total score percentage, a weighted score
was used by adding all seven subscale scores (maximum score 1 per subscale), dividing
by 7 and multiplying by 100. For the TFI Tier 1 total score percentage, the sum of 15
items was divided by 30 (total possible score) and multiplied by 100 (see Table 6 for
total score in percentages of the TFI and SET). Both instruments were digitalized in a
web-based software program that was used to process data from questionnaires to
diminish the chance of missing data or errors and also provided safe storage conform
to ethical standards.
3
52 53
Procedure
Data collection. All data were collected in school years 2015-2016 and 2016-2017.
Each school was assessed once. The TFI was completed by discussing the 15 ques-
tions of Tier 1 during a SWPBIS leadership team meeting, guided by an external SWPBIS
coach. Preferably, this coach also was (or had been) responsible for coaching the
school during SWPBIS implementation. If the school did not have an external coach
who could complete TFI Tier 1, one was provided. Prior to the leadership team meeting,
the SWPBIS coach briey interviewed students and sta, and made some observa-
tions.
For the SET, the procedure was dierent. A SWPBIS professional who did not have any
connection with the school in question, visited the school to collect data. This assessor
conducted structured interviews with the principal, sta members and students,
then reviewed developed products such as school policies, SWPBIS Handbook or
documents, and data systems. For example, to determine how well school values and
accompanying behavioral expectations had been taught, the assessor studied lesson
plans and asked at least 15 students and 10 sta members if they could state the
values and expectations of their school.
First, the TFI Tier 1 section was completed in schools. Following that, the SET was
completed within 2 weeks. If it was not possible to complete the TFI rst, SET scores
(consisting of an overview of scored items, a SET scoring prole and a written report)
were only send to schools after completion of the TFI in order not to inuence the TFI
measurement. This was the case at 31 schools. TFI scores were immediately available
after completion.
Assessors. The TFI and SET were completed by 82 SWPBIS professionals. Only pro-
fessionals who were familiar with PBIS and received PBIS training prior to the current
study were included as SET assessor. Only PBIS coaches, who were previously trained
as a PBIS coach and who were actually coaching one or more schools, were used as
assessors of the TFI. All assessors were selected and trained by the same researcher
during a 4-hr course in groups of 10 persons maximum. All items and scoring proce-
dures for both TFI and SET were discussed during training. Examples were provided to
practice scoring and to check if assessors demonstrated a minimal level of competen-
cy on the topics covered in the training. All coaches also received a manual and written
instructions. The rst author was available (by telephone or email) to answer questions
during or after completion of the instruments.
Interrater agreement. Interrater reliability was assessed for the SET (N = 10 schools).
In each school, two SWPBIS assessors collected data together but scored the SET form
independently. Completion of the SET is highly structured. Each step in the assess-
ment is manualized by using several checklists or forms in order to minimize variation
in scoring. For example, the assessor must count the areas in which school values are
visible, or the number of sta must be noted who could state the same procedure as
the principal. Cohen’s kappa was used to determine agreement amongst observers.
Because scoring TFI items is based on discussions in the SWPBIS leadership team,
which makes independent scoring dicult, and because McIntosh et al. (2017) found a
strong agreement among raters (ICC for all tiers, all items and overall were all .99), we
decided not to measure the interrater reliability of this instrument.
Research design. To determine to what extent core features and procedures of
SWPBIS were present in Dutch schools, we calculated frequencies of each item in both
the TFI and the SET. To check if completion of the TFI and the SET in Dutch schools
showed inconsistencies or remarkable discrepancies compared to the completion of
these instruments in U.S. schools, we repeated the analysis of Horner et al. (2004). We
calculated Cronbach’s coecient alpha to determine whether internal consistency of
the Dutch TFI and SET were as strong as in the U.S. versions. We also conducted a se-
ries of correlational analyses (Pearson correlations) to determine content cohesiveness
and discriminability of items (for TFI Tier 1) or features (for the SET) and the total score
of both the TFI (Tier 1) and the SET in our sample. Because type of schools was not
equally divided among participants, we conducted our analysis in two ways: all schools
grouped together and separate analysis both for elementary and special education
schools.
Results
Descriptive Analyses. Table 6 presents the TFI and SET total score in percentag-
es. Table 7 (TFI) and Table 8 (SET) provide basic descriptive statistics (means and
standard deviations) for all TFI and SET items. For the TFI, we calculated Cron-
bach’s alpha for all 15 Tier 1 items. The internal consistency was good: α = .83 for
all types of schools (α = .85 for elementary schools only, α = .73 for special schools
only). In the SET, calculation of Cronbach’s alpha was based on subscales: α = .73
for all type of schools (α = .74 for elementary schools only, α = .72 for special edu-
cation only). Cohen’s kappa (interrater reliability, N = 10) varied from low (k = .12) to
almost perfect (k = .84). The average kappa score was moderate (k = .58) (Landis
& Koch, 1977). All core features and standard procedures of SWPBIS, as displayed
in the delity measures, occurred in participating schools. For some items scores
were not equally divided among scoring items. Results showed that 33% of all
participating schools met the TFI criteria for adequately implementing SWPBIS Tier
1, consisting of a TFI Tier 1 total score of 70% or more. The percentage of schools
reaching the cut-o score for the SET was 30% on SET total score only, whereas
25% of all schools met the 80/80 criteria on both the total score and Behavioral
Expectations Taught subscale. Of all participating schools, 17% reached the cut-o
score on both TFI and SET (total/subscale). Below we describe the results of items
which represent high visibility of core features in schools and point out several
striking results.
3
54 55
Items in place. A large majority of schools (≥ 90%) met most of the requirements
regarding the SWPBIS leadership team, representing school sta members, and the
principal being an active member. Leadership team operating procedures (regular
team meetings, roles dened, and taking minutes) were also partially or fully imple-
mented in almost all schools. The same pattern was visible for teaching behavior:
In almost all participating schools, schoolwide behavioral expectations (e.g., “Be
respectful” or “Be responsible”) were established and systematically taught. A
high percentage of schools (66% for the TFI, 62-80% for the three SET items) fully
implemented procedures concerning feedback and acknowledgement, meaning
that schools had a reward system in place to systematically provide students with
positive feedback. Most participating schools used some kind of token economy
system, for example students collected tickets or marbles as both individual and
group rewards. The use of SWPBIS in individual classrooms could only be scored in
the TFI.
In most participating schools, SWPBIS classroom procedures were present to some
extent (89% of all schools scored 1 or 2). Most schools paid attention to professional
development of their school sta. Team members were trained in specic elements
of Tier 1 interventions, such as teaching and acknowledging behavior, and respond-
ing to problem behavior. High scores on SET Item D3 “Documented crisis plan for
responding to extreme dangerous situations” are due to requirements of Dutch
law: all schools need to have a proper ight plan in place. In 52% of all participating
schools, discipline data were being collected: these schools used a behavior inci-
dent form that met all the required criteria. A similar pattern is seen in SET feature
E “Monitoring and Evaluation”. On the TFI in the area of collection and use of delity
data (Item 1.14), 53% of all schools scored 2. In many schools, the implementation
of SWPBIS was an important part of the school improvement plan (SET Item F1):
77% of all schools scored 2.
TFI M To tal Min Max SD % Fid. ɾ TFI-
SET
All schools
(N =117)
59. 5 4% 3.33% 100% 18.92 33 .71**
Elementary education
(n = 92)
58.88% 3.33% 100% 1 9.74 31 .71**
Special education
(n = 25)
62.00% 26.67 % 86.67 % 15.63 36 .69**
SET M To tal Min Max SD % Fid. ɾ TFI-
SET
All schools
(N =117)
69. 8 3% 27.80% 100% 15.83 25 .71**
Elementary education
(n = 92)
66.24% 25.49% 89.87 % 15.88 23 .71**
Special education
(n = 25)
70 .15% 46.93% 96.88% 14.01 28 .69**
Table 6. Fidelity of Implementation (N = 117)
Note: TFI = Tiered Fidelity Inventory; SET = Schoolwide Evaluation Tool; % Fid. = percentage
of schools in the sample at or above the delity criterion of the measure (70% on total
score for the TFI, and for the SET 80% both on total score and on Behavioral Expecta-
tions Taught subscale); ɾ TFI-SET = (Pearson) correlations between total score of TFI
and SET; **Correlation is signicant at the 0.01 level (2-tailed)
3
56 57
TFI tier 1 All schools
N = 117
Elementary
education
n = 92
Special
education
n = 25
MSD MSD MSD
1.1 Team composition 1.34 0.49 1.40 0.51 1 .12 0.33
1.2 Team operating
procedures
1.47 0.55 1.48 0.52 1.44 0.65
1.3 Behavioral expectations 1.68 0.48 1.65 0.50 1.76 0.44
1.4 Teaching expectations 1.22 0.56 1.23 0.58 1.20 0.50
1.5 Problem behavior
denitions
0.99 0.76 0.90 0.7 7 1.32 0.63
1.6 Discipline policies 1.09 0.75 1.01 0.72 1.36 0.81
1.7 Professional
development
1.27 0.68 1.24 0.68 1.40 0.64
1.8 Classroom procedures 1.22 0.62 1.20 0.62 1.32 0.63
1.9 Feedback and
acknowledgement
1.49 0.7 7 1.50 0.76 1.44 0.82
1.1 0 Faculty development 1.1 9 0.75 1.18 0.75 1.20 0.76
1.1 1 Student/Family/
Community involvement
0.75 0.74 0.76 0.75 0.72 0.74
1.1 2 Discipline data 1.25 0.86 1 .17 0.87 1.52 0.77
1.1 3 Data-based decision
making
0.81 0.68 0.79 0.67 0.88 0.73
1.1 4 Fidelity Data 1.31 0.81 1.36 0.79 1 .1 2 0.88
1.1 5 Annual evaluation 0.79 0.70 0.78 0.72 0.80 0.64
Table 7. TFI Descriptive Data for Dierent School Types
Note: Each item can be scored 0 (Notimplemented), 1 (Partiallyimplemented),
or 2 (Fullyimplemented).
SET All schools
N = 117
Elementary
education
n = 92
Special
education
n = 25
MSD MSD MSD
AExpectations dened 2.85 1.23 2.72 1.27 3.32 0.94
1Documentation on sta
agreement school rules
1.62 0.79 1.57 0.83 1.80 0.58
2Expectations publicly
posted
1.23 0.88 1.15 0.89 1.52 0.7 7
BExpectations taught 6.90 2.03 6.96 2.02 6.68 2.10
1Documented system
for teaching behavioral
expectations
1.56 0.69 1.57 0.68 1.52 0.71
2Sta states that
teaching has occurred
1.54 0.70 1.59 0.67 1.36 0.81
3Sta states that
schoolwide program has
been taught/reviewed
1.79 0.50 1.84 0.49 1.64 0.49
4Students state school
rules
0.50 0.70 0.48 0.69 0.56 0.77
5Sta lists 67% of school
rules
1.51 0.67 1.49 0.69 1.60 0.58
C Reward system 4.97 1.54 4.93 1.56 5.08 1.47
1Documented system
for rewarding student
behavior
1.50 0.70 1.47 0.72 1.64 0.64
2 Students have received
rewards
1.70 0.65 1.71 0.64 1.68 0.69
3Sta has delivered
rewards
1.76 0.52 1.76 0.54 1.76 0.44
Table 8. SET Descriptive Data for Dierent School Types
3
58 59
D Violation System 4.52 1.87 4.30 1.90 5.32 1.52
1Documented system
for dealing with and
reporting specic
behavioral violations
1.28 0.80 1.18 0.81 1.64 0.64
2Sta-administration
agreement on what
problems are oce/
classroom managed
0.92 0.82 0.84 0.83 1.24 0.72
3Documented crisis
plan for responding to
extreme dangerous
situations
1.68 0.67 1.71 0.62 1.60 0.82
4Sta-administration
agreement on
procedures handling
extreme emergencies
0.63 0.75 0.58 0.71 0.84 0.85
E Monitoring and
evaluation
5.09 2.71 4.9 9 2.79 5.48 2.38
1Discipline referral list
present
1.44 0.85 1.39 0.88 1.64 0.70
2 System for collecting
referral data
1.32 0.75 1.28 0.76 1.48 0.71
3Discipline data reported
to team
1.25 0.87 1.29 0.86 1.08 0.9 1
4Discipline data being
used for behavior support
eorts
1.08 0.87 1.02 0.86 1.28 0.89
F Management 13 .90 2.59 13.05 2.52 12.80 2.10
1 SWPBIS is in top 3 school
improvement plan
1.64 0.70 1.67 0.68 1.52 0.7 7
2Sta reports that there
is a schoolwide team
established to address
behavior support
systems
1.93 0.31 1.93 0.32 1 .92 0.28
3Administrator reports
team membership
includes representation
of all sta
5.1 6 1 7.74 4.97 17.37 5.88 19. 4 0
4Sta can identify team
leader
1.31 0.80 1.29 0.82 1.36 0.76
5Administrator is an
active member
1.66 0.63 1.73 0.59 1.40 0.71
6Team meets monthly 1.46 0.55 1.42 0.56 1.60 0.50
7Team reports progress 1.79 0.53 1.85 0.47 1.60 0.71
8 Recent action plan with
goals less than one year
old
5.58 19.8 4 4.58 17.45 9.28 27.02
G District Support 2.58 1.12 2.57 1.12 2.64 1.11
1Allocated budget for
schoolwide behavioral
support
1.71 0.71 1.72 0.70 1.68 0.75
2Out-of-school liaison in
district or state
0.87 1.00 0.85 0.99 0.96 1.02
Note: Each item can be scored 0 (Notimplemented), 1 (Partiallyimplemented), or 2 (Fully
implemented). For Items F3, F8, G1, and G2 only 0 (No) and 2 (Yes) could be scored.
3
60 61
Items partly or not implemented. Some aspects of SWPBIS seemed harder for par-
ticipating schools to implement. Student, family and community involvement was low.
For data-based decision making, 15% of all schools met all the TFI criteria. However,
42% of all schools scored 2 on SET Item E4 “Discipline data being used for behavior
support eorts”. “Annual evaluation” is only measured with the TFI: 38% of all schools
did not have any type of evaluation. For both the TFI and SET, students were asked if
they could state school values: on the SET 62% of all schools scored 0 on this item.
On the TFI, stating school values is part of “Teaching expectations”: 64% of all schools
scored 1 on this item.
Adapted items. Upon examination of those items in which procedures are represent-
ed that have been adapted to the Dutch context, we saw that teaching expectations
met all the delity criteria, except for students being able to state school values. The
feedback and acknowledgement for positive student behavior was fully implemented
in many participating schools. The replacement of ODRs by a behavior incident form
still reected consistency in both the TFI and SET items. Despite the Dutch focus on
student and parent involvement, this TFI item scored low.
Comparing elementary and special education. Results for elementary and special
education separately showed similar patterns. For some core features, we saw striking
dierences between elementary and special education. “Dening problem behav-
ior” was in 92% of all participating special schools, partially or fully implemented. In
elementary schools, the percentage was lower (65% of schools scored 1or 2). SET Item
D1 “Availability of a documented system for dealing with and reporting on behavior vio-
lations” showed a similar pattern for special education: 72% of special schools fully im-
plemented D1 (43% of elementary schools). Agreement among sta on how to handle
emergencies was dicult for both special schools and elementary schools to achieve:
44% of special schools and 55% of elementary schools scored 0 on this item. Finally,
Tier 1 delity data were more commonly used in elementary than in special education.
Discussion
Two delity measures were used to describe SWPBIS core features and procedures
that were present in Dutch schools, and the extent to which Tier 1 was implemented
with delity. TFI Tier 1 and SET measurements in 117 Dutch schools showed that all core
features and standard procedures were partially or fully implemented. The correla-
tion between the TFI and SET scores was strong (ɾ = .71), although smaller than in the
research of Mercer et al. (2017) (ɾ = .92, n = 36). All other correlations in their research
varied from .59 to .71, comparable to the correlation we found in our sample. Mercer et al.
had a relatively small sample when the TFI was compared with other measures. This may
explain the high correlation between TFI and SET in their research. The data in this study
showed that the TFI and SET could be modied to t Dutch culture without weakening
the psychometric properties of the instruments. This allows comparisons of delity
scores across cultural contexts. Mean total scores were 60% for the TFI and 70% for
the SET. To compare, U.S. schools scored 74% on the TFI total score (SD = 24) in a study
by Kittelman, Eliason, Dickey, and McIntosh (2018). The mean SET total score was 10%
higher than TFI Tier 1 total score, which is consistent with ndings in U.S. schools (Mercer
et al., 2017). The percentage of schools in our sample meeting the criteria for adequately
implementing Tier 1 was lower than the percentages for U.S. schools found by Mercer et
al. (2017, p. 4): 33% for the TFI (58% of U.S. schools scored ≥ 70% on TFI Tier 1 total score)
and 25% for the SET (61% of U.S. schools scored ≥ 80% on both total score and Behavior
Expectation Taught subscale). More research is necessary to comprehend the dierence
in percentages of schools reaching the cut-o score for the TFI Tier 1 and the SET. When
only total scores of both instruments are compared, scores are comparable: 33% for
TFI – 30% for SET. Horner et al. (2004) and McIntosh et al. (2017) argued that a SET total
score of 80%, and correspondingly, a TFI total score of 70% are minimum levels to expect
positive outcomes. The results found in the current study suggest that, with an average
implementation period of 2 years and 5 months, reaching the criterion for adequate im-
plementation is not certain. This is endorsed by McIntosh, Mercer, et al. (2013) who stat-
ed that implementing SWPBIS with delity takes time, eort, and resources. Research
of Nese, Nese, McIntosh, Mercer, and Kittelman (2019) showed that the average time for
U.S. elementary schools from PBIS training to reaching the level of adequate implemen-
tation was 2 years. The lower scores for participating Dutch schools, compared to U.S.
schools, can probably be explained by the fact that SWPBIS was not only fairly new for
Dutch schools, but also for professionals coaching these schools. No routines were yet
developed, and all steps in the implementation process had to be discovered, which can
be compared with building a bridge while walking on it.
Cultural adaptations. With the introduction of SWPBIS in the Netherlands, a consor-
tium of partners from education and care organizations discussed how to adapt core
features into feasible interventions tting the cultural context, learning histories, and
values and beliefs of Dutch educators. In translating the TFI and SET, these adapta-
tions were taken into account. Of special interest are those items and subscales that
reect issues where cultural adaptations were made. Schoolwide behavioral expecta-
tions in Dutch schools are grounded in school values, established at the start of im-
plementation by all sta members. Although almost all schools systematically taught
these expectations, many students were not able to state the school values. Presuma-
bly, teachers may not actively have connected the expectations taught with the value
in question (“Youshouldbequietbecauseitisrespectful”). Another explanation can be
that schoolwide values may have been too abstract for students and specic school
rules are easier for students to memorize.
The high percentage of schools who had a reward system fully implemented was also
striking. In fact, this was the only item scoring at the same level as U.S. schools (M =
1.49) whereas all other items scored lower (Kittelman et al., 2018). At the introduction
of SWPBIS in the Netherlands, the systematic use of token economy systems met with
resistance of teachers (M. J. M. Nelen, Willemse, et al., 2019). However, for most par-
ticipating schools, this seemed to be no longer an issue as they clearly have managed
to implement token economy systems according to requirements. Further research on
how schools and coaches managed to inuence teachers to implement a token system
when teachers have such great autonomy is needed.
3
62 63
Concerning another culturally adapted procedure, the behavior incident form as a
replacement for the ODRs, results showed that this form was commonly used (fully
implemented at 52% of the schools). It seemed to be much harder to use these data to
develop preventive strategies: data-based decision making was only fully implement-
ed in 15% of participating schools. This could be explained by the fact that in Dutch
schools, dierent teams are responsible for discussing behavioral and academic data.
Therefore, few schools met all criteria (score 2) at this item. The 42% of all schools
that fully implemented SET Item E4 “Discipline data being used for behavior support
eorts” seems to conrm this explanation, as this SET item does not focus on discuss-
ing academic data. To date, each TFI item contains several aspects of the construct
concerned and is therefore more complex to score than each single SET item. More
research is needed to verify this explanation and to explore the way Dutch schools use
data-driven decision making.
Other TFI Tier 1 and SET items where potential dierences could occur were classroom
procedures and stakeholder involvement. Most of the time, classroom procedures were
in line with schoolwide procedures. This was an important nding because imple-
menting with delity strongly depends on how teachers act in daily practice. McIntosh,
Mercer, et al. (2013) found that school team functioning was highly related to sustain-
able implementation, including SWPBIS-congruent behaviors. According to Kincaid et
al. (2007), a lack of teacher support can be an important barrier for successful imple-
mentation of SWPBIS. Dutch schools, however, are known for high teacher autonomy
(OECD, 2011), which might lead to more variation in teacher behaviors, even behaviors
not supporting SWPBIS practices. To fully implement SWPBIS, support may be needed
to foster teachers’ regular interactions with students and colleagues that are consis-
tent with the core features of SWPBIS (Han & Weiss, 2005).
Student involvement and cooperation with parents are important issues in the
Netherlands. However, stakeholder involvement showed that 43% of all schools scored
0. The mean score for U.S. schools was also low (M = 1.08; Kittelman et al., 2018).
The fact that involvement of both students, parents and community members were
measured in one item could have inuenced this score. Another explanation can
be that schools’ focus at the start of implementation was probably more internally
oriented and less focused on possible partners outside their school team.
Results for all participating schools on the one hand, and elementary and special
schools on the other hand, showed similar patterns, except for items concerning how
to deal with problem behavior. Special education schools deal with challenging and
complex student behavior. The motivation to dene problem behavior and discipline
policies will therefore be higher in special education than in regular schools. It is strik-
ing, however, that the agreement among sta on how to handle emergencies for both
special and elementary schools was low. This argues for more systematic, schoolwide
procedures for responding to these situations.
Limitations and Future Directions
Maintaining the validity of delity measures TFI and SET to achieve program integrity
of SWPBIS in the Netherlands was an important objective of this study. However, a
number of limitations necessitate discussion. First, we only succeeded in collecting
interrater agreement data for the SET in 10 schools, which is less than 10% of all parti-
cipating schools. Second, we did not collect similar data for the TFI. Although both SET
and TFI assessments were highly structured, and the nature of TFI assessment makes
it hard to collect objective data about interrater agreement, we cannot state that
the training provided was eective in preparing our assessors to collect delity data.
Future research should strive to collect more data on interrater agreement. Finally, the
impact of culturally adapted coaching and sta buy-in on delity cannot be measured
by only assessing TFI or SET. A more detailed (case) description of the implementation
process and the adaptations made in culturally diverse schools outside the US, could
help us better understand the cultural adaptation process of SWPBIS in other coun-
tries.
Conclusions
By completing TFI and SET, we assessed what SWPBIS looks like in 117 Dutch schools.
All core features and standard procedures as displayed in the delity measures were
present in participating schools. Adaptations in procedures and cultural adaptive
coaching to align SWPBIS with the Dutch educational context did not seem to interfere
with delity of implementation of Tier 1. The level of implementation needed to achieve
positive outcomes in Dutch schools cannot yet be determined. The next step is to
investigate the relation between delity and outcome data in Dutch schools.
3
64 65
Chapter 4.
Results
of SWPBIS
Chapter 4.
Results
of SWPBIS
66 67
Abstract
In 2009, School-Wide Positive Behavioral Interventions
and Supports (SWPBIS) was introduced in the Netherlands
to support schools in creating safe learning environ-
ments. In this longitudinal study, we explored eects of
SWPBIS on student outcomes in the Netherlands. Fidelity
of implementation of SWPBIS has been associated with
improved student outcomes. The purpose of this study
was to examine the relation between changes in delity
and student outcomes. Sixty-six elementary schools (n =
14,256 students) were followed for 3 years (2015-2018).
We collected yearly data on delity, social safety (con-
sisting of students’ social well-being, general feeling of
safety, harassment, prevalence of unsafe locations in and
around schools), behavior incidents, and additional behav-
ioral support. Using repeated measures ANOVAs, we saw
an increase in delity scores and a decline in the percent-
age of students stating there were unsafe locations in
and around school. Multiple regression analyses showed
that changes in delity were related to changes in both
students’ social well-being and the number of behavior
incidents. Limitations were discussed, such as the ab-
sence of comparison schools not implementing SWPBIS,
and schools at dierent stages of implementation, and we
accounted for missing data.
Keywords: School-Wide Positive Behavioral Interventions
and Supports, delity, eects
This chapter is based on: Nelen, M. J. M., Scholte, R. H. J., Blonk, A.,
van der Veld, W., Nelen, W., & Denessen, E. (2021). School-Wide Po-
sitive Behavioral Interventions and Support (SWPBIS) in Dutch ele-
mentary schools: exploring delity and eects. Psychology in the
Schools, 1-15, https://doi.org/https://doi.org/10.1002/pits.22483
Introduction
In 2009, School-Wide Positive Behavioral Interventions and Supports (SWPBIS) was
introduced in the Netherlands to support schools in dealing with problem behavior and
creating safe environments. SWPBIS was originally developed in the US in the 1980s
by researchers from the University of Oregon (Sugai & Simonsen, 2012), and more than
26,000 U.S. schools are currently working with SWPBIS. Its aim is to develop school-
wide systems and procedures that promote positive changes in student behavior by
targeting sta behavior (Bradshaw et al., 2010). SWPBIS is a framework, not a method
with specic protocols or standardized interventions: Strategies and interventions are
developed and modied in alignment with the context of the individual school, referred
4
68 69
to as contextual t (McIntosh et al., 2010). Research has shown that SWPBIS resulted
in a decrease in problem behavior, an increase in prosocial skills and perceptions of
school safety, and an improvement of the overall school climate (e.g., Bradshaw et al.,
2008; Bradshaw et al., 2009; Bradshaw et al., 2012; Horner et al., 2010; Horner et al.,
2009; Waasdorp et al., 2012). Most SWPBIS research has been U.S.-oriented, although
other countries such as Norway and Australia have been building evidence for the ef-
fectiveness of SWPBIS as well (Sørlie & Ogden, 2015; Yeung et al., 2016). Implementing
SWPBIS with delity has been shown to be important for achieving positive outcomes
(McIntosh, Mercer, et al., 2013). In this study, we aimed to explore eects of SWPBIS in
the Netherlands, with particular attention to the role of delity of implementation. We
followed 66 elementary schools (14,256 students) for 3 years, collecting data on delity
of implementation and student outcomes.
SWPBIS Features
Sugai and Horner (2009) described the theoretical and conceptual characteristics
of SWPBIS as (a) the behavioral foundation of SWPBIS; (b) emphasis on prevention
in a multi-tiered system of behavior support; (c) teaching of behavior; (d) the use of
evidence-based or research-based practices; (e) the implementation of systems that
support eective practices related to school safety; and (f) the on-going collection
and use of behavioral data to develop (preventive) strategies. The multi-tiered sys-
tem of student support (Greenwood et al., 2008) contains universal interventions for
all students (Tier 1), targeted interventions for students who need additional support
(Tier 2), and individual interventions for students with chronic or severe behavioral
needs who need individual support (Tier 3). At Tier 1, a SWPBIS school typically has
established schoolwide expectations (such as “Be responsible”) that are being taught,
systematically acknowledges positive student behavior, and has a system for hand-
ling problem behavior, including procedures for how to respond to problem behavior
with consistent consequences (OSEP, 2015). Data-driven decision making is a central
feature of SWPBIS (McIntosh, Ellwood, McCall, & Girvan, 2018). Behavioral data such
as oce discipline referrals (ODRs) are collected and used to develop and evaluate
preventive interventions. Systems change and research-validated practices are used
to reach valued outcomes that are dened and operationalized by the school (Sugai et
al., 2012). A SWPBIS leadership team (a representative group of stakeholders including
educators, school administrator(s), family members, and students) is responsible for
the implementation process at the school, establishing local capacity and expertise,
setting up majority agreements and commitments, measuring delity of implementa-
tion, and outcome evaluation (Lewis et al., 2016; Sailor et al., 2009). In the US, school-
based leadership teams receive further support from district- and state-level leader-
ship teams (OSEP, 2015).
All the separate components mentioned above are part of the SWPBIS framework and
draw from several decades of systematic research in education, mental health, and
behavior analysis (Horner et al., 2010). The ecacy of SWPBIS is based on focusing on
the whole school approach, emphasizing the multiple tiers of support that are deli-
vered as early as possible, tying educational practices to organizational systems need-
ed to deliver these practices with delity, and the systematic use of data for decision
making (Sugai, Horner, & Lewis, 2009). Adapting the framework to the school context
is crucial for successful implementation (McIntosh et al., 2010). This not only applies
to implementation of SWPBIS in diverse US cultural contexts, but also to implemen-
tation in other countries (M. J. M. Nelen, Willemse, et al., 2019). However, adaptations
made to make SWPBIS t more closely to the (national) school context must be in line
with the conceptual foundations of the framework to avoid weakening the ecacy (T.
B. Smith, Domenech Rodríguez, & Bernal, 2011). When SWPBIS was introduced in the
Netherlands, essential features of the framework were formulated in recognizable and
culturally acceptable words, and interventions and strategies were adjusted to t the
Dutch schools. M. J. M. Nelen, Willemse, et al. (2019) have described the process of
cultural adaptation of the framework to the Dutch educational context.
SWPBIS in the Netherlands
Discussing eects of SWPBIS in a country requires understanding of the cultural con-
text. In the 2015-2016 school year, there were 6,431 elementary schools (grades 1-8,
ages 4-12 years) in the Netherlands. Many elementary schools are relatively small (50%
of all elementary schools have fewer than 200 students, M = 224 students). The aver-
age class size in elementary school is approximately 24 students. Almost all schools
are funded by the Dutch government, as long as prescriptive goals are achieved. Dutch
schools are known for their high (teacher) autonomy (OECD, 2011). Every school is free
to choose its curriculum and methods, achievement measures, and sta-to-student
ratio. A national inspectorate monitors the quality of education in the schools. Parents
are free to choose a school, and costs are minimal.
In 2009, a consortium of universities of applied sciences and youth care agencies in-
troduced SWPBIS in the Netherlands and initiated PBIS coach training. The consortium
presented several adaptations to SWPBIS procedures. As problem behavior is mostly
classroom-managed and ODRs do not exist in Dutch schools, a behavior incident form
was developed for the ongoing use of behavioral data. Collecting behavior incident
data for preventive reasons is not common in Dutch schools. Therefore, during SWPBIS
implementation, schools are usually coached on determining when, what and how to
register. In 2014, a Dutch version of the Schoolwide Information System (SWIS, May
et al., 2010) was introduced in the Netherlands. As openly praising students in the
Netherlands is often considered “over the top”, the introduction of token economy sys-
tems initially met with some resistance from teachers (M. J. M. Nelen, Willemse, et al.,
2019). However, research on the use of delity measures in Dutch schools showed that
feedback and acknowledgement for positive student behavior was fully implement-
ed at most schools (M. J. M. Nelen, Blonk, et al., 2020). This suggests that culturally
appropriate ways of reinforcing student behavior were found (such as group awards or
“thumbs up”). Finally, culturally adaptive ways of coaching were developed, taking into
account the high degree of autonomy of Dutch teachers. There are currently dierent
modalities for supporting schools in implementing SWPBIS in the Netherlands: schools
4
70 71
can be coached by a SWPBIS coach, networks of SWPBIS schools have arisen, and
some schools have started SWPBIS without the guidance of a coach (M. J. M. Nel-
en, Blonk, et al., 2020). Today, SWPBIS has been implemented in approximately 350
schools (approximately 4.5% of all Dutch schools), mostly elementary schools.
Fidelity of Implementation
Many studies have reported that implementing SWPBIS with delity is associated with
positive school outcomes such as improvement of school climate and safety, and a
decrease in behavioral problems (e.g., Bradshaw et al., 2009; Simonsen et al., 2012).
Fidelity of implementation is the extent to which components of an intervention, as
conceptualized in a theoretical model or manual, are implemented as intended (Schulte
et al., 2009). In SWPBIS studies, delity has been operationalized by measuring to what
extent the core features and standard procedures of SWPBIS were present in schools.
Fidelity measures reect core features and standard procedures and contain items on
the SWPBIS leadership team (composition, procedures and universal screening), imple-
mentation (teaching behavioral expectations, problem behavior denitions, classroom
procedures, providing students with feedback and acknowledgement, stakeholder
involvement and professional development), and evaluation (collecting discipline data,
data based decision making, measuring delity and annual evaluation). As the process
of implementation can vary across schools in dierent countries, measuring delity
provides information regarding the extent to which a school has succeeded in imple-
menting core features and procedures (McIntosh et al., 2017).
Fidelity does not happen automatically: schools work hard to contextualize and imple-
ment core features and procedures. Usually, SWPBIS coaches support schools in their
implementation eorts. Fixsen, Blase, Naoom, and Wallace (2009) distinguished several
stages of implementation: creating readiness, initial implementation and institutional-
ization. Nese et al. (2019) found that most schools reached adequate implementation
at Tier 1 during their second year of implementation following training. The initial years
of implementation are crucial as threats like administrator or team turnover can easily
lead to abandoning SWPBIS. To embed SWPBIS practices into school routines may
even take three to ve years (Sugai, Horner, & McIntosh, 2008). Reaching implemen-
tation early is a strong predictor of sustained implementation (McIntosh, Mercer, Nese,
Strickland-Cohen, & Hoselton, 2015).
To measure delity of Tier 1 implementation, several instruments have been developed.
The most recently developed is the Tiered Fidelity Inventory (TFI, McIntosh et al., 2017).
The SWPBIS leadership team of a school completes a questionnaire, preferably with
guidance by a SWPBIS coach to ensure as much objectivity as possible. The School-
wide Evaluation Tool (SET, Horner et al., 2004) is another delity measure, mostly used
in research studies because it is considered to be a more objective measure, as it is
completed by an external assessor. Both instruments are valid and reliable, and assess
the same construct (Mercer et al., 2017). They both result in a total score, indicating
the level at which features are realized. Higher scores mean greater delity. When the
total score meets or exceeds a criterion (e.g., 80% for the SET and 70% for the TFI),
it indicates that a school is implementing SWPBIS “with delity” (Mercer et al., 2017).
In the present study, we used both the TFI and the SET to measure delity of Tier 1
implementation in Dutch schools. The TFI was chosen because it is the most recently
developed and up-to-date instrument, it is brief, and it is based on the factors and
features in existing validated delity measures. The SET was chosen to compare TFI
measurements with more objective data (M. J. M. Nelen, Blonk, et al., 2020).
School Safety
SWPBIS, when implemented with delity, is expected to promote safe schools, not
only by reducing problem behavior or improving school climate (Horner et al., 2009),
but also by enhancing schools’ organizational context (Bradshaw et al., 2009). Safe
schools are pivotal for learning. According to J. Cohen, McCabe, Michelli, and Pickeral
(2009), positive school climate is associated with and predictive of academic achieve-
ment, school success, eective violence prevention, students’ healthy development,
and teacher retention. Nijs et al. (2014) stated that school environment is an impor-
tant determinant of psychosocial function and may also be related to mental health.
Kutsyuruba, Klinger, and Hussain (2015) found that school climate, feelings of school
attachment/connectedness and personal safety are some of the most important vari-
ables for understanding school safety.
In the Netherlands, school safety is emphasized as social safety. The Dutch Minis-
try of Education, Culture, and Science dened three aspects of social safety: social
and physical safety of students, and social well-being. When students’ safety is not
being violated by others, a school is considered to be safe (W. Nelen et al, 2018). Yearly
monitoring of school safety is mandatory for Dutch schools. Although the government
organizes a bi-yearly measurement of school safety, each school is free to choose an
instrument for monitoring school safety. In this study, we followed the Dutch govern-
ment’s denition of social safety, which we operationalized as students’ perceptions of
school safety and the prevalence of behavior incidents. Social well-being is dened as
the way students perceive their class, contacts with classmates, and being at school.
Physical safety is dened as the absence of physical harassment (such as hurting,
pushing or ghting; W. Nelen et al., 2018). Research showing that SWPBIS contrib-
utes to improved social safety has mainly been conducted in countries outside the
Netherlands (for the US: e.g., Bradshaw, Mitchell, & Leaf, 2010, Canada: e.g., McIntosh,
Bennett, & Price, 2011, and Norway: e.g., Sørlie, & Ogden, 2015). Therefore, we wanted
to explore whether these results were replicable for the Netherlands.
Purpose of the Study
Fidelity of implementation has been associated with positive student outcomes, such
as a decrease in problem behavior and an increase in social safety. To examine this,
some studies have used delity cut-o scores (meeting or exceeding a criterion) in
their analyses (e.g., Simonsen et al., 2012). Others used delity as both a continuous
and a dichotomous variable (e.g., Bradshaw et al., 2009). The relation between changes
in delity and changes in student outcomes has been less examined. In the Nether-
4
72 73
All participating schools started implementing SWPBIS before study onset. Average
duration of implementation at study onset was 22.97 months (SD = 16.53 months,
range 2 to 74 months). All schools received support from a trained SWPBIS coach,
mainly at the beginning of the implementation process. The training contained, among
others, issues such as implementing and monitoring delity of SWPBIS implementa-
tion. Authors had no involvement with implementing SWPBIS in participating schools.
The process of implementation was not part of this study. Schools in our sample were
comparable with other Dutch elementary schools in size, location, and aliation.
Twenty-ve schools reported they were located in a multi-problem neighborhood. We
dened this as a neighborhood where multiple problems occur, such as unemploy-
ment, violence, criminality, addiction-related problems, and health problems such as
higher mortality rate and obesity (e.g., Marlet, Poort, & van Woerkens, 2009). See Table
9 for summary information about numbers of teachers, students, and classes at par-
ticipating schools.
Procedure
Data collection. Data were collected for 3 consecutive years, with a focus on the rst
and last wave (T1 and T3), in repeated measurements of delity of Tier 1 implemen-
tation and student outcomes (social safety, behavior incidents, and the percentage
of students receiving additional behavioral support). All data were collected between
October 2015 and August 2018. In dening our measures, we stayed as close as possi-
ble to the daily practice in schools. We chose measures that were either part of SWPBIS
(behavior incident form), or part of schools’ obligation to collect data on social safety
(social safety monitor).
Fidelity, social safety, and the percentage of students receiving additional support
were measured yearly. Data on behavior incidents were collected several times per
year in 10 periods of 4 weeks each. Data collection was synchronized each year.
For behavior incidents and students receiving additional support, we asked schools to
anonymize their data before sending them per email. Most data were at the school le-
vel, except for the social safety monitor; in that, individual student data were collected.
lands, to our knowledge, research to study the relation between delity and student
outcomes has not been done before. In Dutch schools, there are also dierent modali-
ties for supporting SWPBIS implementation. To examine whether the core components
of SWPBIS were being implemented as intended, measuring delity of implementation
was important. Earlier research on the use of delity measures in the Netherlands
showed that all items displayed in the TFI and SET were present in participating
schools (M. J. M. Nelen, Blonk, et al., 2020), and, therefore, these measures could be
used to measure delity of implementation.
The number of Dutch schools implementing SWPBIS is relatively small, and in the
Netherlands there is usually no (research) funding to nance the costs of implementa-
tion. Therefore, we decided to focus on elementary schools that were already imple-
menting SWPBIS, rather than on schools that started at study onset. For 3 consecutive
years, we measured delity of Tier 1 implementation, students’ perceptions of social
safety and the prevalence of behavior incidents. To determine the distribution of the
multi-tiered model in participating Dutch schools, we also collected data on the per-
centage of students receiving additional support for their behavior.
Our research questions were:
1. TowhatextentdodelityofTier1SWPBISimplementationandstudentout-
comes(i.e.,students’perceptionsofsocialsafety,theprevalenceofbehavior
incidents,andthepercentageofstudentsreceivingadditionalsupportfor
behavior)inDutchelementaryschoolschangeovertime?
2. WhatistherelationbetweenSWPBISTier1delityofimplementationandstu-
dentoutcomesinparticipatingschools?
3. IsanincreaseinSWPBISTier1delityofimplementationrelatedtoimprove-
mentinstudentoutcomesinparticipatingschools?
Method
Participating Schools
Elementary schools implementing SWPBIS were recruited through invitations posted
on Dutch SWPBIS websites, yers distributed at the annual Dutch SWPBIS conference,
and invitations sent by several SWPBIS expertise centers (mostly indirectly via SWPBIS
coaches). Of 83 schools asked to participate in the 3-year study, 76 initially accepted
the invitation. Of these schools, six schools declined before study onset. During data
collection, four schools withdrew due to management changes or not being able to
provide the data requested. In the end, 66 schools participated for all 3 years. Eect
sizes for SWPBIS have been reported to vary across studies from relatively small (d
= 0.31; Simonsen et al., 2012) to very large (d = 2.63; Bradshaw et al., 2010), and to
depend on the variables assessed (Horner et al., 2009). For student outcomes, mean
eect sizes are around d = 0.32 (Simonsen et al., 2012), and for delity measures eect
sizes are well above 1 (d varies between 1.08 and 2.63). Based on the smallest reported
eect size (0.31), an alpha of .05 and a power of .80, a total sample size of 52 schools
for a repeated measures ANOVA was considered large enough to detect signicant
eects.
MMin Max SD
Number of students 216 57 476 104.73
Number of teachers 17.35 642 8.33
Number of classes 9. 2 9 319 3.97
Table 9. Descriptive data for participating schools at T1 (N = 66)
4
74 75
The TFI was completed by the same assessor every year, whereas the SET asses-
sor varied each year. All TFI and SET assessors were familiar with SWPBIS, and were
selected and trained by the rst author in completing both instruments. The interrater
agreement of SET assessors was moderate (k = .58) when measured in an earlier study
on the use of TFI and SET in Dutch schools (M. J. M. Nelen, Blonk, et al., 2020). That
study also included data for the rst delity measurements at T1 used in this study
(66 of the 117 schools included in that study). The interrater agreement for the TFI
was not calculated in that study, because scoring TFI items is based on discussions in
the SWPBIS leadership team, which makes independent scoring dicult. For a more
detailed description of the use of these delity measures in Dutch schools, see M. J. M.
Nelen, Blonk, et al. (2020).
Social safety was measured with an online survey measuring perceptions of social
safety and required interventions, and harassment (Mooij, De Wit, & Fettelaar, 2011).
The survey consists of eight dierent topics. For example, “About school”, “Feeling
safe”, “Being bullied” and “Being a bully.” An example of a question was “Are you being
bullied at school?” This question could be scored “Every day”, “Every week, but not
every day”, “Sometimes, but not every week”, “Almost never”, or “No, never”. At the
beginning of each page, students were reminded that the questions were about the
present school year. There was a maximum of 71 questions. Several questions are
shown or hidden depending on the reaction of a previous question. Most questions
were answered by multiple choice or a Likert scale. The number and content of the
options varied depending on the question. In the survey, four dimensions of positive
or negative aspects of social safety were distinguished: 1) the perception of safety at
dierent school locations; 2) unacceptable behavior, represented by the prevalence of
behavior incidents and substance abuse; 3) harassment of students; and 4) the per-
ceived need for extra interventions to improve social safety in and around the school
(W. Nelen et al., 2018). For the purpose of this study, we only used questions about
students’ social well-being, general feeling of safety, unsafe locations, and harass-
ment. “Well-being” was operationalized as the average of the scores for three ques-
tions about liking one’s class, number of contacts with classmates, and appreciation of
these contacts (scale existing of three items, Cronbach’s alpha varying from .61 in 2017
to .65 in 2016 and 2018). “General safety” was operationalized by asking students how
safe they generally felt at school, on a ve-point scale (single question, validated with
similar questions on safety). “Unsafe locations” was operationalized by asking students
if there were various locations (total of seven, e.g., classroom, hallway, playground) in
or around school where they did not feel safe at any time the past year. And “Harass-
ment” was operationalized by asking if students had been a victim of various types
of harassment at any time the past year (scale existing of six items, Cronbach’s alpha
varying from .81 in 2016 to .97 in 2017 and 2018). Here a mean score was calculated for
being bullied and/or being a victim of minor physical (e.g., hurting, pushing or ghting),
social (e.g., exclusion, ignoring or threatening), material (e.g., destroying or stealing),
and/or verbal (e.g., name-calling or yelling) harassment.
Schools were invited by email to subscribe to the safety monitor. In accordance with
the ocial survey procedure, only students from grades 7 and 8 (10- to 12- year-olds)
received a login code (more than 3,500 students), so they could complete the survey
anonymously. The safety monitor used in this study is one of the social safety moni-
tors ocially approved by the Dutch inspectorate of education. Since monitoring social
safety is prescribed by law, no parental consent for participation of students was
needed. The internal review board of the research institute approved the study (ECSW
2016-2501-369). At the beginning of the school year, each school received an over-
view of which data were planned to be collected when. When a school did not provide
the data requested, several reminder emails were sent.
Measures
Fidelity of implementation was measured with both the TFI Tier 1 and the SET. We
focused on Tier 1, because not many schools have implemented Tiers 2 and 3 yet. The
TFI Tier 1 (version 2.1) has 15 questions, divided into three subscales: “Team”, “Imple-
mentation”, and “Evaluation” (McIntosh et al., 2017). The SET was originally designed for
academic research and is completed by an external assessor (Horner et al., 2004). It
has seven subscales, “Expectations dened”, “Behavior expectations taught”, “Reward
system”, “Violations system”, “Monitoring and evaluation”, “Management”, and “Dis-
trict support ”. There are multiple items per subscale with a total of 28 items. For each
subscale, the sum score is divided by the maximum score per scale. In both measures
items can be scored 2 (fullyimplemented), 1 (partiallyimplemented), or 0 (notimple-
mented). The total score indicates the level at which features are realized in schools
in percentages. A weighted score was used for the SET total score by adding all seven
subscale scores (maximum score 1 per subscale), divided by 7 and multiplied by 100.
For the TFI Tier 1 total score, the sum for the 15 items was divided by 30 (total possible
score) and multiplied by 100.
The TFI Tier 1 was completed rst, by discussing the 15 questions in order to reach con-
sensus during a SWPBIS leadership team meeting. The meeting was guided by a SWPBIS
coach, who explicitly asked for substantiation of the choices made. Prior to the meeting,
the SWPBIS coach made some observations, and briey interviewed both students and
teachers about school values and behavioral expectations, and acknowledging students.
Preferably, this SWPBIS coach also was (or had been) responsible for coaching the school
during SWPBIS implementation. When the school did not have a SWPBIS coach to assist
with completing the measurement, one was provided (approximately 14 times). Follow-
ing that, the SET was completed within two weeks by a dierent SWPBIS professional
who was not familiar with the school. This professional conducted structured interviews
with the administrator, sta members and students, observed the school environment,
and reviewed developed products such as school policies, SWPBIS Handbook or docu-
ments, and data systems. For example, to determine how well a school’s values and
accompany ing behavioral expectations had been taught, the assessor studied lesson
plans and asked at least 15 students and 10 sta members whether they could state the
values and behavioral expectations of their school.
4
76 77
To answer the rst research question, we used within-subjects repeated measures
ANOVAs to examine how group means for Tier 1 delity of implementation and
student outcomes changed over time. To examine the relation between delity
of implementation and student outcomes (Research Questions 2 and 3), we used
regression analyses, as is recommended for testing associations between a predictor
and outcomes. We conducted 6 multiple regression analyses with SET scores at T1 as
the independent variable, using student outcomes at T3 as the dependent variables.
We also conducted these analyses with TFI scores as the independent variable. These
analyses enabled us to determine whether the level of implementation was related to
changes in student outcomes (see Table 11). Next, we performed 6 multiple regression
analyses with changes in delity (i.e., the dierence between delity scores at T3
and T1) scores as the independent variable, rst for SET, and second for TFI, again
controlling for student outcomes at T1. These results were used to study whether
changes in student outcomes depended on changes in delity (see Table 11). As many
studies have focused on the results of schools that started implementing SWPBIS at
study onset, we also calculated means for both delity and outcome variables for the
nine schools that started in August 2015, and reported on their results separately, to
give an impression of their progress across 3 years.
Results
Relation of outcomes and delity. Table 10 gives descriptive data and results of
repeated measures ANOVAs, to see if student outcomes and delity changed over the
years. Fidelity of implementation improved signicantly. In addition, the percentage
of students stating there were locations in or around school where they felt unsafe
decreased signicantly. The other variables did not change signicantly, although the
decrease in behavior incidents showed a small eect. For the nine schools that started
implementing SWPBIS just before study onset, all means for student outcome varia-
bles improved, but the number of cases was too low to draw conclusions. As there was
a considerable variation in months of SWPBIS implementation for participating schools
that could have inuenced the results found, we checked whether using months of
implementation as a between-subjects factor in the repeated measures ANOVAs re-
vealed any dierences for student outcomes. This was not the case.
In Table 11, the multiple regression analyses with student outcomes at T3 as depend-
ent variables and TFI and SET scores as independent variable are displayed. Whereas
ANOVAs use group means, multiple regression analyses were conducted to identify
patterns in individual school scores. On the rst row of Table 11, the contribution of
Well-being at T1 to predicting Well-being at T3 is presented, interpreted as the sta-
bility of Well-being scores. For all variables stability appeared to be low, although for
two variables (Well-being and Behavior incidents) there were statistically signicant β
values. For Well-being, the β value was .34 (p < .05), indicating that stability was not
perfect, so there was change in students’ social well-being at individual schools. This
was also the case for the number of behavior incidents (β = .51). On the next two rows,
we controlled for the level of delity. We saw no eect of the predictors SET or TFI at
All data were aggregated at school level. First, answers were dichotomized (e.g., for
bullying: “Almost never” and “No, never” as “0”, and “Every day”, “Every week, but not
every day”, and “Sometimes, but not every week” as “1” ). Next, the answers of all
students were aggregated at school level. In our example of the item on bullying, this
resulted in the percentage of students who stating that they were being bullied during
the last school year.
Behavior incidents. To measure the prevalence of behavior that was not tolerated at
a school, we asked schools to provide data on the number and location (in or outside
class) of major and minor problem behaviors, using the schools’ data collection
method. Behavior was considered an incident as it interfered (or could interfere) with
daily practice in schools. Minor incidents could be resolved quickly without disturbing
class, with no need for support from outside class. Examples are not following a
teacher’s directions or name calling. Examples of major problem behaviors are physical
violence, theft or vandalism. Most Dutch SWPBIS schools rst dene what particular
behaviors can be considered as problem behavior (both minor and major), as this can
vary across contexts. Second, each school decides what, when, and how to report.
For this study, to support schools in collecting data on behavior incidents, we provided
them with descriptions and examples based on the Dutch version of the SWIS. Data
were recorded by means of the Dutch SWIS or Excel sheets, programmed by the
Dutch SWPBIS consortium. For our analyses, we counted the total number of behavior
incidents (major and minor incidents) and standardized this by calculating the average
number of incidents per 100 students per day, for two intervals from the same 4-week
period, at T1 and T3. For example: school A had 19 incidents over 18 school days in the
4-week period, and a total number of 128 students. This resulted in the following score:
[(19 /18)/128] * 100 = 0.82 incidents per 100 students per day.
Additional support. We asked schools to complete a form each year with the number
of students receiving additional support for behavior. We dened it as extra arrange-
ments for students, comparable to Tier 2 or Tier 3 interventions, including examples
such as Check-In-Check-Out or an individual behavior plan with dierent rules for
playing outside at recess. Each student could only be counted once. For each school,
we calculated the percentage of students who received additional behavioral support.
Analyses. In our study, the school was the unit of analysis. All analyses were per-
formed with SPSS version 20 for Windows 10. Not all schools provided all the data
requested. Therefore, the number of participating schools varied across time. We
focused on the rst (T1) and last (T3) waves of data collection, as we had a loss of
20% of our data if we used all three waves. Comparison between T1 and T3, two years
apart, would allow for more change over time to occur that could be related to delity
of implementation. We tested whether the non-response over time (i.e., attrition) was
systematic or not. We compared the scores at T1 of schools with incomplete data at
T3 with the scores at T1 of schools with complete data at T3. There were no signicant
dierences for any of the outcome variables. We therefore concluded that the non-
response was random and not selective, as the two groups did not dier systematically
at T1.
4
78 79
T1 on Well-being at T3, nor on any other variable. On the fourth row, the interaction ef-
fects are presented. The eects displayed in the third and fourth column, indicate the
extent to which the stability depended on the level of the SET or TFI score. For none of
the variables, the interaction eect was statistically signicant.
We repeated these analyses using changes in delity scores instead of level of deli-
ty. Again, Well-being changed from T1 to T3 ( β = .49, p < .01). In contrast to the result
for the absolute level of delity, the eect of the interaction on Well-being at T3 was
signicant (β = .51,p < .01). This indicates that the change in students’ social well-be-
ing depended on the changes in delity. Behavior incidents showed a similar, though
slightly dierent pattern: a signicant change in the number of behavior incidents oc-
curred, which was predicted by the change in delity ( β = .27, p < .05). Other variables
did not show signicant changes.
For the TFI, results were similar. Well-being and Behavior incidents changed signi-
cantly. Other variables did not show signicant change. TFI total scores were not re-
lated with student outcomes at T3. Changes in Well-being were signicantly related to
changes in TFI scores, indicating that students’ social well-being increased at schools
with increasing levels of implementation delity. In contrast to the SET, there was no
signicant relation between Behavior incidents and changes in TFI scores, indicating
that the number of behavior incidents did not decrease at schools with increasing
levels of delity.
Number
of
schools
MT1 SDT1 MT3 SDT3 MT3 -T1 95 % con-
dence in-
terval of the
dierence
pCohen’s
d
lower upper
TFI 66 57.4 8a20.9 7 82.83 15.54 25.35 19.84 30.87 .00 1 .1 3
SET 66 68.56a16.99 84.29 11.06 15.73 11.32 20 .15 .00 0.88
Well-
being
39 84.38b8.7 7 85.97 7.63 1.59 -1.53 4.71 .31 0 .1 7
General
safety
39 85.47c8.17 86.21 5.7 3 0.74 8 -2.39 3.88 .63 0.08
Unsafe
location
39 25.31d10.06 20.61 9.75 -4.70 -8.46 -0.93 .02 -0.41
Harass-
ment
39 32.27e10.03 30.1 8 1 0.14 -2.09 -6.89 2.70 .38 -0.14
Additional
support
38 4.1 7 f2.70 3.83 2.33 -0.34 -7. 6 6 3.25 .52 - 0.13
Behavior
incidents
42 1.61g1.65 1.23 1.32 -0.37 -0.84 0.09 .1 1 -0.25
Table 10. Repeated measures ANOVAs: change over time
Note:
a Total score, meaning the percentage of realized SWPBIS features. b The average score
of liking ones class, contact with classmates, appreciation of these contacts, and liking
being at school, in percentages. c The percentage of students stating they generally
felt safe. d The percentage of students stating there were various locations in and
around school where they not felt safe at any time the past year. e The percentage of
students stating they had been a victim of various types of harassment at any time the
past year. f The percentage of students receiving additional behavioral support.
g Incidents per 100 students per day.
4
80 81
Level of delity Change in delity
B SET β SET B TFI β TFI B SET β SET B TFI β TFI
Level of delity Change in delity
B SET β SET B TFI β TFI B SET β SET B TFI β TFI
Well-being
T1
.30* .34* .28* .32 Well-being
T1
.43** .49** .37* .42
SET T1 -.1 0 -.21 ΔSET
(T3-T1)
.08 .20
TFI T1 -.07 -.1 9 ΔT FI
(T3-T1)
.09 .28
Interaction
T1b
.01 .11 .01 .25 Interaction
T1c
.02** .51** .01* .37
Table 11. Regression analyses with student outcomes at T3 as dependent variables
General
safety T1
.04 .06 .04 .06 Well-being
T1
-.00 -.01 .02 .03
SET T1 .01 .02 ΔSET