Technical ReportPDF Available

Design study for the Summer Learning Program Quality Intervention (SLPQI): Final-year intervention design and evaluation results

Authors:

Abstract and Figures

This paper describes implementation and outcomes for QIS in school-based summer learning programs in multiple cities.
Content may be subject to copyright.
301 W. Michigan Ave. | Suite 200 | Ypsilanti, Michigan 48197 | Phone: 734.961.6900 | www.cypq.org
The David P. Weikart Center for Youth Program Quality is a division of The Forum for Youth Investment.
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention
Design and Evaluation Results
This project was funded by The Wallace Foundation and The Raikes Foundation. This document, last
updated on 4/25/2017, is submitted to the funders as a final report for the project.
Charles Smith Ph.D.
Ravi Ramaswamy
Katie Helegda
Colin Macleod
Barbara Hillaker, Ph.D.
Poonam Borah
Stephen C. Peck, Ph.D.
ii
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Executive Summary
The Summer Learning Program Quality Intervention (SLPQI) is a continuous improvement
intervention for summer learning systems and settings. The intervention includes: (a) standards and
measures for high-quality instructional practices, (b) data products and technology for meaningful
feedback, (c) a plan-assess-improve cycle at each summer site, and (d) supports necessary to design and
implement the prior three parts. The SLPQI focuses on instructional practices that build student skills
during summer and increase school success during subsequent school years.
The SLPQI was the subject of a four-year Design Study involving 152 providers in seven cities.
In the final year of the study, the SLPQI was implemented citywide in Denver, CO; St. Paul, MN; and
Seattle, WA (N = 106 sites). This report presents final specification of the SLPQI design, supports,
measures, and performance benchmarks. Key findings from 2016 include:
The SLPQI was implemented at moderate to high fidelity, at scale, in three citywide systems with
local provision of supports. The proportion of sites implementing the SLPQI at high fidelity was high in
all three systems, and partnerships of school districts, city agencies, community-based providers, and
quality intermediary organizations developed capacity to implement the SLPQI at scale. A large
proportion of non-school-based sites were connected with information about students’ success in the prior
school year.
Summer program staff positively valued the SLPQI and the assessor-coach role. System leaders,
site managers, and assessors reported that implementation of the SLPQI was a good use of their time and
a good fit with their work. They also reported that the Summer Learning Program Quality Assessment
(PQA) successfully differentiated between higher and lower quality. Staff valued of the assessor-coach
who observed, generated performance feedback, and provided coaching for the site manager.
Performance data indicates that instructional quality and student outcomes improved as
predicted by the SLPQI theory of change. Performance data indicates that instructional quality improved
from 2015 to 2016. Lower-performing sites improved the most, and high performance was sustained.
Innovations were focused on identified areas of low quality: student management of their executive
skills, motivation, and emotions. Students in higher-quality summer settings had greater academic skill
gains in both 2015 and 2016 compared to students participating in lower-quality summer settings.
Recommendations include (a) marketing the SLPQI in cities with strong summer partnerships; (b)
marketing SLPQI to school districts that hope to build summer partnerships; (c) continuing efforts to
improve the Summer Learning PQA as a standard for high-quality instruction tailored specifically for
students with difficult SEL histories, and (d) conducting a randomized efficacy trial for the SLPQI.
iii
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Organizational Background
In 2013, the David P. Weikart Center for Youth Program Quality (Weikart Center) and the
National Summer Learning Association (NSLA) began a collaboration to address summer learning
program quality improvement. NSLA is the only national nonprofit focused exclusively on closing
achievement gaps by increasing access to high-quality summer learning opportunities. NSLA recognizes
and disseminates what works, offers expertise and support for programs and communities, and
advocates for summer learning as a means for promoting equity and excellence in education. The
Weikart Center’s mission is to empower education and human-service leaders to adapt, implement, and
scale best-in-class, research-validated quality improvement systems to advance child and youth
development. The Weikart Center is an affiliate division of the Forum for Youth Investment.
iv
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Contents
Executive Summary ...................................................................................................................................... ii
Organizational Background ......................................................................................................................... iii
Contents ....................................................................................................................................................... iv
I. Introduction to the Summer Learning Design Study ................................................................................. 1
Overview of the Four-Year Design Study ................................................................................................ 2
In This Report ........................................................................................................................................... 4
II. SLPQI Design, Supports, and Benchmarks .............................................................................................. 5
SLPQI Theory of Change ......................................................................................................................... 5
Performance Measures .............................................................................................................................. 7
Standard and Measure for Instructional Quality .................................................................................. 9
Cycle for Continuous Improvement ........................................................................................................ 10
Supports: Training, Technical Assistance, Technology .......................................................................... 12
Design Wisdom ....................................................................................................................................... 15
Cost to Implement a QIS anchored by SLPQI ........................................................................................ 15
III. 2016 Performance Study Questions, Sample, and Procedures.............................................................. 16
Methods .................................................................................................................................................. 16
IV. Results for Implementation ................................................................................................................... 18
Implementation of Supports .................................................................................................................... 19
SLPQI Implementation Fidelity and Feasibility ..................................................................................... 19
Staff Valuation of SLPQI ........................................................................................................................ 20
Quality of Management Practices ........................................................................................................... 22
V. Results for Quality of Instructional Practices ........................................................................................ 23
Quality of Instructional Practice, Field Perspective ................................................................................ 23
Change in Quality of Instruction, 2015 to 2016 ...................................................................................... 27
Instructional Innovation During the SLPQI ............................................................................................ 29
Quality and Academic Skill Growth ....................................................................................................... 32
VI. Discussion of Findings and Recommendations .................................................................................... 33
Findings .................................................................................................................................................. 33
Recommendations ................................................................................................................................... 34
References ............................................................................................................................................... 36
v
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Appendix A 2016 SLPQI Performance Benchmarks ............................................................................. A-1
Appendix B Notes on SLPQI Design Adjustments since the 2015 report ............................................. B-1
Appendix C Summer Learning PQA Measures ..................................................................................... C-1
Appendix D Implementation by Sites .................................................................................................. D-1
Appendix E Site Manager Responses to Open-Ended Questions .......................................................... E-1
1
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
I. Introduction to the Summer Learning Design Study
Summer learning programs are positioned to play an important role in reducing summer learning
losses that disproportionately affect disadvantaged students (Alexander, Entwisle, & Olson, 2007; Harris
Cooper, Nye, Charlton, Lindsay, & Greathouse, 1996; Gershenson, 2013; Matsudaira, 2013), and summer
learning programs with an explicit focus on improving academic skills are an important part of the out-of-
school time landscape (Boss & Railsback, 2002; Newhouse, Neely, Freese, Lo, & Willis, 2013).
Although a growing literature suggests that summer learning programs can impact academic and other
school-related skills (Borman & Dowling, 2006; Chaplin & Capizzano, 2006; McCombs, Augustine, &
Schwartz, 2011; McCombs et al., 2014; Roderick, Engel, & Nagaoka, 2003), few rigorous studies have
closely examined the specific features and practices that mediate or moderate relations between summer
program participation and school success outcomes (Arbreton et al., 2008; Augustine et al., 2016;
Spielberger & Halpern, 2002).
This relatively oblique understanding about the specific instructional practices that support skill
development in young learners presents a number of challenges. First, without a sufficient description of
promising practices, it is impossible to evaluate the effectiveness of those specific practices. Second,
without standards and measures for promising practices that are both precise and feasible to implement, it
is difficult to plan for high-quality services or provide the performance feedback necessary for
accountability and improvement. Finally, and perhaps most importantly, without standards and measures
for promising practices it is difficult to promote the most important kinds of staff practices for at-risk
students. These are practices that help children be open to, and engaged with, academic content and that
support the development of social, emotional, and executive skills that are likely to make students more
effective learners in all settings and with all content.
The Summer Learning Program Quality Intervention (SLPQI) and the Summer Learning
Program Quality Assessment (PQA) directly address these challenges. The SLPQI is a continuous
improvement intervention for summer learning systems and settings that includes four core parts: (a)
standards and measures for high-quality practice anchored by the Summer Learning PQA, (b) data
products and technology that support meaningful feedback to summer staff, (c) a plan-assess-improve
cycle adapted to each summer site, and (d) coaching, training, and technical assistance necessary to
design and implement the prior three parts. The SLPQI and Summer Learning PQA focus summer
learning systems on the difficult task of improving instructional practices that build student skills in
summer to increase student’s school success in subsequent school-years.
2
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Overview of the Four-Year SLPQI Design Study
Since 2013, the National Summer Learning Association (NSLA) and the David P. Weikart Center
for Youth Program Quality (Weikart Center), multiple national funders, and dozens of place-based
organizations have partnered to implement a design and development study for the SLPQI.
1
The study,
and the intervention design and supports produced through the process, were conducted in partnership
with expert practitioners and designers in the organizations listed in Table 1.
Table 1. SLPQI Design Study Partnership by Year
2013
2014
2015
2016
# of summer
program sites
16
32
62
106
Largest Providers
Grand Rapids, MI;
Oakland, CA;
Baltimore, MD
West Michigan
Public Schools,
Higher
Achievement
Grand Rapids, MI;
Northern California;
Seattle Public
Schools Washington
YMCA of San
Joaquin County,
Stockton Unified
School District, City
of Seattle Parks and
Recreation, YMCA
of Greater Seattle,
Bay Area
Community
Resources
Denver Public
Schools, St. Paul
SPROCKETS,
Boys and Girls
Club, DU Bridge
Project, St. Paul
Parks and
Recreation
Denver Public
Schools, St. Paul
SPROCKETS,
Seattle Public
Schools
Boys and Girls
Club, DU Bridge
Project, St. Paul
Parks and
Recreation
Collaborating
Funders
National Center for
Summer Learning,
W.T. Grant
Foundation
David and Lucille
Packard
Foundation, The
Doug and Maria
Devos Foundation,
The Raikes
Foundation, and
The Wallace
Foundation
The Wallace
Foundation, David
and Lucille Packard
Foundation, The
Raikes Foundation
The Wallace
Foundation, The
Raikes
Foundation
There were two primary design and evaluation tasks completed over the four-year period. The
design task was to engage the summer learning experts identified in Table 1 to translate or adapt an
1
The purpose of design and development research is to develop new or improved interventions or strategies to
achieve well-specified learning goals or objectives, including making refinements on the basis of small-scale testing.
Typically, this research involves four components: (a) development of a solution (for example, an instructional
approach; design and learning objects, such as museum exhibits or media; or education policy) based on a well-
specified theory of action appropriate to a well-defined end user; (b) creation of measures to assess the
implementation of the solution(s); (c) collection of data on the feasibility of implementing the solution(s) in typical
delivery settings by intended users; and (d) conducting a pilot study to examine the promise of generating the
intended outcomes (Institute for Education Science, 2013; Czajkowski et al., 2016).
3
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
existing continuous improvement intervention the Youth Program Quality Intervention (YPQI) - for use
in summer learning systems and settings. The YPQI is an evidence-based continuous improvement
intervention and was the core design from which the SLPQI was adapted.
2
The evaluation task was to evaluate each iteration of the SLPQI design on three criteria: First, as
the beta versions were fielded, the focus of evaluation was on implementation fidelity to the standard and
the feasibility of the effort necessary to attain the standard. Second, we continuously asked the
implementers (i.e., city leads, site managers, and instructional staff) about the value of the SLPQI (e.g.,
Was the SLPQI a good use of their time? Did the SLPQI fit with their local circumstances and resources?
What worked and didn’t work?). Third, wherever possible, we attempted to answer more specific
questions about the validity of the theory of change. In particular, we wanted to know what the effects of
implementation of SLPQI were on both instructional quality and growth in child skills. Several reports
were produced over the four year study period.
3
The basic design for intervention and supports was completed at the end of the second year.
During the third and final phase (2015 and 2016), the delivery of the intervention supports (e.g., training,
technical assistance, project management) was transitioned to the local intermediary organizations and
their summer network partners. The sequence of the study’s three phases were:
Phase I (Summer 2013): Pilot for proof of concept resulting in design of beta intervention and
beta supports.
Phase II (Summer 2014): Feasibility study for beta intervention and beta supports delivered by
developer.
Phase III (Summers 2015 and 2016): Scaled intervention with evaluation of implementation
fidelity and student outcomes with local delivery of supports.
During these three phases, SLPQI concepts and practices were tested and evaluated with 152 unique
provider organizations and data from hundreds of observations, surveys, focus groups, and interviews
2
The Youth Program Quality Intervention is the most widely used quality-assurance process in the afterschool field
and was the subject of a randomized trial that demonstrated that high fidelity to the same four continuous
improvement elements improved the quality of instructional experiences for at-risk youth (Smith et al., 2012).
Subsequent validation studies have linked exposure to high-quality instructional practices, as defined by the
Program Quality Assessment (PQA), to improved school success outcomes, including school behavior and
achievement (Naftzger, 2014; Naftzger et al., 2013; Naftzger, Tanyu, & Stonehill, 2010; Naftzger, Vinson,
Manzeske, & Gibbs, 2011).
3
Summer Learning Program Quality Assessment: 2013 Phase I Pilot Report (Ramaswamy, Gersh, Sniegowski,
McGovern, & Smith, 2014); Summer Learning Program Quality Intervention (SLPQI): Phase II Feasibility Study
(Smith, Ramaswamy, Gersh, & McGovern, 2015); Summer Learning Program Quality Intervention Phase III
Interim Report (Smith, Ramaswamy, Hillaker, Helegda, & McGovern, 2015); Quality-Outcomes for Seattle Public
Schools Summer Programs: Summer 2015 Program Cycle ( Smith et al., 2015) ; Quality-Outcomes Study for Seattle
Public Schools Summer Programs, Summer 2016 Program Cycle, Interim Findings (Smith, Roy, Peck, Helegda,
Macleod, 2016); Summer Learning program Quality Intervention Handbook (Ramaswamy et al., 2017).
4
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
were collected and analyzed. This information was part of a feedback loop: first to frontline site
managers and teachers, as they worked to improve their practice and curriculum, and then to the technical
partners who were using staff input and feedback to improve the design.
Although the project design involved substantial commitments and internal costs for all
participants, total external funding for the four-year design and development study was approximately
$725,000. This funding was distributed across: city intermediary organizations that coordinated the
work, managed contracts with assessors and coaches, and transitioned to delivery of the SLPQI supports
in the final year; technical partners (e.g., NSLA, Weikart, funders) that led the design and evaluation
efforts; and direct service providers that in many cases were already receiving programmatic support
from the funders.
This approach to conducting a design and development study is notable for its efficiency in
extracting user experience into several cycles of design iteration, thus leading to a greater likelihood of
successful implementation at scale. During the period of the study in the participating sites, higher-
quality services were delivered to an estimated 3,350 summer students.
In This Report
This report covers the final year (i.e., 2016) of the study that was fielded in three cities: St. Paul,
Denver, and Seattle. The primary objective was to evaluate SLPQI implementation fidelity and feasibility
when SLPQI was delivered at city-wide scale and where training and technical assistance supports were
provided through local capacity. Experiences from the first two years of the study suggested that cities
with mature OST networks and a high-capacity quality intermediary organization (QIO) would be ideally
suited to scaling up quality improvement systems
4
for summer learning programs. Denver Afterschool
Alliance (Denver [https://www.denvergov.org/denverafterschoolalliance]), Sprockets (St. Paul
[http://www.sprocketssaintpaul.org]), and Schools Out Washington (Seattle
[https://www.schoolsoutwashington.org]) are high-capacity QIOs that manage mature QISs anchored by
the Youth Program Quality Intervention (YPQI).
Part II of this report describes the summer 2016 SLPQI design (e.g., parts, sequence, and roles),
supports, performance benchmarks for high fidelity, and a rudimentary assessment of costs. Although the
study did not focus on analysis of costs, we draw upon the information available to discuss costs of
4
Quality Improvement Systems (QIS) provide normative frameworks for positive youth development and articulate
standards for management practices, service quality, and program effectiveness that a wide variety of service
providers can agree on and are willing to be accountable for. QIS also frequently create opportunities for cross-age,
cross-sector, and cross-town planning and coordination, effectively blending resources from multiple public and
private funders through the shared purposes of accountability and improvement. QIS typically include Quality
Intermediary Organizations (QIO), as dissemination agents for quality improvement interventions, and technical
supports necessary for program managers to participate in the QIS. QIO also often provide services related to
performance measurement, participation tracking, curriculum, and other professional development (Smith, 2013).
5
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
implementation so that school districts, local funders, and QIO can be as fully informed as possible about
what it takes to build effective QISs for summer.
Although the intervention design work was less prominent in 2016, we were still improving the
design and supports. Appendix B describes improvements to the design and supports that were new in
2016 and how some of 2015 changes were maintained in the 2016 year.
The results sections IV and V draw upon routine performance data produced when the SLPQI is
implemented to address a range of research questions related to the validity of summer program designs.
First, data describing the quality of instructional practices in summer settings is reviewed to better
understand (a) the prevalence of specific instructional practices and (b) the need for quality improvement
in the wider summer sector. Second, because the motivation of frontline staff is so important to
successful implementation, responses from the summer staff who implemented SLPQI in 2016 are
reviewed in order to understand how they valued the SLPQI. Third, data from summer sites participating
in two years of SLPQI are used to describe how quality of instruction changed in the three cities.
According to the SLPQI theory of change, if summer systems implement SLPQI, then quality should
improve, and lower-quality sites should improve the most. Fourth, we present information from
interviews regarding the kinds of instructional innovations that occurred during the summer of 2016 as a
result of the SLPQI implementation. According to the SLPQI theory of change, teachers should make
instructional improvements in response to performance data indicating areas of low performance. Finally,
we summarize findings from the one system that also collected pre- and post- student academic skill data.
Again, according to the theory of change, students participating in higher-quality summer settings should
have greater academic skill growth compared to students in lower-quality settings.
II. SLPQI Design, Supports, and Benchmarks
QISs anchored by the Program Quality Intervention approach and Program Quality Assessment
assessments have proven to be an effective way to bring promising practices to scale in the
organizationally and programmatically diverse OST field (Smith & Akiva, 2008; Smith, Akiva, Sugar, Lo
et al., 2012; Smith et al., 2017). Because high-quality summer settings are uniquely positioned to address
summer learning loss with vulnerable students, the SLPQI was designed to help summer learning leaders
and staff focus deeply on instructional practices, assess their strengths, and improve the quality and
effectiveness of their services over multiple cycles. In this section, we document the final design
specification for the SLPQI, including the overarching theory of change, standards and measures, the
parts of continuous improvement cycles that sites implement, the supports (e.g., coaching, training,
technical assistance) available to SLPQI adopters, design wisdom, and the costs of implementation.
SLPQI Theory of Change
6
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
The SLPQI theory of change describes a cascade of multilevel intervention effects designed to
maximize the motivation of frontline managers and staff: motivation to work on improving instructional
practices that they believe are most critical for their own students’ success. At the system level, system
leaders connect public and private organizations with shared goals for summer outcomes, coordinate with
the QIO to manage delivery of supports, and send signals to site managers and teachers that the SLPQI is
a priority. At the organization level, site managers receive training in the SLPQI and lead their site teams
to implement the continuous improvement cycle for their site. Although the decision to adopt the SLPQI
may occur at the system level, the most critical work of SLPQI implementation occurs at the site level
where the cycle is implemented. The site manager’s responsibility for implementation of the cycle is a
critical level of accountability in the SLPQI.
At the point-of-service level (POS; e.g., classroom), teachers implement high-quality instructional
practices and curricula that are identified in the continuous improvement cycle. Assessment of both
instructional quality and student skill growth occur at the POS level as students demonstrate academic
and other skills in response to instructional practices. Finally, as students build toward mastery of social-
emotional learning skills (e.g., management of emotions, executive processes, and social role mastery)
and domain-specific academic content skills (e.g., math and literacy) in the summer setting, the likelihood
of skill transfer to school day classrooms in the subsequent year increases.
Figure 1. SLPQI Theory of Change
Although the cascade metaphor in Figure 1 describes a top-down flow of effects, the YPQI design
is focused on building, simultaneously across levels, motivation in specific summer learning roles (e.g.,
system leader, site manager, teacher) by focusing on developing empowerment and expertise appropriate
to each role. We refer to this as a lower-stakes accountability approach (Smith, 2013), wherein most
7
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
individual site managers and teacher’s experiences of accountability in the QIS include the beliefs that
performance standards and measures are fair, attainment of the standards is possible, sufficient supports
are available for improvement, and any single performance measure is insufficient for evaluation of
quality.
Perhaps most importantly, in lower-stakes systems, the logic of negative incentives is inverted
5
so
that low performers receive additional supports. This logic, where “low performers receive extra help,” is
critical for an intervention like the SLPQI that includes performance measures that could promulgate
perverse incentives and behaviors under higher-stakes models (e.g., gaming measures, minimum
compliance, or outright resistance). To summarize: We can most easily help people collect meaningful
and precise data, using their existing organizational resources, if they are not also at risk of being
summarily sanctioned for identifying their own low performance.
Further, although SLPQI performance measures can supply a wealth of valuable performance
data, these data do not become meaningful information without a professional learning community.
Learning from data and using it effectively requires site managers and teachers to engage in conversations
about that information that lead to decisions about both curriculum and professional development. This
process of coming together around standards and data in a lower-stakes context is an integral feature of
the lower-stakes approach. Together, the performance data and learning community provide important
informational and purposive incentives for high-fidelity implementation.
Finally, implementation of the YPQI in a lower-stakes context, with an active professional
learning community, is a proven framework for growing public-private partnerships in a region. Access
to high-quality supports and a shared technical language of summer learning can bring summer-focused
public and private actors into partnership (Yohalem et al., 2010; Yohalem, Devaney, Smith, Wilson-
Ahlstrom, 2012).
In the remainder of Part II, five parts of the final iteration of the SLPQI design are described: (a)
Performance measures, (b) Plan-Assess-Improve cycle sequence and roles, (c) training and technical
assistance supports, (d) start-up wisdom, and finally, (e) costs to implement the SLQPI.
Performance Measures
The 2016 suite of performance measures for the SLPQI includes eight composite measures (and
their requisite domains, scales, and items) to describe the organization level of setting and seven
composite measures (and their requisite domains, scales, and items) to describe the point-of-service level
of setting. These measures are described in the Table 2. Appendix C presents additional descriptive
information and a summary discussion of their reliability and validity.
5
The dominant accountability model in education comes from the No Child Left Behind policy that produces
higher-stakes experiences where low performers are identified publicly, outcome measures lack validity, and the
cost of improvement is borne by the low-performing organization.
8
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Table 2. SLQPI Performance Measures
System Level
Accurate and On-time All assessors certified as external assessors; all assessments
completed, reports delivered, and coaching visits conducted on time.
Project Records
Org Level
Staff Training - Staff have adequate preparation and receive comprehensive orientation;
high staff retention and adequate staff-to-student ratios, staff has time to plan curriculum
to meet student objectives.
PQA Form B
Interview
Planning - Site manager plans proactively, articulates mission and goals for youth;
strategic plan formally reviewed and communicated to staff; youth included curriculum
development, staff have framework for lessons. Data is collected and used for
improvement planning.
PQA Form B
Interview
Individualization - Student skill assessment to provide individualized instruction; site
director and staff discuss needs of individual students. Youth attend sessions frequently,
meet program recruitment criteria, have a high retention rate, and receive high level of
program hours.
PQA Form B
Interview
Family Connections - Program communicates with family year-round, staff have
relationships with families, and families have opportunities for participation in program
offerings.
PQA Form B
Interview
Align to School Achievement Staff review student’s school data from the previous year;
students are recruited based on prior year’s school performance or recommendation from
school district or staff.
Site Manager
Survey
Staff capacity and expertise High staff retention and adequate staff-to-student ratios;
staff skill assessed, trained in advance, provided year round professional development;
frequent collaboration and feedback.
Site Manager
Survey
SLPQI Implementation Fidelity - Site manager attended trainings (Summer Institute,
Coaching); engage assessor-coach and Report; create Program Improvement Plan; coach
staff on instruction using SLPQA.
Site Manager
Survey
Staff Valuation of SLPQI - Participation in SLPQI was a good use of time, good fit with
job, and administrative support provided.
Site Manager
Survey
POS Level
Instructional Total Score Total Score for Instructional Quality composed of ratings of
practice in three domains Supportive Environment, Interaction, and Engagement
PQA Form A
Observation
Safe Environment - Practices that support psychological, emotional, and physical safety;
supports for a positive, inclusive atmosphere; physical activity; and a healthy
environment.
PQA Form A
Observation
Supportive Environment Practices that support basic skill learning using both
exploratory methods (e.g., engage with materials, encourage trying new skills, multiple
types of activities) and direct scaffolding (e.g., break down tasks, staff models, monitor
challenge) methods; positive emotionality and learning from mistakes; conflict resolution.
PQA Form A
Observation
Interaction Practices that support peer friendships and shared values; group process,
social roles, help-giving and seeking, leadership; shared control and work with adults.
PQA Form A
Observation
Engagement Practices that support executive functions necessary for planning and
reflection; supports for extension of knowledge; supports for development of strategies
and rules for problem solving.
PQA Form A
Observation
Math and literacy - In math, access to mathematical problem solving and reasoning, in
different contexts, linked to examples. In literacy, access to literacy activities at a variety
PQA Form A
Observation
9
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
of levels, in multiple contexts and modalities, write about experiences, talk about the
meaning of words.
Greeting, Transition, Departure Students and families experience warmth and guided
interaction at entry and exit; transitions are planned and children are prepared; departure is
constructive for remaining students.
PQA Form A
Observation
Student Skill - Academic achievement test scores in math and reading including sight
word assessments, oral fluency, summer staircase math assessment, and math practice.
Summer Program
Skill Assessment
Student Skill Transfer - Subsequent year academic achievement test scores or
proficiency levels, grades, school behavior records.
School Data and
Records
Standard and Measure for Instructional Quality
The Summer Learning PQA Form A is the anchor measure for the SLPQI. Although current
research on summer learning programs continues to extend our understanding of program features such as
teacher expertise and curricula (Augustine et al., 2016; McCombs et al, 2011), formative analyses of
instructional practices as delivered is relatively rare. The Summer Learning PQA Form A was developed
to assess instructional practices that build student skills according to an explicit standard for high quality
practice the active-participatory approach which was developed over several decades at the HighScope
Educational Research Foundation (Ilfeld, 1996; Oden, Kelley, Ma, & Weikart, 1992). This instructional
approach supports learning in two ways.
First, active-participatory refers to a pedagogical approach (i.e., active learning) that makes the
presentation of academic content more engaging by blending exploratory learning methods that maximize
motivation for novices (e.g., choice, concrete and abstract, open-ended questions), direct skill scaffolding
designed to move students upward on specific skill hierarchies (e.g., clarity of instruction, adult modeling,
encouragement to higher levels), and application of academic learning strategies and rules (e.g., identify
strategies, attribute success to effort, guided error correction) that support more sophisticated forms of
academic problem solving. Each of these instructional practices exploration, direct skill scaffolding,
and use of strategies and rules is known to increase student engagement with academic content (Gagne,
Briggs, & Wager, 1988; Martin & Reigeluth, 1999).
Second, the active-participatory approach is also a set of supports for learning social, emotional,
and executive skills that make students more effective learners in all settings and with all content.
6
In
particular, students who have been exposed to chronic stressors associated with lower-income
neighborhoods, under-resourced schools, or environmental contaminants are more likely to achieve the
basic regulation and attention skills necessary for learning where there are additional supports in place.
6
Crosswalks of practices named in PQA items are available upon request for the common core habits of mind
(Devaney and Yohalem, 2012), the Danielson Framework, SEL practices (Smith, McGovern, Peck, Larson et al.,
2016), and other school-day practice frameworks.
10
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
More specifically, these staff practices provide normative guidance in social interaction (e.g., values
communicated, help another child, lead groups), supports for positive emotionality (e.g., warm and
respectful, staff acknowledge feelings, opportunities to get to know), and opportunities to practice
managing the executive processes necessary for decision making (e.g., make plans, intentional reflection,
make connections). Each of these aspects of practice guidance for social interaction, supports for
positive emotionality, and active management of executive processes such as secondary appraisal or
meta-cognitive strategies are also known to increase learning in academic content (e.g., Li and Julian,
2012; Linnenbrink, 2007; Marzano, 1999).
Cycle for Continuous Improvement
The SLPQI improvement cycle can be seen in the exemplar of the SLPQI cycle timelines,
activities, and supports presented in Table 3. Determining the sequence of supports for implementation of
the site-level cycle is a critical part of the technical assistance that system leaders receive early in the
process. The generic sequence of PLAN-ASSESS-IMPROVE, where IMPROVE includes coaching and
training, is shown in Table 4, with additional detail on support trainings and actions required. There is
overlap in the final two stages of the process to recognize that performance feedback and improvement
happen both during and after the summer session ends.
Table 3. Sample SLPQI Cycle Timeline, Activities, and Supports
Element
Cycle Timeline
Activities and Supports
Plan
Assess
Improve
(Coach &
Train)
March 1
SLPQI Kickoff Webinar or Recruitment meeting (Optional)
April 23
Live Summer Learning Institute training (site lead plus other staff as needed)
May 18
Live SLPQA Assessor training (reliable PQA external assessors only)
May 19
Live Quality Coaching workshop (assessors and/or site leads)
June 22
START OF SUMMER PROGRAM SESSION
July 2
External Assessment Site Visits and Reporting
July 8
Assessor-Coach site visits (assessors, site leads, and staff)
July 9-24
Mid-session program improvement (site leads and staff)
July 31
END OF SUMMER PROGRAM SESSION
August 10
Live Planning with Data Workshops (site leads and staff as available)
September 1
Live Youth Work Methods Workshops
11
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Table 4. SLPQI Process Overview
Element
Activity
Supports (Training Italicized)
Action Example
PLAN
(Pre
Summer)
Adapt SLPQI
design & costs
Consulting
System leaders receive guidance on fitting
SLPQI to local purposes; review relevant
exemplars.
Plan for high
quality
instruction
Summer Learning
Institute training
Site leads learn quality standard for
instruction and amend curriculum plan to
increase the prevalence of specific high
quality practices
Build
coaching
capacity
Quality Coaching training
Assessors and/or site leads gain skills for
providing ongoing, meaningful coaching
support to instructional staff
ASSESS
(During
Summer)
Collect
performance
data on PQA
Forms A & B
PQA Reliability & SLPQA
Assessor trainings
Assessors conduct site visits to collect data
and create site-level reports
IMPROVE
(During
and post-
summer)
Coach staff
NA
Site manager engages staff through coaching
around plan for instruction
Summary
Report
NA
Assessor-coach visits site team to discuss
Summary Report
Mid-Session
Improvement
NA
Assessor-coach supports site team to purse a
short-term improvement
Post-Summer
Improvement
Planning
Planning with Data training
Site leads engage in longer term data driven
improvement planning, setting specific goals
for improvement in the following summer
session
Post-Summer
Improvement
Youth Work Methods
Workshops trainings
Site leads and staff engage in targeted
professional development for long-term
improvement of summer program quality
Roles and responsibilities
There are several key roles that support high-fidelity implementation of the SLPQI. Each role is
described briefly below. Table 5 presents the role tasks for each phase of the plan-assess-improve cycle.
Roles and responsibilities include:
System Lead. The System Lead (or Network Lead) is responsible for overseeing the SLPQI at the
city, school district, or region level. The System Lead sets overall goals for the network, provides clear
messaging and advocacy around the process, coordinates training logistics, communicates timelines to
sites, and provides troubleshooting supports as sites implement the SLPQI.
Site Manager. The Site Manager (or site supervisor, site coordinator) is responsible for leading a
site team through the SLPQI. It is important that this person has sufficient time to attend trainings before
and after the summer session and coordinate staff engagement with the process during the summer
session. Key responsibilities include communicating with assessors, managing improvement planning,
and seeing that improvement plans are carried out.
12
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Instructional Staff. The Instructional Staff are primarily responsible for working directly with
students and enacting improvements in the quality of youth experience available at the summer program.
They may also have some responsibilities for leading their team through elements of the SLPQI.
External assessors. The External Assessors are data collectors for the SLPQI process. They
receive training on the SLPQA, conduct site visits, and score the tool so that the site lead has immediate
access to the data. External assessors may also function as coaches (see next paragraph).
Assessor-Coach. The Assessor-Coaches are trained assessors who provide both data collection
and coaching on the results of the assessment for the site team. Rather than being in a position of
monitoring or performance evaluation, assessor-coaches employ a lower-stakes and strengths-based
coaching method.
Data Collection Coordinator. The Data Collection Coordinator is responsible for overseeing the
scheduling and logistics of the data collection process. This role is essential for ensuring that assessors
are paired promptly and appropriately with sites and that site visits and reporting happen in a timely
manner. System Leads can take on this role in certain circumstances.
Table 5. SLPQI Roles and Responsibilities by Intervention Step
Network Lead
Site Lead
Program Staff
External
Assessor
Assessor-
Coach
Data
Collection
Coordinator
PLAN
Pre Summer
Sets SLPQI
timeline for
training events
and data
collection. Sets
SLPQI
network-level
goals. Attends
Summer
Learning
Institute.
Attends
Summer
Learning
Institute.
Completes Pre-
Summer
Quality Plan.
Participates in
pre-summer
professional
development.
Attends PQA
reliability
training as
needed.
Attends
SLPQA
assessor
training.
Receives site
assignment(s)
.
External
Assessor roles
+
Attends
Coaching
training.
Creates final
master list of
sites and
external
assessors.
Works with
site leads,
assessors to
complete data
collection
calendar.
ASSESS
Oversees
process.
Receives
external
assessor on
pre-determined
day.
Participates in
Form B
interview.
Observed by
external
assessor.
Visits site(s)
to conduct
data
collection.
Inputs data
into Scores
Reporter to
generate
report.
External
Assessor roles
+
Establishes
follow-up visit
date for
coaching.
Monitors data
collection
process.
Ensures
timeliness of
reporting.
13
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Improve
Oversees
process.
Reviews
reports and
Program
Improvement
Plans.
Engages staff
in mid-session
planning using
Summary
Report.
Participates in
Planning with
Data at end of
summer.
Participates in
mid-session
planning
process.
__
Facilitates mid-
session
improvement
discussion
based on
Summary
Report.
Monitors
completion of
Program
Improvement
Plans
Coordinates
professional
development
opportunities
aligned with
program goals.
Engages in
network-level
reflection and
planning for
subsequent
year
Coaches staff
during the
session to
improve
practice.
Oversees
execution of
improvement
plan between
summer
sessions.
Engages in
improvement
actions during
the session.
Engages in
targeted
professional
development
between
sessions.
Coaches staff
during the
session to
improve
practice. Leads
sites in
sustaining
improvement
plan between
sessions.
Ensures that
data is
available for
improvement
and
planning
between
sessions.
Supports: Training, Technical Assistance, Technology
Several types of supports are provided through the SLPQI, including (a) technical assistance for
system leaders supporting summer learning systems, (b) training for site managers on the content of the
SLPQI, and (c) training for assessors and assessor-coaches. This section briefly describes each of the
component trainings that are a part of the SLPQI.
Training for Site Managers
Summer Learning Institute. The Summer Learning Institute is a planning workshop designed to
familiarize site leads with the SLPQA tool and research-based best practices in summer learning
programs. During this training, site leads have opportunities to anticipate their program’s strengths and
areas for improvement as they create a plan for summer quality.
Quality Coaching: Site leads can also attend a coaching training with, or separate from, assessor-
coaches. This training provides concrete skills for supporting staff in making improvements to their
practice during the summer session.
Planning with Data: After the summer session is over, site leads convene with their peers to
review in detail all of their performance data and create longer-term improvement plans. These plans are
intended to apply directly to summer sessions in subsequent years.
Training for Instructional Staff
Summer Learning Institute. If staff are able to attend with their site leads, they can benefit from
learning more about the SLPQA and participating in planning for quality. For staff that cannot attend the
14
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
training, the site lead is encouraged to communicate to staff back at the site what they have learned at the
training and engage site staff in the process.
Youth Work Methods. The Methods workshops are aligned with the SLPQA and are designed to
provide program staff with meaningful professional development opportunities to improve their skills.
Specific Methods workshops (i.e., workshops focused on specific learning goals) are be selected based on
each site’s improvement goals.
Training for Assessors and Assessor-Coaches
PQA Reliability Training. Assessors who wish to use the SLPQA must first be reliable in either
the Youth or School-Age PQA. This rigorous training and certification process is designed to ensure as
much consistency as possible in the quality of data collected. Experience conducting and scoring
observations using the Youth or School-Age PQA is also very helpful as preparation for using the
SLPQA.
SLPQA Assessor Training. Reliable PQA assessors then attend a one-day training that focuses on
the unique elements of the SLPQA. Participants have a chance to practice scoring and discussing the new
items. The training also teaches assessors the data collection methodology for the SLPQA, which differs
from the standard school-year PQAs.
Quality Coaching. Assessors who will also work as coaches for their sites can attend a Quality
Coaching training in order to improve their coaching skills and learn the observation-reflection method
for instructional coaching.
Data Products
The assessor scored the SLPQA and drew from the performance data to produce
recommendations for improvement in the remaining weeks of the program. This process of converting
data to a customized data product supporting the performance feedback and improvement process is
facilitated by the Online Scores Reporter, an on-line data entry and report-sharing portal that supports
PQI-type interventions (http://cypq.org/content/scores-reporter-30). For the design study, the Online
Scores Reporter was set up to allow assessors to input their scores and generate reports on their own. The
reports include:
The morning and afternoon scores for all SLPQA items and scales
One-page guide about how and where the data could be used during their summer session
One-page overview of the quality standards referenced by the performance data
Take-it-back agenda for a 30-minute workshop on the Summary Report
Guidance on interpreting PQA data
15
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
The Summary Report, a one-page narrative summary of strengths, suggested improvement
actions, and other specific feedback from the assessor (an example of this report can be found
here: http://cypq.org/sites/cypq.org/examplereport)
Design Wisdom
Make the Design Fit the Resources. In order for the SLPQI to be successful, even in a pilot year,
a network or site must have resources (both time and funds) to support the process. If a program is short
on time, money, or human resources, it may be possible to acquire the necessary resources in some other
way, reassign roles, or adjust goals and scale back the intervention to a level that matches the available
resources. Perhaps most importantly, it may be necessary to start small and focus resources on a few
sites.
Start Small with Motivated Participants. Even though a complete SLPQI process is designed to
involve all of the elements described above, a single program could start on its own by downloading a
copy of the SLPQA (available at cypq.org/downloadpqa), reviewing the handbook (Ramaswamy et al.,
2017), and spending some time in a staff meeting or training discussing the standards. A site lead or other
designated person could even conduct a short observation and then discuss their notes with a colleague as
they think about what the scores might be. Simply engaging with the Summer Learning PQA Form A is a
first step that should build buy-in and momentum.
Systemic Implementation in a Region Requires a Network of Providers and a Quality
Intermediary Organization. It is most effective to be part of a larger network so that resources can be
pooled and a QIO can manage resources and systems, connect to the Weikart Center and NSLA, and
bring the learning community of service providers together. A strong QIO will be able to help set and
manage network goals, timelines, trainings, and data collection.
Cost to Implement a QIS anchored by SLPQI
Although the SLPQI Design Study was not focused specifically on analyses of cost, the
contracting model used to fund the study wherein funds pass to the QIO and other network actors who
then purchase services from the Weikart Center and each other - did require monetization of some aspects
of the work as well as estimation of staff time for various actors in the QIS to carry out their roles.
Using a hypothetical summer learning system consisting of 25 sites, cost are likely structured in
the following way: The SLPQI package of services and supports from Weikart/NSLA costs
approximately $30,000 per year to produce. Staff time for participation in training and coordination of
the improvement work at each site, valued at $40 per hour, is projected to cost approximately $1,000, or
$25,000 for a system with 25 sites. Finally, overall project management and assessor-coaching services
are estimated to cost approximately $25,000 in a project where assessment and coaching visits (including
preparation and follow up in the per-visit cost) were valued at $400 each. In total, to bring a summer
16
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
learning QIS to scale would cost between $80,000 and $100,000 per year for two years. The higher range
would include additional consulting for the curriculum and an evaluation report paralleling sections III-V
of this report.
III. 2016 Performance Study Questions, Sample, and Procedures
Methods
In this section, we discuss the research questions, participant organizations and staff, and
procedures for collection of performance data.
Research Questions
This report addresses the following questions related to implementation fidelity and program
effectiveness: Was the SLPQI implemented at high fidelity? Was the SLPQI valued by the staff who led
implementation? What level of quality is being achieved in the field? What are examples of instructional
innovation that occur as a result of SLPQI participation? Did the quality of instruction in summer
program settings improve after two years of implementing the SLPQI? Is student academic skill growth
related to participation in high-quality summer settings?
7
Sample of Organizations, Sites, Curricula, Staff, and Students
Table 6 describes the number of provider organizations (e.g., St. Paul Public Schools, Seattle
YMCA), the number of summer learning sites, the number of staff represented in the performance data
for all sites, and an estimate for the number of children and youth served across these summer learning
sites.
Table 6. Study Sample by Year and City
Year
Characteristic
City A
City B
City C
Total
2015
Number of Organizations
7
13
6
26
Number of Sites
13
15
34
62
Number of Staff*
31
36
52
119
2016
Number of Organizations
9
16
34
59
Number of Sites
24
18
64
106
Number of Staff*
48
46
107
201
Number of Children
(estimated)
3,350
7
Although we summarize information about student skill gains related to participation in higher-quality
summer settings, these data were collected only in the Seattle system and are discussed in greater detail in
two reports: Quality-Outcomes for Seattle Public Schools Summer Programs: Summer 2015 Program Cycle (Smith
et al., 2015); Quality-Outcomes Study for Seattle Public Schools Summer Programs, Summer 2016 Program Cycle,
Interim Findings (Smith, Roy, Peck, Helegda, Macleod, 2016).
17
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
The summer learning sites in 2016 reflected a mix of designs, auspices, and organizational
purposes related to minimizing or eliminating summer learning loss. These networks and sites reflected
the diversity of the summer learning field through, in particular, a varying emphasis on blending
academic and enrichment content using both public and community-based providers. It is interesting to
note that despite the relatively low proportion of school-district sites, a high percentage of site managers
in the study reported having access to student records for targeting and diagnostic purposes, suggesting
that these summer systems were connected to public schools but administered through public-private
partnerships.
Table 7. Summer Learning Program Designs
City A
City B
City C
Morning academic curriculum content
Academic
61%
Literacy
65%
Math 26%
Academic 58%
Literacy 67%
Math 25%
Academic 80%
Literacy 84%
Math 73%
Includes afternoon enrichment
14%
10%
19%
Site is a school
8%
0%
38%
Staff is a certified teacher (CT) or social worker
(SW)
16% (CT)
16% (SW)
8% (CT)
4% (SW)
45% (CT)
13% (SW)
Program targets academically at risk
87%
46%
56%
Program uses school year or other diagnostic data
on achievement
64%
29%
85%
Data Collection Procedures
Data collection for the 2016 year included the following data sources, measures, and procedures:
Project records. Project records included records of training attendance, assessor reliability test
results, dates for submission of observations, dates when performance reports were sent to each of the 106
programs, and notes from technical assistance calls.
SLPQA Form A. Form A is an observational measure designed to evaluate the quality of staff
instructional practices where interacting with children and youth at the “point-of-service” during program
offerings. Assessors were required to achieve 80 percent or greater perfect agreement with gold standard
scores of a video-taped program offering before conducting observations.
Each observation, morning and afternoon, utilized a method where the assessors collected a
detailed running record of staff behavior and youth responses, during 15-30 minute observation blocks in
18
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
a cross-section of program offerings, led by different program staff. Each rating was based on a total of
approximately 90 minutes of observation time. Assessors then used the anecdotal records to score the
rubrics that constitute Form A, typically requiring about 60 minutes to convert the anecdotal records into
a complete Form A rating. For full-day programs, a distinct Form A rating was produced for the morning
and the afternoon sessions. For half-day programs, only the respective morning or afternoon rating was
produced.
Assessors also completed a checklist related to basic best practices for three transition periods
during the program day. Ratings for the Greetings Index were collected only during morning
observations, whereas ratings for the Departures Index were collected only during afternoon observations.
The following ratings were produced during all observations.
SLPQA Form B. Form B is an interview-based assessment of management practices. The
assessor interviews the program manager and records written responses. Later, this written record is used
to score the Form B rubrics, typically requiring about 30 minutes.
Site Manager Survey. The program manager survey was developed to assess a number of
attributes at each site, including: (a) fidelity of the SLPQI implementation, (b) staff valuation of the
SLPQI and the Summer Learning PQA, (c) any innovations or changes during the program as a result of
receiving the Summer Learning PQA data, and (d) the implementation of management practices regarding
the capacity of the staffing model and school connections related to targeting students based on academic
risk and prior academic performance.
Assessor survey. The assessor survey was developed to better understand successes and
challenges in the assessment process and to gain assessor perspective on the SLPQA. Ninety-nine
external assessors completed an assessor survey via an online data collection system.
Staff interviews. Phone interviews (N = 12 in 2016) were conducted with staff members from
each network. Interviewees were nominated for an interview if their site manager thought they were
making innovative instructional responses to the SLPQI.
Missing Data
Performance data for the 2016 year had little missing data. The 269 PQA Form A assessments
included information on all participating sites; the 106 PQA Form B assessments included all sites, the
113 site manager surveys included 93% of sites, and the 64 assessor surveys included 97% of sites.
IV. Results for Implementation
This section presents 2016 results for implementation of the SLPQI in 106 summer learning
program sites in three cities. This section describes (a) implementation of SLPQI supports, (b) fidelity
and feasibility of the SLPQI sequence, and (c) staff valuation of the SLPQI process.
19
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Implementation of SLPQI Supports
SLPQI supports are the training and technical assistance necessary for program managers to
implement the work. Participants (N = 203) gave the trainings positive ratings. Eighty-four percent of
participants indicated that the trainings were a good use of their time, 89% indicated that the trainings
were a good fit with their current position, and 82% indicated that they had administrative support to
implement the content of the training. Assessors (N = 58) attended a Summer Learning PQA Reliability
Training. All assessors completed training evaluations and reported that the events were worth their time
and that they either acquired new skills or strengthened skills they already had. Eighty-seven percent of
assessors reported previous experience with the YPQI and Youth PQA. Table 8 describes training
locations, dates, and attendance.
Table 8. Training Events
Program Staff Training Events
Location
Date
Attendance
Summer Learning Institute
City A
April 6, 2016
33
Summer Learning Institute
City B
April 12, 2016
21
Summer Learning Institute
City C
May 13, 2016
34
Summer Learning Institute
City C
May 14, 2016
32
Quality Coaching
City B
May 3, 2016
21
Quality Coaching
City A
May 10, 2016
17
Assessor Training Events
Assessor Reliability Training
City A
May 5, 2016
18
Assessor Reliability Training
City B
May 10, 2016
15
Assessor Reliability Training
City C
June 3, 2016
12
SLPQI Implementation Fidelity and Feasibility
Implementation Fidelity
SLQPI implementation fidelity was assessed by creating a fidelity index to describe overall
implementation of four SLPQI elements: Planning, Assessment, Coaching, and Training. The index
ranges from 0 to 4 and was created by summing responses to four dichotomous items (where 1 =
implemented and 0 = not implemented) corresponding to each of the SLPQI elements. Across the three
cities, in 2016, 75% of sites achieved a high level of implementation fidelity, defined as implementing at
least three of the four elements (see Table 9). Because, in 2016, the SLPQI was implemented at greater
scale in each city, and used local capacity to produce supports, we interpret this level of implementation
20
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
as a success benchmark for scaled implementation.
8
SLPQI performance benchmarks are described in
Appendix A.
For comparison, we include in Table 9 comparable implementation data from the YPQI Study, a
randomized trial which produced a Cohen’s d effect size (Cohen, 1988) of d = .55 for the relation
between assignment to a group of sites implementing the SLPQI and instructional quality (i.e., PQA Form
A) following one year of implementation. Fidelity in the SLPQI study was close to the fidelity achieved
in the YPQI study treatment group and substantially greater than that achieved by the YPQI study control
group. Further, because the lowest fidelity occurred in the City C system, it is important to note that
many of the City C sites were new to the SLPQI in 2016, meaning they did not have a 2015 year
dedicated to preparing for scale-up. For this reason, communication with the City C sites that is,
communication among summer learning sites, the QIO, and technical partners was not as tightly
coupled as it was in Cities A and B. It is also the case that City C summer learning settings were already
characterized, on average, by very high instructional quality (i.e., PQA Form A), so in some cases very
high-quality sites may have made rational choices not to fully participate in the SLPQI elements.
Table 9. Comparison of the SLPQI Implementation Fidelity Index with the YPQI Study Treatment
and Control Groups
2016 SLPQI Study
YPQI Study
All Sites
N=87
City A
N=24
City B
N=16
City C
N=47
Treatment
N=37
Control
N=42
% sites 0 practice
3
0
6
4
0
0
% sites 1 practice
3
0
0
9
4
40
% sites 2 practices
18
17
17
21
13
34
% sites 3 practices
33
38
33
28
32
10
% sites 4 practices
42
46
44
38
53
16
Note: Practices include program improvement planning, assessing, training, and coaching.
Feasibility
We asked site managers about the timeliness of trainings and the success of their implementation
to address the question of feasibility. In 2016, 83% of site managers indicated that the trainings (Summer
Learning Institute, Quality Coaching) were “provided in a timely fashion to meet the needs of your
programs, and 78% of site managers indicated that their site was “able to successfully implement the
SLPQI.”
Staff Valuation of SLPQI
With respect to the overall value of the SLPQI, an average of 74% of site managers agreed that
participation in the SLPQI was "worth my time and effort” (less than 10 percent disagreed), 78% agreed
8
In 2014, when the beta version of the SLPQI was introduced, high fidelity was achieved by 63% of sites (N = 11),
and in 2015, when the SLPQI supports and coordination were being delivered by the developer (Weikart Center),
high fidelity was achieved by 99% of sites.
21
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
that the SLPQI “is applicable to my current job position and fits my role,” and 70% indicated that they
“have administrative support… to implement the SLPQI.” Where asked to describe what was most
valuable about the process, most site coordinators mentioned the process of PQA assessment and
coaching with the assessor-coach.
Site managers and assessors were also asked about the value of the Summer Learning PQA.
Seventy-eight percent of site managers and assessors agreed that the Summer Learning PQA was “able to
accurately assess the presence of academic practices” at their site. Eighty-eight percent of assessors
agreed that the Summer Learning PQA “was able to capture essential differences in the quality of
programs.”
Table 10 presents a sample of the responses to two questions: “What aspect of your experience
with the SLPQI was most valuable?” and “Please share any additional thoughts you may have about any
aspect of your experience with the Summer Learning PQI.” In general, comments indicated that staff saw
positive value in the SLPQI; in particular, the feedback visit with the assessor-coach. However, it is clear
that “fitting” the intervention to local circumstances is critical to achieving both high implementation
fidelity and staff value. All open-ended survey responses are provided in Appendix E.
Table 10. Open-Ended Responses Regarding SLPQI Value and Fit
The visit and review with the site assessor was extremely valuable to our site. As a team, we were able to ask
clarifying questions and receive detailed descriptions on how we could improve our practices.
It was very helpful to sit down the assessor and the data. During this time we were able to have a conversation
about the strengths and areas of improvement of the programming. I appreciated the time to dialogue and
brainstorm ways to strengthen the program offerings.
I feel that when we participate in the SLPQI the feedback, help and the training myself and my staff receive make
us better able to provide a stronger program for all the youth in our community. If at any point in time I need to
talk to my coach he would have been available. The support we receive is invaluable and could never be replaced.
Seeing our program through the eyes of another program coordinator. It was really helpful to hear some things a
neutral party noticed--both good and bad--and to be able to use this feedback to help our staff hear alternative
ways to do things.
The training just reaffirmed my philosophy of teaching and learning. It was nice to get the reports after each
observation. The reports provided an honest lens from an outside source that has no idea about how we run our
program. We were able to adjust as needed.
It would have been helpful to have two coaching sessions; one at the beginning of the program and one towards
the middle.
The summer is such a fast moving train, that even when the SLPQA was done in the 2nd week of program the
results and coaching were not available to the 4th or 5th week of program and the program was finished after the
6th week. I think we will see the value in using those results to influence our school year planning and the
planning for next summer, but we were not really able to make changes in the moment.
Some of the aspects are really difficult and don't actually match up with what the district has asked us to do. For
example, the SLPQI places high value on total student choice. The Math for Love curriculum gives a directive
for narrowly limiting student choice. We can't do right by both.
It was an overall positive experience. The assessor was understanding and flexible. I thought he led a very
pleasant feedback session in which the teachers came out with a positive outlook.
22
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Quality of Management Practices
The SLPQA Form B includes 31 items in four domains Planning, Staff Training, Family
Connection, and Individualization that are described in the second panel of Table 2. Median scores
from each domain for all of the 2016 summer sites are provided in Figure 2. The median of each variable
(domain) is marked by the dark line in the middle of the box. The box represents the interquartile range,
which describes where 50% of cases fall around the median (i.e., 25% above and below the median). The
distance between the lowest and highest markers delineate the range of scores on the variable, and the
numbered circles denote outlier cases. Form B data are best interpreted within the context of local
policies and regulatory environments because local policies and regulations tend to vary widely. In other
words, the best comparisons for Form B performance are local, and not all indicators in each domain
necessarily apply to all organizations given regional variation in policy and regulations.
With that caveat in mind, scores of 4 or higher indicate that most of the desired management
practices included in the domain score were present in the setting, whereas a score of 1 indicates the
absence of the practice. Overall, 2016 sites reflected moderately-high quality of management practices,
with practices supporting individualization scoring lowest. Descriptive data for the 31 Form B items are
provided in Appendix Table C-3.
Figure 2. Median Quality of Management Practices in 2016
23
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
V. Results for Quality of Instructional Practices
This section describes instructional practices used in summer learning programs. First, a field-
wide interpretation of the results is supported by drawing on the complete sample of summer learning
settings across the wide range of program designs, public and private auspices, staff expertise, and
connections to school-year content. This broad perspective is useful for developing an appreciation of the
need for quality improvement in the summer sector and identifying specific instructional practices and
student skill groups that are supported by those practices that may not be supported sufficiently in
summer programs as currently conceived.
Second, the results for change in quality from 2015 to 2016 are presented, drawing on data from
46 sites that implemented the SLPQI in both years. According to the SLPQI theory of change,
implementation of the SLPQI at high fidelity should improve the quality of instruction available in
summer settings where SLPQI is implemented over multiple summer cycles.
Quality of Instructional Practice, Field Perspective
To best understand “quality in the field” of summer learning settings, we constructed a data file
consisting of the 245 independent observational ratings available in the 2015 and 2016 years.
Four Domains of Quality, Plus Math and Literacy Scales
The third panel of Table 2 refers to measures of performance at the POS level. Measures at this
level describe staff practices assessed in each of four domains of instructional quality: Safe Environment,
Supportive Environment, Interaction, and Engagement. POS-level measures of instructional quality also
include indexes for the presence of domain-specific Math Practices and Literacy Practices.
Figure 3 presents the SLPQA median scores for the four domains and two academic practice
indexes. These scores represent overall quality of summer learning services available in the three cities
in summer 2016. A score of 4 or higher indicates that most of the desired instructional practices included
in the domain score were present in the setting, whereas a score of 1 indicates the absence of the practice.
On average, summer learning settings are safe and support skill building with both more exploratory and
more direct-scaffolding types of instruction. However, social and executive skills (e.g., planning,
reflection) are less well supported. Finally, not all summer learning programs included explicit math or
literacy practices. Descriptive information for all item, scale, domain, and total scores associated with
Figure 3 is provided in Appendix C.
24
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Figure 3. Quality of Instructional Practices All Years, Academic and Enrichment (N = 425)
Low-Scoring Items
Table 11 presents a selection of the lowest-scoring items across the 245 ratings for 2015 and
2016. Staff practices identified in the table were not present during 30% or more of those sessions. The
infrequent opportunities for “examine actions and consequences,” “suggest solutions,” “express in
writing,” “staff seek youth input,” “make plans,” provide feedback,” and “intentional reflection” suggest
that summer programs could seek to add practice and curriculum elements related to means-ends
thinking, reflection, and other forms of executive functioning. Infrequent opportunities for “use reasoning
to evaluate,” “link concrete examples,” and “identify learning strategy” suggest the need for greater
emphasis on learning strategies.
9
Table 11. Low-Scoring SLPQA Items
Item
Short Name
Percentage
Scoring "1"
Y.RC.3**
(Y) Youth examine actions and consequences
73%
Y.Ld.3
(Y) All youth lead group
72%
SA.MF.4**
(SA) Children suggest solutions
64%
Y.RC.2**
(Y) Staff seeks youth input
53%
Lit.3
Staff encourage expression in writing
53%
9
Appendix C, Table C-2, presents information for the PQA Form A measures for quality of greetings, transitions,
and departures another important aspect of classroom quality and student experience.
25
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Item
Short Name
Percentage
Scoring "1"
Y.Pn.1
(Y) Opportunities to make plans
52%
Rf.3
Structured opportunities to provide feedback
47%
Y.RC.4**
(Y) Staff acknowledges and follows up
46%
Math.3
Use reasoning to evaluate
45%
Math.4
Linking concrete examples
43%
A.LL.2
(A) Staff has youth identify learning strategy
42%
Rf.1
Intentional reflection
41%
Y.Co.2
(Y) Interdependent roles
37%
SA.MF.1**
(SA) Staff acknowledges feelings
37%
Y.AE.4
(Y) Tangible products or performances
36%
Math.5
Support the conveying of concepts
36%
SA.Ld.2
(SA) Opportunities to help another child
33%
SA.MF.2**
(SA) Staff asks children to explain situation
32%
SL.Be.6
(SL) Values communicated and integrated
31%
* This item was scored in less than 50% of offerings
** This item was scored in less than 25% of offerings
Profiles of Instructional Practices
Although the prevalence of specific types of practice is informative for thinking about program
design and improvement goals, other policy-relevant questions can be addressed best by considering the
prevalence of patterns, or profiles, of instructional practices; that is, by using simultaneously all of the
data on instructional practices to identify, for example, groups of sites that do not achieve high quality on
any of the measures. These lower-performing sites may not be producing positive effects on student
learning and are obvious targets for lower-stakes QIS policies.
Figure 4 presents results from a cluster analysis
10
using data for instructional quality in the three
domains described in the third panel of Table 2 (i.e., Supportive Environment, Interaction, and
Engagement). The distribution of the three profiles of instructional practices shown in Figure 4 indicates
that 20% of summer settings were characterized by the lowest-performing profiles and 37% were
characterized by the highest-performing profiles.
10
A hierarchical cluster analysis (using Ward’s method on squared Euclidean distances), followed by k-means
relocation analysis, was conducted using the ROPstat (version 2.0) statistical package for pattern-oriented analyses
(Vargha, Torma, & Bergman, 2015) to identify relatively-homogeneous subgroups of sites based on profiles of the
three instructional PQA domain scores in 2016. The analysis revealed that a 3-cluster solution was the most
parsimonious and yielded meaningful profile interpretations. Full analytic details are available upon request.
26
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Figure 4. Profiles of Instructional Practices
In Table 12, we present characteristics of the three 2016 profiles of instructional practices.
Higher-performing sites tend to be located in City C, tend not to offer academic enrichment in the
afternoon, tend to be located in schools, and are more informed about student’s school year achievement.
Interestingly, the high-quality instruction subgroup reported lower levels of SLPQI implementation than
the lower-quality subgroups, possibly indicating a rational choice by higher-performing sites to focus less
on improvement.
11
The high-performing sites are considered high because scores of 4 or more in all three
domains indicate that most of the instructional practices identified in the Summer Learning PQA are
present on an average day of programming.
Lower-performing sites tended to be located in City A, tended to offer full-day programming with
enrichment, were most frequently located in community-based organizations, and were moderately
informed about student’s school year achievement. The low-performing sites are of concern because they
fail to provide basic supports for skill building, positive emotionality, and executive functions.
Table 12. Profiles of Instructional Practices by City, Program Structure, Auspice, Benchmarks for
SLPQI Fidelity, and School Connection
% within
Cluster 1 -
Low
% within
Cluster 2
- Moderate
% within
Cluster 3
- High
City A
57
22
5
City B
24
22
8
City C
19
57
87
11
However, also impactful here was the relative latecomer status of City C to the project (joining the SLPQI study
in 2016) that may have had the effect of lower implementation fidelity because summer sites in City C were less
well connected to the technical partners or the quality intermediary organization.
1
2
3
4
5
Cluster 1- Low Cluster 2- Moderate Cluster 3- High
II. Supportive Environment III. Interaction IV. Engagement
27
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
% within
Cluster 1 -
Low
% within
Cluster 2
- Moderate
% within
Cluster 3
- High
Full Day Only
67
61
36
Morning Only
5
24
59
CBO Sites
91
80
60
School District Sites
10
20
41
Above School Connection Benchmark
5
17
42
Change in Quality of Instruction from 2015 to 2016
The Instructional Total Score (ITS) for the PQA Form A is a composite score constructed from
42 items in 12 scales across three domains for which the average is taken to produce the total score. With
sufficient observation time, this rating can be used to reliably differentiate between summer settings and
between time points for the same setting (Smith, 2013). This section presents results for analyses of
change in the ITS from 2015 and 2016 in a subsample of 46 sites for which we had data for both years.
The ITS increased significantly from 3.58 in 2015 to 3.82 in 2016 (p = .004; d = .64). ITS in 2015 and
2016 were normally distributed, displayed homogenous variances, and are shown in Figure 6.
To better understand which sites improved the most, cluster analyses methods were used to
identify four performance subgroups in the 2015 data.
12
As shown in Figure 7, the average ITS for sites
in the Low cluster (n = 8) increased from 2.79 to 3.71 (p < .001; d = 2.79). Comparison of profile
memberships across 2015 and 2016 indicate that, on average, sites in the two low-performing clusters,
particularly in the lowest-performing cluster, increased their ITS, whereas sites in the two highest-
performing clusters stayed about the same.
12
A hierarchical cluster analysis (using Ward’s method on squared Euclidean distances), followed by k-means
relocation analysis, was run using the ROPstat (version 2.0) statistical package for pattern-oriented analyses (Vargha
et al., 2015) to identify low-performing sites based on the score of the four PQA domains in 2015. The analysis
revealed that a 4-cluster solution was the most parsimonious and yielded meaningful profile interpretations. Full
analytic details are available upon request.
28
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Figure 6. Instructional Total Score for sites in 2015 and 2016
Figure 7. Instructional Total Score Change from 2015 to 2016 by Profiles of Instructional Practices
29
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Instructional Innovation During the SLPQI
In order to better understand the effect of SLPQI on the quality of instruction (e.g., teaching
practices, curriculum, or youth’s learning experiences), we conducted qualitative analyses of interview
data from staff. This interview information is important given the short duration of the summer session
and the rapid turnaround time necessary for participants to receive the Summary Report and the assessor-
coach visit early in the summer session. In 2016, 64% of site managers said this feedback visit was the
most valuable part of the SLPQI, and 78% reported coaching their staff based on the results of the
Summary Report. When asked “How did instruction change as a result of participation in the SLPQI?”
Eighty percent of the site managers who coached staff reported that there were resulting innovations in
instruction.
Twelve instructional staff were either self-nominated or recommended by their site manager to
participate in an interview at the end of the summer 2016 sessions. Four staff from each of the three cities
were selected from a pool of nominees from sites that also had high SLPQI fidelity ratings. The specific
questions from the interviews focused on staff experience with the SLPQI and staff assessment of the
SLPQI impact on instruction during the 2016 summer session. Our qualitative method involved three
steps: (a) conducting a structured interview with each staff, (b) conducting thematic analyses to
summarize the major types of innovation that staff described, and finally, and (c) identification of at least
one primary instructional innovation discussed in each of the twelve interviews.
Tables 13 summarizes the results of the thematic analyses. Across the twelve interviews, some of
the changes reported involved improvements in staff planning practices and learning experiences.
However, all of the staff interviewed were able to describe benefits for youth resulting from the
innovations or adjustments they made. Several staff reported improved behavior; for example, the
innovation or adjustment cut down on behavioral issues because youth had a role, or it reduced recess and
lunch conflicts because staff were more actively involved and supervising. Some staff reported that youth
experienced a greater sense of belonging or had more fun.
The specific themes, and their definitions in terms of the interview content, suggested that
participation in SLPQI provided the following benefits:
Staff found the need to be more intentional about planning their objectives and trainings.
The SLPQI incentivizes learning because it raises standards and creates opportunities for
intentional reflection.
A common framework helped staff to discuss and evaluate their program using a common
language.
More opportunities for student choice and voice improved student engagement, behavior, and
retention.
30
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Table 13. Themes and Exemplary Quotations from Staff interviews
Incentivizes
intentional planning
Incentivizes Staff
Learning & Innovation
Common framework for
staff and students
Improvement in student
engagement, behavior
and retention
It (SLPQI data) kind of
put a mirror in front of
my eyes saying "Hey
guys, that’s what you
do." And we said,
"Okay, we would like it
to look a little
different."
"The reason we are so
confident that we can do
Math better is because
Reading worked(SLPQI
goal from previous year)"
"The whole assessment
allowed me to name, label
and identify those points
of my program- like the
good, the bad, what do
you work on. And it
allowed me to just kind of
be more intentional about
improving my program."
“One of the first things
that I see when the kids
really enjoy something is
just a huge drop in
needing to manage
behavior.”
"Just because you know
everything was so
hectic in the beginning
that having another set
of eyes really made us
see what we were
missing"
"I appreciate the
standards that it sets. And
it has introduced me to
some better practices".
"It's a good way to have a
common language among
youth workers… I like
having a similar language
and being intentional
about how we program
and what the benefits are.
I like the commonality and
the intentionality it
creates".
“I definitely think the
students felt more
engaged in the class. I
don’t know how it
affected them overall but
I think it affected their
kind of behavior in the
class… They were more
willing to participate."
“But after the
evaluation just to have
specific objectives and
specific goals for them
(assistants) to focus on
was really good. …It
definitely grew deeper
relationships with the
kids- and be more
purposeful with their
learning."
"The YPQA process has
really taken me to-out of
my comfort zone as far as
teaching things and doing
activities that the kids tell
me they enjoy, even if it
is not something I 'm very
good at. I take the time to
learn it now so I make it
so I can teach it."
"(The Assessments)
started conversation and it
kind of brought us on
board… I think that really
kind of opened our eyes to
say" Okay we have this
tool we can use and we
can do it on an in level
and we can have people
externally who come out
and let us know what they
see."
"I have been a part of
summer program and
attendance would
dwindle a bit but it
remained steady. And I
think partly because you
know how were able to
use what we observed
(data) and stuff to
change.
“We found that we need
to be more intentional
about the kind of
training that we offer
our staff prior to the
summer just to make
sure some of these
missed areas are
included prior to the
summer starting just so
everybody is on the
same page."
"It was a good way to
position me to reflect on
my own teaching, which
doesn’t always happen
during the summer
programs because they
tend to be so short. It
feels like it is just starting
and then it is over so
having some intentional
reflection felt really
good."
"They felt more confident
about leading activities.
So we did a lot of you
know one on one
conversations. Especially
me, the director, sat with
the youth and had one-on
one conversations about
improvement and how it
goes, what's wrong, what's
good, how we can
improve what we do and
give them lots of
feedback"
31
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Table 14 provides a summary of specific instructional-quality improvements (e.g., teaching
practice, curriculum, and youth’s learning experience) resulting from of each interviewee’s involvement
with the SLPQI. For each improvement, the table provides a description of the specific innovation, a
relevant quotation for that innovation, and the corresponding PQA domain to which the innovation
applies. From these alignments, it is possible to extrapolate to the student skills that the innovation
supports. It appears that the primary innovations were focused on students’ executive function skills
(e.g., reflection and and planning), motivation management skills (e.g., choice and leadership), and basic
emotional regulation (e.g., emotional safety and belonging).
Table 14. Specific Improvements in the Quality of Instruction During the Summer Session
Innovation
Description of Innovation
Impacts on teaching
practice, curriculum and
youth’s learning experience
Alignment with
PQA
Teacher-
Student
combined book
creation
Teachers collaborate with students to
create their own books for reading
"The book idea was a great
way to encourage them."
"Once you make it (books)
their own they will be more
likely to read.”
Leadership,
Collaboration
Planning, Choice
Tally
Students reflect on the session
completed with three choices "They
did not like it", "They were okay
with it" or "They Loved it"
Reflection
Active Engagement
Line on a
Barometer
Students reflect on their session by
lining up on a Board Barometer
divided into "loved it", " I am
Neutral" and "I don't get it/I didn't
like it"
"Trying to improve the
environment for our students
and make them feel welcome
and have those good
relationships with adults.”
Reflection
Active Engagement
Surveys,
registration
sign ups
Students are provided with
content/project choices through
surveys and sign up for registration
with no set of required classes.
"I think the students felt
more engaged in the
class…but I think it affected
their kind of behavior in
class" "Affected the
participation level of
students who were just used
to sitting on sidelines."
Choice
Planning
Round Robin
Students do a Round Robin
reflecting on things they have learnt
Reflection
Active
Engagement
Short journals
Short journals on things learnt when
the content is dense. Discussion on
what they have learnt and how to use
that in day to day lives.
"Provide opportunities for
their curriculum
development time. We did
not really have that set aside
prior to this assessment so
we were able to kind of
implement more time in the
day."
Active Engagement
Reflection
Techperts
A group of students assigned as
Techperts who are the teacher’s first
Adult Partners
Leadership
32
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Innovation
Description of Innovation
Impacts on teaching
practice, curriculum and
youth’s learning experience
Alignment with
PQA
defense when it comes to small
technical problems
Popcorn
One idea of what you reflected
today. If the content is dense, the
reflection strategy is more in depth.
Reflection
Active Engagement
Money
Matters
Money Matters-Learning about
credits, understanding bank accounts,
savings and checking. Part of regular
school-Focused more during the
summer after the assessment.
Learning about credits,
understanding bank
accounts, savings and
checking accounts.
Active Engagement
Planning
Adult Partners
Learn five
names
Teachers have a notebook to write at
least five names of students and as
they learn cross them out to learn the
next five.
Warm Welcome
Emotional Safety
Belonging
Cahoot quiz
Students create their own quiz using
class content and compete with each
other.
Planning
Choice
Active Engagement
Legos
Students build their group Legos
based on their choice and solve a
problem given by the teacher in
connection to what they are building.
"The YPQA process has
really taken me to-out of my
comfort zone as far as
teaching things and doing
activities that the kids tell me
they enjoy even if it is not
something I 'm very good at.
I take the time to learn it now
so I make it so I can teach it.
Active Engagement
Choice
Planning
Collaboration
Skill Building
Quality and Academic Skill Growth
According to the SLPQI theory of change, students who participate in summer settings with
higher-quality instruction, as defined by PQA Form A, will gain more academic skills compared to
students who participate in lower quality settings. At one of the study cities in 2015 (n = 30 summer
classrooms) and 2016 (n = 60 summer classrooms), several academic skill assessments were administered
to summer students at baseline and a second time point. Multiple observational ratings were also
produced for each sample of classrooms, producing more reliable information about instructional
practices. Findings to date for both the 2015 and 2016 summer sessions suggest that participating in
high-performing summer classrooms (e.g., the High profile shown in Figure 4) results in greater skill
gains for both math and literacy compared to students participating in summer classrooms in lower-
performing summer classrooms (e.g., the Low profile shown in Figure 4). Detailed findings are available
in two reports (see Note 7). Additional findings will follow receipt of school day achievement, grades,
and behavior data for the 2016-2017 school year.
33
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
VI. Discussion of Findings and Recommendations
The Summer Learning Program Quality Intervention (SLPQI) is a continuous improvement
intervention for summer learning systems and settings. The intervention includes: (a) standards and
measures for high-quality instructional practice and student skill growth anchored by the Summer
Learning PQA, (b) data products and technology that support meaningful feedback to summer managers
and teachers, (c) a plan-assess-improve cycle adapted to operations at each summer site, and (d) training
and technical assistance necessary to implement the prior three parts. The SLPQI and Summer Learning
PQA focus on instructional practices that build student skills in summer and increase school success in
subsequent school-years.
The SLPQI has been the subject of a four year design study involving 152 summer learning
providers in seven cities. In the final year of the study, SLPQI was implemented in three citywide
summer learning networks in Denver, CO; St. Paul, MN; and Seattle, WA (N = 106 sites). This final
report presents final specification of the SLPQI design, supports, measures, and performance benchmarks
for implementation fidelity, instructional quality, and student skill growth.
Key Findings from 2016
The SLPQI was implemented at moderate to high fidelity, at scale, in three citywide systems with
local provision of training and technical assistance supports. The proportion of sites implementing the
SLPQI at high fidelity (i.e., in at least three of the four planning, assessing, coaching, and training steps)
was high in all three systems. In each city, partnerships of a local quality intermediary organization, the
public school district, city agencies, and numerous community-based providers developed sufficient
capacity to support the intervention at scale in multiple sites. School districts were sufficiently connected
to private providers to supply information about students’ success in the prior school year to a majority of
non-school sites.
Summer program staff positively valued the SLPQI; in particular, the assessor-coach role.
Participants in the SLPQI (e.g., system leaders, site managers, and assessors) felt that the Summer
Learning PQA successfully differentiated between higher- and lower-quality settings and that
implementation of the SLPQI was a good use of their time and a good fit with their work. In particular,
staff valued the assessor-coach who observed, generated performance feedback, and provided coaching.
Performance data indicate that instructional quality and student outcomes improved as predicted
by the SLPQI theory of change. Performance data from the three citywide systems indicate that
instructional quality improved from 2015 to 2016. Lower-performing sites improved the most, whereas
higher-performing sites sustained high quality over two years. Instructional innovations were focused on
areas of low quality (e.g., student management of their own executive skills, motivation, and emotions)
and, importantly, these are skills that support academic learning in all contexts. In the one city that
34
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
collected academic performance data, students in higher-quality summer settings had greater academic
skill gains in both 2015 and 2016 compared to students participating in lower-quality summer settings.
Recommendations
In each city, a partnership of quality intermediary organizations, public school districts, city
recreation departments, and numerous community-based providers joined to improve the quality of
summer instructional services over the course of two summers. Based on this experience and the study
findings, we offer the following three recommendations related to dissemination of the SLPQI:
Disseminate the SLPQI to partnerships between regional funders, OST intermediary
organizations, school districts, and networks of summer program providers. The SLPQI is designed for
use with public-private partnerships that include providers, a quality intermediary organization, and
funders. The SLPQI can be scaled quickly and efficiently in cities that have these summer partnerships in
place.
Disseminate the SLPQI as a summer system-building initiative for individual school districts. The
SLPQI requires coordinated action from system-level actors, making the intervention good for building
summer service systems. In each city participating in the study, the network of service providers included
public schools, city agencies, community-based organizations, quality improvement organizations,
funders, and students and families who used the summer services. Adopting the SLPQI successfully
brought actors together around a common vision for summer instruction and coordinated action and
substantial resources among all of the actors to deliver that vision. The SLPQI is a method for building
summer learning systems that should be valuable for school districts interested in building a summer
learning partnership as part of their ESSA (Every Student Succeeds Act) compliance plan.
Weikart Center should continue to seek support for validation work on the Program Quality
Assessment. PQA Form A observational measures are ready for widespread use to help programs identify
staff training needs and develop effective program improvement plans. However, the SLPQI design study
has also dovetailed with work on a PQA Form A reconfigured to address social and emotional learning
(SEL). We are now in a position to validate a new version of the PQA measure that would extend from
emphases on exploratory, direct skill scaffolding, and learning strategy methods present in the Summer
Learning PQA to include assessment of practices for students who have had difficult SEL histories.
13
Pursue funding for an efficacy trial. Based on the prior four year design study sequence (see
footnote 1, above; IES, 2013), we have meet all criteria recommended as a foundation for an efficacy
trial
14
design (IES, 2013). The SLPQI is ready to be tested using a randomized design, and because the
13
See the discussion at the end of Appendix C.
14
Efficacy Research should be justified by one or more of the following: (a) empirical evidence of the promise of
the intervention from a well-designed and implemented pilot study (e.g., a study conducted as part of a design and
development project); (b) empirical evidence from at least one well-designed and implemented Early-Stage or
35
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
structure of summer learning experience is much simpler than school year learning (i.e., teasing apart the
different effects from afterschool and school day learning), summer programming is an ideal place to test
the impact of access to high-quality instruction on academic skills and SEL skills that support academic
learning.
Exploratory Research study supporting all the critical links in the intervention’s theory of action; (c) evidence that
the intervention is widely used, even though it has not been adequately evaluated to determine its efficacy; or (d) if
the intent is to replicate an evaluation of an intervention with a different population and there is evidence of
favorable impacts from a previous well-designed and implemented efficacy study and justification for studying the
intervention with the new target population.
36
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
References
Augustine, C. H., McCombs, J. L., Pane, J. F., Schwartz, H. J., Schweig, J., McEachin, A., & Siler-Evans,
K. (2016). Learning from summer: Effects of voluntary summer learning programs on low-
income urban youth. RAND Summer Learning Series. Santa Monica, CA: RAND.
Alexander, K. L., Entwisle, D. R., & Olson, L. S. (2007). Lasting consequences of the summer learning
gap. American Sociological Review, 72, 167-180.
Arbreton, A., Sheldon, J., Bradshaw, M., Goldsmith, J., Jucovy, L., & Pepper, S. (2008). Advancing
achievement: Findings from an independent evaluation of a major after-school initiative
INSIGHT: Lessons learned from the CORAL intiative. San Francisco: The James Irvine
Foundation and Public/Private Ventures.
Borman, G. D., & Dowling, N. M. (2006). Longitudinal achievement effects of multiyear summer school:
Evidence from the teach Baltimore randomized field trial. Educational Evaluation and Policy
Analysis, 28(1), 25-48.
Boss, S., & Railsback, J. (2002). Summer school programs: A look at the research, implications for
practice, and program sampler (pp. 1-43). Washington, DC: Northwest Regional Educational
Laboratory.
Chaplin, D., & Capizzano, J. (2006). Impacts of a Summer Learning Program: A Random Assignment
Study of Building Educated Leaders for Life (BELL). Retrieved from http://www.urban.org/.
Cooper, H., Nye, B., Charlton, K., Lindsay, J., & Greathouse, S. (1996). The effects of summer vacation
on achievement test scores: A narrative and meta-analytic review. Review of Educational
Research, 66(3), 227-268.
Cronbach, L. J., Nageswari, R., & Gleser, G. C. (1963). Theory of generalizability: A liberation of
reliability theory. British Journal of Statistical Psychology, 16, 137-163.
Czajkowski, S. M., Lynch, M. R., Hall, K. L., Stipelman, B. A., Haverkos, L., Perl, H., ... & Shirley, M.
C. (2016). Transdisciplinary translational behavioral (TDTB) research: Opportunities, barriers,
and innovations. Translational Behavioral Medicine, 6(1), 32-43.
Gershenson, S. (2013). Do summer time-use gaps vary by socioeconomic status? American Educational
Research Journal, 50(6), 1219-1248. doi:10.3102/0002831213502516
Ilfeld, E. M. (1996). Learning comes to life: An active learning program for teens. Ypsilanti, MI:
High/Scope.
Institute of Education Science. (2013). Common guidelines for education research and development: A
report from the Institute of Education Sciences, U.S. Department of Education and the National
Science Foundation. Retrieved from https://ies.ed.gov/.
37
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Li, J., & Julian, M. M. (2012). Developmental relationship as the active ingredient: A unifying working
hypothesis of "what works" across intervention settings. American Journal of Orthopsychiatry, 1-
14.
Linnenbrink, E. A. (2007). The role of affect in student learning: A multi-dimensional approach to
considering the interaction of affect, motivation, and engagement. In P. A. Schutz, R. Pekrun, P.
A. Schutz, & R. Pekrun (Eds.), Emotion in education (pp. 107-124). San Diego, CA: Elsevier.
Martin, B., & Reigeluth, C. M. (1999). Affective education and the affective domain: Implications for
instructional-design theories and models (Vol. II). Mahwah, NJ: Erlbaum.
Marzano, R. J. (1998). A theory-based meta-analysis of research on instruction. Aurora, Colorado: Mid-
continent Regional Educational Laboratory.
Matsudaira, J. D. (2013). Summer school and student achievement in the United States. In J. Hattie, E. M.
Anderman, J. Hattie, & E. M. Anderman (Eds.) , International guide to student achievement (pp.
164-166). New York: Routledge.
McCombs, J. S., Augustine, C. H., & Schwartz, H. L. (2011). Making summer count: How summer
programs can boost children's learning. Santa Monica, CA: RAND.
McCombs, J. S., Pane, J. F., Augustine, C. H., Schwartz, H. L., Martorell, P., & Zakaras, L. (2014).
Ready for Fall? Near-Term Effects of Voluntary Summer Learning Programs on Low-Income
Students’ Learning Opportunities and Outcomes. Santa Monica, CA: RAND.
Naftzger, N. (2014). A summary of three studies exploring the relationship between afterschool program
quality and youth outcomes. Paper presented at the Ready by 21 National Meeting, Covington,
KY.
Naftzger, N., Manzeske, D., Nistler, M., Swanlund, A., Rapaport, A., Shields, J., . . . Sugar, S. (2013).
Texas 21st century community learning centers: Final evaluation report. Naperville, IL:
American Institutes for Research.
Naftzger, N., Tanyu, M., & Stonehill, R. (2010). The impact of self-assessment and quality advisor
support on afterschool program quality: Summary of year three findings from WASCIP quality
advisor study. Naperville, IL: Learning Point Associates.
Naftzger, N., Vinson, M., Manzeske, D., & Gibbs, C. (2011). New Jersey 21st century community
learning centers (21st CCLC) impact report 2009-2010. Naperville, IL: American Institutes for
Research.
Newhouse, C., Neely, P., Freese, J., Lo, J., & Willis, S. (2013). Summer matters: How summer learning
strengthens student's success. Oakland, CA: Public Profit.
Oden, S., Kelly, M. A., Ma, Z., & Weikart, D. P. (1992). Challenging the potential: Programs for
talented disadvantaged youth. Ypsilanti, MI: High/Scope.
38
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Ramaswamy, R., Gersh, A., Sniegowski, S., McGovern, G., & Smith, C. (2014). Summer learning
program quality assessment: 2013 Phase I pilot report. Ypsilanti, MI: Weikart Center for Youth
Program Quality.
Ramaswamy, R., Smith, C., Hillaker, B., Jones, M. M., Mauck, S., McGovern, G., . . . Sennaar, K.
(2017). Summer Learning Program Quality Intervention Handbook. Ypsilanti, MI: Weikart
Center for Youth Program Quality.
Raudenbush, S., & Sampson, R. (1999). Assessing direct and indirect effects in multilevel designs with
latent variables. Sociological Methods & Research, 28(2), 123-153.
Roderick, M., Engel, M., & Nagaoka, J. (2003). Ending social promotion: Results from summer bridge.
Chicago, IL: Consortium on Chicago School Research.
Seidman, E. (2012). An emerging action science of social settings. American Journal of Community
Psychology, 50(1-2), 1-16.
Smith, C. (2013). Moving the needle on “moving the needle:” Next stage technical guidance for
performance based accountability systems in the expanded learning field with a focus on
performance levels for the quality of instructional services. Ypsilanti, MI: Weikart Center for
Youth Program Quality.
Smith, C., & Akiva, T. (2008). Quality accountability: Improving fidelity of broad developmentally
focused interventions. In B. Shinn & H. Yoshikawa (Eds.), Towards positive youth development:
Transforming social settings (pp. 192-212). New York: Oxford University Press.
Smith, C., Akiva, T., Sugar, S., Lo, Y. J., Frank, K. A., Peck, S. C., & Cortina, K. S. (2012). Continuous
quality improvement in afterschool settings: Impact findings from the youth program quality
intervention study. Washington, DC: Forum for Youth Investment.
Smith, C., Ramaswamy, R., Gersh, A., & McGovern, G. (2015). Summer Learning Program Quality
Intervention (SLPQI): Phase II Feasibility Study. Ypsilanti, MI: Weikart Center for Youth
Program Quality.
Smith, C., Hallman, S., Hillaker, B., Sugar, S., McGovern, G., & Devaney, E. (2012). Development and
early validation evidence for an observational measure of high quality instructional practice for
science, technology, engineering and mathematics in out-of-school time settings: The STEM
supplement to the Youth Program Quality Assessment. Ypsilanti, MI: Weikart Center for Youth
Program Quality.
Smith, C., Helegda, K., Ramaswamy, R., Hillaker, B., McGovern, G., & Roy, L. (2015). Quality-
Outcomes Study for Seattle Public Schools Summer Programs: Summer 2015 Program Cycle.
Ypsilanti, MI: Weikart Center for Youth Program Quality.
39
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Smith, C., Roy, L., Peck, S. C., Helegda, K., & Macleod, C. (2016). Quality-Outcomes Study for Seattle
Public Schools Summer Programs, Summer 2016 Program Cycle, Interim Findings. Ypsilanti,
MI: Weikart Center for Youth Program Quality.
Smith, C., Ramaswamy, R., Hillaker, B., Helegda, K., & McGovern, G. (2015). Summer Learning
Program Quality Intervention Phase III Interim Report. Ypsilanti, MI: Weikart Center for Youth
Program Quality.
Smith, C., Roy, L., Peck, S. C., Moxley, K., McGovern, G., Helegda, K. (2017). Evaluation of Quality
Improvement System Performance: Oklahoma 21st Century Community Learning Centers.
Ypsilanti, MI: Weikart Center for Youth Program Quality.
Spielberger, J., & Halpern, R. (2002). The role of after-school programs in children's literacy
development. Chicago, IL: Chapin Hall Center for Children at the University of Chicago.
Vargha, A., Torma, B., & Bergman, L. R. (2015). ROPstat: A general statistical package useful for
conducting person-oriented analyses. Journal for Person-Oriented Research, 1, 87-98.
Wheeler, K. A., & Proche, M. (2011). Evaluation Results for the Summer Literacy and Learning
Promotion Initiative. Retrieved from http://www.researchconnections.org/.
Yohalem, N., Devaney, E., Smith, C., & Wilson-Ahlstrom, A. (2012). Building citywide systems for
quality: A guide and case studies for afterschool leaders. Washington, DC: Forum for Youth
Investment.
Yohalem, N., Ravindath, N., Bertoletti, J., Smith, C., Wallace, L., & Sugar, S. (2010). Making quality
count: Lessons learned from the Ready by 21 Quality Counts initiative. Washington, DC: Forum
for Youth Investment.
.
A-1
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Appendix A 2016 SLPQI Performance Benchmarks
Tables A-1 and A-2 present performance benchmarks at the system (e.g., network) organization
(e.g., site) and point of service (e.g., classroom) levels. These key indicators of performance (1)
system-level supports, (2) site level implementation of the SLPQI cycle, (3) prevalence of instructional
practices at the POS level, and (4) student skill gains both during the summer session and in (5)
subsequent school day classrooms empirically represent the cascade of effects described in the SLPQI
theory of change.
When aggregated to the system level, these benchmarks provide a policy-relevant perspective on
regional performance and support leaders to strategize about investment and improvement. The system
level of aggregation can also be used for normative comparison with summer systems in other places.
When disaggregated to the site level, the benchmarks provide within-system performance norms
for implementation fidelity of SLPQI, the quality of instruction available to specific groups of students,
and proportion of those students making gains in desired skills. Appendix Table A-1 summarizes the
benchmarks across the three systems (at the “field” level). Table A-2 provides benchmark data for each
of the three summer systems for the 2016 year.
Table A-1. Multi-level Performance Objectives, Data Source and Benchmarks
Performance Objective
Benchmark
Data
Source
System-Level
Rater reliability
Report timeliness
100% of raters reliable
100% of reports on time
Project
records
Organization-Level
SLPQI Fidelity
Staff Valuation
School Year Connection
Recruit academic risk
Review academic skill data
Implement 3 of 4 SLPQI parts
Site manager score > 4.5
Site manager score > 3.67
Site
manager
survey
Point of Service-Level
Comprehensive rating for quality of
instructional setting and practices; an overall
quality rating for the site
PQA ITS Score > 4.1; Score change > .33 if in low
quartile at baseline.
PQA Form
A
Student Skill Change
Rate student skills at two time points and
describe growth
Effect size depends on skill assessment; Cohen’s d
type effect size range: d=.3-.7
Summer
data
Student Skill Transfer
Compare students by exposure to high
quality, low quality, or no summer program
Effect size depends on skill assessment; TBD
School
data
A-2
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Table A-2. Benchmarks for SLPQI by City
Benchmark Name
Raw Score for
Whole Sample
Raw Score by City
Benchmark Definition Notes
Implementation
Fidelity
N=104
70%
City A 79%
City B 79%
City C 62%
Implementation Index; percent of
programs that scored a 3 or above based on
the sum of 4 implementation indicators.
Instructional Quality
N=425
4.12
City A 3.56
City B 4.01
City C 4.26
SLPQA Instructional Total Score for all
forms; bottom of top quartile (i.e. 75th
percentile)
Staff Value SLQPI
N=104
4.50
City A 4.00
City B 4.50
City C 4.50
Average of items (1) Good use of time and
(2) good fit; bottom of top quartile
School Connection
N=104
3.67
City A 3.67
City B 2.50
City C 4.25
Average of scale scores for Targeting
Academic Risk and Student Data; bottom
of top quartile;
B-1
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Appendix B Notes on the SLPQI Design Adjustments since the 2015 report
Although most of the design work on the SLPQI occurred prior to 2016 (i.e., the final year of the
study), improvement of the intervention design and supports remained a goal throughout the project
period. In this section, we describe adjustments that were made to the SLQPI design and supports
through the 2016 cycle. Many of these updates are a continuation of more substantial changes made in
2015.
We made critical improvements to the SLPQI design to make it more effective, including
assessors serving as coaches and improving the timing and quality of performance feedback.
Additional improvements were made after the 2015 cycle, including improved trainings, a revised
and flexible Form B, and use of the Online Scores Reporter.
Based on this feedback, for 2016, the training agenda was revised to focus more on the tool itself
and provide participants with clear opportunities to explore the items, especially the Form B
items on quality management, and make plans for the coming summer.
We have found that training assessors in how to conduct these sessions is as important as training
them in using the SLPQA. For Phase III, this training was integrated into the Quality Coaching
session for the participating networks. We will continue to include this training as part of the full
SLPQI suite of supports.
In addition to the supports described for Phase II, Phase III included several improvements to
training and technical assistance, including:
o Denver and St. Paul network leaders were brought together several times by the Weikart
Project Manager to share their experiences and reflections.
o An adaptation of the Quality Instructional Coaching training for assessor-coaches was
piloted in St. Paul.
o Planning with Data workshops, which were held in September, asked program managers
to use 2015 summer data to plan for the 2016 school year. The intention was to use
2015’s improvement plans as a point of reference and planning for the 2016 cycle.
o Performance report recommendations were not automated but generated by the assessor-
coach with the intention of providing sites with a more personalized experience.
o The professional learning community was formalized and expanded to include School’s
Out Washington and Seattle Public Schools with support from the Raikes Foundation.
NSLA facilitated quarterly calls and meetings at the Forum for Youth Investment’s
National Meeting and NSLA’s annual conference.
o The Summer Learning Institute training underwent substantial revisions to further
improve the experience for participants.
B-2
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
o The assessor-coach module piloted in 2015 was integrated into the Quality Coaching
trainings in Denver and Seattle.
o All data collection and reporting was done in the Online Scores Reporter.
C-7
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Appendix C Summer Learning PQA Measures
This appendix provides descriptive information for the Summer Learning PQA Forms A and B,
as well as reliability and validity information for the Form A data. The Summer Learning PQA Form A
consists of 74 items nested within18 scales nested within eight domains (Safety, Supportive Environment,
Interaction, Planning-Choice-Reflection, Learning Strategies, Higher Order Thinking, Math and
Literacy). Table C-1 provides item, scale, and domain level descriptive information for 269 completed
ratings in 106 sites during 2016.
Assessors also completed a checklist related to basic best practices for three transition periods in
the program day, greetings, transitions and departures. Table C-2 presents the percent of sites
demonstrating each of seventeen transition practices (e.g., chldren are greeted by staff).
The Summer Learning PQA Form B consists of 13 items nested within four domains (Planning,
Staff Training, Family Connections, and Individualization). Table C-3 provides descriptive information
for the 106 sites at the item and domain levels and an overall Total Score (average across all four domain
scores).
Table C-1. Descriptive Statistics for Summer Learning PQA Form A
2016 (N = 269)
Range
Mean
SD
Safe Environment
1.93
4.55
0.33
Psychological and emotional safety is promoted.
4.00
4.67
0.65
Positive emotional climate
4.00
4.51
0.96
Lack of bias
4.00
4.72
0.85
Removal of Exclusive Behavior
4.00
4.78
0.71
Healthy Environment: The physical environment is safe and free of
health hazards.
2.67
4.80
0.47
Free of health and safety hazards
4.00
4.75
0.77
Clean and sanitary
4.00
4.75
0.71
Suitable for all activities
2.00
4.92
0.40
Emergency Procedures: Appropriate emergency procedures and
supplies are present.
3.00
4.24
0.66
Posted emergency procedures
4.00
4.34
1.20
Fire extinguisher
4.00
3.76
1.23
First-aid kit
4.00
3.89
1.33
Other safety equipment
4.00
4.78
0.92
Supervised entrances
4.00
4.64
0.90
Supervised access to outdoor space
4.00
4.59
0.97
C-7
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
2016 (N = 269)
Range
Mean
SD
Health and Nutrition: Healthy food and physical activity are
provided.
3.00
4.48
0.60
Available drinking water
4.00
4.81
0.62
Plentiful food and drinks
4.00
4.90
0.54
Nutritious food and drink
4.00
4.44
1.00
Dedicated physical activity
4.00
3.80
1.47
Supportive Environment
3.27
4.29
0.55
Warm Welcome: Staff provides a welcoming atmosphere.
3.33
4.60
0.63
Youth Greeted
4.00
4.20
1.32
Staff warm and respectful
4.00
4.76
0.69
Positive staff body language
4.00
4.82
0.60
Program Flow: Session flow is planned, presented and paced for
youth.
2.80
4.56
0.53
Sufficient materials
4.00
4.68
0.81
Explains activities clearly
4.00
4.70
0.77
Appropriate time for activities
4.00
4.49
0.98
Multiple types of activities
4.00
4.31
1.14
Consistent routines and guidelines
4.00
4.58
0.95
Active Learning: Activities support active engagement.
4.00
4.10
0.75
Youth engage with materials or ideas
4.00
4.68
0.81
Youth talk about activities
4.00
4.09
1.34
Balance of concrete and abstract
4.00
4.43
1.02
Tangible products or performances
4.00
3.20
1.82
Skill Building and Encouragement: Staff encourages and supports
youth in building skills.
4.00
4.09
0.95
Learning focus link to activity
4.00
3.59
1.80
Staff encourages youth to try new skills
4.00
4.21
1.26
Staff model skills
4.00
4.27
1.38
Staff breaks down tasks
4.00
4.30
1.28
Staff monitors difficulty
4.00
4.08
1.32
Staff guide initiative in learning
4.00
4.08
1.37
Reframing Conflict: The staff uses youth-centered approaches to
reframe conflict.
4.00
3.13
1.41
Staff approaches calmly
4.00
4.09
1.64
Staff seeks youth input
4.00
2.64
1.75
Youth examine actions and consequences
4.00
2.11
1.45
Staff acknowledges and follows up
4.00
2.75
1.67
Managing Feelings: The staff encourages children to manage
feelings and resolve conflicts appropriately.
4.00
3.12
1.51
Staff acknowledges feelings
4.00
3.24
1.76
C-7
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
2016 (N = 269)
Range
Mean
SD
SA Staff asks children to explain situation
4.00
3.39
1.72
SA Helps children respond appropriately
4.00
3.39
1.67
SA Children suggest solutions
4.00
2.16
1.59
Interaction
3.50
3.54
0.70
Belonging: Youth have opportunities to develop a sense of
belonging.
4.00
3.56
1.17
Opportunities for children to get to know each other
4.00
3.73
1.26
Values communicated and integrated
4.00
3.39
1.70
Collaboration and Leadership: Youth have opportunity to
collaborate and work cooperatively with others.
4.00
3.04
0.99
Interdependent roles
4.00
3.39
1.74
Practice group process skills
4.00
4.03
1.39
Opportunities to demonstrate, explain
4.00
2.95
1.54
All youth lead group
4.00
1.75
1.14
Adult Partners: Youth have opportunities to partner with adults.
3.00
4.03
0.77
Staff shares control with youth
4.00
3.32
1.62
Staff actively involved with youth
4.00
4.79
0.64
Staff and youth accountable to expectations
4.00
3.47
1.37
Positive behavior management style
4.00
4.30
1.09
Engagement
3.64
3.48
0.78
Planning, Choice, and Reflection: Youth have opportunity to direct
their own learning.
4.00
3.10
0.93
Opportunities to make plans
4.00
2.64
1.68
Content alternatives
4.00
3.36
1.59
Process alternatives
4.00
3.57
1.65
Intentional reflection
4.00
3.26
1.78
Structured opportunities to provide feedback
4.00
2.70
1.70
Learning how to learn: Youth are supported developing learning
initiative and persistence.
4.00
3.51
1.04
Problem-solve for improvement
4.00
4.00
1.43
Identify learning strategies
4.00
2.69
1.60
Effort-achievement beliefs
4.00
3.85
1.17
Higher Order Thinking: Youth are supported in developing higher
order thinking skills.
4.00
3.82
1.14
Staff encourages youth to deepen knowledge
4.00
3.81
1.59
Connecting activity and other knowledge
4.00
3.60
1.64
Encourage use of creativity, curiosity, or imagination
4.00
4.08
1.24
Total Score
4.00
3.26
1.57
Instructional Total Score
4.00
3.70
1.73
Math: Youth are supported in mathematical problem solving.
4.00
3.58
1.73
Participate in problem solving
4.00
2.86
1.79
Opportunities to apply knowledge and skills
4.00
2.85
1.75
Use reasoning to evaluate
4.00
3.33
1.85
Linking concrete examples
3.00
4.03
0.77
C-7
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
2016 (N = 269)
Range
Mean
SD
Support the conveying of concepts
4.00
3.32
1.62
Literacy: Youth are supported in reading and writing.
4.00
3.80
1.04
Participate in literacy activities
4.00
4.32
1.28
Opportunities to read in multiple settings
4.00
3.98
1.37
Staff encourage expression in writing
4.00
2.69
1.86
Vocabulary discussed
4.00
3.73
1.69
Available materials and reading environment
4.00
3.91
1.45
Multiple reading and writing activities
4.00
4.16
1.41
During the study, checklist data were assembled on a critical aspect of quality: greetings,
transitions, and departures. Although time did not allow for further analyses, the data suggest that
summer programs in the study were overall quite plannful about all transitions into, during, and exiting
from the program. However, practices to assure student experiences of a “safe space” and “clarity of
expectations” are absent during transitions in 40% or more of summer settings. Also, one third of
programs left children unattended during the departure period. Item-level desrpitive information is
provided in Appendix Table C-2.
Table C-2. Descriptive Statistics for Summer Learning PQA Transition Checklists
2016 (N = 269)
Range
Mean
SD
Greetings - Opening and arrival time
1.00
0.74
0.26
Children greeted by staff
1.00
0.85
0.36
Session starts within 10 minutes of scheduled time
1.00
0.98
0.15
Welcoming activity or icebreaker
1.00
0.61
0.49
Incorporates themes or aspects of program culture
1.00
0.52
0.50
Transitions: Group moves to new activity
1.00
0.68
0.30
Smooth and quick transition times
1.00
0.69
0.46
Clear transition communication
1.00
0.85
0.36
On task and ready for transition
1.00
0.52
0.50
Activity choices clearly communicated
1.00
0.46
0.50
Program lessons incorporated
1.00
0.88
0.32
Departure: When children leave for the day
1.00
0.75
0.21
Organized process
1.00
0.89
0.32
Smooth process
1.00
0.90
0.31
Constructive activities while waiting
1.00
0.85
0.36
Children left unattended
1.00
0.33
0.47
Utilizes parent engagement opportunity
1.00
0.91
0.29
Verification system
1.00
0.91
0.28
Program incorporated
1.00
0.57
0.50
C-7
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Table C-3. Descriptive Statistics for the 2016 Summer Learning PQA Form B (N = 110 Interviews)
Scale/Item
Range
Mean
SD
Organizational planning
3.27
3.90
0.75
Mission Alignment
4.00
4.33
1.16
Strategic Plan
4.00
3.44
1.70
Strategic Plan Reviewed
4.00
3.06
1.70
Proactive Planning
4.00
3.62
1.41
Goals
4.00
4.15
1.25
Staff Input
4.00
4.42
1.16
Youth Input
4.00
3.33
1.73
Lesson Plan Framework
4.00
3.84
1.40
Data Collection Methods
4.00
4.49
0.99
Stakeholder Groups
4.00
4.11
1.26
Improvement Planning
4.00
3.97
1.32
Staff Training
3.36
3.69
0.78
Staff Retention
4.00
3.46
1.30
Adult - Youth Ratio
4.00
4.15
1.10
Defined Competencies
4.00
3.51
1.69
Training Based On Competencies
4.00
3.50
1.69
Year Around PD
4.00
4.04
1.24
Staff Training
4.00
4.00
1.50
Support for Non-Certified Teachers
4.00
3.42
1.68
Certified Teacher Available
4.00
3.64
1.73
Staff Collaboration
4.00
3.84
1.44
Staff Observation and Feedback
4.00
3.38
1.68
Family Connections
2.86
4.03
0.74
Year-round Contact with Families
4.00
3.59
1.47
Relationship-Building with Families
4.00
3.31
1.56
Family Participation Opportunities
4.00
2.98
1.32
Individualization
4.00
3.29
0.99
Youth Assessment
4.00
3.71
1.70
Individualized, Tailored Instruction
4.00
3.62
1.69
Curriculum Implementation
4.00
4.31
1.25
Average Attendance
4.00
4.36
1.05
Year-Year Retention
4.00
3.56
1.24
Recruitment Criteria
4.00
4.81
0.65
Number of Programming Hours
4.00
3.76
1.33
Interview total average score
2.79
3.73
0.61
C-7
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Reliability and Validity of the Summer Learning PQA
Evaluating reliability and validity of data from observation-based measures of settings requires
cautious application of standard psychometric concepts and tools (Cronbach, Nageswari, & Gleser, 1963;
Raudenbush & Sampson, 1999; Seidman, 2012) and careful alignment between (a) the different purposes
for which scores will be used and (b) the different methods to determine score reliability and validity. For
these reasons, our approach to the assessment of the reliability and validity of Summer Learning PQA
consisted of a set of steps, following the Weikart Center’s approach to the development of observational
measures (Smith, Hallman, et al., 2012), which were designed to maximize our understanding of these
complex issues within the limitations imposed by the project budget.
Reliability and validity of the PQA Form A data was addressed more fully in the year-two report
for the SLPQI design study (Smith et al 2015). These analyses included 44 unique session ratings
collected at 32 programs sites by 18 assessors, with a subsample of paired raters. In that report, findings
for reliability and validity of instructional quality data were characterized in the following way:
Precision and meaningfulness of Summer Learning PQA data is promising. The Summer
Learning PQA Form A was endorsed by program managers and assessors as effectively describing high-
quality instructional practices and differentiating between programs of high and low quality. The results
of several reliability analyses indicated that, where multiple ratings from the same site are combined as a
composite score, the Form A Instructional Total Score demonstrated adequate consistency across raters
and short time periods; that is, there is sufficient consistency within organizations to produce a program-
level quality rating. Validity evidence suggested that the Form A scores are associated in the expected
direction with several important characteristics of summer learning programs (Smith et al 2015, p. 32).
In Table C-4 we present descriptive statistics for the SLPQA domain and scale scores using the
combined total sample of 245 offerings summarized in Figure 3. The final column presents Cronbach’s
alpha reliability coefficients for all domain and scale scores. Table C-5 shows the bivariate correlations
among the four domains and academic practices scales.
Table C-4. Descriptive and Reliability Statistics for the SLPQA Domain and Scale Scores
Level
Name
Mean
SD
Skewness
Kurtosis
Range
Cronbach's
Alpha
Domain
Safe Environment
4.53
0.35
-1.08
1.62
2.03
.53
Supportive Environment
4.25
0.57
-1.06
1.17
3.27
.92
Interaction
3.42
0.73
-0.14
-0.61
3.67
.72
Engagement
3.34
0.80
-0.20
-0.60
4.00
.67
Math
3.22
1.51
-0.39
-1.41
4.00
.91
Literacy
3.72
1.06
-1.01
0.44
4.00
.77
Scale
Emotional Safety
4.69
0.65
-2.74
8.88
4.00
.66
C-7
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Level
Name
Mean
SD
Skewness
Kurtosis
Range
Cronbach's
Alpha
Healthy Environment
4.77
0.49
-3.09
13.15
4.00
.54
Emergency Procedures
4.20
0.66
-0.97
1.37
4.00
.65
Health and Nutrition
4.46
0.62
-1.31
1.85
3.00
.29
Warm Welcome
4.55
0.68
-1.62
2.36
3.33
.49
Program Flow
4.54
0.53
-1.47
2.31
2.80
.46
Active Learning
4.07
0.77
-0.85
0.89
4.00
.36
Skill Building
4.03
0.99
-1.09
0.52
4.00
.77
Reframing Conflict
2.79
1.33
0.50
-0.90
4.00
.82
Managing Feelings
2.97
1.45
-0.04
-1.40
4.00
.85
Belong
3.44
1.18
-0.18
-1.06
4.00
.39
Collaboration and
leadership
2.88
0.98
-0.05
-0.78
4.00
.56
Adult Partners
3.93
0.78
-0.51
-0.10
4.00
.48
Planning, Choice,
Reflection
2.91
0.98
-0.05
-0.67
4.00
.51
Learning How to Learn
3.41
1.13
-0.34
-0.76
4.00
.63
High Order Thinking
3.69
1.13
-0.50
-0.76
4.00
.60
Table C-5. Correlations among SLPQA Domain Scores
Domain
Safe
Environment
Supportive
Environment
Interaction
Engagement
Math
Literacy
Safe
Environment
1
.38
.26
.23
.14
.16
Supportive
Environment
.38
1
.59
.65
.48
.38
Interaction
.26
.58
1
.61
.35
.28
Engagement
.23
.65
.61
1
.58
.46
Math
.14
.48
.35
.58
1
.63
Literacy
.16
.38
.28
.46
.63
1
Note: All correlations are significant at the p < .01 level (2-tailed)
Note on Recommendation to Continue Validation of the PQA, Form A.
Reliability coefficients for many of the PQA scales and domains are lower than would be
preferred. This is due to a measurement challenge that we describe below for clarification. Weikart
Center’s near-unique position to advance the field of instructional performance assessment is reflected in
the third recommendation in the Discussion section of this report to continue validation work on the
PQA Form A. The paragraphs that follow describe the measurement problem and our pending efforts to
improve the precision of measurement for instructional practices. We believe that improvements of this
sort are critically valuable, as it will facilitate evaluation of specific types of instructional practices for
specific subgroups of students, in particular students whose successful learning requires greater supports
C-8
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
due to exposure to stressors during childhood. By introducing more objectivity into the assessment of
instructional practices, our ability verify instructional theory will be greatly enhanced. By way of further
explanation:
For the PQA Form A, domain and scale scores were created using a standard measurement
process: combining responses to construct-specific subsets of items into scale scores (e.g., by calculating
means across items) and then combining construct scores into composite scores (e.g., by calculating
means across scale scores). This standard measurement process works best where the items and scales
function in a reflective manner; reflective items each “reflect” the underlying unidimensional construct,
such that scores assigned to any given reflective item within a construct scale correspond to similar scores
assigned to any other reflective item within a scale.
However, close observation and analysis of PQA items suggests that many PQA items function in
a formative manner, such that a high score on any of several formative items within a multidimensional
scale indicates the presence of a high-quality instructional practice for that scale and does not necessarily
require a high score on every formative item (e.g., there are often several different ways to convey a
message, and any of these ways is often sufficient without the others).
If PQA items function in both formative and reflective manners, then we may be able to
substantially improve the precision of the PQA and its composite scores by explicitly taking these
formative and reflective properties into account where creating scale and composite scores (Bollen &
Davis, 2009; Coltman et al., 2008; Diamantopoulos & Siguaw, 2006). In addition, properly integrating
such multidimensional constructs into more extensive structural equation models (e.g., models containing
other predictor and criterion variables) requires specifying measurement models that take such formative
and reflective indicators explicitly into account.
For example, one way to ensure that the measurement model for a multidimensional construct is
“identified” (i.e., specified in a way that allows for a unique mathematical solution to each of the implied
parameters) is to include at least two reflective indicators together with one or more formative indicators
(Bollen & Davis, 2009). This criterion calls for a re-assessment of each PQA scale by reference to the
formative and reflective properties of the corresponding items composing the original and, in some case,
re-conceptualized PQA scales. Consequently, in an effort to increase the precision and validity of both
scale and composite scores, we are in the process of examining and revising the PQA scoring system by
conducting theoretical, descriptive, predictive validity, and model testing analyses.
D-1
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Appendix D SLPQI Implementation by Sites
In this Appendix, we provide SLPQI implementation data by site for the three city networks. A
“1” indicates that the implementation element was implemented. This information is summarized in the
Implementation Results section of this report.
Table D-1. 2016 SLPQI Implementation Elements by Site as Reported on the Manager Survey
Organization
Site
Summer
Institute
Assessor
Visit
Coach
Staff
Improvement
Plan
Denver
BGCMD
Arthur Johnson Club
1
1
1
1
BGCMD
Broncos
1
1
1
1
BGCMD
FoJoGo
1
1
1
1
BGCMD
Cole Beacons
1
1
1
BGCMD
Boettcher Club
1
1
1
BGCMD
Cope Club
1
1
1
DELCS DPS
Summer Slam
1
1
1
1
DELCS DPS
High Tech
1
1
1
DELCS DPS
Kaiser Neighborhood
Center
1
1
1
1
DELCS DPS
Southmoor
1
1
1
1
DELCS DPS
Swigert Neighborhood
Center
1
1
1
1
Denver Parks and Rec
City Park
1
1
Denver Parks and Rec
Sloan's Lake
1
1
DU Bridge Project
Westwood
1
1
DU Bridge Project
Quigg Newton
1
1
1
DU Bridge Project
Columbine
1
1
1
1
DU Bridge Project
Lincoln Park
1
1
1
1
Mi Casa
Mi Casa Lake Campus
1
1
1
1
OpenWorld Learning
OWL Eagleton
1
1
1
Summer Scholars
SS Ashley
1
1
Summer Scholars
SS Stedman
1
1
1
1
Summer Scholars
SS Florida Pitt Waller
1
1
1
YMCA
Omar D Blair
1
1
1
YMCA
Wyatt Academy
1
1
1
St. Paul
SPPS OST
21st Century
1
1
1
Operation
Neighborhood
Ames Lake
1
1
Breakthrough Saint
Paul
Breakthrough Saint
Paul Site
1
1
Sabo Center for
Center for Democracy
1
1
D-2
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Organization
Site
Summer
Institute
Assessor
Visit
Coach
Staff
Improvement
Plan
Democracy and
Citizenship
and Citizenship Site
ComMUSICation
ComMUSICation Site
1
1
1
Conservation Corps
Conservation Corps
Site
1
1
1
1
Interfaith Action of
Greater Saint Paul
Department of Indian
Work
1
1
1
The Sanneh
Foundation
Dreamline
1
1
1
1
Saint Paul Parks and
Recreation
East
1
1
1
Fred Wells Tennis and
Education Center
FWTEC
1
1
1
1
Good Neighbor Center
Good Neighbor Center
Site
1
1
1
YWCA of Minneapolis
Mpls (YMCA)
1
1
1
1
YMCA of Minneapolis
Mpls (YWCA)
1
1
1
1
Roseville Area Schools
Roseville
1
1
1
1
Saint Paul Parks and
Recreation
South
1
1
1
1
Saint Paul Urban
Tennis
SPUT
Saint Paul Parks and
Recreation
West
1
1
1
1
Seattle
Community - Pierce
Baker Middle School
1
1
SLPQA DEEL (City of
Seattle)
CISC Afterschool
1
1
1
SLPQA DEEL (City of
Seattle)
CISC Afterschool
1
1
1
SLPQA DEEL (City of
Seattle)
CISC Afterschool
1
1
1
1
Community - Pierce
Communities In
Schools of Lakewood
1
1
1
1
SLPQA DEEL
Denise Louie - Beacon
Hill
1
1
1
1
SLPQA DEEL
Denise Louie -
International District
1
1
1
SLPQA DEEL
(Community)
Denny Middle School
1
SLPQA RoadMap
EACS - New Holly -
Classroom 2 SSCC
Community - Pierce
(Raikes Pierce County)
Fab 5
1
1
1
1
Community - Pierce
FCMS Eagle Center -
Summer Learning and
Enrichment Academy
1
1
1
Community - Pierce
Hilltop Artists
1
1
D-3
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Organization
Site
Summer
Institute
Assessor
Visit
Coach
Staff
Improvement
Plan
Community - Pierce
Hilltop Artists
1
1
SLPQA RoadMap
Neighborhood House -
Burndale
1
1
1
1
SLPQA RoadMap
Neighborhood House -
Seola Gardens
1
1
1
Community - Pierce
(Raikes Pierce County)
Northwest Leadership
Foundation
1
1
1
1
Community - Pierce
Parents and Students in
Action - The Youth
Connection
1
1
1
SLPQA DEEL
(Community)
Seattle Parks and
Recreation - Aki
Kurose Middle School
1
1
1
1
SLPQA DEEL
(Community)
Seattle Parks and
Recreation - North Hub
at McClure
1
1
1
1
SLPQA DEEL
(Community)
Seattle Parks and
Recreation - South Hub
at Mercer
1
1
1
SLPQA DEEL
Sound Child Care -
RIFC
1
1
SLPQA DEEL
Sound Child Care -
RIFC
1
1
1
SLPQA DEEL
SPS - HS Credit
Retrieval Program -
Roosevelt HS
1
1
SLPQA RoadMap
(Boys and Girls Club
of King County)
SRV Childcare
1
1
SLPQA RoadMap
SWYFS - Arbor
Heights
1
1
1
SLPQA RoadMap
SWYFS - Windsor
Heights
1
SLPQA RoadMap
SWYFS - Woodridge
Park
1
1
1
SLPQA DEEL
UW - Native Youth
Enrichment Program
1
1
SLPQA DEEL
(Community)
Washington Middle
School
1
1
1
SLPQA DEEL (Raikes
King County)
Woodland Park Zoo
1
1
SLPQA DEEL
YMCA - Y.U. Learn
Jams Nathan Hale H.S.
1
SLPQA RoadMap
YMCA of Greater
Seattle - Beacon Hill
Elementary
1
SLPQA RoadMap
YMCA of Greater
Seattle - Summer
Language Journey
1
1
1
1
Seattle Public Schools
D-4
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Organization
Site
Summer
Institute
Assessor
Visit
Coach
Staff
Improvement
Plan
Seattle Public Schools
Roxhill Site
1
1
Seattle Public Schools
West Seattle
Elementary Site
1
1
1
1
Seattle Public Schools
BF Day Elementary
Site
1
1
1
1
Seattle Public Schools
John Rogers
Elementary Site
1
1
1
1
Seattle Public Schools
Sand Point Elementary
Site
1
1
1
1
Seattle Public Schools
Graham Hill/South
Shore Site
Seattle Public Schools
Viewlands Elementary
Site
1
1
1
1
Seattle Public Schools
Olympic Hills Site
1
1
1
1
Seattle Public Schools
Highland Park
Elementary Site
1
1
1
1
Seattle Public Schools
Dearborn Park Site
1
1
Seattle Public Schools
John Muir Site
1
1
1
1
Seattle Public Schools
MLK Jr. Elementary
Site
1
1
1
1
Seattle Public Schools
Northgate Site
1
1
Seattle Public Schools
Hawthorn Elementary
1
1
1
E-1
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Appendix E Site Manager Responses to Open-Ended Questions
In this appendix we provide text responses to two questions “What aspect of your experience with
the SLPQI was most valuable?” and “Please share any additional thoughts you may have about any aspect
of your experience with the Summer Learning PQI (see Tables E-1 and E-2).
Table E-1. “What aspect of your experience with the SLPQI was most valuable?”
Open-Ended Responses
City A
Speaking with my external assessor to look at our report and discussing next steps.
Coaching conversations with assessor.
Seeing staff and students grow in areas that needed to be.
It Is hard in such a short time and for a first year program to establish goals and implement them.
It allowed us to focus on some of the higher level aspects of the pyramid, and tie it in to our program when
applicable.
The SLPQI process makes the work with the youth intentional.
Coaching.
Assessors’ observations and feedback were very useful in validating a few of my own observations and
pointing out a few different ones. He was very supportive and offered to help in way he could which I
really appreciated.
Visiting with the Assessor.
Seeing an outside perspective of how to improve the programming.
The summary and feedback.
Meeting with the observer after.
The coaching training.
Planning with Data prior to summer.
Working with my assessor was incredibly helpful. I really appreciated her support and having the
designated time to speak with her about the goals we had at the site.
Feedback from EA.
The report you get after and that it gets to you much faster.
Having Yvette come out and see what the program was about and getting the feedback.
It was very helpful to sit down the assessor and the data. During this time we were able to have a
conversation about the strengths and areas of improvement of the programming. I appreciated the time to
dialogue and brainstorm ways to strengthen the program offerings.
Meeting with the SLPQI coach and receiving the outside observations report.
City B
Coaching session, going over our data, summary report was the most valuable in my mind because it breaks
down the strength, improvement actions, and reflection in each domain.
I think just having time to reflect with coworkers on what could be improved and getting an outside
"unbiased" observation.
Observation and coaching conversation.
Meeting w/ external assessor
The immediate feedback and coaching session. I was able to implement changes before the session ended.
Sitting down and talking with my assessor while we went over the Summary Report was the most helpful.
While much of the report did make sense, it made a difference to be able to talk it over with her. The few
suggestions that she did make in response to some of our scores on the report were very helpful and we
have already used some of them in our program this summer.
E-2
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
I enjoyed being able to talk through the results with my assessor following the observation and interview.
The overall conclusion was the most helpful for my team to see where we were hitting our mark and where
we could build from.
Seeing our program through the eyes of another program coordinator. It was really helpful to hear some
things a neutral party noticed--both good and bad--and to be able to use this feedback to help our staff hear
alternative ways to do things.
Scores helped us build some intentional reflective and choice based activities. Reviewing our scores and
our struggles helped us improve our work with youth, specifically in regards to higher order thinking.
I like the reflection and the way it looks at the program and the interactions.
The added learning community
Assessment results and summary along with coaching, planning, and implementing next steps.
Seeing which areas our program could improve the most in
Having the assessor come to the site and speak with staff who do not attend trainings. I believe that their
explanation of the benefits of using this tool helped them understand the importance of quality.
Purposeful reflection time to try and improve our program in a non-stressful context.
The framework of SLPQI was very helpful when we planned the program.
Having an outside accessor come in to access our work
Giving the participants choices throughout the course of the program
Last year the coaching was extremely valuable, as was the comparison data between our morning and
afternoon programming. We realized where some of our gaps were!
Feedback session with our assessor
City C
I expect the coaching session to be valuable, but have not yet had it.
It was nice to have a "check list" of sorts to ensure we were consistently doing what is best for kids.
student engagement, choice, and voice
It is always nice to get feedback.
Meeting with assessor.
The training just reaffirmed my philosophy of teaching and learning. It was nice to get the reports after each
observation. The reports provided an honest lens from an outside source that has no idea about how we run
our program. We were able to adjust as needed.
Looking at strengths and then finding places where growth was most needed and helpful.
Having time to connect with students in a more relaxed environment.
This assessment was a great jumping point regarding what to implement in the program. It confirmed the
things we should be doing. I appreciated the quick feedback I received right after observations.
Outside perspective
Training in descriptors of what quality summer learning looks like. Feedback from assessors.
The feedback observation forms and the coaching session.
My conversations with my assessor were extremely helpful. This summer was my site’s first time being
involved with SLPQI. My assessor broke everything down for me and took time to explain what exactly
SLPQI is, what they are observing, and the feedback given was helpful. With my assessors’ assistance, I
begin to think / plan for what I could be doing differently for next year's summer program and what
changes need to be made to make the program more successful.
The one on one coaching.
The visit and review with the site assessor was extremely valuable to our site. As a team, we were able to
ask clarifying questions and receive detailed descriptions on how we could improve our practices.
The training offered to teachers and program staff
The evaluation asks good interview questions. Most of those questions were things that I was able to work
on ahead of time (training for staff and planning).
I appreciated to meeting with the evaluator after the results were available.
E-3
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Feedback on our program is always helpful.
I feel that when we participate in the SLPQI the feedback and help and training myself and my staff receive
make us better able to provide a stronger program for all the youth in our community. If at any point in time
I need to talk to my coach he would have been available. The support we receive is invaluable and could
never be replaced.
It helped us with the overall structure of our program. It provided guidance and enabled us to improve how
we managed our students to ensure that they are engaged and feel safe.
Learning Communities.
To receive coaching tied to a grant that is less about compliance that quality improvement is a gift and
should be replicated elsewhere.
Those areas which were applicable to our age group - preschool.
The strengths observed during the SLPQI was spot on with my observation of the classroom as well. I
thought it was helpful to know the areas we as a program need to improve on.
It’s difficult to answer this question because our assessment hasn't been completed.
The training was very informative.
Receiving the report and sharing the information with the staff.
Receiving objective feedback from a neutral observer.
Getting feedback from the lens and perspective from an outside neutral source that could only enhance and
make the program better.
The coaching and the assessment have helped us with program implementation.
Common language tool.
Consistent feedback that can be compared from year-to-year.
I enjoy the observation from the assessors, I use then in my personal development and ability to lead a
team.
Feedback from evaluator.
Having an outside evaluator look at our program from a fresh set of eyes and provide useful feedback for
improvements is something the staff and myself really look forward too.
Getting feedback from someone who can view the program without bias. An outside perspective.
Receiving feedback on specific items where our program could improve. It provided a start for
conversation with staff and gave direction on what to tackle first in terms of support/training.
Having someone from outside is really good, sometimes we are not able to see things that another person
can see. And having the report is very important because we can see in what part of our program we need to
improve.
Giving common language and data to discuss with my team members for coaching opportunities.
Accountability and additional ideas to make the program even better.
Thinking about structure and common language about quality to implement in summer program. Beginning
to think about ways to work math into program.
Thinking about structure and common language about quality to implement in summer program. Beginning
to think about ways to work math into program.
E-4
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
Table E-2. 2016 Site Manager Reponses to “Please share any additional thoughts you may have
about any aspect of your experience with the Summer Learning PQI.”
Open-Ended Responses
City A
I think that it is silly to have math and reading tied into every program that is done, while it is important to tie it
in to the summer program, it shouldn't be tied into all. I also think that it is ridiculous that a certified teacher
should be on staff or consulted when doing programming, especially when many teachers are failing the youth
that we currently serve. I also think there should be more of a focus on fun within the tool, the program can be
high quality, but if fun isn't infused there won't be many youth choosing to be in the program and then it
becomes forced participation. Some staff will tie the fun in, but if it isn't emphasized by the tool then some
staff will lose sight of this and the overall program suffers.
Too small a window to do it right, and we have other forms of evaluation.
With the short period of time it makes it hard to implement in the summer. I think it would be a better fit during
an extended period of time to properly put together improvement plan.
As a completely new staff member and Site Director, I wish I had been better versed in what this process
looked like earlier on. By the time that Andrea and I were able to meet, it was a bit challenging to change things
since we were already midway through the summer. Still, the second half of our summer was much better than
the first and I attribute that in part to this SLPQI process.
If assessment and feedback could happen earlier I believe it could help us make it positively impact our
programming.
The feedback was very helpful. However, I feel most of the questions did not apply to the outdoor parks and
rec youth program.
City B
It would be nice to add a summary report to the reg, YPQA/ YPQI so when Supervisor meets with their team
everyone can see firsthand.
Incredibly well-organized and helpful.
The trainings and the coaching session was the most beneficial. I am looking forward to follow up trainings this
fall.
Our assessor went out of town right after she observed at my site, so I didn't receive the Summary Report until
a couple weeks later. This was the only downside to the experience. We received the Summary Report with
plenty of time left in the summer to implement changes; however, we would have appreciated a more timely
report.
I really appreciate a dialogue component in this process.
My observer was very helpful and hands on. She gave great concrete feedback.
A lot of the SLPQI does not apply to our program. We are the same year round. We have talked about
changing summer programing and have done different things in the past (and that always gets suggested in the
two years we have participated in this) it doesn't fit/work with the needs of our program in relation to what we
have. We are a drop in tutoring program that serves dinner to everyone. We have one staff person and the rest
are volunteers that are under no contract/obligation. So we are quite different than a lot of programs.
As a participant our coach was well versed and prepared. In the future we would appreciate a more in depth
coaching session.
As a recreation center, we do not look at grades or have access to educational & testing information. I could not
answer the first set of questions as they truly did not apply to this summer program. If Summer PQI could be
modified to more social, recreation & leisure based youth programming we would be more successful in our
implementation.
Was great to have the same assessor for 2 years in a row. Gave us consistent feedback from year to year with a
perspective of our improvement from year to year.
I think that SLPQI is serve better traditional after school program and I would like to see how we can
E-5
Design Study for the Summer Learning Program Quality Intervention (SLPQI): Final-Year Intervention Design and Evaluation Report
change/modify the tool that will serve programs in Community centers better.
This summer our external assessor was not easy to reach through email and missed several appointments,
including the date of our evaluation. This contributed to the data not being helpful for improvement during the
summer, even though we are looking forward to looking at it now, after the summer.
Some of the questions were not applicable at our site or on the day of observation. It would be great to see some
questions about physical activity as that is our program focus.
City C
Our assessment was completed in one hour of academic work and did not reflect our program day. I feel like
such a short snapshot does not give us accurate feedback about the entire program.
Thank you for all your time and effort that was put in to helping us make our program stronger and better for
each youth that comes through our doors. It is the best thing i have ever been through with P.S.I.A. and it has
really made a huge impact on our future programs.
Super helpful, provides great opportunities to bolster programs in seasons of higher need. Grateful to be able to
connect with best practice leaders and other local programs working on similar program goals.
This assessment is more youth oriented in general and less helpful than some other assessments we have
participated in. Our assessor was fantastic in explaining this and pulling out learning opportunities that do still
apply for our teachers.
It was an overall positive experience. The assessor was understanding and flexible. I thought he led a very
pleasant feedback session in which the teachers came out with a positive outlook.
I appreciated the training opportunity and instruction guide.
If possible do observation and assessment in the first two weeks which allows coordinators to make necessary
adjustments to learning environment.
I am feeling a little discouraged, as additional staff training was conducted to promote emotional safety at
camp, but we are still seeing significant bias and conflict.
I would enjoy the observation happening and results coming back sooner so implementation of systemic
strategies to improve can begin during the session.
A wonderful program-our only difficultly is finding the time to really take advantage of all it offers.
Useful and worthwhile. Hoping to have improvements for next year.
I hope you were able to do more observations, I felt one sometime is not enough.
We had a great experience with Summer PQI. I know the timing is so short but a post test would have been
very helpful.
The process felt rushed in just 6 weeks of summer program. We were able to complete the assessments and
coaching, but it didn't feel like we had enough time to make changes based on the feedback (our coaching
sessions happened in the week of program). In theory, the coaching session was great (and very interesting to
speak to the person who actually assessed us), but I felt our time with the coaches could have been more
substantive. They told us their observations and had specific suggestions, but in general they didn't have
broader ideas to coach us to higher quality.
Questions in the site manager interview that focused on data and reporting felt repetitive.
In response to the interview's questions about planning for summer: it's important for funders and other
supporters in the field to understand that our ability to plan depends in large part on staffing and money. Much
of that information (including SLPQI training and support) comes to us in May or later, making it difficult to
plan much further ahead. We would love to be able to get everything worked out much earlier in the spring, and
I hope the field as a whole can work to respond to this need.
The summer is such a fast moving train, that even when the SLPQA was done in the 2nd week of program the
results and coaching were not available to the 4th or 5th week of program and the program was finished after
the 6th week. I think we will see the value in using those results to influence our school year planning and the
planning for next summer, but we were not really able to make changes in the moment.
... Over five summer cycles, an iterative sequence of design and development studies were conducted to evaluate (a) SLPQI implementation fidelity and feasibility; (b) adaptations to the design, training, and technical assistance supporting implementation; and (c) the validity of the standard for highquality instructional practices used in the SLPQI (Smith et al., 2017;Smith et al., 2015; http://cypq.org/SummerLearningPQI ). ...
Technical Report
Full-text available
This paper compares outcomes for struggling students in high quality instructional settings to outcomes for similar students in low quality instructional settings.
Book
Full-text available
The National Summer Learning Project, launched by the Wallace Foundation in 2011, includes an assessment of the effectiveness of voluntary, district-led summer learning programs offered at no cost to low-income, urban elementary students. The study, conducted by RAND, uses a randomized controlled trial and other analytic methods to assess the effects of district-led programs on academic achievement, social-emotional competencies, and behavior over the near and long term. All students in the study were in the third grade as of spring 2013 and enrolled in a public school in one of five urban districts: Boston; Dallas; Duval County, Florida; Pittsburgh; or Rochester, New York. The study follows these students from third to seventh grade; this report describes outcomes through fifth grade. The primary focus is on academic outcomes but students' social-emotional outcomes are also examined, as well as behavior and attendance during the school year. Among the key findings are that students with high attendance in one summer benefited in mathematics and that these benefits persisted through the following spring; students with high attendance in the second summer benefited in mathematics and language arts and in terms of social-emotional outcomes; and that high levels of academic time on task led to benefits that persisted in both mathematics and language arts.
Technical Report
Full-text available
Despite long-term and ongoing efforts to close the achievement gap between disadvantaged and advantaged students, low-income students continue to perform at considerably lower levels than their higher-income peers in reading and mathematics. Research has shown that students' skills and knowledge often deteriorate during the summer months, with low-income students facing the largest losses. Instruction during the summer has the potential to stop these losses and propel students toward higher achievement. A review of the literature on summer learning loss and summer learning programs, coupled with data from ongoing programs offered by districts and private providers across the United States, demonstrates the potential of summer programs to improve achievement as well as the challenges in creating and maintaining such programs. School districts and summer programming providers can benefit from the existing research and lessons learned by other programs in terms of developing strategies to maximize program effectiveness and quality, student participation, and strategic partnerships and funding. Recommendations for providers and policymakers address ways to mitigate barriers by capitalizing on a range of funding sources, engaging in long-term planning to ensure adequate attendance and hiring, and demonstrating positive student outcomes.
Article
Full-text available
Employing a randomized field trial, this 3-year study explored the effects of a multiyear summer school program in preventing the cumulative effect of summer learning losses and promoting longitudinal achievement growth, for a total treatment group of 438 students from high-poverty schools. Longitudinal outcomes for the participants were contrasted to those for 248 children randomized into a no-treatment control condition. Multilevel growth models revealed no intention-to-treat effects of assignment to the multiyear summer school program. However, student attendance patterns at the voluntary program were variable across the 3 years that the intervention was offered. Maximum likelihood mixture models, which estimated the effects of the treatment for compliers, revealed statistically significant effects on learning across all three literacy domains tested for those students who attended the Summer Academy at an above average rate across two or more of the three summers that it was offered. Relative to their control-group counterparts, treatment compliers held advantages of 40% to 50% of one grade level on the final posttests.
Article
Full-text available
Prior research has demonstrated that summer learning rooted in family and community influences widens the achievement gap across social lines, while schooling offsets those family and community influences. In this article, we examine the long-term educational consequences of summer learning differences by family socioeconomic level. Using data from the Baltimore Beginning School Study youth panel, we decompose achievement scores at the start of high school into their developmental precursors, back to the time of school entry in 1st grade. We find that cumulative achievement gains over the first nine years of children's schooling mainly reflect school-year learning, whereas the high SES-low SES achievement gap at 9th grade mainly traces to differential summer learning over the elementary years. These early out-of-school summer learning differences, in turn, substantially account for achievement-related differences by family SES in high school track placements (college preparatory or not), high school noncompletion, and four-year college attendance. We discuss implications for understanding the bases of educational stratification, as well as educational policy and practice.
Article
Several scholars have suggested that differential rates of summer learning loss contribute to the persistence of achievement gaps between students of different socioeconomic backgrounds. To better understand the possible determinants of summer learning loss, a test for summer-specific differences by socioeconomic status (SES) in children’s time spent in activities related to cognitive development and parental time spent interacting with children is conducted using data from two time-diary surveys: the Activity Pattern Survey of California Children and the American Time Use Study. Tobit-model estimates provide evidence of statistically and practically significant summer-SES time-use gaps, most notably in children’s television viewing.