Content uploaded by Charles Smith
Author content
All content in this area was uploaded by Charles Smith on Mar 14, 2018
Content may be subject to copyright.
Center for Youth
David P. Weikart
Program Quality
CONTINUOUS UALITY IMPROVEMENT
IN AFTERSCHOOL SETTINGS:
Impact ndings from the Youth Program uality Intervention study
Executive Summary
e David P. Weikart Center for Youth Program uality is a division of
Continuous quality improvement in aerschool settings: Impact ndings om the Youth Program uality Intervention study
ES - 2
EXECUTIVE S UMMARY | View this online at www.cypq.org/ypqi ES - 3
Abstract
Citation: Smith, C., Akiva, T., Sugar, S., Lo, Y. J., Frank, K. A., Peck, S. C., Cortina, K. S., & Devaney, T. (2012).
Continuous quality improvement in aerschool settings: Impact ndings om the Youth Program uality Intervention
study. Washington, DC: e Forum for Youth Investment.
Background: Out-of-school time programs can have positive eects on young people’s development; however,
programs do not always produce such eects. e quality of instructional practices is logically a key factor but quality
improvement interventions must be understood within a multi-level framework including policy, organization, and
point of service if they are to be both eective and scalable.
Purpose: To evaluate the eectiveness of the Youth Program uality Intervention (YPQI), a data-driven continuous
improvement model for aerschool systems. Research questions include:
• Does the YPQI increase managers’ focus on instruction and the use of continuous
improvement practices by site-based teams?
• Does the YPQI improve the quality of aerschool instruction?
• Does the YPQI increase sta tenure?
• Can the YPQI be taken to scale across programs that vary widely in terms of structure, purposes
and funding and using resources available to public agencies and community-based organizations?
• Will aerschool organizations implement the YPQI under lower stakes conditions where
compliance with the model is focused on the improvement process rather than attainment of
pre-determined quality ratings?
Participants: Eighty-seven aerschool sites in ve diverse aerschool networks participated in the study. Each site
employed the equivalent of one full-time program manager and between two and ten direct sta; had an average annual
enrollment of 216 youth; and had an average daily attendance of 87 youth.
Research Design: is is a cluster randomized trial. Within each of the ve networks, between 17 and 21 sites were
randomly assigned to an intervention (N=43) or control group (N=44). Survey data were collected from managers,
sta, and youth in all sites at baseline prior to randomization (spring 2006), at the end of the implementation year of
the study (spring 2007) and again at the end of the follow-up year (spring 2008). External observers rated instructional
practices at baseline and at the end of the implementation year. Implementation data were collected from both
intervention and control groups. Hierarchical linear models were used to produce impact estimates.
Findings: e impacts of the YPQI on the central outcome variables were positive and statistically signicant. e
YPQI produced gains in continuous improvement practices with eect sizes of .98 for managers and .52 for sta.
e YPQI improved the quality of sta instructional practices, with an eect size of .55. Higher implementation of
continuous improvement practices was associated with higher levels of instructional quality, with eects nearly three
times greater than the overall experimental impact. Level of implementation was sustained in intervention group sites in
the follow-up year.
Conclusions: is study demonstrates that a sequence of continuous improvement practices implemented by a site based
team - standardized assessment of instruction, planning for improvement, coaching from a site manager, and training for
specic instructional methods - improves the quality of instruction available to children and youth. e YPQI produces
a cascade of positive eects beginning with provision of standards, training, and technical assistance, owing through
managers and sta implementation of continuous improvement practices, and resulting in eects on sta instructional
practices. Evidence also suggests that participation in the YPQI may increase the length of sta tenure and that YPQI
impacts are both sustainable and scalable.
Continuous quality improvement in aerschool settings: Impact ndings om the Youth Program uality Intervention study
ES - 4
CONTINUOUS UALITY IMPROVEMENT
IN AFTERSCHOOL SETTINGS:
Impact ndings from the Youth Program uality Intervention study
Executive Summary
As investments in the aerschool eld have grown over the past decade, so too has the body of evidence suggesting
that out-of-school time (OST) settings can be important contexts for positive youth development and learning
(Mahoney, Vandell, Simpkins, & Zarrett, 2009). Aerschool settings provide childcare for working parents, safe
places for youth during nonschool hours, and assistance with homework – services that are highly important
to parents and policymakers alike. However, organized activities during out-of-school time can also provide
opportunities for youth to experience a rich of array of contexts and content – relational, cultural, artistic,
scientic, recreational, and natural – which are available in communities but usually not in schools and not to all
households due to cost of time, transportation, and tuition (Pedersen & Seidman, 2005). Aerschool settings
can also provide exposure to instructional methods that are more responsive to individual youths’ needs, interests,
imagination and time, and less focused on memorization and test preparation, which increasingly animate school-
day routines (Halpern, 2003). i
Many studies of human development and learning from outside the aerschool eld indicate that the qualities
of aerschool settings should matter. Youth experiences of emotional support, competence, and autonomy build
youth interest and motivation to engage with the processes and content in a setting (e.g., Deci & Ryan, 2000).
Youth experiences of engagement, interest, and motivation are associated with a wide range of learning and
developmental outcomes (e.g., Wigeld, Eccles, Schiefele, Roeser, & Kean, 2006), and youth experiences which
combine positive aect, concentration, and moderately-dicult eort promote skill development in multiple
domains, especially when accompanied by adults’ modeling of the learning task (e.g., Fisher & Bidell, 2006).
In research on aerschool programs specically, aerschool experiences are associated with higher levels of
youth engagement than either the school day or unstructured time with peers (e.g., Larson, 2000) and can
positively inuence outcomes over a wide range of cognitive, emotional, and applied skills (e.g., Durlak,
Weissberg, & Pachan, 2010).
e critical active ingredients of aerschool programs may be dened as manager and sta behaviors that inuence
the qualities of youth experience. However, it is clear that not all aerschool contexts promote developmentally
powerful experiences. Reviews of numerous evaluation studies suggest that aerschool impacts vary and that
aerschool settings that lack certain qualities are unlikely to enhance academic or developmental outcomes
(Durlak, Weisburg, & Pachan, 2010; Lauer et al. 2006). Evaluations of the largest and most generic program
models have found few eects on academic achievement and mixed impacts on other developmental outcomes
(Black, Doolittle, Zhu, Unterman, & Grossman, 2008; Gottfredson, Cross, Wilson, Rorie, & Connel, 2010;
James-Burdumy et al., 2005). Following literature in the early childhood and school day elds, there is likely a
relationship between uneven or low instructional quality in aerschool settings and these weak eects. ii
Research, funding, and policy-making communities have endorsed eorts to introduce quality improvement into
aerschool networks (Grossman, Lind, Hayes, McMaken, & Gersick, 2009; Metz, Goldsmith, & Arbreton, 2008;
Princiotta & Fortune, 2009), and a growing number of intermediary organizations are engaged in supporting these
policies (Collaborative for Building Aerschool Systems, 2005; Keller, 2007).
EXECUTIVE S UMMARY | View this online at www.cypq.org/ypqi ES - 5
However, despite this pattern of policy innovation, relatively few intervention models explicitly address the
complex, multilevel nature of aerschool systems (Durlak & DuPre, 2008), particularly the role that managers may
play as leaders of site-level continuous improvement processes. To date, no experimental studies have examined the
impact of quality improvement interventions in the aerschool eld (Gardner, Roth, & Brooks-Gunn, 2009), and
evidence regarding the impact, sustainability, scalability, and cost of such interventions is scarce in the wider elds
of education, human services, prevention and public health.
is report summarizes ndings from the three-year Youth Program uality Intervention Study conducted by the
David P. Weikart Center for Youth Program uality, a division of the Forum for Youth Investment. e study
was designed to examine the impact of the Youth Program uality Intervention (YPQI), a data-driven continuous
improvement model for school and community-based sites serving youth during aerschool hours.
e YPQI Study was designed to rigorously answer several specic questions related to both impact and
implementation:
• Does the YPQI increase managers’ focus on instruction and the use of
continuous improvement practices by site-based teams?
• Does the YPQI improve the quality of aerschool instruction?
• Does the YPQI increase sta tenure?
e study was also intended to inform eld-level questions that pertain to quality improvement systems currently
being created or considered by policy entrepreneurs in public sector agencies, private foundations, and community
based organizations. ese questions include:
• Can the YPQI be taken to scale across programs that vary widely in terms of structure, purposes
and funding and using resources available to public agencies and community-based organizations?
• Will aerschool organizations implement the YPQI under lower stakes conditions where
compliance with the model is focused on the improvement process rather than attainment of
pre-determined quality ratings?
e primary impact of interest in the YPQI Study was the quality of sta instructional practice. As with most
youth development researchers, our long-term aim is greater understanding of the relations between program
context and youth developmental change. iii However, in the current study our strategy was to design an
intervention that promotes high quality instructional practices in a coherent, cost-eective way, and then to
rigorously study whether this approach aects instruction in the ways intended. We were particularly interested in
isolating a sequence of eects that begins at the policy level and extends through several steps of implementation,
and which results in improved quality of instruction at the point of service, where adults and youth meet.
Continuous quality improvement in aerschool settings: Impact ndings om the Youth Program uality Intervention study
ES - 6
Overview of the Intervention
e YPQI eory of Action (Figure 1) is an implementation sequence that spans policy, organization, and point-
of-service levels of aerschool settings. In this model, actors engage in activities at one level, which leads them
to enact behaviors at the level below. In perhaps the most important cross-level step, managers engage site-based
teams of sta in continuous improvement practices, leading sta to enact higher-quality instructional practices
at the point-of-service with youth. We refer to the eory of Action as producing a cascade of eects because
implementation begins with a policy level decision and produces eects both across multiple levels, and from a
single site manager to multiple sta and youth. (For additional detail regarding the intervention see Chapter 1 and
Appendix A in the full YPQI technical report).
Standards and Supports
e YPQI begins with a policy level denition of standards both for site managers’ continuous improvement
practices and for high-quality instruction through adoption of a quality assessment tool. Aligned training and
technical assistance (T&TA) supports are introduced to support performance against the standards at all levels of
setting. T&TA supports are delivered by contract consultants or local sta using locally available resources and in
regional proximity to sites. Recruitment and logistics are handled by network leaders. TA coaches are recruited
locally and trained in the TA coaching method specic to the intervention.
Continuous Improvement Practices
YPQI continuous improvement practices include quality assessment, improvement planning, coaching by site
managers during sta instruction, and sta attendance at targeted trainings for instructional skill building. ese four
practices are enacted by site teams in the assess-plan-improve sequence described in Figure 2. e sequence begins
with use of the Youth Program uality Assessment (PQA), a standardized observational measure of instructional
practice for aerschool and other settings (HighScope, 2005; Smith, Akiva, & Henry, 2006).
EXECUTIVE S UMMARY | View this online at www.cypq.org/ypqi ES - 7
Figure 2. YPQI continuous improvement sequence
e Youth PQA is used in two ways during the rst step of the sequence: (a) a reliable rater conducts two or more
external assessments and (b) the manager leads a site team to conduct program self-assessment, which is a process of
multiple peer observations and team-based scoring of a single assessment for the entire program. iv Data from both
applications of the Youth PQA are used for improvement planning, in which the team interprets the meaning of
their data and selects areas to improve. During the months when site teams enact their improvement plans, sta
members attend training modules for targeted instructional practices and receive performance coaching from their
site manager. Both training and coaching align with and reinforce the site’s quality improvement plan.
Training and technical assistance supports for the YPQI continuous improvement practices consist of training and
one or more visits by a Technical Assistance (TA) coach. e Youth Work Management training sequence consists
of three 6-hour workshops for site managers: Youth PQA Basics prepares managers to lead the site team through
internal assessment and to generate on-line quality proles. Planning with Data prepares managers to lead the site
team through improvement planning and to manage a change process. Instructional Coaching prepares managers to
deliver feedback to sta following observation of sta instruction. TA coaches lightly support managers to enact the
assess-plan-improve sequence (averaging 10 hours per site).
Instructional Practices
e YPQI standards for instructional quality are depicted in Figure 3 and include a range of specic instructional
practices grouped in four domains of quality: safety, support, interaction, and engagement. ese practices, when
enacted together as an instructional approach, provide youth with opportunities for positive developmental
experiences in aerschool settings. Further, as a result of exposure to higher-quality instructional practices we
expect youth to become more engaged with content.v Both of these elements – intentional infusion of higher
quality instructional practices and corresponding higher levels of engagement from youth – are expected to drive
an upward spiral of youth engagement and sta prociency at implementing higher-quality instructional practices.
Continuous quality improvement in aerschool settings: Impact ndings om the Youth Program uality Intervention study
ES - 8
Figure 3. Pyramid of Youth Program uality
Training and technical assistance supports for these instructional practices consisted of the Youth Work Methods
training portfolio of 10 two-hour workshops rooted in the HighScope active participatory approach to youth
development (Smith, 2005): Voice and Choice, Planning and Reection, Building Community, Cooperative
Learning, Active Learning, Scaolding for Success, Ask-Listen-Encourage, Reaming Conict, Structure and Clear
Limits, and Homework Help. ese workshops were selected based on improvement plans and delivered at an all-
site event in each network. Managers were encouraged to attend with their sta.
Timeline
Implementation of the study and intervention occurred over three years: baseline (year 1), implementation
(year 2), and follow-up (year 3). e timeline is depicted in Figure 4. During the follow-up year, the wait-listed
control group was granted access to the YPQI and T&TA supports were oered again in each network, although
attendance was not mandatory for either the control or intervention group.
Figure 4. Implementation Timeline
Fall 2008 Spring 2009Fall 2008 Winter 2008 Spring 2008Spring 2007 Summer 2007
EXECUTIVE S UMMARY | View this online at www.cypq.org/ypqi ES - 9
About the Study
e YPQI study was implemented in 87 aerschool sites (i.e., buildings that housed aerschool programs) in ve
networks in four states. e ve networks were selected to include a mix of rural and urban settings and diverse set
of aerschool policies including fee-based school-age child care, 21st Century Community Learning Centers, and
community-based providers with both local and national aliations. e sample also included substantial variation
in the educational characteristics of program sta and in characteristics of the youth sample in terms of income,
ethnicity, and risk.
Networks also shared important characteristics such as sites operating during the entire school year, full-time site
managers, average attendance of at least 30 youth each day, and a program model that included distinct program
oerings. In addition, participating network leaders agreed that the Youth PQA was an appropriate standard for
high-quality instruction. Finally, most site managers in the study reported that academic support was the primary
objective of the overall program, although a wide range of aims were reported.
e following outcomes were analyzed in order to determine the impact of the YPQI:
• Site Improvement Focus is a manager-reported binary measure, indicating whether a site’s
improvement focus included an instructional topic during the implementation year.
• Continuous Improvement Practices were measured using an index of practices: implementation
of program self-assessment, improvement planning, instructional coaching, and participation in
training on instructional methods.
• Sta Instructional Practices was the primary outcome of interest in the study and was
constructed as a composite score for nine equally weighted scales describing distinct sta
instructional practices: Sta Disposition, Welcoming Atmosphere, Inclusion, Conict
Resolution, Active Skill Building, Support for Group Participation, Opportunities to Make
Choices, Opportunities for Planning, and Opportunities for Reection. vii
• Sta Employment Tenure is indicated using two variables: a binary measure of the presence or
absence of sta employment at the site during the past 10 months, and sta employment of two
years or greater.
In addition to these primary outcomes, we used data from on-site observations, surveys, interviews, and training
and technical assistance records to assess managers’ and sta members’ attitudes, background, knowledge, and
exposure to the intervention. Implementation data were also collected in the control group at all time points to
determine the extent to which control sites were implementing YPQI-like practices or utilizing YPQI-like T&TA
supports.
e study employed a cluster randomized design (Bloom, 2004; Raudenbush, Martinez & Spybrook, 2007) with
random assignment of sites within networks. is design created a group of sites exposed to the intervention and
an equivalent control group within each of the ve networks. e basic strategy for assessing the impact of the
YPQI was to estimate impact within each network, and then pool these estimates as an overall estimate of impact.
We also conducted tests to see if impact estimates diered signicantly across networks, and in most cases they
did not. Because multiple sta were nested within each site, two-level statistical models were used to produce the
impact estimates.
Continuous quality improvement in aerschool settings: Impact ndings om the Youth Program uality Intervention study
ES - 10
Impact estimates for the YPQI study reported here provide an intent-to-treat analysis of the impact of the
intervention because they reect the eects on the entire baseline sample, regardless of participation and
implementation (both of which were uneven). Although participating networks were discouraged from providing
YPQI-like supports to the control group during the baseline and implementation years, the control group sites
were not prevented from engaging in YPQI-like practices or from seeking out YPQI-like T&TA supports from
other sources. For this reason, we characterize the control condition as “business as usual” and interpret impact
estimates as eects over and above quality improvement practices already widespread in the eld.
Findings
ere are two types of ndings in the YPQI study. Impact ndings are those based on estimation of an
experimental contrast between the randomly assigned intervention and control groups. Implementation ndings
represent our best eort to extend our understanding of the impact ndings by asking questions like, “how much?”
and “under what conditions?” ese questions lie outside of the experimental design but are critical for potential
adopters of the YPQI. (For additional detail regarding YPQI study ndings see Chapters 4 and 5 and related
appendices in Smith et al. [2012].)
Impact Findings
In this section we consider each step in the YPQI eory of Action and describe the “amount” of YPQI impact at
each step. In general, we describe the impact in terms of the original metric but for some of the impact estimates
we also present a standardized eect size viii to facilitate comparison across measures and studies. e impact of the
YPQI was positive and statistically signicant (p < .01) for all primary outcome variables except sta employment
tenure which was positive but only marginally signicant for both the 10-month (p = .08) and 2-year (p = .09)
measures.
Manager Participation in YPQI T&TA Supports (site manager “dose”). During the implementation year, managers
in the intervention group were more likely than those in the control group to receive T&TA supports for: data
collection using an observational assessment (76% vs. 12%); improvement planning (76% vs. 19%); coaching sta
on instructional practices (88% vs. 21%); and on-site assistance from TA/coach to strategize and plan about quality
improvement (78% vs. 23%). Each of these dierences was statistically signicant (p < .01). is evidence warrants
subsequent impact analyses because random assignment caused the intervention group sites to receive a substantial
dose of YPQI T&TA supports in marked contrast to the much smaller dose received by the control group.
Site Improvement Focus. It is important to know if the site team is actually focused on instructional quality,
because it is possible for site teams to focus on other issues (e.g., parent involvement) and that may weaken the
cross-level cascade of eects. At baseline, 10% of intervention group managers (and 13% of control) indicated any
instructional improvement focus. During the implementation year, 43% of intervention group managers (24% of
control) indicated that their site’s improvement eorts were focused on an instructional issue.
Manager Continuous Improvement Practices. Site managers assigned to the YPQI enacted in continuous
improvement practices at higher rates than their control group peers (standardized eect size = 0.98, p < .001).
In practical terms, on average, site managers in the YPQI implemented one more of the continuous improvement
practices than controls. If we consider implementation delity, substantially more intervention group managers
were high implementers of continuous improvement practices ix in comparison to their control group peers (53%
vs. 16%), and fewer intervention group managers were not implementing any such practices in comparison to their
control group peers (4% vs. 16%).
EXECUTIVE S UMMARY | View this online at www.cypq.org/ypqi ES - 11
Sta Continuous Improvement Practices. Sta in aerschool sites assigned to the intervention engaged in
continuous improvement practices at signicantly higher rates than their control group counterparts (standardized
eect size = .54, p = .003). In practical terms, on average, site sta in the YPQI group implemented approximately
one more practice at two-thirds of the sites in each network. If we consider implementation delity, 40% of the
intervention group sta reported engaging in all four continuous improvement practices while only 21% reported
equally high delity in the control group.x
Instructional uality. Sta in aerschool sites assigned to the intervention group had higher levels of instructional
quality than sta in the control group (standardized eect size = .55, p = .003). In practical terms, this eect
size can be interpreted as an average increase of one level on two of the nine practices (or an increase of two
levels on one practice) measured in the composite score used to assess instructional practices. For example, this
change could represent a site extending skill-building practices from some to all youth or introducing youth
planning opportunities where none had existed before. If we consider oerings that achieved high delity for sta
instructional practices, 65% of intervention group sta received a mean Sta Instructional Practices Total Score of
4 or higher, while only 39% reported equally high levels of instructional quality in the control group.
Sta Employment Tenure. Participation in the YPQI had a positive but marginally signicant (p = .08) eect on
short-term sta tenure. At the end of the implementation year, participating in the YPQI increased the odds that
sta were employed at the site for 2 months or more (84% sta in intervention group vs. 74% control) and that
sta were employed at the site for 2 years or more (69% intervention vs. 57% control).
Implementation Findings
In this section we address several questions related to implementation of the YPQI. While none of these questions
can be answered with the level of certainty provided by the experimental design, we did collect data specically to
address these key issues related to how and why the YPQI achieved impact.
Does higher delity implementation of continuous improvement practices produce higher quality instruction? Yes. Sta
engagement in the four continuous improvement practices is positively related to the quality of sta instruction.
is nding also supports an important cross-level link in the cascade of eects described in the YPQI eory of
Action. Managers who engage more sta in more of the continuous improvement practices can expect those sta
to enact higher quality instruction in point-of-service settings with youth.
Is the eect on instructional quality robust across program conditions that are common in the eld? Yes. We examined
the extent to which the association between continuous improvement and instructional quality was moderated by
high manager turnover, low sta education levels, and youth-adult ratios. None of these features had a statistically
signicant moderation eect. is evidence suggests that even in settings characterized by some of the eld’s most
challenging conditions, the YPQI may still be eective.
Were YPQI practices sustained in the follow-up year when participation was not required or requested? Yes. Using data
collected from intervention group sites during the baseline, implementation, and follow-up years, we analyzed
trends on three outcome measures over time: site improvement focus, sta continuous improvement practices, and
sta employment tenure. In each case, the dierence between baseline performance and the level of performance
sustained in the follow-up year was positive and statistically signicant. is nding suggests that YPQI T&TA
supports have a sustained eect in subsequent years.
Continuous quality improvement in aerschool settings: Impact ndings om the Youth Program uality Intervention study
ES - 12
How much time did it take for site managers and sta to participate in YPQI T&TA supports and then implement
continuous improvement practices at their site? Based on service logs from the YPQI study and subsequent
deployments of the intervention, we estimated that a site manager spends an average total of 52 hours over 18
months: 25 hours attending training, 12 hours implementing continuous improvement practices, and 15 hours
with a coach or conducting miscellaneous tasks. On average, three additional sta on the site team spent
a combined total of 71 hours. xi
What was the cost of the T&TA supports in the YPQI Study? e estimated cost for YPQI T&TA supports was
$333 per sta member, or $3,028 per site during the implementation year.
Discussion
is study nds a preponderance of evidence that the YPQI works. When aerschool site managers implement
a sequence of continuous improvement practices with site teams, the quality of instructional practices available
to youth improves. Furthermore, the positive and near signicant impact on sta tenure hints at the eect of the
YPQI on building a positive organizational culture and climate that increases sta retention. ese ndings are
the product of a rigorously designed intervention and provide some of the rst experimental impact estimates
regarding quality improvement systems in the aerschool eld.
As described in the YPQI eory of Action (see Figure 1), the intervention was designed to produce a cascade
of eects across multiple levels of aerschool settings: from a single site manager engaging with standards
and supports in the policy setting, to the creation of a site-based improvement team with multiple sta in an
aerschool organization, and, nally, to transfer of improvement plans into point-of-service level instructional
performances. Importantly, the YPQI Study design produced an experimental estimate at each step in this model,
providing rare “black box” impact estimates that suggest how the intervention mechanism produces eects
across multiple actors and levels of aerschool settings. Figure 5 presents standardized eect sizes for each of the
outcomes described in the YPQI eory of Action. xii
Figure 5. Cascading Eects
* indicates statistical signicance at the p < .01 level
EXECUTIVE S UMMARY | View this online at www.cypq.org/ypqi ES - 13
Non-experimental analyses supported the hypothesis that one critical link in the chain of eects – the opportunity
for sta to engage in continuous improvement practices – was associated with variation in the quality of
instruction. xiii is association provides strong non-experimental evidence supporting the YPQI eory of Action
and a specic cross-level eect: When site sta are more deeply engaged in a continuous quality improvement
process, the quality of their instruction improves.
Additional implementation analyses support further important conclusions. First, the YPQI has robust impact
across widely varied aerschool systems and achieves eects despite challenging structural features which
characterize individual sites, including sta education, youth-adult ratios, and sta turnover. Second, analyses
across three years suggest that levels of sta participation in continuous improvement teams are sustained
over time.
Finally, we asked if the YPQI could be carried out using resources normally available to public agencies and
community-based organizations. While we could not answer this question directly, we calculated time estimates
and costs for the intervention as delivered in the study, noting that the YPQI was carried out using human
resources already available in each of the networks. Elsewhere, we have attempted to compare the intensity of the
YPQI to other interventions producing similar standardized eect sizes, suggesting that the YPQI is cost-eective
for the aerschool eld.xiv
Conclusions
e YPQI Study makes a much needed contribution to our understanding of how a site-level continuous
improvement intervention can work and be implemented at scale in quality improvement systems. Of particular
interest to policymakers is the fact that the policy-level performance standards for continuous improvement
and instruction in the YPQI model were “lower” stakes. Sites were not penalized by their leadership or by their
customers if they failed to attain a certain level of quality. Despite this lack of either performance data publicity
or direct sanction, program quality still improved in response to standards and supports that were designed rst
and foremost to empower site managers to enact the four continuous improvement practices.
Limitations of the Study
e primary limitation of this study is that it does not examine in detail the relations between the intervention
and child-level changes in engagement and skill building. For reasons of both design feasibility and cost, child-
level change was not the object of evaluation in this study. Nevertheless, extension of the concept of a “cascade”
of intervention eects across levels in future studies should ultimately include detailed longitudinal assessment
of child engagement in aerschool settings and long-term skill building. Another limitation raised by several
reviewers is that the intervention group was trained on the outcome measure; that is, the Youth PQA was both
a standard for performance in the intervention and supplied the focal outcome measures. Although it is possible
that sta in randomly sampled aerschool oerings could have performed for the rater who observed their oering
because they were familiar with the Youth PQA (raters were blind to condition), this kind of peak performance
response is dicult to achieve. A nal weakness of the study was our inability to more thoroughly track eects into
subsequent years. Our follow-up year data collection did not include observation of instructional quality as it only
focused on measures that could be completed using manager and sta self-reports on surveys. A major unanswered
question for the YPQI relates to cumulative eects over time. It seems likely that both manager continuous
improvement skills and sta instructional skills could improve over multiple years, increasing the increment added
to instructional quality each year until a threshold or ceiling is reached. Our study did not allow us to evaluate
these questions.
Continuous quality improvement in aerschool settings: Impact ndings om the Youth Program uality Intervention study
ES - 14
References
Akiva, T., Brummet, Q., Sugar, S., & Smith, C. (2011, April). Sta instructional practices, youth engagement, and belonging in out-of-school time programs. In
Sherno, D. J. (Chair), Advances in out-of-school time research: Examining the variables important for successful OST programming and experiences.
Paper symposium conducted at the annual meeting of the American Educational Research Association, New Orleans, LA.
Bloom, H. (2004). Randomizing groups to evaluate place-based programs. Unpublishe d manuscript.
Cohen, D. K., Raudenbush, S. W., & Loewenberg Ball, D. (2003). Resources, instruction, and research. Educational Evaluation and Policy Analysis, 25, 119-142.
Deci, E. L., & Ryan, R. M. (2000). e “what” and “why” of goal pursuits: Human needs and self-determination of behavior. Psychological Inquiry, 11, 227-268.
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the inuence of implemetation on program outcomes and the factors aecting
implementation. American Journal of Community Psychology, 41, 327-350.
Durlak, J. A., Weissberg, R. P., & Pachan, M. K. (2010). A meta-analysis of aerschool programs that seek to promote personal and social skills in children and adolescents.
American Journal Community Psychology, 16, 294-309.
Fischer, K. W., & Bidell, T. R. (2006). Dynamic development of action, thought, and emotion (6th ed. Vol. 1). New York: Wiley.
Gardner, M., Roth, J. L., & Brooks-Gunn, J. (2009). Can aerschool programs level the playing eld for disadvantaged youth. New York: Teachers College Press.
Gottfredson, D., Cross, A. B., Wilson, D., Rorie, M., & Connell, N. (2010). Eects of participation in aer-school programs for middle school students: A randomized trial.
Journal of Research on Educational Eectiveness, 3, 282-313.
Grossman, J. B., Lind, C., Hayes, C., McMaken, J., & Gersick, A. (2009). e cost of quality out of school time programs. Philadelphia, PA: Public/Private Ventures.
Halpern, R. (2003). Making play work: e promise of aer-school programs for low-income children. New York: Teachers College Press.
Hansen, D. M., & Larson, R. W. (2007). Ampliers of developmental and negative experiences in organized activities: Dosage, motivation, lead roles, and adult-youth ratios.
Journal of Applied Developmental Psychology, 28, 360-374.
Hansen, D.M. & Skorupski, W. (March, 2012). Association of urban young adolescent’s peer processes and the quality of structured youth programs. Paper submitted for the
Biennial Meeting of the Society for Research on Adolescence, Vancouver, British Columbia, Canada.
Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York: R outledge.
HighScope Educational Research Foundation. (2005). Youth PQA program quality assessment: Administration manual. Ypsilanti, MI: HighScope Press.
James-Burdumy, S., Dynarski, M., Moore, M., Deke, J., Manseld, W., Pistorino, C., & Warner, E. (2005). When schools stay open late: e national evaluation of
the 21st Century Community Learning Centers program nal report. U.S. Department of Education, Institute of Education Sciences, National Center
for Education Evaluation and Regional Assistance.
Keller, E. (2010). Cutting costs, keeping quality: Financing strategies for youth-serving organizations in a dicult economy. Washington, DC: e Finance Project.
Larson, R. (2000). Toward a psychology of positive youth development. American Psychologist, 55, 170-183.
Lauer, P. A., Akiba, M., Wilkerson, S. B., Apthorp, H. S., Snow, D., & Martin-Glenn, M. L. (2006). Out-of-school time programs: A meta-analysis of eects for at-risk students.
Review of educational research, 76, 275-313.
Mahoney, J., Vandell, D. L., Simpkins, S., & Zarrett, N. (2009). Adolescent out-of-school activities. In R. M. Lerner & L. Steinberg (Eds.), Handbook of adolescent psychology, 3rd
Ed. (pp.228-269). New York: John Wiley.
Mashburn, A. J., Pianta, R. C., Hamre, B. K., Downer, J. T., Barbarin, O. A., Bryant, D., et al. (2008). Measures of classroom quality in prekindergarten and children’s
development of academic, language, and social skills. Child Development, 79, 732-749.
Metz, R. A., Goldsmith, J., & Arbreton, A. J. A. (2008). Putting it all together: Guiding principles for quality aer-school programs serving preteens. Philadelphia, PA:
Public/Private Ventu r e s.
Pedersen, S., & Seidman, E. (2005). Contexts and correlates of out-of-school activity participation among low-income urban adolescents. In J. Mahoney, R. W. Larson & J. S.
Eccles (Eds.), Organized activities as contexts of development: Extracurricular activities, aer-school and community programs (pp. 65-84). Mahwah, NJ: Erlbaum.
Pianta, R. C. and NICHD ECCR N (2008). Developmental science and education: e NICHD study of early child care and youth development - ndings from elementary
school. Advances in child development and behavior. R . V. Ka il. New York, Elsevier.
Princiotta, D., & Fortune, A. (2009). e quality imperative: A state guide to achieving the promise of extended learning opportunities. Washington, DC : Council of Chief State
School Ocers National Governors Association Center for Best Practices.
Raudenbush, S., Martinez, A., & Spybrook, J. (2005). Strategies for improving precision in group-randomized experiments. Education Evaluation and Policy Analysis, 29, 5-29.
Smith, C. (2005). Evidence of eectiveness for training in the HighScope Participatory Learning Approach. Ypsilanti, MI: HighScope Press.
Smith, C., Akiva, T., & Henry, B. (2006). uality in the out-of-school time sector: Insights om the Youth PQA Validation Study. Paper presented at the Society for Research on
Adolescence biennial meeting, San Francisco, CA.
Smith, C., Akiva, T., Sugar., S., Lo, Y. J., Frank, K. A., Devaney, T., Peck, S. C., & Cortina, K. S. (2012). Continuous quality improvement in aerschool settings: Impact ndings
om the Youth Program uality Intervention study. Wash ington, DC: e Forum for Youth Investment.
Smith, C., Pearson, L., Peck, S. C., Denault, A., & Sugar, S. (2009). Managing for positive youth development: Linking management practices to instructional performances in out-of-
school time organizations. Paper presented at the annual conference of the American Educational Research Association, San Diego, CA.
Smith, C., Peck, S. C., Denault, A., Blazevski, J., & Akiva, T. (2010). uality at the point of service: Proles of practice in aer-school settings. American Journal of Community
Psychology, 45, 358-369.
Wigeld, A. , Eccles, J.S., Schiefele, U., Roeser, R.W., & Kean, P.D. (2006). Development of achievement motivation. In W. Damon & R.M. Lerner (Series Eds.) & N. Eisenberg
(Volume Ed.), Handbook of child psychology, 6th ed., Vol. 3: Social, emotional and personality development. New York : Wiley.
Zaslow, M., Anderson, R., Redd, Z., Wessel, J., Tarullo, L., and Burchinal, M. (2010). uality dosage, thresholds, and features in early childhood settings: A review of the literature,
OPRE 2011-5. Washington, DC: Oce of Planning, Research & Evaluation, Administration for Children and Families, U.S. Department of Health and Human
Services.
EXECUTIVE S UMMARY | View this online at www.cypq.org/ypqi ES - 15
Notes
i Our data suggest that “academic support” is the most widely endorsed priority of aerschool program managers and that an amazingly diverse set of
academic enrichment and non-academic enrichment activities are delivered to support school-related content using methods that complement rather
than replicate those used during the school day.
ii is conclusion has been reached in a number of related elds where the qualities of how adults interact with children has been associated with child
eects. In the early childhood and school day elds, numerous high quality studies, reviews, and meta-analyses conclude that “process quality” or
“instruction” are important determinants of child learning and development. See Cohen, Raudenbush & Loewenberg Ball, 2003; Hattie, 2010;
Masburn et al., 2008, Pianta & NICHD ECCRN, 2009; Zazlow, Anderson, Redd, Wessel, Tarullo, & Burchnial, 2010.
iii e YPQI study was designed to assess context-level eects, not child-level outcomes. In pragmatic terms, the sample size necessary to detect context
level eects in relation to the quality of manager behavior and sta instruction was very large (e.g., N=100 sites in the original design). Further, given the
transience of aerschool program participation, our ability to adequately track individual subjects across so many sites was beyond the available resources.
However, we did collect unidentied child-level data at several points in this study to establish group equivalence at baseline and to examine the proximal
association between quality and youth engagement. ese and other correlational ndings using child-level data are discussed elsewhere (e.g., Akiva,
Brummet, Sugar, & Smith, 2011).
iv
In theory, other behavior-focused measures of practice could be inserted into this intervention model, depending on the denition of high quality
practice that is used.
v Akiva, Brummet, Sugar, & Smith (2011) and Hansen & Skorupski (2012) describe the relation between the quality of aerschool oerings and youth
engagement in several independent samples. According to our theory of change, high quality instruction produces youth engagement during a given
session. Simultaneous presence of high quality instruction and high youth engagement across multiple sessions produces mastery experiences in a number
of domains, depending on content of the oering sessions. ese content-specic mastery experiences in the aerschool context produce longer-term skill
development and corresponding skill transfer outside of the aerschool setting.
vi Program oerings are dened as micro-settings with the same sta, same youth, and same learning purpose being pursued over multiple sessions. e
YPQI sample of oerings was designed to exclude activities characterized primarily as homework, tutoring, competitive sport, and unstructured time.
vii ese scales were selected as the most reliable and representative subset of the published Youth PQA. For details and conrmatory analyses see Smith et
al. (2010).
viii e standardized eect sizes presented for all outcomes (except sta tenure) are based on Cohen’s d: e mean dierence between intervention and
control group divided by the pooled standard deviation for the control group at baseline. See Chapter 4 and Appendix F in Smith et al. (2012) for details
on how a two-level statistical model was used to produce adjusted means and variance estimates necessary to calculate standardized eect sizes.
ix High implementation for managers was dened as implementing all three practices counted in the Manager Continuous Improvement Practices Score.
See Chapter 3 of Smith et al. (2012) for full details.
x See Chapter 3 in Smith et al. (2012).
xi ese estimates do not include time spent implementing higher quality instruction during point-of-service oerings with youth.
xii Although the declining size of standardized eects is clearly intriguing, the stronger claims that eects more proximal to the intervention are either (a)
the direct cause of impacts at subsequent levels or (b) larger because they are more proximal to the intervention cannot be experimentally substantiated in
this study. However, the critical cross-level eect of sta continuous improvement on instruction is explored directly in Chapter 5 of Smith et al. (2012).
xii Because sta engagement in continuous improvement practices introduced by the site manager is a critical link in the hypothesized chain of eects,
we conducted an instrumental variables analysis using assignment to the YPQI as an instrument to remove unwanted error variance from the sta
continuous improvement practices score. is score was a positive and statistically signicant predictor of the quality of sta instruction.
xiv We did compare the YPQI standardized eect on instruction to several other studies and meta-analytic ndings that employed rigorous designs and
observational assessments with some similarity to the Youth PQA to produce comparable outcome estimates on classroom and setting instruction. Across
studies, YPQI impact estimates on instruction were of similar magnitude. e critical dierence being that in each of these studies the intensity of
the training and coaching interventions was much greater and there was no “cascading” eect, meaning that these interventions directly targeted sta
instructors and care givers. ese comparisons suggest that the YPQI may be more cost-eective than other more traditional intervention models, but
future research will be necessary to adequately address this question.
Center for Youth
David P. Weikart
Program Quality