Content uploaded by Charles Smith
Author content
All content in this area was uploaded by Charles Smith on Mar 14, 2018
Content may be subject to copyright.
Quality in the Out-of-School Time Sector:
Insights from the Youth PQA Validation Study
Society for Research on Adolescence Biennial Meeting
March 26, 2006
DRAFT
Charles Smith, Tom Akiva & Brenda Henry
High/Scope Educational Research Foundation
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
2
Introduction
Over the last decade the High/Scope Educational Research Foundation has developed and
validated an observational assessment instrument for out-of-school time (OST) programs, the
Youth Program Quality Assessment (High/Scope, 2005; Smith, 2005a), and several
methodologies for its use (Smith, 2005b). This experience has been instrumental in shaping our
ideas about what program quality is, how it works, and how OST organizations can consistently
produce it (Akiva, 2005; Akiva, 2006). There is much discussion in the field of youth
development about the nature and effects of program quality—even arguably a rough consensus
on program practices and elements that define quality in youth development settings (Eccles &
Gootman, 2002; Gambone et al., 2001; Forum for Youth Investment, 2004). However, there is
less guidance available regarding the relative importance of specific quality practices, how to
know if a program is producing them well enough, and perhaps most importantly, how these
elements of quality can be intentionally applied to improve OST settings. This article attempts to
join what we have learned about program quality in OST programs—what counts and how best
to see it—to a framework that describes organizational structure and change. We hope that this
effort to will inform setting-level intervention, improvement and accountability work in the OST
field.
After collecting hundreds of structured observations in a wide variety of youth work
settings, we frame the issue like this: a high quality program provides youth with access to key
experiences that advance adaptive, developmental and learning outcomes. However, OST
organizations frequently miss opportunities to provide these key experiences for the youth who
attend them. This is an area of systemic underperformance because these missed opportunities
occur due to existing structures, practices and policies across the OST sector. The areas of
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
3
underperformance can be identified, described and assessed through two setting-level constructs:
quality at the point of service (POS quality) and quality of the professional learning community
(PLC quality). POS occurs where youth, staff, and resources come together, and POS quality
involves both (1) the delivery of key developmental experiences and (2) the level of access
participating youth have to these experiences. In high quality programs, the PLC exists primarily
to build and sustain the POS. PLC quality, as we construe it, is primarily focused on (1) the role
of supervisors as human resource mangers and (2) the creation of knowledge management
systems that facilitate the translation of program data/information into plans for action related to
POS quality. Our ideas about the elements of quality that make up POS and PLC constructs are
not new. What is important is how these setting-level constructs allow us to see quality more
clearly and in ways that are linked to structure and change dynamics in OST organizations.
In the next three sections we examine in turn, a diagram about how the PLC and POS
settings occur in organizations, a theory of dynamics that influence these settings (make quality
higher or lower), and review the contents and psychometric characteristics of our primary POS
quality measure, the Youth Program Quality Assessment (Youth PQA). With these pieces in
hand, we then move to the task of defining the empirical context of POS and PLC quality across
a wide range of OST settings.
OST-Setting Structural Diagram
In order to better understand where and how quality happens in OST programs we need
to open up the “black box” of program quality – to look inside the buildings, classrooms and
workgroups that are sites of youth development programming and see how they operate.
However, despite the apparent simplicity of the metaphor, the “black box” of program quality is
not just a container for human interaction, relationships and resources that can be directly
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
4
revealed; PLC and POS both represent different boxes and inside, each has its own fairly
elaborate structures, sites and roles. The top section of Figure 1 presents a structural diagram for
the PLC and POS settings in an OST organization. Although this model is causally complex, we
are primarily interested in boxes A, B and C – and especially Box B, POS quality. Very simply,
Box B is where the youth and staff meet and the quality of this experience leads to the staff and
youth outcomes represented by Box C. Box B has a lot to do with the performance of individual
staff. However, Box B is also influenced by Box A, the priorities, values, and artifacts that a
community of staff share, and for which program managers should take primary responsibility.
Box C is important because it represents youth-level outcomes. Box C also suggests that
important staff outcomes (retention, professionalization, job satisfaction, motivation) may follow
from the experience of high POS quality (see related discussion in Pianta, 2003). Further, the
large arrow moving from Box C back to Box B represents how youth learn to “take over” in
programs with high POS quality: as the kids learn routines and buy-in to the program experience,
they become contributors to POS quality
Figure 1 is also a multi-level model. While PLC and POS are both setting level
constructs, they represent distinct levels of organizational functioning. The bottom part of Figure
1 (measurement model) provides a visual representation of the nested structure that exists within
Boxes A, B and C above. An OST organization usually contains several program offerings -
relatively stable groupings of youth and staff who meet for an ongoing purpose. For example, an
after-school program may have the following program offerings: arts & crafts, homework help,
team sports, and technology workshop. Each program offering contains at least one staff member
and usually several youth. In Figure 1, some youth (Y1, Y2, and Y3) attend multiple program
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
5
offerings and one staff member (S2) leads both program offerings 2 and 4. So within a single
organization, youth experience occurs in several settings.
Figure 1 - OST Structural and Measurement Model
Organization
Offering 1
Offering 2
Offering 3
Offering 4
Y1 Y2
Y3
S1
Y1 Y2
Y3
S2
Y1
Y3
S3
S2
A. PLC Quality
- Continuous improvement tools
- Prioritize POS
- Collaborative staff structures
- Best practice methods
Y2
B. POS Quality
- Safe environment
- Supportive environment
- Interaction
- Engagement
- Youth voice
C. Individual Level
Short-term youth outcomes
and
Short-term staff outcomes
Quality
standards,
licensing, &
accreditation
Organization purpose,
structure, content,
resources
Supervisor & staff
education & experience Child & family background,
neighborhood characteristics, etc. Structural
Model
Measurement
Model
From the perspective of an individual youth, the experience of quality may differ substantially
depending on which program offerings he or she attends1 and, during the offering, which
developmental experiences he or she has access to. POS quality (access to key developmental
1 We are not trying to sort out the effect of dosage, but making the point that it is possible for kids to come to the
same building and have dramatically different experience depending on what offerings they attend and/or to have
highly varied experiences of POS quality at different moments during the same program day.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
6
experiences) is not necessarily an organization-wide quantity, but is a distinct mix of attributes
in each of the several program offerings that organizations provide. Because the primary
purpose of a PLC should be to deliver POS, the most general definition of PLC quality describes
a manager that is able to focus staff on quality in each individual offering where sustained adult-
youth and youth-youth experiences occur.
Theory of Setting Dynamics
If the PLC and POS constructs are to be useful for improvement, intervention, and
accountability, we also need to understand how they are related and how they can change. The
theory of behavior settings, as introduced by Barker (1968) and extended by Schoggen (1989),
provides a theoretical perspective on the change dynamics that occur within the POS and PLC
behavior settings. Behavior settings are small-scale social systems that are self-generated, are
bound by space and time, and have fairly clear boundaries between internal patterns and other
external patterns of behavior (Wicker, 1992; Schoggen, 1989). The structural and material
characteristics of settings have a profound impact on the human roles that develop, and
consequently, the actions that are taken by the people who inhabit the setting. A given behavior
setting contains setting features—the material (e.g., a poster that describes steps for conflict
resolution) and structural (e.g., annual staff meeting to discuss ratings on the Youth PQA) and
social (e.g., a group song after snack) characteristics of organizations that influence the behaviors
of setting participants. According to the theory, the artifacts and traditions that reside in the
setting regulate behavior to a considerable degree beyond more distal sources such as attitudes,
beliefs, education level, and background (Barker, 1968, 202-205). For example, when a person
enters a café to buy a cup of coffee, she follows the patterns of behavior of a café customer.
Whoever this person is, and whatever her background, she is extremely likely to follow the
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
7
behavioral patterns—the role—of a customer. Although we do not offer a specific discussion of
setting features in this paper, setting features that drive behavior related to PLC and POS quality
might include:
• Posters describing youth-centered values of the organization.
• Surveys of youth interests that drive programming.
• New staff orientation process that focuses on positive youth development
values of the organization.
• Staff performance evaluation forms that emphasize POS quality.
• Regular training in a youth development method for line staff such as those
provided by NTI, High/Scope, Search or NIOST.
• Use of a program quality assessment tool in conjunction with a data-driven
continuous improvement process.
• A youth advisory council that incorporates youth voice into operation and
management of the organization.
• Supervisors starting staff meetings with structured reflections on recent
successes with challenging students
Figure 1 then describes two related but distinguishable types of behavior settings, the
POS (a behavior setting for each program offering emphasizing a staff role as facilitator) and the
PLC (a behavior setting for the organization as a whole emphasizing a managerial role as a
human resources developer). The POS is the ecological environment present whenever staff and
youth or youth and youth interact directly in a behavior setting. POS is the behavior setting that
happens in a program offering; sometimes that grouping is every child and adult present at the
site, or sometimes it is a sub-grouping, such as a journalism workshop that meets in a particular
classroom for a set time. The PLC is less confined to time and place. In youth serving
organizations, the PLC usually occurs in staff-specific settings, or formal staff meetings, but can
be extended to the range of moments when staff meet formally or informally to talk about how
they will design and deliver program.
The theory speaks to the OST sector. What are the material and social features that can
be literally put into the POS and PLC that will influence the behavior of setting participants to
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
8
produce high POS quality (the experiences described in the Appendix)? We suggest that there is
a systemic disconnect between features of the setting and the delivery of key developmental
experiences for youth. Improvement of the POS (raising quality) directly, and indirectly through
improvement of PLC, will require setting features that drive staff and youth behaviors related to
more regular production of key developmental experiences. This sounds more complicated than
it needs to: In our experience, most OST organizations put little coordinated effort into raising
the quality of youth experiences and have no “yardstick” for the quality of adult-youth or youth-
youth process that occurs. Simply by starting to pay attention, as a group, to the issue of quality,
the PLC will be affecting the POS in a positive direction.
The theory of behavior settings is particularly powerful in the OST sector because it
directly addresses two of the most challenging obstacles to quality (and just about everything
else) in the sector, (1) the professional knowledge/experience of entry-level staff and (2) the
frequency and volume of turnover for both staff and youth. If behavior norms reside in a setting
to some significant extent, it should be possible to introduce setting features that focus energy on
the production of high POS quality, despite professional experience and transience of setting
participants.
The Youth PQA and Validation Study
The Youth Program Quality Assessment (Youth PQA) and the Youth PQA Validation
study were efforts to identify and understand the POS quality in OST settings. Through this
work, we have reached the conclusion that high POS quality does not often exist without high
PLC quality to support it. POS quality as we have defined it in our own work consists of five
dimensions: safe environment, supportive environment, interaction, engagement, and youth
advisory. These five dimensions are subscales in the Youth PQA. The Appendix provides a
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
9
complete list of the Youth PQA subscales, items, and indicators that we use to represent POS
quality.
The Youth Program Quality Assessment Validation Study2 was a comprehensive
evaluation of the Youth PQA designed to rigorously assess the instrument’s reliability, validity,
and usefulness.3 The primary validation study was conducted on two waves of data for a sample
totaling 1,635 youth nested within 162 offerings which were in turn nested within 51 youth
serving organizations.4 The Youth PQA Validation Study was designed to determine the
applicability of the POS quality construct across a wide range of settings as a generic quality
measure, independent of program content or purpose. For example, the wave-2 data sample (the
largest in the study) consisted of 35% school-based programs, 57% community-based programs,
and 8% camps. Across these program auspices, 7% were school-day interventions, 53% were
after-school programs, 17% were summer programs, and 16% were residential programs. Over
90% of the program offerings met at least one time per week. Sixty percent of the youth sample
was female and nearly 40% were between the ages of 10 and 12 years old (the rest were older).
Finally, 38% of the youth sample was African-American, 40% were White, 8% were Latino, 2%
were Middle Eastern, 1% were Asian, and 12% were listed as other.
Table 1 presents basic psychometric information for the Youth PQA’s four observational
subscales. The internal consistency column presents cronbach’s-alpha coefficients for 118 Youth
PQA ratings drawn from 36 organizations. The inter-rater reliability column presents intraclass
2 See reports available at youth.highscope.org (Smith, 2005a; Smith 2005b; Smith 2006a; Akiva, 2005; Akiva 2006)
3 The study was funded by the W.T. Grant Foundation during the period 2001-2005. The instrument review process
was facilitated by several organizations including the W.T Grant Foundation, the Michigan Department of
Education, and the Forum for Youth Investment.
4 The overall sample has since been extended and now includes over 70 organizations. The findings presented in this
report are drawn from the published findings regarding the first two waves of data collection for the 51
organizations, the primary validation study, and from unpublished data analyses for an extended sample that
combines the 51 organizations with Youth PQA data from an additional 20.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
10
correlation coefficients (ICCs) calculated for 44 rater-pairs.5 The concurrent validity column
presents bivariate correlation coefficients for 454 youth who attended 29 program offerings to
describe the level of association between Youth PQA subscale scores and youth survey reports6
on aligned measures. The four subscales in Table 2 reflect findings from factor analyses on two
waves of data from a total of 162 program offerings which confirmed these related but
distinguishable subscales (Smith, 2005a).
Table 1 Youth PQA Reliability and Validity Evidence
Youth PQA Observation Subscales Internal
consistency
(alphas)
N=118 ratings
Inter-rater
Reliability
(ICCs)
N=44 paired
ratings
Concurrent Validity
Pearson-r correl. w/ aligned
youth survey subscales
N=29 offerings, 454 Youth
Safe Environment (5 items)
.43 .48 .42*
Supportive Environment (6 items)
.84 .69 .29+
Interaction (4 items)
.64 .83 .44**
Engagement (3 items) .70 .72 .32*
+p<.1, *p<.05, **p<.01
The Youth PQA is a POS quality measure that produces scores of known reliability and
validity. This discussion of PQA performance provides support to the empirical discussion of
POS quality which follows. Further, successful use of the instrument in the validation study, and
the evidence cohesion and performance for the subscales, supports the more general argument
that a generic POS quality construct is applicable across a wide range of OST settings.
5 ICCs compare the variance within the rater-pair scores to the variance between all scores for all raters. ICCs were
estimated using an equation that assumes a random selection of raters across rater pairs (i.e., rater effects are error).
6 Youth survey data was collected, entered, and prepared by an independent organization, Youth Development
Strategies, Inc. of Philadelphia, PA. For more information visit www.ydsi.org.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
11
POS Quality Defined
In this section we define POS quality further, first, by examining patterns of stability and
variance that occur in quality scores across many different settings and, second, by situating the
POS quality construct within a theory that integrates POS quality with youth outcomes. To
support this discussion, we rely upon findings from research conducted using the Youth PQA
with a quality scale that ranges between 1 and 5. These scores can be interpreted as follows: at
the indicator level a score of 1 indicates the absence of a particular developmental experience, a
score of 3 indicates presence of the experience but with limited access for youth in attendance,
and a score of 5 usually indicates both the presence of the experience and access by nearly all
youth in attendance.
The Empirical Context of POS Quality
POS quality is low for subscales II, III and IV. Table 2 provides POS quality scores for
several samples of data collected by trained outside observers7 including the Youth PQA
Validation Study full sample (column 1), center-based school-age care providers (column 2), 21st
Century elementary after-school sites (column 3), and model programs8 where POS quality as
defined by the Youth PQA was a primary emphasis of the PLC (column 4). POS quality on the
supportive environment, interaction and engagement subscales is fairly low across OST settings
described in the first three columns.
7 Currently trained observers are required to achieve 80% perfect agreement with an expert rater. Observations for
each offering in the study entailed two hours of observation and anecdotal evidence gathering and one hour of
scoring.
8 Model programs were the High/Scope Scope Institute for IDEAS 2004 session and the High/Scope Civic and
Community Service Institute, 2004 and 2005 sessions.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
12
Table 2 Youth PQA Subscale Scores for Offerings in Several Samples, Including Model
Programs for Reference
Program Offering Level
Youth PQA
Validation Study,
All Offerings
N=148
Center-Based
School-Age
Care
N=12
21st Century
Elementary
N=15
Model
Programs
N=14
I. Safe environment
4.36
4.10
4.38
4.45
II. Supportive environment
3.61 3.14 3.69 4.17
III. Interaction
2.90 2.97 2.93 3.48
IV. Engagement 2.61 1.70 2.71 3.46
The ratings provided in Table 2 are considered low because programs cannot score high
unless most of the youth in attendance have access to the kinds of developmental experiences
that define high POS quality. Scores of 3 and lower indicate that substantial numbers of the
youth who are present do not get access to key developmental experiences. For example, in the
sample of 71 organizations reflected in Table 2: in 53% of those settings youth did not have
opportunities to lead groups; in 56% youth did not have opportunities to mentor other youth; in
30% youth did not have opportunities to practice group process skills; in 47% youth were not
given opportunities to work in small groups; in 55% no explicit grouping strategies were used; in
30% youth did not have opportunities to make open-ended content choices; in 27% youth did not
have opportunities to make open-ended process choices; in 45% youth did not have opportunities
to make plans; in 60% youth were not exposed to any intentional planning strategies; and in 47%
youth were not given opportunities to reflect.
POS quality has a hierarchical structure. A second critical attribute of quality in the OST
sector is a hierarchical structure. Across each of the samples presented in Table 2, the same
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
13
pattern is present in the data. Programs score highest for safe environment subscale and then
scores decline incrementally, first to supportive environment then to interaction and then lowest
for the engagement subscale. This pattern is reflected in the Pyramid of Quality diagram in
Figure 2 and is of interest for several reasons.
First, the items that make up the safety and supportive environment subscales (see
Appendix) describe the regulated sphere of youth programs, areas of quality that fall under the
focus of state licensing requirements, national accreditation guidelines, and quality standards that
have been developed by cities, counties and states across the country. These items also tend to be
what people in the field mean when they use colloquial definitions of “safe places” and
“relationships” as key dimensions of quality. We suggest that the field has neither reached
Engagement
Interaction
Supportive Environment
Safe Environment
•Set goals and make plans
•Make choices •Reflect
•Partner with adults
•Lead and mentor •Be in small groups
•Experience belonging
•Reframing conflict
•Encouragement
•Skill building•Active engagement
•Appropriate session flow •Welcoming atmosphere
•Healthy food and drinks•Program space and furniture
•Physically safe environment•Psychological and emotional safety
Figure 2 POS Quality Pyramid
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
14
regulatory or ideological consensus about the types of items positioned at the top of the POS
quality pyramid, so it is not surprising that scores are lower in these areas.
In the model programs the distance between subscale scores looks different than in the
other samples—indeed the difference between interaction and engagement subscales for the
model programs nearly disappears. This suggests that the hierarchical structure of POS quality is
not fixed - when the appropriate setting features9 are in place, POS quality ratings may even out
at uniformly high levels.
POS quality varies around performances of individual staff. Table 2 represents the
empirical reality of POS quality in youth programs in the aggregate. However, each rating of
POS quality in Table 2 (described by the Ns in the columns of Table 2) was produced during a
single program offering with a unique staff member and purpose. The 189 individual offerings in
Table 2 are actually nested within 71 organizations – between 1 and 3 offerings were captured
within each of the 71 organizations. Looking at the individual offerings nested within their
respective organizations provides evidence about what is going on within Table 2’s aggregate
picture of quality in the OST field.
First, quality ratings for individual staff members are quite stable over time when
multiple ratings are collected for an offering with the same staff, same kids and same purpose. In
the Youth PQA Validation study we conducted multiple sequential observations in 44 program
offerings.10 We sent an observer into each program offering (for example, a photography
workshop that met every Tuesday from 3:00-4:30) three times over a three-month period. Youth
9 Setting features in these programs related to PLC and POS quality included: (1) training of all staff in a uniform
youth development method, (2) use of the Youth PQA as a self-assessment with all staff, (3) opportunities for youth
participation in governance, (4) supervisors who realize that how you work with staff is how they will work with
kids.
10 Only scores from the first offering are included in Table 2 to preserve independence of observations.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
15
PQA scores for these sequential ratings produced Pearson-r correlations of 0.8 or greater
between the first rating and each subsequent rating in the same offering (Smith, 2004).
If quality scores for individual staff demonstrate stability over time, quality ratings
across offerings for different staff within the same organization vary substantially. In both three-
level unconditional HLM models and simpler ANOVAs, we found that the proportions of PQA
score variance for the 189 individual offerings in Table 2 were quite evenly divided between
individual staff nested within their respective organizations at level 1 and the between-
organization variance (level 2). Further, when variance in POS quality scores for individual
offerings was partitioned at three levels, the third level being organizational type, the
organizational type factor did not represent a substantial proportion of variance. For example,
in an ANOVA using program type as a factor (type=school-based, community-based, and
camps) the Youth PQA total score was significantly related to the program type (F=2.481,
N=197) but the proportion of between group score variance was only 4% of total variance.
Content was not related to POS quality scores in the Youth PQA Validation study. Using
content data from the Youth PQA validation study, we constructed content emphasis scores11 for
each offering in the study. Content types were: (1) recreation, leadership and service, (2) math,
science and shop, (3) sport and physical fitness, (4) life skills, social, religion, (5) writing and
computers, (6) art and sewing, (7) music and dance. There was no pattern of association between
POS quality scores and the content emphasis scores, except for weak negative relationships
11 We experimented with several types of content scores with the same effect in each case. The final method was:
Youth PQA offering titles were coded 1=yes/0=no on 21 content types. Factor analysis was used to determine
content clusters that tended to occur in programs. Scores were created by adding the individual 1/0 content variables
into the seven clusters suggested by the factor analyses and then dividing by the total number of content types coded
for a given offering.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
16
between POS quality and sports and life skills content clusters. No relationships between content
and quality were found in multivariate models.
POS Quality-Outcomes Theory
The relationship between POS quality and youth outcomes is a critical source of
validation for the POS quality construct. We employ Maslow’s (1943) hierarchical model of
human needs as a theoretical frame for interpretation of the quality-outcomes relationship in
OST settings.
The Quality Pyramid and Actualization Outcomes. In Maslow’s hierarchy of needs there
are basic needs (“deficiency” or D-needs), which when unmet lead to discomfort. These include
physiological, safety, belonging, and esteem. Meeting these needs is necessary to successfully
focus other higher-order experiences, and if unmet, the basic needs tend to direct an individual’s
attention. For example, if you do not feel safe, it’s hard to focus on other tasks. When the D-
Needs are met, a person can move to the B-needs (“being needs”) related to actualization:
intrinsic growth and motivation. This hierarchy provides an interesting parallel to the pyramid
structure of quality that we found in OST programs. The lower levels of the pyramid represent
the kinds of experiences that might satisfy the D-needs and the higher levels of the pyramid are
likely to be related to the experience of satisfying B-needs. The pyramid of quality and
actualization as a program purpose provide organizational improvement with an interesting
telos: successful program improvement is about getting past safety and good relationships to a
greater focus on the quality of group interaction and task engagement in the setting.
Actualization is useful as a broad category of youth outcomes because it resides in the
theoretical company of a number of narrower concepts in the psychology and education literature
such as thriving (Lerner, 2002), flow (Czsikszentmihalyi, 1990), intrinsic motivation (Dweck,
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
17
2000), zone of proximal development (Vygotsky, 1962), and development of initiative (Larson,
2000). The term simplifies the discussion about what kind of outcomes programs produce.
Second, because actualization resides at the top of the pyramid, it also captures social and higher
order skill building that accompanies experiences of interaction and engagement at the top of the
pyramid. Experiences at the top of the pyramid move youth closer to desirable outcomes related
to motivation and psychological well-being (Halpern, 2005; Resiner, 2005, p.5) and development
of generic skills that support social and cognitive functioning in the community and labor market
(Murname and Levy, 1996; Honey et al., 2005).
Testing POS-actualization relationships in our data. In recent analyses of data from the
Youth PQA Validation Study (Smith, 2005c), we found that a composite score for 10 Youth
PQA items drawn primarily from the interaction and engagement subscales was related to three
youth actualization outcomes: youth reports of interest in the OST program, a sense of growth
from the OST program, and reports that they were developing skills through participating in the
OST program. In a series of OLS regression models described in Table 3, the POS quality
variable was statistically significant and explained between 25% and 40% of the total variance in
the three youth outcome variables. Each of the models controlled for organizational capacity,
program structure, urbanicity, age of youth, and staff-youth ratio, with the POS quality variable
entered last in order to see the effect on explained variance. When the same models were run
with POS quality scores from the safe environment and supportive environment subscales, no
relationships were evident. In this very rough test, it appears that access to POS quality
experiences at the top of the pyramid are related to actualization outcomes.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
18
Table 3 OLS Models: POS Quality and Youth Interest, Growth, Skills
Interest Growth Skill Build
Number Paid Service Staff .00 (-.09) -.01 (-.19) -.01 (-.18)
Multi-purpose (dummy) .00 (.00) .14 (.22) .21 (.21)
Suburban (dummy) -.27 (-.36)* -.25 (-.48)** -.42 (-.52)**
21st Century (dummy) -.34 (-.40)* -.25 (-.41)* -.35 (-.36)*
Age level -.17 (-.38)+ .02 (.08) -.02 (-.05)
Ratio .02 (.24) .00 (.08) .01 (.17)
Adjusted R-sq .25 .25 .16
Quality Score .34 (.70)** .18 (.52)* .22 (.41)+
Adjusted R-sq whole model .43 .35 .20
Standardized coefficients in parentheses; +=<.1; *=<.05; **<.01
Organization-level Youth PQA data for 44 organizations with surveys of 1,030 kids aggregated
up to the organziation level. OLS models described with controls for org capacity (number of
paid service staff and dummy variable for 21st Century), mission (dummy variable for multi-
purpose versus single purpose), context (dummy variable for suburban versus urban), and child
characteristics (age and youth-staff ratio).
Dependent variables are from the YDSI Youth Survey: Interest (3 items), Growth (2 items), and
Skill Building (7 items)
Quality score variable consisted of average for 10 Youth PQA items: II.H, II.I, III.L, III.N, III.O,
IV.P, IV.Q, IV.R, V.C, V.D
Conclusions from the Empirical Definition of POS Quality
From the empirical findings presented above, we draw two conclusions. First, the key
quality issue for OST organizations is to raise POS quality, primarily at the top levels of the
pyramid. Second, focusing on the performance of individual staff -- setting behaviors at the
offering level -- is the key to improving the quality of developmental experiences available to
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
19
youth at an OST organization. A systematic approach to raising the quality of performances by
individual staff requires attention to the next level in the multi-level system, the PLC.
PLC Quality Defined
Although the term “professional learning community” comes from research on schools,
similar conclusions about the link between professional community and the quality of client
services have been reached in the social work administration literature (Jaskyte, 2003; Kayser et
al., 2000; Latting, 2004). Our specific use of the school literature is to understand the kinds of
setting features that education organizations have used in order to focus their PLCs on the quality
of instruction.
Our thinking about the PLC construct and its relation to POS quality has been influenced
by the school-based research program conducted by Talbert, McLaughlin and others (for a
review see Talbert and McGlaughlin, 1999). We draw upon three points from their discussion.
The first is that school effects happen at the classroom level in the interactions between teachers
and students (school equivalent of POS). Second, that research on school effects should use
models which both sample from multiple classrooms and embed these composite classroom-
level experiences within higher levels of school operation. The third is that these embedded
classroom-level contexts are not necessarily permeated by decision-making and resource
allocation at higher managerial and administrative levels. Teachers have to intentionally open
their classrooms and administrators have to seek out pathways to that are likely to engage front-
line staff in the process of raising POS quality. These three points are critical in a political
context where system-level forces require greater accountability for outcomes and return on
investment, while simultaneously, decentralized policy structures empower site-level staff to
simply resist and avoid top-down accountability pressure. The concept of PLC quality is critical
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
20
to creating conditions where front-line staff are open to administrative emphases on
accountability and quality improvement.
PLCs in the School Literature
School-based research on PLCs is of particular interest because one of the central issues
in this literature is the disconnect between accountability data and instructional behaviors.
Effective PLCs represent a “tight coupling” between (1) program standards, (2) managerial
priorities, (3) performance data, and (4) professional development. This pattern is contrasted
with traditional school structures where decisions about these elements by teachers and
administrators lack continuity and shared focus (Halverson, et al., 2005 and Tyack & Cuban,
1995; Desminone, 2002). In the 1990’s, school-wide reform research suggested that educational
change was more apt to succeed and be sustained in schools where teachers worked together to
plan reform and develop the capacity to use data to make decisions (DuFour & Eaker, 1998;
Schmoker, 1996; Spillane & Thompson, 1997). As the case study literature about successful
school improvement efforts has grown, emphasis has shifted to include the critical role played by
school leaders who empower staff to make collective decisions within parameters set by data and
in the context of intentional organizational learning structures like PLCs (Mason, 2003;
Halverson et al., 2005; Kannapel et al., 2005; Leithwood et al., 2004). In this formulation, there
is an emerging role of the school principal as a human resources developer focused on the
growth of staff skills related to instruction (POS quality).
However, being rooted in data is a critical dimension of the new HR role for managers
and the PLC concept addresses two of the barriers that make learning from data more
challenging than simply reading numbers. First, in order for school personnel to make meaning
from data, it has to be contextualized within local networks of data producers and users -- data
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
21
must be converted to information by adding local context and then converted to knowledge
through plans for action (Fullan, 2001). Without a structured and collaborative process that helps
local users of data move through sequential stages of data reduction, interpretation, and planning
and action, the process of meaning-making and improvement action is stopped short (Halverson
et al, 2005; Schmoker, 1996). Second, it is clear from the literature on professional learning
communities, systemic reform theory, and knowledge management theory that improving
instruction provides the greatest impact on student learning – but what to do with the kids is the
least available meaning to be drawn from standardized outcome data (Mason, 2003; Darling-
Hammond, 1999). In response to teacher struggles to determine the implications of student
outcomes data for teaching practice, it has become clear to some that accountability data focused
on quality rather than outcomes can be a key to support to the meaning-making and
improvement process (Pianta, 2003; Hilberg, et al., 2004).
The PLC literature helps to define the elements of PLC quality (setting features and staff
behaviors) that produce high POS quality. First, managers should prioritize POS quality values
in their everyday work. This will allow them to both provide clear leadership about what is
important in the organization, as well as to have more focused criteria for hiring and educating
new staff. The greatest short-term quality gains can probably be achieved through the hiring
process. Second, mangers should implement setting features (staff meetings, planning time, self-
assessment, outside observation) that empower front line staff to think about quality and how to
produce it12 and that regularly puts diagnostic data regarding setting quality in front of staff.
Third, because the interpretability of data is so important for the development of shared meaning
12 Maslow also came to the same conclusion in his later work, studying management (1998): the pyramid in Figure 2
applies not only to youth but also to staff who also thrive when supported to reach actualization outcomes.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
22
and action, OST organizations should seek to produce data that directly informs discussion about
POS quality.
The Empirical Context of PLC Quality
There is little quantitative evidence that we are aware of regarding the use of setting
features related to quality accountability in the OST sector, other than to say that these are often
missing (see discussion in NRC, 2002 p. 254). In our research, organizational interviews with
youth program directors revealed the following for 71 organizations: over one third reported that
staff meetings were sporadic and did not include all staff; two-thirds of the organizations
reported having no formal plan for the assessment of program quality; two-thirds did not use data
as a part of the program planning process.
Results from a recent (unpublished) survey of 245 front-line staff, site coordinators and
program directors from 21st Century Community Learning Centers are presented in Table 4. It
appears that the majority of staff in this sample never come into contact with evaluative data
regarding the purposes or products of their work, or the interests of their clients. The staff that do
have access to this kind of information, tend to be at the managerial level, suggesting that that
program managers are not pushing this kind of information downward to site level and front-line
staff. A similar pattern holds true for program planning based on data from any of the several
sources listed – except that the overall levels of participation in data-driven planning are even
lower. This is certainly the case because useful data is not easily available in these systems, but it
also reflects a lack of priority on the part of leadership, and perhaps a lack of interest on the part
of site and front-line staff as well.
If leadership does not prioritize use of data in program planning (even surveys of youth
interests), neither is there a strong indication of why site coordinators and front-line staff should
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
23
be asking for it. In Table 4, less than half of the front-line staff had anything to do with
developing the structure and content of the programming that it is their job to deliver. Staff who
are not engaged in the process of decision-making about their own work are probably less likely
to be interested seeing data about its effectiveness.
Table 4 PLC Quality In School-Based After-school Programs
Front-line
staff, not
school day
teachers
N=50
Front-line
staff and a
school day
teacher
N=109
Site
coordinator
N=70
Program
director
N=16
Have you ever read, discussed or seen presentations about:
Yes for surveys of youth interests 34% 25% 56% 69%
Yes for observational assessment 18% 5% 21% 44%
Yes for local evaluations
(youth outcomes, quality, parent involvement)
22% 22% 46% 56%
Yes for system-level evaluation 18% 11% 41% 62%
Have you ever done program planning based on any of these data sources?
Yes 11% 6% 35% 37%
Are you familiar with your state’s program standards?
Yes 2% 3% 23% 50%
How frequently do you observe other staff in your programs?
Sometimes or more 78% 65% 87% 80%
How much input did you have on the structure and content of your program?
Substantial 45% 42% 84% 75%
Perhaps the most interesting piece of information in Table 4 has to do with familiarity
with program quality standards. In the state(s) where this data was collected, program standards
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
24
explicitly for use in the 21st Century program have been developed and disseminated to program
mangers by the agencies of state government. The fact that only 3% of front line staff have ever
seen the standards is a clear sign that, again, it is not the priority of program management to
engage front line staff in an ongoing discussion about quality. It may also be a sign that standards
documents are not typically produced in a form and format that are easy to use.
The process of aligning the Youth PQA with the existing quality standards in several
system-level initiatives provides another perspective on the current state of the field regarding
PLC and POS quality. We have conducted Youth PQA-to-standards alignments with several
national-, state-, county-, and city-level sets of OST program standards.13 In each case,
alignments conducted by High/Scope and others14 concluded that existing standards did not
provide depth in the critical areas of adult-youth and youth-youth interaction that are represented
at the top of the POS quality pyramid. Additionally, few of the quality standards covered
dimensions of PLC quality having to do with regularity of staff meetings, use of observational
assessment, and managerial responsibility for human resources development, data driven
planning, or staff roles in program development.
Conclusion
As the OST field works to conduct improvement interventions and demonstrate
accountability for quality, it is critical to build a research base around (1) what it is most
important to be accountable for and (2) how to raise quality in areas that need improvement. Our
13 National After-school Association Accreditation Standards (http://www.naaweb.org/); Michigan Model Out-of-
School Time Standards (http://www.michigan.gov/documents /OST_Standards_43292_7.pdf); Palm Beach County
Standards for After-school Quality (http://primetimepbc.org/); After-school Standards developed by the Youth,
Sports and Recreation Commission for the City of Detroit (http://www.ysrc.org/); the Expanded Learning
Opportunities Standards for Quality After-school Programs in the City of Grand Rapids, Michigan
(http://www.ci.grand-rapids.mi.us/index.pl?page_id=3167); and The After-school Corporation of New York quality
assessment stadnards (http://www.tascorp.org/).
14 Report to the Youth, Sports and Recreation Commission of the City of Detroit conducted by the Center for Urban
Studies at Wayne State University (2005).
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
25
research using the Youth PQA suggests that while OST programs tend to be physically and
psychologically safe places, they do not consistently deliver key developmental experiences
related to group interaction, task engagement, and youth voice.
The POS and PLC constructs provide insight into OST setting structures and roles by
suggesting both how to assess quality and where to focus improvement efforts. The POS
construct suggests that quality happens in program offerings with individual staff and that
currently in the OST field, there is substantial variance in quality across offerings within the
same organizations. This inconsistency of quality within organizations suggests that the
managerial role in OST organizations is not focused on the production of POS quality. The PLC
literature describes a number of setting features that are being used during the school day to
focus program staff and program resources on the improvement of classroom instruction and
these are likely tools for the OST field as well.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
26
Bibliography
Akiva, T.(2005,Fall/Winter).Turning training into results: The new youth program quality assess-
ment. High/Scope ReSource, 24(2), 21-24.
Akiva, T. (in press). Getting to engagement: Building an effective after-school program.
High/Scope ReSource.
Czsikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York: Harper
& Row.
Darling-Hammond, L. (1999, Fall). Instructional policy into practice: “The power of the
bottom over the top”. Education Evaluation and Policy Analysis (Vol. 12, No. 3).
Desmione, L.M., Porter, A.C., Garet, M.S., Yoon, K.S., Birman, B.F. (2002). Effects of
professional development on teachers’ instruction: Results from a three-year longitudinal
study. Educational Evaluation and Policy Analysis, 24(2), pp. 81-112.
DuFour, R & Eaker, R. (1998). Professional learning communities at work: Best practices for
enhancing student achievement. Bloomington, IN: National Education Service, and
Alexandria, VA: Association for Supervision and Curriculum Development.
Dweck, C. (2000). Self-theories: Their role in motivation, personality, and development.
Gambone, M.A., Klem, A.M. & Connel, J.P. (2002). Finding out what matters for youth: Testing
key links in a community action framework for youth development. Philadelphia: Youth
Development Strategies, Inc., and Institute for Research and Reform in Education. (Pgs.
3-4).
Halverson, R., Grigg, J., Prichett, R., Thomas, C. (2005). The new instructional leadership:
creating data-driven instructional systems in schools. Wisconsin Center for Education
Research. University of Wisconsin-Madison.
Halpern, R. (2005). Confronting the big lie: The need to reframe expectations of afterschool
programs. Report from the Partnership for After School Education.
Hilberg, R.S., Waxman. H.C., & Tharp, R. G. (2003). Introduction: Purposes and perspectives on
classroom observation research. In H. C. Waxman, R.G. Tharp, & R.S. Hilberg (Eds),
Observational research in U.S. classrooms. Cambridge: Cambridge University Press.
Honey, M., Fasca, C., Gersick, A, Mandinach, E., & Suparnah, S. (2005). Assessment of 21st
century skills: The current landscape (pre-publication draft). Down-loaded from
www.21stcenturyskills.org.
Jaskyte, K. (2003). Assessing changes in employee’s perceptions of leadership, job design, and
organizational arrangements in their job satisfaction and commitment. Administration in
social work. 27(4), pp.25-39.
Kayser, K., Walker, D., Demaio, J. (2000). Understanding social workers’ sense of competence
within the context of organizational change. Administration in social work. 24(4), pp.1-
20.
Larson, R. (2000). Toward a psychology of positive youth development. American
Psychologist, 55 (1). American Psychological Association. (Pgs. 170-183).
Latting, J.K., Beck, M.H., Slack, K.J., Tetrick, L.e., Jones, A.P., Etchegraray, J.M., Da Silva, N.
(2004). Promoting service quality and client adherence to the service plan: The role of
top management’s support for innovation and learning. Administration in social work.
28(2), pp.29-489.
Leithwood, K., Louis, K.S., Anderson, S, and Wahlstrom, K. (2004). Review of research: How
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
27
leadership influences student learning. Down-loaded from the Wallace Foundation
website.
Lerner, R.M., Brentano, C., Dowling, E.M. & Anderson, P.M. (2002). Positive youth
development: Thriving as the basis of personhood and civil society. New directions for
youth development: Pathways to positive
Maslow, Abraham H. (1943) A theory of human motivation. Psychological Review, 50, 370-
396. Barker, Roger, G. (1968). Ecological psychology. Stanford University Press.
Maslow, A. H. (1998). Maslow on management. New York: John Wiley & Sons, Inc.
Mason, S.A., Learning from data: The roles of professional learning communities. University
of Wisconsin-Madison. Paper presented at the American Research Education Conference
in April 2003.
Murnane, L & and Levey, F. (1996). Teaching the new basic skills: Principles for educating
children to thrive in a changing economy. New York: Free Press.
National Research Council and Institute of Medicine. (2002). Community programs to promote
youth development. Committee on Community-Level Programs for Youth. Jacquelynne
Eccles and Jennifer A.Gootman, eds. Board on Children, Youth, and Families, Division
of Behavioral and Social Sciences and Education. Washington, DC: National Academy
Press.
Pianta, R. (2003). Standardized classroom observations from pre-k to third grade: A mechanism
for improving quality classroom experiences during the p-3 years. Unpublished paper.
Reisner, E.R. (2005). Using evaluation methods to promote continuous improvement and
accountability in afterschool programs: A guide. Policy Studies Associates.
Schmoker, M. (1996). Results: The key to continuous school improvement. Alexandria, VA:
Association for Supervision and Curriculum Development.
Schoggen, P. (1989). Behavior settings: a revision and extension of Roger G. Barker’s
ecological psychology. Stanford University Press.
Smith, C. (2004). Youth Program Quality Assessment Validation Study: Wave-1
findings for reliability and validity analyses. Report to the W.T. Grant Foundation.
Smith, C. (2005a). The Youth Program Quality Assessment validation study: Findings for
instrument validation. Report available from the High/Scope Foundation at
http://youth.highscope.org.
Smith, C. (2005b). Measuring quality in Michigan’s 21st Century afterschool programs: The
Youth PQA self-assessment pilot study. Report available from the High/Scope
Foundation at http://youth.highscope.org.
Smith, C. (2005c). What matters for after-school quality? Presentation to the W.T. Grant
Foundation After-school Grantees Meeting, December 19,2005. Available:
http://youth.highscope.org
Spillane, J.P. & Thompson, C.L. (1997). Reconstructing conceptions of local capacity: The
local education agency’s capacity for ambitious instructional reform. Educational
Evaluation and Policy Analysis, 19(2), 185-203.
Talbert, T.E. & McGlaughlin, M.W. (1999). Assessing the school environment: Embedded
contexts and bottom-up research strategies. In Sarah Friedman and Theodore Wachs
(Eds). Measuring environment across the lifespan: Emerging methods and concepts.
Washington D.C.: American Psychological Association.
Vygotsky, L. S. (1962). Thought and language. Cambridge, MA: Harvard University Press.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
28
Wicker, A. W., (1992). Making sense of environments. In W. B. Walsh, K. H. Clark, & R. H.
Price (Eds.), Person-environment psychology: Models and perspectives. Hillsdale, NJ:
Lawrence Erlbaum Associates. (Pgs. 158-191).
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
29
Appendix – Youth PQA Subscales, Items and Indicators
I. SAFE ENVIRONMENT
A. Psychological and emotional safety is promoted.
(I-A1) The emotional climate of the session is predominantly positive (e.g., mutually respectful, relaxed,
supportive; characterized by teamwork, camaraderie, inclusiveness, and an absence of negative behaviors).
Any playful negative behaviors (not considered offensive by parties involved) are mediated (countered,
curtailed, defused) by staff or youth.
(I-A2) There is no evidence of bias but rather there is mutual respect for and inclusion of others of a
different religion, ethnicity, class, gender, ability, appearances, or sexual orientation.
B. The physical environment is safe and free of health hazards.
(I-B1) The program space is free of health and safety hazards.
(I-B2) The program space is clean and sanitary.
(I-B3) Ventilation and lighting are adequate in the program space.
(I-B4) The temperature is comfortable for all activities in the program space.
C. Appropriate emergency procedures and supplies are present.
(I-C1) Written emergency procedures are posted in plain view.
(I-C2) At least one charged fire extinguisher is accessible and visible from the program space.
(I-C3) At least one complete first-aid kit is accessible and visible from the program space.
(I-C4) Other appropriate safety and emergency equipment (e.g., for water or vehicle safety, sports, or
repairs) is available to the program offering as needed, can be located by staff, and is maintained in full-
service condition.
(I-C5) All entrances to the indoor program space are supervised for security during program hours (can
include electronic security system).
(I-C6) Access to outdoor program space is supervised during program hours.
D. Program space and furniture accommodate the activities offered.
(I-D1) Program space allows youth and adults to move freely while carrying out activities (e.g., room
accommodates all participants without youth blocking doorways, bumping into one another, and
crowding).
(I-D2) Program space is suitable for all activities offered (e.g., furniture and room support small and large
groups; if athletic activity is offered, then program space supports this).
(I-D3) Furniture is comfortable and of sufficient quantity for all youth participating in the program
offering.
(I-D4) Physical environment can be modified to meet the needs of the program offering (e.g., furniture
and/or supplies can be moved).
E. Healthy food and drinks are provided.
(I-E1) Drinking water is available and easily accessible to all youth.
(I-E2) Food and drinks are plentiful and available at appropriate times for all youth during the session.
(I-E3) Available food and drink is healthy (e.g., there are vegetables, fresh fruit, real juice, or homemade
dishes).
II. SUPPORTIVE ENVIRONMENT
F. Staff provide a welcoming atmosphere.
(II-F1) All youth are greeted by staff within the first 15 minutes of the session.
(II-F2) During activities, staff mainly use a warm tone of voice and respectful language.
(II-F3) During activities, staff generally smile, use friendly gestures, and make eye contact.
G. Session flow is planned, presented, and paced for youth.
(II-G1) Staff start and end session within 10 minutes of scheduled time.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
30
(II-G2) Staff have all materials and supplies ready to begin all activities (e.g. materials are gathered, set
up).
(II-G3) There are enough materials and supplies prepared for all youth to begin activities.
(II-G4) Staff explain all activities clearly (e.g. youth appear to understand directions; sequence of events
and purpose are clear).
(II-G5) There is an appropriate amount of time for all of the activities. (e.g. youth do not appear rushed,
frustrated, bored, or distracted; most youth finish activities).
H. Activities support active engagement.
(II-H1) The bulk of the activities involve youth in engaging with (creating, combining, reforming)
materials or ideas or improving a skill though guided practice.
(II-H2) The program activities lead (or will lead in future sessions) to tangible products or performances
that reflect ideas or designs of youth.
(II-H3) The activities provide all youth one or more opportunities to talk about (or otherwise communicate)
what they are doing and what they are thinking about to others.
(II-H4) The activities balance concrete experiences involving materials, people, and projects (e.g., field
trips, experiments, interviews, service trips, creative writing) with abstract concepts (e.g., lectures,
diagrams, formulas).
I. Staff support youth in building new skills.
(II-I1) All youth are encouraged to try out new skills or attempt higher levels of performance.
(II-I2) All youth who try out new skills receive support from staff despite imperfect results, errors, or
failure; staff allow youth to learn from and correct their own mistakes and encourage youth to keep trying
to improve their skills.
J. Staff support youth with encouragement.
(II-J1) During activities, staff are almost always actively involved with youth (e.g., they provide directions,
answer questions, work as partners or team members, check in with individuals or small groups).
(II-J2) Staff support at least some contributions or accomplishments of youth by acknowledging what
they’ve said or done with specific, nonevaluative language (e.g., “Yes, the cleanup project you suggested is
a way to give back to the community.” “I can tell from the audience response that you put a lot of thought
into the flow of your video.”).
(II-J3) Staff make frequent use of open-ended questions (e.g., staff ask open-ended questions throughout
the activity and questions are related to the context).
K. Staff use youth-centered approaches to reframe conflict.
(II-K1) Staff predominantly approach conflicts and negative behavior in a nonthreatening manner (i.e.,
approach calmly, stop any hurtful actions, and acknowledge youth’s feelings).
(II-K2) Staff seek input from youth in order to determine both the cause and solution of conflicts and
negative behavior (e.g., youth generate possible solutions and choose one).
(II-K3) To help youth understand and resolve conflicts and negative behavior, staff encourage youth to
examine the relationship between their actions and consequences.
(II-K4) Staff acknowledge conflicts and negative behavior and follow up with those involved afterward.
III. INTERACTION
L. Youth have opportunities to develop a sense of belonging.
(III-L1) Youth have structured opportunities to get to know each other (e.g., there are team-building
activities, introductions, personal updates, welcomes of new group members, icebreakers, and a variety of
groupings for activities).
(III-L2) Youth exhibit predominately inclusive relationships with all in the program offering, including
newcomers.
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
31
(III-L3) Youth strongly identify with the program offering (e.g., hold one another to established guidelines,
use ownership language, such as “our program,” engage in shared traditions such as shared jokes, songs,
gestures).
(III-L4) The activities include structured opportunities (e.g., group presentations, sharing times, recognition
celebrations, exhibitions, performances) to publicly acknowledge the achievements, work, or contributions
of at least some youth.
M. Youth have opportunities to participate in small groups.
(III-M1) Session consists of activities carried out in at least 3 groupings—full, small, or individual.
(III-M2) Staff use 2 or more ways to form small groups (e.g., lining up by category and counting off,
grouping by similarities, signing up).
(III-M3) Each small group has a purpose (i.e., goals or tasks to accomplish), and all group members
cooperate in accomplishing it.
N. Youth have opportunities to act as group facilitators and mentors.
(III-N1) All youth have multiple opportunities to practice group-process skills (e.g., actively listen,
contribute ideas or actions to the group, do a task with others, take responsibility for a part).
(III-N2) During activities, all youth have one or more opportunities to mentor an individual (e.g., teach or
coach another).
(III-N3) During activities, all youth have one or more opportunities to lead a group (e.g., teach others; lead
a discussion, song, project, event, outing, or other activity).
O. Youth have opportunities to partner with adults.
(III-O1) Staff share control of most activities with youth, providing guidance and facilitation while
retaining overall responsibility (e.g., staff use youth leaders, semiautonomous small groups, or individually
guided activities).
(III-O2) Staff always provide an explanation for expectations, guidelines, or directions given to youth.
IV. ENGAGEMENT
P. Youth have opportunities to set goals and make plans.
(IV-P1) Youth have multiple opportunities to make plans for projects and activities (individual or group).
(IV-P2) In the course of planning the projects or activities, 2 or more planning strategies are used (e.g.,
brainstorming, idea webbing, backwards planning).
Q. Youth have opportunities to make choices based on their interests.
(IV-Q1) All youth have the opportunity to make at least one open-ended content choice within the content
framework of the activities (e.g., youth decide topics within a given subject area, subtopics, or aspects of a
given topic).
(IV-Q2) All youth have the opportunity to make at least one open-ended process choice (e.g., youth decide
roles, order of activities, tools or materials, or how to present results).
R. Youth have opportunities to reflect.
(IV-R1) All youth are engaged in an intentional process of reflecting on what they are doing or have done
(e.g., writing in journals; reviewing minutes; sharing progress, accomplishments, or feelings about the
experience).
(IV-R2) All youth are given the opportunity to reflect on their activities in 2 or more ways (e.g., writing,
role playing, using media or technology, drawing).
(IV-R3) In the course of the program offering, all youth have structured opportunities to make
presentations to the whole group.
(IV-R4) Staff initiate structured opportunities for youth to give feedback on the activities (e.g., staff ask
feedback questions, provide session evaluations).
V. YOUTH CENTERED POLICIES AND PRACTICES
C. Youth have an influence on the setting and activities in the organization
DRAFT – Quality in the Out-of-School Time Sector: Insights from the Youth PQA Validation Study
SRA Biennial Meeting ▪ March 26, 2006 ▪ High/Scope Educational Research Foundation
32
(V-C1) Youth and adults share the responsibility for decisions about the design and use of the physical
environment (e.g., they make plans for furniture arrangement, determine design additions and displays
relevant to youth activities.)
(V-C2) Youth and adults share the responsibility in determining program schedules and program offerings.
(V-C3) Youth take charge of (with appropriate support from adults) and facilitate or lead (not just assist)
sessions or activities for peers or younger youth.
D. Youth have an influence on the structure and policy of the organization.
(V-D1) Youth participate in program quality review and plans for improvement.
(V-D2) Youth and staff share responsibilities for hiring, training, and evaluating staff (e.g., they help set
qualifications, are present for interviews, and are involved in making decisions about candidates for staff
positions.)
(V-D3) Youth and staff share responsibilities for planning recruitment and actually recruiting other youth
to join the organization or program offerings.
(V-D4) Youth and staff share responsibilities for community outreach efforts (i.e., interaction with
families, schools, other youth-serving organizations, and the community).
(V-D5) Youth and staff share responsibilities on program governing bodies (e.g. boards, advisory panels,
standing committees, task forces.)