Technical ReportPDF Available

Figures

Evaluation of ‘Enhancing Emotional Literacy
through Visual Arts’
Brad Astbury and Ruth Aston
Centre for Program Evaluation
Melbourne Graduate School of Education
University of Melbourne
December 2013
2
‘Programs do not “work”, rather it is the action of stakeholders that makes them
work, and the causal potential of an initiative takes the form of providing
reasons and resources to enable participants to change.
Ray Pawson and Nick Tilley1 (1997, p. 215)
1 Pawson, R., & Tilley, N. (1997). Realistic evaluation: London: Sage.
3
Acknowledgments
A number of people provided valuable input into this evaluation. The
eleven members of the Dax working party acted as an advisory group
for the study, contributing guidance throughout the entire research
process. Amy Parthenopoulos and Isabel Brookes from the Centre for
Program Evaluation assisted with teacher interviews.
Our thanks to the principal and teachers at each of the pilot schools,
who participated in the survey, made themselves available for
interviews, arranged access for school visits and consented to the
presence of the evaluation team at training sessions. This evaluation
would not have been possible without your contribution.
In addition, we wish to acknowledge the support of Dr Eugen Koh, past
Director of the Dax Centre, who commissioned this study and was
instrumental in supporting the development of ELVA. Margaret Nixon,
lead facilitator and key architect of ELVA deserves special mention for
practicing what she preaches her openness to being ‘evaluated’,
commitment to the project and ability to model what it looks like to be
an emotionally literate teacher are to be commended.
Brad Astbury & Ruth Aston
Melbourne, December, 2013
4
Table of Contents
Introduction and Background .................................................................. 7
The Dax Centre ...................................................................................... 8
Origins of the Approach .......................................................................... 8
Evaluating the Approach ....................................................................... 10
Reading the Report ............................................................................... 12
Understanding How the Approach Works ............................................ 14
A Working Model of the Program .......................................................... 15
Explaining the Model ............................................................................. 16
Using the Model to Guide the Evaluation .............................................. 18
Implementation and Early Evidence of Effects ..................................... 20
Overview of the Pilot Schools ............................................................... 20
Perceptions of the Training ................................................................... 21
Critical Features of the Training ............................................................ 25
Implementation Strategies and Challenges ........................................... 29
Effects on Teachers’ Learning and Classroom Practice ........................ 32
Areas for Improvement ......................................................................... 36
Training Transfer: Results from a Six Month Follow-Up ..................... 37
Method and Data Source ...................................................................... 37
Key Findings from the Survey ............................................................... 38
Discussion and Limitations of the Analysis ............................................ 41
Conclusions and Implications ............................................................... 43
Suggestions to Improve Design and Delivery ........................................ 43
Strengthening the Research and Evidence-Base .................................. 44
Pathways to Sustainability: Some General Considerations ................... 45
References .............................................................................................. 47
5
Executive Summary
In 2010 the Dax Centre received funding from a private philanthropic
trust to develop a classroom-based model that teachers could use to
enhance the emotional literacy of primary school children using visual
art as a medium. This initiative has become known as ‘Enhancing
Emotional Literacy through Visual Arts’ (ELVA).
What is ELVA?
ELVA is designed to address an important need: helping schools and
teachers to support the development of student emotional literacy.
There are many programs that focus on social and emotional well-being;
few however explicitly link the role of visual arts in facilitating emotional
awareness, connection and resilience. Another important distinguishing
feature of ELVA is that it adopts an experiential rather than competency
or skills-based approach.
The initiative was initially trialled in four schools, with positive results
reported. In mid-2011, eight new pilot schools were recruited. Seven
full-day training sessions were delivered over an 18 month period and
classroom units of work delivered to approximately 2000 students in
total. A second round of pilot schools commenced this year.
The Dax Centre commissioned the Centre for Program Evaluation at the
University of Melbourne to undertake an independent evaluation of
ELVA. This publication summarises key findings regarding design,
implementation, and initial outcomes.
Findings in brief
Teachers expressed high levels of overall satisfaction with ELVA
and professional learning opportunities, including the quality of
facilitators and follow-up support. Levels of attendance at
training sessions were high.
ELVA incorporates several critical features that are consistent
with what is known about best practice in teacher professional
learning. These include: (a) sufficient duration; (b) theory-
practice mix; (c) active, experiential learning; (d) collective
participation; and (e) ongoing support and materials.
A range of factors affecting implementation were identified,
including characteristics of the teacher-participants, facilitators,
features of the training, and school context. In most cases
schools were able to overcome obstacles, and found the approach
easy to implement in the classroom.
A 6-month follow-up survey found that teachers have changed
classroom practice as a result of participating in the training. This
is an important finding research suggests that many teacher
6
professional learning initiatives rely on a ‘train and hope’ model.
ELVA supports training transfer.
Although provisional, there is also information to suggest that
positive impacts on students, parents and the broader school
community are beginning to occur across school sites. This is
significant for just one year of involvement.
Summing up
On the basis of these findings, the Centre for Program Evaluation
recommends that ELVA is continued beyond the pilot phase. To enhance
quality, consistency, and sustainability a number of suggested areas for
improvement have been identified, and these relate chiefly to securing
alternative funding sources. Ongoing monitoring and evaluation is also
essential so that better evidence of longer-term impacts on students can
be obtained.
7
Introduction and Background
Over the past decade, educational authorities in the United Kingdom,
the United States and Australia have become increasingly active in
delivering emotional literacy programs in schools. Terminology in this
area varies considerably, and these initiatives are often referred to as
social and emotional learning (SEL), emotional intelligence training, or
emotional well-being [For a recent review and meta-analysis see Durlak
et al. (2011)]. The Victorian Department of Education and Early
Childhood Development (DEECD) has also emphasised the important
role that social and emotional competencies can play in improving
student engagement, behaviour and learning outcomes.
One major driver of current initiatives focusing on emotional well-being
in schools is recent knowledge from neuroscience2. For example, the
publication from the Institute of Medicine and National Research Council
titled, From Neurons to Neighborhoods: The Science of Early Childhood
Development is often cited by education authorities. A key contributor to
this report, Shonkoff (2006), has stated that:
...if we really want to build a strong platform for healthy
development and effective learning in the early childhood years,
then we must pay as much attention to children‘s emotional
wellbeing and social capacities as we do to their cognitive abilities
and early literacy skills (p.2188).
This landmark study has had a significant influence on early childhood
and education policy and discourse. In part, this is because of the way in
which emerging evidence from the interface between neuroscience and
psychology has rapidly captured the imagination of policy-makers,
educators and the general public. Books on the brain, mind and
emotions continue to top best-seller lists.
However, translating knowledge into effective educational practice has
been challenging. Evaluators still know little about what kinds of
emotional literacy programs work best, for whom and under what
circumstances. Different traditions within psychology continue to
compete for claims to ‘hard’ evidence from neuroscience; often simply to
prop up favourite theories. Meanwhile, teachers struggle to find the time
and support necessary for introducing and sustaining curriculum
changes. Not surprisingly, implementation problems are widespread and
these are exacerbated further by confusion surrounding the actual
concept and measurement of emotional literacy.
2 One should also not overlook the influence of research on ‘resilience’ from the field of developmental
psychology (Howard, Dryden & Johnson, 1999) and growing interest in the positive psychology
movement (Seligman, 1991, 2011). Further influences include the work of Salovey and Mayer (1990)
who first coined the term emotional intelligence, and more controversially, Dan Goleman (1995), who
despite genuine concerns about his popularisation of neuroscientific and psychological evidence in his
book Emotional Intelligence: Why it Can Matter More than IQ certainly contributed significantly to raising
public awareness. McLaughlin (2008) sums things up well: “The emphasis on emotional well-being has
its roots in many different traditions and arguments. It is a network of research and thinking...driven by a
set of social and policy concerns” (p. 355).
8
The Dax Centre
The Dax Centre is named after the founder, Dr. Eric Cunningham Dax
(1908-2008), a British psychiatrist who during the 1940s pioneered the
use of art to facilitate understanding, diagnosis and treatment of mental
disorders. In 1953 Dax published findings of his research in
Experimental Studies in Psychiatric Art, and began collecting artworks
produced by psychiatric patients. He arrived in Melbourne in 1952, and
up until the time of his death was a prominent leader in the psychiatric
field. Over the years, Dax amassed a large collection of art. During
retirement he spent much of his time cataloguing, preserving and
exhibiting the art to a wide audience.
Today, the Dax Centre continues this tradition, maintaining the
Cunningham Dax Collection, which consists of approximately 15,000
works created by people with an experience of mental illness and
trauma. The Centre’s primary mission is to ‘promote mental health and
wellbeing by fostering a greater understanding of the mind, mental
illness and trauma through art and creativity(Dax Centre, 2013). Five
core values underpin the work of the centre - respect, empathy,
equality, integrity, and creativity.
In recent years, the scope of Centre activities has expanded from
being a gallery with an education program, to being a central hub for all
those interested in the relationship between the mind, mental illness,
psychological trauma, art and creativity’ (Dax Centre, 2013). In line with
these developments, the Centre is currently undertaking a pilot initiative
designed to improve student emotional literacy through the use of visual
arts (2010-2014).
Origins of the Approach
In 2010 the Dax Centre received funding from a private philanthropic
trust to develop a classroom-based model that teachers could use to
enhance the emotional literacy of primary school children using visual
art as a medium. This initiative has become known as the Enhancing
Emotional Literacy through Visual Arts’ Program (ELVA).
The project involves participation in a series of day-long teacher
professional learning sessions, the development and trialling of
classroom activities or ‘units of work’, and support from Dax staff to
embed the approach in schools. ELVA has been informed by a multi-
disciplinary working party and seeks to integrate psychodynamic
theories with neuroscientific research, a developmental perspective on
early intervention and prevention, and aspects of art therapy theory and
practice. Six inter-related domains were considered as a framework
around which to develop the approach:
1. The activity
2. The student
3. The teacher
4. The student-teacher relationship
9
5. The classroom environment
6. The broader system of school, parents, community.
The domain framework informs the planning of units of worklessons
that teachers can use in the classroom. These vary according to length
of time to implement and student grade level, although all units
incorporate four dimensions: the pre-activity teacher reflection, the
activity, the experience, and teacher response.
According to internal documents, the overall aim of ELVA is to improve
‘the capacity for children to be emotionally alive, emotionally aware, and
emotionally connected with themselves, others and with experiences
and situations’. Emphasis is placed on the significance of ‘creating time,
space and place for children to reflect on themselves [and] their
interactions with their social setting within a safe and supportive
environment’. A key assumption is that:
A child who is capable of understanding their emotional
experiences and their feelings associated with it, is better
equipped to develop capacities to engage with and manage
emotional difficulties they may encounter and are more resilient
in the face of emotional challenges.
In early 2011 ELVA was trialled in four schools, with positive results
reported in an internal review document. This included feedback from
teachers that students were noticeably more engaged in their art work,
there was a ‘quieter atmosphere’ in the classroom, and that more
meaningful and individual work was being produced. Participants also
reported that since implementing the approach they had become more
reflective about their teaching practice and developed a stronger
understanding of students.
In mid-2011, eight new ‘pilot’ schools were recruited following a call for
expressions of interest. In total, seven full-day training sessions were
delivered over an 18 month period; four in 2011 and three in 2012. A
second round of pilot schools will commenced in mid-2013. A broad
timeline listing key stages of the project is presented below.
Figure 1: Project timeline and stages
2010 Stage 1:
Development
2011 Stage 2:
Trial Phase
2012 Stage 3:
Pilot Phase
2013 Stage 4:
Refinement and
Implementation
2014 Stage 5:
Sustaining
Implementation
10
Evaluating the Approach
The Centre for Program Evaluation (CPE) was commissioned in 2012 by
the Dax Centre to undertake an independent evaluation of ELVA. The
terms of reference included, broadly, an assessment of the design,
implementation and initial outcomes of the approach, and identification
of ways in which this pilot initiative could be improved and sustained. As
one of the Dax Centre’s early ventures into teacher professional
learning, the evaluation may also generate lessons for developing future
school-based initiatives.
Aims and key evaluation questions
Considering the stage of program development it was determined that
the evaluation should primarily be formative, rather than summative in
nature. Formative evaluation asks the question ‘How are we doing?’ It
focuses on generating information about program establishment and
implementation. This information is then used by staff to inform
decisions about program development and improvement.
The rationale here is that if ELVA is not well designed and delivered then
it will fail to produce desired outcomes. Undertaking a summative study
too soon could result in the premature termination of a potentially
effective intervention. After the initiative has had time to settle and
mature, it is anticipated that a summative evaluation will be undertaken
to generate more robust estimates of impact.
Within this context, there were three linked aims of the evaluation.
These were to:
1. Clarify ELVA design by articulating the relationship between
program processes and intended short, medium and longer-term
outcomes;
2. Document and examine a range of stakeholder perceptions and
experiences regarding the implementation and effectiveness of
the ELVA approach; and
3. To work with Dax Centre staff and other key stakeholders to
develop their capacity to use evaluative information for informing
future refinement and development of ELVA.
To achieve these aims, a theory-based approach was used to identify
and prioritise key evaluation questions and structure data collection
activities. The following questions were formulated and agreed upon by
stakeholders (see Box 1).
11
Box 1: Key evaluation questions
1. Are teacher professional learning activities well-designed and
delivered?
2. Are teachers satisfied with the professional learning activities and
on-going support provided by the Dax Centre?
3. What effect does professional learning in emotional literacy, using
the Dax Approach, have on teacher knowledge, attitudes, capacity,
and confidence?
4. What changes occur in the classroom learning environment when
teachers participate in the program?
5. How, and to what extent, does the approach contribute to enhanced
student emotional literacy?
6. Are there any unintended outcomes associated with the approach
(positive and/or negative)?
7. Are there particular conditions under which the approach works
better than others (sustainability and potential transferability)?
The first four questions were prioritised in light of the stage of program
development, stakeholder information needs and available time,
resources and other feasibility constraints. A mixed-methods data
collection approach was utilised to address evaluation questions, as
described below:
Ongoing discussions with members of the Dax working party to
develop a preliminary logic model for ELVA and reach a common
understanding of the purposes of the evaluation;
Review of the existing evidence-base for the approach, including
program documents as well as relevant literature on teacher
professional development and student emotional literacy;
Detailed observation of how teacher professional learning was
delivered by attending all sessions throughout 2012;
Semi-structured interviews with teachers and school principals.
These interviews sought detailed information about:
o levels of satisfaction with the professional learning
sessions and on-going support provided by the Dax
Centre;
o the difference this has made in terms of knowledge,
capacity, confidence, attitudes and beliefs;
o the effect that the ELVA approach has had on classroom
practice and any associated impacts on student emotional
literacy;
12
o what contributed to the success (or lack of success) of the
approach;
o what could be done differently in order to maximise
results.
Thematic analysis of existing data and documents collected by
the Dax Centre and participating schools (e.g. training feedback
sheets, school policy documents, photos, student artwork, etc).
Descriptive analysis of data from a post-training survey of
teachers to assess durability of knowledge gains and extent of
training transfer.
The University of Melbourne, Graduate School of Education Human
Research Ethics Committee, and the Victorian Department of Education
and Early Childhood Development, Education Policy and Research
Division approved the research components of the evaluation.
A summary list of methods and data sources and their relationship to
key evaluation questions is presented in Table 1 below.
Reading the Report
Section 2 provides an overview of a provisional model that specifies how
ELVA works to generate desired outcomes. This model guided the
development of evaluation questions and data collection activities. The
next two sections constitute the main findings of the study, and draw on
multiple data to assess implementation quality and provide provisional
findings of effects on teachers, students and the broader school
community. The final section summarises the main messages of the
study and identifies several implications for future research and
development efforts.
.
13
Table 1: Evaluation schedule
Evaluation questions
Data source
Method
Timing
1. Are teacher professional learning activities well-designed and
delivered?
Program documents
Observation
Teachers
Existing literature
Semi-structured interviews
Observation of training
Research synthesis
End of term 4
28 Feb, 9 May, 23 July, 25
Oct
Ongoing
2. Are teachers satisfied with the learning activities and on-going
support provided by the Dax Centre? Teachers Semi-structured interviews Term 4
3. What effect does professional learning in emotional literacy,
using the Dax approach, have on teacher knowledge, attitudes,
skills and confidence?
Teachers
Principal
Students
Semi-structured interviews
Student focus groups Term 4
4. What changes occur in the classroom learning environment
when teachers participate in the program?
Teachers
Students
Semi-structured interviews
Student focus groups
Site visits
Post-training survey
End of Term 4
5. How, and to what extent, does the approach contribute to
enhanced student emotional literacy?
Students
Teachers
Principal
Student focus groups
Semi-structured interviews
Program theory analysis
End of Term 4
Ongoing
6. Are there any unintended outcomes associated with the
approach (positive and/or negative)?
Teachers
Research literature
Interviews
Program theory analysis Ongoing
7. Are there particular conditions under which the approach works
better than others (sustainability and potential transferability)? All of the above All methods Ongoing
14
Understanding How the Approach Works
All programs3 are based on some sort of ‘if/then’ hypothesis which
asserts that ‘If we deliver a program in this way, then it will generate
certain kinds of desired changes’. When evaluating a program it is often
useful to start by asking a fundamental question: ‘Is the change process
presumed in program conceptualisation and design plausible and
logically sound? Clarifying how and why a program is intended to work
can assist program planners and evaluators to identify any critical flaws
in the design of a new initiative, and support program improvement
efforts.
One technique or tool for understanding the design and functioning of a
program is logic modelling. A logic model is a ‘plausible and sensible
model of how a program is supposed to work’ (Bickman, 1987, p. 5).
Program logic models can be expressed in different ways a graphic
display of boxes and arrows, a table, a narrative description and so on.
The level of detail and complexity can also vary significantly. Regardless
of the way in which it is depicted, a logic model should clearly identify
the underlying premises about the way in which the elements of a
program fit together in a simple causal sequence.
The basic elements of a simple linear logic model are shown in Figure 2
below, although more elaborated logic models are sometimes
preferable.
Figure 2: Basic elements of a logic model
Source: W. K. Kellogg Foundation (2004)
3 Evaluators use the term program in a broad sense to refer to any intentional and organised effort to
bring about some kind of positive change in individuals, groups, organisations or communities. In
educational settings the term ‘program’ is now increasingly replaced with ‘approach’, ‘initiative’ or
‘philosophy’. One reason for this is to avoid narrow conceptions of the word program among some
quarters of the education community (i.e. program = a top-down, highly scripted set of classroom
instructional materials; and/or off-the shelf, prescribed practices that are imposed on local actors). As
this section is written primarily from an evaluation perspective we retain the word program. Elsewhere
we use the preferred nomenclature of the Dax Centre, referring to ELVA as an approach.
15
The model has the following key elements, that can be summarised
briefly as follows:
Inputs human, financial and physical capital as well as other
resources such as partnerships, infrastructure, materials, policy,
and research knowledge that are invested in a program.
Activities all the actions, procedures and processes that are
necessary to produce program outputs.
Outputs are the immediate results of activities and are often
stated as specific and measurable process indicators, such as
participant numbers, materials produces and disseminated,
number of programs delivered, and so on.
Outcomes are the changes you expect will occur as a result of
activities and outputs. These are often broken down further into
a structured sequence or outcomes hierarchy (e.g. short,
medium, long-term outcomes).
Programs are not isolated interventions. They are always introduced into
a larger organisational and social system. Consequently, logic models
should also identify important features of the broader context in which
the program operates. Contextual issues might include variations in
participant demographics and motivation, social, political and economic
considerations, time and so on. These factors are often outside the
direct control of the program, but are important nevertheless because
they can affect implementation processes and outcomes.
A Working Model of the Program
This section articulates a logic model for the ELVA approach (see Figure
3, p. 19). The model offers a visual depiction of the hypothesised
relationship between activities and intended outcomes. Key contextual
features that may influence the generation of outcomes are also
identified.
As a starting point, a rudimentary logic for the program can be stated in
narrative form as follows:
If teachers experience effective training in the ‘Dax approach’
then they will acquire new insights, knowledge, attitudes and
skills regarding emotional literacy. If teachers are motivated and
provided with support to apply this learning appropriately in the
classroom, then student emotional literacy will be enhanced.
The model was developed initially by examining key documents and
talking with program architects to surface their assumptions about the
logic underlying the approach. This was supplemented by examining
relevant literature, including Kirkpatrick’s (1959/1977) four level
framework for evaluating training programs as well as various theories
of adult experiential learning and training transfer (see for example,
Knowles, 1989; Kolb, 1984; McKeough, Lupart, & Marini, 1995;
Mezirow, 1991; Prawat, 1989; Schon, 1988).
16
The model was subsequently refined as data was collected and
iteratively ‘tested’ against the conceptual ideas and presumed causal
linkages identified in the original schematic model.
Explaining the Model
The model should be read from left to right - starting with the
components of the program process theory, then to the outcomes
grouped under program impact theory. Examples of contextual factors
that may influence the operation and effectiveness of the program are
not depicted in the diagram, but are discussed below. The arrows
indicate the hypothesised linkages between the various elements of the
program. In many cases these may not necessarily be linear, as
suggested by the model, but could possibly interact in a reciprocal
fashion or also ‘feedback’ into other components of the model.
For example, when a change in teachers’ classroom practice impacts
positively on student emotional literacy, this may result in a virtuous
‘feedback loop’, leading to reinforcement of these changes among
individual teachers (i.e. continued and/or greater application of Dax
emotional literacy lesson plans).
Program process theory
Four major components are identified in the model. First, a key process
is the development and maintenance of an effective strategy for
delivering teacher professional learning. A second important step is
careful selection and recruitment of schools and teachers who are
motivated to learn. The next stage involves delivering the training
sessions. A key enabler of these three components of the
implementation chain is the ongoing support and direction for pilot
schools provided by the Dax Centre.
In future versions of this model it may become important to include the
following items: (a) details of ongoing operational costs associated with
training, (b) delivery mode and format (e.g. online, intensive, on-site)
(c) new information on training needs of teachers, as well as other
inputs that may be required to support development of the program,
such as inter-agency partnerships.
While not explicitly identified in the model, outputs are implicit in the
activities themselves. Examples of outputs derived from program
activities include: manuals and documents, number and types of
training hours/sessions conducted, number and types of participants
who attend, number of site visits to pilot schools, and so on.
Program impact theory
A commonly used framework for evaluating the effectiveness of training
programs has been developed by Donald Kirkpatrick. His idea of a four-
level model was first published in 1959 in an article for the American
Training and Development Journal. Since then, Kirkpatrick’s model has
been used extensively in industrial/organisational psychology and
17
management-oriented evaluation to cluster training program outcomes
according to four levels:
1. Reaction: how those who participate in the training program
respond to it (i.e. a measure of satisfaction).
2. Learning the extent to which participants change attitudes,
improve knowledge and/or increase skill as a result of attending
the training program.
3. Behaviour the extent to which change in behaviour has
occurred (i.e. integration of new learning into everyday practice).
4. Results the final results that occurred because the participants
attended the program. These are usually long-term outcomes
and constitute the reason for having the training program (e.g.
enhanced student emotional literacy).
According to Kirkpatrick, the four levels are sequential and should be
evaluated progressively. The rationale behind this is that incorrect
conclusions about program performance and effectiveness may be
drawn if information about reactions, motivations and learning are not
obtained prior to assessment of behaviour and results. Kirkpatrick and
Kirkpatrick (2006) explain this point well, offering the following advice:
My suggestion is to start at level 1 and proceed through the other
levels as time and opportunity allow. Some trainers are anxious to
get to level 3 or 4 right away because they think the first two
aren’t as important. Don’t do it. Suppose, for example, that you
evaluate at level 3 and discover that little or no change in
behaviour has occurred. What conclusions can you draw? The first
conclusion is probably that the training program was no good and
we had better discontinue it or at least modify it. This conclusion
many be entirely wrong…the reason for no change in job behaviour
may be that the climate prevents it. Supervisors may have gone
back to the job with the necessary knowledge, skills, and attitudes,
but the boss wouldn’t allow change to take place (p. 71).
Evidence relating to outcomes at level 1 and 2 is often easier to
measure and collect than for level 3 and 4 outcomes. This is because it
is usually very difficult to determine if it was the training program that
led to results. Typically, there are a range of plausible alternative
explanations that may account for why changes in behaviour and results
did, or did not, occur. For example, an increase in student emotional
literacy may be the result of other interventions or policy changes at the
school, rather than something to do with the actual training program.
Kirkpatrick’s notion of sequentially evaluating outcomes at four levels
has been incorporated into the preliminary logic model. According to the
causal sequence described in the model, a chain of positive outcomes
should flow from implementation of activities. At level 1, it can be seen
that the dispositions and reactions of participants are an important
precursor for outcome achievement. It is critical to be clear here that it
is not programs per se that work. The people who participate in them
are active players who have the volition to determine if they will
participate in the training (or not); take on board knowledge that the
trainers are attempting to impart (or not); consciously reflect on what
18
they have learnt and apply new knowledge and skills in their workplaces
(or not) and so forth. Consequently, it is important to document
participant reactions to training and not assume that improved learning,
behaviour and results will occur automatically.
If training is successfully implemented, participants are motivated to
learn, and react favourably to the training, then there should be some
form of learning enhancement (Level 2). The next two levels in the
outcomes hierarchy are more difficult to evaluate, as there are many
factors outside the direct control of the program that can affect the
achievement of ‘teacher practice change’ (Level 3) and ‘enhanced
student emotional literacy’ (Level 4). In these circumstances, evaluators
sometimes introduce some form of experimental control design, to
quantitatively estimate what happened with the program in place,
compared to what would happened to an equivalent group who did not
receive the program (i.e. a counterfactual). This type of design was not
appropriate or feasible for this study, given the stage of program
development.
Contextual influences
As noted previously, programs do not operate in a vacuum. There are a
wide range of potential factors that may facilitate or inhibit program
processes and generation of desired outcomes. These are not presented
in the diagram to reduce visual complexity. As a tentative taxonomy, we
have grouped the main contextual influences into the following
categories: (a) teacher and student characteristics; (b) school level-
factors; (c) home environment; and (d) the broader policy context.
These capture, broadly, the main levels of external enablers and
constraints that we believe shape implementation and co-determine the
type and magnitude of effects on students, teachers and the broader
school community. As knowledge of the program develops, these
categories could be usefully unpacked to understand specific
configurations of contextual influence in different school settings.
Using the Model to Guide the Evaluation
The model of the ELVA approach that has been developed here should
be viewed as a tentative attempt to explain how the program works to
achieve desired outcomes. A more nuanced understanding of the
program and its operation is likely to emerge over time, and with
successive evaluations that build, test and refine the model further.
It is important to reiterate that the model is not intended as some kind
of fixed ‘blueprint’; rather it is a simplification of reality that is designed
to facilitate communication and common understanding about how the
program, in an ideal sense, is supposed to work. In a more practical
sense, the model provided an organising structure and framework to
guide the evaluation. For example, the model was used to facilitate
identification of key evaluation questions and critical program functions
that then became the focus of data collection efforts.
19
Figure 3: A working model of the emotional literacy pilot initiative
Design an effective
teacher professional
learning approach
Identify and recruit
appropriate schools
and teachers
Provide on-going support for teachers and schools, including:
Part-time co-ordinator to enhance training transfer
Multi-disciplinary working party to guide implementation
Research and knowledge-base to inform content and curriculum
Materials and other resources to support classroom instruction
Deliver high-quality
professional learning
sessions
Program Process Theory
Increased teacher
knowledge, capacity
and confidence;
change in attitudes and
beliefs
Training provides
opportunity for
experiential, practice-
based learning
TEACHER PRACTICE CHANGE
Contributes to enhancing the emotional literacy of
students
Program Impact Theory
20
Implementation and Early Evidence of Effects
This section integrates findings from observations of training sessions
and post-training interviews with teacher participants and school
principals. Originally, we had also anticipated obtaining feedback from
students via a series of small group discussions at each of the eight pilot
schools. Despite our best efforts, this was only possible at one of the
pilot school sites. The results are organised as follows:
Overview of the schools
Perceptions of the training
Critical features of the training
Implementation strategies and challenges
Effects on teachers’ learning and classroom practice
Suggested areas for improvement
Overview of the Pilot Schools
Table 2 provides a summary of the pilot schools. Schools varied quite
substantially in terms of sector, type, size and socio-economic profile.
However, they did share a common reason for being involved in the
ELVA pilot. Essentially, they see in ELVA a complementary fit with the
ethos of the school, especially as this relates to existing well-being
policy and programs. The novelty of ELVA was also cited as a factor
affecting the decision to participate in the trial: ‘It was something
completely different that we wanted to try’.
Table 2: Summary of participating schools
School
sector School
type Total
Enrolments ICSEA Location
A Government Combined 2143 916 Metropolitan
B Government Primary 165 991 Rural
C Government Primary 523 1127 Metropolitan
D Government Primary 381 992 Metropolitan
E
Government
Special
344
-
Metropolitan
F
Government
Primary
133
1088
Rural
G Independent Primary 33 1174 Metropolitan
H Government Primary 543 938 Rural
21
Perceptions of the Training
Reflecting on the simple ‘causal’ sequence outlined in our logic model,
we hypothesised that if teachers respond favourably to the training then
changes in knowledge and practice are more likely to occur. In turn,
these changes may contribute to improvements in student emotional
literacy. This is really nothing more than common sense. Teachers
usually attend professional development sessions with high hopes of
learning something new. However, if they are not satisfied with the
training, and this could occur for a number of reasons, then it is very
unlikely that changes will be implemented in the classroom. Effectively,
we were interested in progressively answering four basic questions:
Did participants like the training?
Did they learn something from the training?
Did they change their teaching practice as a result of the
training?
Did this contribute to positive effects on students?
Gathering evidence of training satisfaction is usually achieved by simply
asking participants’ informally - during and at the end of sessions. Often
this is supplemented by post-training ‘feedback sheets’ that record
participants’ reactions and provide an opportunity for immediate
response to facilitators. As an initial step in evaluating levels of
satisfaction we analysed post-training feedback sheets from each of the
seven professional development sessions (including four sessions that
occurred prior to the commencement of the evaluation). Box 2 below
provides a representative summary of participant responses, organised
by session.
Box 2: Post-training participant feedback by session
22
Session 1: Introduction to the Dax approach
“Thanks for a really useful, stimulating and interesting day. I’m looking
forward to the rest of the sessions”
“A very effective day balanced theory/philosophy well with practical
activities. Today gave a sound starting point.”
The smoke/fire analogy to explain the psychodynamic approach was
really helpful”
“Carefully thought-out process/agenda for the day”
“The information about adolescent brains and neurons was very
interesting”
Session 2: Experience, relationships and reflection
“I really enjoyed the passion and enthusiasm of the facilitators”
“Liked hearing the ‘real life’ examples of how it is working in other
schools”
“I have a deeper understanding of the process and implications [but]
would like to know more about how to successfully implement such a
valuable program”
“Demystified psychoanalysis a thorough and easy to understand
presentation”
“Great sense of collegiality across everyone and the sense of learning
together”
Session 3: Understanding relationships
“Wealth of experience here, honesty and sense of mutual purpose”
“Thank you very much for another valuable day!”
I enjoyed hearing about how the activities can be used in an art class.
This was a practical example that helped to bring the information
together”
“It was a really interesting day. Congratulations! I am getting excited
about implementing the program.”
“How to support the implementation of the program so that it fits with
the rest of what we do at our school?”
Session 4: Creating a safe and supportive environment
It’s a rare and delightful joy to attend a training program where there
are as many, or more, personal developments and benefits as there are
for my teaching practice”
“Beautiful, happy presenters”
“Amazed by the fantastic presentation of practical units of work”
“The experience of other teachers always great for gaining new ideas”
“Really loved all the presentations today. Can now really see a path to
how I can implement in our school (me), although my ‘deep end’
continues to be the whole schools aspect”
Session 5: Implementation and Support
“Helpful learning how to explain/promote program to other staff in a
way that ‘guarantees’ support/success”
“At this time in place a little overloaded with information and need to
read over resources”
“Great to get a clear group understanding of the Dax approach”
“The importance of boundaries, and ‘acceptable flexibility/adaptation’ of
the units of work”
“Was looking forward to coming again. Usually can’t stand PD’s”
Session 6: Implementation and Support
“Great to get handouts re: notices to parents/newsletter”
“Greater maturing of thinking and understanding was shown. I certainly
will go away with greater knowledge and understanding of what is useful
and required for effective implementation”
Going through the planning for units of work. Discussion led to several
Ah-ha moments.
Box 2: Post-training participant feedback by session
23
Session 7: Show and tell
“I have so enjoyed this opportunity. It has not only enhanced my
teaching and the experiences I am able to make possible for our
children, but what I’ve learnt and seen with myself, my journey, my
teaching, parenting, etc”
“This approach must be central to all teacher training at every level”
“At the beginning of the project the ‘how’ was a big question. The lesson
plans were very helpful in giving us a ‘safe’ starting point’. The
neuroscience aspect at the beginning was crucial”
“Deepest gratitude and thanks for the opportunity to be inspired and
trained by ‘amazing’ teachers...their expertise, endeavour, passion,
knowledge and warmth made the PD a ‘top class’ learning experience
for us all”
“Very helpful listening to others share what they have done what
worked and what didn’t”
“Variations on previous units covered great how many different ways
of using the same ‘unit’. Simple hints, practical applications”
“Thoroughly enjoyable PD. I have learnt and experienced growth in my
teaching”
Thank you! It’s been so inspiring. Truly a positive life-changing
experience”
24
These responses have not been selectively ‘handpicked’. Several specific
comments were made about the quality of the two facilitators, as well as
guest speakers. There were very few negative remarks, and these
related mainly to: (a) logistical aspects of the training, such as
‘uncomfortable’ chairs, parking facilities, etc, and (b) some uncertainty
and lack of clarity about the ‘Dax approachand how it might be
practically implemented in schools and classrooms. There was a
noticeable decline in comments regarding implementation feasibility as
the training progressed.
While these findings are promising, and suggest high levels of
satisfaction, it is important to acknowledge the problem of ‘grateful
testimonials’ which is linked to social desirability bias. Campbell
(1969) defines the difficulty as follows:
Human courtesy and gratitude being what it is, the most
dependable means of assuring a favourable evaluation is to use
voluntary testimonials from those who have had the
treatment...The rosy glow resulting is analogous to the
professor’s impression of his teaching success when it is based
solely upon the comments of those students who come up and
talk with him after class. In many programs, as in
psychotherapy, the recipient, as well as the agency, has devoted
much time and effort to the program and it is dissonance
reducing for himself, as well as common courtesy to his
therapist, to report improvement... (p. 426).
This may strike some as unduly harsh, but in our experience it is
common for training programs to assume success based on grateful
testimonials. To dig deeper we observed a number of training sessions
and conducted interviews with teachers at the end of the seven
sessions. We stressed the importance of honest appraisal, reminded
interviewees that this was an independent evaluation and reinforced
that responses were confidential and anonymous.
The interviews and our observation of training quality corroborate the
positive testimonials provided by teachers on post-training feedback
sheets. Selection bias cannot be ruled out completely, but it seems
improbable. While the majority of teachers appeared motivated to learn
and held prior beliefs that were consistent with the ethos of the training,
these characteristics and predispositions do not account for high levels
of reported satisfaction. There was no evidence to suggest that
participants were telling us what we wanted to hear. Teachers are
generally not reluctant to share criticism of poor quality training,
especially when there are financial costs, such as the need to provide
causal relief staff.
In short, participants were authentic and effusive in their praise; with
several stating that this was the best professional development they had
attended. As one school principal reflected:
“Professional development is often about what we teach, about
how we teach, about when we teach it. But, we leave out the
who’s teaching it and I think it’s the who that is the most
important thing...the most powerful tool that we have as
teachers - so we really need to know who we are, and know our
own emotional capacity”.
25
Critical Features of the Training
Why did participants react so favourably to the training? The above
quote provides a glimpse of one critical overarching feature
recognition that teacher emotional literacy is an important precursor for
enhancing student emotional literacy. Certainly, it would be difficult for
teachers to foster students’ ability to understand the significance of
emotions if they are not aware of how their own thoughts and feelings
impact the classroom. This concept of ‘knowing me, knowing you’
emerged as a pithy summary of the program among facilitators and
participants.
During interviews we asked a range of probing questions to further
explore successful features of the training, such as: ‘What was it exactly
about the training that you liked?‘You must have attended a lot of
teacher professional development programs, how does this compare?
Why do you say that this training was much better?
From these responses emerged a useful catalogue of program design
and delivery features that facilitated teacher engagement and learning:
(a) duration; (b) theory-practice mix; (c) experiential learning; (d)
characteristics of the facilitators; (e) collective participation, and (f)
ongoing support and materials. These critical features are explained in
more detail below with the support of illustrative comments from the
interview transcripts.
Duration
Participants frequently mentioned that in contrast to other professional
development experiences, the length of the training (7 sessions over 18
months) encouraged deeper learning and supported practice change
that could be sustained over time. One teacher summarised this well: “I
think I rate it highly compared to other training that I’ve received mainly
because the training was ongoing”. Another echoed these sentiments,
reinforcing the point that: “It’s not like you go to a PD and come away
and think that’s really good, and that’s the end of it. It was ongoing and
great that they implemented it over that period of time”.
Sequential learning was important because it allowed teachers to trial
classroom practices, obtain feedback, reflect and apply what was learnt
to new situations. One teacher explained the importance of this cycle of
do-reflect-apply as follows:
“You had the opportunity to take everything away after the
session, think about the information that you’ve been given and
how it would work in your school, try it out, and then go back
within that certain timeframe and revisit at the next training”.
A few teachers reported that some of the early sessions were very
heavy going, especially for those new to the content. Thus, pacing of
the sessions over school terms, with time to digest the readings and
training helped to minimise cognitive overload - “it’s not something that
you could condense into a week-long training - you need time to
download.”
26
Theory-practice mix
Another positive feature of the training, as reported by several
participants, was the blend between theory and practice. As noted
earlier, the program is informed by various psychodynamic theories,
neuroscientific research and a developmental perspective on early
intervention and prevention. The facilitators provided participants with a
reading list of book chapters and articles that reflect this
multidisciplinary foundation. The training itself is facilitated by a
teacher-clinician dyad allowing for a bridge between educational and
psychodynamic perspectives on emotional literacy. Throughout the
training, guest speakers were utilised, including Dax Centre staff,
members of the working party, and classroom teachers who had
previously developed and tested classroom units of work during the trial
phase.
Due to the timing of the evaluation, we were not able to observe the
first four sessions. These leaned more toward psychodynamic theory in
terms of content. However, our observation of later sessions, review of
curriculum materials and teacher interviews suggest that the majority of
participants felt that possessing some knowledge, or at least a basic
awarenessof psychodynamic theory, was important for effectively
implementing units of work in the classroom.
In fact, appreciation or theory-awareness may have been the actual
learning outcome, rather than deeper abstract knowledge. At the time of
the interviews, few participants could elaborate specific theories,
concepts or ideas from the knowledge-base of the program, beyond
perhaps a general familiarity and recognition of the potential significance
of say, Freud’s notion of the unconscious mind; as in the ‘smoke/fire’
metaphor often cited by participants as informative. Teachers conveyed
the sense that the theory and evidence-base legitimated existing
beliefsand provided a strong rationale for visual arts as a medium for
fostering emotional literacy and student well-being.
Experiential learning
Training sessions were organised to facilitate active, experiential
learning usually through a non-didactic process of reflection via some
direct, concrete experience. In the context of the training, teachers were
involved in ‘making art’, following the approach that would later be
adapted in the classroom. For example, in one session participants
completed the ‘inside/outside’ feelings activity. This involves, among
other things, developing a representational container or box depicting
how others perceive you and how you perceive yourself; usually in
relation to a specific context. Inner thoughts/emotions may or may not
be shared with the group.
The importance of creating a safe and supportive environment is also
emphasised. Discussion and reflection on the activity often leads to
greater insight into teachers’ emotional space. It also enables teachers
to understand how the activity may look or feel like for students
(including potential anxieties about whether their art will be ‘good
enough’ or ‘right’). Comments provided during interviews demonstrate
the significant insights for teachers that this ‘learning by doing’ approach
achieved. For instance:
27
“One of the important parts of the Dax training was that we were
actually involved in doing the units of work, so it puts you in the
situation of the student, and I think in particular, for classroom
teachers that’s really valuable putting yourself in the students
shoes and seeing things through their eyes”.
And:
Being put in that position of making the art, you realise what
emotions you are sitting on yourself, so yeah it can have a really
big impact”.
A related benefit of active learning is that it provides teachers with an
opportunity to rehearse lessons and see how the units of work might be
transferred to the classroom. The role of the facilitators in modelling the
lessons was regularly cited as important for implementing the training
materials effectively. One participant recalled that ‘seeing [the
facilitator] model how to teach those units, you would hear the little
gems that she would throw in and just watching that and learning from
that was invaluable’.
Characteristics of the facilitators
It seems almost a truism to say that quality of facilitation is a key
ingredient for training success. It is perhaps surprising then, that much
of the research in education does not explicitly identify facilitator
characteristics as predictors of the effectiveness of teacher professional
learning. Instead, the focus is often on training content, school and
policy factors and individual learner variables such as motivation
(Penuel, Fishman, Yamaguchi & Gallagher, 2007).
There is much discussion, however, about the importance of trainers in
the extant literature on training transfer. Good trainers are
knowledgeable about the subject matter, display professionalism,
recognise the needs of individual learners, and understand the principles
of adult learning. Effective trainers involve participants in the learning
process; using techniques such as role plays, small group exercises and
collaborative activities (Burke & Hutchins, 2007).
In the sessions that we observed, facilitators displayed all of the above
attributes. Furthermore, they prepared well for the sessions and showed
empathy, patience and honesty including an ability to place
themselves in the situation of teachers, recognising uncertainties and
anxieties when learning a new way of thinking about classroom practice.
They used humour and personal anecdotes to put participants at ease
and convey difficulties they had encountered in their teaching.
Participants regarded the facilitators as leaders. During interviews,
teachers spoke positively about their capacity to listen and show respect
for the experience of teacher participants. We noted several instances
where the facilitators fostered the development of a safe learning
environment and modelled what it looks like to be an emotionally
literate teacher. In our view, the facilitators are a key factor in
understanding positive reactions to the training.
28
Collective participation
Another critical aspect of the training was that it supported the
development of a community of learners. Collective approaches to
professional development were widely regarded by teachers as “a great
way to learn, because you’re all learning about something together”.
One precondition for participation was that schools were required to
involve, at a minimum, two teachers from the school in training sessions
(and ideally the principal). In practice this was not possible for all
sessions, but the majority of pilot schools were able to maintain regular
group attendance of teaching staff.
There were a number of advantages associated with this design feature.
First, it resulted in opportunities for teachers to share and discuss
concepts, practices, and problems both within and across schools:
“It’s just been so rewarding...the networks made with the other
teachers and talking with them extensively about what they do at
their school. They shared lots of examples of how they dealt with
certain kids and the outcomes that they got, so then it kind of
gave us ideas...that’s been something that I’ve really taken
away, the ideas from others.”
Another benefit of joint professional learning is that teachers who work
together in the same school can share knowledge about students’ needs
across classes. Teachers can also provide each other with mutual
support to address implementation challenges such as securing ‘buy-in’,
dealing with staff turnover and sustaining changes in practice over time.
Travelling to and from the training provided an avenue for some
teachers to consolidate learning and collaboratively plan strategies to
implement the approach within the particular context of their school.
Ongoing support and materials
There was strong agreement among the pilot schools that a significant
success factor was the ongoing technical support provided to teachers
outside training days. In addition to delivering the training sessions, the
lead facilitator communicated regularly with participants via email and
phone conversations. Individual ‘coaching’ was provided in some
instances, for example when a new participant commenced mid-way
through the training.
To develop a better understanding of the school implementation context
and promote awareness of the approach, site visits and presentations
were also offered to pilot schools. The response from participants was
that this support was invaluable. It helped raise the profile of ELVA at
their school and addressed any misunderstandings or concerns that
other staff may have held about the purpose of the approach.
Various resource materials were provided, such as sample letters to
send parents, items for newsletters, a list of frequently asked questions,
and practical tips for adapting lesson plans to suit classroom needs. This
level of involvement from the lead facilitator also served an important
quality assurance function, as any new materials developed for use in
the classroom were reviewed to ensure consistency with the principles
underlying ELVA.
29
It is worth ending our discussion with a direct quote from one of the
participants. The comment captures in narrative form the way in which
the training successfully combined several critical features of effective
teacher professional development to assist engagement and learning:
It’s the best PD, best training I’ve ever done on so many levels.
From a teaching perspective I like the way it was spaced out, I
loved the way it had the balance between the theory but also the
practical, being able to network with other teachers and learn
together over that time, and every session you’d end with I don’t
think I quite get this. But everyone else felt the same, and it was
great having the time to go away, absorb it, try a few different
things and then come back at that next level of understanding
and challenge yourself a bit more. And then, doing the practical
activities, it taught you how to do it but then personally it was
also great to process your own feelings that way, so you learnt
through doing, it was really good.
The next section describes some of the ways in which teachers’
implemented knowledge from the training and applied it within the
context of their school and classroom. We also examine factors that
were seen to support successful implementation and identify common
challenges raised by teachers and principals.
Implementation Strategies and Challenges
Delivering ELVA in each of the eight pilot schools required different
strategies, due to differences in school context (e.g. size, location,
student characteristics, and available facilities). During interviews we
asked participants to describe key steps they had undertaken over the
past 12 months, individually and at the broader school level, to
implement the approach. We also asked teachers to identify any
challenges encountered and how they had managed these. This enabled
us to learn more about contextual factors that influenced training
transfer.
Involved and committed leadership was seen by all schools to be a
critical factor for ensuring effective implementation. In the majority of
cases this precondition was established as part of the selection of pilot
schools, but for some schools maintaining active involvement was
problematic. A number of school principals, however, regularly attended
training, and could articulate how they saw the approach contributing to
student well-being:
I am 100% behind it. We’ve always been aware of the need to
provide good support and program development for students
around well-being. Emotional literacy is an important part of
what we’ve been doing, but to see this program operate within
the visual arts program, that was the key for us, from my point
of view, how it added to the programs that we were already
running at the school. How it gave the vehicle of visual arts as an
operating way of improving students’ emotional literacy and it
just makes sense to us that that’s what visual arts is all about. It
fits in with the other programs and the philosophy that we’re
trying to implement.
30
Another important initial step was to find ways to secure the support of
the whole school community. This was seen as a continual process,
rather than a one-off event. Existing workload commitments and
competing demands were often mentioned as obstacles. Finding time to
co-ordinate planning, promotional activities and inform staff was
challenging; especially for teachers working in larger schools and where
ELVA was being introduced as a ‘whole-of-school’ rather than individual
classroom approach. Some teachers mentioned that particularly at the
beginning they did not feel confident enough to explain the approach to
other staff.
One common strategy for raising visibility and support for ELVA among
teachers, parents and the broader school was to use visual displays or
‘exhibitions’ of student artwork that were produced as a result of
trialling the classroom units of work:
“The comments that people gave, visiting principals from other
schools, parents, people delivering things, people selling
educational resources, everybody who walked past it [referring to
students’ artwork using the Dax approach] stopped and looked
and commented. They always say how nice our displays look but
this one stopped people in their tracks and they talked about
them. I’ve never seen that level of interest and comment from
people who are visiting our school.
Another strategy, noted earlier, was to draw on existing samples of
promotional materials, disseminate a ‘frequently asked’ questions and
answer list to parents, and encourage staff to attend a presentation by
the lead Dax facilitator. The background presence of the Dax Centre, as
the agency responsible for the training, should not be underestimated. It
provided a useful way to establish credibility, particularly in a context
where staff or parents may devalue visual arts, view emotional literacy
as ‘wishy-washy’, or express a misunderstanding that the approach will
encourage teachers to ‘pseudo-analyse’ students, generating
unnecessary distress:
Some staff were afraid at the beginning of what might come out
from students but I think with the way that we’ve approached
this, the communication with teachers and with parents that it’s
set them all at ease. Also, putting things up on display and
constantly letting them know visually and verbally what was
happening”.
For some schools, implementation challenges related to personnel and
infrastructure issues. For example, they could not afford to send more
staff to the training, did not have a specialist arts teacher, or there were
limited facilities, space and materials to support delivery of the
classroom units of work. This presented some difficulties, as described
by one teacher:
For me probably the most challenging thing was that we didn’t
have an art room here, so it was done within the classroom. So it
was travelling from class to class and I couldn’t use all the
materials that I may have used if I was in an art room.
In a similar vein, another participant felt that it was tricky, because I
am a generalist teacher with no art background, so having the time and
31
skills to incorporate the lessons...it would be wonderful to have a
specific art person, I could see that really working well”.
A perceived need for ELVA and compatibility with existing teacher
beliefs, school mission, priorities and values were also important overall
factors affecting the implementation process. The approach worked best
when there was congruence or a ‘fit’ with preferred ways of working with
students:
I believe that the emotions drive the learning process not the
cognitive stuff. So, if students are emotionally ready to learn,
they’re feeling comfortable and so on, they will learn easier than
if they’re distressed or anxious.”
A number of participants felt that the ELVA approach might struggle in a
school ‘obsessed with NAPLAN scores’ or where ‘measuring literacy and
numeracy outcomes’ were paramount. Other important factors for
supporting uptake in schools were flexibility and ease of use. For
instance, teachers viewed specific units of classroom work positively, as
something that could be integrated into existing practices and routines
at the school, rather than a time-consuming and difficult add-on.
Teachers were also able to develop and appropriately modify lesson
plans to suit their local context. For example, one school completed the
‘inside/outside feelings box’ but incorporated it as part of an immigration
unit where the box was a suitcase, and the reflection involved
considering the process of migrating to a new country what you would
feel on the inside and what you would show on the outside.
By way of summary, Table 3 provides an overview of critical features
identified through this evaluation as important factors for enabling
successful implementation of ELVA in pilot schools.
32
Table 3: Factors affecting the implementation process
Teacher
characteristics
Academic background/discipline
Motivation and self-efficacy
Beliefs about teaching
Time constraints
Openness to change
Facilitator
characteristics
Empathy
Expertise and qualifications
Patience
Knowledge of adult learning principles
Features of the
training Duration
Content
Experiential/collegial learning
Ongoing support and materials
School context Leadership support
Teachers’ workload
Availability of funding for teacher release
Perceived need and levels of support
Compatibility or ‘fit’
Ease of implementation
Evidence of a culture that reflects
prioritisation of student well-being
Effects on Teachers’ Learning and Classroom Practice
The main perceived benefits that emerged from the experience of pilot
schools are discussed below, from the perspective of teachers, students
and the broader school community. The positive effects reported by
teachers, in relation to their own learning and classroom practice, are
consistent with the outcomes hierarchy identified in the initial logic
model (see p.19).
Reflecting back on the model, we hypothesised that teacher learning and
practice change are mediators for improvement in student emotional
literacy, the ultimate long-term goal of the program. In other words, we
regard the development of teacher emotional literacy as an important
precondition for enhancing student emotional literacy. You cannot teach
what you do not have yourself.
Following this logic, we started from the premise that the training
component of the program is only useful if, in fact, teachers acquire a
deeper understanding of their own emotions and are able to transfer
learning to classroom situations. A range of factors can affect training
33
transfer, as documented earlier. Often there simply is not enough time
or support for teachers to put what they have learnt into practice.
In relation to ELVA, one observable indicator of training transfer is
implementation of the Dax units of work. These units arelessons plans
that outline different art experiences that teachers can utilise to enhance
students’ capacity to understand their own and others emotions. There
are approximately 17 units that have been developed, and these vary in
terms of grade level (P-6) and length of time (3 weeks to full term). The
units comprise four components: The pre-activity teacher reflection, the
activity, the experience, reflection and observation of teacher and child
response.
There is considerable evidence to demonstrate that all teachers who
participated in the training had implemented at least one classroom unit
of work. Some teachers had trialled more than one unit in their
classroom, and a few had exclusively used the Dax units to replace
existing art lessons. Teachers reported that they intended to continue
implementing the units of work after completion of the training. The
most typical units implemented were ‘inside/outside feelings’, ‘self-
portraits’ and the ‘rainbow of feelings’.
Many teachers felt that as a result of the knowledge acquired during
training and experience delivering classroom units of work, they had
developed greater emotional self-awareness. That is, they were more
conscious of how their own thoughts, feelings, emotions and actions can
influence students. As one teacher reflected:
“It was actually through doing the training and units of work in
the training that I realised what emotions I was feeling as a
classroom teacher and how these might be projected onto
students...quite overwhelming. I dont understand how people
cannot even consider it [your own emotions]. It absolutely
baffles me you need to know who you are, feel like, and who I
am as a teacher and how that really does influence things.”
The training also provided teachers with a greater understanding of the
importance of creating a safe environment for expressing emotions, and
emphasising the notion of respect for the students’ choice in sharing
their work with others. This can take time to develop, but was
considered essential for enabling students to feel comfortable sharing
and talking about feelings associated with their art work, without fear of
being judged. The teachers own experience in completing units of work
during the training helped illustrate the importance of a supportive
context.
Several teachers emphasised that the training had ‘changed the way
they teach, and think about teaching’ and reinforced their beliefs about
the significance of emotions within daily classroom life. During
interviews teachers spoke about being more reflective, empathetic and
mindful of how they communicate and relate to students. For example, a
number of teachers indicated that they were more ‘tuned-in’ - aware of
the language they used in the classroom and confident in their capacity
to perceive student emotions accurately:
“I realised that now I’m not just looking at their artwork and
saying, oh this is great or beautiful’ I am now asking them
34
questions tell me about your artwork and why did you do
that?’. I’m noticing things about them that I didn’t see before”.
Many teachers elaborated that as a result of implementing the units of
work they ‘think more about the kids and their needs’, have ‘gotten to
know students much more deeply’ and are ‘a lot more clued in to
dealing with individual students’. This has been extremely beneficial for
student engagement, relationship development and the overall learning
and teaching process (including outside the art curriculum). For
example, one teacher described that following implementation of the
‘inside/outside feelings’ unit a number of students ‘described feelings on
their inside that I thought’ “oh wow, I didn’t realise that’s the way they
were feeling until then’.
Others described that previously disengaged students would now often
approach them outside class and during recess to talk about how they
were feeling, often articulating their emotions though colours and
symbols that they had learnt through completing the units of work.
According to one teacher, who ran a visual arts session with ‘kids who
are quite troubled’ the response was ‘amazing’ and ‘so many of those
kids have maintained a relationship with me, and actually seek me out
to talk about things’.
Benefits for students
The effects on teachers described above are logically connected to
student outcomes. An improvement in teacher self-awareness,
knowledge about how to build safe and supportive classroom
environments and increased understanding of students are likely to have
a positive flow-on effect. Although we were only able to speak directly to
a small sample of students at one of the pilot schools, a provisional,
qualitative assessment of impact was attempted by asking teachers to
provide examples and evidence of changes they had observed in
students.
The stories conveyed by teachers illustrate that one of the significant
benefits they observed were greater levels of emotional awareness and
expression among students, especially those students who previously
were very reserved. For example:
Students who don’t have the words, who don’t have the literacy
and numeracy to express their knowing and their understandings
about the world around them and their own position in their
world, the artwork was amazing. I’ve got amazing things from a
student with Asperger’s who normally doesn’t give any emotion.”
This observation was supported by students who participated in a group
discussion with the evaluators. When asked: ‘What did you learn when
making the artwork’, one student commented: “I learnt that it’s okay to
express myself, and you don’t have to share something you don’t want
to’. Another commented that he ‘learnt how artwork can represent your
feelings and emotions’.
A more positive, gentle classroom atmosphere was also linked to
implementation of the Dax units of work. The activities were seen to
have facilitated teacher-student and student-to-student engagement,
35
trust and rapport. Teacher reported that students were visibly more
enthusiastic about their art work and creating more thoughtful art,
rather than the ‘uniform and bland pieces of work that were produced in
the past’. One teacher explained that:
It has given students who would never participate in art or even
appreciate the work an opportunity to engage. They are now
loving art because there’s not that fear...those kids are producing
things they’re really proud of. You can definitely see a change.
Students who would never talk and share are coming out of their
shell and are a lot more confident.
And,
...the kids in my room feel a safety about their well-being and
their emotional well-being, not judged. They’re very open and
they do talk and they do communicate. I know my grade can be
quite difficult, but within the room they are very supportive of
one another and they actually as a team get on really, really
well.
Moreover, involvement in the Dax units of work was seen to have helped
students ‘appreciate why somebody made that artwork’ and to
understand the power and ‘meaning of art, not just producing it because
they have to’. Teachers felt that as a consequence students were
learning more about each other and developing a greater capacity to
empathise ‘just seeing how concerned and connected they were when
sharing feelings about their artwork was incredible’. Another teacher
commented that ‘when I was implementing the units, the calming effect
on them was amazing...previously it would be ‘your work is dumb’ or
‘stupid’ and so on. Now, kids were sharing feelings and finding things in
common with other kids that they didn’t have a lot of contact with”.
This claim was substantiated by students who commented that as a
result of participating in the Dax units of work, they ‘learnt things about
some other people, not just my friends that I didn’t know before’. Other
positive reactions from students were that the experience was
‘something different from what we normally do...it was really fun and we
looked forward to it each week. Sometimes we were so eager we
reminded the teacher a lot’. Another student, a grade 3 boy, felt that
the artwork he produced was more individual and meaningful ‘it’s
something that represents you, not what the teacher has told you to do,
to paint”.
Wider effects
Teachers reported some initial changes within the school community.
Primarily, this was linked with efforts to promote visibility of ELVA in
school newsletters, meetings and other forums. Often this was achieved
by inviting the Dax Centre lead facilitator to attend briefing sessions for
staff. This helped raise awareness, address any potential concerns and
misunderstandings, and legitimise the approach.
Many schools also experimented with the use of visual displays of
student artwork in reception areas, classrooms and at parent-teacher
nights. This proved to be a useful strategy for ‘creating conversations’
36
about the value of visual arts and the relationship to emotional literacy.
For example, one principal spoke at length about the unusually high
number of unsolicited comments he received from visitors, who almost
invariably remarked on the uniqueness of student work it’s not like
traditional primary school art; it didn’t look all the same”.
When asked about parent responses to ELVA, several teachers described
examples of favourable reactions, noting that support for the innovation
was strong. At events, some parents recounted to teachers that their
child had spoken positively at home about their experiences of creating
and talking about art, and were astonished with the depth of insight,
understanding and sophistication of language used to describe emotions
and feelings. One pilot school, in a rather unique situation, was able to
successfully apply their learning from ELVA to support students,
teachers and parents who were dealing with the stress, anxiety and
heightened emotions associated with impending school closure.
Innovations produce a variety of effects, some of which are unintended.
One foreseeable negative outcome of ELVA is ‘leakage’. This involves
teachers sharing materials with colleagues who have not participated in
the training, thus potentially undermining implementation integrity.
Related to this, is the idea that teachers may misunderstand the
philosophy underlying the program, and come to view their role as akin
to an ‘art therapist’. We did not find any evidence to suggest this had
occurred, but note that the training facilitators spent considerable time
within the curriculum discussing these matters.
Areas for Improvement
During the interviews, teachers were invited to provide suggestions for
ways in which ELVA could be improved. Comments were focused mainly
on practical matters such as the desirability of having a consolidated
package of training materials and lesson plans, including online
resources to support implementation. Some commented that they would
have liked to focus less on theory and more on practice, while others felt
that conceptual underpinnings could have been elaborated, possibly
involving the use of more expert ‘guest’ speakers.
Many teachers struggled with how best to ‘sell’ the approach in their
particular school, and would like more guidance and resources to enable
this to happen. Sharing practice-based examples of how the innovation
had been applied in different settings, and with different student groups,
were frequently mentioned as good ways to enhance adoption.
Videotaping of classroom sessions and opportunities to ‘see’ how
experienced teachers run Dax lessons with their students were also
highlighted as effective strategies to enhance learning and uptake. To
support practice change, many respondents expressed an interest in
developing structures to maintain communication and knowledge
sharing between the growing network of participating schools and
teachers.
37
Training Transfer:
Results from a Six Month Follow-Up
In this component of the evaluation we obtained, amongst other things,
follow-up evidence to help determine whether and to what extent
transfer of learning occurred, six months following participation in the
pilot. Three key questions guided data collection efforts: Do teachers
continue to use ELVA in their daily practice? What factors in the school
setting help or hinder practice change? Are teachers observing any
benefits of practice change on students?
Method and Data Source
Participants
In total, 13 teachers participated in the follow-up questionnaire,
representing an acceptable response rate of 62%4. However, given the
low sample size caution needs to be exercised when interpreting results.
Of this group, 12 were female. The majority (50%) were aged between
50-59 years, with the second most common age category 30-39 years
(21.5%). Most participants reported having 20 or more years of
teaching experience (57%), followed by equal proportions of 1-5 and 6-
10 years (14%).
All respondents were located in the government school sector, and a
little over two-thirds of the sample reported working in large schools
with 300+ students. In terms of job position, most described themselves
as classroom teachers (46%), followed by art teachers (38%) and 15%
indicated they were part of the leadership group at their school. There
was an even spread of grade level taught, from prep to year six.
Measures
The survey instrument was a modified version of the tool developed by
the Australian Council for Educational Research (ACER) to measure the
impact of teacher professional development programs on knowledge,
practice, self-efficacy and student learning outcomes (Ingvarson, Meiers,
& Beavis, 2004). Teachers provided demographic information as
indicated above. The survey also contained indicators of participant
satisfaction with the quality of the training and structured self-report
data on levels of use and barriers to implementing Dax units of work.
4 One respondent was removed from the analysis due to a high number of missing data/skipped
responses. Note that with a sample size of 13 and population of 21, the confidence interval or margin of
error is 18%.
38
Procedures
An online survey method was selected to obtain feedback from
participants. Questions were developed in collaboration with Dax
facilitators and pilot tested on a separate sample of teachers familiar
with the ELVA approach. The survey was designed and administered
through SurveyMonkey®. The principal evaluator sent email requests to
all 21 eligible teachers requesting their participation in the survey. The
email contained a link to the survey. Two rounds of reminder emails
were sent, roughly two weeks apart.
Key Findings from the Survey
Descriptive results are reported below for measures relating to five key
domains of the survey: (a) school support, (b) perceptions of training
quality, (c) impact on teacher knowledge and practice change, (d) levels
of use and barriers to training transfer, and (e) student outcomes.
Although we see potential relationships among these domains, given the
small number of respondents it was not possible to investigate these in
this study. For example, to conduct analyses to test for possible
associations between variables such as level of school support and
teacher knowledge and practice change.
School support
School support for professional development was measured by asking
respondents to indicate the extent of agreement with the following
statements:
The leaders at my school actively support and encourage all staff
to take part in professional development
Insufficient time is available in my school to support teachers'
professional learning
Follow up support for professional development is available within
my school
Teachers at my school work collaboratively to resolve teaching
and learning issues
The arts are valued and supported at my school
According to Ingvarson et al (2004) these factors constitute control
variables that might account for variance in the impact of professional
development on teacher outcomes. The responses to these items were
positive overall. This suggests that at most pilot schools there were
good levels of support for teachers to implement knowledge and insights
gained through participation in ELVA.
39
Perceptions of training quality
Previous sections of this report present findings regarding training
satisfaction, drawing on post-training feedback forms and participant
interviews. These data indicate that immediately following completion of
professional learning, teachers were very positive about the quality of
ELVA. To investigate this finding further, teachers were asked to provide
an overall rating on a scale with response options of ‘excellent’, ‘good’,
‘average’, ‘poor’ or ‘very poor’. All 13 respondents rated the professional
learning as excellent.
Additional data were obtained by asking teachers to report on the extent
they disagreed or agreed with a series of statements regarding the
training (1=strongly disagree, 5= strongly agree). Table 4 reports these
results.
Table 4: To what extent to you disagree or agree with the following statements:
Impact on teachers
To estimate the impact of ELVA on different dimensions of teacher
learning we asked participants to rate their level of knowledge (low,
medium, high) before then after the training. Table 5 provides
retrospective pre-post mean scores. These show positive gains for all
learning dimensions. This suggests that teachers feel there has been
significant knowledge gains associated with the training, especially in
terms of: (1) their understanding of the importance of emotional literacy
for student development; and (2) the role of art-making experiences for
enhancing emotional literacy.
40
Table 5: Teachers’ retrospective pre-post rating of knowledge
Pre
Mean
Post
Mean
Change
Emotional literacy and its importance for child
development
8
13
+5
Teacher-student relationships in the process of
providing a safe and supportive environment
11.7
13
+1.3
The role of art making in enhancing emotional
literacy in children.
7.3
12.3
+5
Creating an emotionally safe and supportive
environment
9.3
12.3
+3
The role of experiences in enhancing emotional
literacy
7.7
13
+5.3
The implications of a teacher reflecting on their
own emotional life
7.7
12
+4.3
Participants in the survey were also invited to indicate the extent to
which professional learning through ELVA had made a difference to the
way they teach, using the following five-point scale:
1. No difference
2. Slight difference
3. Some difference
4. A great difference
5. Tremendous difference
Almost one-third (31%) stated that the training made a ‘tremendous
difference, 62 percent noted a ‘great difference’ in their teaching
following ELVA and 8 percent ‘some difference’. No-one felt that the
training had made ‘no’ or only a ‘slight’ difference.
Levels and barriers to use
The vast majority of teachers indicated they had trialled at least one of
the seventeen Dax units of work in their classroom (92%). The most
frequently mentioned units implemented after the training were inside-
outside containers, rainbow of feelings, self-portraits and body maps. All
thirteen participants indicated an intention to use one or more units of
work in the future.
Over half of the respondents reported experiencing barriers to
implementing the Dax units of work, with the most common issues
being disrupted working conditions at the school (83%) and lack of time
(67%). Few respondents reported lack of support from leadership or
peers as a barrier, although some noted challenges associated with
space and/or materials to run art activities. These findings are
consistent with positive results from the school support measures
described earlier.
41
Student outcomes
Indirect measures of student and teacher outcomes were obtained by
asking teachers to respond to a series of statements that convey
important desired effects of the initiative. Figure 4 suggests that
teachers believe their participation in ELVA has helped them to create a
classroom environment that better supports the development of student
emotional literacy. Many also reported feeling more confident in their
teaching practice.
All participants agreed that overall, students are more engaged, have a
greater awareness of the emotional world of other students, and are
more reflective about the impact of their emotions and behaviours on
others.
Figure 4: Teacher perceptions of student outcomes and related effects on
classroom environment and teaching confidence
Discussion and Limitations of the Analysis
Professional learning and development programs are a ubiquitous
feature of the Victorian school sector. Considerable amounts of funding
are provided each year to support teacher participation in a wide variety
of training programs. Although professional development is recognised
as a key factor in supporting learning and student outcomes, the
assumption that knowledge is automatically applied back at school and
in the classroom is questionable. Teacher practice change is hard to
achieve and often even harder to maintain.
Training transfer research indicates that when participants return to the
workplace, many struggle to implement what they have learnt, despite
high reported levels of satisfaction and gains in pre-post knowledge
levels (Burke & Hutchins, 2007). In fact, some studies estimate that as
little as 10% of training leads to practice change, with others suggesting
that around half of training investments do not result in any noticeable
improvements in the workplace (Georgenson, 1982; Saks, 2002).
42
This can occur for a variety of reasons, but most commonly there is an
absence of a supportive environment and/or insurmountable time and
resource constraints. Work practices continue as before, with limited
opportunities to introduce and embed new innovations.
Given the difficulty of achieving training transfer, we believe the results
contained in this section constitute an important aspect of
understanding the effectiveness of ELVA5. A number of summary
interpretations can be drawn from results of the post-training follow-up
survey:
Supportive context: School leadership, advocacy and
commitment is necessary for enabling ongoing implementation
and routinisation;
Training quality: Teacher judgements of training quality have not
changed half a year post-training, remaining very positive, thus
reducing the plausibility of initial halo or recency effects;
Teacher knowledge and practice change: Teachers report that
they are still using what they have learnt, both directly in terms
of the Dax units of work, and conceptually, with respect to
changes in teaching attitudes, confidence and practice; and
Student impacts: There is partial evidence to suggest that as a
result of changes in classroom practice, students are
experiencing more opportunities to reflect on the significance of
their own and other’s emotions.
Overall, the survey findings are promising and triangulate well with data
reported in earlier sections of this report. Even so, it is important to
emphasise that the sample of respondents is small (n=13), and may not
be representative of the population of teachers who participated in the
ELVA pilot (n=21). Larger sample sizes are also needed to draw clearer
causal inferences about effects on teacher practice, tease out possible
sub-group differences, and explore the impact of individual and school
context factors on training transfer.
5 Certainly in terms of teacher practice change, which according to the program model is an important
link in the causal chain that contributes to improved student emotional literacy outcomes.
43
Conclusions and Implications
This section examines a number of implications for the ongoing
development of ELVA that have arisen during the course of this project.
Strategic priorities are grouped according to the following key themes:
(1) improving design and delivery; (2) strengthening the research and
evidence-base; and (3) planning for sustainability. The Dax Centre is
already actioning plans to respond to each of these developmental
priorities, although several areas require funding to be generated before
meaningful progress can be made.
Suggestions to Improve Design and Delivery
An important initial step in the evaluation of ELVA was to engage
stakeholders in a process of clarifying how and why the initiative is
assumed to work. A logic model was developed as a way to capture and
‘represent’, albeit in a simplified way, the basic links between inputs,
processes and outcomes. In the real-world things are never so simple.
Nevertheless, the model provides a useful frame of reference for guiding
practice and development to ensure that outcomes are maximised.
For example, the findings suggest that the processes involved in the
implementation of ELVA are more likely to produce the desired
outcomes when teachers are located in supportive school contexts. If
this is not the case then transfer of training to the classroom is more
difficult to achieve. This has clear implications for selection criteria, and
supports the importance of Dax maintaining current protocols for
identifying schools and recruiting teachers.
We were also able to confirm and elaborate the logic model component
that specifies the design of teacher professional development as an
integral aspect of ELVA. We now know that there are several non-
negotiable core elements to ‘protect’ and strengthen as the initiative
moves forward:
Appropriate duration
Theory-practice mix
Experiential learning
High-quality facilitators
Collective participation; and
Ongoing support and materials6.
There is also evidence to suggest there are various factors associated
with teachers and the broader school context that are important and
potentially modifiable influences on effective implementation (see Table
3, p. 32).
6 These elements are largely consistent with findings from research on core features associated with
effective teacher professional development (see for example, Desimone, 2009).
44
Our impression is that Dax Centre staff involved in this pilot were
intuitively aware of these elements and features of effective teacher
professional learning, and this explains in part the favourable response
from teachers and schools.
We recommend that more explicit attention now be provided to each of
the above aspects of the training, and that this is used to inform future
training sessions7.
Strengthening the Research and Evidence-Base
The information on program outcomes and impacts that have been put
forward in this report should be viewed as a provisional attempt to
strengthen the empirical evidence-base of ELVA. More detailed
description and statistical measurement of causal links, as well as
deeper explanatory knowledge of how the initiative works to generate
outcomes, will emerge over time as successive research and evaluation
efforts are conducted.
The findings of our investigation suggest that there would be
considerable value to be gained by improving the way in which
outcomes are measured and monitored. One way of achieving this would
be to enhance data collection systems so that they were able to more
readily capture information on the kinds of outcomes we have identified
in this evaluation. At present, there is no dedicated database for
tracking teacher outcomes over time, nor is there an agreed upon
instrument for measuring change in students.
There is also clear scope for improvement in terms of quantitative
impact evaluation, as these techniques offer a comparative advantage in
terms of formally addressing the counterfactual. For example, what
would have happened if teachers did not participate in ELVA? Do
students who experience Dax lessons differ from those who do not?
Given the circumstances, there would seem to be too many difficulties
associated with using a randomised control design. Prospective studies
would most likely need to consider a variety of quasi-experimental
options, tailored to different components of the initiative.
7 For example, Kolb’s (1984) four-stage model of experiential learning could help to inform the theory
and practice underlying this feature of effective professional development.
Suggested Actions
1. Consolidate teaching materials and resources, including the
development of a training manual for use by current and
future facilitators
2. Initiate an e-
learning platform to support demand for a
community of practice site for ELVA teachers
3. Consider moving tow
ard a more formalised system for
‘qualifying’ teachers in the Dax approach
4. Support continued professional development of teachers
through conferences, master classes and other learning
opportunities
45
For example, evaluators might be able to use non-equivalent matched
comparison groups of students to estimate effects on those who receive
Dax units of work versus a comparable group who do not (assuming of
course, that a valid instrument to assess student outcomes is
identifiable, or can be developed). Similarly, although simple pre-post
designs are subject to several validity threats, they provide one way of
assessing whether there have been gains in teacher knowledge before
and after participation in ELVA professional learning.
It is important to caution that there will always be an element of
ambiguity when attempting to measure, in a quantitative sense, the
impact of ELVA. This is unavoidable as the initiative deals with complex,
multi-dimensional outcomes such as teacher practice change and
student emotional literacy. Therefore, there is also a need to diversify
future measurement efforts through the use of multiple qualitative
methods, as this would compensate for the weaknesses of statistical
performance measures. As Patton (1997) reminds us: ‘[It is better to
have] soft or rough measures of important goals rather than highly
precise, quantitative measures of goals that no one much cares about’
(p. 161).
Pathways to Sustainability: Some General Considerations
Although sustainability is a common concern among program
stakeholders, many innovations, even those that have demonstrated
effectiveness, terminate within a relatively short time after start-up
resources are expended. Conservative estimates suggest that as many
as 40% of pilot initiatives are not sustained (Elsworth & Astbury, 2005).
Moving forward, a critical question for ELVA is: ‘What happens after the
initial 5-year funding ends? There are some general lessons that have
been learnt about ways of enhancing sustainability that can be
incorporated into ELVA forward planning. Some of the key factors
include:
Build support by developing and maintaining a diverse and
effective range of networks and partnerships
Initiate broad-based marketing and promotion to disseminate
early achievements
Suggested Actions
1.
Develop internal data collection systems, quantitative and
qualitative, that enable ongoing and regular monitoring of
ELVA
2. As part of the above, specify criteria and standards that can
be used to make judgements about the merit and worth of
ELVA performance and impact (e.g. 70% of teachers’ report
using at least one Dax unit of work 6 months post training)
3. Undertake a scoping study to determine the suitability of using
or adapting existing measures of student emotional literacy,
and/or the need for developing a purpose-built tool
46
Establish regular internal monitoring and evaluation processes
Stabilise the funding base and where possible diversify funding
streams (i.e. not relying on a single source of funding)
Develop a closer ‘fit’ between the initiative and the organisational
context in which it operates
Leverage from existing and complementary initiatives and
identify organisations that could support ELVA activities into the
future
ELVA is not expensive to run. The major costs for the Dax Centre are
staff time to co-ordinate the initiative and deliver professional
development activities. Recently, a new business model has been trialled
to ascertain the feasibility of a user-pays system. This will need to be
reviewed, but could offer one option to (partially) cover staff
expenditure. Additional revenue streams may need to be considered,
including one-off funding proposals, provision of modified/customised
ELVA training activities for professionals working in other sectors, and
corporate fund-raising events.
To date, almost all of the teacher professional learning, and indeed
conceptual development and co-ordination of the entire ELVA approach,
has been undertaken by one part-time lead facilitator, with support from
a clinical co-facilitator. There is a clear need to build training capacity
and ‘spread the knowledge’ to ensure sustainability of the initiative.
facilitator, with support from a clinical co-facilitator.
******
Suggested Actions
1. Develop a 3-year strategic plan and funding proposal for ELVA,
with links to current and future Dax Centre directions
2.
Allocate additional staff resources (0.4 EFT) to manage and
further develop ELVA
3. Maintain the EVLA working group
4. Provide follow-up support to
schools and teachers that
participated in the pilot
5. Develop a dedicated EVLA website presence, including a space
for schools to disseminate good practice examples, case studies,
literature and other resources
47
References
Bickman, L. (1987). The functions of program theory. In L. Bickman
(Ed.), Using program theory in evaluation, New Directions for
Program Evaluation, 33, San Francisco, CA: Jossey-Bass (pp.5-
18).
Burke, L. A., & Hutchins, H. M. (2007). Training transfer: An integrative
literature review. Human Resource Development Review, 6(3),
263-296.
Campbell, D. T. (1969). Reforms as experiments. American
Psychologist, 24, 409-429.
Dax Centre (2013). About us. Retrieved
from http://www.daxcentre.org/about-us/
Desimone, L. M. (2009). Improving impact studies of teachers’
professional development: Toward better conceptualisations and
measures. Educational Researcher, 38(3), 181-199.
Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D. &
Schellinger, K. B. (2011). The impact of enhancing students
social and emotional learning: A meta-analysis of school-based
universal interventions. Child Development, 82(1), 405-432.
Elsworth, G. R., & Astbury, B. (2005).Sustainability in health promotion:
Case studies of two food insecurity demonstration projects.
Melbourne: Victorian Health Promotion Foundation.
Georgenson, D. L. (1982). The problem of transfer calls for partnership.
Training and Development Journal, 36, 75-78.
Goleman, D. (1995). Emotional intelligence. New York: Bantam.
Howard, S., Dryden, J., Johnson, B. (1999). Childhood resilience:
Review and critique of literature. Oxford Review of Education,
25(3), 307-323.
Ingvarson, L., Meiers, M., & Beavis, A. (2005). Factors affecting the
impact of professional development programs on teachers’
knowledge, practice, student outcomes and efficacy. Education
Policy Analysis Archives, 13(10). Retrieved
from http://epaa.asu.edu/ojs/article/view/115
Kirkpatrick, D. L. (1959). Techniques for evaluating training programs.
Journal of the American Society of Training and Development,
13, 3-9.
Kirkpatrick, D. L. (1977). Evaluating training: Evidence vs. proof.
Training and Development Journal, 31, 9-14.
48
Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training
programs: The four levels. San Francisco: Berrett-Koehler.
Knowles, M. (1989). The making of an adult educator: An
autobiographical journey. San Francisco: Jossey-Bass.
Kolb, D. A. (1984). Experiential learning: Experience as a source of
learning and development. New Jersey: Prentice-Hall.
McKeough, A., Lupart, J. & Marini, A. (eds.) (1995). Teaching for
transfer: Fostering generalization in learning. New Jersey:
Lawrence Erlbaum Associates.
McLaughlin, C. (2008). Emotional well-being and its relationship to
schools and classrooms: A critical reflection. British Journal of
Guidance and Counselling, 36,(4), 353-366.
Mezirow, J. (1991). Transformative dimensions of adult Learning. San
Francisco: Jossey-Bass.
Patton, M.Q. (1997). Utilization-focused evaluation: The new century
text (3rd ed.). Thousand Oaks, CA: Sage.
Penuel, W. R., Fishman, B., Yamaguchi, R., & Gallagher, L. P. (2007).
What makes professional development effective? Strategies that
foster curriculum implementation. American Educational
Research Journal, 44(4), 921-958.
Prawat, R. S. (1989). Promoting access to knowledge, strategy, and
disposition in students: A research synthesis. Review of
Educational Research, 59(1), 1-41.
Saks, A. M. (2002). So what is a good transfer of training estimate? A
reply to Fitzpatrick. The Industrial-Organizational Psychologist,
39, 29-30.
Salovey, P., & Mayer, J. D. (1990). Emotional intelligence. Imagination,
Cognition, and Personality, 9, 185-211.
Schon, D. A. (1988). Educating the reflective practitioner: Toward a new
design for teaching and learning in the professions. San
Francisco: Jossey-Bass.
Seligman, M. E. P. (1991). Learned optimism: How to change your mind
and your life. New York: Knopf.
Seligman, M. E. P. (2011). Flourish: A visionary new understanding of
happiness and well-being. New York: Free Press.
Shonkoff, J. P. (2006). A promising opportunity for developmental and
behavioural paediatrics at the interface of neuroscience,
psychology and social policy: Remarks on receiving the 2005 C.
Anderson Aldrich Award. Pediatrics, 118(5), 2187-2191.
W. K. Kellogg Foundation. (2004). Logic model development guide.
Battle Creek, MI: Author.
Chapter
The role that schools play in promoting young people’s positive psychological functioning is well recognised and supported by an expanding evidence base. Central to fulfilling this role is the development of capacity in the teacher workforce. Teachers require a sound understanding of the important contribution of social and emotional skills to learning and wellbeing, and the capacity to confidently and skilfully engage in developing students’ social and emotional competencies. A range of professional learning approaches have been adopted to improve teachers’ capabilities to implement social emotional learning. This chapter will describe one approach, a 2 year, part-time postgraduate programme that aims specifically to assist teachers learn more about their role in social and emotional learning and promoting student wellbeing. The programme rationale, content and pedagogy will be described. The nexus between personal and professional learning will be highlighted alongside the need for teachers to develop an understanding of the implementation processes required to successfully engage others in social and emotional learning initiatives. The chapter will focus on the role of university–system partnerships in building teacher workforce capacity drawing upon the experience of a 16 year partnership between an educational system in Victoria, Australia, and a university school of education. It will explore the impact of postgraduate study on participating teachers and on system capacity to promote a social and emotional wellbeing strategy. The challenges in this agenda and emerging possibilities for professional learning in the domain of social and emotional learning are considered.
Article
Full-text available
This study uses a sample of 454 teachers engaged in an inquiry science program to examine the effects of different characteristics of professional development on teachers’ knowledge and their ability to implement the program. The authors analyzed results from a survey of teachers served by 28 professional development providers within a hierarchical linear modeling framework. Consistent with findings from earlier studies of effective professional development, this study points to the significance of teachers’ perceptions about how coherent their professional development experiences were for teacher learning and program implementation. The authors also found that the incorporation of time for teachers to plan for implementation and provision of technical support were significant for promoting program implementation in the program.
Article
Full-text available
Given the proliferation of training transfer studies in various disciplines, we provide an integrative and analytical review of factors impacting transfer of training. Relevant empirical research for transfer across the management, human resource development (HRD), training, adult learning, performance improvement, and psychology literatures is integrated into the review. We synthesize the developing knowledge regarding the primary factors influencing transfer—learner characteristics, intervention design and delivery, and work environment influences—to identify variables with substantive support and to discern the most pressing gaps. Ultimately, a critique of the state of the transfer literature is provided and targeted suggestions are outlined to guide future empirical and theoretical work in a meaningful direction.
Article
Emotional intelligence is a type of social intelligence that involves the ability to monitor one's own and others' emotions, to discriminate among them, and to use the information to guide one's thinking and actions (Salovey & Mayer, 1990). We discuss (a) whether intelligence is an appropriate metaphor for the construct, and (b) the abilities and mechanisms that may underlie emotional intelligence. © 1993.
Article
The emphasis in this paper is on access, defined as the ability to draw on or utilize one’s intellectual resources in situations where those resources are relevant. A number of factors influence access. They have been identified in various strands of research representing different construct and curriculum perspectives. To date, few attempts have been made to synthesize this research. This is the purpose of the present paper, which utilizes a framework thought to have relevance across research domains. According to this framework, two factors influence students’ ability to access knowledge, strategy, and disposition. The first is organizational in nature, the second relates to the amount of reflective awareness possessed by the individual. This paper discusses how organization and awareness factors influence access in each of the informational categories and how teachers can better attend to these factors and thus promote access or transfer in students.
Article
In recent times, research has been conducted into childhood resilience, a term which, according to Masten, Best and Garmezy (1990) is defined as the process of, capacity for, or outcome of successful adaptation despite challenging or threatening circumstances. Rutter (1990, p. 181) suggests the term refers to 'the positive pole of the ubiquitous phenomenon of individual difference in people's responses to stress and adversity'. This paper presents a brief review and critique of the most influential literature in the area which includes work by, for example, Rutter (1994), Garmezy (1994), Garmezy and Rutter (1983), Werner and Smith (1988, 1990) and others. In particular, the way in which the concept of resilience has been taken up in the educational literature will be examined (e.g. Benard, 1991, 1993; Winfield 1994; Comprehensive Training to Assure Resiliency in Students, 1996; Wang, 1995). The paper concludes by suggesting that while the twin concepts of risk and resilience have been carefully explored in the research reviewed, there is room for further work in the area. Future studies, especially those which are to have an applied focus, should be guided by three important principles. First, they should adopt a theoretical and practical ecological framework (Bronfenbrenner, 1979); secondly, they should be extremely mindful of the social context within which the research is carried out; and thirdly, they should take account of children's understanding of the key concepts which may well differ from those of the adult researchers.