Content uploaded by Bill Lucas
Author content
All content in this area was uploaded by Bill Lucas on Apr 07, 2016
Content may be subject to copyright.
Please cite this paper as:
Lucas, B., G. Claxton and E. Spencer (2013), “Progression
in Student Creativity in School: First Steps Towards New
Forms of Formative Assessments”, OECD Education
Working Papers, No. 86, OECD Publishing.
http://dx.doi.org/10.1787/5k4dp59msdwk-en
OECD Education Working Papers
No. 86
Progression in Student
Creativity in School
FIRST STEPS TOWARDS NEW FORMS OF
FORMATIVE ASSESSMENTS
Bill Lucas, Guy Claxton, Ellen Spencer
Unclassified EDU/WKP(2013)1
Organisation de Coopération et de Développement Économiques
Organisation for Economic Co-operation and Development
10-Jan-2013
___________________________________________________________________________________________
English - Or. English
DIRECTORATE FOR EDUCATION
PROGRESSION IN STUDENT CREATIVITY IN SCHOOL: FIRST STEPS TOWARDS NEW FORMS
OF FORMATIVE ASSESSMENTS
OECD Education Working Paper No. 86
Bill Lucas, Guy Claxton and Ellen Spencer
Creativity is widely accepted as being an important outcome of schooling. Yet there are many different views
about what it is, how best it can be cultivated in young people and whether or how it should be assessed. And in
many national curricula creativity is only implicitly acknowledged and seldom precisely defined. This paper
offers a five dimensional definition of creativity which has been trialled by teachers in two field trials in schools
in England. The paper suggests a theoretical underpinning for defining and assessing creativity along with a
number of practical suggestions as to how creativity can be developed and tracked in schools. Two clear benefits
of assessing progress in the development of creativity are identified: 1) teachers are able to be more precise and
confident in developing young people’s creativity, and 2) learners are better able to understand what it is to be
creative (and to use this understanding to record evidence of their progress). The result would seem to be a
greater likelihood that learners can display the full range of their creative dispositions in a wide variety of
contexts.
This paper was drafted for the CERI Innovation Strategy for Education and Training. It is the output of a
collaborative research project commissioned by Creativity, Culture and Education (CCE) in partnership with the
OECD Centre for Educational Research and Innovation (CERI).
Contacts:
Francesco Avvisati, Analyst: francesco.avvisati@oecd.org
Stéphan Vincent-Lancrin, Senior Analyst and Project Leader: stephan.vincent-lancrin@oecd.org
JT03333178
Complete document available on OLIS in its original format
This document and any map included herein are without prejudice to the status of or sovereignty over any territory, to the d elimitation of
international frontiers and boundaries and to the name of any territory, city or area.
EDU/WKP(2013)1
Unclassified
English - Or. English
EDU/WKP(2013)1
2
OECD DIRECTORATE FOR EDUCATION
OECD EDUCATION WORKING PAPERS SERIES
This series is designed to make available to a wider readership selected studies drawing on the work
of the OECD Directorate for Education. Authorship is usually collective, but principal writers are named.
The papers are generally available only in their original language (English or French) with a short
summary available in the other.
This document and any map included herein are without prejudice to the status of or sovereignty over
any territory, to the delimitation of international frontiers and boundaries and to the name of any territory,
city or area.
The opinions expressed in these papers are the sole responsibility of the author(s) and do not
necessarily reflect those of the OECD or of the governments of its member countries.
You can copy, download or print OECD content for your own use, and you can include excerpts from
OECD publications, databases and multimedia products in your own documents, presentations, blogs,
websites and teaching materials, provided that suitable acknowledgement of OECD as source and
copyright owner is given. All requests for public or commercial use and translation rights should be
submitted to rights@oecd.org.
Comment on the series is welcome, and should be sent to edu.contact@oecd.org.
-------------------------------------------------------------------------
www.oecd.org/edu/workingpapers
--------------------------------------------------------------------------
Copyright OECD 2012.
The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities. The use of such data by
the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under
the terms of international law.
EDU/WKP(2013)1
3
TABLE OF CONTENTS
ABSTRACT .................................................................................................................................................... 4
RÉSUMÉ ........................................................................................................................................................ 4
ACKNOWLEDGMENTS .............................................................................................................................. 4
1. Why assessing creativity in schools matters ........................................................................................ 5
A creative challenge ................................................................................................................................ 6
Some pros and cons of assessing creativity ............................................................................................. 6
The principles guiding our development of a framework and associated tool ........................................ 7
Creativity in schools ................................................................................................................................ 8
Assessing creativity in schools .............................................................................................................. 10
2. Thinking about creativity and its assessment ..................................................................................... 10
Differing views of creativity .................................................................................................................. 11
Describing creativity in individuals ....................................................................................................... 12
Freeranging versus Disciplined ............................................................................................................. 14
3. Our prototype tool for assessing pupils’ creativity in schools ........................................................... 16
The Five Creative Dispositions Model .................................................................................................. 16
Trialling and refining the tool ................................................................................................................ 19
Findings in more detail .......................................................................................................................... 20
Reflections on fieldwork in schools....................................................................................................... 21
Refining the second field trial ................................................................................................................ 24
4. Conclusion and next steps .................................................................................................................. 26
REFERENCES ............................................................................................................................................. 30
APPENDIX 1: FIELD TRIAL 2 – ASSESSMENT TOOL ......................................................................... 34
APPENDIX 2: PUPIL QUESTIONNAIRE, FIELD TRIAL 2 .................................................................... 35
APPENDIX 3: EXAMPLE TEACHER QUESTIONNAIRE, FIELD TRIAL 2 ......................................... 37
Figures
Figure 1. Creativity: Person and location .................................................................................................. 12
Figure 2. Creativity: Learnable or Innate .................................................................................................. 14
Figure 3. Field Trial 1 Tool ....................................................................................................................... 18
Boxes
Box 1. Early attempts at assessing creativity in schools ........................................................................... 10
Box 2. Field Trial Methodology ................................................................................................................ 20
EDU/WKP(2013)1
4
ABSTRACT
Creativity is widely accepted as being an important outcome of schooling. Yet there are many
different views about what it is, how best it can be cultivated in young people and whether or how it should
be assessed. And in many national curricula creativity is only implicitly acknowledged and seldom
precisely defined. This paper offers a five dimensional definition of creativity which has been trialled by
teachers in two field trials in schools in England. The paper suggests a theoretical underpinning for
defining and assessing creativity along with a number of practical suggestions as to how creativity can be
developed and tracked in schools. Two clear benefits of assessing progress in the development of creativity
are identified: 1) teachers are able to be more precise and confident in developing young people’s
creativity, and 2) learners are better able to understand what it is to be creative (and to use this
understanding to record evidence of their progress). The result would seem to be a greater likelihood that
learners can display the full range of their creative dispositions in a wide variety of contexts.
RÉSUMÉ
La créativité est largement acceptée comme étant un résultat scolaire important. Pourtant il y a
beaucoup d’opinions différentes sur ce qu’elle est, comment on peut la cultiver chez les jeunes gens, et si
et comment on devrait l’évaluer. De plus, dans beaucoup de programmes scolaires, la créativité n’est
reconnue que de manière implicite et rarement définie de manière précise. Ce document offre une
définition de la créativité reposant sur cinq dimensions, qui a été testée par des enseignants durant deux
expériences de terrain dans des écoles en Angleterre. Le document propose un soubassement théorique
pour définir et évaluer la créativité ainsi que nombre de suggestions pratiques sur le développement et le
suivi de la créativité à l’école. Deux bénéfices clairs d’évaluer le progrès dans le développement de la
créativité sont identifiés : 1) les enseignants peuvent être plus précis et confiants lorsqu’ils développent la
créativité des jeunes gens, et 2) les apprenants sont davantage en mesure de comprendre ce que « être
créatif » signifie (et à utiliser cette compréhension pour documenter et relater leur progrès). Le résultat
semble être une plus grande probabilité que les apprenants témoignent de toute l’étendue de leurs
dispositions à la créativité dans un large éventail de contextes.
ACKNOWLEDGMENTS
We are very grateful to Francesco Avvisati and Stéphan Vincent-Lancrin for their extremely helpful
detailed reading of earlier drafts of this paper and for their editorial guidance.
Our thanks also to the project’s steering group, whose members were: Dr. Francesco Avvisati, Paul
Collard, Prof. Anna Craft, Dr. David Parker, Naranee Ruthra-Rajan, Prof. Julian Sefton-Green, Jo
Trowsdale, and Dr. Stéphan Vincent-Lancrin.
EDU/WKP(2013)1
5
PROGRESSION IN STUDENT CREATIVITY IN SCHOOL:
FIRST STEPS TOWARDS NEW FORMS OF FORMATIVE ASSESSMENTS
by
Bill Lucas, Guy Claxton and Ellen Spencer*
In Spring 2011, Creativity, Culture and Education (CCE), in partnership with the OECD Centre for
Educational Research and Innovation (CERI), commissioned the Centre for Real-World Learning (CRL) at
the University of Winchester to undertake research to establish the viability of creating an assessment
framework for tracking the development of young people’s creativity in schools.
After reviewing the literature on creativity and its assessment, CRL consulted expert practitioners
using both structured interviews and adopting an appreciative inquiry approach (Cooperrider and Whitney,
2005). In the light of this preliminary investigative work we created a framework for teachers to assess the
development of young people’s creativity, and associated processes for trialling this framework in schools.
We then ran two field trials in 12 schools, the first as a proof of concept and the second one exploring
issues raised in the first trial.
Three overarching questions guided us:
a) Is it possible to create an assessment instrument that is sufficiently comprehensive and
sophisticated that teachers would find useful (the proof of concept)?
b) Would any framework be useable across the entire age span of formal education?
c) If a framework is to be useful to teachers and pupils, what approach to assessment should it
adopt?
The paper describes the approach adopted by the CRL research team and the conclusions we reached.
It includes a highly selective summary of a more extensive literature review (Spencer et al., 2012a) and a
description of the assessment tool we developed along with an analysis of its effectiveness.
1. Why assessing creativity in schools matters
‘From its modest beginnings in the universities of the eighteenth century and the school systems
of the nineteenth century, educational assessment has developed rapidly to become the
unquestioned arbitrator of value, whether of pupils’ achievements, institutional quality or
national educational competitiveness.’
Patricia Broadfoot (2000:xi)
* Centre for Real-World Learning at the University of Winchester. Contact: Bill.Lucas@winchester.ac.uk.
EDU/WKP(2013)1
6
A creative challenge
Most people agree that schools need to develop creativity in students just as much as they need to
produce literate and numerate learners. Yet across the educational world there is no widely used definition
of what creativity is, no agreed framework for assessing its development in schools and few assessment
tools specifically designed to track learners’ progress.
If creativity is to be taken more seriously by educators and educational policy-makers then we need to
be clearer about what it is. We also need to develop an approach to assessing it which is both rigorous
enough to ensure credibility and user-friendly enough to be used by busy teachers. In this way we can add
the kind of value referred to in the epigraph above.
In approaching this challenge, our working definition of creativity includes the following elements.
Creativity, we believe, is:
− Complex and multi-faceted, occurring in all domains of life (Treffinger et al., 2002);
− Learnable (Csikszentmihalyi, 1996);
− Core to what it is to be successful today (Sternberg, 1996);
− Capable of being analysed at an individual level in terms of dispositions1 (Guilford, 1950); and
− Strongly influenced by context and by social factors (Lave and Wenger, 1991).
Some pros and cons of assessing creativity
Both assessment and creativity are enormous subjects, each with extensive bodies of literature.
Education stakeholders also have strong opinions on both assessment and creativity. An anecdote from
early in our project illustrates this. At an appreciative inquiry session with teachers, creative agents2 and
experts, those present strongly agreed with the proposition that it is possible (although not straightforward)
to assess progress in the development of creativity in young people and that there are a range of ways in
which this could be done. Presented with a circular, bulls-eye like matrix showing a number of levels of
creative skill in a number of different areas, the group was entirely comfortable.
But when exactly the same conceptualisation was presented in the form of a table, with progression
levels explicitly numbered (as opposed to being implicitly graded in the bull’s-eye figure, with ‘higher’
being shown by a larger wedge of shading), teachers and creative agents expressed anger, hostility and
bewilderment.
The only difference was in the presentational format. The circle somehow only hinted at levels of
‘progression’ while the table looked all too much like the kinds of levels associated by teachers with
attainment levels achieved in core subjects such as literacy or numeracy.
Thus we learned early on that the problem we faced was one not only of identifying a number of
facets of creativity, each of which could be described in terms of a developmental trajectory; we had also
to take into account the practicability, plausibility and acceptability of any such conceptualisation to
teachers.
Despite the complexity of the task, the potential advantages of attempting to measure and/or track the
development of creativity in schools are easy to see. They include:
EDU/WKP(2013)1
7
− Indicating that creative-mindedness is taken seriously as an important aspect of the formal
curriculum in schools;
− Inspiring the development of curricula and teaching activities that foster creativity;
− Providing a way of articulating an applied vision of creativity (Hingel, 2009) that allows teachers
and others to understand more about different dimensions of pupils’ progression and to support
their mental development more effectively (Craft et al., 2007);
− Helping teachers to be more precise in their understanding of creativity;
− Providing formative feedback to pupils to enable them to develop their creativity more
effectively (Black and Wiliam 2000);
− Providing feedback to teachers and focus their attention on this dimension;
− Starting a discussion on the nature of creativity and build a consensus; and
− Understanding more about individual progressions and trajectories in creativity learning.
The problem is that there is no consensus on what creativity is. Possible disadvantages or challenges
associated with the formative assessment of creativity in schools include, therefore:
− Encouraging overly simplistic interpretations of what creativity is (as indicated by the anecdote
earlier in this section);
− Potentially being confused pejoratively with a comment about a pupil’s character, for example,
being unimaginative3;
− If we assume that making summative comparisons of individuals’ creativity is not an appropriate
goal, there is also the risk that assessment ‘scores’ could be used inappropriately for summative
comparisons of performance both between schools and within schools;
− Concerns about assessments being made without due regard to context (Koestler, 1964); and
− The practical difficulties inherent in measuring something which manifests itself in a range of
school subjects.
The principles guiding our development of a framework and associated tool
We developed a set of guiding principles to help us balance the inevitable tensions between rigour and
useability. These criteria (which we list on the next page) seek to combine scholarship with pragmatic
common-sense.
We decided that our framework should be:
− Deliberately identifying those dispositions which the literature suggests are at the core of
creativity (Claxton, 2006, Feist, 2010, Kaufman and Sternberg, 2010);
− Explicitly premised on the ‘grow-ability’ of creative mindedness (Lucas and Claxton, 2010,
Perkins, 1995, Sternberg, 1996);
EDU/WKP(2013)1
8
− As comprehensive in terms of existing research as possible; and
− Coherent internally and having distinct elements.
In addition we were determined (and strongly supported in this by our steering group) that we should
highlight both the social/contextual component of creativity and learning (Lave and Wenger, 1991) as well
as the technical and craft aspects (Berger, 2003; Ericsson et al., 1993). This meant including notions of
‘disciplined’ and ‘collaborative’ in our definition of the creative individual. An individual with
‘disciplined’ dispositions would be likely to devote time and effort to crafting and improving, and to the
development of their technique, while one with ‘collaborative’ dispositions would be likely to work with
others as appropriate, and share the results of their creativity – an important output of creativity.
In describing these two ‘choices’ made, we are explicitly aligning ourselves to a broadly social-
constructivist tradition within education, as well as drawing on a literature exploring the acquisition of
expert performance and how individuals progress from novice to expert practitioners.
In most countries creativity is not a statutory element of the school curriculum (even if it is highly
valued by many teachers and employers). Consequently any assessment activity undertaken by teachers in
relation to their students’ creative development needs to be seen by them as intrinsically valuable. This was
clearly the case in England where, for practical reasons, we undertook the first phase of this project. In
terms of principles, it was therefore essential that any assessment tools should be:
− Seen as useful by teachers;
− At the right ‘grain’ of analysis: neither too abstract to be directly observable, nor too detailed to
become unwieldy;
− Clear and accessible in its use of terminology; and
− Applicable to a broad range of types of creativity.
Creativity in schools
In England the status of creativity in schools has waxed and waned. In the first decade of this century,
in the years following the report by the influential National Advisory Committee on Creative and Cultural
Education (National Advisory Committee on Creative and Cultural Education, 1999)4, creativity seemed to
be in the ascendency. Indeed for a recent period it seemed as if creativity was set to become embedded in
the curriculum.
As Jeffrey Smith and Lisa Smith put it: "creativity and education sit and look at one another from a
distance, much like the boys and girls at the seventh-grade dance, each one knowing that a foray across the
gym floor might bring great rewards but is fraught with peril." (Smith and Smith, 2010:251).
As in most OECD countries, education policy in the United Kingdom officially gives some place to
creativity. However, while Personal, Learning and Thinking Skills (PLTS) in England (and their equivalent
in Scotland, Wales and Northern Ireland) still exist as a framework, they are rarely referred to by policy
makers and education stakeholders. The PLTS framework comprises six groups of cross-curricular skills,
of which ‘creative thinking’ is one.
There are economic and social reasons why creativity might have a place within the school
curriculum. Creativity is held as one of the most important competencies by 21st century employers
EDU/WKP(2013)1
9
(Florida, 2002), and when creativity is acknowledged by and promoted through policy it is often in
response to employability and competitiveness concerns. Education policy widely positions itself as
putting creativity at the centre in order that pupils are able to solve problems and challenges beyond the
classroom. For example, The Qualification and Curriculum Authority’s understanding of creativity is that
it ‘improves pupils’ self-esteem, motivation and achievement’; it ‘prepares pupils for life’; and it ‘enriches
pupils’ lives’ (Banaji et al., 2010:23).
From the literature it is clear that creativity can also be seen as a ‘social good’ (Banaji et al., 2010)
and that it is important, therefore, for ‘the social and personal development of young people in
communities and other social settings’. There is often an ‘economic imperative’ involved as well. The
National Advisory Committee on Creative and Cultural Education (NACCCE) explicitly argued that
creativity in education enables a country ‘to compete in a global market, having a flexible workforce,
facing national economic challenges, feeding the ‘creative industries’ and enabling youth to adapt to
technological change’ (Banaji et al., 2010:35).
A central challenge for the cultivation of creativity in schools is their subject-dominated nature. Thus,
while creativity spans all subject areas and is not limited to the ‘arts’, there are inherent conflicts in
attempting to ensure assessment of cross-curricular concepts. The degree to which creativity is context-free
or domain-specific is ambiguous also. For example, in the United Kingdom, the National Curriculum
generally treats creativity as cross-curricular, and yet, in the early years curriculum, creativity is located in
a set of specific domains including art, design, music, and play. As Anna Craft (2008b) comments, this
makes the decision about what exactly to assess (and indeed what not to assess) problematic. In
developing our assessment framework we tried two different approaches, one in each of the field trials, to
explore this further.
A further issue for schools in England is the overriding agenda of school accountability grades,
assessment systems and their league tables, new pay regimes, a sense of reduced professional freedom in
making curriculum choices locally that competes with serious attempts at fostering creativity (Menter,
2010). It may be that a formative assessment valuing creative dispositions is at odds with the performance
agenda of national testing, and is therefore subordinated (Looney, 2009). Craft’s report for Futurelab notes:
‘the powerful drive to raise standards and to make performance judgments about individuals and about
schools, can be seen as being in tension with an almost equally powerful commitment to nurturing
ingenuity, flexibility, capability’ (Craft, 2008b:3).
Yet a closer examination of research, for example into meta-cognitive processes (including mental
processes such as ‘challenging assumptions’ – itself a disposition of the creative individual), reveals clear
evidence to suggest that the embedding of creative (and other learning) dispositions into lessons actually
raises achievement, with attempts to enhance creativity and develop more powerful learners leading to
increases in measured test results (Watkins, 2010). The two agendas need not be mutually exclusive. It is
certainly feasible both to cultivate creative dispositions and to raise achievement levels in subjects. Indeed,
research commissioned by CCE into the impact of Creative Partnerships on attainment found small but
significant attainment gains, especially for young people at Key Stages 3 and 4 (Cooper et al., 2011). With
the creation of a tool to measure progression in creativity, this relationship would be clearer to see.
Unsurprisingly, many teachers focus more closely on high-stakes state-mandated testing than on
tracking the development of dispositions such as creativity (Wiliam et al., 2004). . The lack of any
requirement to assess creativity in a national, summative way (or even formatively in class) also
contributes to the undervaluing of creativity. But the lack of school-friendly tools to assess creativity is
arguably another reason for paying less attention to creativity than to content or procedure knowledge.
EDU/WKP(2013)1
10
Assessing creativity in schools
Despite the difficulties, attempts to assess creativity have a rich history (Hocevar, 1981; Plucker and
Makel, 2010). Yet our review found no examples of widely used and credible methods of assessing
creativity in schools, although it uncovered some noble attempts and experiments, many stimulated by
CCE’s work.
The purpose of any assessment activity critically influences the selection of methods. Boud and
Falchikov (2006:401) tell us that there are two fundamentally distinct purposes of assessment: one is to
provide certification of achievement, the other is to facilitate learning. Assessment can thus be formative,
helping pupils and teachers improve, or also summative, enabling comparison. Indeed, one can sometimes
make formative use of summative assessments, although it is more dificult the other way around (Looney,
2011). Formative assessment has a view of reality that sees reality as socially constructed rather than
objective. Variables assessed formatively are complex, interwoven, and difficult to measure. A summative
use of formative data would fall down on its requirement for ‘validity’ and ‘reliability’, while formative
data uses different criteria: ‘trustworthiness’ and ‘credibility’, for example. Approaches to formative
assessment in English schools have been shaped significantly by the Assessment for Learning (AfL)
movement in recent years5. AfL uses a range of feedback methods to help learners achieve mandated levels
of examined performance more effectively (see also OECD, 2005, for an overview).
Box 1. Early attempts at assessing creativity in schools
Plucker and Makel (2010) suggest tests for creativity fall into a number of categories:
• Psychometric tests for divergent thinking;
• Behaviour or personality tests of past behaviour or personality characteristics;
• Personality tests of personality correlates of creative behaviour;
• Activity checklists of experience associated with creative production,
• Scales assessing attitudes towards important aspects of creativity or divergent thinking;
• Advanced techniques for the assessment of creative products;
• Expert judges to assess level of creativity in a product or response (Consensual Assessment Technique);
• Six components to assess creative design of product (Consumer Product Design Models): newness, ability
to resolve problems, level of pleasure induced, ability to match needs of customer, importance to needs of
customer, level of desirability or criticalness.
2. Thinking about creativity and its assessment
‘Despite the abundance of definitions of creativity and related terms, few are widely used and
many researchers simply avoid defining relevant terms at all.’
Jonathan Plucker and Matthew Makel (2010:48)
EDU/WKP(2013)1
11
This section introduces the theoretical foundations for our assessment framework, building on ideas
introduced above and drawing selectively on a much larger review of the literature (Spencer et al., 2012a).
The psychological and social components of creative performance are hard to disentangle. Because
our study attempted to develop a framework for assessment of individuals in schools, however, the
literature review focused on the characteristics of creative dispositions that might be assessable, rather than
on exploring the nature of creative outputs and performances, or of environments that might support
creativity more effectively.
This section begins by summarizing some tensions between different views of creativity, then brings
together key conceptualisations about the dispositions that make up a creative individual, and considers the
challenges presented for anyone seeking to create an assessment framework for creativity.
Inevitably in developing any assessment framework, choices have to be made with regard to earlier
thinking about the subject. Informed by our literature review, the decisions we took with regard to
assessing creativity can be summarized thus:
a) We describe creativity in terms of individual creative dispositions selecting a cohesive set of
dispositions drawn from the literature. We chose consciously to focus directly on what is going
on for the learner during acts of creativity, not on the environment in which this takes place nor
on any creative products produced per se (although these may well be used by learners to indicate
their own sense of progress);
b) While recognizing and valuing the social and collaborative nature of creativity, we focused on
assessing creativity within individuals and we deliberately included one disposition which
specifically acknowledges the collaborative nature of creativity;
c) We explicitly adopted a view of creativity (and of intelligence) that sees it as largely learnable
rather than essentially innate;
d) We acknowledged the importance of context by valuing both creativity within subjects (in music
and in mathematics, for example) as well as creativity in its more generalisable forms (such as
being able to have good ideas in a range of domains); and
e) We included an emphasis on the discipline of being creative as well as on the well-documented
value of free-thinking.
Differing views of creativity
Craft’s (2008a) model (Figure 1) helpfully maps a range of views of creativity. These range from
creativity as an individualised endeavour to creativity as a collective phenomenon. It also serves to point up
the tension between creativity as domain-specific versus it being domain-generic.
EDU/WKP(2013)1
12
Figure 1. Creativity: Person and location
Describing creativity in individuals
Guilford was one of the first researchers to examine creativity from the perspective of creative
dispositions, commonly referred to as psychological trait theory. Trait theory focuses on habitual patterns
of mind and their associated behaviours to describe and account for different personalities. Guilford’s
definition of traits linked them with the broad categories of aptitudes, interests, attitudes and
temperamental qualities. From his perspective, the ‘creative personality is then a matter of those traits that
are characteristics of creative persons’ (Guilford, 1950).
There is increasing consensus about which dispositions might serve as indicators of the strength of
creative-mindedness in individuals. In a comprehensive meta-analytical review of the creativity literature,
Treffinger et al. (2002) compared 120 definitions of creativity in papers exploring the ‘traits’,
‘characteristics’, and other personal ‘attributes’ distinguishing highly creative individuals from their peers.
From these 120 definitions they compiled a list of creative dispositions (cognitive, personality, and
biographical), cited in at least three sources, clustering them into four categories:
− Generating ideas;
− Digging deeper into ideas;
− Openness and courage to explore ideas; and
− Listening to one’s ‘inner voice’.
There have been several attempts to map the dispositions that underlie creative performance (e.g.
Kaufman and Sternberg, 2010; Root-Bernstein and Root-Bernstein, 1999). Some lists of creativity-related
dispositions were simply too long for teachers to be able to find manageable. Root-Bernstein and Root-
Bernstein (1999), for example, list 13 such dispositions, all of which have a degree of both empirical and
face validity. They are careful observation; use of sensory imagination; the ability to abstract essentials;
recognizing patterns in information; forming new patterns; generating useful analogies; use of intuition and
embodied cognition; empathy and shifting perspectives; mapping between different dimensional
representations; creating and adapting models; playfulness with material and ideas; transforming ides into
different media; and synthesizing elements of thought into a coherent whole.
EDU/WKP(2013)1
13
Individual versus social components of creativity
Treffinger et al.’s (2002) list of dispositions, while a helpful starting point, is incomplete as a
framework for assessment in so far as manifestations of creativity are, to a degree, almost always the result
of complex collaboration across social groups. The challenge of using such a categorisation to create an
assessment framework is that such dispositions are not simply located within the individual, they are also a
function of what the broader context affords. As the authors note, many definitions of creativity challenge
the notion that dispositions alone are sufficient.
Fillis and McAuley (2000:9), for instance, cite the work of Amabile as they assert that ‘examining
creativity from a trait perspective can have limited impact, given that social surroundings have also been
shown to impact upon creative behaviour’.
An early authoritative text on creativity was Arthur Koestler’s (1964) The Act of Creation, which
takes a broad conception of creativity and emphasises its social dependencies. Koestler’s general theory of
human creativity in art, humour, and scientific discovery pinpointed the role of external influences on an
individual’s creative thought process. Citing the scientific ‘discoveries’ of Kepler, Kelvin, Newton,
Pasteur, and Fleming, Koestler demonstrated the way all ideas develop through cross-fertilisation and
recombination of existing components. Human beings do not, he argued, ever ‘create’ wholly original
thinking.
Regarding the social element many current approaches to creativity stress the social and collaborative
nature of the creative process. John-Steiner, for example, tells us that:
'The notion of the solitary thinker still appeals to those moulded by the Western belief in
individualism. However, a careful scrutiny of how knowledge is constructed and artistic forms
are shaped reveals a different reality. Generative ideas emerge from joint thinking, from
significant conversations, and from sustained, shared struggles to achieve new insights by
partners in thought.' (John-Steiner, 2006:3).
The challenge for anyone creating an assessment tool exploring individual creativity is to allow
sufficient scope for the social element of creativity to be accounted for. This could be achieved by
including a ‘collaborative’ dimension as an important, assessable, element of the creative individual.
Subject-specific versus general creativity
Csikszentmihalyi wrote that the key difference between creative people with a big C and their less
creative peers is the ‘complexity’ of their tendencies of thought and action. Those veering toward creativity
‘tend to bring together the entire range of human possibilities within themselves’ (1996:57). This is not to
say that only a privileged few have capacity for creativity, but that the creative side is nurtured and
cultivated in the process of developing maturity and that it is likely to draw on experiences in different
contexts.
Looking at the subject-specific/domain-free continuum, Craft (2008a:7) comments that:
'Whilst some views of creativity argue that at its heart, creativity in one domain is the same as in
another, in that it ultimately involves asking ‘what if?’ in appropriate ways for the domain…,
others would argue… that creativity cannot be understood without reference to the domain of
application.'
A tool for assessing creativity needs to allow for assessment of creativity across a range of contexts, which
might be subject domains as well as in or outside of the school environment.
EDU/WKP(2013)1
Learnable versus innate
Assessment of creativity only has
v
creative. We take the well-supported
v
ways, including in its ubiquity and in
Figure 2 below.
Figure
It is clear, for example, that ever
y
Creativity also has levels, so that we
c
Heindel and Furlong (2000) suggest th
a
other skill, Csikszentmihalyi believed
combination of personal characteristics
made a powerful case for the learnabilit
y
the creativity literature (Perkins, 1995).
To be of formative use, a framewo
r
of behaviour that represent learnable dis
p
Freeranging versus Disciplined
One important aspect of generaliz
e
ideas from a range of perspectives with
important, but it is not a proxy for cre
a
creative thinking and problem solving’.
B
‘disciplined’ is the convergent and impo
r
Our tool for assessing creativity sh
o
convergent aspects of the creative indiv
i
and imagination and, on the other, the d
i
narrow down options.
Assessing Creativity – a specific challen
g
In section 1.5 we noted the diffic
u
consider the specific challenges which
a
of the purposes of any assessment.
At a very practical level assessing
s
as we have observed elsewhere in a rev
i
school labelled as a ‘level 7 imaginer’
evaluation of success is necessary’ (Luc
a
Learnable
14
v
alue if we take the view that children can learn
t
v
iew that creativity is comparable to intelligence
its ‘learnability’. This latter tension is presente
d
2. Creativity: Learnable or Innate
y
individual is creative to some degree (Csiksze
n
c
an ask ‘how creative’ an individual is (Treffing
a
t while Torrance believed that creativity could b
e
that, while children could not be taught crea
t
and an encouraging environment could produc
e
y
of intelligence, including many aspects of creati
v
r
k for assessing creativity should thus include ass
e
p
ositions over which individuals have a degree of
c
e
d creativity is ‘divergent thinking’ – the ability t
o
out being limited by preconceived thinking. Dive
r
a
tivit
y
. Rather tests of it represent ‘estimates of
t
B
eing imaginative can be seen as the di
v
ergent as
p
r
tant parallel one (Runco, 2010:424).
o
uld thus include dispositions that represent both t
h
i
dual. This would include, on the one hand, notio
n
i
sposition of reflecting upon a range of choices crit
i
g
e?
u
lties others have found in assessing creativity.
H
a
ssessing creativity in schools may bring as well a
s
s
omething like creativity, if reductionist, could giv
e
i
ew of wider skills: ‘The idea that young people c
o
or ‘gra
d
e C collaborator’ is horrific – yet clear
l
a
s and Claxton, 2009:25).
Innate
t
o become more
in a number of
d
graphically in
n
tmihalyi, 1996).
er et al., 2002).
e
taught like any
t
ivity, the right
e
it. Perkins has
v
ity identified in
e
ssable elements
c
ontrol.
o
generate many
r
gent thinking is
t
he potential for
p
ect, while being
h
e divergent and
n
s of playfulness
i
cally in order to
H
ere we briefly
s
the wide issues
e
rise to ridicule,
o
uld come out of
l
y some kind of
EDU/WKP(2013)1
15
Our quotation illustrates clearly the tension between, on the one hand, providing post hoc comparative
data to decision-makers particularly at policy level and, on the other, giving children and young people the
information they need in order to develop their thinking.
As we began to explore above, the paradigms within which formative and summative assessment sit
become more clearly different (Kaufman et al., 2008).
A summative framework would necessarily have to establish, as a minimum, its validity and
reliability. To ensure its reliable implementation it would require the development and trialling of criteria,
as well as a system of moderator training and moderation to ensure its consistent application. A formative
framework, on the other hand, would require a different approach.
While any assessment can be used summatively (without making a claim for its validity) not all can
make the additional claim of serving formative functions. Indeed, Taras (2005:466) argues that ‘formative
assessment is in fact summative assessment plus feedback which is used by the learner’ and, in addition, it
may be used by the teacher. A framework of progression can be both summative and formative, although
the ability of an assessment to serve both formative and summative functions is a fine balancing act, with
many criticising the notion that this is even possible (Wiliam and Black, 1996). Teachers can make use of
both formative and summative assessment data in planning lessons. ‘In-the-moment’ formative assessment
might, however, provide more relevant information to help teachers manipulate lessons by focusing on
areas of learning or subject knowledge as required.
The evidence for the benefits of using formative assessment is strong. Black and Wiliam’s (1998:142)
seminal paper Inside the Black Box: Raising standards through classroom assessment presented firm
evidence that formative assessment can raise standards of achievement. In doing this they drew on more
than 250 high-quality published journal articles.
Leahy and Wiliam’s address to the American Educational Research Association conference in 2009
similarly suggested that there is a strong case for the use of formative assessment to improve learner
outcomes. They observed that over the past 25 years, ‘at least 15 substantial reviews of research,
synthesizing several thousand research studies, have documented the impact of classroom assessment
practices on students’ (Leahy and Wiliam, 2009:2). They quantified the ‘substantial increases in student
achievement – in the order of a 70 to 80 percent increase in the speed of learning’ (2009:15).
Wiliam (2006) argues that all activities under the ‘assessment for learning’ banner can be expressed as
one of five key strategies and that anything not fitting into this set of strategies is, in fact, not assessment
for learning:
− Clarifying and understanding learning intentions and criteria for success;
− Engineering effective classroom discussions, questions and tasks that elicit evidence of learning;
− Providing feedback that moves learners forward;
− Activating students as instructional resources for each other; and
− Activating students as owners of their own learning;
Our review found a variety of assessment instruments assessing the development of traits linked to
creativity (Beattie, 2000, Hocevar, 1981). In each case this necessitates an assessment instrument that
EDU/WKP(2013)1
16
captures instances of those dispositions in action. The literature has explored a variety of possible ways
forward including:
− Use of descriptive rubrics supported by examples (Lindström, 2006);
− Assessment by peers;
− Assessment using portfolios;
− Assessment using mixed methods (Treffinger et al., 2002); and
− Self-assessment.
Ultimately then, it would appear that if an assessment framework is to be of formative use to teachers
and learners, its utility is likely to be in developing shared understanding between teacher and learner, and
in shedding light on the necessary steps for progression for each of them rather than in providing
individuals with a crude labelling of their creativity. It might be further developed to serve a secondary
function as a summative tool. A range of approaches could be taken to gathering data, with the above
examples providing a starting point.
3. Our prototype tool for assessing pupils’ creativity in schools
‘Genius is one percent inspiration and ninety-nine per cent perspiration’
Thomas Edison
Our prototype model of the creative individual resulted directly from what we learned from our
interaction with practitioners and from our research literature (Spencer et al., 2012a). It was further
informed by the criteria we evolved with our steering group to help us gain maximum value from our two
field trials.
The Five Creative Dispositions Model
The five dispositions on which we decided to focus were arrived at after careful weighing up of the
pros and cons of existing lists of creative dispositions in the light of our criteria. Our model explored the
following five core dispositions of the creative mind:
1. Inquisitive. Clearly creative individuals are good at uncovering and pursing interesting and
worthwhile questions in their creative domain.
− Wondering and questioning – beyond simply being curious about things, the questioning
individual poses concrete questions about things. This enables him, and others, to think
things through and develop new ideas.
− Exploring and investigating – questioning things alone does not lead to creativity. The
creative individual acts out his curiosity through exploration, and the investigating
individual follows up on her questions by actively going out, seeking, and finding out
more.
− Challenging assumptions – a degree of appropriate scepticism is an important trait of the
creative individual. This means not taking things at face value without critical
examination.
EDU/WKP(2013)1
17
2. Persistent. In line with Thomas Edison's remark above, this section has been repeatedly
emphasized.
− Sticking with difficulty – persistence in the form of tenacity is an important habit of mind
enabling an individual to get beyond familiar ideas and come up with new ones.
− Daring to be different – creativity demands a certain level of self-confidence as a pre-
requisite for sensible risk-taking as well as toleration of uncertainty.
− Tolerating uncertainty – being able to tolerate uncertainty is important if an individual is
going to move ‘off of the starting blocks’ on a project or task where actions or even goals
are not fully set out.
3. Imaginative. At the heart of a wide range of analyses of the creative personality is the ability to
come up with imaginative solutions and possibilities.
− Playing with possibilities – developing an idea involves manipulating it, trying it out,
improving it.
− Making connections – this process of synthesising brings together a new amalgam of
disparate things.
− Using intuition – the use of intuition allows individuals to make new connections and arise
at thoughts and ideas that would not necessarily materialise given analytical thinking
alone.
4. Collaborative. Many current approaches to creativity, such as that of John-Steiner (2006), stress
the social and collaborative nature of the creative process.
− Sharing the product – this is about the creative output itself impacting beyond its creator.
− Giving and receiving feedback – this is the propensity to want to contribute to the ideas of
others, and to hear how one’s own ideas might be improved.
− Cooperating appropriately – the creative individual co-operates appropriately with others.
This means working collaboratively as needed, not necessarily all the time.
5. Disciplined. As a counterbalance to the ‘dreamy’, imaginative side of creativity, there is a need
for knowledge and craft in shaping the creative product and in developing expertise.
− Developing techniques – skills may be established or novel but the creative individual will
practise in order to improve. This is about devoting time to a creative endeavour.
− Reflecting critically – once ideas have been generated, evaluation is important. We could
call this ‘converging’. It requires decision-making skills.
− Crafting and improving – this relates to a sense of taking pride in one’s work. The
individual pays attention to detail, corrects errors, and makes sure the finished article
works perfectly, as it should.
EDU/WKP(2013)1
18
We chose to describe the five dispositions with relatively abstract adjectives, while using the gerund
to indicate the sub-dispositions in an attempt to reinforce the action required to ‘live’ each disposition
fully.
In terms of the different approaches to creativity summarized in Figure 1 in section 2.1, we sought to
be inclusive, accommodating as many of them as possible within the context of the schools with which we
were working. Our prototype, we believe, holds relevance within each area of the school curriculum, while
recognizing that the way a particular disposition is expressed may be different depending upon context.
At the outset we assumed that it was at least worth exploring the use of the prototype tool across the
age range 4-16. This was not, in fact, possible at Key Stage 4 given the strong performance culture
prevalent at this stage6. For time reasons, we did not explore the different ways in which learners of
different ages demonstrate creative dispositions. Nevertheless, assuming a common definition of creative-
mindedness, we explored variations in how the tool was used and understood across age ranges.
Our first field trial was planned as a proof of concept, aiming to show us how easily teachers could
understand and use the tool at a moment in time to assess pupils. The second trial focused on self-
assessment by individual learners. Throughout the project we have favoured a formative approach to
assessment tool design, while remaining agnostic about potential summative uses.
Figure 3. Field Trial 1 Tool
The tool tested initially is shown graphically in Figure 3. It was designed so that development of each
of the 15 sub-dispositions could be tracked along three dimensions:
− Strength – this was seen in the level of independence demonstrated by pupils in terms of their
need for teacher prompts or scaffolding, or their need for favourable conditions;
− Breadth – this was seen in the tendency of pupils to exercise creative dispositions in new
contexts, or in a new domain; and
EDU/WKP(2013)1
19
− Depth – this was seen in the level of sophistication of disposition application and the extent to
which application of dispositions was appropriate to the occasion.
Trialling and refining the tool
'I have noticed that the children are far more aware of how and when they use their imagination
and are now independently identifying this throughout lessons for themselves.'
Primary Teacher, Field Trial 2
Two field trials were carried out. The first was designed to show:
− How easily teachers were able to map a pupil onto the framework;
− How easily teachers were able to decide on, and gather, suitable decision-making evidence/data;
− What the sticking points and ‘hard parts’ were in the process; and
− How we could improve this process.
The second was designed to ascertain:
− The extent to which pupils perceived that they were able to self-assess ‘imagination’;
− The extent to which pupils were able to provide sufficient supporting evidence; and
− How the tool could be modified, in order to provide direction for development.
Data collection for both trials was situated within a case study design. The case study research design
is typically used for qualitative data collection involving a number of groups of participants. Field Trials
were teacher-led, involving design, planning and co-ordination from the project team. Both primary and
secondary schools were involved: six for the first trial, and 12 for the second. Each school identified a
project coordinator who attended our ‘train the trainer’ session. Coordinators communicated the project to
participating teachers at their school. Teachers were each asked to focus on six to 12 pupils in the year
groups we specified. For each pupil, they were asked to attempt to map the child’s profile at a single
moment in time in relation to the three dimensions ‘strength’, ‘breadth’, and ‘depth’ of the habit
‘imaginative’. Coordinators finally arranged for teachers to gather for completion of a pre-formatted end-
of-project report, which involved responding to 52 questions that probed for respondents’ views on every
aspect of the tool.
Field Trial 2 was a broader trial, requiring pupil involvement and a longer timeframe than the single
snapshot specified for the first. Responsibility for using the tool lay with pupils, with teachers taking a
more facilitative role. The broad aims above gave rise to a number of in-depth research questions. Inquiries
were made of research participants (both teachers and pupils) through the use of questionnaires as a data
collection tool (see Appendixes 2 and 3).
EDU/WKP(2013)1
20
Findings in more detail
On the evidence of our field trials in twelve schools, the concept of an assessment framework for
creativity in schools is valuable and relevant. Its value resides in its use as a prompt to teachers to enable
them to maintain focus and as a formative assessment tool to track pupil creativity. The language of the
tool provides pupils with a new (and sometimes stretching) vernacular with which to describe their
behaviour and monitor different dimensions of their learning and helps teachers consider the opportunities
for creative development they provide. Among those we worked with we found no appetite for a
summative creativity instrument.
As a proof of concept, this study shows us that it is possible for both teachers and pupils to assess
pupils’ creativity, and that the five habits have face validity. Our conception of creativity fits teachers’
understandings of the creative dispositions that they would wish pupils to develop. The habits are said to
be sufficiently distinctive and useful, and there is a strong sense among teachers that our framework
encompasses a learnable set of dispositions.
While we had originally speculated that the framework could be of use between the ages of 4 and 16,
the trials suggest that we should initially focus on the 5-14 age range, although some practitioners may find
it useful with younger and older pupils. In other places than England, years with high stake exams for
teachers or students should initially be avoided to pilot the tool.
Box 2. Field Trial Methodology
Field Trial 1
School Year group (number of teachers) Number of
assessment tools
completed
3 primary Nursery (5), Reception (1), Year 3 (2), Year 5 (3).
48
3 secondary Year 7 (7); Year 8 (1); Year 9 (2); Year 10 (7); Year 11 (1);
Year 12/13 (1); Year 13 (1)
161
Field Trial 2
Schools Year groups Teacher
questionnaires
Pupil
questionnaires
Pupil self-
reporting tools
5 primary Nursery; Year 2
(x3); Year 1/2 ;
Year 4/5; Year 4
Year 5; Year 5/6
Year 6 (x2)
16 61 98
6 secondary Year 7; Year 8
(x3); Year 9 (x2)
9 68 90
EDU/WKP(2013)1
21
It is not currently of use at Foundation stage (age 3-5). There are two reasons for this. First, the tool
always had a self-assessment element and this made it too complex for very young pupils. Secondly, early
years practitioners have a range of useful formative assessment processes already for, in the case of
England, a largely play-based curriculum.
The tool is almost at the right grain of analysis: use of five habits appears to be sufficiently
comprehensive and not unwieldy. Consolidation of the three sub-habits into one exemplar statement for
pupils is too blunt an instrument to ensure they address all three aspects of the statement. To use the three
dimensions of strength, breadth, and depth explicitly generates an assessment task that is too burdensome
and complex, but by making them more hidden, some of the subtlety is lost. The tool is clear and
accessible in its use of terminology and is applicable to a broad range of real-world types of creativity. The
tool is sufficiently comprehensive, and internally coherent: no missing habits or sub-habits, or overlap of
sub-habits, were identified during the trials.
Benefits of using the assessment tool are broad and relate to:
− The potentially powerful use of feedback material for formative use by pupils as it supports them
in harnessing more of their creativity.
− The additional focus and precision which our research-informed synthesis of five dispositions
afforded teachers in their classroom activities.
− The influence of the tool on teachers, and its help in refining their practice, helping them to think
specifically how they could cultivate the full range of creative dispositions.
− The boost to the status of creativity afforded by our clarification and refining of a practically
useful definition of creativity for those trying to argue its case. This is particularly pertinent in the
current educational landscape as many ‘creative’ subjects are not to be found in the coming
English Baccalaureate7. A more precise, research-led definition could be helpful in countering
potentially negative impacts of a narrower curriculum upon creativity.
− The balance of simplicity and rigour. This project has attempted to span the gap between theory
and practice, and has found that teachers will only use a tool that obtains this balance.
− The role that this tool, with its vocabulary, could have in structuring a community of practitioners
interested in teaching creativity.
− The emergence of an open-ended curriculum for developing creative-mindedness as a result of
collecting, tagging, and exchanging teaching and learning materials using the tools’ structure.
Reflections on fieldwork in schools
Both field trials took just one of the five dispositions: ‘being inquisitive’ for the first trial and ‘being
imaginative’ for the second. In the first trial teachers at six schools (3 primary and 3 secondary) were asked
to focus on 6 to 12 pupils and attempt to map each child’s profile onto a copy of the reporting tool at a
single moment in time by shading in the appropriate ‘strength’, ‘breadth’, and ‘depth’. They were given
full instructions. From the first trial we received a report from each of the six schools, and copies of over
200 completed assessment tools. In terms of validating the whole tool, teachers were asked to share their
thoughts on the five broad habits, and 15 sub-habits upon which the tool was built. Teachers’ feedback
provided us with a ‘proof of concept’ that the habits and sub-habits were useful, and that they could be
EDU/WKP(2013)1
22
monitored and assessed. Teachers perceived the framework to be complete, with all expected aspects of
creativity being present.
For the second trial teachers at 11 schools (5 primary and 6 secondary) trialled a modified tool
– this time for pupils to self-assess with – in one of their classes for a period of four to six weeks. Teachers
implemented the project in a variety of ways, generally following the guidelines given by the project team.
Most teachers showed an online presentation and video we had prepared that explained the concept of
creativity, why its assessment would be beneficial, and how we planned for them to do this. The
presentation and video were received warmly by pupils. Many pupils were given the opportunity to
develop their own definitions of imagination, through various means including discussion, mind-mapping,
and blogging. Teachers held 2-3 class sessions with the class prompting them to self-assess using the pre-
formatted pupil reporting tool. They were asked to consider, from recent examples, how imaginative they
had been in comparison with the exemplar statement on the tool. The exemplar statement can be seen on
the tool in Appendix 1. They were to justify their self-assessment of how closely they fit the exemplar
statement on the reporting tool. We received 25 teacher questionnaires, over 120 pupil questionnaires, and
copies of over 180 pupil reporting tools from participants of the second trial.
Some teachers linked their introductory session explicitly to the piece of work the class would tackle
that lesson. In one history class, for example, a discussion was held to develop the class’ awareness of the
role of creativity in history. A religious studies teacher used the introduction as a way of bringing in
consideration of thinking skills to a topic containing ‘big philosophical ideas’. While these activities do not
necessarily cultivate creativity, they show how teachers strive to place sessions on creativity in context.
Teachers’ views
The majority of teachers involved with the trial told us that their experiences with it had impacted
positively upon their practice. Three teachers talked of how the trial had broadened their awareness of
creativity; the different forms it takes and places it emerges; and how it helped them to value and celebrate
it. Another told us how she had benefitted from the narrow focus on just one aspect of creativity; planning
to continue this focus by looking at a small number of ‘skills’ on a half-termly basis.
At five schools, teachers talked about impacts of the trial on their practice, such as more listening to
(and questioning of) pupils in order to notice imaginative behaviour; more praise and encouragement of
pupils; more time for reflection; and more planning for imagination. Planning opportunities for
imagination into lessons and into wider schemes-of-work was the most common change teachers
mentioned.
Pupils’ views
Pupils were asked to return to the tool during 2-3 lessons (at their teacher’s discretion) over the course
of around six weeks. To test this proof of concept (i.e. whether pupils could use the tool) they were asked
to think about some concrete examples of when they were being imaginative recently. Teachers were told
they could focus pupils’ efforts during a particular lesson or project, but also to allow pupils to bring in
evidence from lessons taught by their colleagues – whether in primary or secondary. Pupils were asked to
compare their own behaviour with the exemplar statement on the tool and decide on the extent to which the
behaviour was ‘like me’ (not at all, a little, quite, very much). They were required to seek evidence. For
example, was their evidence in their written work? With such a formative tool, more important than an
objective relationship between the box ticked and the evidence provided would be the pupil’s justification
for their choice of box, and a dialogue with the teacher to help develop the pupil’s judgment.
EDU/WKP(2013)1
23
As would be hoped, most pupils developed understanding of the key words and concepts used in the
tool. Having used the tool, pupils were overwhelmingly more aware of when they were being imaginative;
many were also seeking actively for opportunities to be more so. Those pupils who claimed that the tool
had not made them more aware fell broadly into two categories. Some believed that they had sufficient
awareness anyway. Others showed that they had held onto their original views that creativity and
imagination could not be taught; that they were currently unable to be imaginative; or that creativity had
too many meanings for them to try and define it. In one school in particular, pupils’ responses indicated a
lack of sufficient contact time with the tool, but also that initial input from teachers had not been sufficient
to develop their understanding. It is to be expected that these factors would be concurrent with a narrow
view of what it means to be imaginative.
The vast majority of pupils told us that they found the tool accessible and evidence easy to gather and
that the tool became easier to use as it became more familiar.
The quality of self-assessments varied, for a number of reasons. Some pupils just listed work they had
done or, more broadly, lessons they had been present in where they had used their imagination. Detail was
generally sparse and insufficient for teachers to give guidance for improvement although some pupils
provided significantly more detail than others. On the whole, teachers were satisfied that evidence was
reasonably justifiable and appropriate. Evidence tended to be better when it was concrete and sufficiently
detailed, although a primary teacher told us that she found verbal evidence easier to agree with because her
pupils were able to articulate better orally than in writing. Evidence from lessons other than the teachers’
own was harder to judge. We thus advise to use the assessment tool for lessons given or projects led by the
teacher who trials it.
Pupils often mentioned how much easier it was to gather evidence in subjects where they felt that
their creativity was used more naturally; or where they were familiar with using their imagination in the
way they had always understood imagination to mean.
Difficulties some pupils had related to: finding examples of when they had been imaginative; relating
their examples to the exemplar statement; finding solid evidence; and deciding which of the ‘like me’
statements their evidence suggested they should tick. Putting thoughts into writing (particularly if pupils
had limited vocabulary) was the most commonly expressed of all these issues, although several of the
mentions were by pupils who claimed to find the tool easy to use. This suggests that it was only a minor
issue for these pupils. In the school in which it was cited most frequently (by 15 pupils) however, the
teacher only considered recording evidence to be an issue for the less able children.
In two schools reflecting and discussing with peers was more popular than noting experiences down
while some pupils tried to include too much detail on their reporting sheet. For a formative tool to map
progression, however, the question has to be posed regarding how useful this level of detail is. While a
certain level of relevant detail is helpful in ensuing teachers understand pupils’ evidence, reams of
descriptive, uncritical narrative are unlikely to be read and absorbed by teachers with a view to assisting in
the formation of deeper levels of creativity; and even less likely to be drawn on by pupils as they hit
problems in the future or wish to reflect and decide upon their own personal development targets.
A common theme was that developing creativity in maths lessons posed some challenges. The
teaching had to be conducive to pupils using their imagination; they needed confidence to believe they
could be imaginative (particularly girls); and they needed hard evidence, which was less easy to obtain
from looking at a piece of maths work. Pupils in one maths teacher’s class claimed that the class did find it
easier with practice.
EDU/WKP(2013)1
24
Refining the second field trial
The first field trial highlighted some aspects of the tool that were burdensome or difficult to use for
young students in a limited time. Field trial 2 therefore simplified the tool to make it more friendly and
accessible. Key differences from the first trial were:
− Being ‘imaginative’ was the creativity sub-habit in focus, rather than being ‘inquisitive’.
− The assessment tool was simplified in terms of process. The assessment tool was simplified in
terms of content.
− Assessment was undertaken by pupils, with teachers taking a facilitative ‘signing off’ role.
− The assessment process was embarked upon over a period of time rather than carried out at a
snapshot moment.
− Having trialled quite a complex approach to mapping creativity using dimensions of ‘strength’,
‘breadth’, and ‘depth’ in the first field trial, the second trial simplified the tool in this regard. Our
approach to ‘strength’ and ‘depth’ involved the following criteria attached to the exemplar
statement, rather than attempting to give separate scores for ‘strength’, ‘breadth’, and ‘depth’
(seen in the tool in Appendix 1):
‘I can do these things without being prompted. I am confident about doing these things’.
− The ‘breadth’ dimension (intended to reflect the degree to which individuals showed creative
tendencies across a range of contexts) was accounted for by pupils considering examples and
evidence from various contexts. Pupils were not expected to give themselves a single score for
‘breadth’, but rather to consider a range of contexts.
− Key Stage 4 (age 14 to 16) was omitted due to potential conflicts of statutory examinations.
Schools were asked to focus on Y2, 4, 6, and 8, as well as at Foundation stage.
Following field trial 1, the concept of ‘strength’ was replaced with the more transparent idea of
‘independence’; the idea of being able to do things without being prompted. Confidence was used as a
proxy for ‘depth’.
This consolidated approach to tracking strength and depth was only apparently successful at those
schools where the tool was entirely unproblematic. At three schools, teachers were satisfied that pupils
understood the requirement had no problem paying attention to both. At seven of the other schools,
teachers themselves did not provide us with feedback relating to this specific question, suggesting strongly
that the consolidated approach was too subtle or intangible for them to notice.
Having trialled a more complex approach to assessment of the three sub-habits of ‘being imaginative’
in field trial 1, for field trial 2 we developed a combined exemplar statement that described what it would
look like if an individual was doing all three sub-habits well. Pupils varied in the degree to which they
evidenced one, two, or three sub-habits. In some instances pupils did not comprehend the question we
asked regarding the number of sub-habits they had attempted to evidence. This suggests the consolidated
approach was not sufficiently directive for some.
Of the three sub-habits, if one was given slightly more attention by those telling us what they found
difficult, it was using their ‘intuition’ (being able to carry on even when you cannot fully explain your
EDU/WKP(2013)1
25
reasoning). This said, difficulties with intuition were mentioned only infrequently. Not a familiar word to
begin with, it became more so with practice and also with hindsight. Some found it less easy to notice
when they themselves were being intuitive, although teachers told us pupils did use their own intuition. It
is quite possible that the problem with intuition (if indeed there really was one) may not have been the
wording, because teachers would have used different words to explain what it meant, but the concept itself.
Intuition is perhaps inherently difficult to notice and, therefore, sometimes hard to evidence. As it is so
intangible it is also harder to write about, even when noticed. It is, nevertheless, an important aspect of
creativity that appears regularly in other analyses of the traits of creative individuals. Perhaps the
difficulties of demonstrating it should not prevent its inclusion in an assessment framework.
Teachers at two schools both expressed a preference to focus on capturing evidence for only one-sub
habit at a time.
Our approach to ‘breadth’ involved asking participating teachers to allow pupils to bring in evidence
from other lessons as well as their own. Pupils indicated to us whether they had drawn upon a single
subject only; a narrow range of subjects; a broad range; a broad range and out of school examples. The
range of subject examples drawn from and the range which exists within subjects helped pupils to establish
breadth.
At this stage in the development of the tool, pupils were not led by the research team or teachers to be
systematic about collecting their evidence and only around a dozen mentioned out-of-school evidence.
This was expected, given the arrangement whereby pupils did this work with only one teacher. The trial
was to see whether pupils could refer to other subjects. A common theme in this regard was selectivity,
with the most overtly ‘creative’ subjects being considered more readily by some.
Remembering contexts outside of pupils’ immediate experience was a problem for a few pupils; for
some even recalling what they had done earlier in the lesson in which they were reflecting was a challenge.
Subject silos also kept, to some degree, pupils’ minds confined to the subject in which they were working.
This suggests that a method of capturing thoughts that works on the spot would be best. Note, however,
that a teacher at one school believed that not overdoing the reporting was a good approach.
Field Trial 2 findings suggest that it would certainly be worth developing the tool because of the
significant impact it had upon pupils’ understanding of important learning concepts, upon their vocabulary,
and upon teachers’ professional practice.
To continue with this work, further development of the tool is needed. Most significantly:
accompanying training materials are needed in order to maximise fully the benefits of tracking creativity in
classes; and the tool should be more formatively useful. Teachers and pupils need detailed exemplars and
clear guidance about how best to utilise the tool to ensure that evidence is always pointing pupils’
development on an upward trajectory.
There are remaining issues of incentivisation. For example, how do we ensure the tool becomes part
of the schools’ data collection, reporting, and reward system? To what extent is this necessary? How could
technology assist with this? Issues of moderation remain also. How do we ensure teachers and pupils share
a common understanding of the key terms? How might we use moderation to develop some exemplar
pieces of evidence?
For future development, however, we believe the tool’s most useful direction is as a formative
instrument for pupils and teachers to concentrate on action for the future, rather than as a record of past
achievement.
EDU/WKP(2013)1
26
4. Conclusion and next steps
‘We need innovative practitioner research within the field of curriculum and assessment studies –
research that will change assessment policy and creative learning practices within the
classroom in different socio-cultural contexts.’
Pamela Burnard (2011:140-149)
The fact that, after recent years of considerable investment in promoting creativity in schools in
England, there is no widely used assessment tool or framework has a number of possible explanations.
It could be that assessing creativity is just too difficult in schools. Or it might be a consequence of
being in an education system already perceived as over-tested. Or the subject-dominated nature of schools
may simply throw up too many logistical barriers. Or, we suspect, as was revealed in the anecdote we cited
on page 6, teachers who are interested in creativity may remain wary about assessing it. Or there is no clear
understanding and consensus about what creativity means in different contexts.
We started our research and development with three questions:
1. Is it possible to create an assessment instrument sufficiently comprehensive and sophisticated that
teachers would find useful (the proof of concept)?
2. Would any framework be useable across the entire age span of formal education?
3. If a framework is to be useful to teachers and pupils, what approach to assessment should it
adopt?
Here we give answers to these questions and offer some more finely grained reflections on what we
found. Our full report can be found on CCE’s website (Spencer et al. 2012b)8.
1. It is possible to create an assessment instrument that teachers find useful and to this extent the
concept is proved.
2. The framework seems most useable between the ages of 5 and 14. Post 14 the pressure of
examinations and the pull of subjects seems too great. Pre 5 early years teachers already have
excellent formative learning tools for use in a curriculum which is much more playful and into
which the development of creativity already fits easily.
3. We are clear that the primary use of the tool is in enabling teachers to become more precise and
confident in their teaching of creativity and as a formative tool to enable learners to record and
better develop their creativity.
The teachers who trialled our tool found the underlying framework both rigorous and plausible. They
liked the tool and could see how it could be useful in the classroom for formative assessment. On the basis
of this small-scale study, there would seem to be an appetite for a tool like ours to help teachers focus more
on creative mindedness and to help learners develop their own creativity more effectively. But we are only
at the very beginning of a larger process.
The teachers we worked with clearly preferred an approach to assessment which was formative, not
summative. We got the strong sense that there is little appetite for the creation of a complex summative
matrix against which the creativity of pupils can be compared and cross-checked. This feeling came
particularly strongly from feedback at our pre-trial work with teachers and headteachers which
demonstrated their aversion to the notion of grading pupils in light of their creativity.
EDU/WKP(2013)1
27
Thus far we have only tried the tool with teachers who declare an interest in creativity and only
involved English schools. While the concept seems to be a useful one, the tool has only been used by
teachers and pupils over very short periods of time. The assessment tool was a paper one rather than
existing in online versions. Its design was simple and not specially tailored to the different ages of the
pupils who used it.
We have found that the balance of simplicity and rigour in an assessment tool is key because teachers
will only use a tool that obtains this balance. The use of five habits is sufficiently detailed without being
too unwieldy, and the five habits we trialed were validated by practitioners and pupils.
In terms of what the assessment tool might look like in light of our findings we recommend the
following approaches:
− Maintaining the emphasis upon the learnability of creativity. The steering group was strongly in
support of our decision to emphasise this aspect, and also that of sociability.
− Incorporating the tool into the school’s data collection, reporting, and reward systems.
− Developing training materials and resources for teachers to demonstrate best practice, making the
assessment process more tangible for teachers. Materials might relate to:
− communicating the purpose
− linking evidence to the exemplar statement
− demonstrating the level of detail required
− preparing very young pupils
− Developing layout to separate back out the three sub-habits of each creativity habit.
− Use of a clear font; easily decipherable by the youngest pupils.
− Scrutinising language to ensure it is sufficiently clear, particularly for younger children and those
with special needs, but also those less familiar with creativity or learning vocabulary. This may
mean creating different versions of the tool for different age groups, although comments did not
justify this as a definitive course of action.
− Developing best practice relating to how teachers might choose to focus on a small aspect of the
tool at a time.
− Developing a more formative tool that prompts pupils and teachers to consider how they could
improve, rather than just logging past behaviour. From a practical point of view, at present, the
tool does not allow room for capturing progression adequately due to lack of space. Some pupils
did date their notes, which showed progression to a small degree. Separation of the sub-habits
would allow more focused notes on progression.
− Capturing ‘breadth’ more systematically in the tool, by establishing how it could be used in
multiple contexts, and whether there would be any issues of ultimate ownership. This may
involve exploring how schools best deal with the issue of coordination to ensure that assessments
are undertaken systematically and collated in a useful format for both learners and teachers to use
EDU/WKP(2013)1
28
formatively. For example, schools may need to assign a coordinator role to ensure that
assessments are undertaken. This role may fall naturally to the ‘assessment coordinator’ at
primary level.
− Developing a more systematic evidence collection process. Developing materials to tackle
teachers’ thinking about the opportunities they provide in the curriculum.
− Developing the tool for the virtual environment.
− Trialing the tool with the ‘unconverted’. In light of the fact that participating schools were a self-
selecting group of ‘keen’ practitioners, the tool is yet to be exposed to the ‘unconverted’. Its
introduction to a group of schools unfamiliar with assessment of creativity would further test its
practicality and utility. Given its non-statutory status, however, it could be that it would be better
to focus on those schools which actively want to explore creativity.
NOTES
1 The word Guilford actually used was ‘trait’. There are many near synonyms of which we are aware, each with
slightly different nuances, including ‘characteristic’, ‘quality, ‘attribute’, ‘habits of mind’ and ‘disposition’.
We have chosen largely to use the word ‘disposition’ throughout this paper except when ‘trait’ has already
become widely associated with a line of thinking. We prefer to refer to creative ‘dispositions’ because of
the unequivocal connection with the idea that such aspects of any individual can be cultivated and learned,
becoming stronger and deeper conveyed by ‘disposition’.
2 ‘Creative agents’ is the term used to describe professionals from a range of disciplines funded to work in schools as
part of CCE’s Creative Partnerships scheme.
3 This phenomenon can also be seen in other subjects notably mathematics where poor numeracy levels can be
abusively seen as a proxy for being ’stupid’.
4 In 1999, the National Advisory Committee on Creative and Cultural Education (NACCCE) produced a report to the
UK Government: All Our Futures: Creativity, Culture and Education. The committee’s inquiry coincided
with the review of the National Curriculum in England and Wales, and, thus, made recommendations for
this review. It also included recommendations for a wider national strategy for creative and cultural
education. The NACCCE report was a response to the Government’s 1997 White Paper, Excellence in
Schools, and it highlighted an undervaluing of the arts, humanities, and technology. Our literature review
(Spencer et al., 2012) elaborates further on how the NACCCE report shaped the development of creativity
within education in the United Kingdom.
5 In 1999 the Nuffield Foundation funded a piece of research called the King’s-Medway-Oxfordshire Formative
Assessment Project (KMOFAP). As a result of the project, Assessment for Learning (AfL) has become
central to education policy in England and Scotland. AfL is any assessment that prioritises pupil learning
first and foremost.
6 In England Key Stages of education sit within the National Curriculum framework of teaching. Key Stage 4
comprises school years 10-11, and children aged 11-14. Pupils are assessed at the end of KS4. This marks
the end of compulsory education.
EDU/WKP(2013)1
29
7 Introduced as a performance measure in the 2010 performance tables, the EBacc is a measure of where pupils have
attained a grade C or above in a core of academic subjects (English, maths, history or geography, the
sciences and a language). It enables comparison of schools in terms of their provision for the key academic
subjects that are preferred or required for entry to degree courses.
8 www.creativitycultureeducation.org/wp-content/uploads/Progression-in-Creativity-Final-Report-April-2012.pdf
EDU/WKP(2013)1
30
REFERENCES
Adams, K. (2005), The Sources of Innovation and Creativity, National Center on Education and the
Economy.
Banaji, S. A. Burn and D. Buckingham (2010), The Rhetorics of Creativity: A literature review.
2nd edition, Creativity, Culture and Education, Newcastle.
Beattie, D. (2000), "Creativity in Art: The Feasibility of Assessing Current Conceptions in the School
Context", Assessment in Education: Principles, Policy and Practice, Vol. 7, No. 2, pp. 175-192.
Berger, R. (2003), An Ethic of Excellence: Building a Culture of Craftsmanship with Students, Heinemann
Educational Books, Portsmouth, NH.
Black, P. and D. Wiliam (1998), "Inside the Black Box: Raising Standards Through Classroom
Assessment", The Phi Delta Kappan, Vol. 80, No. 2, pp. 139-144 and 146-148.
Boud, D. and N. Falchikov (2006), "Aligning Assessment with Long-Term Learning", Assessment and
Evaluation in Higher Education, Vol. 31, No. 4, pp. 399-413.
Broadfoot, P. (2000), "Preface, pp ix-ixxx" in Filer, A. (ed.), Assessment: Social practice and social
product, RoutledgeFalmer, London.
Burnard, P. (2011) "Constructing Assessment for Creative Learning", in Sefton-Green, J., K. Jones and L.
Bresler (eds.), The Routledge International Handbook of Creative Learning, Routledge, Abingdon.
Claxton, G. (2006) "Cultivating Creative Mentalities: A Framework for Education", Thinking Skills and
Creativity, Vol. 1, No. 1, pp. 57-61.
Cooper, L., T. Benton and C. Sharp (2011), The Impact of Creative Partnerships on Attainment and
Attendance in 2008-09 and 20090-10, NFER, Slough.
Cooperrider, D. and D. Whitney (2005), "Appreciative Inquiry: A Positive Revolution in Change", Berrett-
Koehler Publishers Inc, San Fransisco, CA.
Craft, A. (2008a), "Approaches to Assessing Creativity in Fostering Personalisation", paper prepared for
discussion at DCSF Seminar – Assessing The Development of Creativity: Is it possible, and if so
what approaches could be adopted? 3rd October 2008, Wallacespace, London.
Craft, A. (2008b), Creativity in the School [Online], www.beyondcurrenthorizons.org.uk/creativity-in-the-
school/, accessed 6th July 2011.
Csikszentmihalyi, M. (1996), Creativity: Flow and the Psychology of Discovery and Invention,
HarperCollins, New York.
Ericsson, A., R. Krampe and C. Tesch-Römer (1993), "The Role of Deliberate Practice in the Acquisition
of Expert Performance", Psychological Review, Vol. 100, No. 3, pp. 363-406.
EDU/WKP(2013)1
31
Feist, G. (2010), "The Function of Personality in Creativity: The Nature and Nurture of the Creative
Personality", in Kaufman, J. and R. Sternberg (eds.), The Cambridge Handbook of Education,
Cambridge University Press, Cambridge.
Fillis, I. and A. McAuley (2000), "Modeling and Measuring Creativity at the Interface", Journal of
Marketing Theory and Practice, Vol. 8, No. 2, pp. 8-17.
Florida, R. (2002), The Rise of the Creative Class…and How it’s Transforming Work, Leisure, Community
and Everyday Life, Basic Books, New York, NY.
Guilford, J. (1950), "Creativity", American Psychologist, Vol. 5, No. 9, pp. 444-454.
Heindel, C. and L. Furlong (2000), "Philosophies of Creativity: Two Views", Zip Lines: The voice for
adventure education, No. 40, pp. 47-48.
Hingel, A. (2009), "Creativity Measurement in the European Policy Context", pp 421-422, in Villalba, E.
(ed.), Measuring Creativity: The book, www.ec.europa.eu/education/lifelong-learning-
policy/doc2082_en.htm, accessed 6th December 2011, European Commission.
Hocevar, D. (1981), "Measurement of Creativity: Review and Critique", Journal of Personality
Assessement, Vol. 45, No. 5, pp. 450-464.
John-Steiner, V. (2006), Creative Collaboration, Cambridge University Press, Cambridge.
Kaufman, D., D. Moss and T. Osborn (eds.) (2008), Interdisciplinary Education in the Age of Assessment,
Routledge, New York, NY.
Kaufman, J. and R. Sternberg (eds.) (2010), The Cambridge Handbook of Creativity, Cambridge
University Press, Cambridge.
Koestler, A. (1964), The Act of Creation, Penguin Books, New York, NY.
Lave, J. and E. Wenger (1991), Situated Learning: Legitimate Peripheral Participation, Cambridge
University Press, Cambridge.
Leahy, S. and D. Wiliam (2009), "From Teachers to Schools: Scaling up Formative Assessment", paper
presented at the 2009 AERA Annual Meeting on Disciplined Inquiry: Education research in the
circle of knowledge, San Diego.
Lindström, L. (2006), "Creativity: What is it? Can you assess it? Can it be taught?", International Journal
of Art and Design Education, Vol. 25, No. 1, pp. 53-66.
Looney, J. W. (2009), “Assessment and Innovation in Education”, OECD Education Working Papers,
No. 24, OECD Publishing, http://dx.doi.org/10.1787/222814543073.
Looney, J. W. (2011), “Integrating Formative and Summative Assessment: Progress Toward a Seamless
System?”, OECD Education Working Papers, No. 58, OECD Publishing,
http://dx.doi.org/10.1787/5kghx3kbl734-en.
Lucas, B. and G. Claxton (2009), Wider Skills for Learning: What are they, how can they be cultivated,
how could they be measured and why are they important for innovation, NESTA, London.
EDU/WKP(2013)1
32
Lucas, B. and G. Claxton (2010), New Kinds of Smart: How the Science of Learnable Intelligence is
Changing Education, McGraw Hill Open University Press, Maidenhead.
Menter, I. (2010), Teachers - Formation, Training and Identify:A literature Review, Creativity, Culture and
Education, Newcastle Upon Tyne.
National Advisory Committee on Creative and Cultural Education (1999), All Our Futures: Creativity,
Culture and Education, DCMS and DfEE, London.
Organisation for Economic Co-operation and Development (OECD) (2005), Formative Assessment.
Improving Learning in Secondary Classrooms, OECD Publishing, Paris.
Perkins, D. (1995), Outsmarting IQ: The Emerging Science of Learnable Intelligence, Free Press, New
York, NY.
Plucker, J. and M. Makel (2010), "Assessment of Creativity", pp 48-73, in Kaufman, J. and R. Sternberg
(eds.), The Cambridge Handbook of Creativity, Cambridge University Press, Cambridge.
Root-Bernstein, R. and M. Root-Bernstein (1999), Sparks of Genius, Houghton Mifflin, Boston.
Runco, M. (2010), "Divergent Thinking, Creativity, and Ideation", in Kaufman, J. and R. Sternberg (eds.),
The Cambridge Handbook of Creativity, Cambridge University Press, Cambridge.
Saltelli, A. and E. Villalba (2008), "How About Composite Indicators?", pp. 17-24, in Villalba, E. (ed.),
Measuring Creativity: The book, www.ec.europa.eu/education/lifelong-learning-
policy/doc2082_en.htm, accessed 6th December 2011, European Commission.
Smith, J. and L. Smith (2010), "Educational Creativity", in Kaufman, J. and R. Sternberg (eds.), The
Cambridge Handbook of Creativity, Cambridge University Press, Cambridge.
Spencer, E., B. Lucas and G. Claxton (2012a), Progression in Creativity - Developing New Forms of
Assessment: A literature Review, CCE, Newcastle.
Spencer, E., B. Lucas and G. Claxton (2012b), Progression in Creativity - Developing New Forms of
Assessment. Final Research Report, CCE, Newcastle.
Sternberg, R. (1996), Successful Intelligence: How Practical and Creative Intelligence Determine Success
in Life, Simon and Schuster, New York.
Taras, M. (2005), "Assessment - Summative and Formative - Some Theoretical Reflections", British
Journal of Educational Studies, Vol. 53, No. 4, pp. 466-478.
Treffinger, D., G. Young, E. Selby, and C. Shepardson (2002), Assessing Creativity: A Guide for
Educators, The National Research Centre on the Gifted and Talented, Connecticut.
Villalba, E. (2008), Towards an Understanding of Creativity and its Measures, European Commission
Joint Research Centre, Luxembourg.
Watkins, C. (2010), "Learning, Performance and Improvement", International Network for School
Improvement Research Matters, No. 34., Institute of Education.
EDU/WKP(2013)1
33
Wiliam, D. (2006), "Assessment for Learning: Why, What and How? [Online], Cambridge Assessment
Network talk, Excellence in Assessment: Assessment for Learning: A supplement to the Cambridge
Assessment Network 'Assessment for Learning' seminar held on 15 September 2006 in Cambridge,
UK, Cambridge Assessment Network www.assessnet.org.uk/e-
learning/file.php/1/Resources/Excellence_in_Assessment/Excellence_in_Assessment_-_Issue_1.pdf,
accessed 12th December 2011.
Wiliam, D. and P. Black (1996), "Meanings and Consequences: A Basis for Distinguishing Formative and
Summative Functions of Assessment?", British Educational Research Journal, Vol. 22, No. 5,
pp. 537-548.
Wiliam, D., C. Lee, C. Harrison and P. Black (2004), "Teachers Developing Assessment for Learning:
Impact on Student Achievement", Assessment in Education: Principles, Policy and Practice,
Vol. 11, No. 1, pp. 49-65.
EDU/WKP(2013)1
APPENDIX 1:
F
34
F
IELD T
R
IAL 2 – ASSESSMENT TOOL
EDU/WKP(2013)1
35
APPENDIX 2: PUPIL QUESTIONNAIRE, FIELD TRIAL 2
(Question numbering cross-referenced to in-depth research questions)
Please enter the following information
Today’s date / / 2011
Your school’s name
Name of the teacher who introduced the pupil
recording sheet to you
Your full name (first name and surname)
3. Thinking about the process of trying to track how imaginative you are, and how it might help
you
3.3
Now that you’ve tried out the ‘pupil
recording sheet’ are you more aware of when
you are being imaginative?
Please tell us why you chose that answer.
Please type below and highlight one of the
boxes to the right.
☐
It has
not
really
made
me more
aware
☐
It has made
me more
aware
☐
I am more aware and I also
think more about how I
could be more imaginative
3.5
How easy did you find the pupil recording
sheet to use? Please type below and highlight
one of the boxes to the right.
If difficult, what was hard?
Should we change any of the words?
Which ones, and why?
☐
Very
easy
☐
Easy
☐
Less
easy
☐
Very difficult
EDU/WKP(2013)1
36
5. Thinking about how easily you can track your ‘imagination’ and provide evidence
5.4
Did you find it easy to decide on and
gather 'evidence' to support the box you
ticked on the recording sheet?
What bits were difficult and why?
Please type below and highlight one of the
boxes to the right.
☐
Very
easy
☐
Easy
☐
Ok
☐
Less
easy
☐
Very hard
7. Thinking about how much attention you paid to the three aspects of 'imagination' in the
exemplar statement (not narrowing ideas down too quickly, linking facts and ideas, using intuition)
7.1
There were three parts to 'being
imaginative' (see the box above).
Please tick one of the boxes, and then
write which area(s) you ignored and why.
Please type below and highlight one of the
boxes to the right.
☐
I tended to
evidence one
area only
☐
I tended to
evidence two
areas
☐
I tended to evidence
all three elements of
being imaginative
8. Thinking about the different places you got evidence from
8.1
From which subject areas did you draw
your evidence?
Please tick one of the boxes, and then tell
us which subject area(s) you considered and
why. Please type below and highlight one of
the boxes to the right.
☐
One
subject
only
☐
A narrow
range of
subjects
☐
A broad
range of
subjects
☐
Other subjects
and out of
school too
9. Thinking about the whole project – from hearing about it, to trying it out, to reflecting in class
For you, what were the two best things about this pupil reporting tool?
What were the two most difficult parts?
What two things have you learned about yourself?
EDU/WKP(2013)1
37
APPENDIX 3: EXAMPLE TEACHER QUESTIONNAIRE, FIELD TRIAL 2
Supporting the Development of Creativity in Schools Assessing Creativity
Field Trial (2) Questionnaire for teacher participants
Dear teacher
Thank you for taking part in this field trial. Your views are really important to us, so:
− Everything you write in the questionnaire will be analyzed thoroughly to help us understand your
experiences of the pupil reporting tool.
− Please be as detailed as you can. Text boxes expand as you type.
− Where there are multiple choice answers, please check the box and also add comments to
explain.
− We should be grateful if you would complete this form electronically. This means it can then be
analyzed without transcription. We can email a paper version (for printing) if necessary.
Please enter the following information
Today’s date 5/ 12 / 2011
Year group of the participating class Year 8
Date that you first introduced the ‘pupil
recording sheet’ to your class
18 / 11 / 2011
Number of subsequent reflection sessions
you held with the same class
3
EDU/WKP(2013)1
38
1. Thinking about your previous experience of tracking creativity in pupils
1.1
How are you already
assessing creativity in your
school?
We use a criteria that moves from dependence to independence in
Guy Claxton’s 4 R’s
1.2
In which contexts / subject
areas are you assessing creativity
in your school?
In the Evolve Curriculum (Year 7), in our days of learning:
Perform, Create, Innovate, Communicate, Explore (Year 8)
1.3
How did this project sit
within other assessment activities
in your school?
It sat really well because we regularly discuss creativity and
imagination
1.4
How did pupils understand
‘imagination’ before the project?
(If given the opportunity to come
up with their own definitions at
the start) If you have photographs
of mind-maps, or similar,
prepared by pupils when this
project was started please send
these to the project co-ordinator
along with this questionnaire.
I will send the mind-maps they created to you as evidence
2. Thinking about how you introduced the project to pupils
2.1
How did you introduce the
project to pupils (the idea of
assessing creativity; the idea of
learning to get better at noticing
‘being imaginative’)?
I did a workshop based around the question:” What is creativity?”
And then” What does being imaginative mean?”
2.2
What went well, or less well
The students really engaged with the workshop, they seemed to
EDU/WKP(2013)1
39
in introducing the project to
pupils?
enjoy our 3 hour discussion around creativity and noticing when
they were being imaginative
2.3
You were asked to hold 2-3
reflection sessions with pupils.
How did you conduct the
reflection sessions? What did you
ask pupils to do? How did you ask
them to interact with one another?
I asked the pupils to complete the evidence sheets and we
discussed these experiences. I also had asked that they peer assess,
noticing others imagination and telling them when they see
evidence of it.
2.4
What activities or
approaches to the reflection
sessions did you find worked best
/ less well, and why?
They preferred to discuss rather than note experiences down
3. Thinking about how the process of trying to assess their own creativity might have affected
pupils’ behaviour
3.1
How did pupils respond to
the concept of tracking
‘imagination’ and how did this
change (if it did)?
Please highlight one of the
options to the right and explain
below:
Remained
positive
Changed to
negative
Changed to
positive
Remained negative
3.2
How successfully do you
believe you were able to guide
pupils in tracking the
development of their imagination,
using this pupil recording sheet,
over the duration of the project?
Please explain why.
I feel we were quite successful, the students are very independent
in this sample and they were able to track their development well,
the record sheet was very user friendly.
3.3
Did you notice a change in
the way pupils talked about, and
understood, ‘being imaginative’
as the project progressed? Please
They were able to notice each others imagination and feedback to
them, this certainly didn’t happen before.
EDU/WKP(2013)1
40
explain.
3.4
How did pupils respond to
the reflection sessions?
Really positively
4. Thinking about how the process of helping pupils to track their own creativity might have
impacted your own practice
4.1
Were other members of staff
(non ‘teacher participants’) able to
support the project? If so, please
give details?
Please highlight one of the
options to the right and explain
below:
They dropped
examples of
being
imaginative
into their
lessons
They knew
about the
project and
we talked
about it
They knew
but were not
supportive
I do not know if
they knew about the
project
4.2
Now that you have helped
pupils track their imagination,
what things do you do or think
differently and why?
Question them more regularly on when they have been
imaginative, encourage them to be more imaginative.
5. Thinking about how pupils were able to self-assess ‘imagination’ and provide enough
evidence to support the box they ticked
5.1
Pupils had to select the tick-box that showed
how closely they ‘fit’ with the exemplar statement.
Did you find it easy to ‘sign off’ this evidence? What
was difficult?
Please highlight one of the options to the right
and explain below:
Very
easy
Easy
Ok
Less
easy
Very hard
5.2
Pupils were encouraged to provide evidence
from areas outside of your subject area. Did you find
it easy to ‘sign off’ this evidence? What was
Very
easy
Easy
Ok
Less
easy
Very hard
EDU/WKP(2013)1
41
difficult?
Please highlight one of the options to the right
and explain below:
5.3
Were some sorts of evidence more or less
persuasive than others? Please give examples of
evidence that was hard to sign off, and evidence that
was easy to reach a consensus on with pupils. Please
tell us why.
Creative writing was easier to sign off.
Maths lessons were far harder
6. Thinking about how much attention pupils paid to whole exemplar statement. It talked about
using imagination ‘without being prompted’ and ‘confidently’
6.1
Did pupils seem to pay attention to each
part of the exemplar statement when
supporting their choice of tick-box with
evidence, or did they provide evidence for
some and ignore other aspects? Which bits
were ignored?
Please highlight one of the options to the
right and explain below:
They tended
not to refer to
confidence
and not
needing
prompting
They tended to
evidence
confidence OR
not needing
prompting
They tended to
evidence both
evidenced
confidence AND
not needing
prompting
6.2
Did pupils tend to
provide enough evidence
for their choice of tick
box?
Please highlight one
of the options to the right
and explain below:
Evidence
was
missing
Evidence
was
tenuous or
ambiguous
Evidence
was alright
Evidence
was good
They really
understood how to
make the evidence
clear and relevant
EDU/WKP(2013)1
42
8. Thinking about the range of learning settings (including out of school contexts) that pupils
drew on
8.1
To what extent did pupils draw on a
range of learning contexts, not just your
own subject area?
Please highlight one of the options to
the right and explain below:
Your subject
only
A narrow
range of
subjects
A broad
range of
subjects
Other
subjects
and out of
school too
9. Thinking about the whole project – from hearing about it, to receiving the materials, to
introducing it to pupils, to trying it out and re-visiting it with them
9.1
What three things worked well?
1. The new student record sheet
2. The emphasis on the students to gather evidence
3. The focus on just one element of the model
9.2
What were the three most difficult parts?
a. Time
b.
c.
7. Thinking about how much attention pupils paid to the definition of 'imagination' with its
three aspects (trying things out, combining ideas from different places, being able to carry on even
when you can't fully explain your reasoning)
7.1
Did pupils pay equal attention to all
three aspects of being imaginative when
they provided evidence?
Please select from the right and
explain below:
They tended
to evidence
one area only
They tended
to evidence
two areas
They tended to evidence
all three elements of
being imaginative
EDU/WKP(2013)1
43
9.3
What three things have you learned?
That students have a really good understanding of what good imagination looks like
They are very observant noticing when each other use imagination
That there aren’t enough hours in the day to really dig deep into the impact of this field study
10. Your additional comments
Please tell us anything else you think might be useful to us.
EDU/WKP(2013)1
44
RECENT OECD PUBLICATIONS OF RELEVANCE TO THIS WORKING PAPER
Foray, D. and J. Raffo (2012), “Business-driven innovation: is it making a difference in education? An
analysis of educational patents”, OECD Education Working Papers, No. 84, OECD Publishing,
http://dx.doi.org/ 10.1787/5k91dl7pc835-en.
Kärkkäinen, K. (2012), “Bringing About Curriculum Innovations: Implicit Approaches in the OECD
Area”, OECD Education Working Papers, No. 82, OECD Publishing,
http://dx.doi.org/10.1787/5k95qw8xzl8s-en.
Lubienski, C. (2009), “Do Quasi-markets Foster Innovation in Education?: A Comparative Perspective”,
OECD Education Working Papers, No. 25, OECD Publishing,
http://dx.doi.org/10.1787/221583463325.
Looney, J. (2009), “Assessment and Innovation in Education”, OECD Education Working Papers, No. 24,
OECD Publishing, http://dx.doi.org/10.1787/222814543073.
OECD (2012), Better Skills, Better Jobs, Better Lives: A Strategic Approach to Skills Policies, OECD
Publishing, Paris.
OECD (2010), Innovative Workplaces. Making better use of skills within organisations, OECD Publishing,
Paris.
OECD (2010), The OECD Innovation Strategy: Getting a Head Start on Tomorrow, OECD Publishing,
Paris.
OECD (2009), Working Out Change. Systemic Innovation in Vocational Education and Training, OECD
Publishing, Paris.
OECD (2007), Evidence in Education. Linking Research and Policy, OECD Publishing, Paris.
OECD (2004), Innovation in the Knowledge Economy – Implications for Education and Learning, OECD
Publishing, Paris.
EDU/WKP(2013)1
45
THE OECD EDUCATION WORKING PAPERS SERIES ON LINEIES ON LINE
The OECD Education Working Papers Series may be found at:
− The OECD Directorate for Education website: www.oecd.org/edu/workingpapers
− Online OECD-ilibrary: http://www.oecd-ilibrary.org/education/oecd-education-working-
papers_19939019
− The Research Papers in Economics (RePEc) website: www.repec.org
If you wish to be informed about the release of new OECD Education working papers, please:
− Go to www.oecd.org
− Click on “My OECD”
− Sign up and create an account with “My OECD”
− Select “Education” as one of your favourite themes
− Choose “OECD Education Working Papers” as one of the newsletters you would like to receive
For further information on the OECD Education Working Papers Series, please write to:
edu.contact@oecd.org.