Content uploaded by Stephanie (Perample) Butler
Author content
All content in this area was uploaded by Stephanie (Perample) Butler on Jun 06, 2015
Content may be subject to copyright.
! !
ABSTRACT
IS QUANTITATIVE DATA-DRIVEN INSTRUCTION APPROPRIATE
IN VISUAL ARTS EDUCATION?
By
Stephanie T. Butler
May 2015
The use of quantitative Data-Driven Instruction and Assessment in the visual arts
curriculum could impact the outcome of student creativity if employed within the visual
arts, a content area that uses primarily qualitative pedagogy and assessment. In this paper
I examine the effect upon measured creativity resulting from the use of Quantitative
Data-Driven Assessment compared to the use of Authentic Assessment in the Visual Arts
curriculum.
This initial experimental research exposed eighth grade Visual Arts students to
Authentic Assessment in one group, and Quantitative Data-Driven Assessment in
another. Two experiments were conducted from the results. In the first experiment, both
groups of student post-test art works are compared for mean creativity scores as defined
by an independent expert panel of Art Educators. The second experiment compares for
gains in pre-test/post-test creativity as the teacher assessed. Gains in mean creativity
scores are compared between groups. Difference in assessment motivations are discussed
as possible influencing factors.
! !
IS QUANTITATIVE DATA-DRIVEN INSTRUCTION APPROPRIATE
IN VISUAL ARTS EDUCATION?
A THESIS
Presented to the School of Art
California State University, Long Beach
In Partial Fulfillment
of the Requirements for the Degree
Master of Arts in Art Education
Committee Members:
Carlos Silveira, Ph.D. (Chair)
Laurie Gatlin, Ph.D.
Erin M. Craig, Ed.D.
College Designee:
Karen Kleinfelder, Ph.D.
By Stephanie T. Butler
B.A., 2005, California State University, Long Beach
May 2015
! !
"""!
ACKNOWLEDGEMENTS
I am humbled with gratitude for the encouragement and support of many people
during this process of research and discovery. My dynamic and brilliant committee of
extraordinary educators, Dr. Carlos Silveira, Dr. Laurie Gatlin, and Dr. Erin M. Craig,
have challenged and supported me, providing their expertise and unique perspectives to
help me focus much like a lens through which I was able to clarify my muddied thoughts.
My advisor Carlos has influenced my education and professional life for over a
decade. I admire his energy, respect his honesty and integrity, and cherish his sacred
appreciation for the value and transformative power of Art Education. His passion for
the beauty within the process of research has inspired me to persevere through this
process. Erin has balanced this process with a logical, literal, linear focus to temper my
holistic, intuitive, divergent mind. Laurie has been constant in centering the ideas and
discussions to the process of art, the process of learning. My classmate and colleague,
Roxana Taboada-Pena has been a phenomenal friend, and study-buddy. The ladies in the
Thesis Office, the Librarians and Graduate Advisor, Rebecca Sittler at CSULB have been
so very helpful, responsive, and kind. Thank you all.
I humbly thank my administrators at Jeffrey Trail Middle School including
Principal Scott Bowman and Vice Principal Kimberly Cardoza for encouraging me to
conduct this initial experimental research and investigate ideas that will guide my
practice as an educator, and help me bring a high quality arts education to our students. I
! !
"#!
am ever grateful to Music Educator Thomas Kahelin for guiding me toward some very
relevant creativity research, and to Nancy Karamanos in helping to research and in
clarifying our population data. I am blessed and grateful each day to have the best job in
the world largely because of those with whom I work each day, namely my colleagues at
Jeffrey Trail Middle School and especially our outstanding students.
Special thanks are also in order to the National Art Education Association, and the
California Art Education Association for their advocacy, research, and conferences in the
field of Art Education. Also, thank you to the State Education Agency Directors of Art
Education for your support and approval of use of the National Core Arts Standards in
this paper.
Mostly, I profoundly thank my husband, Tom, for his unfailing support and
encouragement, for his insights as an educator and testing coordinator, for being a
phenomenal husband to me and caring father to our wonderful sons, Nathan and
Benjamin. Thank you, Tom. “2 – 5”
! !
#!
TABLE OF CONTENTS
Page
ACKNOWLEDGEMENTS ......................................................................................... iii
LIST OF TABLES ....................................................................................................... viii
LIST OF FIGURES ..................................................................................................... ix
CHAPTER
1. THE PROBLEM ............................................................................................... 1
Introduction and Statement of the Problem ............................................... 1
Purpose of the Study .................................................................................. 6
Need for the Study ..................................................................................... 6
Definition of Terms .................................................................................... 7
Assumptions and Limitations .................................................................... 12
2. REVIEW OF LITERATURE ........................................................................... 15
Introduction ................................................................................................ 15
Preface: The Purpose of Creativity in Arts Education,
Specific to This Study .................................................................... 16
Assessment of Creativity ........................................................................... 17
Authentic Assessment ................................................................................ 20
Quantitative Data-Driven Instruction and Assessment ............................. 23
Summary and Conclusions ........................................................................ 28
Assessment of Creativity Summary and Conclusions ......................... 29
Authentic Assessment Summary and Conclusions .............................. 29
Quantitative Data-Driven Instruction and Assessment Summary and
Conclusions .......................................................................................... 29
! !
#"!
CHAPTER Page
3. METHODOLOGY ........................................................................................... 32
Instrument A: Expert Panel Posttest Only Experiment Hypothesis ........... 32
Instrument B: Creativity Pretest/Posttest Gains Experiment Hypothesis .. 34
Research Procedure .................................................................................... 36
Test Groups .......................................................................................... 36
Population and Sample ........................................................................ 36
The Design ................................................................................................. 39
Instruments ................................................................................................. 39
Independent, Dependent, and Controlled Variables .................................. 40
Treatments .................................................................................................. 41
Art Experience Survey Results ............................................................ 44
The Unit of Visual Arts Instruction ..................................................... 49
Lesson plan 1: tar-paper slab construction clay projects ..................... 49
Production process ......................................................................... 49
Discussion board A ........................................................................ 50
Reflection ....................................................................................... 50
Lesson plan 2: Ceramics surface treatment techniques ....................... 51
Production process ......................................................................... 52
Treatment ....................................................................................... 52
Discussion board B ........................................................................ 53
Discussion board C ........................................................................ 53
Reflection ....................................................................................... 53
4. RESULTS ........................................................................................................ 57
Instrument A: Expert Panel Posttest Only Results .................................... 57
Instrument B: Creativity Gains Results ..................................................... 71
5. CONCLUSIONS ............................................................................................... 80
General Conclusions .................................................................................. 80
Limitations of The Study ........................................................................... 84
Discussion of Hypotheses .......................................................................... 88
Instrument A: Expert Panel Posttest Only Discussion of Hypotheses . 88
Instrument B: Creativity Gains Discussion of Hypotheses .................. 91
Implications For Implementation of Quantitative Data-Driven Assessment in
The Visual Arts, and Upon Creativity ....................................................... 92
APPENDICES ........................................................................................................ 94
A. DISCUSSION BOARD “A” RESPONSES, CONTROL GROUP ................. 95
! !
#""!
APPENDICES ........................................................................................................ Page
B. DISCUSSION BOARD “A” RESPONSES, EXPERIMENTAL GROUP ..... 108
C. DISCUSSION BOARD “B” RESPONSES, CONTROL GROUP ................. 118
D. DISCUSSION BOARD “B” RESPONSES, EXPERIMENTAL GROUP ..... 121
E. DISCUSSION BOARD “C” RESPONSES, CONTROL GROUP .................. 125
F. DISCUSSION BOARD “C” RESPONSES, EXPERIMENTAL GROUP ...... 129
G. TAR PAPER LESSON PLAN, ........................................................................ 134
H. PRETEST CRITIQUE AND ASSESSMENT ................................................. 137
I. POSTTEST CRITIQUE AND ASSESSMENT ................................................ 140
J. STUDENT WORK IMAGES ........................................................................... 145
BIBLIOGRAPHY ........................................................................................................ 152
! !
#"""!
LIST OF TABLES
TABLE Page
3.1. Butler Experimental Research Methodology .................................................... 45
3.2. Art Experiences Survey .................................................................................... 50
4.1. Expert Panel Results Summary ......................................................................... 70
! !
"$!
LIST OF FIGURES
FIGURE Page
4.1. Overall expert panel posttest only results ......................................................... 60
4.2. Null hypothesis 1 ANOVA results ................................................................... 61
4.3. Null hypothesis 1 ANOVA results boxplot ..................................................... 62
4.4. Null hypothesis 2 ANOVA results ................................................................... 63
4.5. Null Hypothesis 2 ANOVA results boxplot ..................................................... 64
4.6. Null hypothesis 3 ANOVA results .................................................................. 65
4.7. Null hypothesis 3 ANOVA results boxplot ...................................................... 66
4.8. Null hypothesis 4 ANOVA results ................................................................... 67
4.9. Null hypothesis 4 ANOVA results boxplot ...................................................... 68
4.10. Instrument B null hypothesis 5 ANOVA results ............................................. 74
4.11. Instrument B null hypothesis 5 ANOVA table results .................................... 74
4.12. Instrument B null hypothesis 6 ANOVA results ............................................ 76
4.13. Instrument B null hypothesis 6 ANOVA table results .................................... 76
4.14. Instrument B null hypothesis 7 ANOVA results ............................................ 77
4.15 Instrument B null hypothesis 7 ANOVA table results ..................................... 78
4.16. Instrument B null hypothesis 8 ANOVA results ............................................ 79
4.17. Instrument B null hypothesis 8 ANOVA table results .................................... 79
! !
%!
CHAPTER 1
THE PROBLEM
Introduction and Statement of the Problem
Since the No Child Left Behind mandate of 2001, Data-Driven Instruction and
Assessment in classrooms has become a widely utilized method of assessment of student
achievement. In the attempts of uniform implementation across schools and districts,
some administrators have employed Data-Driven Instruction and Assessment across the
scholastic curriculum school-wide. Due to the predominantly quantitative data
requirements employed in collecting and analyzing quantitative assessment data, the
employment of quantitative Data-Driven Instruction and Assessment in the visual arts
curriculum could influence a difference in creativity in student work. If this difference in
creativity becomes a limitation or measured decline of creativity as an output, employing
quantitative Data-Driven Instruction and Assessment could limit and excessively narrow
quality of the visual arts curriculum, a content area that uses primarily qualitative
pedagogy and assessment, and a content in which demonstrated creativity in student
process and product is one of many desired outcomes (Zimmerman, 2010a).
One of the most important outcomes of an education with art includes
encouraging “creativity and innovative thinking in young minds” (Dwyer, 2011,
Presidents’ Committee on the Arts and the Humanities, [PCAH] p. 8). Creativity as an
applied skill in employees has been named as highly desired by employers, surpassing the
! !
&!
importance of basic knowledge and ranking in the top five of applied skills (Ruppert,
2010). While Art Education does not hold the monopoly on teaching creativity within
the schools, it is one of the core academic subjects where the success of the art program is
continually approved or condemned by the creativity of the curriculum, and the
demonstrated creativity of the student work produced, or the end product. Art Education
curricula regularly use creativity as a vehicle with which to acquire and practice the
higher-order thinking skills and communication skills, which educational reformers strive
for students to aspire (U.S. Department of Education, National Assessment of
Educational Progress, [NAEP] 2008). Therefore, not only is creativity a large part of the
Art Education end product, it is integral to the process of making, and also to the
pedagogy and curriculum. Due to the nature of Art Education’s symbiotic relationship
with creativity, with the recognition that creativity is a major part of what students aspire
to practice and demonstrate in a Visual Arts course, the author posits that it is essential to
this study to investigate creativity and its relationship to Visual Arts pedagogical and
assessment methodologies.
In quantitative Data-Driven Instruction and Assessment students take formative
and summative assessments that are aligned to specific academic standards (state or
national standards in the pre-Common Core era, or now the electively adopted Common
Core State Standards). The assessments are designed by the teacher, test designer, or
textbook publisher to be closely representative of, if not modeled directly after, content
on state and national standardized achievement tests. The driving focus in data-driven
instruction is the creation of and implementation of the assessment, with the pedagogy
! !
'!
and curriculum aligned to these tests in order to attain higher assessment results
(Bambrick-Santoyo, 2010; Mertler, 2014; Sindelar, 2011).
Data-driven tests are mostly quantitative in nature using mostly true/false,
multiple-choice, and scored rubrics to assess student mastery of standards. When used
along with bubble-style test sheets and OCR software, the data-driven instruction
software compiles student score data by overall results, mastery of specific standards, and
progress over time. Frequent assessments are recommended for a record of results over
time. Also key to the success of Data-Driven Instruction is the alignment of the
assessments and curricula to the content standards.
Data-Driven Instruction and Assessment allows this data to be shared among all
stakeholders: teachers, students, parents, administrators, and so forth. The immediate,
quantitative feedback provides educators with the opportunity to group low performers
by content area or standard, and to design and provide targeted student interventions.
Pedagogical shifts can be made after a teacher reflection of what the report of student
scores demonstrates. If implemented by the teacher with an opportunity for error analysis
by the students, students can reflect on their own mastery of content standards,
reconsidering patterns of thought that may have led to the initial errors specific to the
content standard. Longitudinal score data can be an opportunity for student reflection on
their own progress of standards mastery.
This quantitative true/false, Multiple Choice, or scored rubric analysis employed
in quantitative Data-Driven Instruction and Assessment is appropriate for employment in
content areas that are quantitative and convergent by nature quantitative and convergent
subjects usually present one right answer to attain. The score reflects the student’s
! !
(!
answer was either correct or incorrect. According to the data collected, the student either
mastered the content standard linked to that test question or did not.
In content areas that are dominantly qualitative and divergent in nature, as in the
Visual Arts, there is an opportunity to examine the suitability of this type of assessment
as implemented within the Visual Arts. Specific to the Visual Arts curriculum in
California, there is elasticity within the California Visual and Performing Arts content
standards and the newly released National Core Arts Standards (at the time of this initial
experimental research) to allow for pedagogical shift responding to the fluid and ever
changing cultural and political dominances, shifts in aesthetic philosophies, and even to
allow for fluctuations in the availabilities of supplies and media or the fluctuations of the
budget specific to each individual classroom (CDE VAPA, 2001; NCAS, 2014).
The experience of teaching and learning within the visual arts are contextual
according to medium, technique, aesthetic philosophies, cultural influences and
cultural/societal mores. The elasticity of the California State Visual And Performing Arts
Standards and of the National Core Arts Standards is necessary due to the dynamic nature
of all the arts, and to meet the needs of the students and the educators’ immediate
community. California State Superintendent of Public Instruction encourages educators
to use the standards to design the curriculum and subsequent instructional strategies to fit
the needs of their local communities (CDE, VAPA, 2001; NCAS, 2014).
Learning within the visual arts employs a style of analysis and thinking that is
primarily divergent in nature. Often there are many possible correct answers to the
questions asked in a visual arts curriculum. Since some of the aim and purpose of
education in the Visual Arts is to build creativity and encourage communication and
! !
)!
innovation, “such as problem solving, creative thinking, effective planning, time
management, teamwork, effective communication, and an understanding of technology”
according to Ruth E. Green, President, California State Board of Education (CDE, VAPA
2001, p. v). which are divergent and qualitative in instructional nature, is it possible that
assessing the Visual Arts through convergent, quantitative measures would be
inappropriate in seeking accurate data from which to drive instruction and pedagogical
shifts? “Creative learning is multidimensional. It challenges established thinking and
practice in assessment, requiring a creative solution” (Ellis, 2008, p. 1).
Shannon Pella argues that quantitative data obtained in data-driven instruction is
not sufficient in qualitative subjects. “The information from a test score is general and
vague and often reveals little about what could have been done differently during
instruction. …The quantitative data alone from this data-driven pedagogy is insufficient
in developing a responsive pedagogy” (Pella, 2012, p. 59).
Since the aims and pedagogies of the Visual Arts overall are often to encourage
divergent thinking, to increase creativity and innovation (Zimmerman, 2010a), it is
possible that the quantitative and convergent data collected from quantitative and
convergent Data-Driven Instruction and Assessment could be extraneous to Art
Educators and counterproductive for desired outcomes for student learning and
achievement.
Perhaps a case can be made for revising the data collection and analysis tools and
software for the purpose of collecting and analyzing the qualitative data readily available
and regularly employed within the Visual Arts curriculum. People with different work
roles have different data needs. Coburn and Talbert advocate for collecting and
! !
*!
employing a wide range of data in order to answer all the questions that different people
confront in different work roles (Coburn, 2006).
For the purpose of this initial experimental research, I will examine the suitability
of the use of quantitative data-driven instruction and assessment in the Visual Arts
curriculum.
Purpose of the Study
An initial experimental research was conducted involving eighth grade Visual
Arts students in a classroom setting, compared for creativity as measured two ways:
contextually, and consensually. The purpose of this study is two-fold:
1. The comparison of creativity between a control group utilizing Authentic
Assessment and an experimental group using Quantitative Data-Driven Assessment
through an initial experimental study involving an expert panel of appropriate observers.
2. A comparison of gains in mean creativity scores between a control group
utilizing Authentic Assessment and an experimental group using Quantitative Data-
Driven Assessment through an initial experimental study utilizing pretest and posttest
scores for creativity from teacher evaluations.
Need for the Study
The findings of this study will be useful for students, parents, Art Educators,
other-content educators of subject areas that primarily rely upon qualitative assessment,
School Administrators, Curricular Specialists, school stakeholders, post-secondary
education institutions, and employers. It is anticipated that the results from this initial
experimental research will assist development of a data collection and analysis tool
! !
+!
specific to the content of Art Education specific to visual arts education, supporting
continued Art Educator usage of best practices for visual arts instruction.
Although this initial experimental research specifically addresses the immediate
concerns of the author’s inquiry; California Visual Arts Education in grades 7 to 12, with
focus on the California State Standards and the implementation of the recently released
National Core Arts Standards (NCAS, 2014), the information found could be relevant
across the varied Visual and Performing Arts disciplines, and other educational areas in
which Authentic Assessment is the currently the dominant form of assessment.
Definition of Terms
The following definitions are provided for key terms used in this study:
Data-Driven Instruction and Assessment: A quantitative assessment tool used to
gather, score, compile data reports specific to student scoring on teacher-created
questions directly linked to the content standards. Such tools may include but are not
limited to Data Director, Illuminate, TetraData, EDmin, Cognos, and Schoolnet.
Quantitative assessment: Assessments in which responses can be readily
measured against a standard of correct or incorrect. Most quantitative assessment data
are answers in the form of true/false, or Multiple Choice.
Qualitative assessment: Assessments in which responses can be readily measured
contextually, and subjectively. Qualitative assessment data can take the form of the
following, but are not limited to: analysis of student interviews, group critiques, journal
writing, and concept mapping.
Authentic Assessment: The qualitative style of assessment used throughout Art
Education, including in the Visual Arts, including the measurement of "intellectual
! !
,!
accomplishments that are worthwhile, significant, and meaningful, as compared to
multiple choice standardized tests” (Wehlage, Newmann, & Secada, 1996, p. 23).
Standards-Based Grading: A grading system in which all assessment grades are
tied directly to mastery of a pre-defined content standard. Performance objectives for
students demonstrate a level of mastery of the particular content standard when utilizing
standards-based grading.
Creativity: The experience of creating something new and valuable, that
transcends traditional ideas, rules, patterns, relationships. The creation can be
meaningful new ideas, forms, methods, interpretations, processes, or products.
Consensual Definition of Creativity:
A product or response is creative to the extent that appropriate observers
independently agree it is creative. Appropriate observers are those familiar with
the domain in which the product was created or the response articulated. Thus,
creativity can be regarded as the quality of product or responses judged to be
creative by appropriate observers, and it can also be regarded as the process by
which something so judged is produced. (Amabile, 1983, p. 357)
Contextual Definition of Creativity:
A product or response will be judged as creative to the extent that ( a ) it is both a
novel and appropriate, useful, correct or valuable response to the task at hand, and
( b ) the task is heuristic rather than algorithmic. (Amabile, 1983, p. 369)
Visual Arts Education: The two-dimensional and three-dimensional media of Art
Education addressing primarily the Visual Arts, including but not limited to Drawing,
Painting, Sculpting, Ceramics, Graphic Design, Printmaking, Digital Media, Film, Fibers,
! !
-!
Jewelry, Wood, Metals, Electronic Media, Art History, Art Criticism, Aesthetic
Philosophy.
Convergent thinking: The thought process of sorting through many options and
processes to find one right answer. Joy Paul Guilford (1957) defines it as giving the
“correct” answer to questions that do not require considerable creativity.
Divergent thinking: The thought process of exploring multiple options and
processes to find many possible right answers. A process used to generate ideas by
exploring multiple possible solutions.
Medium: In the Visual Arts, the materials specific to the product of an art work.
The properties and employment of technique can vary within a group of media, adding
further considerations while assessing student work in the visual arts. For example,
within the realm of painting, oil paints require different supplies and the employment of
different techniques from the supplies required and techniques used in watercolor paints,
acrylic paints, and gouache and tempera paints. Similar considerations and differences
are found throughout all groupings of medium within the Visual Arts.
Technique: The medium-specific procedures employed when creating works in
the Visual Arts.
Aesthetic Philosophy: The cultural and philosophical lens through which a person
views, analyzes, values, and makes judgments about a work of art.
Cultural Mores: The dominant values and social preferences within a culture.
Bloom’s Taxonomy: A classification of learning objectives educators set for
students in order to create a more holistic education, with emphasis on cognitive,
affective, and psychomotor domains (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956).
! !
%.!
Anderson & Krathwol’s Revised Taxonomy: A 2001 revision of Bloom’s
Taxonomy to define and delineate the Knowledge Dimension of learning’s relationship to
the Cognitive Process of learning (Anderson, Krathwohl, & Bloom, 2001).
Heer Combined Taxonomy Model: A model of taxonomy defining learning
objectives exhibited by students when the cognitive process level of learning intersects
with the knowledge dimension of learning (Heer, 2012).
CA VAPA Standards (California Visual and Performing Arts Standards): A
framework of standards adopted by the California State Board of Education for visual
and performing arts (CDE, VAPA, 2001).
CCSS (Common Core State Standards): The framework of core conceptual
understandings and procedures communicating what is expected of students at each grade
level, in each subject (Common Core State Standards, 2010).
National Core Arts Standards: The framework of core concepts, philosophies,
structures, and outcomes for student achievement in Visual and Performing Arts (State
Education Agency Directors of Arts Education, 2014).
Visual Literacy: The five stage scale developed by Abigail Housen and Philip
Yenawine upon which people experiencing works of art can be ranked according to their
levels of response, appreciation, and interaction with the work of art. (/01234!5!&..+)
Stage Descriptions are quoted directly from Housen’s descriptions, as follows:
Stage 1: Accountive: Accountive viewers are storytellers. Using their
senses, memories, and personal associations, they make concrete observations
about a work of art that are woven into a narrative. Here, judgments are based on
! !
%%!
what is known and what is liked. Emotions color viewers' comments, as they
seem to enter the work of art and become part of its unfolding narrative.
Stage 2: Constructive: Constructive viewers set about building a
framework for looking at works of art, using the most logical and accessible tools:
their own perceptions, their knowledge of the natural world, and the values of
their social, moral and conventional world. If the work does not look the way it is
supposed to, if craft, skill, technique, hard work, utility, and function are not
evident, or if the subject seems inappropriate, then these viewers judge the work
to be weird, lacking, or of no value. Their sense of what is realistic is the standard
often applied to determine value. As emotions begin to go underground, these
viewers begin to distance themselves from the work of art.
Stage 3: Classifying: Classifying viewers adopt the analytical and
critical stance of the art historian. They want to identify the work as to place,
school, style, time and provenance. They decode the work using their library of
facts and figures which they are ready and eager to expand. This viewer believes
that properly categorized, the work of art's meaning and message can be explained
and rationalized.
Stage 4: Interpretive: Interpretive viewers seek a personal encounter
with a work of art. Exploring the work, letting its meaning slowly unfold, they
appreciate subtleties of line and shape and color. Now critical skills are put in the
service of feelings and intuitions as these viewers let underlying meanings of the
work what it symbolizes emerge. Each new encounter with a work of art presents
a chance for new comparisons, insights, and experiences. Knowing that the work
! !
%&!
of art's identity and value are subject to reinterpretation, these viewers see their
own processes subject to chance and change.
Stage 5: Re-Creative: Re-creative viewers, having a long history of
viewing and reflecting about works of art, now willingly suspend disbelief. A
familiar painting is like an old friend who is known intimately, yet full of surprise,
deserving attention on a daily level but also existing on an elevated plane. As in
all important friendships, time is a key ingredient, allowing Stage 5 viewers to
know the ecology of a work — its time, its history, its questions, its travels, its
intricacies. Drawing on their own history with one work in particular, and with
viewing in general, these viewers combine personal contemplation with views
that broadly encompass universal concerns. Here, memory infuses the landscape
of the painting, intricately combining the personal and the universal. (p. 173)
Assumptions and Limitations
This initial experimental research specifically addresses the immediate concerns
of the authors’ inquiry: California Visual Arts Education in grades 7 to 12, with focus on
the California State Standards and the implementation of the recently released National
Core Arts Standards (NCAS, 2014). The initial experimental research focuses on the
impact of these methods of assessment on measured creativity as demonstrated in student
work.
For the purpose of this study, it was assumed that the Expert Panel Posttest Only
Experiment instrument would be finalized consensually just prior to creativity assessment
by the expert panel primarily to elicit a creativity rating scale that was contextual to the
task of the student art projects (posttest artworks).
! !
%'!
Furthermore it was assumed that teacher perceptions of demonstrated creativity
were equally valuable as data to be examined in determining creativity in the Creativity
Pretest/Posttest Gains Experiment.
The design chosen for the Expert Panel Posttest Only Experiment was the Quasi-
Experimental Non-Randomized Posttest Only Design (Campbell & Stanley, 1963). The
design chosen for the Pretest/Posttest Gains Experiment was the Nonrandomized Pretest-
Posttest Design (Van Dalen & Meyer, 1979).
Despite the fact that randomization of subject selection would offer a more
accurate experimental control, this author chose non-randomization through the use of
intact classes for experimental and control groups. The main reasons for this decision
are:
1. Site administrator’s reluctance in disrupting school schedules (Van Dalen &
Meyer, 1979); and
2. The use of intact classes is more suitable when one of the priorities of the
study is the subjects’ unawareness of the experiment (Van Dalen, Meyer & 1979).
Student posttest project samples ( n ) were effected by omitted samples. Some
student posttest work that had been judged by the panelists had been very recently
completed and was still in a “green ware” state, the un-fired clay that is very fragile, and
still malleable if it is at all wet. Green ware when dry is extremely fragile, and easily
broken. Project explosions in the kiln during firing throughout the last week of the
treatment lesson due to inclimate weather and other unforeseen circumstances had
negatively affected the quantity of projects in both the control group and experimental
group results sample. In two separate kiln firings, a student had left tarpaper on their
! !
%(!
green ware project, and covered the tarpaper with clay to hide the remaining tarpaper. It
is suspected that in these two kiln firings, the green ware with tarpaper emitted steam and
exploded, in such a way that the exploded shrapnel broke many nearby projects in the
kiln.
In spite of the setbacks, students felt compelled to participate in the panel gallery
showing, and had rushed to create last-minute green ware for inclusion into the gallery
exhibit for the expert panelists. The expert panel rated the green ware, but the data
collected from the green ware was omitted from the results for this initial experimental
research. The data for the Expert Panel Posttest Only Experiment reflects the absence of
these omitted works.
It was decided that despite the study’s limitations, and due to the potential
importance in the field of Art Education, this study should be conducted as an initial
experimental study, which will offer insights for continued research.
! !
%)!
CHAPTER 2
REVIEW OF LITERATURE
Introduction
The purpose of this literature review is to examine relevant studies pertaining to
Quantitative Data-Driven Instruction and Assessment, qualitative Authentic Assessment
particular to the pedagogy and assessment in visual arts education. It will be
demonstrated how the proposed research is related to previous investigations, and how it
can make a unique contribution in the field of Art Education. This chapter will follow a
sequential structure--from comprehensive to specific--with divisions identified with the
following headings:
A: Assessment of Creativity. --This section will provide information based on
recent studies pertaining to the nature of Creativity as it pertains to the Visual Arts, and
the assessment of creativity. Focus will be placed upon the benefits and limitations of
creativity, and how one can appropriately assess creativity within the Visual Arts.
B: Authentic Assessment. --This section will provide information based on recent
studies pertaining to the nature of authentic assessment, relevant to Art Education and
particular to the visual arts. Divergent thinking will be discussed. Focus will be placed
upon the benefits and limitations of this style of pedagogy and assessment, and the
relevant value of the data that is collected.
! !
%*!
C: Quantitative Data-Driven Instruction and Assessment. --This section will
provide information based on recent studies pertaining to the nature of Quantitative Data-
Driven Instruction and Assessment. Convergent thinking will be discussed. Focus will
be placed upon the benefits and limitations of this style of pedagogy and assessment, and
the relevant value to educators and students of the data that is collected.
Preface: The Purpose of Creativity in Arts Education, Specific to this Study
One of the most important outcomes of an education with art includes
encouraging “creativity and innovative thinking in young minds” (Dwyer, PCAH, 2011
p. 8). Creativity as an applied skill in employees has been named as highly desired by
employers, surpassing the importance of basic knowledge and ranking in the top five of
applied skills (Ruppert, 2010). It is important to acknowledge that Art Education does
not hold the monopoly on teaching creativity within the schools. However, it is one of
the core academic subjects where the success of the art program is continually approved
or condemned by the creativity of the curriculum, and the creativity of the student work
produced, or the end product. Art Education curricula regularly use creativity as a
vehicle with which to acquire and practice the higher-order thinking skills and
communication skills, which educational reformers strive for students to aspire (NAEP,
2008). Therefore, not only is creativity a large part of the Art Education end product, it is
integral to the process of making, and also to the pedagogy and curriculum. Due to the
nature of the field of Art Education’s symbiotic relationship with creativity, with the
recognition that creativity is a major part of what students aspire to practice and
demonstrate in a Visual Arts course, the author posits that it is essential to this study to
! !
%+!
investigate creativity and its relationship to Visual Arts pedagogical and assessment
methodologies.
Assessment of Creativity
Through an examination of recent literature pertaining to creativity, this section will
provide information pertaining to the nature of creativity, specific to the assessment of
creativity, with a focus particular to the process and products in the visual arts in
secondary schools.
It is important to recognize the breadth of scope in the field of creativity research,
as psychological research, as neurocognitive research, as systems analysis and work flow
research, and as personal trait or discipline habit. Presently, over 100 definitions of
creativity have been documented from literature and research (Treffinger, 2002). For the
purposes of this study, which are very specific, the definitions agreed upon and utilized
for this research will be the ones most recognized within the field of Art Education, for
the purpose of education within the Visual Arts classroom and evaluation of assessment
techniques for their suitability of use in the Visual Arts classroom.
Although much of the research in the field of creativity focuses on personal traits
and divergent thinking, creativity is used in this study as an outcome that is to be
measured in process and product, assisting in determining of the suitability of
implementing Quantitative Data-Driven Assessment in Visual Arts assessment processes.
Mihaly Csikszentmihalyi clearly characterizes creativity as a social process across
the three related components defining creativity: domain, field, and individual.
Csikszentmihalyi defines creativity as “an act, idea, or product that changes an existing
domain, or that transforms an existing domain into a new one” (Csikszentmihalyi, 1996,
! !
%,!
p. 28). Creativity affects a domain when the creative response alters the set of rules,
symbols, and procedures that govern the domain. Examples of a domain would be
mathematics or biology, or pertinent to this experimental research, the domain would be
art.
Within the domain is the field, the people acting as “gatekeepers” decide which
changes to the domain will be included or rejected. Within the field are the individuals,
the people identifying or generating the novelty selected by the field to be included in the
domain. “A domain cannot be changed without the explicit or expressed consent of a
field responsible for it” (Csikszentmihalyi, 1996, p. 28). For the purpose of this
experimental research, the author finds it important to include other appropriate observers
within the field of Art Education in the assessment of creativity. The field of
“gatekeepers” employed is an expert panel of art educators within the district of the
experiment sites. The expert panel will be explained in further detail in Chapter 3.
Teresa Amabile (1983, 1996) continues Csikszentmihalyi’s inquiry into the nature
of creativity, also recognizing that creativity cannot be defined nor experienced by the
individual alone. Amabile develops two ways to define creativity: Contextually and
Consensually.
In Contextual Creativity, the end product is evaluated for creativity as well as the
process of creating the end product. A product or response is judged as creative “to the
extent that ( a ) it is both a novel and appropriate, useful, correct or valuable response to
the task at hand, and ( b ) the task is heuristic rather than algorithmic” (Amabile, 1983, p.
358). It is within the context of the task that creativity is judged as useful, valuable, or
! !
%-!
novel, and the process of creating the end product is as equally valued as the end product
is.
The task must be a heuristic one, though.
The conceptual definition of creativity states that a creative response is a novel and
appropriate solution to a heuristic task. If the path to solution is clear and
straightforward, the task is an algorithmic one, and responses to it simply cannot be
considered creative. To allow responses that may be considered creative, the task
must be open-ended to some degree. Some search for solution paths is required.
Raymond Veon agrees that the skills identified as integral to creativity are non-
algorithmic and based in uncertainty. This complexity and uncertainty when “not
everything that bears on the task at hand is known” develop and exercise the
metacognitive skills of higher-order thinking (Veon, 2014b). We do not assert higher
order thinking in someone when someone else directs the task at every step (Goldstein &
Ford, 2001). For the purpose of this initial experimental research, the experimenter finds
it important to keep the tasks of the instrument as heuristic as can be practically
accomplished.
Judging creativity consensually is much like that of an art critique in which the
process is as important as the finished product. Amabile asserts that “appropriate
observers are those familiar with the domain in which the product was created or the
response articulated. Thus, creativity can be regarded as the quality of product or
responses judged to be creative by appropriate observers, and it can also be regarded as
the process by which something so judged is produced” (1983, p. 357). “A product or
! !
&.!
response is creative to the extent that appropriate observers independently agree it is
creative”(Amabile, 1983, p. 357).
Because creativity occurs in a social context, researchers agree that some of the
evaluation of creativity needs to be done within this context (Csikszentmihalyi, 1999;
Terry & Mynatt, 2002; Terry, Mynatt, Nakakoji & Yamamoto, 2004). Amabile (1983,
1996) recognizes the importance of judging creativity by the process and product, and
also within the community of appropriate observers familiar with the domain. For the
purposes of this initial experimental research, the experimenter will follow Amabile’s
recommendations on testing hypotheses about creativity: “the task ( 1 ) is an open-ended
(heuristic) one; ( 2 ) does not depend heavily on special skills; and ( 3 ) is one in which
subjects actually make an observable product or a response that can be recorded and later
judged on creativity” (1983, p. 359).
Authentic Assessment
This section will provide information based on recent studies relevant to the nature
of qualitative authentic assessment in Art Education, particular to the visual arts.
Divergent thinking will be discussed. Focus will be placed upon the benefits and
limitations of this style of pedagogy and assessment, and the value of the data that is
collected.
Authentic Assessment is a the “examination of student performance on worthy
intellectual tasks. Traditional assessment … relies on indirect or proxy 'items'--efficient,
simplistic substitutes from which we think valid inferences can be made about the
student's performance at those valued challenges” (Wiggins, 1990) It is a qualitative style
of assessment used throughout Art Education, including in the Visual Arts, including the
! !
&%!
measurement of "intellectual accomplishments that are worthwhile, significant, and
meaningful, as compared to multiple choice standardized tests” (Wehlage, et al., 1996, p.
23). Grant Wiggins characterizes the designs and uses of assessments as authentic when
the following characteristics are present (Wiggins, 1990):
( 1 ) Acquired knowledge is effectively performed by students.
( 2 ) A full array of tasks performed by student’s mirrors the best instructional
practices.
( 3 ) Student can craft polished, thorough and justifiable answers, performances or
products.
( 4 ) Involve ‘ill-structured’ challenges and roles” reflecting the “complex
ambiguities” applied domain scenarios.
( 5 ) Enabling and forward-looking, not just reflective of prior teaching. (p. 3)
In Authentic Assessment, it is sufficient to present with some or most of the
characteristics to be considered as authentic. Lack of one or more of the vital
characteristics will not invalidate the process of authentic assessment.
Wiggins (1990) further states of Authentic Assessment the benefits such as the
teacher-mediated assessment itself is easily shareable and open to public examination. In
the utilization of Authentic Assessment, essential intellectual abilities and habits of mind
are prioritized over the achievement of one singular answer. Edward Chittenden (1991)
continues that Authentic Assessment capitalizes on the actual work done within the
classroom, negating some of the need for additional or quantitative assessments,
enhancing student and teacher cooperative involvement in evaluation.
! !
&&!
Howard Gardener and Reinecke Zessoules (1990) explain that the authentic
assessment is an ongoing process where the teacher monitors students’ expressions of
thoughtfulness, being creative, curiosity, and self-directed independence. Students are
able to make use of a variety of skills they have learned in a variety of contexts. The
Authentic Assessment process employs divergent thinking in which many possible
correct solutions can be employed to demonstrate understanding. Divergent thinking is
the idea-generating process in which many unexpected connections are drawn.
Convergent thinking is the process of narrowing and organizing these ideas, into a
structured closure.
Limitations to Authentic Assessment include that the process is labor-intensive
and time-intensive. If assessed nationally or on a larger scale than the immediate locale,
the cost of Authentic Assessment can be considerable in comparison to older methods of
standardized assessment. (In the 1990 Article, Gardener and Zessoules cite an estimated
cost of $2.00 per student for Authentic Assessment, compared to mere cents per student
using standardized testing. However, it is worth noting that the cost comparison now in
2015 would demonstrate little to no difference per student, especially taking into
consideration the start-up and maintenance costs of hardware, software, and Internet
access capabilities for test administration. These costs are repeated every few years as
upgrades and technological accommodations are required.) The suggestion to utilize
sampling with Authentic Assessment in a larger scale assessment is raised by Wiggins
(1990) to ameliorate the costs of a national scale Authentic Assessment within a domain.
Gardener and Zessoules (1990) illustrate the comparison of Authentic Assessment
with the administration of the Grade 4 Manipulative Skills Test in New York State. In a
! !
&'!
traditional standardized test, students would be required to demonstrate possession of
facts, and recitations of knowledge. The Manipulative Skills Test, an authentic
assessment, requires a hands-on demonstration of understanding, and a practical
application of domain-based skills. The Authentic Assessment is a reflective and cyclical
practice of how to think and work in the individual role, within the community of the
domain.
Both intrinsic motivations and extrinsic motivations are employed in Authentic
Assessment, as the teacher is continually a reflective practitioner, and student is an active
participant in the assessment (Zessoules, Gardener, 1990). Numerous studies have found
that students who are intrinsically motivated to persist longer, conquer more challenges,
and demonstrate accomplishments in their academic endeavors than those who are
extrinsically motivated (Pintrich & Garcia, 1991).
Quantitative Data-Driven Instruction and Assessment
Through an examination of recent literature this section will provide information
pertaining to the quantitative nature of Data-Driven Instruction and Assessment. Focus
will be placed upon the benefits and limitations of this style of pedagogy and assessment,
and the value of the quantitative data that is collected specific to its employment as a tool
for use within the visual arts classroom.
Data-Driven Instruction in classrooms has become a widely utilized method of
assessment of student achievement. Data-Driven Instruction is a pedagogical system
with the overall purpose of aligning the assessments of student achievement to the
curriculum. Focus is placed primarily on the assessments, secondarily on the instruction,
in order to improve student achievement scores. (Bambrick-Santoyo, 2010; Mertler,
! !
&(!
2014; Sindelar, 2010). “The practices of Data-Driven Instruction are inextricably bound
up with the process of assessment.” (Bambrick-Santoyo, 2010, loc. 647) “After we align
our local curriculum and assessments to standards, we experience a gradual shift in our
understanding and use of assessment.” (Sindelar, 2010, loc. 339)
Citing the disconnect between curriculum and assessment, proponents of Data-
Driven Instruction state that the most important tenet of Data-Driven Instruction and
Assessment is to design standards-based assessments first, then plan the curriculum
backward to support achievement on the tests. “We use our quizzes and unit tests as
assessments for learning rather than assessments of learning. Rather than just putting
another grade in the grade book, we analyze test results, diagnose learning difficulties,
and identify the next steps we need to take to remediate our students’ weaknesses.”
(Sindelar, 2010, loc. 340).
In quantitative Data-Driven Instruction and Assessment, students take formative
and summative assessments that are aligned to specific academic standards (state or
national standards in the pre-Common Core era, or now the Common Core State
Standards). The assessments are designed to be closely representative of content that is
on state and national standardized achievement tests. When curriculum is separated from
assessment, then assessment results will not fairly reflect what has been taught. “If the
curriculum scope and sequence do not precisely match the standards on the interim
assessments, then teachers will be teaching one thing, and assessing something else
altogether. Then any assessment results have no bearing on what actually happens in the
classroom” (Bambrick-Santoyo, 2010, loc. 465).
! !
&)!
Test structures and questions are designed by the teacher, test designer, or
textbook publisher to represent student mastery of the designated content standards.
“When defining your learning targets, you need to have whatever standards you
want your students to learn close at hand. It’s also a good idea to review state test
data for your school to determine standards that are ‘key’ based on difficulty and
number of test items, as well as to identify areas where students’ scores are low”
(Sindelar, 2010, loc. 423).
Data-driven tests are mostly quantitative in nature using only true/false, multiple-
choice, and scored rubrics to assess student mastery of standards. The data driven
software collects scoring data by using an optical scanner to record student test answers
recorded within a bubble-style answer document. The software compiles score data, and
reports results to educators, administrators, students, and parents the level of mastery of
that standard attained by students at the time of the test.
Data-Driven Instruction and Assessment allows this data to be shared among all
stakeholders; teachers, students, parents, administrators, and so forth. The immediate,
quantitative feedback provides educators with the opportunity to group low performers
by content area or standard, and to design and provide targeted student interventions.
Pedagogical shifts can be made after a teacher reflection of what the report of student
scores demonstrates. If implemented by the teacher with an opportunity for error analysis
by the students, students can reflect on their own mastery of content standards,
reconsidering patterns of thought that may have led to the initial errors specific to the
content standard. Longitudinal score data can be an opportunity for student reflection on
their own progress of standards mastery.
! !
&*!
The limitations of quantitative Data-Driven Assessment occur frequently enough
that even the proponents of this assessment pedagogy predict failure of the system if all
ideal conditions are not successfully met at all times. Lack of one or more of the vital
characteristics will invalidate the process of Data-Driven Assessment. If the interim
assessments are less frequent than every six-to-eight weeks, if the results are delayed
more than forty-eight hours from the assessment administration, if teachers are not
provided time by the school for a quality data analysis, if teaching and analysis are
separated, and there is a disconnect between curriculum and assessment, the entire system
is unreliable and the data is worthless. (Bambrick-Santoyo, 2010) “Making sweeping,
important decisions about students or instruction on the basis of limited sets of data”
(Russell & Airasian, 2012 in Mertler, 2014, loc. 45) can result in over-interpretation of
the quantitative results leading also to errors in the system. Craig Mertler also describes a
loss of longitudinal focus with the over-use of Data-Driven Assessment, “if your
engagement in these processes progresses across multiple years, it will likely become
increasingly difficult to keep track of where you have been and to map out where you
might be headed” (2014, loc. 286).
When utilized correctly, this quantitative true/false, Multiple Choice, or scored
rubric analysis employed in quantitative Data-Driven Instruction and Assessment seems
appropriate for employment in content areas that are quantitative and convergent by
nature; where there is usually only one right answer to attain. The score reflects the
students’ answer was either correct or incorrect. According to the data collected, it is
assumed that the student either mastered the content standard linked to that test question
or did not.
! !
&+!
“In addition to informing instructional practice, the use of software creates ease
and efficiency for teachers because tests are scored and analyzed in minutes, and
hand scoring and tallying of test results, which often takes hours of teacher time,
are eliminated” (Sindelar, 2010, loc. 261).
The value of quantitative date for collection and use within the Visual Arts
curriculum is at times extraneous to determine learning in qualitatively assessed and
divergent subjects, such as the Visual Arts. Proponents of qualitative Data-Driven
Assessment support the inclusion of qualitative data with which to score students
quantitatively. The use of standards-based rubrics, formative assessments including
teacher observations, student responses and reflections, summative assessments including
portfolios, performance-based assessments are encouraged as “legitimate and viable
sources of student data for this process”. (Mertler, 2014, loc. 37) “in the end, both
interim assessments and in-the-moment assessments are necessary and important.”
(Bambrick-Santoyo, 2010, loc. 2780).
Some quantitative Data-Driven Assessment proponents even contradict their own
findings of the value of qualitative data for use in Data-Driven Assessment.
“Because of vocabulary and value-laden questions, some traditional assessment
tools may greatly underestimate the knowledge a student possesses. The results of
performance assessments tend to be more indicative of students’ actual
understanding of a concept or a skill, while forced-choice, short constructed-
response tests may provide a less valid data.” (Almeida, 2007, in Sindelar, 2010,
loc. 1343)
! !
&,!
“Performance assessments, such as rubric-graded projects, labs, speeches, and
performances, all do a better job of assessing content knowledge rather than content
knowledge and the ability to communicate it in Standard English. When these
assessments include tools or artifacts that are used in a student’s everyday life, the
assessment’s ability to capture a student’s understanding of a concept or a problem
is also increased.” (Edd Taylor, in Beck, 2009, p. 94)
It is worthwhile to note that extrinsic motivations are employed in Quantitative
Data-Driven Assessment. Numerous studies have found that students motivated
extrinsically tend to focus on earning higher grades, obtaining rewards and acceptance
from peers (Pintrich & Garcia, 1991). Some researchers have also posited that extrinsic
motivational factors can diminish students’ intrinsic motivation (Biehler & Snowman,
1990). If the ambition of the assessment is to increase student learning, beyond the
immediacy of the assessment, consideration of the type of assessment and its’ motivation
is entirely appropriate.
The employment of standards-based rubrics is again prescribed as the bridge to
assess quantitatively a qualitative and divergent domain. The use of the standards-based
curricula and assessment promote the idea that students will learn the same information,
and be assessed consistently by different raters over time (Sindelar, 2010).!
Summary And Conclusions
The conclusions of this review of literature are divided into three major headings:
1. Assessment of Creativity
2. Authentic Assessment
3. Quantitative Data-Driven Instruction and Assessment
! !
&-!
Assessment of Creativity Summary and Conclusions
This section provided the reader with a basic understanding of recent studies on
the nature of assessing creativity, relevant to Art Education, specific to the visual arts.
Creativity is heuristic, not algorithmic. Creativity is a social process that is defined in a
multi-faceted method. Assessment of creativity naturally follows a multi-faceted
approach. Creativity properly assessed will be done at the individual level, and also at
the field “gatekeeper” level, within the context of the task and within the context of the
domain. Process and product are equally weighted in importance when considering
assessment of creativity.
Authentic Assessment Summary and Conclusions
This section provided the reader with a basic understanding of recent studies on
the nature of Authentic Assessment, relevant to Art Education, specific to the visual arts.
Literature indicates Authentic Assessment to be more indicative of intellectual abilities
and demonstration of understandings. The underlying principles are that the habits of
mind and essential abilities of the student are prioritized over a recitation of knowledge.
Divergent thinking is practiced and encouraged. Uncertainty is necessary. Absence of
one characteristic of Authentic Assessment does not negate the validity of the system.
Quantitative Data-Driven Instruction and Assessment Summary and Conclusions
This section provided the reader with a basic understanding of recent studies on
the nature of quantitative Data-Driven Instruction and Assessment. Literature indicates
Quantitative Data-Driven Assessment a valuable instrument in improving student
achievement scores. Assessment is prioritized with all curricular matters aligned to the
assessment, with the presence content on state and national standardized tests preferred to
! !
'.!
that which is not. Alignment of the content to the test is key. Convergent thinking is
practiced and encouraged. Qualitative assessment practices are prevailing. Absence of
merely one of the characteristics negates the reliability of the entire system.
Based on what has been covered in this review of literature, the experimenter
draws three essential conclusions:
1. The appropriateness of employment of Quantitative Data-Driven Assessment
in the Visual Arts curricula has not been tested. Although employment of
Quantitative Data-Driven Assessment has been initiated in some school settings,
further research in this area is needed.
2. Extrinsic and Intrinsic motivations can impact student achievement
specifically to the extent to which students are assessed qualitatively or
quantitatively. It is entirely worthwhile to examine the motivations employed
when examining the appropriateness of an assessment. Further research in the
area of assessment motivation is needed.
3. Creativity as a process and product is a valuable outcome of a Visual Arts
curriculum. The overall reticence from the field of Art Education to affirm and
declare this value is reflective of the former difficulties and ambiguities in
defining creativity, and how to assess it. Recent analyses in scholarly literature
prove that assessment of creativity can be performed. With the renewal of
educational purposes and aims, including the implementation of the National Arts
Standards, and electively adopted Common Core Standards in Math and
Language Arts, an opportunity arises to incorporate creativity as an equally
important process and product that can be assessed and encouraged. While the
! !
'%!
Visual Arts hold no monopoly over the encouragement of creativity, as previously
stated, the demonstration of creativity is part and parcel of what Art Educators do,
and how we are judged as effective educators.
There is clear evidence that further research in the area of the appropriateness of
quantitative assessment in qualitative subjects is needed. This study proposes to
contribute to the literature on an assessment’s effect upon demonstrated creativity,
relevant to Art Education, and specific to the Visual Arts.
! !
'&!
CHAPTER 3
METHODOLOGY
From the previous review of the literature, two experiments were developed to
measure an assessment’s motivational impact on creativity of student art projects.
Instrument A for the first experiment is the Expert Panel Posttest Only Experiment.
Instrument B for the second experiment is the Creativity Pretest/Posttest Gains
Experiment. The methods of assessment, referred to here as Authentic Assessment, and
Quantitative Data-Driven Assessment (QDDA), represent the independent variable in
both initial experiments.
In both initial experiments the dependent variable is the posttest assessment
motivation presented to students. The control group using Authentic Assessment was
presented with the motivation of “increasing creativity” from pretest to posttest. The
experimental group using Quantitative Data-Driven Assessment were presented with the
motivation of “increasing creativity scores” from pretest to post test. The Lesson Plans,
three Discussion Boards, Pretest and Posttest Evaluation Rubrics, and assessment rubrics
were identical between control group using Authentic Assessment and experimental
group using QDDA. Only the posttest assessment motivation differed between groups.
Instrument A: Expert Panel Posttest Only Experiment Hypothesis
The first experiment is a quasi-experimental posttest-only design (Campbell &
Stanley, 1963). In the first experiment referred to as the Expert Panel Posttest Only
! !
''!
Experiment, an independent panel of appropriate expert observers whom are familiar
with the task formed a consensual definition of creativity. The expert panel uses their
consensually defined creativity criteria to judge student posttest-only artworks for
creativity. Expert panel judges have reviewed recent and relevant scholarly literature
about creativity narrowed specifically to the context of Art Education. Expert panel
judges have reviewed all student lesson information including the three Discussion
Boards, Pretest and Posttest Evaluation Rubrics, Lesson Plans, a Student-Generated
Creativity Rubric, and visual reference materials. Images of student pretest projects were
made available as requested by the expert panel.
The expert panel’s consensual creativity agreement was defined using four
criteria, which subsequently constituted and defined the following four hypotheses:
Hypothesis 1: It is hypothesized that an analysis of variance (ANOVA) will
determine a statistically significant difference in creativity between the control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) posttest only artworks in the consensually panel-defined
“Work Synthesizes Ideas in Original and Surprising Ways” with experimental group
utilizing QDDA demonstrating significantly lower creativity scores in this category.
Hypothesis 2: It is hypothesized that an analysis of variance (ANOVA) will
determine a statistically significant difference in creativity between the control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) in the consensually panel defined “Novel or Valuable
Response to the Task” with experimental group utilizing QDDA demonstrating
significantly lower creativity scores in this category.
! !
'(!
Hypothesis 3: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be a statistically significant difference in creativity between the
control group utilizing Authentic Assessment and experimental group utilizing
Quantitative Data-Driven Assessment (QDDA) in the consensually panel defined “Work
Is Improved from Pretest to Posttest” with experimental group utilizing QDDA
demonstrating significantly lower creativity scores in this category.
Hypothesis 4: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be a statistically significant difference in creativity between the
control group utilizing Authentic Assessment and experimental group utilizing
Quantitative Data-Driven Assessment (QDDA) in the consensually panel defined “Work
Is Original from Others in the Class” with experimental group utilizing QDDA
demonstrating significantly lower creativity scores in this category.
Instrument B: Creativity Pretest/Posttest Gains Experiment Hypothesis
The second experiment is a measurement of gains in creativity scores from pretest
projects and posttest projects using data from Teacher Evaluations on the creativity
rubric.
A unit of instruction in Ceramics specifically creating Tar-Paper Slab Projects
was delivered. The control group received the Authentic Assessment, and the
experimental group received the Quantitative Data-Driven Assessment (QDDA). In both
initial experiments the dependent variable is the posttest assessment motivation presented
to students. The control group using Authentic Assessment was presented with the
motivation of “increasing creativity” from pretest to posttest. The experimental group
using QDDA were presented with the motivation of “increasing creativity scores” from
! !
')!
pretest to posttest. The Lesson Plans, three Discussion Boards, Pretest and Posttest
Evaluation Rubrics, and assessment rubrics were identical between control group using
Authentic Assessment and experimental group using QDDA. Only the posttest
assessment motivation differed between groups. Results were then compared for gains in
creativity with pretest-posttest data from Teacher Evaluations.
The initial experimental research pretest/posttest gains experiment tested the
following four hypotheses in order to determine the impact of a method of assessment on
creativity gains between the control group using Authentic Assessment and the
experimental group using QDDA.
Hypothesis 5: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Synthesis of Ideas In
Original and Surprising Ways”.
Hypothesis 6: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Novel or Valuable
Response to the Task”.
Hypothesis 7: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
! !
'*!
demonstrated in Teacher Evaluations in the creativity category “Work Makes Student
Ask New Questions to Build Upon An Idea”.
Hypothesis 8: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Enables Student To
Discover, Learn Something Not Directly Instructed”.
Research Procedure
Test Groups
One control group and one experimental group were employed in this initial
experimental research. The control group was exposed to Authentic Assessment, and the
experimental group was exposed to the Quantitative Data-Driven Assessment (QDDA).
Both treatments will be explained in detail later in this chapter.
Population and Sample
At the time of this initial experiment, data on this school is continually evolving
due to the growth and development of a new school in an established district. The
student data at this school site is mostly a reflective population sample of the district
under which it operates. The district educates a diverse population of more than 30,000
K-12 students in twenty-four elementary schools, six middle schools, four high schools
and two alternative education sites. As of the 2014-2015 academic year, the district has
fifteen Federally designated “Title I” schools, one of which is the site of this initial
experimental research, Jeffrey Trail Middle School. The district boasts nationally
recognized schools as student performance is well-above state and national comparisons;
! !
'+!
and maintains comprehensive programs in academics, the arts, and athletics.
Approximately 12,260 students, or nearly 39 percent of students in the district have a
native language other than English. Common native languages of students within the
school district include Arabic, Chinese, Farsi, Japanese, Korean, and Spanish. (IUSD,
2015) More than 5,300 students, or about 17 percent of students in the district are limited
in their English proficiency.
This initial experimental research took place in the fall of 2014 at Jeffrey Trail
Middle School located in Irvine, California. The school site for this initial experimental
research is a middle school in its second year of operation, with 981 students enrolled.
445 of the students are 7th grade, and 536 are in the 8th grade. Jeffrey Trail has received
US Federal “Title I” designation.
At the school site, only 71 7th grade students, and 50 8th grade students are labeled
as English Language Learners. However, the district has in place a Newcomer Program
for English Language Learners scoring low on CELDT placement tests. The Newcomers
are taught with special support in their home language at a separate school site, making
the site data at Jeffrey Trail Middle School for English Language Learners an inaccurate
representation of the overall district population. Students who have scored low in the
CELDT placement test still attending Jeffrey Trail Middle School without language
support do so voluntarily, legally waiving their rights to language support for instruction
(IUSD, 2014).
The district offers access to Visual Art courses at the Middle School level, and an
introduction of Art experiences at the Elementary School levels led by Art Educators at
least six hours of instruction, per academic year in grades 4, 5, and 6. Art programs
! !
',!
increase in variety and availability at the high school level and include Drawing,
Painting, Ceramics, Graphic Arts, Digital Media, Animation, Metals and Jewelry Design.
It is worthwhile to note that this district also has a large performing arts program with
theater, dance, and music at all grade levels, having earned the distinction of one of the
“Best Communities for Music Education” by the NAMM Foundation in 2014, 2013 and
2010. The Grammy Foundation has also awarded Irvine Unified School District high
schools ten Grammy Signature Awards, currently leading all other districts in California
(IUSD, 2014).
Permission to conduct this initial experimental research was granted by the
Jeffrey Trail Middle School Principal with the condition that instruction provided is part
of the regular “8th Grade Art” curriculum. In this way there could be no interruption of
student schedules and of the regular art program. This condition was a major determinant
for choosing intact classes as experimental and control groups for this initial experimental
research rather than a randomized selection of students.
The student subjects for the experiments were enrolled in two separate sections of
8th grade Art class. These sections constituted both the control and experimental groups.
Period 1 constituted the control group, using Authentic Assessment. Period 6 constituted
the experimental group, employing the Quantitative Data-Driven Assessment. Student
populations for both experimental and control groups were surveyed regarding their
experience in the Visual Arts. Of the seventy-three students polled in this study total,
sixty responded to the survey. Of all student respondents, thirty-five percent report never
have taken an Art class in school before the implementation of this initial experimental
research. Forty-eight percent of respondents have not taken Art courses outside of school
! !
'-!
prior to the survey. The results of the survey are significant due to the recent population
changes to the district, with many students and families as recent immigrants to the area
from various international communities, with a variety of experience of and personal or
cultural investment in Art Education.
The Design
Two initial experiments were conducted in a middle school Visual Arts classroom
with 8th grade students. Due to the practical considerations of conducting pilot studies in
which one mode of assessment was compared with another mode of assessment,
randomization of the population was not possible, nor practical.
In Instrument A, an analysis of student creativity was performed by an expert
panel of art educators, whom had reached a consensual definition of creativity, then had
rated student art work on the consensually defined rubric of creativity.
Instrument B is a comparative analysis of gains in student creativity made through
a difference in scores recorded in pretest and posttest. The rubric to score for creativity
relied upon a contextual definition of creativity.
Instruments
The two instruments designed for this study had two main purposes.
1. Instrument A is a comparison between groups of student creativity
demonstrated in student art work as determined by appropriate observers, an
expert panel of art educators within the district of the student population. A
quantitative analysis of student creativity was determined by a professional panel
of art educators within the district of the student population. The rubric to score
for creativity relied upon a contextual definition of creativity.
! !
(.!
2. Instrument B is a comparison between control and experimental groups of
decreases or gains in creativity as demonstrated in student artwork as measured by
Teacher Evaluation from pretest to posttest. Assessments of student artwork are
rubric-based utilizing multiple criterion to evaluate.
The two instruments were necessary for the purpose of this study due to the
scholarly research and accepted definitions regarding creativity, specific to Art
Education. Although over one hundred definitions of creativity currently exist, for the
purposes of this experimental research in visual arts education, creativity is the
experience of creating something new and valuable that transcends traditional ideas,
rules, patterns, or relationships. The creation can be meaningful new ideas, forms,
methods, interpretations, processes, or products. Creativity is defined within the context
of the task, and by a consensus of appropriate observers. The products are considered
creative when the task itself is heuristic, not algorithmic.
Independent, Dependent, and Controlled Variables
Two initial experiments were developed to measure an assessment’s motivational
impact on creativity of student art projects. Instrument A for the first experiment is the
Expert Panel Posttest Only Experiment. Instrument B for the second experiment is the
Creativity Pretest/Posttest Gains Experiment.
In both initial experiments the independent variable is the posttest assessment
motivation presented to students. The control group using Authentic Assessment was
presented with the motivation of “increasing creativity” from pretest to posttest. The
experimental group using Quantitative Data-Driven Assessment were presented with the
motivation of “increasing creativity scores” from pretest to post test.
! !
(%!
The dependent variable in Instrument A; the Expert Panel Posttest Only
Experiment is represented by the mean creativity scores, compared from control group
using Authentic Assessment and the experimental group using Quantitative Data-Driven
Assessment.
The dependent variable for Instrument B; the Creativity Pretest/Posttest Gains
Experiment is the gain in mean creativity scores from Teacher Evaluation, compared
from control group using Authentic Assessment and the experimental group using
Quantitative Data-Driven Assessment.
The Lesson Plans, three Discussion Boards, Pretest and Posttest Evaluation
Rubrics, and assessment rubrics represent the controlled variables, remaining identical
between control group using Authentic Assessment and experimental group using
Quantitative Data-Driven Assessment (QDDA). Only the posttest assessment motivation
differed between groups.
Treatments
Two units of instruction were employed in this experiment:
1. The Tar Paper Slab Construction lesson, which was administered to both the
control group and experimental group.
2. The Ceramics Surface Treatments lesson which was administered to both the
control group and experimental group.
Over the course of the two units of instruction, three Discussion Boards were
employed as complementary instruction to both the control group utilizing Authentic
Assessment and the experimental group utilizing Quantitative Data-Driven Assessment.
1. Discussion Board A: “What is creativity?”
! !
(&!
2. Discussion Board B: “What surface treatments are you considering?”
3. Discussion Board C: “List how you are going to make the second Tar-Paper
Slab Project more creative than the first.”
Between the completing Discussion Board B and Discussion Board C, Students
were asked to create a rubric to evaluate themselves on their own definition of creativity.
The resultant rubric generated later became the “Student Generated Creativity Rubric”.
In the first experiment, Instrument A: Expert Panel Posttest Only Experiment,
posttest student artwork was exhibited to an Expert Panel of appropriate observers (local
Art Educators within the district of the initial experiment site). Control group works
utilizing Authentic Assessment and experimental group works utilizing Quantitative
Data-Driven Assessment were randomly assorted throughout the exhibit. The Expert
Panel for creativity judged each posttest student artwork.
In the second experiment, Instrument B: Creativity Pretest/Posttest Gains
Experiment, Pretest/Posttest creativity scores from Teacher Evaluations on the creativity
rubric were recorded and compared between groups.
! !
('!
TABLE 3.1. Butler Experimental Research Methodology
ART EXPERIENCES SURVEY
CONTROL GROUP:
Authentic Assessment
EXPERIMENTAL GROUP:
Quantitative Data-Driven Assessment
Lesson 1: Tar-Paper Slab Construction Ceramics Projects
Discussion Board A: “What is Creativity?”
Pretest Projects Completed, Students Self-Evaluation, Teacher Evaluation
Lesson 2: Surface Treatments on Tar-Paper Slab Projects
Discussion Board B: “Which Surface Treatments would you employ?”
“Student-Generated Creativity Rubric” Discussion and Creation
Discussion Board C: “List how you are going to make the second
Tar-Paper Slab Project more creative than the first.”
CONTROL TREATMENT: Posttest
Assessment Motivation “How can we
increase our creativity?”
EXPERIMENTAL TREATMENT:
Posttest Assessment Motivation “How
can we increase our creativity scores
according to the rubric?”
Pretest Projects Completed, Students Self-Evaluation, Teacher Evaluation
INSTRUMENT A: EXPERT PANEL POSTTEST ONLY EXPERIMENT
INSTRUMENT B: CREATIVITY PRETEST/POSTTEST GAINS EXPERIMENT
! !
((!
Art Experience Survey Results
A survey about student experience with the Visual Arts was employed prior to the
initial experimental research in order to select the most appropriate art medium to use for
the unit of instruction in the treatment. Effective lessons in art require some difficulty to
provide students a challenge, and a level of comfort and ease in order to avoid too much
frustration (Bartel, M, 1999). Just as in all academic subjects, the skills specific to the
medium are learned through practice. Bartle continues “The best lessons are those that
include practice training together with interest builders that motivate self initiated skill
practice.” (1999, web). Due to these motivational influences, the author felt it necessary
to select a medium in which interest was high and student experience was evenly
balanced between the familiar and the strange.
In prior lessons both in control and in experimental groups students demonstrated
polar abilities in drawing and painting. Some students demonstrated extraordinarily high
levels of proficiency far above grade level expectations in drawing, or in painting. The
other students were at or far below grade or developmental level in drawing and/or
painting skills (Edwards, 1999; Lowenfeld, 1947; McGregor, 1990; Perry & Wolf, 1988).
Just as in most aspects of work and education, motivation has a direct influence
on perception of ability. Students often present an attitude of incompetence in art
because they feel untalented (Bartel, 1999) especially in comparison to peers with greater
levels of skills practice. Choosing the most appropriate art medium to use for the unit of
instruction in the treatment was essential for fair and authentic results for this initial
experimental research. Drawing and Painting were both excluded as a potential medium
! !
()!
for this initial experimental research because of the disproportionate ability and
experience reported from students.
Results from the pre-pilot-study survey indicate 65% responded that they have not
taken Art classes in school prior to this course. Student response indicates 52% have
taken Art classes outside of school including at Art Camp (5%), Private Art Tutoring
(15%), Art Workshops (1%), and other.
Student respondents report their experience with types of art making according to
media. 98% of respondents have made Drawings. 93% of respondents have made
Paintings. 31% have practiced Printmaking. 50% of respondents report having made
hand-built Ceramics projects, and 11% have experience with Ceramics in wheel
throwing. 51% of respondents report experience in photography although most of these
respondents report that use of their smartphone for photography qualified their perception
of photography experience. 30% of student respondents report experience in making
Graphic Arts and/or Graphic Design. 33% of student respondents report experience in
creating hand-drawn Animations, and 15% report experience in digital Animations. 11%
of respondents have experience in Filmmaking. 16% of respondents report experience in
digital image manipulation (such as Photoshop, Illustrator, etc.) although again, most of
these respondents report that use of their smartphone for this purpose qualified their
perception of digital image manipulation experience. 41% of student respondents report
experience with making Collage. 35% of students report experience in Sculpture, with
18% reporting experience in carving or building wood, and only 3% building or sculpting
in metals. No student respondents report experience with Jewelry-Making of any type,
and no respondents report experience with other not-listed options of art making.
! !
(*!
63% of respondents have entered their art work in some sort of competition, with
51% of respondents have had their art work on display at a school art display, 21%
displayed in a gallery, and 10% have displayed their work in an online gallery.
The majority of student respondents report either one year or three years
experience in art making, in and outside of school, (13% for both). However, 23% of
respondents report at least ten or more years experience in art making, in and outside of
school. The majority of student respondents report that they have either not yet taken art
classes in any form (28%), or they have taken 1 year cumulatively of art classes in any
form (25%).
Student respondents demonstrate little to no experience in attending an art
museum within the past five years. 28% of student respondents have never been to an art
museum, 23% have attended an art museum once in the past five years, and 35% of
student respondents have attended an art museum within the past five years. Similar
results are reported regarding art gallery attendance. 31% of student respondents have
never attended an art gallery, 30% of student respondents have attended an art gallery
once in the past five years, and 26% of student respondents have attended an art gallery
two to three times in the past five years.
Student respondents rated their perceptions of themselves as artistically skilled.
10% of student respondents report a perception of “not at all skilled”. 15% of student
respondents report a perception of having very little artistic skills. 43% of students
perceive themselves as “o.k., not the best, but not the worst” in artistic skills. 26% report
a comfort with their perception of artistic skills with a desire to still improve upon them,
! !
(+!
and 6% of student respondents report a very confident perception of their artistic skills,
perceiving themselves as one of the top five in their class.
The limitations of access to technology of some art media at the initial
experimental research site at the time of the initial experimental research excluded the
exploration of certain art media: Photography, Digital Photography, Graphic Art /
Graphic Design, Digital Animation, and Digital Image Manipulation. Wood, Metals,
Sculpture, and Collage were also precluded from consideration for employment in the
initial experimental research due to site-specific concerns and limitations.
For the purposes of this initial experimental research, the experimenter will follow
Teresa Amabile’s recommendations on testing hypotheses about creativity: “the task ( 1 )
is an open-ended (heuristic) one; ( 2 ) does not depend heavily on special skills; and ( 3 )
is one in which subjects actually make an observable product or a response that can be
recorded and later judged on creativity.” (Amabile, 1983)
It was decided that the polarity of the results and skills practice experience for
drawing and painting both precluded either medium from suitability for inclusion in the
treatment of this initial experimental research. Hand-Building methods in the medium of
Ceramics presented the most appropriate level of familiarity and challenge (see Table
3.2). In order to present students with a learning challenge in which they could explore a
medium that is somewhat familiar and employ a novel technique in hand-building skills,
the Ceramics Tar-Paper Slab Project was employed as the unit of instruction.
! !
(,!
TABLE 3.2. Art experience survey results.
Students report experience in the following media
prior to the initial experimental research:
Drawing
98%
Painting
93%
Printmaking
31%
Ceramics (Hand-Building)
50%
Ceramics (Wheel-Throwing)
11%
Photography
51%
Digital Photography
31%
Graphic Art / Graphic Design
30%
Animation (Hand-drawn)
33%
Animation (Digital)
15%
Film-Making
11%
Digital Image Manipulation
16%
Collage
41%
Sculpture
35%
Wood (building or carving)
18%
Metals (building or sculpting)
3%
Jewelry making (any type)
0%
Other
0%
! !
(-!
The Unit of Visual Arts Instruction
Lesson plan 1: Tar paper clay slab project. The objective of this lesson is for
students to learn basic Ceramics concepts and practice basic skills of ceramic hand-
building techniques in order to create a tarpaper ceramic slab vessel. Various scaffolded
and medium-specific processes include but are not limited to recognizing the various
stages of clay (green ware, bisque ware, and glaze ware), wedging clay, throwing slabs of
clay by hand, scoring and use of slip to adhere parts of clay together, kiln firing,
vitrification, glazing, creating patterns, use of spatial planning, surface treatments, and
clay construction techniques.
The use of tarpaper as a support structure in this lesson is important in the
flexibility of vertical and horizontal shapes allowed that are not normally able to be
created in a slab-constructed clay project without the support of the tarpaper. The
tarpaper provides a rigidity to the green ware (still wet and heavy) clay. The support of
the tarpaper template allows for many options of shapes and sides for the finished clay
vessel product.
Production process. Students create and cut a template from tarpaper, using a
pattern piece for each side of the vessel, and the bottom of the vessel. It is important to
note that for this project, part of the requirements for the project grade is that the vessel
has “a bottom, and at least two sides”. Students attach the patterns to their hand-thrown
slabs of clay and cut the clay patterns out. The pattern pieces are assembled into the
vessel shape with scoring and slip application to join clay pieces. After a sufficient
drying to a leather-hard state, and decorative techniques are applied, the green ware
vessel is fired in the kiln, and vitrified, becoming bisque ware. Students learn to apply
! !
).!
glaze to the bisque ware, and the glazed vessel is again fired in the kiln to complete the
project.
Discussion board A. At the beginning of the unit of instruction, students were
instructed to answer the discussion board on the classroom website. The discussion
board prompt is as follows:
Answer the following in complete sentences. Use examples to support your ideas
if needed. You do not need to respond to anyone's post, but I would appreciate
seeing a lively academic discussion. If you disagree with someone, now is a good
time to appropriately practice doing so, and using evidence to back up your
opinion.
What is creativity? What is it like for you when you experience
creativity? Attach a link to an artwork or idea (video or website) that you think is
creative. Explain why you think it is creative. Keep your content academically
appropriate.
Reflection. Students were introduced to a project-specific critique process in
which they reflected upon the process of creating the tarpaper clay slab projects. The
focus of the critique is an analysis of the finished work (the pretest) and of the process of
creating the work. Construction and creative choices were discussed and reflected upon
by students, and the students reviewed the techniques they employed. The overall
success of the completed work is discussed with opportunity to imagine changes that the
student would have made if they were able to re-do the exact same project. (See
Appendix, Ceramics Critique 1). It is important to note that success of student artwork is
discussed per each individual project. At the end of the teacher-led group discussion, a
! !
)%!
“gallery walk” around the classroom to view the finished pretest projects was employed
to assist students in viewing their personal works comparatively to the other works in the
class, also assisting in the critique process. The teacher discussed the observation of
themes and similarities within the group, and also original solutions to the task. During
the group “gallery walk” critique, students practice their presentation, analysis, and
communication skills.
At the end of the self-critique document is a Creativity Criteria chart. The
students reflected and wrote about how they met the creativity criteria, and also rated
their success in meeting the creativity criteria on a scale of 0 to 5. The teacher also rated
student work according to the Creativity Criteria with the evidence of the project itself,
and taking into consideration the complexity of text within the student responses, using
Housen’s Visual Literacy Scale as a guide. Students used the teacher rating as part of a
conversation, or data-driven “error analysis”. The question was again revisited from the
written Ceramics Critique 1, “What could be improved if you were to recreate your
project?”. Students responded with their answers and were then informed that their next
ceramics project would have the same exact requirements, that the project “must have a
bottom and at least two sides”.
Lesson plan 2: Ceramics surface treatment techniques.
Both groups were provided a weeklong lesson and demonstration on ceramics
surface treatments. Surface treatments demonstrated and discussed included but were not
limited to incising, carving, stamping, applique, application of coils, burnishing,
application of an under glaze, glazing and reglazing techniques, sgraffito, smoothing,
! !
)&!
textural applications with textiles, slip decoration, and use of design motifs and
iconographic components.
Production process. Production process for the posttest project, the Tar Paper
Clay Slab Project Including Ceramics Surface Treatments, followed much of the same
procedure with much of the same considerations. Students again use tarpaper patterns to
cut clay patterns out and assemble into the vessel shape. After a thorough drying, and
decorative techniques are applied, the green ware vessel is fired in the kiln, and vitrified,
becoming bisque ware. Students apply glaze to the bisque ware, and the glazed vessel is
again fired in the kiln to complete the project.
Treatment. Identical to the pretest tar paper slab project, part of the requirements
for the posttest tarpaper slab project grade are that the vessel has “a bottom, and at least
two sides”. Both groups used their critiques and creativity rubrics for the error analysis.
The difference in treatment between groups is the motivation for the posttest assessment.
The control group exposed to the Authentic Assessment was frequently directed to focus
on the essential question for revision, “How can we make our projects more creative?”,
and the experimental group exposed to QDDA frequently directed to focus on the
essential question, “how can we increase our creativity scores according to the rubric?”.
Throughout the treatment, discussions were held during production time in the classroom
about creativity. All discussion questions were repeated in both control and experimental
group, with the repetition of the objective for the project revision. The control group
exposed to the Authentic Assessment objective remained, “How can we make our
projects more creative?”, and the experimental group exposed to QDDA objective
remained, “how can we increase our creativity scores according to the rubric?”. During
! !
)'!
an in-class discussion, students were asked to create their own way to judge if a project is
creative or not, leading to the Student-Generated Creativity Rubric. That rubric is
included in the posttest Ceramics Critique 2 as part of the student self-reflection process.
It is worthwhile to note that there are some very striking similarities from the Student-
Generated Creativity Rubric to the professional recommendations of assessing creativity
from the research of Teresa Amabile, Raymond Veon, and Mihaly Csikszentmihalyi as
reviewed in this Chapter II Review of Literature.
Discussion board B. Toward the ending of the first unit of instruction on
ceramics surface treatments and work production, students were instructed to answer the
discussion board on the classroom website. The discussion board prompt is as follows:
“For your next Tar Paper slab project, what surface treatments are you
considering? Describe them and/or link to images of them here.”
Results to the discussion board are presented in the Appendix.
Discussion board C. Toward the middle of the second lesson of instruction and
work production, students were instructed to answer the discussion board on the
classroom website. The discussion board prompt is as follows:
Here's your chance to show me what you plan on doing to improve on your first
tar-paper slab project! We're investigating new techniques, we're sketching things
out, and here: List how you are going to make the second one more creative than
the first:.
Results to the discussion board are presented in the Appendix.
Reflection. Students repeated the project-specific critique process in which they
reflected upon the process of creating the posttest tarpaper clay slab projects. (See
! !
)(!
Appendix Ceramics Critique 2). The focus of the critique is an analysis of the finished
work, and of the process of creating the work. The overall success of the completed work
is discussed, just as in the pretest Ceramics Critique 1 reflection. The Student-Generated
Creativity Rubric was included in the posttest Ceramics Critique 2 reflection. At the end
of the self-critique document is the same Creativity Criteria chart exactly the same as in
the pretest Ceramics Critique 1. The students again reflected and wrote about how they
met the creativity criteria, and also rated their success in meeting the creativity criteria on
a scale of 0 to 5.
As was done with the Ceramics Critique 1, the teacher also rated student work
according to the Creativity Criteria for the Ceramics Critique 2 with the evidence of the
project itself, and taking into consideration the complexity of text within the student
responses, using Housen’s Visual Literacy Scale as a guide. The data collected from
teacher rating of creativity in the Creativity Criteria Rubric was collected to compare for
gains in individual scores from pretest to posttest.
Consensual Creativity Scores were determined by an independent panel of
appropriate observers whom are familiar with the task and have come to a consensual
agreement on what creativity is pertaining directly to the project the students have
completed, and specific to the project tasks.
Panelists were local professionals, educators within the district of the initial
experimental research, working in the same district with the same age population as in
the initial experimental research. Four panelists are Middle School Visual Arts Teachers
within the district of the initial experimental research. One panelist is an Instructional
Assistant in the Special Education program at the initial experimental research site, with
! !
))!
an undergraduate degree in Art. One panelist is a Math teacher at the initial experimental
research school site, with an undergraduate minor in Art. Another panelist, a Science
teacher at the initial experimental research school site has a background in Art, specific to
Ceramics. This panelist became ill just prior to the juried exhibit, and was unable to
participate.
Panelists were informed of the pretest project, the task of the pretest project, the
treatment lessons and supporting discussion boards, the images and responses presented
to the students within the treatment, and the student-generated creativity criteria for the
posttest project. The posttest project had the same tasks and parameters as the pretest
project. The difference between groups was that the control group aimed for an increase
in creativity, and the experimental group aimed for an increase in creativity score
according to the rubric. After completion of the pretest, both groups received
individualized teacher feedback with a rating from the teacher in response to the student
creativity score, and written and verbal feedback for each individual student, specific to
their project.
Panelists were also provided with blank copies of both pretests and posttests.
Panelists asked the teacher/experimenter if student grades for the projects (pretests and
posttests) were influenced by the creativity rubric scores. It was discussed that the
creativity scores were introduced as an important factor to guide the student reflection on
their work as they critique the project, but students were aware that the creativity scores
were not included in student grades. In order to avoid results that could have been
influenced further by extrinsic motivations such as grades, student grades were entirely
independent of the creativity rubric scores.
! !
)*!
Panelists used all this information to create a consensual agreement on the panel
assessment for creativity of all student posttest works. Panelists disregarded the use of a
linear scale for the creativity rating, noting that the creativity rubrics students had
generated as a student group seemed more inclusive and appropriate to the task of the
project. The panelists decided to retain much of the creativity assessment criteria used on
student pretests and posttests.
The first two criteria used on student pretests and posttests rubrics (“Work
synthesizes ideas in original and surprising ways”, and “Novel or valuable response to the
task”) were kept. Panelists adopted the latter two criteria from the student-generated
creativity criteria (“Work is improved from Pretest to Posttest”, and “Work is original
from others in the class”). Inclusion of the third criteria (improvement from pretest to
posttest) required the inclusion of pretest images from each pretest project to be placed
beneath each posttest project. It is important to note that panelist rating of creativity
applied to posttest work, but panelists used the pretest images for comparison for the
third criteria (improvement from pretest to posttest). When a student had completed a
posttest, but not a pretest, the image was unavailable. Panelists were instructed to skip
rating in that particular third criteria only for projects without a pretest. While these
posttest only projects were rated overall, the data was not included in the results for this
initial experimental research – as it was unclear if the posttest was actually the students’
first or second attempt at the project.
! !
)+!
CHAPTER 4
RESULTS
The results can be divided into two main sections:
1. Instrument A: Expert Panel Posttest Only Results.
2. Instrument B: Creativity Gains Results.
Each of these is discussed below.
As previously discussed, over one hundred definitions of creativity currently
exist, for the purposes of this experimental research in visual arts education, creativity is
the experience of creating something new and valuable that transcends traditional ideas,
rules, patterns, or relationships. The creation can be meaningful new ideas, forms,
methods, interpretations, processes, or products. Creativity is defined within the context
of the task, and by a consensus of appropriate observers. The products are considered
creative when the task itself is heuristic, not algorithmic.
Instrument A: Expert Panel Posttest Only Results
An independent expert panel of appropriate observers whom are familiar with the
task came to a consensual agreement on what creativity is pertaining directly to the
project the students have completed, and specific to the project tasks. This consensual
creativity agreement was defined using four criteria, as defined in the four hypotheses:
Hypothesis 1: It is hypothesized that an analysis of variance (ANOVA) will
determine a statistically significant difference in creativity between the control group
! !
),!
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) posttest only artworks in the consensually panel-defined
“Work Synthesizes Ideas in Original and Surprising Ways” with experimental group
utilizing QDDA demonstrating significantly lower creativity scores in this category.
Hypothesis 2: It is hypothesized that an analysis of variance (ANOVA) will
determine a statistically significant difference in creativity between the control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) in the consensually panel defined “Novel or Valuable
Response to the Task” with experimental group utilizing QDDA demonstrating
significantly lower creativity scores in this category.
Hypothesis 3: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be a statistically significant difference in creativity between the
control group utilizing Authentic Assessment and experimental group utilizing
Quantitative Data-Driven Assessment (QDDA) in the consensually panel defined “Work
Is Improved from Pretest to Posttest” with experimental group utilizing QDDA
demonstrating significantly lower creativity scores in this category.
Hypothesis 4: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be a statistically significant difference in creativity between the
control group utilizing Authentic Assessment and experimental group utilizing
Quantitative Data-Driven Assessment (QDDA) in the consensually panel defined “Work
Is Original from Others in the Class” with experimental group utilizing QDDA
demonstrating significantly lower creativity scores in this category.
! !
)-!
The four research hypotheses previously stated for the Expert Panel Posttest Only
Experiment were converted to a null hypothesis in order to discuss the importance of the
findings.
The statistical inference test used to retain of reject the four between group
hypotheses was the sign test. The sign test is a nonparametric technique, and disregards
the magnitude of the difference between scores, taking into consideration only their
direction (sign), making it a rather insensitive test (Pagano, 1990). Despite its
limitations, the sign test seemed to be the most suitable technique to test the four between
group hypotheses. The main reason relies upon the fact that the sample scores gathered
in this study are not randomized samples from normally distributed populations (Pagano,
1990). Likewise, the ordinal scale resulting from the data collected in this experiment
also determined the nonparametric statistical inference technique to test between group’s
null hypotheses. In this study, the statistical inference test used was the Analysis of
Variance (ANOVA) test comparison between two variables. In this study, the alpha level
was set a 0.05. Therefore, the level of significance in accepting or rejecting the null
hypothesis was 0.05.
Instrument A: Expert Panel Posttest Only Experiment Null Hypotheses
Overall Results demonstrate no significant difference between groups in creativity
scores. See Figure 4.1
! !
*.!
FIGURE 4.1. Overall expert panel posttest only results.
The results will be addressed in the order of null hypotheses as consensually
defined by the independent panel of appropriate observers whom are familiar with the
task.
Null hypothesis 1: It is hypothesized that an analysis of variance (ANOVA) will
determine no statistically significant difference in creativity between the control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) posttest only artworks in the consensually panel-defined
“Work Synthesizes Ideas in Original and Surprising Ways” with experimental group
utilizing QDDA demonstrating similar creativity scores in this category.
Null Hypothesis 1 Results
.!
&!
(!
*!
,!
%.!
%&!
%(!
%*!
6374!890:325!
;:"<"47=!>!
81:?:"2"4<!
6374!890:325!
@0#3=!;:!
A7=17B=3!
C32?0423!
6374!890:325!
DE?:0#3E34F!
G:0E!H:3F32F!
F0!H02FF32F!
6374!890:325!
;:"<"47=!G:0E!
;FI3:2!
Instrument A: Panel Results
J04F:0=5!K1FI34F"9!K22322E34F!
L$?3:"E34F7=5!M174F"F7F"#3!
N7F7ON:"#34!K22322E34F!
! !
*%!
Mean scores with a 95% confidence interval of “Work synthesizes ideas in
original and surprising ways” for the control group was higher at 13.11 than the mean
scores for the experimental group at 12.63. However, the ANOVA difference is not
statistically significant. The results for null Hypothesis 1 has not been proven.
When examining the range of scores between groups, however, it is important to
note that the control group demonstrated a greater range of scores in the category of
“Work synthesizes ideas in original and surprising ways” as defined by the panel, and
also demonstrated higher top scores.
FIGURE 4.2. Null hypothesis 1 ANOVA results.
! !
*&!
FIGURE 4.3. Null hypothesis 1 ANOVA results boxplot
Null hypothesis 2: It is hypothesized that an analysis of variance (ANOVA) will
determine no statistically significant difference in creativity between the control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) in the consensually panel defined “Novel or Valuable
Response to the Task” with experimental group utilizing QDDA demonstrating similar
creativity scores in this category.
Null Hypothesis 2 Results
Mean scores with a 95% confidence interval of “Novel or Valuable Response to
the Task” for the control group was higher at 11.95 than the mean scores for the
! !
*'!
experimental group at 11.84. However, the ANOVA difference is not statistically
significant. The results for null Hypothesis 2 have not been proven.
When examining the range of scores between groups, however, it is important to
note that the control group demonstrated a greater range of scores in the category of
“Novel or Valuable Response to the Task” as defined by the panel, and also demonstrated
higher top scores, just as with the H1 category of (“Work synthesizes ideas in original
and surprising ways”.
FIGURE 4.4. Null hypothesis 2 ANOVA results.
! !
*(!
FIGURE 4.5. Null hypothesis 2 ANOVA results boxplot.
Null hypothesis 3: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be no statistically significant difference in creativity between the
control group utilizing Authentic Assessment and experimental group utilizing
Quantitative Data-Driven Assessment (QDDA) in the consensually panel defined “Work
Is Improved from Pretest to Posttest” with experimental group utilizing QDDA
demonstrating similar creativity scores in this category.
Null Hypothesis 3 Results
Mean scores with a 95% confidence interval of “Work is improved from pretest to
posttest” for the control group was higher at 13.0 than the mean scores for the
! !
*)!
experimental group at 11.11. However, the ANOVA difference is not statistically
significant. The results for null Hypothesis 3 have not been proven.
When examining the range of scores between groups, however, it is important to
note that the experimental group demonstrated a greater range of scores in the category of
“Work is improved from pretest to posttest” as defined by the panel, and also
demonstrated lower low scores.
FIGURE 4.6. Null hypothesis 3 ANOVA results.
Null hypothesis 4: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be no statistically significant difference in creativity between the
control group utilizing Authentic Assessment and experimental group utilizing
Quantitative Data-Driven Assessment (QDDA) in the consensually panel defined “Work
! !
**!
Is Original from Others in the Class” with experimental group utilizing QDDA
demonstrating similar creativity scores in this category.
FIGURE 4.7. Null hypothesis 3 ANOVA results boxplot
Null Hypothesis 4 Results
Mean scores with a 95% confidence interval of “Work is Original From Others in
the Class” for the control group was higher at 13.79 than the mean scores for the
experimental group at 12.37. However, the ANOVA difference is not statistically
significant. The results for null Hypothesis 4 have not been proven.
! !
*+!
When examining the range of scores between groups, however, it is important to
note that the control group demonstrated a greater range of scores in the category of
“Work is Original From Others in the Class” as defined by the panel, and also
demonstrated higher Originality scores.
FIGURE 4.8. Null hypothesis 4 ANOVA results.
Some student posttest work that had been judged by the panelists had been very recently
completed and was still in a green ware state, the un-fired clay that is very fragile, and
still malleable if it is at all wet. Green ware when dry is extremely fragile, and easily
broken. Recent kiln explosions during the last week of the treatment lesson due to
inclimate weather and other unforeseen circumstances had negatively affected the
quantity of projects in both the control group and experimental group results sample. In
two separate kiln firings, a student had left tarpaper on their green ware project, and
! !
*,!
FIGURE 4.9. Null hypothesis 4 ANOVA results boxplot.
covered the tarpaper with clay to hide the remaining tarpaper. It is suspected that in these
two kiln firings, the green ware with tarpaper exploded, in such a way that the exploded
shrapnel broke many nearby projects in the kiln.
In spite of the setbacks, students felt compelled to participate in the panel gallery
showing, and had rushed to create last-minute green ware for inclusion into the gallery
for the panelists. The green ware was rated, but the data collected from the green ware
was omitted from the results for this initial experimental research. The green ware
projects were completed under a much smaller timeline, (some within three hours, instead
of two to three weeks), and under increased pressure with the rapidly changing ambient
! !
*-!
humidity levels, kiln firing schedules, and a more rapidly approaching deadline to
assemble the gallery exhibit for the arriving panelists.
With the omission of the green ware projects from the overall initial experimental
research panel results data, and the projects that had no pretest data available, nineteen
projects remained in the overall initial experimental research panel data in the control
group, and nineteen projects remained in the overall data in the experimental group.
The panel had a few questions regarding the second criteria, a “novel or valuable
response to the task”. The criteria was defined as novel being “new” to the task, or
valuable being “something that can be used” after the task, just as it was explained to the
students. However, it is important to note that many student posttest projects were
cylindrical in form. The task of the project, pretest and posttest, was to “create a tar
paper slab project with a bottom and at least two sides”. Many of the panelists contended
that the task of the project was not met if the posttest project was cylindrical in form such
as a mug or vase, as cylinders have only a bottom and one side: the cylinder wall. This is
significant because both the control group and experimental groups exhibited posttest
projects that were cylindrical. One project had no bottom, and no side, as the final
product was a hanging wind-chime. The wind-chime and cylindrical-shaped posttest
projects were not removed from the data sample.
! !
+.!
TABLE 4.1: Expert panel posttest summary results
SUMS OF PANEL
SCORES
Work
synthesizes
ideas in
original and
surprising
ways
Novel or
valuable
response to
the task
Work is
improved
from pretest
to posttest
Work is
original
from
others in
the class.
CONTROL GROUP
249
227
247
262
EXPERIMENTAL GROUP
240
225
211
235
Difference
9
2
36
27
MEAN PANEL SCORES
Work
synthesizes
ideas in
original and
surprising
ways
Novel or
valuable
response to
the task
Work is
improved
from pretest
to posttest
Work is
original
from
others in
the class.
CONTROL GROUP
13.11
11.95
13.00
13.79
EXPERIMENTAL GROUP
12.63
11.84
11.11
12.37
Difference
0.38
0.11
1.89
1.42
AVERAGE PANEL
SCORE PERCENTAGE
Work
synthesizes
ideas in
original and
surprising
ways
Novel or
valuable
response to
the task
Work is
improved
from pretest
to posttest
Work is
original
from
others in
the class.
CONTROL GROUP
44%
40%
43%
46%
EXPERIMENTAL GROUP
42%
39%
37%
41%
Difference
2%
1%
6%
5%
! !
+%!
Instrument B: Creativity Pretest/Posttest Gains Experiment Results
As previously discussed, over one hundred definitions of creativity currently
exist, for the purposes of this experimental research in visual arts education, creativity is
the experience of creating something new and valuable that transcends traditional ideas,
rules, patterns, or relationships. The creation can be meaningful new ideas, forms,
methods, interpretations, processes, or products. Creativity is defined within the context
of the task, and by a consensus of appropriate observers. The products are considered
creative when the task itself is heuristic, not algorithmic.
The second instrument in this initial experimental research tested four hypotheses
in order to determine the method of assessment’s increase or reduction of creativity
between the control group using authentic assessment and the experimental group using
quantitative data-driven instruction. Teacher-Evaluation Scores were compared for gains
in scores from pretest to posttest between groups for the following four hypotheses;
Hypothesis 5: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Synthesis of Ideas In
Original and Surprising Ways”.
Hypothesis 6: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Novel or Valuable
Response to the Task”.
! !
+&!
Hypothesis 7: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Work Makes Student
Ask New Questions to Build Upon An Idea”.
Hypothesis 8: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Enables Student To
Discover, Learn Something Not Directly Instructed”.
The four research hypotheses previously stated for the Creativity Pretest/Posttest
Gains Experiment were converted to a null hypothesis in order to discuss the importance
of the findings.
The statistical inference test used to retain of reject the four between group
hypotheses was the sign test. The sign test is a nonparametric technique, and disregards
the magnitude of the difference between scores, taking into consideration only their
direction (sign), making it a rather insensitive test (Pagano, 1990). Despite its
limitations, the sign test seemed to be the most suitable technique to test the four between
group hypotheses. The main reason relies upon the fact that the sample scores gathered
in this study are not randomized samples from normally distributed populations (Pagano,
1990). Likewise, the ordinal scale resulting from the data collected in this experiment
also determined the nonparametric statistical inference technique to test between group’s
null hypotheses. In this study, the statistical inference test used was the Analysis of
! !
+'!
Variance (ANOVA) test of comparison between two variables. In this study, the alpha
level was set a 0.05. Therefore, the level of significance in accepting or rejecting the null
hypothesis was 0.05.
Instrument B: Creativity Pretest/Posttest Gains Experiment Null Hypotheses.
The results will be addressed in the order of null hypothesis.
Null hypothesis 5: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be no significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Synthesis of Ideas In
Original and Surprising Ways”.
Null Hypothesis 5 Results
Mean gain scores “Synthesis of Ideas In Original and Surprising Ways” for the
control group demonstrated an increase in Teacher Evaluation Scores from 2.97 to 3.40.
Mean gain scores “Synthesis of Ideas In Original and Surprising Ways” for the
experimental group demonstrated a decrease in Teacher Evaluation Scores from 3.28 to
3.00. The ANOVA difference is not statistically significant between groups for Teacher
Evaluation Results. The results for Hypothesis 5 have not been proven.
When examining the range of gain scores between groups it is important to note
that the control group demonstrated a gain in scores between pretest to posttest and the
experimental group demonstrated a decrease in creativity from pretest to posttest in the
category of “Synthesis of Ideas In Original and Surprising Ways”.
! !
+(!
FIGURE 4.10. Instrument B null hypothesis 5 ANOVA results.
!
K@;AK!PKQRLS!8T4FI32"2!0G!"U372!"4!0:"<"47=!>!21:?:"2"4<!V7T2!!
W@0!2"<4"G"974F!U"GG3:3493X!
Df Sum Sq Mean Sq F value Pr(>F)
group 1 0.71 0.7063 0.268 0.608
Residuals 32 84.26 2.6333
39 observations deleted due to missingness
FIGURE 4.11. Instrument B null hypothesis 5 ANOVA table results.
Null hypothesis 6: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be no significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
2.97
3.28
3.40
3.00
Control, Authentic Assessment Experimental, Quantitative Data-
Driven Assessment
Instrument B: Hypothesis 5
Work Synthesizes Ideas in Original and Surprising Ways
Pretest Mean Creativity Scores
Posttest Mean Creativity Scores
! !
+)!
demonstrated in Teacher Evaluations in the creativity category “Novel or Valuable
Response to the Task”.
Null Hypothesis 6 Results
Mean gain scores “Novel or Valuable Response to the Task” for the control group
demonstrated an increase in Teacher Evaluation Scores from 2.59 to 2.69. Mean gain
scores “Novel or Valuable Response to the Task” for the experimental group
demonstrated an increase in Teacher Evaluation Scores from 2.88 to 3.23. The ANOVA
difference is not statistically significant between groups for Teacher Evaluation Results.
The results for Hypothesis 6 have not been proven, as there is no statistically significant
difference in gains between control and experimental groups in the creativity scores for
the category of “Novel or Valuable Response to the Task”.
It is important to summarize that in the Teacher Evaluation both the control group
and the experimental group demonstrated a gain in scores between pretest to posttest in
the category of “Novel or Valuable Response to the Task”, with the experimental group
demonstrating greater gains.
Null hypothesis 7: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Work Makes Student
Ask New Questions to Build Upon An Idea”.
! !
+*!
FIGURE 4.12. Instrument B null hypothesis 6 ANOVA results.
ANOVA TABLE: Novel or valuable response to the task
(No significant difference)
Df Sum Sq Mean Sq F value Pr(>F)
group 1 7.88 7.880 2.725 0.109
Residuals 31 89.64 2.891
40 observations deleted due to missingness
FIGURE 4.13. Instrument B null hypothesis 6 ANOVA table results.
Null Hypothesis 7 Results
Mean gain scores “Work Makes Student Ask New Questions, Build Upon An
Idea” for the control group demonstrated a decrease in Teacher Evaluation Scores from
2.18 to 2.06. Mean gain scores “Work Makes Student Ask New Questions, Build Upon
2.59 2.88
2.69
3.23
Control, Authentic Assessment Experimental, Quantitative Data-
Driven Assessment
Instrument B: Hypothesis 6
Novel or Valuable Response to the Task
Pretest Mean Creativity Scores
Posttest Mean Creativity Scores
! !
++!
An Idea” for the experimental group demonstrated a decrease in Teacher Evaluation
Scores from 3.24 to 1.91. The ANOVA difference is statistically significant between
groups for Teacher Evaluation Results. The results for Hypothesis 7 have been proven,
as there is a statistically significant difference in gains/losses between control and
experimental groups on the creativity scores for the category of “Work Makes Student
Ask New Questions, Build Upon An Idea”. ANOVA Results demonstrate statistically
significant difference in gains/losses for between groups in Teacher Evaluation scores.
The control group demonstrated less loss of score for Teacher Evaluation in the category
of “Work Makes Student Ask New Questions, Build Upon An Idea”.
FIGURE 4.14. Instrument B null hypothesis 7 ANOVA results.
2.18
3.24
2.06 1.91
Control, Authentic Assessment Experimental, Quantitative Data-
Driven Assessment
Instrument B: Hypothesis 7
Work Makes Student Ask New Questions
to Build Upon an Idea
Pretest Mean Creativity Scores
Posttest Mean Creativity Scores
! !
+,!
ANOVA TABLE: Work makes student ask new questions, build upon an idea
(Demonstrates significant difference)
Df Sum Sq Mean Sq F value Pr(>F)
group 1 14.85 14.850 5.227 0.029 *
Residuals 32 90.91 2.841
39 observations deleted due to missingness
FIGURE 4.15. Instrument B null hypothesis 7 ANOVA table results
Null hypothesis 8: It is hypothesized that an analysis of variance (ANOVA) will
determine there will be significant decrease in creativity in the experimental group
utilizing Quantitative Data-Driven Assessment (QDDA) from pretest to posttest
demonstrated in Teacher Evaluations in the creativity category “Enables Student To
Discover, Learn Something Not Directly Instructed”.
Null hypothesis 8 results: Mean gain scores “Enables Student To Discover, Learn
Something Not Directly Instructed” for the control group demonstrated an increase in
Teacher Evaluation Scores from 2.11 to 2.56. Mean gain scores “Enables Student To
Discover, Learn Something Not Directly Instructed” for the experimental group
demonstrated a decrease in Teacher Evaluation Scores from 3.00 to 2.27. The ANOVA
difference demonstrates statistical significance for difference in gains between groups for
Teacher Evaluation Results. The results for null Hypothesis 7 have been proven, as there
is a statistically significant difference in gains/losses between control and experimental
groups on the Teacher Evaluation creativity scores for the category of “Enables Student
To Discover, Learn Something Not Directly Instructed”.
! !
+-!
FIGURE 4.16. Instrument B Null Hypothesis 8 ANOVA Results
ANOVA TABLE: Enables student to discover, learn something not directly instructed
(Demonstrates significant difference)
Df Sum Sq Mean Sq F value Pr(>F)
group 1 18.74 18.741 8.895 0.00553 **
Residuals 31 65.32 2.107
40 observations deleted due to missingness
FIGURE 4.17 Instrument B Null Hypothesis 8 ANOVA TABLE Results
2.11
3.00
2.56 2.27
Control, Authentic Assessment Experimental, Quantitative Data-
Driven Assessment
Instrument B: Hypothesis 8
Enables Student to Discover,
Learn Something Not Directly Instructed
Pretest Mean Creativity Scores
Posttest Mean Creativity Scores
! !
,.!
CHAPTER 5
CONCLUSIONS
General Conclusions
Although over one hundred definitions of creativity currently exist, for the
purposes of this experimental research in Visual Arts Education, creativity is the
experience of creating something new and valuable that transcends traditional ideas,
rules, patterns, or relationships. The creation can be meaningful new ideas, forms,
methods, interpretations, processes, or products. Creativity is defined within the context
of the task, and by a consensus of appropriate observers. The products are considered
creative when the task itself is heuristic, not algorithmic.
One of the most important outcomes of an education with art includes
encouraging “creativity and innovative thinking in young minds” (PCAH, 2011 p. 8). It
is important to again acknowledge that Art Education does not hold the monopoly on
teaching creativity within the schools. However, it is one of the core academic subjects
where the student success and the success of the art program is continually approved or
condemned by the creativity of the curriculum, and the creativity of the student work
produced, or the end product. Art Education curricula regularly incorporate creativity as
a vehicle with which to acquire and practice the higher-order thinking skills and
communication skills, which educational reformers strive for students to acquire (NAEP,
2008). It is the opinion of the author that any limit to creativity in this scope is
! !
,%!
counterproductive to classroom, school, community, and educational expectations. The
impact is further broadened when students no longer are our students, but have graduated
from school and go on to lead productive lives as employees, employers, policy-makers,
and furthermore consumers and creators of the culture where convergent and quantitative
answers will not always be the appropriate ones to the many complicated issues faced
throughout a lifetime.
The employment of Quantitative Data-Driven Instruction and Assessment is not
an objectionable pedagogical or assessment method when it is done appropriately.
Quantitative Data-Driven Instruction and Assessment is best employed in academic
content areas that seek convergent thinking and answers, and primarily use quantitative
instructional and assessment methods. The quantitative use of data has been employed
quite successfully in behavioral monitoring and disciplinary issues at the school site in
which this experiment took place. Clearly seeing trends in absences and tardies, or
identifying repeated behavioral issues has helped teachers, administrators, parents, and
students to separate perception from the reported data. It is when Quantitative Data-
Driven Instruction is employed only as a means to increase student achievement, only
narrowly defined as test scores, that the student as a learner takes a subordinate position
to the test score. Such implementation is myopic.
Based on the results of this experimental research, a difference in motivation
toward the assessments negatively affects the demonstrated creativity. The control
group’s intrinsic motivation of making the posttest work “more creative” did influence
the students, demonstrating higher creativity scores than the extrinsic motivation of the
experimental group in “increasing creativity scores according to the rubric.”
! !
,&!
Due to the nature of the field of Art Educations’ symbiotic relationship with
creativity, with the recognition that creativity is a major part of what students aspire to
practice and demonstrate in a Visual Arts course, any limit to creativity in this scope is
counterproductive to classroom, school, community, and educational expectations. It is
the opinion of the author that continued testing of any assessment methodologies should
occur prior to mandating their implementation in the classroom. The desire for uniformity
across all curricula in a school or district should never override best practices in any
academic content area, including the Visual Arts. It is the opinion of the author that
further testing of this research is highly appropriate.
The following conclusions were drawn from the analysis of data gathered in this
study.
( 1 ) Instrument A: Overall, mean creativity scores as determined by an Expert
Panel of Art Educators were greater in the control group utilizing Authentic Assessment
than the experimental group utilizing Quantitative Data-Driven Assessment. Differences
in mean creativity scores were not statistically significant in the creativity categories,
defined as “Work Synthesizes Ideas in Original and Surprising Ways”, “Novel or
Valuable Response to the Task”, “Work is Improved from Pretest to Posttest”, and
“Work Is Original From Others In Class”. Though the differences between groups are not
statistically significant, a difference exists. The control group using Authentic
Assessment showed greater creativity as juried by an independent expert panel, than the
experimental group using Quantitative Data-Driven Assessment. Based on the results of
Instrument A in this experimental research, it is the opinion of the author that the use of
! !
,'!
Quantitative Data-Driven Assessment requires more testing for appropriateness prior to
mandating the implementation of such assessment methods.
( 2 ) Instrument B: Overall, creativity gain scores as determined by Teacher
Evaluation from the creativity rubric were again greater in the control group utilizing
Authentic Assessment than in the experimental group utilizing Quantitative Data-Driven
Assessment. Differences in creativity gain scores demonstrated statistical significance in
two of the four creativity categories. defined as “Work Makes Student Ask New
Questions to Build Upon An Idea”, and “Enables Student to Discover, Learn Something
Not Directly Instructed” , but not statistically significant in the creativity categories,
“Work Synthesizes Ideas in Original and Surprising Ways”, and “Novel or Valuable
Response to the Task”. The Experimental group using Quantitative Data-Driven
Instruction did perform better than the control group using Authentic Assessment in the
category of “Novel or Valuable Response to the Task” although no explanation or theory
is available at this time. Though the differences between groups are not statistically
significant in two of four categories for Instrument B, a difference in creativity
demonstrated again exists. The control group using Authentic Assessment showed greater
creativity than the experimental group using Quantitative Data-Driven Assessment.
Based on the results of Instrument B in this experimental research, it is the opinion of the
author that the use of Quantitative Data-Driven Assessment requires more testing for
appropriateness prior to mandating the implementation of such assessment methods.
( 3 ) The employment of quantitative Data-Driven Assessment in the Visual
Arts demonstrates a decrease in creativity, however statistically slight in most creativity
categories. The range of creativity scores also generally decreases when Quantitative
! !
,(!
Data-Driven Assessment is employed. The dependent variable, the posttest assessment
motivation appears to be the fundamental affect upon the creativity results in both
instruments. Based on the results of this experimental research, a difference in
motivation toward the assessments negatively affects the demonstrated creativity. The
control group’s intrinsic motivation of making the posttest work “more creative” did
influence the students, demonstrating higher creativity scores than the extrinsically
motivation of the experimental group in “increasing creativity scores according to the
rubric.”
Since the aims and pedagogies of the Visual Arts overall are to encourage
divergent thinking, to increase creativity and innovation (Zimmerman, 2010b), the results
of these experiments demonstrate that the assessment motivations employed in the use of
quantitative and convergent Data-Driven Instruction and Assessment are
counterproductive for desired outcomes for student learning and achievement in the
visual arts. Further testing of the implementation of Quantitative Data-Driven
Assessment is highly appropriate prior to mandating the implementation of such
assessments.
Limitations of the Study
The unpredicted non-significance of difference of the posttest assessment
motivation (dependent variable) upon the mean creativity scores between control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (independent variables) may be clarified by an examination of the
methodological limitations revealed during the course of this initial experimental study.
These limitations are as follows:
! !
,)!
Limitation 1: Due to the existing conditions within the middle school available
for this initial experimental research, samples were non-randomized. Random
assignment is a useful tool to control the undesirable presence of confounding variables.
Through effective matching techniques, randomization provides a significant equivalency
between pretest experimental and posttest control groups (Dalen, Meyer, 1979.) When
randomization cannot be employed, another procedure used to control variation between
experimental and control groups is the Analysis of Variance (ANOVA). ANOVA was
chosen to compare mean creativity scores between groups for Instrument A, and to
compare pretest/posttest creativity gains in Instrument B.
Limitation 2: Samples of student posttest artworks were limited to nineteen
samples in the control group utilizing Authentic Assessment and nineteen samples in the
experimental group utilizing Quantitative Data-Driven Assessment. Some student
posttest work that had been judged by the expert panelists had been very recently
completed and was still in a “green ware” state, the un-fired clay that is very fragile, and
still malleable if it is at all wet. Green ware when dry is extremely fragile, and easily
broken. Recent kiln explosions during the last week of the treatment lesson due to
inclimate weather and other unforeseen circumstances had negatively impacted the
quantity of projects in both the control group and experimental group results sample.
In two separate kiln firings, a student had left tarpaper on their green ware project,
and covered the tarpaper with clay to hide the remaining tarpaper. It is suspected that in
these two kiln firings, the green ware with tarpaper exploded, in such a way that the
exploded shrapnel broke many nearby projects in the kiln. See image (x.x)
! !
,*!
In spite of the setbacks, students felt compelled to participate in the expert panel
gallery exhibit, and had rushed to create last-minute green ware for inclusion into the
gallery exhibit for the expert panelists. The green ware was rated, but the data collected
from the green ware was omitted from the results for this initial experimental research.
The green ware projects were completed under a much smaller timeline, (some within
three hours, instead of two to three weeks), and under increased pressure with the rapidly
changing ambient humidity levels, kiln firing schedules, and a rapidly approaching
deadline to assemble the gallery exhibit for the arriving expert panelists.
With the omission of the green ware projects from the overall initial experimental
research panel results data, and the projects that had no pretest data available, nineteen
projects remained in the overall initial experimental research panel data in the control
group, and nineteen projects remained in the overall data in the experimental group.
Limitation 3: In Instrument A: Expert Panel Posttest Only Experiment, the panel
had a few questions regarding the second criteria, a “novel or valuable response to the
task”. The criteria was defined as novel being “new” to the task, or valuable being
“something that can be used” after the task, just as it was explained to the students.
However, it is important to note that many student posttest projects were cylindrical in
form. The task of the project, pretest and posttest, was to “create a tar paper slab project
with a bottom and at least two sides”. Many of the panelists contended that the task of
the project was not met if the posttest project was cylindrical in form such as a mug or
vase, as cylinders have only a bottom and one side: the cylinder wall. This is significant
because both the control group and experimental groups exhibited posttest projects that
were cylindrical. One project had no bottom, and no side, as the final product was a
! !
,+!
hanging wind-chime. The wind-chime and cylindrical-shaped posttest projects were not
removed from the data sample.
Limitation 4: Students began to demonstrate fatigue and dissatisfaction with the
use of clay for two large projects, sequentially. Toward the end of the second project
(posttest) students became increasingly vocal about the frustrations and concerns of
repeating the same type of projects. “When are we going to be done with this?” and
“When will we get back to drawing?” were frequent student requests during the last few
weeks for a transition from the clay medium and into another type of project. The nature
of the study necessitated a pretest / posttest comparison. Fridays were regularly
scheduled each week as “Sketchbook Fridays” in order to ameliorate any medium-
fatigue, and facilitate planning and idea generation time, where students could choose to
draw or paint in their classroom sketchbooks, either deciding from a list of ideas or
generating ideas of their own. Excitement about the use of clay, and the project tasks at
hand made the use of sketchbooks each Friday very rare at the beginning of this study.
Toward the end of the study, students were vocally grateful for the break in medium and
pace.
Limitation 5: Due to the practical concerns of the student population, and
conducting the experimental research within the structure of a classroom, the amount of
time allotted for this experiment was limited to nine weeks. Ideally, the experimental
research would be conducted over a longer amount of time, perhaps three to four months.
With the need for student evaluation, entering grades into the grade-book, and keeping
the curricular pace for the academic year, the units of instruction in clay for this study
was limited to nine weeks. At this particular school site, there is a modified block-
! !
,,!
schedule in which classes are increased in duration for a normal 55-minute class period to
90 minutes, two days per week. Students were also encouraged to visit the Art
Classroom during their “Advantage” enrichment course period, twenty minutes twice per
week. Also, during the course of this experimental research study, some students stayed
after school to work on their projects, a few hours per week, sometimes up to eight hours
a week after school. Opportunities were available to students to receive individualized
instruction if requested. Still, the obligations and responsibilities of the experimenter
within the classroom setting governed the allotted time available to conduct this
experimental research.
Discussion of Hypotheses
Instrument A: Expert Panel Posttest Only Experiment
Hypothesis 1: It was hypothesized that an analysis of variance (ANOVA) would
determine a statistically significant difference in creativity between the control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) posttest only artworks in the consensually panel-defined
“Work Synthesizes Ideas in Original and Surprising Ways” with experimental group
utilizing QDDA demonstrating significantly lower creativity scores in this category. This
was not confirmed.
An Expert Panel of Art Educators scored student artwork from the control group
utilizing Authentic Assessment as more creative than the experimental group utilizing
QDDA in the creativity category “Work Synthesizes Ideas in Original and Surprising
Ways”. However, the difference in mean creativity scores between control and
experimental groups were not statistically significant.
! !
,-!
Creativity score range demonstrated in the control group is greater than the
creativity score range demonstrated in the experimental group.
Hypothesis 2: It was hypothesized that an analysis of variance (ANOVA) would
determine a statistically significant difference in creativity between the control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) posttest only artworks in the consensually panel-defined
“Novel or Valuable Response to the Task” with experimental group utilizing QDDA
demonstrating significantly lower creativity scores in this category. This was not
confirmed.
An Expert Panel of Art Educators scored student artwork from the control group
utilizing Authentic Assessment as more creative than the experimental group utilizing
QDDA in the creativity category “Novel or Valuable Response to the Task”. However,
the difference in mean creativity scores between control and experimental groups were
not statistically significant.
Creativity score range demonstrated in the control group is greater than the
creativity score range demonstrated in the experimental group.
Hypothesis 3: It was hypothesized that an analysis of variance (ANOVA) would
determine a statistically significant difference in creativity between the control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) posttest only artworks in the consensually panel-defined
“Work Is Improved From Pretest to Posttest” with experimental group utilizing QDDA
demonstrating significantly lower creativity scores in this category. This was confirmed.
! !
-.!
An Expert Panel of Art Educators scored student artwork from the control group
utilizing Authentic Assessment as more creative than the experimental group utilizing
QDDA in the creativity category “Work Is Improved from Pretest to Posttest”. The
difference in mean creativity scores between control and experimental groups were
statistically significant.
However, it is interesting to note that the creativity score range demonstrated in
the control group is lesser than the creativity score range demonstrated in the
experimental group. In all other creativity categories for Instrument B, the control group
demonstrates a greater range than the experimental group.
Hypothesis 4: It was hypothesized that an analysis of variance (ANOVA) would
determine a statistically significant difference in creativity between the control group
utilizing Authentic Assessment and experimental group utilizing Quantitative Data-
Driven Assessment (QDDA) posttest only artworks in the consensually panel-defined
“Work Is Original From Others In the Class” with experimental group utilizing QDDA
demonstrating significantly lower creativity scores in this category. This was not
confirmed.
An Expert Panel of Art Educators scored student artwork from the control group
utilizing Authentic Assessment as more creative than the experimental group utilizing
QDDA in the creativity category “Work Is Original From Others In the Class”.
However, the difference in mean creativity scores between control and experimental
groups were not statistically significant.
Creativity score range demonstrated in the control group is greater than the
creativity score range demonstrated in the experimental group.
! !
-%!
Instrument B: Creativity Pretest/Posttest Gains Experiment
Hypothesis 5: It was hypothesized that an analysis of variance (ANOVA) would
determine a significant decrease in creativity in the experimental group utilizing
Quantitative Data-Driven Assessment (QDDA) from pretest to posttest demonstrated in
Teacher Evaluations in the creativity category “Synthesis of Ideas In Original and
Surprising Ways”. This was not confirmed.
Creativity gains as demonstrated in student artwork from the control group
utilizing Authentic Assessment as more creative than the experimental group utilizing
QDDA in the creativity category “Synthesis of Ideas in Original and Surprising Ways”.
However, the difference in creativity gains between control and experimental groups
were not statistically significant.
Hypothesis 6: It was hypothesized that an analysis of variance (ANOVA) would
determine a significant decrease in creativity in the experimental group utilizing
Quantitative Data-Driven Assessment (QDDA) from pretest to posttest demonstrated in
Teacher Evaluations in the creativity category “Novel or Valuable Response To The
Task”. This was not confirmed.
Creativity gains as demonstrated in student artwork from the control group
utilizing Authentic Assessment as less creative than the experimental group utilizing
QDDA in the creativity category “Novel or Valuable Response to the Task”. However,
the difference in creativity gains between control and experimental groups were not
statistically significant.
Hypothesis 7: It was hypothesized that an analysis of variance (ANOVA) would
determine a significant decrease in creativity in the experimental group utilizing
! !
-&!
Quantitative Data-Driven Assessment (QDDA) from pretest to posttest demonstrated in
Teacher Evaluations in the creativity category “Work Makes Student Ask New Questions
to Build Upon an Idea”. This was confirmed.
Creativity gains as demonstrated in student artwork from the control group
utilizing Authentic Assessment as more creative than the experimental group utilizing
QDDA in the creativity category “Work Makes Student Ask New Questions to Build
Upon an Idea”. The difference in creativity gains between control and experimental
groups were statistically significant.
Hypothesis 8: It was hypothesized that an analysis of variance (ANOVA) would
determine a significant decrease in creativity in the experimental group utilizing
Quantitative Data-Driven Assessment (QDDA) from pretest to posttest demonstrated in
Teacher Evaluations in the creativity category “Enables Student to Discover, Learn
Something Not Directly Instructed”. This was confirmed.
Creativity gains as demonstrated in student artwork from the control group
utilizing Authentic Assessment as more creative than the experimental group utilizing
Quantitative Data-Driven Assessment in the creativity category “Enables Student to
Discover, Learn Something Not Directly Instructed”. The difference in creativity gains
between control and experimental groups were statistically significant.
Implications of Employment of Quantitative Data-Driven Instruction
and Assessment in the Visual Arts
The most important finding of this initial experimental research is that the
employment of quantitative Data-Driven Assessment in the Visual Arts demonstrates a
decrease in creativity, defined as “Work Synthesizes Ideas in Original and Surprising
! !
-'!
Ways”, “Novel or Valuable Response to the Task”, “Work is Improved from Pretest to
Posttest”, and “Work Is Original From Others In Class”, however statistically slight in
most creativity categories. The range of creativity scores also decreases in most
creativity categories when Quantitative Data-Driven Assessment is employed. The
dependent variable, the posttest assessment motivation appears to be the fundamental
affect upon the creativity results in both instruments. Since the aims and pedagogies of
the Visual Arts overall are to encourage divergent thinking, to increase creativity and
innovation (Zimmerman, 2010b), the results of these experiments demonstrate that the
assessment motivations employed in the use of quantitative and convergent Data-Driven
Instruction and Assessment are counterproductive for desired outcomes for student
learning and achievement in the visual arts. Further testing of the implementation of
Quantitative Data-Driven Assessment is highly appropriate prior to mandating the
implementation of such assessments.
Based on the research findings of this initial experiment, it is the opinion of the
experimenter that additional discussion and research within the education community and
within the Art Education community should occur to clarify the necessity of practicing
and demonstrating creativity in the Visual Arts. Furthermore, the intrinsic motivation
and extrinsic motivations involved in creativity assessment methodologies likewise
require additional research and discussion.
! !
-(!
APPENDICES
! !
-)!
APPENDIX A
DISCUSSION BOARD “A” RESPONSES< CONTROL GROUP
! !
-*!
ID
RESPONSE
LINK AND COMMENTS
AL007
Creativity comes from the flow of
piled-up knowledge of art such as
technology to structure design and
pottery to manufactured goods.
Experiencing creativity is something
way absolutely beyond my dreams
and it always has me at "wow!"
Creativity is a VERY major and
inspirational factor to the world that
leads an age of outstanding
technology and many other
things.
http://designyoutrust.com/2012/08/cre
ative-inspirational-street-art-1/ (Links
to an external site.)
Here I have attached a link about
inspirational street art that shows a few
original 3-Dimensional drawings by
talented artists on the streets. The
reason why I think this link (the
paintings) is full of creativity is that
there are realistic shadings of
highlights and shadows and SO many
more countless things, such as
techniques, which all comes together
to create a massive image.
HB033
To be creative is to turn new ideas
into something equally interesting
and realistic. It is also the ability to
identify the world around you and
solving the endless problems in
different ways. So basically being
able to think out of the box. When I
http://www.pinterest.com/ The link
that I chose expresses creativity by
letting people from around the world
show their creativity through writing,
blogs, artwork, pictures, etc. and it
enables others to learn from other
people. Also it inspires people to find
! !
-+!
experience creativity it normally just
comes to me while watching shows
that I like and when I find art that
pops out to me. I then create it into
something original that I come up
with later one, which normally
inquires mashing different art ideas
together.
new ways to recreate old things. Oh
and I would also think as deviantART
as a good website to show creativity.
It is basically like pinterest.
PC063
Creativity is "thinking outside of the
box" and going above and beyond
during anything. This means to not
do something that anyone else could
do and be original to yourself and
your ideas. Creativity is not being
boring and making something boring
into something unique and original.
http://www.pinterest.com/raventracks/
abstract-art/ (Links to an external
site.) Abstract art is being creative by
turning what you see into something
wildly imaginative and creative. Also
pinterest is a website where people can
express their creativity and inspire
others.
YP035
Creativity is something you make up
in your mind that isn't simple and
unique. When I experience
creativity, I think of something on
my mind and I combine it with the
http://www.creaktif.com/
! !
-,!
task on hand.
JL057
creativity is defined as the tendency
to generate or recognize ideas,
alternatives, or possibilities that may
be useful in solving problems,
communicating with others, and
entertaining ourselves and others.
when I experience creativity it is
really interesting because, I can
imagine or draw something that is
impossible. Also I don't want to
draw common things I want to draw
more unique things.
https://www.youtube.com/watch?v=Kt
coUNlaO4s (Links to an external
site.) I think this is very creative art to
me because,3D painting make a street
look eye catchy and a complete
distraction Also,3D Street Art
Paintings look fantasizing and
dreamy.
RL005
Creativity is freedom to express
yourself in any way your mind lets
you. When I experience creativity,
it feels good because I can do
whatever I want.
http://www.youtube.com/watch?v=l2K
SPiTOMR8 (Links to an external
site.) I think that video is creative
because it allows them (the person) to
make fun of the trailer while
entertaining us (the audience).
NM039
In my opinion, creativity is using
your imagination, including your
own thoughts and idea, to create
Link (Links to an external site.) - I
think this artwork is creative, because
the artist used a human face with its
! !
--!
different types of artwork. When I
experience creativity, I feel free to
create different types of art by using
my own imagination, thoughts, and
ideas
features in their artwork and creatively
made it all fit together to create a
realistic piece of art
HH027
Creativity is when a person uses
their imagination and have fun with
their artwork by adding different
types of colors that shows your
feelings. Feelings are a big part of
art to me because most of the time
when people make art they use real
life experiences or memories to
really have a meaning in it. When I
experience creativity I do it in my
art and in many other ways but when
I do paint or sketch things that I love
like sunsets or my memories and
reflect on it my feelings towards it.
http://www.youtube.com/watch?v=Nq
edBekgLdo (Links to an external site.)
In this video I think It is very creative
because it is quite unique from other
types of art. Other art wouldn't be
painted on water and afterwards be
reflected onto paper. To me I just
think it is unique and different and
that's why I like it.
AN021
Creativity is something that is
unique, you don't copy anyone that
you think has a good idea. It is
Here is a link to a picture I think it is
creative because when I saw it, I
questioned why there was a dog being
! !
%..!
something that you come up with
yourself and not taking any ideas or
suggestions from anybody. When I
experience creativity, I think of
imaginative because in order for
someone to make something
creative, they have to think about it
first. So that person has to imagine
what they are going to make or do.
held up by balloons and then started
using my own creativity to come up
with a story to explain how and why.
http://www.google.com/url?sa=i&rct=j
&q=&esrc=s&frm=1&source=images
&cd=&cad=rja&uact=8&ved=0CAcQ
jRw&url=http%3A%2F%2Fitsmyview
s.com%2F%3Fattachment_id%3D928
8&ei=IO1LVJ-
nJcLj8gHQoYDIDg&bvm=bv.778807
86,d.aWw&psig=AFQjCNEjvzkY1cJ
LCub62-
0NsGIkdbNhCQ&ust=141434840775
5270 (Links to an external site.)
GC043
Creativity is when ideas are put into
the physical world from the artist's
imagination. It is important that it is
supposed to be original and come
from the mind. When I experience
creativity, I feel free since I can put
whatever I think onto paper using
whatever I want, pencil, pen, etc.
! !
%.%!
The image below in my opinion is
creative, since it gives me the idea
that our minds can create something
colorful and creative, and still retain
the idea of imagination.
EF055
Creativity is the ability of you being
able to think of your own new and
unique ideas. When you experience
it you can make it in your head and
know what to do.
I think the link below is creative
because this artist came up with his
own idea to do these 3D paintings and
paint them in layers using resin.
http://www.thisiscolossal.com/2013/04
/three-dimensional-animals-painted-in-
layers-of-resin-by-keng-lye/
JT029
Creativity is the use of imagination
and original ideas, to make
something new. When I experience
creativity it is something that
catches my eye, something I have
never seen before, that is new to me.
http://forum.xcitefun.net/creative-art-
on-plates-t83446.html (Scroll down a
little.) I think this is creative because
the art work uses unlikely objects to
create an image that relates to
something else. This is a very original
idea. (Links to an external site.)
AD017
Creativity is when someone
expresses their ideas to the world.
The way they express their creativity
http://www.youtube.com/watch?v=5c
BrM2syccU&list=UUqxKavbK7Sx-
daJqZBLsgog (Links to an external
! !
%.&!
can be in many, many ways.
Creativity is different for every
person. When I experience
creativity I feel empowered because
I am putting my ideas into the world
and making a, even if it isn't very
big, difference.
site.) I think that the person who made
this design is creative. I think so
because I know how difficult it is to
come up with designs yet this person
does it with ease. Also by making a
video showing people how to create
the design they are sharing their
creativity with the world. I also think
it is creative because if a normal
person were to look at what was made
this person made they would most
likely say "wow how did you make
that?" That is why I think that design
is creative.
SK045
Creativity is art that is "out of the
box" of unique. Creativity does not
always have to be the best artwork.
It can be a title font or anything else.
Creativity allows variety of things
such as different houses and cars.
When I experience creativity, I feel
very proud of my work and
https://www.youtube.com/watch?v=D
e4KEvIUn2c (Links to an external
site.) this is a video of street spray
paint art. This is very creative because
this man uses different tools to make
shapes even and sharp
! !
%.'!
accomplished. It may not always
look good but it definitely expresses
my thoughts.
IL051
Creativity is the art of expressing
what you feel and what you imagine,
through your artwork. When I
experience creativity it makes me
feel good and accomplished. I feel
as if I can show exactly what I am
feeling through designs and patterns.
I think the painting in the link below is
creative because it tells a story and is
beautiful in its own
way. http://www.daydaypaint.com/ima
ges/Print-Painting/Abstract-Painting-
011.jpg (Links to an external site.)
SJ047
Creativity is when your mind is used
in an outside-the-box way.
Creativity can be sprouted from
anything, such as other people , your
interests ,your happiness, and even
fear or hate. (and anything can be
the result of creativity) When I
experience Creativity, I usually feel
captivated by the thought, and want
to grow the idea into something
Example of creatvity:
http://www.youtube.com/watch?v=Pjn
MFqnGdac (Links to an external
site.) This is a video that shows the
making of a costume that is based off
of a game called Animal Crossing. I
feel like this video shows creativity
because she (Commander Holly) is
creating something inspired from one
of her interests, something that
! !
%.(!
bigger.
required thinking, because she is
bringing something that exists in a 2-d
world to the world we live in.
CK067
Creativity is a way one self
expresses themselves in any way
shape or form
JP059
Creativity is something or some idea
that is created by an individual
which expresses his or her
uniqueness. In this theory, anything
can be creative, also long as it is
unique to the artist and is true to him
or her. For me, experiencing
creativity is making something that
express who I am and something
that is different and not overly used.
That would be a type of happy and
cheery kind of artwork. Creativity
is all about the perspective and
opinions of the artist, but also the
audience.
http://upload.wikimedia.org/wikipedia/
commons/d/dd/Labille-Guiard,_Self-
portrait_with_two_pupils.jpg (Links to
an external site.) To me this painting
is very creative. When this is first
overlooked, people wouldn't see it as
creative. Because this type of artwork
is so overly used, it seems very plain,
but to me I find it very creative. To
see it, we must changed our
perspective and go back in time when
paintings were the only type of
photographs people had. During this
time these paintings were new and
different from the old medieval
paintings. This is why so many people
! !
%.)!
recognize this kind of artwork.
FG013
I believe creativity is the ability to
create. Hence, the ability to create is
the ability to form, to design, to
construct, to generate, to organize, to
invent, to discover and even just to
plain plan, anything. From the way
one may hold their paintbrush to the
invention of a new computer, any
person can obtain creativity for
anything their mind desires.
Thus, I decided to attach an image of
the world, exemplifying the idea that
everybody on this earth resembles and
is able to acquire creativity. However
creativity is not always positive,
because creativity can also be
portrayed as ideas of war, or plans of
conflict against one another. This is
why I put the earth as what I perceive
is creative. To show the vagueness of
how general creativity is and what it
can be.
Screen-shot-2013-10-31-at-
18.13.53.png (Links to an external
site.)
JL061
Creativity is colors of imagination
pieced together. For me using my
creativity is like using imagination
in pictures on paper. I think it's
creative because your imagination is
how you determine something by
.https://www.youtube.com/watch?v=g
857UNIKQsM (Links to an external
site.)
I think this video is creative because
she is using the colors from her
creativity and creating a face of many
! !
%.*!
yourself and what to make out of
your thinking
colors. As well as using a tiny bit of
imagination because the figure is not a
regular face it is expressed with many
colors of shades.
PR025
Creativity is the use of ones
imagination based on their own
unique style and thought process.
When I see something that inspires
me, I become creative. To me,
creativity is everything. From the
design of a building to the murals on
the wall.
http://logopond.com/ (Links to an
external site.) This website is creative
and makes me want to create a logo
myself. Luckily, you can! There is a
spot where you can design your own
logo.
EC037
Creativity is the ability to think of
new ideas and make new things.
Creativity to me can be doodling
random stuff. Sometimes, creativity
is a hard concept.
This is my example of creativity
because it is a doodle
sketch. http://www.needlenthread.com
/Images/patterns/Embroidery_Doodles
/doodle_design_02.jpg
EM053
Creativity to me means imagination
we wouldn't live without it. when I
experience creativity I try to think
open-minded and unique
. I think this pic would be a good
example of creativity because
someone used their imagination and it
looks unique.
CG023
I believe creativity comes from
No link provided.
! !
%.+!
the heart, the mind, the soul. What I
mean is, creativity is when you have
the power to do what you want on a
canvas, a essay or in your actions.
YL009
Creativity is making each
individuals different, because every
people have different ideas. For me,
when I make something very
different or unique, I feel like I
experienced creativity.
I felt this is creative, because when the
person in the video was making the
bowl on the wheel, he used his fingers
to make the flower-shape bowl, which
I thought it was special, because it was
not just a simple ordinary bowl.
https://www.youtube.com/watch?v=qz
667FhL9-M (Links to an external
site.)
! !
%.,!
APPENDIX B
DISCUSSION BOARD “A” RESPONSES, EXPERIMENTAL GROUP
! !
%.-!
ID
RESPONSE
LINK AND COMMENTS
SK042
Creativity is a spark of your
imagination, something original!
When I experience creativity, it's
fun! Not much erasing... not
being creative is being like
copying down a report that
someone else wrote - boring!
How many people have been
asked to duplicate a real life apple
on a sheet of paper? Experienced
or not, it's still frustrating and
some people just toss their hands
in the air in annoyance
. http://www.zazzle.com/cloud_p
uking_rainbow_card_post_cards-
239306030289591130 (Links to
an external site.)
This artwork is creative since it is
fun and seems to be out of
imagination and dreams, not from
the real old world.
MB040
To me creativity is simply when
you create ideas or use your
imagination. When I am being
creative it’s like solving the
universe's questions in as many
ways as possible.
http://businesshacker.co/wp-
content/uploads/2014/02/creativit
y.jpg (Links to an external site.) I
believe that this is very creative
because of what they used to
make into art and the idea behind
it.
! !
%%.!
CA004
Creativity is what every artists
needs. When I first experienced
creativity was good to know how
to do all the work you can think
of.
Pinterest.com is creative. It's
creative because of how you will
see different art works that other
people have created.
SK046
Creativity comes from what you
do. When you are trying to make
a painting, sculpture, or anything
you add creativity to it.
http://www.youtube.com/watch?v
=1NXcZWxBGSA (Links to an
external site.) I think this video is
very creative. I like how she used
crayon and glued them on a board
and then she melted them to make
them look like rain.
RA010
creativity is a unique drawing,
imagination or a item that has lots
of designs. when I experienced
creativity it was like experience a
whole other part of you.
(no link recorded in student
response)
JH056
Creativity is an idea of being
original and unique. Whenever I
tend to be more creative, I make
or draw things that you would not
http://cubebreaker.tumblr.com/po
st/100874700586/visual-artist-
alice-pasquini-completed-
this#notes (Links to an external
! !
%%%!
expect me to make/draw, just like
everyone when they experience
creativity. The artwork shown in
the following link is very creative
because the artist chose not to
draw on a plain sheet of paper or
anything that artists usually use to
draw on.
site.) The artist also decided to
draw "in memory of late poet
Alfonso Gatto" which is creative
and interesting because the poet
himself is not a widely known
person and not a lot of people
would draw in memory of any
poets.
AL014
For me, creativity is anything you
can imagine or thinking beyond.
When I experience creativity, I
feel amazed at what I couldn't
think about and excited because
imagination never stops.
http://4.bp.blogspot.com/-
z_RBev08qCA/TiGSGyZlVjI/A
AAAAAAAAGg/iLQntmUwvw0
/s1600/Awesome%2BCollection
%2BOf%2BImagination%2BArt
%2BWork..jpg (Links to an
external site.) I think this picture
is creative because there are so
many colors in it and imagination
was everywhere, the designs were
something I never seen before.
IN006
I think creativity is using your
imagination or own original
I think this link is creative
because instead of going out and
! !
%%&!
thoughts or ideas. When I
experience creativity I keep on
having one idea after another.
For the most part, I also normally
have almost everything
thought/planned out how I want
it.
spending money to solve their
problems, they're being creative
and finding easy, simple, and cool
solutions that work just as well
(or maybe even better than a
bought product).
http://www.buzzfeed.com/alanna
okun/crafthackz
CM022
Creativity is something that
stands out, that is original, that is
like nothing ever seen before.
When I experience something
creative, I think it is really
awesome because I have never
seen something like it.
http://inspirationfeed.com/articles
/design-articles/5-awesome-
optical-illusions-with-impossible-
objects/
JL020
Creativity is to be able to express
yourself or something freely. It
can also just be allowing the ideas
in your head and putting it into
reality. Also it is something
unique in your own image. it
. http://www.youtube.com/watch?
v=sIgFAXcdVAI (Links to an
external site.) This video shows
how this spray painter artist is
being creative and unique with
his mind. Also he is able to use
! !
%%'!
maybe not be perfect but its
creative. . Everything around us
is creative. What it is like for me
when I experience creativity is
that I’m able to express my own
ideas and be able to think of new
ideas in my head freely. To be
able to be able to put my ideas
and be unique
what he has to be able to make
what he wants to make.
LG066
Creativity is when you use your
imagination to come up with
original ideas. Or, you can take
other concepts and apply parts of
them to your own piece of work
to create something unique.
When I experience creativity, I
have a sudden burst of
imagination that comes to my
head and I have to write down my
idea before I forget it.
. http://24.media.tumblr.com/tum
blr_lcxij3c9NX1qzlgueo1_500.pn
g (Links to an external site.) I
think that this photograph is
creative because it has special
way in showing what it is trying
to express. You can see that
everything else is blurred out
except for the sign, so you can
focus on the words "One Way or
Another". The blurred out
background shows cars on a busy
street in the city, which helps
! !
%%(!
symbolize the words. Also, the
filter that has been applied to the
picture makes it bright and
colorful so it catches your eye and
is pleasing to look at
AF030
Creativity is when you do
something out of the ordinary and
its something that your mind does
when it he use of the imagination
or original ideas is displayed like
in a art work piece or drawing of
some sort. For me I get
motivated it make more creative
things and that is what sparks
my imagination.
http://hovercraftdoggy.com/2013/
01/18/we-pin-up/
HS064
Creativity is the essence of
problem solving, curiosity,
science, art, and many, many
other things. When I experience
creativity, I feel as if I must
immediately write or draw my
idea immediately.
Here's my link: 180px-
14331_view.jpg (Links to an
external site.) Toothless, from
how to train you dragon. The
books, just so you know. I love
the movies too,
though. c0085950_50c97a2d49f5
! !
%%)!
7.jpg (Links to an external
site.) Hobbit! yay! marvel-
studios-avengersbg-logo-
img.jpg (Links to an external
site.) lastly, Marvel! Yay
Avengers and X-Men!
JH032
Not sure what the rest of you did
since I can't see the rest of the
responses. Harry shows his story
through the narration. His dance
with pigments correspond to the
narration. Each color represents a
emotion that builds up in his life
time. By the end, he is a mess.
But with the colors, he takes
advantage and makes the best of
it, decides to make the colors who
he is today. It shows his life in
another way besides telling or
showing. creativity is something
that everyone is born with. it is
the skill of simply picturing
https://www.youtube.com/watch?
v=elILetNPyr4 (Links to an
external site.) Link above.
! !
%%*!
things in your mind. Everybody
has it, even if he/she/it can't show
it physically. I can't say much
more than that, I can't think of
anything else I’m a little sleepy.
DF068
Creativity is a type of way you
use your imagination to make art
or anything else. When I
experience creativity I use my
imagination
. http://www.art.com (Links to an
external site.) , is what I think
creative is because their are so
many pieces of artwork that show
different techniques and colors
AM048
Creativity to me is expressing the
mind and implementing images
on to a canvas or something.
When I experience creativity I let
go of all thing I was thing about.
This picture shows that the right
side of the brain is what you use
for your art. I think it is creative
because its colorful, and it has the
other side too.
MM050
Creativity is using your
imagination at anytime and when
I experience creativity, it's
exciting
. Link (Links to an external
site.) I think this photograph is art
because it has amazing colors and
the photography is beautiful.
DE002
Creativity is something new and
I was going through the internet
! !
%%+!
meaningful. I sometimes get
these bursts of creativity and start
sketching my ideas out!
Sometimes, my creativity switch
is on "off" and no matter how
hard I try, I simply can't draw
anything that worth finishing (I
have so many unfinished
drawings!).
recently and I found a drawing
that I completely fell in love
with!!!! It' called Liquid Life by
Kirrui:
http://www.deviantart.com/art/Li
quid-Life-471103385 (Links to an
external site.) You HAVE to
check it out!!!!
! !
%%,!
APPENDIX C
DISCUSSION BOARD “B” RESPONSES, CONTROL GROUP
! !
%%-!
ID
STUDENT RESPONSE
MB040
My project is going to be like a piggy bank, and because it will be square I
really want it to seem like an actual pig so my texture will be kind of
smooth but mostly cracking and realistic.
CA004
My next tar paper slab project is going to have patterns like a diamond
shape surface around it.
DE002
I plan on making the spots on my giraffe shaped like triangles! I might
glaze them red to make the giraffe more colorful.
AF030
For my next slab project I will have scale like patterns for a vase or a box.
RA010
I’m going to make a tall vase. The structure of the surface is going to be a
pattern with lines and sketches.
SK046
For my next tar paper slab project is going to be a vase that will have
patterns and lots of ridges it will be rough and bumpy.
JL020
What I am planning to do is make a mug or a cup and make like zig zag
and star shaped designs on it or do like a curly swirly line around the cup
and stars around it.
CM022
My project is going to have a surface texture that is like the bottom of my
shoe so it will have a zig zag design.
JH056
I am planning onto making a cup and using a carving tool to carve flower
patterns into it.
IN006
For my next Tar Paper slab project, I am going to carve the outlines of
! !
%&.!
building (like a silhouette) onto a box. My goal is to make it look like a
city at night.
BP044
For my next tar paper slab project, I want to make a box, not a boring one
but one that has all different kinds of stuff and patterns on it. I want it to be
very colorful, and have vines going all across it in different directions.
KD024
For my next slab project i want to make a box with engravings of words on
it.
LG066
For my next tar paper slab project, I am planning on carving out a picture
rather than just painting with the glaze, which is what I did last time
DB054
For my next Tar slab project since I’m making a skateboard I’m going to
make the surface of it rough and grainy
SK042
For my next Tar Paper slab project, I am considering a smooth, sea shell-
like surface or a rough, sand-like surface on a cookie. To specify, the
chocolate chips might have that smooth surface whilst the crumbly part of
the cookie (the dough) might have the rough surface
SK060
For my next tar paper slab project, my idea is to create a cylinder shaped jar
with water drop designs on the surface.
DF068
For my next Tar paper slab project I am going to have diamond, swirls and
scales on my cup.
AL014
For my next Tar Paper slab project, I would have all different kinds of
flowers for the surface treatments, to see how all the flowers together look.
! !
%&%!
APPENDIX D
DISCUSSION BOARD “B” RESPONSES, EXPERIMENTAL GROUP
! !
%&&!
ID
RESPONSE
JT029
I am going to have scales on another, taller, box. Scales that overlap each
other and look like fish scales. I also might have a lined diamond pattern on
the inside
PR025
For my next project I am going to do a box/vase that will have shell patterns
on it.
JP059
For my next tar paper slab project, I would like to make a jar but with prints
that would represent my personality. Rather than prints that go inward, I
would like to make prints that stick out giving it a nice texture. I could add
different emojis on the outside sticking them with slip such as the smiley
face or the poop emoji.
HH027
For my next tar paper slab I will definitely consider using the techniques
that I have learned for the past week or so and just have and make my
project unique and nice. I might create an elephant and outline the features
on there. Like patterns and different shapes.
HB033
For my next Tar Paper slab project I was thinking of making a clay book
cover for one of my book series and I would have to make the slab really
thin if I want to glue it to a hard cover book or other object. I would maybe
print an image of that book and put it on the cover, so I would be using
image transfer on to my bisque ware. If that does not work I will try to
make a clay book and again I will use image transfer for the cover.
! !
%&'!
EC037
For my next tar paper slab project, I am thinking of carving shapes instead
of drawing it in with the glaze on my previous project. Also, I want the
corners of my next project to be round so I will use a sponge to wipe it
down.
NM039
For my next tar paper slab project, I have considered using techniques, such
as carving, glazing, and application. First, by using the carving technique, I
could cut out a design from the surface of the clay. Next, by using the
glazing technique, I could cover my project with a shiny coating. Lastly, by
using the application technique, I could paste an object or image onto the
clay's surface.
CK067
I consider doing carving and engraving
YL009
For my next Tar Paper slab project, I am going to make a candy box that can
be opened and closed. Rather than carving the shapes what I did this time, I
am thinking to put clay on it, like in the video that we watched in the class.
(Using the slab to draw on the project, like using it for the cake decoration.)
With that tool, I want to draw a mustache on my project.
AL007
For my next Tar Paper slab project, I would like to create a tall vase with
geometric patterns such as triangles, circles, and diamonds to represent the
plain-ness which I like. I would also make the shapes pop out and make it
"realistic" rather than just glazing it but with shades and highlights.
RL005
For my next tar paper slab project, I will make some indents into the clay
and they glaze those different colors.
! !
%&(!
TK031
My project is like a cup with writings and strips around it with stars.
FG013
I’m planning to design a boxed type of vase, with each side as square tiles
with mosaic and tile incorporated designs.
AD017
For my next tar paper project I'm thinking of making the clay look like bark
from a tree. I may also want to make the clay look like it is made of scales.
GC043
For the next Tar Paper slab project, I will consider giving it a texture similar
to smooth rocks and indented cracks on the surface. I will make it look like
a volcano.
SK045
I am considering making a pineapple. For the texture, I figured out that
using the bottom of your hand while it is in a fist creates a texture that looks
like the pointy part of a pineapple.
SJ047
For my next Tar paper slab project, I would like to make laptop. One of the
surface decorations I am considering is carving in many little keyboards
instead of painting them on with glaze.
IL051
For my next Tar Paper slab project I am considering making a large bowl
and using triangular patterns to design it. The triangles will line up side by
side to make a cool pattern effect at the top and bottom of the bowl.
! !
%&)!
APPENDIX E
DISCUSSION BOARD “C” RESPONSES, CONTROL GROUP
! !
! !
%&*!
ID:
RESPONSE:
MJ074
Trying to make all sides with tar paper on it as equal as possible and
flatten the inside and out more. And maybe even putting borders or
edges on the sides
NH034
make the sides even
JH056
For my next project, I'm going to put more surface designs and make
sure the surfaces are smooth and even. Also, I would make it more
sturdy.
SK046
To make my tar paper slab project more creative I will put more surface
designs and try to make the sides very even, and also take my time so
my project won't explode.
JH032
I will try to make it a concave, not something that is open air, like a
box. By something concave, I mean like a vase, that passes the
imaginary vertical line twice.
MB040
I will attempt to make more surface decorations and use multiple glazes
to color the outside
LG066
I plan to improve my tar paper slab project by making the slabs more
even and scoring better. I also plan on using glazes to create more
detailed paintings as well as other surface treatments.
RA010
I plan to improve slab project by taking my time and smoothing out my
clay.
! !
%&+!
DB054
I am going to add more scorings and make everything fit together
AF030
I am going to improve my tar paper slab project by sketching it out
better and I am going to put patterns in my slab project and come up
with a different shape then just making a box or mug.
CK012
I will make my tar paper slab project better by making the sides even
and put a better design and glaze it different colors
AL014
To make the second one more creative than the first, I would add more
interesting surface decorations and make a different shape than just an
ordinary box.
SM036
I plan to improve my slab art project by using the slip properly so that it
wont break again unlike all four of my other ones.
MM050
For the next tar-paper slab project I will try to make the sides even and
carve more details into the clay. Also I would like to make something
that is more creative than just a box because now I have more
experience.
DF068
Make the sides rounder.
JS018
I plan to improve my project by giving it more color and a eye popping
design.
CA004
Make all the sides even to hold up your design
JL020
how i will improve my tar-paper slab project is by making the sides
more even and more creative like the designs on the project. also i
would make sure to take my time on this second project.
! !
%&,!
DE002
I will try to make another giraffe that will stand better and have a much
smoother surface. I will also make it very colorful.
CM022
trying to add more designs and make it smoother and flatter
AM048
I can improve my art slab project by making it by my self with out tar
paper
RA010
I’m going to make a vase with designs on it
SK060
I plan to make my tar paper slab project better by evening out the sides
and smoothing the surfaces. I will use a ruler and measure the size.
Before firing my work I will make sure to find any cracks, dents, or
holes.
SK042
I plan to improve my tar paper slab project by putting more creativity
into it and more care. My last tar paper slab project was rather simple
and I spent most of the time trying to keep things together and trying to
not snap my project. So, instead I will: 1. more expressive than a
cookie (literally) 2. more originality than a brownie (literally) I'm
making a blooming flower with a fox playing in the middle, which is
better than the cookie and brownie I made last time.
NL076
Four sides even and more creative with my designs and colors.
BP044
I'll try to make the sides more even, and not have some of the parts fall
off my project.
! !
%&-!
APPENDIX F
DISCUSSION BOARD “C” RESPONSES, EXPERIMENTAL GROUP
! !
! !
%'.!
ID:
RESPONSE:
AN021
To make the project more creative, you can add more sides to the
project or have the project be rounded on the outside. It can also be
made in different sizes, shapes, and color.
CG023
I would not rush and take more time making it perfect. When i didn't
take my time, my pot blew up. Also I would be more creative with my
ideas because my last one was bad and lazy.
CE015
Making all the sides even, and scored better.
EF055
I would take my time and make sure that all edges are cut precisely and
all sides are even. I would also make use of the slip more to make sure
my project sticks together better.
GC043
Next project, I will make the tar paper more equal and even. The clay
will be cut more evenly and be more form fitting.
NM039
Since we've been investigating new techniques, I am going to make my
second tar paper slab project more creative than my first by creating a
different structure than my first, using tar paper to reinforce the joints
of the structure and some of the surface techniques that I learned about,
and using time wisely to perfect the color and shape of my project so
that it wouldn't look as if I had rushed through it.
PR025
I plan to make it better by making exact measurements on the tar paper
so my squares would be the same size. I would also make something
! !
%'%!
more than just a box since I know how to do it now.
FG013
I am going to add more detail and preciseness to my next tar paper slab
project, by refining the edges and smoothing out all sides to ensure
evenness.
AL007
For my next tar paper slab project, I plan to improve on creating a
simpler but professional project than the first one which cracked into
pieces since I spent too much time on it before first firing. I secondly
will try my best to make my tar paper pieces more accurate in shape
and size than my first one which was hard to work with. I would also
use some clay decorating techniques such as glazing or carving onto
my next project.
HB033
For my next tar paper project I plan on making all the sides more even
and come up with a better idea on what to make. I might also make the
construction a little bit neater and try to use image transfer on to my
bisque ware.
MW011
I would check for cracks and holes before firing it. And maybe being
more creative with the project instead of making one huge boring
box.
HH027
For my second project I will probably try to be creative then just
having a cup with just one boring color but instead I can make carvings
or any other techniques I learned in the past few weeks.
SJ047
-Try out more ways to use tar paper -surface decorations -more
! !
%'&!
different shapes -color schemes maybe -different techniques with
glaze
YL009
To improve my tar paper slab project, I will be more focused on the
shape, to make the good circle shape, not just focusing on brushing the
project. To make it more creative, I can make the mug's base unique,
like heart shape or star shape.
RL005
I would make mine a better shape. I won't make it such a box-type
shape, but circular.
PC063
I could add more detail on the leaves or maybe make the shape of the
container more unique as appose to just a rectangle.
YP035
Next time, I will the make the sides even and try not to have any cracks
in my art.
IL051
I'm going to make my second project more creative than the first by
making my slabs of clay more neat and by carving in designs and
painting them with different colored glaze.
JP059
Next time I do this project, I'm going to add more texture to make my
work come alive and to really reach out to the audience. I also want to
manage my time and not make some too big otherwise it would harden
as I put it together and it wouldn't even come together.
TA001
make the sides even and make it a good shape
HS041
Instead of just a plain box, i will try an more intricate design. I will
also try to give the pot a texture.
! !
%''!
AD017
I'm going to make my second slab project more creative by adding a
variety of glaze colors and a a design on the outside of the project .
JL061
I'm going to make my sculpture more creative by adding more curves
and swirls into my project on the outside.
AG019
I'm going to make a normal box, but I'm going to cut holes in the shape
of playing card symbols in the middle of each side and add slip trail
designs around the holes.
CK067
I am going to focus more on details and color scheme on this next
project and make it look neater and with a bit more symmetry
EC037
I plan to carve shapes into my next project. I might also do slip trailing
on more project to make it more creative
JL057
Make all sides even and try to think which design would be more
creative.(not cylinder or box etc..)
! !
%'(!
APPENDIX G
TARPAPER LESSON PLAN
! !
%')!
CLAY Construction- starring…. TAR PAPER!
Vase / Abstract Vessels
with Amy Cranfill http://www.amyswindowseat.com
and Stephanie Pickens http://www.stephaniepickens.com/
MATERIALS
- Clay, canvas, newsprint (to draw templates), TAR PAPER, pencil, plastic bags, wood, brayer,
clay knife,
RECYCLE HINT
- Tar Paper is recyclable!! Call your
local Waste Transfer Station for details.
Construction Procedures
-
1. Design, Draw, Cut Out the template. Make
sure you identify each side with an appropriate
letters or numbers – so that you understand what
sides connect together. Don’tforgetthebottom
piece- which is a combination of the bottoms of all
your side pieces.
2. Trace your templates onto the tar paper – youcanuseapencil.Don’tforgettolabelyournew
pattern pieces with the corresponding letters or numbers you used to identify the sides.
3. Cut out your new pattern shapes from the tar paper using scissors.
4. Create your clay slabs by throwing, or rolling you clay out (use ¼ inch dowel rods for guides).
These slabs need to be big enough for all your pattern pieces to lay out on the surface.
5. Next step is to attach the Tar Paper to the clay. First spritz the surface of the slabs with a spray
bottle- youdon’twantwatertopool, just enough to dampen the surface.
! !
%'*!
! !
%'+!
APPENDIX H
PRETEST CRITIQUE AND ASSESSMENT
! !
%',!
Ceramics Critique 1
Part of the learning experience in creating art is being able to analyze the work. By taking time to study a piece of art and
express your thoughts about the strengths and weaknesses of the work, you will develop a better understanding of what
appeals to you and what does not. The things you write for this critique should demonstrate thought and need to
be written in complete sentences.
Title of your work:
____________________________________________
Dimensions:
Height:________________”
Width:________________”
Length:_______________”
Glaze Used:
____________________________________________
Draw (or paste an image of ) an accurate picture of your finished work above.
Meeting Project Requirements:
! This project had specific requirements. How did you meet the requirements?
! How did you fail to meet them?
! How did you exceed the basic requirements?
Construction Techniques:
! What method(s) did you use to construct this project? Any special techniques used?
! Did you have any construction/craftsmanship problems? If yes, how did you solve them?
! What do you need to do to improve your skill level?
Time Management:
! The quality of work you do reflects the time spent on it.
How does this piece reflect the use of your time?
! !
%'-!
! !
%(.!
APPENDIX I
POSTTEST CRITIQUE AND CREATIVITY ASSESSMENT
! !
%(%!
! !
%(&!
! !
%('!
APPENDIX J
STUDENT WORK IMAGES
! !
%((!
!
!
! !
%()!
!
!
!
! !
%(*!
!
!
!
! !
%(+!
!
!
!
!
! !
%(,!
!
!
!
! !
%(-!
!
!
!
! !
%).!
!
!
!
! !
%)%!
!
!
!
!
! !
%)&!
BIBLIOGRAPHY
! !
%)'!
BIBLIOGRAPHY
Amabile, T. (1983). The social psychology of creativity: A componential
conceptualization. Journal of personality and social psychology , 45 (2), 357 –
376.
Amabile, T. (1996). Creativity in context. Boulder, CO: Westview Press.
Anderson, L. W., Krathwohl, D. R., & Bloom, B. S. (2001). A taxonomy for learning,
teaching, and assessing: A revision of Bloom's taxonomy of educational
objectives. Boston, MA: Allyn & Bacon.
!
Anderson L., Krathwohl D. R., & Bloom, B.S. (2009). A taxonomy for learning,
teaching, and assessing:A revision of Bloom's taxonomy of educational objectives
(Abriged Ed.) New York, NY: Pearson Education.
Bambrick-Santoyo, P. (2010). Driven By data: A practical guide to improve instruction.
San Francisco, CA: Josey Hass.
Bartel, M. (2002). Goshen arts education.retrieved from Drawing Stages Charts:
https://www.goshen.edu/art/ed/artlsn.html
Batey, M. (2012). The measurement of creativity: From definitional consensus to the
introduction of a new heuristic framework. Creativity Research Journal , 24 (1),
55-65.
Beattie, D. K. (1997). Art Education in Practice Series: Assessment in art education.
Worcester, MA: Davis Publications.
!
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956).
Taxonomy of educational objectives: The classification of educational goals:
Handbook I. Cognitive domain.New York, NY David McKay Company.
Burton, J. H. (2000). Learning in and through the arts: The question of transfer.
Studies in Art Education , 41 (3), 228-257.
Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs
for research on teaching. Washington, D.C.: American Educational Research
Association.
! !
%)(!
California Department of Education (2001). Visual And performing arts content
standards. Retrieved October 2013, from http://www.cde.ca.gov/be/st/ss/va
main.asp
CCSESA Arts Initiative, (2008). Arts Assessment Resource Guide. San Diego CA: San
Diego County Office of Education
Chittenden, E. (1991) Authentic assessment, evaluation, and documentation of student
performance. In V. Perrone (Ed) Expanding Student Assessment. Alexandria, VA.
Association for Supervision and Curriculum Development
Clark, G. & Zimerman, E. (2004) in Cukierkorn, J. (2006). Teaching talented art
students: Principles and practices. Lawrenceville, NJ: Roeper Review, 28(3),
180.
Coburn, C. E. (2006). Conceptions of evidence use in school districts: Mapping the
terrain. American Journal of Education , 112 (4), 469-495.
Common Core State Standards Initiative. (2010). Common Core State Standards for
mathematics. Washington, DC: National Governors Association Center for Best
Practices and the Council of Chief State School Officers.
Csikszentmihaly, M. (1997). Creativity: Flow and the psychology of discovery and
invention. New York, NY: Harper Perennial.
Csikszentmihalyi, M. (1999). 16 Implications of a systems perspective for the study of
creativity. In Handbook of Creativity (pp. 313 - 316). Cambridge, United
Kingdom: Cambridge University Press.
Csikszentmihalyi, M. (1996). Creativity: The work and lives of 91 eminent people.
New York, NY: Harper Collins.
DeSantis, K., & Housen, A. (2009). VTS: Visual thinking strategies. Brooklyn, NY:
Copyright 1996 Visual Thinking Strategies.
Dorn, C. M. (2004). Assessing expressive learning. London, England: Lawrence
DTZ Consulting & Research. (2006). Arts and employability. Retrieved from
http://www .scotland.gov.uk/Resource/Doc/1600091/0043606.pdf
Dwyer, M. C. (2011). Reinvesting in arts education: Winning America's future through
creative schools. Washington, D.C.: President's Committee on the Arts and the
Humanities.
Eisner, E. (2002). The arts and the creation of mind. Harrisonburg, VA: R. R. Donnelley
! !
%))!
Ellis, S. (2008). The assessment of creative learning. Creative Learning , pp. 73-89.
Elton, L. (2007). Assessing creativity in an unhelpful climate. Cardiff, Wales: Higher
Education Academy.
European Comission on Culture (2009). Measuring creativity. Retrieved from
http://ec.europa.eu/education/lifelong-learning
Gardner, H. (1982). Art, mind, and brain: A cognitive approach to creativity. New
York, NY: Basic Books Press.
Goldstein, I. L., & Ford, J. K. (2002). Training in organizations. Belmont. CA:
Wadsworth.
!
Guilford, J. P. (1957). Creative abilities in the arts. Psychological Review , 64 (2), 110 -
112.
Guilford, J. P. (1959). Three faces of intellect. American Psychologist , 14 (8), 469-472.
Hardiman, E. M. (2009). Neuroeducation: Learning, arts, and the brain. (E. B. Rich,
Ed.). New York, NY: Dana Press.
Heer, R. (2012). A model of learning objectives—Based on A Taxonomy for Learning,
Teaching, and Assessing: A revision of Bloom’s taxonomy of educational
objectives. Retrieved from Center for Excellence in Learning and Teaching,
Iowa State University: http://www.celt.iastate.edu/teaching/Revised
Blooms1.html.
Herpin, S. A. (2012). Improving assessment of student learning in the arts - State of the
field recommendations. Washington, D.C.: National Endowment for the Arts.
WestEd.
Hewett, T., Czerwinski, M., Terry, M., Nunamaker, J., Candy, L., Kules, B., & Sylvan, E.
(2005). Creativity support tool evaluation methods and metrics. Creativity
Support Tools, 10-24.
Home Affairs Bureau, (2005). A study on creativity index. Tamar, Hong Kong. The Hong
Kong Special Administrative Region Government, Home Affairs Bureau.
Housen, A. (1987). Three methods for understanding museum audiences. Museum
Studies Journal , 2 (4), 41-49.
Housen, A. & Yenawine, P. (2005) Basic VTS at a glance. Visual understanding in
education. Chicago, IL: University of Chicago
! !
%)*!
Housen, A. (2007). Art viewing and aesthetic development: Designing for the viewer.
From periphery to center: Art museum education in the 21st century, 172-179.
Irvine Unified School District. (2013). About the Irvine Unified School District.
Retrieved from https://www.iusd.org/district_news_information/
AboutTheIUSD.html
Jensen, E. (1998). Teaching with the brain in mind. Alexandria, VA: Association for
Supervision and Curriculum Development.
Jones, J. C. (1973). Strategies for divergent thinking. College Student Journal , 7 (3), 48-
51.
Kagan, P. J. (2009). Learning, arts, and the brain. Los Angeles, CA: The Dana
Foundation.
Krathwohl, D. R. (2002). A revision of Bloom's taxonomy: An overview. Theory into
Practice , 41 (4), 212-218.
Kulp, M., & Tarter, B. J. (1986). The Creative Processes Rating Scale. Creative Child &
Adult Quarterly.. 11 (3), pp. 166-173, 176.
Lampert, N. (2006). Critical thinking dispositions as an outcome of arts education.
Studies in Art Education , 47 (3), 215-228.
Merriam, S. B. (2004). The role of cognitive development in Mezirow’s transformational
learning theory. Adult education quarterly , 55 (1), 60-68.
Mertler, C. (2014). The Data-Driven classroom; How do I use student data to improve
my instruction? (ASCD Arias): ACSD.
National Arts Policy Roundtable (2010). 2010 Final Report The Role of the Arts in
Educating America for Great Leadership and Economic Strength. Sundance, UT:
Americans for the Arts.
National Research Center on the Gifted and Talented. (2002). Assessing Cretivity: A
Guide for Educators.
Newman, F. M., Secada, W. G., & Wehlage, G. G. (1995). A guide to authentic
assessment and instruction: Vision, scoring and standards. Madison: Wisconsin
Center for Educational Research.
Pella, S. (2012). What should count as data for data-driven instruction? Toward
contextualized data-inquiry models for teacher education and professional
development. Middle Grades Research Journal , 7 (1), 57-75.
! !
%)+!
Penn State University. (n.d.). The history of art education timeline 1870-1879.
Retrieved February 5, 2013, from http://www.personal.osu.edu/mas53/
timln870.html
Pintrich, P. R., & Garcia, T. (1991). Student goal orientation and self-regulation in the
college classroom. Advances in motivation and achievement: Goals and self-
regulatory processes, 7 (371-402).
!
Ruppert, S. S. (2010). Creativity, Innovation and Arts Learning: Preparing All Students
for Success in a Global Economy. Arts Education Partnership (NJ1).
Sindelar, N. (2011). Assessment powered teaching. Thousand Oaks, CA: Corwin.
Slavin, R. E., Cheung, A., Holmes, G., Madden, N. A., & Chamberlain, A. (2012).
Effects of a data-driven district reform model on state assessment outcomes.
American Educational Research Journal, 0002831212466909.
!
Smarter Balanced Assessment Consortium (2013). Preliminary test blueprints. Retrieved
October 11, 2013, from http://www.smarterbalanced.org/wordpress/wpcontent/
uploads/2011/12/Smarter-Balanced-Premilinary-Test-Blueprints.pdf
Spencer, E., Lucas, B., & Claxton, G. (2012). Progression in creativity: Developing new
forms of assessment: A literature review. Creativity, Culture and Education.
State Education Agency Directors of Arts Education (SEADAE). NCCAS.
(2014). National Core Arts Standards. Retrieved January 2015, from
http://www.nationalartsstandards.org
Sternberg, R. J. (1999). Handbook of creativity. Cambridge, United Kingdom. Cambridge
University Press.
Taylor, B. (2013, Spring). A conversation with Tom Torlakson. Retrieved from California
School Boards Association website: http://www.csba.org/Newsroom /CASchools
Magazine/2013/Spring/InThisIssue/2013_SpringCSM_Torlakson.aspx
Terry, M., & Mynatt, E. D. (2002b). Recognizing creative needs in user interface design.
In Proceedings of the 4th conference on Creativity & Cognition (pp. 38-44).
ACM.
Terry, M., Mynatt, E. D., Nakakoji, K., & Yamamoto, Y. (2004). Variation in element
and action: supporting simultaneous development of alternative solutions. In
Proceedings of the SIGCHI conference on Human factors in computing systems
(pp. 711-718). ASCD: ACM.
!
! !
%),!
Torrance, E. P. (1968). Torrance tests of creative thinking. San Francisco, CA: Personnel
Press, Incorporated.
Treffinger, D. J. (2002). Assessing Creativity: A Guide for Educators. National Research
Center on the Gifted and Talented.
University of North Texas. (n.d.). History of art education. Retrieved February 5, 2013,
from http://art.unt.edu/ntieva/HistoryofArtEd/references.html
U.S. Department of Education, National Assessment Governing Board (2008). Arts
education assessment framework. Washington, D.C.: Author.
Van Dalen, D. B. & Meyer, W.J. (1979). Understanding educational research: An
introduction. New York, NY.
Veon, R. E. (March 2014a). Creativity Infusion Training. National Art Education
Association Conference. San Diego, CA.
Veon, R. E. (March, 2014b). Creativity Instructional Martix for Art. National Art
Education Association Conference. San Diego, CA.
Veon, R. E. (2014c). Leading change: The art administrator’s role in promoting
creativity. Art Education , 67 (1), pp. 20-27.
Wiggins, G. (1990). The case for authentic assessment. ERIC Digest . Retrieved from
ERIC database. (ED328611)
Yenawine, P. (1998). Visual art and student-centered discussions. Theory into Practice,
37 (4), 314-321.
Zimmerman, E. (2010a). Creativity and art education: A personal journey in four acts.
Art Education , 63 (5), 84-94.
Zimmerman, E. (2010b). Reconsidering the role of creativity in art education. Art
Education , 63 (2), 4-5.