ArticlePDF Available

A heuristic framework for the determination of the critical elements in authentic assessment

Authors:
University of Wollongong
Research Online
'#.*+#,0( (%%('!('!1+#+(%%,#('
 '#.*+#,0( (%%('!('!1+#+(%%,#('+

A heuristic framework for the determination of the
critical elements in authentic assessment
Kevin Hugh Ashford-Rowe
University of Wollongong
+*"'%#'#+,"()'++#'+,#,-,#('%*)(+#,(*0 (*,"'#.*+#,0( (%%('!('!(* -*,"*#' (*&,#('(',,,"#**0
*+*")-+-(/--
(&&'#,,#('
+" (*(/.#'-!""-*#+,# *&/(*$ (*,",*&#',#('( ,"*#,#%%&',+#'-,"',#++++&',(,(*(
-,#(',"+#+-%,0( -,#(''#.*+#,0( (%%('!('!"2)*(-(/--,"++
A HEURISTIC FRAMEWORK FOR THE DETERMINATION OF
THE CRITICAL ELEMENTS IN AUTHENTIC ASSESSMENT
A thesis submitted in partial fulfilment of the requirements for the award of the
degree
DOCTOR OF EDUCATION
FROM
UNIVERSITY OF WOLLONGONG
BY
KEVIN HUGH ASHFORD-ROWE, BACHELOR OF ARTS
(HONOURS), POST GRADUATE CERTIFICATE IN EDUCATION,
GRADUATE DIPLOMA IN MULTIMEDIA, MASTER OF
PROFESSIONAL STUDIES, MASTER OF EDUCATION
FACULTY OF EDUCATION
2009
DECLARATION
I, Kevin H. Ashford-Rowe, declare that this thesis, submitted in partial fulfilment of the
requirements for the award of Doctor of Education, in the Faculty of Education,
University of Wollongong, is wholly my own work unless otherwise referenced or
acknowledged. The document has not been submitted for qualifications at any other
academic institution.
Kevin H. Ashford-Rowe
23 January 2009.
TABLE OF CONTENTS
Declaration ................................................................................................................ii
List of tables ..............................................................................................................vii
List of Figures.............................................................................................................vii
Abstract .............................................................................................................viii
Acknowledgments.......................................................................................................ix
Chapter 1: Introduction................................................................................................1
Background to the study — Authenticity in educational assessment..................... 1
Assessment, authenticity and educational technology............................................ 4
Research questions and the study ........................................................................... 8
The organisation of the thesis............................................................................... 10
Chapter 2: Authentic Assessment: A General review of the literature ................12
Previous findings .................................................................................................. 12
Assessment ........................................................................................................... 12
Assessment in higher education ........................................................................... 17
What is authentic assessment ............................................................................... 19
Assessment and educational technology .............................................................. 23
Characteristics of authentic assessment................................................................ 31
Chapter 3: Research Design and Methodology.......................................................37
Introduction .......................................................................................................... 37
Design-based research .......................................................................................... 37
PHASE 1: Exploration of the problem........................................................ 43
PHASE 2: Development of a solution ........................................................ 44
PHASE 3: Implementation and evaluation ................................................. 50
PHASE 4: Presentation of findings............................................................. 58
Summary of the research plan .............................................................................. 59
Conclusion ............................................................................................................ 62
Chapter 4: An Effective Model for Task Design in Flexible Learning
Environments......................................................................................63
Analysis of the elements of authentic assessment................................................ 63
Practitioner feedback ............................................................................................ 66
Evolving and further developing the critical elements ......................................... 69
1. Degree of challenge(s) presented to the assessed student....................... 70
2. Performance, or product, as final assessment outcome........................... 72
3. Transfer of learning (skills/knowledge/attitude) required....................... 74
4. Critical reflection and self-assessment or evaluation required................ 75
5. Accuracy in product or performance, and fidelity of assessment
environment, is displayed .................................................................. 76
6. Fidelity of assessment tools used ............................................................ 78
iii
Table of contents iv
7. Discussion and feedback required........................................................... 79
8. Collaboration required ............................................................................ 80
Expert review........................................................................................................ 81
Expert Reviewer 1....................................................................................... 82
Expert Reviewer 2....................................................................................... 85
Expert Reviewer 3....................................................................................... 88
Summary of feedback from expert reviewers....................................................... 90
1. Degree of challenge(s) presented to the assessed student....................... 91
2. Performance, or product, as final assessment outcome........................... 92
3. Transfer of learning (skills/knowledge/attitude) required....................... 92
4. Critical reflection and self-assessment or evaluation required................ 93
5. Accuracy in product or performance, and fidelity of assessment
environment, is displayed .................................................................. 93
6. Fidelity of assessment tools used ............................................................ 94
7. Discussion and feedback required........................................................... 94
8. Collaboration required ............................................................................ 95
Revision of critical elements from expert review................................................. 95
From critical elements to critical questions — A summary ................................. 98
The critical questions .................................................................................. 98
Chapter 5: Applying the Critical Questions of Authentic Assessment in
the Design of a Learning Module ....................................................100
Development of — Evaluating Educational Multimedia................................... 100
Introduction ............................................................................................... 100
The re-design of Evaluating Educational Multimedia ....................................... 101
1. To what extent does the assessment activity challenge the
assessed student? ............................................................................. 105
2. Is a performance, or product, required as a final assessment
outcome?.......................................................................................... 106
3. Does the assessment activity require that transfer of learning has
occurred, by means of demonstration of skill?................................ 107
4. Does the assessment activity require that metacognition is
demonstrated, by means of critical reflection, self-assessment
or evaluation?................................................................................... 108
5. Does the assessment require a product or performance that could
be recognised as authentic by a client or stakeholder?.................... 110
6. Is fidelity required in the assessment environment? And in the
assessment tools (actual or simulated)?........................................... 114
7. Does the assessment activity require discussion and feedback?.......... 115
8. Does the assessment activity require that students collaborate?.......... 116
9. Description of how the critical questions were applied in the
design and structure of the learning outcomes and assessment
criteria of Module 10 ....................................................................... 117
The role of formative assessment in the redesign of the module ....................... 133
The application of the elements to the learning environment ............................ 135
Conclusion .......................................................................................................... 136
Chapter 6: Learners’ Responses to Authentic Assessment ................................139
Learning Module Implementation...................................................................... 140
Learning Module Evaluation and Analysis ........................................................ 140
Table of contents v
Method of implementation........................................................................ 140
The method of analysis ............................................................................. 141
Applying the constant comparative method ....................................................... 142
Analysis of responses ......................................................................................... 143
Researcher’s observation on students’ responses by data source....................... 144
Interview ................................................................................................... 144
Observation ............................................................................................... 145
Video ......................................................................................................... 146
Notes on students’ performance made on observation during the delivery of
the module................................................................................................. 147
Notes on student’s performance made on researcher review of the video
content recorded during the delivery of the module ................................. 150
The student’s response to the critical questions ................................................. 152
1. To what extent does the assessment activity challenge the
assessed student? ............................................................................. 152
2. Is a performance, or product, required as a final assessment
outcome?.......................................................................................... 155
3. Does the assessment activity require that transfer of learning has
occurred, by means of demonstration of skill?................................ 157
4. Does the assessment activity require that metacognition, is
demonstrated, by means of critical reflection, self-assessment
or evaluation?................................................................................... 159
5. Does the assessment require a product or performance that could
be recognised as authentic by a client or stakeholder?.................... 160
6. Is fidelity required in the assessment environment? And the
assessment tools (actual or simulated)?........................................... 162
7. Does the assessment activity require discussion and feedback?.......... 163
8. Does the assessment activity require that students collaborate?.......... 165
Summary of the student’s response to the application of the critical
questions in the redesign of Module 10 .................................................... 167
The student’s response to the assessment activity.............................................. 170
Discussion........................................................................................................... 172
Chapter 7: Discussion .............................................................................................181
Research questions — Data analysis......................................................... 183
1. To what extent does the assessment activity challenge the
assessed student? ............................................................................. 186
2. Is a performance or product required as a final assessment
outcome?.......................................................................................... 188
3. Does the assessment activity require that transfer of learning has
occurred, by means of demonstration of skill?................................ 189
4. Does the assessment activity require that metacognition is
demonstrated by means of critical reflection, self-assessment
or evaluation?................................................................................... 190
5. Does the assessment require a product or performance that could
be recognised as authentic by a client or stakeholder?.................... 191
6. Is fidelity required in the assessment environment? And the
assessment tools (actual or simulated)?........................................... 193
7. Does the assessment activity require discussion and feedback?.......... 194
8. Does the assessment activity require that students collaborate?.......... 195
Table of contents vi
Summary of student response and impact on the critical questions.......... 199
Chapter 8: Conclusion.............................................................................................201
Introduction ........................................................................................................ 201
Summary and review of process......................................................................... 201
PHASE 1: Exploration of the problem...................................................... 204
PHASE 2: Development of a solution ...................................................... 204
PHASE 3: Implementation and evaluation ............................................... 206
PHASE 4: Presentation of findings........................................................... 207
Description of the principles............................................................................... 207
Findings of the study .......................................................................................... 209
Principal research question ....................................................................... 210
Subordinate research question 1................................................................ 211
Subordinate research question 2................................................................ 215
Conclusion .......................................................................................................... 217
Limitations of the study...................................................................................... 218
Recommendations for further research............................................................... 219
References ............................................................................................................221
APPENDIX 1 Expert Reviewer Interview Questionnaire ......................................239
APPENDIX 2 Student Evaluation Questionnaire ..................................................243
APPENDIX 3 Student interview Questionnaire.....................................................247
APPENDIX 4 Computer Based Learning Practitioners Course — Module
10 — Evaluating educational multimedia .......................................252
LIST OF TABLES
Table 3.1: The way in which the stages of the design-based research
process are applied in this study...........................................................42
Table 3.2: Summary of the research plan..............................................................61
Table 4.1: Researcher’s synthesis of the elements of authentic assessment
from the literature..................................................................................64
Table 4.2 Researcher’s translation of characteristics to critical elements of
authentic assessment with practitioner feedback .................................67
Table 4.3: Revision of critical elements from expert reviewer feedback to
produce the critical questions ...............................................................96
Table 5.1: Proposed application of the critical questions to the re-design of
Module 10 ...........................................................................................104
Table 6.1: Student Feedback on the Critical Elements........................................179
Table 7.1: Consideration of the students’responses with reference to the
research questions..............................................................................183
Table 7.2: Student Feedback on the Critical Questions.......................................198
Table 8.1: Stages of the design-based research process in this study................203
LIST OF FIGURES
Figure 3.1: Design-based research (2006, p. 59) ...................................................39
Figure 5.1: Apply the process of educational multimedia evaluation to the
Army’s Training Technology Centre developed Computer Based
Learning Practitioners Course ............................................................110
Figure 5.2: Apply the process of educational multimedia evaluation to a TTC
developed CBLP — Trainees will construct their own ........................113
Figure 5.3: Process of multimedia evaluation model assessment activity ............117
vii
ABSTRACT
Higher Education is currently undergoing a period of significant challenge and
transformation. It is likely that these challenges will, in a comparatively short period of
time, lead to changes in the ways in which the higher education experience is both
mediated and accessed. These changes have arisen as a result of a number of factors,
including the information revolution, and the consequent pace of technological
innovation, the increased demand from both employers and government for a more
highly skilled workforce and the desire to increase and make more accessible the higher
education experience to an increasing proportion of the overall population.
All of this has impacted upon the ways in which the higher education experience is
represented, and in turn, by which students gain access to the knowledge and skills that
will underpin their ability to both learn and perform. Higher education is increasingly
being challenged to demonstrate its continued value to the broader community,
especially employers, by ensuring that it provides capable, competent and informed
citizens adequate to the challenges of a twenty-first century lifetime. If these principles
are considered drivers for change, then it is important that the higher education sector
can continue to demonstrate its ongoing value to the students who undertake it.
It is against this background that this study was developed with the purpose of
identifying from the literature, and then to codify into an applicable framework, the
critical elements that would determine an assessment as being authentic. The study took
as its starting point the importance, in the current educational context, of being able to
determine the elements that define an educational experience as being an authentic one.
The research commenced with a review of the literature to identify and collate those
elements that had been identified by previous researchers in the field. Next these
elements, once refined iteratively in practice, were developed into a framework that
could be applied by the designer of instruction and assessment, in order to ascertain
whether such a framework could be used to support the design of a more authentic
assessment experience. This framework was then applied in practice and the student’s
response to the learning and assessment designed according to these elements was
evaluated, and the elements were further reviewed and revised upon the basis of this
data. Thus the study was conducted in four phases, in the first of which the researcher
explored the problem, in the second the researcher sought the development of a
solution, and in the third phase this solution was implemented and evaluated, the
findings were presented in the final phase.
The findings of this study suggest that not only is it possible to codify those elements
critical to the determination of authenticity into such a framework, but moreover, it is
possible to systematically apply them in the design of assessment activity. Thus the
implication of this research for educators and educational designers who seek to meet a
requirement for workplace relevance in the design of their education and assessment
activities is that they will have a better opportunity to both identify and then apply
specific design principles that will assist them in the better development of assessment
outcomes with a clearer workplace applicability.
viii
ACKNOWLEDGMENTS
In the completion of this work, I acknowledge the help, guidance and assistance freely
given to me by a number of people over many years, from both within and outside of
the field of education.
First, and foremost, I thank my wife, Tina, for her love and unstinting support, and my
children, Holly, Jamie and Sam, for their continued interest and the questions that have
ensured that I saw this through. I also thank all of them for the understanding that has
enabled me to sacrifice time with them to the completion of this study.
I thank my parents Ken and Catherine whose love, support, guidance and belief ensured
that both I, and my brothers, Jeremy, Alan and Ian, have been adequate to the joys and
challenges that a life can present.
From an educational perspective, I acknowledge the inspiration and assistance given to
me by many people, in particular, Robert Pepper and Lotte Deeble, Bosvigo County
Primary, Cornwall (1971–1974), Rosemary King, Penweathers County Secondary and
Richard Lander Comprehensive, Cornwall (1974–1979) and Margaret Garland and
David Worley of Cornwall College, Cornwall (1979–1982).
I also sincerely acknowledge the assistance and advice of Professor John Hedberg and
Professor Barry Harper.
Finally, I am deeply indebted to, and thank, my supervisors Associate Professor Jan
Herrington and Doctor Christine Brown, who have provided me with both support and
guidance over the last several years, and without whose vision and the means to express
it, I would not have come close to completing this work.
I dedicate this work to my father, Leslie Kenneth Rowe, Kernowyon, (1931–1992) who
understood the true value and importance of education as the enabler that can allow us
to fulfil our potential and raise ourselves up.
My thanks to Jill Ryan for her excellent assistance in proof reading this thesis.
ix
‘Then said a teacher, Speak to us of Teaching.
And he said:
No man can reveal to you aught but that which already lies half asleep in the dawning
of your knowledge.
The teacher who walks in the shadow of the temple, among his followers, gives not of
his wisdom but rather of his faith and his lovingness.
If he is indeed wise he does not bid you to enter the house of his wisdom, but rather
leads you to the threshold of your own mind.’ (Khalil Gibran, 1923, p. 74)
x
... The first set of keywords reflect those used in the main literature review conducted in the field of authentic assessment by past researchers (Ashford-Rowe 2009;Palm, 2008;Taylor, 2011;Varley 2008). The second set of keywords was used to identify published research that investigated the relationship between authentic assessment and the outcomes of engagement, transfer of learning, evidence of competence, and valid and reliable student performance outcomes. ...
... The first set of keywords reflect those used in the main literature review conducted in the field of authentic assessment by past researchers (Ashford-Rowe 2009;Taylor 2011;Varley 2008). The second set of keywords were used to identify published research that investigated the relationship between authentic assessment and the outcomes of engagement, transfer of learning, evidence of competence, and valid and reliable student performance outcomes. ...
Thesis
A review of past research in the area of seafarer education conducted for this study, showed that the traditional assessments that required seafarer students to focus on rote learning and construction of responses devoid of real-world contexts (e.g. oral examinations, and multiple-choice questions) disengaged them from learning. Hence, this research proposed that authentic assessments, requiring students to construct responses based on the assimilation, integration, and critical analysis of information presented in real-world contexts will result in higher academic achievement. Using the characteristics recommended by the commonly cited authors, this study redefined the concept of authentic assessment which established the theoretical framework for this study. An extensive literature review, conducted as part of this research study, uncovered an absence of academic investigation on: authentic assessment in the area of seafarer training and on the different aspects of validity and reliability of authentic assessment. In this regard, a conceptual framework for seafarer education and training (AAFSET) that provides greater assurances of validity and reliability throughout all stages of authentic assessment was developed. An empirical contribution of this study was through the investigation of the difference in seafarer students’ academic achievement (measured through scores obtained in assessment) in authentic assessment as compared with traditional assessment. Analysis of student scores revealed that the authentically assessed students were guided towards significantly higher academic achievement. A further investigation using the students undergoing authentic assessment, included measuring their perceptions of authenticity for factors of assessment (task, criteria, etc.) and correlating to their scores in the associated task. Data analysis revealed the factor of transparency of criteria was a significant predictor of the students’ academic achievement.
... The first set of keywords reflect those used in the main literature review conducted in the field of authentic assessment by past researchers (Ashford-Rowe 2009;Palm 2008;Taylor 2011;Varley 2008). The second set of keywords was used to identify published research that investigated the relationship between authentic assessment and the outcomes of engagement, transfer of learning, evidence of competence, and valid and reliable student performance outcomes. ...
Article
Full-text available
Past research shows seafarer students perceive traditional assessment methods used in maritime education and training (MET) institutes as disengaging, making them adopt surface-learning approaches towards acquiring essential knowledge and skills required for the workplace. Instead of developing skills that may be transferred to shipboard tasks, disengaged seafarer students focus only on achieving the minimum score required to attain a 'pass' grade for their certification. In this paper, a quantitative research methodology is developed to investigate seafarer students' perception of authenticity to workplace tasks in decontextualised traditional assessments versus authentic assessments conducted in real world contexts. The methodology will further correlate the perceptions to student achievement (measured using assessment scores) in the assessment tasks. The paper describes the research design and identifies the ethical issues. Procedures to address validity and reliability of the research are also established. Future research aims to implement the methodology to conduct research in a selected educational institute and a unit of competence. However, this paper acknowledges that the methodology should be replicated in other educational institutes for different units of competence to compare and generalise findings.
... The first set of keywords reflect those used in the main literature review conducted in the field of authentic assessment by past researchers (Ashford-Rowe 2009;Taylor 2011;Varley 2008). The second set of keywords were used to identify published research that investigated the relationship between authentic assessment and the outcomes of engagement, transfer of learning, evidence of competence and valid and reliable student performance outcomes. ...
Article
Full-text available
Past literature on authentic assessment suggests that it provides a far more reliable and valid indicator of outcomes such as higher student engagement, ability to transfer skills to different contexts, multiple evidence of competence, and student performance. This has appeal in seafarer education and training where both students and employers increasingly perceive traditional assessment methods as failing to consistently generate these outcomes. However, this paper argues that improving different aspects of assessment validity and reliability is essentially required to enhance the outcomes identified above. The paper builds on and extends previous work to investigate and develop a conceptual and practical framework that promotes a holistic approach to authentic assessment that provides greater assurances of validity and reliability throughout all stages of assessment within seafarer programs. It also lays the path to future research directions by establishing the agenda to test the practicality of the framework in the authentic assessment of seafarer students and also investigate the impact of students’ perception of increasing authenticity on performance scores in assessment tasks.
Article
Full-text available
Workplace ready or professional preparedness and the development of generic workplace competencies are the common factors emanating from the university education system that will launch a student for future employment. Talents for the broadcasting industry are increasingly being trained while they are undergoing tertiary level education. The highly esteemed way of creating these generic competencies is to have an effective curriculum. Having one is considered to be crucial not only in determining the achievement of an educational institution, but also the future and success of students following that program. Keeping these in mind this study aims to (i) assess current practices and future training needs in the Malaysian broadcasting industry; (ii) discuss the expectations of practitioners and the academia on developing evolving curricula that accommodate changes in the industry. (C) 2013 The Authors. Published by Elsevier Ltd.
Article
Full-text available
Content standards and tests aligned to them are the focus of teachers' efforts and often present challenges in meeting varying student interests, readiness levels, and learner profiles. This article describes how performance assess ments can enable administrators and teachers not only to address content standards but also to consider the academic diversity in their classrooms.
Article
Full-text available
Learners are often overwhelmed by the complexity of realistic learning tasks, but reducing this complexity through traditional Instructional Design (ID) methods jeopardizes the authenticity of the learning experience. To solve this apparent paradox, a two-phase ID model is presented. Phase 1 consists of cognitive task analysis, where a systematic approach to problem solving (SAP) is identified in conjunction with skill decomposition and determination of task complexity. In the subsequent design phase, inductive micro-level sequencing based on the four-component ID model (van Merriënboer, 1997) is applied where worked-out examples and problems accompanied by process worksheets assure the necessary variability of practice. Step size in a multiple-step whole-task approach—needed for the process worksheets—is determined on the basis of estimated part-task complexity. A developmental study of the model is illustrated with examples from the domain of law.
Article
Full-text available
Fostering synergies amongst learner , task , and technology to create innovative and immersive distance learning environments runs counter to the widespread practice of incorporating traditional classroom pedagogical strategies into Web‐based delivery of courses. The most widely accepted model of online higher education appears to be one of reductionism, whereby learning management systems facilitate the design of easily digested packets of information, usually assessed by discrete stand‐alone tests and academic assignments. This article describes a model for the development of authentic tasks that can assist in designing environments of increased, rather than reduced, complexity. It provides a robust framework for the design of online courses, based on the work of theorists and researchers in situated learning and authentic learning. It describes the characteristics of a task's design that facilitates the requirements of an entire course of study being readily satisfied by its completion, where the students make the important decisions about why, how, and in what order they investigate a problem. The article describes several learning environments that were investigated in depth in the study, and explores the synergies that exist between the learners, tasks, and technology engaged in authentic learning settings. The article leads readers to a conceptual understanding of the role of authentic tasks in supporting knowledge construction and meaningful learning, and illustrates the principles of authentic task design for online learning environments.
Article
Full-text available
Online learning environments' offer perhaps the most efficient methods' yet for providing objective, quantitative assessment tasks' for students'. In the current resource-stretched tertiary education climate these methods' are perceived as' time and cost effective, and often educationally sound, particularly when appropriate feedback is provided. A wide range of research on recent online assessment tools' supports' this' claim. As yet there is little research which addresses the value of qualitative techniques in such contexts', and even less which examines the issues associated with the integration of both types of assessment tasks' within the same context. This' tension in the research requires examination and it is the purpose of this' paper to not only investigate the recent neglect of qualitative assessment in online education but to consider potential solutions to this' struggle for balance between quantitative and qualitative online assessment techniques. Previous' work outlining a suggested set of criteria for designing and implementing qualitative online assessment tasks' is' used to address the challenge of designing practical guidelines by which balanced assessment methods' can be implemented.
Does the assessment activity require that students collaborate? 165 Summary of the student’s response to the application of the critical questions in the redesign of Module 10 167 The student’s response to the assessment activity
  • . ............................................................................................................................................................................................................. Discussion
Does the assessment activity require that students collaborate?.......... 165 Summary of the student’s response to the application of the critical questions in the redesign of Module 10 .................................................... 167 The student’s response to the assessment activity.............................................. 170 Discussion........................................................................................................... 172 Chapter 7: Discussion .............................................................................................181 Research questions — Data analysis......................................................... 183
90 1. Degree of challenge(s) presented to the assessed student
  • .......................................................................... Summary Of Feedback From Expert Reviewers
Summary of feedback from expert reviewers....................................................... 90 1. Degree of challenge(s) presented to the assessed student....................... 91
37 PHASE 1: Exploration of the problem 43 PHASE 2: Development of a solution 44 PHASE 3: Implementation and evaluation
  • ............................................................................................................................................................................................................................................................................... Design-Based Research
Design-based research.......................................................................................... 37 PHASE 1: Exploration of the problem........................................................ 43 PHASE 2: Development of a solution........................................................ 44 PHASE 3: Implementation and evaluation................................................. 50 PHASE 4: Presentation of findings............................................................. 58
81 Expert Reviewer 1
  • ............................................................................................................................................................................................................................................................................................................. Expert
Expert review........................................................................................................ 81 Expert Reviewer 1....................................................................................... 82 Expert Reviewer 2....................................................................................... 85 Expert Reviewer 3....................................................................................... 88