Conference PaperPDF Available

Assessment in serious games: An enhanced approach for integrated assessment forms and feedback to support guided learning

Authors:

Abstract and Figures

Over the last 100 years, the learning process has changed from being repetitive to a new form of learning based on understanding, independency, learners' empowerment and skills improvement. Game-based learning is an example of these new forms of learning in which experiential learning and guided intuitive learning is advocated. As a main part of the learning process, assessment is no more considered to discriminate between students, rather than it is used to enhance students learning and encourage them for further progress and success. In the new era for assessment, students play major roles in the assessment process where they participate in alternative forms of assessment based on their behavior and performance. Moreover, they are provided with timely and quality feedback to scaffold their learning process and to maintain their progress and success. This paper proposes an enhanced approach for assessment in serious games through which instructors can define assessment rules to guide students through dynamic feedback. A proof-of-concept is developed and first findings depict the applicability of providing dynamic assessment and feedback in stealth mode for serious games.
Content may be subject to copyright.
Assessment in Serious Games
An Enhanced Approach for Integrated Assessment Forms and Feedback to Support Guided
Learning
Mohammad AL-Smadi and Gudrun Wesiak
Graz University of Technology,
Graz, Austria
msmadi@iicm.edu
Christian Guetl
Graz University of Technology,
Curtin University of Technology, Perth, WA.
Graz, Austria
cguetl@iicm.edu
Abstract—Over the last 100 years, the learning process has
changed from being repetitive to a new form of learning based
on understanding, independency, learners’ empowerment and
skills improvement. Game-based learning is an example of
these new forms of learning in which experiential learning and
guided intuitive learning is advocated. As a main part of the
learning process, assessment is no more considered to
discriminate between students, rather than it is used to
enhance students learning and encourage them for further
progress and success. In the new era for assessment, students
play major roles in the assessment process where they
participate in alternative forms of assessment based on their
behavior and performance. Moreover, they are provided with
timely and quality feedback to scaffold their learning process
and to maintain their progress and success. This paper
proposes an enhanced approach for assessment in serious
games through which instructors can define assessment rules
to guide students through dynamic feedback. A proof-of-
concept is developed and first findings depict the applicability
of providing dynamic assess ment and feedback in stealth mode
for serious games.
Keywords- e-assessment; dynamic assessment; serious games;
feedback; game-based learning; intuitive guided learning;
achievement.
I. INTRODUCTION
Over the last decades, our modern life has been
influenced by the shift to more global and knowledge-
centered society with a rapid development in technology.
Educational systems - including teaching and learning - have
been struggling to cope with this shift and challenges.
Therefore, new and modern teaching and learning styles,
settings, and practices have emerged to meet those
challenges. These modern settings require people to improve
their skills as well as their expertise to cope with the rapid
changes in their societies [1].
The emergence of Web 2.0 and the influence of
information and communication technology (ICT) have
fostered e-learning 2.0 to be more interactive, challenging,
and situated [2]. Nowadays learners use technology
anywhere, anytime, and thus they require adequate learning
types that are challenging and engaging [3]. Given the
different learning styles and teaching strategies, educators
are faced with the challenge of having to develop learning
activities that motivate students and maintain engagement.
Therefore, e-learning systems have evolved to incorporate
social and collaborative learning tools as well as highly
interactive learning material such as games and simulations.
Learner motivation and engagement has become a
challenge to e-learning systems developers. Therefore,
learning types with high level of interaction and challenge -
such as game-based learning - have become widely used.
The use of games technology for learning is not new and
online games have been available for more than a decade.
According to [4], interactive-learning environments foster
knowledge transfer, skills and abilities improvement in
general and social skills in particular. A variety of
educational online games have become available to increase
learners’ motivation, support collaborative learning and
games may foster students to gain knowledge [5].
Sophisticated and simultaneous online games exist for a
small group of players to massive multiplayer online games
(MMOG). World of Warcraft is the most popular MMOG
Western title reaching over $1.4 billion in consumer
spending in 2008 in North America and Europe and since
2005, the cumulative spending on subscriptions is reaching
over $2.2 billion [6]. Research advocating the use of video
games in education can be found in ([7] [8] [9] [10] [11] [12]
[13]).
The potential of learning through gaming has been
highlighted in literature (cf. [7] [8]). For instance, in [8] the
impact of serious games on cognitive development has been
discussed, nevertheless 36 possible learning principles can be
achieved through gaming. Among these principles are the
advocating of different learning types - e.g. intuitive and
experiential learning - and raising motivation and
engagement. When students play, they interact with the
game by making decisions and taking right/wrong actions
and paths. Serious games should have the possibility to
define checkpoints (assessment rules) so that to assess
players interactions and decisions without breaking the non-
linearity of the game. Moreover, serious games should
provide valuable feedback. However, defining assessment
rules to meet the learning objectives during the design and
development of the game limits the educator control on the
learning process on one hand and eliminates the flexibility of
having non-linear and intuitive learning paths within the
game scenarios. Therefore, this paper discusses an enhanced
approach for decoupling assessment from the game platform
and giving the educator more flexibility on defining
assessment rules based on the learning objectives, the game
context, and targeted audience.
To this end, the rest of this paper is organized as follows:
Section II discusses aspects related to assessment and
feedback in serious games. Section III proposes an enhanced
approach to guide learning in serious games based on
assessment and feedback. As part of this approach, a
framework to design assessment and feedback for serious
games is proposed, the solution architecture is illustrated and
poof-of-concept is demonstrated. Section IV explores related
work form literature whereas a conclusion and future work is
presented in section V.
II. GUIDING LEARNING IN SERIOUS GAMES THROUGH
ASSESSMENT AND FEEDBACK
Digital games content is very interactive thus more
engaging. Nevertheless, this high level of interaction can be
utilized for supporting learning. When players interact with
the game they eventually take possible actions pre-defined in
the game model of actions. These interactions can be utilized
to define assessment rules based on monitoring the player
activities, logging all actions within the game session which
can be used to assess the player activities within the game.
Serious games represent a challenging as well a rich domain
for assessment practices. However, the efficacy of any
assessment approach is highly related to the target
demographic, usage context, choice of technology, and
underlying pedagogy [10]. Hence an attempt to evaluate any
assessment model typically results with lack of applicability
when transferred to other groups of learners, different
context and educational situations.
Focusing on feedback, literature reviews of feedback in
digital educational games highlight the importance of
formative models for feedback provision [14] [15].
Reference [16] discusses the feedback aspects in digital
educational games based on Rogers’ feedback classification
into evaluative (players get a score), interpretive (players get
a score and the wrong action), supportive (players get a score
and guidance information), probing (players get a score and
analysis of why the player did the wrong action), and
understanding (players get a score and analysis of why the
player did the wrong action as well as guidance for
supportive steps or learning material) forms [17]. Moreover
they propose a four-dimensional approach for feedback
provision in serious games. According to their approach the
following aspects should be considered:
Type: feedback type differs based on Roger’s
classification -discussed above- with respect to
students, teachers, or technology thus required
aspects to classify feedback – such as measure
variables, their relationships model, learner model,
knowledge model, and domain model- should be
considered.
Content: content can be classified with respect to
the learning outcomes into essential or desirable.
Format: the media used to represent feedback (e.g.
text, image, voice, etc.).
Frequency: the rate of feedback provision to
students differs with respect to instructors,
technology, pedagogy, and learner preferences
control. Hence, feedback can be immediate,
delayed, or dynamic based on the domain and
learner action type.
III. ENHANCED APPROACH FOR ASSESSMENT IN SERIOUS
GAMES
Providing assessment and feedback in serious games
highly depends on the game context, the nature of the
scenario within the game, as well as the learning objectives -
or underlying pedagogy - to be achieved through using this
game in education. Therefore, educational games should be
integrated with LMSs thus to adapt their content, scenarios,
and didactic objectives - e.g. the type of provided feedback -
to fit with learners – i.e. players – preferences and skill and
knowledge state (cf. [18]). LMSs use the Log of player
interactions within the game session to provide more
personalized and adaptable content. The player flow within
the game will form like a learning path where a third-party
tool is needed to interact with the game engine, retrieves the
player state, and communicate with LMS so the learner
model can be updated as well as adaptive and personal
content can be provided during the game next phases.
To this end, Providing assessment for serious games
holds some challenges especially when it comes to provide
dynamic evaluation and feedback for player’s - i.e. students -
based on their progress and interactions within the game [19]
[20] [21]. In order to tackle this problem an enhanced
approach for integrated assessment in ‘stealth’ mode has
been developed. The next sub-sections discuss in detail the
approach framework, solution architecture and developed
components, and provide proof-of-concept scenarios based
on that.
A. Framework for Dynamic Assessment and Feedback
Provision in Serious Games
This sub-section proposes a framework for externalizing
assessment and feedback from the game engine based on
achievements. The framework considers three main phases:
(a) game development phase in which aspects such as target
users, learner and game context, pedagogical approaches,
and game fidelity are considered by the game developer to
tag game objects with pedagogic impact and use them to
define potential achievements players can achieve during the
game, (b) assessment and feedback definition phase in
which the educator defines assessment rules based on
possible interactions of the player with the tagged game
objects from the development phase. Moreover, feedback
can be defined in the meta-description of the assessment rule
as part of the consequences tag (see Fig 2). The feedback can
range from simple guidance messages through to initiating
dialogue with a virtual character or changing the current
scene, game level, or achievements based on giving points,
and (c) gameplay phase where the assessment rules are
triggered based on the player interactions with the game and
subsequent feedback is provided to the player through the
communication mechanism.
This Framework has been used to provide achievement-
based assessment and feedback to a serious game developed
to guide school pupils of how to evacuate the school building
in case of fire threat. The next sub-sections explain in more
details how the assessment framework can be used to
provide flexible and integrated assessment and feedback to
guide learning in serious games. Moreover, it disseminates
technical aspects related to the solution architecture,
assessment and feedback type and specifications.
Figure 1. Architecture for an enhanced approach for integrated
assessment in serious games
B. Solution Architecture
The architectural design of the proposed assessment
approach is based on the service-oriented flexible and
interoperable assessment (SOFIA) [22]. This service-
oriented approach has been used to foster flexible integration
of the game in the enhanced learning environment. The
architecture is designed to consider two main scenarios for
assessment:
Dynamic Assessment and Feedback: in which an
‘assessment interface’ is attached to the game engine in
order to handle events coming from players interactions
and calls the ‘assessment engine’ to evaluate those
actions based on pre-defined assessment rules in the
‘assessment model’, and provides the pre-defined
feedback associated to those assessment rules
dynamically to the player.
Post Evaluation: in which a ‘log file’ has been
designed to hold all the actions related to the assessment
scenario for specific context – e.g. fire evacuation
training – through tracking the players’ interactions.
Moreover, an ‘assessment engine’ is developed to
interact with an ‘assessment model’ to evaluate the
players progress – represented by log file actions –
against a pre-authored assessment rules to assess
specific learning objectives – e.g. crawling in Smokey
areas during evacuation.
As depicted in Fig. 1 the architecture consists of the
following components:
Assessment Model: is an XML based description
of behaviour patterns and associated consequences.
Behaviour patterns are defined through sequences of
possible player actions and conditional matches. While
consequences have the primary goal of providing
feedback - messages or actions - to the player within the
game engine after detecting specific pattern by the
assessment engine. Consequences can take a form of
action to enable internal measurement operation (e.g.
stop watches). The assessment model is authored by the
teacher based on the achievements their associated
objects and interactions from game development phase.
Moreover, the four-dimensional approach for feedback
provision in serious games is used to define feedback as
part of the assessment and feedback definition phase.
Fig. 2 depicts an example of defined assessment rule to
teach students not to collect their possessions during fire
evacuation.
Assessment Engine: loads the related assessment
model once it is invoked. The retrieval of the assessment
model is based on the learning task and context. Using
the model, the assessment engine analyses and match all
possible assessment rules when invoked by the game
engine by receiving new game flow events. Possible
event sources are, log files - for the post evaluation
scenario - or direct calls - for the dynamic assessment
and feedback scenario - from the game engine.
Assessment Interface: handles the communication
between the game engine and assessment engine. The
assessment engine is managed through a web service
developed as part of SOFIA middleware. For this web
service an interface is provided and used to call the
assessment engine methods. The service is described
using the Web Services Description language (WDSL)
and uses the Simple Object Access Protocol (SOAP) for
messages communication and transport.
Log file: is created by the game engine which
tracks the player interactions and environment changes
and logs these in an XML-based log file. The log file is
used for post evaluation to provide report based on
player behaviour and performance within the game
environment. Fig 3 depicts part of the Log file
representing the logging an action of collecting an object
- i.e. bag - during fire evacuation.
Figure 2. An example of an assessment rule to teach students not to
collect their possessions during fire evacuation.
Figure 3. Part of the log file relatd to the event of a player collected their
bag during fire evacuation.
C. Proof-of-Concept
The proposed approach has been developed in the
context of ALICE1 project to provide integrated assessment
forms for serious games [23].The assessment approach aims
at providing feedback, thus to support students through a
guided learning approach to learn how to evacuate a school
building in case of fire threat. The game – developed at the
Serious Games Institute (SGI) at Coventry University -
adopts a freely navigable 3D environment, created within the
Unity Engine 2. The game contains elements of crowd
simulation within fire evacuation scenarios, effectively
placing the player within the building and monitoring their
actions as they evacuate. Hence, provide effective feedback
and assessment. It is essential that the game monitors and
correctly identifies key actions which may indicate correct
and incorrect behaviors.
Based on Framework discussed earlier, the principal
means through which it is proposed is achieved through the
implementation of virtual ‘checkpoints’ within each
scenario, recording players’ time and state as they pass
within a radius of a single point within the virtual space.
Nevertheless, the game designer annotates the pedagogical
1 http://www.aliceproject.eu/
2 http://unity3d.com/
objects and share these annotations as XML file with the
LMS. The annotations are used by the assessment designer
to provide assessment rules thus to provide feedback to the
player once an event regarding one of these pedagogical
objects is sent to the Assessment Engine.
Figure 4. The player action to collect his possessions triggered an
assessment rule which has a consequence of providing feedback using a
virtual character.
The Game engine tracks the player behaviour and
environment changes and save them to a log file. More
precisely, the actions of the players on tagged pedagogical
objects fires an event to save a record to the log file. A
JavaScript function is developed to send this event to
through the interface to the assessment engine. For instance,
a scenario of teaching students that they should not collect
their possessions during fire evacuation was tested using this
approach. The instructor uses the Assessment Editor and
designes an assessment rule for the game object ‘bag’, with a
feedback message added to the consequence section of the
assessment rule of “You took your bag during fire
evacuation! You should not collect your possessions before
evacuating” (see Fig. 2). During playing, if the player
collects his bag before leaving the class room, the action
fires an event and saves it to the log file (see Fig. 3). The
JavaScript function in the Web-based game player calls the
interface with that event. The Assessment Engine evaluates
that event against the assessment rule defined by the
instructor and replies with the feedback message. This
assessment scenario can be used to guide the player not to
collect their possessions during fire evacuation using
dynamic feedback provision (see Fig. 4).
Similarly, another scenario was used to teach the students
not to use the elevator during fire evacuation. Nevertheless,
more complex scenarios based on context, time, and
evironment state are used to evaluate the player behaviour.
For instance, crawling in areas of smoke where evaluated. In
the assessment model assessment rules based on environment
state and action can be defined. Hence, when the player does
not crawl in sections where breathing is difficult - i.e. tagged
by game designer as areas with smoke - a feedback is
provided based on the assessment rule to stimulate desired
behaviour of crawling.
IV. RELATED WORK
For instance the extension of the evidence-centered
design assessment model (ECD) [24] with an action model
instead of task model which has been used with Bayesian
networks to track player actions within the game and to
provide an evidence of progress within the game [19]. The
aim of this research is to use what they called stealth
assessment approach within immersive games to track
players actions and with respect to the ECD model to
provide formative and dynamic feedback thus to support
students learning. However, what they discussed represents
an summative approach by which they evaluates the progress
of the player in terms of interactions and used it in
comparison with the evidence model - part of ECD - to
provide an evidence of learning and skills achievement.
Another example is the so-called micro-adaptivity
approach for assessment in educational games [21] [25]. The
approach has been developed in the context of the Learning
Experience and Knowledge TRAnsfer (ELEKTRA)3 project.
The ELEKTRA framework uses the Competence-based
Knowledge Space Theory (CbKST) to model the
competencies required by the student to achieve a learning
goal. The basic idea of CbKST is to associate problems in a
domain with skills in order to provide a model of
competencies for a specific domain which can be used to
update the knowledge and skill state of the learner in a
learning domain. The ELEKTRA game and its successor
80Days4 game tracks the player interactions and uses them to
update the competence state represented in the CbKST for
the learning domains provided in the games. However,
according to [25] the approach demands extra load on
authoring aspects to define all required information for the
3 http://www.elektraproject.org
4 http://www.eightydays.eu
models as well as computational load as the game updates
the CbKST based on each player action.
Another example is the adventure game engine called <e-
Adventure> [20], which authors claims that the <e-
Adventure> is flexible to be used with educational modeling
languages - i.e. IMS Learning Design (IMS LD) - to design
the pedagogical impact of using assessment rules in the game
engine to evaluate the progress of the players, and hence to
provide personalized and adaptive digital educational games.
V. CONCLUSIONS AND FUTURE WORK
Video Games have a prominent impact on raising
motivation and engagement. However, using video games
for education demands a compromise between games
entertainment and instruction. Moreover, requires more
attention on game design when it comes to use it for
education. It is intuitive that serious games have positive
impact on students’ motivation and engagement [7] [8] [12].
However, designing games for education demands further
alignment with instruction and learning. Among this, and
some others, stimulating pedagogic-based desired behaviors
within the game. Quality assessment forms play major role
on raising motivation and engagement [26] [27]. Therefore,
this paper proposes an assessment and feedback approach to
guide learning in serious games.
The proposed approach builds on a pedagogic-
prominent framework to design assessment and feedback
apart from the game engine and use them to stimulate
desired learning. Through an externalization of the
assessment process, the framework supports interaction
between educators and serious game designers, allowing
pedagogic content to be identified at the development stage
and then manipulated dynamically in response to learner
actions and interactions within an overarching set of
pedagogic goals defined by the educator or trainer. The
method supports integration with automated assessment
technologies, allowing such tools to recognize and respond
immediately to learner actions by modifying the game
environment or triggering feedback.
The decoupling of the serious game - as a complex
learning resource - and the assessment engine – utilized via
web services - fosters the accommodation of various learning
contexts and pedagogical approaches. A proof-of-concept is
illuistrated in Section III where the player receives a
feedback from a virtual character within the game through
which the player is guided to learn through tacking action -
correct / incorrect. The appraoch advocates an intutive way
of learning through allowing the learner to explore the game
environement and eventually recieves consructive feedback.
Looking ahead, exerimenting this approach with different
3D virtual worlds thus to evaluate the flexibility and
reliability of the architecture as well as to scale the features
provided for assessment and feedback is the main concern
for future work. Enhancing the assessment model to optimize
the evaluation of players behaviour thus to provide not only
interpretive feedback (players get a score and the wrong
action), but also other forms of feedback such as supportive
(players get a score and guidance information), probing
(players get a score and analysis of why the player did the
wrong action), and understanding (players get a score and
analysis of why the player did the wrong action as well as
guidance for supportive steps or learning material) [17].
ACKNOWLEDGMENT
This work has been supported by the European
Commission under the Collaborative Project ALICE
”Adaptive Learning via Intuitive/Interactive, Collaborative
and Emotional Systems”, VII Framework Programme,
Theme ICT-2009.4.2 (Technology-Enhanced Learning),
Grant Agreement n. 257639.
REFERENCES
[1] C. Gütl, & V. Chang, “Ecosystem-based Theoretical Models for
Learning in Environments of the 21st Century”. International. Journal
of Emerging Technologies in Learning (iJET), submitted Nov. 2008,
December 2008.
[2] M. AL-Smadi; C. Guetl; & V. Chang “Addressing e-Assessment
practices in e-Learning Activities: A Review”, Proceedings of Global
Learn Asia Pacific 2011-Global Conference on Learning and
Technology, Melbourne, Australia, March 28-April 1, 2011.
[3] M. Prensky, “Digital Natives, Digital Immigrants”. In On the
Horizon, October 2001, 9(5)NCB University Press, 2001.
[4] W. C. Kriz, “Creating Effective Learning Environments and Learning
Organizations through Gaming and Simulation Design”, Simulation
and Gaming, Vol. 34, Iss. 4, pp 495-511, 2003.
[5] C. Gütl, “The Support of Virtual 3D Worlds for enhancing
Collaboration in Learning Settings”, in Francesca Pozzi and
Donatella Persico (Eds.) ,Techniques for Fostering Collaboration in
Online Learning Communities: Theoretical and Practical
Perspectives, IGI Global, 2011, 278-299.
[6] Screen Digest. “There’s life beyond World of Warcraft. Subscription
MMOGs: Life beyond World of Warcraft”, Retrieved on 8
December, 2009 from
http://www.screendigest.com/press/releases/pdf/PR-
LifeBeyondWorldOfWarcraft-240309.pdf
[7] M. Prensky, Digital-game based learning. McGraw-Hill, 2001.
[8] J. P. Gee, What video games have to teach US about learning and
literacy. Palgrave Macmillan, 2003.
[9] S. De Freitas, “Learning in immersive worlds: A review of game-
based learning”. JISC e-Learning Programme, 2006.
[10] S. De Freitas, & M. Oliver, “A four dimensional framework for the
evaluation and assessment of educational games”. Paper accepted for
Computer Assisted Learning Conference 2005.
[11] R. Van Eck, “Digital game based learning: It’s not just the digital
native who are restless”, Educause Review, 41, 16–30, 2006.
[12] E. Klopfer, S. Osterweil, & K. Salen, Moving learning games
forward: obstacules, opportunities and openness, the education
arcade. Massachusetts Institute of Technology, 2009.
[13] J. McGonigal, Reality is broken: Why games make US better and
how they can change the world. Penguin Books, 2011.
[14] V. J. Shute, “Focus on formative feedback”, Review of Educational
Research, 780), 153-189, 2008.
[15] E. H., Mory, Feedback research revisited. Handbook of Research on
Educational Communications and Technology, 2nd Edition, D. H.
Jonassen Eds. Lawrence Erlbaum Associates, p745-783, 2004.
[16] I. Dunwell, S. Jarvis, & S. de Freitas,. „Four-dimensional
consideration of feedback in serious games”. In Digital Games and
Learning, P. Maharg & S. de Freitas (Eds.) Continuum Publishing,
2010.
[17] C. Rogers, Client-centered Therapy: Its Current Practice, Implications
and Theory. London: Constable, 1951
[18] P. Moreno-Ger, D. Burgos, J. L. Sierra, & B. Fernández-Manjón,
“Educational game design for online education”, Computers in
Human Behavior, 24, 2530-2540, 2008.
[19] V. J., Shute, M. Ventura, M. I., Bauer, & D. Zapata-Rivera, „Melding
the power of serious games and embedded assessment to monitor and
foster learning: Flow and grow”. In U. Ritterfeld, M. Cody, & P.
Vorderer (Eds.), Serious games: Mechanisms and effects (pp. 295–
321). Mahwah, NJ: Routledge, Taylor and Francis, 2009.
[20] D., Burgos, P. Moreno-Ger, J. L. Sierra, B. Fernández-Manjón, M.,
Specht, & R. Koper, “Building adaptive game-based learning
resources: The marriage of IMS learning design and <e-Adventure>”,
Simulation & Gaming, 39, 414-431, 2008.
[21] M.D., Kickmeier-Rust, C.M., Steiner, & D. Albert, „Non-invasive
Assessment and Adaptive Interventions in Learning Games”,
International Conference on Intelligent Networking and Collaborative
Systems INCOS 09, pp. 301 – 305, 2009.
[22] M. AL-Smadi, and C. Guetl, Service-oriented flexible and
interoperable assessment: towards a standardised e-assessment
system, Int. J. Continuing Engineering Education and Life-Long
Learning, Vol. 21, No. 4, pp.289–307, 2011.
[23] M. Al-Smadi, C. Guetl, I. Dunwell, and S. Caballe., "D5.2.2:
Enriched Learning Experience V2," ALICE (Adaptive Learning Via
Intuitive/Interactive, Collaborative And Emotional Systems) project
co-funded by the European Commission within the 7th Framework
Programme (2007-2013), n. 257639 (2010). 2012.
[24] R., Almond, L. Steinberg, & R. Mislevy, “Enhancing the design and
delivery of assessment systems: A four process architecture”, The
Journal of Technology, Learning, and Assessment, 1, 5. 2002.
[25] M. Kickmeier-Rust, & D. Albert, “Micro-adaptivity: protecting
immersion in didactically adaptive digital educational games”.
Journal of Computer Assisted Learning, 26: 95–105, 2010,
[26] R. J. Stiggins, Student-involved assessment for learning (4th ed).
Upper Saddle River, NJ: Prentice-Hall, 2001.
[27] W. Harlen, “The role of assessment in developing motivation for
learning”, In J. Gardner (Ed.). Assessment and Learning (pp. 61-80).
London: Sage Publications, 2006.
... Assessment rules also named as checkpoints, are embedded into the serious games and are used for assessing players interactions and decisions without interrupting the dimensionality of the games [26]. Generally, game designers or educators define the assessment rules to provide implicit feedback to the player using a virtual character [26]. ...
... Assessment rules also named as checkpoints, are embedded into the serious games and are used for assessing players interactions and decisions without interrupting the dimensionality of the games [26]. Generally, game designers or educators define the assessment rules to provide implicit feedback to the player using a virtual character [26]. ...
Article
Full-text available
As a current trend in teaching, simulation games play an active and important role in the area of technology-based education. Simulation games create an envi-ronment for scholars to solve real-world problems in a risk-free environment. Therefore, they aim to increase the knowledge base as well as learning experienc-es for students. However, assessing the effectiveness of a simulation game is necessary to optimize elements of the game and increase their learning effect. In order to achieve this aim, different evaluation methods exist, which do not always involve all phases when running a simulation game. In this study, we conduct a literature review to analyze evaluation methods for three phases of simulation games: pre-game, in-game, and post-game. Thirty-one peer-reviewed research papers met specified selection criteria and we classified them according to a di-dactic framework that illustrates four phases of running simulation games: Prepa-ration, Introduction, Interaction and Conclusion phase. Based on the results, we provide a concrete evaluation strategy that will be a guide to assess simulation games during all phases. This study contributes to theory by providing an over-view of evaluation methods for the assessment of simulation games within the different game phases. It contributes to practice by providing a concrete evalua-tion strategy that can be adapted and used to assess simulation games.
... To this end, this research extends the work presented in (AL-Smadi et al. 2012) through proposing an enhanced flexible design for a dynamic assessment approach for serious games and 3D immersive environments. In addition to technical enhancements, the assessment model is extended to use a semantic knowledge base that represents abstract spatial location of the environment (the game)objects and avatars, observable state of entities (items and avatars), and possible interactions between objects and avatars. ...
... Providing assessment for serious games holds some challenges especially when it comes to provide formative dynamic assessment and feedback (AL-Smadi et al. 2012;Shute et al. 2009;Kickmeier-Rust et al. 2009). In order to tackle this problem an enhanced approach for integrated dynamic assessment in 'stealth' mode (i.e., none invasive) has been developed through the architecture depicted in Fig. 1. ...
Article
Full-text available
This research proposes an enhanced approach of decoupling assessment and serious games to support fire evacuation training in smart education. The proposed assessment approach employs an evidence-based dynamic assessment and feedback to guide players through school’s building evacuation. Experimentation results show the applicability of the proposed assessment approach in enhancing fire evacuation training using serious games. Moreover, students were engaged to the proposed learning scenarios and their overall fire evacuation assessment were enhanced using the guided exploratory game-based training.
... (Christian S Loh & Yanyan Sheng, 2013) they suggest a SEGA (Serious Game Analytics). However , (Al-Smadi, Wesiak, & Guetl, 2012) argue that defining assessment rules to meet the learning objectives during the design and development of the game limits the educator control on the learning process on one hand and eliminates the flexibility of having non-linear and intuitive learning paths within the game scenarios. They discuss an enhanced approach for decoupling assessment from the game platform. ...
... A related work to our study (Al-Smadi et al., 2012) noted that architectural design of assessment approach should be based on the service-oriented flexible and interoperable assessment (SOFIA) The architecture has to be designed to consider two main scenarios for assessment: Dynamic Assessment and Feedback and Post Evaluation. However, our proposed model of providing feedback is based on strong pre-defined metrics, through which not only assessment but also other skills out of course material can be defined. ...
Conference Paper
Full-text available
Serious game design ignores the fact that learning and assessment go hand in hand. One learns by knowing he has learned and there has to be a measure to ascertain that learning has occurred and to what extent. The subjectivity of whether learning has occurred without sound measure of what has been learnt, how and when is a challenge to serious game. In this paper we argue that the design and integration of learning content in educational serious games has to consider means of communicating results. And those results should have a strong pedagogical foundation. We propose a design approach that considers incorporation of learning objects in educational serious games; we also provide clarity on educational serious game and its anatomy. We present a conceptual framework that allows us to clearly provide argument that learning in game is two-sided learning the game skills (Game specific Learning Objects) and the course skills (Pedagogical Learning Objects).
... Questionnaires [28][29][30][31] Counting Errors [20] Questionnaires [28,29,[31][32][33][34][35][36] Analysis of Cognitive Improvements [30] Assessment Rules [37] Data Mining Algorithms [38] Demographic Information [26,39] Performance Tracking [21] Tracking Success of the Teams [40] Learner Type Information [39] Questionnaire, Semi-structured interviews [41] Evaluation Grids [42] Multiple Choice Questionnaire (MCQ) [24] Monitoring students' progress (Formative) [22] Performance Evaluation by Observer [43] Self-assessment of Learning Questionnaire (SAL) [24] Self-Evaluation by Students [26] Learning Assessment Tools (LATs) [44] Discussions, Interviews [45] Descriptive and Causal Analysis [46] Questionnaire, Case Study, Testing [33] Quantitative Evaluation Framework [47] Interviews, Cognitive Labs [23] Survey, Debriefing discussion [48] Unobtrusive Observation [25] Score Evaluation (Summative) [22] Participatory Heuristic Evaluation Methodology [25] Questionnaire with examlike questions [49] Storyboarding [25] Online Performance Tasks [23] Think Aloud Protocols [23,35] Knowledge Tests, Aspect Ratings, Perceptions and Learners Preferences [39] Instant Informative Feedback to Participants [50] Self-assessment of Learning Questionnaire (SAL) [24] Field Notes during classroom observation [36] Checklist Item Assessing of Learning Outcomes [50] Teacher Interviews, Tests [36] The case study from Wilson et al. [27] aims to test the serious game "Macbeth", in which players in a fictional environment gather and analyze intelligence data to prevent simulated attacks. This study adopts an iterative evaluation methodology, which uses many qualitative assessment methodologies, such as focus groups, interviews, and oneon-one playtests. ...
Chapter
Full-text available
Simulation games play an important role in the area of technology-based education. They allow the simulation of real-world problems in a risk-free environment and thereby intend to increase the learning experience of students. However, assessing the effectiveness of a simulation game is necessary to optimize elements of the game and increase the learning effect for students. For this, different evaluation methods exist, which do not always cover all phases when running a simulation game. In this study, we conduct a literature review to analyze evaluation methods for the pre-game, in-game, and post-game assessment of simulation games. In accordance with our inclusion criteria, we selected 31 peer-reviewed articles, and categorized them according to a didactic framework that describes four phases of running simulation games: Preparation, Introduction, Interaction and Conclusion phase. Based on the results, we provide a concrete evaluation strategy that can be used to assess simulation games throughout all phases. This study contributes to theory by providing an overview of evaluation methods for the assessment of simulation games within the different game phases. It contributes to practice by providing a concrete evaluation strategy that can be adapted and used to assess simulation games.
... Learning by playing seems to be a solid method to gain better appropriation for the learner [10]. Serious games, mostly developed for surgical skills, have been tested to prove their validity [11][12][13]. ...
Article
Full-text available
Background The emergence of new technologies in the obstetrical field should lead to the development of learning applications, specifically for obstetrical emergencies. Many childbirth simulations have been recently developed. However, to date none of them have been integrated into a serious game. Objective Our objective was to design a new type of immersive serious game, using virtual glasses to facilitate the learning of pregnancy and childbirth pathologies. We have elaborated a new game engine, placing the student in some maternity emergency situations and delivery room simulations. Methods A gynecologist initially wrote a scenario based on a real clinical situation. He also designed, along with an educational engineer, a tree diagram, which served as a guide for dialogues and actions. A game engine, especially developed for this case, enabled us to connect actions to the graphic universe (fully 3D modeled and based on photographic references). We used the Oculus Rift in order to immerse the player in virtual reality. Each action in the game was linked to a certain number of score points, which could either be positive or negative. Results Different pathological pregnancy situations have been targeted and are as follows: care of spontaneous miscarriage, threat of preterm birth, forceps operative delivery for fetal abnormal heart rate, and reduction of a shoulder dystocia. The first phase immerses the learner into an action scene, as a doctor. The second phase ask the student to make a diagnosis. Once the diagnosis is made, different treatments are suggested. Conclusions Our serious game offers a new perspective for obstetrical emergency management trainings and provides students with active learning by immersing them into an environment, which recreates all or part of the real obstetrical world of emergency. It is consistent with the latest recommendations, which clarify the importance of simulation in teaching and in ongoing professional development.
Article
Purpose Prostate cancer is the most common cancer diagnosed in men in the UK. Black men are in a higher prostate cancer risk group possibly due to inherent genetic factors. The purpose of this paper is to introduce PROstate Cancer Evaluation and Education (PROCEE), an innovative serious game aimed at providing prostate cancer information and risk evaluation to black African-Caribbean men. Design/methodology/approach PROCEE has been carefully co-designed with prostate cancer experts, prostate cancer patients and members of the black African-Caribbean community in order to ensure that it meets the real needs and expectations of the target audience. Findings During the co-design process, the users defined an easy to use and entertaining game which can effectively raise awareness, inform users about prostate cancer and their risk, and encourage symptomatic men to seek medical attention in a timely manner. Originality/value During focus group evaluations, users embraced the game and emphasised that it can potentially have a positive impact on changing user behaviour among high risk men who are experiencing symptoms and who are reluctant to visit their doctor.
Conference Paper
Full-text available
This paper proposes a pedagogic-prominent approach to design automated assessment and feedback apart from the 3D virtual worlds and use them to support guided learning. Through an externalization of the assessment process, the approach supports interaction between educators and virtual environment designers, allowing pedagogic content to be identified at the development stage and then manipulated dynamically in response to learner actions and interactions within an overarching set of pedagogic goals defined by the educator or trainer. The method supports integration with automated assessment technologies, allowing such tools to recognize and respond immediately to learner actions by modifying the virtual environment or triggering feedback. This decoupling between the assessment engine and the 3D virtual worlds enables supporting a variety of environments as well as different contexts and application domains.
Article
Full-text available
Article
Full-text available
Note: This article was UPDATED and revised in 2015 in a new article entitled "DGBL: Still Restless After All These Years" which can be found in Research Gate and at Educause Review. What follows are BOTH abstracts: 2006 Abstract: After years of research and proselytizing, the proponents of digital game-based learning (DGBL) have been caught unaware. Like the person who is still yelling after the sudden cessation of loud music at a party, DGBL proponents have been shouting to be heard above the prejudice against games. But now, unexpectedly, we have everyone’s attention. The combined weight of three factors has resulted in widespread public interest in games as learning tools. 2015 Abstract: Nearly a decade ago, I wrote an article for EDUCAUSE Review about digital game-based learning (DGBL) and the challenges it faced.1 I suggested that once proponents of DGBL were successful in convincing people that games could play a role in education, they would be unprepared to provide practical guidance for implementing DGBL. Just as when the person shouting to be heard at a party is suddenly the center of attention at the moment there is a lull in the conversation, we DGBL proponents had everyone's attention—but not much to say. In the article I also suggested that our sometimes overzealous defense of videogames (hereafter often referred to as "digital games") ran the risk of overselling the benefits (and underreporting the challenges) of using digital games in formal education. Digital games, I said then and still believe today, are effective as embodiments of effective learning theories that can promote higher-order outcomes. Our inability to provide guidance in doing so a decade ago was ceding the DGBL front to digital games as tools for making didactic, instructivist learning (i.e., lectures) more "engaging." DGBL, I suggested, was effective not as a means for making learning "fun" or for "tricking" students into learning; DGBL was effective because it supported powerful learning strategies such as situated learning, authentic environments, and optimized challenge and support (scaffolding). What was needed was a renewed focus on (1) research about why DGBL is effective and (2) guidance on how, when, for whom, and under what conditions to integrate digital games into formal education. I was not the only one with these ideas, but my timing and the venue combined to reach many people. That 2006 article has been cited more than 1,000 times since then.2 Yet though these ideas continue to resonate with many people, much has changed in terms of research, practice, and to some extent, my own beliefs about the future of DGBL.
Article
an Education Arcade paper (research report)
Article
Collaborative learning activities apply different approaches in-class or out-of-class, which range from classroom discussions to group-based assignments and can involve students more actively as well as stimulate social and interpersonal skills. Information and communication technology can support collaboration, however, a great number of pre-existing technologies and implementations have limitations in terms of the interpersonal communication perspective, limited shared activity awareness, and a lack of a sense of co-location. Virtual 3D worlds offer an opportunity to either mitigate or even overcome these issues. This book chapter focuses on how virtual 3D worlds can foster the collaboration both between instructors and students as well as between student peers in diverse learning settings. Literature review findings are complemented by the results of practical experiences on two case studies of collaborative learning in virtual 3D worlds: one on small group learning and one on physics education. Overall findings suggest that such learning environment's advantages are a promising alternative to meet more easily and spontaneously; and that an integrated platform with a set of tools and a variety of communication channels provides real life world phenomena as well as different ones. On the negative side, there are usability issues in relation to the technical limitations of 3D world platforms and applications, which reduce the potential for learning in such collaborative virtual environments.
Article
Creating effective learning environments plays an important role in supporting organizational learning, changing individual and social interpretation patterns of reality, developing knowledge and competencies, and changing the sociotechnical systems of organizations. This article describes gaming simulation and the design of simulation games as a design-in-the-small approach that has always been a powerful method and is instrumental in modeling and changing social systems while aiming at their sustainable development. Gaming simulation as an interactive-learning environment propels the principles of problem-oriented learning into action and enhances a shift of existing organizational cultures and structures and in this way contributes to the design-in-the-large processes of organizations. The training program for systems competence through gaming simulation demonstrates that interactive design of simulation games supports change processes in the educational organizations.
Article
IMS Learning Design (IMS-LD) is a specification to create units of learning (UoLs), which express a certain pedagogical model or strategy (e.g., adaptive learning with games). However, the authoring process of a UoL remains difficult because of the lack of high-level authoring tools for IMS-LD, even more so when the focus is on specific topics, such as educational eGames. However, external tools that are not specifically IMS-LD oriented can be used. In this case, the main challenge is the integration between these external resources developed with other technologies and the personalized learning experience of an IMS-LD UoL. In this article, the authors use the project to develop conversational games that are integrated with IMS-LD UoLs to improve personalized learning. The main contribution of this setting is the integration of both parts, the IMS-LD specification and , and the communication that enables a mutual influence of the adaptive learning experience.