Content uploaded by Mohammad AL-Smadi
Author content
All content in this area was uploaded by Mohammad AL-Smadi
Content may be subject to copyright.
Assessment in Serious Games
An Enhanced Approach for Integrated Assessment Forms and Feedback to Support Guided
Learning
Mohammad AL-Smadi and Gudrun Wesiak
Graz University of Technology,
Graz, Austria
msmadi@iicm.edu
Christian Guetl
Graz University of Technology,
Curtin University of Technology, Perth, WA.
Graz, Austria
cguetl@iicm.edu
Abstract—Over the last 100 years, the learning process has
changed from being repetitive to a new form of learning based
on understanding, independency, learners’ empowerment and
skills improvement. Game-based learning is an example of
these new forms of learning in which experiential learning and
guided intuitive learning is advocated. As a main part of the
learning process, assessment is no more considered to
discriminate between students, rather than it is used to
enhance students learning and encourage them for further
progress and success. In the new era for assessment, students
play major roles in the assessment process where they
participate in alternative forms of assessment based on their
behavior and performance. Moreover, they are provided with
timely and quality feedback to scaffold their learning process
and to maintain their progress and success. This paper
proposes an enhanced approach for assessment in serious
games through which instructors can define assessment rules
to guide students through dynamic feedback. A proof-of-
concept is developed and first findings depict the applicability
of providing dynamic assess ment and feedback in stealth mode
for serious games.
Keywords- e-assessment; dynamic assessment; serious games;
feedback; game-based learning; intuitive guided learning;
achievement.
I. INTRODUCTION
Over the last decades, our modern life has been
influenced by the shift to more global and knowledge-
centered society with a rapid development in technology.
Educational systems - including teaching and learning - have
been struggling to cope with this shift and challenges.
Therefore, new and modern teaching and learning styles,
settings, and practices have emerged to meet those
challenges. These modern settings require people to improve
their skills as well as their expertise to cope with the rapid
changes in their societies [1].
The emergence of Web 2.0 and the influence of
information and communication technology (ICT) have
fostered e-learning 2.0 to be more interactive, challenging,
and situated [2]. Nowadays learners use technology
anywhere, anytime, and thus they require adequate learning
types that are challenging and engaging [3]. Given the
different learning styles and teaching strategies, educators
are faced with the challenge of having to develop learning
activities that motivate students and maintain engagement.
Therefore, e-learning systems have evolved to incorporate
social and collaborative learning tools as well as highly
interactive learning material such as games and simulations.
Learner motivation and engagement has become a
challenge to e-learning systems developers. Therefore,
learning types with high level of interaction and challenge -
such as game-based learning - have become widely used.
The use of games technology for learning is not new and
online games have been available for more than a decade.
According to [4], interactive-learning environments foster
knowledge transfer, skills and abilities improvement in
general and social skills in particular. A variety of
educational online games have become available to increase
learners’ motivation, support collaborative learning and
games may foster students to gain knowledge [5].
Sophisticated and simultaneous online games exist for a
small group of players to massive multiplayer online games
(MMOG). World of Warcraft is the most popular MMOG
Western title reaching over $1.4 billion in consumer
spending in 2008 in North America and Europe and since
2005, the cumulative spending on subscriptions is reaching
over $2.2 billion [6]. Research advocating the use of video
games in education can be found in ([7] [8] [9] [10] [11] [12]
[13]).
The potential of learning through gaming has been
highlighted in literature (cf. [7] [8]). For instance, in [8] the
impact of serious games on cognitive development has been
discussed, nevertheless 36 possible learning principles can be
achieved through gaming. Among these principles are the
advocating of different learning types - e.g. intuitive and
experiential learning - and raising motivation and
engagement. When students play, they interact with the
game by making decisions and taking right/wrong actions
and paths. Serious games should have the possibility to
define checkpoints (assessment rules) so that to assess
players interactions and decisions without breaking the non-
linearity of the game. Moreover, serious games should
provide valuable feedback. However, defining assessment
rules to meet the learning objectives during the design and
development of the game limits the educator control on the
learning process on one hand and eliminates the flexibility of
having non-linear and intuitive learning paths within the
game scenarios. Therefore, this paper discusses an enhanced
approach for decoupling assessment from the game platform
and giving the educator more flexibility on defining
assessment rules based on the learning objectives, the game
context, and targeted audience.
To this end, the rest of this paper is organized as follows:
Section II discusses aspects related to assessment and
feedback in serious games. Section III proposes an enhanced
approach to guide learning in serious games based on
assessment and feedback. As part of this approach, a
framework to design assessment and feedback for serious
games is proposed, the solution architecture is illustrated and
poof-of-concept is demonstrated. Section IV explores related
work form literature whereas a conclusion and future work is
presented in section V.
II. GUIDING LEARNING IN SERIOUS GAMES THROUGH
ASSESSMENT AND FEEDBACK
Digital games content is very interactive thus more
engaging. Nevertheless, this high level of interaction can be
utilized for supporting learning. When players interact with
the game they eventually take possible actions pre-defined in
the game model of actions. These interactions can be utilized
to define assessment rules based on monitoring the player
activities, logging all actions within the game session which
can be used to assess the player activities within the game.
Serious games represent a challenging as well a rich domain
for assessment practices. However, the efficacy of any
assessment approach is highly related to the target
demographic, usage context, choice of technology, and
underlying pedagogy [10]. Hence an attempt to evaluate any
assessment model typically results with lack of applicability
when transferred to other groups of learners, different
context and educational situations.
Focusing on feedback, literature reviews of feedback in
digital educational games highlight the importance of
formative models for feedback provision [14] [15].
Reference [16] discusses the feedback aspects in digital
educational games based on Rogers’ feedback classification
into evaluative (players get a score), interpretive (players get
a score and the wrong action), supportive (players get a score
and guidance information), probing (players get a score and
analysis of why the player did the wrong action), and
understanding (players get a score and analysis of why the
player did the wrong action as well as guidance for
supportive steps or learning material) forms [17]. Moreover
they propose a four-dimensional approach for feedback
provision in serious games. According to their approach the
following aspects should be considered:
• Type: feedback type differs based on Roger’s
classification -discussed above- with respect to
students, teachers, or technology thus required
aspects to classify feedback – such as measure
variables, their relationships model, learner model,
knowledge model, and domain model- should be
considered.
• Content: content can be classified with respect to
the learning outcomes into essential or desirable.
• Format: the media used to represent feedback (e.g.
text, image, voice, etc.).
• Frequency: the rate of feedback provision to
students differs with respect to instructors,
technology, pedagogy, and learner preferences
control. Hence, feedback can be immediate,
delayed, or dynamic based on the domain and
learner action type.
III. ENHANCED APPROACH FOR ASSESSMENT IN SERIOUS
GAMES
Providing assessment and feedback in serious games
highly depends on the game context, the nature of the
scenario within the game, as well as the learning objectives -
or underlying pedagogy - to be achieved through using this
game in education. Therefore, educational games should be
integrated with LMSs thus to adapt their content, scenarios,
and didactic objectives - e.g. the type of provided feedback -
to fit with learners – i.e. players – preferences and skill and
knowledge state (cf. [18]). LMSs use the Log of player
interactions within the game session to provide more
personalized and adaptable content. The player flow within
the game will form like a learning path where a third-party
tool is needed to interact with the game engine, retrieves the
player state, and communicate with LMS so the learner
model can be updated as well as adaptive and personal
content can be provided during the game next phases.
To this end, Providing assessment for serious games
holds some challenges especially when it comes to provide
dynamic evaluation and feedback for player’s - i.e. students -
based on their progress and interactions within the game [19]
[20] [21]. In order to tackle this problem an enhanced
approach for integrated assessment in ‘stealth’ mode has
been developed. The next sub-sections discuss in detail the
approach framework, solution architecture and developed
components, and provide proof-of-concept scenarios based
on that.
A. Framework for Dynamic Assessment and Feedback
Provision in Serious Games
This sub-section proposes a framework for externalizing
assessment and feedback from the game engine based on
achievements. The framework considers three main phases:
(a) game development phase in which aspects such as target
users, learner and game context, pedagogical approaches,
and game fidelity are considered by the game developer to
tag game objects with pedagogic impact and use them to
define potential achievements players can achieve during the
game, (b) assessment and feedback definition phase in
which the educator defines assessment rules based on
possible interactions of the player with the tagged game
objects from the development phase. Moreover, feedback
can be defined in the meta-description of the assessment rule
as part of the consequences tag (see Fig 2). The feedback can
range from simple guidance messages through to initiating
dialogue with a virtual character or changing the current
scene, game level, or achievements based on giving points,
and (c) gameplay phase where the assessment rules are
triggered based on the player interactions with the game and
subsequent feedback is provided to the player through the
communication mechanism.
This Framework has been used to provide achievement-
based assessment and feedback to a serious game developed
to guide school pupils of how to evacuate the school building
in case of fire threat. The next sub-sections explain in more
details how the assessment framework can be used to
provide flexible and integrated assessment and feedback to
guide learning in serious games. Moreover, it disseminates
technical aspects related to the solution architecture,
assessment and feedback type and specifications.
Figure 1. Architecture for an enhanced approach for integrated
assessment in serious games
B. Solution Architecture
The architectural design of the proposed assessment
approach is based on the service-oriented flexible and
interoperable assessment (SOFIA) [22]. This service-
oriented approach has been used to foster flexible integration
of the game in the enhanced learning environment. The
architecture is designed to consider two main scenarios for
assessment:
• Dynamic Assessment and Feedback: in which an
‘assessment interface’ is attached to the game engine in
order to handle events coming from players interactions
and calls the ‘assessment engine’ to evaluate those
actions based on pre-defined assessment rules in the
‘assessment model’, and provides the pre-defined
feedback associated to those assessment rules
dynamically to the player.
• Post Evaluation: in which a ‘log file’ has been
designed to hold all the actions related to the assessment
scenario for specific context – e.g. fire evacuation
training – through tracking the players’ interactions.
Moreover, an ‘assessment engine’ is developed to
interact with an ‘assessment model’ to evaluate the
players progress – represented by log file actions –
against a pre-authored assessment rules to assess
specific learning objectives – e.g. crawling in Smokey
areas during evacuation.
As depicted in Fig. 1 the architecture consists of the
following components:
• Assessment Model: is an XML based description
of behaviour patterns and associated consequences.
Behaviour patterns are defined through sequences of
possible player actions and conditional matches. While
consequences have the primary goal of providing
feedback - messages or actions - to the player within the
game engine after detecting specific pattern by the
assessment engine. Consequences can take a form of
action to enable internal measurement operation (e.g.
stop watches). The assessment model is authored by the
teacher based on the achievements their associated
objects and interactions from game development phase.
Moreover, the four-dimensional approach for feedback
provision in serious games is used to define feedback as
part of the assessment and feedback definition phase.
Fig. 2 depicts an example of defined assessment rule to
teach students not to collect their possessions during fire
evacuation.
• Assessment Engine: loads the related assessment
model once it is invoked. The retrieval of the assessment
model is based on the learning task and context. Using
the model, the assessment engine analyses and match all
possible assessment rules when invoked by the game
engine by receiving new game flow events. Possible
event sources are, log files - for the post evaluation
scenario - or direct calls - for the dynamic assessment
and feedback scenario - from the game engine.
• Assessment Interface: handles the communication
between the game engine and assessment engine. The
assessment engine is managed through a web service
developed as part of SOFIA middleware. For this web
service an interface is provided and used to call the
assessment engine methods. The service is described
using the Web Services Description language (WDSL)
and uses the Simple Object Access Protocol (SOAP) for
messages communication and transport.
• Log file: is created by the game engine which
tracks the player interactions and environment changes
and logs these in an XML-based log file. The log file is
used for post evaluation to provide report based on
player behaviour and performance within the game
environment. Fig 3 depicts part of the Log file
representing the logging an action of collecting an object
- i.e. bag - during fire evacuation.
Figure 2. An example of an assessment rule to teach students not to
collect their possessions during fire evacuation.
Figure 3. Part of the log file relatd to the event of a player collected their
bag during fire evacuation.
C. Proof-of-Concept
The proposed approach has been developed in the
context of ALICE1 project to provide integrated assessment
forms for serious games [23].The assessment approach aims
at providing feedback, thus to support students through a
guided learning approach to learn how to evacuate a school
building in case of fire threat. The game – developed at the
Serious Games Institute (SGI) at Coventry University -
adopts a freely navigable 3D environment, created within the
Unity Engine 2. The game contains elements of crowd
simulation within fire evacuation scenarios, effectively
placing the player within the building and monitoring their
actions as they evacuate. Hence, provide effective feedback
and assessment. It is essential that the game monitors and
correctly identifies key actions which may indicate correct
and incorrect behaviors.
Based on Framework discussed earlier, the principal
means through which it is proposed is achieved through the
implementation of virtual ‘checkpoints’ within each
scenario, recording players’ time and state as they pass
within a radius of a single point within the virtual space.
Nevertheless, the game designer annotates the pedagogical
1 http://www.aliceproject.eu/
2 http://unity3d.com/
objects and share these annotations as XML file with the
LMS. The annotations are used by the assessment designer
to provide assessment rules thus to provide feedback to the
player once an event regarding one of these pedagogical
objects is sent to the Assessment Engine.
Figure 4. The player action to collect his possessions triggered an
assessment rule which has a consequence of providing feedback using a
virtual character.
The Game engine tracks the player behaviour and
environment changes and save them to a log file. More
precisely, the actions of the players on tagged pedagogical
objects fires an event to save a record to the log file. A
JavaScript function is developed to send this event to
through the interface to the assessment engine. For instance,
a scenario of teaching students that they should not collect
their possessions during fire evacuation was tested using this
approach. The instructor uses the Assessment Editor and
designes an assessment rule for the game object ‘bag’, with a
feedback message added to the consequence section of the
assessment rule of “You took your bag during fire
evacuation! You should not collect your possessions before
evacuating” (see Fig. 2). During playing, if the player
collects his bag before leaving the class room, the action
fires an event and saves it to the log file (see Fig. 3). The
JavaScript function in the Web-based game player calls the
interface with that event. The Assessment Engine evaluates
that event against the assessment rule defined by the
instructor and replies with the feedback message. This
assessment scenario can be used to guide the player not to
collect their possessions during fire evacuation using
dynamic feedback provision (see Fig. 4).
Similarly, another scenario was used to teach the students
not to use the elevator during fire evacuation. Nevertheless,
more complex scenarios based on context, time, and
evironment state are used to evaluate the player behaviour.
For instance, crawling in areas of smoke where evaluated. In
the assessment model assessment rules based on environment
state and action can be defined. Hence, when the player does
not crawl in sections where breathing is difficult - i.e. tagged
by game designer as areas with smoke - a feedback is
provided based on the assessment rule to stimulate desired
behaviour of crawling.
IV. RELATED WORK
For instance the extension of the evidence-centered
design assessment model (ECD) [24] with an action model
instead of task model which has been used with Bayesian
networks to track player actions within the game and to
provide an evidence of progress within the game [19]. The
aim of this research is to use what they called stealth
assessment approach within immersive games to track
players actions and with respect to the ECD model to
provide formative and dynamic feedback thus to support
students learning. However, what they discussed represents
an summative approach by which they evaluates the progress
of the player in terms of interactions and used it in
comparison with the evidence model - part of ECD - to
provide an evidence of learning and skills achievement.
Another example is the so-called micro-adaptivity
approach for assessment in educational games [21] [25]. The
approach has been developed in the context of the Learning
Experience and Knowledge TRAnsfer (ELEKTRA)3 project.
The ELEKTRA framework uses the Competence-based
Knowledge Space Theory (CbKST) to model the
competencies required by the student to achieve a learning
goal. The basic idea of CbKST is to associate problems in a
domain with skills in order to provide a model of
competencies for a specific domain which can be used to
update the knowledge and skill state of the learner in a
learning domain. The ELEKTRA game and its successor
80Days4 game tracks the player interactions and uses them to
update the competence state represented in the CbKST for
the learning domains provided in the games. However,
according to [25] the approach demands extra load on
authoring aspects to define all required information for the
3 http://www.elektraproject.org
4 http://www.eightydays.eu
models as well as computational load as the game updates
the CbKST based on each player action.
Another example is the adventure game engine called <e-
Adventure> [20], which authors claims that the <e-
Adventure> is flexible to be used with educational modeling
languages - i.e. IMS Learning Design (IMS LD) - to design
the pedagogical impact of using assessment rules in the game
engine to evaluate the progress of the players, and hence to
provide personalized and adaptive digital educational games.
V. CONCLUSIONS AND FUTURE WORK
Video Games have a prominent impact on raising
motivation and engagement. However, using video games
for education demands a compromise between games
entertainment and instruction. Moreover, requires more
attention on game design when it comes to use it for
education. It is intuitive that serious games have positive
impact on students’ motivation and engagement [7] [8] [12].
However, designing games for education demands further
alignment with instruction and learning. Among this, and
some others, stimulating pedagogic-based desired behaviors
within the game. Quality assessment forms play major role
on raising motivation and engagement [26] [27]. Therefore,
this paper proposes an assessment and feedback approach to
guide learning in serious games.
The proposed approach builds on a pedagogic-
prominent framework to design assessment and feedback
apart from the game engine and use them to stimulate
desired learning. Through an externalization of the
assessment process, the framework supports interaction
between educators and serious game designers, allowing
pedagogic content to be identified at the development stage
and then manipulated dynamically in response to learner
actions and interactions within an overarching set of
pedagogic goals defined by the educator or trainer. The
method supports integration with automated assessment
technologies, allowing such tools to recognize and respond
immediately to learner actions by modifying the game
environment or triggering feedback.
The decoupling of the serious game - as a complex
learning resource - and the assessment engine – utilized via
web services - fosters the accommodation of various learning
contexts and pedagogical approaches. A proof-of-concept is
illuistrated in Section III where the player receives a
feedback from a virtual character within the game through
which the player is guided to learn through tacking action -
correct / incorrect. The appraoch advocates an intutive way
of learning through allowing the learner to explore the game
environement and eventually recieves consructive feedback.
Looking ahead, exerimenting this approach with different
3D virtual worlds thus to evaluate the flexibility and
reliability of the architecture as well as to scale the features
provided for assessment and feedback is the main concern
for future work. Enhancing the assessment model to optimize
the evaluation of players behaviour thus to provide not only
interpretive feedback (players get a score and the wrong
action), but also other forms of feedback such as supportive
(players get a score and guidance information), probing
(players get a score and analysis of why the player did the
wrong action), and understanding (players get a score and
analysis of why the player did the wrong action as well as
guidance for supportive steps or learning material) [17].
ACKNOWLEDGMENT
This work has been supported by the European
Commission under the Collaborative Project ALICE
”Adaptive Learning via Intuitive/Interactive, Collaborative
and Emotional Systems”, VII Framework Programme,
Theme ICT-2009.4.2 (Technology-Enhanced Learning),
Grant Agreement n. 257639.
REFERENCES
[1] C. Gütl, & V. Chang, “Ecosystem-based Theoretical Models for
Learning in Environments of the 21st Century”. International. Journal
of Emerging Technologies in Learning (iJET), submitted Nov. 2008,
December 2008.
[2] M. AL-Smadi; C. Guetl; & V. Chang “Addressing e-Assessment
practices in e-Learning Activities: A Review”, Proceedings of Global
Learn Asia Pacific 2011-Global Conference on Learning and
Technology, Melbourne, Australia, March 28-April 1, 2011.
[3] M. Prensky, “Digital Natives, Digital Immigrants”. In On the
Horizon, October 2001, 9(5)NCB University Press, 2001.
[4] W. C. Kriz, “Creating Effective Learning Environments and Learning
Organizations through Gaming and Simulation Design”, Simulation
and Gaming, Vol. 34, Iss. 4, pp 495-511, 2003.
[5] C. Gütl, “The Support of Virtual 3D Worlds for enhancing
Collaboration in Learning Settings”, in Francesca Pozzi and
Donatella Persico (Eds.) ,Techniques for Fostering Collaboration in
Online Learning Communities: Theoretical and Practical
Perspectives, IGI Global, 2011, 278-299.
[6] Screen Digest. “There’s life beyond World of Warcraft. Subscription
MMOGs: Life beyond World of Warcraft”, Retrieved on 8
December, 2009 from
http://www.screendigest.com/press/releases/pdf/PR-
LifeBeyondWorldOfWarcraft-240309.pdf
[7] M. Prensky, Digital-game based learning. McGraw-Hill, 2001.
[8] J. P. Gee, What video games have to teach US about learning and
literacy. Palgrave Macmillan, 2003.
[9] S. De Freitas, “Learning in immersive worlds: A review of game-
based learning”. JISC e-Learning Programme, 2006.
[10] S. De Freitas, & M. Oliver, “A four dimensional framework for the
evaluation and assessment of educational games”. Paper accepted for
Computer Assisted Learning Conference 2005.
[11] R. Van Eck, “Digital game based learning: It’s not just the digital
native who are restless”, Educause Review, 41, 16–30, 2006.
[12] E. Klopfer, S. Osterweil, & K. Salen, Moving learning games
forward: obstacules, opportunities and openness, the education
arcade. Massachusetts Institute of Technology, 2009.
[13] J. McGonigal, Reality is broken: Why games make US better and
how they can change the world. Penguin Books, 2011.
[14] V. J. Shute, “Focus on formative feedback”, Review of Educational
Research, 780), 153-189, 2008.
[15] E. H., Mory, Feedback research revisited. Handbook of Research on
Educational Communications and Technology, 2nd Edition, D. H.
Jonassen Eds. Lawrence Erlbaum Associates, p745-783, 2004.
[16] I. Dunwell, S. Jarvis, & S. de Freitas,. „Four-dimensional
consideration of feedback in serious games”. In Digital Games and
Learning, P. Maharg & S. de Freitas (Eds.) Continuum Publishing,
2010.
[17] C. Rogers, Client-centered Therapy: Its Current Practice, Implications
and Theory. London: Constable, 1951
[18] P. Moreno-Ger, D. Burgos, J. L. Sierra, & B. Fernández-Manjón,
“Educational game design for online education”, Computers in
Human Behavior, 24, 2530-2540, 2008.
[19] V. J., Shute, M. Ventura, M. I., Bauer, & D. Zapata-Rivera, „Melding
the power of serious games and embedded assessment to monitor and
foster learning: Flow and grow”. In U. Ritterfeld, M. Cody, & P.
Vorderer (Eds.), Serious games: Mechanisms and effects (pp. 295–
321). Mahwah, NJ: Routledge, Taylor and Francis, 2009.
[20] D., Burgos, P. Moreno-Ger, J. L. Sierra, B. Fernández-Manjón, M.,
Specht, & R. Koper, “Building adaptive game-based learning
resources: The marriage of IMS learning design and <e-Adventure>”,
Simulation & Gaming, 39, 414-431, 2008.
[21] M.D., Kickmeier-Rust, C.M., Steiner, & D. Albert, „Non-invasive
Assessment and Adaptive Interventions in Learning Games”,
International Conference on Intelligent Networking and Collaborative
Systems INCOS 09, pp. 301 – 305, 2009.
[22] M. AL-Smadi, and C. Guetl, Service-oriented flexible and
interoperable assessment: towards a standardised e-assessment
system, Int. J. Continuing Engineering Education and Life-Long
Learning, Vol. 21, No. 4, pp.289–307, 2011.
[23] M. Al-Smadi, C. Guetl, I. Dunwell, and S. Caballe., "D5.2.2:
Enriched Learning Experience V2," ALICE (Adaptive Learning Via
Intuitive/Interactive, Collaborative And Emotional Systems) project
co-funded by the European Commission within the 7th Framework
Programme (2007-2013), n. 257639 (2010). 2012.
[24] R., Almond, L. Steinberg, & R. Mislevy, “Enhancing the design and
delivery of assessment systems: A four process architecture”, The
Journal of Technology, Learning, and Assessment, 1, 5. 2002.
[25] M. Kickmeier-Rust, & D. Albert, “Micro-adaptivity: protecting
immersion in didactically adaptive digital educational games”.
Journal of Computer Assisted Learning, 26: 95–105, 2010,
[26] R. J. Stiggins, Student-involved assessment for learning (4th ed).
Upper Saddle River, NJ: Prentice-Hall, 2001.
[27] W. Harlen, “The role of assessment in developing motivation for
learning”, In J. Gardner (Ed.). Assessment and Learning (pp. 61-80).
London: Sage Publications, 2006.