ChapterPDF Available

Game-Based Assessment: The Past Ten Years and Moving Forward



The implementation of assessment features into game-based learning environments is only in its early stages because it adds a very time-consuming step to the design process. The impact on learning and questions toward reliability and validity of technology-based assessment systems are still being questioned. To answer the question of what people are learning from playing games, researchers have been using a variety of methods including external measures, log data capturing in-game actions, and game-related actions beyond the game context. This chapter seeks to identify why research on game-based assessment is still in its infancy, what advances have been achieved over the past 10 years, and which challenges lie ahead for advancing assessment in game-based learning.
3© Springer Nature Switzerland AG 2019
D. Ifenthaler, Y. J. Kim (eds.), Game-Based Assessment Revisited, Advances in
Game-Based Learning,
Chapter 1
Game-Based Assessment: ThePast Ten
Years andMoving Forward
YoonJeonKim andDirkIfenthaler
1.1 Introduction
Educational assessment practice is challenging as there are a number of diverse
concepts referring to the idea of assessment. Newton (2007) laments that the dis-
tinction between formative and summative assessment hindered the development of
sound assessment practices on a broader level. Black (1998) denes three main
types of assessment: (a) formative assessment to aid learning; (b) summative assess-
ment for review, for transfer, and for certication; and (c) summative assessment for
accountability to the public. Pellegrino, Chudowsky, and Glaser (2001) extend these
denitions with three main purposes of assessment: (a) assessment to assist learning
(formative assessment), (b) assessment of individual student achievement (summa-
tive assessment), and (c) assessment to evaluate programs (evaluative assessment).
A common thread among the many denitions points to the concept of feedback for
a variety of purposes, audiences, and methods of assessment (Ifenthaler, Greiff, &
Gibson, 2018).
Digital game-based technologies are nudging the eld of education to redene
what is meant by learning, instruction, and assessment. Proponents of game-based
learning argue that students should be prepared to meet the demands of the twenty-
rst century by teaching them to be innovative, creative, and adaptable so that they
can deal with the demands of learning in domains that are complex and ill- structured
(Federation of American Scientists, 2005; Gee, 2003; Ifenthaler, Eseryel, & Ge,
Y. J. Kim
Teaching Systems Lab, Massachusetts Institute of Technology, Cambridge, MA, USA
D. Ifenthaler (*)
Learning, Design and Technology, University of Mannheim, Mannheim, Germany
Curtin University, Perth, Australia
2012; Prensky, 2001; Shaffer, 2006). On the other hand, opponents of games argue
that games are just another technological fad, which emphasize supercial learning.
In addition, opponents argue that games cause increased violence, aggression, inac-
tivity, and obesity while decreasing prosocial behaviors (Walsh, 2002).
However, Ifenthaler etal. (2012) argue that the implementation of assessment
features into game-based learning environments is only in its early stages because it
adds a very time-consuming step to the design process. Also, the impact on learning
and questions toward reliability and validity of technology-based assessment sys-
tems are still being questioned. Three distinguishing features of game-based assess-
ment have been proposed and are widely accepted: (1) game scoring, (2) external,
and (3) embedded assessment of game-based learning (Ifenthaler etal., 2012). Only
recently, an additional feature has been introduced which enables adaptive game-
play and game environments, broadly dened as learning analytics (Ifenthaler,
2015) and specically denoted as serious games analytics (Loh, Sheng, & Ifenthaler,
2015). Serious games analytics converts learner-generated information into action-
able insights for real-time processing. Metrics for serious games analytics are simi-
lar to those of learning analytics including the learners’ individual characteristics
(e.g., socio-demographic information, interests, prior knowledge, skills, and com-
petencies) and learner-generated game data (e.g., time spent, obstacles managed,
goals or tasks completed, navigation patterns, social interaction, etc.) (Ge &
Ifenthaler, 2017; Ifenthaler, 2015; Loh, Sheng, & Ifenthaler, 2015).
This chapter seeks to identify why research on game-based assessment is still in
its infancy, what advances have been achieved over the past 10years, and which
challenges lie ahead for advancing assessment in game-based learning.
1.2 Game-Based Assessment andAssessment ofLearning
inGames: Why?
Games—both digital and nondigital—have become an important aspect of young
people’s life. According to a recent survey conducted in the United States, 72% of
youth ages 13–17 play games daily or weekly (Lenhart, 2015). Gaming is also one
of the most popular social activities, especially for boys, where 55% of them play
games in-person or online with friends daily or weekly. While gaming gained more
popularity in people’s daily life, starting in early 2000, educational researchers
began to investigate potential educational benets of games for learning and what
we can learn from well-designed games about learning and assessment (Gee, 2003).
So what are affordances of games for learning? First, people learn in action in
games (Gee, 2008). That is, people interact with all aspects of the game and take
intentional actions within the game. For its part, the game continuously responds to
each action, and through this process, the player gradually creates meaning. Clearly,
how people are believed to learn within video games contrasts to how people typi-
cally learn at school, which often entails memorization of decontextualized and
Y. J. Kim and D. Ifenthaler
abstract concepts and procedures (Shute, Ventura, Bauer, & Zapata-Rivera, 2009).
Second, due to its interactive nature, learning by playing games can lead to concep-
tual understanding and problem-solving (Eseryel, Ge, Ifenthaler, & Law, 2011) in
addition to domain-specic skills and practices (Bressler & Bodzin, 2016) that go
beyond the basic content knowledge more commonly taught in the classroom.
Steinkuehler and Duncan (2008) have found players in virtual worlds frequently
engaging in social knowledge construction, systems-based reasoning, and other sci-
entic habits of mind. This body of work shows that games in general have a lot of
potential for contributing to a deep learning environment. In video games, players
engage in active and critical thinking, they take on different identities, and they have
opportunities to practice skills and nd intrinsic rewards as they work on increas-
ingly difcult challenges on their path to mastery (Eseryel, Law, Ifenthaler, Ge, &
Miller, 2014; Gee, 2003).
Numerous studies have reported the benets of games for learning as a vehicle to
support student learning. In a meta-analysis study, Clark, Tanner-Smith, and
Killingsworth (2016) reported that compared to nongame conditions, digital games
had a moderate to strong effect in terms of overall learning outcomes including
cognitive and interpersonal skills. Similarly, a literature review by Boyle et al.
(2016) reports that games are benecial for learning of various outcomes such as
knowledge acquisition, affect, behavior change, perception, and cognition.
Numerous studies also reported academic domain-specic benets of games for
learning including science and mathematics (Divjak & Tomić, 2011). To answer the
question of what people are learning from playing games, researchers have been
using a variety of methods including external measures, log data capturing in-game
actions, and game-related actions beyond the game context (Ifenthaler etal., 2012;
Loh etal., 2015).
1.3 Game-Based Assessment: Past 10 Years
Several meta-analyses have been published focusing on game-based learning. For
example, Baptista and Oliveira (2019) highlight important variables in their litera-
ture search of more than 50 studies focusing on serious games including intention,
attitude, enjoyment, and usefulness. A systematic review by Alonso-Fernández,
Calvo-Morata, Freire, Martínez-Ortiz, and Fernández-Manjón (2019) focuses on
the application of data science techniques on game learning data and suggests spe-
cic game learning analytics. Ke (2016) presents a systematic review on the integra-
tion of domain-specic learning in game mechanics and game world design. Another
systematic review by Ravyse, Seugnet Blignaut, Leendertz, and Woolner (2017)
identies ve central themes of serious games: backstory and production, realism,
articial intelligence and adaptivity, interaction, and feedback and debrieng.
Accordingly, none of the abovementioned meta-analyses and systematic reviews
have a clear focus on assessment of game-based learning.
1 Game-Based Assessment: ThePast Ten Years andMoving Forward
Still, a line of research that emerged over the past 10years was in relation to the
question of how we can use games as an interactive and rich technology-enhanced
environment to advance assessment technologies. That is, the primary goal of this
line is to advance assessment using games (Ifenthaler etal., 2012). Earlier game-
based assessment work has primarily focused on applying the evidence-centered
design framework to develop assessment models with specic learning outcomes
and skills in mind (Behrens, Mislevy, Dicerbo, & Levy, 2012). For example, Shute
etal. (2009) describe an approach called stealth assessment—where in-game behav-
ioral indicators (e.g., specic actions taken within a quest in Oblivion) are identied
and make inferences about the player’s underlying skills (e.g., creative problem-
solving) without the ow of gameplay using logged data. Using this approach, one
can use existing games to measure latent constructs, even if the game was not
explicitly developed for the purpose of learning or assessment, as long as the game
provides ample contexts (or situations) that elicit evidence for underlying skills and
constructs (Loh etal., 2015). Similarly, using a popular game SimCity, GlassLab
developed SimCityEDU to assess students’ systems thinking (Dicerbo etal., 2015).
These approaches have primarily used the evidence-centered design framework
(Almond, Steinberg, & Mislevy, 2002) to align what people might learn from the
game with what they do in games.
Eseryel, Ifenthaler, and Ge (2011) provide an integrated framework for assessing
complex problem-solving in digital game-based learning in the context of a longitu-
dinal design-based research study. In a longitudinal eld study, they examined the
impact of the massively multiplayer online game (MMOG) Surviving in Space on
students’ complex problem-solving skill acquisition, mathematics achievement,
and students’ motivation. Two different methodologies to assess student’s progress
of learning in complex problem-solving were applied. The rst methodology uti-
lized adapted protocol analysis (Ericsson & Simon, 1980, 1993) to analyze stu-
dents’ responses to the given problem scenario within the framework of the
think-aloud methodology. The second methodology utilized HIMATT methodology
(Eseryel, Ifenthaler, & Ge, 2013; Pirnay-Dummer, Ifenthaler, & Spector, 2010) to
analyze students’ annotated causal representations of the phenomena in question.
The automated text-based analysis function of HIMATT enables the tracking of the
association of concepts from text which contain 350 or more words directly, hence
producing an adaptive assessment and feedback environment for game-based learn-
ing. For future game design, the algorithms produce quantitative measures and
graphical representations which could be used for instant feedback within the game
or for further analysis (Ifenthaler, 2014).
More recently, researchers have introduced learning analytics and data mining
techniques to broaden what game-based assessment means (Loh etal., 2015). For
example, Rowe etal. (2017) built “detectors” machine-learned algorithm using log
data in the game to measure implicit understanding of physics, different strategies
associated with productivity in the game, and computational thinking. While they
did not use formal measure models (e.g., IRT or Bayes net), these detectors are
implemented in the game engine to make real-time inferences of players. Similarly,
Shadowspect developed at MIT Playful Journey Lab (Kim & Rosenheck, 2018) is
Y. J. Kim and D. Ifenthaler
another example of GBA that utilizes new advancements in learning analytics and
educational data mining techniques in the process of game design and development
for the purpose of assessment.
Hence, the application of serious games analytics opens up opportunities for the
assessment of engagement within game-based learning environments (Eseryel
etal., 2014). The availability of real-time information about the learners’ actions
and behaviors stemming from key decision points or game-specic events provides
insights into the extent of the learners’ engagement during gameplay. The analysis
of single action or behavior and the investigation of more complex series of actions
and behaviors can elicit patterns of engagement and therefore provide key insights
into learning processes (Ge & Ifenthaler, 2017).
Ifenthaler and Gibson (2019) report how highly detailed data traces, captured by
the Challenge platform, with many events per learning activity and when combined
with new input devices and approaches bring the potential for measuring indicators
of physical, emotional, and cognitive states of the learner. The data innovation of the
platform is the ability to capture event-based records of the higher-frequency and
higher-dimensional aspects of learning engagement, which is in turn useful for anal-
ysis of the effectiveness and impact on the physical, emotional, and cognitive layers
of learning caused or inuenced by the engagements. This forms a high-resolution
analytics base on which research into digital learning and teaching as well as into
how to achieve better outcomes in scalable digital learning experiences can be con-
ducted (Gibson & Jackl, 2015).
1.4 Challenges andFuture Work
While interests for game-based assessment peaked in 2009 when the GlassLab was
launched to scale up this approach in the broad education system, many promises of
game-based learning and assessment have not fully accomplished in the actual edu-
cation system. Based on the reection of the elds’ achievements in the past 10years
and contributions to the current volume, challenges remain that the eld of game-
based assessment still faces as well as future work that researchers, game designers,
and educators should address to transform how games are used in the educa-
tion system.
While ECD has been the most predominant framework to design assessment in
games, it is often unclear how different development processes leverage ECD to
conceptualize game design around the competency of interest (Ke, Shute, Clark, &
Erlebacher, 2019). For example, how can assessment models be formalized? How
can formalized assessment models be translated to game design elements? When in
the game design process, does this translation occur most effectively? How can
competency models be transformed into interesting, engaging game mechanics?
How can psychometric qualities be ensured without being too prescriptive?
Many established game-based assessment approaches focus on understanding
the continuous progression of learning, thinking, reasoning, argumentation, and
1 Game-Based Assessment: ThePast Ten Years andMoving Forward
complex problem-solving during digital game-based learning. From a design per-
spective, it seems important that the game mechanisms address the underlying
affective, behavioral, and cognitive dispositions which must be assessed carefully at
various stages of the learning process, hence, while conceptualizing and designing
games for learning (Bertling, Jackson, Oranje, & Owen, 2015; Eseryel etal., 2014;
Ge & Ifenthaler, 2017).
Advanced data analytics methodologies and technological developments enable
researchers, game designers, and educators to easily embed assessment and analysis
techniques into game-based learning environments (Loh et al., 2015). Internal
assessment and instant analysis including personalized feedback can be imple-
mented in a new generation of educational games. However, it is up to educational
research to provide theoretical foundations and empirical evidence on how these
methodologies should be designed and implemented. We have just arrived in the age
of educational data analytics. Hence, it is up to researchers, technologists, educa-
tors, and philosophers to make sense of these powerful technologies, thus better
help learners to learn.
With the challenges brought on by game-based assessments including data ana-
lytics, the large amount of data now available for teachers is far too complex for
conventional database software to store, manage, and process. Accordingly,
analytics- driven game-based assessments underscore the need to develop assess-
ment literacy in stakeholders of assessment (Ifenthaler etal., 2018; Stiggins, 1995).
Game designers and educators applying data-driven game-based assessments
require practical hands-on experience on the fundamental platforms and analysis
tools for linked big game-based assessment data. Stakeholders need to be intro-
duced to several data storage methods and how to distribute and process them, intro-
duce possible ways of handling analytics algorithms on different platforms, and
highlight visualization techniques for game-based assessment analytics (Gibson &
Ifenthaler, 2017). Well-prepared stakeholders may demonstrate additional compe-
tencies such as understanding large-scale machine learning methods as foundations
for human-computer interaction, articial intelligence, and advanced network anal-
ysis (Ifenthaler etal., 2018).
The current research ndings also indicate that design research and development
are needed in automation and semi-automation (e.g., humans and machines work-
ing together) in assessment systems. Automation and semi-automation of assess-
ments to provide feedback, observations, classications, and scoring are increasingly
being used to serve both formative and summative purposes in game-based learning.
Gibson, Ifenthaler, and Orlic (2016) proposed an open assessment resources
approach that has the potential to increase trust in and use of open education
resources (OER) in game-based learning and assessment by adding clarity about
assessment purposes and targets in the open resources world. Open assessment
resources (OAR) with generalized formative feedback are aligned with a specic
educative purpose expressed by some user of a specic OER toward the utility and
expectations for using that OER to achieve an educational outcome. Hence, OAR
may be utilized by game designers to include valuable and competence-based
assessments in game-based learning.
Y. J. Kim and D. Ifenthaler
The application of analytics-driven game-based assessments opens up opportu-
nities for the assessment of engagement and other motivational (or even broader:
non-cognitive) constructs within game-based learning environments (Eseryel etal.,
2014). The availability of real-time information about the learners’ actions and
behaviors stemming from key decision points or game-specic events provides
insights into the extent of the learners’ engagement during gameplay. The analysis
of single action or behavior and the investigation of more complex series of actions
and behaviors can elicit patterns of engagement and therefore provide key insights
into ongoing learning processes within game-based learning environments.
To sum up, the complexity of designing adaptive assessment and feedback sys-
tems has been discussed widely over the past few years (e.g., Sadler, 2010; Shute,
2008). The current challenge is to make use of data—from learners, teachers, and
game learning environments—for assessments. Hence, more research is needed to
unveil diverse methods and processes related to how design teams, often including
learning scientists, subject-matter experts, and game designers, can seamlessly inte-
grate design thinking and the formalization of assessment models into meaningful
assessment for game-based learning environments.
Almond, R. G., Steinberg, L. S., & Mislevy, R. J. (2002). Enhancing the design and delivery
of assessment systems: A four process architecture. Journal of Technology, Learning, and
Assessment, 1(5), 3–63.
Alonso-Fernández, C., Calvo-Morata, A., Freire, M., Martínez-Ortiz, I., & Fernández-Manjón, B.
(2019). Applications of data science to game learning analytics data: A systematic literature
review. Computers & Education, 141, 103612.
Baptista, G., & Oliveira, T. (2019). Gamication and serious games: A literature meta-analysis and
integrative model. Computers in Human Behavior, 92, 306–315.
Behrens, J., Mislevy, R., Dicerbo, K., & Levy, R. (2012). Evidence centered design for learn-
ing and assessment in the digital world. In M.Mayrath, J.Clarke-Midura, D.Robinson, &
G.Schraw (Eds.), Technology-based assessments for 21st century skills (pp.13–54). Charlotte,
NC: Information Age Publishers.
Bertling, M., Jackson, G.T., Oranje, A., & Owen, V.E. (2015). Measuring argumentation skills
with game-based assessments: Evidence for incremental validity and learning. In C.Conati,
N. Heffernan, A.Mitrovic, & M. Verdejo (Eds.), Articial intelligence in education. AIED
2015 (Vol. 9112, pp.545–549). Cham, Switzerland: Springer.
Black, P.J. (1998). Testing: Friend or foe? The theory and practice of assessment and testing.
London, UK: Falmer Press.
Boyle, E.A., Hainey, T., Connolly, T.M., Grant, G., Earp, J., Ott, M., … Pereira, J.(2016). An
update to the systematic literature review of empirical evidence of the impacts and outcomes
of computer games and serious games. Computers & Education, 94, 178–192. https://doi.
Bressler, D.M., & Bodzin, A.M. (2016). A mixed methods assessment of students’ ow experi-
ences during a mobile augmented reality science game. Journal of Computer Assisted Learning,
29, 505–517.
1 Game-Based Assessment: ThePast Ten Years andMoving Forward
Clark, D.B., Tanner-Smith, E.E., & Killingsworth, S.S. (2016). Digital games, design, and learn-
ing: A systematic review and meta-analysis. Review of Educational Research, 86(1), 79–122.
Dicerbo, K., Bertling, M., Stephenson, S., Jia, Y., Mislevy, R.J., Bauer, M., & Jackson, G.T.
(2015). An application of exploratory data analysis in the development of game-based assess-
ments. In C.S. Loh, Y.Sheng, & D.Ifenthaler (Eds.), Serious games analytics. Methodologies
for performance measurement, assessment, and improvement (pp.319–342). NewYork, NY:
Divjak, B., & Tomić, D. (2011). The impact of game-based learning on the achievement of learn-
ing goals and motivation for learning mathematics- Literature review. Journal of Information
and Organizational Sciences, 35(1), 15–30.
Ericsson, K.A., & Simon, H.A. (1980). Verbal reports as data. Psychological Review, 87, 215–251.
Ericsson, K.A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data. Cambridge,
MA: MIT Press.
Eseryel, D., Ge, X., Ifenthaler, D., & Law, V. (2011). Dynamic modeling as cognitive regulation
scaffold for complex problem solving skill acquisition in an educational massively multiplayer
online game environment. Journal of Educational Computing Research, 45(3), 265–287.
Eseryel, D., Ifenthaler, D., & Ge, X. (2011). Alternative assessment strategies for complex problem
solving in game-based learning environments. In D.Ifenthaler, P.I. Kinshuk, D.G. Sampson,
& J.M. Spector (Eds.), Multiple perspectives on problem solving and learning in the digital
age (pp.159–178). NewYork, NY: Springer.
Eseryel, D., Ifenthaler, D., & Ge, X. (2013). Validation study of a method for assessing com-
plex ill-structured problem solving by using causal representations. Educational Technology
Research and Development, 61(3), 443–463.
Eseryel, D., Law, V., Ifenthaler, D., Ge, X., & Miller, R.B. (2014). An investigation of the inter-
relationships between motivation, engagement, and complex problem solving in game-based
learning. Journal of Educational Technology & Society, 17(1), 42–53.
Federation of American Scientists. (2005). Summit of educational games: Harnessing the power of
video games for learning. Washington, DC: Author.
Ge, X., & Ifenthaler, D. (2017). Designing engaging educational games and assessing engagement
in game-based learning. In R.Zheng & M.K. Gardner (Eds.), Handbook of research on serious
games for educational applications (pp.255–272). Hershey, PA: IGI Global.
Gee, J.P. (2003). What video games have to teach us about learning and literacy. NewYork, NY:
Palgrave Macmillan.
Gee, J. P. (2008). Learning and games. The ecology of games: Connecting youth, games, and
learning. In K.Salen (Ed.), The John D. and Catherine T.MacArthur Foundation series on
digital media and learning (pp.21–40). Cambridge, MA: MIT.
Gibson, D.C., & Ifenthaler, D. (2017). Preparing the next generation of education researchers for
big data in higher education. In B.K. Daniel (Ed.), Big data and learning analytics: Current
theory and practice in higher education (pp.29–42). NewYork, NY: Springer.
Gibson, D.C., Ifenthaler, D., & Orlic, D. (2016). Open assessment resources for deeper learning.
In P. Blessinger & T.J. Bliss (Eds.), Open education: International perspectives in higher
education (pp.257–279). Cambridge, UK: Open Book Publishers.
Gibson, D.C., & Jackl, P. (2015). Theoretical considerations for game-based e-learning analyt-
ics. In T.Reiners & L. Wood (Eds.), Gamication in education and business (pp.403–416).
NewYork, NY: Springer.
Ifenthaler, D. (2014). AKOVIA: Automated knowledge visualization and assessment. Technology,
Knowledge and Learning, 19(1–2), 241–248.
Ifenthaler, D. (2015). Learning analytics. In J.M. Spector (Ed.), The SAGE encyclopedia of edu-
cational technology (Vol. 2, pp.447–451). Thousand Oaks, CA: Sage.
Ifenthaler, D., Eseryel, D., & Ge, X. (2012). Assessment for game-based learning. In D.Ifenthaler,
D.Eseryel, & X.Ge (Eds.), Assessment in game-based learning. Foundations, innovations, and
perspectives (pp.3–10). NewYork, NY: Springer.
Y. J. Kim and D. Ifenthaler
Ifenthaler, D., & Gibson, D.C. (2019). Opportunities of analytics in challenge-based learning. In
A.Tlili & M.Chang (Eds.), Data analytics approaches in educational games and gamication
systems. Cham, Switzerland: Springer.
Ifenthaler, D., Greiff, S., & Gibson, D.C. (2018). Making use of data for assessments: Harnessing
analytics and data science. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.),
International handbook of IT in primary and secondary education (2nd ed., pp. 649–663).
NewYork, NY: Springer.
Ke, F. (2016). Designing and integrating purposeful learning in game play: A systematic review.
Educational Technology Research and Development, 64(2), 219–244.
Ke, F., Shute, V.J., Clark, K.M., & Erlebacher, G. (2019). Interdisciplinary design of game-based
learning platforms. Cham, Switzerland: Springer.
Kim, Y.J., & Rosenheck, L. (2018). A playful assessment approach to research instrument devel-
opment. Paper presented at the Thirteenth International Conference of the Learning Sciences,
London, UK.
Lenhart, A. (2015). Teen, social media and technology overview 2015. Washington, DC: Pew
Research Center.
Loh, C.S., Sheng, Y., & Ifenthaler, D. (2015). Serious games analytics: Theoretical framework.
In C.S. Loh, Y.Sheng, & D. Ifenthaler (Eds.), Serious games analytics. Methodologies for
performance measurement, assessment, and improvement (pp.3–29). NewYork, NY: Springer.
Newton, P.E. (2007). Clarifying the purposes of educational assessment. Assessment in Education:
Principles, Policy & Practice, 14(2), 149–170.
Pellegrino, J.W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students knwo: The
science and design of educational assessment. Washington, DC: National Academy Press.
Pirnay-Dummer, P., Ifenthaler, D., & Spector, J.M. (2010). Highly integrated model assessment
technology and tools. Educational Technology Research and Development, 58(1), 3–18. https://
Prensky, M. (2001). Digital game-based learning. NewYork, NY: McGraw-Hill.
Ravyse, W.S., Seugnet Blignaut, A., Leendertz, V., & Woolner, A. (2017). Success factors for seri-
ous games to enhance learning: A systematic review. Virtual Reality, 21(1), 31–58.
Rowe, E., Asbell-Clarke, J., Baker, R.S., Eagle, M., Hicks, A., Barnes, T., … Edwards, T. (2017).
Assessing implicit science learning in digital games. Computers in Human Behavior, 76, 617–
Sadler, D. R. (2010). Beyond feedback: developing student capability in complex appraisal.
Assessment & Evaluation in Higher Education, 35(5), 535–550.
Shaffer, D. W. (2006). How computer games help children learn? New York, NY: Palgrave
Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189.
Shute, V. J., Ventura, M.I., Bauer, M., & Zapata-Rivera, D. (2009). Melding the power of seri-
ous games and embedded assessment to monitor and foster learning: Flow and grow. In
U.Ritterfeld, M.Cody, & P.Vorderer (Eds.), Serious games: Mechanisms and effects (pp.295–
321). NewYork, NY: Routledge.
Steinkuehler, C., & Duncan, S. (2008). Scientic habits of mind in virtual worlds. Journal of
Science Education and Technology, 17(6), 530–543.
Stiggins, R.J. (1995). Assessment literacy for the 21st century. Phi Delta Kappan, 77(3), 238–245.
Walsh, D. (2002). Video game violence and public policy. Retrieved from http://www.soc.iastate.
1 Game-Based Assessment: ThePast Ten Years andMoving Forward
... Second, because of the very nature of games as an interactive environment, they capture the full process of learning and solving problems, instead of capturing evidence at one time point unlike how assessment is typically done at the end of unit or lesson. Therefore, teachers should understand that game environments provide evidence based on the process, not just based on something that students do at the end of the gameplay (Kim & Ifenthaler, 2019). Third, teachers should understand that specific actions and choices in the game can be linked to non-cognitive skills and dispositions, different strategies, different problem-solving styles, how they collaborate with other players in the game and how they are progressing in the game. ...
... This chapter reports a work that is situated at the intersection of these two problems-the limited use of games for learning in classrooms and creating learning analytics and supporting tools to enhance practices on the ground. While multiple studies used learning analytics techniques in games, for example to examine how students are collaborating with each other , to function as game-based assessment purposes (Kim & Ifenthaler, 2019), or to model learning behaviors within the game (Kang et al., 2017), teachers' implementation of games coupled with learning analytics in classrooms are still somewhat limited. One of the barriers is the lack of actionable assessment data, the fact that teachers often do not have a clear sense of how students are interacting with the game, and if the gameplay is leading to productive learning (Martınez et al., 2020). ...
Full-text available
The use of learning analytics (LA) in educational technology has emerged as a key interest to researchers with the promise that this technology will help teachers and schools make data-informed decisions that were not feasible without big data and AI-driven algorithms. Despite its potential, LA has not yet effectively connected research and practice broadly. In the field, we have yet to understand how research-based advances in LA can become accessible assets for teachers, and often LA tools are generally not aligned with teachers’ needs. To see the real impact of LA in classrooms, the first step is to understand teacher literacy for using sophisticated technology-enhanced learning systems that use algorithms and analytics. In this chapter, we present a framework that enables a collaborative design and development process for learning analytics and data visualizations, specifically using games developed for learning and assessment purposes. Using a 3D puzzle game, Shadowspect, the team has been exploring a balanced design of data visualization that considers teachers’ needs and desires as well as their assessment literacy. In this chapter, we (1) define what it means to be assessment literate in the context of game-based learning and assessment, (2) present a process of creating data visualizations with teachers as co-designers, and (3) present several use cases. This chapter can contribute to establishing the foundations of how to design dashboard systems for learning games that can lead to broad use of game data in classrooms.
... Regarding this last point about what type of VG use can be more effective, it can be explained by the different ways of playing or using these VG. Traditionally, it has been assumed that VG favor learning due to specific characteristics linked to their own design (Kors, et al., 2015;Mitgustsch & Alvarado, 2012;Smith & Just, 2017) such as the immediate feedback they provide to the player's actions, their playful character that favors motivation, by stimulating the dopaminergic reward system with its reinforcers (Aprea & Ifenthaler, 2021;Eseryer et al., 2014;Greenfield, 2014;Howard-Jones et al., 2011;Kim & Ifenthaler, 2019), the facility to embody oneself in the characters, which favors the cognitive or emotional empathy (Alhabash & Wise, 2012;Aprea & Ifenthaler, 2021;Bachen et al., 2016), or the enactive learning they promote, from the action itself and without the need for reflection on the actions (Pozo, 2017). ...
... On the other hand, university students who spend more time playing VG value the epistemic thinking features as much as the pragmatic aspects, which may be linked to a certain knowledge about VG that allows them to value potential learning aspects that are not detected by teachers whose use is more amateurish. These results confirm the findings of studies aimed at analyzing the characteristics of VG as learning enhancers (Alhabash & Wise, 2012;Aprea & Ifenthaler, 2021;Bachen et al., 2016;Eseryel, et al., 2014;Ge & Ifenthaler, 2017;Howard-Jones et al., 2011;Huizinga et al., 2017;Kim & Ifenthaler, 2019;Malinverni & Pares, 2014). When we identified which specific factors the students in our research identified as important in teaching with VG, teacher supervision stood out. ...
Full-text available
One of the factors associated with the educational use of video games is the conception that teachers and students have about their educative usefulness. However, there are no studies that identify what aspects are considered more effective to learn with video games and what kind of learning is more accessible using them. This study aims at identifying pre-service teachers’ conceptions regarding video game use for learning and specifically to know what aspects and learning they consider are more feasible. Likewise, we analyzed the pedagogical training effect of these conceptions for three groups of university students: primary pre-service teachers (who received general pedagogical training), secondary pre-service teachers (who received pedagogical training in only one area of knowledge) and other university students without pedagogical training. We applied a questionnaire to a sample of 422 university students. This questionnaire had two dimensions that differentiated between the pragmatic and epistemic uses of video games for learning and three dimensions about the different verbal, procedural and attitudinal learning which can be achieved with them. The results showed wide acceptance of video games as a learning resource in university students, but in particular secondary pre-service teachers pointed out higher possibilities of achieving learning with video games than primary pre-service teachers. On the other hand, university students pointed out more learning when video games were used in an epistemic way. In addition, they considered video games favor more verbal and procedural learnings than attitudinal ones. In conclusion, despite the positive conceptions of the students about learning with video games, we observed a less positive pattern in pre-service teachers with general pedagogical training. These results suggest that video game incorporation in schools is not being carried out fruitfully by education faculties. Therefore, we advocated for 21st-century training that optimized new conceptions and uses of video games.
... One innovative way to meet the requirements of dynamic assessments in ITSs is to implement stealth assessment [3]. Stealth assessment refers to assessments that are seamlessly embedded in a computer-based or gaming environment such that learners are largely unaware of being assessed [4]. Unlike traditional assessments, testing items are replaced with learning or gaming tasks and activities; stealth assessment is unobtrusive. ...
iSTART is a game-based intelligent tutoring system (ITS) designed to improve students’ reading skills by providing training on reading comprehension strategies. Game-based practice in iSTART follows two main approaches: generative practice and identification practice. Generative practice games ask students to author self-explanations using one or more of the instructed strategies. Identification practice games require students to recognize or select appropriate strategies based on their analysis of example texts. This study explored the feasibility of implementing stealth assessments in iSTART using only an identification game. Specifically, this study examined the extent to which participants’ performance and attitudes related to a simple vocabulary game could predict the outcomes of standardized reading assessments. MTurk participants (N = 211) played identification games in iSTART and then rated their subjective gameplay experience. Participants also completed measures of their vocabulary and reading comprehension skills. Results indicated that participants’ performance in a vocabulary practice game was predictive of literacy skills. In addition, the possibility that students’ attitude towards the game moderated the relation between game performance and literacy skills was ruled out. These findings argue for the feasibility of implementing stealth assessment in simple games to facilitate the adaptivity of ITSs.
Full-text available
Virtual reality (VR) is a potential assessment format for constructs dependent on certain perceptual characteristics (e.g., realistic environment and immersive experience). The purpose of this series of studies was to explore methods of evaluating reliability and validity evidence for virtual reality assessments (VRAs) when compared with traditional assessments. We intended to provide the basis of a framework on how to evaluate VR assessments given that there are important fundamental differences to VR assessments compared with traditional assessment formats. Two commercial off-the-shelf (COTS) games (i.e., Project M and Richie's Plank Experience) were used in Studies 1 and 2, while a game-based assessment (GBA; Balloon Pop, designed for assessment) was used in Study 3. Studies 1 and 2 provided limited evidence for the reliability and validity of the VRAs. However, no meaningful constructs were measured by the VRA in Study 3. Findings demonstrate limited evidence for these VRAs as viable assessment options through the validity and reliability methods utilized in the present studies, which in turn emphasize the importance of aligning the assessment purpose to the unique advantages of a VR environment. Practitioner points • Findings were mixed in correlating the VRA scores with similar assessments to the intended constructs being measured. • Details are provided on the design and scoring for the presented VRAs. • Although research using VRAs is still preliminary, there are promising methods through which we might design unique behavior based evaluation.
Conference Paper
In an attempt to predict the learning of a player during a content agnostic educational video game session, this study used a dynamic bayesian network in which participants’ game play interactions were continuously recorded. Their actions were captured and used to make real-time inferences of the learning performance using a dynamic bayesian network. The predicted learning was then correlated with the post-test scores to establish the validity of assessment. The assessment was moderately positively correlated with the post-test scores demonstrating support for its validity.
We experience a multifaceted challenge when we try to understand complex, dynamic systems and communicate our understanding about such systems. This chapter gives an overview of the challenges that we experience and how the different challenges synergize to make our problems more complex. Also, the chapter provides a brief review of the existing instructional design theories or models, methods, techniques, and tools used to support learning in and about CDS and highlights the gaps that still need further research.
All video games, by design, are oriented in the narrative. Some games exemplify a colloquial notion of narrative (e.g., MYST), while others seem less story-like (e.g., Tetris). From a literacy perspective, narrative extends beyond the story. Literacy, similarly, is more than discrete acts of listening, speaking, reading, or writing that is typically associated with narrative storytelling. Literacy, in its most inclusive and broad sense, involves the encapsulating, intertextual interactions between and among all modes of communication and their respective contexts. Literacy includes metacognitive skills, critical thinking skills, and social skills. More importantly, sociocultural influences and contexts undergird both games and literacy. Epistemologically, games are literacy events and researchers stand to benefit from understanding the ways the two domains are isomorphically related. This chapter is dedicated to establishing the relationship between the field of literacy and game-based learning. This relationship is demonstrated through a literacy definition of text, context, skills, and application (of literacy) as each applies to the games Super Mario Bros., The Deed, and World of Warcraft. From the perspective that learning is a process and that games serve as sites of application for literacy, several implications for research and practice are provided.
Serious games are garnering popularity in learning environments and as assessment tools. We propose a summative assessment of a serious game as an assessment tool by merging assessment standards with serious game mechanics. To this end, we apply instructional design components with a focus on the evidence-centered game design approach (ECgD). Simultaneously, we introduce a different approach to game design and the traditional chain of effects toward competence assessment. Our leading questions are: how can the competences be operationalized and translated into game mechanics? Through which serious game mechanics can we prompt players to act in typical domain-specific situations and show their sustainable creative competence? How and through which statistical models can we match the observed competence of players with the intended competence model formulated a priori? To answer these questions, we developed the domain-specific serious game MyBUY to assess the sustainable creative competence (SC competence) of young adults in Vocational Education and Training in the field of retail and sales. By matching the intended competence (theoretical model) while playing the serious game with the SC competence (empirical model), we found that the models were highly compatible. Further confirmation is given by the results of questionnaires on usability, cognitive load, and motivation. Our results affirm the need for future studies to apply our algorithm to design domain-specific serious games as competence assessment tools and extend data collection and data analytics procedures in longitudinal studies.
Game-based learning is a dynamic field that has recently garnered much interest from different areas. Although game-based learning is not limited to digital games, and has in fact a long-standing history in human learning and development far beyond the times of digitalization, the growing interest in educational settings can also be attributed to technological advances as well as to particular preferences in users’ digital media behavior. If games exert such a fascination, it is only natural to ask whether and how they can be used to successfully promote learning purposes. This edited volume presents a wide-ranging collection of work and findings on game-based learning inside and across various disciplines. This concluding chapter aims to situate the single contributions of the volume within more general reflections on the concept and potential benefits of game-based learning as well as to provide an analysis and synthesis of major themes that have emerged in the previous chapters. Finally, it intends to sketch ideas for future research on game-based learning from an inter- and transdisciplinary perspective.
Full-text available
Data science techniques, nowadays widespread across all fields, can also be applied to the wealth of information derived from student interactions with serious games. Use of data science techniques can greatly improve the evaluation of games, and allow both teachers and institutions to make evidence-based decisions. This can increase both teacher and institutional confidence regarding the use of serious games in formal education, greatly raising their attractiveness. This paper presents a systematic literature review on how authors have applied data science techniques on game analytics data and learning analytics data from serious games to determine: (1) the purposes for which data science has been applied to game learning analytics data, (2) which algorithms or analysis techniques are commonly used, (3) which stakeholders have been chosen to benefit from this information and (4) which results and conclusions have been drawn from these applications. Based on the categories established after the mapping and the findings of the review, we discuss the limitations of the studies analyzed and propose recommendations for future research in this field.
Full-text available
In recent years we have witnessed a growing number of companies and institutions embedding game mechanics and game design techniques in all types of information systems, applications, and services. Following this trend, it is possible to find an increasing number of publications studying these subjects. With this meta-analysis we synthesise and integrate all the earlier literature and information available on gamification and serious games, assessing the current state-of-the-art in the field, filling a literature gap on this subject. We calculated meta-analysis effects from a total of 54 studies and 59 datasets collected from the literature. Attitude, enjoyment, and usefulness are the most relevant predictors of intention to use gamification. Intention, enjoyment, and usefulness are the most relevant predictors of the brand attitude towards gamification. Our results allow us to present a theoretical model that will be of value to future gamification studies.
The increased availability of vast and highly varied amounts of data from learners, teachers, learning environments, and administrative systems within educational settings is overwhelming. The focus of this chapter is on how data with a large number of records, of widely differing datatypes, and arriving rapidly from multiple sources can be harnessed for meaningful assessments and supporting learners in a wide variety of learning situations. Distinct features of analytics-driven assessments may include self-assessments, peer assessments,
This study is part of a research programme investigating the dynamics and impacts of learning engagement in a challenge-based digital learning environment. Learning engagement is a multidimensional concept which includes an individual’s ability to behaviourally, cognitively, emotionally, and motivationally engage in an on-going learning process. Challenge-based learning gives significant freedom to the learner to decide what and when to engage and interact with digital learning materials. In light of previous empirical findings, we expect that learning engagement is positively related to learning performance in a challenge-based online learning environment. This study was based on data from the Challenge platform, including transaction data from 8951 students. Findings indicate that learning engagement in challenge-based digital learning environments is, as expected, positively related to learning performance. Implications point toward the need for personalised and adaptive learning environments to be developed in order to cater for the individual needs of learners in challenge-based online learning environments.
The focus of this chapter is on designing engaging educational games for cognitive, motivational, and emotional benefits. The concept of engagement is defined and its relationship with motivation and cognition are discussed. Design issues with many educational games are examined in terms of factors influencing sustained motivation and engagement. A theoretical framework to design engaging digital games is presented, including three dimensions of engagement (i.e., behavioral, cognitive, and emotional). Later, the chapter considers how to harness the appealing power of engaging games for designing engaging educational games. Various motivational features of game design and learner experiences are considered. In conclusion, the chapter also discusses various methods to assess engagement in order to inform the design of educational games that motivate learners.