Content uploaded by Yoon Jeon Kim
Author content
All content in this area was uploaded by Yoon Jeon Kim on Sep 21, 2021
Content may be subject to copyright.
dirk@ifenthaler.info
3© Springer Nature Switzerland AG 2019
D. Ifenthaler, Y. J. Kim (eds.), Game-Based Assessment Revisited, Advances in
Game-Based Learning, https://doi.org/10.1007/978-3-030-15569-8_1
Chapter 1
Game-Based Assessment: ThePast Ten
Years andMoving Forward
YoonJeonKim andDirkIfenthaler
1.1 Introduction
Educational assessment practice is challenging as there are a number of diverse
concepts referring to the idea of assessment. Newton (2007) laments that the dis-
tinction between formative and summative assessment hindered the development of
sound assessment practices on a broader level. Black (1998) denes three main
types of assessment: (a) formative assessment to aid learning; (b) summative assess-
ment for review, for transfer, and for certication; and (c) summative assessment for
accountability to the public. Pellegrino, Chudowsky, and Glaser (2001) extend these
denitions with three main purposes of assessment: (a) assessment to assist learning
(formative assessment), (b) assessment of individual student achievement (summa-
tive assessment), and (c) assessment to evaluate programs (evaluative assessment).
A common thread among the many denitions points to the concept of feedback for
a variety of purposes, audiences, and methods of assessment (Ifenthaler, Greiff, &
Gibson, 2018).
Digital game-based technologies are nudging the eld of education to redene
what is meant by learning, instruction, and assessment. Proponents of game-based
learning argue that students should be prepared to meet the demands of the twenty-
rst century by teaching them to be innovative, creative, and adaptable so that they
can deal with the demands of learning in domains that are complex and ill- structured
(Federation of American Scientists, 2005; Gee, 2003; Ifenthaler, Eseryel, & Ge,
Y. J. Kim
Teaching Systems Lab, Massachusetts Institute of Technology, Cambridge, MA, USA
e-mail: yjk7@mit.edu
D. Ifenthaler (*)
Learning, Design and Technology, University of Mannheim, Mannheim, Germany
Curtin University, Perth, Australia
e-mail: dirk@ifenthaler.info
dirk@ifenthaler.info
4
2012; Prensky, 2001; Shaffer, 2006). On the other hand, opponents of games argue
that games are just another technological fad, which emphasize supercial learning.
In addition, opponents argue that games cause increased violence, aggression, inac-
tivity, and obesity while decreasing prosocial behaviors (Walsh, 2002).
However, Ifenthaler etal. (2012) argue that the implementation of assessment
features into game-based learning environments is only in its early stages because it
adds a very time-consuming step to the design process. Also, the impact on learning
and questions toward reliability and validity of technology-based assessment sys-
tems are still being questioned. Three distinguishing features of game-based assess-
ment have been proposed and are widely accepted: (1) game scoring, (2) external,
and (3) embedded assessment of game-based learning (Ifenthaler etal., 2012). Only
recently, an additional feature has been introduced which enables adaptive game-
play and game environments, broadly dened as learning analytics (Ifenthaler,
2015) and specically denoted as serious games analytics (Loh, Sheng, & Ifenthaler,
2015). Serious games analytics converts learner-generated information into action-
able insights for real-time processing. Metrics for serious games analytics are simi-
lar to those of learning analytics including the learners’ individual characteristics
(e.g., socio-demographic information, interests, prior knowledge, skills, and com-
petencies) and learner-generated game data (e.g., time spent, obstacles managed,
goals or tasks completed, navigation patterns, social interaction, etc.) (Ge &
Ifenthaler, 2017; Ifenthaler, 2015; Loh, Sheng, & Ifenthaler, 2015).
This chapter seeks to identify why research on game-based assessment is still in
its infancy, what advances have been achieved over the past 10years, and which
challenges lie ahead for advancing assessment in game-based learning.
1.2 Game-Based Assessment andAssessment ofLearning
inGames: Why?
Games—both digital and nondigital—have become an important aspect of young
people’s life. According to a recent survey conducted in the United States, 72% of
youth ages 13–17 play games daily or weekly (Lenhart, 2015). Gaming is also one
of the most popular social activities, especially for boys, where 55% of them play
games in-person or online with friends daily or weekly. While gaming gained more
popularity in people’s daily life, starting in early 2000, educational researchers
began to investigate potential educational benets of games for learning and what
we can learn from well-designed games about learning and assessment (Gee, 2003).
So what are affordances of games for learning? First, people learn in action in
games (Gee, 2008). That is, people interact with all aspects of the game and take
intentional actions within the game. For its part, the game continuously responds to
each action, and through this process, the player gradually creates meaning. Clearly,
how people are believed to learn within video games contrasts to how people typi-
cally learn at school, which often entails memorization of decontextualized and
Y. J. Kim and D. Ifenthaler
dirk@ifenthaler.info
5
abstract concepts and procedures (Shute, Ventura, Bauer, & Zapata-Rivera, 2009).
Second, due to its interactive nature, learning by playing games can lead to concep-
tual understanding and problem-solving (Eseryel, Ge, Ifenthaler, & Law, 2011) in
addition to domain-specic skills and practices (Bressler & Bodzin, 2016) that go
beyond the basic content knowledge more commonly taught in the classroom.
Steinkuehler and Duncan (2008) have found players in virtual worlds frequently
engaging in social knowledge construction, systems-based reasoning, and other sci-
entic habits of mind. This body of work shows that games in general have a lot of
potential for contributing to a deep learning environment. In video games, players
engage in active and critical thinking, they take on different identities, and they have
opportunities to practice skills and nd intrinsic rewards as they work on increas-
ingly difcult challenges on their path to mastery (Eseryel, Law, Ifenthaler, Ge, &
Miller, 2014; Gee, 2003).
Numerous studies have reported the benets of games for learning as a vehicle to
support student learning. In a meta-analysis study, Clark, Tanner-Smith, and
Killingsworth (2016) reported that compared to nongame conditions, digital games
had a moderate to strong effect in terms of overall learning outcomes including
cognitive and interpersonal skills. Similarly, a literature review by Boyle et al.
(2016) reports that games are benecial for learning of various outcomes such as
knowledge acquisition, affect, behavior change, perception, and cognition.
Numerous studies also reported academic domain-specic benets of games for
learning including science and mathematics (Divjak & Tomić, 2011). To answer the
question of what people are learning from playing games, researchers have been
using a variety of methods including external measures, log data capturing in-game
actions, and game-related actions beyond the game context (Ifenthaler etal., 2012;
Loh etal., 2015).
1.3 Game-Based Assessment: Past 10 Years
Several meta-analyses have been published focusing on game-based learning. For
example, Baptista and Oliveira (2019) highlight important variables in their litera-
ture search of more than 50 studies focusing on serious games including intention,
attitude, enjoyment, and usefulness. A systematic review by Alonso-Fernández,
Calvo-Morata, Freire, Martínez-Ortiz, and Fernández-Manjón (2019) focuses on
the application of data science techniques on game learning data and suggests spe-
cic game learning analytics. Ke (2016) presents a systematic review on the integra-
tion of domain-specic learning in game mechanics and game world design. Another
systematic review by Ravyse, Seugnet Blignaut, Leendertz, and Woolner (2017)
identies ve central themes of serious games: backstory and production, realism,
articial intelligence and adaptivity, interaction, and feedback and debrieng.
Accordingly, none of the abovementioned meta-analyses and systematic reviews
have a clear focus on assessment of game-based learning.
1 Game-Based Assessment: ThePast Ten Years andMoving Forward
dirk@ifenthaler.info
6
Still, a line of research that emerged over the past 10years was in relation to the
question of how we can use games as an interactive and rich technology-enhanced
environment to advance assessment technologies. That is, the primary goal of this
line is to advance assessment using games (Ifenthaler etal., 2012). Earlier game-
based assessment work has primarily focused on applying the evidence-centered
design framework to develop assessment models with specic learning outcomes
and skills in mind (Behrens, Mislevy, Dicerbo, & Levy, 2012). For example, Shute
etal. (2009) describe an approach called stealth assessment—where in-game behav-
ioral indicators (e.g., specic actions taken within a quest in Oblivion) are identied
and make inferences about the player’s underlying skills (e.g., creative problem-
solving) without the ow of gameplay using logged data. Using this approach, one
can use existing games to measure latent constructs, even if the game was not
explicitly developed for the purpose of learning or assessment, as long as the game
provides ample contexts (or situations) that elicit evidence for underlying skills and
constructs (Loh etal., 2015). Similarly, using a popular game SimCity, GlassLab
developed SimCityEDU to assess students’ systems thinking (Dicerbo etal., 2015).
These approaches have primarily used the evidence-centered design framework
(Almond, Steinberg, & Mislevy, 2002) to align what people might learn from the
game with what they do in games.
Eseryel, Ifenthaler, and Ge (2011) provide an integrated framework for assessing
complex problem-solving in digital game-based learning in the context of a longitu-
dinal design-based research study. In a longitudinal eld study, they examined the
impact of the massively multiplayer online game (MMOG) Surviving in Space on
students’ complex problem-solving skill acquisition, mathematics achievement,
and students’ motivation. Two different methodologies to assess student’s progress
of learning in complex problem-solving were applied. The rst methodology uti-
lized adapted protocol analysis (Ericsson & Simon, 1980, 1993) to analyze stu-
dents’ responses to the given problem scenario within the framework of the
think-aloud methodology. The second methodology utilized HIMATT methodology
(Eseryel, Ifenthaler, & Ge, 2013; Pirnay-Dummer, Ifenthaler, & Spector, 2010) to
analyze students’ annotated causal representations of the phenomena in question.
The automated text-based analysis function of HIMATT enables the tracking of the
association of concepts from text which contain 350 or more words directly, hence
producing an adaptive assessment and feedback environment for game-based learn-
ing. For future game design, the algorithms produce quantitative measures and
graphical representations which could be used for instant feedback within the game
or for further analysis (Ifenthaler, 2014).
More recently, researchers have introduced learning analytics and data mining
techniques to broaden what game-based assessment means (Loh etal., 2015). For
example, Rowe etal. (2017) built “detectors” machine-learned algorithm using log
data in the game to measure implicit understanding of physics, different strategies
associated with productivity in the game, and computational thinking. While they
did not use formal measure models (e.g., IRT or Bayes net), these detectors are
implemented in the game engine to make real-time inferences of players. Similarly,
Shadowspect developed at MIT Playful Journey Lab (Kim & Rosenheck, 2018) is
Y. J. Kim and D. Ifenthaler
dirk@ifenthaler.info
7
another example of GBA that utilizes new advancements in learning analytics and
educational data mining techniques in the process of game design and development
for the purpose of assessment.
Hence, the application of serious games analytics opens up opportunities for the
assessment of engagement within game-based learning environments (Eseryel
etal., 2014). The availability of real-time information about the learners’ actions
and behaviors stemming from key decision points or game-specic events provides
insights into the extent of the learners’ engagement during gameplay. The analysis
of single action or behavior and the investigation of more complex series of actions
and behaviors can elicit patterns of engagement and therefore provide key insights
into learning processes (Ge & Ifenthaler, 2017).
Ifenthaler and Gibson (2019) report how highly detailed data traces, captured by
the Challenge platform, with many events per learning activity and when combined
with new input devices and approaches bring the potential for measuring indicators
of physical, emotional, and cognitive states of the learner. The data innovation of the
platform is the ability to capture event-based records of the higher-frequency and
higher-dimensional aspects of learning engagement, which is in turn useful for anal-
ysis of the effectiveness and impact on the physical, emotional, and cognitive layers
of learning caused or inuenced by the engagements. This forms a high-resolution
analytics base on which research into digital learning and teaching as well as into
how to achieve better outcomes in scalable digital learning experiences can be con-
ducted (Gibson & Jackl, 2015).
1.4 Challenges andFuture Work
While interests for game-based assessment peaked in 2009 when the GlassLab was
launched to scale up this approach in the broad education system, many promises of
game-based learning and assessment have not fully accomplished in the actual edu-
cation system. Based on the reection of the elds’ achievements in the past 10years
and contributions to the current volume, challenges remain that the eld of game-
based assessment still faces as well as future work that researchers, game designers,
and educators should address to transform how games are used in the educa-
tion system.
While ECD has been the most predominant framework to design assessment in
games, it is often unclear how different development processes leverage ECD to
conceptualize game design around the competency of interest (Ke, Shute, Clark, &
Erlebacher, 2019). For example, how can assessment models be formalized? How
can formalized assessment models be translated to game design elements? When in
the game design process, does this translation occur most effectively? How can
competency models be transformed into interesting, engaging game mechanics?
How can psychometric qualities be ensured without being too prescriptive?
Many established game-based assessment approaches focus on understanding
the continuous progression of learning, thinking, reasoning, argumentation, and
1 Game-Based Assessment: ThePast Ten Years andMoving Forward
dirk@ifenthaler.info
8
complex problem-solving during digital game-based learning. From a design per-
spective, it seems important that the game mechanisms address the underlying
affective, behavioral, and cognitive dispositions which must be assessed carefully at
various stages of the learning process, hence, while conceptualizing and designing
games for learning (Bertling, Jackson, Oranje, & Owen, 2015; Eseryel etal., 2014;
Ge & Ifenthaler, 2017).
Advanced data analytics methodologies and technological developments enable
researchers, game designers, and educators to easily embed assessment and analysis
techniques into game-based learning environments (Loh et al., 2015). Internal
assessment and instant analysis including personalized feedback can be imple-
mented in a new generation of educational games. However, it is up to educational
research to provide theoretical foundations and empirical evidence on how these
methodologies should be designed and implemented. We have just arrived in the age
of educational data analytics. Hence, it is up to researchers, technologists, educa-
tors, and philosophers to make sense of these powerful technologies, thus better
help learners to learn.
With the challenges brought on by game-based assessments including data ana-
lytics, the large amount of data now available for teachers is far too complex for
conventional database software to store, manage, and process. Accordingly,
analytics- driven game-based assessments underscore the need to develop assess-
ment literacy in stakeholders of assessment (Ifenthaler etal., 2018; Stiggins, 1995).
Game designers and educators applying data-driven game-based assessments
require practical hands-on experience on the fundamental platforms and analysis
tools for linked big game-based assessment data. Stakeholders need to be intro-
duced to several data storage methods and how to distribute and process them, intro-
duce possible ways of handling analytics algorithms on different platforms, and
highlight visualization techniques for game-based assessment analytics (Gibson &
Ifenthaler, 2017). Well-prepared stakeholders may demonstrate additional compe-
tencies such as understanding large-scale machine learning methods as foundations
for human-computer interaction, articial intelligence, and advanced network anal-
ysis (Ifenthaler etal., 2018).
The current research ndings also indicate that design research and development
are needed in automation and semi-automation (e.g., humans and machines work-
ing together) in assessment systems. Automation and semi-automation of assess-
ments to provide feedback, observations, classications, and scoring are increasingly
being used to serve both formative and summative purposes in game-based learning.
Gibson, Ifenthaler, and Orlic (2016) proposed an open assessment resources
approach that has the potential to increase trust in and use of open education
resources (OER) in game-based learning and assessment by adding clarity about
assessment purposes and targets in the open resources world. Open assessment
resources (OAR) with generalized formative feedback are aligned with a specic
educative purpose expressed by some user of a specic OER toward the utility and
expectations for using that OER to achieve an educational outcome. Hence, OAR
may be utilized by game designers to include valuable and competence-based
assessments in game-based learning.
Y. J. Kim and D. Ifenthaler
dirk@ifenthaler.info
9
The application of analytics-driven game-based assessments opens up opportu-
nities for the assessment of engagement and other motivational (or even broader:
non-cognitive) constructs within game-based learning environments (Eseryel etal.,
2014). The availability of real-time information about the learners’ actions and
behaviors stemming from key decision points or game-specic events provides
insights into the extent of the learners’ engagement during gameplay. The analysis
of single action or behavior and the investigation of more complex series of actions
and behaviors can elicit patterns of engagement and therefore provide key insights
into ongoing learning processes within game-based learning environments.
To sum up, the complexity of designing adaptive assessment and feedback sys-
tems has been discussed widely over the past few years (e.g., Sadler, 2010; Shute,
2008). The current challenge is to make use of data—from learners, teachers, and
game learning environments—for assessments. Hence, more research is needed to
unveil diverse methods and processes related to how design teams, often including
learning scientists, subject-matter experts, and game designers, can seamlessly inte-
grate design thinking and the formalization of assessment models into meaningful
assessment for game-based learning environments.
References
Almond, R. G., Steinberg, L. S., & Mislevy, R. J. (2002). Enhancing the design and delivery
of assessment systems: A four process architecture. Journal of Technology, Learning, and
Assessment, 1(5), 3–63.
Alonso-Fernández, C., Calvo-Morata, A., Freire, M., Martínez-Ortiz, I., & Fernández-Manjón, B.
(2019). Applications of data science to game learning analytics data: A systematic literature
review. Computers & Education, 141, 103612. https://doi.org/10.1016/j.compedu.2019.103612
Baptista, G., & Oliveira, T. (2019). Gamication and serious games: A literature meta-analysis and
integrative model. Computers in Human Behavior, 92, 306–315.
Behrens, J., Mislevy, R., Dicerbo, K., & Levy, R. (2012). Evidence centered design for learn-
ing and assessment in the digital world. In M.Mayrath, J.Clarke-Midura, D.Robinson, &
G.Schraw (Eds.), Technology-based assessments for 21st century skills (pp.13–54). Charlotte,
NC: Information Age Publishers.
Bertling, M., Jackson, G.T., Oranje, A., & Owen, V.E. (2015). Measuring argumentation skills
with game-based assessments: Evidence for incremental validity and learning. In C.Conati,
N. Heffernan, A.Mitrovic, & M. Verdejo (Eds.), Articial intelligence in education. AIED
2015 (Vol. 9112, pp.545–549). Cham, Switzerland: Springer.
Black, P.J. (1998). Testing: Friend or foe? The theory and practice of assessment and testing.
London, UK: Falmer Press.
Boyle, E.A., Hainey, T., Connolly, T.M., Grant, G., Earp, J., Ott, M., … Pereira, J.(2016). An
update to the systematic literature review of empirical evidence of the impacts and outcomes
of computer games and serious games. Computers & Education, 94, 178–192. https://doi.
org/10.1016/j.compedu.2015.11.003
Bressler, D.M., & Bodzin, A.M. (2016). A mixed methods assessment of students’ ow experi-
ences during a mobile augmented reality science game. Journal of Computer Assisted Learning,
29, 505–517. https://doi.org/10.1111/jcal.12008
1 Game-Based Assessment: ThePast Ten Years andMoving Forward
dirk@ifenthaler.info
10
Clark, D.B., Tanner-Smith, E.E., & Killingsworth, S.S. (2016). Digital games, design, and learn-
ing: A systematic review and meta-analysis. Review of Educational Research, 86(1), 79–122.
https://doi.org/10.3102/0034654315582065
Dicerbo, K., Bertling, M., Stephenson, S., Jia, Y., Mislevy, R.J., Bauer, M., & Jackson, G.T.
(2015). An application of exploratory data analysis in the development of game-based assess-
ments. In C.S. Loh, Y.Sheng, & D.Ifenthaler (Eds.), Serious games analytics. Methodologies
for performance measurement, assessment, and improvement (pp.319–342). NewYork, NY:
Springer.
Divjak, B., & Tomić, D. (2011). The impact of game-based learning on the achievement of learn-
ing goals and motivation for learning mathematics- Literature review. Journal of Information
and Organizational Sciences, 35(1), 15–30.
Ericsson, K.A., & Simon, H.A. (1980). Verbal reports as data. Psychological Review, 87, 215–251.
Ericsson, K.A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data. Cambridge,
MA: MIT Press.
Eseryel, D., Ge, X., Ifenthaler, D., & Law, V. (2011). Dynamic modeling as cognitive regulation
scaffold for complex problem solving skill acquisition in an educational massively multiplayer
online game environment. Journal of Educational Computing Research, 45(3), 265–287.
Eseryel, D., Ifenthaler, D., & Ge, X. (2011). Alternative assessment strategies for complex problem
solving in game-based learning environments. In D.Ifenthaler, P.I. Kinshuk, D.G. Sampson,
& J.M. Spector (Eds.), Multiple perspectives on problem solving and learning in the digital
age (pp.159–178). NewYork, NY: Springer.
Eseryel, D., Ifenthaler, D., & Ge, X. (2013). Validation study of a method for assessing com-
plex ill-structured problem solving by using causal representations. Educational Technology
Research and Development, 61(3), 443–463. https://doi.org/10.1007/s11423-013-9297-2
Eseryel, D., Law, V., Ifenthaler, D., Ge, X., & Miller, R.B. (2014). An investigation of the inter-
relationships between motivation, engagement, and complex problem solving in game-based
learning. Journal of Educational Technology & Society, 17(1), 42–53.
Federation of American Scientists. (2005). Summit of educational games: Harnessing the power of
video games for learning. Washington, DC: Author.
Ge, X., & Ifenthaler, D. (2017). Designing engaging educational games and assessing engagement
in game-based learning. In R.Zheng & M.K. Gardner (Eds.), Handbook of research on serious
games for educational applications (pp.255–272). Hershey, PA: IGI Global.
Gee, J.P. (2003). What video games have to teach us about learning and literacy. NewYork, NY:
Palgrave Macmillan.
Gee, J. P. (2008). Learning and games. The ecology of games: Connecting youth, games, and
learning. In K.Salen (Ed.), The John D. and Catherine T.MacArthur Foundation series on
digital media and learning (pp.21–40). Cambridge, MA: MIT.
Gibson, D.C., & Ifenthaler, D. (2017). Preparing the next generation of education researchers for
big data in higher education. In B.K. Daniel (Ed.), Big data and learning analytics: Current
theory and practice in higher education (pp.29–42). NewYork, NY: Springer.
Gibson, D.C., Ifenthaler, D., & Orlic, D. (2016). Open assessment resources for deeper learning.
In P. Blessinger & T.J. Bliss (Eds.), Open education: International perspectives in higher
education (pp.257–279). Cambridge, UK: Open Book Publishers.
Gibson, D.C., & Jackl, P. (2015). Theoretical considerations for game-based e-learning analyt-
ics. In T.Reiners & L. Wood (Eds.), Gamication in education and business (pp.403–416).
NewYork, NY: Springer.
Ifenthaler, D. (2014). AKOVIA: Automated knowledge visualization and assessment. Technology,
Knowledge and Learning, 19(1–2), 241–248. https://doi.org/10.1007/s10758-014-9224-6
Ifenthaler, D. (2015). Learning analytics. In J.M. Spector (Ed.), The SAGE encyclopedia of edu-
cational technology (Vol. 2, pp.447–451). Thousand Oaks, CA: Sage.
Ifenthaler, D., Eseryel, D., & Ge, X. (2012). Assessment for game-based learning. In D.Ifenthaler,
D.Eseryel, & X.Ge (Eds.), Assessment in game-based learning. Foundations, innovations, and
perspectives (pp.3–10). NewYork, NY: Springer.
Y. J. Kim and D. Ifenthaler
dirk@ifenthaler.info
11
Ifenthaler, D., & Gibson, D.C. (2019). Opportunities of analytics in challenge-based learning. In
A.Tlili & M.Chang (Eds.), Data analytics approaches in educational games and gamication
systems. Cham, Switzerland: Springer.
Ifenthaler, D., Greiff, S., & Gibson, D.C. (2018). Making use of data for assessments: Harnessing
analytics and data science. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.),
International handbook of IT in primary and secondary education (2nd ed., pp. 649–663).
NewYork, NY: Springer.
Ke, F. (2016). Designing and integrating purposeful learning in game play: A systematic review.
Educational Technology Research and Development, 64(2), 219–244. https://doi.org/10.1007/
s11423-015-9418-1
Ke, F., Shute, V.J., Clark, K.M., & Erlebacher, G. (2019). Interdisciplinary design of game-based
learning platforms. Cham, Switzerland: Springer.
Kim, Y.J., & Rosenheck, L. (2018). A playful assessment approach to research instrument devel-
opment. Paper presented at the Thirteenth International Conference of the Learning Sciences,
London, UK.
Lenhart, A. (2015). Teen, social media and technology overview 2015. Washington, DC: Pew
Research Center.
Loh, C.S., Sheng, Y., & Ifenthaler, D. (2015). Serious games analytics: Theoretical framework.
In C.S. Loh, Y.Sheng, & D. Ifenthaler (Eds.), Serious games analytics. Methodologies for
performance measurement, assessment, and improvement (pp.3–29). NewYork, NY: Springer.
Newton, P.E. (2007). Clarifying the purposes of educational assessment. Assessment in Education:
Principles, Policy & Practice, 14(2), 149–170. https://doi.org/10.1080/09695940701478321
Pellegrino, J.W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students knwo: The
science and design of educational assessment. Washington, DC: National Academy Press.
Pirnay-Dummer, P., Ifenthaler, D., & Spector, J.M. (2010). Highly integrated model assessment
technology and tools. Educational Technology Research and Development, 58(1), 3–18. https://
doi.org/10.1007/s11423-009-9119-8
Prensky, M. (2001). Digital game-based learning. NewYork, NY: McGraw-Hill.
Ravyse, W.S., Seugnet Blignaut, A., Leendertz, V., & Woolner, A. (2017). Success factors for seri-
ous games to enhance learning: A systematic review. Virtual Reality, 21(1), 31–58.
Rowe, E., Asbell-Clarke, J., Baker, R.S., Eagle, M., Hicks, A., Barnes, T., … Edwards, T. (2017).
Assessing implicit science learning in digital games. Computers in Human Behavior, 76, 617–
630. https://doi.org/10.1016/j.chb.2017.03.043
Sadler, D. R. (2010). Beyond feedback: developing student capability in complex appraisal.
Assessment & Evaluation in Higher Education, 35(5), 535–550.
Shaffer, D. W. (2006). How computer games help children learn? New York, NY: Palgrave
Macmillan.
Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189.
Shute, V. J., Ventura, M.I., Bauer, M., & Zapata-Rivera, D. (2009). Melding the power of seri-
ous games and embedded assessment to monitor and foster learning: Flow and grow. In
U.Ritterfeld, M.Cody, & P.Vorderer (Eds.), Serious games: Mechanisms and effects (pp.295–
321). NewYork, NY: Routledge.
Steinkuehler, C., & Duncan, S. (2008). Scientic habits of mind in virtual worlds. Journal of
Science Education and Technology, 17(6), 530–543.
Stiggins, R.J. (1995). Assessment literacy for the 21st century. Phi Delta Kappan, 77(3), 238–245.
Walsh, D. (2002). Video game violence and public policy. Retrieved from http://www.soc.iastate.
edu/sapp/videogames2.pdf
1 Game-Based Assessment: ThePast Ten Years andMoving Forward