Conference PaperPDF Available

Gameful Design Heuristics: A Gamification Inspection Tool

Authors:

Abstract and Figures

Despite the emergence of many gameful design methodologies in the literature, there is a lack of methods to evaluate the resulting designs. Gameful design techniques aim to increase the user's motivation to interact with a software , but there are presently no accepted guidelines on how to find out if this goal was achieved during the design phase of a project. This paper presents the Gameful Design Heuristics, a novel set of guidelines that facilitate a heuristic evaluation of gameful software, with a focus on the software's potential to afford intrinsic and extrinsic motivation for the user. First, we reviewed several gameful design methods to identify the most frequently employed dimensions of motivational affordances. Then, we devised a set of 28 gamification heuristics that can be used to rapidly evaluate a gameful system. Finally, we conducted a summative empirical evaluation study with five user experience professionals, which demonstrated that our heuristics can help the evaluators find more motivational issues in interactive systems than they would without the heuristics. The suggested method fulfills the need for evaluation tools specific to gameful design, which could help evaluators assess the potential user experience of a gameful application in the early phases of a project.
Content may be subject to copyright.
Gameful Design Heuristics:
A Gamification Inspection Tool
Gustavo F. Tondello1,2[0000-0002-4174-7245], Dennis L. Kappen5[0000-0003-4815-153X],
Marim Ganaba1 and Lennart E. Nacke1,2,3,4[0000-0003-4290-8829]
1 HCI Games Group, Games Institute; 2 Cheriton School of Computer Science
3 Drama and Speech Communication; 4 Stratford School of Interaction and Business
University of Waterloo, ON, Canada
5 Humber College, Toronto, ON, Canada
gustavo@tondello.com, dennis.kappen@humber.ca,
mariganaba@gmail.com, lennart.nacke@acm.org
Abstract. Despite the emergence of many gameful design methodologies in the
literature, there is a lack of methods to evaluate the resulting designs. Gameful
design techniques aim to increase the user’s motivation to interact with a soft-
ware, but there are presently no accepted guidelines on how to find out if this
goal was achieved during the design phase of a project. This paper presents the
Gameful Design Heuristics, a novel set of guidelines that facilitate a heuristic
evaluation of gameful software, with a focus on the software’s potential to af-
ford intrinsic and extrinsic motivation for the user. First, we reviewed several
gameful design methods to identify the most frequently employed dimensions
of motivational affordances. Then, we devised a set of 28 gamification heuris-
tics that can be used to rapidly evaluate a gameful system. Finally, we conduct-
ed a summative empirical evaluation study with five user experience profes-
sionals, which demonstrated that our heuristics can help the evaluators find
more motivational issues in interactive systems than they would without the
heuristics. The suggested method fulfills the need for evaluation tools specific
to gameful design, which could help evaluators assess the potential user experi-
ence of a gameful application in the early phases of a project.
Keywords: Gameful Design Heuristics, Heuristic Evaluation, User Experience,
Gamification, Gameful Design.
1 Introduction
Many gameful design methods have recently emerged as part of the user experience
(UX) design toolkit. They aim to augment and improve the UX of interactive systems
with gamificationdefined as using game design elements in non-game contexts [1].
Even though these tools have been increasingly adopted during the design phase of
software projects, designers still lack standard evaluation methods. There are no
guidelines for experts (i.e., people with background knowledge in UX) to evaluate a
gameful implementation early on in a project.
Proceedings of HCI International 2019, Springer, 2019 (pre-print)
For usability evaluation, two standard approaches exist. First, the gold standard is a
usability test. UX researchers can either run a formative usability test (where they
usually sit close to the participant and observe their behaviour) or a summative one
(where they are often present locally or virtually, but the participant is working
through an assigned task or scenario while some outcome measures are recorded).
However, the second type (the heuristic evaluation or usability inspection) is cheaper
and easier to set upand can be conducted before planning an expensive usability
test. Heuristic evaluation or usability inspections allow experts to evaluate a design
based on a set of principles or guidelines (i.e., heuristics). These are fast and inexpen-
sive methods that can be used to identify and address design issues.
These expert guidelines date back to the early days of software design (e.g., Smith
and Mosier [2]) and have over the past decades improved how we develop software
and interactive applications. In the established areas of UX, heuristic evaluation or
inspection methods [3, 4] are commonly used as evaluation tools during the project
design and implementation phases. These are not meant to replace user testing, but
rather complement the set of evaluation tools. While it has become more common to
conduct user tests with gamified applications (just as games user researchers have
done in the video game industry), the domain is still lacking robust methodologies for
evaluating gameful designs.
The benefit of using a gamification inspection method is that it allows rapid and
early evaluation of a gameful design. While several studies have investigated the ef-
fectiveness of gameful applications by studying their users [5], user tests are conduct-
ed after a prototype has already been implemented. Although concerns have been
voiced that heuristic evaluation can be influenced by subjective interpretations [6], it
remains a valuable tool for practitioners, who operate under tighter time constraints
than researchers. Heuristic evaluation affords researchers a finer focus in the user tests
that are usually done subsequently to this initial validation, since the most basic issues
will have already been discovered at that point.
While UX tests focus on identifying issues related to usability, ergonomics, cogni-
tive load, and affective experiences, gamification is concerned with understanding
and fostering the user’s motivation to use a product, system, or service. Thus, gamifi-
cation methods rely on motivational psychology research, such as self-determination
theory (SDT) [710], to understand human motivation. Our heuristics were informed
by this theoretical framework.
Several gameful design frameworks and methods have been suggested [11, 12]
with prescriptive guidelines for augmenting an application with motivational af-
fordances (note that we refer to gamification and gameful design interchangeably
because both frame the same set of phenomena from different points of view [1]).
Motivational affordances are properties added to an object, which allow its users to
experience the satisfaction of their psychological needs [13, 14]. In gameful design,
motivational affordances are used to facilitate intrinsic and extrinsic motivation. Thus,
motivational affordances supporting a user’s feelings of competence, autonomy, and
relatedness can facilitate intrinsic motivation, whereas external incentives or rewards
facilitate extrinsic motivation.
Our work contributes to the human-computer interaction (HCI) and gamification
communities by presenting a new set of guidelines for heuristic evaluation of gameful
design in interactive systems. We began our research by reviewing several gameful
design frameworks and methods to identify which dimensions of motivational af-
fordances were common among them. Next, we created a set of heuristics focused on
each of the identified dimensions. The resulting set of heuristics provides a new way
of evaluating gameful user experiences. It is the first inspection tool focused specifi-
cally on evaluating gameful design through the lens of intrinsic and extrinsic motiva-
tional affordances. The aim of our inspection tool is to enable any UX expert to con-
duct a heuristic evaluation of a gameful application more easily, even if they have no
background expertise in gameful design or motivational psychology.
To evaluate the proposed heuristics, we conducted a study with five UX or HCI
professionals who evaluated two online gameful applications. Three participants used
our gameful design heuristics, while the remaining two used a two-page description of
gamification and motivational affordances. Results showed that usage of our heuris-
tics led to more motivational issues being identified in the evaluated applications, as
well as a broader range of identified issues, comprising a larger number of different
dimensions.
2 Related Work and Model Development
2.1 Heuristic Evaluation for Games
In usability engineering, heuristics are broad usability guidelines that have been used
to design and evaluate interactive systems [15]. Heuristic evaluation is the use of
these principles by experts in a usability inspection process to identify usability prob-
lems in an existing design as part of an iterative design process [3, 4]. These inspec-
tions are usually done early in the design process to identify application errors before
scheduling user tests.
Several authors have suggested heuristic evaluation models for games. These mod-
els vary both in their goals and in the dimensions they address: while some are more
general, aimed at evaluating any game genre or type, others are more focused for
example on networked or mobile games. Some of the most relevant heuristic evalua-
tion models for game design are shown in Table 1.
Some heuristics for evaluating games or playability may also be applied to gameful
applications. Some of the dimensions addressed by most game design heuristics are of
relevance to gameful design, such as goals, challenge, feedback, and social interac-
tion. However, heuristics for games include several dimensions that are not applicable
to most gameful applications, such as control and concentration.
Additionally, some of the game heuristics cover issues that can be addressed in
gameful applications using general UX principles, such as screen layout or naviga-
tion. These heuristics might be necessary when evaluating games because game de-
sign often uses its own user interface principles, which can be different from tradi-
tional application interfaces. However, most gameful applications follow current de-
sign standards for user interfaces; thus, general UX evaluation methods can be easily
applied to gameful applications to address issues such as usability or ergonomics.
Game design heuristics do not cover the full range of common motivational af-
fordances used in gamification. For example, meaning, rewards, and scarcity are di-
mensions of motivational affordances often used in gameful design that are not cov-
ered by existing game heuristics. This makes it difficult to use game design heuristics
to evaluate gameful applications. In order to do so, an evaluator would have to decide
first which dimensions from the game heuristics should be used and which should not;
next, they would also have to be concerned with motivational issues that are not cur-
rently covered by game heuristics. Consequently, we conclude that we need an in-
spection method better suited to assess gameful applications.
Before creating our set of gameful design heuristics, we reviewed the abovemen-
tioned game heuristics and considered the possibility of extending the existing models
rather than proposing a new one. However, we encountered the same issues men-
tioned above: we would have to separate which heuristics from the existing models
are applicable to gameful design and which are not. The resulting model would be
confusing and difficult to apply. Therefore, we decided to create a new set of gameful
design heuristics by analyzing existing gameful design methods rather than analyzing
and extending existing game design heuristics.
2.2 Heuristic Evaluation for Playful Design
The Playful Experiences (PLEX) Framework [16, 17] provides an understanding of
pleasurable user experience, which can be applied to both games and gameful applica-
tions. It classifies playful experiences according to 22 categories (see Table 2).
Table 1. Existing heuristic evaluation models for games.
Model
Description
Heuristic Evaluation for
Playability (HEP) [31]
A set of heuristics for playability comprising four categories:
gameplay, game story, game mechanics, and game usability.
Games Usability
Heuristics (PLAY) [32]
A set of 48 principles aimed at evaluating action-adventure,
RTS, and FPS games. The heuristics are organized according
to three categories: gameplay, coolness / entertainment / humor
/ emotional immersion, and usability & game mechanics.
Game Approachability
Principles (GAP) [33]
A set of guidelines to create better tutorials or experiences for
new players.
Playability Heuristics for
Mobile Games [34]
A set of heuristics for mobile games comprising three catego-
ries: game usability, mobility, and gameplay.
Networked Game
Heuristics (NGH) [35]
A set of heuristics that consider specific issues related to group
play over a network.
Heuristics for Social
Games [36]
A set of heuristics created from a critical review of prior video
game evaluation heuristics.
GameFlow [37, 38]
A comprehensive heuristic set designed as a tool to evaluate
player enjoyment in eight dimensions: concentration, chal-
lenge, player skills, control, clear goals, feedback, immersion,
and social interaction.
The PLEX framework can be used as a tool for heuristic evaluation of gameful in-
teractive systems, similar to the gameful design heuristics we are presenting. Never-
theless, PLEX is focused on classifying the types of experiences that the system can
afford, rather than the motivational potential of these experiences. Therefore, the
PLEX framework and the gameful design heuristics are two complementary tools,
which can each provide insights into different characteristics of interactive systems
that work together to afford an enjoyable user experience.
2.3 Review of Gameful Design Methods
To the best of our knowledge, no extant set of heuristics is available for evaluating
motivation in gameful design. Some of the existing gameful design methods, namely
Octalysis [18], HEXAD [19], and Lens of Intrinsic Skill Atoms [11], suggest proce-
dures to evaluate an existing system. Nevertheless, these procedures only provide a
starting point for the design process. They are less suited for being used as an evalua-
tion tool by a quality control team because they lack a concise set of heuristics with
brief descriptors which could be quickly checked by a UX practitioner. Moreover, the
lack of a succinct rubric implies that an evaluator would need to study the methods
intensively before being able to conduct an evaluation. Therefore, presently, there is
no evaluation method for gameful applications that can be easily learned by UX pro-
fessionals who are not familiar with gameful design. Our research fills this gap.
Table 2. The 22 categories of the PLEX framework.
Experience
Description
Captivation
Forgetting one’s surroundings
Challenge
Testing abilities in a demanding task
Competition
Contest with oneself or an opponent
Completion
Finishing a major task, closure
Control
Dominating, commanding, regulating
Cruelty
Causing mental or physical pain
Discovery
Finding something new or unknown
Eroticism
A sexually arousing experience
Exploration
Investigating an object or situation
Expression
Manifesting oneself creatively
Fantasy
An imagined experience
Fellowship
Friendship, communality, or intimacy
Humor
Fun, joy, amusement, jokes, gags
Nurture
Taking care of oneself or others
Relaxation
Relief from bodily or mental work
Sensation
Excitement by stimulating senses
Simulation
An imitation of everyday life
Submission
Being part of a larger structure
Subversion
Breaking social rules and norms
Suffering
Experience of loss, frustration, anger
Sympathy
Sharing emotional feelings
Thrill
Excitement derived from risk, danger
Several gameful design frameworks and methods are currently available (see [11,
12, 20] for comprehensive reviews). Therefore, we decided to review these existing
methods to extract the different dimensions of motivational affordances that need to
be considered in gameful design. Since the reviewed methods synthesize the current
set of best practices in gameful design, we considered that they could provide an
adequate starting point to identify motivational dimensions of concern. However, only
a few of the reviewed methods feature a classification of motivational affordances in
different dimensions, which we could use as a theoretical background to devise our
heuristics. This was unfortunate, since our goal was to use these dimensions of
motivational affordances as the starting point for the development of our framework.
Gameful design methods that do not provide a classification of dimensions of
motivational affordances would not be helpful in creating our gameful design
heuristics. Therefore, we expanded the scope of our analysis to include methods that
presented some sort of classification of motivational affordances. Table 3 lists the
frameworks and methods we considered, as well as the rationale for their inclusion or
otherwise in our analysis.
After reviewing the frameworks and methods and selecting six of them for further
analysis (see Table 3), we conducted a comparison of the motivational dimensions in
each model to map the similarities between them, using the following procedure:
1. The first framework was added as the first column of a table, with each one of its
suggested motivational dimensions as separate rows. We chose the Octalysis
framework as the first one because it comprised the highest number of dimensions
(eight), which facilitated subsequent procedures/steps, but we could have chosen
any of the frameworks as a starting point.
2. Next, we added each one of the remaining models as additional columns into the
table. For each added model, we compared each one of its suggested dimensions
with the rows that already existed in the table. When the new dimension to be add-
ed corresponded to one of the dimensions already in the table, we added it to the
relevant existing row. Otherwise, we added a new row to the table creating a new
dimension. In some cases, the addition of a new dimension also prompted the sub-
division of an existing row. For example, the competence dimension was split into
challenge/competence and completeness/mastery.
3. After adding all the models to the table, we observed the characteristics of the di-
mensions named in the rows and created for each of the latter a unique label, com-
prising the meaning of all the dimensions it encompassed.
The resulting model consists of twelve common dimensions of motivational af-
fordances (see Table 4). The similarity analysis between dimensions of different
models was conceptual, meaning that we studied the description of each dimension as
presented by their original authors and decided whether they represented the same
core construct as any of the dimensions already present in the table. Similarly, we
derived the labels for each one of the twelve resulting dimensions (first column of
Table 4) by identifying the core concepts of each dimension. In the resulting classifi-
cation, we noted that these dimensions were strongly based on: (1) the theories of
intrinsic and extrinsic motivation (SDT; [79]), (2) behavioural economics [21], and
(3) the practical experience of the authors of the analyzed frameworks. The entire
initial analysis was conducted by one of the researchers; next, three other researchers
(co-authors) also analyzed the resulting table. We then conducted an iterative loop of
feedback and editing until none of the researchers had additional suggestions to im-
prove the final model.
Table 3. A summary of the gameful design frameworks & methods considered in our research.
Framework or
Method
References
in the
Rationale
Gamification by
Design
Zichermann and
Cunningham [39]
Does not provide a classification of
dimensions of motivational af-
fordances.
Gamification
Framework
Francisco-
Aparicio et al.
[40]
Does not provide a classification of
dimensions of motivational af-
fordances.
Gamification Model
Canvas
Jiménez [41]
Does not provide a classification of
dimensions of motivational af-
fordances.
Gamify
Burke [42]
Does not provide a classification of
dimensions of motivational af-
fordances.
User Types
HEXAD
Tondello et al.
[19]
Provides a classification with six
user types that are further used to
classify sets of game elements for
each type.
The Kaleidoscope
of Effective
Gamification (KEG)
Kappen and
Nacke [43]
Provides a classification with sever-
al layers of motivational affordances
that can be used to design or evalu-
ate gameful systems.
Motivational Design
Lenses (MDL)
Deterding [11]
Provides a classification of motiva-
tional design lenses that can be used
to evaluate gameful systems.
Loyalty 3.0
Paharia [44]
Does not provide a classification of
dimensions of motivational af-
fordances.
The RECIPE for
Meaningful
Gamification
Nicholson [45]
Provides six different motivational
dimensions for gameful design.
Octalysis
Framework
Chou [18]
Provides a classification with eight
dimensions of motivation that can
be used to design or evaluate game-
ful systems.
Six Steps to Success
Werbach and
Hunter [46]
Does not provide a classification of
dimensions of motivational af-
fordances.
Super Better
McGonigal [47]
Proposes a gameful design method
based on seven steps, which can be
mapped as motivational dimensions.
3 Gameful Design Heuristics
Our set of heuristics enables experts to identify gaps in a gameful systems design.
This is achieved by identifying missing affordances from each of the dimensions.
Prior to creating the heuristics, we reviewed the research on motivation [7, 8] to
help categorize the twelve dimensions into intrinsic, extrinsic, and context-dependent
motivational categories. This is a common practice in gameful design and many of the
reviewed methods also employ a similar classification. Although it is a simplification
of the underlying theory, this simple categorization helps designers and evaluators
better understand the guidelines and focus their attention on specific motivational
techniques. We chose SDT as the theoretical background for this classification be-
cause it is the motivational theory most frequently employed in gameful design meth-
odologies [11, 12].
Table 4. Dimensions of motivational affordances from the reviewed gameful design methods.
Dimension
Octalysis [18]
HEXAD
[19]
KEG [43]
MDL
[11]
RECIPE
[45]
Super
Better [47]
Purpose and
Meaning
Epic Meaning
& Calling
Philan-
thropist
Information;
Reflection
Epic win
Challenge
and
Competence
Development &
Accomplishment
Achiever
Competence;
Challenge
Challenge
lenses;
Intrinsic
rewards
Engagement
Challenge;
Bad guys
Completeness
and Mastery
Development &
Accomplishment
Achiever
Competence;
Achievements
Goal and
Action
lenses;
Intrinsic
rewards
Complete
quests
Autonomy
and
Creativity
Creativity &
Feedback
Free
Spirit
Autonomy
Object
lenses;
Intrinsic
rewards
Play;
Choice
Relatedness
Social Influence
& Relatedness
Socialiser
Relatedness
Intrinsic
rewards
Engagement
Recruit
allies
Immersion
Perceived Fun
Exposition
Secret
identity
Ownership
and
Rewards
Ownership &
Possession
Player
Extrinsic
motivation
Intrinsic
rewards
Power-ups
Unpredicta-
bility
Unpredictability
& Curiosity
Free
Spirit
Varied
challenge;
Varied
feedback;
Secrets
Play
Scarcity
Scarcity &
Impatience
Loss
avoidance
Loss &
Avoidance
Feedback
Creativity &
Feedback
Feedback
lenses
Change and
Disruption
Disruptor
We used the following criteria to split our heuristics into categories:
Intrinsic motivation includes affordances related to the three intrinsic needs intro-
duced by SDT [7, 8] (competence, autonomy, and relatedness), as well as pur-
poseand meaningas facilitators of internalization [22–24] and immersion’, as
suggested by Ryan and Rigby [9, 25] and Malone [26].
Extrinsic motivation includes affordances that provide an outcome or value sepa-
rated from the activity itself as suggested by SDT [8] and Chou [18]: ownership
and rewards, scarcity, and loss avoidance.
Context-dependent motivation includes the feedback, unpredictability, and disrup-
tion affordances, which can afford either intrinsic or extrinsic motivation depend-
ing on contextual factors. For example, the application can provide feedback to the
user regarding either intrinsically or extrinsically motivated tasks; therefore, feed-
back might afford intrinsic or extrinsic motivation according to the type of task
with which it is associated.
We constructed the heuristics based on an examination of the literature cited in Table
4, by writing adequate guidelines for each of the twelve identified dimensions. Fol-
lowing the literature review, we created these guidelines by studying the descriptions
of each dimension in the original models, identifying the main aspects of each dimen-
sion, and writing concise descriptions of each aspect to assist expert evaluation. We
employed the following procedure:
1. For each one of the twelve motivational dimensions, we first studied the underly-
ing concepts and wrote a short description of the dimension itself, aimed at guiding
expert evaluators’ understanding of each dimension.
2. Next, for each dimension, we identified the main aspects of concern, meaning the
aspects that should be considered by designers when envisioning a gameful system,
as suggested by the reviewed frameworks or methods. We argue that these aspects
of concern, when designing a system, should also be the main points of evaluation.
3. For each aspect of concern, we then wrote a concise description aimed at guiding
experts in evaluating whether the aspect being scrutinized was considered in the
evaluated system’s design.
Tables 57 present the final set of 28 heuristics organized within the 12 dimensions,
following the initial analysis, framing, and iterative feedback mentioned above, and
which have been presented previously in a work-in-progress [27].
Additionally, we have extended the gameful design heuristics by writing a set of
questions for each heuristic. These questions inquire about common ways of imple-
menting each guideline, helping the evaluators assess whether the guideline is imple-
mented in the system at all. We do not include the complete set of questions here
because of space constraints, but we provide them in our website1.
1 http://gamefuldesign.hcigames.com/
Table 5. Intrinsic motivation heuristics.
Intrinsic Motivation Heuristics
Purpose and Meaning: Affordances aimed at helping users identify a meaningful goal that
will be achieved through the system and can benefit the users themselves or other people.
I1. Meaning: The system clearly helps users identify a meaningful contribution (to them-
selves or to others).
I2. Information and Reflection: The system provides information and opportunities for re-
flection towards self-improvement.
Challenge and Competence: Affordances aimed at helping users satisfy their intrinsic need
of competence through accomplishing difficult challenges or goals.
I3. Increasing Challenge: The system offers challenges that grow with the user’s skill.
I4. Onboarding: The system offers initial challenges for newcomers that help them learn
how it works.
I5. Self-challenge: The system helps users discover or create new challenges to test them-
selves.
Completeness and Mastery: Affordances aimed at helping users satisfy their intrinsic need
of competence by completing series of tasks or collecting virtual achievements.
I6. Progressive Goals: The system always presents the next actions users can take as tasks
of immediately doable size.
I7. Achievement: The system lets users keeps track of their achievements or advancements.
Autonomy and Creativity: Affordances aimed at helping users satisfy their intrinsic need
of autonomy by offering meaningful choices and opportunities for self-expression.
I8. Choice: The system provides users with choices on what to do or how to do something,
which are interesting but also limited in scope according to each user’s capacity.
I9. Self-expression: The system lets users express themselves or create new content.
I10. Freedom: The system lets users experiment with new or different paths without fear or
serious consequences.
Relatedness: Affordances aimed at helping users satisfy their intrinsic need for relatedness
through social interaction, usually with other users.
I11. Social Interaction: The system lets users connect and interact socially.
I12. Social Cooperation: The system offers the opportunity of users working together to-
wards achieving common goals.
I13. Social Competition: The system lets users compare themselves with others or challenge
other users.
I14. Fairness: The system offers similar opportunities of success and progression for every-
one and means for newcomers to feel motivated even when comparing themselves with
veterans.
Immersion: Affordances aimed at immersing users in the system in order to improve their
aesthetic experience [48], usually by means of a theme, narrative, or story, which can be real
or fictional.
I15. Narrative: The system offers users a meaningful narrative or story with which they can
relate to.
I16. Perceived Fun: The system affords users the possibility of interacting with and being
part of the story (easy fun; [49]).
Table 6. Extrinsic motivation heuristics.
Extrinsic Motivation Heuristics
Ownership and Rewards: Affordances aimed at motivating users through extrinsic rewards
or possession of real or virtual goods. Ownership is different from Competence when ac-
quiring goods is perceived by the user as the reason for interacting with the system, instead
of feeling competent.
E1. Ownership: The system lets users own virtual goods or build an individual profile over
time, which can be developed by continued use of the system and to which users can relate.
E2. Rewards: The system offers incentive rewards for interaction and continued use, which
are valuable to users and proportional to the amount of effort invested.
E3. Virtual Economy: The system lets users exchange the result of their efforts with in-
system or external rewards.
Scarcity: Affordances aimed at motivating users through feelings of status or exclusivity by
means of acquisition of difficult or rare rewards, goods, or achievements.
E4. Scarcity: The system offers interesting features or rewards that are rare or difficult to
obtain.
Loss Avoidance: Affordances aimed at leading users to act with urgency, by creating situa-
tions in which they could lose acquired or potential rewards, goods, or achievements if they
do not act immediately.
E5. Loss Avoidance: The system creates urgency through possible losses unless users act
immediately.
Table 7. Context-dependent heuristics.
Context-dependent Heuristics
Feedback: Affordances aimed at informing users of their progress and the next available
actions or challenges.
C1. Clear and Immediate Feedback: The systems always inform users immediately of any
changes or accomplishments in an easy and graspable way.
C2. Actionable Feedback: The system always informs users about the next available actions
and improvements.
C3. Graspable Progress: Feedback always tells users where they stand and what is the path
ahead for progression.
Unpredictability: Affordances aimed at surprising users with variable tasks, challenges,
feedback, or rewards.
C4. Varied Challenges: The system offers unexpected variability in the challenges or tasks
presented to the user.
C5. Varied Rewards: The system offers unexpected variability in the rewards that are of-
fered to the user.
Change and Disruption: Affordances aimed at engaging users with disruptive tendencies
[19] by allowing them to help improve the system, in a positive rather than destructive way.
C6. Innovation: The system lets users contribute ideas, content, plugins, or modifications
aimed at improving, enhancing, or extending the system itself.
C7. Disruption Control: The system is protected against cheating, hacking, or other forms of
manipulation from users.
3.1 Using the Gameful Design Heuristics
Similar to previous heuristic UX evaluation methods, gamification heuristics should
be used by experts to identify gaps in a gameful system’s design. Experts should con-
sider each guideline to evaluate whether it is adequately implemented into the design.
Prior studies have shown that evaluations conducted by many evaluators are more
effective in finding issues than those conducted by an individual evaluator [3, 28, 29].
Thus, we recommend the evaluation to be conducted by two or more examiners.
When applying the heuristics, the evaluators should first familiarize themselves
with the application to be analyzed and its main features. Then, for each heuristic,
they should read the general guideline and observe the application, identifying and
noting what the application does to implement this guideline. Next, they should read
the questions associated with the heuristic and answer them to identify possible gaps
in the application’s design. The evaluation is focused on observing the presence or
absence of the motivational affordances and, if the evaluator has enough expertise, in
evaluating their quality. However, it does not aim to observe the actual user experi-
ence, which is highly dependent on the users themselves in addition to the system.
Therefore, this method cannot evaluate the user experience; its goal is to evaluate the
system’s potential to afford a gameful, engaging experience. As we have stated be-
fore, the heuristic evaluation should be subsequently validated by user studies to es-
tablish whether the observed potential translates into actual gameful experiences.
It is important to note that the questions associated with each heuristic act as guide-
lines to facilitate the evaluation process. They are not intended to represent every
aspect related to the heuristic. Therefore, it is important that the evaluator also thinks
beyond the suggested questions and considers other issues that might be present in the
application regarding each heuristic.
After evaluating all the dimensions, a count of the number of issues identified in
each dimension can help identify which motivational issues (from the heuristics) re-
quire more attention in improving the system’s potential to engage users.
3.2 Turning the Evaluation Results into Actionable Design
Since the gameful design heuristics are an evaluation method, they do not provide the
means to turn the identified issues into actionable design ideas to improve the applica-
tion’s design. Although the heuristics identify what dimensions of motivational af-
fordances are implemented in, or excluded from the system, they do not provide any
information about the need (or otherwise) to implementing the missing dimensions.
Depending on the goals of the gameful software being developed, including motiva-
tional affordances for all dimensions might be either necessary or unimportant. There-
fore, we suggest that the identified design gaps should be considered within an itera-
tive gameful design method, which can then provide the tools to assess the need for
including new motivational affordances into the system to address the gaps. The
methods used to inform the development of the heuristics (see Table 4) are adequate
for this goal because they make it easy to map the dimensions where gaps are identi-
fied to the design element categories suggested by these gameful design methods.
4 Evaluation
We conducted a summative study with five UX or HCI experts to evaluate the game-
ful design heuristics. We asked participants to evaluate two online gameful applica-
tions: Habitica2 and Termling3. Data were collected between August and December
2016.
Three participants (P1, P2, P3) conducted the evaluation using the heuristics and
the remaining two (P4, P5) without it, enabling us to compare how many motivational
design issues were found by experts with and without the heuristics. Furthermore,
three participants (P1, P3, P4) had expertise in gamification or games, whereas two
(P2, P5) were knowledgeable in UX or HCI, but did not have a specific background in
gamification. This enabled us to assess if prior gamification expertise would influence
the evaluators’ ability to identify motivational design issues.
4.1 Participants
We initially invited 18 experts in UX, HCI, or gamification to participate in the study.
Potential participants were selected from the authors’ acquaintances and from previ-
ous project collaborators. The criterion was that potential participants should have an
expertise either in gamification or games (including design practice or research expe-
rience) or in using other UX or HCI methods to evaluate interactive digital applica-
tions. Potential participants were contacted by email or in person. No compensation
was provided for participation.
From the 18 invited participants, 10 initially agreed to participate and were sent the
instructions; of these only five participants completed the procedures (likely because
of scheduling difficulties and the lack of compensation). Of these five, two partici-
pants completed the evaluation of Habitica only; however, we decided to include their
feedback in the study anyway. This meant that we collected five evaluations for
Habitica, but only three for Termling. Table 8 summarizes the demographics of the
participants.
2 http://habitica.com/, last accessed December 2016.
3 http://www.termling.com/, last accessed December 2016.
Table 8. Participant demographics.
#
Gender
Role
Gamification
expertise?
Has studied gamification
before?
P1
Male
Graduate Student (HCI)
No
Yes (4 months)
P2
Male
Creative Director
Yes
Yes (3 years)
P3
Female
Professor (HCI)
Yes
Yes
P4
Male
Creative Lead
Yes
Yes
P5
Female
Graphic Designer
No
No
4.2 Procedure
Initially, participants read and signed a consent form and filled out a short demo-
graphic information form (see Table 8). Next, the instructions to evaluate the two
applications were sent out. Since both applications were free and available online,
participants were instructed to create a free account to test them. We instructed partic-
ipants P1 and P2 to carry out the evaluation without the gameful design heuristics and
participants P3, P4, and P5 to use the heuristics. Assignment to experimental condi-
tions was not random because we needed to ensure that we had participants with and
without gamification expertise in both conditions (with or without the heuristics).
The instructions for P1 and P2 contained a one-page summarized introduction
about gamification and motivation, followed by instructions requesting them to reflect
on the applications’ design and motivational affordances, try to understand how they
afford intrinsic and extrinsic motivation, and then list any issue they identified related
to the motivational affordances (or lack of the same).
Participants P3, P4, and P5 received information that contained the same introduc-
tion about gamification and motivation, followed by an introduction to the gameful
design heuristics, and instructions that asked them to reflect on the applications and
identify motivational issues using the gameful design heuristics. Participants were
given a complete copy of the gameful design heuristics to guide them during the eval-
uation, including the full list of heuristics with all the accompanying questions to
guide the evaluation (see Section 3). The heuristics were formatted as a fillable form,
which offered an additional column where participants could take notes about the
issues observed in the applications. After receiving the instructions, participants could
conduct their evaluations at their own pace and discretion; they were not supervised
by the researchers. After completing the evaluation, participants emailed the forms
back to the researchers.
4.3 Results
Table 9 shows the number of issues found in the two evaluated applications by the
participants. Overall, participants who used the gameful design heuristics identified
more issues than those who did not use any heuristics.
The number of issues identified by the participant who had no prior gamification
expertise and used the heuristics (P5) was just slightly higher than the participants
who did not use the heuristics (P1 and P2), whether they had gamification expertise or
not. However, it is noteworthy that the heuristics helped P5 identify issues in more
dimensions than did P1 and P2: while P5 identified issues in 10 different dimensions
for Habitica, P1’s and P2’s issues were concentrated in only six dimensions.
Moreover, congruent to our intentions, the heuristics helped evaluators focus their
analyses on the motivational affordances instead of other usability issues or bugs.
This is demonstrated by the fact that P1 and P2 both reported some issues that were
not related to the motivational affordances at all (e.g., usability issues or bugs),
whereas P3, P4, and P5 only reported motivational issues.
Furthermore, a qualitative comparison of participants’ responses shows that when
they used the heuristics, their comments were generally more focused on the motiva-
tional aspects, whereas the comments from participants who did not use the heuristics
were more general. For example, regarding Habitica’s onboarding, P1, P2, and P3
mostly recognized the fact that some information or tutorial material is missing or
hidden. However, they do not comment on how this would affect the user’s motiva-
tion. On the other hand, P4 and P5 could point out that, although a set of instructions
existed, it did not motivate the user because it was not challenging or fun. Thus, it
seems that the heuristics are useful in focusing the evaluator’s attention into the moti-
vational issues of the application.
Table 9. Number of issues found by participants.
Habitica
Termling
Participant
P1
P2
P3
P4
P5
P1
P2
P3
P4
P5
Used heuristics?
No
No
Yes
Yes
Yes
No
No
-
Yes
-
I1. Meaning
1
3
0
0
0
0
2
-
1
-
I2. Information and
Reflection
0
0
1
0
1
0
0
-
1
-
I3. Increasing Challenge
0
0
1
0
0
0
0
-
1
-
I4. Onboarding
2
1
1
1
1
1
3
-
1
-
I5. Self-challenge
0
0
0
0
0
0
0
-
0
-
I6. Progressive Goals
0
0
0
1
1
0
0
-
1
-
I7. Achievement
0
0
1
0
0
0
1
-
1
-
I8. Choice
0
0
2
1
0
0
0
-
2
-
I9. Self-expression
1
0
0
0
0
1
2
-
0
-
I10. Freedom
0
0
0
0
1
0
0
-
1
-
I11. Social Interaction
0
1
2
0
0
0
1
-
0
-
I12. Social Cooperation
0
0
1
0
1
0
0
-
0
-
I13. Social Competition
0
0
0
1
0
0
0
-
2
-
I14. Fairness
0
0
0
1
0
0
0
-
0
-
I15. Narrative
0
0
1
2
0
0
0
-
1
-
I16. Perceived Fun
1
0
0
1
0
0
0
-
1
-
E1. Ownership
0
0
0
0
0
0
0
-
1
-
E2. Rewards
1
0
0
0
0
1
0
-
1
-
E3. Virtual Economy
1
0
0
2
0
0
0
-
2
-
E4. Scarcity
0
0
0
0
0
0
1
-
0
-
E5. Loss Avoidance
0
1
0
1
1
0
1
-
1
-
C1. Clear & Immediate
Feedback
0
0
0
0
1
1
0
-
1
-
C2. Actionable Feedback
0
0
0
0
0
0
0
-
1
-
C3. Graspable Progress
0
0
0
1
1
0
0
-
1
-
C4. Varied Challenges
0
1
1
1
1
0
1
-
0
-
C5. Varied Rewards
0
1
1
1
1
0
0
-
1
-
C6. Innovation
0
0
0
0
0
0
0
-
2
-
C7. Disruption Control
0
0
0
2
0
0
0
-
0
-
Total motivational issues
7
8
12
16
10
4
12
-
24
-
Other issues
(usability, bugs)
3
2
-
-
-
4
2
-
-
-
Additionally, the participants who had prior gamification expertise and used the
heuristics could identify approximately twice as many motivational issues as the par-
ticipant who also had gamification expertise but did not use the heuristics. In compar-
ison, P3 found 12 and P4 found 16 motivational issues in Habitica, whereas P2 found
only eight. In Termling, P4 found 24 motivational issues while P2 found only 12.
We can also observe that the motivational dimensions where participants classified
the issues sometimes differ. However, this is not a characteristic specific to our tool,
but it is a known fact of heuristic evaluation in general that a single evaluator usually
does not notice all the existing issues. This is why it is recommended that a heuristic
evaluation should be conducted by a number of experts instead of only one [3, 28,
29]. This way, by combining all the issues identified by the different experts, good
coverage of the total issues existing in the system will be achieved.
In summary, the results provided the following evidence:
A participant who had no prior gamification expertise, but used the gameful design
heuristics, could find as many motivational issues as participants who did not use
the heuristics (with or without prior gamification expertise), but in a broader range
of motivational dimensions;
Participants who had prior gamification expertise and used the gameful design
heuristics could find twice as many motivational issues than participants who did
not use the heuristics or did not have prior expertise;
Using the gameful design heuristics helped participants focus their analyses on the
motivational issues, avoiding any distraction with other types of problems.
5 Discussion
We have created a set of 28 gameful design heuristics for the evaluation and
identification of design gaps in gameful software. Due to the lack of direct
applicability of existing heuritics from game design, we deliberately decided to create
a new set of heuristics specific to gameful design, based on motivational theories and
gameful design methods, rather than extending the existing heuristics for game
design. By deriving our set of heuristics from common dimensions of motivational
affordances employed by different gameful design methods, we have presented a
novel and comprehensive approach that encompasses a broad range of motivational
affordances. Furthermore, to enable expert evaluation, the heuristics are written in a
concise form, together with supportive questions for reflection.
Our study with five UX and HCI experts provided empirical evidence that:
gameful design heuristics can help UX evaluators who are not familiar with gami-
fication to evaluate a gameful system at least as well as a gamification expert who
does not use the heuristics; and
gameful design heuristics can greatly improve the ability of gamification experts to
perform a heuristic evaluation, leading them to find twice as many issues as they
would find without the heuristics.
The implications of our findings are twofold. First, we provide evidence that evalua-
tion of gameful applications without a support tool is subjective; therefore, even gam-
ification experts might miss important issues. A probable reason for this is the com-
plexity of gameful design and the number of motivational dimensions involved. Sec-
ond, we demonstrate that usage of the gameful design heuristics can significantly
improve the results of heuristic evaluations conducted both by gamification experts
and non-experts. Considering that gameful design still suffers from difficulties in
reproducing some of the successful results and that several studies have reported
mixed results [5, 30], our work sheds light on one of the probable causes for this.
Consequently, the gameful design heuristics represent an important instrument, which
can be used to improve the chances of building effective gameful applications.
Nevertheless, the study was limited by the small sample size. Thus, although these
initial results seem promising, future studies will be needed to support them. Addi-
tionally, even though the proposed method was meant to be generic enough to work in
any heuristic evaluation of gameful applications, future studies will need to consider
diverse usage scenarios to investigate if adaptations are needed for specific purposes.
6 Conclusion
Evaluation using heuristics is a way of identifying issues during various stages of
software development, ranging from ideation, design, and prototyping to implementa-
tion and tests. While many heuristics exist in various fields such as usability and game
design, we still lacked guidelines specific to gameful design due to the differences in
types of solutions emergent from this domain. Therefore, our work addresses this gap
and contributes to gameful design research and practice by identifying key motiva-
tional dimensions and presenting a novel evaluation tool specific for gameful systems.
This gameful design heuristics provides a method of evaluating interactive systems in
various stages of their development. The suggested method fulfills a need for UX
evaluation tools specific to gameful design, which could help evaluators assess the
potential UX of a gameful application in the early phases of the software project. The
expert evaluation of the gameful design heuristics provided information that the heu-
ristics enabled experts to identify the presence of motivational affordances from sev-
eral dimensions, as well as the absence of specific affordances from other dimensions.
This is valuable information, which could help software developers and systems de-
signers to incorporate the missing elements. We expect the gameful design heuristics
to be of use to both researchers and practitioners who design and evaluate gameful
software, whether in research studies or in industry applications.
Acknowledgments. We would like to thank the participants, who generously offered
their time to help us. This work was supported by CNPq, the National Council for
Scientific and Technological Development Brazil; SSHRC, the Social Sciences and
Humanities Research Council Canada [895-2011-1014, IMMERSe]; NSERC, the
Natural Sciences and Engineering Research Council of Canada [RGPIN-418622-
2012]; CFI, the Canada Foundation for Innovation [35819]; and Mitacs [IT07255].
References
1. Deterding, S., Dixon, D., Khaled, R., Nacke, L.E.: From Game Design Elements to
Gamefulness: Defining “Gamification.” In: Proceedings of the 15th International Academic
MindTrek Conference. pp. 915. ACM, Tampere, Finland (2011).
doi:10.1145/2181037.2181040.
2. Smith, S.L., Mosier, J.N.: Guidelines for designing user interface software. Mitre
Corporation, Bedford, MA (1986).
3. Nielsen, J.: Finding usability problems through heuristic evaluation. In: Proceedings of the
SIGCHI Conference on Human Factors in Computer Systems - CHI ’92. pp. 373380
(1992). doi:10.1145/142750.142834.
4. Nielsen, J.: Heuristic Evaluation. In: Nielsen, J. and Mack, R.L. (eds.) Usability Inspection
Methods. pp. 25–62. John Wiley & Sons, New York (1994).
5. Hamari, J., Koivisto, J., Sarsa, H.: Does gamification work? - A literature review of
empirical studies on gamification. In: Proceedings of the Annual Hawaii International
Conference on System Sciences. pp. 3025–3034 (2014). doi:10.1109/HICSS.2014.377.
6. White, G.R., Mirza-Babaei, P., McAllister, G., Good, J.: Weak inter-rater reliability in
heuristic evaluation of video games. In: Proceedings of the CHI 2011 Extended Abstracts
on Human Factors in Computing Systems - CHI EA ’11. pp. 1441–1446. ACM (2011).
doi:10.1145/1979742.1979788.
7. Ryan, R.M., Deci, E.L.: Self-determination theory and the facilitation of intrinsic
motivation, social development, and well-being. Am. Psychol. 55, 68–78 (2000).
doi:10.1037/0003-066X.55.1.68.
8. Ryan, R.M., Deci, E.L.: Intrinsic and Extrinsic Motivations: Classic Definitions and New
Directions. Contemp. Educ. Psychol. 25, 54–67 (2000). doi:10.1006/ceps.1999.1020.
9. Ryan, R.M., Rigby, C.S., Przybylski, A.: The motivational pull of video games: A self-
determination theory approach. Motiv. Emot. 30, 347363 (2006).
doi:10.1007/s11031-006-9051-8.
10. Deci, E.L., Ryan, R.M.: Intrinsic Motivation and Self-Determination in Human Behavior.
Plenum, New York and London (1985).
11. Deterding, S.: The Lens of Intrinsic Skill Atoms: A Method for Gameful Design. Human-
Computer Interact. 30, 294335 (2015). doi:10.1080/07370024.2014.993471.
12. Mora, A., Riera, D., González, C., Arnedo-Moreno, J.: Gamification: a systematic review
of design frameworks. J. Comput. High. Educ. 133 (2017).
doi:10.1007/s12528-017-9150-4.
13. Deterding, S.: Situated motivational affordances of game elements: A conceptual model. In:
Gamification: Using Game Design Elements in Non-Gaming Contexts, A Workshop at CHI
2011. ACM (2011).
14. Zhang, P.: Motivational Affordances: Reasons for ICT Design and Use. Commun. ACM.
51, 145147 (2008). doi:10.1145/1400214.1400244.
15. Nielsen, J.: Enhancing the explanatory power of usability heuristics. In: Proceedings of the
SIGCHI Conference on Human Factors in Computing Systems - CHI ’94. pp. 152–158
(1994). doi:10.1145/191666.191729.
16. Lucero, A., Holopainen, J., Ollila, E., Suomela, R., Karapanos, E.: The playful experiences
(PLEX) framework as a guide for expert evaluation. In: Proceedings of the 6th International
Conference on Designing Pleasurable Products and Interfaces - DPPI ’13. pp. 221230.
ACM (2013). doi:10.1145/2513506.2513530.
17. Lucero, A., Karapanos, E., Arrasvuori, J., Korhonen, H.: Playful or Gameful? Creating
delightful user experiences. Interactions. 21, 3439 (2014). doi:10.1145/2590973.
18. Chou, Y.: Actionable Gamification - Beyond Points, Badges, and Leaderboards. Octalysis
Media (2015).
19. Tondello, G.F., Wehbe, R.R., Diamond, L., Busch, M., Marczewski, A., Nacke, L.E.: The
Gamification User Types Hexad Scale. In: Proceedings of the 2016 Annual Symposium on
Computer-Human Interaction in Play - CHI PLAY ’16. pp. 229243. ACM, Austin, TX,
USA (2016). doi:10.1145/2967934.2968082.
20. Morschheuser, B., Werder, K., Hamari, J., Abe, J.: How to gamify? A method for designing
gamification. In: Proceedings of the 50th Annual Hawaii International Conference on
System Sciences (HICSS). IEEE, Hawaii, USA (2017). doi:10.24251/HICSS.2017.155.
21. Hamari, J., Huotari, K., Tolvanen, J.: Gamification and Economics. In: Walz, S.P. and
Deterding, S. (eds.) The Gameful World: Approaches, Issues, Applications. pp. 139161.
The MIT Press (2015).
22. Deci, E.L., Eghrari, H., Patrick, B.C., Leone, D.R.: Facilitating Internalization: The Self-
Determination Theory Perspective. J. Pers. 62, 119142 (1994).
doi:10.1111/j.1467-6494.1994.tb00797.x.
23. Huta, V., Waterman, A.S.: Eudaimonia and Its Distinction from Hedonia: Developing a
Classification and Terminology for Understanding Conceptual and Operational Definitions.
J. Happiness Stud. 15, 1425–1456 (2014). doi:10.1007/s10902-013-9485-0.
24. Peterson, C., Park, N., Seligman, M.E.P.: Orientations to happiness and life satisfaction: the
full life versus the empty life. J. Happiness Stud. 6, 25–41 (2005). doi:10.1007/s10902-004-
1278-z.
25. Rigby, S., Ryan, R.M.: Glued to Games: How Video Games Draw Us In and Hold Us
Spellbound. Praeger, Santa Barbara, CA (2011).
26. Malone, T.W.: Toward a Theory of Intrinsically Motivating Instruction. Cogn. Sci. 4, 333
369 (1981).
27. Tondello, G.F., Kappen, D.L., Mekler, E.D., Ganaba, M., Nacke, L.E.: Heuristic Evaluation
for Gameful Design. In: Proceedings of the 2016 Annual Symposium on Computer-Human
Interaction in Play Extended Abstracts - CHI PLAY EA ’16. pp. 315323. ACM (2016).
doi:10.1145/2968120.2987729.
28. Nielsen, J., Molich, R.: Heuristic Evaluation of User Interfaces. In: Proceedings of the
SIGCHI Conference on Human Factors in Computer Systems - CHI ’90. pp. 249256
(1990). doi:10.1145/97243.97281.
29. Nielsen, J., Landauer, T.K.: A mathematical model of the finding of usability problems. In:
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI
’93. pp. 206–213. ACM (1993). doi:10.1145/169059.169166.
30. Seaborn, K., Fels, D.I.: Gamification in theory and action: A survey. Int. J. Hum. Comput.
Stud. 74, 14–31 (2014). doi:10.1016/j.ijhcs.2014.09.006.
31. Desurvire, H., Caplan, M., Toth, J.A.: Using heuristics to evaluate the playability of games.
In: CHI’04 Extended Abstracts on Human Factors in Computing Systems. pp. 15091512
(2004). doi:10.1145/985921.986102.
32. Desurvire, H., Wiberg, C.: Game Usability Heuristics (PLAY) for Evaluating and
Designing Better Games: The Next Iteration. In: Online Communities and Social
Computing: Third International Conference - OCSC 2009. pp. 557566. Springer Berlin
Heidelberg (2009). doi:10.1007/978-3-540-73257-0.
33. Desurvire, H., Wiberg, C.: User Experience Design for Inexperienced Gamers: GAP
Game Approachability Principles. In: Bernhaupt, R. (ed.) Evaluating User Experience in
Games. pp. 131–148. Springer (2010). doi:10.1007/978-1-84882-963-3_1.
34. Korhonen, H., Koivisto, E.M.I.: Playability heuristics for mobile multi-player games. In:
Proceedings of the 2nd International Conference on Digital Interactive Media in
Entertainment and Arts. pp. 28–35 (2007). doi:10.1145/1306813.1306828.
35. Pinelle, D., Wong, N., Stach, T., Gutwin, C.: Usability heuristics for networked multiplayer
games. In: Proceedings of the ACM 2009 International Conference on Supporting Group
Work. pp. 169178 (2009). doi:10.1145/1531674.1531700.
36. Paavilainen, J.: Critical review on video game evaluation heuristics: Social Games
Perspective. In: Proceedings of the International Academic Conference on the Future of
Game Design and Technology. pp. 5665 (2010). doi:10.1145/1920778.1920787.
37. Sweetser, P., Wyeth, P.: GameFlow: A Model for Evaluating Player Enjoyment in Games.
Comput. Entertain. 3, 3 (2005). doi:10.1145/1077246.1077253.
38. Sweetser, P., Johnson, D.M., Wyeth, P.: Revisiting the GameFlow model with detailed
heuristics. J. Creat. Technol. (2012).
39. Zichermann, G., Cunningham, C.: Gamification by Design: Implementing game mechanics
in web and mobile apps. O’Reilly, Sebastopol, CA (2011).
40. Francisco-Aparicio, A., Gutiérrez-Vela, F.L., Isla-Montes, J.L., Sanchez, J.L.G.:
Gamification: Analysis and Application. In: Penichet, V., Peñalver, A., and Gallud, J. (eds.)
New Trends in Interaction, Virtual Reality and Modeling. pp. 113126. Springer London
(2013). doi:10.1007/978-1-4471-5445-7_9.
41. Jiménez, S.: Gamification Model Canvas, http://www.gameonlab.com/canvas/.
42. Burke, B.: Gamify: How gamification motivates people to do extraordinary things.
Bibliomotion, Brookline, MA (2014).
43. Kappen, D.L., Nacke, L.E.: The Kaleidoscope of Effective Gamification: Deconstructing
Gamification in Business Applications. In: Proceedings of Gamification 2013. pp. 119–122.
ACM, Stratford, ON, Canada (2013). doi:10.1145/2583008.2583029.
44. Paharia, R.: Loyalty 3.0: How to revolutionize customer and employee engagement with
big data and gamification. McGraw-Hill, New York, NY (2013).
45. Nicholson, S.: A RECIPE for Meaningful Gamification. In: Reiners, T. and Wood, L.C.
(eds.) Gamification in Education and Business. pp. 1–20. Springer, Cham (2015).
doi:10.1007/978-3-319-10208-5.
46. Werbach, K., Hunter, D.: For the Win: How game thinking can revolutionize your business.
Wharton Digital Press, Philadelphia, PA (2012).
47. McGonigal, J.: SuperBetter: A Revolutionary Approach to Getting Stronger, Happier,
Braver and More Resilient. Penguin Books, New York, NY (2015).
48. Hunicke, R., LeBlanc, M., Zubek, R.: MDA: A Formal Approach to Game Design and
Game Research. In: Workshop on Challenges in Game AI. pp. 14 (2004).
49. Lazzaro, N.: The four fun keys. In: Isbister, K. and Schaffer, N. (eds.) Game Usability:
Advancing the Player Experience. pp. 315344. Elsevier, Burlington (2008).
... Gameful Design Heuristics [Tondello et al., 2019], created to be an evaluation method for gameful designs, brought the concept of heuristics from usability engineering (as the general principles or broad usability guidelines that have been used to design and evaluate systems) and developed this set of particular heuristics specifically to inspect gameful designs. Experts use heuristic evaluation to identify usability problems in an existing design as part of an iterative design. ...
... In this study, we use Bloom's revised taxonomy [Krathwohl, 2002], composed of the statement of a learning objective, where the verb (and the action associated with it) Figure 2. Gameful Design Heuristics [Tondello et al., 2019] refers to the cognitive process, and the object (usually a noun) refers to the knowledge expected the students to acquire. As such, the authors refer to two dimensions: the cognitive process, categorized in six hierarchical stages (i.e., Remembering, Understanding, Applying, Analyzing, Evaluating, Creating), and the Knowledge Dimension, categorized in factual, conceptual, procedural, and meta-cognitive, as shown in the examples from Table 1. ...
Article
Full-text available
This research explores the fertile intersection of narrative, gamification, and education, focusing on user experience (UX). Addressing a critical gap in the literature, we developed and validated a Narrative Gamification Framework for Education. The framework provides educators with tangible guidelines to gamify their lessons, emphasizing the content's gameful transformation rather than the environment. The research contributes novel insights and practical tools to the fields of Human-Computer Interaction (HCI) and UX, with implications for the broader context of education. Our findings set the stage for future research, including an ongoing initiative to adapt the framework to engage teachers in a journey of recognition and learning.
... Our research drew inspiration from existing gamification frameworks and design heuristics. Works such as Deterding et al. [10], Werbach et al. [46], Kapp et al. [16] and Tondello et al. [43] presented frameworks and guidelines for designing gamified experiences. These frameworks emphasized the importance of good strategies and how each element relates to other. ...
... The initial phase of our research involved an extensive literature review, where we examined various theories and frameworks related to narrative, instructional design, gameful design, learning taxonomy, user experience, and game elements. The pillars of the conception were the Hero's Journey [44], ADDIE Framework [25], Gameful Design Heuristics [43], Bloom's Revised Taxonomy [17], and TGEEE [41] theories. They provided a solid foundation for understanding the key elements and principles contributing to effective narrative design and gamification in education. ...
... To address these limitations, we dive into the power of narrative and storytelling in gamified experiences, developing the Narrative Gamification Framework for Education [Palomino and Isotani 2024]. This framework integrates consolidated theories, such as Hero's Journey [Vogler 2007], AD-DIE instructional design framework [Morrison et al. 2019], Gameful Design Heuristics [Tondello et al. 2019], Bloom's Revised Taxonomy [Krathwohl 2002], TGEEE game elements , and User Experience (UX) [Norman 2004, Hassenzahl 2018, enabling designers to create gameful environments that engage users on a deeper level. Additionally, the Gamification Journey Personalization User-type Approach, inspired by Jung's Archetypes, allows for the personalization of gamification strategies based on users' unique characteristics and preferences, optimizing their experience [Jung 2014, Palomino et al. 2019c]. ...
Conference Paper
Full-text available
This article challenges traditional approaches to gamification by advocating a paradigm shift toward viewing gamification as an immersive and transformative experience, acknowledging its potential not just as a tool for incentivizing behaviours, but as an experience that can enhance engagement, learning, and personal growth. Drawing on research in narrative gamification and personalized strategies, we advocate holistic, context-specific approaches to redefine gamification beyond simple game elements, allowing the creation of meaningful gamified experiences across domains. We recommend interdisciplinary collaboration to envision the empowering potential of diverse gameful experiences to help individuals reach their full potential.
... The course requires critical thinking to progress through the learning content. Four experts with extensive backgrounds in gamification and educational technology were selected to evaluate the course using gamification heuristics developed by [Tondello et al. 2019]. These experts provided valuable feedback, highlighting our approach's strengths and areas for potential improvement. ...
Conference Paper
Full-text available
Interactive storytelling and narrative can significantly enhance immersion in learning systems, especially within gamified environments. This study presents an innovative approach to integrating interactive storytelling with instructional design and Human-Computer Interaction (HCI) concepts in a gamified environment. We discuss our methodology and evaluate it through gamification heuristics, reflecting on the positive feedback from specialists. Our findings offer valuable insights for practitioners and researchers in HCI, focusing on the immersive aspects of interfaces.
... However, the prediction of meaningful gamification implementation has different outcomes, ranging from positive perspectives to less favorable ones, and, thus, the discussion surrounding the topic is meaningful but divergent. Clearly, gamification has attracted increasing interest and opinions, but there still a need for a clear conceptual understanding of it [56], and empirical studies examining its effects are still scarce [57]. Hence, it is important to determine if meaningful gamification lives up to the optimistic assumed results of its effectiveness with the adoption of SOR, embedded in the mGEECO model as a fundamental inclusion. ...
Article
Full-text available
Research into ecotourism behavior in China through meaningful gamification offers a promising strategy for enhancing sustainable tourism practices. With the rapid growth of China’s ecotourism sector, understanding and influencing visitor behaviors is crucial. This study focuses on meaningful gamification elements—exposition, information, engagement, and reflection—as a technique to nurture positive intentions towards ecotourism behavior, increase environmental awareness, educate tourists, and promote sustainable practices in an interactive way. Aligning with China’s technological and sustainability goals, this research introduces the Meaningful Gamification Elements for Ecotourism Behavior (mGEECO) model. This model is analyzed using Structural Equation Modeling (SEM)–Partial Least Squares (PLS) to test hypotheses related to the relationship between gamification elements and ecotourism intentions, grounded in Stimulus–Organism–Response (SOR) theory. The findings show that meaningful gamification significantly enhances positive intentions towards ecotourism by improving Environmental Attitude, Awareness of Consequences, and Ascription of Responsibilities. In conclusion, this approach raises awareness of sustainability practices and fosters a sense of responsibility, potentially leading to a more balanced and responsible ecotourism industry in China, benefiting both the environment and local communities while enhancing visitor experiences.
... Points-based elements also target extrinsic motivation (rather than intrinsic motivation), which may be more effective in influencing participant performance [24]. We followed the feedback category of the Gameful Design Heuristics from Tondello et al [42], which states that the system should offer users clear and immediate feedback, actionable feedback, and graspable progress. ...
Preprint
BACKGROUND Few gamified cognitive tasks are subjected to rigorous examination of psychometric properties, despite their use in experimental and clinical settings. Even small manipulations to cognitive tasks require extensive research to understand their effects. OBJECTIVE This study aims to investigate how game elements can affect the reliability of scores on a Stroop task. We specifically investigated performance consistency within and across sessions. METHODS We created 2 versions of the Stroop task, with and without game elements, and then tested each task with participants at 2 time points. The gamified task used points and feedback as game elements. In this paper, we report on the reliability of the gamified Stroop task in terms of internal consistency and test-retest reliability, compared with the control task. We used a permutation approach to evaluate internal consistency. For test-retest reliability, we calculated the Pearson correlation and intraclass correlation coefficients between each time point. We also descriptively compared the reliability of scores on a trial-by-trial basis, considering the different trial types. RESULTS At the first time point, the Stroop effect was reduced in the game condition, indicating an increase in performance. Participants in the game condition had faster reaction times ( P =.005) and lower error rates ( P =.04) than those in the basic task condition. Furthermore, the game condition led to higher measures of internal consistency at both time points for reaction times and error rates, which indicates a more consistent response pattern. For reaction time in the basic task condition, at time 1, r Spearman-Brown=0.78, 95% CI 0.64-0.89. At time 2, r Spearman-Brown=0.64, 95% CI 0.40-0.81. For reaction time, in the game condition, at time 1, r Spearman-Brown=0.83, 95% CI 0.71-0.91. At time 2, r Spearman-Brown=0.76, 95% CI 0.60-0.88. Similarly, for error rates in the basic task condition, at time 1, r Spearman-Brown=0.76, 95% CI 0.62-0.87. At time 2, r Spearman-Brown=0.74, 95% CI 0.58-0.86. For error rates in the game condition, at time 1, r Spearman-Brown=0.76, 95% CI 0.62-0.87. At time 2, r Spearman-Brown=0.74, 95% CI 0.58-0.86. Test-retest reliability analysis revealed a distinctive performance pattern depending on the trial type, which may be reflective of motivational differences between task versions. In short, especially in the incongruent trials where cognitive conflict occurs, performance in the game condition reaches peak consistency after 100 trials, whereas performance consistency drops after 50 trials for the basic version and only catches up to the game after 250 trials. CONCLUSIONS Even subtle gamification can impact task performance albeit not only in terms of a direct difference in performance between conditions. People playing the game reach peak performance sooner, and their performance is more consistent within and across sessions. We advocate for a closer examination of the impact of game elements on performance.
... Points-based elements also target extrinsic motivation (rather than intrinsic motivation), which may be more effective in influencing participant performance [24]. We followed the feedback category of the Gameful Design Heuristics from Tondello et al [42], which states that the system should offer users clear and immediate feedback, actionable feedback, and graspable progress. ...
Article
Full-text available
Background Few gamified cognitive tasks are subjected to rigorous examination of psychometric properties, despite their use in experimental and clinical settings. Even small manipulations to cognitive tasks require extensive research to understand their effects. Objective This study aims to investigate how game elements can affect the reliability of scores on a Stroop task. We specifically investigated performance consistency within and across sessions. Methods We created 2 versions of the Stroop task, with and without game elements, and then tested each task with participants at 2 time points. The gamified task used points and feedback as game elements. In this paper, we report on the reliability of the gamified Stroop task in terms of internal consistency and test-retest reliability, compared with the control task. We used a permutation approach to evaluate internal consistency. For test-retest reliability, we calculated the Pearson correlation and intraclass correlation coefficients between each time point. We also descriptively compared the reliability of scores on a trial-by-trial basis, considering the different trial types. Results At the first time point, the Stroop effect was reduced in the game condition, indicating an increase in performance. Participants in the game condition had faster reaction times (P=.005) and lower error rates (P=.04) than those in the basic task condition. Furthermore, the game condition led to higher measures of internal consistency at both time points for reaction times and error rates, which indicates a more consistent response pattern. For reaction time in the basic task condition, at time 1, rSpearman-Brown=0.78, 95% CI 0.64-0.89. At time 2, rSpearman-Brown=0.64, 95% CI 0.40-0.81. For reaction time, in the game condition, at time 1, rSpearman-Brown=0.83, 95% CI 0.71-0.91. At time 2, rSpearman-Brown=0.76, 95% CI 0.60-0.88. Similarly, for error rates in the basic task condition, at time 1, rSpearman-Brown=0.76, 95% CI 0.62-0.87. At time 2, rSpearman-Brown=0.74, 95% CI 0.58-0.86. For error rates in the game condition, at time 1, rSpearman-Brown=0.76, 95% CI 0.62-0.87. At time 2, rSpearman-Brown=0.74, 95% CI 0.58-0.86. Test-retest reliability analysis revealed a distinctive performance pattern depending on the trial type, which may be reflective of motivational differences between task versions. In short, especially in the incongruent trials where cognitive conflict occurs, performance in the game condition reaches peak consistency after 100 trials, whereas performance consistency drops after 50 trials for the basic version and only catches up to the game after 250 trials. Conclusions Even subtle gamification can impact task performance albeit not only in terms of a direct difference in performance between conditions. People playing the game reach peak performance sooner, and their performance is more consistent within and across sessions. We advocate for a closer examination of the impact of game elements on performance.
Article
Full-text available
Gamification has gained significant attention in recent years. Even the cultural heritage has not been unnoticed and various gamification techniques have been applied for different purposes. Bulgaria is among the countries with prominent interest in the matter. Although some successful initiatives have been carried out and are still implemented, there is a lack of systemised knowledge for what does not work well and why, which will help practitioners, researchers, public and local authorities in avoiding making the same mistakes in the future. The objective of the paper is to identify best practices for gamification in Bulgarian cultural heritage. The objective is addressed by providing a context specific framework for selecting best practices on gamification in Bulgarian cultural heritage. Based on a set of 12 gamification criteria and 9 impact criteria and by keeping to a 3-stage procedure, the first two top-rated initiatives have been selected as best practices and briefly described.
Article
Full-text available
Learner’s motivation difficulties are recognized as a problem in diverse educational scenarios, reaching up to university degrees. Among other techniques that are often applied by instructors to counteract this issue, those related to the use of gaming elements seem to very promising. In this context, considering the use of game-like properties in learning scenarios, known as gamification, has received increasing interest by academia in recent years. However, its application in higher education can be challenging, due to some unwanted effects caused by the lack of proven design methodologies have been detected. Choosing the adequate formal process for gamification design has become an important success requirement. This work presents a systematic review of the gamification design frameworks discussed in the literature, providing a useful resource to educational practitioners as well as gamification designers and researchers. A total of 2314 unique works are initially recorded, based on queries in databases, libraries, journals and search engines. After applying a systematic filtering process, a definitive list of 40 works is more closely analysed. Next to review over relevant literature, an assessment of the main features found in the discussed approaches is given, while also categorizing them according to their main application field and its suitability in higher educational environments.
Conference Paper
Full-text available
During recent years, gamification has become a popular method of enriching information technologies. Popular business analysts have made promising predictions about penetration of gamification, however, it has also been estimated that most gamifica-tion efforts will fail due to poor understanding of how gamification should be designed and implemented. Therefore, in this paper we seek to advance the understanding of best practices related to the gamifica-tion design process. We approach this research problem via a design science research approach; firstly, by synthesizing the current body of literature on gamification design methods and interviewing 25 gamification experts. Secondly, we develop a method for gamification design, based on the gathered knowledge. Finally, we conduct an evaluation of the method via interviews of 10 gamification experts. The results indicate that the developed method is comprehensive , complete and provides practical utility. We deliver a comprehensive overview of gamification guidelines and shed novel insights into the overall nature of the gamification development and design discourse.
Conference Paper
Full-text available
Despite the emergence of many gameful design methods in the literature, there is a lack of evaluation methods specific to gameful design. To address this gap, we present a new set of guidelines for heuristic evaluation of gameful design in interactive systems. First, we review several gameful design methods to identify the dimensions of motivational affordances most often employed. Then, we present a set of 28 gamification heuristics aimed at enabling experts to rapidly evaluate a gameful system. The resulting heuristics are a new method to evaluate user experience in gameful interactive systems.
Conference Paper
Full-text available
Several studies have indicated the need for personalizing gamified systems to users' personalities. However, mapping user personality onto design elements is difficult. Hexad is a gamification user types model that attempts this mapping but lacks a standard procedure to assess user preferences. Therefore, we created a 24-items survey response scale to score users' preferences towards the six different motivations in the Hexad framework. We used internal and test-retest reliability analysis, as well as factor analysis, to validate this new scale. Further analysis revealed significant associations of the Hexad user types with the Big Five personality traits. In addition, a correlation analysis confirmed the framework's validity as a measure of user preference towards different game design elements. This scale instrument contributes to games user research because it enables accurate measures of user preference in gamification.
Book
This book offers a practical yet powerful way to understand the psychological appeal and strong motivation to play video games. With video game sales in the billions and anxious concerns about their long-term effects growing louder, Glued to Games: How Video Games Draw Us In and Hold Us Spellbound brings something new to the discussion. It is the first truly balanced research-based analysis on the games and gamers, addressing both the positive and negative aspects of habitual playing by drawing on significant recent studies and established motivational theory. Filled with examples from popular games and the real experiences of gamers themselves, Glued to Games gets to the heart of gaming's powerful psychological and emotional allure—the benefits as well as the dangers. It gives everyone from researchers to parents to gamers themselves a clearer understanding the psychology of gaming, while offering prescriptions for healthier, more enjoyable games and gaming experiences.
Chapter
What if every part of our everyday life was turned into a game? The implications of “gamification.” What if our whole life were turned into a game? What sounds like the premise of a science fiction novel is today becoming reality as “gamification.” As more and more organizations, practices, products, and services are infused with elements from games and play to make them more engaging, we are witnessing a veritable ludification of culture. Yet while some celebrate gamification as a possible answer to mankind's toughest challenges and others condemn it as a marketing ruse, the question remains: what are the ramifications of this “gameful world”? Can game design energize society and individuals, or will algorithmicincentive systems become our new robot overlords? In this book, more than fifty luminaries from academia and industry examine the key challenges of gamification and the ludification of culture—including Ian Bogost, John M. Carroll, Bernie DeKoven, Bill Gaver, Jane McGonigal, Frank Lantz, Jesse Schell, Kevin Slavin, McKenzie Wark, and Eric Zimmerman. They outline major disciplinary approaches, including rhetorics, economics, psychology, and aesthetics; tackle issues like exploitation or privacy; and survey main application domains such as health, education, design, sustainability, or social media.
Chapter
Games entertain by daring the user to change the course of events. Garners do pursue goals, but what they value most is the experience that the game creates. Instead of task completion, time-on-task, and error prevention the true measure of a player’s experience is how much the game affects his or her internal sensations, thoughts, and feelings.
Book
I: Background.- 1. An Introduction.- 2. Conceptualizations of Intrinsic Motivation and Self-Determination.- II: Self-Determination Theory.- 3. Cognitive Evaluation Theory: Perceived Causality and Perceived Competence.- 4. Cognitive Evaluation Theory: Interpersonal Communication and Intrapersonal Regulation.- 5. Toward an Organismic Integration Theory: Motivation and Development.- 6. Causality Orientations Theory: Personality Influences on Motivation.- III: Alternative Approaches.- 7. Operant and Attributional Theories.- 8. Information-Processing Theories.- IV: Applications and Implications.- 9. Education.- 10. Psychotherapy.- 11. Work.- 12. Sports.- References.- Author Index.