Contents lists available at ScienceDirect
Computers & Education
journal homepage: www.elsevier.com/locate/compedu
Comparing success and engagement in gamiﬁed learning
experiences via Kahoot and Quizizz
Derya Orhan Göksün
, Gülden Gürsoy
Adiyaman University, Computer Education and Instructional Technologies Department, Turkey
Adiyaman University, Science Education Department, Turkey
Improving classroom teaching
Interactive learning environments
The purpose of the present study was to investigate the reﬂections of gamiﬁcation activities that
are used as a formative assessment tool based on academic achievement and student engagement
in learning environments. It was also aimed to investigate whether the utilization of the gami-
ﬁcation tool led to a diﬀerence in academic achievement and student engagement. Three re-
search groups were determined; two experimental groups where 7E instructional model gamiﬁed
with Kahoot and Quizizz was implemented and a control group where conventional 7E in-
struction method was implemented. The groups were determined by random assignment of 97
pre-service teachers who took scientiﬁc research methods course in the 2017–2018 academic
year spring semester. However, since only 71 of the assigned pre-service teachers voluntarily
participated in the study, the study data included 71 pre-service teachers. At the beginning and
the end of the six-week-long instruction activities, the academic achievement test and student
engagement scale on the content instructed in the six-week-long scientiﬁc research methods
course were applied. Furthermore, in-depth views of pre-service teachers were obtained with
focus group interviews. Therefore, the study was conducted with mixed design principles. The
study ﬁndings demonstrated that the scientiﬁc research methods academic achievement x stu-
dent engagement ×group interaction model (Wilks's lambda =.819, F
=7.301, p <0.05)
was signiﬁcant. The activities gamiﬁed with Kahoot application, albeit statistically insigniﬁcant,
had a more positive impact on academic achievement and student engagement when compared
to the other groups. On the other hand, it was observed that the positive impact of the activities
gamiﬁed with Quizizz application was lower than that of the instruction method utilized in the
control group both based on academic achievement (Δx̄
= 38.116, Δx̄
= 38.776) and student engagement (Δx̄
= 12.176, Δx̄
= 14.218). Opposed to quantitative ﬁndings, pre-service teachers expressed views about
the problems they experienced under the sub themes of motivation, reinforcement, entertain-
ment, competition sub-themes in gamiﬁcation activities and stated that they were generally
positive about the activities and experienced problems related to the infrastructure and the tool.
While the digital revolution rapidly changes the world, it also changes the humankind. The modern children, who were born to
digital technologies and called Millennials or the Y-Generation, are raised in a world where everyone has a computer in their pockets
Received 23 October 2018; Received in revised form 7 January 2019; Accepted 25 February 2019
Corresponding author. Adiyaman University, Faculty of Education, Number: a/56, 02100, Adiyaman, Turkey.
E-mail addresses: email@example.com (D. Orhan Göksün), firstname.lastname@example.org (G. Gürsoy).
(Koivisto & Hamari, 2014). These modern children both learn diﬀerently (Prensky, 2014) and prefer to learn the information that is
useful, fun and relevant (Jukes & Dosaj, 2004). The requirement to know how this generation can learn better and what their
preferred learning styles are (Arabaci & Polat, 2013) has emerged as a new educational problem (Campell, 2016). The education
industry faced new challenges and had to be redesigned based on the needs, preferences and orientations of digital natives in order to
be successful in the 21st century (Prensky, 2001). The approach that instructional activities designed based on student needs in-
creased the success in education (Demirtaş& Kahveci, 2010) introduced the need for new methods.
The lack of student motivation to learn (Lee & Hammer, 2011) and the lack of student engagement in the instructional en-
vironment (Kumar & Khurana, 2012) became a fundamental problem in modern education as a result of the digital revolution
(Kiryakova, Angelova, & Yordanova, 2014). It was reported in several studies in the literature that this problem cannot be resolved by
conventional methods and the motivation and engagement in learning cannot be achieved (Barata, Gama, Jorge, & Gonçalves, 2015;
Bell, 2014;Buckley & Doyle, 2016;Erhel & Jamet, 2013;Hamari & Koivisto, 2014;Jui-Mei, Chun-Ming, Hwang, & Yueh-Chiao, 2011;
Kapp, 2012;Muntean, 2011;Zichermann & Cunningham, 2011) and it was suggested that “gamiﬁcation,”which could be eﬀective on
the improvement of learners’motivation and engagement, should be introduced to the education system as a new approach.
Marczewski (2012) reported that the concept of gamiﬁcation was ﬁrst introduced by Nick Pelling in 2002, while Gaggioli (2012)
attributed the concept solely to Jesse Schell (2008). There are diﬀerences in the deﬁnition of the gamiﬁcation concept, similar to the
diﬀerences in the roles attributed to diﬀerent individuals at its induction (Burke, 2011;Deterding, 2011;Deterding, Dixon, Khaled, &
Nacke, 2011;Gökkaya, 2014;Kapp, 2012;Lee & Hammer, 2011). Gamiﬁcation is used to describe a type of connection between
games and anything that is not a game (Campbell, 2016). Gamiﬁcation is a method used to apply game elements to non-game
contexts (Deterding et al., 2011) The aim of gamiﬁcation is not to create a game-like new world, but to transfer the game elements
into the real world to capture similar senses without leaving the reality (Arkün-Kocadere & Samur, 2016, pp. 397–414). In education,
gamiﬁcation is a way of playing creative games in classroom without jeopardizing the scientiﬁc nature of a curriculum (Nolan &
McBride, 2014). In an educational setting, gamiﬁcation supports the individuals to acquire the potential to develop critical thinking
and multi-tasking, while training successful 21st century digital natives (Kapp, 2012;Prensky, 2001). Gamiﬁcation makes learning
more entertaining, increasing the motivation of the students to learn and study (Muntean, 2011). Furthermore, gamiﬁcation provides
data on student learning by enabling more eﬃcient, accurate and timely information for teachers, parents, administrators and public
policy makers (Darling-Hammond, 2010).
Instant feedback capability using game elements such as scores, badges, rankings and rewards in gamiﬁcation leads to student
engagement in learning environment and enforces their behavior to reach targets (Glover, 2013), as well as providing the opportunity
to monitor the learning achievements and transparent assessment of these achievements (Clarisó et al., 2017, pp. 105–116; Kapp,
2012;Lee & Hammer, 2011). Feedback is a signiﬁcant component of the assessment process. Formative assessment, one of the
assessment methods, is focused on the active use of the feedback (Delacruz, 2011). According to Shute and Spector (2010) the use of
gamiﬁcation as a formative assessment tool provides in ﬁrst information on the learning processes of the individuals. It also allows us
to observe the motivation of individuals, to monitor their emotional and metacognitive traits, and to understand their speciﬁc
behavior. Third, instant feedback based on embedded or conﬁdential evaluations allows individuals to be aware of the diﬃculties
that they experience in games (Shute & Spector, 2010). Also, the use of gamiﬁcation as an assessment tool demonstrates the strengths
and weaknesses of the game design. Delacruz (2011) examined the impact of game feedback in diﬀerent detail levels on students'
mathematics performances. In the present experimental study, three scoring methods (detailed, minimal, without scoring informa-
tion) were used. It was concluded that mathematics performances of the students increased as the level of detail in gamiﬁcation
increased. Rapti (2013, pp. 255–262) demonstrated that gamiﬁcation could be successful as an alternative method of student as-
Literature review demonstrated that there were only a few studies where the gamiﬁcation was used for assessment purposes
(Arkün-Kocadere & Çağlar, 2015;Attali & Arieli-Attali, 2015;Delacruz, 2011;Dichev & Dicheva, 2017;Ismail & Mohammad, 2017;
Oliver, 2017;Rapti, 2013, pp. 255–262; Turan & Meral, 2018). There are gaps in the literature on the impact of the gamiﬁcation on
assessment. In particular, determination of the diﬀerences between conventional assessment and assessment with gamiﬁcation would
contribute to this gap in the literature (Jackson & McNamara, 2013). In the present study, the course was designed with the 7E
instructional model based on the constructivist approach. The design was based on the 7E steps proposed by Eisenkraft (2003).
Eisenkraft (2003), in contrast to Bybee (2003), argued that the prior knowledge of the students should be tested in the engagement
step. In the present study, gamiﬁcation was used as a formative assessment tool in the engagement step where the prior knowledge of
the students is tested and in the evaluation step where whether the students learned the content is tested. Three groups were designed
and three groups were instructed with the constructivist 7E instruction approach in the scientiﬁc research methods course. However,
gamiﬁcation was used in the stages of engagement and evaluation in the experiment group, while only the question-answer method
was used in the control group. The present study aimed to determine the impact of formative assessment conducted with conven-
tional method and gamiﬁcation tools on the success and engagement of pre-service teachers.
There are several free applications that could be used for formative assessment: Edmodo (LMS, exams, surveys, and indicators);
Socrative (exams, surveys, gamiﬁcation, indicators); Kahoot (quizzes gamiﬁcation, surveys and indicators); Quizizz (quizzes and
words, cultural games, etc.); Google Forms and Flubaroo (exams and indicators); Padlet; Mentimeter; Edpuzzle (video quiz could be
integrated into Edmodo). Review of the previous studies in the literature (Biçen & Kocakoyun, 2018;Borrell, Cosmas, Grymes, &
Radunzel, 2017;Ismail & Mohammad, 2017;Licorish, George, Owen, & Daniel, 2017;Licorish, Owen, Daniel, & George, 2018;
Medina & Hurtado, 2017;Solmaz & Çetin, 2017;Tsihouridis, Vavougios, & Ioannidis, 2017;Yapıcı& Karakoyun, 2017) demonstrated
that Kahoot application was used more frequently in gamiﬁcation activities when compared to the other applications. Kahoot is a
globally accepted online learning platform based on behavioral approach with more than 30 million users (Plump & LaRosa, 2017).
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
Quizizz is also a similar Web 2.0 tool, which is popular in the ﬁeld of assessment. Quizizz has advantages and disadvantages when
compared to Kahoot. Comparison of the features of these two applications based on assessment demonstrated that there were dif-
ferences in the presentation of questions, feedbacks, progression speed and method of the questions, technical requirements, etc.
These diﬀerences are summarized in Table 1.
As demonstrated in Table 1, there are certain diﬀerences between the Kahoot and Quizizz applications, which were frequently
used in previous gamiﬁcation studies. Both applications have both advantages and disadvantages. Each new gamiﬁcation activities
has to transform the disadvantages of another gamiﬁed activities into advantages. It is necessary to determine the better gamiﬁed
design that would be used as a formative assessment tool. In the present study, two diﬀerent experiment groups and a control group
were determined and Kahoot application was used in one experimental group and Quizizz application was used in the other ex-
perimental group. The aim of the present study was to determine the strengths and weaknesses of these gamiﬁcation experiences via
two diﬀerent tools.
The question “What are the eﬀects of gamiﬁcation used as a formative assessment tool to academic achievement and student
engagement”forms the basis of the research.
The sub-questions investigated within the framework of the basic question of the study are as follows:
1. Does gamiﬁcation as a formative assessment tool aﬀect academic achievement?
a. What is the impact of activities gamiﬁed with Kahoot application on academic achievement?
b. What is the impact of activities gamiﬁed with Quizizz application on academic achievement?
2. Does gamiﬁcation as a formative assessment tool aﬀect student engagement?
a. What is the impact of activities gamiﬁed with Kahoot application on student engagement?
b. What is the impact of activities gamiﬁed with Quizizz application on student engagement?
3. What are the professional and personal views of pre-service teachers on gamiﬁcation applications?
a. What are the views and recommendations of pre-service teachers who participated in activities gamiﬁed with Kahoot appli-
cation on the application?
b. What are the views and recommendations of pre-service teachers who participated in activities gamiﬁed with Quizizz appli-
cation on the application?
The study was designed with mixed method principles. Mixed method is deﬁned as the integration of qualitative and quantitative
approaches, data collection tool and data analysis in order to obtain in depth research data or validate the collected data (Johnson,
Onwuegbuzie, & Turner, 2007, p. 123). Furthermore, in mixed method research, both qualitative and quantitative research questions
are investigated (Creswell & Plano Clark, 2007, p.5). The ﬁrst and second research questions were investigated with the quantitative
research approach and the third research question was investigated with the qualitative research approach. Thus, it can be observed
that a research consistent with the nature of mixed methodology was conducted.
In the study, mixed methodology convergent parallel design was implemented. Convergent parallel pattern is a mixed-method
design where both the quantitative and quantitative research steps and data are collected and analyzed separately with equal priority
and without correlation (Dede & Demir, 2014; Trans., p.79). Quantitative study data were collected with an experimental process. In
the experimental process, data were collected via an academic achievement test and student engagement scale. In the experimental
Comparison of Kahoot and Qizizz applications based on instructional quizzes.
Comparison Criteria Kahoot Quizizz
Presentation of the questions Questions are asked to the whole group using a projector or
computer screen, where only the answer options are
reﬂected on the participant screens.
Both the questions and answer options are presented
individually on participant screens in diﬀerent order.
Progression All participants answer the next question after all
participants answer the previous question or when the time
allowed for the question is over.
Each participant can answer the next question after she/he
answers the previous question on her/his screen or the time
allowed to answer that question is over.
Feedback The statistics for the answers of a particular question is
presented between the questions.
Based on the correct or incorrect answer of a particular
participant, positive or negative messages are presented
immediately after the response.
Technical requirements Application requires a large screen where all participants
could read the questions, a projection device, smartboard, an
Internet connected device such as a smartphone, tablet,
laptop or computer that the participants could use to answer
An Internet connected device such as a smartphone, tablet,
laptop or computer where the instructor could initiate the quiz
and participants could answer the questions.
The lenght of questions Each question could include maximum 95 characters, each
answer option could include maximum 60.
There is no character limitation.
Development of questions
4 multiple choice answers.
Visuals could be included in the questions.
No preview when developing the questions.
The number of multiple choice answers is ﬂexible.
Both questions and answer options could include visuals.
Preview is available when developing the questions.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
process, qualitative data were collected with focus group interviews conducted with pre-service teachers who experienced instruction
activities gamiﬁed with Kahoot and Quizizz applications to obtain their in-depth views. More speciﬁcally, quantitative and quali-
tative data were collected in processes that did not interfere with each other in the study. The study data were analyzed separately
and associated by responding to the research questions. In short, the study was conducted with the convergent parallel design within
the framework of mixed methodology.
The study participants included 97 pre-service teachers who took scientiﬁc research methods (SRM) course in Adıyaman
University, Faculty of Education during the 2017–2018 academic year. Based on the study purposes, two experimental groups were
determined since it was considered that the diﬀerences mentioned in Table 1 could alter the eﬃciency and productivity of gami-
ﬁcation activities. In the ﬁrst group, Kahoot, and in the second, Quizizz activities were conducted. A control group was organized in
order to determine whether the changes were a result of the presentation of the course content or the gamiﬁcation application. Thus,
instruction activities were conducted with three groups: two experiment groups and one control group. In the experimental process,
simple random sampling technique was used to determine the group members. The participants were randomly divided into three
branches in the study. Group A (Kahoot experiment group) included 30, Group B (Quizizz experiment group) included 33, Group C
(control group) included 34 participants. However, since the pre-tests were conducted before the presentation of the content and the
participation in the research process was volunteer-based, and the data collected from the participants who did not participate in
either the pre-test or the post-test could not be analyzed, the number of participating pre-service teachers changed. These changes are
presented in Table 2.
As seen in Table 2, the analyzed participant data varied between the groups. In addition to the previously explained reasons
related to the data collection process, certain students who took the course in the previous year, but failed the course, and did not
have to attend the course on the second year due to requirements of the Turkish higher education system were not able to participate
in the study. The number of participants in the study decreased due to the above-mentioned factors, however it was considered that
the diﬀerentiation between the number of participants in diﬀerent groups was not signiﬁcant.
The pre-service teachers that participated in the focus group interviews were determined with the maximum diversity sampling
technique. The maximum diversity sampling, a non-probabilistic sampling method, entails selection of individuals that could provide
answers relevant for the purpose of the study and who are experienced in the research questions in a way to reﬂect diversity with
respect to the signiﬁcant study variable or variables (Balcı, 2011, p. 103; McMillan, 2004, p.114). It was generally advised that
groups that included between six and eight participants were the optimum size for focus group interviews (Bloor, Frankland, Thomas,
& Robson, 2001, p. 26). Thus, six pre-service teachers were selected for focus group interviews. Attention was paid to ensure that the
pre-service teachers that participated in focus group interviews attended the activities regularly, were highly, moderately and less
successful pre-service teachers. Since six consecutive volunteer pre-service teachers participated in the focus group interviews, it was
considered that both the voluntary basis required by the qualitative research approaches and the requirement of maximum diversity
in participants based on a criterion by quantitative approach were met in the present study.
2.2. Data collection tools and process
Data were collected using three processes and tools. An academic achievement test, developed to compare academic achievement
on the content of the initial six weeks of the SRM course, and a student engagement scale developed by Günüç and Kuzu (2015) to
investigate the impact of the course instructed with the 7E method on the engagement of pre-service teachers were utilized. In the
ﬁnal stage, data were collected with focus group interviews that were conducted to determine the views and assessments of pre-
service teachers on gamiﬁcation techniques that were not scrutinized by the measurement tools.
2.2.1. Academic achievement test
The academic achievement test was based on the 24 achievements included in the ﬁrst six units of the SRM course. In order to
measure all achievements, a statement table and a 30-item pool, where each achievement was measured with at least one question,
were developed. This form was applied to 100 pre-service teachers who previously took the SRM course. The scores obtained in the
test were divided into groups of upper and lower 27 percentile. Diﬃculty and distinctiveness indices were calculated for the scores in
the lower and upper groups. It was planned to exclude the items with a distinctiveness limit lower than 0.20 in order not to reduce the
Distribution of the participants based on the groups and data collection tools.
Study Group Planned Participation in the academic
Participation in the student engagement
Participation in focus group interviews
Kahoot experiment 30 20 20 6
Quizizz experiment 33 26 25 6
Control 34 25 24 –
Total 97 71 69 12
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
content validity, however since the exclusion of the items with a distinctiveness score of 0.19 decreased the content validity, these
items were included in the achievement test and options in these items were reorganized. Thus, 20-item achievement test with a
mean diﬃculty of 0.58, and a distinction score between 0.19 and 0.73, where each item had 5 options was ﬁnalized. Academic
achievement pre-test and post-test data for the pre-service teachers that participated in the experimental process were collected using
the above-mentioned test.
2.2.2. Student engagement scale
Another data collection tool used in the study was the student engagement scale developed by Günüç and Kuzu (2015). The scale
includes 41 items and six factors such as valuing, sense of belonging, cognitive engagement, peer relationships, relationships with
faculty members and behavioral engagement. The total variance explained was 59%. Cronbach's alpha (α) internal consistency
reliability coeﬃcient for the initial scale was 0.929. Thus, it was suggested that the scale was reliable for the purposes of the present
2.2.3. Focus group interviews
Five questions were asked to the participants in focus group interviews, which were the data collection technique used to collect
the qualitative study data. These questions included in the interviews aimed to determine the views of pre-service teachers on the
positive/negative eﬀects of the tools used in gamiﬁcation activities with respect to learning, peer interaction, learner-teacher in-
teraction, and problems encountered during implementation and solutions of these problems and provided feedback. The following
questions were included in the interviews after the opinion of the two ﬁeld experts were obtained:
1. In your opinion, what are the positive eﬀects of Kahoot/Quizizz applications with respect to
a. Your learning
b. Social interactions, etc.
2. What were the problems you experienced in Kahoot/Quizizz applications?
a. What could be the reasons for these problems?
b. What are your recommendations for the solution of these problems?
3. What are your views on the impact of the feedback provided during the Kahoot/Quizizz applications?
4. Would you like to participate in another Kahoot/Quizizz application? Why?
The above-mentioned data collection tools were used to obtain data for diﬀerent purposes of the study. Quantitative study data
were collected during a six-week period. The instructional activities were conducted at the same time at the same classroom and by
the same instructor on diﬀerent days. Thus, an attempt was made to ensure that the probable diﬀerences between the experimental
groups were not aﬀected by external variables. During the 2017–2018 academic year spring semester, after information was provided
for the pre-service teachers about the research process, the academic achievement test developed by the researchers and the student
engagement scale developed by Günüç and Kuzu (2015) were applied. This was immediately followed by the instruction of the course
with the 7E model. During the ﬁve-week course, instructional activities were conducted using 7E method that included engage and
evaluation activities gamiﬁed with Kahoot in the ﬁrst experiment group and with Quizizz in the second experimental group. In the
control group, instructional activities were conducted with 7E method that did not include gamiﬁcation activities. After the ﬁve-week
instruction process conducted with the above-mentioned methods, the academic achievement test and student engagement scale
were reapplied to the three groups. The collected data constituted the pre-test and post-test of the quantitative dimension of the
Immediately after the completion of the instruction activities, focus group interviews were conducted with the ﬁrst and second
experimental groups in separate sessions. The focus group interviews were conducted in the researcher's oﬃce during the hours
determined by the researcher and pre-service teachers. Focus group interviews aimed to reveal the views of the pre-service teachers
on the advantages and disadvantages of the utilized tools (Kahoot and Quizizz), the problems experienced during the process and
solution recommendations, and their satisfaction or dissatisfaction about the process. After the written and verbal consent of the pre-
service teachers were obtained, focus group interviews were conducted on a suitable date and time for the group. The focus group
interviews with the experimental group where Kahoot-based instructional activities were conducted lasted for 17 min and 1 s and the
focus group interviews with the experimental group where Quizizz-based instructional activities were conducted lasted for 24 min
and 38 s and both interviews were recorded on tape.
In the frame of mix method principles, data were collected via mentioned tools throughout experimental process. Experimental
process was designed as had three diﬀerent groups, two of them was experiment group and the other one was control group. All
groups got student engagement scale and academic achievement test before and after the process, and the courses conducted in the
frame of 7E teaching method. For experimental groups, one got quizzes at the beginning and at the end of course designed via Kahoot,
the other one got it designed via Quizizz. For control group participants, they got the same questions and feedbacks at the beginning
and at the end of course in verbal way. By this way, the diﬀerence between gamiﬁed and non gamiﬁed quizzes, also the diﬀerence of
gamiﬁcation tools, could be shown. After experimental process, to detail the diﬀerence and indicate the views of participants, focus
group interviews were conducted with experimental groups. The interviews were not conducted with control groups’participants
because they did not have any experience on gamiﬁcation activities or tools. After collecting data, data analysis process was began.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
2.3. Data analysis
In the quantitative dimension of the study, the dependent variable that included two pretests and two posttests, and a three-
category independent variable that included two experimental groups and a control group were compared. The data analysis process
was conducted with the mixed design MANOVA, which analyzes all the above-mentioned data at once (Field, 2009, p. 822). The
prerequisites for this test and the consistence of the research data with these prerequisites were as follows:
1. The study dataset was analyzed based on the cook's distance and centered leverage values and it was observed that there were no
outliers. As deﬁned in the participants section of the present study, the prerequisites of the minimum data count should be equal
to the number of dependent variables in each independent variable cell and there should be at least 20 participants in the cells
(Pearson, Pearson, & Hartley, 1958;Tabachnick & Fidell, 2012, p. 73; 252) were met.
2. 3. Multilinearity of the dependent variables were analyzed with scatter graph and it was observed that the variables exhibited
multiple normal distribution. Scatter graph is presented in Fig. 1.
3. Homogeneity of variance-covariance matrices was tested with Box's test of equality of covariance matrices. As result, it was
observed that variance and covariance matrices were equal (Box's M =14,45, F
=0,658, p >0.05)(Çokluk,
Şekercioğlu, & Büyüköztürk, 2012, p. 20). Furthermore, when Levene's test of equality of error variances was examined, it was
observed that the error variances of both dependent variables were equal (F
=0,581, p >0.05; F
=0,533, p >0.05; F
=0,557, p >0.05).
As reported below, since the necessary requirements for MANOVA were met, mixed design MANOVA was conducted in the study.
The data obtained with the focus group interviews, where the qualitative study data were collected, was analyzed with inductive
content analysis technique. Inductive content analysis is deﬁned as detailed and systematical analysis of a certain material in order to
determine patterns, themes, judgments and meanings (Bogdan & Biklen, 2007, p. 173; Neuendorf, 2002, p. 17). In other words, a
detailed study of the obtained codes is deﬁned as inductive content analysis (Lune & Berg, 2017, p. 183). In the analysis, the audio
recordings of the participants were reviewed, and the statements of the participants were replaced with codes for use in the analysis
and reporting processes. For trustworthiness of study, interviews were conducted by the researcher who conducted the lessons during
whole semester and by this way, According to Guba (1981), if they know the interviewer and the interviewer was naturalised during
process, participants answered the questions without any suspect. Additionally, privacy of participants was guarantied and the direct
coded was written with participants’word with nicknames. Since the analysis of focus group interviews was conducted in-
dependently, the participants were assigned code names by the interviewer and K-nickname format was used for the Kahoot focus
group interviewees and Q-nickname format was used for Quizizz focus group interviews, and these nicknames were not shared with
anyone. For Shenton (2004) debate and conﬁrmation of analysis and member checking were the ways of supporting trustworthiness
Fig. 1. Scatter graph of independent variables.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
the qualitative analysis. The data of this study were coded by two diﬀerent researchers separately and these were uniﬁed and
structured. After that the themes and subthemes with related codes were shown to the participants and asked whether they meant
that or not for member checking. Since they expressed that there was no misunderstandings and mistakes, the qualitative data
analysis process was ﬁnalized.
In this section of the study, quantitative ﬁndings and qualitative ﬁndings are presented separately. The main reason for this was
the fact that the data were collected and analyzed separately due to the nature of the research design. In the conclusion section, the
above-mentioned ﬁndings were integrated and discussed as a whole.
3.1. Quantitative ﬁndings
Based on the study purposes and the nature of the collected data, the quantitative study ﬁndings were obtained with mixed design
MANOVA. The ﬁrst and second research questions were answered using the MANOVA results. Speciﬁcally, whether gamiﬁed 7E
instructional model led to a change in academic achievement and student engagement of the students in experimental groups was
analyzed with this test. The analysis results are presented in Table 3.
The analysis revealed (a) a signiﬁcant SRM academic achievement test interaction (Wilks's lambda =.066, F
p<0.05); (b) a signiﬁcant student engagement scale score interaction (Wilks's lambda =.211, F
=247.394, p <0.05); (c) a
signiﬁcant student engagement ×group interaction (Wilks's lambda =.821, F
=7.203, p <0.05); (d) a signiﬁcant SRM aca-
demic achievement x student engagement interaction (Wilks's lambda =.204, F
=257.853, p <0.05); (d) a signiﬁcant SRM
academic achievement x student engagement ×group interaction (Wilks's lambda =.819, F
=7.301, p <0.05). Review of the
signiﬁcance levels of the analysis presented in Table 3 demonstrated that there was no statistically signiﬁcant diﬀerence between
SRM academic achievement and group (Wilks's lambda =.979, F
=0.693, p >0.05). When observed power for this comparison
was analyzed, it could be suggested that the sample size was not suﬃcient since the analysis power was under 0.80 (Power
=.162)(Cohen, 1988, p. 248).
When the measurement analyzes with statistically signiﬁcant diﬀerences are analyzed, it was observed that all had a greater than
0.5 power value. This indicated that the study produced valid ﬁndings that explained the scrutinized model based on power (Cohen,
1988, p. 116). However, when the eﬀect size was reviewed, it was observed that the eﬀect size was smaller than 0.5 in the analyzes
conducted with the comparative model (Student engagement*group and SRM academic achievement*student engagement*group)
based on the group variable. This was mainly due to the fact that the whole model could not be explained by the measurements and
the presence of other variables that aﬀect the model but not measured in the present study. When power and eﬀect sizes were
considered holistically, it was determined that the suﬃcient sample size was obtained for the conducted analysis, however the
presence of other variables led to the above-mentioned variance.
Furthermore, the results of multiple comparison tests were examined to determine the source of the above-mentioned variance.
Since the results of Levene's test of equality of error variances (p > 0.05) demonstrated that the error variances were homogeneous,
the analysis was conducted with Post hoc Scheﬀe test. Scheﬀe test results are presented in Table 4.
Eﬀect F df η
SRM academic achievement 940.17 1, 66 .934 1.000 .000
SRM academic achievement*group 0.693 2, 66 .021 .162 .504
Student engagement 247.394 1, 66 .789 1.000 .000
Student engagement*group 7.203 2, 66 .179 .924 .001
SRM academic achievement*student engagement 257.853 1, 66 .796 1.000 .000
SRM academic achievement*student engagement*group 7.301 2, 66 .181 .927 .001
Scheﬀe test scores.
Tested Measure (I) Group (J) Group Δx̄
SRM academic achievement* student engagement kahoot quizizz 1.977 1.6143 .476
control 1.720 1.6292 .575
quizizz kahoot −1.977 1.6143 .476
control -.257 1.5378 .986
control kahoot −1.720 1.6292 .575
quizizz .257 1.5378 .986
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
Based on the Scheﬀe test results, it was determined that there was no variance between the measurements based on the groups. As
is known, multivariance analysis simultaneously analyzes several measurements or variables and the impact of the correlation be-
tween these variables on the ﬁndings, preventing the Type I errors (Huck, 2009, p. 212). Thus, it is possible for the variables
scrutinized separately in multiple comparison tests to fail to lead to a signiﬁcant diﬀerence (Huck, 2009, p. 218). In other words, the
variances resulting from the MANOVA test are due to inter-measurement interaction. Since analysis of the variance based on the
groups eliminates the above-mentioned correlation, a variance is not observed. More clearly, the statistical diﬀerence was shown on
model came from the interaction between test scores and group variance. From this point of view, to see the basis of this diﬀerence
the scores should be examined separately. However, although there was no statistically signiﬁcance variance between the groups, a
certain level of variance was determined in academic achievement and student engagement tests. Distribution of the variance be-
tween the SRM academic achievement test pretest and posttest scores among the groups is presented in Fig. 2.
As seen in Fig. 2, there was a diﬀerence between SRM academic achievement pretest and posttest scores. However, this is
expected in experimental studies. The diﬀerence between the variation between the groups should be examined in the graph. It was
observed that while mean SRM academic achievement pretest scores were similar (x̄
=3.412), there was a diﬀerence between the posttest scores based on the groups (x̄
=42.188). The most signiﬁcant diﬀerence was observed in the Kahoot experiment group (Δx̄
=41.701). Similarly, the distribution of the variation between the groups in student engagement pretest and posttest
scores was analyzed and presented in Fig. 3.
Analysis of the mean student engagement pretest score presented in Fig. 3 demonstrated that there were minor variances between
the groups based on student engagement scores (x̄
=15.691). Although there
was an increase in posttest scores in all groups (x̄
=29.909), the highest
variation was observed in the Kahoot experiment group (Δx̄
=21.393). This explained the fact that there was no
variance in post hoc tests despite the variation observed in the MANOVA model. Similarly, the fact that the eﬀect sizes were below
0.50 and the variance in the pretest results was consistent. Thus, it could be suggested that, in addition to the experimental mea-
surements, other variables such as the variations in pre-service teachers assigned with random sampling and attitudes towards
scientiﬁc research could have aﬀected the study ﬁndings.
Holistic analysis of Figs. 2 and 3 demonstrated that the activities gamiﬁed with Kahoot application, albeit statistically insignif-
icant, had a more positive impact on academic achievement and student engagement when compared to the other groups. On the
other hand, it was observed that the positive impact of the activities gamiﬁed with Quizizz application was lower than that of the
instruction method utilized in the control group both based on academic achievement (Δx̄
=38.776) and student engagement (Δx̄
=14.218). Thus, it could be suggested
that the use of Quizizz application in gamiﬁcation activities utilized in the present study did not have any positive eﬀects on academic
Fig. 2. Distribution of SRM academic achievement test scores by experiment groups.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
achievement and student engagement.
3.2. Qualitative ﬁndings
In this section of the study, focus group interviews that were conducted to answer the 3rd research question were analyzed. Due to
the nature of the present study, the focus group interviews were conducted in two diﬀerent sessions; with the group that was
instructed using activities gamiﬁed with the Kahoot application, and the group that was instructed using activities gamiﬁed with the
Quizizz application. Since the questions were developed for the relevant gamiﬁcation tool based on the experiences of pre-service
teachers, the group data were compared and analyzed independently. In this section, the ﬁndings of the focus group interviews
conducted with six participants in the Kahoot experiment group and the ﬁndings of the focus group interviews conducted with six
participants in the Quizizz experiment group are presented and the themes derived from the two interview group ﬁndings were
compared. For protecting participants' privacy, nicknames were used with used gamiﬁcation tools' ﬁrst letters. Kahoot interview
focus groups' participants were named as K-nickname, Quizizz interview focus groups’participants were named as Q-nickname.
3.2.1. The Kahoot focus group interviews
Focus group interviews were analyzed by two researchers that conducted the present study. According to Krueger (2006), it is
better for more than one individual to analyze the data in focus group interviews. Independent analyzes were combined and the views
of pre-service teachers were categorized under the themes of “reinforcement,”“motivation,”“competition,”“entertainment,”“active
participation,”“infrastructure problems,”and “tool-based problems.”In detail analysis of the codes under these themes led to the
identiﬁcation of the following main themes: “impact on learning,”“interaction,”and “problems.”The above-mentioned qualitative
ﬁndings are visualized in Fig. 4.
As seen in Fig. 4, focus group interviews revealed a structure that included three main themes and seven sub-themes. The pre-
service teachers discussed the eﬀects of the study tools on their learning simply in the sub-themes of reinforcement and motivation.
On the topic, K-Özer stated the following: “I can state that it is a nice material for reinforcement and comprehension of the subject,”while
K-Eda stated the following: “I had a good grade in the exam, since I gave wrong answers to Kahoot questions, I learned the correct answers
and my grade was good …For instance, Kahoot made it possible for me to see my mistakes since I generally gave the wrong answers, we learn
from our mistakes, isn't it?”And K-Sude stated “It allowed us to repeat the previous week, it was very useful in this respect, it reminded us the
things we learned.”Furthermore the above-mentioned views were considered under the motivation sub-theme. Participant K-Funda
stated that “Thanks to Kahoot, we are able to listen to the instructions better so that we could answer the questions correctly,”K-Eda: “In the
ﬁrst weeks, my grades were bad, but later I came to the class prepared for the quizzes that we took at the beginning of each class, I think this led
to ambition,”and K-Sude: “I think it is also important to use such technological tools in the course, we absolutely comprehend the course
better.”The views of pre-service teachers on this theme were positive. It can be suggested that integration of the technologies in the
Fig. 3. Distribution of student engagement scale scores by experiment groups.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
instruction process was eﬀective as noted by the pre-service teachers.
The views of pre-service teachers that were classiﬁed under the theme of interaction were analyzed within the framework of
competition, entertainment and active participation sub-themes. In the context of competition sub-theme, they mentioned positive
reﬂections of competitive motivation in the classroom environment as stated by K-Özer: “The fun competition among us made a positive
contribution to listening and concentration on the course better.”K-Funda stated that the course was quite entertaining and K-Sude stated
that it was out of the ordinary, leading to the suggestion that it was possible to introduce entertainment, which is one of the basic
elements of the gamiﬁcation, to the classroom environment. K-Eda and K-Özer stated their views on the sub-theme of active par-
ticipation with the using the following statements: “The student participation is more active, and there are more interesting aspects.”
Furthermore, K-Meltem stated that “(Previously,”you just instructed the subject and we listened, in other words, we had to parti-
cipation, but you included us with the Kahoot application, we interacted with you, its applied nature was nice,”underlying the
impact of both interaction and active participation on learning. Thus, it was suggested that active participation sub-theme was
correlated with both learning and interaction themes.
Similar to several research on educational technologies, a number of infrastructural problems were experienced in the present
study. These problems were mentioned by K-Özer as follows in the most general sense: “We had problems with Internet access and the
school infrastructure.”Furthermore, K-Ecem stated the following: “Not all had Internet access, they were not able to connect to the
school network, some of them had slow connection and they could not participate.”In this statement, the pre-service teacher
mentioned a situation that aﬀected the implementation process and the participation of the pre-service teachers, not an individual
impact. Thus, it was suggested that the infrastructure problems had a negative impact on interaction and this sub-theme was as-
sociated with the interaction theme. In addition to infrastructural problems, pre-service teachers reported certain Kahoot-oriented
problems. Certain Kahoot-oriented problems were mentioned by K-Funda as follows: “Questions were viewable only on the screen
and this distracted me a lot, I was even searching for the question in the beginning, I lost a great amount of time.”K-Eda stated that
“There were no options on the phone, we even marked the wrong option while looking at the screen”and K-Meltem stated that “We
were allowed to answer the question (within) ﬁve seconds, sometimes we did not understand the questions, when we solved the
questions with you, we retained more information.”Thus, it was concluded that high hardware requirements of the Kahoot tool, and
the fact that the questions and multiple choice answers were on diﬀerent screens distracted the learners, and therefore problems were
experienced in implementation.
3.2.2. The Quizizz focus group interviews
Quizizz focus group interview sub-themes and themes were consistent with Kahoot focus group interview sub-themes and themes.
The only diﬀerence was the presence of participant data that demonstrated that the sub-themes of entertainment and competition
were correlated with the impact on learning theme. The remaining correlations were the same. It was suggested that this was due to
the fact that the same questions were used in both interviews, the participants experienced the same quiz questions and the same
content in the experimental process, and although there were diﬀerences between the Kahoot and Quizizz tools, they were similar
with respect to the fact that they are both gamiﬁcation tools. The correlations between the themes and sub-themes in Quizizz focus
group interviews are presented in Fig. 5.
The analysis model presented in Fig. 5 includes seven sub-themes and three main themes. The ﬁrst of these themes explains the
Fig. 4. Themes and sub themes obtained in Kahoot focus group interviews.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
impact of Quizizz on learning. During the interviews, the participants were engaged in the following dialogue within the framework
of the reinforcement sub-theme:
“Q-Nilay: It helped reinforce the content we learned in the course better
Q-Faruk: We had the opportunity to revise the topics we learned
Q-Ozan: We had the opportunity to correct our mistakes by learning the accurate content. We revised the questions later on as follows:
when I make a mistake, I remember this better, as you explained the questions afterwards, I remembered (the content) better.”
Based on the above-mentioned dialogue, particularly Q-Ozan's statement, it can be suggested that the implementation of Quizizz
reinforced learning, however this should be supported by face-to-face feedback. Furthermore, the participants' views were positive on
the sub-theme of motivation as observed in the following statements: Q-Nilay: “Since we knew that Quizizz would be given at the end of
the course, we were more focused during the instruction.”Q-Faruk: “Our ambition was productive, we started to learn more.”
It was determined that the pre-service teachers interpreted the Quizizz application within the sub-themes of competition, en-
tertainment and active participation and the dimension of interaction based on their responses. On competition sub-theme, Q-Yunus
stated the following: “The eﬀect of the course was immediate, such as ‘you were/were not the leader,’in a funny sense,”and Q-Azim stated
that “Due to the time allotted, there was an entertaining competition.”The dialogue between Q-Yunus and Q-Azim quoted below was the
main reason for the correlations between competition sub-theme, interaction theme and impact on learning theme:
“-The competitive environment was great
-We learned about the learning levels of our friends, who learned how much.”
Furthermore, the following statement by Q-Esin also contributed to the above-mentioned conclusion: “We reinforced our knowledge
and the competitive environment was fun.”
Although the participants expressed positive views on the theme of entertainment using statements similar to Q-Faruk (“Seeing
Trump and smileys make it fun”), as Q-Yunus mentioned in the dialogue below, there were contradicting views about the visual
feedback in Quizizz:
“Q-Yunus: I do not think anybody cares for these feedback
Q-Esin: I cared, I liked them very much
Q-Azim: Me, too, ugly pictures appeared when we made a mistake
Q-Yunus: I tried to answer the next question as soon as possible, I only remember Trump
Q-Nilay: No, I liked these emoji-like things very much, they were nice.”
As seen in the dialogue above, a neutral if not negative approach was observed about the feedback function of the tool. Participant
Q-Esin established an association between entertainment and learning with the following statement: “We had fun as well as learning,
Fig. 5. Themes and sub themes in Quizizz focus group interviews.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
and learning was fun.”Thus, entertainment sub-theme was associated with both interaction and impact on learning themes. Similarly,
active participation sub-theme was correlated with the two above-mentioned themes. The following statement by Q-Nilay conﬁrmed
this correlation: “Professor, we already sit on the desks without moving for forty minutes, we listen to the teacher, so at least it is fun learning
with this method.”
The problems that pre-service teachers experienced during the implementation of Quizizz were categorized in two sub-themes.
The ﬁrst one was the sub theme of infrastructural problems. On the issue, Q-Azim stated the following: “An application that does not
require the Internet could be developed, for example, or always it is always a problem.”Participants emphasized Internet and application
related problems as follows: Q-Nilay: “For instance, we both answer the same question within ﬁve seconds, but his Internet speed is better
than me and at the end, he is considered to have answered the question quicker than me.”Q-Ozan: “If everyone had the same Internet speed,
then the old phones could have slowed it down.”Within the context of this sub-theme, it was determined that infrastructural problems
negatively aﬀected participation, hence the interaction as observed in the following statement by Q-Nilay: “When the Internet slows
down, you would stop doing the quiz on the third question thinking that others have already answered and I cannot keep up with them.”Based
on this statement, it can be suggested that the sub-theme of the infrastructure problems was correlated with the interaction theme.
Pre-service teachers stated the following on the problems they experienced due to the Quizizz application. Q-Nilay: “I answer the
question correctly as well, but when someone answers the question a second earlier, he becomes the leader.”Q-Yunus: “That is how I could not
become the leader twice, it is a problem that there is only one leader.”The following statement by Q-Yunus on this sub-theme was
remarkable: “The projected images should be diﬀerent, when the same image appears every time, it becomes diﬃcult to concentrate.”Based
on this statement, it can be suggested that pre-service teachers expected diversiﬁcation of the feedbacks in Quizizz application.
In both focus group interviews, it was observed that they expressed mostly positive views on the sub-themes of reinforcement,
motivation, competition, entertainment, active participation and considered Kahoot and Quizizz applications similar at this level.
Although internet infrastructure problems were common in both applications, it was determined that Kahoot was more limited when
compared to Quizizz due to problems such as the requirement for other equipment and distractions it caused. Furthermore, the visual
feedback mechanism in Quizizz application was limited since it is individual-based, leading to technological problems at the in-
However, all pre-service teachers in both groups replied “Yes”to the question “Would you like to participate in another Kahoot/
Quizizz application? Why?”The following dialogue between Q-Azim and Q-Yunus was among the most signiﬁcant replies to the
“Q-Azim: It will not work in every course, for instance, it would be better in verbal courses, in certain courses such as literature, like you
will ﬁnd the elements of a sentence, etc.
Q-Yunus: In other words, the Quizizz of that course would be boring as well.”
The dialogue above demonstrated that pre-service teachers preferred the gamiﬁed quiz applications for the content where the-
oretical but short questions could be asked. Thus, it can be suggested that the fact that pre-service teachers expressed positive views
in focus group interviews, but there was no statistically signiﬁcant eﬀect could be due to the SRM course content.
4. Conclusion and discussion
The present study aimed to investigate the reﬂections of gamiﬁcation activities (Kahoot and Quizizz) used as formative assess-
ment tools for academic achievement and student engagement on learning environments. It was also aimed to investigate whether the
use of one gamiﬁcation application led to diﬀerences in academic achievement and student engagement. In the study where two
experiment groups and one control group were assigned, formative assessments were conducted at the beginning and end of each
class using Quizz with one experiment group and with Kahoot with the other experiment group in the engagement and evaluation
steps of the 7E teaching model. In the control group, formative assessment was conducted with conventional questions and answers.
The quantitative study ﬁndings suggested that gamiﬁcation aﬀected academic achievement and student engagement in the scientiﬁc
research methods course. However, inability to determine the direction of this impact by post-hoc tests indicated the possibility of an
eﬀect due to inter-measurement interaction. Thus, the graphs produced by MANOVA were examined. In the graphs, it was observed
that the impact of Kahoot-based instructional activities on academic achievement and student engagement was higher when com-
pared to that of the control group. On the other hand, the educational activities that were conducted with Quizizz were less eﬀective
when compared to the control group. Limited visual feedback capacity of the Quizizz application, the fact that the questions pro-
gressed at an individual pace and the individual technological problems experienced by the participants may have prevented aca-
demic achievement and student engagement as demonstrated by the qualitative ﬁndings.
Literature review demonstrated that certain studies reported that the gamiﬁcation used for assessment purposes improved
achievement and engagement (Biçen & Kocakoyun, 2018;Bolat, Şimşek, & Ülker, 2017;Bury, 2017;Fotaris, Mastoras, Leinfellner, &
Rosunally, 2016; Tsahouridis, Vavougios & Ioannidis, 2018; Turan & Meral, 2018). Bury (2017), in a study that examined whether
online assessment tools (Kahoot and Quizizz) increased student motivation, participation and learning, concluded that online as-
sessment tools developed students' grammar knowledge and the students desired to specialize in the content of online tools. Fur-
thermore, it was determined that the reason students desired to use Kahoot and Quizizz applications in the classroom was due to
students' need for strong stimuli or the will to receive immediate feedback on how well they performed on the test. Bolat et al. (2017)
investigated the impact of using Kahoot application as a formative assessment tool on the academic achievements of pre-service
teachers and reported that Kahoot application had an eﬀect on students' retention levels based on the revised Bloom taxonomy,
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
however it did not have an eﬀect on application levels. Turan and Meral (2018) investigated the eﬀects of game-based and non-game-
based online student response systems on student achievement, engagement, and exam anxiety levels in a study where Kahoot was
used in the experimental group and Socrative online student response system was used in the control group for four weeks. The study
results demonstrated that the game-based student response systems increased achievement and participation, and decreased the test
anxiety levels of the students. He proposed the use of online game-based student response systems in diﬀerent topics in social studies
courses. In an experimental study conducted in Basic Software Development course, Fotaris et al. (2016) aimed to determine the
impact of gamiﬁcation applications (Kahoot, Who wants to be a Millionaire, and Code academy) used as formative and summarizing
assessment tools on students. The data collected with observations, survey forms, interviews and documents demonstrated that the
above-mentioned applications had a positive eﬀect on students' motivation, retention and performance. Furthermore, the study
ﬁndings demonstrated that Kahoot and who wants to be a millionaire gamiﬁcation applications allowed the immediate im-
plementation of knowledge, reinforcing learning outcomes, and students felt good about receiving immediate feedback about their
achievements, which in turn improved their self-esteem has been revealed in the study.
The qualitative study ﬁndings revealed a structure that included three main themes and seven sub-themes. The sub-themes
indicating the impact of gamiﬁcation on learning and interaction and the sub-themes on the presence of application-related problems
are presented in the ﬁndings section with direct quotes. The interviews conducted with Kahoot and Quizizz application groups
demonstrated that the quizzes conducted at the beginning and at the end of each class reinforced the topical knowledge of the
students and the students came to class prepared since they knew about the pre-class quizzes, which in turn motivated the students
for the course. Furthermore, in the interviews conducted with the Quizz group, it was determined that the students followed the
instructions more carefully due to the post-instruction quiz. It could be suggested that feedbacks provided via gamiﬁcation reached
their main purpose. When considering quantitative ﬁndings, it can be said that these positive views on motivation conﬁrmed en-
gagement and academic achievement diﬀerence between groups, as said in Pekrun and Linnenbrink-Garcia’s (2012) study. In the sub-
themes of entertainment and competition, it was found that the students had fun with the emojis provided as feedback, their curiosity
about the achievements of their friends were satisﬁed, they were ambitious about being the class leader in scores, and they considered
this competitive environment as fun. In the sub-themes of infrastructure problems and application-related problems, it was de-
termined that the students experienced Internet speed problems and since they experienced problems in connecting the school
networks, this led to competition problems. Furthermore, it was determined that high hardware requirements of Kahoot application,
presentation of the questions and answer options on diﬀerent screens were distracting for the learners in the study. It was also
identiﬁed in the study that the feedback emojis of the Quizizz application were inadequate. Previous study ﬁndings were consistent
with the above-mentioned ﬁndings (Biçen & Kocakoyun, 2018;Cahyani, 2016;Licorish et al., 2018;Medina & Hurtado, 2017;Yapıcı
& Karakoyun, 2017). In a study that aimed to determine student views on gamiﬁcation-based interactive response systems, Solmaz
and Çetin (2017) utilized and compared three gamiﬁcation applications (Kahoot, Socrative, Plickers) in the IT course. In the study, it
was determined that students preferred Plickers, Kahoot, and Socrative applications, respectively. It was found in the study that
Kahoot led to the most competitive environment and was the easiest to use, and the students liked the feedback form of the Plickers
application the most and had the most fun with this application. They liked the fact that Kahoot was colorful, while the presentation
of the questions was the most unpopular aspect of the application. While the most popular aspect of Socrative was its feedbacks, the
most unpopular aspect was the scoring system. Students liked the QR code system in Plickers, however the risk of misrepresenting the
card was not appreciated. Cahyani (2016) reported that students were happy when they conducted gamiﬁed learning activities.
Furthermore, the study revealed that gamiﬁed learning activities challenged the students, and the students desired to master in all
activities and pass all levels. In a study that aimed to determine student views on Kahoot application, Biçen and Kocakoyun (2018)
determined that gamiﬁcation increased the interest of the students in the course and encouraged them to be more ambitious for
achievement. Furthermore, the study ﬁndings demonstrated that the reward system increased student motivation in the course, the
students feel important when they win badges, and the competitive environment shortens the student response time. Licorish et al.
(2018) reported that Kahoot contributed to teacher-student interaction, however it sometimes led to negative emotions due to the
extreme competitive environment. They argued that applications such as Kahoot were necessary in long courses. They also de-
monstrated that students' desire to perform well in Kahoot also increased their interest in the course and interactions with each other.
Glover (2013) reported that gamiﬁcation helps students to overcome their negative attitudes in a competitive environment and
encourages them for more productive behavior.
The present study aimed to determine the positive reﬂections of the assessments conducted with gamiﬁcation instead of con-
ventional assessment on the students. Due to the positive eﬀects of technology on the ﬁeld of measurement and evaluation, it was
emphasized in International Education Technology Standards that teachers should conduct numerous and various formative-grading
assessments using the technologies in evaluation activities, especially in active learning activities. The determination of the eﬀec-
tiveness of gamiﬁcation in summarizing and diagnostic measurement and evaluation types is a requirement for the literature.
Declaration of interest
Appendix A. Supplementary data
Supplementary data to this article can be found online at https://doi.org/10.1016/j.compedu.2019.02.015.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
Arabaci, I. B., & Polat, M. (2013). Dijital yerliler, dijital göçmenler ve sinif yonetimi. Elektronik Sosyal Bilimler Dergisi, 11–20.
Arkün-Kocadere, S., & Çağlar, Ş. (2015). The design and implementation of a gamiﬁed assessment. Journal of e-Learning and Knowledge Society, 11(3).
Arkün-Kocadere, S., & Samur, Y. (2016). Oyundan oyunlaştırmaya. içinde A. İşman, F. Odabaşı, ve B. Akkoyunlu Eğitim Teknolojileri Okumaları.Tojet- Sakarya Üni-
Attali, Y., & Arieli-Attali, M. (2015). Gamiﬁcation in assessment: Do points aﬀect test performance? Computers & Education, 83, 57–63.
Balcı, A. (2011). Sosyal bilimlerde araştırma: Yöntem, teknik ve ilkeler (Extended 10
ed.). Ankara: Pegem Akademi.
Barata, G., Gama, S., Jorge, J., & Gonçalves, D. (2015). Gamiﬁcation for smarter learning: Tales from the trenches. Smart Learning Environments, 2(1), 1–23.
Bell, K. R. (2014). Online 3.0 the rise of the gamer educator the potential role of gamiﬁcation in onlineUnpublished doctorate thesis. Philadelphia: University of
Biçen, H., & Kocakoyun, Ş. (2018). Perceptions of students for gamiﬁcation approach: Kahoot as a case study. International Journal of Emerging Technologies in Learning,
Bloor, M., Frankland, J., Thomas, M., & Robson, K. (2001). Focus groups in social research. London: Sage Publications.
Bogdan, R. C., & Biklen, S. K. (2007). Qualitative research for education: An introduction to theories and methods (5
ed.). New York: Pearson Education.
Bolat, Y.İ., Şimşek, Ö., & Ülker, Ü. (2017). Oyunlaştırılmışçevrimiçi sınıf yanıtlama sisteminin akademik başarıya etkisi ve sisteme yönelik görüşler. Abant İzzet Baysal
Üniversitesi Eğitim Fakültesi Dergisi, 17(4), 1741–1761.
Borrell, J., Cosmas, N., Grymes, J., & Radunzel, J. (2017). The Eﬀectiveness of Kahoot! as a pre-lesson assessment tool. Retrieved October 16, 2018 https://www.usma.
Buckley, P., & Doyle, E. (2016). Gamiﬁcation and student motivation. Interactive Learning Environments, 24(6), 1162–1175.
Burke, B. (2011). Gartner enterprise architecture summit 2011. Retrieved September 05, 2016 http://www.gartner.com/newsroom/id/1629214.
Bury, B. (2017). Testing goes mobile –Web 2.0 formative assessment tools. ICT4LL 2017. International conference ICT for language learning - 9-10 November, Florence,
Bybee, R. W. (2003). Why the seven E's. Retrieved September 05, 2016 http://www.miamisci.org/ph/lpintro7E.html.
Cahyani, A. D. (2016). Gamiﬁcation approach to enhance students engagement in studying language course MATEC Web of Conferences. https://doi.org/10.1051/
Campell, A. A. (2016). Gamiﬁcation in higher education: Not a trivial pursuitUnpublished doctoral thesis. Florida: St. Thomas University Miami Gardens.
Clarisó, R., Arnedo Moreno, J., Bañeres Besora, D., Caballé Llobet, S., Conesa, J., & Gañán Jiménez, D. (2017). Gamiﬁcation as a service for formative assessment e-
learning tools. 1 st Workshop on Gamiﬁcation and Games for Learning (GamiLearn’17).
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2
ed.). Hillsdale, NJ: Erlbaum.
Çokluk, Ö., Şekercioğlu, G., & Büyüköztürk, Ş. (2012). Sosyal bilimler için çok değişkenli istatistik SPSS ve LISREL uygulamaları.Ankara: Pegem Akademi Yayıncılık.
Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage.
Darling-Hammond, L. (2010). Teacher education and the American future. Journal of Teacher Education, 61, 35–47. https://doi.org/10.1177/0022487109348024.
Delacruz, G. C. (2011). Games as formative assessment environments: Examining the ımpact of explanations of scoring and ıncentives on math learning, game performance, and
help seekingCRESST Report 796. National Center for Research on Evaluation, Standards, and Student Testing (CR).
Demirtaş, Z., & Kahveci, G. (2010). İlköğretim ikinci kademe öğrencilerinin okullarına yönelik beklenti ve memnuniyet düzeyleri. E-Journal of New World Sciences
Academy, 5(4), 2150–2161.
Deterding, S. (2011). Meaningful play:getting gamiﬁcation right. Retrieved June 11, 2014 http://www.youtube.com/watch?v=7ZGCPap7GkY.
Deterding, S., Dixon, D., Khaled, R., & Nacke, L. (2011). From game design elements to gamefulness: Deﬁning gamiﬁcation. 15th international academic MindTrek
conference: Envisioning Future media environments (pp. 9–15). .
Dichev, C., & Dicheva, D. (2017). Gamifying education: What is known, what is believed and what remains uncertain: A critical review. International Journal of
Educational Technology in Higher Education, 14(1), 9. https://doi.org/10.1186/s41239-017-0042-5.
Eisenkraft, A. (2003). Expanding the 5E model: A proposed 7E model emphasizes “transfer of learning”and theim portance of elicitingprior understanding.
TheScienceTeacher, 70(6), 56–59.
Erhel, S., & Jamet, E. (2013). Digital game-based learning: Impact of instructions and feedback on motivation and learning eﬀectiveness. Computers & Education, 67,
Field, A. (2009). Discovering statistics using SPSS (3
ed.). London: Sage Publications.
Fotaris, P., Mastoras, T., Leinfellner, R., & Rosunally, Y. (2016). Climbing up the leaderboard: An empirical study of applying gamiﬁcation techniques to a computer
programming class. Electronic Journal of e-Learning, 14(2), 94–110.
Gaggioli, A. (2012). Cyber sightings. Cyberpsychology, Behavior and Social Networking, 15(3), 184.
Glover, I. (2013). Play as you learn: Gamiﬁcation as a technique for motivating learners. In J. Herrington, (Ed.). World conference on educational multimedia, hypermedia
and telecommunications chesapeake. VA: AACE.
Gökkaya, Z. (2014). Yetişkin eğitiminde yeni bir yaklaşım: Oyunlaştırma. Hasan Âli Yücel Eğitim Fakültesi Dergisi, 11–1(21), 71–84.
Guba, E. G. (1981). Criteria for assessing the trustworthiness of naturalistic inquiries. Educational Communication & Technology Journal, 29, 75–91.
Günüç, S., & Kuzu, A. (2015). Student engagement scale: Development, reliability and validity. Assessment & Evaluation in Higher Education, 40(4), 587–610.
Hamari, J., & Koivisto, J. (2014). Measuring ﬂow in gamiﬁcation: Dispositional ﬂow scale-2. Computers in Human Behavior, 40, 133–143.
Huck, S. W. (2009). Statistical misconceptions. Taylor & Francis Group.
Ismail, M. A.-A., & Mohammad, J. A.-M. (2017). Kahoot: A promising tool for formative assessment in medical education. Education in Medicine Journal, 9(2), 19–26.
Jackson, G. T., & McNamara, D. S. (2013). Motivation and performance in a game-based intelligent tutoring system. Journal of Educational Psychology, 105, 1036–1049.
Johnson, R. B., Onwuegbuzie, A. J., & Turner, L. A. (2007). Toward a deﬁnition of mixed methods research. Journal of Mixed Methods Research, 1(2), 112–133.
Jui-Mei, Y., Chun-Ming, H., Hwang, G. J., & Yueh-Chiao, L. (2011). A game-based learning approach to improving students' learning achievements in a nutrition
course. The Turkish Online Journal of Educational Technology, 10(2), 7.
Jukes, I., & Dosaj, A. (2004). Understanding DK (digital kids) ó teaching and learning in the new digital landscape. Retrieved June 23, 2005 http://www.
Kapp, K. M. (2012). The gamiﬁcation of learning and instruction: Game-based methods and strategies for training and education. San Francisco, CA: Pfeiﬀer.
Kiryakova, G., Angelova, N., & Yordanova, L. (2014). Gamiﬁcation in education. 9th international Balkan education and science conference.
Koivisto, J., & Hamari, J. (2014). Demographic diﬀerences in perceived beneﬁts from gamiﬁcation. Computers in Human Behavior, 35, 179.
Krueger, R. A. (2006). Analyzing focus group interviews. The Journal of Wound, Ostomy and Continence Nursing, 33(5), 478–481.
Kumar, B., & Khurana, P. (2012). Gamiﬁcation in education –learn computer programming with fun. International. Journal of Computers and Distributed Systems, 2(1),
Lee, J. J., & Hammer, J. (2011). Gamiﬁcation in education: What, how, why bother? .
Licorish, S. A., George, L. J., Owen, H. E., & Daniel, B. (2017). Go kahoot!”Enriching classroom engagement, motivation and learning experience with games. 25th
international conference on computers in education. New Zealand: Asia-Paciﬁc society for computers in education.
Licorish, S. A., Owen, H. E., Daniel, B., & George, J. L. (2018). Students' perception of Kahoot!’s inﬂuence on teaching and learning. Research and Practice in Technology
Enhanced Learning, 13(1), 9.
Lune, H., & Berg, B. L. (2017). Qualitative research methods for the social sciences (9
Marczewski, A. (2012). Gamiﬁcation: A simple introduction & a bit more: Tips. Advice and thoughts on gamiﬁcation(2nd ed.). Kindle edition.
McMillan, J. H. (2004). Educational research: Fundamentals for the customers (4
ed.). Boston: Pearson.
Medina, E. G. L., & Hurtado, C. P. L. (2017). Kahoot! A digital tool for learning vocabulary in a language classroom. Revista Publicando, 4(12), 441–449.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²
Muntean, C. I. (2011). Raising engagement in e-learning through gamiﬁcation. 6th international conference on virtual learning (pp. 323–329). .
Neuendorf, K. A. (2002). The content analysis guidebook. Thousand Oaks, CA: Sage.
Nolan, J., & McBride, M. (2014). Beyond gamiﬁcation: Reconceptualizing game-based learning in early childhood environments. Information, Communication & Society,
Oliver, E. (2017). Gamiﬁcation as transformative assessment in higher education. HTS Teologiese Studies/Theological Studies, 73(3) a4527.
Pearson, E. S., Pearson, K., & Hartley, H. O. (1958). Biometrika tables for statisticians. New York: Cambridge University Press.
Pekrun, R., & Linnenbrink-Garcia, L. (2012). Academic emotions and student engagement. Handbook of research on student engagement (pp. 259–282). Boston, MA:
Plump, C. M., & LaRosa, J. (2017). Using kahoot! In the classroom to create engagement and active learning: A game-based technology solution for elearning novices.
Management Teaching Review, 2(2), 151–158.
Prensky, M. (2001). Digital natives, digital immigrants: Part 1. On the Horizon, 9(5), 1–6.
Prensky, M. (2014). The world needs a new curriculum. Retrieved September 25, 2018 http://marcprensky.com/wp-content/uploads/2013/05/Prensky-5-The-World_
Rapti, K. (2013). Implementing alternative assessment methods through gamiﬁcation, Iceri2013.
Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22(2), 63–75.
Shute, V. J., & Spector, J. M. (2010). Stealth assessment in virtual worlds. Retrieved April 22, 2010 http://www.adlnet.gov/Technologies/Evaluation/Library/Additional
Solmaz, E., & Çetin, E. (2017). Ask-response-play-learn: Students' views on gamiﬁcation based interactive response systems. Journal of Educational and Instructional
Studies in The World, 7(3).
Tabachnick, B. G., & Fidell, L. S. (2012). Using multivariate statistics (6
Tsihouridis, C., Vavougios, D., & Ioannidis, G. S. (2017). Assessing the learning process playing with kahoot –a study with upper secondary school pupils learning.
Electrical circuits. International conference on interactive collaborative learning (pp. 602–612). .
Turan, Z., & Meral, E. (2018). Game-based versus to non-game-based: The impact of student response systems on students’achievements, engagements and test
anxieties. Informatics in Education, 17(1).
Yapıcı,İ.Ü., & Karakoyun, F. (2017). Gamiﬁcation in biology teaching: A sample of kahoot application. Turkish Online Journal of Qualitative Inquiry, 8(4), 396–414.
Zichermann, G., & Cunningham, C. (2011). Gamiﬁcation by design: Implementing game mechanics in web and mobile apps. Sebastopol, CA: O'Reilly Media, Inc.
D. Orhan Göksün and G. Gürsoy &RPSXWHUV(GXFDWLRQ²