ArticlePDF Available

Examining the Use of Audience Response Systems in Secondary School Classrooms: A Formative Analysis

Authors:

Abstract and Figures

To date, extensive research has been done on the use of Audience Response Systems (ARSs) in colleges and universities, but not in secondary schools. The purpose of this study was to conduct a detailed formative analysis on the benefits, challenges, and use of ARSs from the perspective of 659 secondary school students. Key benefits reported were increased levels of engagement and motivation, the effective use of formative assessment, and a better quality learning environment. Key challenges included a perceived decrease in learning performance when an aRS was used for summative assessment, technological malfunctions, resistance to using a new method of learning, and increased stress due to perceived time constraints. finally, students consistently rated the use of an aRS significantly higher when it was used for formative as opposed to summative assessment.
Content may be subject to copyright.
Jl. of Interactive Learning Research (2010) 21 (3), 343-365.
Examining the Use of Audience Response Systems
in Secondary School Classrooms: A Formative Analysis
ROBIN KAY, ANN LESAGE AND LIESEL KNAACK
University of Ontario Institute of Technology, Canada
robin.kay@uoit.ca
ann.lesage@uoit.ca
liesel.knaack@uoit.ca
To date, extensive research has been done on the use of Au-
dience Response Systems (ARSs) in colleges and universi-
ties, but not in secondary schools. The purpose of this study
was to conduct a detailed formative analysis on the benets,
challenges, and use of ARSs from the perspective of 659 sec-
ondary school students. Key benets reported were increased
levels of engagement and motivation, the effective use of for-
mative assessment, and a better quality learning environment.
Key challenges included a perceived decrease in learning per-
formance when an ARS was used for summative assessment,
technological malfunctions, resistance to using a new method
of learning, and increased stress due to perceived time con-
straints. Finally, students consistently rated the use of an ARS
signicantly higher when it was used for formative as op-
posed to summative assessment.
Overview
An Audience Response System (ARS), also known as a personal re-
sponse system (Hinde & Hunt, 2006), electronic voting system (Kennedy &
Cutts, 2005), student response system (Kaleta & Joosten, 2007 and clickers
(Bergtrom, 2006), permits students to respond to multiple choice questions
using a remote control device. After students click in their responses, the
results are instantly aggregated and displayed in chart form for the entire
class to review. Responses are often anonymous but the teacher can assign
an ARS remote device to specic students for evaluation purposes. Once the
feedback is displayed, a teacher can change the course of instruction accord-
ingly or students can work out misconceptions and difculties through peer
344 Kay, Lesage, and Knaack
or classroom discussion. While the use of ARSs started in 1966 (Judson &
Sawada, 2002), mainstream use of this tool in secondary schools and higher
education is a relatively new phenomenon beginning in 2003.
Several research reviews have been completed examining the use of
ARSs (Caldwell, 2007; Fies & Marshall, 2006; Judson & Sawada, 2002),
however only one paper could be found examining the use of ARSs in sec-
ondary school classrooms (Penuel, Abrahamson, & Roschelle, 2006). The
purpose of the current article was to conduct a detailed formative analysis of
the potential benets, challenges, and use of ARSs from the perspective of
secondary students.
Literature Review
A detailed review of the ARS literature revealed four key areas of focus:
(a) overall attitudes toward ARSs, (b) student involvement, (c) assessment,
and (d) learning. Each of these areas will be discussed in detail.
Overall Attitudes Toward Using an ARS
Prior to 1992, overall student acceptance of ARSs was quite high (Jud-Jud-
son & Sawada, 2002), although much of the evidence presented was judged
to be anecdotal (Fies & Marshall, 2006). However, more recent studies
have reported considerable quantitative and qualitative evidence to suggest
that higher education students have positive attitudes toward using ARSs
(Caldwell, 2007; Carnaghan & Webb, 2007; Draper & Brown, 2004; Judson
& Sawada, 2002; Kaleta & Joosten, 2007; Paschal, 2002; Prezler, Dawe,
Shuster, & Shuster, 2007; Reay, Bao, Li, Warnakulasooriya, & Baugh, 2005;
Sharma, Khachan, Chan, & Byrne, 2005; Slain, Abate, Hidges, Stamatakis,
& Wolak, 2004).
Student Involvement and ARSs
Engagement. Engagement is a precursor to student-centered learning.
If a student is actively constructing knowledge, overall engagement should
be higher. Students who used an ARS reported being more interested or en-
gaged in the content presented (Bergtrom, 2006; Preszler, et al., 2007; Simp-
son & Oliver, 2007).
Participation. Another essential element for student-centred learning is
participation—students need to be part of the learning process. There is con-—students need to be part of the learning process. There is con-students need to be part of the learning process. There is con-
siderable evidence to suggest that students participate more when an ARS
is used in the classroom (Caldwell, 2007; Draper & Brown, 2004; Greer &
Heaney, 2004; Jones, Connolly, Gear, & Read, 2001; Siau, Sheng, & Nan,
2006; Stuart, Brown, & Draper, 2004; Uhari, Renko, & Soini, 2003; Van
Dijk, Van Den Berg, & Van Keulen, 2001).
345
Examining the Use of Audience Response Systems in Secondary School
Attention. There is some evidence to suggest that student attention wanes
after about 20 minutes in a classroom environment (d’Inverno, Davis, &
White, 2003; Jackson, Ganger, Bridge, & Ginsburg, 2005). Consequently,
offering ARS questions at 20 minute intervals is one technique of segment-
ing a long lecture and allowing students to shift their attention and active-
ly participate in the learning process. Numerous studies have reported that
higher education students are more attentive when an ARS is used (Berg-
trom, 2006; Burnstein & Lederman, 2001; Caldwell, 2007; d’Inverno et al.;
Draper & Brown, 2004; Jackson et al.; Jones et al., 2001; Latessa & Mouw,
2005; Siau, et al., 2006; Slain, et al., 2004).
Discussion. Several researchers have observed that an ARS promotes in-
creased discussion, particularly when used with a peer instruction strategy
(Beatty, 2004; Brewer, 2004; Draper & Brown, 2004; Jones et al., 2001; Ni-
col & Boyle, 2003). With this strategy, students felt they were better able
to discuss and calibrate their understanding of specic concepts (Draper &
Brown, 2004). In addition, students noted they were more engaged and ani-
mated when peer discussions were occurring as a result of ARS feedback
(Jones et al.; Nicol & Boyle).
Assessment and ARSs
Feedback. An ARS improves the feedback cycle between instructor and
students by the rapid, anonymous, collection and presentation of all student
responses (Abrahamson, 2006; McCabe, 2006; Pelton & Pelton, 2006). In
addition, with an ARS students are required to think about a question or
problem and then commit to an answer. This commitment to a response is
particularly important when students are required to articulate and defend
their answers in a peer-instruction situation (Abrahamson; Beatty, 2004;
Pradhan, Sparano, & Ananth, 2005).
Compare with other students. When an ARS is used, all responses from
the class are publicly presented, so students are able to compare their indi-
vidual understanding with their fellow classmates. There is some evidence
to suggest that students like to see how well they are doing relative to their
peers (Burton, 2006; Caldwell, 2007; Draper & Brown, 2004; Hinde &
Hunt, 2006; Simpson & Oliver, 2007).
Formative assessment. There are two common forms of assessment
that can be used in a class. Summative assessment involves tests and as-
signments that count toward a student’s nal grade. The vast majority of as-
sessment in higher education is summative. Formative assessment, on the
other hand, is used to determine student understanding of concepts without
grades, to identify misconceptions, and alter classroom instruction accord-
ingly (Donovan, Brown, & Cocking, 1999). Regular use of an ARS can offer
346 Kay, Lesage, and Knaack
formative feedback to both instructors and students as to how well concepts
are being understood. Students can gauge and discuss their understanding
of concepts as they are being presented. There is considerable evidence to
suggest that an ARS is an effective tool for providing formative assessment
(Beatty, 2004; Bergtrom, 2006; Brewer, 2004; Caldwell, 2007; Draper &
Brown, 2004; Dufresne & Gerace, 2004; Greer & Heaney, 2004; Jackson et
al., 2005; Siau et al., 2006; Simpson & Oliver, 2007; Stuart et al., 2004).
Summative assessment. While it has been suggested by Fies & Marshall
(2006) that ARSs have been used primarily for summative assessment, lit-
tle research has been done on the impact of using an ARS for formal test
situations. There is some indirect evidence to suggest that higher education
students do not like using ARSs for “participation” credit (Caldwell, 2007).
More research is needed to examine the impact of summative assessment on
classroom learning.
Learning and ARSs
Quality of Learning. Numerous studies have asserted that higher educa-
tion students report increased learning when an ARS is used (Greer & Hea-
ney, 2004; Nicol & Boyle, 2003; Pradhan et al., 2005; Preszler et al., 2007;
Siau et al., 2006; Slain et al., 2004; Stuart et al., 2004; Uhari et al., 2003).
Some students like hearing explanations about ARS questions from their
peers who have a similar language and therefore can explain misconceptions
more effectively than the instructor (Nicol & Boyle, 2003; Caldwell, 2007).
Other students report that using an ARS helps them think more deeply about
important concepts (Draper & Brown, 2004; Greer & Heaney).
Learning performance. A number of researchers have provided descrip-
tive data to suggest that learning performance increased as a direct result of
using an ARS (Brewer, 2004; Caldwell, 2007; Carnaghan & Webb, 2007;
Kennedy & Cutts, 2005; Latessa & Mouw, 2005; Schackow, Milton, Loya,
& Friedman., 2004). In addition, many experimental studies have reported
that ARS-based classes signicantly outperform traditional lectures (Fagan,
Crouch, Mazur, 2002; Kaleta & Joosten, 2007; Kennedy & Cutts, 2005;
Pradhan et al., 2005; Preszler et al., 2007; Schackow et al.; Slain et al.,
2004).
Challenges Associated with Using an ARS
Technological challenges.Two main technological difculties were re-
ported when an ARS was used. When students were responsible for buying
their own remote devices, they didn’t always bring them to class or they lost
them. Because of the dependence on technology, students without remote
devices were unable to fully participate in an ARS-based class (Caldwell,
347
Examining the Use of Audience Response Systems in Secondary School
2007; Reay et al., 2005). A more critical issue was when remote devices did
not work or the signal was not received by the instructor’s computer. This
was a particularly stressful experience when students were being evaluated
for marks (Sharma et al., 2005; Siau et al., 2006).
Adjusting to a new method. Some students may react adversely to the
use of an ARS because the learning “game plan” has been changed. They
are used to a lecture format and a switch of methods can lead to stress, frus-
tration, and resistance at rst (Beatty, 2004; Fagan et al., 2002). Other stu-
dents are distracted by the use of an ARS (Siau, et al., 2006). Still others
doubt their ability to direct their own learning using an ARS (Allen & Tan-
ner, 2005).
Effort. Trees & Jackson (2007) noted that using an ARS requires more
cognitive energy and cooperation from students. This kind of extended ef-
fort may not suit all students, especially those who are more accustomed
to passive lectures. More research is needed, though, to determine wheth-
er students adapt to and accept the extended effort required when using an
ARS. The popularity of ARSs and their positive impact on learning suggest
that the additional effort required may not be a signicant barrier (Caldwell,
2007; Fies & Marshall, 2006; Judson & Sawada, 2002; Simpson & Oliver,
2007).
K-12 Investigation of an ARS
Penuel et al. (2006) have done the only comprehensive study involving
the use of ARSs with K-12 teachers. Teachers used an ARS for two main
reasons: (a) to improve learning and instruction (formative assessment) or
(b) for evaluating students (summative assessment). Frequent users of an
ARS had positive perceptions about using this tool in the classroom. Over-
all, the impact of the ARSs in K-12 was to increase motivation and improve
student learning.
Summary and Purpose of Study
Previous research on ARSs has been conducted almost exclusively in
the domain of higher education. Reported benets include greater student
involvement, positive reaction to regular assessment of knowledge during
a class, and improved quality of learning. Key challenges include ensuring
that the technology functions properly and helping students adjust to a new
way of learning. Since the use of ARSs is increasing rapidly in secondary
schools (Abrahamson, 2006), it is prudent to broaden the scope of current
ARS research. Therefore, the purpose of the current study is to examine
benets and challenges of using ARSs in secondary schools. Three ques- ques-
tions were addressed:
348 Kay, Lesage, and Knaack
1. What were the perceived benets of using an ARS?
2. What were the perceived challenges in using an ARS?; and
3. How did teaching strategy inuence the use of ARSs? (summative
vs. mixed vs. formative assessment).
METHOD
Sample
Students. The student sample consisted of 659 Canadian students
(327 males, 327 females, 5 missing data), enrolled in grades 9 (n= 71), 10
(n=233), 11 (n=149), and 12 (n= 206). Subject areas where an ARS was
used included accounting, biology, business, chemistry, civics, computer en-
gineering, English, law, mathematics, marketing, physics, technology, and
world issues. Eighty-seven percent (87%) of the students claimed that they
were comfortable or very comfortable with technology (n=572). Sample
population data was collected from 23 classrooms in 15 different secondary
schools located within an urban population of 400,000 people. All students
were selected through convenience sampling and had to obtain signed pa-
rental permission to participate.
Teachers. The teacher sample consisted of 23 teachers (16 males, 7 fe-
males), with 1 to 26 years of teaching experience (M = 15.9, SD = 7.9). Al-
most all teachers reported that they were comfortable or very comfortable
with technology (n=22, 96%).
Procedure
Teachers were emailed by an educational coordinator and informed of
the ARS study. Participation was voluntary and subjects could withdraw
from the study at any time without penalty or prejudice. Each teacher re-
ceived two half days of training in November and February on how to use
the ARS software and possible strategies for using an ARS in the classroom.
Teachers were then asked to use an ARS in their regular classrooms for one
month, although frequency of use was controlled by the individual teacher.
In pairs, teachers shared a laptop computer, an LCD projector, and one ARS
system (32 clickers and a receiver) from E-Instruction (see http://www.ein-
struction.com/ for details of the specic ARS brand used). E-Instruction did
not fund nor was it afliated in any way with this study. All teachers used
the ARSs for a three month period, however, data collected for this study fo-
cused on the last month. During the nal month in which ARSs were used,
94% (n=617) of secondary students reported using an ARS one to two times.
Only, 6% (n=41) of the students used an ARS once a week.
Examining the Use of Audience Response Systems in Secondary School 349
Data Sources
Student survey. Based on the nal month in which an ARS was used,
students completed the ARS Attitude Survey for Students (Kay, 2008a).
This survey consisted of 11, 7-point Likert scale items that focused on the
four key areas reported in previous ARS research: (a) overall attitudes, (b)
student involvement, (c) assessment, and (d) learning. Since this was a for-
mative analysis of student attitudes toward ARS and its use, reliability and
construct validity were not calculated. Instead, each item was analyzed indi-
vidually to glean as much information as possible.
Student comments. Students were asked the following open-ended ques-
tion: “What was the impact of clickers on your learning in the past month?”
A coding scheme was developed to categorize 714 student comments made
by 526 students (Kay, 2008b). Some students made more than one comment
when they lled in their survey, while other students offered no response.
Each comment was then rated on a 5-point Likert scale (-2 = very negative,
-1 = negative, 0 = neutral, 1 = positive, 2 = very positive). Two raters as-
sessed all comments made by students based on category and rating value.
After round one, inter-rater reliability was 83% for categories and 93% for
ratings. Comments where categories or ratings were not exactly the same
were shared and reviewed a second time by each rater. After round two, an
inter-rater reliability of 98% was reached for categories and 99% for the rat-
ing values.
Teacher questions. Teachers were asked to indicate gender, years of
teaching experience, their comfort level with technology, and whether they
used an ARS for formative and/or summative assessment.
RESULTS
Benefits to Using an ARS
Student survey results are presented in Table 1 including percent of stu-
dents who disagreed (selected Strongly Disagree, Disagree or Slightly Dis-
agree) or agreed (selected Strongly Agree, Agree or Slightly Agree) with an
item, mean score (based on a 7-point Likert scale) and standard deviation. In
addition, a summary of students comments about ARSs is presented in Table
2 and includes mean rating score (ranging from -2 to +2 see Appendix B
for more details), standard deviations, and number of negative and positive
comments.
350 Kay, Lesage, and Knaack
Table 1
Summary of ARS Quantitative Student Survey Questions (n=659)
Question n Disagree1 Agree2 M SD
Overall Attitude
1. I would prefer to use clickers. 653 15% 69% 5.3 1.8
2. When clickers were used, the class
was better. 659 16% 52% 4.7 1.5
Student Involvement
3. I was more engaged in the lesson
when clickers were used.
657 10% 75% 5.3 1.5
4. I was more motivated when clickers
were used. 656 11% 70% 5.2 1.5
5. I participated more than I normally
would when clickers were used.
659 15% 55% 5.1 1.6
6. Using clickers generated more class
discussion. 657 18% 55% 4.6 1.5
Assessment
7. Using clickers was a good way to test
my knowledge.
655 9% 77% 5.4 1.4
8. I liked seeing what other students in
the class selected for answers.
656 14% 57% 4.8 1.5
9. I DID NOT feel bad when most stu-
dents got an answer right and I didn’t.
658 26% 47% 4.6 1.7
10. I liked using clickers for tests. 636 21% 54% 4.7 1.8
Learning
11. I learned more when clickers were
used. 656 23% 44% 4.4 1.5
1 Combining “Slightly Agree”, “Agree” and “Strongly Agree” responses
2 Combining “Slightly Disagree”, “Disagree” and “Strongly Disagree” responses
Overall attitudes. Students were relatively positive about using ARSs
with 69% agreeing that they preferred to use an ARS (M = 5.3, SD = 1.8,
possible range was 1 to 7) and 52% indicating that their class was better
when an ARS was used (M = 4.7, SD =1.5). General comments offered from
the open-ended survey question also indicated that students were either neu-
tral or accepting of ARS use in their classroom. Although many comments
were neutral (n=41 out of 70), 22 comments were positive. The following
were typical comments made by students in favour of ARSs.
351
Examining the Use of Audience Response Systems in Secondary School
Loved the clickers—they were radical
I thought that they were a great addition to the class
They helped me with some things I was not sure about.
Table 2
Summary of Student Comments about Audience Response Systems (n=526)
Category Mean SD No. Negative
Comments No.
Positive
Com-
ments
Total
Com-
ments
Overall Attitude
General comments 0.30 (0.89) 7 22 70
Different methods
used 0.43 (1.03) 12 30 46
Student Involvement
Engagement 1.23 (0.43) 0 111 111
Participation 1.04 (0.35) 0 49 49
Paid attention more 0.93 (0.38) 1 27 28
Discussion 1.11 (0.33) 0 9 9
Stress -1.00 (1.05) 23 5 28
Assessment
Formative Assess-
ment 0.94 (0.41) 1 49 53
Compare with others 0.88 (0.34) 0 14 16
Feedback 0.86 (0.53) 1 3 14
Wrong answer -
Reaction -1.00 (0.00) 2 0 2
Learning
Learning process 0.81 (0.78) 4 72 104
Review previous
concepts 1.13 (0.44) 0
51 53
Memory 0.86 (0.73) 1 19 21
Teacher explained
better 1.00 (0.00) 1 5 5
Learning performance -0.26 (1.16) 45 30 80
Technology Issues -0.48 (1.05) 16 7 25
352 Kay, Lesage, and Knaack
Of the 46 comments made about student reaction to using a different
method of teaching, 30 were positive. Sample positive comments were as
follows:
I enjoyed it because it was a new form of lesson that we all
beneted from.
It is better than reading questions out of a textbook.
It was more interesting with the clickers rather than just raising
your hand.
Student involvement. Students indicated that they were substantially
more involved when an ARS was used. Based on the survey, 75% of the stu-
dents agreed that they were more engaged (M=5.3, SD = 1.5), 70% agreed
they were more motivated (M=5.2, SD = 1.5) 55% agreed that they partici-) 55% agreed that they partici-
pated more than they normally would (M=5.1, SD = 1.6), and 55% agreed
that more class discussion was generated (M=4.6, SD = 1.4).
Student comments regarding involvement were very positive with only
one of the 201 comments rated as being negative. The mean rating for all
comments was high (M=1.23, SD = 0.43), scale ranged from -2 to +2). Rep-, scale ranged from -2 to +2). Rep-. Rep-
resentative comments were as follows:
It made class more fun and enjoyable.
I wasn’t falling asleep. And I was very interested.
Much of the class seemed more interested in the Math we were
doing
Fewer comments were made about participation than engagement (n=49),
although the mean rating was high (M=1.04, SD = 0.35). Typical comments
about participation included:
Clickers motivated me into participating in class discussions in
general!
More hands on and intriguing way to learn. More class in-
volvement.
353
Examining the Use of Audience Response Systems in Secondary School
It forced people who normally don’t feel cofortable participat-
ing in class discussion to participate.
Only 27 student comments were made about paying attention, but the
mean rating was high (M=0.93, SD = 0.38). Sample attention-based com-
ments were as follows:
It made me concentrate more.
The clickers allowed me to stay focused on the subject while
in class
I felt I read all the questions more carefully to make sure I got
the right answer and paid attention more than I normally would
when we just talked about something.
Finally, mean ratings about class discussion were high (M=1.11, SD =
0.33), but only nine students commented on this aspect of using ARS. Typi-
cal comments about discussion included:
[It] takes more time with more discussion, but the discussion
and more in depth learning outweighed the time used.
Using the clickers prompted much more class discussion and it
was much better than simply answering in a paper.
Assessment. Students appeared to be most enthusiastic when an ARS
was used for formative assessment. Over 75% of students agreed that using
ARSs was a “good way” to self test their knowledge (M=5.4, SD = 1.4).
Comparing answers with other students (M=4.8, SD = 1.5) was rated lower,
with only 57% agreeing that they like to see the responses of other students.
Finally, just over 50% of the students agreed that they liked using an ARS
for formal tests (M=4.7, SD = 1.8).
Comments about formative assessment (M=0.94, SD = 0.41), comparing
responses (M=0.88 , SD = 0.34), and feedback (M=0.86, SD = 0.53) were
rated relatively high. The following are typical comments made about for-
mative assessment:
[I] got to see how much I really understood in that class.
354 Kay, Lesage, and Knaack
It helped me see what parts of law I needed improvement in,
judging by the questions I got right or wrong.
[It] helped me by testing my knowledge and getting to know
where the class and I are at.
Sample comments about comparing responses with the class were as fol-
lows:
I liked clickers because I can see how I’m doing compared to
the class.
I also learned more about my classmates and what they were
interested in.
It is interesting to see what the rest of the class was answering.
Finally, representative comments about getting feedback included:
I liked to be able to see the answer right away.
I enjoyed knowing if I got a question right or wrong immedi-
ately.
Learning. Only 44% of the students agreed that they learned more when
an ARS was used. The mean score for this item was 4.4 (SD=1.5), the low-
est of all survey questions. However, open-ended comments painted a dif-
ferent picture. Ratings for comments about learning process (M=0.81, SD
= 0.78), reviewing previous concepts (M=1.13, SD = 0.44), helping to im-
proved memory (M=0.86, SD = 0.53), and teacher explanations (M=1.00,
SD = 0.00) were relatively high. However, the mean rating for learning per-
formance comments was negative (M=-0.26, SD = 1.16). Students were div-). Students were div-
ided on whether learning performance improved or decreased. Comments
made about the overall learning process (n=104) and reviewing concepts
(n=53) were the most frequent. Typical comments made about the learning
process were:
I liked the way you had to think very fast.
I have learned better and observe the question better.
355
Examining the Use of Audience Response Systems in Secondary School
Learning was more fullling.
A very innovative and exciting way to learn.
Representative comments about using an ARS to review concepts
included
I think it was a nice way to test our knowledge for reviews.
It was great review of the units and topics we just covered.
I was able to obtain answers to test-like questions easily giving
me insight as to what more studying was needed.
Comments made about an ARS helping to improve memory were less
frequent (n=21) but noteworthy.
They helped me remember small but important details that
were on my test.
Clickers helped make the material that much more easy to re-
member.
I nd I remember things better if I get them wrong rst on a
test and this system always seems like a test for me so I feel I
learn more this way.
Positive comments (see negative comments in challenges section) made
about learning performance focused on positive reactions to test situations:
I think clickers had a very good impact on my learning. They
made my tests a lot more enjoyable and relaxing.
Well we only used the clickers twice and both times it made
me concentrate more and I got better marks. But when we did,
I actually didn’t care that we had to do a test and I’m pretty
sure I did better than I would [have] on a sheet of paper.
356 Kay, Lesage, and Knaack
Challenges Experienced Using ARSs
Technology. Perceptions of ARS technology was not assessed using the
survey, but the mean rating for comments was negative (M=-0.48, SD =
1.05), at least for the 25 students who chose reect on this aspect of ARS
use. The majority of student complaints were related to condence that the
remote control functioned properly and registered the intended response.
The whole process of aiming the clicker [was] bothersome.
Well at rst I didn’t like the clickers—I found them rather
complicated and also mine was not working correctly.
They are too difcult to use and it was difcult to tell when the
correct answer was chosen.
Overall attitudes. While most general attitudes toward ARSs were posi-
tive, comments from some students indicated that the use of the ARS was
quite negative. Some of the more negative comments offered were as fol-
lows:
I did not like using the clickers.
Horrendous—takes up too much time.
Other comments were targeted at having to use a new method to learn.
I prefer writing my answers on paper. I still learned during the
process of clickers, but I prefer class without them.
Student involvement. Overall, students were very positive about using
ARSs, with the exception of one category: stress. Mean ratings were quite
low -1.00 (SD=1.05) and 23 of the 28 comments made were negative. Typi-
cal comments referred to increased pressure about understanding concepts,
using an ARS in a test situation, or feeling pressure to respond.
It made me more worried about not knowing what I need to
know!!
Using clickers on a test is very stressful. Not only are we
stressed because of the test conditions, but it’s difcult to work
the clickers.
357
Examining the Use of Audience Response Systems in Secondary School
I felt that clickers made me feel as if I had to rush to get my
answer selected. I was nervous and felt pressure, when usually
I’m condent.
Assessment. The survey indicated that 26% of secondary students “felt
bad” when most of their peers responded correctly and they did not (M=4.6,
, SD = 1.7). As stated earlier, student comments about assessment and ARSs
were almost always positive.
Learning. Even though reviewing concepts, improving memory, and ob-
taining explanations from the teacher were rated as positive effects of us-
ing an ARS, a sizeable group of students (n=45) were quite vocal about the
negative impact of the ARSs on learning performance. Negative comments
included not having the questions available after they were answered elec-
tronically and the time pressures experienced when completing tests with an
ARS.
If anything, I didn’t do as well on units test—I will be [not]
able to do as well on exams as I could have because I don’t
have the questions in front of me to help me study. Therefore,
I soon forgot all the questions from the quiz/test.
The clickers seem to create more pressure to answer correctly
which often led to answering incorrectly.
I hated the time constraint. I was more nervous about the time,
than I was working on the questions.
Strategies
Detailed data on how ARSs were used in secondary schools was not col-
lected, however, three general strategies were reported by teachers: (a) using
an ARS for formative assessment only (formative, n=398), (b) using an ARS
for both summative tests and formative assessment (mixed, n=110), and (c)
using an ARS for summative assessment only (summative, n =103). Means
for all items on the ARS student attitude scale for each strategy selected are
presented in Table 3. Note that mean values for most attitude scale items are
lowest when a summative strategy was used and highest when a formative
strategy was employed.
358 Kay, Lesage, and Knaack
Table 3
Mean ARS Survey Item Scores as a Function of Strategy Used (n=611)
Survey Item Summative
Assessment Mixed
(Formative &
Summative)
Formative
Assessment
M SD M SD M SD
Overall Attitudes
1. I would prefer to use clickers. 4.42 (1.94) 4.72 (1.73) 5.70 (1.62)
2. When clickers were used, the
class was better.
4.19 (1.46) 4.43 (1.32) 4.97 (1.50)
Student Involvement
3. I was more engaged in the les-
son when clickers were used.
4.52 (1.57) 5.05 (1.44) 5.65 (1.33)
4. I was more motivated when
clickers were used.
4.44 (1.60) 4.79 (1.47) 5.57 (1.39)
5. I participated more than I
normally would when
clickers were used.
4.39 (1.51) 4.85 (1.56) 5.41 (1.50)
6. Using clickers generated more
class discussion.
3.83 (1.35) 4.45 (1.37) 4.85 (1.49)
Assessment
7. Using clickers was a good way
to test my knowledge.
4.56 (1.59) 5.27 (1.39) 5.71 (1.28)
8. I liked seeing what other
students in the class selected for
answers.
4.10 (1.47) 5.23 (1.43) 4.92 (1.48)
9. I did not feel bad when most
students got an answer right and
I didn’t.
4.32 (1.59) 3.92 (1.75) 4.89 (1.70)
10. I liked using clickers for tests. 4.09 (2.16) 4.19 (1.93) 5.11 (1.62)
Learning
11. I learned more when clickers
were used.
3.63 (1.48) 4.13 (1.37) 4.65 (1.46)
A MANOVA was run to compare formative, mixed, and summative ap-
proaches to using the ARS (Table 4). Using an ARS for formative assess-
ment was rated signicantly more positively than using an ARS for summa-
tive assessment on all 11 Likert scale items in the ARS attitude scale. Using
an ARS for formative assessment also resulted in signicantly higher scores
on most survey items when compared to a mixed approach (formative &
summative).
359
Examining the Use of Audience Response Systems in Secondary School
Table 4
MANOVA Examining Attitude toward ARS as a Function of Teaching
Strategy
Source df F Scheffe’s Post Hoc Analysis
(p <.05)
Overall Attitude
1. I would prefer to use clickers. 2 31.5* Formative > Mixed & Summative
2. When clickers were used, the class
was better.
2 14.6* Formative > Mixed & Summative
Student Involvement
3. I was more engaged in the lesson
when clickers were used.
2 29.6* Formative > Mixed > Summative
4. I was more motivated when clickers
were used.
2 31.9* Formative > Mixed & Summative
5. I participated more than I normally
would when clickers were used.
2 21.3* Formative > Mixed > Summative
6. Using clickers generated more class
discussion.
2 21.4* Formative & Mixed > Summative
Assessment
7. Using clickers was a good way to
test my knowledge.
2 30.3* Formative > Mixed > Summative
8. I liked seeing what other students in
the class selected for answers.
2 17.6* Formative & Mixed > Summative
9. I did not feel bad when most stu-
dents got an answer right and I didn’t.
2 16.4* Formative > Mixed & Summative
10. I liked using clickers for tests. 2 20.6* Formative > Mixed & Summative
Learning
11. I learned more when clickers were
used.
2 22.6* Formative > Mixed > Summative
* p < .001
DISCUSSION
The purpose of this study was to conduct a formative analysis of the use
of ARSs in secondary school classrooms. Three areas were examined: (a)
benets, (b) challenges, and (c) teaching strategy.
Benefits
Overall attitudes. The survey data suggests that secondary students were
positive about the use of ARSs, a result that is consistent with previous
studies conducted in higher education. However, not all students beneted
equally when using an ARS, a conclusion that is supported by some student
360 Kay, Lesage, and Knaack
comments. To provide a more in depth analysis, Fies and Marshall (2006)
suggested that general impressions and anecdotal comments about attitudes
toward ARSs need to be supplemented by a more detailed, focused analysis.
Student involvement. The evidence provided by this study suggests that
ARSs helped increase engagement, participation, and to a lesser extent,
attention paid in class. These results are consistent with a number of stud-
ies looking at ARS use in higher education (Caldwell, 2007; Fies & Mar-Caldwell, 2007; Fies & Mar-
shall, 2006; Judson & Sawada, 2002; Simpson & Oliver, 2007). More data
is needed, though, focussing on what is engaging about ARSs, the quality of
participation, and why students are more focused when an ARS is used.
Assessment. Secondary students appreciated the use of ARS for forma-
tive assessment particularly with respect to test preparation and review.
Comparing answers with other students and getting general feedback played
a secondary role. While the value of formative assessment is well docu-
mented in higher education (Beatty, 2004; Bergtrom, 2006; Brewer, 2004;
Caldwell, 2007; Draper & Brown, 2004; Dufresne & Gerace, 2004; Greer
& Heaney, 2004; Jackson et al., 2005; Siau et al., 2006; Simpson & Oliver,
2007; Stuart et al., 2004), the relative importance of different kinds of form-
ative assessment has not been examined in detail.
Learning. A number of students commented that the quality of the learn-
ing process, specically within the context of reviewing concepts, was an
effective consequence of using an ARS. Impact on memory for material pre-
sented was noteworthy, but commented on by only a few students. While
challenges will be discussed later, it is important to note that an improved
learning process did not necessarily lead to increased learning performance.
At least 45 students felt that learning performance was hampered, some-
times appreciably, by using an ARS, particularly when it was used as a sum-
mative assessment tool. To thoroughly investigate the impact of ARSs on
learning, more detailed data collection and a wider variety of teaching strat-
egies need to be explored.
Challenges
Overall attitudes. It is evident that some students had trouble adjusting
to the novelty of using an ARS and the associated changes in teaching strat-
egies. While this difculty was experienced by less than 2% of the student
sample, it is one that has also been observed in higher education. The sug-
gested remedy has been to clearly explain the rationale for using an ARS
and the intended benets of this technology on learning (Crouch & Mazur,
2001; Trees & Jackson, 2007).
Student involvement. One key challenge for students in this study was
the stress experienced when being required to respond to an ARS question,
361
Examining the Use of Audience Response Systems in Secondary School
particularly in a summative test situation. The comments concerning in-
creased stress were not frequent, but they were intense. Given the student
frustration level when an ARS was used for summative tests (see strategies,
next section) it might be wise to use this technology for formative assess-
ment purposes only. In addition, student anxiety might be reduced by ex-
plaining why ARSs are being used with an emphasis on learning versus hav-
ing to obtain the right answer.
Learning. The most serious challenge associated with using an ARS was
the negative impact on learning performance experienced by students who
were required to use this tool for completing graded tests. Time pressure and
not having a paper copy of the questions and answers to reference at a later
date were two notable complaints. More thought needs to be directed toward
whether summative assessment is an effective use of an ARS.
Strategy
Only one previous study reviewed the use of ARSs in a K-12 environ-
ment (Penuel et al., 2006). These researchers observed that teachers natur-(Penuel et al., 2006). These researchers observed that teachers natur-). These researchers observed that teachers natur-
ally divided into two groups with respect to using ARSs: (a) formative (in-
structional) and (b) summative (testing) use. A similar pattern was observed
in the current study with secondary students. Most teachers (65%) chose to
use an ARS as a formative assessment tool. Based on the survey results, stu-
dents overwhelmingly preferred formative over summative use. No research
to date has been done comparing these formative and summative approach-
es, so further research is needed to conrm this conclusion.
Recommendation to Educators
The current study was intended to be a formative analysis of the use of
ARSs in secondary school classrooms, so it would be premature to offer un-
equivocal advice about their use in the classroom. That said, there are sev-
eral recommendations that may be worth noting.
First, most secondary students in this study did not respond well when
the ARS was used as a test-taking device. In fact learning performance was
perceived as decreasing. In the interest of minimizing the negative impact of
this tool, it might be wise to restrict the use of an ARS to formative assess-
ment, particularly when reviewing concepts. Students appeared to appreci-
ate this strategy the most.
Second, it is probably a good idea to clearly explain why an ARS is be-
ing used in the classroom, particularly if it will be used regularly. Caldwell
(2007) and Trees and Jackson (2007) noted that if you expect to garner full
student support, you need to explain why you are using ARSs and what you
expect to gain from the technology.
362 Kay, Lesage, and Knaack
Finally, because technology problems were experienced by some stu-
dents, it would be prudent for educators to test the ARS equipment ahead
of time to ensure that all remote devices respond properly. Batteries, for ex-
ample, might need to be changed. When infrared remote devices are used,
uorescent lights may need to be turned off to avoid signal interference.
Caveats and Future Research
This study was a rst attempt to investigate the use of ARSs in second-
ary school classrooms. Two main data collection sources were used—survey
questions and open-ended comments. Analysis of the data was based on a
comprehensive review of the literature examining the use of ARSs in higher
education. There are a number of caveats, though, that need to be considered
when interpreting the results.
First, the results are intended to provide a preliminary perspective on use
in secondary school classrooms. More detailed examination is required in
key areas such as the catalysts for student engagement, the impact of differ-
ent teaching strategies, and why memory is improved when using an ARS.
Second, the results are based on short-term use of an ARS—once or
twice over a one month period. The impact could be decidedly different if
an ARS were used on a regular basis. For example, engagement may be re-
duced when students become more accustomed to using an ARS on a regu-
lar basis. On the other hand, the impact of an ARS on learning may be more
signicant with increased use.
Third, teaching strategies should be expanded to acquire a more com-
plete understanding of the effectiveness of using an ARS. One promising
strategy that has yet to be examined in secondary school, but which has re-
ceived considerable acclaim in higher education, is peer-based instruction.
This technique involves displaying a multiple-choice, higher-level question
that identies a misconception, asking students to select an answer using an
ARS, giving students time to discuss and defend their answers with two to
four peers, taking a revote on the original question, and having the instructor
provide a brief summary of the content presented. A number of researchers
have reported considerable success with this approach in higher education
(Brewer, 2004; Burnstein & Lederman, 2001; Crouch & Mazur, 2001; Cutz,
2006; Draper & Brown, 2004; Hinde & Hunt, 2006; Jones et al., 2001; Ken
nedy & Cutts, 2005; Miller, Santana-Vega, & Terrell, 2006; Pelton & Pelton,
2006; Nicol & Boyle, 2003).
Finally, more research is needed on learning performance. This study
collected systematic data on the perceived quality of learning, but actual
measures of student performance are needed to rmly establish the learning
impact of ARSs.
363
Examining the Use of Audience Response Systems in Secondary School
Summary
The purpose of this study was to conduct a detailed formative analysis
of the potential benets, challenges, and use of ARSs from the perspective
of secondary school students. Key benets reported were increased engage-
ment and motivation, effective formative review of concepts, and a higher
quality learning environment. Key challenges observed included decreased
learning performance when ARSs were used for summative assessment,
technological glitches, negative reactions to using a new method of learn-
ing, and higher stress experienced due to perceived time constraints when
responding to questions. Finally, students rated the use of ARSs higher when
it was used for formative as opposed to summative assessment.
References
Abrahamson, L. (2006). A brief history of networked classrooms: Effects, cases, peda-
gogy, and implications. In D. A. Banks (Ed.), Audience response systems in higher
education (pp. 1-25). Hershey, PA: Information Science Publishing.
Allen, D., & Tanner, K. (2005). Infusing active learning into the large-enrolment Biology
class: Seven strategies, from the simple to complex. Cell Biology Education, 4, 262-
268.
Beatty, I. (2004). Transforming student learning with classroom communication systems.
EDUCAUSE Research Bulletin, 3, 1-13. Retrieved November 3, 2007, from
http://
www.educause.edu/ir/library/pdf/ERB0403.pdf
Bergtrom, G. (2006). Clicker sets as learning objects. Interdisciplinary Journal of Knowl-
edge and Learning Objects, 2. Retrieved November 3, 2007, from http://ijklo.org/
Volume2/v2p105-110Bergtrom.pdf
Brewer, C. A. (2004). Near real-time assessment of student learning and understanding
in biology courses. BioScience, 54(11), 1034-1039.
Burnstein, R. A., & Lederman, L. M. (2001). Using wireless keypads in lecture classes.
The Physics Teacher, 39(1), 8-11.
Burton, K. (2006). The trial of an audience response system to facilitate problem-based
learning in legal education. In D. A. Banks (Ed.), Audience response systems in
higher education (pp. 265-276). Hershey, PA: Information Science Publishing.
Caldwell, J.E. (2007). Clickers in the large classroom: Current research and best-practice
tips.
Life Sciences Education
, 6(1), 9-20.
Carnaghan, C., & Webb, A. (2007). Investigating the effects of group response systems
on student satisfaction, learning, and engagement in accounting education.
Issues
in Accounting Education
, 22(3), 391-409.
Crouch, C. H., & Mazur, E. (2001). Peer instruction: Ten years of experience and results.
American Journal of Physics, 69(9), 970-977.
Cutts, Q. (2006). Practical lessons from four years of using an ARS in every lecture of
a large class. In D. A. Banks (Ed.), Audience response systems in higher education
(pp. 65-79). Hershey, PA: Information Science Publishing.
d’Inverno, R., Davis, H., & White, S. (2003). Using a personal response system for pro-
moting student interaction. Teaching Mathematics and Its Applications, 22(4), 163-
169.
364 Kay, Lesage, and Knaack
Donovan, M. S., Brown, A. L., & Cocking, R. (Eds.) (1999). How people learn: Brain,
mind, experience, and school. Washington, DC: National Academy Press.
Draper, S.W., & Brown, M.I. (2004). Increasing interactivity in lectures using an electronic
voting system. Journal of Computer Assisted Learning, 20(2), 81-94.
Dufresne, R. J., & Gerace, W. J. (2004). Assessing-to-learn: Formative assess-
ment in physics instruction.
The Physics Teacher
,
42
, 428-433.
Fagan, A. P., Crouch, C. H., & Mazur, E. (2002). Peer instruction: Results from a range
of classrooms. The Physics Teacher, 40(4), 206-209.
Fies, C., & Marshall, J. (2006). Classroom response systems: A review of the literature.
Journal of Science Education and Technology
, 15(1), 101-109.
Greer, L., & Heaney, P. J. (2004). Real-time analysis of student comprehension: An as-
sessment of electronic student response technology in an introductory earth science
course. Journal of Geoscience Education, 52(4), 345-351.
Hinde, K., & Hunt, A. (2006). Using the personal response system to enhance student
learning: Some evidence from teaching economics. In D. A. Banks (Ed.), Audience
response systems in higher education (pp. 140-154). Hershey, PA: Information Sci-
ence Publishing.
Jackson, M., Ganger, A. Ac., Bridge, P. D., & Ginsburg, K. (2005). Wireless handheld
computers in the undergraduate medical curriculum. Medical Education Online, 10
(5). Retrieved November 3, 2007, from http://www.med-ed-online.org/pdf/t0000062.
pdf
Jones, C., Connolly, M., Gear, A., & Read, M. (2001). Group integrative learning with
group process support technology. British Journal of Educational Technology, 32(5),
571-581.
Judson, E., & Sawada, D. (2002). Learning from past and present: Electronic response
systems in college lecture halls. Journal of Computers in Mathematics and Science
Teaching, 21(2), 167-181.
Kay, R. H. (2008a). Audience response systems attitude scale. Retrieved November 22,
2008, from http://faculty.uoit.ca/kay/papers/arshs/AppendixA_ARS.pdf
Kay, R. H. (2008b). Coding scheme for secondary school students comments about au-
dience response systems. Retrieved November 22, 2008, from http://faculty.uoit.ca/
kay/papers/arshs/AppendixB_ARS.pdf
Kaleta, R., & Joosten, T. (2007). Student response systems: A University of Wisconsin
system study of clickers. EDUCAUSE Research Bulletin, (10), 1-12.
Kennedy, G. E., & Cutts, Q. I. (2005). The association between students’ use of elec-
tronic voting systems and their learning outcomes. Journal of Computer Assisted
Learning, 21(4), 260-268.
Latessa, R., & Mouw, D. (2005). Use of audience response system to augment inter-
active learning. Family Medicine, 37(1), 12-14. Retrieved November 3, 2007, from
http://www.stfm.org/fmhub/fm2005/January/Robyn12.pdf
McCabe, M. (2006). Live assessment by questioning in an interactive classroom. In D.
A. Banks (Ed.), Audience Response Systems in Higher Education (pp. 276-288).
Hershey, PA: Information Science Publishing.
Miller, R. L., Santana-Vega, E., & Terrell, M. S. (2006). Can good questions and peer
discussion improve calculus instruction? PRIMUS, 16(3), 1-9.
Nicol, D. J., & Boyle, J. T. (2003). Peer instruction versus class-wide discussion in large
classes: A comparison of two interaction methods in the wired classroom. Studies in
Higher Education, 28(4), 457-473.
365
Examining the Use of Audience Response Systems in Secondary School
Paschal, C. B. (2002). Formative assessment in physiology teaching using a wireless
classroom communication system.Advances in Physiology Education, 26(4), 299-
308.
Pelton, L. F., & Pelton, T. (2006). Selected and constructed response systems in math-
ematics. In D. A. Banks (Ed.), Audience response systems in higher education (pp.
175-186). Hershey, PA: Information Science Publishing.
Penuel, W. R., Abrahamson, L., & Roschelle, J. (2006). Theorizing the transformed class-
room: Sociocultural interpretation of the effects of audience response systems in
higher education. In D. A. Banks (Ed.), Audience response systems in higher edu-
cation (pp. 187-208). Hershey, PA: Information Science Publishing.
Pradhan, A., Sparano, D., & Ananth, C.V. (2005). The influence of an audience response
system on knowledge retention: An application to resident education. American
Journal of Obstetrics and Gynecology, 193(5), 1827-30.
Preszler, R.W., Dawe, A., Shuster, C.B., & Shuster, M. (2007). Assessment of the effects
of student response systems on student learning and attitudes over a broad range
of biology courses.
CBE-Life Sciences Education
, 6(1), 29-41.
Reay, N. W., Bao, L., Li, P., Warnakulasooriya, R., & Baugh, G. (2005). Toward the effec-
tive use of voting machines in physics lectures.
American Journal of Physics
,
73
(6),
554-558.
Schackow, T. E., Milton, C., Loya, L., & Friedman, M. (2004). Audience response system:
Effect on learning in family medicine residents. Family Medicine, 36, 496-504.
Sharma, M. D., Khachan, J., Chan, B., & O’Byrne, J. (2005). An investigation of the ef-
fectiveness of electronic classroom communication systems in large lectures.
Aus-
tralasian Journal of Educational Technology
,
21
(2), 137-154.
Siau, K., Sheng, H., & Nah, F. (2006). Use of classroom response system to enhance
classroom interactivity. IEEE Transactions on Education, 49(3), 398-403.
Simpson, V., & Oliver, M. (2007). Electronic voting systems for lectures then and now: A
comparison of research and practice.
Australasian Journal of Educational Technology
,
23(2), 187-208.
Slain, D., Abate, M., Hidges, B. M., Stamatakis, M. K., & Wolak, S. (2004). An interactive
response system to promote active learning in the doctor of pharmacy curriculum.
American Journal of Pharmaceutical Education, 68(5), 1-9.
Stuart, S. A. J., Brown, M. I., & Draper, S.W. (2004). Using an electronic voting system in
logic lectures: One practitioner’s application.
Journal of Computer Assisted Learning
,
20
(2), 95-102
Trees, A. R., & Jackson, M. H. (2007). The learning environment in clicker classrooms:
Student processes of learning and involvement in large university course using stu-
dent response systems. Learning, Media, and Technology, 32(1), 21-40.
Uhari, M., Renko, M., & Soini, H. (2003). Experiences of using an interactive audience
response system in lectures. BMC Medical Education, 3(12), 1-6.
Van Dijk, L. A., Van Den Berg, G. C., & Van Keulen, H. (2001). Interactive lectures in en-
gineering education. European Journal of Engineering Education, 26(1), 15-28.
... However, SRSs have been incorporated into courses including English for the media (Tan, Ganapathy, & Kaur, 2018), English preparation class (Celik, 2015); health care English course (Korkealehto, & Siklander, 2018); biology (Wolter et al., 2011;Knight & Wood, 2005), chemistry (Ares, Bernal, Nozal, Sánchez, & Bernal, 2018), computer science course (Dolezal et al., 2018), information systems strategy and governance course (Licorish, Owen, Daniel, & George, 2018), economics (Simpson & Oliver, 2007), and educational psychology (Mayer et al., 2009). Moreover, Kay, LeSage, & Knaack (2010) conducted a study to examine benefits and challenges of using SRSs in secondary schools in accounting, biology, business, chemistry, civics, computer engineering, English, law, mathematics, marketing, physics, technology, and world issues courses by including 659 Canadian students and 23 teachers. As it can be seen, there is no constraints on the subject it can be used in. ...
... Furthermore, the results of the study conducted by Han & Finkelstein (2013) showed that the use of SRSs as a way of formative feedback had more influence on students' perceptions in terms of engagement and learning than summative feedback. Caldwell (2007) stated that when SRSs were included to teaching learning processes, students emotionally and psychologically committed themselves to answer questions even if sometimes they just guessed, which increased their involvement (Bicen & Kocakoyun 2018;Celik, 2015;Dolezal et al., 2018;Han & Finkelstein, 2013;Kay et al., 2010;Korkealehto, & Siklander, 2018;Knight & Wood, 2005;Licorish et al., 2018;Tan, Ganapathy, & Kaur, 2018). Also, SRSs improved the motivation of students (Dolezal et al., 2018;Licorish et al., 2018). ...
... Kahoot is one of the SRSs which was included in the current study. In the literature, it was stated that Kahoot increases motivation, the involvement of students, interaction among students and between students and instructor and assesses student learning formatively, lets students compare their responses relative to their peers, assess pre-class studies, gives immediate feedback, reveals common misconceptions of students, makes instruction fun, increases class attendance, enhance learning, saves time and students can assess their own understanding of a concept (Ares, et al., 2018;Bicen & Kocakoyun 2018;Caldwell, 2007;Celik, 2015;Draper & Brown, 2004;Dolezal et al., 2018;Gok, 2011;Han & Finkelstein, 2013;Kay & LeSage, 2009;Kay et al., 2010;Korkealehto, & Siklander, 2018;Knight & Wood, 2005;Licorish et al., 2018;Martyn, 2007;Steinert & Snell, 1999;Tan, Ganapathy, & Kaur, 2018). On the other hand, in the literature, one of the biggest problems of using student response systems in education is stated as technical problems such as poor internet connection or some issues that stem from mobile phones may restrain learning (Bicen & Kocakoyun 2018;Kay & LeSage, 2009). ...
Conference Paper
Full-text available
The aim of this study was to investigate the perceptions of pre-service teachers about a student response system, Kahoot. This study was based on case study research design and conducted in the fall semester of 2018-2019 for 9 weeks at a state university in Turkey. The participants of the study were 112 pre-service teachers chosen according to convenience sampling method. Among the 112 pre-service teachers, 79 (70.5%) of them were female and 33 (29.5%) of them were male, 53 (47.3%) of them were from undergraduate Mathematics Teaching Department and 59 (52.7%) of them from Turkish Language Teaching Department. This study was conducted in the Principles and Methods of Instruction course. In this study, after the introduction of subjects using PowerPoint presentations, the instructor included Kahoot plays. Each Kahoot application consisted of 5 to 6 multiple-choice questions about the course topics. Data were collected through a questionnaire and they were analyzed using descriptive statistical analysis techniques. The findings of the study showed that most of the pre-service teachers have positive attitudes towards Kahoot, informed that Kahoot accounted for their learning and knowledge retention. Finally, pre-service teachers stated that Kahoot encouraged their active participation, helped them to get more involved in the course. On the other hand, a small percentage of pre-service teachers (less than 5%) stated that Kahoot applications are time-consuming, boring and have some technological problems. The findings of this study were discussed and some implications for researchers were suggested. Öz Bu çalışmanın amacı, öğretmen adaylarının bir öğrenci yanıtlama sistemi olan Kahoot hakkındaki algılarını incelemektir. Bu çalışma durum çalışması araştırma yöntemine dayanmaktadır ve 2018-2019 güz döneminde 9 hafta süreyle bir devlet üniversitesinde yürütülmüştür. Araştırmanın katılımcıları kolay ulaşabilir örnekleme yöntemine göre seçilen 112 öğretmen adayıdır. 112 öğretmen adayı arasında 79'u (%70.5) bayan, 33'ü (%29.5) erkek, 53'ü (%47.3) Matematik Öğretmenliği bölümüne kayıtlıyken, 59'u (%52.7) Türkçe Öğretmenliği Bölümüne kayıtlıdır. Bu çalışma Öğretim İlke ve Yöntemleri dersinde yürütülmüştür. Bu çalışmada konular PowerPoint sunumları kullanılarak öğretim elemanı tarafından anlatıldıktan sonra, Kahoot oyunlarını yer almıştır. Her Kahoot uygulaması ders konusu hakkında 5 ile 6 çoktan seçmeli sorudan oluşmuştur. Veriler bir anket aracılığıyla toplanmıştır ve betimsel istatistik analiz teknikleri kullanılarak analiz edilmiştir. Çalışmanın bulguları, öğretmen adaylarının çoğunun Kahoot'a karşı olumlu tutumları olduğunu göstermiştir, Kahoot'un öğrenme ve bilgilerin kalıcılığını sağladığı belirlenmiştir. Son olarak, öğretmen adayları Kahoot'un aktif katılımlarını teşvik ettiğini, derse daha fazla katılmaları için teşvik ettiğini belirtmişlerdir. Diğer taraftan, öğretmen adaylarının küçük bir bölümü (% 5'ten az), Kahoot uygulamalarının zaman alıcı, sıkıcı ve bazı teknolojik sorunları olduğunu belirtmiştir. Bu çalışmanın bulguları tartışılmış ve araştırmacılar için bazı önerilerde bulunulmuştur. Anahtar Kelimeler: Oyunlaştırma, öğrenci yanıt sistemleri, Kahoot.
... Roschelle (2003) suggests that clickers are effective in gathering instantaneous feedback. Many favorable study results (Roschelle, 2003;Kay & Knaak, 2009;Martyn, 2007) make it clear from studies involving university classrooms, where classroom response systems are most often used, that there are benefits that come from using these systems. They can be used to assess pre-and post-lesson knowledge, provide a springboard for class discussion, and collect attendance. ...
... However, classroom response system technology has had limited use in secondary schools (Fies and Marshall, 2006). Kay and Knaak (2009) conducted an early study in high school science classes which suggested that this technology did not work well in test-taking situations, and that they might best be used in formative assessment situations or as the impetus for class discussion. Kay and Knaak (2009) point out that there are barriers to including classroom response systems in high school classes, including the barriers teachers face using technology, the time needed for developing an understanding of the formative assessment aspects of active learning, and time and skill needed to create questions to gain valuable information from the responses. ...
... Kay and Knaak (2009) conducted an early study in high school science classes which suggested that this technology did not work well in test-taking situations, and that they might best be used in formative assessment situations or as the impetus for class discussion. Kay and Knaak (2009) point out that there are barriers to including classroom response systems in high school classes, including the barriers teachers face using technology, the time needed for developing an understanding of the formative assessment aspects of active learning, and time and skill needed to create questions to gain valuable information from the responses. Yet, the possibilities for their use in high school classrooms remain. ...
Article
Classroom Response Systems are personal response devices that allow teachers to poll a group of students. As the use of technology, including tablets, iPads, and clicker systems, is introduced into Saudi Arabian classrooms, teachers will need to be trained in all the possible uses for the tablets and iPads that will be given to the students. Classroom Response Systems are a natural outgrowth of the type of uses teachers will need to integrate into their now-technological classrooms. The current study looks at the literature on classroom response systems and offers reasoning for the professional development of Saudi teachers in the use of classroom response systems for feedback and discussion in the science classroom. The target audience for this study are the professional development trainers of Saudi Arabian teachers, particularly teachers of high school Biology. Science curriculum has recently been changed in Saudi Arabia, and teachers are searching for ways to coordinate the curriculum with the new classroom technology that has also been recently introduced. Professional development will need to be developed to train teachers in the utilization of the new technology. Training in classroom response systems will be one part of that training. The study includes a link to an author-created training website, which includes videos of examples of classroom response systems in the classroom, interviews with students, sample tests, and guidelines for the purchase of classroom response systems.
... One of the active learning methods to maximize student participation, interaction, and engagement are Student Response Systems (SRSs). SRSs are also referred to as Audience Response Technology (ART), Audience Response System (ARS), Personal Response System (PRS), Classroom Communication System (CCS), electronic voting systems or in general known as "clickers" (Caldwell, 2007;Draper & Brown, 2004;Han & Finkelstein, 2013;Kay & LeSage, 2009;Kay, Lesage et al., 2010;Mayer et al., 2009;Nicol & Boyle, 2003;Simpson & Oliver, 2007;Wolter et al., 2011). In the current study, Kahoot, one of the SRSs, was included. ...
... It was stated in the literature that Kahoot has many significant effects on students' learning and creates a positive classroom environment. The use of Kahoot or other SRSs may change the dynamics of traditional classroom structure by providing an equal chance to all students to reply to the instructor about their understandings and immediate feedback to students (Celik, 2015;Gok, 2011;Kay, Lesage et al., 2010;Knight & Wood, 2005;Mayer et al., 2009;Steinert & Snell, 1999). In this way, the inclusion of Kahoot in the teaching-learning process reveals the common misconceptions of students (Knight & Wood, 2005). ...
... Hence, to cover the topic during class lectures, there is little time for more interactive teaching-learning activities. On the other hand, if SRSs are used for brief quizzes on assigned readings or homework for formative assessment (Caldwell, 2007;Gok, 2011;Han & Finkelstein, 2013;Kay, Lesage et al., 2010), this may encourage pre-class preparation (Caldwell, 2007). In this way, Kahoot saves class time and this time can be spent in more productive ways than content coverage (Martyn, 2007). ...
... Only a handful of studies have looked at the use of SRSs in K-12 environments [9][10][11][12][13][14]. Similar benefits (formative feedback, student involvement, quality of teaching) and challenges (e.g., technology set-up and question creating time, summative assessment) observed in higher education were noted in K-12 classrooms [9][10][11][12][13][14]. ...
... Only a handful of studies have looked at the use of SRSs in K-12 environments [9][10][11][12][13][14]. Similar benefits (formative feedback, student involvement, quality of teaching) and challenges (e.g., technology set-up and question creating time, summative assessment) observed in higher education were noted in K-12 classrooms [9][10][11][12][13][14]. However, the number of studies on SRS use in K-12 classrooms is too small to make firm conclusions. ...
... Overall, five key benefits of using SRSs in STEM-based secondary school classrooms were identified: formative assessment for student learning, formative assessment for teachers, increased student involvement, improved quality of teaching, and summative assessment. The first four benefits mirrored previous research on the use of SRSs in higher education [1][2][3][4][5][6][7][8] and K-12 classrooms [9][10][11][12][13][14]. However, qualitative data provided additional useful information about the nature of these benefits. ...
Conference Paper
Full-text available
Student response systems (SRSs), also known as audience response systems, classroom response systems and clickers, allow students to answer electronically displayed questions. SRSs can be hardware (e.g., physical device) or software-based (e.g., mobile apps). Typically, all responses are instantly presented, in chart form, then reviewed and discussed by the instructor and the class. Considerable research has been conducted on the use of SRSs in higher education [1-8] but not in secondary school environments. The purpose of the current study was to examine the effectiveness of SRSs in STEM-based secondary school classrooms. Twenty-two STEM-focused secondary school teachers with 1 to 32 years of teaching experience participated in this study. After four months of integrating SRSs in their classrooms, all teachers completed a survey, responded to open-ended questions, and participated in hour-long focus groups. Key benefits of using SRSs included providing formative feedback on learning and teaching, increasing student involvement, improving the quality of teaching, and offering an alternative method for conducting summative assessments. Key challenges observed were technical problems (software and hardware), increased preparation time for lessons, accounting for individual differences, resistance to summative assessment, and classroom management.
... Studies that examined early versions of ARS resulted in improved enthusiasm for learning but limited gains in learning [24]. When student-centered learning strategies were used with these technologies, improvements in several areas were evidenced, including student attendance and participation (e.g., [25,26]), collaborative learning and student engagement [26][27][28][29][30], student comprehension (e.g., [31][32][33]), and student class satisfaction [24]. For example, Cheng and colleagues [34] found stronger motivation for learning and academic performance among undergraduate students who exhibited greater social presence in classes with ARSs. ...
... As described earlier, there is a significant history of research that has supported the efficacy of ARS for engaging students in the learning process [25][26][27][28][29][30] as well as student achievement in Algebra I [15][16][17][18]. The present study extends the CCMS project and its findings by showing significant differences on an Algebra I EOC assessment in one school. ...
Article
Full-text available
This article reports on two aspects of a professional learning (PL) and research study. Twenty-five teachers participated in a two-year PL program that sought to support teachers to implement classroom connectivity technology (CCT) in their Algebra I classrooms. Students in one school who learned Algebra I with CCT outperformed their peers who learned without CCT. Therefore, we explored the classroom practices of one teacher whose students attained higher achievement. There were several aspects of this teacher’s instruction that created the context for noticing and potentially led to the significant difference in Algebra I achievement. First, we describe the foundational components of the classroom context that established the expectations for learning and lesson mini cycles that provided a consistent format for students. We discuss several components of this work that supported student noticing, including connections to prior knowledge, task analysis, and carefully curated lessons. Students’ observations were codified in the conjectures that they developed individually and in groups as well as in the summaries of the classroom engagement in several ways. The implications of these results and future research are discussed.
... The idea of using a CRS to gather evidence of students' thinking, and using this as a starting point for generating productive classroom discussions, harmonizes with principles of formative assessment. In fact, literature reviews (Caldwell, 2007;Egelandsdal et al., 2019;Kay & LeSage, 2009) and other case studies in mathematics education (Kay et al., 2010;King & Robinson, 2009;Krumsvik & Ludvigsen, 2012) point out that one of the most important benefits of using a CRS is the possibility to improve formative assessment practices, and especially provide feedback to students. As such, the use of a CRS creates opportunities for instant formative feedback in direct relation to the students' work with mathematical tasks. ...
Article
Full-text available
This article reports on results from a design research project that implements a formative assessment practice with support from a classroom response system (CRS). Cumulatively building on earlier research and drawing on iteratively generated data from secondary schools in Sweden, the article elaborates on design principles and task types for constructing CRS tasks aimed at generating mathematical classroom discussions. The study generated three design principles, six task types and 31 empirically evaluated tasks useful for developing and using CRS tasks in classroom practices. The results are discussed in relation to earlier research, methodological considerations and the cultural context of Sweden.
... A second contribution is that we examine the effects in secondary education, while previous research has focused at university-level. Further research on this topic particularly in secondary education is deemed necessary, since polling technology usage in secondary education is rapidly increasing (Kay, LeSage, & Knaack, 2010;Molin, Cabus, Haelermans, & Groot, 2019). Third, we compare two treatment groups, with different types of feedback, with a control group, enabling us to compare differences in metacognitive skills and motivation. ...
Article
Full-text available
This study explores the effects of formative feedback on students’ metacognitive skills when using feedback strategies with polling technology. Using a randomized field experiment among 633 physics students in six schools in Dutch secondary education, we study assessments with the polling technology Socrative, by dividing students into three groups. Students in the cooperative group use a combination of peer discussions and teacher feedback, while students in the individual group use teacher feedback. To compare differences in metacognitive skills, students in the control group only use Socrative, but do not receive formative feedback from either teacher or peers. The results show that there is a significant positive effect of the cooperative treatment on both metacognitive skills and motivation in comparison with the control group. We find that students with low metacognitive skills benefit significantly more from the cooperative treatment than students with high metacognitive skills. No effects are found for the individual treatment. However, girls significantly increase their metacognitive skills and are more motivated than boys, when using an individual treatment. Additionally, a mediation analysis shows that motivation partially mediates the cooperative treatment and metacognitive skills. Based on these results, we recommend a combination of peer discussions and teacher feedback in physics courses.
... Classroom Teaching Interactive Response System (CIRS) is a teaching application system with electronic products such as hand-held sensors, wireless voting devices, and smart phones as the carrier to realize teacher-student interaction and information feedback in the classroom. CIRS system [5] proposed by Stanford University was widely applied in Europe and the United States after the development of infrared wireless. China didn't introduce CIRS system until 2000 and started to build a CIRS-based teaching model. ...
Article
With the continuously deepening education reform, to realize effective teaching as much as possible, teaching quality improvement becomes the precondition of building a moderately prosperous society. At the same time, teaching informationization also requires teachers to pay attention to keeping abreast of the times, adopt multimedia means for teaching and improve teaching efficiency. As new teaching concepts and teaching methods are required in the new era, five-star teaching theory focusing on problem-solving and the consistency of the teaching process and the learning process provides theoretical guidance for curriculum design through four steps, activating original knowledge, displaying and demonstrating new knowledge, trying application exercise, and integrated mastery. The classroom teaching interactive response system (CIRS) which attaches great importance to the interactive feedback between teachers and students, strengthens students' participation in the classroom can have a real-time control of teaching progress and adjust the teaching content in real time. Combination of five-star teaching theory and CIRS has the optimum effect on improving effective teaching. With internal medicine teaching course as the example, the study conducted smartphone CIRS course teaching model based on 5-Star Instructional Model, carried out experiments, and evaluated the experiment results through questionnaire survey. Experiment results show that the teaching mode received students’ full recognition, improved students' internal medicine teaching course scores and achieved good teaching results.
Conference Paper
Full-text available
Student response systems (SRSs) allow students to answer digitally displayed questions and receive immediate feedback. While considerable research has been conducted on the use of SRSs in higher education (e.g., Chien et al., 2016; Kay & LeSage, 2009; Keough, 2012), limited research has focused on elementary school classrooms. The purpose of the study was to examine grade 4 to 6 students' (n=88) attitudes toward the use of SRSs over three months. Six themes of use were examined: general impressions, motivation, participation, social interactions, learning, and technology. Most students were motivated when using SRSs and believed they increased participation. Social interactions appeared to be positive when using SRSs, although some students felt pressure to respond quickly or poorly when their answers were incorrect. A majority of students particularly enjoyed being tested by SRSs and receiving feedback. The main challenge with using SRSs was that the technology sometimes did not work, thereby preventing students from participating.
Article
Full-text available
The objective of this chapter is to explain the huge, burgeoning sense of excitement surrounding response systems, and more generally, networked classrooms today. Also why, for an idea apparently more than 40 years old, it took this long to happen! Beginning with a brief history of early response systems, it takes up the story from the author's own experience, leading through hardware barriers, misconceptions about pedagogy, and classroom successes, to summarize the variety of uses, and how they lead to improved teaching and learning. It then discusses why this is such a potentially important area of study for improving education, and finally goes on to describe the emerging characteristics of, and rationale for, more powerful types of modern systems.
Article
Full-text available
Since 2002 we have been investigating the use of an electronic classroom communication system in large first year lecture classes. Handheld keypads were distributed to teams of students during a lecture class. Students used the keypads to answer two step multiple choice problems after a discussion within their group. The questions were generated using students’ answers from previous exams. We have evaluated our use of the classroom communication system using a survey about how comfortable students are with this type of interaction. In addition, we have tried to determine if the use of the classroom communication system can be linked to student performance on exams. Our results show that students are comfortable with this technology and feel that, on the whole, interactive lectures are useful. At a first glance, there is an improvement in students’ exam performance, but there are too many competing factors to clearly say that this improvement is solely due to the use of the classroom communication system. Even though this paper is based in physics and a physics example is used to illustrate points, the technique can be applied to other discipline areas.
Article
Full-text available
Research and practice in the use of electronic voting systems has developed over the last five years. Electronic voting systems, also known as personal response systems, audience response systems or classroom communication systems, use handsets to elicit responses from students as part of structured teaching sessions, typically lectures. The use of this information has implications for pedagogy; they are associated with the introduction of interactive, discursive and more segmented approaches to teaching. The pedagogic and organisational implications of adopting such systems are summarised, along with the perceptions that staff and students hold. Comparisons are drawn between practice up to 2002 and between 2002 and 2006; these reveal how both practice and research on this topic has matured, highlighting (for example) the development of models that seek to abstract and share practice. The paper concludes by outlining the ways in which such tools can be used to improve lecturing, and identifies an agenda for future work in this area.
Chapter
This chapter presents an overview of audience response systems as tools for promoting learning in an interactive classroom. The comparison with a popular TV programme is used to introduce the role of questioning in face-to-face teaching, before examining the intrinsic pedagogical benefits of questions. Technical aspects of different systems are summarised, and pedagogical issues discussed, based on the author's personal teaching experience. Appropriate ways of using audience response systems in the classroom, and guidelines for question delivery are presented. New technology will continue to open up opportunities, but the challenge for the future remains to use the technology effectively and in pedagogically sound ways.
Chapter
This chapter examines two types of response technologies (selected and constructed) available to support discussion and participation in the classroom, and describes our experiences using and observing them in a variety of mathematics, science, and computer science classes at various educational levels. Selected response systems (a.k.a., clickers) display multiple-choice questions, and then collect and analyze student responses, and present distribution summaries to the class. Constructed response systems allow students to use handheld computers to generate free-form graphical responses to teacher prompts using various software applications. Once completed, students submit their responses to the instructor's computer wirelessly. The instructor may then select and anonymously project these authentic student work samples or representations to promote classroom discussion. We review the purpose, design, and features of these two types of response systems, highlight some of the issues underlying their application, discuss our experiences using them in the classroom, and make recommendations.
Article
Recent increases in class size in higher education have focused more attention on the nature of the face-to-face learning experience. This chapter examines how a keypad technology facilitates active learning in the lecture hall using a number of pedagogically proven approaches. We survey 219 first-year business studies students tackling introductory economics, and find that the technology enhances learning in lectures because, among other things, it improves concentration, provides instantaneous and more effective student feedback, and allows students to make comparisons on how well they fare relative to their peers. Interestingly, we find less statistical support for the benefits of using the technology to allow students to respond anonymously, and explore some reasons for this result. Finally, we demonstrate our use of the tool to engage in teaching the Prisoner's Dilemma game. This forms part of the emerging knowledge on how to teach classroom experiments using keypad technology.
Article
Electronic student response technologies (SRT) are capable of assessing teaching and learning methods in real time, and they offer an exceptional means of introducing active learning protocols in classes with large enrollments. These wireless systems allow students to key in responses with remote control units to questions posed by an instructor in the classroom. Student responses then are displayed in real time, allowing both students and instructors to gauge student comprehension instantaneously. From Spring 2002 to Spring 2003, we utilized SRT in 4 sections of a high-enrollment introductory Earth Science course (Geosc 020: Planet Earth) at Penn State University. We conducted a multi-faceted assessment of the use of SRT in our course that included quantitative and qualitative perception data from students enrolled in the course and faculty/administrator visitors to our classroom. Our preliminary assessment of the pedagogical merits of SRT in our course suggests that this technology is an effective tool for introductory geoscience education.
Article
Objective. The purpose of this study was to evaluate the impact of an interactive student response (ISR) system on student learning, interest, and satisfaction. Methods. Students enrolled in 3 courses, Clinical Pharmacokinetics, Medical Literature Evaluation, and Pathophysiology and Therapeutics, were taught using either a traditional lecture format (study year 1) or an ISR system format (study year 2). Primary outcomes of interest were performance on examinations and student attitudes. Results. Students using the ISR system had better scores on the Clinical Pharmacokinetics examination questions (mean scores, 82.6% ± 9.6% vs 63.8% ± 8.3%, p <0.001), on the cumulative final examination for Medical Literature Evaluation (82.9% ± 11.5% vs 78.0% ± 12.2%, p = 0.016), and on the evaluable "analysis type" examination questions in the Pathophysiology and Therapeutics course (82.5% ± 8.7% vs 77.4% ± 12.5%, p = 0.0002). Students using the ISR system in all 3 courses were positive about the system. Conclusion. The ISR system was a useful tool for encouraging active student learning and was well received by students. This system can be efficiently used to gauge student understanding in the classroom and enhance student performance.
Article
We surveyed Peer Instruction users worldwide to collect data on their experiences with the pedagogy. Force Concept Inventory pre- and post-test scores at a range of institutions show learning gains above the level for traditional pedagogies and consistent with interactive engagement.