Content uploaded by Robin Holding Kay
Author content
All content in this area was uploaded by Robin Holding Kay
Content may be subject to copyright.
Exploring the Use of Audience Response Systems in Secondary
School Science Classrooms
Robin Kay ÆLiesel Knaack
Published online: 18 February 2009
ÓSpringer Science+Business Media, LLC 2009
Abstract An audience response systems (ARS) allows
students to respond to multiple choice questions using
remote control devices. Once the feedback is collected and
displayed, the teacher and students discuss misconceptions
and difficulties experienced. ARSs have been extremely
popular and effective in higher education science class-
rooms, although almost no research has been done at the
secondary school level. The purpose of this study was to
conduct a detailed formative analysis of the benefits,
challenges, and use of ARSs from the perspective of 213
secondary school science students. Perceived benefits were
increased student involvement (engagement, participation,
and attention) and effective formative assessment of
student understanding. Perceived challenges included
decreased student involvement and learning when ARSs
were used for summative assessment, occasional techno-
logical malfunctions, resistance to using a new method of
learning, and increased stress due to time constraints when
responding to questions. Finally, students rated the use of
ARSs significantly higher when it was used for formative
as opposed to summative assessment.
Keywords Audience response systems
Secondary school Science Evaluation
Examining the Use of Audience Response Systems
in Secondary School Classrooms: A Formative Analysis
Overview
An Audience Response System (ARS), also known as a
personal response system (Hinde and Hunt 2006), elec-
tronic voting system (Kennedy and Cutts 2005), student
response system (Kaleta and Joosten 2007 and clickers
(Bergtrom 2006), permits students to click in answers to
electronically displayed multiple choice questions using a
remote control device. After students select an answer, the
results are instantly aggregated and displayed in chart form
for the entire class to review. Responses are usually
anonymous, but a teacher can assign individual remote
devices to specific students for purpose of evaluation. Once
the feedback is presented, a teacher can alter the course of
instruction or students can work out misconceptions and
difficulties through peer or classroom discussion. Wide-
spread use of ARSs in secondary schools is a relatively
new phenomenon beginning in 2003.
Several researchers have completed extensive reviews
on the use of ARSs (Caldwell 2007; Fies and Marshall
2006; Judson and Sawada 2002), however only one paper
could be found examining the use of ARSs in secondary
school classrooms (Penuel et al. 2006). The purpose of the
current study was to conduct a detailed formative analysis
of the potential benefits, challenges, and use of ARSs from
the perspective of secondary science students.
ARSs in Secondary Schools
Research on the use of ARSs in secondary schools is
sparse. Penuel et al. (2006) have conducted the only study
of ARSs in a K-12 environment. They noted that teachers
R. Kay (&)L. Knaack
University of Ontario Institute of Technology,
Oshawa, ON, Canada
e-mail: robin.kay@uoit.ca
123
J Sci Educ Technol (2009) 18:382–392
DOI 10.1007/s10956-009-9153-7
used ARSs for two principal reasons: formative assessment
(to improve learning and instruction) or summative
assessment (to deliver and grade tests). They also observed
that training increased frequency of use and that teachers
rarely used ARSs to promote discussion. Teachers who
adopted the view that students should play a significant,
active role in learning were more likely to use ARSs for
formative assessment. Overall, the main impact of ARSs in
K-12 was to increase motivation and improve student
learning.
Benefits of Using ARSs in Higher Education
Because there is limited research on the use of ARSs in the
secondary school domain, it is worth reviewing the impact
of ARS in higher education, particularly since the majority
of ARS use has occurred in science related subject areas
(e.g., Allen and Tanner 2005; Beatty et al. 2006; Dufresne
and Gerace 2004; Fagan et al. 2002; Preszler et al. 2007;
Sharma et al. 2005). A detailed review of the ARS litera-
ture in higher education revealed four key areas of focus:
overall attitudes toward ARSs, student involvement,
assessment, and learning. Each of these areas will be dis-
cussed in detail.
Overall Attitudes Toward ARS
Extensive data has been collected indicating that student
attitudes toward ARSs are very positive (Caldwell 2007;
Fies and Marshall 2006;JudsonandSawada2002; Simpson
and Oliver 2007). Students have also reported that ARSs
are easy to learn and use (e.g., Hinde and Hunt 2006;
Pelton and Pelton 2006; Pradhan et al. 2005; Sharma et al.
2005; Siau et al. 2006).
Student Involvement
When an ARS is used in a higher education classrooms,
students are more engaged in the content presented (e.g.,
Bergtrom 2006; Preszler et al. 2007; Simpson and Oliver
2007), participate more (Caldwell 2007; Draper and Brown
2004; Greer and Heaney 2004; Siau et al. 2006; Stuart et al.
2004; Uhari et al. 2003), and pay more attention to concepts
presented (e.g., Bergtrom 2006; Draper and Brown 2004;
Jackson et al. 2005; Latessa and Mouw 2005; Siau et al.
2006; Slain et al. 2004). In addition, a number of
researchers have observed that ARSs stimulate discussion,
especially when they are used with a peer instruction
strategy (Beatty 2004; Brewer 2004; Jones et al. 2001;
Nicol and Boyle 2003). With this strategy, students felt they
were better able to discuss and improve their understanding
of higher level concepts (Draper and Brown 2004).
Assessment
Evidence suggests that using an ARS improves the feed-
back cycle between instructor and students with the rapid,
anonymous, collection and presentation of all student
responses to questions asked (Abrahamson 2006; McCabe
2006; Pelton and Pelton 2006). Students also enjoy seeing
how well they are doing relative to their peers (Burton
2006; Caldwell 2007; Draper and Brown 2004; Hinde
and Hunt 2006; Simpson and Oliver 2007). In addition,
considerable data supports the use of ARSs for providing
effective formative assessment of student knowledge
(Beatty 2004; Bergtrom 2006; Brewer 2004; Caldwell
2007; Draper and Brown 2004; Dufresne and Gerace 2004;
Greer and Heaney 2004; Jackson et al. 2005; Siau, et al.
2006; Simpson and Oliver 2007; Stuart et al. 2004).
Finally, some indirect evidence suggests that students do
not like using ARSs when participation is associated with a
grade (Caldwell 2007).
Learning
Many higher education students report that they learn more
when ARSs are used (Greer and Heaney 2004; Nicol and
Boyle 2003; Pradhan et al. 2005; Preszler et al. 2007; Siau
et al. 2006; Slain et al. 2004; Stuart et al. 2004; Uhari et al.
2003). Some students prefer discussing ARS-based ques-
tions with their peers who have a similar language and
therefore can articulate solutions more successfully than
the instructor (Nicol and Boyle 2003; Caldwell 2007).
Other students maintain answering ARS questions forces
them to think more about the important concepts (Draper
and Brown 2004; Greer and Heaney 2004) resolve mis-
conceptions (D’Inverno et al. 2003). Finally, there is
substantial qualitative and quantitative evidence to suggest
that learning performance increases as a direct result of
using ARSs (e.g., Carnaghan and Webb 2007; Fagan et al.
2002; Kaleta and Joosten 2007; Kennedy and Cutts 2005;
Pradhan et al. 2005; Preszler et al. 2007; Schackow et al.
2004).
Challenges Associated with Using an ARS
Three categories of challenges have been reported by stu-
dents and instructor using ARSs: technology, time, and
method. With respect to technology, students who forgot to
bring their remote devices to class were unable to fully
participate in ARS-based discussions (Caldwell 2007; Reay
et al. 2005). Another more critical technological issue
occurred when the remote devices did not work, a partic-
ularly stressful situation during a formal evaluation
(Sharma et al. 2005; Siau et al. 2006).
J Sci Educ Technol (2009) 18:382–392 383
123
Several challenges associated with ARS are related to
time. There are very few classroom-ready, subject-specific
collections of ARS questions available, so instructors have
to develop original questions. Many find this task extre-
mely time consuming (Allen and Tanner 2005; Beatty et al.
2006; Fagan et al. 2002; Freeman et al. 2007; Paschal
2002). In addition, the time required to set up an ARS,
hand out the remote controls at the beginning of the class,
and collect remote controls at the end of the class can
be significant (Hatch et al. 2005, Stuart et al. 2004). Fur-
thermore, some discussions, if not moderated properly,
can consume considerable class time by drifting away from
the main concepts (Nicol and Boyle 2003, Reay et al.
2005).
A final set of challenges involves students who react
unfavorably to the use of an ARS because the method of
learning has been changed. They are accustomed to lec-
tures and a switch in teaching format can lead to stress,
frustration, and resistance at first (Beatty 2004; Fagan et al.
2002). In addition, using ARSs require more cognitive
energy and cooperation from students and this extended
effort may not suit students who are accustomed to passive
lectures (Trees and Jackson 2007). Finally, some students
doubt their ability to direct their own learning when ARSs
are used (Allen and Tanner 2005).
Summary and Purpose of Study
Previous research on the use of ARSs in classrooms has
been conducted almost exclusively in the domain of higher
education. Benefits reported when ARSs are used have
included greater student involvement, a positive reaction to
formative assessment, improved quality of learning, and
increased learning performance. Challenges identified
when using ARSs have consisted of malfunctioning tech-
nology, increased demands on teaching time, and students
having to adjust to a new way of learning. Since the use of
ARSs is increasing rapidly in secondary schools (Abra-
hamson 2006), it is prudent to broaden the scope of current
ARS research in the area of science where a majority of
higher education instructors have used ARSs. Therefore,
the purpose of the current study was to examine benefits,
challenges, and strategies associated with using ARSs in
secondary school science classrooms. Three questions were
addressed:
1. What were the perceived benefits of using an ARS in a
secondary school science classroom?
2. What were the perceived challenges in using an ARS
in a secondary school science classroom; and
3. How did teaching strategy influence the use of ARSs
in a secondary school science classroom? (summative
vs. mixed vs. formative assessment).
Method
Sample
Students
The student sample consisted of 213 students (107 males,
105 females, 1 missing data), enrolled in grades 10 (n=50),
11 (n=81), and 12 (n=82). Subject areas where an ARS
was used included biology, chemistry, physics and general
science. Eight-six percent of the students claimed that they
were comfortable or very comfortable with technology
(n=182). Sample population data was collected from seven
classrooms in six different secondary schools. All students
were selected through convenience sampling and had to
obtain signed parental permission to participate.
Teachers
The teacher sample consisted of seven teachers (5 males,
2 females), with 8–26 years of teaching experience
(M=18.0, SD =7.0). Six out of seven the teachers
reported that they were comfortable or very comfortable
with technology.
Procedure
Teachers were emailed by an educational coordinator and
informed of the ARS study. Participation was voluntary
and subjects could withdraw from the study at any time
without penalty or prejudice. Each teacher received two
half days of training in November and February on how to
use the ARS software and possible strategies for using an
ARS in the classroom. Teachers were then asked to use an
ARS in their regular classrooms, although frequency of use
was controlled by the individual teacher. Each teacher had
access to a laptop computer, an LCD projector, and one
ARS system (32 clickers and a receiver) from E-Instruction
(see http://www.einstruction.com/ for details of the specific
ARS brand used). This equipment was purchased inde-
pendently by the school board in where the study was
conducted. All teachers used the ARS for a three month
period, however, data collected for this study focussed on
the last month. During the final month in which ARSs were
used, all of secondary school science students reported
using the ARS one to two times.
Data Sources
Student Survey
Based on the last month in which an ARS was used, stu-
dents completed the ARS Attitude Survey for Students
384 J Sci Educ Technol (2009) 18:382–392
123
(Kay 2008a). This survey consisted of 11, seven-point
Likert scale items that focused on the four key areas
reported in previous ARS research: overall attitudes, stu-
dent involvement, assessment, and learning. Cronbach’s
reliability coefficient for the ARS survey was 0.88. How-
ever, since this was a formative analysis of student
attitudes toward the use of ARSs in secondary school
science classrooms, items on the scale were analyzed
individually to gather as much information as possible.
Student Comments
Students were asked the following open-ended question:
‘‘What was the impact of clickers on your learning in the
past month’’? A coding scheme was developed to catego-
rize 255 student comments made by 175 students (Kay
2008b). Some students made more than one comment,
while other students offered no response. No student made
more than 3 comments. Each comment was rated on a five-
point Likert scale (-2=very negative, -1=negative,
0=neutral, 1 =positive, 2 =very positive). Two raters
(the authour and a trained research assistant) assessed all
comments made by students based on category and rating
value. After round one, inter-rater reliability was 83% for
categories and 93% for ratings. Comments where catego-
ries or ratings were not exactly the same were shared and
reviewed a second time by each rater. After round two, an
inter-rater reliability of 98% was reached for categories and
99% for the rating values.
Teacher Questions
Teachers were asked to indicate their gender, years of
teaching experience, comfort level with computer tech-
nology, and whether they used an ARS for formative and/
or summative assessment. Finally, teachers were asked to
respond to open ended written questions about the overall
impact of ARS on their classroom and suggestions for
future teachers.
Results
Benefits to Using an ARS
Overall Attitudes
Students were somewhat positive about using ARSs
with 62% agreeing that they preferred to use an ARS
(M=5.0 ±0.26, possible range was 1–7) and 42% indi-
cating that their class was better when an ARS was used
(M=4.5 ±0.21, see Table 1). General comments offered
from the open-ended survey question were mixed with the
majority being neutral (n=13). Representative neutral
comments about ARS were ‘‘It didn’t affect me’’ and ‘‘It
did not make much of a difference’’. Typical negative
comments (n=5) were ‘‘Clickers are a waste of time’’ and
‘‘Horrendous—takes up too much time’’. Characteristic
positive comments were ‘‘It was good’’ and ‘‘Amazing’’.
Table 1 Summary of ARS
quantitative student survey
questions (n=213)
a
Combining ‘‘Slightly Agree’’,
‘‘Agree’’ and ‘‘Strongly Agree’’
responses
b
Combining ‘‘Slightly
Disagree’’, ‘‘Disagree’’ and
‘‘Strongly Disagree’’ responses
Question nDisagree
(%)
a
Agree
(%)
b
MSD
Overall attitude
1. I would prefer to use clickers 209 23 62 5.0 1.9
2. When clickers were used, the class was better 213 21 42 4.5 1.6
Student involvement
3. I was more engaged in the lesson when clickers were used 213 13 70 5.2 1.6
4. I was more motivated when clickers were used 213 17 63 5.0 1.7
5. I participated more than I normally would when clickers
were used
213 19 62 5.0 1.6
6. Using clickers generated more class discussion 213 20 53 4.6 1.5
Assessment
7. Using clickers was a good way to test my knowledge 210 12 74 5.3 1.5
8. I liked seeing what other students in the class selected
for answers
211 13 56 4.8 1.5
9. I did not feel bad when most students got an answer right
and I didn’t
213 35 39 4.3 1.8
10. I liked using clickers for tests 211 29 33 4.1 1.8
Learning
11. I learned more when ICCS was used 211 24 43 4.3 1.5
J Sci Educ Technol (2009) 18:382–392 385
123
Of the 17 comments made about student reaction to
using a different method of teaching, 11 were positive (see
Table 2). Sample positive comments were as follows:
It’s fun to use and adds a certain aspects to the class
that you wouldn’t have in a regular class.
I learned better because it promoted friendly com-
petition within the class. Also it made reviewing
easier and more fun. It is better than reading ques-
tions out of a textbook.
I learn better than sitting and listening and it
improves my multiple choice skills.
Student Involvement
Students indicated that they were more involved in science
class when an ARS was used. Based on the survey data, 70%
ofthestudentsagreedthattheyweremoreengaged
(M=5.2 ±0.21), 63% agreed they were more motivated
(M=5.0 ±0.23), 62% agreed that they participated more
than they normally would (M=5.0 ±0.21), and 53% agreed
that more class discussion was generated (M=4.6 ±0.20).
Student comments regarding engagement were univer-
sally positive (M= 1.24, scale ranged from -2to?2).
Typical comments were as follows:
It motivated me and helped me a little with constantly
paying attention.
Using the clickers made the lesson more enjoyable.
Keeps me awake and helps to keep me interested in
the subjects we are learning.
Fewer comments were made about participation than
engagement (n=17), although the mean rating was high
(M=0.94) and most comments were positive (n=15).
Typical comments about participation included:
I’ve only used them once but I really liked them. I
thought it was a great way to be involved. Everybody
was doing something.
It forced people who normally don’t feel comfortable
participating in class discussion to participate.
Only 12 comments were made about paying attention,
but the mean rating was high (M=1.00). Sample atten-
tion-based comments were as follows:
I am forced to pay more attention in class which helps
me learn better.
I was more inclined to listen in class because I knew
that the lesson would be shortened and therefore I
could focus more on the lesson.
Table 2 Summary of student
comments about audience
response systems (n=255)
Category Mean (SD) No. negative
comments
No. positive
comments
Total
comments
Overall attitude
General comments -0.04 (0.93) 5 5 23
Different methods used 0.35 (0.93) 5 11 17
Student involvement
Engagement 1.24 (0.43) 0 42 42
Participation 0.94 (0.43) 0 15 17
Paid attention more 1.00 (0.00) 0 12 12
Discussion 1.00 (0.00) 0 2 2
Stress -1.00 (1.10) 9 2 11
Assessment
Formative assessment 1.00 (0.34) 0 17 18
Compare with others 1.00 (0.00) 0 4 4
Feedback 1.00 (0.00) 0 3 3
Wrong answer—reaction -1.00 (0.00) 2 0 2
Learning
Learning process 0.52 (0.91) 3 16 29
Review previous concepts 1.20 (0.41) 0 15 15
Memory 0.62 (1.06) 1 7 8
Teacher explained better 1.00 (0.00) 0 2 2
Learning performance -0.71 (0.97) 23 6 31
Technology issues -1.00 (0.93) 7 1 8
386 J Sci Educ Technol (2009) 18:382–392
123
Assessment
Students liked using ARS for formative assessment. Just
under 75% of students agreed that using ARSs was a ‘‘good
way’’ to test their knowledge (M=5.3 ±0.20). Compar-
ing answers with other students (M=4.8 ±0.20) was
rated somewhat lower, with 56% agreeing that they liked to
see the responses of their peers. Students were split on how
they felt when they selected a wrong answer and the rest of
the class chose a correct answer—35% felt bad while 39%
did not feel bad. Finally, only 33% of the students agreed
that they liked using an ARS for formal tests or summative
assessment (M=4.1 ±0.24).
Comments about formative assessment were rated high
(M=1.00), however few comments were made about
comparing responses or not getting a correct answer. The
following are typical comments made about formative
assessment:
Helped me by testing my knowledge and getting to
know where the class and I are at.
Clickers were fairly good in a normal class and fairly
good way to test our knowledge. However, on tests, it
is very inefficient and can waste time.
We found out what common mistakes were made
more commonly, then discussed why that was the
wrong answer.
Learning
Only 43% of the students agreed that they learned more
when an ARS was used. The mean score for this item was
4.3 (±0.20), the second lowest of all survey questions.
However, open-ended comments revealed a different
picture. Ratings for comments about learning process
(M=0.52), reviewing previous concepts (M=1.20),
helping to improved memory (M=0.62), and teacher
explanations (M=1.00) were relatively high. However,
the mean rating for learning performance comments was
negative (M=-0.71). It is important to note, though, that
all of the negative comments about learning performance
were made when ARSs was used for formal testing, as
opposed to formative assessment. Comments made about
the overall learning process (n=29) and reviewing con-
cepts (n=15) were the most common. Typical remarks
made about the learning process were:
Helped me figure out how much I learned.
Using the clickers helped me get a better under-
standing of some of the concepts that I did not
understand before.
Since clickers were used to review for a test, I was
able to obtain answers to test-like questions easily
giving me insight as to what more studying was
needed.
Representative comments about using an ARS to review
concepts included,
It reinforced the idea of physics just before the test.
However, I found the practice test we did with the
clickers very helpful.
It was great review of the units and topics we just
covered, and I find I remember things better if I get
them wrong first on a test and this system always
seems like a test for me so I feel I learn more this
way.
Comments made about an ARS helping to improve
memory were less frequent (n=8) but worth noting.
It actually helped me remember facts we have been
learning better.
Because we could see the correct answer immediately
after it helped me remember the answer and get my
questions answered.
Challenges Experienced Using ARSs
Technology
Attitudes toward ARS technology were not assessed using
the survey data, but the mean rating for comments was
negative (M=-.1.00), albeit only for eight students. The
majority of student complaints were targeted toward whe-
ther the remote devices actually worked.
They are too difficult to use and it was difficult to tell
when the correct answer was chosen.
Took more time to learn to use them than time spent
learning; caused confusion; didn’t like pointing at the
receiver; not good in a test situation.
Overall Attitudes
While most ‘‘overall’’ attitudes toward ARSs were neutral,
comments from five students were exceptionally negative
(e.g., ‘‘Clickers are a waste of time’’ or ‘‘Horrendous—
takes up to much time’’). Other comments were focussed
using a new method to learn.
I prefer to listen to the teacher—using the clickers
constantly would become tedious.
J Sci Educ Technol (2009) 18:382–392 387
123
It went slowly and I don’t like the time limit
questions.
Student Involvement
As stated earlier, most students were enthusiastically
involved when an ARS was used, however some students
were stressed. Typical comments referred feeling pressure
during a test.
Using the clickers on a test is very stressful. Not only
are we stressed because of the test conditions, but it’s
difficult to work the clickers.
The clickers have made me nervous in test situations
and have made me lose confidence in situation where
immediate feedback was given. The clickers have had
an altogether negative feedback on my learning
experience in biology class.
Assessment
The survey indicated that 35% of secondary students ‘‘felt
bad’’ selecting an incorrect response when almost everyone
else in the class selected a correct response (see Table 1).
Most comments about assessment (see ‘‘Benefits’’) and
ARSs were positive.
Learning Performance
Even though reviewing concepts, improving memory, and
obtaining explanations from the teacher were rated as
positive consequences of using an ARS, a small group of
students (n=23) were adamant about the negative impact
of the ARSs on learning performance. Representative
comments included:
Frustrating, stressful to use in test situations.
The time constraints compared to a normal test were
very annoying since the slow students miss out and
the speedy students get bored.
The clickers are NOT good to use on long tests;
they’re frustrating and hard to check your answers.
Teaching Strategy Used with ARSs
Comprehensive information was not collected on how
ARSs were used in secondary science classrooms, how-
ever, three general assessment-based strategies were
reported by teachers: using an ARS for formative assess-
ment only (formative, n=130), using an ARS for both
summative tests and formative assessment (mixed,
n=45), using an ARS for summative assessment only
(summative, n=38). Means for all items on the ARS
student attitude scale as a function of strategy selected are
presented in Table 2. Mean values for all attitude scale
items were lowest when a summative assessment strategy
was used and highest when a formative assessment strategy
was employed.
A MANOVA was run to compare formative, mixed, and
summative approaches to using the ARS (Table 3). Using
an ARS for formative assessment was rated significantly
more positively than using an ARS for summative assess-
ment on all 11 Likert scale items in the ARS attitude scale.
Using an ARS for formative assessment also resulted in
significantly higher scores on most survey items when
compared to a mixed approach (formative & summative)
(Table 4).
Teachers’ Comments
When science teachers were asked about the overall impact
of ARSs, two main themes emerged: learning and moti-
vation. With respect to learning, comments focussed on the
value of feedback, not having to write everything down,
and examining areas of weakness.
The immediacy of the feedback permits me to iden-
tify difficulties and address them quickly.
My class really struggles with written work so the
clickers were very popular.
By using clickers as an assessment tool, I can focus
lessons on areas of weakness.
Regarding motivation, using an ARS appeared to
increase the overall energy level in the science classroom.
Clickers have made my review sessions very exciting
and motivating for my students.
They benefited—the positive energy was good. For
example they would say yeah we’re using the clickers
today instead of your usual boring lesson!
When prompted to give suggestions to future teachers
about using ARSs, science teachers in this study offered the
following observations. First, set up of ARS is time con-
suming, as is the creation of good questions. Second, keep
the clicker sessions short to avoid losing the focus of a
lesson. Finally, sometimes the remote devices do not work
and students have difficult registering responses—using
the more reliable radio frequency (as opposed to infra-
red) remote devices was encouraged. It is important to
remember that these comments were collected from only
seven secondary school science teachers.
388 J Sci Educ Technol (2009) 18:382–392
123
Discussion
The purpose of this study was to conduct a formative ARS
use in secondary school science classrooms. Three areas
were examined including benefits, challenges, and teaching
strategies.
Benefits
Overall Attitudes
The survey data suggests that science students were mod-
erately positive about using ARSs, a result that is partially
Table 3 Mean ARS survey
item scores as a function of
strategy used (n=255)
Survey item Summative
assessment
Mixed
(formative &
summative)
Formative
assessment
M (SD) M (SD) M (SD)
Overall attitudes
1. I would prefer to use clickers 3.3 (1.7) 4.3 (1.7) 5.6 (1.7)
2. When clickers were used, the class was better 3.4 (1.3) 4.6 (1.3) 4.8 (1.7)
Student involvement
3. I was more engaged in the lesson when clickers were
used
3.8 (1.5) 4.9 (1.4) 5.6 (1.4)
4. I was more motivated when ARS was used 3.6 (1.6) 4.6 (1.5) 5.6 (1.4)
5. I participated more than I normally would when clickers
were used
4.1 (1.5) 4.6 (1.5) 5.4 (1.5)
6. Using clickers generated more class discussion 3.5 (1.3) 4.8 (1.2) 4.8 (1.6)
Assessment
7. Using clickers was a good way to test my knowledge 4.3 (1.6) 5.2 (1.5) 5.7 (1.2)
8. I liked seeing what other students in the class selected
for answers
3.5 (1.0) 5.4 (1.5) 5.0 (1.4)
9. I did not feel bad when most students got an answer right
and I didn’t
4.3 (1.5) 3.3 (1.5) 4.6 (1.8)
10. I liked using clickers for tests 2.3 (1.7) 3.7 (1.8) 4.8 (1.4)
Learning
11. I learned more when clickers were used 3.1 (1.4) 4.1 (1.3) 4.6 (1.5)
Table 4 MANOVA examining attitude toward ARS as a function of teaching strategy
Source df SS F Scheffe’s Post
Hoc analysis (p\.05)
Overall attitude
1. I would prefer to use clickers 2 173.8 30.7* Formative [Mixed & Summative
2. When clickers were used, the class was better 2 58.7 12.8* Formative & Mixed[Summative
Student involvement
3. I was more engaged in the lesson when clickers were used 2 97.6 24.5* Formative [Mixed [Summative
4. I was more motivated when clickers were used 2 118.1 26.5* Formative [Mixed & Summative
5. I participated more than I normally would when clickers were used 2 62.5 13.8* Formative [Mixed & Summative
6. Using clickers generated more class discussion 2 49.8 11.8* Formative & Mixed [Summative
Assessment
7. Using clickers was a good way to test my knowledge 2 58.0 15.6* Formative & Mixed [Summative
8. I liked seeing what other students in the class selected for answers 2 84.6 23.3* Formative & Mixed [Summative
9. I did not feel bad when most students got an answer right and I didn’t 2 48.5 8.3* Formative & Mixed [Summative
10. I liked using clickers for tests 2 164.6 35.3* Formative [Mixed [Summative
Learning
11. I learned more when clickers were used 2 59.3 29.7* Formative & Mixed [Summative
*p\.001
J Sci Educ Technol (2009) 18:382–392 389
123
consistent with previous studies conducted in higher edu-
cation. Part of the problem with assessing overall attitudes
is that positive and negative attitudes can cancel each other
out leading in a neutral result. As Fies and Marshall (2006)
suggest, general impressions and anecdotal comments
about attitudes toward ARSs need to be supplemented by a
more detailed, focussed analysis in order to provide a more
in depth analysis.
Student Involvement
Relative to assessment and learning, student involvement
was rated highest by students on the survey data and open-
ended comments. Data collected strongly suggested that
ARSs helped increase engagement, participation and, to a
lesser extent, attention paid in class. These results are
consistent with those observed in higher education
(e.g., Caldwell 2007; Fies and Marshall 2006; Judson and
Sawada 2002; Simpson and Oliver 2007). It is unclear,
though, whether increases in student involvement are a
superficial reaction to the excitement of using TV-like
remote devices in a game like atmosphere, or a reflection of
a more profound change in student learning and cognitive
engagement. The next stage of research that needs to be
done with respect to student involvement is to examine
what in particular is engaging about ARSs, the quality of
the participatory efforts by students, and why students are
more focussed when an ARS is used.
Assessment
A majority of secondary school science students appreci-
ated the use of ARSs for formative assessment particularly
with respect to checking their understanding and reviewing
for tests. Comparing answers with other students and get-
ting general feedback played a secondary role. In addition,
most teacher comments enthusiastically endorsed the use
ARSs for formative assessment. These results are consis-
tent with those observed in higher education (Beatty 2004;
Bergtrom 2006; Brewer 2004; Caldwell 2007; Draper and
Brown 2004; Dufresne and Gerace 2004; Greer and Hea-
ney 2004; Jackson et al. 2005; Siau et al. 2006; Simpson
and Oliver 2007; Stuart et al. 2004). The success of using
formative assessment with ARSs stands in marked contrast
to students’ resistance to using summative assessment.
Learning
A number of students commented that using ARSs to
review concepts improved the overall learning process. The
impact of using ARSs for improving recall of material
presented was noteworthy, but commented on by only a
few students. While challenges will be discussed in the
next section, it is important to note that perceptions of
improved learning process did not necessarily lead to
increased learning performance. At least 23 students felt
that learning performance was hampered, sometimes
appreciably, when ARSs were used to deliver summative
assessment. Simply using an ARS does not guarantee
success and choice of pedagogy clearly effects students
perception of learning.
Challenges
Overall Attitudes
It is evident that a few students had difficulty adjusting to
the newness of using an ARS. While this type of problem
was experienced by less than 2% of the student sample, it is
challenge that has also been noted in higher education. The
suggested remedy has been to engage in fun practice ses-
sions before use ARSs for teaching, clearly explain the
rationale for using an ARS, and identify the intended
benefits for students (Crouch and Mazur 2001; Trees and
Jackson 2007).
Student Involvement
While a majority of students were more involved when
ARSs were used, some students were decidedly stressed,
particularly in a summative testing situation. The com-
ments about increased stress were not frequent, but they
were passionate. Given the student frustration level when
an ARS was used for summative tests (see ‘‘Teaching
Strategy Used with ARSs’’) it might be prudent to use this
technology for formative assessment purposes only. Stu-
dent anxiety might also be reduced by emphasizing the
learning versus having to get the right answer.
Learning
The most noteworthy learning challenge when using ARSs
was the negative impact on learning performance perceived
by students who were asked to use this tool to complete
graded tests. Overall rating of this practice was the lowest
rated survey item. However, this study was a preliminary
investigation of the use of ARSs in secondary schools
science classrooms, and more thought and analysis needs to
be directed toward the use of summative assessment.
Teaching Strategy Used with ARSs
Only one previous study reviewed the use of ARSs in a
K-12 environment (Penuel et al. 2006). These researchers
observed that teachers naturally migrated to one of two
strategies when using ARSs: formative (instructional) or
390 J Sci Educ Technol (2009) 18:382–392
123
summative (testing) use. A similar pattern was observed in
the current study with secondary school science teachers.
Most of the science teachers in the current study chose to
use an ARS as a formative assessment tool. Based on the
survey results, students overwhelmingly preferred forma-
tive over summative use. Overall attitude, student
involvement and the value of ARSs as effective learning
tools were all rated significantly lower when students
experienced summative as opposed to formative assess-
ment. No research to date has been done comparing
formative and summative approaches, so further research is
needed to confirm these initial findings.
Recommendation to Educators
The current study is formative analysis of the use of ARSs
in secondary school science classrooms, so it would be
inappropriate to offer unequivocal advice about their use in
the classroom. That said there are several tentative sug-
gestions that may be worth noting.
First, many secondary school science students did not
respond well when an ARS was used as a test-taking tool.
Student involvement and learning performance were per-
ceived as decreasing. In the interest of minimizing the
negative impact of this tool, at least in its initial imple-
mentation in secondary schools, it might be safer to limit
the use ARSs to formative assessment.
Second, it may be important to explain why an ARS is
being used in the classroom. This practice might help the
small, but vocal group of students who resisted this new
method of learning. Caldwell (2007) and Trees and Jackson
(2007) note that if you expect to gain full student accep-
tance, you need to explain why you are using ARSs and
what you expect to achieve with the technology.
Third, teachers should be aware of the time needed to
set-up an ARS and more importantly, to create effective
questions. While only two teachers commented on this
issue, previous research in higher education suggests that
question development is very time consuming (Allen and
Tanner 2005; Beatty et al. 2006; Fagan et al. 2002; Free-
man et al. 2007; Paschal 2002).
Finally, because stress was experienced by students
when the technology did not work, teachers should test the
ARS equipment ahead of time in order to ensure that all
remote devices respond properly. Batteries, for example,
might need to be changed.
Caveats and Future Research
This study investigated the use of ARSs in secondary
school science classrooms. Two principal data collection
tools were used—survey questions and open-ended com-
ments. The design and analysis of the data collection tools
were based on a thorough review of the literature exam-
ining ARS use in higher education. Nonetheless, at least
three caveats need to be considered when interpreting the
current results.
First, the data are intended to provide a starting point for
investigating use of ARSs in secondary school science
classrooms. More detailed examination is required in key
areas such as indentifying sources of student engagement,
the effect of different teaching strategies, and why long-
term memory might be enhanced with the use of ARSs.
Second, the results are based on limited use of an ARS
once or twice over a one month period. The impact could
be distinctly different if an ARS was used regularly. For
example, engagement may be reduced when students
become more familiar with ARSs. On the other hand, the
impact of an ARS on learning may be more significant with
increased use.
Finally, learning performance needs to be examined in
more detail. While systematic data on the perceived quality
of learning was collected for the current study, evaluation
of student performance is needed to firmly establish the
actual learning impact of ARSs.
References
Abrahamson L (2006) A brief history of networked classrooms:
effects, cases, pedagogy, and implications. In: Banks DA (ed)
Audience response systems in higher education. Information
Science Publishing, Hershey, pp 1–25
Allen D, Tanner K (2005) Infusing active learning into the large-
enrolment biology class: seven strategies, from the simple to
complex. Cell Biol Educ 4:262–268. doi:10.1187/cbe.05-08-
0113
Beatty I (2004) Transforming student learning with classroom
communication systems. EDUCAUSE Res Bull 3:1–13.
Retrieved 3 Nov 2007 from http://www.educause.edu/ir/library/
pdf/ERB0403.pdf
Beatty ID, Leonard WJ, Gerace WJ, Dufresne RJ (2006) Question
driven instruction: teaching science (well) with an audience
response system. In: Banks DA (ed) Audience response systems
in higher education. Information Science Publishing, Hershey,
pp 96–115
Bergtrom G (2006). Clicker sets as learning objects. Interdiscip
J Knowl Learn Objects 2. Retrieved 3 Nov 2007 from http://
ijklo.org/Volume2/v2p105-110Bergtrom.pdf
Brewer CA (2004) Near real-time assessment of student learning and
understanding in biology courses. Bioscience 54(11):1034–1039.
doi:10.1641/0006-3568(2004)054[1034:NRAOSL]2.0.CO;2
Burton K (2006) The trial of an audience response system to facilitate
problem-based learning in legal education. In: Banks DA (ed)
Audience response systems in higher education. Information
Science Publishing, Hershey, pp 265–276
Caldwell JE (2007) Clickers in the large classroom: current research
and best-practice tips. Life Sci Educ 6(1):9–20
Carnaghan C, Webb A (2007) Investigating the effects of group
response systems on student satisfaction, learning, and engage-
ment in accounting education. Issues Acc Educ 22(3):391–409.
doi:10.2308/iace.2007.22.3.391
J Sci Educ Technol (2009) 18:382–392 391
123
Crouch CH, Mazur E (2001) Peer instruction: ten years of experience
and results. Am J Phys 69(9):970–977. doi:10.1119/1.1374249
D’Inverno R, Davis H, White S (2003) Using a personal response
system for promoting student interaction. Teach Math Its Appl
22(4):163–169. doi:10.1093/teamat/22.4.163
Draper SW, Brown MI (2004) Increasing interactivity in lectures
using an electronic voting system. J Comput Assist Learn
20(2):81–94. doi:10.1111/j.1365-2729.2004.00074.x
Dufresne RJ, Gerace WJ (2004) Assessing-to-learn: formative
assessment in physics instruction. Phys Teach 42:428–433. doi:
10.1119/1.1804662
Fagan AP, Crouch CH, Mazur E (2002) Peer instruction: results from
a range of classrooms. Phys Teach 40(4):206–209. doi:
10.1119/1.1474140
Fies C, Marshall J (2006) Classroom response systems: a review of
the literature. J Sci Educ Technol 15(1):101–109. doi:10.1007/
s10956-006-0360-1
Freeman M, Bell A, Comerton-Forder C, Pickering J, Blayney P (2007)
Factors affecting educational innovation with in class electronic
response systems. Australas J Educ Technol 23(2):149–170
Greer L, Heaney PJ (2004) Real-time analysis of student compre-
hension: an assessment of electronic student response technology
in an introductory earth science course. J Geosci Educ 52(4):
345–351
Hatch J, Jensen M, Moore R (2005) Manna from heaven or clickers
from hell. J Coll Sci Teach 34(7):36–39
Hinde K, Hunt A (2006) Using the personal response system to
enhance student learning: some evidence from teaching eco-
nomics. In: Banks DA (ed) Audience response systems in higher
education. Information Science Publishing, Hershey, pp 140–154
Jackson M, Ganger AC, Bridge PD, Ginsburg K (2005) Wireless
handheld computers in the undergraduate medical curriculum.
Med Educ Online 10(5). Retrieved 3 Nov 2007 from http://www.
med-ed-online.org/pdf/t0000062.pdf
Jones C, Connolly M, Gear A, Read M (2001) Group integrative
learning with group process support technology. Br J Educ
Technol 32(5):571–581. doi:10.1111/1467-8535.00226
Judson E, Sawada D (2002) Learning from past and present:
electronic response systems in college lecture halls. J Comput
Math Sci Teach 21(2):167–181
Kaleta R, Joosten T (2007) Student response systems: a university of
Wisconsin system study of clickers. EDUCAUSE Res Bull
2007(10):1–12
Kay RH (2008a) Audience response systems attitude scale. Retrieved 22
Nov 2008 from http://faculty.uoit.ca/kay/papers/arshs/AppendixA_
ARS.pdf
Kay RH (2008b) Coding scheme for secondary school students
comments about audience response systems. Retrieved 22 Nov
2008 from http://faculty.uoit.ca/kay/papers/arshs/AppendixB_
ARS.pdf
Kennedy GE, Cutts QI (2005) The association between students’ use
of electronic voting systems and their learning outcomes.
J Comput Assist Learn 21(4):260–268. doi:10.1111/j.1365-2729.
2005.00133.x
Latessa R, Mouw D (2005) Use of audience response system to
augment interactive learning. Fam Med 37(1):12–14. Retrieved
3 Nov 2007 from http://www.stfm.org/fmhub/fm2005/January/
Robyn12.pdf
McCabe M (2006) Live assessment by questioning in an interactive
classroom. In: Banks DA (ed) Audience response systems in
higher education. Information Science Publishing, Hershey,
pp 276–288
Nicol DJ, Boyle JT (2003) Peer instruction versus class-wide
discussion in large classes: a comparison of two interaction
methods in the wired classroom. Stud High Educ 28(4):457–473.
doi:10.1080/0307507032000122297
Paschal CB (2002) Formative assessment in physiology teaching
using a wireless classroom communication system. Adv Physiol
Educ 26(4):299–308
Pelton LF, Pelton T (2006) Selected and constructed response systems
in mathematics. In: Banks DA (ed) Audience response systems
in higher education. Information Science Publishing, Hershey,
pp 175–186
Penuel WR, Abrahamson L, Roschelle J (2006) Theorizing the
transformed classroom: sociocultural interpretation of the effects
of audience response systems in higher education. In: Banks DA
(ed) Audience response systems in higher education. Information
Science Publishing, Hershey, pp 187–208
Pradhan A, Sparano D, Ananth CV (2005) The influence of an
audience response system on knowledge retention: an applica-
tion to resident education. Am J Obstet Gynecol 193(5):1827–
1830. doi:10.1016/j.ajog.2005.07.075
Preszler RW, Dawe A, Shuster CB, Shuster M (2007) Assessment of
the effects of student response systems on student learning and
attitudes over a broad range of biology courses. CBE Life Sci
Educ 6(1):29–41. doi:10.1187/cbe.06-09-0190
Reay NW, Bao L, Li P, Warnakulasooriya R, Baugh G (2005) Toward
the effective use of voting machines in physics lectures. Am
J Phys 73(6):554–558. doi:10.1119/1.1862638
Schackow TE, Milton C, Loya L, Friedman M (2004) Audience
response system: effect on learning in family medicine residents.
Fam Med 36:496–504
Sharma MD, Khachan J, Chan B, O’Byrne J (2005) An investigation
of the effectiveness of electronic classroom communication
systems in large lectures. Australas J Educ Technol 21(2):
137–154
Siau K, Sheng H, Nah F (2006) Use of classroom response system to
enhance classroom interactivity. IEEE Trans Educ 49(3):398–
403. doi:10.1109/TE.2006.879802
Simpson V, Oliver M (2007) Electronic voting systems for lectures
then and now: a comparison of research and practice. Australas
J Educ Technol 23(2):187–208
Slain D, Abate M, Hidges BM, Stamatakis MK, Wolak S (2004) An
interactive response system to promote active learning in the
doctor of pharmacy curriculum. Am J Pharm Educ 68(5):1–9
Stuart SAJ, Brown MI, Draper SW (2004) Using an electronic voting
system in logic lectures: one practitioner’s application. J Comput
Assist Learn 20(2):95–102. doi:10.1111/j.1365-2729.2004.
00075.x
Trees AR, Jackson MH (2007) The learning environment in clicker
classrooms: student processes of learning and involvement in
large university course using student response systems. Learn
Med Technol 32(1):21–40. doi:10.1080/17439880601141179
Uhari M, Renko M, Soini H (2003) Experiences of using an
interactive audience response system in lectures. BMC Med
Educ 3(12):1–6
392 J Sci Educ Technol (2009) 18:382–392
123