Content uploaded by Robin Holding Kay
Author content
All content in this area was uploaded by Robin Holding Kay on Dec 27, 2015
Content may be subject to copyright.
Content uploaded by Robin Holding Kay
Author content
All content in this area was uploaded by Robin Holding Kay
Content may be subject to copyright.
Handbook of Research on
New Media Literacy at the
K-12 Level:
Issues and Challenges
Leo Tan Wee Hin
National Institute of Education, Nanyang Technological University Singapore
R. Subramaniam
National Institute of Education, Nanyang Technological University Singapore
Hershey • New York
InformatIon scIence reference
Volume II
Director of Editorial Content: Kristin Klinger
Senior Managing Editor: Jamie Snavely
Managing Editor: Jeff Ash
Assistant Managing Editor: Carole Coulson
Typesetter: Chris Hrobak
Cover Design: Lisa Tosheff
Printed at: Yurchak Printing Inc.
Published in the United States of America by
Information Science Reference (an imprint of IGI Global)
701 E. Chocolate Avenue,
Hershey PA 17033
Tel: 717-533-8845
Fax: 717-533-8661
E-mail: cust@igi-global.com
Web site: http://www.igi-global.com/reference
and in the United Kingdom by
Information Science Reference (an imprint of IGI Global)
3 Henrietta Street
Covent Garden
London WC2E 8LU
Tel: 44 20 7240 0856
Fax: 44 20 7379 0609
Web site: http://www.eurospanbookstore.com
Copyright © 2009 by IGI Global. All rights reserved. No part of this publication may be reproduced, stored or distributed in any form or by
any means, electronic or mechanical, including photocopying, without written permission from the publisher.
Product or company names used in this set are for identication purposes only. Inclusion of the names of the products or companies does
not indicate a claim of ownership by IGI Global of the trademark or registered trademark.
Library of Congress Cataloging-in-Publication Data
Handbook of research on new media literacy at the K-12 level : issues and challenges / Leo Tan Wee Hin and R. Subramaniam, editors.
p. cm.
Includes bibliographical references and index.
Summary: "This book provides coverage of signicant issues and theories currently combining the studies of technology and literacy"--
Provided by publisher.
ISBN 978-1-60566-120-9 (hardcover) -- ISBN 978-1-60566-121-6 (ebook) 1. Mass media in education--Handbooks, manuals, etc. 2. Media
literacy--Handbooks, manuals, etc. 3. Educational technology--Handbooks, manuals, etc.
I. Tan, Leo Wee Hin, 1944- II. Subramaniam, R. (Ramanathan), 1952-
LB1043.H329 2009
302.23071--dc22
2009003229
British Cataloguing in Publication Data
A Cataloguing in Publication record for this book is available from the British Library.
All work contributed to this book is new, previously-unpublished material. The views expressed in this book are those of the authors, but not
necessarily of the publisher.
720
Chapter XLV
A Formative Analysis of
Interactive Classroom
Communication Systems Used
in Secondary School Classrooms
Robin Kay
University of Ontario Institute of Technology, Canada
Copyright © 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
ABSTRACT
Extensive research has been done on the use of Interactive Classroom Communication Systems (ICCS)
in higher education, but not in secondary schools. This chapter provides a detailed overview of the
benets,challenges,andstrategiesobservedwhenusingICCSinhighereducation.Thisoverviewis
thenusedtoanalyzeandinterpretquantitativeandqualitativedatacollectedfrom659secondaryschool
students.ThemainbenetsthatstudentsidentiedforusingICCSwereincreaseduseofformative as-
sessment, higher engagement and motivation, enhanced participation, and improved focus during class.
Students were relatively neutral, though, with respect to whether ICCS improved class discussion or
learning. The main challenges that students reported were increased stress and uncertainty of answers
whenICCSwereusedinaformaltestsituation.MalesweresignicantlymorepositivetowardICCS,
as were students who had higher comfort levels with technology. When ICCS were used for formative
assessmentasopposedtoformaltests,studentsweresignicantlymoreaccepting.Thechapterconcludes
with suggestions for educators and future research.
OVERVIEW AND HISTORY
Interactive Classroom Communication Systems
(ICCS) allow students to respond to multiple
choice questions using a remote control device.
After students click in their responses, the results
are instantly aggregated and displayed in chart
form, usually a histogram. Responses are often
721
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
anonymous but can be linked to specic students
for evaluation purposes. A comprehensive review
of the literature reveals no less that 26 different
labels for ICCS including audience response sys-
tems (e.g., Caldwell, 2007), classroom response
systems (e.g., Siau, et al., 2006), electronic voting
systems (e.g., Simpson & Oliver, 2007), personal
response systems (e.g., Brewer, 2004), clickers
(e.g., Bergtrom, 2006) and zappers (e.g., d’Inverno,
Davis, & White, 2003). The label “Interactive
Classroom Communication Systems” was cho-
sen for this chapter because the tool promotes
“interactivity”, is used in a “classroom” setting,
h e l p s “ communicate” information to both students
and teachers, and is used most effectively within
a “system” of thoughtful pedagogical learning
strategies.
Judson and Sawada (2002) noted in a com-
prehensive review of early work on ICCS that
student attitudes toward these systems in higher
education was universally positive. However, the
cost of using ICCS at that time was prohibitive.
It was not until 1992 that the rst popular ICCS
became commercially available. A new generation
of easier to use, affordable ICCS started gaining
acceptance at universities in 1999 (Beatty, 2004).
Several research reviews have been completed
examining the use of ICCS (Caldwell, 2007; Fies
& Marshall, 2006; Judson & Sawada, 2002), how-
ever all but one of the numerous papers reviewed
(Penuel, Boscardin, Masyn & Crawford, 2006)
focussed on higher education. Little is known
about the use of ICCS in secondary schools.
The purpose of this chapter is to discuss
the potential benets, challenges, and strategies
associated with using ICCS, then to present and
evaluate the feedback and comments from 659
secondary school students who used ICCS over
a period of one month.
BENEFITS TO USING ICCS
General Student Attitudes
Prior to 1992, student overall acceptance of ICCS
was quite high (Judson & Sawada, 2002), although
much of the evidence presented was anecdotal.
A more recent analysis of student attitudes is
consistent with previous results. There is con-
siderable quantitative and qualitative evidence
to suggest that higher education students have
positive attitudes toward using ICCS (Caldwell,
2007; Carnaghan & Webb, 2006; Draper & Brown,
2004; Judson & Sawada, 2002; Kaleta & Joosten,
2007; Paschal, 2002; Poulis et al., 1998; Prezler,
Dawe, Shuster, & Shuster, 2007; Reay, Bao, Li,
Warnakulasooriya, & Baugh, 2005; Sharma,
Khachan, Chan, & O’Byrne, 2005; Abate, Hidges,
Stamatakis, & Wolak, 2004). However, it is criti-
cal to focus on specic benets in order to truly
understand whether ICCS is a viable tool in the
classroom. The list of advantages that have been
researched in higher education with respect to
the use of ICCS include
1. Improving attendance;
2. Increasing participation and interaction;
3. Making participation anonymous;
4. Increasing student attention;
5. Enhancing student engagement;
6. Increasing class discussion;
7. Increasing the use effective formative as-
sessment; and
8. Enhancing learning.
Each of these potential benets will be dis-
cussed in detail.
Attendance
ICCS have been introduced at universities to
help address attendance problems. A number of
722
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
studies note that attendance does improve when
use of ICCS is associated with part of the course
grade. Several researchers observed dramatic
increases in attendance when 15% of a student’s
grade was linked to ICCS participation (Burn-
stein & Lederman, 2001; Greer & Heany, 2004).
Caldwell (2007), though, found that a ve percent
reward for using ICCS was sufcient to increase
the number of students who came to class. Ide-
ally, one would like to have students attend class
because they felt using ICCS was benecial to
their learning. Two studies did report increased
attendance in ICCS classes when grades were not
used as a motivator. Greer & Heany (2004) added
that students were not happy about being forced
to come to class because of ICCS participation
credit. Using external rewards to increase class-
room attendance, may undermine the process of
using ICCS if students are resentful.
Participation
There is considerable evidence to suggest that
students participate more when ICCS are used
in the classroom (Bullock, et al., 2002; Caldwell,
2007; Draper & Brown, 2004; Greer & Heany,
2004; Jones, Connolly, Gear, & Read, 2001; Siau,
et al., 2006; Stuart, Brown, & Draper, 2004;
Uhari, Renko, & Soini, 2003; Van Dijk, Van Den
Berg, & Van Keulen, 2001). One study noted that
“shy” students participated more (Greer & Heany,
2004). Bullock et al. (2002) added that when a
portion of a student’s grade was allotted to ICCS
use, participation increased by 1400%. Specic
strategies of ICCS use appear to inuence the
level of student involvement. For example, one
study reported that ICCS were more effective
when case studies were used (Jones et al., 2001).
Other researchers have observed that students
were more involved when ICCS were used in
groups as opposed to individually (Jones et al.,
2001; Van Dijk et al., 2001). Regardless of the type
of student, inuence of grade, or strategy used,
it appears ICCS promote student participation in
higher education classrooms.
Anonymity
One of the proposed benets of using ICCS is
anonymity. Unlike traditional classrooms, stu-
dents can respond to ICCS questions without
being judged by peers, a tutor, or the instructor.
A number of researchers have reported that higher
education students like this feature (Caldwell,
2007; Draper & Brown, 2004; Jones et al., 2001;
Siau, et al., 2006; Simpson & Oliver, 2007; Stuart
et al., 2004).
Attention
In order for learning to occur, students need to
be focused and paying attention. There is some
evidence to suggest that student attention wanes
after about 20 minutes in a classroom (d’Inverno,
et al., 2003; Jackson, Ganger, Bridge, & Ginsburg,
2005). Since a traditional lecture lasts anywhere
from 50 minutes to three hours, there are times
when students are not able to concentrate on what
is being discussed. Presenting ICCS questions at
20 minute intervals is one way of breaking up a
long lecture and allowing students to shift their
attention and actively participate in the learning
process. Numerous studies have reported that
higher education students are more attentive when
ICCS are used (Bergtrom, 2006 ; Burnstein &
Lederman, 2001 ; Caldwell, 2007 ; d’Inverno, et
al., 2003 ; Draper & Brown, 2004 ; Elliott, 2003;
Jackson et al., 2005; Jones et al., 2001 ; Latessa
& Mouw, 2005 ; Siau, et al., 2006 ; Slain et al.,
2004).
Engagement
Engagement is also a key component of learning.
Students who used ICCS had more fun (Caldwell,
2007; Draper & Brown, 2004; Latessa & Mouw,
2005; Siau, et al., 2006), were more interested or
engaged (Bergtrom, 2006; Prezler et al., 2007;
Simpson & Oliver, 2007), and were more likely
to go to class (Greer & Heany, 2004). Only one
723
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
study reported no signicant differences in mo-
tivation between ICCS and traditional classroom
formats.
Discussion
Several researchers have observed that ICCS
promotes increased discussion, particularly when
used with a peer instruction strategy (Beatty, 2004;
Brewer, 2004; Draper & Brown, 2004; Jones et
al., 2001; Nicol & Boyle, 2003). Peer instruc-
tion occurs when an instructor asks a question
using ICCS, collects and presents amalgamated
responses from the class, but does not give away
the correct answer. Instead, the class is told to
discuss the possible solutions in pairs and then a
second vote is taken. In this situation, students
felt they were better able to discuss and calibrate
their understanding of specic concepts (Draper
& Brown, 2004). In addition, students felt they
were more engaged and animated when ICCS
discussions were occurring (Jones et al., 2001;
Nicol & Boyle, 2003).
Formative Assessment
There are two common forms of assessment
that can be used in a class. Summative assess-
ment involves tests and assignments that count
toward a student’s nal grade. The vast majority
of assessment in higher education is summative.
Formative assessment, on the other hand, is used
to determine student understanding of concepts
without grades, to identify misconceptions, and
alter classroom instruction accordingly. Without
ICCS, it is very difcult to calibrate student un-
derstanding of concepts presented in large classes.
Regular use of ICCS can offer feedback to both
instructors and students as to how well concepts
are being understood. Experienced teachers can
quickly modify their explanations or mode of
instruction accordingly. Students can gauge and
discuss their understanding of concepts as they are
being presented. There is considerable evidence
to suggest that ICCS are an effective means for
providing formative assessment (Beatty, 2004;
Bergtrom, 2006; Brewer, 2004; Bullock et al.,
2002; Caldwell, 2007; Draper & Brown, 2004;
Dufresne & Gerace, 2004; Elliott, 2003; Greer
& Heany, 2004; Hatch, Jensen, & Moore 2005;
Jackson et al., 2005; Siau, et al., 2006; Simpson
& Oliver, 2007; Stuart et al., 2004).
Learning
Ultimately, when one introduces a new learning
tool such as ICCS, one wants to improve student
learning. Many studies have reported that students
feel they learn more when ICCS are used in higher
education classrooms (Elliott, 2003; Greer &
Heany, 2004; Hatch et al., 2005; Nicol & Boyle,
2003; Pradha, Sparano, Ananth,2005; Prezler et
al., 2007; Siau, et al., 2006; Slain et al., 2004; Stuart
et al., 2004; Uhari et al., 2003). Some students like
hearing explanations about ICCS questions from
their peers who have common experiences and can
explain misconceptions more effectively than the
instructor (Nicol & Boyle, 2003; Caldwell, 2007).
Other students believe that using ICCS helps
them think more about the important concepts
(Draper & Brown, 2004; Greer & Heany, 2004).
Still others note that the use of ICCS helps them
discover and resolve misconceptions (d’Inverno,
et al., 2003). The one drawback noted by several
teachers is that not as many concepts can be cov-
ered when ICCS are integrated into the classroom
(Elliott, 2003; Caldwell, 2007). However, it is felt
that reduced coverage is more than compensated
for by the increased material that students truly
understand (Elliott, 2003).
A strong argument can be made for the use of
ICCS based on anecdotal and experimental evi-
dence. A number of researchers have reported that
learning performance has increased as a result of
using ICCS (Carnaghan & Webb, 2006; Brewer,
2 0 0 4 ; C a l d w e l l , 2 0 0 7 ; K e n n e d y & C u t t s , 2 0 0 5 ; L a -
tessa & Mouw, 2005; Poulis et al., 1998; Schackow,
Milton, Loya, & Friedman, 2004), although these
724
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
observations are based on descriptive feedback.
However, many experimental studies have been
done where ICCS classes signicantly outperform
traditional lecture formats (Bullock et al., 2002;
Crouch & Mazur, 2001; El-Rady, 2006; Fagen,
Crouch, & Mazur, 2002; Hake, 1998; Kaleta &
Joosten, 2007; Kennedy & Cutts, 2005; Pradhan
et al., 2005; Prezler et al., 2007; Schackow et al.,
2004; Slain et al., 2004).
It is important to note, though, that simply us-
ing ICCS may not necessarily improve learning
performance (Draper & Brown, 2004; Van Dijk
et al., 2001). For example, it is recommended by
several researchers that peer-instruction be used
to achieve the maximum benet (Brewer, 2004;
Bullock et al., 2002; Burnstein & Lederman, 2001;
Caldwell, 2007; Crouch & Mazur, 2001). However,
limited research has been done examining and
comparing specic strategies used with ICCS.
CHALLENGES TO USING ICCS
Overview
While ICCS appear to improve learning and has
been well received by both students and teach-
ers, there are several challenges that have been
reported including:
1. Changing to a new method of teaching for
teachers and students;
2. Time required to develop effective ICCS
classes;
3. Technological problems; and
4. Developing effective questions.
Each of these challenges will be discussed in
turn.
New Method of Teaching - Teachers
Three main challenges emerged for faculty who
used ICCS in their classrooms. First, there was
a big learning curve for some teachers when
learning how to use ICCS. Some instructors were
overwhelmed with the new technology (Kaleta
& Joosten, 2007). A second problem emerged
when teachers attempted to modify instruction
based on misunderstandings reported from ICCS
questions. Less experienced teachers, for example,
had more difculty modifying their explanations
“on the y” (Beatty, 2004). The most signicant
concern for instructors, though, was changing
their teaching philosophy from teacher-centered
lectures to student-based discussions stimulated
by ICCS questions (Brewer, 2004; Freeman, Bell,
Comerton-Forder, Pickering, & Blayney, 2007).
New Method of Teaching - Students
Some students may react adversely to the use
of ICCS because the learning “game plan” has
been changed. They are used to lecturing and a
switch of methods can lead to stress, frustration,
and resistance at rst (Beatty, 2004; Fagen et al.,
2002). Other students are distracted by the use of
ICCS (Siau, et al., 2006). Still others doubt their
own ability to direct their learning using ICCS
(Allen & Tanner, 2005). Finally, certain students,
not unlike instructors, feel that less content is
covered when using the ICCS approach (Allen &
Tanner, 2005). While resistance to using ICCS is
relatively limited (Fagen et al., 2002), it is impor-
tant to adequately explain to the class why ICCS
are being used (Crouch & Mazur, 2001).
Time
Time has been reported as a concern when ICCS
are implemented into the classroom. Four key ar-
eas have been noted by researchers. First, several
instructors have commented that it takes consider-
able time to learn how to use ICCS, 20 hours in
some cases (El-Rady, 2006, Freeman et al., 2007,
Hatch et al., 2005). Second, time required to set
up the ICCS, hand out the remote controls at the
beginning of the class, and collect them back at
725
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
the end of the class can be signicant (Hatch et
al., 2005, Stuart et al., 2004). Third, time taken up
during class to address and discuss ICCS questions
instead of covering new material is a concern for
a number of teachers (Caldwell, 2007, Draper &
Brown, 2004; Fagen et al., 2002, Freeman et al.,
2007, Kaleta & Joosten, 2007, Siau, et al., 2006,
Slain et al., 2004). Fourth, in order for ICCS to
be effective, thoughtful questions focusing on
key misconceptions need to be developed. This
is probably the most time consuming aspect of
ICCS use and requires extensive commitment
from an instructor (Beatty, 2004, Fagen et al.,
2002, Freeman et al., 2007, Paschal, 2002, Steinert
& Snell, 1999).
Technology
Two main technology-based difculties were re-
ported when ICCS were used. When students were
responsible for buying their own remote devices,
they did not always bring them to class or they lost
them. Because of the dependence on technology,
students without remotes were not able to fully
participate in ICCS classes (Caldwell, 2007; Reay
et al., 2005). A more critical issue was when re-
mote devices did not work or the signal was not
received by the instructor’s computer. This was
a particularly stressful experience when students
were being evaluated for marks (El-Rady, 2005;
Hatch et al., 2005; Sharma et al., 2005; Siau, et al.,
2006). For ICCS to be a successful learning tool,
the technology has to work. Possible solutions to
the earlier problem include handing out remote
devices in every class instead of relying on students
to bring them (Reay et al., 2005) and using radio
frequency devices which are more reliable than
the less expensive infrared models.
Questions
Writing good ICCS questions turns out to be
a daunting task. Researchers have noted that
the most effective questions address a specic
learning goal, make students aware of opinions
other than their own, locate misconceptions and
confusion, explore ideas in a new context, and
elicit a wide range of responses (Caldwell, 2007,
Crouch & Mazur, 2001; Miller, Santana-Vega, &
Terrell, 2006). Since there are very few collec-
tions of ICCS questions available for most elds,
it falls upon the instructor to create questions from
scratch. Many instructors nd question creation
very time consuming and challenging (Allen &
Tanner, 2005; Beatty, Gerace, Leonard, & Du-
fresne, 2006; El-Rady, 2006; Fagen et al., 2002;
Freeman et al., 2007; Paschal, 2002).
Miscellaneous Concerns
A variety of miscellaneous, somewhat isolated
problems were reported by students using ICCS
involving anonymity, attendance, cheating, class
discussion, distractibility, effort, and misuse.
Some students were much less condent about
using ICCS when their answers were not anony-
mous (Elliot, 2003). Other students did not like
ICCS being used to monitor attendance. As a
result, 20% to 58% of students reported seeing
other students cheat by bringing multiple remote
handsets to class to record attendance for miss-
ing classmates (Caldwell, 2007). Occasionally,
signicant problems emerged when ICCS ques-
tions were discussed. Some students dominated
group discussion. Others reported that discussion
of different viewpoints led to more confusion.
Still others noted that class wide discussion took
too much time and that it was easy to drift away
from the main concept being addressed (Nicol
& Boyle, 2003, Reay et al., 2005). Occasionally,
students felt that ICCS use distracted them from
learning (Draper & Brown, 2004). In certain cases,
students realized that using ICCS required more
effort and participation and they preferred the
more passive lecture approach (Trees & Jackson,
2007). Finally, some students did not take ICCS
use seriously and kept pressing buttons or voting
randomly to mislead the instructor (Draper &
726
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
Brown, 2004). While the earlier problems were
sporadic and do not represent the norm when using
ICCS, they still need to be addressed, especially
when learning is undermined.
K-12 ICCS RESULTS
Penuel et al. (2006) have done the only compre-
hensive study of ICCS with K-12 teachers. They
reported a number of ndings. First, teachers used
ICCS for two main reasons: to improve learning
and instruction or for summative assessment in-
volving marks. Second, frequent users of ICCS
had the most positive perceptions about use in the
classroom. Third, there was no relation between
subject area taught and type of use. Fourth, train-
ing increased the likelihood that teachers would
be frequent users. Fifth, teachers rarely used
ICCS to promote discussion. Finally, teachers
who adopted the view that students should play
a signicant, active role in learning were more
likely to use ICCS to alter instructional practice.
Overall, the main impact of ICCS in K-12 ap-
pears to be increasing motivation and improving
student learning.
STRATEGIES FOR USING ICCS
Technology does not improve learning by itself.
It is the pedagogical strategies selected to use the
technology that have a fundamental inuence on
success (Reay et al., 2005; Simpson & Oliver,
2007; Stuart et al., 2004). A detailed analysis
and comparison of strategies used with ICCS is
beyond the scope of this paper, but it is worth
noting that a wide range of strategies has been
used including:
1. Explaining the why ICCS are being used in
the class and giving students time to practice
using the system (Beatty, 2004; Caldwell,
2007; Dufresne & Gerace, 2004; Trees &
Jackson, 2007);
2. Requiring students to read material
ahead of class so that ICCS questions can
be discussed more meaningfully (Beatty,
2004; Bergtrom, 2006; d’Inverno, et al.,
2003; El-Rady, 2006; Uhari et al., 2003);
3. Using questions that uncover misconcep-
tions (Beatty, et al., 2006; Brewer, 2004;
Draper & Brown, 2004; Judson & Sawada,
2002; Trees & Jackson, 2007);
4. Increasing peer and classroom discus-
sion (Beatty et al., 2006; Caldwell, 2007;
Draper et al., 2002; Dufresne & Gerace,
2004; Judson & Sawada, 2002; Kennedy &
Cutts, 2005; Nicol & Boyle, 2003; Simpson
& Oliver, 2007; Slain et al., 2004);
5. Contingent teaching where the
path of instruction is contingent on
ICCS feedback from the students
(Brewer, 2004; Draper & Brown, 2004; El-
liott, 2003; Greer & Heany, 2004; Jackson
et al., 2005; Kennedy & Cutts, 2005; Poulis
et al., 1998);
6. Case-study questions (Jones et al., 2001);
7. Conducting experiments (Draper et al.,
2002; Simpson & Oliver, 2007);
8. Peer-instruction where students discuss
responses to challenging questions and
attempt to work out the answers without
signicant instructor input (Brewer, 2004;
Bullock et al., 2002; Burnstein & Lederman,
2001; Caldwell, 2007; Crouch & Mazur,
2001; Draper & Brown, 2004; Jones et al.,
2001; Kennedy & Cutts, 2005; Miller et al.,
2006; Nicol & Boyle, 2003);
9. Summative assessment worth marks
(Draper et al., 2002; Fies & Marshall, 2006;
Simpson & Oliver, 2007); and
10. Formative assessment (Beatty, 2004; Ken-
nedy & Cutts, 2005; Simpson & Oliver,
2007).
727
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
EXAMINING SECONDARY SCHOOL
CLASSROOMS
Benets, challenges and strategies related to ICCS
use in higher educations have been presented
in detail. The second goal of this chapter was
to examine use of ICCS in a secondary school
environment.
METHOD
Sample
Students. The student sample consisted of 659 Ca-
nadian students (327 males, 327 females, 5 missing
data), enrolled in grades 9 (n= 71), 10 (n=233), 11
(n=149), and 12 (n= 206). Subject areas where ICCS
were used included accounting, biology, business,
chemistry, civics, computer engineering, English,
law, mathematics, marketing, physics, technol-
ogy, and world issues. Eight-seven percent of the
students claimed that they were comfortable or
very comfortable with technology (n=572). Sample
population data was collected from 23 different
classrooms The students were selected through
convenience sampling and had to obtain signed
parental permission to participate.
Teachers. The teacher sample consisted of
23 teachers (16 males, 7 females), with 1 to 26
years of teaching experience (M = 15.9, SD =
7.9). Almost all teachers reported that they were
comfortable or very comfortable with technology
(n=22, 96%).
Procedure
Teachers were emailed by an educational coordi-
nator and informed of the ICCS study. Participa-
tion was voluntary and a subject could withdraw
from the study at any time. Each teacher received
two half days of training in November and Febru-
ary on how to use the ICCS software and possible
strategies for using ICCS in the classroom. They
were then asked to use ICCS in their classrooms
for one month, although how often ICCS were
used was up to the individual teachers. In pairs,
teachers shared a laptop computer, an LCD pro-
jector, and one ICCS system from E-Instruction.
All students in a given teacher’s classroom par-
ticipated in ICCS lessons. However, only those
students with signed parental permission forms
were permitted to ll in an anonymous, online
survey about their use of the ICCS.
Data Sources
Student survey. After using ICCS for one month,
students completed the ICCS Attitude Survey
for Students (see Appendix A). Since this was
a formative analysis of student attitudes toward
the use of ICCS, reliability and construct validity
were not assessed. Instead, each item was ana-
lyzed individually to glean as much information
as possible.
Student comments. Students were asked “What
was the impact of clickers on your learning in the
past month?” A coding scheme was developed to
categorize 760 student comments (see Appendix
B). Note that some students wrote more than
one comment when they lled in their survey,
whereas other students offered no response.
Each comment was then rated on a ve-point
Likert scale (-2 = very negative, -1 = negative, 0
= neutral, 1 = positive, 2 = very positive). Two
raters assessed all comments made by students
based on category and rating on the Likert scale.
Inter-rater reliability was 83% for categories and
93% for ratings. Comments where categories or
ratings were not exactly the same were shared and
reviewed a second time by each rater. An inter-
rater reliability of 98% was reached for categories
and 99% for the rating values.
728
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
Question n Min Max Mean S.D. C.I
Using ICCS were a good way to test my knowledge. 655 1 7 5.4 1.4 ± 0.11
I was more engaged in the lesson when ICCS were used. 657 1 7 5.3 1.5 ± 0.11
I would PREFER to use ICCS. 653 1 7 5.3 1.8 ± 0.14
I was more motivated when ICCS were used. 656 1 7 5.2 1.5 ± 0.11
I participated more than I normally would when ICCS were used. 659 1 7 5.1 1.6 ± 0.12
The class was NOT out of control when ICCS were used. 658 1 7 5.0 1.5 ± 0.11
I liked seeing what other students in the class selected for answers. 656 1 7 4.8 1.5 ± 0.11
I liked using ICCS for tests. 636 1 7 4.7 1.8 ± 0.14
When ICCS were used, the class was better. 659 1 7 4.7 1.5 ± 0.11
I DID NOT feel bad when most students got an answer right and I didn’t. 658 1 7 4.6 1.7 ± 0.13
Using ICCS generated more class discussion. 657 1 7 4.6 1.5 ± 0.11
I learned more when ICCS were used. 656 1 7 4.4 1.5 ± 0.11
Table 1. Summary of ICCS quantitative survey questions
Category Mean S.D. n Total Impact
(Mean x n)
Engagement 1.23 0.43 111 137
Learning 0.81 0.78 104 84
Review previous concepts 1.13 0.44 53 60
Participation 1.04 0.35 49 51
Formative Assessment 0.94 0.41 53 50
Paid attention more 0.93 0.38 28 26
General comment 0.30 0.89 70 21
Different methods used 0.43 1.03 46 20
Memory 0.86 0.73 21 18
Compare progress with other students 0.88 0.34 16 14
Feedback 0.86 0.53 14 12
Discussion 1.11 0.33 9 10
Teacher explained better 1.00 0.00 5 5
Class environment 0.14 1.23 14 2
Did not use Enough 0.00 0.00 32 0
Wrong answer - Reaction -1.00 0.00 2 -2
Technology Issues -0.48 1.05 25 -12
Learning performance -0.26 1.16 80 -21
Stress -1.00 1.05 28 -28
Table 2. Summary of student comments about ICCS
729
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
RESULTS
Benets to Using ICCS
Student survey. Table 1 provides a summary of the
means and standard deviations for each of items
on the ICCS Attitude Scale (see questions 8 to 19
in Appendix A). Overall students felt that ICCS
• Was a good way to test knowledge,
• They were more engaged and motivated
when ICCS were used,
• They preferred to use ICCS,
• They participated more in class,
• And the class was not out of control.
Student comments. Comments made by stu-
dents about ICCS are presented by category in
Table 2. Note that total impact is of a particular
category is calculated y multiplying mean rating
(as determined from Appendix B) by the num-
ber of comments in that category. For example,
engagement received a mean rating of 1.23 and
was referred to by 111 students, so the total impact
was calculated as 1.23 * 111 or 137.
Based on total impact scores, students felt that
ICCS had the biggest positive effect on engage-
ment, learning, reviewing previous concepts,
participation, formative assessment and attention.
ICCS also had a positive impact on improving
memory of concepts, seeing how one is doing
relative to other class members, and getting
feedback about learning, however, students made
fewer comments in these areas. Table 3 offers
sample comments made by students about the
benets of ICCS.
Category Dening Criteria
Engagement “It made class more fun and enjoyable”.
“I wasn’t falling asleep. And I was very interested.”
“Much of the class seemed more interested in the Math we were doing”
Learning “Clickers impacted my learning slightly”
“I liked the way you had to think very fast.”
“It was a faster learning process.”
“I have learned better and observe the question better”
“Learning was more fullling.”
“A very innovative and exciting way to learn.”
Review Previous
Concepts “I think it was a nice way to test our knowledge for reviews”
“It helped me study for the up coming test”
“It was great review of the units and topics we just covered”
“I was able to obtain answers to test-like questions easily giving me insight as to what more studying was needed.”
Participation “It made the class more interactive.”
“ICCS motivated me into participating in class discussions in general!”
“More hands on and intriguing way to learn. More class involvement.”
“It forced people who normally don’t feel comfortable participating in class discussion to participate.”
“[ICCS] makes you want to participate because it’s not like school work.”
Formative
Assessment “Got to see how much I really understood in that class.”
“I did gure out what I need to go over in my notes.”
“It helped me see what parts of law I needed improvement in, judging by the questions I got right or wrong.”
“Helped me by testing my knowledge and getting to know where the class and I are at.”
Increased
Attention or Focus “It made me concentrate more”
“The clickers allowed me to stay focused on the subject while in class”
“I felt I read all the questions more carefully to make sure I got the right answer and paid attention more than I
normally would when we just talked about something.”
“I am forced to pay more attention in class which helps me learn better.”
Table3.SamplecommentsillustratingbenetsofusingICCS
730
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
Challenges Using ICCS
Student survey. Students were relatively neutral
about whether they learned more using ICCS,
whether ICCS generate more class discussion, how
they felt when their answers were wrong compared
to the rest of the class, and liking to use ICCS for
tests of summative assessment (Table 1)
Student comments. Some students felt that
when ICCS were used they were more stressed,
learning performance was negatively affected,
and that technology issues were a problem. The
relative impact of challenges, though, compared to
the benets discussed earlier were small. In other
words, fewer students voiced concerns about the
negative aspects of ICCS and the mean impact
although negative, was modest.
Table 4 offers sample comments made by stu-
dents at about the challenges of using ICCS. The
majority of comments refer to students not liking
to use ICCS for summative assessment (testing)
because of the pressure and stress created. Students
also noted that when the remote devices did not
work, they were somewhat frustrated.
Individual Differences in Using ICCS
Responses, both qualitative and quantitative, were
quite varied in this study. Some students were very
positive about using ICCS. Other students were
neutral or stressed. Analysing individual differ-
ences is one way of sorting out possible factors
that inuenced student attitudes. Three key areas
were examined: gender, computer comfort (self
efcacy), and strategies used with ICCS.
Gender. A MANOVA was run to compare
male and females on each of the items in the ICCS
Attitude Scale (Appendix A). Hotelling’s T was
signicant (p <.001), so individual comparisons
were done on each survey question. From Table
5, it is clear that males and females differed
signicantly on all but two items from the ICCS
Attitude Survey. Specically, males were more
motivated and engaged when using ICCS, they
participated more, they liked using ICCS to test
their knowledge, especially in summative evalu-
ation, they did not feel as bad as females when
they got incorrect answers, they thought ICCS
generated more class discussion, they felt ICCS
Category Dening Criteria
Stress “It made me more worried about not knowing what I need to know!!”
“I realized that I cannot work well under the pressure when ICCS were used.”
“ICCS made me nervous in test situations”
“Using ICCS on a test is very stressful. Not only are we stressed because of the test conditions, but it’s difcult
to work the clickers.”
“I felt that ICCS made me feel as if I had to rush to get my answer selected. I was nervous and felt pressure,
when usually I’m condent.”
Learning performance “IT hindered my performance on the test.”
“ICCS does not help serve the purpose of testing our knowledge conveniently and effectively. It’s hard to go
back to change your selected answer and the whole process of aiming the clicker is bothersome.”
“ICCS seem to create more pressure to answer correctly which often led to answering incorrectly.”
“I didn’t like using ICCS for tests because they take up more time; and also its not easy to go back and change
your answer if you want to.”
Technology Issues “They are too difcult to use and it was difcult to tell when the correct answer was chosen.”
“The whole process of aiming the remote is bothersome”
“I disliked ICCS because I didn’t like having such a small target to aim at. If the target was bigger and maybe
on the front board it would be easier and much more efcient.”
“Well at rst I didn’t like ICCS - I found it rather complicated and also mine was not working correctly”
Table 4. Sample comments illustrating challenges using ICCS
731
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
helped improve their learning, and overall, they
thought ICCS classes were better. The only two
items where males and females did not differ
signicantly was feeling the class was out of
control when using ICCS and liking to see other
students’ answers.
Computer comfort. A MANOVA was run
to compare students who were not comfortable
Females Males
Measure M SD M SD F
Attitudes
I was more motivated when clickers were used8. 5.01 1.56 5.46 1.43 14.18 *
I was more engaged in the lesson when clickers were used9. 5.18 1.56 5.50 1.35 7.21 **
I participated more than I normally would when clickers were used.10. 4.97 1.60 5.29 1.51 6.53 ****
The class was in control when clickers were used11. 5.04 1.50 4.99 1.54 0.16
I liked seeing what other students in the class selected for answers. 12. 4.72 1.49 4.95 1.52 3.62
Using the clickers were a good way to test my knowledge13. 5.30 1.49 5.55 1.33 4.72 ****
I liked using clickers for tests14. 4.41 1.86 5.13 1.73 24.68 *
I would prefer to use clickers 15. 5.01 1.85 5.62 1.64 18.70 *
I did not feel bad when most students got an answer right and I didn’t.16. 4.41 1.73 4.82 1.71 8.50 **
Using the clickers generated more class discussion17. 4.36 1.48 4.82 1.71 14.95 *
I learned more when clickers were used 18. 4.17 1.51 4.60 1.44 13.11 *
When clickers were used, the class was better19. 4.44 1.51 5.01 1.43 23.07
Table 5. Gender differences in attitudes toward ICCS
* p < .001
** p < .005
*** p < .01
**** p < .05
Not
Comfortable Comfortable
Survey Item M SD M SD F
I was more motivated when clickers were used8. 4.28 1.82 5.38 1.40 39.86 *
I was more engaged in the lesson when clickers were used9. 4.51 1.79 5.47 1.36 31.63 *
I participated more than I normally would when clickers were used.10. 4.36 1.82 5.25 1.49 23.42 *
The class was in control when clickers were used11. 4.88 1.48 5.02 1.54 0.65
I liked seeing what other students in the class selected for answers. 12. 4.60 1.33 4.89 1.51 2.59
Using the clickers were a good way to test my knowledge13. 4.71 1.71 5.54 1.34 24.65 *
I liked using clickers for tests14. 3.78 2.01 4.92 1.76 28.53 *
I would prefer to use clickers 15. 4.38 2.14 5.44 1.68 25.91 *
I did not feel bad when most students got an answer right and I didn’t.16. 4.03 1.90 4.71 1.68 11.31 *
Using the clickers generated more class discussion17. 4.05 1.48 4.68 1.47 12.54*
I learned more when clickers were used 18. 3.80 1.59 4.47 1.47 14.36 *
When clickers were used, the class was better19. 4.13 1.46 4.82 1.49 15.39 *
Table6.DifferencesoninattitudestowardICCSbasedoncomputercomfortlevel
* p < .001
732
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
with technology (n=80) with students who were
comfortable with technology (n=534) on the ICCS
Attitude Scale items (Appendix A). Hotelling’s T
was signicant (p <.001), so individual compari-
sons were examined. Students who were more
comfortable with technology were more moti-
vated and engaged when using ICCS, participated
more, liked using ICCS to test their knowledge,
especially in summative evaluation, did not feel
as bad when they got incorrect answers, thought
ICCS generated more class discussion, felt ICCS
helped improve their learning, and overall, thought
ICCS classes were better. The only two items
where there were no signicant differences was
feeling the class was out of control and liking to
see other students’ answers.
Strategies used with ICCS. Detailed data on
how ICCS were used in secondary schools was
not collected, however, three general strategies
were observed: using ICCS for formative assess-
ment only (formative, n=398), using ICCS for both
summative tests and formative assessment (mixed,
n=110), using ICCS for summative assessment
only (summative, n =103). Means for all items
on the ICCS Attitude Scale based on strategy
selected are presented in Table 7. Note that the
means for most items show a steady rise in value
from summative methods, to mixed methods, to
a formative approach.
A MANOVA was run to compare these three
approaches to using ICCS based on items from
the ICCS Attitude scale (Table 8). With the excep-
tion of the item asking about classroom control
(Item 11), using ICCS for formative assessment
was rated signicantly more positively than using
ICCS for summative assessment on all items on
the ICCS Attitude Scale. The formative assess-
ment approach outperformed the mixed method
(formative & summative) approach on most items
as well.
DISCUSSION
Benets Based on Previous
Research
Each of the benets identied in previous research
in higher education and secondary school class-
rooms will be discussed.
Survey Item Summative
Assessment Mixed
(Formative &
Summative)
Formative
Assessment
M SD M SD M SD
I was more motivated when clickers were used8. 4.44 1.60 4.79 1.47 5.57 1.39
I was more engaged in the lesson when clickers were used9. 4.52 1.57 5.05 1.44 5.65 1.33
I participated more than I normally would when clickers were used.10. 4.39 1.51 4.85 1.56 5.41 1.50
The class was in control when clickers were used11. 5.25 1.38 5.21 1.38 4.92 1.56
I liked seeing what other students in the class selected for answers. 12. 4.10 1.47 5.23 1.43 4.92 1.48
Using the clickers were a good way to test my knowledge13. 4.56 1.59 5.27 1.39 5.71 1.28
I liked using clickers for tests14. 4.09 2.16 4.19 1.93 5.11 1.62
I would prefer to use clickers 15. 4.42 1.94 4.72 1.73 5.70 1.62
I did not feel bad when most students got an answer right and I didn’t.16. 4.32 1.59 3.92 1.75 4.89 1.70
Using the clickers generated more class discussion17. 3.83 1.35 4.45 1.37 4.85 1.49
I learned more when clickers were used 18. 3.63 1.48 4.13 1.37 4.65 1.46
When clickers were used, the class was better19. 4.19 1.46 4.43 1.32 4.97 1.50
Table7.Meansurveyitemscoresasafunctionofstrategyused
733
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
Attendance. Previous research suggested that
ICCS were used to improved student attendance.
In several cases, ICCS participation counted for
a nal percentage of a student’s grade. In this
study, ICCS were not used by teachers to encour-
age students to come to class, at least directly.
However, about 17% of the teachers used ICCS
exclusively for formal tests. Most students would
be motivated to come to class when a summative
test of any format was taking place. It should
also be noted that secondary school classes are
usually much smaller than university classes, so
it is much easier to determine whether students
are present without ICCS. Therefore, high school
teachers may have thought it unnecessary to use
this tool for attendance purposes.
Participation. Based on the mean survey scores
on the ICCS attitude survey and the qualitative
comments, students felt that they participated more
when ICCS were used. This result is consistent
with previous research in higher education. Of
course, by design, ICCS forces participation to a
certain extent. Each student has a remote device
and is asked to click in an answer. It is possible that
a student could not click in an answer, but most
students when given the opportunity to answer
a question, did. It is not clear, though, whether
participation helped improve learning from the
data gathered in this study.
Anonymity. Higher education students pre-
ferred their responses to be anonymous when
ICCS were used. The ICCS Attitude Survey in
this study did not ask about anonymity directly,
however, not one of the 760 comments mentioned
anonymity as a concern or benet in the learning
process. It is speculated that since ICCS were used
as an instructional tool by most teachers, ICCS
were probably used anonymously most of the
time. When ICCS were not used anonymously,
it appeared to be used for tests where students
would accept that identication was a necessary
component. Therefore, the type of strategy used
with ICCS may have reduced or eliminated the
impact of anonymity.
Student attention. Students were not asked
about their attention levels in the ICCS attitude
survey, but several students spontaneously men-
tioned that they focussed more when ICCS were
used. While increased attention in class is well
documented in higher education, it appears that it
is a minor issue in secondary school. One reason
for this difference may be the size and duration of
Source df SS F Scheffe’s Post Hoc Analysis (p <.05)
I was more motivated when clickers were used8. 2 132.1 31.9 * Formative > Mixed & Summative
I was more engaged in the lesson when clickers were used9. 2 115.2 29.6 * Formative > Mixed > Summative
I participated more than I normally would when clickers were used.10. 2 97.7 21.3 * Formative > Mixed > Summative
The class was in control when clickers were used11. 2 13.0 2.9 No differences
I liked seeing what other students in the class selected for answers. 12. 2 76.0 17.6 * Formative & Mixed > Summative
Using the clickers were a good way to test my knowledge13. 2 111.0 30.3 * Formative > Mixed > Summative
I liked using clickers for tests14. 2 130.4 20.6 * Formative > Mixed & Summative
I would prefer to use clickers 15. 2 181.4 31.5 * Formative > Mixed & Summative
I did not feel bad when most students got an answer right and I didn’t.16. 2 93.4 16.4 * Formative > Mixed & Summative
Using the clickers generated more class discussion17. 2 89.2 21.4 * Formative & Mixed > Summative
I learned more when clickers were used 18. 2 94.5 22.6 * Formative > Mixed > Summative
When clickers were used, the class was better19. 2 62.0 14.6* Formative > Mixed & Summative
Table 8. MANOVA examining attitude toward ICCS as a function of teaching strategy
* p < .001
734
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
an average high school class. Attention span may
not be an issue in small, relatively short classes.
Engagement. Previous research on ICCS and
higher education indicated that engagement was
a relatively minor impetus for using ICCS. In this
study, though, engagement and motivation were
among the top rated qualities of ICCS based on
both quantitative and qualitative evidence. En-
gagement may be more of an issue for secondary
school students, who have much less choice about
the subjects they choose than their university
counterparts.
Class discussion. There is some evidence
to suggest that higher education students enjoy
class discussion precipitated by the use of ICCS,
particularly when they can confer about questions
with their peers. High school students, though,
rated discussion relatively low compared to other
possible benets. Both quantitative and qualitative
data conrmed that students in this study did not
see ICCS as promoting discussion. This result
may be a reection of how ICCS were used by
teachers. Teachers in this study were not trained
to use discussion promoting techniques like peer-
instruction.
Formative assessment. Secondary school stu-
dents rated the testing of their own knowledge as
the number one benet of using ICCS. In addition,
quantitative comments conrmed that students
liked using ICCS for formative assessment or to
review for tests. This nding is consistent with
extensive data reported in higher education.
Based on ICCS Attitude Survey results, students
overwhelming preferred using ICCS for formative
as opposed to summative assessment. It is clear
that many high school students did not like being
tested for marks using ICCS.
Learning. There is considerable research to
suggest ICCS improved the learning environment
and performance for higher education students.
Based on the survey results in this study, learning
was the lowest rated item. Students noted that they
were more engaged, involved, and focussed. Yet,
they were neutral about whether ICCS improved
their learning. The qualitative results were mixed.
Some students felt ICCS improved their learning,
while others were unsure. Other students noted
that ICCS were not used enough to determine the
impact on learning. Examining individual differ-
ences in perceptions of learning revealed higher
ratings for males, students who were comfortable
with technology, and using ICCS for formative
assessment. Learning is a complicated process
involving many factors, so it may not be fruitful to
ask the question “Does ICCS improve learning?”
A better question might be “Under what conditions
does ICCS help improve learning?”
Challenges Based on Previous
Research
Most of the challenges discussed in higher educa-
tion were from the teacher’s perspective: increased
time to learn software, preparing good questions,
and integrating ICCS into the class. The data
presented in this chapter reported challenges from
the perspective of the student.
Based on the qualitative data, secondary school
students identied several challenges to using
ICCS in the classroom. Some students were quite
s t r e s s e d w h e n I C C S w e r e u s e d i n a f o r m a l t e s t s i t u -
ation. They were uneasy with the technology and
in some cases, felt that ICCS hindered their per-
formance. In addition, technological difculties
were distracting and frustrating, particularly when
answers were not registered by the ICCS receiver.
Anxiety about an answer being properly recorded
was magnied in test situations. Increased stress,
poorer learning performance, and technological
difculties appear to be directly related to using
ICCS for summative assessment.
Other relatively minor issues noted by a few
students included the class being somewhat out
of control and not using ICCS enough. Inter-
estingly enough, most students did not to feel
uncomfortable in situations where they selected
a wrong answer and most of their peers selected
the correct response.
735
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
Individual Differences
Previous literature on individual differences in
the use of ICCS is almost non existent. The data
from this study provide a preliminary analysis
of specic factors that might inuence the use
of ICCS.
While there is considerable evidence to sug-
gest that males, on average, have more positive
attitudes toward computers (e.g., AAUW, 2000;
Barker & Aspray, 2006; Kay, 1992; Sanders, 2006;
Whitley, 1997), it was not anticipated that these
differences would be present using a relatively
straightforward tool like ICCS, where all a student
had to do was click in an answer with a remote
control. However, males preferred using ICCS
signicantly more than females in all but two of
the 12 ICCS attitude scale items. It is unclear why
these differences exist. More qualitative research
needs to be done.
In addition, those students who felt comfort-
able with technology preferred ICCS more than
students who felt less comfortable. Again, because
ICCS are relatively easy to use, this result was
not anticipated. Finally, strategies used with ICCS
had a signicant inuence on overall perceptions.
When ICCS were used for instructional purposes
(formative assessment), students ratings were
signicantly higher. When ICCS were used for
summative assessment, a number of students were
stressed and frustrated. Using a new technology
combined with not knowing whether responses
were accurately recorded likely increased stu-
dent anxiety when using ICCS in formal test
situations.
Summary and Suggestions for
Educators Using ICCS
The main benets that students identied for
using ICCS in secondary school classrooms were
increased use of formative assessment, engage-
ment and motivation, more participation, and
to a lesser extent, improved focus during class.
Students were relatively neutral with respect to
whether ICCS improved class discussion or learn-
ing. Key challenges reported were based on the
use of ICCS used for test situations where some
students were stressed and unsure whether their
answers were recorded. Males were signicantly
more positive toward ICCS, as were students
who were relatively comfortable with technol-
ogy. Finally, when ICCS were used for formative
assessment as opposed to graded tests, students
were signicantly more accepting.
It is wise to interrupt the results from this
chapter cautiously. While the sample size was
large, reliability and validity of the assessment
tools has not been established. Nonetheless, it
appears that at least two suggestions are worth
noting. First, ICCS should probably be used for
formative assessment only. The negative reaction
to summative assessment was intense and clearly
had an adverse impact on learning in some cases.
Second, it might be benecial to offer a detailed
orientation to using ICCS before it is formally
employed in a secondary school classroom. Ex-
planations of why ICCS is being incorporated
combined with thoughtful practice sessions might
help reduce anxiety in students. This approach
has been encouraged in higher education as well
(Beatty, 2004; Caldwell, 2007; Dufresne & Ger-
ace, 2004; Trees & Jackson, 2007).
Suggestions for Future Research on
ICCS
This chapter provided a review of benets and
challenges identied in previous research on
ICCS in higher education. In addition, a forma-
tive analysis of ICCS use in secondary education
was offered. Both quantitative and quantitative
evidence was gathered from over 650 students.
In order to build on this research, the following
suggestions for future research are offered:
1. Further develop the ICCS Attitude Survey
and establish reliability and validity for
constructs assessed;
736
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
2. Examine and compare a wide range of
strategies for using ICCS based on those
identied in higher education;
3. Conduct interviews of male and female
students to determine possible reasons for
differences in acceptance of ICCS technol-
ogy; and
4. Explore other sources of individual differ-
ences such as subject area taught and grade
level.
REFERENCES
Allen, D., & Tannner, K. (2005). Infusing active
learning into the large-enrolment Biology class:
Seven strategies, from the simple to complex. Cell
Biology Education, 4, 262-268.
American Association of University Women
(2000). Tech-Savvy: Educating girls in the new
computer age. Washington, DC: American As-
sociation of University Women Foundation.
Retrieved Dec 1, 2006 from http://www.aauw.
org/member_center/publications/TechSavvy/
TechSavv y.pdf
Barker, L. J., & Aspray, W. (2006). The state of
research on girls and IT. In J. M. Cohoon & W.
Aspray (Eds.), Women and information technology
(pp. 3-54). Cambridge, MA: The MIT Press.
Beatty, I. (2004). Transforming Student Learn-
ing with Classroom Communication Systems.
EDUCAUSE Research Bulletin, (3), 1-13. Re-
trieved Nov 3, 2007 from http://www.educause.
edu/ir/library/pdf/ERB0403.pdf
Beatty, I. D., Gerace, W. J., Leonard, W. J., &
Dufresne, R. J. (2006). Designing effective ques-
tions for classroom response system teaching.
American Journal of Physics, 74(1), 31-39.
Bergtrom, G. (2006). Clicker sets as learning ob-
jects. Interdisciplinary Journal of Knowledge and
Learning Objects, 2. Retrieved Nov 3, 2007 from
http://ijklo.org/Volume2/v2p105-110Bergtrom.
pdf
Brewer, C. A. (2004). Near real-time assessment
of student learning and understanding in biology
courses, BioScience, 54(11), 1034-1039.
Bullock, D. W., LaBella, V. P., Clinghan, T., Ding,
Z., Stewart, G., & Thibado, P.M. (2002). Enhanc-
ing the student-instructor interaction frequency.
The Physics Teacher, 40, 30-36.
Burnstein, R. A., & Lederman, L. M. (2001).
Using wireless keypads in lecture classes. The
Physics Teacher, 39(1), 8-11.
Caldwell, J.E. (2007). Clickers in the large class-
room: Current research and best-practice tips. Life
Sciences Education, 6(1), 9-20.
Carnaghan, C., & Webb, A. (2007). Investigating
the effects of group response systems on student
satisfaction, learning, and engagement in account-
ing education. Issues in Accounting Education,
22(3), 391-409.
Crouch, C. H., & Mazur, E. (2001). Peer instruc-
tion: Ten years of experience and results. Ameri-
can Journal of Physics, 69(9), 970-977.
D’Inverno, R., Davis, H., & White, S. (2003).
Using a personal response system for promoting
student interaction. Teaching Mathematics and
Its Applications, 22(4), 163-169.
Draper, S.W., & Brown, M.I. (2004). Increasing
interactivity in lectures using an electronic voting
system. Journal of Computer Assisted Learning,
20(2), 81-94.
D u f r e s n e , R . J . , & G e r a c e , W . J. ( 2 0 0 4 ) . Assessing-
to-learn: formative assessment in physics instruc-
tion. The Physics Teacher, 42, 428-433.
Elliott, C. (2003) Using a personal response sys-
tem in economics teaching. International Review
of Economics Education, 1(1), Retrieved Nov 3,
2007 from http://www.economicsnetwork.ac.uk/
iree/i1/elliott.htm .
737
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
El-Rady, J. (2006). To click or not to click:
That’s the question. Innovate Journal of On-
line Education, 2(4). Retrieved Nov 3, 2007
from http://www.innovateonline.info/index.
php?view=article&id=171
Fies, C., & Marshall, J. (2006). Classroom
response systems: A review of the literature.
Journal of Science Education and Technology,
15(1), 101-109.
F r e e m a n , M . , B e l l , A . , C o m e r t o n - F o r d e r , C . , P i c k-
ering J., & Blayney, P. (2007). Factors affecting
educational innovation with in class electronic
response systems. Australasian Journal of Edu-
cational Technology, 23(2), 149-170.
Greer, L., & Heaney, P. J. (2004. Real-time analy-
sis of student comprehension: an assessment of
electronic student response technology in an
introductory earth science course. Journal of
Geoscience Education, 52 (4), 345-351.
Hake, R. R. (1998). Interactive-engagement ver-
sus traditional methods: A six-thousand-student
survey of mechanics text data for introductory
physics courses. American Journal of Physics,
66(1), 64-74.
Hatch, J., Jensen, M., & Moore, R. (2005). Manna
from heaven or clickers from hell. Journal of Col-
lege Science Teaching, 34(7), 36-39.
Jackson, M., Ganger, A. C., Bridge, P. D., &
Ginsburg, K. (2005). “Wireless handheld com-
puters in the undergraduate medical curriculum.
Medical Education Online, 10 (5). Retrieved Nov
3, 2007 from http://www.med-ed-online.org/pdf/
t0000062.pdf .
Jones, C., Connolly, M., Gear, A., & Read, M.
(2001). Group integrative learning with group
process support technology. British Journal of
Educational Technology, 32(5), 571-581.
Judson, E., & Sawada, D. (2002). Learning from
past and present: Electronic response systems
in college lecture halls. Journal of Computers
in Mathematics and Science Teaching, 21(2),
167-181.
Kaleta, R., & Joosten, T. (2007). Student response
systems: A University of Wisconsin system study
of clickers. EDUCAUSE Research Bulletin, (10),
1-12.
Kay, R. H. (1992). An analysis of methods used to
examine gender differences in computer-related
behaviour. Journal of Educational Computing
Research, 8(3), 323-336.
Kennedy, G. E., & Cutts, Q. I. (2005). The associa-
tion between students’ use of an electronic voting
systems and their learning outcomes. Journal of
Computer Assisted Learning, 21(4), 260-268.
Latessa, R., & Mouw, D. (2005). Use of audience
response system to augment interactive learning.
Family Medicine, 37(1), 12 -14. Retrieved Nov 3,
2007 from http://www.stfm.org/fmhub/fm2005/
January/Robyn12.pdf
Miller, R. L., Santana-Vega, E., & Terrell, M. S.
(2006). Can good questions and peer discussion
improve calculus instruction? PRIMUS, 16(3),
1-9.
Nicol, D. J., & Boyle, J. T. (2003). Peer instruc-
tion versus class-wide discussion in large classes:
a comparison of two interaction methods in the
wired classroom. Studies in Higher Education,
28(4), 457-473.
Paschal, C. B. (2002). Formative assessment in
physiology teaching using a wireless classroom
communication system. Advances in Physiology
Education, 26 (4), 299-308.
Penuel, W. R., Boscardin, C. K., Masyn, K., &
Crawford, V. M. (2007). Teaching with student
response systems in elementary and secondary
education settings: A survey study. Educational
Technology, Research and Development, 55(4),
315-346.
Poulis, J., Massen, C., Robens, E., & Gilbert, M.
(1998). Physics lecturing with audience paced
738
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
feedback. American Journal of Physics, 66(5),
439- 441.
Pradhan, A., Sparano, D., Ananth, C.V. (2005).
The inuence of an audience response system on
knowledge retention: an application to resident
education. American Journal of Obstetrics and
Gynecology, 193 (5), 1827-30.
Preszler, R.W., Dawe, A., Shuster, C.B., & Shuster,
M. (2007). Assessment of the effects of student
response systems on student learning and attitudes
over a broad range of biology courses. CBE-Life
Sciences Education, 6(1), 29-41.
Reay, N. W., Bao, L., Li, P., Warnakulasooriya,
R., & Baugh, G. (2005). Toward the effective use
of voting machines in physics lectures. American
Journal of Physics, 73(6), 554-558.
Sanders, J. (2006). Gender and technology: A
research review. In C Skelton, B. Francis, and
L. Smulyan (Eds.), Handbook of Gender and
Education. London: Sage.
Schackow, T. E., Milton, C., Loya, L., & Friedman,
M. (2004). Audience response system: Effect on
learning in family medicine residents. Family
Medicine, 36, 496-504.
Sharma, M. D., Khachan, J., Chan, B., & O’Byrne,
J. (2005). An investigation of the effectiveness of
electronic classroom communication systems in
large lectures. Australasian Journal of Educa-
tional Technology, 21 (2), 137-154.
Siau, K., Sheng, H. and Nah, F. (2006). Use of
classroom response system to enhance classroom
interactivity. IEEE Transactions on Education,
49 (3), 398-403.
Simpson, V., & Oliver, M. (2007). Electronic vot-
i n g s y s t e m s f o r l e c t u r e s t h e n a n d n o w : A c o m p a r i -
son of research and practice. Australasian Journal
of Educational Technology, 23(2), 187-208.
Slain, D., Abate, M., Hidges, B. M., Stamatakis,
M. K., & Wolak, S. (2004). An interactive response
system to promote active learning in the doctor
of pharmacy curriculum. American Journal of
Pharmaceutical Education, 68 (5), 1-9.
Steinhert, Y., & Snell, L. S. (1999). Interactive
lecturing: strategies for increasing participation
in large group presentations. Medical Teacher,
21(1), 37-42.
Stuart, S. A. J., Brown, M. I., & Draper, S.W.
(2004). Using an electronic voting system in logic
lectures: one practitioner’s application. Journal of
Computer Assisted Learning, 20 (2), 95-102.
Trees, A. R., & Jackson, M. H. (2007). The learn-
ing environment in clicker classrooms: student
processes of learning and involvement in large
university course using student response systems.
Learning, Media, and Technology, 32(1), 21-40.
Uhari, M., Renko, M., & Soini, H. (2003). Experi-
ences of using an interactive audience response
system in lectures. BMC Medical Education, 3
(12), 1-6.
Van Dijk, L. A., Van Den Berg, G. C., & Van Keu-
len, H. (2001). European Journal of Engineering
Education, 26(1), 15-28.
Whitley, B. E., Jr. (1997). Gender differences
in computer-related attitudes and behaviors: A
meta-analysis. Computers in Human Behavior,
13, 1-22.
KEY TERMS AND DEFINITIONS
Formative Assessment: Is used to determine
student understanding of concepts without grades,
to identify misconceptions, and alter classroom
instruction accordingly.
Interactive Classroom Communication
Systems (ICCS): Allow students to respond to
multiple choice questions using a remote control
device. After students click in their responses, the
results are instantly aggregated and displayed in
739
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
chart form, usually a histogram. Responses are
often anonymous but can be linked to specic
students for evaluation purposes.
Summative Assessment: Involves tests and
assignments that count toward a student’s nal
grade.
740
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
APPENDIX A: ICCS ATTITUDE SURVEY FOR STUDENTS
1. Class ID: _________
2. What grade are you in (circle one)? ______
3. Gender (circle one) Male Female
4. How comfortable are you with technology? (circle one)
Not at all comfortable Somewhat Comfortable Comfortable Very Comfortable
5. On average, how much do you participate in class when clickers ARE NOT used? (circle one)
Not at all Some of the Time Most of the Time All of the Time
6. About how many times did you use the clickers in the past month?
Never 1-2 times Once a week 2-3 times per week Almost daily
7. In what subjects did you use clickers? _______________________________________
Item Strongly
Disagree Disagree Slightly
Disagree Neutral Slightly
Agree Agree Strongly
Agree
8. I was more motivated when clickers were used 1 2 3 4 5 6 7
9. I was more engaged in the lesson when clickers
were used 1 2 3 4 5 6 7
10. I participated more than I normally would when
clickers were used. 1 2 3 4 5 6 7
11. The class was in control when clickers were used 1 2 3 4 5 6 7
12. I liked seeing what other students in the class
selected for answers. 1 2 3 4 5 6 7
13. Using the clickers were a good way to test my
knowledge 1 2 3 4 5 6 7
14. I liked using clickers for tests 1 2 3 4 5 6 7
15. I would prefer to use clickers 1 2 3 4 5 6 7
16. I did not feel bad when most students got an
answer right and I didn’t. 1 2 3 4 5 6 7
17. Using the clickers generated more class
discussion 1 2 3 4 5 6 7
18. I learned more when clickers were used 1 2 3 4 5 6 7
19. When clickers were used, the class was better 1 2 3 4 5 6 7
20. What was the impact of clickers on your learning in the past month? ____________________
____________________________________________________________________________
____________________________________________________________________________
____________________________________________________________________________
741
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
APPENDIX B: CODING SCHEME FOR STUDENT COMMENTS ABOUT ICCS
Categories
Category Code Dening Criteria
Attend or focus more in class Focus on class more•
Focus on questions more•
Concentrate more•
Classroom environment Talks about a change in the nature of the class •
e.g., out of control or more calm
Compare with other students Talks about how they did compare to the rest of the class•
Different Method Liked because it was a new method or way of learning / doing things / something •
different
+ score if liked clicker method better
- score if liked another method (e.g., paper and pencil) better
Discussion in Class Brought about class discussion•
Engagement Fun / Motivated / Interested •
Feedback Referring to feedback they get – quickness of feedback or getting to see the answer •
right away
Formative Assessment Referring more to • formative assessment including homework – does not talk about
“summative test”
Also include quizzes •
(which student don’t seem to consider as summative)
Student could refer to testing knowledge but not “a test”•
General Comment Vague or general reference to learning•
General comment•
Learning Makes specic reference to learning or thinking •
Memory Talks about remembering better•
Participation Refers to increase participation / • interactivity /getting involved / more hands on learning
Performance Increased or decreased performance on test – looking at the impact on summative •
assessment
Include improving ability to answer multiple choice questions•
Review Talks about reviewing or getting prepared for a test•
Stressed Talks about feeling rushed but also stress / pressure / frustration in general•
Teacher Explain As a result of using clickers, teacher explained answers•
Technology Refers to the use of the clicker technology•
Use was too infrequent Didn’t use them enough to make a comment about impact•
Wrong Answers Talks about feeling bad when getting wrong answers•
742
A Formative Analysis of Interactive Classroom Communication Systems Used in Secondary School Classroom
Rating
-2 Using adverb to describe impact (e.g., very, too, really)•
Strong negative adjective (e.g., hate, annoying)•
A serious issue like reducing condence •
More than one negative adjective•
Exclamation mark (e.g., this was terrible!)•
-1 Negative comment
0 Neutral (e.g., no effect, no impact)
1 Positive comment
2 Using adverb to describe impact (e.g., very, too, really)•
Strong positive adjective (e.g., love, awesome, captivating, great)•
More than one positive adjective•
Exclamation mark (e.g., this was good!)•