ArticlePDF Available

Exploring the Use of Audience Response Systems in Secondary School Science Classrooms

Springer Nature
Journal of Science Education and Technology
Authors:

Abstract and Figures

An audience response systems (ARS) allows students to respond to multiple choice questions using remote control devices. Once the feedback is collected and displayed, the teacher and students discuss misconceptions and difficulties experienced. ARSs have been extremely popular and effective in higher education science classrooms, although almost no research has been done at the secondary school level. The purpose of this study was to conduct a detailed formative analysis of the benefits, challenges, and use of ARSs from the perspective of 213 secondary school science students. Perceived benefits were increased student involvement (engagement, participation, and attention) and effective formative assessment of student understanding. Perceived challenges included decreased student involvement and learning when ARSs were used for summative assessment, occasional technological malfunctions, resistance to using a new method of learning, and increased stress due to time constraints when responding to questions. Finally, students rated the use of ARSs significantly higher when it was used for formative as opposed to summative assessment.
Content may be subject to copyright.
Exploring the Use of Audience Response Systems in Secondary
School Science Classrooms
Robin Kay ÆLiesel Knaack
Published online: 18 February 2009
ÓSpringer Science+Business Media, LLC 2009
Abstract An audience response systems (ARS) allows
students to respond to multiple choice questions using
remote control devices. Once the feedback is collected and
displayed, the teacher and students discuss misconceptions
and difficulties experienced. ARSs have been extremely
popular and effective in higher education science class-
rooms, although almost no research has been done at the
secondary school level. The purpose of this study was to
conduct a detailed formative analysis of the benefits,
challenges, and use of ARSs from the perspective of 213
secondary school science students. Perceived benefits were
increased student involvement (engagement, participation,
and attention) and effective formative assessment of
student understanding. Perceived challenges included
decreased student involvement and learning when ARSs
were used for summative assessment, occasional techno-
logical malfunctions, resistance to using a new method of
learning, and increased stress due to time constraints when
responding to questions. Finally, students rated the use of
ARSs significantly higher when it was used for formative
as opposed to summative assessment.
Keywords Audience response systems
Secondary school Science Evaluation
Examining the Use of Audience Response Systems
in Secondary School Classrooms: A Formative Analysis
Overview
An Audience Response System (ARS), also known as a
personal response system (Hinde and Hunt 2006), elec-
tronic voting system (Kennedy and Cutts 2005), student
response system (Kaleta and Joosten 2007 and clickers
(Bergtrom 2006), permits students to click in answers to
electronically displayed multiple choice questions using a
remote control device. After students select an answer, the
results are instantly aggregated and displayed in chart form
for the entire class to review. Responses are usually
anonymous, but a teacher can assign individual remote
devices to specific students for purpose of evaluation. Once
the feedback is presented, a teacher can alter the course of
instruction or students can work out misconceptions and
difficulties through peer or classroom discussion. Wide-
spread use of ARSs in secondary schools is a relatively
new phenomenon beginning in 2003.
Several researchers have completed extensive reviews
on the use of ARSs (Caldwell 2007; Fies and Marshall
2006; Judson and Sawada 2002), however only one paper
could be found examining the use of ARSs in secondary
school classrooms (Penuel et al. 2006). The purpose of the
current study was to conduct a detailed formative analysis
of the potential benefits, challenges, and use of ARSs from
the perspective of secondary science students.
ARSs in Secondary Schools
Research on the use of ARSs in secondary schools is
sparse. Penuel et al. (2006) have conducted the only study
of ARSs in a K-12 environment. They noted that teachers
R. Kay (&)L. Knaack
University of Ontario Institute of Technology,
Oshawa, ON, Canada
e-mail: robin.kay@uoit.ca
123
J Sci Educ Technol (2009) 18:382–392
DOI 10.1007/s10956-009-9153-7
used ARSs for two principal reasons: formative assessment
(to improve learning and instruction) or summative
assessment (to deliver and grade tests). They also observed
that training increased frequency of use and that teachers
rarely used ARSs to promote discussion. Teachers who
adopted the view that students should play a significant,
active role in learning were more likely to use ARSs for
formative assessment. Overall, the main impact of ARSs in
K-12 was to increase motivation and improve student
learning.
Benefits of Using ARSs in Higher Education
Because there is limited research on the use of ARSs in the
secondary school domain, it is worth reviewing the impact
of ARS in higher education, particularly since the majority
of ARS use has occurred in science related subject areas
(e.g., Allen and Tanner 2005; Beatty et al. 2006; Dufresne
and Gerace 2004; Fagan et al. 2002; Preszler et al. 2007;
Sharma et al. 2005). A detailed review of the ARS litera-
ture in higher education revealed four key areas of focus:
overall attitudes toward ARSs, student involvement,
assessment, and learning. Each of these areas will be dis-
cussed in detail.
Overall Attitudes Toward ARS
Extensive data has been collected indicating that student
attitudes toward ARSs are very positive (Caldwell 2007;
Fies and Marshall 2006;JudsonandSawada2002; Simpson
and Oliver 2007). Students have also reported that ARSs
are easy to learn and use (e.g., Hinde and Hunt 2006;
Pelton and Pelton 2006; Pradhan et al. 2005; Sharma et al.
2005; Siau et al. 2006).
Student Involvement
When an ARS is used in a higher education classrooms,
students are more engaged in the content presented (e.g.,
Bergtrom 2006; Preszler et al. 2007; Simpson and Oliver
2007), participate more (Caldwell 2007; Draper and Brown
2004; Greer and Heaney 2004; Siau et al. 2006; Stuart et al.
2004; Uhari et al. 2003), and pay more attention to concepts
presented (e.g., Bergtrom 2006; Draper and Brown 2004;
Jackson et al. 2005; Latessa and Mouw 2005; Siau et al.
2006; Slain et al. 2004). In addition, a number of
researchers have observed that ARSs stimulate discussion,
especially when they are used with a peer instruction
strategy (Beatty 2004; Brewer 2004; Jones et al. 2001;
Nicol and Boyle 2003). With this strategy, students felt they
were better able to discuss and improve their understanding
of higher level concepts (Draper and Brown 2004).
Assessment
Evidence suggests that using an ARS improves the feed-
back cycle between instructor and students with the rapid,
anonymous, collection and presentation of all student
responses to questions asked (Abrahamson 2006; McCabe
2006; Pelton and Pelton 2006). Students also enjoy seeing
how well they are doing relative to their peers (Burton
2006; Caldwell 2007; Draper and Brown 2004; Hinde
and Hunt 2006; Simpson and Oliver 2007). In addition,
considerable data supports the use of ARSs for providing
effective formative assessment of student knowledge
(Beatty 2004; Bergtrom 2006; Brewer 2004; Caldwell
2007; Draper and Brown 2004; Dufresne and Gerace 2004;
Greer and Heaney 2004; Jackson et al. 2005; Siau, et al.
2006; Simpson and Oliver 2007; Stuart et al. 2004).
Finally, some indirect evidence suggests that students do
not like using ARSs when participation is associated with a
grade (Caldwell 2007).
Learning
Many higher education students report that they learn more
when ARSs are used (Greer and Heaney 2004; Nicol and
Boyle 2003; Pradhan et al. 2005; Preszler et al. 2007; Siau
et al. 2006; Slain et al. 2004; Stuart et al. 2004; Uhari et al.
2003). Some students prefer discussing ARS-based ques-
tions with their peers who have a similar language and
therefore can articulate solutions more successfully than
the instructor (Nicol and Boyle 2003; Caldwell 2007).
Other students maintain answering ARS questions forces
them to think more about the important concepts (Draper
and Brown 2004; Greer and Heaney 2004) resolve mis-
conceptions (D’Inverno et al. 2003). Finally, there is
substantial qualitative and quantitative evidence to suggest
that learning performance increases as a direct result of
using ARSs (e.g., Carnaghan and Webb 2007; Fagan et al.
2002; Kaleta and Joosten 2007; Kennedy and Cutts 2005;
Pradhan et al. 2005; Preszler et al. 2007; Schackow et al.
2004).
Challenges Associated with Using an ARS
Three categories of challenges have been reported by stu-
dents and instructor using ARSs: technology, time, and
method. With respect to technology, students who forgot to
bring their remote devices to class were unable to fully
participate in ARS-based discussions (Caldwell 2007; Reay
et al. 2005). Another more critical technological issue
occurred when the remote devices did not work, a partic-
ularly stressful situation during a formal evaluation
(Sharma et al. 2005; Siau et al. 2006).
J Sci Educ Technol (2009) 18:382–392 383
123
Several challenges associated with ARS are related to
time. There are very few classroom-ready, subject-specific
collections of ARS questions available, so instructors have
to develop original questions. Many find this task extre-
mely time consuming (Allen and Tanner 2005; Beatty et al.
2006; Fagan et al. 2002; Freeman et al. 2007; Paschal
2002). In addition, the time required to set up an ARS,
hand out the remote controls at the beginning of the class,
and collect remote controls at the end of the class can
be significant (Hatch et al. 2005, Stuart et al. 2004). Fur-
thermore, some discussions, if not moderated properly,
can consume considerable class time by drifting away from
the main concepts (Nicol and Boyle 2003, Reay et al.
2005).
A final set of challenges involves students who react
unfavorably to the use of an ARS because the method of
learning has been changed. They are accustomed to lec-
tures and a switch in teaching format can lead to stress,
frustration, and resistance at first (Beatty 2004; Fagan et al.
2002). In addition, using ARSs require more cognitive
energy and cooperation from students and this extended
effort may not suit students who are accustomed to passive
lectures (Trees and Jackson 2007). Finally, some students
doubt their ability to direct their own learning when ARSs
are used (Allen and Tanner 2005).
Summary and Purpose of Study
Previous research on the use of ARSs in classrooms has
been conducted almost exclusively in the domain of higher
education. Benefits reported when ARSs are used have
included greater student involvement, a positive reaction to
formative assessment, improved quality of learning, and
increased learning performance. Challenges identified
when using ARSs have consisted of malfunctioning tech-
nology, increased demands on teaching time, and students
having to adjust to a new way of learning. Since the use of
ARSs is increasing rapidly in secondary schools (Abra-
hamson 2006), it is prudent to broaden the scope of current
ARS research in the area of science where a majority of
higher education instructors have used ARSs. Therefore,
the purpose of the current study was to examine benefits,
challenges, and strategies associated with using ARSs in
secondary school science classrooms. Three questions were
addressed:
1. What were the perceived benefits of using an ARS in a
secondary school science classroom?
2. What were the perceived challenges in using an ARS
in a secondary school science classroom; and
3. How did teaching strategy influence the use of ARSs
in a secondary school science classroom? (summative
vs. mixed vs. formative assessment).
Method
Sample
Students
The student sample consisted of 213 students (107 males,
105 females, 1 missing data), enrolled in grades 10 (n=50),
11 (n=81), and 12 (n=82). Subject areas where an ARS
was used included biology, chemistry, physics and general
science. Eight-six percent of the students claimed that they
were comfortable or very comfortable with technology
(n=182). Sample population data was collected from seven
classrooms in six different secondary schools. All students
were selected through convenience sampling and had to
obtain signed parental permission to participate.
Teachers
The teacher sample consisted of seven teachers (5 males,
2 females), with 8–26 years of teaching experience
(M=18.0, SD =7.0). Six out of seven the teachers
reported that they were comfortable or very comfortable
with technology.
Procedure
Teachers were emailed by an educational coordinator and
informed of the ARS study. Participation was voluntary
and subjects could withdraw from the study at any time
without penalty or prejudice. Each teacher received two
half days of training in November and February on how to
use the ARS software and possible strategies for using an
ARS in the classroom. Teachers were then asked to use an
ARS in their regular classrooms, although frequency of use
was controlled by the individual teacher. Each teacher had
access to a laptop computer, an LCD projector, and one
ARS system (32 clickers and a receiver) from E-Instruction
(see http://www.einstruction.com/ for details of the specific
ARS brand used). This equipment was purchased inde-
pendently by the school board in where the study was
conducted. All teachers used the ARS for a three month
period, however, data collected for this study focussed on
the last month. During the final month in which ARSs were
used, all of secondary school science students reported
using the ARS one to two times.
Data Sources
Student Survey
Based on the last month in which an ARS was used, stu-
dents completed the ARS Attitude Survey for Students
384 J Sci Educ Technol (2009) 18:382–392
123
(Kay 2008a). This survey consisted of 11, seven-point
Likert scale items that focused on the four key areas
reported in previous ARS research: overall attitudes, stu-
dent involvement, assessment, and learning. Cronbach’s
reliability coefficient for the ARS survey was 0.88. How-
ever, since this was a formative analysis of student
attitudes toward the use of ARSs in secondary school
science classrooms, items on the scale were analyzed
individually to gather as much information as possible.
Student Comments
Students were asked the following open-ended question:
‘What was the impact of clickers on your learning in the
past month’’? A coding scheme was developed to catego-
rize 255 student comments made by 175 students (Kay
2008b). Some students made more than one comment,
while other students offered no response. No student made
more than 3 comments. Each comment was rated on a five-
point Likert scale (-2=very negative, -1=negative,
0=neutral, 1 =positive, 2 =very positive). Two raters
(the authour and a trained research assistant) assessed all
comments made by students based on category and rating
value. After round one, inter-rater reliability was 83% for
categories and 93% for ratings. Comments where catego-
ries or ratings were not exactly the same were shared and
reviewed a second time by each rater. After round two, an
inter-rater reliability of 98% was reached for categories and
99% for the rating values.
Teacher Questions
Teachers were asked to indicate their gender, years of
teaching experience, comfort level with computer tech-
nology, and whether they used an ARS for formative and/
or summative assessment. Finally, teachers were asked to
respond to open ended written questions about the overall
impact of ARS on their classroom and suggestions for
future teachers.
Results
Benefits to Using an ARS
Overall Attitudes
Students were somewhat positive about using ARSs
with 62% agreeing that they preferred to use an ARS
(M=5.0 ±0.26, possible range was 1–7) and 42% indi-
cating that their class was better when an ARS was used
(M=4.5 ±0.21, see Table 1). General comments offered
from the open-ended survey question were mixed with the
majority being neutral (n=13). Representative neutral
comments about ARS were ‘‘It didn’t affect me’’ and ‘‘It
did not make much of a difference’’. Typical negative
comments (n=5) were ‘‘Clickers are a waste of time’’ and
‘Horrendous—takes up too much time’’. Characteristic
positive comments were ‘‘It was good’’ and ‘‘Amazing’’.
Table 1 Summary of ARS
quantitative student survey
questions (n=213)
a
Combining ‘‘Slightly Agree’’,
‘Agree’’ and ‘‘Strongly Agree’
responses
b
Combining ‘‘Slightly
Disagree’’, ‘‘Disagree’’ and
‘Strongly Disagree’’ responses
Question nDisagree
(%)
a
Agree
(%)
b
MSD
Overall attitude
1. I would prefer to use clickers 209 23 62 5.0 1.9
2. When clickers were used, the class was better 213 21 42 4.5 1.6
Student involvement
3. I was more engaged in the lesson when clickers were used 213 13 70 5.2 1.6
4. I was more motivated when clickers were used 213 17 63 5.0 1.7
5. I participated more than I normally would when clickers
were used
213 19 62 5.0 1.6
6. Using clickers generated more class discussion 213 20 53 4.6 1.5
Assessment
7. Using clickers was a good way to test my knowledge 210 12 74 5.3 1.5
8. I liked seeing what other students in the class selected
for answers
211 13 56 4.8 1.5
9. I did not feel bad when most students got an answer right
and I didn’t
213 35 39 4.3 1.8
10. I liked using clickers for tests 211 29 33 4.1 1.8
Learning
11. I learned more when ICCS was used 211 24 43 4.3 1.5
J Sci Educ Technol (2009) 18:382–392 385
123
Of the 17 comments made about student reaction to
using a different method of teaching, 11 were positive (see
Table 2). Sample positive comments were as follows:
It’s fun to use and adds a certain aspects to the class
that you wouldn’t have in a regular class.
I learned better because it promoted friendly com-
petition within the class. Also it made reviewing
easier and more fun. It is better than reading ques-
tions out of a textbook.
I learn better than sitting and listening and it
improves my multiple choice skills.
Student Involvement
Students indicated that they were more involved in science
class when an ARS was used. Based on the survey data, 70%
ofthestudentsagreedthattheyweremoreengaged
(M=5.2 ±0.21), 63% agreed they were more motivated
(M=5.0 ±0.23), 62% agreed that they participated more
than they normally would (M=5.0 ±0.21), and 53% agreed
that more class discussion was generated (M=4.6 ±0.20).
Student comments regarding engagement were univer-
sally positive (M= 1.24, scale ranged from -2to?2).
Typical comments were as follows:
It motivated me and helped me a little with constantly
paying attention.
Using the clickers made the lesson more enjoyable.
Keeps me awake and helps to keep me interested in
the subjects we are learning.
Fewer comments were made about participation than
engagement (n=17), although the mean rating was high
(M=0.94) and most comments were positive (n=15).
Typical comments about participation included:
I’ve only used them once but I really liked them. I
thought it was a great way to be involved. Everybody
was doing something.
It forced people who normally don’t feel comfortable
participating in class discussion to participate.
Only 12 comments were made about paying attention,
but the mean rating was high (M=1.00). Sample atten-
tion-based comments were as follows:
I am forced to pay more attention in class which helps
me learn better.
I was more inclined to listen in class because I knew
that the lesson would be shortened and therefore I
could focus more on the lesson.
Table 2 Summary of student
comments about audience
response systems (n=255)
Category Mean (SD) No. negative
comments
No. positive
comments
Total
comments
Overall attitude
General comments -0.04 (0.93) 5 5 23
Different methods used 0.35 (0.93) 5 11 17
Student involvement
Engagement 1.24 (0.43) 0 42 42
Participation 0.94 (0.43) 0 15 17
Paid attention more 1.00 (0.00) 0 12 12
Discussion 1.00 (0.00) 0 2 2
Stress -1.00 (1.10) 9 2 11
Assessment
Formative assessment 1.00 (0.34) 0 17 18
Compare with others 1.00 (0.00) 0 4 4
Feedback 1.00 (0.00) 0 3 3
Wrong answer—reaction -1.00 (0.00) 2 0 2
Learning
Learning process 0.52 (0.91) 3 16 29
Review previous concepts 1.20 (0.41) 0 15 15
Memory 0.62 (1.06) 1 7 8
Teacher explained better 1.00 (0.00) 0 2 2
Learning performance -0.71 (0.97) 23 6 31
Technology issues -1.00 (0.93) 7 1 8
386 J Sci Educ Technol (2009) 18:382–392
123
Assessment
Students liked using ARS for formative assessment. Just
under 75% of students agreed that using ARSs was a ‘‘good
way’’ to test their knowledge (M=5.3 ±0.20). Compar-
ing answers with other students (M=4.8 ±0.20) was
rated somewhat lower, with 56% agreeing that they liked to
see the responses of their peers. Students were split on how
they felt when they selected a wrong answer and the rest of
the class chose a correct answer—35% felt bad while 39%
did not feel bad. Finally, only 33% of the students agreed
that they liked using an ARS for formal tests or summative
assessment (M=4.1 ±0.24).
Comments about formative assessment were rated high
(M=1.00), however few comments were made about
comparing responses or not getting a correct answer. The
following are typical comments made about formative
assessment:
Helped me by testing my knowledge and getting to
know where the class and I are at.
Clickers were fairly good in a normal class and fairly
good way to test our knowledge. However, on tests, it
is very inefficient and can waste time.
We found out what common mistakes were made
more commonly, then discussed why that was the
wrong answer.
Learning
Only 43% of the students agreed that they learned more
when an ARS was used. The mean score for this item was
4.3 (±0.20), the second lowest of all survey questions.
However, open-ended comments revealed a different
picture. Ratings for comments about learning process
(M=0.52), reviewing previous concepts (M=1.20),
helping to improved memory (M=0.62), and teacher
explanations (M=1.00) were relatively high. However,
the mean rating for learning performance comments was
negative (M=-0.71). It is important to note, though, that
all of the negative comments about learning performance
were made when ARSs was used for formal testing, as
opposed to formative assessment. Comments made about
the overall learning process (n=29) and reviewing con-
cepts (n=15) were the most common. Typical remarks
made about the learning process were:
Helped me figure out how much I learned.
Using the clickers helped me get a better under-
standing of some of the concepts that I did not
understand before.
Since clickers were used to review for a test, I was
able to obtain answers to test-like questions easily
giving me insight as to what more studying was
needed.
Representative comments about using an ARS to review
concepts included,
It reinforced the idea of physics just before the test.
However, I found the practice test we did with the
clickers very helpful.
It was great review of the units and topics we just
covered, and I find I remember things better if I get
them wrong first on a test and this system always
seems like a test for me so I feel I learn more this
way.
Comments made about an ARS helping to improve
memory were less frequent (n=8) but worth noting.
It actually helped me remember facts we have been
learning better.
Because we could see the correct answer immediately
after it helped me remember the answer and get my
questions answered.
Challenges Experienced Using ARSs
Technology
Attitudes toward ARS technology were not assessed using
the survey data, but the mean rating for comments was
negative (M=-.1.00), albeit only for eight students. The
majority of student complaints were targeted toward whe-
ther the remote devices actually worked.
They are too difficult to use and it was difficult to tell
when the correct answer was chosen.
Took more time to learn to use them than time spent
learning; caused confusion; didn’t like pointing at the
receiver; not good in a test situation.
Overall Attitudes
While most ‘‘overall’’ attitudes toward ARSs were neutral,
comments from five students were exceptionally negative
(e.g., ‘‘Clickers are a waste of time’’ or ‘‘Horrendous
takes up to much time’’). Other comments were focussed
using a new method to learn.
I prefer to listen to the teacher—using the clickers
constantly would become tedious.
J Sci Educ Technol (2009) 18:382–392 387
123
It went slowly and I don’t like the time limit
questions.
Student Involvement
As stated earlier, most students were enthusiastically
involved when an ARS was used, however some students
were stressed. Typical comments referred feeling pressure
during a test.
Using the clickers on a test is very stressful. Not only
are we stressed because of the test conditions, but it’s
difficult to work the clickers.
The clickers have made me nervous in test situations
and have made me lose confidence in situation where
immediate feedback was given. The clickers have had
an altogether negative feedback on my learning
experience in biology class.
Assessment
The survey indicated that 35% of secondary students ‘‘felt
bad’’ selecting an incorrect response when almost everyone
else in the class selected a correct response (see Table 1).
Most comments about assessment (see ‘Benefits’) and
ARSs were positive.
Learning Performance
Even though reviewing concepts, improving memory, and
obtaining explanations from the teacher were rated as
positive consequences of using an ARS, a small group of
students (n=23) were adamant about the negative impact
of the ARSs on learning performance. Representative
comments included:
Frustrating, stressful to use in test situations.
The time constraints compared to a normal test were
very annoying since the slow students miss out and
the speedy students get bored.
The clickers are NOT good to use on long tests;
they’re frustrating and hard to check your answers.
Teaching Strategy Used with ARSs
Comprehensive information was not collected on how
ARSs were used in secondary science classrooms, how-
ever, three general assessment-based strategies were
reported by teachers: using an ARS for formative assess-
ment only (formative, n=130), using an ARS for both
summative tests and formative assessment (mixed,
n=45), using an ARS for summative assessment only
(summative, n=38). Means for all items on the ARS
student attitude scale as a function of strategy selected are
presented in Table 2. Mean values for all attitude scale
items were lowest when a summative assessment strategy
was used and highest when a formative assessment strategy
was employed.
A MANOVA was run to compare formative, mixed, and
summative approaches to using the ARS (Table 3). Using
an ARS for formative assessment was rated significantly
more positively than using an ARS for summative assess-
ment on all 11 Likert scale items in the ARS attitude scale.
Using an ARS for formative assessment also resulted in
significantly higher scores on most survey items when
compared to a mixed approach (formative & summative)
(Table 4).
Teachers’ Comments
When science teachers were asked about the overall impact
of ARSs, two main themes emerged: learning and moti-
vation. With respect to learning, comments focussed on the
value of feedback, not having to write everything down,
and examining areas of weakness.
The immediacy of the feedback permits me to iden-
tify difficulties and address them quickly.
My class really struggles with written work so the
clickers were very popular.
By using clickers as an assessment tool, I can focus
lessons on areas of weakness.
Regarding motivation, using an ARS appeared to
increase the overall energy level in the science classroom.
Clickers have made my review sessions very exciting
and motivating for my students.
They benefited—the positive energy was good. For
example they would say yeah we’re using the clickers
today instead of your usual boring lesson!
When prompted to give suggestions to future teachers
about using ARSs, science teachers in this study offered the
following observations. First, set up of ARS is time con-
suming, as is the creation of good questions. Second, keep
the clicker sessions short to avoid losing the focus of a
lesson. Finally, sometimes the remote devices do not work
and students have difficult registering responses—using
the more reliable radio frequency (as opposed to infra-
red) remote devices was encouraged. It is important to
remember that these comments were collected from only
seven secondary school science teachers.
388 J Sci Educ Technol (2009) 18:382–392
123
Discussion
The purpose of this study was to conduct a formative ARS
use in secondary school science classrooms. Three areas
were examined including benefits, challenges, and teaching
strategies.
Benefits
Overall Attitudes
The survey data suggests that science students were mod-
erately positive about using ARSs, a result that is partially
Table 3 Mean ARS survey
item scores as a function of
strategy used (n=255)
Survey item Summative
assessment
Mixed
(formative &
summative)
Formative
assessment
M (SD) M (SD) M (SD)
Overall attitudes
1. I would prefer to use clickers 3.3 (1.7) 4.3 (1.7) 5.6 (1.7)
2. When clickers were used, the class was better 3.4 (1.3) 4.6 (1.3) 4.8 (1.7)
Student involvement
3. I was more engaged in the lesson when clickers were
used
3.8 (1.5) 4.9 (1.4) 5.6 (1.4)
4. I was more motivated when ARS was used 3.6 (1.6) 4.6 (1.5) 5.6 (1.4)
5. I participated more than I normally would when clickers
were used
4.1 (1.5) 4.6 (1.5) 5.4 (1.5)
6. Using clickers generated more class discussion 3.5 (1.3) 4.8 (1.2) 4.8 (1.6)
Assessment
7. Using clickers was a good way to test my knowledge 4.3 (1.6) 5.2 (1.5) 5.7 (1.2)
8. I liked seeing what other students in the class selected
for answers
3.5 (1.0) 5.4 (1.5) 5.0 (1.4)
9. I did not feel bad when most students got an answer right
and I didn’t
4.3 (1.5) 3.3 (1.5) 4.6 (1.8)
10. I liked using clickers for tests 2.3 (1.7) 3.7 (1.8) 4.8 (1.4)
Learning
11. I learned more when clickers were used 3.1 (1.4) 4.1 (1.3) 4.6 (1.5)
Table 4 MANOVA examining attitude toward ARS as a function of teaching strategy
Source df SS F Scheffe’s Post
Hoc analysis (p\.05)
Overall attitude
1. I would prefer to use clickers 2 173.8 30.7* Formative [Mixed & Summative
2. When clickers were used, the class was better 2 58.7 12.8* Formative & Mixed[Summative
Student involvement
3. I was more engaged in the lesson when clickers were used 2 97.6 24.5* Formative [Mixed [Summative
4. I was more motivated when clickers were used 2 118.1 26.5* Formative [Mixed & Summative
5. I participated more than I normally would when clickers were used 2 62.5 13.8* Formative [Mixed & Summative
6. Using clickers generated more class discussion 2 49.8 11.8* Formative & Mixed [Summative
Assessment
7. Using clickers was a good way to test my knowledge 2 58.0 15.6* Formative & Mixed [Summative
8. I liked seeing what other students in the class selected for answers 2 84.6 23.3* Formative & Mixed [Summative
9. I did not feel bad when most students got an answer right and I didn’t 2 48.5 8.3* Formative & Mixed [Summative
10. I liked using clickers for tests 2 164.6 35.3* Formative [Mixed [Summative
Learning
11. I learned more when clickers were used 2 59.3 29.7* Formative & Mixed [Summative
*p\.001
J Sci Educ Technol (2009) 18:382–392 389
123
consistent with previous studies conducted in higher edu-
cation. Part of the problem with assessing overall attitudes
is that positive and negative attitudes can cancel each other
out leading in a neutral result. As Fies and Marshall (2006)
suggest, general impressions and anecdotal comments
about attitudes toward ARSs need to be supplemented by a
more detailed, focussed analysis in order to provide a more
in depth analysis.
Student Involvement
Relative to assessment and learning, student involvement
was rated highest by students on the survey data and open-
ended comments. Data collected strongly suggested that
ARSs helped increase engagement, participation and, to a
lesser extent, attention paid in class. These results are
consistent with those observed in higher education
(e.g., Caldwell 2007; Fies and Marshall 2006; Judson and
Sawada 2002; Simpson and Oliver 2007). It is unclear,
though, whether increases in student involvement are a
superficial reaction to the excitement of using TV-like
remote devices in a game like atmosphere, or a reflection of
a more profound change in student learning and cognitive
engagement. The next stage of research that needs to be
done with respect to student involvement is to examine
what in particular is engaging about ARSs, the quality of
the participatory efforts by students, and why students are
more focussed when an ARS is used.
Assessment
A majority of secondary school science students appreci-
ated the use of ARSs for formative assessment particularly
with respect to checking their understanding and reviewing
for tests. Comparing answers with other students and get-
ting general feedback played a secondary role. In addition,
most teacher comments enthusiastically endorsed the use
ARSs for formative assessment. These results are consis-
tent with those observed in higher education (Beatty 2004;
Bergtrom 2006; Brewer 2004; Caldwell 2007; Draper and
Brown 2004; Dufresne and Gerace 2004; Greer and Hea-
ney 2004; Jackson et al. 2005; Siau et al. 2006; Simpson
and Oliver 2007; Stuart et al. 2004). The success of using
formative assessment with ARSs stands in marked contrast
to students’ resistance to using summative assessment.
Learning
A number of students commented that using ARSs to
review concepts improved the overall learning process. The
impact of using ARSs for improving recall of material
presented was noteworthy, but commented on by only a
few students. While challenges will be discussed in the
next section, it is important to note that perceptions of
improved learning process did not necessarily lead to
increased learning performance. At least 23 students felt
that learning performance was hampered, sometimes
appreciably, when ARSs were used to deliver summative
assessment. Simply using an ARS does not guarantee
success and choice of pedagogy clearly effects students
perception of learning.
Challenges
Overall Attitudes
It is evident that a few students had difficulty adjusting to
the newness of using an ARS. While this type of problem
was experienced by less than 2% of the student sample, it is
challenge that has also been noted in higher education. The
suggested remedy has been to engage in fun practice ses-
sions before use ARSs for teaching, clearly explain the
rationale for using an ARS, and identify the intended
benefits for students (Crouch and Mazur 2001; Trees and
Jackson 2007).
Student Involvement
While a majority of students were more involved when
ARSs were used, some students were decidedly stressed,
particularly in a summative testing situation. The com-
ments about increased stress were not frequent, but they
were passionate. Given the student frustration level when
an ARS was used for summative tests (see ‘‘Teaching
Strategy Used with ARSs’) it might be prudent to use this
technology for formative assessment purposes only. Stu-
dent anxiety might also be reduced by emphasizing the
learning versus having to get the right answer.
Learning
The most noteworthy learning challenge when using ARSs
was the negative impact on learning performance perceived
by students who were asked to use this tool to complete
graded tests. Overall rating of this practice was the lowest
rated survey item. However, this study was a preliminary
investigation of the use of ARSs in secondary schools
science classrooms, and more thought and analysis needs to
be directed toward the use of summative assessment.
Teaching Strategy Used with ARSs
Only one previous study reviewed the use of ARSs in a
K-12 environment (Penuel et al. 2006). These researchers
observed that teachers naturally migrated to one of two
strategies when using ARSs: formative (instructional) or
390 J Sci Educ Technol (2009) 18:382–392
123
summative (testing) use. A similar pattern was observed in
the current study with secondary school science teachers.
Most of the science teachers in the current study chose to
use an ARS as a formative assessment tool. Based on the
survey results, students overwhelmingly preferred forma-
tive over summative use. Overall attitude, student
involvement and the value of ARSs as effective learning
tools were all rated significantly lower when students
experienced summative as opposed to formative assess-
ment. No research to date has been done comparing
formative and summative approaches, so further research is
needed to confirm these initial findings.
Recommendation to Educators
The current study is formative analysis of the use of ARSs
in secondary school science classrooms, so it would be
inappropriate to offer unequivocal advice about their use in
the classroom. That said there are several tentative sug-
gestions that may be worth noting.
First, many secondary school science students did not
respond well when an ARS was used as a test-taking tool.
Student involvement and learning performance were per-
ceived as decreasing. In the interest of minimizing the
negative impact of this tool, at least in its initial imple-
mentation in secondary schools, it might be safer to limit
the use ARSs to formative assessment.
Second, it may be important to explain why an ARS is
being used in the classroom. This practice might help the
small, but vocal group of students who resisted this new
method of learning. Caldwell (2007) and Trees and Jackson
(2007) note that if you expect to gain full student accep-
tance, you need to explain why you are using ARSs and
what you expect to achieve with the technology.
Third, teachers should be aware of the time needed to
set-up an ARS and more importantly, to create effective
questions. While only two teachers commented on this
issue, previous research in higher education suggests that
question development is very time consuming (Allen and
Tanner 2005; Beatty et al. 2006; Fagan et al. 2002; Free-
man et al. 2007; Paschal 2002).
Finally, because stress was experienced by students
when the technology did not work, teachers should test the
ARS equipment ahead of time in order to ensure that all
remote devices respond properly. Batteries, for example,
might need to be changed.
Caveats and Future Research
This study investigated the use of ARSs in secondary
school science classrooms. Two principal data collection
tools were used—survey questions and open-ended com-
ments. The design and analysis of the data collection tools
were based on a thorough review of the literature exam-
ining ARS use in higher education. Nonetheless, at least
three caveats need to be considered when interpreting the
current results.
First, the data are intended to provide a starting point for
investigating use of ARSs in secondary school science
classrooms. More detailed examination is required in key
areas such as indentifying sources of student engagement,
the effect of different teaching strategies, and why long-
term memory might be enhanced with the use of ARSs.
Second, the results are based on limited use of an ARS
once or twice over a one month period. The impact could
be distinctly different if an ARS was used regularly. For
example, engagement may be reduced when students
become more familiar with ARSs. On the other hand, the
impact of an ARS on learning may be more significant with
increased use.
Finally, learning performance needs to be examined in
more detail. While systematic data on the perceived quality
of learning was collected for the current study, evaluation
of student performance is needed to firmly establish the
actual learning impact of ARSs.
References
Abrahamson L (2006) A brief history of networked classrooms:
effects, cases, pedagogy, and implications. In: Banks DA (ed)
Audience response systems in higher education. Information
Science Publishing, Hershey, pp 1–25
Allen D, Tanner K (2005) Infusing active learning into the large-
enrolment biology class: seven strategies, from the simple to
complex. Cell Biol Educ 4:262–268. doi:10.1187/cbe.05-08-
0113
Beatty I (2004) Transforming student learning with classroom
communication systems. EDUCAUSE Res Bull 3:1–13.
Retrieved 3 Nov 2007 from http://www.educause.edu/ir/library/
pdf/ERB0403.pdf
Beatty ID, Leonard WJ, Gerace WJ, Dufresne RJ (2006) Question
driven instruction: teaching science (well) with an audience
response system. In: Banks DA (ed) Audience response systems
in higher education. Information Science Publishing, Hershey,
pp 96–115
Bergtrom G (2006). Clicker sets as learning objects. Interdiscip
J Knowl Learn Objects 2. Retrieved 3 Nov 2007 from http://
ijklo.org/Volume2/v2p105-110Bergtrom.pdf
Brewer CA (2004) Near real-time assessment of student learning and
understanding in biology courses. Bioscience 54(11):1034–1039.
doi:10.1641/0006-3568(2004)054[1034:NRAOSL]2.0.CO;2
Burton K (2006) The trial of an audience response system to facilitate
problem-based learning in legal education. In: Banks DA (ed)
Audience response systems in higher education. Information
Science Publishing, Hershey, pp 265–276
Caldwell JE (2007) Clickers in the large classroom: current research
and best-practice tips. Life Sci Educ 6(1):9–20
Carnaghan C, Webb A (2007) Investigating the effects of group
response systems on student satisfaction, learning, and engage-
ment in accounting education. Issues Acc Educ 22(3):391–409.
doi:10.2308/iace.2007.22.3.391
J Sci Educ Technol (2009) 18:382–392 391
123
Crouch CH, Mazur E (2001) Peer instruction: ten years of experience
and results. Am J Phys 69(9):970–977. doi:10.1119/1.1374249
D’Inverno R, Davis H, White S (2003) Using a personal response
system for promoting student interaction. Teach Math Its Appl
22(4):163–169. doi:10.1093/teamat/22.4.163
Draper SW, Brown MI (2004) Increasing interactivity in lectures
using an electronic voting system. J Comput Assist Learn
20(2):81–94. doi:10.1111/j.1365-2729.2004.00074.x
Dufresne RJ, Gerace WJ (2004) Assessing-to-learn: formative
assessment in physics instruction. Phys Teach 42:428–433. doi:
10.1119/1.1804662
Fagan AP, Crouch CH, Mazur E (2002) Peer instruction: results from
a range of classrooms. Phys Teach 40(4):206–209. doi:
10.1119/1.1474140
Fies C, Marshall J (2006) Classroom response systems: a review of
the literature. J Sci Educ Technol 15(1):101–109. doi:10.1007/
s10956-006-0360-1
Freeman M, Bell A, Comerton-Forder C, Pickering J, Blayney P (2007)
Factors affecting educational innovation with in class electronic
response systems. Australas J Educ Technol 23(2):149–170
Greer L, Heaney PJ (2004) Real-time analysis of student compre-
hension: an assessment of electronic student response technology
in an introductory earth science course. J Geosci Educ 52(4):
345–351
Hatch J, Jensen M, Moore R (2005) Manna from heaven or clickers
from hell. J Coll Sci Teach 34(7):36–39
Hinde K, Hunt A (2006) Using the personal response system to
enhance student learning: some evidence from teaching eco-
nomics. In: Banks DA (ed) Audience response systems in higher
education. Information Science Publishing, Hershey, pp 140–154
Jackson M, Ganger AC, Bridge PD, Ginsburg K (2005) Wireless
handheld computers in the undergraduate medical curriculum.
Med Educ Online 10(5). Retrieved 3 Nov 2007 from http://www.
med-ed-online.org/pdf/t0000062.pdf
Jones C, Connolly M, Gear A, Read M (2001) Group integrative
learning with group process support technology. Br J Educ
Technol 32(5):571–581. doi:10.1111/1467-8535.00226
Judson E, Sawada D (2002) Learning from past and present:
electronic response systems in college lecture halls. J Comput
Math Sci Teach 21(2):167–181
Kaleta R, Joosten T (2007) Student response systems: a university of
Wisconsin system study of clickers. EDUCAUSE Res Bull
2007(10):1–12
Kay RH (2008a) Audience response systems attitude scale. Retrieved 22
Nov 2008 from http://faculty.uoit.ca/kay/papers/arshs/AppendixA_
ARS.pdf
Kay RH (2008b) Coding scheme for secondary school students
comments about audience response systems. Retrieved 22 Nov
2008 from http://faculty.uoit.ca/kay/papers/arshs/AppendixB_
ARS.pdf
Kennedy GE, Cutts QI (2005) The association between students’ use
of electronic voting systems and their learning outcomes.
J Comput Assist Learn 21(4):260–268. doi:10.1111/j.1365-2729.
2005.00133.x
Latessa R, Mouw D (2005) Use of audience response system to
augment interactive learning. Fam Med 37(1):12–14. Retrieved
3 Nov 2007 from http://www.stfm.org/fmhub/fm2005/January/
Robyn12.pdf
McCabe M (2006) Live assessment by questioning in an interactive
classroom. In: Banks DA (ed) Audience response systems in
higher education. Information Science Publishing, Hershey,
pp 276–288
Nicol DJ, Boyle JT (2003) Peer instruction versus class-wide
discussion in large classes: a comparison of two interaction
methods in the wired classroom. Stud High Educ 28(4):457–473.
doi:10.1080/0307507032000122297
Paschal CB (2002) Formative assessment in physiology teaching
using a wireless classroom communication system. Adv Physiol
Educ 26(4):299–308
Pelton LF, Pelton T (2006) Selected and constructed response systems
in mathematics. In: Banks DA (ed) Audience response systems
in higher education. Information Science Publishing, Hershey,
pp 175–186
Penuel WR, Abrahamson L, Roschelle J (2006) Theorizing the
transformed classroom: sociocultural interpretation of the effects
of audience response systems in higher education. In: Banks DA
(ed) Audience response systems in higher education. Information
Science Publishing, Hershey, pp 187–208
Pradhan A, Sparano D, Ananth CV (2005) The influence of an
audience response system on knowledge retention: an applica-
tion to resident education. Am J Obstet Gynecol 193(5):1827–
1830. doi:10.1016/j.ajog.2005.07.075
Preszler RW, Dawe A, Shuster CB, Shuster M (2007) Assessment of
the effects of student response systems on student learning and
attitudes over a broad range of biology courses. CBE Life Sci
Educ 6(1):29–41. doi:10.1187/cbe.06-09-0190
Reay NW, Bao L, Li P, Warnakulasooriya R, Baugh G (2005) Toward
the effective use of voting machines in physics lectures. Am
J Phys 73(6):554–558. doi:10.1119/1.1862638
Schackow TE, Milton C, Loya L, Friedman M (2004) Audience
response system: effect on learning in family medicine residents.
Fam Med 36:496–504
Sharma MD, Khachan J, Chan B, O’Byrne J (2005) An investigation
of the effectiveness of electronic classroom communication
systems in large lectures. Australas J Educ Technol 21(2):
137–154
Siau K, Sheng H, Nah F (2006) Use of classroom response system to
enhance classroom interactivity. IEEE Trans Educ 49(3):398–
403. doi:10.1109/TE.2006.879802
Simpson V, Oliver M (2007) Electronic voting systems for lectures
then and now: a comparison of research and practice. Australas
J Educ Technol 23(2):187–208
Slain D, Abate M, Hidges BM, Stamatakis MK, Wolak S (2004) An
interactive response system to promote active learning in the
doctor of pharmacy curriculum. Am J Pharm Educ 68(5):1–9
Stuart SAJ, Brown MI, Draper SW (2004) Using an electronic voting
system in logic lectures: one practitioner’s application. J Comput
Assist Learn 20(2):95–102. doi:10.1111/j.1365-2729.2004.
00075.x
Trees AR, Jackson MH (2007) The learning environment in clicker
classrooms: student processes of learning and involvement in
large university course using student response systems. Learn
Med Technol 32(1):21–40. doi:10.1080/17439880601141179
Uhari M, Renko M, Soini H (2003) Experiences of using an
interactive audience response system in lectures. BMC Med
Educ 3(12):1–6
392 J Sci Educ Technol (2009) 18:382–392
123
... This includes assessment evidence about student achievement, but in the case of DBDM, it also includes data such as information about student characteristics and structured classroom observation data (Kerr et al., 2006). Furthermore, the use of formative assessment requires knowledge and skills related to constructing and using a range of assessment instruments, such as paper-and-pencil tests and homework assignments (Christoforidou et al., 2014;Feldman & Capobianco, 2008;Gottheiner & Siegel, 2012;Kay & Knaack, 2009;Ní Chróinín & Cosgrave, 2013;Yin et al., 2013), and the knowledge and skills to critically evaluate these assessment instruments (Gottheiner & Siegel, 2012). ...
... Further, teachers need to be able to transform data into information based on their analysis and interpretation of the data. This step concerns identifying student learning needs (Blanc et al., 2010;Schildkamp & Kuiper, 2010;Schildkamp et al., 2012;Schildkamp, Karbautzki et al., 2014;Young, 2006), and determining appropriate actions to take in the classroom, such as re-teaching certain content, grouping students differently, or differentiating instruction (Blanc et al., 2010;Datnow, Park, & Kennedy-Lewis, 2013;Feldman & Capobianco, 2008;Fuchs et al., 1999;Gottheiner & Siegel, 2012;Kay & Knaack, 2009;Kerr et al., 2006;Lee, 2011;Levin & Datnow, 2012;McNaughton et al., 2012;Penuel, Boscardin, Masyn, & Crawford, 2007;Schildkamp & Kuiper, 2010;Schildkamp & Visscher, 2010a;Schildkamp et al., 2012;Schildkamp, Karbautzki et al., 2014, 2010bYoung, 2006). Young (2006) x DBDM US n/a QL 4 n/a n/a . ...
... International Journal of Educational Research 103 (2020) 101602 3.1.2. Pedagogical content knowledge Fifteen studies concluded that teachers need pedagogical content knowledge (PCK) to be able to implement DBDM or AfL (Aschbacher & Alonzo, 2006;Birenbaum et al., 2011;Blanc et al., 2010;Feldman & Capobianco, 2008;Fletcher & Shaw, 2012;Fox-Turnbull, 2006;Fuchs et al., 1999;Gottheiner & Siegel, 2012;Harris, Brown, & Harnett, 2014;Kay & Knaack, 2009;Lee, 2011;McNaughton et al., 2012;Penuel et al., 2007;Yin et al., 2013). PCK refers to subject-matter content knowledge, as well as knowledge about how to teach subject-matter knowledge. ...
Article
Full-text available
Formative assessment has the potential to support teaching and learning in the classroom. This study reviewed the literature on formative assessment to identify prerequisites for effective use of formative assessment by teachers. The review sought to address the following research question: What teacher prerequisites need to be in place for using formative assessment in their classroom practice? The review was conducted using a systematic approach. A total of 54 studies were included in this review. The results show that (1) knowledge and skills (e.g., data literacy), (2), psychological factors (e.g., social pressure), and (3) social factors (e.g., collaboration) influence the use of formative assessment. The prerequisites identified can inform professional development initiatives regarding formative assessment, as well as teacher education programs.
... These appear to result in improved levels of student engagement (DeSorbo et al. 2013;Vital 2011) and achievement (Mun et al. 2009), possibly associated with increased demands for participation and the availability of (generic) formative feedback. Reported challenges regarding the use of clickers include the malfunction of the hardware, time pressure, student maturity levels, limited response formats, the construction of appropriate questions, utilizing feedback formatively, and discomfort with the new technology that interfered with students' learning (Kay and Knaack 2009;Lee et al. 2012;Vital 2011). However, with the increasing accessibility of technology and use in all aspects of life, it is anticipated that these issues would have diminished over time. ...
... Although it has been applied to different academic disciplines such as psychology (Morling et al. 2008;Dallaire 2011;Ludvigsen et al. 2015), health (Schackow et al. 2004;Hughes et al. 2011), sciences (Kay and Knaack 2009;Moss and Crowley, 2011) or accounting (Carnaghan and Webb 2007;Camacho-Miñano and Del Campo 2014), it cannot yet be considered as widely used. In this sense, Emenike and Holme (2012) argue, using a methodological analysis of binary data between users and non-users of the system for a sample carried out at faculties of chemistry, that this technology is still at an early user stage, in transition towards a more widespread use status taking the life cycle of technology adoption (technology adoption life cycle -TALC). ...
Article
Full-text available
p class="Textoindependiente21"> The European Higher Education Area (EHEA) involves changing traditional methods to promote innovative teaching experiences. This paper has two main aims: a) to show evidence of the use of Interactive Response Systems (IRS) to identify gaps in the understanding of the course contents and b) to investigate factors influencing students’ attitudes towards the use of IRS. The experience was developed through a collective tutoring session in the subject of Economics using IRS. Economics is a first-year subject in the Degree of Business Administration and Management offered by the University of Cadiz, which includes contents of Microeconomics and Macroeconomics and uses economic models to explain the function of the economy and the behaviour of economic agents. Results show that IRS technique allows detecting gaps in learning and comprehension. From our econometric estimations, we also identify two strongly significant variables affecting students’ attitudes towards IRS: gender and received explanations regarding the use of IRS. Variables such as first enrolment in the subject and the number of hours devoted to studying have a positive and significant effect on the attitude to IRS, but at a lower level of significance (from 5% to 10%). </p
... Similarly to what others have found regarding student feedback on the use of a student response system (Addison et al. 2009;Caldwell 2007;Draper 2002;FitzPatrick et al. 2011;Geertsen 2015;Kay & Knaack 2009), the perception of students enrolled in the A&P I and II courses on the i>clicker use was highly positive and consistent across semesters and courses. The overall response rate of student completing the survey was very good. ...
... This includes the collection of student assessment data, but can also include other types of data -for example, from surveys and/or classroom observations (Kerr et al. 2006) -all aimed at gaining information about student learning in relation to the learning goals (Gulikers and Baartman 2017). To be able to identify the learning needs of students, teachers need to be able to use a range of different assessments, such as paper-and-pencil tests, homework assignments, and so forth (Feldman and Capobianco 2008;Gottheiner and Siegel 2012;Kay and Knaack 2009;Yin, Tomita, and Shavelson 2013). Furthermore, alongside the collection of these more formal types of data, as often described in data-based or data-informed decision-making approaches (e.g. ...
Article
Full-text available
Background: To enable all students to reach their full potential, teachers have to adapt their instruction to students’ varying needs. In order to do this, teachers need to engage in activities associated with formative assessment, as well as those associated with differentiation. However, both of these types of activities are, in themselves, difficult for teachers to carry out. Furthermore, as both fields tend to use their own terminology, frameworks, and cycles of teacher activities, it can be even more difficult for teachers to navigate both. Although the notion of the strong relationship between formative assessment and differentiation is not new, we argue that a better understanding of the close relationship between the two is needed in the context of teacher education. Purpose: Our aim was to develop a conceptual framework which offers teachers and teacher educators a coherent set of teacher activities in which both formative assessment and differentiation are represented. Sources of evidence: International literature in the fields of formative assessment and differentiation was reviewed. Through this process, we sought to identify, describe and compare teacher activities regarded as crucial for formative assessment and for differentiation. The review was based on extant review studies and frameworks used in both fields and handbooks on both topics. Main argument: Our analysis demonstrated that both approaches have much in common, but differ substantially in terms of the emphasis placed on different activities and the depth of elaboration. As such, we argue that the approaches complement each other well and that it is feasible to present teachers with one coherent set of teacher activities in which both approaches are unified. We propose a conceptual framework for Assessment-Informed Differentiation (AID), which involves a continual cycle of in-depth activities related to preparing and providing differentiated instruction based on assessment data. Conclusion: Formative assessment and differentiation approaches need to be treated as an integrated set of activities in order to realise the full potential of all students. Further research should focus on the usability and effectiveness of the proposed cycle. The conceptual framework we propose could ultimately be used in many different teacher education settings internationally, forming a starting point for much-needed teacher professional development in this area.
... CRS improved student learning and motivation at the secondary level (Satori, 2008). CRS enhanced student motivation in 10th to 12th-grade students (Kay & Knaack, 2009). CRS improved student performance and motivation of science students in grade 6 th (Bloemers, 2004). ...
... CRS improved student learning and motivation at the secondary level (Satori, 2008). CRS enhanced student motivation in 10th to 12th-grade students (Kay & Knaack, 2009). CRS improved student performance and motivation of science students in grade 6 th (Bloemers, 2004). ...
Article
Technology Enhanced Formative Assessment (TEFA) is an advanced and evidence-based pedagogical approach for science and mathematics teaching using a classroom response system. Question-driven instruction, dialogical discourse, formative assessment, and meta-level communication are four core principles on which TEFA is built upon. These are implemented through a question cycle in the classroom. Eight weeks duration intervention study was carried out in two urban high schools, one from each stratum (i.e., boys and girls) using a non-equivalent comparison group quasi-experimental design on a sample of 183 participants. The study participants included 42 girls and 47 boys in the experimental group, whereas 54 girls and 40 boys in the control group from the 8th-grade mathematics class. Students' mathematics motivation survey using IMI (i.e., intrinsic motivation inventory) was used to measure students' mathematics learning motivation levels. Two-way ANOVA was conducted to assess the motivation level of the students. Analysis of the results showed that the experimental group motivation level was significantly higher than the control group.
Chapter
This chapter focuses on the purpose of assessment and presents the main characteristics of formative and summative evaluation, emphasizing that these are complementary, and both are necessary to carry out a correct follow-up of student learning. The main tensions between summative and formative assessment are discussed, ending with suggestions that constitute good practices, resources and instruments that teachers can use in their classrooms to integrate formative and summative assessment into their assessment practices.
Article
Full-text available
The objective of this chapter is to explain the huge, burgeoning sense of excitement surrounding response systems, and more generally, networked classrooms today. Also why, for an idea apparently more than 40 years old, it took this long to happen! Beginning with a brief history of early response systems, it takes up the story from the author's own experience, leading through hardware barriers, misconceptions about pedagogy, and classroom successes, to summarize the variety of uses, and how they lead to improved teaching and learning. It then discusses why this is such a potentially important area of study for improving education, and finally goes on to describe the emerging characteristics of, and rationale for, more powerful types of modern systems.
Article
Full-text available
Electronic student response technologies (SRT) are capable of assessing teaching and learning methods in real time, and they offer an exceptional means of introducing active learning protocols in classes with large enrollments. These wireless systems allow students to key in responses with remote control units to questions posed by an instructor in the classroom. Student responses then are displayed in real time, allowing both students and instructors to gauge student comprehension instantaneously. From Spring 2002 to Spring 2003, we utilized SRT in 4 sections of a high-enrollment introductory Earth Science course (Geosc 020: Planet Earth) at Penn State University. We conducted a multi-faceted assessment of the use of SRT in our course that included quantitative and qualitative perception data from students enrolled in the course and faculty/administrator visitors to our classroom. Our preliminary assessment of the pedagogical merits of SRT in our course suggests that this technology is an effective tool for introductory geoscience education.
Article
We report data from ten years of teaching with Peer Instruction (PI) in the calculus- and algebra-based introductory physics courses for nonmajors; our results indicate increased student mastery of both conceptual reasoning and quantitative problem solving upon implementing PI. We also discuss ways we have improved our implementation of PI since introducing it in 1991. Most notably, we have replaced in-class reading quizzes with pre-class written responses to the reading, introduced a research-based mechanics textbook for portions of the course, and incorporated cooperative learning into the discussion sections as well as the lectures. These improvements are intended to help students learn more from pre-class reading and to increase student engagement in the discussion sections, and are accompanied by further increases in student understanding.
Chapter
This chapter provides a case study of how a lecturer in the School of Law at the Queensland University of Technology (QUT) used an audience response system (ARS) in a lecture for a second-year, core, undergraduate law subject to facilitate problem-based learning. It identifies the positive student response to the use of an ARS, and explores the main benefits, for example, active engagement of students in the learning process, facilitation of formative assessment where the students develop initiative and peer relationships, and the provision of timely and worthwhile feedback. The chapter also identifies some of the problems that the author faced in the trial, and provides some suggested solutions and recommendations. The author hopes to encourage other lecturers to take advantage of an ARS to enhance student learning, and identifies some future ARS research opportunities.
Chapter
This chapter presents an overview of audience response systems as tools for promoting learning in an interactive classroom. The comparison with a popular TV programme is used to introduce the role of questioning in face-to-face teaching, before examining the intrinsic pedagogical benefits of questions. Technical aspects of different systems are summarised, and pedagogical issues discussed, based on the author's personal teaching experience. Appropriate ways of using audience response systems in the classroom, and guidelines for question delivery are presented. New technology will continue to open up opportunities, but the challenge for the future remains to use the technology effectively and in pedagogically sound ways.
Chapter
This chapter examines two types of response technologies (selected and constructed) available to support discussion and participation in the classroom, and describes our experiences using and observing them in a variety of mathematics, science, and computer science classes at various educational levels. Selected response systems (a.k.a., clickers) display multiple-choice questions, and then collect and analyze student responses, and present distribution summaries to the class. Constructed response systems allow students to use handheld computers to generate free-form graphical responses to teacher prompts using various software applications. Once completed, students submit their responses to the instructor's computer wirelessly. The instructor may then select and anonymously project these authentic student work samples or representations to promote classroom discussion. We review the purpose, design, and features of these two types of response systems, highlight some of the issues underlying their application, discuss our experiences using them in the classroom, and make recommendations.
Article
Recent increases in class size in higher education have focused more attention on the nature of the face-to-face learning experience. This chapter examines how a keypad technology facilitates active learning in the lecture hall using a number of pedagogically proven approaches. We survey 219 first-year business studies students tackling introductory economics, and find that the technology enhances learning in lectures because, among other things, it improves concentration, provides instantaneous and more effective student feedback, and allows students to make comparisons on how well they fare relative to their peers. Interestingly, we find less statistical support for the benefits of using the technology to allow students to respond anonymously, and explore some reasons for this result. Finally, we demonstrate our use of the tool to engage in teaching the Prisoner's Dilemma game. This forms part of the emerging knowledge on how to teach classroom experiments using keypad technology.
Article
Electronic student response technologies (SRT) are capable of assessing teaching and learning methods in real time, and they offer an exceptional means of introducing active learning protocols in classes with large enrollments. These wireless systems allow students to key in responses with remote control units to questions posed by an instructor in the classroom. Student responses then are displayed in real time, allowing both students and instructors to gauge student comprehension instantaneously. From Spring 2002 to Spring 2003, we utilized SRT in 4 sections of a high-enrollment introductory Earth Science course (Geosc 020: Planet Earth) at Penn State University. We conducted a multi-faceted assessment of the use of SRT in our course that included quantitative and qualitative perception data from students enrolled in the course and faculty/administrator visitors to our classroom. Our preliminary assessment of the pedagogical merits of SRT in our course suggests that this technology is an effective tool for introductory geoscience education.