Content uploaded by Robin Holding Kay
Author content
All content in this area was uploaded by Robin Holding Kay on Mar 21, 2019
Content may be subject to copyright.
A CASE STUDY EXPLORING THE USE OF STUDENT RESPONSE
SYSTEMS IN STEM-BASED SECONDARY SCHOOL CLASSROOMS
R. Kay
University of Ontario Institute of Technology (CANADA)
Abstract
Student response systems (SRSs), also known as audience response systems, classroom response
systems and clickers, allow students to answer electronically displayed questions. SRSs can be
hardware (e.g., physical device) or software-based (e.g., mobile apps). Typically, all responses are
instantly presented, in chart form, then reviewed and discussed by the instructor and the class.
Considerable research has been conducted on the use of SRSs in higher education [1-8] but not in
secondary school environments. The purpose of the current study was to examine the effectiveness of
SRSs in STEM-based secondary school classrooms. Twenty-two STEM-focused secondary school
teachers with 1 to 32 years of teaching experience participated in this study. After four months of
integrating SRSs in their classrooms, all teachers completed a survey, responded to open-ended
questions, and participated in hour-long focus groups. Key benefits of using SRSs included providing
formative feedback on learning and teaching, increasing student involvement, improving the quality of
teaching, and offering an alternative method for conducting summative assessments. Key challenges
observed were technical problems (software and hardware), increased preparation time for lessons,
accounting for individual differences, resistance to summative assessment, and classroom
management.
Keywords: student response systems, clickers, audience response systems, secondary school, STEM.
1 INTRODUCTION
Student response systems (SRSs) allow students to answer digitally presented questions (often multiple
choice) and receive immediate feedback. Originally, SRSs consisted of costly, physical clickers
connected to a computer using infrared or radio frequency technology (Boscardin & Penuel, 2012). The
technology has advanced, though, and students can now respond to questions using laptops or mobile
phones and free, web-based software (e.g., Kahoot©, PollEveryWhere©, Socrative©).
At least eight literature reviews have been conducted on the use of SRSs in higher education classrooms
[1-8]. Reported benefits of using SRSs include providing effective formative feedback to students to help
them learn [3,5,6], collecting formative feedback to help instructors adjust their teaching [1,3], increasing
student involvement (e.g., engagement and participation) [1,3,4-8], and improving the quality of teaching
[1,2,4,5]. Reported challenges of using SRSs involve dealing with technological problems [5], time
required to set-up SRSs [5] and to create effective questions [1,5,9], and resistance to summative
assessment [3,5].
Only a handful of studies have looked at the use of SRSs in K-12 environments [9-14]. Similar benefits
(formative feedback, student involvement, quality of teaching) and challenges (e.g., technology set-up
and question creating time, summative assessment) observed in higher education were noted in K-12
classrooms [9-14]. However, the number of studies on SRS use in K-12 classrooms is too small to
make firm conclusions. Furthermore, most studies rely on quantitative data makeing it difficult to
understand the precise nature of benefits and challenges reported. Finally, only one paper could be
found examining the use of SRSs from the perspective of the teacher [13]. The purpose of this study,
then, was to use a mixed method approach to provide a detailed analysis of potential benefits and
challenges using SRSs from the perspective of secondary school teachers.
2 METHODOLOGY
2.1 Participants
Twenty-two teachers (7 females, 22 males), from 14 different high schools within a suburban region of
over 200,00, volunteered to participate in the study. Grades taught were 9 (n=6), 10 (n=9), 11 (n=4),
and 12 (n=3). STEM-based subject areas taught included business (n=3), mathematics (n=5), science
Proceedings of INTED2019 Conference
11th-13th March 2019, Valencia, Spain
ISBN: 978-84-09-08619-1
3455
(n=7) and technology (n=7). Teaching experience ranged from 0.5 to 32 years, with a mean of 15.5
(SD=7.9) years. On a four-point scale ranging from “Not at all Comfortable” to “Very Comfortable”, mean
comfort level with technology was 3.8 (SD= 0.6).
2.2 Data Collection and Analysis
2.2.1 Survey Data
Participants completed an online survey after they used SRSs in their classrooms for a period of four
months. The first section of the survey collected demographic data (e.g., gender, subject area, grade,
experience level), preparation time and frequency of SRS lessons. The second section consisted of 12,
four-point Likert-scale items focussing on the intended purposes for using SRSs. The final section
included five, five-point Likert-scale items examining perceptions about the overall impact of SRSs.
Finally, one open-ended question asked participants to share the perceived benefits and/or challenges
of using SRSs in their classroom.
A frequency analysis including means and standard deviations was used to analyze the quantitative
survey data. An emergent content analysis was used to analyze open-ended responses about benefits
(n=44 comments) and challenges (n=29). Four themes emerged for benefits experienced: formative
assessment for students, formative assessment for teachers, student involvement, and summative
assessment. Five main themes were identified for challenges observed: technology, preparation,
individual differences, summative assessment and classroom management.
2.2.2 Focus Groups
Three one-hour, focus groups (4-6 participants each) were conducted. Participants discussed three
main questions.
1 How did they use SRSs in the classroom?
2 What were the main benefits of using SRSs and why?
3 What were the main challenges of using SRS and why?
All responses were audio recorded and transcribed producing 166 comments (n= 102 on benefits, n=
64 on challenges. With respect to benefits, five themes were noted: formative assessment for students,
formative assessment for teacher, student involvement, quality of teaching and summative assessment.
Regarding challenges, three themes were identified: technology, preparation, classroom management.
2.3 Procedure
Fourteen secondary schools were contacted to enlist teachers to participate in a study involving the use
of SRSs in the classrooms. Twenty-two volunteers enrolled in a full-day workshop focussing on using
the SRS hardware and software, as well as implementing effective strategies for maximizing the learning
benefits of SRSs. At the end of the workshop, each teacher had access to their own SRS system for a
period of four months (one full academic term). After the four-month term was completed, each teacher
filled in an online survey and participated in a one-hour focus group.
3 RESULTS
3.1 Preparation Time and Frequency of Use
Preparation time for using SRSs in a lesson ranged from 10 to 75 minutes with a mean of 40.6 (SD=18.4)
minutes. SRSs were used almost daily (n=1 teacher), 2-3 times per week (n=1 teacher), once each
week (n=10 teachers), once (n= 8 teachers), or never (n=2 teachers).
3.2 Overall Impact
Eighty-five to 90% of teachers agreed or strongly agreed that students were more engaged or motivated
when using SRSs in the classroom. Over two-thirds of the teachers agreed or strongly agreed that SRSs
improved the quality of instruction, made the lesson more effective and helped to plan for future lessons.
3456
Table 1. Overall Impact of SRSs in Secondary School Classrooms.
n
% Agree1
Mean (SD)2
Students were more engaged with SRSs
20
85%
6.2 (1.0)
Students were more motivated with SRSs
20
90%
6.1 (0.7)
SRSs helped improve quality of instruction
19
68%
6.0 (0.9)
SRSs made lesson more effective
20
65%
5.9 (0.9)
Feedback from SRSs helped in planning future lessons
20
70%
5.6 (1.2)
1 Agree + Strongly Agree
2 Seven-point scale (Strongly Disagree to Strongly Agree)
3.3 Benefits of Using SRSs
3.3.1 Formative Feedback – Student Learning
A majority of teachers used SRSs to obtain formative feedback on student learning. Over 90% of these
teachers rated this type of use as effective or highly effective when used to check for understanding
during or after a lesson. Eighty percent of teachers rated SRSs as effective or highly effective when
used to check for individual understanding (e.g., when answers were linked to specific students) (Table
2).
Table 2. Formative Use of SRSs for Student Learning.
n
% Effective1
Mean (SD)2
Before a Lesson
15
67%
3.1 (0.9)
During a Lesson
14
93%
3.4 (0.6)
End of a Lesson
17
94%
3.5 (0.6)
Individual Understanding
15
80%
3.1 (0.7)
1 Effective + Highly Effective
2 Four-point scale (Not at all effective to Highly Effective)
Over 30% (n=53) of the open-ended survey and focus group comments referred to the effectiveness of
SRS feedback in supporting student learning. Sample comments were:
“I liked it because of the quick response from the kids; you know when 24 or 26 got it right – I
knew they understood it – when half only did it, I knew we needed more discussion on that certain
topic.”
“The immediacy of the feedback is good. It’s not 1 or 2 kids putting their hands up, they all have
to, and you can see the results right away, you can address issues that come up right away.”
“[With the SRS], you give the students 25 multiple-choice [questions], and they know what to
study instead of wasting time studying other stuff.”
“I used them as a formative tool and as a diagnostic tool about what my students knew and then
we could actually fine-tune the review the next day.”
3.3.2 Formative Feedback – Teacher Learning
About 10% (n=15) of the open-ended survey and focus group responses indicated that SRSs were
useful for providing feedback on how well concepts were taught. Representative comments were:
“We did it at the end of each unit as a diagnostic tool to help them know what they know and to
help me know what I didn’t teach them well enough in.”
“I use frequent quizzes at the start of class to review the previous day’s work, and have modified
my lesson as a result of the feedback.”
3457
“When I've taken the time to incorporate it, the process has caused me to rethink and come up
with a stronger lesson. In essence, the new technology is forcing my creativity.”
“By using clickers as an assessment tool, I can focus lessons on areas of weakness.”
“[I liked the] feedback being right away was so powerful that I could look at it. Kind of like this
where I can look at it to see if the wording is bad, if I taught it bad if there’s consistently one kid
who’s missing the concept.”
“The one benefit brought up from the teacher aspect from improving your teaching, improving
how you word question to draw out the answer that you want.”
“You can see what the whole class knows. where they are, and you can also say what review you
have to do as a teacher.”
3.3.3 Student Involvement
Almost 80% of teachers (n=15) believed that using SRSs to motivate or engage students was effective
or highly effective with a mean rating of 3.1 out of 4 (SD=0.8). Comments from the open-ended question
and focus groups (n=25, 15%) supported this finding. Sample responses were:
“The kids really liked using it because it was something different; especially with math. It wasn’t
the same routine every day.”
“I’ve found that kids are very interested in doing it.”
“Students are enthusiastic and engaged; they look forward to days when clickers are used.”
“Students are always very excited to use the clickers.”
Aside from motivation/engagement, open-ended (n=4, 9%) and focus group (n=8, 8%) responses
indicated that using SRSs increased student participation. Sample comments were:
“[SRSs] has given a voice to some of the quieter students.”
“The reason I used it so much was because the kids liked it, there was 100% participation (I could
see it),”
“The kids that never participated typically wanted to discuss the reasons as to why the answers
were given.”
“When I had the clickers out, [students with special needs] were focused, they were right there, I
wasn’t losing them, they were engaged and a part of the process.”
3.3.4 Quality of Teaching
SRSs were also used to augment the quality of teaching in five areas. All of the teachers who used a
game-based learning approach with SRSs rated it as effective. Three-quarters of the teachers who
SRSs to increase discussion, introduce a new topic or check homework rated them as effective. Only 4
teachers used SRSs as a cooperative tool where students shared a clicker and discussed the selection
of a response before responding. This approach was viewed as moderately effective (Table 3).
Table 3. Formative Use of SRSs for Improving the Quality of Teaching.
n
% Effective1
Mean (SD)2
Game-Based Learning
10
100%
3.7 (0.5)
Increased Discussion
18
78%
3.0 (0.7)
Introduce a Topic
11
73%
3.1 (0.9)
Checking Homework
11
73%
2.9 (1.1)
Cooperative Learning
4
50%
2.8 (1.0)
1 Effective + Highly Effective
2 Four-point scale (Not at all effective to Highly Effective)
Teachers offered only a few open-ended or focus group comments about SRSs improving the quality of
their teaching. With respect to gaming, teachers noted:
3458
“The game is a fun and [an] engaging review lesson. The majority of the kids love it.”
“And I found that jeopardy [with SRSs] better than the regular jeopardy – [with SRSs] everyone
answers.”
Regarding homework, sample comments were:
“[with regular homework checks] they would never put up their hands you’d have to be pulling the
answers out of them. [With SRSs] they’re engaged, and they’re clicking, and the whole bit and
they see the answer come up and it’s surprised me sometimes because the kids that are cheering
[and it] actually gave me feedback that they’re doing their homework.”
“I find it quite good and I’ve had students be excited that, I hear them, that they’re getting a reward
for doing their homework for one.”
Finally, for increasing discussion, sample comments were:
“You get the bar graph, and you say whoa, half of you got this wrong, you stop, and you have a
discussion about why you chose it.”
“[SRSs] created more discussion in the classroom. “
3.3.5 Summative Assessment
Fifty-four percent of teachers (n=12) rated the use of SRSs for summative assessment (formal testing)
as effective or highly effective with a mean rating of 2.4 our 4 4 (SD=1.0). This was the lowest rated
strategy for using SRSs, however, participants from the focus groups reflected extensively on this
method of using SRSs (n=19 comments, 19%). A number of teachers used SRSs to prepare students
for multiple choice questions that they might experience in higher education. Sample comments were:
“My purpose in doing this was to give the kids practice with doing multiple choice questions.”
“I was using it a tool to effectively teach them how to approach multiple choice questions. I tell
you no one does this and I tell you that’s all they’re going to see [this in university].”
“A lot of people don’t spend much time how to do multiple choice questions – that’s what they get
in university because that’s what they’re faced with in the future.”
Other responses about using SRSs for summative assessment included:
“I used clickers in formal testing situations for grade 11 and 12 biology and grade 9 mathematics.
“What I appreciated there was the immediate, instant feedback.”
3.4 Challenges of Using SRSs
3.4.1 Technology
Thirty-five percent (n=34) the open-ended and focus group responses noted that the technology
(software and hardware) involved in using SRSs was challenging at times. Sample comments related
to software issues were:
“The software is just a pain, but I’m used to that in education because education and software.”
“You know I could use it right now without really thinking but I found it a little bit less than ideal,
and I thought maybe I’m the older generation, but I would think that a lot of people who aren’t
computer-savvy would find it frustrating initially.”
“Also, with the graph, every time I want to do something graphical the formats that were there just
didn’t apply, and that was frustrating like crazy.”
“I spent over an hour preparing Exam View questions at home, and when I went to use them on
the computer at school, it didn't work. I was extremely frustrated and wasted so much time in
class.”
Typically comments about hardware challenges were:
“We’ve had the radio frequency ones, and they’re no problems. The IR frequency is the problem
… There are a couple of kids that just couldn’t do it.”
“I did notice that a couple of kids would hold down the button too long, hit two buttons at the same
time because of the way they’re holding the remote.
3459
“I’m teaching the BTA course or information technology-type course in a computer lab, it got pretty
frustrating because they had to move to go over the monitors, or between them, stand up,
because they had to raise their hand really high and push the button for all to see.”
3.4.2 Preparation
One-third of the open-ended and focus group (n=32) comments referred to preparation as being a
significant challenge when using SRSs. The two main issues were the time required to create effective
questions and setting up the SRS technology. Sample comments about creating questions were:
“It takes a decent amount of time to find questions, to think of questions, once you’ve found
questions you need to edit them to fit your exact topic of what you taught that year.”
“It’s sort of time-consuming to make the questions.”
“It's important to have well thought-out and precisely worded questions to prevent student
confusion.
“It took more time to prepare questions. It took a lot of time.”
Sample comments about setting up SRSs were:
“Technology takes forever to set up and set down – I don’t care how long people think it takes.
You’re not doing it while you’re teaching and if that’s your plan, you should have something else
for the kids to do.”
“I find it a challenge to set up the equipment – the projector, the laptop, the clickers are in the bag.
It’s four corridors away, so I’ve got to go walk down to the library, bring the equipment in the room,
put it on the cart.”
3.4.3 Individual Differences
Over ten percent (n=11) of open-ended and focus group comments noted that there were individual
differences with respect to SRS use. Specific differences noted involved some students feeling pressure
to respond quickly, getting lost in the intensity of using the SRSs, and subject area not matching the
format of questions typically used in SRSs (e.g., multiple-choice). Sample comments were:
“Some students told me … that the intensity was a bit too much for them, that they felt they
couldn’t think that quickly on the spot.”
“I noticed a couple of things. One is the students who work very quickly and buzz in their answers
and theres the kids who are right down to the second. The creates some challenges.”
“It's difficult to use the clickers when developing a concept or topic in computer engineering.”
3.4.4 Summative Assessment
Ten percent (n=10) of open-ended and focus group responses claimed that using SRSs to conduct
summative assessment was a problem. Sample responses included:
“I just felt from a time constraint, not being able to go back and not necessarily and look at their
answers for the kids that struggle anyways, I thought that would be too much of a challenge.”
“I had at least one student who loved the clickers but felt that it shouldn’t be used for a test but
was more comfortable writing the traditional test.”
“I tried using them for assessment, and the students balked. They blamed the clickers for their
poor performance on the multiple-choice questions.”
“There’s certainly dissatisfaction [from the students], this is not working, and I’m getting kind of
fed up with this. And before, all I needed to do was write down ABC or D and you can’t double
check your answers – that’s another problem.”
“The feedback we got from the students was “yeah it’s neat, but it’s way too easy to cheat” – since
we have the IR ones and they have to get up and cheat from what other people are doing. So
after that, we never used it for that again.”
3.4.5 Classroom Management
Just under ten percent (n=8) of open-ended and focus group responses identified classroom
management issues when using SRSs. Typical comments included:
3460
“[SRSs] sometimes cause a bit of a disturbance with kids trying to answer wrong on purpose to
get a laugh.”
“It is weird; some classes get more rambunctious while using the clickers and it takes me longer
to get through my lessons because I have to settle them down several times.”
“I found too that students would get a little silly and you have them buzzing in a number of times
or buzzing in a letter F when there is no F.”
4 CONCLUSIONS
This study provided a detailed analysis of the benefits and challenges of using SRSs in STEM-based
classrooms, from the perspective of 22 secondary school teachers who used SRSs for a four-month
academic term. Most teachers used SRSs weekly and took an extra 40 minutes to prepare for SRS
lessons.
4.1 Benefits
Overall, five key benefits of using SRSs in STEM-based secondary school classrooms were identified:
formative assessment for student learning, formative assessment for teachers, increased student
involvement, improved quality of teaching, and summative assessment. The first four benefits mirrored
previous research on the use of SRSs in higher education [1-8] and K-12 classrooms [9-14]. However,
qualitative data provided additional useful information about the nature of these benefits.
With respect to formative assessment for student learning, it appeared that this approach was most
effective when used during or at the end of a lesson (as opposed to the beginning of a lesson or to
garner individual formative feedback). This type of feedback, described as efficient and quick, saved
time and helped students focus on what they needed to learn.
While not as prominent as formative feedback for student learning, formative feedback to guide teaching
had a specific impact on teachers adjusting or modifying instruction and revisiting previously taught but
poorly understood concepts. This type of feedback also inspired creativity, improved the quality of
questions asked, and increased the efficiency of instruction by targeting areas of weakness.
Regarding student involvement, the unique finding in this study is the intensity, enthusiasm, focus and
full participation of all students when SRSs were used. The interest and excitement of using SRSs were
prominent. Furthermore, and perhaps more importantly, SRSs gave voice to students who rarely, if
ever, participated in the traditional classroom.
Previous studies noted that SRSs improved the quality of teaching [1,2,4,5], particularly with respect to
improving the nature of peer and class discussions. This study added the use of game-based review
for upcoming tests which was universally popular with students, introducing new topics, checking
homework, and to a lesser extent promoting cooperative learning (e.g., when students shared a clicker
device and worked together to answer questions).
The benefits of summative assessment, not emphasized in previous studies, were also noted by
teachers in this study, albeit to a far lesser extent than the four previously articulated benefits. The main
purpose for using summative assessment was to prepare students for answering challenging multiple-
choice questions that they might have to face in university. This is a unique finding, perhaps indigenous
to STEM-based subject areas, and would need to be replicated to establish reliability.
4.2 Challenges
Previous research has not looked in depth at challenges experienced while using SRS. Secondary
school teachers in this study identified five problem areas: technology (software and hardware
challenges), preparation time, individual differences in using SRSs, summative assessment and
classroom management. Previous research had noted some hardware issues [11,12], however, in this
study software and hardware challenges were the most challenging issue when using SRSs. Time
wasted learning the software and trouble connecting to the receiver were two relatively prominent areas
of concern. Given that free, easy-to-use, software-based SRSs are becoming more prevalent in the
classroom, software and hardware problems may dissipate in the future.
Preparation time, identified in some previous studies as a challenge, focussed on set-up time [5] and
creating effective questions [1,5,9]. While set-up time may be alleviated somewhat by using software-
based SRSs, creating meaningful, thought-provoking questions are critical for the effective use of SRSs
3461
and will likely take substantial time to create from scratch. Teachers might consider pooling their
resources to create a shared database of questions organized by subject area and course.
Individual differences in receptivity to using SRSs were observed by a few teachers in this study,
specifically with respect to the intensity of the classroom when SRSs were used and the perceived time
pressure for students to answer questions. Maintaining a calm atmosphere while using SRSs and
allowing sufficient time for students to reply is clearly necessary for some students.
While using SRSs for summative assessment might save time and help students prepare for multiple-
choice questions, some teachers in this study noted a strong negative reaction to using SRSs for formal
testing. The key issue appeared to revolve around the anxiety of being evaluated combined with
answering questions using new and occasionally unpredictable technology that did not allow students
to review previous responses. This negative feedback is consistent with two previous studies where
SRSs were used for summative assessment in secondary schools [11,12]. Given that the benefits of
using SRSs for summative purposes are limited, teachers might re-consider using this tool for formal
evaluation.
Classroom management issues were not identified as a problem in previous studies in higher education.
One might expect older students to behave when using SRSs at the college or university level. However,
in secondary school classrooms, several teachers reported that some students did not have the maturity
to use SRSs and deliberately tried to sabotage the process. These discipline problems were not
prevalent, but need to be addressed with younger students to limit distraction for other students and
maximize learning.
4.3 Limitations and Future Research
A mixed method, case study approach was used to investigate the benefits and challenges of using
SRSs in STEM-based, secondary school classrooms. While a more detailed understanding of SRS use
was presented based on rich comments from the focus groups, the generalizability of the results cannot
be established based on the small sample size. Future research could (a) use the qualitative findings
from this to develop a more comprehensive survey for a much larger audience, (b) develop and test
interventions that help maximize the benefits and minimize the challenges of using SRSs, (c) explore
the effectiveness of different types of questions on student learning, and (d) conducts research that
applies the results and principles revealed in previous studies on physical SRSs to a free, software-
based format that uses both multiple choice and open-ended questions.
REFERENCES
[1] C. Boscardin & P. Penuel, “Exploring benefits of audience-response systems on learning: a review
of the literature,” Academic Psychiatry, vol. 36, no. 7, pp. 401-407, 2012.
[2] Y. T. Chien, Y. H. Chang, & C. Y. Chang, “Do we click in the right way? A meta-analytic review of
clicker-integrated instruction,”, Educational Research Review, vol. 17, pp. 1-18, 2016.
[3] J. H. Han, “Closing the missing links and opening the relationships among the factors: A literature
review on the use of clicker technology using the 3P model,” Journal of Educational Technology &
Society, vol. 17, no. 4, pp. 150-168, 2014.
[4] N. J. Hunsu, O. Adesope, & D. J. Brady, “A meta-analysis of the effects of audience response
systems (clicker-based technologies) on cognition and affect,,” Computers & Education, vol. 94,
pp. 102-119, 2016.
[5] R. H. Kay & A. Lesage, “Examining the benefits and challenges of using audience response
systems: A review of the literature,” Computers & Education, vol. 53, no. 3, pp. 819-827, 2009.
[6] S.M. Keough, “Clickers in the Classroom: A Review and a Replication,” Journal of Management
Education, vol. 36, no. 6, pp. 822-847, 2012.
[7] R. E. Landrum, “Teacher-ready research review: Clickers,” Scholarship of Teaching and Learning
in Psychology, vol. 1, no. 3, pp. 250-254, 2015.
[8] C. Liu, S. Chen, C. Chi, K. P. Chien, Y. Liu, & T. L. Chou, “The effects of clickers with different
teaching strategies. Journal of Educational Computing Research, 55(5), pp. 603-628.
3462
[9] R. Shieh & W. Chang, W., “Implementing the interactive response system in a high school physics
context: Intervention and reflections,” Australasian Journal of Educational Technology, vol. 29, no.
5, pp. 748-761, 2013.
[10] T. Y, Chien, Y. Lee, T. Y. Li, & C. Y. Change, “Examining the effects of displaying clicker voting
results on high school students' voting behaviors, discussion processes, and learning
outcomes. Eurasia Journal of Mathematics, Science & Technology Education, vol. 11, no. 5, pp.
1089-1104, 2015
[11] R. Kay, A. LeSage, & L. Knaack, “Examining the use of audience response systems in secondary
school classrooms: A formative analysis,” Journal of Interactive Learning Research, vol. 21, no. 3,
pp. 343-365, 2010.
[12] R. Kay & L. Knaack, “Exploring the use of audience response systems in secondary school
science classrooms,” Journal of Science Education and Technology, vol. 18, no. 5, pp. 382-392,
2009.
[13] W. R. Penuel, C. K. Boscardin, & K. Masyn, “Teaching with student response systems in
elementary and secondary education settings: A survey study,” Educational Technology Research
and Development, vol. 55, no. 4, pp. 315-346, 2007.
[14] F. Vital, F., “Creating a positive learning environment with the use of clickers in a high school
chemistry classroom,” Journal of Chemical Education, vol. 89, no. 4, pp. 470-473, 2011.
3463