Conference PaperPDF Available

A Case Study Exploring the Use of Student Response Systems in STEM-Based Secondary School Classrooms

Authors:

Abstract and Figures

Student response systems (SRSs), also known as audience response systems, classroom response systems and clickers, allow students to answer electronically displayed questions. SRSs can be hardware (e.g., physical device) or software-based (e.g., mobile apps). Typically, all responses are instantly presented, in chart form, then reviewed and discussed by the instructor and the class. Considerable research has been conducted on the use of SRSs in higher education [1-8] but not in secondary school environments. The purpose of the current study was to examine the effectiveness of SRSs in STEM-based secondary school classrooms. Twenty-two STEM-focused secondary school teachers with 1 to 32 years of teaching experience participated in this study. After four months of integrating SRSs in their classrooms, all teachers completed a survey, responded to open-ended questions, and participated in hour-long focus groups. Key benefits of using SRSs included providing formative feedback on learning and teaching, increasing student involvement, improving the quality of teaching, and offering an alternative method for conducting summative assessments. Key challenges observed were technical problems (software and hardware), increased preparation time for lessons, accounting for individual differences, resistance to summative assessment, and classroom management.
Content may be subject to copyright.
A CASE STUDY EXPLORING THE USE OF STUDENT RESPONSE
SYSTEMS IN STEM-BASED SECONDARY SCHOOL CLASSROOMS
R. Kay
University of Ontario Institute of Technology (CANADA)
Abstract
Student response systems (SRSs), also known as audience response systems, classroom response
systems and clickers, allow students to answer electronically displayed questions. SRSs can be
hardware (e.g., physical device) or software-based (e.g., mobile apps). Typically, all responses are
instantly presented, in chart form, then reviewed and discussed by the instructor and the class.
Considerable research has been conducted on the use of SRSs in higher education [1-8] but not in
secondary school environments. The purpose of the current study was to examine the effectiveness of
SRSs in STEM-based secondary school classrooms. Twenty-two STEM-focused secondary school
teachers with 1 to 32 years of teaching experience participated in this study. After four months of
integrating SRSs in their classrooms, all teachers completed a survey, responded to open-ended
questions, and participated in hour-long focus groups. Key benefits of using SRSs included providing
formative feedback on learning and teaching, increasing student involvement, improving the quality of
teaching, and offering an alternative method for conducting summative assessments. Key challenges
observed were technical problems (software and hardware), increased preparation time for lessons,
accounting for individual differences, resistance to summative assessment, and classroom
management.
Keywords: student response systems, clickers, audience response systems, secondary school, STEM.
1 INTRODUCTION
Student response systems (SRSs) allow students to answer digitally presented questions (often multiple
choice) and receive immediate feedback. Originally, SRSs consisted of costly, physical clickers
connected to a computer using infrared or radio frequency technology (Boscardin & Penuel, 2012). The
technology has advanced, though, and students can now respond to questions using laptops or mobile
phones and free, web-based software (e.g., Kahoot©, PollEveryWhere©, Socrative©).
At least eight literature reviews have been conducted on the use of SRSs in higher education classrooms
[1-8]. Reported benefits of using SRSs include providing effective formative feedback to students to help
them learn [3,5,6], collecting formative feedback to help instructors adjust their teaching [1,3], increasing
student involvement (e.g., engagement and participation) [1,3,4-8], and improving the quality of teaching
[1,2,4,5]. Reported challenges of using SRSs involve dealing with technological problems [5], time
required to set-up SRSs [5] and to create effective questions [1,5,9], and resistance to summative
assessment [3,5].
Only a handful of studies have looked at the use of SRSs in K-12 environments [9-14]. Similar benefits
(formative feedback, student involvement, quality of teaching) and challenges (e.g., technology set-up
and question creating time, summative assessment) observed in higher education were noted in K-12
classrooms [9-14]. However, the number of studies on SRS use in K-12 classrooms is too small to
make firm conclusions. Furthermore, most studies rely on quantitative data makeing it difficult to
understand the precise nature of benefits and challenges reported. Finally, only one paper could be
found examining the use of SRSs from the perspective of the teacher [13]. The purpose of this study,
then, was to use a mixed method approach to provide a detailed analysis of potential benefits and
challenges using SRSs from the perspective of secondary school teachers.
2 METHODOLOGY
2.1 Participants
Twenty-two teachers (7 females, 22 males), from 14 different high schools within a suburban region of
over 200,00, volunteered to participate in the study. Grades taught were 9 (n=6), 10 (n=9), 11 (n=4),
and 12 (n=3). STEM-based subject areas taught included business (n=3), mathematics (n=5), science
Proceedings of INTED2019 Conference
11th-13th March 2019, Valencia, Spain
ISBN: 978-84-09-08619-1
3455
(n=7) and technology (n=7). Teaching experience ranged from 0.5 to 32 years, with a mean of 15.5
(SD=7.9) years. On a four-point scale ranging from “Not at all Comfortable” to “Very Comfortable”, mean
comfort level with technology was 3.8 (SD= 0.6).
2.2 Data Collection and Analysis
2.2.1 Survey Data
Participants completed an online survey after they used SRSs in their classrooms for a period of four
months. The first section of the survey collected demographic data (e.g., gender, subject area, grade,
experience level), preparation time and frequency of SRS lessons. The second section consisted of 12,
four-point Likert-scale items focussing on the intended purposes for using SRSs. The final section
included five, five-point Likert-scale items examining perceptions about the overall impact of SRSs.
Finally, one open-ended question asked participants to share the perceived benefits and/or challenges
of using SRSs in their classroom.
A frequency analysis including means and standard deviations was used to analyze the quantitative
survey data. An emergent content analysis was used to analyze open-ended responses about benefits
(n=44 comments) and challenges (n=29). Four themes emerged for benefits experienced: formative
assessment for students, formative assessment for teachers, student involvement, and summative
assessment. Five main themes were identified for challenges observed: technology, preparation,
individual differences, summative assessment and classroom management.
2.2.2 Focus Groups
Three one-hour, focus groups (4-6 participants each) were conducted. Participants discussed three
main questions.
1 How did they use SRSs in the classroom?
2 What were the main benefits of using SRSs and why?
3 What were the main challenges of using SRS and why?
All responses were audio recorded and transcribed producing 166 comments (n= 102 on benefits, n=
64 on challenges. With respect to benefits, five themes were noted: formative assessment for students,
formative assessment for teacher, student involvement, quality of teaching and summative assessment.
Regarding challenges, three themes were identified: technology, preparation, classroom management.
2.3 Procedure
Fourteen secondary schools were contacted to enlist teachers to participate in a study involving the use
of SRSs in the classrooms. Twenty-two volunteers enrolled in a full-day workshop focussing on using
the SRS hardware and software, as well as implementing effective strategies for maximizing the learning
benefits of SRSs. At the end of the workshop, each teacher had access to their own SRS system for a
period of four months (one full academic term). After the four-month term was completed, each teacher
filled in an online survey and participated in a one-hour focus group.
3 RESULTS
3.1 Preparation Time and Frequency of Use
Preparation time for using SRSs in a lesson ranged from 10 to 75 minutes with a mean of 40.6 (SD=18.4)
minutes. SRSs were used almost daily (n=1 teacher), 2-3 times per week (n=1 teacher), once each
week (n=10 teachers), once (n= 8 teachers), or never (n=2 teachers).
3.2 Overall Impact
Eighty-five to 90% of teachers agreed or strongly agreed that students were more engaged or motivated
when using SRSs in the classroom. Over two-thirds of the teachers agreed or strongly agreed that SRSs
improved the quality of instruction, made the lesson more effective and helped to plan for future lessons.
3456
Table 1. Overall Impact of SRSs in Secondary School Classrooms.
n
% Agree1
Mean (SD)2
Students were more engaged with SRSs
20
85%
6.2 (1.0)
Students were more motivated with SRSs
20
90%
6.1 (0.7)
SRSs helped improve quality of instruction
19
68%
6.0 (0.9)
SRSs made lesson more effective
20
65%
5.9 (0.9)
Feedback from SRSs helped in planning future lessons
20
70%
5.6 (1.2)
1 Agree + Strongly Agree
2 Seven-point scale (Strongly Disagree to Strongly Agree)
3.3 Benefits of Using SRSs
3.3.1 Formative FeedbackStudent Learning
A majority of teachers used SRSs to obtain formative feedback on student learning. Over 90% of these
teachers rated this type of use as effective or highly effective when used to check for understanding
during or after a lesson. Eighty percent of teachers rated SRSs as effective or highly effective when
used to check for individual understanding (e.g., when answers were linked to specific students) (Table
2).
Table 2. Formative Use of SRSs for Student Learning.
n
Mean (SD)2
Before a Lesson
15
3.1 (0.9)
During a Lesson
14
3.4 (0.6)
End of a Lesson
17
3.5 (0.6)
Individual Understanding
15
3.1 (0.7)
1 Effective + Highly Effective
2 Four-point scale (Not at all effective to Highly Effective)
Over 30% (n=53) of the open-ended survey and focus group comments referred to the effectiveness of
SRS feedback in supporting student learning. Sample comments were:
I liked it because of the quick response from the kids; you know when 24 or 26 got it right I
knew they understood it when half only did it, I knew we needed more discussion on that certain
topic.
“The immediacy of the feedback is good. It’s not 1 or 2 kids putting their hands up, they all have
to, and you can see the results right away, you can address issues that come up right away.
“[With the SRS], you give the students 25 multiple-choice [questions], and they know what to
study instead of wasting time studying other stuff.”
“I used them as a formative tool and as a diagnostic tool about what my students knew and then
we could actually fine-tune the review the next day.”
3.3.2 Formative Feedback Teacher Learning
About 10% (n=15) of the open-ended survey and focus group responses indicated that SRSs were
useful for providing feedback on how well concepts were taught. Representative comments were:
We did it at the end of each unit as a diagnostic tool to help them know what they know and to
help me know what I didn’t teach them well enough in.
I use frequent quizzes at the start of class to review the previous day’s work, and have modified
my lesson as a result of the feedback.”
3457
“When I've taken the time to incorporate it, the process has caused me to rethink and come up
with a stronger lesson. In essence, the new technology is forcing my creativity.
By using clickers as an assessment tool, I can focus lessons on areas of weakness.
[I liked the] feedback being right away was so powerful that I could look at it. Kind of like this
where I can look at it to see if the wording is bad, if I taught it bad if there’s consistently one kid
who’s missing the concept.
The one benefit brought up from the teacher aspect from improving your teaching, improving
how you word question to draw out the answer that you want.
“You can see what the whole class knows. where they are, and you can also say what review you
have to do as a teacher.”
3.3.3 Student Involvement
Almost 80% of teachers (n=15) believed that using SRSs to motivate or engage students was effective
or highly effective with a mean rating of 3.1 out of 4 (SD=0.8). Comments from the open-ended question
and focus groups (n=25, 15%) supported this finding. Sample responses were:
The kids really liked using it because it was something different; especially with math. It wasn’t
the same routine every day.
I’ve found that kids are very interested in doing it.”
Students are enthusiastic and engaged; they look forward to days when clickers are used.
Students are always very excited to use the clickers.
Aside from motivation/engagement, open-ended (n=4, 9%) and focus group (n=8, 8%) responses
indicated that using SRSs increased student participation. Sample comments were:
“[SRSs] has given a voice to some of the quieter students.
“The reason I used it so much was because the kids liked it, there was 100% participation (I could
see it),
“The kids that never participated typically wanted to discuss the reasons as to why the answers
were given.
“When I had the clickers out, [students with special needs] were focused, they were right there, I
wasn’t losing them, they were engaged and a part of the process.”
3.3.4 Quality of Teaching
SRSs were also used to augment the quality of teaching in five areas. All of the teachers who used a
game-based learning approach with SRSs rated it as effective. Three-quarters of the teachers who
SRSs to increase discussion, introduce a new topic or check homework rated them as effective. Only 4
teachers used SRSs as a cooperative tool where students shared a clicker and discussed the selection
of a response before responding. This approach was viewed as moderately effective (Table 3).
Table 3. Formative Use of SRSs for Improving the Quality of Teaching.
n
% Effective1
Mean (SD)2
Game-Based Learning
10
100%
3.7 (0.5)
Increased Discussion
18
78%
3.0 (0.7)
Introduce a Topic
11
73%
3.1 (0.9)
Checking Homework
11
73%
2.9 (1.1)
Cooperative Learning
4
50%
2.8 (1.0)
1 Effective + Highly Effective
2 Four-point scale (Not at all effective to Highly Effective)
Teachers offered only a few open-ended or focus group comments about SRSs improving the quality of
their teaching. With respect to gaming, teachers noted:
3458
The game is a fun and [an] engaging review lesson. The majority of the kids love it.
And I found that jeopardy [with SRSs] better than the regular jeopardy [with SRSs] everyone
answers.
Regarding homework, sample comments were:
“[with regular homework checks] they would never put up their hands you’d have to be pulling the
answers out of them. [With SRSs] they’re engaged, and they’re clicking, and the whole bit and
they see the answer come up and it’s surprised me sometimes because the kids that are cheering
[and it] actually gave me feedback that they’re doing their homework.”
I find it quite good and I’ve had students be excited that, I hear them, that they’re getting a reward
for doing their homework for one.
Finally, for increasing discussion, sample comments were:
You get the bar graph, and you say whoa, half of you got this wrong, you stop, and you have a
discussion about why you chose it.
“[SRSs] created more discussion in the classroom.
3.3.5 Summative Assessment
Fifty-four percent of teachers (n=12) rated the use of SRSs for summative assessment (formal testing)
as effective or highly effective with a mean rating of 2.4 our 4 4 (SD=1.0). This was the lowest rated
strategy for using SRSs, however, participants from the focus groups reflected extensively on this
method of using SRSs (n=19 comments, 19%). A number of teachers used SRSs to prepare students
for multiple choice questions that they might experience in higher education. Sample comments were:
“My purpose in doing this was to give the kids practice with doing multiple choice questions.”
I was using it a tool to effectively teach them how to approach multiple choice questions. I tell
you no one does this and I tell you that’s all they’re going to see [this in university].”
A lot of people don’t spend much time how to do multiple choice questions that’s what they get
in university because that’s what they’re faced with in the future.”
Other responses about using SRSs for summative assessment included:
I used clickers in formal testing situations for grade 11 and 12 biology and grade 9 mathematics.
What I appreciated there was the immediate, instant feedback.”
3.4 Challenges of Using SRSs
3.4.1 Technology
Thirty-five percent (n=34) the open-ended and focus group responses noted that the technology
(software and hardware) involved in using SRSs was challenging at times. Sample comments related
to software issues were:
The software is just a pain, but I’m used to that in education because education and software.”
You know I could use it right now without really thinking but I found it a little bit less than ideal,
and I thought maybe I’m the older generation, but I would think that a lot of people who aren’t
computer-savvy would find it frustrating initially.”
Also, with the graph, every time I want to do something graphical the formats that were there just
didn’t apply, and that was frustrating like crazy.
I spent over an hour preparing Exam View questions at home, and when I went to use them on
the computer at school, it didn't work. I was extremely frustrated and wasted so much time in
class.
Typically comments about hardware challenges were:
We’ve had the radio frequency ones, and they’re no problems. The IR frequency is the problem
There are a couple of kids that just couldn’t do it.
I did notice that a couple of kids would hold down the button too long, hit two buttons at the same
time because of the way they’re holding the remote.
3459
I’m teaching the BTA course or information technology-type course in a computer lab, it got pretty
frustrating because they had to move to go over the monitors, or between them, stand up,
because they had to raise their hand really high and push the button for all to see.”
3.4.2 Preparation
One-third of the open-ended and focus group (n=32) comments referred to preparation as being a
significant challenge when using SRSs. The two main issues were the time required to create effective
questions and setting up the SRS technology. Sample comments about creating questions were:
It takes a decent amount of time to find questions, to think of questions, once you’ve found
questions you need to edit them to fit your exact topic of what you taught that year.
“It’s sort of time-consuming to make the questions.
It's important to have well thought-out and precisely worded questions to prevent student
confusion.
It took more time to prepare questions. It took a lot of time.
Sample comments about setting up SRSs were:
Technology takes forever to set up and set down I don’t care how long people think it takes.
You’re not doing it while you’re teaching and if that’s your plan, you should have something else
for the kids to do.
I find it a challenge to set up the equipment the projector, the laptop, the clickers are in the bag.
It’s four corridors away, so I’ve got to go walk down to the library, bring the equipment in the room,
put it on the cart.”
3.4.3 Individual Differences
Over ten percent (n=11) of open-ended and focus group comments noted that there were individual
differences with respect to SRS use. Specific differences noted involved some students feeling pressure
to respond quickly, getting lost in the intensity of using the SRSs, and subject area not matching the
format of questions typically used in SRSs (e.g., multiple-choice). Sample comments were:
“Some students told me that the intensity was a bit too much for them, that they felt they
couldn’t think that quickly on the spot.”
I noticed a couple of things. One is the students who work very quickly and buzz in their answers
and theres the kids who are right down to the second. The creates some challenges.”
It's difficult to use the clickers when developing a concept or topic in computer engineering.
3.4.4 Summative Assessment
Ten percent (n=10) of open-ended and focus group responses claimed that using SRSs to conduct
summative assessment was a problem. Sample responses included:
I just felt from a time constraint, not being able to go back and not necessarily and look at their
answers for the kids that struggle anyways, I thought that would be too much of a challenge.
I had at least one student who loved the clickers but felt that it shouldn’t be used for a test but
was more comfortable writing the traditional test.
I tried using them for assessment, and the students balked. They blamed the clickers for their
poor performance on the multiple-choice questions.
There’s certainly dissatisfaction [from the students], this is not working, and I’m getting kind of
fed up with this. And before, all I needed to do was write down ABC or D and you can’t double
check your answers that’s another problem.
“The feedback we got from the students was “yeah it’s neat, but it’s way too easy to cheat” since
we have the IR ones and they have to get up and cheat from what other people are doing. So
after that, we never used it for that again.”
3.4.5 Classroom Management
Just under ten percent (n=8) of open-ended and focus group responses identified classroom
management issues when using SRSs. Typical comments included:
3460
“[SRSs] sometimes cause a bit of a disturbance with kids trying to answer wrong on purpose to
get a laugh.”
It is weird; some classes get more rambunctious while using the clickers and it takes me longer
to get through my lessons because I have to settle them down several times.
I found too that students would get a little silly and you have them buzzing in a number of times
or buzzing in a letter F when there is no F.
4 CONCLUSIONS
This study provided a detailed analysis of the benefits and challenges of using SRSs in STEM-based
classrooms, from the perspective of 22 secondary school teachers who used SRSs for a four-month
academic term. Most teachers used SRSs weekly and took an extra 40 minutes to prepare for SRS
lessons.
4.1 Benefits
Overall, five key benefits of using SRSs in STEM-based secondary school classrooms were identified:
formative assessment for student learning, formative assessment for teachers, increased student
involvement, improved quality of teaching, and summative assessment. The first four benefits mirrored
previous research on the use of SRSs in higher education [1-8] and K-12 classrooms [9-14]. However,
qualitative data provided additional useful information about the nature of these benefits.
With respect to formative assessment for student learning, it appeared that this approach was most
effective when used during or at the end of a lesson (as opposed to the beginning of a lesson or to
garner individual formative feedback). This type of feedback, described as efficient and quick, saved
time and helped students focus on what they needed to learn.
While not as prominent as formative feedback for student learning, formative feedback to guide teaching
had a specific impact on teachers adjusting or modifying instruction and revisiting previously taught but
poorly understood concepts. This type of feedback also inspired creativity, improved the quality of
questions asked, and increased the efficiency of instruction by targeting areas of weakness.
Regarding student involvement, the unique finding in this study is the intensity, enthusiasm, focus and
full participation of all students when SRSs were used. The interest and excitement of using SRSs were
prominent. Furthermore, and perhaps more importantly, SRSs gave voice to students who rarely, if
ever, participated in the traditional classroom.
Previous studies noted that SRSs improved the quality of teaching [1,2,4,5], particularly with respect to
improving the nature of peer and class discussions. This study added the use of game-based review
for upcoming tests which was universally popular with students, introducing new topics, checking
homework, and to a lesser extent promoting cooperative learning (e.g., when students shared a clicker
device and worked together to answer questions).
The benefits of summative assessment, not emphasized in previous studies, were also noted by
teachers in this study, albeit to a far lesser extent than the four previously articulated benefits. The main
purpose for using summative assessment was to prepare students for answering challenging multiple-
choice questions that they might have to face in university. This is a unique finding, perhaps indigenous
to STEM-based subject areas, and would need to be replicated to establish reliability.
4.2 Challenges
Previous research has not looked in depth at challenges experienced while using SRS. Secondary
school teachers in this study identified five problem areas: technology (software and hardware
challenges), preparation time, individual differences in using SRSs, summative assessment and
classroom management. Previous research had noted some hardware issues [11,12], however, in this
study software and hardware challenges were the most challenging issue when using SRSs. Time
wasted learning the software and trouble connecting to the receiver were two relatively prominent areas
of concern. Given that free, easy-to-use, software-based SRSs are becoming more prevalent in the
classroom, software and hardware problems may dissipate in the future.
Preparation time, identified in some previous studies as a challenge, focussed on set-up time [5] and
creating effective questions [1,5,9]. While set-up time may be alleviated somewhat by using software-
based SRSs, creating meaningful, thought-provoking questions are critical for the effective use of SRSs
3461
and will likely take substantial time to create from scratch. Teachers might consider pooling their
resources to create a shared database of questions organized by subject area and course.
Individual differences in receptivity to using SRSs were observed by a few teachers in this study,
specifically with respect to the intensity of the classroom when SRSs were used and the perceived time
pressure for students to answer questions. Maintaining a calm atmosphere while using SRSs and
allowing sufficient time for students to reply is clearly necessary for some students.
While using SRSs for summative assessment might save time and help students prepare for multiple-
choice questions, some teachers in this study noted a strong negative reaction to using SRSs for formal
testing. The key issue appeared to revolve around the anxiety of being evaluated combined with
answering questions using new and occasionally unpredictable technology that did not allow students
to review previous responses. This negative feedback is consistent with two previous studies where
SRSs were used for summative assessment in secondary schools [11,12]. Given that the benefits of
using SRSs for summative purposes are limited, teachers might re-consider using this tool for formal
evaluation.
Classroom management issues were not identified as a problem in previous studies in higher education.
One might expect older students to behave when using SRSs at the college or university level. However,
in secondary school classrooms, several teachers reported that some students did not have the maturity
to use SRSs and deliberately tried to sabotage the process. These discipline problems were not
prevalent, but need to be addressed with younger students to limit distraction for other students and
maximize learning.
4.3 Limitations and Future Research
A mixed method, case study approach was used to investigate the benefits and challenges of using
SRSs in STEM-based, secondary school classrooms. While a more detailed understanding of SRS use
was presented based on rich comments from the focus groups, the generalizability of the results cannot
be established based on the small sample size. Future research could (a) use the qualitative findings
from this to develop a more comprehensive survey for a much larger audience, (b) develop and test
interventions that help maximize the benefits and minimize the challenges of using SRSs, (c) explore
the effectiveness of different types of questions on student learning, and (d) conducts research that
applies the results and principles revealed in previous studies on physical SRSs to a free, software-
based format that uses both multiple choice and open-ended questions.
REFERENCES
[1] C. Boscardin & P. Penuel, “Exploring benefits of audience-response systems on learning: a review
of the literature,” Academic Psychiatry, vol. 36, no. 7, pp. 401-407, 2012.
[2] Y. T. Chien, Y. H. Chang, & C. Y. Chang, “Do we click in the right way? A meta-analytic review of
clicker-integrated instruction,”, Educational Research Review, vol. 17, pp. 1-18, 2016.
[3] J. H. Han, “Closing the missing links and opening the relationships among the factors: A literature
review on the use of clicker technology using the 3P model,” Journal of Educational Technology &
Society, vol. 17, no. 4, pp. 150-168, 2014.
[4] N. J. Hunsu, O. Adesope, & D. J. Brady,A meta-analysis of the effects of audience response
systems (clicker-based technologies) on cognition and affect,,” Computers & Education, vol. 94,
pp. 102-119, 2016.
[5] R. H. Kay & A. Lesage, “Examining the benefits and challenges of using audience response
systems: A review of the literature,” Computers & Education, vol. 53, no. 3, pp. 819-827, 2009.
[6] S.M. Keough, “Clickers in the Classroom: A Review and a Replication,” Journal of Management
Education, vol. 36, no. 6, pp. 822-847, 2012.
[7] R. E. Landrum, “Teacher-ready research review: Clickers,” Scholarship of Teaching and Learning
in Psychology, vol. 1, no. 3, pp. 250-254, 2015.
[8] C. Liu, S. Chen, C. Chi, K. P. Chien, Y. Liu, & T. L. Chou, “The effects of clickers with different
teaching strategies. Journal of Educational Computing Research, 55(5), pp. 603-628.
3462
[9] R. Shieh & W. Chang, W., “Implementing the interactive response system in a high school physics
context: Intervention and reflections,” Australasian Journal of Educational Technology, vol. 29, no.
5, pp. 748-761, 2013.
[10] T. Y, Chien, Y. Lee, T. Y. Li, & C. Y. Change, “Examining the effects of displaying clicker voting
results on high school students' voting behaviors, discussion processes, and learning
outcomes. Eurasia Journal of Mathematics, Science & Technology Education, vol. 11, no. 5, pp.
1089-1104, 2015
[11] R. Kay, A. LeSage, & L. Knaack, “Examining the use of audience response systems in secondary
school classrooms: A formative analysis,” Journal of Interactive Learning Research, vol. 21, no. 3,
pp. 343-365, 2010.
[12] R. Kay & L. Knaack, “Exploring the use of audience response systems in secondary school
science classrooms,” Journal of Science Education and Technology, vol. 18, no. 5, pp. 382-392,
2009.
[13] W. R. Penuel, C. K. Boscardin, & K. Masyn, “Teaching with student response systems in
elementary and secondary education settings: A survey study,” Educational Technology Research
and Development, vol. 55, no. 4, pp. 315-346, 2007.
[14] F. Vital, F., “Creating a positive learning environment with the use of clickers in a high school
chemistry classroom,” Journal of Chemical Education, vol. 89, no. 4, pp. 470-473, 2011.
3463
... Studies that examined early versions of ARS resulted in improved enthusiasm for learning but limited gains in learning [24]. When student-centered learning strategies were used with these technologies, improvements in several areas were evidenced, including student attendance and participation (e.g., [25,26]), collaborative learning and student engagement [26][27][28][29][30], student comprehension (e.g., [31][32][33]), and student class satisfaction [24]. For example, Cheng and colleagues [34] found stronger motivation for learning and academic performance among undergraduate students who exhibited greater social presence in classes with ARSs. ...
... Studies that examined early versions of ARS resulted in improved enthusiasm for learning but limited gains in learning [24]. When student-centered learning strategies were used with these technologies, improvements in several areas were evidenced, including student attendance and participation (e.g., [25,26]), collaborative learning and student engagement [26][27][28][29][30], student comprehension (e.g., [31][32][33]), and student class satisfaction [24]. For example, Cheng and colleagues [34] found stronger motivation for learning and academic performance among undergraduate students who exhibited greater social presence in classes with ARSs. ...
... As described earlier, there is a significant history of research that has supported the efficacy of ARS for engaging students in the learning process [25][26][27][28][29][30] as well as student achievement in Algebra I [15][16][17][18]. The present study extends the CCMS project and its findings by showing significant differences on an Algebra I EOC assessment in one school. ...
Article
Full-text available
This article reports on two aspects of a professional learning (PL) and research study. Twenty-five teachers participated in a two-year PL program that sought to support teachers to implement classroom connectivity technology (CCT) in their Algebra I classrooms. Students in one school who learned Algebra I with CCT outperformed their peers who learned without CCT. Therefore, we explored the classroom practices of one teacher whose students attained higher achievement. There were several aspects of this teacher’s instruction that created the context for noticing and potentially led to the significant difference in Algebra I achievement. First, we describe the foundational components of the classroom context that established the expectations for learning and lesson mini cycles that provided a consistent format for students. We discuss several components of this work that supported student noticing, including connections to prior knowledge, task analysis, and carefully curated lessons. Students’ observations were codified in the conjectures that they developed individually and in groups as well as in the summaries of the classroom engagement in several ways. The implications of these results and future research are discussed.
Chapter
Full-text available
Afirmar en la actualidad que el uso de Internet ha revolucionado nuestras vidas es incidir en un hecho que, desde hace décadas, venimos experimentado en nuestra vida personal, social y laboral. Como docentes del momento que nos ha tocado vivir somos responsables de nuestra formación continua como usuarios de las nuevas tecnologías y, por tanto, la curiosidad nos debe llevar a investigar nuevos formatos y maneras de llegar a nuestros alumnos. Estos no son los mismos que se sentaban en nuestras aulas hace veinte años y demandan nuevas y diferentes formas de aprender. El alumno se ha posicionado en el centro del proceso de enseñanza-aprendizaje no para que el docente se quede de brazos cruzados ante la expansión social y cultural de la tecnología, sino para que se beneficie lo máximo de ella y obtenga mejores resultados en sus alumnos, sin que estos tengan que esforzarse mucho más que antes. Es necesario que el docente reflexione sobre su papel en la educación actual, aunque se podría decir que lo ha empezado a hacer, a consecuencia de la situación de crisis mundial sanitaria que le ha tocado vivir a la sociedad actual y que ha obligado al docente a enseñar desde casa y a agudizar su ingenio. Sin embargo, siguen existiendo reticencias al respecto y algunos de ellos no acaban de ver los beneficios que las nuevas tecnologías, sin dejar los medios tradicionales de lado, pueden aportar a sus clases. El objetivo de este articulo no es que este tipo de profesores se den cuenta de las ventajas de las Tecnologías de la Información y la Comunicación (en adelante TIC), sino demostrar al profesor que actualmente imparte su asignatura desde un aula virtual, que se puede mantener al alumno igual de motivado en la sesión no presencial que en la clase presencial. Resulta imprescindible que el docente, no solo aprenda a manejar los nuevos recursos que tiene a su disposición, sino que domine y valore una nueva cultura del aprendizaje (Mauri y Onrubia, 2008). Esta nueva cultura se ha analizado desde diferentes puntos de vista (Salinas y Aguaded, 2004; Pozo, 2006, Mauri y Onrubia, 2008) y se caracteriza por tres rasgos principales: (1) enseñar al alumno a organizar y a darle sentido a la información, (2) proporcionar al alumno las herramientas necesarias para construir su propio aprendizaje a lo largo de su vida y (3) construir su propio punto de vista teniendo en cuenta la diversidad del mundo que le rodea. El uso de Sistemas de Respuesta Inmediata (también llamados en inglés Student Response Systems) constituyen una herramienta útil para el docente que imparte sus clases de manera no presencial, ya que ahondan en esta cultura del aprendizaje que debemos transmitir a nuestros alumnos, y que tan complejo resulta hacer a través de una cámara web y un micrófono. En este artículo se pretende demostrar que este tipo de sistemas ayudan a mantener la atención del alumno, así como a desarrollar su pensamiento cognitivo, su autonomía y formación continua. Por último, se presentará un caso real en el que se pondrá en práctica el sistema de respuesta inmediata Pear Deck. 2. De aula presencial al aula virtual. El modelo de aula presencial ha ido evolucionando en las últimas décadas para dar paso a un modelo de aula virtual que permite a los alumnos de cualquier parte del mundo visualizar una clase en directo (modalidad síncrona), así como mantener una conversación en tiempo real con el docente y con otros compañeros. Asimismo, las posibilidades que ofrece el uso de Internet garantizan que ese mismo alumno pueda ver de nuevo la clase de manera diferida (modalidad asíncrona) cuando él o ella lo desee. Para poder llegar a la realidad actual, el modelo presencial ha pasado por varias etapas como la de, por ejemplo, enseñanza semi-presencial o blended learning, que permite a los docentes beneficiarse de la enseñanza
Article
Full-text available
Student response systems (AKA clickers) are being used widely by educators, and the pedagogical research that documents their benefits and drawbacks continues to increase. In this teacher-ready research review, I provide a brief overview of the current literature, review the research about clickers influencing student performance, provide an overview about how clickers are used in additional contexts, and close with recommendations and thoughts about the optimal use of clickers.
Article
Full-text available
Clicker technology is one of the most widely adopted communication systems in college classroom environments. Previous literature reviews on clicker technology have identified and thoroughly documented the advantages, disadvantages, and implications of the use of this technology; the current review is intended to synthesize those earlier findings and recast them in terms of the interrelationship between the “3 Ps” of the 3P model: Presage, Process, and Product factors. Using this guided framework enables the identification of the most up-to-date trends and issues in clicker studies published in peer-reviewed journals since 2009. The review shows that recent clicker studies have examined the effects of clickers in terms of student presage factors (cognitive, non-cognitive, background factors), instructor presage factors (instructor effects and the level of the course taught), process factors (delivery method, instructional activities, and assessment and feedback), and product factors (cognitive and non-cognitive outcomes). A heat-mapping approach is used to facilitate the interpretation of the results. The findings also discuss missing/unaddressed links and the untapped relationships among instructional factors in these studies. This study concludes that teaching and learning with the use of clicker technology is a complex and relational phenomenon; factors that are currently under-explored should be examined using more rigorous research methods to close gaps in the literature and to enhance understanding of the use of clickers in classroom learning environments.
Article
Full-text available
To date, extensive research has been done on the use of Audience Response Systems (ARSs) in colleges and universities, but not in secondary schools. The purpose of this study was to conduct a detailed formative analysis on the benefits, challenges, and use of ARSs from the perspective of 659 secondary school students. Key benefits reported were increased levels of engagement and motivation, the effective use of formative assessment, and a better quality learning environment. Key challenges included a perceived decrease in learning performance when an aRS was used for summative assessment, technological malfunctions, resistance to using a new method of learning, and increased stress due to perceived time constraints. finally, students consistently rated the use of an aRS significantly higher when it was used for formative as opposed to summative assessment.
Article
The aim of this review was to analyze the effects of instance response systems or clickers on students’ learning in different teaching strategies. A total of 128 empirical studies were reviewed; 80% of the studies were conducted in the context of lectures or collaborative learning. Further analysis of the studies using a quasi-experimental design revealed that clicker usage in traditional lectures may enhance students’ attention and participation. However, it is not more effective than low-technology methods such as hand raising or response cards in terms of learning performance. Clickers combined with collaborative peer-aided learning have shown positive results with large effect sizes. Furthermore, incorporating clickers into innovative teaching strategies appears to be promising. Finally, the use of clickers to promote high-order thinking is discussed.
Article
p>The interactive response system (IRS) has been widely used to promote student learning since 2003. It is an electronic system connected to handset devices allowing students to transmit their responses by pressing the desired buttons and meanwhile allowing the teacher to monitor and track individual students' answers anonymously and statistically. However, there is limited research examining the challenges teachers may encounter when designing IRS-based questions and providing mediations which may lead them to develop quality questions. The purpose of this study is to address this research gap by investigating one high school teacher's IRS implementation based on both the teacher's and students' teaching/learning experiences as well as presenting an intervention to help the teacher develop higher quality IRS questions. High quality questions denote questions that are able to help students engage in deeper thinking and eventually lead to comprehensive understanding of the concepts learned. The data sources consist of tests, classroom observations, interviews, face-to-face meetings, and email correspondence. The findings disclose that enhancing the teacher's content knowledge and capability of recognizing the students' learning pitfalls is the foundation to developing quality IRS questions. Collaboration established between the teacher and a university physics education expert appears to have effectively helped both participants gain insights and knowledge into designing quality questions aimed at identifying the students' learning bottlenecks. </p
Article
Clickers, formerly known as instant response systems, have gradually become an integral part of the classroom. Though several reviews on research into clicker-integrated instruction have been published within this decade, the controversy over whether clicker-integrated instruction is effective to enhance students' learning gains has not been settled because the early reviews mainly focus on students' perceptions toward and acceptance of clicker-integrated instruction. Furthermore, so far there is no consistent and clear framework to explain why the use of clickers is effective or ineffective to facilitate academic learning outcomes. Based on the literature from the 1970s to the early 2010s, this review article identifies and summarizes the theoretical aspects accounting for possible relations between clicker-integrated instruction and academic learning outcomes. The theoretical aspects are subsequently evaluated and expanded in reference to primary studies. The results suggest that the superior effect of clicker-integrated instruction, compared to conventional lectures, stands on firm empirical ground. In addition, engaging students in explaining and justifying their answers to clicker questions is highly recommended because such an instructional strategy is associated with positive and strong effect sizes on academic learning outcomes.
Article
Although the effectiveness of student response systems in improving student learning is inconclusive, clickers can be used to create a positive learning environment in the classroom, which can help increase student achievement. With the use of clickers, students showed modest improvements in their performance in conceptual summative assessments. Clickers created a learning environment where all students could participate and be engaged. Students reported that the clickers helped them improve their learning. The clickers allowed for instant feedback and a means to assess student knowledge without penalty. These formative assessments had a positive impact on student beliefs and allowed the teacher to reflect on instructional strategies.
Article
This article reviews 66 clicker technology–based studies focusing on student perceptions/outcomes. Eight major perceptions/outcomes are noted, including high levels of performance (actual and perceived), student attention span, attendance, and participation, as well as student perceptions of satisfaction, feedback, and ease of use. Because the review revealed that studies involving clickers within the management discipline were nonexistent, an empirical study was conducted to determine whether the perceptions/outcomes of clickers realized in other disciplines could be duplicated in the management discipline. The results of the empirical study indicate that the same perceptions/outcomes can be attained within the management discipline.