ArticlePDF Available

Abstract and Figures

An audience response system (ARS) permits students to respond to multiple choice questions using a remote control device. All responses are instantly displayed, usually in chart form, and subsequently reviewed and discussed by the instructor and the class. This paper offers a comprehensive review of teaching strategies used with ARS and includes a discussion of general, motivational, assessment based, and learning based approaches. Several promising strategies have been identified, particularly collecting formative assessment feedback and peer based instruction. More systematic, detailed research in a broader range of contexts is recommended. Overview An audience response system (ARS) allows students to respond to multiple choice questions using a remote control device. After students click in their responses, the results are instantly amalgamated and displayed in chart form. More often than not, responses are anonymous, but can be linked to specific students for evaluation purposes. The key advantage of using an ARS is that it can give feedback to both students and instructors on how well the entire class understands concepts presented. Once this feedback is obtained, a teacher can modify the course of instruction, or students can work out misconceptions via peer or classroom discussion. ARSs were first introduced at Stanford and Cornell Universities in the mid 1960s, but did not become commercially available until 1992 (Abrahamson, 2006; Judson & Sawada, 2002). In 1999, a new generation of more affordable ARSs was created, with widespread use emerging in 2003. Today, numerous colleges and universities use ARSs (Abrahamson, 2006).
Content may be subject to copyright.
Australasian Journal of
Educational Technology
2009, 25(2), 235-249
A strategic assessment of audience response
systems used in higher education
Robin H. Kay and Ann LeSage
University of Ontario Institute of Technology
An audience response system (ARS) permits students to respond to multiple choice
questions using a remote control device. All responses are instantly displayed, usually
in chart form, and subsequently reviewed and discussed by the instructor and the
class. This paper offers a comprehensive review of teaching strategies used with ARS
and includes a discussion of general, motivational, assessment based, and learning
based approaches. Several promising strategies have been identified, particularly
collecting formative assessment feedback and peer based instruction. More systematic,
detailed research in a broader range of contexts is recommended.
Overview
An audience response system (ARS) allows students to respond to multiple choice
questions using a remote control device. After students click in their responses, the
results are instantly amalgamated and displayed in chart form. More often than not,
responses are anonymous, but can be linked to specific students for evaluation
purposes. The key advantage of using an ARS is that it can give feedback to both
students and instructors on how well the entire class understands concepts presented.
Once this feedback is obtained, a teacher can modify the course of instruction, or
students can work out misconceptions via peer or classroom discussion.
ARSs were first introduced at Stanford and Cornell Universities in the mid 1960s, but
did not become commercially available until 1992 (Abrahamson, 2006; Judson &
Sawada, 2002). In 1999, a new generation of more affordable ARSs was created, with
widespread use emerging in 2003. Today, numerous colleges and universities use
ARSs (Abrahamson, 2006).
The purpose of the following review is to provide a current, comprehensive analysis of
strategies used with ARSs in higher education.
Method
Previous literature reviews
At least four previous literature reviews have been completed on ARSs (Caldwell,
2007; Fies & Marshall, 2006; Judson & Sawada, 2002; Simpson & Oliver, 2007). Judson
& Sawada (2002) provided a summary of ARS use prior to 1998; however their review
included only eight peer reviewed references. Moreover, because the widespread use
of ARS began in earnest after 2003, Judson & Sawada`s review (2002) is now dated.
236 Australasian Journal of Educational Technology, 2009, 25(2)
Fies & Marshall (2006) completed an extensive analysis of methods used to assess
ARSs, however they examined only 16 peer reviewed studies, of which only two were
published after 2004. Therefore, some of their findings and conclusions are incomplete.
For example, they noted that few studies reported the use of ARSs for formative
assessment purposes, however, since 2004, 16 new studies have been completed where
formative assessment was examined. The authors also claimed that ARSs were used
primarily for individual, not collaborative interactions, a practice that has changed
markedly since 2004.
A more recent review by Simpson & Oliver (2007) analysed more than 40 papers. Only
17 of the cited articles were from peer reviewed journals, with the majority of the
results founded on five references. Finally, Caldwell (2007) analysed 25 peer reviewed
articles. Most papers examined were published after 2000 with 10 studies completed
after 2004. The principal foci of Caldwell’s work were to discern the primary users of
ARSs, determine the rationale for using ARSs, explore questioning techniques utilised
during ARS lessons, and identify best practices associated with ARS classroom use.
However, fewer details were offered concerning specific teaching strategies used, and
their impact on student learning.
Current study
Data
In order to address some of the limitations of previous research reviews on ARSs, a
comprehensive search of peer reviewed journals and edited books was completed.
Conference papers or reports were not included. The reasons for not including
conference papers or reports were threefold. First, it was assumed that peer reviewed
journal articles, on balance, were more rigorous than conference papers in terms of
method and quality of data. Journal articles typically report on finished studies,
whereas many conference papers are works in progress. Second, the sheer volume of
conference papers made it unrealistic to include this type of publication. A recent
search of AACE conference proceedings revealed 395 conference papers on ARSs.
Third, finding a consistent, objective method to assess the quality, validity and volume
of conference papers is challenging. Some conferences have a strict review process, but
the details for review criteria are not as transparent and consistent as they are with
journal articles. All methodological choices come with inherent biases, and this paper
is no different. By not including conference papers, it is recognised that new and
innovative strategies for using ARSs will be missed (e.g. peer assessment from
Richards, Braiding & Vaughan, 2006; or the impact of multiculturalism on ARS use
from Albon & Jewels, 2007).
The articles selected for this review focus solely on strategies used with ARSs in the
higher education domain. Twenty-six labels (see Appendix A in Kay, 2008a) were used
to search for relevant articles. A total of 52 papers and chapters were analysed. Given
that previous literature reviews included no more than 25 peer reviewed papers, one
can be reasonably confident that this review accurately reflects the state of current
research on ARSs. Of the 52 studies analysed, 50 were conducted between 2000 and
2007, with 37 articles published since 2004.
Data analysis
Each of the studies included in this review was analysed based on the following
categories: rationale or theory for using ARS, context of use, benefits and challenges
associated with using ARS, strategies and pedagogical use, and individual differences
Kay and LeSage 237
in the use of ARS. See Appendix B (see Kay, 2008b) for a detailed description of the
coding of variables used in this study. See Appendix C (Kay, 2008c) for a description of
all the articles reviewed for this study.
Results and discussion
Context of using ARS
Before presenting a detailed analysis of strategies used with ARSs, it is informative to
understand the context in which ARSs have been used to date. Seventeen studies were
in the domain of science, six in medicine, four in mathematics or computer science,
three in business, and three in engineering. The remaining 19 studies covered a variety
of subject areas including anatomy, biology, earth science, economics, law,
pharmacology, philosophy, and social science. Sample size ranged from 14 to 1542
students. Eighty percent of the 30 studies that did report sample size examined over 80
students with an overall mean 241, suggesting that ARSs were typically used in large
classrooms.
In summary, the conclusions from the current review reflect the attitudes and learning
efforts of large classes of undergraduate students who were studying mathematics or
science based subject areas.
Strategies and ARS
While there is considerable evidence to suggest that higher education students are very
positive toward the use of ARSs (e.g., Caldwell, 2007; Fies & Marshall, 2006; Simpson
& Oliver, 2007), the integration of an ARS into a classroom does not guarantee
improved student learning (Draper & Brown, 2004; Van Dijk, Van Den Berg & Van
Keulen, 2001). It is the implementation of pedagogical strategies in combination with
the technology that ultimately influences student success (Reay, Bao, Li,
Warnakulasooriya & Baugh, 2005; Simpson & Oliver, 2007; Stuart, Brown & Draper,
2004). While limited research has been done examining and comparing specific
instructional strategies used with ARSs, a number of techniques have been employed
over the past eight years (see Table 1 for a summary).
Four categories of strategies were examined in the current review. General strategies are
those referring to the preparation and process involved in using an ARS. Motivational
strategies involve explicit attempts to involve or engage students. Assessment strategies
refer to the use of an ARS to guide instruction or evaluate progress. Finally, learning
based strategies include specific techniques designed to increase learning performance.
General strategies for using an ARS
Explain why an ARS is being used
As stated earlier, higher education students appear to embrace the use of ARSs in the
classroom. However, a number of students have resisted the use of ARSs, especially
during the early stages of adoption. Some objections have included concerns over the
extra effort required to discuss answers presented by an ARS (Trees & Jackson, 2007),
wanting responses to be anonymous (Abrahamson, 2006), feeling uncomfortable when
responding incorrectly, particularly when a majority of the class responds correctly
(Carnaghan & Webb, 2006), becoming overly distracted by the use of an ARS (Siau,
238 Australasian Journal of Educational Technology, 2009, 25(2)
Sheng, & Nah, 2006) and general resistance to a new method of learning (Beatty, 2004;
Fagan, Crouch & Mazur, 2002).
Table 1: Summary of strategies for using ARS
Strategy
Description
References
General strategies
Explain
Explain to class why ARS
is being used
Beatty, 2004; Caldwell, 2007; Dufresne & Gerace,
2004; Trees & Jackson, 2007.
Preparation
Planning required to
develop effective
questions
Allen & Tanner, 2005; Boyle, 2006; Beatty, 2004;
Boyle, 2006; Caldwell, 2007; Poulis et al., 1998; Beatty
et al., 2006; McCabe, 2006; Stuart et al., 2004.
Type
Type of questions that
works with ARS
Beatty, 2004; Beatty et al., 2006; Brewer, 2004;
Caldwell, 2007; Crouch & Mazur, 2001; Cutts, 2006;
Dufresne & Gerace, 2004; Fies & Marshall, 2006;
McCabe, 2006; Horowitz, 2006; Kennedy & Cutts,
2005; Miller et al., 2006; Poulis et al., 1998.
Format
Format in which
questions are offered
Caldwell, 2007; Cutts, 2006; Horowitz, 2006; McCabe,
2006; Robertson, 2000; Simpson & Oliver, 2007; Uhari
et al., 2003.
Motivational strategies
Attendance
Students go to class more
Caldwell , 2007; Burnstein & Lederman, 2001; Greer
& Heaney, 2004.
Engagement
Students are more
engaged in class
Bergtrom, 2006; Caldwell, 2007; Draper & Brown,
2004; Latessa & Mouw, 2005; Preszler et al., 2007;
Siau, Sheng, & Nah, 2006; Simpson & Oliver, 2007.
Participation
Students participate with
peers more in class to
solve problems
Bullock et al., 2002; Caldwell, 2007; Draper & Brown,
2004; Greer & Heaney, 2004; Jones et al., 2001; Siau,
Sheng, & Nah, 2006; Stuart et al., 2004; Uhari et al.,
2003; Van Dijk et al., 2001.
Assessment strategies
Formative
Assessment is done that
improves student
understanding and
quality of teaching
Beatty, 2004; Bergtrom, 2006; Brewer, 2004; Bullock et
al., 2002; Caldwell, 2007; Draper & Brown, 2004;
Dufresne & Gerace, 2004; Elliott, 2003; Greer &
Heaney, 2004; Hatch et al., 2005; Jackson et al., 2005;
Siau, Sheng, & Nah, 2006; Simpson & Oliver, 2007;
Stuart et al., 2004.
Contingent
teaching
Adjust teaching method
based on feedback from
the class
Beatty, 2004; Brewer, 2004; Draper & Brown, 2004;
Elliott, 2003; Greer & Heaney, 2004; Jackson et al.,
2005; Kennedy & Cutts, 2005; Poulis et al., 1998;
Simpson & Oliver, 2007.
Summative
Used ARS for graded
tests
Draper et al., 2002; Fies & Marshall, 2006; Simpson &
Oliver, 2007.
Learning based strategies
Attention
Students are more
focused in class
Bergtrom, 2006; Burnstein & Lederman, 2001;
Caldwell, 2007; d'Inverno, et al., 2003; Draper &
Brown, 2004; Elliott, 2003; Elliott, 2003; Jackson et al.,
2005; Jones et al., 2001; Latessa & Mouw, 2005; Siau,
Sheng, & Nah, 2006; Slain et al., 2004.
Interaction
Students interact more
with peers to discuss
ideas
Beatty, 2004; Bergtrom, 2006; Caldwell, 2007; Elliott,
2003; Freeman et al., 2007; Kennedy et al., 2006;
Sharma et al., 2005; Siau, Sheng, & Nah, 2006; Slain et
al., 2004; Stuart et al., 2004; Trees & Jackson, 2007;
Van Dijk et al., 2001.
Kay and LeSage 239
Peer based
instruction
Promote peer discussion
and resolution of
problems
Brewer, 2004; Bullock et al., 2002; Burnstein &
Lederman, 2001; Caldwell, 2007; Crouch & Mazur,
2001; Draper & Brown, 2004; Jones et al., 2001;
Kennedy & Cutts, 2005; Miller et al., 2006; Nicol &
Boyle, 2003.
Student
preparation
Students read materials
ahead of class
Beatty, 2004; Bergtrom, 2006; Bergtrom, 2006;
d'Inverno, et al., 2003; El-Rady, 2006; Uhari et al.,
2003.
Class based
discussion
Promote class discussion
and resolution of
problems
Beatty et al., 2006; Caldwell, 2007; d'Inverno, et al.,
2003; Nicol & Boyle, 2003; Reay et al., 2005; Sharma et
al., 2005.
Case studies
Present and solve case
studies
Jones et al., 2001.
Experiments
Present and solve
experiments
Draper et al., 2002; Simpson & Oliver, 2007.
Because of the potential problems that could arise when introducing an ARS, a number
of researchers recommend that instructors should explain why this new technology is
being used. Beatty (2004) suggests that students will be more comfortable when they
understand the rationale for a specific classroom practice. Caldwell (2007) and Tree &
Jackson (2007) add that if teachers expect to garner full student support for this new
method, they need to explain why ARSs are being used and what they expect to gain
by using the technology. Finally, although ARSs are reportedly easy to use (d'Inverno
et al., 2003; Elliott, 2003; Hinde & Hunt, 2006; Jones, Connolly, Gear & Read, 2001;
Pradhan, Sparano, & Ananth, 2005; Sharma, Khachan, Chan & O'Byrne, 2005; Siau et
al., 2006), it might be wise to have practice questions to allow students to become
familiar with using the technology (Caldwell, 2007). To date, the need for explanation
and practice has not been systematically studied; however, it seems to be a reasonable
approach for addressing some of the challenges noted earlier. For example, if students
understand why they are using ARS they may be more accepting of this new way of
learning, more willing to put forth the extra effort required to discuss and solve
problems in class, and less sensitive when they respond incorrectly to the questions
presented.
Question preparation
While setting up and using an ARS is a relatively simple and quick task, creating
effective multiple choice questions is a challenging and time consuming process (Allen
& Tanner, 2005; Boyle, 2006). A number of recommendations have been tendered with
respect to preparing effective questions. It has been argued that every question should
have an explicit pedagogical purpose (Beatty, 2004; Beatty, Leonard, Gerace &
Dufresne, 2006; Caldwell, 2007; Poulis, Massen, Robens & Gilbert 1998). In addition,
McCabe (2006) notes that questions should be thoughtfully linked together. Boyle
(2006) adds that even when questions have been prepared and used, there is need for
continual refinement. Stuart et al. (2004) suggest that spontaneous questions based on
student feedback work well, however most research supports a more systematic
carefully planned approach (Allen & Tanner, 2005; Beatty, 2004; Beatty et al., 2006;
Caldwell, 2007; McCabe, 2006; Poulis et al., 1998). Current research, though, is lacking
with respect to concrete evidence on the impact of various question preparation
strategies. In addition, there are few classroom-ready collections of ARS questions
available in most fields. Thus it is incumbent upon the instructor to develop original
questions and many find this task extremely time consuming (Allen & Tanner, 2005;
Beatty et al., 2006; El-Rady, 2006; Fagan et al., 2002; Freeman, Comerton-Forder,
Pickering & Blayney, 2007; Paschal, 2002).
240 Australasian Journal of Educational Technology, 2009, 25(2)
Question type
Allen & Turner (2005) maintain that the cognitive benefits of ARSs are only as strong
as the questions asked. Beatty et al. (2006) add that the critical challenge is to create
questions that cultivate productive classroom interaction and discourse. A wide range
of suggestions have been offered regarding the most effective type of ARS questions,
including those that:
allow students to apply knowledge recently acquired (Poulis et al., 1998);
are higher level and require students to compare situations or data, make
predictions, and explore causal relationships (Beatty et al., 2006; Poulis et al., 1998);
are ill-defined and vague with respect to possible answers so that students are
required to process and debate (Beatty, 2004; Beatty et al., 2006; Caldwell, 2007;
Dufresne & Gerace, 2004; Miller, Santana-Vega & Terrell, 2006);
focus on process and reasoning as opposed to factual content (Beatty, 2004; Beatty
et al., 2006; Crouch & Mazur, 2001; Fies & Marshall, 2006);
identify and help resolve misconceptions (Beatty et al., 2006; Brewer, 2004; Crouch
& Mazur, 2001; Cutts, 2006; Horowitz, 2006 ); and
support a comprehensive review of a specific set of concepts (McCabe, 2006;
Kennedy & Cutts, 2005).
It should be noted that the above characteristics are not mutually exclusive. For
example, questions could be designed to be ill-defined, require students to make
predictions, but also involve several responses that isolate misconceptions. It is also
worth mentioning that the type of questions created may be partially dependent on the
subject being taught (McCabe, 2006). For example, more technical, application type
questions may be suited to applied science, mathematics or engineering, while more
open ended, less concrete questions may be more appropriate in philosophy or social
sciences. Some good examples of ARS questions are available at the A2L Library
(2008), Chemistry Concept Tests (2008), and the Cornell Mathematics Database (2008).
For a more detailed discussion of ARS questioning, consult Beatty (2004), Caldwell
(2007), and Beatty et al. (2006).
Question format
Once a question is developed, decisions have to be made about the number of
questions to ask, the number of options to provide within a multiple choice question,
and how long to take when working with a specific question. Most researchers agree
that questions should be sprinkled judiciously throughout a lecture at the rate of two
to five questions per 50 minute time period (Allen & Tanner, 2005; Burton, 2006;
Caldwell, 2007; Preszler, Dawe, Shuster, & Shuster, 2007; Robertson, 2000). The main
reason given for limiting the number of questions is to maintain student interest and
enthusiasm (Allen & Tanner, 2005; Burton, 2006; Preszler et al., 2007; Robertson, 2000).
Given that thoughtful ARS questions take 5-10 minutes to display, discuss, and resolve
(Cutts, 2006), it would be challenging to present more than one question every 10-15
minutes. Finally, because many teachers are concerned about reduced content
coverage when using an ARS, limiting the number questions asked is probably a
reasonable strategy (e.g., Beatty, 2004; Beatty et al., 2006; Caldwell, 2007; Cutts, 2006;
Draper & Brown, 2004).
Many researchers suggest that no more than four or five options be offered when
asking an ARS question (Caldwell, 2007; Cutts, 2006; Robertson, 2000; Uhari, Renko &
Soini, 2003). This suggestion is based on the experiences of the researchers, yet no
specific reasons are given for this restriction. Several researchers have suggested that
Kay and LeSage 241
to accurately monitor whole class understanding, an “I don’t know” option be
included as a potential response (McCabe, 2006; Simpson & Oliver, 2007).
Little research has been published on the optimum amount of time to take when
asking an ARS question. Beatty et al. (2006) anecdotally reported that when the noise
level drops in a class, it is time to stop peer discussion and move on. Time allotted for
discussion of an ARS question may be related to subject area, level of thinking
required, question difficulty level, and the pedagogical goals of the instructor. To date,
limited data has been collected comparing the efficacy of different question formats.
Motivational strategies for using an ARS
Attendance
In an effort to address attendance issues ARSs have been introduced at some
universities. Several studies point out that attendance does improve when an ARS is
used, provided it is associated with a portion of the course grade. Specifically, several
researchers observed dramatic increases in attendance when 15% of a student’s grade
was linked to ARS participation (Burnstein & Lederman, 2001; Greer & Heaney, 2004).
However, Caldwell (2007) reported that allotting just five percent of student’s grade to
ARS participation was sufficient motivation to improve regular classroom attendance.
Only two studies have reported increased attendance in ARS classes when grades were
not used as a motivator (El-Rady, 2006; Preszler et al., 2007).
However, the strategy of using an ARS to increase attendance is not universally
popular and sometimes has unexpected consequences. For example, Greer & Heaney
(2004) added that students were displeased about being forced to attend class in order
to gain academic credit for ARS participation. An unfortunate consequence of
attendance monitoring practice is that 20% to 58% of students observed their peers
bringing multiple remote devices to class to record attendance for missing classmates
(Caldwell, 2007). Ideally, students should want to attend class for intrinsic, not
extrinsic reasons. ARSs should provide an inherent learning incentive so that students
want to attend. Attaching a grade to ARS monitored attendance may foster resistance
and undermine the goal of developing a student centered environment.
Participation
Many instructors and learning theorists would agree that students need to participate
actively in the learning process. One of the main reasons an ARS is used is to increase
participation. Substantial evidence indicates that using an ARS increases student
participation when compared to participation rates in classrooms where an ARS was
not used (Bullock et al., 2002; Caldwell, 2007; Draper & Brown, 2004; Greer & Heaney,
2004; Jones et al., 2001; Siau, Sheng, & Nah, 2006; Stuart et al., 2004; Uhari et al.2003;
Van Dijk et al., 2001). For example, one study observed that “shy” students
participated more in classrooms using ARSs (Greer & Heaney, 2004). Bullock et al.
(2002) added that when a portion of students’ grades were assigned to ARS use,
participation markedly increased. Another study reported that ARSs were more
effective when case studies were employed (Jones et al., 2001). Still, other researchers
have noted that students were more involved when ARSs were used in groups as
opposed to individually (Jones et al., 2001; Van Dijk et al., 2001).
Engagement
An implicit strategy for using ARSs is the engagement value and if students are
engaged, it is argued they are more likely to actively construct knowledge. In general,
242 Australasian Journal of Educational Technology, 2009, 25(2)
students in ARS based classes report being more interested or engaged in concepts
presented and discussed (Bergtrom, 2006; Preszler et al., 2007; Simpson & Oliver,
2007). It is important to note, though, that detailed data on the reasons for why
students are engaged has not been collected to date. For example, students may be
more engaged because they are actively involved in the learning process. An
alternative explanation might involve the novelty of the technology - it may simply be
fun to use a remote control device in class and observe other students’ responses. More
comprehensive, qualitative research is required to explore plausible explanations for
increased student engagement with ARS use.
Assessment based strategies for using an ARS
Formative assessment
There are two main forms of classroom assessment: summative and formative.
Summative assessment is used for tests or assignments that count toward a student’s
final grade. Formative assessment is used to determine student understanding of
concepts without grades in order to identify misconceptions and alter classroom
instruction accordingly (Bransford, Brown & Cocking, 1999). Without a tool like an
ARS, it is challenging to calibrate overall student understanding of concepts presented
in class. Regular use of an ARS can offer feedback to both instructors and students as
to how well concepts are being understood. Experienced teachers can quickly modify
explanations or mode of instruction and students can gauge and discuss their
understanding of concepts as they are presented. There is considerable evidence to
suggest that ARSs help provide effective formative assessment (Beatty, 2004; Bergtrom,
2006; Brewer, 2004; Bullock et al., 2002; Caldwell, 2007; Draper & Brown, 2004;
Dufresne & Gerace, 2004; Elliott, 2003; Greer & Heaney, 2004; Hatch, Jensen, & Moore,
2005; Jackson et al., 2005; Siau et al, 2006; Simpson & Oliver, 2007; Stuart et al., 2004).
Contingent teaching
With contingent teaching, the material that is presented and discussed in class is
largely dependent on student feedback from the ARS. The instructor is presenting
ideas, gathering formative assessment data, and adjusting content and teaching
strategies based on how well students understand the concepts (Brewer, 2004; Cutts,
2006; Draper & Brown, 2004; Elliott, 2003; Greer & Heaney, 2004; Hinde & Hunt, 2006;
Jackson et al., 2005; Kennedy & Cutts, 2005; Poulis et al., 1998; Stuart et al., 2004). There
is some evidence to suggest that this approach has been successful (Brewer, 2004;
Greer & Heaney, 2004), although Abrahamson (2006) speculates that success may be
dependent on an instructor’s experience level and ability to instantly address problems
and misconceptions.
Summative assessment
Fies & Marshall (2006) reported most higher education instructors regularly employ a
summative assessment strategy when using an ARS. However, limited evidence exists
to support this claim - most studies report the formative use of ARSs (e.g.,
Abrahamson, 2006; Beatty, 2004; Caldwell, 2007; Elliott, 2003; Jackson et al., 2005). It
has been suggested that summative assessment encourages rote learning and cannot
be used to shape instruction in a dynamic fashion (Dufresne & Gerace, 2004). In fact,
understanding how well students understand concepts in traditional, lecture based
classes is often a mystery until after the first exam (Bullock et al., 2002). There is some
evidence to indicate that higher education students do not enjoy using ARS for grades
(Caldwell, 2007). While using ARS for formal testing situations might be attractive to
an instructor in terms of quick, easy marking, it may not be the most appropriate
Kay and LeSage 243
pedagogical choice. More research is needed examining the benefits and challenges of
using ARSs for summative assessment.
Learning based strategies for using an ARS
Attention
One of the fundamental conditions for effective learning is that students are focused
and paying attention to the content presented. Some evidence indicates that in
classroom situations, student attention wanes after approximately 20 minutes
(d'Inverno et al., 2003; Jackson et al., 2005). Given that the length of a traditional lecture
is between 50 minutes and three hours, it is inevitable that students will be unable to
concentrate on the lesson content for the entire duration of the class. Presenting ARS
questions at 20 minute intervals may be one technique for segmenting a long lecture by
requiring students to shift their attention and actively participate in the learning
process. This hypothesis is supported by numerous studies which have reported that
higher education students are more attentive when an ARS is used during lectures
(Bergtrom, 2006; Burnstein & Lederman, 2001; Caldwell, 2007; d'Inverno et al., 2003;
Draper & Brown, 2004; Elliott, 2003; Jackson et al., 2005; Jones et al., 2001; Latessa &
Mouw, 2005; Siau et al., 2006; Slain, Abate, Hidges, Stamatakis & Wolak., 2004). It
should be noted, though, that a direct link between increased attention in ARS based
classrooms and learning performance has not been established.
Interaction
It could be argued that participation is a necessary, but not sufficient component for
learning. The quality of participatory effort is perhaps more important. Students need
to be interacting with each other, the instructor and new concepts being introduced.
They must also be “cognitively” engaged (Van Dijk et al., 2001). Numerous studies
suggest that frequent and positive interaction occurs with ARSs (Beatty, 2004;
Bergtrom, 2006; Caldwell, 2007; Elliott, 2003; Freeman et al., 2007; Kennedy, Cutts &
Draper, 2006; Sharma et al., 2005; Siau et al., 2006; Slain et al., 2004; Stuart et al., 2004;
Trees & Jackson, 2007). When an ARS is used, researchers have reported greater
articulation of student thinking (Beatty, 2004), more probing questions and an
increased focus on student needs (Beatty, 2004; Siau, et al., 2006), effective peer to peer
discussions (Bergtrom, 2006; Caldwell, 2007; Kennedy et al., 2006), and active learning
(Elliott, 2003; Kennedy et al., 2006; Slain et al., 2004; Stuart et al., 2004).
Peer based learning
One of the most common and successful strategies used with ARS is peer based
instruction, which involves displaying a higher level question that could identify a
misconception, asking students to click in a response, giving students time to discuss
and defend their answers with two to four peers, taking a re-vote on the original
question, and having the instructor provide a brief summary (Brewer, 2004; Bullock et
al., 2002; Burnstein & Lederman, 2001; Crouch & Mazur, 2001; Cutts, 2006; Draper &
Brown, 2004; Hinde & Hunt, 2006; Jones et al., 2001; Kennedy & Cutts, 2005; Miller et
al., 2006; Nicol & Boyle, 2003). Crouch & Mazur (2001) provided 10 years of evidence
suggesting that peer based instruction provides significant gains in student learning
performance. Nicol & Boyle (2003) added that peer instruction is central to the
development of student conceptual understanding.
One of the main concerns about using an ARS on a regular basis is coverage of content.
Abundant research suggests that teachers, and occasionally students, feel that less
content is addressed when using an ARS as opposed to a more traditional lecture
244 Australasian Journal of Educational Technology, 2009, 25(2)
format (Beatty, 2004; Beatty et al., 2006; Burnstein & Lederman, 2006; Caldwell, 2007;
d'Inverno, et al., 2003; Burton, 2006; Cutts, 2006; Draper & Brown, 2004; Fagan et al.,
2002; Freeman et al., 2007; Hatch et al., 2005; Sharma et al., 2005; Siau et al., 2006; Slain
et al., 2004; Steinert & Snell, 1999; Stuart et al., 2004). Responding to and discussing
knowledge centred questions that identify and target misconceptions can take
considerably more time than simply presenting material in a lecture. Some researchers
have noted, though, that what is covered in a traditional lecture may not be
understood as well as concepts learned with an ARS (Beatty et al., 2006; Caldwell,
2007). However, the fact remains that curriculum coverage is a reality that many
lecturers have to face. One way to compensate for material not covered in class is to
require students to do more reading and class preparation outside of the lecture
(Bergtrom, 2006; Bullock et al., 2002; Burnstein & Lederman, 2001; Caldwell, 2007; Slain
et al., 2004). This strategy is discussed next.
Require student preparation before class
A number of researchers have suggested that a good strategy for addressing reduced
content coverage is to require students to read materials prior to class so that the class
time can be devoted to refining and extending student thinking and knowledge
(Beatty, 2004; Bergtrom, 2006; d'Inverno, et al., 2003). D'Inverno et al. (2003) add that
providing notes to augment a lecture was a popular instructional practice. Student
attitudes toward the extra effort required for class preparation has not been measured,
nor has the impact of this strategy on the quality of learning.
Classroom discussion
A variation of the peer instruction approach augments the role of the entire class. A
multiple choice question is displayed, students are immediately asked to discuss
possible solutions with their peers before voting, responses are collected and
displayed, and then volunteers are asked to explain the answers that they selected. At
some point, the instructor stops the class discussion and summarises the results (Beatty
et al., 2006; d'Inverno, et al., 2003; Nicol & Boyle, 2003; Reay et al., 2005; Sharma et al.,
2005). One research study suggested that students prefer peer based instruction to
classroom discussion (Nicol & Boyle, 2003) - students wanted to think about their
answers on their own, before discussing it with peers. A discussion first approach
tended to mute conversation with some students simply conceding to the most
dominant student because they did not have sufficient time to formulate a response.
Other researchers have suggested that the classroom discussion approach may be
better suited to smaller classes (d'Inverno, et al., 2003).
Occasionally, significant problems emerge when ARS questions are presented for
classroom discussion. Some students may dominate group discussions (Nicol & Boyle,
2003) or discussion of different viewpoints may precipitate student confusion (Nicol &
Boyle, 2003, Reay et al., 2005). Occasionally, students feel that ARS use distracts them
from the lesson (Draper & Brown, 2004) or view class discussion as intimidating and a
source of anxiety (Nicol & Boyle, 2003). While these problems have not been reported
extensively, instructors and researchers need more information about creating effective
discussion that is focussed, non-threatening, and efficient.
Case studies
Jones et al. (2003) used ARSs successfully with university students by incorporating
case study questions in a peer based instruction format. The case studies appeared to
make classes far more animated and students talked considerably more with peers
instead of working on their own. This approach may be limited to the subject area
Kay and LeSage 245
being covered. For example, it may be more difficult to come up with reasonable case
studies for mathematics or physics. Clearly more research is needed to gain a
comprehensive picture of the effectiveness of case studies.
Experiments
While this approach has not been tested rigorously, conducting psychology
experiments in class and requiring students to predict outcomes has been examined by
Simpson & Oliver (2007). Clearly more research is needed to determine the validity of
this approach, but, again, success may be partially dependent on subject area. Science,
for example, may be a good fit for an experimental approach, whereas history or
English may not.
Summary of strategies and future research
An analysis of 52 peer reviewed articles and chapters has revealed helpful information
regarding strategies used with ARSs. Promising ARS general strategies reported
included explaining why an ARS is being used, asking practice questions, taking time
to develop effective, meaningful questions, developing questions that stimulate class
discussion, focussing on higher level as opposed to factual questions, asking only two
to five questions per hour, and limiting the number of options in a multiple choice
question to four or five. Effective ARS motivational strategies involved using ARSs to
increase participation and engagement. Promising ARS assessment strategies included
collecting formative feedback and contingency teaching. Finally, with respect to
learning strategies, using ARS to increase attention levels, improving the quality of
classroom interaction, peer based instruction, and requiring students to read extra
materials before an ARS class appeared to work well. Preliminary research suggests
that using ARSs with experiments and case studies is effective.
Nonetheless, a number of significant ARS strategy based problems have yet to be
examined. Overall, the results of the current review are limited to math and science
based subject areas. More research is needed on the use of ARS in broader range of
subject areas. Regarding ARS general strategies, formal evaluation comparing the
efficacy of specific question types and format is needed. With respect to ARS
motivational strategies, it is unclear why students are more engaged when ARSs are
used. Anecdotal explanations have included the novelty effect of using a new
technology, the fun of clicking in answers, and increased commitment to learning
concepts being presented, but more in depth research is needed to explore and confirm
plausible explanations for increased student engagement. Concerning ARS assessment
strategies, it appears that formative assessment is a productive technique; yet more
research is necessary focussing on the benefits and challenges of using ARS for
summative assessment. Finally, when examining learning based strategies, learning
performance needs to be assessed to determine the effectiveness of certain strategies
such as the impact of increased attention levels, extra student preparation before class,
experiments, and case studies.
References
A2L Item Library (2008). A2L- Assessing to learn physics. [viewed 28 Nov 2008]
http://a2l.physics.umass.edu/library/
Abrahamson, L. (2006). A brief history of networked classrooms: Effects, cases, pedagogy, and
implications. In D. A. Banks (Ed.), Audience response systems in higher education (pp. 1-25).
Hershey, PA: Information Science Publishing.
246 Australasian Journal of Educational Technology, 2009, 25(2)
Albon, R. & Jewels, T. (2007). The impact of audience response systems in a multicultural Asian
context. In ICT: Providing choices for learners and learning. Proceedings ascilite Singapore 2007.
http://www.ascilite.org.au/conferences/singapore07/procs/albon.pdf
Allen, D. & Tanner, K. (2005). Infusing active learning into the large-enrolment Biology class:
Seven strategies, from the simple to complex. Cell Biology Education, 4(4), 262-268.
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1305885
Beatty, I. (2004). Transforming student learning with classroom communication systems.
EDUCAUSE Research Bulletin, 2004(3), 1-13. [viewed 3 Nov 2007]
http://www.educause.edu/ir/library/pdf/ERB0403.pdf
Beatty, I. D., Leonard, W. J., Gerace, W. J. & Dufresne, R. J. (2006). Question driven instruction:
Teaching science (well) with an audience response system. In D. A. Banks (Ed.), Audience
response systems in higher education (pp. 96-115). Hershey, PA: Information Science Publishing.
Bergtrom, G. (2006). Clicker sets as learning objects. Interdisciplinary Journal of Knowledge and
Learning Objects, 2. [viewed 3 Nov 2007] http://ijklo.org/Volume2/v2p105-110Bergtrom.pdf
Boyle, J. (2006). Eight years of asking questions. In D. A. Banks (Ed.), Audience response systems in
higher education (pp. 289-304). Hershey, PA: Information Science Publishing.
Bransford, J. D., Brown, A. L. & Cocking, R. R. (Eds.) (1999). How people learn: Brain, mind,
experience, and school. Washington, DC: National Academy Press. [verified 8 May 2009]
http://www.nap.edu/openbook.php?record_id=6160
Brewer, C. A. (2004). Near real-time assessment of student learning and understanding in
biology courses. BioScience, 54(11), 1034-1039.
Bullock, D. W., LaBella, V. P., Clinghan, T., Ding, Z., Stewart, G. & Thibado, P. M. (2002).
Enhancing the student-instructor interaction frequency. The Physics Teacher, 40, 30-36.
Burnstein, R. A. & Lederman, L. M. (2001). Using wireless keypads in lecture classes. The Physics
Teacher, 39(1), 8-11.
Burnstein, R. A. & Lederman, L. M. (2006). The use and evolution of an audience response
system. In D. A. Banks (Ed.), Audience response systems in higher education (pp. 40-52).
Hershey, PA: Information Science Publishing.
Burton, K. (2006). The trial of an audience response system to facilitate problem-based learning
in legal education. In D. A. Banks (Ed.), Audience response systems in higher education (pp. 265-
276). Hershey, PA: Information Science Publishing.
Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips. Life
Sciences Education, 6(1), 9-20. http://www.lifescied.org/cgi/reprint/6/1/9.pdf
Carnaghan, C. & Webb, A. (2007). Investigating the effects of group response systems on student
satisfaction, learning, and engagement in accounting education. Issues in Accounting
Education, 22(3), 391-409.
Chemistry Concept Tests (2008). [viewed 28 Nov 2008]
http://jchemed.chem.wisc.edu/JCEDLib/QBank/collection/ConcepTests/
Cornell Mathematics Database (2008). [viewed 28 Nov 2008]
http://www.math.cornell.edu/~GoodQuestions/GQbysubject_pdfversion.pdf
Kay and LeSage 247
Crouch, C. H. & Mazur, E. (2001). Peer instruction: Ten years of experience and results. American
Journal of Physics, 69(9), 970-977.
Cutts, Q. (2006). Practical lessons from four years of using an ARS in every lecture of a large
class. In D. A. Banks (Ed.), Audience response systems in higher education (pp. 65-79). Hershey,
PA: Information Science Publishing.
D’Inverno, R., Davis, H. & White, S. (2003). Using a personal response system for promoting
student interaction. Teaching Mathematics and Its Applications, 22(4), 163-169.
Draper, S. W. & Brown, M. I. (2004). Increasing interactivity in lectures using an electronic voting
system. Journal of Computer Assisted Learning, 20(2), 81-94.
Dufresne, R. J. & Gerace, W. J. (2004). Assessing-to-learn: Formative assessment in physics
instruction. The Physics Teacher, 42, 428-433.
Elliott, C. (2003). Using a personal response system in economics teaching. International Review of
Economics Education, 1(1). [viewed 3 Nov 2007]
http://www.economicsnetwork.ac.uk/iree/i1/elliott.htm
El-Rady, J. (2006). To click or not to click: That's the question. Innovate Journal of Online Education,
2(4). [viewed 3 Nov 2007] http://www.innovateonline.info/index.php?view=article&id=171
Fagan, A. P., Crouch, C. H. & Mazur, E. (2002). Peer Instruction: Results from a range of
classrooms. The Physics Teacher, 40(4), 206-209.
Fies, C. & Marshall, J. (2006). Classroom response systems: A review of the literature. Journal of
Science Education and Technology, 15(1), 101-109.
Freeman, M., Bell, A., Comerton-Forder, C., Pickering, J. & Blayney, P. (2007). Factors affecting
educational innovation with in class electronic response systems. Australasian Journal of
Educational Technology, 23(2), 149-170. http://www.ascilite.org.au/ajet/ajet23/freeman.html
Greer, L. & Heaney, P. J. (2004. Real-time analysis of student comprehension: an assessment of
electronic student response technology in an introductory earth science course. Journal of
Geoscience Education, 52(4), 345-351.
Hatch, J., Jensen, M. & Moore, R. (2005). Manna from heaven or clickers from hell. Journal of
College Science Teaching, 34(7), 36-39.
Hinde, K. & Hunt, A. (2006). Using the personal response system to enhance student learning:
Some evidence from teaching economics. In D. A. Banks (Ed.), Audience response systems in
higher education (pp. 140-154). Hershey, PA: Information Science Publishing.
Horowitz, H. M. (2006). ARS evolution: Reflections and recommendations. In D. A. Banks (Ed.),
Audience response systems in higher education (pp. 53-63). Hershey, PA: Information Science
Publishing.
Jackson, M., Ganger, A. C., Bridge, P. D. & Ginsburg, K. (2005). Wireless handheld computers in
the undergraduate medical curriculum. Medical Education Online, 10(5). [viewed 3 Nov 2007]
http://www.med-ed-online.org/pdf/t0000062.pdf
Jones, C., Connolly, M., Gear, A. & Read, M. (2001). Group integrative learning with group
process support technology. British Journal of Educational Technology, 32(5), 571-581.
Judson, E. & Sawada, D. (2002). Learning from past and present: Electronic response systems in
college lecture halls. Journal of Computers in Mathematics and Science Teaching, 21(2), 167-181.
248 Australasian Journal of Educational Technology, 2009, 25(2)
Kay, R. H. (2008a). Appendix A - Labels used to describe audience response systems.
http://faculty.uoit.ca/kay/papers/arsrev/AppendixA_Labels.pdf
Kay, R. H. (2008b). Appendix B – Coding of research papers reviewed for ARS strategy paper.
http://faculty.uoit.ca/kay/papers/arsrev/AppendixB_Coding.pdf
Kay, R. H. (2008c). Appendix C – List of studies reviewed for ARS strategy paper.
http://faculty.uoit.ca/kay/papers/arsrev/AppendixC_ListOfPapers.pdf
Kennedy, G. E. & Cutts, Q. I. (2005). The association between students’ use of electronic voting
systems and their learning outcomes. Journal of Computer Assisted Learning, 21(4), 260-268.
Kennedy, G. E., Cutts, Q. & Draper, S. W. (2006). Evaluating electronic voting systems in
lectures: Two innovative methods. In D. A. Banks (Ed.), Audience response systems in higher
education (pp. 155-174). Hershey, PA: Information Science Publishing.
Latessa, R. & Mouw, D. (2005). Use of audience response system to augment interactive learning.
Family Medicine, 37(1), 12-14. [viewed 3 Nov 2007]
http://www.stfm.org/fmhub/fm2005/January/Robyn12.pdf
McCabe, M. (2006). Live assessment by questioning in an interactive classroom. In D. A. Banks
(Ed.), Audience response systems in higher education (pp. 276-288). Hershey, PA: Information
Science Publishing.
Miller, R. L., Santana-Vega, E. & Terrell, M. S. (2006). Can good questions and peer discussion
improve calculus instruction? PRIMUS, 16(3), 1-9. [verified 8 May 2009; preprint]
http://www.math.cornell.edu/~maria/mathfest_eduction/preprint.pdf
Nicol, D. J. & Boyle, J. T. (2003). Peer instruction versus class-wide discussion in large classes: A
comparison of two interaction methods in the wired classroom. Studies in Higher Education,
28(4), 457-473.
Paschal, C. B. (2002). Formative assessment in physiology teaching using a wireless classroom
communication system. Advances in Physiology Education, 26(4), 299-308.
Poulis, J., Massen, C., Robens, E. & Gilbert, M. (1998). Physics lecturing with audience paced
feedback. American Journal of Physics, 66(5), 439-441.
Pradhan, A., Sparano, D., Ananth, C. V. (2005). The influence of an audience response system on
knowledge retention: An application to resident education. American Journal of Obstetrics and
Gynecology, 193(5), 1827-1830.
Preszler, R. W., Dawe, A., Shuster, C. B. & Shuster, M. (2007). Assessment of the effects of
student response systems on student learning and attitudes over a broad range of biology
courses. CBE-Life Sciences Education, 6(1), 29-41.
http://www.lifescied.org/cgi/content/full/6/1/29
Reay, N. W., Bao, L., Li, P., Warnakulasooriya, R. & Baugh, G. (2005). Toward the effective use of
voting machines in physics lectures. American Journal of Physics, 73(6), 554-558.
Richards, D., Braiding, K. & Vaughan, A. (2006). Fun and feedback at the press of a button. In
Who's learning? Whose technology? Proceedings ascilite Sydney 2006.
http://www.ascilite.org.au/conferences/sydney06/proceeding/pdf_papers/p151.pdf
Robertson, L. J. (2000). Twelve tips for using a computerised interactive audience response
system. Medical Teacher, 22(3), 237-239.
Kay and LeSage 249
Sharma, M. D., Khachan, J., Chan, B. & O'Byrne, J. (2005). An investigation of the effectiveness of
electronic classroom communication systems in large lectures. Australasian Journal of
Educational Technology, 21(2), 137-154. http://www.ascilite.org.au/ajet/ajet21/sharma.html
Siau, K., Sheng, H. & Nah, F. (2006). Use of classroom response systems to enhance classroom
interactivity. IEEE Transactions on Education, 49(3), 398-403.
Simpson, V. & Oliver, M. (2007). Electronic voting systems for lectures then and now: A
comparison of research and practice. Australasian Journal of Educational Technology, 23(2), 187-
208. http://www.ascilite.org.au/ajet/ajet23/simpson.html
Slain, D., Abate, M., Hidges, B. M., Stamatakis, M. K. & Wolak, S. (2004).An interactive response
system to promote active learning in the doctor of pharmacy curriculum. American Journal of
Pharmaceutical Education, 68(5), 1-9.
Steinhert, Y. & Snell, L. S. (1999). Interactive lecturing: Strategies for increasing participation in
large group presentations. Medical Teacher, 21(1), 37-42.
Stuart, S. A. J., Brown, M. I. & Draper, S.W. (2004). Using an electronic voting system in logic
lectures: one practitioner’s application. Journal of Computer Assisted Learning, 20(2), 95-102.
Trees, A. R. & Jackson, M. H. (2007). The learning environment in clicker classrooms: student
processes of learning and involvement in large university course using student response
systems. Learning, Media, and Technology, 32(1), 21-40.
Uhari, M., Renko, M. & Soini, H. (2003). Experiences of using an interactive audience response
system in lectures. BMC Medical Education, 3(12), 1-6. http://www.biomedcentral.com/1472-
6920/3/12
Van Dijk, L. A., Van Den Berg, G. C. & Van Keulen, H. (2001). Interactive lectures in engineering
education. European Journal of Engineering Education, 26(1), 15-28.
Dr Robin H. Kay
University of Ontario Institute of Technology, Faculty of Education
2000 Simcoe St. North, Oshawa, Ontario L1H 7L7, Canada
Email: Robin.Kay@uoit.ca Web: http://faculty.uoit.ca/kay/home/
Dr Ann LeSage, University of Ontario Institute of Technology, Faculty of Education
Email: ann.lesage@uoit.ca
Web: http://education.uoit.ca/EN/main/13738/159923/profiles_lesage.php
... An important advantage is that using these game-based learning platforms increases classroom dynamics, attendance, student motivation and engagement, as students may perceive the learning environment as more interesting and competitive (Kalogiannakis et al., 2021;Owen & Licorish, 2020;Rojabi et al., 2022) while making lectures more interactive. Gamification, therefore, can turn students' learning environment into a positive classroom atmosphere, with "challenges, clear goals, feedback and an element of playfulness" (Kay & Lesage, 2009;Plum & LaRosa, 2017). ...
Article
Full-text available
Education has undergone a fundamental change driven by technological advances, significantly altering the way teaching and learning is conceptualized in the classroom. Currently the variety of learning and teaching platforms is extensive and in general, these have been very well received by language teachers. The use of these platforms is often driven by teachers' eagerness to keep students motivated, engaged, and focused throughout a class. This exploratory study investigates engineering undergraduates' perceptions of integrating Kahoot! (a Game-based Student Response System (GSRS)) to review technological vocabulary and grammar before their unit tests. Students' perceptions were assessed using quantitative and qualitative methods throughout one semester. The findings indicated that students found the use of Kahoot! to be beneficial in relation to: (1) improving technological vocabulary in English (2) review more complex grammatical structures; (3) enjoy a more relaxed classroom environment. Finally, the present article presents some implications that may be useful to language lecturers.
... Each quiz had 3-4 questions and was focused on relevant contents explained in class. We included multiple choice and true/false questions, bearing in mind that the most effective questions to be used in SRS are those (Kay and LeSage 2009b) that allow students to apply knowledge recently acquired; are higher level; focus on process and reasoning as opposed to factual content; identify and help resolve misconceptions; and support a comprehensive review of a specific set of concepts. b) Statement of the rules of the game. ...
Article
Full-text available
p>This paper describes a teaching experience using Socrative, a third party electronic tool, for real-time questioning in lectures of Econometrics. Econometrics is a theoretical-practical subject, but traditionally a large proportion of our students tend to focus on the practical and discard the theory, often skipping classes on theory and avoiding studying its content, probably motivated by its complexity. As a consequence, students’ marks obtained in the theoretical part of the exam are usually low. In this context, we put forward a change in our teaching methodology to include the use of Socrative, a freely available app, that allows students to answer teachers’ short, true/false, or multiple choice questions posed during each class using their smartphones (or other electronic devices with Internet connection). The objectives of this project are twofold: 1) to engage students and increase attendance at lectures; 2) to improve feedback on the learning process. The results of a survey of a sample of 186 students reveal that Socrative has been an effective tool for achieving these objectives.</p
... Finally, recent reviews present two different trends: the first one leaning into more complex and fully integrated learning environments (De Freitas, 2006), the second towards more simple approaches like puzzle based and simple game mechanics, allowing quick engagement, easily understood educational relevance, and short efficient duration of play (Abdul Jabbar and Felicia, 2015). In an extensive literature review, Kay and LeSage (2009) showed that quizzes and polls have proven to be an effective educational tool as students attended more classes, paid more attention, and were more engaged. Similarly, Raes et al. (2020) found that their students' intrinsic motivation was higher during quiz times than during other moments of the lecture. ...
Article
Purpose – This research examines how students’ learning was affected by the transition from a flipped classroom model to remote learning during the Covid19 lockdown and hybrid learning after return to class. Methodology – This study features quantitative analysis of undergraduate students’ online interactions with the course material over two semesters with the same instructor, one completely online, the other in the form of hybrid learning imposed by Covid19 restrictions. Participation in the game-based quizzes before class, in-class multiplayer game sessions, and presence in the classroom or online during the lectures were measured, as well as the students’ score to the final exam. Students also took a survey at the end of each semester to provide feedback on the course. Finally, interviews were made with some of the students to document their detailed impression of the learning challenges of the period. Findings – Results from this study identified four groups of students with different attitudes regarding learning and challenges during and after Covid19 restrictions. Some came back immediately in the physical class while others remained online or did both. Average scores correlate increased face-to-face time and regular online preparation before class with better results in the final exam. Qualitative data from students’ interviews confirm these changing attitudes to learning due to Covid19. Practical implication – This study allows us to reflect on best practices for hybrid learning and opportunities to improve on the flipped classroom model under changing modes of delivery. Furthermore, it underlines strategic importance to engage different profiles and challenges for students with less time and opportunity to engage in face-to-face learning. Interest – While research on flipped and hybrid classrooms is prevalent, little has been done in comparing the two models and their impact on students’ learning attitudes Furthermore, research on classroom adaption and adjustment during Covid19 is still at early stages. This study presents opportunities and challenges for the hybrid classroom moving forward.
... Other findings revealed that when online quizzes were administered to students, and personalized feedback based upon their performance was provided, they performed significantly better compared to previous years without online quizzes (Pennebaker et al., 2013). Although quizzes are generally considered as a useful educational strategy leading students to engage during lectures (Dobbins & Denton, 2017;Kay & LeSage, 2009;Raes et al., 2020), it is likely that benefits are not in the quizzes themselves, but in the feedback based on students' answers. ...
Article
The role of individual feedback in (online) learning has been widely studied by researchers, but collective feedback based on quizzes and its impact on various academic outcomes has been overlooked to date. The aim of the present study was to compare the effects of displaying individual or collective feedback during an online course on students’ attitudes (interest in the course), emotions (social comparison threat), and behaviors (engagement and academic performance). A two-hour online learning session was conducted among undergraduate students in psychology (N = 155) to examine the effects of collective or individual feedback using quizzes on academic outcomes. In a control condition, each student received individual feedback from the teacher privately on their own device indicating the correct/incorrect answers to each quiz question. In an experimental condition, collective feedback was given publicly to the whole online class by displaying the distribution of answers by response options to each quiz question via a videoconferencing system which shared the teacher’s computer screen. Results showed that students perceived a greater threat of social comparison, and demonstrated lower behavioral engagement and academic performance in the collective than in the individual feedback condition. The opposite effect was found on interest in the course, which was higher when collective feedback based on quizzes was displayed than when it was not. These findings suggest that collective feedback can be considered as a “double-edged sword” generating both positive and negative impacts depending on the type of academic outcomes taken into consideration.
... As a result of the research, it has been revealed that the use of Mentimeter helps students focus their attention and facilitates learning. There are studies with similar findings (Kay, R. H. and LeSage, A. 2009). At the same time, the students learn about their mistakes by getting immediate feedback on their answers. ...
Article
Full-text available
p>The use of the Audience Response System in English teaching increases students’ interests and participations. Mentimeter is one of the listener response systems that allows us to prepare course evaluation activities. Mentimeter allows students to convey their opinions via phones, tablets, laptops or personal computers. In this study, it is aimed to get the opinions of 10th grade students about the use of Mentimeter in English class. As a result of the research, it is found that the course taught with Mentimeter has positive effects on the students. It has been concluded that it increases students' participations in the lesson, makes them enjoy the lesson, motivates them and helps them to focus. "Mentimeter" creates a competitive environment, allowing students to have a pleasant time and to correct their mistakes by receiving feedback. Dinleyici Yanıt Sisteminin İngilizce öğretiminde kullanımı öğrenci ilgi ve katılımını arttırmaktadır. Mentimeter derse ilişkin değerlendirme etkinlikleri hazırlamamıza olanak sağlayan dinleyici yanıt sistemlerinden birisidir. Mentimeter öğrencilerin telefon, tablet, dizüstü bilgisayar veya kişisel bilgisayar aracılığıyla görüşlerini iletmesine olanak tanımaktadır. Bu çalışmada İngilizce dersinde Mentimeter kullanılmasına ilişkin 10. sınıf öğrencilerinin görüşlerinin alınması amaçlanmıştır. Araştırmanın sonucunda “Mentimeter’’ ile işlenen dersin öğrencilerin üzerinde olumlu etkileri olduğu görülmüştür. Öğrencilerin derse katılımını arttırdığı, derstten keyif almalarını sağladığı, motive ettiği, dikkatini toplamalarına yardımcı olduğu sonucuna ulaşılmıştır. Mentimeter rekabet ortamı oluşturarak öğrencilerin keyifli zaman geçirmelerine, geri dönüt alarak hatalarını düzeltmelerine olanak sağlamıştır. Article visualizations: </p
Article
Full-text available
Data is essential in decision-making, so it is important that citizens understand the value of open data. With the aim of innovating and promoting the use of open data, the artistic-technological project "Falla Inmaterial" is presented, with which scientific dissemination is carried out through massive festivals such as Las Fallas. This is an initiative aligned with citizen science that goes beyond the Fallas festivities and is conceived as a gamification tool with which to show social applications of engineering and data science / Los datos son fundamentales en la toma de decisiones, por ello es importante que la ciudadanía comprenda el valor de los datos abiertos. Con el objetivo de innovar y promocionar el uso de datos abiertos se presenta el proyecto artístico-tecnológico “Falla Inmaterial” con el que se hace divulgación científica a través de unas fiestas masivas como las Fallas. Se trata de una iniciativa alineada con la ciencia ciudadana que va más allá de las fiestas de las Fallas y se concibe como una herramienta de ludificación con la que mostrar aplicaciones sociales de la ingeniería y la ciencia de datos.
Chapter
The emergence of the COVID-19 pandemic in March 2020 disrupted learning and teaching and brought in-person teaching to an abrupt halt, leading to a rapid pivot towards online and technological solutions including in Anatomy where access to cadaveric specimens and labs were halted. But emerging from the pandemic and with the resumption of more in-person teaching, would these technological innovations find the metaphorical “Room 101” or could some of them successfully enhance the anatomy educators’ toolkit? In this chapter, two such technological adaptations that we intend to continue to use in our institution are described. Their development and the pedagogy underpinning their successful use will also be described. The first of these is “Live from the Lab” which is a cadaveric demonstration livestreaming session. Teaching Anatomy through demonstration is not a new concept, however, beaming demonstrations of cadaveric material to students in remote locations need consideration of ethics and the Code of Conduct for students and teachers alike which will be described here along with how to set up such a session. Next, the use of an online assessment platform for spotter-type Anatomy assessments will be described along with their benefits. Both adaptations involve online use of images and/or videos of cadaveric material but have proven to be pedagogically useful.
Chapter
Technology provides an excellent framework to integrate formative assessment into teaching sessions. It allows the learner to incorporate and synthesize information and gives important feedback to the teacher regarding student comprehension. Using technology allows the seamless incorporation of questions and serves as an opportunity to chunk material to improve student retention. A variety of technology is available and working with instructional designers may be useful to choose and implement a new technology, particularly in different learning environments.KeywordsTechnologyAudience response systemFormative assessmentFeedbackQuestion design
Article
Full-text available
The transfer of conventional face-to-face teaching into the digital space usually requires an as-sessment of the learning objectives as well as the methods used owing to the changed technical boundary conditions. In this context, the experiences of the transition to asynchronously pro-vided audio annotated PowerPoint presentations and a three-part lecture concept, which was subsequently implemented on a test basis, were reflected upon. This three-part concept com-bines 1.) asynchronous lectures with annotated presentations, 2.) synchronous lectures with "conventional" explanations of prepared presentation slides as web meetings and 3.) lectures according to the inverted classroom principle. The evaluation of these different lecture types by one and the same group of students showed a self-critical perception of the inverted classroom as a suitable element for independent, semester-accompanying and profound engagement with the course content. Furthermore, the adaptation of an experimental lecture in the digital space proved the importance of audience response systems to support student perception of actively influencing the lecture and experiments.
Article
Full-text available
The objective of this chapter is to explain the huge, burgeoning sense of excitement surrounding response systems, and more generally, networked classrooms today. Also why, for an idea apparently more than 40 years old, it took this long to happen! Beginning with a brief history of early response systems, it takes up the story from the author's own experience, leading through hardware barriers, misconceptions about pedagogy, and classroom successes, to summarize the variety of uses, and how they lead to improved teaching and learning. It then discusses why this is such a potentially important area of study for improving education, and finally goes on to describe the emerging characteristics of, and rationale for, more powerful types of modern systems.
Article
Interactive lecturing involves an increased inter-change between teachers, students and the lecture content. The use of interactive lectures can promote active learning, heighten attention and motivation, give feedback to the teacher and the student, aid increase satisfaction for both. This article describes a number of interactive techniques that can be used in large group presentations as well as general strategies that can promote interactivity during lectures.
Chapter
Eight years ago, the Department decided to embark upon a radical change to its first-year teaching. A core feature of that change was the introduction of "classroom feedback systems" in large, engineering science classes, starting with ClassTalk and then moving on to the Personal Response System. This chapter gives a brief history of the reasons for this change, which involved other, complimentary, teaching, and learning strategies, our experiences, current developments, and a look to the future, in particular, the way we would like to see the technology developing.
Chapter
This chapter presents an overview of audience response systems as tools for promoting learning in an interactive classroom. The comparison with a popular TV programme is used to introduce the role of questioning in face-to-face teaching, before examining the intrinsic pedagogical benefits of questions. Technical aspects of different systems are summarised, and pedagogical issues discussed, based on the author's personal teaching experience. Appropriate ways of using audience response systems in the classroom, and guidelines for question delivery are presented. New technology will continue to open up opportunities, but the challenge for the future remains to use the technology effectively and in pedagogically sound ways.
Chapter
This chapter describes my 25-year journey and experience with audience response systems (ARS), starting with my first realization of the potential of ARS while teaching at a University as an adjunct professor. A synopsis of the initial ARS experiment conducted in the mid-1980s at IBM's Management Development Center serves as a baseline. The conclusions from this study justified the use of keypads in the classroom at IBM, and after publication, set the stage for the growth of the ARS industry. The ARS activities pursued after retiring from IBM in 1988 are described, including the advances that my companies made in software, graphics, and keypad technology, which we incorporated into our products. Finally, the chapter offers 10 recommendations for higher quality questions developed by ARS users. I conclude that these recommendations are critical prerequisites to the continued growth of the ARS industry in academia.
Chapter
This chapter reports the authors' use and refinement of a wireless audience response system (ARS) over a 10-year period. The motivation for this effort was to replace the traditional passive lecture with a more interactive classroom. Our classroom procedures and the evolution of these procedures are detailed. The authors also illustrate how ARS systems can be applied in a variety of ways, including the use of modified multiplechoice questions. ARS systems allow for both formative and summative assessment in real time, which is a unique and desirable property. The use of ARS systems similar to the ones described has increased rapidly in the past few years. A brief survey of current and proposed commercial wireless keypad systems is included.
Article
Recent increases in class size in higher education have focused more attention on the nature of the face-to-face learning experience. This chapter examines how a keypad technology facilitates active learning in the lecture hall using a number of pedagogically proven approaches. We survey 219 first-year business studies students tackling introductory economics, and find that the technology enhances learning in lectures because, among other things, it improves concentration, provides instantaneous and more effective student feedback, and allows students to make comparisons on how well they fare relative to their peers. Interestingly, we find less statistical support for the benefits of using the technology to allow students to respond anonymously, and explore some reasons for this result. Finally, we demonstrate our use of the tool to engage in teaching the Prisoner's Dilemma game. This forms part of the emerging knowledge on how to teach classroom experiments using keypad technology.
Article
This chapter provides practical advice on the evaluation of electronic voting systems (EVSs), particularly in relation to two evaluation methods. It begins by considering two potential educational advantages of using EVSs in large-group lectures in higher education. Four evaluation questions that are commonly asked by lecturers who use EVSs are linked to these two pedagogical advantages. The main body of the chapter focuses on two methods, observation and audit trails, and shows how these can be used to innovatively evaluate the use of EVSs. The development of an observational coding schema is described, and a case study of its use in two learning contexts is presented. Practical and technical issues associated with the use of audit trails are then discussed before a second case study is presented. The two case studies presented in this chapter draw extensively on data collected in evaluations of EVS implementations at the University of Glasgow.