Content uploaded by Robin Holding Kay
Author content
All content in this area was uploaded by Robin Holding Kay on Nov 27, 2017
Content may be subject to copyright.
Author's personal copy
Examining gender differences in attitudes toward interactive classroom
communications systems (ICCS)
Robin H. Kay
*
University of Ontario Institute of Technology, Faculty of Education, 2000 Simcoe St. North, Oshawa, Ontario, Canada L1H 7L7
article info
Article history:
Received 27 June 2008
Received in revised form 22 November 2008
Accepted 24 November 2008
Keywords:
Evaluate
Assess
Quality
Scale
Gender
Self-efficacy
Confidence
Individual differences
Secondary school
High school
Audience response systems
Interactive classroom communication
systems
abstract
An interactive classroom communication system (ICCS) involves the use of remote devices that permit all
students in a class to respond to multiple choice questions displayed on a LCD projector. After responses
are clicked in, the results are instantly aggregated and displayed in chart form. The purpose of this study
was to examine gender differences in attitudes toward ICCSs for 659 secondary school students. The ini-
tial results suggested that male students had significantly more positive attitudes than female students
with respect to student involvement, assessment, and perceived learning. However, a number of these
differences disappeared when computer comfort level and type of use were added as covariates. Male
students still perceived that ICCSs improved the overall learning process more than female students
regardless of computer comfort level or type of use.
Ó2008 Elsevier Ltd. All rights reserved.
1. Overview
An interactive classroom communication systems (ICCS), also known as an audience response system (Caldwell, 2007) or clickers
(Bergtrom, 2006), allows students to respond to multiple choice questions using a remote control device. After students click in their
responses, the results are automatically collected and displayed in chart form, usually a histogram. Responses are often anonymous
but can be linked to specific students for evaluation purposes. The principal advantage of using an ICCS is that it gives feedback to both
students and instructors on how well the entire class understands concepts being presented in class. Once this feedback is attained, a
teacher can alter the course of instruction accordingly or students can work out misconceptions and difficulties through peer or class-
room discussion.
Considerable research has been done exploring gender differences in computer-related behaviour (seeAAUW (2000), Barker and Aspray
(2006), Kay (2008),Whitley (1997) for detailed reviews) where persistent, but small differences in attitude, ability, and use have been ob-
served, usually in favour of males. To date, though, no research has been done on gender differences in the use of ICCSs. Instead, researchers
of ICCSs have focussed on potential benefits, challenges, and strategies (Caldwell, 2007; Fies & Marshall, 2006; Kay, in press; Simpson &
Oliver, 2007). While the use of ICCSs started in 1966 (Abrahamson, 2006; Judson & Sawada, 2002), mainstream integration of this tool
in to secondary schools and higher education is a relatively new phenomenon beginning in 2003. It is prudent to investigate gender dif-
ferences in attitudes toward ICCSs early in order to address negative consequences and educational bias. The purpose of this paper, then,
was to examine gender differences in attitudes toward ICCSs in secondary school classrooms.
0360-1315/$ - see front matter Ó2008 Elsevier Ltd. All rights reserved.
doi:10.1016/j.compedu.2008.11.015
*Tel.: +1 905 721 8668x2679.
E-mail address: robin.kay@uoit.ca
Computers & Education 52 (2009) 730–740
Contents lists available at ScienceDirect
Computers & Education
journal homepage: www.elsevier.com/locate/compedu
Author's personal copy
1.1. Fundamental difference between ICCS and traditional classrooms – the feedback cycle
In a traditional classroom, feedback can be acquired by multiple means, including a show of hands, asking volunteers to share answers,
use of small individual whiteboards or tablets to display answers, or using coloured cards to represent multiple choice responses
(Abrahamson, 2006; Draper, Cargill, & Cutts, 2002; McCabe, 2006; Pelton & Pelton, 2006). However these methods have notable disadvan-
tages. A show of hands, for example, is limited because it is difficult to obtain a quick, accurate sense of class understanding, particularly in
a large lecture. Furthermore, some students are inclined to copy the responses of others. In addition, when hands are lowered, the data is
lost (Abrahamson, 2006; Pelton & Pelton, 2006). Also, relying on volunteers is somewhat restrictive because, typically, only the very brave
or confident raise their hands (Burton, 2006; Slain, Abate, Hidges, Stamatakis, & Wolak, 2004). Note also that with a show of hands or asking
volunteers to respond, anonymity is lost. Whiteboards and coloured cards are more anonymous, but amalgamating responses is a relatively
slow process. In contrast to traditional lectures, an ICCS-based classroom has several key advantages. All students participate and respond
to class questions, student commitment to a response tends to increase interest in the concepts being discussed (Abrahamson, 2006;
Beatty, 2004; Pradhan, Sparano, & Ananth, 2005), and answers from the entire class are instantly displayed so that misconceptions can
be quickly identified and discussed.
1.2. Benefits to using ICCS
Student reactions to the use of ICCSs have been universally positive (Caldwell, 2007; Fies & Marshall, 2006; Judson & Sawada, 2002;
Simpson & Oliver, 2007) in the domain of higher education. Researchers have also revealed more specific areas of benefit including student
involvement, assessment, and learning. With respect to student involvement, there is considerable data to suggest that students using an
ICCS are more engaged in concepts covered (e.g., Preszler, Dawe, Shuster, & Shuster, 2007; Siau, Sheng, & Nah, 2006; Simpson & Oliver,
2007), participate more (e.g., Draper & Brown, 2004; Greer & Heaney, 2004; Stuart, Brown, & Draper, 2004), pay more attention in class
(e.g., Jackson, Ganger, Bridge, & Ginsburg, 2005; Latessa & Mouw, 2005), and are more involved in class discussions (e.g., Beatty, 2004; Nicol
& Boyle, 2003). Regarding assessment, an ICCS helps improve the feedback cycle by ensuring anonymity, collecting and summarizing re-
sponses from all students in larger classes very quickly, and limiting the copying of answers (Abrahamson, 2006; Beatty, 2004; Pradhan
et al., 2005). In addition, the regular use of an ICCS can offer feedback to both the instructors and students as to how well concepts are
being understood (e.g., Bergtrom, 2006; Bullock et al., 2002; Simpson & Oliver, 2007). With respect to learning, numerous studies have re-
ported that students feel they learn more when an ICCS is used in higher education classrooms (e.g., Greer & Heaney, 2004; Nicol & Boyle,
2003; Pradhan et al., 2005; Preszler et al., 2007; Uhari, Renko, & Soini, 2003). Furthermore, many experimental studies have been done
where ICCS-based classes score significantly higher on tests and examinations than classes who are exposed to traditional lecture formats
(Crouch & Mazur, 2001; El-Rady, 2006; Kaleta & Joosten, 2007; Kennedy & Cutts, 2005).
1.3. Challenges associated with using ICCS
Two main challenges have been associated with the use of an ICCS: technology and adjusting to a new method of learning. With respect
to technology, on occasion, signals from some remote devices do not register on the teacher’s computer, a particularly stressful experience
when students are being evaluated for grades (El-Rady, 2006; Sharma, Khachan, Chan, & O’Byrne, 2005; Siau et al., 2006). Regarding new
methods of learning, some students react adversely to the use of an ICCS because the overall approach to learning changed. They are accus-
tomed to lectures and a switch of methods leads to stress, frustration, and resistance at first (Beatty, 2004; Fagan, Crouch, & Mazur, 2002).
Other students are distracted by the use of an ICCS (Siau et al., 2006). Still others doubt their ability to direct their own learning (Allen &
Tanner, 2005).
1.4. Gender and the use of ICCS
1.4.1. Overall impact
Numerous studies have investigated the role of gender in computer behaviour (see AAUW (2000); Barker and Aspray (2006); Kay
(2008);Whitley (1997) for detailed reviews) and the following conclusions have been made. First, most studies have looked at computer
attitude, ability, and/or use. Second, roughly 30–50% of the studies report differences in favour of males, 10–15% in favour of females, and
40–60% no difference. Third, differences reported, while statistically significant, are often small. Overall, one could say there is a persistent
pattern of small differences in computer attitude, ability, and use that favours males (Kay, 2008). To date, no research has been done inves-
tigating gender differences in attitudes toward ICCSs.
1.4.2. Perceived comfort level
According to Kay (2008), comfort level with technology is one critical determinant of gender differences in computer-related behav-
iours. Perceived comfort level with using computers, also known as self-efficacy, has been shown to be particularly influential on computer
ability and use (e.g., Barbeite & Weiss, 2004; Durndell & Haag, 2002; Shapka & Ferrari, 2003; Solvberg, 2002). Research on computer self-
efficacy and ICCSs has not been done, however, many studies have reported that students find ICCSs easy to learn and use (Hinde & Hunt,
2006; Pelton & Pelton, 2006; Pradhan et al., 2005; Sharma, Khachan, Chan, & O’Byrne, 2005). Therefore, even though self-efficacy has been
prominent factor in influencing computer-related behaviours in more advanced computer technologies, it is speculated that it will not play
a major role in influencing attitudes towards simple, easy to use ICCS equipment.
1.4.3. Type of computer use
With respect to computer use, there is some evidence to suggest that gender differences are partially dependent on how computers are
being used (AAUW, 2000; Barker & Aspray, 2006; Kay, 2008;Whitley, 1997). For example, gender differences are far more pronounced in
favour of males when computers are used for entertainment as opposed to educational applications (Barker & Aspray, 2006; Kay, 2008). It is
conceivable that potential gender disparities in the use of ICCSs may be partially dependent on how these tools are used in the classrooms.
R.H. Kay / Computers & Education 52 (2009) 730–740 731
Author's personal copy
A number of researchers have argued that the type of educational strategy used with an ICCS can have a fundamental influence on accep-
tance and overall success (Reay, Bao, Li, Warnakulasooriya, & Baugh, 2005; Simpson & Oliver, 2007; Stuart et al., 2004). For example, there
is data to suggest that higher education students do no like using an ICCS for participation grades or summative assessment (Caldwell,
2007), but that they react well to formative assessment (e.g., Bergtrom, 2006; Dufresne & Gerace, 2004; Jackson et al., 2005). If gender dif-
ferences do exist, it is important to examine the influence of context, or how an ICCS is used.
1.5. Summary and purpose of study
Widespread use of ICCSs is a relatively recent phenomenon, particularly in secondary schools. Previous research in higher education has
focussed on the benefits and challenges of using ICCSs. No research has been done examining gender differences in attitudes toward ICCSs
and only one study could be found looking at secondary school classrooms (Penuel, Boscardin, Masyn, & Crawford, 2007). The purpose of
the current study was to investigate gender differences in secondary school students’ attitudes toward ICCSs as a function of computer
comfort level and type of use.
2. Method
2.1. Sample
2.1.1. Students
The student sample consisted of 659 students (327 males, 327 females, 5 missing data), enroled in grades 9 (n= 71), 10 (n= 233), 11
(n= 149), and 12 (n= 206). Subject areas where ICCSs were used included business, computer technology, social science, science, and math.
Eighty-seven percent (n= 572) of the students claimed that they were comfortable or very comfortable with technology. Sample popula-
tion data was collected from 23 different classrooms. The students were selected through convenience sampling and had to obtain signed
parental permission to participate.
2.1.2. Teachers
The teacher sample consisted of 23 teachers (16 males, 7 females), with 1–26 years of teaching experience (M= 15.9, SD = 7.9). Almost
all teachers reported that they were comfortable or very comfortable with technology (n= 22, 96%).
3. Procedure
Teachers were emailed by an educational coordinator and informed of the ICCS study. Participation was voluntary and a subject could
withdraw from the study at any time. Each teacher received a two half days of training in November and February on how to use the ICCS
software and possible strategies for using an ICCS in the classroom. They were then asked to use an ICCS in their classrooms, although how
often the ICCS was used was up to the individual teachers. In pairs, teachers from each high school shared a laptop computer, an LCD pro-
jector, and a one ICCS system from E-Instruction. All students in a given teacher’s classroom participated in ICCS-based lessons. However,
only those students with signed parental permission forms were permitted to fill in an anonymous, online survey about their use of the
ICCS. All teachers used the ICCS for a three month period, however, data collected for this study focussed on the last month. During the
final month in which ICCSs were used, 94% (n= 617) of secondary students reported using an ICCS one to two times. Only, 6% (n= 41)
of the students used an ICCS once a week.
3.1. Data sources
3.1.1. Survey – attitudes
Based on the final month in which an ICCS was used, students completed the ICCS attitude survey for students (see Appendix A). This
survey consisted of nine, seven-point Likert-scale items. Items were constructed based on a review of the ICCS literature (Caldwell, 2007;
Fies & Marshall, 2006; Kay, in press; Simpson & Oliver, 2007) and focussed on general attitudes, student involvement, assessment, and
learning. The internal reliability for the total nine-item scale was 0.89.
3.1.2. Survey – Computer comfort level
Student computer comfort level was assessed using a scale developed by Kay and Knaack (2005) which showed good construct validity
and high reliability. The internal reliability for the scale used in this study was 0.81.
3.1.3. Survey – type of use
Three types of ICCS use were examined involving two categories of assessment: formative and summative. Formative assessment re-
ferred the use of an ICCS to assess student understanding of key concepts, whereas summative assessment referred to using an ICCS to
formally evaluate student knowledge for grades. The three categories of ICCS use, then, were formative (n= 398), mixed (n= 110; formative
and summative) and summative (n= 103).
3.1.4. Student comments – attitudes
Students were asked ‘‘What was the impact of clickers on your learning in the past month?” A coding scheme was developed to cate-
gorize 714 student comments (see Appendix B). Note that some students made more than one comment when they filled in their survey,
while other students offered no response. Each comment was then rated on a five-point Likert-scale (2 = very negative, 1 = negative,
0 = neutral, 1 = positive, 2 = very positive). Two raters assessed all comments made by students based on category and rating value. After
round one, inter-rater reliability was 83% for categories and 93% for ratings. Comments where categories or ratings were not exactly the
732 R.H. Kay / Computers & Education 52 (2009) 730–740
Author's personal copy
same were shared and reviewed a second time by each rater. After round two, an inter-rater reliability of 98% was reached for categories
and 99% for the rating values.
3.2. Key research questions
In order to examine gender differences in student attitudes toward the use of ICCSs in secondary school classrooms, the following ques-
tions were addressed:
1. How do male and female high school students differ in their attitudes toward ICCSs?
2. How does computer comfort level (self-efficacy) influence gender differences in attitude toward ICCSs?
3. How does type of use influence gender differences in attitude toward ICCSs?
4. Results
4.1. Gender differences – overall
4.1.1. Survey results
With respect to total ICCS attitude score, male secondary school students (M= 46.1, SD = 9.5) had significantly more positive attitudes
towards ICCSs than female secondary school students (M= 42.5, SD = 10.4) (t= 44.93, df = 615, p< 001). The effect size of 0.36 is considered
to be in the medium range by Cohen (1988). Since overall attitudes toward ICCSs were significantly different, a MANOVA was run to com-
pare male and female students on each of the nine Likert-scale survey items examining attitudes toward using ICCSs (Appendix A). Hotell-
ing’s Twas significant (p< .001), so individual comparisons were done on each survey question. Male and female students differed
significantly on all but one of the items from the ICCS attitude survey (Table 1). Specifically, male students were more motivated and en-
gaged when using ICCSs, participated more in ICCS-based classrooms, liked using ICCSs to test their knowledge, especially in summative
evaluation, thought ICCSs generated more class discussion, felt ICCSs helped improve their learning, and overall, thought ICCS-based classes
were better. The only item where male and female students did not significantly differ was ‘‘liking to see other student answers”.
4.1.2. Student comments
Multiple t-tests were run comparing comment ratings for student involvement, assessment, learning, and challenges experienced
when an ICCS was used (see Appendix B for comment categories). A probability level of p< .005 was used to compensate for the num-
ber of t-tests done (see Kirk (1982) p. 102). With respect to student involvement and assessment associated with ICCS use, male and
female student attitudes were not significantly different. However, male students had significantly more positive attitudes toward the
impact of ICCSs on the learning process and learning performance. Female students on the other hand, were significantly more
stressed than male students about using an ICCS and reacted more negatively to ICCS technology problems (Table 2). It is important
to emphasize that differences observed for stress (n= 27) and technology problems (n= 24) were based are a small portion of the total
comments (n= 714).
It is worth looking at the content of specific comments in the four areas where gender differences were observed when ICCS technology
was used: learning process (n= 104 comments), learning performance (n= 77 comments), stress (n= 27 comments), and technology prob-
lems (n= 24 comments). Gender differences in the learning process were a matter of degree as opposed to substance. A small group of fe-
male students (n= 4) felt that using an ICCS had a negative impact on their learning.
‘‘Clickers made me lose confidence in situation where immediate feedback was given.”
‘‘It was confusing at times.”
Table 1
Gender differences in attitudes toward ICCSs based on survey results.
Measure Males Female F
MSD MSD
Overall Attitude
When an ICCS was used, the class was better 5.01 1.43 4.44 1.51 23.07
*
Student Involvement
I was more engaged in the lesson when an ICCS was used 5.50 1.35 5.18 1.56 7.21
**
I was more motivated when an ICCS was used 5.46 1.43 5.01 1.56 14.18
*
I participated more than I normally would when an ICCS was used. 5.29 1.51 4.97 1.60 6.53
***
Using an ICCS generated more class discussion 4.82 1.71 4.36 1.48 14.95
*
Assessment
Using an ICCS was a good way to test my knowledge 5.55 1.33 5.30 1.49 4.72
***
I liked seeing what other students in the class selected for answers. 4.95 1.52 4.72 1.49 3.62
I liked using an ICCS for tests 5.13 1.73 4.41 1.86 24.68
*
Learning
I learned more when an ICCS was used 4.60 1.44 4.17 1.51 13.11
*
*
p< .001.
**
p< .005.
***
p< .05.
R.H. Kay / Computers & Education 52 (2009) 730–740 733
Author's personal copy
‘‘Took more time to learn to use them than time spent learning.”
This negative reaction was not present for males. On the other hand, there was a sizeable group of male students (n= 14) who felt using
an ICCS was a very positive way to learn.
‘‘They are fun and made it more interesting to learn. I felt more involved in the class discussion. It was interesting using this object in a
testing format.”
‘‘I learned that these clickers made a high impact on my learning by teaching me all different things I didn’t know.”
‘‘It made learning easier and more fun because there was less writing and we got our results instantly.”
This enthusiastic reaction was not observed often for female students. Overall, the majority of comments made by male and female stu-
dents about learning process and the use of ICCSs were neutral or are somewhat positive.
Regarding learning performance and ICCS, typical male student comments were neutral or slightly positive.
‘‘Nothing really changed. Basically, I think they’re fine. The clickers aren’t good or bad, they just make it easier for the teacher to test the
students.”
‘‘I found it easier to use the clickers than to do it [tests] on paper.”
‘‘I found that multiple choice questions for a test went smoother and faster and I liked the way we could know our results almost
instantly.”
Female reaction to learning performance was markedly different from males and focussed on the use of an ICCS for summative assess-
ment. Many comments (n= 30) reflected a negative reaction to use of ICCSs for formal tests.
‘‘I didn’t like how we couldn’t go back and change the answers.”
‘‘It helped that one time where each answer that was answered mostly wrong was explained, but it hindered my performance on the
test.”
‘‘If anything, I didn’t do as well on units test - I will be [not] able to do as well on exams as I could have because I don’t have the ques-
tions in front of me to help me study. Therefore, I soon forgot all the questions from the quiz/test.”
‘‘I don’t like using clickers, I don’t like the time limit and would much rather answer questions on paper within a given amount of time to
do the full tests. I do not enjoy the clickers at all & using the clickers has had no impact at all on my learning.”
‘‘The clickers seem to create more pressure to answer correctly which often led to answering incorrectly. The clickers were fun - This
technology for M/C questions is innovative but I’m not sure they’re good for tests and quizzes.”
Regarding stress, a small but significant group of female students (n= 23) were very anxious about using ICCSs. The source of stress and
frustration appeared to be connected to the use of ICCSs for summative assessment.
‘‘While using the clickers for tests, it took extra time that was needed to complete it entirely. I found it to be inconvenient and
frustrating.”
‘‘The impact of clickers on my learning in the past month was nerve wracking.”
‘‘I realized that I cannot work well under the pressure when the clickers were used.”
‘‘Although the clickers seem to control a test situation, they become stressful to use on a time constraint during tests.”
Table 2
Gender differences in attitudes toward ICCSs based on ratings of student comments.
Category Males Females t
MSD MSD
Student Involvement
Engagement (n= 110) 1.16 (0.37) 1.31 (0.47) 1.81
Participation (n= 49) 1.00 (0.00) 1.07 (0.47) 0.73
Attention (n= 28) 0.71 (0.76) 1.00 (0.00) 1.80
Assessment
Formative Assessment (n= 53) 1.00 (0.31) 0.90 (0.47) 0.84
Compare with other students (n= 16) 0.80 (0.42) 1.09 (0.00) 1.15
Feedback (n= 14) 0.50 (1.00) 1.00 (0.00) 1.69
Learning
Learning process (n= 104) 1.14 (0.72) 0.58 (0.74) 3.86
*
Review concepts (n= 53) 1.29 (0.59) 1.06 (0.33) 1.89
Memory (n= 21) 0.88 (0.35) 0.85 (0.90) 0.09
Learning performance (n= 77) 0.14 (1.16) 0.60 (1.05) 3.00
**
Challenges
Technology issues (n= 24) 0.27 (0.91) 1.08 (0.76) 3.98
**
Stress (n= 27) 0.50 (1.00) 1.35 (0.71) 4.52
*
Different methods used (n= 45) 0.82 (0.81) 0.18 (1.09) 2.11
*
p< .001.
**
p< .005.
734 R.H. Kay / Computers & Education 52 (2009) 730–740
Author's personal copy
Finally, there were clear differences when looking at the small subset of male and female students who commented about using ICCS
technology. All comments made by female students were negative, whereas most comments by male students were either neutral or some-
what positive. Typical female comments about ICCS technology were as follows:
‘‘They are too difficult to use and it was difficult to tell when the correct answer was chosen.”
‘‘If we did not have to point them at the little device in the ceiling, maybe it wouldn’t be so bad.”
‘‘It was sometimes frustrating to aim the clicker accurately.”
Male student comments about ICCS technology were more positive:
‘‘It’s fun to push the buttons.”
‘‘I enjoy the advancement of technology.”
‘‘I thought that it was a fun way to mark tests because it was new technology.”
4.2. Gender and computer comfort (self-efficacy)
Male students had significantly higher computer comfort scores (M= 3.47, SD = 0.67) than female students (M= 3.05, SD = 0.70,
t= 7.83, df = 650, p< .001). The effect size of 0.61 is considered to be in the moderate to large range by Cohen (1988). Therefore it is rea-
sonable to compare male and female attitudes toward ICCSs using computer comfort as a covariate.
A MANOVA was run to compare male and female students on the nine Likert-scale survey items examining attitudes toward using an
ICCS (Appendix A) with computer comfort as a covariate. Hotelling’s Twas significant (p< .001), so individual comparisons were done on
each survey question. Male students still had significantly more positive attitudes than female students about ICCSs making the overall
class better, using an ICCS to generate discussion, using ICCSs for tests, and perceiving that they learned more when ICCSs were used. How-
ever, male and female students no longer differed with respect to attitudes about engagement, motivation, participation, and ICCSs being a
good way to test knowledge (Table 3).
An analysis of student comments was done using multiple ANOVA’s, computer comfort level as a covariate, and a probability le-
vel of p< .005 to compensate for the number of tests done (see Kirk (1982) p. 102). Male students (M= 1.08, SE = 0.11) still had
significantly more positive comments than female students (M= 0.63, SE = 0.09) about the learning process when an ICCS was used
(p< .005). However, male perceptions that learning performance improved with the use of an ICCS (M= 0.07, SE = 0.20) were not
significantly different than female perceptions (M=0.55, SE = 0.19), once computer comfort was included as a covariate. Female
students still reported having significantly higher stress levels (M=1.32, SE = 0.16) than male students (M= 0.36, SE = 0.39) with
respect to using ICCSs, even when computer comfort was added as a covariate (p< .005), but there were no gender differences ob-
served for attitudes toward ICCS technology. All other comment categories listed in Appendix B showed no significant gender
differences.
4.3. Gender and type of use
A chi-square analysis revealed significant differences between male and female students (X
2
(2, N= 649) = 10.60, p< .01) with re-
spect to ‘‘type of use” experienced in the classroom. It appears that male students experienced formative use of ICCSs more than fe-
male students (71% for males vs. 62% for females) and female students experienced summative use more than male students (21% for
females vs. 11% for male). Therefore it is reasonable to analyze gender differences in attitudes toward ICCSs with ‘‘type of use” added
as a covariate.
A MANOVA was run to compare male and female students on the nine Likert-scale survey items examining attitudes toward using ICCSs
(Appendix A), with ‘‘type of use” included as a covariate. Hotelling’s Twas significant (p< .001), so individual comparisons were done on
each survey question. Male students were still more positive than female students about the overall class being better with ICCS, more
motivating, generating more discussion, liking to use an ICCS for formal tests, and believing that they learned more when an ICCS was used.
However, male students were no longer significantly more positive about the engagement, participation, and formative assessment value
of an ICCS (Table 4).
An analysis of student comments was done using multiple ANOVA’s, ‘‘type of use” as a covariate, and a probability level of p< .005 to
compensate for the number of tests done (see Kirk (1982) p. 102). Male students (M= 1.09, SE = 0.10) still had significantly more positive
comments than female students (M= 0.62, SE = 0.08) about the quality of the learning process when an ICCS was used (p< .001). However,
male perceptions that learning performance improved with ICCS use (M= 0.10, SE = 0.17) were not significantly different than female per-
ceptions (M=0.57, SE = 0.16), once ‘‘type of use” was added as a covariate. Female students still reported significantly higher stress levels
(M=1.30, SE = 0.15) than male students (M= 0.21, SE = 0.39) with respect to the use of ICCSs, even when ‘‘type of use” was added as a
covariate (p< .005). In addition, differences between male (M= 0.21, SE = 0.27) and female (M=1.02, SE = 0.24) student attitudes toward
ICCS technology persisted with ‘‘type of use” as a covariate (p< .005). All other comment categories listed in Appendix B showed no sig-
nificant gender differences.
4.4. Gender, computer comfort and type of use
The combined impact of computer comfort and type of use was examined using a MANOVA to compare male and female students on the
nine Likert-scale survey items used to assess attitudes toward using ICCSs (Appendix A). Hotelling’s Twas significant (p< .001), so individ-
ual comparisons were done on each survey question. Significant gender differences were still observed with respect to overall attitude
about the class use of ICCSs (p< .01), generating more class discussion with ICCSs (p< .05), and liking to use an ICCS for tests (p< .01. How-
R.H. Kay / Computers & Education 52 (2009) 730–740 735
Author's personal copy
ever, male students were no longer significantly more positive than female students about perceptions how much was learned when an
ICCS was used. All other items showed no significant differences between male and female students.
An analysis of student comments was done using multiple ANOVA’s, computer comfort and ‘‘type of use” as covariates, and a probability
level of p< .005 to compensate for the number of tests done (see Kirk (1982) p. 102). Male students (M= 1.05, SE = 0.10) still had signif-
icantly more positive comments than female students (M= 0.64, SE = 0.08) about the learning process experienced when an ICCS was used
(p< .005). However, male perceptions that learning performance improved with ICCS use were not significantly different than female per-
ceptions once computer comfort and ‘‘type of use” were included as a covariates. In addition, female and male students no longer differed
with respect to stress level associated with ICCS use and reactions to ICCS technology. All other comment categories listed in Appendix B
showed no significant gender differences.
5. Discussion
The purpose of this study was to examine gender differences in attitudes toward ICCSs in the context of computer comfort level and type
of ICCS use.
5.1. Computer comfort level
A simple analysis of gender differences revealed that male secondary students were significantly more positive than females on eight of
the nine ICCS survey items. Male students were more involved, more receptive to both formative and summative assessment, and felt they
Table 4
Gender differences in attitudes toward ICCSs with type of use as a covariate.
Measure Males Females F
M
a
SE M
a
SE
Overall Attitude
When ICCS was used, the class was better 4.98 0.08 4.49 0.08 17.74
*
Student Involvement
I was more engaged in the lesson when ICCS was used 5.44 0.08 5.24 0.08 3.12
I was more motivated when ICCS was used 5.40 0.08 5.07 0.08 8.30
**
I participated more than I normally would when ICCS was used. 5.24 0.09 5.02 0.09 3.05
Using ICCS generated more class discussion 4.79 0.08 4.41 0.08 10.39
**
Assessment
Using ICCS was a good way to test my knowledge 5.50 0.08 5.36 0.08 1.61
I liked seeing what other students in the class selected for answers. 4.92 0.09 4.75 0.09 1.83
I liked using ICCS for tests 5.07 0.10 4.47 0.10 17.57
*
Learning
I learned more when ICCS was used 4.55 0.09 4.22 0.08 8.15
**
*
p< .001.
**
p< .005.
a
Estimated mean with computer comfort as a covariate.
Table 3
Gender differences in attitudes toward ICCSs with computer comfort as a covariate.
Measure Males Females F
M
a
SE M
a
SE
Overall Attitude
When ICCS was used, the class was better 4.92 0.08 4.53 0.08 9.77
*
Student Involvement
I was more engaged in the lesson when ICCS was used 5.37 0.08 5.31 0.08 0.21
I was more motivated when ICCS was used 5.33 0.08 5.14 0.08 2.46
I participated more than I normally would when ICCS was used. 5.17 0.09 5.09 0.09 0.38
Using ICCS generated more class discussion 4.74 0.09 4.44 0.09 6.30
**
Assessment
Using ICCS was a good way to test my knowledge 5.45 0.08 5.41 0.08 0.14
I liked seeing what other students in the class selected for answers. 4.92 0.09 4.77 0.09 1.40
I liked using ICCS for tests 5.01 0.10 4.54 0.10 10.12
*
Learning
I learned more when ICCS was used 4.52 0.09 4.25 0.08 4.62
**
*
p< .005.
**
p< .05.
a
Estimated mean with computer comfort as a covariate.
736 R.H. Kay / Computers & Education 52 (2009) 730–740
Author's personal copy
learned more when an ICCS was used. However, male students in this study were significantly more comfortable with computers than their
female counterparts. Differences in computer comfort levels appeared to play a direct role in attenuating attitudes toward ICCSs for female
students. When gender comparisons were made with computer comfort level as a covariate, many of the differences between male and
female students disappeared. Male perceptions of overall learning (items 1 and 9 in Appendix A) were still significantly more positive than
those of female students, but attitudes about student involvement and formative assessment showed no differences. As noted in earlier
research (e.g., Barbeite & Weiss, 2004; Durndell & Haag, 2002; Shapka & Ferrari, 2003; Solvberg, 2002), self-efficacy or computer comfort
can play a significant role in modifying gender differences in computer-related behaviour.
Comments from students revealed that some, but clearly not a majority, of female secondary students felt that using ICCSs increased
stress levels, particularly when ICCSs were used to administer formal tests. A number of male students, on the other hand were more
accepting of ICCSs, seeing it as a fun way to learn or do tests. It is important to note that, while a small, but vocal group of male and female
students clearly differed in their attitude toward the use of ICCS, the majority of students were more similar than different in their com-
ments about using ICCSs.
The impact of computer comfort level is puzzling for four reasons. First, the remote devices that come with an ICCS are very easy to use.
They require a student pick up the device, point it at the receiver, and select one of six buttons labeled A–F. There are no other buttons on
the remote device to confuse even the most technologically-challenged user. It is assumed that most students would have experienced the
use of such a device at home when operating a television. Second, most students had worked with ICCSs for at least three months, so it is
also assumed that they had achieved a certain comfort level with the technology. Third, not one student in this study, female or male,
complained that the ICCS remote devices were complicated or difficult to use. Finally, extensive research suggests that students like using
ICCSs and that they are easy to use (e.g., Caldwell, 2007; Hinde & Hunt, 2006; Pelton & Pelton, 2006; Penuel et al., 2007; Pradhan et al.,
2005; Sharma et al., 2005). So why did overall computer comfort level or self-efficacy influence how males and females reacted to the use
of ICCSs?
One possible explanation for the impact of computer comfort level is the presence of tangential attitude constructs such as affective
or cognitive attitudes. When some students feel uncomfortable with computers, they may also experience a level of anxiety that inter-
feres with learning, regardless of how difficult or easy it is to use the specific technology in question. These students may also have a
negative cognitive bias towards using technology in general. Gender differences in affective and cognitive attitudes, while small, have
been well documented (AAUW, 2000; Barker & Aspray, 2006; Kay, 2008;Whitley, 1997). Another explanation might involve the more
impersonal nature of using ICCS. Using a remote device may be less satisfying than verbal interaction for some female students. Clearly,
more research, perhaps in the form of focus groups or interviews is needed to explain the impact of computer comfort level attitudes
toward ICCS use.
5.2. Gender and type of use
With a sample size of 659 students, it was anticipated that there would be no differences in how ICCSs would be used based on gender.
For some unknown reason, though, female students experienced summative assessment more than male students and male students expe-
rienced formative assessment more than female students. When comparisons between male and female students were controlled for ‘‘type
of use”, several significant differences disappeared. Male students were no longer significantly more positive about the engagement, par-
ticipation, and formative assessment value of ICCSs. In other words, these differences would not have been observed had each gender expe-
rienced the same strategies for using ICCSs. One the other hand, male ratings of ICCS-based classes being better overall, more motivating,
generating more discussion, and increasing the amount learned continued to be significantly higher than those of female students. While a
number of researchers have argued that the type of educational use selected for ICCSs in the classroom can have a fundamental influence
on acceptance and overall success (Reay et al., 2005; Simpson & Oliver, 2007; Stuart et al., 2004), male students were still more positive
about this technology in several key areas regardless of the instructional strategy used. Of course, only three strategies (summative assess-
ment, formative assessment, and mixed) were examined based on the type of assessment goals dictated by the teachers, so different results
might occur if a wider range of strategies were investigated.
5.3. Gender, computer comfort, and type of use
A third analysis was completed to explore whether the impact of computer comfort and type of use were additive. When both of these
variables were added as covariates, males and females differed on only three scale items: overall attitudes toward ICCS-based classes, gen-
erating more class discussion with ICCS, and liking to use ICCSs for tests. While male students were still more positive about the use of
ICCSs in the classroom, many of the initial gender differences observed were no longer significant.
Extending the analysis of gender differences in attitudes toward ICCSs to include possible confounding variables proved to be an
effective strategy for identifying why male students preferred to use ICCSs. While there may be additional factors that moderate
gender differences, it is clear that computer comfort level and ‘‘type of use” had an impact on gender based attitudes toward
ICCSs.
5.4. Recommendations for educators
Given the formative nature of this study, firm recommendations for educators using ICCSs in secondary school classrooms are probably
risky. However, two suggestions for addressing and perhaps reducing gender differences are tentatively offered. First, it may be tempting to
launch into the instructional use of an ICCS relatively quickly because (a) valuable teaching time is lost setting up an ICCS and handing out
the remote devices, and (b) the technology is easy to use and learn. However, a small but disproportionate number of female students may
feel uncomfortable with the idea of using remote devices to respond to questions. Therefore, it is probably a good idea to clearly explain
why and how an ICCS will be used in the classroom to help reduce the anxiety level of some female students. In addition, it might be wise to
use practice problems designed to relax and help students feel at ease with the process of using ICCSs.
R.H. Kay / Computers & Education 52 (2009) 730–740 737
Author's personal copy
The second suggestion would be to use ICCSs for formative as opposed to summative assessment. A number of female students
reacted quite negatively to using an ICCSs as a testing tool (summative use). However, most students, regardless of gender, reacted
positively to using an ICCS to assess understanding of concepts (formative use). If some female students are alienated by the sum-
mative use of ICCSs and a majority of students respond positively to the formative use of ICCSs, it is reasonable to adopt the latter
approach.
5.5. Caveats and future research
This study was a first attempt to explore gender differences in attitudes toward ICCSs in secondary school classrooms. Two main data
collection sources, survey questions and open-ended comments, were used to establish triangulation and reliability of the data. Nonethe-
less, there are several caveats that need to be considered when interpreting the results.
First, the results are intended to provide a preliminary or formative analysis of attitudes. A more detailed examination is required to
confirm and address the factors that influence gender differences. For example, qualitative information in the form of focus groups or inter-
views might help provide a more in depth understanding of female and male reactions to ICCSs.
Second, a comprehensive, reliable, and valid measure needs to be developed in order to assess the full effects of ICCSs. To date, no such
instrument exists, but in order to build a cohesive knowledge base, researchers need to establish a common ground.
Finally, evidence was presented in this study to suggest that computer comfort and type of use influenced gender differences in second-
ary school student attitudes toward ICCSs. A stronger argument could be made if teachers were asked reduce the impact of these two vari-
ables. For example, a program could be developed to help students get used to and feel more comfortable with ICCSs. Furthermore, the
reason for using this new tool could be explained followed by practice sessions and more regular use. Finally, using an ICCS for summative
assessment could be minimized or eliminated. Assessing the effect of these interventions could help to confirm the influence of computer
comfort level and type of use.
6. Summary
The purpose of this study was to explore gender differences in secondary school students’ attitudes toward ICCSs accounting for com-
puter comfort level and type of ICCS use. Three key areas were examined: student involvement, assessment, and perceived learning. A
simple comparison showed that male students were significantly more positive than female students in all three categories. However,
when both computer comfort level and type of use were added as covariates, most differences in attitudes between male and female
students were no longer significant. Regardless of computer comfort level and type of use, male students still felt ICCSs improved
the overall learning process. It was suggested that offering a more thorough introduction to ICCS, increasing the frequency of use,
and eliminating the practice of using ICCSs as formal test-taking tools, might help to limit the negative impact of ICCSs on female
students.
Appendix A. ICCS attitude survey for students
A. What grade are you in? ______
B. Gender (circle one) Male Female
C. How comfortable are you with technology? (circle one)
Not at all comfortable Somewhat comfortable Comfortable Very comfortable
D. How often did you use ICCSs in the past month?
Never 1–2 times Once a week 2–3 times per week
Item Strongly
disagree
Disagree Slightly
disagree
Neutral Slightly
agree
Agree Strongly
agree
General attitude
When ICCS was used, the class was better 1 2 3 4 5 6 7
Student involvement
I was more engaged in the lesson when ICCS was
used.
1234567
I was more motivated when ICCS was used. 1 2 3 4 5 6 7
I participated more than I normally would when
ICCS was used.
1234567
Using ICCS generated more class discussion 1 2 3 4 5 6 7
Assessment
Using ICCS was a good way to test my knowledge. 1 2 3 4 5 6 7
I liked seeing what other students in the class
selected for answers.
1234567
I liked using ICCS for tests. 1 2 3 4 5 6 7
Learning
I learned more when ICCS was used 1 2 3 4 5 6 7
738 R.H. Kay / Computers & Education 52 (2009) 730–740
Author's personal copy
Appendix B. Coding scheme for student comments about ICCS
Categories
Category code Defining criteria
Student involvement
Engagement Fun/Motivated/Interested
Participation Refers to increase participation/interactivity/getting involved/more hands on learning
Attention Focus on class more
Focus on questions more
Concentrate more
Assessment
Formative assessment Referring more to formative assessment including homework – does not talk about ‘‘summative test”
Also include quizzes (which student don’t seem to consider as summative)
Student could refer to testing knowledge but not ‘‘a test”
Compare with other students Talks about how they did compare to the rest of the class
Feedback Referring to feedback they get – quickness of feedback or getting to see the answer right away
Learning
Learning process Makes specific reference to learning or thinking
Review Talks about reviewing or getting prepared for a test
Memory Talks about remembering better
Learning performance Make reference to learning performance on a test or how well they did
Challenges
Technology Refers to the use of the clicker technology
Different method used Liked because it was a new method or way of learning/doing things/something different
+ score if liked clicker method better
score if liked another method (e.g., paper and pencil) better
Stress Talks about feeling rushed but also stress/pressure/frustration in general
Rating
2Using adverb to describe impact (e.g., very, too, really)
Strong negative adjective (e.g., hate, annoying)
A serious issue like reducing confidence
More than one negative adjective
Exclamation mark (e.g., this was terrible!)
1Negative comment
0Neutral (e.g., no effect, no impact)
1Positive comment
2Using adverb to describe impact (e.g., very, too, really)
Strong positive adjective (e.g., love, awesome, captivating, great)
More than one positive adjective
Exclamation mark (e.g., this was good!)
References
Abrahamson, L. (2006). A brief history of networked classrooms: Effects, cases, pedagogy, and implications. In D. A. Banks (Ed.), Audience response systems in higher education
(pp. 1–25). Hershey, PA: Information Science Publishing.
Allen, D., & Tanner, K. (2005). Infusing active learning into the large-enrolment biology class: Seven strategies, from the simple to complex. Cell Biology Education, 4, 262–268.
American Association of University Women (2000). Tech-Savvy: Educating girls in the new computer age. Washington, DC: American Association of University Women
Foundation. Retrieved Nov 21, 2008 from <http://www.aauw.org/research/upload/TechSavvy.pdf>.
Barbeite, F. G., & Weiss, E. M. (2004). Computer self-efficacy and anxiety scales for an internet sample: Testing measurement equivalence of existing measures and
development of new scales. Computers in Human Behavior, 20(1), 1–15.
Barker, L. J., & Aspray, W. (2006). The state of research on girls and IT. In J. M. Cohoon & W. Aspray (Eds.), Women and information technology (pp. 3–54). Cambridge, MA: The
MIT Press.
Beatty, I. (2004). Transforming student learning with classroom communication systems. EDUCAUSE Research Bulletin,(3), 2004, 1–13. Retrieved Nov 3, 2007 from <http://
www.educause.edu/ir/library/pdf/ERB0403.pdf>.
Bergtrom, G. (2006). Clicker sets as learning objects. Interdisciplinary Journal of Knowledge and Learning Objects, 2. Retrieved Nov 3, 2007 from <http://ijklo.org/Volume2/
v2p105–110Bergtrom.pdf>.
Bullock, D. W., LaBella, V. P., Clinghan, T., Ding, Z., Stewart, G., & Thibado, P. M. (2002). Enhancing the student-instructor interaction frequency. The Physics Teacher, 40, 30–36.
Burton, K. (2006). The trial of an audience response system to facilitate problem-based learning in legal education. In D. A. Banks (Ed.), Audience response systems in higher
education (pp. 265–276). Hershey, PA: Information Science Publishing.
Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips.Life Sciences Education, 6(1), 9–20.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates.
Crouch, C. H., & Mazur, E. (2001). Peer instruction: Ten years of experience and results. American Journal of Physics, 69(9), 970–977.
Draper, S. W., & Brown, M. I. (2004). Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted Learning, 20(2), 81–94.
Draper, S. W., Cargill, J., & Cutts, Q. (2002). Electronically enhanced classroom interaction. Australian Journal of Educational Technology, 18, 13–23.
Dufresne, R. J., & Gerace, W. J. (2004). Assessing-to-learn: Formative assessment in physics instruction. The Physics Teacher, 42, 428–433.
R.H. Kay / Computers & Education 52 (2009) 730–740 739
Author's personal copy
Durndell, A., & Haag, Z. (2002). Computer self efficacy, computer anxiety, attitude towards the internet and reported experience with the internet, by gender, in an east
European sample. Computers in Human Behaviour, 18(5), 521–535.
El-Rady, J. (2006). To click or not to click: That’s the question. Innovate Journal of Online Education, 2(4). Retrieved Nov 3, 2007 from <http://www.innovateonline.info/
index.php?view=article&id=171>.
Fagan, A. P., Crouch, C. H., & Mazur, E. (2002). Peer instruction: Results from a range of classrooms. The Physics Teacher, 40(4), 206–209.
Fies, C., & Marshall, J. (2006). Classroom response systems: A review of the literature. Journal of Science Education and Technology, 15(1), 101–109.
Greer, L., & Heaney, P. J. (2004). Real-time analysis of student comprehension: An assessment of electronic student response technology in an introductory earth science
course. Journal of Geoscience Education, 52(4), 345–351.
Hinde, K., & Hunt, A. (2006). Using the personal response system to enhance student learning: Some evidence from teaching economics. In D. A. Banks (Ed.), Audience response
systems in higher education (pp. 140–154). Hershey, PA: Information Science Publishing.
Jackson, M., Ganger, A. C., Bridge, P. D., & Ginsburg, K. (2005). Wireless handheld computers in the undergraduate medical curriculum. Medical Education Online, 10(5).
Retrieved Nov 3, 2007 from <http://www.med-ed-online.org/pdf/t0000062.pdf>.
Judson, E., & Sawada, D. (2002). Learning from past and present: Electronic response systems in college lecture halls. Journal of Computers in Mathematics and Science Teaching,
21(2), 167–181.
Kaleta, R., & Joosten, T. (2007). Student response systems: A University of Wisconsin system study of clickers. EDUCAUSE Research Bulletin, 2007(10), 1–12.
Kay, R. H. (2008). Exploring gender differences in computer-related behaviour: Past, present, and future. In T. T. Chen & I. Chen (Eds.), Social information technology: Connecting
society and cultural issues (pp. 12–30). Hershey, PA: Information Science Reference.
Kay, R. H., & Knaack, L. (2005). Developing learning objects for secondary school students: A multi-component model. Interdisciplinary Journal of Knowledge and Learning
Objects, 1, 229–254.
Kay, R. H. (in press). A formative analysis of interactive classroom communication systems used in secondary school classrooms. Handbook of research on new media literacy at
the K-12 level.
Kennedy, G. E., & Cutts, Q. I. (2005). The association between students’ use of electronic voting systems and their learning outcomes. Journal of Computer Assisted Learning,
21(4), 260–268.
Kirk, R. E. (1982). Experimental design (2nd ed.). Monterey, CA: Brooks/Cole Publishing Company.
Latessa, R., & Mouw, D. (2005). Use of audience response system to augment interactive learning. Family Medicine, 37(1), 12–14. Retrieved Nov 3, 2007 from <http://
www.stfm.org/fmhub/fm2005/January/Robyn12.pdf>.
McCabe, M. (2006). Live assessment by questioning in an interactive classroom. In D. A. Banks (Ed.), Audience response systems in higher education (pp. 276–288). Hershey, PA:
Information Science Publishing.
Nicol, D. J., & Boyle, J. T. (2003). Peer instruction versus class-wide discussion in large classes: A comparison of two interaction methods in the wired classroom. Studies in
Higher Education, 28(4), 457–473.
Pelton, L. F., & Pelton, T. (2006). Selected and constructed response systems in mathematics. In D. A. Banks (Ed.), Audience response systems in higher education (pp. 175–186).
Hershey, PA: Information Science Publishing.
Penuel, W. R., Boscardin, C. K., Masyn, K., & Crawford, V. M. (2007). Teaching with student response systems in elementary and secondary education settings: A survey study.
Educational Technology Research and Development, 55(4), 315–346.
Pradhan, A., Sparano, D., & Ananth, C. V. (2005). The influence of an audience response system on knowledge retention: An application to resident education. American Journal
of Obstetrics and Gynecology, 193(5), 1827–1830.
Preszler, R. W., Dawe, A., Shuster, C. B., & Shuster, M. (2007). Assessment of the effects of student response systems on student learning and attitudes over a broad range of
biology courses. CBE-Life Sciences Education, 6(1), 29–41.
Reay, N. W., Bao, L., Li, P., Warnakulasooriya, R., & Baugh, G. (2005). Toward the effective use of voting machines in physics lectures. American Journal of Physics, 73(6),
554–558.
Shapka, J. D., & Ferrari, M. (2003). Computer-related attitudes and actions of teacher candidates. Computers in Human Behaviour, 19(3), 319–334.
Sharma, M. D., Khachan, J., Chan, B., & O’Byrne, J. (2005). An investigation of the effectiveness of electronic classroom communication systems in large lectures. Australasian
Journal of Educational Technology, 21(2), 137–154.
Siau, K., Sheng, H., & Nah, F. (2006). Use of a classroom response system to enhance classroom interactivity. IEEE Transactions on Education, 49(3), 398–403.
Simpson, V., & Oliver, M. (2007). Electronic voting systems for lectures then and now: A comparison of research and practice. Australasian Journal of Educational Technology,
23(2), 187–208.
Slain, D., Abate, M., Hidges, B. M., Stamatakis, M. K., & Wolak, S. (2004). Interactive response system to promote active learning in the doctor of pharmacy curriculum. American
Journal of Pharmaceutical Education, 68(5), 1–9.
Solvberg, A. (2002). Gender differences in computer-related control beliefs and home computer use. Scandinavian Journal of Educational Research, 46(4), 410–426.
Stuart, S. A. J., Brown, M. I., & Draper, S. W. (2004). Using an electronic voting system in logic lectures: One practitioner’s application. Journal of Computer Assisted Learning,
20(2), 95–102.
Uhari, M., Renko, M., & Soini, H. (2003). Experiences of using an interactive audience response system in lectures. BMC Medical Education, 3(12), 1–6.
Whitley, B. E. Jr., (1997). Gender differences in computer-related attitudes and behaviors: A meta-analysis. Computers in Human Behavior, 13, 1–22.
740 R.H. Kay / Computers & Education 52 (2009) 730–740