ArticlePDF Available

A Systematic Evaluation of Learning Objects for Secondary School Students

Authors:

Abstract and Figures

Empirical research evaluating the effectiveness of learning objects is noticeably absent. No formal research has been done on the use of learning objects in secondary schools. The purpose of this study was to evaluate the use of learning objects by high school students. The evaluation metric used to assess benefits and quality of learning objects was theoretically sound, reliable, and valid. Overall, two thirds of the students stated they benefited from using learning objects. Students benefited more if they were comfortable with computers, the learning object had a well organized layout, the instructions were clear, and the theme was fun or motivating. Students appreciated the motivational, interactive, visual qualities of the learning objects most. Computer comfort was significantly correlated with learning object quality and benefit. Younger students appeared to have less positive experiences than their older counterparts. There were no gender differences in perceived benefit or quality of learning objects, with one exception. Females emphasized the quality of help features significantly more than males.
Content may be subject to copyright.
J. EDUCATIONAL TECHNOLOGY SYSTEMS, Vol. 35(4) 411-448, 2006-2007
A SYSTEMATIC EVALUATION OF LEARNING
OBJECTS FOR SECONDARY SCHOOL STUDENTS
ROBIN KAY
University of Ontario Institute of Technology
ABSTRACT
Empirical research evaluating the effectiveness of learning objects is
noticeably absent. No formal research has been done on the use of learning
objects in secondary schools. The purpose of this study was to evaluate the
use of learning objects by high school students. The evaluation metric used
to assess benefits and quality of learning objects was theoretically sound,
reliable, and partially validated. Overall, two-thirds of the students stated they
benefitted from using learning objects. Students benefitted more if they were
comfortable with computers, the learning object had a well organized layout,
the instructions were clear, and the theme was fun or motivating. Students
appreciated the motivational, interactive, visual qualities of the learning
objects most. Computer comfort was significantly correlated with learning
object quality and benefit. Younger students appeared to have less positive
experiences than their older counterparts. There were no gender differences in
perceived benefit or quality of learning objects, with one exception. Females
emphasized the quality of help features significantly more than males.
INTRODUCTION
Over the past 10 years, a substantial effort has been made to increase the use of
technology in the classroom (Compton & Harwood, 2003; McRobbie, Ginns, &
Stein, 2000; Plante & Beattie, 2004; U.S. Department of Education, National
Center for Education Statistics, 2002). In spite of these efforts, a number of
researchers have argued that technology has had a minor or negative impact on
student learning (e.g., Cuban, 2001; Roberston, 2003; Russell, Bebell, O’Dwyer,
& O’Connor, 2003; Waxman, Connell, & Gray, 2002). Part of the problem stems
411
Ó2007, Baywood Publishing Co., Inc.
from a considerable list of obstacles that have prevented successful implemen-
tation of technology including a lack of time (Eifler, Greene, & Carroll, 2001;
Wepner, Ziomek, & Tao, 2003), limited technological skill (Eifler et al., 2001;
Strudler, Archambault, Bendixen, Anderson, & Weiss, 2003; Thompson,
Schmidt, & Davis, 2003), fear of technology (Bullock, 2004; Doering, Hughes, &
Huffman, 2003), a clear lack of understanding about how to integrate technology
into teaching (Cuban, 2001), and insufficient access (e.g., Bartlett, 2002; Brush
et al., 2003; Russell et al., 2003).
The Role of Learning Objects
Learning objects, defined in this article as “interactive Web-based tools that
support learning by enhancing, amplifying, and guiding the cognitive processes of
learners” (Agostinho, Bennett, Lockyer, & Harper, 2004; Butson, 2003; Friesen,
2001; Gibbons, Nelson, & Richards, 2000; Littlejohn, 2003; Metros, 2005;
McGreal, 2004; Muzio, Heins, & Mundell, 2002; Parrish, 2004; Polsani, 2003;
Wiley, 2000; Wiley et al., 2004), offer a number of key components that can
reduce the impact of potential obstacles observed in the past (accessibility, ease
of use, reusability) and enhance student learning (interactivity, graphics, reduc-
tion of cognitive load, adaptive).
In contrast to other learning technologies burdened with implementation
challenges and costs, learning objects are readily accessible over the Internet and
users need not worry about excessive costs or not having the latest version (Wiley,
2000). Well over 90% of all public schools in North America and Europe now
have access to the Internet (and therefore learning objects) with most having
high-speed broadband connections (Compton & Harwood, 2003; McRobbie et al.,
2000; Plante & Beattie, 2004; U.S. Department of Education, National Center
for Education Statistics, 2002). In addition, because of their limited size and
focus, learning objects are relatively easy to learn and use, making them much
more attractive to busy educators who have little time to learn more complex,
advanced software packages (Gadanidis, Gadanidis, & Schindler, 2003). Finally,
reusability permits learning objects to be useful for a large audience, particularly
when the objects are placed in well organized, searchable databases (e.g.,
Agostinho et al., 2004; Duval, Hodgins, Rehak, & Robson, 2004; Rehak &
Mason, 2003).
With respect to enhancing learning, many learning objects are interactive
tools that support exploration, investigation, constructing solutions, and manipu-
lating parameters instead of memorizing and retaining a series of facts. This
constructivist based model is well documented in the literature (e.g., Albanese &
Mitchell, 1993; Bruner, 1983, 1986; Carroll, 1990; Caroll & Mack, 1984: Collins,
Brown, & Newman, 1989; Vygotsky, 1978). In addition, a number of learning
objects have a graphical component that helps make abstract concepts more
concrete (Gadanidis et al., 2003). Furthermore, certain learning objects allow
412 / KAY
students to explore higher level concepts by reducing cognitive load. They act as
perceptual and cognitive supports, permitting students to examine more complex
and interesting relationships (Sedig & Liang, 2006). Finally, learning objects are
adaptive, allowing users to have a certain degree of control over their learning
environments, particularly when they are learning and for how long.
In spite of this long list of potential benefits, little research has been done
examining the actual use and impact of learning objects in the classroom (Bradley
& Boyle, 2004; Kenny, Andrews, Vignola, Schilz, & Covert, 1999; Van Zele,
Vandaele, Botteldooren, & Lenaerts, 2003). The few studies examining the use
of learning objects have concentrated exclusively on higher education. No
formal, published studies were found investigating the use of learning objects
for secondary school students.
Learning Object Research
An extensive review of the literature uncovered 58 articles related to learning
objects. Fifty of these papers were theoretical in nature—they did not offer formal
evaluations to support claims made. Acknowledging that a single paper could
cover more than one topic, the following topics were addressed: design (n= 24;
41%); metadata (n= 17; 29%); learning (n= 17; 29%); reusability (n= 12; 29%);
development (n= 11; 19%); evaluation (n= 11; 19%); definition (n= 9; 16%);
repositories and searching (n= 9; 16%); use (n= 7; 12%); and standards (n=5;
9%) . The majority of these issues covered are technology-focused with relatively
little discussion on learning, evaluation, and use.
A number of authors note that the “learning object” revolution will never take
place unless instructional use and pedagogy is explored and evaluated (Maclaren,
2004; Muzio et al., 2002; Richards, 2002; Wiley, 2000). Agostinho et al. (2004)
and Wiley (2000) add that the learning object research agenda must begin to
investigate how learning objects can be used to create a high quality instruction
or “we will find ourselves with digital libraries full of easy to find learning
objects we don’t know how to use” (Agostinho et al., 2004, p. 2). Finally, Duval
et al. (2004) note that while many groups seem to be grappling with issues
that are related to the pedagogy and learning objects, few papers include a
detailed analysis of specific learning object features that affect learning. Clearly,
there is a need for empirical research that focuses on the pedagogical qualities
of learning objects.
Evaluation of Learning Objects
Only eight out of the 56 papers reviewed in this study evaluated the actual
use of learning objects (Adams, Lubega, Walmsley, & Williams, 2004; Bradley
& Boyle, 2004; Cochrane, 2005; Kenny et al., 1999; Krauss & Ally, 2005;
Macdonald et al., 2005; Nesbit, Belfer, & Vargo, 2002; Van Zele et al., 2003).
Various methods of evaluation were used including informal or qualitative
A SYSTEMATIC EVALUATION / 413
feedback (Adams et al., 2004; Bradley & Boyle, 2004; Cochrane, 2005;
Macdonald et al., 2005), descriptive analysis (Krauss & Ally, 2005; Macdonald
et al., 2005), convergent participation (Nesbit et al., 2002), formal surveys
(Cochrane, 2005; Krauss & Ally, 2005), and learning outcomes (Adams et al.,
2004; Bradley & Boyle, 2004; Macdonald et al., 2005; Van Zele et al., 2003).
In all eight studies, students and/or professors reported that learning objects
had a positive impact. Learning objects that offered clear instructions, engaging
activities, and interactivity (Cochrane, 2005; Krauss & Ally, 2005; Macdonald
et al., 2005) were rated as most successful. The results from these studies,
though, are limited in several ways. First, the research has been done exclusively
with university students. Second, while all eight studies reported that students
benefitted from the learning objects, the evidence was gathered, for the most part,
from loosely designed assessment tools with no validity or reliability. Third, a
formal statistical analysis was done in only two studies (Adams et al., 2004;
Van Zele et al., 2003). Finally, only three studies explored the impact of specific
learning object characteristics (Cochrane, 2005; Kenny et al., 1999; Krauss &
Ally, 2005). More formal, systematic research on broader populations is needed
with respect to evaluating the impact of learning objects (Duval et al., 2004).
Computer Attitudes
Considerable research has been done on the effect of attitude on computer
related behavior (Barbeite & Weiss, 2004; Christensen & Knezek, 2000; Durndell
& Haag, 2002; Kay, 1989, 1993b; Liu, Maddux, & Johnson, 2004; Torkzadeh,
Pflughoeft, & Hall, 1999). In general, more positive computer attitudes are asso-
ciated with higher levels of computer ability and use. Self-efficacy or perceived
comfort with using computers has been to shown to be particularly influential
on computer related behaviors (e.g., Barbeite & Weiss, 2004; Durndell & Haag,
2002; Shapka & Ferrari, 2003; Solvberg, 2002). Self-efficacy has not been
examined with respect to the use of learning objects, although it is expected
that students who are more comfortable with computers will benefit more.
Gender Differences in Computer Behavior
In 1992, Kay reviewed 36 studies on gender and computer related behaviors.
While there were clear measurement concerns regarding the assessment of gender
differences in computer ability, attitude, and use (e.g., Kay, 1992, 1993a), the
overall picture indicated that males had more positive attitudes, higher ability,
and used computers more. Five years later, a meta-analysis by Whitley (1997)
revealed the imbalance between males and females continued to exist with respect
to computer attitudes. Males had greater sex-role stereotyping of computers,
higher computer self-efficacy, and more positive affect about computers than
females. In a recent review (Kay, 2006), differences between males and females
appear to be lessening somewhat, although male dominance is still prevalent with
414 / KAY
respect to attitude, ability, and use. Only one study has looked at gender differ-
ences with respect to computer attitudes and learning objects—no significant
differences were found between male and female university students (Van Zele,
et al., 2003).
Purpose
The purpose of this study was to systematically evaluate the use of learning
objects by secondary school students. The key questions addressed were:
1. What were the reported benefits of using the learning objects (perceived
benefit)?
2. What did students like and dislike about using learning objects (quality of
learning objects)?
3. What was the influence of individual differences (computer comfort level,
gender, grade, and learning object type) on the perceived benefits and
assessment of quality?
METHOD
Sample
Students
The sample consisted of a 221 secondary school students (104 males, 116
females, 1 missing data), 13 to 17 years of age, in grades 9 (n = 85), 11 (n = 67),
and 12 (n = 69) from 12 different high schools and three boards of education.
The students were obtained through convenience sampling.
Teachers
A total of 30 teachers (9 experienced, 21 preservice) participated in the
development of the learning objects. Experienced teachers had taught for 10
or more years. Preservice teachers had completed at least a B.A. or B.Sc. and
were enrolled in an eight-month bachelor of education program. The breakdown
by subject area was eight for Biology (two experienced, six preservice), five
for Chemistry (two experienced, three preservice), five for Computer Science
(one experienced, four preservice), five for Physics (one experienced, four
preservice), and seven for Math (three experienced, four preservice).
Learning Objects
Five learning objects in five different subject areas were evaluated by secondary
school students. Seventy-eight students used the Mathematics learning object
(grade 9), 40 used the Physics learning object (grades 11 and 12), 37 used the
A SYSTEMATIC EVALUATION / 415
Chemistry learning object (grade 12), 34 used the Biology learning object (grades
9 and 11), and 32 used the Computer Science learning object (grades 11 and 12).
All learning objects can be accessed at: http://education.uoit.ca/learningobjects.
A brief description is provided below.
The mathematics learning object (Deep Space Line) was designed to help grade
9 students explore the formula and calculations for the slope of a line. Students
used their knowledge of slope to navigate a spacecraft through four missions. As
the missions progressed from level one to level four, less scaffolding was provided
to solve the mathematical challenges.
The physics learning object (Relative Velocity) helped grade 11 and 12 students
explore the concept of relative velocity. Students completed two case study
questions, and then actively manipulated the speed and direction of a boat, along
with the river speed, to see how these variables affect relative velocity.
The biology learning object (Groovy Genetics) was designed to help grade 11
students investigate the basics of Mendel’s genetics relating the genotype (genetic
trait) with the phenotype (physical traits) including monohybrid and dihybrid
crosses. Students had a visual instruction to complete Punnett squares. Each
activity finished with an assessment.
The chemistry learning object (Le Chatelier’s Principle) demonstrated the
three stresses (concentration, temperature, and pressure change) that can be
imposed to a system at chemical equilibrium. Students explored how equilibrium
shifts related to Le Chatelier’s Principle. Students assessed their learning in a
simulated laboratory environment by imposing changes to equilibrated systems
and predicting the correct outcome.
The computer science learning object (Logic Flows) was designed to teach
grade 10 or 11 students the six basic logic operations (gates) AND, OR, NOT,
XOR (exclusive OR), NOR (NOT-OR), and NAND (NOT-AND) through a visual
metaphor of water flowing through pipes. Students selected the least number
of inputs (water taps) needed to get a result in the single output (water holding
tank) to learn the logical function of each operation.
Developing the Learning Objects
The design of the learning objects was based on the following principles.
First, the learning objects were created at the grassroots level by preservice and
inservice teachers. Wiley (2000) maintained that learning objects need to be
sufficiently challenging, so inservice teachers were asked to brainstorm about and
select areas where their students had the most difficulty. Second, the learning
objects were designed to be context rich, however they focused on a relatively
specific topic areas that could be shared by different grades. Reusability, while
important, took a back seat to developing meaningful and motivating problems.
This approach is supported by a number of learning theorists (Brown, Collins, &
Duguid, 1989; Lampert, 1986; Larkin, 1989; Lave & Wenger, 1991; Sternberg,
416 / KAY
1989). Third, the learning objects were both interactive and constructivist in
nature. Students interacted with the computer, but not simply by clicking “next,
next, next.” They had to construct solutions to genuine problems. Fourth, the
“octopus” or resource model proposed by Wiley et al. (2004) was used. The
learning objects were designed to support and reinforce understanding of specific
concepts. They were not designed as stand alone modules that could teach
concepts. Finally, the learning objects went through many stages of development
and formative evaluation, including a pilot study involving secondary school
students. This approach is supported by Downes (2001) and Polsani (2003). Note
that a more detailed description and analysis of the development of learning
objects used in this study is offered by Kay and Knaack (2005).
Procedure
Preservice and inservice teachers administered the survey to their classes after
using one of the learning objects within the context of a lesson. Students were
told the purpose of the study and asked to give written consent if they wished to
volunteer to participate. Teachers and teacher candidates were instructed to use
the learning object as authentically as possible. Often the learning object was used
as another teaching tool within the context of a unit. Students were taken to a
computer lab, given some preliminary introduction to the learning object, and then
asked to use it. Since learning objects are best surrounded by teacher input and
assistance, preservice and inservice provided help to students in need. After one
period of using the learning object (approximately 70 minutes), students were
asked to fill out a survey (see Appendix A).
Date Sources
The data for this study was gathered from students using seven, 7-point Likert
scale items and two open ended questions (see Appendix A). The questions
yielded both quantitative and qualitative data. A similar five-item Likert survey
was given to teachers who used the learning objects in their classrooms (see
Appendix B). A detailed review and analysis of the evaluation tool used in this
study is presented in Kay and Knaack (2007).
Quantitative Data
A principal components analysis on the seven Likert scale items (student
survey—Appendix A) revealed two distinct constructs (Table 1). The first con-
struct, consisted of items 1 to 4, and was labeled “perceived benefit” of the
learning object. The second construct, computer comfort rating, consisted of
items 5 to 7. The internal reliability estimates were .87 for perceived benefit
and .79 for computer comfort rating. Criterion related validity for perceived
A SYSTEMATIC EVALUATION / 417
benefit score was assessed by correlating the survey score with the qualitative
ratings (Item 9—see scoring). The correlation was significant (.64; p< .001).
Qualitative Data—Learning Object Quality
Item 8 (Appendix A) asked students what they liked and did not like about the
learning object. A total of 757 comments were written down by 221 students.
Student comments were coded based on well-established principles of instruc-
tional design. Thirteen categories are presented with examples and references in
Table 2. In addition, all comments were rated on 5-point Likert scale (–2 = very
negative, –1 = negative, 0 = neutral, 1 = positive, 2 = very positive).
Two raters assessed the first 100 comments made by students and achieved
inter-rater reliability of .78. They then met, discussed all discrepancies and
attained 100% agreement. Next the raters assessed the remaining 657 comments
with an inter-rated reliability of .66. All discrepancies were reviewed and 100%
agreement was reached again.
Qualitative Data—Benefits of Learning Objects
Item 9 (Appendix A) asked students whether the learning object was beneficial.
Two hundred and twenty-five comments were made and categorized according to
nine post-hoc categories (Table 3). Each comment was then rated on a 5-point
Likert scale (–2 = very negative, –1 = negative, 0 = neutral, 1 = positive, 2 = very
positive). Two raters assessed all comments made by students and achieved
418 / KAY
Table 1. Varimax Rotated Factor Loadings on
Learning Object Survey
Item Factor 1 Factor 2
1. Another strategy
2. Understanding
3. Did benefita
4. Would use again
5. Enjoy computers
6. Graphics Help
7. Interactive helps
.85
.88
.83
.64
.82
.84
.73
Factor Eigenvalue PCT of VAR CUM PCT
1
2
4.02
1.05
57.5
15.0
57.5
72.5
aReverse scoring because of negativity worded question.
inter-rater reliability of .72. They then met, discussed all discrepancies and
attained 100% agreement.
Data
Predictor Variables
Four predictor variables were examined in this study: computer comfort
rating (Items 5 to 7 in Appendix A), gender (male and female), grade (9, 11,
and 12), and learning object (Mathematics, Physics, Chemistry, Biology, and
Computer Science).
Response Variables
Two main response variables were looked at: perceived benefit and quality.
Perceived benefit of the learning object was measured using survey (Items 1 to 4 in
Appendix A) and qualitative feedback (Item 9—Appendix A). The quality of
learning objects was assessed from qualitative feedback (Item 8—Appendix A).
RESULTS
Perceived Benefit of Learning Object
Students
Based on the average perceived benefit rating from the survey (Items 1 to
4—Appendix A), it appears the students felt the learning object was more bene-
ficial than not (M= 4.8, SD = 1.5; scale ranged from 1 to 7). Fourteen percent of
all students (n= 30) disagreed (average score of 3 or less) that the 1earning object
was of benefit whereas 55% (n= 122) agreed (average score of 5 or more) that
it was useful.
The qualitative comments (Q9—Appendix A) supported the survey results.
Twenty-four percent of the students (n= 55) felt the overall learning object was
not beneficial, however 66% (n= 146) felt it did provide benefit.
A more detailed examination indicated that the motivational, interactive, and
visual qualities were most important to students who benefitted from the learning
object. Whether they learned something new was also cited frequently and rated
highly. Presenting the learning object after the topic had already been learned and
poor instructions were the top two reasons given by students who did not benefit
from the learning object (Table 4).
Teachers
Overall, experienced and preservice teachers strongly agreed that the learning
object was a beneficial learning strategy for students (Item 1—M= 6.5, SD = 0.6)
A SYSTEMATIC EVALUATION / 419
Table 2. Coding Scheme for Assessing Learning Object Quality (Item 8—Appendix A)
Category and References Criteria Sample student comments
1. Organization/Layout
Calvi, 1997; Koehler & Lehrer, 1998;
Lorch, 1989; Madhumita, 1995)
2. Learner Control over Interface
(Akpinar & Hartley, 1996; Bagui,
1998; Druin et al., 1999; Hanna
et al., 1999; Kennedy & McNaught,
1997)
3. Animation
(Gadanidis et al., 2003; Oren, 1990;
Stoney & Wild, 1998; Sedig &
Liang, submitted)
4. Graphics
(Gadanidis et al., 2003; Oren, 1990;
Stoney & Wild, 1998; Sedig &
Liang, submitted)
Refers to the location or overall
layout of items on the screen
Refers the control of the user
over specific features of the
learning object including pace
of learning
Refers specifically to animation
features of the program
Refers to graphics (non-animated
of the program), colors, size
of text
“Sometimes we didn’t know where/what to click.”
“I found that they were missing the next button.”
“Easy to see layout”
“[Use a] full screen as opposed to small box.”
“[I liked] that it was step by step and I could go
at my own pace.”
“I liked being able to increase and decrease
volume, temperature, and pressure on my own.
It made it easier to learn and understand.”
“It was too brief and it went too fast.”
“You don’t need all the animation. It’s good to
give something good to look at, but sometimes
it can hinder progress.”
“I liked the fun animations.”
“Like how it was linked with little movies...
demonstrating techniques.”
“I liked the moving spaceship.”
“The pictures were immature for the age group.”
“I would correct several mistakes in the graphics.”
“The graphics and captions that explained the
steps were helpful.”
“Change the colors to be brighter.”
420 / KAY
5. Audio
(Gadanidis et al., 2003; Oren, 1990;
Stoney & Wild, 1998; Sedig &
Liang, submitted)
6. Clear Instructions
(Acovelli & Gamble, 1997; Jones
et al., 1995; Kennedy & McNaught,
1997; Macdonald et al., 2005)
7. Help Features
(Acovelli & Gamble, 1997; Jones
et al., 1995; Kennedy & McNaught,
1997; Macdonald et al., 2005)
8. Interactivity
(Akpinar & Hartley, 1996; Bagui,
1998; Druin et al., 1999; Hanna
et al., 1999; Kennedy & McNaught,
1997)
9. Incorrect Content/Errors
Refers to audio features
Refers to clarity of instructions
before feedback or help is
given to the user
Refers to help features of the
program
Refers to general interactive
nature of the program
Refers to incorrect content
“Needed a voice to tell you what to do.”
“Needs sound effects.”
“Unable to hear the character (no sound card on
computers).”
“Some of the instructions were confusing.”
I...found it helpful running it through first and
showing you how to do it.”
“[I needed[...more explanations/Clearer
instructions.”
“The glossary was helpful.”
“Help function was really good.”
“Wasn’t very good in helping you when you were
having trouble...Igotmore help from the
teacher than it.”
“Using the computer helped me more for
genetics because it was interactive.”
“I like that it is on the computer and you were able
to type the answers.”
“I liked the interacting problems.”
“There were a few errors on the sight.”
“In the dihybrid cross section, it showed some
blond girls who should have been brunette.”
A SYSTEMATIC EVALUATION / 421
Table 2. (Cont’d.)
Category and References Criteria Sample student comments
10. Difficulty/Challenge Levels
(Hanna et al., 1999; Klawe, 1999;
Savery & Duffy, 1995)
11. Useful/Informative
(Sedig & Liang, submitted)
12. Assessment
(Atkins, 1993; Kramarski &
Zeichner, 2001; Sedighian, 1998;
Wiest, 2001; Zammit, 2000)
13. Theme/Motivation
(Akpinar & Hartley, 1996; Harp &
Mayer, 1998)
Was the program challenging?
Too easy? Just the right
difficulty level?
Refers to how useful or
informative the learning object
was
Refers to summative feedback/
evaluation given after a major
task (as opposed to a single
action) is completed
Refers to overall theme and/or
motivating aspects of the
learning object
“Make it a bit more basic.”
“For someone who didn’t know what they were
doing, the first few didn’t teach you anything
but to drag and drop.”
“I didn’t like how the last mission was too hard.”
“I like how it helped me learn.”
“I found the simulations to be very useful.”
“[The object] has excellent review material and
interesting activities.”
“I don’t think I learned anything from it though.”
No specific comments offered by students
“Very boring. Confusing. Frustrating.”
“Better than paper or lecture—game is good!”
“I liked it because I enjoy using computers, and
I learn better on them.”
422 / KAY
and were interested in using the learning object in their classrooms again (Item
3—M= 6.6, SD = 0.6). The teachers moderately agreed that the learning object
helped students with respect to understanding concepts (M= 5.4, SD = 1.2) and
that students would want to use the learning object again (M= 5.5, SD = 1.7).
Teachers agreed that the learning objects would have been more successful if they
had been implemented at the right time in the curriculum (M= 6.0, SD = 1.4).
Recall that the learning objects were used during the field experience placements
and may not have been introduced at a pedagogically appropriate time. Finally,
there were no significant differences between experienced and preservice teachers
with respect to the perceived benefits (Items 1 to 4) of the learning objects
(Hotelling’s T2 n.s.).
Quality of Learning Object
Overview
Students were relatively negative with respect to their comments about learning
object quality (Item 8—Appendix A). Fifty-seven percent of all comments were
either very negative (n= 42, 6%) or negative (n= 392, 52%) whereas only 42%
of the students made positive (n= 258, 34%) or very positive (n= 57, 8%)
statements about learning object quality.
Categories
An analysis of categories evaluating learning object quality (see Table 2 for
description) identified animation, interactivity, and usefulness as the highest rated
areas and audio, correct information, difficulty, clarity of instructions, and help
functions as the lowest rated. Table 5 provides means and standard deviation
for all categories assessing the quality of learning objects.
A one-way ANOVA comparing categories of learning object quality was
significant (p< .001). Audio, correct information, and difficulty were rated
significantly lower than animations, interactivity, and usefulness (Scheffé post
hoc analysis, p< .05).
Categories—Likes Only
One might assume that categories with mean ratings close to zero are not
particularly important with respect to evaluation. However, it is possible that a
mean of zero could indicate an even split between students who liked and disliked
a specific category. Therefore, it is worth looking at what students liked about the
learning objects, without dislikes, to identify polar “hot spots.” A comparison of
means for positive comments confirmed that usefulness (M= 1.33) was still
important, but that theme and motivation (M= 1.35), learner control (M= 1.35),
and organization of the layout (M= 1.20) also received high ratings. These areas
had mean ratings that were close to zero when negative comments were included
A SYSTEMATIC EVALUATION / 423
Table 3. Coding Scheme for Assessing Learning Object Benefits (Item 9—Appendix A)
Reason category Criteria Sample student comments
1. Timing
2. Review of Basics/
Reinforcement
3. Interactive/Hands
On/Learner Control
4. Good for visual
learners
5. Computer Based
When the learning object was introduced
in the curriculum
Refers to reviewing, reinforcing concept,
practice.
Refers to interactive nature of the
process.
Refers to some visual aspects of the
profess.
Refers more generally to liking to work
with computers
“I think I would have benefitted more if I used this
program while studying the unit.”
“It didn’t benefit me because that particular unit was
over. It would have helped better when I was first
learning the concepts.”
“Going over it more times is always good for memory.”
“It did help me to review the concept and gave me
practice in finding the equation of a line.”
“I believe I did, cause I got to do my own pace...
I prefer more hands on things (like experiments).”
“Yes, it helped because it was interactive.”
“I was able to picture how logic gates function better
through using the learning object.”
“I found it interesting. I need to see it.”
“I think that digital learning kind of made the game
confusing.”
“I think I somewhat did because I find working on the
computer is easier than working on paper.”
424 / KAY
6. Fun/Interesting
7. Learning Related
8. Clarity
9. Not good at
subject
10. Compare to other
method
11. No reason given
Refers to process being fun,
interesting, motivating
Refers to some aspect of the learning
process.
Refers to the clarity of the program
and/or the quality of instruction.
Refers to personal difficulties in
subject areas
Compared to other teaching method/
strategy.
“I think I learned the concepts better because it made
them more interesting.”
“I think I did. The learning object grasped my attention
better than a teacher talking non-stop.”
“I don’t think I learned the concept better.”
“It did help me teach the concept better.”
“I think it was very confusing and hard to understand.”
“Yes, this helped me. It made it much clearer and was
very educational.”
“No, to be honest it bothered me. In general I don’t
enjoy math and this did not help.”
“Yes, because it...isbetter than having the teacher tell
you what to do.”
“Would rather learn from a book.”
“I didn’t benefit from any of it.”
“Yes.”
A SYSTEMATIC EVALUATION / 425
(see Table 4). This indicates that students had relatively polar attitudes about
these categories.
Categories—Dislikes Only
A comparison of means for negative comments indicated that usefulness
(M= –1.33) remained important, however theme and motivation (M= 1.32)
was also perceived as particularly negative. Students appeared to either like or
dislike the theme or motivating qualities of the learning object.
Correlation between Quality and Perceived Benefit Scores
Theme and motivation (r.45; p< .01), the organization of the layout (r= .33;
p< .01), clear instructions (r= .33; p< .01), and usefulness (r= .33; p< .01 )
were significantly correlated with the perceived benefit score measured by the
survey (Items 1 to 4—Appendix C).
Computer Comfort
Overview
Students in this study appeared to be relatively comfortable using computers
with an average computer comfort score of 5.2 (SD = 1.3; scale range from 1 to 7;
Items 5 to 7—Appendix A). Sixty-six percent of the students (n146) were
comfortable working with computers (average score of 5 or more) while only
7% (n= 15) were uncomfortable (average score of 3 or less).
426 / KAY
Table 4. Mean Ratings for Reasons Given for Benefits
of Learning Objects (Q9)
Reason nMean Std. Deviation
Fun/Interesting
Visual Learners
Interactive
Learning Related
Good Review
Computer Based
Compare to Another Method
Timing
Clarity
Not good at subject
17
33
30
37
60
5
24
21
33
3
1.35
1.24
1.17
0.81
0.80
0.20
0.00
–0.29
–0.55
–1.35
0.74
0.84
1.34
1.13
1.04
1.40
1.18
1.19
0.00
0.38
Correlation with Perceived Benefit
The overall correlation between computer comfort and perceived benefit
(survey) was high and significant (r= .59; p< .001). Computer comfort was
also significantly correlated (r= .52; p< .001) with benefit ratings derived from
the qualitative data ratings (Item 9—Appendix A).
Correlation with Quality of Learning Object
Computer comfort was significantly correlated with overall ratings of learning
object quality (r= .37; p< .001). An analysis of specific categories revealed
that computer comfort was significantly correlated with theme/motivation (r= .38;
p< .001) and clear instructions (r= .22; p.01).
Gender
Overall
No significant differences between males and females were found with respect
to perceived benefit of the learning objects (survey or qualities data), learning
object quality, or computer comfort.
A SYSTEMATIC EVALUATION / 427
Table 5. Mean Ratings for Evaluating Learning
Object Quality
Category nMean Std. Deviation
Animations
Interactivity
Useful
Assessment
Graphics
Theme/Motivation
Organization
Learner Control
Help Functions
Clear Instructions
Difficulty
Information Correct
Audio
27
47
39
9
84
125
34
75
42
138
107
17
13
0.81
0.66
0.51
0.44
0.25
0.12
–0.06
–0.12
–0.43
–0.61
–0.67
–.100
–1.15
0.74
0.84
1.34
1.13
1.04
1.40
1.18
1.19
1.02
0.95
0.81
0.00
0.38
Perceived Benefit—Specific Categories
Independent t-tests done for gender and perceived benefit category (Table 3)
revealed two significant differences. First, males (M= .97) rated the learning
object higher than females (M= .63) in terms of being a good review or
reinforcement of concepts (p< .05). Second, females (M= –.90) saw clarity of
instructions as significantly more important than males (M= .08) with respect
whether a learning object was beneficial (p< .05).
Quality—Specific Categories
An analysis of learning quality categories (Table 2) indicated that females
(M= .91) rated help features significantly lower than males (M= .10; t= 3.67,
p< .005). All other categories were not significant.
Grade
Overview
A MANOVA with computer comfort score as a covariate was used to examine
differences among grade levels with respect to perceived benefits and quality
of learning object. Significant differences among grades were observed for per-
ceived benefit (survey questions—p< .005; open ended question—no significant
difference) and learning object quality (p< .001). An analysis of contrasts
revealed that grade 9 students rated the benefits of learning objects lower than
grade 12 students. Grade 9 (p< .001) and 11 (p< .005) students rated the quality
of learning objects lower than grade 12 students (see Tables 6 and 7).
Perceived Benefit Rating Categories (Q9)
A series of ANOVAs analyzing the 13 benefit categories (Table 2) showed no
significant differences among grades.
428 / KAY
Table 6. MANOVA for Learning Object Quality, Learning Object
Benefits, and Computer Comfort for Grade
Source df SS MS F Post Hoc Analysis
Learning Quality (Q8)
Benefits (Scale)
Benefits (Q9)
2
2
2
7.1
282.5
3.3
3.5
141.3
1.7
10.55*
6.38**
1.81
G12 > G11 and G9
G12>G9
*p< .001. **p< .005.
Quality of Learning Object
A series of ANOVAs were run comparing mean grade ratings of categories used
to assess learning object quality. While a majority of the categories showed
no significant effect for grade, four areas showed significantly higher ratings by
older students: learner control, graphics, clear instructions, and theme/motivation
(see Table 8). These results are partially compromised because students from
each grade did not experience all learning objects.
Learning Object Type
Overview
A MANOVA with computer comfort score as a covariate was used to examine
differences among learning objects with respect to quality and benefits (per-
ceived benefits score and open ended response). Significant differences among
learning objects were observed for quality (p< .001), perceived benefit (p< .001),
and benefits rated from the open ended question (Q9—Appendix A; p< .001).
An analysis of contrasts revealed that the mathematics and physics learning
objects were rated significantly lower than the Biology, Chemistry, and
Computer Science learning objects with respect to quality and benefits (p< .001;
see Tables 9 and 10).
Learning Object Quality
A series of ANOVAs was run comparing mean learning objects ratings for
categories used to assess learning object quality (Table 2). Computer Science,
Chemistry, or Biology had significantly higher ratings that Math or Physics with
respect to learner control, clear instructions, and theme/motivation. All other
comparisons were not significant or the sample size was too small (see Table 11).
These results are partially compromised because students from each grade did
not experience all learning objects.
A SYSTEMATIC EVALUATION / 429
Table 7. Adjusted Means Scores for Learning Object and Benefits by Grade
Learning Object
Quality (Q8) Benefit (Scale) Benefit (Q9)
Mean S.E. Mean S.E. Mean S.E.
Grade 9
Grade 10
Grade 11
–0.29a
–0.11a
0.20a
0.06
0.07
0.08
17.90a
19.33a
20.82a
0.53
0.59
0.62
0.30a
0.51a
0.61a
0.11
0.12
0.13
aCovariate (Computer Comfort) appearing in model is evaluated at 15.9.
430 / KAY
Table 8. Grade Differences Among Categories Evaluating
Learning Quality
Grade
9
Ma(SD)b
11
Ma (SD)
12
Ma (SD)
Post Hoc Comparisons
Organization
Learner Control
Animation
Graphics
Audio
Clear Instructions
Help Features
Interactivity
Info. Correct
Difficulty
Useful
Assessment
Theme/Motivation
.00 (1.1)
–.25 (1.2)
1.00 (0.0)
–.18 (1.1)
–.77 (0.9)
–.55 (1.0)
.79 (0.09)
–.55 (1.0)
.18 (1.5)
–.25 (1.4)
–.29 (1.3)
–.43 (1.0)
–.55 (0.8)
.50 (0.8)
–.72 (0.8)
–.20 (1.1)
.65 (0.8)
–.73 (0.7)
1.0 (1.2)
.48 (1.5)
.00 (1.3)
.48 (1.3)
1.00 (0.7)
.58 (1.0)
–.25 (1.0)
–.13 (0.9)
.65 (0.9)
–.79 (0.6)
.58 (1.2)
.96 (0.9)
No significant difference
G12 > G11, G9; p< .05
No significant difference
G12, G11 > G9; p< .05
Sample too small
G12 > G9; p< .05
No significant difference
No significant difference
Sample too small
No significant difference
No significant difference
Sample too small
G12 > G9; p< .005
aMean. bStandard Deviation.
Table 9. Adjusted Means Scores for Learning Object Quality
and Benefits by Learning Object
Learning Object
Quality (Q8) Benefit (Scale) Benefit (Q9)
Mean S.E. Mean S.E. Mean S.E.
Biology
Chemistry
Computer Science
Math
Physics
0.05
0.35
0.09
–0.31
–0.33
0.10
0.07
0.11
0.10
0.09
22.35
22.58
19.21
17.77
17.08
0.89
0.52
0.87
0.81
0.74
0.80
0.77
0.87
0.25
0.00
0.17
0.11
0.18
0.17
0.15
aCovariate (Computer Comfort) appearing in model is evaluated at 15.9.
Benefits of Learning Objects
Sample sizes were too small to run reliable ANOVAs comparing learning
objects with respect to benefit categories outlined in Table 3.
DISCUSSION
The purpose of this study was to systematically evaluate the use of learning
objects by secondary school students. A metric, based on well tested principles of
instructional design, was used to examine: a) the perceived benefit of learning
objects; b) the quality of learning objects; and c) the role individual differences.
Evaluation Tool
This study is unique in attempting to develop a reliable and valid metric to
evaluate the benefits and specific impact of a wide range of learning object
qualities. The perceived benefits scale proved to be reliable and valid. The quality
scale was also reliable, although validity was not tested. The coding scheme
(see Table 2) based on sound principles of instructional design was particularly
useful in identifying salient qualities of learning objects. Overall, the evaluation
tool used in this study provided a reasonable foundation with which to assess
the impact of learning objects. Further research, though, is needed to correlate
specific learning qualities with actual learning outcomes.
Perceived Benefit of Learning Object
To date, the use of learning objects by secondary school students has not been
formally examined. The results from his study, suggest that learning objects are a
viable learning tool for this population. Two-thirds of all students felt that learning
objects were beneficial, particularly when they had a motivating theme with visual
supports and interactivity. These results are consistent with previous research on
instructional design (e.g., Akpinar & Hartley, 1996; Bagui, 1998; Druin et al.,
1999; Gadanidis et al., 2003; Hanna, Risden, Czerwinski, & Alexander, 1999;
A SYSTEMATIC EVALUATION / 431
Table 10. MANOVA for Learning Object Quality, Learning Object
Benefits, and Computer Comfort for Learning Object
Source df SS MS F Post Hoc Analysis
Learning Quality (Q8)
Benefits (Scale)
Benefits (Q9)
4
4
4
11.1
835.9
19.2
2.8
209.0
4.8
8.70*
10.69*
5.65*
Bio., Chem., CompSci. > Math, Phys.
Bio., Chem. > Math, Phys.
Bio., Chem., CompSci. > Math, Phys.
*p< .001
Table 11. Learning Objects Differences Among Categories Evaluating Learning Quality
Physics
Ma(SD)bMath
M(SD)
Biology
M(SD)
Chemistry
M(SD)
Comp. Sci.
M(SD) Post Hoc Comparisons
Organization
Learner Control
Animation
Graphics
Audio
Clear Instructions
Help Features
Interactivity
Info. Correct
Difficulty
Useful
Assessment
Theme/Motivate
–.43 (1.0)
.60 (0.8)
–.91 (0.5)
.82 (0.6)
–1.06 (0.2)
.10 (1.4)
–.29 (1.2)
–.24 (1.1)
–.77 (1.0)
.57 (1.1)
–.54 (1.0)
–.24 (1.4)
–.48 (1.0)
.45 (0.9)
.57 (1.1)
.78 (0.7)
–.67 (0.8)
.15 (1.3)
1.17 (0.8)
.29 (1.3)
–.05 (1.1)
.56 (0.9)
–1.0 (0.0)
.50 (1.4)
–.13 (1.2)
.62 (1.0)
–.62 (0.8)
.55 (1.0)
–.56 (0.8)
1.2 (0.9)
Sample too small
Chem. > Bio & Math; p< .05
Sample too small
No significant difference
Sample too small
Bio > Math & Phys.; p< .005
Chem. > Math & Phys.; p< .05
Sample too small
No significant difference
Sample too small
No significance difference
Sample too small
Sample too small
Comp. Sci. > Math; p< .005
aMean. bStandard Deviation.
432 / KAY
Harp & Mayer, 1998; Kennedy & McNaught, 1997; Oren, 1990; Sedig & Liang,
2006; Stoney & Wild, 1998). While student ratings might be considered nebulous
teachers who used the learning objects as an educational aid in this study were
very positive regarding the benefits of learning objects and most would use
them again.
Quality of Learning Object
The results from this article suggest that four of the 13 learning quality cate-
gories (Table 2) are particularly important in terms of learning object quality
and benefit (usefulness, clear instructions, organization/layout, and theme/
motivation). If the learning object provides clear instructions, is well organized,
motivating, and perceived as being useful, secondary school students are more
likely to feel they have benefitted from the experience. These results match the
qualitative feedback reported by Cochrane (2005) and MacDonald et al. (2005)
for higher education students.
Computer Comfort
Previous research behaviors (e.g., Barbeite & Weiss, 2004; Durndell & Haag,
2002; Shapka & Ferrari, 2003; Solvberg, 2002) predicted that computer self-
efficacy would play a significant role in use of the learning objects evaluated in
this study. This prediction was supported as computer comfort level was strongly
correlated with perceived benefits and quality of the learning object. It appears
that those students who did not feel comfortable with using computers did not
work well with the learning objects. Educators need to be aware of this bias,
however, two-thirds of the 221 secondary school students in this study were
comfortable with computers. Overall, the secondary school population seems
predisposed to using computer instruction.
Gender
Twenty years of persistent bias in computer related behavior in favor of males
(e.g., Kay, 1992, 1993a, 2007; Whitley, 1997) would have led one to expect a
similar bias with learning objects. However, there were very few gender differ-
ences for perceived quality and benefits of the learning objects used. Females
were more critical of the help features and rated clarity of instructions as more
important than males. Shapka and Ferrari (2003) reported female preservice
teachers were more likely to use help features than males. One could speculate
that this pattern was mirrored by secondary school students. Female students
may have relied on help more than males and therefore they were more critical.
More in-depth research needs to be done in this area.
A SYSTEMATIC EVALUATION / 433
Grade
No previous research has been done comparing grade levels and use of learning
objects. Results from this study suggest that younger students rated learning
objects as being poorer in quality and less beneficial than older students. This
result is partially confounded by the fact that different age groups used different
learning objects. For example grade 9 students used the mathematics learning
objects exclusively, but this object was rated lower than some of the learning
objects used by grade 11 and 12 students. It is conceivable that younger age groups
are more into gaming and therefore they might be more critical of relatively simple
learning objects. It is also possible that younger students might see working with
computers as a game like atmosphere, whereas older students may be more
focused in their learning. Further research needs to be done in this area.
Learning
The underlying theme in this article is a clear emphasis on the cognitive aspects
of learning objects. It is strongly felt the ultimate goal of any learning object is to
help a student learn. This question was looked at indirectly by assessing the
perceived quality and benefits of learning objects.
The question “Are learning objects beneficial to secondary school students?”
is too simplistic given the results of this study. A better question is “Under what
conditions do learning objects provide the most benefit to secondary school
students?” The evidence suggests that students will benefit more if they are
comfortable with computers, the learning object has a well organized layout, is
interactive, visual representations are provided that make abstract concepts more
concrete, instructions are clear, and the theme is fun or motivating.
Overall, secondary school students appear to be relatively receptive to using
learning objects. While almost 60% of the students were critical about one or more
learning object features, roughly two-thirds of all students perceived learning
objects as beneficial because they were fun, interactive, visual, and helped them
learn. Students who did not benefit from the learning felt that they were presented
at the wrong time (e.g., after they had already learned the concept) or that the
instructions were not clear enough. Interestingly, these reasons, both positive and
negative, are learning focused. While reusability, accessibility, and adaptability
are given heavy emphasis in the learning object literature (Appendix C), when
it comes to the end user, learning features appear to be more important.
Caveats
This study was the first attempt to systematically evaluate learning object
qualities in the eyes of secondary school students. While the study produced useful
information for educators and researchers, there are at least four key areas that
could be improved in future research. First, a set of pre- and post-test content
434 / KAY
questions is important to assess whether any learning actually occurred. Second, a
more systematic survey requiring students to rate all quality and benefit categories
(Tables 2 and 3) would help to provide more comprehensive assessment data.
Third, details about how each learning object is used are necessary to open up a
meaningful dialogue on the kinds of instructional wraps that are effective. Finally,
a more detailed assessment of computer ability, attitudes, experience, and learning
styles might provide insights about the effect of individual differences on the
use of learning objects.
Future Research
There are numerous opportunities for future research based on a clear absence
of empirical investigation in the learning object literature and the results observed
in this study. These include:
further research in secondary schools and new research in elementary schools;
further research in individual differences (gender, learning style, computer
ability);
exploration on the role of context, motivation, and theme in determining the
impact of learning objects;
an evaluation of education applets that match many, if not all, of the qualities
of learning objects;
examining the role of instructional wrap or guidance on the effectiveness of
learning objects; and
developing a more detailed and comprehensive evaluation scheme for assess-
ing learning object effectiveness.
Summary
The purpose of this study was to evaluate the use of learning objects by
secondary school students. Clear emphasis was placed on a learning-focused
definition of learning objects, a comprehensive evaluation metric examining
the quality and benefits of learning objects, and individual differences in use
(computer comfort, gender, grade level, and learning object type). The evaluation
metric used was theoretically sound, reliable, and partially validated. Overall,
two-thirds of the students stated they benefitted from using the learning object.
Students benefitted more if they were comfortable with computers, the learning
object had a well organized layout, instructions were clear, and the theme was fun
or motivating. Students appreciated the motivating, hands-on, and visual qualities
of the learning objects most. Computer comfort was significantly correlated with
learning object quality and benefit. Younger students appeared to have less
positive experiences than the older counterparts. There were no gender differences
in perceived quality or benefit of learning objects, with one exception. Females
emphasized the quality of help features significantly more than males.
A SYSTEMATIC EVALUATION / 435
436 / KAY
APPENDIX A
Learning Object Survey For Students
Strongly
Disagree
1
Disagree
2
Slightly
Disagree
3
Neutral
4
Slightly
Agree
5
Agree
6
Strongly
Agree
7
1. The learning object has
some benefit in terms of
providing me with another
learning strategy/another
tool.
2. I feel the learning object did
benefit my understanding
of the subject matter’s
concept/principle.
3. I did not benefit from using
the learning object.
4. I am interested in using the
learning object again.
5. I enjoy working on the
computer to learn.
6. I find computer graphics
and interactive programs
help me learn.
7. I benefit from learning
through interactive &
engaging activities.
1
1
1
1
1
1
1
2
2
2
2
2
2
2
3
3
3
3
3
3
3
4
4
4
4
4
4
4
5
5
5
5
5
5
5
6
6
6
6
6
6
6
7
7
7
7
7
7
7
8. You used a digital learning object on the computer. Tell me about this experience
when you used the object.
a) What did you like? (found helpful, liked working with, what worked well for you)
b) What didn’t you like? (found confusing, or didn’t like, or didn’t understand)
9. Do you think you benefitted from using this particular learning object? Do you think
you learned the concept better? Do you think it helped you review a concept you just
learned? Why? Why not?
A SYSTEMATIC EVALUATION / 437
APPENDIX B
Learning Object Survey For Teachers
Strongly
Disagree
1
Disagree
2
Slightly
Disagree
3
Neutral
4
Slightly
Agree
5
Agree
6
Strongly
Agree
7
1. The learning object has
benefit in terms of
providing students with
another learning
strategy in my classroom.
2. The learning object did
benefit my students in terms
of their understanding
of the concept/principle
explored in the learning
object.
3. I would be interested in
using the learning object
again in my class.
4. There would have been
more success with the
learning object had it been
implemented during the
proper time within the unit.
5. Students were interested in
using the learning object
again.
1
1
1
1
1
2
2
2
2
2
3
3
3
3
3
4
4
4
4
4
5
5
5
5
5
6
6
6
6
6
7
7
7
7
7
APPENDIX C
Key Focus of Learning Object Papers Reviewed in the Study
Definition Design Develop Metadata Reusable Repository Standards Learning Evaluation Use
Adams et al., 2004
Agostinho et al., 2004
Atif et al., 2003
Bartz, 2002
Baruque & Melo, 2004
Bennett & McGee, 2005
Boyle, 2003
Bradley & Boyle, 2004
Butson, 2003
Buzza et al., 2004
Carey et al., 2004
Christiansen & Anderson, 2004
Cochrane, 2005
COHERE Group, 2002
Collis & Strijker, 2003
Downes, 2001
Downes, 2003
Duval et al., 2004
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
438 / KAY
Fiadihi & Mohammed, 2004
Friesen, 2001
Gadanidis et al., 2003
Gibbons et al., 2000
Hamel & Ryan-Jones, 2002
Harmon & Koohang, 2005
Jaakkola & Nurmi, 2004
Jonassen & Churchill, 2004
Kenny et al., 1999
Koppi et al., 2005
Krauss & Ally, 2005
Littlejohn, 2003
MacDonald et al., 2005
Maclaren, 2004
McGreal, 2004
McGreal et al., 2004
Metros, 2005
Morris, 2005
Muzio et a l., 2002
Nesbit et al., 2002
Orill, 2000
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
A SYSTEMATIC EVALUATION / 439
APPENDIX C (Cont’d.)
Definition Design Develop Metadata Reusable Repository Standards Learning Evaluation Use
Paquette & Rosca, 2002
Parrish, 2004
Petrinjak & Graham, 2004
Polsani, 2003
Poupa & Forte, 2003
Rehak & Mason, 2003
Richards, 2002
Richards et al., 2002
Schell & Burns, 2002
Sedig & Liang, 2006
Seki et al., 2005
Siquerira et al., 2004
Sloep, 2003
Valderrama et al., 2005
Van Zele et al., 2003
Wiley et al., 2004
Wiley, 2000
Wilhelm & Wilde, 2005
Williams, 2000
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
440 / KAY
REFERENCES
Acovelli, M., & Gamble, M. (1997). A coaching agent for learners using multimedia
simulations. Educational Technology, 37(2), 44-49.
Adams, A., Lubega, J., Walmsley, S., & Williams, S. (2004). The effectiveness of assess-
ment learning objects produced using pair programming. Electronic Journal of
e-Learning, 2(2). Retrieved July 28, 2005 from
http://www.ejel.org/volume-2/vol2-issue2/v2-i2-artl-adams.pdf.
Agostinho, S., Bennett, S., Lockyer, L., & Harper, B. (2004). Developing a learning
object metadata application profile based on LOM suitable for the Australian higher
education market. Australasian Journal of Educational Technology, 20(2), 191-208.
Akpinar, Y., & Hartley, J. R. (1996). Designing interactive learning environments. Journal
of Computer Assisted Learning, 12(1), 33-46.
Albanese, M. A., & Mitchell, S. A. (1993). Problem-based learning: A review of the
literature on its outcomes and implementation issues. Academic Medicine, 68, 52-81.
Atif, Y., Benlarmi, R., & Berri, J. (2003). Learning objects based framework for self-
adaptive learning. Education and Information Technologies, 8(4), 345-368.
Atkins, M. J. (1993). Theories of learning and multimedia applications: An overview.
Research Papers in Education, 8(2), 251-271.
Bagui, S. (1998). Reasons for increased learning using multimedia. Journal of Educational
Multimedia and Hypermedia, 7(1), 3-18.
Barbeite, F. G., & Weiss, E. M. (2004). Computer self-efficacy and anxiety scales for an
Internet sample: Testing measurement equivalence of existing measures and develop-
ment of new scales. Computers in Human Behavior, 20(1), 1-15.
Bartlett, A. (2002). Preparing preservice teachers to implement performance assessment
and technology through electronic portfolios. Action in Teacher Education, 24(1),
90-97.
Bartz, J. (2002). Great idea, but how do I do it? A practical example of learning object
creation using sgml/xml. Canadian Journal of Learning and Technology, 28(3).
Retrieved July 1, 2005 from http://www.cjlt.ca/content/vol28.3/bartz.html.
Baruque, L. B., & Melo, R. N. (2004). Learning theory and instructional design using
learning objects. Journal of Educational Multimedia and Hypermedia, 13(4), 343-370.
Bennett, & McGree. (2005). Transformative power of the learning object debate. Open
Learning, 20(1), 15-30.
Boyle, T. (2003). Design principles for authoring dynamic, reusable learning objects.
Australian Journal of Educational Technology, 19(1), 46-58.
Bradley, C., & Boyle, T. (2004). The design, development, and use of multimedia learning
objects. Journal of Educational Multimedia and Hypermedia, 13(4), 371-389.
Brown, J. S., Collins, A., & Duiguid, P. (1989). Situated cognition and the culture of
learning. Educational Researcher, 18(1), 32-42.
Bruner, J. (1983). Child’s talk. Learning to use language. Toronto, Canada: George J.
McLeod Ltd.
Bruner, J. (1986). Actual minds, possible worlds. Cambridge, MA: Harvard University
Press.
Brush, T., Glazewski, K., Rutowski, K, Berg, K., Stromfors, C., Van-Nest, M., Stock, L., &
Sutton, J. (2003). Integrating technology in a field-based teacher training program:
The PT3@ASU project. Educational Technology, Research and Development, 51(1),
57-73.
A SYSTEMATIC EVALUATION / 441
Bullock, D. (2004). Moving from theory to practice: An examination of the factors
that preservice teachers encounter as the attempt to gain experience teaching with
technology during field placement experiences. Journal of Technology and Teacher
Education, 12(2), 211-237.
Butson, R. (2003). Learning objects: Weapons of mass instruction. British Journal of
Educational Technology, 34(5), 667-669.
Buzza, D. C., Bean, D., Harrigan, K., & Carey, T. (2004). Learning design repositories:
Adapting learning design specifications for shared instructional knowledge. Canadian
Journal of Learning and Technology, 30(3), 79-101.
Calvi, L. (1997). Navigation and disorientation: A case study. Journal of Educational
Multimedia and Hypermedia, 6(¾), 305-320.
Carey, T., Swallow, J., & Oldfield, W. (2002). Educational rationale metadata for learning
objects. Canadian Journal of Learning and Technology, 28(3). Retrieved July 1, 2005
from http://www.cjlt.ca/content/vol28.3/carey_etal.html.
Carroll, J. B. (1990). The Nurnberg funnel. Cambridge, MA: MIT Press.
Carroll, J. M., & Mack, R. L. (1984). Learning to use a word processor: By doing, by
thinking, and by knowing. In J. C. Thomas & M. Schneider (Eds.), Human factors in
computer systems. Norwood, NJ: Ablex.
Christensen, R. W., & Knezek, G. A. (2000). Internal consistency reliabilities for 14
computer attitude scales. Journal of Technology and Teacher Education, 8(4),
327-336.
Christiansen, J., & Anderson, T. (2004). Feasibility of course development based on
learning objects: Research analysis of three case studies. International Journal of
Instructional Technology and Distance Learning, 1(3). Retrieved July 30, 2005 from
http://www.itdl.org/Journal/Mar_04/article02.htm.
Cochrane, T. (2005). Interactive QuickTime: Developing and evaluating multimedia
learning objects to enhance both face-to-face and distance e-learning environments.
Interdisciplinary Journal of Knowledge and Learning Objects, 1. Retrieved August 3,
2005 from http://ijklo.org/Volume1/v1p033-054Cochrane.pdf.
COHERE Group. (2002). The learning object economy: Implications for developing
faculty expertise Canadian Journal of Learning and Technology, 28(3). Retrieved July
1, 2005 from http://www.cjlt.ca/content/vol28.3/cohere.html.
Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the
crafts of reading, writing, and mathematics. In L. B. Resnick (Ed.), Knowing, learning,
and instruction (pp. 453-494). Hillsdale, NJ: Erlbaum Associates.
Collis, B., & Strijker, A. (2003). Re-usable learning objects in context. International
Journal of E-Learning, 2(4), 5-16.
Compton, V., & Harwood, C. (2003). Enhancing technological practice: An assessment
framework for technology education in New Zealand. International Journal of
Technology and Design Education, 13(1), 1-26.
Cuban, L. (2001). Oversold and underused: Computers in the classroom. Cambridge, MA:
Harvard University Press.
Doering, A., Hughes, J., & Huffman, D. (2003). Preservice teachers: Are we thinking
with technology? Journal of Research on Technology in Education, 35(3), 342-361.
Downes, S. (2001). Learning objects: Resources for distance education worldwide.
International Review of Research in Open and Distance Learning. Retrieved July 1,
2005 from http://www.irrodl.org/content/v2.1/downes.html.
442 / KAY
Downes, S. (2003). Design and reusability of learning objects in an academic context: A
new economy of education? USDLA Journal, 17(1). Retrieved July 1, 2005 from
http://www.usdla.org/html/journal/JAN03_Issue/article01.html.
Druin, A., Bederson, B., Boltman, A., Miura, A., Knotts-Callahan, D., & Platt, M. (1999).
The Design of Children’s Technology. San Francisco: Morgan Kaufmann Publishers,
Inc.
Durndell, A., & Haag, Z. (2002). Computer self efficacy, computer anxiety, attitude
towards the Internet and reported experience with the Internet, by gender, in an east
European sample. Computers in Human Behaviour, 18(5), 521-535.
Duval, E., Hodgins, W., Rehak, D., & Robson, R. (2004). Learning objects symposium
special issue guest editorial. Journal of Educational Multimedia and Hypermedia,
13(4), 331-342.
Eifler, K., Greene, T., & Carroll, J. (2001). Walking the talk is tough: From a single
technology course to infusion. The Educational Forum, 65(4), 366-375.
Fiaidhi, J., & Mohammed, S. (2004). Design issues involved in using learning objects
for teaching programming language within a collaborative eLearning environment.
International Journal of Instructional Technology and Distance Learning, 1(3).
Retrieved August 3, 2005 from http://www.itdl.org/Journal/Mar_04/article03.htm.
Friesen, N. (2001). What are educational objects? Interactive Learning Environments,
9(3).
Gadanidis, G., Gadanidis, J., & Schindler, K. (2003). Factors mediating the use of online
applets in the lesson planning of pre-service mathematics teachers. Journal of
Computers in Mathematics and Science Teaching, 22(4), 323-344.
Gibbons, A. S., Nelson, J., & Richards, R. (2000). The nature and origin of instructional
objects. In D. A. Wiley (Ed.), The Instructional Use of Learning Objects: Online
Version. Retrieved July 1, 2005 http://reusability.org/read/chapters/gibbons.doc.
Hamel, C. J., & Ryan-Jones, D. (2002). Designing instruction with learning objects.
International Journal of Educational Technology, 3(1). Retrieved June 1, 2005 from
http://www.ao.uiuc.edu/ijet/v3nl/hamel/.
Hanna, L., Risden, K., Czerwinski, M., & Alexander, K. J. (1999). The role of usability in
designing children’s computer products. In A. Druin (Ed.), The design of children’s
technology. San Francisco: Morgan Kaufmann Publishers, Inc.
Harman, K., & Koohang, A. (2005). Discussion board: A learning object. Interdisciplinary
Journal of Knowledge and Learning Objects, 1. Retrieved August 5, 2005 from
http://ijklo.org/Volume1/v1p067-077Harman.pdf.
Harp, S. F., & Mayer, R. E. (1998). How seductive details do their damage: A theory of
cognitive interest in science learning. Journal of Educational Psychology, 90(3),
414-434.
Jaakkola, T., & Nurmi, S. (2004). Learning objects—A lot of smoke but is there a fire?
Academic impact of using learning objects in different pedagogical settings. Turku:
University of Turku. Retrieved July 25, 2005 from
http://users.utu.fi/samnurm/Final_report_on_celebrate_experimental_studies.pdf.
Jonassen, D., & Churchhill, D. (2004). Is there a learning orientation in learning objects?
International Journal on E-Learning, 3(2), 32-41.
Jones, M. G., Farquhar, J. D., & Surry, D. W. (1995). Using metacognitive theories to
design user interfaces for computer-based learning. Educational Technology, 35(4),
12-22.
A SYSTEMATIC EVALUATION / 443
Kay, R. H. (1989). A practical and theoretical approach to assessing computer atti-
tudes: The computer attitude measure (CAM). Journal of Research on Computing in
Education, 21(4), 456-463.
Kay, R. H. (1992). An analysis of methods used to examine gender differences in
computer-related behaviour. Journal of Educational Computing Research, 8(3),
323-336.
Kay, R. H. (1993a). A critical evaluation of gender differences in computer-related
behaviour. Computer in the Schools, 9(4), 81-93.
Kay, R. H. (1993b). An exploration of theoretical and practical foundations for assessing
attitudes toward computers: The computer attitude measure (CAM). Computers in
Human Behavior, 9, 371-386.
Kay, R. H. (2006). Addressing gender differences in computer ability, attitudes and use:
The laptop effect. Journal of Educational Computing Research, 34(2), 187-211.
Kay, R. H., & Knaack, L. (2005). Developing learning objects for secondary school
students: A multi-component model. Interdisciplinary Journal of Knowledge and
Learning Objects, 1, 229-254. Retrieved December 1, 2005 from
http://ijklo.org/Volume1/v1p229-254Kay_Knaack.pdf.
Kay, R. H., & Knaack, L. (2007). Evaluating the learning in learning objects. Open
Learning, 22(1), 5-28.
Kennedy, D. M., & McNaught, C. (1997). Design elements for interactive multimedia.
Australian Journal of Educational Technology, 13(1), 1-22.
Kenny, R. F., Andrews, B. W., Vignola, M. V., Schilz, M. A., & Covert, J. (1999). Towards
guidelines for the design of interactive multimedia instruction: Fostering the reflec-
tive decision-making of preservice teachers. Journal of Technology and Teacher
Education, 7(1), 13-31.
Klawe, M. M. (1999). Computer games, education and interfaces: The E-GEMS Project
[Word File From Web Page]. Available: http://taz.cs.ubc.ca/egems/home.html [2000,
January 15, 2000].
Koehler, M. J., & Lehrer, R. (1998). Designing a hypermedia tool for learning about
children’s mathematical cognition. Journal of Educational Computing Research,
13(2), 123-145.
Koppi, T., Bogle, L., & Lavitt, N. (2005). Institutional use of learning objects: Lessons
learned and future directions. Journal of Educational Multimedia and Hypermedia,
13(4), 449-463.
Kramarski, B., & Zeichner, O. (2001). Using technology to enhance mathematical reason-
ing: Effects of feedback and self-regulation learning. Education Media International,
38(2/3).
Krauss, F., & Ally, M. (2005). A study of the design and evaluation of a learning object
and implications for content development. Interdisciplinary; Journal of Knowledge
and Learning Objects, 1. Retrieved August 4, 2005 from
http://ijklo.org/VOlume1/v1p001-022Krauss.pdf.
Lampert. M. (1986). Teaching multiplication. Journal of Mathematical Behaviour, 5,
241-280.
Larkin, J. H. (1989). What kind of knowledge transfers? In L. B. Resnick (Ed.), Knowing,
learning and instruction (pp. 283-305). Hillsdale, NJ: Erlbaum Associates.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation.
New York: Cambridge University Press.
444 / KAY
Littlejohn, A. (2003). Issues in reusing online resources. Journal of Interactive Media
in Education, 1, Special issue on reusing online Resources. Retrieved July 1, 2005
from www-jime.open.ac.uk/2003/1/.
Liu, L., Maddux, C., & Johnson, L. (2004). Computer attitude and achievement: Is time
an intermediate variable? Journal of Technology and Teacher Education, 12(4),
593-607.
Lorch, R. F. (1989). Text-signaling devices and their effects on reading and memory
processes. Educational Psychology Review, 1(3), 209-234.
MacDonald, C. J., Stodel, E., Thompson, T. L., Muirhead, B., Hinton, C., & Carson, B.
(2005). Addressing the eLearning contradiction: A collaborative approach for
developing a conceptual framework learning object. Interdisciplinary Journal of
Knowledge and Learning Objects, 1. Retrieved August 2, 2005 from
http://ijklo.org/Volume1/v1p079-098McDonald.pdf.
Maclaren, I. (2004). New trends in Web-based learning: Objects, repositories and learner
engagement. European Journal of Engineering Education, 29(1), 65-71.
Madhumita, K. K. L. (1995). Twenty-one guidelines for effective instructional design.
Educational Technology, 35(3), 58-61.
McGreal, R. (2004). Learning objects: A practical definition. International Journal of
Instructional Technology and Distance Learning, 1(9). Retrieved August 5, 2005 from
http://www.itdl.org/Journal/Sep_04/article02.htm.
McGreal, R., Anderson, T., Babin, G., Downes, S., Friesen, N., & Harrigan, K. (2004).
EduSource: Canada’s learning object repository network. International Journal of
Instructional Technology and Distance Learning, 1(3). Retrieved July 24, 2005 from
http://www.itdl.org/Journal/Mar_04/article01.htm.
McRobbie, C. J., Ginns, I. S., & Stein, S. J. (2000). Preservice primary teachers’ thinking
about technology and technology education. International Journal of Technology
and Design Education, 10, 81-101.
Metros, S. E. (2005). Visualizing knowledge in new educational environments: A course
on learning objects. Open Learning, 20(1), 93-102.
Morris, E. (2005). Object oriented learning objects. Australasian Journal of Educational
Technology, 21(1), 40-59.
Muzio, J. A., Heins, T., & Mundell, R. (2002). Experiences with reusable e-learning objects
from theory to practice. Internet and Higher Education, 2002(5), 21-34.
Nesbit, J., Belfer, K., & Vargo, J. (2002). A convergent participation model for evaluation
of learning objects. Canadian Journal of Learning and Technology, 28(3). Retrieved
July 1, 2005 from http://www.cjlt.ca/content/vol28.3/nesbit_etal.html.
Oren, T. (1990). Cognitive load in hypermedia: Designing for the exploratory learner. In
S. Ambron & K. Hooper (Eds.), Learning with interactive multimedia (pp. 126-136).
Washington: Microsoft Press.
Orrill, C. H. (2000). Learning objects to support inquiry-based online learning. In
D. A. Wiley (Ed.), The instructional use of learning objects: Online version. Retrieved
July 1, 2005 from http://reusability.org/read/chapters/orrill.doc.
Paquette, G., & Rosca, I. (2002). Organic aggregation of knowledge object in educational
systems. Canadian Journal of Learning and Technology, 28(3). Retrieved July 1, 2005
from http://www.cjlt.ca/content/vol28.3/paquette_rosca.html.
Parrish, P. E. (2004). The trouble with learning objects. Educational Technology Research
& Development, 52(1), 49-67.
A SYSTEMATIC EVALUATION / 445
Petrinjak, A., & Graham, R. (2004). Creating learning objects from pre-authored course
materials: Semantic structure of learning objects—Design and technology. Canadian
Journal of Learning and Technology, 30(3). Retrieved July 1, 2005 from
http://www.cjlt.ca/content/vol30.3/petrinjak.html.
Plante, J., & Beattie, D. (2004). Education, skills and learning—Research papers connec-
tivity and ICT integration in Canadian elementary and secondary schools: First results
from the information and communications technologies in schools survey, 2003-2004.
Statistics Canada. Retrieved August 29, 2004 from
http://www.schoolnet.ca/home/documents/Report_EN.pdf.
Polsani, P. R. (2003). Use and abuse of reusable learning objects. Journal of Digital
Information, 3(4). Retrieved July 1, 2005 from
http://jodi.ecs.soton.ac.uk/Articles/v03/i04/Polsani/.
Poupa, C., & Forte, E. (2003). Collaborative teaching with learning objects in an inter-
national, non-profit context. The example of the ARIADNE community. Educational
Media International, 3-4, 239-248.
Rehak, D., & Mason, R. (2003). Chapter 3: Keeping the learning in learning objects.
Journal of Interactive Media in Education, 2003(1). Retrieved July 1, 2005 from
http://www-jime.open.ac.uk/2003/1/reuse-05.html.
Richards, G. (2002). Editorial: The challenges of learning object paradigm. Canadian
Journal of Learning and Technology, 28(3). Retrieved July 1, 2005 from
http://www.cjlt.ca/content/vol28.3/editorial.html.
Richards, G., McReal, R., Hatala, M., & Frisen, N. (2002). The evolution of learning object
repository technologies: Portals for on-line objects for learning. Journal of Distance
Education, 17(3), 67-79.
Robertson, H. (2003). Toward a theory of negativity: Teacher education and information
and communications technology. Journal of Teacher Education, 54(4), 280-296.
Russell, M., Bebell, D., O’Dwyer, L., & O’Connor, K. (2003). Examining teacher tech-
nology use: Implications for preservice and biservice teacher preparation. Journal
of Teacher Education, 54(4), 297-310.
Savery, J. R., & Duffy, T. M. (1995). Problem-based learning: An instructional model
and its constructivist framework. Educational Technology, 35(5), 31-34.
Schell, G. P., & Burns, M. (2002). A repository of e-]earning objects for higher education.
e-Service Journal, 1(2), 53-64.
Sedig, K., & Liang, H (2006). Interactivity of visual mathematical representations:
Factors affecting learning and cognitive processes. Journal of Interactive Learning
Research, 17(2), 179-212.
Sedighian, K. (1998). Interface style, flow, and reflective cognition: Issues in designing
interactive multimedia mathematics learning environments for children. Unpublished
Doctor of Philosophy dissertation, University of British Columbia, Vancouver.
Seki, K., Matsui, T., & Okamoto, T. (2005). An adaptive sequencing method of the learning
objects for the e-learning environment. Electronics and Communication in Japan,
85(3), 330-344.
Shapka, J. D., & Ferrari, M. (2003). Computer-related attitudes and actions of teacher
candidates. Computers in Human Behaviour, 19(3), 319-334.
Siqueira, S. W. M., Melo, R. N., & Braz, M. H. L. B. (2004). Increasing the semantics of
learning objects. International Journal of Computer Processing of Oriental languages,
17(1), 27-39.
446 / KAY
Sloep, P. (2003). Commentary on Diana Laurillard and Patrick McAndrew, Reusable
Educational Software: A Basis for Generic e-Learning Tasks, in A. Littlejohn (Ed.),
Reusing Online Resources: A Sustainable Approach to e-Learning (Chapter 7),
London: Kogan Page.
Solvberg, A. (2002). Gender differences in computer-related control beliefs and
home computer use. Scandinavian Journal of Educational Research, 46(4),
410-426.
Sternberg, R. J. (1989). Domain-generality versus domain-specificity: The life and
impending death of a false dichotomy. Merrill-Palmer Quarterly, 35(1), 115-130.
Stoney, S., & Wild, M. (1998). Motivation and interface design: Maximizing learning
opportunities. Journal of Computer Assisted Learning, 14(1), 40-50.
Strudler, N., Archambault, L., Bendixen, L., Anderson, D., & Weiss, R. (2003). Project
THREAD: Technology helping restructure educational access and delivery. Educa-
tional Technology Research and Development, 51(1), 39-54.
Thompson, A. D., Schmidt, D. A., & Davis, N. E. (2003). Technology collaboratives
for simultaneous renewal in teacher education. Educational Technology Research
and Development, 51(1), 73-89.
Torkzadeh, R., Pflughoeft, K., & Hall, L. (1999). Computer self-efficacy, training
effectiveness and user attitudes: An empirical study. Behaviour and Information
Technology, 18(4), 299-309.
U.S. Department of Education, National Center for Education Statistics. (2002) Internet
access in U.S. public schools and classrooms: 1994-2002. Retrieved August 30, 2004
from http://nces.ed.gov/programs/digest/d02/tables/dt419.asp.
Valderrama, R. P., Ocana, L. B., & Sheremetov, L. B. (2005). Development of intelligent
reusable learning objects for Web-based education systems. Expert Systems with
Applications, 28(2005), 273-283.
Van Zele, E., Vandaele, P., Botteldooren, D., & Lenaerts, J. (2003). Implementation
and evaluation of a course concept based on reusable learning objects. Journal of
Educational Computing Research, 28(4), 355-372.
Vygotsky, L. S. (1978). Mind in society. Cambridge, MA: Harvard University Press.
Waxman, H. C., Connell, M. L., & Gray, J. (2002). A quantitative synthesis of recent
research on the effects of teaching and learning with technology on student outcomes.
Naperville, IL: North Central Regional Laboratory.
Wepner, S. B., Ziomek, N., & Tao, L. (2003). Three teacher educators’ perspectives
about the shifting responsibilities of infusing technology into the curriculum. Action
in Teacher Education, 24(4), 53-63.
Whitley, B. E., Jr. (1997). Gender differences in computer-related attitudes and behaviors:
A meta-analysis. Computers in Human Behavior, 13(1), 1-22.
Wiest, L. R. (2001). The role of computers in mathematics teaching and learning.
Computers in the Schools, 17(½), 41-55.
Wiley, D. A. (2000). Connecting learning objects to instructional design theory: A defini-
tion, a metaphor, and a taxonomy. In D. A. Wiley (Ed.), The instructional use of
learning objects: Online version. Retrieved July, 1, 2005, from
http://reusability.org/read/chapters/wiley.doc.
Wiley, D., Wayers, S., Dawson, D., Lambert, B., Barclay, M., & Wade, D. (2004).
Overcoming the limitations of learning objects. Journal of Educational Multimedia
and Hypermedia, 13(4), 507-521.
A SYSTEMATIC EVALUATION / 447
Wilhelm, P., & Wilde, R. (2005). Developing a university course for online delivery based
on learning objects: From ideals to compromises. Open Learning, 20(1), 65-81.
Williams, D. D. (2000). Evaluation of learning objects and instruction using learning
objects. In D. A. Wiley (Ed.), The instructional use of learning objects: Online version.
Retrieved July 1, 2005 from http://reusability.org/read/chapters/williams.doc.
Zammit, K. (2000). Computer icons: A picture says a thousand words. Or does it? Journal
of Educational Computing Research, 23(2), 217-231.
Direct reprint requests to:
Robin Kay
University of Ontario Institute of Technology
2000 Simcoe Street North
Oshawa, Ontario, Canada L1H 7K4
e-mail: robin.kay@uoit.ca
448 / KAY
... The 12 articles that compiled the final dataset were included in the systematic review (Table 2). According to Table 2, it seems that the most used evaluation tools are LOES-S and LOES-T (Kay & Knaack, 2007, 2008bKay, 2012). It is noteworthy that in two studies (Pejuan et al., 2016;Schibeci et al., 2008) the researchers developed their own evaluation tool, without given any name. ...
... In the majority of the studies, the reliability and validity of the evaluation tool were measured, the evaluators being both primary and secondary school students and teachers. Kay and Knaack (2007) and Turel and Gürol (2011) referred only to secondary education. The review shows that there are few tools for the evaluation of DLOs concerning Science Education. ...
... The high values of the descriptive measures showed that the selected DLOs for the empirical study satisfy the evaluation criteria of SciLOET, specifically, in terms of content quality, teaching and learning processes, and design. The results agree with other studies (Kay & Knaack, 2007, 2008a, b, 2009Lowe et al., 2010;Kay, 2011Kay, , 2014Schibeci et al., 2008), which note that content quality and design are the highest rated features for DLOs. The variations in the factors are probably due to the different nature of the five DLOs and the topics under study. ...
Chapter
Research shows that Digital Learning Objects (DLOs) for Science Education contribute to positive learning outcomes. DLOs are structural elements for learning environments, and their evaluation is important for their introduction in instructional interventions. Research also shows that there is a lack of evaluation tools regarding certain disciplines and especially Science Education. The Science Learning Objects Evaluation Tool (SciLOET) is among the few DLOs evaluation tools focused on Science Education. SciLOET evaluates four factors of DLOs, namely content quality, teaching effectiveness, design, and documentation. The purpose of this work was the evaluation of SciLOET through two empirical studies one addressed to elementary science teachers and the other one to secondary education science teachers. The two groups of samples used a series of DLOs to test the SciLOET. The results of the first study have shown that SciLOET presents internal consistency. Moreover, the exploratory factor analysis indices were satisfactory, indicating the validity of SciLOET. However, the results of the exploratory factor analysis led to the deletion of an axis from the initial version of the questionnaire. In its revised version, SciLOET consists of three factors, namely content quality, teaching effectiveness and design, and a total of eleven questions which were used in the second study. The results of the second study also have shown that the revised SciLOET presents internal consistency and validity. Our findings show that SciLOET can be used by researchers and teachers as a tool for the evaluation of DLOs in science education.
... Μελλοντική εμπειρική μελέτη περιλαμβάνει μαθητές και αξιολογεί την παιδαγωγικά προστιθέμενη αξία του εικονικού υπολογιστή με το αντίστοιχο του LOES-T εργαλείο, LOES-S των Kay & Knaack (2007). ...
Article
Τα εκπαιδευτικά εικονικά περιβάλλοντα συνεισφέρουν στην κατανόηση αφηρημένων και δυσνόητων εννοιών. Η εργασία παρουσιάζει τη σχεδίαση, ανάπτυξη και πιλοτική εφαρμογή ενός εικονικού περιβάλλοντος για την αρχιτεκτονική του υπολογιστή που απευθύνεται σε μαθητές δευτεροβάθμιας εκπαίδευσης. Η αξιολόγηση έγινε από 15 εκπαιδευτικούς Πληροφορικής δευτεροβάθμιας εκπαίδευσης, οι οποίοι αφού χρησιμοποίησαν το περιβάλλον κατά τη διδασκαλία τους, εκτίμησαν τη λειτουργικότητα του μέσω του εργαλείου SUS και η δυνατότητα επίτευξης μάθησης, εμπλοκής και ευχρηστίας με το εργαλείο LOES-T. Τα αποτελέσματα της πιλοτικής εμπειρικής μελέτης ήταν ενθαρρυντικά, καθώς οι εκπαιδευτικοί συμφώνησαν ότι η παιδαγωγική αλληλεπίδραση με το εύχρηστο εικονικό περιβάλλον εμπλέκει τους μαθητές στη διδακτική πράξη και συμβάλλει σε θετικά μαθησιακά αποτελέσματα.
... LO penting bagi mencapai hasil pembelajaran dan objektif pendidikan (Nash, 2005) selain berfungsi sebagai sistem pengurusan pengetahuan (Sampson & Zervas, 2013). LO turut dikenal pasti oleh Sek et al., (2012), Elfeky dan Elbyaly (2016), Kay et al., 2007), Alharbi et al., (2014) sebagai memberi impak positif kepada pembelajaran. ...
Article
Pelaksanaan Perintah Kawalan Pergerakan (PKP) akibat Pandemik Covid-19 menyebabkan sebahagian sebahagian besar rutin harian normal berubah termasuk proses pengajaran dan pembelajaran di institusi pendidikan. Alternatif terbaik meneruskan proses pembelajaran tanpa bersemuka ialah memanfaatkan teknologi maklumat dan komunikasi (ICT) iaitu melaksanakan proses pengajaran dan pembelajaran dalam talian (PdPDT) sepenuhnya yang memerlukan empat komponen utama; pensyarah, pelajar, pedagogi dan objek pembelajaran (learning object, LO), dan teknologi. Pelajar perlu mencapai objek pembelajaran yang disediakan oleh pensyarah dengan bantuan teknologi. Situasi ini secara tidak langsung memerlukan komitmen dan daya usaha pelajar menguruskan pembelajaran sendiri. Sehubungan itu, kajian ini bertujuan mengena pasti pola kekerapan capaian pelajar kepada LO bagi salah satu subjek umum universiti dalam tempoh PKP serta perkaitannya dengan pencapaian subjek. Kajian berbentuk kuantitatif ini menggunakan instrumen data log capaian portal e-pembelajaran bagi analisis deskriptif untuk mengenal pasti pola kekerapan capaian dan perkaitan pencapaian. Hasil analisis mendapati pelajar gred A mempunyai pola kekerapan capaian LO lebih tinggi berbanding pelajar gred B dan C. Manakala LO berbentuk dokumen lebih kerap dicapai berbanding LO video. Analisis perkaitan mendapati pola kekerapan capaian LO mempunyai kaitan dengan pencapaian pelajar. Perbincangan akhir kajian mencadangkan pensyarah perlu melaksanakan pengajaran dengan struktur tertentu yang memerlukan pelajar membuat capaian LO bagi membantu pencapaian subjek.
... Information communication technology has an impact on the students rather than on the teachers. Students are having more capability and skills to acquire knowledge and develop resources to achieve the aim of acquiring the knowledge and education adequately (Friedman & Coates, 2009;Kay, 2007). ...
Article
E-learning is a technology able to deliver information via the internet for teaching and learning. Elearning still needs to be studied. The main objective of this study is to review the present adoption of E-learning technology in higher education sectors. This study provides an idea to the researchers about the prior studies on the adoption of E-learning technology. Also indicated few studies on the adoption of E-learning technology which have been carried out in both developed and developing countries. There is a high potential in developing countries' use of technology especially in the sector of the higher education system. Future studies should investigate the factors that could affect the adoption of E- learning technology in both developing and developed countries.
... Ένα σημαντικό θέμα στο χώρο των ΑΕΠ είναι η αξιολόγηση της αποτελεσματικότητας και της χρησιμότητάς των ΜΑ. Περιορισμένος αριθμός ερευνών έχουν πραγματοποιηθεί, ιδιαίτερα στο χώρο της δευτεροβάθμιας εκπαίδευσης, με ποικίλαμεθοδολογικά χαρακτηριστικά όσον αφορά το δείγμα -εκπαιδευτικούς, μαθητές -, το είδος τους -ποιοτικές ή ποσοτικές -, τη στόχευση και οργάνωσή τους (Kay, 2007). Η επίδραση συγκεκριμένων ποιοτικών στοιχείων των ΜΑ στη μάθηση καθώς και η συστηματική ανάλυση της προστιθέμενης αξίας τους σε μεγάλο αριθμητικά δείγμα αποτελούν ανοιχτούς ερευνητικούς στόχους. ...
Conference Paper
Full-text available
Η εργασία αφορά στην αξιολόγηση μαθησιακών αντικειμένων που αναπτύχθηκαν για τους τομείς Μηχανολογίας και Ηλεκτρολογίας των ΕΠΑΛ για το Φωτόδεντρο. Στη μελέτη που παρουσιάζεται επιχειρείται μία σύνθεση της οπτικής μαθητών και εκπαιδευτικών ΕΠΑΛ. Συγκεκριμένα, μαθητές ΕΠΑΛ εργάστηκαν με συγκεκριμένα αντικείμενα στη βάση ενός εκπαιδευτικού σεναρίου και συμπλήρωσαν σχετικό ερωτηματολόγιο προκειμένου να αποτιμήσουν τη μαθησιακή τους εμπειρία, την ποιότητα των αντικειμένων και την εμπλοκή τους στη μαθησιακή διαδικασία. Αντίστοιχα, οι εκπαιδευτικοί μελέτησαν και αξιολόγησαν μαθησιακά αντικείμενα της ειδικότητάς τους και απάντησαν σε αντίστοιχο ερωτηματολόγιο προτείνοντας παράλληλα βελτιώσεις. Τα αποτελέσματα έδειξαν ότι τα μαθησιακά αντικείμενα είχαν ιδιαίτερα θετικό αντίκτυπο τόσο στους μαθητές του δείγματος, όσο και στους εκπαιδευτικούς αντίστοιχα, αναδεικνύοντας παράλληλα και παράγοντες βελτίωσης αλλά και εναλλακτικά πλαίσια εφαρμογής τους σε πραγματικές συνθήκες εκπαίδευσης εντός και εκτός σχολείου.
... The literature about online learning is wide and includes different approaches used in investigations during the last three decades. Starting from well-established learning management systems, possibly enhanced by innovative technologies, such as augmented and virtual reality [27] [16] that is integrated with social media platforms. MOOCs have gained wide popularity and are producing a large amount of data that can be used to monitor both a single user's performance and trends resulting from groups of learners. ...
Article
Full-text available
During the last few years, increasing emphasis has been given to geo-education in middle and high school curricula, transforming the way certain subjects are now taught to provide students with comprehensive knowledge about dynamics and interconnections in the world, which also rely on relevant geographic information. This is the case for history, geography, the earth sciences, and other subjects. However, so far, little has been done to provide adequate learning tools that can support such an important transformation. This paper proposes the integration of advanced geospatial technology in traditional interactive learning tools as a way to describe experiences that help students understand phenomena and improve their competencies. The system is the result of a usability engineering process aimed at providing users with an effective learning experience, iteratively analysing their expectations and needs with respect to geo-referenced content. The resulting learning environment, Maps4Learning, allows users to manage content in terms of learning objects named geoLO+, which are extended with spatial and temporal components and built according to standardized metadata. The system was tested in the context of a middle school programme. A usability study was carried out to analyse the impact of geoLO+ resources in terms of perceived quality, engagement, and student learning performance, and the outcomes of the resources were compared to the outcomes of traditional teaching methods. The results were encouraging and showed that learning improvement can indeed be achieved using Maps4Learning.
Book
Full-text available
Η αξιοποίηση της ψηφιακής τεχνολογίας στον χώρο της Εκπαίδευσης, σε συνδυασμό με την επιλογή μαθητοκεντρικών και κυρίως ομαδικών διδακτικών τεχνικών οι οποίες βασίζονται σε ομαδικά και συνεργατικά διδακτικά μοντέλα, διαμορφώνει μια δυναμική κατά τη διδακτική πράξη, που αποφέρει θετικά μαθησιακά αποτελέσματα και θετική στάση των εκπαιδευόμενων απέναντι στη διδακτική πράξη. Το περιεχόμενο του βιβλίου έχει στόχο τη συμμετοχική και συνεργατική μάθηση και την οικοδόμηση της γνώσης, και αφορά τόσο τον τρόπο διδασκαλίας, δηλαδή τις διδακτικές τεχνικές, που χρειάζεται να χρησιμοποιεί κάθε Εκπαιδευτικός πρωτοβάθμιας, δευτεροβάθμιας και τριτοβάθμιας εκπαίδευσης, κατά την άσκηση του διδακτικού του έργου, όσο και την αποτελεσματική αξιοποίηση της εκάστοτε διαθέσιμης τεχνολογίας, όπως υποστηρίζει η τεχνολογική παιδαγωγική γνώση περιεχομένου. Η οργάνωση του βιβλίου κινείται σε δύο παράλληλους άξονες. Ο ένας άξονας αφορά τη θεωρητική παρουσίαση έντεκα επιλεγμένων διδακτικών τεχνικών και ενός μοντέλου, της «Ανεστραμμένης τάξης», και την υλοποίησή τους με τη χρήση της ψηφιακής τεχνολογίας. Ο δεύτερος άξονας αφορά τα θέματα που μελετώνται ως παραδείγματα κατά την εφαρμογή των διδακτικών τεχνικών και αναφέρονται σε γνώσεις οι οποίες συναποτελούν βασικό γνωστικό υπόβαθρο για τους Εκπαιδευτικούς. Τα θέματα πραγματεύονται θεωρητικά ζητήματα, τα οποία άπτονται της διδασκαλίας και της μάθησης, καθώς και ορισμένα παραδείγματα από τις Φυσικές Επιστήμες και την Πληροφορική. Ο συνδυασμός των δύο αξόνων πραγματοποιείται σε κάθε κεφάλαιο του βιβλίου, ώστε, με τη χρήση παραδείγματος, να αποδοθεί η λειτουργία της κάθε διδακτικής τεχνικής. Τα τρία εισαγωγικά θεωρητικά κεφάλαια του βιβλίου αναφέρονται σε θέματα διδακτικής και διδασκαλίας, στην παιδαγωγική αξιοποίηση της ψηφιακής τεχνολογίας και στην ομαδοσυνεργατική διδασκαλία. Ακολουθούν οι διδακτικές τεχνικές, και, στη συνέχεια, παρατίθενται δύο κεφάλαια, ένα κεφάλαιο για την εκπαίδευση STEM και ένα τελικό κεφάλαιο, το οποίο αναφέρεται στην ποιοτική αξιολόγηση των μαθησιακών αποτελεσμάτων, SOLO, αφού η αξιολόγηση αποτελεί αναπόσπαστο τμήμα της εκπαιδευτικής διαδικασίας. Το βιβλίο κλείνει με το παράρτημα, που περιλαμβάνει ορισμένα από τα παιγνίδια γνωριμίας φοιτητών και άλλων ενηλίκων με σκοπό τη συνεργασία σε ομάδα.
Article
Στο πεδίο των Φυσικών Επιστημών τα Ψηφιακά Μαθησιακά Αντικείμενα προσελκύουν το ενδιαφέρον ερευνητών και εκπαιδευτικών. Η παρούσα εργασία διερευνά την εγκυρότητα και την αξιοπιστία του εργαλείου αξιολόγησης Ψηφιακών Μαθησιακών Αντικειμένων για τις Φυσικές Επιστήμες Science Learning Objects Evaluation Tool – SciLOET. Η μελέτη πραγματοποιήθηκε με εκπαιδευτικούς Πρωτοβάθμιας Εκπαίδευσης που χρησιμοποίησαν δύο Ψηφιακά Μαθησιακά Αντικείμενα από την περιοχή των Φυσικών Επιστημών και συγκεκριμένα, της Φυσικής. Τα αποτελέσματα των περιγραφικών μέτρων από τις απαντήσεις των εκπαιδευτικών αναδεικνύουν ότι τα επιλεγμένα Ψηφιακά Μαθησιακά Αντικείμενα ικανοποιούν τα κριτήρια αξιολόγησης του εργαλείου. Τα αποτελέσματα της στατιστικής ανάλυσης αναδεικνύουν την εσωτερική αξιοπιστία του εργαλείου, ενώ οι δείκτες της διερευνητικής παραγοντικής ανάλυσης επιβεβαιώνουν την εγκυρότητά του. Σύμφωνα με τα αποτελέσματα της έρευνας το εργαλείο SciLOET μπορεί να αξιοποιηθεί ως ένα έγκυρο και αξιόπιστο εργαλείο προκειμένου να δοθεί στην ερευνητική και εκπαιδευτική κοινότητα για την αξιολόγηση των Ψηφιακών Μαθησιακών Αντικειμένων για τις Φυσικές Επιστήμες.
Article
Full-text available
Στο πεδίο των Φυσικών Επιστημών τα Ψηφιακά Μαθησιακά Αντικείμενα προσελκύουν το ενδιαφέρον ερευνητών και εκπαιδευτικών. Η παρούσα εργασία διερευνά την εγκυρότητα και την αξιοπιστία του εργαλείου αξιολόγησης Ψηφιακών Μαθησιακών Αντικειμένων για τις Φυσικές Επιστήμες Science Learning Objects Evaluation Tool – SciLOET. Η μελέτη πραγματοποιήθηκε με εκπαιδευτικούς Πρωτοβάθμιας Εκπαίδευσης που χρησιμοποίησαν δύο Ψηφιακά Μαθησιακά Αντικείμενα από την περιοχή των Φυσικών Επιστημών και συγκεκριμένα, της Φυσικής. Τα αποτελέσματα των περιγραφικών μέτρων από τις απαντήσεις των εκπαιδευτικών αναδεικνύουν ότι τα επιλεγμένα Ψηφιακά Μαθησιακά Αντικείμενα ικανοποιούν τα κριτήρια αξιολόγησης του εργαλείου. Τα αποτελέσματα της στατιστικής ανάλυσης αναδεικνύουν την εσωτερική αξιοπιστία του εργαλείου, ενώ οι δείκτες της διερευνητικής παραγοντικής ανάλυσης επιβεβαιώνουν την εγκυρότητά του. Σύμφωνα με τα αποτελέσματα της έρευνας το εργαλείο SciLOET μπορεί να αξιοποιηθεί ως ένα έγκυρο και αξιόπιστο εργαλείο προκειμένου να δοθεί στην ερευνητική και εκπαιδευτική κοινότητα για την αξιολόγηση των Ψηφιακών Μαθησιακών Αντικειμένων για τις Φυσικές Επιστήμες.
Article
Preface PART 1: TWO NATURAL KINDS 1. Approaching the Literary 2. Two Modes of Thought 3. Possible Castles PART 2: LANGUAGE AND REALITY 4. The Transactional Self 5. The Inspiration of Vygotsky 6. Psychological Reality 7. Nelson Goodman's Worlds 8. Thought and Emotion PART 3: ACTING IN CONSTRUCTED WORLDS 9. The Language of Education 10. Developmental Theory as Culture Afterword Appendix: A Reader's Retelling of "Clay" by James Joyce Notes Credits Index
Article
The evolving use of learning technologies and systems, such as learning object systems, to support more social learning environments in which learners have more agency than ever before to construct their own learning experiences is an innovation that involves both faculty and learners in a process of difficult sociocultural change. Programs of faculty support that acknowledge that faculty’s learning needs extend beyond the development of technical skills to the development of new pedagogical skills are indicated. This paper argues that the evolving concept of learning objects systems, and the "economy" that is emerging around the idea of sharable, reusable learning objects managed by repositories, presents new challenges and opportunities for our community. Faculty working with these systems may need to be supported through a personal process of reconceptualizing the nature of teaching and learning within these environments. This process of personal transformation has the potential for change in institutional policy and practice, the institutional cultural change of which Tony Bates (2000) and others speak (cf. Advisory Committee for Online Learning, 2000). The Collaboration for Online Higher Education Research (COHERE) is an alliance of eight research-intensive Canadian universities that is examining these challenges through a multi-pronged research program, one focus of which is supporting faculty as they research their own practice related to technology-enhanced teaching innovations. More specifically, this paper is itself a collaboration among the COHERE partners to share our collective belief about the potential for faculty and institutional transformation through participation in these "e-learning evolutions".
Article
This paper describes work that was done at Athabasca University as part of the EduSource Canada project. This work centered around learning object development based on pre-authored educational content. The major outcomes of the work were the development of an explicit semantic structure with strong educational focus for learning objects, and the implementation of that structure, using platform/software-independent XML technology. An explicit semantic structure for educational content has some significant advantages: it enables faster publishing of material in different formats using automated processes; it allows institutions to participate in seamless content exchange with other institutions; and it enables more accurate discovery and reuse of learning objects within learning object repositories.
Article
span>We apply the object oriented software engineering (OOSE) design methodology for software objects (SOs) to learning objects (LOs). OOSE extends and refines design principles for authoring dynamic reusable LOs. Our learning object class (LOC) is a template from which individualised LOs can be dynamically created for, or by, students. The properties of LOCs refine existing LO definitions and design guidelines. We adapt SO levels of cohesion to LOCs, and illustrate reusability increases when LO lessons are built from LOs like maintainable software systems are built from SOs. We identify facilities required in learning management systems to support object oriented LO lessons that are less predetermined in their sequencing of activities for each student. Our OOSE approach to the design of object oriented LO lessons is independent of, and complementary to, instructional design theory underlying the LO design process, and metadata standards adopted by the IEEE for LO packaging. Our approach produces well structured LOCs with greater reuse potential.</p