ArticlePDF Available

Learning With Technology: The Impact of Laptop Use on Student Achievement

Authors:

Abstract and Figures

Rapid technological advances in the last decade have sparked educational practitioners’ interest in utilizing laptops as an instructional tool to improve student learning. There is substantial evidence that using technology as an instructional tool enhances student learning and educational outcomes. Past research suggests that compared to their non-laptop counterparts, students in classrooms that provide all students with their own laptops spend more time involved in collaborative work, participate in more project-based instruction, produce writing of higher quality and greater length, gain increased access to information, improve research analysis skills, and spend more time doing homework on computers. Research has also shown that these students direct their own learning, report a greater reliance on active learning strategies, readily engage in problem solving and critical thinking, and consistently show deeper and more flexible uses of technology than students without individual laptops. The study presented here examined the impact of participation in a laptop program on student achievement. A total of 259 middle school students were followed via cohorts. The data collection measures included students’ overall cumulative grade point averages (GPAs), end-of-course grades, writing test scores, and state-mandated norm- and criterion-referenced standardized test scores. The baseline data for all measures showed that there was no statistically significant difference in English language arts, mathematics, writing, and overall grade point average achievement between laptop and non-laptop students prior to enrollment in the program. However, laptop students showed significantly higher achievement in nearly all measures after one year in the program. Cross-sectional analyses in Year 2 and Year 3 concurred with the results from the Year 1. Longitudinal analysis also proved to be an independent verification of the substantial impact of laptop use on student learning outcomes.
Content may be subject to copyright.
Learning With Technology:
The Impact of Laptop
Use on Student
Achievement
The Journal of Technology, Learning, and Assessment
Volume 3, Number 2 · January 2005
A publication of the Technology and Assessment Study Collaborative
Caroline A. & Peter S. Lynch School of Education, Boston College
www.jtla.org
James Cengiz Gulek & Hakan Demirtas
Learning With Technology: e Impact of Laptop Use on Student
Achievement
James Cengiz Gulek and Hakan Demirtas
Editor: Michael Russell
russelmh@bc.edu
Technology and Assessment Study Collaborative
Lynch School of Education, Boston College
Chestnut Hill, MA 02467
Copy Editor: Rachel Kay
Design and Layout: omas Hoffmann
JTLA is a free on-line journal, published by the Technology and Assessment Study
Collaborative, Caroline A. & Peter S. Lynch School of Education, Boston College.
Copyright ©2005 by the Journal of Technology, Learning, and Assessment
(ISSN 1540-2525).
Permission is hereby granted to copy any article provided that the Journal of Technology,
Learning, and Assessment is credited and copies are not sold.
Preferred citation:
Gulek, J. C. & Demirtas, H. (2005). Learning with technology: e impact of laptop use
on student achievement. Journal of Technology, Learning, and Assessment, 3(2).
Available from http://www.jtla.org
Volume 3, Number 2
Abstract:
Rapid technological advances in the last decade have sparked educational practitioners’
interest in utilizing laptops as an instructional tool to improve student learning. ere
is substantial evidence that using technology as an instructional tool enhances student
learning and educational outcomes. Past research suggests that compared to their non-
laptop counterparts, students in classrooms that provide all students with their own lap-
tops spend more time involved in collaborative work, participate in more project-based
instruction, produce writing of higher quality and greater length, gain increased access to
information, improve research analysis skills, and spend more time doing homework on
computers. Research has also shown that these students direct their own learning, report
a greater reliance on active learning strategies, readily engage in problem solving and
critical thinking, and consistently show deeper and more flexible uses of technology than
students without individual laptops. e study presented here examined the impact of
participation in a laptop program on student achievement. A total of 259 middle school
students were followed via cohorts. e data collection measures included students’ over-
all cumulative grade point averages (GPAs), end-of-course grades, writing test scores, and
state-mandated norm- and criterion-referenced standardized test scores. e baseline
data for all measures showed that there was no statistically significant difference in Eng-
lish language arts, mathematics, writing, and overall grade point average achievement
between laptop and non-laptop students prior to enrollment in the program. However,
laptop students showed significantly higher achievement in nearly all measures after one
year in the program. Cross-sectional analyses in Year 2 and Year 3 concurred with the
results from the Year 1. Longitudinal analysis also proved to be an independent verifica-
tion of the substantial impact of laptop use on student learning outcomes.
Learning With Technology: e Impact
of Laptop Use on Student Achievement
Introduction
Technological advances, such as more powerful personal computers,
directly affect the way people live in this information age. In the analysis
of Fifty Trends Now Changing the World, Cetron and Davies (2001) noted
that technology is increasingly dominating both the economy and soci-
ety. Schools are no exception. e Digest of Education Statistics (National
Center for Education Statistics, 2000) reports that the percent of students
using computers at school more than doubled between 1984 and 1997.
Similarly, Education Week notes that the United States, along with Aus-
tralia, leads the world in the number of students per computer, with a
ratio of five to one in 2003 (Technology Counts, 2004). Education Week
(Technology Counts, 2004) also notes that 98 percent of nation’s schools
have Internet access and that more than 12 percent of the nation’s schools
have used laptops as an instructional tool. In addition, 38 states standards
for teacher certification include technology, 15 states require technology
training or coursework for an initial teacher license, and 9 states require a
technology test for an initial teacher license. ere is no doubt that edu-
cational leaders are increasingly looking for ways to increase instructional
uses of technology.
Over the past decade, rapid technological advances have sparked inter-
est in utilizing laptops as an instructional tool to improve student learn-
ing. According to e New Lab for Teaching and Learning’s Dalton Council
Task Force Report in Laptop Technology (2001), Beaufort County in South
Carolina launched a pilot program in 1994 using laptops for instruction
with 330 sixth graders. e program was expanded to all middle school
students by 2000. e same report indicates that Clovis Unified School
District in California and New York City Community School District Six
(with 2,700 students in grades four through seven) launched similar laptop
immersion programs in 1996 (Dalton Council Task Force Report, 2001).
Similarly, in 2000, the state of Maine piloted a laptop immersion pro-
gram with one middle school, expanding it to 241 middle schools in 2001,
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
5
J·T·L·A
and increasing to a total of 36,000 laptops with 33,000 students and 3,000
teachers by 2003. e second largest initiative occurred in Henrico County
Schools in Virginia, which piloted a laptop program in 2001 and expanded
enrollment to 23,000 students by the end of 2003. Texas is the most recent
state to join this trend. More than 7,300 students in thirteen schools will
be given wireless laptop computers next fall for use at home and at school
as part of a Technology Immersion Project that is expected to fundamen-
tally change the way students learn.
On a smaller scale, in 1996 Microsoft Corporation launched the Anytime
Anywhere Learning Project in partnership with Toshiba America Informa-
tion Systems’ Notebooks for Schools. e following year, full implementa-
tion of the program occurred in 52 schools across the United States. By the
year 2000, 800 schools with 125,000 students and teachers participated
in the laptop program. To evaluate the effectiveness of Microsoft’s laptop
program, an independent research organization in San Francisco, Califor-
nia, was contracted. Multiple evaluations of Microsoft’s laptop immersion
program yielded positive results on student learning and curriculum deliv-
ery (Rockman et al., 1997, 1998, 2000). Key evaluation findings fall into
two categories: student outcomes and teacher outcomes.
Student outcomes include:
Laptop students spend more time engaging in collaborative work
than non-laptop students
Laptop students participate in more project-based instruction
Laptops lead to more student writing and to writing of higher
quality
Laptops increase access to information and improve research
analysis skills
Laptop students become collaborators (interact with each other
about their work)
Laptop students direct their own learning
Laptop students report a greater reliance on active learning
strategies
Laptop students readily engage in problem solving and critical
thinking
Laptop students consistently show deeper and more flexible uses
of technology
Laptop students spend more time doing homework on
computers
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
6
J·T·L·A
Teacher outcomes include:
Teachers who use laptops use a more constructivist approach to
teaching
Teachers who use laptops feel more empowered in their
classrooms
Teachers who use laptops spend less time lecturing
As reported by Rockman et al. (1997, 1998, 2000), laptop use not only
reinforces the utilization of successful learning strategies but also enables
students to transfer the knowledge across disciplines. is is believed to
occur because laptop students are involved in: (1) highly engaged and
focused activities (spending more time on their work and completing
larger projects); (2) frequently apply active learning strategies; (3) interact
with each other about their work; (4) problem solve through project-based
activities, which usually involve more critical thinking; and (5) regularly
find information, make sense of it, and communicate it. Research provides
evidence that students who engage in collaborative work, participating in
more project-based learning, have higher levels of motivation (Wigfield
et al., 1998; Guthrie & Wigfield, 2000). When students are motivated,
they demonstrate improved achievement (White, 1989; Roth & Paris,
1991; Roderick & Engel, 2001; Haydel & Roeser, 2002; Gulek, 2003), they
produce longer and higher quality writing samples (Reeves, 2001; Gold-
berg, Russell & Cook, 2003), and they spend more time doing homework
(Parschal, Weinstein & Walberg, 1984; Walberg, 1984,1994; Walberg &
Haertel, 1997). Similarly, teachers using a constructivist approach feel
more empowered and spend less time lecturing (von Glaserfeld, 1995,
1995b), have fewer classroom management problems (Marzano et al.,
2003), and have more engaged learners in their classrooms (von Glaser-
feld, 1987; Jonassen, 1991; Fosnot, 1996; Marzano et al., 2003). As seen
in the evaluations conducted by Rockman et al. (1997, 1998, 2000), many
of these outcomes were observed when students were provided with their
own laptop through the Anytime Anywhere Learning Project.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
7
J·T·L·A
Examining the Impact of the Harvest Park
Laptop Immersion Program
ere is substantial evidence that using technology as an instructional
tool enhances student learning and educational outcomes (Berger, 1984;
Choi & Gennaro, 1987; White & Horowitz, 1988; Garza, 1991; Geban,
Askar & Ozkan, 1992; Secules, Herron & Tomasello, 1992; Njoo & de
Jong, 1993; Lehman, 1994; Beauvois, 1997; Soloway et al., 1997; Gonza-
lez-Bueno, 1998; Schecker, 1998; Spitulnik et al., 1998; Hanna & de Nooy,
2003). e study presented in this article examines the impact of the
Harvest Park Middle School’s laptop immersion program on student learn-
ing. Specific research questions include the following:
1. Does the laptop program have an impact on students’ grade
point average?
2. Does the laptop program have an impact on students’ end-of-
course grades?
3. Does the laptop program have an impact on students’ essay
writing skills?
4. Does the laptop program have an impact on students’ stan-
dardized test scores?
Learning outcomes examined in this study include overall grade point
averages (GPAs), end-of-course grades, District Writing Assessment
results for sixth and eighth grade students, results of the Standardized
Testing and Reporting (STAR) Norm-Referenced Test (NRT-CAT/6), and
California Standards Tests in English-language arts and mathematics.
Description of the Laptop Immersion Program
Harvest Park Middle School, located in Pleasanton Unified School Dis-
trict in Pleasanton, California, established its Laptop Immersion Program
in 2001. Located 40 miles southeast of San Francisco, in the center of what
is rapidly becoming the new “Silicon Valley,suburban Pleasanton has expe-
rienced considerable growth in its residential and business base over the
last two decades and is now home to an increasingly diverse population of
more than 60,000. A highly educated, high-income community has devel-
oped in the midst of what not too long ago were acres of fruit orchards
and cattle fields on the edge of Alameda County. As a school experiencing
rapid growth over a short period of time, the challenge of Harvest Park
was to maintain the same high level of academic excellence, while build-
ing an infrastructure that would meet the demands of its student popula-
tion. Harvest Park’s laptop program emerged out of a partnership between
the offerings of the high-tech businesses in the community and schools’
search for innovative programs.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
8
J·T·L·A
How Does the Laptop Program Work?
Students in the Laptop Immersion Program receive the same grade
level curriculum offered to all students in the district. e differences are
seen in the method of curriculum delivery and in the latitude of options
students are given to demonstrate curriculum mastery through the use of
technology.
All students are eligible to participate in the program. Parents pur-
chase the laptops used by their students in this program. For families
who cannot afford to purchase a laptop, a Laptop Advisory Committee,
comprised of an administrator, and teacher and parent representatives,
reviews parent requests for loaner laptops. e loaner applications are
reviewed during the spring enrollment period for the program. To date,
no student has been denied an opportunity to participate in the Laptop
Program. e loaner program provides students computers approximately
one week before the start of the school year and allows students to keep
them until the end of the school year. At the end of the year, students in
the loaner program are required to return their computers to the school.
All students participate in computer camp where they are introduced
to the basics of using the computer. Teachers who regularly teach in the
Laptop Program also teach the Computer Camp. Students are trained in
understanding the capabilities of laptops, in navigating and operating the
computer, and in installing the software that they will use later in the year.
e session also covers the rules and expectations around laptop use in
the classroom, internet/e-mail safety, and virus safety. Students receive
hands-on training in all of these areas.
Students then use laptops on a daily basis during the school year. e
laptop use varies depending on the subject matter. e most common
laptop applications in the classroom include essay writing and on-line
grading in English, researching information on the web , and developing
power point presentations for projects in history/social science classes.
Students also use laptops to develop websites, access web-based lab proj-
ects and activities in science, and design posters and logos. Note-taking
for all subjects in the classroom is also performed with laptops.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
9
J·T·L·A
Program Enrollment
e Laptop Immersion Program started with sixth grade students as a
pilot program in the 2001–02 school year and was subsequently expanded
to seventh and eighth grade students. Students were allowed to join the
program at any point during their middle school years, as long as any
course scheduling conflicts were resolved. e program caught the grow-
ing interest of many students and expanded to a current enrollment of
259. Table 1 shows the enrollment in the program and school-wide enroll-
ment by grade level in 2003–04.
Table 1: Laptop Immersion Program Enrollment by Grade
Grade
Laptop
Program
Enrollment
Total School
Enrollment
6 91 353
7 93 361
8 75 371
Total 259 1085
Student Demographics
To compare the demographics of students enrolled in the Laptop Pro-
gram to the demographics of students school-wide at Harvest Park Middle
School, several key indicators were identified. ese key demographics
data included students’ ethnic background, gender, Gifted and Talented
(GATE) program enrollment, special education status, enrollment in
the National School Lunch Program (NSLP; economically disadvantaged
status), English Learner status, and parent education level. e data are
summarized in Table 2.
(Table 2 is shown on the following page.)
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
10
J·T·L·A
Table 2: Student Demographics – Laptop Immersion Program Versus
School-Wide
Student Demographics Laptop School-wide
Ethnicity
Asian 14% 16%
Filipino 1% 2%
Hispanic/Latino 6% 7%
African American 0% 1%
White 79% 74%
Gender
Female 44% 49%
Male 56% 51%
Gifted and Talented 27% 24%
Special Education 5% 10%
Economically Disadvantaged 1% 4%
English Learner 1% 3%
Parent Education Level
Graduate School 42% 37%
College Graduate 46% 44%
Some College 10% 12%
High School Graduate 2% 6%
Not High School Graduate 0% 1%
Table 2 indicates that all demographic indicators show no more than
five percentage points difference between laptop and non-laptop students.
is indicates the demographic composition of students enrolled in the
program closely mirror those of the entire school population.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
11
J·T·L·A
Data Collection Measures
e data collection measures included students’ overall cumulative
grade point averages (GPAs), end-of-course grades, district writing assess-
ment scores, California Standardized Testing and Reporting (STAR) Pro-
gram Norm-referenced test scores (California Achievement Test Survey
Form Sixth Edition), and Criterion-referenced test scores from the STAR
California Standards Tests. Each measure is described below.
Grade Point Averages (GPAs)
e grade point averages (GPAs) were reported on a 4-point scale to
two decimal places. e cumulative overall GPAs were used for this study,
including grades from all middle school courses in which students were
enrolled. e cumulative GPA represents the average of grades awarded in
all courses taken by a given student during the current academic year.
End-of-Course Grades
e end-of-course grades are reported on a 5-point scale as A, B, C, D,
and F. e pluses and minuses within each letter grade were rounded to
the same letter grade for analytic purposes. For example, grades of A+, A,
and A- were all reported as the letter grade of A. End of course grades were
collected for English and mathematics and represent the cumulative grade
within each course awarded at the end of the school year.
District (Local) Writing Assessment
e district writing assessment, formally known as the District Writ-
ing Sample, is given to all middle school students in grades six and eight,
unless the student has an Individualized Education Plan (IEP) or Section
504 Plan stating differently. Seventh grade students take the state writing
test and not the district test. State writing test results are not reported
separately but are incorporated into the California Standards Test in Eng-
lish-language arts scores. e district writing assessment results in grades
six and eight are reported in four proficiency levels. Below is the interpre-
tation of these proficiency levels:
Minimal Proficiency: does not meet the grade level expectation.
Limited Proficiency: approaches the grade level expectation.
Solid Proficiency: competent achievement, meets the grade level
expectation.
Advanced Proficiency: exceeds the grade level expectation.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
12
J·T·L·A
Standardized Norm-Referenced Test Results
In April 2002, the California State Board of Education designated the
California Achievement Tests, Sixth Edition Survey (CAT/6) published by
CTB/McGraw-Hill (which replaced the Stanford Achievement Test, Ninth
Edition or SAT/9) as the national norm-referenced test for the Standard-
ized Testing and Reporting (STAR) Program. e SAT/9 had been the des-
ignated norm-referenced test for the previous five years in California. Like
the SAT/9, the CAT/6 is a national norm-referenced achievement test, and
the test questions and scoring are the same from year to year. All students
in grades 2–11 are tested in reading, language and mathematics. Students
in grades 2–8 are also tested in spelling, and students in grades 9–11 are
tested in science. e purpose of administering the CAT/6 is to determine
how well each California student is achieving academically compared to a
national sample of students tested in the same grade at the same time of
the school year.
California Standards Tests
e California Standards Tests (CSTs) are given to public school stu-
dents in grades 2–11 as part of the States Standardized Testing and Report-
ing (STAR) Program. e CSTs are criterion-referenced tests, developed
specifically for California public schools and aligned to state-adopted aca-
demic content standards that describe what students should know and be
able to demonstrate at each grade level and subject area tested. Results are
based on how well students achieve the academic content standards, not
how student results compare with results of other students taking same
tests. Questions for all CSTs, except for the writing tests in grades four
and seven, are in a multiple-choice format. e California State Board of
Education established five performance levels. ese levels are advanced,
proficient, basic, below basic, and far below basic.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
13
J·T·L·A
Laptop Immersion Program and Student
Achievement
e data analysis was conducted in a three-layered approach. Descrip-
tive statistics constituted the first layer in which the authors examined if
there were any notable differences between laptop and non-laptop students
to warrant further analyses. ese results are presented in Tables 3–7.
e initial results prompted further analyses of the data using inferential
statistics with particular data points to determine if there were differences
between laptop and non-laptop students prior to enrolling in the program.
e results from the inferential statistics, along with the initial descriptive
statistics are presented in Tables 9–14. When inferential statistics yielded
no significant results prior to program enrollment, but significant results
after enrolling in the program, the authors then applied model-based lon-
gitudinal analysis to the data. Table 15 presents results from the longitu-
dinal analysis.
Grade Point Average (GPA)
e 2003–04 school year cumulative GPAs of laptop and non-laptop
students were averaged by grade level. Table 3 shows the summary of this
data.
Table 3: 2003–04 Cumulative Grade Point Averages by Grade
Program Enrollment
Grade Laptop Non-Laptop
6 3.50 3.13
7 3.28 2.94
8 3.23 3.07
Results indicate that Harvest Park Middle School students in the
Laptop Immersion Program attained higher GPAs than non-participating
students in their respective grades. e greatest difference was observed
in sixth grade GPAs.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
14
J·T·L·A
End-of-Course Grades
End-of-course grades in English and math were also collected to iden-
tify any difference between laptop and non-laptop students. Table 4 dis-
plays the percentage of students receiving each end-of-course grade for
English and mathematics by grade level and by program enrollment.
Table 4: 2003–04 End-of-Course Grades by Subject, Grade, and Program
End-of-Course
Letter Grade
Grade 6 Grade 7 Grade 8
Laptop
Non-
Laptop Laptop
Non-
Laptop Laptop
Non-
Laptop
English Language Arts
A 50% 38% 39% 23% 36% 39%
B 42% 32% 45% 33% 54% 40%
C 7% 21% 11% 28% 10% 17%
D 1% 6% 3% 9% 0% 3%
F 0% 3% 2% 7% 0% 1%
Mathematics
A 40% 33% 37% 30% 24% 23%
B 41% 31% 38% 32% 36% 29%
C 14% 20% 18% 21% 20% 28%
D 2% 6% 5% 8% 20% 11%
F 3% 10% 2% 9% 0% 9%
Results indicate that there is a substantial difference between laptop
and non-laptop students in terms of their end-of-course grades. A notably
higher percentage of laptop students attained A grades and a significantly
lower percentage attained F grades in their English and mathematics
courses. e largest difference between percent of laptop and non-laptop
students obtaining A grades was in seventh grade English and the smallest
difference was in eighth grade mathematics. One exception to this trend
was that a slightly higher percentage of non-laptop students obtained A
grades in eighth grade English; however, results favored laptop students
(90%) versus non-laptop students (79%) in terms of the percentage of stu-
dents attaining a B or a higher for eighth grade English.
District (Local) Writing Assessment Results
e 2004 district writing assessment is given to all middle school stu-
dents in grades six and eight. e grade level expectations for the writing
sample focused on responding to the assigned topic, having discernible
organization, demonstrating adequate word choice and including sup-
porting material. Specific expectations included but were not limited to
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
15
J·T·L·A
grammar, mechanics, sentence structure, fluency, sentence development,
organization, and neatness. Table 5 compares the most recent assessment
results (March 2004 testing) for laptop students, the entire school, and
the district.
Table 5: 2004 District Writing Assessment Results by Grade and Program
Enrollment
Score of 4
(Advanced
Proficiency)
Score of 3
(Solid
Proficiency)
Score of 2
(Limited
Proficiency)
Score of 1
(Minimal
Proficiency)
Grade 6
Laptop Program 17% 78% 5% 0%
Harvest Park 16% 68% 16% 1%
District Average 9% 72% 19% 2%
Grade 8
Laptop Program 15% 76% 9% 0%
Harvest Park 17% 66% 17% 2%
District Average 16% 68% 16% 2%
Results indicate that eighth grade laptop students at both the high-
est (advanced proficiency) and lowest (minimal proficiency) scoring levels
do not show noteworthy differences from Harvest Park Middle School or
from district-wide results. However, a considerably higher percentage of
laptop students scored 3 (solid proficiency) compared with their school
average and district-wide results. Overall, a substantially higher percent-
age of laptop students (95% in Grade 6; 91% in grade 8) met or exceeded
grade level expectations in writing compared to Harvest Park school-wide
averages (84% in Grade 6; 83% in grade 8) and district-wide averages (81%
in Grade 6; 84% in grade 8).
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
16
J·T·L·A
STAR Norm-Referenced Test Results
e state mandated Norm-Referenced Test (NRT) data were also com-
piled to measure the effect of the Laptop Immersion Program on student
learning. e current NRT is the California Achievement Test Sixth Edition
Survey Form (CAT/6 Survey). Table 6 presents the 2004 STAR NRT results
for current students in total language and total mathematics by grade level
and by program enrollment in terms of the percentage of students scoring
at or above the national average (50th percentile).
Table 6: 2004 STAR Norm-Referenced Test Results by Grade: Percent of
Students Scoring At or Above National Average (50th Percentile)
Total
Language
Total
Mathematics
Grade 6 Laptop 88% 96%
Non-Laptop 78% 83%
Grade 7 Laptop 86% 83%
Non-Laptop 76% 76%
Grade 8 Laptop 89% 83%
Non-Laptop 77% 77%
Results indicate that a considerably higher proportion of laptop stu-
dents scored at or above the national average in both the language and
mathematics portions of the CAT/6 across all grade levels. e largest dif-
ference in NRT outcomes occurred between current sixth grade laptop and
non-laptop students in math, and current eighth grade students in total
language, when they were tested at the end of their first year enrollment
in the laptop program.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
17
J·T·L·A
STAR California Standards Tests Results
e California Standards Tests (CSTs) are given to public school
students in grades 2-11 as part of the state’s Standardized Testing and
Reporting (STAR) Program. e CST results are based on how well stu-
dents achieve on state academic content standards. Table 7 presents the
2004 STAR CST results in English-language arts and mathematics in terms
of percentage of students scoring proficient or advanced by grade level and
program enrollment.
Table 7: 2004 STAR California Standards Tests in English-Language Arts
and Mathematics Results: Percent of Students Scoring Proficient or
Advanced
English-
Language
Arts Mathematics
Grade 6 Laptop 80% 86%
Non-Laptop 68% 66%
Grade 7 Laptop 83% 73%
Non-Laptop 64% 57%
Grade 8 Laptop 76% 58%
Non-Laptop 56% 49%
e CST results indicate that a notably higher percentage of students
enrolled in the Laptop Immersion Program at Harvest Park Middle School,
across all grade levels, met or exceeded state content standards in Eng-
lish-language arts and mathematics when compared to their non-laptop
counterparts. e average difference in the percentage of students scor-
ing proficient or advanced in the CSTs between laptop and non-laptop
students was 17 percentage points for English-language arts and 18 per-
centage points for mathematics, both favoring the laptop program. e
greatest difference, 20 percentage points between the two groups, was
observed in the sixth grade mathematics scores and eighth grade English-
language arts scores.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
18
J·T·L·A
Examining Prior Achievement and Impact of
the Laptop Immersion Program
Random assignment into treatments and measures of prior achieve-
ment in identifying the impact of a particular program, such as the Laptop
Immersion Program on learning, is an ongoing issue in field-based educa-
tional research. It is particularly difficult to estimate the extent to which
prior differences of students enrolled in the Laptop Program contribute to
the difference in student performance as a group. is study attempts to
address this issue by comparing student-learning outcomes from the year
prior to the students’ participation in the program.
Prior Achievement and Follow-up Data: Statistical
Comparisons
Additional data analyses were conducted for students who are at dif-
ferent stages in the Laptop Program and reported for their achievement
outcomes prior to, and at the end of, their first, second, and third years of
enrollment in the program. Students were followed via cohorts. Cohorts
were based on the number of years enrolled in the program. ere are cur-
rently three cohorts in the program. e Cohort design is presented in
Table 8 below.
Table 8: Cohort Design by Academic Year
Academic
Year
Baseline Data
(Grade 5)
Year 1 Data
(Grade 6)
Year 2 Data
(Grade 7)
Year 3 Data
(Grade 8)
2000–01 Cohort 1
2001–02 Cohort 2 Cohort 1
2002–03 Cohort 3 Cohort 2 Cohort 1
2003–04 Cohort 3 Cohort 2 Cohort 1
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
19
J·T·L·A
Cohort 1 Analyses
Cohort 1 members were the eighth grade students in the 2003–2004
school year. ey are the first group of students who went through the pro-
gram for three consecutive years. eir baseline data was gathered from
2000–01, when they were fifth graders. Results of the STAR NRT language
arts and math, STAR CST in English-Language Arts, and district writing
assessments were used for further analysis to determine achievement of
this cohort at the end of the first year. STAR NRT scores are presented as
the percent of students scoring at or above National Average (At/Above
50th percentile), STAR CST and district writing scores are presented as the
percent of students scoring proficient or advanced (meeting or exceeding
the [state] standard). Table 9 presents the results.
Table 9: Prior Differences in Achievement for Cohort 1
by Program Enrollment
Baseline Data
(Prior to Laptop
Enrollment) End of Year 1 Change
STAR NRT LA Laptop 97% (N=67) 100% (N=70) +3
Non-laptop 94% (N=202) 84% (N=228) -10
STAR NRT Math Laptop 98% (N=66) 99% (N=70) +1
Non-laptop 93% (N=203) 86% (N=235) -7
STAR CST ELA Laptop 79% (N=66) 74% (N=70) -5
Non-laptop 60% (N=214) 63% (N=228) +3
District Writing
Test
Laptop 81% (N=63) 96% (N=68) +15
Non-laptop 83% (N=210) 95% (N=230) +12
Baseline data in Table 9 indicate that there were minor differences
between laptop and non-laptop students for the three measures, with the
exception of CST English-language arts (which favored laptop students).
However, the follow-up data showed that laptop students demonstrated
more positive change in scores than non-laptop students in three of the
four measures. To examine whether differences in prior achievement
(baseline) and the first year follow-up scores are statistically significant,
a series of T-tests were conducted with cross-sectional data. e scaled
scores were used when analyzing the STAR test data. e results are pre-
sented in Table 10.
(Table 10 is shown on the following page.)
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
20
J·T·L·A
Table 10: Cross-Sectional Analysis of Cohort 1 Achievement
by Program Enrollment
Baseline
(2000–01)
Year 1
(2001–02)
Year 2
(2002–03)
Year 3
(2003–04)
Measure Statistics Laptop
Non-
Laptop Laptop
Non-
Laptop Laptop
Non-
Laptop Laptop
Non-
Laptop
STAR NRT
Language
Arts
Mean SS 691 685 698 688 697 690 707 692
SD 27 32 24 35 33 42 43 45
N 67 202 70 232 72 269 72 279
F3.10 9.84 2.51 0.02
p p>.05 (NS*) p< .005 p> .1 (NS) p> .1 (NS)
STAR NRT
Mathematics
Mean SS 700 692 718 701 710 699 729 718
SD 33 35 28 43 43 48 45 54
N 66 203 70 235 71 268 72 279
F0.02 13.89 1.79 1.15
p p>.1 (NS) p<.001 p> .1 (NS) p> .1 (NS)
STAR CST
English-
Language
Arts
Mean SS
Limited Reports
produced, Scaled
Scores not reported)
372 362 383 367 374 361
SD 34 48 40 54 43 55
N 70 228 72 267 72 277
F10.68 6.87 6.88
p-value p< .005 p< .01 p< .01
STAR CST
Mathematics
Mean SS
Test not operational
395 370 381 363 383 367
SD 48 65 54 66 72 80
N 70 232 72 268 72 275
F 8.57 2.54 0.07
p-value p<.005 p> .1 (NS) p> .1 (NS)
District
Writing
Assessment
Mean SS 2.9 3.0 3.7 3.7
District writing
assessment not
offered at this
grade level
3.1 2.9
SD 0.9 0.8 0.8 0.8 0.5 0.6
N 63 210 70 228 72 275
F0.34 0.05 5.53
p-value p>.1 (NS) p>.1 (NS) p< .05
Overall GPA
Mean SS
GPAs not
calculated at
Grade 5
3.51 3.18 3.24 2.97 3.23 3.07
SD 0.46 0.70 0.57 0.79 0.58 0.67
N 70 237 72 273 72 285
F14.47 12.65 2.13
p-value p< .001 p< .001 p>.1 (NS)
Note: (*): NS means non-significant, N means sample size and SD means standard deviation.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
21
J·T·L·A
e baseline data for three measures (NRT language arts and math,
and the district writing test) show that there is no statistically significant
difference in achievement between laptop and non-laptop students prior
to the enrollment in the program. However, the comparison between the
two groups after one year in the program indicate that laptop students
showed significantly higher achievement in NRT language arts (F=9.84,
p< .005) and NRT mathematics (F=13.89, p<.001). e difference between
the two groups in writing achievement at the end of Year 1 in the program
was not significant; however, the Year 3 results significantly favored the
laptop students (F=5.53, p< .05).
Although there was no baseline data available to assess the prior
achievement for STAR CST and NRT, and overall GPA, statistical compari-
sons between laptop and non-laptop students were made for these mea-
sures as well. Cross-sectional analyses comparing the difference in mean
scores indicate that laptop students consistently scored higher than non-
laptop students in CST English-language arts at the end of Year 1 (F=10.68,
p< .005), Year 2 (F=6.87, p< .01), and Year 3 (F=6.88, p< .01). e differ-
ence between laptop and non-laptop students in CST math performance
was significant in Year 1 (F=8.57, p< .005), but non-significant in Year 2
and Year 3. e STAR NRT results indicate that there is no statistically
significant difference between laptop and non-laptop students in terms
of language and math achievement in Year 2 and Year 3. e comparison
of overall GPA scores demonstrated that laptop students obtained higher
GPAs throughout their enrollment in the program. e differences were
statistically significant in Year 1 (F=14.47, p< .001), and Year 2 (F=12.65,
p< .001), but not in Year 3.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
22
J·T·L·A
Cohort 2 Analyses
Cohort 2 members were seventh grade students in the 2003–04 school
year. ose who participated in the Laptop Program completed their second
year. eir baseline data were extracted from the 2001–2002 school year
test scores, when they were fifth graders. e STAR CST in English-lan-
guage arts and mathematics were the two data points available for com-
parison between the baseline and follow-up achievement for this cohort
group. e CST scores were reported in terms of the percent of students
scoring proficient or advanced (meeting or exceeding the [state] standard).
Table 11 shows the results for students in Cohort 2.
Table 11: Prior Differences in Achievement for Cohort 2 by Program
Enrollment
Baseline Data
(2001–02)
(Prior to Laptop Enrollment)
Year 1
(2002–03) Change
NRT Language Laptop 92% (N=85) 88% (N=91) -4
Non-laptop 79% (N=205) 72% (N=231) -7
NRT
Mathematics
Laptop 93% (N=85) 89% (N=91) -4
Non-laptop 80% (N=205) 75% (N=231) -5
CST ELA Laptop 75% (N=85) 87% (N=91) +12
Non-laptop 67% (N=205) 68% (N=231) +1
CST Math Laptop 73% (N=85) 81% (N=91) +8
Non-laptop 62% (N=204) 59% (N=232) -3
Notes: 1. GPAs and End-of-Course grades are not reported at grade 5.
2. Writing assessment changed from 6-point scoring rubric in 2001–02 to 4-point scoring
rubric in 2003.
3. Norm-Referenced Test changed from SAT/9 in 2002 to CAT/6 in 2003.
Data presented in Table 11 indicate that Cohort 2 laptop students
showed notable differences in both English-language arts and math
achievement prior to enrolling in the program. After one year in the pro-
gram, students in the Laptop Program experienced larger positive change
in their CST English-language arts and math achievement than students
not in the program. NRT results showed a decline in language and math
scores for both groups.
A series of T-tests were conducted with cross-sectional data to inves-
tigate the significance of differences in student achievement in English-
language arts and mathematics prior to, and one year after, enrolling in
the program. Scaled scores were used for analysis. In addition, mean score
differences in GPAs between laptop and non-laptop students were also
compared in Year 2. Table 12 presents the results.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
23
J·T·L·A
Table 12: Cross-Sectional Analysis of Cohort 2 Achievement by Program
Enrollment
Baseline
(2001–02)
Year 1
(2002–03)
Year 2
(2003–04)
Measure Statistics Laptop
Non-
Laptop Laptop
Non-
Laptop Laptop
Non-
Laptop
STAR NRT
Language
Arts
Mean SS 689 681 701 680 698 688
SD 28 37 46 43 35 45
N 85 205 91 232 92 241
F-value 3.12 0.81 2.86
p-value p> .05 (NS) p> .1 (NS) p> .05 (NS)
STAR NRT
Mathematics
Mean SS 697 684 709 689 719 700
SD 36 38 42 50 51 51
N 86 206 90 232 92 241
F-value 0.58 0.74 0.04
p-value p> .1 (NS) p> .1 (NS) p> .1 (NS)
STAR CST
English-
Language
Arts
Mean SS 381 365 397 372 387 370
SD 39 45 40 54 49 58
N 85 204 91 232 92 240
F-value 1.68 10.12 9.42
p-value p>.1 (NS) p<.005 p<.005
STAR CST
Mathematics
Mean SS 401 379 394 369 395 368
SD 65 68 53 72 69 70
N 85 205 91 231 92 241
F-value 0.02 9.22 0.27
p-value p>.1 (NS) p<.005 p> .1 (NS)
District
Writing
Assessment
Mean SS 3.5 3.5 3.0 2.9
District writing
assessment not offered at
this grade level
SD 0.8 0.9 0.5 0.6
N 82 193 72 210
F-value 2.79 7.73
p-value p>.05 (NS) p< .01
Overall GPA
Mean SS
GPAs not calculated at
Grade 5
3.48 3.15 3.28 2.94
SD 0.50 0.79 0.60 0.81
N 92 240 92 252
F-value 19.97 12.06
p-value p< .001 p< .005
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
24
J·T·L·A
Baseline data show that the difference in both English-language arts
and math performance between laptop and non-laptop students prior to
enrolling in the program was not statistically significant. However, Year 1
results showed that laptop students, as compared to their non-laptop
counterparts, demonstrated significantly higher achievement in CST
English-language arts (F=10.12, p< .005) and CST mathematics (F=9.22,
p< .005). e district writing assessment results for Cohort 2 were consis-
tent with CST results. e cross-sectional mean comparison for writing
achievement indicated that students’ prior writing skills did not differ sig-
nificantly, but laptop students demonstrated significantly higher achieve-
ment at the end of Year 1 (F=7.73, p< .01). However, STAR NRT scores for
language arts and mathematics did not show any significant differences
for all comparisons. Because overall GPAs are not calculated at Grade 5,
baseline data was not available for comparison. Nevertheless, cross-sec-
tional comparison between laptop and non-laptop students show that
laptop students obtained significantly higher overall GPAs in both Year 1
(F=19.97, p< .001) and in Year 2 (F=12.06, p< .005).
Cohort 3 Analyses
Cohort 3 members were the sixth grade students in the 2003–04 school
year who were enrolled in the program for one school year. eir baseline
data came from the 2002–03 school year test scores, when students were
in the fifth grade. STAR NRT and CST results in English-language arts and
math, and district writing assessment results were used to evaluate stu-
dent achievement. Table 13 presents student achievement results prior to
enrolling, and after completing one year in the program.
Table 13: Prior Differences in Writing Achievement for Cohort 3
by Program Enrollment
Baseline Data
(Prior to Laptop Enrollment) End of Year 1 Change
STAR NRT LA Laptop 92% (N=67) 88% (N=70) -4
Non-laptop 81% (N=202) 78% (N=228) -3
STAR NRT Math Laptop 97% (N=66) 96% (N=70) -1
Non-laptop 83% (N=203) 83% (N=235) 0
STAR CST ELA Laptop 89% (N=66) 80% (N=70) -9
Non-laptop 73% (N=214) 68% (N=228) -5
STAR CST Math Laptop 92% (N=66) 86% (N=70) -6
Non-laptop 71% (N=214) 66% (N=228) -5
District Writing
Test
Laptop 76% (N=63) 95% (N=68) +19
Non-laptop 68% (N=210) 79% (N=230) +11
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
25
J·T·L·A
Results presented in Table 13 indicate that there were notable dif-
ferences in achievement between laptop and non-laptop students in the
baseline year. e change in scores after enrolling one year in the program
provided mixed results. Whereas STAR CST and NRT scores did not show
notable differences in achievement, laptop students showed a substan-
tially more positive change in writing achievement after enrolling one year
in the program. A cross-sectional comparison of mean scores was made to
test the difference in achievement.
As seen in Table 14, results from the cross-sectional comparison of
achievement for laptop and non-laptop students indicate that the differ-
ences at the baseline data were not statistically significant for all measures.
However, laptop students showed significantly higher achievement in writ-
ing after the first year of enrollment in the laptop program (F=4.02, p< .05).
STAR CST and NRT scores in English-language arts and mathematics indi-
cate that there is no statistically significant difference in baseline achieve-
ment between laptop and non-laptop students. Whereas English-language
arts scores did not show any statistical differences after enrolling one year
in the program, laptop students demonstrated significantly higher math
achievement in NRT (F=5.09, p< .05) and CST (F=4.91, p< .05) in Year
1. In addition, laptop students obtained significantly higher overall GPAs
after their first year in the program (F=17.29, p< .001). Although there is
no baseline data to conclude whether student achievement prior to the
enrollment in the program had any impact on the higher overall GPAs, it
is reasonable to expect that the baseline STAR results for this cohort and
results from other cohorts are consistent with these findings.
(Table 14 is shown on the following page.)
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
26
J·T·L·A
Table 14: Cross-Sectional Analysis of Cohort 3 Achievement
by Program Enrollment
Baseline
(2002–03)
Year 1
(2003–04)
Measure Statistics Laptop
Non-
Laptop Laptop
Non-
Laptop
STAR NRT
Language Arts
Mean SS 690 681 704 691
SD 30 39 49 44
N 88 255 90 272
F 3.57 1.46
pp>.05 (NS) p> .1 (NS)
STAR NRT
Mathematics
Mean SS 700 685 711 699
SD 43 51 33 46
N 88 256 90 273
F 1.56 5.09
pp>.1 (NS) p<.05
STAR CST English-
Language Arts
Mean SS 387 374 389 371
SD 37 43 44 51
N 88 255 90 273
F 2.60 1.57
pp>.1 (NS) p>.1 (NS)
STAR CST
Mathematics
Mean SS 428 394 403 380
SD 65 76 53 66
N 88 255 90 273
F 2.17 4.91
pp>.1 (NS) p<.05
District Writing
Assessment
Mean SS 2.9 2.8 3.1 2.9
SD 0.6 0.6 0.4 0.6
N 86 244 89 261
F 2.63 4.02
pp>.1 (NS) p<.05
Overall GPA
Mean SS
GPAs not calculated at
Grade 5
3.50 3.13
SD 0.54 0.82
N 90 279
F 17.29
pp<.001
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
27
J·T·L·A
Model-Based Statistical Analyses
Analyses presented above compare performance of students at single
points in time. In this section, more sophisticated statistical methods are
employed to conduct longitudinal analyses of the effects of participation
in the laptop program. is model-based statistical approach addresses
the complications introduced by missing data and correlations among
outcomes that may not be captured in cross-sectional analyses. For more
information about the key issues of the model-based statistical approach
and longitudinal data analysis, please see Technical Notes at the end of
this paper.
e NRT mathematics and language arts scaled scores, and overall
cumulative GPA scores, were used for analysis in this section. In addition,
cumulative math GPAs were also incorporated into the longitudinal analy-
sis. Student scores for this analysis were derived from the Cohort 1 and
Cohort 2 students in the Laptop Program. ese two cohorts were selected
for analysis simply because scores were available longitudinally, provid-
ing the most comprehensive information about the long-term structure of
learning outcomes. Furthermore, although SAT/9 and CAT/6 are different
norm-referenced tests, SAT/9 2001 and 2002 scores were combined with
CAT/6 2003 and 2004 scores because a linking study conducted by the Cal-
ifornia Department of Education (2003) indicates that the two tests mea-
sure comparable content knowledge. Also, the scores in the linking study
exhibited similar trends, particularly at the seventh and eighth grade level
from which CAT/6 scores were extracted for this analysis. In addition, the
mean and median scores between CAT/6 and SAT/9 were nearly identical,
with a correlation of approximately 0.7 between 2002 SAT/9 and 2003
CAT/6 scores. e longitudinal analysis of overall cumulative GPAs and
math cumulative GPAs was based on data from 2002, 2003, and 2004.
Four separate analyses were conducted for each of the longitudinal
math, language, and overall GPA scores. Linear Mixed-Modeling (LMM)
software, developed by Schafer (1998), was employed to analyze repeated
measures data (see Technical Notes). Results are presented in Table 15.
(Table 15 is shown on the following page.)
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
28
J·T·L·A
Table 15: Longitudinal Analysis of the Effect of Laptop Use
on Individual Scores
Coefficient Standard Error t-ratio p
Indicator Cohort
1
Cohort
2
Cohort
1
Cohort
2
Cohort
1
Cohort
2
Cohort
1
Cohort
2
Math
Intercept 685.93 680.59 2.78 2.98 247.09 228.31 0.000 0.000
Year 9.42 8.94 0.75 1.29 12.57 6.94 0.000 0.000
Laptop 15.50 17.21 5.96 5.56 2.60 3.10 0.009 0.002
Year by Laptop -1.16 1.95 1.55 2.37 -0.75 0.82 0.453 0.411
Language
Intercept 680.20 677.24 2.32 2.63 293.04 257.60 0.000 0.000
Year 4.03 4.33 0.67 1.27 5.99 3.43 0.000 0.001
Laptop 10.79 15.15 4.94 4.89 2.18 3.10 0.029 0.002
Year by Laptop 0.69 -0.51 1.38 2.34 0.50 -0.22 0.620 0.827
Math GPA
Intercept 2.85 2.84 0.06 0.09 45.47 31.21 0.000 0.000
Year -0.16 -0.05 0.03 0.04 -5.28 -1.23 0.000 0.221
Laptop 0.40 0.40 0.14 0.17 2.99 2.36 0.003 0.018
Year by Laptop -0.06 -0.01 0.06 0.08 -0.97 -0.15 0.332 0.884
Overall
GPA
Intercept 3.12 3.33 0.04 0.06 75.48 57.36 0.000 0.000
Year -0.05 -0.20 0.01 0.02 -3.36 -8.37 0.001 0.000
Laptop 0.34 0.34 0.09 0.11 3.70 3.06 0.000 0.002
Year by Laptop -0.08 0.01 0.03 0.05 -2.76 0.18 0.006 0.860
Notes: 1. The fraction of missing values for learning outcomes analyzed range from 5–25 percent.
2. For this analysis, population-averaged effects were included in the fixed effects component and
subject-specific effects in the random component. For the random component of math and
language scores, random slope and intercept for time trends were incorporated. For the fixed
component, an intercept, Laptop Enrollment, Year, and Year by Laptop interaction were used.
3. Throughout the analysis, missing values were assumed to be Missing At Random. Multiple
imputation (see Technical Notes), assuming Missing At Random, was also conducted (results
are not shown for brevity) where missing values are replaced with multiple values with some
plausible mechanism, and performed linear mixed modeling analysis for the completed
datasets. Little or no discernible differences were found. This suggests that there is no
systematic pattern of missing value occurrences.
Results presented in Table 15 indicate that laptop enrollment has a
significant effect on mathematics and language scores. Specifically, par-
ticipation in the laptop program is associated with an average per student
gain of 16 points for mathematics scores and 13 points for language scores
obtained from the state-mandated standardized NRTs. Year (number of
years in the program) by laptop enrollment interaction results were not
significant in both math and language arts results, suggesting that the
effectiveness of laptop use on test scores is not influenced by time, once
its overall effect is accounted for. Results also indicate that laptop enroll-
ment seems to improve math cumulative GPA and overall cumulative GPA,
yielding a 0.40 increase in math cumulative GPA and 0.34 increase in the
overall cumulative GPA.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
29
J·T·L·A
Discussion
Numerous studies have examined the link between technology use and
its impact on student socio-emotional factors, such as student motivation,
disruptive classroom behavior, classroom participation/engagement, and
students’ interaction with their peers or teachers (Rockman et al., 1997,
1998, 2000; Fisher & Stolarchuk, 1998; Harris & Smith, 2004; Shaver,
2004). Other studies have attempted to estimate the effect of technology
use on student achievement in writing (Goldberg, Russell & Cook, 2003),
science (Fisher & Stolarchuk, 1998; Gabel, 2004), foreign language (Met,
2004), and social studies (Shaver, 2004). e contribution of this study to
the body of educational research is that it explored students’ achievement
with not just one indicator, but with multiple indicators of learning (state
and district test results and overall grade point averages). e purpose of
this study was to examine the effect of participation in a laptop immersion
program on student achievement. Although students were not randomly
assigned to participate in the laptop immersion program, an examination
of indicators of achievement indicate that students who participated in
the program and those that did not participate performed similarly prior
to start of the laptop program. Analyses of outcome measures collected
after participation in the laptop program, however, indicate that students
who did participate in the program tended to earn significantly higher test
scores and grades for writing, English-language arts, mathematics, and
overall Grade Point Averages (GPAs).
e findings related to writing are consistent with results of a recent
meta-analysis of studies that investigated the effect of computers on stu-
dent writing (Goldberg, Russell & Cook, 2003). is meta-analysis found
that students who use computers when learning to write are not only more
engaged and motivated in their writing, but also produce work that is of
greater length and higher quality, especially at the secondary level. e
study presented above indicates that participation in the laptop immer-
sion program had significant positive effects on students writing skills.
Due to the small sample size, this study did not analyze the data for
special education students. However, laptop use with special education
students certainly seems to be a promising classroom-instruction strategy
and an avenue for future research. Laptop computers offer students with
disabilities an opportunity for success that may not be otherwise offered.
Laptops provide special education students an additional visual represen-
tation of learning material, which directly addresses the needs of these
students. As evidenced by Goldberg, Russell and Cook (2003), the effect
of computers on student writing had the strongest positive impact on stu-
dents with disabilities. In addition, Harris and Smith (2004) found that
teachers’ rating of special education students’ appropriate behavior, moti-
30
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
J·T·L·A
vation, engagement/interest, independent work, and retention of mate-
rial increased with their exposure to and use of laptop computers.
One limitation that might have a confounding effect on student
achievement in this study is the teacher assignment into the Laptop Pro-
gram. Participating teachers volunteer for the program. As with most field-
based research in education, in the absence of random assignment into the
program, the differences in student performance may be partly because of
differences in teachers volunteering for this program.
A second limitation stems from the lack of data regarding the extent to
which students used laptops for specific types of learning. As Bebell, Rus-
sell, and O’Dwyer (2004) demonstrate, technology can be used in many
different ways by students and teachers. Depending upon how technol-
ogy use is measured, the lessons learned from a study of technology can
vary dramatically. In the study presented here, we know that students who
participated in the laptop immersion program had full access to a laptop
computer. Anecdotal evidence indicates that students did use the laptops
for a variety of purposes. But, since we did not systematically collect infor-
mation about how individual students used their laptops or other types of
technology (e.g., desktop computers), we are unable to provide estimates
of the effect of specific technology uses on student achievement.
Despite these shortcomings, this study provides evidence that par-
ticipation in the laptop immersion program had a significant impact on
student achievement. Given the cost of creating a one-to-one laptop envi-
ronment, this finding raises important questions about equity. Clearly,
school systems are not technologically or financially equal (Jameson,
1999). Inequities also exist between schools within the same district and
within classrooms within the same school building. ese inequities result
from differences in funding, technical infrastructure, access to technology,
instructional practices, and teachers’ experiences. Given the potential ben-
efits that may come from learning in laptop settings, it is important that
schools begin taking steps to create more equitable settings with respect
to technology access and skills. ese steps include: (1) remediating stu-
dents who lack experience with technology; (2) increasing teachers’ tech-
nology skills; (3) providing students with greater access to a computer; and
(4) developing teacher and student standards for technology proficiency.
While concerns about technology equity often focus on the development
of technology-related skills and preventing an information society that
is divided into haves and have-nots, the findings from this study suggest
that the digital divide may also create society that is divided by academic
achievement. To increase the achievement of all students, findings from
this study suggest that all students must have equal access to technology
rich environments in which technology is no longer a shared commodity.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
31
J·T·L·A
References
American Psychological Association (2001). Publication manual of the
American Psychological Association (5th Ed.). Washington, DC: APA.
Beauvois, M. H. (1997). Computer-mediated communication: Technology
for improving speaking and writing. In M. Bush & R. Terry (Eds.),
Technology-enhanced language learning. Lincolnwood, IL: National
Textbook Co.
Bebell, D., Russell, M., & O’Dwyer, L. (2004). Measuring teachers’
technology uses: Why multiple-measures are more revealing. Journal
of Research on Technology in Education, 37(1) 45–64.
Berger, C. (1984). Learning more than facts: Microcomputer simulation
in the science classroom. In D. Peterson (Ed), Intelligent school house:
Readings on computers and learning. Reston, VA: Reston Publishing
Company.
California Department of Education (2003). California Standardized
Testing and Reporting Stanford 9 – CAT/6 linking study.
Cetron, M. J. & Davies, O. (2001). Special report: Fifty trends now changing
the world. Bethesda, MD: World Future Society.
Choi, B. S. & Gennaro, E. (1987), e effectiveness of using computer
simulated experiments in junior high students’ understanding of the
volume displacement concept. Journal of Research in Science Teaching,
24, 539–52.
Demirtas, H. & Schafer, J. L. (2003). On the performance of random-
coefficient pattern-mixture models for nonignorable dropout.
Statistics in Medicine, 22, 2553–2575.
Fosnot, C. (1996). Constructivism: A psychological theory of learning.
In C. Fosnot (Ed.), Constructivism: eory, perspectives, and practice.
New York: Teachers College Press.
Fisher, D. & Stolarchuk, E. (1998). e effect of using laptop computers on
achievement, attitude to science and classroom environment in science.
Proceedings from Western Australian Institute for Educational
Research Forum. Retrieved June 15, 2004, from
http://education.curtin.edu.au/waier/forums/1998/fisher.html
Furger, R. (2004). Laptops for all. The Geroge Lucas Educational Foundation
Edutopia Online. Retrieved July 15, 2004, from http://www.glef.org/
php/print.php?id=Art_895&template=printarticle.php
Gabel, D. (2004). Science. In G. Cawelti (Ed), Handbook of research
on improving student achievement (pp. 202–225). Arlington, VA:
Educational Research Service.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
32
J·T·L·A
Garza, T. J. (1991). Evaluating the use of captioned video material in
advanced foreign language learning. Foreign Language Annals, 14,
239–58.
Geban, O., Askar, P., & Ozkan, I. (1992). Effects of computer simulations
and problem solving approaches on high school students. Journal of
Educational Research, 86 (1), 5–10.
Goldberg, A., Russell, M., & Cook, A. (2003). e effect of computers
on student learning: A meta-analysis of studies from 1992 to 2002.
Journal of Technology, Learning, and Assessment, 2, (1).
Gonzalez-Bueno, M. (1998). e effects of electronic mail on Spanish L2
discourse. Language Learning and Technology, 1, 55–70.
Gulek, C. (2003). Preparing for high-stakes testing. eory Into Practice,
42 (1), 42-50.
Guthrie, J. T. & Wigfield, A. (2000). Engagement and motivation in
reading. In M.K. Kamil, P.T. Mosenthal, P.D. Pearson, & R. Barr (Eds.),
Handbook of reading research, Volume III (pp. 403–22). Mahwah, NJ:
Earlbaum.
Hanna, B. E. & de Nooy, J. (2003). A funny thing happened on the way
to the forum: Electronic discussion and foreign language learning.
Language Learning and Technology, 7, 71–85.
Harris, W. & Smith, L. (2004). Laptop use by seventh grade students with
disabilities: Perceptions of special education teachers. University of
Maine, Maine Education Policy Institute (Report No. 2).
Haydel, A. M. & Roeser, R. W. (2002). On the links between students
motivational patterns and their perceptions of, beliefs about, and
performance on different types of science assessments: A multidimensional
approach to achievement validation. National Center for Research and
Evaluation, CA: Los Angeles, Report No. 573.
Jameson, R. (1999). Equity and access to educational technology. rust
for Educational Leadership, 26 (9), 28–31.
Jonassen, D. (1991). Evaluating constructivist learning. Educational
Technology, 36 (9), 28–33.
Laird, N. M. & Ware, J. H. (1982). Random-effects models for
longitudinal data. Biometrics, 38, 963–974.
Lehman, J. (1994). Secondary science teachers’ use of microcomputers
during instruction. School Science and Mathematics, Vol. 94, 413–20.
Little, R. J. A. & Rubin, D. B. (2002). Statistical analysis with missing data
(2nd Ed.). New York: Wiley.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
33
J·T·L·A
Martin, M. O. et al. (1999). TIMSS 1999 international science report:
Findings from IEA’s repeat of the third international mathematics
and science study at eighth grade. Newton, MA: Boston College
International Study Center.
Marzano, R. J., Marzano, J. S. & Pickering, D. J. (2003). Classroom
management that works: Research-based strategies for every teacher.
Alexandria, VA: Association for Supervision and Curriculum
Development.
Met, M. (2004). Foreign language. In G. Cawelti (Ed), Handbook of
research on improving student achievement (pp. 86–111). Arlington, VA:
Educational Research Service.
Mullis, I. V. S. et al. (1999). TIMSS 1999 international mathematics report:
Findings from IEA’s repeat of the third international mathematics
and science study at eighth grade. Newton, MA: Boston College
International Study Center.
National Center for Education Statistics (2000). Digest of education
statistics. Washington, DC: Office of Educational Research and
Improvement, U.S. Department of Education.
Njoo, M. & de Jong, T. (1993). Exploratory learning with a computer
simulation for control theory: Learning processes and instructional
support. Journal of Research in Science Teaching, 30, 821–44.
Parschal, R. T., Weinstein, T. & Walberg, H. J. (1984). Effects of
homework: A quantitative synthesis. Journal of Educational Research,
78, 97–104.
Reeves, D. B. (2001). 101 Questions about standards, assessment and
accountability. Denver, CO: Advanced Learning Press.
Rockman et. al. (1997). Report of a laptop program pilot: A project for
Anytime Anywhere Learning by Microsoft Corporation Notebooks for
Schools by Toshiba America Information Systems. San Francisco, CA:
Rockman et al.
Rockman et al. (1998). Powerful tools for schooling: Second year study of the
laptop program – A project for Anytime Anywhere Learning by Microsoft
Corporation Notebooks for Schools by Toshiba America Information
Systems. San Francisco, CA: Rockman et al.
Rockman et al. (2000). A more complex picture: Laptop use and impact in
the context of changing home and school access – the third in a series of
research studies on Microsoft’s Anytime Anywhere Learning program.
San Francisco, CA: Rockman et al.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
34
J·T·L·A
Roderick, M. & Engel, M. (2001). e grasshopper and the ant:
Motivational responses of low-achieving students to high-stakes
testing. Educational Evaluation and Policy Analysis, 23 (3), 197–227.
Roth, J. L. and Paris, S. G. (1991). Motivational differences in students’
perceptions of classroom and standardized achievement tests. Paper
presented at the annual meeting of the American Educational
Research Association. Chicago, IL: April 3–7.
Rubin, D. B. (1976). Inference and missing data. Biometrika, 63, 581–592.
Rubin, D. B. (1987). Multiple imputation for nonresponse in surveys.
New York: Wiley.
Schafer, J. L. (1998). Some improved procedures for linear mixed models.
Technical Report #98-27, e Pennsylvania State University,
Prevention Methodology Center, University Park, PA.
Schecker, H. P. (1998). Integration of experimenting and modeling by
advanced educational technology: Examples from Nuclear Physics.
In B. J. Fraser & K. G. Tobin (Eds.), International handbook of science
education. Dordrecht, Boston, and London: Kluwer Academic
Publishers.
Secules, T., Herron, C., & Tomasello, M. (1992). e effect of video
context on foreign language learning. Modern Language Journal, 76,
480–90.
Shaver, J. P. (2004). Social studies. In G. Cawelti (Ed), Handbook of
research on improving student achievement (pp. 226–244). Arlington,
VA: Educational Research Service.
Soloway, E. et al. (1997). Science ware’s model it: Technology to support
authentic science inquiry. T. H. E. Journal, 1, 54–56.
Spitulnik, M. et al. (1998). Using technology to support students’
artifact construction in science. In B. J. Fraser & K. G. Tobin (Eds.),
International handbook of science education. Dordrecht, Boston, and
London: Kluwer Academic Publishers.
Technology Counts 2004: Global links-lessons from the world [Special
Report]. (2004, May 6). Bethesda, MD: Education Week.
e New Lab for Teaching and Learning (2001). Dalton Council Task Force
Report on Laptop Technology. Retrieved 7/4/2004 from http://www.
dalton.org/departments/nltl/index.html
Verbeke, G. & Molenberghs, G. (2000). Linear mixed models for
longitudinal data. New York: Springer-Verlag.
35
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
J·T·L·A
von Glaserfeld, E. (1987). Learning as a constructive activity. In C.
Janview, Problems of representation in the teaching and learning
mathematics. New Jersey: Lawrence Erlbaum Associates, Inc.
von Glaserfeld, E. (1995). A constructivist approach to teaching. In
L. Steffe & J. Gale (Eds), Constructivism in education. New Jersey:
Lawrence Erlbaum Associates, Inc.
von Glaserfeld, E. (1995b). Sensory experience, abstraction, and
teaching. In L. Steffe & J. Gale (Eds), Constructivism in education. New
Jersey: Lawrence Erlbaum Associates, Inc.
Walberg, H. J. (1984). Improving the productivity of American schools.
Educational Leadership, 41 (8), 19–27.
Walberg, H. J. (1994). Homework. In T. Husen & T. N. Postlethwaite
(Eds), International encyclopedia of education (2nd Ed.). Oxford,
England: Pergamon.
Walberg, H. J. & Haertel, G. D. (1997). Psychology and educational practice.
Berkeley, CA: McCutchan Publishing.
Walberg, H. J. & Paik, S. J. (2004). Effective general practices. In G.
Cawelti (Ed), Handbook of research on improving student achievement
(pp. 2538). Arlington, VA: Educational Research Service.
White, B. Y. & Horowitz, P. (1988). Computer microworlds and
conceptual change: A new approach to science education. In P.
Ransden (Editor), Improving learning: New perspectives. London:
Kegan Paul.
White, N. (1989). Developmental relationships between students’ attitudes
toward reading and reading achievement in grades 1 through 8 (Report
No. 040, 143). (ERIC Document Reproduction No. ED 329 905).
Wigfield, A., Eccles, J. S., & Rodriguez, D. (1998). e development
of children’s motivation in school contexts. In P.D. Pearson, & A.
Iran-Nejad (Eds.). Review of research in education. Washington, DC:
American Educational Research Association.
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
36
J·T·L·A
Appendix A: Technical Notes
A key strength of repeated measurements (longitudinal) research is
that this type of design makes it possible to obtain information concern-
ing individual patterns of change. e main advantage is to economize on
participants, using the maximum possible learning outcome data for each
student. For example, when studying the effects of laptop use over time,
it is usually desirable to observe the same participants repeatedly rather
than to observe different participants at each specified time point. A final
consideration is that data can often be collected more reliably in a study
in which the same participants are followed repeatedly than in a cross-
sectional study.
ere is one major challenge to the analysis of data from repeated mea-
sures studies. Namely, the analysis is complicated by the dependence upon
repeated observations made on the same experimental unit. e investi-
gator often cannot control the circumstances in obtaining measurements,
so the data may be unbalanced or partially incomplete. For example, in
a longitudinal study, the response from a participant may be missing at
one or more of the time points due to factors that may or may not be
unrelated to the outcome of interest (Little & Rubin, 2002; Demirtas &
Schafer, 2003).
Missing observations are common in longitudinal studies. In the
presence of incomplete data, the risk of reaching incorrect conclusions is
higher, because missing data may degrade the performance of confidence
intervals, bias parameter estimates, and reduce statistical power. Han-
dling incomplete data generally requires special techniques and inferential
tools. e properties of missing-data methods depend on the manner in
which data became missing; every missing-data technique makes implicit
or explicit assumptions about the missing-data mechanism. Many miss-
ing-data procedures in use today assume that missing values are missing at
random (MAR) (Rubin, 1976). Under MAR, the probability distribution of
the indicators of missingness may depend on the observed data but must
be conditionally independent of the missing data given the observed data.
Intuitively, MAR means that once appropriate account is taken of what we
have observed, there remains no dependence of the missingness on unob-
served quantities. An important special case of MAR is missing completely
at random (MCAR). Under MCAR, the response probabilities are indepen-
dent of both the observed and unobserved random variables in the data-
set. If MAR is violated, the response probabilities depend on unobserved
data in some fundamental way. In this case, the missing values are said to
be missing not at random (MNAR). MNAR situations require special care;
to obtain correct inferences, one must specify a joint probability model for
the complete data and the indicators of missingness.
37
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
J·T·L·A
Another important concept in the theory of missing data, closely
related to MAR, is ignorability. Together with a minor technical condi-
tion called distinctness, a missingness mechanism is said to be ignorable
when the missing values are MAR. Under ignorability, one does not need
to explicitly model the missingness mechanism. In most cases, the plau-
sibility of MAR cannot be verified nor contradicted by the examination of
the observed data; it needs to be treated as an assumption (Little & Rubin,
2002).
Linear-mixed models (Laird & Ware, 1982; Verbeke & Molenberghs,
2000) have become very popular for analyzing incomplete repeated mea-
sures data. In linear-mixed models, the variation in participants’ longitu-
dinal profiles arises at two levels. In the first level, the vector of repeated
measures for each participant is related to time and time-varying covariates
by a relatively small number of estimated participant-specific regression
coefficients. In the second level, one relates these coefficients to additional
time-varying and static covariates such as the laptop group, baseline char-
acteristics, gender and so forth. Additional levels of clustering can easily
be handled by augmenting the model. e linear-mixed model paradigm
combines these two or more stages into a single modeling procedure. ese
models are also known as multilevel models, random-coefficient models,
random regression models and hierarchical linear models. In linear-mixed
effects models, the vector of repeated measures on each participant fol-
lows a linear regression model where some of the regression coefficients
are common to the population, whereas other coefficients vary by partici-
pant. In other words, one can model commonalities and heterogeneities
among participants by separating participant-specific and population-
averaged effects. e fixed effect design matrix contains the variables that
are thought to estimate population-averaged effects, whereas the random
effects design matrix is used to assess perturbations due to inter-partici-
pant variation. ese regressor matrices do not assume any particular
form, therefore the overall model can handle time-varying covariates and
unequally spaced measurements. By varying forms of regressor matrices,
one can test a wide range of hypotheses.
Imputation, the practice of filling in missing data with plausible values,
is an attractive approach to analyzing incomplete data. It apparently solves
the missing-data problem at the beginning of analysis. e question of how
to obtain valid inferences from imputed data was addressed by Rubin’s
(1987) book on multiple imputation (MI). MI is a Monte Carlo technique
in which the missing values are replaced by m>1 simulated versions, where
m is typically small. In Rubin’s method for repeated imputation inference,
each of the simulated complete datasets is analyzed by standard methods,
and the results are combined to produce estimates and confidence inter-
vals that incorporate missing-data uncertainty.
38
Learning With Technology: The Impact of Laptop Use on Student Achievement Gulek & Demirtas
J·T·L·A
Author Biographies
Dr. James Cengiz Gulek is currently the Director of Assessment and
Evaluation for the Pleasanton Unified School District in California. He is
also an adjunct professor teaching research and statistics in the Educational
Leadership Program at Saint Mary’s College of California. He received
his Bachelor’s and Master’s degrees in Educational Sciences, majoring in
School Counseling, from Bogazici University in Istanbul, Turkey; and his
doctorate in Educational Research, Measurement, and Evaluation from
Boston College, Massachusetts, in 1999. He has been working in the K–12
education for 8 years, first as a counselor, then as the central office admin-
istrator. His research interests include program evaluation, data use in
decision-making, classroom assessment, and mixed research methods.
Dr. Hakan Demirtas currently works as an Assistant Professor of Bio-
statistics in the Division of Epidemiology and Biostatistics at University of
Illinois at Chicago. He received his doctoral degree in mathematical statis-
tics from Pennsylvania State University in 2003. His expertise is in miss-
ing data and statistical computing. His research interests include weighted
estimating equations, multiple imputation, statistical computing, tech-
niques for nonignorable missingness, analysis of incomplete multivariate
data, analysis of longitudinal data, random number generation, advanced
computational tools such as Markov Chain Monte Carlo algorithm and its
variants (data augmentation, Gibbs sampling,etc.) and generalized linear
and nonlinear mixed models. More information is available on his website:
http://apr.sph.uic.edu/faculty_profile/facultyprofile.asp?i=demirtas.
Technology and Assessment Study Collaborative
Caroline A. & Peter S. Lynch School of Education, Boston College
www.jtla.org
Editorial Board
Michael Russell, Editor
Boston College
Allan Collins
Northwestern University
Cathleen Norris
University of North Texas
Edys S. Quellmalz
SRI International
Elliot Soloway
University of Michigan
George Madaus
Boston College
Gerald A. Tindal
University of Oregon
James Pellegrino
University of Illinois at Chicago
Katerine Bielaczyc
Harvard University
Larry Cuban
Stanford University
Lawrence M. Rudner
University of Maryland
Mark R. Wilson
UC Berkeley
Marshall S. Smith
Stanford University
Paul Holland
ETS
Randy Elliot Bennett
ETS
Robert J. Mislevy
University of Maryland
Ronald H. Stevens
UCLA
Seymour A. Papert
MIT
Terry P. Vendlinski
UCLA
Walt Haney
Boston College
Walter F. Heinecke
University of Virginia
The Journal of Technology, Learning, and Assessment
... Students' grades in the course for mid-semester and final exams of computer architecture course were also analyzed. This method helps to evaluate students' achievements in the course under investigation (Gulek & Demirtas, 2005; Rivera & Rice, 2002). The grades were included before and after one year of the implementation of the SAtTaP. ...
... However, this possibility can be ignored because grade comparison results are consistent with our other results of the survey, interviews, and 5-Star rating. These results are consistent with Gray and Bunte (2022) and Gulek and Demirtas (2005) that examined the relationship between students' performance and grades. ...
Article
Full-text available
Computer architecture is considered one of the dry and hard-to-learn courses for students in the computer science discipline. Therefore, educational institutions experiment with different pedagogical approaches to enhance students’ performance and interest in courses. This study introduces a Simplified Approach to Teach a Processor (SAtTaP), a reformed version of the computer architecture course. The study aims to confirm that the proposed simplified approach, SAtTaP, enhances students’ performance and interest in the courses. Data was collected through multiple methods including an online survey based on the Technology Acceptance Model, students’ interviews, and students’ grades obtained in pre and post-implementation of the SAtTaP. The study participants were students enrolled in an undergraduate program in computer science discipline at the University of Bari, Italy. The data was analyzed using SmartPLS and NVivo. The data analysis shows that the students were delighted and interested in the computer architecture course when taught using the SAtTaP. Furthermore, by comparing the grades of pre and post-implementation of the SAtTaP, an improvement in the student’s performance and a decrease in the failure rate were found. The study’s findings have the potential to inform and shape the way computer science is taught and learned in higher education, with potential benefits for both students and educators.
... Among the courses, Algebra was the only one to show a significant difference in favor of tablet PC use [40] . In another study, positive effects were observed when the effects of mobile computing on state achievement test scores in writing were examined [41] . ...
Article
Full-text available
The purpose of this study is to explore the effect of mobile learning on writing achievement using tablet PCs. It used a pretest-posttest control group design to assess the effect at a private university in Türkiye. The study involved 55 students and two instructors who volunteered to participate. A 16-week tablet-assisted language learning (TALL) material was used for the treatment group while the control group was instructed in a regular way of using a coursebook pack and a notebook. A validated writing achievement test was used to collect data. To reveal a potential difference between the pretest and posttest results of the groups, a non-parametric equivalent to the t-test for dependent samples, Wilcoxon Signed-Ranks test was employed. To reveal a potential difference between posttest results of the groups, a non-parametric equivalent to the t-test for independent samples, Mann-Whitney U test was employed. In addition, some descriptive statistics were calculated to support the results. The findings revealed that TALL made a significant change in students’ achievement scores in EFL writing; however, the treatment group scores were not statistically and significantly higher than the control group scores based on comparison of both groups’ posttest scores. The findings show that TALL can impact educational practices by replacing traditional classroom practice if need be.
... Research has shown that effective technology integration can positively impact student performance (Bragdon & Dowler, 2016;Henderson et al., 2017;Rashid & Asghar, 2016). For instance, studies have demonstrated that students with access to individual laptops achieved significantly higher scores than those without laptops, even though their grades within specific courses were not significantly different (Gulek & Demirtas, 2005). Augmented Reality on mobile devices has also been found to enhance motivation and improve academic achievement among engineering students (Fonseca et When combined with appropriate teaching strategies, effective technology usage has been shown to positively influence student achievement and performance (Ainin et al., 2015;Bragdon & Dowler, 2016;Grinager, 2008;Schacter, 1999). ...
Article
Full-text available
This article explores the disconnect between student and educator perspectives regarding practical technology usage in higher education. As technology continues to play an increasingly prominent role in the educational landscape, understanding the differing viewpoints of students and educators is crucial for designing impactful technology integration strategies. Through a comprehensive literature review, this article examines the factors contributing to the disconnect and its implications for teaching and learning in higher education. Additionally, recommendations are provided to bridge this gap and promote a more aligned understanding of effective technology usage between students and educators.
... Influence of Simulations on Students' Achievement in Physics in Secondary Schools in Nandi East, Kenya 71 2013). Research indicates that integrating mobile technologies into education fosters greater student engagement, collaboration, and access to comprehensive information, thereby improving academic performance and learning outcomes (Gulek & Demirtas, 2005). Moreover, students report enhanced problem-solving skills, critical thinking abilities, and a more positive attitude towards learning when mobile technologies are effectively integrated into their educational experiences (Berge & Muilenburg, 2013). ...
Article
Full-text available
Many secondary schools in Kenya are yet to incorporate Mobile Technologies into their educational practices, despite widespread ownership and use by both students and teachers. This study investigated the influence of simulations on students' achievement in physics in secondary schools in Nandi East, Kenya. The dismal performance in physics locally and nationally may be linked to slow technological integration, hence the need for the study. The target population consisted of Form Three physics students and 30 Heads of Department (HODs) from 31 secondary schools. A descriptive survey design was utilized. Random and purposive sampling was used to select participants, and data collection involved questionnaires and interviews. Using Taro Yamane’s Formula, the sample size was determined to be 300 students, 30 HODs, and 30 schools. Data were analyzed using SPSS, providing descriptive statistics such as frequencies, percentages, pie charts, means, and standard deviations (SD). The reliability of the questionnaire items was confirmed with a Cronbach Alpha coefficient above 0.7. The findings emphasize the significant benefits of integrating simulations on students' academic achievement in secondary school physics learning. While the study offers important insights, its geographical limitation, technological access issues, and narrow scope constraints must be considered. These results have implications for various stakeholders including learners, parents, teachers, school administrators, curriculum developers, and policymakers. Learners will benefit from improved effectiveness, engagement, and personalized instruction, while policymakers will gain insights for future improvements. The study recommended learners be allowed to bring their own devices to school to ensure equality and equity in access to digital learning materials.
... For this reason, along with the provision of infrastructure, teacher training must be addressed, and not just in its most basic form, as this will enable the full utilization of all the possibilities offered by technology (Liu et al., 2020;Sailer et al., 2021). In fact, it has been demonstrated that ICT can increase academic performance, but only when they are used effectively in teaching processes (Gulek & Demirtas, 2005;Van der Spoel et al., 2020;Zheng et al., 2016;Zucker & Hug, 2008). If this dimension is not addressed, the investment in digital infrastructure can be worthless (Backfisch et al., 2021;Dawson et al., 2008;Rutledge et al., 2007;Scherer et al., 2021). ...
Article
Full-text available
In recent decades, Information and Communication Technologies have gained significant relevance in all areas of society, including the field of education. This is due, among other reasons, to the transformative methodological potential they present and their multiple possibilities in teaching and learning processes. The presence of digital infrastructure within schools is crucial for integrating technology into them, but it is also necessary for teachers to know how to use it in a didactic and pedagogical way with students. Generally, digital training for teachers has focused more on understanding the tools and their most basic functions than on their pedagogical use within the classrooms. This study analyzes the number and themes of digital training courses offered to teachers by Spanish educational administrations, classifying them into two main areas: (i) knowledge of tools and initiation, and (ii) didactics and curricular content. The results show a change in the content of these teacher training programs over time.
... Publikasi pada tahun 2004 yang telah dikutip lebih banyak dari tahun lainnya, yaitu dengan 3 publikasi dapat dilihat pada tabel 2 dibawah. Penelitian yang dilakukan oleh (Gulek & Demirtas, 2004) dengan judul "Learning with technology: The impact of laptop use on student achievement" mendapatkan jumlah kutipan terbanyak yaitu sebanyak 116 kali, artikel tersebut menunjukkan bahwa para siswa sekolah menengah mengarahkan pembelajaran mereka sendiri, melaporkan ketergantungan yang lebih besar pada strategi pembelajaran aktif, siap terlibat dalam pemecahan masalah dan pemikiran kritis, dan secara konsisten menunjukkan penggunaan teknologi yang lebih dalam dan lebih fleksibel daripada siswa yang tidak memiliki laptop. Sumber publikasi dari table diatas juga dapat dijadikan acuan bagi peneliti yang ingin mempublikasikan artikelnya terkait berpikir kritis pada sekolah menengah. ...
Article
Full-text available
Critical Thinking is a process that involves mental operations such as deduction, induction, classification, evaluation, and reasoning. This study aims to capture the landscape of Critical Thinking research, especially in secondary schools. The method used is descriptive bibliometric analysis. The source data obtained comes from the Scopus database. The study results show that the most publications related to Critical Thinking are in high schools, namely in 2022, with a total of 18 publications. The publication with the most citations was in 2004, which was cited 133 times. The country that has had the greatest impact on Critical Thinking research in secondary schools in the United States of America. The research focus is divided into sections, namely, 1) development and technology; 2) science and outcomes; 3) curriculum and mathematics, and 4) environment and motivation. The new themes in this field are self-efficacy, STEM education, and innovation. The keywords innovation and Critical Thinking in high school are separate from the other two new theme keywords, self-efficacy and STEM education. The keywords in the first research focus, namely technology and Critical Thinking in high schools, are separate from the keyword reasoning
Article
21st century learning is a collaboration of technology and science that can change students' abilities from LOTS (lower order thinking skills) to HOTS (higher order thinking skills). In its application, 21st century learning is supported by teaching resources and media based on environmental phenomena and technological developments. This research aims to determine the feasibility of the physics e-module based android application that has been developed after validation testing by media experts, material experts, and learning experts. This e-module was developed to serve as an innovation in physics learning in schools to improve high school students from LOTS to HOTS abilities. This type of research is development research by applying 4D model consisting of define, design, develop and disseminate stages. At the define stage, researchers examine and analyze phenomena in schools such as curriculum, student learning styles, teacher teaching styles, learning resources and learning media used. In the design stage, researchers design the needs of students and teachers in learning such as learning modules. In the develop stage, researchers develop products for student learning needs through expert validation from lecturers and teachers. In the disminate stage, researchers apply products that have been developed in schools. The population of this study was all class X students at SMAN 5 Mataram. Sampling in this research was carried out using a purposive sampling method. The classes chosen as research samples were classes X IPA 1 and The average score from the media validation results was 4.8 with a very valid category. The average score from the learning validation results was 4.7 in the very valid category. Based on the validation results, it was concluded that the development product in the form of an integrated physics e-module with local wisdom based on an Android application was suitable for use in schools.
Article
Full-text available
This study investigated the use of e-learning devices for effective students' learning outcomes in private secondary schools in Rivers State, Nigeria. The study adopted a descriptive research design with a population of 6,908 teachers and 40,107 students from all the private secondary schools in the study area. A stratified random sampling technique was used to select 500 teachers (7%) and 1600 students (4%) drawn from 80 private secondary schools in Rivers State. Data were collected with a research instrument of twelve items. The instrument was validated by two experts from the Department of Educational Management. The reliability of the instrument was ascertained with the use of the test-retest method. Pearson Product Moment Correlation analysis was used to get the coefficient of 0.85. Mean and standard deviation statistics were used to answer the research questions, while z-test statistics were used to test the formulated null hypothesis at a 0.05 level of significance The study revealed that the various e-learning devices such as laptops, tablets, smartphones, and interactive whiteboards, among others, enhance students learning outcomes in private secondary schools in Rivers State. It is concluded that the use of e-learning devices offers huge prospects for enhancing students' learning outcomes in private secondary schools in today's digital age. The study, amongst others, recommends that private school owners should partner with internet service providers to offer affordable internet access.
Article
When making sampling distribution inferences about the parameter of the data, theta, it is appropriate to ignore the process that causes missing data if the missing data are 'missing at random' and the observed data are 'observed at random', but these inferences are generally conditional on the observed pattern of missing data. When making direct likelihood or Bayesian inferences about theta, it is appropriate to ignore the process that causes missing data if the missing data are missing at random and the parameter of the missing data process is 'distinct' from theta. These conditions are the weakest general conditions under which ignoring the process that causes missing data always leads to correct inferences.
Article
The abstract for this document is available on CSA Illumina.To view the Abstract, click the Abstract button above the document title.