ArticlePDF Available

USEiT Report 1: An Overview of the USEIT Study and the Participating Districts

Authors:

Abstract and Figures

The Use, Support, and Effect of Instructional Technology (USEIT) Study was undertaken to better understand: a) how educational technologies are being used by teachers and students in the classroom, b) what factors influence these uses, and c) how the use of technology effects student learning. The three-year study began during the Spring of 2001 and was divided into two phases. During the 1st phase (2001-2002 school year), information about district technology programs, teacher and student use of technology in and out of the classroom, and factors that influence these uses were collected through site visits and surveys. In total, survey responses were obtained from 120 district level administrators, 122 principals, 4,400 teachers, and 14,200 students in Grades 5, 8, and 11. During the second phase (2002-2003 school year), four case studies and a study focusing on the relationship between student use of technology and academic performance will be conducted. Although the data collected during the 1st phase of this study were not designed to be representative of all districts in Massachusetts or of districts across the nation, the data represent the largest and most comprehensive data set related to the support and use of technology in schools collected to date. This study also provides a data set that links information provided by students with information provided by their teachers, their principals, and their district leaders. Thus, this study presents an opportunity to understand how factors that operate at the student, teacher, school, and district level interact to influence the ways and the extent to which teachers and students use technology in school. The purpose of the current report is to provide readers with a sense of why the study was conducted, how it was conducted, and who participated.
Content may be subject to copyright.
report one
An Overview of the Study and the Participating Districts
Use,
Support,
and
Effect
of
Instructional
Technology
Study
Report 1
An Overview of the USEIT Study and the Participating Districts
Michael Russell , Damian Bebell, & Laura O’Dwyer
Published by inTASC – January 2003
Preferred Citing:
Russell, M., Bebell, D., & O’Dwyer, L. (2003) Use, support, and effect of instructional technology study: An overview of the
USEIT study and the participating districts. Boston, MA: Technology and Assessment Study Collaborative
Available for download at http://www.intasc.org/PDF/useit_r1.pdf
Michael K. Russell, Project Director/Boston College
Copyright © 2003 Technology and Assessment Study Collaborative, Boston College
Supported under the Field Initiated Study Grant Program, PR/Award Number R305T010065, as administered by the
Ofce of Educational Research and Improvement, U.S. Department of Education.
The nding and opinions expressed in this report do not reect the positions or policies of the Ofce of Educational
Research and Improvement, or the U.S. Department of Education.
Use, Support, and Effect of Instructional Technology Study
Report 1 · November 2002
Use, Support, and Effect of Instructional Technology (USEIT)
Report 1
An Overview of the Study and the Participating
Districts
Over the past decade, expenditures on, access to, and use of computer-based tech-
nologies by school-aged children have increased sharply. Between 1995 and 2001, fed-
eral expenditures on educational technology increased from $21 to $729 million while
the student to computer ratio has decreased from 9:1 to 4:1 nationally (Glennan &
Melmed, 1996; Market Data Retrieval, 1999, 2001). In 2001, the U.S. Census Bureau’s
Current Population Survey reported that American children between ages 9-17 use
computers more than any other reported subgroup of the American population (92.6%)
(A Nation Online, 2002).
Despite these large expenditures, this increased access, and nearly universal use by
school-aged children, several observers have questioned the extent to which technol-
ogy is impacting teaching and learning. In 1997, Oppenheimer argued that there is no
solid evidence that computers have improved student performance, as measured by
test scores. Others, like Stoll (1999) and Healy (1998), have criticized investments in
educational technologies arguing that there is little evidence that they impact teaching
and learning in a positive way and, in fact, asserted that computer use may be harm-
ing children and their learning. More recently, Cuban (2001) argued that computers
have been oversold as a vehicle for reforming educational practices and are generally
underused as an instructional tool by teachers at all levels of education.
While it is certainly appropriate to question the effects computer-based technolo-
gies are having on teaching and learning, the data collected to date provide incom-
plete, and sometimes misleading images of the ways teachers and students use technol-
ogy, the effects of these uses, and the factors that inuence these uses. To deepen our
understanding of these issues, the Technology and Assessment Study Collaborative
(inTASC) at Boston College undertook a three-year study of technology use, support,
and impact across 22 districts located throughout Massachusetts.
Introduction to the USEIT Study
The Use, Support, and Effect of Instructional Technology (USEIT) Study was
undertaken to better understand:
a) how educational technologies are being used by teachers and students in the
classroom,
b) what factors inuence these uses, and
c) how the use of technology effects student learning.
The three-year study began during the Spring of 2001 and was divided into two
phases.
During the rst phase (2001-2002 school year), information about district tech-
nology programs, teacher and student use of technology in and out of the classroom,
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 4
and factors that inuence these uses were collected through site visits and surveys. In
total, survey responses were obtained from 120 district level administrators, 122 prin-
cipals, 4,400 teachers, and 14,200 students in Grades 5, 8, and 11.
During the second phase (2002-2003 school year), four case studies and a study
focusing on the relationship between student use of technology and academic perfor-
mance will be conducted.
Although the data collected during the rst phase of this study were not designed
to be representative of all districts in Massachusetts or of districts across the nation,
the data represent the largest and most comprehensive data set related to the support
and use of technology in schools collected to date. This study also provides the rst
data set that links information provided by students with information provided by
their teachers, their principals, and their district leaders. Thus, this study is the rst
opportunity to understand how factors that operate at the student, teacher, school,
and district level interact to inuence the ways and the extent to which teachers and
students use technology in school.
To provide school leaders, teachers, policy makers, and the general public with
timely information about the use of technology in schools and the various factors that
inuence teacher and student use of technology in the classroom, inTASC will be
releasing a series of reports based on the survey and site visit interview data collected
during the rst phase of this study. Like this document, the series of reports will be
available on-line in pdf format (www.intasc.org). The focus of the reports will range
from descriptive analyses of student, teacher, principal, and district-level beliefs and
practices to more sophisticated analyses that explore the inuence various factors have
on teacher and student use of technology. The purpose of the current report is to pro-
vide readers with a sense of why the study was conducted, how it was conducted, and
who participated.
Characteristics of the Study Districts
This study was spawned by an inquiry from a group of districts located in the
greater-metropolitan area surrounding Boston, Massachusetts. This group of districts
asked the project director to develop a multi-district research plan that focused on
how technology was being used to impact learning and how to maximize investments
in technology. The proposed plan was submitted to and ultimately generously funded
by the Ofce of Educational Research and Improvement’s Field Initiated Studies
Program.
To increase the diversity of districts participating in the study, 12 Massachusetts
school districts that were not part of the original group of districts were asked to par-
ticipate. Of these, 11 opted to participate. Thus, when data collection began, the study
included 22 districts located throughout Massachusetts. Of these, three are considered
small urban districts, ve are rural, and the remaining 14 are suburban. Figure 1, be-
low shows the geographic location of the participating school districts.
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 5
Figure 1 Geographic Locations of the USEIT Participating School Districts
Background
Throughout the rst phase of the study, districts were actively engaged in develop-
ing and piloting survey instruments. Through survey review meetings, district repre-
sentatives have critiqued and informed the development of all teacher, principal and
student survey items. To increase response rates and to provide teachers with ample
time to complete the surveys, participating districts agreed to administer all teacher
surveys during a faculty or department meeting. Student surveys were administered
during the rst 15 minutes of a class period (usually during English class). District
leaders were asked to complete their respective surveys at a time of their choosing.
Copies of the principal, teacher, and student surveys will be found in the forthcoming
technical report available on the inTASC web site.
In addition to the surveys, a two-day site visit was conducted in each participating
district. During each site visit, district-level leaders were interviewed. A minimum of
one elementary, one middle, and one high school per district was also visited. At each
school, the principal, librarian, and a technology specialist were interviewed. In the
high school, the mathematics, science, social studies and English department heads
were also interviewed. All interviews focused on issues related to the support and use
of technology in the district.
It should be noted that before the data were analyzed to answer the primary re-
search questions, each district was provided complete copies of their summary statistics
for all teacher and student survey items, as well as the raw data collected from teachers
and students. In addition, a half-day workshop that focused on understanding, using,
and interpreting their data was held in each district.
It also should be noted that our research has been greatly aided by the thought-
ful advice and suggestions made by the USEIT Study Advisory Board. The Advisory
Board has met twice, once in the Fall of 2001, when instruments were being devel-
oped, and again in the Fall of 2002, when preliminary analyses were beginning. The
Advisory Board members included Hank Becker, Chris Dede, David Dwyer, Cheryl
Lemke, and Linda Roberts.
Arlington
Ashburham/
Westminster
Bedford
Belmont
Brookline
Concord/Carlisle
Everett
Hampden/
Wilbraham
Ipswich
Mohawk Trail
Mount Greylock
Newton
Northbridge
Norwood
Shrewsbury
Sutton
Swampscott
Waltham
Watertown
Weston
Winchester
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 6
Research Questions Addressed During Phase One
(2001–2002 School Year)
The research conducted as part of this study builds on prior research by focus-
ing on the relationships among different types of district level supports, classroom
practices, and impacts on student outcomes. Specically, the surveys incorporate
several scales developed by Becker and his colleagues (1999) so that we can examine
relationships among pedagogical beliefs, instructional practices, school climate, and
instructional uses of technology. Recognizing the relationship between classroom and
student-level factors (such as prior experience with computers) and instructional uses
of technology documented by Schoeld (1995), a portion of our student surveys focus
specically on documenting students’ prior home and school computer experiences,
and how these experiences shape instructional uses of technology in their classrooms.
Finally, recognizing that instructional technology programs are inuenced by factors
that exist at different levels within a school system, multiple approaches to analyzing
survey data are being employed. These methods include within-level correlational and
regression analyses, across-level hierarchical linear modeling (HLM), and structural
equation modeling (SEM) methods.
This study employs common data collection instruments and procedures across
the 22 school districts to examine the effects different district level technology sup-
port structures have on teaching and learning. Among the several specic questions
addressed in phase one are:
How and to what extent are teachers and students using technology in and out
of the classroom?
How much inuence does district leadership, shared vision, provision of re-
sources, and technical support have on the ways in which and extent to which
teachers use technology for instructional purposes?
How do different approaches to professional development impact instruc-
tional uses of technology?
To address these questions, surveys and site visits were used to collect information
about several factors related to technology in schools. These factors include:
District Vision for Technology Teaching Philosophy/Instructional Model
School and District Culture Equity
Leadership Community
Technology Resources Demographic Information
Technology Support Instructional Uses of Technology
Professional Development Physical Infrastructure
Technology Policies and Standards Barriers to Use
Technology Beliefs Teacher Preparedness
Non-instructional Uses
of Technology
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 7
Although each of the rst 13 factors is believed to impact instructional uses of
technology (either positively or negatively), the factors reside at different levels of a
school system. As Figure 2 depicts, the majority of the factors originate at the district
level. At the school level, these district level factors may be moderated by local leader-
ship and culture. At the classroom level, factors internal to the teacher and character-
istics of the students may further inuence the ways in which technology is and is not
used for instructional and preparatory purposes.
Figure 2 Origination of Factors Inuencing Instructional Use of Technology
 


























Report 1 · November 2002
An Overview of USEIT and the Participating Districts 8
Table 1 indicates how information about each factor was collected. Since many of
these factors may play different roles at different levels, information about these fac-
tors is collected at multiple levels.
Table 1 Information Collected from Specic Individuals
Surveys
Teacher Principal
District
Leaders Students Site Visits
Vision
Personal Teaching Philosophy
School/District Culture
Demographics
Professional Development
Technology Beliefs
Support
Resources
Equity Issues
Policies
Instructional Use
Barriers
Comparing USEIT to National and State Data
Sources
The Use, Support, and Effect of Instructional Technology Study was designed to
enable intense and sustained investigation of a series of issues across multiple school
districts. Many of the issues examined are generally encountered by schools across
the nation. The 22 districts that have and continue to participate in this study were
selected to allow us easy and extensive access to their schools. When selecting par-
ticipating districts, we made a concerted effort to include districts in rural, suburban,
and small urban settings. In addition, we included districts that were believed to have
educational technology programs that were in different stages of development – from
very advanced and well established to those that are still being established.
We recognize that districts from across the nation will be interested in using the
information provided by this study to inform their own educational technology pro-
grams. We also understand that districts will ask how similar the study districts are to
themselves. For this reason, we provide several comparisons of the characteristics of
the study participants with schools, students, and teachers across the nation. In do-
ing so, we emphasize that we are not attempting to argue that the set of districts that
participated in the study are representative of districts across the nation. Rather, we
present these comparisons to help readers better understand the characteristics of the
study participants and how these characteristics compare with other groups of schools,
students, and teachers.
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 9
Demographics and Technology Access
Each year the state of Massachusetts provides basic demographic information for
each of the 372 school districts operating in the state. In addition the Massachusetts
Department of Education’s Instructional Technology Group published a “state of the
state” report in 2002 which was culled from submitted mandatory school technol-
ogy plans. When combined, these two sources allow us to compare the participating
USEIT districts to the state averages on demographic variables as well as on some
technology measures. Table 2 summarizes this information.
Table 2 Selected Demographic Data Comparing USEIT Districts and Massachusetts
District Name
# of
Students
Elem.
Schools
Middle
Schools
High
Schools
%
White
% Free
Lunch
A/B
Computers
All
Computers
Acton-Boxborough 2,269 0 1 1 88% 1% 6.0 4.1
Acton 2,386 5 0 0 87% 3% 7.1 5.5
Boxborough 634 1 0 0 89% 3% 4.21 4.08
Arlington 4,178 7 1 1 87% 10% 8.1 4.9
Ashburham-Westminster 3,820 3 1 1 96% 7%
Bedford 2,086 2 1 1 89% 4% 4.6 4.3
Belmont 3,608 4 1 1 86% 7% 11.0 9.2
Carlisle 787 1 1 0 94% 0% 15.13 5.39
Concord 2,063 3 1 0 90% 4% 5.25 4.32
Concord-Carlisle 1,016 0 0 1 86% 4% 6.3 26.9
Everett 5,377 7 0 1 74% 37% 7.15 5.83
Hampden-Wilbraham 3,820 6 1 1 96% 6% 6.85 6.85
Ipswich 1,953 2 1 1 97% 8% 3.5 2.96
Mohawk Trail 1,719 4 0 1 97% 23% 1.17 0.9
Mount Greylock 833 0 0 1 94% 9% 5.05 5.05
Newton 11,248 16 4 2 82% 5% 9.54 6.14
Northbridge 2,422 4 1 1 95% 28%
Norwood 3,539 6 1 1 87% 9% 6.85 6.85
Shrewsbury 4,512 6 1 1 88% 5%
Sutton 1,593 2 1 1 99% 2%
Swampscott 2,379 4 1 1 96% 4% 8.78 8.47
Waltham 5,187 9 2 1 65% 26% 10.5 7.84
Watertown 2,657 3 1 1 92% 19% 6.12 4.18
Weston 2,147 3 1 1 81% 3%
Winchester 3,285 5 1 1 93% 3% 8.37 6.77
Total USEIT 75,518 103 23 22 89% 9% 7.1 6.5
MASSACHUSETTS 979,593 1270 282 318 76% 25% 5.7 4.8
Data source: Massachusetts Department of Education
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 10
Table 2 shows the basic topography of the participating school districts. In most
cases, a school district had at least one middle and high school and about ve el-
ementary schools. In some cases, a regional high school was shared across two school
districts (for example Concord-Carlisle and Acton-Boxborough). The participating
USEIT districts differ from the state average on free/reduced lunch participation and
ethnic composition. Specically, the USEIT sample districts are comprised of 89 per-
cent white students compared to the state average of 76 percent. Nine percent of the
USEIT district students participate in a free/reduced lunch program compared to the
state average of 25 percent. Looking specically at access to technology in schools,
the state reports more computers per students than the USEIT districts report. When
examining only A/B type computers1 the USEIT districts and the Massachusetts aver-
ages are 7.1 and 5.7 students per computer, respectively. Across all types of computers
this relationship changes little with 6.5 and 4.8 students per computer reported for
the USEIT and total state averages, respectively. Thus, the USEIT districts contain
less minority and nancially burdened students than the state. However, the districts
participating in the USEIT study have fewer computers than the state’s average.
Student Access To and Use of Computers: Comparison to the Nation
In early 2001, a nationally representative survey was conducted of 7–12th Graders
by Harris Interactive, Inc.
Table 3 below outlines the differences between the two data sources.
Table 3 Comparison of USEIT and Harris Data Sources
USEIT Harris
Methodology Questionnaire Telephone survey
Date of data collection Spring 2002 January 15–28, 2001
Sample (n) 8,371 500
Sample (Grades) 8th and 11th 7th–12th
Sample (Geography) Massachusetts Across USA
Data collected in: Classroom (School) Student’s home
The two surveys have some overlap in their survey content. In some cases, the
questions are nearly identical. In other cases, the questions differ in scale, wording,
and tone, yet, the general content is similar enough to allow comparison.
Both surveys addressed students’ access to technology at home and the responses
were similar. The Harris Poll reported that 94 percent of the surveyed students had
computer access in their home. Similarly, in the USEIT survey 95 percent of all Grade 8
and 11 students report having a computer at home. Both surveys asked students about
their access to the Internet in their home. Again, the responses from the two surveys
are similar with 88 percent of Harris Poll student reporting Internet access at home
and 91.4 percent reporting home access in the USEIT sample.
Both surveys also addressed students’ access to technology in their schools.
Although most of the items in the two surveys address different topics or were worded
differently and do not allow direct comparison, some of the items are similar enough
to provide a meaningful comparison. One item asks students about the availability of
computers at their school. Table 4 shows the differences in the questions as well as the
responses from both survey instruments.
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 11
Table 4 Comparison of Ease of Student’s Computer Access in School
USEIT–Student* Harris*
When you want to
use a computer in
school is it…
Which of the follow-
ing best describes
the availability of
computers at your
school?
Always easy to nd a computer 29% 47%
Usually easy to nd a computer 44% 43%
Sometimes difcult to nd a computer 18% 8%
Always hard to nd a computer 7% 2%
* USEIT Study (grades 8 and 11), Harris Poll (grades 7 through 12)
Generally, the USEIT results indicate that students have more difculty accessing
technology in schools as compared to the Harris results. Specically, 18 percent more
students reported that it is “always easy to nd a computer” in the Harris Poll than the
USEIT survey.
Since both surveys ask students about their use of technology in different subjects
it is possible to compare results for English, Math, Science and Social Studies classes.
Table 5 shows the percentage of students who use computers across the four main
subject areas.
Table 5 Comparison of Student Computer Use in English, Math, Science and
Social Studies
USEIT–Student* Harris*
How often do you
use a computer in
[subject] class?
In which of the fol-
lowing classes do
you use computers?
English 49% 61%
Math 29% 26%
Science 53% 50%
Social Studies 64% 55%
* USEIT Study (grades 8 and 11), Harris Poll (grades 7 through 12)
‡ The percentage represents students who report using computers a couple of times a year or more.
The above table shows that across the two surveys, student use is very similar in
Math and Science, but differs in English and Social Studies. It is important to note that
these results do not address the relative frequency of students’ technology use in these
classes; only whether they have used technology in the class or not. Unfortunately, a
more detailed comparison is not possible since the two instruments employ different
scales.
Another question that was similar enough to allow meaningful comparison ad-
dressed where students use technology the most in their schools. Table 6 displays the
frequencies of responses across both samples of students.
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 12
Table 6 Comparison of Where Students Use Technology the Most in School
USEIT–Student* Harris*
Where do you use
technology (com-
puters, AlphaSmarts,
etc.) most in school?
Where do you use
computers most of-
ten in school?
In the classroom 10% 24%
In a computer lab 68% 39%
In the library/media center 20% 35%
* USEIT Study (grades 8 and 11), Harris Poll (grades 7 through 12)
Note: Table percentages do not equal 100% because of missing data.
Again, there are differences in the school use of technology between the two
groups. Specically, the USEIT 8th and 11th grade students predominately access
technology in computer labs (68 percent) while Harris grade 7–12 students use com-
puters more regularly in classrooms and in the library/media center.
These comparisons suggest that the USEIT and Harris samples are quite simi-
lar in home access, but differ regarding use of technology in school. Specically, the
Harris study reported greater access to technology in school as compared to the stu-
dents participating in the USEIT study. Moreover, the USEIT sample appears to lack
technology access in the classroom and library as compared to the Harris respondents.
With respect to where students learn new things about technology and computers,
Table 7 indicates that a higher percentage of USEIT students report learning new
things at home as compared to the Harris sample.
Table 7 Comparison of Where Students Learn New Things With Technology
USEIT–Student* Harris*
Where do you
usually learn how to
do new things with
computers?
Where have you
learned the most
about using
computers?
At home 70% 56%
At school 25% 39%
* USEIT Study (grades 8 and 11), Harris Poll (grades 7 through 12)
Teacher Access to Technology
It is also useful to examine how closely the participating USEIT teacher responses
resemble teachers access to technology across the nation. Unfortunately, a direct
comparison is not possible since no national surveys of teachers’ technology access
have been conducted recently. However, in Becker’s Fall 1998 Teaching, Learning and
Computing Survey, 80 percent of teachers reported that they had a computer in their
home. Similarly, the 2002 US Census Current Population Survey “A Nation Online,”
reports that between 1998 and 2002 home access to computers for adults who have
earned a college degree has grown at a rate of 5.3 percent per year. Since teachers have
college degrees, it is reasonable that their growth rate is similar to 5.3 percent a year.
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 13
Applying a growth rate of 5.3 percent a year to Becker’s ndings, one would expect a
15.9 percent increase in the number of teachers who own a computer at home which
results in an estimated 95.9 percent of teachers who have a computer in their home.
The percentage of USEIT teachers who own a home computer is remarkably similar
95 percent. Although there are no other data that can be compared directly or indi-
rectly to national data sources related to teachers use of computers in or out school,
it appears that home access to computers for the teachers participating in the USEIT
study is similar to the projected access of teachers nationwide.
Variability Across USEIT Study Districts
As presented above, the USEIT teachers and students appear to have similar ac-
cess to computers at home as compared to national samples. Students participating in
the USEIT study, however, report higher use of computers in school labs and librar-
ies and appear to depend less on school for learning new things about computers as
compared to the national data source. Despite these similarities and differences, it is
important to recognize that within the USEIT data set, there is signicant variation
among students, schools, and districts.
As an example, Figure 3 displays four measures of technology usage for each par-
ticipating high school. These technology uses include teacher use of technology for
preparation, teacher use of technology during instruction, student use of technology in
the classroom as reported by teachers, and student use of technology in the classroom
as reported by students. For each participating middle and high school, the mean for
each measure of technology usage is displayed.
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 14
Figure 3 High Schools’ Reported Use of Technology as Reported by Teachers and
Students (sorted by Instructional Use)
W-z
High
Schools
D-y
F-z
S-z
G-y
A-y
C-z*
V-z
P-z
J-y
O-z
K-z
Z-y
R-z
Y-z*
B-y
N-z
Y-y
X-z
U-y
L-z
T-y
-3.00 -2.00 -1.00 0.00 1.00 2.00 3.00
Reported Use (in Z Score Units)
Instructional Use Prep Use Student Use (teacher) Student use (student)
Frequent Use
Frequent Use
Infrequent Use
Infrequent Use
*data not available for “Student Use” as reported by students
Note: School & District names have been made anonymous to protect the identity of participating districts
As seen in Figures 3 and 4, the extent to which technology is used inside and out-
side of the classroom varies greatly across schools. From a research perspective, this
variation provides opportunities for us to identify factors that inuence technology
uses (such as teacher beliefs, pedagogical practices, student home use, principal leader-
ship, shared district vision, etc.).
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 15
Figure 4 Middle Schools’ Reported Use of Technology as Reported by Teachers and
Students (sorted by Instructional Use)
-3.00 -2.00 -1.00 0.00 1.00 2.00 3.00
Reported Use (in Z-Score Units)
Instructional Use Prep Use Student Use (teacher) Student use (student)
Frequent Use
Frequent Use
Infrequent Use
Infrequent Use
Middle
Schools
V-c
B-d
Y-a
Y-b*
Z-c
Y-c
K-d
W-b
F-b
Y-d*
S-b
G-a
K-b
Q-a
D-a
U-c
R-a
N-c
O-b
J-d
X-a
T-c
J-b*
*data not available for “Student Use” as reported by students
Note: School & District names have been made anonymous to protect the identity of participating districts
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 16
Summary
In this report we have provided an overview of the research design and presented
background information for the 22 participating districts. Comparisons to state and
national data were made to provide the reader with a sense of how similar or dissimilar
the participating districts are to other districts across the nation. We leave it to the
individual reader to determine how similar or applicable our results may be to their
respective situations and environment. Although the sample of participating districts
was not designed to be representative of districts across Massachusetts or the nation,
we believe that the participating districts do not differ greatly from districts that are
located in either non-isolated rural areas or in areas that are not large urban areas. In
other words, the participating districts should be viewed as a fair cross-section of small
urban, suburban and rural communities.
For those readers interested in more detailed information regarding the sample
of districts, technical issues regarding survey development and scaling, and technical
details of the site visit data collection and coding, this information will be presented
in a forthcoming technical report. Following the technical report, a series of reports
will be released that focus on specic aspects of the study. The initial set of reports
will focus on descriptive results from the surveys and site visits. Additional reports will
explore district, school, and classroom-level factors that inuence teacher and student
use of technology. Together, this collection of reports will help inform a variety of re-
search-based decisions for teachers, administrators, policy makers, and other research-
ers regarding the use, support, and effect of instructional technology in schools.
Report 1 · November 2002
An Overview of USEIT and the Participating Districts 17
Endnote
1 In a 2002 Massachusetts Department of Education report A and B computers were classied the
following way: “During the period that this data was collected, Type A computers were dened as
machines with 64 MB RAM or higher, which are capable of running multimedia applications, high-
end applications, and streamed video. Type B computers were dened as multimedia computers 16
MB RAM to 64 MB RAM, which have CD-ROM access and Internet capability using a browser.”
(Massachusetts DOE, 2001, p. 5)
References
Cuban, L. (2001). Oversold & Underused: Computers in the Classroom. Cambridge, MA: Harvard
University Press.
Market Data Retrieval. (2001). Technology in Education. (A report issued by Market Data
Retrieval). Shelton, CT: Market Data Retrieval.
Market Data Retrieval. (1999). Technology in Education 1999. (A report issued by Market Data
Retrieval). Shelton, CT: Market Data Retrieval.
Glennan, T. K., & Melmed, A. (1996). Fostering the use of educational technology: Elements of a
national strategy. Santa Monica, CA: RAND.
Massachusetts Department of Education. (2002). EdTech 2001. Retrieved on January 10, 2003
from http://www.doe.mass.edu/edtech.
Edwards, V. (Ed.). (2002). Technology Counts 2002 [Special Issue]. Education Week, 21 (35).
Ravitz, J., Wong, Y., & Becker, H. (1999). Teacher and teacher directed student use of computers
and software. Irvine, CA: Center for Research on Information Technology and
Organizations.
U.S. Department of Commerce. (2002) A Nation Online: How Americans Are Expanding
Their Use of the Internet. Retrieved on January 10, 2003 from http://www.ntia.doc.gov/
ntiahome.dn/index.html.
inTASC is a not-for-prot research group that works col-
laboratively with schools, educational agencies, and businesses
to conduct research and development on a variety of issues
related to technology and assessment. inTASC brings together
researchers who have examined several aspects of technology
and assessment in schools over the past decade to focus on new
questions and issues that arise from the eld. inTASC is unique
in that it does not develop research studies and then seek schools
to participate in research activities. Instead, schools, education-
al agencies, and businesses approach inTASC with their own
ideas and/or questions that require systematic research to ad-
dress. Research conducted by inTASC is developed, conducted,
and often disseminated in collaboration with our educational
and business partners.
inTASC believes that advances in educational technology
and continuously emerging applications of those technologies
coupled with growing demands to document impacts on teach-
ing and learning requires a dual focus on instructional uses of
technology and applications of technology to new forms of
assessment. For this reason, inTASC collaborates on research
that focuses on instructional uses of technology and on appli-
cations of computer-based technologies to the technology of
testing and assessment. It is our hope that this dual focus will
enable us to provide research-based information to schools and
educational leaders about the impacts of educational technol-
ogy, and to produce new forms of assessment that capitalize
on the powers of computer-based technologies and that are
more sensitive to the types of learning enabled by educational
technologies.
Use, Support, and Effect of Instructional Technology Study
www.intasc.org
Article
Full-text available
In spite of large expenditures on and increased access to educational technologies, a concern remains that computer-based technologies are not being integrated into regular instructional practices. While there is evidence to support the hypothesis that newer teachers' familiarity with technology leads to increased technology integration, a question remains about whether technology use is not only related to the number of years they have been in the teaching professional, but also the number of years they have been teaching at their current school. To examine this issue, this research presents the findings from the analysis of survey data collected from the 2,864 teachers. The data show that the relationship between technology use, the amount of time teachers have been teaching throughout their career, and the number of years they have been, at their current school varies according to the ways in which technology use is defined. This article discusses the variety of relationships that were observed in the data and their implications for pre-service and in-service teacher training.
Oversold & Underused: Computers in the Classroom Technology in Education. (A report issued by Market Data Retrieval). Shelton, CT: Market Data Retrieval
  • L Cuban
Cuban, L. (2001). Oversold & Underused: Computers in the Classroom. Cambridge, MA: Harvard University Press. Market Data Retrieval. (2001). Technology in Education. (A report issued by Market Data Retrieval). Shelton, CT: Market Data Retrieval. Market Data Retrieval. (1999). Technology in Education 1999. (A report issued by Market Data Retrieval). Shelton, CT: Market Data Retrieval
Teacher and teacher directed student use of computers and software
  • J Ravitz
  • Y Wong
  • H Becker
Ravitz, J., Wong, Y., & Becker, H. (1999). Teacher and teacher directed student use of computers and software. Irvine, CA: Center for Research on Information Technology and Organizations.
Technology in Education. (A report issued by Market Data Retrieval)
Market Data Retrieval. (2001). Technology in Education. (A report issued by Market Data Retrieval). Shelton, CT: Market Data Retrieval.
A Nation Online: How Americans Are Expanding Their Use of the Internet
U.S. Department of Commerce. (2002) A Nation Online: How Americans Are Expanding Their Use of the Internet. Retrieved on January 10, 2003 from http://www.ntia.doc.gov/ ntiahome.dn/index.html.