ArticlePDF Available

Abstract and Figures

A microeconomics principles course employing random assignment across three sections with different teaching models is used to explore learning outcomes as measured by a cumulative final exam for students who participate in traditional face-to-face classroom instruction, blended face-to-face and online instruction with reduced instructor contact time, and a purely online instructional format. Evidence indicates learning outcomes were reduced for students in the purely online section relative to those in the face-to-face format by 5 to 10 points on a cumulative final exam. No statistically significant differences in outcomes are observed for students in the blended relative to the face-to-face section.
Content may be subject to copyright.
1
American Economic Review: Papers & Proceedings 2016, 106(5): 1–5
http://dx.doi.org/10.1257/aer.p20161057
A Randomized Assessment of Online Learning
By W T. A, K A. C,  O R. H*
* Alpert: University of Connecticut-Stamford, 1
University Place, Stamford, CT 06901 (e-mail: alpert@
uconn.edu); Couch: University of Connecticut, 365 Faireld
Way, Storrs, CT 06269 (e-mail: kenneth.couch@uconn.edu);
Harmon: University of Connecticut-Stamford, 1 University
Place, Stamford, CT 06901 (e-mail: oskar.harmon@uconn.
edu).
Go to http://dx.doi.org/10.1257/aer.p20161057 to visit
the article page for additional materials and author disclo-
sure statement(s).
This paper contains estimates of the impact
of different instructional models that incorpo-
rate online course content on learning outcomes
of college students of principles of microeco-
nomics using a randomized study design. In the
existing literature, there are only three published
studies (Figlio, Rush, and Yin 2013; Bowen et al.
2014; and Joyce et al. 2015) that use a random
design to explore the impact of online education
in a college-length course on learning outcomes.
Thus, this study provides an important extension
to a literature that is extraordinarily small given
the widespread adoption of online teaching and
its potential impact on student outcomes at the
postsecondary level.
There is a large prior literature addressing the
impact of incorporating online content deliv-
ery into educational instruction. That literature
examines impacts on classes at the primary,
secondary, and postsecondary levels. As sum-
marized in a meta-analysis released by the US
Department of Education (2010, p. xii), the ini-
tial search for articles within this massive liter-
ature located 1,132 publications as of 2008. Of
those, however, only a handful had a random
design and none considered a semester-length
college-level course.
Against this backdrop, the Figlio, Rush, and
Yin (2013) study of the impact of teaching
microeconomics principles in a purely online or
face-to-face classroom setting on student learn-
ing outcomes was the rst randomized study for
a semester-length course at the postsecondary
level. Students were randomized after enroll-
ment into either a classroom-based or purely
online section. In assessing the difference in
outcomes across sections, estimates indicate
(Figlio, Rush, and Yin 2013, Table 3) that stu-
dents attending the live lectures do roughly 3
points better than those in the purely online sec-
tion on a mid-term exam and about 2.5 points
better on their nal exam. Thus, the average stu-
dent performed worse in the online sections.
The study of Joyce et al. (2015) similarly uses
a randomized study design to examine different
instructional models for principles of micro-
economics focusing on a contrast between a
face-to-face section that met twice a week versus
a blended one that instead met once per week.
Both sections had access to online materials.
This contrast of two course sections in a blended
teaching format that varied by providing more
or less face-to-face contact time addresses an
important margin of decision making for edu-
cational administrators. Across the sections with
more or less contact time, the study reports that
no signicant differences are found for the aver-
age nal exam score.
The analysis of blended versus purely online
delivery of a basic statistics course contained in
Bowen et al. (2014), while not directly applica-
ble to instruction of microeconomics, is nonethe-
less notable within the very small set of studies
that employ a randomized research design. The
study was conducted across six campuses com-
paring a machine-based learning model with one
hour per week of instructor contact time to class-
room courses with about three hours per week of
contact time. The study examines a number of
learning outcomes including nal exam scores
and concludes that outcomes across sections are
about the same. This comparison of student out-
comes associated with a blended course design
versus traditional classroom instruction arrives
at a similar conclusion to the study by Joyce
et al. (2015).
We highlight two important differences in
this study compared to the prior literature. First,
the random design of this study simultaneously
considers three course formats ( face-to-face,
P20161057.indd 1 2/4/16 7:44 AM
MAY 2016
2
AEA PAPERS AND PROCEEDINGS
blended, and purely online) whereas prior stud-
ies provide a single contrast (Figlio, Rush, and
Yin 2013, face-to-face versus online; Joyce et
al. 2015 and Bowen et al. 2014, face-to-face
versus blended). This allows us to consolidate
and conrm prior ndings within a single study.
Second, this study randomly assigned students
at the point of the expression of interest in taking
the principles course rather than after enrollment
(Figlio, Rush, and Yin 2013; Bowen et al. 2014;
Joyce et al. 2015). This allows us to examine the
potential importance of nonenrollment and fail-
ure to complete courses on learning outcomes.
I. The Randomized Trial
The sample was collected from a microeco-
nomics principles course taught over 4 con-
secutive 16-week semesters at a large public
university in the Northeast. Each semester a
course of microeconomic principles was listed
that had 3 sections capped at 35 students con-
taining one of the instructional models. As an
incentive to participate, students were given ve
points on their term average for the course. Five-
hundred nineteen students were randomized
across the four semesters.
A. Instructional Formats
The experimental design randomly assigned
students to one of three delivery modalities:
classroom instruction, blended instruction with
some online content and reduced instructor
contact, and purely online instruction. The tra-
ditional section met weekly for two 75-minute
sessions, alternating between a lecture and dis-
cussion period. The blended section met weekly
with the instructor for a 75-minute discussion
period. As a substitute for the lecture period,
students were given access to online lecture
materials. The online section had class discus-
sion in an online asynchronous forum, and for
the lecture period students were given access to
the same online lecture materials as students in
the blended section. The online materials were
developed using best practice standards from
the Higher Ed Program Rubric for online edu-
cation as described on the website of the Quality
Matters (2014) organization. For the three arms
of the experiment, lectures, discussions, and
other instructional content were prepared and
delivered by the same instructor. The measure
of learning outcomes examined is a cumulative
nal exam score.
All three sections were given access to the
PowerPoint slides used in the lectures. Here we
discuss the additional online lecture materials
provided to the online and blended sections.
One additional online lecture material was a
tape of the live lecture originally given to the
face-to-face section. To reduce the incidence of
lecture-listening fatigue, the online and blended
sections were also given access to a compact,
20-minute version of the lecture that used some
of the same PowerPoint slides as the full length
one. The compact version is closed-captioned,
which is helpful for students for whom English
is a second language and hearing-impaired stu-
dents. Additionally, a version of the shortened
lecture closed-captioned in Spanish was pro-
vided to the online and blended sections. The
closed-captioning is keyword searchable so stu-
dents can easily locate sections of the lecture for
review.
The online and blended sections were also
given access to an enhanced PowerPoint version
of the lecture with hyperlinks separating it into
ve manageable chunks and a voice-over of the
main points. The slides contain economic dia-
grams drawn sequentially by user mouse clicks
and guided by audio narration. Students are pro-
vided with a drawing tool and encouraged to
draw the diagram. The script of the audio nar-
ration appears in a notes pane and the translate
function in PowerPoint will display the script in
a language of choice.
B. Online Discussion Forums
As a substitute for the learning from inter-
actions between an instructor and students and
between students that can occur in a classroom
setting, the best practice recommendation from
Quality Matters is to provide an online discus-
sion forum that engages students and fosters a
sense of community. In the online section, asyn-
chronous discussion was conducted using two
equal-sized Facebook groups. A portion of the
course grade depended on the level of participa-
tion. Participation included posting (or respond-
ing to) questions, posting links to articles or
videos helpful for understanding topics in the
lecture, and engaging in discussion questions
posted by the instructor.
P20161057.indd 2 2/4/16 7:44 AM
VOL. 106 NO. 5
3
A RAndomized Assessment of online leARning
II. Descriptive Statistics and Estimation Results
A. Course Completers
First, we present evidence that across the arms
of the experiment ( face-to-face, blended, and
online) descriptive characteristics of the sample
were largely indistinguishable at the point of
course completion. Of the randomized students,
323 completed the course.
Online Appendix Tables A.1 and A.2 provide
comparisons of average demographic and back-
ground academic characteristics for students
who completed the course in the face-to-face
relative to the purely online (Table A.1) and
blended (Table A.2) sections. Across virtually
every characteristic, the random design appears
to have been successful with mean values for the
online and blended sections indistinguishable
from the face-to-face sections at conventional
levels of signicance. The only statistically sig-
nicant difference ( t-statistic = 2.08) indicates
that those in the purely online course have taken
six additional credit hours in the past. Given the
number of comparisons made, this one signif-
icant difference is within the range of random
error. A control for prior credit hours and the
other covariates shown in the descriptive tables
are included in our estimates. We conclude that
the randomization was successful, most import-
ant for key characteristics likely to be closely
related to subsequent academic outcomes (prior
GPA and SAT scores).
We use linear regression for calculating the
estimates capturing the difference in outcomes
across the face-to-face, online, and blended class
modalities for those who completed the course.
Panel A of Table 1 contains parameter estimates
for differences in scores (out of a possible 100)
on a cumulative nal exam taken in a similar
setting by all students. Column 1 contains esti-
mates not including available covariates. As
can be seen there, students in the purely online
course score about 4.9 ( t-statistic = 3.09) points
worse than those in the face-to-face course. In
column 2, covariates are added, most related to
learning outcomes (prior GPA, prior credits, and
SAT scores). With these controls, students in
the purely online course are still found to score
5.2 ( t-statistic = 3.26) points lower on the nal
exam. Column 3 includes all available covari-
ates. There, students are still found to score 4.2
( t-statistic = 2.68) points lower in the online sec-
tion than the face-to-face variant. The sign of the
impact of participating in the blended course is
negative but the parameters are not signicantly
different than zero at conventional levels across
the three columns. Online Appendix Table B.1
provides a full set of parameter estimates and
diagnostic statistics for the models associated
with the entries in panel A of Table 1.
B. Accounting for Differential Attrition
An additional concern with online education
is the willingness of students to take a course and
the impact of the delivery method on comple-
tion. The development of the online tools used
in this experiment was consistent with current
best practices in course design. Nonetheless,
there was clearly greater attrition among those
randomly given the opportunity to enroll in the
online section.
From the point students received permission
to enroll to course completion, potential par-
ticipation in the face-to-face section declined
from 175 to 120 students (30 percent). For the
blended course section, the decline was from
172 randomized students to 110 completers (36
percent). The largest decline was observed for
the online arm where 172 students were assigned
to the course and 93 completed (46 percent).
T 1—OLS E F E  M
(1) (2) (3)
Panel A. Course completers
Dummy online = 14.912*** 5.201*** 4.232***
(3.09) (3.26) (2.68)
Dummy blended = 11.454 1.703 0.996
(0.95) (1.12) (0.66)
Observations 323 265 215
Panel B. Adjusting for attrition and noncompletion
Dummy online = 113.32*** 9.517** 10.28**
(3.29) (2.20) (2.17)
Dummy blended = 14.261 1.518 0.121
(1.05) (0.35) (0.03)
Observations 519 411 324
Note: Table entries are ordinary least squares (OLS) param-
eter estimates ( t-statistics).
Source: Author calculations.
*** Signicant at the 1 percent level.
** Signicant at the 5 percent level.
* Signicant at the 10 percent level.
P20161057.indd 3 2/4/16 7:44 AM
MAY 2016
4
AEA PAPERS AND PROCEEDINGS
Descriptive statistics at the point of randomiza-
tion are provided in online Appendix Tables A.3
and A.4. As seen at course completion, average
student characteristics across the course sections
are almost identical. Similar descriptive statis-
tics at the point of enrollment are contained in
online Appendix Tables A.5 and A.6.
To gauge the potential impact on learning
related to differential attrition across the three
arms of the experiment, we use the complete
sample of those given a permission number
along with outcomes for those who completed
the course to provide estimates where we assign
those who did not complete the course an out-
come of zero and recalculate the estimates
contained in panel A of Table 1. Those esti-
mates, contained in panel B, indicate that for
the online course section, the additional attri-
tion relative to the face-to-face setting ampli-
es the potential negative impact of the course
on the students to be served. As shown in col-
umn 1, without including any covariates, the
estimated impact for students given the oppor-
tunity to enroll in an online course offering has
a negative impact of 13.3 ( t-statistic = 3.29)
points on a scale of 100. Including covariates
most related to student learning in column 2,
the estimated negative impact is 9.5 ( t-statistic
= 2.20) points. Including all covariates in col-
umn 3, the estimated reduction in the test score
is 10.3 ( t-statistic = 2.17) points. While all of
the estimates for the impact of online course
delivery are signicantly different than zero at
conventional levels of signicance, none are for
the blended sections. Online Appendix Table
B.2 contains a full set of parameter estimates
and diagnostic statistics associated with panel B
of Table 1.
III. Conclusion
This paper contains results from a random-
ized study of student outcomes on a cumulative
semester exam in a face-to-face course ver-
sus two popular variants of online education,
a purely online course and a blended model
that reduces instructor contact by one-half.
Descriptive evidence indicates that randomiza-
tion was successful at the time permission num-
bers were assigned, were used, and at course
completion. Additionally, across most character-
istics, those in the experiment appear similar to
students enrolled in a nonexperimental section
of the same course (see online Appendix Table
A.7).
Estimates indicate that those who completed
the purely online course had learning outcomes
that were signicantly worse than those in the
face-to-face section of the course: this differ-
ence is about four to ve points or one-half
of a letter grade. Differences in outcomes for
those who completed the blended relative to the
face-to-face course were not signicantly differ-
ent than zero.
Ignoring possible contamination through
enrollment outside the experiment, we provide
estimates that incorporate the potential negative
impact from differential attrition across sections.
Those estimates indicate that assignment to the
purely online course resulted in a reduction in
average scores on a cumulative nal exam of
about 10 points out of 100—the equivalent of a
letter grade. However, no statistically signicant
ndings were found for the blended course for-
mat when accounting for attrition.
The learning outcomes on exam scores are not
meaningfully different for the average student in
the control group (classroom instruction) and
blended treatment group for any of the estimates
presented, consistent with the results of Bowen
et al. (2014) and Joyce et al. (2015). However,
exam scores are consistently worse for the
online treatment group consistent with ndings
from Figlio, Rush, and Yin (2013) and earlier
studies (e.g., Brown and Liedholm 2002) based
on observational data. These results suggest a
potentially promising avenue for economizing
on teaching resources while maintaining student
outcomes is to use blended teaching models of
instruction for economics principles that reduce
instructor contact time in a classroom setting.
REFERENCES
Bowen, William G., Matthew M. Chingos, Kelly
A. Lack, and Thomas I. Nygren. 2014. “Inter-
active Learning Online at Public Universities:
Evidence from a Six-Campus Randomized
Trial.Journal of Policy Analysis and Manage-
ment 33 (1): 94–111.
Brown, Byron W., and Carl E. Liedholm. 2002.
“Can Web Courses Replace the Classroom
in Principles of Microeconomics.” American
Economic Review 92 (2): 444–48.
Figlio, David, Mark Rush, and Lu Yin. 2013. “Is
It Live or Is It Internet? Experimental Estimates
P20161057.indd 4 2/4/16 7:44 AM
VOL. 106 NO. 5
5
A RAndomized Assessment of online leARning
of the Effects of Online Instruction on Student
Learning.” Journal of Labor Economics 31 (4):
763–84.
Joyce, Ted, Sean Crockett, David A. Jaeger, Onur
Altindag, and Scott D. O’Connell. 2015. “Does
Classroom Time Matter?” Economics of Edu-
cation Review 46: 44–67.
Quality Matters. 2014. “Higher Ed Program
Rubric.” https://www.qualitymatters.org/rubric
(accessed February 7, 2014).
US Department of Education. 2010. “Evaluation
of Evidence-Based Practices in Online Learn-
ing: A Meta-Analysis and Review of Online
Learning Studies.” Washington, DC: US
Department of Education, Ofce of Planning,
Evaluation, and Policy Development.
P20161057.indd 5 2/4/16 7:44 AM
... Benefits include technology which enables instructors to utilize multiple teaching methods (e.g., recorded lecture videos, online discussions, face-to-face video conferences) to address students' varied learning styles and provide them with enhanced flexibility (Davis et al., 2022). Despite these benefits, evidence from bachelor's degree programs suggests that students tend to perform worse in online courses relative to courses held in a physical classroom (Alpert et al. 2016;Cellini & Grueso, 2021). Generally, older students with strong overall GPAs are more likely to succeed in online courses (Clark, 2013). ...
Chapter
Full-text available
The global calamity of the COVID-19 pandemic accelerated the shift from face-to-face learning to online learning ushering in pedagogical experimentation and innovation. This chapter advocates for the importance of a humanizing approach to the online classroom. Humanizing is conceptualized as a pedagogical orientation and practice in which person-to-person connections and the relational qualities of teaching and learning are intentionally prioritized. Four areas are discussed: 1) the adoption of new practices for the creation of authentic learning experiences, 2) the development of humanizing approaches for the assessment of student learning, 3) the recognition of student mental wellness as a humanizing practice, and 4) the leverage of educational frameworks to build connection. The aim is to expand the conversation regarding the opportunities and challenges for humanizing online learning.
... While much could be said about the positive aspects of online learning, a sizeable body of research indicate negative academic outcomes among students in online courses as opposed to campus courses [26][27][28][29]. Students in online courses have lower rates of course completion and grades and tend to be less persistent and motivated to finish [30]. ...
Article
Full-text available
The COVID-19 pandemic has had a dramatic effect on society, including teaching within higher education that was forced to adapt to online teaching. Research on this phenomenon has looked at pedagogical methods as well as student perceptions of this way of teaching. However, to the best of our knowledge, no studies have looked at the wider perspective, within the entire student populous of a university, what students’ perceptions are and how these correlate with the students’ previous experiences and habits with online platforms, e.g., online streaming or social media. In this study, we perform a questionnaire survey with 431 responses with students from 20 programs at Blekinge Institute of technology. The survey responses are analyzed using descriptive statistics and qualitative analysis to draw its conclusions. Results show that there is no correlation between previous habits and student experience with online platforms in relation to online learning. Instead, other factors, e.g., teacher engagement, is found central for student learning and therefore important to consider for future research and development of online teaching methodologies.
... There is yet relatively little evidence indicating that learning analytics improve the learning outcomes of students (Alpert et al., 2016;Bettinger et al., 2017;Jacob et al., 2016;Viberg et al., 2018). For example, less sophisticated correction algorithms may be exploited by students who will tailor their solution to obtain maximal scores without obtaining the desired knowledge (De Wit & Broucker, 2017). ...
Chapter
Full-text available
The online traces that students leave on electronic learning platforms; the improved integration of educational, administrative and online data sources; and the increasing accessibility of hands-on software allow the domain of learning analytics to flourish. Learning analytics, as in interdisciplinary domain borrowing from statistics, computer sciences and education, exploits the increased accessibility of technology to foster an optimal learning environment that is both transparent and cost-effective. This chapter illustrates the potential of learning analytics to stimulate learning outcomes and to contribute to educational quality management. Moreover, it discusses the increasing emergence of large and accessible data sets in education and compares the cost-effectiveness of learning analytics to that of costly and unreliable retrospective studies and surveys. The chapter showcases the potential of methods that permit savvy users to make insightful predictions about student types, performance and the potential of reforms. The chapter concludes with recommendations, challenges to the implementation and growth of learning analytics.
... A switch to a mixed or blended platform with small in-person and virtual supports may be needed and is consistent with mixed reports of harm and benefit in our study. These have been beneficial in other settings [35,36]. ...
Article
Full-text available
Background The years people spend attending university or college are often filled with transition and life change. Younger students often move into their adult identity by working through challenges and encountering new social experiences. These transitions and stresses have been impacted significantly by the COVID-19 pandemic, which has led to dramatic change in the post-secondary experience, particularly in the pandemic’s early months when colleges and universities were closed to in person teaching. The goal of this study was to identify how COVID-19 has specifically impacted the postsecondary student population in Kingston, Ontario, Canada. Methods The Cost of COVID is a mixed methods study exploring the social and emotional impacts of the COVID-19 pandemic, with a focus on families, youth, and urban Indigenous People. The present analysis was completed using a subset of qualitative data including Spryng.io micronarrative stories from students in college and university, as well as in-depth interviews from service providers providing services to students. A double-coded phenomenological approach was used to collect and analyze data to explore and identify themes expressed by postsecondary students and service providers who worked with postsecondary students. Results Twenty-six micronarratives and seven in-depth interviews were identified that were specifically relevant to the post-secondary student experience. From this data, five prominent themes arose. Impacts of the COVID-19 pandemic on the use of technology was important to the post secondary experience. The pandemic has substantial educational impact on students, in what they chose to learn, how it was taught, and experiences to which they were exposed. Health and wellbeing, physical, psychological and emotional, were impacted. Significant impacts were felt on family, community, and connectedness aspects. Finally, the pandemic had important financial impacts on students which affected their learning and their experience of the pandemic. Impacts did differ for Indigenous students, with many of the traditional cultural supports and benefits of spaces of higher education no longer being available. Conclusion Our study highlights important impacts of the pandemic on students of higher education that may have significant individual and societal implications going forward. Both postsecondary institutions and society at large need to attend to these impacts, in order to preserve the wellbeing of graduates, the Canadian labor market, and to ensure that the pandemic does not further exacerbate existing inequalities in post-secondary education in Canada.
... The extant scholarship on teaching and learning has investigated whether and how modality can affect student learning (Alpert, Couch, and Harmon 2016;Botsch and Botsch 2012;Clawson, Deen, and Oxley 2002;Daigle and Stuvland 2020;Glazier et al. 2020;Hamann, Pollock, and Wilson 2009;Pollock and Wilson 2002;Wilson, Pollock, and Hamann 2007;Xu and Jaggers 2014). These studies, however, treat modality as a fixed course structure. ...
Article
Full-text available
Within-semester shifts in course modality in response to pandemics, weather, or accommodation for travel and health are increasingly common and can interrupt student learning. We tracked temporary modality changes across 10 sections of “Introduction to American Government” to examine the extent to which instructors have tools to help students successfully navigate such changes and mitigate learning loss. We find that students rated instructors’ handling of shifts well if they made course material engaging, communicated clearly, and effectively used technology. The analysis suggests that instructors can mitigate the impact of unplanned changes to modality on students’ learning when there are three or fewer shifts during a semester.
Article
The switch to remote classes disrupted higher education during the Covid-19 pandemic. Online courses have the potential to be especially disruptive in health fields, where more of the learning is hands-on and practice-based. Using detailed pre-Covid administrative data from a large, diverse public university, I study how online course delivery can impact student performance in these fields. While grades are similar on average in online courses as compared with in-person courses, grades are difficult to interpret and may not measure actual learning. I find that course pass rates – an outcome of real consequence for students – are 3.9 percentage points lower in online courses. This is especially true among Black and low-income students, for whom pass rates go down by 6.4 and 5.4 percentage points, respectively. The results suggest that the move to online courses may depress graduation rates in health fields, particularly among minority and lower-income students, leading to a less diverse healthcare workforce.
Chapter
Self-regulation is a core concept associated with the metacognitive, motivational, and emotional aspects of learning. The COVID-19 pandemic provides a large-scale setting to collect new empirical evidence to test this conceptual framework in an authentic online learning environment. By reference to 64,949 participants enrolled at 39 universities in China, the authors developed the Undergraduate Online Self-Regulated Learning Questionnaire and estimated the associations among self-regulation, motivation, emotion, and skill mastery with regard to online learning across different subgroups of a diverse student body. The results demonstrated that males, rural students, lower-division undergraduates, first-generation college students, SEAM majors, and students at elite universities reported significantly lower UOSL scores. After controlling for motivation and emotion, these gaps decreased substantially and become statistically nonsignificant. The findings highlight the critical role played by targeted interventions in the creation of a supportive online environment for disadvantaged subgroups.
Article
Online instruction is quickly gaining in importance in U.S. higher education, but little rigorous evidence exists as to its effect on student learning. We measure the effect on learning outcomes of a prototypical interactive learning online statistics course by randomly assigning students on six public university campuses to take the course in a hybrid format (with machine‐guided instruction accompanied by one hour of face‐to‐face instruction each week) or a traditional format (as it is usually offered by their campus, typically with about three hours of face‐to‐face instruction each week). We find that learning outcomes are essentially the same—that students in the hybrid format are not harmed by this mode of instruction in terms of pass rates, final exam scores, and performance on a standardized assessment of statistical literacy. We also conduct speculative cost simulations and find that adopting hybrid models of instruction in large introductory courses has the potential to significantly reduce instructor compensation costs in the long run.
Article
Online instruction is quickly gaining in importance in U.S. higher education, but little rigorous evidence exists as to its effect on student learning. We measure the effect on learning outcomes of a prototypical interactive learning online statistics course by randomly assigning students on six public university campuses to take the course in a hybrid format (with machine-guided instruction accompanied by one hour of face-to-face instruction each week) or a traditional format (as it is usually offered by their campus, typically with about three hours of face-to-face instruction each week). We find that learning outcomes are essentially the same—that students in the hybrid format are not harmed by this mode of instruction in terms of pass rates, final exam scores, and performance on a standardized assessment of statistical literacy. We also conduct speculative cost simulations and find that adopting hybrid models of instruction in large introductory courses has the potential to significantly reduce instructor compensation costs in the long run.
Is It Live or Is It Internet? Experimental Estimates P20161057
  • David Figlio
  • Mark Rush
  • Lu Yin
Figlio, David, Mark Rush, and Lu Yin. 2013. "Is It Live or Is It Internet? Experimental Estimates P20161057.indd 4 2/4/16 7:44 AM