PreprintPDF Available

Higher education: The impact of recreational marijuana on college applications

Authors:
  • Oxford College of Emory University
Preprints and early-stage research may not have been peer reviewed yet.

Abstract

We investigate the effects of local recreational marijuana (RMJ) policy changes on college applications and find significant boosts in quantity. There is no evidence of a change in application quality. These results are robust across regression discontinuity models, traditionally used fixed effects panel models, and Difference-in Difference models with robust treatment impacts for multiple periods. However, results for application quantity diminish as: 1) smaller schools are included; and 2) more states legalize RMJ. We conclude that RMJ legaliza-tion generated some benefits for large schools in early adopting states, without negatively affecting any school. JEL: I23; I18; J18
Higher education: The impact of recreational marijuana
on college applications
Christopher D. Blake
Joshua Hess
Danna Kang Thomas
Abstract
Abstract
We investigate the effects of local recreational marijuana (RMJ) policy changes
on college applications and find significant boosts in quantity. There is no ev-
idence of a change in application quality. These results are robust across re-
gression discontinuity models, traditionally used fixed effects panel models, and
Difference-in-Difference models with robust treatment impacts for multiple pe-
riods. However, results for application quantity diminish as: 1) smaller schools
are included; and 2) more states legalize RMJ. We conclude that RMJ legaliza-
tion generated some benefits for large schools in early adopting states, without
negatively affecting any school.
JEL: I23; I18; J18
Keywords: Recreational Marijuana, College Applications, Difference-in-
Difference
Oxford College, Emory University (christopher.blake@emory.edu)
Department of Economics, Darla Moore School of Business, University of South Carolina
(joshua.hess@moore.sc.edu)
Department of Economics, Darla Moore School of Business, University of South Carolina
(danna.thomas@moore.sc.edu)
1 Introduction
Where should I apply to go to college? This question is complicated, and given the
economic significance of college attendance, it is important to uncover the determinants of
college application and enrollment decisions. The traditional economic approach has been
to model college choice as a sophisticated maximization problem over the present value of
expected wages less the economic cost of attendance–which is determined by variables such
as school quality, foregone wages, distance from home, and tuition (Willis and Rosen,1979;
Fuller et al.,1982;Manski and Wise,1983;Long,2004;Griffith and Rask,2007;McDuff,
2007). However, recent research has drawn attention to how students’ application consider-
ation sets are impacted by factors beyond net future earnings such as campus quality-of-life
and amenities (Alter and Reback,2014;Jacob et al.,2018), charismatic college leadership
(Bastedo et al.,2014), sports (Pope and Pope,2009;Perez,2012;Pope and Pope,2014;
Anderson,2017;Eggers et al.,2020), and individual information shocks (Bond et al.,2018).
Unexamined to our knowledge, though, is the impact of local public policy changes that
do not directly affect the economic costs and benefits to application decisions. For exam-
ple, public policy that potential applicants view as restrictive (e.g. blue laws or vape store
bans) could remove colleges from students’ consideration sets. In contrast, well-publicized,
permissive public policy could spur student interest in local universities either through in-
creased salience, applicants’ tastes and preferences, or the view that liberalized policies occur
alongside other desirable attributes (e.g tolerance, lower policing, “party school”).
Therefore, in this paper, we investigate the impacts of a visible, permissive public policy
change–recreational marijuana (RMJ) legalization–on the quantity and quality of college
applications. Previous studies have shown high initiation rates of marijuana use by college
students (Pinchevsky et al.,2012;Phillips et al.,2017) as well as increased marijuana use
by young adults and college students after marijuana laws liberalize (Wen et al.,2015;Kerr
et al.,2018,2017;Miller et al.,2017;Pearson and Bruce S. Liese,2017;Bae and Kerr,2020).
This stream of research suggests that less restrictive marijuana laws may be attractive for
2
college students.
In order to evaluate how RMJ laws impact applications, we adopt a three-pronged empiri-
cal strategy utilizing (1) regression discontinuity models, (2) fixed effects models as tradition-
ally used for Difference-in-Difference estimation for panel models, and (3) novel approaches
to panel data Difference-in-Difference model estimation that incorporate cohort effects. The
regression discontinuity models allow us to test for immediate short-run impacts of recre-
ational marijuana legalization while the fixed effects models–which are in the spirit of Pope
and Pope (2009) and Pope and Pope (2014)–allow us to investigate not only the short-run
but also longer run trends in applications. Moreover, our most flexible specifications allow
for cohort-specific effects while remaining robust to staggered assignment of treatment.
We evaluate the models using a panel constructed from the Integrated Post Secondary
Educational Data System (IPEDS) of nearly all four-year post secondary institutions for
the years 2008-2020. The IPEDS data contains not only information on the total number
of applications but also the number of applications by sex. Further, IPEDS provides the
distribution of ACT and SAT scores for admitted applicants which we use as a measure of
applicant quality.
Our preferred panel specifications suggest a nearly 15% increase in applications after RMJ
becomes available in a state, a result consistent in each form of Difference-in-Difference esti-
mation we use. We also find gender differences in these effects: men’s applications increase
by an average of 20% after RMJ availability whereas women’s applications increase 13%
on average. Furthermore, large Colorado schools experience a nearly 30% ceteris paribus
increase. Large Washington schools similarly observed an increase in applications, though
this effect is less precisely measured. Across all specifications, subsequent state legalization
policies generate diminished increases in applications with each subsequent cohort.
Finally, we assess the relationship between RMJ availability and admitted cohort quality
using SAT scores as a proxy. We find some evidence, albeit weak, that RMJ availability is
associated with a higher-quality admitted cohort. This effect appears persistent over time
3
for non-flagship schools. We interpret this to mean that, at best, RMJ availability improved
student cohort quality and, at worst, had no effect.
These findings are related to a growing body of research on the economic impacts of
legalized marijuana (see Hansen et al.,2021, for an expansive review). Specifically, this study
complements the existing studies on the effects of RMJ on universities as well as academic
outcomes. For example, Marie and olitz (2017) finds removing access to RMJ has a positive
effect on the academics of the currently-enrolled students at Maastricht University. Wright
and Krieg (2020) similarly investigates the effect of RMJ access on academic performance of
Western Washington University students.
Our study differs in that we are focused on the impact of liberalized policy rather than
usage. Consequently, we examine the impact of RMJ availability rather than legal access
(which is limited to those 21 years of age and older) on the universe of American institutions
of higher education. Whereas previous studies indicate marijuana use has deleterious effects
on students’ academic outcomes, our study finds poor students are not more attracted than
good students by RMJ policies.
The remainder of this paper is organized as follows: Section 2discusses the relevant
background on previous works and the characteristics of ‘treated’ schools, Section 3covers
our data sources, we show our econometric approach in Section 4, Section 5presents our
results, and Section 6concludes.
2 Background and Descriptive Evidence
State marijuana laws in the U.S. have tended toward greater liberalization since California
became the first state legalize marijuana for medicinal purposes in the 1990s. (Figure 1a
offers a graphical depiction of this progression.) While a consistent trend, the more climactic
stage of this movement began in November 2012, when Colorado and Washington became
the first two states to legalize recreational marijuana for adults over the age of twenty-one.
4
Figure 1: U.S Marijuana Policies
(a) Adoption of MJ Laws Over Time (b) Current State Policies
RMJ sales in Colorado and Washington began in January 2014 and July 2014, respectively,
and since then, seventeen other states have passed their own RMJ laws. These states are
highlighted in Figure 1b.
Figure 2a provides descriptive evidence on how these laws may have affected college
applications in the early adopting RMJ states, displaying the mean number of applications
by year. The mean to all US schools–including Colorado and Washington–is also exhibited
for comparison. The green vertical bar indicates the 2013 application period before any RMJ
sales begin.
The graph shows that applications to Colorado for Fall 2014 were higher than any of
the previous six years. Additionally, the mean value of applications to Colorado schools
surges but seems to slow within three years post RMJ availability. As noted above, RMJ
was not commercially available in Washington until July 2014, well after the application
deadline. The increase in Washington’s applications, consequently, does not appear to occur
until 2015. However, it appears to follow a similar pattern to Colorado after availability:
increased growth for three years before tapering. This is in contrast to the mean for all U.S.
schools which shows a steady linear trend rather than the upturn that seems to occur for
5
Figure 2: Applications Over Time
(a) Mean Applications for U.S. Schools (b) Total for Top Five CO Schools
Notes: The green vertical bar indicates the 2013 fall application period before RMJ sales begin in Colorado
and Washington.
both Colorado and Washington.
A deeper look at the make up of Colorado’s increase in applications shows an uneven
distribution of the gains. This can be seen in Figure 2b. In particular, CU Boulder receives
the most application of any Colorado school, Colorado School of Mines receives the second
most, Colorado State University the third most, and so on. Comparing Figure 2a to Figure 2b
indicates the surge in application to Colorado for the Fall 2014 semester went mostly to the
state’s largest school CU Boulder. In fact, applications to the next two largest schools
Mines and CSU decreased.
3 Data
The primary data set comes from the Integrated Postsecondary Education Data Sys-
tem (IPEDS)1provided by the National Center for Education Statistics. Collected through
mandatory surveys, IPEDS data provides detailed snapshots on applications, enrollment,
graduation, faculty and staff, financial aid, and more for all colleges and vocational insti-
1https://nces.ed.gov/ipeds/
6
tutions that participate in federal student financial aid programs throughout the United
States.
We utilize IPEDS data on the number of applications and the distribution of standardized
test scores for admitted applicants for nearly all four-year post secondary institutions in the
United States. Our panel covers 2008-2020. Given the relative recency of the first RMJ
availability in some states, our study is most relevant to schools in the first few legalizing
cohorts, particularly those in Colorado and Washington.
Summary statistics for the IPEDS data is provided in the first two sections of Table 1. As
seen in the first section, four-year, primarily undergraduate degree-granting schools receive
nearly 7,000 applications though significant variation exists in the number of applications to
schools. Further, colleges in general have nearly 900 more female applicants on average than
male applicants. The second section of the table summarizes school-specific characteristics
such as net tuitition prices and the 75th percentile of standardized test scores for each school–
which we use in our empirical analysis as a measure of admitted student quality. The average
for all schools in our data is 26 for the composite ACT score while for the SAT it is 598 and
594 for the math and verbal sections, respectively.
Our empirical models also require several controls which account for non-school factors
that may influence student applications and success. For example, prospective students
may consider whether they should enter the collegiate system or, instead, enter the labor
market. This decision may be informed by a combination of labor market tightness and
perceived labor market opportunities. For the former, we use the state unemployment rate
as an indicator of how challenging it may be to get a job. As the unemployment rate rises,
students may view attending college as the more attractive option, making the unemployment
rate and number of applications positively related. Similarly, high unemployment rates may
incentivize continued enrollment and persistence within the college environment.
In the absence of changes to unemployment, application decisions may also be swayed
by potential earnings. We assume entry-level positions to be the baseline in most potential
7
students’ decision-making and use weekly wages in the service, leisure, and retail sectors
as potential factors influencing college choice. As any of these wages rise, the perceived
opportunity cost of attending college increases pressuring applications downward. Data for
both unemployment rates and sectoral wages–which are also summarized in the final section
of Table 1–are provided publicly through the Bureau of Labor Statistics (BLS).
Table 1: Summary Statistics for Four-Year, Primarily Undergraduate Degree-Granting In-
stitutions: 2002-2020
obs. mean std. dev. min. max.
Total applications 14,277 6,943 10,261 2 113,754
Male applications 14,261 3,047 4,779 0 52,198
Female applications 14,261 3,902 5,592 0 61,556
Net tuition price 14,276 21,606 12,817 876 61,788
75th %tile
SAT Math 13,281 598 79 338 800
SAT Verbal 13,190 594 69 260 800
ACT Composite 14,277 26 4 9 36
ACT English 12,682 26 4 7 36
ACT Math 12,684 25 4 8 36
State unemployment 14,211 6 2 2 14
Weekly Sectoral Wage
Service job 13,170 347 69 228 690
Leisure job 13,170 387 86 235 689
Retail job 13,170 546 75 426 1,197
4 Econometric Approach
Figure 2informs our empirical strategy. For instance, the response in applications seems
to follow RMJ availability rather than legality. Hence, in all our models we consider a college
“treated” if students would observe legal RMJ in a school’s state as they apply for the fall
semester start of that year.2
2For example, Colorado is “treated” in 2014 as the Fall 2014 cohort observed legal sales starting January
1, 2014 ahead of the end-of-January application deadline. Washington is “treated” in 2015 as the first sale
did not occur until the summer of 2014 well after the deadline. Notably, both states passed legalization
measures in November of 2012.
8
Because we see jumps in applications to early adopting states after RMJ sales begin in
Figure 2a, we utilize fuzzy regression discontinuity design (RDD) models. These models–
which incorporate controls such as tuition rates, unemployment rates, and wages–allow us
to assess the discontinuity in application trends post-RMJ availability in order to measure
short term policy impacts.
On the other hand, Figure 2a also suggest differential trends in applications after mar-
ijuana is available. Since RDD models obfuscate longer term trends and group effects, we
utilize fixed effects models that allow for different linear trends in outcomes for schools in
treated versus untreated states (similar to what is found in Pope and Pope,2009). Imple-
menting such fixed effects models has traditionally been used as a panel data version of a
Difference-in-Difference (DID) estimator. We therefore refer to such models as panel DID
models.
More recently, studies have highlighted the bias inherent in using these fixed effects
specifications, particularly in cases with several treatment time periods that effectively create
unique cohorts of treated individuals. To account for this, we present more robust panel DID
models and refer to them as panel DID models with cohort effects. The gains in applications
may also differ based on school size (Figure 2b), so these models account for a school’s
flagship status.
We offer additional details for our three approaches below, but each model revolves
around commonalities among the variables outlined here. The units of observation are year
tand school i. Our general functional form can be expressed as:
Ai,t =fRECi,t, F LAGi,t , τ, δi,t1, γi,t1, ϵ(1)
Ai,t is either the log of applications (quantity) or the SAT/ACT scores of the 75th per-
centile admitted student (quality). RECi,t is a dummy variable taking on the value of one
if RMJ was available prior to the application deadline and zero otherwise. As indicated
9
by Figure 2and discussed in Section 2, the bump in applications potentially attributable
to RMJ appears to have gone to the largest schools in Colorado and Washington. So, we
introduce a dummy variable, F LAGi,t that is a one if the school is one of the state’s largest
by application quantity and zero otherwise.3We use a variety of specifications for F LAGi,t,
ranging from just the largest school in a state, to the largest ten. The definition of the
variable τvaries slightly in each of our three econometric approaches, which we highlight in
the following subsections, but is always an indicator of time.
The remaining variables are controls and ϵis the error term. δi,t1captures other char-
acteristics of the school that may affect the number of applicants, including tuition price,
notoriety, and admitted student quality. γi,t1represents local economic conditions that
would affect the decision to attend college on the extensive margin, including unemployment
rates and entry-level wages. School characteristics and economic conditions are lagged one
year on the assumption that most prospective students are high-schoolers who evaluate their
collegiate options in their junior years.
4.1 RDD Models
As previously noted, our specifications differ in their interpretations of τfrom Equa-
tion (1). Regression discontinuity design (RDD) models interpret τas a time trend of
application quantity and quality from year-to-year. τis therefore the year itself (e.g. 2008,
2009, etc.) and one year is selected as the treatment year (e.g. 2014 for Colorado). We
estimate fuzzy RDD models, with the aformentioned controls, and (as is standard for RDD
models) interpret results to be the discontinuity in trend that occurs in the year of treatment,
i.e. first RMJ availability.
3We originally thought of this as being the “flagship” school but were unable to find a consensus on what
that meant.
10
4.2 Panel DID Models
RDD models are best used to understand short-term effects rather than longer-run trends.
Until very recently, that is to say traditionally, fixed effects models with linear trends have
been used as a form of difference-in-difference estimation applied to panel data (see Nichols,
2007;Singer et al.,2003;Skrondal and Rabe-Hesketh,2004, for discussions of this approach).
The variable definitions and econometric approach outlined here is aligned with this tradition.
For the sake of interpretation, we define τin two ways for these models:
1. Y RSP OST –The number of years after RMJ legalization.
(a) This variable would be negative in years leading up to legalization, zero the
year prior to implementation, and positive integer values for years following (e.g.
Y RSP OST = 1 for Colorado in 2014, Washington in 2015).
(b) Such a specification removes cohort effects and realigns each treated school as if
on common timeline.
* Results are displayed in Table 3and Table 4
2. Y EAR#–The current year, expressed as an integer.
(a) This variable collapses into the individual-specific effect when it is not interacted
with another variable. Interacting it with RECi,t yields information about differ-
ential time trends among schools in legalized and non-legalized states.
(b) Such a specification does not divide schools into cohorts, but does elicit how trends
might differ between treated and non-treated schools through time.
* Results are displayed in Table 5
In all three specifications, our main focus is the coefficient estimates on the availability
of RMJ, which we have named REC in the tables, as well as any variable interacted with it
(denoted with the ·operator).
11
4.3 Panel DID Models with Cohort Effects
Rothstein (2009,2010) show that standard difference-in-difference models may generate a
bias toward erroneous precision levels when multiple treatments are present and particularly
when the models avoid assumptions about impending policy treatment. As our data set
includes several cohorts of legalizing states and many states without treatment, our specifi-
cations may suffer from this bias. What follows is an important robustness check; however,
our intuition is that policy-anticipation is unimportant because (in our experience) college
students’ forward-thinking does not typically extend very far. Consequently, our preferred
specifications are the cohort results presented with an assumption of zero-period anticipation.
To tease out differences in legalization cohorts, we follow Callaway (2022) by including
models that account for multiple staggered treatments. Their “doubly robust” estimators
first attempt to pair treated schools from each year of legalization (cohort) with as-to-yet un-
treated schools using propensity score matching. Schools are matched based on the regressors
identified in the model (e.g. incoming student test scores, local wages, and unemployment)
for similarity within a specified degree of confidence.4Assuming matches can be found,
the model then evaluates the average treatment of the treated (ATT) stemming from the
policy change for each legalizing cohort. Put otherwise, treated schools in Colorado are
matched with non-treated schools in states that never legalize and are treated as a distinct
cohort from Washington schools, which are similarly matched to non-treated schools. In this
way, the method Callaway (2022) suggests is robust to differences that might exist between
treated schools in different treatment periods and the entire pool of treated schools relative
to non-treated schools.
Adapting the method of Callaway (2022) requires that we make two assumptions. The
first concerns what constitutes treated schools. As suggested by Figure 2, larger schools
(which we call flagship schools) may have received the brunt of effects from RMJ legalization.
Consequently, our preferred specification uses flagship schools in states with RMJ availability
4We use a confidence of 95%.
12
as the treatment group.
First, early figures suggest that larger schools (which we call flagship schools) may have
received the brunt of effects from RMJ legalization as compared to smaller schools. For this
reason, we subset our definition of schools that are “truly” treated to those that are largest.5
This brings the added benefit that a smaller pool of treated schools increases the likelihood
that propensity score matching can find a non-treated partner school for comparison.
The second assumption regards the degree to which prospective college students antic-
ipate (and respond to) impending marijuana legalization. This is important to consider
because Callaway (2022) note that sometimes treatment is not contemporaneous with policy
implementation. Prospective students may have anticipated the policy change and behaved
accordingly. While, as we note above, experience suggests this particular group has lim-
ited anticipation, we display results that assume anywhere between zero- and two-period
anticipation.6
Combined, these various specifications provide a set of very robust results with minimal
concrete assumptions made. One of the added benefits of this method is that we can also
summarize treatment effects to the cohort-level, providing insight into the average impact
of RMJ policy on schools affected by each year of legalization.
5 Results
5.1 RDD Results
We begin with the results from the regression discontinuity design (RDD) models which
can be neatly display in Table 2. We report only the the estimated coefficient on REC, the
dummy variable indicating whether or not RMJ was available in the state prior to the fall
5Various specifications of flagship are used, ranging from just the largest school in a state, to the largest
ten.
6We use two as the “anticipation cap” to match the anticipation that might have occurred for Colorado
and Washington prospective students that observed legalization two years prior to actual first sale.
13
application deadline. The first and second lines of Table 2present the outcomes of RDD
models for Colorado and Washington, respectively. The third line pools combines schools
from all states with RMJ laws.
Table 2: REC Coefficient Estimates from RDD Models
lnAPP ACT SAT
Math/English Math/Verbal
Cohort 1 0.010 0.404 -3.162
(0.123) (0.377) (9.866)
Cohort 2 0.114∗∗ 1.056 -5.097
(0.050) (0.707) (6.562)
All REC Schools -0.238-0.101 -16.482∗∗∗
(0.124) (0.164) (3.703)
Notes: Significance level denoted as: p<0.05; ∗∗ p<0.01; ∗∗∗p<0.001. Cohort 1 consists of
all Colorado schools, Cohort 2 consists of all Washington schools, and “All REC Schools”
are all schools that eventually receive RMJ treatment.
While we measure positive coefficients for both application quantity as well as ACT re-
sults for Colorado (Cohort 1), the results are imprecise. However, some positive, statistically
significant effects on application quantity exist for Washington (Cohort 2). We measure po-
tentially negative effects in application quantity and quality for remaining schools pooled
together. As a whole the results are mixed, hinting at the need for further investigation
using our fixed effects models.
While we combined application quantity and quality coefficients for our RDD results, we
divide the results for our remaining empirical models into two sections: (1) those pertaining
to the quantity of applications and (2) those regarding the quality of applications.
Panel DID Results for the Quantity of Applications
Given how closely related the results are, we present our: 1) Panel DID Model; and
2) Panel DID Model with Cohort Effects, results here. The estimated coefficients from our
panel DID results are presented in Table 3. As before, we do not report coefficients of control
14
regressors (i.e. tuition, accommodation wages, unemployment rate, ACT scores) for brevity
instead focusing only on the variables of interest.
Table 3: Results from Log of Applications Regression
(1) (2)
REC 0.065∗∗ 0.148∗∗
(0.024) (0.052)
Y RSP OST -0.004
(0.035)
REC ·Y RSP O ST 0.013 -0.021
(0.017) (0.029)
F LAG 1.255∗∗∗
(0.209)
REC ·F LAG -0.069
(0.101)
Y RSP OST ·F LAG -0.005
(0.008)
REC ·Y RSP O ST ·F LAG 0.031
(0.047)
Fixed Effects
School Yes
Year Yes Yes
Flagship Yes
Observations 6906 6906
R20.024 0.277
Adjusted R2-0.063 0.274
Notes: Significance level denoted as p<0.05; ∗∗p<0.01; ∗∗∗p<0.001. All
standard errors are clustered by school.
Column (1) shows the results of the regression on the log of applications with both year
and school fixed effects. The well-estimated coefficient for REC indicates a 6.5% increase
in applications to schools immediately following availability of RMJ. However, as seen in
Figure 2b, large schools like CU Boulder, which has twice as many applications as a similarly
sized Colorado State, could be driving these results.
Therefore, results for our model that controls for flagship school status is reported in
15
Column (2). The dummy variable F LAG indicates whether a school is one of the top two
most-applied-to in the state based on 2014 applications. This is our preferred specification as
it accounts for much more variation as indicated by the much higher R2and Adjusted R2
than that of Column (1). The coefficient on F LAG is precise and indicates that, on average,
the largest schools in a state receive 125% more applications than the smaller schools in a
state.
For non-flagship status schools, we measure a 14.8% increase in applications after RMJ
becomes available. On the other hand, flagship status schools experience an immediate an
estimated increase in applications is 14.8% - 6.9% = 7.9% though the difference between
non-flagship and flagship schools is not statistically significant. This large increase relative
to non-RMJ schools does not necessarily endure–the interaction term REC ·Y RSP O ST is
-0.021, but together these two coefficients imply that the year after RMJ becomes available,
applications increase by a diminished 14.8% - 2.1% = 12.7% (We note, however, that the
coefficient on REC ·Y RSP O ST lacks statistical significance.)
Additionally, we leverage the IPEDS data on the reported sex of applicants to assess
any difference between men’s and women’s responses to RMJ availability. We report the
estimated coefficients from our preferred specification in Table 4.
Overall, the results are largely consistent with with Table 3. The coefficient on REC is
statistically significant for men at the 0.1% level and significant for women at the 5% level.
Female applicants increased by 12.8% immediately after RMJ became available relative to
control schools, and the estimated coefficient on REC ·Y RS P OST for women is negative
and significant. We interpret these estimates to imply the first year of RMJ availability
saw a 12.8% increase in women’s applications relative to similar schools in non-RMJ states
while the next year saw a more muted 12.8% - 6.9% = 5.9% increase. On the other hand,
men’s applications increased by 20.2% - 4.9% = 15.3% in the first year of RMJ availability–
albeit this is less precisely estimated. With the caveat that some of these coefficients are not
statistically significant, the estimates do suggest an overall picture that schools experience
16
Table 4: Results from Log of Applications Regression Binned by Sex
WOMEN MEN
REC 0.1280.202∗∗∗
(0.076) (0.070)
Y RSP OST 0.0210.017
(0.012) (0.010)
REC ·Y RSP O ST -0.069∗∗ -0.049
(0.033) (0.033)
F LAG 1.171∗∗∗ 1.215∗∗∗
(0.228) (0.202)
REC ·F LAG -0.025 -0.075
(0.116) (0.101)
Y RSP OST ·F LAG -0.012 -0.004
(0.014) (0.014)
REC ·Y RSP O ST ·F LAG 0.036 0.039
(0.060) (0.052)
Observations 6382 6472
R20.294 0.284
Adjusted R20.290 0.280
Notes: Significance level denoted as p<0.05; ∗∗p<0.01; ∗∗∗p<0.001. All
standard errors are clustered by school.
17
positive effects of RMJ on applications but that these effects abate over time.
Table 5reports the estimates of Equation (1) when considering year-by-year effects rather
than the effects of years post availability. This effectively allows for trend differences in
applications for treated schools relative to untreated. The first two columns of Table 5are
the estimated coefficients when the dependent variable is the natural log of all applications.
The next two columns show the estimated coefficients for the natural log of applications for
women and men specifically..
As described above, Table 5effectively represents a panel DID model where our time
trend component from Equation (1) is Y EAR#. The marginal effect of RMJ availability in
a given year is the sum of the estimated coefficient on REC and the estimated coefficient
REC interacted with that year. As an example, we can interpret these results to imply that
the Fall 2016 cohort saw a 0.039 + 0.115 = 0.076% increase (the clustered standard error
is 0.034 so that the sum is statistically significant at the 1% level) in applications to schools
where RMJ was available prior to the 2016 application deadline relative to those that had
not legalized in that year. Taken altogether, the first column of Table 5shows diminishing
returns to school applications from RMJ availability from 2015 to 2018 and the 2019 effects
are statistically indistinguishable from zero. Interestingly, the precision of the estimates of
the coefficients on REC, and all its interactions, dissipates almost completely when F LAG
is included in the second column. This is in line with our previous results that most of the
gains from RMJ availability appear to go to the largest schools in the state.
As suggested by Rothstein (2009) and Rothstein (2010), the coefficient estimates in Ta-
ble 5may contain some bias. For robustness, and following Callaway (2022), we display
cohort-specific, robust results in Figure 3and Figure 4. Figure 3 our baseline displays
our findings with the assumption that agents do not anticipate a policy change. Observa-
tions are grouped by year of legalization. In keeping with Callaway (2022), the results are
double robust as we paired treated schools in each time period with similar but untreated
schools thereby facilitating a direct comparison. This method relies on a narrow specification
18
Table 5: Results from Log of Applications Regression Allowing for Differential Time Trends
in Treated Schools
ALL WOMEN MEN
REC -0.039-0.058 -0.134∗∗∗ -0.059 -0.068 0.008
(0.020) (0.078) (0.050) (0.085) (0.046) (0.079)
REC ·2014 0.066 0.549 0.057 0.6420.022 0.417
(0.085) (0.341) (0.208) (0.369) (0.196) (0.348)
REC ·2015 0.148∗∗ 0.357 0.2530.392 0.197 0.279
(0.062) (0.258) (0.153) (0.279) (0.143) (0.263)
REC ·2016 0.115∗∗∗ 0.047 0.124 -0.053 0.133 0.028
(0.041) (0.170) (0.102) (0.186) (0.096) (0.175)
REC ·2017 0.086∗∗ 0.006 0.094 -0.070 0.096 -0.024
(0.041) (0.168) (0.101) (0.183) (0.095) (0.173)
REC ·2018 0.063∗∗ 0.066 0.024 0.067 0.039 0.065
(0.028) (0.112) (0.068) (0.123) (0.063) (0.115)
REC ·2019 0.006 0.030 0.024 0.008 0.014 0.027
(0.027) (0.109) (0.065) (0.119) (0.061) (0.111)
F LAG 1.431∗∗∗ 1.452∗∗∗ 1.411∗∗∗
(0.173) (0.188) (0.177)
REC ·F LAG 0.064 0.059 0.024
(0.279) (0.302) (0.285)
Fixed Effects
School Yes Yes Yes
Time Yes Yes Yes Yes Yes Yes
Flagship Yes Yes Yes
Observations 17,641 17,641 17,288 17,288 17,502 17,502
R20.011 0.263 0.176 0.275 0.183 0.268
Adjusted R2-0.075 0.260 0.094 0.272 0.102 0.265
Notes: Significance level denoted as p<0.05; ∗∗p<0.01; ∗∗∗p<0.001. All standard errors are
clustered by school.
19
for treated schools. Given our results above, it appears that flagship schools observed the
greatest effects of RMJ legalization. As such, Figure 3assumes that only the top-2 schools
by application are treated in RMJ legal state.
Figure 3: ATT Effect of RMJ Legalization by Cohort of Legalization
Like Table 5, Figure 3also demonstrates the diminishing effect of RMJ on application
quantity. In each case, the coefficient on the x-axis can be interpreted as the controlled,
percentage effect on application quantity for treated schools relative to non-treated schools.
(95% confidence intervals of the coefficient estimations are displayed as the lines surrounding
each cohort’s dot.) From this, we see that the 2014 cohort of legalization observed an increase
in applications, given quality of school, tuition price, otherwise available wages. For schools
with first legal sale in 2016, 2018, 2019, there does not appear to be a statistical effect
of legalization on quantity of applications. This suggests the effect of RMJ policy on the
quantity of applications diminishes rapidly.
As a robustness check, in Figure 4we expanded the number of “flagship” schools to be
the top-4 by application in 2014. With such a specification, 95% confidence intervals no
longer suggest the 2014 cohort of RMJ legalizing states observed an increase in applications.
20
However, the 2015 cohort does have significant results, and the average treatment effect of
the treated group also diminishes with each subsequent cohort, reinforcing at least one result
from Figure 3.
Figure 4: ATT Effect of RMJ Legalization by Cohort of Legalization
Though directionally consistent, Figure 4does suggest sensitivity with respect to the
treatment definition. That is, how large does a school have to be in order for it to observe
the positive effects exhibited in Figures 3and 4? To alleviate concerns regarding this choice,
Table 6displays the RMJ coefficients in a variety of difference-in-difference specifications.
Each column assumes a different anticipation level of the policy while each row is grouped
into the cohorts and how flagship schools are defined.
The reduced pool of treated schools makes it less likely the model can find a matching,
non-treated school through propensity score matching, particularly when only two schools are
defined as flagship schools in the state. However, the overall results are largely consistent
with Table 5and Figure 3. For example, Table 6demonstrates a nearly 30% controlled
increase in applications for Colorado schools (Cohort 2014) regardless of anticipation level.
All coefficients are positive outside of the 2018 and 2019 cohorts and several are statistically
21
Table 6: Results from Log of Applications Regressions with Anticipation
Anticipation: Zero-Period One-Period Two-Period
Flagship Defined as
Top :
2Schools
2014 NA NA NA
2015 NA 0.1730.169
(0.072) (0.007)
2016 NA NA NA
2018 -0.020 -0.007 -0.015
(0.038) (0.016) (0.016)
2019 -0.003 0.027 0.015
(0.143) (0.031) (0.028)
3Schools
2014 0.3110.3300.288
(0.069) (0.088) (0.122)
2015 NA 0.1450.104
(0.064) (0.096)
2016 0.056 0.073 0.101
(0.069) (0.111) (0.151)
2018 0.005 0.032 -0.013
(0.040) (0.058) (0.058)
2019 -0.071 -0.108 -0.092
(0.103) (0.111) (0.097)
4Schools
2014 0.100 0.156 0.155
(0.168) (0.146) (0.149)
2015 0.2250.140 0.124
(0.083) (0.091) (0.080)
2016 0.090 0.083 0.127
(0.091) (0.117) (0.160)
2018 -0.014 0.010 -0.017
(0.035) (0.044) (0.040)
2019 -0.069 -0.102 -0.101
(0.080) (0.092) (0.081)
Notes: Significance level denoted as p<0.05. All results doubly robust.
NA results exist when too few comparison schools can be found between
the treated group of X schools and never-treated schools. Results then
interpreted as ceteris paribus application increases relative to otherwise
comparative schools for those top schools with legal RMJ sales.
22
significant at the 10% level.
Altogether these results reinforce earlier estimates showing positive effects of RMJ avail-
ability for flagship schools in early cohorts and diminishing gains with each passing cohort.
Panel DID Results for the Quality of Applications
Up to now, our analysis has been focused on RMJ policy effects on the size of the
application pool. We now assess whether or not RMJ policy affected the applicant pool
quality. To this end, we estimate Equation (1) using the third quantile of applicant SAT and
ACT scores as the dependent variable. Here again, we define our time trend as the number of
years post-legalization to start. The results, which are reported in Table 7, suggest that the
standardized test scores for admitted students to schools where RMJ had recently become
available are higher than their non-RMJ counterparts. As we assume that colleges admit the
best applicants from the pool, and that there are no heterogeneous effects in who chooses to
attend once admitted, we interpret our positive estimates to mean that the initially admitted
class is better than previous classes due to the improved quality of the increased application
pool.
Further, the coefficients on REC ·Y RS are also significant, and the estimates in Table 7
can then be interpreted to mean a non-flagship school’s second treated cohort would have
higher 75th percentile SAT scores by 30.7 points for women and 30.3 for men relative to
non-treated schools. As a robustness check, we also focus on cohort effects using empirical
techniques in the style of Callaway and Sant’Anna (2021). The results are reported in Table 8
found in Appendix A. The coefficients remain positive. However, we caution that there is
a general lack of precision due to the small number of matched schools generated by the
propensity score matching algorithm. Taken with the estimates in Table 7, these results
seem to suggest that RMJ availability may have improved cohort quality or, at most, had
no effect.
A caveat to this section’s analysis is that using the SAT scores of the admitted class does
23
Table 7: Results from SAT Regression
WOMEN MEN
REC 27.221∗∗∗ 27.141∗∗∗
(3.707) (3.601)
Y RS -2.508∗∗∗ -2.486∗∗∗
(0.344) (0.345)
REC ·Y RS 3.450∗∗∗ 3.149∗∗
(1.332) (1.322)
F LAG 26.059∗∗∗ 25.322∗∗∗
(6.975) (7.073)
REC ·F LAG -21.784∗∗∗ -21.271∗∗∗
(8.128) (7.813)
Y RS ·F LAG 1.3791.271
(0.733) (0.714)
REC ·Y RS ·F LAG -1.894 -1.553
(3.170) (2.710)
Observations 5,709 5,795
R20.836 0.838
Adjusted R20.835 0.837
Notes: Significance level denoted as p<0.05; ∗∗p<0.01; ∗∗∗p<0.001. All standard errors are
clustered by school.Each model specification is a twoway fixed effects (within) model.
24
not account for the deleterious effects of marijuana consumption on academic performance
and cognitive functioning while in college. Graduation rates or GPAs would perhaps be a
better indicator of how RMJ legalization impacted a state’s college-educated population.
While not presented within the context of this paper, we did analyze two year of data on
retention and graduation rates. Preliminary evidence suggests that retention and graduation
rates were unaffected by RMJ legalization. However, we cannot speak much (yet) to the effect
on student outcomes because using six-year graduation rates leaves only Colorado schools
in the data set at this time.(The COVID-19 pandemic also creates disruptions in the data.)
Therefore, we leave this to future research.
6 Conclusion
We provide evidence that state and local policies influence an individual’s application
consideration set by showing that the availability of recreational marijuana leads to a positive
increase in the size of the application pool for colleges. Colleges and universities experience
an almost 15% increase in applications after recreational marijuana becomes available with
a 12.8% and 20.2% increase in female and male applications, respectively. When cohorts are
more robustly analyzed, the quantity of applications for flagship schools rises in excess of
25% for Washington and Colorado schools. This is despite RMJ policies not having a clear
direct impact on the costs and benefits of a college education.
These large effects are of importance to both college administrators and policy makers
in understanding how permissive public policy impacts the college location choices of young
college-bound individuals–decisions that could potentially have longer run local economic
consequences. Our findings suggest that many of them, irrespective of academic prowess,
find RMJ availability to be desirable. Nonetheless, these particular policy impacts accrued
primarily to the first few cohorts of legalizing states, and as more schools allowed RMJ to
be legally available, this effect diminished.
25
Administrators may also worry that RMJ legalization would have negative effects on the
quality of the admitted cohort. However, we find no evidence that the increase in applications
came largely from low-quality students. In fact, there is weak evidence that 75th percentile
SAT scores improved for schools in legalizing states.
Both results contribute to the important ongoing research regarding how more permissive
RMJ policy might affect certain groups. As more states legalize RMJ and federal policymak-
ers consider the same for the whole nation, our work stresses the potential for positive gains
for colleges looking to improve their applicant pools, with no evidence of negative effects.
Moreover, our use of relatively novel econometric approaches for panel DID models reinforces
our results. With evident significance on our coefficients, even with so many specifications
and assumptions for robustness, we are confident in our results for application quantity and
quality. Nonetheless, more work is needed on understanding how RMJ laws–as well as other
permissive public policies–impact other academic outcomes of interest. These are left for
future studies, with ours being a good next step on the road to understanding RMJ policy
effects.
26
References
Alter, Molly and Randal Reback (2014) “True to Your School? How Changing Reputations
Alter Demand for Selective Colleges,” Educational Evaluation and Policy Analysis, 36 (3),
346–370.
Anderson, Michael L. (2017) “The Benefits of College Athletic Success: An Application of
the Propensity Score Design,” The Review of Economics and Statistics, 99 (1), 119–134.
Bae, Harold and David C. R. Kerr (2020) “Marijuana Use Trends among College Students
in States with and without Legalization of Recreational Use: Initial and Longer-Term
Changes from 2008 to 2018,” Addiction, 115 (6), 1115–1124.
Bastedo, Michael N., Elias Samuels, and Molly Kleinman (2014) “Do Charismatic Presi-
dents Influence College Applications and Alumni Donations? Organizational Identity and
Performance in US Higher Education,” Higher Education, 68, 397–415.
Bond, Timothy N., George Bulman, Xiaoxiao Li, and Jonathan Smith (2018) “Updatng
Human Capital Decisions: Evidence from SAT Score Shocks and College Applications,”
J. Labor Econ., 36 (3), 807–839.
Callaway, Brantly (2022) “Difference-in-Differences for Policy Evaluation,” arXiv preprint
arXiv:2203.15646.
Callaway, Brantly and Pedro HC Sant’Anna (2021) “Difference-in-differences with multiple
time periods,” J. Econometrics, 225 (2), 200–230.
Eggers, Austin F., Peter A. Groothuis, Parker Redding, Kurt W. Rotthoff, and Michael
Solimini (2020) “Universities Behaving Badly: The Impact of Athletic Malfeasance on
Student Quality and Enrollment,” Journal of Sports Economics, 21 (1), 87–100.
Fuller, Winship C., Charles F. Manski, and David A. Wise (1982) “New Evidence on the
27
Economic Determinatns of Post-secondary Schooling Choices,” J. Hum. Resour., 17 (4),
477–498.
Griffith, Amanda and Kevin Rask (2007) “The Influence of the US News and World Report
Collegiate Rankings on the Matriculation Decision of High-Ability Students: 1995-2004,”
Econ. Educ. Rev., 26, 244–255.
Hansen, Benjamin, Keaton Miller, and Caroline Weber (2021) “Up in Smoke? The Market
for Cannabis.”
Jacob, Brian, Brian McCall, and Kevin Strange (2018) “College as a Country Club: Do
Colleges Cater to Students’ Preferences for Consumption,” J. Labor Econ., 36 (2), 306–
348.
Kerr, David C. R., Harold Bae, and Andrew L. Koval (2018) “Oregon Recreational Marijuana
Legalization: Changes in Undergraduates’ Marijuana Use Rates From 2008 to 2016,”
Psychology of Addictive Behaviors, 32 (6), 670–678.
Kerr, David C. R., Harold Bae, Sandi Phibbs, and Adam C. Kern (2017) “Changes in
Undergraduates’ Marijuana, Heavy Alcohol and Cigarette Use Following Legalization of
Recreational Marijuana Use in Oregon,” Addiction, 112 (11), 1992–2001.
Long, Bridget Terry (2004) “How Have College Decisions Changed Over Time? An Appli-
cation of the Conditional Logistic Choice Model,” J. Econometrics, 121, 271–296.
Manski, Charles F. and David A. Wise (1983) College Choice in America: Harvard University
Press.
Marie, Olivier and Ulf olitz (2017) ““High” Achievers? Cannabis Access and Academic
Performance,” Rev. Econ. Stat., 84 (3), 1210–1237.
McDuff, DeForest (2007) “Quality, Tuition, and Applications to In-State Public Colleges,”
Econ. Educ. Rev., 26, 433–449.
28
Miller, Austin M., Robert Rosenman, and Benjamin W. Cowan (2017) “Recreational Mari-
juana Legalization and College Student Use: Early Evidence,” SSM - Population Health,
3, 649–657.
Nichols, Austin (2007) “Causal inference with observational data,” The Stata Journal, 7 (4),
507–541.
Pearson, Matthew R. and Marijuana Outcomes Study Team Bruce S. Liese, Robert D. Dvo-
rack (2017) “College Student Marijuana Involvement: Perceptions, Use, and Consequences
across 11 College Campuses,” Addictive Behaviors, 66, 83–89.
Perez, Stephen J. (2012) “Does Intercollegiate Athletics Draw Local Students to a Univer-
sity?,” Journal of Sports Economics, 13 (2), 198–206.
Phillips, Kristina T., Trent L. Lalonde, Michael M. Phillips, and Maryia M. Schneider (2017)
“Marijuana Use and Associated Motives in Colorado University Students,” The American
Journal on Addictions, 26 (8), 830–837.
Pinchevsky, Gillian M., Amelia M. Arria, Kimberly M. Caldeira, Laura M. Garnier-Dykstra,
Kathryn B. Vincent, and Kevin E. O’Grady (2012) “Marijuana Exposure Opportunity
and Initiation during College: Parent and Peer Influences,” Prevention Science, 13 (1),
43–54.
Pope, Devin G and Jaren C. Pope (2009) “The Impact of College Sports Success on the
Quantity and Quality of Student Applications,” South. Econ. J., 75 (3), 750–780.
(2014) “Understanding College Application Decisions: Why College Sports Success
Matters,” Journal of Sports Economics, 15 (2), 107–131.
Rothstein, Jesse (2009) “Student sorting and bias in value-added estimation: Selection on
observables and unobservables,” Education finance and policy, 4 (4), 537–571.
29
(2010) “Teacher quality in educational production: Tracking, decay, and student
achievement,” Q. J. Econ., 125 (1), 175–214.
Singer, Judith D, John B Willett, John B Willett et al. (2003) Applied longitudinal data
analysis: Modeling change and event occurrence : Oxford university press.
Skrondal, Anders and Sophia Rabe-Hesketh (2004) Generalized latent variable modeling:
Multilevel, longitudinal, and structural equation models: Chapman and Hall/CRC.
Wen, Hefei, Jason M. Hockenberry, and Janet R. Cummings (2015) “The Effect of Medical
Marijuana Laws on Adolescent and Adult Use of Marijuana, Alcohol, and Other Sub-
stances,” J. Health Econ., 42, 64–80.
Willis, Robert J. and Sherwin Rosen (1979) “Education and Self-Selection,” J. Polit. Econ.,
87 (5), S7–S36.
Wright, Adam C. and John M. Krieg (2020) “Getting into the Weeds: Does Legal Marijuana
Access Blunt Academic Performance in College?,” Econ. Inq., 58 (2), 607–623.
30
A SAT Robust Results
Table 8notably displays very few results. This highlights the stringency of our econo-
metric approach in finding results that are “doubly robust.” Because each treated school in a
cohort is matched with a non-treated school with similar characteristics through propensity
score matching, changing the definition of flagship schools (on our argument that only these
are truly treated) affects the likelihood the model can discern results. This is less of an issue
for our regressions on applications, in part because larger schools have a lot of commonality
between application quantity as a baseline for finding econometrically valid matches. In
the regressions for ACT/SAT scores, large schools may be fundamentally different in terms
of admitted student quality. With this, the likelihood of a match declines with one less
potential common parameter between the two.
31
Table 8: Results SAT Score Regressions with Anticipation
Anticipation: Zero-Period One-Period Two-Period
Flagship Defined as
Top :
2Schools
2014 NA NA NA
2015 NA NA NA
2016 NA NA NA
2018 20.630 60.869 20.630
(48.855) (52.809) (44.725)
2019 -3.646 -2.527 -3.646
(3.247) (2.486) (3.355)
3Schools
2014 NA NA NA
2015 NA NA NA
2016 NA NA NA
2018 NA 26.939 12.299
(24.519) (24.170)
2019 -20.272 -2.171 -3.552
(32.853) (2.567) (3.278)
4Schools
2014 NA NA NA
2015 NA NA NA
2016 NA NA NA
2018 -11.224 8.922 7.914
(15.451) (17.656) (19.710)
2019 NA -0.187 -11.213
(2.829) (5.979)
Notes: Significance level denoted as p<0.05. All results doubly robust.
NA results exist when too few comparison schools can be found between
the treated group of X schools and never-treated schools. Results then
interpreted as ceteris paribus application increases relative to otherwise
comparative schools for those top schools with legal RMJ sales.
32
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
There have been few studies of marijuana use before and after recreational marijuana legalization (RML) in affected states. We tested whether marijuana use rates increased more among college students in Oregon than in non-RML states following Oregon RML in July 2015. Repeated cross-sectional National College Health Assessment−II surveys were completed by random samples of students within participating colleges from 2008 to 2016. Data were from 4-year institutions that participated both before and after Oregon RML. Undergraduates (ages 18–26 years) from 2 institutions in Oregon (n = 7,412) and 123 institutions (n = 274,340) in non-RML U.S. states self-reported use of marijuana, tobacco, alcohol, and other drugs in the past 30 days. Mixed-effects regressions accounted for clustering of participants within institutions and controlled for individual-, context-, and institution-level factors as well as secular changes in substance use rates from 2008 to 2016. Following RML, Oregon students (compared to non-RML-state students) showed relative increases in rates of marijuana use (odds ratio [OR] = 1.29, 95% confidence interval [CI: 1.13, 1.48], p = .0002, and decreases in tobacco use rates (OR = .71, 95% CI [.60, .85], p < .0001). Changes in marijuana use after RML did not differ significantly for participants under and over age 21 years. Some study limitations would be addressed with higher survey response rates, inclusion of other Oregon institutions, and controls for marijuana and other substance policies. Still, findings are consistent with an effect of RML on rates of marijuana use among young adult college students, which may have important public health implications.
Chapter
Difference-in-differences is one of the most used identification strategies in empirical work in economics. This chapter reviews a number of important, recent developments related to difference-in-differences. First, this chapter reviews recent work pointing out limitations of two-way fixed-effects regressions (these are panel data regressions that have been the dominant approach to implementing difference-in-differences identification strategies) that arise in empirically relevant settings where there are more than two time periods, variation in treatment timing across units, and treatment effect heterogeneity. Second, this chapter reviews recently proposed alternative approaches that are able to circumvent these issues without being substantially more complicated to implement. Third, this chapter covers a number of extensions to these results, paying particular attention to (i) parallel trends assumptions that hold only after conditioning on observed covariates and (ii) strategies to partially identify causal effect parameters in difference-in-differences applications in cases where the parallel trends assumption may be violated.
Article
In this article, we consider identification, estimation, and inference procedures for treatment effect parameters using Difference-in-Differences (DiD) with (i) multiple time periods, (ii) variation in treatment timing, and (iii) when the “parallel trends assumption” holds potentially only after conditioning on observed covariates. We show that a family of causal effect parameters are identified in staggered DiD setups, even if differences in observed characteristics create non-parallel outcome dynamics between groups. Our identification results allow one to use outcome regression, inverse probability weighting, or doubly-robust estimands. We also propose different aggregation schemes that can be used to highlight treatment effect heterogeneity across different dimensions as well as to summarize the overall effect of participating in the treatment. We establish the asymptotic properties of the proposed estimators and prove the validity of a computationally convenient bootstrap procedure to conduct asymptotically valid simultaneous (instead of pointwise) inference. Finally, we illustrate the relevance of our proposed tools by analyzing the effect of the minimum wage on teen employment from 2001–2007. Open-source software is available for implementing the proposed methods.
Article
Background and aims: Young adult college students in the United States are likely to be affected by marijuana liberalization trends. However, changes in students' marijuana use following recreational marijuana legalization (RML) have not been examined in more than one RML state at a time, or beyond 1-2 years post-legalization. Design: Cross-sectional National College Health Assessment survey administered twice yearly from 2008 to 2018. Setting: A total of 587 4-year colleges and universities in 48 US states. Participants: Undergraduates aged 18-26 years attending college in US states that did (n = 234 669 in seven states) or did not (n = 599 605 in 41 states) enact RML between 2008 and 2018. Measurements: Self-reported marijuana use (past 30 days) and individual and contextual covariates, institution-provided institutional and community covariates and publicly available dates when states enacted RML. Findings: Adjusting for covariates, state differences and state-specific linear time trends (accounting for pre-RML trends), prevalence of 30-day marijuana use increased more among students exposed to RML [odds ratio (OR) = 1.23, 95% confidence interval (CI) = 1.19-1.28, P < 0.001] than among non-RML state students throughout the same time-period; the results were similar for frequent use (≥ 20 days) (OR = 1.18, 95% CI = 1.10-1.27, P < 0.001). Interaction models supported stronger RML effects among students who were female, residing off-campus and aged 21 years and older; sexual orientation did not moderate RML effects. In the earliest states to enact RML (2012) there were increases in use prevalence in the second through the sixth year post-RML compared to pre-RML. In the second legalization group (2015) there were increases in the first and second year post-RML, and greater increases in the third year. In the later states (2016-17), increases were observed in both years after RML. Conclusions: In US states that enacted recreational marijuana legislation from 2012 to 2017 there was evidence for a general trend towards greater increases in marijuana use by college students and differential impact by gender, legal using age and campus residence.
Article
National accolades and positive media attention are frequently lavished upon successful collegiate sports programs. Correspondingly, studies have demonstrated that universities often benefit from the achievements of their athletic teams by increasing the schools’ application numbers, student quality, and alumni donations. This study demonstrates that the opposite effect occurs when a university’s sports team is accused of engaging in impropriety. Our findings suggest that the negative attention given to the National Collegiate Athletic Association postseason tournament ban of a men’s basketball program could serve as a signal to prospective students regarding the quality of the institution. This perception ultimately leads to a decrease in the infracting university’s enrollment the year before the ban that then rebounds the year after the ban. However, the ban reduces the percentage of high-achieving students who choose to attend the university after the ban has been implemented.
Article
Problems with inferring causal relationships from nonexperimental data are briefly reviewed, and four broad classes of methods designed to allow estimation of and inference about causal parameters are described: panel regression, matching or reweighting, instrumental variables, and regression discontinuity. Practical examples are offered, and discussion focuses on checking required assumptions to the extent possible.
Article
This paper examines the effect of legal access to marijuana on student performance stemming from a voter‐approved initiative legalizing marijuana for those 21 and older in the State of Washington. Using panel data from a medium‐sized public university, we use a within‐student and within‐class estimator to show that legalization reduces students' grades, with an effect size about one‐half the impact of gaining legal access to alcohol. Consistent with how marijuana consumption affects cognitive functioning, we find that students' grades fall furthest in courses that require more quantitative skills. These effects are largely driven by men and low performers. (JEL I23, I18, K32)
Article
We estimate whether students update the colleges to which they consider applying in response to large, unanticipated information shocks generated by the release of SAT scores—a primary factor in admission decisions. Exploiting population data on the timing of college selection and a policy that induces students to choose colleges prior to taking the exam, we find that students update their portfolios in terms of selectivity, tuition, and sector. However, the magnitude of updating is too modest to significantly reduce unexplained variation across students, suggesting that nonacademic factors are the dominant determinants of college match.