ArticlePDF Available

What Can Be Learned From Growth Mindset Controversies?

Authors:

Abstract

The growth mindset is the belief that intellectual ability can be developed. This article seeks to answer recent questions about growth mindset, such as: Does a growth mindset predict student outcomes? Do growth mindset interventions work, and work reliably? Are the effect sizes meaningful enough to merit attention? And can teachers successfully instill a growth mindset in students? After exploring the important lessons learned from these questions, the article concludes that large-scale studies, including preregistered replications and studies conducted by third parties (such as international governmental agencies), justify confidence in growth mindset research. Mindset effects, however, are meaningfully heterogeneous across individuals and contexts. The article describes three recent advances that have helped the field to learn from this heterogeneity: standardized measures and interventions, studies designed specifically to identify where growth mindset interventions do not work (and why), and a conceptual framework for anticipating and interpreting moderation effects. The next generation of mindset research can build on these advances, for example by beginning to understand and perhaps change classroom contexts in ways that can make interventions more effective. Throughout, the authors reflect on lessons that can enrich metascientific perspectives on replication and generalization.
What Can Be Learned From Growth Mindset Controversies?
David S. Yeager
1
and Carol S. Dweck
2
1
Department of Psychology, University of Texas at Austin
2
Department of Psychology, Stanford University
The growth mindset is the belief that intellectual ability can be developed. This article seeks to
answer recent questions about growth mindset, such as: Does a growth mindset predict student
outcomes? Do growth mindset interventions work, and work reliably? Are the effect sizes
meaningful enough to merit attention? And can teachers successfully instill a growth mindset in
students? After exploring the important lessons learned from these questions, the article concludes
that large-scale studies, including preregistered replications and studies conducted by third parties
(such as international governmental agencies), justify confidence in growth mindset research.
Mindset effects, however, are meaningfully heterogeneous across individuals and contexts. The
article describes three recent advances that have helped the field to learn from this heterogeneity:
standardized measures and interventions, studies designed specifically to identify where growth
mindset interventions do not work (and why), and a conceptual framework for anticipating and
interpreting moderation effects. The next generation of mindset research can build on these
advances, for example by beginning to understand and perhaps change classroom contexts in
ways that can make interventions more effective. Throughout, the authors reflect on lessons that
can enrich metascientific perspectives on replication and generalization.
Public Significance Statement
Research on growth mindset—the belief that intellectual ability can be developed—has found that
a growth mindset can lead to greater resilience and academic achievement among students facing
difficulties. The present article reviews the evidence and shows that highly quality studies, and
independent analyses, have supported the conclusion that growth mindset effects are replicable,
meaningful, and theoretically grounded, but interventions targeting teachers (rather than students)
have not yet been effective. The article concludes with a discussion of why it has been difficult to
change teachers or schools and why new research is needed on this topic.
Keywords: implicit theories, growth mindset, adolescence, educational psychology, metascience
Supplemental materials: https://doi.org/10.1037/amp0000794.supp
A growth mindset is the belief that personal characteris-
tics, such as intellectual abilities, can be developed, and a
fixed mindset is the belief that these characteristics are fixed
and unchangeable (Dweck, 1999; Dweck & Leggett, 1988;
Yeager & Dweck, 2012). Research on these mindsets has
found that people who hold more of a growth mindset are
David S. Yeager Xhttps://orcid.org/0000-0002-8522-9503
Carol S. Dweck Xhttps://orcid.org/0000-0001-7741-0073
The writing of this essay was supported by the William T Grant Foun-
dation Scholars Award, the National Institutes of Health (R01HD084772
and P2CHD042849), and the National Science Foundation (1761179). The
content is solely the responsibility of the authors and does not necessarily
represent the official views of the funding agencies.
David S. Yeager received a 2020 APA Award for Distinguished Scien-
tific Early Career Contributions to Psychology. In association with the
award, David S. Yeager was invited to submit a manuscript to American
Psychologist, which was peer reviewed. The article is published as part of
the journal’s annual Awards Issue.
David S. Yeager conceptualized the article, conducted data analysis,
produced data visualizations, and wrote the first draft of the article. Carol
S. Dweck collaborated on the conceptualization and writing of the article.
David S. Yeager served as lead for funding acquisition, project adminis-
tration, visualization, and writing – original draft and contributed equally to
investigation. Carol S. Dweck served in a supporting role for writing –
original draft. David S. Yeager and Carol S. Dweck contributed to con-
ceptualization equally. David S. Yeager and Carol S. Dweck contributed to
writing – review and editing equally.
Correspondence concerning this article should be addressed to David S.
Yeager, Department of Psychology, University of Texas at Austin, 305
East 23rd Street, Mail Stop G1800, Austin, TX 78712–1699, United States.
Email: dyeager@utexas.edu
American Psychologist
© 2020 American Psychological Association 2020, Vol. 2, No. 999, 000
ISSN: 0003-066X https://doi.org/10.1037/amp0000794
1
more likely to thrive in the face of difficulty and continue to
improve, while those who hold more of a fixed mindset may
shy away from challenges or fail to meet their potential (see
Dweck & Yeager, 2019). There has been considerable in-
terest among researchers, policymakers, and educators in
the use of growth mindset research to improve educational
outcomes, in part because research on mindsets has yielded
effective, scalable interventions. For instance, the National
Study of Learning Mindsets (NSLM; Yeager, 2019) evalu-
ated a short (1 hr), online growth mindset intervention in
a nationally representative sample of 9th graders in the
United States (N12,490). Compared with the control
condition, the intervention improved grades for lower-
achieving students and improved the rate at which students
overall chose and stayed in harder math classes (Yeager et
al., 2019). These or similar effects have appeared in inde-
pendent evaluations of the NSLM (Zhu et al., 2019) and
international replications (Rege et al., in press).
With increasing emphasis on replication and generaliz-
ability has come an increased attention to questions of when,
why, and under what conditions growth mindset associa-
tions and intervention effects can be expected to appear.
Large trans-disciplinary studies have yielded insights into
contextual moderators of mindset intervention effects, such
as the educational cultures created by peers or schools
(Rege et al., in press; Yeager et al., 2019), and has begun to
document the behaviors that can mediate intervention ef-
fects on achievement (Gopalan & Yeager, 2020; Qin et al.,
2020). At the same time, researchers and practitioners have
begun to test a variety of mindset measures, manipulations,
and interventions. These have sometimes failed to find
growth mindset effects and other times have replicated
mindset effects.
The trend toward testing “heterogeneity of effects” and
the boundary conditions of effects has coincided with the
rise of the metascience movement. This latter movement has
asked foundational questions about the replicability of im-
portant findings in the literature (see Nelson et al., 2018).
The emergence of these two trends in parallel—increased
examination of heterogeneity and increased focus on repli-
cability—has made it difficult to distinguish between stud-
ies that call into question the foundations of a field and
studies that are a part of the normal accumulation of knowl-
edge about heterogeneity (about where effects are likely to
be stronger or weaker; see Kenny & Judd, 2019; Tipton et
al., 2019). Many scientists and educational practitioners
now wish to know: Which claims in the growth mindset
literature still stand? How has our understanding evolved?
And what will the future of mindset research look like?
In this article we examine different controversies about
the mindset literature and discuss the lessons that can be
learned from them.
1
To be clear, we think controversies are
often good. It can be useful to reexamine the foundations of
a field. Furthermore, controversies can lead to theoretical
advances and positive methodological reforms. However,
controversies can only benefit the field in the long run when
we identify useful lessons from them. This is what we seek
to do in the current article. We highlight why people’s
growth versus fixed mindsets or why growth mindset inter-
ventions should sometimes affect student outcomes and
what we can learn from the times when they do not.
Preview of Controversies
In this article we review several prominent growth mind-
set controversies:
1. Do mindsets predict student outcomes?
2. Do student mindset interventions work?
3. Are mindset intervention effect sizes too small to
be interesting?
4. Do teacher mindset interventions work?
As we discuss below, different authors can stitch together
different studies across these issues to create very different
narratives. One narrative can lead to the conclusion that
mindset effects are not real and that even the basic tenets of
mindset theory are unfounded (Li & Bates, 2019; Macna-
mara, 2018). Another selective narrative could lead to the
opposite conclusion. Although we are not disinterested par-
ties, we hope to navigate the evidence in a way that is
illuminating for the reader—that is, in a way that looks at
the places where mindset effects are robust and where they
are weak or absent. Below, we first preview the central
concepts at issue in each controversy and the questions we
will explore. After that, we will review the evidence for
each controversy in turn.
Do Mindsets Predict Student Outcomes?
A fixed mindset, with its greater focus on validating one’s
ability and drawing negative ability inferences after struggle
or failure, has been found to predict lower achievement
(e.g., grades and test scores) among students facing aca-
demic challenges or difficulties, compared with a growth
mindset, with its greater focus on developing ability and on
questioning strategy or effort after failure (e.g., Blackwell et
al., 2007). However, some studies have not found this same
result (e.g., Li & Bates, 2019). Meta-analyses have shown
overall significant associations in the direction expected by
the theory, but the effects have been heterogeneous and thus
1
In the present article we focus on the intelligence mindsets, which form
the core of much of the mindset literature. Of course, people can have
mindsets about other characteristics as well (e.g., personality or social
qualities; see Schroder et al., 2016) and similar mediators and mechanisms
have often emerged in those domains as well.
2YEAGER AND DWECK
call for a greater understanding of where these associations
are likely and where they may be less likely (Burnette et al.,
2013; Sisk et al., 2018).
Recently, Macnamara (2018) and her coauthors (Bur-
goyne et al., 2020; Sisk et al., 2018) and Bates and his
coauthor (Li & Bates, 2019) have looked at this pattern of
results and called into question the basic correlations be-
tween mindsets and outcomes. Below, we look at the dif-
ferences across these correlational studies to try and make
sense of the discrepant results. We ask: Are the effects
found in the large, generalizable studies (i.e., random or
complete samples of defined populations), particularly
those conducted by independent organizations or research-
ers? And how can we know whether a given result is a
“failure to replicate” or a part of a pattern of decipherable
heterogeneity?
Do Student Mindset Interventions Work?
The effects of mindset interventions on student outcomes
have been replicated but they, too, are heterogeneous (Sisk
et al., 2018; Yeager et al., 2019). Some studies (and sub-
groups within studies) have shown noteworthy effects, but
other studies or subgroups have not. Macnamara (2018),
Bates (Li & Bates, 2019) and several commentators
(Shpancer, 2020; Warne, 2020) have argued that this means
mindset interventions do not work or are unimportant. Oth-
ers have looked at the same data and concluded that the
effects are meaningful and replicable but they are likely to
be stronger or weaker for different people in different con-
texts (Bryan et al., 2020; Gelman, 2018). We ask whether
the heterogeneity is in fact informative and whether there is
an overarching framework that could explain (and predict in
advance) heterogeneous effects.
Are Mindset Intervention Effect Sizes Too Small
to be Interesting?
Macnamara and colleagues have argued that growth
mindset intervention effect sizes are too small to be worthy
of attention (Macnamara, 2018; Sisk et al., 2018). Others
have disagreed (Gelman, 2018; Kraft, 2020; The World
Bank, 2017). Here we ask: What are the appropriate bench-
marks for the effect sizes of educational interventions? How
do mindset interventions compare? And how can an appre-
ciation of the moderators of effects contribute to a more
nuanced discussion of “true” effect sizes for interventions
(also see Bryan et al., 2020; Vivalt, 2020)?
Do Teacher Mindset Interventions Work?
Correlational research has indicated a role for educators’
mindsets and mindset-related practices in student achieve-
ment (Canning et al., 2019; Leslie et al., 2015; Muenks et
al., 2020). Nevertheless, two recent mindset interventions,
delivered to and by classroom teachers, have not had any
discernable effects on student achievement (Foliano et al.,
2019; Rienzo et al., 2015). We examine the issue of teacher-
focused interventions and ask: Why might it be so difficult
to coach teachers to instill or support a growth mindset?
How can the next generation of mindset research make
headway on this?
Controversies
Controversy 1: Do Mindsets Predict
Student Outcomes?
Background
What is Mindset Theory? Let us briefly review the
theoretical predictions before assessing the evidence. Mind-
set theory (Dweck, 1999; Dweck & Leggett, 1988; see
Dweck & Yeager, 2019) grows out of two traditions of
motivational research: attribution theory and achievement
goal theory. Attribution theory proposed that people’s ex-
planations for a success or a failure (their attributions) can
shape their reactions to that event (Weiner & Kukla, 1970),
with attributions of failure to lack of ability leading to less
persistent responses to setbacks than attributions to more
readily controllable factors, such as strategy or effort (see
Weiner, 1985). Research by Diener and Dweck (1978) and
Dweck and Reppucci (1973) suggested that students of
similar ability could differ in their tendency to show these
different attributions and responses. Later, achievement
goal theory was developed to answer the question of why
students with roughly equal ability might show different
attributions and responses to a failure situation (Elliott &
Dweck, 1988). This line of work suggested that students
who have the goal of validating their competence or avoid-
ing looking incompetent (a performance goal) tend to show
more helpless reactions in terms of (ability-focused) attri-
butions and behavior, relative to students who have the goal
of developing their ability (a learning goal; Elliott &
Dweck, 1988; Grant & Dweck, 2003).
The next question was: Why might students of equal
ability differ in their tendency toward helpless attributions
or performance goals? This is what mindset theory was
designed to illuminate. Mindset theory proposes that situa-
tional attributions and goals are not isolated ideas but in-
stead are fostered by more situation-general mindsets
(Molden & Dweck, 2006). These more situation-general
mindset beliefs about intelligence—whether it is fixed or
can be developed—are thought to lead to differences in
achievement (e.g., grades and test scores) because of the
different goals and attributions in situations involving chal-
lenges and setbacks.
In summary, mindset theory is a theory about responses to
challenges or setbacks. It is not a theory about academic
3
GROWTH MINDSET CONTROVERSIES
achievement in general and does not purport to explain the
lion’s share of the variance in grades or test scores. The
theory predicts that mindsets should be associated with
achievement particularly among people who are facing
challenges.
How Are Mindsets Measured? Mindsets are typically
assessed by gauging respondents’ agreement or disagree-
ment with statements such as “You have a certain amount of
intelligence, and you really can’t do much to change it”
(Dweck, 1999). Greater agreement with this statement cor-
responds to more of a fixed mindset, and greater disagree-
ment corresponds to more of a growth mindset. Mindsets
are not all-or-nothing—they are conceptualized as being on
a continuum from fixed to growth, and people can be at
different parts of the continuum at different times. In studies
in which survey space has been plentiful, mindsets are often
measured with three to four items framed in the fixed
direction and three to four items framed in the growth
direction, with the former being reverse-scored (e.g., Black-
well et al., 2007). In recent large-scale studies in which
space is at a premium (e.g., Yeager et al., 2016, 2019), a
more streamlined measure of the two or three strongest
fixed-framed items is used for economy, simplicity, and
clarity. Recently, several policy-oriented research studies
have adapted the mindset measures for use in national
surveys (e.g., Claro & Loeb, 2019).
Controversy 1a: Do Measured Mindsets Predict
Academic Achievement?
Preliminary Evidence. An initial study (Blackwell et
al., 2007, Study 1, N373) found that students reporting
more of a growth mindset showed increasing math grades
over the 2 years of a difficult middle school transition,
whereas those with more of a fixed mindset did not (even
though their prior achievement did not differ). Later, a
meta-analysis by Burnette et al. (2013) summarized data
from many different kinds of behavioral tasks and admin-
istrative records, in the laboratory and the field, and from
many different participant populations, and found that
mindsets were related to achievement and performance.
Large-Scale Studies. Many large studies conducted
by governments and nongovernmental organizations have
found that mindsets were correlated with achievement.
First, all 4th to 7th grade students in the largest districts of
California were surveyed as part of the state accountability
system, (the “CORE” districts; N300,629). Growth
mindset was associated with higher English/Language Arts
scores, r.28, and higher math scores, r.27 (Claro &
Loeb, 2019, Table 2, Column 1; also see West et al., 2018).
2
A follow-up analysis of the same dataset found that the
association between mindset and test scores was stronger
among students who were struggling the most (those who
were medium-to-low achieving students; Kanopka et al.,
2020), which is consistent with mindset theory. Next, three
large survey studies—including the NSLM and the U-say
study in Norway (N23,446)—collectively showed a
correlation of mindset with high school grades of r.24
(see Figure 1). In Chile, all 10th grade public school stu-
dents (N168,533) were asked to complete growth mind-
set questions in conjunction with the national standardized
test. Mindsets were correlated with achievement test scores
at r.34, and correlations were larger among students with
greater risk of low performance (those facing socioeco-
nomic disadvantages; Claro et al., 2016). In another large
study, the Programme for International Student Assessment
(PISA), conducted by the OECD (an international organi-
zation that is charged with promoting economic growth
around the world through research and education), surveyed
random samples of students from 74 developed nations
(N555,458, see Figure 1) and showed that growth mind-
set was significantly and positively associated with test
scores in 72 of those nations (OECD, 2019; the exceptions
were China and Lebanon). (If the three separate regions of
China are counted as different nations, the positive associ-
ations were significant in 72 out of 76 cases.) The OECD
correlations for the United States and Chile approximated
the previously noted data (see Figure 1).
Unsupportive Evidence. In a recent study of N433
Chinese students in 5th and 6th grade, Li and Bates (2019)
found no significant correlation between students’ reported
mindsets and their grades or intellectual performance.
3
Bah-
ník and Vranka (2017) likewise found no association be-
tween mindsets and aptitude tests in a large sample of
university applicants in the Czech Republic (N5,653).
There can be many reasons for these null effects. One can be
that the China sample is a smaller sample of convenience
and so the results may be less informative. Another can be
the choice and translation of the mindset items. However,
the PISA data may provide another possible, perhaps more
interesting, explanation. Looking at Figure 1, we see that
there was no positive correlation in the PISA data for China
overall (and there was even a negative correlation for the
mainland China students); moreover, the Czech Republic
was 71st out of 74 OECD nations in the size of their
correlation. Thus, the PISA data may suggest that the Li and
Bates (2019) and Bahník and Vranka (2017) correlational
2
The Claro and Loeb (2019) study also did an analysis focused on
learning (i.e., test score gains). Such analyses will reduce the estimated
correlations with mindset because they would be controlling for any effect
of mindset on prior achievement, but these analyses too were significant
and in the direction expected by mindset theory.
3
Li and Bates (2019) also tested the effects of intelligence praise (not
mindset) on students’ motivation, similar to what was done by Mueller and
Dweck (1998). The authors reported three experiments, one of which was
a significant replication at p.05 and one of which was a marginally
significant replication at p.10 (two-tailed hypothesis tests). However,
each study was under-powered. The most straightforward analysis with
individually under-powered replications is to aggregate the data. When this
was done across the three studies, Li and Bates (2019) clearly replicated
Mueller and Dweck (1998), p.05.
4YEAGER AND DWECK
studies happened to be conducted in cultural contexts that
have the weakest links between mindsets and achievement
in the developed world. Of course, this does not mean that
every sample from these countries would show a null effect
or that every sample from the countries with the higher
positive correlations would show a positive effect, but the
cross-national results help us to see whether a given study
fits a pattern or not.
A recent meta-analysis, presented as unsupportive of
mindset theory’s predictions, was, upon closer inspection,
consistent with the conclusions from the large-scale studies.
Macnamara and colleagues meta-analyzed many past stud-
ies’ effect sizes, with data from different nations, age
groups, and achievement levels, and using many different
kinds of mindset survey items (Sisk et al., 2018, Study 1).
They found an overall significant association between
mindsets and academic performance. However, this overall
association was much smaller (r.09) than the estimates
from the generalizable samples presented above (r.24 to
r34). This puzzle can be solved when we realize that the
Sisk et al. (2018) results were heterogeneous (I
2
96.29%),
and that this caused the random effects meta-analysis to
give about the same weight in the average to the effect sizes
of very small, outlying studies from convenience samples as
it gave to the generalizable studies. The data from three
generalizable data sets (the CORE data, the Chile data, and
the U.S. National Study of Learning Mindsets), which con-
tributed 296,828, or 81%, of all participant data in the Sisk
et al. (2018) meta-analysis, were given roughly the same
weight as correlations with sample sizes of n10 or fewer
participants (of which there were 19).
Conclusions. There is a replicable and generalizable
association between mindsets and achievement. A lesson
from this controversy is to underscore that this association
cannot be captured with a single, summary effect size.
Mindset associations with outcomes were expected to be,
and often were, stronger among people facing academic
difficulties or setbacks. Further, there was (and always will
be) some unexplained heterogeneity across cultures and
almost certainly within cultures as well. This unexplained
heterogeneity should be a starting point for future theory
development. To quote Kenny and Judd (2019), “rather than
seeking out the ‘true’ effect and demonstrating it once and
for all, one approaches science . . . with a goal of under-
standing all the complexities that underlie effects and how
those effects vary” (p. 10).
In line with this recommendation, we are deeply inter-
ested in cultures in which a growth mindset does not predict
higher achievement, such as mainland China. Multinational
probability sample studies are uniquely suited to investigate
this. On the PISA survey, students in mainland China re-
ported spending 57 hr per week studying (OECD, 2019,
Figure 1
Correlations Between Mindset and Test Scores in 74 Nations Administering the
Mindset Survey in the 2018 PISA (N 555,458)
Note. Each dot is a raw correlation between the single-item mindset measure and PISA
reading scores for each nation’s 15 year-olds. Bars: 95% confidence intervals (CIs). Data
Source: OECD, PISA, 2018 Database, Table 3.B1.14.5. Data were reported by the OECD in
SD units; these were transformed to rusing the standard “rfrom d” formulas. One nation
contributed three different effect sizes according to the OECD (China: for Macao, for Hong
Kong, and for four areas of the mainland); these three effect sizes were averaged here.
Looking at the three China effects separately, each was near zero, and mainland China was
negative and significant. See the online article for the color version of this figure.
5
GROWTH MINDSET CONTROVERSIES
Figure 1.4.5), the second-most in the world. Perhaps a
growth mindset cannot increase hours of studying or test
scores any further when there is already a cultural impera-
tive to work that hard. However, that does not mean mindset
has no effect in such cultures. Mindset has an established
association with mental health and psychological distress
(Burnette et al., 2020; Schleider et al., 2015; Seo et al., in
press). Of all OECD nations, mainland China had the stron-
gest association between fixed mindset and “fear of failure,”
a precursor to poor mental health (OECD, 2019, Table 3.2.).
This suggests that perhaps a growth mindset might yet have
a role to play in student well-being there.
Controversy 1b: Does the Mindset “Meaning System”
Replicate?
Burgoyne et al. (2020) have claimed that the evidence for
the mindset “meaning system” is not robust. First, we re-
view the predictions and measures, and then we weigh the
evidence.
Meaning System Predictions. The two mindsets are
thought to orient students toward different goals and attri-
butions in the face of challenges and setbacks. Research has
also found (e.g., Blackwell et al., 2007) that the two mind-
sets can orient students toward different interpretations of
effort. For this reason, we have suggested that each mindset
fosters its own “meaning system.” More specifically, mind-
set theory expects that the belief in fixed ability should be
associated with a meaning system of performance goals
(perhaps particularly performance-avoidance goals, which
is the goal of avoiding situations that could reveal a lack of
ability), negative effort beliefs (e.g., the belief that the need
to put forth effort on a task reveals a lack of talent), and
helpless attributions in response to difficult situations (e.g.,
attributing poor performance to a stable flaw in the self,
such as being “dumb”). A growth mindset meaning system
is the opposite, involving learning goals, positive effort
beliefs, and resilient (e.g., strategy-focused) attributions
(see Dweck & Yeager, 2019). Thus, the prediction is that
measured mindsets should be associated with their allied
goals, effort beliefs, and attributions.
Meaning System Measures. In evaluating the robust-
ness of the meaning system, one must carefully examine the
measures used to see whether they capture the spirit and the
psychology of the meaning system in question. Indeed, as a
meta-analysis by Hulleman points out, research on achieve-
ment goals has often used “the same label for different
constructs” (Hulleman et al., 2010, p. 422). There are two
very different types of performance goal measures: those
focused on appearance/ability (i.e., validating and demon-
strating one’s abilities) and those focused on living up to a
normative standard (e.g., doing well in school; also see
Grant & Dweck, 2003). These two different types of per-
formance goal measures have different (even reversed) as-
sociations with achievement and with other motivational
constructs (see Table 13, p. 438, in Hulleman et al., 2010).
Mindset studies have mostly asked about appearance-
focused achievement goals (e.g., “It’s very important to me
that I don’t look stupid in [this math] class,” as in Blackwell
et al., 2007, Study 1; also see Table 1). However, correla-
tions of mindset with normative achievement goals (the
desire to do well in school overall) are not expected. Next,
learning goals measures do not simply ask about the value
of learning in general but ask about it in the context of a
tradeoff (e.g., in Blackwell et al., 2007: “It’s much more
important for me to learn things in my classes than it is to
get the best grades;” also see Table 1). The reason is that
almost anyone, even students with more of a fixed mindset,
should be interested in learning, but mindsets should start to
matter when people might have to sacrifice their public
image as a high-ability person to learn. Finally, attributions
are typically assessed in response to a failure situation (see
Robins & Pals, 2002 and Table 1), which is where the two
mindsets part ways.
Initial Evidence. The first study to simultaneously test
multiple indicators of the mindset meaning system was
conducted by Robins and Pals (2002) with college students
(N508). They found that a fixed mindset was associated
with performance goals and fixed ability attributions for
failure, as well as helpless behavioral responses to diffi-
culty, rs.31, .19, and .48, respectively (effort beliefs were
not assessed). Later, Blackwell et al. (2007, Study 1, N
373) replicated these findings, adding a measure of effort
beliefs.
Each meaning system hypothesis has also been tested
individually in experiments which directly manipulated
mindsets through persuasive articles (without mentioning
any of the meaning system constructs). In a study of mind-
sets and achievement goals, Cury et al. (2006, Study 2)
manipulated mindsets, without discussing goals, and
showed that this caused differences in performance versus
learning or mastery goals in the two mindset conditions. In
a study of mindsets and effort beliefs, Miele and Molden
(2010, Study 2) crossed a manipulation of mindsets (fixed
vs. growth) with a manipulation of how hard people had to
work to interpret a passage of text.
4
Participants induced to
hold more of a fixed mindset lowered their competence
perceptions when they had to work hard to interpret the
passage, and increased their competence perceptions when
the passage was easy to interpret, relative to those induced
to hold a growth mindset (who did not differ in their
competence perceptions in the hard vs. easy condition). It is
also important to note that actual performance and effort
were constant across the mindset conditions, but mindsets
4
Miele and Molden (2010) note that “the two versions of the [mindset-
inducing] article focused solely on whether intelligence was stable or
malleable and did not include any information about the role of mental
effort or processing fluency in comprehension and performance” (p. 541).
6YEAGER AND DWECK
caused different appraisals of the meaning of the effort they
expended. Finally, studies of mindsets and resilient versus
helpless reactions to difficulty have manipulated mindsets
without mentioning responses to failure. These have shown
greater resilience among those induced with a growth mind-
set (e.g., attributing failure to effort and/or seeking out
remediation) and greater helpless responses among those
induced with a fixed mindset (Hong et al., 1999; Nussbaum
& Dweck, 2008).
Meta-Analysis. A meta-analysis by Burnette et al.
(2013) synthesized the experimental and correlational evi-
dence on the meaning system hypotheses using data from
N28,217 participants. Consistent with mindset theory,
the effects of mindset on achievement goals (performance
vs. learning) and on responses to situations (helpless vs.
mastery) were replicated and were statistically significantly
stronger (by approximately 50% on average) when people
were facing a setback (what the authors called “ego threats”;
Burnette et al., 2013).
Large-Scale Studies Using Standardized Measures.
Recently, three different large studies with a total of over
23,000 participants have replicated the meaning system
correlations (see Figure 2 and Table 1), two of which were
generalizable to entire populations of students, either to the
United States overall, (Yeager et al., 2019) or to entire parts
of Norway (Rege et al., in press).
These studies used the “Mindset Meaning-System Index”
(MMI) a standardized measure comprised of single items
for each construct. To create the MMI, from past research
(e.g., Blackwell et al., 2007) we chose the most prototypical
or paradigmatic items that loaded highly onto the common
factor for that construct and covered enough of the construct
that it could stand in for the whole variable, Meeting the
criteria set forth above, the performance goal item was
appearance/ability focused, the learning goal involved a
tradeoff (e.g., learning but risking your high grades vs. not
learning but maintaining a high-ability image), and the
attribution items involved a response to a specific failure
situation. Each item was then edited to be clear enough to be
administered widely, especially to those with different lev-
els of language proficiency.
Understanding an Unsupportive Study. In a study
with N438 undergraduates, Burgoyne et al. (2020) found
weak or absent correlations between their measure of mind-
sets and their measures of some of the meaning system
variables (performance goals, learning goals, and attribu-
tions), none of which met their r.20 criterion for a
meaningful effect. (They did not include effort beliefs.)
The data from Burgoyne et al. (2020) are thought-
provoking. They differ from the data provided by Robins
and Pals (2002); Blackwell et al. (2007) and by the partic-
ipants just mentioned. Indeed, Table 1 shows that 10 out of
15 of the correlations in the three large replications ex-
ceeded the r.20 threshold set by Burgoyne et al., even
without adjusting for the unreliability in single-item mea-
sures that can attenuate the magnitude of associations.
One solution to the puzzling discrepancy is that Burgoyne
et al. (2020) did not adhere closely to the measures used in
past studies. The nature of these deviations can be instruc-
tive for theory and future research. Tellingly, Burgoyne et
al. (2020) asked about normative-focused performance
goals (“I worry about the possibility of performing poorly”),
not appearance/ability-focused goals (cf. Grant & Dweck,
Table 1
Correlations of Fixed Mindsets With the “Mindset Meaning System” in Replication Studies
Sample
Correlation with fixed mindset NSLM pilot, N3,306 NSLM, N14,894 U-say, N5,247
Meaning system aggregate index, r.43 .40 .41
Individual meaning system items
Effort beliefs, r.47 .32 .36
Goals
Performance-avoidance, r.21 .21 .17
Learning, r⫽⫺.16 .14 .21
Response (attributions)
Helplessness, r.29 .28 .23
Resilience, r⫽⫺.14 .15 .25
Note. NSLM National Study of Learning Mindsets. Correlations do not adjust for unreliability in the single
items, so effect sizes are conservative relative to measures with less measurement error. Data sources: the NSLM
(Yeager et al., 2019); the NSLM pilot study (Yeager et al., 2016), and a replication of the NSLM in Norway,
the U-say study (Rege et al., in press). Effort beliefs: “When you have to try really hard in a subject in school,
it means you can’t be good at that subject;” Performance-avoidance goals: “One of my main goals for the rest
of the school year is to avoid looking dumb in my classes;” Learning goals: Student chose between “easy math
problems that will not teach you anything new but will give you a high score vs. harder math problems that might
give you a lower score but give you more knowledge”; Helpless responses to challenge: Getting a bad grade
“means I’m probably not very smart at math;” Resilient responses to challenge: After a bad grade, saying “I can
get a higher score next time if I find a better way to study.” All ps.05.
7
GROWTH MINDSET CONTROVERSIES
2003; Hulleman et al., 2010). Students with more of a
growth mindset may well be just as eager to do well in
school (and not do poorly) as those with more of a fixed
mindset (indeed, endorsement of their item was near ceil-
ing). Where they should differ is in whether they are highly
concerned with maintaining a public (or private) image of
being a “smart” not a “dumb” person. To begin to find out
if this solves the puzzle, we were able to include both
measures of performance goals in a study of mindset we
were conducting with N1,582 U.S. 8th to 12th graders
(preregistration: osf.io/3mtcz; see the online supplemental
materials). A Bayesian analysis using Stan (Gelman et al.,
2015) found that a fixed mindset was associated with our
ability-focused goals item, r.25 [95% posterior density
interval: .20, .29], and this was greater than Burgoyne’s
criterion of r.20 with 97% probability, while a fixed
mindset was more weakly associated with the normative-
focused goals composite, r.08 [.03, .13], probability r
.20 99.9%. These data support our argument that at least
in some cases Burgoyne et al. (2020) used items that did not
truly test the meaning system hypotheses.
5
Conclusion. The Burgoyne et al. (2020) study was in-
structive because it provided an opportunity to articulate the
aspects of the constructs that are most central to mindset
theory. In addition, it showed the need for standardized
measures, not only of the mindsets, but also of their mean-
ing system mediators. Being short, the MMI is not designed
to maximize effect sizes in small studies, but rather is
designed to provide a standardized means for learning about
where (and perhaps for testing hypotheses about why) the
meaning system variables should be more strongly or
weakly related to mindsets, to each other, and to outcomes.
Controversy 2: Do Student Mindset
Interventions Work?
Based on their meta-analysis of mindset interventions,
Macnamara and colleagues (Macnamara, 2018; Sisk et al.,
2018) concluded that there is only weak evidence that
mindset interventions work or yield meaningful effects.
Here, we first clarify what is meant by a growth mindset
intervention. Then we ask: Do the effects replicate and hold
up in preregistered studies and independent analyses? And
are the effects meaningfully heterogeneous—that is, do the
5
In addition, the Burgoyne et al. (2020) study included a single attri-
butional item which did not ask about an interpretation of any failure event
(it simply said “Talent alone—without effort—creates success”), unlike
Robins and Pals (2002).
Figure 2
The Correlation Between Fixed Mindsets, the Meaning System, and Academic Outcomes Were Replicated in
Three Large Studies of First-Year High School Students in Two Nations
Note. Statistics are zero-order correlations estimated by meta-analytically aggregating three data sets: The U.S. National Study of
Learning Mindsets (NSLM) pilot (Yeager et al., 2016); The U.S. NSLM (Yeager et al., 2019); The Norway U-say experiment (Rege
et al., in press). All ps.001. Data from each of the items for each of the three studies are presented in Table 1 later in the article.
The meaning system paths to grades and course-taking were measured with the aggregate of the Mindset Meaning System items
(see Table 1 below and the online appendix for the questionnaire). The MMI measure is brief (five items) so that it could be
incorporated into large-scale studies. Because these are single-item measures, magnitudes of correlations are likely to be
underestimates of true associations.
8YEAGER AND DWECK
moderation analyses reveal new insights, mechanisms, or
areas for future research?
What Is a Growth Mindset Intervention?
The Intervention’s Core Content. A growth mindset
intervention teaches the idea that people’s academic and
intellectual abilities can be developed through actions they
take (e.g., effort, changing strategies, and asking for help;
Yeager & Dweck, 2012; Yeager, Romero, et al., 2016). The
intervention usually conveys information about neuroplas-
ticity through a memorable metaphor. The NSLM, for in-
stance, stated “the brain is like a muscle—it gets stronger
(and smarter) when you exercise it.” Merely defining and
illustrating the growth mindset with a metaphor could never
motivate sustained behavior change, however. The interven-
tion must also mention concrete actions people can take to
implement the growth mindset, such as “You exercise your
brain by working on material that makes you think hard in
school.” Students also heard stories from scientists, peers,
and notable figures who have used a growth mindset. The
intervention is not a passive experience but invites active
engagement. In the NSLM, for example, students wrote
short essays about times they have grown their abilities after
struggling and how they might use a growth mindset for
future goals. They also wrote a letter to encourage a future
student who had fixed mindset thoughts such as “I’m al-
ready smart, so I don’t have to work hard” or “I’m not smart
enough, and there’s nothing I can do about it.” This is called
a “saying-is-believing” exercise and it can lead students to
internalize the growth mindset in a short time, through
cognitive dissonance processes (Aronson et al., 2002).
Crucial Details. Several details are crucial to the inter-
vention’s effectiveness. A growth mindset is not simply the
idea that people can get higher scores if they try harder. To
count as a growth mindset intervention, it must make an
argument that ability itself has the potential to be developed.
Telling students that they succeeded because they tried hard,
for example, is an attribution manipulation, not a growth
mindset intervention (cf. Li & Bates, 2019). A growth
mindset intervention also does not require one to believe
that ability can be greatly or easily changed. It deals with the
potential for change, without making any claim or promise
about the magnitude or ease of that change. Nor does a
growth mindset intervention say that ability does not matter
or does not differ. Rather, it focuses on within-person com-
parisons—the idea that people can improve relative to prior
abilities.
Finally, growth mindset interventions can be poorly
crafted or well-crafted (see Yeager, Romero, et al., 2016, for
head-to-head comparisons of different interventions).
Poorly crafted interventions tell participants the definition
of growth mindset without suggesting how to put it into
practice. As noted above, a definition alone cannot motivate
behavior change. Well-crafted interventions ask students to
reflect on how they might develop a “stronger” (better-
connected) brain if they do challenging work, seek out new
learning strategies, or ask for advice when it is needed, but
they do not train students in the mediating behaviors or
meaning system constructs. That is, the interventions men-
tion these learning-oriented behaviors at a high level, so that
students do not just have a definition of a growth mindset,
but also show how to put it into action if they wish to do so. In
fact, in some mindset experiments, the control group gets skills
training (e.g., Blackwell et al., 2007; Yeager et al., 2013).
Well-crafted interventions are also autonomy-supportive and
are not didactic (see Yeager et al., 2018).
Crafting the Interventions. Recently, as growth mind-
set interventions have been evaluated in larger-scale exper-
iments, a years-long research and development process has
been followed, which included (a) focus groups with stu-
dents to identify arguments that will be persuasive to the
target population, (b) A/B tests to compare alternatives head
to head, and (c) pilot studies in preregistered replication
experiments (see Yeager, Romero, et al., 2016). When the
intervention reliably changed mindset beliefs and short-term
mindset-relevant behavior (such as challenge-seeking) con-
sistently across gender, racial, ethnic, and socioeconomic
groups, then it was considered ready for an evaluation study
testing its effects on academic outcomes in the target pop-
ulation (e.g., among 9th graders).
Intervention Target. Here we focus on growth mindset
interventions that target students and that deliver treatments
directly to students without using classroom teachers as
intermediaries. We consider teacher-focused interventions
below (see Controversy 4).
Do Student Growth Mindset Intervention
Effects Replicate?
Initial Evaluation Experiments. Blackwell and col-
leagues (2007, Study 2) evaluated an in-person growth
mindset intervention, delivered by highly trained facilitators
(not teachers) who were trained extensively before each
session and debriefed fully after each one. They found that
this initial intervention halted the downward trajectory of
math grades among U.S. 7th grade students who had pre-
viously been struggling in school (N99). Although prom-
ising, the in-person intervention was not scalable because it
required extensive training of facilitators and a great deal of
class time. Later, Paunesku and colleagues (2015) adminis-
tered a portion of that face-to-face intervention in an online
format to U.S. high school students (N1,594). It included
a scientific article conveying the growth mindset message,
followed by guided reading and writing exercises designed
to help students internalize the mindset. Using a double-
blind experimental design, Paunesku and colleagues (2015)
found effects of the intervention on lower-achieving stu-
dents’ grades months later, at the end of the term—a result
which mirrored the findings of many correlational studies
9
GROWTH MINDSET CONTROVERSIES
showing larger effects for more vulnerable groups (noted
above). The effect size was, not surprisingly, smaller in a
short, online self-administered intervention than in an in-
person intervention, but the conclusions were similar and
the online mode was more scalable.
Yeager, Walton and colleagues (2016, Study 2) replicated
the Paunesku et al. study (using almost identical materials),
but focusing on undergraduates (N7,418) and a different
outcome (full-time enrollment rather than grades). They
administered a version of the Paunesku et al. (2015) short
online growth mindset intervention to all entering under-
graduates at a large public university and found that it
increased full-time enrollment rates among vulnerable
groups. Broda and colleagues (Broda et al., 2018) carried
out a replication (N7,686) of the Yeager, Walton, et al.
(2016, Study 2) study and found a similar result: a growth
mindset intervention lifted achievement among a subgroup
of students that was at risk for poor grades.
First Preregistered Replication. The Paunesku et al.
study was replicated in a preregistered trial conducted with
high school students (Yeager, Romero, et al., 2016). That
study improved upon the intervention materials by updating
and refining the arguments (see Study 1, Yeager et al.,
2016) and more than doubled the sample size (3,676), which
matched recommendations for replication studies (Simon-
sohn, 2015). This Yeager et al. (2016) study found effects
for lower-achieving students. Of note, Pashler and de Ruiter
(2017) have argued that psychologists should reserve strong
claims for “class 1” findings—those that are replicated in
preregistered studies. With the 2016 study, growth mindset
met their “class 1” threshold.
A National, Preregistered Replication. In 2013 we
launched the NSLM (Yeager, 2019), which is another pre-
registered randomized trial, but now using a nationally
representative sample of U.S. public high schools. To
launch the NSLM, a large team of researchers collectively
developed and reviewed the intervention materials and pro-
cedures to ensure that the replication was faithful and high-
quality. Then, an independent firm specializing in nationally
representative surveys (ICF International) constructed the
sample, trained their staff, guided school personnel, and
collected and merged all data. Next, a different independent
firm, specializing in impact evaluation studies (MDRC),
constructed the analytic dataset, and wrote their own inde-
pendent evaluation report based on their own analyses of the
data. Our own analyses (Yeager et al., 2019) and the inde-
pendent report (Zhu et al., 2019) confirmed the conclusion
from prior research: there was a significant growth mindset
intervention effect on the grades of lower-achieving stu-
dents.
Effects on Course-Taking. Of note, an exploratory
analysis of the NSLM data found that the growth mindset
intervention increased challenge-seeking across achieve-
ment levels, as assessed by higher rates of taking advanced
math (Algebra II or above). This finding was later replicated
in the Norway U-say experiment (N6,541; Rege et al., in
press) with an identical overall effect size.
Summary of Supportive Replications. Taken to-
gether, these randomized trials established the effects of
growth mindset interventions with more than 40,000 partic-
ipants and answered the question of whether growth mind-
set interventions can, under some conditions, yield replica-
ble and scalable effects for vulnerable groups. However,
this does not mean that growth mindset interventions work
everywhere for all people. Indeed, there were sites within
the NSLM where the intervention did not yield enhanced
grades among lower achievers.
Unsupportive Evidence. Rienzo, Rolfe, and Wilken-
som (2015) evaluated a face-to-face growth mindset inter-
vention in a sample of 5th grade students (N286),
resembling the face-to-face method in Blackwell et al.
(2007). They showed a nonsignificant positive effect,
namely, a 4-month gain in academic achievement, p.07,
in the growth mindset group relative to the controls. The
estimated effect size (and pattern of moderation results by
student prior achievement reported by Rienzo et al., 2015)
was larger than the online growth mindset intervention
effects. Therefore, the Rienzo study is not exactly evidence
against mindset effects (see a discussion of this statistical
argument in McShane et al., 2019). But Rienzo et al., 2015,
also reported a second, teacher-focused intervention with
null effects (see Controversy 4 below).
Several studies have evaluated direct-to-student growth
mindset manipulations (see Sisk et al., 2018 for a meta-
analysis; also see the appendix in Ganimian, 2020). Some of
these relied on nonrandomized (i.e., quasi-experimental)
research designs, and others were randomized trials. Some
were not growth mindset interventions (e.g., involving only
an email from a professor or sharing a story of scientist who
overcame great struggle). These were combined in a meta-
analysis (Sisk et al., 2018). This meta-analysis yielded the
same conclusion as the NSLM. They found an overall,
significant mindset intervention effect that was larger
among students at risk for poor performance. However, as
with their meta-analysis of the correlational studies, the Sisk
et al. (2018) meta-analysis yielded significant heterogene-
ity.
Is the Heterogeneity in Mindset Effects Meaningful?
Learning from the heterogeneity in any psychological phe-
nomenon, especially one involving real-world behavior and
policy-relevant outcomes, can be difficult (Tipton et al.,
2019). In particular, it is hard to understand the source of
heterogeneous findings when different studies involve dif-
ferent populations, different interventions, and different
contexts all at once. Because meta-analyses tend to assess
moderators at the study level (rather than analyzing within-
study interaction effects), and because each study tends to
change all three—the population, the intervention, and the
10 YEAGER AND DWECK
context—then moderators are confounded. For this reason,
metaregression is often poorly suited for understanding
moderators (see a discussion in Yeager et al., 2015).
It is far easier to find out if contextual heterogeneity is
meaningful if a study holds constant key study design
features (the intervention and the targeted group) and then
carries out the experiment in different contexts (Tipton et
al., 2019). If effects still varied, this would mean that at least
some of the heterogeneity we were seeing was systematic
and potentially interesting. Indeed, large, rigorous random-
ized trials have a long history of settling debates caused
by meta-analyses that aggregated mostly correlational or
quasi-experimental studies (see, e.g., the class size de-
bate; Krueger, 2003).
Heterogeneous Effects in the NSLM. The NSLM was
designed to provide exactly this kind of test, by uncon-
founding contextual heterogeneity from intervention design.
It revealed that growth mindset intervention effects were
systematically larger in some schools and classrooms than
others. In particular, the intervention improved lower-
achieving students’ grade point averages chiefly when peers
displayed a norm of challenge-seeking (a growth mindset
behavior; Yeager et al., 2019) and math grades across
achievement levels when math teachers endorsed more of a
growth mindset (Yeager et al., 2020). Thus, the intervention
(that changed growth mindset beliefs homogeneously across
these settings), was most effective at changing grades in
populations who were vulnerable to poor outcomes, and in
contexts where peers and teachers afforded students the
chance to act and continue acting on the mindset (see
Walton & Yeager, 2020).
A Proposed Framework for Understanding Heteroge-
neity: The Mindset Context Perspective. The NSLM
results led to a framework that can help to understand (and
predict, in advance) heterogeneous results. We call it the
Mindset Context perspective (see Figure 3; see Yeager et
al., 2019). According to this view, a growth mindset inter-
vention should have meaningful effects only when people
are actively facing challenges or setbacks (e.g., when they
are lower-achieving, or are facing a difficult course or
school transition) and when the context provides opportu-
nities for students to act on their mindsets (e.g., via teacher
practices that support challenge-seeking or risking mistakes;
Bryan et al., 2020; Walton & Yeager, 2020). Mindset
Context stands in contrast to the “mindset alone” perspec-
tive, which is the idea that if people are successfully taught
a growth mindset they will implement this mindset in al-
most any setting they find themselves in.
One important implication of Mindset Context theory
is that any intervention will likely need further customiza-
tion before it can be given to different populations (e.g., in
the workplace, for older or younger students, or in a new
domain). And even a well-crafted intervention will need to
be delivered with an understanding of the context factors
that could moderate its effects.
We note that research using a typical convenience sample
will have a hard time testing predictions of Mindset
Context Theory. Individual-level risk factors and unsup-
portive context factors are often positively correlated in
the population, but the two types of factors are expected
to have opposite moderation effects. This is why it is
useful to use either representative samples (like the
NSLM) or carefully constructed quota samples which
disentangle the two.
Conclusions. The NSLM results show why it can be
misleading to look at studies with null findings and con-
clude that something “doesn’t work” or isn’t “real” (also see
Kenny & Judd, 2019). In fact, if researchers had treated
Figure 3
The Mindset Context Perspective: A Decision Tree Depicts Questions to Ask
About a Mindset Intervention, and What Kinds of Effect to Expect Depending on
the Answer
11
GROWTH MINDSET CONTROVERSIES
each of the 65 schools in the NSLM as its own separate
randomized trial, there could be many published papers in
the literature that might look like “failures to replicate.”
This is why it is not best practice to count significant results
(see Gelman, 2018, for a discussion of this topic). When,
instead, we had specific hypotheses about meaningful
sources of heterogeneity and a framework for interpreting
them (see Figure 3) we found that the schools varied sys-
tematically in their effects. There is still much more to learn
about heterogeneity in growth mindset interventions, of
course, in particular with respect to classroom cultures and
international contexts (cf. Ganimian, 2020).
Controversy 3: Are Mindset Effect Sizes Too
Small to Be Interesting?
What are the Right Benchmarks for Effect Sizes?
Benchmarks from Macnamara and Colleagues. Macna-
mara and colleagues stated that a “typical” effect size for an
educational intervention is .57 SD (Macnamara, 2018; Sisk et
al., 2018, p. 569). To translate this into a concrete effect, this
would mean that if GPA had a standard deviation of 1, as it
usually does, then a typical intervention should be expected to
increase GPA by .57 points, for instance from a 3.0 to a 3.57.
Macnamara and colleagues argued that “resources might be
better allocated elsewhere” because growth mindset effects
were much smaller than .57 SD (Sisk et al., 2018, p. 569).
Questioning Macnamara (2018), Gelman (2018) asked
“Do we really believe that .57? Maybe not.” The .57 SD
benchmark comes from a meta-analysis by Hattie et al.
(1996) of learning skills manipulations. Their meta-
analysis mostly aggregated effects on variables at imme-
diate posttest, almost all of which were researcher-
designed measures to assess whether students displayed
the skill they had just learned (almost like a manipulation
check). However, such immediate measures that are very
close to precisely what was taught are well-known to
show much larger effect sizes than multiply determined
outcomes that unfold over time in the real world, such as
grades or test scores (Cheung & Slavin, 2016). Indeed, in
the same Hattie meta-analysis cited by Macnamara
(2018), Hattie and colleagues stated: “There were 30
effect sizes that included follow-up evaluations, typically
of 108 days, and the effects sizes declined to an average
of .10” (Hattie et al., 1996, p. 112). Judging from the
meta-analysis cited by Macnamara (2018), the “typical”
effect size for a study looking at effects over time would
be .10 SD, not .57 SD. But even this is too optimistic
when we consider study quality.
Benchmarks From Field Experiments in Education.
The current standard for understanding effect sizes in
educational research does not look to a meta-analysis of
all possible studies regardless of quality but looks to
“best evidence syntheses.” This means examining syn-
theses of studies using the highest-quality research de-
signs that were aimed at changing real-world educational
outcomes (Slavin, 1986). This is important because
lower-quality research designs (nonexperimental, non-
randomized, or nonrepresentative) do not provide realis-
tic benchmarks against which interventions should be
evaluated (Cheung & Slavin, 2016). Through the lens of
the best evidence synthesis, about the best effect that can
be expected for an educational intervention with real-
world outcomes is .20 SD. For instance, the effect of
having an exceptionally good versus a below-average
teacher for an entire year is .16 SD (Chetty et al., 2014).
An entire year of math learning in high school is .22 SD
(Lipsey et al., 2012). The positive effect of drastically
reducing class-size for elementary school students was
.20 SD (Nye et al., 2000).
A typical effect of educational interventions is much
smaller. Kraft (2020) located all studies funded by the
federal government’s Investing in Innovation (I3) Fund,
which were studies of programs that had previously
shown promising results and won a competition to un-
dergo rigorous, third-party evaluation (Boulay et al.,
2018). This is relevant because researchers had to pre-
specify their primary outcome variable and hypotheses
and report the results regardless of their significance, so
there is no “file drawer” problem. These were studies
looking at objective real-world outcomes (e.g., grades,
test scores, or course completion rates) some time after
the educational interventions. The median effect size was
.03 SD and the 90th percentile was .23 SD (see Kraft,
2020, Table 1, rightmost column). In populations of
adolescents (the group targeted by our growth mindset
interventions), there were no effects over .23 SD (see an
analysis in the online supplement in Yeager et al., 2019).
Kraft (2020) concluded that “effects of 0.15 or even 0.10
SD should be considered large and impressive when they
arise from large-scale field experiments that are prereg-
istered and examine broad achievement measures” (p.
248).
It is interesting to note that the most highly touted
“nudge” interventions (among those whose effects unfold
over time) have effects that are in the same range (see
Benartzi et al., 2017). The effects of descriptive norms on
energy conservation (.03 SD), the effects of implemen-
tation intentions on flu vaccination rates (.12 SD), and the
effects of simplifying a financial aid form on college
enrollment for first-time college-goers (.16 SD; effect
sizes calculated from posted zstatistics, https://osf.io/
47f7j/) were never anywhere close to .57 SD overall.
Summary of Effect Sizes
We have presented this review to suggest that psychology’s
effect size benchmarks, which were based on expectations of
laboratory results and are often still used in our field, have led
12 YEAGER AND DWECK
the field astray (also see a discussion in Kraft, 2020). In the real
world, single variables do not have huge effects. Not even
relatively large, expensive, and years-long reforms do. If psy-
chological interventions can get a meaningful chunk of a .20
effect size on real-world outcomes in targeted groups, reliably,
cost-effectively, and at scale, that is impressive.
Comparison to the NSLM
In this context, growth mindset intervention effect sizes are
noteworthy. The NSLM showed an average effect on grades in
the preregistered group of lower-achieving students of .11 SD
(Yeager et al., 2019). Moreover, this occurred with a short,
low-cost self-administered intervention that required no further
time investment from the school (indeed, teachers were blind
to the purpose of the study). And these effects were in a
scaled-up study.
Growth mindset effects were larger in contexts that were
specified in preregistered hypotheses. The effect size was .18
(for overall grades) and .25 (for math and science grades) in
schools that were not already high-achieving and provided a
supportive peer climate (Yeager et al., 2019, Figure 3). Of
course, we are aware that these kinds of moderation results
might, in the past, have emerged from a post hoc exploratory
analysis and would, therefore, be hard to believe. Therefore, it
might be tempting to discount them. But these patterns
emerged from a disciplined preanalysis plan that was carried
out by independent Bayesian statisticians who analyzed
blinded data using machine-learning methods (Yeager et al.,
2019), and the moderators were confirmed by an independent
research firm’s analyses, over which we had no influence (Zhu
et al., 2019).
Effects of this magnitude can have important consequences
for students’ educational trajectories. In the NSLM, the overall
average growth mindset effect on lower-achievers’ 9th grade
poor performance rates in their courses (that is, the earning of
D/F grades) was a reduction of 5.3 percentage points (Yeager
et al., 2019). Because there are over 3 million students per year
in 9th grade (and lower-achievers are defined as the bottom
half of the sample), this means that a scalable growth mindset
intervention could prevent 90,000 at-risk students per year
from failing to make adequate progress in the crucial first year
of high school. In summary, the effect sizes were meaning-
ful—they are impressive relative to the latest standards and in
terms of potential societal impact—and the moderators meet a
high bar for scientific credibility.
Controversy 4: Teacher Mindset Interventions
The final controversy concerns mindset interventions
aimed at or administered by teachers. So far, teacher-
focused growth mindset interventions have not worked (Fo-
liano et al., 2019; Rienzo et al., 2015), even though they
were developed with great care and were labor-intensive. A
likely reason is that the evidence base for teacher-focused
interventions is just beginning to emerge. Among other
things, the field will need to learn (a) precisely how to
address teachers’ mindsets about themselves and their stu-
dents, (b) which teacher practices feed into and maintain
students’ fixed and growth mindsets, (c) how to guide and
alter the teachers’ practices, and (d) how to do so in a way
that affects students’ perceptions and behaviors and that
enhances students’ outcomes. Moreover, changing teacher
behavior through professional development is known to be
exceptionally challenging (TNTP, 2015).
For this reason, it might be preferable to start by admin-
istering a direct-to-student program to teach students a
growth mindset, such as the (free-to-educators) program we
have developed (available at www.perts.net). Then the fo-
cus can be on helping teachers to support its effects. As the
field begins to tackle this challenge, it will not have to start
from scratch but can build on recent studies documenting
the role of teachers’ mindsets and mindset-related practices
in student achievement (e.g., Canning et al., 2019).
Summary of the Controversies
Three of the questions we have addressed so far—Does
growth mindset predict outcomes? Do growth mindset in-
tervention effects replicate? Are the effect sizes meaning-
ful?—have strong evidence in the affirmative. In each case
we have been inspired to learn from critiques, for instance,
by learning more about the expected effect sizes in educa-
tional field experiments, or designing standardized mea-
sures and interventions. There is also evidence that speaks
to the meaningful heterogeneity of the effects. As we have
discussed, there are studies, or sites within studies, that do
not show predicted mindset effects, but the more we are
learning about the students and contexts at those sites the
more we can improve mindset measures and intervention
programs. The fourth controversy, about educational prac-
titioners, highlights one important limitation of the work to
date and points to future directions for research.
Conclusion
This review of the evidence showed that the foundations
of mindset theory are sound, the effects are replicable, and
the effect sizes are promising. Although we have learned
much from the research-related controversies, we might ask
at a more general level: Why should the idea that students
can develop their abilities be controversial? And why
should it be controversial that believing this can inspire
students, in supportive contexts, to learn more? In fact, do
not all children deserve to be in schools where people
believe in and are dedicated to the growth of their intellec-
tual abilities? The challenge of creating these supportive
contexts for all learners will be great, and we hope mindset
research will play a meaningful role in their creation.
13
GROWTH MINDSET CONTROVERSIES
References
Aronson, J. M., Fried, C. B., & Good, C. (2002). Reducing the effects of
stereotype threat on African American college students by shaping
theories of intelligence. Journal of Experimental Social Psychology,
38(2), 113–125. https://doi.org/10.1006/jesp.2001.1491
Bahník, Š., & Vranka, M. A. (2017). Growth mindset is not associated with
scholastic aptitude in a large sample of university applicants. Personality
and Individual Differences,117, 139–143. https://doi.org/10.1016/j.paid
.2017.05.046
Benartzi, S., Beshears, J., Milkman, K. L., Sunstein, C. R., Thaler, R. H.,
Shankar, M., Tucker-Ray, W., Congdon, W. J., & Galing, S. (2017).
Should governments invest more in nudging? Psychological Science,
28(8), 1041–1055. https://doi.org/10.1177/0956797617702501
Blackwell, L. S., Trzesniewski, K. H., & Dweck, C. S. (2007). Implicit
theories of intelligence predict achievement across an adolescent tran-
sition: A longitudinal study and an intervention. Child Development,
78(1), 246–263. https://doi.org/10.1111/j.1467-8624.2007.00995.x
Boulay, B., Goodson, B., Olsen, R., McCormick, R., Darrow, C., Frye, M.,
Gan, K., Harvill, E., & Sarna, M. (2018). The investing in innovation
fund: Summary of 67 evaluations (NCEE 2018 4013). National Center
for Education Evaluation and Regional Assistance, Institute of Educa-
tion Sciences, U.S. Department of Education.
Broda, M., Yun, J., Schneider, B., Yeager, D. S., Walton, G. M., & Diemer,
M. (2018). Reducing inequality in academic success for incoming col-
lege students: A randomized trial of growth mindset and belonging
interventions. Journal of Research on Educational Effectiveness,11(3),
317–338. https://doi.org/10.1080/19345747.2018.1429037
Bryan, C. J., Tipton, E., & Yeager, D. S. (2020). To change the world,
behavioral intervention research will need to get serious about hetero-
geneity [Unpublished manuscript]. University of Texas at Austin.
Burgoyne, A. P., Hambrick, D. Z., & Macnamara, B. N. (2020). How firm
are the foundations of mind-set theory? The claims appear stronger than
the evidence. Psychological Science,31(3), 258–267. https://doi.org/10
.1177/0956797619897588
Burnette, J. L., Knouse, L. E., Vavra, D. T., O’Boyle, E., & Brooks, M. A.
(2020). Growth Mindsets and psychological distress: A meta-analysis.
Clinical Psychology Review,77, 101816. https://doi.org/10.1016/j.cpr
.2020.101816
Burnette, J. L., O’Boyle, E. H., VanEpps, E. M., Pollack, J. M., & Finkel,
E. J. (2013). Mind-sets matter: A meta-analytic review of implicit
theories and self-regulation. Psychological Bulletin,139(3), 655–701.
https://doi.org/10.1037/a0029531
Canning, E. A., Muenks, K., Green, D. J., & Murphy, M. C. (2019). STEM
faculty who believe ability is fixed have larger racial achievement gaps
and inspire less student motivation in their classes. Science Advances,
5(2), eaau4734. https://doi.org/10.1126/sciadv.aau4734
Chetty, R., Friedman, J. N., & Rockoff, J. E. (2014). Measuring the impacts
of teachers I: Evaluating bias in teacher value-added estimates. The
American Economic Review,104(9), 2593–2632. https://doi.org/10
.1257/aer.104.9.2593
Cheung, A. C. K., & Slavin, R. E. (2016). How methodological features
affect effect sizes in education. Educational Researcher,45(5), 283–
292. https://doi.org/10.3102/0013189X16656615
Claro, S., & Loeb, S. (2019). Students with growth mindset learn more in
school: Evidence from California’s CORE school districts. PACE. Re-
trieved from https://www.edpolicyinca.org/publications/self-management-
skills-and-student-achievement-gains-evidence-california-core-districts
Claro, S., Paunesku, D., & Dweck, C. S. (2016). Growth mindset tempers
the effects of poverty on academic achievement. Proceedings of the
National Academy of Sciences of the United States of America,113(31),
86648668. https://doi.org/10.1073/pnas.1608207113
Cury, F., Elliot, A. J., Da Fonseca, D., & Moller, A. C. (2006). The
social-cognitive model of achievement motivation and the 2 2
achievement goal framework. Journal of Personality and Social Psy-
chology,90(4), 666679. https://doi.org/10.1037/0022-3514.90.4.666
Diener, C. I., & Dweck, C. S. (1978). An analysis of learned helplessness:
Continuous changes in performance, strategy, and achievement cogni-
tions following failure. Journal of Personality and Social Psychology,
36(5), 451. https://doi.org/10.1037/0022-3514.36.5.451
Dweck, C. S. (1999). Self-theories: Their role in motivation, personality,
and development. Taylor and Francis/Psychology Press.
Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to
motivation and personality. Psychological Review,95(2), 256–273.
https://doi.org/10.1037/0033-295X.95.2.256
Dweck, C. S., & Reppucci, N. D. (1973). Learned helplessness and
reinforcement responsibility in children. Journal of Personality and
Social Psychology,25(1), 109–116. https://doi.org/10.1037/h0034248
Dweck, C. S., & Yeager, D. S. (2019). Mindsets: A view from two eras.
Perspectives on Psychological Science,14, 481–496. https://doi.org/10
.1177/1745691618804166
Elliott, E. S., & Dweck, C. S. (1988). Goals: An approach to motivation
and achievement. Journal of Personality and Social Psychology,54(1),
5–12. https://doi.org/10.1037/0022-3514.54.1.5
Foliano, F., Rolfe, H., Buzzeo, J., Runge, J., & Wilkinson, D. (2019).
Changing mindsets: Effectiveness trial. National Institute of Economic
and Social Research.
Ganimian, A. J. (2020). Growth-mindset interventions at scale: Experi-
mental evidence from Argentina. Educational Evaluation and Policy
Analysis,42(3), 417–438. https://doi.org/10.3102/0162373720938041
Gelman, A. (2018, September 13). Discussion of effects of growth mindset:
Let’s not demand unrealistic effect sizes. Retrieved from https://
statmodeling.stat.columbia.edu/2018/09/13/discussion-effects-growth-
mindset-lets-not-demand-unrealistic-effect-sizes/
Gelman, A., Lee, D., & Guo, J. (2015). Stan: A probabilistic programming
language for Bayesian inference and optimization. Journal of Educa-
tional and Behavioral Statistics,40(5), 530–543.
Grant, H., & Dweck, C. S. (2003). Clarifying achievement goals and their
impact. Journal of Personality and Social Psychology,85(3), 541–553.
https://doi.org/10.1037/0022-3514.85.3.541
Gopalan, M., & Yeager, D. S. (2020). How does adopting a Growth
Mindset Improve Academic Performance? Probing the Underlying
Mechanisms in a Nationally- Representative Sample [OSF Preprints].
https://doi.org/10.31219/osf.io/h2bzs
Hattie, J., Biggs, J., & Purdie, N. (1996). Effects of learning skills inter-
ventions on student learning: A meta-analysis. Review of Educational
Research,66(2), 99–136. https://doi.org/10.3102/00346543066002099
Hong, Y., Chiu, C., Dweck, C. S., Lin, D. M.-S., & Wan, W. (1999).
Implicit theories, attributions, and coping: A meaning system approach.
Journal of Personality and Social Psychology,77(3), 588–599. https://
doi.org/10.1037/0022-3514.77.3.588
Hulleman, C. S., Schrager, S. M., Bodmann, S. M., & Harackiewicz, J. M.
(2010). A meta-analytic review of achievement goal measures: Different
labels for the same constructs or different constructs with similar labels?
Psychological Bulletin,136(3), 422–449. https://doi.org/10.1037/
a0018947
Kanopka, K., Claro, S., Loeb, S., West, M., & Fricke, H. (2020). Changes
in social-emotional learning: Examining student development over time
[Working Paper] Policy Analysis for California Education). Stanford
University. Retrieved from https://www.edpolicyinca.org/sites/default/
files/2020-07/wp_kanopka_july2020.pdf
Kenny, D. A., & Judd, C. M. (2019). The unappreciated heterogeneity of
effect sizes: Implications for power, precision, planning of research, and
replication. Psychological Methods,24(5), 578–589. https://doi.org/10
.1037/met0000209
Kraft, M. A. (2020). Interpreting effect sizes of education interventions.
Educational Researcher,49(4), 241–253. https://doi.org/10.3102/
0013189X20912798
14 YEAGER AND DWECK
Krueger, A. B. (2003). Economic considerations and class size. Economic
Journal,113(485), F34–F63. https://doi.org/10.1111/1468-0297.00098
Leslie, S.-J., Cimpian, A., Meyer, M., & Freeland, E. (2015). Expectations
of brilliance underlie gender distributions across academic disciplines.
Science,347(6219), 262–265. https://doi.org/10.1126/science.1261375
Li, Y., & Bates, T. C. (2019). You can’t change your basic ability, but you
work at things, and that’s how we get hard things done: Testing the role
of growth mindset on response to setbacks, educational attainment, and
cognitive ability. Journal of Experimental Psychology: General,148(9),
1640–1655. https://doi.org/10.1037/xge0000669
Lipsey, M. W., Puzio, K., Yun, C., Hebert, M. A., Steinka-Fry, K., Cole,
M. W., Roberts, M., Anthony, K. S., & Busick, M. D. (2012). Trans-
lating the statistical representation of the effects of education interven-
tions into more readily interpretable forms (NCSER 2013–3000). Na-
tional Center for Special Education Research. Retrieved from https://ies
.ed.gov/ncser/pubs/20133000/
Macnamara, B. (2018). Schools are buying “growth mindset” interventions
despite scant evidence that they work well. The Conversation. Retrieved
from http://theconversation.com/schools-are-buying-growth-mindset-
interventions-despite-scant-evidence-that-they-work-well-96001
McShane, B. B., Tackett, J. L., Böckenholt, U., & Gelman, A. (2019).
Large-scale replication projects in contemporary psychological research.
The American Statistician,73(Suppl. 1), 99–105. https://doi.org/10
.1080/00031305.2018.1505655
Miele, D. B., & Molden, D. C. (2010). Naive theories of intelligence and
the role of processing fluency in perceived comprehension. Journal of
Experimental Psychology: General,139(3), 535–557. https://doi.org/10
.1037/a0019745
Molden, D. C., & Dweck, C. S. (2006). Finding “meaning” in psychology:
A lay theories approach to self-regulation, social perception, and social
development. American Psychologist,61(3), 192–203. https://doi.org/10
.1037/0003-066X.61.3.192
Mueller, C. M., & Dweck, C. S. (1998). Praise for intelligence can
undermine children’s motivation and performance. Journal of Person-
ality and Social Psychology,75(1), 33–52. https://doi.org/10.1037/0022-
3514.75.1.33
Muenks, K., Canning, E. A., LaCosse, J., Green, D. J., Zirkel, S., Garcia,
J. A., & Murphy, M. C. (2020). Does my professor think my ability can
change? Students’ perceptions of their STEM professors’ mindset be-
liefs predict their psychological vulnerability, engagement, and perfor-
mance in class. Journal of Experimental Psychology: General,149,
2119–2144. https://doi.org/10.1037/xge0000763
Nelson, L. D., Simmons, J., & Simonsohn, U. (2018). Psychology’s re-
naissance. Annual Review of Psychology,69(1), 511–534. https://doi
.org/10.1146/annurev-psych-122216-011836
Nussbaum, A. D., & Dweck, C. S. (2008). Defensiveness versus remedia-
tion: Self-theories and modes of self-esteem maintenance. Personality
and Social Psychology Bulletin,34(5), 599612. https://doi.org/10
.1177/0146167207312960
Nye, B., Hedges, L. V., & Konstantopoulos, S. (2000). The effects of small
classes on academic achievement: The results of the Tennessee class size
experiment. American Educational Research Journal,37(1), 123–151.
https://doi.org/10.3102/00028312037001123
OECD. (2019). PISA 2018 results (Volume III): What school life means for
students’ lives. PISA, OECD Publishing. https://doi.org/10.1787/
acd78851-en
Pashler, H., & de Ruiter, J. P. (2017). Taking responsibility for our field’s
reputation. APS Observer. Retrieved from https://www.psychological
science.org/observer/taking-responsibility-for-our-fields-reputation
Paunesku, D., Walton, G. M., Romero, C., Smith, E. N., Yeager, D. S., &
Dweck, C. S. (2015). Mind-set interventions are a scalable treatment for
academic underachievement. Psychological Science,26(6), 784–793.
https://doi.org/10.1177/0956797615571017
Qin, X., Wormington, S., Guzman-Alvarez, A., & Wang, M.-T. (2020).
Why does a growth mindset intervention impact achievement differently
across secondary schools? Unpacking the causal mediation mechanism
from a national multisite randomized experiment [OSF Preprints].
https://doi.org/10.31219/osf.io/7c94h
Rege, M., Hanselman, P., Ingeborg, F. S., Dweck, C. S., Ludvigsen, S.,
Bettinger, E., Muller, C., Walton, G. M., Duckworth, A. L., & Yeager,
D. S. (in press). How can we inspire nations of learners? Investigating
growth mindset and challenge-seeking in two countries. American Psy-
chologist.
Rienzo, C., Rolfe, H., & Wilkinson, D. (2015). Changing mindsets: Eval-
uation report and executive summary. Education Endowment Founda-
tion.
Robins, R. W., & Pals, J. L. (2002). Implicit self-theories in the academic
domain: Implications for goal orientation, attributions, affect, and self-
esteem change. Self and Identity,1(4), 313–336. https://doi.org/10.1080/
15298860290106805
Schleider, J. L., Abel, M. R., & Weisz, J. R. (2015). Implicit theories and
youth mental health problems: A random-effects meta-analysis. Clinical
Psychology Review,35, 1–9. https://doi.org/10.1016/j.cpr.2014.11.001
Schroder, H. S., Dawood, S., Yalch, M. M., Donnellan, M. B., & Moser,
J. S. (2016). Evaluating the domain specificity of mental health–related
mind-sets. Social Psychological and Personality Science,7(6), 508
520. https://doi.org/10.1177/1948550616644657
Seo, E., Lee, H. Y., Jamieson, J. P., Reis, H. T., Beevers, C. G., & Yeager,
D. S. (in press).Trait attributions and threat appraisals explain the rela-
tion between implicit theories of personality and internalizing symptoms
during adolescence. Development and Psychopathology.
Shpancer, N. (2020, July 29). Life cannot be hacked. Psychology Today.
Retrieved from https://www.psychologytoday.com/blog/insight-therapy/
202007/life-cannot-be-hacked
Simonsohn, U. (2015). Small telescopes: Detectability and the evaluation
of replication results. Psychological Science,26(5), 559–569. https://doi
.org/10.1177/0956797614567341
Sisk, V. F., Burgoyne, A. P., Sun, J., Butler, J. L., & Macnamara, B. N.
(2018). To what extent and under which circumstances are growth
mind-sets important to academic achievement? Two meta-analyses. Psy-
chological Science,29(4), 549–571. https://doi.org/10.1177/
0956797617739704
Slavin, R. E. (1986). Best-evidence synthesis: An alternative to meta-
analytic and traditional reviews. Educational Researcher,15(9), 5–11.
https://doi.org/10.3102/0013189X015009005
The World Bank. (2017, April 25). If you think you can get smarter, you
will. Retrieved from http://www.worldbank.org/en/results/2017/04/25/
peru-if-you-think-you-can-get-smarter-you-will
Tipton, E., Yeager, D. S., Iachan, R., & Schneider, B. (2019). Designing
probability samples to study treatment effect heterogeneity. In P. J.
Lavrakas (Ed.), Experimental methods in survey research: Techniques
that combine random sampling with random assignment (pp. 435–456).
Wiley. https://doi.org/10.1002/9781119083771.ch22
TNTP. (2015). The mirage: Confronting the hard truth about our quest for
teacher development. Retrieved from http://tntp.org/assets/documents/
TNTP-Mirage_2015.pdf
Vivalt, E. (2020). How much can we generalize from impact evaluations?
Journal of the European Economic Association. Advance online publi-
cation. https://doi.org/10.1093/jeea/jvaa019
Walton, G. M., & Yeager, D. S. (2020). Seed and soil: Psychological
affordances in contexts help to explain where wise interventions succeed
or fail. Current Directions in Psychological Science,29, 219–226.
https://doi.org/10.1177/0963721420904453
Warne, R. (2020, January 3). The one variable that makes growth mindset
interventions work. Retrieved from https://russellwarne.com/2020/01/
03/the-one-variable-that-makes-growth-mindset-interventions-work/
15
GROWTH MINDSET CONTROVERSIES
Weiner, B. (1985). An attributional theory of emotion and motivation.
Psychological Review,92(4), 548–573. https://doi.org/10.1037/0033-
295X.92.4.548
Weiner, B., & Kukla, A. (1970). An attributional analysis of achievement
motivation. Journal of Personality and Social Psychology,15(1), 1–20.
https://doi.org/10.1037/h0029211
West, M. R., Buckley, K., Krachman, S. B., & Bookman, N. (2018).
Development and implementation of student social-emotional surveys in
the CORE Districts. Journal of Applied Developmental Psychology,55,
119–129. https://doi.org/10.1016/j.appdev.2017.06.001
Yeager, D. S. (2019). The National Study of Learning Mindsets [United
States] (pp. 2015–2016). Inter-university Consortium for Political and
Social Research. https://doi.org/10.3886/ICPSR37353.v1
Yeager, D. S., Carroll, J. M., Buontempo, J., Cimpian, A., Woody, S.,
Crosnoe, R., Muller, C., Murray, J., Mhatre, P., Kersting, N., Hulleman,
C., Kudym, M., Murphy, M., Duckworth, A. L., Walton, G., & Dweck,
C. S. (2020). Teacher mindsets help explain where a growth mindset
intervention does and doesn’t work [Manuscript submitted for publica-
tion].
Yeager, D. S., Dahl, R. E., & Dweck, C. S. (2018). Why interventions to
influence adolescent behavior often fail but could succeed. Perspectives
on Psychological Science,13(1), 101–122. https://doi.org/10.1177/
1745691617722620
Yeager, D. S., & Dweck, C. S. (2012). Mindsets that promote resilience:
When students believe that personal characteristics can be developed.
Educational Psychologist,47(4), 302–314. https://doi.org/10.1080/
00461520.2012.722805
Yeager, D. S., Fong, C. J., Lee, H. Y., & Espelage, D. L. (2015). Declines
in efficacy of anti-bullying programs among older adolescents: Theory
and a three-level meta-analysis. Journal of Applied Developmental Psy-
chology,37, 36–51. https://doi.org/10.1016/j.appdev.2014.11.005
Yeager, D. S., Hanselman, P., Muller, C. L., & Crosnoe, R. (2019).
Mindset x Context Theory: How agency and structure interact to shape
human development and social inequality [Working Paper]. University
of Texas at Austin.
Yeager, D. S., Hanselman, P., Walton, G. M., Murray, J. S., Crosnoe, R.,
Muller, C., Tipton, E., Schneider, B., Hulleman, C. S., Hinojosa, C. P.,
Paunesku, D., Romero, C., Flint, K., Roberts, A., Trott, J., Iachan, R.,
Buontempo, J., Yang, S. M., Carvalho, C. M., . . . Dweck, C. S. (2019).
A national experiment reveals where a growth mindset improves
achievement. Nature,573(7774), 364–369. https://doi.org/10.1038/
s41586-019-1466-y
Yeager, D. S., Romero, C., Paunesku, D., Hulleman, C. S., Schneider, B.,
Hinojosa, C., Lee, H. Y., O’Brien, J., Flint, K., Roberts, A., Trott, J.,
Greene, D., Walton, G. M., & Dweck, C. S. (2016). Using design
thinking to improve psychological interventions: The case of the growth
mindset during the transition to high school. Journal of Educational
Psychology,108(3), 374–391. https://doi.org/10.1037/edu0000098
Yeager, D. S., Trzesniewski, K. H., & Dweck, C. S. (2013). An implicit
theories of personality intervention reduces adolescent aggression in
response to victimization and exclusion. Child Development,84(3),
970–988. https://doi.org/10.1111/cdev.12003
Yeager, D. S., Walton, G. M., Brady, S. T., Akcinar, E. N., Paunesku, D.,
Keane, L., Kamentz, D., Ritter, G., Duckworth, A. L., Urstein, R., Gomez,
E. M., Markus, H. R., Cohen, G. L., & Dweck, C. S. (2016). Teaching a lay
theory before college narrows achievement gaps at scale. Proceed-
ings of the National Academy of Sciences of the United States of
America,113(24), E3341–E3348. https://doi.org/10.1073/pnas
.1524360113
Zhu, P., Garcia, I., & Alonzo, E. (2019). An independent evaluation of
growth mindset intervention. MDRC. Retrieved from https://files.eric
.ed.gov/fulltext/ED594493.pdf
16 YEAGER AND DWECK

Supplementary resource (1)

... Growth mindset is the belief that individuals have that their intelligence can develop through effort and is not fixed due to genetic factors . The existence of this belief has been proven by many researchers to make students more motivated and ultimately make them strive to achieve maximum learning outcomes (Barbouta et al., 2020;Cheng et al., 2021;Hacisalihoglu et al., 2020;Yeager & Dweck, 2020). In contrast to a growth mindset, a fixed mindset believes that an individual's abilities and intelligence remain unchanged and cannot be modified or developed over time. ...
... Since Carol S. Dweck first introduced the theory of growth mindset, numerous interventions in the realm of education have been implemented (Yeager & Dweck, 2020). Carol Dweck's study emphasizes the importance of a growth mindset for how learners learn and academic achievement (Prodyanatasari et al., 2023). ...
... Yeager is the researcher with the largest number of articles explaining growth mindset online intervention in the world of education. On the one hand, Carol S. Dweck, who is also the main researcher who coined the concept and theory of growth mindset, is the second author with the largest number of articles explaining growth mindset online intervention (Yeager & Dweck, 2020). Secondly, there is also research and publications carried out simultaneously in explaining online intervention-based growth mindset in the realm of education. ...
Article
The use of online media in the realm of education is a trend nowadays. Not only is it used to support the learning process, it also supports students' psychological conditions. One of them is creating a growth mindset. The aim of the research is to describe the development of online interventions for growth mindsets used in the educational domain. Bibliometric analysis was used in this study. Data pertaining to articles published on Google Scholar relevant to this research topic were gathered utilizing the Publish or Perish application. Furthermore, the data was analysed using the VOSviewer application. The results of the analysis revealed that the use of online-based interventions was developed and used to improve students' growth mindset in the current educational domain. Various online-based intervention designs have been developed and scientifically researched for their impact on growth mindset in the education domain. This research can serve as a basis for creating growth mindset-based media interventions for students in the context of education.
... Students can vary in their implicit theories, from more of a fixed theory of intelligence (fixed mindset), believing that their intelligence and abilities are unchangeable, to a more malleable theory of intelligence (growth mindset), believing that their intelligence and abilities can be developed over time. However, it is worth noting that growth and fixed mindsets are not separate concepts but rather opposite poles of the same concept of mindset regarding whether a certain attribute or ability is believed to be malleable (Yeager & Dweck, 2020). ...
... Research has consistently demonstrated that children who receive praise for effort are more inclined to adopt the belief that intelligence is malleable, thereby fostering a growth mindset (Yeager & Dweck, 2020). In contrast, children who are praised for their intelligence are more likely to develop a fixed mindset, believing their abilities are innate and unchangeable. ...
... These findings were further supported by Yeager and Dweck (2020) in a reanalysis of replication studies involving a sample of Chinese primary school children, reinforcing the idea that effort-based praise plays a significant role in promoting growth mindset beliefs. However, a study conducted with 108 Dutch students aged 17 (Glerum et al., 2020b) found that even when the procedure from Mueller and Dweck's original experiments was replicated, the students were not significantly influenced by the type of praise they received. ...
Article
Full-text available
Promoting a growth mindset may positively influence learner motivation and enhance learning outcomes among primary school children. Previous studies have predominantly focused on secondary and undergraduate students, investigating the efficacy of a reading and writing intervention related to intelligence. Introducing effort-based praise during the learning process also aligns with a growth mindset and may further facilitate the development of a growth mindset in primary education. To evaluate the effects of these intervention approaches, we conducted a two-by-two between-subjects experimental study with a sample of 161 Dutch primary school children aged 10 to 12 years. This study aimed to assess the effects of two interventions both independently (main effects) and in combination (interaction effect): a growth mindset reading and writing assignment (factor 1) and effort-based praise (factor 2) on mindset beliefs and learning performance, specifically in terms of retention and transfer within the probability calculation domain. While a positive effect on growth mindset beliefs was observed, neither the individual interventions nor their combination significantly influenced learning performance.
... Smith (1999) Delivered through structured mechanisms such as supervision, coaching, or other reflective frameworks, feedback provides a means to uncover ethical blind spots, refine professional practices, and strengthen relational competence through thoughtful and constructive critique. Yeager and Dweck's (2020) work on the growth mindset underscores its role in fostering receptivity to feedback, resilience, and adaptive learning, positioning it as integral to ongoing ethical and professional development. ...
... Coach educators play a pivotal role in fostering these capacities by creating environments that emphasise growth mindset, autonomy, and intentional learning (Yeager & Dweck, 2020). These environments encourage students to reflect on their practices, identify actionable insights, and test these insights through feedback loops and further reflection (Schön, 1983;Smith, 1994;Rajasinghe et al., 2022). ...
Article
Full-text available
This paper examines the primary theme of intentionality in the professional development of experienced coaches, building upon the research presented in ‘On Becoming a Coach’ (Rajasinghe, Garvey, Smith, et al., 2022). The original study explored how coaches navigated their developmental journeys as they evolved into seasoned practitioners. Key themes identified in that research included: Vehicles of Development, Narratives of Awareness, Narratives of Letting Go, Ethical Practice, and Narratives of Becoming a Coach. This paper extends those findings by focusing on intentional learning, a deliberate, conscious effort to align professional growth with intentionality and ethicality, fostering personal and professional wellbeing in the process. Drawing from my previous work on identity development (Smith, 2024), which examined the shifting nature of self-concept in becoming a coach, this paper highlights how intentional learning intersects with identity shifts and the evolution of ethicality. The study employs a qualitative approach, conducting in-depth interviews with experienced coaches to explore how intentionality shapes professional development. Engines and Anchors of Intentional Learning, Novel Learning Frontiers, and Thriving and Mastery through Ethicality. This research contributes to the emerging conversation on ethicality in coaching, a concept pioneered by this work. It demonstrates how intentional learning fosters professional growth and ethical maturity. It presents a practice model rooted in ethical responsibility and lifelong learning. The paper discusses practical implications for coach educators, emphasising the importance of creating environments that nurture intentionality to enhance coaching effectiveness and ethical practice.
... Relational support was a strong resilience-promoting factor, which is consistent with Henshall et al. 's [11] findings and is a robust factor well-recognised in the wider resilience literature [20]. Personal resources used by students included cognitive flexibility, problem-solving, and having a growth mindset, where they displayed a belief in their ability to continue learning and growing [47]. These are important resilience-promoting factors which can be learned and developed through wellbeing and resilience education Table 6 Data display table of integration findings and meta-inferences Quantitative data Qualitative data Data display • Overall severe depressive and stress symptoms, and extremely severe anxiety symptoms in the overall and qualitative samples. ...
Article
Full-text available
Background The COVID-19 pandemic and rapid shift to online learning have had ongoing impacts on nursing students’ wellbeing and resilience. We are yet to fully understand the implications for this emerging workforce in the post-pandemic era. The aims of this mixed methods study were to investigate wellbeing, coping and resilience of undergraduate nursing students in the pandemic; explore relationships between these variables and investigate predictors of wellbeing and coping, including differences between domestic and international students; explore how students experienced and managed adversity; understand how their mental distress and wellbeing were influenced by resilience resources used to deal with adversity, and identify implications for nurse wellbeing as they enter the workforce in the post-pandemic era. Methods A convergent mixed methods design was used. An online survey investigated wellbeing (COMPAS-W), psychological distress (DASS-21), coping (Brief COPE) and resilience (ARM-R) was completed by n = 175 undergraduate nursing students. Semi-structured interviews with n = 18 students explored how they navigated challenges. Descriptive, correlational, and regression analyses, and thematic analysis, were conducted. Mixed methods analysis was used to integrate both sets of findings. Results Students reported high levels of mental distress, yet also moderate levels of wellbeing and resilience. Key findings included domestic students reporting significantly greater stress than international students, and wellbeing being predicted by lower mental distress and increased problem-focused coping. Students coped with challenges by being proactive, drawing on a range of coping strategies, and seeking technical and emotional support. From a social-ecological resilience perspective, access to and engagement with a range of personal, environmental and relational resources served as protective factors for their wellbeing. Conclusions This study provides valuable new insights into protective factors for nurses during a period of extraordinary challenge. In the post-pandemic era, there is a need to strengthen the wellbeing and retention of new graduates now entering the workforce from university. Implementation of targeted strategies to strengthen graduates’ peer relationships and sense of belonging at work, and wellbeing and resilience education, are needed. Longitudinal follow-up of graduates’ wellbeing is recommended.
Article
Growth mindset and academic grit are crucial for adolescents’ academic achievement. However, previous research that explored their relationship through variable‐centered approaches cannot adequately address the group heterogeneity. This study addressed the issue by conducting latent profile analysis and social network analysis with a sample of 1171 Chinese adolescents, examining the different subgroups of the adolescents in terms of growth mindset and academic grit, and the distinct core characteristics of these subgroups. Latent profile analysis revealed four heterogeneous subgroups: (1) Fixed Vulnerability: (C1, 15.0%) with believing in difficulty changing intelligence and lack of grit; (2) Composite: (C2, 17.3%) with believing in difficulty changing intelligence but having strong grit; (3) Growth Grit: (C3, 22.1%) with believing in intelligence improvement through effort and having robust grit; and (4) Intermediate: (C4, 40.9%) with believing in intelligence improvement through effort and having moderate grit. Regression mixture modeling showed that older adolescents were more likely to fall into the Fixed Vulnerability subgroup. Social network analysis showed that subgroups C1 and C4 emphasized growth mindset, while C2 and C3 highlighted academic grit. The study indicates the heterogeneity in growth mindset and academic grit among Chinese adolescents, suggesting the need for educational practitioners to apply more targeted intervention measures for various subgroups.
Article
Full-text available
Failure can be an effective tool for learning, but it comes with negative consequences. Educators and learners should practice strategies that leverage the benefits of failure while managing its negative consequences on learners’ motivation and persistence. Towards that goal, this paper examines the biological effects of failure on learning to (1) explain how failure primes the brain for learning and (2) propose behavioral strategies for coping with the negative consequences, focusing on postsecondary students. This conceptual literature review article draws upon neuroscience literature to explain biological mechanisms related to failure and education literature to explore connections to learning theory and environments. The paper is organized into two major sections: (1) the benefits of failure and (2) tools to deal with its negative effects. Within each section, the paper describes related neurochemicals and behavioral strategies to affect them that could be explored in educational settings. By understanding these biological effects, we can better design learning environments and support students through failure. Each section of the paper also describes non-invasive research tools that could be used to study the effects of interventions that aim to improve students’ experience of failure in education.
Chapter
This chapter examines the transformation of a Northeastern Thai flood-affected hamlet, Ban Tha Khaek in Nakhon Phanom, using a growth mindset—the belief that adversity like chronic floods may enable learning and adaptation. Villages used to flee floods, relying on outside aid and lacking infrastructure. Community leaders, local administrations, and educational initiatives helped the town move from reactive to proactive flood risk management. Flood forecasting, flood-resistant agriculture, and collaborative water storage are non-structural flood adaptations. These methods have helped the community manage floods sustainably, reducing infrastructure costs and promoting ecological and social cohesion. Integrating traditional knowledge and local participation has built resilience. This case study shows how a growth mindset helped individuals perceive floods as opportunities for improvement rather than inevitable disasters. It promotes community engagement, education, and anticipatory planning to reduce climate-induced floods.
Article
Full-text available
The growth mindset or the belief that intelligence is malleable has garnered significant attention for its positive association with academic success. Several recent randomized trials, including the National Study of Learning Mindsets (NSLM), have been conducted to understand why, for whom, and under what contexts a growth mindset intervention can promote beneficial achievement outcomes during critical educational transitions. Prior research suggests that the NSLM intervention was particularly effective in improving low-achieving 9th graders’ GPA, while the impact varied across schools. In this study, we investigated the underlying causal mediation mechanism that might explain this impact and how the mechanism varied across different types of schools. By extending a recently developed weighting method for multisite causal mediation analysis, the analysis enhances the external and internal validity of the results. We found that challenge-seeking behavior played a significant mediating role, only in medium-achieving schools, which may partly explain the reason why the intervention worked differently across schools. We conclude by discussing implications for designing interventions that not only promote students’ growth mindsets but also foster supportive learning environments under different school contexts.
Article
Full-text available
A growth-mindset intervention teaches the belief that intellectual abilities can be developed. Where does the intervention work best? Prior research examined school-level moderators using data from the National Study of Learning Mindsets (NSLM), which delivered a short growth-mindset intervention during the first year of high school. In the present research, we used data from the NSLM to examine moderation by teachers’ mindsets and answer a new question: Can students independently implement their growth mindsets in virtually any classroom culture, or must students’ growth mindsets be supported by their teacher’s own growth mindsets (i.e., the mindset-plus-supportive-context hypothesis)? The present analysis (9,167 student records matched with 223 math teachers) supported the latter hypothesis. This result stood up to potentially confounding teacher factors and to a conservative Bayesian analysis. Thus, sustaining growth-mindset effects may require contextual supports that allow the proffered beliefs to take root and flourish.
Article
Full-text available
Adolescents who hold an entity theory of personality—the belief that people cannot change—are more likely to report internalizing symptoms during the socially stressful transition to high school. It has been puzzling, however, why a cognitive belief about the potential for change predicts symptoms of an affective disorder. The present research integrated three models—implicit theories, hopelessness theories of depression, and the biopsychosocial model of challenge and threat—to shed light on this issue. Study 1 replicated the link between an entity theory and internalizing symptoms by synthesizing multiple datasets (N = 6,910). Study 2 examined potential mechanisms underlying this link using 8-month longitudinal data and 10-day diary reports during the stressful first year of high school (N = 533, 4,255 daily reports). The results showed that an entity theory of personality predicted increases in internalizing symptoms through tendencies to make fixed trait causal attributions about the self and maladaptive (i.e., “threat”) stress appraisals. The findings support an integrative model whereby situation-general beliefs accumulate negative consequences for psychopathology via situation-specific attributions and appraisals.
Article
Full-text available
Two experiments and 2 field studies examine how college students’ perceptions of their science, technology, engineering, and mathematics (STEM) professors’ mindset beliefs about the fixedness or malleability of intelligence predict students’ anticipated and actual psychological experiences and performance in their STEM classes, as well as their engagement and interest in STEM more broadly. In Studies 1 (N = 252) and 2 (N = 224), faculty mindset beliefs were experimentally manipulated and students were exposed to STEM professors who endorsed either fixed or growth mindset beliefs. In Studies 3 (N = 291) and 4 (N = 902), we examined students’ perceptions of their actual STEM professors’ mindset beliefs and used experience sampling methodology (ESM) to capture their in-the-moment psychological experiences in those professors’ classes. Across all studies, we find that students who perceive that their professor endorses more fixed mindset beliefs anticipate (Studies 1 and 2) and actually experience (Studies 3 and 4) more psychological vulnerability in those professors’ classes—specifically, they report less belonging in class, greater evaluative concerns, greater imposter feelings, and greater negative affect. We also find that in-the-moment experiences of psychological vulnerability have downstream consequences. Students who perceive that their STEM professors endorse more fixed mindset beliefs experience greater psychological vulnerability in those professors’ classes, which in turn predict greater dropout intentions, lower class attendance, less class engagement, less end-of-semester interest in STEM, and lower grades. These findings contribute to our understanding of how students’ perceptions of professors’ mindsets can serve as a situational cue that affects students’ motivation, engagement, and performance in STEM.
Technical Report
Full-text available
The Changing Mindsets project aimed to improve attainment outcomes at the end of primary school by teaching Year 6 pupils that their brain potential was not a fixed entity but could grow and change through effort exerted. The programme, delivered by Portsmouth University, taught pupils about the malleability of intelligence through workshops. Teachers attended short professional development courses on approaches to developing a ‘growth mindset’ before delivering sessions to pupils weekly, over eight consecutive weeks. Teachers were encouraged to embed aspects of the growth mindset approach throughout their teaching—for example, when giving feedback outside of the sessions. They were also given access to digital classroom resources, such as a video case study of Darwin overcoming adversity in his own life, as a practical example of the importance of having a growth mindset. The project was a randomised controlled trial (RCT) and included 101 schools and 5018 pupils across England, assigned to either intervention or control groups. The trial ran from September 2016 to February 2017. The process evaluation involved interviews with teachers, focus groups with pupils receiving the intervention, lesson observations, and surveys of both treatment schools and control groups throughout the course of the intervention.
Article
Previous research provides evidence that developing a growth mindset—believing that one’s capabilities can improve—promotes academic achievement. Although this phenomenon has undergone prior study in a representative sample of ninth graders in the United States, it has not been studied in representative samples of other grade levels or with standardized assessment measures of achievement rather than more subjective grades. Using a rich longitudinal data set of more than 200,000 students in Grades 4 through 7 in California who we followed for a year until they were in Grades 5 through 8, this article describes growth mindset gaps across student groups and confirms, at a large scale, the predictive power of growth mindset for achievement gains. We estimate that a student with growth mindset who is in the same school and grade level and has the same background and achievement characteristics as a student with a fixed mindset learns 0.066 SD more annually in English language arts, approximately 18% of the average annual growth or 33 days of learning if we assume learning growth as uniform across the 180 days of the academic year. For mathematics, the corresponding estimates are 0.039 SD, approximately 17% of average annual growth or 31 days of learning.
Article
Impact evaluations can help to inform policy decisions, but they are rooted in particular contexts and to what extent they generalize is an open question. I exploit a new data set of impact evaluation results and find a large amount of effect heterogeneity. Effect sizes vary systematically with study characteristics, with government-implemented programs having smaller effect sizes than academic or non-governmental organization-implemented programs, even controlling for sample size. I show that treatment effect heterogeneity can be appreciably reduced by taking study characteristics into account.
Preprint
Students who are taught that intellectual abilities are not fixed but can be developed—a growth mindset of intelligence—show improvements in their academic outcomes. Specifically, the mindset treatment effects are strongest for low-achieving students and vary across school contexts, as demonstrated by several lab and field-experimental studies including the largest randomized control trial (RCT) evaluation in a nationally- representative sample of 9th graders in the US as shown by the National Study of Learning Mindsets (NSLM). Yet, the mechanisms through which a growth mindset brings about positive academic outcomes remains unclear. Past research posits a role for challenge-seeking/learning-oriented behaviors exhibited by students exposed to a growth mindset intervention. However, research illuminating this pathway from mindset to academic outcomes through student behaviors are sparse. For the first time, given the multisite RCT design and the inclusion of a rich set of mediators—especially, a mediator measured using a novel behavioral task that elicits learning-oriented behavior in the NSLM, rather than relying on just self-reported mindset measures, this research tests the complete mediational pathway and highlights a key mechanism through which growth mindset interventions may work.
Preprint
The growth mindset or the belief that intelligence is malleable has garnered significant attention for its positive association with academic success. Several recent randomized trials, including the National Study of Learning Mindsets (NSLM), have been conducted to understand why, for whom, and under what contexts a growth mindset intervention can promote beneficial achievement outcomes during critical educational transitions. Prior research suggests that the NSLM intervention was particularly effective in improving low-achieving 9th graders’ GPA, while the impact varied across schools. In this study, we investigated the underlying causal mediation mechanism that might explain this impact and how the mechanism varied across different types of schools. By applying an advanced analytic procedure developed under a causal framework, the analysis enhances the external and internal validity of the results. We found that challenge-seeking behavior played a significant mediating role, only in medium-achieving schools, which may partly explain the reason why the intervention worked differently across schools. We concluded by discussing implications for designing interventions that not only promote students’ growth mindsets, but also foster supportive learning environments under different school contexts.
Article
This is one of the first evaluations of a “growth-mindset” intervention at scale in a developing country. I randomly assigned 202 public secondary schools in Salta, Argentina, to a treatment group in which Grade 12 students were asked to read about the malleability of intelligence, write a letter to a classmate, and post their letters in their classroom, or to a control group. The intervention was implemented as intended. Yet, I find no evidence that it affected students’ propensity to find tasks less intimidating, school climate, school performance, achievement, or post-secondary plans. I rule out small effects and find little evidence of heterogeneity. This study suggests that the intervention may be more challenging to replicate and scale than anticipated.