ArticlePDF Available

What Can Be Learned From Growth Mindset Controversies?



The growth mindset is the belief that intellectual ability can be developed. This article seeks to answer recent questions about growth mindset, such as: Does a growth mindset predict student outcomes? Do growth mindset interventions work, and work reliably? Are the effect sizes meaningful enough to merit attention? And can teachers successfully instill a growth mindset in students? After exploring the important lessons learned from these questions, the article concludes that large-scale studies, including preregistered replications and studies conducted by third parties (such as international governmental agencies), justify confidence in growth mindset research. Mindset effects, however, are meaningfully heterogeneous across individuals and contexts. The article describes three recent advances that have helped the field to learn from this heterogeneity: standardized measures and interventions, studies designed specifically to identify where growth mindset interventions do not work (and why), and a conceptual framework for anticipating and interpreting moderation effects. The next generation of mindset research can build on these advances, for example by beginning to understand and perhaps change classroom contexts in ways that can make interventions more effective. Throughout, the authors reflect on lessons that can enrich metascientific perspectives on replication and generalization.
What Can Be Learned From Growth Mindset Controversies?
David S. Yeager
and Carol S. Dweck
Department of Psychology, University of Texas at Austin
Department of Psychology, Stanford University
The growth mindset is the belief that intellectual ability can be developed. This article seeks to
answer recent questions about growth mindset, such as: Does a growth mindset predict student
outcomes? Do growth mindset interventions work, and work reliably? Are the effect sizes
meaningful enough to merit attention? And can teachers successfully instill a growth mindset in
students? After exploring the important lessons learned from these questions, the article concludes
that large-scale studies, including preregistered replications and studies conducted by third parties
(such as international governmental agencies), justify confidence in growth mindset research.
Mindset effects, however, are meaningfully heterogeneous across individuals and contexts. The
article describes three recent advances that have helped the field to learn from this heterogeneity:
standardized measures and interventions, studies designed specifically to identify where growth
mindset interventions do not work (and why), and a conceptual framework for anticipating and
interpreting moderation effects. The next generation of mindset research can build on these
advances, for example by beginning to understand and perhaps change classroom contexts in
ways that can make interventions more effective. Throughout, the authors reflect on lessons that
can enrich metascientific perspectives on replication and generalization.
Public Significance Statement
Research on growth mindset—the belief that intellectual ability can be developed—has found that
a growth mindset can lead to greater resilience and academic achievement among students facing
difficulties. The present article reviews the evidence and shows that highly quality studies, and
independent analyses, have supported the conclusion that growth mindset effects are replicable,
meaningful, and theoretically grounded, but interventions targeting teachers (rather than students)
have not yet been effective. The article concludes with a discussion of why it has been difficult to
change teachers or schools and why new research is needed on this topic.
Keywords: implicit theories, growth mindset, adolescence, educational psychology, metascience
Supplemental materials:
A growth mindset is the belief that personal characteris-
tics, such as intellectual abilities, can be developed, and a
fixed mindset is the belief that these characteristics are fixed
and unchangeable (Dweck, 1999; Dweck & Leggett, 1988;
Yeager & Dweck, 2012). Research on these mindsets has
found that people who hold more of a growth mindset are
David S. Yeager X
Carol S. Dweck X
The writing of this essay was supported by the William T Grant Foun-
dation Scholars Award, the National Institutes of Health (R01HD084772
and P2CHD042849), and the National Science Foundation (1761179). The
content is solely the responsibility of the authors and does not necessarily
represent the official views of the funding agencies.
David S. Yeager received a 2020 APA Award for Distinguished Scien-
tific Early Career Contributions to Psychology. In association with the
award, David S. Yeager was invited to submit a manuscript to American
Psychologist, which was peer reviewed. The article is published as part of
the journal’s annual Awards Issue.
David S. Yeager conceptualized the article, conducted data analysis,
produced data visualizations, and wrote the first draft of the article. Carol
S. Dweck collaborated on the conceptualization and writing of the article.
David S. Yeager served as lead for funding acquisition, project adminis-
tration, visualization, and writing – original draft and contributed equally to
investigation. Carol S. Dweck served in a supporting role for writing –
original draft. David S. Yeager and Carol S. Dweck contributed to con-
ceptualization equally. David S. Yeager and Carol S. Dweck contributed to
writing – review and editing equally.
Correspondence concerning this article should be addressed to David S.
Yeager, Department of Psychology, University of Texas at Austin, 305
East 23rd Street, Mail Stop G1800, Austin, TX 78712–1699, United States.
American Psychologist
© 2020 American Psychological Association 2020, Vol. 2, No. 999, 000
ISSN: 0003-066X
more likely to thrive in the face of difficulty and continue to
improve, while those who hold more of a fixed mindset may
shy away from challenges or fail to meet their potential (see
Dweck & Yeager, 2019). There has been considerable in-
terest among researchers, policymakers, and educators in
the use of growth mindset research to improve educational
outcomes, in part because research on mindsets has yielded
effective, scalable interventions. For instance, the National
Study of Learning Mindsets (NSLM; Yeager, 2019) evalu-
ated a short (1 hr), online growth mindset intervention in
a nationally representative sample of 9th graders in the
United States (N12,490). Compared with the control
condition, the intervention improved grades for lower-
achieving students and improved the rate at which students
overall chose and stayed in harder math classes (Yeager et
al., 2019). These or similar effects have appeared in inde-
pendent evaluations of the NSLM (Zhu et al., 2019) and
international replications (Rege et al., in press).
With increasing emphasis on replication and generaliz-
ability has come an increased attention to questions of when,
why, and under what conditions growth mindset associa-
tions and intervention effects can be expected to appear.
Large trans-disciplinary studies have yielded insights into
contextual moderators of mindset intervention effects, such
as the educational cultures created by peers or schools
(Rege et al., in press; Yeager et al., 2019), and has begun to
document the behaviors that can mediate intervention ef-
fects on achievement (Gopalan & Yeager, 2020; Qin et al.,
2020). At the same time, researchers and practitioners have
begun to test a variety of mindset measures, manipulations,
and interventions. These have sometimes failed to find
growth mindset effects and other times have replicated
mindset effects.
The trend toward testing “heterogeneity of effects” and
the boundary conditions of effects has coincided with the
rise of the metascience movement. This latter movement has
asked foundational questions about the replicability of im-
portant findings in the literature (see Nelson et al., 2018).
The emergence of these two trends in parallel—increased
examination of heterogeneity and increased focus on repli-
cability—has made it difficult to distinguish between stud-
ies that call into question the foundations of a field and
studies that are a part of the normal accumulation of knowl-
edge about heterogeneity (about where effects are likely to
be stronger or weaker; see Kenny & Judd, 2019; Tipton et
al., 2019). Many scientists and educational practitioners
now wish to know: Which claims in the growth mindset
literature still stand? How has our understanding evolved?
And what will the future of mindset research look like?
In this article we examine different controversies about
the mindset literature and discuss the lessons that can be
learned from them.
To be clear, we think controversies are
often good. It can be useful to reexamine the foundations of
a field. Furthermore, controversies can lead to theoretical
advances and positive methodological reforms. However,
controversies can only benefit the field in the long run when
we identify useful lessons from them. This is what we seek
to do in the current article. We highlight why people’s
growth versus fixed mindsets or why growth mindset inter-
ventions should sometimes affect student outcomes and
what we can learn from the times when they do not.
Preview of Controversies
In this article we review several prominent growth mind-
set controversies:
1. Do mindsets predict student outcomes?
2. Do student mindset interventions work?
3. Are mindset intervention effect sizes too small to
be interesting?
4. Do teacher mindset interventions work?
As we discuss below, different authors can stitch together
different studies across these issues to create very different
narratives. One narrative can lead to the conclusion that
mindset effects are not real and that even the basic tenets of
mindset theory are unfounded (Li & Bates, 2019; Macna-
mara, 2018). Another selective narrative could lead to the
opposite conclusion. Although we are not disinterested par-
ties, we hope to navigate the evidence in a way that is
illuminating for the reader—that is, in a way that looks at
the places where mindset effects are robust and where they
are weak or absent. Below, we first preview the central
concepts at issue in each controversy and the questions we
will explore. After that, we will review the evidence for
each controversy in turn.
Do Mindsets Predict Student Outcomes?
A fixed mindset, with its greater focus on validating one’s
ability and drawing negative ability inferences after struggle
or failure, has been found to predict lower achievement
(e.g., grades and test scores) among students facing aca-
demic challenges or difficulties, compared with a growth
mindset, with its greater focus on developing ability and on
questioning strategy or effort after failure (e.g., Blackwell et
al., 2007). However, some studies have not found this same
result (e.g., Li & Bates, 2019). Meta-analyses have shown
overall significant associations in the direction expected by
the theory, but the effects have been heterogeneous and thus
In the present article we focus on the intelligence mindsets, which form
the core of much of the mindset literature. Of course, people can have
mindsets about other characteristics as well (e.g., personality or social
qualities; see Schroder et al., 2016) and similar mediators and mechanisms
have often emerged in those domains as well.
call for a greater understanding of where these associations
are likely and where they may be less likely (Burnette et al.,
2013; Sisk et al., 2018).
Recently, Macnamara (2018) and her coauthors (Bur-
goyne et al., 2020; Sisk et al., 2018) and Bates and his
coauthor (Li & Bates, 2019) have looked at this pattern of
results and called into question the basic correlations be-
tween mindsets and outcomes. Below, we look at the dif-
ferences across these correlational studies to try and make
sense of the discrepant results. We ask: Are the effects
found in the large, generalizable studies (i.e., random or
complete samples of defined populations), particularly
those conducted by independent organizations or research-
ers? And how can we know whether a given result is a
“failure to replicate” or a part of a pattern of decipherable
Do Student Mindset Interventions Work?
The effects of mindset interventions on student outcomes
have been replicated but they, too, are heterogeneous (Sisk
et al., 2018; Yeager et al., 2019). Some studies (and sub-
groups within studies) have shown noteworthy effects, but
other studies or subgroups have not. Macnamara (2018),
Bates (Li & Bates, 2019) and several commentators
(Shpancer, 2020; Warne, 2020) have argued that this means
mindset interventions do not work or are unimportant. Oth-
ers have looked at the same data and concluded that the
effects are meaningful and replicable but they are likely to
be stronger or weaker for different people in different con-
texts (Bryan et al., 2020; Gelman, 2018). We ask whether
the heterogeneity is in fact informative and whether there is
an overarching framework that could explain (and predict in
advance) heterogeneous effects.
Are Mindset Intervention Effect Sizes Too Small
to be Interesting?
Macnamara and colleagues have argued that growth
mindset intervention effect sizes are too small to be worthy
of attention (Macnamara, 2018; Sisk et al., 2018). Others
have disagreed (Gelman, 2018; Kraft, 2020; The World
Bank, 2017). Here we ask: What are the appropriate bench-
marks for the effect sizes of educational interventions? How
do mindset interventions compare? And how can an appre-
ciation of the moderators of effects contribute to a more
nuanced discussion of “true” effect sizes for interventions
(also see Bryan et al., 2020; Vivalt, 2020)?
Do Teacher Mindset Interventions Work?
Correlational research has indicated a role for educators’
mindsets and mindset-related practices in student achieve-
ment (Canning et al., 2019; Leslie et al., 2015; Muenks et
al., 2020). Nevertheless, two recent mindset interventions,
delivered to and by classroom teachers, have not had any
discernable effects on student achievement (Foliano et al.,
2019; Rienzo et al., 2015). We examine the issue of teacher-
focused interventions and ask: Why might it be so difficult
to coach teachers to instill or support a growth mindset?
How can the next generation of mindset research make
headway on this?
Controversy 1: Do Mindsets Predict
Student Outcomes?
What is Mindset Theory? Let us briefly review the
theoretical predictions before assessing the evidence. Mind-
set theory (Dweck, 1999; Dweck & Leggett, 1988; see
Dweck & Yeager, 2019) grows out of two traditions of
motivational research: attribution theory and achievement
goal theory. Attribution theory proposed that people’s ex-
planations for a success or a failure (their attributions) can
shape their reactions to that event (Weiner & Kukla, 1970),
with attributions of failure to lack of ability leading to less
persistent responses to setbacks than attributions to more
readily controllable factors, such as strategy or effort (see
Weiner, 1985). Research by Diener and Dweck (1978) and
Dweck and Reppucci (1973) suggested that students of
similar ability could differ in their tendency to show these
different attributions and responses. Later, achievement
goal theory was developed to answer the question of why
students with roughly equal ability might show different
attributions and responses to a failure situation (Elliott &
Dweck, 1988). This line of work suggested that students
who have the goal of validating their competence or avoid-
ing looking incompetent (a performance goal) tend to show
more helpless reactions in terms of (ability-focused) attri-
butions and behavior, relative to students who have the goal
of developing their ability (a learning goal; Elliott &
Dweck, 1988; Grant & Dweck, 2003).
The next question was: Why might students of equal
ability differ in their tendency toward helpless attributions
or performance goals? This is what mindset theory was
designed to illuminate. Mindset theory proposes that situa-
tional attributions and goals are not isolated ideas but in-
stead are fostered by more situation-general mindsets
(Molden & Dweck, 2006). These more situation-general
mindset beliefs about intelligence—whether it is fixed or
can be developed—are thought to lead to differences in
achievement (e.g., grades and test scores) because of the
different goals and attributions in situations involving chal-
lenges and setbacks.
In summary, mindset theory is a theory about responses to
challenges or setbacks. It is not a theory about academic
achievement in general and does not purport to explain the
lion’s share of the variance in grades or test scores. The
theory predicts that mindsets should be associated with
achievement particularly among people who are facing
How Are Mindsets Measured? Mindsets are typically
assessed by gauging respondents’ agreement or disagree-
ment with statements such as “You have a certain amount of
intelligence, and you really can’t do much to change it”
(Dweck, 1999). Greater agreement with this statement cor-
responds to more of a fixed mindset, and greater disagree-
ment corresponds to more of a growth mindset. Mindsets
are not all-or-nothing—they are conceptualized as being on
a continuum from fixed to growth, and people can be at
different parts of the continuum at different times. In studies
in which survey space has been plentiful, mindsets are often
measured with three to four items framed in the fixed
direction and three to four items framed in the growth
direction, with the former being reverse-scored (e.g., Black-
well et al., 2007). In recent large-scale studies in which
space is at a premium (e.g., Yeager et al., 2016, 2019), a
more streamlined measure of the two or three strongest
fixed-framed items is used for economy, simplicity, and
clarity. Recently, several policy-oriented research studies
have adapted the mindset measures for use in national
surveys (e.g., Claro & Loeb, 2019).
Controversy 1a: Do Measured Mindsets Predict
Academic Achievement?
Preliminary Evidence. An initial study (Blackwell et
al., 2007, Study 1, N373) found that students reporting
more of a growth mindset showed increasing math grades
over the 2 years of a difficult middle school transition,
whereas those with more of a fixed mindset did not (even
though their prior achievement did not differ). Later, a
meta-analysis by Burnette et al. (2013) summarized data
from many different kinds of behavioral tasks and admin-
istrative records, in the laboratory and the field, and from
many different participant populations, and found that
mindsets were related to achievement and performance.
Large-Scale Studies. Many large studies conducted
by governments and nongovernmental organizations have
found that mindsets were correlated with achievement.
First, all 4th to 7th grade students in the largest districts of
California were surveyed as part of the state accountability
system, (the “CORE” districts; N300,629). Growth
mindset was associated with higher English/Language Arts
scores, r.28, and higher math scores, r.27 (Claro &
Loeb, 2019, Table 2, Column 1; also see West et al., 2018).
A follow-up analysis of the same dataset found that the
association between mindset and test scores was stronger
among students who were struggling the most (those who
were medium-to-low achieving students; Kanopka et al.,
2020), which is consistent with mindset theory. Next, three
large survey studies—including the NSLM and the U-say
study in Norway (N23,446)—collectively showed a
correlation of mindset with high school grades of r.24
(see Figure 1). In Chile, all 10th grade public school stu-
dents (N168,533) were asked to complete growth mind-
set questions in conjunction with the national standardized
test. Mindsets were correlated with achievement test scores
at r.34, and correlations were larger among students with
greater risk of low performance (those facing socioeco-
nomic disadvantages; Claro et al., 2016). In another large
study, the Programme for International Student Assessment
(PISA), conducted by the OECD (an international organi-
zation that is charged with promoting economic growth
around the world through research and education), surveyed
random samples of students from 74 developed nations
(N555,458, see Figure 1) and showed that growth mind-
set was significantly and positively associated with test
scores in 72 of those nations (OECD, 2019; the exceptions
were China and Lebanon). (If the three separate regions of
China are counted as different nations, the positive associ-
ations were significant in 72 out of 76 cases.) The OECD
correlations for the United States and Chile approximated
the previously noted data (see Figure 1).
Unsupportive Evidence. In a recent study of N433
Chinese students in 5th and 6th grade, Li and Bates (2019)
found no significant correlation between students’ reported
mindsets and their grades or intellectual performance.
ník and Vranka (2017) likewise found no association be-
tween mindsets and aptitude tests in a large sample of
university applicants in the Czech Republic (N5,653).
There can be many reasons for these null effects. One can be
that the China sample is a smaller sample of convenience
and so the results may be less informative. Another can be
the choice and translation of the mindset items. However,
the PISA data may provide another possible, perhaps more
interesting, explanation. Looking at Figure 1, we see that
there was no positive correlation in the PISA data for China
overall (and there was even a negative correlation for the
mainland China students); moreover, the Czech Republic
was 71st out of 74 OECD nations in the size of their
correlation. Thus, the PISA data may suggest that the Li and
Bates (2019) and Bahník and Vranka (2017) correlational
The Claro and Loeb (2019) study also did an analysis focused on
learning (i.e., test score gains). Such analyses will reduce the estimated
correlations with mindset because they would be controlling for any effect
of mindset on prior achievement, but these analyses too were significant
and in the direction expected by mindset theory.
Li and Bates (2019) also tested the effects of intelligence praise (not
mindset) on students’ motivation, similar to what was done by Mueller and
Dweck (1998). The authors reported three experiments, one of which was
a significant replication at p.05 and one of which was a marginally
significant replication at p.10 (two-tailed hypothesis tests). However,
each study was under-powered. The most straightforward analysis with
individually under-powered replications is to aggregate the data. When this
was done across the three studies, Li and Bates (2019) clearly replicated
Mueller and Dweck (1998), p.05.
studies happened to be conducted in cultural contexts that
have the weakest links between mindsets and achievement
in the developed world. Of course, this does not mean that
every sample from these countries would show a null effect
or that every sample from the countries with the higher
positive correlations would show a positive effect, but the
cross-national results help us to see whether a given study
fits a pattern or not.
A recent meta-analysis, presented as unsupportive of
mindset theory’s predictions, was, upon closer inspection,
consistent with the conclusions from the large-scale studies.
Macnamara and colleagues meta-analyzed many past stud-
ies’ effect sizes, with data from different nations, age
groups, and achievement levels, and using many different
kinds of mindset survey items (Sisk et al., 2018, Study 1).
They found an overall significant association between
mindsets and academic performance. However, this overall
association was much smaller (r.09) than the estimates
from the generalizable samples presented above (r.24 to
r34). This puzzle can be solved when we realize that the
Sisk et al. (2018) results were heterogeneous (I
and that this caused the random effects meta-analysis to
give about the same weight in the average to the effect sizes
of very small, outlying studies from convenience samples as
it gave to the generalizable studies. The data from three
generalizable data sets (the CORE data, the Chile data, and
the U.S. National Study of Learning Mindsets), which con-
tributed 296,828, or 81%, of all participant data in the Sisk
et al. (2018) meta-analysis, were given roughly the same
weight as correlations with sample sizes of n10 or fewer
participants (of which there were 19).
Conclusions. There is a replicable and generalizable
association between mindsets and achievement. A lesson
from this controversy is to underscore that this association
cannot be captured with a single, summary effect size.
Mindset associations with outcomes were expected to be,
and often were, stronger among people facing academic
difficulties or setbacks. Further, there was (and always will
be) some unexplained heterogeneity across cultures and
almost certainly within cultures as well. This unexplained
heterogeneity should be a starting point for future theory
development. To quote Kenny and Judd (2019), “rather than
seeking out the ‘true’ effect and demonstrating it once and
for all, one approaches science . . . with a goal of under-
standing all the complexities that underlie effects and how
those effects vary” (p. 10).
In line with this recommendation, we are deeply inter-
ested in cultures in which a growth mindset does not predict
higher achievement, such as mainland China. Multinational
probability sample studies are uniquely suited to investigate
this. On the PISA survey, students in mainland China re-
ported spending 57 hr per week studying (OECD, 2019,
Figure 1
Correlations Between Mindset and Test Scores in 74 Nations Administering the
Mindset Survey in the 2018 PISA (N 555,458)
Note. Each dot is a raw correlation between the single-item mindset measure and PISA
reading scores for each nation’s 15 year-olds. Bars: 95% confidence intervals (CIs). Data
Source: OECD, PISA, 2018 Database, Table 3.B1.14.5. Data were reported by the OECD in
SD units; these were transformed to rusing the standard “rfrom d” formulas. One nation
contributed three different effect sizes according to the OECD (China: for Macao, for Hong
Kong, and for four areas of the mainland); these three effect sizes were averaged here.
Looking at the three China effects separately, each was near zero, and mainland China was
negative and significant. See the online article for the color version of this figure.
Figure 1.4.5), the second-most in the world. Perhaps a
growth mindset cannot increase hours of studying or test
scores any further when there is already a cultural impera-
tive to work that hard. However, that does not mean mindset
has no effect in such cultures. Mindset has an established
association with mental health and psychological distress
(Burnette et al., 2020; Schleider et al., 2015; Seo et al., in
press). Of all OECD nations, mainland China had the stron-
gest association between fixed mindset and “fear of failure,”
a precursor to poor mental health (OECD, 2019, Table 3.2.).
This suggests that perhaps a growth mindset might yet have
a role to play in student well-being there.
Controversy 1b: Does the Mindset “Meaning System”
Burgoyne et al. (2020) have claimed that the evidence for
the mindset “meaning system” is not robust. First, we re-
view the predictions and measures, and then we weigh the
Meaning System Predictions. The two mindsets are
thought to orient students toward different goals and attri-
butions in the face of challenges and setbacks. Research has
also found (e.g., Blackwell et al., 2007) that the two mind-
sets can orient students toward different interpretations of
effort. For this reason, we have suggested that each mindset
fosters its own “meaning system.” More specifically, mind-
set theory expects that the belief in fixed ability should be
associated with a meaning system of performance goals
(perhaps particularly performance-avoidance goals, which
is the goal of avoiding situations that could reveal a lack of
ability), negative effort beliefs (e.g., the belief that the need
to put forth effort on a task reveals a lack of talent), and
helpless attributions in response to difficult situations (e.g.,
attributing poor performance to a stable flaw in the self,
such as being “dumb”). A growth mindset meaning system
is the opposite, involving learning goals, positive effort
beliefs, and resilient (e.g., strategy-focused) attributions
(see Dweck & Yeager, 2019). Thus, the prediction is that
measured mindsets should be associated with their allied
goals, effort beliefs, and attributions.
Meaning System Measures. In evaluating the robust-
ness of the meaning system, one must carefully examine the
measures used to see whether they capture the spirit and the
psychology of the meaning system in question. Indeed, as a
meta-analysis by Hulleman points out, research on achieve-
ment goals has often used “the same label for different
constructs” (Hulleman et al., 2010, p. 422). There are two
very different types of performance goal measures: those
focused on appearance/ability (i.e., validating and demon-
strating one’s abilities) and those focused on living up to a
normative standard (e.g., doing well in school; also see
Grant & Dweck, 2003). These two different types of per-
formance goal measures have different (even reversed) as-
sociations with achievement and with other motivational
constructs (see Table 13, p. 438, in Hulleman et al., 2010).
Mindset studies have mostly asked about appearance-
focused achievement goals (e.g., “It’s very important to me
that I don’t look stupid in [this math] class,” as in Blackwell
et al., 2007, Study 1; also see Table 1). However, correla-
tions of mindset with normative achievement goals (the
desire to do well in school overall) are not expected. Next,
learning goals measures do not simply ask about the value
of learning in general but ask about it in the context of a
tradeoff (e.g., in Blackwell et al., 2007: “It’s much more
important for me to learn things in my classes than it is to
get the best grades;” also see Table 1). The reason is that
almost anyone, even students with more of a fixed mindset,
should be interested in learning, but mindsets should start to
matter when people might have to sacrifice their public
image as a high-ability person to learn. Finally, attributions
are typically assessed in response to a failure situation (see
Robins & Pals, 2002 and Table 1), which is where the two
mindsets part ways.
Initial Evidence. The first study to simultaneously test
multiple indicators of the mindset meaning system was
conducted by Robins and Pals (2002) with college students
(N508). They found that a fixed mindset was associated
with performance goals and fixed ability attributions for
failure, as well as helpless behavioral responses to diffi-
culty, rs.31, .19, and .48, respectively (effort beliefs were
not assessed). Later, Blackwell et al. (2007, Study 1, N
373) replicated these findings, adding a measure of effort
Each meaning system hypothesis has also been tested
individually in experiments which directly manipulated
mindsets through persuasive articles (without mentioning
any of the meaning system constructs). In a study of mind-
sets and achievement goals, Cury et al. (2006, Study 2)
manipulated mindsets, without discussing goals, and
showed that this caused differences in performance versus
learning or mastery goals in the two mindset conditions. In
a study of mindsets and effort beliefs, Miele and Molden
(2010, Study 2) crossed a manipulation of mindsets (fixed
vs. growth) with a manipulation of how hard people had to
work to interpret a passage of text.
Participants induced to
hold more of a fixed mindset lowered their competence
perceptions when they had to work hard to interpret the
passage, and increased their competence perceptions when
the passage was easy to interpret, relative to those induced
to hold a growth mindset (who did not differ in their
competence perceptions in the hard vs. easy condition). It is
also important to note that actual performance and effort
were constant across the mindset conditions, but mindsets
Miele and Molden (2010) note that “the two versions of the [mindset-
inducing] article focused solely on whether intelligence was stable or
malleable and did not include any information about the role of mental
effort or processing fluency in comprehension and performance” (p. 541).
caused different appraisals of the meaning of the effort they
expended. Finally, studies of mindsets and resilient versus
helpless reactions to difficulty have manipulated mindsets
without mentioning responses to failure. These have shown
greater resilience among those induced with a growth mind-
set (e.g., attributing failure to effort and/or seeking out
remediation) and greater helpless responses among those
induced with a fixed mindset (Hong et al., 1999; Nussbaum
& Dweck, 2008).
Meta-Analysis. A meta-analysis by Burnette et al.
(2013) synthesized the experimental and correlational evi-
dence on the meaning system hypotheses using data from
N28,217 participants. Consistent with mindset theory,
the effects of mindset on achievement goals (performance
vs. learning) and on responses to situations (helpless vs.
mastery) were replicated and were statistically significantly
stronger (by approximately 50% on average) when people
were facing a setback (what the authors called “ego threats”;
Burnette et al., 2013).
Large-Scale Studies Using Standardized Measures.
Recently, three different large studies with a total of over
23,000 participants have replicated the meaning system
correlations (see Figure 2 and Table 1), two of which were
generalizable to entire populations of students, either to the
United States overall, (Yeager et al., 2019) or to entire parts
of Norway (Rege et al., in press).
These studies used the “Mindset Meaning-System Index”
(MMI) a standardized measure comprised of single items
for each construct. To create the MMI, from past research
(e.g., Blackwell et al., 2007) we chose the most prototypical
or paradigmatic items that loaded highly onto the common
factor for that construct and covered enough of the construct
that it could stand in for the whole variable, Meeting the
criteria set forth above, the performance goal item was
appearance/ability focused, the learning goal involved a
tradeoff (e.g., learning but risking your high grades vs. not
learning but maintaining a high-ability image), and the
attribution items involved a response to a specific failure
situation. Each item was then edited to be clear enough to be
administered widely, especially to those with different lev-
els of language proficiency.
Understanding an Unsupportive Study. In a study
with N438 undergraduates, Burgoyne et al. (2020) found
weak or absent correlations between their measure of mind-
sets and their measures of some of the meaning system
variables (performance goals, learning goals, and attribu-
tions), none of which met their r.20 criterion for a
meaningful effect. (They did not include effort beliefs.)
The data from Burgoyne et al. (2020) are thought-
provoking. They differ from the data provided by Robins
and Pals (2002); Blackwell et al. (2007) and by the partic-
ipants just mentioned. Indeed, Table 1 shows that 10 out of
15 of the correlations in the three large replications ex-
ceeded the r.20 threshold set by Burgoyne et al., even
without adjusting for the unreliability in single-item mea-
sures that can attenuate the magnitude of associations.
One solution to the puzzling discrepancy is that Burgoyne
et al. (2020) did not adhere closely to the measures used in
past studies. The nature of these deviations can be instruc-
tive for theory and future research. Tellingly, Burgoyne et
al. (2020) asked about normative-focused performance
goals (“I worry about the possibility of performing poorly”),
not appearance/ability-focused goals (cf. Grant & Dweck,
Table 1
Correlations of Fixed Mindsets With the “Mindset Meaning System” in Replication Studies
Correlation with fixed mindset NSLM pilot, N3,306 NSLM, N14,894 U-say, N5,247
Meaning system aggregate index, r.43 .40 .41
Individual meaning system items
Effort beliefs, r.47 .32 .36
Performance-avoidance, r.21 .21 .17
Learning, r⫽⫺.16 .14 .21
Response (attributions)
Helplessness, r.29 .28 .23
Resilience, r⫽⫺.14 .15 .25
Note. NSLM National Study of Learning Mindsets. Correlations do not adjust for unreliability in the single
items, so effect sizes are conservative relative to measures with less measurement error. Data sources: the NSLM
(Yeager et al., 2019); the NSLM pilot study (Yeager et al., 2016), and a replication of the NSLM in Norway,
the U-say study (Rege et al., in press). Effort beliefs: “When you have to try really hard in a subject in school,
it means you can’t be good at that subject;” Performance-avoidance goals: “One of my main goals for the rest
of the school year is to avoid looking dumb in my classes;” Learning goals: Student chose between “easy math
problems that will not teach you anything new but will give you a high score vs. harder math problems that might
give you a lower score but give you more knowledge”; Helpless responses to challenge: Getting a bad grade
“means I’m probably not very smart at math;” Resilient responses to challenge: After a bad grade, saying “I can
get a higher score next time if I find a better way to study.” All ps.05.
2003; Hulleman et al., 2010). Students with more of a
growth mindset may well be just as eager to do well in
school (and not do poorly) as those with more of a fixed
mindset (indeed, endorsement of their item was near ceil-
ing). Where they should differ is in whether they are highly
concerned with maintaining a public (or private) image of
being a “smart” not a “dumb” person. To begin to find out
if this solves the puzzle, we were able to include both
measures of performance goals in a study of mindset we
were conducting with N1,582 U.S. 8th to 12th graders
(preregistration:; see the online supplemental
materials). A Bayesian analysis using Stan (Gelman et al.,
2015) found that a fixed mindset was associated with our
ability-focused goals item, r.25 [95% posterior density
interval: .20, .29], and this was greater than Burgoyne’s
criterion of r.20 with 97% probability, while a fixed
mindset was more weakly associated with the normative-
focused goals composite, r.08 [.03, .13], probability r
.20 99.9%. These data support our argument that at least
in some cases Burgoyne et al. (2020) used items that did not
truly test the meaning system hypotheses.
Conclusion. The Burgoyne et al. (2020) study was in-
structive because it provided an opportunity to articulate the
aspects of the constructs that are most central to mindset
theory. In addition, it showed the need for standardized
measures, not only of the mindsets, but also of their mean-
ing system mediators. Being short, the MMI is not designed
to maximize effect sizes in small studies, but rather is
designed to provide a standardized means for learning about
where (and perhaps for testing hypotheses about why) the
meaning system variables should be more strongly or
weakly related to mindsets, to each other, and to outcomes.
Controversy 2: Do Student Mindset
Interventions Work?
Based on their meta-analysis of mindset interventions,
Macnamara and colleagues (Macnamara, 2018; Sisk et al.,
2018) concluded that there is only weak evidence that
mindset interventions work or yield meaningful effects.
Here, we first clarify what is meant by a growth mindset
intervention. Then we ask: Do the effects replicate and hold
up in preregistered studies and independent analyses? And
are the effects meaningfully heterogeneous—that is, do the
In addition, the Burgoyne et al. (2020) study included a single attri-
butional item which did not ask about an interpretation of any failure event
(it simply said “Talent alone—without effort—creates success”), unlike
Robins and Pals (2002).
Figure 2
The Correlation Between Fixed Mindsets, the Meaning System, and Academic Outcomes Were Replicated in
Three Large Studies of First-Year High School Students in Two Nations
Note. Statistics are zero-order correlations estimated by meta-analytically aggregating three data sets: The U.S. National Study of
Learning Mindsets (NSLM) pilot (Yeager et al., 2016); The U.S. NSLM (Yeager et al., 2019); The Norway U-say experiment (Rege
et al., in press). All ps.001. Data from each of the items for each of the three studies are presented in Table 1 later in the article.
The meaning system paths to grades and course-taking were measured with the aggregate of the Mindset Meaning System items
(see Table 1 below and the online appendix for the questionnaire). The MMI measure is brief (five items) so that it could be
incorporated into large-scale studies. Because these are single-item measures, magnitudes of correlations are likely to be
underestimates of true associations.
moderation analyses reveal new insights, mechanisms, or
areas for future research?
What Is a Growth Mindset Intervention?
The Intervention’s Core Content. A growth mindset
intervention teaches the idea that people’s academic and
intellectual abilities can be developed through actions they
take (e.g., effort, changing strategies, and asking for help;
Yeager & Dweck, 2012; Yeager, Romero, et al., 2016). The
intervention usually conveys information about neuroplas-
ticity through a memorable metaphor. The NSLM, for in-
stance, stated “the brain is like a muscle—it gets stronger
(and smarter) when you exercise it.” Merely defining and
illustrating the growth mindset with a metaphor could never
motivate sustained behavior change, however. The interven-
tion must also mention concrete actions people can take to
implement the growth mindset, such as “You exercise your
brain by working on material that makes you think hard in
school.” Students also heard stories from scientists, peers,
and notable figures who have used a growth mindset. The
intervention is not a passive experience but invites active
engagement. In the NSLM, for example, students wrote
short essays about times they have grown their abilities after
struggling and how they might use a growth mindset for
future goals. They also wrote a letter to encourage a future
student who had fixed mindset thoughts such as “I’m al-
ready smart, so I don’t have to work hard” or “I’m not smart
enough, and there’s nothing I can do about it.” This is called
a “saying-is-believing” exercise and it can lead students to
internalize the growth mindset in a short time, through
cognitive dissonance processes (Aronson et al., 2002).
Crucial Details. Several details are crucial to the inter-
vention’s effectiveness. A growth mindset is not simply the
idea that people can get higher scores if they try harder. To
count as a growth mindset intervention, it must make an
argument that ability itself has the potential to be developed.
Telling students that they succeeded because they tried hard,
for example, is an attribution manipulation, not a growth
mindset intervention (cf. Li & Bates, 2019). A growth
mindset intervention also does not require one to believe
that ability can be greatly or easily changed. It deals with the
potential for change, without making any claim or promise
about the magnitude or ease of that change. Nor does a
growth mindset intervention say that ability does not matter
or does not differ. Rather, it focuses on within-person com-
parisons—the idea that people can improve relative to prior
Finally, growth mindset interventions can be poorly
crafted or well-crafted (see Yeager, Romero, et al., 2016, for
head-to-head comparisons of different interventions).
Poorly crafted interventions tell participants the definition
of growth mindset without suggesting how to put it into
practice. As noted above, a definition alone cannot motivate
behavior change. Well-crafted interventions ask students to
reflect on how they might develop a “stronger” (better-
connected) brain if they do challenging work, seek out new
learning strategies, or ask for advice when it is needed, but
they do not train students in the mediating behaviors or
meaning system constructs. That is, the interventions men-
tion these learning-oriented behaviors at a high level, so that
students do not just have a definition of a growth mindset,
but also show how to put it into action if they wish to do so. In
fact, in some mindset experiments, the control group gets skills
training (e.g., Blackwell et al., 2007; Yeager et al., 2013).
Well-crafted interventions are also autonomy-supportive and
are not didactic (see Yeager et al., 2018).
Crafting the Interventions. Recently, as growth mind-
set interventions have been evaluated in larger-scale exper-
iments, a years-long research and development process has
been followed, which included (a) focus groups with stu-
dents to identify arguments that will be persuasive to the
target population, (b) A/B tests to compare alternatives head
to head, and (c) pilot studies in preregistered replication
experiments (see Yeager, Romero, et al., 2016). When the
intervention reliably changed mindset beliefs and short-term
mindset-relevant behavior (such as challenge-seeking) con-
sistently across gender, racial, ethnic, and socioeconomic
groups, then it was considered ready for an evaluation study
testing its effects on academic outcomes in the target pop-
ulation (e.g., among 9th graders).
Intervention Target. Here we focus on growth mindset
interventions that target students and that deliver treatments
directly to students without using classroom teachers as
intermediaries. We consider teacher-focused interventions
below (see Controversy 4).
Do Student Growth Mindset Intervention
Effects Replicate?
Initial Evaluation Experiments. Blackwell and col-
leagues (2007, Study 2) evaluated an in-person growth
mindset intervention, delivered by highly trained facilitators
(not teachers) who were trained extensively before each
session and debriefed fully after each one. They found that
this initial intervention halted the downward trajectory of
math grades among U.S. 7th grade students who had pre-
viously been struggling in school (N99). Although prom-
ising, the in-person intervention was not scalable because it
required extensive training of facilitators and a great deal of
class time. Later, Paunesku and colleagues (2015) adminis-
tered a portion of that face-to-face intervention in an online
format to U.S. high school students (N1,594). It included
a scientific article conveying the growth mindset message,
followed by guided reading and writing exercises designed
to help students internalize the mindset. Using a double-
blind experimental design, Paunesku and colleagues (2015)
found effects of the intervention on lower-achieving stu-
dents’ grades months later, at the end of the term—a result
which mirrored the findings of many correlational studies
showing larger effects for more vulnerable groups (noted
above). The effect size was, not surprisingly, smaller in a
short, online self-administered intervention than in an in-
person intervention, but the conclusions were similar and
the online mode was more scalable.
Yeager, Walton and colleagues (2016, Study 2) replicated
the Paunesku et al. study (using almost identical materials),
but focusing on undergraduates (N7,418) and a different
outcome (full-time enrollment rather than grades). They
administered a version of the Paunesku et al. (2015) short
online growth mindset intervention to all entering under-
graduates at a large public university and found that it
increased full-time enrollment rates among vulnerable
groups. Broda and colleagues (Broda et al., 2018) carried
out a replication (N7,686) of the Yeager, Walton, et al.
(2016, Study 2) study and found a similar result: a growth
mindset intervention lifted achievement among a subgroup
of students that was at risk for poor grades.
First Preregistered Replication. The Paunesku et al.
study was replicated in a preregistered trial conducted with
high school students (Yeager, Romero, et al., 2016). That
study improved upon the intervention materials by updating
and refining the arguments (see Study 1, Yeager et al.,
2016) and more than doubled the sample size (3,676), which
matched recommendations for replication studies (Simon-
sohn, 2015). This Yeager et al. (2016) study found effects
for lower-achieving students. Of note, Pashler and de Ruiter
(2017) have argued that psychologists should reserve strong
claims for “class 1” findings—those that are replicated in
preregistered studies. With the 2016 study, growth mindset
met their “class 1” threshold.
A National, Preregistered Replication. In 2013 we
launched the NSLM (Yeager, 2019), which is another pre-
registered randomized trial, but now using a nationally
representative sample of U.S. public high schools. To
launch the NSLM, a large team of researchers collectively
developed and reviewed the intervention materials and pro-
cedures to ensure that the replication was faithful and high-
quality. Then, an independent firm specializing in nationally
representative surveys (ICF International) constructed the
sample, trained their staff, guided school personnel, and
collected and merged all data. Next, a different independent
firm, specializing in impact evaluation studies (MDRC),
constructed the analytic dataset, and wrote their own inde-
pendent evaluation report based on their own analyses of the
data. Our own analyses (Yeager et al., 2019) and the inde-
pendent report (Zhu et al., 2019) confirmed the conclusion
from prior research: there was a significant growth mindset
intervention effect on the grades of lower-achieving stu-
Effects on Course-Taking. Of note, an exploratory
analysis of the NSLM data found that the growth mindset
intervention increased challenge-seeking across achieve-
ment levels, as assessed by higher rates of taking advanced
math (Algebra II or above). This finding was later replicated
in the Norway U-say experiment (N6,541; Rege et al., in
press) with an identical overall effect size.
Summary of Supportive Replications. Taken to-
gether, these randomized trials established the effects of
growth mindset interventions with more than 40,000 partic-
ipants and answered the question of whether growth mind-
set interventions can, under some conditions, yield replica-
ble and scalable effects for vulnerable groups. However,
this does not mean that growth mindset interventions work
everywhere for all people. Indeed, there were sites within
the NSLM where the intervention did not yield enhanced
grades among lower achievers.
Unsupportive Evidence. Rienzo, Rolfe, and Wilken-
som (2015) evaluated a face-to-face growth mindset inter-
vention in a sample of 5th grade students (N286),
resembling the face-to-face method in Blackwell et al.
(2007). They showed a nonsignificant positive effect,
namely, a 4-month gain in academic achievement, p.07,
in the growth mindset group relative to the controls. The
estimated effect size (and pattern of moderation results by
student prior achievement reported by Rienzo et al., 2015)
was larger than the online growth mindset intervention
effects. Therefore, the Rienzo study is not exactly evidence
against mindset effects (see a discussion of this statistical
argument in McShane et al., 2019). But Rienzo et al., 2015,
also reported a second, teacher-focused intervention with
null effects (see Controversy 4 below).
Several studies have evaluated direct-to-student growth
mindset manipulations (see Sisk et al., 2018 for a meta-
analysis; also see the appendix in Ganimian, 2020). Some of
these relied on nonrandomized (i.e., quasi-experimental)
research designs, and others were randomized trials. Some
were not growth mindset interventions (e.g., involving only
an email from a professor or sharing a story of scientist who
overcame great struggle). These were combined in a meta-
analysis (Sisk et al., 2018). This meta-analysis yielded the
same conclusion as the NSLM. They found an overall,
significant mindset intervention effect that was larger
among students at risk for poor performance. However, as
with their meta-analysis of the correlational studies, the Sisk
et al. (2018) meta-analysis yielded significant heterogene-
Is the Heterogeneity in Mindset Effects Meaningful?
Learning from the heterogeneity in any psychological phe-
nomenon, especially one involving real-world behavior and
policy-relevant outcomes, can be difficult (Tipton et al.,
2019). In particular, it is hard to understand the source of
heterogeneous findings when different studies involve dif-
ferent populations, different interventions, and different
contexts all at once. Because meta-analyses tend to assess
moderators at the study level (rather than analyzing within-
study interaction effects), and because each study tends to
change all three—the population, the intervention, and the
context—then moderators are confounded. For this reason,
metaregression is often poorly suited for understanding
moderators (see a discussion in Yeager et al., 2015).
It is far easier to find out if contextual heterogeneity is
meaningful if a study holds constant key study design
features (the intervention and the targeted group) and then
carries out the experiment in different contexts (Tipton et
al., 2019). If effects still varied, this would mean that at least
some of the heterogeneity we were seeing was systematic
and potentially interesting. Indeed, large, rigorous random-
ized trials have a long history of settling debates caused
by meta-analyses that aggregated mostly correlational or
quasi-experimental studies (see, e.g., the class size de-
bate; Krueger, 2003).
Heterogeneous Effects in the NSLM. The NSLM was
designed to provide exactly this kind of test, by uncon-
founding contextual heterogeneity from intervention design.
It revealed that growth mindset intervention effects were
systematically larger in some schools and classrooms than
others. In particular, the intervention improved lower-
achieving students’ grade point averages chiefly when peers
displayed a norm of challenge-seeking (a growth mindset
behavior; Yeager et al., 2019) and math grades across
achievement levels when math teachers endorsed more of a
growth mindset (Yeager et al., 2020). Thus, the intervention
(that changed growth mindset beliefs homogeneously across
these settings), was most effective at changing grades in
populations who were vulnerable to poor outcomes, and in
contexts where peers and teachers afforded students the
chance to act and continue acting on the mindset (see
Walton & Yeager, 2020).
A Proposed Framework for Understanding Heteroge-
neity: The Mindset Context Perspective. The NSLM
results led to a framework that can help to understand (and
predict, in advance) heterogeneous results. We call it the
Mindset Context perspective (see Figure 3; see Yeager et
al., 2019). According to this view, a growth mindset inter-
vention should have meaningful effects only when people
are actively facing challenges or setbacks (e.g., when they
are lower-achieving, or are facing a difficult course or
school transition) and when the context provides opportu-
nities for students to act on their mindsets (e.g., via teacher
practices that support challenge-seeking or risking mistakes;
Bryan et al., 2020; Walton & Yeager, 2020). Mindset
Context stands in contrast to the “mindset alone” perspec-
tive, which is the idea that if people are successfully taught
a growth mindset they will implement this mindset in al-
most any setting they find themselves in.
One important implication of Mindset Context theory
is that any intervention will likely need further customiza-
tion before it can be given to different populations (e.g., in
the workplace, for older or younger students, or in a new
domain). And even a well-crafted intervention will need to
be delivered with an understanding of the context factors
that could moderate its effects.
We note that research using a typical convenience sample
will have a hard time testing predictions of Mindset
Context Theory. Individual-level risk factors and unsup-
portive context factors are often positively correlated in
the population, but the two types of factors are expected
to have opposite moderation effects. This is why it is
useful to use either representative samples (like the
NSLM) or carefully constructed quota samples which
disentangle the two.
Conclusions. The NSLM results show why it can be
misleading to look at studies with null findings and con-
clude that something “doesn’t work” or isn’t “real” (also see
Kenny & Judd, 2019). In fact, if researchers had treated
Figure 3
The Mindset Context Perspective: A Decision Tree Depicts Questions to Ask
About a Mindset Intervention, and What Kinds of Effect to Expect Depending on
the Answer
each of the 65 schools in the NSLM as its own separate
randomized trial, there could be many published papers in
the literature that might look like “failures to replicate.”
This is why it is not best practice to count significant results
(see Gelman, 2018, for a discussion of this topic). When,
instead, we had specific hypotheses about meaningful
sources of heterogeneity and a framework for interpreting
them (see Figure 3) we found that the schools varied sys-
tematically in their effects. There is still much more to learn
about heterogeneity in growth mindset interventions, of
course, in particular with respect to classroom cultures and
international contexts (cf. Ganimian, 2020).
Controversy 3: Are Mindset Effect Sizes Too
Small to Be Interesting?
What are the Right Benchmarks for Effect Sizes?
Benchmarks from Macnamara and Colleagues. Macna-
mara and colleagues stated that a “typical” effect size for an
educational intervention is .57 SD (Macnamara, 2018; Sisk et
al., 2018, p. 569). To translate this into a concrete effect, this
would mean that if GPA had a standard deviation of 1, as it
usually does, then a typical intervention should be expected to
increase GPA by .57 points, for instance from a 3.0 to a 3.57.
Macnamara and colleagues argued that “resources might be
better allocated elsewhere” because growth mindset effects
were much smaller than .57 SD (Sisk et al., 2018, p. 569).
Questioning Macnamara (2018), Gelman (2018) asked
“Do we really believe that .57? Maybe not.” The .57 SD
benchmark comes from a meta-analysis by Hattie et al.
(1996) of learning skills manipulations. Their meta-
analysis mostly aggregated effects on variables at imme-
diate posttest, almost all of which were researcher-
designed measures to assess whether students displayed
the skill they had just learned (almost like a manipulation
check). However, such immediate measures that are very
close to precisely what was taught are well-known to
show much larger effect sizes than multiply determined
outcomes that unfold over time in the real world, such as
grades or test scores (Cheung & Slavin, 2016). Indeed, in
the same Hattie meta-analysis cited by Macnamara
(2018), Hattie and colleagues stated: “There were 30
effect sizes that included follow-up evaluations, typically
of 108 days, and the effects sizes declined to an average
of .10” (Hattie et al., 1996, p. 112). Judging from the
meta-analysis cited by Macnamara (2018), the “typical”
effect size for a study looking at effects over time would
be .10 SD, not .57 SD. But even this is too optimistic
when we consider study quality.
Benchmarks From Field Experiments in Education.
The current standard for understanding effect sizes in
educational research does not look to a meta-analysis of
all possible studies regardless of quality but looks to
“best evidence syntheses.” This means examining syn-
theses of studies using the highest-quality research de-
signs that were aimed at changing real-world educational
outcomes (Slavin, 1986). This is important because
lower-quality research designs (nonexperimental, non-
randomized, or nonrepresentative) do not provide realis-
tic benchmarks against which interventions should be
evaluated (Cheung & Slavin, 2016). Through the lens of
the best evidence synthesis, about the best effect that can
be expected for an educational intervention with real-
world outcomes is .20 SD. For instance, the effect of
having an exceptionally good versus a below-average
teacher for an entire year is .16 SD (Chetty et al., 2014).
An entire year of math learning in high school is .22 SD
(Lipsey et al., 2012). The positive effect of drastically
reducing class-size for elementary school students was
.20 SD (Nye et al., 2000).
A typical effect of educational interventions is much
smaller. Kraft (2020) located all studies funded by the
federal government’s Investing in Innovation (I3) Fund,
which were studies of programs that had previously
shown promising results and won a competition to un-
dergo rigorous, third-party evaluation (Boulay et al.,
2018). This is relevant because researchers had to pre-
specify their primary outcome variable and hypotheses
and report the results regardless of their significance, so
there is no “file drawer” problem. These were studies
looking at objective real-world outcomes (e.g., grades,
test scores, or course completion rates) some time after
the educational interventions. The median effect size was
.03 SD and the 90th percentile was .23 SD (see Kraft,
2020, Table 1, rightmost column). In populations of
adolescents (the group targeted by our growth mindset
interventions), there were no effects over .23 SD (see an
analysis in the online supplement in Yeager et al., 2019).
Kraft (2020) concluded that “effects of 0.15 or even 0.10
SD should be considered large and impressive when they
arise from large-scale field experiments that are prereg-
istered and examine broad achievement measures” (p.
It is interesting to note that the most highly touted
“nudge” interventions (among those whose effects unfold
over time) have effects that are in the same range (see
Benartzi et al., 2017). The effects of descriptive norms on
energy conservation (.03 SD), the effects of implemen-
tation intentions on flu vaccination rates (.12 SD), and the
effects of simplifying a financial aid form on college
enrollment for first-time college-goers (.16 SD; effect
sizes calculated from posted zstatistics,
47f7j/) were never anywhere close to .57 SD overall.
Summary of Effect Sizes
We have presented this review to suggest that psychology’s
effect size benchmarks, which were based on expectations of
laboratory results and are often still used in our field, have led
the field astray (also see a discussion in Kraft, 2020). In the real
world, single variables do not have huge effects. Not even
relatively large, expensive, and years-long reforms do. If psy-
chological interventions can get a meaningful chunk of a .20
effect size on real-world outcomes in targeted groups, reliably,
cost-effectively, and at scale, that is impressive.
Comparison to the NSLM
In this context, growth mindset intervention effect sizes are
noteworthy. The NSLM showed an average effect on grades in
the preregistered group of lower-achieving students of .11 SD
(Yeager et al., 2019). Moreover, this occurred with a short,
low-cost self-administered intervention that required no further
time investment from the school (indeed, teachers were blind
to the purpose of the study). And these effects were in a
scaled-up study.
Growth mindset effects were larger in contexts that were
specified in preregistered hypotheses. The effect size was .18
(for overall grades) and .25 (for math and science grades) in
schools that were not already high-achieving and provided a
supportive peer climate (Yeager et al., 2019, Figure 3). Of
course, we are aware that these kinds of moderation results
might, in the past, have emerged from a post hoc exploratory
analysis and would, therefore, be hard to believe. Therefore, it
might be tempting to discount them. But these patterns
emerged from a disciplined preanalysis plan that was carried
out by independent Bayesian statisticians who analyzed
blinded data using machine-learning methods (Yeager et al.,
2019), and the moderators were confirmed by an independent
research firm’s analyses, over which we had no influence (Zhu
et al., 2019).
Effects of this magnitude can have important consequences
for students’ educational trajectories. In the NSLM, the overall
average growth mindset effect on lower-achievers’ 9th grade
poor performance rates in their courses (that is, the earning of
D/F grades) was a reduction of 5.3 percentage points (Yeager
et al., 2019). Because there are over 3 million students per year
in 9th grade (and lower-achievers are defined as the bottom
half of the sample), this means that a scalable growth mindset
intervention could prevent 90,000 at-risk students per year
from failing to make adequate progress in the crucial first year
of high school. In summary, the effect sizes were meaning-
ful—they are impressive relative to the latest standards and in
terms of potential societal impact—and the moderators meet a
high bar for scientific credibility.
Controversy 4: Teacher Mindset Interventions
The final controversy concerns mindset interventions
aimed at or administered by teachers. So far, teacher-
focused growth mindset interventions have not worked (Fo-
liano et al., 2019; Rienzo et al., 2015), even though they
were developed with great care and were labor-intensive. A
likely reason is that the evidence base for teacher-focused
interventions is just beginning to emerge. Among other
things, the field will need to learn (a) precisely how to
address teachers’ mindsets about themselves and their stu-
dents, (b) which teacher practices feed into and maintain
students’ fixed and growth mindsets, (c) how to guide and
alter the teachers’ practices, and (d) how to do so in a way
that affects students’ perceptions and behaviors and that
enhances students’ outcomes. Moreover, changing teacher
behavior through professional development is known to be
exceptionally challenging (TNTP, 2015).
For this reason, it might be preferable to start by admin-
istering a direct-to-student program to teach students a
growth mindset, such as the (free-to-educators) program we
have developed (available at Then the fo-
cus can be on helping teachers to support its effects. As the
field begins to tackle this challenge, it will not have to start
from scratch but can build on recent studies documenting
the role of teachers’ mindsets and mindset-related practices
in student achievement (e.g., Canning et al., 2019).
Summary of the Controversies
Three of the questions we have addressed so far—Does
growth mindset predict outcomes? Do growth mindset in-
tervention effects replicate? Are the effect sizes meaning-
ful?—have strong evidence in the affirmative. In each case
we have been inspired to learn from critiques, for instance,
by learning more about the expected effect sizes in educa-
tional field experiments, or designing standardized mea-
sures and interventions. There is also evidence that speaks
to the meaningful heterogeneity of the effects. As we have
discussed, there are studies, or sites within studies, that do
not show predicted mindset effects, but the more we are
learning about the students and contexts at those sites the
more we can improve mindset measures and intervention
programs. The fourth controversy, about educational prac-
titioners, highlights one important limitation of the work to
date and points to future directions for research.
This review of the evidence showed that the foundations
of mindset theory are sound, the effects are replicable, and
the effect sizes are promising. Although we have learned
much from the research-related controversies, we might ask
at a more general level: Why should the idea that students
can develop their abilities be controversial? And why
should it be controversial that believing this can inspire
students, in supportive contexts, to learn more? In fact, do
not all children deserve to be in schools where people
believe in and are dedicated to the growth of their intellec-
tual abilities? The challenge of creating these supportive
contexts for all learners will be great, and we hope mindset
research will play a meaningful role in their creation.
Aronson, J. M., Fried, C. B., & Good, C. (2002). Reducing the effects of
stereotype threat on African American college students by shaping
theories of intelligence. Journal of Experimental Social Psychology,
38(2), 113–125.
Bahník, Š., & Vranka, M. A. (2017). Growth mindset is not associated with
scholastic aptitude in a large sample of university applicants. Personality
and Individual Differences,117, 139–143.
Benartzi, S., Beshears, J., Milkman, K. L., Sunstein, C. R., Thaler, R. H.,
Shankar, M., Tucker-Ray, W., Congdon, W. J., & Galing, S. (2017).
Should governments invest more in nudging? Psychological Science,
28(8), 1041–1055.
Blackwell, L. S., Trzesniewski, K. H., & Dweck, C. S. (2007). Implicit
theories of intelligence predict achievement across an adolescent tran-
sition: A longitudinal study and an intervention. Child Development,
78(1), 246–263.
Boulay, B., Goodson, B., Olsen, R., McCormick, R., Darrow, C., Frye, M.,
Gan, K., Harvill, E., & Sarna, M. (2018). The investing in innovation
fund: Summary of 67 evaluations (NCEE 2018 4013). National Center
for Education Evaluation and Regional Assistance, Institute of Educa-
tion Sciences, U.S. Department of Education.
Broda, M., Yun, J., Schneider, B., Yeager, D. S., Walton, G. M., & Diemer,
M. (2018). Reducing inequality in academic success for incoming col-
lege students: A randomized trial of growth mindset and belonging
interventions. Journal of Research on Educational Effectiveness,11(3),
Bryan, C. J., Tipton, E., & Yeager, D. S. (2020). To change the world,
behavioral intervention research will need to get serious about hetero-
geneity [Unpublished manuscript]. University of Texas at Austin.
Burgoyne, A. P., Hambrick, D. Z., & Macnamara, B. N. (2020). How firm
are the foundations of mind-set theory? The claims appear stronger than
the evidence. Psychological Science,31(3), 258–267.
Burnette, J. L., Knouse, L. E., Vavra, D. T., O’Boyle, E., & Brooks, M. A.
(2020). Growth Mindsets and psychological distress: A meta-analysis.
Clinical Psychology Review,77, 101816.
Burnette, J. L., O’Boyle, E. H., VanEpps, E. M., Pollack, J. M., & Finkel,
E. J. (2013). Mind-sets matter: A meta-analytic review of implicit
theories and self-regulation. Psychological Bulletin,139(3), 655–701.
Canning, E. A., Muenks, K., Green, D. J., & Murphy, M. C. (2019). STEM
faculty who believe ability is fixed have larger racial achievement gaps
and inspire less student motivation in their classes. Science Advances,
5(2), eaau4734.
Chetty, R., Friedman, J. N., & Rockoff, J. E. (2014). Measuring the impacts
of teachers I: Evaluating bias in teacher value-added estimates. The
American Economic Review,104(9), 2593–2632.
Cheung, A. C. K., & Slavin, R. E. (2016). How methodological features
affect effect sizes in education. Educational Researcher,45(5), 283–
Claro, S., & Loeb, S. (2019). Students with growth mindset learn more in
school: Evidence from California’s CORE school districts. PACE. Re-
trieved from
Claro, S., Paunesku, D., & Dweck, C. S. (2016). Growth mindset tempers
the effects of poverty on academic achievement. Proceedings of the
National Academy of Sciences of the United States of America,113(31),
Cury, F., Elliot, A. J., Da Fonseca, D., & Moller, A. C. (2006). The
social-cognitive model of achievement motivation and the 2 2
achievement goal framework. Journal of Personality and Social Psy-
chology,90(4), 666679.
Diener, C. I., & Dweck, C. S. (1978). An analysis of learned helplessness:
Continuous changes in performance, strategy, and achievement cogni-
tions following failure. Journal of Personality and Social Psychology,
36(5), 451.
Dweck, C. S. (1999). Self-theories: Their role in motivation, personality,
and development. Taylor and Francis/Psychology Press.
Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to
motivation and personality. Psychological Review,95(2), 256–273.
Dweck, C. S., & Reppucci, N. D. (1973). Learned helplessness and
reinforcement responsibility in children. Journal of Personality and
Social Psychology,25(1), 109–116.
Dweck, C. S., & Yeager, D. S. (2019). Mindsets: A view from two eras.
Perspectives on Psychological Science,14, 481–496.
Elliott, E. S., & Dweck, C. S. (1988). Goals: An approach to motivation
and achievement. Journal of Personality and Social Psychology,54(1),
Foliano, F., Rolfe, H., Buzzeo, J., Runge, J., & Wilkinson, D. (2019).
Changing mindsets: Effectiveness trial. National Institute of Economic
and Social Research.
Ganimian, A. J. (2020). Growth-mindset interventions at scale: Experi-
mental evidence from Argentina. Educational Evaluation and Policy
Analysis,42(3), 417–438.
Gelman, A. (2018, September 13). Discussion of effects of growth mindset:
Let’s not demand unrealistic effect sizes. Retrieved from https://
Gelman, A., Lee, D., & Guo, J. (2015). Stan: A probabilistic programming
language for Bayesian inference and optimization. Journal of Educa-
tional and Behavioral Statistics,40(5), 530–543.
Grant, H., & Dweck, C. S. (2003). Clarifying achievement goals and their
impact. Journal of Personality and Social Psychology,85(3), 541–553.
Gopalan, M., & Yeager, D. S. (2020). How does adopting a Growth
Mindset Improve Academic Performance? Probing the Underlying
Mechanisms in a Nationally- Representative Sample [OSF Preprints].
Hattie, J., Biggs, J., & Purdie, N. (1996). Effects of learning skills inter-
ventions on student learning: A meta-analysis. Review of Educational
Research,66(2), 99–136.
Hong, Y., Chiu, C., Dweck, C. S., Lin, D. M.-S., & Wan, W. (1999).
Implicit theories, attributions, and coping: A meaning system approach.
Journal of Personality and Social Psychology,77(3), 588–599. https://
Hulleman, C. S., Schrager, S. M., Bodmann, S. M., & Harackiewicz, J. M.
(2010). A meta-analytic review of achievement goal measures: Different
labels for the same constructs or different constructs with similar labels?
Psychological Bulletin,136(3), 422–449.
Kanopka, K., Claro, S., Loeb, S., West, M., & Fricke, H. (2020). Changes
in social-emotional learning: Examining student development over time
[Working Paper] Policy Analysis for California Education). Stanford
University. Retrieved from
Kenny, D. A., & Judd, C. M. (2019). The unappreciated heterogeneity of
effect sizes: Implications for power, precision, planning of research, and
replication. Psychological Methods,24(5), 578–589.
Kraft, M. A. (2020). Interpreting effect sizes of education interventions.
Educational Researcher,49(4), 241–253.
Krueger, A. B. (2003). Economic considerations and class size. Economic
Journal,113(485), F34–F63.
Leslie, S.-J., Cimpian, A., Meyer, M., & Freeland, E. (2015). Expectations
of brilliance underlie gender distributions across academic disciplines.
Science,347(6219), 262–265.
Li, Y., & Bates, T. C. (2019). You can’t change your basic ability, but you
work at things, and that’s how we get hard things done: Testing the role
of growth mindset on response to setbacks, educational attainment, and
cognitive ability. Journal of Experimental Psychology: General,148(9),
Lipsey, M. W., Puzio, K., Yun, C., Hebert, M. A., Steinka-Fry, K., Cole,
M. W., Roberts, M., Anthony, K. S., & Busick, M. D. (2012). Trans-
lating the statistical representation of the effects of education interven-
tions into more readily interpretable forms (NCSER 2013–3000). Na-
tional Center for Special Education Research. Retrieved from https://ies
Macnamara, B. (2018). Schools are buying “growth mindset” interventions
despite scant evidence that they work well. The Conversation. Retrieved
McShane, B. B., Tackett, J. L., Böckenholt, U., & Gelman, A. (2019).
Large-scale replication projects in contemporary psychological research.
The American Statistician,73(Suppl. 1), 99–105.
Miele, D. B., & Molden, D. C. (2010). Naive theories of intelligence and
the role of processing fluency in perceived comprehension. Journal of
Experimental Psychology: General,139(3), 535–557.
Molden, D. C., & Dweck, C. S. (2006). Finding “meaning” in psychology:
A lay theories approach to self-regulation, social perception, and social
development. American Psychologist,61(3), 192–203.
Mueller, C. M., & Dweck, C. S. (1998). Praise for intelligence can
undermine children’s motivation and performance. Journal of Person-
ality and Social Psychology,75(1), 33–52.
Muenks, K., Canning, E. A., LaCosse, J., Green, D. J., Zirkel, S., Garcia,
J. A., & Murphy, M. C. (2020). Does my professor think my ability can
change? Students’ perceptions of their STEM professors’ mindset be-
liefs predict their psychological vulnerability, engagement, and perfor-
mance in class. Journal of Experimental Psychology: General,149,
Nelson, L. D., Simmons, J., & Simonsohn, U. (2018). Psychology’s re-
naissance. Annual Review of Psychology,69(1), 511–534. https://doi
Nussbaum, A. D., & Dweck, C. S. (2008). Defensiveness versus remedia-
tion: Self-theories and modes of self-esteem maintenance. Personality
and Social Psychology Bulletin,34(5), 599612.
Nye, B., Hedges, L. V., & Konstantopoulos, S. (2000). The effects of small
classes on academic achievement: The results of the Tennessee class size
experiment. American Educational Research Journal,37(1), 123–151.
OECD. (2019). PISA 2018 results (Volume III): What school life means for
students’ lives. PISA, OECD Publishing.
Pashler, H., & de Ruiter, J. P. (2017). Taking responsibility for our field’s
reputation. APS Observer. Retrieved from https://www.psychological
Paunesku, D., Walton, G. M., Romero, C., Smith, E. N., Yeager, D. S., &
Dweck, C. S. (2015). Mind-set interventions are a scalable treatment for
academic underachievement. Psychological Science,26(6), 784–793.
Qin, X., Wormington, S., Guzman-Alvarez, A., & Wang, M.-T. (2020).
Why does a growth mindset intervention impact achievement differently
across secondary schools? Unpacking the causal mediation mechanism
from a national multisite randomized experiment [OSF Preprints].
Rege, M., Hanselman, P., Ingeborg, F. S., Dweck, C. S., Ludvigsen, S.,
Bettinger, E., Muller, C., Walton, G. M., Duckworth, A. L., & Yeager,
D. S. (in press). How can we inspire nations of learners? Investigating
growth mindset and challenge-seeking in two countries. American Psy-
Rienzo, C., Rolfe, H., & Wilkinson, D. (2015). Changing mindsets: Eval-
uation report and executive summary. Education Endowment Founda-
Robins, R. W., & Pals, J. L. (2002). Implicit self-theories in the academic
domain: Implications for goal orientation, attributions, affect, and self-
esteem change. Self and Identity,1(4), 313–336.
Schleider, J. L., Abel, M. R., & Weisz, J. R. (2015). Implicit theories and
youth mental health problems: A random-effects meta-analysis. Clinical
Psychology Review,35, 1–9.
Schroder, H. S., Dawood, S., Yalch, M. M., Donnellan, M. B., & Moser,
J. S. (2016). Evaluating the domain specificity of mental health–related
mind-sets. Social Psychological and Personality Science,7(6), 508
Seo, E., Lee, H. Y., Jamieson, J. P., Reis, H. T., Beevers, C. G., & Yeager,
D. S. (in press).Trait attributions and threat appraisals explain the rela-
tion between implicit theories of personality and internalizing symptoms
during adolescence. Development and Psychopathology.
Shpancer, N. (2020, July 29). Life cannot be hacked. Psychology Today.
Retrieved from
Simonsohn, U. (2015). Small telescopes: Detectability and the evaluation
of replication results. Psychological Science,26(5), 559–569. https://doi
Sisk, V. F., Burgoyne, A. P., Sun, J., Butler, J. L., & Macnamara, B. N.
(2018). To what extent and under which circumstances are growth
mind-sets important to academic achievement? Two meta-analyses. Psy-
chological Science,29(4), 549–571.
Slavin, R. E. (1986). Best-evidence synthesis: An alternative to meta-
analytic and traditional reviews. Educational Researcher,15(9), 5–11.
The World Bank. (2017, April 25). If you think you can get smarter, you
will. Retrieved from
Tipton, E., Yeager, D. S., Iachan, R., & Schneider, B. (2019). Designing
probability samples to study treatment effect heterogeneity. In P. J.
Lavrakas (Ed.), Experimental methods in survey research: Techniques
that combine random sampling with random assignment (pp. 435–456).
TNTP. (2015). The mirage: Confronting the hard truth about our quest for
teacher development. Retrieved from
Vivalt, E. (2020). How much can we generalize from impact evaluations?
Journal of the European Economic Association. Advance online publi-
Walton, G. M., & Yeager, D. S. (2020). Seed and soil: Psychological
affordances in contexts help to explain where wise interventions succeed
or fail. Current Directions in Psychological Science,29, 219–226.
Warne, R. (2020, January 3). The one variable that makes growth mindset
interventions work. Retrieved from
Weiner, B. (1985). An attributional theory of emotion and motivation.
Psychological Review,92(4), 548–573.
Weiner, B., & Kukla, A. (1970). An attributional analysis of achievement
motivation. Journal of Personality and Social Psychology,15(1), 1–20.
West, M. R., Buckley, K., Krachman, S. B., & Bookman, N. (2018).
Development and implementation of student social-emotional surveys in
the CORE Districts. Journal of Applied Developmental Psychology,55,
Yeager, D. S. (2019). The National Study of Learning Mindsets [United
States] (pp. 2015–2016). Inter-university Consortium for Political and
Social Research.
Yeager, D. S., Carroll, J. M., Buontempo, J., Cimpian, A., Woody, S.,
Crosnoe, R., Muller, C., Murray, J., Mhatre, P., Kersting, N., Hulleman,
C., Kudym, M., Murphy, M., Duckworth, A. L., Walton, G., & Dweck,
C. S. (2020). Teacher mindsets help explain where a growth mindset
intervention does and doesn’t work [Manuscript submitted for publica-
Yeager, D. S., Dahl, R. E., & Dweck, C. S. (2018). Why interventions to
influence adolescent behavior often fail but could succeed. Perspectives
on Psychological Science,13(1), 101–122.
Yeager, D. S., & Dweck, C. S. (2012). Mindsets that promote resilience:
When students believe that personal characteristics can be developed.
Educational Psychologist,47(4), 302–314.
Yeager, D. S., Fong, C. J., Lee, H. Y., & Espelage, D. L. (2015). Declines
in efficacy of anti-bullying programs among older adolescents: Theory
and a three-level meta-analysis. Journal of Applied Developmental Psy-
chology,37, 36–51.
Yeager, D. S., Hanselman, P., Muller, C. L., & Crosnoe, R. (2019).
Mindset x Context Theory: How agency and structure interact to shape
human development and social inequality [Working Paper]. University
of Texas at Austin.
Yeager, D. S., Hanselman, P., Walton, G. M., Murray, J. S., Crosnoe, R.,
Muller, C., Tipton, E., Schneider, B., Hulleman, C. S., Hinojosa, C. P.,
Paunesku, D., Romero, C., Flint, K., Roberts, A., Trott, J., Iachan, R.,
Buontempo, J., Yang, S. M., Carvalho, C. M., . . . Dweck, C. S. (2019).
A national experiment reveals where a growth mindset improves
achievement. Nature,573(7774), 364–369.
Yeager, D. S., Romero, C., Paunesku, D., Hulleman, C. S., Schneider, B.,
Hinojosa, C., Lee, H. Y., O’Brien, J., Flint, K., Roberts, A., Trott, J.,
Greene, D., Walton, G. M., & Dweck, C. S. (2016). Using design
thinking to improve psychological interventions: The case of the growth
mindset during the transition to high school. Journal of Educational
Psychology,108(3), 374–391.
Yeager, D. S., Trzesniewski, K. H., & Dweck, C. S. (2013). An implicit
theories of personality intervention reduces adolescent aggression in
response to victimization and exclusion. Child Development,84(3),
Yeager, D. S., Walton, G. M., Brady, S. T., Akcinar, E. N., Paunesku, D.,
Keane, L., Kamentz, D., Ritter, G., Duckworth, A. L., Urstein, R., Gomez,
E. M., Markus, H. R., Cohen, G. L., & Dweck, C. S. (2016). Teaching a lay
theory before college narrows achievement gaps at scale. Proceed-
ings of the National Academy of Sciences of the United States of
America,113(24), E3341–E3348.
Zhu, P., Garcia, I., & Alonzo, E. (2019). An independent evaluation of
growth mindset intervention. MDRC. Retrieved from https://files.eric

Supplementary resource (1)

... Students' growth mindsets-how malleable they perceive their abilities to be-play a pivotal role in their academic achievement (Dweck, 2006;Yeager and Dweck, 2020) and have been linked to multiple positive outcomes, including improved mental health (Schroder et al., 2017b), decreased stress (Burnette et al., 2020a) and even brain activity (Schroder et al., 2014). The extent to which and how growth mindset modulates one's achievement has been descriptively examined but rarely systematically explored (but see Cheng et al., 2021 for a review). ...
... Mindset theory (Dweck, 1946;Dweck and Leggett, 1988) was originally derived from attribution theory and achievement goal theory in motivational research. Specifically, mindset effects are meaningfully heterogeneous across persons and situations (see Yeager and Dweck, 2020 for a discussion). Mindsets are typically evaluated by gauging individuals' agreement or disagreement with some statements on his or her capability to change or thrive in the face of challenges. ...
... A growth mindset intervention explicitly teaches that individual's academic and intellectual abilities have the potential to develop and grow. A growth mindset intervention encompass the instructional delivery and/or discussion of potential concrete actions and strategies a learner can take to implement a growth mindset (Yeager and Dweck, 2020). ...
Full-text available
Students, staff, and faculty in higher education are facing unprecedented challenges due to the COVID-19 pandemic. Recent data revealed that a good number of academic activities and opportunities were disrupted as a result of the COVID-19 pandemic and its variants. While much uncertainty remains for the next academic year, how higher education institutions and their students might improve responses to the rapidly changing situation matters. This systematic review and framework proposal aim to update previous empirical work and examine the current evidence for the effectiveness of growth mindset interventions in young adults. To this end, a systematic search identified 20 empirical studies involving 5, 805 young adults. These studies examined growth mindset within ecologically valid educational contexts and various content areas. Generally, these findings showed that brief messages of growth mindset can improve underrepresented students' academic performance and facilitate other relevant psychological constructs. In addition, we argue, although growth mindset has been identified as a unitary concept, it is comprised of multiple interdependent skills, such as self-control, self-efficacy, and self-esteem. Understanding the nature of growth mindset may contribute to successful mindset implementation. Therefore, this article presents a practical framework to help educators in higher education rethink the multidimensionality of growth mindset and to provide their students with alternative routes to achieve their goals. Finally, additional articles were discussed to help evaluate growth mindset interventions in higher education.
... In general, people with a fixed mindset believe that intelligence is innate. They attribute success to being smart and tend to avoid challenges or often fail to meet their potential (Dweck, 2009(Dweck, , 2017Yeager and Dweck, 2020). Conversely, people with a growth mindset believe that intelligence is changeable (Dweck, 2009(Dweck, , 2017Yeager and Dweck, 2020). ...
... They attribute success to being smart and tend to avoid challenges or often fail to meet their potential (Dweck, 2009(Dweck, , 2017Yeager and Dweck, 2020). Conversely, people with a growth mindset believe that intelligence is changeable (Dweck, 2009(Dweck, , 2017Yeager and Dweck, 2020). Previous studies have demonstrated that students' intelligence mindsets are correlated with their math learning. ...
... In contrast, individuals with a growth mindset believe that ability can be changed through efforts, and failure is an opportunity to help them develop their abilities. Therefore, people with a growth mindset are more likely to view failure as enhancing their abilities (Dweck, 2009(Dweck, , 2017Yeager and Dweck, 2020). It can be seen that an individual's failure beliefs reflect the individual's intelligence mindset in a failure situation. ...
Full-text available
This study aimed to examine the relationship between parents' intelligence mindset and children's math anxiety and the mediating role of parents' failure mindset and evaluations of their child's math performance. A total of 419 Chinese students (196 boys and 223 girls) and their parents were recruited to complete a series of questionnaires on topics such as math anxiety, parent's failure mindset, parent's intelligence mindset, and parents' evaluations of their child's mathematical performance. The results revealed that parents' intelligence mindset was not correlated with children's math anxiety. However, parents' intelligence mindset indirectly predicted children's math anxiety through the chain-mediated role of parents' failure beliefs and parents' evaluations of their child's math performance. Further, sex differences were found through a multigroup analysis, which showed a chain-mediated effect between parents' intelligence mindset and girls' math anxiety.
... Based on these findings, growth mindset interventions have been developed in educational settings to boost students' cognitive abilities and learning outcomes (e.g., Yeager et al., 2019). These effects are particularly strong for students facing adversity, such as a lack of resources from low socioeconomic status, students struggling academically, and students of ethnic-racially marginalized groups (Blackwell et al., 2007;Yeager & Dweck, 2020;Yeager et al., 2019). ...
... Both level of education and retirement status were significant predictors of growth mindset level. The finding that individuals with higher levels of education had higher levels of growth mindset is consistent with past literature showing an association between growth mindset and educational attainment (Blackwell et al., 2007;Yeager & Dweck, 2020). It may be the case that the more one learns, the more one is likely to espouse a growth mindset. ...
... It might also be possible that stronger, continuous reinforcement of growth mindset is necessary for behavioral transfer, as other mindset interventions with children have included more frequent and intensive instruction (Blackwell et al., 2007;Yeager et al., 2019). In their recent review, Yeager and Dweck (2020) argue that teaching learners the definition of a growth mindset is not sufficient to stimulate behavior change. Instead, effective growth mindset interventions should teach students what it means to have a growth mindset while also providing autonomy supportive, detailed instruction on how to put their growth mindsets into practice (e.g., embracing challenging material, trying new strategies, and asking for help). ...
Growth mindset (belief in the malleability of intelligence) is a unique predictor of young learners' increased motivation and learning, and may have broader implications for cognitive functioning. Its role in learning in older adulthood is unclear. As part of a larger longitudinal study, we examined growth mindset and cognitive functioning in older adults engaged in a 3-month multi-skill learning intervention that included growth mindset discussions. Before, during, and after the intervention, participants reported on their growth mindset beliefs and completed a cognitive battery. Study 1 indicated that intervention participants, but not control participants, increased their growth mindset during the intervention. Study 2 replicated these results and found that older adults with higher preexisting growth mindsets showed larger cognitive gains at posttest compared to those with lower preexisting growth mindsets. Our findings highlight the potential role of growth mindset in supporting positive learning cycles for cognitive gains in older adulthood.
... Mindset theory, which originates from educational psychology, has attracted considerable interest of the researchers due to its positive effect on students' motivation and achievement (Yeager and Dweck, 2020;Xu et al., 2021). Dweck (2006) suggested that individual mindsets can be divided into two categories, namely, growth mindset, and fixed mindset. ...
... Dweck (2006) suggested that individual mindsets can be divided into two categories, namely, growth mindset, and fixed mindset. Individuals with a growth mindset believe that their attributes such as intelligence are malleable, whereas individuals endorsing a fixed mindset believe that their attributes are stable (Yeager and Dweck, 2020). Many studies have indicated that people with a growth mindset are more likely to learn from their mistakes and reach higher levels of learning performance and achievement than people with a fixed mindset (e.g., Asbury et al., 2016;Bostwick and Becker-Blease, 2018;Yeager and Dweck, 2020). ...
... Individuals with a growth mindset believe that their attributes such as intelligence are malleable, whereas individuals endorsing a fixed mindset believe that their attributes are stable (Yeager and Dweck, 2020). Many studies have indicated that people with a growth mindset are more likely to learn from their mistakes and reach higher levels of learning performance and achievement than people with a fixed mindset (e.g., Asbury et al., 2016;Bostwick and Becker-Blease, 2018;Yeager and Dweck, 2020). In addition, previous research has also found that employee growth mindset contributes to improved employee engagement (Keating and Heslin, 2015), task performance, job satisfaction, and organizational citizenship behavior (Han and Stieha, 2020). ...
Full-text available
This study aimed to investigate the relationship of employee growth mindset with innovative behavior and the mediating role of use of strength as well as the moderating role of strengths-based leadership in this relationship. Data with a sample of 244 employees working in diverse Chinese organizations were collected at two points in time. Results of bootstrapping analyses demonstrated that growth mindset is positively related to innovative behavior, employee strengths use partially mediates the positive relationship of growth mindset with innovative behavior, and strengths-based leadership strengthens the direct relationship between employee growth mindset and innovative behavior and the indirect relationship of employee growth mindset with innovative behavior via strengths use. This study advances growth mindset and innovative behavior theories and research.
... Second, the situational and dynamic nature of 26 NETWORK APPROACH IN SEVT SEVT suggests that future intervention studies should consider situations in their design, implementation, and analysis. Interventions from intelligence beliefs also suggest that a one-sizefits-all intervention is hard to achieve, and is probably not the most effective (Yeager & Dweck, 2020). Third, despite the situative nature, we also find some consistent patterns across situations. ...
Full-text available
In their recently renamed theory, Situated Expectancy-Value Theory (SEVT), Eccles and Wigfield (2020) emphasized the importance of situations in influencing individuals’ motivational beliefs and academic choices. Adopting a novel approach—network analysis, this study aimed to examine how situations may impact the associations among expectancies, subjective task values, and achievement from a holistic perspective. In this study, situations were operationalized as grade levels (i.e., 6th -9th grade), subject domains (i.e., language arts and math), and countries (i.e., Finland and Germany). Adolescents from Finland (N = 4,062) and Germany (N = 449) were included in the study. Results showed that, overall, the networks are mostly subject-bound, yearly-varied, and country-specific, supporting the situative nature of SEVT. We also found that expectancies were consistently the closest motivational beliefs to achievement, whereas utility values were the least close ones, implying that expectancies, not utility, might be the most desirable intervention targets for achievement improvement.
... Despite the controversy regarding whether the growth mindset influences students' achievement or not (as explained in the introduction), it is imperative that the growth mindset is a theory of response to challenges (Yeager & Dweck, 2020). To illustrate, some students might fail the national exam to enter public universities, but with a growth mindset they will not easily give up. ...
Full-text available
Growth Mindset is one of the determinants of a person's success in study and career. This is because people who have a growth mindset have perseverance, like challenges, and believe that their abilities will improve if they practice persistently. This quantitative study explores the growth mindset of undergraduate students studying education programs with a descriptive analysis and compares the growth mindset of students from private and public universities with independent sample t-test analysis. There were 452 undergraduate students from various universities in South Sulawesi participating: 215 students from private universities and 237 students from public universities. The results showed that the growth mindset of undergraduate students in education programs was high, both in private and public universities. However, there is a difference between the two, where the growth mindset of undergraduate students in public universirties is higher than that in private universities.
Researchers investigated the impact of a professional learning intervention focused on teaching teachers to increase rigor, challenge, and engagement to reveal talent in low-income learners. The professional learning intervention’s goal, to improve teachers’ ability to recognize student ability and talent through use of proven high-level curricular and instructional strategies, focused teacher learning on culturally responsive teaching, fundamental principles of learning, and specific curriculum. Researchers hypothesized teacher perceptions of students’ abilities would shift from a deficit view to a strengths-based one. Results indicated that teacher beliefs were positively impacted after one year and that those impacts leveled off over time. Specifically, teachers’ perceptions of student potential and the importance of talent development improved. Engagement in professional learning predicted positive change in classroom support, organization, and instruction. Finally, the professional learning intervention positively impacted teacher efficacy related to engaging, instructing, and managing learners.
Students typically have varying beliefs regarding the changeability of their own abilities in mathematics learning. A growth mindset is the belief that mathematics abilities can be developed, whereas a fixed mindset is the belief that mathematics abilities are unchangeable. Recent studies have highlighted that mindset beliefs regarding mathematics learning vary across cultures. This review summarizes cultural variations in students’ self-reported mindset beliefs, and how culture influences the roles of mindset beliefs in mathematics-learning outcomes, the development of mindset beliefs, and the effects of mindset interventions. Finally, we propose that future research should consider culture-specific factors in the development and measurement of students’ growth mindsets.
Full-text available
A growth-mindset intervention teaches the belief that intellectual abilities can be developed. Where does the intervention work best? Prior research examined school-level moderators using data from the National Study of Learning Mindsets (NSLM), which delivered a short growth-mindset intervention during the first year of high school. In the present research, we used data from the NSLM to examine moderation by teachers’ mindsets and answer a new question: Can students independently implement their growth mindsets in virtually any classroom culture, or must students’ growth mindsets be supported by their teacher’s own growth mindsets (i.e., the mindset-plus-supportive-context hypothesis)? The present analysis (9,167 student records matched with 223 math teachers) supported the latter hypothesis. This result stood up to potentially confounding teacher factors and to a conservative Bayesian analysis. Thus, sustaining growth-mindset effects may require contextual supports that allow the proffered beliefs to take root and flourish.
Full-text available
Adolescents who hold an entity theory of personality—the belief that people cannot change—are more likely to report internalizing symptoms during the socially stressful transition to high school. It has been puzzling, however, why a cognitive belief about the potential for change predicts symptoms of an affective disorder. The present research integrated three models—implicit theories, hopelessness theories of depression, and the biopsychosocial model of challenge and threat—to shed light on this issue. Study 1 replicated the link between an entity theory and internalizing symptoms by synthesizing multiple datasets (N = 6,910). Study 2 examined potential mechanisms underlying this link using 8-month longitudinal data and 10-day diary reports during the stressful first year of high school (N = 533, 4,255 daily reports). The results showed that an entity theory of personality predicted increases in internalizing symptoms through tendencies to make fixed trait causal attributions about the self and maladaptive (i.e., “threat”) stress appraisals. The findings support an integrative model whereby situation-general beliefs accumulate negative consequences for psychopathology via situation-specific attributions and appraisals.
Full-text available
Two experiments and 2 field studies examine how college students' perceptions of their science, technology, engineering, and mathematics (STEM) professors' mindset beliefs about the fixedness or malleability of intelligence predict students' anticipated and actual psychological experiences and performance in their STEM classes, as well as their engagement and interest in STEM more broadly. In Studies 1 (N = 252) and 2 (N = 224), faculty mindset beliefs were experimentally manipulated and students were exposed to STEM professors who endorsed either fixed or growth mindset beliefs. In Studies 3 (N = 291) and 4 (N = 902), we examined students' perceptions of their actual STEM professors' mindset beliefs and used experience sampling methodology (ESM) to capture their in-the-moment psychological experiences in those professors' classes. Across all studies, we find that students who perceive that their professor endorses more fixed mindset beliefs anticipate (Studies 1 and 2) and actually experience (Studies 3 and 4) more psychological vulnerability in those professors' classes-specifically, they report less belonging in class, greater evaluative concerns, greater imposter feelings, and greater negative affect. We also find that in-the-moment experiences of psychological vulnerability have downstream consequences. Students who perceive that their STEM professors endorse more fixed mindset beliefs experience greater psychological vulnerability in those professors' classes, which in turn predict greater dropout intentions, lower class attendance, less class engagement, less end-of-semester interest in STEM, and lower grades. These findings contribute to our understanding of how students' perceptions of professors' mindsets can serve as a situational cue that affects students' motivation, engagement, and performance in STEM. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Technical Report
Full-text available
The Changing Mindsets project aimed to improve attainment outcomes at the end of primary school by teaching Year 6 pupils that their brain potential was not a fixed entity but could grow and change through effort exerted. The programme, delivered by Portsmouth University, taught pupils about the malleability of intelligence through workshops. Teachers attended short professional development courses on approaches to developing a ‘growth mindset’ before delivering sessions to pupils weekly, over eight consecutive weeks. Teachers were encouraged to embed aspects of the growth mindset approach throughout their teaching—for example, when giving feedback outside of the sessions. They were also given access to digital classroom resources, such as a video case study of Darwin overcoming adversity in his own life, as a practical example of the importance of having a growth mindset. The project was a randomised controlled trial (RCT) and included 101 schools and 5018 pupils across England, assigned to either intervention or control groups. The trial ran from September 2016 to February 2017. The process evaluation involved interviews with teachers, focus groups with pupils receiving the intervention, lesson observations, and surveys of both treatment schools and control groups throughout the course of the intervention.
Full-text available
Mind-set refers to people’s beliefs about whether attributes are malleable ( growth mind-set) or unchangeable ( fixed mind-set). Proponents of mind-set theory have made bold claims about mind-set’s importance. For example, one’s mind-set is described as having profound effects on one’s motivation and achievements, creating different psychological worlds for people, and forming the core of people’s meaning systems. We examined the evidentiary strength of six key premises of mind-set theory in 438 participants; we reasoned that strongly worded claims should be supported by equally strong evidence. However, no support was found for most premises. All associations ( rs) were significantly weaker than .20. Other achievement-motivation constructs, such as self-efficacy and need for achievement, have been found to correlate much more strongly with presumed associates of mind-set. The strongest association with mind-set ( r = −.12) was opposite from the predicted direction. The results suggest that the foundations of mind-set theory are not firm and that bold claims about mind-set appear to be overstated.
Full-text available
Here we evaluate the potential for growth mindset interventions (which teach students that intellectual abilities can be developed) to inspire adolescents to be “learners”—that is, to seek out challenging learning experiences. In a previous analysis, the U.S. National Study of Learning Mindsets (NSLM) showed that a growth mindset could improve the grades of lower-achieving adolescents, and, in an exploratory analysis, increase enrollment in advanced math courses across achievement levels. Yet the importance of being a “learner” in today’s global economy requires clarification and replication of potential challenge-seeking effects, as well as an investigation of the school affordances that make intervention effects on challenge-seeking possible. To this end, the present paper presents new analyses of the U.S. NSLM (N = 14,472) to (a) validate a standardized, behavioral measure of challenge-seeking (the “make-a-math worksheet” task), and (b) show that the growth mindset treatment increased challenge-seeking on this task. Second, a new experiment conducted with nearly all schools in two counties in Norway, the U-say experiment (N = 6,541), replicated the effects of the growth mindset intervention on the behavioral challenge-seeking task and on increased advanced math course-enrollment rates. Treated students took (and subsequently passed) advanced math at a higher rate. Critically, the U-say experiment provided the first direct evidence that a structural factor—school policies governing when and how students opt in to advanced math—can afford students the possibility of profiting from a growth mindset intervention or not. These results highlight the importance of motivational research that goes beyond grades or performance alone and focuses on challenge-seeking. The findings also call attention to the affordances of school contexts that interact with student motivation to promote better achievement and economic trajectories.
The growth mindset or the belief that intelligence is malleable has garnered significant attention for its positive association with academic success. Several recent randomized trials, including the National Study of Learning Mindsets (NSLM), have been conducted to understand why, for whom, and under what contexts a growth mindset intervention can promote beneficial achievement outcomes during critical educational transitions. Prior research suggests that the NSLM intervention was particularly effective in improving low-achieving 9th graders’ GPA, while the impact varied across schools. In this study, we investigated the underlying causal mediation mechanism that might explain this impact and how the mechanism varied across different types of schools. By extending a recently developed weighting method for multisite causal mediation analysis, the analysis enhances the external and internal validity of the results. We found that challenge-seeking behavior played a significant mediating role, only in medium-achieving schools, which may partly explain the reason why the intervention worked differently across schools. We conclude by discussing implications for designing interventions that not only promote students’ growth mindsets but also foster supportive learning environments under different school contexts.
Impact evaluations can help to inform policy decisions, but they are rooted in particular contexts and to what extent they generalize is an open question. I exploit a new data set of impact evaluation results and find a large amount of effect heterogeneity. Effect sizes vary systematically with study characteristics, with government-implemented programs having smaller effect sizes than academic or non-governmental organization-implemented programs, even controlling for sample size. I show that treatment effect heterogeneity can be appreciably reduced by taking study characteristics into account.
This is one of the first evaluations of a “growth-mindset” intervention at scale in a developing country. I randomly assigned 202 public secondary schools in Salta, Argentina, to a treatment group in which Grade 12 students were asked to read about the malleability of intelligence, write a letter to a classmate, and post their letters in their classroom, or to a control group. The intervention was implemented as intended. Yet, I find no evidence that it affected students’ propensity to find tasks less intimidating, school climate, school performance, achievement, or post-secondary plans. I rule out small effects and find little evidence of heterogeneity. This study suggests that the intervention may be more challenging to replicate and scale than anticipated.
We investigated if growth mindsets—the belief in the malleable nature of human attributes—are negatively related to psychological distress and if they are positively related to treatment value and active coping. In the meta-analysis, we included articles published between 1988 and 2019, written in English, that reported on mindsets as well as a qualifying dependent variable and included information required to calculate an effect size. With a random effects approach, meta-analytic results (k = 72 samples, N = 17,692) demonstrated that mindsets relate to, albeit with minimal effects, to distress, treatment and coping. Specifically, there is a negative relation between growth mindsets and psychological distress (r = −0.220), a positive relation between growth mindsets and treatment value (r = 0.137) and a positive relation between growth mindsets and active coping (r = 0.207). Differences in mindset domain, assessment method of mindsets and timing of assessments moderated effects. There were not differences based on operationalization of psychological distress outcome or sample characteristics (i.e., developmental stage, diagnostic status, ethnicity). We discuss theoretical and practical applications of the findings.