ArticlePDF Available

Using Design Thinking to Improve Psychological Interventions: The Case of the Growth Mindset During the Transition to High School



There are many promising psychological interventions on the horizon, but there is no clear methodology for preparing them to be scaled up. Drawing on design thinking, the present research formalizes a methodology for redesigning and tailoring initial interventions. We test the methodology using the case of fixed versus growth mindsets during the transition to high school. Qualitative inquiry and rapid, iterative, randomized “A/B” experiments were conducted with ~3,000 participants to inform intervention revisions for this population. Next, two experimental evaluations showed that the revised growth mindset intervention was an improvement over previous versions in terms of short-term proxy outcomes (Study 1, N=7,501), and it improved 9th grade core-course GPA and reduced D/F GPAs for lower achieving students when delivered via the Internet under routine conditions with ~95% of students at 10 schools (Study 2, N=3,676). Although the intervention could still be improved even further, the current research provides a model for how to improve and scale interventions that begin to address pressing educational problems. It also provides insight into how to teach a growth mindset more effectively.
Using Design Thinking to Make
Psychological Interventions Ready for Scaling:
The Case of the Growth Mindset During the Transition to High School
David Yeager
University of Texas at Austin
Carissa Romero
Dave Paunesku
PERTS and Stanford University
Christopher Hulleman
University of Virginia
Barbara Schneider
Michigan State University
Cintia Hinojosa
Hae Yeon Lee
Joseph O’Brien
University of Texas at Austin
Kate Flint
Alice Roberts
Jill Trott
ICF International
Daniel Greene
Gregory M. Walton
Carol Dweck
Stanford University
September 7, 2015
Author Note
The authors would like to thank the teachers, principals, administrators, and students who
participated in this research. This research was supported by generous funding from the Spencer
Foundation, the William T. Grant Foundation, the Bezos Foundation, the Houston Endowment,
the Character Lab, the President and Dean of Humanities and Social Sciences at Stanford
University, Angela Duckworth, a William T. Grant scholars award, and a fellowship from the
Center for Advanced Study in the Behavioral Sciences (CASBS) to the first and fourth authors.
The authors are grateful to Angela Duckworth, Elizabeth Tipton, Michael Weiss, Tim Wilson,
Robert Crosnoe, Chandra Muller, Ronald Ferguson, Ellen Konar, Elliott Whitney, Paul
Hanselman, Jeff Kosovich, Andrew Sokatch, Katharine Meyer, Patricia Chen, Chris Macrander,
Jacquie Beaubien, and Rachel Herter for their assistance. Please address correspondence to
David Yeager ( or Gregory Walton (
Design Thinking for Mindset Interventions
There are many promising psychological interventions on the horizon, but there is no clear
methodology for getting them ready to be scaled up. This not only limits their policy impact, but
also invites ineffective applications or misguided replications. Drawing on design thinking, the
present research formalizes a methodology for redesigning initial interventions so that they can
address common challenges for a defined group of individuals—here, fixed versus growth
mindsets during the transition to high school. Specifically, qualitative inquiry and rapid,
iterative, randomized “A/B” experiments were conducted with ~3,000 participants to inform
intervention revisions. Next, two experimental evaluations showed that the revised growth
mindset intervention was an improvement over previous versions in terms of short-term proxy
outcomes (Study 1, N=7,501), and it improved 9th grade core-course GPA when delivered under
routine conditions with >95% of students at 10 schools (Study 2, N=3,676). For instance, the
revised growth mindset intervention led to 4 percentage points fewer students earning D/F
averages compared to controls, and, for previously low-achieving students, 7 percentage points
fewer. This research provides insights into how to better teach a growth mindset; it shows that
psychological intervention effects are replicable in trials relevant to questions of educational
improvement; and it provides a model for how to iteratively improve and scale interventions to
address policy problems.
Keywords: motivation, psychological intervention, incremental theory of intelligence,
growth mindset, adolescence.
Design Thinking for Mindset Interventions
Using Design Thinking to Make
Psychological Interventions Ready for Scaling:
The Case of the Growth Mindset During the Transition to High School
One of the most promising developments in educational psychology in recent years has
been the finding that self-administered psychological interventions can initiate lasting
improvements in student achievement (Cohen & Sherman, 2014; Garcia & Cohen, 2012;
Walton, 2014; Wilson, 2012; Yeager & Walton, 2011). These interventions do not provide new
instructional materials or pedagogies. Instead, they emulate the practices of expert teachers (see
Lepper & Woolverton, 2001; Treisman, 1992) by addressing students’ subjective construals of
themselves and school—how students view their abilities, their experiences in school, their
relationships with peers and teachers, and their learning tasks (see Ross & Nisbett, 1991). Such
construals can powerfully shape motivation to learn.
For instance, when students are led to construe their learning situation as one in which
they have the potential to develop their abilities (Dweck, 1999; Dweck, 2006), in which they feel
psychologically safe and connected to others (Cohen et al., 2006; Stephens, Hamedani, & Destin,
2014; Walton & Cohen, 2007), and in which putting forth effort has meaning and value
(Hulleman & Harackiewicz, 2009; Yeager et al., 2014; see Eccles & Wigfield, 2002), they are
more likely to be motivated to learn (see Elliot & Dweck, 2005; Lepper, Woolverton, Mumme,
& Gurtner, 1993; Stipek, 2002). Such subjective construals—and interventions or teacher
practices that affect them—can affect behavior over time because they can become self-
confirming. When students doubt their capacities in school—for example, when they see a failed
math test as evidence that they are not a “math person”—they behave in ways that can make this
true (e.g., avoiding future math challenges they might learn from). By changing initial construals
and behaviors, psychological interventions can set in motion recursive processes that alter
Design Thinking for Mindset Interventions
students’ achievement into the future (see Cohen & Sherman, 2014; Garcia & Cohen, 2012;
Walton, 2014; Yeager & Walton, 2011).
Although promising, self-administered psychological interventions have not often been
tested in ways that are directly relevant for policy and practice. Rigorous randomized trials have
been conducted with samples of students within schools—usually, students who could be
conveniently recruited. These studies have been extremely useful for rejecting the null
hypothesis in tests of novel theoretical claims (e.g., Aronson et al., 2002; Blackwell et al., 2007;
Good et al., 2003). Some studies have subsequently taken a step toward scale by advancing
methods for delivering intervention materials to large samples via the Internet without requiring
professional development (e.g. Paunesku et al., 2015; Yeager et al., 2014). However such tests
are limited in relevance for policy and practice because they did not attempt to improve the
outcomes of an entire student body or an entire sub-groups of students.
There is not currently a methodology for adapting generic materials that were effective in
initial experiments so they can be used at scale for large populations of students who are facing
particular issues at a particular point in their academic lives. We seek to develop this
methodology here. To do so, we focus on students at diverse schools but who are at a similar
developmental point and, therefore, may encounter similar challenges and may benefit from
similar changes in construals.
We test whether the tradition of “design thinking,” combined with psychological
experimentation, can facilitate the development of improved intervention materials for a given
population.1 As explained later, the policy problem we address is core academic performance of
9th graders transitioning to public high schools in the United States.
1 Previous research has shown how design thinking can improve instructional materials but not
yet psychological interventions that target students’ construals of learning situations (see
Razzouk & Shute, 2012).
Design Thinking for Mindset Interventions
The specific intervention we re-design is the growth mindset of intelligence intervention
(Aronson et al., 2002; Blackwell et al., 2007; Good et al., 2003; Paunesku et al., 2015). The
growth mindset intervention counteracts the fixed mindset, which is the belief that intelligence is
a fixed entity that cannot be changed with experience. To do this the intervention teaches
scientific facts about the malleability of the brain, to show how intelligence can be developed. It
then uses writing assignments to help students internalize the messages (see the pilot study
methods). The growth mindset intervention aims to change students’ mindsets about intelligence,
increase their desire to take on challenges, and enhance their persistence by forestalling
attributions that academic struggles and setbacks mean they are “not smart” (Blackwell et al.,
2007, Study 1; see Burnette et al., 2013; Yeager & Dweck, 2012). These changes can result in
academic resilience (Burnette et al., 2013).
Design Thinking and Psychological Interventions
Design thinking is “problem-centered.” That is, effective design seeks to solve
predictable problems for specified user groups (Kelley & Kelley, 2013; also see Bryk, 2009;
Razzouk & Shute, 2012). To develop psychological interventions that solve defined problems,
expertise in theory is crucial. But theory alone does not help a person discover how to connect
with students facing a particular set of motivational barriers—for instance, 9th graders making the
transition to high school. Doing that, we believe, requires combining theoretical expertise with a
design-based approach (Razzouk & Shute, 2012; also see Bryk, 2009; Yeager & Walton, 2011).
Our hypothesis is that this problem-specific customization can increase the likelihood that an
intervention will be widely effective for the intended, pre-defined population.
We adapt models from two design traditions, user-centered design and A/B testing, and
combine these with theoretical expertise in psychological processes. Theories of user-centered
Design Thinking for Mindset Interventions
design were pioneered by firms such as IDEO (Kelley & Kelley, 2013; see Razzouk & Shute,
2012). The design process privileges the user’s subjective perspective—in the present case, the
9th grade students. To do so, it often employs qualitative research methods such as ethnographic
observations of people’s mundane goal pursuit in their natural habitats (Kelley & Kelley, 2013).
User-centered design also has a bias toward action. Designers test minimally viable products
early in the design phase in an effort to learn from users how to improve them (see Ries, 2011).
Applied to psychological intervention, this can help prevent running a large, costly experiment
with a treatment that has foreseeable flaws. In sum, our aim was to acquire insights about the
barriers to adoption of a growth mindset during the transition to high school as quickly as
possible, without waiting for a full-scale, long-term evaluation.
User-centered design typically does not ask users for their theories about what they desire
or would find compelling. Individuals may not have access to that information (Wilson, 2002).
However, users may be excellent reporters on what they dislike or are confused by. Thus, user-
centered designers do not ask, “What kind of software would you like?” Rather, they often show
users prototypes and let them say what seems wrong or right. Similarly, we did not ask students,
“What would make you adopt a growth mindset?” But we do ask for positive and negative
reactions to prototypes of growth mindset materials, and then used those responses to formulate
changes for the next iteration.
Qualitative user-centered design can lead to novel insights, but how would one know if
they actually led to improvements? To assess this question, we drew on a second tradition, that
of “A/B testing” (see, e.g., Kohavi & Longbotham, 2015). The logic is simple. Because it is easy
to be wrong about what will be persuasive to a 9th grader, rather than guess, test. We used the
methodology of low-cost, short-term, large-sample, random-assignment experiments to test
Design Thinking for Mindset Interventions
revisions to intervention content. Although each experiment may not, on its own, offer a
theoretical advance or a definitive conclusion, in the aggregate they may contribute to
substantially improving the fit of an intervention for a population of interest.
To show that this design process has produced materials that may be ready for scaling, at
least two conditions of questions must be met: (1) the re-designed intervention should be more
effective for the target population than the generic intervention when examining short-term
proxy outcomes (such as behaviors that eventually lead to learning), and (2) the re-designed
intervention should address the policy-relevant aim: namely, an improvement in student
achievement when delivered to censuses of students within schools. Interestingly, although a
great deal has been written about best practices in design, we do not know of any set of
experiments that has collected both of these two types of evidence as we did (cf. Razzouk &
Shute, 2012).
Scope of the Present Investigation
As a first test case, we redesign a growth mindset of intelligence intervention for the
transition to high school (Aronson et al., 2002; Good et al., 2003; Blackwell et al., 2007;
Paunesku et al., 2015; see Dweck, 2006; Yeager & Dweck, 2012). This is an informative case
study because (a) previous research has found that growth mindsets can predict success across
educational transitions, and previous growth mindset interventions have shown some
effectiveness (Blackwell et al., 2007); (b) there is a clearly defined psychological process model
explaining how a growth mindset relates to student performance supported by a large amount of
correlational and laboratory experimental research (see a meta-analysis by Burnette et al., 2013) ,
which informs decisions about which “proxy” measures would be most useful for shorter-term
evaluations. The growth mindset is thus a good place to start.
Design Thinking for Mindset Interventions
Our defined user group was students making the transition to high school. New 9th
graders also represent a large population of students—approximately 4 million individuals each
year (Bauman & Davis, 2013). Although entering 9th graders’ experiences can vary, there are
some common challenges: high school freshmen often take more rigorous classes than
previously and their performance can affect their chances for college; they have to form
relationships with new teachers and school staff; and they have to think more seriously about
their goals in life. Students who do not successfully complete 9th grade core courses have a
dramatically lower rate of high school graduation, and much poorer life prospects (Allensworth
& Easton, 2005). Improving the transition to high school is therefore an important policy
Previous growth mindset interventions may be improved for the transition to high school
in several ways. First, past growth mindset interventions were not tailored for the specific
challenges that occur in the transition to high school. Rather they were written to address
challenges in learning in any adolescent or adult learning context. Second, they were not written
for the vocabulary, conceptual sophistication, and interests of adolescents entering high school.
Third, when they were created they did not have in mind arguments that might be most relevant
or persuasive for 14-15 year-olds.
Despite their generic content, prior growth mindset interventions have already been
effective in high schools. Extending earlier trials (Aronson et al., 2002; Blackwell et al., 2007;
Good et al., 2003), Paunesku et al. (2015) conducted a double-blind, randomized experimental
evaluation of a growth mindset intervention with over 1,500 high school students of all grade
levels via the Internet. They found a significant Treatment × Prior achievement interaction, such
that lower-performing students benefitted most from the growth mindset treatment in terms of
Design Thinking for Mindset Interventions
the GPA. Lower-achievers both may have had more room to grow (for range restriction reasons)
and also may have faced greater concerns about their academic ability (see analogous benefits
for lower-achievers in Cohen, Garcia, Purdie-Vaughns, Apfel, & Brzustoski, 2009; Hulleman &
Haraciewicz, 2009; Wilson & Linville, 1982; Yeager, Henderson, et al., 2014). Paunesku et al.
found that previously low-achieving treated students were also less likely to receive “D” and “F”
grades in core classes (e.g., English, math, science). We examine whether these results could be
replicated and improved upon.
The Present Research
The present research first involved design thinking, presented as a pilot study. Next,
Study 1 tested whether the design process described here produced growth mindset materials that
were an improvement over original materials when examining proxy outcomes, such as beliefs,
goals, attributions, and challenge-seeking behavior. The study required a great deal of power to
detect a significant treatment contrast because we expected that the “original” generic
intervention would also teach a growth mindset (as it had before; Paunesku et al., 2015).
Study 2 tested whether we had, in fact, developed a revised growth mindset intervention
that was effective at changing student achievement when delivered to a census (>95% of
students) in 10 different schools across the country. The focal research hypothesis was an effect
on core course GPA and D/F averages among previously low-performing students, which would
replicate the Treatment × Prior achievement interaction found by Paunesku et al. (2015).
Study 2 was, to our knowledge, the first pre-registered replication of a psychological
intervention effect, the first to have data collected and cleaned by an independent research firm,
and the first to employ a census of students (>95% response rates). Hence, it was a rigorous test
of the hypothesis.
Design Thinking for Mindset Interventions
Pilot: Using Design Thinking to Improve a Growth Mindset Intervention
Data. During the design phase, the goal was to learn as much as possible, as rapidly as
possible, given data that were readily available. Thus, no intentional sampling was done and no
demographic data on participants were collected. For informal qualitative data—focus groups,
one-on-one interviews, and other types of feedback—high school students were contacted
through personal connections and invited to provide feedback on a new program for high school
students. Quantitative data—the rapid “A/B” experiments—were collected from college-aged
and older adults on Amazon’s Mechanical Turk platform.
Although adults are not an ideal data source for a transition to high school intervention,
the A/B experiments required great statistical power because we were testing minor variations;
data from Mechanical Turk were readily available and allowed us to iterate quickly. Therefore
all conclusions were tempered by consulting both the qualitative user tests with high school
students and the randomized experiments with adults.
The “original” mindset treatment. The “original” generic treatment was the starting
point for our revision. It involved three elements. First, participants read a scientific article
titled “You Can Grow Your Intelligence,” written by researchers, used in the Blackwell et al.
(2007) experiment, and slightly revised for the Paunesku et al. (2015) experiments. It described
the idea that the brain can get smarter the more it is challenged, like a muscle. As scientific
background for this idea, the article explained what neurons are and how they form a network in
the brain. It then provided summaries of studies showing that animals or people (e.g., rats,
Design Thinking for Mindset Interventions
babies, or London taxi drivers) who have greater experience or learning develop denser networks
of neurons in their brains.
After reading this four-page article, participants were asked to generate a personal
example of learning and getting smarter—a time when they used to not know something, but
then they practiced and got better at it. Finally, participants were asked to author a letter
encouraging a future student who might be struggling in school and may feel “dumb” (see
Aronson et al., 2002). This is a “saying-is-believing” exercise (Aronson; Walton & Cohen, 2011;
Walton, 2014).
“Saying-is-believing” is thought to be effective for several reasons. First, it is thought to
make the information (in this case, about the brain and its ability to grow) more self-relevant,
which may make it easier to recall (Bower & Gilligan, 1979; Hulleman & Harackiewicz, 2009;
Lord, 1980). Indeed, prior research has found that students can benefit more from social-
psychological intervention materials when they author reasons why the content is relevant, as
opposed to being told why they are relevant to their own lives (Godes, Hulleman, &
Harackiewicz, 2007). Second, by mentally rehearsing how one should respond when struggling,
it can be easier to enact those thoughts or behaviors later (Gollwitzer, 1999). Third, when
students are asked to communicate the message to someone else—and not directly asked to
believe it themselves—it can feel less controlling, it can avoid implying that students are
deficient, and it can lead students to convince themselves of the truth of the proposition via
cognitive dissonance processes (see research on self-persuasion, Aronson, 1999; also see Bem,
1965; Cooper & Fazio, 1984).
Procedures and Results: Revising the Mindset Treatment
Design methodology.
Design Thinking for Mindset Interventions
User-centered design. Theories of user-centered design (Kelley & Kelley, 2013)
informed our procedures. We (a) met one-on-one with 9th graders; (b) met with groups of 2 to 10
students; and (c) piloted with groups of 20-25. In these piloting sessions we first asked students
to go through a “minimally viable” version of the treatment (i.e. early draft revisions of the
generic materials) as if they were receiving it as a new freshman in high school. We then led
them in guided discussions of what they disliked, what they liked, what was confusing. We also
asked students to summarize the content of the message back to us, under the assumption that a
person’s inaccurate summary of a message is an indication of where the clarity could be
improved. Note that the level of evidence sought in these sessions was not to show causality; the
informal feedback was suggestive, not definitive. However a consistent message from the
students allowed the research team to identify, in advance, predictable failures of the message to
connect or instruct, and potential improvements worth testing.
Through this we developed insights that might seem minor taken individually, but, in the
aggregate, may be important. These were to include quotes from admired adults and celebrities;
to include more and more diverse writing exercises; to weave purposes for why one should grow
one’s brain together with statements that one could grow one’s brain; to use bullet points instead
of paragraphs; to reduce the amount of information on each page; to show actual results of past
scientific research in figures; and to change examples that appear less relevant to high school
students (e.g., replacing a study about rats growing their brains with a summary of science about
teenagers’ brains),among others.
A/B testing. A series of randomized experiments tested small variations on the mindset
intervention. In each, we randomized participants to versions of a mindset treatment and assessed
changes from pre- to post-test in self-reported fixed mindsets (see measures below in Study 1;
Design Thinking for Mindset Interventions
also see Dweck, 1999). Mindset self-reports are an imperfect measure, as will be shown in the
two studies. Yet they are informative in the sense that if mindsets are not changed—or if they
were changed in the wrong direction—then it is reason for concern. Self-reports give the data “a
chance to object.”
Two studies involved a total of 7 factors, fully crossed, testing 5 research questions
across N=3,004 participants. These factors and their effects on self-reported fixed mindsets are
summarized in Table 1.
One question tested in both A/B experiments was whether it was more effective to tell
research participants that the growth mindset treatment was designed to help them (a “direct”
framing), versus a framing in which participants were asked to summarize the content for future
9th grade students (an “indirect” framing). Research on the “saying-is-believing” tactic (E.
Aronson, 1999; J. Aronson et al., 2002; Walton & Cohen, 2011; Yeager & Walton, 2011) and
theory about “stealth” interventions more broadly (Robinson, 2010) suggests that the latter might
be more helpful, as noted above. Indeed, Table 1 shows that in both A/B Study 1 and A/B Study
2 the direct framing led to smaller changes in mindsets—corresponding to lower effectiveness—
than indirect framing (see rows 1 and 5 in Table 1). Thus, the indirect framing was used
throughout the revised intervention. To our knowledge this is the first experimental test of the
effectiveness of the indirect framing in psychological interventions, even though it is often
standard practice (J. Aronson et al., 2002; Walton & Cohen, 2011; Yeager & Walton, 2011).
The second question was: Is it more effective to present and refute the fixed mindset
view. Or is it more effective to only teach evidence for the growth mindset view? On the one
hand, refuting the fixed mindset might more directly discredit the problematic belief. On the
other hand, it might give credence and voice to the fixed mindset message, for instance by
Design Thinking for Mindset Interventions
conveying that the fixed mindset is a reasonable perspective to hold (perhaps even the norm),
giving it an “illusion of truth” (Skurnik, Yoon, Park, & Schwarz, 2005). This might cause
participants who hold a fixed mindset to become entrenched in their beliefs. Consistent with the
latter possibility, in A/B Study 1, refuting the fixed mindset view led to smaller changes in
mindsets—corresponding to lower effectiveness—as compared to not doing so (see row 2 in
Table 1). Furthermore, the refutation manipulation caused participants who were more fixed
mindset at baseline to show an increase in fixed mindset post-message, main effect p = .003,
interaction effect p = .01. That is, refuting a fixed mindset seemed to exacerbate fixed mindset
beliefs for those who already held them.
Following this discovery, we wrote a different kind of refutation of fixed mindset
thinking and tested it in A/B Study 2. The revised content encouraged participants to replace
thoughts about between-person comparisons (that person is smarter than me) with within-person
comparisons (I can become even smarter tomorrow than I am today). This no longer caused
reduced effectiveness (see row 6 in Table 1). The final version emphasized within-person
comparisons as a means to discrediting between-person comparisons. Further, to avoid creating
an illusion of truth, the final materials never named or defined a “fixed mindset.”
We furthermore tested the impact of using well-known or successful adults as role
models of a growth mindset.. For instance, the treatment conveyed the true story of Scott
Forstall, who, with his team, developed the first iPhone at Apple. Forstall used growth mindset
research to select team members who were not afraid of failure but were ready for a challenge. It
furthermore included an audio excerpt from a speech given by First Lady Michelle Obama, in
which she summarized the basic concepts of growth mindset research. These increased adoption
Design Thinking for Mindset Interventions
of a growth mindset compared to versions that did not include these endorsements (Table 1).
Other elements were tested as well (see Table 1).
Guided by theory. The re-design also relied on psychological theory. These changes,
which were subjected to A/B testing, are summarized here. Several theoretical elements were
relevant: (a) theories about how growth mindset beliefs affect student behavior in practice; (b)
theories of cultural values that may appear to be in contrast with growth mindset messages; and
(c) theories of how best to induce internalization of attitudes among adolescents.
Emphasizing “strategies,” not just “hard work.” We were concerned that the “original”
treatment too greatly emphasized “hard work” as the opposite of raw ability, and under-
emphasized the need to change strategies or ask adults for advice on improved strategies for
learning. This is because for many students, much of the time, just working harder with
ineffective strategies will not lead to increased learning. So, for example, the revised treatment
said ““Sometimes people want to learn something challenging, and they try hard. But they get
stuck. That’s when they need to try new strategies—new ways to approach the problem” (also
see Yeager & Dweck, 2012). Our goal was to remove any stigma of needing to ask for help or
having to switch one’s approach.
Addressing a culture of independence. We were concerned that the notion that you can
grow your intelligence would perhaps be perceived as too “independent,” and threaten the more
communal, interdependent values that many students might emphasize, especially students from
working class backgrounds and some racial/ethnic minority groups (Fryberg, Covarrubias, &
Burack, 2013; Stephens et al., 2014). Therefore we included more prosocial, beyond-the-self
motives for adopting and using a growth mindset (see Hulleman & Harackiewicz, 2009; Yeager
et al., 2014). For example, the new treatment said:
Design Thinking for Mindset Interventions
“People tell us that they are excited to learn about a growth mindset because it helps them
achieve the goals that matter to them and to people they care about. They use the mindset
to learn in school so they can give back to the community and make a difference in the
world later.”
Harnessing norms. Adolescents may be especially likely to conform to peers (Cohen &
Prinstein, 2006), and so we created a norm around the use of a growth mindset (Cialdini et al.,
1991). For instance, the end of the second session said “People everywhere are working to
become smarter. They are starting to understand that struggling and learning are what put them
on a path to where they want to go.” The intervention furthermore presented a series of stories
from older peers who endorsed the growth mindset concepts.
Harnessing reactance. We sought to use adolescent reactance, or the tendency to reject
mainstream or external exhortations to change personal choices (Brehm, 1966; Erikson, 1968;
Hasebe, Nucci & Nucci, 2004; Nucci, Killen, & Smetana, 1996), as an asset rather than as a
source of resistance to the message. We did this by initially framing the mindset message as a
reaction to adult control. For instance, at the very beginning of the intervention adolescents read
this story from an upper year student:
“I hate how people put you in a box and say ‘you’re smart at this’ or ‘not smart at that.”
After this program, I realized the truth about labels: they’re made up. … now I don’t let
other people box me in … it’s up to me to put in the work to strengthen my brain.”
Self-persuasion. The revised intervention increased the number opportunities for
participants to write their own opinions and stories. This was believed to increase the probability
that more of the benefits of “saying-is-believing” would be achieved.
Design Thinking for Mindset Interventions
Study 1: Does a Revised Growth Mindset Intervention Outperform an Existing Effective
Study 1 evaluated whether the re-design process led to materials that were an
improvement over the originals. As criteria, Study 1 used short-term measures of psychological
processes that are well-established to follow from a growth mindset: person (low ability) versus
process-focused (strategy, effort) attributions for difficulty, performance avoidance goals (over-
concern about making mistakes or looking incompetent), and challenge-seeking behavior
(Blackwell et al., 2007; Mueller & Dweck, 1998; see Burnette et al., 2013; Dweck & Leggett,
1988; Yeager & Dweck, 2012). Note that the goal of this research was not to test whether any
individual revision, by itself, caused greater efficacy, but rather to investigate all of the changes,
in the aggregate in comparison to the original intervention.
Data. A total of 69 high schools in the United States and Canada were recruited, and
7,501 9th grade students (predominately ages 14-15) provided data during the session when
dependent measures were collected (Time 2), although not all finished the session.2 Participants
were diverse: 17% were Hispanic/Latino, 6% were black/African-American, 3% were Native
American / American Indian, 48% were White, non-Hispanic, 5% were Asian / Asian-American,
and the rest were from another or multiple racial groups. Forty-eight percent were female, and
53% reported that their mothers had earned a Bachelor’s degree or greater.
School recruitment. Schools were recruited via advertisements in prominent educational
publications, through social media (e.g. Twitter), and through recruitment talks to school districts
2 This experiment involved a third condition of equal size that was developed to test other
questions. Results involving that condition will be presented in a future report.
Design Thinking for Mindset Interventions
or other school administrators. Schools were informed that they would have access to a free
growth mindset intervention for their students. Because of this recruitment strategy, all
participating students indeed received a version of a growth mindset intervention. The focal
comparison was between an “original” version of a growth mindset intervention (Paunesku et al.,
2015, which was adapted from materials in Blackwell et al., 2007), and a “revised” version,
created via the user-centered, rapid-prototyping design process summarized above. See
screenshots in Figure 1.
The original intervention. Minor revisions were made to the original intervention to
make it more parallel to the revised treatment, such as the option for the text to be read to them.
Survey sessions. In the winter of 2015, school coordinators brought their students to the
computer labs for two sessions, 1 to 4 weeks apart (researchers never visited the schools). The
Time 1 session involved baseline survey items, a randomized mindset treatment, some fidelity
measures, and brief demographics. Random assignment happened in real time and was
conducted by a web server, and so all staff were blind to condition. The Time 2 session involved
a second round of content for the revised mindset treatment, and control exercises for the original
mindset condition. After completing session two, students completed proxy outcome measures.
Measures. In order to minimize respondent burden and increase efficiency at scale, we
used or developed 1 to 3-item self-report measures of our focal constructs (see a discussion of
“practical measurement” in Yeager & Bryk, 2015). We also developed a brief behavioral task.
Fixed mindset. Three items at Time 1 and Time 2 assessed fixed mindsets: “You have a
certain amount of intelligence, and you really can’t do much to change it”, “Your intelligence is
something about you that you can’t change very much,” and “Being a “math person” or not is
something that you really can’t change. Some people are good at math and other people aren’t.”
Design Thinking for Mindset Interventions
(Response options: 1 = Strongly disagree, 2 = Disagree, 3 = Mostly disagree, 4 = Mostly agree,
5 = Agree, 6 = Strongly agree). These were averaged into a single scale with higher values
corresponding to more fixed mindsets (α=.74).
Challenge-seeking: the “Make-a-Math-Worksheet” Task. We created a novel
behavioral task to assess a known behavioral consequence of a growth mindset: challenge-
seeking (Blackwell et al., 2007; Mueller & Dweck, 1998). It used Algebra and Geometry
problems obtained from the Khan Academy website and was designed to have more options than
previous measures (Mueller & Dweck, 1998) to produce a continuous measure of challenge
seeking. Participants read these instructions:
“We are interested in what kinds of problems high school math students prefer to work
on. On the next few pages, we would like you to create your own math worksheet. If
there is time, at the end of the survey you will have the opportunity to answer these math
problems. On the next few pages there are problems from 4 different math chapters.
Choose between 2 and 6 problems for each chapter.
You can choose from problems that are...Very challenging but you might learn a lot;
Somewhat challenging and you might learn a medium amount; Not very
challenging and you probably won't learn very much; Do not try to answer the math
problems. Just click on the problems you'd like to try later if there's time.”
See Figure 2. There were three topic areas (Introduction to Algebra, Advanced Algebra, and
Geometry), and within each topic area there were four “chapters” (e.g. rational and irrational
numbers, quadratic equations, etc.), and within each chapter there were six problems, each
labeled “Not very challenging,” “Somewhat challenging,” or “Very challenging” (two per type).
Design Thinking for Mindset Interventions
Each page showed the six problems for a given chapter and, as noted, students were instructed to
select “at least 2 and up to 6 problems” on each page.
The total number of “Very challenging” (i.e. hard) problems chosen across the 12 pages
was calculated for each student (Range: 0-24) as was the total number of “Not very challenging”
(i.e. easy) problems (Range: 0-24). The final measure was the number of easy problems minus
the number of hard problems selected. Visually, the final measure approximated a normal
Challenge-seeking: Hypothetical scenario. Participants were presented with the
following scenario, based on a measure in Mueller and Dweck (1998):
“Imagine that, later today or tomorrow, your math teacher hands out two extra credit
assignments. You get to choose which one to do. You get the same number of points for
trying either one. One choice is an easy review—it has math problems you already know
how to solve, and you will probably get most of the answers right without having to think
very much. It takes 30 minutes. The other choice is a hard challenge—it has math
problems you don’t know how to solve, and you will probably get most of the problems
wrong, but you might learn something new. It also takes 30 minutes. If you had to pick
right now, which would you pick?”
Participants chose one of two options (1 = The easy math assignment where I would get most
problems right, 0 = The hard math assignment where I would possibly learn something new).
Higher values corresponded to the avoidance of challenge, and so this measure should be
positively correlated with fixed mindset and be reduced by the mindset treatment.
Fixed-trait attributions. Fixed mindset beliefs are known predictors of person-focused
versus process-focused attributional styles (e.g., Henderson & Dweck, 1990; Robins & Pals,
Design Thinking for Mindset Interventions
2002). We adapted prior measures (Blackwell et al., 2007) to develop a briefer assessment.
Participants read this scenario: “Pretend that, later today or tomorrow, you got a bad grade on a
very important math assignment. Honestly, if that happened, how likely would you be to think
these thoughts?” Participants then rated this fixed-trait, person-focused response "This means
I’m probably not very smart at math” and this malleable, process-focused response “I can get a
higher score next time if I find a better way to study (reverse-scored)” (response options: 1 = Not
at all likely, 2 = Slightly likely, 3 = Somewhat likely, 4 = Very likely, 5 = Extremely likely). The
two items were averaged into a single composite, with higher values corresponding to more
fixed-trait, person-focused attributional responses.
Performance avoidance goals. Because fixed mindset beliefs are known to predict the
goal of hiding one’s lack of knowledge (Dweck & Leggett, 1988), we measured performance-
avoidance goals with a single item (Elliot & McGregor, 2001; performance approach goals were
not measured). Participants read “What are your goals in school from now until the end of the
year? Below, say how much you agree or disagree with this statement. One of my main goals for
the rest of the school year is to avoid looking stupid in my classes” (Response options: 1 =
Strongly disagree, 2 = Disagree, 3 = Mostly disagree, 4 = Mostly agree, 5 = Agree, 6 = Strongly
agree). Higher values correspond to greater performance avoidance goals.
Fidelity measures. To examine fidelity of implementation across conditions, students
were asked to report on distraction in the classroom, both peers’ distraction (“Consider the
students around you... How many students would you say were working carefully and quietly on
this activity today?” Response options: 1= Fewer than half of students, 2 = About half of
students, 3 = Most students, 4 = Almost all students, with just a few exceptions, 5 = All students)
and one’s own distraction (“How distracted were you, personally, by other students in the room
Design Thinking for Mindset Interventions
as you completed this activity today?” Response options: 1 = Not distracted at all, 2 = Slightly
distracted, 3 = Somewhat distracted, 4 = Very distracted, 5 = Extremely distracted).
Next, participants in both conditions rated how interesting the materials were (“For you
personally, how interesting was the activity you completed in this period today?” Response
options: 1 = Not interesting at all, 2 = Slightly interesting, 3 = Somewhat interesting, 4 = Very
interesting, 5 = Extremely interesting), and how much they learned from the materials (“How
much do you feel that you learned from the activity you completed in this period today?”
Response options: 1 = Nothing at all, 2 = A little, 3 = A medium amount, 4 = A lot, 5 = An
extreme amount).
Prior achievement. School records were not available for this sample. Prior achievement
was indexed by a composite of self-reports of typical grades and expected grades. The two items
were “Thinking about this school year and the last school year, what grades do you usually get in
core classes? By core classes, we mean: English, math, and science. We don't mean electives,
like P.E. or art” (Response options: 1 = Mostly F's, 2 = Mostly D's, 3 = Mostly C's, 4 = Mostly
B's, 5 = Mostly A's) and “Thinking about your skills and the difficulty of your classes, how do
you think you’ll do in math in high school? (Response options: 1 = Extremely poorly, 2 = Very
poorly, 3 = Somewhat poorly, 4 = Neither well nor poorly, 5 = Somewhat well, 6 = Very well, 7 =
Extremely well). Items were z-scored within schools and then averaged, with higher values
corresponding to higher prior achievement (α=.74).
Attitudes to validate the “Make-a-Math-Worksheet” Task. Additional self-reports were
assessed at Time 1 to validate the Time 2 challenge-seeking behaviors. These were all expected
to be correlated with challenge-seeking behavior. These were the short grit scale (Duckworth &
Quinn, 2009), the academic self-control scale (Tsukayama, Duckworth, & Kim, 2013), a single
Design Thinking for Mindset Interventions
item of interest in math (“In your opinion, how interesting is the subject of math in high school?”
response options: 1 = Not at all interesting, 2 = Slightly interesting, 3 = Somewhat interesting, 4
= Very interesting, 5 = Extremely interesting), and a single item of math anxiety (“In general,
how much does the subject of math in high school make you feel nervous, worried, or full of
anxiety?” Response options: 1 = Not at all, 2 = A little, 3 = A medium amount, 4 = A lot, 5 = An
extreme amount).
Preliminary analyses. Throughout the manuscript, we tested for violations of
assumptions of linear models (e.g., outliers, non-linearity). Variables either did not violate
linearity, or when transforming or dropping outliers, significance of results were unchanged.
Correlational analyses. Before examining the impact of the treatment, we conducted
correlational tests to replicate the basic findings from prior research on fixed versus growth
mindsets. Table 2 shows that measured fixed mindset significantly predicted fixed-trait, person-
focused attributions, r(6636)=.28, performance avoidance goals, r(6636)=.23, and lower
hypothetical challenge-seeking, r(6636)=.12 (all ps < .001 due to very large sample sizes). The
sizes of these correlations correspond to the sizes in a meta-analysis of many past studies
(Burnette et al., 2013). Thus, a fixed mindset was associated with thinking that difficulty means
you are “not smart,” with having the goal of not looking “dumb,” and with avoiding hard
problems that you might get wrong, as in prior research (see Dweck, 2006; Dweck & Leggett,
1988; Yeager & Dweck, 2012).
Validating the “Make-a-math-worksheet” challenge-seeking task. As shown in Table 2,
the choice of a greater number of easy problems as compared to hard problems at Time 2 was
modestly correlated with grit, self-control, prior performance, interest in math, math anxiety
Design Thinking for Mindset Interventions
measured at Time 1, 1 to 4 weeks earlier, and all in the expected direction (all ps<.001, given
large sample size; see Table 2). In addition, measured fixed mindset, fixed-trait attributions, and
performance avoidance goals predicted choices of more easy problems and fewer hard problems.
The worksheet task behavior correlated with the single dichotomous hypothetical choice, point-
biserial r(6636)=.29. Thus, participants’ choices on this task appear to reflect their individual
differences in challenge-seeking tendencies.
Random assignment. Random assignment to condition was effective. There were no
differences between conditions in terms of demographics (gender, race, ethnicity, special
education, parental education) or in terms of prior achievement (all ps>.1), despite 80% power to
detect effects as small as d=.06. Furthermore, as shown in Table 3, there were no pre-treatment
differences between conditions in terms of fixed mindset.
Fidelity. Students in the revised and original conditions did not differ in terms of their
ratings of their peers’ distraction during the testing session, t(6454)=0.35, p=.72, or in terms of
their own personal distraction, t(6454)=1.92, p=.06. Distraction was low (at or below a 2 on a 5-
point scale).
Next, the revised treatment was rated as more interesting, t(6454)=4.44, p<.001, and
also as more likely to cause participants to feel as though they learned something, t(6454)=6.25,
p<.001. This means the design process was successful in making the new treatment engaging.
Yet in Study 2 it was important to ensure that the control condition was also interesting.
Self-reported fixed mindset. The revised mindset treatment was more effective at
reducing reports of a fixed mindset as compared to the original mindset treatment. See Table 3.
The original mindset group showed a change score of Δ =-0.22 scale points (out of 6), as
Design Thinking for Mindset Interventions
compared to Δ =-0.48 scale point (out of 6) for the revised mindset group. Both change scores
were significant at p<.001, and they were significantly different from one another (see Table 2).
In moderation analyses, students who already had more of a growth mindset at baseline
changed their beliefs less, Treatment × Pre-treatment fixed mindset interaction, t(6687)=-3.385,
p=.0007, β=.04, which is consistent with a ceiling effect among those who already held a growth
mindset. There was no Treatment × Prior achievement interaction, t(6687)=1.184, p=.24, β=.01,
suggesting that the treatment was effective in changing mindsets across all levels of
Primary analyses.
The “make-a-math-worksheet” task. Our primary outcome of interest was challenge-
seeking behavior. Compared to the original growth mindset treatment, the revised growth
mindset treatment nearly cut in half the over-representation of easy versus hard math problems in
participants’ self-created worksheets, 4.62 vs. 2.42, a significant difference, t(6884)=8.03,
p<.001, d=.19. See Table 3. This treatment effect on behavior was not moderated by prior
achievement, t(6884)=-.65, p=.52, β=.01, or pre-treatment fixed mindset, t(6884)=.63, p=.53,
It was also possible to test whether the revised growth mindset treatment increased the
overall number of challenging problems, decreased the number of easy problems, or increased
the proportion of problems chosen that were challenging. Supplementary analyses showed that
all of these treatment contrasts were also significant, t(6884)=3.95, p<.001, t(6884)=8.60,
p<.001, t(6884)=7.60, p<.001, respectively.
Secondary analyses.
Design Thinking for Mindset Interventions
Hypothetical challenge-seeking scenario. Compared to the original mindset treatment,
the revised mindset treatment reduced the proportion of students saying they would choose the
“easy” math homework assignment versus the “hard” assignment from 60% to 51%, logistic
regression Z=7.951, p<.001, d=.19. See Table 3. The treatment effect was not significantly
moderated by prior achievement, Z=-1.82, p=.07, β=.04, or pre-treatment fixed mindset, Z=0.51,
p=.61, β=.01.
Attributions and goals. The revised treatment significantly reduced fixed-trait, person-
focused attributions as well as performance avoidance goals, compared to the “original”
treatment, ps<.01 (see Table 3). These effects were small, ds = .07 and 06, but recall that this is
the group difference between two growth mindset treatments. Neither of these outcomes showed
a significant Treatment × Pre-treatment fixed mindset interaction, t(6647)=-.62, p=.54, β=.01,
and t(6631)=-.72, p=.47, β=.01, for attributions and goals respectively. For attributions, there
was no Treatment × Prior achievement interaction, t(6647)=.32, p=.74, β=.001. For performance
avoidance goals, there was a small but significant Treatment × Prior achievement interaction, in
the direction that students with higher levels of prior achievement benefitted slightly more,
t(6631)=-2.23, p=.03, β=.03.
Study 2: Does a Re-Designed Intervention Improve Grades When Tested With a Census of
In Study 1, the revised growth mindset treatment outperformed its predecessor in terms of
changes in immediate self-reports and behavior. In Study 2 we examined whether this revised
treatment would improve actual grades among 9th graders just beginning high school and
replicate the effects of prior studies using an even more rigorous evaluation.
Design Thinking for Mindset Interventions
We carried out this experiment with a census (>95%) of students in 10 schools. A census
is defined as an attempt to reach all individuals in an organization, and is contrasted with a
sample. In addition, instead of conducting the experiment ourselves (as in Study 1 and prior
research), we contracted a third-party research firm to collect and clean all data. This firm
specializes in government-sponsored public health surveys in the United States.
These changes have scientific value. First, in prior research that served as the basis for
the present investigation (Paunesku et al, 2015), the average proportion of students in the high
school who completed the Time 1 session was 17%. The goal of that research (and Study 1) was
to achieve sample size, not within-school representativeness.3 Thus, the present study may
include more of the kinds of students that may have been underrepresented in prior studies.
Next, achieving a census of students is informative for policy. As noted at the outset,
schools, districts, and states are often interested in raising the achievement for entire schools or
for entire defined sub-groups, not for groups of students who may have selected into the
treatment on the basis of their achievement.
Finally, ambiguity about seemingly mundane methodological choices is one important
source of the non-replication of psychological experiments (see, e.g., Schooler, 2014). Therefore
it is important for replication purposes to be able to train third-party researchers in study
procedures, which was done here.
Data. Participants were a maximum of 3,676 students from a national convenience
sample of ten schools in California, New York, Texas, Virginia, and North Carolina. One
3 In Study 1 we were not able to estimate the % of students within each school who participated
because schools did not provide any official data for the study. Yet using historical numbers
from the common core of data as a denominator for a sub-sample of schools that could be
matched, the median response rate was approximately 40%.
Design Thinking for Mindset Interventions
additional school was recruited, but validated student achievement records could not be obtained.
The schools were selected from a national sampling frame based on the Common Core of Data,
with these criteria: public high school, 9th grade enrollment between 100 and 600 students, within
the medium range for poverty indicators (e.g. free or reduce price lunch %), and moderate
representation of students of color (Hispanic / Latino or Black / African American). Schools
selected from the sampling frame were then recruited by a third party firm. School characteristics
—e.g., demographics, achievement, and average fixed mindset score—are in Table 4.
Student participants were somewhat more diverse than Study 1: 29% were
Hispanic/Latino, 17% were black/African-American, 3% were Native American / American
Indian, 30% were White, non-Hispanic, 6% were Asian / Asian-American, and the rest were
from another from another or multiple racial groups. Forty-eight percent were female, and 52%
reported that their mothers had earned a Bachelor’s degree or greater.
The response rate for all eligible 9th grade students in the 10 participating schools for the
Time 1 session was 96%. A total of 183 students did not enter their names accurately and were
not matched at Time 2. All of these unmatched students received control exercises at Time 2. An
additional 291 students completed Time 1 materials but not Time 2 materials, and this did not
vary by condition (Control = 148, Treatment = 143). There were no data exclusions. Students
were retained as long as they began the Time 1 survey, regardless of Time 2 participation or
quality of responses—that is, we estimated “intent-to-treat” (ITT) effects. ITT effects are
conservative tests of the hypothesis, they afford greater internal validity (preventing the
possibility of differential attrition), and they are more policy-relevant because they demonstrate
the effect of offering a treatment.
Design Thinking for Mindset Interventions
Procedures. The mindset treatment, measures, and study procedures designed by the
research team were then implemented by a third-party firm. The firm collected all data directly
from the school partners, and cleaned and merged it, without influence of researchers. Before
the final dataset was delivered, the research team pre-registered the primary hypothesis and
analytic approach via the Open Science Framework (OSF). The pre-registered hypothesis, a
replication of Paunesku et al’s (2015) interaction effect, was that prior achievement, indexed by
a composite of GPA and test scores, would moderate mindset intervention effects on 9th grade
continuous GPA and D/F averages (see:; deviations from the pre-analysis plan are
disclosed in Appendix 1).
Intervention delivery. Student participation consisted of two one-period online sessions
conducted at the school, during regular class periods, in a school computer lab or classroom. The
sessions were 1-4 weeks apart, beginning in the first 10 weeks of the school year. Sessions
consisted of survey questions and the treatment or control intervention. Students were randomly
assigned by the software, in real time, to the treatment or control group. A script was read to
students by their teachers at the start of each computer session.
Mindset treatment. This was identical to the “revised” mindset treatment in Study 1.
Control activity. The control activity was designed to be parallel to the treatment activity.
It, too, was framed as providing helpful information about the transition to high school, and
participants were asked to read and retain this information so as to write their opinions and help
future students. Because the revised mindset treatment had been shown to be more interesting
that previous versions, great effort was made to make the control group at least as interesting as
the treatment.
Design Thinking for Mindset Interventions
The control intervention involved the same type of graphic art (e.g., images of the brain,
animations), as well as compelling stories (e.g., about Phineas Gage). It also provided stories
from upperclassmen, reporting their opinions about the content and about how it was related to
their transition to high school. The celebrity quotes in Time 2 were matched—e.g., Michelle
Obama, Scott Forstall—but they differed in content. For instance, in the control group Michelle
Obama talked about the White House’s BRAIN initiative, an investment in neuroscience.
Finally, as in the treatment, there were a number of opportunities for interactivity; students were
asked open-ended questions and they provided their reactions.
9th grade GPA. Final grades for the end of the first semester of 9th grade were collected.
Schools that provided grades on a 0-100 scale were asked which level of performance
corresponded to which letter grade (A+ to F), and letter grades were then converted to a 0 to 4.33
scale. Using full course names from transcripts, researchers, blind to students’ condition or
grades, coded the courses as science, math or English (i.e., core courses) or not. End of term
grades for the core subjects were averaged. When a student was enrolled in more than one course
in a given subject (e.g., both Algebra and Geometry), the student’s grades in both were averaged,
and then the composite was averaged into their final grade variable.
As a second measure using the same outcome data, we created a dichotomous variable to
indicate poor performance (1 = an average GPA of D+ or below, 0 = not; this dichotomization
cut-point was pre-registered:
Prior achievement. The 8th grade prior achievement variable was pre-registered and is a
standard measure in prior intervention experiments with incoming 9th graders (e.g. Yeager,
Johnson, et al., 2014). It was an unweighted average of 8th grade GPA and 8th grade state test
Design Thinking for Mindset Interventions
scores. The high schools in the present study were from different states and taught students from
different feeder schools. We z-scored 8th grade GPA and state test scores. Because this removes
the mean from each school, we later tested whether adding fixed effects for school to statistical
models changed results (it did not; see Table 8).4 A small proportion was missing both prior
achievement measures (fewer than 10%) and they were assigned a value of zero; we then
included a dummy-variable indicating missing data, which increases transparency and is the
prevailing recommendation (Puma, Olsen, Bell, & Price, 2009).
Hypothetical challenge-seeking. This measure was identical to Study 1. The make-a-
worksheet task was not administered in this study because it was not yet developed when Study 2
was launched.
Fixed mindset, attributions, and performance goals. These measures were identical to
Study 1.
Fidelity measures. Measures of distraction, interest, and self-reported learning were the
same as Study 1.
Preliminary analyses.
Random assignment. Random assignment to condition was effective. There were no
differences between conditions in terms of demographics (gender, race, ethnicity, special
education, parental education) or in terms of prior achievement within any of the 10 schools or in
the full sample (all ps>.1). A shown in Table 3, there were no pre-treatment differences between
conditions in terms of fixed mindset.
4 Some prior research (Hulleman & Harackiewicz, 2009) used self-reported expectancies, not
prior GPA, as a baseline moderator. We too measured these (see Study 1 for measure). When
added to the composite, our moderation results were the same and slightly stronger. We did not
ultimately include this measure in our baseline composite in the main text because we did not
pre-register this variable for the composite.
Design Thinking for Mindset Interventions
Fidelity. Several measures suggest high fidelity of implementation. On average, 94% of
treated and control students answered the open-ended questions at both Time 1 and 2, and this
did not differ by conditions or time. During the Time 1 session, both treated and control students
saw an average of 96% of the screens in the intervention. Among those who completed the Time
2 session, treated students saw 99% of screens, compared to 97% for control students. Thus,
both conditions saw and responded to their respective content to an equal (and high) extent.
Open-ended responses from students confirm that they were, in general, processing the
mindset message. Here are some examples of student responses to the final writing prompt at
Time 2, in which they were asked to list the next steps they could take on their growth mindset
“I can always tell myself that mistakes are evidence of learning. I will always find
difficult courses to take them. I'll ask for help when I need it.”
“To get a positive growth in mindset, you should always ask questions and be curious.
Don't ever feel like there is a limit to your knowledge and when feeling stuck, take a few
deep breaths and relax. Nothing is easy.”
“Step 1: Erase the phrase ‘I give up’ and all similar phrases from your vocabulary. Step
2: Enter your hardest class of the day (math, for example). Step 3: When presented with
a brain-frying worksheet, ask questions about what you don't know.”
Design Thinking for Mindset Interventions
“When I grow up I want to be a dentist which involves science. I am not good at
science…yet. So I am going to have to pay more attention in class and be more focused
and pay attention and learn in class instead of fooling around.”
Next, students across the two conditions reported no differences in levels of distraction,
own distraction: t(3438)=0.15, p=.88; others’ distraction: t(3438)=0.37, p=.71. Finally, control
participants actually rated their content as more interesting, and said that they felt like they
learned more, as compared to treated participants, t(3438)=7.76 and 8.26, Cohen’s ds = .25
and .27, respectively, ps <.001. This was surprising but it does not threaten our primary
inferences. Instead it points to the conservative nature of the control group. Control students
received a positive, interesting, informative experience that held their attention and exposed them
to novel scientific information relevant to their high school biology classes (e.g., the brain) that
was endorsed by influential role models (e.g., Michele Obama, Scott Forstall).
Self-reported fixed mindset. As an additional manipulation check, students reported their
fixed mindset beliefs. As shown in Table 3, both the treatment and control conditions changed in
the direction of a growth mindset between Times 1 and 2 (change score ps<.001). However,
those in the treatment condition changed much more (Control Δ = -.17 scale points out of 6,
Treatment Δ = -.55 scale points out of 6). These change scores differed significantly from each
other, p<.001. Table 5 reports regressions predicting post-treatment fixed mindset as a function
of condition, and shows that choices of covariates did not affect this result.
Moderator analyses in Table 5 show that previously higher-achieving students, and, to a
much lesser extent, students who held more of a fixed mindset at baseline, changed more in the
direction of a growth mindset.
Design Thinking for Mindset Interventions
Primary analyses.
9th grade GPA and poor performance rates. Our first pre-registered confirmatory
analysis was to examine the effects of the treatment on 9th grade GPA, moderated by prior
achievement. In the full sample, there was a significant Treatment × Prior Achievement
interaction, t(3419)=2.66, p=.007, β=-.05, replicating prior research (Paunesku et al., 2015;
Yeager et al., 2014; also see Wilson & Linville, 1982; 1985). Table 6 shows that the significance
of this result did not depend on the covariates selected. Tests at ±1SD of prior performance
showed an estimated treatment benefit of 0.13 grade points, t(3419)=2.90, p=.003, d=.10, for
those who were at -1SD of prior performance, and no effect among those at +1SD of prior
performance, b=-0.04 grade points, t(3419)=0.99, p=.33, d=.03. This may perhaps be because
higher achieving students have less room to improve or because they may manifest their
increased growth mindset in challenging-seeking rather than in seeking easy A’s.
In a second pre-registered confirmatory analysis, we analyzed rates of poor performance
(D or F averages). This analysis mirrors past research (Cohen et al., 2009; Paunesku et al., 2015;
Wilson & Linville, 1982, 1985; Yeager, Purdie-Vaughns, et al., 2014), and helps test the
theoretically predicted finding that the intervention is beneficial by stopping a recursive process
by which poor performance begets worse performance over time (Cohen et al., 2009; see Cohen
& Sherman, 2014; Yeager & Walton, 2011).
There was a significant overall main effect of treatment on a reduced rate of poor
performance of 4 percentage points, Z=2.95, p=.003, d=.10. Next, as with the full continuous
GPA metric, in a logistic regression predicting poor performance there was a significant
Treatment × Prior Achievement interaction, Z=2.45, p=.014, β=.05. At -1SD of prior
achievement the treatment effect was estimated to be 7 percentage points, Z=3.80, p<.001, d=.13,
Design Thinking for Mindset Interventions
while at +1SD there was a non-significant difference of 0.7 percentage points, Z=.42, p=.67,
Secondary analyses.
Hypothetical challenge-seeking. The Mindset treatment reduced from 54% to 45% the
proportion of students saying they would choose the “easy” math homework assignment (that
they would likely get a high score on) versus the “hard” assignment (that they might get a low
score on) (see Table 3). The treatment effect on hypothetical challenge-seeking was larger for
previously higher-achieving students, Treatment × Prior Achievement interaction Z=2.63,
p=.008, β=.05, and was not moderated by pre-treatment fixed mindset, Z=1.45, p=.15, β=.02.
Thus, while lower achieving students were more likely than high achieving students to show
benefits in grades, higher achieving students were more likely to show an impact on their
challenge-seeking choices on the hypothetical task.
Attributions and goals. The growth mindset treatment reduced fixed-trait, person-
focused attributions, d=.13, and performance avoidance goals, d=.11, ps<.001 (see Table 3),
unlike some prior growth mindset intervention research (e.g., Blackwell et al., 2007). Thus, the
present study uses a field experiment to replicate much prior laboratory and correlational
research on the effects of a growth mindset on attributions and goals (Dweck, 1999; Dweck,
2006; Yeager & Dweck, 2012).
In sum, the revised mindset treatment effectively changed grades and prevented poor
performance for low-prior-achievers several months post-treatment. It also changed students’
reported fixed mindsets , challenge-seeking (particularly for high achievers), attributions, and
goals. All of this was possible when the intervention was administered by a third-party firm to a
census of students at ten schools.
Design Thinking for Mindset Interventions
General Discussion
We used “design thinking” to make psychological intervention materials broadly
applicable to students who may share concerns and construals because they are undergoing
similar challenges—in this case, 9th grade students entering high school. When this was done,
the revised intervention was more effective in changing proxy outcomes such as beliefs and
short-term behaviors than the previous, more generic intervention that did not go through this
process (Study 1). Furthermore, the intervention increased core course grades for previously low-
achieving students (Study 2).
On the whole, the present research provides direct evidence of an exciting possibility: a
scalable, two-session mindset program, developed through a rapid, iterative, user-centered design
process, may be administered to entire classes of 9th graders (>95% of students) and begin to
raise the grades of the lowest performing entering 9th graders, while increasing the learning-
oriented attitudes and beliefs of all 9th graders. Moreover, this approach provides a model for
taking growth mindset and other psychological interventions to scale, and for conducting
replication studies.
The efficacy of the growth mindset program highlights the importance of practices of
expert teachers who guide students’ construals in productive directions in their classrooms.
Indeed, research on expert tutors—who represent one of the most powerful educational
interventions, capable of raising struggling students’ achievement by two standard deviations
(Bloom, 1984)—highlights the intensity of their focus on students’ construals. Working hand-in-
hand with students, expert tutors build relationships of trust with students and then redirect their
construals of academic difficulty as challenges to be met, not evidence of fixed inability (see
Lepper, Woolverton, Mumme, & Gurtner, 1993; Treisman, 1992). Research on psychological
Design Thinking for Mindset Interventions
interventions distills a portion of the practices of expert teachers and then can embed them in
brief, web-based psychological interventions. When this succeeds, it suggests that one potentially
under-utilized element in efforts to train novice teachers to become experts could be training in
how to optimally guide students’ construals of the learning situation.
Comparison to Previous Growth Mindset Interventions
It is reassuring that the treatment effect size on grades among previously low-achieving
students (.13 grade points) was consistent with that found in prior studies. Paunesku et al. (2015)
showed a treatment effect on core subject GPA of .13 grade points for the bottom third of
students. Yeager, Henderson et al. (2014) showed a treatment effect in the full sample of .10
grade points, and .20 grade points among the bottom third of prior achievement.
It is interesting that, although the revised mindset intervention was more effective at
changing short-term beliefs and behavior than the original intervention (Study 1), the effect sizes
for grades in Study 2 do not exceed those found in past studies (e.g. Paunesku et al., 2015;
Yeager, Henderson, et al., 2014). Each of those past studies, however, did not attempt to reach a
census of students within a school as the present study did. Perhaps the present study included
more students who were reluctant to participate or who may have been resistant to the treatment
message and therefore we did not show stronger effects despite a possibly stronger message.
Next, it was surprising that self-reported fixed mindsets of students in the control group
in Study 2 changed nearly as much from Time 1 to Time 2 as students in the original mindset
treatment group in Study 1 (see Table 2). There are at least three potential explanations for this.
One is trivial: perhaps people simply disagree more with these items when they respond to them
a second time. It may come to seem socially desirable to believe you can get smarter. Second,
perhaps the school was teaching a growth mindset to many students already; indeed, in a survey
Design Thinking for Mindset Interventions
of the math teachers in the schools in Study 2 (not reported here), 26% of teachers said they had
read the book Mindset (Dweck, 2006). Third, perhaps there was spillover from the treatment
group to the control group—i.e., perhaps treated students shared the message with control
students or perhaps students who decided to work harder had an influence on their friends. In
each of these two latter scenarios, then the treatment effects on grades would likely be
conservative estimates (because controls were receiving a version the treatment message through
other means).
The Value of Replication
Replication efforts are important for cumulative science (Funder et al., 2014; Lehrer,
2010; Open Science Collaboration, 2015; Schooler, 2011, 2014; Schimmack, 2012; Simmons et
al., 2011; Pashler & Wagenmakers, 2012; also see Ioannidis, 2005). However, it would be easy
for replications to take a misguided “magic bullet” approach—that is, to assume that an
intervention that worked in one place for one group should work in another place for another
group (Yeager & Walton, 2011). This is why Yeager and Walton (2011) stated that
experimenters “should [not] hand out the original materials without considering whether they
would convey the intended meaning” for the group in question (p. 291; also see Wilson,
Aronson, & Carlsmith, 2010). The present research therefore followed a procedure of (a)
beginning with the original materials as a starting point; (b) using a design methodology to
increase the likelihood that they conveyed the intended meaning in the targeted population (9th
graders); and (c) conducting well-powered randomized experiments in advance to ensure that, in
fact, the materials were appropriate and effective in the target population.
Study 2 showed that, when this was done, the revised mindset intervention raised grades
for previously low-achieving students, even when all data were collected, aggregated, and
Design Thinking for Mindset Interventions
cleaned by an independent third-party, and when all involved were blind to experimental
condition. Furthermore, the focal tests were pre-registered and required only minor adjustment
once data arrived. The present research therefore provides a rigorous replication of Paunesku et
al. (2015). This addresses skepticism about whether a two-session, self-administered
psychological intervention can, under some conditions, measurably improve the grades for
previously low-performing students.
Nevertheless, the pre-registered hypothesis reported here does not advance our theory of
mechanisms for psychological intervention effects. For instance, an under-studied mechanism
involves what happens between learning the message and earning higher grades. Which
behaviors are different? Are students forming better relationships? Are teachers noticing their
improved performance? One way to study these mechanisms is to examine moderation by
contextual factors of the growth mindset program effects on grades, which is a high priority for
research. Understanding what student learning opportunities must be in place before a growth
mindset can increase achievement will also be crucial for developing policy advice and for
further improving the intervention program and its impact. That is, although the revised
intervention had a measurable impact for lower achieving students, it may be possible to have a
more substantial impact.
Divergence of Self-Reports and Behavior
One theme in the present study’s results—and a theme dating back to the original
psychological interventions in education (Wilson & Linville, 1982)—is a disconnect between
self-reports and actual behavior. On the basis of only the self-reported fixed mindset results, we
might have concluded that the intervention affected high achievers more, but on the basis of the
GPA results, we might conclude that the treatment “worked” better for who were lower in prior
Design Thinking for Mindset Interventions
achievement. That is, self-reports and grades showed moderation in the opposite directions:
lower-achievers benefitted more for grades and less for self-reports.
What accounts for this? Two possibilities are germane. First, grades follow a non-normal
distribution that suffers from a range restriction at the top, in part due to grade inflation. Thus
higher-achievers may not have room to grow in their grades. But higher-achievers may also be
those who read material more carefully absorb the messages they are taught, and show this when
answering the self-report questions. These patterns could explain smaller GPA effects but larger
manipulation check effects for higher-achievers.
Another possibility is substantive. High-achieving students may have been inspired to
take on more challenging work that might not lead to higher grades but might teach them more
(indeed, this was our challenge-seeking measure). This is one reason why it could have been
informative to give students a norm-referenced exam at the end of the term as in some
intervention studies (Good et al., 2003; Hanselman, Bruch, Gamoran, & Borman, 2014). If
higher-achievers were choosing harder tasks, their grades may have suffered, but their learning
may have improved. In the present research, we could not provide our own test due to the light-
touch nature of the intervention. And state-administered exams only measure learning when
instruction is also held constant, which was not true in our diverse sample of schools. Future
research conducted when instruction was uniform could therefore use test scores as outcomes.
Finally, Study 1’s development of a brief behavioral measure of challenge seeking—the
“make-a-worksheet” task—was a methodological advance. It may avoid known problems with
using self-reports to evaluate behavioral interventions (see Duckworth & Yeager, 2015). This
may make it a practical and useful proxy outcome for evaluating brief, web-based psychological
interventions (see Duckworth & Yeager, 2015; Yeager & Bryk, 2015).
Design Thinking for Mindset Interventions
The present research has many limitations. First, we only had access to grades at the end
of the first term of 9th grade—in some cases, a few weeks after the treatment. It would be
informative to collect grades over a longer period of time.
Second, we have not attempted here to understand heterogeneity in effect sizes (cf.
Hanselman et al. 2014). Building on the results of the present study, we are now poised to carry
out a systematic examination of student, teacher, and school factors that cause heterogeneous
treatment effects. Indeed, with further revision to the intervention materials, we are carrying out
a version of Study 2 with students in a national sample of randomly-selected U.S. public high
schools. The present study provides a basis for this.
Third, the mindset effect sizes may appear to be small. However, the effect sizes conform
to expectations, and are practically meaningful.5 They were obtained in highly heterogeneous
contexts, without oversight by researchers. With lower experimental control and a census of
students, there may be greater error variance, resulting in lower effect sizes compared to some
previous studies. Even with this error variance, the present study’s effect sizes represent a
meaningful improvement in core educational outcomes, from a policy perspective. Considering
that 9th grade course failure almost perfectly predicts eventual high school graduation rates
(Allensworth & Easton, 20015), if the growth mindset treatment reduces by 4 percentage points
the proportion of 9th graders who earn D/F averages, then a fully scaled and spread version of this
program could in theory prevent 100,000 high school dropouts in the U.S. per year—while
increasing the learning-oriented behavior of many other students.
5 The correlations with fixed mindset beliefs in Table 2 ranged from r=.12 to r=.40, which
matches a meta-analysis of implicit theories effects (range of r=.15 to r=.24; Burnette et al.,
2013). The present effects also match to the average of social-psychological effects more
generally, r=.21 (Richard, Bond, & Stokes-Zoota, 2003).
Design Thinking for Mindset Interventions
This research began with a simple question: is it possible to take an existing, initially
effective psychological intervention and re-design it to improve the outcomes for a population of
students? Now that we have shown this is possible for a growth mindset intervention and
described a method for doing so, it will be exciting to see how this approach can be applied to
and improved for other interventions.
Furthermore, this particular intervention—the growth mindset during the transition to
high school—is now ready for even more rigorous evaluation in more heterogeneous contexts. If
effective, would it be possible to disseminate it in such a way that would actually reduce the
large number of high school dropouts each year? We look forward to answering this question.
Design Thinking for Mindset Interventions
Allensworth, E. M., & Easton, J. Q. (2005). The on-track indicator as a predictor of high school
graduation. Consortium on Chicago School Research, University of Chicago.
Aronson, E. (1999). The power of self-persuasion. American Psychologist, 54, 875–884.
Aronson, J., Fried, C., & Good, C. (2002). Reducing the effects of stereotype threat on African
American college students by shaping theories of intelligence. Journal of Experimental
Social Psychology, 38, 113–125.
Bem, D. J. (1965). An experimental analysis of self-persuasion. Journal of Experimental Social
Psychology, 1, 199-218.
Blackwell, L. A., Trzesniewski, K. H., & Dweck, C. S. (2007). Theories of intelligence and
achievement across the junior high school transition: A longitudinal study and an
intervention. Child Development, 78, 246–263.
Bower, G. H., & Gilligan, S. G. (1979). Remembering information related to one's self. Journal
of Research in Personality, 13, 420-432.
Brehm, J. W. (1966). A theory of psychological reactance. New York, NY: Academic Press.
Bryk, A. S. (2009). Support a science of performance improvement. Phi Delta Kappan, 90(8),
Burnette, J. L., O'Boyle, E. H., VanEpps, E. M., Pollack, J. M., & Finkel, E. J. (2013). Mind-sets
matter: A meta-analytic review of implicit theories and self-regulation. Psychological
Bulletin, 139, 655.
Cialdini, R. B., Kallgren, C. A., & Reno, R. R. (1991). A focus theory of normative conduct: A
theoretical refinement and re-evaluation. Advances in Experimental Social Psychology,
Design Thinking for Mindset Interventions
24, 201–234.
Cohen, G. L., Garcia, J., Apfel, N., & Master, A. (2006). Reducing the racial achievement gap: A
social-psychological intervention. Science, 313, 1307–1310.
Cohen, G. L. & Prinstein, M. J. (2006). Peer contagion of aggression and health-risk behavior
among adolescent males: An experimental investigation of effects on public conduct and
private attitudes. Child Development, 77, 967-983.
Cohen, G. L., Garcia, J., Purdie-Vaugns, V., Apfel, N., & Brzustoski, P. (2009). Recursive
processes in self-affirmation: Intervening to close the minority achievement gap. Science,
324, 400–403.
Cohen, G. L., & Sherman, D. K. (2014). The psychology of change: Self-affirmation and social
psychological intervention. Annual Review of Psychology, 65, 333-371.
Cooper, J., & Fazio, R. H. (1984). A new look at dissonance theory. Advances in Experimental
Social Psychology, 17, 229-266.
Davis, J., & Bauman, K. (2013). School enrollment in the United States: 2011. United States
Census Bureau.
Duckworth, A. L., & Quinn, P. D. (2009). Development and validation of the Short Grit Scale
(GRIT–S). Journal of Personality Assessment, 91, 166-174.
Duckworth, A.L., & Yeager, D.S. (in press). Measurement matters: Assessing personal qualities
other than cognitive ability. Educational Researcher.
Dweck, C. (1999). Self-theories: Their role in motivation, personality, and development.
Philadelphia, PA: Psychology Press.
Dweck, C. S. (2006). Mindset. New York, NY: Random House
Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and
Design Thinking for Mindset Interventions
personality. Psychological Review, 95, 256–273.
Eccles, J. S., & Wigfield, A. (2002). Motivational beliefs, values, and goals. Annual Review of
Psychology, 53, 109-132.
Elliot, A. J., & Dweck, C. S. (Eds.) (2005). Handbook of competence and motivation. New York,
NY: Guilford Press.
Erikson, E. H. (1968). Identity: Youth and Crisis. New York, NY: Norton.
Fryberg, S. A., Covarrubias, R., & Burack, J. (2013). Cultural models of education and academic
performance for Native American and European American students, School Psychology
International, 34, 439-452.
Funder, D. C., Levine, J. M., Mackie, D. M., Morf, C. C., Sansone, C., Vazire, S., & West, S. G.
(2014). Improving the dependability of research in personality and social psychology:
Recommendations for research and educational practice. Personality And Social
Psychology Review, 18, 3-12.
Garcia, J., & Cohen, G. L. (2012). Social psychology and educational intervention. In E. Shafir
(Ed.), Behavioral foundations of policy. New York, NY: Russell Sage.
Godes, O., Hulleman, C. S., & Harackiewicz, J. M. (2007). Boosting students’ interest in math
with utility value: Two experimental tests. Meeting of the American Educational
Research Association, Chicago, IL.
Gollwitzer, P. M. (1999). Implementation intentions: Strong effects of simple plans. American
Psychologist, 54, 493.
Good, C., Aronson, J., & Inzlicht, M. (2003). Improving adolescents’ standardized test
performance: An intervention to reduce the effects of stereotype threat. Journal of
Applied Developmental Psychology, 24, 645–662.
Design Thinking for Mindset Interventions
Hanselman, P., Bruch, S.K., Gamoran, A., & Borman, G.D. (2014). Threat in context: School
moderation of the impact of social identity threat on racial/ethnic achievement gaps.
Sociology of Education, 87, 106-124.
Hasebe, Y., Nucci, L., & Nucci, M. S. (2004). Parental control of the personal domain and
adolescent symptoms of psychopathology: A cross-national study in the United States
and Japan. Child Development, 75, 815-828.
Henderson, V. L., & Dweck, C. S. (1990). Motivation and achievement. In S. S. Feldman & G.
R. Etliott (Eds.), At the threshold: The developing adolescent (pp. 308-329). Cambridge,
MA: Harvard University Press.
Hulleman, C. S., & Harackiewicz, J. M. (2009). Making education relevant: Increasing interest
and performance in high school science classes. Science, 326, 1410–1412.
Ioannidis, J. P. (2005). Why most published research findings are false. Plos One, 18, 40-47.
Kelley, T., & Kelley, D. (2013). Creative confidence: Unleashing the creative potential within
us all.
Kohavi, R., & Longbotham, R. (2015). Online controlled experiments and A/B tests. In C.
Sammut and G. Webb (Eds.), Encyclopedia of Machine Learning and Data Mining. NY,
NY: Springer.
Kuncel, N. R., Crede, M., & Thomas, L. L. (2005). The validity of self-reported grade point
averages, class ranks, and test scores: A meta-analysis and review of the literature.
Review of Educational Research, 75, 63-82.
Lehrer, J. (2010). The truth wears off. The New Yorker, 13, 52.
Design Thinking for Mindset Interventions
Lepper, M. R., Woolverton, M., Mumme, D. L., & Gurtner, J. L. (1993). Motivational
techniques of expert human tutors: Lessons for the design of computer-based tutors. In S.
P. Lajoie & S. J. Derry (Eds.), Computers as cognitive tools: Technology in education.
Hillsdale, NJ: Erlbaum.
Lewin, K. (1952). Group decision and social change. In G. E. Swanson, T. M. Newcomb, & E.
L. Hartley (Eds.), Readings in social psychology (2nd ed., pp. 330–344). New York, NY:
Lord, C. G. (1980). Schemas and images as memory aids: Two modes of processing social
information. Journal of Personality and Social Psychology, 38, 257-269.
McCannon, C.J., Schall, M.W., & Perla, R.J. (2008). Planning for Scale: A Guide for Designing
Large-Scale Improvement Initiatives. IHI Innovation Series white paper. Cambridge,
Massachusetts: Institute for Healthcare Improvement.
Mueller, C. M., & Dweck, C. S. (1998). Praise for intelligence can undermine children’s
motivation and performance. Journal of Personality and Social Psychology, 75, 33–52.
Nucci, L. P., Killen, M., & Smetana, J. G. (1996). Autonomy and the personal: Negotiation and
social reciprocity in adult-child social exchanges. New Directions for Child and
Adolescent Development, 1996, 7-24.
Open Science Collaboration (2015). Estimating the reproducibility of psychological science.
Science, 349, 943.
Pashler, H., & Wagenmakers, E. J. (2012). Editors’ introduction to the special section on
replicability in psychological science a crisis of confidence?. Perspectives on
Psychological Science, 7, 528-530.
Design Thinking for Mindset Interventions
Paunesku, D., Walton, G.M., Romero, C.L., Smith, E.N., Yeager, D.S., & Dweck, C.S. (2015).
Mindset interventions are a scalable treatment for academic underachievement.
Psychological Science, 26, 784-793.
Puma, M. J., Olsen, R. B., Bell, S. H., & Price, C. (2009). What to do when data are missing in
group randomized controlled trials (NCEE 2009-0049). Washington, DC: National
Center for Education Evaluation and Regional Assistance, Institute of Education
Sciences, U.S. Department of Education.
Richard, F.D., Bond, C.F., & Stokes-Zoota, J.J. (2003). One hundred years of social psychology
quantitatively described. Review of General Psychology, 7, 331-363.
Ries, E. (2011). The lean startup: How today's entrepreneurs use continuous innovation to
create radically successful businesses. Random House LLC.
Robins, R. W., & Pals, J. L. (2002). Implicit self-theories in the academic domain: Implications
for goal orientation, attributions, affect, and self-esteem change. Self and Identity, 1, 313-
Ross, L., & Nisbett, R. E. (1991). The person and the situation: Perspectives of social
psychology. New York, NY: McGraw-Hill.
Schimmack, U. (2012). The ironic effect of significant results on the credibility of multiple-study
articles. Psychological Methods, 17, 551.
Schooler, J. (2011). Unpublished results hide the decline effect. Nature, 470, 437.
Schooler, J.W. (2014). Turning the lens of science on itself: Verbal overshadowing, replication,
and metascience. Perspectives on Psychological Science, 9, 579-584.
Simmons, J.P., Nelson, L.D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed
flexibility in data collection and analysis allows presenting anything as significant.
Design Thinking for Mindset Interventions
Psychological Science, 22, 1359-1366.
Skurnik, I., Yoon, C. Park, D.C., & Schwarz, N. (2005). How warnings about false claims
become recommendations. Journal of Consumer Research, 31, 713-724.
Stephens, N.M., Hamedani, M.G., & Destin, M. (2014). Closing the social-class achievement
gap: A difference-education intervention improves first-generation students’ academic
performance and all students’ college transition. Psychological Science, 25, 943-953.
Stipek, D. (2002). Motivation to learn: From theory to practice. (4th edition). Allyn & Bacon,
Needham Heights, MA.
Tsukayama, E., Duckworth, A. L., & Kim, B. E. (2013). Domain-specific impulsivity in school-
age children. Developmental Science, 16, 879-893.
Walton, G. M. (2014). The new science of wise psychological interventions. Current Directions
in Psychological Science, 23, 73-82.
Walton, G. M., & Cohen, G. L. (2007). A question of belonging: Race, social fit, and
achievement. Journal of Personality and Social Psychology, 92, 82-96.
Walton, G. M., & Cohen, G. L. (2011). A brief social-belonging intervention improves academic
and health outcomes among minority students. Science, 331, 1447–1451.
Wilson, T.D. (2002). Strangers to ourselves: Discovering the adaptive unconscious. Cambridge,
M.A.: Harvard University Press.
Wilson, T. D., Aronson, E., & Carlsmith, K. (2010). Experimentation in social psychology.
Handbook of Social Psychology (pp. 51-81).
Wilson, T. D., & Linville, P. W. (1982). Improving the academic performance of college
freshmen: Attribution therapy revisited. Journal of Personality and Social Psychology,
42, 367–376.
Design Thinking for Mindset Interventions
Wilson, T. D., & Linville, P. W. (1985). Improving the performance of college freshmen with
attributional techniques. Journal of Personality and Social Psychology, 49, 287–293.
Yeager, D.S., & Bryk, A. (2015). Practical measurement. Unpublished Manuscript, University of
Texas at Austin.
Yeager, D.S. & Dweck, C.S. (2012). Mindsets that promote resilience: When students believe
that personal characteristics can be developed. Educational Psychologist, 47, 302-314.
Yeager, D.S., Henderson, M., Paunesku, D., Walton, G., Spitzer, B., D’Mello, S., & Duckworth,
A.L. (2014). Boring but important: A self-transcendent purpose for learning fosters
academic self-regulation. Journal of Personality and Social Psychology, 107, 559-580.
Yeager, D.S., Johnson, R., Spitzer, B., Trzesniewski, K., Powers, J., & Dweck, C.S. (2014). The
far-reaching effects of believing people can change: Implicit theories of personality shape
stress, health, and achievement during adolescence. Journal of Personality and Social
Psychology, 106, 867-884.
Yeager, D.S., Purdie-Vaughns, V., Garcia, J., Apfel, N., Pebley, P., Master, A., Hessert, W.,
Williams, M. & Cohen, G.L. (2014). Breaking the cycle of mistrust: Wise interventions
to provide critical feedback across the racial divide. Journal of Experimental Psychology:
General, 143, 804-824.
Yeager, D.S. & Walton, G. (2011). Social-psychological interventions in education: They’re not
magic. Review of Educational Research, 81, 267-301.
Yeager, D.S., Walton, G., & Cohen, G.L. (2013). Addressing achievement gaps with
psychological interventions. Phi Delta Kappan, 94(5), 62-65.
Design Thinking for Mindset Interventions
Table 1. Designing the Revised Mindset Treatment: Manipulations and Results from
Rapid, A/B Experiments of Growth Mindset Intervention Elements Conducted on MTurk.
Effect on pre-
post mindset Δ
Study Total N
and coding Sample text β= p=
1 1851
Direct =1
(this will help
you) framing
vs. Indirect =
0 (help other
Direct: "Would you like to be smarter? Being smarter helps teens
become the person they want to be in life… In this program, we
share the research on how people can get smarter." vs. Indirect:
"Students often do a great job explaining ideas to their peers
because they see the world in similar ways. On the following
pages, you will read some scientific findings about the human
brain. ... We would like your help to explain this information in
more personal ways that students will be able to understand.
We'll use what we learn to help us improve the way we talk
about these ideas with students in the future."
-0.234 0.056
fixed mindset
=1 vs. Not = 0
Refutation: Some people seem to learn more quickly than others,
for example, in math. You may think, “Oh, they’re just
smarter.” But you don’t realize that they may be working really
hard at it (harder than you think).
-0.402 0.002
Labeling and
benefits of
mindset" = 1
vs. Not = 0
Benefits of mindset: People with a growth mindset know that
mistakes and setbacks are opportunities to learn. They know that
their brains can grow the most when they do something difficult
and make mistakes. We studied all the 10th graders in the nation
of Chile. The students who had a growth mindset were 3 times
more likely to score in the top 20% of their class. Those with a
fixed mindset were more likely to score in the bottom 20%
0.357 0.006
evidence = 1
vs. Teenagers
= 0
Rats/Jugglers: Scientists used a brain scanner (it’s like a camera
that looks into your brain) to compare the brains of the two
groups of people. They found that the people who learned how
to juggle actually grew the parts of their brains that control
juggling skills vs. Teenagers: Many of the [teenagers] in the
study showed large changes in their intelligence scores. And
these same students also showed big changes in their brain. ...
This shows that teenagers’ brains can literally change and
become smarter—if you know how to make it happen.
-0.122 0.352
2 1153
Direct (this
will help you)
framing = 1
vs. Indirect
(help other
framing = 0
Direct: Why does getting smarter matter? Because when people
get smarter, they become more capable of doing the things they
care about. Not only can they earn higher grades and get better
jobs, they can have a bigger impact on the world and on the
people they care about... In this program, you’ll learn what
science says about the brain and about making it smarter. Vs.
Indirect: (see above).
-0.319 0.036
fixed mindset
= 1 vs. Not=0
Refutation: Some people look around and say "How come
school is so easy for them, but I have to work hard? Are they
smarter than me?" ... They key is not to focus on whether you're
smarter than other people. Instead, focus on whether you're
smarter today than you were yesterday and how you can get
smarter tomorrow than you are today.
0.002 0.990
= 1 vs. Not=0
Celebrity: Endorsements from Scott Forstall, LeBron James, and
Michelle Obama.
0.273 0.073
Design Thinking for Mindset Interventions
Note: β= standardized regression coefficient for condition contrast in multiple linear regression.
Design Thinking for Mindset Interventions
Table 2. Correlations Among Measures in Study 1 Replicate Prior Growth Mindset Effects
and Validate the “Make-A-Worksheet” Task.
Actual Easy
(Minus Hard)
Selected Grit
(Time 2)
Fixed trait
in Math
Grit -.16
Self-control -.14 .51
Fixed Mindset
(Time 2) .13 -.16 -.16
Prior Performance -.17 .40 .37 -.25
Fixed trait
attributions .17 -.27 -.22 .28 -.30
avoidance goals .10 -.13 -.14 .23 -.14 .21
Interest in Math -.23 .27 .29 -.14 .48 -.27 -.10
Math Anxiety .11 -.07 -.08 .12 -.35 .23 .14 -.29
willingness to
select the easy
(not hard) math
problem. .29 -.19 -.16 .12 -.14 .25 .12 -.26 .12
Note: Ns range from 6,883 to 7,251; all ps < .01. Data are from both conditions; correlations did
not differ across experimental conditions.
Design Thinking for Mindset Interventions
Table 3. Effects of Condition on Fixed Mindset, Attributions, Performance-avoidance
Goals, and Challenge-Seeking in Studies 1 and 2.
Study 1 Study 2
Original Mindset
Revised Mindset
Revised Mindset
Pre-treatment (Time 1)
fixed mindset 3.20 1.15 3.22 1.15 t=1.18 3.07 1.12 3.09 1.14 t=0.30
Post-treatment (Time 2)
fixed mindset 2.98 1.21 2.74 1.21 t=9.32*** 2.89 1.14 2.54 1.16 t=12.16***
Change from Time 1 to
Time 2 -0.22 -0.48 -0.17 -0.55
Fixed trait attributions 2.14 0.87 2.08 0.82 t=2.71** 2.12 0.86 2.03 0.82 t=3.58***
Performance avoidance
goals 3.40 1.51 3.31 1.52 t=2.60** 3.57 1.55 3.42 1.56 t=2.95**
Actual easy (minus hard)
math problems
selected at Time 2 4.62 11.52 2.42 11.38 - - - -
Hypothetical willingness
to select the easy
(not hard) math
problem at Time 2 60.4% 51.2% t=7.95*** 54.4% 45.3% t=5.71***
N (both Time 1 and 2)= 3665 3480 1646 1630
Note: Mindset change scores from Time 1 to Time 2 significant at p<.001. *** p<.001, ** p<.01.
Design Thinking for Mindset Interventions
Table 4. School Characteristics in Study 2.
year start
Modal start
date for
Time 1
org rating
/ Latino
% Black /
% living
line (in
district) State
1 8/18/14 10/6/14 4 3.13 20 26 32 22 18.3 CA
2 8/18/14 10/3/14 2 3.22 3 57 33 7 18.3 CA
3 9/3/14 10/23/14 2 3.23 10 13 74 3 41.0 NY
4 8/25/14 9/30/14 7 2.65 62 19 10 9 10.8 NC
5 8/25/14 10/6/14 5 3.11 41 17 40 1 13.7 NC
6 8/25/14 10/7/14 2 2.95 29 19 48 2 13.7 NC
7 8/26/14 9/23/14 7 3.00 78 21 1 0 11.6 TX
8 8/25/14 9/18/14 4 3.21 11 78 6 4 27.8 TX
9 8/25/14 10/8/14 8 3.07 52 27 11 9 27.8 TX
10 9/2/14 10/29/14 6 3.43 55 16 26 3 5.9 VA
Design Thinking for Mindset Interventions
Table 5. Regressions Predicting Post-treatment (Time 2) Fixed Mindset, Study 2
Base model
Plus school
fixed effects
Plus demographic
Plus pre-treatment
Intercept 2.959*** 2.933*** 2.943*** 2.989***
(0.028) (0.079) (0.086) (0.071)
Revised mindset treatment -0.389*** -0.394*** -0.397*** -0.391***
(0.039) (0.039) (0.039) (0.032)
Prior achievement (z-scored,
centered at 0) -0.156*** -0.182*** -0.173*** -0.036
(0.028) (0.029) (0.029) (0.025)
Treatment × Prior achievement (z-
scored, centered at 0) -0.181*** -0.177*** -0.176*** -0.183***
(0.040) (0.039) (0.039) (0.033)
School 2 0.111 0.109 0.040
(0.120) (0.121) (0.100)
School 3 0.075 0.041 -0.009
(0.137) (0.139) (0.115)
School 4 -0.561*** -0.569*** -0.284**
(0.115) (0.116) (0.096)
School 5 -0.083 -0.119 -0.096
(0.087) (0.088) (0.073)
School 6 -0.126 -0.169 -0.028
(0.097) (0.099) (0.082)
School 7 0.012 -0.003 -0.012
(0.108) (0.110) (0.091)
School 8 0.218* 0.222* 0.091
(0.091) (0.094) (0.078)
School 9 0.144 0.129 0.099
(0.094) (0.095) (0.079)
School 10 0.214* 0.201* 0.037
(0.094) (0.094) (0.078)
Female 0.020 -0.031
(0.039) (0.032)
Asian -0.115 -0.150*
(0.084) (0.069)
Hispanic/Latino -0.024 -0.000
(0.051) (0.043)
Black/African-american 0.036 -0.008
(0.058) (0.048)
Repeating freshman year 0.356** 0.180
(0.137) (0.114)
Pre-treatment fixed mindset (z-
scored, centered at 0) 0.704***
Treatment × Pre-treatment fixed
mindset (z-scored, centered at 0) -0.120***
Adjusted R2 .076 .099 .100 .383
AIC 10103.731 10030.747 10015.772 8759.446
N3279 3279 3274 3267
Note: Unstandardized coefficients above standard errors (in parentheses).
Design Thinking for Mindset Interventions
Table 6. Regressions Predicting End-of-Term GPA in Math, Science, and English, Study 2.
Design Thinking for Mindset Interventions
Figure 1. Screenshots of the “original” and “revised” mindset treatments
Base model
Plus school
fixed effects
Plus demographic
Plus pre-treatment
Intercept 1.556*** 1.584*** 1.581*** 1.589***
(0.035) (0.063) (0.067) (0.066)
Revised mindset treatment
(among low prior-achievers,
-1SD) 0.119* 0.125** 0.123** 0.135**
(0.049) (0.047) (0.045) (0.046)
Prior achievement (z-scored,
centered at -1SD) 0.693*** 0.721*** 0.663*** 0.641***
(0.024) (0.024) (0.023) (0.024)
Treatment × Prior achievement (z-
scored, centered at -1SD) -0.079* -0.082* -0.080* -0.091**
(0.034) (0.033) (0.032) (0.032)
School 2 -0.358*** -0.233** -0.217*
(0.089) (0.086) (0.086)
School 3 -0.748*** -0.656*** -0.643***
(0.105) (0.102) (0.101)
School 4 0.895*** 0.838*** 0.804***
(0.090) (0.087) (0.087)
School 5 -0.037 0.035 0.031
(0.066) (0.065) (0.064)
School 6 -0.070 0.069 0.046
(0.075) (0.074) (0.073)
School 7 -0.308*** -0.325*** -0.313***
(0.086) (0.084) (0.084)
School 8 -0.322*** -0.182** -0.158*
(0.070) (0.070) (0.070)
School 9 0.005 0.015 0.029
(0.073) (0.071) (0.070)
School 10 0.188* 0.191** 0.224**
(0.074) (0.071) (0.071)
Female 0.327*** 0.338***
(0.031) (0.031)
Asian 0.189** 0.190**
(0.067) (0.067)
Hispanic/Latino -0.287*** -0.284***
(0.041) (0.041)
Black/African-american -0.322*** -0.309***
(0.046) (0.046)
Repeating freshman year -0.922*** -0.894***
(0.108) (0.108)
Pre-treatment fixed mindset (z-
scored, centered at 0) -0.110***
Treatment × Pre-treatment fixed
mindset (z-scored, centered
at 0) -0.018
Adjusted R2 .295 .358 .407 .416
AIC 9781.892 9467.187 9196.462 9112.995
N3448 3448 3448 3438
Design Thinking for Mindset Interventions
Original Revised
Design Thinking for Mindset Interventions
Figure 2. Screenshots from the “make a worksheet” task.
Design Thinking for Mindset Interventions
Appendix 1: Deviations from Pre-Registered Analysis Plan
The study was pre-registered before the delivery of the dataset to the researchers by the third-
party research firm ( However after viewing the data, but before testing for the
effects of the treatment variable, we discovered that some of the pre-registered methods were not
possible to carry out. Because at this time there are no standards for pre-registration in
psychological science, the research team used its best judgment to make minor adjustments. In
the end, we selected analyses that were closest to a replication of the Treatment × Prior
achievement interaction reported by Paunesku et al. (2015). The deviations were:
We initially expected to exclude students on an individualized special education plan, but
that variable was not delivered to us for any schools.
We pre-registered a second way of coding prior performance: by including self-reported
prior grades in the composite. However, we later discovered a meta-analysis showing
that this is inappropriate for use in tests of moderation because lower-performing students
show greater bias and measurement error, making it an inappropriate moderator, even
though self-reported grades are an effective linear covariate (Kuncel, Crede, & Thomas,
We hoped to include social studies grades as a core subject, but schools did not define
which classes were social studies and they could not be contacted to clarify, and so we
only classified math, science and English as core classes.
We pre-registered other analyses that will be the subject of other working papers: effects
on stress (which were described as exploratory in our pre-registered plan), and
moderation of treatment effects by race/ethnicity and gender (which were not found in
Paunesku et al., 2015 and so, strictly speaking, they are not a part of the replication
reported here).
... First, participants reviewed six scientifically supported strategies (all adapted from Guess et al. 22 ) explained through peer testimonials to spot online misinformation (skepticism for headlines; looking beyond fear-mongering; inspecting the source of news; checking the evidence; triangulation; considering if the story is a joke; see an example in Fig. 2). Based on prior research 31,55,56 , these testimonials were supposed to provide normative information about other students' negative experiences of not spotting misinformation and positive experiences of identifying misinformation. In contrast to prior studies, we used quotes as narratives, but we did not use the descriptive norms in a numeric (e.g., XY% of the students) format such as in the intervention of Yeager et al. 31 . ...
Full-text available
The present online intervention promoted family-based prosocial values—in terms of helping family members—among young adults to build resistance against fake news. This preregistered randomized controlled trial study is among the first psychological fake news interventions in Eastern Europe, where the free press is weak and state-sponsored misinformation runs riot in mainstream media. In this intervention, participants were endowed with an expert role and requested to write a letter to their digitally less competent relatives explaining six strategies that help fake news recognition. Compared to the active control group there was an immediate effect (d = 0.32) that persisted until the follow-up four weeks later (d = 0.22) on fake news accuracy ratings of the young, advice-giving participants. The intervention also reduced the bullshit receptivity of participants both immediately after the intervention and in the long run. The present work demonstrates the power of using relevant social bonds for motivating behavior change among Eastern European participants. Our prosocial approach with its robust grounding in human psychology might complement prior interventions in the fight against misinformation.
... Our work further suggests that family members may play an important role in future math mindset interventions. Until now, most math mindset interventions have centered on students (Yeager et al., 2016; less attention has been paid to cultivating a growth mindset context that surrounds students . Recently, Yeager et al. (2022) called for adopting a mindset-plus-supportive-context approach in designing mindset interventions. ...
Mindsets are defined as people’s beliefs about the nature of intelligence, and previous research has found effects of students’ mindsets on their academic outcomes. In the present study, we bring together two recent lines of mindset research: research that has demonstrated that the mindset contexts that surround students matter above and beyond students’ own mindsets; and research that has demonstrated the importance of parents’ mindsets on students’ academic outcomes. Specifically, we explored associations among the family mindset context—operationalized as undergraduate students’ perceptions of their parents’ and older siblings’ mindsets beliefs about math ability—and their motivation, behavior, and affect in math. We found that students’ (N = 358) perceptions of their parents’ and older siblings’ fixed math mindsets were negatively associated with their motivation, engagement, and help-seeking behaviors in math. These findings underscore the importance of family mindset contexts to students’ math motivation and engagement, especially the role of older siblings, which is a particularly novel contribution.
... When in doubt, the algorithm should be tested. However, it should be kept in mind that the time lost is an economic loss, just like the loss of machines and materials (28). A helpful solution may be the use of an algorithm introduced by artificial intelligence to fill out SSC (29,30). ...
Full-text available
Objectives: the surgical safety checklist (SSC) is a document that is intended to increase patient safety in the operating theater by eliminating avoidable errors. The original document has been published in English by the WHO which recommends its obligatory use. The document’s name is often distorted when translated into European languages, for instance into the “surgical control list”. This article aims to assess the consequences of the distortion of the originally intended meaning for the completion of SSC in the operating theater. Methods: we compared the exactness of the meaning of translation in 29 European languages based on Google translator. Particular attention was paid to the presence of essential words such as “checklist” and “safety” in the translation.Results: we found that in 15 out of the 29 languages, the translation of these two words was incorrect, particularly in Slavic languages. The most often mistranslation was the “control card” or “control list”, which was a misnomer.Conclusions: the translation of the SSC name into native languages is inadequate in about one-half of the cases, which may jeopardize its proper use by team members of the operating theater, and thus the patient perioperative safety.
... What is exciting about mindset research is its expansion to interventions, which have yielded promising results (Bettinger et al., 2018;Paunesku et al., 2015;Yeager et al., 2016). A growth mindset can be taught to students through social, emotional learning interventions that strive to motivate and empower students to take responsibility for their learning (Yeager & Dweck, 2012;Yeager & Walton, 2011). ...
Full-text available
When teachers believe that students' abilities are fixed, their students' motivation, well-being, and performance tend to suffer. While many interventions have been developed to reduce these so-called fixed mindsets in students and increase the belief that abilities are malleable (i.e., a growth mindset), there are no cost-effective, scalable interventions targeting teachers' mindsets. To address this need, we developed a brief intervention consisting of a reflection task in which teachers thought and wrote about their mission as educators. Two experiments with preservice teachers (total N = 576) revealed that this brief intervention increased their growth mindsets (meta-analytic d = 0.25). This increase did not dissipate over the course of a week (meta-analytic d = 0.26), suggesting the intervention's effects are not ephemeral. Although more work is needed to establish the robustness and generalizability of this intervention's effects, it may ultimately become a useful tool for teacher education programs.
Full-text available
We utilize the concept of implicit theories, or individuals' lay beliefs about the malleability of human attributes, to illustrate how certain individuals can be dispositionally poised to resist (or seek) new and innovative products. We find that entity theorists, or those who believe in the fixedness of human traits, are relatively more likely to resist innovative new products, while incremental theorists, or those who believe in the malleability of human traits, are relatively more likely to seek out new products. We find this effect is bound by the perceived learning cost of the innovative product-such that low perceived learning costs reduce the differences in evaluations of new products between entity and incremental theorists. Several potential mechanisms for the effect are explored-most notably, the roles of fear of negative evaluation, negative effort beliefs, and need for cognition. Finally, we discuss theoretical and managerial implications and suggest avenues for future research.
Full-text available
This study shows how receptivity and responsiveness can influence the efficacy of an intervention helping adolescents reappraise worry or uncertainty following the difficult transition to middle school. The intervention was implemented at-scale in a diverse sample of sixth-grade public school district students followed through eighth grade (N = 1,180; 41% Black or Latinx; 44% low socioeconomic status). Results suggest the intervention’s effects on grade point average are confined to a racially and socioeconomically diverse subgroup of adolescents who had high teacher evaluations of their classroom behaviors in kindergarten that declined over the early elementary school years (i.e., “Disengagers”). These findings suggest that adolescents’ past school experiences with educators may bound the extent to which interventions can promote success in school.
Full-text available
The internet connectivity of client software (e.g., apps running on phones and PCs), web sites, and online services provide an unprecedented opportunity to evaluate ideas quickly using controlled experiments, also called A/B tests, split tests, randomized experiments, control/treatment tests, and online field experiments. Unlike most data mining techniques for finding correlational patterns, controlled experiments allow establishing a causal relationship with high probability. Experimenters can utilize the Scientific Method to form a hypothesis of the form " If a specific change is introduced, will it improve key metrics? " and evaluate it with real users.
Full-text available
The data includes measures collected for the two experiments reported in “False-Positive Psychology” [1] where listening to a randomly assigned song made people feel younger (Study 1) or actually be younger (Study 2). These data are useful because they illustrate inflations of false positive rates due to flexibility in data collection, analysis, and reporting of results. Data are useful for educational purposes.
Full-text available
This chapter reviews the recent research on motivation, beliefs, values, and goals, focusing on developmental and educational psychology. The authors divide the chapter into four major sections: theories focused on expectancies for success (self-efficacy theory and control theory), theories focused on task value (theories focused on intrinsic motivation, self-determination, flow, interest, and goals), theories that integrate expectancies and values (attribution theory, the expectancy-value models of Eccles et al., Feather, and Heckhausen, and self-worth theory), and theories integrating motivation and cognition (social cognitive theories of self-regulation and motivation, the work by Winne & Marx, Borkowski et al., Pintrich et al., and theories of motivation and volition). The authors end the chapter with a discussion of how to integrate theories of self-regulation and expectancy-value models of motivation and suggest new directions for future research.
Full-text available
Empirically analyzing empirical evidence One of the central goals in any scientific endeavor is to understand causality. Experiments that seek to demonstrate a cause/effect relation most often manipulate the postulated causal factor. Aarts et al. describe the replication of 100 experiments reported in papers published in 2008 in three high-ranking psychology journals. Assessing whether the replication and the original experiment yielded the same result according to several criteria, they find that about one-third to one-half of the original findings were also observed in the replication study. Science , this issue 10.1126/science.aac4716