Content uploaded by Richard E Petty
Author content
All content in this area was uploaded by Richard E Petty on Sep 01, 2022
Content may be subject to copyright.
Why are people antiscience, and what can we do about it?
Aviva Philipp-Muller
a,1,2
, Spike W. S. Lee
b,c
, and Richard E. Petty
a
Edited by Timothy Wilson, University of Virginia, Charlottesville, VA; received February 3, 2022; accepted May 19, 2022
From vaccination refusal to climate change denial, anti-
science views are threatening humanity. When different
individuals are provided with the same piece of scientific
evidence, why do some accept whereas others dismiss it?
Building on various emerging data and models that have
explored the psychology of being antiscience, we specify
four core bases of key principles driving antiscience atti-
tudes. These principles are grounded in decades of
research on attitudes, persuasion, social influence, social
identity, and information processing. They apply across
diverse domains of antiscience phenomena. Specifically,
antiscience attitudes are more likely to emerge when a
scientific message comes from sources perceived as lack-
ing credibility; when the recipients embrace the social
membership or identity of groups with antiscience atti-
tudes; when the scientific message itself contradicts what
recipients consider true, favorable, valuable, or moral; or
when there is a mismatch between the delivery of the sci-
entific message and the epistemic style of the recipient.
Politics triggers or amplifies many principles across all
four bases, making it a particularly potent force in anti-
science attitudes. Guided by the key principles, we
describe evidence-based counteractive strategies for
increasing public acceptance of science.
antiscience jattitudes jsocial identity jpolitics jscience communication
From refusing to get vaccinated against COVID-19 (1) to
ignoring worsening climate change (2), rejection of scientific
information is costing lives now and will continue to do so
in the future. One need only look at recent polling data to
find concerning cases of public rejection of scientificevi-
dence or denial of solutions with high levels of consensus
among scientists. For example, a September 2021 poll
found that only 61%of Americans saw COVID-19 as a major
public health threat (3). Another recent poll found that 40%
of Americans do not think climate change is a major threat
(4). Additional examples abound around the world (5).
Dismissal of scientific evidence is not a new phenome-
non, however. When germ theory was proposed in the 19th
century, an anticontagionist movement rejected the notion
that disease could be spread through miniscule germs. Ear-
lier scientific discoveries, such as the heliocentric nature of
the solar system, were met with heavy opposition. But why?
From early to contemporary examples, what are the psy-
chological principles that account for people’s antiscience
views? That is, when different individuals are provided with
the same piece of scientific evidence, why do some go on to
accept and integrate it as a fact, whereas others dismiss it
as invalid or irrelevant?
Numerous scholars have pondered the antecedents of
antiscience views. Existing models have identified factors
that predict wariness of specific scientific innovations or
theories (6, 7) or antiscience attitudes overall [e.g., the atti-
tude roots and jiu jitsu models (8)]. These and other models
noted throughout our article offer important insights. But
one theoretical paradigm that has been largely ignored in
the antiscience literature, despite its substantive relevance,
is the classic perspective on attitudes and persuasion (9).
This is surprising, because antiscience views represent a cri-
sis of attitudes due to both effective persuasion by anti-
science sources and ineffective persuasion by scientificor
“proscience”sources. This is also a missed opportunity,
because classic work on persuasion has highlighted a num-
ber of explanatory processes and remediative strategies,
many of which are highly applicable to the problem of anti-
science attitudes. The goal of our article is to make these
connections explicit and constructive. We do so by connect-
ing contemporary findings and models in the antiscience
literature to key principles from decades of research on
attitudes, persuasion, social influence, social identity, and
acceptance versus rejection of information writ large. Draw-
ing these connections confers the dual scientificbenefits of
organizing our understanding of antiscience phenomena
and delineating how classic components of persuasive pro-
cesses formulated in the 20th century are impacted by new
forms of social dynamics in the 21st century (e.g., vast and
fast social network effects in the spreading of misinforma-
tion on social media).
Why Are People Antiscience? An
Inclusive Framework
Distinct clusters of basic mental processes can explain
when and why people ignore, trivialize, deny, reject, or even
hate scientificinformation—a variety of responses that
might collectively be labeled as “being antiscience.”To orga-
nize these processes, we offer an inclusive framework that
specifies four core bases of antiscience attitudes (Table 1,
first column). In essence, people are prone to rejecting
Author affiliations:
a
Department of Psychology, The Ohio State University, Columbus, OH
43210;
b
Rotman School of Management, University of Toronto, Toronto, ON M5S 1A1,
Canada; and
c
Department of Psychology, University of Toronto, Toronto, ON M5S 1A1,
Canada
Author contributions: A.P.-M., S.W.S.L., and R.E.P. wrote the paper.
The authors declare no competing interest.
This article is a PNAS Direct Submission.
Copyright © 2022 the Author(s). Published by PNAS. This article is distributed under
Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).
1
Present address: Beedie School of Business, Simon Fraser University, Burnaby, BC V5A
1S6, Canada.
2
To whom correspondence may be addressed. Email: aviva_philipp-muller@sfu.ca.
This article contains supporting information online at http://www.pnas.org/lookup/
suppl/doi:10.1073/pnas.2120755119/-/DCSupplemental.
Published July 12, 2022.
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 1of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
scientific messages when it comes from a source they do
not find credible (basis 1), when they, as the recipient of
the scientific message, identify with social groups that
hold antiscience attitudes (basis 2), when the scientific
message itself contradicts their related beliefs or atti-
tudes (basis 3), or when it is delivered in ways that mis-
match their motivational and cognitive approaches to
information processing (basis 4). Each of these bases
involves specific antecedents or predictors and elicits dif-
ferent nuances of psychological reaction (Table 1, second
column); they also point to different counteractive strate-
gies (Table 1, third column). Despite their differences in
focus,thefourbasesareunified in revealing ways in
which scientific information conflicts with people’sexist-
ing content or style of thought. Such conflicts are hard to
swallow and easy to disavow, rendering effective commu-
nication of scientific information a thorny problem—but
one that becomes more surmountable once its underlying
bases are elucidated.
In the following sections, we introduce each basis of
antiscience attitudes by highlighting key principles, identi-
fying relevant models, and reviewing illustrative findings
and real-world examples, from heavily studied domains
like vaccination to less studied ones like nanotechnology.
Next, through the conceptual lens of the four bases, we
explain why politics has particularly potent effects on anti-
science attitudes. Afterward, we present a variety of coun-
teractive strategies for increasing acceptance of science by
targeting the four bases. Finally, we conclude with theoreti-
cal contributions of our framework.
Basis 1: Source of the Scientific Message. Lay people do not
discover facts about reality in isolation, devoid of external
inputs. Instead, they rely on sources of scientificinforma-
tion—scientists, or, more frequently for most people, jour-
nalists, health officials, politicians, or key opinion leaders—to
construct their understanding of the world. In general, the
more credible a source is perceived to be, the more likely
people are to accept its information and be persuaded by it.
Unfortunately, many people perceive scientists, who are
supposed to be the original source of scientific information,
as lacking credibility (10). Why?
Source credibility is composed of three pillars: expertise
(i.e., possessing specialized skills and knowledge), trustwor-
thiness (i.e., being honest), and objectivity (i.e., having unbi-
ased perspectives on reality) (11). All three are necessary.
When scientists (or anyone conveying scientificinformation)
are perceived as inexpert, untrustworthy, or biased, their
credibility is tainted, and they lose effectiveness at convey-
ing scientific information and changing opinions.
Although scientists are generally perceived as high in
competence and expertise (12), this perception is facing
mounting challenges. Concerns about the truth value and
robustness of scientificfindings in multiple fields, from
medical to social sciences (13, 14), have received media
attention (15). Lay perception of scientists’credibility can
even be undermined by features central to the very mis-
sion of science: Legitimate debates happen within scientific
fields, with different scientists championing different,
sometimes contradictory, perspectives, theories, hypothe-
ses, findings, and recommendations. [As a current example
Table 1. Key principles driving antiscience attitudes and counteractive strategies for addressing them
Basis of key principles
Key principles driving antiscience
attitudes Counteractive strategies for addressing the key principles
Basis 1. Source of the scientific
message
When sources of scientific information
(e.g., scientists) are perceived as 1)
inexpert, 2) untrustworthy, or 3)
biased, they lack credibility, and their
messages are ignored or rejected.
1, i) Improving perceived and actual validity of scientists’
work
1, ii) Legitimizing substantive scientific debate
2) Conveying warmth and prosocial goals in science
communication and using accessible language
3) Conveying that the source is not antagonistic to the
recipient, such as by providing two-sided messages that
clearly state the side for which there is stronger evidence
Basis 2. Recipient of the scientific
message
When scientific information activates
one’s social identity as a member of a
group 1) that holds antiscience
attitudes or 2) that has been
underrepresented in science or
exploited in scientific work, it triggers
ingroup favoritism and outgroup
antipathy.
1) Activation of shared or superordinate identity
2) Engaging and collaborating with marginalized
communities
Basis 3. The scientific message
itself
When scientific information contradicts
what people 1) believe to be true, 2)
evaluate as favorable, or 3) moralize,
they experience cognitive dissonance,
which is more easily resolved by
rejecting the scientific information
than by changing existing beliefs,
attitudes, or values.
1, i) Training in scientific reasoning
1, ii) Prebunking
2, i) Strong arguments
2, ii) Self-affirmation
3, i) Moral reframing
3, ii) Increasing the perceived naturalness and moral purity
of scientific innovations
Basis 4. Mismatch between the
delivery of the scientific
message and the recipient’s
epistemic style
When scientific information is delivered
in ways that mismatch one’s1)
construal level, 2) regulatory focus, 3)
need for closure, or 4) need for
cognition, it tends to be rejected.
1–4) Matching the delivery of scientific information with the
recipient’s epistemic style (e.g., framing messages as
approaching gains for promotion-focused recipients but
as avoiding losses for prevention-focused recipients)
2of10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
at the time of our writing, scientists differ in their recom-
mendations about whether and when to roll out the sec-
ond booster shot for COVID-19 (16).] In principle, these can
be signs of a healthy scientific ecosystem. In practice, con-
tradictions between scientists, especially against the back-
drop of replicability concerns, threaten lay perceptions of
scientists’credibility (17).
Scientists’trustworthiness is also threatened by multiple
social forces. Distrust of elites (i.e., those with societal
influence) is on the rise (18), and scientists whose voices
are broadcast in the public sphere are often employed by
elite media and institutions. Distrust of government organ-
izations is on the rise too (19), which predicts distrust in
scientists who recommend innovations that would require
greater governmental regulation (20). Furthermore, scien-
tists have been stereotyped as cold and unfeeling in char-
acter (12, 21), which undermines the public’s willingness to
trust them (21).
Scientists’objectivity has also been called into question.
Scientists in certain fields are portrayed and perceived as
exhibiting biased perspectives against Christian (22) and
conservative (23) values. Indeed, many religious individuals
reject science, in part, due to the perception that scientists
are atheistic (24). More generally, when scientists are
thought to have a vested interest (e.g., monetary incentives)
in persuading their audience, they are perceived as both
biased and untrustworthy (25). During the COVID-19 pan-
demic, widespread misinformation characterized prominent
public health officials as promoting the vaccine because of
their financial investment in various pharmaceutical compa-
nies (26). In short, scientists can be perceived as inexpert,
untrustworthy, or biased, which threatens their credibility in
the public eye.
Basis 2: Recipient of the Scientific Message. People vary in
how interested and willing they are to listen to different
types of information (27, 28). A powerful force that shapes
the types of information individuals expose themselves to
or actively seek out is their social identities. Substantial
research on social identity theory has found that the social
groups to which individuals belong or feel a connection
exert strong influences on their response to information
perceived to be identity relevant (29). For example, young
adults are more likely to seek out positive (vs. negative)
information about young adults (their ingroup), and older
individuals are more likely to seek out negative informa-
tion about young adults (their outgroup) (30).
Social identities play a role in antiscience attitudes and
behaviors. Those who have been underrepresented in sci-
ence or who have historically been exploited in scientific
experiments [e.g., Black and Indigenous individuals (31)]
are more skeptical of science (32). In addition to demo-
graphic groups, people can identify with interest groups
that shape antiscience attitudes. For example, those who
strongly identify as video gamers are more likely to reject
scientific evidence regarding the harms of playing video
games (33). These findings are broadly consistent with
research and models in science communication that
describe how people tend to reject scientific information
incompatible with their identities. Work on cultural cogni-
tion has highlighted how people contort scientificfindings
to fit with values that matter to their cultural identities (34,
35). Relatedly, work on identity-protective cognition shows
that people selectively dismiss scientifically determined
risk assessments that threaten their identity (36), as when
White men are more likely than other racial and gender
groups to dismiss data regarding the riskiness of guns,
because guns are a more integral part of their cultural
identity (37).
Beyond the effects of identifying with specific demo-
graphic or cultural groups that can conflict with specific sci-
entificfindings, some individuals identify with groups that
altogether ignore and shut down scientific thought, recom-
mendations, and evidence, in general (38, 39). This sort of
identity is often tied to other personally meaningful identi-
ties, particularly, political ones [and religious ones (39)], a
theme we elaborate on shortly. An important nuance and
caveat, however, is that, although scientists might charac-
terize some social groups as antiscience, the individuals
who identify with these groups might not think of them-
selves as explicitly or consciously disavowing science. They
might even think of themselves as proscience, in that they
believe their own views are more scientifically sound than
those of mainstream scientists (40). In what sense, then,
are they antiscience? In the sense that, if they reject the
preponderance of scientific evidence and instead favor
positions with scant or pseudoscientific support, then they
are de facto acting in opposition to how science works—
they are against the scientific approach to knowledge crea-
tion and the knowledge created by it.
In addition to being against scientific information, indi-
viduals can be against the people providing or promoting
the scientific information. This is, unfortunately, a common
aspect of social identity, namely, antipathy toward those who
do not share that identity and are thus part of the outgroup
(41). For example, those who identify as climate change
skeptics harbor hostile feelings toward climate change
believers (42). For individuals who embrace an identity
associated with antiscience attitudes, scientists are mem-
bers of the outgroup. People tend to reject what outgroup
members have to say, sometimes to the point of violence,
which can arise even in the absence of substantive rea-
sons for rejecting the outgroup member’s message other
than that it comes from the outgroup (43). These forces of
social identity reflect why many individuals who strongly
identify with antiscience groups seem to vehemently reject
scientific messages and frequently approach scientists
with hostility, even threatening their lives (44).
Similar dynamics are evident in the marked rise in con-
spiracy theories related to COVID-19 (e.g., the pandemic was
a hoax, or the vaccines contained microchips). These con-
spiracy theories often coalesce around particular social
groups and are most vehemently promoted by those who
feel highly identified with their pseudoscientific community
(45). In recent years, conspiracy theories have led to highly
visible behavior such as antimask and antivaccine protests.
Due to social media, antiscience groups can now mobilize
activists and followers more swiftly than in previous eras.
Beyond the context of COVID-19, social groups that reject
mainstream science have emerged surrounding unvalidated
treatments for Lyme disease (46) and opposition to getting
oneself or one’s children immunized in general (47).
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 3of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
Basis 3: The Scientific Message Itself. People do not always
think and behave in line with what science suggests. One
reason is that they are unaware of the scientific evidence
[i.e., the deficit model (48)]. Sometimes, when people sim-
ply learn about the scientific consensus, their thoughts
and feelings follow suit [i.e., the gateway belief model
(49)]. Other times, however, when scientific information
contradicts people’s existing beliefs about what is factually
true, they can reject even the strongest scientific evidence,
because harboring conflicting cognitions is aversive. This
phenomenon is known as cognitive dissonance (50), which
arises when a person is exposed to information that
conflicts with their existing beliefs, attitudes, or behaviors.
Dissonance elicits discomfort. Given this aversive feeling,
people are motivated to resolve the contradiction and
eliminate the discomfort in a number of ways, such as
rejecting the new information, trivializing the topic, ratio-
nalizing that there is no contradiction, or revising their
existing thought (51).
Critically, people tend to resolve dissonance using the
path of least resistance. To a person who has been smoking
their entire life, it is far easier to reject or trivialize scientific
evidence about the health risks of smoking than to alter
their ingrained habit. With dissonance, the intransigence of
existing beliefs resembles the stickiness of existing behav-
iors: It is easier to reject a piece of scientific information
than to revise an entire system of existing beliefs one has
accumulated and integrated into a worldview over the years,
often reinforced by social consensus. One’s existing beliefs
can be based on valid scientific information, previously
accepted but now outdated scientific information, or scien-
tific misinformation. As an example of dissonance arising
from believing outdated scientific information, for thousands
of years, it was a widespread belief that Earth was the center
of the universe and that the sun orbited Earth (52). To a per-
son who had always believed the sun revolved around Earth,
it was far easier to reject the notion of Copernican heliocen-
trism than to overhaul the geocentric model of the universe,
which was previously accepted and felt subjectively coherent
enough, and thus in no obvious need for revision.
In addition to rejecting new information from scientific
progress and updates, individuals might possess beliefs
that contradict scientific evidence due to the spread of
misinformation. The last few years have witnessed a prolif-
eration of fake news (53), catalyzed by social media, which
facilitates the rapid spread of information regardless of
whether it is true. Sadly, fake news spreads “significantly
farther, faster, deeper, and more broadly”than true news
on social media platforms, because fake news stories often
evoke stronger emotional reactions and come across as
more novel than true ones, which are attributes that
increase sharing behavior (54). Although some individuals
might be sharing misinformation merely because of inatten-
tion to veracity (not because of endorsement of content)
(55), extensive sharing of fake news among one’singroup
makes it likely to be accepted, due to the dynamics of social
identity outlined earlier, which can result in rapid accep-
tance of pseudoscientific or antiscientificbeliefs.
Once misinformation has spread, it is difficult to correct
(56), and there is often a continued influence of the misin-
formation even after it has been retracted. Corrections
issued by media sources are typically ineffective at reduc-
ing belief in the misinformation. In fact, corrections can
sometimes reinforce the belief by making it more salient
(56). Unfortunately, misinformation on many scientific
topics has been widely disseminated, such as exaggerated
and unfounded risks of vaccines (including pre-COVID
times), denial of climate change, and dismissal of evidence
for evolution (57).
Scientific misinformation is especially difficult to correct
when it provides a causal explanation for a phenomenon.
Correcting the misinformation would leave a gap in peo-
ple’s mental model of why an event or a situation has
occurred (58) and would cause discomfort (59). People
often refill that gap with misinformation to make sense of
the issue at hand. Circling back to the example of heliocen-
trism, telling a geocentricist that Earth is actually not the
center of the universe would leave a gap in their mental
model of why the sun clearly appears to be revolving
around Earth, a gap that is easy to refill by reaffirming
their existing causal belief. Similar cognitive dynamics have
long been observed in pseudoscience (60) and continue to
result in rejection of scientific information today.
Not only do people possess beliefs about whether
things are true or false, they also evaluate things as desir-
able or undesirable (attitudes) (9), important or unimpor-
tant (values) (61), and right or wrong (morals) (62). Some
moral views are at odds with particular kinds of scientific
information, resulting in morally fueled rejection. For
example, people who endorse the moral significance of
naturalness and purity are prone to resisting scientific
technologies and innovations seen as tampering with
nature. Vaccines (63) and genetically modified food (64),
despite their documented benefits, are often rejected due
to perceptions that they are unnatural. This cluster of
moral intuitions about naturalness and purity is highly
related to individual differences in aversion to “playing
God,”an aversion that predicts lower willingness to fund
the NSF and less monetary donation to organizations sup-
porting novel scientific procedures (65).
Attitudes rooted in one’s notions of right and wrong
(e.g., not eating meat as a moral issue rather than as a
taste preference) are particularly strong (66) and tend to
be more extreme, persistent over time, resistant to
change, and predictive of behavior (67). For example, peo-
ple with moralized attitudes toward recycling are more
resistant to counterattitudinal information regarding the
efficacy of recycling (68). To resolve dissonance from con-
flicting information, rejecting the novel scientific informa-
tion is often the path of lesser resistance than revising
one’s existing moralized attitudes. Likewise, when misin-
formation is consistent with one’s existing attitudes, it is
difficult to correct (69). To people who love driving high-
horsepower but gas-guzzling vehicles, misinformation such
as “climate change is a hoax”would be attitude consistent,
whereas scientific correction of this misinformation would
be attitude inconsistent and thus prone to rejection.
Basis 4: Mismatch between the Delivery of the Scientific
Message and the Recipient’s Epistemic Style. Even when sci-
entific information does not conflict with an individual’s
beliefs or attitudes, it can still be rejected for reasons
4of10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
beyond the content of the message. In particular, when
scientific information is delivered in ways that are at
odds with a person’s style of thinking about the topic at
hand or their general approach to information process-
ing, it is less likely to be processed and more likely to be
rejected (70).
For example, when people construe an issue in
abstract/high-level (vs. concrete/low-level) terms, concrete
(vs. abstract) scientific information about the issue mis-
matches their construal level and tends to be rejected.
People typically construe the issue of climate change in
abstract/high-level terms (e.g., global environmental degra-
dation), because the consequences of climate change are
seen as psychologically distant (71), and distance promotes
abstract construal (72). Thus, when ecofriendly products
are described in concrete/low-level terms (e.g., fine details
about the product’s carbon savings), despite making a
compelling case, they tend to be rejected (71). Evaluation
and choice of sustainable products are also undermined
when the products are described in concrete terms of self-
interested economic savings to consumers who think
abstractly about sustainability (73).
Even holding the level of abstractness/concreteness cons-
tant, scientific information can be presented in a gain frame
or a loss frame. Describing a vaccine as 90%effective (gain
frame) is technically equivalent to describing it as 10%inef-
fective (loss frame), but with dissimilar psychological effects,
because the frame can be at odds with people’s regulatory
focus (74). Promotion focus orients people to eagerly attain-
ing gains; prevention focus orients people to cautiously
preventing losses. When scientific information is framed as
promoting gains (vs. preventing losses), it tends to be
rejected by people who are prevention focused (vs. promo-
tion focused) (74). Such mismatch effects have been found
to result in rejection of climate change (75) and health mes-
sages (e.g., vaccination and smoking cessation) (76).
Framing of scientific information also varies in how cer-
tain and decisive it seems. Even when there is a high
degree of scientific consensus, scientific information is
often disseminated in terms that signal uncertainty. Such
terminology, while technically accurate, leads people with
high need for closure (i.e., low tolerance of epistemic
uncertainty) (77) to reject it. For example, when people
receive mixed scientific information about vaccines, those
with high need for closure are particularly likely to become
entrenched in their existing views and reject the mixed
information (78). More generally, people with high need
for closure are more likely to reject novel information that
challenges their currently held conclusions or assumptions
(77). This poses a challenge for scientists, who are trained
to hedge their findings and avoid overclaiming certainty,
as they try to communicate the preliminary, inconclusive,
nuanced, or evolving nature of scientific evidence.
Finally, scientific information varies in its quality. Intui-
tively, high-quality arguments are more persuasive than
low-quality ones (79). But this is often not true for people
with low need for cognition (i.e., people who do not enjoy
thinking), for whom low-quality arguments can be just as
persuasive as high-quality ones if positive peripheral cues
(e.g., a likable source) are present (80). Therefore, while
good-quality scientific evidence is, overall, more likely to
be accepted than bad-quality evidence (81), people who do
not enjoy thinking are less likely to appreciate such quality
distinctions. They are less likely to process complex infor-
mation, as comprehending it requires active thinking (79).
They are also less likely to choose to read nuanced science
blog posts (82) and less likely to accept evidence for cli-
mate change and evolution (83).
Construal level, regulatory focus, need for closure, and
need for cognition are different dimensions of epistemic
style. On any of these dimensions, a mismatch between
how scientific information is delivered and how the recipi-
ent thinks will increase the probability of rejection. More
generally, source–recipient mismatches (basis 4), content
conflicts (basis 3), social identity (basis 2), and sources lack-
ing in credibility (basis 1) all contribute to antiscience atti-
tudes. They also point to why politics is a particularly
potent driver of these attitudes.
How Politics Drives Antiscience Attitudes. Acceptance of sci-
entific information is now sharply divided along political
lines, with individuals in different camps holding, even
enshrining, vastly different views (84). Conservatives are
more likely than liberals to reject scientific evidence sup-
porting evolution (85) and the existence of anthropogenic
climate change (86), and have lower intentions to get vacci-
nated against COVID-19 (87). Although liberals, overall, are
more inclined to accept scientific evidence (86–88), there
are specific topics about which they are more likely to be
skeptical, such as the risk of nanotechnology (35). How do
we make sense of these political divides?
The literature on antiscience attitudes has found that
rejection of scientific information by members of different
political camps is often based on motivational factors (89).
Building on these insights, we argue that politics can trig-
ger or amplify basic mental processes across all four bases
of antiscience attitudes, thereby making it a particularly
potent force. Because the mental processes are not mutu-
ally exclusive, many of the political influences described
below are likely to occur in conjunction with each other.
Politics impacts people’s perception of scientists’credibil-
ity (the source) via perceived expertise and trustworthiness
(90). In general, people see others with similar political
views as more expert and knowledgeable. Both liberals and
conservatives are less trusting of scientists whose work con-
tradicts their ideological viewpoint (91), and recent expo-
sures to such contradictory information reduces trust in
theentirescientific community (92). Because liberals and
conservatives find different sources credible (e.g., CNN
vs. Fox News), they expose themselves to different scien-
tific information (93) and misinformation (94), often rein-
forced by cues from trusted political elites (95), further
entrenching them in siloed networks. In the era of social
media and algorithmically customized news feeds, even
what appears to be the same source (e.g., Facebook) can
provide highly varied information to different users (96),
exacerbating the division of communities along political
lines.
For many, politics is more than just a set of beliefs or
ideals; it is a core part of their identity (97), which can have a
large impact on how they, as a recipient, react to different
pieces of scientific evidence, policy proposals, and legislation.
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 5of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
Those who identify strongly as a Democrat or a Republican
tend to show different responses to various pieces of sci-
entific information, with each group rejecting proposals
that are purportedly proposed by the outgroup, even
when it goes against their own best interest. For example,
when carbon taxes are framed as being a predominantly
Republican (vs. Democrat) policy, those who identify as
Democrat (vs. Republican) are more likely to oppose the
policy (96). This opposition to anything proposed by the
outgroup is mediated by the perception that the outgroup
is a threat to society (99), and threats reliably trigger out-
group antipathy (100). Such antipathy is prevalent in the
political sectarianism of our time (101), which leads many
individuals to selectively expose themselves to congenial
scientific information (28).
Indeed, people have a strong tendency to seek out
information (the message) that reinforces their existing
beliefs (93), a phenomenon intensified by online platforms,
which heighten the speed and scope of exposure to infor-
mation and misinformation in homogenous and polarized
echo chambers (102). Much of the misinformation online is
politically charged, covering diverse topics from elections
to climate change (57). Research on values-based messag-
ing has found that, when a political message evokes values
discordant with people’s existing values, it tends to be
rejected (103). Indeed, when scientific information contra-
dicts people’s beliefs shaped by political forces, it tends to
be rejected outright as simply untrue, a tendency exhibited
by both liberals and conservatives (104). Worse still, the
more extreme or morally charged people’s political views,
the stronger their sense of belief superiority, regardless of
accuracy (105), further amplifying the rejection of belief-
contradictory scientific information.
Alongside content differences (the types of messages
liberals and conservatives seek out and accept), liberals
and conservatives also differ in how they approach infor-
mation (epistemic styles). Conservatives are, on average,
more prevention focused, and liberals are more promotion
focused (106). According to this logic, conservatives would
be more likely to reject scientific information framed as
approaching gains, and liberals would be more likely to
reject scientific information framed as avoiding losses.
Conservatives also have a stronger need for closure (107),
which is linked to stronger beliefs in a variety of conspiracy
theories with no scientific basis (108).
Altogether, politics is a particularly potent force in rejec-
tion of scientific information because it strikes all four
bases of antiscience attitudes, at times amplifying them.
Acute increases in political partisanship and sectarianism
(101) in recent years have only accentuated the potency
and toxicity of such political influences.
What Can We Do About Antiscience Attitudes?
By specifying the key principles underlying antiscience atti-
tudes, our framework suggests counteractive strategies for
increasing acceptance of scientific information by targeting
each of the four bases (Table 1, third column). Obviously,
no single strategy is perfect or universal, and the current
era is replete with unique challenges, such as the spread
of misinformation on social media, but specific strategies
can still be effective in their respective contexts, for specific
goals. We outline a number of these strategies briefly.
Targeting Basis 1: Increasing Perception of Scientific
Information Sources as Credible. Scientists lack credibility
when they are perceived as inexpert, untrustworthy, or
biased. To tackle emerging concerns about the quality of
scientists’work and their perceived expertise, trustworthi-
ness, and objectivity, scientists need to improve the validity
of their research (109) and establish the replicability and
reproducibility of their findings. Scientists also need to
communicate to the public that substantive debate and
disagreement are inherent to the scientific process and
signal a healthy scientific landscape, a point often missed
by lay people who expect a given scientificfinding to be
absolute (17). To maximize effectiveness, scientists and
science organizations need to recruit journalists, health
officials, politicians, or key opinion leaders to join these
communicative efforts, as they are often the sources con-
veying scientific information directly to the public or the
sources that the public already trusts.
To reduce distrust in scientists due to their perceived
coldness (12), when scientists communicate their findings
and recommendations, they should ameliorate the unfa-
vorable impressions by intentionally conveying interper-
sonal warmth and highlighting the communal nature of
science, a tactic that has proven effective for a different
but related goal—recruiting girls and women into STEM
training programs and careers (12). Another strategy that
is related to but distinct from conveying warmth is for sci-
entists to communicate that they are pursuing prosocial
goals in their work. When people perceive scientists as
prosocial, they have greater trust in science (110).
Scientists also often use excessively complex language
when communicating science to the general public (111). To
mitigate the negative perception from jargon-laden wording
that conceals the meaning of the science from lay people,
scientists should use language that conveys their message
clearly and precisely while still being accessible to a general
audience. One specific suggestion in this vein, which most
journals have yet to adopt, is for published articles to
include “lay summaries”along with the more jargon-laden
abstracts, so that interested lay people can better glean the
information in terms that they can understand (112).
To reduce perceived bias, scientists should attempt to
communicate in a balanced manner whenever possible.
When communicators offer a nuanced, multifaceted per-
spective, especially if they change positions in the face of
new evidence, they are perceived as less biased and more
persuasive (113). When a communicator expresses open-
ness to alternative views, especially when of high status,
this can increase openness in committed recipients (114).
For example, those who saw the issue of wearing masks in
the COVID-19 pandemic as a moral impingement on their
rights were more open to wearing masks when a commu-
nicator acknowledged the recipient’s view but explained
why the promask position was preferable (115). Impor-
tantly, we are not suggesting that communicators adopt a
position of false neutrality or “both sidesism.”Instead, we
are suggesting that they honestly acknowledge any draw-
backs of their position while ultimately explaining in clear
6of10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
and compelling terms why their position is still the most
supported or more justifiable one.
Targeting Basis 2: Decreasing Recipients’Identification with
Antiscience Groups. To reduce the salience or strength of
recipients’identification with groups that embrace anti-
science views, science communicators should invoke
meaningful and important shared social identities between
themselves and the recipients of scientific messages (116).
For groups in conflict, finding a common or superordinate
identity often helps the two groups minimize their conflict
and approach intergroup harmony (117). If those viewing
scientists as outgroup members can see themselves as
sharing a common identity with scientists, antiscience sen-
timent and the derogation of scientists can be reduced.
For example, when scientists offer their recycled water pol-
icy suggestions to a hostile audience, finding common
ground via a superordinate identity successfully increases
audience receptivity (118). One way to legitimately claim a
shared identity between scientists and antiscience commu-
nity members is by bringing together different stakehold-
ers to form one group (e.g., a committee) that is working
toward shared goals, while still preserving the original sub-
groups within the superordinate identity (98).
Science communicators should also seek to earn the
trust of groups that have been historically exploited or
excluded by the science community (119, 120). This can be
done by directly engaging with the target groups in the pro-
cess of conducting the research (121). For example, rather
than treating racialized or historically underrepresented
groups as the objects of study, scientists can collaborate
with members of these communities and build cultural com-
petencies (122). Scientific funding agencies’requirement of
active Indigenous participation in any research that might
impact or involve Indigenous communities (123) offers
another step toward reconciliation. Programs that train mar-
ginalized individuals to be the scientists working within their
own communities also help to earn trust from racialized
communities, as when a program that trains Indigenous
genome researchers increases trust in science (124). Many
of these efforts are still rather nascent, however, and, unlike
the other counteractive strategies outlined in our article,
their efficacy has not yet been rigorously assessed. We
encourage proper quantitative assessment of these efforts’
effectiveness. If useful, they can be scaled up to help rebuild
or strengthen the rapport between scientists and diverse
communities.
Targeting Basis 3: Increasing Acceptance of Scientific
Information Even When It Contradicts One’s Beliefs and
Attitudes. To tackle rejection of scientific information that
contradicts an audience’s beliefs, prevention is better than
cure: Whenever possible, minimize the formation of ill-
informed beliefs in the first place. One preventive strategy
is to train people in scientific reasoning (i.e., the ability to
evaluate the quality of scientific information). People
equipped with scientific reasoning skills are more likely to
accept high-quality scientific evidence (84). This strategy is
especially apt for combatting the rise of fake news [which
is another major problem that requires societal-level
changes in digital infrastructure (125)]. Arming media con-
sumers with the skills to differentiate between true and
false scientific information leads them to become more
discerning regarding which beliefs to adopt (125). Critically,
this strategy pertains to conveying the correct scientific
information prior to any misinformation being adopted.
An additional caveat is that, although encouraging critical
reasoning decreases belief in scientific misinformation,
simply telling people that they should trust science more
can actually increase belief in and dissemination of misin-
formation framed as being scientific (compared with misin-
formation not framed as being scientific) (126).
Related to the broader notion of training in scientific
reasoning, a specific strategy is called prebunking. Derived
from the logic of disease inoculation (127), it involves fore-
warning people that they will be receiving misinformation,
then giving them a small dose of misinformation (the
“vaccine”) and refuting it so that they will be better able to
resist misinformation when they encounter it in the wild
(the “disease”). Data from a field experiment among older
adults have found this strategy to be effective for minimiz-
ing the impact of disinformation on people’s intention to
receive a COVID-19 vaccine (128).
Another preventive strategy, which sounds intuitive but
turns out to be ineffective for enhancing acceptance of scien-
tific information, is increasing a population’s general scien-
tific literacy. Unlike specialized scientific knowledge, general
scientific literacy does not involve a deep dive into why a sci-
entific phenomenon occurs (89). Unlike scientific reasoning
skills, general scientific literacy does not teach people how to
parse scientific information (84). Instead, it merely entails
imparting an unelaborated list of scientific information (89).
Why is it ineffective for enhancing acceptance of scientific
information? Because people with more scientific literacy are
simply more sophisticated at bolstering their existing beliefs
by cherry-picking ideas and information to defend their
worldview (84). Higher levels of scientific literacy, instead of
leading people to coalesce around scientifictruths,can
increase polarization of beliefs (84). Similarly, greater cogni-
tive sophistication (e.g., stronger analytic thinking) does not
necessarily reduce antiscience views, as the most cognitively
sophisticated and educated people can also be the most
polarized (129), although the evidence for and interpretation
of this pattern have been subject to debate (130).
When preventive strategies are implausible, curative
ones are necessary. Simply learning information is often
uncorrelated with attitude change (48, 131). What matters
more than whether people learn or remember the infor-
mation they have been told is how they react to that infor-
mation. If people have positive reactions to a message,
they are more likely to change their attitudes to be in line
with that message (132). By implication, merely informing
the public of scientific information is insufficient; one must
also persuade them. Strong, well-reasoned, and well-
substantiated arguments, implemented by skilled science
communicators, have been found effective for altering
even entrenched attitudes, such as toward climate change
(133) and the safety of electronic health records (134).
But, for the particularly intransigent, additional strategies
should be utilized to supplement persuasive arguments. As
noted earlier, a fundamental mechanism that leads people
to reject scientific information contradictory to their beliefs
is cognitive dissonance. This aversive state has been found
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 7of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
to be reduced by a procedure called self-affirmation, which
involves prompting people to conjure and affirm values
that matter to them (e.g., caring for one’s family) in ways
unrelated to the cognitive conflict at hand (135). Why does
self-affirmation reduce dissonance? Because it increases
one’s sense of self-integrity and security, which reduces the
threatening effect of dissonance to the self. Self-affirmation
interventions have been used successfully to reduce defen-
siveness and increase acceptance of scientific information
regarding health behaviors (136) and climate change (137).
Sometimes, scientific messages not only conflict with a
person’s beliefs and attitudes but also with their particular
moral concerns. To manage this, an effective strategy is to
identify the specific morals the recipient endorses and
reframe the scientific message to accord with them. Con-
servatives, who endorse the moral foundation of ingroup
loyalty, are more persuaded by messages about climate
change framed as a matter of loyalty to one’s country. Lib-
erals, who endorse the moral foundation of intentional
care, are more persuaded by messages about climate
change framed as a matter of care for innocent creatures
(138). Moral reframing has also been found effective for
minimizing morally based opposition to vaccines and stem
cell technology (138). Similarly, for recipients who think
about public health in more (vs. less) moral terms, mes-
sages that use moral arguments such as engaging in physi-
cal distancing during the COVID-19 pandemic to benefit
others (vs. oneself) are more persuasive (139).
To increase acceptance of scientific evidence among
those who have strong moral intuitions about naturalness/
purity, science communicators can specifically reframe sci-
entific innovations as confluent with nature. For example,
increasing the perceived naturalness of geoengineering
has been found to increase people’s acceptance of it as a
strategy to combat climate change (140). Overall, these
findings suggest that science communicators can create
multiple moral frames when communicating their scientific
information to distinct audiences (e.g., liberals vs. conser-
vatives, religious vs. nonreligious) who are likely to have
different moral intuitions or views.
Targeting Basis 4: Matching the Delivery of the Scientific
Message with the Recipient’s Epistemic Style. People tend to
reject scientific information when it is delivered in ways
that mismatch their epistemic styles. This basic principle
has theoretically straightforward implications for what
counteractive strategies to use: Identify the recipient’s
style, and match it. To implement a matching strategy,
regional demographic data (e.g., on political leanings) can
aid in developing psychographically targeted communica-
tions at the aggregate level. Given the vast amounts of
fine-grained, person-specific data that various technology
companies collect on people’s online activity (if they have
not opted out), targeting may even be done at the individ-
ual level, which has been found effective for changing
behavior (141). Consumer researchers have long been seg-
menting and targeting consumers based on rich psycho-
graphic and behavioral data. Other public interest groups
could adopt similar strategies and use the logic of targeted
advertising to more precisely position their scientific com-
munications with different audiences in mind. The essence
of this strategy is to craft different messages or different
delivery approaches for different audiences. For recipients
who think abstractly (vs. concretely), scientific messages
delivered in an abstract (vs. concrete) manner increase
their acceptance of the scientific information as true (142).
For recipients who are promotion focused (vs. prevention
focused), messages about health behavior framed as
approaching gains (vs. avoiding losses) are better accepted
(76), and so forth, as explained earlier.
Concluding Remarks
By offering an inclusive framework of key principles under-
lying antiscience attitudes, we aim to advance theory and
research on several fronts: Our framework highlights basic
principles applicable to antiscience phenomena across
multiple domains of science. It predicts situational and
personal variables (e.g., moralization, attitude strength,
and need for closure) that amplify people’s likelihood and
intensity of being antiscience. It unpacks why politics is
such a potent force with multiple aspects of influence on
antiscience attitudes. And it suggests a range of counterac-
tive strategies that target each of the four bases. Beyond
explaining, predicting, and addressing antiscience views,
our framework raises unresolved questions for future
research (SI Appendix).
With the prevalence of antiscience attitudes, scientists
and science communicators face strong headwinds in gain-
ing and sustaining public trust and in conveying scientific
information in ways that will be accepted and integrated
into public understanding. It is a multifaceted problem
that ranges from erosions in the credibility of scientists to
conflicts with the identities, beliefs, attitudes, values,
morals, and epistemic styles of different portions of the
population, exacerbated by the toxic ecosystem of the poli-
tics of our time. Scientific information can be difficult to
swallow, and many individuals would sooner reject the evi-
dence than accept information that suggests they might
have been wrong. This inclination is wholly understand-
able, and scientists should be poised to empathize. After
all, we are in the business of being proven wrong, but that
must not stop us from helping people get things right.
Data Availability. There are no data underlying this work.
ACKNOWLEDGMENTS. We thank Rebecca Walker Reczek, Laura Wallace, Tim
Broom, Javier Granados Samoyoa, the Attitudes and Persuasion Lab, and the
Mind and Body Lab for feedback.
1. J. W. V. Thangaraj et al., Predominance of delta variant among the COVID-19 vaccinated and unvaccinated individuals, India, May 2021. J. Infect. 84,94–118 (2022).
2. World Health Organization, Climate change and health. (Fact Sheet, World Health Organization, 2021). https://www.who.int/news-room/fact-sheets/detail/climate-change-and-health/. Accessed 6 July 2022.
3. A. Tyson, C. Funk, B. Kennedy, C. Johnson, Majority in U.S. says public health benefits of COVID-19 restrictions worth the costs, even as large shares also see downsides. Pew Research Center, (2021). https://
www.pewresearch.org/science/2021/09/15/majority-in-u-s-says-publich-health-benefits-of-covid-19-restrictions-worth-the-costs-even-as-large-shares-also-see-downsides/. Accessed 30 March 2022.
4. B. Kennedy, U.S. concern about climate change is rising, but mainly among Democrats. Pew Research Center, (2020). https://www.pewresearch.org/fact-tank/2020/04/16/u-s-concern-about-climate-change-is-
rising-but-mainly-among-democrats/. Accessed 28 February 2021.
5. B. T. Rutjens et al., Science skepticism across 24 countries. Soc. Psychol. Personal. Sci. 13, 102–117 (2022).
8of10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
6. M. J. Hornsey, Why facts are not enough: Understanding and managing the motivated rejection of science. Curr. Dir. Psychol. Sci. 29, 583–591 (2020).
7. B. T. Rutjens, S. J. Heine, R. M. Sutton, F. van Harreveld, “Attitudes towards science”in Advances in Experimental Social Psychology, J. M. Olson, Ed. (Academic, 2018), pp. 125–165.
8. M. J. Hornsey, K. S. Fielding, Attitude roots and Jiu Jitsu persuasion: Understanding and overcoming the motivated rejection of science. Am. Psychol. 72, 459–473 (2017).
9. W. J. McGuire, “The nature of attitudes and attitude change”in The Handbook of Social Psychology, G. Lindzey, E. Aronson, Eds. (Addison-Wesley, ed. 2, 1969), pp. 136–314.
10. C. Funk, M. Hefferon, B. Kennedy, C. Johnson, Trust and mistrust in Americans’views of scientific experts. Pew Research Center, (2019). https://pewresearch.org/science/2019/08/02/trust-and-mistrust-in-
americans-views-of-scientific-experts/. Accessed 17 March 2022.
11. L. E. Wallace, D. T. Wegener, R. E. Petty, When sources honestly provide their biased opinion: Bias as a distinct source perception with independent effects on credibility and persuasion. Pers. Soc. Psychol. Bull.
46, 439–453 (2020).
12. A. B. Diekman, E. K. Clark, A. M. Johnston, E. R. Brown, M. Steinberg, Malleability in communal goals and beliefs influences attraction to stem careers: Evidence for a goal congruity perspective. J. Pers. Soc.
Psychol. 101, 902–918 (2011).
13. T. M. Errington et al., Investigating the replicability of preclinical cancer biology. eLife 10, e71601 (2021).
14. B. A. Nosek et al., Replicability, robustness, and reproducibility in psychological science. Annu. Rev. Psychol. 73, 719–748 (2022).
15. E. Yong, Psychology’s replication crisis is running out of excuses. The Atlantic, 19 November 2018. https://www.theatlantic.com/science/archive/2018/11/psychologys-replication-crisis-real/576223/. Accessed 5
April 2022.
16. M. Heid, Opinion jWhy experts can’t seem to agree on boosters. N. Y. Times, 13 April (2022). https://www.nytimes.com/2022/04/13/opinion/covid-booster-shot.html. Accessed 20 April 2022.
17. D. Flemming, I. Feinkohl, U. Cress, J. Kimmerle, Individual uncertainty and the uncertainty of science: The impact of perceived conflict and general self-efficacy on the perception of tentativeness and credibility
of scientific information. Front. Psychol. 6, 1859 (2015).
18. J. Kennedy, Populist politics and vaccine hesitancy in Western Europe: An analysis of national-level data. Eur. J. Public Health 29, 512–516 (2019).
19. C. Lee, K. Whetten, S. Omer, W. Pan, D. Salmon, Hurdles to herd immunity: Distrust of government and vaccine refusal in the US, 2002-2003. Vaccine 34 , 3972–3978 (2016).
20. E. Pechar, T. Bernauer, F. Mayer, Beyond political ideology: The impact of attitudes towards government and corporations on trust in science. Sci. Commun. 40, 291–313 (2018).
21. S. T. Fiske, C. Dupree, Gaining trust as well as respect in communicating to motivated audiences about science topics. Proc. Natl. Acad. Sci. U.S.A. 111, 13593–13597 (2014).
22. M. E. Barnes, J. M. Truong, D. Z. Grunspan, S. E. Brownell, Are scientists biased against Christians? Exploring real and perceived bias against Chr istians in academic biology. PLoS One 15, e0226826 (2020).
23. J. L. Duarte et al., Political diversity will improve social psychological science. Behav. Brain Sci. 38, e130 (2015).
24. A. Simpson, K. Rios, Is science for atheists? Perceived threat to religious cultural authority explains U.S. Christians’distrust in secularized science. Public Underst. Sci. 28, 740–758 (2019).
25. S. Hilton, M. Petticrew, K. Hunt, Parents’champions vs. vested interests: Who do parents believe about MMR? A qualitative study. BMC Public Health 7, 42 (2007).
26. D. Funke, Fact-check: Does Anthony Fauci have millions invested in a coronavirus vaccine? Austin American-Statesman, (2020). https://www.statesman.com/story/news/politics/elections/2020/04/16/fact-check-
does-anthony-fauci-have-millions-invested-in-coronavirus-vaccine/984125007/. Accessed 28 February 2021.
27. I. M. Handley, E. R. Brown, C. A. Moss-Racusin, J. L. Smith, Quality of evidence revealing subtle gender biases in science is in the eye of the beholder. Proc. Natl. Acad. Sci. U.S.A. 112, 13201–13206 (2015).
28. W. Hart, K. Richardson, G. K. Tortoriello, A. Earl, ‘You Are What You Read:’Is selective exposure a way people tell us who they are? Br. J. Psychol. 111, 417 –442 (2020).
29. M. A. Hogg, K. D. Williams, From I to we: Social identity and the collective self. Group Dyn. Theory Res. Pract. 4,81–97 (2000).
30. S. Knobloch-Westerwick, M. R. Hastall, Please your self: Social identity effects on selective exposure to news about in- and out-groups. J. Commun. 60, 515–535 (2010).
31. H. A. Washington, Medical Apartheid: The Dark History of Medical Experimentation on Black Americans from Colonial Times to the Present (Doubleday, 2006).
32. R. C. Warren, L. Forrow, D. A. Hodge Sr., R. D. Truog, Trustworthiness before trust—Covid-19 vaccine trials and the black community. N. Engl. J. Med. 383, e121 (2020).
33. P. Nauroth, M. Gollwitzer, J. Bender, T. Rothmund, Gamers against science: The case of the violent video games debate. Eur. J. Soc. Psychol. 44, 104–116 (2014).
34. D. M. Kahan, H. Jenkins-Smith, D. Braman, Cultural cognition of scientific consensus. J. Risk Res. 14, 147–174 (2011).
35. D. M. Kahan, D. Braman, P. Slovic, J. Gastil, G. Cohen, Cultural cognition of the risks and benefits of nanotechnology. Nat. Nanotechnol. 4,87–90 (2009).
36. D. M. Kahan, Misconceptions, misinformation, and the logic of identity-protective cognition. SSRN [Preprint] (2017). https:/doi.org/10.2139/ssrn.2973067. Accessed 17 March 2022.
37. D. M. Kahan, D. Braman, J. Gastil, P. Slovic, C. K. Mertz, Culture and identity-protective cognition: Explaining the white-male effect in risk perception. J. Empir. Leg. Stud. 4, 465–505 (2007).
38. A. Fasce, J. Adri
an-Ventura, S. Lewandowsky, S. van der Linden, Science through a tribal lens: A group-based account of polarization over scientific facts. Group Process. Intergroup Relat.,
10.1177/13684302211050323 (2021).
39. B. T. Rutjens, S. van der Linden, R. van der Lee, N. Zarzeczna, A group processes approach to antiscience beliefs and endorsement of “alternative facts.”Group Process. Intergroup Relat. 24, 513–517 (2021).
40. P. A. Offit, C. A. Moser, The problem with Dr Bob’s alternative vaccine schedule. Pediatrics 123, e164–e169 (2009).
41. I. McGregor, R. Haji, S.-J. Kang, Can ingroup affirmation relieve outgroup derogation? J. Exp. Soc. Psychol. 44, 1395–1401 (2008).
42. A.-M. Bliuc et al., Public division about climate change rooted in conflicting socio-political identities. Nat. Clim. Chang. 5, 226–229 (2015).
43. N. R. Branscombe, D. L. Wann, Collective self-esteem consequences of outgroup derogation when a valued social identity is on trial. Eur. J. Soc. Psychol. 24, 641–657 (1994).
44. M. J. Hornsey, A. Imani, Criticizing groups from the inside and the outside: An identity perspective on the intergroup sensitivity effect. Pers. Soc. Psychol. Bull. 30, 365–383 (2004).
45. B. Nogrady, ‘I hope you die’: How the COVID pandemic unleashed attacks on scientists. Nature 598, 250–253 (2021).
46. K. M. Douglas, COVID-19 conspiracy theories. Group Process. Intergroup Relat. 24, 270–275 (2021).
47. P. G. Auwaerter et al., Antiscience and ethical concerns associated with advocacy of Lyme disease. Lancet Infect. Dis. 11, 713–719 (2011).
48. M. Motta, T. Callaghan, S. Sylvester, K. Lunz-Trujillo, Identifying the prevalence, correlates, and policy consequences of anti-vaccine social identity. Polit. Groups Identities,
10.1080/21565503.2021.1932528 (2021).
49. P. Sturgis, N. Allum, Science in society: Re-evaluating the de ficit model of public attitudes. Public Underst. Sci. 13,55–74 (2004).
50. S. van der Linden, The Gateway Belief Model (GBM): A review and research agenda for communicating the scientific consensus on climate change. Curr. Opin. Psychol. 42,7–12 (2021).
51. L. Festinger, A Theory of Cognitive Dissonance (Stanford University Press, 1957).
52. J. Cooper, Cognitive Dissonance: Fifty Years of a Classic Theory (SAGE, 2007).
53. J. Hannam, God’s Philosophers: How the Medieval World Laid the Foundations of Modern Science (Icon, 2009).
54. T. Lee, The global rise of “fake news”and the threat to democratic elections in the USA. Public Adm. Policy 22,15–24 (2019).
55. S. Vosoughi, D. Roy, S. Aral, The spread of true and false news online. Science 359, 1146–1151 (2018).
56. G. Pennycook, D. G. Rand, The psychology of fake news. Trends Cogn. Sci. 25, 388–402 (2021).
57. R. Greifeneder, M. E. Jaff
e, E. J. Newman, N. Schwarz, The Psychology of Fake News: Accepting, Sharing, and Correcting Misinformation (Routledge, 2021).
58. D. A. Scheufele, N. M. Krause, Science audiences, misinformation, and fake news. Proc. Natl. Acad. Sci. U.S.A. 116, 7662–7669 (2019).
59. U. K. Ecker, S. Lewandowsky, O. Fenton, K. Martin, Do people keep believing because they want to? Preexisting attitudes and the continued influence of misinformation. Mem. Cognit. 42, 292–304 (2014).
60. M. W. Susmann, D. T. Wegener, The role of discomfort in the continued influence effect of misinformation. Mem. Cognit. 50, 435–448 (2022).
61. P. R. Thagard, “Why astrology is a pseudoscience”in PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, D. Hull, M. Forbes, and R. M. Burian, Eds. (Philosophy of Science
Association, 1978), vol. 1978, 223–234.
62. S. H. Schwartz, “Universals in the content and structure of values: Theoretical advances and empirical tests in 20 countries”in Advances in Experimental Social Psychology, M. P. Zanna, Ed.
(Academic, 1992), pp. 1–65.
63. L. J. Skitka, B. E. Hanson, G. S. Morgan, D. C. Wisneski, The psychology of moral conviction. Annu. Rev. Psychol. 72, 347–366 (2021).
64. M. d. Dibonaventura, G. B. Chapman, Do decision biases predict bad decisions? Omission bias, naturalness bias, and influenza vaccination. Med. Decis. Making 28, 532–539 (2008).
65. S. E. Scott, Y. Inbar, P. Rozin, Evidence for absolute moral opposition to genetically modified food in the United States. Perspect. Psychol. Sci. 11, 315–324 (2016).
66. A. Waytz, L. Young, Aversion to playing God and moral condemnation of technology and science. Philos. Trans. R. Soc. Lond. B Biol. Sci. 374, 20180041 (2019).
67. R. E. Petty, J. A. Krosnick, Eds., Attitude Strength: Antecedents and Consequences (Lawrence Erlbaum Associates, 1995).
68. A. Luttrell, V. Sawicki, Attitude strength: Distinguishing predictors versus defining features. Soc. Personal. Psychol. Compass 14, e12555 (2020).
69. A. Luttrell, R. E. Petty, P. Bri~
nol, B. C. Wagner, Making it moral: Merely labeling an attitude as moral increases its strength. J. Exp. Soc. Psychol. 65,82–93 (2016).
70. S. Lewandowsky, U. K. H. Ecker, C. M. Seifert, N. Schwarz, J. Cook, Misinformation and its correction: Continued influence and successful debiasing. Psychol. Sci. Public Interest 13, 106–131 (2012).
71. J. D. Teeny, J. J. Siev, P. Bri~
nol, R. E. Petty, A review and conceptual framework for understanding personalized matching effects in persuasion. J. Consum. Psychol. 31, 382–414 (2021).
72. R. W. Reczek, R. Trudel, K. White, Focusing on the forest or the trees: How abstract versus concrete construal level predicts responses to eco-friendly products. J. Environ. Psychol. 57,87–98 (2018).
73. Y. Trope, N. Liberman, Construal-level theory of psychological distance. Psychol. Rev. 117, 440–463 (2010).
74. K. Goldsmith, G. E. Newman, R. Dhar, Mental representation changes the evaluation of green product benefits. Nat. Clim. Chang. 6, 847–850 (2016).
75. J. Cesario, H. Grant, E. T. Higgins, Regulatory fit and persuasion: Transfer from “feeling right.”J. Pers. Soc. Psychol. 86, 388–404 (2004).
76. M. Bertolotti, P. Catellani, Effects of message framing in policy communication on climate change: Framing in communication on climate change. Eur. J. Soc. Psychol. 44, 474–486 (2014).
77. R. Ludolph, P. J. Schulz, Does regulatory fit lead to more effective health communication? A systematic review. Soc. Sci. Med. 128, 142–150 (2015).
78. D. M. Webster, A. W. Kruglanski, Individual differences in need for cognitive closure. J. Pers. Soc. Psychol. 67, 1049–1062 (1994).
79. X. Nan, K. Daily, Biased assimilation and need for closure: Examining the effects of mixed blogs on vaccine-related beliefs. J. Health Commun. 20, 462–471 (2015).
80. J. T. Cacioppo, R. E. Petty, K. J. Morris, Effects of need for cognition on message evaluation, recall, and persuasion. J. Pers. Soc. Psychol. 45, 805–818 (1983).
81. R. E. Petty, J. T. Cacioppo, “The elaboration likelihood model of persuasion”in Advances in Experimental Social Psychology, M. P. Zanna, Ed. (Academic, 1996), pp. 123–205.
82. A. Bhattacherjee, C. Sanford, Influence processes for information technology acceptance: An elaboration likelihood model. Manage. Inf. Syst. Q. 30, 805–825 (2006).
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 9of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
83. S. Winter, N. C. Kr€
amer, Selecting science information in Web 2.0: How source cues, message sidedness, and need for cognition influence users’exposure to blog posts. J. Comput. Mediat. Commun. 18,
80–96 (2012).
84. J. Kudrna, M. Shore, D. Wassenberg, Considering the role of “need for cognition”in students’acceptance of climate change & evolution. Am. Biol. Teach. 77, 250–257 (2015).
85. C. Drummond, B. Fischhoff, Individuals with greater science literacy and education have more polarized beliefs on controversial science topics. Proc. Natl. Acad. Sci. U.S.A. 114, 9587–9592 (2017).
86. S. Lewandowsky, J. K. Woike, K. Oberauer, Genesis or evolution of gender differences? Worldview-based dilemmas in the processing of scientific information. J. Cogn. 3, 9 (2020).
87. D. B. Kirby, The impact of abstinence and comprehensive sex and STD/HIV education programs on adolescent sexual behavior. Sex. Res. Soc. Policy 5,18–27 (2008).
88. L. C. Hamilton, J. Hartter, M. Lemcke-Stampone, D. W. Moore, T. G. Safford, Tracking public beliefs about anthropogenic climate change. PLoS One 10, e0138208 (2015).
89. S. Baker, Axios-Ipsos poll: More Americans want the vaccine. Axios, 12 January (2021). https://www.axios.com/2021/01/12/axios-ipsos-coronavirus-index-americans-want-vaccine. Accessed 28 February 2021.
90. S. Lewandowsky, K. Oberauer, Motivated rejection of science. Curr. Dir. Psychol. Sci. 25, 217–222 (2016).
91. K. D. Landreville, C. Niles, “And that’s a fact!”: The roles of political ideology, PSRs, and perceived source credibility in estimating factual content in partisan news. J. Broadcast. Electron. Media 63,
177–194 (2019).
92. A. M. McCright, K. Dentzman, M. Charters, T. Dietz, The influence of political ideology on trust in science. Environ. Res. Lett. 8, 044029 (2013).
93. E. C. Nisbet, K. E. Cooper, R. K. Garrett, The partisan brain: How dissonant science messages lead conservatives and liberals to (dis)trust science. Ann. Am. Acad. Pol. Soc. Sci. 658,36–66 (2015).
94. N. J. Stroud, Polarization and partisan selective exposure. J. Commun. 60, 556–576 (2010).
95. C. S. Traberg, S. van der Linden, Birds of a feather are persuaded together: Perceived source credibility mediates the effect of political bias on misinformation susceptibility. Pers. Individ. Dif. 185,
111269 (2022).
96. K. S. Fielding, M. J. Hornsey, H. A. Thai , L. L. Toh, Using ingroup messengers and ingroup values to promote climate change policy. Clim. Change 158, 181–199 (2020).
97. E. Bakshy, S. Messing, L. A. Adamic, Political science. Exposure to ideologically diverse news and opinion on Facebook. Science 348, 1130–1132 (2015).
98. E. A. West, S. Iyengar, Partisanship as a social identity: Implications for polarization. Polit. Behav. 44, 807–838 (2020).
99. K. S. Fielding, M. J. Hornsey, A social identity analysis of climate change and environmental attitudes and behaviors: Insights and opportunities. Front. Psychol. 7, 121 (2016).
100. M.R. Hoffarth, G. Hodson, Green on the outside, red on the inside: Perceived environmentalist threat as a factor explaining political polarization of climate change. J. Environ. Psychol. 45 ,40–49(2016).
101. E.J. Finkel et al., Political sectarianism in America. Science 370, 533–536 (2020).
102. M.Del Vicario et al., The spreading of misinformation online. Proc. Natl. Acad. Sci. U.S.A. 113, 554–559 (2016).
103. T.E. Nelson, J. Garst, Values-based political messages and persuasion: Relationships among speaker, recipient, and evoked values. Polit. Psychol. 26, 489–516 (2005).
104. A.N. Washburn, L. J. Skitka, Science denial across the political divide: Liberals and conservatives are similarly motivated to deny attitude-inconsistent science. Soc. Psychol. Personal. Sci. 9, 972–980 (2018).
105. K.Toner, M. R. Leary, M. W. Asher, K. P. Jongman-Sereno, Feeling superior is a bipartisan issue: Extremity (not direction) of political views predicts perceived belief superiority. Psychol. Sci. 24,
2454–2462 (2013).
106. R.Janoff-Bulman, To provide or protect: Motivational bases of political liberalism and conservatism. Psychol. Inq. 20, 120–128 (2009).
107. A.Chirumbolo, The relationship between need for cognitive closure and political orientation: The mediating role of authoritarianism. Pers. Individ. Dif. 32, 603 –610 (2002).
108. K.M. Douglas, R. M. Sutton, A. Cichocka, The psychology of conspiracy theories. Curr. Dir. Psychol. Sci. 26, 538–542 (2017).
109. L.R. Fabrigar, D. T. Wegener, R. E. Petty, A validity-based framework for understanding replication in psychology. Pers. Soc. Psychol. Rev. 24, 316–344 (2020).
110. T.M. Benson-Greenwald, A. Trujillo, A. D. White, A. B. Diekman, Science for others or the self? Presumed motives for science shape public trust in science. Pers. Soc. Psychol. Bull.,
10.1177/01461672211064456 (2021).
111. C.Blue, Precision is the enemy of public understanding. APS Obs. 34 , 73 (2021).
112. L.M. Kuehne, J. D. Olden, Opinion: Lay summaries needed to enhance science communication. Proc. Natl. Acad. Sci. U.S.A. 112, 3585–3586 (2015).
113. L.E. Wallace, D. T. Wegener, R. E. Petty, Influences of source bias that differ from source untrustworthiness: When flip-flopping is more and less surprising. J. Pers. Soc. Psychol. 118, 603–616 (2020).
114. M.A. Hussein, Z. L. Tormala, Undermining your case to enhance your impact: A framework for understanding the effects of acts of receptiveness in persuasion. Pers. Soc. Psychol. Rev. 25, 229–250 (2021).
115. M.Xu, R. E. Petty, Two-sided messages promote openness for morally based attitudes. Pers. Soc. Psychol. Bull., 10.177/0146167220988371 (2021).
116. J.J. Van Bavel, D. J. Packer, The Power of Us: Harnessing Our Shared Identities to Improve Performance, Increase Cooperation, and Promote Social Harmony (Little, Brown Spark, ed. 1, 2021).
117. S.L. Gaertner, J. F. Dovidio, P. A. Anastasio, B. A. Bachman, M. C. Rust, The common ingroup identity model: Recategorization and the reduction of intergroup bias. Eur. Rev. Soc. Psychol. 4,1–26(1993).
118. T.Schultz, K. Fielding, The common in-group identity model enhances communication about recycled water. J. Environ. Psychol. 40, 296–305 (2014).
119. G.Corbie-Smith, S. B. Thomas, M. V. Williams, S. Moody-Ayers, Attitudes and beliefs of African Americans toward participation in medical research. J. Gen. Intern. Med. 14 , 537–546 (1999).
120. E.Portacolone et al., Earning the trust of African American communities to increase representation in dementia research. Ethn. Dis. 30, 719–734 (2020).
121. A.Wilkinson, M. Parker, F. Martineau, M. Leach, Engaging ‘communities’: Anthropological insights from the West African Ebola epidemic. Philos. Trans. R. Soc. Lond. B Biol. Sci. 372, 20160305 (2017).
122. K.G. Claw et al.; Summer internship for INdigenous peoples in Genomics (SING) Consortium, A framework for enhancing ethical genomic research with Indigenous communities. Nat. Commun. 9, 2957 (2018).
123. S.M. Sidik, Weaving Indigenous knowledge into the scientific method. Nature 601, 285–287 (2022).
124. L.Wade, To overcome decades of mistrust, a workshop aims to train Indigenous researchers to be their own genome experts. Science, (2018). https:/doi.org/10.1126/science.aav5286.
125. D.M. J. Lazer et al., The science of fake news. Science 359, 1094–1096 (2018).
126. T.C. O’Brien, R. Palmer, D. Albarracin, Misplaced trust: When trust in science fosters belief in pseudoscience and the benefits of critical evaluation. J. Exp. Soc. Psychol. 96, 104184 (2021).
127. W.J. McGuire, D. Papageorgis, The relative efficacy of various types of prior belief-defense in producing immunity against persuasion. J. Abnorm. Soc. Psychol. 62, 327–337 (1961).
128. M.Vivion et al., Prebunking messaging to inoculate against COVID-19 vaccine misinformation: An effective strategy for public health. J. Commun. Healthc., 10.1080/17538068.2022.2044606 (2022).
129. T.Bolsen, J. N. Druckman, F. L. Cook, Citizens’, scientists’, and policy advisors’beliefs about global warming. Ann. Am. Acad. Pol. Soc. Sci. 658, 271–295 (2015).
130. B.M. Tappin, G. Pennycook, D. G. Rand, Rethinking the link between cognitive sophistication and politically motivated reasoning. J. Exp. Psychol. Gen. 150, 1095–1114 (2021).
131. N.Miller, D. T. Campbell, Recency and primacy in persuasion as a function of the timing of speeches and measurements. J. Abnorm. Psychol. 59 ,1–9 (1959).
132. R.E. Petty, D. W. Schumann, S. A. Richman, A. J. Strathman, Positive mood and persuasion: Different roles for affect under high- and low-elaboration conditions. J. Pers. Soc. Psychol. 64 ,5–20 (1993).
133. B.Nerlich, N. Koteyko, B. Brown, Theory and language of climate change communication. Wiley Interdiscip. Rev. Clim. Change 1,97–110 (2010).
134. C.M. Angst, R. Agarwal, Adoption of electronic health records in the presence of privacy concerns: The elaboration likelihood model and individual persuasion. Manage. Inf. Syst. Q. 33, 339–370 (2009).
135. C.M. Steele, T. J. Liu, Dissonance processes as self-affirmation. J. Pers. Soc. Psychol. 45,5–19 (1983).
136. T.Epton, P. R. Harris, Self-affirmation promotes health behavior change. Health Psychol. 27, 746–752 (2008).
137. P.Sparks, D. C. Jessop, J. Chapman, K. Holmes, Pro-environmental actions, climate change, and defensiveness: Do self-affirmations make a difference to people’s motives and beliefs about making a difference?
Br. J. Soc. Psychol. 49, 553–568 (2010).
138. M.Feinberg, R. Willer, Moral reframing: A technique for effective and persuasive communication across political divides. Soc. Personal. Psychol. Compass 13, e12501 (2019).
139. A.Luttrell, R. E. Petty, Evaluations of self-focused versus other-focused arguments for social distancing: An extension of moral matching effects. Soc. Psychol. Personal. Sci. 12, 946–954 (2021).
140. A.Corner, N. Pidgeon, Like artificial trees? The effect of framing by natural analogy on public perceptions of geoengineering. Clim. Change 130, 425–438 (2015).
141. C.A. Summers, R. W. Smith, R. W. Reczek, An audience of one: Behaviorally targeted ads as implied social labels. J. Consum. Res. 43, 156–178 (2016).
142. J.Hansen, M. W€
anke, Truth from language and truth from fit: The impact of linguistic concreteness and level of construal on subjective truth. Pers. Soc. Psychol. Bull. 36, 1576–1588 (2010).
10 of 10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.