ArticlePDF Available

Abstract

From vaccination refusal to climate change denial, antiscience views are threatening humanity. When different individuals are provided with the same piece of scientific evidence, why do some accept whereas others dismiss it? Building on various emerging data and models that have explored the psychology of being antiscience, we specify four core bases of key principles driving antiscience attitudes. These principles are grounded in decades of research on attitudes, persuasion, social influence, social identity, and information processing. They apply across diverse domains of antiscience phenomena. Specifically, antiscience attitudes are more likely to emerge when a scientific message comes from sources perceived as lacking credibility; when the recipients embrace the social membership or identity of groups with antiscience attitudes; when the scientific message itself contradicts what recipients consider true, favorable, valuable, or moral; or when there is a mismatch between the delivery of the scientific message and the epistemic style of the recipient. Politics triggers or amplifies many principles across all four bases, making it a particularly potent force in antiscience attitudes. Guided by the key principles, we describe evidence-based counteractive strategies for increasing public acceptance of science.
Why are people antiscience, and what can we do about it?
Aviva Philipp-Muller
a,1,2
, Spike W. S. Lee
b,c
, and Richard E. Petty
a
Edited by Timothy Wilson, University of Virginia, Charlottesville, VA; received February 3, 2022; accepted May 19, 2022
From vaccination refusal to climate change denial, anti-
science views are threatening humanity. When different
individuals are provided with the same piece of scientic
evidence, why do some accept whereas others dismiss it?
Building on various emerging data and models that have
explored the psychology of being antiscience, we specify
four core bases of key principles driving antiscience atti-
tudes. These principles are grounded in decades of
research on attitudes, persuasion, social inuence, social
identity, and information processing. They apply across
diverse domains of antiscience phenomena. Specically,
antiscience attitudes are more likely to emerge when a
scientic message comes from sources perceived as lack-
ing credibility; when the recipients embrace the social
membership or identity of groups with antiscience atti-
tudes; when the scientic message itself contradicts what
recipients consider true, favorable, valuable, or moral; or
when there is a mismatch between the delivery of the sci-
entic message and the epistemic style of the recipient.
Politics triggers or amplies many principles across all
four bases, making it a particularly potent force in anti-
science attitudes. Guided by the key principles, we
describe evidence-based counteractive strategies for
increasing public acceptance of science.
antiscience jattitudes jsocial identity jpolitics jscience communication
From refusing to get vaccinated against COVID-19 (1) to
ignoring worsening climate change (2), rejection of scientic
information is costing lives now and will continue to do so
in the future. One need only look at recent polling data to
nd concerning cases of public rejection of scienticevi-
dence or denial of solutions with high levels of consensus
among scientists. For example, a September 2021 poll
found that only 61%of Americans saw COVID-19 as a major
public health threat (3). Another recent poll found that 40%
of Americans do not think climate change is a major threat
(4). Additional examples abound around the world (5).
Dismissal of scientic evidence is not a new phenome-
non, however. When germ theory was proposed in the 19th
century, an anticontagionist movement rejected the notion
that disease could be spread through miniscule germs. Ear-
lier scientic discoveries, such as the heliocentric nature of
the solar system, were met with heavy opposition. But why?
From early to contemporary examples, what are the psy-
chological principles that account for peoples antiscience
views? That is, when different individuals are provided with
the same piece of scientic evidence, why do some go on to
accept and integrate it as a fact, whereas others dismiss it
as invalid or irrelevant?
Numerous scholars have pondered the antecedents of
antiscience views. Existing models have identied factors
that predict wariness of specic scientic innovations or
theories (6, 7) or antiscience attitudes overall [e.g., the atti-
tude roots and jiu jitsu models (8)]. These and other models
noted throughout our article offer important insights. But
one theoretical paradigm that has been largely ignored in
the antiscience literature, despite its substantive relevance,
is the classic perspective on attitudes and persuasion (9).
This is surprising, because antiscience views represent a cri-
sis of attitudes due to both effective persuasion by anti-
science sources and ineffective persuasion by scienticor
prosciencesources. This is also a missed opportunity,
because classic work on persuasion has highlighted a num-
ber of explanatory processes and remediative strategies,
many of which are highly applicable to the problem of anti-
science attitudes. The goal of our article is to make these
connections explicit and constructive. We do so by connect-
ing contemporary ndings and models in the antiscience
literature to key principles from decades of research on
attitudes, persuasion, social inuence, social identity, and
acceptance versus rejection of information writ large. Draw-
ing these connections confers the dual scienticbenets of
organizing our understanding of antiscience phenomena
and delineating how classic components of persuasive pro-
cesses formulated in the 20th century are impacted by new
forms of social dynamics in the 21st century (e.g., vast and
fast social network effects in the spreading of misinforma-
tion on social media).
Why Are People Antiscience? An
Inclusive Framework
Distinct clusters of basic mental processes can explain
when and why people ignore, trivialize, deny, reject, or even
hate scienticinformationa variety of responses that
might collectively be labeled as being antiscience.To orga-
nize these processes, we offer an inclusive framework that
species four core bases of antiscience attitudes (Table 1,
rst column). In essence, people are prone to rejecting
Author afliations:
a
Department of Psychology, The Ohio State University, Columbus, OH
43210;
b
Rotman School of Management, University of Toronto, Toronto, ON M5S 1A1,
Canada; and
c
Department of Psychology, University of Toronto, Toronto, ON M5S 1A1,
Canada
Author contributions: A.P.-M., S.W.S.L., and R.E.P. wrote the paper.
The authors declare no competing interest.
This article is a PNAS Direct Submission.
Copyright © 2022 the Author(s). Published by PNAS. This article is distributed under
Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).
1
Present address: Beedie School of Business, Simon Fraser University, Burnaby, BC V5A
1S6, Canada.
2
To whom correspondence may be addressed. Email: aviva_philipp-muller@sfu.ca.
This article contains supporting information online at http://www.pnas.org/lookup/
suppl/doi:10.1073/pnas.2120755119/-/DCSupplemental.
Published July 12, 2022.
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 1of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
scientic messages when it comes from a source they do
not nd credible (basis 1), when they, as the recipient of
the scientic message, identify with social groups that
hold antiscience attitudes (basis 2), when the scientic
message itself contradicts their related beliefs or atti-
tudes (basis 3), or when it is delivered in ways that mis-
match their motivational and cognitive approaches to
information processing (basis 4). Each of these bases
involves specic antecedents or predictors and elicits dif-
ferent nuances of psychological reaction (Table 1, second
column); they also point to different counteractive strate-
gies (Table 1, third column). Despite their differences in
focus,thefourbasesareunied in revealing ways in
which scientic information conicts with peoplesexist-
ing content or style of thought. Such conicts are hard to
swallow and easy to disavow, rendering effective commu-
nication of scientic information a thorny problembut
one that becomes more surmountable once its underlying
bases are elucidated.
In the following sections, we introduce each basis of
antiscience attitudes by highlighting key principles, identi-
fying relevant models, and reviewing illustrative ndings
and real-world examples, from heavily studied domains
like vaccination to less studied ones like nanotechnology.
Next, through the conceptual lens of the four bases, we
explain why politics has particularly potent effects on anti-
science attitudes. Afterward, we present a variety of coun-
teractive strategies for increasing acceptance of science by
targeting the four bases. Finally, we conclude with theoreti-
cal contributions of our framework.
Basis 1: Source of the Scientific Message. Lay people do not
discover facts about reality in isolation, devoid of external
inputs. Instead, they rely on sources of scienticinforma-
tionscientists, or, more frequently for most people, jour-
nalists, health ofcials, politicians, or key opinion leadersto
construct their understanding of the world. In general, the
more credible a source is perceived to be, the more likely
people are to accept its information and be persuaded by it.
Unfortunately, many people perceive scientists, who are
supposed to be the original source of scientic information,
as lacking credibility (10). Why?
Source credibility is composed of three pillars: expertise
(i.e., possessing specialized skills and knowledge), trustwor-
thiness (i.e., being honest), and objectivity (i.e., having unbi-
ased perspectives on reality) (11). All three are necessary.
When scientists (or anyone conveying scienticinformation)
are perceived as inexpert, untrustworthy, or biased, their
credibility is tainted, and they lose effectiveness at convey-
ing scientic information and changing opinions.
Although scientists are generally perceived as high in
competence and expertise (12), this perception is facing
mounting challenges. Concerns about the truth value and
robustness of scienticndings in multiple elds, from
medical to social sciences (13, 14), have received media
attention (15). Lay perception of scientistscredibility can
even be undermined by features central to the very mis-
sion of science: Legitimate debates happen within scientic
elds, with different scientists championing different,
sometimes contradictory, perspectives, theories, hypothe-
ses, ndings, and recommendations. [As a current example
Table 1. Key principles driving antiscience attitudes and counteractive strategies for addressing them
Basis of key principles
Key principles driving antiscience
attitudes Counteractive strategies for addressing the key principles
Basis 1. Source of the scientic
message
When sources of scientic information
(e.g., scientists) are perceived as 1)
inexpert, 2) untrustworthy, or 3)
biased, they lack credibility, and their
messages are ignored or rejected.
1, i) Improving perceived and actual validity of scientists
work
1, ii) Legitimizing substantive scientic debate
2) Conveying warmth and prosocial goals in science
communication and using accessible language
3) Conveying that the source is not antagonistic to the
recipient, such as by providing two-sided messages that
clearly state the side for which there is stronger evidence
Basis 2. Recipient of the scientic
message
When scientic information activates
ones social identity as a member of a
group 1) that holds antiscience
attitudes or 2) that has been
underrepresented in science or
exploited in scientic work, it triggers
ingroup favoritism and outgroup
antipathy.
1) Activation of shared or superordinate identity
2) Engaging and collaborating with marginalized
communities
Basis 3. The scientic message
itself
When scientic information contradicts
what people 1) believe to be true, 2)
evaluate as favorable, or 3) moralize,
they experience cognitive dissonance,
which is more easily resolved by
rejecting the scientic information
than by changing existing beliefs,
attitudes, or values.
1, i) Training in scientic reasoning
1, ii) Prebunking
2, i) Strong arguments
2, ii) Self-afrmation
3, i) Moral reframing
3, ii) Increasing the perceived naturalness and moral purity
of scientic innovations
Basis 4. Mismatch between the
delivery of the scientic
message and the recipients
epistemic style
When scientic information is delivered
in ways that mismatch ones1)
construal level, 2) regulatory focus, 3)
need for closure, or 4) need for
cognition, it tends to be rejected.
14) Matching the delivery of scientic information with the
recipients epistemic style (e.g., framing messages as
approaching gains for promotion-focused recipients but
as avoiding losses for prevention-focused recipients)
2of10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
at the time of our writing, scientists differ in their recom-
mendations about whether and when to roll out the sec-
ond booster shot for COVID-19 (16).] In principle, these can
be signs of a healthy scientic ecosystem. In practice, con-
tradictions between scientists, especially against the back-
drop of replicability concerns, threaten lay perceptions of
scientistscredibility (17).
Scientiststrustworthiness is also threatened by multiple
social forces. Distrust of elites (i.e., those with societal
inuence) is on the rise (18), and scientists whose voices
are broadcast in the public sphere are often employed by
elite media and institutions. Distrust of government organ-
izations is on the rise too (19), which predicts distrust in
scientists who recommend innovations that would require
greater governmental regulation (20). Furthermore, scien-
tists have been stereotyped as cold and unfeeling in char-
acter (12, 21), which undermines the publics willingness to
trust them (21).
Scientistsobjectivity has also been called into question.
Scientists in certain elds are portrayed and perceived as
exhibiting biased perspectives against Christian (22) and
conservative (23) values. Indeed, many religious individuals
reject science, in part, due to the perception that scientists
are atheistic (24). More generally, when scientists are
thought to have a vested interest (e.g., monetary incentives)
in persuading their audience, they are perceived as both
biased and untrustworthy (25). During the COVID-19 pan-
demic, widespread misinformation characterized prominent
public health ofcials as promoting the vaccine because of
their nancial investment in various pharmaceutical compa-
nies (26). In short, scientists can be perceived as inexpert,
untrustworthy, or biased, which threatens their credibility in
the public eye.
Basis 2: Recipient of the Scientific Message. People vary in
how interested and willing they are to listen to different
types of information (27, 28). A powerful force that shapes
the types of information individuals expose themselves to
or actively seek out is their social identities. Substantial
research on social identity theory has found that the social
groups to which individuals belong or feel a connection
exert strong inuences on their response to information
perceived to be identity relevant (29). For example, young
adults are more likely to seek out positive (vs. negative)
information about young adults (their ingroup), and older
individuals are more likely to seek out negative informa-
tion about young adults (their outgroup) (30).
Social identities play a role in antiscience attitudes and
behaviors. Those who have been underrepresented in sci-
ence or who have historically been exploited in scientic
experiments [e.g., Black and Indigenous individuals (31)]
are more skeptical of science (32). In addition to demo-
graphic groups, people can identify with interest groups
that shape antiscience attitudes. For example, those who
strongly identify as video gamers are more likely to reject
scientic evidence regarding the harms of playing video
games (33). These ndings are broadly consistent with
research and models in science communication that
describe how people tend to reject scientic information
incompatible with their identities. Work on cultural cogni-
tion has highlighted how people contort scienticndings
to t with values that matter to their cultural identities (34,
35). Relatedly, work on identity-protective cognition shows
that people selectively dismiss scientically determined
risk assessments that threaten their identity (36), as when
White men are more likely than other racial and gender
groups to dismiss data regarding the riskiness of guns,
because guns are a more integral part of their cultural
identity (37).
Beyond the effects of identifying with specic demo-
graphic or cultural groups that can conict with specic sci-
enticndings, some individuals identify with groups that
altogether ignore and shut down scientic thought, recom-
mendations, and evidence, in general (38, 39). This sort of
identity is often tied to other personally meaningful identi-
ties, particularly, political ones [and religious ones (39)], a
theme we elaborate on shortly. An important nuance and
caveat, however, is that, although scientists might charac-
terize some social groups as antiscience, the individuals
who identify with these groups might not think of them-
selves as explicitly or consciously disavowing science. They
might even think of themselves as proscience, in that they
believe their own views are more scientically sound than
those of mainstream scientists (40). In what sense, then,
are they antiscience? In the sense that, if they reject the
preponderance of scientic evidence and instead favor
positions with scant or pseudoscientic support, then they
are de facto acting in opposition to how science works
they are against the scientic approach to knowledge crea-
tion and the knowledge created by it.
In addition to being against scientic information, indi-
viduals can be against the people providing or promoting
the scientic information. This is, unfortunately, a common
aspect of social identity, namely, antipathy toward those who
do not share that identity and are thus part of the outgroup
(41). For example, those who identify as climate change
skeptics harbor hostile feelings toward climate change
believers (42). For individuals who embrace an identity
associated with antiscience attitudes, scientists are mem-
bers of the outgroup. People tend to reject what outgroup
members have to say, sometimes to the point of violence,
which can arise even in the absence of substantive rea-
sons for rejecting the outgroup members message other
than that it comes from the outgroup (43). These forces of
social identity reect why many individuals who strongly
identify with antiscience groups seem to vehemently reject
scientic messages and frequently approach scientists
with hostility, even threatening their lives (44).
Similar dynamics are evident in the marked rise in con-
spiracy theories related to COVID-19 (e.g., the pandemic was
a hoax, or the vaccines contained microchips). These con-
spiracy theories often coalesce around particular social
groups and are most vehemently promoted by those who
feel highly identied with their pseudoscientic community
(45). In recent years, conspiracy theories have led to highly
visible behavior such as antimask and antivaccine protests.
Due to social media, antiscience groups can now mobilize
activists and followers more swiftly than in previous eras.
Beyond the context of COVID-19, social groups that reject
mainstream science have emerged surrounding unvalidated
treatments for Lyme disease (46) and opposition to getting
oneself or ones children immunized in general (47).
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 3of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
Basis 3: The Scientific Message Itself. People do not always
think and behave in line with what science suggests. One
reason is that they are unaware of the scientic evidence
[i.e., the decit model (48)]. Sometimes, when people sim-
ply learn about the scientic consensus, their thoughts
and feelings follow suit [i.e., the gateway belief model
(49)]. Other times, however, when scientic information
contradicts peoples existing beliefs about what is factually
true, they can reject even the strongest scientic evidence,
because harboring conicting cognitions is aversive. This
phenomenon is known as cognitive dissonance (50), which
arises when a person is exposed to information that
conicts with their existing beliefs, attitudes, or behaviors.
Dissonance elicits discomfort. Given this aversive feeling,
people are motivated to resolve the contradiction and
eliminate the discomfort in a number of ways, such as
rejecting the new information, trivializing the topic, ratio-
nalizing that there is no contradiction, or revising their
existing thought (51).
Critically, people tend to resolve dissonance using the
path of least resistance. To a person who has been smoking
their entire life, it is far easier to reject or trivialize scientic
evidence about the health risks of smoking than to alter
their ingrained habit. With dissonance, the intransigence of
existing beliefs resembles the stickiness of existing behav-
iors: It is easier to reject a piece of scientic information
than to revise an entire system of existing beliefs one has
accumulated and integrated into a worldview over the years,
often reinforced by social consensus. Ones existing beliefs
can be based on valid scientic information, previously
accepted but now outdated scientic information, or scien-
tic misinformation. As an example of dissonance arising
from believing outdated scientic information, for thousands
of years, it was a widespread belief that Earth was the center
of the universe and that the sun orbited Earth (52). To a per-
son who had always believed the sun revolved around Earth,
it was far easier to reject the notion of Copernican heliocen-
trism than to overhaul the geocentric model of the universe,
which was previously accepted and felt subjectively coherent
enough, and thus in no obvious need for revision.
In addition to rejecting new information from scientic
progress and updates, individuals might possess beliefs
that contradict scientic evidence due to the spread of
misinformation. The last few years have witnessed a prolif-
eration of fake news (53), catalyzed by social media, which
facilitates the rapid spread of information regardless of
whether it is true. Sadly, fake news spreads signicantly
farther, faster, deeper, and more broadlythan true news
on social media platforms, because fake news stories often
evoke stronger emotional reactions and come across as
more novel than true ones, which are attributes that
increase sharing behavior (54). Although some individuals
might be sharing misinformation merely because of inatten-
tion to veracity (not because of endorsement of content)
(55), extensive sharing of fake news among onesingroup
makes it likely to be accepted, due to the dynamics of social
identity outlined earlier, which can result in rapid accep-
tance of pseudoscientic or antiscienticbeliefs.
Once misinformation has spread, it is difcult to correct
(56), and there is often a continued inuence of the misin-
formation even after it has been retracted. Corrections
issued by media sources are typically ineffective at reduc-
ing belief in the misinformation. In fact, corrections can
sometimes reinforce the belief by making it more salient
(56). Unfortunately, misinformation on many scientic
topics has been widely disseminated, such as exaggerated
and unfounded risks of vaccines (including pre-COVID
times), denial of climate change, and dismissal of evidence
for evolution (57).
Scientic misinformation is especially difcult to correct
when it provides a causal explanation for a phenomenon.
Correcting the misinformation would leave a gap in peo-
ples mental model of why an event or a situation has
occurred (58) and would cause discomfort (59). People
often rell that gap with misinformation to make sense of
the issue at hand. Circling back to the example of heliocen-
trism, telling a geocentricist that Earth is actually not the
center of the universe would leave a gap in their mental
model of why the sun clearly appears to be revolving
around Earth, a gap that is easy to rell by reafrming
their existing causal belief. Similar cognitive dynamics have
long been observed in pseudoscience (60) and continue to
result in rejection of scientic information today.
Not only do people possess beliefs about whether
things are true or false, they also evaluate things as desir-
able or undesirable (attitudes) (9), important or unimpor-
tant (values) (61), and right or wrong (morals) (62). Some
moral views are at odds with particular kinds of scientic
information, resulting in morally fueled rejection. For
example, people who endorse the moral signicance of
naturalness and purity are prone to resisting scientic
technologies and innovations seen as tampering with
nature. Vaccines (63) and genetically modied food (64),
despite their documented benets, are often rejected due
to perceptions that they are unnatural. This cluster of
moral intuitions about naturalness and purity is highly
related to individual differences in aversion to playing
God,an aversion that predicts lower willingness to fund
the NSF and less monetary donation to organizations sup-
porting novel scientic procedures (65).
Attitudes rooted in ones notions of right and wrong
(e.g., not eating meat as a moral issue rather than as a
taste preference) are particularly strong (66) and tend to
be more extreme, persistent over time, resistant to
change, and predictive of behavior (67). For example, peo-
ple with moralized attitudes toward recycling are more
resistant to counterattitudinal information regarding the
efcacy of recycling (68). To resolve dissonance from con-
icting information, rejecting the novel scientic informa-
tion is often the path of lesser resistance than revising
ones existing moralized attitudes. Likewise, when misin-
formation is consistent with ones existing attitudes, it is
difcult to correct (69). To people who love driving high-
horsepower but gas-guzzling vehicles, misinformation such
as climate change is a hoaxwould be attitude consistent,
whereas scientic correction of this misinformation would
be attitude inconsistent and thus prone to rejection.
Basis 4: Mismatch between the Delivery of the Scientific
Message and the Recipients Epistemic Style. Even when sci-
entic information does not conict with an individuals
beliefs or attitudes, it can still be rejected for reasons
4of10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
beyond the content of the message. In particular, when
scientic information is delivered in ways that are at
odds with a persons style of thinking about the topic at
hand or their general approach to information process-
ing, it is less likely to be processed and more likely to be
rejected (70).
For example, when people construe an issue in
abstract/high-level (vs. concrete/low-level) terms, concrete
(vs. abstract) scientic information about the issue mis-
matches their construal level and tends to be rejected.
People typically construe the issue of climate change in
abstract/high-level terms (e.g., global environmental degra-
dation), because the consequences of climate change are
seen as psychologically distant (71), and distance promotes
abstract construal (72). Thus, when ecofriendly products
are described in concrete/low-level terms (e.g., ne details
about the products carbon savings), despite making a
compelling case, they tend to be rejected (71). Evaluation
and choice of sustainable products are also undermined
when the products are described in concrete terms of self-
interested economic savings to consumers who think
abstractly about sustainability (73).
Even holding the level of abstractness/concreteness cons-
tant, scientic information can be presented in a gain frame
or a loss frame. Describing a vaccine as 90%effective (gain
frame) is technically equivalent to describing it as 10%inef-
fective (loss frame), but with dissimilar psychological effects,
because the frame can be at odds with peoples regulatory
focus (74). Promotion focus orients people to eagerly attain-
ing gains; prevention focus orients people to cautiously
preventing losses. When scientic information is framed as
promoting gains (vs. preventing losses), it tends to be
rejected by people who are prevention focused (vs. promo-
tion focused) (74). Such mismatch effects have been found
to result in rejection of climate change (75) and health mes-
sages (e.g., vaccination and smoking cessation) (76).
Framing of scientic information also varies in how cer-
tain and decisive it seems. Even when there is a high
degree of scientic consensus, scientic information is
often disseminated in terms that signal uncertainty. Such
terminology, while technically accurate, leads people with
high need for closure (i.e., low tolerance of epistemic
uncertainty) (77) to reject it. For example, when people
receive mixed scientic information about vaccines, those
with high need for closure are particularly likely to become
entrenched in their existing views and reject the mixed
information (78). More generally, people with high need
for closure are more likely to reject novel information that
challenges their currently held conclusions or assumptions
(77). This poses a challenge for scientists, who are trained
to hedge their ndings and avoid overclaiming certainty,
as they try to communicate the preliminary, inconclusive,
nuanced, or evolving nature of scientic evidence.
Finally, scientic information varies in its quality. Intui-
tively, high-quality arguments are more persuasive than
low-quality ones (79). But this is often not true for people
with low need for cognition (i.e., people who do not enjoy
thinking), for whom low-quality arguments can be just as
persuasive as high-quality ones if positive peripheral cues
(e.g., a likable source) are present (80). Therefore, while
good-quality scientic evidence is, overall, more likely to
be accepted than bad-quality evidence (81), people who do
not enjoy thinking are less likely to appreciate such quality
distinctions. They are less likely to process complex infor-
mation, as comprehending it requires active thinking (79).
They are also less likely to choose to read nuanced science
blog posts (82) and less likely to accept evidence for cli-
mate change and evolution (83).
Construal level, regulatory focus, need for closure, and
need for cognition are different dimensions of epistemic
style. On any of these dimensions, a mismatch between
how scientic information is delivered and how the recipi-
ent thinks will increase the probability of rejection. More
generally, sourcerecipient mismatches (basis 4), content
conicts (basis 3), social identity (basis 2), and sources lack-
ing in credibility (basis 1) all contribute to antiscience atti-
tudes. They also point to why politics is a particularly
potent driver of these attitudes.
How Politics Drives Antiscience Attitudes. Acceptance of sci-
entic information is now sharply divided along political
lines, with individuals in different camps holding, even
enshrining, vastly different views (84). Conservatives are
more likely than liberals to reject scientic evidence sup-
porting evolution (85) and the existence of anthropogenic
climate change (86), and have lower intentions to get vacci-
nated against COVID-19 (87). Although liberals, overall, are
more inclined to accept scientic evidence (8688), there
are specic topics about which they are more likely to be
skeptical, such as the risk of nanotechnology (35). How do
we make sense of these political divides?
The literature on antiscience attitudes has found that
rejection of scientic information by members of different
political camps is often based on motivational factors (89).
Building on these insights, we argue that politics can trig-
ger or amplify basic mental processes across all four bases
of antiscience attitudes, thereby making it a particularly
potent force. Because the mental processes are not mutu-
ally exclusive, many of the political inuences described
below are likely to occur in conjunction with each other.
Politics impacts peoples perception of scientistscredibil-
ity (the source) via perceived expertise and trustworthiness
(90). In general, people see others with similar political
views as more expert and knowledgeable. Both liberals and
conservatives are less trusting of scientists whose work con-
tradicts their ideological viewpoint (91), and recent expo-
sures to such contradictory information reduces trust in
theentirescientic community (92). Because liberals and
conservatives nd different sources credible (e.g., CNN
vs. Fox News), they expose themselves to different scien-
tic information (93) and misinformation (94), often rein-
forced by cues from trusted political elites (95), further
entrenching them in siloed networks. In the era of social
media and algorithmically customized news feeds, even
what appears to be the same source (e.g., Facebook) can
provide highly varied information to different users (96),
exacerbating the division of communities along political
lines.
For many, politics is more than just a set of beliefs or
ideals; it is a core part of their identity (97), which can have a
large impact on how they, as a recipient, react to different
pieces of scientic evidence, policy proposals, and legislation.
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 5of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
Those who identify strongly as a Democrat or a Republican
tend to show different responses to various pieces of sci-
entic information, with each group rejecting proposals
that are purportedly proposed by the outgroup, even
when it goes against their own best interest. For example,
when carbon taxes are framed as being a predominantly
Republican (vs. Democrat) policy, those who identify as
Democrat (vs. Republican) are more likely to oppose the
policy (96). This opposition to anything proposed by the
outgroup is mediated by the perception that the outgroup
is a threat to society (99), and threats reliably trigger out-
group antipathy (100). Such antipathy is prevalent in the
political sectarianism of our time (101), which leads many
individuals to selectively expose themselves to congenial
scientic information (28).
Indeed, people have a strong tendency to seek out
information (the message) that reinforces their existing
beliefs (93), a phenomenon intensied by online platforms,
which heighten the speed and scope of exposure to infor-
mation and misinformation in homogenous and polarized
echo chambers (102). Much of the misinformation online is
politically charged, covering diverse topics from elections
to climate change (57). Research on values-based messag-
ing has found that, when a political message evokes values
discordant with peoples existing values, it tends to be
rejected (103). Indeed, when scientic information contra-
dicts peoples beliefs shaped by political forces, it tends to
be rejected outright as simply untrue, a tendency exhibited
by both liberals and conservatives (104). Worse still, the
more extreme or morally charged peoples political views,
the stronger their sense of belief superiority, regardless of
accuracy (105), further amplifying the rejection of belief-
contradictory scientic information.
Alongside content differences (the types of messages
liberals and conservatives seek out and accept), liberals
and conservatives also differ in how they approach infor-
mation (epistemic styles). Conservatives are, on average,
more prevention focused, and liberals are more promotion
focused (106). According to this logic, conservatives would
be more likely to reject scientic information framed as
approaching gains, and liberals would be more likely to
reject scientic information framed as avoiding losses.
Conservatives also have a stronger need for closure (107),
which is linked to stronger beliefs in a variety of conspiracy
theories with no scientic basis (108).
Altogether, politics is a particularly potent force in rejec-
tion of scientic information because it strikes all four
bases of antiscience attitudes, at times amplifying them.
Acute increases in political partisanship and sectarianism
(101) in recent years have only accentuated the potency
and toxicity of such political inuences.
What Can We Do About Antiscience Attitudes?
By specifying the key principles underlying antiscience atti-
tudes, our framework suggests counteractive strategies for
increasing acceptance of scientic information by targeting
each of the four bases (Table 1, third column). Obviously,
no single strategy is perfect or universal, and the current
era is replete with unique challenges, such as the spread
of misinformation on social media, but specic strategies
can still be effective in their respective contexts, for specic
goals. We outline a number of these strategies briey.
Targeting Basis 1: Increasing Perception of Scientific
Information Sources as Credible. Scientists lack credibility
when they are perceived as inexpert, untrustworthy, or
biased. To tackle emerging concerns about the quality of
scientistswork and their perceived expertise, trustworthi-
ness, and objectivity, scientists need to improve the validity
of their research (109) and establish the replicability and
reproducibility of their ndings. Scientists also need to
communicate to the public that substantive debate and
disagreement are inherent to the scientic process and
signal a healthy scientic landscape, a point often missed
by lay people who expect a given scienticnding to be
absolute (17). To maximize effectiveness, scientists and
science organizations need to recruit journalists, health
ofcials, politicians, or key opinion leaders to join these
communicative efforts, as they are often the sources con-
veying scientic information directly to the public or the
sources that the public already trusts.
To reduce distrust in scientists due to their perceived
coldness (12), when scientists communicate their ndings
and recommendations, they should ameliorate the unfa-
vorable impressions by intentionally conveying interper-
sonal warmth and highlighting the communal nature of
science, a tactic that has proven effective for a different
but related goalrecruiting girls and women into STEM
training programs and careers (12). Another strategy that
is related to but distinct from conveying warmth is for sci-
entists to communicate that they are pursuing prosocial
goals in their work. When people perceive scientists as
prosocial, they have greater trust in science (110).
Scientists also often use excessively complex language
when communicating science to the general public (111). To
mitigate the negative perception from jargon-laden wording
that conceals the meaning of the science from lay people,
scientists should use language that conveys their message
clearly and precisely while still being accessible to a general
audience. One specic suggestion in this vein, which most
journals have yet to adopt, is for published articles to
include lay summariesalong with the more jargon-laden
abstracts, so that interested lay people can better glean the
information in terms that they can understand (112).
To reduce perceived bias, scientists should attempt to
communicate in a balanced manner whenever possible.
When communicators offer a nuanced, multifaceted per-
spective, especially if they change positions in the face of
new evidence, they are perceived as less biased and more
persuasive (113). When a communicator expresses open-
ness to alternative views, especially when of high status,
this can increase openness in committed recipients (114).
For example, those who saw the issue of wearing masks in
the COVID-19 pandemic as a moral impingement on their
rights were more open to wearing masks when a commu-
nicator acknowledged the recipients view but explained
why the promask position was preferable (115). Impor-
tantly, we are not suggesting that communicators adopt a
position of false neutrality or both sidesism.Instead, we
are suggesting that they honestly acknowledge any draw-
backs of their position while ultimately explaining in clear
6of10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
and compelling terms why their position is still the most
supported or more justiable one.
Targeting Basis 2: Decreasing RecipientsIdentification with
Antiscience Groups. To reduce the salience or strength of
recipientsidentication with groups that embrace anti-
science views, science communicators should invoke
meaningful and important shared social identities between
themselves and the recipients of scientic messages (116).
For groups in conict, nding a common or superordinate
identity often helps the two groups minimize their conict
and approach intergroup harmony (117). If those viewing
scientists as outgroup members can see themselves as
sharing a common identity with scientists, antiscience sen-
timent and the derogation of scientists can be reduced.
For example, when scientists offer their recycled water pol-
icy suggestions to a hostile audience, nding common
ground via a superordinate identity successfully increases
audience receptivity (118). One way to legitimately claim a
shared identity between scientists and antiscience commu-
nity members is by bringing together different stakehold-
ers to form one group (e.g., a committee) that is working
toward shared goals, while still preserving the original sub-
groups within the superordinate identity (98).
Science communicators should also seek to earn the
trust of groups that have been historically exploited or
excluded by the science community (119, 120). This can be
done by directly engaging with the target groups in the pro-
cess of conducting the research (121). For example, rather
than treating racialized or historically underrepresented
groups as the objects of study, scientists can collaborate
with members of these communities and build cultural com-
petencies (122). Scientic funding agenciesrequirement of
active Indigenous participation in any research that might
impact or involve Indigenous communities (123) offers
another step toward reconciliation. Programs that train mar-
ginalized individuals to be the scientists working within their
own communities also help to earn trust from racialized
communities, as when a program that trains Indigenous
genome researchers increases trust in science (124). Many
of these efforts are still rather nascent, however, and, unlike
the other counteractive strategies outlined in our article,
their efcacy has not yet been rigorously assessed. We
encourage proper quantitative assessment of these efforts
effectiveness. If useful, they can be scaled up to help rebuild
or strengthen the rapport between scientists and diverse
communities.
Targeting Basis 3: Increasing Acceptance of Scientific
Information Even When It Contradicts Ones Beliefs and
Attitudes. To tackle rejection of scientic information that
contradicts an audiences beliefs, prevention is better than
cure: Whenever possible, minimize the formation of ill-
informed beliefs in the rst place. One preventive strategy
is to train people in scientic reasoning (i.e., the ability to
evaluate the quality of scientic information). People
equipped with scientic reasoning skills are more likely to
accept high-quality scientic evidence (84). This strategy is
especially apt for combatting the rise of fake news [which
is another major problem that requires societal-level
changes in digital infrastructure (125)]. Arming media con-
sumers with the skills to differentiate between true and
false scientic information leads them to become more
discerning regarding which beliefs to adopt (125). Critically,
this strategy pertains to conveying the correct scientic
information prior to any misinformation being adopted.
An additional caveat is that, although encouraging critical
reasoning decreases belief in scientic misinformation,
simply telling people that they should trust science more
can actually increase belief in and dissemination of misin-
formation framed as being scientic (compared with misin-
formation not framed as being scientic) (126).
Related to the broader notion of training in scientic
reasoning, a specic strategy is called prebunking. Derived
from the logic of disease inoculation (127), it involves fore-
warning people that they will be receiving misinformation,
then giving them a small dose of misinformation (the
vaccine) and refuting it so that they will be better able to
resist misinformation when they encounter it in the wild
(the disease). Data from a eld experiment among older
adults have found this strategy to be effective for minimiz-
ing the impact of disinformation on peoples intention to
receive a COVID-19 vaccine (128).
Another preventive strategy, which sounds intuitive but
turns out to be ineffective for enhancing acceptance of scien-
tic information, is increasing a populations general scien-
tic literacy. Unlike specialized scientic knowledge, general
scientic literacy does not involve a deep dive into why a sci-
entic phenomenon occurs (89). Unlike scientic reasoning
skills, general scientic literacy does not teach people how to
parse scientic information (84). Instead, it merely entails
imparting an unelaborated list of scientic information (89).
Why is it ineffective for enhancing acceptance of scientic
information? Because people with more scientic literacy are
simply more sophisticated at bolstering their existing beliefs
by cherry-picking ideas and information to defend their
worldview (84). Higher levels of scientic literacy, instead of
leading people to coalesce around scientictruths,can
increase polarization of beliefs (84). Similarly, greater cogni-
tive sophistication (e.g., stronger analytic thinking) does not
necessarily reduce antiscience views, as the most cognitively
sophisticated and educated people can also be the most
polarized (129), although the evidence for and interpretation
of this pattern have been subject to debate (130).
When preventive strategies are implausible, curative
ones are necessary. Simply learning information is often
uncorrelated with attitude change (48, 131). What matters
more than whether people learn or remember the infor-
mation they have been told is how they react to that infor-
mation. If people have positive reactions to a message,
they are more likely to change their attitudes to be in line
with that message (132). By implication, merely informing
the public of scientic information is insufcient; one must
also persuade them. Strong, well-reasoned, and well-
substantiated arguments, implemented by skilled science
communicators, have been found effective for altering
even entrenched attitudes, such as toward climate change
(133) and the safety of electronic health records (134).
But, for the particularly intransigent, additional strategies
should be utilized to supplement persuasive arguments. As
noted earlier, a fundamental mechanism that leads people
to reject scientic information contradictory to their beliefs
is cognitive dissonance. This aversive state has been found
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 7of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
to be reduced by a procedure called self-afrmation, which
involves prompting people to conjure and afrm values
that matter to them (e.g., caring for ones family) in ways
unrelated to the cognitive conict at hand (135). Why does
self-afrmation reduce dissonance? Because it increases
ones sense of self-integrity and security, which reduces the
threatening effect of dissonance to the self. Self-afrmation
interventions have been used successfully to reduce defen-
siveness and increase acceptance of scientic information
regarding health behaviors (136) and climate change (137).
Sometimes, scientic messages not only conict with a
persons beliefs and attitudes but also with their particular
moral concerns. To manage this, an effective strategy is to
identify the specic morals the recipient endorses and
reframe the scientic message to accord with them. Con-
servatives, who endorse the moral foundation of ingroup
loyalty, are more persuaded by messages about climate
change framed as a matter of loyalty to ones country. Lib-
erals, who endorse the moral foundation of intentional
care, are more persuaded by messages about climate
change framed as a matter of care for innocent creatures
(138). Moral reframing has also been found effective for
minimizing morally based opposition to vaccines and stem
cell technology (138). Similarly, for recipients who think
about public health in more (vs. less) moral terms, mes-
sages that use moral arguments such as engaging in physi-
cal distancing during the COVID-19 pandemic to benet
others (vs. oneself) are more persuasive (139).
To increase acceptance of scientic evidence among
those who have strong moral intuitions about naturalness/
purity, science communicators can specically reframe sci-
entic innovations as conuent with nature. For example,
increasing the perceived naturalness of geoengineering
has been found to increase peoples acceptance of it as a
strategy to combat climate change (140). Overall, these
ndings suggest that science communicators can create
multiple moral frames when communicating their scientic
information to distinct audiences (e.g., liberals vs. conser-
vatives, religious vs. nonreligious) who are likely to have
different moral intuitions or views.
Targeting Basis 4: Matching the Delivery of the Scientific
Message with the Recipients Epistemic Style. People tend to
reject scientic information when it is delivered in ways
that mismatch their epistemic styles. This basic principle
has theoretically straightforward implications for what
counteractive strategies to use: Identify the recipients
style, and match it. To implement a matching strategy,
regional demographic data (e.g., on political leanings) can
aid in developing psychographically targeted communica-
tions at the aggregate level. Given the vast amounts of
ne-grained, person-specic data that various technology
companies collect on peoples online activity (if they have
not opted out), targeting may even be done at the individ-
ual level, which has been found effective for changing
behavior (141). Consumer researchers have long been seg-
menting and targeting consumers based on rich psycho-
graphic and behavioral data. Other public interest groups
could adopt similar strategies and use the logic of targeted
advertising to more precisely position their scientic com-
munications with different audiences in mind. The essence
of this strategy is to craft different messages or different
delivery approaches for different audiences. For recipients
who think abstractly (vs. concretely), scientic messages
delivered in an abstract (vs. concrete) manner increase
their acceptance of the scientic information as true (142).
For recipients who are promotion focused (vs. prevention
focused), messages about health behavior framed as
approaching gains (vs. avoiding losses) are better accepted
(76), and so forth, as explained earlier.
Concluding Remarks
By offering an inclusive framework of key principles under-
lying antiscience attitudes, we aim to advance theory and
research on several fronts: Our framework highlights basic
principles applicable to antiscience phenomena across
multiple domains of science. It predicts situational and
personal variables (e.g., moralization, attitude strength,
and need for closure) that amplify peoples likelihood and
intensity of being antiscience. It unpacks why politics is
such a potent force with multiple aspects of inuence on
antiscience attitudes. And it suggests a range of counterac-
tive strategies that target each of the four bases. Beyond
explaining, predicting, and addressing antiscience views,
our framework raises unresolved questions for future
research (SI Appendix).
With the prevalence of antiscience attitudes, scientists
and science communicators face strong headwinds in gain-
ing and sustaining public trust and in conveying scientic
information in ways that will be accepted and integrated
into public understanding. It is a multifaceted problem
that ranges from erosions in the credibility of scientists to
conicts with the identities, beliefs, attitudes, values,
morals, and epistemic styles of different portions of the
population, exacerbated by the toxic ecosystem of the poli-
tics of our time. Scientic information can be difcult to
swallow, and many individuals would sooner reject the evi-
dence than accept information that suggests they might
have been wrong. This inclination is wholly understand-
able, and scientists should be poised to empathize. After
all, we are in the business of being proven wrong, but that
must not stop us from helping people get things right.
Data Availability. There are no data underlying this work.
ACKNOWLEDGMENTS. We thank Rebecca Walker Reczek, Laura Wallace, Tim
Broom, Javier Granados Samoyoa, the Attitudes and Persuasion Lab, and the
Mind and Body Lab for feedback.
1. J. W. V. Thangaraj et al., Predominance of delta variant among the COVID-19 vaccinated and unvaccinated individuals, India, May 2021. J. Infect. 84,94118 (2022).
2. World Health Organization, Climate change and health. (Fact Sheet, World Health Organization, 2021). https://www.who.int/news-room/fact-sheets/detail/climate-change-and-health/. Accessed 6 July 2022.
3. A. Tyson, C. Funk, B. Kennedy, C. Johnson, Majority in U.S. says public health benets of COVID-19 restrictions worth the costs, even as large shares also see downsides. Pew Research Center, (2021). https://
www.pewresearch.org/science/2021/09/15/majority-in-u-s-says-publich-health-benets-of-covid-19-restrictions-worth-the-costs-even-as-large-shares-also-see-downsides/. Accessed 30 March 2022.
4. B. Kennedy, U.S. concern about climate change is rising, but mainly among Democrats. Pew Research Center, (2020). https://www.pewresearch.org/fact-tank/2020/04/16/u-s-concern-about-climate-change-is-
rising-but-mainly-among-democrats/. Accessed 28 February 2021.
5. B. T. Rutjens et al., Science skepticism across 24 countries. Soc. Psychol. Personal. Sci. 13, 102117 (2022).
8of10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
6. M. J. Hornsey, Why facts are not enough: Understanding and managing the motivated rejection of science. Curr. Dir. Psychol. Sci. 29, 583591 (2020).
7. B. T. Rutjens, S. J. Heine, R. M. Sutton, F. van Harreveld, Attitudes towards sciencein Advances in Experimental Social Psychology, J. M. Olson, Ed. (Academic, 2018), pp. 125165.
8. M. J. Hornsey, K. S. Fielding, Attitude roots and Jiu Jitsu persuasion: Understanding and overcoming the motivated rejection of science. Am. Psychol. 72, 459473 (2017).
9. W. J. McGuire, The nature of attitudes and attitude changein The Handbook of Social Psychology, G. Lindzey, E. Aronson, Eds. (Addison-Wesley, ed. 2, 1969), pp. 136314.
10. C. Funk, M. Hefferon, B. Kennedy, C. Johnson, Trust and mistrust in Americansviews of scientic experts. Pew Research Center, (2019). https://pewresearch.org/science/2019/08/02/trust-and-mistrust-in-
americans-views-of-scientic-experts/. Accessed 17 March 2022.
11. L. E. Wallace, D. T. Wegener, R. E. Petty, When sources honestly provide their biased opinion: Bias as a distinct source perception with independent effects on credibility and persuasion. Pers. Soc. Psychol. Bull.
46, 439453 (2020).
12. A. B. Diekman, E. K. Clark, A. M. Johnston, E. R. Brown, M. Steinberg, Malleability in communal goals and beliefs inuences attraction to stem careers: Evidence for a goal congruity perspective. J. Pers. Soc.
Psychol. 101, 902918 (2011).
13. T. M. Errington et al., Investigating the replicability of preclinical cancer biology. eLife 10, e71601 (2021).
14. B. A. Nosek et al., Replicability, robustness, and reproducibility in psychological science. Annu. Rev. Psychol. 73, 719748 (2022).
15. E. Yong, Psychologys replication crisis is running out of excuses. The Atlantic, 19 November 2018. https://www.theatlantic.com/science/archive/2018/11/psychologys-replication-crisis-real/576223/. Accessed 5
April 2022.
16. M. Heid, Opinion jWhy experts cant seem to agree on boosters. N. Y. Times, 13 April (2022). https://www.nytimes.com/2022/04/13/opinion/covid-booster-shot.html. Accessed 20 April 2022.
17. D. Flemming, I. Feinkohl, U. Cress, J. Kimmerle, Individual uncertainty and the uncertainty of science: The impact of perceived conict and general self-efcacy on the perception of tentativeness and credibility
of scientic information. Front. Psychol. 6, 1859 (2015).
18. J. Kennedy, Populist politics and vaccine hesitancy in Western Europe: An analysis of national-level data. Eur. J. Public Health 29, 512516 (2019).
19. C. Lee, K. Whetten, S. Omer, W. Pan, D. Salmon, Hurdles to herd immunity: Distrust of government and vaccine refusal in the US, 2002-2003. Vaccine 34 , 39723978 (2016).
20. E. Pechar, T. Bernauer, F. Mayer, Beyond political ideology: The impact of attitudes towards government and corporations on trust in science. Sci. Commun. 40, 291313 (2018).
21. S. T. Fiske, C. Dupree, Gaining trust as well as respect in communicating to motivated audiences about science topics. Proc. Natl. Acad. Sci. U.S.A. 111, 1359313597 (2014).
22. M. E. Barnes, J. M. Truong, D. Z. Grunspan, S. E. Brownell, Are scientists biased against Christians? Exploring real and perceived bias against Chr istians in academic biology. PLoS One 15, e0226826 (2020).
23. J. L. Duarte et al., Political diversity will improve social psychological science. Behav. Brain Sci. 38, e130 (2015).
24. A. Simpson, K. Rios, Is science for atheists? Perceived threat to religious cultural authority explains U.S. Christiansdistrust in secularized science. Public Underst. Sci. 28, 740758 (2019).
25. S. Hilton, M. Petticrew, K. Hunt, Parentschampions vs. vested interests: Who do parents believe about MMR? A qualitative study. BMC Public Health 7, 42 (2007).
26. D. Funke, Fact-check: Does Anthony Fauci have millions invested in a coronavirus vaccine? Austin American-Statesman, (2020). https://www.statesman.com/story/news/politics/elections/2020/04/16/fact-check-
does-anthony-fauci-have-millions-invested-in-coronavirus-vaccine/984125007/. Accessed 28 February 2021.
27. I. M. Handley, E. R. Brown, C. A. Moss-Racusin, J. L. Smith, Quality of evidence revealing subtle gender biases in science is in the eye of the beholder. Proc. Natl. Acad. Sci. U.S.A. 112, 1320113206 (2015).
28. W. Hart, K. Richardson, G. K. Tortoriello, A. Earl, You Are What You Read:Is selective exposure a way people tell us who they are? Br. J. Psychol. 111, 417 442 (2020).
29. M. A. Hogg, K. D. Williams, From I to we: Social identity and the collective self. Group Dyn. Theory Res. Pract. 4,8197 (2000).
30. S. Knobloch-Westerwick, M. R. Hastall, Please your self: Social identity effects on selective exposure to news about in- and out-groups. J. Commun. 60, 515535 (2010).
31. H. A. Washington, Medical Apartheid: The Dark History of Medical Experimentation on Black Americans from Colonial Times to the Present (Doubleday, 2006).
32. R. C. Warren, L. Forrow, D. A. Hodge Sr., R. D. Truog, Trustworthiness before trustCovid-19 vaccine trials and the black community. N. Engl. J. Med. 383, e121 (2020).
33. P. Nauroth, M. Gollwitzer, J. Bender, T. Rothmund, Gamers against science: The case of the violent video games debate. Eur. J. Soc. Psychol. 44, 104116 (2014).
34. D. M. Kahan, H. Jenkins-Smith, D. Braman, Cultural cognition of scientic consensus. J. Risk Res. 14, 147174 (2011).
35. D. M. Kahan, D. Braman, P. Slovic, J. Gastil, G. Cohen, Cultural cognition of the risks and benets of nanotechnology. Nat. Nanotechnol. 4,8790 (2009).
36. D. M. Kahan, Misconceptions, misinformation, and the logic of identity-protective cognition. SSRN [Preprint] (2017). https:/doi.org/10.2139/ssrn.2973067. Accessed 17 March 2022.
37. D. M. Kahan, D. Braman, J. Gastil, P. Slovic, C. K. Mertz, Culture and identity-protective cognition: Explaining the white-male effect in risk perception. J. Empir. Leg. Stud. 4, 465505 (2007).
38. A. Fasce, J. Adri
an-Ventura, S. Lewandowsky, S. van der Linden, Science through a tribal lens: A group-based account of polarization over scientic facts. Group Process. Intergroup Relat.,
10.1177/13684302211050323 (2021).
39. B. T. Rutjens, S. van der Linden, R. van der Lee, N. Zarzeczna, A group processes approach to antiscience beliefs and endorsement of alternative facts.Group Process. Intergroup Relat. 24, 513517 (2021).
40. P. A. Oft, C. A. Moser, The problem with Dr Bobs alternative vaccine schedule. Pediatrics 123, e164e169 (2009).
41. I. McGregor, R. Haji, S.-J. Kang, Can ingroup afrmation relieve outgroup derogation? J. Exp. Soc. Psychol. 44, 13951401 (2008).
42. A.-M. Bliuc et al., Public division about climate change rooted in conicting socio-political identities. Nat. Clim. Chang. 5, 226229 (2015).
43. N. R. Branscombe, D. L. Wann, Collective self-esteem consequences of outgroup derogation when a valued social identity is on trial. Eur. J. Soc. Psychol. 24, 641657 (1994).
44. M. J. Hornsey, A. Imani, Criticizing groups from the inside and the outside: An identity perspective on the intergroup sensitivity effect. Pers. Soc. Psychol. Bull. 30, 365383 (2004).
45. B. Nogrady, I hope you die: How the COVID pandemic unleashed attacks on scientists. Nature 598, 250253 (2021).
46. K. M. Douglas, COVID-19 conspiracy theories. Group Process. Intergroup Relat. 24, 270275 (2021).
47. P. G. Auwaerter et al., Antiscience and ethical concerns associated with advocacy of Lyme disease. Lancet Infect. Dis. 11, 713719 (2011).
48. M. Motta, T. Callaghan, S. Sylvester, K. Lunz-Trujillo, Identifying the prevalence, correlates, and policy consequences of anti-vaccine social identity. Polit. Groups Identities,
10.1080/21565503.2021.1932528 (2021).
49. P. Sturgis, N. Allum, Science in society: Re-evaluating the de cit model of public attitudes. Public Underst. Sci. 13,5574 (2004).
50. S. van der Linden, The Gateway Belief Model (GBM): A review and research agenda for communicating the scientic consensus on climate change. Curr. Opin. Psychol. 42,712 (2021).
51. L. Festinger, A Theory of Cognitive Dissonance (Stanford University Press, 1957).
52. J. Cooper, Cognitive Dissonance: Fifty Years of a Classic Theory (SAGE, 2007).
53. J. Hannam, Gods Philosophers: How the Medieval World Laid the Foundations of Modern Science (Icon, 2009).
54. T. Lee, The global rise of fake newsand the threat to democratic elections in the USA. Public Adm. Policy 22,1524 (2019).
55. S. Vosoughi, D. Roy, S. Aral, The spread of true and false news online. Science 359, 11461151 (2018).
56. G. Pennycook, D. G. Rand, The psychology of fake news. Trends Cogn. Sci. 25, 388402 (2021).
57. R. Greifeneder, M. E. Jaff
e, E. J. Newman, N. Schwarz, The Psychology of Fake News: Accepting, Sharing, and Correcting Misinformation (Routledge, 2021).
58. D. A. Scheufele, N. M. Krause, Science audiences, misinformation, and fake news. Proc. Natl. Acad. Sci. U.S.A. 116, 76627669 (2019).
59. U. K. Ecker, S. Lewandowsky, O. Fenton, K. Martin, Do people keep believing because they want to? Preexisting attitudes and the continued inuence of misinformation. Mem. Cognit. 42, 292304 (2014).
60. M. W. Susmann, D. T. Wegener, The role of discomfort in the continued inuence effect of misinformation. Mem. Cognit. 50, 435448 (2022).
61. P. R. Thagard, Why astrology is a pseudosciencein PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, D. Hull, M. Forbes, and R. M. Burian, Eds. (Philosophy of Science
Association, 1978), vol. 1978, 223234.
62. S. H. Schwartz, Universals in the content and structure of values: Theoretical advances and empirical tests in 20 countriesin Advances in Experimental Social Psychology, M. P. Zanna, Ed.
(Academic, 1992), pp. 165.
63. L. J. Skitka, B. E. Hanson, G. S. Morgan, D. C. Wisneski, The psychology of moral conviction. Annu. Rev. Psychol. 72, 347366 (2021).
64. M. d. Dibonaventura, G. B. Chapman, Do decision biases predict bad decisions? Omission bias, naturalness bias, and inuenza vaccination. Med. Decis. Making 28, 532539 (2008).
65. S. E. Scott, Y. Inbar, P. Rozin, Evidence for absolute moral opposition to genetically modied food in the United States. Perspect. Psychol. Sci. 11, 315324 (2016).
66. A. Waytz, L. Young, Aversion to playing God and moral condemnation of technology and science. Philos. Trans. R. Soc. Lond. B Biol. Sci. 374, 20180041 (2019).
67. R. E. Petty, J. A. Krosnick, Eds., Attitude Strength: Antecedents and Consequences (Lawrence Erlbaum Associates, 1995).
68. A. Luttrell, V. Sawicki, Attitude strength: Distinguishing predictors versus dening features. Soc. Personal. Psychol. Compass 14, e12555 (2020).
69. A. Luttrell, R. E. Petty, P. Bri~
nol, B. C. Wagner, Making it moral: Merely labeling an attitude as moral increases its strength. J. Exp. Soc. Psychol. 65,8293 (2016).
70. S. Lewandowsky, U. K. H. Ecker, C. M. Seifert, N. Schwarz, J. Cook, Misinformation and its correction: Continued inuence and successful debiasing. Psychol. Sci. Public Interest 13, 106131 (2012).
71. J. D. Teeny, J. J. Siev, P. Bri~
nol, R. E. Petty, A review and conceptual framework for understanding personalized matching effects in persuasion. J. Consum. Psychol. 31, 382414 (2021).
72. R. W. Reczek, R. Trudel, K. White, Focusing on the forest or the trees: How abstract versus concrete construal level predicts responses to eco-friendly products. J. Environ. Psychol. 57,8798 (2018).
73. Y. Trope, N. Liberman, Construal-level theory of psychological distance. Psychol. Rev. 117, 440463 (2010).
74. K. Goldsmith, G. E. Newman, R. Dhar, Mental representation changes the evaluation of green product benets. Nat. Clim. Chang. 6, 847850 (2016).
75. J. Cesario, H. Grant, E. T. Higgins, Regulatory t and persuasion: Transfer from feeling right.J. Pers. Soc. Psychol. 86, 388404 (2004).
76. M. Bertolotti, P. Catellani, Effects of message framing in policy communication on climate change: Framing in communication on climate change. Eur. J. Soc. Psychol. 44, 474486 (2014).
77. R. Ludolph, P. J. Schulz, Does regulatory t lead to more effective health communication? A systematic review. Soc. Sci. Med. 128, 142150 (2015).
78. D. M. Webster, A. W. Kruglanski, Individual differences in need for cognitive closure. J. Pers. Soc. Psychol. 67, 10491062 (1994).
79. X. Nan, K. Daily, Biased assimilation and need for closure: Examining the effects of mixed blogs on vaccine-related beliefs. J. Health Commun. 20, 462471 (2015).
80. J. T. Cacioppo, R. E. Petty, K. J. Morris, Effects of need for cognition on message evaluation, recall, and persuasion. J. Pers. Soc. Psychol. 45, 805818 (1983).
81. R. E. Petty, J. T. Cacioppo, The elaboration likelihood model of persuasionin Advances in Experimental Social Psychology, M. P. Zanna, Ed. (Academic, 1996), pp. 123205.
82. A. Bhattacherjee, C. Sanford, Inuence processes for information technology acceptance: An elaboration likelihood model. Manage. Inf. Syst. Q. 30, 805825 (2006).
PNAS 2022 Vol. 119 No. 30 e2120755119 https://doi.org/10.1073/pnas.2120755119 9of10
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
83. S. Winter, N. C. Kr
amer, Selecting science information in Web 2.0: How source cues, message sidedness, and need for cognition inuence usersexposure to blog posts. J. Comput. Mediat. Commun. 18,
8096 (2012).
84. J. Kudrna, M. Shore, D. Wassenberg, Considering the role of need for cognitionin studentsacceptance of climate change & evolution. Am. Biol. Teach. 77, 250257 (2015).
85. C. Drummond, B. Fischhoff, Individuals with greater science literacy and education have more polarized beliefs on controversial science topics. Proc. Natl. Acad. Sci. U.S.A. 114, 95879592 (2017).
86. S. Lewandowsky, J. K. Woike, K. Oberauer, Genesis or evolution of gender differences? Worldview-based dilemmas in the processing of scientic information. J. Cogn. 3, 9 (2020).
87. D. B. Kirby, The impact of abstinence and comprehensive sex and STD/HIV education programs on adolescent sexual behavior. Sex. Res. Soc. Policy 5,1827 (2008).
88. L. C. Hamilton, J. Hartter, M. Lemcke-Stampone, D. W. Moore, T. G. Safford, Tracking public beliefs about anthropogenic climate change. PLoS One 10, e0138208 (2015).
89. S. Baker, Axios-Ipsos poll: More Americans want the vaccine. Axios, 12 January (2021). https://www.axios.com/2021/01/12/axios-ipsos-coronavirus-index-americans-want-vaccine. Accessed 28 February 2021.
90. S. Lewandowsky, K. Oberauer, Motivated rejection of science. Curr. Dir. Psychol. Sci. 25, 217222 (2016).
91. K. D. Landreville, C. Niles, And thats a fact!: The roles of political ideology, PSRs, and perceived source credibility in estimating factual content in partisan news. J. Broadcast. Electron. Media 63,
177194 (2019).
92. A. M. McCright, K. Dentzman, M. Charters, T. Dietz, The inuence of political ideology on trust in science. Environ. Res. Lett. 8, 044029 (2013).
93. E. C. Nisbet, K. E. Cooper, R. K. Garrett, The partisan brain: How dissonant science messages lead conservatives and liberals to (dis)trust science. Ann. Am. Acad. Pol. Soc. Sci. 658,3666 (2015).
94. N. J. Stroud, Polarization and partisan selective exposure. J. Commun. 60, 556576 (2010).
95. C. S. Traberg, S. van der Linden, Birds of a feather are persuaded together: Perceived source credibility mediates the effect of political bias on misinformation susceptibility. Pers. Individ. Dif. 185,
111269 (2022).
96. K. S. Fielding, M. J. Hornsey, H. A. Thai , L. L. Toh, Using ingroup messengers and ingroup values to promote climate change policy. Clim. Change 158, 181199 (2020).
97. E. Bakshy, S. Messing, L. A. Adamic, Political science. Exposure to ideologically diverse news and opinion on Facebook. Science 348, 11301132 (2015).
98. E. A. West, S. Iyengar, Partisanship as a social identity: Implications for polarization. Polit. Behav. 44, 807838 (2020).
99. K. S. Fielding, M. J. Hornsey, A social identity analysis of climate change and environmental attitudes and behaviors: Insights and opportunities. Front. Psychol. 7, 121 (2016).
100. M.R. Hoffarth, G. Hodson, Green on the outside, red on the inside: Perceived environmentalist threat as a factor explaining political polarization of climate change. J. Environ. Psychol. 45 ,4049(2016).
101. E.J. Finkel et al., Political sectarianism in America. Science 370, 533536 (2020).
102. M.Del Vicario et al., The spreading of misinformation online. Proc. Natl. Acad. Sci. U.S.A. 113, 554559 (2016).
103. T.E. Nelson, J. Garst, Values-based political messages and persuasion: Relationships among speaker, recipient, and evoked values. Polit. Psychol. 26, 489516 (2005).
104. A.N. Washburn, L. J. Skitka, Science denial across the political divide: Liberals and conservatives are similarly motivated to deny attitude-inconsistent science. Soc. Psychol. Personal. Sci. 9, 972980 (2018).
105. K.Toner, M. R. Leary, M. W. Asher, K. P. Jongman-Sereno, Feeling superior is a bipartisan issue: Extremity (not direction) of political views predicts perceived belief superiority. Psychol. Sci. 24,
24542462 (2013).
106. R.Janoff-Bulman, To provide or protect: Motivational bases of political liberalism and conservatism. Psychol. Inq. 20, 120128 (2009).
107. A.Chirumbolo, The relationship between need for cognitive closure and political orientation: The mediating role of authoritarianism. Pers. Individ. Dif. 32, 603 610 (2002).
108. K.M. Douglas, R. M. Sutton, A. Cichocka, The psychology of conspiracy theories. Curr. Dir. Psychol. Sci. 26, 538542 (2017).
109. L.R. Fabrigar, D. T. Wegener, R. E. Petty, A validity-based framework for understanding replication in psychology. Pers. Soc. Psychol. Rev. 24, 316344 (2020).
110. T.M. Benson-Greenwald, A. Trujillo, A. D. White, A. B. Diekman, Science for others or the self? Presumed motives for science shape public trust in science. Pers. Soc. Psychol. Bull.,
10.1177/01461672211064456 (2021).
111. C.Blue, Precision is the enemy of public understanding. APS Obs. 34 , 73 (2021).
112. L.M. Kuehne, J. D. Olden, Opinion: Lay summaries needed to enhance science communication. Proc. Natl. Acad. Sci. U.S.A. 112, 35853586 (2015).
113. L.E. Wallace, D. T. Wegener, R. E. Petty, Inuences of source bias that differ from source untrustworthiness: When ip-opping is more and less surprising. J. Pers. Soc. Psychol. 118, 603616 (2020).
114. M.A. Hussein, Z. L. Tormala, Undermining your case to enhance your impact: A framework for understanding the effects of acts of receptiveness in persuasion. Pers. Soc. Psychol. Rev. 25, 229250 (2021).
115. M.Xu, R. E. Petty, Two-sided messages promote openness for morally based attitudes. Pers. Soc. Psychol. Bull., 10.177/0146167220988371 (2021).
116. J.J. Van Bavel, D. J. Packer, The Power of Us: Harnessing Our Shared Identities to Improve Performance, Increase Cooperation, and Promote Social Harmony (Little, Brown Spark, ed. 1, 2021).
117. S.L. Gaertner, J. F. Dovidio, P. A. Anastasio, B. A. Bachman, M. C. Rust, The common ingroup identity model: Recategorization and the reduction of intergroup bias. Eur. Rev. Soc. Psychol. 4,126(1993).
118. T.Schultz, K. Fielding, The common in-group identity model enhances communication about recycled water. J. Environ. Psychol. 40, 296305 (2014).
119. G.Corbie-Smith, S. B. Thomas, M. V. Williams, S. Moody-Ayers, Attitudes and beliefs of African Americans toward participation in medical research. J. Gen. Intern. Med. 14 , 537546 (1999).
120. E.Portacolone et al., Earning the trust of African American communities to increase representation in dementia research. Ethn. Dis. 30, 719734 (2020).
121. A.Wilkinson, M. Parker, F. Martineau, M. Leach, Engaging communities: Anthropological insights from the West African Ebola epidemic. Philos. Trans. R. Soc. Lond. B Biol. Sci. 372, 20160305 (2017).
122. K.G. Claw et al.; Summer internship for INdigenous peoples in Genomics (SING) Consortium, A framework for enhancing ethical genomic research with Indigenous communities. Nat. Commun. 9, 2957 (2018).
123. S.M. Sidik, Weaving Indigenous knowledge into the scientic method. Nature 601, 285287 (2022).
124. L.Wade, To overcome decades of mistrust, a workshop aims to train Indigenous researchers to be their own genome experts. Science, (2018). https:/doi.org/10.1126/science.aav5286.
125. D.M. J. Lazer et al., The science of fake news. Science 359, 10941096 (2018).
126. T.C. OBrien, R. Palmer, D. Albarracin, Misplaced trust: When trust in science fosters belief in pseudoscience and the benets of critical evaluation. J. Exp. Soc. Psychol. 96, 104184 (2021).
127. W.J. McGuire, D. Papageorgis, The relative efcacy of various types of prior belief-defense in producing immunity against persuasion. J. Abnorm. Soc. Psychol. 62, 327337 (1961).
128. M.Vivion et al., Prebunking messaging to inoculate against COVID-19 vaccine misinformation: An effective strategy for public health. J. Commun. Healthc., 10.1080/17538068.2022.2044606 (2022).
129. T.Bolsen, J. N. Druckman, F. L. Cook, Citizens, scientists, and policy advisorsbeliefs about global warming. Ann. Am. Acad. Pol. Soc. Sci. 658, 271295 (2015).
130. B.M. Tappin, G. Pennycook, D. G. Rand, Rethinking the link between cognitive sophistication and politically motivated reasoning. J. Exp. Psychol. Gen. 150, 10951114 (2021).
131. N.Miller, D. T. Campbell, Recency and primacy in persuasion as a function of the timing of speeches and measurements. J. Abnorm. Psychol. 59 ,19 (1959).
132. R.E. Petty, D. W. Schumann, S. A. Richman, A. J. Strathman, Positive mood and persuasion: Different roles for affect under high- and low-elaboration conditions. J. Pers. Soc. Psychol. 64 ,520 (1993).
133. B.Nerlich, N. Koteyko, B. Brown, Theory and language of climate change communication. Wiley Interdiscip. Rev. Clim. Change 1,97110 (2010).
134. C.M. Angst, R. Agarwal, Adoption of electronic health records in the presence of privacy concerns: The elaboration likelihood model and individual persuasion. Manage. Inf. Syst. Q. 33, 339370 (2009).
135. C.M. Steele, T. J. Liu, Dissonance processes as self-afrmation. J. Pers. Soc. Psychol. 45,519 (1983).
136. T.Epton, P. R. Harris, Self-afrmation promotes health behavior change. Health Psychol. 27, 746752 (2008).
137. P.Sparks, D. C. Jessop, J. Chapman, K. Holmes, Pro-environmental actions, climate change, and defensiveness: Do self-afrmations make a difference to peoples motives and beliefs about making a difference?
Br. J. Soc. Psychol. 49, 553568 (2010).
138. M.Feinberg, R. Willer, Moral reframing: A technique for effective and persuasive communication across political divides. Soc. Personal. Psychol. Compass 13, e12501 (2019).
139. A.Luttrell, R. E. Petty, Evaluations of self-focused versus other-focused arguments for social distancing: An extension of moral matching effects. Soc. Psychol. Personal. Sci. 12, 946954 (2021).
140. A.Corner, N. Pidgeon, Like articial trees? The effect of framing by natural analogy on public perceptions of geoengineering. Clim. Change 130, 425438 (2015).
141. C.A. Summers, R. W. Smith, R. W. Reczek, An audience of one: Behaviorally targeted ads as implied social labels. J. Consum. Res. 43, 156178 (2016).
142. J.Hansen, M. W
anke, Truth from language and truth from t: The impact of linguistic concreteness and level of construal on subjective truth. Pers. Soc. Psychol. Bull. 36, 15761588 (2010).
10 of 10 https://doi.org/10.1073/pnas.2120755119 pnas.org
Downloaded from https://www.pnas.org by 185.203.219.196 on July 12, 2022 from IP address 185.203.219.196.
... Research shows that science-related populist beliefs are often related to people's background, such as their sociodemographic profile, their political affiliation, or their science literacy (Hartman, Dieckmann, Sprenger, Stastny, & DeMarree, 2017;Hawkins, & Chinn, 2023;Mede, Schäfer, Metag, & Klinger, 2022;Philipp-Muller, Lee, & Petty, 2022). Populists' epistemology often incorporates seemingly scientific knowledge, complex explanations, and empirical data that is mostly presented informal (Krämer & Klingler, 2020). ...
... However, Philipp-Muller, Lee, and Petty (2022) argue that people with more scientific literacy are often simply more sophisticated at bolstering their existing scientific beliefs. Consequently, people with high science-related populist beliefs may be more skilled at cherry-picking ideas and information that fit their worldview (Krämer & Klingler, 2020;Philipp-Muller, Lee, & Petty, 2022). ...
Article
Science-related populism is often used to undermine public trust in science. Good scientific practice contributes to public trust in science, whereas misconduct promotes public skepticism. Since research ethics are a vital part of research integrity, we argue that ethical misconduct potentially undermines trust in scientists, particularly among people with populist beliefs. Drawing on a quota sample of German citizens (N = 1,321), the experimental study examines the influence of ethical conduct on trust in scientists, moderated by science-related populist beliefs. More specifically, we tested the influence of experimental deception (with and without debriefing) using a vignette design. The results of our study showed that ethical misconduct is negatively associated with trust in scientists. In addition, the relationship between ethical misconduct and trust in scientists was influenced by both science-related populist beliefs and science literacy. Although people with high science-related populist beliefs generally placed less trust in scientists, the negative effect was even more pronounced for people with low science-related populist beliefs. Our findings further revealed that ethical misconduct reduces trust in science among people with both low and high science literacy. This demonstrates the importance of researchers transparently discussing and reflecting on ethical research conduct to promote trust in scientists.
... These anti-science views, or views that dismiss scientific information and evidence, inevitably make it into science classrooms. According to Philipp-Muller, Lee, and Petty (2022), anti-science attitudes tend to arise when scientific information comes from sources deemed as less credible, contradicts a person's established beliefs, or mismatches their cognitive style. Politics enhances these factors, making it a powerful driver of anti-science attitudes (Philipp-Muller, Lee, and Petty 2022). ...
Article
Full-text available
Both evolution and climate change have broad scientific consensus, and yet they are the most contested scientific concepts in the US K‐12 education system. This study aimed to explore trends in proposed US state legislation employed from 2003 to 2023 by anti‐evolution and anti‐climate change education movements to constrain the teaching of these sciences. Using a historical qualitative research design, document analysis was used to evaluate state legislation and reports from the National Center for Science Education (NCSE). Two hundred and seventy‐three climate and evolution‐related House and Senate bills, concurrent resolutions, and joint resolutions were identified, coded, and analyzed. Eleven anti‐science education legislative tactics were employed from 2003 to 2023. Five were first identified in the literature review: academic freedom (42.1%), rebranding (12.1%), balanced treatment (12.1%), censorship (2.6%), and disclaimers (2.6%). Six new tactics were revealed in the analysis: anti‐indoctrination (16.8%), standards (12.1%), instructional materials (10.3%), religious liberty (8.8%), avoidance (4.4%), and religious instruction (4.0%). One‐quarter of bills and resolutions employed a combination of tactics. The most ubiquitous tactics were academic freedom bills, which urge science teachers to introduce ideas like intelligent design or climate change denial under the mantle of academic freedom, and anti‐indoctrination bills, which prevent teachers from advocating for controversial topics deemed political. Since 2017, anti‐indoctrination has become the preferred tactic. Southern, southeastern, and midwestern states were the most prolific in their contribution to anti‐science education legislation. Qualitative analysis revealed bill and resolution language was often recycled across years and states with slight changes to wording. From 2003 to 2023, the total number of anti‐science education state legislative efforts increased, as did the number of passed bills and resolutions. The implications of these tactics and trends are considered.
... Figure 6 shows distinct perceptual gaps. Thus, Democrats vastly underestimate Republicans' trust, believing them to be much more antiscience than they are; this tracks the emergence of an anti-science depiction of the right (e.g., Philipp-Muller et al. 2022, Hotez 2023. Conversely, Republicans also underestimate Democrats' ...
... The pre-training must align with best practices in teacher professional learning (Wei et al., 2009), but the ways in which other disciplines approach training development and knowledge dissemination should be consulted, such as the fields of health communication, medicine, political science, climate science, and public health. All of these disciplines have grappled with the anti-science movement and, thus, might be better positioned to offer guidance on communicating information that deviates from public opinion (Philipp-Muller et al., 2022). School psychologists will need to be intentional in presenting this information in a manner that does not undermine the credibility and trustworthiness of the research but is also objective and relatable to educators. ...
... In an increasingly polarized society, with anti-science sentiments growing across the world [5], many states in the United States (U.S.) face proposed legislative initiatives that threaten individual and public health [6]. In this context, to best serve the health of our patients and communities, health care providers and community members must be prepared to take action to shape policies that support population health and increase health equity [7,8]. ...
Article
Full-text available
In an era of political polarization, growing anti-science sentiment, and pervasive inequities in the social drivers of health, a rising tide of potentially harmful state policy proposals in the United States threaten to undermine the health of the public. In response, our health system’s population health and government relations offices partnered with key health advocacy organizations in our state of New Hampshire to offer an interactive virtual learning series aimed at preparing diverse professionals and citizens to effectively advocate for sound health policies. Two hundred forty-seven individuals registered for the six-session series. Our findings indicate that participants experienced increased awareness of the political determinants of health, better understanding of specific legislative proposals in New Hampshire, and enhanced preparedness for advocacy, with many reporting greater active engagement in advocacy. Given its flexible and virtual nature, this innovative learning model could easily be adapted to promote dialogue and advocacy for sound health policy in diverse regional contexts.
Article
Full-text available
Resistance to truth and susceptibility to falsehood threaten democracies around the globe. The present research assesses the magnitude, manifestations, and predictors of these phenomena, while addressing methodological concerns in past research. We conducted a preregistered study with a split-sample design (discovery sample N = 630, validation sample N = 1,100) of U.S. Census-matched online adults. Proponents and opponents of 2020 U.S. presidential candidate Donald Trump were presented with fake and real political headlines ahead of the election. The political concordance of the headlines determined participants’ belief in and intention to share news more than the truth of the headlines. This “concordance-over-truth” bias persisted across education levels, analytic reasoning ability, and partisan groups, with some evidence of a stronger effect among Trump supporters. Resistance to true news was stronger than susceptibility to fake news. The most robust predictors of the bias were participants’ belief in the relative objectivity of their political side, extreme views about Trump, and the extent of their one-sided media consumption. Interestingly, participants stronger in analytic reasoning, measured with the Cognitive Reflection Task, were more accurate in discerning real from fake headlines when accurate conclusions aligned with their ideology. Finally, participants remembered fake headlines more than real ones regardless of the political concordance of the news story. Discussion explores why the concordance-over-truth bias observed in our study is more pronounced than previous research suggests, and examines its causes, consequences, and potential remedies.
Chapter
Full-text available
Ce chapitre aborde: (1) une explication de ce qu'est l'attitude, son origine, les méthodes pour la mesurer (2) une présentation des théories et modèles de la persuasion (3) un questionnement sur la durabilité des changements d'attitude (4) une exploration de la résistance à la persuasion, à la désinformation, au changement (5) une mise en perspective des nouvelles recherches liées au développement de la communication numérique et virtuelle
Article
Humans need to experience meaning in their lives yet often find it difficult to do so. We argue that, for nonreligious individuals in many Western cultures, the materialist and reductionist ideology that surrounds scientific practice and data may be an impediment to attaining a robust sense of meaning in life. Furthermore, scientific materialism and reductionism may be especially problematic for existential mattering—the form of meaning entailing a belief that one’s life matters in the context of the universe as a whole. We review new research supporting this account, along with implications for those immersed in the materialist worldview. We conclude by suggesting possible means of finding meaning, including a sense of existential mattering, without abandoning science, and highlight research directions to further examine these possibilities.
Chapter
Scientific literacy is a key area of focus for the science and education community, with renewed interest due to global challenges such as climate change and the recent events of the COVID-19 outbreak. Unfortunately, we live in a time where misinformation and disinformation are negatively impacting the acquisition of scientific knowledge and skills. This is affecting the American public and leading us to decreased scientific literacy rates and an increase in skepticism of the scientists working to make our world a better place to live. Efforts to increase scientific literacy require multiple stakeholders to work together to increase accessibility, transparency, and access to accurate scientific information for all citizens. While this is a lofty goal, it is essential if we are going to combat the scientific challenges that we face in our world.
Article
Full-text available
Background Vaccination coverage needs to reach more than 80% to resolve the COVID-19 pandemic, but vaccine hesitancy, fuelled by misinformation, may jeopardize this goal. Unvaccinated older adults are not only at risk of COVID-19 complications but may also be misled by false information. Prebunking, based on inoculation theory, involves ‘forewarning people [of] and refuting information that challenges their existing belief or behavior’. Objective To assess the effectiveness of inoculation communication strategies in countering disinformation about COVID-19 vaccines among Canadians aged 50 years and older, as measured by their COVID-19 vaccine intentions. Method Applying an online experiment with a mixed pre–post design and a sample size of 2500 participants, we conducted a national randomized survey among English and French-speaking Canadians aged 50 years and older in March 2021. Responses to two different disinformation messages were evaluated. Our primary outcome was the intention to receive a COVID-19 vaccine, with attitudes toward COVID-19 vaccine a secondary outcome. The McNemar test and multivariate logistic regression analysis on paired data were conducted when the outcome was dichotomized. Wilcoxon sign rank test and Kruskal–Wallis were used to test difference scores between pre- and post-tests by condition. Results Group comparisons between those who received only disinformation and those who received the inoculation message show that prebunking messages may safeguard intention to get vaccinated and have a protective effect against disinformation. Conclusion Prebunking messages should be considered as one strategy for public health communication to combat misinformation.
Article
Full-text available
Science can improve life around the world, but public trust in science is at risk. Understanding the presumed motives of scientists and science can inform the social psychological underpinnings of public trust in science. Across five independent datasets, perceiving the motives of science and scientists as prosocial promoted public trust in science. In Studies 1 and 2, perceptions that science was more prosocially oriented were associated with greater trust in science. Studies 3 and 4a & 4b employed experimental methods to establish that perceiving other-oriented motives, versus self-oriented motives, enhanced public trust in science. Respondents recommend greater funding allocations for science subdomains described as prosocially oriented versus power-oriented. Emphasizing the prosocial aspects of science can build stronger foundations of public trust in science.
Article
Full-text available
Replicability is an important feature of scientific research, but aspects of contemporary research culture, such as an emphasis on novelty, can make replicability seem less important than it should be. The Reproducibility Project: Cancer Biology was set up to provide evidence about the replicability of preclinical research in cancer biology by repeating selected experiments from high-impact papers. A total of 50 experiments from 23 papers were repeated, generating data about the replicability of a total of 158 effects. Most of the original effects were positive effects (136), with the rest being null effects (22). A majority of the original effect sizes were reported as numerical values (117), with the rest being reported as representative images (41). We employed seven methods to assess replicability, and some of these methods were not suitable for all the effects in our sample. One method compared effect sizes: for positive effects, the median effect size in the replications was 85% smaller than the median effect size in the original experiments, and 92% of replication effect sizes were smaller than the original. The other methods were binary – the replication was either a success or a failure – and five of these methods could be used to assess both positive and null effects when effect sizes were reported as numerical values. For positive effects, 40% of replications (39/97) succeeded according to three or more of these five methods, and for null effects 80% of replications (12/15) were successful on this basis; combining positive and null effects, the success rate was 46% (51/112). A successful replication does not definitively confirm an original finding or its theoretical interpretation. Equally, a failure to replicate does not disconfirm a finding, but it does suggest that additional investigation is needed to establish its reliability.
Article
Full-text available
Previous research has confirmed the prominent role of group processes in the promotion and endorsement of disinformation. We report three studies on a psychological framework derived from integrated threat theory—a psychological theory which describes how perceived threat leads to group polarization and prejudice—composed of the following constructs: group belongingness, perceived threat, outgroup derogation, and intergroup anxiety. Our Pilot Study suggested that need to belong and intergroup anxiety predict anti-scientific beliefs (pseudoscientific, paranormal and conspiracy theories), thus justifying the general applicability of integrated threat theory. Study 1 investigates the transition from weak to strong critical thinking regarding pseudoscientific doctrines. Besides greater outgroup derogation and perceived threats among strong critical thinkers, the model does not perform well in this context. Study 2 focuses on the intergroup conflict around anthropogenic global warming, revealing the strong predictive power of the model. These results are discussed in relation to the distinctive psychological profiles of science acceptance and rejection.
Article
Replication—an important, uncommon, and misunderstood practice—is gaining appreciation in psychology. Achieving replicability is important for making research progress. If findings are not replicable, then prediction and theory development are stifled. If findings are replicable, then interrogation of their meaning and validity can advance knowledge. Assessing replicability can be productive for generating and testing hypotheses by actively confronting current understandings to identify weaknesses and spur innovation. For psychology, the 2010s might be characterized as a decade of active confrontation. Systematic and multi-site replication projects assessed current understandings and observed surprising failures to replicate many published findings. Replication efforts highlighted sociocultural challenges such as disincentives to conduct replications and a tendency to frame replication as a personal attack rather than a healthy scientific practice, and they raised awareness that replication contributes to self-correction. Nevertheless, innovation in doing and understanding replication and its cousins, reproducibility and robustness, has positioned psychology to improve research practices and accelerate progress. Expected final online publication date for the Annual Review of Psychology, Volume 73 is January 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Article
Dozens of researchers tell Nature they have received death threats, or threats of physical or sexual violence. Dozens of researchers tell Nature they have received death threats, or threats of physical or sexual violence.
Article
The viral spread of misinformation poses a threat to societies around the world. Recently, researchers have started to study how motivated reasoning about news content influences misinformation susceptibility. However, because the importance of source credibility in the persuasion process is well-documented, and given that source similarity contributes to credibility evaluations, this raises the question of whether individuals are more susceptible to misinformation from ideologically congruent news sources because they find them to be more credible. In a large between-subject pilot (N = 656) and a pre-registered online mixed-subject experiment with a US sample (N = 150) using simulated social media posts, we find clear evidence that both liberals and con-servatives judge misinformation to be more accurate when the source is politically congruent, and that this effect is mediated by perceived source credibility. We show that source effects play a greater role in veracity judgements for liberals than conservatives, but that individuals from both sides of the spectrum judge politically congruent sources as less slanted and more credible. These findings add to our current understanding of source effects in online news environments and provide evidence for the influential effect of perceived source similarity and perceived credibility in misinformation susceptibility.
Article
Research examining the continued influence effect (CIE) of misinformation has reliably found that belief in misinformation persists even after the misinformation has been retracted. However, much remains to be learned about the psychological mechanisms responsible for this phenomenon. Most theorizing in this domain has focused on cognitive mechanisms. Yet some proposed cognitive explanations provide reason to believe that motivational mechanisms might also play a role. The present research tested the prediction that retractions of misinformation produce feelings of psychological discomfort that motivate one to disregard the retraction to reduce this discomfort. Studies 1 and 2 found that retractions of misinformation elicit psychological discomfort, and this discomfort predicts continued belief in and use of misinformation. Study 3 showed that the relations between discomfort and continued belief in and use of misinformation are causal in nature by manipulating how participants appraised the meaning of discomfort. These findings suggest that discomfort could play a key mechanistic role in the CIE, and that changing how people interpret this discomfort can make retractions more effective at reducing continued belief in misinformation.