ArticlePDF Available

Abstract and Figures

** Note: This post includes the text accepted for publication, which was subsequently highly copy-edited to fit the magazine format of the journal. ** Erroneous beliefs are difficult to correct. Worse, popular correction strategies may backfire and further increase the spread and acceptance of misinformation. People evaluate the truth of a statement by assessing its compatibility with other things they believe, its internal consistency, amount of supporting evidence, acceptance by others, and the credibility of the source. To do so, they can draw on relevant details (an effortful analytic strategy) or attend to the subjective experience of processing fluency (a less effortful intuitive strategy). Throughout, fluent processing facilitates acceptance of the statement – when thoughts flow smoothly, people nod along. Correction strategies that make false information more fluent (e.g., through repetition or pictures) can therefore increase its later acceptance. We review recent research and offer recommendations for more effective correction strategies.,
Content may be subject to copyright.
a publication of the behavioral science & policy association 85
Making the truth stick & the myths fade:
Lessons from cognitive psychology
Norbert Schwarz, Eryn Newman, & William Leach
Summary. Erroneous beliefs are dicult to correct. Worse, popular
correction strategies, such as the myth-versus-fact article format, may
backfire because they subtly reinforce the myths through repetition and
further increase the spread and acceptance of misinformation. Here we
identify five key criteria people employ as they evaluate the truth of a
statement: They assess general acceptance by others, gauge the amount
of supporting evidence, determine its compatibility with their beliefs, assess
the general coherence of the statement, and judge the credibility of the
source of the information. In assessing these five criteria, people can actively
seek additional information (an eortful analytic strategy) or attend to the
subjective experience of easy mental processing—what psychologists call
fluent processing—and simply draw conclusions on the basis of what feels
right (a less eortful intuitive strategy). Throughout this truth-evaluation
eort, fluent processing can facilitate acceptance of the statement: When
thoughts flow smoothly, people nod along. Unfortunately, many correction
strategies inadvertently make the false information more easily acceptable
by, for example, repeating it or illustrating it with anecdotes and pictures.
This, ironically, increases the likelihood that the false information the
communicator wanted to debunk will be believed later. A more promising
correction strategy is to focus on making the true information as easy to
process as possible. We review recent research and oer recommendations
for more eective presentation and correction strategies.
Back in 2000, flesh-eating bananas were on the********
loose and wreaking havoc, according to trending
Internet reports. The story claimed that exported
Schwarz, N., Newman, E., & Leach, W. (2016). Making the truth stick
& the myths fade: Lessons from cognitive psychology. Behavioral
Science & Policy, 2(1), pp. 85–95.
bananas contained necrotizing bacteria that could
infect consumers after they had eaten the fruit. It was
a hoax, but one with such legs of believability that the
Centers for Disease Control and Prevention (CDC) set
up a hotline to counter the misinformation and assure
concerned fruit lovers that bananas were perfectly safe.
The Los Angeles Times even ran an article explaining the
review
BSP_vol2no1_Interior_v4.indd 85 10/11/16 3:37 PM
86 behavioral science & policy | volume 2 issue 1 2016
origin of the myth, noting that the hoax gained traction
because a secretary from the University of California,
Riverside’s agricultural college forwarded the story to
friends in an e-mail, seemingly giving it the imprimatur
of the college. Paradoxically, the eorts by the CDC
and the Los Angeles Times to dispel the myth actually
increased some people’s acceptance of it, presumably
because these trustworthy sources had taken the time
and eort to address the “problem.” These correc-
tions likely made the myth more familiar and prob-
ably helped the myth and its variants to persist for the
entiredecade.1
No one doubts that the Internet can spread misinfor-
mation, but when such falsehoods go beyond banana
hoaxes and into the health care realm, they have the
potential to do serious harm. For example, websites
abound that mischaracterize the scientific evidence and
misstate the safety of vaccines, such as that they cause
infection that can be passed on;2 that falsely claim a
certain kind of diet can beat back cancer, such as claims
that drinking red wine can prevent breast cancer;3 and
that overstate preliminary associations between certain
foods and healthful outcomes, such as that eating
grapefruit burns fat.4 These erroneous statements can
cause people to modify their behaviors—perhaps in a
detrimental fashion—aecting what they eat and how
they seek medical care.
The persistence of the necrotizing banana myth
shows that correcting false beliefs is dicult and that
correction attempts often fail because addressing
misinformation actually gives it more airtime, increasing
its familiarity and making it seem even more believable.5
For instance, one of the most frequently used correc-
tion strategies, the myth-versus-fact format, can back-
fire because of repetition of the myth, leaving people
all the more convinced that their erroneous beliefs are
correct.6 The simple repetition of a falsehood, even by a
questionable source, can lead people to actually believe
the lie. The psychological research showing how people
determine whether something is likely to be true has
important implications for health communication strat-
egies and can help point to more ecient approaches
to disseminating well-established truths in general.
Overall, behavioral research shows that often the best
strategy in the fight against misinformation is to paint a
vivid and easily understood summation of the truthful
message one wishes to impart instead of drawing
further attention to false information.
The Big Five Questions We Ask to Evaluate Truth
When people encounter a claim, they tend to evaluate
its truth by focusing on a limited number of criteria.7
Most of the time, they ask themselves at least one of five
questions (see Table 1).
1. Social Consensus: Do Others Believe It?
In 1954, the American social psychologist Leon Fest-
inger theorized that when the truth is unclear, people
often turn to social consensus as a gauge for what is
likely to be correct.8 After all, if many people believe
a claim, then there is probably something to it. A fun
example of this is played out on the popular TV show
Who Wants to Be a Millionaire? where, when stumped
for the correct answer to a question, the contestant may
poll the audience to see if there is a consensus answer.
Overall, people are more confident in their beliefs
if others share them,9,10 trust their memories more
if others remember an event the same way,11,12 and
are more inclined to believe scientific theories if a
consensus among scientists exists.13
To verify a statement’s social consensus, people
may turn to opinion polls, databases, or other external
resources. Alternatively, they may simply ask themselves
how often they have heard this belief. Chances are that
a person is more frequently exposed to widely shared
beliefs than to beliefs that are held by few others, so
frequency of exposure should be a good gauge for a
belief’s popularity. Unfortunately, people are bad at
tracking how often they have heard something and
from whom; instead, people rely on whether a message
feels familiar. This reliance gives small but vocal groups
a great advantage: The more often they repeat their
message, the more familiar it feels, leaving the impres-
sion that many people share the opinion.
For example, Kimberlee Weaver of Virginia Poly-
technic Institute and her colleagues showed study
participants a group discussion regarding public space.14
The discussion presented the opinion that open spaces
are desirable because they provide the community with
opportunities for outdoor recreation. Participants heard
the opinion either once or thrice, with a crucial dier-
ence: In one condition, three dierent people oered
the opinion, whereas in the other condition, the same
person repeated the opinion three times. Not surpris-
ingly, participants thought that the opinion had broader
BSP_vol2no1_Interior_v4.indd 86 10/11/16 3:37 PM
a publication of the behavioral science & policy association 87
support when three speakers oered it than when only
one speaker did. But hearing the same statement three
times from the same person was almost as influential
as hearing it from three separate speakers, proving that
a single repetitive voice can sound like a chorus.14,15
These findings also suggest that the frequent repetition
of the same sound bite in TV news or ads may give the
message a familiarity that makes viewers overestimate
its popularity. This is also the case on social media,
where the same message keeps showing up as friends
and friends of friends like it and repost it, resulting in
many exposures within a network.
2. Support: Is There Much Evidence to Substantiate It?
When a large body of evidence supports a position,
people are likely to trust it and believe that it is true.
They can find this evidence through a deliberate search
by looking for evidence in peer-reviewed scientific
articles, reading substantiated news reports, or even
combing their own memories. But people can also take
a less taxing, speedier approach by making a judgment
on the basis of how easy it is to retrieve or obtain some
pieces of evidence. After all, the more evidence exists,
the easier it should be to think of some. Indeed, when
recalling evidence feels dicult, people conclude that
there is less of it, regardless of how much information
they actually remember. In one 1993 study,16 Fritz Strack
and Sabine Stepper, then of the University of Mannheim
in Germany, asked participants to recall five instances in
which they behaved very assertively. To induce a feeling
of diculty, some were asked to furrow their eyebrows,
an expression often associated with dicult tasks. When
later asked how assertive they are, those who had to
furrow their eyebrows judged themselves to be less
assertive than did those who did not have to furrow their
brows. Even though both groups recalled five examples
of their own assertive behavior, they arrived at dierent
conclusions when recall felt dicult.
In fact, the feeling of diculty can even override
the implications of coming up with a larger number of
examples. In another study,17 participants recalled just a
few or many examples of their own assertive behavior.
Whereas participants reported that recalling a few
examples was easy, they reported that recalling many
examples was dicult. As a result, those who remem-
bered more examples of their own assertiveness subse-
quently judged themselves to be less assertive than did
those who had to recall only a few examples. The di-
culty of bringing many examples to mind undermined
the examples’ influence.
These findings have important implications for
correction strategies. From a rational perspective,
thinking of many examples or arguments should be
more persuasive than thinking of only a few. Hence,
correction strategies often encourage people to think
of reasons why an erroneous or potentially erro-
neous belief may not hold.18 But the more people try
to do so, the harder it feels, leaving them all the more
convinced that their belief is correct.6 For example, in
Table 1. Five criteria people use for judging truth
Criteria Analytic evaluation Intuitive evaluation
Social consensus: Do others believe it? Search databases, look for supporting
statistics, or poll a group or audience.
Does it feel familiar?
Support: Is there much supporting
evidence?
Look for corroborating evidence in
peer-reviewed scientific articles or news
reports, or use one’s own memory.
Is the evidence easy to generate or recall?
Consistency: Is it compatible with what I
believe?
Recall one’s own general knowledge and
assess the match or mismatch with new
information.
Does it make me stumble? Is it dicult to
process, or does it feel right?
Coherence: Does it tell a good story? Do the elements of the story logically fit
together?
Does the story flow smoothly?
Credibility: Does it come from a credible
source?
Is the source an expert? Does the source
have a competing interest?
Does this source seem familiar and
trustworthy?
BSP_vol2no1_Interior_v4.indd 87 10/11/16 3:37 PM
88 behavioral science & policy | volume 2 issue 1 2016
a study described in an article published in the Journal
of Experimental Psychology; Learning , Memory, and
Cognition, participants read a short description of a
historic battle in Nepal.19 Some read that the British army
won the battle, and others read that the Nepal Gurkhas
won the battle. Next, they had to think about how the
battle could have resulted in a dierent outcome. Some
had to list only two reasons for a dierent outcome,
whereas others had to list 10. Although participants in
the latter group came up with many more reasons than
did those in the former group for why the battle could
have had a dierent result, they nevertheless thought
that an alternative outcome was less likely. Such findings
illustrate why people are unlikely to believe evidence
that they find dicult to retrieve or generate: A couple
of arguments that readily pop into the head are more
compelling than many arguments that were hard to
think of. As a result, simple and memorable claims have
an advantage over considerations of a more compli-
cated notion or reality.
3. Consistency: Is It Compatible with What I Believe?
People are inclined to believe things that are consis-
tent with their own beliefs and knowledge.20 –22 One
obvious way to assess belief consistency would be to
recall general knowledge and assess its match with new
information. For example, if you heard someone claim
that vaccinations cause autism, you may check that
claim against what you already know about vaccina-
tions. But again, reliance on one’s feelings while thinking
about the new information provides an easier route to
assessing consistency. When something is inconsistent
with existing beliefs, people tend to stumble—they take
longer to read it and have trouble processing it.23–2 5
Moreover, information that is inconsistent with one’s
beliefs produces a negative aective response, as shown
in research on cognitive consistency since the 1950s.26,27
Either of these experiences can signal that something
does not feel right, which may prompt more critical
thought and analysis.
In contrast, when the new information matches
one’s beliefs, processing is easy, and people tend to
nod along. As an example, suppose you are asked,
“How many animals of each kind did Moses take on the
ark?” Most people answer “two” despite knowing that
the biblical actor was Noah, not Moses28—the biblically
themed question feels familiar, and people focus on
what they are asked about (how many?) rather than
the background details (who). But when the question
is printed in a dicult-to-read font that impedes easy
processing, the words do not flow as smoothly. Now
something seems to feel wrong, and more people
notice the error embedded in the question.29
4. Coherence: Does It Tell a Good Story?
When details are presented as part of a narrative and
individual elements fit together in a coherent frame,
people are more likely to think it is true.3 0,31 For instance,
in a 1992 article about juror decision making, Nancy
Pennington and Reid Hastie of the University of Colo-
rado described experiments in which they asked
volunteers to render verdicts after reading transcripts
of cases consisting of several witness statements. The
researchers varied the way information was presented:
Either evidence was blocked so that all of the evidence
(across several witnesses) regarding motive appeared
as a summary, or it was presented more like a story, as
witness narratives. The researchers found that people
tended to believe the witnesses more when the same
evidence was presented in the format of a coherentstory.
In fact, when asked to remember a story, people often
remember it in ways that make it more coherent, even
filling in gaps and changing elements.32 Maryanne Garry
of the University of Wellington in New Zealand and her
colleagues had volunteers watch a video of a woman
making a sandwich. Although participants probably
thought they saw the whole video, certain parts of the
sandwich-making process were not shown. In a later
memory test, participants confidently but falsely remem-
bered events they had never witnessed in thevideo.
When a story feels coherent, people think that it
makes more sense, and they enjoy reading it more.33,34
Coherent stories flow more smoothly and are easier to
process than incoherent stories with internal contra-
dictions are.30 There are several ways to increase the
chances that readers will feel as though they are reading
a coherent story. For example, in one line of studies,
Jonathan Leavitt and Nicholas Christenfeld of the
University of California, San Diego, gave some partici-
pants summary information that enabled them to antic-
ipate a story’s ending before they began to read it. After
reading, those who had the extra information said they
enjoyed the story more—having some prior context lent
the story more coherence and made it easier to follow.
BSP_vol2no1_Interior_v4.indd 88 10/11/16 3:37 PM
a publication of the behavioral science & policy association 89
5. Credibility: Does It Come from a Credible Source?
Not surprisingly, people are more likely to accept infor-
mation from a credible source than from a less credible
one.35,36 People evaluate the credibility of a source in
many ways, such as by looking at the source’s expertise,
past statements, and likely motives. Alternatively, people
can again consult their feelings about the source. When
they do so, the apparent familiarity of the source looms
large. Repeatedly seeing a face is enough to increase
perceptions of honesty, sincerity, and general agree-
ment with what that person says.3 7,3 8 Even the ease of
pronouncing the speaker’s name influences credibility:
When a person’s name is easy to say, people are more
likely to believe what they hear from the person.39 Thus,
a source can seem credible simply because the person
feels familiar.
An exception to this rule is when people realize that
the person seems familiar for a bad reason. For example,
although the name Adolf Hitler is familiar and easy to
pronounce, it does not lend credibility. Similarly, famil-
iarity is unlikely to enhance the credibility of a source
that is closely identified with a view that one strongly
opposes, as might happen if the source is a politi-
cian from an opposing party. (See the sidebar Political
Messages from the Other Side.) In these cases, familiarity
with the source comes with additional information that
serves as a warning signal and prompts closer scrutiny.
A source also seems more credible when the
message is easy to process. For example, people are
more likely to believe statements when they are made
in a familiar and easy-to-understand accent rather
than a dicult-to-understand one. In a 2010 study, for
instance, Shin Lev-Ari and Boaz Keysar of the University
of Chicago asked native speakers of American English to
rate the veracity of trivia statements (such as “A girae
can go longer without water than a camel can”). Volun-
teers rated statements recited by native English speakers
more truthful than statements recited by speakers of
accented English (whose native tongues included Polish,
Turkish, Italian, and Korean).40
Summary of Truth Evaluation
Regardless of which truth criteria people draw on, easily
processed information enjoys an advantage over infor-
mation that is dicult to process: It feels more familiar,
widely held, internally consistent, compatible with one’s
beliefs, and likely to have come from a credible source.
In short, easy processing gives folks an intuitive feeling
of believability and helps pass the Big Five major truth
criteria tests outlined above.7 Put simply, when thought
flows smoothly, people tend to accept them without
analyzing them too closely.
Alternatively, information that is dicult to process,
feels unfamiliar, and makes people stumble is more
likely to trigger critical analysis. When something feels
wrong, people pay closer attention, look for more rele-
vant information, and are willing to invest more eort
into figuring out what is likely to be true. People are
Political Messages from the Other Side
Messages from the other side of a political debate
rarely change partisan minds. The five truth tests
discussed in the main text shed some light on why.
To begin with, a message from a political opponent
comes from a source that one has already identified
as being associated with other interests, thus limiting
its credibility. Moreover, its content is likely to be at
odds with several of one’s beliefs. Accordingly, thinking
of many arguments that support a message from
the other side is dicult, but coming up with many
counterarguments is easy. In addition, opposing beliefs
interfere with the processing of the information, so
arguments will not seem to flow smoothly. This limits
the perceived coherence of the message—it is just
not a good story. Finally, one’s own social network is
unlikely to agree with other-side messages, thus limiting
perceived social consensus as well.
As a result, messages that contradict a person’s worldview
and advocate opposing positions are unlikely to feel true
and compelling to that person. This eect is not just
evidence for the stubbornness of partisans but inherent
in how people gauge truth: The dominant truth criteria
inherently place beliefs of the other side at a disadvantage.
However, the other side’s messages may gain in
acceptance as time passes. For example, election
campaigns expose all citizens to messages that are
closely linked to partisan sources. Yet, as time goes by,
the specific source will be forgotten, but the message
may feel fluent and familiar when it is encountered after
the campaign is over. That is, although one may reject a
message from the other side at first, the message itself
may seem more plausible later on, when the original
source cannot be remembered. At that point, it may
receive less scrutiny, and people may nod along because
of the fluency resulting from previous encounters.
BSP_vol2no1_Interior_v4.indd 89 10/11/16 3:37 PM
90 behavioral science & policy | volume 2 issue 1 2016
also more likely to notice misleading questions and to
critically examine their own beliefs.7, 2 9, 41 If their crit-
ical analysis reveals something faulty, they will reject
the message. But if the arguments hold up to scrutiny,
a message that initially felt wrong may end up being
persuasive. Nevertheless, in most cases, recipients
will conclude that a message that feels wrong is not
compelling. After all, at first glance, it did not meet the
Big Five truth criteria discussed above.
Repeating False Information: A Bad Idea
The reviewed research sheds light on why some correc-
tion strategies may unintentionally cement the ideas
they are trying to correct: When a correction attempt
increases the ease with which the false claim can be
processed, it also increases the odds that the false
claimfeels true when it is encountered again at a later
point in time.
Repetition Increases Acceptance
The popular strategy of juxtaposing myths and facts
necessarily involves a repetition of the false claims
(or myths) in order to confront them with the facts. A
growing number of studies show that this strategy can
have unintended consequences: increasing the accep-
tance of false beliefs, spreading them to new segments
of the population, and creating the perception that
the false beliefs are widely shared. For example, in a
2005 study,42 Ian Skurnik of the University of Toronto
and his colleagues had participants view health- related
statements. They told them which ones were true
and which were false. When participants were tested
immediately, they were able to recall this information
from memory and could distinguish fact from fiction.
But 3 days later, after their memories had a chance to
fade, participants were more likely to think that any
statement they had seen was true, whether it had been
presented as true or false. Moreover, the acceptance
of false statements increased with the number of
warnings: Participants who had been told thrice that
a statement was false were more likely to accept it as
true than were those who had only been told once.
Older participants were particularly vulnerable to this
bias, presumably because their poorer memory made
it harder to remember the details of what they had
heardearlier.
Fluency: When It Is Easy, It Seems Familiar, and Familiar Feels True
Any mental act, from reading and hearing to remembering and evaluating, can feel easy or dicult. Material that is easy
to process feels fluent, in contrast to material that is dicult to process, which may make the reader stumble. People are
sensitive to these feelings but not to where they come from. For example, familiar material is easier to read than unfamiliar
material is, but not everything that is easy to read is also familiar.
Many things can influence the feeling of fluency. Influences include presentation characteristics, such as print font, color
contrast, or a speaker’s accent, and content characteristics, such as the complexity and flow of an argument. They also include
the receiver’s expertise and history with the material, such as how often one has seen it before and how long ago onesaw it.
When any of these factors make processing easy, they increase the likelihood that a message is accepted as true. Hence,
people are more likely to consider a statement true when it is presented, for example, in high color contrast, in a more simple
font or in a rhyming form.A,B
More likely to be judged true: Less likely to be judged true:
Orsono is a city in Chile Orsono is a city in Chile
Orsono is a city in Chile Orsono is city in Chile
Woes unite foes Woes unite enemies
A. Reber, R., & Schwar z, N. (1999). Eects of perceptual fluency on judgments of truth. Consciousness and Cognition, 8, 338–342.
B. McGlone, M. S., & Tofighbakhsh, J. (2000). Birds of a feather flock conjointly (?): Rhyme as reason in aphorisms. Psychological Science, 11,
424–428.
BSP_vol2no1_Interior_v4.indd 90 10/11/16 3:37 PM
a publication of the behavioral science & policy association 91
Startlingly, it takes neither 3 days nor old age for such
a paradoxical eect to occur. When undergraduates
viewed a myths-and-facts flyer about the flu taken from
the CDC website, they remembered some myths as
facts after only 30 minutes.6 Moreover, despite the flyer’s
promotion of the flu vaccine for their age group, partic-
ipants who had read the myths-and-facts flyer reported
lower intentions to get a flu vaccination than did partic-
ipants who read only the facts. Worse, their reported
intentions to get vaccinated were even lower than those
of control participants who had not been exposed to
any message about the flu. Apparently, realizing there
might be some controversy about the issue was su-
cient to undermine healthy intentions.
Repetition Spreads Misinformation to New Audiences
Myths typically take root in a small segment of the
population, yet sometimes a myth breaks free and
spreads to larger audiences. Ironically, the cause of the
spread may be education campaigns. Although one
may hope that the clear juxtaposition of myth and fact
teaches the new audience what’s right and wrong and
inoculates them against later misinformation, this is not
always the case. Instead, a well-intentioned information
campaign may have the unfortunate eect of spreading
false beliefs to a broader population.
The flesh-eating bananas rumor is an example. It
moved from the fringes of the Internet to mainstream
media after the CDC published its correction, which
was picked up by the Los Angeles Times. After a while,
people misremembered the sources of the correction
as the sources of the false information itself, resulting
in the impression that flesh-eating bananas are a real
problem.43 This retrospective attribution of a myth to a
more credible source goes beyond the more common
observation that messages initially seen as unconvincing
because they come from an untrustworthy source can
exert an influence later on, once their source is forgotten
(a phenomenon known as the sleeper eect).44,45
Myth-Busting Can Convey Controversy
The popular myth-versus-fact formats also convey the
impression that a significant number of people hold a
dierent position or positions on an issue, or else there
would be no reason to juxtapose myths and facts. So
although the myth-versus-fact format may increase
readership and engagement, it also can make a topic
seem controversial and render the truth unclear. It tells
people that either side could be right and can make a
vocal minority seem larger than it is. People with limited
expertise in an area are therefore likely to defer judg-
ment and hesitate to take sides. This is particularly likely
in scientific controversies, where the facts are dicult
for the public to evaluate, as is the case with certain
dietary approaches or health treatments4 as well as for
climate change.13,46 The strategy of emphasizing contro-
versy to engage readers is problematic when the actual
facts have been well demonstrated, because it under-
mines the credibility of the facts and facilitates overesti-
mates of the disagreement.
Anecdotes and Photographs Reinforce the Message
Anecdotes and photos serve several communicative
goals—they capture attention, boost comprehension,
and enhance the readability of associated text.47– 49 This
makes the content easier to imagine, which can artifi-
cially boost its perceived truth.50
Anecdotes promote understanding because they
link new information with prior knowledge and evoke
vivid pictures in people’s minds. For these reasons, they
can have powerful eects on people’s beliefs, leading
them to ignore available statistics and scientific facts
and use feelings and intuition as measures by which to
evaluate information. In 2005, Angela Fagerlin, now at
the University of Michigan, and her colleagues asked
study volunteers to read a scenario about angina and
to choose between bypass surgery and balloon angio-
plasty. They tended to overlook statistical data about the
cure rates and instead choose the option that included
anecdotes of those who underwent that procedure.51
Photos can produce similar eects, even when
they have no probative value for the claim with which
they are paired. In one experiment conducted by Eryn
Newman of the University of Southern California and
colleagues,50 participants in New Zealand were shown
Participants who had been told thrice that a
statement was false were more likely to accept it as
true than were those who had only been told once.
BSP_vol2no1_Interior_v4.indd 91 10/11/16 3:37 PM
92 behavioral science & policy | volume 2 issue 1 2016
a picture of Nick Cave, a musician with the Australian
band the Bad Seeds. When the photo accompanied the
claim “Nick Cave is alive,” people were more likely to
agree that he is, indeed, alive than when no photo was
presented. But the same photo also made people more
likely to think that Nick Cave is dead when the photo
accompanied the claim “Nick Cave is dead.” (For the
record, Nick Cave is alive as of this writing.)
Other more superficial communication approaches
can produce similar eects. For example, rhyme
can enhance memory for material by serving as a
mnemonic device. But rhyme can also enhance the
credibility of a message, even if it does not add any
supporting evidence, by making words flow smoothly. In
2000, Mathew McGlone and Jessica Tofighbakhsh, then
of Lafayette College, asked study participants to eval-
uate sayings about human behavior and rate the truth
of each saying. When the sayings rhymed (for example,
“Woes unite foes”), people were more likely to think they
were true representations of human conduct than when
the sayings did not (“Woes unite enemies”).52
In sum, anecdotes, pictures, and rhymes that contain
little informational value are usually oered to engage
readers. But they can nevertheless influence outcomes
because they scaold mental imagery, increase the ease
with which a message is processed, produce a feeling of
remembering, and systematically bias people to believe
information whether it is true or false. For that reason,
these communication devices can thwart the intended
educational eect when they are presented with false
information; we therefore discourage their use when
written content contains myths or retractions.
Key Communication Strategies for
Making Truths Stick and Myths Fade
So how can one correct false beliefs and increase public
knowledge without propagating misinformation? The
available research indicates that information is more
likely to stick the more easily it can be processed and
the more familiar it feels. Accordingly, the overarching
goal for any communication strategy is to increase the
fluency and familiarity of correct information and to
decrease the fluency and familiarity of misinformation.
Attempts at correcting misinformation—for example,
using the myth-versus-fact setup—often fail because
they center on the false information and unintention-
ally increase the ease with which false information can
be processed when it is encountered again. Increasing
the fluency and familiarity of true information can be
achieved in three key ways.
The first way is through repetition—specifically,
repetition of the correct information, not the misinfor-
mation one wants to undermine. For this reason, it is
usually better to ignore false information than to repeat
it. The popular myth-versus-fact format unwittingly
reinforces the myths by repeating them, which makes
them more influential once memory for the less familiar
(and often more complex) facts fades. Focus rather on
the facts, making them easy to understand and easy
to remember. Instead of repeating various vaccina-
tion myths, for example, a more eective strategy is to
document why vaccinations are safe and to emphasize
Photographs and Truthiness
Messages or claims that appear with photos catch
the eye and generally are more easily understood and
remembered. But adding a photo to claims can also
add authority: People are more likely to think claims are
true when they appear with a photo. Photos have this
influence even when they provide no probative evidence
about whether the claim is correct. For instance, people
are more likely to believe the claim “Magnesium is the
liquid metal inside a thermometer” when they see a
photo of a thermometer, even one that provides no
information regarding what metal can be found inside.
(Most household glass thermometers use alcohol with
red dye.) One reason why photos bring about this
truthiness eect is that they make it easy for the reader
to understand and imagine the claim. As a result, the
claim feels fluent, familiar, and true.
Want to convince people that Nick Cave is dead or Nick
Cave is alive? Easy. Just add his picture to either claim
and voila! People believe.
(For more information on the
experiment that investigated
this scenario, see “Nonprobative
Photographs (or Words) Inflate
Truthiness,” by E. J. Newman,
M. Garr y, D. M. Bernstein, J.
Kantner, and D. S. Lindsay, 2012,
Psychonomic Bulletin & Review,
19, 969–974 .)
BSP_vol2no1_Interior_v4.indd 92 10/11/16 3:37 PM
a publication of the behavioral science & policy association 93
the scientific evidence that vaccines promote health
and not harm.
Sometimes there are legal requirements to repeat
false information in the context of a correction. In
such cases, it is important to provide a fluent and
coherent account of why the false information was
presented to begin with. Consider the myth that autism
is caused by childhood vaccines. A straightforward,
easy-to-comprehend account of how the discovery of
an alleged autism–vaccine link was completely made
up and based on fraudulent data that cost the principal
author his professional license will be more eective
in addressing the misinformation than simply labeling
the original myth discredited, as many news outlets
routinely do.
Second, true information needs to be made as acces-
sible as possible. Unfortunately, the truth is often more
complicated than the myth, which usually involves
considerable simplification. This puts the truth at a
disadvantage because it is harder to process, under-
stand, and remember. Presenting true information
in ways that make processing it as easy as possible is
therefore important. This requires clear, step-by-step
exposition and the avoidance of jargon. Other more
cosmetic changes can also make the truth easily digest-
ible—choosing an easy-to-read font and ensuring the
speaker’s pronunciation is easy to understand can
increase the fluency of a message. It also helps when
the true information is accompanied by pictures that
make the information easy to imagine or when key parts
of the repeated message rhyme.
Finally, at the individual level, one of the most
powerful strategies for avoiding misinformation is to
know it is coming.5 In one study, Stephan Lewandowsky
of the University of Bristol and his colleagues asked
participants to read a short description about a bus
accident. After reading the passage, participants were
told that some of the information was wrong. Despite
the retractions, many participants held on to the inaccu-
rate details that they learned from the initial description
of the bus accident. That is, once the story was told, it
was dicult to cleave out inaccuracies.
Two strategies can eectively prevent such miscon-
ceptions. One is to provide accurate details that present
an alternative account of the misinformation, increasing
the chances of people remembering the true informa-
tion and allowing the false details to fade away. The
second is to warn people before they read the passage
about the influence of misinformation. Pre-exposure
warnings can alert people to carefully scrutinize the
content of information and ward o false details.53–56
Although research shows that warnings are more
ecient when they are received prior to the false infor-
mation, this is not where they are commonly placed.
In the health domain, the law requires that labels
claiming unsubstantiated health benefits must include
a disclaimer: “This product is not intended to diagnose,
treat, cure, or prevent any disease.”57 Such disclaimers
commonly follow the unsubstantiated claims. Moving
them to the top of a label or the beginning of radio
advertisement is likely to enhance their impact.
In sum, the available research shows that highlighting
false information and then attempting to unwind its
eects is usually a bad idea. More promising communi-
cation strategies focus on the truth, making it easier to
process and more handily remembered, which increases
the chance that the correct message sticks.
author aliation
Schwarz, Department of Psychology, University of
Southern California; Newman, Dornsife Mind and
Society Center, University of Southern California; Leach,
Sol Price School of Public Policy, University of Southern
California. Corresponding author’s e-mail: norbert.
schwarz@usc.edu
References
1. Fragale, A . R., & Heath, C. (2004). Evolving informational
credentials: The (mis)attribution of believable facts to credible
sources. Personality and Social Psychology Bulletin, 30,
225–236.
2. Kata, A. (2010). A postmodern Pandora’s box: Antivaccination
misinformation on the Internet. Vaccine, 28, 1709–1716.
3. Goldacre, B. (2009). Media misinformation and health
behaviours. Lancet Oncology, 10, 848.
4. Ayoob, K. T., Duy, R. L., & Quagliani, D. (2002). Position
of the American Dietetic Association: Food and nutrition
misinformation. Journal of the American Dietetic Association,
102, 260–266.
5. Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., &
Cook, J. (2012). Misinformation and its correction: Continued
influence and successful debiasing. Psychological Science in
the Public Interest, 13(3), 106–131.
Repeat correct information, not the misinformation
one wants to undermine.
BSP_vol2no1_Interior_v4.indd 93 10/11/16 3:37 PM
94 behavioral science & policy | volume 2 issue 1 2016
6. Schwarz, N., Sanna, L. J., Skurnik, I., & Yoon, C. (2007).
Metacognitive experiences and the intricacies of setting people
straight: Implications for debiasing and public information
campaigns. Advances in Experimental Social Psychology, 39,
127–161.
7. Schwarz, N. (2015). Metacognition. In M. Mikulincer, P. R.
Shaver, E. Borgida, & J. A. Bargh (Eds.), APA handbook of
personality and social psychology: Attitudes and social
cognition (Vol. 1, pp. 203–229). Washington, DC: American
Psychological Association.
8. Festinger, L. (1954). A theory of social comparison processes.
Human Relations, 7, 117–140.
9. Newcomb, T. M. (1943). Personality and social change: Attitude
formation in a student community. Fort Worth, TX: Dryden
Press.
10. Visser, P. S., & Mirabile, R. R. (2004). Attitudes in the social
context: The impact of social network composition on
individual-level attitude strength. Journal of Personality and
Social Psychology, 87, 779–795.
11. Harris, A. J. L., & Hahn, U. (2009). Bayesian rationality in
evaluating multiple testimonies: Incorporating the role of
coherence. Journal of Experimental Psychology: Learning,
Memory, and Cognition, 35, 1366–1372.
12. Ross, M., Buehler, R., & Karr, J. W. (1998). Assessing the
accuracy of conflicting autobiographical memories. Memory &
Cognition, 26, 1233–1244.
13. Lewandowsky, S., Gignac, G. E., & Vaughan, S. (2013). The
pivotal role of perceived scientific consensus in acceptance of
science. Nature Climate Change, 3, 399–404.
14. Weaver, K., Garcia, S. M., Schwarz, N., & Miller, D. T. (2007).
Inferring the popularity of an opinion from its familiarity: A
repetitive voice can sound like a chorus. Journal of Personality
and Social Psychology, 92, 821–833.
15. Foster, J. L., Huthwaite, T., Yesberg, J. A., Garry, M., & Loftus,
E. F. (2012). Repetition, not number of sources, increases
both susceptibility to misinformation and confidence in the
accuracy of eyewitnesses. Acta Psychologica, 139, 320–326.
16. Stepper, S., & Strack, F. (1993). Proprioceptive determinants of
emotional and nonemotional feelings. Journal of Personality
and Social Psychology, 64, 211–220.
17. Schwarz, N., Bless, H., Strack, F., Klumpp, G., Rittenauer-
Schatka, H., & Simons, A. (1991). Ease of retrieval as
information: Another look at the availability heuristic. Journal
of Personality and Social Psychology, 61, 195 –202.
18. Larrick, R. P. (2004). Debiasing. In D. J. Koehler & N. Harvey
(Eds.), Blackwell handbook of judgment and decision making
(pp. 316–337). Oxford, United Kingdom: Blackwell.
19. Sanna, L. J., Schwarz, N., & Stocker, S. L. (2002). When
debiasing backfires: Accessible content and accessibility
experiences in debiasing hindsight. Journal of Experimental
Psychology: Learning, Memory, and Cognition, 28, 497–5 02 .
20. Abelson, R. P. (1968). Theories of cognitive consistency: A
sourcebook. Chicago, IL: Rand McNally.
21. McGuire, W. J. (1972). Attitude change: An information
processing paradigm. In C. G. McClintock (Ed.), Experimental
social psychology (pp. 108–141). New York, NY: Holt, Rinehart
and Winston.
22. Wyer, R. S. (1974). Cognitive organization and change: An
information processing approach. Potomac, MD: Erlbaum.
23. Edwards, K., & Smith, E. E. (1996). A disconfirmation bias in
the evaluation of arguments. Journal of Personality and Social
Psychology, 71, 5–24.
24. Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the
evaluation of political beliefs. American Journal of Political
Science, 50, 755–769.
25. Winkielman, P., Huber, D. E., Kavanagh, L., & Schwarz, N.
(2012). Fluency of consistency: When thoughts fit nicely and
flow smoothly. In B. Gawronski & F. Strack (Eds.), Cognitive
consistency: A fundamental principle in social cognition (pp.
89–111). New York, NY: Guilford Press.
26. Festinger, L. (1957). A theory of cognitive dissonance. Stanford,
CA: Stanford University Press.
27. Gawronski, B., & Strack, F. (2012). Cognitive consistency: A
fundamental principle in social cognition. New York, NY:
Guilford Press.
28. Erickson, T. D., & Mattson, M. E. (1981). From words to
meaning: A semantic illusion. Journal of Verbal Learning and
Verbal Behavior, 20, 540–551.
29. Song, H., & Schwarz, N. (2008). Fluency and the detection
of distortions: Low processing fluency attenuates the Moses
illusion. Social Cognition, 26, 791799.
30. Johnson-Laird, P. N. (2012). Inference with mental models. In
K. Holyoak & R. G. Morrison (Eds.), The Oxford handbook of
thinking and reasoning (pp. 134–145). New York, NY: Oxford
University Press.
31. Pennington, N., & Hastie, R. (1992). Explaining the evidence:
Tests of the story model for juror decision making. Journal of
Personality and Social Psychology, 62, 189–206.
32. Gerrie, M. P., Belcher, L. E., & Garry, M. (2006). ‘Mind the
gap’: False memories for missing aspects of an event. Applied
Cognitive Psychology, 20, 689–696.
33. Bransford, J. D., & Johnson, M. K. (1972). Contextual
prerequisites for understanding: Some investigations of
comprehension and recall. Journal of Verbal Learning and
Verbal Behavior, 11, 71 7–726 .
34. Leavitt, J., & Christenfeld, N. J. (2013). The fluency of spoilers:
Why giving away endings improves stories. Scientific Study of
Literature, 3, 93–104.
35. Eagly, A. H., & Chaiken, S. (1993). The psychology of attitudes.
Orlando, FL: Harcourt Brace Jovanovich College.
36. Petty, R. E., & Cacioppo, J. T. (1986). Communication and
persuasion: Central and peripheral routes to attitude change.
New York, NY: Springer.
37. Brown, A. S., Brown, L. A., & Zoccoli, S. L. (2001). Repetition-
based credibility enhancement of unfamiliar faces. The
American Journal of Psychology, 115, 199–209.
38. Weisbuch, M., & Mackie, D. (2009). False fame, perceptual
clarity, or persuasion? Flexible fluency attribution in
spokesperson familiarity eects. Journal of Consumer
Psychology, 19, 62–72.
39. Newman, E. J., Sanson, M., Miller, E. K., Quigley-McBride, A.,
Foster, J. L., Bernstein, D. M., & Garry, M. (2014). People with
easier to pronounce names promote truthiness of claims. PloS
One, 9(2), Article e88671. doi:10.1371/journal.pone.0088671
40. Lev-Ari, S., & Keysar, B. (2010). Why don’t we believe
non-native speakers? The influence of accent on credibility.
Journal of Experimental Social Psychology, 46, 1093–1096.
41. Lee, D. S., Kim, E., & Schwarz, N. (2015). Something smells
fishy: Olfactory suspicion cues improve performance on
the Moses illusion and Wason rule discovery task. Journal of
Experimental Social Psychology, 59, 47–50 .
42. Skurnik, I., Yoon, C., Park, D. C., & Schwarz, N. (2005). How
warnings about false claims become recommendations.
Journal of Consumer Research, 31, 713–724.
43. Emery, D. (2000,). The great Internet banana scare of 2000:
“Killer flesh-eating bananas” rumor floods Internet. Retrieved
August 2, 2016, from: http://urbanlegends.about.com/od/
fooddrink/a/killer_bananas.htm
44. Hovland, C. I., & Weiss, W. (1951). The influence of source
credibility on communication eectiveness. Public Opinion
Quarterly, 15, 635–650. doi:10.1086/266350
45. Pratkanis, A. R., Greenwald, A. G., Leippe, M. R., &
Baumgardner, M. H. (1988). In search of reliable persuasion
eects: III. The sleeper eect is dead: Long live the sleeper
BSP_vol2no1_Interior_v4.indd 94 10/11/16 3:37 PM
a publication of the behavioral science & policy association 95
eect. Journal of Personality and Social Psychology, 54,
203–218. doi:10.1037/0022-3514.54.2.203
46. Lewandowsky, S., Oreskes, N., Risbey, J. S., Newell, B. R., &
Smithson, M. (2015). Seepage: Climate change denial and its
eect on the scientific community. Global Environmental
Change, 33, 1–13.
47. Houts, P. S., Doak, C. C., Doak, L. G., & Loscalzo, M. J. (2006).
The role of pictures in improving health communication: A
review of research on attention, comprehension, recall, and
adherence. Patient Education and Counseling, 61, 1 73–190.
48. Marcus, N., Cooper, M., & Sweller, J. (1996). Understanding
instructions. Journal of Educational Psychology, 88, 49–63.
49. Mayer, R. E. (2008). Applying the science of learning:
Evidence-based principles for the design of multimedia
instruction. American Psychologist, 63, 760–769.
50. Newman, E. J., Garry, M., Bernstein, D. M., Kantner, J., &
Lindsay, D. S. (2012). Nonprobative photographs (or words)
inflate truthiness. Psychonomic Bulletin & Review, 19,
969–974.
51. Fagerlin, A ., Wang, C., & Ubel, P. A. (2005). Reducing the
influence of anecdotal reasoning on people’s health care
decisions: Is a picture worth a thousand statistics? Medical
Decision Making, 25, 398–405.
52. McGlone, M. S., & Tofighbakhsh, J. (2000). Birds of a
feather flock conjointly (?): Rhyme as reason in aphorisms.
Psychological Science, 11, 424–428.
53. Blank, H., & Launay, C. (2014). How to protect eyewitness
memory against the misinformation eect: A meta-analysis of
post-warning studies. Journal of Applied Research in Memory
and Cognition, 3, 77–8 8.
54. Butler, A. C., Zaromb, F. M., Lyle, K. B., & Roediger, H. L., III.
(2009). Using popular films to enhance classroom learning:
The good, the bad, and the interesting. Psychological Science,
20, 1161–1168.
55. Ecker, U. K., Lewandowsky, S., & Tang, D. T. (2010). Explicit
warnings reduce but do not eliminate the continued influence
of misinformation. Memory & Cognition, 38, 1087–1100.
56. Tousignant, J. P., Hall, D., & Loftus, E. F. (1986). Discrepancy
detection and vulnerability to misleading postevent
information. Memory & Cognition, 14, 329–338.
57. Certain Types of Statements for Dietary Supplements, 21 C.F.R.
§ 101.93 (2015).
BSP_vol2no1_Interior_v4.indd 95 10/11/16 3:37 PM
... Since the accessibility of earlier-studied misinformation should determine how often corrections are detected as such, manipulations of misinformation memorability should lead to differences in subsequent memory effects associated with retrieval practice of fake news when detecting corrections. One way to influence misinformation accessibility is to vary the congruence of participant and peer beliefs in the veracity of such information (Schwarz et al., 2016). From one perspective, when evaluating the veracity of information, people incorporate peer beliefs into their evaluations, especially when the information has ambiguous veracity (Gabbert et al., 2007) and social contacts endorse the belief (Galesic et al., 2021). ...
... From one perspective, when evaluating the veracity of information, people incorporate peer beliefs into their evaluations, especially when the information has ambiguous veracity (Gabbert et al., 2007) and social contacts endorse the belief (Galesic et al., 2021). In addition, when new information matches prior beliefs, encoding is more fluent (Schwarz et al., 2016), leading to stronger memory representations (Levine & Murphy, 1943), possibly by integrating information with schemas (Pratkanis, 1989). Accordingly, misinformation that both participants and peers believe would be more accessible, leading to better detection of contradictory details enabled by misinformation retrieval. ...
... Although the reason for the absence of a belief congruence effect is unclear, one possibility is that belief congruence and incongruence both improved memory via different routes. For example, belief congruence could have enhanced encoding by improving encoding fluency (Schwarz et al., 2016), memory representations (Levine & Murphy, 1943), and integration with schemas (Pratkanis, 1989). Another possibility is that the values we selected for peer beliefs may have not been extreme enough to induce social influence (cf. ...
Article
Full-text available
Fake news can impair memory leading to societal controversies such as COVID-19 vaccine efficacy. The pernicious influence of fake news is clear when ineffective corrections leave memories outdated. A key theoretical issue is whether people should recall fake news while reading corrections with contradictory details. The familiarity backfire view proposes that recalling fake news increases its familiarity, leading to interference. However, the integrative encoding view proposes that recalling fake news promotes co-activation and binding of contradictory details, leading to facilitation. Two experiments examined if one theory better accounts for memory updating after participants recalled actual fake news details when reading headlines that corrected misinformation. In Phase 1, participants read real and fake news headlines of unclear veracity taken from various internet sources. In Phase 2, participants read real news headlines that reaffirmed real news and corrected fake news from Phase 1. When they detected that Phase 2 real news corrected fake news, they attempted to recall Phase 1 fake news. In Phase 3, participants first recalled real news details. When they remembered that those details were corrections from Phase 2, they attempted to recall fake news from Phase 1. Recalling fake news when noticing corrections in Phase 2 led to better memory for real news in Phase 3 when fake news was recalled again and worse memory for real news in Phase 3 when fake news was not recalled again. Both views explain part of the memory differences associated with recalling fake news during corrections, but only when considering whether people recollected that fake news had been corrected.
... Other concepts relate to the role of individual bias, with people being less sceptical if the information presented aligns with their political beliefs (Gampa et al. 2019). Of notable mention are the five criteria for which Schwarz, Newman, and Leach (2016) argue individuals subject a piece of information to before deeming it as true. For instance, compatibility with other known information, the credibility of the source, whether it is believed by others, whether it is internally consistent with their views or beliefs, and whether there is supporting evidence (Apuke and Omar 2021). ...
... On many occasions throughout the pandemic, this discourse has been peddled by senior politicians, governments, medical professionals and mainstream media. Schwarz, Newman, and Leach (2016) highlight how the credibility of the source plays a role in the dissemination and acceptance of information. In addition to this, research looking at authority and misinformation found that the misinformation effect only occurred in the high authority conditions (Skagerberg and Wright 2008). ...
Article
Full-text available
This paper looks at the profiles of those who engaged in Islamophobic language/extremist behaviour on Twitter during the COVID-19 pandemic. This two-part analysis takes into account factors such as anonymity, membership length and postage frequency on language use, and the differences in sentiment expressed between pro-social and anti-social tweets. Analysis includes comparisons between low, moderate and high levels of anonymity, postage frequency and membership length, allowing for differences in keyword use to be explored. Our findings suggest that increased anonymity is not associated with an increase in Islamophobic language and misinformation. The sentiment analysis indicated that emotions such as anger, disgust, fear, sadness and trust were significantly more associated with pro-social Twitter users whereas sentiments such as anticipation, joy and surprise were significantly more associated with anti-social Twitter users. In some cases, evidence for joy in the suffering of others as a result of the pandemic was expressed.
... The present findings add to this literature by suggesting that reminder-based and veracity-labeled corrections can promote integrative encoding to the extent that they trigger retrieval of fake news during real news corrections. The present results are also somewhat incompatible with the familiarity backfire prediction that repeating misinformation with corrections should lead misinformation to be more familiar and believable 36,39 . However, familiarity backfire was likely present in our results when corrections were not recollected. ...
Article
Full-text available
Fake news exposure can negatively affect memory and beliefs, thus sparking debate about whether to repeat misinformation during corrections. The once-prevailing view was that repeating misinformation increases its believability and should thus be avoided. However, misinformation reminders have more recently been shown to enhance memory and belief accuracy. We replicated such reminder benefits in two experiments using news headlines and compared those benefits against the effects of veracity labeling. Specifically, we examined the effects of labeling real news corrections to enhance conflict salience (Experiment 1) and labeling fake news on its debut to encourage intentional forgetting (Experiment 2). Participants first viewed real and fake news headlines with some fake news labeled as false. Participants then saw labeled and unlabeled real news corrections; labeled corrections appeared alone or after fake news reminders. Reminders promoted the best memory and belief accuracy, whereas veracity labels had selective effects. Correction labels led to intermediate memory and belief accuracy, whereas fake news labels improved accuracy for beliefs more than memory. The extent that real and fake news details were recalled together correlated with overall memory and belief differences across conditions, implicating a critical role for integrative encoding that was promoted most by fake news reminders.
... However, the misinformation should be prefaced with a warning 99,148 and repeated only once in order not to boost its familiarity unnecessarily 104 . It is also good to conclude by repeating and emphasizing the accurate information to reinforce the correction 185 . ...
Poster
Full-text available
Critical thinking for sustainable development therefore focuses on the soft skills of positive values and attitudes while at the same time embracing social, economic, political, and environmental transformation for the good of everyone irrespective of age, gender, ethnicity, or status in society. Green marketing is developing and selling environmentally friendly goods or services. It helps improve credibility, enter a new audience segment, and stand out among competitors as more and more people become environmentally conscious. Using eco-friendly paper and inks for print marketing materials. Skipping the printed materials altogether and option for electronic marketing. Having a recycling program and responsible waste disposal practices. Using eco-friendly product packaging. Critical thinking helps people better understand themselves, their motivations and goals. When you can deduce information to find the most important parts and apply those to your life, you can change your situation and promote personal growth and overall happiness. The reason why innovation benefits from critical thinking is simple; critical thinking is used when judgment is needed to produce a desired set of valued outcomes. That is why the majority of innovation outcomes reflect incremental improvements built on a foundation of critically thought-out solutions. The results indicate that there are four factors that effectively influence fulfillment of green marketing, specifically, green labeling, compatibility, product value and green advertising. A green mission statement becomes the foundation of a company's sustainability efforts. It provides the organization and its stakeholders with an understanding of what's most important and what your company can do to protect the natural world and be more socially responsible.
... False beliefs are not replaced with well-founded knowledge by simply providing true information. This can result in distortion of the new information to make it fit with preconceived ideas (Hughes, Lyddy, & Lambe, 2013;Schwarz et al., 2016;Swami et al., 2015). All this current knowledge about the mechanisms of propagation of myths should be available to teachers to outline in their classrooms appropriate methodologies to deal with false beliefs and promote students' critical thinking (Hughes, Lyddy, & Lambe, 2013;McAfee & Hoffman, 2021). ...
Article
Background The study of myths in psychology has conceptual and educational relevance: How to adapt the teaching of psychology to confront myths with grounded knowledge? A first step is to know which myths prevail and its relation to training in psychology. Objective To explore myth's prevalence among Spanish first-year university students of Social (SS) and Engineering Sciences (ES) (Study 1), and among different levels of expertise in psychology (Study 2). Method Questionnaire including 21 myths. Study 1: 175 first-year SS and ES undergraduates. Study 2: 102 lay, semi-experts and experts in psychology. Results Lower prevalence of myths among Spanish students than in other countries (approx. 37% vs. 60%), with SS students performing better than ES students. Experts performed significantly better (14% myths endorsed) than lay students (33%), but not than semi-experts (19%). Conclusions The lower prevalence of myths compared to other countries may be due to methodological and sociocultural aspects. University training in psychology helps to better identify myths but does not eradicate them. Teaching Implications Need to reflect on the little progress beyond a medium level of expertise. Teachers and students must identify their own myths and work on them in classroom, promoting critical thinking.
... There is, however, a whole dimension to consider about how these goods and services are presented to the mentioned audience in terms of variables that are not critical to the content of the proposition, but that have an impact on how the messages are processed. "When messages are difficult to process, people think they are less compelling (Schwarz, 2015;Schwarz, Newman, & Leach, 2016)". ...
Thesis
In today's globalized world of information, where the attention span of the population is shrinking and becoming an increasingly prized currency, it is therefore worthwhile to research and explore the tools at marketers' disposal to further maximize the return on their investment and keep their audience attached to the screen for as long as their value proposition has been delivered. Nested in previous research on the fluency of a message derived from incidental factors, this paper sheds light on the effect that ''Cutting on The Beat'' (or COTB), a particular style of synchronization-based audiovisual editing, has on retention and certain subjective perceptions of the message. The conventional laboratory experiment conducted in a virtual environment suggested that COTB does not influence retention or message believability, but it lays the groundwork for further research on post-production styles that purport to do so.
... They are influenced, at least in part, by media consumption and a polarized media landscape. The novelty of COVID-19, as well as disparate government responses and overwhelmed health systems exacerbate the factors that contribute to belief in conspiracy theories including feelings of powerlessness, a desire to cope with uncertainty or threats, and validation of perceived victimization (Schwarz et al., 2016). ...
Article
The Transfer-appropriate Processing (TAP) framework has demonstrated enhanced recognition memory when processing operations engaged at encoding and at test match. Our research applied TAP to study the illusory truth effect (ITE). We investigated whether the match/mismatch of evaluative goals at encoding and at test affects the ITE. At encoding, participants saw target words (Experiments 1-3; or full trivia claims Experiments 4-5) and completed an evaluative goal: imagery task or vowel-counting. At test, participants saw target words embedded in trivia claims that were old or new and completed the same (matching) or different (mismatching) evaluative goal that they completed at encoding, before making truth or memory ratings. We found a typical TAP effect for memory judgements when people saw words at encoding, but no TAP effect when people saw claims at encoding. We also found an ITE when people saw claims at encoding, but no ITE when people saw words at encoding (no evidence of TAP moderating truth judgments). Together these results extend both the TAP and ITE literatures, suggesting boundary conditions for TAP and the conditions under which the ITE emerges.
Article
This research examined the hypothesis that people judge as true those claims aligned with the normative content of their salient social identities. In Experiment 1a, participants’ social identities were manipulated by assigning them to ‘inductive‐thinker’ and ‘intuitive‐thinker’ groups. Participants subsequently made truth judgements about aphorisms randomly associated with ‘science’ and ‘popular wisdom’. Those with salient inductive‐thinker social identities judged science‐based claims as more truthful than popular wisdom‐based claims to a greater extent than those with salient intuitive‐thinker social identities. Experiment 1b was a preregistered replication, with additional conditions eliminating an alternative semantic‐priming explanation. In Experiment 2, American Conservatives and Liberals judged as more true claims associated with the ideological content of their social identities. This difference was attenuated through a manipulation that framed participants as more moderate than they had originally indicated. Overall, these experiments suggest an identity‐truth malleability, such that making salient specific social identities can lead to related perceptions of truth normatively aligned with those identities.
Article
Full-text available
Vested interests and political agents have long opposed political or regulatory action in response to climate change by appealing to scientific uncertainty. Here we examine the effect of such contrarian talking points on the scientific community itself. We show that although scientists are trained in dealing with uncertainty, there are several psychological reasons why scientists may nevertheless be susceptible to uncertainty-based argumentation, even when scientists recognize those arguments as false and are actively rebutting them. Specifically, we show that prolonged stereotype threat, pluralistic ignorance, and a form of projection (the third-person effect) may cause scientists to take positions that they would be less likely to take in the absence of outspoken public opposition. We illustrate the consequences of seepage from public debate into the scientific process with a case study involving the interpretation of temperature trends from the last 15 years. We offer ways in which the scientific community can detect and avoid such inadvertent seepage.
Article
Thinking and reasoning, long the academic province of philosophy, have emerged over the past century as core topics of empirical investigation and theoretical analysis in the modern fields of cognitive psychology, cognitive science, and cognitive neuroscience. Formerly seen as too complicated and amorphous to be included in early textbooks on the science of cognition, the study of thinking and reasoning has since taken off, branching off in a distinct direction from the field from which it originated. This comprehensive publication covers all the core topics of the field of thinking and reasoning. Written by the foremost experts from cognitive psychology, cognitive science, and cognitive neuroscience, individual articles summarize basic concepts and findings for a major topic, sketch its history, and give a sense of the directions in which research is currently heading. The authors provide introductions to foundational issues and methods of study in the field, as well as treatment of specific types of thinking and reasoning and their application in a broad range of fields including business, education, law, medicine, music, and science.
Article
Two studies demonstrated that attempts to debias hindsight by thinking about alternative outcomes may backfire and traced this to the influence of subjective accessibility experiences. Participants listed either few (2) or many (10) thoughts about how an event might have turned out otherwise. Listing many counterfactual thoughts was experienced as difficult and consistently increased the hindsight bias, presumably because the experienced difficulty suggested that there were not many ways in which the event might have turned out otherwise. No significant hindsight effects were obtained when participants listed only a few counterfactual thoughts, a task subjectively experienced as easy. The interplay of accessible content and subjective accessibility experiences in the hindsight bias is discussed.
Article
This chapter outlines the theory of mental models. The theory accounts for the deductive reasoning of individuals untrained in logic, and the chapter marshals corroboratory evidence. It goes on to describe how the theory applies to reasoning about probabilities based on the alternative possibilities in which an event occurs ("extensional" reasoning). It also outlines the theory's application to inductions, including those that depend on the intuitive system of reasoning (System 1) and those that depend on deliberation (System 2). Inductions include the automatic use of knowledge to modulate the interpretation of assertions. Models appear to underlie both the detection of inconsistencies among propositions and abductions that create explanations, including those that resolve inconsistencies. The model theory is supposed to apply to all thinking about propositions, and the chapter concludes with some of the main gaps in its account.