ArticlePDF Available

Abstract

Reviews research into intuitions of truth and discusses its implications for fake news, social media, and the correction of misinformation. -- The published version (open access) is here: http://www.apa.org/science/about/psa/2017/08/gut-truth.aspx
8/25/2017 Howdoesthegutknowtruth?
http://www.apa.org/printthis.aspx 1/8
Psychological Science Agenda (/science/about/psa/index.aspx) | June 2017
(/science/about/psa/2017/06/index.aspx)
SCIENCE BRIEF
How does the gut know truth?
The psychology of “truthiness.”
By Norbert Schwarz (http://www.apa.org/search.aspx?query=&fq=ContributorFilt:%22Schwarz,
Norbert%22&sort=ContentDateSort desc) and Eryn J. Newman, PhD (http://www.apa.org/search.aspx?
query=&fq=ContributorFilt:%22Newman, Eryn J.%22&sort=ContentDateSort desc)
Norbert Schwarz, PhD, (University of Mannheim, Germany, 1980) is provost professor
of psychology and marketing at the University of Southern California (USC) and co-
founder of the USC Dornsife Mind & Society Center. He previously served on the
faculty of the University of Michigan and the University of Heidelberg, Germany. His
research addresses the interplay of feeling and thinking, the context sensitive and
embodied nature of judgment and decision making, and the implications of basic cognitive and
communicative processes for public opinion, consumer behavior and social science research. He has
been elected to the American Academy of Arts and Sciences and German National Academy of
Sciences and has received distinguished scientific contribution awards from behavioral science
associations in Europe and the United States. Author website (https://dornsife.usc.edu/norbert-schwarz/) .
Eryn J. Newman, PhD, (Victoria University of Wellington, New Zealand, 2013) is a
postdoctoral research associate at the University of Southern California’s USC
Dornsife Mind & Society Center. Her research addresses distortions of cognition and
memory and ways to correct them. She currently focuses on how anecdotes,
illustrative photographs and other tangential, nonprobative information can influence
judgments of truth and boost people’s confidence in the veracity of false claims. Newman’s work has
appeared in leading journals of the field and has been funded by a Fulbright Fellowship. Author
website (http://dornsife.usc.edu/eryn-newman/) .
The comedian Stephen Colbert (2005 video (http://www.cc.com/video-clips/63ite2/the-colbert-report-the-
word---truthiness) ) introduced the term truthiness to describe “truth that comes from the gut, not the
book.” Following Brexit and the rise of Donald Trump, the Oxford Dictionaries (2016) selected post-
truth as the word of the year 2016 to denote “circumstances in which objective facts are less
influential in shaping public opinion than appeals to emotion and personal belief.” Indeed, current
public discourse — on topics ranging from politics to vaccines, from genetically modified food to
8/25/2017 Howdoesthegutknowtruth?
http://www.apa.org/printthis.aspx 2/8
human-caused climate change — suggests that knowledge from the gut may often override
knowledge from the book, that is, established facts based on scientific evidence. But how does the
gut know what’s true? What makes a claim feel right?
Intuitions of truth
When people consider whether something is true, they usually ask themselves one or more of the
questions in Table 1. Each of these questions can be answered analytically or intuitively (Schwarz,
2015; Schwarz, Newman & Leach, 2016). Analytic answers, akin to knowledge from the book, draw on
relevant knowledge and may involve extensive information search, which is taxing and requires
cognitive resources. Intuitive answers, akin to knowledge from the gut, are less demanding and rely
on feelings of fluency and familiarity. The easier a claim is to process and the more familiar it feels, the
more likely it is judged “true.” When thoughts flow smoothly, people nod along.
Table 1. Criteria for judging truth
When knowledge is uncertain, people turn to social consensus to gauge what is likely to be correct
(Festinger, 1954) — if many people believe it, there’s probably something to it. Hence, people are
more confident in their beliefs if others share them (Visser & Mirabile, 2004) and more inclined to
believe scientific theories when there is consensus among scientists (Lewandowsky, Gignac,
Vaughan, 2013). But determining the extent of consensus can be difficult and familiarity offers a
plausible shortcut — if many people think so, one should have heard it a few times, making it familiar.
This gives small but vocal groups a great advantage — the more often they repeat their message, the
more familiar it feels and the more people infer that many others agree, even if every repetition
comes from the samesource. Likewise, a text can feel more familiar merely because it was repeated
several times on a page, even when due to a printing error (Weaver, Garcia, Schwarz & Miller, 2007).
8/25/2017 Howdoesthegutknowtruth?
http://www.apa.org/printthis.aspx 3/8
People are more likely to accept a claim that is compatible with their own beliefs than one that is not
(Abelson et al., 1968; Wyer, 1974). Compatibility can be assessed analytically by checking the claim
against other knowledge or intuitively by attending to one’s subjective experiences during exposure.
Information that is inconsistent with one’s beliefs elicits negative feelings (e.g., Festinger, 1957) and is
processed less fluently than information that is consistent with one’s beliefs (for a review, see
Winkielman, Huber, Kavanagh & Schwarz, 2012). These subjective experiences serve as problem
signals that trigger more careful assessments of the veracity of a statement. For example, when asked
“How many animals of each kind did Moses take on the Ark?” most people answer “two” despite
knowing that the biblical actor was Noah, not Moses (Erickson & Mattson, 1981). The biblically themed
question feels familiar and people focus on what they are asked about (how many?) rather than a
background detail (who). But when the question is printed in a difficult to read font, thus making it
harder to process, thoughts flow less smoothly and people are more likely to notice the misleading
supposition embedded in the question (Song & Schwarz, 2008).
Table 2. Ease of reading influences detection of misleading
question (Song & Schwarz, 2008, Experiment 1)
Claims are also more likely to be accepted as true when they are compatible with how one feels. Kim,
Park and Schwarz (2010) induced peaceful or excited feelings before their participants read an
advertisement that promised a serene or an adventurous vacation. Participants who felt excited were
more likely to think that the adventurous vacation will deliver what the advertisement promised,
whereas those who felt peaceful were more likely to think that the serene vacation will live up to its
promises. Similarly, angry people may find angry messages credible, even while acknowledging that
substantive details are flaky.
Claims are also more likely to be accepted when they form a coherent and plausible story (Johnson-
Laird, 2012; Pennington & Hastie, 1993). Coherent stories are easier to process than incoherent
stories with internal contradictions. The ease with which stories can be processed serves as an
experiential marker for how well things hang together (Topolinski, 2012) — as long as thoughts flow
smoothly, the story seems to make sense.
Not surprisingly, information is more likely to be accepted when it comes from a credible source.
Source credibility can be evaluated by drawing on the source’s expertise, affiliation, past statements
and so on (for a review, see Eagly & Chaiken, 1993). It can also be evaluated intuitively, in which case
feelings of fluency and familiarity loom large. Repeatedly seeing a face is sufficient to increase
perceived honesty and sincerity as well as agreement (Brown, Brown & Zoccoli, 2001; Weisbuch &
Mackie, 2009). People are also more likely to believe statements when they are made in a familiar
and easy to understand accent (Lev-Ari & Keysar, 2010) and when the speaker’s name is easy rather
8/25/2017 Howdoesthegutknowtruth?
http://www.apa.org/printthis.aspx 4/8
than difficult to pronounce (Newman, Sanson, Miller, Quigley-McBride, Foster, Bernstein & Garry,
2014).
Finally, a claim is more likely to be accepted when it has a large body of supporting evidence.
Evidence can be assessed analytically by consulting relevant literature or one’s own knowledge. But
it can also be gauged by how easy it is to bring some evidence to mind — the more evidence exists,
the easier it should be to think of some (Schwarz et al., 1991; Tversky & Kahneman, 1973). Hence,
people who are asked to list two supporting arguments are more persuaded by a claim than people
asked to list six supporting arguments. Even when people can list many arguments, doing so is more
difficult than listing only a few and highlights that good support is difficult to come by (for a review,
see Schwarz, Sanna, Skurnik & Yoon, 2007).
A given claim is also more likely to be accepted when it appears with a photo — even when the
photo has no probative value (Newman, Garry, Bernstein, Kantner & Lindsey, 2012). For example,
people are more likely to believe “Magnesium is the liquid metal inside a thermometer” when they
see a photo of a thermometer (Figure 1), even one that provides no information about the liquid
inside. Photos exert this influence because they are perceived as offering evidence and make it
easier for the reader to understand and imagine the claim. As a result, the claim feels fluent, familiar
and true.
Figure 1. Nonprobative photos increase acceptance
In sum, easy processing gives an affirmative intuitive answer to each of the major truth criteria. This
reflects reliance on generally correct lay theories of mental processes: Familiar information is indeed
easier to process; information that is coherent and compatible with one’s knowledge is indeed more
likely to be correct; and supporting arguments are easier to generate when there are many of them.
But people are more sensitive to their feelings than to where their feelings come from (Schwarz,
2012). They miss that their experienced ease or difficulty of processing may result from influences that
are completely unrelated to a claim’s veracity, such as number of repetitions, ease of reading (e.g.,
due to color contrast and print font), listening (e.g., due to accent) and pronunciation (for a review of
variables that influence fluency, see Alter & Oppenheimer, 2009). Indeed, the same statement is more
likely to be accepted as true when the color contrast makes it easier to read, as illustrated in Figure 2.
Figure 2. Color contrast and truth (based on Reber &
Schwarz, 1999)
8/25/2017 Howdoesthegutknowtruth?
http://www.apa.org/printthis.aspx 5/8
Echo chambers
Intuitive truth tests foster the acceptance of information on social media. On Facebook, one’s friends
(a credible source) post a message that is liked and reposted by other friends (social consensus),
resulting in multiple exposures to the same message. With each exposure, processing becomes
easier and perceptions of social consensus, coherence and compatibility increase. Comments and
related posts provide additional supporting evidence and further enhance familiarity. At the same
time, the filtering mechanism of the feed makes exposure to opposing information less likely, as
illustrated by the Wall Street Journal’s “Blue Feed/Red Feed (http://www.wsj.com/video/red-feed-blue-feed-
liberal-vs-conservative-facebook/0678AF47-7C53-4CDF-8457-F6A16A46CDAF.html)
” site. Even outside of social media, the personalization of internet offerings facilitates a similar
narrowing of one’s information diet (Pariser, 2011).
Going beyond the mere acceptance of information, these processes are also likely to leave people
with a high sense of expertise and confidence — not only does the information seem true, it has been
seen without much opposing evidence. This enhances the familiar phenomena of naïve realism (Ross
& Ward, 1996) — the world is the way I see it and whoever disagrees is either ill-informed (which
motivates persuasion efforts) or ill-intentioned (if persuasion fails).
Correcting misinformation
False information is notoriously difficult to correct (for comprehensive reviews, see Lewandowsky,
Ecker, Seifert, Schwarz & Cook, 2012; Schwarz, Sanna, Skurnik & Yoon, 2007). While suspicion or
warnings prior to exposure reduce the acceptance of false (as well as correct) information, corrections
after exposure are often futile. Most correction attempts confront misleading statements with facts.
This works as long as the facts are highly accessible, but backfires after a delay because it ignores
the downstream consequences for intuitive truth assessments. Extensive thought about the
misinformation at the correction phase increases fluent processing when the misinformation is re-
encountered at a later time. If the correct facts do not easily come to mind at that moment, the false
information will feel all the more fluent and familiar, fostering its endorsement as true. For example,
telling people multiple times that a health claim is false reduced acceptance of the claim when people
were tested immediately, but increased acceptance three days later, when the details were forgotten
but the claim felt familiar (Skurnik, Yoon, Park & Schwarz, 2005). Older adults are particularly
vulnerable to such backfire effects because memory for details declines faster with age than the
global feeling of familiarity when one re-encounters previously seen information (Skurnik et al., 2005).
To be successful, correction attempts should avoid the repetition of false information and instead
focus on making the truth as fluent and familiar as possible. Unfortunately, the truth is often more
complex than false stories, putting it at a disadvantage. Overcoming this disadvantage requires that
8/25/2017 Howdoesthegutknowtruth?
http://www.apa.org/printthis.aspx 6/8
the truth be articulated clearly and repeated frequently, in formats that are easy to process. Photos,
illustrations and anecdotes should highlight what is true, without facilitating images of what is false.
This is particularly challenging when the false information itself is the key piece of news, for example,
when a highly visible public figure posts a false claim on Twitter. Media coverage of such “news
events” will inevitably repeat the false claim and spread it to a wider audience. In such cases, it is not
sufficient to note that the claim is “unsubstantiated” — a detail that will fade faster than the vivid claim.
Instead, media coverage needs to make the truth the primary focus, highlighting in vivid and concrete
ways how the false claim deviates from it.
References
Abelson, R.P., Aronson, E., McGuire, W.J., Newcomb, T.M., Rosenberg, M.J., & Tannenbaum, P.H. (Eds.) (1968).
Theories of cognitive consistency: A sourcebook. Chicago: Rand-McNally.
Alter, A.L., & Oppenheimer, D.M. (2009). Uniting the tribes of fluency to form a metacognitive nation. Personality
and Social Psychology Review, 13, 219-235.
Brown, A.S., Brown, L.A., & Zoccoli, S.L. (2001). Repetition based credibility enhancement of unfamiliar faces. The
American Journal of Psychology, 115, 199–209.
Colbert, S. (2005). Truthiness. The Colbert Report, 17 Oct 2005. Retrieved from http://www.cc.com/video-
clips/63ite2/the-colbert-report-the-word---truthiness (http://www.cc.com/video-clips/63ite2/the-colbert-report-the-word---
truthiness) .
Eagly, A.H., & Chaiken, S. (1993). The psychology of attitudes. Orlando, Florida: Harcourt Brace Jovanovich
College.
Erickson, T.A., & Mattson, M.E. (1981). From words to meaning: A semantic illusion. Journal of Verbal Learning and
Verbal Behavior, 20, 540-552.
Festinger, L. (1954). A theory of social comparison processes. Human Relations, 7, 123-146.
Festinger, L. (1957). A theory of cognitive dissonance. Stanford, California: Stanford University Press.
Johnson-Laird, P.N. (2012). Mental models and consistency. In B. Gawronski & F. Strack (Eds.), Cognitive
consistency: A fundamental principle in social cognition (pp. 225-243). New York: Guilford Press.
Kim, H., Park, K., & Schwarz, N. (2010). Will this trip be really exciting? The role of incidental emotions in product
evaluation. Journal of Consumer Research, 36, 983-991.
Lev-Ari, S., & Keysar, B. (2010). Why don’t we believe non-native speakers? The influence of accent on credibility.
Journal of Experimental Social Psychology, 46, 1093–1096.
Lewandowsky, S., Ecker, U. K.H., Seifert, C., Schwarz, N., & Cook, J. (2012). Misinformation and its correction:
Continued influence and successful debiasing. Psychological Science in the Public Interest, 13, 106-131.
Lewandowsky, S., Gignac, G.E., & Vaughan, S. (2013). The pivotal role of perceived scientific consensus in
acceptance of science. Nature Climate Change, 3, 399–404.
McGlone, M.S., & Tofighbakhsh, J. (2000). Birds of a feather flock conjointly (?): Rhyme as reason in aphorisms.
Psychological Science, 11, 424–428.
8/25/2017 Howdoesthegutknowtruth?
http://www.apa.org/printthis.aspx 7/8
Newman, E.J., Garry, M., Bernstein, D.M., Kantner, J., & Lindsay, D.S. (2012). Nonprobative photographs (or words)
inflate truthiness. Psychonomic Bulletin & Review, 19, 969–974.
Newman, E.J., Sanson, M., Miller, E.K., Quigley-McBride, A., Foster, J.L., Bernstein, D.M., & Garry, M. (2014). People
with easier to pronounce names promote truthiness of claims. PloSOne, 9(2), Article e88671.
Oxford Dictionaries (2016). Post-truth. Retrieved from
https://www.oxforddictionaries.com/press/news/2016/12/11/WOTY-16
(https://www.oxforddictionaries.com/press/news/2016/12/11/WOTY-16) .
Pariser, E. (2011). Beware of online “filter bubbles”. TED talk. Retrieved from
http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles
(http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles) .
Pennington, N., & Hastie, R. (1992). Explaining the evidence: Tests of the story model for juror decision making.
Journal of Personality and Social Psychology, 62, 189–206.
Reber, R., & Schwarz, N. (1999). Effects of perceptual fluency on judgments of truth. Consciousness and
Cognition, 8, 338-342.
Ross, L., & Ward, A. (1996). Naive realism in everyday life: implications for social conflict and misunderstanding. In
E.S. Reed, E. Turiel, & T.T. (Eds.), Values and knowledge (pp. 103–135). Hillsdale, NJ: Lawrence Erlbaum.
Schwarz, N. (2012). Feelings-as-information theory. In P.A.M. Van Lange, A. Kruglanski, & E.T. Higgins (eds.),
Handbook of theories of social psychology (pp. 289-308). Thousand Oaks, California: Sage.
Schwarz, N (2015). Metacognition. In M. Mikulincer, P.R. Shaver, E. Borgida, & J.A. Bargh (Eds.), APA Handbook of
Personality and Social Psychology: Attitudes and Social Cognition (pp. 203-229). Washington, D.C.: APA
Schwarz, N., Bless, H., Strack, F., Klumpp, G., Rittenauer-Schatka, H., & Simons, A. (1991). Ease of retrieval as
information: Another look at the availability heuristic. Journal of Personality and Social Psychology, 61, 195–202.
Schwarz, N., Newman, E., & Leach, W. (2016). Making the truth stick and the myths fade: Lessons from cognitive
psychology. Behavioral Science & Policy, 2(1), 85-95.
Schwarz, N., Sanna, L.J., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting
people straight: Implications for debiasing and public information campaigns. Advances in Experimental Social
Psychology, 39, 127–161.
Skurnik, I., Yoon, C., Park, D.C., & Schwarz, N. (2005). How warnings about false claims become
recommendations. Journal of Consumer Research, 31, 713–724.
Song, H., & Schwarz, N. (2008). Fluency and the detection of distortions: Low processing fluency attenuates the
Moses illusion. Social Cognition, 26, 791–799.
Topolinski, S. (2012). Nonpropositional consistency. In B. Gawronski & F. Strack (Eds.), Cognitive consistency: A
fundamental principle in social cognition (pp. 112-131). New York: Guilford Press.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive
Psychology, 5, 207-232.
Visser, P.S., & Mirabile, R.R. (2004). Attitudes in the social context: The impact of social network composition on
individual-level attitude strength. Journal of Personality and Social Psychology, 87, 779-795.
8/25/2017 Howdoesthegutknowtruth?
http://www.apa.org/printthis.aspx 8/8
Find this article at:
http://www.apa.org/science/about/psa/2017/06/gut-truth.aspx
Wall Street Journal (2016). Blue feed, red feed. Retrieved from http://www.wsj.com/video/red-feed-blue-feed-
liberal-vs-conservative-facebook/0678AF47-7C53-4CDF-8457-F6A16A46CDAF.html (http://www.wsj.com/video/red-
feed-blue-feed-liberal-vs-conservative-facebook/0678AF47-7C53-4CDF-8457-F6A16A46CDAF.html) .
Weaver, K., Garcia, S.M., Schwarz, N., & Miller, D.T. (2007). Inferring the popularity of an opinion from its familiarity:
A repetitive voice can sound like a chorus. Journal of Personality and Social Psychology, 92, 821–833.
Weisbuch, M., & Mackie, D. (2009). False fame, perceptual clarity, or persuasion? Flexible fluency attribution in
spokesperson familiarity effects. Journal of Consumer Psychology, 19, 62–72.
Winkielman, P., Huber, D.E., Kavanagh, L., & Schwarz, N. (2012). Fluency of consistency: When thoughts fit nicely
and flow smoothly. In B. Gawronski & F. Strack (Eds.), Cognitive consistency: A fundamental principle in social
cognition (pp. 89–111). New York, NY: Guilford Press.
Wyer, R.S. (1974). Cognitive organization and change: An information processing approach. Potomac, Maryland:
Erlbaum.
The views expressed in this article are those of the author and do not reflect the opinions or policies of APA.

Supplementary resource (1)

... The existing empirical research suggests that truthiness occurs throughout a variety of judgment contexts because non-probative but related photos make claims feel subjectively easier to process (i.e., increase processing fluency) compared to when no photo is present (e.g., Jacoby & Dallas, 1981;Whittlesea, 1993). According to fluency-based accounts, a photo may help people picture and process a claim more easily (Cardwell et al., 2017;Schwarz & Newman, 2017). Subsequently, truthiness occurs when people use this fluency as a heuristic (mental shortcut) for truth (see also Tversky & Kahneman, 1973). ...
... Subsequently, truthiness occurs when people use this fluency as a heuristic (mental shortcut) for truth (see also Tversky & Kahneman, 1973). Specifically, people can mistake the ease with which they process the trivia claim as evidence that the claim is true rather than correctly attributing that ease to the presence of the non-probative but related photograph (Newman et al., 2015;Schwarz & Newman, 2017; see also fluency misattribution, Jacoby & Whitehouse, 1989). ...
... For example, a picture of a cheetah may help us visualize a cheetah running quickly, which makes the claim easier to imagine and process. This can produce a feeling of ease while processing the question or claim that is often associated with truth (Cardwell et al., 2017;Schwarz & Newman, 2017). An increasing body of research shows that easy processing is often interpreted as a cue to truth but is more generally taken as a cue to criteria that are related to truth such as social consensus, coherence, credibility and compatibility (Schwarz et al., 2016). ...
Article
Full-text available
When semantically-related photos appear with true-or-false trivia claims, people more often rate the claims as true compared to when photos are absent-truthiness. This occurs even when the photos lack information useful for assessing veracity. We tested whether truthiness changed in magnitude as a function of participants' age in a diverse sample using materials appropriate for all ages. We tested participants (N = 414; Age range = 3-87 years) in two culturally diverse environments: a community science center (First language: English (61.4%); Mandarin/Cantonese (11.6%); Spanish (6%), other (21%); ethnicity: unreported) and a psychology lab (First language: English (64.4%); Punjabi (9.8%); Mandarin/Cantonese (7.4%); other (18.4%); ethnicity: Caucasian (38%); South Asian (30.7%); Asian (22.7%); other/unreported (8.6%). Participants rated trivia claims as true or false. Half the claims appeared with a semantically related photo, and half appeared without a photo. Results showed that participants of all ages more often rated claims as true when claims appeared with a photo; however, this truthiness effect was stable across the lifespan. If truthiness age differences exist, they are likely negligible in the general population. (PsycInfo Database Record (c) 2022 APA, all rights reserved).
... İnternet aynı zamanda her konuda olağanüstü boyutlarda uydurma bilgiyi içermekte ve yüzyıllardır var olan yalan ve aldatmaya ivme kazandırmaktadır (Keyes, 2017, s. 265). Schwarz ve Newman (2017) bir bilginin doğruluğuna karar verirken kullandığımız kriterlerden bir tanesini bu bilgi ile ilgili kanıt bulunabilmesi olarak tanımlamaktadır. İnsanlar günümüzde bir bilginin doğruluğunu kanıtlamak için akademik yayınları taramak yerine popüler arama motorlarından birinde arama yapmakta ve inanmak istedikleri bilgi için çoğu zaman rahatlıkla bir kanıt bulabilmektedir. ...
... Maalesef gerçek, çoğu zaman aktarılan versiyonlarından çok daha karmaşıktır ve bu nedenle birçok kişiye, anlamak için özel bir çaba göstermedikleri taktirde karmaşık ve tutarsız gelebilmektedir. Bu nedenle internet üzerinden kolaylıkla ulaşılabilen, gerçeğin doğruluğu yüksek olmayan versiyonları, başka bir deyişle anlatılar, bireylere çok daha doğru ve anlaşılır izlenimi bırakabilmektedir (Schwarz ve Newman, 2017). ...
Article
Full-text available
p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px 'Minion Pro'; min-height: 14.0px} p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; text-align: justify; line-height: 10.1px; font: 10.0px 'Minion Pro'; color: #2d2829} span.s1 {font: 12.0px 'Minion Pro'; color: #000000} Bu çalışmada, nesnel gerçeklerin belirli bir konu üzerinde kamuoyunu belirlemede duygulardan ve kişisel kanaatlerden, daha az etkili olması ve bireylerin nesnel gerçeklikleri görmezden gelerek belirli bir öznel gerçekliğe inatla bağlı kalması olarak tanımlanan gerçek ötesi kavramı teorik olarak incelenmektedir. Bu inceleme, siyasal iletişim bağlamında ve McLuhan tarafından geliştirilen teknolojik belirleyicilik kuramı, Klapper tarafından geliştirilen minimal etki kuramı ve Festinger tarafından geliştirilen bilişsel çelişki kuramı temel alınarak yapılmaktadır. Teknolojik belirleyicilik kuramı, günümüz iletişim teknolojileri ile değişen insan hayatının, insani değer ve pratiklerin, internet ve sosyal medya ile evrim geçiren bilgiye ulaşma, haber alma, kendini ifade etme ve iletişim kurma biçimlerinin nesnel gerçekliğin önemini azaltarak öznel gerçekliği vurgulaması ve gerçek ötesine ivme kazandırmasını açıklamak için çerçeve olarak kullanılmaktadır. Klapper’ın minimal etki kuramı, insanların sadece kendi fikir, inanç ve tutumları ile uyuşan mesajları arama, kabul etme ve hatırlama eğilimini, bugünün medya çeşitliliği ortamında ve bireylerin gelişmiş haber kaynaklarını kişiselleştirebilme becerilerinin etkilerini dikkate alarak incelemek ve gerçek ötesi kavramı ile ilişkilendirmek için çalışmaya dâhil edilmiştir. Festinger’ın bilişsel çelişki kuramı ise insanların neden gerçek ötesine meylettikleri konusunda bir anlayış geliştirmek için kullanılmıştır
... Consistent with the continued influence of misinformation account, 4 participants' agreement with the news position was not attenuated by the explicit post hoc correction (H2a). Considering that information is judged as truth when it meets intuitive evaluation criteria (e.g., familiarity, compatibility with existing knowledge), 13 once false information is received and integrated into one's knowledge system, debunking its falsehood may not be sufficient to undo it. Interestingly, even though people did not discount the news after learning that it was judged to be false, they rated social media to be a less credible news channel (H4). ...
... Second, research found some post hoc corrections are more efficacious than others, such as repeating retractions or providing readily accessible factual alternatives to minimize the impact of misinformation. 8,13,15 The current null effect of debunking information on the agreement with the news position may be attributable to the use of an ineffective correction strategy, demanding future research employing various debunking strategies. ...
Article
A web-based experiment (n = 960) examined how debunking of publicly shared news on social media affects viewers' attitudes toward the source who shared the fake news, their agreement with the news position, and perceived credibility of social media as a news platform. Exposure to debunking information did not lower participants' agreement with the news position, but led them to derogate (1) the source who shared the misinformation and (2) social media as a news platform. However, participants who initially favored the source were less likely to attribute the sharing of fake news to the source's dispositions, rather than situational factors, thereby maintaining their positive attitudes toward the source.
... In addition to mapping the media environment, there is a need to be proactive in bringing together findings from across disciplines. Social psychology provides valuable research in decision-making, particularly how we justify choices even when we are aware they are wrong, who is most likely to overestimate competence (Kruger & Dunning, 1999;Johansson, et al., 2005), and how our minds prefer intuitive "gut feelings" over analytic thinking (Schwarz & Newman, 2017). Political science work in how we justify partisan positions, motivated reasoning (Kahne & Bowyer, 2017), how our unconscious reactions to visual cues make us judgmental of those who hold different opinions (Dodd, Hibbing, & Smith, 2016), and how rumors spread and become part of our values and beliefs (Berinksky, 2015) offer insights into mechanisms driving choices and promising points of intervention. ...
... For example, a picture may help us visualize and imagine a question or statement. This can lead to an expected feeling of ease while processing the question or statement that is often associated with truth (Cardwell, Lindsay, Förster, & Garry, 2017;Schwarz & Newman, 2017). An unrelated photo could cause people to visualize conflicting information, making processing the statement more difficult than expected. ...
Article
Non‐probative but related photos can increase the perceived truth value of statements relative to when no photo is presented (truthiness ). In 2 experiments, we tested whether truthiness generalizes to credibility judgements in a forensic context. Participants read short vignettes in which a witness viewed an offence. The vignettes were presented with or without a non‐probative, but related photo. In both experiments, participants gave higher witness credibility ratings to photo‐present vignettes compared to photo‐absent vignettes. In Experiment 2, half the vignettes included additional non‐probative information in the form of text. We replicated the photo presence effect in Experiment 2, but the non‐probative text did not significantly alter witness credibility. The results suggest that non‐probative photos can increase the perceived credibility of witnesses in legal contexts. This article is protected by copyright. All rights reserved.
Article
Full-text available
** Note: This post includes the text accepted for publication, which was subsequently highly copy-edited to fit the magazine format of the journal. ** Erroneous beliefs are difficult to correct. Worse, popular correction strategies may backfire and further increase the spread and acceptance of misinformation. People evaluate the truth of a statement by assessing its compatibility with other things they believe, its internal consistency, amount of supporting evidence, acceptance by others, and the credibility of the source. To do so, they can draw on relevant details (an effortful analytic strategy) or attend to the subjective experience of processing fluency (a less effortful intuitive strategy). Throughout, fluent processing facilitates acceptance of the statement – when thoughts flow smoothly, people nod along. Correction strategies that make false information more fluent (e.g., through repetition or pictures) can therefore increase its later acceptance. We review recent research and offer recommendations for more effective correction strategies.,
Chapter
Full-text available
Feelings-as-information theory conceptualizes the role of subjective experiences – including moods, emotions, metacognitive experiences, and bodily sensations – in judgment. It assumes that people attend to their feelings as a source of information, with different feelings providing different types of information. Whereas feelings elicited by the target of judgment provide valid information, feelings that are due to an unrelated influence can lead us astray. The use of feelings as a source of information follows the same principles as the use of any other information. Most important, people do not rely on their feelings when they (correctly or incorrectly) attribute them to another source, thus undermining their informational value for the task at hand. What people conclude from a given feeling depends on the epistemic question on which they bring it to bear; hence, inferences from feelings are contextsensitive and malleable. In addition to serving as a basis of judgment, feelings inform us about the nature of our current situation and our thought processes are tuned to meet situational requirements. The chapter reviews the development of the theory, its core propositions and representative findings
Article
Full-text available
When people make judgments about the truth of a claim, related but nonprobative information rapidly leads them to believe the claim-an effect called "truthiness" [1]. Would the pronounceability of others' names also influence the truthiness of claims attributed to them? We replicated previous work by asking subjects to evaluate people's names on a positive dimension, and extended that work by asking subjects to rate those names on negative dimensions. Then we addressed a novel theoretical issue by asking subjects to read that same list of names, and judge the truth of claims attributed to them. Across all experiments, easily pronounced names trumped difficult names. Moreover, the effect of pronounceability produced truthiness for claims attributed to those names. Our findings are a new instantiation of truthiness, and extend research on the truth effect as well as persuasion by showing that subjective, tangential properties such as ease of processing can matter when people evaluate information attributed to a source.
Article
Full-text available
Although most experts agree that CO2 emissions are causing anthropogenic global warming (AGW), public concern has been declining. One reason for this decline is the `manufacture of doubt' by political and vested interests, which often challenge the existence of the scientific consensus. The role of perceived consensus in shaping public opinion is therefore of considerable interest: in particular, it is unknown whether consensus determines people's beliefs causally. It is also unclear whether perception of consensus can override people's `worldviews', which are known to foster rejection of AGW. Study 1 shows that acceptance of several scientific propositions--from HIV/AIDS to AGW--is captured by a common factor that is correlated with another factor that captures perceived scientific consensus. Study 2 reveals a causal role of perceived consensus by showing that acceptance of AGW increases when consensus is highlighted. Consensus information also neutralizes the effect of worldview.
Article
Full-text available
The widespread prevalence and persistence of misinformation in contemporary societies, such as the false belief that there is a link between childhood vaccinations and autism, is a matter of public concern. For example, the myths surrounding vaccinations, which prompted some parents to withhold immunization from their children, have led to a marked increase in vaccine-preventable disease, as well as unnecessary public expenditure on research and public-information campaigns aimed at rectifying the situation. We first examine the mechanisms by which such misinformation is disseminated in society, both inadvertently and purposely. Misinformation can originate from rumors but also from works of fiction, governments and politicians, and vested interests. Moreover, changes in the media landscape, including the arrival of the Internet, have fundamentally influenced the ways in which information is communicated and misinformation is spread. We next move to misinformation at the level of the individual, and review the cognitive factors that often render misinformation resistant to correction. We consider how people assess the truth of statements and what makes people believe certain things but not others. We look at people’s memory for misinformation and answer the questions of why retractions of misinformation are so ineffective in memory updating and why efforts to retract misinformation can even backfire and, ironically, increase misbelief. Though ideology and personal worldviews can be major obstacles for debiasing, there nonetheless are a number of effective techniques for reducing the impact of misinformation, and we pay special attention to these factors that aid in debiasing. We conclude by providing specific recommendations for the debunking of misinformation. These recommendations pertain to the ways in which corrections should be designed, structured, and applied in order to maximize their impact. Grounded in cognitive psychological theory, these recommendations may help practitioners—including journalists, health professionals, educators, and science communicators—design effective misinformation retractions, educational tools, and public-information campaigns.