Getting past post truth 1
Running head: GETTING PAST POST TRUTH
Letting the Gorilla Emerge From the Mist: Getting Past Post-Truth
University of Bristol and University of Western Australia
George Mason University
Ullrich K. H. Ecker
University of Western Australia
Word count: XXXX excluding references (approximate count due to use of L
School of Experimental Psychology and Cabot Institute
University of Bristol
12a Priory Road
Bristol BS8 1TU, United Kingdom
Getting past post truth 2
We welcome the 9 constructive and insightful commentaries on our target article. The
commentaries proposed a number of creative, evidence-based applications of the principles
we proposed. Here we identify common themes among the commentaries, including one
relating to the political intentionality underlying much disinformation that we only
partially addressed and that thus had remained shrouded in mist. We synthesize the
suggestions from the commentary into a proposal that may help overcome the post-truth
malaise, provided a ﬁnal obstacle can be overcome. This obstacle is the gorilla in the
room: Policy making in the United States is largely independent of the public’s wishes but
serves the interests of economic elites.
Getting past post truth 3
Letting the Gorilla Emerge From the Mist: Getting Past
At the time of this writing, questions about whether an hypothetical new strain of
avian ﬂu is contagious to humans would be resolved by medical research. Although the
reliance on expertise and science in such matters appears obvious it need not be taken for
granted: Our target article (Lewandowsky, Ecker, & Cook, 2017, LEC from here on) raised
the spectre of a dystopian “post-truth” future in which questions about viruses or the
laws of physics are resolved not by “elitist” experts but by an opinion market on Twitter.
Although this possible future is still ﬁctional, we argued that we already live in a
“post-truth” era in which people’s misconceptions can no longer be considered an isolated
failure of individual cognition that can be corrected with appropriate communication
tools. Instead, we argued that any response to the post-truth era must recognize the
presence of widespread alternative epistemological communities that defy conventional
standards of evidence. In those communities, climate change is seen as a hoax perpetrated
by corrupt scientists, the Democratic party traﬃcs child sex out of the basement of a
pizza parlor in Washington D.C., and NASA is operating a slave colony on Mars.
Because those alternative epistemologies arguably arose as a consequence of societal
mega-trends, such as growing inequality or the decline of social capital, we suggested that
solutions to the post-truth crisis must also look beyond individual cognition. We proposed
one avenue forward, based on the blending of insights from cognitive science with
technology, an approach we called “technocognition.”
Table 1 summarizes the 9 commentaries on LEC and identiﬁes the code that we use
to refer to individual contributions from here on. We structure our response around the
main themes that emerged from the commentaries.
Getting past post truth 4
Cognition by the People and of the People
Virtually all commentaries support our contention that the “post-truth” world is
best understood as a phenomenon that goes beyond individual cognition and instead
requires some form of collective analysis and understanding. Seifert put this elegantly:
“The problem of misinformation ‘in the head,’ where individuals struggled to maintain
inconsistent facts in memory, has been replaced by a problem of misinformation “in the
world,” where inconsistent information exists across individuals, cultures, and societies.
Now, misinformation can be so ‘good,’ it is presented simply because it should be true
. . . ” (S, p. x).
There were, however, dissenting voices. At the most divergent end, RD focused
entirely on individual-level cognition and made several helpful suggestions about how
those can be harnessed for corrective eﬀorts. We return to those suggestions later, but like
the remaining commentators, we believe that the full social context must be considered
before we can tackle processes based on individual cognition.
VB endorse our contextual approach but caution that the picture about
politically-asymmetric susceptibility to misinformation is far from clear. VB acknowledge
that there are some studies—which we cited; for example Pfattheicher and Schindler
(2016)—that suggest that conservatives are more susceptible to being misled than liberals.
However, they note that other studies show the opposite (e.g., Bakshy, Messing, &
Adamic, 2015). We agree that the issue is not fully settled. For example, there are some
potential inconsistencies between the ﬁnding that conservatives exhibit greater credulity
for information about hazards than liberals (Fessler, Pisor, & Holbrook, 2017) on the one
hand, and the well-established “white male” eﬀect, which shows that white men (and in
particular conservatives) downplay a number of risks (D. M. Kahan, Braman, Gastil,
Slovic, & Mertz, 2007). We are, however, quite conﬁdent that the rejection of scientiﬁc
ﬁndings is mainly focused on the political right: the preponderance of survey and public
Getting past post truth 5
opinion data supports this conclusion (Lewandowsky & Oberauer, 2016).1We are equally
conﬁdent that overall, there is asymmetry between left and right on a multitude of
cognitive variables (Jost, 2017), although it remains to be seen which of those variables
are most pertinent to the post-truth world.
Filter Bubbles or People Filtering?
Commentators generally saw our proposal for “technocognition”, that is
cognitively-inspired design of information architectures that are more resilient to spreading
misinformation, as providing a useful contribution, although some expressed skepticism
that it was suﬃcient to act as a solution to the crisis For example, MD fear that
technocognition would be “insuﬃcient in countering systemic lies in the US” (MD, p. x),
and HJ are concerned that triggering people’s worldview defenses via technocognition
(e.g., automated fact checking) may be counter-productive. Perhaps the most strident
criticism was oﬀered by Garrett, who disagreed with our uncritical acceptance of the ideas
of echo chambers and ﬁlter bubbles (Pariser, 2011), and with the idea that
techno-cognitive approaches could serve to break down those echo chambers and broaden
ﬁlter bubbles. Garrett cites evidence that news audience fragmentation is, arguably, not as
great as is often assumed (Flaxman, Goel, & Rao, 2016). Indeed, some researchers argue
that face-to-face interactions are more segregated now than online news consumption
(Gentzkow & Shapiro, 2011). To the extent that there is online segregation, it is said to
be driven more by people’s personal choices than algorithms (Bakshy et al., 2015).
We accept that if exposure is used as a metric, the fractionation of the information
landscape may be less severe than some critics have feared. However, in line with
Garrett’s further comments, we believe that the crucial metric is engagement with
content. The appearance of an item in one’s Facebook news feed is of little consequence if
it is ignored—what matters is whether it is read and processed. When engagement rather
Getting past post truth 6
than exposure is considered, Garrett seems to be in agreement with us that the evidence
for echo chambers is robust (Schmidt et al., 2017; Zollo et al., 2017).
The debate about whether exposure or engagement is the correct metric with which
to approach echo chambers is not a mere intellectual curiosity. As Garrett notes, if
exposure deﬁned echo chambers, then a diﬀerent form of technocognition would be needed
to dilute them than if fractionation arose from engagement instead. He proceeds to
propose a technocognitive approach aimed at engagement, namely modiﬁcations to
Google’s search algorithm so that they are sensitive not just to popularity but also to the
accuracy of information. We agree, and we take up Garrett’s suggestion in our concluding
section that synthesizes the commentaries into a way forward.
Demisting the Gorilla
The focus of LEC was on identifying candidate societal trends that may have
contributed to the emergence of a “post-truth” world.2We identiﬁed 6 trends: (1) The
decline of social capital, such as trust in institutions and civic engagement (e.g., Aldrich &
Meyer, 2015), accompanied by increasing social isolation (Sander & Putnam, 2010). (2)
Growing inequality (Sommeiller, Price, & Wazeter, 2016) with its manifold adverse
consequences (Wilkinson & Pickett, 2009), including political polarization (Garand, 2010)
and widespread discontent (Alesina & Perotti, 1996). (3) Increasing political polarization,
with levels of mutual animosity that can now exceed aﬀective polarization over race
(Iyengar, Van den Bulte, & Valente, 2011). (4) Declining trust in science among
Conservatives (but not Liberals; Gauchat, 2012). (5) An asymmetric credulity for
misinformation that is greater among people on the political right than the left (e.g.,
Pfattheicher & Schindler, 2016). (6) The increasing fractionation of the media landscape
and the opportunity for “echo chambers” and “ﬁlter bubbles” it aﬀords (e.g., Pariser,
Getting past post truth 7
We acknowledged that this selection was largely arbitrary. We therefore do not
insist that our selection of trends is exhaustive—on the contrary, we merely wanted to
stimulate a growing conversation about the importance of the larger context in which
individual cognition, such as responding to misinformation, takes place. In this spirit, we
now oﬀer a further mega-trend that we believe to be important and that we only became
aware of after LEC was written. Santos, Varnum, and Grossmann (2017) provided
comprehensive cross-national evidence for a global increase in individualism, measured in
both attitudes and practices, during the last 30-40 years. Individualism is “a view of the
self as self-directed, autonomous, and separate from others”, and it stands in contrast to
collectivism, which refers to an “interconnected view of the self that overlaps with close
others, with individuals’ thoughts, feelings, and behaviors embedded in social contexts”
(Santos et al., 2017, p. 3). There is much evidence that individualism is a principal driver
of the rejection of several well-established scientiﬁc propositions, foremost among them
climate change (see, e.g., D. Kahan, 2016). The increasing global individualism might
therefore be reason for concern, not because individualism is inherently “wrong”, but
because it provides a fertile milieu for the rejection of at least some evidence-based
The enumeration of societal trends, however, can only take us so far towards
understanding the emergence of the post-truth world. This limitation was recognized by
several commentators who did not hesitate to highlight the instrumental background to
the post-truth malaise that we only hinted at: The post-truth crisis is not some random
natural (or societal) calamity, but it has been carefully curated and stoked by political
operatives and vested interests. MD are perhaps most outspoken when they point to the
“intentional promotion of misinformation in the powerful conservative echo chamber,
ranging from the conspiracy theories of Infowars and Rush Limbaugh to the consistent lies
and exaggerations about liberal politicians and Democratic candidates spread on Fox
Getting past post truth 8
News, Breitbart, and talk radio” (MD, p. x). The importance of disinformation was also
echoed by Garrett, Seifert, HJ, and WJ.
We agree entirely with those comments, notwithstanding their inevitable political
overtones. As we noted in LEC, “science sometimes cannot help but be political: for
example, the potential political fallout must not deter medical researchers from
determining, and then publicly articulating, that smoking causes lung cancer”
(Lewandowsky et al., 2017, p. x). Indeed, in the fast-moving world in which we are
writing, we would now consider it ethically problematic to ignore or withhold relevant
evidence about political developments. For example, we must not ignore the evidence that
the “Tea Party”in the U.S. was not a spontaneous eruption of “grassroots” opposition to
former President Obamas healthcare initiative but the result of long-standing eﬀorts by
libertarian and conservative “think tanks” and political operatives (Mayer, 2016). We
must not be blind to the fact that Donald Trump learned his trade from Sen. Joe
McCarthy’s chief counsel who was the brains behind the paranoid hunt for communist
inﬁltrators in the 1950s (O’Harrow & Boburg, 2016). We must recognize that the
xenophobia and Islamophobia (Swami, Barron, Weis, & Furnham, 2017) that contributed
to the Brexit vote was fostered by the U.K. tabloid press. Four outlets, The Sun,Daily
Mail,Daily Express, and Daily Star, ran more than 8,000 stories about asylum
seekers—many of them inﬂammatory—in the period from 2000–2006 alone, with more
than 1,400 of those articles using the terms “immigrant” and “asylum-seeker”
interchangeably (Colville, 2016).
We therefore conclude with a synthesis of the recommendations in the comments
that does not shy away from recognition of the political context in which we live.
Getting past post truth 9
We agree with MD that it is essential to diﬀerentiate between diﬀerent types of
misinformation. We ﬁnd their classiﬁcation of misinformation into four distinct categories
helpful, although we are less convinced by MD’s attempts to arrange those four categories
within the two axes of a two-dimensional space. For example, it is diﬃcult to see why
“truthiness”, which MD deﬁne as an “emotional, non-cognitive form of radical
constructivism” (MD, p. x), is located at the “strong realism” end of an axis whose
opposite pole is “strong constructivism”.
We also propose that MD’s classiﬁcation would beneﬁt from the addition of a
further category, which we call “paltering”. Paltering is a technique of deception that
stops short of literal falsity (Schauer & Zeckhauser, 2009). For example, rhetorical claims
made to deny climate change (e.g., “sea levels have fallen in the last 2 years”) are often
literally true but nonetheless highly deceptive (because sea level rise continues unabated
notwithstanding small ﬂuctuations about the trend line). The deceptions are readily
revealed when the full context is provided (Lewandowsky, Ballard, Oberauer, & Benestad,
2016). Our proposed paltering category largely overlaps with the concern raised by WJ
about subtle, slanted misinformation that deﬁes identiﬁcation by automated tools.
Table 2 summarizes the resulting 5 categories of misinformation and how they might be
met. The putative countermeasures in the table draw on proposals from all commentaries.
Those countermeasures are to be understood as additions to conventional fact checking
and public corrections, which must continue—if only for obvious ethical
reasons—notwithstanding their limited eﬃcacy.
BC’s commentary focused on the importance of the elites—that is, politicians,
media organizations and opinion leaders, think tanks, and so on—in creating the
post-truth problem. In support, Brulle, Carmichael, and Jenkins (2012) found that shifts
in U.S. public attitudes on climate change (from 2002 to 2010) were largely driven by elite
Getting past post truth 10
cues, in particular the Republican leadership’s withdrawal from the scientiﬁc consensus.
Fortunately, there is evidence that elites are sensitive to being held to account. In an
elegant experiment involving state legislators, Nyhan and Reiﬂer (2015) showed that
legislators are sensitive to the reputational consequences of questionable public
statements. A randomly selected group of legislators were sent letters about the risks to
their reputation and electoral chances if their public statements were identiﬁed as being
questionable. This group was substantially less likely to make inaccurate public
statements in an ensuing election than legislators in a control condition who were not
threatened with the consequences of fact checking. Holding elites to account—or the mere
threat of such accountability—therefore demonstrably works in at least some
circumstances. However, this technique is unlikely to be eﬀective in situations where the
intent is to disrupt or to create an alternative epistemological community (Shock & chaos
and Truthiness in Table 2).
Garrett suggests that the solution to post-truth must involve a new set of gate
keepers that can replace the editorial control that used to be exercised by newspaper
editors. We agree that such automated gate-keeping—e.g., the introduction of a
factual-accuracy component to Google searches—may be necessary in order to deal with
many forms of misinformation, foremost among them Bullshit and Shock & Chaos.
However, the design of any such automated tools must be sensitive to the conception of
democracy it entails. In a thoughtful analysis Bozdag and van den Hoven (2015) showed
how diﬀerent technological tools are tacitly built on diﬀerent conceptions of democracy.
For example, tools that allow users control over incoming information and ﬁlters are
tacitly built on the idea of a liberal democracy, whereas tools that seek to increase the
epistemic quality and breadth of information are endorsing a deliberative conception of
democracy. Seifert’s list of sites that can help users step outside their ﬁlter bubble (e.g.,
“Escape Your Bubble”; https://www.escapeyourbubble.com/, “Red Feed, Blue Feed”;
Getting past post truth 11
https://flipfeed.media.mit.edu/, or “PolitEcho”; http://politecho.org/) is
therefore steeped in a deliberative view of democracy.
It is likely that automated gate-keeping and other techno-cognitive tools will be
particularly challenged by Paltering and Truthiness, and to a lesser extent, by Systemic
lies: Common to those forms of misinformation is their ostensible commitment to realism.
Climate denial, for example, typically masquerades as “pro-science” skepticism and paints
the actual science of climate change as being “corrupt” or “post-modern.” It is possible
that those carefully-crafted forms of misinformation will require continued human
debunking as well as increased media literacy.
The idea of media and information literacy was central to the commentaries by VB
and MY. We agree that media literacy, that is the public’s ability to discern reliable from
unreliable information, should be foregrounded in education. Encouraging results have
been obtained in classroom settings (e.g., Walton & Hepworth, 2011). It is also
encouraging that greater knowledge about the news media has been found to be associated
with a reduced propensity to endorse various conspiracy theories, even when those theories
were aligned with participants’ political worldviews (Craft, Ashley, & Maksl, 2017).
However, information literacy is a nuanced concept and is unlikely to be a panacea.
MY make the valuable point that literacy skills must extend beyond merely evaluating
source credibility. MY correctly assert that “typical cues for credibility have been
hijacked, making source evaluation increasingly diﬃcult” (MY, p. x), as exempliﬁed by
middle school students’ inability to identify an item as advertising when it was presented
as “sponsored content.” Nonetheless, several recent studies have shown that people can be
“inoculated” against misinformation if they are informed of speciﬁc disinformation
techniques ahead of time (Cook, Lewandowsky, & Ecker, 2017; van der Linden,
Leiserowitz, Rosenthal, & Maibach, 2017).
Getting past post truth 12
Enhancing media and information literacy is not without its own problems,
however. If media literacy becomes focused on “critical thinking” alone, it may
inadvertently make people more cynical and less trustful of media and institutions overall
(Mihailidis & Viotty, 2017). We therefore embrace the suggestion by MY that people
ought to be assisted in the recognition of weak arguments, irrespective of a source’s
credibility. One particularly weak form of argumentation rests on incoherence. We have
recently shown that the rhetoric of climate denial is inherently incoherent (Lewandowsky,
Cook, & Lloyd, 2016). It remains to be seen how readily people can be taught to
recognize incoherence (e.g., “Global temperature cannot be measured with any degree of
accuracy. It is quite clear that the Earth hasn’t warmed in the last 5 years.”). It also
remains to be seen whether detection of incoherence, in turn, has any eﬀect on people’s
susceptibility to denialist arguments. There are signs that such interventions may be
successful. RB pointed to studies showing that corrections can be eﬀective when people
become aware that information is implausible (Hinze, Slaten, Horton, Jenkins, & Rapp,
2014), or when pithy explanations are presented to refute one view and aﬃrm another
(Ranney & Clark, 2016).
Information literacy may, however, be insuﬃcient to overcome the strong eﬀects of
worldview on people’s responses to misinformation and its correction. HJ recognize the
importance of worldview in information processing and in particular in the resistance to
correction of misconceptions (e.g., Nyhan & Reiﬂer, 2010). HJ also suggest that “we must
address both the misinformation and the worldviews leading to the acceptance of that
misinformation.” (HJ, p. x). This recommendation deserves to be carefully unpacked
because we doubt it would be advisable or permissible to exhort someone to be “less
conservative” or “more open-minded.” Political worldviews are an individual’s prerogative
and should not be targeted by corrective eﬀorts. However, when it comes to worldviews of
”not [valuing] empirical evidence and [...] not [trusting] scientists and other experts”, we
Getting past post truth 13
believe that it is acceptable to educate people about the problems inherent in those views.
Science education and science engagement activities may change these worldviews and
increase trust in science.
In seeming opposition to our view, HJ “doubt it will be possible to sneak true but
inconsistent information past someone’s activated worldview” (HJ, p. x). However, this
mis-characterizes our position, which is that worldview and factual information need to be
decoupled from each other, for example by reframing an issue. HJ go on to propose that
worldviews are multi-faceted and contextualized (Oyserman & Schwarz, 2017; Unsworth &
Fielding, 2014), arguing that people “may activate diﬀerent identities depending on the
current situation, changing the way they interact with the world” (HJ, p. x). This aligns
quite well with our preferred approach to reframe an issue so diﬀerent aspects of a
worldview are triggered. In the case of climate change this might involve highlighting the
eﬀects of climate change (and mitigation) on public health (Maibach, Nisbet, Baldwin,
Akerlof, & Diao, 2010) or highlighting that the economic cost of mitigation is small
compared to projected future wealth increases (Hurlstone, Lewandowsky, Newell, &
Finally, WJ advocate “the building of close relationships between science and
society, scientists and citizens, in order to produce outcomes of innovation that align with
societal goals and values.” This idea is meritorious, and indeed there are successful cases
of knowledge co-production between citizens and scientists (e.g., Whatmore & Landstr¨om,
2011) but the idea that tech giants such as Facebook would substantively amend their
algorithms to prioritize societal beneﬁt over proﬁts as a result of dialogue with citizens
appears overly optimistic to us. On the contrary, Facebook has recently expressed
opposition to German laws aimed at forcing tech giants to remove hate speech within 24
hours or face substantial ﬁnes (Shead, 2017). Which brings us to the gorilla in the room.
Getting past post truth 14
What determines the outcome of political debates in the United States? A
quantitative analysis of the underlying variables was provided by Gilens and Page (2014)
based on an analysis of 1,779 policy issues decided by the U.S. government between 1981
and 2002. Figure 1 shows their results. The dashed gray line shows the likelihood of a
policy measure being adopted as a function of public opinion, and the solid black line
shows the likelihood of adoption as a function of economic elite opinion. If the public is
nearly united against a policy, it has a probability of adoption of around 30%. If the
public is nearly united in supporting a policy, that probability is also around 30%. By
contrast, if the economic elites oppose a measure, the probability of its adoption is near
zero, and it rises to 60% or more when the elites strongly support it. Gilens and Page
(2014) dryly conclude that “The results provide substantial support for theories of
Economic-Elite Domination and for theories of Biased Pluralism, but not for theories of
Majoritarian Electoral Democracy or Majoritarian Pluralism” (p. 564).
In the present context, those data imply that any response to the post-truth malaise
that involves legislation or policy—as many techno-cognitive initiatives might—is unlikely
to be successful unless it is supported by economic forces. We are not aware of any support
for corrective legislation among the tech giants, such as Apple, Amazon, Facebook, and
Google. On the contrary, at the time of this writing, Donald Trump has co-opted the term
“fake news” in his attacks on the media, going so far as to claim that he invented the term
himself (Cillizza, 2017). This suggests that widespread public and academic concern about
the “post-truth” world is unlikely to result in any legislative change.
We are left with a better understanding of the post-truth world and how we got
here. We can also imagine solutions that are inspired by cognitive research and can be
combined with new technologies. All we need to do now, is deal with the 800 pound
Getting past post truth 15
gorilla in the room—namely, a political system that is driven by the interests of economic
elites rather than the people.
Getting past post truth 16
Aldrich, D. P., & Meyer, M. A. (2015). Social capital and community resilience. American
Behavioral Scientist,59 , 254–269. doi: 10.1177/0002764214550299
Alesina, A., & Perotti, R. (1996). Income distribution, political instability, and
investment. European Economic Review,40 , 1203 - 1228. doi:
Bakshy, E., Messing, S., & Adamic, L. (2015). Exposure to ideologically diverse news and
opinion on Facebook. Science. doi: 10.1126/science.aaa1160
Boussalis, C., & Coan, T. G. (2017). Elite polarization and correcting misinformation in
the “post-truth era”. Journal of Applied Research in Memory and Cognition.
Bozdag, E., & van den Hoven, J. (2015). Breaking the ﬁlter bubble: democracy and
design. Ethics and Information Technology ,17 , 249–265. doi:
Brulle, R. J., Carmichael, J., & Jenkins, J. C. (2012). Shifting public opinion on climate
change: an empirical assessment of factors inﬂuencing concern over climate change
in the U.S., 2002–2010. Climatic Change,114 , 169–188. doi:
Cillizza, C. (2017). Donald Trump just claimed he invented ‘fake news’. Retrieved from
Colville, R. (2016). Words and images. UNHCR Refugees(142), 14–17.
Cook, J., Lewandowsky, S., & Ecker, U. K. H. (2017). Neutralizing misinformation
through inoculation: Exposing misleading argumentation techniques reduces their
inﬂuence. PLOS ONE,12 , e0175799. doi: 10.1371/journal.pone.0175799
Craft, S., Ashley, S., & Maksl, A. (2017). News media literacy and conspiracy theory
endorsement. Communication and the Public. doi: 10.1177/2057047317725539
Getting past post truth 17
Fessler, D. M. T., Pisor, A. C., & Holbrook, C. (2017). Political orientation predicts
credulity regarding putative hazards. Psychological Science,28 , 651–660.
Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online
news consumption. Public Opinion Quarterly ,80 , 298 – 320.
Garand, J. C. (2010). Income inequality, party polarization, and roll-call voting in the U.S.
Senate. The Journal of Politics,72 , 1109-1128. doi: 10.1017/S0022381610000563
Garrett, R. K. (2017). The “echo chamber” distraction: Disinformation campaigns are the
problem, not audience fragmentation. Journal of Applied Research in Memory and
Gauchat, G. (2012). Politicization of science in the public sphere: A study of public trust
in the United States, 1974 to 2010. American Sociological Review,77 , 167–187.
Gentzkow, M., & Shapiro, J. M. (2011). Ideological segregation online and oﬄine.
Quarterly Journal of Economics,126 , 1799–1839. doi: 10.1093/qje/qjr044
Gilens, M., & Page, B. I. (2014). Testing theories of American politics: Elites, interest
groups, and average citizens. Perspectives on Politics,12 , 564–581.
Hinze, S. R., Slaten, D. G., Horton, W. S., Jenkins, R., & Rapp, D. N. (2014). Pilgrims
sailing the Titanic: Plausibility eﬀects on memory for misinformation. Memory &
Cognition,42 , 1–20. doi: 10.3758/s13421-013-0359-9
Hurlstone, M. J., Lewandowsky, S., Newell, B. R., & Sewell, B. (2014). The eﬀect of
framing and normative messages in building support for climate policies. PLOS
ONE,9, e114335. doi: 10.1371/journal.pone.0114335
Hyman, I. E., & Jalbert, M. C. (2017). Misinformation and worldviews in the post-truth
information age: Commentary on lewandowsky, ecker, and cook. Journal of Applied
Research in Memory and Cognition.
Iyengar, R., Van den Bulte, C., & Valente, T. W. (2011). Opinion leadership and social
contagion in new product diﬀusion. Marketing Science,30 , 195–212. doi:
Getting past post truth 18
Jost, J. T. (2017). Ideological asymmetries and the essence of political psychology.
Political Psychology,38 , 167–208. doi: 10.1111/pops.12407
Kahan, D. (2016). The politically motivated reasoning paradigm. Emerging Trends in
Social & Behavioral Sciences, in press.
Kahan, D. M., Braman, D., Gastil, J., Slovic, P., & Mertz, C. K. (2007). Culture and
identity-protective cognition: Explaining the white-male eﬀect in risk perception.
Journal of Empirical Legal Studies,4, 465-505.
Kloor, K. (2012, September). GMO opponents are the climate skeptics of the left
(http://www.slate.com/articles/health and science/science/2012/09/
are gmo foods safe opponents are skewing the science to scare people .html).
http://www.slate.com/articles/health and science/science/2012/09/
are gmo foods safe opponents are skewing the science to scare people .html
(Accessed 29 September 2012)
Lewandowsky, S., Ballard, T., Oberauer, K., & Benestad, R. (2016). A blind expert test
of contrarian claims about climate data. Global Environmental Change,39 , 91–97.
Lewandowsky, S., Cook, J., & Lloyd, E. (2016). The ‘Alice in Wonderland’ mechanics of
the rejection of (climate) science: simulating coherence by conspiracism. Synthese.
Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation:
Understanding and coping with the post-truth era. Journal of Applied Research in
Memory and Cognition, in press.
Lewandowsky, S., & Oberauer, K. (2016). Motivated rejection of science. Current
Directions in Psychological Science,25 , 217–222.
Maibach, E., Nisbet, M., Baldwin, P., Akerlof, K., & Diao, G. (2010). Reframing climate
Getting past post truth 19
change as a public health issue: an exploratory study of public reactions. BMC
Public Health,10 , 299.
Marsh, E. J., & Yang, B. W. (2017). A call to think broadly about information literacy:
Comment on “beyond misinformation: Understanding and coping with the
post-truth era” by lewandowsky, ecker, & cook. Journal of Applied Research in
Memory and Cognition.
Mayer, J. (2016). Dark money: The hidden history of the billionaires behind the rise of
the radical right. Scribe Publications.
McCright, A. M., & Dunlap, R. E. (2017). Combatting misinformation requires
recognizing its types and the factors that facilitate its spread and resonance. Journal
of Applied Research in Memory and Cognition.
Mihailidis, P., & Viotty, S. (2017). Spreadable spectacle in digital culture: Civic
expression, fake news, and the role of media literacies in post-fact society. American
Behavioral Scientist,61 , 441–454. doi: 10.1177/0002764217701217
Mooney, C. (2011, June). The science of why we don’t believe science
(Accessed 21 December 2011)
Nyhan, B., & Reiﬂer, J. (2010). When corrections fail: The persistence of political
misperceptions. Political Behavior,32 , 303–330.
Nyhan, B., & Reiﬂer, J. (2015). The eﬀect of fact-checking on elites: A ﬁeld experiment
on U.S. state legislators. American Journal of Political Science,59 , 628–640. doi:
O’Harrow, R. O., & Boburg, S. (2016). The man who showed Donald Trump how to
exploit power and instill fear. Retrieved from
Getting past post truth 20
Oyserman, D., & Schwarz, N. (2017). Conservatism as a situated identity: Implications
for consumer behavior. Journal of Consumer Psychology,27 , 532 – 536. doi:
Pariser, E. (2011). The ﬁlter bubble: What the internet is hiding from you. New York:
Pfattheicher, S., & Schindler, S. (2016). Misperceiving bullshit as profound is associated
with favorable views of Cruz, Rubio, Trump and conservatism. PLoS ONE,11 ,
e0153419. doi: 10.1371/journal.pone.0153419
Ranney, M. A., & Clark, D. (2016). Climate change conceptual change: Scientiﬁc
information can transform attitudes. Topics in Cognitive Science,8, 49–75. doi:
Rapp, D. N., & Donovan, A. M. (2017). Routine processes of cognition result in routine
inﬂuences of inaccurate content. Journal of Applied Research in Memory and
Sander, T. H., & Putnam, R. D. (2010). Still bowling alone?: The post-9/11 split.
Journal of Democracy,21 , 9–16.
Santos, H. C., Varnum, M. E., & Grossmann, I. (2017). Global increases in individualism.
Schauer, F., & Zeckhauser, R. (2009). Paltering. In B. Harrington (Ed.), Deception: From
ancient empires to internet dating (pp. 38–54). Stanford, CA: Stanford University
Schmidt, A. L., Zollo, F., Del Vicario, M., Bessi, A., Scala, A., Caldarelli, G., . . .
Quattrociocchi, W. (2017). Anatomy of news consumption on Facebook.
Proceedings of the National Academy of Sciences,114 , 3035–3039. doi:
Getting past post truth 21
Seifert, C. M. (2017). The distributed inﬂuence of misinformation. Journal of Applied
Research in Memory and Cognition.
Shead, S. (2017). Facebook said Germany’s plan to tackle fake news would make social
media companies delete legal content. Retrieved from
Shermer, M. (2013, January). The liberals’ war on science
Retrieved from http://www.scientificamerican.com/
article.cfm?id=the-liberals-war-on-science (Accessed 28 January 2013)
Sommeiller, E., Price, M., & Wazeter, E. (2016). Income inequality in the US by state,
metropolitan area, and county (Tech. Rep.). Economic Policy Institute.
Swami, V., Barron, D., Weis, L., & Furnham, A. (2017). To Brexit or not to Brexit: The
roles of Islamophobia, conspiracist beliefs, and integrated threat in voting intentions
for the United Kingdom European Union membership referendum. British Journal
of Psychology. doi: 10.1111/bjop.12252
Unsworth, K. L., & Fielding, K. S. (2014). It’s political: How the salience of one’s
political identity changes climate change beliefs and policy support. Global
Environmental Change,27 , 131–137.
van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the
public against misinformation about climate change. Global Challenges,1, 1600008.
Vraga, E. K., & Bode, L. (2017). Leveraging institutions, educators, and networks to
correct misinformation: A commentary on lewandowsky, ecker, and cook. Journal of
Applied Research in Memory and Cognition.
Getting past post truth 22
Walton, G., & Hepworth, M. (2011). A longitudinal study of changes in learners’ cognitive
states during and following an information literacy teaching intervention. Journal of
Documentation,67 , 449–479.
Webb, H., & Jirotka, M. (2017). Commentary on “beyond misinformation: Understanding
and coping with the post truth era”. Journal of Applied Research in Memory and
Whatmore, S. J., & Landstr¨om, C. (2011). Flood apprentices: an exercise in making
things public. Economy and Society,40 , 582-610. doi:
Wilkinson, R., & Pickett, K. (2009). The spirit level : why more equal societies almost
always do better. London: Allen Lane.
Zollo, F., Bessi, A., Del Vicario, M., Scala, A., Caldarelli, G., Shekhtman, L., . . .
Quattrociocchi, W. (2017). Debunking in a world of tribes. PLOS ONE ,12 ,
e0181821. doi: 10.1371/journal.pone.0181821
Getting past post truth 23
Preparation of this paper was facilitated by a Wolfson Research Merit Award from
the Royal Society to the ﬁrst author and a Discovery Grant from the Australian Research
Council to the third and ﬁrst author. Address correspondence to the ﬁrst author at the
Department of Experimental Psychology and Cabot Institute, University of Bristol, 12a
Priory Road, Bristol BS8 1TU, United Kingdom. email:
firstname.lastname@example.org. Personal web page: http://www.cogsciwa.com.
Getting past post truth 24
1VB suggest that the selection of scientiﬁc issues we put forward was biased against
conservatives. This is not the case. At least two of the issues, vaccinations and
genetically-modiﬁed organisms, had been anecdotally—but erroneously—thought to be
subject to denial by the political left (Kloor, 2012; Mooney, 2011; Shermer, 2013).
2Our analysis was mainly focused on the United States and it remains to be seen
how many of those trends are also manifest elsewhere.
Getting past post truth 25
Summary of commentaries on the target article by Lewandowsky, Ecker, & Cook (2017)
Citation Code Synopsis
Boussalis & Coan (2017) BC The role of elites was crucial in creating the problem
but will also be crucial in solving it.
Garrett (2017) G Disinformation campaigns, not echo chambers, are the
Hyman & Jalbert (2017) HJ We must address the worldviews that lead to the
acceptance of misinformation.
Marsh & Yang (2017) MY We must foster information literacy.
McCright & Dunlap (2017) MD Misinformation is intentionally promoted by a
powerful conservative echo chamber.
Rapp & Donovan (2017) RD Drawing attention to implausibility or providing
refutation-based explanations can correct
Seifert (2017) S The problem of misinformation used to be “in the
head” but it is now “in the world.”
Vraga & Bode (2017) VB Media literacy training must be taken out of the
Webb & Jirotka (2017) WJ We need to understand the diﬀerent types of
Getting past post truth 26
Categories of misinformation (1–4 are taken from MD’s classiﬁcation) and potential
Category Synopsis Countermeasure
Truthiness Emotional constructivism (e.g., Sean
dilution of echo chambers
Bullshit Cavalier disrespect for reason (e.g.,
dilution of echo chambers,
Systemic lies Carefully curated campaigns in support
of an agenda (e.g., climate denial)
automatic detection tools,
Shock & chaos Misinformation intended to destabilize
social relations and societal institutions
(e.g., Russian twitter bots)
automatic detection tools,
Paltering Deception that avoids falsity (e.g.,
climate denial, politicians)
Getting past post truth 27
Figure 1. Predicted probability of policy adoption in the United States as a function of
opinion among citizens (dashed line) and economic elites (solid lines). Data from Gilens &
Page (2014). See text for details.
Getting past post truth, Figure 1
Average citizens’ preferences
Economic Elites’ preferences
Predicted probability of adoption
Percent favoring proposed policy change
10 20 30 40 50 60 70 80 90 100