Content uploaded by Riley E. Dunlap
Author content
All content in this area was uploaded by Riley E. Dunlap on Dec 16, 2017
Content may be subject to copyright.
Journal
of
Applied
Research
in
Memory
and
Cognition
6
(2017)
389–396
Contents
lists
available
at
ScienceDirect
Journal
of
Applied
Research
in
Memory
and
Cognition
j
ourna
l
h
om
epage:
www.elsevier.com/locate/jarmac
Commentary
Combatting
Misinformation
Requires
Recognizing
Its
Types
and
the
Factors
That
Facilitate
Its
Spread
and
Resonance
Aaron
M.
McCrighta,∗,
Riley
E.
Dunlapb
aMichigan
State
University,
United
States
bOklahoma
State
University,
United
States
As
sociologists
who
have
studied
organized
climate
change
denial
and
the
political
polarization
on
anthropogenic
climate
change
that
it
has
produced
in
the
US
since
the
late
1990s
(Dunlap,
McCright,
&
Yarosh,
2016;
McCright
&
Dunlap,
2000),
we
have
closely
followed
the
work
of
Lewandowsky
and
his
collaborators
over
the
years.
Like
them,
we
have
observed
how
the
“climate
change
denial
countermovement”
(Dunlap
&
McCright,
2015)1has
employed
the
strategy
of
manufacturing
uncertainty—long
used
by
industry
to
undermine
scientific
evi-
dence
of
the
harmful
effects
of
products
ranging
from
asbestos
to
DDT
and
especially
tobacco
smoke
(Michaels,
2008;
Oreskes
&
Conway,
2010)—to
turn
human-caused
climate
change
into
a
controversial
issue
in
contemporary
American
society.
And,
like
Lewandowsky,
Ecker,
and
Cook
(2017)
in
“Beyond
Misin-
formation,”
we
view
these
past
efforts
as
key
contributors
to
the
present
situation
in
which
pervasive
misinformation
has
gener-
ated
“alternative
facts,”
pseudoscience
claims,
and
real
“fake
news”—a
“post-truth
era”
indeed.2
The
current
state
of
affairs
has
provoked
much
consternation
among
academics
and
journalists
in
the
US
and
beyond.
For
example,
scholars
have
organized
initiatives
(e.g.,
the
University
of
Sydney’s
Post-Truth
Initiative),
conferences
(e.g.,
“The
Press
and
the
Presidency
in
the
Post-Truth
Era”
at
the
University
of
1We
also
refer
to
this
as
the
“climate
change
denial
machine”
(Dunlap
&
McCright,
2011).
2It
is
no
coincidence
that
the
Oxford
Dictionaries
named
“post-truth”
as
their
2016
Word
of
the
Year.
Author
Note
For
their
respect
for
facts,
dedication
to
truth,
and
pursuit
of
justice,
we
thank
Eric
Schneiderman,
and
Robert
Mueller,
as
well
as
Mueller’s
all-star
team
of
Zainab
Ahmad,
Greg
Andres,
Rush
Atkinson,
Peter
Carr,
Michael
Dreeben,
Kyle
Freeny,
Andrew
D.
Goldstein,
Adam
Jed,
Lisa
C.
Page,
Elizabeth
Bar-
chas
Prelogar,
James
L.
Quarles
III,
Jeannie
Rhee,
Brandon
Van
Grack,
Andrew
Weissmann,
Aaron
Zebley,
and
Aaron
Zelinsky.
∗Correspondence
concerning
this
article
should
be
addressed
to
Aaron
M.
McCright,
Michigan
State
University,
United
States.
Contact:
mccright@msu.edu
Nebraska,
Lincoln),
workshops
(e.g.,
“Seeking
Truth
in
a
‘Post-
Truth’
World”
at
Bethel
University),
speaker
series
(e.g.,
“Media
and
Politics
in
a
Post-Truth
Era”
at
Furman
University),
and
courses
(e.g.,
“Calling
Bullshit”
at
the
University
of
Washington)
to
interrogate
misinformation
in
the
present
era.
Several
recent
books
(e.g.,
Tom
Nichols’s
The
Death
of
Expertise
in
2017)
try
to
explain
the
routine
disrespect
of
facts
and
declining
authority
of
science
across
society.
Popular
periodicals
feature
cover
stories
or
special
issues
devoted
to
the
conspiracy
theories
and
alterna-
tive
facts
that
contribute
to
America’s
“post-truth
moment”
(e.g.,
September
10,
2016
issue
of
The
Economist
and
September
2017
issue
of
The
Atlantic).
And,
of
course,
journalists
struggle
daily
to
make
sense
of
a
sitting
US
President
routinely
dismissing
stories,
journalists,
and
entire
outlets
as
“fake
news.”
Within
this
context,
we
widen
our
scope
beyond
climate
change
denial
to
discuss
misinformation
more
generally
and,
in
doing
so,
offer
a
sociological
response
to
Lewandowsky
et
al.
(2017),
aimed
at
complementing
and
extending
their
analysis.
We
broadly
agree
with
most
of
what
Lewandowsky
et
al.
(2017,
p.
4)
write,
and
as
sociologists
are
especially
pleased
to
see
them
emphasize
the
“larger
political,
technological,
and
soci-
etal
context”
in
which
misinformation
has
evolved
and
must
be
addressed.
Nonetheless,
we
are
skeptical
of
the
efficacy
of
their
“technocognition”
approach
for
combating
the
growth
of
mis-
information
in
the
US,
as
will
become
clear.
Before
we
briefly
identify
the
factors
that
amplify
some
types
of
misinformation
more
than
others
later
in
our
essay,
we
devote
most
of
our
atten-
tion
to
presenting
a
conceptual
framework
for
describing
key
COMBATTING
MISINFORMATION
390
types
of
misinformation
by
selected
characteristics
of
their
mes-
sengers.
While
all
misinformation
can
be
problematic,
some
types
seem
more
consequential
and
may
be
more
challenging
to
combat
than
others.
That
is,
strategies
that
may
prove
effective
at
countering
or
neutralizing
one
type
of
misinformation
may
not
work
against
others
(and
may
even
backfire)—especially
when
we
consider
the
particular
combinations
of
social,
political,
and
economic
factors
that
facilitate
the
differing
types
of
misinfor-
mation.
Given
this,
and
mindful
of
space
limitations,
our
final
section
offers
a
few
suggestions
for
future
work.
Key
Types
of
Misinformation
We
can
arrange
the
conceptual
space
of
misinformation
along
two
dimensions.
One
is
a
messenger’s
ontological
position
on
truth
and
facts,
which
ranges
from
strong
realism
(i.e.,
accep-
tance
that
truths
exist
external
to
your
mind
and
a
respect
for
facts)
to
strong
constructivism
(i.e.,
agnosticism
about
or
even
disbelief
in
the
existence
of
external
truths
and
a
disrespect
of
facts).
The
other
dimension
is
a
messenger’s
typical
rhetorical
style
and
primary
audience,
which
ranges
from
an
informal,
conversational
style
directed
toward
people’s
daily
lives
(i.e.,
lifeworlds)
to
a
formal,
persuasive
style
aimed
at
institutions
and
systems.
Combining
these
two
dimensions
produces
four
ideal-
types
of
misinformation:
truthiness,
bullshit,
systemic
lies,
and
shock-and-chaos
(see
Figure
1).
As
with
all
models
and
frame-
works,
ours
simplifies
reality
for
the
purpose
of
presentation
and
interpretation.
In
actuality,
the
boundaries
between
quad-
rants
are
porous,
and
some
messages
may
feature
multiple
types
of
misinformation
simultaneously—depending
on
the
audience,
context,
and
life
course
of
the
message.
Truthiness
occupies
the
top
left
quadrant
in
Figure
1.
The
intellectual
foundations
of
truthiness
are
found
in
popular
(mis)readings
of
the
works
of
French
postmodern
philosophers
and
British
science
and
technology
studies
(STS)
scholars
since
the
1960s.3Briefly,
these
academics
aimed
to
challenge
and
“deconstruct”
the
hegemonic
power
structure
of
Western
sci-
ence,
which
has
supported
patriarchal
capitalism
and
white
supremacist
colonialism
since
the
Enlightenment.
In
pursu-
ing
a
Leftist
political
agenda
of
critiquing
the
political
and
moral
authority
of
Western
science,
empowering
historically
marginalized
peoples,
and
legitimizing
indigenous
knowledge,
they
argued
that
what
we
consider
as
scientific
facts
and
knowl-
edge
are
essentially
the
result
of
ongoing
social
processes
of
negotiation
among
many
claims-makers,
none
of
whom
have
privileged
access
to
complete
truth.
Yet,
popular
(mis)readings
of
their
works,
which
have
mobilized
troops
in
the
so-called
“science
wars,”
have
led
many
people
to
(mis)interpret
their
key
argument
as
promoting
an
extreme
relativism
whereby
all
actors’
claims
are
equally
valid
and
accepted
“facts”
are
the
outcomes
of
power
and
epistemic
procedures.”4
3The
former
include
the
likes
of
Jean
Baudrillard,
Jacques
Derrida,
Michel
Foucault,
and
Jean-Franc¸ois
Lyotard,
while
the
latter
include
such
scholars
as
David
Bloor,
Barry
Barnes,
Harry
Collins,
Steve
Fuller,
and
Trevor
Pinch.
4Over
a
decade
ago
Latour
(2004),
a
prominent
STS
scholar,
acknowledged
the
field
as
having
contributed
to
a
situation
in
which
powerful
interests
exploit
Occurring
in
parallel
was
the
rise
of
identity
politics
in
the
US,
initially
on
the
Left
since
the
mid-1960s
but
perhaps
just
as
prominently
on
the
Right
in
recent
years.
Combining
iden-
tity
politics
and
the
postmodern
arguments
above
helps
produce
truthiness,
in
which
“facts”
and
“reality”
are
things
some
peo-
ple
feel—rather
than
know—to
be
true.
Nowhere
is
this
so
poignantly
described
than
in
the
October
17,
2005
pilot
episode
of
The
Colbert
Report.
As
his
conservative
character
(a
parody
of
Fox
News’s
Bill
O’Reilly),
Stephen
Colbert
coined
the
term
“truthiness”5as
something
you
just
feel
to
be
true:
Anybody
who
knows
me
knows
that
I’m
not
fan
of
dic-
tionaries
or
reference
books.
They’re
elitist,
constantly
telling
us
what
is
or
isn’t
true
or
what
did
or
didn’t
hap-
pen.
.
.
.
I
don’t
trust
books.
They’re
all
fact
and
no
heart.
And
that’s
exactly
what’s
pulling
our
country
apart
today.
‘Cause
face
it,
folks,
we
are
a
divided
nation.
.
.
.
We
are
divided
between
those
who
think
with
their
head
and
those
who
know
with
their
heart.
Thus,
truthiness
is
an
emotional,
non-cognitive
form
of
radical
constructivism;
it
is
simply
feeling
something
to
be
true
with-
out
the
need
for
reasoned
argument
or
rigorously
collected
and
analyzed
empirical
evidence.
Popular
purveyors
of
truthiness
include
such
famous
conservative
media
personalities
as
Sean
Hannity,
Bill
O’Reilly,
and
Glenn
Beck.
Even
as
they
spread
their
messages
across
society,
they
aim
for
a
highly
personal
connection
with
their
audience.
For
some
media
organizations
like
Fox
News
and
the
Sinclair
Broadcasting
Group,
truthiness
is
the
coin
of
the
realm.
The
top
right
quadrant
is
the
domain
of
“bullshit”
(BS),
for
which
we
turn
to
Harry
Frankfurt’s
(2005,
p.
61)
famous
def-
inition:
“The
liar
cares
about
the
truth
and
attempts
to
hide
it;
the
bullshitter
doesn’t
care
if
what
they
say
is
true
or
false,
but
rather
only
cares
whether
or
not
their
listener
is
persuaded.”
BSing,
then,
is
a
rather
personal
and
typically
self-serving
form
of
dishonesty,
with
its
purveyors
treating
information
so
cava-
lierly
that
they
seem
to
have
a
fundamental
disrespect
for
reason
and
rules
of
evidence.6Prevalent
here
are
the
kinds
of
conspir-
acy
theories
that
thrive
on
the
internet
and
are
peddled
by
outlets
like
Alex
Jones’s
Infowars
and
Steve
Bannon’s
Breitbart
News.
Self-professed
“truthers”
seem
to
have
turned
BSing
into
a
voca-
tion
(e.g.,
Kay,
2011;
Leibovich,
2015):
9/11
truthers,
Sandy
Hook
truthers,
and
citizenship
truthers
(aka,
“birthers”
who
chal-
lenge
the
established
fact
that
President
Obama
is
a
natural
born
citizen).
Perhaps
the
most
infamous
BSer
of
our
age
is
President
Don-
ald
Trump,
who
spreads
it
so
frequently
and
effortlessly
that
observers
are
challenged
to
keep
up.
He
enlists
a
revolving
door
of
personnel
whose
primary
role
is
to
justify
his
spoken
and
strong
relativism
to
deny
and/or
avoid
responsibility
for
problems
like
climate
change.
And
quite
recently,
the
emergence
of
the
post-truth
era
has
stimulated
debate
among
STS
scholars
over
the
field’s
role
in
contributing
to
it
(e.g.,
Collins
et
al.,
2017b;
Fuller,
2017;
Sismondo,
2017a,
2017b).
5Merriam
Webster
named
“truthiness”
as
their
2006
Word
of
the
Year.
6Male
politicians
figure
prominently
in
this
quadrant,
especially
when
manag-
ing
personal
scandals
(e.g.,
John
Edwards,
Mark
Sanford,
and
Anthony
Weiner).
COMBATTING
MISINFORMATION
391
Figure
1.
Key
types
of
misinformation
with
illustrative
examples
of
selected
messengers.
tweeted
BS
to
the
American
public.
Trump
is
an
exemplary
BSer
because
he
is
driven
much
more
by
self-serving
narcissism
than
by
any
ideologically
coherent
political
agenda.
Journalists
continually
document
the
growing
list
of
untruths
that
illus-
trate
Trump’s
willingness
to
BS
about
almost
anything,
big
or
small
(e.g.,
Kelly,
Kessler,
&
Lee,
2017;
Leonhardt
&
Thomp-
son,
2017).
Indeed,
it
was
White
House
Press
Secretary
Sean
Spicer’s
repeated,
impassioned
defense
of
Trump’s
BS
claims
about
having
the
largest
inauguration
crowd
ever
that
prompted
Senior
White
House
aide
Kellyanne
Conway
to
unleash
the
now
infamous
“alternative
facts”
on
the
world
a
mere
two
days
into
the
Trump
Administration
(Moore,
2017).
Systemic
lies,
which
occupy
the
lower
left
quadrant
of
Figure
1,
are
perhaps
the
most
pernicious
type
of
misinforma-
tion.
These
are
carefully
constructed
fabrications
or
obfuscations
intended
to
protect
and
promote
material
or
ideological
interests
with
a
coherent
agenda.
As
such,
purveyors
of
this
type
of
misin-
formation
target
organizations,
movements,
and
institutions
they
perceive
as
threatening
their
interests.
Systemic
lies
align
closely
with
what
we
have
termed
“anti-reflexivity,”
or
the
defense
of
the
industrial
capitalist
system
from
the
claims
of
scientists
and
social
movements
that,
for
the
regular
functioning
of
the
system,
produce
serious
problems
requiring
governmental
intervention
(e.g.,
McCright
&
Dunlap,
2010).
We
may
characterize
anti-
reflexivity
as
speaking
power
to
truth
using
any
one
of
the
set
of
nondecision-making
strategies
in
the
“anti-reflexivity
playbook”
(see
Table
2
in
McCright
&
Dunlap,
2010).
The
US-based
climate
change
denial
countermovement’s
messages,
which
employ
some
of
the
language
and
trappings
of
science
and
authoritative
expertise,
may
be
the
most
successful
systemic
lies
of
the
last
few
decades.
Briefly,
this
countermove-
ment
uses
money
and
resources
from
industry
and
conservative
foundations
to
mobilize
an
array
of
conservative
think
tanks,
lobbying
organizations,
media
outlets,
front
groups,
and
Repub-
lican
politicians
to
ignore,
suppress,
obfuscate,
and
cherry-pick
scientists
and
their
research
to
deny
the
reality
and
seriousness
of
COMBATTING
MISINFORMATION
392
climate
change
(e.g.,
Brulle,
2014;
Dunlap
&
McCright,
2011,
2015;
Farrell,
2016a,
2016b;
McCright
&
Dunlap,
2010).
The
success
of
the
climate
change
denial
countermovement
owes
much
to
the
Right’s
superior
effectiveness
in
framing
and
re-
directing
public
discourse
toward
advancing
their
ideological
interests.
Indeed,
the
Right
seems
especially
adept
at
using
Orwellian
language
to
promote
their
ideological
and
material
interests
via
what
we
would
argue
are
systemic
lies:
•the
George
W.
Bush
Administration’s
Clear
Skies
Initiative
and
Health
Forests
Initiative
of
2003,
which
would
have
weak-
ened
existing
air
quality
and
forest
conservation
protections,
respectively;
•“right-to-work”
laws
that
further
weaken
labor
unions
and
the
very
mechanisms
(e.g.,
collective
bargaining)
that
earned
workers
hard-fought
rights
in
the
first
place;
•“religious
liberty”
bills
designed
to
legalize
discrimination
against
the
LGBTQ
community,
based
on
a
narrow,
funda-
mentalist
interpretation
of
the
Christian
Bible;
•the
Trump
Administration’s
2017
Presidential
Advisory
Com-
mission
on
Election
Integrity,
which
appears
designed
to
reduce
election
integrity
by
suppressing
likely
Democratic
voters;
and
•countless
focus-group
generated
terms
that
conservative
activists
have
infiltrated
into
our
public
discourse:
e.g.,
“fam-
ily
values,”
“junk
science,”
“partial
birth
abortion,”
“death
panels,”
“death
tax,”
“job
creator,”
and,
most
recently,
“fake
news.”
Finally,
a
type
of
misinformation
we
call
“shock
and
chaos”
occupies
the
lower
right
quadrant
in
Figure
1.
It
is
misin-
formation
intended
to
destabilize
social
relations
and
societal
institutions
so
that
its
proponents
may
consolidate
power
and
force
unpopular
decisions
on
a
confused
and/or
distracted
pub-
lic
(e.g.,
Paul
&
Matthews,
2016).
As
such,
it
is
a
mix
of
the
“shock
doctrine”
strongly
critiqued
by
Naomi
Klein
(2008)
and
postmodern
authoritarianism
championed
by
Vladimir
Putin’s
key
advisors,
Vladislav
Surkov
(e.g.,
Pomerantsev,
2014)
and
Aleksandr
Dugin
(e.g.,
Ratner,
2016).7Briefly,
shock
and
chaos,
which
is
most
commonly
seen
within
nations
like
Russia,
North
Korea,
and
Iran,
involves
weaponizing
misinformation
to
secure
the
allegiance
of
followers
and
to
root
out
and
suppress
poten-
tial
dissidents
(e.g.,
Hains,
2016;
Shore,
2017).
Russia
seems
especially
active
in
employing
shock
and
chaos
misinformation
against
Western
democracies:
(a)
hacking
campaigns,
elections,
or
referendum
votes
in
France
(e.g.,
Border,
2017),
Germany
(e.g.,
Brennan,
2017),
the
United
Kingdom
(e.g.,
Syal,
2017),
and
the
US
(CNN
Staff,
2017;
The
New
York
Times
Staff,
2017);
(b)
infiltrating
news
and
social
media
platforms
(e.g.,
Calabresi,
2017b;
Lee,
2017);
and
(c)
spreading
propaganda
supporting
or
opposing
different
civil
society
groups
(e.g.,
Orr,
2017;
Syeed,
2017).
While
shock
and
chaos
misinformation
has
been
largely
absent
from
the
US
historically,
many
observers
express
con-
7While
Surkov,
a
businessman
and
politician,
has
focused
primarily
on
cern
about
its
increasing
prevalence
in
recent
years
due
largely
to
Russian
efforts.
We
are
only
beginning
to
discover
the
extent
to
which
Russian
forces
are
helping
to
spread
shock
and
chaos
misinformation
in
the
US,
partly
because
it
is
often
hard
to
detect
in
a
timely
manner.
Nevertheless,
there
is
compelling
evidence
that
•Russian
forces
hacked
(and
publicly
leaked
data
from)
the
Democratic
National
Committee
computer
networks
(e.g.,
Calabresi,
2017a);
•while
the
social
media
accounts
characterized
as
“Bernie
Bros”
(ostensibly
white
male
supporters
of
Senator
Bernie
Sanders)
were
probably
not
Russian
bots
themselves,
they
nevertheless
seemed
adept
at
spreading
Russian-generated
misinformation
(e.g.,
Ferguson,
2017);
•Russian
Twitter
bots
are
promoting
right-wing
groups
and
causes
(e.g.,
Dorell,
2017);
and
•on
the
campaign
trail
(e.g.,
Naylor,
2016)
and
in
the
first
few
months
of
his
Presidential
Administration
(e.g.,
Clifton,
2017b),
Donald
Trump
or
his
allies
routinely
spread
misinfor-
mation
generated
by
Russian
forces.
Indeed,
more
than
a
few
analysts
suggest
that
Donald
Trump’s
Presidential
campaign
and
resulting
administration
is
closely
following
the
shock
and
chaos
misinformation
strategies
of
the
Putin
regime
(e.g.,
Clifton,
2017a;
Mariani,
2017).8
domestic
politics
and
Dugin,
a
sociologist
and
political
analyst,
has
focused
mostly
on
international
relations,
both
have
long
aimed
to
promote
Russian
dominance
in
the
global
geopolitical
order.
Surkov
often
gets
credit
as
the
key
architect
of
Putinism,
a
new
style
of
authoritarianism
cloaked
in
democratic
rhetoric
(Pomerantsev,
2014).
Surkov
is
described
as
a
master
of
postmodern
theater
(Pomerantsev,
2011,
p.
5):
[C]ontemporary
Russia
.
.
.
is
a
dictatorship
in
the
morning,
a
democracy
at
lunch,
an
oligarchy
by
suppertime,
while,
backstage,
oil
companies
are
expropriated,
journalists
killed,
billions
siphoned
away.
Surkov
is
at
the
center
of
the
show,
sponsoring
nationalist
skinheads
one
moment,
backing
human
rights
groups
the
next.
It’s
a
strategy
of
power
based
on
keeping
any
opposition
there
may
be
constantly
confused,
a
ceaseless
shape-shifting
that
is
unstoppable
because
it’s
indefinable.
Dugin’s
(1997)
The
Foundation
of
Geopolitics—one
of
many
espousing
his
“fourth
political
theory”—lays
out
the
blueprint
for
Russian
ascendance
via
neutralizing
or
even
defeating
the
US.
Dugin
(1997,
p.
367)
makes
clear
the
importance
of
misinformation
campaigns
and
attacks
on
US
civil
society:
It
is
especially
important
to
introduce
geopolitical
disorder
into
internal
American
activity,
encouraging
all
kinds
of
separatism
and
ethnic,
social,
and
racial
conflicts,
actively
supporting
all
dissident
movements—extremist,
racist,
and
sectarian
groups,
thus
destabiliz-
ing
internal
political
processes
in
the
US.
It
would
also
make
sense
simultaneously
to
support
isolationist
tendencies
in
American
politics.
8A
brief
comparison
of
the
personal
Twitter
accounts
of
President
Donald
Trump
(@realDonaldTrump)
and
President
Barack
Obama
(@BarackObama)
is
revealing.
As
of
September
7,
2017,
about
17.7%
(16,680,211)
of
Obama’s
94,238,483
Twitter
followers
but
approximately
51.2%
(19,335,714)
of
Trump’s
37,765,067
followers
were
classified
as
“fake”
by
the
service
Twitter
Audit.
Indeed,
between
September
5th
(when
we
first
checked
for
this
note)
and
the
7th,
Trump’s
total
number
of
followers
remained
relatively
constant
(37,589,213
and
37,765,067,
respectively).
Yet,
Twitter
Audit
increased
its
estimation
of
the
fake
ones
by
nearly
three
million
a
mere
two
days
later:
from
16,614,432
fake
COMBATTING
MISINFORMATION
393
Key
Factors
that
Facilitate
the
Spread
and
Resonance
of
Misinformation
in
the
US
Lewandowsky
et
al.’s
(2017)
proposed
“technocognition”
approach,
combining
psychological
principles
and
technolog-
ical
innovation
to
combat
the
growth
of
misinformation,
is
laudable.
Yet,
countering
misinformation
may
be
more
chal-
lenging
than
Lewandowsky
et
al.
(2017)
imply,
especially
in
the
US
where
several
factors
amplify
the
resilience
and
potency
of
misinformation
within
large
sectors
of
society
(e.g.,
Maza,
2017).
Further,
effectively
combating
misinformation
will
likely
require
understanding
the
characteristics
and
dynamics
of
the
different
types
of
misinformation
we
have
outlined.
More
impor-
tantly,
we
must
understand
how
combinations
of
these
types
of
misinformation
synergize
to
have
even
more
complex
and
recalcitrant
impacts.
We
are
skeptical
of
the
success
of
the
technocognition
approach
in
the
US,
largely
because
of
the
megatrends
that
Lewandowsky
et
al.
(2017)
identify
as
giving
rise
to
our
current
post-truth
era.
In
our
mind,
the
foremost
barrier
to
combatting
misinformation
in
the
US
is
the
intense
political
polarization
that,
of
course,
is
related
intimately
to
decreasing
social
capital,
rising
inequality,
declining
trust
in
science,
and
an
increas-
ingly
fractionated
media
landscape.
While
still
debating
the
sources
of
polarization,
political
scientists
agree
that
it
has
reached
unprecedented
levels
and
stems
more
from
Republi-
cans
moving
far
to
the
right
than
Democrats
moving
to
the
left
(Mann
&
Ornstein,
2016).
The
rise
of
“negative
partisanship,”
in
particular,
has
created
a
situation
in
which
Republicans
and
Democrats
are
likely
to
regard
the
opposing
party
as
a
threat
to
the
nation
and
view
its
followers
in
highly
negative
terms
(Pew
Research
Center,
2016).9In
this
context,
Republicans’
skepti-
cism
about
Russian
meddling
in
the
last
election
and
especially
their
increasingly
favorable
views
of
Russia
and
Putin
may
not
be
so
surprising;
anything
and
anyone
keeping
Democrats
out
of
office
is
acceptable
(Riley,
2017).
This
intense
political
polarization
in
the
US
is
abetted
by
three
factors
largely
beyond
the
scope
of
Lewandowsky
et
al.
(2017).
First
is
the
intentional
promotion
of
misinformation
in
the
pow-
erful
conservative
echo
chamber,
ranging
from
the
conspiracy
theories
of
Infowars
and
Rush
Limbaugh
to
the
consistent
lies
and
exaggerations
about
liberal
politicians
and
Democratic
can-
didates
spread
on
Fox
News,
Breitbart,
and
talk
radio
(Benkler,
Faris,
Roberts,
&
Zuckerman,
2017).
Second
is
the
utility
of
misinformation
(especially
systemic
lies
but
also,
increasingly,
shock
and
chaos)
to
powerful
political
and
economic
interests
(e.g.,
the
Koch
Brothers
and
fossil
fuels
corporations)
and
their
consequent
and
unrelenting
support
for
it,
which
was
only
briefly
touched
on
by
Lewandowsky
et
al.
(2017).
Third
is
the
insti-
tutionalization
of
“false
equivalence”
in
so-called
mainstream
followers
on
September
5th
to
19,335,714
fake
followers
on
the
7th.
Many
have
observed
that
the
number
of
Trump’s
Twitter
followers
varies
dramatically
over
the
course
of
days
and
even
hours,
with
most
of
the
variation
coming
in
fake
followers
or
bots
(e.g.,
Ingram,
2017;
Riotta,
2017).
9Dunlap
et
al.
(2016)
review
the
polarization
literature
in
the
context
of
climate
change.
media,
evidenced
by
never-ending
efforts
to
equate
a
major
false-
hood
from
the
Right
with
a
far
less
consequential
one
from
the
Left
(Alterman,
2016).
Likewise,
major
venues
like
CNN
not
only
allow
much
misinformation
to
go
unchallenged,
as
when
interviewing
Trump
surrogates,
but
provide
a
platform
for
it
by
consistently
including
the
same
people
on
its
TV
panels—all
in
the
interest
of
trying
to
appear
“unbiased.”10 The
combination
of
the
above
factors
in
our
highly
polarized
nation
creates
major
barriers
to
combatting
misinformation.
Going
Forward:
Suppositions
to
Address
in
Future
Work
As
mentioned
above,
a
better
understanding
of
the
differ-
ent
types
of
misinformation
and
how
they
may
synergize
when
combined
is
needed
to
effectively
combat
misinformation
with
solutions
tailored
to
contexts
and
audiences.
We
end
with
some
initial
suppositions
intended
to
spur
work
in
this
area.
Whether
or
not
conservatives
are
“innately”
more
prone
to
accept
and
promote
misinformation
than
are
liberals,
the
US
media
landscape
nevertheless
has
far
more
avenues
for
the
former
than
for
the
latter
to
send
and
receive
misinformation:
broadcast
networks
(e.g.,
Fox
News,
The
Blaze,
One
Amer-
ica
News
Network),
popular
websites
(e.g.,
Infowars,
Breitbart,
Daily
Caller,
The
Drudge
Report),
and
other
outlets
(espe-
cially
conservative
talk
radio).
Thus,
in
our
politically
polarized
society,
combating
emotionally
resonant
truthiness
and
bullshit
within
this
cohesive,
insulated,
and
near-impenetrable
conser-
vative
echo
chamber
may
not
be
feasible
(Benkler
et
al.,
2017).
With
the
rise
of
social
media
and
the
absence
of
the
Fairness
Doctrine
to
prohibit
broadcast
licensees
from
having
a
consistent
political
slant
(e.g.,
Hershey,
1987),
penetrating
this
echo
cham-
ber
with
reasoned
arguments
supported
by
science
and
facts
may
be
a
quixotic
task.
In
addition,
the
production
and
consumption
of
systemic
lies,
epitomized
by
organized
climate
change
denial,
has
aligned
with
political
ideology
in
the
US
over
the
last
few
decades
(e.g.,
Dunlap
&
McCright,
2015;
Dunlap
et
al.,
2016;
Farrell,
2016a,
2016b;
McCright
&
Dunlap,
2000,
2010,
2011).
Conser-
vative
citizens
are
more
likely
than
are
liberals
or
moderates
to
believe
misinformation
about
climate
change
(e.g.,
McCright,
Charters,
Dentzman,
&
Dietz,
2016).
This
is
likely
because
political
ideology
is
correlated
strongly
with
support
for
pro-
tecting
and
maintaining
the
existing
political-economic
system,
with
conservatives
reporting
much
stronger
support
than
liberals
(e.g.,
Jost,
Frederico,
&
Napier,
2009;
Jost,
Nosek,
&
Gosling,
2008).
As
such,
we
expect
that
conservatives
are
more
likely
to
produce
and
consume
systemic
lies
about
similar
problems
such
as
poverty,
racism,
gender
inequality,
and
US
imperialism.
We
speculate
that
Lewandowsky
et
al.’s
(2017)
technocognition
approach
will
be
insufficient
in
countering
systemic
lies
in
the
US,
in
large
part,
because
the
latter
10 The
fact
that
Trump
and
his
followers
now
regard
CNN
as
a
purveyor
of
“fake
news”
belies
the
futility
of
utilizing
false
equivalence.
COMBATTING
MISINFORMATION
394
•are
ideologically
motivated,
deeply
held,
and
mutually
reinforcing;
•are
promoted
within
cohesive
networks
of
industries,
foun-
dations,
think
tanks,
activists,
pundits,
and
politicians
that
promote
ideologically
coherent
messages;
and
•are
often
bolstered
by
like-minded
truthiness
and
BS
in
an
echo
chamber.
Lastly,
events
over
the
past
year
suggest
that
US
media
companies
and
key
institutions
seem
quite
ill-equipped
to
rec-
ognize
and
combat
shock
and
chaos
misinformation
in
a
timely
manner.
The
mélange
of
anti-intellectual
appeals,
conspiratorial
thinking,
pseudoscientific
claims,
and
sheer
propaganda
circu-
lating
within
American
society
seems
unrelenting.
Further,
the
more
widely
and
rapidly
that
shock
and
chaos
misinformation
spreads,
the
more
citizens
(especially
conservatives)
will
lose
trust
in
our
core
institutions
and
begin
to
question
the
truthful-
ness
of
nearly
anything
that
comes
out
of
most
news
outlets
(Pew
Research
Center,
2017).
The
resulting
cynicism
that
such
processes
engender
may
likely
outpace
the
capacity
of
a
tech-
nocognition
approach
to
neutralize—let
alone
counter—shock
and
chaos
misinformation.
As
we
have
hinted
above,
what
may
be
more
challenging
than
any
one
of
these
types
of
misinformation
are
the
toxic
impacts
from
their
synergies.
When
also
considering
the
high
degree
of
political
polarization
in
our
society
(which,
again,
is
largely
due
to
the
rightward
shift
of
Republicans),
we
assert
that
whichever
technocognition
strategies
are
employed
are
unlikely
to
prove
effective
among
conservatives.
Rather,
scientists,
journalists,
and
other
communicators
may
be
better
served
in
directing
their
technocognition
strategies
toward
political
moderates
and
liberals—combatting
misinformation
designed
to
reduce
these
citizens’
motivations
to
vote
and
participate
in
governance
more
generally.
In
closing,
we
also
note
that
the
technological
architecture
of
the
internet,
and
social
media
sites
specifically,
may
be
out-
matched
against
the
perseverance,
resources,
and
hacking
skills
of
those
who
seek
to
promote
systemic
lies
and
shock-and-chaos
misinformation—especially
when
they
are
backed
by
influential
economic
interests
or
powerful
state
actors,
both
domestic
and
foreign.
Fact-checking
sites
themselves
may
become
politicized
as
misinformation
about
them
infiltrates
public
consciousness.
Also,
social
media
sites
such
as
Facebook
may
fail
to
adequately
employ
technocognition
strategies
out
of
fear
of
being
labeled
as
biased—especially
by
conservative
groups
more
likely
to
pro-
duce
and
consume
misinformation
(e.g.,
Heath,
2016;
Nunez,
2016).
This
is
not
inconsequential,
since
more
and
more
evi-
dence
is
being
discovered
that
Russian-backed
shock-and-chaos
misinformation
spread
strategically
via
Facebook—and
bol-
stered
by
truthiness,
BS,
and
systemic
lies
promoted
by
the
Right—may
have
influenced
the
outcome
of
the
2016
US
Pres-
idential
election
(e.g.,
Castillo,
2017;
Leonnig,
Hamburger,
&
Helderman,
2017;
Shane,
2017).11
11 Indeed,
it
appears
that
the
election
management
company
Cambridge
Ana-
lytica
helped
the
Trump
campaign
staff
to
precisely
identify
particular
types
Conflict
of
Interest
Statement
The
authors
declare
no
conflict
of
interest.
Author
Contributions
AMM
and
RED
wrote
the
manuscript.
Keywords:
Misinformation,
Propaganda,
Polarization,
Systemic
Lies
References
Alterman,
E.
(2016).
‘Both
sides
do
it’:
How
false
equivalence
is
dis-
torting
the
2016
election
coverage.
The
Nation,
302(June
(20/27)),
18–22.
Benkler,
Y. ,
Faris,
R.,
Roberts,
H.,
&
Zuckerman,
E.
(2017).
Study:
Breitbart-led
right-wing
media
ecosystem
altered
broader
media
agenda.
Columbia
Journalism
Review,.
Retrieved
from
https://www.cjr.org/analysis/breitbart-media-trump-harvard-
study.php
Berzon,
A.,
&
Barry,
R.
(2017).
How
alleged
Russian
hacker
teamed
up
with
Florida
GOP
operative.
The
Wall
Street
Journal,.
Retrieved
from
https://www.wsj.com/articles/how-alleged-russian-hacker-
teamed-up-with-florida-gop-operative-1495724787
Border,
J.
(2017).
US
Official
says
France
warned
about
Rus-
sian
hacking
before
Macron
leak.
The
Guardian,.
Retrieved
from
https://www.theguardian.com/technology/2017/may/09/us-
russians-hacking-france-election-macron-leak
Brennan,
C.
(2017).
German
politician
says
Party
suffered
Rus-
sian
cyberattacks.
New
York
Daily
News,.
Retrieved
from
http://www.nydailynews.com/news/world/german-politicians-
party-suffered-russian-cyberattacks-article-1.3468560
Brulle,
R.
J.
(2014).
Institutionalizing
inaction:
Foundation
founding
and
the
creation
of
U.S.
climate
change
counter-movement
organi-
zations.
Climatic
Change,
122,
681–694.
Calabresi,
M.
(2017a).
Election
hackers
altered
voter
rolls,
stole
private
data,
officials
say.
Time,.
Retrieved
from
http://time.com/4828306/russian-hacking-election-widespread-
private-data/
Calabresi,
M.
(2017b).
Inside
Russia’s
social
media
war
on
America.
Time,.
Retrieved
from
http://time.com/4783932/inside-russia-social-media-war-america/
Castillo,
M.
(2017).
Facebook
found
fake
accounts
leaking
stolen
info
to
sway
presidential
election.
CNBC,.
Retrieved
from
https://www.cnbc.com/2017/04/27/facebook-found-efforts-to-
sway-presidential-election-elect-trump.html
Clifton,
D.
(2017a).
A
Chilling
theory
on
Trump’s
nonstop
lies.
Mother
Jones,.
Retrieved
from
http://www.motherjones.com/politics/2017/08/trump-nonstop-lies/
Clifton,
D.
(2017b).
How
Trump
and
his
allies
have
run
with
Russian
propaganda.
Mother
Jones,.
Retrieved
from
http://www.motherjones.com/politics/2017/06/russian-active-
measures-trump-propaganda-conspiracy-theories/
of
potential
voters
using
psychographic
data
from
millions
of
American
adults
(Grassegger
&
Krogerus,
2017).
This
was
data
hacked
by
Russian
operatives
(e.g.,
Berzon
&
Barry,
2017)
and/or
harvested
by
Trump
campaign
associates
from
sources
such
as
private
Facebook
data
(e.g.,
Schwartz,
2017).
The
Trump
campaign’s
digital
content
staff
then
helped
feed
these
potential
voters
misin-
formation
that
had
been
personally
tailored
for
them
(e.g.,
Collins
et
al.,
2017a;
Overby,
2017).
COMBATTING
MISINFORMATION
395
CNN
Staff.
(2017).
2016
Presidential
campaign
fast
facts.
CNN,.
Retrieved
from
http://www.cnn.com/2016/12/26/us/
2016-presidential-campaign-hacking-fast-facts/index.html
Collins,
B.,
Poulsen,
K.,
&
Ackerman,
S.
(2017).
Rus-
sia’s
Facebook
fake
news
could
have
reached
70
million
Americans.
The
Daily
Beast,.
Retrieved
from
http://www.thedailybeast.com/russias-facebook-fake-news-could-
have-reached-70-million-americans
Collins,
H.,
Evans,
R.
E.,
&
Weinel,
M.
(2017).
STS
as
science
or
politics?
Social
Studies
of
Science,
43(4),
580–586.
Dorell,
O.
(2017).
Breitbart,
other
‘Alt-Right’
websites
are
the
darlings
of
Russian
propaganda
effort.
USA
Today,.
Retrieved
from
https://www.usatoday.com/story/news/world/2017/08/24/
breitbart-other-alt-right-websites-darlings-russian-propaganda-
effort/598258001/
Dugin,
A.
(1997).
The
foundations
of
geopolitics:
The
geopolitical
future
of
Russia.
Moscow:
Arktogeya.
Dunlap,
R.
E.,
&
McCright,
A.
M.
(2011).
Organized
climate
change
denial.
In
J.
Dryzek,
R.
Norgaard,
&
D.
Schlosberg
(Eds.),
Oxford
handbook
of
climate
change
and
society
(pp.
144–160).
Cambridge:
Oxford
University
Press.
Dunlap,
R.
E.,
&
McCright,
A.
M.
(2015).
Challenging
climate
change:
The
denial
countermovement.
In
R.
E.
Dunlap,
&
R.
J.
Brulle
(Eds.),
Climate
change
and
society:
Sociological
perspectives
(pp.
300–322).
New
York:
Oxford
University
Press.
Dunlap,
R.
E.,
McCright,
A.
M.,
&
Yarosh,
J.
(2016).
The
Political
divide
on
climate
change:
Partisan
polarization
in
the
U.S.
public
widens.
Environment,
58(5),
4–22.
Farrell,
J.
(2016a).
Corporate
funding
and
ideological
polarization
about
climate
change.
Proceedings
of
the
National
Academy
of
Sciences
of
the
United
States
of
America,
113(1),
92–97.
Farrell,
J.
(2016b).
Network
structure
and
influence
of
the
climate
change
counter-movement.
Nature
Climate
Change,
6,
370–374.
Frankfurt,
H.
G.
(1986).
On
bullshit.
Princeton,
NJ:
Princeton
Univer-
sity
Press.
Ferguson,
D.
(2017).
Russians
used
‘Bernie
Bros’
as
‘unwitting
agents’
in
disinformation
campaign:
Senate
intel
witness.
Raw
Story.
Retrieved
from
http://www.rawstory.com/2017/03/russians-
used-bernie-bros-as-unwitting-agents-in-disinformation-campaign-
senate-intel-witness/
Fuller,
S.
(2017).
Is
STS
all
talk
and
no
walk?
EASST
Review,
36(1),
21–22.
Grassegger,
H.,
&
Krogerus,
M.
(2017).
The
data
that
turned
the
world
upside
down.
Vice
Motherboard,.
Retrieved
from
https://motherboard.vice.com/en
us/article/mg9vvn/how-our-
likes-helped-trump-win
Hains,
T.
(2016).
BBC’s
Adam
Curtis:
How
propaganda
turned
Russian
politics
into
theater.
Real
Clear
Politics,.
Retrieved
from
https://www.realclearpolitics.com/video/2016/10/12/bbcs
adam
curtis
how
propaganda
turned
russian
politics
into
a
circus.html
Heath,
A.
(2016).
Facebook
didn’t
block
fake
news
because
it
was
reportedly
scared
of
suppressing
Right-Wing
sites.
Business
Insider,.
Retrieved
from
http://www.businessinsider.com/facebook-didnt-block-fake-news-
because-of-conservative-right-wing-sites-report-2016-11
Hershey,
R.
D.,
Jr.
(1987).
F.C.C.
votes
down
Fairness
Doc-
trine
in
a
4-0
decision.
The
New
York
Times,.
Retrieved
from
http://www.nytimes.com/1987/08/05/arts/fcc-votes-down-
fairness-doctrine-in-a-4-0-decision.html
Ingram,
M.
(2017).
Trump’s
fake
Twitter
following
climbs,
sparking
fears
of
a
bot
war.
Fortune,.
Retrieved
from
http://fortune.com/2017/05/31/trump-twitter-bot-war/
Jost,
J.
T.,
Nosek,
B.
A.,
&
Gosling,
S.
D.
(2008).
Ideology:
Its
resur-
gence
in
social,
personality,
and
political
psychology.
Perspectives
on
Psychological
Science,
3(2),
126–136.
Jost,
J.
T.,
Frederico,
C.
M.,
&
Napier,
J.
L.
(2009).
Political
ideology:
Its
structure,
functions,
and
elective
affinities.
Annual
Review
of
Psychology,
60,
307–337.
Kay,
J.
(2011).
Among
the
truthers:
A
journey
through
America’s
grow-
ing
conspiracist
underground.
New
York:
Harper.
Kelly,
M.,
Kessler,
G.,
&
Lee,
M.
Y.
H.
(2017).
Pres-
ident
Trump’s
list
of
false
and
misleading
claims
tops
1,000.
The
Washington
Post,.
Retrieved
from
https://www.washingtonpost.com/news/fact-checker/wp/2017/08/
22/president-trumps-list-of-false-and-misleading-claims-tops-1000/
Klein,
N.
(2008).
The
Shock
doctrine:
The
rise
of
disaster
capitalism.
New
York:
Picador.
Latour,
B.
(2004).
Why
has
critique
run
out
of
steam?
From
matters
of
fact
to
matters
of
concern.
Critical
Inquiry,
30,
225–248.
Leibovich,
M.
(2015).
The
Weaponization
of
‘truther’.
The
New
York
Times
Magazine,.
Retrieved
from
https://www.nytimes.com/2015/11/08/
magazine/the-weaponization-of-truther.html
Leonhardt,
D.,
&
Thompson,
S.
A.
(2017).
Trump’s
lies.
The
New
York
Times,.
Retrieved
from
https://www.nytimes.com/interactive/
2017/06/23/opinion/trumps-lies.html
Leonnig,
C.
D.,
Hamburger,
T.,
&
Helderman,
R.
S.
(2017).
Russian
firm
tied
to
pro-Kremlin
propaganda
advertised
on
Facebook
during
election.
Washington
Post,.
Retrieved
from
https://www.washingtonpost.com/politics/facebook-says-it-sold-
political-ads-to-russian-company-during-2016-election/2017/09/
06/32f01fd2-931e-11e7-89fa-bb822a46da5b
story.html
Lewandowsky,
S.,
Ecker,
U.
K.
H.,
&
Cook,
J.
(2017).
Beyond
misinfor-
mation:
Understanding
and
coping
with
the
post-truth
era.
Journal
of
Applied
Research
in
Memory
and
Cognition
(in
press).
Mann,
T.
E.,
&
Ornstein,
N.
J.
(2016).
It’s
even
worse
than
it
was.
New
York:
Basic
Books.
Mariani,
M.
(2017).
Is
Trump’s
chaos
tornado
a
move
from
the
Kremlin’s
playbook?
Vanity
Fair.
Retrieved
from
https://www.vanityfair.com/news/2017/03/is-trumps-chaos-a-
move-from-the-kremlins-playbook
Maza,
C.
(2017).
Why
fact-checking
can’t
stop
Trump’s
lies.
Vox,.
Retrieved
from
https://www.vox.com/videos/2017/8/30/
16228480/fact-checking-cant-debunk-trump-lies
McCright,
A.
M.,
Charters,
M.,
Dentzman,
K.,
&
Dietz,
T.
(2016).
Examining
the
effectiveness
of
climate
change
frames
in
the
face
of
a
denialist
counter-frame.
Topics
in
Cognitive
Science,
8(1),
76–97.
McCright,
A.
M.,
&
Dunlap,
R.
E.
(2000).
Challenging
global
warming
as
a
social
problem:
An
analysis
of
the
conservative
movement’s
counter
claims.
Social
Problems,
47(4),
499–522.
McCright,
A.
M.,
&
Dunlap,
R.
E.
(2010).
Anti-reflexivity:
The
American
conservative
movement’s
success
in
undermining
climate
science
and
policy.
Theory,
Culture,
and
Society,
27(2–3),
100–133.
McCright,
A.
M.,
&
Dunlap,
R.
E.
(2011).
The
politicization
of
climate
change
and
polarization
in
the
American
public’s
views
of
global
warming,
2001–2010.
The
Sociological
Quarterly,
52,
155–194.
Michaels,
D.
(2008).
Doubt
is
their
product.
New
York:
Oxford
Uni-
versity
Press.
Moore,
M.
(2017).
Conway:
Trump
spokesman
gave
‘alternative
facts’.
New
York
Post,.
Retrieved
from
http://nypost.com/2017/01/22/
conway-trump-spokesman-gave-alternative-facts-on-inauguration-
crowd/