ArticlePDF Available

Abstract and Figures

Online disinformation is considered a major challenge for modern democracies. It is widely understood as misleading content produced to generate profits, pursue political goals, or maliciously deceive. Our starting point is the assumption that some countries are more resilient to online disinformation than others. To understand what conditions influence this resilience, we choose a comparative cross-national approach. In the first step, we develop a theoretical framework that presents these country conditions as theoretical dimensions. In the second step, we translate the dimensions into quantifiable indicators that allow us to measure their significance on a comparative cross-country basis. In the third part of the study, we empirically examine eighteen Western democracies. A cluster analysis yields three country groups: one group with high resilience to online disinformation (including the Northern European systems, for instance) and two country groups with low resilience (including the polarized Southern European countries and the United States). In the final part, we discuss the heuristic value of the framework for comparative political communication research in the age of information pollution.
Content may be subject to copyright.
Resilience to Online Disinformation: A Framework for Cross-
National Comparative Research
Edda Humprecht, University of Zurich
Frank Esser, University of Zurich
Peter Van Aelst, University of Antwerp
Abstract
Online disinformation is considered a major challenge for modern democracies. It is
widely understood as misleading content produced to generate profits, pursue political goals,
or maliciously deceive. Our starting point is the assumption that some countries are more
resilient to online disinformation than others. To understand what conditions influence this
resilience we choose a comparative cross-national approach. In the first step, we develop a
theoretical framework that presents these country conditions as theoretical dimensions. In the
second step, we translate the dimensions into quantifiable indicators that allow us to measure
their significance on a comparative cross-country basis. In the third part of the study, we
empirically examine 18 Western democracies. A cluster analysis yields three country groups:
one group with high resilience to online disinformation (including the Northern European
systems, for instance) and two country groups with low resilience (including the polarized
Southern European countries and the United States). In the final part, we discuss the heuristic
value of the framework for comparative political communication research in the age of
information pollution.
Keywords: online disinformation, theoretical framework, resilience, cross-national
comparison, cluster analysis
Introduction
RESILIENCE TO ONLINE DISINFORMATION 2
The campaigns for the 2016 U.S. presidential election and the U.K. vote to leave the
European Union (“Brexit”) have increased the discussion about the potential influence of
content disseminated to mislead recipients. Several authors argue that the phenomenon of
online disinformation has gained more influence through social media but that the discussion
around it is politicized and in need of clarity (Allcott & Gentzkow, 2017; Vargo, Guo, &
Amazeen, 2017). An U.S. post-election study by Allcott and Gentzkow (2017) found that
heavy users of social media were not well equipped to identify false information. This finding
caused some concern, given that social media is an important source of news consumption
(Newman, Fletcher, Kalogeropoulos, & Nielsen, 2019). However, empirical evidence
regarding the rise of online disinformation and its effects on society is inconclusive, and little
is known about the situation outside the U.S. Cross-national research can help understand the
influences of the political, economic and media environment on online disinformation. To
encourage comparative research on the topic, we propose a theoretical framework that
identifies the conditions promoting or inhibiting the influence of disinformation, and we
suggest measurable indictors to empirically examine the role of these conditions empirically.
Our study argues that certain countries are better equipped to face the problems of the
digital era, demonstrating a resilience to manipulations attempts such as online
disinformation. Based on a thorough literature review, we identify macro-level characteristics
that help explain cross-national differences regarding the exposure to and the diffusion of
online disinformation. We suggest empirical dimensions and indicators for the study of online
disinformation, measure country differences and identify clusters of countries with different
levels of resilience to online disinformation.
Literature Review
Traditionally, social scientists have been concerned with low levels of political
knowledge among the electorate. Citizens need information about candidates, parties and
RESILIENCE TO ONLINE DISINFORMATION 3
current issues to be able to make reasonable choices and to participate in democratic life
(Carpini & Keeter, 1996). In recent decades, however, the concern has shifted. Survey
research (mainly in the U.S.) has shown that a growing group of people—who are not
uninformed but rather disinformed—hold inaccurate factual beliefs and use incorrect
information to form their preferences (Kuklinski et al., 2014). As a consequence, the
production, consumption, and dissemination of online disinformation is of growing interest
among scholars from different disciplines such as communications, political science, and
psychology (Ciampaglia, 2017; Guess, Nyhan, & Reifler, 2018; Lewandowsky, Ecker, &
Cook, 2017; Pennycook & Rand, 2017; Tandoc, Lim, & Ling, 2017).
Disinformation is widely understood as content produced to generate profits, pursue
political goals, or maliciously mislead, such as in the form of hoaxes (Nielsen & Graves,
2017). Wardle and Derakhshan (2017) argue that different types of information must be
distinguished, namely, misinformation, disinformation, and malinformation. According to
those authors, misinformation refers to the unintentional publication of false or misleading
information; disinformation means that false information is strategically shared to cause harm;
and malinformation occurs when genuine information is shared to cause harm, for example,
by disclosing private information to the public. In this article, we expand the understanding of
disinformation by adding further aspects, such as lack of context that leads to false
interpretations, disinformed opinions shared publicly on social media, and manipulated
comments often published by bots (see Figure 1). Following Wardle and Derakhshan (2017),
we argue that different elements should be separately examined, namely, the agent, messages
and interpreters. Moreover, misinformation, disinformation, and malinformation overlap, as
online users unintentionally share false information.
This paper focuses on the aspect of disinformation because it is strategically used to
influence audiences and is likely to be harmful to democracy (Benkler, Faris, & Roberts,
2018; Marwick & Lewis, 2017).
RESILIENCE TO ONLINE DISINFORMATION 4
[Figure 1 about here]
Understanding the diffusion and consumption of online disinformation
Several authors have argued that the diffusion and consumption of disinformation is
driven by mechanisms such as “confirmation bias” and “motivated reasoning”, leading people
to believe information that confirms their own worldviews (Nickerson, 1998; Robison &
Mullinix, 2015; Shin, Jian, Driscoll, & Bar, 2017). Furthermore, people tend to believe that
the only accurate perception of reality is their own, a phenomenon called “naïve realism”
(Ross & Ward, 1996). From this perspective, people who voice different opinions are
suspected of being biased or uninformed, and content that includes opposing views is labeled
“fake” (Prior, Sood, & Khanna, 2015). People with strong confirmation bias towards their
own strongly held beliefs are also less likely to trust interventions by fact-checkers
(Brandtzaeg & Følstad, 2017).
Against this background, many studies are concerned about the consequences of
disinformation for the functioning of democracy and the potential risks of strategic
manipulation. Recent events, such as the 2016 U.S. presidential election and the 2016 Brexit
referendum in the U.K., have demonstrated how quickly disinformation can spread on social
media. Social media has been found to be a problematic source of information because it
often provides highly selective and even biased views of public opinion (Guess et al., 2018;
Shin & Thorson, 2017). Certain groups of actors are overrepresented in the social media
environment. Studies show that advocacy groups—that is, groups of activists ranging from
large unions and lobbying organizations to small citizen groups—actively use Twitter and
Facebook to reach a broader audience (Chalmers & Shotton, 2016). Moreover, “undefined”
actors or so-called social bots can also influence the distribution of political information, thus
contributing to a skewed representation of viewpoints encountered online. As Bradshaw &
Howard (2017, p. 11) have described, “bots can amplify marginal voices and ideas by
RESILIENCE TO ONLINE DISINFORMATION 5
inflating the number of likes, shares and retweets they receive, creating an artificial sense of
popularity, momentum or relevance”. For example, during the 2016 U.S. presidential election,
diverse forms of “computational propaganda” flourished (Howard, Bolsover, Kollanyi,
Bradshaw, & Neudert, 2017). Elections in Europe have also experienced the invasion of bots
and the spread of false information by strategic actors (Wardle & Derakhshan, 2017). At least
in the U.S. case, there is proof that fake accounts and false information influenced the agenda
of partisan media outlets (Vargo et al., 2017).
The combination of a massive diffusion of manipulated information created by
different actors, techniques for amplifying content, and new platforms hosting and producing
disinformation and the speed of information, especially via social media, has been labeled
“information pollution” (Wardle & Derakhshan, 2017). As more people turn to social
networks as a primary news source, the “polluted” online environment could become a major
challenge to political communication in democracies. Moreover, the recent discussion about
“fake news” and the politicized use of the term have alienated citizens. The Pew Research
Center found that many Americans are confused about the nature of facts in general (Barthel,
Mitchell, & Holcomb, 2017). A survey showed that most Americans suspect that
disinformation had an impact on the 2016 U.S. elections. Nearly one-quarter of respondents
said that they themselves had shared “fake news”. Of those who shared disinformation, 14
percent knew at the time that the story was made up, and 16 percent realized later that the
information was false (Barthel et al., 2017). In the U.K., two-thirds of the respondents in a
recent study admitted sharing mis- and disinformation on social media (Chadwick, Vaccari, &
O’Loughlin, 2018). Further, in a study based on focus group discussions, Nielsen and Graves
(2017) found that the difference between “fake news” and news is not perceived as a clear
distinction but rather as one of degree. Respondents were able to identify poor journalism,
propaganda (lying politicians and hyper-partisan content), and certain kinds of advertising
more easily than invented stories. The authors argued that the new confusion is driven by a
RESILIENCE TO ONLINE DISINFORMATION 6
combination of news providers publishing disinformation, political actors contributing to its
spread, and platforms disseminating it further (Nielsen & Graves, 2017).
The research focus on the U.S. and the U.K., following the 2016 elections and Brexit,
has created the impression that online disinformation has become a global problem.
Comparative data from the Digital News Report (2018) confirm this impression to a certain
extent. The survey data show that not only in the U.S. but also in countries such as Spain and
Greece citizens indicate that they are frequently exposed to online disinformation. At the
same time, however, the data show great country variation with citizens in many Western and
Northern European countries (e.g., Germany, Denmark, and the Netherlands) reporting low
levels of exposure to online disinformation. Moreover, citizens in countries those countries
are less willing to disseminate disinformation on social networks (Neudert, Howard, &
Kollanyi, 2019).
Against this background, the question arises which framework conditions in different
environments foster the diffusion and consumption of disinformation. Understanding the basic
conditions can help researchers understand why disinformation spreads to different degrees
across Western democracies and what the effects are on individuals and on democratic society
as a whole.
A Framework for the Study of Online Disinformation
Based on a review of international research literature, we identify seven macro-level
conditions that can weaken the resilience of countries to problems of online disinformation.
We conceive resilience as a collective characteristic that transcends the individual level.
Resilience is generally understood as “the capacity of groups of people bound together in a
[…] community or nation to sustain and advance their well-being in the face of challenges to
it” (Hall & Lamont, 2013, p. 2). Such “challenges” are more likely in highly developed
societies due to their greater complexity. The causes of these stress experiences usually come
RESILIENCE TO ONLINE DISINFORMATION 7
less from unforeseeable shock events than from fractures that have been preceded over a
longer period of time by structural aberrations (Adger, 2000).
For this reason, our study also focuses on structural factors. According to Benkler,
Faris and Roberts (2018, pp. 348–387), media systems that are resilient to online
disinformation are characterized by distinct structural features, such as a low degree of
polarization and fragmentation; a low level of distrust in truth-seeking institutions that operate
on reason and evidence (science, law, professionalism); a public health approach toward
media regulation; and public funding for reliable truth-seeking media and an educated public.
Based on the US experience, Benkler et al. (2018) argue that a resilient media system can first
prevent the emergence of a large audience that no longer expects true reporting from its
preferred ideological media, but primarily identity-confirming news and opinions – regardless
of the truth content; secondly, a resilient system has a strong infrastructure of professional
media that apply the principle of accountable verification to all information that rushes around
in the old and new channels of the media environment.
More generally, resilience refers to a structural context in which disinformation does
not reach a large number of citizens. At the same time, we argue that resilience is not only a
consequence of simply not being exposed to disinformation. In countries that can be seen as
resilient, people might also come across forms of disinformation. In those circumstances,
people will be less inclined to support or further distribute such low quality information and,
in some cases, they will be more able to counter that information.
In sum, we argue that resilience to online disinformation can be linked to structural
factors related to different political, media, and economic environments. We propose a
framework that will help scholars understand how the diffusion and consumption of online
disinformation differ across national information environments and which constellations of
contextual conditions make national information environments more vulnerable or more
resilient to the spread and use of online disinformation. We also suggest measurable
RESILIENCE TO ONLINE DISINFORMATION 8
indicators that allow us to rank countries according to individual dimensions or, more
importantly, classify countries according to more comprehensive types.
Factors of the Political Environment Limiting Resilience
Polarization of society
Several authors have argued that increasing polarization is an important driver for the
deliberate dissemination and production of online disinformation (Allcott & Gentzkow, 2017;
Shin & Thorson, 2017). Polarization is difficult to measure and has been conceptualized in
different ways. Many political scientists understand political polarization as the separation of
partisans or elites on issues or policy spectrums (Dalton, 2008; Hetherington, 2001). In
general, majoritarian systems with only two parties and a winner-takes-all system are seen as
a breeding ground for party polarization and camp formation (Layman, Carsey, & Horowitz,
2006; Prior, 2013). However, Southern European countries with a multi-party system and
deep historical partisan divisions are also often considered strongly ideologically polarized
(Hallin & Mancini, 2004). More recently, Iyengar, Sood, and Lelkes (2012) have introduced
the concept of affective polarization arguing that citizens’ ties to the political world are often
emotional rather than ideological. Allcott and Gentzkow (2017) argue that partisans hold
strong negative feelings towards the opposite side of the ideological spectrum and are
therefore more likely to believe only stories reflecting their own viewpoints. Muddiman and
Stroud (2017) have found that partisanship increases the sharing of and commenting on
political content. Moreover, partisans tend to share only content that is favorable to candidates
from their own political party and neglect fact-checking messages supporting the opposing
party (Pennycook & Rand, 2017; Shin & Thorson, 2017). Along this line, other studies have
shown that partisans also distrust fact-checking websites and accuse them of being biased
(Young, Jamieson, Poulsen, & Goldring, 2018).
RESILIENCE TO ONLINE DISINFORMATION 9
In polarized environments, citizens are confronted with different deviating
representations of the reality and therefore it becomes increasingly difficult for them to
distinguish between false and correct information (Craft, Ashley, & Maksl, 2017; Swire,
Berinsky, Lewandowsky, & Ecker, 2017). Thus, societal polarization can be assumed to
decrease resilience to online disinformation.
Populist communication
Several scholars have argued that the phenomena of partisan disinformation and populism are
linked (Bennett & Livingston, 2018; Marwick & Lewis, 2017). Both concepts share several
key psychological underpinnings (Hameleers, 2018). First, populism and disinformation both
relate to the spread of partisan information that supports one particular party’s attitudinal
stance whilst discrediting information from the other party. Second, similar to the social
identification process underlying populism, partisan disinformation constructs an all-
encompassing moral and causal divide between two camps: “we” are right and truthful and
“they” are wrong and fake. In this vein, populist actors also frequently blame the news media
for spreading “fake news” that allegedly mislead ordinary people (Ross & Rivers, 2018;
Schulz, Wirth, & Müller, 2018). Third, populists claim that evil-doers in politics use
misinformation to conspire against the ordinary public. However, populist actors disseminate
misinformation themselves if it helps to strengthen their in-group/out-group narratives.
Studies among citizens have found that belief in conspiracy theories correlates highly with
being susceptible to populist politics. In sum, both populism and partisan disinformation share
a binary Manichaean worldview, anti-elitism, mistrust of expert knowledge, and conspiracy
theories (Bergman, 2018). As a consequence of these combined influences, citizens can obtain
inaccurate perceptions of reality (Pennycook & Rand, 2018). Thus, in environments with high
levels of populist communication, online users are exposed to more disinformation.
Factors of the Media Environment Limiting Resilience
RESILIENCE TO ONLINE DISINFORMATION 10
Low trust in news
Previous research suggests that media trust plays a crucial role in how citizens and
stakeholders perceive information and how they are aware of certain problems (Curran, Coen,
Aalberg, & Iyengar, 2012; Van Aelst et al., 2017). Research has established that low levels of
trust in news media stem from a general political malaise (Jones, 2004; Ladd, 2010). For
example, conservative Republicans in the U.S. in particular distrust the news media and tend
to perceive a “liberal bias” in news content (Jones, 2004). Furthermore, distrust in
professional news media can lead to selective exposure because source credibility affects the
interpretation of information (Chung, Nam, & Stefanone, 2012; Swire et al., 2017; Turcotte,
York, Irving, Scholl, & Pingree, 2015). Distrust in news media also increases the use of
alternative sources, such as online platforms that distribute disinformation (Tsfati & Cappella,
2003). In other words, in environments in which distrust in news media is higher, people are
less likely to be exposed to different sources of political information and to critically evaluate
those (Benkler et al., 2018). Based on this reasoning, it can be assumed that resilience to
disinformation is lower in societies where distrust in professional news media is high.
Weak public service media
Studies have shown that information environments influence what citizens know about
socially relevant topics. Aalberg et al. (2013) have demonstrated a positive relationship
between the amount of hard news coverage available in a country and the citizens’ level of
public affairs knowledge. These authors, along with other studies, have found the highest
levels of hard news and public affairs knowledge in countries with strong public service
broadcasting (Aalberg & Curran, 2012; Curran et al., 2009). More important is the ecological
effect of public service media on commercial media through a mechanism called “market
conditioning”: Comparative research indicates that public service content encourages rivals
who compete for the same audience to spend more on original content; this “race to the top”
increases overall quality and engenders informed citizenship (Aalberg & Cushion, 2016; Van
RESILIENCE TO ONLINE DISINFORMATION 11
der Wurff, 2005). The higher level of knowledge that people gain is likely to play an
important role when confronted with online disinformation. Research has shown that
knowledge is an important factor in the manner in which people deal with information (Prior
et al., 2015). As people become more knowledgeable about a certain topic, their perception is
less likely to be guided by confirmation bias and naïve realism (Ross & Ward, 1996).
Therefore, it can be assumed that environments with weak public service media are less
resilient to online disinformation.
More fragmented, less overlapping audiences
It has been argued that the digitalization has led to a general increase in media
products (Webster & Ksiazek, 2012). In addition, the supply of niche or partisan media has
increased in some countries due to rising demand, which has led to more fragmented
audiences (Fletcher & Nielsen, 2017). This means that users being confronted with
disinformation in partisan or alternative media are less likely to encounter information
correcting or challenging false claims (Shin et al., 2017). Societies in which the users of news
are distributed across a large number of media, some of which are peripheral, offer more entry
points for disinformation than societies in which universally recognized news media can unite
large audiences in their online and offline offerings, for example because of their high
reputation and quality (Fletcher & Nielsen, 2017). Thus, it can be assumed that if the overlap
in news consumption is large, users are less likely to be exclusively confronted with false
information.
Factors of the Economic Environment Limiting Resilience
Large ad market size
False social media content is often produced in pursuit of advertising revenue, as was
the case with the Macedonian “fake news factories” during the 2016 U.S. presidential election
(Nielsen & Graves, 2017; Subramanian, 2017). In a British government report on
RESILIENCE TO ONLINE DISINFORMATION 12
Disinformation and “Fake News”, the role of advertising revenues in the production of online
disinformation was highlighted (House of Commons UK, 2019). The report expressed the
concern that changes in the selling and placing of advertising have encouraged the growth of
disinformation. The business model of social media platforms such as Google and Facebook
is to charge advertisers commission for every view and click. When content producers work
with advertising networks, their content is simultaneously published on numerous platforms
that maximize views, clicks, and revenue. Against this background, Tambini (2017) argues
that the online advertising ecosystem “enables smaller publishers to thrive outside the ethical
and self-regulatory constraints which in the past tightly reinforced an ethics of truth-seeking”.
Moreover, because disinformation often contains sensationalist and emotionalized aspects, it
is likely to attract users’ attention. It is therefore appealing for producers to publish this kind
of content—especially if the potential readership is large. Thus, large-size advertising markets
with a high number of potential users are less resistant to disinformation than smaller-size
markets (Faris et al., 2017; Van Herpen, 2015).
High social media use
Social media is considered an amplifier of disinformation (Meraz & Papacharissi,
2013; Shin et al., 2017; Singer, 2014). Disinformation is particularly prevalent on social
media (Fletcher, Cornia, Graves, & Nielsen, 2018) and in countries with very many social
media users, it is easier for rumor spreaders to build partisan follower networks. Previous
research has argued that social media is more often used for entertainment purposes than for
seeking news (Newman, Fletcher, Kalogeropoulos, Levy, & Nielsen, 2017). With this
motivation, people are likely to share information without verifying it (Shin & Thorson,
2017). Moreover, it has been found that a media diet mainly consisting of news from social
media limits political learning and leads to less knowledge of public affairs compared to other
media sources (Shehata & Strömbäck, 2018). From this, it can be concluded that societies
RESILIENCE TO ONLINE DISINFORMATION 13
with a high rate of social media users are more vulnerable (hence less resilient) to online
disinformation spreading rapidly than other societies.
In sum, we argue that low levels of populist communication, low levels of societal
polarization, high levels of trust in news media, strong public service broadcasting (PSB),
high levels of shared media use, small-size media markets, and lower levels of social media
use provide better conditions for resilience and—at the same time—less favorable conditions
for the dissemination of and exposure to online disinformation (see Table 1).
[Table 1 about here]
Data and Operationalization
To illustrate country differences in relation to our theoretical framework we collected
data for 18 Western democracies. For comparative reasons, we selected the countries used by
Hallin & Mancini (2004) in their book on models of media systems. Since a large part of
comparative research in the field of news media and political communication is based on their
typology, Hallin & Mancini’s (2004) selection of countries was a good starting point for us.
However, we emphasize that it is necessary for future research to widen the scope and include
a broader sample of countries.
Our data sources include the Digital News Report (2018), the Varieties of
Democracies Project (Coppedge et al., 2019; Pemstein et al., 2019), data on populist parties
from the Timbro Authoritarian Populism Index (2019), the Global Populism Database
(Hawkins et al., 2019), Aalberg et al. (2016) and Van Kessel (2015); data on the strength of
public service broadcasting from Brüggemann et al. (2014); and World Bank Data (2017) on
the size of population and number of online users.
The Digital News Report presents annual data on more than 74,000 online media users
from 37 countries and their news consumption habits. We used their representative country
data on trust (general trust in media, and trust in those media that respondents use
RESILIENCE TO ONLINE DISINFORMATION 14
themselves), social media (social media for news consumption and for sharing of news), and
on exposure to dis- and misinformation. Our measure of shared media reflects the proportion
of the most used news source per country, based on the Digital News Report (2019).
The Varieties of Democracies (VDem) project draws on theoretical and
methodological expertise from academics who act as expert coders to answer 400 questions
related to the state of democracy in their country (Coppedge & Teorell, 2016). To build our
polarization index, we used two of their indicators: polarization of society and online media
fractionalization. To measure the polarization of society, experts were asked how they
characterize the differences of opinions on major political issues in their society (response
options ranged from no polarization to serious polarization). For online media
fractionalization, experts were asked whether domestic online media outlets give a similar
presentation of major political news, with response options ranging from opposing
presentation of major events to similar presentations of major events.
Further, we collected data on the percentage of votes of populist parties during the
most recent national election and the difference in vote shares between 2008 and 2018 from
the Timbro Authoritarian Populism Index (Timbro, 2019). Data for Canada and the U.S. was
collected based on lists of populist parties from Aalberg et al. (2016) and Van Kessel (2015).
In addition, we used content analysis data on the levels of populism in speeches of political
leaders from the Global Populism Database (Hawkins et al., 2019).
Finally, we used Brüggemann et al.’s (2014) index of the strength of public service
broadcasting (PSB). To construct the index, the authors used data from the European
Audiovisual Observatory on the market share of public TV and its funding.
To allow for cross-national comparison, we merged the individual measures into
average indices; they showed sufficient internal consistency (Cronbach’s α>.71<.96). We
prefer average to additive indices because they are less sensitive to missing values, which
could not, despite complementary datasets, be completely avoided. Prior to data analysis, the
RESILIENCE TO ONLINE DISINFORMATION 15
dimension indices were also z-standardized. Further, we inverted some indices so that all
indices pointed in the same direction. In other words, high values of our indicators reflect
high resilience to online disinformation and vice versa. This step made the results easier to
interpret.
Findings
Our aim was to operationalize and measure the theoretical dimensions for the study of online
disinformation. Our theoretical framework consists of seven dimensions that have been
operationalized and merged into seven indices (see Table 2).
[Table 2 about here]
Figure 2 shows that substantial country differences exist with regard to our indices.
Northern and Western European countries, such as Finland, Denmark, and the Netherlands
received high values on most indices suggesting greater resilience to online disinformation. In
contrast, countries such as Spain, Italy, Greece, and the U.S. obtained low index values. Thus,
these countries have conditions that favor an easier dissemination of and exposure to online
disinformation.
[Figure 2 about here]
To examine the relationship between our framework indicators and to identify
potential sub-indicators we conducted a principal components factor analysis with varimax
rotation. The analysis resulted in factors with eigenvalues greater than 1.0, explaining 69
percent of the variance. The strength of PSB was the only variable that loaded on both factors
(eigenvalues for factor 1 = .58 and factor 2 = .55). Factor 1 comprises social media use, media
trust, polarization, populism and strength of PSB, explaining 49.7 percent of the variance.
Factor 2 comprises market size, fragmentation of media consumption, and strength of PSB
and explains 19.1 percent of the variance in the analysis. The variables included in factor 1
RESILIENCE TO ONLINE DISINFORMATION 16
are related to political communication and media use in a country, whereas factor 2 consists
of variables related to the size of the media market and media organizations.
In the next step, we tried to group the countries with respect to their resilience towards
online disinformation. We used the seven z-standardized indices to carry out a two-stage
cluster analysis of the 18 countries. To identify the number of clusters, we performed a
hierarchical cluster analysis using Ward’s algorithm and the squared Euclidean distance as a
heterogeneity measure. We chose a three-cluster solution for three reasons. First, merging the
clusters beyond the third would have resulted in solutions that are too heterogeneous. If we
display the sum of squared distances as a scree plot, this is reflected by a strong elbow at the
third cluster. Second, the dendrogram for the three-cluster solution is very clear and highly
interpretable. Third, we checked the clarity and interpretability of alternative solutions and
found that they could not compete with the three-cluster solution. Figure 3 visualizes the
country means for each cluster.
Cluster 1 consists of Northern and Western European countries, plus Canada (Austria,
Belgium, Canada, Denmark, Finland, Germany, Ireland, the Netherlands, Norway, Sweden,
Switzerland, and the U.K.). The bulk of these countries have been described as democratic-
corporatist media systems, whereas Canada, Ireland and the U.K. have many features of
liberal media systems (Hallin & Mancini (2004). However, several authors have stressed that
the three Anglo-Saxon countries in many ways resemble the corporatist European systems, for
instance, with respect to welfare expenditure, support for public broadcasting, and regulations
of media ownership, advertising and electoral coverage (Büchel, Humprecht, Castro-Herrero,
Engesser, & Brüggemann, 2016; Lawlor, 2015; Simpson, Puppis, & Van den Bulck, 2016).
Because we still find a minimum of political and public support for the public service ethos in
these countries as well as a comparatively high level of media trust, supplemented by
comparatively low audience fragmentation and polarization, this cluster can be described—
with all due caution—as media-supportive and more consensual. The conditions relevant for
RESILIENCE TO ONLINE DISINFORMATION 17
online disinformation that we find in the countries of this cluster indicate a high level of
resilience due to the consistently high values of our seven indices (Figure 3).
Cluster 2 includes Greece, Italy, Portugal, and Spain. All these countries, without
exception, have polarized pluralist media systems (Brüggemann et al., 2014; Hallin &
Mancini, 2004). The political history of these countries is characterized by late
democratization, patterns of polarized conflict, a strong role of political parties and dirigiste
state interventions. The history of the media in these countries is characterized by a
commentary oriented, often partisan and less professionalized journalism. In our empirical
analysis, this cluster is distinguished by comparatively high levels of societal polarization,
populist communication, and social media use for news consumption. Countries in this cluster
typically have lower levels of trust in media and shared media use. We describe this cluster as
polarized since this label reflects its main characteristics and shows the similarities to Hallin
& Mancini’s (2004) polarized-pluralist media system model.
Finally, the last cluster only comprises the U.S. This finding reflects the exceptional
role of the U.S. in the context of online disinformation. The country stands out because of its
large advertising market, its weak public service media and its comparatively fragmented
news consumption. The enormous size of its market—and its competitive and commercial
culture—makes the U.S. attractive for producers of disinformation targeting social media
users. Moreover, the country is characterized by high levels of populist communication,
polarization, and low levels of trust in the news media. Nechushtai (2018) recently aptly
described the changed conditions of the U.S. media system as a low trust, politicized and
fragmented environment. Based on the contextual conditions shown by our empirical analysis
here, the U.S. must be considered the most vulnerable country regarding the spread of online
disinformation. The cluster profiles are displayed in Figure 3.
[Figure 3 about here]
RESILIENCE TO ONLINE DISINFORMATION 18
To test the relationships between our framework indicators and the phenomenon of
online disinformation, we ran a linear regression (OLS). As an outcome, we used data from
the Digital News Report (2018) on exposure to disinformation1. To increase interpretability,
all indicators have been inverted to meet our theoretical assumptions. This means that we
expect negative relationships between all factors and the outcome. Table 3 shows that media
trust, social media use, and market size have an influence on perceived exposure to online
disinformation: F(7,16) = 11.996, p = .001, n = 18. It is noteworthy that 83 percent of the
variance in the level of exposure to disinformation is explained by the independent variables
in the model. This underlines the great empirical significance that the theoretical dimensions
presented in this study have for the understanding of this socially relevant problem.
[Table 3 about here]
Discussion and Conclusion
Recent research has shown that some countries have stood out as being stable,
adaptive and resilient in times of social and technological transformation (Baldersheim &
Keating, 2016). In those countries, online disinformation can be considered a minor problem
at present. However, there is an urgent need to better understand the conditions that create,
sustain and reproduce social resilience and, simultaneously, to uncover factors that render
societies vulnerable to phenomena such as online disinformation.
We aimed to fill this gap by suggesting a theoretical framework with measurable
indicators that helps explain why disinformation is more or less prevalent in a country. The
empirical analysis confirmed our assumption that resilience to disinformation differs
systematically – depending on certain conditions that are stronger or less strong in a country.
Our indicators proved to be highly effective in explaining cross-national differences in
people’s reported exposure to online disinformation.
1 In the survey, respondents were asked the following: “In the last week, which of the following have you
personally come across? Stories that are completely made-up for political or commercial reasons.”
RESILIENCE TO ONLINE DISINFORMATION 19
The cluster analysis resulted in three groups of countries. The media-supportive, more
consensual cluster is composed of Western European democracies and Canada. Most
countries in this cluster have been described as countries with consensus political systems,
strong welfare states, and pronounced democratic corporatism (Hallin & Mancini, 2004).
These countries are likely to demonstrate high resilience to online disinformation: they are
marked by low levels of polarization and populist communication, high levels of media trust
and shared news consumption, and a strong PSB. Those countries seem to be well equipped to
face the challenges of the digital information age because they have stable, trusted institutions
that enable citizens to obtain independent information and uncover manipulation attempts.
The countries in this cluster are not yet affected to a large extent by the problem of online
disinformation. However, it is possible that this will change in the future and that online
disinformation will become a greater threat. A case in point is the U.K. Although the country
has a long democratic tradition and the BBC is a wide-reaching and greatly trusted media
organization, disinformation was a major problem during the Brexit campaign (Howard &
Kollanyi, 2017). In the politicized, heated public debate that led to the referendum,
disinformation was easily disseminated (Bennett & Livingston, 2018). After the referendum
the Brexit debate continued and led to further polarization (pro or against Brexit) and created
the potential for new political players to transform the political landscape, i.e., the “Brexit
Party” of Nigel Farage constantly attacking the BBC (Engesser, Ernst, Esser, & Büchel,
2016). This example illustrates the influence of the political and media environment on the
possibilities to disseminate online disinformation. Against this background, we conclude that
some countries in this cluster are potentially at risk of facing wide-reaching disinformation
campaigns in the context of polarized debates.
The polarized cluster consists of Southern European countries that have a long history
of stark partisan or ideological divides (Brüggemann et al., 2014; Büchel et al., 2016; Hallin
& Mancini, 2004). The conditions found in our analysis fit this description: high levels of
RESILIENCE TO ONLINE DISINFORMATION 20
polarization, populist communication, social media news use, and low levels of trust and
shared media consumption are key features of the information environments in this cluster.
Countries in the polarized cluster are thus the most likely to be vulnerable to online
disinformation.
The third cluster features the low trust, politicized and fragmented environment of the
U.S. The political and media environment of the country has also become more polarized and
has created another fertile ground for the spread of disinformation today. Political
communication in the U.S. is characterized by populist rhetoric while media coverage has
become more partisan and trust in the media has decreased as a consequence. In addition, the
large market makes it attractive to produce attention-triggering content for U.S. audiences. In
the current political and media environments, political disinformation that discredits a
particular party can widely attract attention. Our results show that the U.S. is particularly
susceptible to disinformation campaigns—and its peculiar contextual conditions make it a
unique case. Although the exceptionality of this case might be influenced by the prominence
of Donald Trump, who according to the Washington Post has made over 10,000 false or
misleading claims since entering office, we believe the structural characteristics of the US
case go beyond the current president, as the problem of disinformation and false beliefs dates
back before the rise of Trump (Kuklinski et al., 2014). Against this background, scholars may
want to be aware that findings on the problem of disinformation in this country are limited to
specific scope conditions that cannot be easily transferred to other Western countries. It is
therefore unlikely that, for instance, European countries will experience the same problems
with disinformation that the U.S. faced in the 2016 election.
To test the influence of relevant country-level factors, we ran a linear regression
predicting perceived exposure to disinformation with seven indicators. Although the results
show that our indicators explain a large part of the variance in the model, not all indicators in
the model predict the outcome to the same extent. While the majority perform as predicted,
RESILIENCE TO ONLINE DISINFORMATION 21
two of them run in the opposite direction as predicted. Such fluctuations in explanatory power
are not unexpected and can be easily explained. For instance, the role of some indicators
might be hampered by our dependent variable of self-reported experience with
disinformation. Such measurements of reported exposure to disinformation are inherently
distorted as they only reflect personal perceptions. It can be assumed that people who believe
in disinformation do not recognize it and will therefore not report it in a survey. If we use
another dependent variable to measure online disinformation that is not based on self-reports
by media users but on expert assessments by academics, the two predictors polarization and
shared media point in the expected direction. To be more precise, if the VDem2 measurement
is used as an outcome in our regression model (analysis not shown for space reasons), our set
of indicators also shows relatively high proportions of explained variance (40%) and most of
the correlations we expect are evident. However, only the variables populism and social
media use significantly predict the outcome. Such fluctuations with existing secondary data
underline the need to collect better primary data for future studies of online disinformation;
this data must be collected in consistent and standardized ways across country contexts and
time points in order to obtain comparative findings that are more robust than ours. Despite
these limitations, it must be said that the data we use are the best available—especially in
international comparison—and that our analysis is the best approximation to the ideal that was
possible for us. The fact that not all predictors in Table 3 are significant does not detract from
the theoretical relevance of the dimensions that we present and the heuristic value of our
framework. Against this background, we stand by our indicators, but would like to urge
researchers to validate our framework with better data and supplement it with further
dimensions.
2 Questions used by VDem refer to “domestic disinformation dissemination by the government” and “domestic
disinformation dissemination by political parties” (Coppedge et al., 2019).
RESILIENCE TO ONLINE DISINFORMATION 22
The goal of this study was to provide researchers with a conceptual map for cross-
national research on online disinformation. However, some aspects should be considered that
are both theoretical and empirical in nature. First, some of our indicators are correlated with
each other. This reflects connections among the seven dimensions of our framework. The
factor analysis showed that these belong to two overarching factors, one comprises variables
related to the political environment and to news consumption, and the second one comprises
variables related to the market size and the size and importance of media organizations. This
is an important finding that shows two things. On the one hand, the political environment and
media use are closely connected. For example, in countries where populist politicians often
attack journalists, public trust in traditional media suffers while the use of social media
increases. Lower trust in traditional media and higher use of social media presents populists
with improved opportunities to spread their messages about who is allegedly conspiring
against the common people. On the other hand, the market is related to the strength of media
organizations. In smaller markets, there is often a strong PSB that also acts as a link between
the society and the media. Often the news broadcasts of the PSB are the most used programs
across the entire population - and thus mitigate the fragmentation of media use.
This brings us to a several limitations of this study. First, we argue that the size of the
advertising market is important because it is more attractive for producers of disinformation to
generate content for large audiences. However, we have only included one large market in our
study, namely the U.S. The U.S. is an exception in many respects, including but not limited to
its size. To examine the influence of market size more broadly, future studies should include
other large countries. The results of such comparisons can further show to what extent the
U.S. resembles other large countries.
A second limitation is rooted in the nature and number of indicators used in the
framework. Some of the indicators are volatile, for example vote shares of political parties.
We tried to account for this by combining these indicators with more stable indicators, e.g.
RESILIENCE TO ONLINE DISINFORMATION 23
content analysis data. Third, we identified seven indicators related to resilience to online
information based on an extensive literature review. However, there might be other important
drivers of the dissemination of online disinformation that we did not discuss, such as social-
economic inequality. We hope that our framework will inspire researchers from different
disciplines to think about such drivers and generate further ideas and, hopefully, measurable
indicators. On the upside, our findings can guide case selection in future cross-national
research on the topic. Studies examining similar countries can identify national specificities
and clarify the role of single indicators.
A fourth limitation concerns the sample used in this study. We relied on Hallin &
Mancini’s (2004) typology and followed their country selection. However, online
disinformation is currently a problem in many countries and especially in those with low
levels of media freedom or with Internet censorship. Broadening the spectrum of countries is
therefore an important step that has to be taken in future work. Moreover, taking into account
countries beyond the Western world will likely require a broader set of empirical indicators.
A fifth limitation might result from the different data sources that we used in this
study. Although other scholars have successfully worked with the same data sources because
they enjoy high credibility, the sources capture country differences only in an aggregated
form. An additional challenge is that situations in countries change over time and that our
sources only provide a snapshot of the current situation. Many of our data sources are fairly
new and their repeated use in the future will allow to also observe trends.
The initial goal of this study was to develop a theoretical framework to enable and
stimulate cross-national research on the topic of online disinformation. Although scholarship
on disinformation has increased substantially since 2016 (especially in the U.S.), there is a
lack of work comparing these findings with the situation in other countries. Moreover, recent
studies exploring the phenomenon of online disinformation have primarily focused on the
individual level; however, the literature also emphasizes the importance of macro-level
RESILIENCE TO ONLINE DISINFORMATION 24
factors (Allcott & Gentzkow, 2017; Graves, Nyhan, & Reifler, 2016; Vargo et al., 2017). By
suggesting a theoretical framework for the study of online disinformation we want to help
scholars to understand which contextual and individual factors foster the dissemination and
consumption of online disinformation and with what effects. The consequences of
technology-driven developments are often prematurely generalized, but our comparative
analysis shows that they can have different effects in different countries. Future research in
the field of political communication should focus on the relationship between structural
conditions at the macro level and individual-level prepositions at the micro level. To
understand when and why a person is willing to believe or share disinformation, we need to
know more about how personal characteristics and attitudes interact with the structural
context in which people receive and consume this kind of low-quality or even false
information. We hope that our focus on resilience might inspire researchers and policy makers
to think not only about disinformation as a problem but also about structural factors as a
means to counter it.
RESILIENCE TO ONLINE DISINFORMATION 25
References
Aalberg, T., & Curran, J. (2012). How Media Inform Democracy. A Comparatice Approach.
London and New York: Routledge.
Aalberg, T., & Cushion, S. (2016). Public Service Broadcasting, Hard News, and Citizens’
Knowledge of Current Affairs. In Oxford Research Encyclopedia of Politics.
Aalberg, T., Esser, F., Reinemann, C., Stromback, J., & De Vreese, C. (2016). Populist
political communication in Europe. Routledge.
Aalberg, T., Papathanassopoulos, S., Soroka, S., Curran, J., Hayashi, K., Iyengar, S., …
Tiffen, R. (2013). International TV News, Foreign Affairs Interest and Public
Knowledge. Journalism Studies, 14(3), 387–406.
https://doi.org/10.1080/1461670X.2013.765636
Adger, W. N. (2000). Social and ecological resilience: are they related? Progress in Human
Geography, 24(3), 347–364.
Allcott, H., & Gentzkow, M. (2017). Social Media and Fake News in the 2016 Election.
Journal of Economic Perspectives, 31(2), 211–236.
Baldersheim, H., & Keating, M. (Eds.). (2016). Small States in the Modern World:
Vulnerabilities and Opportunities. Cheltenham: Edward Elgar Publishing Ltd.
Barthel, M., Mitchell, A., & Holcomb, J. (2017). Many Americans Believe Fake News is
Sowing Confusion. Pew Research Center.
Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation,
disinformation, and radicalization in American politics. Oxford, UK: Oxford University
Press.
Bennett, W. L., & Livingston, S. (2018). The disinformation order: Disruptive communication
and the decline of democratic institutions. European Journal of Communication, 33(2),
122–139. https://doi.org/10.1177/0267323118760317
Bradshaw, S., & Howard, P. N. (2017). Troops, Trolls and Troublemakers: A Global
Inventory of Organized Social Media Manipulation (Computational Propaganda
Research Project No. 2017.12). (S. Woolley & P. N. Howard, Eds.). Oxford, UK.
Retrieved from http://comprop.oii.ox.ac.uk/
Brandtzaeg, P. B., & Følstad, A. (2017). Trust and distrust in online fact-checking services.
Communications of the ACM, 60(9), 65–71. https://doi.org/10.1145/3122803
Brüggemann, M., Engesser, S., Büchel, F., Humprecht, E., & Castro, L. (2014). Hallin and
Mancini Revisited: Four Empirical Types of Western Media Systems. Journal of
Communication, 64(6), 1037–1065. https://doi.org/10.1111/jcom.12127
RESILIENCE TO ONLINE DISINFORMATION 26
Büchel, F., Humprecht, E., Castro-Herrero, L., Engesser, S., & Brüggemann, M. (2016).
Building Empirical Typologies with QCA: Toward a Classification of Media Systems.
International Journal of Press/Politics, 21(2).
https://doi.org/10.1177/1940161215626567
Carpini, M. X. D., & Keeter, S. (1996). What Americans Know about Politics and Why It
Matters. Yale University Press. Retrieved from http://www.jstor.org/stable/j.ctt1cc2kv1
Chadwick, A., Vaccari, C., & O’Loughlin, B. (2018). Do tabloids poison the well of social
media? Explaining democratically dysfunctional news sharing. New Media and Society,
20(11), 4255–4274. https://doi.org/10.1177/1461444818769689
Chalmers, A. W., & Shotton, P. A. (2016). Changing the Face of Advocacy? Explaining
Interest Organizations’ Use of Social Media Strategies. Political Communication, 33(3),
374–391. https://doi.org/10.1080/10584609.2015.1043477
Chung, C. J., Nam, Y., & Stefanone, M. A. (2012). Exploring Online News Credibility: The
Relative Influence of Traditional and Technological Factors. Journal of Computer-
Mediated Communication, 17(2), 171–186. https://doi.org/10.1111/j.1083-
6101.2011.01565.x
Ciampaglia, G. L. (2017). Fighting fake news: a role for computational social science in the
fight against digital misinformation. Journal of Computational Social Science, 1(1),
147–153. https://doi.org/10.1007/s42001-017-0005-6
Coppedge, M., Gerring, J., Knutsen, C. H., Lindberg, S. I., Teorell, J., Altman, D., … Ziblatt,
D. (2019). V-Dem Dataset v9.
Coppedge, M., & Teorell, J. (2016). Measuring high level democratic principles using the V-
Dem data. International Political Science Review.
https://doi.org/10.1177/0192512115622046
Craft, S., Ashley, S., & Maksl, A. (2017). News media literacy and conspiracy theory
endorsement. Communication and the Public, 205704731772553.
https://doi.org/10.1177/2057047317725539
Curran, J., Coen, S., Aalberg, T., & Iyengar, S. (2012). News Content, Media Consumption,
and Current Affairs Knowledge. In T. Aalberg & J. Curran (Eds.), How media inform
democracy: A comparative approach (Vol. 1, pp. 81–97). New York and London:
Routledge.
Curran, J., Iyengar, S., Brink-Lund, A., Salovaara-Moring, I., Lund, A. B., Salovaara-Moring,
I., … Salovaara-Moring, I. (2009). Media System, Public Knowledge and Democracy: A
Comparative Study. European Journal of Communication, 24(1), 5–26.
RESILIENCE TO ONLINE DISINFORMATION 27
https://doi.org/10.1177/0267323108098943
Dalton, R. J. (2008). The Quantity and the Its Measurement , and Its Consequences.
Comparative Political Studies, 41(7), 1–22. https://doi.org/10.1177/0010414008315860
Engesser, S., Ernst, N., Esser, F., & Büchel, F. (2016). Populism and social media: how
politicians spread a fragmented ideology. Information, Communication & Society, 0(0),
1–18. https://doi.org/10.1080/1369118X.2016.1207697
Faris, R., Roberts, H., Etling, B., Bourassa, N., Benkler, E., & Yochai, Z. (2017).
Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S.
Presidential Election (No. 2017–6) (Vol. August 201). Cambridge, Massachusetts.
Retrieved from http://nrs.harvard.edu/urn-3:HUL.InstRepos:33759251%0AThe
Fletcher, R., Cornia, A., Graves, L., & Nielsen, R. K. (2018). Measuring the reach of “fake
news” and online disinformation in Europe (Factsheet).
Fletcher, R., & Nielsen, R. K. (2017). Fragmentation and Duplication: A Cross-national
Comparative Analysis of Cross-platform News Audiences. Journal of Communication,
67(2012), 476–498. https://doi.org/10.1111/jcom.12315
Graves, L., Nyhan, B., & Reifler, J. (2016). Understanding Innovations in Journalistic
Practice: A Field Experiment Examining Motivations for Fact-Checking. Journal of
Communication, 66(1), 102–138. https://doi.org/10.1111/jcom.12198
Guess, A., Nyhan, B., & Reifler, J. (2018). Selective Exposure to Misinformation : Evidence
from the consumption of fake news during the 2016 U . S . presidential campaign.
Hall, P. A., & Lamont, M. (Eds.). (2013). Social resilience in the neoliberal era. New York:
Cambridge University Press.
Hallin, D. C., & Mancini, P. (2004). Comparing Media Systems. Three Models of Media and
Politics. Cambridge: Cambrige University Press.
Hameleers, M. (2018). Populism all Around? Exploring the Intersections of Populism,
Polarization and Partisan Mis- and Disinformation. Paper presented to the Workshop
“Media Populism and European Democracy", 8-9 November 2018, University of
Copenhagen.
Hawkins, K. A., Aguilar, R., Castanho Silva, B., Jenne, E. K., Kocijan, B., & Rovira
Kaltwasser, C. (2019). Global Populism Database, v1. Harvard Dataverse.
https://doi.org/doi:10.7910/DVN/LFTQEZ
Hetherington, M. J. (2001). Resurgent Mass Partisanship : The Role of Elite Polarization. The
American Political Science Review, 95(3), 619–631.
House of Commons UK. (2019). Disinformation and ‘fake news’: Final Report. London,
RESILIENCE TO ONLINE DISINFORMATION 28
England. Retrieved from
https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1791/1791.pdf
Howard, P., Bolsover, G., Kollanyi, B., Bradshaw, S., & Neudert, L.-M. (2017). Junk news
and bots during the US election: What were Michigan voters sharing over Twitter?
Oxford, UK. Retrieved from http://comprop.oii.ox.ac.uk/2017/03/26/junknews-%0Aand-
bots-during-the-u-s-election-what-were-michigan-voters-sharingover-%0Atwitter/
Howard, P., & Kollanyi, B. (2017). Bots, #Strongerin, and #Brexit: Computational
Propaganda During the UK-EU Referendum. SSRN Electronic Journal.
https://doi.org/10.2139/ssrn.2798311
Jones, D. a. (2004). Why Americans Don’t Trust the Media: A Preliminary Analysis. The
Harvard International Journal of Press/Politics, 9(2), 60–75.
https://doi.org/10.1177/1081180X04263461
Kuklinski, J. H., Quirk, P. J., Jerit, J., Schwieder, D., Robert, F., & Rich, R. F. (2014).
Misinformation and the Currency of Democratic Citizenship Misinformation and the
Currency of Democratic Citizenship. The Journal of Politics, 62(3), 790–816.
Ladd, J. M. (2010). The Role of Media Distrust in Partisan Voting. Political Behavior, 32(4),
567–585. https://doi.org/10.1007/s11109-010-9123-z
Lawlor, A. (2015). Framing Immigration in the Canadian and British News Media. Canadian
Journal of Political Science, 48(02), 329–355.
https://doi.org/10.1017/s0008423915000499
Layman, G. C., Carsey, T. M., & Horowitz, J. M. (2006). Party Polarization in American
Politics: Characteristics, Causes, and Consequences. Annual Review of Political Science,
9(1), 83–110. https://doi.org/10.1146/annurev.polisci.9.070204.105138
Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond Misinformation:
Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in
Memory and Cognition, 6(4), 353–369. https://doi.org/10.1016/j.jarmac.2017.07.008
Marwick, A., & Lewis, R. (2017). Media Manipulation and Disinformation Online. New
York, NY.
Meraz, S., & Papacharissi, Z. (2013). Networked Gatekeeping and Networked Framing on
#Egypt. International Journal of Press/Politics, 18(2), 138–166.
https://doi.org/10.1177/1940161212474472
Muddiman, A., & Stroud, N. J. (2017). News Values, Cognitive Biases, and Partisan Incivility
in Comment Sections. Journal of Communication, 67, 586–609.
https://doi.org/10.1111/jcom.12312
RESILIENCE TO ONLINE DISINFORMATION 29
Nechushtai, E. (2018). From Liberal to Polarized Liberal? Contemporary U.S. News in Hallin
and Mancini’s Typology of News Systems. International Journal of Press/Politics,
23(2), 183–201. https://doi.org/10.1177/1940161218771902
Neudert, L.-M., Howard, P., & Kollanyi, B. (2019). Sourcing and Automation of Political
News and Information During Three European Elections. Social Media + Society, 5(3),
205630511986314. https://doi.org/10.1177/2056305119863147
Newman, N., Fletcher, R., Kalogeropoulos, A., Levy, D. A. L., & Nielsen, R. K. (2017).
Reuters Institute Digital News Report 2017. Oxford, UK.
https://doi.org/10.1080/21670811.2012.744561
Newman, N., Fletcher, R., Kalogeropoulos, A., & Nielsen, R. K. (2019). Reuters Institute
Digital News Report 2019. Oxford, UK. https://doi.org/10.2139/ssrn.2619576
Newman, N., Levy, D., & Nielsen, R. K. (2018). Reuters Institute Digital News Report 2018.
https://doi.org/10.2139/ssrn.2619576
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises.
Review of General Psychology, 2(2), 175–220. https://doi.org/10.1037/1089-
2680.2.2.175
Nielsen, R. K., & Graves, L. (2017). “News you don’t believe”: Audience perspectives on
fake news. Oxford. Retrieved from
https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2017-
10/Nielsen%26Graves_factsheet_1710v3_FINAL_download.pdf
Pemstein, D., Marquardt, K. L., Tzelgov, E., Wang, Y., Medzihorsky, J., Krusell, J., …
Römer, J. von. (2019). V-Dem Measurement Model: Latent Variable Analysis for Cross-
National and Cross-Temporal Expert-Coded Datas (V-Dem Working Paper No. No. 21.
4th edition).
Pennycook, G., & Rand, D. G. (2017). Who falls for fake news? The roles of analytic
thinking, motivated reasoning, political ideology, and bullshit receptivity. SSRN
Electronic Journal, September. https://doi.org/10.2139/ssrn.3023545
Pennycook, G., & Rand, D. G. (2018). Lazy, not biased: Susceptibility to partisan fake news
is better explained by lack of reasoning than by motivated reasoning. Cognition.
https://doi.org/10.1016/j.cognition.2018.06.011
Prior, M. (2013). Media and Political Polarization. Ssrn. https://doi.org/10.1146/annurev-
polisci-100711-135242
Prior, M., Sood, G., & Khanna, K. (2015). You Cannot be Serious: The Impact of Accuracy
Incentives on Partisan Bias in Reports of Economic Perceptions. Quarterly Journal of
RESILIENCE TO ONLINE DISINFORMATION 30
Political Science, 10(4), 489–518. https://doi.org/10.1561/100.00014127
Robison, J., & Mullinix, K. J. (2015). Elite polarization and public opinion: How polarization
is communicated and its effects. Political Communication, 4609(July), 1–22.
https://doi.org/10.1080/10584609.2015.1055526
Ross, A., & Rivers, D. (2018). Discursive Deflection: Accusation of “Fake News” and the
Spread of Mis- and Disinformation in the Tweets of President Trump. Social Media +
Society, 4(2), 205630511877601. https://doi.org/10.1177/2056305118776010
Ross, L., & Ward, A. (1996). Naive realism in everyday life: implications for social confilct
and misunderstanding. In E. S. Reed, E. Turiel, & T. Brown (Eds.), Values and
Knowledge (pp. 103–135). Mahwah, New Jersey: Lawrence Erlbaum Associates,
Publishers.
Schulz, A., Wirth, W., & Müller, P. (2018). We Are the People and You Are Fake News: A
Social Identity Approach to Populist Citizens’ False Consensus and Hostile Media
Perceptions. Communication Research. https://doi.org/10.1177/0093650218794854
Shehata, A., & Strömbäck, J. (2018). Learning Political News From Social Media: Network
Media Logic and Current Affairs News Learning in a High-Choice Media Environment.
Communication Research, 009365021774935.
https://doi.org/10.1177/0093650217749354
Shin, J., Jian, L., Driscoll, K., & Bar, F. (2017). Political rumoring on Twitter during the 2012
US presidential election: Rumor diffusion and correction. New Media & Society, 19(8),
1214–1235. https://doi.org/10.1177/1461444816634054
Shin, J., & Thorson, K. (2017). Partisan Selective Sharing: The Biased Diffusion of Fact-
Checking Messages on Social Media. Journal of Communication, 67(2), 233–255.
https://doi.org/10.1111/jcom.12284
Simpson, S., Puppis, M., & Van den Bulck, H. (2016). European Media Policy for the
Twenty-first Century: Assessing the Past, Setting Agendas for the Future. Routledge.
Singer, J. B. (2014). User-generated visibility: Secondary gatekeeping in a shared media
space. New Media & Society, 16(1). https://doi.org/10.1177/1461444813477833
Subramanian, S. (2017). Inside the Macedonian Fake-News Complex. Wired, February 15.
Swire, B., Berinsky, A. J., Lewandowsky, S., & Ecker, U. K. H. (2017). Processing political
misinformation: comprehending the Trump phenomenon. Royal Society Open Science,
4(3), 1–21. https://doi.org/10.1098/rsos.160802
Tambini, D. (2017). How advertising fuels fake news. Retrieved from
http://blogs.lse.ac.uk/mediapolicyproject/2017/02/24/how-advertising-fuels-fake-news/
RESILIENCE TO ONLINE DISINFORMATION 31
Tandoc, E. C. J., Lim, Z. W., & Ling, R. (2017). Defining “ Fake News ”. A typology of
scholarly definitions. Digital Journalism, 0811(September), 1–17.
https://doi.org/10.1080/21670811.2017.1360143
Timbro. (2019). Timbro Authoritarian Populism Index. Stockholm, Sweden. Retrieved from
https://populismindex.com/
Tsfati, Y., & Cappella, J. N. (2003). Do People Watch what they Do Not Trust?: Exploring
the Association between News Media Skepticism and Exposure. Communication
Research, 30(5), 504–529. https://doi.org/10.1177/0093650203253371
Turcotte, J., York, C., Irving, J., Scholl, R. M., & Pingree, R. J. (2015). News
Recommendations from Social Media Opinion Leaders: Effects on Media Trust and
Information Seeking. Journal of Computer-Mediated Communication, 20(5), 520–535.
https://doi.org/10.1111/jcc4.12127
Van Aelst, P., Strömbäck, J., Aalberg, T., Esser, F., de Vreese, C. H., Matthes, J., … Stanyer,
J. (2017). Political communication in a high-choice media environment: a challenge for
democracy? Annals of the International Communication Association, 41(1), 3–27.
https://doi.org/10.1080/23808985.2017.1288551
Van der Wurff, R. (2005). Competition, Concentration and Diversity in European Television
Markets. Journal of Cultural Economics, 29(4), 249–275.
Van Herpen, M. H. (2015). Putin’s Propaganda Machine: Soft Power and Russian Foreign
Policy. Rowman & Littlefield.
van Kessel, S. (2015). Populist Parties in Europe. London: Palgrave Macmillan UK.
https://doi.org/10.1057/9781137414113
Vargo, C. J., Guo, L., & Amazeen, M. A. (2017). The agenda-setting power of fake news: A
big data analysis of the online media landscape from 2014 to 2016. New Media &
Society, 146144481771208. https://doi.org/10.1177/1461444817712086
Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an interdisciplinary
framework for research and policy making (Council of Europe report No. DGI(2017)09).
Strasbourg: Council of Europe report.
Webster, J. G., & Ksiazek, T. B. (2012). The Dynamics of Audience Fragmentation: Public
Attention in an Age of Digital Media. Journal of Communication, 62(1), 39–56.
https://doi.org/10.1111/j.1460-2466.2011.01616.x
Wolrd Bank. (2017). World Development Indicators. Retrieved from
https://data.worldbank.org/indicator
Young, D. G., Jamieson, K. H., Poulsen, S., & Goldring, A. (2018). Fact-Checking
RESILIENCE TO ONLINE DISINFORMATION 32
Effectiveness as a Function of Format and Tone: Evaluating FactCheck.org and
FlackCheck.org. Journalism and Mass Communication Quarterly, 95(1), 49–75.
https://doi.org/10.1177/1077699017710453
RESILIENCE TO ONLINE DISINFORMATION 33
Figures
Figure 1: Types of Information in the Social Media Environment
Source: Partly adapted from Wardle & Derakhshan (2017, p. 5)
Figure 2: Country Values of Framework Indicators
Note: Bars show added index values per country (z-standardized). Higher values indicate
greater resilience towards disinformation; lower values indicate less resilience towards online
disinformation.
-15,00
-10,00
-5,00
0,00
5,00
10,00
Polarization index (inverted) Populism index (inverted)
Media trust index Shared media
Strength of PBS Social media index (inverted)
Market size (inverted)
RESILIENCE TO ONLINE DISINFORMATION 34
Figure 3: Cluster Country Means
Note: Clusters represent country means of different framework indicators.
Polarization
index
(inverted)
Populism
index
(inverted)
Media trust
index
Shared
media
Strength of
PBS
Social
media index
(inverted)
Market size
(inverted)
media-supportive
(AUT, BEL, CAN, DEN,
FIN, GER, IRE, NLD,
NOR, SUI, SWE, UK)
polarized (GRE, ITA,
POR, SPA)
USA
RESILIENCE TO ONLINE DISINFORMATION 35
Tables
Table 1: Framework with Theoretical Dimensions, Measurable Indicators, and Data
Sources
DIMENSION MEASURABLE INDICATOR DATA SOURCE
POLITICAL
ENVIRONMENT
POPULIST
COMMUNICATION
Vote share of populist parties 2018
Change in vote share 2008-2018
Speeches of political leaders
Timbro Authoritarian
Populism Index (2019),
Aalberg et al. (2016), Van
Kessel (2015)
Global Populism Database
(2019)
SOCIETAL
POLARIZATION
Polarization of society
Online media fractionalization
VDem (2019)
VDem (2019)
MEDIA ENVIRONMENT
TRUST IN NEWS MEDIA Overall trust in news media
Trust in news that I use
Digital News Report (2018)
STRENGTH OF PSB Market share of public TV
Public revenue (license fee)
Brüggemann et al. (2014)
SHARED MEDIA Share of most used media outlets/
programs
Digital News Report (2019)
ECONOMIC
ENVIRONMENT
SIZE OF ONLINE MEDIA
MARKET
No. of online users per country World Bank Data (2017)
SOCIAL MEDIA NEWS
CONSUMPTION
Social media use for news
Sharing news on social media
Digital News Report (2018)
OUTCOME
EXPOSURE TO ONLINE
DISINFORMATION
Reported exposure to dis- and
misinformation
Digital News Report (2018)
RESILIENCE TO ONLINE DISINFORMATION 36
Table 2: Correlations between Indices
Populism
index
(inverted)
Polarizati
on index
(inverted)
Media
trust
index
Shared
media
Strength
of PSB
Social
media
index
(inverted)
Market
size
(inverted)
Populism
index
(inverted)
1 .42 .54* .35 .45 .33 .31
Polarization
index
(inverted)
1 .66** .26 .48 .32 .43
Media trust
index
1 .31 .42 .36 .31
Shared media 1 .54* .06 .62**
Strength of
PSB
1 .45 .58*
Social media
index
(inverted)
1 -.15
Market size
(inverted)
1
Note: N= 17; values are Pearson’s correlation coefficients; marked values are
statistically significant (*p< .05, **p< .01).
Table 3: Framework Indicators Predicting Exposure to Disinformation
B SE Β
Constant -.19 .13
Populism index (inverted) -.12 .15 -0.10
Polarization index (inverted) .36 .18 .30
Media trust index -.58 .16 -.56***
Shared media .35 .17 .32
Strength of PBS -.16 .18 -.16
Social media index (inverted) -.62 .15 -.62**
Market size (inverted) -.41 .18 -.41*
R
2
.83
Note: Ordinary least squares (OLS) regression. Entries are unstandardized coefficients,
standard errors (SEs) and betas. ***p < .001; **p < .01; *p < .05.
... Our literature review gathers how age, gender, education level and political leaning help explain the way citizens (dis) regard misinformation. Also, resilience to misinformation is associated with countryspecific characteristics (Humprecht et al., 2020;). An emerging future research line focusing on cross-national studies will help enlarge the analysis of the development of selfefficacy beliefs that favour resilience to misinformation regarding media intangible assets (e.g. ...
Article
Full-text available
Resilience to misinformation has been conceptualised and defined as an intangible resource belonging to a country, a measure of the capacity of its citizens to deploy discerning and cognitive skills to determine the veracity or falsehood of information, as well as be aware of the degree of the problem. This conceptu-alisation allows for value to emerge from crosscountry and cross-time analyses of data on perceptions of self-efficacy in curbing misinformation. Using data from Eurobarometer, this research conducts analyses at whole-country level, and 1. identifies key components of individuals' perceptions about their resilience to misinformation; 2. produces a factor with which cross-time observations can be operationalised; 3. shows evolution over time (2018-2022) for European citizens from 27 countries. Overall, results disclose a growing trend, and this is so for both specific individual attitudes and skills, as well as the resulting factor as a whole. The causes and implications of the findings are discussed to provide hints on how to improve public policies, such as taking into account self-perceptions 2 Carlos Rodríguez Pérez, María José Canel KOME − An International Journal of Pure Communication Inquiry of efficacy in fighting misinformation alongside media literacy strategies that engage citizens in curbing misinformation.
... Dezenformasyon en temelde bozulmaya uğramış bilgi olarak tanımlansa da bunun özellikle hangi alanlardaki kasıtlı bilgi bozukluğunu kapsadığı araştırmalara göre farklılıklar göstermektedir. Bazı araştırmalar dezenformasyonu yalnızca siyasal alandaki yalan/yanlış bilgilerin kasıtlı biçimde üretilmesi olarak tanımlarken (Bennett ve Livingston, 2018: 124) bazı araştırmalar bunun hangi alanda olduğunu belirtmeksizin kasıtlı biçimde yayılan her türlü yalan ve yanlış bilginin dezenformasyon olduğunu belirtmektedir (Humprecht, 2020). ...
Chapter
“Medya Okuryazarlığında Yeni Gündem: Hakikat Sonrası Çağ” başlıklı çalışmada; medya okuryazarlığının ve hakikat sonrası (post-truth) kavramının anlam ve kapsamına ışık tutulmaya çalışılmış, ardından hakikat sonrası dönemde medya okuryazarlığının ne anlama geldiği tartışılmıştır. Ayrıca günümüzde medyanın dönüşümüne ve geleneksel medyadan ayrışan yönlerine temas edilmiş, hakikat sonrası çağda medya okuryazarlığının ne tür bir işlev gördüğü açıklanmıştır. Bu sayede hakikat sonrası çağda medya okuryazarlığının anlaşılması ve geliştirilmesine yönelik fırsatların ve zorlukların neler olabileceği tartışılmıştır.
... Iran and several other countries and non-state actors were also found to have taken steps to influence the election. A comparative, cross-national study on the nation's resilience to online disinformation found that the United States scored quite low based on the country's large advertising market and its comparatively fragmented news consumption, which relies primarily on non-public service media (Humprecht et al. 2020). Similarly, Shahbaz and Funk (2021) report that the United States' score declined for the fifth consecutive year based on the continued presence of false and misleading information online. ...
Article
Full-text available
The internet is one of humanity’s most significant creations in the modern era. What began roughly 30 years ago has developed into a rich, diverse, but largely unregulated environment we can no longer live without. With the spread of mis- and disinformation worldwide, calls for a safer internet have gotten louder. This article discusses the threats disinformation poses to online users and provides a case study on how the European Union’s Digital Services Act attempts to protect users’ fundamental rights in the online space and whether the Digital Services Act could or should serve as a model for similar legislation in the United States.
... Increasing societal polarization and rising populism, as well as low confidence in news media, limit citizens' resilience to manipulative content and disinformation. Furthermore, a weak public service broadcaster and fragmented audiences exacerbate the issue (Humprecht et al., 2020(Humprecht et al., , 2021. ...
Article
Full-text available
This article investigates the challenges journalism professionals face in a rapidly changing digital media environment, proposing that a “processual” and human-centered perspective might offer valuable insights into developing resilient professionalism. The article builds its argument on theories of transmediality and hybridization in digital media ecosystems and the socio-psychological development toward accountable communication and responsible professionalism. It specifically looks at future journalists as active learners to whom media literacy interventions may offer new insights into the mental processes in professional decision-making. It tests these ideas in an experimental study with journalism students, where the lateral reading approach was applied within the framework of learning skills for information verification. Results from the thematic analysis of students’ reflexive assessments of their practice reveal norms illustrative of a self-efficacious learning process: Students’ answers demonstrate empowering and perseverance-directed approaches. As argued, these norms are geared toward imposing a higher media awareness and self-regulatory capacity, which is critical for accountable decision-making in transmedial and highly interactive digital information environments.
... News, as a medium for information dissemination, reported facts, events, and issues, aiming to provide the public with reliable and timely information. In the best scenarios, news facilitates people in gaining access to more information and potentially being more resilient in dealing with misinformation (Acerbi et al., 2022;Humprecht et al., 2020). In addition, news provided a broader and more convenient access route through online platforms and digital technology (Nelson & Lei, 2018). ...
Article
The rise of social media has enabled unrestricted information sharing, regardless of its accuracy. Unfortunately, this has also resulted in the widespread dissemination of misinformation. This study aims to provide a comprehensive scientometric analysis under the PRISMA paradigm to clarify the repetitive trajectory of misinformation on social media in the current digital age. In this study, 3724 publications on social media misinformation from the Web of Science between January 2010 and February 2024 were analyzed scientifically and metrically using CiteSpace software. The findings reveal a sharp increase in annual publication output starting from 2015. The United States of America and China have made more significant contributions in publication volume and global collaborations than other nations. The top five keywords with high frequency are social media, fake news, information, misinformation, and news. In contrast to a brief review of existing articles, this study provides an exhaustive review of annual scientific research output, journals, countries, institutions, contributors, highly cited papers, and keywords in social media misinformation research. The developmental stages of social media misinformation research are charted, current hot topics are discussed, and avenues for future research are suggested.
... Although it is untrue and false content, due to the use of the mentioned words and expressions, it is easier to attract people's attention (Humprecht et al. 2020). Moreover, it is a reason why a certain part of the audience often accepts false content. ...
Article
Full-text available
In recent years, disinformation has become a significant problem in the media environment. The topic is therefore increasingly relevant in recent research, and authors approach it in different ways. This research aims to provide an answer to the need for a deeper understanding of how to detect and combat disinformation. The primary purpose of this research is to identify and systematize key categories that enable the detection of disinformation, providing a solid framework for combating this ubiquitous challenge. The qualitative method of thematic analysis was used to analyze the relevant literature and articles published in the period from 2011 to 2024. Thematic analysis was chosen because of its ability to successfully systematize key categories and create an adequate theoretical framework. The results of the research revealed eight key categories for the detection of disinformation: harm level, source checking, linguistic, syntactic, psycho-linguistic, style, visual and social context categories. These categories offer a systematic approach to recognizing disinformation from different perspectives, and the research itself emphasizes the importance of collaboration between people and analysis software. The research represents a comprehensive theoretical framework that not only contributes to the academic debate, but also serves as a foundation for future educational materials and experimental research.
Chapter
This chapter discusses the completely new reality determined by the emergence of new private actors who dominate and shape the communication and information infrastructure—i.e. the digital platforms. The digital media ecosystem has become one of the main battlefields of contemporary geopolitical competition. This chapter reflects on the changes in the media and information ecosystem, arguing that some of them pose significant risks, leading to a particular type of crisis in the contemporary democratic societies. Three main problems in the public sphere are identified and exemplified: the disturbance in the relationship between information and democracy, the use of digital affordances to strategically highjack public conversions, and the increased fragility of news media, under the pressure of platformization.
Article
Full-text available
TikTok stands out as the top choice for digital social networking among the youth in the United States, despite being viewed by the U.S. authorities as a potential risk to the nation's security. Over the past few years, the concept of digital sovereignty has gained prominence, with growing global interest and scrutiny from the international community regarding this matter. This article organizes the relevant concepts of digital sovereignty, discusses geopolitics and digital boundaries, reviews the ban on TikTok in the United States, and combines the background of digital sovereignty with the ban on TikTok to draw a conclusion: TikTok poses a threat to the U.S. government and society in terms of data collection, algorithmic control, and political influence, and these threats are part of the background of digital sovereignty. Ultimately, this article calls for a joint effort from the international community to balance the relationship between data security, freedom of information, and technological innovation, and to strengthen the emphasis on the issue of digital sovereignty.
Book
Full-text available
This book examines the shape, composition, and practices of the United States political media landscape. It explores the roots of the current epistemic crisis in political communication with a focus on the remarkable 2016 U.S. president election culminating in the victory of Donald Trump and the first year of his presidency. The authors present a detailed map of the American political media landscape based on the analysis of millions of stories and social media posts, revealing a highly polarized and asymmetric media ecosystem. Detailed case studies track the emergence and propagation of disinformation in the American public sphere that took advantage of structural weaknesses in the media institutions across the political spectrum. This book describes how the conservative faction led by Steve Bannon and funded by Robert Mercer was able to inject opposition research into the mainstream media agenda that left an unsubstantiated but indelible stain of corruption on the Clinton campaign. The authors also document how Fox News deflects negative coverage of President Trump and has promoted a series of exaggerated and fabricated counter narratives to defend the president against the damaging news coming out of the Mueller investigation. Based on an analysis of the actors that sought to influence political public discourse, this book argues that the current problems of media and democracy are not the result of Russian interference, behavioral microtargeting and algorithms on social media, political clickbait, hackers, sockpuppets, or trolls, but of asymmetric media structures decades in the making. The crisis is political, not technological.
Book
Full-text available
While the historical impact of rumours and fabricated content has been well documented, efforts to better understand today’s challenge of information pollution on a global scale are only just beginning. Concern about the implications of dis-information campaigns designed specifically to sow mistrust and confusion and to sharpen existing sociocultural divisions using nationalistic, ethnic, racial and religious tensions is growing. The Council of Europe report on “Information Disorder: Toward an interdisciplinary framework for research and policy making” is an attempt to comprehensively examine information disorder and to outline ways to address it.
Article
Full-text available
Voters increasingly rely on social media for news and information about politics. But increasingly, social media has emerged as a fertile soil for deliberately produced misinformation campaigns, conspiracy, and extremist alternative media. How does the sourcing of political news and information define contemporary political communication in different countries in Europe? To understand what users are sharing in their political communication, we analyzed large volumes of political conversation over a major social media platform—in real-time and native languages during campaign periods—for three major European elections. Rather than chasing a definition of what has come to be known as “fake news,” we produce a grounded typology of what users actually shared and apply rigorous coding and content analysis to define the types of sources, compare them in context with known forms of political news and information, and contrast their circulation patterns in France, the United Kingdom, and Germany. Based on this analysis, we offer a definition of “junk news” that refers to deliberately produced misleading, deceptive, and incorrect propaganda purporting to be real news. In the first multilingual, cross-national comparison of junk news sourcing and consumption over social media, we analyze over 4 million tweets from three elections and find that (1) users across Europe shared substantial amounts of junk news in varying qualities and quantities, (2) amplifier accounts drive low to medium levels of traffic and news sharing, and (3) Europeans still share large amounts of professionally produced information from media outlets, but other traditional sources of political information including political parties and government agencies are in decline.
Article
Full-text available
This study aims to investigate the relationships between citizens’ populist attitudes, perceptions of public opinion, and perceptions of mainstream news media. Relying on social identity theory as an explanatory framework, this article argues that populist citizens assume that public opinion is congruent with their own opinion and that mainstream media reporting is hostile toward their own views. To date, only anecdotal evidence suggests that both assumptions are true. The relationships are investigated in a cross-sectional survey with samples drawn from four Western European countries (N = 3,354). Multigroup regression analysis supports our hypotheses: False consensus and hostile media perceptions can clearly be linked to populist attitudes in all four regions under investigation. Moreover, our findings show a gap between hostile media perceptions and congruent public opinion perceptions, which increases with increasing populist attitudes to the point that the persuasive press inference mechanism is annulled.
Article
Why do people believe blatantly inaccurate news headlines ("fake news")? Do we use our reasoning abilities to convince ourselves that statements that align with our ideology are true, or does reasoning allow us to effectively differentiate fake from real regardless of political ideology? Here we test these competing accounts in two studies (total N = 3446 Mechanical Turk workers) by using the Cognitive Reflection Test (CRT) as a measure of the propensity to engage in analytical reasoning. We find that CRT performance is negatively correlated with the perceived accuracy of fake news, and positively correlated with the ability to discern fake news from real news - even for headlines that align with individuals' political ideology. Moreover, overall discernment was actually better for ideologically aligned headlines than for misaligned headlines. Finally, a headline-level analysis finds that CRT is negatively correlated with perceived accuracy of relatively implausible (primarily fake) headlines, and positively correlated with perceived accuracy of relatively plausible (primarily real) headlines. In contrast, the correlation between CRT and perceived accuracy is unrelated to how closely the headline aligns with the participant's ideology. Thus, we conclude that analytic thinking is used to assess the plausibility of headlines, regardless of whether the stories are consistent or inconsistent with one's political ideology. Our findings therefore suggest that susceptibility to fake news is driven more by lazy thinking than it is by partisan bias per se - a finding that opens potential avenues for fighting fake news.