Conference PaperPDF Available

Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-vaccination Discussions

Authors:

Figures

Content may be subject to copyright.
Mapping the Narrative Ecosystem of Conspiracy Theories in
Online Anti-vaccination Discussions
Joshua Introne
School of Information Studies, Syracuse University,
Syracuse, NY, USA
jeintron@syr.edu
Ania Korsunska
School of Information Studies, Syracuse University,
Syracuse, NY, USA
akorsuns@syr.edu
Leni Krsova
School of Information Studies, Syracuse University,
Syracuse, NY, USA
lkrsova@syr.edu
Zefeng Zhang
School of Information Studies, Syracuse University,
Syracuse, NY, USA
zzhan208@syr.edu
ABSTRACT
Recent research on conspiracy theories labels conspiracism as a
distinct and decient epistemic process. However, the tendency to
pathologize conspiracism obscures the fact that it is a diverse and
dynamic collective sensemaking process, transacted in public on
the web. Here, we adopt a narrative framework to introduce a new
analytical approach for examining online conspiracism. Narrative
plays an important role because it is central to human cognition
as well as being domain agnostic, and so can serve as a bridge
between conspiracism and other modes of knowledge production.
To illustrate the utility of our approach, we use it to analyze con-
spiracy theories identied in conversations across three dierent
anti-vaccination discussion forums. Our approach enables us to
capture more abstract categories without hiding the underlying
diversity of the raw data. We nd that there are dominant narrative
themes across sites, but that there is also a tremendous amount
of diversity within these themes. Our initial observations raise
the possibility that dierent communities play dierent roles in
the collective construction of conspiracy theories online. This of-
fers one potential route for understanding not only cross-sectional
dierentiation, but the longitudinal dynamics of the narrative in
future work. In particular, we are interested to examine how ac-
tivity within the framework of the narrative shifts in response to
news events and social media platforms’ nascent eorts to control
dierent types of misinformation. Such analysis will help us to bet-
ter understand how collectively constructed conspiracy narratives
adapt in a shifting media ecosystem.
CCS CONCEPTS
Human-centered computing
Collaborative and social com-
puting .
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for prot or commercial advantage and that copies bear this notice and the full citation
on the rst page. Copyrights for components of this work owned by others than ACM
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specic permission and/or a
fee. Request permissions from permissions@acm.org.
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
©2020 Association for Computing Machinery.
ACM ISBN 978-1-4503-7688-4/20/07. . . $15.00
https://doi.org/10.1145/3400806.3400828
KEYWORDS
Narratives, Conspiracy Theories, Social Media, Misinformation
ACM Reference Format:
Joshua Introne, Ania Korsunska, Leni Krsova, and Zefeng Zhang. 2020.
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-
vaccination Discussions. In International Conference on Social Media and
Society (SMSociety ’20), July 22–24, 2020, Toronto, ON, Canada. ACM, New
York, NY, USA, 9 pages. https://doi.org/10.1145/3400806.3400828
1 INTRODUCTION
Conspiracy theories (CTheories) have long been a topic of inter-
est in scholarly literature and popular press. Their prevalence in
online media and recent appearances in public announcements
from the highest levels of the US government have re-invigorated
this interest. CTheories are thought to be problematic for various
reasons: they provide a natural habitat for misinformation because
they undercut trust in our traditional gatekeepers [
2
,
28
], they help
motivate extremist belief systems and acts [
3
,
38
], they motivate
disengagement with conventional modes of political activity [
20
],
and they may help sustain online echo-chambers [10].
It is not surprising that many modern treatments pathologize
both CTheories and the people who promote them. There are sev-
eral potential pitfalls with this tendency. Attempts to build an un-
derstanding of conspiracism as a decient alternative to other ways
of knowing obscures the fact that the process of conspiracy theo-
rizing is, like other forms of knowledge construction, a diverse and
dynamic search for a collective “truth” that explains the observed
world. CTheories may not draw their support from conventional
sources of evidence, and theorists may not follow the scientic
method, but that does not mean that the construction of CTheo-
ries is without its own logic. Some CTheories are more compelling
than others for a reason, and it is unlikely that these reasons are
completely unrelated to the social processes underlying the con-
struction of non-conspiracy theories. Understanding the logic that
governs the development of CTheories might not only help us un-
derstand how to intervene in those that are dangerous; it might
help us arrive at a better understanding of how people collectively
construct knowledge in general.
Another potential pitfall in the tendency for scholars and popular
media to pathologize CTheories and their supporters is that this can
become part of a broader cultural process of stigmatization, leading
to bias and further balkanization. Belief in a CTheory does not
184
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada Joshua Introne et al.
mean that a theorist is crazy or even necessarily wrong. Indeed, if
this were the case, over half of Americans [
30
] could be so classied.
Widespread stigmatization is likely to create powerful in- and out-
group dynamics that could have more serious social consequences.
In this paper we consider CTheories as components of a broader,
shifting narrative [
12
,
41
], and then examine the collective produc-
tion of this narrative across three dierent online anti-vaccination
discussion communities. The most signicant contributions of this
paper are methodological: we present a rigorous approach for iden-
tifying online conspiracy theories and mapping them using a nar-
rative framework that was previously developed to examine online
pseudo-knowledge [
18
,
19
]. Our approach allows us to organize and
analyze a diverse variety of connected conspiracy theories across a
set of topic-specic discussion communities at dierent levels of
abstraction. We nd that conspiracy theories make up a relatively
small percentage of user-generated discussion posts (from 1 to 10
percent) in the forums we examined, although there are clear dif-
ferences in their prevalence across sites. We also nd evidence that
some narrative elements and topics are more popular than others,
but there are community dierences here as well. Finally, and per-
haps most importantly, we observe that despite the existence of
common themes, there is an enormous amount of diversity amongst
stories that become part of an overarching conspiracy narrative.
Our analysis sets the stage for future, longitudinal analyses that
examine how collectively maintained conspiracy theories develop
over time across disparate communities and respond to external
events.
2 BACKGROUND
Though diverse, CTheories tend to share a similar underlying struc-
ture. Samory and Mitra [
33
] reviewed numerous denitions and
identied three common elements across them: actors, who are gen-
erally thought to be powerful and highly coordinated, motivated
by a desire for power or some other type of gain; actions, which are
performed in secret or with an intent to deceive; and targets, which
are outcomes that enrich the conspirators while causing some harm
to a victimized group.
Scholarly work on CTheories builds on an established tradition
that portrays conspiracy theorization as antithetical to rational, sci-
entic thought [
17
,
31
]. Carrying on this tradition, studies within
the last several decades have found that people who maintain CThe-
ories feel like they are disempowered and lack control, are poorly
educated, ideologically eccentric, and do not think analytically
[
11
,
32
,
40
,
42
]. As a social phenomenon, it is thought that CTheo-
ries may be a collective response to the threat of outsiders and other
threatening circumstances [
23
]. They are amplied in echo cham-
bers on the internet, and weaponized by those seeking political
advantage [
4
]. Against this backdrop, there are eorts to produce
taxonomies of CTheories [
1
,
5
] and distinguish CTheory thinking
from “normal” thinking [
39
]. Some scholars have suggested that
conspiracist ideation is a kind of monological thought process, be-
cause people who believe in one tend to believe in others as well,
even if they seem logically incompatible [
11
,
25
]. Harvard Law
School professor Cass Sunstein has gone so far as to suggest that
conspiracy theorists suer from a kind of “crippled epistemology”
[
37
]. Across this literature, belief in CTheories is often interpreted
as a kind of mental disorder, and this interpretation is carried forth
more broadly in Western culture [
24
]. As such, Lantian et al. [
24
]
argue that this perspective motivates a fraught process of social
stigmatization. On one hand, stigmatization paves the way for many
injustices to potentially be visited upon the stigmatized population
[
26
]. At the same time, those who are stigmatized develop a group
identity around the stigmatized characteristic [
14
], and draw much
needed social support from others who are part of this group [
26
].
This is particularly problematic when conspiracy theories become
part of belief systems that motivate the rejection of scientically
justied behaviors that are important for the collective well-being
of the broader population, like vaccination or reduced reliance on
fossil fuels. Existing power structures might consider more authori-
tarian means for controlling this population (e.g., mandatory vacci-
nation), but such policies reduce the autonomy of these individuals
and, paradoxically, provide further evidence for their CTheories.
Some recent scholarship has sought to move beyond the ten-
dency to pathologize CTheories. Several have argued that CThe-
ories should not be considered to be irrational, because some are,
in fact, true [
9
]. Philosopher Kurtis Hagen argues that apparent
ndings that conspiracy theorists adopt a monological thought
process are not supported by the actual empirical data [
15
]. Others
have suggested that conspiracy theories might better be understood
as a socio-political response to distant and elite power structures
that control the media and scientic knowledge production [
16
,
27
].
Harambam and Aupers [16:466] characterize the struggle between
CTheories and conventional knowledge as a “complex battle for
epistemic authority in a broader eld of knowledge contestation.
In other words, CTheories can be understood as an alternative to
conventional and privileged ways of knowing, returning power to
those who are excluded.
Considering CTheories and their construction as an alternative
means of knowledge production rather than a pathological aiction
of the mentally imbalanced helps move us away from the path to
stigmatization and brings to light questions of just how CTheories
are constructed over time. Legal scholar Mark Fenster [
12
] views
CTheories as a type of narrative that can help bridge conspiracism
and conventional epistemologies. In Fenster’s words, “[c]onspiracy
theory
. . .
tells stories about the past, present, and future, and it
presents an argument in narrative form for a contemporary audi-
ence about how power works” [[
12
]:121]. Narratives play a central
role in all human cognition [
6
,
13
,
34
], and cognitive psychologist
Jerome Bruner argued that narrative and argumentation are two
primary and irreducible forms of cognition [
6
,
7
]. Whereas the
former guides everyday thought, the latter is used when carefully
weighing evidence. Conspiracism might thus be understood as a
type of narrative-centric cognition, whereas the production of scien-
tic knowledge is heavily constrained by argumentative processes.
However, while the rules of argumentation are well-understood,
those that govern the production of narratives are not [
6
]. Exam-
ining online conspiracy theories might thus be understood as an
investigation of the rules that govern the construction of good
stories.
Adopting such a framing, Introne et al. [
18
] introduced a narra-
tive approach to examine the development of pseudo-knowledge
in an online discussion. They applied a story grammar that was
originally developed by Stein and Glenn [
36
] to capture elements of
185
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-vaccination Discussions SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
a pseudoscientic theory about ancient aliens. The story grammar
centered upon the intentional actions performed by actors and their
consequences, and it is clear that this organization bears much sim-
ilarity to the denition of CTheories oered by Samory and Mitra
[
33
]. That is, a conspiracy theory may be understood as a story
about the intentionally deceitful actions of a coordinated group,
which have the consequence of harming or otherwise disenfran-
chising a victim.
Drawing on the preceding observations, we apply this narra-
tive lens to examine online conspiracy theories in order to develop
empirical data about conspiracism as a form of collective narra-
tive construction. As context for our analysis, we examine a set of
online anti-vaccination discussion groups. This domain is of partic-
ular interest for several reasons. The recent growth of anti-vaccine
sentiment in the United States has led to a resurgence of diseases
that had previously been well-controlled, and it is thought that
social media has had a signicant role in this [
22
,
29
]. Others have
documented the prevalence of CTheories in online anti-vaccination
groups [
21
,
22
], and there is evidence that vaccine hesitant individ-
uals face increasing stigma [
8
]. Because centralized eorts to boost
vaccination rates are likely to further iname anti-vaccination sen-
timents, there is a distinct need for new approaches to address the
problem.
Cognizant of this need, the main goal of our paper is to provide
the scientic community with a new set of analytical tools for
examining the development of CTheories. In the following, we
introduce our methods, and then apply them to investigate three
questions:
1.
How prevalent are conspiracy theories in online anti-
vaccination discussions and how does the level of conspir-
acism vary across dierent communities?
2.
How common are narrative elements across dierent anti-
vaccination communities?
3.
What can we say about the content of the narrative that is
collectively constructed by people within and across dierent
communities?
3 METHODS AND DATA
3.1 Data
Data for our nal analysis was drawn from three anti-vaccination
discussion forums identied in Kata’s [
22
] analysis of social media
anti-vaccination sentiments: Natural News [
40
], Mothering [
44
],
and Above Top Secret [
45
]. We chose these sites in particular be-
cause they hosted a large set of publicly accessible discussions
about vaccines, and we were interested in the organic process of
conversation-based conspiracy theorization rather than analyzing
curated sets of conspiracy theories. Because this data is public and
does not contain personally identifying information, it was not con-
sidered human subjects’ data by the authors’ Institutional Review
Board. Additionally, while all of these sites host anti-vaccination
discussions, they are also quite diverse. Natural News is a blog that
focuses on healthy living and alternative medicine, Mothering is a
forum for mothers to share and discuss information, and Above Top
Secret is an “alternative topics” discussion forum that is explicitly
focused on conspiracy theories.
We scraped
1
discussion data using custom scrapers from these
sites in October of 2019. The scraped data included all publicly
available information associated with each post, including the user-
name, date / time, reply structure, and text of the post. On Natural
News we scraped all user-generated discussion threads that were
associated with blog posts containing the word “vaccine, resulting
in a set of 85,683 individual posts, spanning roughly six years. On
Above Top Secret we scraped all discussion threads with the word
“vaccine” appearing in the rst post of a discussion, resulting in a set
of 11,655 individual posts spanning about 17 years. For Mothering,
we scraped all discussions in those forums that were categorized in
the sites thread index as being about vaccines, resulting in a set of
344,805 posts spanning a period of about 16 years.
3.2 Identifying Conspiracy Theories
To perform our analysis, we developed a repeatable strategy for
identifying conspiracy theories. To develop a codebook, we worked
iteratively with a team of coders and pilot data that was distinct
from the data used for the analysis presented here. We began with a
working denition developed from prior work [e.g. 33], and asked
a team of six coders (including the authors and several graduate
students) to search for an initial set of CTheories in anti-vaccination
sites discussed in prior literature [
21
,
22
]. The team met on a weekly
basis to discuss their ndings and challenges, revising the deni-
tion of conspiracy theory as necessary. During this process, we
encountered a large number of posts that strongly implied a con-
spiracist point of view (e.g., by using loaded terms like “New World
Order”) but did not themselves contain identiable CTheories, and
so we expanded the codebook to include a conspiratorial thinking
(CThinking) category as well. We continued with this process until
we had denitions for both categories that led to a high degree of
intercoder agreement. Our nal denitions blend prior work on
CTheories and the story grammar previously discussed by Introne
et al. [19]:
A conspiracy theory is a narrative explaining an
event or series of events
that involve
deceptive,
coordinated actors
working together to achieve
a
goal
through
an action or series of actions
that
have
consequences
that intentionally disenfranchise
or harm an individual or population
Denitions for the six emphasized terms (which we summarize
as events, actors, goal, actions, consequences, and target) were
further elaborated in our full codebook
2
. To code an instance of
text as a CTheory, we required that coders identify four required
categories: actors, actions, consequences, and a victim. To code a
post as CThinking, coders were required to identify at least one
of the six categories, and further determine that the post implied
support for a CTheory.
To address our research questions, we randomly selected 1,800
posts from the total set of 440k posts, split evenly across the forums.
In this exploratory study, we were interested both in testing our
methods and obtaining a rough indicator of the prevalence and
content of conspiracy theories in open online discussions, and our
1Full R code available at https://github.com/c4-lab/SocialMediaSociety2020
2
The complete codebook is available online at https://c4-lab.github.io/assets/les/
codebook-CT-narratives_SMSociety2020.pdf
186
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada Joshua Introne et al.
Figure 1: Template for mapping conspiracy narratives
only criteria for inclusion was that these posts could be generated
by any self-registered (or possibly anonymous) visitor. Thus, while
the three sites have dierent aordances, these dierences were
not germane to the current inquiry. Moreover, because we were
interested in getting a sense of the true proportions of CTheories
and CThinkings, we did not exclude empty or one-word posts from
our set (we note however that there were very few such posts).
To validate the reliability of our coding approach, we rst se-
lected 90 posts that were longer than 10 words from the set of 1,800
posts, with even representation across our three sites. All of the
authors and four additional coders who had helped to develop the
codebook coded this set. Evaluating the nal category assignments,
we found that the coders achieved Kripendor’s
α=
.88 for this
set of posts, which is generally considered to be an excellent level
of inter-rater reliability. Any disagreements were subsequently re-
solved through discussion, and these results retained for our nal
analysis. Finally, the teams of coders worked individually to code
the remaining posts, discussing any problematic posts as a group
during weekly meetings.
3.3 Mapping the narrative
After coding the data, we then assembled the data into a map
using mind-mapping software SimpleMind [
46
] in order to help us
visualize the data and identify repeating categories of items. The
mapping process was led by the second author, with assistance
from the other authors. Figure 1 presents the template we used
to guide the mapping process. As posts were added to the map, it
grew as a tree. Elements drawn from the actual posts were rst
attached to relevant categories, referencing the unique identier
associated with the post. During this process, it became apparent
that many items were instances of common categories based on how
they were used and shared attributes (e.g., “Merck” and “corrupt
medical system” both belong to a common class we labeled as “Big
Pharma”). Therefore, following a narrow type of taxonomic analysis
[
35
], whenever we encountered two items that were both “a kind
of” a common class (subjectively assessed by the second author),
a parent node was introduced. As the map was constructed, the
broader team reviewed these category assignments, which were
adjusted as necessary. Whenever we encountered an item that we
had already seen, we simply added the identier of the post to that
item. Moving through the posts in this manner, we occasionally
encountered posts that we felt were problematic. We discussed
these as a team and updated the codes as necessary. Thus, the
mapping process served as a nal check on the quality of the nal
coding. Both CTheories and CThinking were added to the map.
4 FINDINGS
To help illustrate the kinds of CTheories and instances of CThink-
ing our coders found, we have included examples of each category
of post below. In each case, we have xed minor spelling and gram-
matical errors, and highlighted identied story categories:
Conspiracy Theory:
I’m not saying it’s not danger-
ous, I think that would be a given since they are
rolling it out with practically no testing
. All I
was saying is it has nothing to do with FEMA camps,
mass graves/lots of cons or depopulation. It’s
big
pharma
trying to
make money
by
creating a pan-
demic
and
raking in the cash
for
making a vac-
cine
. And I have already seen what goes into vaccines
from various sources. (AboveTopSecret.com; the
gen-
eral population
is the implied target in “rolling it
out”)
Conspiratorial Thinking:
Quote: Public health spe-
cialists and manufacturers are working frantically to
develop vaccines, drugs, strategies . . . Oh, I’ll just bet
they are. The SARS "epidemic" zzled, so they need to
focus on something else. These
scare tactics
benet
the
pharmaceutical industry
, including
the doc-
tors who pimp for them.
? (Mothering.com; implica-
tion that epidemics are deliberate, but unclear conse-
quences or target)
None:
I’ve never had a u vaccine, nor will I ever.
Back in 1957 I was told I got the Asian u. No fun,
that’s for sure, but I sweated it out and haven’t had the
u since then. I did have a nasty cold in 2000, but other
187
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-vaccination Discussions SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
Table 1: Coding results.
Site Word Count
Distribution
Number CTheories (%) Number CThinking (%) Total (%)
Mothering 1/33/74/136/1666 1 (0.17 %) 4 (0.67 %) 5 (0.83%)
Natural News 0/14/35/75/1267 15 (2.5%) 32 (5.3%) 57 (7.8%)
Above Top Secret 5/49/99/188/1112 27 (4.5%) 34 (5.7%) 61 (10.2%)
Total 43 (2.4%) 70 (3.9%) 113 (6.3%)
Figure 2: Categories of story elements used by site
than that.... It’s horrible what’s going on in the world
of
Big Pharma and their lemming-like medical
docs and patients.
(NaturalNews.com; unclear that
this is describing a conspiracy theory)
Table 1 presents overall statistics from our coding process. Ad-
dressing our rst research question, we nd that complete conspir-
acy theories are relatively rare, instances of conspiratorial thinking
are roughly three times as prevalent, and that there are clear dier-
ences amongst the dierent sites, with Mothering exhibiting the
least prevalence of conspiracism and Above Top Secret exhibiting
the most. Table 1 also includes summary statistics for the number
of words in each post, suggesting that the prevalence of conspiracy
theories was not a simple function of the general verbosity of the
forum. Word count distribution reects the number of words per
post and includes (Min/1st Q/Median/3rd Q/Max).
We note that these numbers only reect the quantity of observ-
able conspiracism in posts that are taken out of context, and that
our method does not capture the quantity of posters that voice
agreement or disagreement with posts. In the course of our coding
we observed many such posts across all sites.
Addressing our second research question, we rst examined the
overall proportions of the story roles present in just the CTheory
posts. As illustrated in Figure 2, the pattern of category use is
relatively consistent across the sites, even given the relatively small
number of CTheory posts on Mothering. In general, conspiracist
posts seem to dwell primarily on actors, followed by actions, and
then the other categories, with events being the least prevalent.
The prevalence of actors and actions are consistent with the nding
reported in Samory and Mitra [
33
]. However, we also note that
Above Top Secret seems to emphasize actions and targets more
than the others. A Chi-squared test revealed that these dierences
were weakly signicant in the “Target” category (p < .05), but with
our limited sample, it is unclear how meaningful this trend is, and
we return to this observation in the discussion.
After calculating these general statistics, we turned our attention
to the narrative itself. To help summarize the content of the map, we
calculated the number of instances of each subcategory of the six
top-level story categories. These results are summarized in Table 2.
For those instances in which either higher level categories are not
yet apparent, or no specic examples were provided, the example
column is left blank. As is apparent in the table, there are several
dominant categories found across all conspiracy theories. Following
our narrative template, we might summarize the collected CTheo-
ries as the following: Big Pharma and other powerful institutions
distribute dangerous and toxic vaccines to the general public and
other disadvantaged or powerless groups in order to make money or
gain power, and these vaccines cause harm, suering, and possibly
death.
However, the value of the map as an analytical tool lies in its abil-
ity to make apparent the extraordinary diversity of the individual
stories and story elements that sustain this overall narrative. The
complete map contains 171 nodes, and it is too large to include in
this manuscript in a legible format. Figure 3 includes an overview
view of the map, and a more detailed view is shown in Figure 4.
The complete map is available to view in detail online4.
5 DISCUSSION AND CONCLUSIONS
We have presented an approach for analyzing online CTheory con-
tent through a narrative lens. Our aim is to shed light on the col-
lective processes through which CTheories are developed without
4
Full map available to view online at https://c4-lab.github.io/assets/les/map-CTs-
SMSociety2020.pdf
188
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada Joshua Introne et al.
Table 2: Summary of topics across each of the story categories and forums analyzed
Topic Examples ATS NN M Total
Event Cover up Outbreaks of disease in vaccinated populations;
industry whistleblowers
4 0 0 4
Agreement/Legislation Passage of the NVCIP3; Industrial mergers 2 1 0 3
Outbreak/Epidemic Flu outbreak; Anticipated pandemic 1 2 1 4
Court Settlement Awards through the NVCIP22 0 0 2
Inscriptions on the Georgia
Guide stones
- 0 1 0 1
Sterilization hormones in
vaccines
- 2 0 0 2
Vaccinations given at a
specic point in time
Salk polio vaccine; vaccinations in school 2 0 0 2
Actor Big Pharma Merck; corrupt medical system 15 6 1 22
Government, Institutions,
Universities
US Government; CDC; Rotary Club; Rockefeller
Foundation
6 5 1 12
Individual People Brian Deer, Bill Gates, Obama 4 1 0 5
Media Jimmy Kimmel; Corporate media 1 3 0 4
Non-specic Nefarious
Conspirator
The eugenicists; evil rich people; the powers
that be; deepstate nwo guys
1 3 0 4
Unethical doctor Jonas Salk; Paul Ot 2 0 0 2
Goal Make Money Sell vaccines; medical dependency 10 6 0 16
Cause Harm Depopulation; cause autism 5 4 0 9
Save money - 2 0 0 2
Action Distribution of bad
quality/harmful vaccine
Vaccine adjuvants; Vaccine microchips;
Hormones; Other diseases
19 9 1 29
Pushing an agenda Brainwashing; Stop competitors; Keep up the
CDC vaccine schedule
0 6 1 7
Cover Up/Withholding of
information
Research fraud; Criminal collusion; Media fairy
tales about illness
4 1 0 5
Conseq. Health-related Suering Autism; Other illnesses; Mental inrmity;
Fertility issues
15 10 0 25
Death - 7 7 0 14
Loss of freedom Chip implant; Unable to protect oneself 3 1 0 4
Professional Harm Ruin the reputations of good scientists
(Wakeeld)
2 0 0 2
Target The general population Sheeple, brain damaged gullible masses 10 4 1 15
Children Infants and specic individuals 5 3 2 10
Nationality/Citizens Africans, Third-world countries 2 2 0 4
Individual People John Noveske 2 0 0 2
Scientists - 2 0 0 2
pathologizing conspiracism. Using narrative as a vehicle for this
purpose is valuable because the process of narrative construction
is a central and ubiquitous activity across many domains of human
knowledge. Thus, while our analysis has focused on conspiracy
theories in online anti-vaccination discussions, our approach could
be applied more broadly across other kinds of stories as well.
This raises the possibility of comparative analyses that contrasts
CTheories with other kinds of knowledge. Our initial results suggest
that, while the production of complete CTheories is relatively rare
across online discussion posts, we nd that there are both dominant
themes and a striking amount of diversity within them. An impor-
tant question is whether and how these patterns dier across other
domains, for instance, within the narratives that develop around
political events or scientic theories. Building on Bruner’s work [
7
]
we have proposed that conspiracism is a type of story construction
that is liberated from slower, more deliberate verication processes.
Thus, we anticipate that the level of diversity we have observed
here is a unique property of collective conspiracism. This may have
important implications for how quickly a conspiracy narrative can
adapt over time.
Our data also suggests that there may be dierences amongst
anti-vaccination discussion communities with respect to the types
of story elements they focus on and topics they emphasize. For
example, posts on Above Top Secret focus relatively more attention
189
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-vaccination Discussions SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
Figure 3: General view of the full map4
Figure 4: A detailed view of the "Action" section of the map
190
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada Joshua Introne et al.
on the targets of conspiracy theories (Table 1) in general as well
as actions related to “pushing an agenda” (Table 2). We cannot
make any strong claims yet, but these initial observations raise
the possibility that dierent communities play dierent roles in
the collective construction of conspiracy theories online. If future
work chooses to examine longitudinal dynamics of the narrative,
which is a dimension that has not been covered in our analysis, it
might shed light on another aspect of how these ecosystems work
in concert to narrate conspiracy theories.
We believe our initial results oer one potential route for under-
standing not only cross-sectional dierentiation, but the longitu-
dinal dynamics of the narrative. In particular, we are interested in
examining how activity within the framework of the narrative shifts
in response to news events and social media platforms’ nascent
eorts to control dierent types of misinformation. Such analy-
sis will help us to better understand how collectively constructed
conspiracy narratives adapt in a shifting media ecosystem.
ACKNOWLEDGMENTS
This research is supported by the National Science Foundation
under grant no.1908407. We thank Sehrish Ahmed, Jingxian Sun,
Zimo Xu, Yankun Wang and Mingkang Zhang for their help with
coding and data scraping. We also thank our reviewers for their
valuable comments.
REFERENCES
[1]
Michael Barkun. 2013. A Culture of Conspiracy: Apocalyptic Visions in Contempo-
rary America. Univ of California Press.
[2]
Michael Barkun. 2016. Conspiracy Theories as Stigmatized Knowledge. Diogenes
(October 2016). DOI:https://doi.org/10.1177/0392192116669288
[3] Jamie Bartlett and Carl Miller. 2010. The Power of Unreason. Demos, London.
[4]
David A. Broniatowski, Amelia M. Jamison, SiHua Qi, Lulwah AlKulaib, Tao
Chen, Adrian Benton, Sandra C. Quinn, and Mark Dredze. 2018. Weaponized
Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine
Debate. Am J Public Health 108, 10 (August 2018), 1378–1384. DOI:https://doi.
org/10.2105/AJPH.2018.304567
[5]
Robert Brotherton, Christopher C. French, and Alan D. Pickering. 2013. Measuring
Belief in Conspiracy Theories: The Generic Conspiracist Beliefs Scale. Frontiers
in Psychology 4, (2013). DOI:https://doi.org/10.3389/fpsyg.2013.00279
[6]
Jerome Bruner. 1991. The Narrative Construction of Reality. Critical Inquiry 18, 1
(1991), 1–22. DOI:https://doi.org/10.1086/448619
[7]
Jerome S Bruner. 1986. Actual Minds, Possible Worlds. Harvard University
Press.Harvard University Press.
[8]
Richard M. Carpiano and Nicholas S. Fitz. 2017. Public attitudes toward child
undervaccination: A randomized experiment on evaluations, stigmatizing ori-
entations, and support for policies. Social Science & Medicine 185, ( July 2017),
127–136. DOI:https://doi.org/10.1016/j.socscimed.2017.05.014
[9]
David Coady. 2012. What to Believe Now: Applying Epistemology to Contemporary
Issues. John Wiley & Sons.
[10]
Michela Del Vicario, Gianna Vivaldo, Alessandro Bessi, Fabiana Zollo, Antonio
Scala, Guido Caldarelli, and Walter Quattrociocchi. 2016. Echo Chambers: Emo-
tional Contagion and Group Polarization on Facebook. Scientic Reports 6, 1
(December 2016). DOI:https://doi.org/10.1038/srep37825
[11]
Karen M. Douglas, Robbie M. Sutton, Mitchell J. Callan, Rael J. Dawtry, and
Annelie J. Harvey. 2016. Someone is pulling the strings: hypersensitive agency
detection and belief in conspiracy theories. Thinking & Reasoning 22, 1 ( January
2016), 57–77. DOI:https://doi.org/10.1080/13546783.2015.1051586
[12]
Mark Fenster. 2008. Conspiracy theories: secrecy and power in American culture
(Rev. and updated ed ed.). University of Minnesota Press, Minneapolis.
[13] Walter R Fisher. 1989. Human communication as narration: Toward a philosophy
of reason, value, and action. University of South Carolina Press.
[14]
Erving Goman. 2009. Stigma: Notes on the Management of Spoiled Identity. Simon
and Schuster.
[15]
Kurtis Hagen. 2018. Conspiracy Theorists and Monological Belief Systems. In
Taking Conspiracy Theories Seriously. Rowman & Littleeld Publishers, Lanham.
[16]
Jaron Harambam and Stef Aupers. 2015. Contesting epistemic authority: Con-
spiracy theories on the boundaries of science. Public Understanding of Science 24,
4 (May 2015), 466–480. DOI:https://doi.org/10.1177/0963662514559891
[17] Richard Hofstadter. 2012. The paranoid style in American politics. Vintage.
[18]
Joshua Introne, Irem Gokce Yildirim, Luca Iandoli, Julia DeCook, and Shaima
Elzeini. 2018. How People Weave Online Information Into Pseudoknowledge.
Social Media
+
Society 4, 3 (July 2018), 205630511878563. DOI:https://doi.org/10.
1177/2056305118785639
[19]
Joshua Introne, Luca Iandoli, Julia DeCook, Irem Gokce Yildirim, and Shaima
Elzeini. 2017. The Collaborative Construction and Evolution of Pseudo-
knowledge in Online Conversations. In Proceedings of the 8th International Confer-
ence on Social Media & Society - #SMSociety17, ACM Press, Toronto, ON, Canada,
1–10. DOI:https://doi.org/10.1145/3097286.3097297
[20]
Daniel Jolley and Karen M. Douglas. 2014. The social consequences of conspir-
acism: Exposure to conspiracy theories decreases intentions to engage in politics
and to reduce one’s carbon footprint. British Journal of Psychology 105, 1 (February
2014), 35–56. DOI:https://doi.org/10.1111/bjop.12018
[21]
Anna Kata. 2010. A postmodern Pandora’s box: Anti-vaccination misinformation
on the Internet. Vaccine 28, 7 (February 2010), 1709–1716. DOI:https://doi.org/10.
1016/j.vaccine.2009.12.022
[22]
Anna Kata. 2012. Anti-vaccine activists, Web 2.0, and the postmodern paradigm
An overview of tactics and tropes used online by the anti-vaccination movement.
Vaccine 30, 25 (May 2012), 3778–3789. DOI:https://doi.org/10.1016/j.vaccine.2011.
11.112
[23]
Péter Krekó. 2015. Conspiracy theory as collective motivated cognition. In The
psychology of conspiracy.Routledge/Taylor & Francis Group, New York, NY, US,
62–75.
[24]
Anthony Lantian, Dominique Muller, Cécile Nurra, Olivier Klein, Sophie Berjot,
and Myrto Pantazi. 2018. Stigmatized beliefs: Conspiracy theories, anticipated
negative evaluation of the self, and fear of social exclusion. European Journal of
Social Psychology 48, 7 (2018), 939–954. DOI:https://doi.org/10.1002/ejsp.2498
[25]
Stephan Lewandowsky, Klaus Oberauer, and Gilles E. Gignac. 2013. NASA Faked
the Moon Landing—Therefore, (Climate) Science Is a Hoax: An Anatomy of the
Motivated Rejection of Science. Psychological Science 24, 5 (May 2013), 622–633.
DOI:https://doi.org/10.1177/0956797612457686
[26]
Bruce G. Link and Jo C. Phelan. 2001. Conceptualizing Stigma. Annual Review of
Sociology 27, 1 (August 2001), 363–385. DOI:https://doi.org/10.1146/annurev.soc.
27.1.363
[27]
Stephen M. E. Marmura. 2014. Likely and Unlikely Stories: Conspiracy Theories
in an Age of Propaganda. International Journal of Communication 8, 0 (September
2014), 19.
[28]
Paul Mihailidis and Samantha Viotty. 2017. Spreadable Spectacle in Digital
Culture: Civic Expression, Fake News, and the Role of Media Literacies in
“Post-Fact” Society. American Behavioral Scientist 61, 4 (April 2017), 441–454.
DOI:https://doi.org/10.1177/0002764217701217
[29]
Brendan Nyhan, Jason Reier, and Sean Richey. 2012. The Role of Social Networks
in Inuenza Vaccine Attitudes and Intentions Among College Students in the
Southeastern United States. Journal of Adolescent Health 51, 3 (September 2012),
302–304. DOI:https://doi.org/10.1016/j.jadohealth.2012.02.014
[30]
J. Eric Oliver and Thomas J. Wood. 2014. Conspiracy Theories and the Paranoid
Style(s) of Mass Opinion. American Journal of Political Science 58, 4 (2014), 952–
966. DOI:https://doi.org/10.1111/ajps.12084
[31]
Karl Popper. 2012. The Open Society and its Enemies. Routledge. DOI:https://doi.
org/10.4324/9780203820377
[32]
Jan-Willem van Prooijen and Nils B. Jostmann. 2013. Belief in conspiracy theories:
The inuence of uncertainty and perceived morality: Belief in conspiracy theories.
European Journal of Social Psychology 43, 1 (February 2013), 109–115. DOI:https:
//doi.org/10.1002/ejsp.1922
[33]
Mattia Samory and Tanushree Mitra. 2018. “The Government Spies Using Our
Webcams”: The Language of Conspiracy Theories in Online Discussions. Pro-
ceedings of the ACM on Human-Computer Interaction 2, CSCW (November 2018),
1–24. DOI:https://doi.org/10.1145/3274421
[34]
Roger C. Schank and Robert P. Abelson. 1995. Knowledge and memory: The real
story. In Knowledge and memory: The real story.Lawrence Erlbaum Associates,
Inc, Hillsdale, NJ, US, 1–85.
[35] James P. Spradley. 2016. The Ethnographic Interview. Waveland Press.
[36]
NL Stein and CG Glenn. 1979. An analysis of story comprehension in elementary
children, vol. 2. Norwood, NJ: Ablex (1979).
[37]
Cass R. Sunstein. 2014. Conspiracy Theories and Other Dangerous Ideas. Simon
and Schuster.
[38]
Cass R. Sunstein and Adrian Vermeule. 2009. Conspiracy Theories: Causes and
Cures*. Journal of Political Philosophy 17, 2 (June 2009), 202–227. DOI:https:
//doi.org/10.1111/j.1467-9760.2008.00325.x
[39]
Viren Swami, David Barron, Laura Weis, Martin Voracek, Stefan Stieger, and
Adrian Furnham. 2017. An examination of the factorial and convergent validity
of four measures of conspiracist ideation, with recommendations for researchers.
PLOS ONE 12, 2 (February 2017), e0172617. DOI:https://doi.org/10.1371/journal.
pone.0172617
[40]
Viren Swami, Martin Voracek, Stefan Stieger, Ulrich S. Tran, and Adrian Furnham.
2014. Analytic thinking reduces belief in conspiracy theories. Cognition 133, 3
(December 2014), 572–585. DOI:https://doi.org/10.1016/j.cognition.2014.08.006
191
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-vaccination Discussions SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
[41]
Katharina Thalmann. 2019. The Stigmatization of Conspiracy Theory since the
1950s: “A Plot to Make us Look Foolish.Routledge. DOI:https://doi.org/10.4324/
9780429020353
[42]
J. A. Whitson and A. D. Galinsky. 2008. Lacking ControlIncreases Illusor y Pattern
Perception. Science 322, 5898 (October 2008), 115–117. DOI:https://doi.org/10.
1126/science.1159845
[43] Natural News. Retrieved from https://naturalnews.com
[44] Mothering. Retrieved from https://mothering.com
[45] AboveTopSecret. Retrieved from https://abovetopsecret.com
[46]
SimpleMind. ModelMaker Tools BV - SimpleApps. Retrieved from https://
simplemind.eu/
192
... For example, a straightforward approach that considers all content from CT-related forums to be conspiratorial (Phadke, Samory, and Mitra 2021a;Klein, Clutton, and Dunn 2019;Bessi et al. 2015) can produce a large number of false positives, whereas a keyword-driven method that uses predetermined keywords to extract conspiracy theories on a specific topic (Kim and Kim 2023;Hoseini et al. 2023) may overlook the false negatives and is frequently limited in scope and generalizability. Other methods, such as pattern matching (Kou et al. 2017;Introne et al. 2020), which match textual messages with predefined syntactical elements, can be labor-intensive and may miss narratives that lack exact matching. Recent works have also employed machine learning techniques to automate the classification of CTs, but they are frequently limited to specific topics or lack clear and consistent criteria for identifying conspiratorial content (Platt, Brown, and Venske 2022). ...
... Elements of conspiracy theories. Several scholarly authors (Introne et al. 2020;Zonis and Joseph 1994;Mompelat et al. 2022;Wood and Douglas 2015) have examined the components of conspiracy theories. For example, Introne et al. (2020) highlighted six terms contained within a conspiracy theory: 1) events, 2) actors, 3) goal, 4) actions, 5) consequences, and 6) target. ...
... Several scholarly authors (Introne et al. 2020;Zonis and Joseph 1994;Mompelat et al. 2022;Wood and Douglas 2015) have examined the components of conspiracy theories. For example, Introne et al. (2020) highlighted six terms contained within a conspiracy theory: 1) events, 2) actors, 3) goal, 4) actions, 5) consequences, and 6) target. In contrast, Zonis and Joseph (1994) argued that the tenet consists of four rather than six basic components: 1) a number of actors joining together, 2) in a secret agreement, 3) to achieve a hidden goal, and 4) which is perceived to be unlawful or malevolent. ...
Article
Online discussions frequently involve conspiracy theories, which can contribute to the proliferation of belief in them. However, not all discussions surrounding conspiracy theories promote them, as some are intended to debunk them. Existing research has relied on simple proxies or focused on a constrained set of signals to identify conspiracy theories, which limits our understanding of conspiratorial discussions across different topics and online communities. This work establishes a general scheme for classifying discussions related to conspiracy theories based on authors' perspectives on the conspiracy belief, which can be expressed explicitly through narrative elements, such as the agent, action, or objective, or implicitly through references to known theories, such as chemtrails or the New World Order. We leverage human-labeled ground truth to train a BERT-based model for classifying online CTs, which we then compared to the Generative Pre-trained Transformer machine (GPT) for detecting online conspiratorial content. Despite GPT's known strengths in its expressiveness and contextual understanding, our study revealed significant flaws in its logical reasoning, while also demonstrating comparable strengths from our classifiers. We present the first large-scale classification study using posts from the most active conspiracy-related Reddit forums and find that only one-third of the posts are classified as positive. This research sheds light on the potential applications of large language models in tasks demanding nuanced contextual comprehension.
... While other authors also focused on methods that analyze conspiratorial text at the span level [Holur et al., 2022, Samory and Mitra, 2018, Introne et al., 2020, the main contribution of our work is that we take into account the importance of intergroup conflict. Holur et al. [2022] recognize the relevance of IGC: "conspiracy theories and their constituent threat narratives share a signature semantic structure: an implicitly accepted Insider group; a diverse group of threatening Outsiders". ...
... In contrast, we use a flexible span definition that allows for different types of actors and other narrative elements. Introne et al. [2020] propose a span-level scheme of six categories (event, actor, goal, action, consequence, target), and use it to analyze 236 messages from anti-vaccination forums. They distinguish between conspiracy theories and conspiratorial thinking, a category that implies only passive support. ...
Article
Full-text available
The current prevalence of conspiracy theories on the internet is a significant issue, tackled by many computational approaches. However, these approaches fail to recognize the relevance of distinguishing between texts which contain a conspiracy theory and texts which are simply critical and oppose mainstream narratives. Furthermore, little attention is usually paid to the role of inter‐group conflict in oppositional narratives. We contribute by proposing a novel topic‐agnostic annotation scheme that differentiates between conspiracies and critical texts, and that defines span‐level categories of inter‐group conflict. We also contribute with the multilingual XAI‐DisInfodemics corpus (English and Spanish), which contains a high‐quality annotation of Telegram messages related to COVID‐19 (5000 messages per language). We also demonstrate the feasibility of an NLP‐based automatization by performing a range of experiments that yield strong baseline solutions. Finally, we perform an analysis which demonstrates that the promotion of intergroup conflict and the presence of violence and anger are key aspects to distinguish between the two types of oppositional narratives, that is, conspiracy versus critical.
... Langguth et al. (2023) study stances toward 12 prevalent COVID-19 narratives on Twitter, emphasizing the role of social media in shaping public opinion. Pogorelov et al. (2021) delve into conspiracy theories, focusing on COVID-19 related 5G claims and evaluating whether tweets expressed Introne et al. (2020) explore anti-vaccination narratives in online discussion forums, offering insights into their evolution and influence. A comprehensive codebook for COVID-19 narratives for a period of three years was proposed by Kotseva et al. (2023). ...
Preprint
Full-text available
Misleading narratives play a crucial role in shaping public opinion during elections, as they can influence how voters perceive candidates and political parties. This entails the need to detect these narratives accurately. To address this, we introduce the first taxonomy of common misleading narratives that circulated during recent elections in Europe. Based on this taxonomy, we construct and analyse UKElectionNarratives: the first dataset of human-annotated misleading narratives which circulated during the UK General Elections in 2019 and 2024. We also benchmark Pre-trained and Large Language Models (focusing on GPT-4o), studying their effectiveness in detecting election-related misleading narratives. Finally, we discuss potential use cases and make recommendations for future research directions using the proposed codebook and dataset.
... Betsch et al. (Betsch et al. 2012) discuss how anti-vaccine messaging is easy to encounter online but information about true risk probability is almost always absent, the combination of which likely increases perceptions of risk and ultimately erosion in trust of medical institutions. Introne et al. (Introne et al. 2020) argue that narratives in social media environments around anti-vaccine efforts are persuasive in their construction. ...
Article
Full-text available
Anti-vaccine advocates have long engaged in a harmful and baseless campaign that ties infertility to vaccines. Preliminary analysis of a large, real-time data collection of Twitter posts that filtered for COVID-19 vaccine and fertility content revealed a surprising moment when, after the vaccines rolled out, anti-vaccine advocates made a new tie to menstrual health not previously seen in anti-vaccine advocacy. We traced the tie to a co-opting of reports commencing soon after vaccines were distributed when people began to report menstrual anomalies—anomalies that would be alarming unless they could be confirmed to be a temporary side effect. These reports, which were attempts to cross-check experiences with others, soon became new sites of “convergence” where information exchange became disordered. Discourse analysis describes how a politicized debate around vaccines, reproductive health, the medical establishment, and the validity of lived experiences was differently argued. This disordered information space can be further characterized by three waves of “social convergence” of people with varying intentions and knowledge. Furthermore, a verification analysis of international popular press articles revealed a long delay before medical groups addressed the link anywhere in the world. Meanwhile, worrisome links between vaccines and other health matters—myo- and pericarditis—were rapidly confirmed with their attendant fears managed. One year after vaccines were distributed, insufficient messaging to women and menstruators persisted; meanwhile, disinforming behaviors were fueled by worry and the ready presence of anti-vaccine activists, illustrating how marginalized topics provide “fertile ground” for information disorder. Results contribute theoretically to our understanding of information disorder, with additional lessons for public health communication and disinformation mitigation.
... In relation to this topic, a framework for mapping conspiracy theory narratives within the framework of anti-vaccine discourses is proposed in [15]. The authors characterise the narratives with a graph containing events, actors, goals, actions, consequences and targets. ...
Article
Full-text available
One of the fundamental components of understanding online discourse in social networks is the identification of narratives. For example, the analysis of disinformation campaigns requires some inference about their communication goals that, in turn, requires the identification of the narratives that they promote. The research in this task involves a number of challenges such as the limited availability of labelled datasets, the subjectivity of the annotators and the time cost of annotation. This article present a definition of the Narrative Identification task, proposes an evaluation framework for Narrative Identification, and a methodology for the creation and annotation of Narrative Identification datasets taking into account the subjectivity of the task. Keeping in mind the goal of comparing systems performance, we explore how to reduce the annotation time while maintaining the reliability of the evaluation. Following this methodology, a set of eight tasks for narrative identification in the political domain has been developed in Spanish and English. Finally, we validated the evaluation framework by analysing its application to DIPROMATS 2024 shared task, together with the performance analysis of baseline and participant systems.
... Other related research focuses on the personality traits of the user; e.g., Lobato et al. (2020) show that individuals with personality traits high in traditionalism and low in social dominance were more willing to share misinformation about COVID-19, and a meta-analysis of COVID-19 misinformation by van Mulukom et al. (2020) finds that biases, group identity, and distrust in institutions contribute to misinformation sharing. There are also investigations into how hyperpartisan news and misinformation spreads online (Haber et al. 2021;Introne et al. 2020), as well as possible interventions for minimizing users' engagement with such content (Bak-Coleman et al. 2022;Bhuiyan et al. 2021;Masrour et al. 2020;Pennycook et al. 2021;Aslett et al. 2022;Nyhan 2021). Yet, beyond an examination of how users navigate YouTube to access more extreme content (Ribeiro et al. 2020), there is little research on userlevel long-term trends in news engagement. ...
Article
Understanding how political news consumption changes over time can provide insights into issues such as hyperpartisanship, filter bubbles, and misinformation. To investigate long-term trends of news consumption, we curate a collection of over 60M tweets from politically engaged users over seven years, annotating ~10% with mentions of news outlets and their political leaning. We then train a neural network to forecast the political lean of news articles Twitter users will engage with, considering both past news engagements as well as tweet content. Using the learned representation of this model, we cluster users to discover salient patterns of long-term news engagement. Our findings include the following: (1) hyperpartisan users are more engaged with news; (2) right-leaning users engage with contra-partisan sources more than left-leaning users; (3) topics such as immigration, COVID-19, Islamaphobia, and gun control are salient indicators of engagement with low quality news sources.
Article
Full-text available
This paper examines syntactic negation in conspiracy theories from a discourse and constructionist perspective. A corpus study provides an overview of the use of negation in different conspiracy theories, where it can serve to deny the official version, i.e., the visible plot, in order to present the conspiracy theory (the invisible plot) as more believable. The corpus linguistic approach reveals a syntactic pattern of negation, which can be considered a form-meaning pair in the sense of Construction Grammar and which is analyzed in more detail in two selected conspiracy theories (attack on the Berlin Christmas market 2016 and "climate change lie"): [es V geben Neg kein X NP ] (e.g., es gibt keine Verletzten [engl. there are no injured people], es gibt keinen Zusammenhang zwischen CO2 und Lufttemperatur [engl. there is no correlation between CO2 and air temperature). Based on 229 instances, the paper describes the form, meaning, and especially the discursive function of this so-called inexistence construction. For instance, the construction occurs in the context of a specific pattern of argumentation used to deny evidence of the official version. Overall, the article argues for a more intensive study of grammatical phenomena in discourse linguistics and for a stronger combination of Discourse Grammar and Construction Grammar.
Article
Internet memes have emerged as the de facto language of the internet, where standardized memetic templates and characters distill and communicate narratives in simple, shareable formats. While prior research has highlighted their broad appeal as they traverse diverse audiences, their cultural function within online communities has received less attention. To investigate this function, we draw on cognitive anthropological conceptualizations of culture and theorize internet memes as “cultural representations.” We analyze 544 memes shared across two interconnected conspiratorial subreddits about COVID-19 between 2020 and 2022, employing a combination of content and thematic analysis. In doing so, we demonstrate that community members selectively engage with standardized memetic elements that resonate with their “conspiracist worldview.” Specifically, elements conveying the enduring “cultural themes” of Deception, Delusion, and Superiority function as “cultural resources” that stabilize the community’s culture. As such, we make three contributions. First, by theorizing internet memes as cultural representations, we demonstrate their stabilizing cultural function. Second, by showing how internet memes are used in online conspiratorial communities, we highlight their role in maintaining group cohesion and alleviating contention. Finally, we advance a revised methodological approach for the study of memetic communication.
Article
This paper outlines a multidisciplinary framework ( Digital Rhetorical Ecosystem or DRE3 ) for scaling up qualitative analyses of image memes. First, we make a case for applying rhetorical theory to examine image memes as quasi‐arguments that promote claims on a variety of political and social issues. Next, we argue for integrating rhetorical analysis of image memes into an ecological framework to trace interaction and evolution of memetic claims as they coalesce into evidence ecosystems that inform public narratives. Finally, we apply a computational framework to address the particular problem of claim identification in memes at large scales. Our integrated framework answers the recent call in information studies to highlight the social, political, and cultural attributes of information phenomena, and bridges the divide between small‐scale qualitative analyses and large‐scale computational analyses of image memes. We present this theoretical framework to guide the development of research questions, processes, and computational architecture to study the widespread and powerful influence of image memes in shaping consequential public beliefs and sentiments.
Article
Full-text available
Recent findings showed that users on Facebook tend to select information that adhere to their system of beliefs and to form polarized groups-i.e., echo chambers. Such a tendency dominates information cascades and might affect public debates on social relevant issues. In this work we explore the structural evolution of communities of interest by accounting for users emotions and engagement. Focusing on the Facebook pages reporting on scientific and conspiracy content, we characterize the evolution of the size of the two communities by fitting daily resolution data with three growth models-i.e. the Gompertz model, the Logistic model, and the Log-logistic model. Although all the models appropriately describe the data structure, the Logistic one shows the best fit. Then, we explore the interplay between emotional state and engagement of users in the group dynamics. Our findings show that communities' emotional behavior is affected by the users' involvement inside the echo chamber. Indeed, to an higher involvement corresponds a more negative approach. Moreover, we observe that, on average, more active users show a faster shift towards the negativity than less active ones. Misinformation has traditionally represented a political, social, and economic risk. The digital age, in which new ways of communication arose, has exacerbated its extent, and mitigation strategies are even more uncertain. However, according to the World Economic Forum, massive digital misinformation remains one of the main threats to our society 1. The diffusion of social media caused a shift of paradigm in the creation and consumption of information. We passed from a mediated (e.g., by journalists) to a more disintermediated selection process. Such a disintermedia-tion elicits the tendencies of the users to a) select information adhering to their system of beliefs-i.e., confirmation bias-and b) to form groups of like-minded people where they polarize their opinion-i.e. echo chamber 2-7. Under these settings, discussion within like-minded people seems to negatively influence users' emotions and to enforce group polarization 8,9. What's more, experimental evidence shows that confirmatory information gets accepted even if containing deliberately false claims 10-14 , while dissenting information is mainly ignored or might even increase group polarization 15. Current solutions, such as debunking efforts or algorithmic driven solutions based on the reputation of the source, seem to be ineffective 16,17. To make things more complicated, users on social media aim at maximizing the number of likes (Attention Bulimia) and often information, concepts, and debate get flattened and oversimplified. In such a disintermediated environment, indeed, the public opinion deals with a large amount of misleading information that might influence important decisions. Computational social science 18 seems to be a powerful tool for a better understanding of the cognitive and social dynamics behind misinformation spreading 1. Along this path, in the present work we address the evolution of online echo chambers by performing a comparative analysis of two distinct polarized communities on the Italian Facebook, i.e., science and conspiracy. The sizes of both the communities are firstly analyzed in terms of their temporal evolution and fitted by classical population growth models deriving from biology and medicine fields. The behavior of users turns out to be similar for both categories, irrespective of the contents: both science and conspiracy communities reach a thresholding value in their sizes, after an almost exponential growth, in agreement with classical growth models. 1 rrrrrr pppppppppp eeeee eeeerrs Deppp III r ee ess 55100 IIIII. 2 IIII IIsssssse r ee eeee rrr 55 27100 IIIII. 3 I s pe 00185 I. rrespee reess r ers s e resse .. e: er.r ....
Article
Full-text available
Objectives: To understand how Twitter bots and trolls ("bots") promote online health content. Methods: We compared bots' to average users' rates of vaccine-relevant messages, which we collected online from July 2014 through September 2017. We estimated the likelihood that users were bots, comparing proportions of polarized and antivaccine tweets across user types. We conducted a content analysis of a Twitter hashtag associated with Russian troll activity. Results: Compared with average users, Russian trolls (χ2(1) = 102.0; P < .001), sophisticated bots (χ2(1) = 28.6; P < .001), and "content polluters" (χ2(1) = 7.0; P < .001) tweeted about vaccination at higher rates. Whereas content polluters posted more antivaccine content (χ2(1) = 11.18; P < .001), Russian trolls amplified both sides. Unidentifiable accounts were more polarized (χ2(1) = 12.1; P < .001) and antivaccine (χ2(1) = 35.9; P < .001). Analysis of the Russian troll hashtag showed that its messages were more political and divisive. Conclusions: Whereas bots that spread malware and unsolicited content disseminated antivaccine messages, Russian trolls promoted discord. Accounts masquerading as legitimate users create false equivalency, eroding public consensus on vaccination. Public Health Implications. Directly confronting vaccine skeptics enables bots to legitimize the vaccine debate. More research is needed to determine how best to combat bot-driven content. (Am J Public Health. Published online ahead of print August 23, 2018: e1-e7. doi:10.2105/AJPH.2018.304567).
Article
Conspiracy theories are omnipresent in online discussions---whether to explain a late-breaking event that still lacks official report or to give voice to political dissent. Conspiracy theories evolve, multiply, and interconnect, further complicating efforts to understand them and to limit their propagation. It is therefore crucial to develop scalable methods to examine the nature of conspiratorial discussions in online communities. What do users talk about when they discuss conspiracy theories online? What are the recurring elements in their discussions? What do these elements tell us about the way users think? This work answers these questions by analyzing over ten years of discussions in r/conspiracy---an online community on Reddit dedicated to conspiratorial discussions. We focus on the key elements of a conspiracy theory: the conspiratorial agents, the actions they perform, and their targets. By computationally detecting agent?action?target triplets in conspiratorial statements, and grouping them into semantically coherent clusters, we develop a notion of narrative-motif to detect recurring patterns of triplets. For example, a narrative-motif such as "governmental agency-controls-communications" represents the various ways in which multiple conspiratorial statements denote how governmental agencies control information. Thus, narrative-motifs expose commonalities between multiple conspiracy theories even when they refer to different events or circumstances. In the process, these representations help us understand how users talk about conspiracy theories and offer us a means to interpret what they talk about. Our approach enables a population-scale study of conspiracy theories in alternative news and social media with implications for understanding their adoption and combating their spread.
Article
Misinformation has found a new natural habitat in the digital age. Thousands of forums, blogs, and alternative news sources amplify fake news and inaccurate information to such a degree that it impacts our collective intelligence. Researchers and policy makers are troubled by misinformation because it is presumed to energize or even carry false narratives that can motivate poor decision-making and dangerous behaviors. Yet, while a growing body of research has focused on how viral misinformation spreads, little work has examined how false narratives are in fact constructed. In this study, we move beyond contagion inspired approaches to examine how people construct a false narrative. We apply prior work in cognitive science on narrative understanding to illustrate how the narrative changes over time and in response to social dynamics, and examine how forum participants draw upon a diverse set of online sources to substantiate the narrative. We find that the narrative is based primarily on reinterpretations of conventional and scholarly sources, and then used to provide an alternate account of unfolding events. We conclude that the link between misinformation, conventional knowledge, and false narratives is more complex than is often presumed, and advocate for a more direct study of this relationship.
Article
Can conspiracy theories be a source of social stigma? If it is true, it would follow that people may expect to be socially excluded when they express endorsement of conspiracy theories. This effect should be partially explained by the knowledge of the negative perceptions associated with conspiracy theories. In Study 1, inducing French internet users to write a text endorsing (vs. criticizing) conspiracy theories about the Charlie Hebdo shooting, led them to anticipate fear of social exclusion. This effect was mediated by anticipated negative evaluation of the self. In Study 2, inducing French internet users to imagine defending (vs. criticizing) conspiracy theories about the Charlie Hebdo shooting in front of an audience, led them to anticipate fear of social exclusion. The effect was again mediated by anticipated negative evaluation of the self. To conclude, our findings demonstrate that conspiracy theories can be viewed as a source of social stigma. This article is protected by copyright. All rights reserved.