Content uploaded by Ben Zefeng Zhang
Author content
All content in this area was uploaded by Ben Zefeng Zhang on Jul 17, 2022
Content may be subject to copyright.
Content uploaded by Joshua Introne
Author content
All content in this area was uploaded by Joshua Introne on Jul 24, 2020
Content may be subject to copyright.
Mapping the Narrative Ecosystem of Conspiracy Theories in
Online Anti-vaccination Discussions
Joshua Introne
School of Information Studies, Syracuse University,
Syracuse, NY, USA
jeintron@syr.edu
Ania Korsunska
School of Information Studies, Syracuse University,
Syracuse, NY, USA
akorsuns@syr.edu
Leni Krsova
School of Information Studies, Syracuse University,
Syracuse, NY, USA
lkrsova@syr.edu
Zefeng Zhang
School of Information Studies, Syracuse University,
Syracuse, NY, USA
zzhan208@syr.edu
ABSTRACT
Recent research on conspiracy theories labels conspiracism as a
distinct and decient epistemic process. However, the tendency to
pathologize conspiracism obscures the fact that it is a diverse and
dynamic collective sensemaking process, transacted in public on
the web. Here, we adopt a narrative framework to introduce a new
analytical approach for examining online conspiracism. Narrative
plays an important role because it is central to human cognition
as well as being domain agnostic, and so can serve as a bridge
between conspiracism and other modes of knowledge production.
To illustrate the utility of our approach, we use it to analyze con-
spiracy theories identied in conversations across three dierent
anti-vaccination discussion forums. Our approach enables us to
capture more abstract categories without hiding the underlying
diversity of the raw data. We nd that there are dominant narrative
themes across sites, but that there is also a tremendous amount
of diversity within these themes. Our initial observations raise
the possibility that dierent communities play dierent roles in
the collective construction of conspiracy theories online. This of-
fers one potential route for understanding not only cross-sectional
dierentiation, but the longitudinal dynamics of the narrative in
future work. In particular, we are interested to examine how ac-
tivity within the framework of the narrative shifts in response to
news events and social media platforms’ nascent eorts to control
dierent types of misinformation. Such analysis will help us to bet-
ter understand how collectively constructed conspiracy narratives
adapt in a shifting media ecosystem.
CCS CONCEPTS
•Human-centered computing →
Collaborative and social com-
puting .
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for prot or commercial advantage and that copies bear this notice and the full citation
on the rst page. Copyrights for components of this work owned by others than ACM
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specic permission and/or a
fee. Request permissions from permissions@acm.org.
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
©2020 Association for Computing Machinery.
ACM ISBN 978-1-4503-7688-4/20/07. . . $15.00
https://doi.org/10.1145/3400806.3400828
KEYWORDS
Narratives, Conspiracy Theories, Social Media, Misinformation
ACM Reference Format:
Joshua Introne, Ania Korsunska, Leni Krsova, and Zefeng Zhang. 2020.
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-
vaccination Discussions. In International Conference on Social Media and
Society (SMSociety ’20), July 22–24, 2020, Toronto, ON, Canada. ACM, New
York, NY, USA, 9 pages. https://doi.org/10.1145/3400806.3400828
1 INTRODUCTION
Conspiracy theories (CTheories) have long been a topic of inter-
est in scholarly literature and popular press. Their prevalence in
online media and recent appearances in public announcements
from the highest levels of the US government have re-invigorated
this interest. CTheories are thought to be problematic for various
reasons: they provide a natural habitat for misinformation because
they undercut trust in our traditional gatekeepers [
2
,
28
], they help
motivate extremist belief systems and acts [
3
,
38
], they motivate
disengagement with conventional modes of political activity [
20
],
and they may help sustain online echo-chambers [10].
It is not surprising that many modern treatments pathologize
both CTheories and the people who promote them. There are sev-
eral potential pitfalls with this tendency. Attempts to build an un-
derstanding of conspiracism as a decient alternative to other ways
of knowing obscures the fact that the process of conspiracy theo-
rizing is, like other forms of knowledge construction, a diverse and
dynamic search for a collective “truth” that explains the observed
world. CTheories may not draw their support from conventional
sources of evidence, and theorists may not follow the scientic
method, but that does not mean that the construction of CTheo-
ries is without its own logic. Some CTheories are more compelling
than others for a reason, and it is unlikely that these reasons are
completely unrelated to the social processes underlying the con-
struction of non-conspiracy theories. Understanding the logic that
governs the development of CTheories might not only help us un-
derstand how to intervene in those that are dangerous; it might
help us arrive at a better understanding of how people collectively
construct knowledge in general.
Another potential pitfall in the tendency for scholars and popular
media to pathologize CTheories and their supporters is that this can
become part of a broader cultural process of stigmatization, leading
to bias and further balkanization. Belief in a CTheory does not
184
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada Joshua Introne et al.
mean that a theorist is crazy or even necessarily wrong. Indeed, if
this were the case, over half of Americans [
30
] could be so classied.
Widespread stigmatization is likely to create powerful in- and out-
group dynamics that could have more serious social consequences.
In this paper we consider CTheories as components of a broader,
shifting narrative [
12
,
41
], and then examine the collective produc-
tion of this narrative across three dierent online anti-vaccination
discussion communities. The most signicant contributions of this
paper are methodological: we present a rigorous approach for iden-
tifying online conspiracy theories and mapping them using a nar-
rative framework that was previously developed to examine online
pseudo-knowledge [
18
,
19
]. Our approach allows us to organize and
analyze a diverse variety of connected conspiracy theories across a
set of topic-specic discussion communities at dierent levels of
abstraction. We nd that conspiracy theories make up a relatively
small percentage of user-generated discussion posts (from 1 to 10
percent) in the forums we examined, although there are clear dif-
ferences in their prevalence across sites. We also nd evidence that
some narrative elements and topics are more popular than others,
but there are community dierences here as well. Finally, and per-
haps most importantly, we observe that despite the existence of
common themes, there is an enormous amount of diversity amongst
stories that become part of an overarching conspiracy narrative.
Our analysis sets the stage for future, longitudinal analyses that
examine how collectively maintained conspiracy theories develop
over time across disparate communities and respond to external
events.
2 BACKGROUND
Though diverse, CTheories tend to share a similar underlying struc-
ture. Samory and Mitra [
33
] reviewed numerous denitions and
identied three common elements across them: actors, who are gen-
erally thought to be powerful and highly coordinated, motivated
by a desire for power or some other type of gain; actions, which are
performed in secret or with an intent to deceive; and targets, which
are outcomes that enrich the conspirators while causing some harm
to a victimized group.
Scholarly work on CTheories builds on an established tradition
that portrays conspiracy theorization as antithetical to rational, sci-
entic thought [
17
,
31
]. Carrying on this tradition, studies within
the last several decades have found that people who maintain CThe-
ories feel like they are disempowered and lack control, are poorly
educated, ideologically eccentric, and do not think analytically
[
11
,
32
,
40
,
42
]. As a social phenomenon, it is thought that CTheo-
ries may be a collective response to the threat of outsiders and other
threatening circumstances [
23
]. They are amplied in echo cham-
bers on the internet, and weaponized by those seeking political
advantage [
4
]. Against this backdrop, there are eorts to produce
taxonomies of CTheories [
1
,
5
] and distinguish CTheory thinking
from “normal” thinking [
39
]. Some scholars have suggested that
conspiracist ideation is a kind of monological thought process, be-
cause people who believe in one tend to believe in others as well,
even if they seem logically incompatible [
11
,
25
]. Harvard Law
School professor Cass Sunstein has gone so far as to suggest that
conspiracy theorists suer from a kind of “crippled epistemology”
[
37
]. Across this literature, belief in CTheories is often interpreted
as a kind of mental disorder, and this interpretation is carried forth
more broadly in Western culture [
24
]. As such, Lantian et al. [
24
]
argue that this perspective motivates a fraught process of social
stigmatization. On one hand, stigmatization paves the way for many
injustices to potentially be visited upon the stigmatized population
[
26
]. At the same time, those who are stigmatized develop a group
identity around the stigmatized characteristic [
14
], and draw much
needed social support from others who are part of this group [
26
].
This is particularly problematic when conspiracy theories become
part of belief systems that motivate the rejection of scientically
justied behaviors that are important for the collective well-being
of the broader population, like vaccination or reduced reliance on
fossil fuels. Existing power structures might consider more authori-
tarian means for controlling this population (e.g., mandatory vacci-
nation), but such policies reduce the autonomy of these individuals
and, paradoxically, provide further evidence for their CTheories.
Some recent scholarship has sought to move beyond the ten-
dency to pathologize CTheories. Several have argued that CThe-
ories should not be considered to be irrational, because some are,
in fact, true [
9
]. Philosopher Kurtis Hagen argues that apparent
ndings that conspiracy theorists adopt a monological thought
process are not supported by the actual empirical data [
15
]. Others
have suggested that conspiracy theories might better be understood
as a socio-political response to distant and elite power structures
that control the media and scientic knowledge production [
16
,
27
].
Harambam and Aupers [16:466] characterize the struggle between
CTheories and conventional knowledge as a “complex battle for
epistemic authority in a broader eld of knowledge contestation.”
In other words, CTheories can be understood as an alternative to
conventional and privileged ways of knowing, returning power to
those who are excluded.
Considering CTheories and their construction as an alternative
means of knowledge production rather than a pathological aiction
of the mentally imbalanced helps move us away from the path to
stigmatization and brings to light questions of just how CTheories
are constructed over time. Legal scholar Mark Fenster [
12
] views
CTheories as a type of narrative that can help bridge conspiracism
and conventional epistemologies. In Fenster’s words, “[c]onspiracy
theory
. . .
tells stories about the past, present, and future, and it
presents an argument in narrative form for a contemporary audi-
ence about how power works” [[
12
]:121]. Narratives play a central
role in all human cognition [
6
,
13
,
34
], and cognitive psychologist
Jerome Bruner argued that narrative and argumentation are two
primary and irreducible forms of cognition [
6
,
7
]. Whereas the
former guides everyday thought, the latter is used when carefully
weighing evidence. Conspiracism might thus be understood as a
type of narrative-centric cognition, whereas the production of scien-
tic knowledge is heavily constrained by argumentative processes.
However, while the rules of argumentation are well-understood,
those that govern the production of narratives are not [
6
]. Exam-
ining online conspiracy theories might thus be understood as an
investigation of the rules that govern the construction of good
stories.
Adopting such a framing, Introne et al. [
18
] introduced a narra-
tive approach to examine the development of pseudo-knowledge
in an online discussion. They applied a story grammar that was
originally developed by Stein and Glenn [
36
] to capture elements of
185
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-vaccination Discussions SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
a pseudoscientic theory about ancient aliens. The story grammar
centered upon the intentional actions performed by actors and their
consequences, and it is clear that this organization bears much sim-
ilarity to the denition of CTheories oered by Samory and Mitra
[
33
]. That is, a conspiracy theory may be understood as a story
about the intentionally deceitful actions of a coordinated group,
which have the consequence of harming or otherwise disenfran-
chising a victim.
Drawing on the preceding observations, we apply this narra-
tive lens to examine online conspiracy theories in order to develop
empirical data about conspiracism as a form of collective narra-
tive construction. As context for our analysis, we examine a set of
online anti-vaccination discussion groups. This domain is of partic-
ular interest for several reasons. The recent growth of anti-vaccine
sentiment in the United States has led to a resurgence of diseases
that had previously been well-controlled, and it is thought that
social media has had a signicant role in this [
22
,
29
]. Others have
documented the prevalence of CTheories in online anti-vaccination
groups [
21
,
22
], and there is evidence that vaccine hesitant individ-
uals face increasing stigma [
8
]. Because centralized eorts to boost
vaccination rates are likely to further iname anti-vaccination sen-
timents, there is a distinct need for new approaches to address the
problem.
Cognizant of this need, the main goal of our paper is to provide
the scientic community with a new set of analytical tools for
examining the development of CTheories. In the following, we
introduce our methods, and then apply them to investigate three
questions:
1.
How prevalent are conspiracy theories in online anti-
vaccination discussions and how does the level of conspir-
acism vary across dierent communities?
2.
How common are narrative elements across dierent anti-
vaccination communities?
3.
What can we say about the content of the narrative that is
collectively constructed by people within and across dierent
communities?
3 METHODS AND DATA
3.1 Data
Data for our nal analysis was drawn from three anti-vaccination
discussion forums identied in Kata’s [
22
] analysis of social media
anti-vaccination sentiments: Natural News [
40
], Mothering [
44
],
and Above Top Secret [
45
]. We chose these sites in particular be-
cause they hosted a large set of publicly accessible discussions
about vaccines, and we were interested in the organic process of
conversation-based conspiracy theorization rather than analyzing
curated sets of conspiracy theories. Because this data is public and
does not contain personally identifying information, it was not con-
sidered human subjects’ data by the authors’ Institutional Review
Board. Additionally, while all of these sites host anti-vaccination
discussions, they are also quite diverse. Natural News is a blog that
focuses on healthy living and alternative medicine, Mothering is a
forum for mothers to share and discuss information, and Above Top
Secret is an “alternative topics” discussion forum that is explicitly
focused on conspiracy theories.
We scraped
1
discussion data using custom scrapers from these
sites in October of 2019. The scraped data included all publicly
available information associated with each post, including the user-
name, date / time, reply structure, and text of the post. On Natural
News we scraped all user-generated discussion threads that were
associated with blog posts containing the word “vaccine,” resulting
in a set of 85,683 individual posts, spanning roughly six years. On
Above Top Secret we scraped all discussion threads with the word
“vaccine” appearing in the rst post of a discussion, resulting in a set
of 11,655 individual posts spanning about 17 years. For Mothering,
we scraped all discussions in those forums that were categorized in
the sites thread index as being about vaccines, resulting in a set of
344,805 posts spanning a period of about 16 years.
3.2 Identifying Conspiracy Theories
To perform our analysis, we developed a repeatable strategy for
identifying conspiracy theories. To develop a codebook, we worked
iteratively with a team of coders and pilot data that was distinct
from the data used for the analysis presented here. We began with a
working denition developed from prior work [e.g. 33], and asked
a team of six coders (including the authors and several graduate
students) to search for an initial set of CTheories in anti-vaccination
sites discussed in prior literature [
21
,
22
]. The team met on a weekly
basis to discuss their ndings and challenges, revising the deni-
tion of conspiracy theory as necessary. During this process, we
encountered a large number of posts that strongly implied a con-
spiracist point of view (e.g., by using loaded terms like “New World
Order”) but did not themselves contain identiable CTheories, and
so we expanded the codebook to include a conspiratorial thinking
(CThinking) category as well. We continued with this process until
we had denitions for both categories that led to a high degree of
intercoder agreement. Our nal denitions blend prior work on
CTheories and the story grammar previously discussed by Introne
et al. [19]:
A conspiracy theory is a narrative explaining an
event or series of events
that involve
deceptive,
coordinated actors
working together to achieve
a
goal
through
an action or series of actions
that
have
consequences
that intentionally disenfranchise
or harm an individual or population
Denitions for the six emphasized terms (which we summarize
as events, actors, goal, actions, consequences, and target) were
further elaborated in our full codebook
2
. To code an instance of
text as a CTheory, we required that coders identify four required
categories: actors, actions, consequences, and a victim. To code a
post as CThinking, coders were required to identify at least one
of the six categories, and further determine that the post implied
support for a CTheory.
To address our research questions, we randomly selected 1,800
posts from the total set of 440k posts, split evenly across the forums.
In this exploratory study, we were interested both in testing our
methods and obtaining a rough indicator of the prevalence and
content of conspiracy theories in open online discussions, and our
1Full R code available at https://github.com/c4-lab/SocialMediaSociety2020
2
The complete codebook is available online at https://c4-lab.github.io/assets/les/
codebook-CT-narratives_SMSociety2020.pdf
186
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada Joshua Introne et al.
Figure 1: Template for mapping conspiracy narratives
only criteria for inclusion was that these posts could be generated
by any self-registered (or possibly anonymous) visitor. Thus, while
the three sites have dierent aordances, these dierences were
not germane to the current inquiry. Moreover, because we were
interested in getting a sense of the true proportions of CTheories
and CThinkings, we did not exclude empty or one-word posts from
our set (we note however that there were very few such posts).
To validate the reliability of our coding approach, we rst se-
lected 90 posts that were longer than 10 words from the set of 1,800
posts, with even representation across our three sites. All of the
authors and four additional coders who had helped to develop the
codebook coded this set. Evaluating the nal category assignments,
we found that the coders achieved Kripendor’s
α=
.88 for this
set of posts, which is generally considered to be an excellent level
of inter-rater reliability. Any disagreements were subsequently re-
solved through discussion, and these results retained for our nal
analysis. Finally, the teams of coders worked individually to code
the remaining posts, discussing any problematic posts as a group
during weekly meetings.
3.3 Mapping the narrative
After coding the data, we then assembled the data into a map
using mind-mapping software SimpleMind [
46
] in order to help us
visualize the data and identify repeating categories of items. The
mapping process was led by the second author, with assistance
from the other authors. Figure 1 presents the template we used
to guide the mapping process. As posts were added to the map, it
grew as a tree. Elements drawn from the actual posts were rst
attached to relevant categories, referencing the unique identier
associated with the post. During this process, it became apparent
that many items were instances of common categories based on how
they were used and shared attributes (e.g., “Merck” and “corrupt
medical system” both belong to a common class we labeled as “Big
Pharma”). Therefore, following a narrow type of taxonomic analysis
[
35
], whenever we encountered two items that were both “a kind
of” a common class (subjectively assessed by the second author),
a parent node was introduced. As the map was constructed, the
broader team reviewed these category assignments, which were
adjusted as necessary. Whenever we encountered an item that we
had already seen, we simply added the identier of the post to that
item. Moving through the posts in this manner, we occasionally
encountered posts that we felt were problematic. We discussed
these as a team and updated the codes as necessary. Thus, the
mapping process served as a nal check on the quality of the nal
coding. Both CTheories and CThinking were added to the map.
4 FINDINGS
To help illustrate the kinds of CTheories and instances of CThink-
ing our coders found, we have included examples of each category
of post below. In each case, we have xed minor spelling and gram-
matical errors, and highlighted identied story categories:
Conspiracy Theory:
I’m not saying it’s not danger-
ous, I think that would be a given since they are
rolling it out with practically no testing
. All I
was saying is it has nothing to do with FEMA camps,
mass graves/lots of cons or depopulation. It’s
big
pharma
trying to
make money
by
creating a pan-
demic
and
raking in the cash
for
making a vac-
cine
. And I have already seen what goes into vaccines
from various sources. (AboveTopSecret.com; the
gen-
eral population
is the implied target in “rolling it
out”)
Conspiratorial Thinking:
Quote: Public health spe-
cialists and manufacturers are working frantically to
develop vaccines, drugs, strategies . . . Oh, I’ll just bet
they are. The SARS "epidemic" zzled, so they need to
focus on something else. These
scare tactics
benet
the
pharmaceutical industry
, including
the doc-
tors who pimp for them.
? (Mothering.com; implica-
tion that epidemics are deliberate, but unclear conse-
quences or target)
None:
I’ve never had a u vaccine, nor will I ever.
Back in 1957 I was told I got the Asian u. No fun,
that’s for sure, but I sweated it out and haven’t had the
u since then. I did have a nasty cold in 2000, but other
187
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-vaccination Discussions SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
Table 1: Coding results.
Site Word Count
Distribution
Number CTheories (%) Number CThinking (%) Total (%)
Mothering 1/33/74/136/1666 1 (0.17 %) 4 (0.67 %) 5 (0.83%)
Natural News 0/14/35/75/1267 15 (2.5%) 32 (5.3%) 57 (7.8%)
Above Top Secret 5/49/99/188/1112 27 (4.5%) 34 (5.7%) 61 (10.2%)
Total – 43 (2.4%) 70 (3.9%) 113 (6.3%)
Figure 2: Categories of story elements used by site
than that.... It’s horrible what’s going on in the world
of
Big Pharma and their lemming-like medical
docs and patients.
(NaturalNews.com; unclear that
this is describing a conspiracy theory)
Table 1 presents overall statistics from our coding process. Ad-
dressing our rst research question, we nd that complete conspir-
acy theories are relatively rare, instances of conspiratorial thinking
are roughly three times as prevalent, and that there are clear dier-
ences amongst the dierent sites, with Mothering exhibiting the
least prevalence of conspiracism and Above Top Secret exhibiting
the most. Table 1 also includes summary statistics for the number
of words in each post, suggesting that the prevalence of conspiracy
theories was not a simple function of the general verbosity of the
forum. Word count distribution reects the number of words per
post and includes (Min/1st Q/Median/3rd Q/Max).
We note that these numbers only reect the quantity of observ-
able conspiracism in posts that are taken out of context, and that
our method does not capture the quantity of posters that voice
agreement or disagreement with posts. In the course of our coding
we observed many such posts across all sites.
Addressing our second research question, we rst examined the
overall proportions of the story roles present in just the CTheory
posts. As illustrated in Figure 2, the pattern of category use is
relatively consistent across the sites, even given the relatively small
number of CTheory posts on Mothering. In general, conspiracist
posts seem to dwell primarily on actors, followed by actions, and
then the other categories, with events being the least prevalent.
The prevalence of actors and actions are consistent with the nding
reported in Samory and Mitra [
33
]. However, we also note that
Above Top Secret seems to emphasize actions and targets more
than the others. A Chi-squared test revealed that these dierences
were weakly signicant in the “Target” category (p < .05), but with
our limited sample, it is unclear how meaningful this trend is, and
we return to this observation in the discussion.
After calculating these general statistics, we turned our attention
to the narrative itself. To help summarize the content of the map, we
calculated the number of instances of each subcategory of the six
top-level story categories. These results are summarized in Table 2.
For those instances in which either higher level categories are not
yet apparent, or no specic examples were provided, the example
column is left blank. As is apparent in the table, there are several
dominant categories found across all conspiracy theories. Following
our narrative template, we might summarize the collected CTheo-
ries as the following: Big Pharma and other powerful institutions
distribute dangerous and toxic vaccines to the general public and
other disadvantaged or powerless groups in order to make money or
gain power, and these vaccines cause harm, suering, and possibly
death.
However, the value of the map as an analytical tool lies in its abil-
ity to make apparent the extraordinary diversity of the individual
stories and story elements that sustain this overall narrative. The
complete map contains 171 nodes, and it is too large to include in
this manuscript in a legible format. Figure 3 includes an overview
view of the map, and a more detailed view is shown in Figure 4.
The complete map is available to view in detail online4.
5 DISCUSSION AND CONCLUSIONS
We have presented an approach for analyzing online CTheory con-
tent through a narrative lens. Our aim is to shed light on the col-
lective processes through which CTheories are developed without
4
Full map available to view online at https://c4-lab.github.io/assets/les/map-CTs-
SMSociety2020.pdf
188
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada Joshua Introne et al.
Table 2: Summary of topics across each of the story categories and forums analyzed
Topic Examples ATS NN M Total
Event Cover up Outbreaks of disease in vaccinated populations;
industry whistleblowers
4 0 0 4
Agreement/Legislation Passage of the NVCIP3; Industrial mergers 2 1 0 3
Outbreak/Epidemic Flu outbreak; Anticipated pandemic 1 2 1 4
Court Settlement Awards through the NVCIP22 0 0 2
Inscriptions on the Georgia
Guide stones
- 0 1 0 1
Sterilization hormones in
vaccines
- 2 0 0 2
Vaccinations given at a
specic point in time
Salk polio vaccine; vaccinations in school 2 0 0 2
Actor Big Pharma Merck; corrupt medical system 15 6 1 22
Government, Institutions,
Universities
US Government; CDC; Rotary Club; Rockefeller
Foundation
6 5 1 12
Individual People Brian Deer, Bill Gates, Obama 4 1 0 5
Media Jimmy Kimmel; Corporate media 1 3 0 4
Non-specic Nefarious
Conspirator
The eugenicists; evil rich people; the powers
that be; deepstate nwo guys
1 3 0 4
Unethical doctor Jonas Salk; Paul Ot 2 0 0 2
Goal Make Money Sell vaccines; medical dependency 10 6 0 16
Cause Harm Depopulation; cause autism 5 4 0 9
Save money - 2 0 0 2
Action Distribution of bad
quality/harmful vaccine
Vaccine adjuvants; Vaccine microchips;
Hormones; Other diseases
19 9 1 29
Pushing an agenda Brainwashing; Stop competitors; Keep up the
CDC vaccine schedule
0 6 1 7
Cover Up/Withholding of
information
Research fraud; Criminal collusion; Media fairy
tales about illness
4 1 0 5
Conseq. Health-related Suering Autism; Other illnesses; Mental inrmity;
Fertility issues
15 10 0 25
Death - 7 7 0 14
Loss of freedom Chip implant; Unable to protect oneself 3 1 0 4
Professional Harm Ruin the reputations of good scientists
(Wakeeld)
2 0 0 2
Target The general population Sheeple, brain damaged gullible masses 10 4 1 15
Children Infants and specic individuals 5 3 2 10
Nationality/Citizens Africans, Third-world countries 2 2 0 4
Individual People John Noveske 2 0 0 2
Scientists - 2 0 0 2
pathologizing conspiracism. Using narrative as a vehicle for this
purpose is valuable because the process of narrative construction
is a central and ubiquitous activity across many domains of human
knowledge. Thus, while our analysis has focused on conspiracy
theories in online anti-vaccination discussions, our approach could
be applied more broadly across other kinds of stories as well.
This raises the possibility of comparative analyses that contrasts
CTheories with other kinds of knowledge. Our initial results suggest
that, while the production of complete CTheories is relatively rare
across online discussion posts, we nd that there are both dominant
themes and a striking amount of diversity within them. An impor-
tant question is whether and how these patterns dier across other
domains, for instance, within the narratives that develop around
political events or scientic theories. Building on Bruner’s work [
7
]
we have proposed that conspiracism is a type of story construction
that is liberated from slower, more deliberate verication processes.
Thus, we anticipate that the level of diversity we have observed
here is a unique property of collective conspiracism. This may have
important implications for how quickly a conspiracy narrative can
adapt over time.
Our data also suggests that there may be dierences amongst
anti-vaccination discussion communities with respect to the types
of story elements they focus on and topics they emphasize. For
example, posts on Above Top Secret focus relatively more attention
189
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-vaccination Discussions SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
Figure 3: General view of the full map4
Figure 4: A detailed view of the "Action" section of the map
190
SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada Joshua Introne et al.
on the targets of conspiracy theories (Table 1) in general as well
as actions related to “pushing an agenda” (Table 2). We cannot
make any strong claims yet, but these initial observations raise
the possibility that dierent communities play dierent roles in
the collective construction of conspiracy theories online. If future
work chooses to examine longitudinal dynamics of the narrative,
which is a dimension that has not been covered in our analysis, it
might shed light on another aspect of how these ecosystems work
in concert to narrate conspiracy theories.
We believe our initial results oer one potential route for under-
standing not only cross-sectional dierentiation, but the longitu-
dinal dynamics of the narrative. In particular, we are interested in
examining how activity within the framework of the narrative shifts
in response to news events and social media platforms’ nascent
eorts to control dierent types of misinformation. Such analy-
sis will help us to better understand how collectively constructed
conspiracy narratives adapt in a shifting media ecosystem.
ACKNOWLEDGMENTS
This research is supported by the National Science Foundation
under grant no.1908407. We thank Sehrish Ahmed, Jingxian Sun,
Zimo Xu, Yankun Wang and Mingkang Zhang for their help with
coding and data scraping. We also thank our reviewers for their
valuable comments.
REFERENCES
[1]
Michael Barkun. 2013. A Culture of Conspiracy: Apocalyptic Visions in Contempo-
rary America. Univ of California Press.
[2]
Michael Barkun. 2016. Conspiracy Theories as Stigmatized Knowledge. Diogenes
(October 2016). DOI:https://doi.org/10.1177/0392192116669288
[3] Jamie Bartlett and Carl Miller. 2010. The Power of Unreason. Demos, London.
[4]
David A. Broniatowski, Amelia M. Jamison, SiHua Qi, Lulwah AlKulaib, Tao
Chen, Adrian Benton, Sandra C. Quinn, and Mark Dredze. 2018. Weaponized
Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine
Debate. Am J Public Health 108, 10 (August 2018), 1378–1384. DOI:https://doi.
org/10.2105/AJPH.2018.304567
[5]
Robert Brotherton, Christopher C. French, and Alan D. Pickering. 2013. Measuring
Belief in Conspiracy Theories: The Generic Conspiracist Beliefs Scale. Frontiers
in Psychology 4, (2013). DOI:https://doi.org/10.3389/fpsyg.2013.00279
[6]
Jerome Bruner. 1991. The Narrative Construction of Reality. Critical Inquiry 18, 1
(1991), 1–22. DOI:https://doi.org/10.1086/448619
[7]
Jerome S Bruner. 1986. Actual Minds, Possible Worlds. Harvard University
Press.Harvard University Press.
[8]
Richard M. Carpiano and Nicholas S. Fitz. 2017. Public attitudes toward child
undervaccination: A randomized experiment on evaluations, stigmatizing ori-
entations, and support for policies. Social Science & Medicine 185, ( July 2017),
127–136. DOI:https://doi.org/10.1016/j.socscimed.2017.05.014
[9]
David Coady. 2012. What to Believe Now: Applying Epistemology to Contemporary
Issues. John Wiley & Sons.
[10]
Michela Del Vicario, Gianna Vivaldo, Alessandro Bessi, Fabiana Zollo, Antonio
Scala, Guido Caldarelli, and Walter Quattrociocchi. 2016. Echo Chambers: Emo-
tional Contagion and Group Polarization on Facebook. Scientic Reports 6, 1
(December 2016). DOI:https://doi.org/10.1038/srep37825
[11]
Karen M. Douglas, Robbie M. Sutton, Mitchell J. Callan, Rael J. Dawtry, and
Annelie J. Harvey. 2016. Someone is pulling the strings: hypersensitive agency
detection and belief in conspiracy theories. Thinking & Reasoning 22, 1 ( January
2016), 57–77. DOI:https://doi.org/10.1080/13546783.2015.1051586
[12]
Mark Fenster. 2008. Conspiracy theories: secrecy and power in American culture
(Rev. and updated ed ed.). University of Minnesota Press, Minneapolis.
[13] Walter R Fisher. 1989. Human communication as narration: Toward a philosophy
of reason, value, and action. University of South Carolina Press.
[14]
Erving Goman. 2009. Stigma: Notes on the Management of Spoiled Identity. Simon
and Schuster.
[15]
Kurtis Hagen. 2018. Conspiracy Theorists and Monological Belief Systems. In
Taking Conspiracy Theories Seriously. Rowman & Littleeld Publishers, Lanham.
[16]
Jaron Harambam and Stef Aupers. 2015. Contesting epistemic authority: Con-
spiracy theories on the boundaries of science. Public Understanding of Science 24,
4 (May 2015), 466–480. DOI:https://doi.org/10.1177/0963662514559891
[17] Richard Hofstadter. 2012. The paranoid style in American politics. Vintage.
[18]
Joshua Introne, Irem Gokce Yildirim, Luca Iandoli, Julia DeCook, and Shaima
Elzeini. 2018. How People Weave Online Information Into Pseudoknowledge.
Social Media
+
Society 4, 3 (July 2018), 205630511878563. DOI:https://doi.org/10.
1177/2056305118785639
[19]
Joshua Introne, Luca Iandoli, Julia DeCook, Irem Gokce Yildirim, and Shaima
Elzeini. 2017. The Collaborative Construction and Evolution of Pseudo-
knowledge in Online Conversations. In Proceedings of the 8th International Confer-
ence on Social Media & Society - #SMSociety17, ACM Press, Toronto, ON, Canada,
1–10. DOI:https://doi.org/10.1145/3097286.3097297
[20]
Daniel Jolley and Karen M. Douglas. 2014. The social consequences of conspir-
acism: Exposure to conspiracy theories decreases intentions to engage in politics
and to reduce one’s carbon footprint. British Journal of Psychology 105, 1 (February
2014), 35–56. DOI:https://doi.org/10.1111/bjop.12018
[21]
Anna Kata. 2010. A postmodern Pandora’s box: Anti-vaccination misinformation
on the Internet. Vaccine 28, 7 (February 2010), 1709–1716. DOI:https://doi.org/10.
1016/j.vaccine.2009.12.022
[22]
Anna Kata. 2012. Anti-vaccine activists, Web 2.0, and the postmodern paradigm –
An overview of tactics and tropes used online by the anti-vaccination movement.
Vaccine 30, 25 (May 2012), 3778–3789. DOI:https://doi.org/10.1016/j.vaccine.2011.
11.112
[23]
Péter Krekó. 2015. Conspiracy theory as collective motivated cognition. In The
psychology of conspiracy.Routledge/Taylor & Francis Group, New York, NY, US,
62–75.
[24]
Anthony Lantian, Dominique Muller, Cécile Nurra, Olivier Klein, Sophie Berjot,
and Myrto Pantazi. 2018. Stigmatized beliefs: Conspiracy theories, anticipated
negative evaluation of the self, and fear of social exclusion. European Journal of
Social Psychology 48, 7 (2018), 939–954. DOI:https://doi.org/10.1002/ejsp.2498
[25]
Stephan Lewandowsky, Klaus Oberauer, and Gilles E. Gignac. 2013. NASA Faked
the Moon Landing—Therefore, (Climate) Science Is a Hoax: An Anatomy of the
Motivated Rejection of Science. Psychological Science 24, 5 (May 2013), 622–633.
DOI:https://doi.org/10.1177/0956797612457686
[26]
Bruce G. Link and Jo C. Phelan. 2001. Conceptualizing Stigma. Annual Review of
Sociology 27, 1 (August 2001), 363–385. DOI:https://doi.org/10.1146/annurev.soc.
27.1.363
[27]
Stephen M. E. Marmura. 2014. Likely and Unlikely Stories: Conspiracy Theories
in an Age of Propaganda. International Journal of Communication 8, 0 (September
2014), 19.
[28]
Paul Mihailidis and Samantha Viotty. 2017. Spreadable Spectacle in Digital
Culture: Civic Expression, Fake News, and the Role of Media Literacies in
“Post-Fact” Society. American Behavioral Scientist 61, 4 (April 2017), 441–454.
DOI:https://doi.org/10.1177/0002764217701217
[29]
Brendan Nyhan, Jason Reier, and Sean Richey. 2012. The Role of Social Networks
in Inuenza Vaccine Attitudes and Intentions Among College Students in the
Southeastern United States. Journal of Adolescent Health 51, 3 (September 2012),
302–304. DOI:https://doi.org/10.1016/j.jadohealth.2012.02.014
[30]
J. Eric Oliver and Thomas J. Wood. 2014. Conspiracy Theories and the Paranoid
Style(s) of Mass Opinion. American Journal of Political Science 58, 4 (2014), 952–
966. DOI:https://doi.org/10.1111/ajps.12084
[31]
Karl Popper. 2012. The Open Society and its Enemies. Routledge. DOI:https://doi.
org/10.4324/9780203820377
[32]
Jan-Willem van Prooijen and Nils B. Jostmann. 2013. Belief in conspiracy theories:
The inuence of uncertainty and perceived morality: Belief in conspiracy theories.
European Journal of Social Psychology 43, 1 (February 2013), 109–115. DOI:https:
//doi.org/10.1002/ejsp.1922
[33]
Mattia Samory and Tanushree Mitra. 2018. “The Government Spies Using Our
Webcams”: The Language of Conspiracy Theories in Online Discussions. Pro-
ceedings of the ACM on Human-Computer Interaction 2, CSCW (November 2018),
1–24. DOI:https://doi.org/10.1145/3274421
[34]
Roger C. Schank and Robert P. Abelson. 1995. Knowledge and memory: The real
story. In Knowledge and memory: The real story.Lawrence Erlbaum Associates,
Inc, Hillsdale, NJ, US, 1–85.
[35] James P. Spradley. 2016. The Ethnographic Interview. Waveland Press.
[36]
NL Stein and CG Glenn. 1979. An analysis of story comprehension in elementary
children, vol. 2. Norwood, NJ: Ablex (1979).
[37]
Cass R. Sunstein. 2014. Conspiracy Theories and Other Dangerous Ideas. Simon
and Schuster.
[38]
Cass R. Sunstein and Adrian Vermeule. 2009. Conspiracy Theories: Causes and
Cures*. Journal of Political Philosophy 17, 2 (June 2009), 202–227. DOI:https:
//doi.org/10.1111/j.1467-9760.2008.00325.x
[39]
Viren Swami, David Barron, Laura Weis, Martin Voracek, Stefan Stieger, and
Adrian Furnham. 2017. An examination of the factorial and convergent validity
of four measures of conspiracist ideation, with recommendations for researchers.
PLOS ONE 12, 2 (February 2017), e0172617. DOI:https://doi.org/10.1371/journal.
pone.0172617
[40]
Viren Swami, Martin Voracek, Stefan Stieger, Ulrich S. Tran, and Adrian Furnham.
2014. Analytic thinking reduces belief in conspiracy theories. Cognition 133, 3
(December 2014), 572–585. DOI:https://doi.org/10.1016/j.cognition.2014.08.006
191
Mapping the Narrative Ecosystem of Conspiracy Theories in Online Anti-vaccination Discussions SMSociety ’20, July 22–24, 2020, Toronto, ON, Canada
[41]
Katharina Thalmann. 2019. The Stigmatization of Conspiracy Theory since the
1950s: “A Plot to Make us Look Foolish.”Routledge. DOI:https://doi.org/10.4324/
9780429020353
[42]
J. A. Whitson and A. D. Galinsky. 2008. Lacking ControlIncreases Illusor y Pattern
Perception. Science 322, 5898 (October 2008), 115–117. DOI:https://doi.org/10.
1126/science.1159845
[43] Natural News. Retrieved from https://naturalnews.com
[44] Mothering. Retrieved from https://mothering.com
[45] AboveTopSecret. Retrieved from https://abovetopsecret.com
[46]
SimpleMind. ModelMaker Tools BV - SimpleApps. Retrieved from https://
simplemind.eu/
192