Content uploaded by Ryan Scrivens
Author content
All content in this area was uploaded by Ryan Scrivens on Sep 12, 2020
Content may be subject to copyright.
Upvoting extremism: Collective identity formation and the extreme right on Reddit
Tiana Gaudette
School of Criminology
Simon Fraser University
8888 University Drive
Burnaby, BC V5A 1S6, Canada
Email: gaudette.tiana@gmail.com
Ryan Scrivens
School of Criminal Justice
Michigan State University
Garth Davies
School of Criminology
Simon Fraser University
Richard Frank
School of Criminology
Simon Fraser University
Author Biographies
Tiana Gaudette is a Research Associate at the International CyberCrime Research Centre
(ICCRC) at Simon Fraser University (SFU). She recently earned an MA in criminology from
SFU.
Ryan Scrivens is an Assistant Professor in the School of Criminal Justice (SCJ) at Michigan State
University (MSU). He is also an Associate Director at the International CyberCrime Research
Centre (ICCRC) at Simon Fraser University (SFU) and a Research Fellow at the VOX-Pol
Network of Excellence.
Garth Davies is an Associate Professor in the School of Criminology at Simon Fraser University
(SFU) and is the Associate Director of the Institute on Violence, Extremism, and Terrorism at
SFU.
Richard Frank is an Associate Professor in the School of Criminology at Simon Fraser
University (SFU) and is the Director of the International CyberCrime Research Centre (ICCRC)
at SFU.
Acknowledgements
The authors would like to thank Maura Conway for her valuable feedback on earlier versions of
this article. The authors would also like to thank the two anonymous reviewers for their valuable
and constructive feedback.
UPVOTING EXTREMISM
2
Abstract
Since the advent of the Internet, right-wing extremists and those who subscribe to extreme right
views have exploited online platforms to build a collective identity among the like-minded.
Research in this area has largely focused on extremists’ use of websites, forums, and mainstream
social media sites, but overlooked in this research has been an exploration of the popular social
news aggregation site Reddit. The current study explores the role of Reddit’s unique voting
algorithm in facilitating ‘othering’ discourse and, by extension, collective identity formation
among members of a notoriously hateful subreddit community, r/The_Donald. The results of the
thematic analysis indicate that those who post extreme-right content on r/The_Donald use
Reddit’s voting algorithm as a tool to mobilize like-minded members by promoting extreme
discourses against two prominent out-groups: Muslims and the Left. Overall, r/The_Donald’s
‘sense of community’ facilitates identity work among its members by creating an environment
wherein extreme right views are continuously validated.
Keywords
Collective identity; extreme right discourse; r/The_Donald; social movement theory; social news
aggregators
UPVOTING EXTREMISM
3
Right-wing extremist movements are increasingly expanding beyond national borders, and the
Internet is a fundamental medium that has facilitated such movements’ increasingly
‘transnational’ nature (Caini and Kröll, 2014; Froio and Ganesh, 2019; Perry and Scrivens,
2016). The plethora of online platforms play a significant role in the ‘transnationalisation’ of the
extreme right by enabling such movements to attract new members and disseminate information
to a global audience (see Caini and Kröll, 2014). The Internet, however, functions as more than a
‘tool’ to be used by movements. It also functions as a space of important identity work; the
interactive nature of the Internet’s various platforms allows for the exchange of ideas and enables
the active construction of collective identities (Perry and Scrivens, 2016). Here the Internet has
facilitated a transnational ‘locale’ where right-wing extremists can exchange radical ideas and
negotiate a common sense of ‘we’ – or a collective identity (Bowman-Grieve, 2009; Futrell and
Simi, 2004; Perry and Scrivens, 2016).
Researchers who have explored right-wing extremists’ use of Internet have typically
focused their attention on dedicated hate sites and forums (e.g. Adams and Roscigno, 2005;
Bliuc et al., 2019; Bowman-Grieve, 2009; Burris, Smith, and Strahm 2000; Futrell and Simi,
2004; Perry and Scrivens, 2016; Scrivens, Davies, and Frank, 2018; Simi and Futrell 2015). As a
result, the nature of right-wing extremists’ use of online platforms that are outside the
mainstream purview of digital platforms – including Facebook (e.g. Ekman, 2018; Nouri and
Lorenzo-Dus, 2018; Stier et al., 2017), Twitter (e.g. Berger, 2016; Berger and Strathearn, 2013;
Burnap and Williams, 2015; Graham, 2016), and YouTube (e.g. Ekman, 2014; O’Callaghan et
al., 2014) – has gone mostly unexplored by researchers, despite the fact that lesser-researched
platforms (e.g., 4chan, 8chan, Reddit, etc.) have provided adherents with spaces to anonymously
discuss and develop ‘taboo’ or ‘anti-moral’ ideologies (see Nagle, 2017). Scholars who have
recognized this lacuna have called for further exploration of extremists’ use of under-researched
platforms. Conway (2017: 85), for example, noted that ‘different social media platforms have
different functionalities’ and posed the question how do extremists exploit the different functions
of these platforms? In response, the current study begins to bridge the abovementioned gap by
exploring one platform that has received relatively little research attention, Reddit, and the
functional role of its voting algorithm (i.e., it’s upvoting and downvoting function) in facilitating
collective identity formation among members of a notoriously hateful subreddit community,
r/The_Donald.
UPVOTING EXTREMISM
4
The collective identity of the extreme right online
The notion of ‘collective identity’ lies at the heart of the social movement theory framework and
has provided valuable insight into the impact of right-wing extremists’ online discussions in
shaping their identity (e.g. Futrell and Simi, 2004; Perry and Scrivens, 2016; Scrivens et al.,
2018). The social movement literature posits that collective identity is actively produced (e.g.
Hunt and Benford, 1994; Snow 2001) and constructed through what Melucci (1995: 43)
describes as ‘interaction, negotiation and the opposition of different orientations.’ In addition,
according to the social constructionist paradigm, collective identity is ‘invented, created,
reconstituted, or cobbled together rather than being biologically preordained’ (Snow, 2001: 5).
As activists share ideas among other members of their in-group, these exchanges actively
produce a shared sense of ‘we’ and, by extension, a collective identity (Bowman-Grieve, 2009;
Snow, 2001; Melluci, 1995; Futrell and Simi, 2004). Additionally, a collective identity is further
developed through the construction of an ‘us’ versus ‘them’ binary (Polletta and Jasper, 2001).
By ‘othering,’ or identifying and targeting the groups’ perceived enemies, such interactions
further define the borders of the in-group (Perry and Scrivens, 2016).
Indeed, these same processes of collective identity formation occur within – and are
facilitated by – online communications. Similar to how ‘face-to-face’ interactions in the ‘real
world’ form collective identities, so too do the ‘many-to-many’ interactions in cyberspace
(Crisafi, 2005). As a result, the Internet’s many online platforms have facilitated extreme right-
wing identity work (Futrell and Simi, 2004; Perry and Scrivens, 2016). As Perry and Scrivens
(2016: 69) explained it:
[a] collective identity provides an alternative frame for understanding and expressing
grievances; it shapes the discursive “other” along with the borders that separate “us” from
“them”; it affirms and reaffirms identity formation and maintenance; and it provides the
basis for strategic action.
Each of these elements enhance our understanding of the Internet’s role in developing a
collective identity among extreme right-wing adherents. But among these virtual spaces, Reddit
in particular has received criticisms in recent years for its proliferation of extreme right-wing
content.
‘The front page of the Internet’: Reddit
UPVOTING EXTREMISM
5
While the social movement framework has been useful in advancing our understanding of right-
wing extremists’ use of the Internet, particularly on the more traditional websites, forums and
mainstream social media platforms to build a collective identity, it remains unclear how the
unique voting features and content policies of one under-researched virtual platform, Reddit,
shape the formation of a collective identity among members of right-wing extremist movements
who use the platform.
Created in 2005, Reddit has become the fifth-most popular website in the United States
(U.S.) (Reddit Press, n.d.) and the 20th-most popular website in the world (Reddit.com
Competitive Analysis, n.d.). As a ‘social news aggregation’ site, Reddit describes itself as ‘a
platform for communities to discuss, connect, and share in an open environment, home to some
of the most authentic content anywhere online’ (Reddit Content Policy, n.d.). Here users may
participate in, contribute to, and even develop like-minded communities whose members share a
common interest. A unique component of the platform is its voting algorithm, which provides its
users with the means to promote and spread content within a particular subreddit. To illustrate,
Reddit users retrieve interesting content from across the Internet and post that content to niche
topic-based communities called ‘subreddits’, and those who share similar views (or not) can
convey their interest – or disinterest – by giving an ‘upvote’ or ‘downvote’ to the posted content
in the subreddit (Wildman, 2020). ‘Upvoting’ a post, for example, increase its visibility and user
engagement within a subreddit (see Carman et al., 2018), and as a result, the most popular posts
are featured on Reddit’s front page for millions of users to see (Wildman, 2020). In short, using
the upvoting (i.e., increasing content visibility) and downvoting (i.e., decreasing content
visibility) function on Reddit, users shape the discourse and build a collective identity within
their subreddit community. Previous research that explored bidirectional voting results on a
related site, Imgur, has similarly found that users’ upvoting and downvoting practices help
reinforce social identity (see Hale, 2017).
While Reddit is designed to foster a sense of community among like-minded users,
historically its laissez-faire content policy has failed to prevent users from posting and spreading
hate speech on the platform (Gibbs, 2018) and has even facilitated the development of what
Massanari (2017) describes as ‘toxic technocultures’. For example, on the one hand, its content
policy defines ‘unwelcome content’ as material that incites violence or that threatens, harasses,
or bullies other users (Reddit Content Policy, n.d.). Conversely, the policy admits that it
UPVOTING EXTREMISM
6
‘provides a lot of leeway in what content is acceptable’ (Reddit Content Policy, n.d.). This
hands-off approach is largely meant to facilitate an open dialogue among members of Reddit’s
numerous communities, wherein users may express their views on controversial topics without
fear of being banned from the site or harassed by those with opposing views (Reddit Content
Policy, n.d). As a result, Reddit tends to stop short of preventing users from posting controversial
ideas. In fact, until recently, Reddit condoned racism and hate speech on its platform, according
to a recent statement by the site’s CEO, Steve Huffman (Gibbs, 2018).
1
It comes as little surprise, then, that Reddit has witnessed an influx of extremist
communities (Feinberg, 2018). In response, Reddit launched a ‘quarantine’ feature in 2015 that
would hide a number of communities that regularly produce offensive content. Here users must
explicitly opt-in to view quarantined communities as a mechanism to prevent users from
inadvertently coming across extremely offensive content (Account and Community Restrictions,
n.d.-b). One previously quarantined and recently banned subreddit community, r/The_Donald,
has been receiving specific criticism for its extreme right-wing content.
‘America First!’: r/The_Donald
r/The_Donald was a subreddit community where members could discuss a variety of matters
relating to Trump’s U.S. presidency. Since its inception in 2015, Trump’s most fervent
supporters have assembled on r/The_Donald, with numbers surpassing 700,000 subscribers
(Romano, 2017). Some members of r/The_Donald represented a variety of extreme right-wing
movements and ideologies, including the ‘manosphere’, the ‘alt-right’, and racists more broadly,
though all members were united in their unwavering support for the U.S. president (Martin,
2017).
Although r/The_Donald was quarantined in June 2019 after members were accused of
breaking Reddit’s rules by inciting violence (see Stewart, 2019), hateful sentiment continued to
be a common occurrence there (Gibbs, 2018), until it was eventually banned (Tiffany, 2020). To
illustrate, while the ‘rules’ of r/The_Donald made it clear that ‘racism and Anti-Semitism will
not be tolerated,’ they also specified that ‘Muslim and illegal immigrant are not races’
(r/The_Donald Wiki, n.d.). These rules suggest, then, that moderators would most likely not
remove discussions related to anti-Muslim or anti-immigrant out-groups within the community.
Not surprisingly, r/The_Donald continued to experience an influx of extreme right-wing content,
UPVOTING EXTREMISM
7
including discussions surrounding ‘white genocide’, anti-Muslim sentiment, and anti-Black
discourse (Ward, 2018). Racist memes also appeared to be increasing in popularity on
r/The_Donald (see Ward, 2018; see also Zannettou et al., 2018). Despite these concerns,
however, research on platforms such as Reddit remains limited. So too does our understanding of
how these sites function to facilitate hateful content and collective identity formation.
Before proceeding, it is worth noting here is that, following J.M. Berger (2018), we take
the view that right-wing extremists—like all extremists—structure their beliefs on the basis that
the success and survival of the in-group is inseparable from the negative acts of an out-group
and, in turn, they are willing to assume both an offensive and defensive stance in the name of the
success and survival of the in-group. We thus conceptualize right-wing extremism as a racially,
ethnically, and/or sexually defined nationalism, which is typically framed in terms of white
power and/or white identity (i.e., the in-group) that is grounded in xenophobic and exclusionary
understandings of the perceived threats posed by some combination of non-whites, Jews,
Muslims, immigrants, refugees, members of the LGBTQI+ community, and feminists (i.e., the
out-group(s)) (Conway, Scrivens, and Macnair, 2019).
Data and methods
This research is guided by the following research question: How does Reddit’s unique voting
algorithm (i.e., it’s upvoting and downvoting function) facilitate ‘othering’ discourse and, by
extension, collective identity formation on r/The_Donald following Trump’s presidential election
victory? To answer this question, data were collected from a website that made Reddit data
publicly available for research and analysis.
2
We extracted all of the data posted to r/The_Donald
subreddit in 2017, as it marked the first year of Trump’s presidency and a time when his far-right
political views encouraged his supporters to preach and practice racist hate against the out-
groups, both on- and offline (see Anti-Defamation League, 2018). Research has similarly
identified a link between Trump’s election victory and a subsequent spike in hatred, both online
(e.g., Zannettou et al., 2018) and offline (e.g., Edwards and Rushin, 2018), in the year following
his victory.
We then identified the 1,000 most highly-upvoted user-submitted comments in the data
(hereafter referred to as the ‘highly-upvoted sample’).
3
A second sample, which comprised of
1,000 user-submitted comments, was randomly sampled from the data (hereafter referred to as
UPVOTING EXTREMISM
8
the ‘random sample’) to serve as a comparison to the highly-upvoted sample. The purpose of this
was to explore what makes highly-upvoted content unique in comparison to a random sample of
non-highly-upvoted comments.
The data were analyzed using thematic analysis (Braun and Clarke, 2006). In particular,
the data were categorized using descriptive coding which was done with Nvivo qualitative
analysis software. After we reviewed each user comment in the highly-upvoted sample and
random sample, codes were assigned to the sections of text that related to the research question.
This descriptive coding technique proceeded in a sequential, line-by-line manner (see Maher et
al., 2018), which was a suitable choice for the current study because it allowed us to organize
vast amount of textual data into manageable word/topic-based clusters. Given that there are no
‘hard and fast’ rules for searching for themes using thematic analysis (see Maguire and Delahunt,
2017), significant themes emerged as we coded, categorized, and reflected on the data (Saldaña,
2009).
Findings
The findings are organized into two sections, one for each key theme that was identified in the
data. Each includes an explanation of the patterns that emerged in support of each theme, both
within and across the highly-upvoted sample and the random sample. Here we draw largely from
the words of the users in the samples (i.e., quotes). Descriptive information pertaining to the
samples are also provided.
Descriptive information of the samples
Several macro-level patterns emerged across the sample groups, as is illustrated in Table 1. First
was the relatively similar number of unique authors in the highly-upvoted sample and the
random sample (852 and 952, respectively). This suggests that the highly-upvoted sample, and
perhaps the highly-upvoted content on r/The_Donald in general, does not consist of a group of
authors whose comments make up a large proportion of the highly-upvoted comments. Second,
each sample consisted of a variety of thread-starting comments and response comments, but the
thread-starter comments comprised a large proportion of the highly-upvoted sample (89.3%)
while the random sample contained fewer thread-starter comments (46.5%). Lastly, a
comparison of the measures of central tendency across the two samples revealed that the
UPVOTING EXTREMISM
9
comment vote scores varied substantially. This is expected because the highly-upvoted sample
consisted of the most highly-upvoted comments found in the r/The_Donald in 2017.
Table 1. Descriptive information of the samples.
Unique Authors
(%)
Thread-Starting
Comments (%)
Comment Vote Score
Min
Max
Mean
Highly Upvoted Sample
(N = 1,000)
852 (85.2%)
893 (89.3%)
1320
7328
1915
Random Sample
(N = 1,000)
929 (92.9%)
465 (46.5%)
-12
487
10
The external threat
Across the 1,000 most highly-upvoted comments were various talking points, including
discussions about Reddit administrators, a recent solar eclipse, users on 4chan, and United States
Senator Ted Cruz, among numerous other discussions. But a large proportion of the comments
(i.e., 11.6%) in the highly-upvoted sample were hateful comments about Muslim communities,
with claims that Muslim immigration is a persistent danger to Western nations, particularly to the
U.S., and that Muslim-perpetrated violence is to be expected as a result of the increasing levels
of Muslim immigration. In particular, contributors generally agreed that increased Muslim
immigration will result in what one user – who posted one of the most highly-upvoted comments
in the sample – described as ‘chaos and oppression and rape and murder’
4
(UserID: 977, 4143
5
).
Here the highly-upvoted comments not only suggested that ‘my Muslim neighbors’ want to ‘rape
my wife and kill my dogs’, as one user explained (UserID: 272, 1532), but also that ‘pedos
[pedophiles] are rampant inside this mind fuck ideology, as their [religious] ‘book’ allows this
shit to happen’ (UserID: 309, 2300). In addition, among the most highly-upvoted comments was
a link to a video of an ‘ISLAMIST MOB PUSHING A TEEN OFF A ROOF THEN BEATING
HIM TO DEATH’ (UserID: 600, 1526), which was used by members to showcase the violent
nature of Islam as well as to make the claim that such a ‘barbaric’ religion is indisputably
violent. Interestingly, while highly-upvoted content reflected such comments as ‘we know
muslims are NOT peaceful and will wreck shit’ (UserID: 588, 1388), much of the content
UPVOTING EXTREMISM
10
oftentimes included the use of sarcasm to express frustration about a wider ‘politically correct’
culture that, as one user best summed up, ‘[does] not allow us to report crimes perpetrated by
Muslims because it’s racist’ (UserID: 555, 1548).
Frequently discussed within the highly-upvoted sample was a link between the increase
of Muslim migration in the U.S., and the West more generally, and the increased threat of
terrorism there. Although racism is against r/The_Donald subreddit’s rules, racist sentiments
against Muslims were generally overlooked in the context of discussions around the ‘Muslim
terror threat’. For instance, one member’s highly-upvoted comment described the terrorist threat
as follows: ‘MY NAME IS JAFAR I COME FROM AFAR THERES A BOMB IN MY CAR
ALLAHU AKBAR’ (UserID: 689, 1400). Notably, although racist remarks such as these should
warrant removal by r/The_Donald moderators, they were overlooked and, resultantly, were
allowed to remain featured within the community to draw further attention to the external threat.
As a result, among the most highly-upvoted content were suggestions that Muslims largely
supported terrorism in the U.S., with some users also suggesting ‘NOT ALL MUSLIMS yeah,
but damn near most of them it seems’ (UserID: 353, 1565) support terrorist attacks against the
West. As a result, commonly expressed in the highly-upvoted comments were references to what
was described as a key method to counter this ‘terror threat’: preventing the so-called external
threat from entering the Western world through immigration. Here, contributors oftentimes
argued that it is imperative to, as one user who posted a highly-upvoted comment put it, ‘halt our
disasterous immigration policies’ (UserID: 612, 1942).
Together, extreme right-wing sentiments emerged in the highly-upvoted content if it
related to this external threat. Consistently uncovered within this sample were discussions about
Muslims as ‘invaders’ of the West and an increasing threat to the white population there. As but
one more notable example, in a highly-upvoted comment to a post titled ‘Muslim population in
Europe to triple by 2050’, one user expressed their grievances about the rising number of
Muslims in Europe and the subsequent threat to the white race there:
In 2050 your gonna have a lot of younger Europeans who completely resent their elders
for giving away there future for a false sense of moral superiority. I almost already feel it
with my parents. My family is white and both of my white liberal parents talk as if whites
are the worst people on the planet. I love them but holy shit they have no self
preservation instincts. It’s embarrassing. (UserID: 114, 1604)
UPVOTING EXTREMISM
11
Similarly uncovered in the highly-upvoted comments were members who described an emerging
sense of desperation that the survival of their in-group is being threatened by the Muslim out-
group. As one user put it: ‘Come on? We’re getting slaughtered here, the previous car that
ploughed though people on the bridge, Ariana Grande concert and now this? Our country has
been infiltrated, we’re fucked’ (UserID: 355, 2189). Here the threat of Muslim-perpetrated
violence and terrorism was described within the broader context of an Islamic ‘invasion’ of
Western countries. To illustrate, a large proportion of the content within the highly-upvoted
messages was a reaction to Muslim immigration into Western nations as a sign of Muslims
‘want[ing] to rule the earth’ (UserID: 309, 2300). One of the highly-upvoted comments
described this so-called Muslim ‘end game’ as a means to destroy other nations to further their
own goals:
Muslims have been trying to conquer us since the beginning of time. They used to
steal our children and then train them to kill their own people. Didn't work then,
and will not work now. There’s not a single romanian out there in this whole
world who doesn’t know what the Muslim end game is. (UserID: 761, 1465)
As a result of this external threat, members in the subreddit oftentimes discussed the ways that
they should defend themselves and members of their in-group since, as one user put it, ‘[w]hy
should we give up our lifestyle for their tyranny?’ (UserID: 460, 1653).
Evidence of the above discussed external threat of the Muslim out-group, such as
gruesome images of recent terror attack victims found in the highly-upvoted content, further
invoked a particularly emotional response from members of the in-group on r/The_Donald; when
Muslim-perpetrated terrorist attacks occurred, the fear and anger that members of r/The_Donald
expressed toward the external threat was reflected in their most highly-upvoted comments. For
example, a highly-upvoted comment to an image of a terror attack victim noted: ‘[i]t’s
heartbreaking and bloodily gruesome but this corpse was once a person. How dare we deny their
pain? We need to defend and avenge’ (UserID: 989, 1468). In effect, each member that upvoted
this comment agreed with its basic underlying assumptions: that the in-group’s very survival
depends on some form of defensive action against the out-groups.
While the most highly-upvoted comments consisted largely of references to the Muslim
threat as imminent threat and the subsequent necessity of a ‘defensive stance’, these highly-
upvoted comments failed to specify how members should respond to this threat. For instance,
UPVOTING EXTREMISM
12
while contributors argued that the threat may, in part, be alleviated by striking down the
‘disastrous’ immigration policies that allow Muslims to ‘flood’ to the U.S. and the West, the
highly-upvoted comments provided no guidelines about how to defend themselves against
Muslims who already reside within the country. It would appear, then, that members’ highly-
upvoted discussions around defending themselves from Muslims are meant to be generalized
and, as such, are careful not to overtly incite violence against a particular person or group.
An assessment of the random sample of comments revealed that the highly-upvoted anti-
Muslim comments discussed above go unchallenged, at least visibly. This is largely because any
comments that ran counter to the most highly-upvoted narrative typically did not receive any
upvotes by other members and, resultantly, were less visible than their highly-upvoted
counterparts. For instance, one user who suggested that ‘the basic ideals of Islam’ include such
things as ‘faith, generosity, self-restraint, upright conduct’ (UserID: 1928, 1) did not have their
comment upvoted. Moreover, comments that opposed the more prominent anti-Muslim
narratives, such as ‘Terrorist come in all colors. So do Americans’ (UserID: 1680, 2), were far
from the most highly-upvoted comments in the community. Similarly, users arguing that
Muslims who commit violent acts have ‘psychological problems so it’s not a terror related
attack’ (UserID: 1860, 1) or that ‘there are a lot of good muslim people, but theyre bad muslims’
(UserID: 1733, 4) were largely unpopular within the community.
While there were a small number of particularly anti-Muslim comments in the random
sample, they were less frequent (i.e., 1.6% of the random sample) and, for the most part, less
extreme than those in the highly-upvoted sample. To illustrate, one member in the random
sample wondered ‘why do countries like Sweden let Muslim war criminals into their country
with no repercussion?’ (UserID: 1788, 2). Another user here argued ‘if they make Europe more
Muslim then the West will stop bombing the Middle East’ (UserID: 1609, 30). There were,
however, a number of comments in the random sample that reflected the more extreme anti-
Muslim content that characterized the anti-Muslim rhetoric in the highly-upvoted sample, but the
comments in the random sample received much lower voting scores than the highly-upvoted
messages – most likely because references to Muslims as an external threat were relatively rare
among the comments in the random sample.
The internal threat
UPVOTING EXTREMISM
13
Commonly expressed in the data was another threat in most highly-upvoted comments: The Left.
Sentiment against this out-group comprised 13.4% of the most highly-upvoted comments in the
sample. An assessment of the most highly-upvoted messages suggested that the Leftist out-group
was perceived as violent, particularly against the extreme right-wing in-group. Here a widely
held belief was, as two users described, ‘the Left are not as peaceful as they pretend to be,’
(UserID: 392, 1982) and they are ‘growing increasingly unhinged every passing day’ (UserID:
880, 2065). Members who posted highly-upvoted content oftentimes noted that the Left is, as
one user put it, sending ‘death threats to people who have different political views than [them]’
(UserID: 880, 2065). Users who posted highly-upvoted comments in r/The_Donald community
even argued that the Left was willing to go so far as physically attack their right-wing political
opponents or, as one user put it, ‘smash a Trump supporter over the head with a bike lock
because of his intolerance to non-liberal viewpoints’ (UserID: 619, 1640). Yet in light of fellow
members being depicted as the victims of the Left’s violence, an analysis of the highly-upvoted
content revealed that members in r/The_Donald encouraged each other not to react to the Left
because, as one user explained it, ‘[v]iolent backlash from the right is exactly what they’re
looking for so they can start point fingers and calling us the violent ones’ (UserID: 1000, 2173).
Also uncovered in the highly-upvoted comments were discussions about the Left seeking
to weaken the West in general but the U.S. in particular through their crusade against the white
race. Specifically, users who posted highly-upvoted content oftentimes expressed frustration
about the how Left has constantly undermined the many societal contributions made by the white
race, which – as one user best summed up – include ‘economies that actually work, philosophy
as a whole, anti-biotics and healthcare in general, printed press, technology in general, etc.’
(UserID: 357, 1688). Further added here were discussions about how the Left would rather focus
on what white people have done wrong and ‘shit all over white people for what they did in the
past’ (UserID: 357, 1688), as one user put it, rather than acknowledge the contributions that the
white race has made to Western society. Together, the general sentiment was that the Left was
seeking to deter whites – and by extension members of r/The_Donald community – from feeling
any source of pride for their white ancestry, as evidenced by the following highly-upvoted
comment:
You’re a white male kid. You’ve grown up being told you’re the reason for all the
world’s ills. You want to speak out. You have to go to your high school in the
UPVOTING EXTREMISM
14
dark, slouched, with your face hidden to simply post a sign that says It’s okay to
be white. Leave. School hands security footage over to the media. The media
broadcasts the picture nationwide and says they are investigating. Just step back
and take in how crazy this is. (UserID: 77, 2099)
As a result, members of r/The_Donald conclude not only that Leftists ‘hate white people’
(UserID: 479, 1582), but that, and one user explained it, ‘our enemies sure want whites to be a
minority or even dead’ (UserID: 197, 1356) as evidenced by their support for the immigration of
‘undesirables’ into West in general and the U.S. in particular.
Apparent in the highly-upvoted content was user support for discussions about the Left’s
support for immigration – especially from Muslims entering the West from dangerous countries.
According to users who posted highly-upvoted comments, this was another key component of
the Left’s so-called anti-white crusade. Members noted that although Muslim immigrants
increase the risk of violence and terrorism in the U.S., such a threat does not prevent the Left
from encouraging them to migrate to the U.S. Perhaps this is because, as one user who posted a
highly-upvoted comment emphasized, the Left ‘do[es]n’t view ISIS or terrorism as a threat to
this country at all... [and the Left is] more scared of Trump’s policies than ISIS’ (UserID: 720,
1361). Still, users who posted the most highly-upvoted comments in r/The_Donald noted that
when ‘dangerous Muslim immigrants’ inevitably commit violent acts against the West, members
believe that these instances of Muslim-perpetrated violence are simply overlooked and excused
by the Left. As one user explained: ‘Liberals will defend these sick fucks because they come
from a differnt culture and don’t understand western society’ (UserID: 799, 4229). The general
sentiment expressed in this regard was that the Left favors Muslim immigrants over the well-
being of Americans – sentiment that was oftentimes met with frustration, as is expressed by one
user: ‘the left's love affair with islam is so cringey and infuriating’ (UserID: 687, 1653). Linked
to these discussions were conspiracy theories about the left-wing media playing a major role in
distorting public perceptions about the ‘Muslim threat’. One highly-upvoted comment, for
example, formed an anti-Muslim acronym, based on the media’s distortion of Muslim-
perpetrated terrorism: ‘Just another random incident. Impossible to discern a motive. Has
nothing to do with religion. Always believe the media. Don’t jump to any conclusions’ (UserID:
486, 1461; emphasis in original).
UPVOTING EXTREMISM
15
An examination of the random sample of comments revealed few discussions about the
so-called Leftist threat (i.e., 5.7% of the random sample). For instance, not only is the Left
mentioned less often in the random sample than in the highly-upvoted sample, but in less
extreme ways as well. Instead, attacks against the Left are intended to be more humorous than
serious. To illustrate, one user criticized the Left, noting that, ‘in addition to being unable to
meme, leftists and ANTIFA can’t shower either’ (UserID: 1891, 17). Similarly, another user
admonished the ‘[t]ypical leftist hypocrites. That’s why we can’t play fair with them. They never
reciprocate the gesture of good faith in anything’ (UserID: 1719, 9). Another less popular
comment in the random sample suggested that the Left seeks to ‘protect free speech by banning
it. Leftist logic’ (UserID: 1751, 1). This, however, was the extent to which the Left was referred
in the random sample of comments.
Discussion
The results of the current study suggest that a particularly effective form of collective identity
building was prevalent throughout the r/The_Donald community – one that was riddled with
hateful sentiment against members’ perceived enemies: Muslims and the Left. The thematic
analysis of the highly-upvoted content indicates that members most often agree with (i.e.,
upvote) extreme views toward two of their key adversaries to mobilize their social movement
around the so-called threat. In particular, r/The_Donald’s rules specifically condoned anti-
Muslim content, which in large part explains why such content was so prevalent among the most
highly-upvoted messages. Additionally, Donald Trump’s own anti-Muslim rhetoric, which has
emboldened right-wing extremists to commit hateful acts against Muslims (see Müller and
Schwarz, 2019), may explain why his supporters on r/The_Donald were so eager to vilify the
external threat. Known as the ‘Trump Effect’ (Müller and Schwarz, 2019), the results of the
current study suggest that Trump’s anti-Muslim rhetoric emboldened his fervent supporters on
r/The_Donald to spread anti-Muslim content via Reddit’s upvoting algorithm. Indeed, anti-
Muslim hate speech is increasingly becoming an ‘accepted’ form of racism across the globe and
extreme right movements have capitalized on this trend to help mobilize an international
audience of right-wing extremists (see Hafez, 2014; see also Froio and Ganesh, 2019). For
instance, the othering of Muslims, specifically, is commonly used to strengthen in-group ties
between far-right extremists against a common enemy (Hafez, 2014). By describing Muslims as
UPVOTING EXTREMISM
16
the violent perpetrators and themselves as those who must defend themselves from “them”, for
example, members in r/The_Donald framed themselves as victims rather than perpetrators – an
othering tactic commonly used by right-wing extremists, among other extremist movements (see
Meddaugh and Kay, 2009). This suggests that, by endorsing anti-Muslim sentiment,
r/The_Donald offered extremists a virtual community wherein they may bond with likeminded
peers around a common enemy. Worth highlighting though is that, while a large proportion of
the highly-upvoted content in the current study often frame Muslims as a serious and imminent
threat, users tended not to provide specific strategies to respond to the so-called threat. Such
behavioral monitoring may have been done in an effort to safeguard the community from being
banned for breaking Reddit’s sitewide policy against inciting violence.
6
Regardless of these
efforts, recent amendments to Reddit’s content policy that explicitly target hate speech led to the
ban of r/The_Donald (see Tiffany, 2020).
There are a number of reasons that might explain why the Left is the target of othering
discourse in the highly-upvoted content on r/The_Donald. For instance, such sentiment most
likely reflects the increasing political divide in the U.S. and other parts of the Western world,
where the ‘right’ often accuse the ‘left’ of threatening the wellbeing of the nation (see Dimock et
al., 2014). However, the results of the current study suggest that some users on r/The_Donald
took this narrative to the extreme. To illustrate, the highly-upvoted anti-Left discourse that was
uncovered in the highly-upvoted content mirror ‘paradoxical’ identity politics expressed by the
alt-right movement, which accuses the Left of authoritarianist censorship, yet position the far-
right as the harbingers of ‘peaceful social change’ (see Phillips and Yi, 2018). On r/The_Donald,
the Left (i.e., the out-group) was construed as the opposite of peaceful on and even as a violent
and physical threat to members of the in-group. Framing the out-group as a physical threat to
members of the in-group served to further delineate the borders between ‘them’ versus ‘us’ and
solidifies bonds between members of the in-group who, together, face a common enemy – a
community-building tactic that is commonly discussed in the social movement literature (see
Futrell and Simi, 2004; see also Perry and Scrivens, 2016). Notably, however, in the face of this
so-called physical threat, within the highly-upvoted comments users cautioned others from
retaliating against ‘Leftist-perpetrated violence’. It is probable that members may have wanted to
avoid discussions of offline violence to avoid being banned from Reddit for overstepping its
content policy against citing violence. If members instead were to encourage one another to react
UPVOTING EXTREMISM
17
with violence against their perceived enemies, they would risk having their community banned
for violating Reddit’s content policy. Similar tactics have been reported in empirical studies
which found that extreme right adherents in online discussion forums will deliberately present
their radical views in a subtler manner in fear that they will be banned from the virtual
community (see Daniels, 2009; Meddaugh and Kay, 2009).
Comparing the content in the highly-upvoted sample with the random sample, it is clear
that there was a complete absence of dissenting views among the most visible content in the
highly-upvoted comment sample. This suggests that Reddit’s voting algorithm facilitated an
‘echo chamber’ effect in r/The_Donald – one which promoted anti-Muslim and anti-Left
sentiment. In particular, rather than encouraging a variety of perspectives within discussions on
r/The_Donald, Reddit’s voting algorithm seemed to allow members to create an echo chamber
by shaping the discourse within their communities in ways that reflected a specific set of values,
norms, and attitudes about the in-group. For example, as evidenced by an analysis of
r/The_Donald’s most highly-upvoted comments, members of the subreddit community tended to
only support, or upvote, extreme views about Muslims and the Left. That is, among the
community’s most highly-upvoted content was an absence of comments which offered an
alternative perspective or even a dissenting view to the community’s dominant extremist
narrative. In some cases, members even used humor and sarcasm – which is commonly used by
commenters on Reddit (see Mueller, 2016) and related sites like Imgur (see Mikal et al., 2014)
regardless of political ideology – to express otherwise ‘taboo’ anti-Muslim and anti-Left views,
reflecting a longstanding tactic used by the extreme right to ‘say the unsayable’ (see Billig,
2001).
Overall, our study’s findings suggest that Reddit’s upvoting and downvoting features
played a central role in facilitating collective identity formation among those who post extreme
right-wing content on r/The_Donald. Reddit’s upvoting feature functioned to promote and
normalize otherwise unacceptable views against the out-groups to produce a one-sided narrative
that serves to reinforce members’ extremist views, thereby strengthening bonds between
members of the in-group. On the other hand, Reddit’s downvoting feature functioned to ensure
that members were not exposed to content that challenged their extreme right-wing beliefs,
which in turn functioned as an echo chamber for hate and may also functioned to correct the
behavior of dissenting members – a possibility gleaned from the results of previous research on
UPVOTING EXTREMISM
18
the effect of Imgur’s bidirectional voting features on social identification (Hale, 2017). Seen
through the lens of social movement theory, the extreme views against Muslims and the Left that
characterized the ‘othering’ discourses among the most highly-upvoted comments may have
been more likely to produce a stronger reaction from r/The_Donald’s extreme right-wing in-
group and, as a result, they may have been more likely to mobilize around these two threats.
Limitations and future research
The current study includes a number of limitations that may inform future research. First, our
sample procedure may have implications for the study’s findings, as the highly-upvoted sample
included a larger proportion of thread-starting comments than the random sample (i.e., 89.3%
and 46.5%, respectively). As a result, fewer instances of right-wing extremist content found in
the random sample could be the result of the greater proportion of response comments (i.e.,
comments that may be less visible than thread-starters, or support extremist content rather than
feature it itself) when compared to the predominantly thread-starting comments of the highly-
upvoted sample. Future work should explore this possibility by creating a matched control
sample in which the number of thread-starting and response comments are equal to that of the
experimental sample. Second, we did not identify an important indicator of user engagement: the
number of replies to comments in the highly-upvoted sample or the random sample. Exploring
the number of replies to comments featuring anti-Muslim and anti-Left content versus those
without such content may provide additional insight into user engagement. Third, we drew from
social movement theory to explore the role of Reddit’s unique voting algorithm in promoting
‘othering’ discourses that in turn facilitated collective identity formation on r/The_Donald, but
clearly there are other theoretical frameworks that could be applied in this context as well.
Bandura's (2002) theory of moral disengagement, for example, offers interesting insights into the
themes uncovered in the current study, including detrimental actions, injurious effects, and
victimization. These aspects of Bandura’s theory function to enhance the likelihood of moral
disengagement and are bound by notions of ‘us’ vs ‘them’, all of which should be explored
further.
In closing, the results of the study highlight at least of four additional paths for further
inquiry. The first involves temporal change. Temporal analyses may be able to effectively
capture how the anti-Muslim and anti-Left sentiment uncovered in the current study evolve over
UPVOTING EXTREMISM
19
time. This, in turn, may offer new insight into the evolution of extremists’ collective identities in
virtual settings (see Scrivens et al. 2018) and provide more insight into whether and how the
nature of r/The_Donald’s echo chamber dynamics change over time. Second, the data for the
current study was gathered from only the period after Trump’s U.S. presential election victory,
and not before. Future studies could, for example, incorporate an interrupted time series analysis
into the research design, assessing whether Trump’s election had a significant impact on the
volume and severity of hateful speech on r/The_Donald. Since offline events can significantly
impact the sentiment in far-right online communities (see Bluic et al., 2019), determining the
impact of offline events, such as Donald Trump’s election, may shed light on if and how it
emboldened right-wing extremist sentiment within the community. Other external events that
may have influenced the most highly-upvoted comments on r/The_Donald during the 2017
timeline include Islamist terror attacks such as the New York City truck attack and the London
Bridge attack, or other galvanizing events such as the Unite the Right rally in Charlottesville.
Future research should therefore attempt to account for these central events and subsequent
interaction effects during the analysis stage. Research is also needed to assess how the banning
of r/The_Donald impacted its online community in general and collective identity formation in
particular, such as whether its users migrated to other political subreddits (e.g., r/Conservative),
backup or mimic sites (e.g., TheDonald.win), or alternative platforms with more lenient content
policies (e.g., Gab or Voat) in an effort to express their extremist views. Lastly, future research
could compare the discourse within and across a number of other extreme subreddits such as, for
example, the recently quarantined r/Imgoingtohellforthis, which is known to promote ‘racist
nationalism’ (see Topinka, 2017). This may provide insight into how collective identities form in
extreme online communities in general, and how Reddit’s unique features facilitate collective
identity formation, in particular.
References
Account and Community Restrictions (n.d.-a). Promoting Hate Based on Identity or
Vulnerability. Available at: https://www.reddithelp.com/en/categories/rules-
reporting/account-and-community-restrictions/promoting-hate-based-identity-or.
(accessed 30 June 2020).
UPVOTING EXTREMISM
20
Account and Community Restrictions (n.d.-b). Quarantined Subreddits. Available at:
https://www.reddithelp.com/en/categories/rules-reporting/account-and-community-
restrictions/quarantined-subreddits. (accessed 6 January 2020).
Adams J and Roscigno V (2005). White supremacists, oppositional culture and the World Wide
Web. Social Forces 84(2): 759—778.
Anti-Defamation League (2018). New hate and old: The changing face of American white
supremacy. Anti-Defamation League.
Bandura A (2002). Selective moral disengagement in the exercise of moral agency. Journal of
Moral Education 31(2): (2002): 101—119.
Berger JM (2018). Extremism. Cambridge: The MIT Press.
Berger JM (2016). Nazis vs. ISIS on Twitter: A comparative study of white nationalist and
ISIS online social media networks. The George Washington University Program on
Extremism. Available at:
https://cchs.gwu.edu/files/downloads/Nazis%2520v.%2520ISIS%2520Final_0.pdf.
(accessed 8 January 2020).
Berger JM and Strathearn B (2013). Who matters online: Measuring influence, evaluating
content and countering violent extremism in online social networks. The International
Centre for the Study of Radicalisation and Political Violence. Available at:
http://icsr.info/wp-content/uploads/2013/03/ICSR_Berger-and-Strathearn.pdf. (accessed
8 January 2020).
Billig M (2001). Humour and hatred: The racist jokes of the Ku Klux Klan. Discourse & Society
12(3): 267—289.
Bliuc AM, Betts J, Vergani M, Iqbal M and Dunn K (2019). Collective identity changes in far-
right online communities: The role of offline intergroup conflict. New Media & Society
21(8): 1770–1786.
Bowman-Grieve L (2009). Exploring “Stormfront”: A virtual community of the radical right.
Studies in Conflict & Terrorism 32(11): 989—1007.
Braun V and Clarke V (2006). Using thematic analysis in psychology. Qualitative Research in
Psychology 3(2): 77—101.
UPVOTING EXTREMISM
21
Burnap P and Williams M L (2015). Cyber hate speech on Twitter: An application of machine
classification and statistical modeling for policy and decision. Policy and Internet 7(2):
223—242.
Burris V, Smith E and Strahm A (2000). White supremacist networks on the Internet.
Sociological Focus 33(2): 215—235.
Carman M, Koerber M, Li J, Choo KKR and Ashman H (2018) Manipulating visibility of
political and apolitical threads on reddit via score boosting. In 17th IEEE International
Conference on Trust, Security and Privacy in Computing and Communications/12th
IEEE International Conference on Big Data Science and Engineering, pp. 184—190,
IEEE.
Caini M and Kröll P (2014). The transnationalization of the extreme right and the use of the
Internet. International Journal of Comparative and Applied Criminal Justice 39(4),
331—351.
Conway M (2017). Determining the role of the Internet in violent extremism and terrorism: Six
suggestions for progressing research. Studies in Conflict & Terrorism 40(1): 77—98.
Conway M, Scrivens R, and Macnair L. (2019). Right-Wing Extremists’ Persistent Online
Presence: History and Contemporary Trends. The International Centre for Counter-
Terrorism – The Hague 10: 1—24.
Crisafi A (2005). The Seduction of Evil: An Examination of the Media Culture of the Current
White Supremacist Movement. In Ni Fhlainn, S., and Myers, W. A. (eds.), The Wicked
Heart: Studies in the Phenomena of Evil (pp. 29—44). Oxford: Inter-Disciplinary Press.
Daniels J (2009). Cyber racism: White supremacy online and the new attack on civil rights.
Lanham, MD: Rowman and Littlefield Publishers.
Diani M (1992). The concept of social movement. Sociological Review 40(1): 1—25.
Dimock M, Doherty C, and Kiley J (2014). Political polarization in the American public: How
increasing ideological uniformity and partisan antipathy affect politics, compromise and
everyday life. Pew Research Center, 12.
Edwards, G S, and Rushin, S (2018). The Effect of president Trump's election on hate crimes.
SSRN: https://ssrn.com/abstract = 3102652 (accessed 8 January 2020).
UPVOTING EXTREMISM
22
Ekman M (2014). The dark side of online activism: Swedish right-wing extremist video activism
on YouTube. Journal of Media and Communication Research 30(56): 79—99.
Ekman, M (2018). Anti-refugee mobilization in social media: The case of Soldiers of Odin.
Social Media + Society 4(1): 1—11.
Feinberg A (2018). Conde Nast sibling reddit says banning hate speech is just too hard. The
Huffington Post, 9 July. Available at: https://www.huffpost.com/entry/reddit-ceo-ban-
hate-speech-hard_n_5b437fa9e4b07aea75429355. (accessed 8 January 2020).
Froio C and Ganesh B (2019). The transnationalisation of far-right discourse on Twitter: Issues
and actors that cross borders in Western European democracies. European Societies
21(4): 519—539.
Futrell R and Simi P (2004). Free spaces, collective identity, and the persistence of US white
power activism. Social Problems 51(1): 16—42.
Gibbs S (2018). Open racism and slurs are fine to post on reddit, says CEO. The Guardian, 12
April. Available at: https://www.theguardian.com/technology/2018/apr/12/racism-slurs-
reddit-post-ceo-steve-huffman. (accessed 8 January 2020).
Graham R (2016). Inter-ideological mingling: White extremist ideology entering the
mainstream on Twitter. Sociological Spectrum 36(1): 24—36.
Hafez F (2014). Shifting borders: Islamophobia as common ground for building pan-European
right-wing unity. Patterns of Prejudice 48(5): 479—499.
Hale BJ (2017). “+1 for Imgur”: A content analysis of SIDE theory and common voice effects on
a hierarchical bidirectionally-voted commenting system. Computers in Human Behavior
77(1): 220—229.
Hauser C (2017). Reddit bans ‘incel’ group for inciting violence against women. The New York
Times, 9 November. Available at:
https://www.nytimes.com/2017/11/09/technology/incels-reddit-banned.html. (accessed 8
January 2020).
Hunt SA and Benford RD (1994). Identity talk in the peace and justice movement. Journal of
Contemporary Ethnography 22(4): 488—517.
Maguire M and Delahunt B (2017). Doing a thematic analysis: A practical, step-by-step guide
for learning and teaching scholars. AISHE-J: The All Ireland Journal of Teaching and
Learning in Higher Education, 9(3).
UPVOTING EXTREMISM
23
Maher C, Hadfield M and Hutchings M (2018). Ensuring rigor in qualitative data analysis: A
design research approach to coding combining NVIVO with traditional material methods.
International Journal of Qualitative Methods 17(1): 1—13.
Martin T (2017). Dissecting trump’s most rabid online following. Five Thirty Eight, 23 March.
Available at: https://fivethirtyeight.com/features/dissecting-trumps-most-rabid-online-
following/. (accessed 8 January 2020).
Massanari A (2017). #Gamergate and the fappening: How Reddit’s algorithm, governance, and
culture support toxic technocultures. New Media & Society 19(3): 329—346.
Meddaugh PM and Kay J (2009). Hate speech or “reasonable racism?” The other in Stormfront.
Journal of Mass Media Ethics 24(4): 251—268.
Melucci A (1995). The process of collective identity. In Johnston H and Klandermans B (eds)
Social Movements and Culture. Minneapolis: University of Minnesota Press, pp. 41-63.
Mikal JP, Rice RE, Kent RG, and Uchino BN (2014). Common voice: Analysis of behavior
modification and content convergence in a popular online community. Computers in
Human Behavior 35: 506—515.
Mueller C (2016). Positive feedback loops: Sarcasm and the pseudo-argument in Reddit
communities. Working Papers in TESOL & Applied Linguistics 16(2): 84—97.
Müller K and Schwarz C (2019). From hashtags to hate crimes. Available at:
http://dx.doi.org/10.2139/ssrn.3149103. (accessed 8 January 2020).
Nagle A (2017). Kill all normies: The online culture wars from Tumblr and 4chan to the alt-
right and Trump. Alresford: John Hunt Publishing.
Nouri L and Lorenzo-Dus N (2019) Investigating Reclaim Australia and Britain First’s use of
social media: Developing a new model of imagined political communities online. Journal
for Deradicalization 18: 1—37.
O’Callaghan D, Greene D, Conway M, Carthy J and Cunningham P (2014). Down the
(white) rabbit hole: The extreme right and online recommender systems. Social Science
Computer Review 33(4): 1—20.
Perry B and Scrivens R (2016). White pride worldwide: Constructing global identities online. In:
Schweppe J and Walters M (eds) The Globalization of Hate: Internationalizing Hate
Crime. Oxford University Press., pp. 65—78.
UPVOTING EXTREMISM
24
Phillips J and Yi J (2018). Charlottesville paradox: The ‘liberalizing’ alt right, ‘authoritarian’
left, and politics of dialogue. Society 55(3): 221—228.
Polletta F and Jasper JM (2001). Collective identity and social movements. Annual Review of
Sociology 27(1): 283—305.
r/The_Donald Wiki (n.d.). Can you explain your rules? Available at:
https://www.reddit.com/r/The_Donald/wiki/index. (accessed 6 January 2020).
Reddit.com Competitive Analysis (n.d.). Retrieved from
https://www.alexa.com/siteinfo/reddit.com (accessed 30 June 2020).
Reddit Content Policy (n.d.) Retrieved from https://www.redditinc.com/policies/content-policy.
(accessed 6 January 2020).
Reddit Press (n.d.) Reddit by the numbers. Available at: https://www.redditinc.com/press.
(accessed 6 January 2020).
Romano A (2017). Reddit just banned one of its most toxic forums. But it won’t touch
The_Donald. Vox, 13 November. Available at:
https://www.vox.com/culture/2017/11/13/16624688/reddit-bans-incels-the-donald-
controversy. (accessed 8 January 2020).
Saldaña J (2009). The coding manual for qualitative researchers. Thousand Oaks, CA: Sage.
Scrivens R, Davies G and Frank R (2018). Measuring the evolution of radical right-wing posting
behaviors online. Deviant Behavior 41(2): 216—232.
Simi P and Futrell R (2015). American Swastika: Inside the White Power Movement’s Hidden
Spaces of Hate (2nd Ed). Lanham, MD: Rowman and Littlefield.
Snow D (2001). Collective Identity and Expressive Forms. University of California, Irvine
eScholarship Repository. Available at: http://repositories.cdlib.org/csd/01-07. (accessed 8
January 2020).
Stewart E (2019). Reddit restricts its biggest pro-Trump board over violent threats. Vox, 26 June.
Available at: https://www.vox.com/recode/2019/6/26/18760288/reddit-the-donald-trump-
message-board-quarantine-ban. (accessed 8 January 2020).
Stier S, Posch L, Bleier A and Strohmaier M (2017). When populists become popular:
Comparing Facebook use by the right-wing movement Pegida and German political
parties. Information, Communication and Society 20(9): 1365—1388.
UPVOTING EXTREMISM
25
Tiffany, K (2020). Reddit is done pretending The Donald is fine. The Atlantic, 29 June.
Available at: https://www.theatlantic.com/technology/archive/2020/06/reddit-ban-the-
donald-chapo-content-policy/613639. (accessed 29 June 2020).
Topinka RJ (2018). Politically incorrect participatory media: Racist nationalism on
r/ImGoingToHellForThis. New Media & Society 20(5): 2050—2069.
Ward J (2018). Day of the trope: White nationalist memes thrive on Reddit’s r/the_donald.
Southern Poverty Law Center. Available at:
https://www.splcenter.org/hatewatch/2018/04/19/day-trope-white-nationalist-memes-
thrive-reddits-rthedonald. (accessed 8 January 2020).
Wildman J (2020). What is Reddit? Digital Trends, 1 July. Available at:
https://www.digitaltrends.com/web/what-is-reddit. (accessed 10 July 2020).
Zadrozny B and Collins B (2018). Reddit bans qanon subreddits after months of violent threats.
NBC News, 12 September. Available at: https://www.nbcnews.com/tech/tech-
news/reddit-bans-qanon-subreddits-after-months-violent-threats-n909061. (accessed 8
January 2020).
Zannettou, S, Bradlyn, B, De Cristofaro, E, Kwak, H, Sirivianos, M, Stringini, G, and Blackburn,
J (2018). What is gab: A bastion of free speech or an alt-right echo chamber. In
Proceedings of the WWW ’18: Companion Proceedings of The Web Conference 2018.
1
Reddit has since amended its content policy to discourage ‘promoting hate based on identity or
vulnerability’ (see Account and Community Restrictions, n.d.-a).
2
For more on the data, see https://files.pushshift.io/reddit/comments.
3
Note that a Reddit ‘comment’ is a user-submitted piece of text that is created in response to a
particular ‘post’. For instance, when a r/The_Donald member finds an interesting link to a news
article, they may create a ‘post’ which links this content to r/The_Donald, to which other
members may comment.
4
All author names were assigned with pseudonyms to protect user anonymity. All online
comments were quoted verbatim.
5
This number represents the comment’s ‘vote score’, or the number of upvotes given to a
comment minus downvotes.
UPVOTING EXTREMISM
26
6
Reddit has used this policy to ban popular extremist subreddits in the past, including the anti-
women community r/Incels in 2017 (Hauser, 2017) and the conspiracy theory community
r/QAnon in 2018 (Zadrozny and Collins, 2018).