ArticlePDF Available

Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists

Authors:

Abstract

Although there is an ongoing need for law enforcement and intelligence agencies to identify and assess the online activities of violent extremists prior to their engagement in violence offline, little is empirically known about their online posting patterns generally or differences in their online patterns compared to non-violent extremists who share similar ideological beliefs particularly. Even less is empirically known about how their online patterns compare to those who post in extremist spaces in general. This study addresses this gap through a content analysis of postings from a unique sample of violent and non-violent right-wing extremists as well as from a sample of postings within a sub-forum of the largest white supremacy web-forum, Stormfront. Here the existence of extremist ideologies, personal grievances, and violent extremist mobilization efforts were quantified within each of the three sample groups. Several notable differences in posting patterns were observed across samples, many of which may inform future risk factor frameworks used by law enforcement and intelligence agencies to identify credible threats online. This study concludes with a discussion of the implications of the analysis, its limitations, and avenues for future research.
Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing
Extremists
Ryan Scrivens
School of Criminal Justice, Michigan State University
East Lansing, Michigan, USA
rscriv@msu.edu
Disclosure Statement
No potential conflict of interest was reported by the author.
Acknowledgement
Thank you to Tiana Gaudette and Brenna Helm for their assistance in coding data for the current
study. Thanks also to Steven Chermak for his valuable feedback on earlier versions of this study
and to Richard Frank for his data collection efforts on the project. Lastly, thank you to the two
anonymous reviewers and editors Beatrice de Graaf and Max Taylor for their valuable and
constructive feedback.
2
Examining Online Indicators of Extremism Among Violent and Non-Violent Right-Wing
Extremists
Abstract:
Although there is an ongoing need for law enforcement and intelligence agencies to identify and
assess the online activities of violent extremists prior to their engagement in violence offline,
little is empirically known about their online posting patterns generally or differences in their
online patterns compared to non-violent extremists who share similar ideological beliefs
particularly. Even less is empirically known about how their online patterns compare to those
who post in extremist spaces in general. This study addresses this gap through a content analysis
of postings from a unique sample of violent and non-violent right-wing extremists as well as
from a sample of postings within a sub-forum of the largest white supremacy web-forum,
Stormfront. Here the existence of extremist ideologies, personal grievances, and violent
extremist mobilization efforts were quantified within each of the three sample groups. Several
notable differences in posting patterns were observed across samples, many of which may inform
future risk factor frameworks used by law enforcement and intelligence agencies to identify
credible threats online. This study concludes with a discussion of the implications of the analysis,
its limitations, and avenues for future research.
Purpose
This study examines extremist ideologies, personal grievances, and violent extremist
mobilization efforts that are expressed by a sample of violent and non-violent right-wing
extremists (RWEs) within a sub-forum of one of the largest and well-known white supremacy
3
web-forums, Stormfront. I investigate if violent RWEs are differentially active in an online space
of the extreme right compared to non-violent RWEs, and with the goal of quantifying the extent
to which online indicators of extremism are found within the content posted by each sample
group. A sample of postings from the RWE sub-forum were also analyzed for comparative
purposes, as little is empirically known about whether the posting patterns of RWE users who
post online in general differ from violent and non-violent RWEs. This study represents an
original contribution to the academic literature on violent online political extremism on three
fronts.
First, many researchers, practitioners, and policymakers continue to raise concerns about
the role of the Internet in facilitating violent extremism and terrorism.
1
Questions often surround
the impact of the offenders’ consumption of and networking around violent extremist online
content in their acceptance of extremist ideology and/or their decision to engage in violent
extremism and terrorism.
2
Understandably, increased attention has been given to identifying
violent extremists online prior to their engagement in offline violence and scrutinizing their
online presence.
3
A growing interest on these issues notwithstanding, few empirically grounded
analyses have identified which online users have engaged in violent extremism offline and then
examined their digital footprints. What little evidence does exist in this regard suggests that
practitioners and policymakers should not conceive violent extremism as an offline versus online
dichotomy. To illustrate, Gill and Corner
4
looked at the behavioral underpinnings of a sample of
119 lone-actor terrorists within the United States (U.S.) and Europe and found that, while the
number of lone-actor terrorist plots remained stable over time, the growth in the Internet altered
their means of radicalization and attack learning. Building on Gill and Corner,
5
Gill and
colleagues
6
examined the online behaviors of 223 convicted United Kingdom (U.K.)-based
4
terrorists and found that those who learned online were more likely than those who did not learn
online to have experienced network activity and place interaction in the offline world. Holbrook
and Taylor
7
focused on pre-arrest media usage of five case studies of U.K.-based terrorists and
found that a belief pathway precipitated any operational action, with subjects consuming a
diverse range of media across several platforms and interacted online in chatrooms as well as
offline by copying compact discs of extremist content for one another. Lastly, Gaudette and
colleagues
8
conducted in-depth interviews with Canadian former violent RWEs on their use of
the Internet during their involvement in violent extremism and identified an important interaction
between their on- and offline behaviors which were intertwined with extremist activities,
identities, and a need for security. Despite these foundational studies, there remains a need to
closely examine the online activities of violent extremists to inform future risk factor
frameworks used by law enforcement and intelligence agencies to identify credible threats
online.
9
Second, researchers, practitioners, and policymakers have paid close attention to the
presence of violent extremists and terrorists online in recent years, with a particular emphasis on
the digital patterns and behaviors of the extreme right.
10
This is largely the result of ongoing
reports that some violent RWEs and terrorists were active online prior to their attacks,
11
with one
of the most recent examples being 28-year-old Australian Brenton Tarrant who, before killing 51
people in two Christchurch, New Zealand mosques in 2019 and live-streaming his attack,
announced his intentions on 8chan and produced a ‘manifesto’ linked on the website.
12
It should
come as little surprise that researchers have focused on the activities of RWEs on various
platforms, including on websites and discussion forums,
13
mainstream social media sites such as
Facebook,
14
Twitter,
15
and YouTube,
16
fringe platforms including 4chan
17
and Gab,
18
and digital
5
applications such as TikTok
19
and Telegram.
20
But most studies, similar to research on the causes
of violent extremism and terrorism in general, lack comparison groups, despite a significant need
to focus on comparative analyses and consider how violent extremists are different than non-
violent extremists.
21
In other words, empirical research has overlooked differences in the online
patterns of those who share extreme ideological beliefs but are violent or non-violent offline.
22
An exhaustive search produced just three empirical studies on the online patterns and behaviors
of violent and non-violent extremists. Holt and colleagues
23
examined the underlying theoretical
assumptions evident in radicalization models through a case-study analysis of violent and non-
violent extremists. Wolfowicz and colleagues
24
used a matched case-control design to
differentiate between terrorists and non-violent radicals based on their Facebook profiles. Lastly,
Scrivens and colleagues
25
examined the posting behaviors of violent and non-violent RWEs and
whether specific posting behaviors were characteristic of users’ violence status. The current
study adds to this area of research by examining the posting behavioral patterns of violent and
non-violent extremists while also accounting for the posting patterns of the general extremist
community to which the violent and non-violent extremists participate.
Third, there is a growing body of research in terrorism and extremism studies attempting
to develop indicators of violent extremist mobilization and related work exploring risk
26
and
protective
27
factors critical to processes of violent radicalization. Much of the emphasis in these
studies is on an individual’s expressed grievance(s) as an important risk factor for determining
whether someone is preparing to engage in violent extremism or terrorism.
28
A growing body of
research has also explored various online mobilization efforts by RWE movements (among other
extremist movements), including – but not limited to – empirical studies exploring the
mobilization efforts by RWE groups and movements on violent extremist forums,
29
social
6
media,
30
and digital applications,
31
as well as the impact of trigger events on online mobilization
such as the effect of riots,
32
rallies,
33
terrorist attacks,
34
and presidential election results.
35
Law
enforcement and intelligence agencies likewise recognize the need to better understand extremist
mobilization efforts online,
36
as empirical research suggests that the Internet plays an important
role in facilitating and mobilizing individuals to extremist violence.
37
The acknowledgement of
the role of the Internet in mobilizing individuals to violent extremism has motivated multiple
federal law enforcement agencies (e.g., the Federal Bureau of Investigation, the U.S. Department
of Homeland Security, and the National Counterterrorism Center) to develop laundry lists of
precursor behaviors linked to violent extremist outcomes.
38
Among the concerns in this regard
are a range of online activities such as engaging others in ideological discussions, demonstrating
ideological outbursts, and seeking to join an extremist group that promotes violence to address
perceived grievances, among many other indicators.
39
But regardless of the ongoing efforts by
law enforcement and intelligence agencies to develop a meaningful set of violent extremist
mobilization indicators to identify credible threats, there is little empirical evidence about the
presence of these indicators in online extremist communities.
40
Most of the research in this area
rely on anecdotal accounts or case studies to better understand mobilization efforts in online
communities. Further, this research has yet to identify differences in the mobilization efforts of
violent and non-violent extremists who post online or assess whether their mobilization efforts
are unique to individuals who post in the same online space generally.
Current study
Data and sample
7
Online postings from the open access sections of Stormfront Canada, a Canadian-themed sub-
forum of the largest white supremacy discussion forum Stormfront, were analyzed for the current
study. Stormfront is the oldest racial hate site and discussion forum used by members of the
RWE movement. Stormfront is also one of the most influential RWE forums in the world
41
and
includes an array of sub-sections addressing a variety of topics, including an ‘International’
section composed of a range of geographically and linguistically bounded sub-forums (e.g.,
Stormfront Europe, Stormfront Downunder, and Stormfront Canada). Stormfront has also served
as a “funnel site” for the RWE movement, wherein forum users have been targeted by other
RWE users for the purpose of recruiting new members into violent offline groups, including
online forums hosted by the Hammerskins, Blood & Honour, and various Ku Klux Klan
branches.
42
Today, Stormfront has approximately 368,000 ‘members’ and contains over 13.9
million posts.
Understandably, Stormfront has been the focus of much research attention since its
inception, including an assessment of recruitment efforts by forum users,
43
the formation of a
virtual community
44
and collective identity there,
45
the extent to which Stormfront is connected
to other racial hate sites
46
, and how Stormfront discourse is less virulent and more palatable to
readers.
47
Although a number of emerging digital spaces have been adopted by the extreme right
in recent years, Stormfront continues to be a valuable online space for researchers to assess
behavioral posting patterns. To illustrate, recent efforts have been made to examine RWE
posting behaviors found on the platform,
48
the development of user activity and extremist
language there
49
, the impact of presidential election results on Stormfront posting behaviors
50
,
and the ways in which the collective identity of the extreme right takes shape over time
51
and is
affected by offline intergroup conflict on the forum.
52
Some exploratory work has also compared
8
the developmental posting behaviors of violent and non-violent users on Stormfront.
53
However,
research in this space has yet to compare the posting patterns of Stormfront users who share
extreme ideological beliefs but are violent or non-violent offline with those who post in
Stormfront generally.
Data collection and sampling efforts proceeded in three stages. First, open source content
on Stormfront Canada was captured using a custom-written computer program that was designed
to collect vast amounts of information online.
54
In total, the web-crawler extracted approximately
125,000 sub-forum posts made by approximately 7,000 authors between September 12, 2001 and
October 29, 2017.
55
Second, to identify users in the sub-forum who were violent RWEs offline
and those who were non-violent RWEs offline, a former violent extremist
56
who was actively
involved in the North American RWE movement and a prominent figure there for more than ten
years – both in recruitment and leadership roles, primarily in violent racist skinhead groups –
voluntarily reviewed a list of users who posted in the sub-forum and then selected those who
matched one of the two user types.
57
This identification process was done under my supervision,
wherein each time the former identified a user of interest, they were asked to explain in as much
detail possible why the user was identified as a violent or non-violent RWE as well as their
association with or connection to each identified user. This was done in an effort to verify the
authenticity of each user identified by the former. A total of 49 violent and 50 non-violent RWEs
were identified from a list of approximately 7,000 usernames. Each of these identified usernames
represented a unique forum user and there was no evidence of extremists in the sample using
multiple usernames.
58
This was confirmed by the former, whose connections were strong enough
to be able to link individuals to usernames. The online content for each user was then identified
in the sub-forum data: 12,617 posts from the violent users and 17,659 posts from the non-violent
9
users.
59
This data included 30,276 posts, with the first post made on September 1, 2004, and the
last post made on October 29, 2017.
RWEs who were identified for the current study were actively involved in right-wing
extremism. In particular, they were those who – like all extremists – structure their beliefs on the
basis that the success and survival of the in-group is inseparable from the negative acts of an out-
group and, in turn, they are willing to assume both an offensive and defensive stance in the name
of the success and survival of the in-group.
60
RWEs in the current study were thus characterized
as a racially, ethnically, and/or sexually defined nationalism, which is typically framed in terms
of white power and/or white identity (i.e., the in-group) that is grounded in xenophobic and
exclusionary understandings of the perceived threats posed by some combination of non-whites,
Jews, Muslims, immigrants, refugees, members of the LGBTQ community, and feminists (i.e.,
the out-group(s)).
61
However, violent RWEs that were identified for the current study were those
who, according to the former extremist, committed several acts of known physical violence
against persons, including violent attacks against minorities and anti-racist groups. Violence in
this regard aligned closely with Bjørgo and Ravndal’s
62
understanding of RWE violence, which
they describe as “violent attacks whose target selection is based on extreme-right beliefs and
corresponding enemy categories—immigrants, minorities, political opponents, or governments
[...] [or] spontaneous violence.” Conversely, non-violent RWEs identified for the current study
were those who were actively involved in RWE movements and activities offline, including –
but not limited to – rallies, marches, protests, postering and flyering campaigns, and group
meetings and gatherings, but did not engage in physical violence against persons in any known
capacity to the former extremist who identified them as non-violent. I do, however, acknowledge
that the former extremist may be more familiar with the histories of some of the identified
10
individuals than others, which may bias the sample. The informant categorized individuals as
violent when he personally witnessed the violent activities, and/or had direct first-hand-
knowledge of it. I thus have strong confidence in the violent categorization. But I have a bit less
confidence in the non-violent categorization since it was possible that an individual committed a
violent act outside the informant’s presence, and the informant never obtained first-hand
knowledge of it.
Third, for comparison purposes a sample of 1,000 postings were randomly retrieved from
the content posted by each of the following groups: the violent group, the non-violent group, and
the broader group of sub-forum posters (herein referred to as “the comparison group”) (n = 3,000
posts). Posts that were randomly sampled from the comparison group did not include posts made
by those in the violent and non-violent groups. As a result, each user was unique to each sample
group, meaning that to the best of my knowledge there were no overlapping users across groups.
Coding schema
The focus here was on coding for RWE ideologies, personal grievances, and violent mobilization
efforts. To quantify the appearance of these online discussions for each of the three sample
groups, content analysis techniques were used to code the data. One coder analyzed a series of
posts from the sample of 3,000 messages, with the unit of analysis being the content of each post,
relative to the other indicator categories. Importantly, each observation could have received
multiple codes. For example, an online post that contained anti-immigrant and anti-Semitic
language was coded for both. Note also that the base for each indicator category was 1,000 posts,
with the maximum for anti-Semitic, for example, as 1,000/1,000.
11
To ensure reliability in the coding, two additional coders receiving systematic training on
the inclusion criteria for each code and then analyzed 10% of the sample independently.
63
This
resulted in an inter-rater reliability of α = 0.92, which is much higher than the acceptable level of
inter-rater agreement (i.e., α = 0.80).
64
Ideology posts were coded (0 = no, 1 = yes) based on the use of comments that detailed
or referenced a known RWE ideology. The codebook for expressed ideologies was developed by
drawing from the Southern Poverty Law Center’s (SPLC)
65
identified RWE ideologies, which
included anti-immigrant, anti-LGBTQ, anti-Muslim, anti-government, Christian Identity, anti-
Semitic, male supremacy, and neo-folkish ideologies.
66
In addition, anti-Black ideologies were
included in the codebook, as research suggests that Black communities have been primary
opponents of the RWE movement
67
and anti-Black ideologies are widely discussed in RWE
spaces online.
68
Lastly, conspiratorial posts were included in the codebook, as research indicates
that RWE ideologies are oftentimes steeped in conspiracy theories, which oftentimes involve a
grave threat to national sovereignty and/or personal liberty, among other concerns.
69
The
codebook for expressed ideologies included 10 binary codes.
Grievance posts (0 = no, 1 = yes) were coded based on the posters’ use of language that
were expressive of their personal grievances. This codebook was derived from a codebook used
by Gill and colleagues
70
and Clemmow and colleagues
71
to examine motivations and antecedent
behaviors of lone-actor terrorists. The codebook for expressed personal grievances included 14
binary codes.
Violent extremist mobilization posts (0 = no, 1 = yes) were coded on the basis of
language suggesting that posters were preparing to engage in extremist violence and/or made
efforts to mobilize others to extremist violence. This codebook was derived from the FBI’s
12
‘Homegrown Violent Extremist Mobilization Indicators.’
72
The codebook for violent extremist
mobilization indicators included 23 binary codes. For more on the codebook, see the Appendix.
Results
The results are divided into three sections: a comparison of the proportion of posts that fall into
each of the three indicator categories (i.e., ideologies, grievances, and violent extremist
mobilization efforts), both within and across sample groups (i.e., violent, non-violent, and
comparison group).
73
Quotes from the data are presented where appropriate to demonstrate the
tenor of postings. All online comments were quoted verbatim.
Expressed ideologies
While a large proportion of ideological posts were identified in the content posted by all three
sample groups, the non-violent group contained a much larger proportion of ideological content
(1,328 posts) than those observed in the violent group and comparison group (748 posts and 857
posts, respectively), which are described in Table 1.
Table 1. Presentation of ideological posts across sample group.
Non-violent
Violent
Comparison
Anti-immigrant
179
66
129
Anti-Semitic
240
190
154
Anti-Black
104
49
79
Anti-LGBTQ
50
42
25
Anti-Muslim
79
34
44
Anti-government
344
135
211
Christian Identity
17
9
13
Male supremacy
20
6
9
Neo-folkish
7
16
10
Conspiratorial
288
201
183
Total
1328
748
857
13
Specifically, anti-Semitic discourse was among the most frequently observed across all three
sample groups, appearing in 24% (n = 240) of all posts in the non-violent group as well as in
19% (n = 190) in the violent group, and 15.40% (n = 154) in the comparison group. The
following are but two examples of the anti-Semitic posts that made up a large proportion of the
ideological content in the data:
The Jews have used a very sinister form of eugenics for thousands of years to
build an extraordinarily cunning race. You only have to think of Einstein to
understand how powerful this cunning is. […] They suck a country dry of all
opportunities. Every country in history has found it necessary to take drastic
action against them. (Non-violent group post)
You are right on that one. The one universal thing about the Jew, he is despised by
everyone! (Comparison group post)
Additionally, anti-Black sentiment made up a similar proportion of posts targeting adversary
groups in the non-violent group and comparison group (10.40% and 7.90%, respectively) and
were moderate compared to the anti-Black sentiment expressed by users in the violent group
(4.90%). Anti-Muslim sentiment, on the other hand, was slightly more prevalent in the non-
violent group (7.90%) compared to those in the violent group (3.40%) and comparison group
(4.40%), but anti-LGBTQ sentiment was similarly moderate in the non-violent and violent
groups (5.00% and 4.20%, respectively) compared to users in the comparison group (2.50%).
However, anti-immigrant sentiment – which is expressed below – was far more common in the
14
non-violent group and comparison group (17.90% and 12.90%, respectively) than in the violent
group (6.60%).
I want the borders shut to third world people. […] It wouldn’t break my heart if
the existing strain on us was removed by removing existing third worlders from
our midst. Let’s face it, we have a problem with immigration. Let’s worry about
and take care of our own first. [...] We have to start thinking and insisting one
what’s good for us -- the European founder/settler people of this country. (Non-
violent group post)
Christian Identity posts made up a relatively small proportion of the ideological posts
across the three sample groups, with 1.70% (n = 17) of all posts in the non-violent group, 0.90%
(n = 9) of the posts in the violent group, and 1.30% (n = 13) of the comparison group posts
containing Christian identity language. Similarly, there was a relatively small proportion of posts
related to male supremacy in the violent, non-violent, and comparison groups (2.00%, 0.60%,
and 0.90%, respectively), with this content reflecting the following sentiment: “Feminists most
often want special privileges and promote inequality rather than equal rights” (comparison group
post). Neo-folkish posts were also infrequent in all three groups, but a slightly higher proportion
of these posts were found in the violent group (16 posts, 1.60%) compared to those in non-
violent group (7 posts, 0.70%) and comparison group (10 posts, 1.00%). As but one example of
this discourse:
At this time of year, we should all reflect on the things that have brought us
together as families and as a people, in an unbroken chain stretching back beyond
the Christian era, to our pre-Christian Pagan roots. In spite of efforts to high
jack this special time of year on the part of Interlopers, this is none the less OUR
15
season -- OUR holiday. So lets fortify ourselves for the challenges of the year
ahead by enjoying the good cheer of family and friends. (Violent group post)
Notably, among the most frequently observed ideological posts across all three samples
were anti-government posts, which were among the most frequently observed ideological posts
in the non-violent group (344 posts, 34.00%) and comparison group (211 posts, 21.10%) and, to
a lesser extent, in the violent group (135 posts, 13.50%). The following are two examples that
summarize these discussions:
Our government has no Moral views! this crap is one of the reasons why family values is
falling apart. (Violent group post)
The problem is the Government does not care about this, in fact it helps them. More
crime means more police needed. More People despite race means more of a tax base and
more money for them. They like money, and care about nothing else. It does not matter if
most of these people end up on welfare, the rest will work for slave wages at their
corporate buddies businesses meaning more campaign contributions. It is ONLY about
$$$ to these people. This is why I advocate attacking that money supply. Stop paying
Income Taxes, get out of the system, cancel bank accounts, stop buying newspapers, stop
shopping at the big box stores and shop locally where you know the owner. Anything
else will simply hasten the downfall of all of us, and you will pay for your own
destruction. (Comparison group post)
16
Lastly, conspiratorial posts were among the most frequently observed in all three sample
groups, appearing in 28.80% (n = 288) of all posts in the non-violent group as well as in 20.10%
(n = 201) of the violent group and 18.30% (n = 183) of the comparison group. Much of this
sentiment was cemented in anti-Semitic discourse, wherein Jews were labeled as “the source of
all evil”, “the spawn of the Devil himself”, conspiring to extinguish the white race and breeding
them out of existence – through “Jew-controlled” government, financial institutions, and media
(i.e., Zionist Occupation Government (ZOG) conspiracy). As one poster described it:
‘Zionism’ is Jewish Nationalism. In principle, there’s no reason we should complain that
another group wants to preserve itself just as we do. The difference is that while we’re
nationalists, seekers after an independant nation, we are being suppressed by
Supremacists - those who do not merely seek their own nation and independance, but
who additionally seek to dominate others. That’s why I usually say JOG instead of ZOG
– it’s the JS, not the Jewish nationalists, who are the problem. A purist Zionist would
simply pack up and head for Israel. A JS will stay here and strive to maintain power over
us and everyone else. (Violent group post)
Expressed grievances
A relatively small proportion of personal grievance posts were observed in all three sample
groups (see Table 2), with fewer grievance posts observed in the violent group (48 posts)
compared to those in the non-violent group (87 posts) and comparison group (108 posts).
17
Table 2. Presentation of grievance posts across sample group.
Non-violent
Violent
Comparison
Loss of employment
5
2
6
Work performance
2
0
1
Intimate relationships
1
0
3
Personal relationships
0
0
0
Death in the family
1
0
1
Death of a friend
0
2
1
Health
0
0
0
Educational system
15
7
16
Criminal justice system
18
13
7
Target of an act of prejudice or
unfairness
37
19
65
Victim of verbal assault
0
2
1
Victim of physical assault
3
3
4
Victim of sexual assault
0
0
0
Socially isolated
5
0
3
Total
87
48
108
Furthermore, the type of personal grievance posts observed were different across groups. Users
in the violent group, for example, contained a narrower range of expressed grievances compared
to the observed grievance posts in the non-violent group and comparison group. There was also
no mention of seven of the personal grievance categories in the violent group (i.e., work
performance, intimate relationships, personal relationships, death in the family, health, victim of
sexual assaults, and social isolation). Instead, for a relatively large proportion of the grievance
categories, there were just a few posts containing a grievance in the violent group, with
grievances in this regard including loss of employment, death of a friend, being a victim or
verbal assault, and being a victim of physical assault.
A relatively small proportion of personal grievance posts about the educational system
were also identified in the violent group (7 posts, 0.70%), which reflected the following post: “I
recall a speech given at my middle school, where a rabbi uttered something to the similar effect
of: Judaism is pro-humanity. If you’re against Judaism, you’re against humanity. […] It’s
obscene, it symbolizes the genocide and domination that Judaea has exerted upon the rest of the
18
world…Lastly, grievance posts about the criminal justice system as well as being the target of
an act of prejudice or unfairness were the most frequently observed in the violent group, with the
former appearing in 1.30% (n = 13) of all posts and the latter appearing in 1.90% (n = 19) of all
posts in the violent group. The following are two examples that summarize each of these
discussions:
Be careful because they’ll try and get you for anything. I’m still under
investigation [by the police] since they STOLE all my stuff back in May…
(Violent group post)
…it was the ARA [Anti-Racist Action] that attacked my site, they defaced it and
called me names and stuff, i guess they don’t like my views... […] weird how my
email and stormfront info and accounts all got taken by someone. (Violent group
post)
In contrast, not only were more personal grievance posts identified in the non-violent
group (87 posts) and comparison group (108 posts) compared to the violent group (48 posts),
there was a wider range of grievance posts in the non-violent group and comparison group. To
illustrate, among the most frequently observed grievance posts in the non-violent group and
comparison group related to being the target of an act of prejudice (37 posts and 65 posts,
respectively) as well the education system (15 posts and 16 posts, respectively), all of which
were observed at higher rates than those observed in the violent group. Grievance posts about
being the target of an act of prejudice in the non-violent group and comparison group ranged
19
from be denied employment opportunities to being targeted by anti-racists on- and offline. As
one user explained it:
I am job hunting right now and I had a couple of places have on their
questionnaires asking if I was ethnic […] Which thus means I’m still job hunting.
(Comparison group post)
Personal grievance posts in the non-violent group and comparison group about the educational
system oftentimes involved anti-Semitic conspiracy theories, which is reflected in the following
discourse:
It is blatant racism to only teach the official Jewish holocaust story in our schools. If the
education system feels this type of thing is necessary to teach, the Ukrainian (10 million
killed) and Armenian (2 million?) holocausts should be taught in detail. Maybe the
Bolshevic Revolution as well. How can they teach that multiculturalism is great, when
history suggests the opposite? (Comparison group post)
It is also interesting to uncover that personal grievance posts about the criminal justice system (n
= 18, 1.80%) were mentioned with some frequency among the non-violent group, which was
similar in proportion as those expressed in the violent group (1.30%) but less so than those in the
comparison group (0.70%). Here, discussions about the criminal justice system for the non-
violent users tended to reflect the following sentiment:
Please take a look at the link below concerning the […] case against me. […] We
should be able to speak up as our ancestors did […]. We shouldn’t have to hide
behind screen names. […] Sadly, with live in a tin-pot dictatorship where freedom
20
of speech is seen as a serious threat to the powers that be and to the minority
masters who pull the strings. […] I’ve read court statements by rabid police
officers and opinions by judges cheerfully seeking to suppress criticism of
privileged minorities lest it cause strife and problems or hurt feelings. (Non-
violent group post)
Violent extremist mobilization efforts
In Table 3, the violent mobilization indicators are presented by sample group, which show
interesting variations in the types of mobilization efforts observed both within and across groups.
Similar to how the number of grievance and ideological posts captured in the non-violent and
comparison groups were greater compared to those in the violent group, more mobilization posts
were observed in the former two groups (277 posts and 189 posts, respectively) than the latter
group (157 posts). Interestingly, though, was that slightly more mobilization indicators were
observed in the violent group than the non-violent group and comparison group; 11 of 23
mobilization indicators were identified in the postings from the violent group (15.70%)
compared to 9 of 23 indicators in the non-violent (27.70%) and comparison group posts
(19.90%).
Nonetheless, the volume of these mobilization posts was greater in the non-violent group
than in the violent group and comparison group, and so too was number of posts that were in
each mobilization category for the non-violent group. Specifically, the number of posts in the
non-violent group about a specific mobilization effort was greater in 8 of the 23 (34.78%)
categories compared to those in the violent group and was also greater in 6 of the 23 (26.09%)
categories in comparison to the comparison group. For example, 46 of the violent group posts
21
included efforts to radicalize others and push them to action compared to 91 posts in the
comparison group and 150 posts in the non-violent group. Postings in this regard reflected the
following sentiment:
We desperately need to get something going soon. Bitching and complaining is not
getting us anywhere. The future for White youth is quite grim […] and we are runnning
out of supportive votes daily. Let’s get the ball rolling, before it’s too late.... That is
exactly the point. Let’s get off our comfy chairs and do something. […] Let us rise to
leadership and do it. (Non-violent group post)
In addition, just nine of the posts in the violent group were seeking to recruit others to mobilize
compared to 20 posts in the comparison group and 49 posts in the non-violent group. Such
mobilizing efforts were best captured in the following post:
Caucasians in this country are becoming concerened about immigration and our
way of life. I’m hearing that from people more and more. I think we have the
support, but how do we get the ball rolling? Spread the word on chat forums.
[…] Contact local WNs [White Nationalists] - arrange to meet, maybe have a
drink or two, distribute as many [racist] flyers as you can, celebrate the hard work
at the end of the night with a drink or two. It’s safe; it’s legal; it’s fun; it lets you
go to sleep knowing you’ve help make the world a better place. (Non-violent
group post)
Similarly, far fewer mobilization posts that advocated and encouraged violence were observed in
the violent group and comparison group (10 posts and 14 posts, respectively) compared to those
identified in the non-violent group (28 posts).
22
Table 3. Presentation of violent extremist mobilization posts across sample group.
Non-violent
Violent
Comparison
End of life preparations
0
0
0
Seeking help to travel abroad
0
0
0
Planning a trip abroad
0
0
0
Seeking permission to engage in
violence
0
0
0
Seeking to recruit others to
mobilize
49
9
20
Asking to purchase/how to obtain
illegal material
0
0
0
Includes terrorist
icons/flags/prominent
figures/symbols/slogans
21
61
26
Expressing goodbyes
0
0
0
Acceptance of violence as a
necessary means to achieve
ideological goals
0
3
0
Attempting to radicalize
others/pushing others to action
150
46
91
Involved in a group that promotes
violence to rectify grievances
0
12
3
Provides virtual simulations of an
attack/assault
0
0
0
Discusses behavioral change
0
1
0
Linguistic expressions that reflect
new sense of purpose
0
0
0
Advocates/encourages violence
28
10
14
Asks for information about specific
targets
0
0
0
Asks for technical expertise
0
3
0
Contains a violent, ideologically
motivated outburst
4
3
16
Blames external factors for failure
in school, career, or
relationships
1
0
0
Display an unstable mental state
(e.g., mental health, derailing)
0
0
3
Discusses operational security and
asks about ways to evade law
enforcement
4
0
7
Praises past successful/attempted
attacks
3
1
0
Inappropriate use of what an
individual perceives as doctrine
to manipulate others
17
8
9
Total
277
157
189
23
It is also worth noting the most frequently observed mobilization indicator in each of the
three sample groups. The top mobilization indicator in both the non-violent group and
comparison group was efforts to radicalize others and push them to action (150 posts and 91
posts, respectively). Interestingly, mobilization posts that attempted to radicalize others and push
them to action were substantially greater in the non-violent group and, to a lesser extent,
mobilization efforts in the comparison group. As an example of this discourse:
I sympathize with you. However, it is time to stop complaining. Join a resistance
movement or if one does not exist form one. Use your imagination and be innovative
when it comes to resisting these invaders as your imagination is often only the starting
point.... (Comparison group post)
The other top mobilization indicators observed in the non-violent group included seeking to
recruit others to mobilize (49 posts, 4.90%) and advocating/encouraging violence (28 posts,
2.80%). By comparison, top mobilization indicators in the comparison group included the use of
terrorist icons/flags/prominent figures/symbols/slogans in the postings (26 posts, 2.60%) and
seeking to recruit others to mobilize (20 posts, 2.0%).
On the other hand, the top mobilization indicator observed in the violent group was the
use of terrorist icons/flags/prominent figures/symbols/slogans in the postings (61 posts, 6.10%),
especially popular RWE slogans. The following are a few examples that best capture sentiment
expressed here:
This is what happens when you let jews run the country , but remember , the jew will
never let racism end. they need it. the jew feeds off of hate. if theres no hate the jews
have no power! with hate , they keep the power! 88 [Heil Hitler] (Violent group post)
24
We must become active because bitching on the pc [personal computer] will not get us
anywhere. 14 words [a reference to a popular RWE slogan: “We must secure the
existence of our people and a future for white children.”] (Violent group post)
Drug addicts and un-educated Leftists who really dont know their asses from holes in the
ground! R.O.A. [race over all] (Violent group post)
Lastly, the other top mobilization indicators observed in the violent group included efforts to
radicalize others and push them to action (46 posts, 4.60%) and discussions of involvement in a
group that promotes violence to rectify grievances (12 posts, 1.20%).
Discussion
This study quantified the existence of extremist ideologies, personal grievances, and violent
extremist mobilization efforts expressed by a sample of violent and non-violent RWEs as well as
a sample of postings within the open-access sections of a sub-forum of Stormfront (n = 3,000).
Several conclusions can be drawn from this comparative analysis, all of which may lead to
methods of identifying credible threats online (i.e., potentially violent individuals).
First, a large proportion of ideological posts targeting the out-group were observed in the
violent, non-violent, and comparison groups, which comes as little surprise, given that previous
research has found that Stormfront
74
– and indeed other platforms used by the extreme right in
general
75
– contain a sizable amount of explicit and overt white supremacist activity targeting the
in-group’s perceived adversaries. The results of the current study also suggest that anti-Semitic
25
conspiracy theories were among the most frequently observed ideological discourse across all
three sample groups, which also aligns with empirical research suggesting that anti-Semitic
conspiracy discussions are rooted in RWE ideologies
76
and in much of the RWE rhetoric
expressed online, including in RWE discussion forums
77
, social media sites,
78
and fringe
platforms.
79
Interestingly, though, is that anti-government posts were among the most frequently
observed ideological posts across all sample groups – a finding that contrasts with recent work
which found that anti-government sentiment makes up a very small proportion of postings found
on several RWE discussion forums.
80
However, this finding aligns with other empirical research
which suggests that RWEs oftentimes endorse anti-government sentiment and contributes to
their underlying belief system.
81
But perhaps most notable is that the non-violent group
contained a much larger proportion of ideological posts than those observed in the violent group
and, to a lesser extent, posters observed in the comparison group. Such a finding mirrors
empirical research which similarly found that non-violent RWEs tend to be more active online
than their violent counterpart in general.
82
It may be the case that the non-violent group perceive
their role in and, by extension, engage with the RWE movement as ideologues, thus providing
“conceptual tools” that can be taken up by others involved in RWE violence, which has been
reported in empirical research.
83
Regardless, this evidence base remains in its infancy and
requires further exploration.
Second, while few personal grievance posts were observed in the sample groups
compared to ideological posts, a larger proportion of personal grievances were observed in the
non-violent and comparison groups than those in the violent group. There was also some
variation in the scope of the personal grievances expressed across sample groups, with more
grievance types identified in the non-violent and comparison groups than those in the violent
26
group. Yet the most prominent personal grievances that were observed across the three groups
were similar: (1) being the target of an act of prejudice, (2) the criminal justice system, and (3)
the educational system. Most importantly, there was an especially small proportion of personal
grievances observed in all three groups. This is a noteworthy finding because it suggests that
RWE posters, whether violent or non-violent, tend not to express their personal grievances in
generic RWE forums (such as Stormfront) as has been found to be the case in similar RWE
forums.
84
This finding does, however, align with research which similarly found a small
proportion personal grievance posts on violent RWE forums.
85
Perhaps it is the case that online
communities of extremist movements in general – and especially those that are open access and
can be publicly viewed – are not spaces to air personal grievances, perhaps out of fear of being
monitored by law enforcement and anti-racists, but instead serve as communities to express
ideologically-based, in-group grievances (e.g., ZOG conspiracies) committed by perceived out-
groups in general – as has been documented in previous research.
86
Interestingly, though, is that
notable exceptions in this regard include personal grievances about treatment of the criminal
justice system and educational system. Perhaps some of the users in the current study who posted
about this content believe that they are already being monitored by “the state” (i.e., law
enforcement and the educational system, among others) and in turn see it as their duty to warn
other RWE adherents as a result of their negative experiences with the system. Nonetheless, the
extent to which personal grievances are expressed in RWE online spaces requires further
exploration.
Third and perhaps most notably was the extent to which violent extremist mobilization
efforts were observed in all three sample groups in general and in the non-violent group in
particular. Although less frequent than ideological posts, much of the sentiment observed in the
27
data, especially in the non-violent group, suggested that posters were preparing to engage in
extremist violence or were making efforts to mobilize others to extremist violence. Interestingly,
attempting to radicalize others and pushing others to action was the top mobilization indicator in
both the non-violent and comparison groups, and the use of terrorist icons/flags/prominent
figures/symbols/slogans as the top indicator for the violent group. These are indicators that law
enforcement and intelligence agencies should look further into as they examine the posting
patterns of users in extremist online communities – especially in spaces that contain such a high
frequency of ideological posts, as was the case in the current study.
Together, the results of the current study suggest that the non-violent group, not the
violent group, contain the more potentially threatening posting patterns that law enforcement and
intelligence agencies may – on the surface at least – deem worthy of further investigation. Yet
from a policy perspective, the results of the study suggest that analysts who are searching for
signs of violent extremists online should perhaps be less concerned about investigating users
who post messages that include numerous extremist indicators and instead be more concerned
about those who post messages with fewer indicators. This recommendation is supported by
recent work on the online behaviors of RWEs which found those who are actively involved in
violent RWE activities offline tend to be concerned that law enforcement officials and anti-racist
groups are monitoring their online activities and may modify their posting activities to avoid
detection.
87
Empirical research similarly suggests that violent members of RWE movements are
largely clandestine, often paranoid because of the violence they engage in, and for this reason are
concerned about revealing their identities.
88
With this in mind, it may be the case that the violent
RWEs in the current study were concerned that, by posting in an online space that can be
publicly viewed, they may be putting themselves in a vulnerable position and could become the
28
subject of an investigation from anti-hate watch-organizations or even law enforcement. This
most likely had an impact on the content that they posted on the site and the results of the study
in general. Nonetheless, in identifying violent extremists online based on their posting patterns,
the results of the current study suggest that a useful starting place is to scrutinize the online
activities of those who tend to incorporate terrorist icons/flags/prominent
figures/symbols/slogans into their postings, particularly those that include popular RWE slogans
(e.g., 14/88, HH). Research has similarly found a link between violent extremism and RWE
slogans in the offline world, wherein it is common for such markers to be embedded in the
practices of violent RWE groups.
89
But little is empirically known about this link in the online
world, which certainty requires further exploration.
Limitations and future research
This study offers a first step in quantifying the presence of extremist ideologies, personal
grievances, and violent mobilization efforts found within a sample of online postings from
violent RWEs, non-violent RWEs, and a comparison group, all derived from one RWE
community. Nonetheless, there are several limitations that may inform future research.
In addition to the relatively small sample of violent and non-violent RWE users who were
known to one former RWE, paired with the validity of this data being based on one key former
RWE informant,
90
five additional points are worth discussing. First, although an array of
indicators of extremism – including ideological indicators – were consistently captured across
sample groups, the study did not include an assessment of users’ self-declared ideologies.
Instead, the extent to which users discussed extremist ideologies in the RWE platform were
quantified rather than subscribed to a particular RWE ideology. Having said that, the extent that
29
forum users in the current study adhere to a particular extremist ideology versus discussed an
ideology remains unclear. Future research is therefore needed to unpack which forum users
identify with specific extremist ideologies. Research should also assess the online content that
violent and non-violent extremists are posting when they are not discussing ideologies, personal
grievances, and mobilization efforts, as doing so will provide additional insight into posting
differentials (or similarities) across groups.
A second limitation of this study pertains to the sample. The analysis was limited to the
content posted in just one RWE sub-forum. Although the sample was sizable, future research
should quantify extremist indicators across a range of platform types. This could include a
comparison of violent RWE forums with generic (non-violent) RWE forums, or a comparison
with mainstream social media sites, fringe platforms, and digital applications. Such an analysis
would not only provide practitioners and policymakers with much need insight into the extent to
which indicators of extremism are unique to specific sample groups (i.e., violent versus non-
violent), but whether such indicators span across online spaces that facilitate extremism more
generally, or whether certain platforms have unique functions for facilitating extremism.
Third, future research would benefit from the inclusion of a temporal component,
wherein temporal analyses may be able to effectively capture an escalation in expressed
extremist ideologies, grievances, and mobilization efforts both within and across sample groups.
This in turn may provide law enforcement and intelligence agencies with insight into the
evolution of extremist indicators online and whether specific patterns in posting activity warrant
future investigation. This may also put practitioners and policymakers in a better position to
identify online discussions or user networks that are credible threat (i.e., those who engage in
violence offline), thus informing future risk factor frameworks. Officials may also have valuable
30
intelligence, informed by online trends, to put them on higher alert on particular threats and
intervene where necessary.
Fourth, given that the validity of the data used in the current study is based on one key
former RWE informant, future research should further verify the authenticity of each RWE user
by identifying the birth and given name of each individual in the samples. These names could
then be triangulated with open-source intelligence (e.g., media reports, court documents,
terrorism databases, and social media accounts on each user) or with the Extremist Crime
Database (ECDB), which currently includes over 500 data points on nearly 1,000 violent and
1,500 non-violent RWEs.
91
Lastly, there remain many unanswered questions about the
characteristics of those identified as violent or non-violent RWEs in the current study. Unlike the
growing body of research that has drawn distinctions between the offline activities and behaviors
of violent and non-violent extremists,
92
data for the current study does not include information
on key characteristics identified in this literature such as an individual’s employment status,
criminal records, history of mental illness, extremist/radicalized peers, and types of grievances,
among many others. Future research is therefore needed to assess whether these characteristics
(and others) mirror the current study sample of violent or non-violent extremists as well as
whether certain characteristics drive differential posting patterns. This could be done similar to
the sampling procedure used for the current study, wherein a former extremist would identify
users who fit into the violent and non-violent extremist categories, but sub-categories of these
categories could be built out to capture the abovementioned key characteristics of each identified
extremist as well as to develop a scheme to further identify differences in posting groups, such as
their violent and non-violent criminal behaviors (e.g., threat of violence, physical damage to
property, and so on). Future research should also pinpoint exact moments in time that individuals
31
engaged in violence offline as well as identify, among other things, whether the attack was
planned or unplanned, the victims or targets and the motive of the attacks, and then assess users’
posting behaviors both before and after the act of violence. Doing so may shed light on whether
specific patterns in the online world escalate to violent actions in the offline world as well as
assist researchers and policymakers in developing methods to identify potentially violent
individuals.
Notes
1
See Maura Conway, “Determining the Role of the Internet in Violent Extremism and
Terrorism: Six Suggestions for Progressing Research,” Studies in Conflict & Terrorism 40, no. 1
(2017): 77-98.
2
See Ryan Scrivens, Paul Gill, and Maura Conway, “The Role of the Internet in Facilitating
Violent Extremism and Terrorism: Suggestions for Progressing Research,” in Thomas J. Holt
and Adam Bossler, Eds., The Palgrave Handbook of International Cybercrime and
Cyberdeviance (London, UK: Palgrave, 2020), pp. 1-22.
3
Joel Brynielsson, Andreas Horndahl, Fredik Johansson, Lisa Kaati, Christian Mårtenson, and
Pontus Svenson, “Analysis of Weak Signals for Detecting Lone Wolf Terrorists,” Security
Informatics 2, no. 11 (2013): 1-15; Katie Cohen, Fredik Johansson, Lisa Kaati, and Jonas C.
Mork, “Detecting Linguistic Markers for Radical Violence in Social Media,” Terrorism and
Political Violence 26, no. 1 (2014): 246-256; Lisa Kaati, Amendra Shrestha, and Katie Cohen,
“Linguistic Analysis of Lone Offender Manifestos,” Proceedings of the 2016 IEEE International
Conference on Cybercrime and Computer Forensics, Vancouver, BC, Canada.
4
Paul Gill and Emily Corner, “Lone-Actor Terrorist Use of the Internet and Behavioural
Correlates,” in Lee Jarvis, Stuart Macdonald, and Thomas M. Chen, Eds., Terrorism Online:
Politics, Law, Technology and Unconventional Violence (London, UK: Routledge, 2015), pp.
35-53.
5
Ibid.
6
Paul Gill, Emily Corner, Maura Conway, Amy Thornton, Mia Bloom, and John Horgan,
“Terrorist Use of the Internet by the Numbers: Quantifying Behaviors, Patterns, and Processes,”
Criminology and Public Policy 16, no. 1 (2017): 99-117.
7
Donald Holbrook and Max Taylor, “Terrorism as Process Narratives: A Study of Pre-Arrest
Media Usage and the Emergence of Pathways to Engagement,” Terrorism and Political Violence
31, no. 6 (2019): 1307-1326.
8
Tiana Gaudette, Ryan Scrivens, and Vivek Venkatesh, “The Role of the Internet in Facilitating
Violent Extremism: Insights from Former Right-Wing Extremists,” Terrorism and Political
Violence. Ahead of Print, 1-18.
9
Ryan Scrivens, Thomas W. Wojciechowski, Joshua D. Freilich, Steven M. Chermak, and
Richard Frank, “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-
Wing Extremists,” Terrorism and Political Violence. Ahead of print, 1-19.
32
10
Maura Conway, Ryan Scrivens and Logan Macnair, “Right-Wing Extremists’ Persistent
Online Presence: History and Contemporary Trends,” The International Centre for Counter-
Terrorism – The Hague 10(2019): 1-24; Thomas J. Holt, Joshua D. Freilich, and Steven M.
Chermak, “Examining the Online Expression of Ideology Among Far-Right Extremist Forum
Users,” Terrorism and Political Violence. Ahead of Print, 1-21.
11
See, for example, Southern Poverty Law Center, “White Homicide Worldwide,” 1 April 2014.
Available at: https://www.splcenter.org/20140331/white-homicide-worldwide (accessed 29
January 2022).
12
See Conway et al., “Right-Wing Extremists’ Persistent Online Presence.”
13
Les Back, “Aryans Reading Adorno: Cyber-Culture and Twenty-First Century Racism,”
Ethnic and Racial Studies 25, no. 4 (2002): 628-651; Ana-Maria Bliuc, John Betts, Matteo
Vergani, Muhammad Iqbal, and Kevin Dunn, “Collective Identity Changes in Far-Right Online
Communities: The Role of Offline Intergroup Conflict,” New Media and Society 21, no. 8
(2019):1770–1786; Val Burris, Emery Smith, E., and Ann Strahm, “White Supremacist
Networks on the Internet,” Sociological Focus 33, no. 2 (2000): 215-235; Willem De Koster, and
Dick Houtman, “‘Stormfront is Like a Second Home to Me,’” Information, Communication and
Society 11, no. 8 (2008): 1155-1176; Robert Futrell and Pete Simi, “Free Spaces, Collective
Identity, and the Persistence of U.S. White Power Activism,” Social Problems 51, no 1. (2004):
16-42; Holt et al., “Examining the Online Expression of Ideology Among Far-Right Extremist
Forum Users”; Ryan Scrivens, Garth Davies, and Richard Frank, “Measuring the Evolution of
Radical Right-Wing Posting Behaviors Online,” Deviant Behavior 41, no. 2 (2020): 216-232;
Ryan Scrivens, “Exploring Radical Right-Wing Posting Behaviors Online,” Deviant Behavior.
Ahead of Print, 1-15; Magdalena Wojcieszak, “‘Don’t Talk to Me’: Effects of Ideological
Homogenous Online Groups and Politically Dissimilar Offline Ties on Extremism,” New Media
and Society 12, no. 4 (2010): 637-655.
14
Mattias Ekman, “Anti-Refugee Mobilization in Social Media: The Case of Soldiers of Odin,”
Social Media + Society 4, no. 1 (2018): 1-11; Jade Hutchinson, Amarnath Amarasingam, Ryan
Scrivens, and Brian Ballsun-Stanton, “Mobilizing Extremism Online: Comparing Australian and
Canadian Right-Wing Extremist Groups on Facebook,” Behavioral Sciences of Terrorism and
Political Aggression. Ahead of Print, 1-31; Lella Nouri and Nuria Lorenzo-Dus, “Investigating
Reclaim Australia and Britain First’s Use of Social Media: Developing a New Model of
Imagined Political Communities Online,” Journal for Deradicalization 18 (2019): 1-37; Ryan
Scrivens and Amarnath Amarasingam, “Haters Gonna “Like”: Exploring Canadian Far-Right
Extremism on Facebook,” in Digital Extremisms: Readings in Violence, Radicalisation and
Extremism in the Online Space, ed. Mark Littler and Benjamin Lee (London: Palgrave 2020),
63–89; Sebastian Stier, Lisa Posch, Arnim Bleier, and Markus Strohmaier, “When Populists
Become Popular: Comparing Facebook Use by the Right-Wing Movement Pegida and German
Political Parties,” Information, Communication and Society 20, no. 9 (2017): 1365-1388.
15
J. M. Berger, Nazis vs. ISIS on Twitter: A Comparative Study of White Nationalist and ISIS
Online Social Media Networks (Washington, DC: The George Washington University Program
on Extremism, 2016); J. M. Berger and Bill Strathearn, Who Matters Online: Measuring
Influence, Evaluating Content and Countering Violent Extremism in Online Social Networks
(London, UK: The International Centre for the Study of Radicalisation and Political Violence,
2013); Pete Burnap and Matthew L. Williams, “Cyber Hate Speech on Twitter: An Application
of Machine Classification and Statistical Modeling for Policy and Decision,” Policy and Internet
33
7, no. 2 (2015): 223-242; Roderick Graham, “Inter-Ideological Mingling: White Extremist
Ideology Entering the Mainstream on Twitter,” Sociological Spectrum 36, no. 1 (2016): 24-36.
16
Mattias Ekman, “The Dark Side of Online Activism: Swedish Right-Wing Extremist Video
Activism on YouTube,” MedieKultur: Journal of Media and Communication Research 30, no 56
(2014): 21-34; Derek O’Callaghan, Derek Greene, Maura Conway, Joe Carthy, and Pádraig
Cunningham, “Down the (White) Rabbit Hole: The Extreme Right and Online Recommender
Systems,” Social Science Computer Review 33, no. 4 (2014): 1-20.
17
Savvas Finkelstein, Joel Zannettou, Barry Bradlyn, and Jeremy Blackburn, “A Quantitative
Approach to Understanding Online Antisemitism,” arXiv:1809.01644, 2018; Antonis Papasavva,
Savvas Zannettou, Elimiano De Cristofaro, Gianluca Stringhini, and Jeremy Blackburn, “Raiders
of the Lost Kek: 3.5 Years of Augmented 4chan Posts from the Politically Incorrect Board,”
arXiv:2001.07487, 2020.
18
Savvas Zannettou, Barry Bradlyn, Elimiano De Cristofaro, Haewoon Kwak, Michael
Sirivianos, Gianluca Stringini, and Jeremy Blackburn, “What is Gab: A Bastion of Free Speech
or an Alt-Right Echo Chamber,” Proceedings of the WWW ’18: Companion Proceedings of The
Web Conference 2018’, Lyon, Fance; Yuchen Zhou, Mark Dredze, David A. Broniatowski, and
William D. Adler, “Elites and Foreign Actors Among the Alt-Right: The Gab Social Media
Platform,” First Monday 24, no. 9 (2019).
19
Gabriel Weimann and Natalie Masri, “Research Note: Spreading Hate on TikTok,” Studies in
Conflict & Terrorism. Ahead of Print, 1-14.
20
Jakob Guhl and Jacob Davey, A Safe Space to Hate: White Supremacist Mobilisation on
Telegram (London, UK: Institute for Strategic Dialogue, 2020); Aleksandra Urman and Stefan
Katz, “What they do in the Shadows: Examining the Far-right Networks on Telegram”,
Information, Communication & Society. Ahead of Print, 1-20.
21
Michael H. Becker, M. “When Extremists Become Violent: Examining the Association
Between Social Control, Social Learning, and Engagement in Violent Extremism,” Studies in
Conflict & Terrorism. Ahead of Print, 1-21; Steven Chermak, Joshua Freilich, and Michael
Suttmoeller, “The Organizational Dynamics of Far-Right Hate Groups in the United States:
Comparing Violent to Nonviolent Organizations,” Studies in Conflict & Terrorism 36, no. 3
(2013): 193-218; Joshua D. Freilich and Gary LaFree, “Criminology Theory and Terrorism:
Introduction to the Special Issue,” Terrorism and Political Violence 27, no. 1 (2015): 1-15;
Joshua D. Freilich, Steven M. Chermak, and Jeff Gruenewald, “The Future of Terrorism
Research: A Review Essay,” International Journal of Comparative and Applied Criminal Justice
39, no. 4 (2015): 353-369; John Horgan, Neil Shortland, Suzzette Abbasciano, Shaun Walsh,
“Actions Speak Louder Than Words: A Behavioral Analysis of 183 Individuals Convicted for
Terrorist Offenses in the United States from 1995 to 2012,” Journal of Forensic Sciences 61, no.
5 (2016): 1228-1237; Katarzyna Jasko, Gary LaFree, and Arie Kruglanski, “Quest for
Significance and Violent Extremism: The Case of Domestic Radicalization,” Political
Psychology 38, no. 5 (2017): 815-831; Sarah Knight, David Keatley, and Katie Woodward,
“Comparing the Different Behavioral Outcomes of Extremism: A Comparison of Violent and
Non-Violent Extremists, Acting Alone or as Part of a Group, Studies in Conflict and Terrorism.
Ahead of print, 1-22; Gary LaFree, Michael A. Jensen, Patrick A. James, and Aaron Safer-
Lichtenstein, “Correlates of Violent Political Extremism in the United States,” Criminology 56,
no. 2 (2018): 233-268.
22
Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-
Wing Extremists.”
34
23
Thomas J. Holt, Joshua D. Freilich, Steven M. Chermak, and Gary LaFree, “Examining the
Utility of Social Control and Social learning in the Radicalization of Violent and Nonviolent
Extremists,” Dynamics of Asymmetric Conflict 11, no. 3 (2018): 125-148.
24
Michael Wolfowicz, Simon Perry, Badi Hasisi, and David Weisburd, “Faces of Radicalism:
Differentiating Between Violent and Non-Violent Radicals by Their Social Media Profiles,”
Computers in Human Behavior. Ahead of Print, 1-10.
25
Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-
Wing Extremists.”
26
For example, see J. Reid Meloy, Karoline Roshdi, Justine Glaz-Ocik, and Jens Hoffmann,
“Investigating the Individual Terrorist in Europe,” Journal of Threat Assessment and
Management 2, no. 3-4 (2015): 140-152; see also John Monahan, “The Individual Risk
Assessment of Terrorism,” Psychology, Public Policy, and Law 18, no. 2 (2012): 167-205; Kiran
M. Sarma, “Risk Assessment and the Prevention of Radicalization from Nonviolence into
Terrorism,” American Psychologist 72, no. 3(2017): 278-288.
27
See Michael Wolfowicz, Yael Litmanovitz, David Weisburd, and Badi Hasisi, “A Field-Wide
Systematic Review and Meta-Analysis of Putative Risk and Protective Factors for Radicalization
Outcomes,” Journal of Quantitative Criminology 36 (2020): 407-447.
28
Caitlin Clemmow, Paul Gill, Noémie Bouhana, James Silver, and John Horgan,
“Disaggregating Lone-actor Grievance-Fuelled Violence: Comparing Lone-Actor Terrorists and
Mass Murderers,” Terrorism and Political Violence. Ahead of Print, 1-25; Sarah L. Desmarais,
Joseph Simons-Rudolph, Christine S. Brugh, Eileen Schilling, and Chad Hoggan, “The State of
Scientific Knowledge Regarding Factors Associated with Terrorism,” Journal of Threat
Assessment and Management 4, no. 4 (2017): 180-209; Paul Gill, John Horgan, and Paige
Deckert, “Bombing Alone: Tracing the Motivations and Antecedent Behaviours of Lone‐Actor
Terrorists,” Journal of Forensic Sciences 59, no. 2 (2014): 425-435.
29
Ryan Scrivens, Amanda Isabel Osuna, Steven M. Chermak, Michael A. Whitney, and Richard
Frank, “Examining Online Indicators of Extremism in Violent Right-Wing Extremist Forums,
Studies in Conflict & Terrorism. Ahead of Print, 1-25.
30
Mattias Ekman, “Anti-refugee Mobilization in Social Media: The Case of Soldiers of Odin”,
Social Media + Society 4, no. 1 (2018): 1-11; Tiana Gaudette, Ryan Scrivens, Garth Davies and
Richard Frank, “Upvoting Extremism: Collective Identity Formation and the Extreme Right on
Reddit,” New Media and Society. Ahead of Print, 1-18; Imogen Richards, “A Philosophical and
Historical Analysis of “Generation Identity”: Fascism, Online Media, and the European New
Right,” Terrorism and Political Violence. Ahead of Print, 1-20; Ryan Scrivens and Amarnath
Amarasingam, “Haters Gonna “Like”: Exploring Canadian Far-Right Extremism on Facebook,”
in Mark Littler and Benjamin Lee, Eds., Digital Extremisms: Readings in Violence,
Radicalisation and Extremism in the Online Space (London, UK, Palgrave 2020), pp. 63-89;
Mattias Wahlström and Anton Törnberg, “Social Media Mechanisms for Right-Wing Political
Violence in the 21st Century: Discursive Opportunities, Group Dynamics, and Co-Ordination,”
Terrorism and Political Violence. Ahead of Print, 1-22.
31
Guhl and Davey, A Safe Space to Hate.
32
Bliuc et al., “Collective Identity Changes in Far-Right Online Communities.”
33
Isabelle van der Vegt, Maximilian Mozes, Paul Gill and Bennett Kleinberg, “Online Influence,
Offline Violence: Language use on YouTube Surrounding the ‘Unite the Right’ Rally,” Journal
of Computational Social Science. Ahead of Print, 1-22.
35
34
Markus Kaakinen, Atte Oksanen, and Pekka Räsänen, “Did the Risk of Exposure to Online
Hate Increase After the November 2015 Paris Attacks? A Group Relations Approach,”
Computers in Human Behavior 78 (2018): 90-97; Matthew L. Williams and Pete Burnap,
“Cyberhate on Social Media in the Aftermath of Woolwich: A Case Study in Computational
Criminology and Big Data,” British Journal of Criminology 56, no. 2 (2015): 211-238.
35
Ryan Scrivens, George W. Burruss, Thomas J. Holt, Steven M. Chermak, Joshua D. Freilich,
and Richard Frank, “Triggered by Defeat or Victory? Assessing the Impact of Presidential
Election Results on Extreme Right-Wing Mobilization Online,” Deviant Behavior. Ahead of
Print, 1-16; Shlomi Sela, Tsvi Kuflik and Gustavo S. Mesch, “Changes in the Discourse of
Online Hate Blogs: The Effect of Barack Obama’s Election in 2008,” First Monday 17, no. 11
(2012); Karsten Müller and Carlo Schwarz, “Making America Hate Again? Twitter and Hate
Crime Under Trump,” 2018. doi: 10.2139/ssrn.3149103.
36
See, for example, the Federal Bureau of Investigation, Homegrown Violent Extremist
Mobilization Indicators 2019 Edition (Washington, DC: Office of the Director of National
Intelligence, 2019).
37
See Paul Gill, Emily Corner, Maura Conway, Amy Thornton, Mia Bloom, and John Horgan,
“Terrorist Use of the Internet by the Numbers: Quantifying Behaviors, Patterns, and Processes,”
Criminology and Public Policy 16, vol. 1 (2017): 99-117; see also Gaudette et al., “The Role of
the Internet in Facilitating Violent Extremism.”
38
A most recent example includes the Federal Bureau of Investigation, Homegrown Violent
Extremist Mobilization Indicators 2019 Edition.
39
See Ibid.
40
Scrivens et al., “Examining Online Indicators of Extremism in Violent Right-Wing Extremist
Forums.”
41
Bliuc et al., “Collective Identity Changes in Far-Right Online Communities”; Pete Simi and
Robert Futrell, American Swastika: Inside the White Power Movement’s Hidden Spaces of Hate
(Second Edition) (Lanham, MD: Rowman and Littlefield Publishers, 2015).
42
Bradley Galloway and Ryan Scrivens, “The Hidden Face of Hate Groups Online: An Insider’s
Perspective,” VOX-Pol Network of Excellence Blog, January 3, 2018.
https://www.voxpol.eu/hidden-face-hate-groups-online-formers-perspective (accessed 29
January 2022).
43
See W. Chris Hale, “Extremism on the World Wide Web: A Research Review,” Criminal
Justice Studies 25, no. 4 (2010): 343-356; see also Christopher J. Lennings, Krestina L. Amon,
Heidi Brummert, and Nicholas J. Lennings, “Grooming for Terror: The Internet and Young
People,” Psychiatry, Psychology and Law 17, no. 3 (2010): 424-437; Meghan Wong, Richard
Frank, and Russell Allsup, “The Supremacy of Online White Supremacists: An Analysis of
Online Discussions of White Supremacists,” Information and Communications Technology Law
24, no. 1 (2015): 41-73.
44
See Back, “Aryans Reading Adorno”; see also Lorraine Bowman-Grieve, “Exploring
“Stormfront:” A Virtual Community of the Radical Right,” Studies in Conflict and Terrorism 32,
no. 11 (2009): 989-1007; see also Willem De Koster and Dick Houtman, “‘Stormfront is Like a
Second Home to Me:’ On Virtual Community Formation by Right-Wing Extremists,”
Information, Communication and Society 11, no. 8 (2008): 1155-1176.
45
See Futrell and Simi, “Free Spaces, Collective Identity, and the Persistence of U.S. White
Power Activism;” see also Barbara Perry and Ryan Scrivens, “White Pride Worldwide:
Constructing Global Identities Online,” in Jennifer Schweppe and Mark Walters, Eds., The
36
Globalisation of Hate: Internationalising Hate Crime (New York, NY: Oxford University Press,
2016), pp. 65-78.
46
See Burris et al., “White Supremacist Networks on the Internet;” see also Phyllis B.
Gerstenfeld, Diana R. Grant, and Chau-Pu Chiang, “Hate Online: A Content Analysis of
Extremist Internet Sites,” Analysis of Social Issues and Public Policy 3, no. 1 (2003): 29-44.
47
See Daniels, Cyber Racism; see also Priscilla M. Meddaugh and Jack Kay, “Hate Speech or
‘Reasonable Racism?’ The Other in Stormfront,” Journal of Mass Media Ethics 24, no. 4 (2009):
251-268.
48
Scrivens, “Exploring Radical Right-Wing Posting Behaviors Online.”
49
Kleinberg et al., “The Temporal Evolution of a Far‑Right Forum.”
50
Ryan Scrivens, George W. Burruss, Thomas J. Holt, Steven M. Chermak, Joshua D. Freilich,
and Richard Frank, “Triggered by Defeat or Victory? Assessing the Impact of Presidential
Election Results on Extreme Right-Wing Mobilization Online,” Deviant Behavior. Ahead of
Print, 1-16.
51
Scrivens et al., “Measuring the Evolution of Radical Right-Wing Posting Behaviors Online.”
52
Bliuc et al., “Collective Identity Changes in Far-Right Online Communities.”
53
Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-
Wing Extremists.”
54
For more information on the web-crawler, see Ryan Scrivens, Tiana Gaudette, Garth Davies,
and Richard Frank, “Searching for Extremist Content Online Using The Dark Crawler and
Sentiment Analysis,” in Mathieu Deflem and Derek M. D. Silva, Eds., Methods of Criminology
and Criminal Justice Research (Bingley, UK: Emerald Publishing, 2019), pp. 179-194.
55
September 12, 2001 was simply date that the sub-forum went live online. Based on my
assessment of the first messages posted on the sub-forum, it would appear as though Stormfront
Canada was not launched in response to the 9/11 terror attacks.
56
By former violent extremists, we refer to individuals who, at one time in their lives, subscribed
to and/or perpetuated violence in the name of a particular extremist ideology and have since
publicly and/or privately denounced violence in the name of a particular extremist ideology. In
short, they no longer identify themselves as adherents of a particular extremist ideology or are
affiliated with an extremist group or movement.
57
Data collection efforts followed the proper ethical procedures for conducting research
involving human participants. Here the former extremist was informed that their participation in
the study was entirely voluntary. They were also informed that they had the right to decline to
answer questions or to end the interview/withdraw from the study at any time. In addition, the
former was informed that they would not be identified by name in any publication, and that all
data collected from the interview would be de-identified for the purpose of ensuring participant
anonymity. One in-person interview was conducted with the former in June 2017 and was
approximately 10 hours in length. The interview was audio-recorded and transcribed.
58
The former extremist reviewed a list of all sub-forum users, and next to each name they
included the names of each identified user, thus documenting that each username was distinct. In
other words, the former linked each of the 99 forum users to a specific individual, indicating
there was no overlap among them.
59
This study was not an indictment of this sub-forum itself. The sub-forum was selected because
it was an online space that the former extremist actively participated in during his involvement in
violent RWE, meaning that they were familiar with the users who posted there and could identify
individuals who the former knew were violent or non-violent RWEs in the offline world.
37
60
See J. M. Berger, Extremism (Cambridge, MA: The MIT Press, 2018).
61
See Conway et al., “Right-Wing Extremists’ Persistent Online Presence.”
62
Tore Bjørgo and Jacob Aasland Ravndal, “Extreme-Right Violence and Terrorism: Concepts,
Patterns, and Responses,” The International Centre for Counter-Terrorism – The Hague
10(2019): p. 5.
63
A 10% sub-sample is commonly used by terrorism and extremism researchers when assessing
inter-rater reliability. A recent example includes Steven Windisch, Michael K. Logan, and Gina
Scott Ligon, “Headhunting Among Extremist Organizations: An Empirical Assessment of Talent
Spotting,” Perspectives on Terrorism 12, no. 2 (2018): 44-62.
64
Patrick E. Shrout and Joseph L. Fleiss, “Intraclass Correlations: Uses in Assessing Rater
Reliability,” Psychological Bulletin 86, no. 2 (1979): 420-428.
65
Southern Poverty Law Center, “Ideologies,” 19 November 2020. Available at:
https://www.splcenter.org/fighting-hate/extremist-files/ideology (accessed 29 January 2022).
66
A select number of ideology codes were drawn from the SPLC ideologies list to make up the
ideology codebook for the current study – ideologies that I believe applied to the data or did not
overlap with other ideology categories outlined by the SPLC. For example, SPLC’s category
‘neo-Nazi’, ‘racist skinhead’, and ‘Ku Klux Klan’ were omitted from the current study as their
description were too similar to code in a meaningful way. Other ideologies from SPLC’s list that
were omitted from my codebook included Black separatist, Phineas Priesthood, hate music, and
alt-right, given that they did not adequately represent the data under investigation.
67
Barbara Perry, and Randy Blazak, “Places for Races: The White Supremacist Movement
Imagines US Geography,” Journal of Hate Studies 8, no. 1 (2009): 29-51; Pete Simi, “Why
Study White Supremacist Terror? A Research Note,” Deviant Behavior 31, no. 3 (2010): 251-
273.
68
Lorraine Bowman-Grieve, “‘Exploring ‘Stormfront’: A Virtual Community of the Radical
Right,” Studies in Conflict & Terrorism 32, no. 11 (2009): 989-1007; Jessie Daniels, Cyber
Racism: White Supremacy Online and the New Attack on Civil Rights (Lanham, MA: Rowman
and Littlefield Publishers, 2009).
69
Barkun, “Millenarian Aspects of ‘White Supremacist’ Movements”; Dobratz and Shanks-
Meile, “White Power, White Pride!”; Kaplan, “Right Wing Violence in North America;” Kaplan,
“Leaderless Resistance.”
70
Gill et al., “Bombing Alone.”
71
Clemmow et al., “Disaggregating Lone-actor Grievance-Fuelled Violence.”
72
Federal Bureau of Investigation, Homegrown Violent Extremist Mobilization Indicators 2019
Edition (Washington, DC: Office of the Director of National Intelligence, 2019).
73
Analyses were conducted using Microsoft Excel Version 16.51.
74
See Bowman-Grieve, “Exploring “Stormfront;” see also De Koster and Houtman,
“‘Stormfront is Like a Second Home to Me.’”
75
See Conway et al., “Right-Wing Extremists’ Persistent Online Presence;” see also Daniels,
Cyber Racism.
76
Michael Barkun, “Millenarian Aspects of ‘White Supremacist’ Movements,” Terrorism and
Political Violence 1, no. 4 (1989): 409-34; Betty A. Dobratz, and Stephanie L. Shanks-Meile,
“White Power, White Pride!”: The White Separatist Movement in the United States
(Woodbridge, CT: Twayne Pub, 1997); Raphael S. Ezekiel, The Racist Mind: Portraits of
American Neo-Nazi and Klansmen (New York: Viking Penguin, 1995); Jeffrey Kaplan, “Right
Wing Violence in North America,” Terrorism and Political Violence 7, no. 1 (1995): 44-95;
38
Jeffrey Kaplan, “Leaderless Resistance,” Terrorism and Political Violence 9, no. 3 (1997): 80-
95.
77
Bowman-Grieve, “‘Exploring ‘Stormfront’”; Daniels, Cyber Racism; Holt et al., “Examining
the Online Expression of Ideology Among Far-Right Extremist Forum Users”; Scrivens et al.,
“Measuring the Evolution of Radical Right-Wing Posting Behaviors Online”; Scrivens,
“Exploring Radical Right-Wing Posting Behaviors Online;” Scrivens et al., “Examining Online
Indicators of Extremism in Violent Right-Wing Extremist Forums.”
78
Davey et al., An Online Environmental Scan of Right-Wing Extremism in Canada.
79
Davey and Ebner, The Fringe Insurgency; Zannettou et al., “A Quantitative Approach to
Understanding Online Antisemitism.”
80
Holt et al., “Examining the Online Expression of Ideology Among Far-Right Extremist Forum
Users.”
81
Barkun, “Millenarian Aspects of ‘White Supremacist’ Movements”; Kaplan, “Right Wing
Violence in North America;” Simi, “Why Study White Supremacist Terror?”
82
Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-
Wing Extremists.”
83
See, for example, Barbara Perry and Ryan Scrivens, Right-Wing Extremism in Canada (Cham,
Switzerland: Palgrave, 2019).
84
See Bowman-Grieve, “‘Exploring ‘Stormfront’”; see also Daniels, Cyber Racism; De Koster
and Houtman, “‘Stormfront is Like a Second Home to Me’;” Wojcieszak, “‘Don’t Talk to Me’.”
85
Scrivens et al., “Examining Online Indicators of Extremism in Violent Right-Wing Extremist
Forums.”
86
See Daniels, Cyber Racism; see also Berger, Extremism.
87
Gaudette et al., “The Role of the Internet in Facilitating Violent Extremism.”
88
Kathleen M. Blee and Kimberly A. Creasap, “Conservative and Right-Wing Movements,”
Annual Review of Sociology 36 (2010): 269-286; Jeff Gruenewald, Joshua D. Freilich, and
Steven M. Chermak, “An Overview of the Domestic Far-Right and Its Criminal Activities,” in
Barbara Perry and Randy Blazak, Eds., Hate Crime: Issues and Perspectives, Vol. 4 Offenders
(New York, NY: Praeger, 2009), pp. 1-22; Mark S. Hamm, American Skinheads: The
Criminology and Control of Hate Crime (Westport, CT: Praeger, 1993); Perry and Scrivens,
Right-Wing Extremism in Canada; Simi and Futrell, American Swastika.
89
Gerstenfeld et al., “Hate Online;” Perry and Scrivens, Right-Wing Extremism in Canada; Simi
and Futrell, American Swastika.
90
For more on these limitations, see Scrivens et al., “Comparing the Online Posting Behaviors of
Violent and Non-Violent Right-Wing Extremists.”
91
For more information on the ECDB, see Joshua D. Freilich, Steven M. Chermak, Jeff
Gruenewald, William S. Parkin, and Brent R. Klein, “Patterns of Fatal Extreme-Right Crime in
the United States”, Perspectives on Terrorism 12, no. 6 (2018): 38-51.
92
Horgan et al., “Actions Speak Louder Than Words”; Jasko et al., “Quest for Significance and
Violent Extremism”; Knight et al., “Comparing the Different Behavioral Outcomes of
Extremism”; LaFree et al., “Correlates of Violent Political Extremism in the United States.”
... To illustrate, recent efforts have been made to examine RWE posting behaviours found on the platform (Scrivens, 2021), the development of user activity and extremist language (Kleinberg et al., 2021), the impact of presidential election results on Stormfront posting behaviours , and the ways in which the collective identity of the extreme right takes shape over time and is affected by offline intergroup conflict on the forum (Bliuc et al., 2019). Some exploratory work has also compared the posting behaviours of violent and non-violent users on Stormfront Scrivens, 2022;Scrivens et al., 2021, but studies have yet to examine posting desistance of these two user types. ...
... That is, the former reviewed a list of all sub-forum users, and next to each name they included the names of each identified user, thus documenting that each username was distinct and there was no overlap among them. In addition, based upon the former's personal knowledge of these 99 persons' activities, they were categorised as violent or non-violent -a sampling procedure supported by previous studies Scrivens, 2022;Scrivens et al., 2021. We do, however, acknowledge that this single source may be more familiar with the histories of some of the identified individuals than others, which may bias the sample. ...
Article
Full-text available
Little is known about online behaviors of violent extremists generally or differences compared to non-violent extremists who share ideological beliefs. Even less is known about desistance from posting behavior. A sample of 99 violent and non-violent right-wing extremists to compare their online patterns of desistance within a sub-forum of the largest white supremacy web-forum was analyzed. A probabilistic model of desistance was tested to determine the validity of criteria set for users reaching posting desistance. Findings indicated that criteria predicted “true” desistance, with 5% misidentification. Each consecutive month without posting in the sub-forum resulted in a 7.6% increase in odds of posting desistance. There were no significant differences in effects for violent versus non-violent users, though statistical power was low.
... Next, we considered alternative variables beyond thematic textual content that might inform differences between individuals who had mobilized and individuals who had not, that could be used as additional features in our models. We then generated paralinguistic features and word count data from the posts, drawing on evidence suggesting that these can be used to distinguish between online users (Golder & Donath, 2004;Scrivens, 2023). Specifically our features included (a) word count data using tokenization on whitespaces (Horne et al., 2017), (b) the number of links shared in a post based on the use of "http" and "www" (Hine et al., 2017), (c) question marks (Bhatia et al., 2016;Nguyen et al., 2022), (d) exclamation marks (Nguyen et al., 2022), and (e) total punctuation used (Horne et al., 2017;Mihaylov & Nakov, 2019). ...
Article
Full-text available
Psychological theories of mobilization tend to focus on explaining people's motivations for action, rather than mobilization ("activation") processes. To investigate the online behaviors associated with mobilization, we compared the online communications data of 26 people who subsequently mobilized to right-wing extremist action and 48 people who held similar extremist views but did not mobilize (N = 119,473 social media posts). In a three-part analysis, involving content analysis (Part 1), topic modeling (Part 2), and machine learning (Part 3), we showed that communicating ideological or hateful content was not related to mobilization, but rather mobilization was positively related to talking about violent action, operational planning, and logistics. Our findings imply that to explain mobilization to extremist action, rather than the motivations for action, theories of collective action should extend beyond how individuals express grievances and anger, to how they equip themselves with the "know-how" and capability to act.
Article
Full-text available
There is an ongoing need for researchers, practitioners, and policymakers to identify and examine the online posting behaviors of violent extremists prior to their engagement in violence offline, but little is empirically known about their online presence generally or differences in their posting behaviors compared to their non-violent counterparts particularly. Even less is empirically known about their persisting and desisting posting patterns. This study drew from a unique sample of violent and non-violent right-wing extremists to examine online changes in posting patterns during the beginning, middle, and end of their observed posting activity. Here we identified persister and desister posters to create four sample groups: non-violent persisters, non-violent desisters, violent persisters, and violent desisters. We then calculated the average number of posts for each sample group as well as quantified the existence of extremist ideologies and violent extremist mobilization efforts across each observed posting period. Overall, we identified several noteworthy posting patterns that may assist law enforcement and intelligence agencies in identifying credible threats online. We conclude with a discussion of the implications of the analysis, its limitations, and avenues for future research.
Article
Full-text available
Despite the ongoing need for practitioners to identify violent extremists online before their engagement in violence offline, little is empirically known about their digital footprints generally or differences in their posting behaviors compared to their non-violent counterparts particularly – especially on high-frequency posting days. Content analysis was used to examine postings from a unique sample of violent and non-violent right-wing extremists as well as from a sample of postings within a sub-forum of the largest white supremacy forum during peak and non-peak posting days for comparison purposes. Several noteworthy posting behaviors were identified that may assist in identifying credible threats online.
Article
Full-text available
Although many researchers, practitioners, and policymakers are concerned about identifying and characterizing online posting patterns of violent extremists prior to their engagement in violence offline, little is empirically known about their online patterns generally or differences in their patterns compared to their non-violent counterpart particularly. In this study, we drew from a unique sample of violent and non-violent right-wing extremists to develop and compare their online posting typologies (i.e., super-posters, committed, engaged, dabblers, and non-posters) in the largest white supremacy web-forum. We identified several noteworthy posting patterns that may assist law enforcement and intelligence agencies in identifying credible threats online.
Article
Full-text available
Although many law enforcement and intelligence agencies are concerned about online communities known to facilitate violent right-wing extremism, little is empirically known about the presence of extremist ideologies, expressed grievances, or violent mobilization efforts that make up these spaces. In this study, we conducted a content analysis of a sample of postings from two of the most conspicuous right-wing extremist forums known for facilitating violent extremism, Iron March and Fascist Forge. We identified a number of noteworthy posting patterns within and across forums which may assist law enforcement and intelligence agencies in identifying credible threats online.
Article
Full-text available
Right-wing extremist groups harness popular social media platforms to accrue and mobilize followers. In recent years, researchers have examined the various themes and narratives espoused by extremist groups in the United States and Europe, and how these themes and narratives are employed to mobilize their followings on social media. Little, however, is comparatively known about how such efforts unfold within and between rightwing extremist groups in Australia and Canada. In this study, we conducted a cross-national comparative analysis of over eight years of online content found on 59 Australian and Canadian right-wing group pages on Facebook. Here we assessed the level of active and passive user engagement with posts and identified certain themes and narratives that generated the most user engagement. Overall, a number of ideological and behavioral commonalities and differences emerged in regard to patterns of active and passive user engagement, and the character of three prevailing themes: methods of violence, and references to national and racial identities. The results highlight the influence of both the national and transnational context in negotiating which themes and narratives resonate with Australian and Canadian right-wing online communities, and the multidimensional nature of right-wing user engagement and social mobilization on social media.
Article
Full-text available
Despite the ongoing need for researchers, practitioners, and policymakers to identify and assess the online activities of violent extremists prior to their engagement in violence offline, little is empirically known about their online behaviors generally or differences in their posting behaviors compared to non-violent extremists who share similar ideological beliefs particularly. In this study, we drew from a unique sample of violent and non-violent right-wing extremists to compare their posting behaviors within a sub-forum of the largest white supremacy web-forum. Analyses for the current study proceeded in three phases. First, we plotted the average posting trajectory for users in the sample, followed by an assessment of the rates at which they stayed active or went dormant in the sub-forum. We then used logistic regression to examine whether specific posting behaviors were characteristic of users' violence status. The results highlight a number of noteworthy differences in the posting behaviors of violent and non-violent right-wing extremists, many of which may inform future risk factor frameworks used by law enforcement and intelligence agencies to identify credible threats online. We conclude with a discussion of the implications of this analysis, its limitations and avenues for future research.
Article
Full-text available
The media frequently describes the 2017 Charlottesville ‘Unite the Right’ rally as a turning point for the alt-right and white supremacist movements. Social movement theory suggests that the media attention and public discourse concerning the rally may have engendered changes in social identity performance and visibility of the alt-right, but this has yet to be empirically tested. The presence of the movement on YouTube is of particular interest, as this platform has been referred to as a breeding ground for the alt-right. The current study investigates whether there are differences in language use between 7142 alt-right and progressive YouTube channels, in addition to measuring possible changes as a result of the rally. To do so, we create structural topic models and measure bigram proportions in video transcripts, spanning approximately 2 months before and after the rally. We observe differences in topics between the two groups, with the ‘alternative influencers’, for example, discussing topics related to race and free speech to a larger extent than progressive channels. We also observe structural breakpoints in the use of bigrams at the time of the rally, suggesting there are changes in language use within the two groups as a result of the rally. While most changes relate to mentions of the rally itself, the alternative group also shows an increase in promotion of their YouTube channels. In light of social movement theory, we argue that language use on YouTube shows that the Charlottesville rally indeed triggered changes in social identity performance and visibility of the alt-right.
Article
Full-text available
Since the advent of the Internet, right-wing extremists and those who subscribe to extreme right views have exploited online platforms to build a collective identity among the like-minded. Research in this area has largely focused on extremists’ use of websites, forums, and mainstream social media sites, but overlooked in this research has been an exploration of the popular social news aggregation site Reddit. The current study explores the role of Reddit’s unique voting algorithm in facilitating ‘othering’ discourse and, by extension, collective identity formation among members of a notoriously hateful subreddit community, r/The_Donald. The results of the thematic analysis indicate that those who post extreme-right content on r/The_Donald use Reddit’s voting algorithm as a tool to mobilize like-minded members by promoting extreme discourses against two prominent out-groups: Muslims and the Left. Overall, r/The_Donald’s ‘sense of community’ facilitates identity work among its members by creating an environment wherein extreme right views are continuously validated.
Article
Full-text available
The theoretical literature from criminology, social movements, and political sociology, among others, includes diverging views about how political outcomes could affect movements. Many theories argue that political defeats motivate the losing side to increase their mobilization while other established models claim the winning side may feel encouraged and thus increase their mobilization. We examine these diverging perspectives in the context of the extreme right online and recent presidential elections by measuring the effect of the 2008 and 2016 election victories of Obama and Trump on the volume of postings on the largest white supremacy web-forum. ARIMA time series using intervention modeling showed a significant and sizable increase in the total number of posts and right-wing extremist posts but no significant change for firearm posts in either election year. However, the volume of postings for all impact measures was highest for the 2008 election.
Article
Full-text available
While a growing body of evidence suggests that the Internet is a key facilitator of violent extremism, research in this area has rarely incorporated former extremists’ experiences with the Internet when they were involved in violent extremism. To address this gap, in-depth interviews were conducted with 10 Canadian former right-wing extremists who were involved in violent racist skinhead groups, with interview questions provided by 30 Canadian law enforcement officials and 10 community activists. Participants were asked about their use of the Internet and the connection between their on- and offline worlds during their involvement in the violent right-wing extremist movement. Overall, our study findings highlight the interplay between the Internet and violent extremism as well as the interactions between the on- and offline worlds of violent extremists. We conclude with a discussion of study limitations and avenues for future research.
Article
Full-text available
TikTok is the fastest-growing application today, attracting a huge audience of 1.5 billion active users, mostly children and teenagers. Recently, the growing presence of extremist’s groups on social media platforms became more prominent and massive. Yet, while most of the scholarly attention focused on leading platforms like Twitter, Facebook or Instagram, the extremist immigration to other platforms like TikTok went unnoticed. This study is a first attempt to find the Far-right’s use of TikTok: it is a descriptive analysis based on a systematic content analysis of TikTok videos, posted in early 2020. Our findings reveal the disturbing presence of Far-right extremism in videos, commentary, symbols and pictures included in TikTok’s postings. While similar concerns were with regard to other social platforms, TikTok has unique features to make it more troublesome. First, unlike all other social media TikTok’ s users are almost all young children, who are more naïve and gullible when it comes to malicious contents. Second, TikTok is the youngest platform thus severely lagging behind its rivals, who have had more time to grapple with how to protect their users from disturbing and harmful contents. Yet, TikTok should have learned from these other platforms’ experiences and apply TikTok’s own Terms of Service that does not allow postings that are deliberately designed to provoke or antagonize people, or are intended to harass, harm, hurt, scare, distress, embarrass or upset people or include threats of physical violence.
Article
Objectives Social media platforms such as Facebook are used by both radicals and the security services that keep them under surveillance. However, only a small percentage of radicals go on to become terrorists and there is a worrying lack of evidence as to what types of online behaviors may differentiate terrorists from non-violent radicals. Most of the research to date uses text-based analysis to identify "radicals" only. In this study we sought to identify new social-media level behavioral metrics upon which it is possible to differentiate terrorists from non-violent radicals. Methods: Drawing on an established theoretical framework, Social Learning Theory, this study used a matched case-control design to compare the Facebook activities and interactions of 48 Palestinian terrorists in the 100 days prior to their attack with a 2:1 control group. Conditional-likelihood logistic regression was used to identify precise estimates, and a series of binomial logistic regression models were used to identify how well the variables classified between the groups. Findings: Variables from each of the social learning domains of differential associations, definitions, differential reinforcement, and imitation were found to be significant predictors of being a terrorist compared to a nonviolent radical. Models including these factors had a relatively high classification rate, and significantly reduced error over base-rate classification. Conclusions Behavioral level metrics derived from social learning theory should be considered as metrics upon which it may be possible to differentiate between terrorists and non-violent radicals based on their social media profiles. These metrics may also serve to support textbased analysis and vice versa.