ArticlePDF Available

Abstract and Figures

Many researchers, practitioners, and policymakers are concerned about online communities that are known to facilitate violent right-wing extremism, but little is empirically known about these digital spaces in general and the developmental posting behaviors that make up these spaces in particular. In this study, group-based trajectory modeling—derived from a criminal career paradigm—was used to identify posting trajectories found in the open-access sections of the Iron March and Fascist Forge forums, both of which have gained notoriety for their members’ online advocacy of violence and acts of violence carried out by them. Multinomial logistic regression and analysis of variance were then used to assess whether posters’ time of entry into the violent forums predicted trajectory group assignment. Overall, the results highlight a number of similarities and differences in posting behaviors within and across platforms, many of which may inform future risk factor frameworks used by law enforcement and intelligence agencies to identify credible threats online. We conclude with a discussion of the implications of this analysis, followed by a discussion of study limitations and avenues for future research.
Content may be subject to copyright.
Examining the Developmental Pathways of Online Posting Behavior in Violent Right-Wing
Extremist Forums
Ryan Scrivens
School of Criminal Justice, Michigan State University
East Lansing, Michigan, USA
rscriv@msu.edu
Thomas W. Wojciechowski
School of Criminal Justice, Michigan State University
East Lansing, Michigan, USA
Richard Frank
School of Criminology, Simon Fraser University
Burnaby, British Columbia, Canada
Disclosure Statement
No potential conflict of interest was reported by the authors.
2
Examining the Developmental Pathways of Online Posting Behavior in Violent Right-Wing
Extremist Forums
Abstract:
Many researchers, practitioners, and policymakers are concerned about online communities that
are known to facilitate violent right-wing extremism, but little is empirically known about these
digital spaces in general and the developmental posting behaviors that make up these spaces in
particular. In this study, group-based trajectory modeling – derived from a criminal career
paradigm – was used to identify posting trajectories found in the open access sections of the Iron
March and Fascist Forge forums, both of which have gained notoriety for their members’ online
advocacy of violence and acts of violence carried out by them. Multinomial logistic regression
and analysis of variance were then used to assess whether posters’ time of entry into the violent
forums predicted trajectory group assignment. Overall, the results highlight a number of
similarities and differences in posting behaviors within and across platforms, many of which may
inform future risk factor frameworks used by law enforcement and intelligence agencies to
identify credible threats online. We conclude with a discussion of the implications of this
analysis, followed by a discussion of study limitations and avenues for future research.
Purpose
This study examines pathways in posting behavior found within the open access sections of two
of the most conspicuous right-wing extremist forums known for facilitating violent extremism,
Iron March and Fascist Forge. The goal is to identify patterns of developmental posting behavior
that could be characterized as distinct groups (or trajectories) using a criminal career paradigm.
3
We also examine whether users’ time of entry in these digital spaces predicts their posting
behavior. This study represents an original contribution to the academic literature on violent
online political extremism on four fronts.
First, researchers, practitioners, and policymakers have paid close attention to the
presence of terrorists and extremists online in recent years, with a particular emphasis on digital
communities that facilitate extreme right-wing ideologies.
1
This is largely the result of ongoing
reports that some violent right-wing extremists and terrorists were active online prior to their
attacks,
2
with one of the most recent examples being 28-year-old Australian Brenton Tarrant
who, before killing 50 people in two Christchurch, New Zealand mosques in 2019 and live-
streaming his attack, announced his intentions on 8chan and produced a ‘manifesto’ linked on
the website.
3
It should come as little surprise, then, that researchers have focused on the presence
of right-wing extremists on various platforms, including websites and discussion forums,
4
mainstream social media sites such as Facebook,
5
Twitter,
6
and YouTube,
7
fringe platforms
including 4chan
8
and Gab,
9
and digital applications such as TikTok
10
and Telegram.
11
But in
light of these important contributions, an empirical study of some of the most violent extremist
discussion forums such as Iron March and Fascist Forge has been overlooked. An exhaustive
search using dedicated academic research databases produced two studies on the presence of the
extreme right on Iron March or on Fascist Forge. Davey and colleagues,
12
for example,
identified and reported on the geo-locations of Canadian users on these platforms. Singer-Emery
and Bray III
13
explored the posting activities of those in Iron March, which included a profile of
an “average” Iron March user, the activities that drove discourse on the forum, and the networks
that formed on the platform. Notwithstanding these studies, what we generally know about these
two spaces is more of a journalistic description
14
than an academic analysis. In other words, we
4
have little scholarship on the presence of the extreme right on these violent platforms and fewer
efforts to methodologically and systematically analyze users’ posting behavior in these spaces.
There can be little doubt, then, that an assessment is needed.
Second, researchers have found that right-wing extremists, among other extremists, have
maintained a presence on multiple online platforms
15
and, in turn, there have been recent calls in
terrorism and extremism studies for more comparative research across platforms.
16
Many
practitioners and policymakers are likewise concerned with (1) how extremists operate across
platforms, (2) whether the presence of extremists on various platforms are similar or distinct, and
(3) whether certain platforms have unique functions for extremists.
17
While there has been a
recent surge of research on the presence of the extreme right on various online platforms,
18
little
is empirically known about their presences across platforms.
19
Of the limited cross-platform
studies that have been conducted in this regard, Davey and Ebner
20
explored the connectivity and
convergence of the ‘new’ extreme right in Europe and the United States (U.S.) on 4chan, 8chan,
Voat, Gab, and Discord. Davey and colleagues
21
also assessed the scale and scope of Canadian
right-wing extremist activity within and across various platforms, including Facebook, Twitter,
YouTube, 4chan, Gab, Fascist Forge and Iron March. Zannettou and colleagues
22
measured the
spread of anti-Semitic content across 4chan and Gab. Lastly, Holt and colleagues
23
examined the
ideological sentiments expressed across several right-wing extremist forums. Together, these
studies highlight the connected nature of right-wing extremist movements as well as distinctions
in their online communications, both within and across platforms. Regardless, this emerging
evidence base remains in its infancy and requires further exploration.
Third, it has become increasingly common in the academic literature on violent online
political extremism, particularly those using machine learning algorithms, to include a temporal
5
component to the analyses, including an evaluation of how extreme discourse evolves over time
online,
24
or an assessment of the levels of – or propensity towards – violent radicalization
online.
25
But far less is empirically known about the general posting behaviors that make up
these extremist spaces
26
or how the posting behaviors that underpin extremist online spaces in
general
27
or in violent extremist spaces in particular develop over time. What is known about the
temporal patterns in posting behaviors found in right-wing extremist spaces, for example, comes
from Kleinberg and colleagues
28
who measured the development of user activity and extremist
language in Stormfront.org, which is the largest collective right-wing extremist forums for a
general audience.
29
The authors found an increase in posting behavior followed by a subsequent
decrease in behavior as well as a group of ‘super users’ who accounted for the vast majority of
the posts and extremist language. Scrivens and colleagues
30
similarly explored how radical right-
wing posting behaviors develop on Stormfront, but they found that users generally increased in
their posting activity which suggests that increased participation in the forum polarizes users’
radical opinions – a finding supported by studies conceptualizing digital platforms of the extreme
right as those that polarize members’ opinions over time.
31
Scrivens and colleagues
32
concluded
that a criminal career paradigm provides a useful perspective through which to study
developmental posting behaviors in online domains of extremist movements.
Of particular interest here, the criminal career paradigm set out by Blumstein and
colleagues
33
argued that an offender’s criminal career could be measured at its beginning (e.g.,
age of onset), middle (e.g., change in frequency, severity, and seriousness of crime), and end
(e.g., desistance). Drawing from this paradigm, Moffitt
34
provided perhaps the most prominent
framework for understanding crime and criminal careers from a developmental perspective with
the dual taxonomy of offending. In short, this framework delineates offenders into two groups:
6
adolescent-limited and life-course persistent. For the former, these offenders demonstrate later
initiation into offending and early desistance (i.e., a complete halt in offending for the remainder
of an individual’s life), with the vast majority predicted to follow this trajectory during life. For
the latter, these offenders demonstrate early initiation into offending, progressing to a more
serious and chronic course of offending. Chronic in this regard is understood as continuity of
engagement in offending across the life-course even when most same-age individuals have
stopped offending. While more recent criminological research has indicated that developmental
subgroups of offending are more complicated than a two-group model,
35
this framework
continues to serve as a useful heuristic tool in understanding developmental pathways of
behavior. Such an approach to understanding offending pathways has also sparked some interest
among those studying issues related to terrorism and extremism. DeLisi and Piquero,
36
for
example, noted in their review of criminal career research from 2000 to 2011 that much more
research is needed to study the longitudinal pathways of politically motivated criminals and
terrorists. The criminology of terrorism has begun to take up DeLisi and Piquero’s
37
challenge.
In 1993, Nagin and Land
38
introduced group-based trajectory modelling (GBTM), a
technique ideally suited to analyzing criminal careers and the developmental paths of individual
offending. Designed to identify latent groups of cases that share similar patterns of behavior over
time, the initial purpose of this analytical technique was to describe the temporal trends of
individual offending
39
and anti-social behavior among youths.
40
Guided by this developmental
framework, criminologists have begun to study trends in terrorist activities.
41
A select number of
criminologists have also incorporated a developmental life-course framework into their
understanding of onset to violent extremism, from qualitative
42
and quantitative perspectives.
43
Criminologists too have used this framework to study the processes of de-radicalization and
7
disengagement.
44
One area, though, that this framework has rarely been applied to is the online
domains of extremist movements.
45
Fourth and finally, a growing body of research in terrorism and extremism studies has
explored the longitudinal impact of individual embeddedness in violent extremist movements,
including – but indeed not limited to – empirical studies exploring the evolution of terrorist
networks,
46
trajectories and mechanisms of violent radicalization,
47
and pathways to desisting
from violent extremist movements
48
and terrorist organizations
49
. The emerging evidence base is
quite clear. Levels of involvement in a violent extremist movement have an influence on
adherents’ roles there. Research on violent online political extremism has similarly been
concerned about the extent to which individuals are immersed in violent extremism online, with
a particular focus on the relationship between the impact of extremist online content and violent
radicalization.
50
But overlooked in this regard has been as assessment of how immersion and
participation in online extremist spaces influences users’ posting behaviors there. What the
limited empirical research does suggest is that the posting behavior of those who post for long
periods of time and are embedded in online extremist spaces, particularly those of the extreme
right, tend to show a high level of commitment to the online community and, by extension, the
radical cause.
51
They also attempt to engage with and motivate others to participate in fanatical
discussions and build a sense of community there
52
, and they are oftentimes the ‘super
contributors’ (i.e., those who dominate the discussions) and those whose first post dates back
significantly longer than the average user.
53
Regardless, much more research is needed to assess
the link between users’ time of entry in violent extremist platforms and their online posting
behavior.
8
Before proceeding, it is necessary to outline how we conceptualize right-wing extremism
in the current study. Following Berger
54
, we are guided by the view that right-wing extremists—
like all extremists—structure their beliefs on the basis that the success and survival of the in-
group is inseparable from the negative acts of an out-group and, in turn, they are willing to
assume both an offensive and defensive stance in the name of the success and survival of the in-
group.
55
Right-wing extremism is thus defined as a racially, ethnically, and/or sexually defined
nationalism, which is typically framed in terms of white power and/or white identity (i.e., the in-
group) that is grounded in xenophobic and exclusionary understandings of the perceived threats
posed by some combination of non-whites, Jews, Muslims, immigrants, refugees, members of
the LGBTQ community, and feminists (i.e. the out-group(s)).
56
Current study
Data
All open source content on Iron March and Fascist Forge were analyzed for the current study.
Here the Iron March data included 195,212 posts made by 1,448 authors between September
2011 and October 2017
57
while the Fascist Forge data included 2,933 posts made by 405 authors
between May 2018 and November 2019. Data was captured using a custom-written computer
program that was designed to collect vast amounts of information online.
58
It is worth
highlighting here that Fascist Forge’s ‘multi-stage application process’ to vet its users before
they are given access to post or visit and participate in private channels there (discussed below)
did not affect the data collection process for the current study, as only the postings from the open
access section of the forum were extracted. In other words, those who were not vetted could view
9
the open access sections of the forum and, by extension, extract the open source content from the
platform.
Iron March and Fascist Forge served as online hubs for neo-Nazis, neo-fascists and white
supremacists to connect with the like-minded and for the purpose of engaging in real-world
violence.
59
Iron March, for example, appeared online in 2011 and functioned as a key online
discussion forum for individuals to connect and communicate with others who subscribed to
violent extreme right-wing ideologies.
60
The forum gained notoriety for its members’ online
advocacy of violence and acts of violence carried out by them.
61
To illustrate, Iron March hosted
a number of members whose goal was to recruit from the U.S. military for the purpose of
“destroying liberal democracy through a fascist paramilitary insurgency.”
62
Most notably, they
later formed the Atomwaffen Division, a far-right terrorist group linked to a series of murders in
the U.S.,
63
with Iron March being crucial to the formation of the group.
64
Anti-hate watch-
organizations also uncovered that Iron March was linked to several other violent right-wing
extremist groups, including Patriot Front and Vanguard America in the U.S. and National Action
in the United Kingdom (U.K.), among others.
65
While the forum went offline in 2017 for
undisclosed reasons, two years later personal details about Iron March users were publicly leaked
online, which included forum usernames with e-mail addresses, IP addresses, and forum posts.
66
In 2018, the Fascist Forge discussion forum went online to “fill the void by the takedown
of Iron March”, according to one of the site founders who posted the message shortly after the
forum went online.
67
Anti-hate watch-groups have similarly described the forum as a remodeled
version of Iron March for a small community of hardline neo-Nazis to advocate terrorism.
68
Such
a small community is likely the result of what Davey and colleagues
69
described as a ‘multi-
stage application process’, wherein forum administrators vetted new users before giving them
10
access to post on the platform and visit private channels there. This was done to ensure that only
those who were committed to the extremist cause were participating in the platform and most
likely to weed out potential infiltrators or non-conformists.
70
As part of this vetting process, new
members were first required to introduce themselves, outline why they registered, and discuss
what they hoped to accomplish by entering the forum, among other things.
71
New members were
then encouraged to visit the ‘learning center’ and read a list of texts that were explicitly
supremacist, including writing from right-wing extremist ideologues David Lane, Adolf Hitler,
and George Lincoln Rockwell, all in preparation for an ‘entrance exam.’
72
Those who gained
additional access to the platform were then subject to strong content moderation, where users
could be kicked off the platform for various reasons, including ‘support for Semitic desert
religions.’
73
Regardless of the platform’s differences to Iron March, Fascist Forge, like its
predecessor, facilitated violent discussions that were steeped in extreme right-wing ideologies,
including calls for direct action such as a race war and targeting public infrastructure.
74
Within
this digital space was also a library for would-be extreme right-wing terrorist cells, which
featured a guide on militaristic tactics for ethnic cleansing, manuals for making homemade
weapons, and instructions on how to dispose of a body.
75
Also included in this space were
descriptions about the most effective weapons to use during urban combat as well as discussions
about how to pull off an assassination.
76
Most importantly, a number of violent extremists were
linked to Fascist Forge, similar to participants in Iron March, including a 16-year old convicted
of planning a terror attack against synagogues in the U.K. to trigger a “race war”, who was a
contributor to the forum.
77
Two other British members of Fascist Forge who were recently
charged with a combined 25 terrorism offences are currently going through the youth court
11
system.
78
In March 2018, the web service companies hosting Fascist Forge took the site offline,
in part because of platform’s link to violent extremism, but the site came back online using
Cloudflare.
79
Today, the site is offline for reasons that are unknown, a situation similar to Iron
March.
Measures
Posting frequency
The dependent variable used for the current study for each of the two discussion forums was
posting frequency per month by each forum user. Here we examined online postings found on
two violent extremist forums, and not violent posts specifically. The act of posting on these
forums was not considered terrorism or violent extremism and we did not consider those who
posted in the two forums as violent extremists or terrorists, nor did we assume that the posted
content related to violence specifically. In addition, we did not examine the posting activity of
the same users across forums but instead we compared – in aggregate – the behaviors of users on
one forum to the behaviors of users on a second forum where they may or may not have been the
same.
Individual posts were tracked, coded and analyzed across a time period of nine months
for each user, beginning at the time of their first post in the forum’s measurable lifespan.
80
The
number of messages that each user posted in a forum were tallied for each month, and the
frequency value was then assigned to the given month variable. If users posted zero messages in
the forum for a given month, then a score of “0” was assigned to that month.
A time period of nine months was chosen as the time span for several reasons. First,
preliminary analyses of the Fascist Forge data indicated that nine months was the mean number
12
of months that forum users provided complete data. This is because several users posted one
message at the onset of their posting career and did not post again until much later in the forum’s
lifespan, while the initial posting career of other users developed late into the forum’s lifespan
(i.e., the latter portion of the study period). Second, coding all individual posts for every user in
the Iron March data resulted in a high proportion of missing data because of the long lifespan of
the forum. Here a relatively large proportion of users in the sample posted once at the onset of
their posting career and then posted again much later in the forum. This resulted in a number of
issues with interpretability of the data and also in visual descriptions of the trajectories. Together,
coding the first nine months of posting activity for each user provided the best capacity for
comparing longitudinal patterns observed in both forums.
81
Table 1 provides descriptive
information of the posting behaviors found in the nine-month and full dataset.
Table 1. Descriptive information of the posting behaviors in the nine-month and full dataset.
Iron March
Fascist Forge
Average number of posts per month for the
nine-month dataset
8.142
0.939
Proportion of users with complete data for
the nine-month dataset
78.936%
(n = 1,143)
47.901%
(n = 194)
Proportion of users with more than nine
months of complete data in the full dataset
75.691%
(n = 1,096)
44.691%
(n = 181)
13
In addition, preliminary analyses indicated that Stata/MP 16.1 had difficulties modeling
the Iron March data due to a large range between minimum and maximum values. Consistent
with an approach conducted by Wojciechowski,
82
a ceiling category was created at 80 posts per
user. All data greater than 80 posts per user were coded as “80.”
83
This was chosen because it
maintained the maximum amount of data in its original form while allowing Stata to estimate the
model effectively. To illustrate, the average proportion of users in the sample that were censored
at 80 posts for each monthly posting period was 2.16%.
84
Initial posting month
The independent variable for the analysis was the month that each user made their first post in
the forum, which we also refer to as posters’ time of entry into the forums. In short, this was a
measure of when a user became active in the forum in relation to the forum’s lifespan (i.e., a
forum-level variable) rather than an assessment of the individual characteristics of a poster.
For the Iron March data, there were approximately 75 months in the forum’s lifespan and
based on which month that each user began posting in the forum, they were assigned a posting
score ranging from 1 to 75 (1 = September 2011; 75 = October 2017). For the Fascist Forge data,
there were approximately 18 total months included in the study period. Posters were therefore
assigned a score ranging from 1-18, again based on which month they began posting in the
forum (1 = May 2018; 18 = November 2019). As a result, the independent variable for each
forum was coded as an interval variable.
Analytic strategy
14
The analyses proceeded in two phases. For the first phase, GBTM was used to identify
heterogeneity in the developmental patterns of posting behavior by users in the Iron March data
and in the Fascist Forge data. GBTM used for this study assumes that the population of interest
(in this case, posters on Iron March and Fascist Forge) consists of a discrete number of
unobserved classes, each with its own distinct temporal pattern and polynomial growth curves.
85
More specifically, GBTM was used to determine if there were patterns of posting behavior over
time that could be characterized as distinct groups (or trajectories). Two separate analyses were
conducted, wherein GBTM was used to identify posting trajectories in Iron March and in Fascist
Forge.
To identify group-level trajectory patterns in each forum dataset, users’ posting
frequencies (dependent variable) were calculated for each month (independent variable). A zero-
inflated poisson (ZIP) distribution was determined to be the best model for posting frequencies
over months in each forum. Model selection determined the optimal number of latent groups
through an iterative process and relied on several indicators of “best fit.” First, the Bayesian
Information Criterion (BIC), which is recognized as one of the most reliable statics available to
determine model fit,
86
was evaluated for each model. The BIC indicates whether the model most
closely matches the data, wherein the closer the BIC is to zero indicates a better model fit.
87
Second, average posterior probabilities (AvePP) were calculated to measure the likelihood that
individuals were assigned to the “correct group.Nagin
88
suggests that the optimal value of the
AvePP is 1.0, but an AvePP value of equal to or greater than 0.7 is acceptable. Third, a more
conservative test for model fit, the odds of correct classification (OCC), was assessed. This
measure indicates the accuracy of group assignment, and an OCC value of equal to or greater
than 5 for all groups meets the threshold and suggests high accuracy. Finally, the 95%
15
confidence interval extension developed by Jones and Nagin
89
was used to examine whether
each trajectory was distinct from one another as well as whether trajectories overlapped at any of
the measurement points in each sample. The 95% confidence intervals should also be tightly
bound around each distinct trajectory group.
90
For the second phase of the analysis, multinomial logistic regression and analysis of
variance (ANOVA) were used to examine how the time of a user’s first post in the lifespan of the
forum predicted assignment to trajectory groups. ANOVA was used to model within and
between class variance to determine if there were significant differences in the mean quantitative
measure of timing of becoming active in each forum. Multinomial logistic regression was used to
examine how this same timing measure influenced the risk of being assigned to each group in the
model, relative to an omitted reference group. For this reason, regression coefficients are
described below in the form of relative risk ratios (RRR).
Results
Iron March
The posting trajectories in the Iron March forum data were best captured by a four-group
model,
91
as illustrated in Figure 1. The trajectory groups were labeled low-onset early desister
(LOED), high-onset moderate desister (HOMD), moderate-onset chronic (MOC), and high-
onset chronic (HOC). The LOED group comprised 64.4% of the sample (n = 956). For the users
in this group, their posting behavior followed a quadratic trajectory trend, decreasing at a fast
rate throughout the users’ measurable posting career. In particular, at the onset of the group’s
posting career, LOED posting frequencies were low (a first month intercept of around 10 posts)
relative to the other trajectory groups in the Iron March sample, and their posting activity sharply
16
declined to zero posts the very next month and throughout the remainder of the groups’
measurable posting careers.
92
Figure 1. Iron March trajectories for the four-group model.
Note: MOC = moderate-onset chronic, HOMD = high-onset moderate desister, LOED = low-onset early desister,
HOC = high-onset chronic.
Comprising 13.2% of the sample (n = 196), the HOMD’s posting behavior followed a
quadratic trajectory trend, whereby users’ posting frequency decreased at a moderate rate across
their measured posting careers. At the onset of the group’s posting career, HOMD posting
frequencies were high (a first month intercept of around 40 posts) compared to the LOED group
and the MOC groups in the sample, and their posting frequency decreased at a sharp rate from
month one to two and then decreased at moderate and steady rate until approximately month
17
five, at which point their posting frequency reached near zero posts for the remainder of the
study period.
93
The third trajectory, the MOC group, comprised 13.8% of the sample (n = 205). The
posting trajectory of this group followed a quadratic trend line, one which decreased at a very
slow rate through measurable posting career of this group. More specifically, at the onset of the
group’s posting career, their posting frequency were moderate (a first month intercept of around
20 posts) relative to the other groups in the sample, but their posting frequency decreased at a
slow rate and did not completely desist like the LOED and HOMD groups in the sample. During
the last point that the data were measured, the MOC group posted approximately 10 messages
per month in the forum.
94
HOCs, the final trajectory group in the Iron March data, comprised 8.6% of the sample (n
= 126). For the users in this group, their posting behavior followed a linear trend, decreasing
slightly throughout their measurable posting careers. Specifically, at the onset of the groups’
posting career, their posting frequency was high (a first month intercept of around 60 posts)
relative to all of the other trajectory groups in the sample, and their posting behavior only
slightly declined throughout the remainder of the study period. During the last point that the data
were measured, the HOC group posted approximately 40 messages per month in the Iron March
sample.
95
Lastly, in determining how users’ initial month of posting activity was associated with
trajectory group membership, ANOVA results indicated that significant differences existed
between trajectory groups in the average time of becoming active in the forum. In particular,
those who were assigned to the LOED group demonstrated the latest average initial posting
month (54.487). The HOC group, on the other hand, demonstrated the earliest initial posting
18
month on average (27.303), whereas the HOMD and MOC groups reported similar timing of
average initial posting months that were between the other two groups (39.760 and 35.707,
respectively). These results indicated significant between-group differences at the p≤.01 level.
Next, the HOC group was omitted as the reference group for the multinomial logistic regression
modeling phase of the analysis. The results indicated that those who became active later in the
forum’s lifespan were more likely to be assigned to HOMD, MOC, and LOED groups, relative to
the HOC group (HOMD RRR = 1.015; MOC RRR = 1.016; LOED RRR = 1.052). Table 2
provides a full description of these multinomial logistic regression results.
Table 2. Multinomial logistic regression effects of initial active month on Iron March trajectory
group assignment.
Relative risk
ratio
p-value
95% confidence interval
LOED
1.052
<.001
1.042
1.061
HOMD
1.015
.003
1.005
1.026
MOC
1.016
.001
1.007
1.026
HOC
--
(omitted)
--
--
---
Note: LOED = low-onset early desister, HOMD = high-onset moderate desister, MOC = moderate-onset chronic,
HOC = high-onset chronic.
Fascist Forge
The Fascist Forge posting trajectories presented in Figure 2 were best modeled via three
groups.
96
The functional forms of the Fascist Forge trajectories were quite distinct in comparison
19
to the posting trajectories in the Iron March data – with the exception of one trajectory group.
That is, the first group, the Fascist Forge low-onset early desisters (LOED), comprised 85.2% of
the sample (n = 345). This group’s posting behavior followed a cubic trajectory trend, whereby
at the onset of the group’s posting career, their posting frequency was low (a first month
intercept of around 2.5 posts) relative to the other trajectory groups in the sample, and their
posting frequency sharply declined to zero posts the very next month and throughout the
remainder of the study period. This group’s posting behavior was very similar to the LOED
group in the Iron March data.
97
Figure 2. Fascist Forge trajectories for the three-group model.
Note: MOMD = moderate-onset moderate desister, LOED = low-onset early desister, EOMD = explosive-onset
moderate desister.
20
In comparison, the second group in the Fascist Forge data, the moderate-onset moderate
desisters (MOMD), comprised 12.4% of the sample (n = 50). The posting behavior of this group
followed a cubic trend which, like the other two groups in the sample, underwent two direction
changes in their observed posting career. In particular, the group’s posting frequency were
moderate at the onset of their posting career (a first month intercept of about 8 posts) relative to
the other two groups in the sample, and their posting frequency declined at a relatively quick rate
the next month, at which point their posting frequency desisted at a moderate rate over the next
several months, stopping at month five and for the remainder of time in the study period.
98
The final group in the sample, the explosive-onset moderate desisters (EOMD),
comprised 2.4% of the sample (n = 10). For the authors in this group, their posting behavior
followed a cubic trajectory trend during which their posting behavior underwent two directional
changes across their measurable posting careers. Specifically, at the onset of the group’s posting
career, their posting frequency was high (a first month intercept of about 17.5 posts) relative to
the other groups in the sample, and their posting frequency increased at a relatively high rate
until month two, at which point their posting frequency decreased at a moderate rate and
plateaued to about five posts around the sixth month mark and remaining around this level for
the remainder of the study period.
99
The second phase of analyses utilized ANOVA and multinomial logistic regression to
understand differences in initial month of posting between trajectory groups. The results
indicated that the LOED group demonstrated the latest average initiation of posting behavior
(11.620). The EOMD group (8.600) demonstrated the earliest average onset of posting behavior
during the study period, and the MOMD group reported initiation between these two groups
(9.840). The results indicated that these between-group differences were significant at the p≤.01
21
level. Next, the use of multinomial logistic regression omitted the EOMD group in order to
provide a reference group. The results indicated that becoming active in the forum later was
associated with significantly greater risk of assignment to the LOED group, relative to
assignment to the EOMD group (RRR=1.168). Timing of becoming active in the forum did not
significantly differentiate risk of assignment to the MOMD group compared to the EOMD group.
Table 3 includes a description of the multinomial logistic regression results in their entirety.
Table 3. Multinomial logistic regression effects of initial active month on Fascist Forge
trajectory group assignment.
Relative risk
ratio
p-value
95% confidence interval
LOED
1.168
.040
1.007
1.354
MOMD
1.067
.420
.911
1.250
EOMD
---
(omitted)
---
---
---
Note: LOED = low-onset early desister, MOMD = moderate-onset moderate desister, EOMD = explosive-onset
moderate desister.
Discussion
This study examined the developmental pathways of online posting behavior found within the
open access sections of two of the most notoriously violent right-wing extremist forums: Iron
March and Fascist Forge. Multinomial logistic regression and ANOVA were then used to assess
whether the time of an author’s first post in the forums predicted assignment to trajectory groups.
Several conclusions can be drawn from this study.
22
First, most of the authors in both discussion forums were low-onset early desisters. This
is a noteworthy finding because it suggests that users who initially post in violent right-wing
extremist forums generally do not increase in their posting frequency over time. Instead, at the
onset of their posting careers, they post a low volume of messages and then stop posting as soon
as one month after initial participation. This finding aligns with empirical research suggesting
that the majority of those who post in right-wing extremist forums, particularly the collective
extremist forums for a general audience, tend to tail off their posting behavior.
100
Our study
findings, however, suggest that most authors in violent right-wing extremist forums desist in
their posting activity at much faster rates than those in the generic (or non-violent) right-wing
extremist forums.
101
Our study findings also suggest that users in violent right-wing extremist
spaces generally do not become more active in the platforms over time which is in contrast to
empirical research that found a steady increase in user participation in a right-wing extremist
forum.
102
However, our findings may be a symptom of the violent forums that were assessed for
the current study. It is reasonable to assume that authors who posted messages in these violent
open access spaces were concerned with just that: perhaps it was the violent nature of the
discourse that turned users away from posting there, or perhaps they were concerned that, by
posting content in such open access spaces, they would become the subject of an investigation
from anti-hate watch-organizations, journalists, or even law enforcement. Furthermore, our
findings may be a symptom of the nine-month posting period that was explored in the current
study, unlike the previous work that explored posting behaviors over longer periods of time.
103
Regardless, this finding requires further exploration.
Second, the Iron March forum, compared to the Fascist Forge forum, contains a higher
number of posting behaviors that law enforcement and intelligence agencies may deem worthy
23
of future investigation. To illustrate, Fascist Forge users desisted in posting activity across all
posting trajectories while two of the Iron March trajectories did not desist in activity and
remained active in the violent platform – a finding supported by empirical research which
suggests that those who post content in right-wing extremist forums over extensive periods of
time indicates a long-term level of commitment to the online community and the extremist
cause.
104
Similarly, the proportion of users assigned to the low-onset early desister group – which
is most likely the least high-risk trajectory group in the sample – was substantially more in the
Fascist Forge model than in the Iron March model. And while a developmental posting behavior
that was potentially threatening was identified in the Fascist Forge forum (i.e., explosive-onset
moderate desister group), the most potentially high-risk developmental posting behavior in the
Iron March forum (i.e., the high-onset chronic group) made up a substantially larger proportion
of users in the forum than those in Fascist Forge’s most potentially high-risk trajectory group.
These findings may be a symptom of the forums that were assessed for the current study. Iron
March, for example, was a relatively well-established forum that remained online for five plus
years and, until the site went offline for unknown reasons, it received little attention from anti-
hate watch-organizations or investigative journalists for its connection to extremist violence.
105
Iron March also operated during a time when public scrutiny and doxing efforts (i.e., the release
of information designed to expose an individual in question to public scrutiny) was little concern
for those using the platform. Previous research supports this claim, in that users on Iron March
did not appear to be overly concerned that their online activities, whether it was recruitment
efforts or discussions about violent fascist beliefs, was subject to investigation from journalists or
law enforcement.
106
Fascist Forge, which formed in response to Iron March going offline, had a
lifespan of approximately one year until it was subjected to scrutiny from journalists and anti-
24
hate watch-organizations for its ties to violent extremists.
107
The forum also operated during a
time that those with extreme right-wing viewed were concerned that they may be doxed.
108
Furthermore, Fascist Forge’s forum administrators vetted new users before giving them access to
post on the platform and visit private channels to ensure that only the committed could
participate. Posters on Fascist Forge were also subject to relatively strong content moderation on
the site
109
, which may have impacted their posting behaviors there and the results of the current
study. Users on Fascist Forge, for example, may have been concerned with being banned from
the platform if the content that they posted did not align with the views of those moderating the
space and, in turn, their posting behaviors may have been influenced by the moderators – a
concern raised by researchers exploring the impact of content moderators on right-wing
extremist discussion forums.
110
Together, it may be the case that users on Fascist Forge were
cautious – and perhaps thoughtful – about the content that they posted in the open access section
of the platform while users on Iron March were less fearful of the materials they posted. The
more potentially high-risk posting trajectories uncovered in Iron March in comparison to those in
Fascist Forge may also be explained by the fleeting and, in some ways, restrictive nature of
Fascist Forge.
Lastly, early participation in both violent right-wing extremist forums was associated
with longer patterns of high frequency posting behavior and chronic posting behavior. In other
words, becoming an active user earlier in a forum’s lifespan is associated with greater risk of
following a more chronic, potentially high-risk posting trajectory. Interestingly, it appears that
these highly embedded users, who may be characterized as ‘founding fathers’ of the forums, are
driving the high posting activity in each forum, while those who initially post later in a forum’s
lifespan are more likely to be fleeting in their posting behavior. The developmental posting
25
behaviors uncovered for the current study are supported by previous work that identified ‘super
contributors’ in online extremist networks,
111
or two user types – a small group of ‘super users’
and large group of ‘regular users’ – found in a right-wing extremist forum.
112
We similarly found
that the super users in Iron March and Fascist Forge posted at high rates, with these users
desisting in posting activity in Fascist Forge in particular. But in contrast to previous research on
the temporal evolution of users in a generic right-wing extremist forum,
113
our ‘super users’ in
Iron March did not desist in posting activity over time, which was similarly found in Singer-
Emery and Bray III’s
114
assessment of posting activity in Iron March. Indeed, this trajectory
group is worthy of future investigation by law enforcement officials and intelligence agencies.
Limitations and future research
While this study offers a first step in understanding developmental patterns of posting behavior
in the Iron March and Fascist forums that law enforcement officials and intelligence agencies
may consider investigating further, paired with a broader understanding of the violent forums
themselves, there are several limitations that may inform future research. The first pertains to the
GBTM method used to identify trajectory groups of posting behaviors in the forums. The
trajectory groups in these models should not be understood as concrete entities that all forum
users assigned to them follow in lockstep together. Rather, they should be understood as
approximate patterns of longitudinal change and continuity, as GBTM is used to sort individuals
with relatively similar patterns together into groups. While GBTM provides one of the more
powerful tools for managing otherwise intractable longitudinal data in a parsimonious manner,
future studies could consider random effects models such as a growth curve modelling approach
26
that would allow for a single overall trend with individual-specific deviations rather than
assuming that distinct groups of posters exist in the data, as is the case with GBTM.
A second limitation of the current study pertains to the wide variance in the proportion of
users who provided complete data for the full nine-month study period between forums. This
may have driven some of the identified developmental subgroups and observed differences in
model composition – even with both forums generally demonstrating trends toward deceleration
across all posting groups. Regardless, future research is needed to assess whether variance in the
proportion of users who provide complete data across forums does, in fact, influence predicted
trajectory group assignment.
A third limitation pertains to the modeling of only the first nine months that users were
active in the forums. While this modeling decision was made for interpretability and
comparability purposes, our analyses did not account for the posting behaviors of those who
were active in the forums beyond nine months or for extensive periods of time. Future research
should attempt to study the entire lifespans of the Iron March and Fascist Forge forums using
methods that are able to leverage all of the available data in order to assess posting behavior that
was outside the scope of the current study.
A final limitation pertains to the data itself. While this type of data is novel in its capacity
to provide details about developments of posting behavior in violent right-wing extremist
discussion forums, it is limited in the information provided. There remain many unanswered
questions about characteristics of individual posters in these forums and how heterogeneity in
these characteristics may drive differential development. Findings that indicate that becoming
active earlier in the lifespan of the forums are useful, but other characteristics may provide
additional utility for understanding how and why users become more embedded in the forums.
27
While characteristics like gender and race are likely to be relatively homogeneous,
characteristics like age, country of origin, religion, and socioeconomic status may also be drivers
of development. Integrating data like these would allow for more rigorous multivariate analyses
that would provide a clearer understanding of how relevant each of these individual
characteristics are for predicting patterns of posting behavior in violent extremist forums.
Similarly, future research should include a mixed-methods approach to identify key themes that
emerge within and across patterns of posting behavior to get a sense of the content being posted.
From a policy perspective, perhaps the most relevant relationship that additional data would help
to better understand here is whether or not developmental patterns of posting behavior are
associated with perpetration of violence in the offline world. Future research should attempt to
assess this important on- and offline dynamic. Researchers, for example, could combine the
online data with offline data of a sample of known violent extremists in an effort to triangulate
their offline experiences with their online presentation of self, language, and behavior.
115
This,
among other research strategies, would provide researchers, practitioners, and policymakers with
new insight into the online behaviors and actions that can spill over into the offline realm.
Notes
1
Maura Conway, Ryan Scrivens and Logan Macnair, “Right-Wing Extremists’ Persistent Online
Presence: History and Contemporary Trends,” The International Centre for Counter-Terrorism –
The Hague 10(2019): 1-24; Thomas J. Holt, Joshua D. Freilich, and Steven M. Chermak,
“Examining the Online Expression of Ideology among Far-Right Extremist Forum Users,”
Terrorism and Political Violence. Ahead of Print, 1-21.
2
See, for example, Southern Poverty Law Center, “White Homicide Worldwide,” 1 April 2014.
Available at: https://www.splcenter.org/20140401/white-homicide-worldwide (accessed 12 May,
2020).
3
See Conway et al., Right-Wing Extremists’ Persistent Online Presence.
4
Les Back, “Aryans Reading Adorno: Cyber-Culture and Twenty-First Century Racism,” Ethnic
and Racial Studies 25(2002): 628-651; Ana-Maria Bliuc, John Betts, Matteo Vergani,
Muhammad Iqbal, and Kevin Dunn, “Collective Identity Changes in Far-Right Online
Communities: The Role of Offline Intergroup Conflict,” New Media and Society 21(2019):1770–
1786; Val Burris, Emery Smith, E., and Ann Strahm, “White Supremacist Networks on the
28
Internet,” Sociological Focus 33(2000): 215-235; Willem De Koster, and Dick Houtman,
“‘Stormfront is Like a Second Home to Me,’” Information, Communication and Society
11(2008): 1155-1176; Robert Futrell and Pete Simi, “Free Spaces, Collective Identity, and the
Persistence of U.S. White Power Activism,” Social Problems 51(2004): 16-42; Holt et al.,
“Examining the Online Expression of Ideology among Far-Right Extremist Forum Users”; Ryan
Scrivens, Garth Davies, and Richard Frank, “Measuring the Evolution of Radical Right-Wing
Posting Behaviors Online,” Deviant Behavior 41(2018): 216-232; Ryan Scrivens, “Exploring
Radical Right-Wing Posting Behaviors Online,” Deviant Behavior. Ahead of Print, 1-15;
Magdalena Wojcieszak, “‘Don’t Talk to Me’: Effects of Ideological Homogenous Online Groups
and Politically Dissimilar Offline Ties on Extremism,” New Media and Society 12(2010): 637-
655.
5
Mattias Ekman, “Anti-Refugee Mobilization in Social Media: The Case of Soldiers of Odin,”
Social Media and Society 4(2018): 1-11; Lella Nouri and Nuria Lorenzo-Dus, “Investigating
Reclaim Australia and Britain First’s Use of Social Media: Developing a New Model of
Imagined Political Communities Online,” Journal for Deradicalization 18(2019): 1-37;
Sebastian Stier, Lisa Posch, Arnim Bleier, and Markus Strohmaier, “When Populists Become
Popular: Comparing Facebook Use by the Right-Wing Movement Pegida and German Political
Parties,” Information, Communication and Society 20(2017): 1365-1388.
6
J. M. Berger, Nazis vs. ISIS on Twitter: A Comparative Study of White Nationalist and ISIS
Online Social Media Networks (Washington, DC: The George Washington University Program
on Extremism, 2016); J. M. Berger and Bill Strathearn, Who Matters Online: Measuring
Influence, Evaluating Content and Countering Violent Extremism in Online Social Networks
(London, UK: The International Centre for the Study of Radicalisation and Political Violence,
2013); Pete Burnap and Matthew L. Williams, “Cyber Hate Speech on Twitter: An Application
of Machine Classification and Statistical Modeling for Policy and Decision,” Policy and Internet
7(2015): 223-242; Roderick Graham, “Inter-Ideological Mingling: White Extremist Ideology
Entering the Mainstream on Twitter,” Sociological Spectrum 36(2016): 24-36.
7
Mattias Ekman, “The Dark Side of Online Activism: Swedish Right-Wing Extremist Video
Activism on YouTube,” MedieKultur: Journal of Media and Communication Research
30(2014): 21-34; Derek O’Callaghan, Derek Greene, Maura Conway, Joe Carthy, and Pádraig
Cunningham, “Down the (White) Rabbit Hole: The Extreme Right and Online Recommender
Systems,” Social Science Computer Review 33(2014): 1-20.
8
Savvas Finkelstein, Joel Zannettou, Barry Bradlyn, and Jeremy Blackburn, “A Quantitative
Approach to Understanding Online Antisemitism,” arXiv:1809.01644, 2018; Antonis Papasavva,
Savvas Zannettou, Elimiano De Cristofaro, Gianluca Stringhini, and Jeremy Blackburn, “Raiders
of the Lost Kek: 3.5 Years of Augmented 4chan Posts from the Politically Incorrect Board,”
arXiv:2001.07487, 2020.
9
Savvas Zannettou, Barry Bradlyn, Elimiano De Cristofaro, Haewoon Kwak, Michael
Sirivianos, Gianluca Stringini, and Jeremy Blackburn, “What is Gab: A Bastion of Free Speech
or an Alt-Right Echo Chamber,” Proceedings of the WWW ’18: Companion Proceedings of The
Web Conference 2018’ Yuchen Zhou, Mark Dredze, David A. Broniatowski, and William D.
Adler, “Elites and Foreign Actors Among the Alt-Right: The Gab Social Media Platform,” First
Monday 24(2019).
10
Gabriel Weimann and Natalie Masri, “Research Note: Spreading Hate on TikTok,” Studies in
Conflict & Terrorism. Ahead of Print, 1-14.
29
11
Jakob Guhl and Jacob Davey, A Safe Space to Hate: White Supremacist Mobilisation on
Telegram (London, UK: Institute for Strategic Dialogue, 2020); Aleksandra Urman and Stefan
Katz, “What they do in the Shadows: Examining the Far-right Networks on Telegram”,
Information, Communication & Society. Ahead of Print, 1-20.
12
Jacob Davey, Mackenzie Hart, and Cécile Guerin, An Online Environmental Scan of Right-
Wing Extremism in Canada (London, UK: Institute for Strategic Dialogue, 2020).
13
Jacques Singer-Emery and Rex Bray III, “The Iron March Data Dump Provides a Window
Into How White Supremacists Communicate and Recruit”, Lawfare, 27 February 2020.
Available at: https://www.lawfareblog.com/iron-march-data-dump-provides-window-how-white-
supremacists-communicate-and-recruit (accessed 2 July 2020).
14
Examples include Mack Lamoureux, “Fascist Forge, the Online Neo-Nazi Recruitment Forum,
is Down,” Vice News, 15 February 2019. Available at
https://www.vice.com/en_ca/article/43zn8j/fascist-forge-the-online-neo-nazi-recruitment-forum-
is-down (accessed 24 June 2020); Mack Lamoureux and Ben Makuch, “Online Neo-Nazis are
Increasingly Embracing Terror Tactics,” Vice News, 28 January 2019. Available at
https://www.vice.com/en_ca/article/8xynq4/online-neo-nazis-are-increasingly-embracing-terror-
tactics (accessed 24 June 2020); Team Ross, “Transnational White Terror: Exposing
Atomwaffen and the Iron March Networks,” Bellingcat, 19 December 2019. Available at
https://www.bellingcat.com/news/2019/12/19/transnational-white-terror-exposing-atomwaffen-
and-the-iron-march-networks (accessed 24 June 2020); Jason Wilson, “Leak from Neo-Nazi Site
Could Identify Hundreds of Extremists Worldwide,” The Guardian, 7 November 2019.
Available at https://www.theguardian.com/us-news/2019/nov/07/neo-nazi-site-iron-march-
materials-leak (accessed 24 June 2020).
15
See Conway et al., Right-Wing Extremists’ Persistent Online Presence; see also Holt et al.,
“Examining the Online Expression of Ideology among Far-Right Extremist Forum Users.”
16
Maura Conway, “Determining the Role of the Internet in Violent Extremism and Terrorism:
Six Suggestions for Progressing Research,” Studies in Conflict & Terrorism 40(2016): 77-98;
Bennett Kleinberg, Isabelle van der Vegt, and Paul Gill, “The Temporal Evolution of a Far‑Right
Forum,” Journal of Computational Social Science. Ahead of Print, 1-23.
17
Ibid.
18
See Conway et al., Right-Wing Extremists’ Persistent Online Presence.
19
Conway, “Determining the Role of the Internet in Violent Extremism and Terrorism.”
20
Jacob Davey and Julia Ebner, The Fringe Insurgency: Connectivity, Convergence and
Mainstreaming of the Extreme Right (London, UK: Institute for Strategic Dialogue, 2017).
21
Davey et al., An Online Environmental Scan of Right-Wing Extremism in Canada.
22
Savvas Zannettou, Joel Finkelstein, Barry Bradlyn, and Jeremy Blackburn, “A Quantitative
Approach to Understanding Online Antisemitism,” 2019, arXiv:1809.01644.
23
Holt et al., “Examining the Online Expression of Ideology among Far-Right Extremist Forum
Users.”
24
Bliuc et al., “Collective Identity Changes in Far-Right Online Communities”; Leo Figea, Lisa
Kaati, and Ryan Scrivens, “Measuring Online Affects in a White Supremacy Forum,”
Proceedings of the 2016 IEEE International Conference on Intelligence and Security
Informatics, Tucson, Arizona, USA; Philippa Levey and Martin Bouchard, “The Emergence of
Violent Narratives in the Life-Course Trajectories of Online Forum Participants,” Journal of
Qualitative Criminal Justice and Criminology 7(2019): 95-121; Logan Macnair and Richard
Frank, “Changes and Stabilities in the Language of Islamic State Magazines: A Sentiment
30
Analysis,” Dynamics of Asymmetric Conflict 11(2018): 109-120; Andrew J. Park, Brian Beck,
Darrick Fletche, Patrick Lam, and Herbert H. Tsang, “Temporal Analysis of Radical Dark Web
Forum Users,” Proceedings of the 2016 IEEE/ACM International Conference on Advances in
Social Networks Analysis and Mining, San Francisco, CA, USA; Matteo Vergani and Ana-Maria
Bluic, “The Evolution of the ISIS’ Language: A Quantitative Analysis of the Language of the
First Year of Dabiq Magazine,” Sicurezza, Terrorismo e Società 2(2015): 7-20; Scrivens et al.,
“Measuring the Evolution of Radical Right-Wing Posting Behaviors Online.”
25
Swati Agarwal and Ashish Sureka, “Using KNN and SVM Based One-Class Classifier for
Detecting Online Radicalization on Twitter,” Proceedings of the International Conference on
Distributed Computing and Internet Technology, Bhubaneswar, India; Adam Bermingham,
Maura Conway, Lisa McInerney, Neil O’Hare, and Alan F. Smeaton, “Combining Social
Network Analysis and Sentiment Analysis to Explore the Potential for Online Radicalisation,
Proceedings of the 2009 International Conference on Advances in Social Network Analysis
Mining, Athens, Greece; Hsinchun Chen, “Sentiment and Affect Analysis of Dark Web Forums:
Measuring Radicalization on the Internet,” Proceedings of the 2008 IEEE International
Conference on Intelligence and Security Informatics, Taipei, Taiwan; Emilio Ferrara,
“Contagion Dynamics of Extremist Propaganda in Social Networks.” Information Sciences,”
Information Sciences 418-419(2017): 1-12; Emilio Ferrara, Wen-Qiang Wang, Onur Varol,
Alessandro Flammini, and Aram Galstyan, “Predicting Online Extremism, Content Adopters,
and Interaction Reciprocity,” Proceedings of the International Conference on Social Informatics,
Berlin, Germany; Ted Grover and Gloria Mark. “Detecting Potential Warning Behaviors of
Ideological Radicalization in an Alt-Right Subreddit,” Proceedings of the Thirteenth
International AAAI Conference on Web and Social Media, Munich, Germany, 2019; Benjamin
W. K. Hung, Anura P. Jayasumana, and Vidarshana W. Bandara. “Detecting Radicalization
Trajectories Using Graph Pattern Matching Algorithms,” Proceedings of the 2016 IEEE
International Conference on Intelligence and Security Informatics, Tucson, Arizona, USA;
Benjamin W. K. Hung, Anura P. Jayasumana, and Vidarshana W. Bandara, “Pattern Matching
Trajectories for Investigative Graph Searches,” Proceedings of the 2016 IEEE International
Conference on Data Science and Advanced Analytics, Montreal, Canada.
26
Scrivens, “Exploring Radical Right-Wing Posting Behaviors Online.”
27
A notable exception includes Kleinberg et al., “The Temporal Evolution of a Far‑Right
Forum.”
28
Ibid.
29
Conway et al., Right-Wing Extremists’ Persistent Online Presence.
30
Scrivens et al., “Measuring the Evolution of Radical Right-Wing Posting Behaviors Online.”
31
Manuela Caiani and Patricia Kröll, “The Transnationalization of the Extreme Right and the
use of the Internet,” International Journal of Comparative and Applied Criminal Justice
39(2014): 331-351; Futrell and Simi, “Free Spaces, Collective Identity, and the Persistence of
U.S. White Power Activism”; Wojcieszak, “‘Don’t Talk to Me’.”
32
Scrivens et al., “Measuring the Evolution of Radical Right-Wing Posting Behaviors Online.”
33
Alfred Blumstein, Jacqueline Cohen, Jeffrey A. Roth, and Christy A. Visher, Criminal
Careers and ‘Career Criminals (Washington, DC: National Academy Press, 1986).
34
Terrie E. Moffitt, “Adolescence-Limited and Life-Course-Persistent Antisocial Behavior: A
Developmental Taxonomy,” Psychological Review 100(1993): 674-701.
35
See Sytske Besemer, Johan Axelsson, and Jerzy Sarnecki, “Intergenerational Transmission of
Trajectories of Offending Over Three Generations,” Journal of Developmental and Life-Course
31
Criminology 2(2016): 417-441; see also Abdullah Cihan, Jonathan Sorensen, and Kimberly A.
Chism, “Analyzing the Offending Activity of Inmates: Trajectories of Offense Seriousness,
Escalation, and De-Escalation,” Journal of Criminal Justice 50(2017): 12-18.
36
Matt DeLisi and Alex P. Piquero, “New Frontiers in Criminal Career Research, 2000-2011: A
State-of-the-Art Review,” Journal of Criminal Justice 39(2011): 289-301.
37
Ibid.
38
Daniel S. Nagin and Kenneth C. Land, “Age, Criminal Careers, and Population Heterogeneity:
Specification and Estimation of Nonparametric, Mixed Poisson Model,” Criminology 31(1993):
327-362.
39
See Ibid.
40
See Daniel S. Nagin and Richard E. Tremblay, “Trajectories of Boys’ Physical Aggression,
Opposition, and Hyperactivity on the Path to Physically Violent and Nonviolent Juvenile
Delinquency,” Child Development 70(1999): 1181-1196.
41
Gary LaFree, Nancy A. Morris, Laura Dugan, and Susan Fahey, “Identifying Cross-National
Global Terrorist Hot Spots,” in Jeffrey Victoroff, Ed., in Tangled Roots: Social and
Psychological Factors in the Genesis of Terrorism (Amsterdam: IOS Press, 2006), pp. 98-114;
Gary LaFree, Sue-Ming Yang, and Martha Crenshaw, “Trajectories of Terrorism: Attack
Patterns of Foreign Groups that have Targeted the United States, 1970-2004,” Criminology and
Public Policy 8(2009): 445-473; Gary LaFree, Nancy A. Morris, and Laura Dugan, “Cross-
National Patterns of Terrorism: Comparing Trajectories for Total, Attributed and Fatal Attacks,
1970-2006,” British Journal of Criminology 50(2010): 622-649; Nancy A. Morris, Lee Ann and
Slocum, “Estimating Country-Level Terrorism Trends using Group Based Trajectory Analysis:
Latent Glass Growth Analysis and General Mixture Modeling,” Journal of Quantitative
Criminology 28(2012): 103-139.
42
Mark S. Hamm, The Spectacular Few: Prisoner Radicalization and the Evolving Terrorist
Threat (New York: New York University Press, 2013); Joseph A. Schafer, Christopher W.
Mullin, and Stephanie Box, “Awakening: The Emergence of White Supremacist Ideologies,”
Deviant Behavior 35(2014): 173-196.
43
Ashmini G. Kerodal, Joshua D. Freilich, Steven M. Chermak, and Michael J. Suttmoeller, “A
Test of Sprinzak’s Split Delegitimization’s Theory of the Life Course of Far-Right
Organizational Behavior,” International Journal of Comparative and Applied Criminal Justice
39(2014): 307-329.
44
Tore Bjørgo, “Strategies for Preventing Terrorism (New York: Palgrave, 2013); Bryan F.
Bubolz and Pete Simi, “Leaving the World of Hate: Life-Course Transitions and Self-Change,”
American Behavioral Scientist 59(2015): 1588-1608.
45
A notable exception includes Scrivens et al., “Measuring the Evolution of Radical Right-Wing
Posting Behaviors Online.”
46
Scott Helfstein and Dominick Wright, “Covert or Convenient? Evolution of Terror Attack
Networks,” Journal of Conflict Resolution 55(2011): 785-813; Marie Ouellet and Martin
Bouchard, “The 40 Members of the Toronto 18: Group Boundaries and the Analysis of Illicit
Networks,” Deviant Behavior 39(2018): 1467-1482.
47
Akhlaq Ahmad, “The Ties that Bind and Blind: Embeddedness and Radicalisation of Youth in
One Islamist Organisation in Pakistan,” The Journal of Developmental Studies 52(2016): 5-21;
Emily Corner, Noémie Bouhana, and Paul Gill, “The Multifinality of Vulnerability Indicators in
Lone-Actor Terrorism,” Journal Psychology, Crime & Law 25(2019): 111-132; Lasse
Lindekilde, Stefan Malthaner, and Francis O’Connor, “Peripheral and Embedded: Relational
32
Patterns of Lone-Actor Terrorist Radicalization,” Dynamics of Asymmetric Conflict 12(2019):
20-41.
48
Froukje Demant, Marieke Slootman, Frank Buijs, and Jean Tillie, Decline and
Disengagement: An Analysis of Processes of Deradicalisation (Amsterdam: Institute for
Migration and Ethnic Studies, 2008).
49
Mary Beth Altier, Emma Leonard Boyle, and John Horgan, “Terrorist Transformations: The
Link Between Terrorist Roles and Terrorist Disengagement,” Studies in Conflict & Terrorism.
Ahead of Print; Tore Bjørgo, and John Horgan, Eds., Leaving Terrorism Behind: Individual and
Collective Perspectives (London: Routledge, 2009).
50
Ryan Scrivens, Paul Gill, and Maura Conway, “The Role of the Internet in Facilitating Violent
Extremism and Terrorism: Suggestions for Progressing Research,” in Thomas J. Holt and Adam
Bossler, Eds., The Palgrave Handbook of International Cybercrime and Cyberdeviance
(London, UK: Palgrave, 2020), pp. 1-22.
51
See Scrivens, “Exploring Radical Right-Wing Posting Behaviors Online.”
52
See Ibid.
53
Kleinberg et al., “The Temporal Evolution of a Far‑Right Forum.”
54
J. M. Berger, Extremism (Cambridge, MA: The MIT Press, 2018).
55
See Ibid.
56
See Conway et al., Right-Wing Extremists’ Persistent Online Presence.
57
3,447 posts were unable to be attributed to a poster and were thus coded as missing data
(1.77% of all posts in the Iron March data).
58
For more information on the web-crawler, see Ryan Scrivens, Tiana Gaudette, Garth Davies,
and Richard Frank, “Searching for Extremist Content Online Using The Dark Crawler and
Sentiment Analysis,” in Mathieu Deflem and Derek M. D. Silva, Eds., Methods of Criminology
and Criminal Justice Research (Bingley, UK: Emerald Publishing, 2019), pp. 179-194.
59
Anti-Defamation League, “Fascist Forge: A New Forum for Hate,” Anti-Defamation League,
15 January 2019. Available at https://www.adl.org/blog/fascist-forge-a-new-forum-for-hate.
(accessed 24 June 2020); Florence Keen, “How the Global Far-Right Makes Use of Social
Networking,” Global Network on Extremism and Technology Insights, 31 December 2019.
Available at https://gnet-research.org/2019/12/31/from-iron-march-to-fascist-forge-how-the-
global-far-right-makes-use-of-social-networking (accessed 24 June 2020).
60
Lamoureux, “Fascist Forge, the Online Neo-Nazi Recruitment Forum, is Down”; Wilson,
“Leak from Neo-Nazi Site Could Identify Hundreds of Extremists Worldwide.”
61
James Poulter, “The Obscure Neo-Nazi Forum Linked to a Wave of Terror,” Vice, 12 March
2018. Available at https://www.vice.com/en_uk/article/437pkd/the-obscure-neo-nazi-forum-
linked-to-a-wave-of-terror (accessed 24 June 2020).
62
Team Ross, “Transnational White Terror.”
63
Jacob Ware, “Siege: The Atomwaffen Division and Rising Far-Right Terrorism in the United
States,” The International Centre for Counter-Terrorism – The Hague 7(2019): 1-20.
64
Anti-Defamation League, “Fascist Forge”; Lamoureux, “Fascist Forge, the Online Neo-Nazi
Recruitment Forum, is Down.”
65
Michael Edison Hayden, “Visions of Chaos: Weighing the Violent Legacy of Iron March,”
Southern Poverty Law Center, 15 February 2019. Available at
https://www.splcenter.org/hatewatch/2019/02/15/visions-chaos-weighing-violent-legacy-iron-
march (accessed 24 June 2020).
66
Wilson, “Leak from Neo-Nazi Site Could Identify Hundreds of Extremists Worldwide.”
33
67
Lamoureux, “Fascist Forge, the Online Neo-Nazi Recruitment Forum, is Down.”
68
Anti-Defamation League, “Fascist Forge”; Michael Edison Hayden, “Mysterious Neo-Nazi
Advocated Terrorism for Six Years Before Disappearance,” Southern Poverty Law Center, 21
May 2019. Available at https://www.splcenter.org/hatewatch/2019/05/21/mysterious-neo-nazi-
advocated-terrorism-six-years-disappearance (accessed 24 June 2020).
69
Davey et al., An Online Environmental Scan of Right-Wing Extremism in Canada.
70
Ibid.
71
Ibid.
72
Lamoureux and Makuch, “Online Neo-Nazis are Increasingly Embracing Terror Tactics.”
73
Davey et al., An Online Environmental Scan of Right-Wing Extremism in Canada.
74
Lamoureux and Makuch, “Online Neo-Nazis are Increasingly Embracing Terror Tactics.”
75
Lamoureux, “Fascist Forge, the Online Neo-Nazi Recruitment Forum, is Down.”
76
Ibid”; Lamoureux and Makuch, “Online Neo-Nazis are Increasingly Embracing Terror
Tactics.”
77
Lizzie Dearden, “Teenage Neo-Nazi Convicted of Planning Terror Attack Targeting
Synagogues as Part of ‘Race War’,” Independent, 20 November 2019. Available at
https://www.independent.co.uk/news/uk/crime/synagogue-attack-durham-terror-neo-nazi-race-
war-antisemitism-a9210856.html (accessed 24 June 2020).
78
Daniel De Simone and Ali Winston, “Neo-Nazi Militant Group Grooms Teenagers,” BBC
News, 22 June 2020. Available at https://www.bbc.com/news/uk-53128169 (accessed 24 June
2020).
79
Counter Extremism Project, “Extremist Content Online: U.K. Neo-Nazi Active on Online
Forum Fascist Forge Arrested,” 25 November 2019. Available at:
https://www.counterextremism.com/press/extremist-content-online-uk-neo-nazi-active-online-
forum-fascist-forge-arrested (accessed 24 June 2020).
80
Participants who were not active for a total of nine months in the forum during their posting
careers were coded as missing for any months that they could not have provided data.
81
To address potential concerns regarding imputation of data for users who did not provide data
for all nine months examined in the analyses, sensitivity analyses were conducted wherein all
users who had any missing data within their first nine months of activity in a forum were
excluded from the analyses. Results were analogous to those found in the main analyses, thus
providing robustness for the observed findings.
82
Thomas W. Wojciechowski, “PTSD as a Risk Factor for the Development of Violence among
Juvenile Offenders: A Group-Based Trajectory Modeling Approach,” Journal of Interpersonal
Violence 35(2017): 2511-2535.
83
The top-coding of the posting month variable resulted in 331 posts above 80 to be re-assigned
to the “80” category, comprising 2.13% of the total post data across all time points. Including
data that were already coded as “80” resulted in a ceiling category with an N of 342 which
comprised 2.24% of the total post data. The maximum value in the raw data was a single
instance of one user who posted 421 times in a single month.
84
For each of the nine months, the following is the average proportion of users in the sample that
were censored at 80 posts per month: month 1 = 3.30%; month 2 = 3.14%; month 3 = 2.30%;
month 4 = 2.37%; month 5 = 2.45%; month 6 = 1.82%; month 7 = 2.38%; month 8 = 1.64%;
month 9 = 1.36%.
85
Nagin and Land, “Age, Criminal Careers, and Population Heterogeneity.”
34
86
Daniel S. Nagin, Group-Based Modeling of Development (Cambridge, MA: Harvard
University Press, 2005).
87
Ibid.
88
Nagin, Group-Based Modeling of Development.
89
Bobby L. Jones and Daniel S. Nagin, “Advances in Group-Based Trajectory Modeling and a
SAS Procedure for Estimating Them,” Sociological Methods and Research 35(2007): 542-571.
90
Wide confidence intervals for a group indicates that there is a lot more variance within that
group, which raises questions about the degree to which the individual trajectories of participants
assigned membership to that group resembles the path of the trajectory group.
91
The zero-inflated poisson model was chosen as the probability distribution for modeling the
data because of the heavy right-skew of the pooled data. This model was comprised of one group
characterized by a linear growth function and three groups that were characterized by a quadratic
growth function. This group also provided better fit according to obtained BIC statistics than did
the two-group and three-group models (i.e., two-group BIC = -116,311.03; three-group BIC = -
56,105.24; four-group BIC = -50,949.58). While a five-group model indicated better fit to the
data based on the BIC, the addition of a fifth group resulted in no additional nuance added to the
model. This group was nearly identical to the third group in the model, as it demonstrated a low
first month intercept and almost immediate decline to zero. For this reason, the four-group model
was chosen as the best fitting model. Finally, the chosen model also met all additional criteria
that allowed for it to be the best fitting model.
92
This group also met additional criteria related to the AvePP and OCC (AvePP = .997; OCC =
1,956.894).
93
This group also met additional criteria related to the AvePP and OCC (AvePP = .987; OCC =
404.923).
94
This group also met additional criteria related to the AvePP and OCC (AvePP = .995; OCC =
1,061.333).
95
This group also met additional criteria related to the AvePP and OCC (AvePP = .999; OCC =
213,328.00).
96
The iterative GBTM process resulted in the identification of a three-group model, as it
provided the best fit to the longitudinal posting data, and the zero-inflated poisson model was
chosen as the probability distribution of choice because of the heavy right-skew of the data. This
three-group model provided better fit based on the BIC than the one, two, and four group models
(i.e., one-group BIC = - 4,933.24; two-group BIC = - 3,017.74; three-group BIC = - 2,637.81;
four-group BIC = 2,648.46). The chosen model was found to be comprised of three groups that
were best represented by a cubic polynomial function. This model also met all additional fit
criteria that allowed for it to be chosen as the best fitting model.
97
This group also met additional criteria related to the AvePP and OCC (AvePP = .997; OCC =
1525.865).
98
This group also met additional criteria related to AvePP and OCC (AvePP = .975; OCC =
179.063).
99
This group also met additional criteria related to AvePP and OCC (AvePP = .984; OCC =
282.369).
100
Examples include Kleinberg et al., “The Temporal Evolution of a Far‑Right Forum”; Scrivens
et al., “Measuring the Evolution of Radical Right-Wing Posting Behaviors Online.”
101
Ibid.
102
Scrivens et al., “Measuring the Evolution of Radical Right-Wing Posting Behaviors Online.”
35
103
Kleinberg et al., “The Temporal Evolution of a Far‑Right Forum”; Scrivens et al., “Measuring
the Evolution of Radical Right-Wing Posting Behaviors Online.”
104
Scrivens, “Exploring Radical Right-Wing Posting Behaviors Online.”
105
Notable exceptions include Kyle Scott Clauss, “Neo-Nazi Posters Spotted on Boston
University’s Campus,” Boston Magazine, 2 May 2016. Available at
https://www.bostonmagazine.com/news/2016/05/02/neo-nazi-posters-boston-university
(accessed 1 July 2020); David Neiwert, “Illinois Woman with Neo-Nazi Leanings Charged in
Canadian Mass Murder Plot,” Southern Poverty Law Center, 18 February 2015. Available at
https://www.splcenter.org/hatewatch/2015/02/18/illinois-woman-neo-nazi-leanings-charged-
canadian-mass-murder-plot (accessed 1 July 2020).
106
See Singer-Emery and Bray III, “The Iron March Data Dump Provides a Window Into How
White Supremacists Communicate and Recruit.”
107
Lamoureux, “Fascist Forge, the Online Neo-Nazi Recruitment Forum, is Down”; Lamoureux
and Makuch, “Online Neo-Nazis are Increasingly Embracing Terror Tactics.”
108
For more on doxing the extreme right, see Steven Blum, “Doxxing White Supremacists is
Making Them Terrified”, Vice News, 15 August 2017. Available at
https://www.vice.com/en_us/article/7xxbez/doxxing-white-supremacists-is-making-them-
terrified (accessed 2 July 2020); see also Vegas Tenold, “To Doxx a Racist,” The New Republic,
26 July 2018. Available at: https://newrepublic.com/article/150159/doxx-racist (accessed 2 July
2020).
109
Davey et al., An Online Environmental Scan of Right-Wing Extremism in Canada.
110
See, for example, Jessie Daniels, Cyber Racism: White Supremacy Online and the New Attack
on Civil Rights (Lanham, MA: Rowman and Littlefield Publishers, 2009); see also Priscilla M.
Meddaugh and Jack Kay, “Hate Speech or ‘Reasonable Racism?’ The Other in Stormfront,”
Journal of Mass Media Ethics 24(2009): 251-268.
111
J.M. Berger and Jonathon Morgan, The ISIS Twitter Census (Washington, DC: Brookings
Institution, 2015); Benjamin Ducol, “Uncovering the French-Speaking Jihadisphere: An
Exploratory Analysis,” Media, War and Conflict 5(2012): 51-70.
112
Kleinberg et al., “The Temporal Evolution of a Far‑Right Forum.”
113
Ibid.
114
Singer-Emery and Bray III, “The Iron March Data Dump Provides a Window Into How
White Supremacists Communicate and Recruit.”
115
Scrivens et al., “The Role of the Internet in Facilitating Violent Extremism and Terrorism.”
... In addition, research on violent online political extremism has been concerned about the extent to which individuals are immersed in violent extremism online, with a particular focus on the relationship between the impact of extremist online content and violent radicalization (see Scrivens et al., 2020a). But overlooked in this regard has been an assessment of how immersion and participation in online extremist spaces influence users' posting behaviors (Scrivens et al., 2020c). ...
... They also attempt to engage with and motivate others to participate in fanatical discussions and build a sense of community (Scrivens, 2020), and they are often the 'super contributors' (i.e., those who dominate the discussions) and those whose first post dates back significantly further than the average user, both in forums (Kleinberg et al., 2020) and sub-forums (Scrivens, 2020). Research similarly suggests that early participation in violent RWE forums is associated with longer patterns of high frequency and chronic posting behavior (see Scrivens et al., 2020c). However, this literature does not differentiate posting behaviors by users' violence status (i.e., violent versus non-violent extremist), nor does it disaggregate and compare posting behaviors at a sub-forum and forum level or examine whether time of initial activity in the lifespan of a sub-forum influence users' sub-forum and forum posting behaviors. ...
Article
Full-text available
There is an ongoing need for researchers, practitioners, and policymakers to detect and assess online posting behaviors of violent extremists prior to their engagement in violence offline, but little is empirically known about their online behaviors generally or the differences in their behaviors compared with nonviolent extremists who share similar ideological beliefs particularly. In this study, we drew from a unique sample of violent and nonviolent right-wing extremists to compare their posting behaviors in the largest White supremacy web-forum. We used logistic regression and sensitivity analysis to explore how users’ time of entry into the lifespan of an extremist sub-forum and their cumulative posting activity predicted their violence status. We found a number of significant differences in the posting behaviors of violent and nonviolent extremists which may inform future risk factor frameworks used by law enforcement and intelligence agencies to identify credible threats online.
... Further, the recent attack on the US Capitol and protests in Charlottesville that resulted in multiple deaths are both known to have been planned in online communities including large platforms such as Twitter and Parler (Prabhu et al. 2021). The importance of timely interventions on toxic content has been further highlighted by Scrivens et al. (Scrivens, Wojciechowski, and Frank 2020) who showed that there existed a gradual increase in the approval of toxic content in response to consistent toxic posting by community members. These results are in line with other studies showing how communities can become more extreme over time (Simi and Futrell 2015;Wojcieszak 2010;Caiani and Kröll 2015;Wright, Trott, and Jones;Ribeiro et al. 2020). ...
Preprint
Full-text available
Most platforms, including Reddit, face a dilemma when applying interventions such as subreddit bans to toxic communities -- do they risk angering their user base by proactively enforcing stricter controls on discourse or do they defer interventions at the risk of eventually triggering negative media reactions which might impact their advertising revenue? In this paper, we analyze Reddit's previous administrative interventions to understand one aspect of this dilemma: the relationship between the media and administrative interventions. More specifically, we make two primary contributions. First, using a mediation analysis framework, we find evidence that Reddit's interventions for violating their content policy for toxic content occur because of media pressure. Second, using interrupted time series analysis, we show that media attention on communities with toxic content only increases the problematic behavior associated with that community (both within the community itself and across the platform). However, we find no significant difference in the impact of administrative interventions on subreddits with and without media pressure. Taken all together, this study provides evidence of a media-driven moderation strategy at Reddit and also suggests that such a strategy may not have a significantly different impact than a more proactive strategy.
... While AWD's violent plans were mostly thwarted by a combination of law enforcement intervention and their own operational incompetence, their propaganda operations both on and offline represent the kind of virtual/physical hybridity that typifies contemporary white nationalist propaganda efforts (see Reid & Valasik, 2020;Perliger, 2020;Miller-Idriss, 2020 for related discussions). Blending an awareness of internet culture with traditional white nationalist rhetorical and visual tropes, AWD's propaganda efforts found success among an online cohort organized on Iron March, a now-defunct fascist internet forum (also see Scrivens, Wojciechowski, & Frank, 2020). Despite their organizational failings, AWD's aesthetic contributions to the white nationalist movement continue to be seen in the propaganda efforts of groups like the Order of the Nine Angels (O9A), the National Socialist Order (NSO), and AWD affiliates in over a dozen countries (The Soufan Center, 2020). ...
Article
Atomwaffen Division (AWD) was an American neo-nazi extremist organization active between 2015 and 2020. Even after its disbanding, AWD’s influence can be felt in the symbols and media produced by other organizations in the white power space. This paper contributes to the ongoing study of violent extremism and media by categorizing, defining, and analysing AWD’s official video releases and visual/audio style. First, we explore the history and significance of AWD’s video media output, noting how AWD took advantage of existing internet aesthetics to connect their messaging to its target audience. Then, we analyse AWD’s official video releases, showing how they express white power ideology in ways which have continued to be influential after the group’s dissolution, along with ongoing implications.
Article
Full-text available
This paper examines patterns of posting behaviour on extremist online forums in order to empirically identify and define classes of highly active ‘super-posters'. Using a unique dataset of 8 far-right, 7 Salafi-jihadist, and 2 Incel forums, totalling 12,569,639 unique posts, the study operates a three-dimensional analysis of super-posters (Gini coefficient, Fisher-Jenks algorithm, network analysis) that sheds light on the type of influence at play in these online spaces. Our study shows that extremist forums consistently display four statistically distinguishable classes of posters from the least active ‘hypo-posters' to the most active ‘hyper-posters', as well as demonstrating that, while hyper-posters’ activity is remarkable, they are not necessarily the most central or connected members of extremist forums. These findings, which suggest that extremist forums are places where both minority and majority influences occur, not only advance our understanding of a key locus of online radicalisation; they also pave the way for sounder interventions to monitor and disrupt the phenomenon. © 2022 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.
Article
Full-text available
Although many researchers, practitioners, and policymakers are concerned about identifying and characterizing online posting patterns of violent extremists prior to their engagement in violence offline, little is empirically known about their online patterns generally or differences in their patterns compared to their non-violent counterpart particularly. In this study, we drew from a unique sample of violent and non-violent right-wing extremists to develop and compare their online posting typologies (i.e., super-posters, committed, engaged, dabblers, and non-posters) in the largest white supremacy web-forum. We identified several noteworthy posting patterns that may assist law enforcement and intelligence agencies in identifying credible threats online.
Chapter
Full-text available
This chapter summarizes the emerging trends in the empirical literature on right-wing extremists’ use of the Internet. These trends are organized into five core uses identified by (Conway, 2006): information provision, networking, recruitment, financing, and information gathering. Highlighted throughout this chapter are key gaps in the empirical literature and suggestions for progressing research in this regard.
Book
Full-text available
Antisemitism on Social Media is a book for all who want to understand this phenomenon. Researchers interested in the matter will find innovative methodologies (CrowdTangle or Voyant Tools mixed with discourse analysis) and new concepts (tertiary antisemitism, antisemitic escalation) that should become standard in research on antisemitism on social media. It is also an invitation to students and up-and-coming and established scholars to study this phenomenon further. This interdisciplinary volume addresses how social media with its technology and business model has revolutionized the dissemination of antisemitism and how this impacts not only victims of antisemitic hate speech but also society at large. The book gives insight into case studies on different platforms such as Twitter, Facebook, TikTok, YouTube, and Telegram. It also demonstrates how social media is weaponized through the dissemination of antisemitic content by political actors from the right, the left, and the extreme fringe, and critically assesses existing counter-strategies. People working for social media companies, policy makers, practitioners, and journalists will benefit from the questions raised, the findings, and the recommendations. Educators who teach courses on antisemitism, hate speech, extremism, conspiracies, and Holocaust denial but also those who teach future leaders in computer technology will find this volume an important resource.
Article
Recent years have witnessed increasing academic, media, and political attention to the threat of far-right terrorism. In this article, I argue that scholarship on this threat has suffered from two limitations, each with antecedents in terrorism research more broadly. First, is an essentialist approach to this phenomenon as an extra-discursive object of knowledge to be defined, explained, catalogued, risk assessed, and (ultimately) resolved. Second, is a temptation to emphasise, even accentuate, the scale of this threat. These limitations are evident, I argue, within scholarship motivated by a problem-solving aspiration for policy relevance. They are evident too, though, within critical interventions in which a focus on far-right terrorism is seen as an important corrective to established biases and blind spots within (counter-)terrorism research and practice. In response, I argue for an approach rooted in the problematisation and desecuritisation of the far-right threat. This, I suggest, facilitates important new reflection on the far-right’s production within and beyond terrorism research, as well as on the purposes and politics of critique therein.
Article
Full-text available
Although many law enforcement and intelligence agencies are concerned about online communities known to facilitate violent right-wing extremism, little is empirically known about the presence of extremist ideologies, expressed grievances, or violent mobilization efforts that make up these spaces. In this study, we conducted a content analysis of a sample of postings from two of the most conspicuous right-wing extremist forums known for facilitating violent extremism, Iron March and Fascist Forge. We identified a number of noteworthy posting patterns within and across forums which may assist law enforcement and intelligence agencies in identifying credible threats online.
Article
Full-text available
TikTok is the fastest-growing application today, attracting a huge audience of 1.5 billion active users, mostly children and teenagers. Recently, the growing presence of extremist’s groups on social media platforms became more prominent and massive. Yet, while most of the scholarly attention focused on leading platforms like Twitter, Facebook or Instagram, the extremist immigration to other platforms like TikTok went unnoticed. This study is a first attempt to find the Far-right’s use of TikTok: it is a descriptive analysis based on a systematic content analysis of TikTok videos, posted in early 2020. Our findings reveal the disturbing presence of Far-right extremism in videos, commentary, symbols and pictures included in TikTok’s postings. While similar concerns were with regard to other social platforms, TikTok has unique features to make it more troublesome. First, unlike all other social media TikTok’ s users are almost all young children, who are more naïve and gullible when it comes to malicious contents. Second, TikTok is the youngest platform thus severely lagging behind its rivals, who have had more time to grapple with how to protect their users from disturbing and harmful contents. Yet, TikTok should have learned from these other platforms’ experiences and apply TikTok’s own Terms of Service that does not allow postings that are deliberately designed to provoke or antagonize people, or are intended to harass, harm, hurt, scare, distress, embarrass or upset people or include threats of physical violence.
Article
Full-text available
The increased threat of right-wing extremist violence necessitates a better understanding of online extremism. Radical message boards, small-scale social media platforms, and other internet fringes have been reported to fuel hatred. The current paper examines data from the right-wing forum Stormfront between 2001 and 2015. We specifically aim to understand the development of user activity and the use of extremist language. Various time-series models depict posting frequency and the prevalence and intensity of extremist language. Individual user analyses examine whether some super users dominate the forum. The results suggest that structural break models capture the forum evolution better than stationary or linear change models. We observed an increase of forum engagement followed by a decrease towards the end of the time range. However, the proportion of extremist language on the forum increased in a step-wise matter until the early summer of 2011, followed by a decrease. This temporal development suggests that forum rhetoric did not necessarily become more extreme over time. Individual user analysis revealed that super forum users accounted for the vast majority of posts and of extremist language. These users differed from normal users in their evolution of forum engagement.
Article
Full-text available
Research pays little attention to the diverse roles individuals hold within terrorism. This limits our understanding of the varied experiences of the terrorist and their implications. This study examines how a terrorist’s role(s) influence the likelihood of and reasons for disengagement. Using data from autobiographies and in-person interviews with former terrorists, we find that role conflict and role strain increase the probability of disengagement. We show those in certain roles, especially leadership and violent roles, incur greater sunk costs and possess fewer alternatives making exit less likely. Finally, certain roles are associated with the experience of different push/pull factors for disengagement.
Chapter
Full-text available
Many researchers, practitioners, and policy-makers continue to raise questions about the role of the Internet in facilitating violent extremism and terrorism. A surge in research on this issue notwithstanding, relatively few empirically-grounded analyses are yet available. This chapter provides researchers with five key suggestions for progressing knowledge on the role of the Internet in facilitating violent extremism and terrorism so that we may be better placed to determine the significance of online content and activity in the latter going forward. These five suggestions relate to: (1) collecting primary data across multiple types of populations; (2) making archives of violent extremist online content accessible for use by researchers and on user-friendly platforms; (3) outreaching beyond terrorism studies to become acquainted with, for example, the Internet studies literature and engaging in interdisciplinary research with, for example, computer scientists; (4) including former extremists in research projects, either as study participants or project collaborators; and (5) drawing connections between the on-and offline worlds of violent extremists.
Article
Full-text available
This policy brief traces how Western right-wing extremists have exploited the power of the internet from early dial-up bulletin board systems to contemporary social media and messaging apps. It demonstrates how the extreme right has been quick to adopt a variety of emerging online tools, not only to connect with the like-minded, but to radicalise some audiences while intimidating others, and ultimately to recruit new members, some of whom have engaged in hate crimes and/or terrorism. Highlighted throughout is the fast pace of change of both the internet and its associated platforms and technologies, on the one hand, and the extreme right, on the other, as well as how these have interacted and evolved over time. Underlined too is the persistence, despite these changes, of right- wing extremists’ online presence, which poses challenges for effectively responding to this activity moving forward.
Chapter
Full-text available
Purpose – This chapter examines how sentiment analysis and web-crawling technology can be used to conduct large-scale data analyses of extremist content online. Methods/approach – The authors describe a customized web-crawler that was developed for the purpose of collecting, classifying, and interpreting extremist content online and on a large scale, followed by an overview of a relatively novel machine learning tool, sentiment analysis, which has sparked the interest of some researchers in the field of terrorism and extremism studies. The authors conclude with a discussion of what they believe is the future applicability of sentiment analysis within the online political violence research domain. Findings – In order to gain a broader understanding of online extremism, or to improve the means by which researchers and practitioners “search for a needle in a haystack,” the authors recommend that social scientists continue to collaborate with computer scientists, combining sentiment analysis software with other classification tools and research methods, as well as validate sentiment analysis programs and adapt sentiment analysis software to new and evolving radical online spaces.
Article
Full-text available
Online discussion forums have been identified as an online social milieu that may facilitate the radicalization process, or the development of violent narratives for a minority of participants, notably youth. Yet, very little is known on the nature of the conversations youth have online, the emotions they convey, and whether or how the sentiments expressed in online narratives may change over time. Using Life Course Theory (LCT) and General Strain Theory (GST) as theoretical guidance, this article seeks to address the development of negative emotions in an online context, specifically whether certain turning points (such as entry into adulthood) are associated with a change in the nature of sentiments expressed online. A mixed methods approach is used, where the content of posts from a sample of 96 individuals participating in three online discussion forums focused on Islamic issues is analyzed quantitatively and qualitatively to assess the nature and evolution of negative emotions. The results show that 1) minors have a wider range of sentiments than adults, 2) adults are more negative overall when compared to minors, and 3) both groups tended to become more negative over time. However, the most negative users of the sample did not show as much change as the others, remaining consistent in their narratives from the beginning to the end of the study period.
Article
Over the last decade, there has been an increased focus among researchers on the role of the Internet among actors and groups across the political and ideological spectrum. There has been particular emphasis on the ways that far-right extremists utilize forums and social media to express ideological beliefs through sites affiliated with real-world extremist groups and unaffiliated websites. The majority of research has used qualitative assessments or quantitative analyses of keywords to assess the extent of specific messages. Few have considered the breadth of extremist ideologies expressed among participants so as to quantify the proportion of beliefs espoused by participants. This study addressed this gap in the literature through a content analysis of over 18,000 posts from eight far-right extremist forums operating online. The findings demonstrated that the most prevalent ideological sentiments expressed in users’ posts involved anti-minority comments, though they represent a small proportion of all posts made in the sample. Additionally, users expressed associations to far-right extremist ideologies through their usernames, signatures, and images associated with their accounts. The implications of this analysis for policy and practice to disrupt extremist movements were discussed in detail.
Article
Content regulation and censorship of social media platforms is increasingly discussed by governments and the platforms themselves. To date, there has been little data-driven analysis of the effects of regulated content deemed inappropriate on online user behavior. We therefore compared Twitter — a popular social media platform that occasionally removes content in violation of its Terms of Service — to Gab — a platform that markets itself as completely unregulated. Launched in mid-2016, Gab is, in practice, dominated by individuals who associate with the “alt-right” political movement in the United States. Despite its billing as “The Free Speech Social Network,” Gab users display more extreme social hierarchy and elitism when compared to Twitter. Although the framing of the site welcomes all people, Gab users’ content is more homogeneous, preferentially sharing material from sites traditionally associated with the extremes of American political discourse, especially the far right. Furthermore, many of these sites are associated with state-sponsored propaganda from foreign governments. Finally, we discovered a significant presence of German language posts on Gab, with several topics focusing on German domestic politics, yet sharing significant amounts of content from U.S. and Russian sources. These results indicate possible emergent linkages between domestic politics in European and American far right political movements. Implications for regulation of social media platforms are discussed.