Available via license: CC BY-NC 4.0
Content may be subject to copyright.
ISSN: 1795-6889
www.humantechnology.jyu.fi Volume 15(3), November 2019, 326–346
326
TOWARD A FORMAL SOCIOLOGY OF ONLINE HARASSMENT
Abstract: Online harassment and various subcategories of it, like doxing and swatting, have
attracted enormous interest for several years now—and particular interest in the world of
game studies in the wake of 2014’s GamerGate harassment campaign. Such protracted,
crowdsourced campaigns remain undertheorized, however. Using contemporary research
to modify Georg Simmel’s formal sociological method, I provide researchers and the wider
public with an easily visualized structure that most harassment campaigns follow: the form
of an inverted pyramid bearing down on an individual target, stratified into three orders of
harassment, each defined by level of severity and invasiveness. With this form, the public can
more clearly visualize online harassment as a structural rather than individual phenomenon.
This form makes a significant contribution to social media research by providing an
approachable theoretical framework for future studies and can also frame design
interventions on harassment.
Keywords: online harassment, formal sociology, game studies, GamerGate, Internet studies.
©2019 Katherine Cross and the Open Science Centre, University of Jyväskylä
DOI: https://doi.org/10.17011/ht/urn.201911265023
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Katherine Cross
University of Washington
Seattle, WA
USA
Toward A Formal Sociology of Online Harassment
327
INTRODUCTION
Online harassment—purposefully abusive, unwanted, often repeated interactions with another
person online—has come of age. Although it has existed since the earliest days of the Internet, it
has now come to dominate in many societies, reaching the highest levels of mainstream politics.
From debates in the U. K. House of Commons about whether the Prime Minister’s rhetoric
inspires online threats against his opponents to the U.S. President’s penchant for tweeting bilious
comments at or about anyone who displeases him, examples of online harassment’s commonplace
banality abound. But this newfound normality is not online harassment’s only novel feature. With
the dawn of the 2010s, a phase-change in online abuse occurred that shifted it from a largely
episodic, individual phenomenon to an indefinite, campaign-based affair. Historically, online
harassment depended on crowdsourcing—the use of the Internet to gather attention and incite
action from a distributed group of users—but, by the 2010s, two new features became increasingly
common in episodes of online harassment: organization and longevity.
By 2014, with the onset of the GamerGate harassment campaign (see Table 1 for description
of the events and organizations mentioned in this paper), the evolution from individual
harassment to long-term, collectivized abuse was complete. The GamerGate campaign, which
began with the harassment of game developer Zoe Quinn over false allegations that they1 had
traded sex with journalists for positive reviews2 (Perreault & Vos, 2018), quickly metastasized
into a reactionary movement against any hint of feminist or progressive activism in the world of
video gaming (Mortensen, 2018). GamerGate, a movement made up of a small but enthusiastic
group of video gamers and fans, targeted women, queer people, and people of color in gaming
with protracted campaigns of online harassment that included a great deal of bigotry. From
engineering boycotts of news outlets whose female writers essayed against GamerGate to driving
developers and critics from their homes and issuing bomb threats against feminist speakers,
GamerGate sustained an assault against many targets for a protracted period of time (Frank, 2014;
Wingfield, 2014). It was organized from platforms such as Reddit and the 4chan imageboard,
where targets were monitored and rationales developed for strategically targeting them, their
families, and their workplaces with harassing behavior.
Video gaming’s seasonal storms of harassment—which, in the past, had targeted figures
like former BioWare writer Jennifer Hepler or Ubisoft producer Jade Raymond3—had
consolidated into a whole new climate of abuse. Harassment neither begins nor ends with
gaming, of course, but it fell largely to gaming fans, journalists, critics, and academics to
analyze this sudden shift toward campaign-based harassment and its significance. How can
scholars take the hard lessons learned in the world of video gaming over the last 5 years and
synthesize it into useful knowledge that has general applicability?
In previous work about the causes of online harassment, I focused on how the Internet’s
cultural conceit of unreality—the popular idea that the Internet is unreal and distinct from “real
life”—licenses this behavior by making it easy to ignore the humanity of one’s interlocutors
(Cross, 2014). In the wake of GamerGate, and with the aid of much of the academic work that
has been done on online harassment since, it is now possible to extend this analysis through
identifying a formal structure of behavior common to most online harassment campaigns.
The formal sociological method pioneered by Georg Simmel (1917/1950a, 1908/1959)
provides researchers with a way of seeing a general social pattern in online harassment, and more
recent research on specific cases of campaign-based harassment sheds even more light on the matter.
Cross
328
Table 1. Definitions of Terms, Groups, and Events Involved in Online Harassment or Abuse
Discussed in this Research.
4chan and 8chan 4 chan is an anonymous imageboard (i.e., a forum in which users post images that
can then be commented on) infamous for its loosely moderated and freewheeling
subforums. It is also well-known for producing memes that have influenced popular
culture and for directing harassment against individuals. 8chan is an especially
virulent offshoot, populated by people who find even 4chan’s loose moderation to
be too restrictive. Multiple mass shooters have posted about their plans on 8chan.
Anonymous An activist group that grew out of 4chan, using anonymous in-person protests and
online hacktivism to pursue political goals. It was born out of an anti-Scientology
protest in 2008.
ComicsGate A recent movement in the world of comic book fandom that sought to oppose the
presence of women in the comics industry and the promotion of diversity in comic
book storylines.
Doxing The practice of publishing an individual’s private information online before an
audience hostile to that individual; usually a “dox” will include a home address, place
of employment, telephone numbers, and, in some cases, social insurance numbers.
The practice is sometimes described as “dropping dox” on or about someone.
GamerGate An online harassment campaign directed at feminists and progressives in game
development and games journalism. Their targets of harassment were loosely defined
but primarily women, LGBT people, and people of color. It began in 2014 and continued
until mid-2015, though remnants still congregate in online forums like Reddit.
KiwiFarms A forum devoted to stalking and harassing people from groups despised by the site’s
denizens. Their targets are chiefly transgender people, people with disabilities
(especially neuroatypical people), furries, and Otherkin.4
QAnon A sprawling online conspiracy theory that emerged in the wake of Donald Trump’s
election to the U.S. presidency. Named for “QAnon,” supposedly an American
intelligence operative with Q-
level clearance who has gained a following of
conspiracy theorists through sites like 4chan and Twitter, the participants believe,
among other things, that Donald Trump is secretly breaking up an international
pedophile ring run by the world’s elite liberals.
Sad and Rabid
Puppies A right-wing movement within science fiction literary fandom that sought to oppose
the diversification of the medium and to elevate the work of likeminded reactionary
authors, while reacting against the literary awards won by authors like N.K. Jemesin
and Ann Leckie.
Swatting The practice of making a false report to an emergency hotline (such as 911 in the
US/Canada or 000 in Australia) that precipitates an armed police response against
a target’s home. For instance, using information revealed in a dox, a harasser may
call the police alleging that the target has taken hostages or is running an illegal
drug laboratory. The purpose is to allege something so egregious that a SWAT team
or its local equivalent will barge into the target’s home, potentially injuring or even
killing the target.
A formal sociological approach, with its emphasis on finding “patterns that transcend their
specific instantiations” (Zerubavel, 2007, p. 133), reveals that most online harassment campaigns
converge upon a single target through employing three different modalities that can be
categorized as distinct types of harassment. I call these modalities the three orders of online
harassment and describe each in detail using illustrative case studies drawn from across the
Internet. The orders form a useful model of online harassment that can improve both public and
academic understanding of the phenomenon. Although GamerGate was a clear case of a
Toward A Formal Sociology of Online Harassment
329
harassment campaign, I show how other less well-known incidents of online abuse illustrate the
emergence of a form for harassment that unites these disparate cases.
Of note is the fact that this method of harassment is employed by diverse groups of people:
from Anonymous, to 4chan, to faceless mobs of sexual harassers, to progressive activists, and to
older social conservatives. The argument here, however, is not intended to suggest that there is a
broad equity in who gets harassed or that the harassment always has the same sociostructural
impact. Rather, I mean to highlight in this paper the fact that many of the people who are harassed
are often antagonized online through a process that is so rote as to nearly be scripted. Thus, I
argue that this script is best understood as a form in the Simmelian sense.
There is, for instance, isomorphism in how Internet users will “call targets,” how a sense
of outraged injustice lends moral fire to the attacks, and how the use of methods such as DDOS
(distributed-denial-of-service) attacks or swatting can inspire less committed participants to
continue with low-investment, individuated harassment on sites like Twitter or Facebook. I
will address these issues and explain, in detail, the practices that make up the form of online
harassment I have identified.
To quickly sketch the form, it can be best understood as an inverted pyramid with an
individual deemed to be an acceptable target of harassment at its nadir (see Figure 1). The
pyramid is stratified into three broad categories of harassing behavior, or what I term orders,
that constitute both the harassment itself and that which fuels the behaviors.
Starting from the nadir and moving up the pyramid is first-order harassment, which involves
the most severe attacks against the target’s Web presence and physical life, like doxing, swatting,
DDOS attacks, and stalking; here, the harassers do something to the target. Second-order
harassment is the most commonly known variety: person-to-person online abuse through social
media or email; here, harassers say something to the target. Third-order harassment sees the crowd
Figure 1. The Sociological Form of Online Harassment. The inverted pyramid depicts the distance,
relationship, and activities of the abusers from most distant toward the target of the abuse.
Third-Order Harassment: Indirect
enabling/apologism/memetics, organization
Second-Order Harassment: Person-to-
person online abuse; slurs, vitriol
First-Order Harassment:
Hacking, doxing, swatting
Target
Cross
330
talking to each other about the target, justifying, excusing, or minimizing the harassment and
creating a social climate that makes abuse more likely.
I begin with a review of the relevant literature on online harassment and the culture of structural
prejudice that often surrounds it. From there I explain my methodology, briefly introducing the case
studies I examined, before reviewing some literature on the formal sociological method itself. I then
elaborate the form of online harassment, explain its constituent parts, and explore the case studies
that show the diversity of applications for the sociological form of harassment that I identify here.
I conclude with brief thoughts on research and policy implications.
ONLINE HARASSMENT, IDENTITY, AND PREJUDICE: A REVIEW
The corpus of literature related to the myriad issues involved in online harassment reflects a
wide variety of disciplines and approaches. However, feminist scholarship in the social
sciences and humanities has shone perhaps the brightest and most consistent light on online
harassment. Such scholars have advanced useful theories exploring the gendered and racist
dimensions of online vituperation. Their work demonstrates online harassment’s remarkable
parallelism with patterns of patriarchy and white supremacy in the physical world—whether it
be Orientalist racism in online games showing how race remains a powerful touchstone for
Internet users (Nakamura, 1995, 2000) or how antiBlack racism on Xbox Live shapes player
experiences in a space where whiteness and masculinity are unmarked norms (Gray, 2012,
2014). These scholars have ensured that structural prejudice is a part of the discussion when
these cases are brought into forums of academic and public opinion. They emphasize the sexist,
racist, and homo/transphobic nature common to much online harassment, arguing that the
dehumanizing effects of the Internet are sharply amplified by the latent othering of women and
minorities in the physical world (Ballard & Lineberger, 1999; Jenkins, 1998; Kuznekoff &
Rose, 2013; Meyer & Cukier, 2006; Nardi, 2010; Norris, 2004). In different ways, this body of
work shows how online abuse, fueled by a culture of prejudice, has a disparate impact on
certain groups. In Nardi’s ethnography of World of Warcraft, for instance, she observed that
women, including herself, were subject to harassing sexual inquiries over voice chat that men
did not face (Nardi, 2010, p. 153); Kuznekoff and Rose’s study (2013), with experimental
controls, empirically demonstrated that Nardi was correct.
The online harassment research that has emerged in response to GamerGate has extended
many of the conversations that were begun by this earlier work, which had correctly identified
race and gender as particularly vexed sites of online abuse. Sexism, racism, and homo/transphobia
remain especially common avenues of online attack, both in video gaming and elsewhere. But
beyond merely identifying the existence of prejudice, more recent scholarship led by women and
people of color has explored how harassment works in greater detail. Gray (2012, 2014), Chess
and Shaw (2015), Gray, Buyukozturk, and Hill (2017), Massanari (2017), and Vossen (2018),
among others, had been crucial figures in identifying the unique features of GamerGate early on.
Vossen (2018) explored how movements such as GamerGate make online spaces culturally
inaccessible for marginalized people. In Massanari’s (2017) term, it became a “toxic
technoculture” aided by the gamified systems in place on social media platforms that incentivize
harassment, allowing it to make the leap from individual to structural. Expanding on this, the
research curated by Grey and Leonard (2018) demonstrated what that structure’s effects can be:
Toward A Formal Sociology of Online Harassment
331
It is regarded as a system and thus must be met by systemic solutions that grapple with the
collectivized nature of prejudice, be it through community management (Cote, 2018), building
queer spaces (Skardzius, 2018), or confronting the reality of systemic prejudice in the gaming
industry itself (Vysotsky & Allaway, 2018). Meanwhile, Mortensen (2018) dubbed GamerGate a
“long event,” capturing one of the critical dimensions of the campaign form: its longevity. This
work pointed to an understanding of a structural phenomenon that is defined in large measure by
that longevity, as well as crowdsourcing and organization.
Though writing before GamerGate’s advent, Nardi (2010) also illustrated how abusive
behavior in games is often encouraged by the game’s ruleset and code, whereby a competitive
online game had “rules…in place to promote a certain style of play,” a “ world encoded into the
rules” (2010, p. 71), underscoring the “power of designers to shape activity” (p. 70). This dovetails
with later research like Massanari’s (2017), which demonstrated how the design of social media
platforms give rise to harassing practices.
Although anonymity is heavily emphasized as a cause of harassment in popular culture (e.g.,
Holkins, 2004), as well as journalistic and academic investigations of the matter (see Fost, 2007;
Levmore, 2010; Nussbaum, 2010), I proposed an alternate theory in past work (Cross, 2014). My
theory draws on scholarship in sociology and psychology and focuses on what John Suler (2004)
called the “dissociative imagination” that allows online users to distance themselves from the
consequences and implications of their behavior on the Internet. I argued that a “mobius strip of
reality and unreality” prevails online, where Internet users behave as if their actions are not real
and lack consequences while justifying their behavior through appeals to concrete causes and
moralities (Cross, 2014, p. 4). I extend that argument here by elaborating on the shape of the
behavior this entails, for nearly all the harassment I describe here is defended by people who
suggest that it is not real or, at least, not consequential in any meaningful way.
METHODS
To illustrate the relevance of a modernized formal sociological approach, I examine several
recent online controversies that resulted in online harassment of one or more targets. They were
chosen because of their relative popularity, each reaching the critical mass of virality required
for a true harassment campaign to begin. As I explain each order of harassment in detail, I will
illustrate that order’s existence and relevance through these cases, which include viral
controversies, specific harassment campaigns, and a sample of YouTube comments. Although
GamerGate provided a point of departure for this paper, it will be instructive to explore less-well-
known incidents to see how GamerGate’s structure, albeit novel in its particulars, nevertheless
has much in common with other forms of harassment. The formal sociological method, modified
by the insights explicated in the literature review, will illustrate these similarities.
Formal Sociology
Simmel’s (1907/1971a, 1908/1971b) formal sociological method is, in brief, a sociology of recurrent
patterns or relationships that exist across a wide variety of social contexts; it seeks to find the patterns
endemic to social organization. Simmel argued that “detach[ing] by analysis the forms of interaction
or sociation from their contents” formed the foundation “of a special science of society as such”
Cross
332
(1908/1959, p. 316), in short, sociology itself. “There remains for a sociology in the strictest sense
... nothing but the treatment of [those] forms [of interaction or sociation]” (p. 319). It is in this sense
that the word form is used throughout the paper to describe a form, a shape, or a model of a type of
social interaction, in this case, online harassment.
Simmel’s approach similarly identified common types of social interaction, such as
“exchange” or “domination,” that demonstrate what each phenomenon had in common. Eviatar
Zerubavel described formal sociology as a “research strategy that guides… sociologists in their
efforts to distill generic social patterns from their specific cultural, situational, and historical
contexts” (Zerubavel, 2007, p. 131). Formal sociology constitutes a kind of social geometry that
represented Simmel’s (1907/1971a, 1908/1971b) attempt at rationalizing the then-young
discipline of sociology into a discipline that could stand on an equal footing with the physical
sciences. As Zerubavel noted, formal sociology in Simmel’s conception “almost resembles
physics in its quest for transhistorical as well as transcultural generality” (2007, p. 132; emphasis
in original). The method seeks the common pattern to a class of phenomena that transcends
particular manifestations of the phenomenon.
Simmel’s (1907/1971a, 1908/1971b) arguments about formal sociology were sometimes
contradictory but can be reduced to a few key points. First and foremost, he was not attributing
essences to social interaction but rather recognizing that there were patterns of social behavior
that were common to groups across obvious superficial boundaries. In short, similarities matter
as much as differences (Simmel, 1917/1950a). In “The Stranger,” Simmel illustrated this through
the eponymous figure, defined by a social location relative to larger collectives that were “near
and far at the same time” (Simmel, 1908/1950b, p. 407, emphasis in original). What he called
“the sociological form” of this figure was determined by a commonality that transcended several
distinct social classes and phenomena, drawing links between the social characteristics of
nomads, traders, and European Jewry. While some research would emphasize the differences
between these groups, formal sociology looks for their commonalities, what Zerubavel called
their “cross-contextual similarity” (2007, p. 136).
Formal sociology does not render context irrelevant or unimportant, but it instead
emphasizes the value of understanding how and why similar patterns of behavior may develop
across distinct fields of interaction. Far from being essentializing, it renders another dimension
of social interaction visible and amenable to study. The formal sociological method suggests that
commonality within difference is identifiable and that this commonality matters because it can
furnish answers to specific inquiries that either particularism or a grand theory might miss (Mills,
1959; Zerubavel, 2007).
This balance is best illustrated by the contemporary uses of formal sociology. Arlie Russell
Hochschild’s The Managed Heart (1983) was a pathbreaking study that identified the concept of
“emotion work” as a potential site of alienation in labor. Her study focused extensively on the
gendered dimensions of women’s emotion work, particularly through analyzing the experience
of flight attendants. But she also used the formal method to compare their experience to that of
men in male-dominated professions intensive in emotion work such as bill collection. Similarly,
Lynn Chancer’s Sadomasochism in Everyday Life (1992) analogized workplace relations and
heterosexual partnerships using her formal frame of sadomasochistic relationships as a metaphor.
Zerubavel (2003) himself examined concepts like time and memory, while sociologist Thomas
DeGloma explored the formal similarities among personal awakenings, such as religious
conversions and coming out as queer (DeGloma, 2014). None of these studies erase particularity,
Toward A Formal Sociology of Online Harassment
333
nor deny its importance, but each also uses the forms they identify to generalize across a variety
of subjective frontiers. These studies inform how the formal method will be employed here.
Formal Sociology in the Present Study
The analysis set forth in this paper provides a useful example with which to explore the merits
of the formal sociological method. In the case of online harassment, it is important to
understand that harassment is not distributed evenly across the population of Internet users, nor
does it have identical effects on those who are targeted. In other words, people belonging to
groups underrepresented or subject to discrimination in certain virtual spaces will be hurt more
severely—and in more directly material ways—than those belonging to majority or privileged
groups. Such harassment is terroristic; the disproportionate vituperation of some women online
(Kuznekoff & Rose, 2013) has often had a chilling effect on the speech or participation of other
women (Yee, 2008, pp. 93–95; see also Citron, 2009, 2014; Nardi, 2010; Shaw, 2014).
Although this overview is brief, it raises the very real and important differences between
episodes of harassment directed at men and women, to say nothing of the myriad distinctions
brought forward by antiqueer and antitrans harassment, where antagonism toward the existence
of trans- and nonbinary people is increasingly becoming a theme in online abuse. These
distinctions matter and have important implications for gender equality in tech spaces, to name
but one example.
Nevertheless, based on the cases of online harassment episodes explored later in this paper,
it becomes clear that, when people online organize or coalesce into groups that antagonize a
single target, they do so in broadly similar ways. The method employed in this paper, therefore,
will deliberately avoid dwelling on the contextual differences between the different episodes
of harassment described in favor of finding a pattern common to them all. As Zerubavel put it,
“Social pattern analysts are… purposefully oblivious to the idiosyncratic features of the
communities, events, or situations they study, looking for general patterns that transcend their
specific instantiations” (Zerubavel, 2007, p. 133).
A social pattern manifesting in different contexts of online harassment, then, is useful data,
and formal sociology can help researchers see it. In this paper, I embrace some aspects of
Simmel’s (1917/1950a) “social geometry” while leaving others aside. For instance, I do not
share Simmel’s belief that this method is purely objective or that it constitutes a purer form of
sociology, with the details of context best left to other disciplines (Simmel 1917/1950a, pp.
21–22). But social geometry is valued as both metaphor and visualization in sociology.
Sociological subject matter, after all, often assumes a shape. Hierarchy conjures images of
pyramids or skyscrapers, social networks engender webs, and geographic sociologies depend
on understanding the role of space in making or breaking social phenomena—the way a public
park’s design may encourage or discourage certain behaviors.
The preeminent shape in the analysis of harassment I advance here will be the inverted
pyramid. True to the formal method, the shape is determined by the phenomenon it describes.
The inversion of the pyramid attempts to illustrate the way that harassment converges, like a
titanic spearpoint, onto a single target. The method of harassment described here is, at heart, an
individualist phenomenon. Whatever “problem” justifies the harassment, the harassers see the
assault of one individual—often representing the embodiment of some larger idea, social
Cross
334
problem, or undesirable group—as the solution. The harassers then become a wave of individuals
revenging themselves upon a lone target.
THE FORM ITSELF
The method of harassment described here is distinguished by its convergence on a single target
(see Figure 1). The term convergence, used throughout this paper, is a considered one. It
suggests, at a stroke, both process and shape. As one moves down the pyramid, one gets closer
to the target—moving from indirect chatter about the target down to performing physical
actions against the target’s person and/or property. Each level of harassment is distinguished
by distance from the target. Relation to and distance from the single target of the harassment is
what orders the scale in the form depicted in Figure 1. In the case of GamerGate, the target
would be a single person, with each order of harassment describing a layer of the campaign
against them. An individual becomes an effigy, a symbol for everything opposed by the
harassers. This meant rigorous ideological policing of targets who did not conform to
GamerGate’s reactionary, corporatist ideals (Gray & Leonard, 2018; Quinn, 2017). Some
popular commentators have already noted that GamerGate presaged larger harassment
campaigns with similar modes of operation (e.g., Marcotte, 2016), but the form I have
identified makes the structure of those similarities easier to see.
The Lone Target
Online harassment is, invariably, directed at an individual person. The reasons for this may vary.
Perhaps that person did something the harasser disapproved of, or belongs to a minority group
despised by the harasser, or was simply in the wrong place at the wrong time. But harassment falls
upon a specific target, whether the harassment consists of a hostile meme, hacking someone’s
website, swatting their home address, or sending them a death threat via email.
What of situations where the target is attacked as a representative of a larger group? In this
case, the logic of the form still applies. If the person is targeted because of, say, their religious
affiliation or work for a specific news outlet, the result is the same. A lone target must bear the
psychic (and occasionally physical) burden of being harassed through the Internet. In
particular, the impact is the same when one is describing a campaign like GamerGate that
attacks multiple targets simultaneously because they share broad characteristics. An abridged
list of GamerGate targets would include Zoe Quinn, journalist Leigh Alexander, Anita
Sarkeesian, game developer Brianna Wu, sociologist Adrienne Shaw, among others; all are
individuals who dealt with all three forms of harassment bearing down on them at once.
The use of a lone target as a focal point should not be seen to deny the sociostructural reality
of online harassment, especially those varieties inspired by structural prejudice. Instead, my
model uses targets’ personal experiences as a focal point for understanding the personal impact
caused by each of the three orders of harassment. In addition, an individual can often become a
symbol for something that harassers perceive as a wider problem. Consider the commonplace cri
de coeur, “You are everything wrong with ‘X’!” That mentality, in turn, invokes a punishment
imperative, a way of revenging one’s self against a much larger problem.
Toward A Formal Sociology of Online Harassment
335
In the case of Zoe Quinn, the acts were meant as punishment toward Quinn for being a
representative of a supposed bias towards feminism in the video gaming industry. This imperative
to punish, along with the design of social media, dictates the shape of online harassment. Social
media corrals a mass of people, but the constraints of online communication focus their words,
causing them to converge upon a single target. The Internet at large also foments a dissociative
imagination that distances people from the consequences of their behavior; bearing down on a
single target therefore becomes the path of least resistance online. One cannot yell at a system or
an abstract idea, but one can yell at individual people or send them bomb threats.
The punishment imperative, which impels attacking lone targets who must be punished for
their sins, is vital to nearly all harassment campaigns. Those who use the methods of Anonymous,
for instance, pride themselves on bringing justice to those that the so-called elite appear to have
forgotten. As anthropologist Gabriella Coleman observed, “Anonymous’s activities, however
disparate and paradoxical on their surface, have tapped into a deep disenchantment with the
political status quo, without positing a utopian vision…in response” (2012, paragraph 10).
Coleman (2012) argued that Anonymous’ excesses represent a “primitive rebellion,” à la
Hobsbawm (1965), that is still finding its political footing but will one day mature into a powerful
political bloc. That optimistic vision has not been borne out to date. The progeny of Anonymous,
like GamerGate and QAnon, have had overarching political agendas, but they were broadly
reactionary and premised less on hope than on fear. Moreover, members of these movements see
themselves as vigilantes who punish the guilty, invariably an individual who can be readily
researched and targeted using online tools.
First-Order Harassment
First-order harassment involves attacks that go beyond the virtual and try to physically hurt or
ruin, both online and off, the target. Some types of first-order harassment assail a person’s Web
presence, including hacking, DDOS attacks on a blog or website, or making malicious edits to
the target’s Wikipedia page if he/she is a notable enough figure to have one. Such attacks can
affect a person’s reputation or income and are carried out with the intent to do just that. A unique
variant on this kind of attack is the way in which Anita Sarkeesian’s5 harassers, en masse, have
reported as terrorism every YouTube video she uploads. This special reporting mechanism,
intended as a response to the posting of grotesque executions by terrorist groups on YouTube,
automatically removes a video from being viewed publicly, at least temporarily.
Other attacks are more overt still, reaching into the physical world to inflict bodily harm or
visceral terror upon the target. Such attacks include swatting or stalking (as happened to Anita
Sarkeesian, where harassers took candid photos of her in public in an attempt to intimidate her), or
even leaving dead animals in the target’s mailbox (as happened to Zoe Quinn). Meanwhile,
developer Brianna Wu found that photographs of her house were put online, and a YouTube video
shot by a man who claimed to be driving there to kill her was soon posted (Merlan, 2015).
Sarkeesian also faced bomb threats and was driven from her home by credible threats of violence
against her and her family (U. S. Federal Bureau of Investigation, 2017). These kinds of attacks are
classed as first-order because they represent the greatest possible intrusion into the life of the target.
Still other forms of similarly intrusive first-order harassment include doxing and swatting
(defined in Table 1). Swatting can be deadly, especially in the United States, where police are
more likely to use deadly force (Lartey, 2015; Lopez, 2018). A case from December 2017 made
Cross
336
this clear, where a man was shot and killed by a Kansas SWAT team that had been sent to his
house by an irate Call of Duty player, Tyler Barriss, who wanted to swat another gamer he had
been arguing with. Barriss attempted to dox this other player but found an incorrect address,
which he sent to the police. It was there that the victim, Andrew Finch, an innocent bystander
in every sense, was slain by police (Australian Broadcasting Corporation, 2019).
Instigating these acts requires a higher level of investment—both emotionally and
temporally. In the case of Anita Sarkeesian, for instance, one must not only disagree vehemently
with her, but also have enough antipathy toward her and her project to attempt to sabotage it and
her in every way possible. Such antagonism also requires time; crashing her website or
researching her personal information and history, as some have done, takes considerably more
effort than simply sending her a nasty comment on Twitter. Such invasive and deadly harassment
is often what targets of massive harassment campaigns fear. But as the tragedy of Andrew Finch
shows, it can occur absent any campaign or even involvement. He was, instead, collateral damage
inflicted by an abusive practice that remains common in gaming communities, where harassment
is used as a tool to punish people one disagrees with or otherwise dislikes. It remains comparatively
rare, but it is the product of other forms of abuse that, as already implied here, are so commonplace
as to constitute a culture unto themselves.
Second-Order Harassment
If first-order harassment does something to the target, second-order harassment is a degree less
invasive in that it merely says something to the target. This can take the form of an abusive
tweet sent to the target or a rape threat emailed to the target. Such harassment is the most
common and popularly understood variety, and what most people tend to think of when they
hear the term online harassment. The following are examples of tweets sent to several targets,
including myself, that might be classed as second-order harassment:
“good go get raped you cunt, get fucking lynched you deserve it.”
“Please kill yourself. I hope you die of breast cancer and AIDS combined, you chink.”
“I hope u get raped and then hit by a bus.”
“You’re a Bolshevik feminist jewess that hates white people... and you expect to be taken
seriously when you're ‘critique-ing’ video games? Fucking ovendodger.”
ohgod you don’t even pass you look like a man in drag, god I’d rather put my dick in a
blender than you.”6
Such comments are certainly startling and qualify as abusive. They are also, invariably,
unwanted. In a harassment campaign, a tweet like any of the above is but one of many that a
target will receive and thus constitutes repetitive abuse. For instance, the “ovendodger”
comment was part of a wave of harassment directed at feminist media critic Anita Sarkeesian.
Thousands of similar comments were made against her, often tagging her in social media.7
The case of Elan Gale (2013), a film producer whose fraudulent tweets about a cruel airline
passenger went viral in 2013, is equally instructive here. Gale claimed that a woman named
Diane, a fellow passenger on his flight, was acidly rude to the flight attendants, prompting him
to live-tweet the episode to his followers. He claimed to have sent lewd notes on napkins to the
mystery woman, including one signed “Eat my dick. Love, Elan,” a theme repeated in future
Toward A Formal Sociology of Online Harassment
337
notes (D’Addario, 2013). It would later be revealed that the whole affair was a hoax, and that
Diane never existed (Dewey, 2013).8
One woman, Liz Dwyer (2013), who wrote a blog post suggesting that Elan’s in-flight
behavior would not have been celebrated if he were nonwhite or Muslim, received the
following response from a commenter:
Anonymous (2013): To the writer of this article. You are a joke. The problem with society
as a whole is that here are not enough Elans out here being honest. ... Freedom of speech?
‘Eat my dick”. If this comment + Elan being a white male made this somehow more
offensive… do the world a favor and go kill yourself please. #TeamElan.
This comment is a prime example of second-order harassment, remarkable chiefly because
it was inspired entirely by a hoax, even imitating Gale’s language. Gale’s pantomime abuse
against a figment of his imagination inspired actual abuse against real people. The harassers
spawned by Gale’s viral thread took it upon themselves to punish those who they believed were
“problems” in society because such people enabled the “Dianes of the world,” who needed to
“shut up” (Gale, 2013).
Third-Order Harassment
Third-order harassment is the least direct but the most social. It is best described as the crowd
talking to each other about the target, rather than saying or doing anything to the target. It is
also the most structural part of the harassment campaign form because it involves the highest
level of social organization (as opposed to individual initiative). A harassment campaign
requires both crowdsourcing and organization, and both occur at the third-order of harassment.
The target is at his/her most abstract here: a thing to be talked about, debated, and picked-over
like so much carrion. Nothing that occurs at the third-order is explicitly meant to be viewed by
the target. Instead, what is said and done here inspires the more direct behaviors of the first and
second orders. It is the base of the inverted-pyramid, the thing that presses the other two orders
down with unbearable weight.
The best examples of this order of harassment are the forums most commonly associated
with online abuse: 4chan, 8chan, Reddit, and KiwiFarms. Harassment can, of course, target
individual users of each site but the third-order of harassment focuses on how such forums can
organize abuse. KiwiFarms, for example, maintains a forum where every thread is dedicated
to a different target. Such threads can contain hundreds of posts, spanning months or even
years, devoted to tracking the target’s every online move and coordinating attacks against
him/her. One recent post in Zoe Quinn’s thread shows a KiwiFarms user saying that someone
should “just get in a car with a weapon… drive out to where [Quinn] lives, kidnap her ass and
either kill her or abandon her ass in the desert with no money or ID or hell even clothing.”
Another user responds, “then go ahead and do it, pussy.”
None of this was being said to Quinn and, unless they were to follow the thread, they’d have
no way of knowing what was being said. But it incites abuse against them that ultimately does
appear in Quinn’s social media feeds; their tweets and professional enterprises are frequently
linked on KiwiFarms to facilitate targeting. Any KiwiFarms user is then just one click away from
being able to harass Quinn directly. The greater harm lies in exhortations like “do it, pussy,” a
theme that appears in several KiwiFarms posts where users muse about committing violent
Cross
338
crimes against their targets. There is a formal similarity with “do it, faggot!” posts in 4chan and
8chan, where anonymous users sometimes discussed a channer’s fervent wish to commit a mass
shooting or murder their own parents or partners. As recent events in New Zealand, the United
States, and Germany have shown, some people do, in fact, go out and “do it.”9
Most such threads on KiwiFarms begin with a post cataloguing the target’s perceived
transgressions—justifying the forthcoming abuse—and linking the hostile audience directly to
the target’s online presence. One such post, targeting a man who advocated for progressive
causes, listed his website, Twitter, Google+, MySpace, and even archives of his defunct old
blogs and Facebook accounts. The post also contained false allegations (in this case, that the
man had called in a bomb threat) meant to enrage further KiwiFarms’ users. More prominent
targets have entire subforums (containing multiple threads) dedicated to them.
This practice is a form of target calling; on sites like KiwiFarms, it is hidden from public
view. But there are others who do such things far more openly. Such target callers, that is,
people of some status within distributed communities (i.e., individuals with large followings
or fan bases), may not directly harass an individual but nevertheless identify targets to their
audiences, a certain percentage of whom will then go on to harass that individual. A target
caller may move between the third and second orders, having a certain hybridity as a result. A
notable example is Milo Yiannopoulos, a far right-wing celebrity with ties to white nationalism,
whose time writing for Breitbart, a similarly-minded syndicated opinion website, won him
admiration from much of the so-called alt-right. He was able to raise his profile by
editorializing on behalf of GamerGate; he often identified individuals for targeting and
sometimes participated in the harassment directly. Indeed, it was for doing the latter that he
was finally banned from Twitter when he attacked actress Leslie Jones with a barrage of racist
epithets. As such a hybrid case illustrates, third-order harassment provides the swamp out of
which more direct kinds of attacks (i.e., second and first order) may originate and flourish.
When researcher Michael Salter concluded that GamerGate’s “lifeworld of geek
masculinity and its correspondence with online technological architecture and administration
…formed the condition…for the abuse campaign that followed” (2018, p. 259), he might well
have been speaking about GamerGate’s third-order harassment. The lifeworld or culture that
nurtures such abuse consists of apologism, hatred, toxic masculinity, and the architecture of
the Web itself. All are essential for sustaining a campaign that, by definition, has to last for a
protracted period.
The third order of harassment is where a harassment campaign is most fully fused with cultural
and societal concerns. It is here that a campaign loses the ability to hide behind individual
indiscretions. One person’s abusive tweet can be dismissed as the product of a wicked personality,
but third-order harassment requires collusion, culture, and collectivity. Everyone involved is
complicit, and forces larger than any one person impel their actions. A site like 8chan could not
have existed without collective action, nor as anything other than a culture.
DISCUSSION
So far as formal sociology is concerned, there are some who might argue that it is being used
too conservatively here. Among many other such examples, Zerubavel argued for formal
sociology by pointing out how it reveals “that setting statutes of limitations is actually not that
Toward A Formal Sociology of Online Harassment
339
different from enacting bankruptcy laws or letting bygones be bygones” (2007, p. 142). The
focus of the present study is narrower; the equivalent of examining only what unites, say, the
various instances of “letting bygones be bygones.” But there is value in this nevertheless, for it
lays groundwork for future research that might find formal similarities between, say, online
harassment and Amish shunning or abusive uses of the telegraph. Or, perhaps more urgently,
the similarities between harassment campaigns and online disinformation campaigns.
What is most sociologically critical about the form is the third order. In identifying the
behavior of people who are not directly speaking to or acting upon the target as a type of
harassment, I hope to reinforce an academic understanding of this phenomenon as something
greater than the sum of individual pathologies. Online harassment is not simply the result of
one antisocial individual deciding to leave cruel messages on another person’s social media
profile. Rather, such behavior grows out of a social climate created by indirect abusive
behavior; the processes of target identification, legitimation, and apologism all create a
propitious climate for harassment.
The cases I examined in this paper all share those critical processes. Defending Elan Gale’s
behavior was a moral crusade to some Internet users, justifying abusive rhetoric against anyone
who questioned his actions. KiwiFarmers and GamerGaters both use moral justifications for
abuse against their targets, conspiratorially accusing many of criminality—or of simply being
“annoying.” A thread on KiwiFarms often begins with a thoroughgoing exercise in
legitimation. The male KiwiFarms target I examined earlier was accused of phoning in a bomb
threat to a convention and attacking a cancer victim, for instance. Despite a lack of evidence
for these charges, the claims foreground a listing of the man’s social media and Web presences.
Third-order harassment is the foundation of a campaign, then. It is the most truly collectivizing
force in any harassment campaign; third-order harassment is a process of steeling the will of
those joining a campaign through legitimizing the abuse. But third-order harassment also
causes a campaign to grow.
Social media have allowed for harassment to be crowdsourced, drawing in an escalating
number of people from across the Internet to bear down on a target. This is reflected in
GamerGate being built upon 4chan’s culture of coordinated and sustained attacks to crowdsource
and organize membership from well beyond the chan boards. This innovation, more than
anything else, was what heralded the shift in harassment I described previously. GamerGate
developed a wide base of third-order harassment, which in turn gave it the energy and motivation
to sustain months and even years of attacks in some cases. In so doing, it intensified the second-
and first-order harassment directed at specific targets, as the amplification of attacks on Anita
Sarkeesian clearly demonstrated.
The orders I have identified can be read as discrete categories; they form a taxonomy of abuse.
However, abuse against a target may not always follow a discrete path through the orders—in short,
abuse does not always begin at the third order. Consider the following harassment campaign, led
by a group of Islamophobic extremists, described in a recent report from the British government:
The protesters …intimidated those who opposed them. One local Muslim resident told us
about how his personal details were publicised as punishment for organising a counter
protest. Photographs of him at the march and his personal information appeared on social
media alongside unfounded allegations that he was a “paedo” and a “rape enabler”. He
received numerous threats and his business was boycotted. (Khan, 2019, p. 71)
Cross
340
The abuse described here exploded in several directions at once. The man was doxed and
experienced both financial and reputational damage (first order), was directly harassed through
email and social media platforms (second order), and had all of this directed at him due to a
crowd of strangers talking about him on social media (third order). But in which of the three
orders did the campaign begin? That remains difficult to say. Publicizing a photo with incitable
libel is something that could be categorized as first-order harassment because of its extreme
invasiveness and likelihood of causing harm to a person’s life. But it gained the power to do so
because of discourse among online extremists who were enflaming each other’s emotions about
the man, and about Muslims in general, which might be considered third-order harassment.
Further, a single harasser can participate in every order of harassment. In this case, it is
conceivable that the same person could have spread lies about the man to other Internet users,
spread his picture around, published his personal information, and sent threats to his email
address. The three orders categorize the abuse, not necessarily the abuser.
Many other productive avenues in this vein await exploration, such as the question of whether
online harassment is becoming increasingly gamified, which is to ask whether it is taking on game-
like incentive structures. After all, Twitter has been likened to a massively multiplayer online game
(Brooker, 2013) and online interactions continue to be treated as playful unreality. This, I argue,
makes the all-important third order of harassment that much wider and significant by making abuse
easy to justify (“It’s just a game”; “It’s not real”) and increasing its social support.
But was there anything specific to the world of video gaming that ensured it was the first
place to see harassment so thoroughly transformed, that is, from episodic to campaign-based,
from a seasonal storm to a wholly changed climate? I believe that had this transformation not
occurred in the gaming world, it would have occurred elsewhere. But nevertheless, two key
features set apart the world of video gaming. First, this mutation was likely to occur in fan
communities, as they often are linked to the chan boards that had practiced small-scale organized
harassment for years. Second, gaming communities are seen as bastions of play, and thus
unreality. Their implicit lack of seriousness makes them easier for activists and policymakers to
ignore, thus providing ample cover for the organization of harassment campaigns, as well as the
use of gaming spaces for white nationalist and neo-Nazi recruitment. Gaming was just unlucky
to be the first community from which this new mode of harassment burst forth, but it was
replicated quickly in other fandoms. The 2014 Sad and Rabid Puppies campaigns in the world of
literary science fiction and the ComicsGate campaign of 2018 (see Table 1) both took GamerGate
as a model, the latter lacking even the fig leaf of ethics that garlanded the original 2014 campaign.
ComicsGate, from the start, openly positioned itself opposed to diversity and to the presence of
women in the world of comics writing and artistry.
Such cases make clear that even though the form is context-agnostic and can be reasonably
said to describe a campaign of any motivation—be it fans of One Direction harassing Justin
Bieber fans, or enthusiasts of U.S. Senator Bernie Sanders harassing supporters of his political
opponents—it does not imply moral symmetry between any of these campaigns. Rather, it
merely underscores that the decision to employ any of the methods entailed by the three orders
is a moral choice in the sense of choosing between competing values (Ossowska, 1970).
Determining the precise merits of those values is beyond the scope of this paper.
Toward A Formal Sociology of Online Harassment
341
CONCLUSION
I have used Simmel’s (1917/1950a; 1908/1950b) formal sociological method, modified by the
insights of contemporary gaming and technology scholars, to identify cross-contextual
similarities in social practices that encompass forms of online harassment that transcend
boundaries of gender, race, class, or political affiliation. Common trends and behaviors link
online harassment across these boundaries that determine the geometry an observer may ascribe
to it. Much of this harassment converges on a single target, and the harassment behaviors then
can be subdivided into three orders—these can be understood as forming layers of an inverted
pyramid converging on the target. The first order involves invasive abuse, such as DDOS
attacks or doxing, and can include direct attacks on the person in the physical world. The second
order is direct, person-to-person harassment online (e.g., a harassing tweet or email). The third,
foundational order is the communal apologism for the harassment from those not necessarily
directly involved but which provides moral justifications for the actual attackers. The third
order of harassment sees the crowd talking to each other about the target; the second order is
where they say things to the target; the first is where they do things to the target. These key
features are shared by most online harassment campaigns. Through understanding how they
form an easily visualized social structure (Figure 1), researchers can better make sense of this
new frontier in online social organization.
IMPLICATIONS FOR RESEARCH AND POLICY
This formal sociological approach can frame research design and policy interventions on major
social media platforms, online games, and websites. The pyramidal taxonomy described here
makes abundantly clear that harassment campaigns are rarely the product of individual bad
behavior; rather, they have cultural roots that must be addressed in order to properly engage
and remedy the problem. Further, the emphasis on collective action here is useful for guiding
design interventions at social media companies; it suggests that the most productive site for
intervention is not necessarily the individual harasser, but those aspects of the platform that
facilitate what I have called third-order harassment. Structures such as Twitter’s threading
function or its lack of granular control of responses (one can either lock his/her account or be
open to the entire platform with minimal, heavy-handed filtering) or Facebook’s unwillingness
to moderate groups that promote ethnic or gender-based hatred facilitate third-order
harassment. Such systems and such groups provide the foundation for harassment that targets
individual users both online and in the physical world. In addition, this theory may guide
interventions and collaborations in other spheres as well; there is ample scope for follow-up
research, particularly using network analysis to shed more empirical light on this theory. The
formal approach, in the meantime, clearly can identify and illustrate online harassment and
gives those who must advocate for themselves a much-needed tool with which they can
demonstrate the structural nature of the abuse they are facing.
Cross
342
ENDNOTES
1. In the soon-to-be-implemented 7th edition of the Publication Manual of the American Psychological
Association, the APA style states the use of they can be employed in instances where the gender of the
person indicated as the sentence’s antecedent is unknown or unclear (or as a gender-neutral alternative
to him/her) or if the person to whom that pronoun refers prefers it as a pronoun.
2. This was the genesis of the phrase “Actually, it’s about ethics in games journalism,” which became the
movement’s (widely mocked) motto. GamerGate harassers alleged Quinn had benefitted from corrupt
gaming journalism, and that this justified any ongoing harassment. In truth, the movement’s members
despised Quinn for making a widely praised and simple game about depression and were furious that it
(which they regarded as “not a real game”) was receiving praise. Meanwhile, Quinn’s ex-boyfriend wrote
a blog post falsely alleging that Quinn had traded sex for positive press. Thus a fig-leaf justification for
GamerGate’s actions was born, despite the fact that GamerGate largely ignored or downplayed genuinely
serious issues in gaming journalism, such as major studios retaliating against publications that gave their
games bad reviews. Indeed, GamerGate called for a studio to do that very thing to Polygon, a gaming
news site, because their reviewer used feminist rhetoric in his review of Bayonetta 2.
3. Figures like Hepler and Raymond became targets due to specific, localized outrages—for instance, a
2006 interview with Hepler resurfaced in which she stated that she would like to see video games make
combat sequences skippable for a player who wanted to experience only a game’s story. When it
resurfaced in 2012, it touched off a firestorm of outrage from some self-identified “hardcore” gamers
who wanted her fired from BioWare. Meanwhile, Raymond’s only “crime” was that some male gamers
alleged that she “slept her way to the top” at Ubisoft because they found her physically attractive. These
episodes of harassment have become periodic and expected in the world of video gaming. In Raymond’s
case, the abuse came in a torrent and then slowly subsided. GamerGate differed by becoming a campaign
that attacked multiple targets while sustaining itself over a prolonged period.
4. It began life as an extension of a 4chan harassment campaign against Christine Chandler, an autistic artist
and fan fiction writer who is more commonly known as Chris-chan. The site is named for how her
harassers imagine Chandler, who has a speech impediment, would have pronounced “CWCki forums,”
which was the original name of the site. That original name was drawn from Chandler’s initials. The
harassment against her began in 2007 and continues to this day. Her subforum on KiwiFarms has nearly
500,000 mostly negative posts.
5. Anita Sarkeesian is a feminist filmmaker who, for several years, hosted on YouTube the Feminist
Frequency video series, which took a 101-level approach to feminist media criticism. After using
Kickstarter to fundraise for a new miniseries that explored the use of sexist tropes in portraying female
video game characters, she was subjected to waves of sustained harassment from mostly male gamers who
were furious at her implication that their favorite games might have misogynistic content.
6. These tweets, and all other similar examples of harassment on social media, are exact. All errors and
idiosyncrasies are in the original unless otherwise noted.
7. Sarkeesian kept a detailed record of her harassment, a good sampling of which can be found on her
website: http://www.feministfrequency.com/tag/harassment/ (the link will show all posts she has tagged
as displaying harassment). She also posted one week’s worth of Twitter abuse, sampled from the height
of GamerGate: https://femfreq.tumblr.com/post/109319269825/one-week-of-harassment-on-twitter
8. The popularity of this episode is part of the reason I took it as a case study: It illustrates the sheer
popularity of this sort of behavior. As Slate’s Dave Weigel noted at the time, “The Internet loved it,
especially BuzzFeed, whose Rachel Zarrell aggregated Gale’s tweets and photos. Her post, on one of the
year’s slowest news cycles, got nearly 1.4 million reads” (Weigel, 2013, para. 1).
9. 8chan has been directly linked to three 2019 mass shootings: the Poway, California, synagogue shooting;
the Christchurch, New Zealand, mosque massacre; and the Latino-targeted shooting at an El Paso, Texas,
Wal-Mart. The Isla Vista, California, massacre (2014); the Pittsburgh, Pennsylvania, synagogue shooting
(2018); the Parkland, Florida, school shooting (2018); and the recent killing of people outside a
synagogue in Halle, Germany (2019) were all celebrated on similar websites where language and
Toward A Formal Sociology of Online Harassment
343
postings like those found on KiwiFarms are quite common. It is, thus, reasonable to assume that any post
like the one cited in this paper constitutes a credible threat. The formal sociological method shines here
because it allows researchers to recognize the homology in posts—and their meanings and impacts—
across several disparate sites within idiosyncratic subcultures.
REFERENCES
Anonymous (2013, November 30, 8:26 a.m.). Re: Since when did telling a woman to eat your dick standing up for
service workers? [comment]. Retrieved from http://www.losangelista.com/2013/11/since-when-is-telling-
woman-to-eat-your.html
Australian Broadcasting Corporation. (2019, March 30). “Serial swatter” whose bogus calls led to a man’s death
gets 20 years in jail [Web log post]. Retrieved October 9, 2019, from https://www.abc.net.au/news/2019-03-
30/man-behind-hoax-call-that-led-to-fatal-shooting-jailed/10955674
Ballard, M. E., & Lineberger, R. (1999). Video game violence and confederate gender: Effects on reward and
punishment given by college males. Sex Roles, 41(7), 541–558.
Brooker, C. (2013). Charlie Brooker - Twitter is the #1 MMORPG (2013) [Video file]. Retrieved October 8, 2019,
from https://www.youtube.com/watch?v=fjGh_lE6EGY
Chancer, L. (1992). Sadomasochism in everyday life: The dynamics of power and powerlessness. New Brunswick,
NJ, USA: Rutgers University Press.
Chess, S., & Shaw, A. (2015). A conspiracy of fishes, or, how we learned to stop worrying about #GamerGate
and embrace hegemonic masculinity. Journal of Broadcasting and Electronic Media, 59(1), 208–220.
Citron, D. K. (2009). Law’s expressive value in combating cyber gender harassment. Michigan Law Review, 108,
373–415.
Citron, D. K. (2014). Hate crimes in cyberspace. Cambridge, MA, USA: Harvard University Press.
Coleman, G. (2012). Our weirdness is free. Triple Canopy, 15, [online]. Retrieved on May 21, 2014, from
http://canopycanopycanopy.com/issues/15/contents/our_weirdness_is_free
Cote, A. C. (2018). Curate your culture: A call for social justice-oriented game development and community
management. In K. L. Gray & D. J. Leonard (Eds.), Woke gaming: Digital challenges to oppression and
social injustice (pp. 193–212). Seattle, WA, USA: University of Washington Press.
Cross, K. (2014). Ethics for cyborgs: On real harassment in an “unreal” place. Loading: The Journal of the Canadian
Games Studies Association, 8(13), 4–21.
D’Addario, D. (2013, December 2). Elan Gale, ‘Diane’ and the age of public Twitter-shaming [Web log post].
Retrieved November 11, 2018, from
https://www.salon.com/2013/12/02/elan_gale_diane_and_the_age_of_public_twitter_shaming/
DeGloma, T. (2014). Seeing the light: The logic of personal discovery. Chicago, IL, USA: University of Chicago Press.
Dewey, C. (2013, December 3). That hoax isn’t funny anymore: Elan Gale and the problem of reality online. The
Washington Post [online]. Retrieved November 11, 2018, from https://www.washingtonpost.com/news/arts-and-
entertainment/wp/2013/12/03/that-hoax-isnt-funny-anymore-elan-gale-and-the-problem-of-reality-online/
Dwyer, L. (2013, November 30). Since when is telling a woman to eat your dick standing up for service workers?
[Web log post]. Retrieved May 20, 2014, from http://www.losangelista.com/2013/11/since-when-is-telling-
woman-to-eat-your.html
Fost, D. (2007, March 27). The attack on Kathy Sierra [Web log post]. Retrieved May 20, 2014, from
http://blog.sfgate.com/techchron/2007/03/27/the-attack-on-kathy-sierra/
Frank, J. (2014, September 1). How to attack a woman who works in video gaming. The Guardian [online]. Retrieved
October 6, 2019, from http://www.theguardian.com/technology/2014/sep/01/how-to-attack-a-woman-who-
works-in-video-games
Cross
344
Gale, E. (2013, November 28). Diane [Web log post]. Retrieved on October 8, 2019, from
https://web.archive.org/web/20131129115133/http://theyearofelan.tumblr.com/post/68424148654/diane
Gray, K. L. (2012). Intersecting oppressions and online communities: Examining the experiences of women of
color in xbox live. Information, Communication & Society, 15(3), 411–428.
https://doi.org/10.1080/1369118X.2011.642401
Gray, K. L. (2014). Race, gender, and deviance in Xbox live: Theoretical perspectives from the virtual margins.
Waltham, MA, USA: Anderson Publishing.
Gray, K. L., Buyukozturk, B., & Hill, Z. G. (2017). Blurring the boundaries: Using Gamergate to examine “real”
and symbolic violence against women in contemporary gaming culture. Sociology Compass, 11(3), e12458.
https://doi.org/10.1111/soc4.12458
Gray, K. L., & Leonard, D. J. (Eds.). (2018). Woke gaming: Digital challenges to oppression and social injustice.
Seattle, WA, USA: University of Washington Press.
Hobsbawm, E. J. (1965). Primitive rebels: Studies in archaic forms of social movement in the 19th and 20th centuries.
New York, NY, USA: W. W. Norton & Company.
Hochschild, A. R. (1983). The managed heart: The commercialization of human feeling. Berkeley, CA, USA:
University of California Press.
Holkins, K. G. (2004, March 19). Green blackboards (and other anomalies) [Web log post]. Retrieved March 9,
2019, from http://www.penny-arcade.com/comic/2004/03/19
Jenkins, H. (1998). Voices from the combat zone: Game grrlz talk back. In J. Cassell & H. Jenkins (Eds.), From
Barbie to Mortal Kombat: Gender and computer games (pp. 328–341). Cambridge, MA, USA: Massachusetts
Institute of Technology Press.
Khan, S. (2019). Challenging hateful extremism: Commission for Countering Extremism report on extremism in
England and Wales. London, UK. Retrieved October 11, 2019, from
https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/836538/
Challenging_Hateful_Extremism_report.pdf
Kuznekoff, J. H., & Rose, L. M. (2013). Communication in multiplayer gaming: Examining player responses to
gender cues. New Media & Society, 15(4), 541–556.
Lartey, J. (2015, June 9). By the numbers: US police kill more in days than other countries do in years [Web log
post]. Retrieved October 7, 2019, from https://www.theguardian.com/us-news/2015/jun/09/the-counted-
police-killings-us-vs-other-countries
Levmore, S. (2010). The Internet’s anonymity problem. In S. Levmore & M. C. Nussbaum (Eds.), The offensive
Internet: Speech, privacy, and reputation (pp. 50–67). Cambridge, MA, USA: Harvard University Press.
Lopez, G. (2018). American police shoot and kill far more people than their peers in other countries [Web log
post]. Retrieved October 8, 2019, from https://www.vox.com/identities/2016/8/13/17938170/us-police-
shootings-gun-violence-homicides
Marcotte, A. (2016, September 15). Donald Trump’s campaign really is Gamergate being played out on a national
scale. Salon. Retrieved on March 8, 2019 from www.salon.com/2016/09/15/gamergater
Massanari, A. (2017). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support
toxic technocultures. New Media & Society, 19(3), 329–346. https://doi.org/10.1177/1461444815608807
Merlan, A. (2015, February 2). A man is making bizarre, terrifying YouTube videos about Brianna Wu [Web log
post]. Retrieved October 8, 2019, from https://jezebel.com/a-man-is-making-bizarre-terrifying-youtube-
videos-abou-1683221832
Meyer, R., & Cukier, M. (2006). Assessing the attack threat due to IRC channels. In Proceedings of the International
Conference on Dependable Systems and Networks (pp. 467–472). Washington, D.C., USA: IEEE Computer Society.
Mills, C. W. (1959). The sociological imagination. New York, NY, USA: Oxford University Press.
Mortensen, T. E. (2018). Anger, fear, and games: The long event of #GamerGate. Games and Culture, 13(8),
787–806. https://doi.org/10.1177/1555412016640408
Toward A Formal Sociology of Online Harassment
345
Nakamura, L. (1995). Race in/for cyberspace: Identity tourism and racial passing on the Internet. Retrieved March
1, 2018, from https://pdfs.semanticscholar.org/3531/da9329d2b7158bd697e1aa8ef073f78de6fb.pdf
Nakamura, L. (2000). Cybertypes: Race, ethnicity and identity on the Internet. London, UK: Routledge.
Nardi, B. (2010). My life as a Night Elf Priest: An anthropological account of World of Warcraft. Ann Arbor, MI,
USA: University of Michigan Press.
Norris, K. O. (2004). Gender stereotypes, aggression, and computer games: An online survey of women. Cyber
Psychology & Behavior, 7(6), 714–727.
Nussbaum, M. C. (2010). Objectification and Internet misogyny. In S. Levmore & M. C. Nussbaum (Eds.), The
offensive Internet: Speech, privacy, and reputation (pp. 68–87). Cambridge, MA, USA: Harvard University Press.
Ossowska, M. (1970). Social determinants of moral ideas. Philadelphia, PA, USA: University of Pennsylvania Press.
Perreault, G. P., & Vos, T. P. (2018). The GamerGate controversy and journalistic paradigm maintenance.
Journalism, 19(4), 553–569. https://doi.org/10.1177/1464884916670932
Quinn, Z. (2017). Crash override: How Gamergate (nearly) destroyed my life, and how we can win the fight
against online hate. New York, NY, USA: Public Affairs.
Salter, M. (2018). From geek masculinity to Gamergate: The technological rationality of online abuse. Crime,
Media, Culture, 14(2), 247–264. https://doi.org/10.1177/1741659017690893
Shaw, A. (2014). Gaming at the edge: Sexuality and gender at the margins of gamer culture. Minneapolis, MN,
USA: University of Minnesota Press.
Simmel, G. (1950a). The field of sociology. In K. H. Wolff (Ed.), The sociology of Georg Simmel (pp. 3–25).
New York, NY, USA: Free Press. (Original work published 1917)
Simmel, G. (1950b). The stranger. In K. H. Wolff (Ed.), The sociology of Georg Simmel (pp. 402–408). New
York, NY, USA: The Free Press. (Original work published 1908)
Simmel, G. (1959). The problem of sociology. In K. H. Wolff (Ed.), Georg Simmel 1858–1918: A collection of
essays with translations and bibliography (pp. 310–335). Columbus, OH, USA: Ohio State University Press.
(Original work published 1908)
Simmel, G. (1971a). Exchange. In D. N. Levine (Ed.), Georg Simmel: On individuality and social forms (pp. 43–
69). Chicago, IL, USA: University of Chicago Press. (Original work published 1907)
Simmel, G. (1971b). Domination. In D. N. Levine (Ed.), Georg Simmel: On individuality and social forms (pp.
96–120). Chicago, IL, USA: University of Chicago Press. (Original work published 1908)
Skardzius, K. (2018). Playing with pride: Claiming space through community building in World of Warcraft. In
K. L. Gray & D. J. Leonard (Eds.), Woke gaming: Digital challenges to oppression and social injustice (pp.
175–192). Seattle, WA, USA: University of Washington Press.
Suler, J. (2004). The online disinhibition effect. CyberPsychology & Behavior, 7(3), 321–326.
U.S. Federal Bureau of Investigation. (2017). Gamergate investigation records. Retrieved on August 9, 2019 from
https://vault.fbi.gov/gamergate
Vossen, E. (2018). On the cultural inaccessibility of gaming: Invading, creating, and reclaiming the cultural
clubhouse. UWSpace [University of Waterloo, Canada, institutional repository]. Retrieved on March 31,
2019, from https://uwspace.uwaterloo.ca/handle/10012/13649
Vysotsky, S., & Allaway, J. (2018). The sobering reality of sexism in the video game industry. In K. L. Gray &
D. J. Leonard (Eds.), Woke gaming: Digital challenges to oppression and social injustice (pp. 101–118).
Seattle, WA, USA: University of Washington Press.
Weigel, D. (2013, December 3). If you want reporters to check their stories before they publish, you’re a hater
[Web log post]. Retrieved March 1, 2019, from
http://www.slate.com/blogs/weigel/2013/12/03/buzzfeed_and_elan_gale_s_internet_hoax_too_good_to_ch
eck.html
Wingfield, N. (2014, October 15). Feminist critics of video games facing threats in ‘‘GamerGate’’ campaign. New
York Times [online]. Retrieved October 8, 2019, from
Cross
346
http://www.nytimes.com/2014/10/16/technology/gamergate-women-video-game-threats-anita-
sarkeesian.html
Yee, N. (2008). Maps of digital desires: Exploring the topography of gender and play in online games. In Y. B.
Kafai, C. Heeter, J. Denner, & J. Y. Sun (Eds.), Beyond Barbie & Mortal Kombat: New perspectives on
gender and gaming (pp. 83–96). Cambridge, MA, USA: The Massachusetts Institute of Technology Press.
Zerubavel, E. (2003). Time maps: Collective memory and the social shape of the past. Chicago, IL, USA:
University of Chicago Press.
Zerubavel, E. (2007). Generally speaking: The logic and mechanics of social pattern analysis. Sociological Forum,
22(2), 131–145.
Author’s Note
All correspondence should be addressed to
Katherine Cross
University of Washington, School of Information
1851 N.E. Grant Lane
Seattle, WA 98105, United States
kcross1@uw.edu
Human Technology
ISSN 1795-6889
www.humantechnology.jyu.fi