ArticlePDF Available

Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media

Authors:

Abstract and Figures

Extreme, anti-establishment actors are being characterized increasingly as ‘dangerous individuals’ by the social media platforms that once aided in making them into ‘Internet celebrities’. These individuals (and sometimes groups) are being ‘deplatformed’ by the leading social media companies such as Facebook, Instagram, Twitter and YouTube for such offences as ‘organised hate’. Deplatforming has prompted debate about ‘liberal big tech’ silencing free speech and taking on the role of editors, but also about the questions of whether it is effective and for whom. The research reported here follows certain of these Internet celebrities to Telegram as well as to a larger alternative social media ecology. It enquires empirically into some of the arguments made concerning whether deplatforming ‘works’ and how the deplatformed use Telegram. It discusses the effects of deplatforming for extreme Internet celebrities, alternative and mainstream social media platforms and the Internet at large. It also touches upon how social media companies’ deplatforming is affecting critical social media research, both into the substance of extreme speech as well as its audiences on mainstream as well as alternative platforms.
Content may be subject to copyright.
Article
Deplatforming:
Following extreme
Internet celebrities to
Telegram and alternative
social media
Richard Rogers
Media Studies, University of Amsterdam
Abstract
Extreme, anti-establishment actors are being characterized increasingly as ‘dangerous
individuals’ by the social media platforms that once aided in making them into ‘Internet
celebrities’. These individuals (and sometimes groups) are being ‘deplatformed’ by the
leading social media companies such as Facebook, Instagram, Twitter and YouTube for
such offences as ‘organised hate’. Deplatforming has prompted debate about ‘liberal big
tech’ silencing free speech and taking on the role of editors, but also about the questions
of whether it is effective and for whom. The research reported here follows certain of
these Internet celebrities to Telegram as well as to a larger alternative social media
ecology. It enquires empirically into some of the arguments made concerning whether
deplatforming ‘works’ and how the deplatformed use Telegram. It discusses the effects of
deplatforming for extreme Internet celebrities, alternative and mainstream social media
platforms and the Internet at large. It also touches upon how social media companies’
deplatforming is affecting critical social media research, both into the substance of
extreme speech as well as its audiences on mainstream as well as alternative platforms.
Keywords
Deplatforming, Social Media, Digital Methods, Telegram, Extreme Speech
Corresponding author:
Richard Rogers, Media Studies, University of Amsterdam, Turfdraagsterpad 9, 1012 XT Amsterdam, the
Netherlands.
Email: R.A.Rogers@uva.nl
European Journal of Communication
0(0) 1–17
!The Author(s) 2020
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/0267323120922066
journals.sagepub.com/home/ejc
Introduction: Deplatforming on social media
Deplatforming, or the removal of one’s account on social media for breaking
platform rules, has recently been on the rise. It is gaining attention as an antidote
to the so-called toxicity of online communities and the mainstreaming of extreme
speech, or “vitriolic exchange on Internet-enabled media” that “push the bound-
aries of acceptable norms of public culture” (Pohjonen and Udupa, 2017). It is also
stirring a discussion about the ‘liberal bias’ of US tech giants implementing the
bans (Bilton, 2019; Mulhall, 2019). In the past few years, Facebook, Instagram,
YouTube, Twitter and other platforms have all suspended and removed a variety
of individuals and groups, comprising, according to one accounting, ‘white nation-
alists’, ‘anti-semites’, ‘alt-right’ adherents, ‘neo-nazis’, ‘hate groups’ and others
(Kraus, 2018). Many of those who have been deplatformed are on the far right
of the ideological spectrum, and certain of them could be described as extreme
Internet celebrities, such as Milo Yiannopoulos and Alex Jones, whose removals
have had a significant impact on their visibility, the maintenance of their fan bases
and the flow of their income streams. Yiannopoulos has claimed to have become
bankrupt by deplatforming, which has included cancellations of a book deal and
college campus appearances (Beauchamp, 2018; Maurice, 2019). Jones has seen the
view counts and seemingly the impact of his posts and videos decline (Wong,
2018).
Deplatformings have been widely reported in the tech news and beyond
(Martineau, 2019). When Yiannopoulos, Jones, Laura Loomer and Paul Joseph
Watson were removed from Facebook and Instagram in 2019 for being ‘dangerous
individuals’ engaged or involved in ‘organised hate’ and/or ‘organized violence’
(Facebook, 2019), it drew widespread reaction, including the story of how
Facebook announced the ban some hours prior to its implementation, allowing
the deplatformed individuals to post notices on their pages, redirecting their audi-
ence to other platforms (Martineau, 2019). Laura Loomer, for one, announced her
Telegram channel; Alex Jones pointed to his websites. The migration from main-
stream to alternative social media platforms was underway.
At the same time, protests from these individuals and their followers have been
staged on the platforms that have removed them. Loomer, the ‘white nationalist’
banned from Twitter for a ‘racist attack’ on a Muslim US congresswoman, hand-
cuffed herself to the front door of the corporation’s office in New York city,
livestreaming her plight and her views on the suppression of ‘conservative’ view-
points on a supporter’s Periscope account (itself a Twitter service). Having been
banned, other users switched to platforms friendly to their politics, such as Gab, a
Twitter alternative that upvotes and downvotes posts like Reddit. It has become
known as a ‘haven for white supremacists’ and for its defence of free speech
(Ohlheiser and Shapira, 2018; Zannettou et al., 2018). It also positions itself as
distinct from the ‘left-leaning Big Social monopoly’ (Coaston, 2018).
When deplatformed social media celebrities migrate to alternative platforms,
these sites are given a boost through media attention and increases in user counts.
2European Journal of Communication 0(0)
Milo Yiannopoulos initially turned to Gab after his account was removed on
Twitter (Benson, 2016), and around the same time Alex Jones joined it ‘with
great fanfare’ (Ohlheiser, 2016). Indeed, when Twitter conducted a so-called
‘purge’ of alt-right accounts in 2016, Gab gained tens of thousands of users in a
short time. It is continually described as a favoured platform of expression for
extremism, including for the shooter in the Pittsburgh synagogue in 2018 who
announced his intended acts there (Nguyen, 2018). Gab drew over a million
hits, after it became known that a mass shooter posted his manifesto there
(Coaston, 2018).
But mainstream social media drives more traffic to extreme content than alter-
native social media platforms or other websites, at least in the case when Alex Jones
was banned from Facebook and YouTube, as mentioned earlier. His InfoWars
posts, now only available on his websites (and a sprinkling of alternative social
media platforms, as we come to), saw a decline in traffic by one-half (Nicas, 2018).
When deplatforming leads to such declines in attention, questions arise about its
effectiveness. Is it indeed a viable means to detoxify mainstream social media and
the Internet more broadly, and/or does it prompt the individuals to migrate to other
platforms with more welcoming and ‘oxygen-giving’ extreme publics?
Effectiveness of deplatforming
There has been some scholarly attention paid to the effectiveness of shutting down
particularly offensive online communities, such as the subreddits r/fatpeoplehate
and r/coontown, banned by Reddit in 2015 for violating its harassment policies. It
was found that the shutdowns worked, in that a proportion of offending users
appeared to leave the platform (for Voat, an alternative to Reddit), and the sub-
reddits that inherited those migrating from those spaces did not see a significant
increase in extreme speech (Chandrasekharan et al., 2017). Indeed, the closing of
those communities was beneficial for Reddit, but less research has been performed
about the effectiveness of the ban for the health of social media or the Internet at
large. The Reddit study’s authors reported that not only did Reddit make these
users ‘someone else’s problem’, but also perhaps pushed them to ‘darker corners of
the Internet’ (Chandrasekharan et al., 2017).
The debate concerning the effectiveness of deplatforming has arguments lined
up on both sides. For those arguing that it does not work, deplatforming is said to
draw attention to suppressed materials (Streisand effect), harden the conviction of
the followers, and put social media companies in the position of an arbiter of
speech. For those arguing that deplatforming is effective, it is said that it detoxes
both subspaces (such as subreddits) as well as platforms more generally, produces
a decline in audience and drives extreme voices to spaces that have less oxygen-
giving capacity, thereby containing their impact. The Reddit study indeed found
that both the subreddits and the platform more generally saw a decline in the type
of harassment found on r/fatpeoplehate and r/coontown, but less is known about
the alternative platforms to which extreme users may turn.
Rogers 3
Telegram as ‘dark corner of the Internet’
Apart from Gab and perhaps Voat (to which deplatformed Pizzagate, incel and
QAnon subreddit users are said to have migrated), Telegram is another of those so-
called darker corners of the Internet (Wikipedia Contributors, 2019). It is an
instant messaging app, founded in 2013 by the same Internet entrepreneurs who
launched VKontakte, the social media platform popular in Russia. Telegram has a
reputation, whether or not well-founded, for highly secure messaging, having noto-
riously been listed by ISIS as ‘safe’ and having themselves championed privacy
upon its founding that coincided with the US state spying revelations by Edward
Snowden (Weimann, 2016). Indeed, the founders started Telegram so communi-
cations could not be monitored by governments, including the Russian authorities,
who pursued the founder on charges of tax avoidance until he fled the country
(Cook, 2018). The Russian state later accused Telegram of enabling terrorists
because it would not turn over users’ encrypted messages, leading to a ban of
the application in Russia. The founders, and their programming team, are them-
selves self-exemplary of privacy-enablers, for they require secure communication,
and have moved from location to location to elude what the founder calls ‘unnec-
essary influence’ (Thornhill, 2015). As I come to, encrypted communication is one
affordance that makes Telegram attractive to certain user groups.
How does Telegram appeal to its users, including those who have been deplat-
formed for violating platform rules? Telegram not only has the reputation but also
the affordances that would be attractive to those seeking something similar to
‘social privacy’, or the capacity to retain control over what is known about oneself
while still participating (and becoming popular) on social media (Raynes-Goldie,
2010). On platforms such as Facebook, such a user is public-facing at the outset,
and subsequently, makes deft use of aliases, privacy settings as well as account and
timeline grooming. That is how social privacy is performed. Telegram, however, is
something of a hybrid system, and in contradistinction to Facebook, it leads with
protected messaging, and follows with the social. That is, it is in the first place a
messaging app, where one has an account, and can message others and join groups,
first private ones (by default) but also public ones. It also has some elements of
social media, whereby one may create a channel (public by default) and have
others subscribe to it.
The apt deployment of Telegram would seem to conceptually invert social pri-
vacy. The app offers protected communication, appealing to a private user, rather
than to the public-facing user, seeking publicity (De Zeeuw and Tuters, forthcom-
ing). ‘Private sociality’ may be a term that captures operating in private chats and
private group chats both for the private user (not necessarily seeking publicity) as
well as the masked user (who may seek attention). One can operate in private mode
and still participate.
Second, apart from only private spaces in which to organize, recruit, chat and so
forth, the Telegram user may still seek publicity. Here the use of a channel is
4European Journal of Communication 0(0)
significant for it allows the building of a following. Similar to YouTube, one can
broadcast to subscribers.
Telegram thereby could be said to reconcile dual desires of protection and
publicity by offering private messaging and broadcasting. It thereby appears to
go some way towards resolving the ‘online extremists’ dilemma’, a variation on the
‘terrorist’s dilemma’, which concerns balancing ‘operational security and public
outreach’ (Clifford and Powell, 2019; Shapiro, 2013).
Telegram, as mentioned earlier, has had other extremist users who are often
discussed together with the Internet celebrities in journalistic pieces (Robins-Early,
2019). In a description of its use by ISIS, the app’s combination of features is
described as follows: ‘Telegram’s public-facing “channels” and private messaging
“chats” make it a “dual-use” weapon [...]’ (Counter-Extremism Project, 2017).
The groups broadcast to followers on channels and recruit and organize through
one-to-one chats, which are secure. One need not maintain a telephone number to
use the service; to sign up one can create an account with a temporary ‘burner
phone’ or an Internet proxy phone number, validating the account through a one-
time SMS message (Yayla and Speckhard, 2017). Once set up, ISIS accounts or
those from other such groups are not removed with the rapidity that it occurs on
Twitter or Facebook; many stay up for months or much longer (Shehabat et al.,
2017). Content also endures. On Telegram, there is the capacity to upload large
video files; once uploaded, the videos are linked from channels, and those links
persist unless the channel is closed or the user deletes the file, making it a reliable
source for recruiting materials, but also a space for archiving content that may
have been deplatformed elsewhere.
The deplatformed and Telegram
For the deplatformed, Telegram’s reputation may be appealing. It affords ‘pro-
tected speech’ by being permissive of extreme content. It also keeps content up and
available, thereby allaying threats of deletion, a key concern for those who have
been deplatformed, and saw their content removed. Telegram also offers means to
build a following, and broadcast to large numbers of users (as on YouTube and
Twitter). Like other messaging apps including WhatsApp, Telegram has groups,
though it does not limit their size as much. Groups can have up to 200,000 users
(compared to 256 for WhatsApp), and channels can have an unlimited number of
subscribers. Telegram also enables large clusters of groups, in that one can forward
a message to an unlimited number of groups, as opposed to the smaller number on
WhatsApp, a restriction that received attention in the wake of misinformation
campaigns around elections in India (and Brazil). (On WhatsApp India has a
special limit of 5 forwards, compared to 20 globally.) In all, Telegram can compete
for the deplatformed users by offering the kinds of features seemingly sought by
those seeking protected speech, archiving as well as a following. It is considered
both a ‘protected space’ as well as a ‘publicity space’ (Nagy and Neff, 2015).
Rogers 5
This study empirically examines the use of the platform by extreme Internet
celebrities that have migrated to it, inquiring into certain arguments made con-
cerning whether deplatforming ‘works’ and how those who have been deplat-
formed from mainstream social media use an alternative Telegram (DeCook,
2019). It discusses the effects of deplatforming for extreme internet celebrities,
alternative and mainstream social media platform, the Internet at large as well
as researchers. It does so first by mapping the alternative social media ecology by
creating a bi-partite graph of select extreme Internet celebrities and the platforms
they use. Subsequently, the focus is on Telegram, for which a scraper is built and
deployed to collect posts made on public channels by select extreme Internet celeb-
rities who have migrated to it. The Telegram analysis utilizes platform data (such
as view counts), extreme language detection (hatebase.org) as well as textual anal-
ysis (keywords in context). It does so to begin to examine empirically some of the
arguments made about the effectiveness of deplatforming.
From the point of view of celebrities newly migrated to Telegram, is it an
effective alternative to mainstream platforms? Do audiences remain robust, or
thin? Are the celebrities as active as previously? Do they become more extreme
in the language they use? We found active celebrities but thinning audiences; we
also determined that their language was mellowing.
Mainstream platforms that deplatform ‘dangerous individuals’ may lose a par-
ticular audience consuming extreme speech. Do the audiences also depart the plat-
forms, or do they remain, consuming other materials? One may begin to answer
the question by examining the celebrity discussion about mainstream platforms on
Telegram. One may also examine the hyperlinks from Telegram to the platforms.
Which mainstream platforms remain of relevance to the celebrities, and which fade
from interest? We found that Facebook and Instagram have been successful in that
they have become unattractive to extreme celebrities, whereas YouTube and
Twitter remain significant.
The research also maps out the alternative platform network to which deplat-
formed social media celebrities have migrated, and how they discuss certain of the
destinations. We found that most alternative platforms are used ‘instrumentally’ in
the sense that they are merely pointed to, rather than described for their particular
affordances. The exception is Telegram which appears to be a refuge, where mul-
tiple extreme celebrities have found a soft landing.
Mapping an alternative social media ecosystem
In order to map an alternative social media ecosystem, the research project com-
piled a list of deplatformed social media celebrities, mainly from the United States
and the United Kingdom.
1
At the same time, a list of alternative social media
platforms also was built, initially from those mentioned in right-wing public
groups on Telegram and supplemented through a so-called associative query snow-
balling technique, where search engines are queried for pairs of alternative social
media platform names (Rogers, 2018). Finally, each celebrity was sought on the
6European Journal of Communication 0(0)
alternative platforms. The connections between celebrities and platforms were
subsequently visualized in a network graph using Gephi, and interpreted through
visual network analysis (Venturini et al., 2015). Three findings stand out (see
Figure 1).
The map shows the centrality of BitChute (alternative to YouTube), Minds
(alternative to Facebook), Gab (alternative to Twitter) as well as Telegram (the
hybrid messaging and broadcasting platform), in the sense that a majority of
the celebrities maintain presences there. Second, one may observe the revival
of the web as a potential destination for extreme celebrity content. Whether by
pointing to personal websites or new subscription services such as freespeech.tv,
the web comes (back) into view for these users. Some years on from the platform-
ization of the web, or the mass migration of activity from websites to social media,
extreme users are turning their attention there anew (Rogers, 2013). Third, there
are two mainstream platforms that remain central, since a number of the users
under study have active accounts there, having as yet not been deplatformed. Even
as their replacements are central in the alternative social media network, YouTube
and Twitter remain of relevance as a destination for extreme content and their
audiences.
Finally, it should be pointed that the alternative social media network comprises
more than ‘the social’, including for example donation and payment processing as
well as merchandise destinations. The relevance of those types of alternatives
became clear initially when mapping which platforms deplatformed the celebrities
(see Figure 2). As PayPal and other payment sites remove particular individuals,
alternatives have emerged, however much both the scope of the deplatforming as
well as migration to the alternative appear still marginal.
The overall question of the robustness and longevity of the alternative network
remains an open one, but at least for Telegram, the research was able to make
some initial determinations concerning the question of whether the extreme voices
shrivel or thrive there despite having been deplatformed by mainstream social
media.
Scraping Telegram for extreme celebrity activity
A Telegram scraper was built and employed to extract, through the platform’s
API, the contents of celebrity public channels.
2
Here the analysis concerns the
extremity of the language employed, posting activity measures and post view
counts, longitudinally, since the so-called ‘purge’ of ‘dangerous individuals’ from
Facebook and Instagram (in particular) in May 2019, a 6-month time frame in all.
The determination of extreme speech benefits from the deployment of hatebase.
org, the repository of (so-called ambiguous and unambiguous) extreme and hateful
speech (Davidson et al., 2017). In the celebrity posts, outlinks to other platforms
are counted, in order to gain an indication of which mainstream platforms are still
considered relevant destinations. Apart from these counts and scores, the project
also produced an analysis of celebrity discussions about mainstream and
Rogers 7
Figure 1. Extreme internet celebrities and the platforms where they remain or to which they have migrated. Bi-partite graph of extreme
internet celebrities and the platforms for which they have accounts. October 2019.
8European Journal of Communication 0(0)
Figure 2. Extreme internet celebrities and the platforms that deplatformed them. Bipartite graph of extreme internet celebrities and the
platforms for which they no longer have accounts (or have been ‘demonetized’ by them). October 2019.
Rogers 9
alternative platforms (with keyword in context word trees). We thematized these
discussions, finding how deplatforming and replatforming (or migrating to alter-
native platforms) are framed.
When the extreme Internet celebrities migrated to Telegram (and other alterna-
tive platforms), one question concerned whether they would be able to sustain
themselves there, given that the audiences are both smaller (as we found), and
thus, the interaction with the fan base or following less intense. Quantitatively,
the audiences on the new platforms have thinned (see Figure 3). Of interest is the
consistency of the celebrities in their posting, despite the overall audience strength.
Indeed, for most, it appears the ‘permanent updating culture’ has not been affected
by platform migration (Jerslev, 2016). There have been some changes in behaviour,
however. After a flurry of extreme language posted on alternative platforms in the
immediate aftermath of the May ‘purge’, its usage has flagged (see Figure 4).
Analytically deploying the ‘unambiguous’ extreme and hateful speech in the hate-
base repository, we found that there were increasingly fewer mentions of them,
generally, by most celebrities.
3
Whether one would expect a decrease in the usage
of extreme speech, especially given what could be thought of as a less moderated
space, is not clear, though in the Reddit study mentioned earlier those users who
were forced to migrate to other spaces also appear to use more mild language.
Which platforms benefit from deplatforming?
The research on how the extreme Internet celebrities describe how they have been
affected by deplatforming proceeded by querying both mainstream and alternative
platform names in the celebrities’ posts, and displaying the platforms as keywords
in context, using word trees. The hyperlinks in celebrity posts also were extracted,
in order to gain an indication of which platforms have content that is being
recommended or at least pointed to. Generally, it was found that Facebook and
Instagram are routinely critiqued as sites that do not grant freedom of speech
and whose use are generally not in the interests of extreme actors, whereas
Twitter and YouTube remain of interest, in the sense that they make appeals for
supporters to use the platforms to spread the word, or in the case of YouTube to
invite them on their shows (see Figure 5).
The research also examined the hyperlinks made from celebrity Telegram posts,
finding that by far the personal websites were most pointed to, followed by Twitter
and YouTube (see Figure 6). Instagram and Facebook draw far fewer links. The
link analysis provides an indication of which platforms are still relevant, and which
less so, both to the celebrities, but presumably to the content the celebrities also
find of interest for their audiences. In keeping with the earlier finding of the revival
of the web in the map of the alternative social media ecology, here the web emerges
again as the most linked-to destination, when the links to the personal websites as
well as newly formed subscription platforms are taken collectively. It also shows
that Facebook and Instagram are benefitting from the deplatforming activities in
the sense that celebrity interest in them has declined, while the other mainstream
10 European Journal of Communication 0(0)
Figure 3. Platform audiences of extreme internet celebrities before and after migration, depicted as Sankey diagram. Data per October 2019.
Rogers 11
platforms, Twitter and YouTube, have not seen such a concomitant slump.
Indeed, as the discursive analysis of Twitter and YouTube indicate, they both
continue to be viewed as resources either for spreading the word, and for broad-
casting content (see Figure 7).
Conclusions: Deplatforming effects and researcher migration
This study examines Telegram as a destination for the deplatformed, asking what it
has to offer not only to extreme internet celebrities, but also to researchers, who in
Figure 4. Post counts, post views and extremity of language used by all extreme internet
celebrities under study with an account on Telegram, May–October 2019.
Figure 5. Facebook discussed by extreme internet celebrities who were deplatformed by the
social media service. Word trees showing keyword in context from all Telegram posts, May–
October 2019.
12 European Journal of Communication 0(0)
a rather different sense are also being deplatformed, having seen their access to
data from mainstream platforms, especially Facebook and Instagram, diminished
or removed (Bruns et al., 2018). Should researchers follow the deplatformed to
alternative social media, and/or continue to research the mainstream platforms
where the data streams have been winnowed and replaced by company-curated
data sets? As has been demonstrated, one is able to study the reception of the
mainstream platforms through the posts on alternative ones, and gain indications
Figure 6. Web outlinks from extreme internet celebrity posts on Telegram, arrayed as circle
packing diagram, May–October 2019.
Figure 7. Twitter discussed by extreme internet celebrities who were deplatformed by the
social media service. Word trees showing keyword in context from all Telegram channel posts,
May–October 2019.
Rogers 13
of the extent to which deplatforming has ‘worked’ for the mainstream platforms.
Such work does not replace examinations of them, but it shows that analysing the
one social media site does not need to result only in ‘single platform studies’
(Rogers, 2019).
Unlike Facebook and Instagram, for researchers Telegram is not ‘locked’.
Apart from certain rate limiting, the platform allows widespread probing of its
public parts, both groups as well as channels.
4
Not only is it possible to study how
other platforms are discussed and linked to, but also the extreme voices may be
studied, where questions may be posed concerning (at least) two widespread views
that have been circulated about the effects of channelling them into an alternative
set of platforms friendly to them. Does the content become only more and more
extreme? There is also the question of the thinning of audiences, both in gross
terms but also over time, and whether the decline in audience strength provides less
oxygen and thus decreases activity. As reported, audiences have thinned, activity
has remained steady, and the language employed has become milder.
While difficult to determine with certainty, Telegram does not appear to be used
by these extreme Internet celebrities for private recruitment, content archiving and
other features that appear to make it attractive to those having been deplatformed.
Judging from the hyperlink and discursive analysis, Telegram rather is deployed
more like Twitter, Instagram, YouTube, or Facebook in a broadcasting mode,
with short (and frequent) posts put out on public channels.
‘Cancel culture’ is a contemporary term that describes the larger phenomenon
of public vilification for offensive speech or action, largely in social media but also
through other forms of deplatforming such as the calling off of a speaking engage-
ment (Bromwich, 2018; McDermott, 2019). One is cancelled by powerful media
forces (Coates, 2019), like a television show that had been thriving with niche
content and an audience not as sizable as desired. It also has been deployed by
extreme Internet celebrities to express victimhood or victimization. Indeed, when
researching how these users describe the mainstream platforms that deplatformed
them there are expressions of having been wronged, and also having been asym-
metrically (and unfairly) treated, when banned. Being cancelled by Facebook,
Instagram, Twitter, and/or YouTube has stark consequences for the maintenance
of a fan base, following and revenue stream, as has been reported. Migrating to
alternative social media may not offer as much. Platform cancelling at the same
time also demonstrates a shift in what is considered acceptable on social media.
Unacceptable content and individuals are removed.
Here it is important to return to the question of the ‘locked’ platforms, asking
how to research the delistings, deletions and other cancellations when the data to
do so on mainstream platforms become unavailable. Following the extreme inter-
net celebrities to alternative social media platforms provides one manner to work
on the question of the effects and effectiveness of deplatforming, albeit from celeb-
rity points of view as well as from their posting and linking behaviour. The alter-
native social media ecology also could be studied for the materials no longer
available on the mainstream platforms, having been moved and archived there.
14 European Journal of Communication 0(0)
While there are these and other research opportunities, following the extreme
internet celebrities to their new media does not substitute for critical mainstream
platform monitoring, and the study of what is deemed worthy of cancellation.
Acknowledgements
The project was undertaken by Lucia Bainotti, Gabriele Colombo, Veronica Moretti, Stijn
Peeters, Leonardo Sanna, Silvia Semenzin, Noemi Schiavi and the author at the Institute of
Advanced Studies, University of Bologna, with thanks to our host, Giovanna Cosenza.
Funding
The author(s) disclosed receipt of the following financial support for the research, author-
ship, and/or publication of this article: The research has received funding from the
European Union’s Horizon 2020 research and innovation programme under grant agree-
ment No 732942, project ODYCCEUS.
Notes
1. During the research period in October 2019, a series of Italian individuals and groups
were deplatformed by Facebook and other social media sites, though they did not imme-
diately migrate, or at least announce their movements, to alternative platforms.
2. The capacity to scrape public groups is also a feature of the tool.
3. The extreme Internet celebrities may have mellowed in their explicit utterances, according
to the hatebase analysis, but they may still trigger hateful reactions through the use of
implicit language. Moreover, the analysis here relies only on the celebrities’ speech rather
than the replies and comments.
4. To subscribe to a private channel or access a private group, the researcher needs an
invitation (or invite link) from the administrator. Researchers have studied such
groups, for example, in an analysis of the culture of revenge porn sharing (Semenzin
and Bainotti, 2020) as well as ISIS lone wolf recruitment (Shehabat et al., 2017). There it
is argued that encrypted communication enables the sharing of more violent content.
References
Beauchamp Z (2018) Milo Yiannopoulos’s collapse shows that no-platforming can work.
Vox, 5 December. https://www.vox.com/policy-and-politics/2018/12/5/18125507/milo-
yiannopoulos-debt-no-platform
Benson T (2016) Inside the ‘Twitter for racists’: Gab – The site where Milo Yiannopoulos
goes to troll now. Salon, 5 November. https://www.salon.com/2016/11/05/inside-the-
twitter-for-racists-gab-the-site-where-milo-yiannopoulos-goes-to-troll-now/
Bilton N (2019) The downfall of Alex Jones shows how the Internet can be saved. Vanity
Fair, 5 April. https://www.vanityfair.com/news/2019/04/the-downfall-of-alex-jones-
shows-how-the-internet-can-be-saved
Bromwich JE (2018) Everyone is cancelled. New York Times, 28 June.
Bruns A, Bechmann A, Burgess J, et al. (2018) Facebook shuts the gate after the horse has
bolted and hurts real research in the process. Internet Policy Review, 25 April.
Rogers 15
Chandrasekharan E, Pavalanathan U, Srinivasan A, et al. (2017) You can’t stay here: The
efficacy of Reddit’s 2015 Ban examined through hate speech. Proceedings of the ACM on
Human-Computer Interaction 1(2): art. 31.
Clifford B and Powell HC (2019) De-platforming and the Online extremist’s dilemma.
Lawfare Blog, 6 June. Available at: https://www.lawfareblog.com/de-platforming-and-
online-extremists-dilemma.
Coaston J (2018) Gab, the social media platform favored by the alleged Pittsburgh shooter,
explained. Vox, 29 October. https://www.vox.com/policy-and-politics/2018/10/29/
18033006/gab-social-media-anti-semitism-neo-nazis-twitter-facebook
Coates T-N (2019) The cancellation of Colin Kaepernick. New York Times, 22 November.
Cook J (2018) The incredible life of Pavel Durov – ‘Russia’s Mark Zuckerberg’ who is
raising $2 billion for his messaging app. Business Insider, 8 February. https://www.busi-
nessinsider.nl/the-incredible-life-of-pavel-durov-the-entrepreneur-known-as-the-mark-
zuckerberg-of-russia-2016-3/
Counter-Extremism Project (2017) Terrorists on Telegram. New York: Counter-Extremism
Project.
Davidson T, Warmsley D, Macy M, et al. (2017) Automated hate speech detection and the
problem of offensive language. In: Proceedings of the eleventh international AAAI con-
ference on web and social media, Montreal, Canada, 15–18 May.
De Zeeuw D and Tuters M (forthcoming) The Internet is serious business: On the deep
vernacular web and its discontents. Cultural Politics.
DeCook JR (2019) Curating the future: The sustainability practices of online hate groups.
PhD Dissertation, Michigan State University, East Lansing, MI.
Facebook (2019) Community Standards, Dangerous Individuals and Organizations. Menlo
Park: Facebook. Available at: https://www.facebook.com/communitystandards/danger
ous_individuals_organizations.
Jerslev A (2016) In the time of the microcelebrity: Celebrification and the YouTuber Zoella.
International Journal of Communication 10: 5233–5251.
Kraus R (2018) 2018 was the year we (sort of) cleaned up the internet. Mashable,26
December. https://mashable.com/article/deplatforming-alex-jones-2018/?europe=true
McDermott J (2019) Those people we tried to cancel? They’re all hanging out together. New
York Times, 2 November.
Martineau P (2019) Facebook bans Alex Jones, other extremists – But not as planned.
Wired, 2 May. https://www.wired.com/story/facebook-bans-alex-jones-extremists/
Maurice EP (2019) Milo Yiannopoulos ‘can’t put food on the table’ and might ‘retire from
social media entirely’. PinkNews, 9 September. https://www.pinknews.co.uk/2019/09/09/
milo-yiannopoulos-cant-put-food-on-table-might-retire-social-media-entirely/comments/
Mulhall J (2019) Deplatforming works: Let’s get on with it. Ctrl Alt-Right Delete Newsletter.
London: Hope Not Hate. https://www.hopenothate.org.uk/2019/10/04/deplatforming-
works-lets-get-on-with-it/
Nagy P and Neff G (2015) Imagined affordance: Reconstructing a keyword for communi-
cation theory. Social Media þSociety 1–9. DOI: 10.1177/2056305115603385
Nguyen T (2018) Gab’s Demise is just the beginning of a horrific new era of far-right
extremism. Vanity Fair, 29 October. https://www.vanityfair.com/news/2018/10/gab-alt-
right-platform-tree-of-life-synagogue-shooting
Nicas J (2018) Alex Jones said bans would strengthen him. He was wrong. New York Times,
4 September.
16 European Journal of Communication 0(0)
Ohlheiser A (2016) Banned from Twitter? This site promises you can say whatever you want.
Washington Post, 29 November.
Ohlheiser A and Shapira I (2018) Gab, the white supremacist sanctuary linked to the
Pittsburgh suspect, goes offline (for now). Washington Post, 29 October.
Pohjonen M and Udupa S (2017) Extreme speech online: An anthropological critique of
hate speech debates. International Journal of Communication 11: 1173–1191.
Raynes-Goldie K (2010) Aliases, creeping, and wall cleaning: Understanding privacy in the
age of Facebook. First Monday 15(1–4). https://firstmonday.org/article/view/2775/2432.
Robins-Early N (2019) Like ISIS before them, far-right extremists are migrating to
Telegram. Huffington Post, 10 May.
Rogers R (2013) Right-Wing Formations in Europe and Their Countermeasures: An Online
mapping. Amsterdam: Govcom.org Foundation.
Rogers R (2018) Issuecrawling: Building lists of URLs and mapping website networks. In:
Lury C, Clough PT, Chung U, et al. (eds) Routledge Handbook of Interdisciplinary
Research Methods. London: Routledge, pp. 169–175.
Rogers R (2019) Doing Digital Methods. Los Angeles, CA: SAGE.
Semenzin S and Bainotti L (2020) The use of Telegram for the non-consensual dissemina-
tion of intimate images: Gendered affordances and the construction of masculinities.
SocArXiv. https://doi.org/10.31235/osf.io/v4f63..
Shapiro JN (2013) The Terrorist’s Dilemma: Managing Violent Covert Organizations.
Princeton, NJ: Princeton University Press.
Shehabat A, Mitew T and Alzoubi Y (2017) Encrypted Jihad: Investigating the role of
Telegram App in lone wolf attacks in the west. Journal of Strategic Security 10(3): 27–53.
Thornhill J (2015) Lunch with the FT: Pavel Durov. Financial Times, 3 July.
Venturini T, Jacomy M and Pereira D (2015) Visual network analysis: The example of the
Rioþ20 Online debate. Working Paper, January. Sciences Po Medialab, Paris.
Weimann G (2016) Terrorist migration to the dark Web. Perspectives on Terrorism 10(3):
40–44.
Wikipedia Contributors (2019) Voat. Wikipedia, The Free Encyclopedia, 16 August.
Available at: https://en.wikipedia.org/w/index.php?title=Voat&oldid=906229670.
Wong JC (2018) Don’t give Facebook and YouTube credit for shrinking Alex Jones’ audi-
ence. The Guardian, 5 September.
Yayla, AS and Speckhard A (2017) Telegram: The mighty application that ISIS loves. Brief
report, International Center for the Study of Violent Extremism, https://www.icsve.org/
telegram-the-mighty-application-that-isis-loves/
Zannettou S, Kwak H, De Cristofaro E, et al. (2018) What is gab? A bastion of free speech
or an alt-right echo chamber? In: WWW18 Conference Companion, New York: ACM.
Available at: https://doi.org/10.1145/3184558.3191531.
Rogers 17
... The trust on social media information sharing is important for the social media users because it can influence their overall understanding (Rogers, 2020). However, the social media users are recommended to communicate a good piece of knowledge with one another that can develop our comprehensive understanding and trust (Zarzycka et al., 2021). ...
... Social media users are advised to develop effective tactics, and marketers should seek expert consultation to enhance communication through social media platforms. According to Rogers (2020), it is advisable to enhance social media communication through a strategic approach that can significantly impact audience comprehension. In summary, high-quality information is essential for enhancing the efficiency of social media platforms, which is vital for achieving impactful social media communication. ...
Article
Full-text available
The study is conducted to determine the impact of social media posts (SMP), information quality, accessibility to social media (ASM), social media user trust (SMUT) and technological literacy (TL) on effectiveness of social media communication (ESMC). A survey based data collection approach was used and a sample of 386 library users in Shanghai and Beijing was collected using survey random sampling method. The study used IBM SPSS 26 for determining correlations, model summary, analysis of variance (ANOVA) and regression analysis. It was found that there is no impact of SMP on ESMC. The study determined that information quality, ASM, SMUT, and TL have a significant impact on ESMC. The study addresses the gaps in knowledge and practitioners and policymakers are recommended for significant actions to improve the ESMC.
... The Covid-19 pandemic has provided fertile ground for the spread of disinformation and conspiracy theories. In response, social media platforms have implemented stringent content moderation measures, including the deletion of problematic content and deplatforming of offending accounts, compelling conspiracy theorists to migrate towards the messaging app Telegram (Rogers 2020). Such actors face 'deplatforming', in which users are ejected "from a specific technology platform by closing their accounts, banning them, or blocking them from using the platform or its services" (Radsch, 2021, p.109). ...
Article
The Covid-19 pandemic has significantly amplified the dissemination of conspiracy theories. In response, social media platforms have intensified their efforts in content moderation, prompting conspiracy theorists to seek refuge on platforms like Telegram. Our study examines conspiracy theorists' perceptions of and misconceptions about content moderation practices on different social media platforms, as discussed in Telegram channels of the “Querdenken” community. Different narratives emerge in relation to the platforms discussed (Telegram, Facebook, and YouTube). Our findings provide insight into how conspiracy theorists construe disruptions in their communication channels, intertwining notions of platform power with their conspiratorial worldview.
... As platforms increasingly mediate social life, platform policy functions as social policy and ICT firms as judge, jury, and executioner. For example, major social networking sites engage in "deplatforming" or the suspension of accounts violating their Terms of Service, such as the 2021 removal of Donald Trump from Twitter following January 6 th (Rogers 2020;Floridi 2021). Deplatforming is plagued by scale due to its focus on individual figureheads. ...
Article
Over the past decade, service providers have taken on larger roles in the regulation of violent digital networks. As reactionary platforms like Gab and Parler emerge to cater to far-right users estranged from mainstream social networking sites, their stack suppliers – domain registrars, web hosts, app stores, etc. – may enforce their Terms of Service, revoking access to services, in what José Van Dijck et al. term “deplatformization.” Deplatformization has proven circumstantially effective as in the 2021 destruction of Parler by Amazon Web Services. However, reactionary platforms have responded by building their own service providers, marketing them as “cancel-proof,” and seeding a cottage industry of far-right infrastructure firms immune to deplatformization. In this trend, which I term “replatformization,” far-right networks scale their capacity to fracture and terrorize online publics. This paper investigates replatformization through a diachronic analysis of the stack of Kiwi Farms, a text forum known for its deadly transmisogynistic harassment campaigns, its history as “victim” of service denial by multiple firms including Cloudflare, and its in-house solutions such as DDoS mitigator “KiwiFlare” and web host “Final Solutions LLC.” Anchored by materialist critical theory, I argue that the replatformization of far-right platforms reifies the role of reactionary digital networks within the social reproduction of racial capitalism. I trace an emerging reactionary Stack-consciousness as KF users become a vanguard in digital service provision. Efforts to regulate online hate culture must attend to the fact that the private ownership of the means of mediation empowers reactionary webmasters with inordinate leverage.
... In Spring of 2020, Israeli Telegram became a staple venue of civic-minded news consumption, as the platform served as the Ministry of Health's primary official output in updating about new COVID-19 patients and measures (Nossek, 2022). The MoH channel, set up in 2018, was until then used primarily for the circulation of press releases, utilizing the platform's "unsend everything" affordances (Urman & Katz, 2020;Rogers, 2020) in order to correct and edit such releases over time. But as contagion numbers rose, a new, less professionalized audience joined in: civic-minded audiences looking to be instantly informed about the trajectory of what will become the first waves of the pandemic in Israel. ...
Article
During the coronavirus pandemic, Israeli Telegram became a staple venue of pandemic news, as the platform served as the Ministry of Health’s primary official output. Over time, Telegram also became home to a community of ‘Guerilla Analysts’ who pooled together their analytical resources to create a ‘National Graph Headquarters’. Utilizing the same data as the MoH channel, this ragtag crew of data-loving laypeople created a space in which data-literacies are shared and developed. Informed by conceptualizations of messaging apps as “news meso-space(s)”, digital collaborative information-making practices, and the notion that visualizations can promote civic deliberation and political empowerment, this paper explores the rhetorical and discursive framework that enabled informational migration and evolution, and the co-production of data-oriented pandemic narratives across both groups, resulting in an exceptionally empowering informational eco-system. It does so by applying qualitative rhetorical visualization analysis to 100 pairs of institutional and guerilla visualizations of the same data, prior to qualitative discourse analysis of 30 related comment threads. Findings reveal that while the ‘Institutional Voice’ remained focused on conservative visualizations of the present and past, guerilla analysts often re-visualize MoH visualizations into predictive narratives, and take bolder swings in proposing actionable implications. While primarily focused on sociable elucidation of complex analyses, comment discourse also challenges and re-visualizes ‘Guerilla’ narratives, further extending informational empowerment. Predominantly conceptualized as a subversive venue, I highlight Telegram as a civic platform and propose a conceptualization of messaging apps as productive and empowering meso-spaces, utilizing common social and analytical resources towards a common good.
... By "controversial questions", on the other hand, I mean questions that tend to emerge from more permissible spaces that are not always modelled in red teaming environments. Despite having suspended or "deplatformed" (Rogers, 2020) several controversial subreddits, Reddit can be considered one of such places. Reddit continues to be a place where users find semihidden subreddits specifically dedicated for transgressive and uncomfortable questions about a variety of "harms", including sex, violence, anatomy, politics, culture or contested histories. ...
Article
For the past year, using large language models (LLMs) for content moderation appears to have solved some of the perennial issues of online speech governance. Developers have promised 'revolutionary' improvements (Weng, Goel and Vallone, 2023), with large language models considered capable of bypassing some of the semantic and ideological ambiguities of human language that hinder moderation at scale (Wang et al., 2023). For this purpose, LLMs are trained to generate “moderate speech” – that is, not to utter offensive language; to provide neutral, balanced and reliable prompt outputs; and to demonstrate an appreciation for complexity and relativity when asked about controversial topics. But the search for optimal content moderation obscures broader questions about what kind of speech is being generated. How does generative AI speak “moderately”? That is: under what norms, training data and larger institutions does it produce “moderate speech”? By examining the regulatory frameworks AI labs, comparing responses to moderation prompts across three LLMs and scrutinising their training datasets, this paper seeks to shed light on the norms, techniques and regulatory cultures around the generation of “moderate speech” in LLM chat completion models.
... Subreddit moderators can intervene at the content level, which, in addition to removing content, includes approving content, managing users, changing settings (e.g., rules), or editing flairs / labels (Li, Hecht, and Chancellor 2022). Moderation can occur at the user-level (e.g., deplatforming, shadow-banning) (Rogers 2020;Myers West 2018) or community-level (Trujillo and Cresci 2022;Chandrasekharan et al. 2022) (e.g., quarantining). We choose to focus on content removal because it is one of the most common moderation interventions and has been shown to influence community members' participation (Li, Hecht, and Chancellor 2022). ...
Preprint
Full-text available
We often treat social media as a lens onto society. How might that lens be distorting the actual popularity of political and social viewpoints? In this paper, we examine the difference between the viewpoints publicly posted in a community and the privately surveyed viewpoints of community members, contributing a measurement of a theory called the "spiral of silence." This theory observes that people are less likely to voice their opinion when they believe they are in the minority--leading to a spiral where minority opinions are less likely to be shared, so they appear even further in the minority, and become even less likely to be shared. We surveyed active members of politically oriented Reddit communities to gauge their willingness to post on contentious topics, yielding 627 responses from 108 participants about 11 topics and 33 subreddits. We find that 72.6% of participants who perceive themselves in the minority remain silent, and are only half as likely to post their viewpoint compared to those who believe their opinion is in the majority. Communities perceived as being more inclusive reduce the magnitude of this effect. These results emphasize how far out of step the opinions we see online may be with the population they purport to represent.
... Once hate speech is identified, it is deleted. Alternatively, quarantine and deplatforming reduce the supply of online hate speech by blocking detected content temporarily (Ullmann & Tomalin, 2020) or by removing the accounts that have disseminated the material (Rogers, 2020; see also Copland, 2020;Jhaver et al., 2021). Social media platforms' efforts to detect and remove or quarantine hate speech online are, to some extent, the result of regulatory pressures (Chetty & Alathur, 2018). ...
Article
Full-text available
Counter-speech is considered a promising tool to address hate speech online, notably, by promoting bystander reactions that could attenuate the prevalence or further dissemination of hate. However, it remains unclear which types of counter-speech are most effective in attaining these goals and which might backfire. Advancing the literature, we examined the effect of four types of counter-speech (i.e., educating the perpetrator, calling on others to intervene, diverting the conversation, and abusing the perpetrator) on a range of bystander behavioral intentions in an experimental study (N = 250, UK-based adults). Overall, counter-speech did not affect bystanders’ subsequent responses to hate speech. Having said this, as expected, diversionary counter-speech increased intentions to ignore hate speech, which suggests unintended consequences. The study illustrates that counter-speech may not be sufficiently impactful in regulating bystanders’ reactions to hate speech online.
... Alt-tech platforms tend to mimic affordances of mainstream platforms (e.g., YouTube or Reddit) while advertising minimal content moderation policies and free speech absolutism, attracting organizations and users banned from mainstream sites like Facebook and X (de Keulenaar, 2023;Rogers, 2020;Zeng and Schäfer, 2021). The combination of familiar affordances of social media platforms, explicit lack of content moderation, and a grievance-based identity of perceived oppression by "Big Tech censors" create ideal conditions for conspiracy narratives to thrive and turn to more extreme, and even violence-legitimating, ideas (Cinelli et al., 2022;Jasser et al., 2021;Nouri et al., 2021). ...
Article
The mainstreaming of conspiracy narratives has been associated with a rise in violent offline harms, from harassment, vandalism of communications infrastructure, assault, and in its most extreme form, terrorist attacks. Group-level emotions of anger, contempt, and disgust have been proposed as a pathway to legitimizing violence. Here, we examine expressions of anger, contempt, and disgust as well as violence, threat, hate, planning, grievance, and paranoia within various conspiracy narratives on Parler. We found significant differences between conspiracy narratives for all measures and narratives associated with higher levels of offline violence showing greater levels of expression.
... They found that these accounts become more active and toxic but gain fewer followers. Other studies (Rauchfleisch & Kaiser, 2021;Rogers, 2020) use a similar approach. ...
Article
Full-text available
Content moderation decisions can have variable impacts on the events and discourses they aim to regulate. This study analyzes Twitter data from before and after the removal of key Arizona Election Audit Twitter accounts in March of 2021. After collecting tweets that refer to the election audit in Arizona in this designated timeframe, a before/after comparison examines the structure of the networks, the volume of the participating population, and the themes of their discourse. Several significant changes are observed, including a drop in participation from accounts that were not deplatformed and a de-centralization of the Twitter network. Conspiracy theories remain in the discourse, but their themes become more diffuse, and their calls to action more abstract. Recruiting calls to join in on promoting and publicizing the audit mostly come to an end. The decision by Twitter to deplatform key election audit accounts appears to have greatly disrupted the hub structure at the center of the emergent network that formed as a response to the election audit. By intervening in the network, moderators successfully defused much of the Twitter-based participation in the Arizona Election Review of 2021. This instance demonstrates the efficacy of network-driven interventions in platform moderation, specifically for events or accounts that use social media to organize or encourage bad-faith attacks on civic instituions.
Article
This paper contributes to the critical study of platformisation and deplatformisation of software development and how networked infrastructures commodify, configure and challenge relations between code, coders, communities, technologies, investors and industries. It explores the political economic and cultural dimensions of the platformisation of software development in the news industry with a case study on Github. It then examines how resistance to platformisation and counter-mobilisations in the context of free software, art and activism surface alternative arrangements for socialising software development imbued with other logics. The paper proposes the concept of “connective coding” to characterise GitHub’s dominant modes of configuration and capitalisation of public repositories and profiles and the power relations that underpin it, whereby public software and project development work becomes assets in the platform economy that have the potential to be variously capitalised by the platform and its associated third-party ecosystem. This institutional perspective is complemented by an analysis of newsroom industry practices mediated by the platform and how GitHub modulates visibility in this space. The second part of the paper reflects on efforts to deplatformise software development by examining various responses and counter-mobilisations partly prompted by Microsoft’s controversial purchase of Github. It reviews the development of alternative coding spaces such as Gogs, Gitea, Radicle and Forgejo, and the values, practices, concerns and communities associated with them. In doing so it contributes to conference themes pertaining to political economy, labour and resistance in digital industries.
Article
Full-text available
In this essay, we reconstruct a keyword for communication—affordance. Affordance, adopted from ecological psychology, is now widely used in technology studies, yet the term lacks a clear definition. This is especially problematic for scholars grappling with how to theorize the relationship between technology and sociality for complex socio-technical systems such as machine-learning algorithms, pervasive computing, the Internet of Things, and other such “smart” innovations. Within technology studies, emerging theories of materiality, affect, and mediation all necessitate a richer and more nuanced definition for affordance than the field currently uses. To solve this, we develop the concept of imagined affordance. Imagined affordances emerge between users’ perceptions, attitudes, and expectations; between the materiality and functionality of technologies; and between the intentions and perceptions of designers. We use imagined affordance to evoke the importance of imagination in affordances—expectations for technology that are not fully realized in conscious, rational knowledge. We also use imagined affordance to distinguish our process-oriented, socio-technical definition of affordance from the “imagined” consensus of the field around a flimsier use of the term. We also use it in order to better capture the importance of mediation, materiality, and affect. We suggest that imagined affordance helps to theorize the duality of materiality and communication technology: namely, that people shape their media environments, perceive them, and have agency within them because of imagined affordances.
Book
Full-text available
Teaching the concrete methods needed to use digital devices, search engines and social media platforms to study some of the most urgent social issues of our time, this is the essential guide to the state of the art in researching the natively digital. With explanation of context and techniques and a rich set of case studies, Richard Rogers teaches you how to: · Build a URL list to discover internet censorship · Transform Google into a research machine to detect source bias · Make Twitter API outputs comprehensible and tell stories · Research Instagram to locate ‘hashtag publics’ · Extract and fruitfully analyze Facebook posts, images and video · And much, much more Designed with a suite of video tutorials and online tools as well as the Digital Methods Manual Interactive eBook, this is the guide to doing digital methods you have been waiting for.
Chapter
Full-text available
The ‘how to’ research protocol that follows describes how to build lists of URLs to seed link crawling software and ultimately make link maps of right-wing extremism and ‘new right’ populism in particular European countries. The maps show links between websites, or online networks of websites that can be analysed according to a series of technical characteristics, but here a substantive analysis is also undertaken. These methods may be situated alongside reading party manifestos and favoured literature, going native by embedding oneself in the groups, interviewing imprisoned or former group members, and other qualitative techniques to distil significant content. The online mapping method of issuecrawling can thus be considered either as an exploratory step that provides leads for further in-depth analysis, or as a means to create country reports with a broad stroke.
Conference Paper
Full-text available
Over the past few years, a number of new "fringe" communities, like 4chan or certain subreddits, have gained traction on the Web at a rapid pace. However, more often than not, little is known about how they evolve or what kind of activities they attract, despite recent research has shown that they influence how false information reaches mainstream communities. This motivates the need to monitor these communities and analyze their impact on the Web's information ecosystem. In August 2016, a new social network called Gab was created as an alternative to Twitter. It positions itself as putting "people and free speech first", welcoming users banned or suspended from other social networks. In this paper, we provide, to the best of our knowledge, the first characterization of Gab. We collect and analyze 22M posts produced by 336K users between August 2016 and January 2018, finding that Gab is predominantly used for the dissemination and discussion of news and world events, and that it attracts alt-right users, conspiracy theorists, and other trolls. We also measure the prevalence of hate speech on the platform, finding it to be much higher than Twitter, but lower than 4chan's Politically Incorrect board.
Article
Full-text available
Exploring the cases of India and Ethiopia, this article develops the concept of "extreme speech" to critically analyze the cultures of vitriolic exchange on Internet-enabled media. While online abuse is largely understood as "hate speech," we make two interventions to problematize the presuppositions of this widely invoked concept. First, extreme speech emphasizes the need to contextualize online debate with an attention to user practices and particular histories of speech cultures. Second, related to context, is the ambiguity of online vitriol, which defies a simple antonymous conception of hate speech versus acceptable speech. The article advances this analysis using the approach of "comparative practice," which, we suggest, complicates the discourse of Internet "risk" increasingly invoked to legitimate online speech restrictions.
Article
Full-text available
The study aims to capture links between the use of encrypted communication channel - Telegram and lone wolf attacks occurred in Europe between 2015-2016. To understand threads of ISIS communication on Telegram we used digital ethnography approach which consists of the self-observation of information flows on four of ISIS's most celebrated telegram Channels. We draw on public sphere theory and coined the term terror socio-sphere 3.0 as the theoretical background of this study. The collected data is presented as screenshots to capture a visual evidence of ISIS communication threads. This study shows that ISIS Telegram channels play critical role in personal communication between potential recruits and dissemination of propaganda that encourage 'lone wolves' to carry attacks in the world at large. This study was limited to the number of the channels that have been widely celebrated.
Article
Full-text available
ISIS has been the most successful terrorist organization in history using social media and the Internet for distributing its propaganda, dissemination of its news and more importantly to communicate. There is no doubt that the frequency and quality of ISIS posts on the Internet, including their videos, memes and online journals are of a quality to make many professional editors and producers envious and they also receive much attention. Telegram has become the choice of the ISIS due to its specifications—providing secure encrypted communications and allowing users to share large files and act with their accounts operating with impunity. While Telegram administrators claim, they favor speech free of interference; it is time for the owners of Telegram to thoroughly consider the existence of ISIS presence and activities on their digital platform. Telegram has become the ultimate tool for the bloodiest terrorist organization in history, carrying and spreading its terrorist ideology around the world, recruiting and even directing cadres to carry out attacks globally. Recently, the families of the San Bernardino shooting sued Facebook, Google, and Twitter, claiming that these social media companies permitted ISIS to flourish on these social media platforms . It may soon happen that Telegram will also have to deal with several legal actions as ISIS cadres continue to utilize their application for their terror operations and communications. http://moderndiplomacy.eu/index.php?option=com_k2&view=item&id=2564:telegram-the-mighty-application-that-isis-loves&Itemid=154 Reference for this Article: Yayla, Ahmet S. & Speckhard, Anne (May 9, 2017) Telegram: the Mighty Application that ISIS Loves, ICSVE Brief Reports, http://www.icsve.org/brief-reports/telegram-the-mighty-application-that-isis-loves/
Preprint
This paper aims to analyze what is the role of Telegram affordances in orienting, amplifying and normalizing the non-consensual diffusion of intimate images (NCII). We focus on the sense of anonymity, the platform’s weak regulation and the possibility to create big male communities, arguing that these affordances are ‘gendered affordances’ (Schwartz & Neff, 2019), as they orient male participants’ harassing behaviours and, in concert with an established misogynist culture, contribute to the reinstatement of hegemonic masculinity (Connel, 2005). The research draws on data collected through an online covert ethnography of Italian Telegram channels and groups.
Article
In 2015, Reddit closed several subreddits-foremost among them r/fatpeoplehate and r/CoonTown-due to violations of Reddit's anti-harassment policy. However, the effectiveness of banning as a moderation approach remains unclear: banning might diminish hateful behavior, or it may relocate such behavior to different parts of the site. We study the ban of r/fatpeoplehate and r/CoonTown in terms of its effect on both participating users and affected subreddits. Working from over 100M Reddit posts and comments, we generate hate speech lexicons to examine variations in hate speech usage via causal inference methods. We find that the ban worked for Reddit. More accounts than expected discontinued using the site; those that stayed drastically decreased their hate speech usage-by at least 80%. Though many subreddits saw an influx of r/fatpeoplehate and r/CoonTown "migrants," those subreddits saw no significant changes in hate speech usage. In other words, other subreddits did not inherit the problem. We conclude by reflecting on the apparent success of the ban, discussing implications for online moderation, Reddit and internet communities more broadly.