Article

Online hatred of women in the Incels.me Forum: Linguistic analysis and automatic detection

Authors:
Article

Online hatred of women in the Incels.me Forum: Linguistic analysis and automatic detection

If you want to read the PDF, try requesting it from the authors.

Abstract

This paper presents a study of the (now suspended) online discussion forum Incels.me and its users, involuntary celibates or incels , a virtual community of isolated men without a sexual life, who see women as the cause of their problems and often use the forum for misogynistic hate speech and other forms of incitement. Involuntary celibates have attracted media attention and concern, after a killing spree in April 2018 in Toronto, Canada. The aim of this study is to shed light on the group dynamics of the incel community, by applying mixed-methods quantitative and qualitative approaches to analyze how the users of the forum create in-group identity and how they construct major out-groups, particularly women. We investigate the vernacular used by incels, apply automatic profiling techniques to determine who they are, discuss the hate speech posted in the forum, and propose a Deep Learning system that is able to detect instances of misogyny, homophobia, and racism, with approximately 95% accuracy.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Prior to his violent rampage, Minassian paid homage to fellow incel (a portmanteau of involuntary celibate), Elliot Rodger, who killed six people and himself during a series of violent attacks in Isla Vista, California in 2014 (Duke, 2014). Rodger has been heralded as a martyr and the "supreme gentleman" of the incel community for the attack and the dissemination of his manifesto, in which he framed his violence as "retribution" for women having denied him the opportunity to have sex (Byerly, 2020;Jaki et al., 2019;Mandel, 2018). ...
... While the news coverage of incel-perpetrated violence is largely sensational, Byerly's (2020) analysis of 70 news articles-which spanned 29 outlets in six countries-suggests that the content is an accurate portrayal of incel behaviours, ideologies, and the community as a whole. Specifically, the central themes of violence, misogyny, and the role of the online community in promoting these that were identified by Byerly (2020) are also reflected in content analyses of multiple incel forums (Baele et al., 2019;Jaki et al., 2019;Maxwell et al., 2020;O'Malley et al., 2020). ...
... A recent textual analysis of 100 discussion threads on the incels.me forum identified loneliness as one of the top 1,000 keywords, with the authors concluding that this constitutes a core aspect of inceldom (Jaki et al., 2019). This may help explain why only 18% of incels in Sparks et al. (2022) reported having pictures with friends in their dating application profile compared to 52% of non-incel men. ...
Preprint
Full-text available
Incels-a ragtag collection of young males who have rallied around their shared experience of romantic rejection-have slowly emerged as an online group of interest to researchers, no doubt as a result of several high-profile attacks. Much of this work has centered around incels' dating experiences, sexual attitudes, and online forums. However, it is possible that their moniker, short for involuntary celibate, has resulted in an overemphasis on their sexual exclusion and frustration. Recent work has identified social isolation as a key aspect of inceldom, which may help explain why incels have responded negatively to romantic rejection. The present study thus sought to examine the role of social support and loneliness in experiences of rejection in a sample of incel (n = 67) and non-incel (n = 103) men. Results provided some support for the recently developed Incel Traits scale, its first known use in a sample of incels. Incels also reported more feelings of loneliness and less social supports than non-incel men, which were associated with multiple mental and relational health issues. Further, incels reported using more solitary and problematic coping mechanisms. These results suggest that incels may be missing a key buffer in sheltering them from the adverse effects of romantic rejection. It also extends previous findings highlighting the importance of attachment styles in differentiating incels from non-incels, which may perpetuate feelings of isolation. Implications for how this may relate to incel discourse and clinical interventions are discussed.
... Because Chads monopolize sexual relations with women, they leave other men empty-handed. description of a Chad are destined for a life of loneliness, never to have a willing sexual partner or a relationship, Incels believe (Jaki et al., 2019). ...
... Another study examined narratives that emerge from Incel online posts, revealing perceived discrimination by physical appearance and masculine entitlement over women (Jones, 2020). Studies that used quantitative and qualitative textual analysis and automatic profiling techniques observed misogyny, homophobia, and racism in Incels' online postings in discussion forums (Jaki et al., 2019;Baele et al., 2019). Incel online discourse has also revealed experiences of depression, hopelessness, suicidal ideation (Jones, 2020), and violent fantasies about rape (Scaptura & Boyle, 2020). ...
... However, it is not clear whether highly radicalized online discourse characterizes the Incel community as a whole or is instead a salient representation of a small radical subgroup of Incels. Indeed, Jaki et al. (2019) found that only a small subset of Incel forum users, about 10 percent, were responsible for most of the hateful content. Similarly, Baele et al. (2019) also found that an extremely active small group of users were responsible for a disproportionate amount of output. ...
Article
Full-text available
Incels (involuntarily celibates) are an online community of men who feel disenfranchised because they are unable to find a romantic and sexual partner. Incels tend to blame society for placing too much value in physical appearance and for endowing women with too much power in mate selection, a grievance that sometimes translates into violent misogyny. Mass-casualty Incel attacks have led the security services in the U.S., Canada, and the U.K. to classify Incels as a violent extremist threat. However, little empirical research is available to inform the understanding of Incels, or to qualify their potential danger to the public. Filling this gap, this study presents an important empirical datum by reaching beyond media headlines and online activity, to assess Incel ideology, mental health, and radical intentions through in-depth surveys of 274 active Incels. Most Incels in our study reported mental health problems and psychological trauma of bullying or persecution. Incel ideology was only weakly correlated with radicalization, and ideology and radicalization were differentially correlated with mental health measures. Most Incels in the study rejected violence. The discussion considers implications of these findings for detection, policing, and non-criminal interventions focused on the Incel community.
... Other researchers (e.g., [28,40,45]) have opted to use chronology-based inclusion criteria, capturing threads that were posted during a set time period. While advantageous, it also runs the risk of being skewed by real-world events that take place, such as the Toronto Van Attack, which resulted in a huge uptick in activity during Jaki and colleagues' [45] timeframe. ...
... Other researchers (e.g., [28,40,45]) have opted to use chronology-based inclusion criteria, capturing threads that were posted during a set time period. While advantageous, it also runs the risk of being skewed by real-world events that take place, such as the Toronto Van Attack, which resulted in a huge uptick in activity during Jaki and colleagues' [45] timeframe. As O'Donnell and Shor [22] note, much of this dialogue was in support of Minassian's self-described "rebellion." ...
... 1864). Jaki et al. [45] also identified social isolation as a core characteristic of inceldom. The lack of social connections may help explain why so few incels (18%) reported having pictures with others on their dating app profiles relative to non-incel men (52%), the only picture category where such a sizeable difference emerged [49]. ...
Preprint
Full-text available
Purpose of review: Incels (involuntary celibates) have recently garnered media attention for seemingly random attacks of violence. Much attention has centered around the misogynistic and violent discourse that has taken place in online incel forums as well as manifestos written by incels who have perpetrated deadly attacks. Such work overlooks the experiences and issues faced by incels themselves, the majority of which have not engaged in any violent behaviour. Recent findings: A small number of studies have recruited incels. Results from these studies highlight that the nuanced nature of the incel identity. It is also apparent that incels suffer from high levels of romantic rejection and a greater degree of depressive and anxious symptoms, insecure attachment, fear of being single, and loneliness. Summary: Incels report significant issues pertaining to their mental, social, and relational well-being and may seek support from forums that often feature misogynistic and violent content. 3 Introduction Incels, Violence, & the Media In the evening hours of August 12, 2021, a media firestorm took place in Plymouth, England when news emerged that a young man had shot and killed six individuals, including himself [1]. The attack was newsworthy for multiple reasons: it was the first deadly mass shooting in the United Kingdom in over a decade [2]; the act was carried out by the owner of a legally owned firearm in a country with strict gun control regulations; and the killer, Jake Davison first shot his mother before shooting people at random in the street-including a 6-year-old girl. Left without a clear motive, much attention circulated around the fact that Davison was a self-proclaimed incel, a portmanteau of involuntary celibate [3, 4]. Such incels have been responsible for a number of high-profile attacks that have resulted in the deaths of over fifty individuals [5]. This includes what has been dubbed the Toronto Van Attack where 25-year-old Alek Minassian mowed down pedestrians on a busy Toronto street with his rental vehicle, becoming one of Canada's most deadly attacks [6]. In the leadup to the attack, Minassian publicly praised "supreme gentleman" Elliot Rodger on his social media accounts, referring to an individual who went on a shooting rampage in Isla Vista, California, killing six before committing suicide. Rodger has become a celebrated figure and martyr among the incel community for committing the first known incel-based attack and is frequently cited by other incels as serving as an inspiration for their own acts of violence [7, 8, 9, 10] .
... Our findings add to a growing body of literature seeking to map the incel community (e.g., Ging, 2017;Hoffman et al., 2020;O'Malley et al., 2020). In particular, our qualitative analysis allows for a richer understanding of some of the trends identified by broad quantitative and mixed method overviews of incel discussions (Baele et al., 2019;Jaki et al., 2019). Our analysis underscores the connection between the broader misogynist incel community and mass violence committed by self-declared incel individuals, highlighting misogyny as a basis for terrorism and violent extremism (Aslam, 2012;Bairner, 1999;Ferber & Kimmel, 2008;Haider, 2016;Kimmel, 2018). ...
... According to Cottee (2020), incels are also exclusively heterosexual. Jaki et al. (2019), who employed a quantitative text profiling technique that deduces demographic traits based on writing style, concluded that slightly more than half of the incels are likely under 25 years old and with low levels of education. ...
... To express their grievances, incels have developed a set of original vocabulary. Young, attractive, white women are referred to as "Stacys," and incels believe that these women are only attracted to successful and hyperattractive men, which they call "Chads" (Baele et al., 2019;Jaki et al., 2019). In sharing their emotional responses to personal experiences of rejection, incels embrace self-deprecating labels like "betafag" (Ging, 2017). ...
Article
How do members of extremist groups think about violence conducted by individual members on the group's behalf? We examine the link between extremism-motivated violence and extremist groups through a case study of misogynist incels, a primarily online community of men who lament their lack of sexual success with women. To learn how misogynist incels talk about mass violence committed by members of their group, we conduct a qualitative content analysis of 3,658 comments relating to the 2018 Toronto van attack, in which self-declared incel Alek Minassian drove a van into pedestrians, killing 10 and injuring 16. We find overwhelming support among self-proclaimed incels for the attack and violence more generally. Incels viewed mass violence as instrumental, serving the following four main purposes: garnering increased attention, exacting revenge, reinforcing masculinity, and generating political change. Our findings indicate the need to examine misogynist incels as a potential terrorist group and male supremacism as a basis for terrorism.
... Incel communities in online environments are an excellent example of the potential ways in which individuals may radicalize to shared grievances due to engagement with others (Ribeiro et al. 2020a(Ribeiro et al. , 2020b. Self-identified incels coalesce online to discuss shared experiences of social isolation, loneliness, and difficulties attaining sexual relationships due to (actual or perceived) social and physical shortcomings (Ging 2019;Jaki et al. 2019). Incels standout within the broader manosphere due to their blackpill philosophy, a fatalistic worldview that rests on the notion that to be an incel is to fall into a naturally occurring category within society. ...
... Incels are able to bond over a shared desire for relationships and sexual opportunities as well as other facets of their lives that may contribute to feelings of hopelessness. For instance, participants often speak of a lack of educational or employment opportunities (e.g., 'NEET' 1 ), financial difficulties (Jelodar and Fran 2021), or struggles with mental health and self-care (Glace, Dover, and Zatkin 2021;Jaki et al. 2019). Incels have been found to attribute blame for their marginalization onto factors such as neuro-atypicality, physical stature, and other characteristics that lend themselves towards traditional masculine performance, collectively contributing to a sense of strained masculinity (Daly and Reed 2022). ...
... The extant literature points towards a progression in extremist beliefs and radical sentiment across incel platforms (Farrell et al. 2019(Farrell et al. , 2020Jaki et al. 2019;Ribeiro et al. 2020aRibeiro et al. , 2020b, as well as the potential for users to not only become violent but to also be martyred by incel peers after the fact (Baele, Brace, and Coan 2019;Jaki et al. 2019;Witt 2020). Since participants often encourage one another to commit acts of violence (Jaki et al. 2019;Robertson 2019), the communities have been banned on some mainstream social platforms (see Bell 2017). ...
Article
Full-text available
The online presence of incels, or involuntary celibates, has been an increasing security concern for researchers, practitioners, and policymakers in recent years, given that self-identified incels – including Alek Minassian and Elliot Rodger – used the Internet to disseminate incel ideology and manifestos prior to committing acts of violence. However, little is empirically known about the incel movement in general or their online communities in particular. The present study draws from a set of comments from r/Incels, a now defunct but once popular subreddit dedicated to the incel community, and compares the most highly-upvoted comments (n = 500) to a random set of other comments (n = 500) in the subreddit. This qualitative analysis focuses on identifying subcultural discourse that is widely supported and engaged with by members of the online community and the extent to which incels utilize this online space to reaffirm deviant behavior. Our study underscores the importance, as well as the difficulties, of drawing from online sources like web-forums to generate new knowledge on deviant communities and behaviors. We conclude with a discussion of the implications of this analysis, its limitations, and avenues for future research.
... Although incels only make up a small percentage of our population (0.2%, according to Greaves et al., 2017) the community has been complicit in or supportive of violent acts that have had broad societal consequences (Jaki et al., 2019). Given that the incel community has largely been founded on mutual experiences of rejection by women, or restrictions to a sexual "market," ...
... In exploring what words most frequently co-occurred with the word women in the incels.me forum, Jaki et al. (2019) identified a number of word pairings that represented either similar maliciousness or deviousness, including that they are "entitled," "parasites," "whores," ...
... In other words, the antifeminism that many attribute to incels may be a manifestation of insecure attachment rather than exclusively misogynistic intent. Indeed, insecure attachment has been linked to male violence in intimate partner settings (Buck et al., 2012) and Jaki et al. (2019) found no shortage of violent word pairings in discussions of women (including allusions to being caged, beaten, controlled, and gassed). ...
Preprint
Full-text available
Incel refers to an online group of young males who feel frustration and despair at being repeatedly neglected on the dating market. Despite several high-profile attacks, the majority of incel research is comprised of qualitative analyses of their forums. This provides a good contextual overview of the incel community but does not capture the experiences of incels or identify how and why this group responds so strongly to rejection. A total of 38 incels and 107 non-incel males participated in the present study, completing questionnaires pertaining to their dating app experiences and their mental and relational well-being. Large differences between incels and non-incels were found, with the former reporting greater depressive symptoms, rejection sensitivity, relationship status influence, and insecure attachment. These were all associated with perceived popularity, which incels scored lower on. Incels also adopted more liberal dating app strategies, yet reported fewer matches, conversations, and in-person outcomes. The pattern of results reported sheds new light on the role that dating apps may play in incels' efforts to attract mates and how these frustrations manifest. This is integral both to understanding the broader incel discourse as well as any efforts to develop treatment strategies with self-identified incels who seek counselling.
... Roger's "war against women" was the most prominent of a series of attacks by men linked to the Incel community (Gecco, 2019). Incels, an abbreviation of involuntary celibates, are members of an online community whose main stated grievance is that they desire to have romantic and sexual relationships but are unable to do so (Jaki et al., 2019). Although most Incels are non-violent (Speckhard et al., 2021), discussions in Incel forums and websites are often characterized by sexist, misogynistic and anti-feministic sentiments (Jaki et al., 2019). ...
... Incels, an abbreviation of involuntary celibates, are members of an online community whose main stated grievance is that they desire to have romantic and sexual relationships but are unable to do so (Jaki et al., 2019). Although most Incels are non-violent (Speckhard et al., 2021), discussions in Incel forums and websites are often characterized by sexist, misogynistic and anti-feministic sentiments (Jaki et al., 2019). ...
... Respondents indicated their degree of unwanted celibacy on 12 items developed using a 1 (does not describe me) to 5 (describes me extremely well) scale. The items, constructed based on the existing qualitative studies on people's experiences with involuntary celibacy (Donnelly et al., 2001;Donnelly & Burgess, 2008) and Incels' description of their own experiences (Daly & Reed, 2021;Jaki et al., 2019), tap into two interrelated themes: (1) desiring to have romantic or sexual partners, but being unable to secure one because of unattractiveness, rejection, failure, or lack of willing partners (e.g., "I want to date, but nobody wants to date me."); (2) expressions of grievance elicited by negatively comparing oneself with others who are able to have romantic or sexual experiences (e.g., "Other men/women are enjoying the pleasure of having romantic/sexual experiences, but not me."). ...
Article
Full-text available
In recent years, involuntary celibates who identify as “Incels” have received considerable public attention because of their misogynistic online discourse and their tie to a string of violent acts motivated by hatred of women. Yet, surprisingly no prior quantitative research has examined whether unwanted celibacy – a subjective psychological experience characteristic of, but not exclusive, to Incels – is associated with misogynistic attitudes among men. The current study (N = 349 men) collected self-report data from a convenience sample of Incel and non-Incel men to investigate whether the degree of unwanted celibacy is associated with misogynistic attitudes. Unwanted celibacy was positively associated with hostile attitudes towards women, sexual objectification and rape myths, even after controlling for personality traits such as agreeableness. These novel quantitative results indicate that unwanted celibacy is an important psychological risk factor for misogynistic attitudes.
... For incels, the salient issue, the lack of physical and emotional intimacy has endured. The way this disenfranchisement from sex is expressed, coupled with the voiced blame attribution to women and feminism, as well as the frequent utterances of misogyny (see Jaki et al., 2019) and toxic language (see Horta Ribeiro et al., 2021) deduce an extensive and unambiguous conflict between incels (the deprived) and women (the blamed). The "Black Pill philosophy" (Blackpill, 2022) is a dominant interpretative framework the incel community proposes. ...
... Hoffman et al. (2020) summarises that although incels largely agree on the description of the roots of their inceldom, they are rather divided on the prescription to solve their problem. In addition, Jaki et al. (2019) arrived at the conclusion that there is no consensus within incels regarding the solution to their problem. ...
... Posts in incel forums often carry sarcasm, irony, outrage, and blame, therefore it is difficult to precisely measure the level of exaggeration or threat the community pose (Hoffman et al., 2020). Nevertheless, Jaki et al. (2019) articulates that forums like Incels.is (Incels.me in 2019) reinforce toxic language, regardless of the benign or malicious intent of the poster. ...
Thesis
Full-text available
Technological advancements and affordability enable voicing of social injustice, feelings of deprivation, and oppression. Spatial barriers no longer pose obstacles to connecting with like-minded (or dissimilar) others to define and refine ingroup and outgroup. Some scholars anticipate that the internet liberates the discussion of opinions, others claim social networking platforms play a role in the polarisation of the public by creating echo chambers. However, it is recognised that ideas, ideologies, and social movements spread across the internet at an unprecedented pace. Connecting with others with whom one shares deprivation in a support network offers a sense of belonging. Broad scholarly literature addresses opinion polarisation and potential radicalisation in online social media platforms. However, quantifying radicalisation trajectories in fringe online communities like the misogynist incels are still to be done. In this thesis I study the online presence of the incel community. Incels are mostly young men who feel stigmatised and need to hide their incel existence. Incels voice their feelings of deprivation of a relationship and sex with a willing partner. This unfulfilled masculinity and sense of entitlement to sex cause frustration and anger which are vented in online forums blaming primarily women and feminism. Calls for action to social change, even for violence is common. However, incels do not unanimously consider violence a solution, many demonstrate the tame side of the so-called blackpilled mindset, the acceptance of powerlessness, and nihilism. Regardless, some scholars view the community as potentially dangerous to society, labelling them as terrorists. This study investigates whether participating registered users of the Incels.is website display increasing tendency toward expressing utterances with the themes of misogyny, harassment, nihilism, and moral outrage in their posted messages, and whether users gradually become more aligned with the general perception of incels in previous scholarly work. In other words, this work tests whether active participation increases the frequency of utterances of misogyny, harassment, and moral outrage, thus demonstrating a radicalisation tendency or increased nihilism. To answer the research question, I first scraped the Incels.is website, and retained ~5.38M posts published over 4 years for analysis. Next, a subset of posts was manually labelled to train a supervised text classification model (BERT). Finally, the results of the classification task were complemented with Ordinary Least Squares regression (n = 4623). The analyses uncover temporal user-level radicalisation trajectories, and increased nihilism. More specifically, the duration of active participation (in days) and the number of posted messages positively predict the count of moral outrage, misogynistic, harassing, and nihilistic content.
... Recent years have seen growing concerns about potential threats of violence stemming from the incel community (Hoffman & Ware, 2020). A significant minority of incels (~ 10%) engage in misogynistic online hostility (Ging, 2019;Jaki et al., 2019), and rare individual cases have seen incels lash out in violent rage. Most notable is the notorious case of Elliot Rodger, who in 2014 killed six people and injured 14 others before killing himself, referring in his manifesto to a "day of retribution" when he would kill those who he most envied (Allely & Faccini, 2017). ...
... The incel community operates almost exclusively online, providing an outlet to express misogynistic hostility, frustration, and blame toward society for a perceived failure to include them (Speckhard et al., 2021). There is a need to know more about incels' experiences, grievances, and mental health outcomes, yet there is a dearth of primary data collected from inquiries made to incels themselves, with most academic research focusing on incel misogyny, using online linguistic analysis (Jaki et al., 2019;O'Malley et al., 2020). It is unclear how much of incel rhetoric is performatively antagonistic; however, online misogyny can be used to predict domestic violence (Blake et al., 2021), and there is evidence that internet "trolls" who are hostile online are similarly hostile offline and may be attracted to evolutionarily novel online worlds, where aggression-based strategies can be pursued without risking real-world retaliation (Bor & Petersen, 2019). ...
... In recent years, specific research on the incel community has grown, examining topics ranging from misogynistic online rhetoric (Byerly, 2020;Jaki et al., 2019), Big Five personality traits (Bieselt, 2020), to incel pornography use (Stickel, 2020). However, Speckhard et al. (2021) note that almost all academic studies which include primary responses from incels used the same limited data set-an online survey of incels (n = 28) from the University of Twente in the Netherlands. ...
Article
Full-text available
Incels (involuntary celibates) are a subculture community of men who build their identity around their perceived inability to form sexual or romantic relationships. To address the dearth of primary data collected from incels, this study compared a sample (n = 151) of self-identified male incels with similarly aged non-incel males (n = 378) across a range of measures related to mental well-being. We also examined the role of sociosexuality and tendency for interpersonal victimhood as potential moderators of incel status and its links with mental health. Compared to non-incels, incels were found to have a greater tendency for interpersonal victimhood, higher levels of depression, anxiety and loneliness, and lower levels of life satisfaction. As predicted, incels also scored higher on levels of sociosexual desire, but this did not appear to moderate the relationship between incel status and mental well-being. Tendency for interpersonal victimhood only moderated the relationship between incel self-identification and loneliness, yet not in the predicted manner. These novel findings are some of the earliest data based on primary responses from self-identified incels and suggest that incels represent a newly identified "at-risk" group to target for mental health interventions, possibly informed by evolutionary psychology. Potential applications of the findings for mental health professionals as well as directions for future research are discussed.
... At the same time, the research community has mostly studied the Incel community and the broader Manosphere on Reddit, 4chan, and other online discussion forums like Incels.me [30,31]. However, the fact that YouTube has been repeatedly accused of user radicalization prompts the need to study the extent to which Incels are exploiting YouTube to spread their views. ...
... The research community has mostly studied the Incel community and the broader Manosphere on Reddit, 4chan, and online discussion forums like Incels.me or Incels.co [30,225,31,110,226,109]. However, the fact that YouTube has been repeatedly accused of user radicalization and promoting offensive and inappropriate content [3,6,227,172] prompts the need to study the extent to which Incels are exploiting the YouTube platform to spread their views. ...
Preprint
Full-text available
YouTube has revolutionized the way people discover and consume video. Although YouTube facilitates easy access to hundreds of well-produced and trustworthy videos, abhorrent, misinformative, and mistargeted content is also common. The platform is plagued by various types of problematic content: 1) disturbing videos targeting young children; 2) hateful and misogynistic content; and 3) pseudoscientific misinformation. While YouTube's recommendation algorithm plays a vital role in increasing user engagement and YouTube's monetization, its role in unwittingly promoting problematic content is not entirely understood. In this thesis, we shed some light on the degree of problematic content on YouTube and the role of the recommendation algorithm in the dissemination of such content. Following a data-driven quantitative approach, we analyze thousands of videos on YouTube, to shed light on: 1) the risks of YouTube media consumption by young children; 2) the role of the recommendation algorithm in the dissemination of misogynistic content, by focusing on the Involuntary Celibates (Incels) community; and 3) user exposure to pseudoscientific content on various parts of the platform and how this exposure changes based on the user's watch history. Our analysis reveals that young children are likely to encounter disturbing content when they randomly browse the platform. By analyzing the Incel community on YouTube, we find that Incel activity is increasing over time and that platforms may play an active role in steering users towards extreme content. Finally, when studying pseudoscientific misinformation, we find that YouTube suggests more pseudoscientific content regarding traditional pseudoscientific topics (e.g., flat earth) than for emerging ones (like COVID-19) and that these recommendations are more common on the search results page than on a user's homepage or the video recommendations section.
... The research community has mostly studied the Incel community and the broader Manosphere on Reddit, 4chan, and online discussion forums like Incels.me or Incels.co [23,38,50,54,63,64]. However, the fact that YouTube has been repeatedly accused of user radicalization and promoting offensive and inappropriate content [40,53,56,65,68] prompts the need to study the extent to which Incels are exploiting the YouTube platform to spread their views. ...
... They create nine lexicons of misogynistic terms to investigate how misogynistic language is used in 6M posts from Manosphere-related subreddits. Jaki et al. [38] study misogyny on the Incels.me forum, analyzing users' language and detecting misogyny instances, homophobia, and racism using a deep learning classifier that achieves up to 95% accuracy. ...
Article
Full-text available
YouTube is by far the largest host of user-generated video content worldwide. Alas, the platform has also come under fire for hosting inappropriate, toxic, and hateful content. One community that has often been linked to sharing and publishing hateful and misogynistic content are the Involuntary Celibates (Incels), a loosely defined movement ostensibly focusing on men's issues. In this paper, we set out to analyze the Incel community on YouTube by focusing on this community's evolution over the last decade and understanding whether YouTube's recommendation algorithm steers users towards Incel-related videos. We collect videos shared on Incel communities within Reddit and perform a data-driven characterization of the content posted on YouTube. Among other things, we find that the Incel community on YouTube is getting traction and that, during the last decade, the number of Incel-related videos and comments rose substantially. We also find that users have a 6.3% chance of being suggested an Incel-related video by YouTube's recommendation algorithm within five hops when starting from a non Incel-related video. Overall, our findings paint an alarming picture of online radicalization: not only Incel activity is increasing over time, but platforms may also play an active role in steering users towards such extreme content.
... For instance, the "Red Pill philosophy" employs the movie "The Matrix" as a metaphor to claim that only a few men can see the alleged truth that women are not oppressed within society (Shawn P. Van Valkenburgh 2018). Incels are "involuntary celibates," men who blame women for being rejected, as was the aforementioned case of Rodger (Sylvia Jaki, et al. 2019). Pick-up Artists portray themselves as dating coaches, and in so doing often dehumanize and abuse women, while Men Going Their Own Way choose to reduce their relationships with women to the bare minimum (Alan Grant 2019). ...
... Such articles and comments also mirror the incel culture within the manosphere, because they blame women for their sexual unavailability (Jaki et al. 2019). Describing #MeToo as exaggerating the nature of sexual violence and trying to destroy men's lives, narratives on ROK and AVfM imply that it is also a ploy of women to reject men. ...
... The level of hate speech on the forum incels.co is investigated in Jaki et al. (2019). The authors use a dictionary-based approach to identify posts with keywords that constitute misogyny and racism. ...
... An interesting aspect is the nature of this toxicity. While the forums do contain much racist hatred, incels are nevertheless an ethnically heterogeneous group (Jaki et al. 2019). Many incels with an ethnicity other than "white" believe that women do not even look at them precisely because of their ethnicity. ...
Article
Full-text available
The internet-based incel subculture has evolved over the past decade on a number of different platforms. The subculture is known to be toxic and has become associated with several high-profile cases of lethal violence. In this paper, we study the level of toxic language and its targets on three large incel forums: incels.co , lookism.net and looksmax.me . These three forums are the most well-known and active online platforms where incels meet and discuss. Our results show that even though usage of toxic language is pervasive on all three forums, they exhibit significant differences in the composition of their toxicity. These differences correspond to different groups or philosophies within the incel communites.
... 27,28,55,56 Third, sexism existed in the forum's accounts where women were denoted as sex objects or animals but blamed for men's body dissatisfaction. Previous research has also found that men talking online 57,58 or about their body dissatisfaction 27,28,55,59 often perpetuate sexism too. ...
... Many forums have explicit policies against hate speech but fail to implement them. 58 Helpfully, an automatic content moderator that is 95% accurate at detecting misogynist speech now exists. Thus, the sexism on the forums is critical to challenge, not only because it contributes contributes to the oppression of women, but also because not challenging it fails to help men themselves (e.g., by detracting from the real drivers of baldness and the potential for effective remedy). ...
Article
Full-text available
Introduction: Head hair comprises a critical part of the male appearance ideal, which itself is a crucial signifier of a man's masculinity. However, difficulties in recruitment have meant that research has not yet fully explored how men construct the loss of head hair (baldness), perhaps because it is considered "feminine" to disclose body dissatisfaction experiences to a researcher or other people. Methods and Design: Online forums provide an opportunity for the anonymous discussion of body dissatisfaction that may overcome this obstacle. The first 260 forums posts from the two most popular baldness forums were thematically analysed. Ethics Statement Institutional ethics approval was granted. Results and Discussion: We identified three themes titled: (1) Baldness is an ugly and demasculinising condition, (2) Baldness is stigmatised by a superficial society and superficial women and (3) Resistance to baldness despair. Our findings show baldness distress, and stigma exist though so does resistance, which can be comforting to men experiencing baldness or any form of body dissatisfaction. Conclusion and Implications: Online forums are a salient resource to enhance our understanding of men's balding concerns and disclosure barriers. Independent, professional and effective baldness support that unpacks baldness masculinised and medicalised framing is recommended.
... Incels tend to blame their disenfranchisement on lookism, or women's choice of sexual partners based solely on physical features (Halpin 2021) 1 . Another culprit often mentioned on Incel online forums is biological determinism that "results in a sexual hierarchy that dominates Western civilization" (Beauchamp 2018) and predestines certain men to never find a mate (Kelly, DiBranco, and DeCook 2021), thus condemning Incels to a life of loneliness (Jaki et al. 2019). Incels who gained notoriety because of their violent acts-Elliot Rodger, Chris Harper-Mercer, and Alek Minassian (notorious Incel killers)-have shared these ideas. ...
... These same individuals tended to express beliefs that the broader public underestimates the danger from Incel community. This finding is consistent with existing research on Incel online activity that found that a small, vocal subset of the larger group of users were responsible for most activity and posted most hateful content (Jaki et al. 2019;Baele, Brace, and Coan 2019). However, it is important to keep these data in perspective. ...
Article
Full-text available
Incels (involuntarily celibates) are an online-based identity group of mostly males who feel disenfranchised because of what they see as an unfair advantage the society gives to muscular, confident males, taking away Incels' chances of securing a sexual relationship. Several Incels have left behind ideological manifestos before carrying out acts of mass murder, alerting the public and security services to the danger Incels can present. Nonetheless, data about radicalization (defined as support for illegal or violent political action) of the larger Incel population remain scarce. This study aimed to expand this knowledge by conducting a survey of 54 self-identified Incels. Results demonstrated high rates of self-reported depression (91%), anxiety (85%), PTSD (40%), autism spectrum disorders (53%), and a history of bullying (91%). A new measure of radicalization specific to the Incel community (Incel Radi-calization Scale) demonstrated high internal consistency and construct validity, making it a useful tool for identification and early prevention of radicalization among Incels. Radicalization was not correlated with Incel ideological commitment. A small proportion of respondents (17%) scored above the midpoint on Incel Radicalization Scale, demonstrating high radicalization. Discussion focuses on implications for preventing and countering violent extremism efforts, including prioritizing mental health and trauma-informed care.
... Previous data-driven studies on incel-related communities have aimed to identify key words, to make inferences about incel demographics, to characterize activity on various platforms, and to automate detection of related communities [37][38][39][40]. A study by Jaki et. ...
... A study by Jaki et. al. used statistical tests on word counts to identify key words in incel corpora, and visualized their results as a word cloud [37]. Our approach differs from the approach used by Jaki et. ...
Preprint
Evolving out of a gender-neutral framing of an involuntary celibate identity, the concept of `incels' has come to refer to an online community of men who bear antipathy towards themselves, women, and society-at-large for their perceived inability to find and maintain sexual relationships. By exploring incel language use on Reddit, a global online message board, we contextualize the incel community's online expressions of misogyny and real-world acts of violence perpetrated against women. After assembling around three million comments from incel-themed Reddit channels, we analyze the temporal dynamics of a data driven rank ordering of the glossary of phrases belonging to an emergent incel lexicon. Our study reveals the generation and normalization of an extensive coded misogynist vocabulary in service of the group's identity.
... Che la violenza e l'odio caratterizzino le comunità virtuali Incel e Red Pill è un fatto assodato nella letteratura internazionale (Ganesh 2018;Jaki et al. 2019;Sang, and Stanton 2020;Waśniewska 2020), al punto che nel 2017 Reddit, web company statunitense che gestisce un sito internet di social news e forum, ha chiuso una parte dedicata alla comunità Incel per aver violato le politiche aziendali sui discorsi di incitamento all'odio 4 . Questo non ha impedito a tali gruppi di spostarsi su piattaforme più compiacenti. ...
Article
Full-text available
Online groups that share violent language and hate speech against women have been spreading and multiplying. Incel and Red Pill groups are part of the manosphere and are active and increasing in Italy too. The aim of the study upon which this article is based, and whose preliminary results are presented here, is to analyse interactions, representations and discourses connected to gender identities and models of masculinity, presented and co-produced within the Incel and Red Pill Italian community. Through the analysis of empirical material, web articles, posts and comments on closed groups active on social media, we identified two interconnected themes: (1) aesthetics, frustration and a narrative of self?victimization; (2) reification, dehumanization and violence against women. The results show that violence is immanent and integrated into the lexicon and imaginary used by this virtual community, and is at the core of the new models of masculinity produced and reinforced by these groups. Keywords: gender models, masculinity, incel, red pill, violence.
... The manosphere overlaps with far-right groups in membership, with growing evidence of their members repeatedly associated with online harassment, real-life violence and terror attacks [14]. A number of researchers have been exploring manosphere-related forums, Reddits, and Youtube data to examine the characteristics and the evolution of the manosphere [14,44], studying incels [36] and online hatred of women [21]. ...
Preprint
Full-text available
Online extremism is a growing and pernicious problem, and increasingly linked to real-world violence. We introduce a new resource to help research and understand it: ExtremeBB is a structured textual dataset containing nearly 44M posts made by more than 300K registered members on 12 different online extremist forums, enabling both qualitative and quantitative large-scale analyses of historical trends going back two decades. It enables us to trace the evolution of different strands of extremist ideology; to measure levels of toxicity while exploring and developing the tools to do so better; to track the relationships between online subcultures and external political movements such as MAGA and to explore links with misogyny and violence, including radicalisation and recruitment. To illustrate a few potential uses, we apply statistical and data-mining techniques to analyse the online extremist landscape in a variety of ways, from posting patterns through topic modelling to toxicity and the membership overlap across different communities. A picture emerges of communities working as support networks, with complex discussions over a wide variety of topics. The discussions of many topics show a level of disagreement which challenges the perception of homogeneity among these groups. These two features of mutual support and a wide range of attitudes lead us to suggest a more nuanced policy approach than simply shutting down these websites. Censorship might remove the support that lonely and troubled individuals are receiving, and fuel paranoid perceptions that the world is against them, though this must be balanced with other benefits of de-platforming. ExtremeBB can help develop a better understanding of these sub-cultures which may lead to more effective interventions; it also opens up the prospect of research to monitor the effectiveness of any interventions that are undertaken.
... It should perhaps not be surprising, in light of the above, that farright movements attract so many highly alienated and sexually frustrated men (the 'red pill' community), as well as so many anti-sexual moralizers, such as found among many religious fundamentalists. In the society of the selfie, Incels have been attracting the attention of many researchers devoted to the relation between sexual repression, personal virtual exhibition, social media and political extremism (Jaki et al. 2019;Hoiland 2019;Hoffman et al. 2020;Maxwell et al. 2020). Moreover, concern with other people's sexual deviance was one of the criteria for authoritarianism in Adorno's F-scale (Adorno et al. 2019(Adorno et al. [1950). ...
Book
Full-text available
This book explores how the Internet is connected to the global crisis of liberal democracy. Today, self-promotion is at the heart of many human relationships. The selfie is not just a social media gesture people love to hate. It is also a symbol of social reality in the age of the Internet. Through social media people have new ways of rating and judging themselves and one another, via metrics such as likes, shares, followers and friends. There are new thirsts for authenticity, outlets for verbal aggression, and social problems. Social media culture and neoliberalism dovetail and amplify one another, feeding social estrangement. With neoliberalism, psychosocial wounds are agitated and authoritarianism is provoked. Yet this new sociality also inspires resistance and political mobilisation. Illustrating ideas and trends with examples from news and popular culture, the book outlines and applies theories from Debord, Foucault, Fromm, Goffman, and Giddens, among others. Topics covered include the global history of communication technologies, personal branding, echo chamber effects, alienation and fear of abnormality. Information technologies provide channels for public engagement where extreme ideas reach farther and faster than ever before, and political differences are widened and inflamed. They also provide new opportunities for protest and resistance.
... In particular, the identification of offensive content, most notably hate speech, has been a growing research area. Within this broad area, various related phenomena have been addressed in isolation such as cyber-bulling, misogyny, aggression, and abuse [8,9,10] while some recent work has focused on modeling multiple types of offensive content at once [11,12]. ...
Preprint
Full-text available
The widespread of offensive content online such as hate speech poses a growing societal problem. AI tools are necessary for supporting the moderation process at online platforms. For the evaluation of these identification tools, continuous experimentation with data sets in different languages are necessary. The HASOC track (Hate Speech and Offensive Content Identification) is dedicated to develop benchmark data for this purpose. This paper presents the HASOC subtrack for English, Hindi, and Marathi. The data set was assembled from Twitter. This subtrack has two sub-tasks. Task A is a binary classification problem (Hate and Not Offensive) offered for all three languages. Task B is a fine-grained classification problem for three classes (HATE) Hate speech, OFFENSIVE and PROFANITY offered for English and Hindi. Overall, 652 runs were submitted by 65 teams. The performance of the best classification algorithms for task A are F1 measures 0.91, 0.78 and 0.83 for Marathi, Hindi and English, respectively. This overview presents the tasks and the data development as well as the detailed results. The systems submitted to the competition applied a variety of technologies. The best performing algorithms were mainly variants of transformer architectures.
... However, as a result, social media platforms can also be used to spread hate comments, hate posts by certain individuals or groups; which can lead to having anxiety, mental illness and severe stress to people who consume that hate content [2]. It becomes necessary to be able to detect such activities at its earliest to stop it from spreading, thereby making social media a healthy place to interact and share their views without a fear of getting hate comments [3]. The hate posts can be insults or racist or discriminating on the bases of a particular gender, religion, nationality, age bracket, ethnicity. ...
Preprint
Full-text available
The number of increased social media users has led to a lot of people misusing these platforms to spread offensive content and use hate speech. Manual tracking the vast amount of posts is impractical so it is necessary to devise automated methods to identify them quickly. Large language models are trained on a lot of data and they also make use of contextual embeddings. We fine-tune the large language models to help in our task. The data is also quite unbalanced; so we used a modified cross-entropy loss to tackle the issue. We observed that using a model which is fine-tuned in hindi corpora performs better. Our team (HNLP) achieved the macro F1-scores of 0.808, 0.639 in English Subtask A and English Subtask B respectively. For Hindi Subtask A, Hindi Subtask B our team achieved macro F1-scores of 0.737, 0.443 respectively in HASOC 2021.
... One specific kind of hate speech comes from "incels" (involuntary celibates) who regard women as the cause of problems in their lives, particularly with respect to romantic attachment. Some incels use social media platforms to post misogynistic hate speech and other problematic content [3]. ...
Article
Full-text available
The ubiquity of online media services helps to promote free speech but also provides opportunities for the spread of problematic content such as hate speech. A group of individuals known as "incels" (involuntary celibates) sometimes use online media services to publish hateful content. To expose and condemn incel hate speech, other individuals, sometimes called "incel hunters" create online communities where they critique screenshots of content posted by incels. In this paper, using 18,187 posts collected from a subreddit, we explore the potential for transforming screenshots of incel hate speech and oppositional statements into training data that could be used as the basis for automated or semi-automated content moderation tools.
... To identifying the misogyny talks, Jaki et al. [28] conducted a study on the Incels.me forum (which has been suspended and is not accessible anymore). ...
Conference Paper
Full-text available
Incels, which stand for involuntary celibates, refer to online community members who identify themselves as individuals that women are not attracted to them. They are usually involved in misogyny and hateful conversations on social networks, leading to several terrorist attacks in recent years, also known as incel rebellion. In order to stop terrorist acts like this, the first step is to detect incels members in social networks. To this end, user-generated data can give us insights. In previous attempts to identifying incels in social media, users’ likes and fuzzy likes data were considered. However, another piece of information that can be helpful to identify such social network members is users’ comments. In this study, for the first time, we have considered users’ comments to identify incels in the social networks. Accordingly, an algorithm using sentiment analysis was proposed. Study results show that by implementing the proposed method on social media users’ comments, incel members can be identified in social networks with an accuracy of 78.8%, which outperforms the previous work in this field by 10.05%.
... Problematic content and in particular hate speech has been a growing research area. Linguists have analysed and described various forms of hate speech [4]. Political scientists and legal experts search for ways to regulate platforms and to handle problematic content without oppressing free speech [5]. ...
Preprint
Full-text available
With the growth of social media, the spread of hate speech is also increasing rapidly. Social media are widely used in many countries. Also Hate Speech is spreading in these countries. This brings a need for multilingual Hate Speech detection algorithms. Much research in this area is dedicated to English at the moment. The HASOC track intends to provide a platform to develop and optimize Hate Speech detection algorithms for Hindi, German and English. The dataset is collected from a Twitter archive and pre-classified by a machine learning system. HASOC has two sub-task for all three languages: task A is a binary classification problem (Hate and Not Offensive) while task B is a fine-grained classification problem for three classes (HATE) Hate speech, OFFENSIVE and PROFANITY. Overall, 252 runs were submitted by 40 teams. The performance of the best classification algorithms for task A are F1 measures of 0.51, 0.53 and 0.52 for English, Hindi, and German, respectively. For task B, the best classification algorithms achieved F1 measures of 0.26, 0.33 and 0.29 for English, Hindi, and German, respectively. This article presents the tasks and the data development as well as the results. The best performing algorithms were mainly variants of the transformer architecture BERT. However, also other systems were applied with good success
... Incels abide by "The Black Pill," the belief that unattractive men would be doomed to romantic loneliness and unhappiness. Previous work has studied the community links with other masculinist communities [36], as well as its relationship with terrorist attacks [15] and the production of misogynistic content online [17]. ...
Preprint
Online platforms face pressure to keep their communities civil and respectful. Thus, the bannings of problematic online communities from mainstream platforms like Reddit and Facebook are often met with enthusiastic public reactions. However, this policy can lead users to migrate to alternative fringe platforms with lower moderation standards and where antisocial behaviors like trolling and harassment are widely accepted. As users of these communities often remain \ca across mainstream and fringe platforms, antisocial behaviors may spill over onto the mainstream platform. We study this possible spillover by analyzing around $70,000$ users from three banned communities that migrated to fringe platforms: r/The\_Donald, r/GenderCritical, and r/Incels. Using a difference-in-differences design, we contrast \ca users with matched counterparts to estimate the causal effect of fringe platform participation on users' antisocial behavior on Reddit. Our results show that participating in the fringe communities increases users' toxicity on Reddit (as measured by Perspective API) and involvement with subreddits similar to the banned community -- which often also breach platform norms. The effect intensifies with time and exposure to the fringe platform. In short, we find evidence for a spillover of antisocial behavior from fringe platforms onto Reddit via co-participation.
... This has motivated researchers to put significant efforts into creating datasets and designing intelligent algorithms for hate speech detection. Linguists, in addition, have analyzed contents and defined different categories of hate speech data [12]. Reference [13] introduced the hate speech-offensive (HSO) dataset. ...
Article
Full-text available
We investigate the performance of a state-of-the-art (SoTA) architecture T5 (available on the SuperGLUE) and compare with it 3 other previous SoTA architectures across 5 different tasks from 2 relatively diverse datasets. The datasets are diverse in terms of the number and types of tasks they have. To improve performance, we augment the training data by using an autoregressive model. We achieve near-SoTA results on a couple of the tasks-macro F1 scores of 81.66% for task A of the OLID 2019 dataset and 82.54% for task A of the hate speech and offensive content (HASOC) 2021 dataset, where SoTA are 82.9% and 83.05%, respectively. We perform error analysis and explain why one of the models (Bi-LSTM) makes the predictions it does by using a publicly available algorithm: Integrated Gradient (IG). This is because explainable artificial intelligence (XAI) is essential for earning the trust of users. The main contributions of this work are the implementation method of T5, which is discussed; the data augmentation using a new conversational AI model checkpoint, which brought performance improvements; and the revelation on the shortcomings of HASOC 2021 dataset. It reveals the difficulties of poor data annotation by using a small set of examples where the T5 model made the correct predictions, even when the ground truth of the test set were incorrect (in our opinion). We also provide our model checkpoints on the HuggingFace hub 1 to foster transparency.
... This assumption is supported by a discourse analysis of the now banned 'Incels.me' forum(Jaki et al., 2019). Note also the rules on the 'r/Incels' message board, which has also been shut down: 'Most can agree that women can be incel in some rare situations such as extreme disfigurement, but their numbers do not come close to male incels' (qtd. ...
Chapter
Full-text available
When we encounter extremist rhetoric, we often find it dumbfounding, incredible, or straightforwardly unintelligible. For this reason, it can be tempting to dismiss or ignore it, at least where it is safe to do so. The problem discussed in this paper is that such dismissals may be, at least in certain circumstances, epistemically unjust. Specifically, it appears that recent work on the phenomenon of hermeneutical injustice compels us to accept two unpalatable conclusions: first, that this failure of intelligibility when we encounter extremist rhetoric may be a manifestation of a hermeneutical injustice; and second, that remedying this injustice requires that we ought to become more engaged with and receptive of extremist worldviews. Whilst some theorists might interpret this as a reductio of this framework of epistemic in/justice, we push back against this conclusion. First, we argue that with a suitably amended conception of hermeneutical justice—one that is sensitive to the contextual nature of our hermeneutical responsibilities, and to the difference between understanding a worldview and accepting it—we can bite the bullet and accept that certain extremists are subject to hermeneutical injustice, but without committing ourselves to any unpalatable conclusions about how we ought to remedy these injustices. Second, we argue that bringing the framework of hermeneutical in/justice to bear upon the experience of certain extremists actually provides a new and useful perspective on one of the causes of extremism, and how it might be undermined.
... And while the vast majority of incels do not commit mass murder and some do not support violence (Daly & Reed, 2022;Pantucci & Ong, 2020;Speckhard et al., 2021;Voroshilova & Pesterev, 2021), many participants in incel forums may contribute to a social climate in which mass killings and attacks against women become increasingly likely (Caruso et al., 2021;Cottee, 2021;Hoffman et al., 2020;O'Malley et al., 2020;Scaptura & Boyle, 2020;Speckhard et al., 2021). The most problematic incel perspectives blame women for men's sexual frustration and contain themes of misogyny, victimhood, and fatalism (Cottee, 2021;Ging, 2019;Jaki et al., 2019;O'Malley et al., 2020;Tranchese & Sugiura, 2021;Voroshilova & Pesterev, 2021). ...
Article
Although several mass killings by incels have received much attention, the overall phenomenon of sexually frustrated offenders seems even larger. This study drew from a recently developed sexual frustration theory to closely examine public mass shooters in the United States from 1966 to 2021 (n = 178). Results showed that some sexually frustrated perpetrators just wanted sex, while others lusted after unavailable partners or had illegal urges that were difficult to satisfy. Quantitative analyses indicated that compared to other mass shooters, sexually frustrated perpetrators were more frequently young, male, unmarried, childless, and unemployed. They were also more likely to be misogynistic, sex offenders, and fame-seekers, and their attacks killed significantly more female victims. Concerted efforts to reduce toxic masculinity and provide better guidance to young men could help reduce this threat.
... These experiences may in turn shape their ideology. For example, subscribers to the involuntary celibate (Incel) movement believe they have been rejected by women and strive to achieve revenge for their perceived victimhood by committing violence against any/all women (Baele et al., 2021), and a linguistic analysis of their online posts suggests they frequently discuss enacting this desired revenge (Jaki et al., 2019). This is exemplified in the case of Elliot Rodger who in 2014 killed six people and injured fourteen because "[he] wanted to make everyone else suffer just as they made [him] suffer. ...
Article
Full-text available
In recent years, researchers of various disciplines have developed many theories to understand the radicalization process. One key factor that may promote radicalization is social exclusion, the state of being kept apart from others. Indeed, experimental studies have provided initial evidence for a relation between exclusion and radicalism. The current review outlines and builds upon these research programs, arguing that social exclusion has been shown (a) to increase the willingness to fight-and-die, (b) to promote the approval for extreme, even violent, political parties and actions, and (c) to push the willingness to engage in illegal and violent action for a political cause. We close with an agenda for future research and critically discuss implications of this work for social policy.
... Radicalizing individual generate a sense of in-group loyalty and positive outlook towards in-group while also being hostile and negative to the outgroup. Research into the Manosphere has shown that it's members have a shared identity and have constructed a major out-group (women) as the causes for their grievances [65] and targets for violence [66]. ...
Preprint
Full-text available
The algorithms and the interactions facilitated by online platforms have been used by radical groups to recruit vulnerable individuals to their cause. This has resulted in the sharp growth of violent events and deteriorating online discourse. The Manosphere, a collection of radical anti-feminist communities, is one such group which has attracted attention due to their rapid growth and increasingly violent real world outbursts. In this paper, we examine the social engagements between Reddit users who have participated in feminist discourse and the Manosphere communities on Reddit to understand the process of development of traits associated with the adoption of extremist ideologies. By using existing research on the psychology of radicalization we track how specific types of social engagement with the Manosphere influence the development of traits associated with radicalization. Our findings show that: (1) participation, even by the simple act of joining the Manosphere, has a significant influence on the language and outlook traits of a user, (2) Manosphere elites are extremely effective propagators of radical traits and cause their increase even outside the Manosphere, and (3) community perception can heavily influence a user's behavior. Finally, we examine how our findings can help draft community and platform moderation policies to help mitigate the problem of online radicalization.
... This has motivated researchers to put significant efforts into creating datasets and designing intelligent algorithms for hate speech detection. Linguists, in addition, have analyzed contents and defined different categories of hate speech data [12]. Reference [13] introduced the hate speech-offensive (HSO) dataset. ...
Preprint
Full-text available
We investigate the performance of a state-of-the art (SoTA) architecture T5 (available on the SuperGLUE) and compare with it 3 other previous SoTA architectures across 5 different tasks from 2 relatively diverse datasets. The datasets are diverse in terms of the number and types of tasks they have. To improve performance, we augment the training data by using an autoregressive model. We achieve near-SoTA results on a couple of the tasks - macro F1 scores of 81.66% for task A of the OLID 2019 dataset and 82.54% for task A of the hate speech and offensive content (HASOC) 2021 dataset, where SoTA are 82.9% and 83.05%, respectively. We perform error analysis and explain why one of the models (Bi-LSTM) makes the predictions it does by using a publicly available algorithm: Integrated Gradient (IG). This is because explainable artificial intelligence (XAI) is essential for earning the trust of users. The main contributions of this work are the implementation method of T5, which is discussed; the data augmentation using a new conversational AI model checkpoint, which brought performance improvements; and the revelation on the shortcomings of HASOC 2021 dataset. It reveals the difficulties of poor data annotation by using a small set of examples where the T5 model made the correct predictions, even when the ground truth of the test set were incorrect (in our opinion). We also provide our model checkpoints on the HuggingFace hub1 to foster transparency.
... A distinct aspect of incel language that has received far less attention, however, is to do with race. The incel community is racially diverse, and active in discussions of race and constructions of specific in-group racism (Jaki et al., 2019). Overlaps between incel and racist agendas have been increasingly noted in research and commentary (e.g. ...
Article
Full-text available
It is often observed that in modern English no political movement has created an internet jargon with the speed and range of the alt-right. Recently, however, we are seeing a specifically misogynist strand of this jargon shoot up, coming from the growing online anti-feminist network known as the Manosphere, and specifically its popularly best known outpost of ‘incels’. The neologisms being produced by incels have come to form a true cryptolect, developing at a rate that almost escapes linguistic description; at the same time, the elements of this cryptolect are quickly infiltrating broader popular culture and global vernacular contexts, from social media to Urban Dictionary (Ging, Lynn & Rosati, 2020).
... As mentioned above, in some cases, incels' attitudes become so extreme that incel-created websites have to be blocked or taken down due to the breaches in violent content rules. These violations are mostly due to extreme misogynist views often accompanied by hate speech (Byerly 2020;Jaki et al. 2019; Pereira e Silva 2021). This radicalisation is reflected in and further strengthened by the language that incels use in their forums. ...
Article
Dialogue plays a most important role in interpersonal relations creating and strengthening social cohesion. Conversely, the lack of dialogue – and more tellingly a deliberate resistance to it – leads to social friction and animosity. In this paper I focus on a strategy used to intentionally disable a possibility of a meaningful dialogue and to deny any voice to the “other”. Dehumanising the “other” by linguistically representing them as animals or machines exempts the perpetrator from any obligation towards the “other”, including the obligation to respect their rights. I adopt Haslam’s model of dehumanisation ( 2006 ) which shows how, by means of metaphorical language, women are dehumanised and denied the possibility to participate in a meaningful dialogue in the so called manosphere.
... Considering this, mental illness appears to be a major concern among the community. Indeed, a group of incels refer to themselves as 'mentalcels', a type of incel who attributes their inceldom to mental illness (Jaki 2019). This suggests the potential for subgroups of incels who have different contributing factors to their inceldom, but there is no available research suggesting how mental illness influences how they attribute blame for their lack of sexual relationships. ...
Article
Full-text available
In recent years, mass violence associated with men who identify as involuntary celibates (incels) has been of increasing concern. Incels engage in an online community where misogyny and incitements to violence against women are prevalent, often owing to the belief that women are denying them a ‘right’ to sex. Indeed, inceldom can be considered a form of extremism. Information released about the prepetrators of incel-associated violence consistently suggests that mental disorder is a contributory factor and may increase vulnerability to engaging with the incel community. Depression, autism and personality disorder are particularly relevant. To date, there has been little research into the mental health of incels and how, in some, this contributes to violence. This article considers the associations between mental disorder and inceldom, including the risk factors for incel-related violence, and makes recommendations for best practice in risk assessment and clinical intervention.
... This is one reason why challenges to traditional structures, including changes in abortion laws, has led to anxiety among men who feel their way of life disappearing (Kelly, 2017). In some cases, the associated backlash has taken the shape of ultra-traditional, nostalgic and often militant and toxic white masculinity (Burns, 2017;Jaki et al., 2019). Women also participate in the resistant backlashas evident in the rise of far-right female 'TradWives' (Love, 2020), hyper-feminine and submissive personalities who recruit males to the white-nationalist cause by framing racist politics as heroic and romantic (Tebaldi, 2021) and female men's rights activists (FeMRA). ...
Article
According to the Great Replacement conspiracy theory, nonwhites, globalists and elites are plotting to eliminate the white race and its dominance through anti-white policies and increased immigration. In that context, abortion among white women is perceived by white nationalists (WN) as a betrayal of their “biological” and “traditional” gender role— procreation of white babies. While WN condemn abortion among white women as a murderous sin, at times they encourage the practice among nonwhites to solve demographic threats to white dominance. In this study, we use mixed methods, combining unsupervised machine learning with close textual analysis of 30,725 posts including the term “abortion” published on the WN website Stormfront between 2001 and 2017. We identify three broad themes: White genocide, focused on the conspiracy theory and detailing the active actors in its alleged execution; political, focused on political agendas and laws; and WN reproductive reasoning, articulating and justifying the contradiction between supporting abortion for nonwhites but not for whites via politics of difference that emphasize nonwhites’ supposed inferior morality. We discuss WN’s unique and explicitly racist discourse around a medical topic like abortion, a staple of the conservative and religious right for decades, and how it is used to alleviate their cognitive dissonance resulting from their dual-stance on abortion. Such discourse could be harnessed to recruit members into the movement and normalize extreme, racist ideologies.
Chapter
Scholars have begun to identify the links between incidents of mass murder and misogynistic behaviors. From the 2014 Isla Vista campus shootings in California through to an incident of mass violence with a van in Toronto in 2017, identifying as an “incel” has been cited as a motivating factor in the perpetrators' pre-attack writings. “Incel” stands for “involuntary celibates,” an online subculture of males displaying rage at females, expressing fandom for mass shooters, and fantasizing about violence. Further complicating matters is the frequent overlap between intimate partner violence and/or stalking with acts of mass violence. In this chapter, suggestions are advanced for ways to effectively assess the risk of mass violence when misogynistic behaviors are present. The potential use of risk assessment instruments is discussed, in addition to ways to devise an effective threat assessment system.
Article
An emerging body of criminological research has sought to investigate the ‘incels’ movement. This article explores the construction of masculinities, gender and violence in a popular incel online forum. Here, we discuss results from a digital ethnography incorporating qualitative analysis of user posts, with a focus on three key overarching themes, namely: biological determinism, masculine humiliation and hierarchical gender relations. Our analysis draws on concepts such as hegemonic and hybrid masculinities (Demetriou 2001; Connell 2005; Bridges and Pascoe 2014), as well as Kimmel’s (2013) notion of ‘aggrieved entitlement’, to further develop understandings of the complexity of users’ framings of gender and masculine identity within the forum. Implications for future research in the field are then discussed.
Article
This article represents the largest ever primary data-based study of involuntary celibates (incels), previously studied nearly exclusively through analysis of online postings. The incel movement has been characterized by some as a radical ideology, with mass murderers such as Elliot Rodger, Alek Minassian, and Chris Harper Mercer being portrayed as prototypical of the movement. However, there is a dearth of research through direct questioning of incels and therefore very little nuanced understanding of the community, its shared grievances, and its opinions regarding violence in its name. The present study of over 250 self-identified incels demonstrates that although the majority of incels are non-violent and do not approve of violence, those who consider themselves to be staunch misogynists are likely to endorse a desire to commit violence and are also likely to become more misogynistic through participation on incel web forums, which validate their views. The study also finds that while many incels report experiencing a variety of psychological symptoms, they are loath to seek help from mental health professionals. This implies that the threat of violence from a subset of incels should not be ignored, but promotion of compassionate and understanding psychological may be more broadly beneficial to the community.
Conference Paper
Full-text available
We target the complementary binary tasks of identifying whether a tweet is misogynous and, if that is the case, whether it is also aggressive. We compare two ways to address these problems: one multi-class model that discriminates between all the classes at once: not misogynous, non aggressive-misogynous and aggressive-misogynous; as well as a cascaded approach where the binary classification is carried out separately (misogynous vs non-misogynous and aggressive vs non-aggressive) and then joined together. For the latter, two training and three testing scenarios are considered. Our models are built on top of AlBERTo and are evaluated on the framework of Evalita's 2020 shared task on automatic misogyny and aggressiveness identification in Italian tweets. Our cascaded models-including the strong naïve baseline-outperform significantly the top submissions to Evalita, reaching state-of-the-art performance without relying on any external information.
Chapter
Systematic case studies have identified risk factors for targeted violence. The sequence of thinking and behavior on a “pathway to violence” is often discernable, and campus attacks potentially preventable. Threat assessment, described in this chapter, is an evidence-based, information-driven, and collaborative undertaking to address the wide variety of potentially violent scenarios that may occur on campuses. Steps in case assessment and intervention principles are outlined and risk inquiry questions provided.
Article
In this article, we investigate, through corpus linguistics and qualitative approaches, YouTube responses to an advert which attempts to bring to the fore detrimental masculine toxic behaviours. With the affordances proper to the medium - anonymity, disinhibition, and de-individuation - our investigation focuses on three gendered terms representing subordinate masculinities (Messerschmidt, 2018): soy, cuck and beta, challenging ‘masculine’ attributes such as toughness, power, heterosexuality, competitiveness, and authority. These are thought to deviate from alpha masculinity between traditional opposite points: alpha men and women. The findings show that compound identities are also constructed, and deviant and subordinate masculinities are seen to be associated with political or social movements. Furthermore, comments suggest that traditional masculinity is threatened by groups of men who are considered socially inferior, provoking a (white) male sense of nostalgic entitlement. This online platform becomes a mediated space for discrimination, a softer manosphere, where anti-feminist sentiments are implicit. This article contributes to the literature on discourse and masculinities, and constructions of gender in online hostile spaces.
Article
Individuals identifying as involuntarily celibate (i.e., "incels"), who are unable to have sex despite wanting to, often later no longer wish to identify as incels. This transition requires former incels to perform facework to reconfigure these otherwise rigid identity categories. Guided by performative face theory, this study analyzed 77 narratives posted to Reddit by users who at one time identified as incels in which they discuss the transition away from inceldom. This analysis elucidates how former incels employ meta-facework strategies online to reimagine their identities and become deradicalized. Findings indicated seven factors that contributed to the initial "doing" of inceldom, two major categories of inciting incidents that catalyzed the "troubling" of performed incel identities, and two sedimenting and three subversive meta-facework strategies employed by Reddit users to facilitate the "undoing" of inceldom. Theoretical and practical implications are offered.
Article
Full-text available
Online radicalization is among the most vexing challenges the world faces today. Here, we demonstrate that homogeneity in moral concerns results in increased levels of radical intentions. In Study 1, we find that in Gab—a right-wing extremist network—the degree of moral convergence within a cluster predicts the number of hate-speech messages members post. In Study 2, we replicate this observation in another extremist network, Incels. In Studies 3 to 5 ( N = 1,431), we demonstrate that experimentally leading people to believe that others in their hypothetical or real group share their moral views increases their radical intentions as well as willingness to fight and die for the group. Our findings highlight the role of moral convergence in radicalization, emphasizing the need for diversity of moral worldviews within social networks.
Article
The term “incel” is a portmanteau of the words “involuntary” and “celibate,” and incels as a group represent a new emergent Internet subculture. Often, they’ve been connected to viewpoints and language that promote toxic masculinity, while encouraging violence against women and minorities. Previous research has often highlighted incels’ views on women and the language used to perpetuate their antifeminist viewpoints and ideology. However, little research has been produced on how incels view themselves and other men. This research uses popular conversations from various incel message boards to qualitatively analyze the discourses incels use to convey the stratification of doing gender within their subculture. Incels use sex as a central metaphor to convey ideas about how power and resources should be distributed in society. Findings suggest that incels use a complex “interpretive repertoire” to describe a gendered spectrum of political agency.
Article
Following a series of deadly attacks, and increasingly in recent years, incels have entered not only the public lexicon but also piqued scholarly interest, especially in terrorism research and programmes aimed at countering violent extremism (CVE). However, much of the current analyses largely interpret incel communities as homogenous, and in doing so ignore the complex and often contradictory nature of incel communities. CVE recommendations made by these scholars are often founded on misconceptions of incel identity and community. Through a critical feminist lens, in this article we argue that the focus on incels should seek to understand the role of male supremacy, antifeminism, and misogyny in society. Additionally, we argue against the trend of attempting to classify and securitise incels as a unique form of misogynistic violence, and identify the dangers of a lack of focus on male supremacy.
Article
Hate speech is any kind of communication that attacks a person or a group based on their characteristics, such as gender, religion and race. Due to the availability of online platforms where people can express their (hateful) opinions, the amount of hate speech is steadily increasing that often leads to offline hate crimes. This paper focuses on understanding and detecting hate speech in underground hacking and extremist forums where cybercriminals and extremists, respectively, communicate with each other, and some of them are associated with criminal activity. Moreover, due to the lengthy posts, it would be beneficial to identify the specific span of text containing hateful content in order to assist site moderators with the removal of hate speech. This paper describes a hate speech dataset composed of posts extracted from HackForums, an online hacking forum, and Stormfront and Incels.co, two extremist forums. We combined our dataset with a Twitter hate speech dataset to train a multi-platform classifier. Our evaluation shows that a classifier trained on multiple sources of data does not always improve the performance compared to a mono-platform classifier. Finally, this is the first work on extracting hate speech spans from longer texts. The paper fine-tunes BERT (Bidirectional Encoder Representations from Transformers) and adopts two approaches – span prediction and sequence labelling. Both approaches successfully extract hateful spans and achieve an F1-score of at least 69%.
Article
Full-text available
Across time, in a variety of forms and spaces -from homes and workplaces to digital domains of social media- women have become victims of male dominance. So also are the other vulnerable sections that suffer multi-layered abuse, and endure sexual harassment in social media. Yet, this phenomenon is insufficiently explored. Therefore, this article argues that social media spaces have become domains for sexual harassment and subjugation of women. This article examines gender-trolling on Twitter as a form of sexual violence against women. Employing qualitative analyses of the Twitter conversations on Indian journalists, namely Barkha Dutt, Sagarika Ghose, and Rana Ayyub, it exposes the nature and form of sexual violence against women on the micro-blogging space, and argues that social media platforms constitute convenient havens of harassment against assertive women Keywords: Twitter, social media, women journalists, sexual harassment, misogyny
Article
Full-text available
Incel, shorthand for 'involuntarily celibate, ' is a violent political ideology based on a new wave of misogyny and white supremacy. These (mostly) young men are frustrated at a world they see as denying them power and sexual control over women's bodies. In their eyes, they are victims of oppressive feminism, an ideology which must be overthrown, often through violence. Incel ideology presents a mythologized view that prior to the sexual revolution in the '60s, every man had access to a female partner; subsequent to the women's empowerment movement, fewer and fewer men have access to a partner. They frame this shift as a profound injustice to men who cannot find a sexual partner, suggesting that society has failed to give men what they are entitled to (access to women's bodies) and that the only recourse is violent insurrection.
Chapter
Full-text available
The rampage of Elliot Rodger adjacent to the campus of the University of California, Santa Barbara in May 2014 is used as a case study of how a sexually and socially frustrated and isolated young man from a highly privileged background became a rampage shooter. He followed a route typical of post‐Columbine rampage shooters by socializing himself to the role of rampager through reading books that informed and reinforced his worldview, playing first‐person shooter online games, visiting hate group websites, purchasing weaponry, and planning his rampage. The social‐political perspective is advanced as an alternative to the more popular mental illness approach because it addresses three fundamental questions related to rampage shootings: Why are nearly all rampage shooters male? Why are they mostly American? And why have they emerged as a serious social problem since the 1980s? The chapter ends with suggestions about what needs to be done to reduce the number of rampage shootings in the United States.
Article
Full-text available
We have developed a system that automatically detects online jihadist hate speech with over 80% accuracy, by using techniques from Natural Language Processing and Machine Learning. The system is trained on a corpus of 45,000 subversive Twitter messages collected from October 2014 to December 2016. We present a qualitative and quantitative analysis of the jihadist rhetoric in the corpus, examine the network of Twitter users, outline the technical procedure used to train the system, and discuss examples of use.
Technical Report
Full-text available
We have developed a system that automatically detects online jihadist hate speech with over 80% accuracy, by using techniques from Natural Language Processing and Machine Learning. The system is trained on a corpus of 45,000 subversive Twitter messages collected from October 2014 to December 2016. We present a qualitative and quantitative analysis of the jihadist rhetoric in the corpus, examine the network of Twitter users, outline the technical procedure used to train the system, and discuss examples of use.
Article
Full-text available
Misogyny online in forms such as explicit rape threats has become so prevalent and rhetorically distinctive it resembles a new dialect or language. Much of this ‘Rapeglish’ is produced by members of an informal alliance of men’s groups online dubbed the ‘Manosphere’. As both a cyberhate researcher and cyberhate target, I have studied as well as contributed to feminist responses to Rapeglish. In 2016, for instance, I helped build a Random Rape Threat Generator (RRTG) – a computer program that splices, shuffles around, and re-stitches in novel combinations fragments of real-life Rapeglish to illustrate the formulaic, machine-like, and impersonal nature of misogynist discourse online. This article uses Yuri Lotman’s ideas about intra- and inter-cultural conflict involving something akin to the translation of a foreign language to frame the RRTG as one example of the way women are ‘talking back’ both to and with Rapeglish (the latter involving appropriations and subversions of the original discourse).
Article
Full-text available
This Research Paper examines how the white supremacist movement Christian Identity emerged from a non-extremist forerunner known as British Israelism. By examining ideological shifts over the course of nearly a century, the paper seeks to identify key pivot points in the movement’s shift toward extremism and explain the process through which extremist ideologues construct and define in-group and out-group identities. Based on these findings, the paper proposes a new framework for analysing and understanding the behaviour and emergence of extremist groups. The proposed framework can be leveraged to design strategic counter-terrorism communications programmes using a linkage-based approach that deconstructs the process of extremist in-group and out-group definition. Future publications will continue this study, seeking to refine the framework and operationalise messaging recommendations.
Article
Full-text available
Since the emergence of Web 2.0 and social media, a particularly toxic brand of antifeminism has become evident across a range of online networks and platforms. Despite multiple internal conflicts and contradictions, these diverse assemblages are generally united in their adherence to Red Pill “philosophy,” which purports to liberate men from a life of feminist delusion. This loose confederacy of interest groups, broadly known as the manosphere, has become the dominant arena for the communication of men’s rights in Western culture. This article identifies the key categories and features of the manosphere and subsequently seeks to theorize the masculinities that characterize this discursive space. The analysis reveals that, while there are some continuities with older variants of antifeminism, many of these new toxic assemblages appear to complicate the orthodox alignment of power and dominance with hegemonic masculinity by operationalizing tropes of victimhood, “beta masculinity,” and involuntary celibacy (incels). These new hybrid masculinities provoke important questions about the different functioning of male hegemony off- and online and indicate that the technological affordances of social media are especially well suited to the amplification of new articulations of aggrieved manhood.
Article
Full-text available
The issue of hate speech has received significant attention from legal scholars and philosophers alike. But the vast majority of this attention has been focused on presenting and critically evaluating arguments for and against hate speech bans as opposed to the prior task of conceptually analysing the term ‘hate speech’ itself. This two-part article aims to put right that imbalance. It goes beyond legal texts and judgements and beyond the legal concept hate speech in an attempt to understand the general concept hate speech. And it does so using a range of well-known methods of conceptual analysis that are distinctive of analytic philosophy. One of its main aims is to explode the myth that emotions, feelings, or attitudes of hate or hatred are part of the essential nature of hate speech. It also argues that hate speech is best conceived as a family resemblances concept. One important implication is that when looking at the full range of ways of combating hate speech, including but not limited to the use of criminal law, there is every reason to embrace an understanding of hate speech as a heterogeneous collection of expressive phenomena. Another is that it would be unsound to reject hate speech laws on the premise that they are effectively in the business of criminalising emotions, feelings, or attitudes of hate or hatred.
Chapter
Full-text available
This chapter examines language aggression against women in public online deliberation regarding crimes of violence against women. To do so, we draw upon a corpus of 460 unsolicited digital comments sent in response to four public service advertisements against women abuse posted on YouTube. Our analysis reveals that three patriarchal strategies of abuse — namely, minimize the abuse, deny its existence, and blame women — are enacted in the online discourse under scrutiny and shows how, at the micro-level of interaction, these strategies relate to social identity and gender ideology through complex processes of positive in-group description and negative out-group presentation. We also argue that despite the few comments that explicitly support abuse, this situation changes at implicit, indirect levels of discourse.
Article
Full-text available
Article
Full-text available
School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various characteristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by 6 school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/prioritization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology.
Article
Full-text available
As video games have attracted more critical attention and theoretical discourse and games play a more visible part in our media landscape, the modern video game community impacts the wider world of online culture and warrants more detailed study. Using the case of the Dickwolves incident from Penny Arcade.com, the authors address issues of hypermasculinity and sexism within the gaming community and how this lens brings to light issues with a hostile response to the expression of a female identity or femininity. The authors argue that this case highlights how the hypermasculine discourse encourages the overt privileging of masculinity over femininity and discourages women from engaging in gendered discourse within the community.
Article
Full-text available
This paper examines language aggression against women in public online deliberation regarding crimes of violence against women. To do so, we draw upon a corpus of 460 unsolicited digital comments sent in response to four public service advertisements against women abuse posted on YouTube. Our analysis reveals that three patriarchal strategies of abuse — namely, minimize the abuse, deny its existence, and blame women — are enacted in the online discourse under scrutiny and shows how, at the micro-level of interaction, these strategies relate to social identity and gender ideology through complex processes of positive in-group description and negative out-group presentation. We also argue that despite the few comments that explicitly support abuse, this situation changes at implicit , indirect levels of discourse.
Article
Full-text available
Pattern is a package for Python 2.4+ with functionality for web mining (Google + Twitter + Wikipedia, web spider, HTML DOM parser), natural language processing (tagger/chunker, n-gram search, sentiment analysis, WordNet), machine learning (vector space model, k-means clustering, Naive Bayes + k-NN + SVM classifiers) and network analysis (graph centrality and visualization). It is well documented and bundled with 30+ examples and 350+ unit tests. The source code is licensed under BSD and available from http://www.clips.ua.ac.be/pages/pattern.
Article
Full-text available
This paper investigates political homophily on Twitter. Using a combination of machine learning and social network analysis we classify users as Democrats or as Republicans based on the political content shared. We then investigate political homophily both in the network of reciprocated and nonreciprocated ties. We find that structures of political homophily differ strongly between Democrats and Republicans. In general, Democrats exhibit higher levels of political homophily. But Republicans who follow official Republican accounts exhibit higher levels of homophily than Democrats. In addition, levels of homophily are higher in the network of reciprocated followers than in the nonreciprocated network. We suggest that research on political homophily on the Internet should take the political culture and practices of users seriously.
Article
Full-text available
Differences in the ways that men and women use language have long been of interest in the study of discourse. Despite extensive theorizing, actual empirical investigations have yet to converge on a coherent picture of gender differences in language. A significant reason is the lack of agreement over the best way to analyze language. In this research, gender differences in language use were examined using standardized categories to analyze a database of over 14,000 text files from 70 separate studies. Women used more words related to psychological and social processes. Men referred more to object properties and impersonal topics. Although these effects were largely consistent across different contexts, the pattern of variation suggests that gender differences are larger on tasks that place fewer constraints on language use.
Article
Full-text available
This article discusses how the study of metaphoric and more generally, figurative language use contributes to critical discourse analysis (CDA). It shows how cognitive linguists’ recognition of metaphor as a fundamental means of concept- and argument-building can add to CDA's account of meaning constitution in the social context. It then discusses discrepancies between the early model of conceptual metaphor theory and empirical data and argues that discursive-pragmatic factors as well as sociolinguistic variation have to be taken into account in order to make cognitive analyses more empirically and socially relevant. In conclusion, we sketch a modified cognitive approach informed by Relevance Theory within CDA.
Article
Full-text available
Authorship profiling problem is of growing importance in the global information environment, and can help police identify characteristics of the perpetrator of a crime when there are specific suspects to consider. The approach is to apply machine learning to text categorization, for which the corpus of training documents, each labeled according to its category for a particular profiling dimension is taken. The study outlined the kinds of text features that can be found most useful for authorship profiling. The two basic type of features include content based features, and style based features, which reflect the fact that different populations might tend to write about different topics as well as to express themselves differently about the same topic. There are four profiling problems such as determining the author's gender, age, native language, and neuroticism level for the experimental setup. The right combination of linguistic features and machine learning methods enables an automated system to effectively determine such aspects of an anonymous author.
Article
Full-text available
While online, some people self-disclose or act out more frequently or intensely than they would in person. This article explores six factors that interact with each other in creating this online disinhibition effect: dissociative anonymity, invisibility, asynchronicity, solipsistic introjection, dissociative imagination, and minimization of authority. Personality variables also will influence the extent of this disinhibition. Rather than thinking of disinhibition as the revealing of an underlying "true self," we can conceptualize it as a shift to a constellation within self-structure, involving clusters of affect and cognition that differ from the in-person constellation.
Article
This article undertakes the first known qualitative study focusing on The Red Pill, an online forum wherein heterosexual men attempt to improve their seduction skills by discussing evolutionary psychology and economic theories. My content analysis of twenty-six documents (130,000 words) designated by the community as central to its purpose and ideology shows that The Red Pill is not just an expression of hegemonic masculinity but also explicitly integrates neoliberal and scientific discourses into its seduction strategies. I theorize that the resulting philosophy superficially resolves a contradiction between hegemonic masculinity’s prescriptive emotional walls and an inherent desire for connection by constructing women as exchangeable commodities.
Article
In understanding the influence of virtual communities on its members, examined in this chapter is the role of identity — the member’s conscious knowledge of belonging and the emotional and evaluative significance attached to the membership. Drawing from research and analyses across different disciplines, we present an integrative framework considering and elaborating on the motivational antecedents, constituents, and consequents of virtual community identity. We also discuss its implications for virtual community organizers and highlight promising research opportunities in this area. Purchase this chapter to continue reading all 16 pages >
Article
The very notion of community in an online context can begin a hot debate. Those who would keep the term 'community' for the imagined ideal of cooperation and joint sharing of land, resources, and goals ask: How can community exist without physical co-location and a geographic touchstone? How can the leanness of computermediated communication support the richness inherent in a community? This article revisits the debate about community and online community, and offers a means of conceptualizing and investigating online community using a social network perspective that frees it from its former geographical constraints. It begins with a look at the challenges to community that have fed into arguments against online community, and with a section on the discovery of community online. The article then addresses the case for a network view of community, starting with how the social network approach has been applied to offline communities and how this lays the groundwork for unbundling community from face-to-face interaction and geographic co-location. Following a brief section on the basics of social-network terminology, it returns to the main topic of community, with a focus on the network-level aspects of community, showing how patterns of interpersonal ties can build a network with outcomes greater than the sum of the pairwise connections. The final sections explore variants on the theme of community, first by revisiting the online and offline dichotomy and addressing the advantages, and indeed the inevitability, of considering community from both online and offline sides.
Article
This study investigates the communicative practices in English and German online discussion fora as exemplified by two thematically related sample threads. Combining first- and second-order approaches to (im-)politeness, the paper focuses on the question of how participants use intergroup rudeness as a means of in- and outgroup construction and examines how intergroup rudeness is metapragmatically negotiated as the discussions unfold. The results show that intergroup rudeness as well as metapragmatic comments are handled very differently in the two communities explored. Suggesting cultural preferences, there is a much higher degree of interactivity and a clear preference for negotiation at an interpersonal level in the German discussion group; its English counterpart favours negotiation at an intergroup level. Both threads provide metapragmatic evidence that the frequent use of rudeness tokens does not automatically make rudeness an accepted norm.
Conference Paper
In Chap. 9, we studied the extraction of structured data from Web pages. The Web also contains a huge amount of information in unstructured texts. Analyzing these texts is of great importance as well and perhaps even more important than extracting structured data because of the sheer volume of valuable information of almost any imaginable type contained in text. In this chapter, we only focus on mining opinions which indicate positive or negative sentiments. The task is technically challenging and practically very useful. For example, businesses always want to find public or consumer opinions on their products and services. Potential customers also want to know the opinions of existing users before they use a service or purchase a product.
Article
We report on a series of experiments with convolutional neural networks (CNN) trained on top of pre-trained word vectors for sentence-level classification tasks. We first show that a simple CNN with little hyperparameter tuning and static vectors achieves excellent results on multiple benchmarks. Learning task-specific vectors through fine-tuning offers further gains in performance. We additionally propose a simple modification to the architecture to allow for the use of both task-specific and static word vectors. The CNN models discussed herein improve upon the state-of-the-art on 4 out of 7 tasks, which include sentiment analysis and question classification.
Article
In recent years, the mainstream media has identified on-line vitriol as a worsening problem which is silencing women in public discourse, and is having a deleterious effect on the civility of the public cybersphere. This article examines the disconnect between representations of “e-bile” in media texts, and representations of e-bile in academic literature. An exhaustive review of thirty years of academic work on “flaming” shows that many theorists have routinely trivialized the experiences of flame targets, while downplaying, defending, and/or celebrating the discourse circulated by flame producers. Much contemporary scholarship, meanwhile, ignores e-bile completely. My argument is that this constitutes a form of chauvinism (in that it disregards women's experiences in on-line environments) and represents a failure of both theoretical acuity and nerve (given that it evades such a pervasive aspect of contemporary culture). The aim of this paper is not only to help establish the importance of on-line vitriol as a topic for interdisciplinary scholarly research, but to assist in establishing a theoretical problematic where what is seen is barely regarded as a problem. Overall, my argument is that—far from being a technology-related moral panic—e-bile constitutes a field of inquiry with a pressing need for recalibrated scholarly intervention.
Chapter
The proliferation of online hate groups over the past few years has brought two main issues into focus. First, legal and political scholars have questioned the extent to which such hate speech should be regulated. Second, and perhaps more importantly, there is a great deal of concern about the effects of hate expressed online - specifically, if it incites violence and hostility between groups in the physical world. Understanding cyberhate therefore provides an important challenge for psychologists. Specifically, it is important to understand why online hate is so widespread and the content of online hate sites often so insulting and aggressive, given that the physical activities of hate groups are much more covert. This article attempts to provide a psychological perspective on the nature and purpose of online hate groups and their underlying motivations, their strategies, psychological theories and research that provide insight into disinhibited online behaviour, the actions being taken to combat cyberhate, and some challenges for future research.
Article
This article explores the signal characteristics of gendered vitriol on the Internet – a type of discourse marked by graphic threats of sexual violence, explicit ad hominem invective and unapologetic misogyny. Such ‘e-bile’ is proliferating in the cybersphere and is currently the subject of widespread international media coverage. Yet it receives little attention in scholarship. This is likely related to the fact that discourse of this type is metaphorically ‘unspeakable’, in that its hyperbolic profanity locates it well outside the norms of what is regarded as ‘civil’ discourse. My case, however, is that – despite the risk of causing offence – this discourse must not only be spoken of, but must be spoken of in its unexpurgated entirety. There is, I argue, no other way to adequately assay the nature of a communication mode whose misogynistic hostility has serious ethical and material implications, not least because it has become a lingua franca in many sectors of the cybersphere. Proceeding via unexpurgated ostension is also the best – arguably the only – way to begin mapping the blurry parameters of the discursive field of e-bile, and from there to conduct further inquiry into the ethical appraisal of putative online hostility, and the consideration of possible remedies.
Article
The use of stylometry, authorship recognition through purely linguistic means, has contributed to literary, historical, and criminal investigation breakthroughs. Existing stylometry research assumes that authors have not attempted to disguise their linguistic writing style. We challenge this basic assumption of existing stylometry methodologies and present a new area of research: adversarial stylometry. Adversaries have a devastating effect on the robustness of existing classification methods. Our work presents a framework for creating adversarial passages including obfuscation, where a subject attempts to hide her identity, and imitation, where a subject attempts to frame another subject by imitating his writing style, and translation where original passages are obfuscated with machine translation services. This research demonstrates that manual circumvention methods work very well while automated translation methods are not effective. The obfuscation method reduces the techniques' effectiveness to the level of random guessing and the imitation attempts succeed up to 67% of the time depending on the stylometry technique used. These results are more significant given the fact that experimental subjects were unfamiliar with stylometry, were not professional writers, and spent little time on the attacks. This article also contributes to the field by using human subjects to empirically validate the claim of high accuracy for four current techniques (without adversaries). We have also compiled and released two corpora of adversarial stylometry texts to promote research in this field with a total of 57 unique authors. We argue that this field is important to a multidisciplinary approach to privacy, security, and anonymity.
Article
Anonymity is thought to be an important means for ensuring a free exchange of ideas by encouraging the expression of minority viewpoints. However, we suggest that anonymity’s reduction in awareness of others potentially affects the expression and interpretation of comments that are made during a discussion. In particular, anonymity will increase the likelihood that comments will be made that are contrary to the majority opinion while at the same time decreasing the effect that those contrary arguments have on other group member’s opinions. This paper reports experimental results showing that anonymity led to more overall participation in discussions of ethical scenarios. However, equality of member participation did not differ between anonymous and member-identified groups, and anonymous groups had significantly higher awareness-related comments. This leads to the conclusion that additional participation in anonymous groups accommodates reduced awareness rather than reflecting the increased participation of normally reticent group members. In addition, anonymity led to more arguments in support of questionable behavior, suggesting that the freeing effects of anonymity apply to the social desirability of arguments. Finally, there was less change in opinion under conditions of anonymity than when comments were identified, suggesting that anonymous arguments have less influence on opinions than identified comments.
Article
Using a life course perspective, we explored the development and maintenance of involuntary celibacy for 82 respondents recruited over the I'nternet. Data were collected using an open‐ended electronic questionnaire. Modified grounded theory analysis yielded three groups of involuntary celibates, persons desiring to have sex but unable to find partners. Virgins were those who had never had sex, singles had sex in the past but were unable to establish current sexual relationships, and part‐nereds were currently in sexless relationships. These groups differed on dating experiences, the circumstances surrounding their celibacy, barriers to sexual activity, and the perceived likelihood of becoming sexually active. They were similar, however, in their negative reactions to celibacy. Pervasive in our respondents’ accounts was the theme of becoming and remaining off time in making normative sexual transitions, which in turn perpetuated a celibate life course or trajectory.
Chapter
This chapter contains section titled:
Article
This study explores whether the attributes listed in the literature on flaming in email are considered characteristic of flaming by actual email users. Through the creation of a semantic differential scale—called the Message Invectives Scale—the study took eight concepts found in more than 20 research articles on flaming and examined email users' responses to a set of 20 messages in relation to those eight characteristics. Findings indi- cate that in each of the 20 cases, six of the original eight concepts relate to each other to form a common set, which also correlates positively with perceptions of flaming. Some of the messages that scored high for flaming contained profanity, all capital let- ters, excessive exclamation points or question marks, indicating that these attributes also relate to flaming. Based on these findings, recommendations are advanced as to how email should be used to avoid negative attributes that can lead to organizational conflict.
Article
We describe new algorithms for training tagging models, as an alternative to maximum-entropy models or conditional random fields (CRFs). The algorithms rely on Viterbi decoding of training examples, combined with simple additive updates. We describe theory justifying the algorithms through a modification of the proof of convergence of the perceptron algorithm for classification problems. We give experimental results on part-of-speech tagging and base noun phrase chunking, in both cases showing improvements over results for a maximum-entropy tagger.
Cybermobbing aus sprachwissenschaftlicher Perspektive
  • Marx
Marx, Konstanze. 2018. "Cybermobbing aus sprachwissenschaftlicher Perspektive. " Sprachreport 1/18: 1-9. https://ids-pub.bsz-bw.de/frontdoor/index/index/docId/7209
Reluctant Virginity: The Relationship between Sexual Status and Self-esteem