Article

Successful trolling on Reddit: A comparison across subreddits in entertainment, health, politics, and religion

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Reddit can be perceived as a heaven for trolls, but how much of the site actually contains trolling? How successful are these troll comments on subreddits, and do the types of subreddits affect the success of comments? Based on an analysis of 16,516 comments from 208 posts from 13 subreddits in four categories (entertainment, health, politics, and religion), we found that trolling comments account for less than 1 % of all comments, and most of these trolling comments were not successful, regardless of category. While one out of five trolling comments were successful, only two out of a thousand comments were successful trolling. We also found that politics subreddits attract more trolling than subreddits in the other categories, and health subreddits are less susceptible to successful trolling than other subreddits.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Trolling has been defined as "Internet users who aim to disrupt online interactions" (Brubaker et al., 2021). Trolling comments are thought to constitute about 1% of Reddit posts (Fichman and Sharp, 2020) but are more common on subreddits that solicit high affect posts, use sexual vulgarities, and include long posts (Tsantarliotis et al., 2017). The NoFap moderator described trolls on his forum as in the "millions," 3 which may prevent the environment from providing support. ...
Article
Full-text available
"Reboot," especially NoFap, promotes abstinence from masturbation and/or pornography to treat "pornography addiction," an unrecognized diagnosis. While the intention of Reboot/NoFap is to decrease distress, qualitative studies have consistently suggested that "Reboots" paradoxically cause more distress. The distress appears to occur in response to (1) the abstinence goal, which recasts common sexual behaviors as personal "failures," and (2) problematic and inaccurate Reboot/NoFap forum messaging regarding sexuality and addiction. This preregistered survey asked men about their experience with perceived "relapse" and NoFap forums. Participants reported that their most recent relapse was followed by feeling shameful, worthless, sad, a desire to commit suicide, and other negative emotions. A novel predictor of identifying as a pornography addict in this lower religiosity sample was higher narcissism. Participants reported that NoFap forums contained posts that were misogynist (73.7% of participants), bullying (49.1%), anti-LGBT (42.9%), antisemitic (32.0%), instructing followers to harm or kill themselves (23.5%), or threats to hurt someone else (21.1%). More engagement in NoFap online forums was associated with worse symptoms of erectile dysfunction, depression, anxiety, and more sex negativity. Results support and expand previously documented harms and problems with Reboot/NoFap claims of treating pornography addiction from qualitative research.
Article
Lurking behaviour is very common in online communities, especially in large-scale informal online communities. Lurkers, who make up the majority of community members, are often overlooked by researchers. The aim of this investigation was to explore both lurkers and posters’ participation and sense of community in informal online education-related communities. An explanatory sequential mixed-methods design was applied. A total of 82 participants completed the survey and 9 of them were interviewed. Results of the investigation showed that although there was no significant difference between lurkers and posters in their sense of community, they reported different perceptions of community during the interviews. Lurkers tended to be more negative about the technical features supporting communities, while posters tended to be more negative about people who hindered community building through disrespectful or unkind contributions. In addition, while participation between those who post and those who lurk was different in the communities studied, interviews suggest that both lurkers and posters had different reasons for lurking and may change their participation behaviours depending on how they perceive the community within different online groups. Practical implication for online community mangers and educators in terms of reconsidering design and management and implications for future studies are discussed.
Article
Objectives To investigate whether sentiment analysis and topic modelling can be used to monitor the sentiment and opinions of junior doctors. Design Retrospective observational study based on comments on a social media website. Setting Every publicly available comment in r/JuniorDoctorsUK on Reddit from 1 January 2018 to 31 December 2021. Participants 7707 Reddit users who commented in the r/JuniorDoctorsUK subreddit. Main outcome measure Sentiment (scored −1 to +1) of comments compared with results of surveys conducted by the General Medical Council. Results Average comment sentiment was positive but varied significantly during the study period. Fourteen topics of discussion were identified, each associated with a different pattern of sentiment. The topic with the highest proportion of negative comments was the role of a doctor (38%), and the topic with the most positive sentiment was hospital reviews (72%). Conclusion Some topics discussed in social media are comparable to those queried in traditional questionnaires, whereas other topics are distinctive and offer insight into what themes junior doctors care about. Events during the coronavirus pandemic may explain the sentiment trends in the junior doctor community. Natural language processing shows significant potential in generating insights into junior doctors’ opinions and sentiment.
Article
As the prevalence of online misinformation grows increasingly apparent, our need to understand its spread becomes more essential. Trolling, in particular, may aggravate the spread of misinformation online. While many studies have investigated the negative impact of trolling and misinformation on social media, less attention has been devoted to the relationships between the two and their manifestation on social question and answer (SQA) sites. We examine the extent of and relationships between trolling and misinformation on SQA sites. Through content analysis of 8,401 posts (159 questions and 8,242 answers) from the Yahoo Answers! Politics & Government and Society & Culture categories, we identified levels of and relationships between misinformation and trolling. We find that trolling and misinformation tend to reinforce themselves and each other and that trolling and misinformation are more common in the Politics & Government category than in the Society & Culture category. Our study is among the first to consider the prevalence of and relationship between misinformation and trolling on SQA sites.
Chapter
Trolling and misinformation are ubiquitous on social media platforms, such as Yahoo! Answers. Yet, little is known about the impact of question type and topic on the extent of trolling and misinformation in answers on these platforms. We address this gap by analyzing 120 transactions with 2000 answers from two Yahoo! Answers categories: Politics & Government and Society & Culture. We found that trolling and misinformation are widespread on Yahoo! Answers. In most cases, trolling in questions was echoed by more trolling in answers, and misinformation in questions with more misinformation in answers. We also found that 1) more misinformation and more trolling were found in answers to conversational questions than to informational questions; 2) more misinformation occurred in answers to questions in politics than answers to questions in culture; and 3) trolling significantly differed between politics and culture.
Article
As the prevalence of online misinformation grows increasingly apparent, our need to understand its spread becomes more essential. Trolling, in particular, may aggravate the spread of misinformation online. While many studies have investigated the negative impact of trolling and misinformation on social media, less attention has been devoted to the relationships between the two and their manifestation on social question and answer (SQA) sites. We examine the extent of and relationships between trolling and misinformation on SQA sites. Through content analysis of 8,401 posts (159 questions and 8,242 answers) from the Yahoo Answers! Politics & Government and Society & Culture categories, we identified levels of and relationships between misinformation and trolling. We find that trolling and misinformation tend to reinforce themselves and each other and that trolling and misinformation are more common in the Politics & Government category than in the Society & Culture category. Our study is among the first to consider the prevalence of and relationship between misinformation and trolling on SQA sites.
Conference Paper
The phenomenon of trolling has emerged as a widespread form of abuse on news sites, online social networks, and other types of social media. In this paper, we study a particular type of trolling, performed by asking a provocative question on a community question-answering website. By combining user reports with subsequent moderator deletions, we identify a set of over 400,000 troll questions on Yahoo Answers, i.e., questions aimed to inflame, upset, and draw attention from others on the community. This set of troll questions spans a lengthy period of time and a diverse set of topical categories. Our analysis reveals unique characteristics of troll questions when compared to "regular" questions, with regards to their metadata, text, and askers. A classifier built upon these features reaches an accuracy of 85% over a balanced dataset. The answers' text and metadata, reflecting the community's response to the question, are found particularly productive for the classification task.
Article
Whilst computer-mediated communication (CMC) can benefit users by providing quick and easy communication between those separated by time and space, it can also provide varying degrees of anonymity that may en-courage a sense of impunity and freedom from being held accountable for inappropriate online behaviour. As such, CMC is a fertile ground for study-ing impoliteness, whether it occurs in response to perceived threat (flam-ing), or as an end in its own right (trolling). Currently, first and second-order definitions of terms such as im/politeness (Brown and Levinson 1987; Bousfield 2008; Culpeper 2008; Terkourafi 2008), in-civility (Lakoff 2005), rudeness (Beebe 1995, Kienpointner 1997, 2008), and etiquette (Coulmas 1992), are subject to much discussion and debate, yet the CMC phenomenon of trolling is not adequately captured by any of these terms. Following Bousfield (in press), Culpeper (2010) and others, this paper suggests that a definition of trolling should be informed first and foremost by user discussions. Taking examples from a 172-million-word, asynchro-nous CMC corpus, four interrelated conditions of aggression, deception, disruption, and success are discussed. Finally, a working definition of troll-ing is presented.
Over a quarter of Americans have made malicious online comments
  • J Gammon