Content uploaded by Renkai Ma
Author content
All content in this area was uploaded by Renkai Ma on Sep 27, 2022
Content may be subject to copyright.
“I am not a YouTuber who can make whatever video I want. I have
to keep appeasing algorithms”: Bureaucracy of Creator
Moderation on YouTube
Renkai Ma∗
The Pennsylvania State University
renkai@psu.edu
Yubo Kou
The Pennsylvania State University
yubokou@psu.edu
ABSTRACT
Recent HCI studies have recognized an analogy between bureau-
cracy and algorithmic systems; given platformization of content
creators, video sharing platforms like YouTube and TikTok practice
creator moderation, i.e., an assemblage of algorithms that manage
not only creators’ content but also their income, visibility, iden-
tities, and more. However, it has not been fully understood as to
how bureaucracy manifests in creator moderation. In this poster,
we present an interview study with 28 YouTubers (i.e., video con-
tent creators) to analyze the bureaucracy of creator moderation
from their moderation experiences. We found participants wres-
tled with bureaucracy as multiple obstructions in re-examining
moderation decisions, coercion to appease dierent algorithms in
creator moderation, and the platform’s indierence to participants’
labor. We discuss and contribute a conceptual understanding of how
algorithmic and organizational bureaucracy intertwine in creator
moderation, laying a solid ground for our future study.
CCS CONCEPTS
•Human-centered computing;•Collaborative and social com-
puting;•Empirical studies in collaborative and social com-
puting;
KEYWORDS
creator moderation, content moderation, algorithmic moderation,
YouTuber
ACM Reference Format:
Renkai Ma and Yubo Kou. 2022. “I am not a YouTuber who can make what-
ever video I want. I have to keep appeasing algorithms”: Bureaucracy of
Creator Moderation on YouTube. In Companion Computer Supported Co-
operative Work and Social Computing (CSCW’22 Companion), November
08–22, 2022, Virtual Event, Taiwan. ACM, New York, NY, USA, 6 pages.
https://doi.org/10.1145/3500868.3559445
1 INTRODUCTION
Recent HCI studies have shown bureaucratic traits of algorith-
mic systems. Rooted in organization science, the notion of bureau-
cracy refers to rule-governed procedures that are inexible to adapt
∗This work is partially supported by NSF grant No. 2006854
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for prot or commercial advantage and that copies bear this notice and the full citation
on the rst page. Copyrights for third-party components of this work must be honored.
For all other uses, contact the owner/author(s).
CSCW’22 Companion, November 08–22, 2022, Virtual Event, Taiwan
©2022 Copyright held by the owner/author(s).
ACM ISBN 978-1-4503-9190-0/22/11.
https://doi.org/10.1145/3500868.3559445
decision-making processes to novel cases [
9
]. HCI researchers have
demonstrated how bureaucracy is embedded in algorithms that
support or automate decision-making. For example, on Amazon
Mechanical Turk, quality assessment algorithms would not assign
new tasks to workers if their past work was not completed as gold-
standard solutions in the algorithms’ training dataset [2, 17].
This analogy between algorithmic systems and bureaucracy also
exists in creator moderation. Beyond the purpose of content moder-
ation that regulates the appropriateness of creative content, creator
moderation consists of multiple governance mechanisms manag-
ing content creators’ visibility [
3
,
4
,
8
], identity [
4
], revenue [
5
,
7
],
labor [
7
], and more. Given platformization and monetization of
creative labor [
10
,
19
,
27
], video sharing platforms like YouTube
and TikTok tend to practice creator moderation through an assem-
blage of various algorithms (e.g., monetization, content moderation,
recommendation algorithms, and more) [
23
]. So, creators may cor-
respondingly experience moderation such as demonetization [
7
,
22
]
or shadowban [
3
,
29
]. However, as interests in the CSCW commu-
nity grow in understanding creators’ moderation experiences (e.g.,
[
22
,
23
]), relatively little attention has been paid to unpacking or
elaborating on bureaucratic traits of creator moderation, so we
ask: How do content creators navigate bureaucracy in creator
moderation? In this poster, we call the algorithms conducting
creator moderation as moderation algorithms holistically when not
specifying certain algorithms such as recommendation algorithms.
To answer the research question, we interviewed 28 YouTubers
who had experienced creator moderation. Through an inductive
qualitative analysis [
21
,
31
], we identied three primary ways that
YouTubers wrestled with bureaucracy: (1) multiple obstructions
prevented our participants from appealing moderation decisions;
(2) participants felt they were coerced to appease dierent types of
algorithms; (3) participants thought their labor was undervalued
by the platform.
We discuss how bureaucracy is embedded in creator moderation
procedures on YouTube, which shares similar traits with not only
algorithmic bureaucracy but also organizational bureaucracy of
online community governance. Organizational bureaucracy refers
to privileged moderation decision-making for some users than oth-
ers. One such example appears on Wikipedia regarding content
exclusion for editors’ contributions. It entails role-based goals of
maintaining an encyclopedia by assigning hierarchical roles to
users such as site owners, sichter (i.e., old users), voluntary users,
or unregistered ones [
6
,
25
], where old members are exempt from re-
view to contribute content [
20
]. We show that our participants rst
experienced algorithmic bureaucracy from moderation decisions
and then organizational bureaucracy when appealing the decisions.
This exploratory study aims to contribute to the HCI and CSCW
CSCW’22 Companion, November 08–22, 2022, Virtual Event, Taiwan Renkai Ma and Yubo Kou
Table 1: Participant proles. Subscription # (fanbase) was collected on the date of the interviews. Status was identied by
YouTubers themselves by the time spent on creating videos. Career refers to how long YouTubers consistently make videos for
their primary channel. Category refers to content category, which is dened by YouTube. “N/A” means the information that
our participants chose not to disclose.
# Sub # Age Status Nationality Race Gender Career Category
P1 ∼25.8k 18 part-time US White Male 5 months Games
P2 ∼21.3k 23 full-time US White Male 5 years Games
P3 ∼6.6k 40 part-time England White Male 3 years Travel
P4 ∼52k 28 part-time US Black Female 6 years People
P5 ∼4.33k 19 part-time England White Male 5 years Technology
P6 ∼268k 29 full-time US White Male 9 years Animation
P7 ∼84.7K 29 full-time US White Male 3 years Games
P8 ∼177k 32 part-time US White Male 3.5 years History
P9 ∼365k 28 full-time Germany White Male 2 years Entertainment
P10 ∼23.1k 38 part-time Mexico Hispanic Female 2.5 years Education
P11 ∼292k 29 part-time Brazil White Female 12 years Entertainment
P12 ∼2.02k 21 part-time England White Male 2.5 years Education
P13 ∼124k 19 full-time US Hispanic Male 4 years Entertainment
P14 ∼88.6k 28 part-time Colombia Hispanic Male 2 years Education
P15 ∼12.6k 29 part-time Mexico Hispanic Male 6 years Education
P16 ∼35.5k 29 part-time Mexico Hispanic Female 4 years Technology
P17 ∼5.7k 21 part-time US N/A Male 8 years Entertainment
P18 ∼26.8k 29 part-time Mexico Hispanic Female 3 years Education
P19 ∼53.9k 32 part-time Mexico Hispanic Female 3 years Technology
P20 ∼8.8k 18 part-time US White Male 2 years Games
P21 ∼497k 25 full-time US N/A Male 2 years Entertainment
P22 ∼230k 22 part-time US White Male 7 years Animation
P23 ∼31.3k 48 part-time Colombia Latino Male 5 years Education
P24 ∼63.2k 31 part-time US White Male 5 years History
P25 ∼52k 48 part-time US White Male 3 years Film
P26 ∼5.51k 27 part-time US Asian Female 1 year Entertainment
P27 ∼60.6k 55 full-time US White Male 2 years Technology
P28 ∼21.4k 23 full-time Demark Mixed Male 6 months Entertainment
research with a conceptual understanding of how algorithmic and
organizational bureaucracy intertwine in creator moderation.
2 METHODS: DATA COLLECTION AND
ANALYSIS
After the approval from our institution’s Institutional Review Board
(IRB) board, we interviewed 28 YouTubers (See Table 1) to answer
our research question. We used purposeful sampling [
31
] with
recruitment criteria that participants must be over 18 years old
and have experienced creator moderation (e.g., “limited ads,” [
35
]
copyright claims [
5
,
18
], or other types). We shared the recruitment
information on Twitter, Facebook, and Reddit. From January to
October 2021, we interviewed 28 YouTubers. Every participant was
compensated with a $ 20 gift card, with three proactively refused
reimbursement.
Through Zoom, all interviews were conducted through a semi-
structured interview protocol involving two sections. After we
received verbal consent from each participant, we conducted (1)
warm-up questions, such as demographics and YouTube usage ques-
tions (e.g., frequency of publishing videos), and then (2) moderation
experiences questions such as what moderation and explanations
they received, how they were impacted, and probes, i.e., follow-up
questions.
We used an inductive thematic analysis on our interview dataset
[
21
]. This procedure was expanded in three steps. First, two coders
separately ran ‘open coding:’ screening the data and assigned codes
to excerpts that could answer our research question. Then, the
coders conducted ‘axial coding’ by combining open coding codes
into themes. Finally, the data analysis ended by running ‘selective
coding’ where relevant themes were grouped into overarching
themes.
3 FINDINGS
We found three aspects of bureaucracy in creator moderation: (1)
multiple obstacles to appeal moderation decisions, (2) coercion to
appease dierent algorithms, and (3) the platform’s indierence to
participants’ individual labor.
“I am not a YouTuber who can make whatever video I want. I have to keep appeasing algorithms”:
Bureaucracy of Creator Moderation on YouTube CSCW’22 Companion, November 08–22, 2022, Virtual Event, Taiwan
3.1 Obstruction in Layers
Obstruction in layers means multiple obstacles that our participants
encountered to appeal moderation decisions. After the issuance
of such decisions, YouTubers could request human reviewers’ re-
examination on them (e.g., contacting creator support [
36
], request-
ing a review [
37
]). This appeared to be a straightforward procedure,
but our participants experienced it as indirect and layered. They
observed a hierarchy in terms of who could actually reach human
reviewers. Then, they were instructed to take lengthy and confusing
steps to achieve their goals.
3.1.1 Hierarchical Communication. Hierarchical communication
means that the communication channels between YouTubers and
the creator moderation system have varying qualities and ecien-
cies. For instance, P24 said:
In terms of tangible support for it (video deletion),
there wasn’t really much. It was more about the lead-
ership of the Slack community (WeCreateEdu) [who]
basically reached out to their contacts on YouTube,
which [was] fairly high up in the content moderation
area, and then it was just a matter of a waiting game.
[P24]
WeCreateEdu is a group of YouTubers who create educational con-
tent on YouTube, and they had built an online community on Slack,
a business communication platform. In the above case, P24 ac-
knowledged a hierarchical feature of moderation where YouTube
maintained closed contacts with organizers of a specic YouTuber
community. Because existing appeal procedures did not help re-
examine P24’s video deletion punishment, he turned to leverage the
community’s contacts for better solutions. However, P24 himself
did not have such access.
Some participants had successfully contacted human reviewers
through third-party platforms (e.g., Twitter), while they witnessed
more disproportionate conversations compared with other YouTu-
bers. For instance, P2 said:
(
. . .
) or a big YouTuber like [YouTuber A;] he’s having
a bunch of issues on his channel, and it has over 10
million subscribers. So, he went on back and forth, and
you can probably look that (chatting history) up with
the team YouTube. But if you follow team YouTube
on Twitter, you can see all kinds of conversations
that people [are] yelling at them because they don’t
do anything. They’re really there for the elite people.
[P2]
@TeamYouTube is a YouTube team’s ocial Twitter account oer-
ing help to YouTubers. P2 witnessed it oered hierarchical length
of conversations between a YouTuber with a larger fanbase (i.e.,
YouTuber A) and small YouTubers. He thus assumed fanbase was
critical for YouTube to oer better responses to requests for solving
moderation issues.
3.1.2 Ineicient Process. Inecient process describes that partic-
ipants experienced appeal as complex and lengthy, resulting in
undue delays. After moderation algorithms issued decisions, some
participants chose to appeal through YouTube’s platform support
(e.g., creator support [
36
]). However, this did not mean participants
could eectively solve moderation issues as they wished. For ex-
ample, P7, a YouTuber who experienced channel demonetization,
said:
They (YouTube) want you to go through the forms
and the formulas. They had sent me one link, and it
said if you believe it was an error, you can reapply to
the YouTube partnership program in 30 days as long
as you’re not breaking any rules. But what rules am I
breaking in videos? There’s no information; it’s like a
wall. [P7]
Channel demonetization means a YouTuber’s whole channel be-
comes illegible to earn ad income from videos. In the above case,
P7 needed to wait a certain period of time to appeal such moder-
ation. However, without knowing what content rules he violated,
his appeal might fail to be initiated once he is deemed to violate
content rules again. Thus, extending prior work’s ndings that
users encountered the opacity of appeal on Reddit [
16
] or Insta-
gram [
11
], we found that on YouTube, the opacity of moderation
decision-making further became an obstacle of eectively initiating
appeals.
We also found that time was an important factor in doing so. P3
shared his experiences of appealing ‘limited ads’: You can appeal
demonetization, and then it’s amazing [they] will take 27 or 28 days,
and then they reject you. Or you’re about to get to the 28th day, and
you win because they haven’t found [any]thing [problematic]. [P3]
“Limited ads” means a video is not suitable for most advertisers,
so its ad income will be decreased or removed. YouTube’s content
rules state that human review on ‘limited ads’ takes up to 7 days
[
37
]. However, P3 waited more than that for moderation systems
to process his appeals. His negative tone further expressed the de-
sire for moderation system’s better eciency since time matters
for monetization. As P19, who experienced “limited ads,” elabo-
rated, “We lost money because I need to wait for a response back from
YouTube.”
3.2 Appeasing Algorithms under Coercion
Appeasing algorithms under coercion refers to situations where par-
ticipants felt coerced into appeasing dierent types of algorithms.
Not only did participants experience content moderation that prior
work has primarily investigated (e.g., [
13
–
15
,
33
]), but they also felt
forced to negotiate for their content existence, income, visibility,
and more. Some participants unconditionally followed whatever de-
cisions algorithms issued, such as what P7 said: It (self-certication)
will say congratulations, you’re rating your videos accurately, and
in future uploads, we will automatically monetize you depending
on your own answers [P7]. Self-certication is an algorithmic func-
tion for a YouTuber to self-report whether a new video complies
with dierent sections in advertiser-friendly content guidelines
before publishing it. P7’s case showed he actively contributed his
uncompensated labor for the platform to train self-certication al-
gorithms/functions better. For another example, after a community
guideline strike (i.e., a content warning), P13 described how he
appeased monetization and recommendation algorithms:
I kind of do feel like I was targeted, especially after
the false strike. (
. . .
) It hurt my motivation very much
because it takes a while for my videos to get pushed
CSCW’22 Companion, November 08–22, 2022, Virtual Event, Taiwan Renkai Ma and Yubo Kou
(by recommendation algorithms), and I have to make
longer videos that are more watch time intensive be-
cause that’s what the algorithms like. [P13]
After the strike, P13 observed YouTube’s recommendation algo-
rithms did not actively promote his videos, so his ad income de-
creased sharply. He assumed that longer videos could be more
recommendable to the algorithms because they are more protable:
more mid-roll ads can be placed on a video that is longer than
eight minutes [
38
]. Thus, P13 created longer videos to appease
recommendation and monetization algorithms for more ad income.
However, some participants chose not to appease YouTube, and
they experienced negative consequences happening to their channel
performance. For instance, P8, a YouTuber who created history
content experienced ‘limited ads,’ said:
They have rendered certain topics o limits. (
. . .
) I
tend to be a little brazen, but that hurts my channel
to grow and to be seen. (
. . .
) YouTube is fostering a
climate in which talking about the Holocaust is im-
possible. That’s a problem. (
. . .
) These things (videos)
were mostly derived from a lecture that I actually gave
to students, and yet I am heavily discouraged from
doing it on YouTube. [P8]
Advertiser-friendly content guidelines allow violent content “in a
news, educational, artistic, or documentary context” [
39
]. P8 felt
the moderation he received unreasonable and believed that there
were latent boundaries of discussing specic topics beyond existing
content rules. This case showed how algorithmic systems of creator
moderation on YouTube failed to adapt decision-making processes
to novel cases. P8’s insistence on creating Holocaust history content
that he considered completely acceptable on YouTube showed that
he refused to appease YouTube. However, such decision caused neg-
ative impacts on his channel’s monetization, visibility, and growth.
3.3 Indierence to Individual Labor
Indierence to individual labor refers to our participants’ com-
plaints that YouTube showed little care to the value of their indi-
vidual labor. Extending prior work that discusses labor structures
on YouTube [
7
,
27
], we further showed how YouTubers subjec-
tively sense YouTube’s indierent attitudes toward their labor. For
example, P2 said:
It (suspension) actually deleted all my money or mon-
etization for April, which was over 700,000 views for
my other videos; it was like a couple of hundred dol-
lars. (
. . .
) Because my channel got deleted on May 1,
and I hadn’t processed it by the 12th, I never saw that
money. And they never told me what happened to
that money. [P2]
YouTube processes monetization (e.g., ad income) from the 7th to
12th of each month for their prots generated in the last month. P2
did not receive the ad income generated from videos with all normal
monetization statuses due to account suspension. This inequality
in prot distribution indicated platform indierence to P2’s labor.
Furthermore, participants complained that demonetization re-
vealed YouTube’s lack of intent to help them monetize videos in
common ways. For example, P25, a YouTuber who create videos of
horror movie compilation, said:
It says limited ads, but a video like mine is tamer than
an episode of the walking dead. The Walking Dead
is on AMC, that’s a basic cable network; they nd
many people to advertise on that show. So, I don’t
believe that YouTube can’t nd advertisers on horror
lm content or whatever [for me]. I think they just
don’t try. [P25]
P25’s ad income was decreased because the moderation system
deemed that his videos depicted violence and were not suitable
for advertisers. P25 believed that his content could be friendly for
certain categories of ads, so he complained about YouTube’s lim-
ited endeavors to help nd matched advertisers for him. Although
YouTube and AMC might have dierent standards in content rules,
P25’s case showed his impression of YouTube’s limited care for him
to re-monetize.
4 DISCUSSION & FUTURE WORK
Extending the prior work on experiences of content moderation
(e.g., [
14
,
22
,
24
,
30
,
33
]), we showed how YouTubers experienced
the bureaucracy of creator moderation. That is, bureaucracy is
embedded in moderation procedures, sharing similar traits with
both algorithmic bureaucracy and organizational bureaucracy of
online community governance.
Similar to how prior work discusses algorithmic bureaucracy
[
2
,
26
], this study found that moderation algorithms failed to con-
textualize our participants’ video content for moderation decision-
making. YouTube largely automates moderation decision-making
through algorithms (e.g., machine learning) [
1
,
12
,
28
]. It, however,
remains questionable how moderation algorithms can ensure con-
tent rules in a qualitative manner (e.g., judgment, discourse) to be
perfectly translated to identify user content as unacceptable, espe-
cially given our participants’ complaints on latent content rules.
Algorithmic bureaucracy on YouTube keeps expanding its scope
and complexity, sometimes involving more uncompensated labor
from YouTubers. An example is YouTube’s “self-certication” func-
tion to improve algorithmic accuracy. Consequently, YouTubers
like P7 willingly provided much uncompensated, voluntary work
so that moderation algorithms, including monetization, recommen-
dation, and more, could be trained better. This means, instead of
making algorithms more exible for YouTubers, the platform makes
YouTubers adapt to the inexible algorithms.
Organizational bureaucracy [
20
,
25
] in creator moderation on
YouTube lies partly in its invisible hierarchy. Prior work has dis-
cussed the hierarchical orders of users in moderation decision-
making for articles on Wikipedia [
25
]. Similarly, our ndings
showed the hierarchy of appeal processes. Some YouTubers can
access certain human reviewers for privileged interpretational in-
teractions and solutions to moderation issues. While hierarchical
moderation decision-making on Wikipedia is somehow visible to
all community members, the hierarchy in creator moderation is
largely invisible. Thus, YouTubers have a longer learning curve to
gure out these complexities and intricacies.
“I am not a YouTuber who can make whatever video I want. I have to keep appeasing algorithms”:
Bureaucracy of Creator Moderation on YouTube CSCW’22 Companion, November 08–22, 2022, Virtual Event, Taiwan
Content policies serve to ratify such organizational bureaucracy.
Researchers criticize existing content rules rarely consider the con-
text of user content [
32
,
34
], easily leading to algorithmic bureau-
cracy. We further showed that content rule acted as an obstacle: it
provided excuses, such as the 30-day rule, to justify the perceived
low eciency of processing appeals, and prevented YouTubers
from returning to earn ad income from time-sensitive content for a
prolonged period.
Thus, the bureaucracy of creator moderation is both algorithmic
and organizational: YouTubers would rst experience algorithmic
bureaucracy upon receiving moderation decisions and then or-
ganizational bureaucracy when initiating appeals. Organizational
bureaucracy works to exacerbate the repercussions of algorithmic
bureaucracy. Lengthy undue procedures ahead, our participants
could only wait for algorithmic decisions to be re-examined by
human reviewers. With limited power in such unbalanced labor
relation, participants like P13 acquired a mindset of “appeasing
algorithms” or sacriced personal preferences and interests to meet
demands that they believed to be unreasonable.
This poster’s late-breaking ndings, how algorithmic bureau-
cracy and organizational bureaucracy intersect in creator moder-
ation, will inform our future studies. Building on these ndings,
we plan to study the complex relationships between bureaucratic
moderation systems and user communities, how user agency plays
a role in such relationships, and how platform policies could be
designed to prevent bureaucracy and support users in constructive
ways.
REFERENCES
[1]
Julia Alexander. 2019. YouTube moderation bots punish videos
tagged as ‘gay’ or ‘lesbian,’ study nds. The Verge. Retrieved from
https://www.theverge.com/2019/9/30/20887614/youtube-moderation-lgbtq-
demonetization-terms- words-nerd- city-investigation
[2]
Ali Alkhatib and Michael Bernstein. 2019. Street–level algorithms: A theory at the
gaps between policy and decisions. In CHI Conference on Human Factors in Com-
puting Systems Proceedings (CHI 2019), Association for Computing Machinery,
New York, NY, USA, 1–13. DOI:https://doi.org/10.1145/3290605.3300760
[3]
Carolina Are. 2021. The Shadowban Cycle: an autoethnography of pole dancing,
nudity and censorship on Instagram. Fem. Media Stud. (2021). DOI:https://doi.
org/10.1080/14680777.2021.1928259
[4]
Sophie Bishop. 2018. Anxiety, panic and self-optimization: Inequalities and the
YouTube algorithm. Converg. Int. J. Res. into New Media Technol. 24, 1 (2018),
69–84. DOI:https://doi.org/10.1177/1354856517736978
[5]
Ragnhild Brøvig-Hanssen and Ellis Jones. 2021. Remix’s retreat? Content moder-
ation, copyright law and mashup music: New Media Soc. (June 2021). DOI:https:
//doi.org/10.1177/14614448211026059
[6]
Brian Butler, Elisabeth Joyce, and Jacqueline Pike.2008. Don’t lo ok now, but we’ve
created a bureaucracy: The nature and roles of policies and rules in Wikipedia.
Conf. Hum. Factors Comput. Syst. - Proc. (2008), 1101–1110. DOI:https://doi.org/
10.1145/1357054.1357227
[7]
<number>[7]</numberRobyn Caplan and Tarleton Gillespie. 2020. Tiered Gover-
nance and Demonetization: The Shifting Terms of Labor and Compensation in
the Platform Economy. Soc. Media + Soc. 6, 2 (2020). DOI:https://doi.org/10.1177/
2056305120936636
[8]
Kelley Cotter. 2021. “Shadowbanning is not a thing”: black box gaslighting and
the power to independently know and credibly critique algorithms. Information,
Commun. Soc. (2021). DOI:https://doi.org/10.1080/1369118X.2021.1994624
[9]
Michel Crozier and Erhard Friedberg. 1964. The Bureaucratic Phenomenon. Rout-
ledge. DOI:https://doi.org/10.4324/9781315131092
[10]
[10] Brooke Erin Duy. 2020. Algorithmic precarity in cultural work. Com-
mun. Public 5, 3–4 (September 2020), 103–107. DOI:https://doi.org/10.1177/
2057047320959855
[11]
Jessica L. Feuston, Alex S. Taylor, and Anne Marie Piper. 2020. Conformity of
Eating Disorders through Content Moderation. Proc. ACM Human-Computer
Interact. 4, CSCW1 (May 2020). DOI:https://doi.org/10.1145/3392845
[12]
Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic con-
tent moderation: Technical and political challenges in the automation of platform
governance. Big Data Soc. 7, 1 (January 2020), 205395171989794. DOI:https:
//doi.org/10.1177/2053951719897945
[13]
Oliver L. Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021.
Disproportionate Removals and Diering Content Moderation Experiences for
Conservative, Transgender, and Black Social Media Users: Marginalization and
Moderation Gray Areas. Proc. ACM Human-Computer Interact. 5, CSCW2 (Oc-
tober 2021). DOI:https://doi.org/10.1145/3479610
[14]
Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019.
“Did you suspect the post would be removed?”: Understanding user reactions
to content removals on reddit. Proc. ACM Human-Computer Interact. 3, CSCW
(November 2019), 1–33. DOI:https://doi.org/10.1145/3359294
[15]
Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does transparency in
moderation really matter?: User behavior after content removal explanations
on reddit. Proc. ACM Human-Computer Interact. 3, CSCW (2019). DOI:https:
//doi.org/10.1145/3359252
[16]
Prerna Juneja, Deepika Rama Subramanian, and Tanushree Mitra. 2020. Through
the looking glass: Study of transparency in Reddit’s moderation practices. Proc.
ACM Human-Computer Interact. 4, GROUP (January 2020), 1–35. DOI:https:
//doi.org/10.1145/3375197
[17]
Sanjay Kairam and Jerey Heer. 2016. Parting Crowds: Characterizing divergent
interpretations in crowdsourced annotation tasks. Proc. ACM Conf. Comput.
Support. Coop. Work. CSCW 27, (February 2016), 1637–1648. DOI:https://doi.
org/10.1145/2818048.2820016
[18]
D. Bondy ValdovinosKaye and Joanne E. Gray. 2021. Copyright Gossip: Exploring
Copyright Opinions, Theories, and Strategies on YouTube: Soc. Media + Soc. 7, 3
(August 2021). DOI:https://doi.org/10.1177/20563051211036940
[19]
Susanne Kopf. 2020. “Rewarding Good Creators”: Corporate Social Media Dis-
course on Monetization Schemes for Content Creators. Soc. Media + Soc. 6, 4
(October 2020), 205630512096987. DOI:https://doi.org/10.1177/2056305120969877
[20]
Paul B. de Laat. 2012. Coercion or empowerment? Moderation of content in
Wikipedia as ‘essentially contested’ bureaucratic rules. Ethics Inf. Technol. 2012
142 14, 2 (February 2012), 123–135. DOI:https://doi.org/10.1007/S10676-012- 9289-
7
[21]
Ralph LaRossa. 2005. Grounded Theory Methods and Qualitative Family Research.
J. Marriage Fam. 67, 4 (November 2005), 837–857. DOI:https://doi.org/10.1111/j.
1741-3737.2005.00179.x
[22]
Renkai Ma and Yubo Kou. 2021. “How advertiser-friendly is my video?”: YouTu-
ber’s Socioeconomic Interactions with Algorithmic Content Moderation. PACM
Hum. Comput. Interact. 5, CSCW2 (2021), 1–26. DOI:https://doi.org/10.1145/
3479573
[23]
Renkai Ma and Yubo Kou. 2022. “I’m not sure what dierence is between their
content and mine, other than the person itself”: A Study of Fairness Perception
of Content Moderation on YouTube. Proc. ACM Human-Computer Interact. 6,
CSCW2 (2022), 28. DOI:https://doi.org/10.1145/3555150
[24] Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpreta-
tions of content moderation on social media platforms. New Media Soc. 20, 11
(2018), 4366–4383. DOI:https://doi.org/10.1177/1461444818773059
[25]
Sabine Niederer and José van Dijck. 2010. Wisdom of the crowd or technicity of
content? Wikipedia as a sociotechnical system: New Media Soc. 12, 8 (July 2010),
1368–1387. DOI:https://doi.org/10.1177/1461444810365297
[26]
Juho Pääkkönen, Matti Nelimarkka, Jesse Haapoja, and Airi Lampinen. 2020.
Bureaucracy as a Lens for Analyzing and Designing Algorithmic Systems. Conf.
Hum. Factors Comput. Syst. - Proc. (April 2020). DOI:https://doi.org/10.1145/
3313831.3376780
[27]
Hector Postigo. 2016. The socio-technical architecture of digital labor: Converting
play into YouTube money. New Media Soc. 18, 2 (2016), 332–349. DOI:https:
//doi.org/10.1177/1461444814541527
[28]
Aja Roman. 2019. YouTubers claim the site systematically demonetizes LGBTQ
content. Vox. Retrieved from https://www.vox.com/culture/2019/10/10/20893258/
youtube-lgbtq- censorship-demonetization- nerd-city-algorithm- report
[29]
Laura Savolainen. 2022. The shadow banning controversy: perceived governance
and algorithmic folklore: Media, Cult. Soc. (March 2022). DOI:https://doi.org/10.
1177/01634437221077174
[30]
Nicolas P. Suzor, Sarah Myers West, Andrew Quodling, and Jillian York. 2019.
What Do We Mean When We Talk About Transparency? Toward Meaningful
Transparency in Commercial Content Moderation. Int. J. Commun. 13, (2019).
Retrieved from https://ijoc.org/index.php/ijoc/article/view/9736
[31]
Sarah J. Tracy. 2013. Qualitative Research Methods: Collecting Evidence, Crafting
Analysis.
[32]
Rebecca Tushnet. 2019. Content Moderation in an Age of Extremes.
Case West. Reserv. J. Law, Technol. Internet 10, (2019). Retrieved from
https://heinonline.org/HOL/Page?handle=hein.journals/caswestres10&id=83&
div=5&collection=journals
[33]
Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. “At the End
of the Day Facebook Does What It Wants”: How Users Experience Contest-
ing Algorithmic Content Moderation. In Proceedings of the ACM on Human-
Computer Interaction, Association for Computing Machinery, 1–22. DOI:https:
//doi.org/10.1145/3415238
CSCW’22 Companion, November 08–22, 2022, Virtual Event, Taiwan Renkai Ma and Yubo Kou
[34]
Richard Ashby Wilson and Molly K. Land. 2021. Hate Speech on Social Media:
Content Moderation in Context. Conn. Law Rev. (2021). Retrieved from https:
//papers.ssrn.com/sol3/papers.cfm?abstract_id=3690616
[35]
“Limited or no ads” explained. YouTube Help. Retrieved from https://support.
google.com/youtube/answer/9269824?hl=en
[36]
Get in touch with the YouTube Creator Support team. YouTube Help. Retrieved
from https://support.google.com/youtube/answer/3545535?hl=en&co=GENIE.
Platform%3DDesktop&oco=0#zippy=%2Cemail
[37]
Request human review of videos marked “Not suitable for most advertis-
ers.” YouTube Help. Retrieved from https://support.google.com/youtube/answer/
7083671?hl=en#zippy=%2Chow-monetization- status-is- applied
[38]
Manage mid-roll ad breaks in long videos. YouTube Help. Retrieved from https:
//support.google.com/youtube/answer/6175006?hl=en#zippy=%2Cfrequently-
asked-questions
[39]
Advertiser-friendly content guidelines. YouTube Help. Retrieved from
https://support.google.com/youtube/answer/6162278?hl=en#Adult&zippy=
%2Cguide-to- self-certication