March 2022
·
38 Reads
·
6 Citations
As public debate moves to private content platforms online, content moderation by the Judiciary is no longer the most common, the most efficient or the most legitimate way of separating acceptable from unacceptable content. A new model of self-regulation of personality rights is being implemented by such platforms, where enforcement—and sometimes review decisions—are performed by code while users operate as reviewers as much as content producers and consumers. Platforms have freedom to design and implement decentralized moderation systems, but the rules set in their architecture and community guidelines must comply with procedural boundaries established by the efficacy of fundamental rights between private parties. Judicial review plays a role not in evaluating the merits of content posted or shared online in these platforms, but rather in course-correction of the procedural rules that ensure self-regulation does not disproportionally restrict informational personality rights such as speech, honor and privacy.