Guest post by Elisabeth Gschösser

Influencer Law Clinic series

12/3/20213 min read

A handful of Big Tech companies are pulling the strings when deciding about what can and cannot be said in some of the most important modern forms of speech. Content moderation allows platforms to screen and monitor user-generated content based on platform-specific guidelines.[1] With the release of the Facebook Files, there has been increasing concern about the opacity and arbitrariness with which Facebook exercises this power.

According to the Wall Street Journal, Facebook installed an internal system exempting 5.8 million users from having to follow the rules on its platform.[2] The enforcement actions applying to high profile users who are “newsworthy”, “influential”, “popular” or “PR risky” differ from ordinary users. Prominent figures like former US President Donald Trump and soccer star Neymar are covered by the system “XCheck” or “cross check”. This system blocks content moderators from removing posts that are not in line with the platform’s guidelines.[3]

In 2019, Neymar published screenshots of his WhatsApp chats with a woman who accused him of rape on his Facebook and Instagram accounts. The screenshots included the woman’s name and nude pictures of her.[4] The Facebook community guidelines would require it to delete the post as it depicted non-consensual intimate imagery.[5] However, since Neymar was whitelisted by XCheck, the post was seen by 56 million online users and shared over 6.000 times before it was removed.[6]

The Facebook Files illustrate the far-reaching consequences content moderation can have for both individuals and societies. Adopting different rules on the freedom of speech for different people is troubling. Thus, the pressing question arises as to how platforms’ decisions can be scrutinized. To what extent can the existing body of human rights law offer a solution?

Part of the challenge in finding a regulatory response to social media platforms like Facebook stems from their dual role in a public and private sphere. While platforms take over a public role in adjudicating conflicts of users, they are still companies governed mainly by their commercial interests. As private actors, platforms are only bound by human rights law if these standards are translated into national law.[7]

Nevertheless, there has been growing convergence around the idea that social media platforms should use human rights law as the basis for content moderation rules.[8] A human rights approach to content moderation rules comes with several benefits: Firstly, it is often argued that human rights are universal meaning that they apply everywhere to everyone. As platforms operate beyond national borders, human rights could act as a single set of global rules.[9] Secondly, human rights provide an effective weapon for companies against illegitimate state abuses of human rights. In case governments demand censorship, it is more convincing to argue that content cannot be taken down as it would breach human rights than to refer to the company’s internal rules. Governments are obliged to uphold human rights which their citizens enjoy.[10] Thirdly, human rights as the basis of content moderation rules could increase legitimacy as they offer a recognized framework of rules for explaining content moderation decisions. Even though human rights do not dictate a specific substantive outcome, they provide a common baseline for argumentation. Furthermore, when filling the substantive gaps, human rights impose procedural rules which enhance accountability and transparency.[11]

On the other hand, numerous challenges arise from a human rights approach to content moderation. Human rights are highly contested and indeterminate as the rules stem from various sources. Understandings and interpretations of human rights differ between states.[12] Additionally, platforms cannot be easily compared to state actors. While state courts can take several months to decide on a case, platforms need to act quickly. Furthermore, platforms lack legitimacy and competence to determine whether a particular restriction on freedom of expression is necessary or proportionate.[13]

Concludingly, it is debatable whether the existing body of human rights offers a way to hold platforms accountable in such cases. However, the Facebook files call for a regulatory response to content moderation on social media platforms like Facebook. Social media platforms should respect and uphold the rights of their users, regardless of their status as private profit-maximizing businesses.

[1] https://computationalsocialmedia.tech/index.php/2021/03/08/social-media-sanctions-the-new-procedural-justice/

[2] https://www.theguardian.com/technology/2021/sep/21/facebook-xcheck-system-oversight-board-review

[3] https://www.businessinsider.com/facebook-content-moderation-58-million-users-xcheck-2021-9?international=true&r=US&IR=T

[4] https://www.computerbild.de/artikel/cb-News-Internet-Facebook-Ausnahmeregeln-fuer-Promis-durch-XCheck-30772083.html

[5] https://transparency.fb.com/de-de/policies/community-standards/sexual-exploitation-adults/

[6] https://www.computerbild.de/artikel/cb-News-Internet-Facebook-Ausnahmeregeln-fuer-Promis-durch-XCheck-30772083.html

[7] MacKenzie F. Common, ‘Rule of Law and Human Rights Issues in Social Media Content Moderation’ (DPhil thesis, London School of Economics 2020) http://etheses.lse.ac.uk/4219/1/Common__Rule-law-human-rights.pdfaccessed 23 October 2021

[8] UNCHR ‘Guiding Principles on Business and Human Rights’ HR/PUB/11/04

[9] Evelyn Douek, ‘The Limits of International Law in Content Moderation’ (2021) 6 UC Irvine J Int’l Transnat’l & Comp L 37

[10] https://www.ohchr.org/EN/NewsEvents/Pages/Online-content-regulation.aspx

[11] Evelyn Douek, ‘The Limits of International Law in Content Moderation’ (2021) 6 UC Irvine J Int’l Transnat’l & Comp L 37

[12] Barrie Sander, ‘Freedom of expression in the age of online platforms: the promise and pitfalls of a human rights-based approach to content moderation’ (2020) 43 Fordham International Law Journal 939

[13] https://berkleycenter.georgetown.edu/responses/a-human-rights-based-approach-to-social-media-platforms