*Primary research and drafting by: Amanda Miller, J.D. Candidate (Class of 2022), UC Irvine School of Law
I. Introduction
In rendering decisions on content moderation actions, the Facebook Oversight Board must scrutinize whether restrictions on speech pass muster under international human rights law. This analysis involves an assessment of Facebook’s written, publicly available policies governing content moderation on the company’s platforms (its “Community Standards”).[1] Indeed, any Board decision affirming the removal of content or an account suspension must necessarily include a finding that the relevant Community Standards comport with inter alia the right to freedom of expression as set forth by Article 19 of the International Covenant on Civil and Political Rights (the “ICCPR”). Yet, as this paper explains, the substance and presentation of these Standards raise important questions about their adherence to the legality requirement under Article 19.
In 2020, the Leibniz Institute for Media Research conducted a study on the development process for the Standards. [2] The study found that Facebook employs a defined, multi-step process in drafting and continuously modifying its Standards.[3] Although the company makes unilateral decisions regarding the Standards, it does consult with stakeholder groups based on “a non-systematic but representativity-oriented selection process.”[4] The study was inconclusive as to the extent to which economic interests directly influence the Standards, noting that “[o]f course, it may be argued that all decisions are influenced by the long-term goal of ensuring the continued attractiveness of Facebook as a social space.”[5] The overarching conclusion of the study was that “Facebook has been constructing a prima facie autonomous and private normative order for public communication that seeks to reconcile interests within that order and is conceived largely without reference to state law or international human rights standards.”[6] In essence, as one Facebook representative explained, “We are making rules up.”[7]
The Leibniz study demonstrates the complexity and even awkwardness of applying international human rights law – which was originally intended to apply only to State actors — to private companies. Social media platforms create policies that impact users’ freedom of expression and may negatively impact the human rights of billions of users (as well as non-users[8]). Despite Facebook’s arguably State-like functions, “its legal status as a private service provider affords it the freedom to design, conduct, and govern this public sphere on the basis of commercial priorities rather than public interest.”[9]
So how can ICCPR Article 19’s legality requirement be applied to a company’s autonomous set of rules? As scholar Michael Lwin has pointed out, “Unlike countries, social media companies cannot pass laws because they are not states imbued with a legislative function.”[10] Also, once applied, how do Facebook’s Standards measure up to the criteria for legality under Article 19? This paper explores these novel questions raised by the Board’s human rights review of Facebook’s content restrictions.
II. Applicable Human Rights Standards
Although States are indeed the primary subjects of international human rights law, the UN Guiding Principles on Business and Human Rights (the “UNGPs”) establish that companies also have human rights responsibilities and provide a legal framework applicable to all types of business enterprises, including social media companies.[11] The UNGPs make clear that companies should respect the full range of human rights, including those recognized by ICCPR.[12] Facebook has expressly agreed to adhere to the UNGPs, even though they are not binding law.[13] As the UN Special Rapporteur for the right to freedom of opinion and expression (“Special Rapporteur”) has observed, the application of the UNGPs to social media companies is particularly compelling given that they “have become central platforms for discussion and debate, information access, commerce and human development.”[14]
Article 19 of the ICCPR sets forth the right to freedom of expression. This right provides the right to seek and receive information of all kinds, regardless of frontiers, and through any media.[15] However, this right permits restrictions on speech if the restriction is “provided by law,” in pursuit of a legitimate aim, as well as “necessary” and “proportionate” for achieving that aim.[16] If a restriction does not satisfy each of these requirements, then the restriction is unlawful under Article 19.
The “provided by law” prong, referred to as the legality requirement, seeks to ensure that restrictions on expression are established by laws passed by a governmental legislature or contempt of court rules.[17] This requirement affirms democratic values and “allows minority groups to express their disagreement, propose different initiatives, participate in the shaping of the political will, or influence public opinion.”[18] Arguably, State law-making processes may be analogized to the policy development processes within social media companies.
To satisfy the legality requirement, a restriction must meet several criteria. First, the restriction must not confer excessive discretion on those enforcing the restriction; rather, it should “provide sufficient guidance to those charged with [its] execution to enable them to ascertain what sorts of expressions are properly restricted and what sorts are not.”[19] Thus, it cannot be overly broad. Second, the restriction must provide adequate notice to those whose speech is being restricted. Accordingly, the law must be clear, precise and publicly accessible in order to provide individuals with adequate guidance.[20] Third, the restriction must be compatible with human rights standards; for example, it cannot be discriminatory or arbitrary.[21] Fourth, the restriction should be subject to procedural safeguards and independent review, especially those guaranteed by courts or tribunals.[22]
III. The Legality of Facebook’s Community Standards
Social media companies do not make “laws,” per se, and, do not create and modify their content moderation policies pursuant to democratic processes, as the legality requirement envisions. However, these policies may be assessed pursuant to the contours of the legality requirement in order to align the companies with their human rights responsibilities under the UNGPs.[23] Indeed, the Special Rapporteur has asserted that “[c]ompanies should incorporate directly into their terms of service and ‘community standards’ relevant principles of human rights law that ensure content-related actions will be guided by the same standards of legality . . . that bind State regulation of expression.”[24] And Board member Evelyn Aswad has posited (in her independent scholarship) that “the legality . . . prong[] of ICCPR Article 19(3)’s tripartite test can be adapted to the corporate context.”[25] Yet Facebook’s Community Standards may not satisfy the criteria of the legality requirement, once applied.[26]
First, the Standards may confer excessive discretion on Facebook and its content moderators. As the Leibniz study found, Facebook enjoys autonomy in drafting and modifying its Standards, although it does consult with selected stakeholders.[27] Further, the non-governmental organization ARTICLE 19 has found that the Standards “remain very broad in scope, leaving significant discretion to Facebook in their implementation.”[28] Restrictions that are overly broad confer excessive discretion on content moderators and thus do not satisfy the first legality criterion. As Lwin has observed, “Facebook currently has ‘unbounded discretion’ in coming up with and implementing the Community Standards.”[29]
Second, Facebook’s Standards may not provide its users with adequate notice of restrictions on their expression. As scholar Evelyn Douek has observed, users are often in the dark about what they might be doing wrong “[w]hether because Facebook’s policies are not clear or lack detail or are scattered around different websites, or because users are not given an adequate explanation for which rule has been applied in their specific case.” [30] Indeed, several of the Board’s case decisions have found the Standards to be unclear, vague and even difficult to locate. For example, in its January 2021 decision regarding COVID-19 misinformation in France, the Board found that the relevant policies were scattered across Facebook’s website and contained vague ambiguities, thereby failing to provide adequate notice.[31] In February 2021, Facebook responded to the Board’s findings by stating it had already taken action by consolidating its policies into a Help Center blog post.[32] However, according to at least one commentator, this step “does not solve the underlying issue of consolidating and clarifying existing rules in one place: Facebook’s policies on health misinformation stretch across blog posts, different sections within the Community Standards, and now in its Help Center.”[33] Their lack of clarity is exacerbated by the fact that they change on a continuous basis.
Third, Facebook’s policies may not be fully compatible with human rights standards. The Leibniz study found that the Standards were not consistent with any legal system and “conceived largely without reference to state law or international human rights standards.”[34] This finding aligns with human rights analyses of the Standards conducted by ARTICLE 19 and other organizations.[35] Notably, in the aforementioned case, the Board recommended that Facebook “conduct a human rights impact assessment with relevant stakeholders as a part of its rule modification process.”[36]
Fourth, Facebook’s Standards are subject to some procedural safeguards to protect rights — the Oversight Board partially fulfills this role. Notwithstanding that the Board has criticized the Standards in many of its case decisions, it has yet to directly confront the issue of how they can be brought into full compliance with the legality requirement.[37]
As a general matter, Facebook’s Standards may not satisfy the legality requirement. Lwin concluded as much in 2020,[38] and, more recently, the Board has highlighted this issue across its decisions. Faiza Patel and Laura Hecht-Felella have summarized the Board’s decisions as saying, in effect: “Facebook’s content-moderation rules and its enforcement of them are a mess and the company needs to clean up its act.”[39]
IV. Conclusion
Whether or not Facebook’s Community Standards satisfy the criteria of the legality requirement is fundamental to the Board’s decision-making: If not, then Facebook may not restrict content, pursuant to human rights standards. The Leibniz study reveals the company’s autonomous development process for its content moderation policies — which underpins the deficiencies of the Standards, identified above. It may be that unless and until this underlying process is made publicly accountable (and, thus, more democratic) with reference to human rights law, Facebook will not be able to bring its Standards into compliance with Article 19.
[1] See Facebook, Community Standards (modified regularly). References to “Facebook” include the company’s Facebook and Instagram platforms. References to “Community Standards” apply to the content moderation policies of both platforms.
[2] Matthias C. Kettemann & Wolfgang Schulz, Leibniz Institute for Media Research, Setting Rules for 2.7 Billion: A (First) Look into Facebook’s Norm-Making System: Results of a Pilot Study, at 15, 28 (2020) [hereinafter “Leibniz Study”].
[3] Id., at 21.
[4] Id., at 28.
[5] Id., at 31.
[6] Id., at 30.
[7] Id., at 28.
[8] For a discussion of the impact of Facebook’s content moderation on non-users, please see Alice Doyle, Out of “Site,” Out of Mind: The Facebook Oversight Board’s Exclusion of Non-User Rightsholders, UCI International Justice Clinic (Mar. 5, 2021).
[9] R.F. Jørgensen & L. Zuleta, Private Governance of Freedom of Expression on Social Media Platforms: EU content regulation through the lens of human rights standards, 41:1 Nordicom Review 51, 62 (2020).
[10] Michael Lwin, Applying International Human Rights Law for Use by Facebook, 4 Yale Journal on Regulation Online Bulletin 53, 68 (2020).
[11] United Nations Guiding Principles on Business and Human Rights, A/HRC/17/31 (Mar. 21, 2011).
[12] Id., Principles 11 and 12.
[13] Facebook, Corporate Human Rights Policy (2021) (“We are committed to respecting human rights as set out in the United Nations Guiding Principles on Business and Human Rights (UNGPs).”).
[14] See Report of the special rapporteur on the promotion and protection of the right to freedom of opinion and expression, A/HRC/38/35 (Apr. 6, 2018) ¶ 9 [hereinafter “UNSR 2018 Report”]. “While the Guiding Principles are non-binding, the companies’ overwhelming role in public life globally argues strongly for their adoption and implementation.” Id. ¶ 10.
[15] International Covenant on Civil and Political Rights, art. 19(2).
[16] Id., art. 19(3); UN Human Rights Committee, General Comment No. 34 (Sept. 12, 2011) ¶ 22 [hereinafter “GC 34”].
[17] Id. ¶ 24 (citing Gauthier v. Canada and Dissanayake v. Sri Lanka).
[18] Inter-American Court of Human Rights, Advisory Opinion OC-6/87 (May 9, 1986) ¶ 22.
[19] GC 34 ¶ 25.
[20] Id.; see also Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, A/HRC/14/23 (Apr. 20, 2010) ¶ 79(d) (“Laws imposing restrictions or limitations must be accessible, concrete, clear and unambiguous, such that they can be understood by everyone and applied to everyone.”); Inter-American Commission on Human Rights, Inter-American Legal Framework regarding the Right to Freedom of Expression, OEA/Ser.L/V/II CIDH/RELE/INF. 2/09 (Dec. 30, 2009) ¶ 71 (“vague, broad or open-ended laws, by their mere existence, discourage the dissemination of information and opinions out of fear of punishment”).
[21] See GC 34 ¶ 26; A/HRC/14/23, supra note 20 ¶ 79(f).
[22] Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, A/74/486 (Oct. 9, 2019) ¶ 6(a).
[23] Communication to Facebook, Inc. from the Mandate of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, OL OTH 24/2019 (May 1, 2019), at 3.
[24] UNSR 2018 Report ¶ 45.
[25] Evelyn Mary Aswad, The Future of Freedom of Expression Online, 17 Duke Law & Technology Review 26, 56 (2018).
[26] Although this paper is focused on Facebook’s Community Standards, the legality requirement is also applicable to the algorithms Facebook uses to determine the virality of content.
[27] Leibniz Study, at 28, 30.
[28] ARTICLE 19, Facebook Community Standards: Analysis Against International Standards on Freedom of Expression (Jul. 30, 2018).
[29] Lwin, supra note 10, at 69.
[30] Evelyn Douek, The Facebook Oversight Board’s First Decisions: Ambitious, and Perhaps Impractical, Lawfare (Jan. 28, 2021). “Company rules routinely lack the clarity and specificity that would enable users to predict with reasonable certainty what content places them on the wrong side of the line.” UNSR 2018 Report ¶ 46.
[31] Oversight Board, Case Decision 2020-006-FB-FBR (Jan. 28, 2021) (“Given this patchwork of rules and policies that appear on different parts of Facebook’s website, the lack of definition of key terms such as ‘misinformation,’ and the differing standards relating to whether the post ‘could contribute’ or actually contributes to imminent harm, it is difficult for users to understand what content is prohibited. The Board finds the rule applied in this case was inappropriately vague. The legality test is therefore not met.”).
[32] Facebook’s Detailed Response to Oversight Board (Feb. 2021).
[33] Carly Miller, Facebook, It’s Time to Put the Rules in One Place, Lawfare (Mar. 5, 2021).
[34] Leibniz Study, at 32.
[35] See, e.g., ARTICLE 19, supra note 28.
[36] Oversight Board, supra note 31.
[37] This issue could be the subject of a Request for Policy Guidance, under Article 2 of the Board’s Bylaws.
[38] Lwin, supra note 10, at 69.
[39] Faiza Patel & Laura Hecht-Felella, Oversight Board’s First Rulings Show Facebook’s Rules Are a Mess, Just Security (Feb. 19, 2021).