*Primary research and drafting by: Alice Doyle, J.D. Candidate (Class of 2022), UC Irvine School of Law

I.         Introduction

The Facebook Oversight Board purports to be accessible to and capable of meaningfully evaluating cases referred by users across the globe. Yet the Board’s limited language capabilities, even compared to those of the Facebook platform, undermine equitable access to the Board and obstruct its ability to gauge the full context of content under review. With only 20 members (40 eventual members), the Board cannot feasibly cover every language spoken by Facebook’s 2.7 billion users.[1] Pursuant to applicable human rights standards, however, the Board should expand its language capabilities and further involve experts and stakeholders to provide context for content at issue.

II.        Factual Background

Facebook is a global platform, with users located all around the world.[2] The Facebook platform supports 111 languages,[3] and another 31 languages are widely spoken on the platform but not officially supported.[4] Some of the 111 languages supported by the platform are quite rare; for instance, in 2016 Facebook added Corsican, a language used by roughly 200,000 Mediterranean islanders and listed on UNESCO’s Atlas of the World’s Languages in Danger.[5]

As of April 2019, Facebook’s community standards are available in only 41 of these languages, leaving countless users unable to access the rules of the platform.[6] Additionally, the standards are more effectively implemented in some languages than others. At Facebook, the content moderation workforce as a whole speaks only about 50 languages, and Facebook’s artificial intelligence tools are able to moderate an even smaller set of languages, with tools that can identify hate speech in about 30 languages and “terrorist propaganda” in 19 languages.[7] While Facebook flags around 70% of misinformation in English with warning labels, only around 30% of analogous misinformation in Spanish is flagged.[8] These trends are most easily seen on the Facebook platform, but they hold true when considering Instagram (which the Oversight Board has jurisdiction over): Of the 51 languages supported by Instagram, the platform’s community guidelines are available in only 30 of them.[9]

The Board’s Bylaws state that English will be its “working language.”[10] Accordingly, the Board’s website, including its appeals submission portal, is available in 28 languages (following the addition of 10 languages in February 2021).[11] Limiting the availability of the Board’s website to those 28 languages means that its governing documents, important information about the appeals process, and the actual portal through which users can submit appeals, are accessible to only a limited subset of Facebook’s users. Users can submit appeals to the Board in any language, and the Board will translate their submissions into English for review.[12]

According to the Board’s governing documents, panels are not required to consult external sources regarding contextual information—it is at the panel’s discretion whether to seek out these sources for each case. The Board’s Bylaws state that before deliberation panels may request and receive information from a “global pool of outside subject-matter experts, including academics, linguists, and researchers” to learn more about specific issues within each case.[13] Panels may also request “issue briefs from advocacy or public interest organizations that reflect a range of perspectives.”[14] The Board’s Rulebook states that the administration will provide “research on case context (e.g., cultural, linguistic, political)” which “may be sought through external partners” when the panel requests it.[15] Panels may also open a public comment period for experts, civil society, and the general public to submit briefs for the case at hand.[16]

III.      Applicable International Human Rights Standards

The UN Guiding Principles on Business and Human Rights (UNGPs) encourage businesses to carry out human rights due diligence “[i]n order to identify, prevent, mitigate and account for how they address their adverse human rights impacts,” including those caused by content moderation decisions.[17] As part of this due diligence, both Facebook and the Oversight Board “should seek to understand the concerns of potentially affected stakeholders by consulting them directly in a manner that takes into account languages and other potential barriers to effective engagement.”[18]

Principle 31 of the UNGPs identifies eight “effectiveness criteria” for grievance mechanisms like the Board.[19] A few of these criteria are particularly relevant to the Bylaws’ treatment of language and translation. The accessibility criterion calls upon the Board to “provid[e] adequate assistance for those who may face particular barriers to access,” including lack of awareness of the mechanism and language barriers.[20] The equitability criterion seeks to ensure that users can “engage in a grievance process on fair, informed and respectful terms,” taking into account the often-present imbalance in access to information and resources.[21] And the rights-compatibility criterion compels that all of the Board’s decisions “accord with internationally recognized human rights,” including those set forth in the International Covenant on Civil and Political Rights (ICCPR) and the International Covenant on Economic, Social and Cultural Rights (ICESCR).[22]

Among those human rights implicated by UNGP 31(f) and contained in the ICCPR and ICESCR are the freedom of expression, the right to non-discrimination, the right to culture, and the rights of minorities. The right to freedom of expression “protects all forms of expression and the means of their dissemination” and must necessarily include the right to express oneself online in the language on one’s choice. [23] The right to non-discrimination protects against discrimination based on “race, colour, sex, language, religion, political or other opinion, national or social origin, property, birth or other status.”[24] The right to culture covers all aspects of cultural life, including “express[ing] oneself in the language of one’s choice.”[25] And the rights of minorities ensure that minority groups are free “to enjoy their own culture, profess their own religion, or use their own language.”[26]

As a related matter, the Rabat Plan of Action lays out a six-factor test for evaluating whether expression constitutes incitement. This test emphasizes the importance of the context of expression, as well as speaker intent, the status of the speaker, reach of the expression, and the likelihood or imminence of harm posed by the expression.[27]

IV.       Language as a Barrier to Equitable Access

The language in which a user chooses to engage online can be significantly tied to identity. Demonstrating this point, the move to translate Facebook into Corsican was driven by native speakers to whom the endangered language was deeply important and who desired to bring it into the future.[28] Indeed, languages have cultural significance, including through connection to religious observance and practice.[29] To respect the right to non-discrimination and freedom to express oneself in one’s own language, Facebook should ensure that any user, regardless of language, has access to the Oversight Board’s review.[30]

Publishing the Board’s website in only 28 languages unreasonably bars access to those who speak another of the 111 languages supported by Facebook’s platform and is contrary to the accessibility criteria of UNGP 31. Although the Bylaws do allow case referrals in any language,[31] users must have some level of comfort with one of the 28 official languages of the Board in order to navigate the review process. In order to comply with the equitability criteria of UNGP 31, the Board should ensure that all users can meaningfully and fairly engage with the review process.[32] Accordingly, the Board should expand its language capabilities to at least include all languages supported by Facebook platforms[33] and provide language assistance for vulnerable and marginalized users.[34]

V.        Importance of Linguistic and Cultural Context

Content moderation at Facebook happens on a massive scale, posing challenges to meaningful examination of cultural context.[35] Artificial intelligence tools used to flag content for review have difficulty assessing the nuanced meaning of content due to the variations of natural language.[36] And Facebook’s human content moderators are exposed to a “never-ending flow” of flagged content and directed “to decide on the basis of guidance and not to get hung up on context.”[37] However, content moderation must involve more than an assessment of whether a post contains certain language.[38]

At the Oversight Board level, essential context risks being lost when posts (and user submissions) are translated into English for the Board’s review. The Rabat Plan of Action lists context as one of six factors in a test for assessing the severity of incitement to hatred.[39] Assessment of social and political context must be informed by the communities impacted by the content: “people who can understand the ‘code’ that language sometimes deploys to hide incitement to violence, evaluate the speaker’s intent, consider the nature of the speaker and audience and evaluate the environment in which hate speech can lead to violent acts.”[40]

One of the Board’s recent decisions regarding (alleged) Burmese hate speech illustrates the importance of linguistic and cultural context in evaluating expression.[41] The post at issue included a statement that Facebook translated as “[there is] something wrong with Muslims psychologically.”[42] The user explained that the post was a sarcastic comment regarding extremist religious responses, pointing out that Facebook’s content moderation process was “not able to differentiate between sarcasm and serious discussion in the Burmese language and its context.”[43] The Board’s internal translators found the correct translation to be “[t]hose male Muslims have something wrong in their mindset.”[44] Consistent with the Rabat Plan—yet not required by the Board’s governing documents—the Board sought contextual understanding from an independent research institute and country experts, who supported the user’s assertion and understood the post as a sarcastic quip.[45] Accordingly, the Board found that while the post might be construed as “pejorative or offensive towards Muslims, it did not advocate hatred or intentionally incite any form of imminent harm.”[46] However, some experts posit that if the Board “had taken a wider view of context, it could well have reached the opposite conclusion” given that “Facebook’s failure to control anti-Muslim hate speech in Myanmar has been linked to the genocide of Rohingya Muslims in the country, violence that continues to this day.”[47] This case underscores the importance of seeking out civil society input regarding context.

VI.       Conclusion

Pursuant to applicable international human rights standards, the Oversight Board is obligated to expand both its language capabilities and its engagement with civil society. Doing so will help remove barriers to access and provide socio-linguistic context for the content at issue. The Board has taken steps in the right direction, including its recent addition of 10 new website languages and proactive engagement with civil society regarding context in deciding its initial cases. Yet more remains to be done for the Board to fulfill its promise of accessibility and meaningful review. Options for improvement could include the Board’s inclusion of at least the languages supported by Facebook platforms and amending the Board’s governing documents to require (not merely permit) consultation with civil society in assessing the context of expression.


[1] Centro de Estudios en Libertad de Expresión y Acceso a la Información (CELE), Considering Facebook Oversight Board: Turning On Expectations 5 (May 2019).

[2] Siva Vaidhyanathan, Facebook and the Folly of Self-Regulation, Wired (May 9, 2020). In terms of geographic spread, 9% of users are in the US and Canada, 15% in Europe, 43% in Asia, and 33% in “the rest of [the] world.” Mansoor Iqbal, Facebook Revenue and Usage Statistics, Business of Apps (May 24, 2021).

[3] Select Language, Facebook.

[4] Maggie Fick & Paresh Dave, Facebook’s Flood of Languages Leave It Struggling to Monitor Content, Reuters (Apr. 23, 2019).

[5] Jessica Guynn, Does Facebook Speak Your Language?, USA Today (Sept. 30, 2016); Atlas of the World’s Languages in Danger, UNESCO.

[6] Fick & Dave, supra note 4.

[7] Id.; see also Billy Perrigo, Facebook Says It’s Removing More Hate Speech Than Ever Before. But There’s a Catch, Time (Nov. 27, 2019).

[8] Kari Paul, ‘Facebook Has a Blind Spot’: Why Spanish-Language Misinformation Is Flourishing, The Guardian (Mar. 3, 2021).

[9] Fick & Dave, supra note 4.

[10] Oversight Board, Bylaws, (Jan. 2020), art. 1 § 4.3. [hereinafter “Bylaws”].

[11] Id. Initially, it covered only 18 languages, but in February 2021, the Board announced that their website is now available in 10 new languages. Oversight Board (@OversightBoard), Twitter (Feb. 16, 2021, 7:12 AM) https://twitter.com/OversightBoard/status/1361694918519963649.

[12] Bylaws, art. 1 § 4.3.

[13] Id. art. 1 § 3.1.4.

[14] Id.

[15] Oversight Board, Rulebook for Case Review and Policy Guidance 9 (Nov. 2020).

[16] Id.

[17] United Nations Guiding Principles on Business and Human Rights, A/HRC/17/31 (Mar. 21, 2011), Principle 17, Commentary [hereinafter “UNGPs”].

[18] UNGPs, Principle 18, Commentary (emphasis added).

[19] Id., Principle 31.

[20] Id., Principle 31(b), Commentary.

[21] Id., Principle 31(d).

[22] Id., Principle 31(f).

[23] UN Human Rights Comm., General Comment No. 34, ¶ 12 (Sept. 12, 2011); see also ICCPR art. 19(2).

[24] ICCPR arts. 2(1), 26; ICESCR art. 2(2).

[25] UN Comm. on Econ., Soc. and Cultural Rights, General Comment No. 21, ¶ 15(a) (Dec. 21, 2009).

[26] ICCPR art. 27; see also UN Human Rights Comm., General Comment No. 24, ¶ 8 (Nov. 11, 1994).

[27] UN High Comm’r for Human Rights, Report on the Expert Workshops on the Prohibition of Incitement to National, Racial or Religious Hatred, ¶ 29(a), UN Doc. A/HRC/22/17/Add.4 (Jan. 11, 2013) [hereinafter “Rabat Plan of Action”].

[28] Guynn, supra note 5.

[29] UN Human Rights Comm., General Comment No. 22, ¶ 4 (1993).

[30] For a discussion of why the Board should provide channels of access to non-users, please see Alice Doyle, Out of “Site,” Out of Mind: The Facebook Oversight Board’s Exclusion of Non-User Rightsholders, UCI International Justice Clinic (Mar. 5, 2021).

[31] Bylaws, art. 1 § 4.3.

[32] UNGPs, Principle 31(d).

[33] See BSR, Human Rights Review: Facebook Oversight Board 52 (Dec. 2019); see also BSR, Progress Report: Human Rights and the Facebook Oversight Board 13 (Dec. 2020).

[34] BSR, Human Rights Review 53; See also Research Report by the Mandate of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, with the support of the International Justice Clinic at the University of California, Irvine School of Law: Freedom of Expression and Oversight of Online Content Moderation, ¶ 56 (July 2020).

[35] Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression on content moderation,¶ 29, UN Doc. A/HRC/38/35 (Apr. 6, 2018).

[36] David Kaye, Speech Police: The Global Struggle to Govern the Internet 63 (2019); see also Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression on Hate Speech, ¶ 50, UN Doc. A/74/486 (Oct. 9, 2019); Anna Schmidt & Michael Wiegand, A Survey on Hate Speech Detection using Natural Language Processing, Proceedings of the Fifth International Workshop on Natural Language Processing for Social Media 1, 4 (Apr. 2017).

[37] Kaye, Speech Police, supra note 32, at 61; see also Thomas Davidson, Dana Warmsley, Michael Macy & Ingmar Weber, International Conference on Web and Social Media, Automated Hate Speech Detection and the Problem of Offensive Language (2017).

[38] A/74/486, supra note32, ¶ 50.

[39] Rabat Plan of Action, ¶ 19(a).

[40] A/74/486, supra note32, ¶ 50; see also Schmidt & Wiegand, supra note 32, at 8 (“[H]ate speech may have strong cultural implications, that is, depending on one’s particular cultural background, an utterance may be perceived as offensive or not.”); see also Vasu Reddy, Perverts and Sodomites: Homophobia as Hate Speech in Africa, 20 S. African Linguistics & Applied Language Stud. 163, 172 (2002).

[41] Oversight Board, Case Decision 2020-002-FB-UA (Jan. 28, 2021).

[42] Id.

[43] Id.

[44] Id.

[45] Id.; see also Jacob Schulz, What Do the Facebook Oversight Board’s First Decisions Actually Say?, Lawfare (Jan. 28, 2021).

[46] Oversight Board, Case Decision 2020-002-FB-UA (Jan. 28, 2021).

[47] Faiza Patel & Laura Hecht-Felella, Oversight Board’s First Rulings Show Facebook’s Rules Are a Mess, Just Security (Feb. 19, 2021).

Lost in Translation: How the Facebook Oversight Board’s Limited Language Capabilities Undermine Human Rights*