*Primary research and drafting by: Alice Doyle, J.D. Candidate (Class of 2022), UC Irvine School of Law


The Facebook Oversight Board’s lack of access for non-user rightsholders forecloses the possibility of a remedy for many individuals adversely impacted by Facebook’s content moderation decisions. Much of the current discourse regarding the Board’s remedial scope takes aim at limitations on the types of content that the Board can review—only permitting users to refer cases in which Facebook has removed organic content that violated its internal policies.[1] However, an overlooked limitation is that, other than Facebook itself, only users of Facebook or Instagram platforms may refer cases to the Board.[2] The harms suffered by non-users can be profound and tragic, as illustrated by the 2017 genocidal attacks on Rohingya communities (many members of which were non-users) fueled by hate speech on the platform.[3]

The Board’s distinction between Facebook users and non-users finds no basis in international human rights law. Rather, as the following analysis details, the exclusion of non-users is antithetical to human rights in disproportionately excluding individuals and communities already at heightened risk of violence or marginalization, preventing them from accessing this pathway to remediation.

I.         Structure and limitations of the Oversight Board

As a recent report from the Office of the U.N. High Commissioner for Human Rights noted, “[t]he remedies that may be obtained from non-State-based grievance mechanisms are usually partial at best, in many cases due to limitations placed on the mechanism’s mandate, available resources, or both.”[4] This observation applies to the Board’s remedial scope, which bars access for many and allows access to only two categories of stakeholders: Facebook users and the company itself.

Indeed, the Board’s Bylaws state that “[i]n order to request a review by the board, a person must have an active Facebook or Instagram account.”[5] Also, Facebook’s internal reporting and appeals process—which must be exhausted before a user may request review by the Board—requires that the person making the report have an active account.[6] These prerequisites for admissibility make clear that the review process only serves users and not non-users whose rights may be impacted.

Meanwhile, Facebook also has the power to refer matters to the Board for review.[7] This pathway allows for greater discretion regarding what types of cases may be referred, as Facebook is not subject to the same admissibility restrictions as users. With Facebook’s broad authority to submit “significant and difficult”[8] cases for review, many important matters that would otherwise be outside the Board’s narrow remedial scope could come before the Board as referrals from Facebook. Furthermore, Facebook may request advisory policy guidance from the Board, presumably on any matter it chooses (although the Board’s decisions on such matters do not bind the company).[9] Therefore, Facebook has the ability to refer matters directly affecting non-user rightsholders but does not have any obligation to do so.

II.        International human rights standards applicable to the Oversight Board

Acknowledging that “even with the best policies and practices, a business enterprise may cause or contribute to an adverse human rights impact,”[10] the U.N. Guiding Principles on Business and Human Rights (“UNGPs”) recommend that businesses establish or participate in operational-level grievance mechanisms “for individuals and communities who may be adversely impacted.”[11] Pursuant to UNGP 31, such mechanisms should be legitimate, predictable, equitable, transparent, a source of continuous learning, based on engagement and dialogue, and—most importantly for this discussion—accessible and rights-compatible.[12]

The UNGPs compel business grievance mechanisms to ensure accessibility and comply with internationally recognized human rights. The accessibility criterion identified in UNGP 31 requires that grievance mechanisms be known to all rightsholders for whom they are intended, and that adequate assistance be provided for those who may face particular barriers to access.[13] Similarly, UNGP 29 establishes that grievance mechanisms should identify and address “any legitimate concerns;” otherwise, concerns “may over time escalate into more major disputes and human rights abuses.”[14] Additionally, grievance mechanisms should be rights-compatible, “ensuring that outcomes and remedies accord with internationally recognized human rights.”[15] Such rights include the right to be free from discrimination[16] and the right to remedy for all abuses[17]—two cornerstones of human rights protection.

As such, compliance with the UNGPs requires that operational-level grievance mechanisms inter alia be made accessible and provide remedies, in a non-discriminatory manner, to all those who may be adversely impacted by the business’s practices. The Board does not meet these standards.

III.      The Board’s exclusion of non-users in contravention of human rights standards

The categorical exclusion of non-users by the Oversight Board is inconsistent with applicable international human rights standards, including those set forth by the UNGPs. This exclusion has a discriminatory impact and denies access to a remedy for billions of rightsholders.

The existence of a digital divide along lines of race, gender, socioeconomic status and other identities has been well documented.[18] Facebook CEO Mark Zuckerberg himself recognized that, as of 2015, two-thirds of the world’s population, particularly in the Global South, did not have access to the internet.[19] Despite improvements in recent years, inequalities with regard to internet access persist. For example, online access in Pakistan depends on characteristics such as age, gender, race/ethnicity, education and income.[20] Even in well-connected States like the U.S., low-income households and communities of color are more likely to lack internet access.[21] The digital divide has compounded existing societal inequalities and engendered “new and complex forms of exclusion affecting those already marginalized and disempowered.”[22] And the COVID-19 pandemic has only deepened the divide, while making internet access more essential than ever.[23]

The Board’s inability to review and provide a remedy to non-users exacerbates these discriminatory impacts. Members of marginalized communities are most likely to not have Facebook accounts, yet they are often harmed by content on the company’s platforms.[24] For example, non-users are often victims of hateful content, “seemingly spurred on by a business model that values attention and virility.”[25] The online environment allows for anonymous speakers, coordination, and mob attacks.[26] The U.N. Committee on the Elimination of Racial Discrimination has suggested that racist hate speech and incitement to racial discrimination can be particularly dangerous when conducted on online platforms like Facebook, as they allow speakers to reach vast audiences and foment hostility towards ethnic and racial groups through repetition.[27] This places marginalized communities at heightened risk of violence or marginalization by activity on social media platforms—triggering the need for particular attention to their rights and needs.[28]

Despite their heightened risk, such rightsholders are deprived access to a remedy by the Board’s exclusion of non-users. It is a basic tenet of human rights law that “[w]hen business-related human rights abuses occur, those affected must have access to effective remedy.”[29] Indeed, the UNGPs call on businesses to “avoid infringing on the rights of others and to address adverse impacts with which they are involved,”[30] which requires meaningful engagement with those whose human rights may be adversely impacted by Facebook’s practices.[31] Corporate responsibility to respect human rights is not limited to respecting the rights of that business’s employees, customers, or users. Rather, grievance mechanisms should be made directly accessible to all individuals and communities who may be adversely impacted.[32]

Users with Facebook or Instagram accounts are not the only rightsholders that may be adversely impacted by Facebook’s content moderation decisions. As indicated above, online hate speech and incitement to violence have far-reaching consequences that are not limited to the Facebook community. Non-users are often “offline victims of online content,” and Board review should be made available to them.[33]

IV.       Conclusion

The UNGPs recognize the corporate responsibilities to respect human rights and provide full access to remedies—in a non-discriminatory manner—to those whose rights have been adversely impacted. In order to meet this standard, Facebook should recognize the discriminatory impacts of the Oversight Board’s exclusion of non-users, and address the disparity by facilitating meaningful access to review for all those who may be harmed by content moderation decisions, users and non-users alike. This access could be made possible by, for example, creating new pathways to review for civil society organizations to raise issues on behalf of affected non-users,[34] or by requiring Facebook to consult with non-user rightsholders in formulating its referrals to the Board.[35] Which form this pathway to review takes matters less than the outcome: that space be held for the voices of those who are otherwise voiceless on the platform.


[1] See Oversight Board, Bylaws (Jan. 2020), art. 3 § 1.1.1 [hereinafter “Bylaws”]; Research Report by the Mandate of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, with the support of the International Justice Clinic at the University of California, Irvine School of Law: Freedom of Expression and Oversight of Online Content Moderation (July 2020), ¶¶ 42–53; Evelyn Douek, How Much Power Did Facebook Give Its Oversight Board?, Lawfare (Sept. 25, 2019).

[2] See Bylaws, art. 3 § 1.1.

[3] The independent fact-finding mission on Myanmar, established by the U.N. Human Rights Council, implicated Facebook in these atrocities, stating: “The role of social media is significant. Facebook has been a useful instrument for those seeking to spread hate, in a context where, for most users, Facebook is the Internet.” Report of the Independent International Fact-finding Mission on Myanmar, U.N. Doc. A/HRC/39/64 (Sept. 12, 2018), ¶ 74.

[4] U.N. High Commissioner for Human Rights, Improving Accountability and Access to Remedy for Victims of Business-Related Human Rights Abuse Through Non-State-based Grievance Mechanisms, A/HRC/44/32 (May 19, 2020).

[5] Bylaws, art. 3 § 1.1.

[6] Facebook’s Help Center page states that, should a non-user want to report something that violates the platform’s Community Standards, they “may need to ask a friend to help [them].” Help Center: Don’t Have an Account?, Facebook.

[7] See Bylaws, art. 2 § 2.

[8] Id. § 2.1.1.

[9] Id. §§ 2.1, 2.3.

[10] United Nations Guiding Principles on Business and Human Rights, A/HRC/17/31 (Mar. 21, 2011), Principle 22, Commentary [hereinafter “UNGPs”].

[11] Id., Principle 29.

[12] Id., Principle 31.

[13] Id., Principle 31(b), Commentary; see also BSR, Human Rights Review: Facebook Oversight Board 52 (Dec. 2019).

[14] UNGPs, Principle 29, Commentary (emphasis supplied).

[15] Id., Principle 31(f). Moreover, in all contexts, businesses should both “respect” and “seek ways to honour the principles of internationally recognized human rights.” Id., Principle 23(a)–(b). This responsibility encompasses, “at a minimum,” those rights expressed in the International Bill of Human Rights (comprised of the Universal Declaration of Human Rights, the International Covenant on Civil and Political Rights (“ICCPR”) and the International Covenant on Economic, Social and Cultural Rights (“ICESCR”)) and the International Labour Organization core conventions. Id., Principle 12, Commentary.

[16] See, e.g., U.N. Human Rights Committee, General Comment No. 18 (Nov. 10, 1989), ¶ 1. The UNGPs make clear that they “should be implemented in a non-discriminatory manner, with particular attention to the rights and needs of, as well as the challenges faced by, individuals from groups or populations that may be at heightened risk of becoming vulnerable or marginalized.” UNGPs, General Principles.

[17] See, e.g., ICCPR, art. 2.

[18] See, e.g., Bridgette Wessels, The Reproduction and Reconfiguration of Inequality, in The Digital Divide: The Internet and Social Inequality in International Perspective 17 (Massimo Ragnedda & Glenn W. Muschert eds., 2013). According to the International Telecommunications Union, as of 2019 the percentage of individuals using the internet still varies drastically across global regions, from 82.2% in Europe to only 28.2% in Africa. Statistics, ITU; Jennifer Brody, Eric Null & Isedua Oribhabor, A Digital Rights Agenda for 2021 and Beyond, Access Now (Aug. 18, 2020).

[19] Arzak Khan, Internet.org Risks the Web’s Future in Pakistan, Al Jazeera America (June 22, 2015). In fact, Zuckerberg has launched a series of initiatives to increase connectivity by providing a free, “stripped-down” web service (offering access to basic sites and, of course, Facebook) to communities that would otherwise not have internet access. See Jessi Hempel, What Happened to Facebook’s Grand Plan to Wire the World?, Wired (May 17, 2018). Such initiatives have been widely criticized by civil society organizations, as well as by States, for violating net neutrality principles and effectively perpetrating “digital colonialism” with “mostly western corporate content.” Olivia Solon, ‘It’s digital colonialism’: how Facebook’s free internet service has failed its users, The Guardian (July 27, 2017) (quoting Ellery Biddle, advocacy director of Global Voices) (internal quotation marks omitted).

[20] Arzak Khan & Jason Whalley, How Connected are Pakistanis?, 5th Communication Policy Research Conference (Dec. 14 2010), at 14–15.

[21] Monica Anderson & Madhumitha Kumar, Digital Divide Persists even as Lower-Income Americans Make Gains in Tech Adoption, Pew Research Center (May 7, 2020); Andrew Perrin & Erica Turner, Smartphones Help Blacks, Hispanics Bridge Some—But Not All—Digital Gaps with Whites, Pew Research Center (Aug. 20, 2019).

[22] Massimo Ragnedda, Tackling Digital Exclusion, in Global Agenda for Social Justice: Volume One 151, 151 (Glenn W. Muschert et al. eds., 2018).

[23] See, e.g., Mehreen Zahra-Malik, The Coronavirus Effect on Pakistan’s Digital Divide, BBC News (July 13, 2020).

[24] Under international human rights standards, discrimination is determined by discriminatory impact, irrespective of intent. See, e.g., International Convention on the Elimination of Racial Discrimination (“ICERD”), art. 1 ¶ 1 (defining the term “racial discrimination” as any “exclusion . . . based on race, colour, descent, or national or ethnic origin which has the purpose or effect of nullifying or impairing the recognition, enjoyment or exercise, on an equal footing, of human rights and fundamental freedoms” (emphasis supplied)); U.N. Human Rights Committee, supra note 16, ¶¶ 6–7 (applying the ICERD definition of discrimination to the ICCPR).

[25] Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression on hate speech, U.N. Doc. A/74/486 (Oct. 9, 2019), ¶ 40.

[26] Id.

[27] U.N. Committee on the Elimination of Racial Discrimination, General Recommendation No. 35 (Sept. 26, 2013), ¶ 14; see also A/74/486, supra note 25, ¶ 16.

[28] UNGPs, General Principles.

[29] A/HRC/44/32, supra note 4, ¶ 5.

[30] UNGPs, Introduction to the Guiding Principles, ¶ 6.

[31] See BSR, Human Rights Review 14.

[32] See, e.g., UNGPs, Principle 29, Commentary.

[33] BSR, Human Rights Review 53; see also id. at 21 (“[T]he Board’s scope . . . does not encompass rightsholders who may have been impacted by content on Facebook or Instagram, but who themselves are not Facebook or Instagram users.”).

[34] See Report of the Working Group on the Issue of Human Rights and Transnational Corporations and Other Business Enterprises, U.N. Doc. A/72/162 (July 18, 2017), ¶ 72 (“Civil society organizations and human rights defenders have a critical role to play in facilitating access to effective remedies. They are often ‘justice enablers’ for the victims of corporate human rights abuses.”).

[35] See id., ¶ 74 (“[B]usinesses should play their part in creating a safe operating environment for civil society organizations.”); see also UNGPs, Principle 31(h) (“Operational-level mechanisms should also be: Based on engagement and dialogue: consulting the stakeholder groups for whose use they are intended on their design and performance, and focusing on dialogue as the means to address and resolve grievances.”).

Out of “Site,” Out of Mind: The Facebook Oversight Board’s Exclusion of Non-User Rightsholders*