by Virginia Kennedy and Grace Palcic

On Tuesday, February 21st, the Supreme Court will hear oral arguments in Gonzalez v. Google, a case with potentially monumental implications for the future of online speech in the United States – and potentially worldwide. The case concerns the scope of Section 230 of the Communications Decency Act of 1996—a federal law that is widely understood to have been essential to the growth of the internet. Section 230 enables “interactive computer services” to avoid liability for the content shared by users. In other words, the law allows platforms like YouTube to host and organize individuals’ videos without being liable for their content, in essence, allowing internet platforms to host the speech of others as a digital public forum. The case is being considered alongside Twitter v. Taamneh which will also have oral arguments next week. In Twitter v. Taamneh, the Court is expected to consider the extent to which the Anti-Terrorism Act requires an internet host to have knowledge of illegal user-generated speech in order to be held liable for it. 

In Gonzalez v. Google, the Court is faced with the question of whether Section 230 immunizes a platform after it recommends third party content on its site. This case originated after the murder of Naomi Gonzalez (petitioner’s daughter) during the 2015 ISIS attacks in Paris. The Gonzalez family sued Google under the Anti-Terrorism Act (18 U.S.C. §2333) arguing that Google, through its YouTube recommendation systems, pushed ISIS videos to users and is partially responsible for Naomi’s death. The petitioner argues that Section 230 does not shield internet platforms from liability for content that is promoted through automated recommendation systems. At least one Justice appears to have the appetite to modify Section 230. Justice Thomas, in his concurrence for the grant of certiorari in Biden v. Knight First Amendment Institute and his statement respecting the denial of certiorari in Malwarebytes Inc. v. Enigma Software Group USA, LLC, has taken aim at Section 230 and stated the need to address its reach, especially as it applied to recommendation systems. 

Petitioner’s brief argues that an internet platform becomes a publisher of user-generated content when they use a recommendation system because a recommendation system allows an internet platform to disseminate information which the user did not seek themselves. Google’s Respondent’s brief argues that “interactive computer services,” like YouTube, are not publishers or speakers, and therefore should not be held liable for user-generated content on the platforms. Google argues this under both Section 230 principles as well as the First Amendment. Further, Google argues that algorithmic recommendation systems should be protected under Section 230 as “virtually everyone depends on tailored online results, [and] Section 230 is the Atlas propping up the modern internet—just as Congress envisioned in 1996.” The internet that we have today, Google argues, would not exist without recommendation systems. 

UCI Law’s International Justice Clinic, in partnership with the global NGO ARTICLE 19, submitted a brief calling on the Supreme Court to protect free expression, uphold the settled approach by U.S. courts to Section 230, and defer to Congress in its consideration of how to address automated recommendation systems. We argue that the Court’s application of Section 230 should be guided by the free speech principles in the First Amendment and international human rights law, which is in line with Congress’ intent. We highlight that the petitioners’ claim effectively turns on the illegality of the ISIS-related content posted by users. Without immunity for the algorithmic recommendation systems that promoted this content, internet platforms are likely to over-remove content to avoid liability, resulting in significant actions against lawful content. This over-removal could harm the ability of individuals to easily navigate the internet, find desired information, share content, and develop religious, social, and cultural communities online. Further, it is likely these harms will disproportionately affect certain identity based groups. The risks associated with restricting or eliminating Section 230 are directly related to the rights protected under International Convention of Civil and Political Rights (ICCPR) and domestic constitutional law. While there is good reason to pursue rights-respecting regulation of recommendation systems, legislative action—not judicial decision—is the appropriate avenue for such rulemaking. 

In total, there were nearly eighty amicus briefs filed between those supporting respondent, petitioner, or neither party. Those filing on behalf of or in keeping with the arguments of the Petitioner support a reading of Section 230 that would limit its immunity shield in the context of recommendation systems. For instance, the brief for the State of Texas argues that the text of Section 230 provides no protection for recommendations and Congress never intended it to do so. Instead, they assert that the congressional intent was to “to encourage Internet platforms to remove pornography and similar content.” The Counter Extremism Project argues that  Google’s algorithms are not “neutral tools” and therefore should not be protected by Section 230. Many of the briefs touched on the societal harms that come from some online speech. The National Police Association and Giffords Law Center to Prevent Gun Violence drew on evidence of violence and crime furthered online to argue for limitations in Section 230. Other briefs, like Anti-Defamation League (in support of neither party) and Children’s Advocacy Institute at the University of San Diego School of Law, address how recommendation systems can impact issues such as extremism, sex trafficking, teen mental health, and social media addictions. Overall, the briefs submitted in support of the Petitioner follow similar arguments as the petitioners brief. 

The Clinic’s brief was one of over forty amicus briefs filed in support of the Respondent, each with unique perspectives on why Section 230 should remain untouched by the Court. The Amici ranged in experience and included academics, NGO’s, large corporations, politicians, and Ron Wyden, the (now) senator from Oregon and the former representative who led the initiative to enact Section 230. These briefs highlight the array of important reasons why the Court should refrain from altering Section 230. Many of the briefs touch on Congress’s intent when passing Section 230, urging the Court to maintain the original purpose of the statute. The Bipartisan Policy Center  notes that Section 230 is the result of bipartisan legislation and received overwhelming support from both sides of the aisle. Some amici, such as the ACLU and EFF, emphasize that Congress enacted Section 230 in order to promote a “forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.” Others, like Internet Society, highlight the intention to protect innovation and “to promote the continued development of the Internet and other interactive computer services and other interactive media.” Perhaps the most compelling, is the brief authored by the co-authors of Section 230, Senator Wyden and Representative Chris Cox. Their brief sets out that the intention of the statute was to protect Internet platforms’ ability to publish and present user-generated content in real time, and to encourage them to screen and remove illegal or offensive content. 

Many briefs underscore the importance of Section 230 for maintaining diversity online and promoting access to information. The briefs filed by Scholars of Civil Rights and Social Justice and TechFreedom show how Section 230 has allowed marginalized communities and minority voices to be empowered to influence politics and culture. The Chamber of Progress makes a similar argument, stating that eliminating Section 230 would especially harm speakers expressing dissent and platforms would foreclose discussion on controversial matters more often. The Reddit Moderators have an especially unique outlook on Section 230’s protections. Their brief points out that on Reddit, the users themselves (as volunteer moderators) are deciding on the majority of content moderation rules for each subreddit and that Section 230 protects those user choices. This allows each subreddit to have its own unique community. Similarly, the brief written by Reporters Committee for Freedom of the Press illustrated how Section 230 promotes press freedom and access to information. Professor Eric Goldman argues that Section 230 is a speech-enhancing statute that adds substantive and procedural due process protections, to protect and advance First Amendment free speech online. 

Some amici also set out that Section 230, despite current rhetoric around the statute, does not only protect “Big Tech” but is necessary to ensure small platforms can thrive on the internet, and often small platforms rely on the protections of Section 230 more than large platforms. Wikimedia explains how Section 230 is foundational to the success of many smaller companies and nonprofit organizations, specifically highlighting how Wikipedia has been made possible only with the existence of Section 230. The Internet Infrastructure Coalition first touches on how Section 230 provides protections to smaller internet services outside of just “big tech” and social media, then explains how individuals also receive protections from Section 230. Further, Economists Ginger Zhe Jin, Steven Tadelis, Laid Wagman, and Joshua D. Wright set out how Section 230 has enabled business to flourish online and described how restricting Section 230 can have negative effects on the economy. 

Lasty, many of the amici point out that Petitioner’s interpretation of Section 230 and its distinction between publishing and distributing is unworkable. Microsoft points out that Petitioner’s acknowledgment that some recommendations could be protected by Section 230 (for example search engines since they require input from the user) is not as easy of a distinction as the Petitioner purports it to be. In fact, search engines do exactly what Petitioner argues would foreclose immunity. It shows results “based upon what [it] thinks the user would be interested in.” Wikimedia explains how Petitioner’s interpretation of Section 230 is illogical and would result in unnecessary uncertainty for platforms. As an example, the brief asks if the way a website orders pages on its homepage constitutes  a “recommendation” under Petitioner’s interpretation.  

The amici described above explain just some of the impacts that a limitation on Section 230 may have. However, advocates also recognize the room for improvement in the current laws that can be addressed through avenues which can properly consider the impact of these laws. For example, The Center for Democracy and Technology urges the Court to look to legal solutions beyond just Section 230 and traditional technology laws. Similar to our own argument, Microsoft asserts that Congress and the President are in a better position than the Court to address the shortcomings of our current law. Meta points out that algorithms are actually a critical part of their Anti-terrorism policies. 

The decision in Gonzalez v. Google, as well as Twitter v. Taamneh, has the impact to substantially change the digital landscape in the US. The Court should respect the US’s obligation under international human rights laws, including the ICCPR, by protecting the access and exchange of information online. To do this, we urge the Court to refrain from limiting Section 230, and rather, defer to congress to make any necessary legislative changes after thorough consideration. Regardless of the outcome of these cases, the International Justice Clinic will continue to advocate for maintaining and strengthening human rights in the digital realm. 

Supreme Court Hearing Major Internet Case: A Preview