The Center for Democracy & Technology and six technologists with expertise in online recommendation systems filed an amicus brief today in Gonzalez v. Google. The brief urges the U.S. Supreme Court to hold that Section 230’s liability shield applies to claims against interactive computer service providers based on their recommendation of third-party content, because those claims treat providers as publishers.
At issue in Gonzalez is whether Section 230 shields Google from liability for allegedly recommending ISIS content posted to YouTube to other YouTube users. (CDT filed an amicus brief in a companion case, Twitter v. Taamneh, in which the Court will interpret the Anti-Terrorism Act (ATA) to determine when an online intermediary can be held liable for aiding and abetting an act of international terrorism.) Petitioners in this case argue that Section 230(c)(1), which shields intermediaries from liability for “publishing” third-party content, applies only to claims based on the “display” of content, not the “recommendation” of content.
The amicus brief shows that Section 230(c)(1) shields intermediaries from claims based on the recommendation of content because those claims treat them as publishers. The amicus brief explains that Petitioners’ proposed distinction between display and recommendation is technologically arbitrary and unworkable. Recommendation is functionally indistinguishable from selecting and ordering or ranking items for display, something every online service provider must do in order to manage the often overwhelming amount of content available on their services. The brief also explains that ranking systems for recommending content are used across the Internet, and a holding in favor of Petitioners could have broad impacts on many online services, beyond social media, including search engines, news aggregators, and music-hosting services.
In addition, the amicus brief demonstrates that if the Court holds that claims based on recommendations are not shielded by Section 230, providers will be discouraged from using novel ranking algorithms to help users find useful content and for content moderation. It also explains that such a holding would create strong incentives for providers to limit speech by over-removing content, for fear of potential liability.
Finally, the brief explains that online service providers can still be held accountable for unlawful practices without resort to Petitioners’ technically baseless interpretation of the term “publisher” under Section 230. Section 230(c)(1)’s liability shield does not apply if the provider even partially creates or develops the information that gives rise to a legal violation. In addition, comprehensive privacy legislation and antitrust law, fall outside the scope of Section 230 and could be used to hold providers liable for their actions.
The coalition that joined the brief includes: CDT; Robin Burke, Professor of Information Science at the University of Colorado, Boulder; Matt Cutts, former Administrator of the United States Digital Service and former Distinguished Engineer at Google; Dean Eckles, Associate Professor at the MIT Sloan School of Management, where he is also affiliated with the Schwarzman College of Computing; Michael Ekstrand, Associate Professor of Computer Science at Boise State University; Brandie Nonnecke, Associate Research Professor at the Goldman School of Public Policy, UC Berkeley and Director of the CITRIS Policy Lab; and Jonathan Stray, Senior Scientist at the Center for Human-Compatible AI, UC Berkeley.