CDT to Court: When it Comes to Illegal Online Content, Punish the Poster, Not the Platform
This post is part of our ‘Shielding the Messengers’ series, which examines issues related to intermediary liability protections, both in the U.S. and globally. Without these protections, the Internet as we know it today–a platform where diverse content and free expression thrive–would not exist.
Section 230 of the Communications Act may be the most important statute for free expression and online innovation you’ve never heard of. It prevents online service providers for being subject to lawsuits for third-party content, that is, the content that people post to a website without direction from the operator. Without Section 230, many websites that form a key part of many people’s online lives, such as YouTube, Facebook, Twitter, and Yelp, may never have existed. Section 230 is essential to keeping the Internet a useful, generative medium for free expression, democratic participation, and economic growth.
As part of our constant effort to preserve Section 230’s vitality, this week CDT joined the Electronic Frontier Foundation on a friend-of-the court brief in the Washington state case J.S., S.L, and L.C. v. Village Voice Holdings. In that case, the Plaintiffs asked the court to deviate from both the text of Section 230 and the case law interpreting it by holding the alternative classified site Backpage.com responsible for illegal content that third parties posted to the website. The trial court incorrectly denied Backpage’s motion that the case be dismissed based on Section 230’s immunity. Our brief supporting Backpage’s appeal argues that it is both good law and good policy to dismiss the case.
The case should have been dismissed because Section 230 broadly shields intermediaries from liability that is based on the content of their users. Extensive case law supports the position that online service providers receive immunity as long as they do not contribute to the development of the content they host. There can be situations where immunity will not apply, such as where the structure of a site makes it impossible for contributors not to violate the law. But the allegations against Backpage.com do not come close to that line. The lower court appears to have concluded that Section 230 immunity should not apply because Backpage posts and attempts to enforce content guidelines about what material is allowed on the site, and because Backpage has general knowledge that illegal content is sometimes posted on the site. Yet those two facts are true of practically every user-generated content site on the Web. Because all the allegations seek to treat Backpage as the speaker or publisher of third-party illegal content that it did not develop – precisely what Section 230 prohibits – the suit should have been dismissed. Any other result would threaten every online platform.
The power to dispense with such claims early in the litigation process is a critical feature of Section 230. In enacting that law, Congress aimed to encourage the unfettered and unregulated development of free speech on the Internet, and to promote the development of Internet-based commerce. To this end, Section 230 was intended not only to protect intermediaries from liability but from lawsuits themselves. Social media and other user-generated content websites carry the public speech of extremely large numbers of individuals. If lawsuits like Village Voice Holdings were regularly allowed to go forward, service providers would face a constant barrage of costly litigation, and many would choose to avoid business models that allow people to freely contribute independent content. This would result in diminished opportunities for expression and reduced investment in the Internet economy.
The Plaintiffs seek recompense for child sex trafficking that was advertised on Backpage.com. Such trafficking is undoubtedly one of the most terrible crimes imaginable. However, as CDT President/CEO Leslie Harris stated in response to a recent congressional hearing: “Child sex trafficking is a horrific crime, but the right way to address criminal acts is through aggressive law enforcement, not by making online content platforms liable for the inappropriate or illegal actions of the users of those services.” A holding for plaintiffs on these facts could have broad repercussions for online expression and e-commerce. A tailored solution to the crimes the Plaintiffs suffered from is a superior path forward. We ought not hamstring platforms for free expression and e-commerce when there are alternative options for preventing and punishing illegal acts.