Holding Platforms Liable for Terrorist Tweets Would Open Door to Mass Censorship

Written by Emma Llansó

When online content hosts face a risk of litigation over user-generated content, they will respond with overbroad censorship measures that limit individuals’ rights to post and access lawful, constitutionally protected speech.  This is true whether the litigation risk comes from the government (as it would under the proposed online censorship bill we’re fighting in Germany) or from private actors.

In an amicus brief filed with the Electronic Frontier Foundation, in the case of Fields v. Twitter, we urged the Ninth Circuit to recognize the threat that civil litigation against intermediaries poses to everyone’s free speech online.

In this case, Tamara Fields and other family members of victims of terrorist attacks seek to hold Twitter liable for the victims’ deaths, due to the general use of Twitter by members of terrorist organizations such as ISIS. Plaintiffs argue that Twitter is in violation of federal law that prohibits “providing material support to terrorists” (18 U.S.C. 2339A) and that the families should thus be able to recover civil damages against the website, as provided in federal statute (18 U.S.C. 2333).

There are a number of issues with plaintiffs’ claims — for example, plaintiffs don’t establish proximate causation, i.e. that use of Twitter led to or facilitated the attacks in question — but our brief focuses on the consequences for free expression that would stem from opening social media platforms and other website operators to this kind of expansive claim of liability.

If an intermediary could face liability merely by offering a platform for user-generated content to the public, while knowing that some of that public includes people affiliated with terrorist organizations, then every single website operator and online service provider would face the risk of lawsuits over their “role”, incredibly tenuous as it might be, in the commission of terrorist attacks.  The messaging apps, ISPs, video hosts, search engines, and social media sites that ISIS decides to use would all be vulnerable to similar claims that they provide “support” to the terrorist organization. And those claims are very likely to materialize — already, there have been multiple lawsuits in the US against social media companies that make similar arguments to Fields’ claims.

In response to this broad threat of liability, these intermediaries would clamp down on the speech that flows across their networks and is hosted on their servers.  Any discussion of terrorism, including news reporting, public policy discussion, religious debate, and advocacy, would likely be viewed as a potential source of a lawsuit and taken down.  This would lead to censorship of huge amounts of constitutionally protected speech, and a withering of our ability to engage in essential dialogue and access to information online.

Fortunately, federal law already bars these kinds of civil claims against hosts of third-party content.  Section 230 of the Communications Act protects intermediaries from liability based on publishing and distributing third-party content. Section 230 also protects intermediaries’ ability to exercise editorial discretion over the material on their services.  Courts across the country have applied Section 230 to bar civil claims that are based on intermediaries’ role in publishing third-party speech, and this case should be no different.  The district court twice rejected plaintiffs’ claims on Section 230 grounds, and we urge the Ninth Circuit to follow suit.

Plaintiffs attempt to circumvent Section 230’s protections by arguing that when Twitter provides accounts to users, it is doing something other than acting as a publisher or editor.  The district court rightly rejected this argument, finding that a decision to deny a person an account “would be a publishing decision to prohibit the public dissemination of [the user’s] ideas.”  In other words, providing an account to someone is a platform’s decision to enable that person to post third-party content — precisely the kind of decision by an intermediary that Section 230 protects.

Shielding online intermediaries from liability for third-party content is essential to the protection and promotion of free expression online.  All of our online communication traverses multiple intermediaries, and laws like Section 230 ensure that we can continue to use the powerful platforms and services that host our speech and enable access to information.  The Ninth Circuit should follow the considerable precedent around this law and maintain these strong protections, for the benefit of all speakers online.

Share Post