Skip to Content

Free Expression

Chilling Effects on Content Moderation Threaten Freedom of Expression for Everyone

Online hosts of user-generated content make millions of decisions a day about what content or accounts to allow or block, amplify or downplay, or otherwise manage. These editorial judgments are critical to online free expression because they create the distinctive speech environment of each service, shaping what users can say and what information they receive. 

But if the government disagrees with how a social media company makes these decisions, should it be able to threaten and coerce the company into changing its mind?

This issue is at the heart of Twitter’s case against Texas Attorney General Ken Paxton, which challenges Paxton’s issuance of a civil investigatory demand (CID) to Twitter in retaliation for Twitter’s decision to bar President Trump from its service following the attack on the Capitol on January 6, 2021. The CID demands that Twitter turn over internal documents about how Twitter moderates content, as part of an “investigation” into whether Twitter’s representations and practices about what can be posted on its platform violate the Texas Deceptive Trade Practices-Consumer Protection. As AG Paxton’s public statements make clear, the true motivation is to retaliate against Twitter for exercising its First Amendment right to engage in content moderation.

At issue before the Ninth Circuit Court of Appeals is the somewhat technical question of whether Twitter’s claim of retaliation is “ripe,” i.e., whether Twitter can bring suit now to challenge the CID as a violation of Twitter’s First Amendment rights or whether it must wait until Paxton moves to enforce the CID in a separate proceeding. But this is no dry procedural question; the Court’s decision could be of great importance to social media users and the general public, which rely on hosts to make important and timely moderation decisions regarding disinformation, hate speech, and harassment.

The ripeness of a claim depends in part on whether the party bringing suit has suffered an injury. Twitter argues that its claim is ripe because the CID chills its ability “to freely exercise its constitutionally-guaranteed right to make editorial decisions about the speech it carries.” CDT and a coalition of five other civil society organizations agree: in an amicus brief in support of Twitter, we argue that a state AG’s investigation and CID targeting a host in retaliation for its content moderation practices will pressure the host into self-censoring, even before the CID is enforced.

As our amicus brief explains, when facing a retaliatory investigation and CID by a state AG who has been critical of a host’s decision to remove certain content, a host may forgo making certain moderation decisions or updates to its content policies because it believes that the AG will treat it more leniently or drop an investigation entirely if it does so. (Depending on the nature of the investigation, an outstanding CID could also make a provider more likely to remove certain speech.) It may also be chilled from engaging in internal discussions and debate about content moderation that, ideally, help to ensure its policies and decisions respect free expression and other human rights, because it is afraid of having to reveal internal documents reflecting those deliberations to the government.

These chilling effects harm users of these services and the public. Users and the general public rely on hosts to regularly update their content policies and moderate content that may negatively impact society at large. Moderation allows hosts to tackle a range of content that users do not want and that can negatively impact society, such as misinformation about COVID-19 or elections, harassment and abuse, and promotion of suicide and self-harm. It is also often what distinguishes services from one another; the types of abuse a provider has to grapple with can vary significantly across services, and the way providers respond to issues like disinformation and harassment can have a real impact on who feels welcome or able to speak. When a retaliatory investigation and CID chills hosts from engaging in these important efforts, it is users and the public that lose.

In addition, CDT and other civil society organizations have not hesitated to point out that hosts can already struggle with making moderation decisions in a manner that respects human rights. A retaliatory investigation and CID inhibit a host from taking the steps necessary to ensure its content moderation practices respect free expression—such as thoughtful and frank exchange of views among the host’s employees, significant research, and consultation with experts when developing and applying content moderation policies—to the detriment of users and the public. 

A retaliatory investigation and CID also harm the public interest because they can decrease competition and diversity in the market of providers of user-generated content hosting. Smaller and startup companies, in particular, may be unable to afford the cost of responding to an investigation and CID, which can be expensive. Startup companies with an outstanding CID may also struggle to attract necessary funding while under the cloud of an investigation. It is critical that courts allow companies to challenge retaliatory CIDs before they are enforced, so government officials cannot use them as retaliatory tools to drive smaller and new companies out of business, which is ultimately to the detriment of the public.

Once Paxton’s investigation was announced and the CID was issued, Twitter’s exercise of its First Amendment right to engage in content moderation was inevitably chilled, as it understood that similar moderation decisions going forward risked adding fuel to the investigatory fire, that ongoing internal deliberations about moderation rules or decisions would be subject to discovery under the CID and second-guessing by Paxton, and that it could minimize its legal, reputational, and financial risks by engaging in self-censorship. This injury to Twitter also harms users and the public. The Ninth Circuit should hold that Twitter’s First Amendment challenge to Paxton’s retaliatory CID is ripe. 

Read the full brief here.