Senate Inquiry into Backpage’s Content Moderation Practices Would Set Dangerous Precedent for Free Speech Online
Written by Emma Llansó
When the government makes invasive demands for information about a website operator’s content moderation practices, it threatens everyone’s free speech online. CDT, joined by the Electronic Frontier Foundation, delivered this message in an amicus brief to the Court of Appeals for the D.C. Circuit last week, in the case of Carl Ferrer v. the Senate Permanent Subcommittee on Investigations.
Last year, the Subcommittee issued a subpoena to Carl Ferrer, CEO of Backpage.com, demanding extensive information about how Backpage moderates user-generated content and screens classified ads submitted to its Adult Services section. This subpoena is the latest move in a long series of efforts aimed at harassing Backpage off the web that have included threats to Backpage’s credit card processors and multiple unsuccessful attempts to hold the website liable for content provided by its users.
Ferrer challenged the subpoena on First Amendment grounds, arguing that the subpoena intrudes into the website’s editorial decisionmaking and that the Subcommittee had failed to justify the need for this intrusion. The district court, however, ruled that the burden of proof rested on Ferrer to show – document by document – that the subpoena’s demand for information invaded his First Amendment rights.
This standard places a significant burden on Ferrer’s First Amendment-protected activity, and jeopardizes the environment for free speech online. The Supreme Court has made clear that, under the First Amendment, the government may not tell a private publisher what to print or punish a publisher for the editorial decisions it makes.
Backpage’s content moderation decisions are equivalent editorial judgements also protected by the First Amendment. Hosts of user-generated content make editorial judgements when they develop their content policies and decide what to include and exclude from their websites, and when they enforce those policies. Website operators create and enforce their content policies as a way to communicate a message, whether it’s Facebook’s desire for “people to feel safe when using Facebook” or Reddit’s interest in being “home to some of the most authentic content anywhere online.” Curating a message in this way is similar to the selection of participants in a parade or protest. Both practices are inherently expressive and combine a variety of narratives meant to convey a message to an audience.
One of the risks to free speech that we highlight for the court is the trend of government officials trying to pursue takedown of third party content by targeting websites and other platforms that host the speech. In May 2016, after allegations that Facebook’s Trending Topics section was biased against conservative viewpoints, Senator Thune sent a letter of inquiry to Facebook demanding to know the details of how Facebook decides which content will appear in its trending news feed.
Targeting the editorial practices of website operators for scrutiny by the government creates a chilling effect. Operators become unwilling to host controversial, but wholly protected speech, out of fear that doing so could subject them to similar scrutiny by the government. As we discuss in our brief, individuals’ ability to use the internet for their own expression and access to information depends on multiple intermediaries, including access providers and content hosts. These entities become potential chokepoints for speech and must be safeguarded from government pressure to censor. Without such protections, they have strong incentives to take down potentially risky speech and avoid potential litigation. With Backpage’s legal fees for responding to the Subcommittee’s subpoena already reaching $2.8 million, it’s easy to see how even a mere “investigation” into a website’s practices could be sufficient to bankrupt the site into silence.
CDT has long urged website operators to be transparent, fair, and consistent in developing and implementing their content policies, and we often have recommendations for how companies can improve their practices. Under the UN Guiding Principles on Business and Human Rights, companies have an obligation to respect the human rights of their users, and transparency about their content moderation activities is an integral part of that. But when the government attempts to engage in intense scrutiny of a website’s lawful decisions to host constitutionally protected speech, it creates a backdoor to censorship that threatens user speech across the internet. For all of these reasons, we urge the DC Circuit to reverse the district court’s ruling and acknowledge the important First Amendment interests at stake.