Skip to Content

Free Expression

SAVE Act Would Chill Online Speech and Innovation

 The Internet has become a powerful platform for individuals to access information and exchange opinions and ideas of all kinds. It also contributes billions of dollars annually to the US economy.  This is due in significant part to a legal structure that protects the intermediaries that make up the Internet – including the third-party content hosts, user-generated content platforms, and advertising networks – from being held legally liable for the content that their users post.  47 USC § 230 provides these intermediaries with crucial certainty that they will not be taken to court for disputes over material that their users choose to publish through their services. Without such legal protection, the risk of potential liability would prevent content hosts from being able to offer people the ability to share information, opinions, and creative expression online.

The SAVE Act introduced by Senators Kirk and Feinstein would radically undermine this certainty by creating potential federal criminal liability for intermediaries who host user-provided content that is designed to facilitate a criminal child-trafficking venture, and placing burdensome recordkeeping requirements on any intermediary that hosts a wide range of “adult” content.

The new offense, described in section (b) of the bill, raises a number of concerns:

  • The definition of “adult advertising” is broad and includes lawful advertising for lawful adult-oriented services, as well as communications that are not explicitly advertisements but that are designed in part to induce lawful commercial exchange.  Intermediaries who host users’ content will be faced with a wide and difficult-to-define range of content that may pull them into the bill’s criminal liability provisions.
  • The offense is aimed at operators of websites[1] that knowingly host adult advertising and “recklessly disregard” the fact that it is designed to facilitate federal child trafficking or state child exploitation crimes.  A “reckless disregard” standard means that a website operator could be held criminally liable for content created by a third-party, even if the operator did not know that the content was intended to further a crime.  This new potential for liability would present a radical change for operators and would have a number of unintended consequences:
    • Operators of online classified ad sites would have a strong disincentive to create categories for “adult” ads or to do any sort of pre-screening of user-uploaded content, in order to avoid obtaining knowledge that they are hosting adult advertising.
    • Operators of any type commercially oriented site – online advertising websites, but also marketplaces such as EBay and Amazon – would be likely to take down any content that is reported to them as being an “adult advertisement”, rather than risk prosecution for continuing to host flagged content with “reckless disregard” to its potentially unlawful nature. This would essentially result in a de facto notice-and-takedown regime for adult content, and would create a potentially powerful heckler’s veto mechanism for individuals seeking to suppress other users’ speech.

 

In addition to the new federal criminal offense, the SAVE Act would introduce a set of recordkeeping requirements for an incredibly broad range of entities, posing significant problems under the First Amendment.  Although styled in paragraph (b)(3)(B) as a safe harbor from the liability for the new offense, section (c) is in fact a broad and burdensome regulation of protected speech. It applies to a wide range of content and intermediaries, and there is a significant minimum penalty for failure to comply.

 

  • Section (c) requires any person who maintains or is paid to distribute or publish an adult advertisement to collect government-issued photo identification from any person who places an adult advertisement, to keep these records for a minimum of seven years, and to provide these records to federal and state officials upon request.
  • There is essentially no limit to what could constitute an adult advertisement and trigger the recordkeeping requirement.  For example, if an adult performer were to use a social networking site to promote his services, the operator of the site would be required to collect his identifying information, regardless of whether the content appeared as a paid advertisement or through regular use of the service, such as through a tweet or status update.
  • Importantly, there is no mens rea requirement for the maintenance or distribution of adult advertisements, meaning that the entire universe of user-generated content platforms, social media sites, and advertising services that accept content developed by third parties may potentially be maintainers or distributors of adult advertising, whether or not they intend to be.
  • The bill’s penalty for failure to maintain the required identification records is a minimum fine of $250,000 or up to 5 years in prison. This is a heavy incentive for content platforms and third-party hosts to obtain identifying information from every individual whose content they host. Even if the platform has no intent to host adult advertising content, the penalty for error is steep, and could prove ruinous to smaller publishers, website operators, and other content hosts.
  • Even if the identification requirements were applied only to individuals uploading clearly identified adult advertising content, the identification requirements threaten to chill adults’ protected speech. The bill’s definition of adult advertising includes lawful ads for lawful services, and covers communications that are only in part intended to induce commercial exchange (i.e., are not communications wholly devoted to advertising). Thus, the bill would still require the provision of significant amounts of personal information from adults engaged in constitutionally protected communications; similar identification requirements in the Child Online Protection Act led to that law being struck down due, in part, to the burden these requirements place on speakers, listeners, and hosts of protected speech. ACLU v Mukasey, 534 F.3d 181 (3rd Cir. 2008), cert. denied, 555 U.S. 137 (2009).

 

Subsection (c)(5) also directs the Attorney General to promulgate a number of additional regulations that burden First Amendment rights:

 

  • (c)(5)(A) requires all operators of user-generated content platforms and third-party content hosts to manually review all content before it is posted.  This is a burden that few, if any, operators and hosts could survive. Attempts to comply with this law will dramatically reduce the opportunities for speakers to find willing hosts and platforms for their speech online.
  • (c)(5)(B) directs the Attorney General to develop a blacklist of words that individuals are prohibited from using even in lawful advertisements for lawful goods and services.  Such a regulation would fail even the slightest level of scrutiny, let alone the heightened scrutiny required by the First Amendment.
  • (c)(5)(C) would require operators of user-generated content sites to engage in active monitoring of all content – a significant time- and resource-intensive process – to ensure previously removed content is not reposted. This will require operators to maintain thorough records of all material removed and their reasons for removing it, significantly adding to the costs of operating a user-generated content site.
  • Further, (c)(5)(C) would require operators to adopt policies of prohibiting people who had previously posted “inappropriate” content from being able to post content in the future, essentially deputizing website operators to enact unconstitutional prior restraints on individuals’ speech.

 

 


[1] The bill makes it unlawful for a person to “knowingly sell . . . or maintain an adult ad . . . in a medium whose predominant purpose or use is to facilitate commercial transactions.”  For the purposes of this post, we read “in a medium” to refer to a website or online service not exempted in (3)(A), but the term “medium” is undefined.