Amendments to EARN IT Act Can’t Fix the Bill’s Fundamental Flaws
The EARN IT Act, which has been widely criticized as threatening the privacy, security, and free expression rights of Internet users around the world, is headed to markup at the Senate Judiciary Committee tomorrow, Thursday, July 2nd. Last night, the bill’s co-sponsors, Senators Graham and Blumenthal, released a manager’s amendment that changes some significant elements of the bill, but that still fails to address the EARN IT Act’s threats to fundamental rights.
Expanded Liability Threatens Free Expression Online
The Manager’s Amendment (MA) significantly expands the risk of liability that online intermediaries of all kinds will face for hosting, linking to, or otherwise enabling third-party content on their services. It revokes the liability shield in Section 230 for federal civil claims and state criminal and civil claims relating to child sexual abuse material (CSAM), making the hosting and moderation of user-generated content much, much riskier for intermediaries to do. This expanded liability will result in over-broad censorship by online services of all manner of constitutionally protected speech, including crucial health and safety information for teens and other vulnerable groups, content created by LGBTQ individuals, and everything from social media apps to video game sites designed for minors and young adults. With greatly expanded criminal and civil liability for CSAM, both sexually oriented content and content depicting or aimed at minors will become much more risky for intermediaries to host.
In the MA, there is no longer any way for intermediaries to “earn” a liability shield—they are simply exposed to dramatically increased sources of liability under 50+ state legal systems. The MA would enable state criminal and civil claims “under State law regarding the advertisement, promotion, presentation, distribution, or solicitation of child sexual abuse material.” This is a vague and broad list of potential offenses that will encompass a wide variety of state laws that apply different legal standards to the same conduct.
For example, Wisconsin’s felony prohibition of knowingly distributing CSAM applies if the person “knows or reasonably should have known” that the material depicted a minor. (Wis. Stat. §948.05(1m)) Texas creates a felony to “knowingly or intentionally promote” CSAM (Tex Penal Code Ann § 43.25). But Minnesota’s definition of the crime is “a person who, knowing or with reason to know its content and character, disseminates [CSAM] for profit.” (Minn Stat § 617.246) Each of these knowledge standards differs from each other and from the current federal standard. Intermediaries will lack certainty about what activity could expose them to legal risk, leading them to take the most risk-averse approaches. This could involve significantly restricting the ability of children to use their service, or blocking any content (including perfectly lawful, constitutionally protected speech) that depicts nudity or sexual themes. Or it could, perversely, discourage intermediaries from proactively identifying and removing CSAM from their services rather than risk obtaining actual knowledge or being told by a court that, because they do some scanning for CSAM on their service, they “had reason to know” that there was a risk they were distributing CSAM.
The language of the MA’s state-law carve-out to Section 230’s protection is arguably even broader than the changes to federal criminal law wrought by SESTA-FOSTA (which is currently being challenged on First Amendment grounds; CDT’s amicus brief is available here). SESTA-FOSTA has led many websites to censor users and forums, and has had a devastating impact on the health and safety of the sex worker community, as documented in this report by Hacking//Hustling. The clear lesson from the SESTA-FOSTA debacle: When content hosts, website operators, social media services, search engines, and other intermediaries face increased legal risk for user-generated content, it is the users who end up suffering, a cost often borne by the most vulnerable.
End-to-End Encryption Still Under Threat
The MA also continues to threaten intermediaries’ ability to provide end-to-end encrypted services to their users, thus denying vital safety protection to users like journalists and domestic abuse survivors. Under the approach adopted in the MA, intermediaries will be exposed to an enormous number of lawsuits in which they will potentially face liability for their choice to protect users’ privacy and security through end-to-end encryption (E2EE). Prosecutors and civil litigants will point to the encrypted status of an intermediary’s services as a relevant consideration in their claims, even for criminal and civil provisions with a “knowingly” mens rea. Even if an intermediary successfully defends against a particular claim, the consistent threat of litigation, and challenge to their decision to employ encryption, will be a strong disincentive against providing E2EE and continuing to have to defend that decision in court.
This disincentive against providing E2EE is exactly what the EARN IT Act has always been designed to do. The role of the Commission has been altered in the MA, which drops the ability for intermediaries to retain Section 230 protections by certifying that they abide by the Commission’s best practices. CDT and many others had criticized this previous structure as extremely coercive, since intermediaries typically cannot afford the risk of handling user-generated content without a strong liability shield like Section 230.
But, while it removes any hope of a safe harbor for intermediaries, the MA retains the same membership and substantive focus of the previous iteration of the Commission, and instructs the Attorney General (AG) to publish the voluntary practices on the DOJ website and in the Federal Register. The Commission, headed by the Attorney General, will undoubtedly consider encryption as part of its “best practices”, and the AG has been extremely clear that he considers intermediaries who employ encryption to be “someone who buries his or her head in the sand” and “blinds others who want to help”. These best practices will serve as a handbook for prosecutors across the country, and states may decide to pass laws requiring intermediaries to comply with the best practices or face liability under state law. The AG-led best practices remain a bully pulpit from which AG Barr is certain to continue his campaign against encryption.
In short, the Manager’s Amendment to the EARN IT Act changes some aspects of the bill, but the rotten core of it remains. Threatening intermediaries with vague and expansive liability for user-generated content is not the right way to fight the sexual exploitation of children, and is a surefire way to discourage encryption and censor an incredible amount of constitutionally protected speech. We urge members of the Senate Judiciary Committee to reject this flawed bill.