Skip to Content

Cybersecurity & Standards, Free Expression, Government Surveillance

The New EARN IT Act Still Threatens Encryption and Child Exploitation Prosecutions

This blog post is the second part of a two part series analyzing the latest version of the EARN IT Act that passed out the Senate Judiciary Committee. The first post introduced the changes made to the structure of the bill and discussed how those changes fail to alleviate the free expression concerns that the bill poses. This blog focuses on how it continues to threaten strong encryption and potentially violate the Fourth Amendment.

A new version of the widely panned S. 3398, Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020 (EARN IT), recently passed out of the Senate Judiciary Committee. EARN IT seeks to deal with the scourge of online child exploitation by coercing service providers to more aggressively police such content on their platforms. It’s an important issue to address, but the bill’s approach is problematic. While we agree that more should be done about online child exploitation, we(1) have(2) voiced(3) strong(4) opposition to the EARN IT Act because of the significant risks the bill poses to free expression online, to the ability for providers to offer and for consumers to use end to end encrypted services, and the risks the bill poses to prosecutions of those who harm children online. We have not(1) been(2) alone(3) in(4) our opposition.

The day before the committee debate, the authors of the EARN IT Act introduced an amendment that significantly changed the bill, but failed(1) to(2) resolve(3) its fundamental flaws(4). Senator Leahy (D-VT) also introduced and secured an amendment that seeks to prevent EARN IT from precluding a provider from offering encrypted services. With little time for members to consider the changes made to the bill, and on the day before a holiday, the amended version of the bill passed out of the Senate Judiciary Committee and now awaits consideration by the whole of the Senate (potentially through unanimous consent, which Senator Wyden [D-Or] has pledged to block, an act we applaud). Despite the changes made, we are still staunchly opposed to the bill. It would still significantly burden the free expression rights of users of interactive computer services—particularly of marginalized communities; it still jeopardizes the ability of service providers to provide encrypted services; and creates a new set of Fourth Amendment problems. We discuss the latter two issues in greater detail below.

Encryption Continues to Be a Live Issue

In short, encryption remains under threat in EARN IT because in order for providers to identify and take down CSAM on their services, the bill requires either a third party to report such content, or the ability of the provider to view each piece of content passing through their service. This expanded liability for CSAM essentially deters a provider from risking offering strong end-to-end encryption (e2ee) if they could be held responsible for content they can’t see on their platform.

Conversely, the Leahy amendment was intended to provide an encryption safe-harbor, deliberately creating a carveout in the bill that seemingly would allow providers to offer strong encryption on their services without facing liability for doing so. The amendment states that a provider may not be held liable under state and federal law because it “utilizes full end-to-end encrypted messaging services, device encryption, or other encryption services”; “does not possess the information necessary to decrypt a communication”; or “fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services.”

Unfortunately, while we applaud Senator Leahy’s intentions, this amendment still leaves social media providers, and thus encryption, vulnerable on multiple grounds.

EARN IT Act Structure

The first, and most insurmountable, issue is that under the new structure, there is no longer any constructive way for companies to “earn” the liability shield provided by Section 230 when it comes to CSAM. Consequently, companies who choose to use strong e2ee on their products would still be susceptible to lengthy and expensive court proceedings to prove that they were not liable for content they couldn’t have known about because of their choice to protect their users.

Further exacerbating these concerns is that the law enforcement-heavy commission from the original EARN IT Act still exists, just now in an advisory capacity to produce recommendations. Given the committee makeup, it is still very likely these recommendations could include provisions for exceptional access, whether or not they explicitly mention e2ee. Such a recommendation could influence judges’ rulings in the lawsuits, or be used to apply reputation pressure to companies. Some large and well-resourced companies may be capable of swallowing the cost of these litigation and reputational risks in service of their users’ security, but for most it’ll be easier to simply forgo e2ee altogether, further weakening security across the internet. It is unlikely that any amendment that is plausibly passable in Congress will fix the threat to encryption posed by the new version of EARN IT; the structure of the bill itself serves as a disincentive to building e2ee systems, thus leaving users of technology more vulnerable to cyber threats.

Confidentiality

Another threat is that companies could be held liable for failing to implement technologies that undermine the confidentiality guarantees that e2ee systems are intended to confer. Such compulsory technologies might include things like client-side scanning, where the system takes a “fingerprint” (also called a hash) of each of a user’s images and videos before they are sent as messages, to compare against a database of known CSAM. If the hash matches something in the CSAM database, the system prevents the message from being sent (and could also report that message to some authority). Theoretically, a system that uses client-side scanning could still send messages encrypted end to end, and so the Leahy amendment would not offer any protection, but many of the same confidentiality concerns with backdoored “e2ee” systems would continue to apply.

It is impossible to guarantee that a scanning system will only be used to detect CSAM, and not also expanded to attempt to detect terrorist or other extremist content. Additionally, client side scanning still runs afoul of the fundamental principle of encryption: that only the people involved in the conversation can access the content of the message. That’s because in this case, “client-side” is a misnomer.

“Client”-side

Client-side generally means that all the data involved stays on the user’s device, i.e. the client’s side. But in the case of CSAM, it’s not possible to do the scanning entirely on the client side, but rather will likely involve computation and/or sharing of metadata outside of the device, like with a server or database. There’s two main reasons for that:

  • The data bases are too large to be stored on user devices like smartphones, even in compressed form. 
  • Second, the primary algorithm used for scanning images and videos is PhotoDNA, which has not been publicly released (raising a whole other set of abuse concerns), most likely because it is fragile, and thus susceptible to bad actors reverse engineering and circumventing it. Thus, it is likely not possible to deploy PhotoDNA on consumer devices without risking that it will become ineffective everywhere it is deployed, or creating a slew of additional security concerns.

Ultimately, the Leahy amendment is a powerful acknowledgement of the importance of end-to-end encryption, but the structure of the EARN IT act is intrinsically antithetical to a robust e2ee ecosystem. Companies will be disincentivized from building and maintaining e2ee systems out of fear of lawsuits, and will still likely be forced to handle their users’ messages in ways those users would consider a violation of trust and confidentiality.

New Fourth Amendment State Actor Problems

In addition to its myriad of encryption problems, the EARN IT Act transforms social media providers into “agents of the government” for purposes of the Fourth Amendment, with potentially disastrous consequences for CSAM prosecutions. 

A private entity can be transformed into an agent of the government “[w]hen a statute or regulation compels a private party to conduct a search” (Skinner v. Railway Labor Executives’ Association, 489 U.S. 602, 614 (1989)). When the private party is not directly instructed by the government to conduct a search, the court assesses the context for evidence of “the Government’s encouragement, endorsement, and participation” in the search. Id. “Even when a search is not required by law, however, if a statute or regulation so strongly encourages a private party to conduct a search that the search is not ‘primarily the result of private initiative,’ then the Fourth Amendment applies” (United States v. Stevenson 727 F.3d 826, 829 (8th. Circuit 2013)).

Such transformation is legally significant: if a private party becomes an agent of the government who conducts searches for the government, the Fourth Amendment applies to those searches. When a search is conducted without a warrant required by the Fourth Amendment, the information seized from that unconstitutional search is properly excluded from evidence – and when it is the basis for a CSAM prosecution, the prosecution can be dismissed. And in the case of EARN IT, any mandate for a company to search through a consumer’s content would violate the warrant requirement, and therefore not be usable in court.

Currently, providers voluntarily filter to identify CSAM or alleged CSAM on their services. When they identify what they believe is CSAM, they report it to the National Center for Missing and Exploited Children (NCMEC), which then passes it on to law enforcement. The initial search is done voluntarily and the private search doctrine affords providers the leeway they need to execute the search. They are obliged to report CSAM to NCMEC when they have knowledge of the CSAM, but they are not obliged to search for it. Courts consistently reject defendants’ challenges to these warrantless, voluntary searches by providers. Courts(1) have(2) repeatedly(3) held that “[a] reporting requirement, standing alone,” does not transform a provider into a state actor and that internet service providers act to further their own business interests when conducting these searches.

When legislating in this space, Congress must be very careful not to provide defendants an opportunity to exclude evidence of their abuse of children. Indeed, such past Congressional action resulted in the determination by the 10th Circuit that NCMEC itself, a nonprofit that Congress subsequently granted certain privileges, was either a government entity or an agent of the government (United States v. Ackerman 831 F.3d 1302 (10th Cir. 2016)). The EARN IT Act, as amended by the Senate Judiciary Committee, subjects providers to state laws that have the practical effect of compelling them to conduct searches for CSAM on their services, turning them into state actors for purposes of the Fourth Amendment.

EARN IT subjects social media providers to civil and criminal liability under state CSAM laws and does not require these laws to match federal law. In other words, though federal law would hold providers responsible only for “knowingly” facilitating the distribution of CSAM, state equivalents could hold them liable for “recklessly” or “negligently” facilitating distribution. There are dozens and dozens of state CSAM laws, some of which rely on a “reckless” standard, and others which use a “knowledge” standard but define knowledge in a way that actually means reckless disregard. For example, in Florida statute 847.0137, a person is criminally responsible if they knowingly transmit child pornography. “Knowingly” is defined broadly as “having the general knowledge of, reason to know, or a belief or ground for belief which warrants further inspection or inquiry of both: (a) The character and content of any material described in this section which is reasonably susceptible of examination by the defendant; and (b) The age of the minor.””

Having a “ground to believe” something is not “knowing” it. Opening up a social media platform to liability when the platform merely has a ground to believe that child pornography is carried on the platform may “so strongly encourage[]” the provider to search for it that the search no longer becomes merely a private initiative (United States v. Stevenson). Indeed, pressuring providers to conduct such searches seems to be the purpose of lifting a provider’s liability shield. Such a change actually would give purveyors of CSAM a colorable argument that the provider’s search for CSAM is an unconstitutional warrantless search conducted by an agent of the government, the fruits of which must be suppressed.

Additionally, at least two state laws impose criminal liability on providers who fail to inspect the content shared on their platforms for obscene materials. The EARN IT Act would remove the liability shield that providers now enjoy, and coerce them to conduct on behalf of the government the very warrantless searches that the Fourth Amendment proscribes. In Illinois, “[a] person commits obscenity when, with knowledge of the nature or content thereof, or recklessly failing to exercise reasonable inspection which would have disclosed the nature or content thereof” they “publish or make available anything obscene.” And in South Carolina, “[i]t is unlawful for any person knowingly to disseminate obscenity.” Knowingly is defined as “having general knowledge of the content of the subject material or performance, or failing after reasonable opportunity to exercise reasonable inspection which would have disclosed the character of the material or performance.” These statutes would compel providers to inspect their users’ content, and would violate the Fourth Amendment.

***

The EARN IT Act continues to be plagued with significant free expression, encryption, and Fourth Amendment problems. We urge Congress to reject this proposal. Much can and should be done to protect children online, and we encourage members to turn instead to the ideas put forth in Senator Wyden’s legislation, the Invest in Child Safety Act. This legislation would protect children online by directing more resources toward prosecution of cases.