Skip to Content

Cybersecurity & Standards, European Policy, Free Expression, Government Surveillance, Privacy & Data

To the UK: An Encrypted System That Detects Content Isn’t End-to-End Encrypted

The UK Safety Tech Challenge Fund is funding five different projects, each with the goal of developing  tools for preventing and detecting child sexual abuse material (CSAM) within end-to-end encrypted (E2EE) environments. REPHRAIN, an third-party auditor, is conducting an independent evaluation of those tools in that goal, using particular criteria to determine their success. REPHRAIN recently invited public comments on their evaluation criteria, under four questions: 1) “What did you like about the evaluation criteria?; 2) what is missing from the evaluation criteria and why?; 3) should anything be removed from the evaluation criteria and why?; and 4) how could the evaluation criteria be improved?”

The Center for Democracy & Technology (CDT) responded to the consultation, highlighting both the gaps in the evaluation and the faulty assumptions underlying the framework of the project. We raised two critical points: 1) the evaluation does not consider how to safeguard children — and other intersectionally marginalized groups — within end-to-end encrypted environments; and 2) the evaluation falsely assumes that CSAM prevention and detection systems are compatible with end-to-end encryption. We strongly urge REPHRAIN to reconsider the premises of the evaluation criteria to include a technically accurate understanding of end-to-end encryption.

Of course, we welcome any and all people- and public interest-centered discussions about encryption, and the ways in which improving end-to-end encrypted platforms strengthens their privacy and confidentiality guarantees. It is unacceptable, however, that a evaluation focused on human-centered privacy and security would not state clearly the disproportionate risks of insecure online environments to children and youth, and people marginalised along the lines of socioeconomic status, sexual identity, race, and gender. 

Further, the assessment must be technically accurate. We noted that REPHRAIN’s evaluation framework incorrectly describes CSAM prevention and detection systems as “within end-to-end encryption,” though this phrasing assumes that end-to-end encryption is compatible with these technologies. According to a technical community definition of end-to-end encryption, content detection is incompatible with end-to-end encryption, especially with regards to features of confidentiality and integrity in such environments. This makes the premise of the Safety Tech Challenge Fund, which is to award “innovative ways in which sexually explicit images or videos of children can be detected and addressed within end-to-end encrypted environments,” an impossibility. Any future documentation of this evaluation should expressly state this inaccuracy. Furthermore, the evaluation should seek to reframe the “UK Safety Tech Challenge” as narrowly addressing content moderation on social media platforms, which is what each project does.

We suggest that the evaluation documentation shift to cover methods for moderating CSAM in general, and omit mention of the false premise that CSAM prevention and detection can be done in end-to-end encrypted environments. Each of the five projects could be evaluated upon the questions already  outlined in REPHRAIN’s criteria. We also recommend adding one evaluation criterion comprising two questions: 1)  in a system that uses end-to-end encryption, would the project’s tool be able to prevent and detect CSAM content?; 2) if any such system were designed around this project’s tool, would that system fit the definition of an end-to-end encrypted system based on this industry standard definition of end-to-end encryption?

REPHRAIN raises an important discussion about improving end-to-end encrypted platforms to better guarantee their privacy and confidentiality. CDT strongly encourages such public-centered discussions, and urges REPHRAIN to make our proposed updates.

To review, we suggest explicitly recognizing the benefits that end-to-end encryption brings to marginalized groups, specifically children; correcting the definitions of end-to-end encryption so as to ensure the technical accuracy of the assessment; and considering content moderation as the main focus of future evaluation documentation, thus removing the false assumption that content detection could be used in end-to-end encrypted environments. A reconsideration in line with these points would allow a stronger, more technically and socially informed, framework for the projects under REPHRAIN.