Also authored by CDT’s Rachele Ceraulo
A leading childrens’ rights coalition has concluded that there should be no generalised ban on encryption in the services children use because such a ban would do more harm than good to children. It also determined that the international human rights law framework of ‘legality, necessity and proportionality’ should be applied to any interventions that would impact encryption. Both the UN Commissioner for Human Rights and the European Data Protection Supervisor and Board have concluded that the EU’s Proposal on Child Sexual Abuse material fails this test. These conclusions appear in a comprehensive, thoughtful report that the Child Rights International Network (CRIN) and Defend Digital Me released last week. The report sets out principles for an approach to encryption that recognises and respects the full range of children’s rights.
The report draws on a literature review, semistructured interviews and background conversations with experts working on this topic, and written answers provided in response to a questionnaire. CDT Europe’s Director Iverna McGowan was one of the experts interviewed and is quoted frequently in the report. A key trend identified in the report is a challenge to the idea that technology can be a silver bullet to solve complex societal problems. Experts interviewed in the report warned against claims that technology can be capable of identifying grooming, as unfortunately such technology “doesn’t exist yet”, and may never in a manner that is accurate and child protective enough to roll out at scale.
This report also highlights the dangers of technology-enabled police surveillance of disadvantaged communities and marginalised children, who are more likely to have negative experiences with policing, including racism. Allowing for large scale police surveillance could further injustice and contribute to a climate of impunity. This comes in the context of a global trend of “a creep of power for law enforcement and a dilution of checks and balances on that power.”
The report also weighs the arguments surrounding client-side scanning. Client-side scanning involves surveilling the content and behavioral data generated on a device before the data is in transit. The report concludes that client-side scanning is akin to breaking end-to-end encryption and carries all of the same concerns regarding proportionality, as such indiscriminate surveillance would have a chilling effect with people limiting the way they communicate in the knowledge that they are being monitored.
The report also considers homomorphic encryption. The applications of homomorphic encryption were designed primarily for use in cloud computing to protect the privacy of user data while still allowing computations to be performed on that data. This is different from brute-force breaking of encryption using super- or quantum-computing power. However, by allowing private user data to be scanned via direct access by servers and their providers, privacy expectations of users of end-to-end encrypted communication systems are nonetheless broken. The problem persists even if end-user devices would be able to perform homomorphic computations, both because that would potentially exacerbate existing inequalities in user access to powerful devices and would inevitably require human review in the case of false positives.
The use of encryption does not leave authorities without tools to address the spread of CSAM and other harms. For example, user reporting is both an effective and underused tool. CDT has advocated for a more sophisticated system of user reporting to be put in place to better protect children online.
The CRIN and Defend Digital Me report provides powerful illustrative cases detailing how encryption actually works to protect the rights of children. For example, encryption can protect young people or children who could be involved in protests where they face unlawful crackdowns by the police, or a young person who is LGBTQ+ but lives in a country that criminalizes homosexuality. Another scenario details a child facing physical abuse at home who wanted to take photos of bruises to share with a trusted friend or other family member; without encryption and with parental controls, this could actually lead to the child’s information being flagged to the abusive parent and have dire consequences for that child.
This important report ultimately concludes that there should be no generalised ban on encryption, as to do so would leave children ‘vulnerable to a wide range of exploitation and abuse’. Importantly, the report also stresses the international human rights law framework of ‘legality, necessity and proportionality’ with regard to any interventions that would impact encryption. Both the UN Commissioner for Human Rights and the European Data Protection Supervisor and Board have concluded that the EU’s Proposal on Child Sexual Abuse material fails this test.