(WASHINGTON)—-A coalition of more than 90 U.S. and international organizations dedicated to civil rights, digital rights, and human rights today sent a letter to Apple CEO Tim Cook urging Apple to abandon its plans to build surveillance capabilities into its iPhones, iPads, and other products.
“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” said the letter, whose signatories include the Center for Democracy & Technology (CDT), the ACLU, PEN America, the Electronic Frontier Foundation (EFF), Access Now, Privacy International, Derechos Digitales, Global Voices, Global Partners Digital and groups from across Europe, Latin America, Africa, Asia, and Australia.
The letter noted that the algorithms Apple intends to deploy to detect explicit images in Messages are “notoriously unreliable…and prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery.”
“Children’s rights to send and receive such information are protected in the U.N. Convention on the Rights of the Child. Moreover, the system Apple has developed assumes that the ‘parent’ and ‘child’ accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organiser of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk.”
As a result of Apple’s changes, “iMessages will no longer provide confidentiality and privacy to… users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent. Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit,” the letter said.
Apple is also introducing a second capability that will scan every photo that users upload to iCloud against a hash database of CSAM images provided by child safety organizations and, when a threshold number of matches is met, disable the account and report the user and those images to authorities.
Sharon Bradford Franklin, Co-Director of the CDT Security & Surveillance Project, says the coalition is concerned that Apple will face pressure from governments around the globe to scan for images other than CSAM that they find objectionable.
“We can expect governments will take advantage of the surveillance capability Apple is building into iPhones, iPads and computers. They will demand that Apple scan for and block images of human rights abuses, political protests, and other content that should be protected as free expression, which forms the backbone of a free and democratic society,” she said.
The letter concludes:
“We support efforts to protect children and stand firmly against the proliferation of CSAM. But the changes that Apple has announced put children and its other users at risk both now and in the future. We urge Apple to abandon those changes and to reaffirm the company’s commitment to protecting its users with end-to-end encryption.”
CDT is a 25-year-old 501(c)3 nonpartisan nonprofit that works to strengthen individual rights and freedoms by defining, promoting, and influencing technology policy and the architecture of the internet that impacts our daily lives.