We now have an authoritative assessment that human rights are enabled through the use of strong end-to-end encryption (E2EE) on social media platforms in use by billions of people around the world.
In 2019, Facebook asked Business for Social Responsibility (BSR) to conduct a thorough human rights impact assessment (HRIA) of its plans to roll out end-to-end encryption for all of its messaging services. Last week, BSR published its report, after an initial leak, along with now-Meta/WhatsApp’s response to the 45 recommendations.
The Center for Democracy & Technology (CDT) welcomes this report from BSR, a leader in human rights impact assessments, and is encouraged by Meta’s response to the recommendations. This huge project has been underway for years because Meta and its related messaging services are fully “at scale,” and represents the first HRIA of its kind to look at the impact on user trust and safety, including children’s rights, of end-to-end encrypted systems with the presumption that the encryption would, if it were implemented, remain strong and not weakened. It is also notable that Meta agreed to transparency by openly sharing the results of this elective HRIA.
HRIAs are important. We encourage more tech platforms and internet infrastructure companies to consider conducting HRIAs on their operations, labor and business practices, and user services. Other E2EE services and companies should also follow suit with comprehensive HRIA assessments and audits, from protocol implementation to policy development, when they plan to make significant changes to a service, offer a new service, or enter or exit a market where human rights are known to be at risk.
BSR reaffirmed in its assessment that Meta’s approach is overall supportive of end-to-end encryption as fundamental to human rights. The HRIA indicates that encryption protects not only privacy, but free expression and the rights of marginalized communities, such as the LGBTQ+ community. Furthermore, Meta has taken a welcome, strong stance that client-side scanning is fundamentally incompatible with an E2EE messaging service. We came to the same conclusion in the Global Encryption Coalition’s assessment of proposals to weaken encryption that CDT joined, Breaking Encryption Myths. It is critical that Meta joins CDT and other civil liberties groups in making the point crystal clear to law enforcement agencies (LEA) and others that client-side scanning is not compatible with end-to-end encryption and would create more opportunities for mass surveillance, as pointed out in the report, Bugs in our Pockets.
However there are additional recommendations that were made by BSR, and in turn Meta’s response to them, that CDT wants to highlight in an effort to hold accountable this massive platform for the protection of human rights and civil liberties for all. We will continue to monitor these issues, which are related to real name policies, child safety, and metadata.
Recommendation 1F says, “Invest in processes to ensure that users who have violated platform policies cannot return.” While recidivism is a problem, CDT takes this opportunity to remind Facebook of the dangers of its real name policy and increased tracking of individuals, which could be imprecise and have knock-on effects for users of the same device or home wifi. It is better to create policies and enforcement that target behavior, not people.
Later, BSR suggests that Messenger Kids ought to implement end-to-end encryption for the protection of users who are children because they are at risk, yet their recommendation emphasizes that parental controls should remain unchanged (“Only implement end-to-end encryption on Messenger Kids and Instagram for Kids if it is possible to retain the same amount of parental control that is currently available”). CDT is concerned that this could effectively require an encryption backdoor for parents. When such child safety features were announced by Apple, it caused significant outcry from human rights advocates. Since Apple has backed off of those plans, we hope Facebook and Instagram will not create the same dangerous precedent allowing one account to essentially conduct surveillance of another. Rather, a careful evaluation of existing parental controls should be done, and some of them modified or removed if those controls would in fact weaken an encrypted version of those same features.
Lastly, it is worth noting that Meta has a metadata problem. It collects too much of it, threatening the privacy of its users. As compared to other platforms, WhatsApp, the Meta-owned messaging service that is encrypted end-to-end, already collects more metadata than do other E2EE platforms. Some of this metadata may not be useful in detecting CSAM and other objectionable content and need not be collected. As Meta extends E2EE to other platforms it operates, and as it implements BSR recommendations 2(a) (using metadata analysis to address harms), 3(h) (reporting on problematic activity detected and on accounts suspended) and 3(j) (account for uncertainty about the extent to which behavioral signals “prove” a violation of content standards), we would recommend that it look to reduce its collection of metadata that is not useful in detecting objectionable content. Meta should also endeavor to reduce, not expand, its data sharing with LEAs as E2EE technology improves. While it is important that Meta has made broad commitments to trust and safety, and under that is its commitment to LEA collaboration, there needs to be a fine balance. In general, we hope that Meta continues to work with civil society partners, including most affected groups, and public interest defenders to strike a balance between metadata collection that puts users at risk and law enforcement access in E2EE environments.
CDT strongly agrees with the HRIA’s finding that E2EE enables human rights. We look forward to working with E2EE platforms and other stakeholders on the guidance and recommendations contained in the HRIA, including how to address challenges for content moderation in E2EE environments as reflected in our recent report, Outside Looking In.