CDT Joins Letter Urging Social Media Platforms to Take Concrete Actions on Content Moderation Policies in Response to Violent Conflicts
CDT joined an open letter, along with 30 civil society organizations from around the world, calling on social media platforms to take concrete actions on their content moderation and other policies in response to violent conflicts.
The response by online platforms to the renewed Russian invasion of Ukraine reveals flaws in the standard operating procedures of how these companies operate globally, including in other countries experiencing war or armed conflicts. The statement calls on social media platforms to take seven steps to address structural inequalities in how they treat different countries, markets, and regions:
- Real human rights due diligence: Platforms should engage in ongoing and meaningful human rights due diligence globally, prioritizing for immediate review their operations in those countries and regions whose inhabitants are at risk of mass killings or grave human rights violations.
- Equitable investment: Platform investments in policy, safety, and integrity must be determined by the level of risk they pose to human rights, not just by the commercial value of a particular country or whether they are located in jurisdictions with enforceable regulatory powers.
- Meaningful engagement: Platforms must build meaningful relationships with civil society globally that are based not on extraction of information to improve products, but also provide civil society with meaningful opportunities to shape platform tools and policies.
- Linguistic equity in content moderation: Platforms must hire adequate numbers of content moderators and staff for every language in which they provide services, fully translate all of their policies into all the languages in which they operate, invest in developing more accurate automated content moderation tools in languages other than English, and limit implementation of fully automated moderation to situations in which they have a high confidence in the accuracy of automated systems.
- Increased transparency: Platforms should increase transparency and accountability in their content moderation practices. The Santa Clara Principles, which were updated and elaborated in 2021, provide concrete guidance for doing so.
- Clarity about so-called “Terrorist and Violent Extremist Content” (TVEC): Platforms should be fully transparent regarding any content guidelines or rules related to the classification and moderation of “terrorism” and “extremism,” including how they define TVEC, exceptions to those rules, and how the company determines when to make such exceptions. Platforms should push back against attempts by governments to use the TVEC label to silence dissent and independent reporting, and should be clear about how their TVEC policies relate to other policies such as incitement to violence.
- Multi-Stakeholder Debriefs: When platforms take extraordinary actions or are forced to engage in a “surge response” to emergencies, they also must take stock afterwards to evaluate and share what they’ve learned.