Today, the Center for Democracy & Technology (CDT) and a coalition of civil society and academic representatives are releasing “The Santa Clara Principles”, a statement outlining minimum standards for the information tech companies to share about their content moderation practices. CDT and the coalition urge companies to adopt these principles and offer meaningful transparency when reporting on content removals, account suspensions, appeals, and other practices that impact user free expression.
The Santa Clara Principles provide guidance for reporting on three categories of information:
- Numbers (of posts removed, accounts suspended);
- Notice (to users about content removals and account suspensions); and
- Appeals (for users impacted by content removals or account suspensions).
The principles came out of the February Content Moderation and Removal at Scale conference at Santa Clara University School of Law, and are published alongside today’s Content Moderation at Scale (COMO) summit in DC.
The following statement can be attributed to Emma Llansó, Director of CDT’s Free Expression Project:
“For years, CDT and many other advocates and NGOs around the world have called for online content platforms to be more accountable and transparent about how they develop and enforce their content policies. The decisions they make have major consequences for individuals’ rights to freedom of expression and everyone’s ability to access information.
“We’ve seen some positive developments from several platforms, including publishing hard numbers about content enforcement decisions, giving users better notice about why their speech has been restricted, and providing better opportunities to appeal content takedowns. But there’s still a lot more to be done.
“As policymakers around the world focus on how online platforms shape our information environments, it’s crucial to know more about how content moderation happens in practice. We hope that the Santa Clara principles can prompt broader discussions amongst advocacy groups, experts, and platforms across the globe and lead to more transparent and accountable content moderation.”