Content Moderation

CDT has worked on content moderation and related issues, including intermediary liability and the role of artificial intelligence and automated decision-making systems, since its inception. In recent years, as interest in these topics has grown, CDT has continued to be a leader in both domestic and global conversations about the intersection of content moderation and transparency, accountability, and human rights. We collaborate with a variety of partners, including academics and advocates, through work such as the Santa Clara Principles. We also engage directly and one-on-one with tech companies and policymakers. Through these informal and formal engagements, we give input, shape policy, and successfully push for greater transparency around platform practices that impact user rights. We are also conducting research on the mechanics of content moderation at different platforms, and developing educational opportunities that bring the realities and challenges of content moderation to different audiences.

The Thorny Problem of Content Moderation and Bias

Three Lessons in Content Moderation from New Zealand and Other High-Profile Tragedies

Companies Finally Shine a Light into Content Moderation Practices

Read More