Content Moderation

CDT has worked on content moderation and related issues, including intermediary liability and the role of artificial intelligence and automated decision-making systems, since its inception. In recent years, as interest in these topics has grown, CDT has continued to be a leader in both domestic and global conversations about the intersection of content moderation and transparency, accountability, and human rights. We collaborate with a variety of partners, including academics and advocates, through work such as the Santa Clara Principles. We also engage directly and one-on-one with tech companies and policymakers. Through these informal and formal engagements, we give input, shape policy, and successfully push for greater transparency around platform practices that impact user rights. We are also conducting research on the mechanics of content moderation at different platforms, and developing educational opportunities that bring the realities and challenges of content moderation to different audiences.

A graphic detailing a CDT Europe event with GNI, entitled "How can we apply human rights due diligence standards to content moderation? Focus on the EU Digital Services Act." Event on July 29, 2021. Grey text on a while, orange, and blue background.

How can we apply human rights due diligence to content moderation? Focus on the EU Digital Services Act – Event Summary

Tech Talk: Apple Announcement — Talking Tech W/ Greg Nojeim and Mallory Knodel

Outside Looking In: Approaches to Content Moderation in End-to-End Encrypted Systems

Read More