Content Moderation

CDT has worked on content moderation and related issues, including intermediary liability and the role of artificial intelligence and automated decision-making systems, since its inception. In recent years, as interest in these topics has grown, CDT has continued to be a leader in both domestic and global conversations about the intersection of content moderation and transparency, accountability, and human rights. We collaborate with a variety of partners, including academics and advocates, through work such as the Santa Clara Principles. We also engage directly and one-on-one with tech companies and policymakers. Through these informal and formal engagements, we give input, shape policy, and successfully push for greater transparency around platform practices that impact user rights. We are also conducting research on the mechanics of content moderation at different platforms, and developing educational opportunities that bring the realities and challenges of content moderation to different audiences.

Amendments to EARN IT Act Can’t Fix the Bill’s Fundamental Flaws

EU Tech Policy Brief: May 2020 Recap

CDT Joins Twitch Safety Advisory Council to Advocate for Users’ Rights, Transparency and Accountability in Content Moderation

Read More