Content Moderation

CDT has worked on content moderation and related issues, including intermediary liability and the role of artificial intelligence and automated decision-making systems, since its inception. In recent years, as interest in these topics has grown, CDT has continued to be a leader in both domestic and global conversations about the intersection of content moderation and transparency, accountability, and human rights. We collaborate with a variety of partners, including academics and advocates, through work such as the Santa Clara Principles. We also engage directly and one-on-one with tech companies and policymakers. Through these informal and formal engagements, we give input, shape policy, and successfully push for greater transparency around platform practices that impact user rights. We are also conducting research on the mechanics of content moderation at different platforms, and developing educational opportunities that bring the realities and challenges of content moderation to different audiences.

Section 230 and Competition Policy are Inextricably Intertwined

CDT Urges Oversight Board to Hold Political Leaders to Higher Standards on Incitement to Violence

Screenshot of CDT's Comments to Facebook Oversight Board on 2021-001-FB-FBR (Case Regarding Suspension of Trump's Account)

CDT’s Comments to Facebook Oversight Board on 2021-001-FB-FBR (Case Regarding Suspension of Trump’s Account)

Read More