Crowd hands

Transparency & Accountability

Content moderation does not and cannot work in a vacuum. Without transparent rules that online platforms can be held accountable for, many communities that host user-generated content would simply fail. CDT works to ensure that policies are accessible to users, that companies and governments protect and respect user rights, and that platforms have mechanisms in place to offer meaningful opportunities for appeal and redress. Along with partners in civil society and academia, we drafted the Santa Clara Principles on Transparency and Accountability in Content Moderation, a set of best practices for online platforms that host user-generated content. In the short time since they were published, the Santa Clara Principles — along with CDT’s larger body of related work — have become central to conversations happening on the Hill, in Silicon Valley, and in Europe concerning content moderation.

Recent Content

EU Tech Policy Brief Newsletter Banner, with gradient background, interlocked hexagons on the bottom left corner, CDT Europe's logo on the bottom right corner, and April 2025 on the top right corner

EU Tech Policy Brief: April 2025

Civil Society Responds to DSA Risk Assessment Reports: An Initial Feedback Brief

Graphic for CDT Research report, entitled "The Kids Are Online." Grey background, with purple, orange, and blue gradient bars underlying black text.

The Kids are Online: Research-Driven Insights on Child Safety Policy

First Amendment Tech Transparency Roadmap. White document on a grey background.

First Amendment Tech Transparency Roadmap

EU Tech Policy Brief: January 2025

Graphic for CDT's European office. Pale blue / green pixelated background, with a portion of the EU flag's circle of stars emblazoned in white on top.

CDT Europe Response to the Consultation on Data Access in the DSA

Read More