Skip to Content

2022 Annual Report: Promoting Platform Governance to Support Users’ Rights

Graphic for feature story in CDT's 2022 annual report, focused on platform governance and users' rights. Left to right: two people looking at a computer; illustration of a computer mouse clicking through an interface and into code; a girl in a hoodie at a computer.
Graphic for feature story in CDT’s 2022 annual report, focused on platform governance and users’ rights. Left to right: two people looking at a computer; illustration of a computer mouse clicking through an interface and into code; a girl in a hoodie at a computer.

As governments around the world dial up scrutiny, the legal and ethical responsibilities tech companies bear are increasing. As a core part of our work, CDT pushes the companies that own and govern online platforms to implement policies that protect democratic values. We urge them to assess the significant impacts their decisions may have on human rights, to put transparent and rights-respecting processes in place, and to more meaningfully engage with communities affected by their services. Before policymakers and the courts, we advocate for legal frameworks that respect users’ rights and enable them to access a wide variety of platforms for their speech.

To promote this vision of equitable platform governance, we engage directly with companies, offer information on best practices, share our research on trends across platforms, and suggest the best ways to address varying online harms.

In 2022, CDT produced original research on how platforms work in practice. On the opaque content moderation practices known as “shadowbanning,” we found that they significantly color users’ perceptions of fair participation in online forums. Our nationally-representative survey indicated that 1 in 10 social media users reported being shadowbanned. We called on platforms to be transparent about the circumstances in which they use shadowbanning, and provide researchers with data to help understand user experiences and identify any harms that arise from the practice. 

Our research team also worked with social media users to examine what aspects of algorithmically recommended content would benefit from greater transparency and how platforms could address those priorities. Turning research into action, we developed several design prototypes that platforms could use to make key social media experiences more transparent to users. 

CDT is especially focused on ensuring equitable participation in online platforms, and fighting online harassment and abuse. Our original research examining the online abuse of women of color political candidates found that they were at least four times as likely as white candidates to be targeted with violent online abuse. We also began new research on how automated content moderation systems fall short in languages other than English — an issue that carries significant consequences both in the U.S. and abroad. 

Because research is key to understanding how platforms’ design and governance choices affect society, CDT advocates strongly for platforms to make data more readily available to outside researchers. We have been at the forefront of discussions on how to achieve this goal while also protecting user privacy — helping policymakers to understand researchers’ needs, engaging directly with companies about their efforts to increase transparency and researcher access, and analyzing how effectively legislative proposals in the U.S. and EU address the issue. As the EU implements new requirements for providing researchers with data access under its Digital Services Act, and the U.S. considers legislation, CDT will continue to help guide governments and companies.

In 2022, several court cases threatened to upend years of work by CDT and other advocates to improve social media companies’ responses to harassment, disinformation, and other undesirable content on their services. In Gonzalez v. Google, Twitter v. Taamneh, NetChoice v. Paxton, and NetChoice v. Moody, CDT repeatedly urged U.S. courts to uphold constitutional and statutory protections for online services to make editorial judgments about what user-generated content they will host, which enables them to moderate without fear of liability.

Again and again, CDT urged platforms and governments to protect marginalized communities. We partnered with over 100 other advocacy organizations to fight the Kids’ Online Safety Act, which effectively instructed online services to limit minors’ access to content through overzealous takedowns and imprecise content filtering tools — with disproportionate effects on LGTBQ+ youth. We also joined with numerous other civil society organizations to continue the battle against the EARN IT Act, which would cause online intermediaries to over-remove even lawful content and disincentivize them from offering encrypted services, to the detriment of all internet users.

Protecting internet users is core to CDT’s mission. In 2022, we were a consistent, trusted, and outspoken voice as the year presented numerous high-stakes challenges to users’ free expression rights.