Skip to Content

2023 Annual Report: Supporting a Vibrant Digital Public Square

Graphic for feature story in CDT’s 2023 annual report, focused on online speech. A man and a woman seated facing a computer screen, with rows of other people seated at computer screens in the background.
Graphic for feature story in CDT’s 2023 annual report, focused on online speech. A man and a woman seated facing a computer screen, with rows of other people seated at computer screens in the background.

At a time when the landscape for online trust and safety is rapidly shifting, civil society organizations, scholars, policymakers, businesses, and members of the public are more loudly voicing concerns about various aspects of online life — including the prevalence of hate speech and harassment, the effects of mis- and disinformation, the impact of social media on kids and teens, and the ability of governments and companies to track so many aspects of our digital lives. As regulators respond, CDT is working to protect people’s rights to express themselves, access information, and find community, while increasing users’ privacy and sense of safety, agency, and control.

2023 was a blockbuster year for online speech in the U.S. courts, and CDT weighed in with friend-of-the-court briefs to articulate the public interests at stake. During oral argument at the Supreme Court in the Google v. Gonzalez case, CDT’s brief was cited by name, referencing the detailed history CDT had provided of “recommendation algorithms” in content moderation. In that case, CDT argued that the Court should avoid a broad ruling making social media companies responsible for all “recommendations,” since online service providers that receive, sort, and display millions of uploads in a single day inevitably must sort information in a way that could be swept into such a rule.

Later in the year, the Supreme Court took up Moody v. NetChoice and NetChoice v. Paxton — cases challenging state laws out of Florida and Texas, respectively, that would require companies to host content they otherwise would not, potentially including hate speech, election disinformation, and more. As we had in the lower courts, CDT advised the Court to find that Florida’s and Texas’ laws impermissibly infringed on platforms’ First Amendment right to exercise editorial judgment in content moderation. We were clear about the stakes: if these laws were allowed to take effect, platforms would likely scale back their voluntary moderation of content like racist invective or misinformation about public health, and might possibly ban discussion of some controversial topics entirely to avoid their moderation choices becoming the subject of litigation. In our view, platforms need flexibility to moderate content so that users have a choice in online environments that work for them.

CDT is working to protect people’s rights to express themselves, access information, and find community, while increasing users’ privacy and sense of safety, agency, and control.

The Supreme Court took up a further major speech case in Murthy v. Missouri, where CDT argued that government actors should be able to share information about online threats with social media companies to support a healthier and more reliable information environment – but that the Court should provide clear guidance to the government to ensure its information-sharing does not cross
the line to unconstitutional coercion of social media services to censor protected speech.

A key issue in 2023 centered on children’s online safety. Regulators proposed numerous laws, including the U.S. Senate’s Kids Online Safety Act (KOSA), Protecting Kids on Social Media Act, and the STOP CSAM Act. We flagged that these bills could limit young people’s agency and prevent them from participating in important, nuanced discussions about what information and venues are appropriate for them. With that priority in mind, we fought government efforts to exert undue — and likely unconstitutional — influence that could limit youth engagement with important topics like reproductive care, racial justice, and LGBTQ+ issues. We also fought to ensure that efforts to protect kids from unwanted content did not undermine other essential online protections, such as everybody’s ability to use encrypted messaging services that protect users’ privacy and security.

One major takeaway from CDT research in 2023 was that, rather than face content restrictions or open up their private messages to greater surveillance, young people want to be empowered with better tools to protect themselves. Drawing on diary studies and interviews with U.S. teenagers and young adults, an original CDT research report, More Tools, More Control, described several tools
platforms should offer for their users to better assess and address unwanted interactions.

In other CDT research and advocacy, we focused on the growing use of student activity monitoring and content filtering in K-12 schools. In a nation-wide survey, we found that 50% of teachers think content filtering and blocking software is stifling students’ growth, and 66% knew students who got in trouble as a result of AI-powered student activity monitoring. A shocking 38% of teachers reported that they knew a student who had been contacted by law enforcement because of monitoring of their online activities. Much as we encouraged regulators to consider the full spectrum of rights and needs of the young people they intend to protect, we urged schools to approach procurement and implementation of safety technologies with students’ privacy, civil rights, and civil liberties as top priorities.

In the online advertising ecosystem, too, users — children and adults alike — face ubiquitous tracking and surveillance. Regulatory and public pressure in the U.S. and EU, combined with emerging privacy-protective changes to advertising’s technical infrastructure, create a ripe moment to imagine a new system. CDT has a vision and a plan for this new era: in 2023, we launched our Future of Online Advertising Project, through which we’ll work towards a competitive online advertising ecosystem that respects human rights, supports independent media, and enables
content creation and availability. Going into 2024, we look forward to engaging with civil society, academia, and the ad industry on diagnosing the friction points that have slowed privacy-forward solutions for online advertising, evaluating new proposals, and charting a proactive agenda for advertising solutions that advance human rights and democracy.