In 2020, the public, policymakers, and researchers increasingly focused on the harms flowing from misinformation, disinformation, and the undermining of facts about key topics in our society — particularly concerning the COVID-19 pandemic and national elections in the U.S.
This year, a foundational grant from the Knight Foundation allowed us to build an in-house team at CDT to conduct original research on the future of digital discourse. In September, this new research team brought together over 30 experts in a variety of disciplines to focus specifically on the intersection of disinformation, race, and gender. We identified high-stakes, unresolved research questions around the impacts of online disinformation on women, LGBTQIA+ communities, people of color, and other voices that are less prominent in mainstream political discourse in the U.S., informing the development of CDT’s future research and advocacy agenda.
In the run-up to November’s U.S. presidential elections, online disinformation and conspiracy campaigns drew scrutiny to social media’s role in public discourse. We called for platforms to increase transparency about their moderation practices, and to improve access provided to third-party researchers to better identify disinformation campaigns and assess platforms’ responses. We also provided training for state and local election officials on how to counter disinformation, and produced our own public service announcements about disinformation and the security of mail-in voting.
In June, we filed a lawsuit against President Trump’s unconstitutional “Executive Order on Preventing Online Censorship,” which threatened to revoke liability protections for social media services in order to deter them from fighting misinformation, voter suppression, and the stoking of violence on their platforms. We also led a coalition of civil rights and technology groups in opposing the Online Content Policy Modernization Act, which would interfere with social media services’ ability to combat the spread of mis- and disinformation on their sites.
In Europe, our team weighed in on the development of the European Commission’s Democracy Action Plan, which focused in part on how European societies can handle the challenge posed by disinformation. We advocated for transparency to be a key element to enable watchdogs, such as public authorities mandated to uphold electoral law and enforce electoral safeguards. We also stressed the importance of enforcing the General Data Protection Regulation (GDPR) to prevent the micro-targeting of individual voters, and cautioned that any new regulations on online campaigning should not unduly restrict civic space.
As social media platforms took steps to promote reliable information around the COVID-19 pandemic, countering disinformation was important for not only the health of democracies, but also the physical health of people worldwide. The problem of mis- and disinformation is a growing challenge around the globe, and we expect this work will continue long into the future.