Skip to Content

Elections & Democracy

Seismic Shifts: How Economic, Technological, and Political Trends are Challenging Independent Counter-Election-Disinformation Initiatives in the United States

Graphic for CDT report, entitled "Seismic Shifts: How Economic, Technological, and Political Trends are Challenging Independent Counter-Election-Disinformation Initiatives in the United States." A menacing crack in the ground separates a voter from a voting booth.
Graphic for CDT report, entitled “Seismic Shifts: How Economic, Technological, and Political Trends are Challenging Independent Counter-Election-Disinformation Initiatives in the United States.” A menacing crack in the ground separates a voter from a voting booth.

This report is also authored by Dean W. Jackson and Danielle Dougall

Executive Summary

Efforts to protect the integrity of information during elections are threatened by seismic economic, technological, and political shifts.

  • Tech sector downsizing has reduced the size of trust and safety teams; new platforms and new technologies like generative AI are making counter-disinformation more challenging.
  • A coordinated political assault on election integrity threatens the capacity and, in some cases, the safety of independent counter-disinformation researchers and advocates—those who are unaffiliated with either platforms or government.
  • This report draws on interviews with 31 individuals (including current and former tech company employees and representatives from independent research and advocacy initiatives) about their experience responding to election disinformation and the growing challenges they face.

The election integrity initiatives examined for this report reflect a variety of approaches.

  • Some consider themselves primarily researchers while others do research to inform advocacy—though almost all include an element of rapid response through methods like counter-messaging to voters or cooperation with election officials.
  • Their interaction with government agencies varies from routine meetings to strict policies of non-communication.
  • Their relationships with social media platforms similarly range from formal partnerships to distanced criticism.

For independent counter-election-disinformation initiatives, partnerships with platforms provide important benefits but also raise concerns about sustainability and extractive labor.

  • Sometimes, initiatives bring cultural and linguistic fluency that platform staff lack and can track harmful narratives that may otherwise go unnoticed or unaddressed by platforms.
  • Platform staff are also keenly aware that input from outside experts helps legitimize decisions about content and integrity.
  • But some independent professionals are wary of providing “free labor to multi-billion dollar corporations,” calling it an “extractive” but “unfortunately necessary way to reduce harm.”

The 2020 election, tech sector layoffs, and other recent events have called into question the ability of counter-election-disinformation initiatives to influence platform content moderation.

  • Interview subjects detailed that platforms were frustratingly inconsistent and sometimes unresponsive before widespread tech sector layoffs. The situation is worse now.
  • Likewise, independent researchers have limited insight into digital threats and trends because most platforms are opaque and offer little access to data crucial to answering key questions.
  • Generative artificial intelligence poses new potential risks related to election disinformation. Rather than jumping to conclusions, stakeholders should methodically consider the highest potential dangers and most appropriate responses.

As prominent politicians and a significant portion of the electorate continue to deny the outcome of the 2020 election, disinformation researchers have found themselves under attack.

  • Independent researchers increasingly face hostile campaigns from partisan media and legal, digital, and sometimes physical harassment—illustrated by Congressional subpoenas to leading figures in the field.
  • The chilling effect alone may drive young professionals from the field, make it more difficult to secure funding, and dissuade government officials from engaging with counter-disinformation efforts—especially with the possibility that a pending court case, Missouri v. Biden, will result in permanent restraints on government communications with platforms or researchers.

In this environment, independent counter-election-disinformation initiatives are reconsidering their approaches.

  • Many initiatives are pivoting harder into other strategies like counter-messaging, assistance to targeted election officials, and policy advocacy.
  • Meanwhile, many online trust and safety outcomes are as bad if not worse than in 2016. The 2024 election is likely to be the most vulnerable environment for political disinformation that the United States has seen in eight years.

Recommendations

Independent counter-disinformation initiatives should take steps in the short-to-medium term to weather the storm and mitigate harm.

  • Funders, research institutions, and nonprofits should create shared resources and practices for researchers under attack. These might include pools for legal defense, cybersecurity assistance, and proactively developed communications plans for responding to coordinated attacks.
  • Counter-election-disinformation should pivot to year-round harm reduction strategies like pre-bunking, training for election officials, and advocacy efforts. False narratives about election fraud persistently impact voting rights between election cycles, so this work should receive consistent support.
  • Advocates should focus less on individual pieces of content and more on mitigating the impact of disinformation “superspreaders.” These are a proven force multiplier for mis- and disinformation—relatively few individuals are responsible for a great deal of false, viral content.

In the medium-to-long term, election integrity initiatives should widen the aperture for advocacy—relying less on unstable partnerships with platforms and the federal government to other stakeholders and non-digital threats to elections.

  • Researchers, donors, and advocates should treat election disinformation as part of a larger, institutional problem by supporting reforms to the electoral process and law. Electoral systems like ranked choice voting and primary reform may reduce incentives for disinformation.
  • Advocates and their donors should increase the resources spent on advocacy to select state governments around relevant issues like security for election workers and researcher access to data.

Government can also take steps to promote public confidence in election integrity and counter-disinformation efforts and, in conjunction with other stakeholders, do more to promote online trust and safety while respecting freedom of expression.

  • Government and other institutions should promote and make use of former trust & safety staffers’ talent by hiring them and encouraging the profession to develop norms, standards, and field-building opportunities comparable to related industries like cybersecurity.
  • Governments should clarify and be more transparent about their role in responding to election disinformation—especially in the aftermath of the injunction issued in the case of Missouri v. Biden. They could explicitly set boundaries and transparency requirements around federal government communications with social media platforms and independent researchers.

Platforms should improve both capacity and process for protecting elections from digital disinformation.

  • Platforms should reinvest in trust and safety teams as soon as possible, focusing especially on civil rights specialists who can shape content moderation policy and practice.
  • Platforms should recommit to policies and practices that combat election disinformation, and respond to claims of censorship and bias by adhering to principles such as the Santa Clara Principles on Transparency and Accountability in Content Moderation.
  • Platforms should designate consistent points of contact for civil society. The departure of key personnel shows the limits of personalized relationships and has been a persistent problem for independent researchers.
  • Platforms should of their own accord increase transparency around their communications with government agencies.
  • Platforms should expand researcher access to platform data—and lawmakers should consider supporting that expansion through legislation like the Platform Accountability and Transparency Act. The public deserves to know more about the impact of social media on society.

Read the full report here.

Read the press release here.

Watch the launch event livestream here.