Skip to Content

Contractor, Third-Party Election Disinformation Research

The Center for Democracy & Technology (CDT) is seeking a short-term contractor to conduct independent research and writing for a report, or series of reports, analyzing third-party partnerships that have been created to monitor and mitigate election-related disinformation. The work will take place from January – May 2023, in partnership with CDT’s Elections & Democracy team. A stipend between $12,000 and $20,000 is available for the work. The stipend will be determined based on contractor experience and will be set early in the process.


After 2016, American election observers recognized the serious problems posed by foreign election disinformation campaigns (e.g., about candidates and/or how to vote). The 2020 election and the January 6, 2021, insurrection at the Capitol focused additional attention on domestic disinformation about how elections are conducted and about the legitimacy of election results. To tackle this problem, social media platforms have formulated “civic integrity policies” that may, to use the policy Twitter adopted in 2021 as an example, prevent the use of the platform “for the purpose of manipulating or interfering in elections or other civic processes.”

But no matter how well-written and comprehensive these policies are, there are signs that the platforms are doing an insufficient job of responding to election disinformation. After the 2022 midterms, for example, the enforcement of content moderation policies has been criticized as “erratic and inconsistent.” On the other hand, some users feel that the platforms have been overly aggressive, as well as opaque in removing content — such as those who feel that they were “shadowbanned” for election-related postings, or those who feel that Twitter engaged in inappropriate suppression of news stories during the 2020 election.

What might a fair, transparent, and democratic strategy for addressing election disinformation look like?

In recent elections, a novel approach has emerged: third-party partnerships (between, e.g., civil society organizations, academic researchers, journalists, fact-checkers, election and other government officials, cybersecurity companies, social media platforms, and citizens) intended to monitor and mitigate election disinformation online. 

The most prominent example of such a partnership is the Election Integrity Partnership, which has operated a ticketing system for false or misleading election-related narratives in the last two federal elections. It allowed participants, primarily academic researchers, to track and respond to specific narratives and related content, streamlining efforts to determine how narratives evolved, whether they were indeed false or misleading, and how to respond. Another model is Common Cause’s Stopping Cyber Suppression program, which is driven by a large group of nonpartisan volunteers rather than by academic researchers.

The goals of such a partnership may be academic: to learn more about how disinformation spreads. Or they may be more practical: to respond to disinformation as swiftly as possible without unduly preventing free expression online. This project is specifically concerned with this latter goal of rapid response.


The main objective of this project is to examine the role that third-party election disinformation partnerships do or can play in mitigating the harmful effects of election-related disinformation. The main output is the publication of a research report that highlights a selection of partnerships and details how those that operate these groups, as well as social media companies, lawmakers, election officials, and funders, can best support and improve their effectiveness. The researcher contracted for this role will be responsible for expanding upon research already conducted by CDT’s Elections & Democracy team, and collaborating with CDT staff to prepare an in-depth report.


The researcher contracted for this role will conduct qualitative research using publicly reported information and interviews with relevant actors (some of which have already been conducted and recorded by CDT staff). In examining the role of election disinformation partnerships, the researcher should consider whether these partnerships can be properly conceptualized as:

  • A way of scaling up the application of civic integrity policies to compensate for possibly lackluster capabilities (or motivations) among the platforms;
  • A way of crowdsourcing and democratizing content moderation — recognizing the role of entities who are not tech platforms in determining how expression should be moderated on the internet;
  • A way of making content moderation more culturally and linguistically fluent, by involving individuals who possess fluencies not well-represented by global platform moderation teams;
  • A mechanism for providing oversight over whether platforms are sufficiently and fairly enforcing their civic integrity policies.

The researcher should also investigate a set of questions determined in collaboration with CDT staff, which may include:

  • What are some examples of ways in which these partnerships succeed and fail?
  • What are the general demographics and characteristics of organizations participating in these partnerships?
  • Can such partnerships scale their responses sufficiently to match the scale of the problem?
  • What are some distinctions between the models and approaches adopted by the various partnerships that exist?
  • Do such partnerships have sufficient visibility to what is happening on the internet?
  • How do such partnerships communicate with platforms and services when content appears to violate civic integrity policies?
  • How involved should the government be in these partnerships? In the U.S., what First Amendment concerns are raised by government involvement? How might these concerns be mitigated?
  • How are these projects funded? What is the ideal funding model?
  • How readily can learnings about these partnerships be applied to other countries? What learnings can these partnerships take from experiences in other countries?


Write a research memorandum, or series of memoranda, that will: 

  1. Describe several election disinformation monitoring-and-mitigation partnerships, highlighting their similarities and differences, and comparative strengths and roles in the election information ecosystem.
  2. Describe obstacles to the effectiveness of these partnerships, leveraging insights gleaned from the research, from interviews with those who participate in these partnerships, and from discussions and writings of other observers.
  3. Propose recommendations for how social media companies, lawmakers, election officials, and funders might make these groups operate more effectively while protecting the right to free expression online.

The researcher will be eligible for an authorship credit on the resulting CDT work product, if desired.


The report should be completed in approximately 15 weeks, with an additional week of availability from the researcher to respond to any additional follow-up questions from CDT, to take a total of 16 weeks. The research project includes the following tasks:

  • Kick-off meeting with CDT;
  • Qualitative research;
  • Three check-in meetings with CDT, in weeks 2, 4, and 6;
  • Partial draft of research memo provided to CDT after approximately 8 weeks of research; CDT will provide feedback within 2 weeks;
  • Two more check-in meetings with CDT, in weeks 11 and 13;
  • Final version of research memo provided to CDT after approximately 15 weeks of research;
  • Additional research or meetings based on follow-up questions from CDT in week 16.

Our ideal timeline for this project is for the research to span from late January to mid-May, with delivery of the final memo in late May, and 1 additional week of availability for follow-up questions, with the contract ending in late May or early June. These dates can be flexible upon request.

Expressions of Interest

Interested candidates should send a cover letter, CV, and 1–3 writing samples to CDT Senior Technologist in Elections & Democracy Will Adler, at [email protected]. Preference will be given to applicants with relevant experience in disinformation research and content moderation policy and practice. Applications will be considered on a rolling basis. 

The Center for Democracy & Technology is an equal opportunity and inclusive employer. CDT does not discriminate on the basis of race, color, religion, gender, gender expression, age, national origin, disability, marital status, or sexual orientation in any of its activities or operations. We believe that a diverse staff enables us to do better and more impactful work. Women, people of color, disabled people, and members of low-income, LGBTQIA+, and other marginalized communities are strongly encouraged to apply.