Skip to Content

2021 Annual Report: Fighting Discrimination and Promoting Equity

Photo collage - on the top left, a teacher wearing glasses talks to a young student carrying books in a busy school hallway. On the bottom left, a group of employees in safety vests gather around a laptop in a warehouse for a discussion. On the top right, two family members use a tablet to see each other and communicate. On the bottom right, a masked passenger on an airplane reads while other masked passengers look on.
Photo collage – on the top left, a teacher wearing glasses talks to a young student carrying books in a busy school hallway. On the bottom left, a group of employees in safety vests gather around a laptop in a warehouse for a discussion. On the top right, two family members use a tablet to see each other and communicate. On the bottom right, a masked passenger on an airplane reads while other masked passengers look on.

In 2021, CDT expanded our fight against uses of data and technology that discriminate against people based on race, gender, sexual orientation, disability, and other protected characteristics. This work is part of our longstanding commitment to addressing the most pressing threats to civil rights and civil liberties in the digital world.

To ensure that technology serves all people, we partnered with allies in the Civil Rights, Privacy, and Technology Table to present an oversight agenda focused on technology and civil rights for the 117th Congress. As the Biden Administration came to power, we called on them to fight algorithm-driven discrimination — particularly in areas crucial to socioeconomic outcomes, such as hiring, credit, and housing. We were pleased to see our efforts bear fruit when the Office of Science and Technology Policy announced its plans for an Artificial Intelligence Bill of Rights, and the Equal Employment Opportunity Commission launched an initiative on employment and AI

CDT worked closely with policymakers and companies to outline approaches to responsible AI. We participated in the National Institute on Standards and Technology’s effort to develop an AI risk management framework, testified before the UK All-Party Parliamentary Committee on AI, commented on the EU’s Artificial Intelligence Act, and engaged in the National Telecommunications and Information Administration’s process on personal data, privacy, equity, and civil rights.

Certain technologies can, by their very nature, create significant risks, particularly for marginalized groups.

The problem of discriminatory technologies is wide in scope, and requires governments to dedicate resources to mitigating its biggest harms. We advocated for the Federal Trade Commission, Consumer Financial Protection Bureau, and civil rights agencies to play a greater role, particularly in deterring and fighting data abuses that disproportionately affect marginalized communities. We also asked Congress to fill gaps in public accommodation and civil rights laws to account for modern data practices, and undo efforts to undermine the “disparate impact” standard, which has been key to measuring and thereby challenging discrimination. One key focus was continuing our longstanding advocacy for federal privacy legislation, and arguing that any such bill should include strong civil rights protections.

Throughout our work to fight for equity in tech policy, we placed special focus on the rights of people with disabilities, against whom hiring technologies and other automated decision-making toolsunfairly discriminate. As the New York City Council worked to draft first-of-its-kind legislation that would mandate audits of automated decision-making tools used in hiring and employment, we urged them to create a model for other jurisdictions by meeting a high bar and protecting people with disabilities from the outset. We also laid groundwork on the international level for better protections for people with disabilities, illustrating in comments cited by the UN Special Rapporteur on the Rights of Persons with Disabilities how automated decision-making technologies may violate the protections of the Convention on the Rights of Persons with Disabilities across social spheres.

Certain technologies can, by their very nature, create significant risks, particularly for marginalized groups. In 2021, CDT took a principled stand against the use of facial recognition technology, which exhibits strong biases against women and people of color, and entrenches the disproportionate harms faced by Black and Brown communities who are already subject to overpolicing. To fight the numerous and often irreversible risks the technology poses, we encouraged the Department of Homeland Security to suspend its use on travelers, and urged Congress to enact a moratorium on its use for law enforcement and immigration enforcement purposes until legislators enact a comprehensive set of rules to mitigate the threats to human rights. 

Our increasing capacity for original research allowed CDT to further explore and provide policymakers with information about how to mitigate potentially discriminatory effects of emerging technologies.

Our increasing capacity for original research allowed CDT to further explore and provide policymakers with information about how to mitigate potentially discriminatory effects of emerging technologies. One CDT report identified key research questions about the impacts of mis- and disinformation on communities of color and across gender identity. Another explored the use of student monitoring technology by schools, finding that because school-issued devices tend to more intensively track student activity than personal devices, students in higher-poverty districts are subjected to more pervasive monitoring. Unsurprisingly, these findings highlight the need for strong privacy protections that are rooted in mitigation of digital inequities.

To further help legislators, civil servants, advocates, and affected communities make informed decisions about technology, we launched the Equity in Civic Technology Project, a new program through which we advocate for just and responsible use of data and technology in the delivery of government services. The project tackled limited access to the internet, which can severely stymie education, economic participation, and more. We also encouraged the FCC to work to close the “digital divide” by helping students and families establish reliable, affordable internet connectionswithout sacrificingtheir data privacy. In a sign of the project’s future direction, we urged the U.S. Department of Education to support the ethical, responsible use of data belonging to transgender and gender non-conforming students, students of color, and students with disabilities

As technology continues to shape all facets of modern life, technology design, use and deployment must consider the effects for differently situated communities. At CDT, we’re committed to research and advocacy that highights these issues and fights to mitigate potential discriminatory effects.