Skip to Content

AI Policy & Governance, Privacy & Data

Press Release: Shining a Light on How Surveillance Tech Discriminates Against Disabled People

(WASHINGTON) — A new report from the Center for Democracy & Technology (CDT) examines four major areas where algorithmic surveillance technologies risk disproportionate harm to disabled people or outright discriminate against them. These areas include the education sector, the criminal legal system, healthcare, and the workplace.

Lydia X. Z. Brown, CDT Policy Counsel for the Privacy and Data Project, and lead author of the report, says:

“Disabled people are simultaneously hyper-surveilled and often erased in society. On one hand, we are treated as potentially dangerous, deceptive, or wasteful, and thus subject to technologies that seek to flag noncompliance, cheating, or fraud.

But on the other hand, we are treated as functionally invisible, and thus left to contend with the damaging and discriminatory impact of technologies that seek to drive efficiency, maximize production, and promote wellness without considering their harmful effect on the physical and mental health of disabled people caught in their wake.”