Skip to Content

Shining a Light on How Surveillance Tech Discriminates Against Disabled People

(WASHINGTON) — A new report from the Center for Democracy & Technology (CDT) examines four major areas where algorithmic surveillance technologies risk disproportionate harm to disabled people or outright discriminate against them. These areas include the education sector, the criminal legal system, healthcare, and the workplace.

Lydia X. Z. Brown, CDT Policy Counsel for the Privacy and Data Project, and lead author of the report, says:

“Disabled people are simultaneously hyper-surveilled and often erased in society. On one hand, we are treated as potentially dangerous, deceptive, or wasteful, and thus subject to technologies that seek to flag noncompliance, cheating, or fraud.

But on the other hand, we are treated as functionally invisible, and thus left to contend with the damaging and discriminatory impact of technologies that seek to drive efficiency, maximize production, and promote wellness without considering their harmful effect on the physical and mental health of disabled people caught in their wake.

The ableist impact of algorithmic surveillance technologies demands critical attention from researchers, developers, and policymakers. COVID-19 is not over, but even if and when society transitions to a point where COVID-19 is no longer a significant force shaping our world, these and likely more surveillance technologies will be here to stay.

Those technologies that exacerbate and deepen existing harms, and that threaten people’s safety, health, and freedom, demand urgent intervention to limit the damage they can do.”

In each sector, the new report examines the ways in which disabled people can experience harm from the use of automated, algorithmic, and artificially intelligent technologies. For example, increased monitoring on school-issued devices has proven to negatively impact the school environment for the most marginalized students, while risking further profiling of and discrimination against disabled students.

In the criminal legal system, cities and counties have continued to adopt predictive policing software and algorithmic risk assessment systems that disproportionately impact disabled people, low-income people, and people of color who are already overrepresented in arrests and in jails and prisons. In the healthcare industry, disabled people worry about increased development of medications and devices that technologically track compliance and transmit data not only to their doctors, but often to insurance companies and potentially third party vendors.

And in the workplace, the sharp increase in the use of “bossware” across sectors means disabled workers face the risk of exacerbated disabilities and chronic illnesses, as well as retaliation, if they request disability accommodations related to the intrusive surveillance software.

Go deeper and read the FULL REPORT.
See Lydia X. Z. Brown’s analysis of some of the key findings.

###

CDT is a 27-year-old 501(c)3 nonpartisan nonprofit organization that fights to put democracy and human rights at the center of the digital revolution. It works to promote democratic values by shaping technology policy and architecture, with a focus on equity and justice.