Skip to Content

AI Policy & Governance, Equity in Civic Technology, Privacy & Data

Protecting Disabled People’s Privacy is a Civil Rights Issue: Lydia X. Z. Brown’s Remarks Before the NTIA’s Listening Session on Civil Rights, Privacy, and Data

CDT’s Lydia X. Z. Brown was invited to give remarks for the National Telecommunications and Information Administration’s (NTIA) first listening session on civil rights, privacy, and data. Their remarks were part of a panel with Michele Gilman, Bertram Lee, David Brody, and Aaron Konopasky, moderated by Lauren Didiuk, which followed keynote addresses by Kristen Clarke and Rebecca Slaughter. (PDF version)

Thank you for the opportunity to speak on today’s panel, and to Bert, David, Aaron, and Michelle for your comments as well. My name is Lydia X. Z. Brown, they/them pronouns, and I am a policy counsel for the privacy and data project at the Center for Democracy and Technology. My work examines the ways in which algorithmic decision-making systems discriminate against and harm people with disabilities, particularly disabled people who are also marginalized because of race, gender, sexuality, or class. 

We know that tech innovation often enables access for disabled people and those in other marginalized communities. But technology can also cause and exacerbate ableist discrimination, through both baseline inaccessibility as well as through more complex means of discrimination by data. For disabled people, this can happen when companies use algorithmic systems that infer information about their disability status, or that tend to make adverse decisions that correlate with disability status. And certain people with disabilities are more likely to rely on biometrics and connected devices that require sharing significant amounts of very personal, intimate data, putting them at heightened risk of discrimination and exploitation in the absence of privacy laws that could protect people using data-driven technologies.

In the employment context, more employers are turning to algorithmic tools for recruitment and hiring. These tools might include resume screening software that looks for trends found in the resumes of existing employees – but may unfairly penalize disabled people because of gaps in their resume, or lacking indicators like having played on a sports team. Other tools might include automated testing, video, or audio analysis that assess applicants’ characteristics or personality “fit” for an employer – characteristics that may be unrelated to the job, and that may result in negative strikes against disabled candidates who have involuntary motor tics, don’t make eye contact, or experience anxiety or depression. Still other employers use algorithmic management tools to surveil employees and track productivity, putting disabled employees with a range of conditions at risk for heightened scrutiny and workplace discipline for disability-related reasons. The ADA prohibits tests of fit unrelated to essential functions of the job, but enforcement is inconsistent and wanting. Job candidates may not always know when they have been discriminated against, especially by automated systems. But even when candidates have a sense that they’ve experienced discrimination, the Equal Employment Opportunity Commission’s Uniform Guidelines for Employee Selection Processes don’t reference disability at all. The EEOC needs to issue updated guidance addressing algorithmic tools and explicitly including disability.

Landlords and property management companies are also using algorithmic tools more than ever to conduct tenant background checks and screening during the rental application process. These types of algorithmic tools used in credit decisions can result in denied applications based on characteristics associated with disability, or disproportionately present in disabled people. Software that automates rejection based on police contacts because of domestic violence related calls, past arrest records, or lack of sufficient rental history can potentially violate the Fair Housing Act’s nondiscrimination protections and the Fair Credit Reporting Act’s requirements for accuracy in information used for credit purposes (including data related to character or reputational characteristics) and disclosure of adverse determinations. It also contradicts HUD’s guidance against making housing decisions based on arrest records. Those systems can disproportionately reject disabled people, who are more likely to experience domestic violence or be arrested and less likely to have stable housing. 

And especially during the pandemic, more people have used and relied on health apps to track physical fitness, make progress on health goals, and manage chronic conditions – including certain disabilities. These health apps may collect incredibly sensitive data, much of it directly related to disability status, whether or not a particular app’s user interface is meaningfully accessible to all disabled people. But use of data by health apps is not generally governed by the very narrow provisions of the Health Information Portability and Accountability Act, nor may it be covered by Title III of the Americans with Disabilities Act pertaining to places of public accommodation, because many jurisdictions don’t apply Title III to cases where people are receiving a business’s services from an online-only location. This is yet another reason why we need a comprehensive federal privacy law.

Finally, disabled people are more likely to experience surveillance in a variety of contexts, often fueled by proprietary algorithmic decision-making systems. For instance, discriminatory data can – and does – result in disabled students being more likely to be flagged for suspicious behavior by automated proctoring tools, and in disabled people being identified as less likely to meet conditions of pretrial release on bail by risk assessment algorithms. 

While a small handful of companies have worked to develop safeguards to protect consumer data, we need federal regulations mandating data safeguards, including restrictions on data collection, retention, third-party sharing, and disclosure to the consumer. Additionally, such regulations must be accompanied by Congressional clarification of regulatory and enforcement authority by the various agencies of jurisdiction, and sufficient resources to enable meaningful monitoring and enforcement of both private and public use. 

NTIA also has an opportunity to help disabled communities by addressing our disproportionate lack of access to affordable broadband infrastructure and accessible devices and programs. I look forward to the panel discussion, and welcome any questions. Thank you.

**

More on the NTIA’s listening session here.

Here’s a PDF of Lydia’s remarks before the NTIA listening session.