AI Policy & Governance, Equity in Civic Technology, Privacy & Data
Comments from CDT’s Ridhi Shetty to OSTP Listening Session on Biometric Technologies
On November 18, the White House Office of Science and Technology Policy (OSTP) held the first of two listening sessions about the use of biometric technologies. CDT joined this conversation, offering examples particularly focused on the impact of biometric systems on disabled people and multiply marginalized disabled people in the public safety, education, and employment contexts.
The OSTP’s listening sessions are part of its efforts to develop a Bill of Rights for AI and automated technologies. CDT will expand on the following remarks to continue advocating for more effective regulation and rigorous enforcement against harms arising from public and private sector uses of AI.
***
Thank you for the opportunity to comment on behalf of the Center for Democracy & Technology today. We are heartened by OSTP’s attention to the impact of biometric systems that have in many ways created and worsened inequities across social spheres. While CDT works on many issues around biometrics, today I will focus my remarks on the impact for disabled and multiply marginalized disabled people. Studies have well established that biometric systems vary significantly in their ability to accurately recognize people based on skin color and gender identity. And disability can affect biometric data such as gestures, eye contact, voice and speech, or physical appearance. These glaring flaws converge in several areas where biometric systems are used.
In the public safety context, biometric systems have been used to detect unauthorized intruders or suspicious behavior. This has triggered wrongful arrests of Black and brown people who are misidentified. It can also trigger law enforcement to respond to mental health crises as threats rather than as circumstances requiring mental health support, leading to fatal outcomes for disabled Black people and other communities of color.
Schools have monitored students’ behavior to detect the need for student safety interventions. But biometric systems can cause marginalized students to be even more disproportionately flagged for these purposes than they already are. Now, at a time of increased remote proctoring, even the keystrokes and speech captured when a disabled student uses assistive technology can cause them to be incorrectly penalized for cheating.
Employers have used biometric systems that involve facial and voice analysis to decide which candidates to hire, which can cause disabled candidates to be disqualified for reasons unrelated to the needs of the job. In the workplace, employers have analyzed employees’ perceived emotional characteristics when interacting with customers, and they have used sensors or other activity tracking tools that collect data on workers’ movements and health to determine productivity and administer wellness programs.
These are only a few examples of how biometric systems can harm marginalized communities, including people with disabilities. We look forward to continued engagement with OSTP to ensure that biometric systems are no longer used to perpetuate a history of exploitative data collection and use.