AI Policy & Governance, Equity in Civic Technology, Privacy & Data
Report – Algorithm-driven Hiring Tools: Innovative Recruitment or Expedited Disability Discrimination?
Algorithm-driven hiring tools have grown increasingly prevalent in recent years. Thousands of job-seekers across the United States are now asked to record videos that employers mine for facial and vocal cues. They complete online tests or games that purport to evaluate their “personal stability,” optimism, or attention span. They submit resumes through online platforms that may reject them because of time gaps in their work histories, such as those resulting from cancer treatment.
Employers using these tools seek a fast and efficient way to process job applications in large numbers. They may also believe that algorithm-driven software will identify characteristics of successful employees that human recruiters would not identify on their own. But as these algorithms have spread in adoption, so, too, has the risk of discrimination written invisibly into their codes. For people with disabilities, those risks can be profound.
The Americans with Disabilities Act (ADA) has explicit prohibitions against the use of hiring processes that discriminate on the basis of disability. First, the ADA requires that employment tests be provided in an accessible format, and if the format is not accessible, that reasonable accommodations be made available without prejudicing the candidate. For example, a test that requires spoken answers is not accessible for an applicant who does not speak because of paralysis or deafness. If an employer uses such a test, they have to evaluate disabled job-seekers in an alternative way that reasonably accommodates their disabilities.
Second, the ADA presumptively disfavors hiring selection criteria that “screen out, or tend to screen out” disabled candidates. For example, a personality test may screen out some candidates with depression or anxiety; a game-based test may screen out a candidate because of their ADHD. If an employer uses selection criteria that screen out disabled candidates, the criteria must be “job-related” and “consistent with business necessity.” This means that employment tests must evaluate candidates on factors that are directly relevant to (and necessary for) the essential functions of the job.
Many algorithm-driven hiring tools fall far short of these standards. Algorithm-driven hiring tools typically assess candidates based on how they perform on a given test compared to a model set of successful employees. Employers may be tempted to use these tools without stopping to consider what exactly they are testing for, or why – specifically, what traits are really being measured by an online game, and whether what is being measured is actually necessary to perform the essential functions of the job.
Employers, vendors who create these hiring tools, regulators, job-seekers, and advocates need to better understand the risks of using algorithm-driven tools in hiring, and consider concrete steps to avoid these harms. This paper seeks to highlight how hiring tools may affect people with disabilities, the legal liability employers may face for using such tools, and concrete steps for employers and vendors to mitigate some of the most significant areas of concern. We hope it will serve as a resource for advocates, for regulators, and – above all – for those deciding whether to develop or use these tools to consider the risks of discrimination, and ultimately to ask if the tools are appropriate for use at all.
Find the full report here.
Find the plain language version here.
Find our recent report on algorithm-driven decision-making in benefits determinations here.