Skip to Content

AI Policy & Governance, Equity in Civic Technology, Privacy & Data

Faster is Not Always Better – Disability Discrimination in Algorithm-driven Hiring Tools

As employers push harder than ever to make their work processes faster and more efficient, we have witnessed major changes in hiring methods. Algorithm-driven hiring tools in particular have become increasingly popular. These tools come in many forms, including recorded video interviews, personality tests, resume mining, and gamified aptitude tests. Vendors market them as an efficient way to identify desirable skills, aptitudes, and “fit” with the workplace culture – in ways that humans cannot.

While algorithm-driven hiring tools may offer quick solutions, faster does not always mean better. This is especially true when these tools reinforce disability-based discrimination. Disability bias in these tools is not only unfair, it can also be illegal. Our new report, Algorithm-driven Hiring Tools: Innovative Recruitment or Expedited Disability Discrimination?, explores employers’ potential liability under the Americans with Disabilities Act (ADA) when they use algorithm-driven hiring tools.

Our report focuses on two ways that the ADA establishes employers’ obligations when using algorithm-driven tests:

  • Employers must provide hiring tests in formats that are accessible to job applicants with disabilities. Algorithm-driven hiring tools are often in formats that many disabled applicants cannot access. They may require applicants to use a screen, mouse, or other equipment in a way that excludes people with certain disabilities from the hiring process. When using these tools, employers must provide reasonable accommodations.
  • Employers must only use selection criteria that are relevant and necessary to essential job functions. Algorithm-driven hiring tools often measure how applicants perform in a particular game or answer specific questions. The results are compared to the performance of the employers’ existing employees. Employers must ask themselves whether these kinds of tests are a truly reliable way to measure skills that are relevant to the essential job functions. 

There are already serious hiring disparities for disabled people. For example, about 37% of people with disabilities are employed, compared to 79% of people without disabilities. This problem is amplified when employers use algorithm-driven hiring tools that do not accurately and fairly assess people with disabilities, many of whom face employment barriers at multiple levels of marginalization.

So how do we prevent these tools from increasing hiring inequities? Our report recognizes that algorithmic discrimination based on disability is uniquely difficult to quantify or to mitigate. Nevertheless, there are steps that stakeholders can take:

  • Employer and vendors: Make your tools accessible, examine whether they actually assess the applicant’s ability to do the job, and provide reasonable accommodations that help ensure success.
  • Policymakers: The ADA and other civil rights laws prohibit hiring practices that disparately impact protected classes. Enforce these laws through existing regulations and new guidance to hold private and government employers accountable for their tools.
  • Job-seekers: Note down what made a test difficult for you to take, whether you received accommodations, and whether the hiring decision was explained.

Because of the diversity of disability experiences, it is especially important to consider how algorithm-driven hiring tools may harm disabled applicants. We hope our report guides stakeholders to ensure that these tools do not push people with disabilities further out of an already biased and ever-evolving job market.

Find the Algorithm-driven Hiring Tools report here.

Find our recent report on algorithm-driven decision-making in benefits determinations here.