Skip to Content

AI Policy & Governance, Privacy & Data

Five Key Takeaways from New EEOC and DOJ Guidance on Disability Discrimination in Algorithm-Driven Hiring

On May 12, the Equal Employment Opportunity Commission (EEOC) and Department of Justice (DOJ) became the first federal agencies to publish guidance explicitly addressing disability discrimination in automated hiring assessments, which are increasingly used by major employers. CDT commended the agencies’ work as a significant move in an area where CDT has long pushed for greater accountability.

CDT hopes that this move will be merely the first salvo in a robust federal response to the discrimination risks that automated technologies pose for disabled workers, who often face significant employment barriers. Here, we highlight and analyze five major takeaways from the guidance:

  1. The employer is responsible for the effectiveness of their hiring tools, even if they were developed by an outside vendor 

Employers who use vendor-provided hiring tools frequently deflect responsibility to vendors to avoid difficult questions about the tools’ fairness. The agencies’ guidance attempts to short-circuit such deflections by stating that the employer is still obligated to ensure its selection procedures are not discriminatory, even when using a tool “designed or administered by another entity, such as a software vendor.” Similarly, the guidance states that employers generally retain responsibility for ensuring that disabled applicants receive reasonable accommodations required by the Americans with Disabilities Act (ADA). Even if a candidate makes the accommodation request to an external vendor rather than to the employer, employers still have ultimate responsibility.

To avoid running afoul of the ADA, the guidance encourages employers to:

  • Probe vendors’ consideration of disability issues during development, when considering whether to purchase a hiring tool, and
  • Provide candidates “with as much information about the tool as possible” so that they can identify tools that may negatively impact their assessment results or screen them out.
  1. Requesting accommodations is easier and more streamlined

Workers with disabilities often do not request accommodations when they need them, often to avoid stigma or discriminatory actions, but just as often, because these workers are not aware that they might need or are legally entitled to a reasonable accommodation. The guidance addresses this issue by advising that, when a worker informs an employer that their disability may make a tool’s assessment more difficult, or may make the tool’s results less acceptable, they are deemed to have requested an accommodation. Therefore, the employer should affirmatively provide accommodations even if a candidate has not made a formal request through existing channels.

The guidance also notes that employers must take a proactive approach to helping candidates determine whether they might need accommodations and, if so, which ones. To reduce the likelihood of an algorithmic tool screening out candidates on the basis of disability, the guidance advises that employers should: 

  • Explain to workers what steps and methods an evaluation process includes, and how different types of disabilities might affect the tool’s outcomes;
  • Ask workers whether they will need reasonable accommodations to complete the process, clearly indicate the availability of reasonable accommodations, and provide instructions for requesting them; and
  • Request supporting medical documentation if it is not obvious or known to the employer that the worker requesting an accommodation has a disability (though, as discussed below, requiring medical documentation can create its own barriers).

The agencies recognize that following these recommendations will enable disabled workers to make more informed decisions about whether to disclose their disability status and how to exercise their right to reasonable accommodations. Further, the guidance recommends that employers train staff in recognizing and processing requests for reasonable accommodations, as well as in evaluating workers through alternative means when the usual process would disadvantage a worker because of their disability. If requests for reasonable accommodations are made to a third party that administers the tool, this still constitutes a request for which employers are responsible.

One gap that could be addressed with further guidance is that the aforementioned reasons for not requesting accommodations would still apply, even if employers take the steps outlined above. A worker would have to decide if they feel safe disclosing their disability status. The worker also may not be able to provide medical documentation: discrimination in health care based on disability, race, sex, gender identity, class, immigration status, and other characteristics affects whether a person can afford to be seen by a healthcare professional, and how accurately the person’s needs are assessed. This can make it difficult to access the requisite medical documentation that results from access to healthcare. Given the barriers to reporting disabilities, employers should explore reasonable accommodations and alternative means of evaluation to generally make available to all workers, so workers with disabilities do not need to disclose to be fairly evaluated. Employers should also accept other forms of documentation so that discrimination in healthcare doesn’t compound disability discrimination in employment. 

  1. Testing for disability discrimination and bias requires a proactive and nuanced approach

The agencies recognize the limitations of claims that an algorithmic tool is “bias free,” explaining that such claims mainly pertain to bias on the basis of race, sex, national origin, color, or religion. It is relatively simpler to measure a tool’s outcomes for Black workers compared to white workers, or men in an applicant pool compared to women, although hiring tools could still pose more subtle risks of discrimination in these cases. Disability bias is more complex. Simply comparing numbers or percentages of workers with disabilities to those without disabilities will not give employers an accurate sense of how, for instance, tools interact with mobility- or pain-related disabilities differently than cognitive or mental health disabilities. While detecting and correcting disability bias can be challenging, the agencies make clear such challenges do not entitle employers to ignore disability bias when implementing an algorithmic decision-making tool.

One note of caution: The guidance recommends relying on experts when developing tools, offering as an example “psychologists, including neurocognitive psychologists.” While their expertise has its place, experts in these disciplines have often advocated for the use of gamified assessments and personality tests despite the high risk of discrimination those selection procedures pose. The agencies should go beyond psychologists and emphasize the importance of relying on experts in disability bias and systemic ableism, who will be well-positioned to recognize overlooked causes and risks of disability discrimination.

  1. Automated assessments must be narrowly tailored to measuring essential job functions

The guidance emphasizes that employers cannot deploy tools that screen out disabled candidates on any basis other than the candidate’s ability to perform the essential functions of the position in question. Ensuring ADA compliance thus means more than simply ensuring that the tool is “valid” under the Uniform Guidelines for Employee Selection Procedures, because validity can be based on indirect measures of job performance. The guidance therefore recommends that employers measure “necessary abilities or qualifications…directly, rather than by way of characteristics or scores that are correlated with those abilities or qualifications.”

Moreover, the ADA requires employers to consider a candidate’s ability to perform the essential functions of a job with reasonable accommodations, if they are needed. As the guidance notes, this requirement presents significant issues for many algorithmic tools, which “are often designed to predict whether applicants can do a job under typical working conditions” – that is, conditions without accommodations. For disabled workers entitled to on-the-job accommodations, such a limited approach to assessing candidates’ capabilities risks violating the ADA.

  1. Screening out even one disabled candidate based on their disability is enough to violate the ADA 

Finally, showing that a tool validly measures most candidates’ ability to perform essential job functions is not enough to ensure ADA compliance. Under the ADA, it is unlawful for an employer to use a tool that screens out any disabled worker due to that worker’s disability, so long as that worker would otherwise have been able to perform the essential functions of the job. Consequently, a company cannot demonstrate ADA compliance simply by showing a lack of statistical adverse impact on disabled workers, or by demonstrating that a tool validly measures most candidates’ ability to perform the job.

To illustrate the point, the guidance offers a hypothetical example, where a hiring tool uses a video game to assess candidates’ memory. In this case, the memory test may well validly measure memory for most people in general, but its design likely means it cannot, for example,  assess the memories of blind candidates. In such cases, using the tool to screen candidates for a particular position is likely unlawful, unless the employer provides an adequate accommodation or an alternative method of assessment that allows every disabled candidate to demonstrate their abilities and compete on equal terms with nondisabled candidates who can complete the gamified assessment.

Conclusion

Currently, in the face of rapid advances in assessment technologies, all anti-discrimination statutes suffer from gaps in regulation and agency guidance. The EEOC and DOJ have begun filling those gaps by highlighting employers’ ADA obligations when using automated and data-driven assessment methods. These issues can be further addressed through the Hiring Initiative to Reimagine Equity, the EEOC’s collaboration with the Department of Labor’s Office of Federal Contract Compliance Programs, which aims to “identify strategies to remove unnecessary barriers to hiring” and “promote equity in the use of tech-based hiring systems.” 

CDT hopes that these agencies will continue to publish similar guidance on automated discrimination on the basis of other protected characteristics, including race, sex, national origin, age, gender identity, and other marginalized groups of workers. In doing so, the agencies could then further analyze the impacts of these technologies on multiply-marginalized disabled people — those who are disabled and are members of another protected class. Employers and vendors contribute to systemic employment barriers when they do not consider how the impacts of a given tool will vary depending on the types of disability a worker has, the supports they currently have and were previously able to access, and the relationship between their disability and their other identities.

That said, implementing the guidance — by ensuring that assessments are tied only to essential job functions, thinking about fairness during the design and development process, and providing candidates with more information about the nature of assessments — will also help reduce other forms of discrimination against marginalized workers. Encouraging employers to more closely scrutinize tool design and the availability of reasonable accommodations within tools when contracting with third parties will also help make vendors more accountable. For these reasons, the guidance is a welcome step toward helping ensure that employers and vendors alike consider the risk of discrimination against disabled and other marginalized workers when developing and deploying automated assessments.