Skip to Content

AI Policy & Governance, Privacy & Data

Oversight of Algorithmic Hiring Bias is a High Priority in California

Vendors claim that algorithm-driven tools are a better way to hire talent, but when these tools embed biases into employment practices, marginalized workers pay the price. On April 29, 2021, the California Department of Fair Employment and Housing Council held a hearing focused on this issue. CDT’s Lydia X. Z. Brown joined Upturn’s Aaron Rieke and Washington University Law Professor Pauline Kim to discuss ways to ensure that California’s antidiscrimination laws keep up with changing technology. 

California is actively pursuing efforts to address algorithmic bias, leading the way for other jurisdictions. 

Last year, California legislators introduced a state bill that would have eased employers’ bias auditing standards for employment assessment technologies. The bill did not pass, but the Fair Employment and Housing Council (“Council”) instead amended regulations to indicate that antidiscrimination protections apply to the use of “online application technology.”

By adopting these amendments, California rightfully put employers on notice that their algorithm-driven hiring tools are not exempt from antidiscrimination protections. New York City followed with its own attempt to better regulate private employers, proposing that employment decision technologies must be audited before sale for compliance with antidiscrimination laws.

This year, California legislators took a new tack, introducing a bill to regulate government procurement and use of AI-driven decision-making systems. Washington state, Massachusetts, and Maryland soon introduced similar efforts to regulate government use.

Policy experts explain how hiring technologies embed bias and how to better protect workers. 

Witnesses at the April 29 hearing explained how employment decision technologies work, and identified regulatory and enforcement gaps that must be filled to address discrimination. Upturn’s Aaron Rieke described how hiring technologies of varying levels of sophistication are used in different stages of the job application cycle – from job ads to the selection of candidates – without meaningful transparency or oversight. Professor Pauline Kim explained barriers to improving and regulating these technologies, including the problems of biased training data, uncertainty about the developer and vendor’s liability, and uncertainty about when potential job applicants count as “workers” who are protected by antidiscrimination laws. 

CDT’s Lydia X. Z. Brown drew on CDT’s work examining the potential discriminatory effects of hiring technologies for job candidates with disabilities. They testified that the federal Americans with Disabilities Act makes disability discrimination illegal in ways that apply to algorithm-driven hiring, and California’s state law goes even further. California law clearly puts the onus on employers to make reasonable accommodations when they know of an applicant’s disability. And before employers implement discriminatory, albeit job-related, selection criteria in their hiring processes, they have an affirmative duty to search for alternative selection procedures or criteria that would not discriminate against disabled workers.

The California Department of Fair Employment and Housing (“Department”) enforces employment discrimination protections, while the Council issues regulations. CDT argued that, together, they can strengthen the state’s protections for disabled people. The Council’s regulations should require employers to: 

  • Proactively investigate their tools’ design and identify all potential risks of disparate impact against all marginalized groups, including against disabled workers; 
  • Only use the tools after proactively implementing potential accommodations, modifications, or alternative selection procedures that effectively mitigate each potential source of discrimination; and
  • Make audits of the tools’ design and impact easily available for public or government review.

Additionally, the Department and Council should educate the public and employers about how algorithm-driven hiring tools can discriminate, informing workers about how to recognize potential discrimination in the hiring process, and giving employers a chance to do right by workers. And when the Department learns that an employer is using a possibly discriminatory tool, it should use its authority to file Director’s complaints to independently investigate.

Policymakers must avoid allowing employers to default to minimal and ineffective auditing standards.

During the hearing, Council members were interested in understanding the effectiveness of self-regulation and quantitative auditing to reduce disparate impact. As Lydia explained, these steps alone are not effective accountability mechanisms for reducing discrimination: 

  • Self-regulation by employers and vendors involves closed-door, opaque processes that keep workers at an information disadvantage.

    Accountability requires transparency to the public about how tools are audited, and transparency to applicants about how tools work and will affect hiring decisions. But self-regulation allows employers and vendors to essentially vouch for themselves.
  • Quantitative auditing uses statistical analysis – an approach that excludes significant data about groups that are harder to compare. 

    As CDT has explained in our prior writing on bias auditing, statistical analysis cannot account for the diversity of disability and how it intersects with other marginalized identities. In contrast, qualitative auditing investigates the types and uses of collected data, the data’s job-relatedness, and the potential sources of risk to correct, and is far more preferable.

Policymakers should take note: the public is paying attention to algorithmic hiring bias.

Members of the public in the audience aired concerns during the hearing about additional barriers to algorithmic accountability: specifically, concerns about intellectual property and vendor liability. Citing intellectual property protection, employers and vendors have imposed nondisclosure agreements and impeded access to audit results so workers lack the information needed to legally challenge biased tools. 

Additionally, open questions remain about whether companies that make hiring technologies can be held liable for resulting discrimination – for example, under theories that they have aided or abetted employment discrimination by an employer who uses their tool. These public concerns are well founded, and any regulation going forward should also consider the role these hurdles play in cementing a hiring system that perpetuates discrimination. 

As algorithm-driven hiring tools and other hiring technologies become a mainstay of the employment landscape, the Council will need to modernize regulations to protect all marginalized workers. The public is counting on it.