In the past two weeks, the Biden Administration continued its welcome efforts to take a closer look at the potentially harmful effects of electronic surveillance, artificial intelligence, and automated decision-making systems in the workplace.
On April 25, four federal agencies — the Equal Employment Opportunity Commission (EEOC), Department of Justice Civil Rights Division (CRD), Federal Trade Commission (FTC), and Consumer Financial Protection Bureau — released a joint statement on “enforcement efforts against discrimination and bias in automated systems.” The statement does three things:
- Discusses the four agencies’ respective spheres of enforcement authority and how they can be applied to automated systems;
- Notes the agencies’ recent actions and publications addressing the risks and harms of automated systems; and
- Describes some of the ways in which automated systems can result in unlawful discrimination, unfair competition, or other violations of federal law.
While it does not announce any particular regulatory action or seek public comment, the statement concludes by pledging that the agencies will “…vigorously use our collective authorities to protect individuals’ rights regardless of whether legal violations occur through traditional means or advanced technologies.” This cross-agency focus could prove particularly fruitful for discrimination in employment decisions since the four agencies have overlapping and complementary authority with respect to both the vendors of automated employment decision tools and the employers who use them.
The following week, on International Workers’ Day (aka May Day), the White House Office of Science and Technology Policy (OSTP) released a public request for information (RFI) on “automated tools used by employers to surveil, monitor, evaluate, and manage workers.” It notes that emerging research suggests that these systems “may undermine the quality of work; workers’ rights to a safe and healthy workplace; compensation for time worked; labor market competition; and workers’ ability to organize and work collectively with their coworkers to improve working conditions, including through labor unions.” The background section of the RFI lists a number of sectors and workplaces where intrusive electronic surveillance systems have been deployed and cites a number of reports and other sources (including CDT’s 2021 report on the health and safety risks of bossware) for evidence of the risk that these systems pose to workers.
CDT recently partnered with Governing for Impact and a coalition of other civil society organizations on a series of memoranda to the Occupational Safety and Health Authority (OSHA) and National Institutes for Occupational Safety and Health (NIOSH) urging them to take concrete action to address the threat that automated surveillance and management systems pose to workers’ physical and mental health. The OSTP RFI provides a much-needed opportunity for civil society groups to continue to educate policymakers on how they can address these and other risks associated with these increasingly prevalent and intrusive systems.
These executive branch actions build on other administration efforts over the past year signaling increasing scrutiny of potentially harmful uses of electronic surveillance and automated systems. These efforts include the OSTP’s Blueprint for an A.I. Bill of Rights; the FTC’s Advance Notice of Proposed Rulemaking on commercial surveillance practices (for which CDT submitted extensive comments); and the EEOC’s recent hearing on automated systems in employment (at which CDT testified).
The administration’s sustained and expanding focus on the risks these emerging technologies pose to workers is welcome, but more concrete agency actions will ultimately be needed to truly mitigate the harms that workers, particularly those from marginalized and historically disadvantaged groups, are already experiencing. CDT will submit comments in response to OSTP’s workplace surveillance RFI and will continue its efforts to raise awareness among both policymakers and the general public of the risks associated with these systems.