Skip to Content

AI Policy & Governance, Equity in Civic Technology, European Policy, Privacy & Data

CDT CEO Alexandra Givens’ Testimony on Precarious Work and the Digital Economy Before the UK All Party Parliamentary Group on AI

CDT President & CEO Alexandra Givens testified before the the UK All Party Parliamentary Group (APPG) on AI, discussing self-employment, the gig economy, and workers’ rights. Their remarks are pasted below, and available in PDF form.

***

Lord Clement-Jones, Mr. Metcalfe, officers and members of the APPG, and members of the public, it is an honor to appear before you today. I am Alexandra Reeve Givens, President and CEO of the Center for Democracy & Technology. We are a twenty five year old nonpartisan, nonprofit organization based in Washington and Brussels focused on protecting human rights and democratic values in the digital age. 

One issue we prioritize is the impact of technology on workers’ rights — including workers’ privacy, their health and wellbeing in the workplace, their ability to organize, and the risk of discrimination.

As this group has heard over the course of your inquiry, employers are increasingly turning to technological tools to monitor and evaluate workers’ activities. This is happening across industries: in factories, delivery services, commercial workplaces, and home offices. The tech advocacy world often describes these types of tools using the umbrella term, “bossware”.

Today, bossware tracks the individual movements of warehouse workers as they scan and box items for delivery. It logs employees’ keystrokes on the computer, time spent using a particular software program, and the websites workers visit. It takes photos through workers’ webcams to check if they were at their computer at a given time. It analyzes staff emails and social media accounts. 

In addition to worker tracking, companies are also using bossware to evaluate and manage their workers. Algorithmic systems can assess workers’ progress against productivity targets. In call centers, AI analyzes workers’ conversations and provides real-time feedback. Software identifies workers who are “flight risks”, combining behavioral analytics with economic and geographic data. Some monitoring tools integrate directly with timekeeping and payroll systems, allowing employers to automatically dock workers’ pay for time spent away from the computer.

My organization analyzed the impact of these technologies in a 2021 report authored by CDT’s Senior Counsel for Technology & Workers Rights, Matt Scherer, titled “Warning: Bossware May Be Hazardous for your Health.” I will submit this for the record along with a 2020 report from the Electronic Frontier Foundation that first popularized the term “bossware”, “Inside the Invasive, Secretive “Bossware” Tracking Workers.” Both explain these technologies in more detail. 

Bossware can pose several distinct risks. An obvious one is the risk to privacy. While important protections are afforded by the UK GDPR and other laws, the existence of tools that can monitor workers in such extraordinary depth continue to raise deep concerns about the information employers can access, and what they do with that information. Such access can undermine workers’ personal privacy, and increases the risk of discrimination based on information employers learn or infer about their workers’ lives. In addition, monitoring of workers’ email, social media, and workplace break rooms can impede workers’ ability to question company practices and to organize with other workers. 

Our advocacy also underscores the risk that bossware has for workers’ health. Increasingly, technology tools guide every action a worker takes in the name of “optimizing” workflow. News reports have documented the safety violations experienced by delivery drivers who are rushing to stay on task. Workers have reported being unable to go to the bathroom or take other small personal breaks. As one factory worker said in an article reported by The Verge, “it’s the automatically enforced pace of work, rather than the physical difficulty of the work itself, that makes the job so grueling. Any slack is perpetually being optimized out of the system, and with it any opportunity to rest or recover.” 

In warehouses and factories, this may take the form of a signal that guides a worker to the next item to be boxed – speeding up their workflow, but in so doing, removing even that microsecond of down time when they looked for the item on the shelf. In a home office, this may take the form of a program that photographs a worker through their webcam at periodic intervals to make sure they are at their computer — in so doing, punishing the worker for something as innocuous as going to the bathroom. These measures contribute directly to “job strain”, which occurs when workers face high job demands but have little control over their work. The level of surveillance is stressful, demoralizing, and damages workers’ dignity. When it prevents workers going to the bathroom or taking small breaks in physical labor, it can cause both psychological and physical harm. 

This management approach can also particularly discriminate against people with disabilities, because it requires them to complete tasks in a particular way or at a particular pace instead of an alternative manner that accommodates their needs.

Tools used for algorithmic management can also undermine workers’ agency and legal rights. For example, some employers in the retail sector are evaluating workers’ performance based on scores they receive from observed data or analysis of their customer interactions. This opaque system can punish workers for factors beyond their control, such as a waiter receiving low ratings because of the quality of a restaurant’s food, not their service, or because of customers’ racial, religious or other biases. Delivery drivers may be flagged for delays caused by external factors like traffic or the location of the delivery, yet they are required in some cases to recover the ‘lost time’ throughout their shift.

In other settings, companies are using predictive analytics to identify which workers are potential flight risks. Such analysis may cause workers to receive differential treatment based on the company’s inferences about them using opaque factors beyond the individual’s control.

Members of Parliament have an important opportunity to raise awareness about these technologies and mitigate their risks. Employers must be challenged to directly consider the impacts of algorithmic systems, and to do so with workers – especially those from historically excluded groups – present at the table and empowered to shape those conversations in a meaningful way. Thank you for your attention to these important issues. I will be happy to elaborate on these points in the Q&A.

Alexandra Givens

President & CEO

Center for Democracy & Technology

Read Alex’s full remarks here.