Skip to Content

AI Policy & Governance, Privacy & Data, Reproductive Rights

CDT Experts Provide Testimony Before FTC’s Commercial Surveillance and Data Security Public Forum

On September 8th, 2022, three CDT team members gave statements on various privacy and data issues in as part of the public remarks section of the Federal Trade Commission’s (FTC) Commercial Surveillance and Data Security Public Forum.

The Public Forum was organized in tandem with the FTC’s Advanced Notice of Proposed Rulemaking (ANPR) on commercial surveillance and data security practices that harm consumers and competition.

Livestream of the Public Forum, including the public remarks, can be seen here.

Summaries of and links to the full statements are below.


Impact of Algorithmic Discrimination on Persons with Disabilities / LGBTQ+ Communities [PDF] – Lydia X. Z. Brown, Policy Counsel, Privacy and Data Project

The FTC’s ANPRM requested information about algorithmic discrimination based on protected classes but omitted any mention of disability or the LGBTQ community. CDT has extensively analyzed and documented various forms of algorithmic discrimination disproportionately and uniquely affecting disabled people and LGBTQ people. Examples of such discriminatory impact include student monitoring software, automated hiring tools, and predictive policing practices. Any rulemaking must consider the specific risks of harm to these communities and explicitly provide for protections against data-driven discrimination.

Necessary Limits on Sensitive Data [PDF] – Andrew Crawford, Senior Counsel, Privacy & Data 

Certain types of data are inherently sensitive and can reveal intimate details about our health, personal lives, associations, and interactions. Geolocation data can reveal where we pray, where we seek medical treatments, and even our political affiliations. Data about our health is also inherently sensitive. It can reveal intimate and personal physical and mental health conditions. Unfortunately, there have been ongoing examples of these types of sensitive data being misused by the entities that collect, share, retain, and use it. It is time to place real limits on how sensitive information is collected, shared, and used. To that end, CDT called on the FTC to embrace:

  1. Limits on how sensitive data is collected, shared and used.
  2. Expand the use of existing unfairness and deceptive practice enforcement authorities.
  3. Bring needed transparency into the data brokers economy.

Prioritize Harms to Marginalized Communities [PDF] – Ridhi Shetty, Policy Counsel, Privacy & Data Project

The FTC’s rulemaking process is especially critical for addressing prevalent data practices, including targeted advertising and data-driven decision-making, that can disproportionately harm marginalized and multiply marginalized communities. Consumers struggle to challenge these practices under existing legal authorities because the opacity of these practices prevent consumers from ascertaining and demonstrating that they have been discriminated against. Another issue is the lack of consensus about the degree to which platforms that increasingly perform the functions of entities covered under civil rights laws are actually covered under these laws. These are gaps that the FTC’s authority against unfair or deceptive acts or practices can fill. CDT urges the FTC to consider these gaps and prioritize preventing harms to marginalized consumers.