Skip to Content

Privacy & Data

Americans Deserve a Law Protecting Their Digital Privacy – Here’s Our Proposal

For more information on our efforts to create comprehensive federal privacy legislation, check out our Federal Privacy Legislation campaign.

Privacy is a fundamental human right. Physical safety, free expression, access to justice, and economic security depend on it. For too long, Americans’ digital privacy has varied widely, hinging on the technologies and services we use, on the companies that provide those services, and on our capacity to navigate confusing notices and settings. It’s time for Congress to pass legislation providing comprehensive protections for personal information that can’t be signed away.

Civil society, industry, and policymakers across the aisle and at every level of government have called for privacy legislation. But designing meaningful, workable privacy protections is no easy task. Existing privacy regimes rely too heavily on the concept of notice and consent, placing an untenable burden on consumers and failing to rein in harmful data practices. For legislation to be more than a band-aid, we have to rethink the relationship between businesses and the people whose data they hold. We need to establish sensible limits on data collection, use, and sharing, so that people can entrust their data to companies without accepting unreasonable risk.

To advance this dialogue, CDT has put forth a draft federal privacy bill for discussion. We hope this draft will inspire feedback and collaboration from all stakeholders and serve as a resource for decision makers who seek to rebalance our privacy ecosystem in favor of users. This post outlines the objectives behind CDT’s discussion draft and presents some questions to help kickstart discussion.

Requiring fair data practices, not just notice and choice

Legal regimes and industry self-regulation have long relied on notice-and-choice or user control as a proxy for respecting individuals’ privacy. These frameworks simply require companies to provide notice of their data practices and get some kind of consent—whether implied or express—or provide users with an array of options and settings. This status quo burdens individuals with navigating every notice, data policy, and setting, trying to make informed choices that align with their personal privacy interests. With hundreds of devices, thousands of apps, and as many different policies and settings, no individual has the time nor capacity to manage their privacy in this way. Even if we had the time, it’s nearly impossible to project the future risks of harm of each data transaction. Moreover, people can be harmed by data processors with whom they have no direct relationship, making control impossible.

Instead of relying on notice and choice, CDT’s draft prohibits data processing that is presumptively unfair. In describing unfair processing, we focused on practices that are likely unexpected by the average person, hard for consumers to avoid, and/or hard to do with appropriate privacy safeguards. Most of these practices involve repurposing data—using it for purposes other than providing a service that a user has affirmatively requested. The draft also prohibits deceptive practices, such as dark patterns designed to coerce or confuse users into providing their consent.

Another weakness of notice-and-choice models is their inability to address discriminatory uses of data. Commercial data can be used in ways that systematically discriminate based on minority or protected classes such as race, age, gender, LGBTQ status, disability, or financial status. Data-driven discrimination can be completely opaque to the affected person and often goes unnoticed even by the discriminating party. This problem is vast and demands multiple legal and policy approaches. CDT’s draft attempts to address discriminatory ad targeting by giving the Federal Trade Commission (FTC) the authority to make rules prohibiting unfair targeted advertising practices, without having to go through the burdensome process currently required for FTC rulemaking.

Affirmative obligations to protect data

Entities that collect, use, and share data have a responsibility to safeguard it and prevent misuse. CDT’s discussion draft would require covered entities to adopt reasonable data security practices and engage in reasonable oversight of third parties with whom they share personal information. These obligations recognize the reality that participating in modern society often means ceding control of one’s personal information. The entities we trust with our data should handle it with care.

The draft also requires covered entities to publish detailed disclosures of their data practices in a standardized, machine readable format that can be scrutinized by regulators and advocates. Some have argued, understandably, that privacy policies should be shorter and easier for users to understand. However, simplifying privacy policies can unintentionally double down on the idea of privacy self-management while allowing companies to hide the details of their data processing. Our draft prioritizes detail and standardization over simplicity so that regulators, consumer advocates, and privacy researchers can effectively scrutinize covered entities on behalf of consumers.

Individual rights to access, correct, delete, and port data

The right to access, correct, and delete personal information are basic requirements of any federal privacy law; many companies already provide these rights under the EU General Data Protection Regulation. However, individual rights alone are insufficient on their own to protect privacy. CDT’s draft provides broad individual rights, with tailored exceptions to account for technical feasibility, legitimate needs such as fraud detection and public interest research, and free expression rights. The individual rights apply not only to information directly disclosed to a covered entity, but also to information inferred by the covered entity, since inferences can often be more sensitive and opaque to users (e.g., inferring a medical condition based on someone’s non-medical purchase history).

Strong enforcement and meaningful penalties

Perhaps the biggest shortfall of existing consumer privacy protections is the lack of a strong enforcement mechanism and significant fining authority. As CDT has written, the FTC today is stuck with a broken consent decree model, accomplishing all of its privacy enforcement through negotiated settlements, which usually amount to a slap on the wrist. The agency cannot fine a company for a first-time violation of our federal prohibition against deceptive or unfair trade practices; it can only levy a fine after a company has violated its own consent decree. Our draft includes the authority to levy fines that we think are fair but meaningful for a first-time violation of the law.

State legislatures and attorneys general have played an important role in protecting privacy and data security in the absence of federal action, and state AGs should continue to have enforcement authority under a federal law. However, a strong federal privacy law should also provide some regulatory and compliance certainty for covered entities. We have attempted to draft a carefully scoped state preemption provision that would provide that certainty without preempting other protections such as state civil rights laws, criminal laws, and privacy torts. Getting this language right is critical and will be one of the most difficult challenges legislators face.

What’s next?

We have taken our best shot at addressing the numerous difficult challenges that drafting privacy legislation raises. Here are some of the hardest policy and drafting questions we faced that will warrant thoughtful discussion in the new Congress.

  • Individual Rights: Our bill includes individual rights that in many ways mirror new EU and California laws. We felt strongly, though that the rights and exceptions must be clear and definitive for the benefit of consumers and covered entities alike. Are these rights fair, clear, and operational?
  • Portability: What is the proper role of data portability rules in federal privacy legislation, and how can legislative language reflect existing technical reality?
  • Tech and Business Model Neutrality: Our bill applies across all unregulated sectors and sweeps in many different types of services, companies, and business models. How successful was this approach at avoiding unintended consequences?
  • Civil Rights: Our draft addresses unfair targeted advertising practices, particularly those that discriminate based on a protected or vulnerable class, through an FTC rulemaking. Are there additional measures we should be considering to address discriminatory data processing practices?
  • Free Expression: Have we appropriately tailored the privacy protections, including the individual rights (such as the right to delete personal information), to minimize the burden on First Amendment protected activities, such as publishing lawfully obtained information?
  • Collection, Use, and Sharing Limits: Our bill relies heavily on purpose limitations to protect the most sensitive uses of data. Are there additional use cases that should be presumptively unfair? Are our definitions for health information and other categories accurate?
  • Preemption: We have attempted to draft a provision that preempts state laws addressing the types of commercial data processing addressed by this law, but preserves state-level requirements that may involve data processing but are outside the scope of this bill. Is this tailoring appropriate? Have we missed any categories of state laws that should not be preempted?  
  • Disclosures: Nearly everyone agrees that “transparency” provides limited privacy protections but is a necessary component to privacy legislation. Have we struck a balance between providing individuals with meaningful information and providing regulators and advocates with more detailed information about corporate privacy practices?

Want to see more of our work towards comprehensive federal privacy legislation?

Federal Privacy Legislation Campaign