Skip to Content

Privacy & Data

States are Letting Us Down on Privacy

On this International Privacy Day, it’s Time to Acknowledge That States are Enacting Industry-Friendly Privacy Legislation

States have been on a tear passing privacy legislation over the past several years, motivated at least in part by the lack of federal privacy protections. At least 13 states now have laws in place that provide some protections for online data. That’s good — states should be active in protecting privacy. States are the laboratories of democracy, and when we can, we should let them experiment with different policy approaches. States are adopting some beneficial provisions in their privacy bills, like data broker registries and user rights to access, correct, and delete data.

Generally, however, each state is simply adopting a minor variation of prior state laws, and those state laws so far lack one of the most meaningful privacy protections: data minimization. Data minimization ensures companies collect only data that is necessary to provide the product or service an individual requested. The strictest definition would disallow collecting data beyond the absolutely critical. More appropriate definitions allow data collection for other specified, allowable purposes, including to authenticate users, to protect against spam, to perform system maintenance, or to comply with other legal obligations.

Data minimization requirements place the privacy-protecting burden primarily on companies that collect and exploit the data, rather than on the already overburdened consumer. U.S. privacy law has developed primarily through the Federal Trade Commission’s authority to prevent “deceptive” practices, which has resulted primarily in protections against misleading people. For years, however, most people have agreed that notice-and-consent has failed, in large part because we know that people do not read or understand laborious, labyrinthian privacy policies. 

For years, however, most people have agreed that notice-and-consent has failed, in large part because we know that people do not read or understand laborious, labyrinthian privacy policies.

Narrowing the categories of data that companies can collect is important because of the variety of privacy-based harms that come about simply from companies collecting and hoarding massive amounts of data: becoming a larger target for hackers or unauthorized access, breaches of that data that result in further downstream harms like identity theft, and subsequent use of data that is unknown or secretive, such as selling the data to third parties that compile detailed individual profiles and use that data (particularly sensitive data) for targeted advertisements, and a variety of other harms.

Reducing data collected also protects against another significant harm: law enforcement access to data. Any data that a company has access to, law enforcement also has access to it. The Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization raised the salience of this concern, as people realized that any data that could be used to identify whether a person sought or received an abortion (location data, communications data, among many others) could be accessed by law enforcement. 

States have been letting us down on data minimization. The concept has been co-opted at the state level to mean something more like companies cannot collect data for any purposes for which they do not inform the consumer. For instance, Section 6(a)(1)-(2) of Connecticut’s privacy law states that a company shall “limit the collection of personal data to what is adequate, relevant[,] and reasonably necessary in relation to the purposes for which such data is processed, as disclosed to the consumer,” and “not process personal data for purposes that are neither reasonably necessary to, or compatible with, the disclosed purposes for which such personal data is processed, as disclosed to the consumer, unless the [company] obtains the consumer’s consent.” Connecticut’s minimization requirement is not effective because it allows companies to continue collecting data for basically any purpose stated in a privacy policy — which is already the law under the FTC’s Act deception standard and, as already mentioned, most consumers do not read privacy policies. Virginia (Section 59.1-578(A)(1)) and Texas (Section 541.101(a)(1), (b)(1)) privacy laws use nearly identical language, as does New Jersey’s law (Section 9(a)(1)-(2)), which just passed earlier this month. While some states require opt-in consent for processing sensitive data, those provisions are also insufficient because states often define sensitive data very narrowly (see Virginia definition limited to children’s data, demographic data, location, and biometrics).

The California Consumer Privacy Act has seemingly similar language, in Section 1798.100(c), that a company’s collection has to be “reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed, or for another disclosed purpose that is compatible with the context in which the personal information was collected, and not further processed in a manner that is incompatible with those purposes.” However, the California Privacy Protection Agency subsequently passed rules stating (Section 7002(d)(1)) that companies should seek to collect the minimum information necessary to achieve the purpose identified. For instance, to mail a product and send an email confirmation, the only information needed is a physical address and an email address. Companies must also take into account (Section 7002(d)(2)) potential negative impacts of collecting data, including that precise geolocation data may give away sensitive information and visits to sensitive locations like health care providers. Colorado privacy regulations, in rule 6.07(A), include a similar requirement that companies “determine the minimum [p]ersonal [d]ata that is necessary, adequate, and relevant for the express purpose.”

Despite California’s more detailed rules, most states have enacted language similar to the Connecticut law, which ultimately has little impact on company data practices—it is merely a continuation of the failed notice-and-consent regime. The language not only allows, but bakes into state law and policy, the privacy status quo that so many people disfavor. It also places essentially no burden on companies to curtail their data practices. In most of these states with “comprehensive privacy” laws, if a company wants to build profiles of all their customers, or sell all the data they collect to third parties to increase their revenues, or hoover up every data point they can to train their internal Artificial Intelligence systems, the only thing stopping them is stating that purpose in a privacy policy. 

…most states have enacted language similar to the Connecticut law, which ultimately has little impact on company data practices—it is merely a continuation of the failed notice-and-consent regime. The language not only allows, but bakes into state law and policy, the privacy status quo that so many people disfavor.

People should not be satisfied with these laws. At the federal level, significant resources and discussion have gone into finding a reasonable approach to data minimization with the American Data Privacy and Protection Act (ADPPA). The bipartisan legislation included a strong minimization requirement, which required companies to collect sensitive data (broadly defined, unlike states, to include health, communications, contacts, financial info, biometric data, and other types of data) only to the extent it was strictly necessary to provide the product or service, or was strictly necessary for one of several other specified allowable purposes. This requirement would have placed significant privacy obligations on companies themselves, forcing them to justify their data collection rather than continuing to place the burden on the shoulders of individuals. 

Some states have or have had pending legislation that would provide similarly strong minimization requirements. Massachusetts is considering its own state-level ADPPA called the Massachusetts Data Privacy Protection Act, which CDT supports. Maine legislators have proposed a similar bill. 

Luckily, existing state laws can be fixed. Laws can be amended and updated to reflect current practices and technologies. States that have already adopted privacy legislation should update those laws to provide stronger privacy protections by changing their minimization language to more closely reflect ADPPA, or the Massachusetts and Maine bills. States that have not yet passed legislation should look toward ensuring that any future potential privacy law includes similar language. If they don’t, states will only continue letting us down on privacy.