On this International Privacy Day, it’s Time to Acknowledge That States are Enacting Industry-Friendly Privacy Legislation
States have been on a tear passing privacy legislation over the past several years, motivated at least in part by the lack of federal privacy protections. At least 13 states now have laws in place that provide some protections for online data. That’s good — states should be active in protecting privacy. States are the laboratories of democracy, and when we can, we should let them experiment with different policy approaches. States are adopting some beneficial provisions in their privacy bills, like data broker registries and user rights to access, correct, and delete data.
Generally, however, each state is simply adopting a minor variation of prior state laws, and those state laws so far lack one of the most meaningful privacy protections: data minimization. Data minimization ensures companies collect only data that is necessary to provide the product or service an individual requested. The strictest definition would disallow collecting data beyond the absolutely critical. More appropriate definitions allow data collection for other specified, allowable purposes, including to authenticate users, to protect against spam, to perform system maintenance, or to comply with other legal obligations.
Data minimization requirements place the privacy-protecting burden primarily on companies that collect and exploit the data, rather than on the already overburdened consumer. U.S. privacy law has developed primarily through the Federal Trade Commission’s authority to prevent “deceptive” practices, which has resulted primarily in protections against misleading people. For years, however, most people have agreed that notice-and-consent has failed, in large part because we know that people do not read or understand laborious, labyrinthian privacy policies.
Narrowing the categories of data that companies can collect is important because of the variety of privacy-based harms that come about simply from companies collecting and hoarding massive amounts of data: becoming a larger target for hackers or unauthorized access, breaches of that data that result in further downstream harms like identity theft, and subsequent use of data that is unknown or secretive, such as selling the data to third parties that compile detailed individual profiles and use that data (particularly sensitive data) for targeted advertisements, and a variety of other harms.
Reducing data collected also protects against another significant harm: law enforcement access to data. Any data that a company has access to, law enforcement also has access to it. The Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization raised the salience of this concern, as people realized that any data that could be used to identify whether a person sought or received an abortion (location data, communications data, among many others) could be accessed by law enforcement.
The California Consumer Privacy Act has seemingly similar language, in Section 1798.100(c), that a company’s collection has to be “reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed, or for another disclosed purpose that is compatible with the context in which the personal information was collected, and not further processed in a manner that is incompatible with those purposes.” However, the California Privacy Protection Agency subsequently passed rules stating (Section 7002(d)(1)) that companies should seek to collect the minimum information necessary to achieve the purpose identified. For instance, to mail a product and send an email confirmation, the only information needed is a physical address and an email address. Companies must also take into account (Section 7002(d)(2)) potential negative impacts of collecting data, including that precise geolocation data may give away sensitive information and visits to sensitive locations like health care providers. Colorado privacy regulations, in rule 6.07(A), include a similar requirement that companies “determine the minimum [p]ersonal [d]ata that is necessary, adequate, and relevant for the express purpose.”
People should not be satisfied with these laws. At the federal level, significant resources and discussion have gone into finding a reasonable approach to data minimization with the American Data Privacy and Protection Act (ADPPA). The bipartisan legislation included a strong minimization requirement, which required companies to collect sensitive data (broadly defined, unlike states, to include health, communications, contacts, financial info, biometric data, and other types of data) only to the extent it was strictly necessary to provide the product or service, or was strictly necessary for one of several other specified allowable purposes. This requirement would have placed significant privacy obligations on companies themselves, forcing them to justify their data collection rather than continuing to place the burden on the shoulders of individuals.
Some states have or have had pending legislation that would provide similarly strong minimization requirements. Massachusetts is considering its own state-level ADPPA called the Massachusetts Data Privacy Protection Act, which CDT supports. Maine legislators have proposed a similar bill.
Luckily, existing state laws can be fixed. Laws can be amended and updated to reflect current practices and technologies. States that have already adopted privacy legislation should update those laws to provide stronger privacy protections by changing their minimization language to more closely reflect ADPPA, or the Massachusetts and Maine bills. States that have not yet passed legislation should look toward ensuring that any future potential privacy law includes similar language. If they don’t, states will only continue letting us down on privacy.