CDT Signs Onto Principles for Privacy Legislation, Calls On NTIA to Promote Robust Privacy Law in Congress
With the midterm election now largely settled, all signs point to consumer privacy law as being one area where Congress and the Trump administration can work together to advance rules that will force companies to safeguard and use our information responsibly. In that spirit, today CDT joined with 34 other civil rights, consumer, and privacy advocacy organizations in releasing public interest principles for privacy legislation. Together, we have called for Congress to enact a law that promotes fairness, prevents discrimination, and advances equal opportunity wherever and whenever data is collected, used, or shared. We have also filed comments with the Trump administration calling for the same.
The unfortunate reality is that many commercial practices today are simply not fair to individual consumers. Data brokers double down on transparency requirements, even as no one knows who they are. Other companies collect detailed behavioral data about their own customers “just because.” Companies promise control over precise geolocation information, yet collect and sell location data with abandon. Often, companies insist these activities are ultimately about providing better advertising, but this ignores how data is used to discriminate in advertising, excluding older workers from even seeing job ads or targeting “abortion-minded women” visiting health clinics.
The Trump administration must also work with Congress to enact meaningful privacy legislation that addresses these harms. The coalition legislative principles follow last Friday’s comment deadline from the National Telecommunications & Information Administration (NTIA), which is engaged in an effort to develop an administration-wide approach to privacy. In our comments, CDT has called on the NTIA to put forward a concrete legislative proposal. We appreciate the NTIA’s recognition that companies must embrace longstanding Fair Information Practice Principles (FIPPs), as well as internal accountability and risk management efforts, but voluntary frameworks and internal corporate processes are insufficient to protect our privacy.
CDT’s comments underscore concerns we have with relying on staffing, privacy by design procedures, or internal privacy risk assessments as the primary basis of privacy protection. While shifting the responsibility for protecting data away from individuals and proactively onto companies is a laudable goal, accountability and risk management relies on the internal privacy values of businesses. Absent a firm set of legislative rules, risk management still gives companies considerable discretion to determine what risks individuals should assume.
We also lack a shared consensus around what privacy risk even is. To the extent that risk management becomes part of the administration’s proposal, CDT recommends adopting the set of the risks compiled by the National Institute for Standards & Technology (NIST). NIST acknowledges that privacy risks exist beyond economic loss and include diminished capacity for autonomy and self-determination, discrimination (legal or otherwise), and a generalized loss of trust. An even more extensive framing of risk is present in a legislative discussion draft from Intel which includes (1) psychological harm, (2) significant inconvenience and loss of time, (3) adverse eligibility determinations, (4) stigmatization and reputational harm, (5) unwanted commercial communications, (6) price discrimination, and (7) other effects that alter experiences, limit choices, or influence individuals in addition to expected financial or physical harms.
Developing meaningful privacy protection requires addressing broader equity and fairness concerns raised by new technologies. A bigger challenge for a federal privacy framework is how to address the risks from opaque and discriminatory algorithms. Applications of artificial intelligence and machine learning present the difficult test for privacy and are an extensive focus of the EU General Data Protection Regulation (GDPR).
Private debates must resolve how ubiquitous data flows generate information and power asymmetries that benefit companies at individual expense. This may also require the NTIA to consider how privacy norms are shaped by user experience and user interface, as well as so-called “dark patterns.” Privacy management now goes beyond what individuals can reasonably control, with spillover effects that impact the public at large. At a minimum, we recommend the NTIA solicit the views of additional perspectives across civil rights organizations to ensure it crafts privacy protections that address concerns of marginalized and vulnerable communities.
Personal information is an evolving concept, and the NTIA must clarify what the scope of information is covered under its approach to privacy. A broad definition of personal information is appropriate in today’s digital environment to protect consumers and capture evolving business practices that undermine privacy, and CDT endorses the test adopted by the Federal Trade Commission, as well as the Office of Management & Budget, that considers information to be personal data where it can be linked or made reasonably linkable to an individual.
We also urge careful consideration of what sort of exemptions de-identified and other types of anonymous data should be subjected to under a federal privacy law. De-identification is a valuable process for protecting privacy, but CDT would suggest it is time to reassess what reasonable de-identification measures should entail and to acknowledge the growing sorts of data misuse that occur with aggregated and traditionally non-personal information.
In the end, the United States needs specific rights that are created through legislative action in Congress. Federal law needs to go beyond giving individuals more notices and choices about their privacy, and we think it is time for legislators in Congress to flip the privacy presumption and declare some data practices unfair. As CDT recently testified before Congress, we are advocating for a legislative solution that (1) grants affirmative individual rights to data, including access, correction, deletion, and portability, (2) requires reasonable security practices and transparency from companies, (3) prevents advertising discrimination against protected classes, and (4) presumptively prohibits certain unfair data practices. These protections must be backed by robust enforcement.
CDT hopes the administration champions this approach, and as the public interest privacy legislation principles demonstrate, there are many organizations that stand ready to work with the NTIA and Congress to propose concrete language to these ends.
CDT’s comments to the NTIA are available here, and you can read the Public Interest Privacy Legislation Principles here.