Skip to Content

Privacy & Data

Analysis of the Consumer Privacy Bill of Rights Act

Late last Friday afternoon, President Obama released a long-awaited draft of the Consumer Privacy Bill of Rights Act.  Industry and advocates have been waiting for this language since the White House announced support for comprehensive commercial privacy legislation in its February 2012 privacy report. The United States is one of only two developed nations without privacy protections for all personal data (Turkey is the other one); instead, we have a handful of sector-specific laws that apply to narrow categories of personal information, and general purpose consumer protection law enforced by the FTC that maps imperfectly to privacy rights.

CDT is committed to working with the administration and Congress on strengthening this bill to ultimately deliver the privacy rights and protections that we deserve.

We’re glad to see the administration continue to provide needed leadership on this issue.  Consumers feel uneasy about how information about them is collected and used, and with the advent of increasingly sophisticated technologies, there is a pervasive sense that we’ve lost control of our privacy.  So the administration’s concrete legislative proposal is an incredibly important step.  The bill does have some significant flaws, but the administration seems to admit as much by explicitly labeling the Consumer Privacy Bill of Rights a “discussion draft.”  A number of elements need to be improved if this bill is going to offer consumers comprehensive protection. As it is, there are too many loopholes, and enforcement is noticeably lacking.  CDT is committed to working with the administration and Congress on strengthening this bill to ultimately deliver the privacy rights and protections that we deserve.

As a first step in that process, we’ve put together our initial summary of some of the bill’s most important provisions, along with tentative suggestions for changes.  Like the administration, we recognize that our analysis may not take everything into account, so we intend to iterate on this guidance over time.  With that caveat, let’s dive right in.

The bill’s substantive protections track the Fair Information Practice Principles.  The linchpin of the bill and its single best feature is that its requirements are based on the decades-old notion of the Fair Information Practice Principles. Each of these foundational notions are reflected in the bill:  generally speaking, the bill requires Transparency (§101), Individual Control (§102), Respect for Context (§103), Focused Collection and Responsible Use (§104), Security (§105), Access and Accuracy (§106), and Accountability (§107).  (While data minimization was not a dedicated principle in the President’s version of the FIPPs, §104 on Focused Collection provides for limitations on data collection and retention.)

Some have argued in recent years that privacy protection should focus almost exclusively on responsible use at the expense of the other traditional protections; indeed, the President’s Council of Advisors on Science and Technology argued as much this summer in its report on Big Data (the White House’s own Big Data report was more measured).  We’re pleased to see that the legislation includes a full range of protections beyond use limitations: minimizing the role of individual control, transparency, and collection limitations is inconsistent with consumer desires and expectations, and unreasonably presumes that nothing bad can happen to stored data.  As we have seen in recent years, data security can be compromised, and governments can access commercial data sets often without reasonable due process requirements.  Moreover, companies and individuals might not always see eye-to-eye on what constitutes reasonable use.  The FTC recently rejected the notion that use limitations are sufficient to safeguard privacy; we’re glad to see the administration do the same.

However, most of the bill’s protections are predicated on a risk of harm.  One very important element of the bill is that many of the FIPPs-based protections described above only apply in proportion to the “privacy risk” the data poses (the word “risk” appears 27 times in the 24 page bill).  The bill defines “privacy risk” fairly narrowly — it is the potential that data could “cause emotional distress, or physical, financial, professional or other harm to an individual.” §4(g).  This is a significant shift away from a rights-based formulation adopted in other privacy legislation.  While an assessment of risk makes sense for some protections — such as what level of security or internal accountability to mandate — predicating rights such as individual control (§102) and access-and-correction (§106) on an evaluation of risk is questionable.  Predicating rights on this narrow formulation would leave many if not most data sets — such as information collected for marketing purposes — unaddressed.  Consumers might reasonably want to limit what gets collected about them — or even know what’s been collected — absent a significant risk of harm or emotional distress.  Given that one significant rationale for this legislation is to demonstrate internationally — particularly to Europe — that the United States is serious about reforming its data protection regime, conditioning so much of the bill’s protections on a narrow assessment of harm is counterproductive (the EU’s Article 29 Working Party previously rejected the notion of conditioning rights on an assessment of privacy risk).

The bill also exempts a number of business records from many substantive protections.  Similar to personal data that doesn’t pose a risk of privacy harm, many business records are also exempted from protections such as individual control (§102(d)) and data minimization (§104(c)(1)).  Certainly, many data records should be exempt from individual control — you can’t opt out of a company’s security database that identifies you as a previous shoplifter. However, the categories of exceptions are extremely broad in this bill (including a tautological definition of “customary business records” in §4(j)), and information collected doesn’t need to be necessary for these purposes, only related.  At the very least, reasonable minimization requirements — so that covered entities only collect the data that is needed (instead of simply related) for these purposes — would be an improvement.

The bill applies to companies and non-profits alike.  Today, the Federal Trade Commission only has the authority to bring cases against for-profit commercial enterprises.  This bill — in addition to expanding the substance of privacy requirements — would also expand the range of institutions to which these requirements apply to non-profit entities as well (meaning CDT would be subject to broad privacy requirements for the first time).  §4(b).  This is a sensible change, as non-profits — including political campaigns — have the capacity to collect a great deal of potentially sensitive personal information.  However, the bill does have an exception for entities that collect information about less than 10,000 people in a year or that have five or fewer employees.  §4(b)(1)(D).  Given that small developers have the capacity to collect vast amounts of personal information, we’re not entirely convinced that this exception is justified.

The bill has a very broad definition of personal information.  The bill applies generally to a wide range of information — considerably beyond the traditional definition of “personally identifiable information” (PII) addressed in some existing sectoral legislation.  §4(a).  As privacy commentators have recognized for years now, consumers have an interest in pseudonymous identifiers such as device IDs and cookies because (1) those identifiers can be used to dramatically alter individual experiences and (2) information attributed to unique identifiers could — quite easily in many cases — be tied to real world identity.  Identifiers for devices that aren’t associated with a particular individual or family — say, a streetlight that is connected to the internet in order to be remotely turned on and off — are reasonably excluded from the definition of personal information.

Out-of-context data collection requires opt-in consent — or approval by a Privacy Review Board.  So here’s where the bill gets rather complicated.  As noted above, consumers get certain rights if there’s a “privacy risk” posed by that data.  However, if data collection is “out of context,” then consumers must be given “Heightened Transparency and Control” over that data (basically, they must provide affirmative consent in response to a clear and prominent prompt).  §103(b)(1).  When data processing is “unreasonable in light of context” is somewhat hard to parse — there’s an eleven-factor test for what constitutes “context” in the definitions, including consideration of the nature of the institution’s relationship with the consumer and the types of data collected.  §4(k).  Previous privacy bills such as Representative Rush’s BEST PRACTICES Act and the Kerry-McCain Commercial Privacy Bill of Rights have adopted a simpler approach: opt-out controls for most data, opt-in controls for “sensitive” data.  That approach is at the very least more predictable and certainly easier to implement.  Alternatively, an obligation to provide just-in-time notice and opt-in controls could be tied to reasonable expectations or the likelihood that a consumer would be surprised by certain practices — arguably, the FTC has used this as a benchmark in previous enforcement actions. One of the key exercises going forward will be to evaluate this creative new idea of context and compare its strengthens and weaknesses against existing approaches.

Notably, the bill does provide for an exception to the requirement for Heightened Transparency and Control if an FTC-approved “Privacy Review Board” (think of a medical research institution’s Institutional Review Board) finds that opt-in control is impractical, and the privacy risks have been minimized and are outweighed by “substantial benefits” that don’t just accrue to the covered entity (this idea is tied to an expectation that big data will have a value to consumers and society).  Even if a PRB waived a requirement of consent, the company would still have to provide transparency and individual control under §§101 and 102 of the bill, though general data minimization requirements are curiously overridden (§104(c)(3)).  Moreover, whether out-of-context data processing is approved by an individual or a PRB, a disparate impact analysis must be conducted to ensure that the data processing wouldn’t result in unintended discrimination against various protected classes.  §103(d).  We are wary about transferring too much individual autonomy to obscure and unproven PRBs, but we appreciate the effort to provide a creative solution to the problem of how to encourage societally beneficial secondary though unanticipated uses of data.  The idea merits greater exploration.

Retroactive revisions to privacy policies no longer need permission.  Under existing law, the Federal Trade Commission has brought enforcement actions against companies who tried to retroactively revise privacy policies to change the rules for previously collected data (that is, a company made certain material representations in its policy at the time of collection, and then retroactively changed the policy to remove those conditions).  This of course makes sense — you can’t promise someone at the time of data collection that you won’t sell their email address to a data broker, and then change your mind after the fact.  Surprisingly, the President’s bill reverses course on this principle.  Under §102(e), companies that change their policies simply need to post information about the change in advance, and then offer “compensating controls designed to mitigate privacy risks” from the change.  This weakening of existing law is a bad idea, and this provision should be removed.

The bill has weak enforcement provisions.   In order to effectively deter privacy violations, a privacy law needs to have robust enforcement — anemic enforcement powers has been blamed by many for uneven compliance with the Data Protection Directive in Europe.  However, the Consumer Privacy Bill of Rights Act is hamstrung by strangely weak enforcement powers.  Most notably, privacy fines are calculated not by the number of affected individuals, but by the number of days during which a violation occurs.  Thus, if a multibillion dollar company decided to sell millions of records in one day in violation of its promises, under this bill it would face a maximum fine of $35,000 — an incredibly perverse result.  State Attorneys General and citizens themselves are not empowered to obtain penalties at all.  And new companies get a free pass from penalties for the first 18 months of their existence (consider that Instagram had over a million users within two months of launch — and 150 million pictures and 10 million users in less than a year).  If this bill is to be effective in deterring bad privacy practices, the consequences for behaving illegally need to be significant.

The FTC can approve specific Codes of Conduct as “safe harbors.”  Given the vagueness surrounding many of the bill’s provisions (including how both risk and context are evaluated), covered entities will be eager for some sort of certainty that a specific set of practices will comply with the law.  While the draft bill does not allow for FTC rulemaking to provide more in-depth guidance, it does allow the FTC to bless certain industry-specific codes as compliant with the law.  These codes can be developed through the existing National Telecommunications & Information Administration’s (NTIA) multistakeholder process that has delivered mixed results (at best) so far at generating consensus codes of conduct (though presumably institutions will be more incentivized to generate enforceable codes if there are substantive obligations from which you want to establish a safe harbor).  Codes can also be developed outside of the NTIA process, potentially entirely by covered entities without multistakeholder input; however, these codes do not enjoy the presumption of validity and streamlined approval process that NTIA codes get.  CDT believes that FTC-approved codes can play a helpful role in translating high level legal obligations into industry-specific practices — however, the FTC needs to be given the necessary resources to evaluate the potential influx of proposed codes.

With limited exceptions, the bill preserves existing federal law, but preempts current and future state privacy and security laws.  Although the bill purports to create a universal privacy standard, it largely preserves existing privacy statutes — including the Communications Act’s more stringent protections for certain data collected by ISPs.  §401.   (The similarly strict Video Privacy Protection Act does not fare as well — it is completely overridden.)  State privacy and data security laws are by and large preempted, though common law (such as privacy torts), laws about children, laws on health and financial data, and state data breach notification bills are preserved (this bill doesn’t provide for breach notification — the President previously proposed a different bill to address that issue, and that bill would broadly preempt state notification laws).  Importantly, both federal and state general purpose consumer protection laws — the main vehicle through which privacy is protected today — are not preempted, though where this bill is more specific (such as on retroactive privacy policy changes), more limiting interpretations of current law may be scaled back.

CDT could support broad preemption provisions in a federal bill to create a common national standard, but the underlying substantive standard would need to be quite strong.  Unfortunately, the President’s bill is not there at this point — too much personal information is exempted from too many protections, and the enforcement mechanisms won’t be an effective deterrent.

We are committed to working toward a common, strong privacy standard that would apply to all of our personal information.  The President and FTC calling for comprehensive privacy legislation were important steps toward that goal.  And this draft bill is an important step along the way.