FTC Once Again Says Privacy Self-Regulation Isn’t Enough
Yesterday, the Federal Trade Commission released the final version of its long-awaited privacy report, calling on Congress to consider baseline privacy legislation and consolidating the principles behind its recent privacy enforcement actions. While the final report largely tracks the preliminary report that the FTC released in December 2010, it does offer a number of changes and clarifications, as well as some important – and timely – reminders about hot-button issues like Do Not Track.
The publication of this report, which follows the February release of the Administration’s “Consumer Privacy Bill of Rights,” comes as Europe is knee-deep in its own reconceptualization of its privacy-protection regime. The EU is currently kicking around a draft regulation that creates strong protections for consumers but also includes some pretty daunting obligations for businesses. The US Department of Commerce has been heavily lobbying for a more nuanced approach to privacy, but has found itself facing a common European refrain: If the US can’t convince its own global companies to respect (European) users’ privacy, then Europe has no choice but to step in and do the job itself. And European regulation will be far tougher on US companies than any privacy law passed here would ever be.
In other words, the fight for privacy protections here in the US is a fight that should engage not just consumers but also American companies.
The Big Picture
In calling for Congress to “consider enacting baseline privacy legislation,” the FTC takes its assertion that “self-regulation has not gone far enough” to the next level, offering actionable suggestions for how to address the failures of self-regulation: privacy legislation and, in its absence, enforcement via the multi-stakeholder codes of conduct being developed through the Department of Commerce.
The FTC also demonstrates an admirable commitment to a technology-neutral approach and one that will be evergreen. In its report, the Commission not only refuses to distinguish between offline and online data, but it also has taken care to construct a framework that will accommodate new, unforeseen technologies.
However, while the FTC does call for shorter, easier to understand privacy policies it continues to emphasize the use of “public commitments” – as opposed to substantively unfair or deceptive practices – as its enforcement hook (that is, when companies violate their public commitments they have engaged in a deceptive practice). We are disappointed that the Commission emphasized neither its power to bring cases against unfair actions nor the reasoning it developed in its 2009 settlement with Sears Holding Corp. In that settlement, the Commission held that Sears had committed a deceptive practice not by violating a “public commitment” but rather by burying a truthful description of its highly problematic privacy practices.
The Commission’s focus on “public commitments” will do little to change an incentive structure that has long encouraged companies to be evasive and vague in their notices to users: Companies are most likely to get called out for violating explicit promises to users, so many realize that the less they promise, the less trouble they court. It’s not a paradigm that gives consumers much useful information about the privacy practices of the products and services they use.
Into the Weeds
Changes to scope
The FTC has tweaked the applicable scope of its framework, exempting companies with fewer than 5,000 customers a year that don’t collect sensitive data or share data with third parties. This narrow exclusion matches those found in past privacy bills (e.g., BEST PRACTICES) and is one that CDT has advocated for.
Don’t cry for PII
The FTC acknowledges in its report that the categorical distinctions between so-called “personally identifiable information” and other information is becoming less important each day. Accordingly, the FTC sets forth the following standard to determine whether a company may properly say it “de-identifies” its customers’ data: A company must take “reasonable measures” to ensure data is de-identified (an evolving standard), must make FTC-enforceable commitments that it will not try to re-identify the data, and must pledge to contractually prohibit other companies with which it shares the data from doing the same. We believe this is a reasonable approach that balances the protection of a broader range of personal data with the need for companies to be able to innovate using data that won’t be tied back to individual users.
Major change in choice/”commonly accepted practices”
Much of the FTC’s privacy report hinges on the idea of “choice.” Consumers need to have meaningful choice over the extent to which their data is collected and used. In both its preliminary report and now this final report, the Commission appropriately sets out situations in which it makes little sense to offer consumers “choices.” For example, when a consumer orders a product online, it does not make sense to give her a choice about whether or not her address can be shared with the company that will be shipping the product to her doorstep.
In the preliminary privacy report, the FTC put forth a set of “commonly accepted practices”: Situations in which companies should not have to worry about offering consumers choice. In our response comments to the Commission, CDT expressed concern that the scope of some of these exceptions was too broad. In the final privacy report, the FTC has changed this formulation to one that matches the Department of Commerce’s “respect for context” principle: When the data collection/use is reasonable given the context of the interaction, companies don’t need to offer consumers choice about this collection/use. While this approach makes some sense, the FTC will need to make sure it is providing appropriate guidance to companies while not allowing them to claim any and all data collection/uses as “appropriate” for a given context.
Privacy concerns raised by “large platforms”
The FTC also calls out the strong privacy concerns presented by “large platforms” (e.g., ISPs, browsers, and operating systems) that have a front row view of consumers’ online activities. Any tracking across sites should be subject to choice, but this is especially important when a platform has access to all or substantially all of a consumer’s activities. Today, an increasingly diverse set of “platforms” have access to such information, so we are pleased that the FTC has not limited its discussion of such platforms to only ISPs. CDT has been concerned about monitoring by such entities for years due to the scope of surveillance, so we are very pleased to see this emphasis in the final report.
Don’t change the rules in the middle of the game: Affirmative express consent for retroactive changes to privacy policy
Consolidating its recent “case law” of voluntary consent orders, the FTC reaffirms in its report that if a company makes a promise to users in its privacy policy about how it will treat their data, it can’t renege on that promise without first getting “affirmative express consent” for these changes. Offering consumers the chance to opt out is simply not sufficient and does not qualify as getting consent. While not surprising – this has effectively been the law of the land for awhile now – we’re glad that the FTC reinforced its intention to hold companies to those few promises that they do make to users.
Do Not Track means just that
The FTC also offers a timely reminder that its expectations for the Do Not Track standards that are currently in development are high. A mechanism that merely allows consumers to avoid targeted ads (but still be tracked) would be insufficient. The Commission writes: “an effective Do Not Track system should go beyond simply opting consumers out of receiving targeted advertisements; it should opt them out of collection of behavioral data for all purposes other than those that would be consistent with the context of the interaction (e.g., preventing click-fraud or collecting de-identified data for analytics purposes).”