State Attorneys General: Evading Privacy Settings is Illegal
Earlier this week, 37 state attorneys general announced a $17 million settlement with Google for placing third-party cookies on Safari browsers in violation of Safari’s privacy settings. These allegations aren’t new — privacy researcher Jonathan Mayer uncovered this practice over two years ago, and the Federal Trade Commission has already reached its own $22.5 million settlement with the company for the same behavior. Usually, I don’t like seeing states expend time and effort to replicate cases that the FTC has already prosecuted (and vice versa). Regulators have limited resources and need to manage their caseload to maximize the impact that their cases will have on the ecosystem. Just replicating a previous federal settlement (even if it results in a substantial fine) doesn’t typically benefit consumers or provide additional guidance to companies on how to model their behavior. There’s no shortage of potential privacy investigations out there; why retread the same ground?
This instance, however, is different. The state AGs’ settlement agreement is considerably more expansive than the FTC’s, and potentially establishes a new precedent for companies: evading privacy controls — even default privacy controls — is per se deceptive. If it’s illegal for companies to try to get around privacy controls, that’s a big deal for consumers.
The FTC’s case against Google was very narrow: it was predicated entirely on promises the company made that it wouldn’t drop cookies on Safari users. Google has said that it never intended to cookie Safari browsers, and its behavior seems to back that up: its help page explicitly said that because Safari blocks third-party cookies, Safari users are effectively opted out of behavioral ads. (Google claims a bug introduced by Google Plus integration caused the company to start dropping the cookies.) The FTC’s complaint merely says that because Google violated its help center language (and other similar statements), those statements — and not the evasion of Safari’s privacy settings itself — were deceptive. The only injunctive relief that the FTC obtained (in addition to the $22.5 million) was that Google had to try to erase the cookies it had inadvertently dropped. There was no prohibition on using similar techniques in the future. (The FTC exhibited similar reluctance to find the vile practice of browser history sniffing to be inherently deceptive in its settlement with Epic Marketplace. We blogged about that case here.)
The states’ settlement acknowledges the deceptive statements, but also finds the underlying practice — placing cookies in contravention to a browser’s privacy settings — inherently deceptive as well. It’s a somewhat aggressive interpretation of deceptive practices law, but it makes sense: the code was designed to circumvent the browser’s privacy settings by tricking the browser into believing that Google was a first party (for a more detailed technical write-up of how Google managed to drop cookies in Safari browsers, refer to Mayer’s original blog post). The law prohibits deceptive practices, but it doesn’t mandate that users themselves are deceived — in this case, it was Apple’s Safari browser that was deceived. The FTC had previously sanctioned similar tactics — it had previously brought an action against the data scraping company ReverseAuction for deceiving eBay into granting it access to its database of consumer information. But while the FTC was reluctant to find intrinsic deception here, the states forged ahead: their settlement prohibits Google from using the same technique to place cookies in the future unless the user consents.
The states could have made a clearer statement by going after one of the other companies identified in Mayer’s blog post as dropping cookies in Safari rather than Google. The Google case was made easier by the existence of clearly inaccurate statements, but the company was in a sense less culpable in that it never intended to place the cookies. Google’s already paid $22 million to the FTC and deleted the cookies they inadvertently sent. The other companies mentioned — Vibrant Media, PointRoll, and the Media Innovation Group — presumably knew what they were doing, and to date, they’ve gotten off scot-free (a class action against PointRoll was recently dismissed). If the states had gone after one of those companies, it would have sent a stronger message that the behavior alone was sufficient to support a deceptive practices claim.
Those cases could still be coming. In the meantime, it will be interesting to see how this precedent plays out with regard to other software privacy protections. Take Do Not Track, for example. It’s a somewhat different beast than a technical limitation on cookie setting, because it’s just supposed to reflect an expression of user preference — and one that should clearly reflect a user’s deliberate choice. However, if browsers were to try to enforce the standard by limiting access to companies that don’t honor the settings in certain ways, efforts to get around that enforcement could be deemed deceptive.
Also, both Apple and Google have introduced privacy protections into their mobile operating systems to limit. If an advertisers’ ability to access certain device identifiers to track users across applications. If an advertiser or app were to access those identifiers in violation of Apple’s or Google’s policies, they now risk a claim that that access itself was illegal. Increasingly, intermediaries like browsers and operating systems are responding to user demands for greater privacy. If it’s illegal for companies to try to get around these added protections, that dramatically levels the playing field for consumers.
In any event, it’s heartening to see states increasingly take action to protect consumer privacy. While I have some reservations about states enacting disparate regulations to regulate online behavior, extra enforcers of one common standard (here, the prohibition on deceptive business practices) is welcome. I think the FTC has done a pretty good job with the limited privacy authority they have — though I wish they’d been a bit more aggressive in this case — but given the proliferation of tracking technologies and growing consumer unease with how their privacy is being respected, more cops working the beat can only help.