Progress Made Revising EU Data Privacy Laws
For the past few years, European policymaking institutions have been working to revise the European Union’s data privacy laws. Last month that effort took a significant step forward as the Civil Liberties, Justice and Home Affairs (LIBE) Committee of the European Parliament voted on and passed new legislative text on a General Data Protection Regulation (GDPR). Developing consensus text was by itself a Herculean task, as various and diverse members and committees had proposed thousands of amendments. The bill still has a number of hurdles to clear before it becomes law, but this vote was an important (and positive) step. Here are our key takeaways from the LIBE vote:
The LIBE amendments got a lot of things right
Most of the Compromise Amendments are considerable improvements to the previous text authored by the European Commission over a year and a half ago: they introduce reasonable improvements to the existing text that maintain robust data protections while undoing some previously unworkable provisions. Last year, CDT had urged a number of improvements to the Commission text, and we’re pleased to see that a lot of those concepts were mirrored in the new legislation.
One of the most significant and controversial elements of the revised text is the retention of the “legitimate interest” basis for data processing — that is, if a company has a “legitimate interest” in collecting and using your data, it may legally do so under the law. This provision has drawn the ire of privacy advocates because of the extremely broad scope of the provision. The revised legislation retains, though narrows, the “legitimate interest” basis, but also strengthens a user’s ability to object to such processing (effectively, the ability to opt out). A previous broad prohibition on profiling without consent has been also limited, and now profiling can occur on an opt-out basis except where the profiling has “legal effects” on the user, or similarly affects a user’s interests, rights, or freedoms. (This is admittedly a somewhat vague and confusing test, though it seems to roughly correspond to the heightened protections that Americans enjoy under the Fair Credit Reporting Act for data used for credit or employment decisions.)
Practically speaking, these (and other changes) seem primarily designed to let marketing (and similar collection and usage) occur on an opt-out basis. We have always argued that there should be a continuum of user control over personal information depending on the sensitivity of the data and the purposes for which it is being used, and this new framework seems like a reasonable articulation of such a sliding scale. Users absolutely need control over how their information is collected and used for marketing, analytics and related purposes, but in many cases that can be fairly done on an opt-out basis. This approach also aligns with efforts within the W3C to standardize a global opt-out for behavioral marketing data collection (which is itself implicitly referenced in Article 19(2b) of the new legislative language).
Relatedly, the LIBE amendments also (as expected) introduce the concept of “pseudonymous” data as a category of data that is harder to tie to individuals and thus might be subject to less stringent standard of care (think cookies and IP addresses, as opposed to real names or home addresses). We’ve supported such a tiered approach to data privacy, but worried that an overly broad exception for pseudonymous data could otherwise swallow the overall rule. The pseudonymous standard in the new LIBE text is somewhat weak — pseudonymous data can still be tied to an individual, the company just promises not to do it. On the other hand, the law’s protections are not substantially lessened for pseudonymous data. Mostly they serve to reinforce the idea that marketing can often be done on an opt-out basis. Importantly, pseudonymous data is still recognized as personal data protected by the GDPR (and the European Convention on Human Rights as well), and most of the law’s protections still apply.
The LIBE amendments also expanded the legislation’s protections in a number of areas. Enforcement of privacy law (long a weakness under the existing Data Protection Directive) was strengthened: already robust penalty provisions were expanded for small and medium enterprises, and state data protection authorities were given greater powers over companies established in other EU jurisdictions. We had previously been concerned that the regulation gave too much authority to companies to forum shop for more sympathetic regulators, but the new structure rightly gives all data protection authorities more capacity to challenge practices that impinge the privacy rights of their citizens.
The amendments also significantly curtail the European Commission’s rulemaking authority to issue regulations interpreting privacy law. Instead, the Commission will mostly just issue non-binding guidance as they already do today. While we think the Commission should have the authority to issue interpreting regulations on some limited aspects of the regulation to improve clarity, the former version of the bill envisioned overly ambitious rulemaking on nearly every conceivable topic, including the mandate of specific technological solutions.
Finally, it’s also worthwhile to consider the strong privacy protections that the LIBE amendments maintained in response to intense lobbying: Consent still has to be explicit under the law (though what you can do on an opt-out basis has been somewhat expanded). The regulation still applies broadly to extra-European companies that collect and use data about European citizens. Data minimization is also still required as a core privacy principle (despite being increasingly under attack in the era of Big Data).
There are still elements of the text that we would change: it seems silly to make privacy impact assessments publicly available (guaranteeing that companies won’t take them seriously as soul-searching exercises), processors still in many cases have strict liability over how their services are used, and the amendments introduce a very prescriptive icon-based short-form notice regime (admirable in spirit, but we’d want to see serious user testing of these icons, certainly before putting them into law). All in all though, the LIBE Committee did a commendable job, not just in sorting through the mountain of proposed amendments, but also in generating reasonable compromise solutions to difficult data protection issues.
The Right to be Forgotten Erasure still has problems
From the beginning of the European Union’s effort to revise its privacy laws, we’ve been skeptical about a broad Right to be Forgotten that gives citizens a legal right to prevent others from speaking truthful public facts about them. We had argued that a narrower legal obligation to delete material that you had hosted online could make sense in the context of the GDPR, but that right shouldn’t extend to what others say about you. Fortunately, the LIBE Committee has dialed back somewhat on this provision, right down to the name — it’s now called the Right to Erasure.
Unfortunately, even the pared-back provision is flawed and still poses a significant threat to free expression. Platforms that share content on behalf of users have an obligation to delete the content they host on behalf of a user (which makes sense), but they also have a legal obligation to “obtain from third parties the erasure of any links to, or copy or replication of that data” (which doesn’t). Even under the revised language, if I retweet or post a screenshot of something that another person says, that person has a right to suppress my use of their previously public statements. Politicians who get caught making offensive or controversial statements might try to use such a right to make embarrassing gaffes disappear into the æther. The provision allows Member States to pass laws that exempt certain free speech activities. But those laws don’t exist yet, and may well be divergent or weak by the letter of the law or in practice. There’s certainly no reason to believe that those laws will be backed up by the same strong penalties that the GDPR has, especially considering that many member states have embraced efforts by public citizens to eliminate unwanted truths from the public record.
Platforms like Google and Facebook are rarely in the best position to evaluate users’ competing rights, but if an intermediary believes it may be subject to millions in fines by failing to adhere to an erasure request, it will likely not even attempt such an evaluation. Thus, the Right to Erasure provision will operate in practice much like the safe harbor program for copyright takedown requests under the Digital Millennium Copyright Act in the United States: companies, facing potential legal liability for failure to comply with takedown requests, will simply remove content as a matter of course, potentially violating users’ free press and free expression rights.
CDT has previously suggested that this problem can be solved by giving users a limited right to erase only what they themselves post online. Platforms would have an obligation to pass along (and demand compliance with) an erasure request to service providers processing the content on behalf of the platform, but anyone else — including search engines and online archiving sites — who has repurposed the previously public data for their own benefit shouldn’t have any obligation to remove it — and certainly the hosting platform shouldn’t have a legal responsibility to track down what others might do with public data. We hope that the Council will press for further changes to the Right to Erasure.
The Anti-FISA provision will force companies to choose whether to break US or EU law
The biggest single change in the GDPR is the reintroduction of the “anti-FISA clause” that would require notice and local data protection authority approval of any foreign government requests for EU citizens’ data. This language was originally included in a leaked Commission draft in December 2011 but was removed after lobbying from the United States government. NSA surveillance has been front-of-mind in Europe since the Snowden leaks began earlier this year. When we testified before the LIBE Committee earlier this year on American surveillance, members of Parliament across the political spectrum were united in outrage about the unrestrained NSA surveillance programs that place little or no value on the intrinsic privacy rights of our allies’ citizens.
We share these concerns and very much want to reform NSA’s unconstitutional and illegal surveillance programs, both for Americans and foreign citizens. That said, we are not sure this new provision in the GDPR will actually achieve much in practice. Failure to adhere to American demands for data would likely subject companies to criminal liability in the United States; providing for a civil fine in Europe for such compliance would likely not outweigh that consideration. For that reason, privacy and surveillance expert Casper Bowden has said that “it is doubtful that this measure would be effective,” though he also argues for increasing those fines to at least put companies in a bind when considering compliance with intelligence demands.
Fundamentally, the only ways to solve the problem of extraterritorial surveillance is through technological improvements to make surveillance impossible or at least difficult to scale or an international accord to limit warrantless mass surveillance. Given how far down the rabbit hole we have fallen, citizens need a sea change in either the technology or the legal structure governing how we communicate with each other.
Now that the GDPR has cleared its Parliamentary committee, it must also pass the Council. The Council is made up of delegations from each of the 28 member state governments (as opposed to Parliament, whose members are directly elected), and is likely to be less harsh on state surveillance, especially as many of the member states have similarly broad surveillance laws, even if they are not as all-encompassing in practice. Parliament and the European Commission very much want the legislation to be enacted by the spring, but some member states (like the United Kingdom) have been pushing to delay consideration until 2015. We urge the Council to take this legislation up as soon as possible, as EU citizens — and indeed, Americans too — are ill-served by further delays in reforming Europe’s data protection framework.