Skip to Content

Cybersecurity & Standards, European Policy, Free Expression, Privacy & Data

Seven Key Issues for EU Justice Ministers on the Right To Be Forgotten

Few cases before the Court of Justice of the European Union (CJEU) have provoked more, or more heated, debate than the 13 May ruling on the “right to be forgotten.” The ruling interprets existing EU data protection law to include a right for individuals to demand that search engines refrain from linking to specified search results under certain conditions. CDT and many other commentators have been critical of the ruling, primarily because we do not think the CJEU appropriately considered the impact of its ruling on the free expression rights of Internet users in Europe and around the world.

While the ruling is focused on search engines, it comes in the midst of EU Member States’ long-running discussions on a data protection reform package, the General Data Protection Regulation (GDPR), and were the ruling’s logic to be ported into new legislation, its implications would be much broader. With this background, Justice Ministers from across the Union will, on 10 October, engage in a political debate on the ruling. They will take on some fundamental and difficult questions: How the right should be balanced with freedom of expression, what the proper scope of the right is, on what grounds it can be exercised, and what obligations and responsibilities should be placed on data controllers.

CDT offers its views ahead of this debate. We argue that removal or deletion requests should be made to publishers rather than intermediaries. We highlight the need for far more clarity and guidance as to when such requests should be honoured. We argue that publishers should be transparent about when and why information is suppressed or deleted, and that they should not be responsible for ensuring that third parties also comply. Finally, we make the fundamental point that freedom of expression and the right to privacy are equal rights and must enjoy equal protection. This was a major issue the Court got wrong, and it should not be replicated in the future.

1. Erasure Requests Should In Most Cases Be Made to the Original Publisher

One crucial question regarding the scope of the right is whether it can be invoked against data controllers who do not themselves host or maintain the personal information.  Clearly, in the Google Spain case, the CJEU articulated a right for individuals to demand that a search engine cease linking to public, lawfully posted information for queries on the basis of a data subject’s name, even while recognizing that the search engines are not the hosts or sources of the data.  However, this sort of obligation puts search engines and other non-host controllers in the position of having to determine when it is potentially unlawful to link or otherwise refer to information lawfully hosted by others.  This requires these entities to engage in a balancing of competing rights and interests that they are ill-equipped to do, as we discuss in more detail below. At the very least, requests to erase data should be made to the entities that host that data. Asking non-hosting intermediaries to second-guess the lawfulness of information that is publicly available online should only occur as a matter of last resort.

2. The GDPR Should Avoid Requiring Companies to Engage in a Balancing of Individuals’ Fundamental Rights

One of the biggest challenges presented by the CJEU opinion is the way it shifts the burden of balancing fundamental rights to search engines, who are intermediaries among at least three diverse set of rights: the right of the web page publisher to impart information, the right of individuals using the search engine to receive information, and the privacy and data protection rights of the person who is the subject of that information. Private companies are simply not the appropriate actors to engage in this sort of balancing, and any formulation of the “right to be forgotten” that conceives of data controllers engaging in extensive and frequent balancing exercises will neither succeed in protecting individuals’ fundamental rights nor create a workable data protection regime.

As the Presidency notes in its briefing paper, Member States have implemented a range of regulations that do the work of balancing privacy and freedom of expression at the national level. This is the appropriate role of states and should not be shifted to private companies. Another question is how existing and future rules are applied by data protection authorities (DPAs) in 28 different Member States. Unless DPAs can agree on clear and common guidelines, it will open up uncertainty about how to apply potentially conflicting or contradictory national regulations addressing privacy and freedom of expression.  For example, if a data subject in one Member State makes a request to have information delinked or suppressed, and that information is hosted in a Member State that protects the author’s right to publish that information, it should not be the responsibility of the intermediary to determine which Member State’s approach to balancing freedom of expression and privacy will prevail. The problem of conflicting jurisdictions will also exist with regard to information published and hosted outside of the European Union. Can the right be invoked against a publisher or controller in the United States or Australia, and how will it be balanced against that publisher’s free expression rights under its own country’s laws?

3. The GDPR Should Limit the Liabilities that Article 17 Places on Intermediaries

A “right to be forgotten” that allows a data subject to make demands of data controllers regarding information provided by third parties essentially creates a notice-and-takedown regime for personal information. This is true whether the right applies to non-host controllers such as search engines or to controllers who host user-generated content.  Notice-and-takedown systems are notoriously vulnerable to abuse. Fraudulent or bad-faith notices, issued for improper purposes such as to silence critics or to suppress relevant but unflattering information, can result in the removal of lawful content. Users whose content has wrongly been deleted may have little recourse or few resources to challenge the takedown and seek reposting of their content.

Meanwhile, intermediaries may have little incentive to question or refuse a takedown request, particularly if the potential legal consequences for refusing to remove information remain unclear. Controllers who face the task of balancing the rights set out in the GDPR with twenty-eight Member States’ national rules, traditions, and practices on free expression may reasonably and in good faith come to different conclusions about whether certain information should be removed.  If the GDPR is to require controllers to make this difficult assessment, it must also protect controllers from potential liability for making an “incorrect” decision.  The Presidency notes that “the decision taken by the controller on the basis of a direct request by the data subject cannot be different in nature or purpose than the one taken by the data protection or judicial authority in a similar case.” But it is not clear how this is to be enforced. If controllers who are responding to data subjects’ requests over information provided or hosted by third parties may potentially face legal liability for making an “incorrect” or “inconsistent” decision, the certain outcome is that these controllers will take down information at data subjects’ demand, without engaging in any balancing inquiry at all (unless there is a clear case from a judicial authority where a takedown request was rejected on free expression grounds).  Notice-and-takedown regimes should be coupled with limitations on liability for intermediaries, as in the notice-and-action framework in the E-Commerce Directive.

4. The Grounds on which the Right Can Be Exercised Should Be Narrowed

We certainly acknowledge that there are cases in which an individual should have the right to seek erasure of others’ online speech about them. Unlawful content, such as defamation, or the unlawful posting of especially sensitive information are obvious cases to consider. On the other hand, data subjects should not have the right to seek erasure or delinking of others’ speech about them merely because they do not like it or disagree with its value. The “right to be forgotten” primarily raises free expression concerns when it gives the data’s subject the ability to prevent others from speaking or accessing true, lawfully published information about her. Narrowing the applicability of the right, when it concerns personal information about an individual that has been provided or published by others, will provide individuals with meaningful protections while avoiding burdens on journalism, historical reporting, academic research, quotation, commentary, and other dimensions of the right to freedom of expression.

Unfortunately, the criteria provided by the CJEU – that information about a person that is “inadequate, irrelevant or excessive” may be delinked – are woefully inadequate for judging removal requests, and the GDPR must provide more clarity for the broader removal right articulated in Article 17. We encourage EU legislators to consider the specific scenarios where data subjects should be able to silence or prejudice others’ speech about them, and delineate precise criteria for controllers to evaluate in considering removal requests.

The Presidency proposes a recital 53a to address the implications of search-result delinking for individuals’ right to access information and to call for a balancing between that right and the data subject’s fundamental right to privacy and data protection.  Specifically, the Presidency notes that the “balance may in specific cases depend on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having access to that information, an interest which may vary, in particular, according to the role played by the data subject in public life.”

This type of more precise guidance would be very useful for data controllers, but these are important considerations for any data controller receiving a “right to be forgotten” request, not only search engines.  It is important to give individuals control over the personal data that they provide to a controller, but when a controller is processing information obtained not directly from the individual in question (e.g., a search engine indexing publicly posted information, a news reporter citing public records, or a social media platform hosting a user’s commentary, critique, or quoting of another individual), the free expression implications of giving the data subject a blanket right to halt the processing of that information are significant.  It would make more sense to bifurcate this broad “right to be forgotten” concept into a strong right to instruct a data controller to whom you gave personal information to stop processing it, with few or no exceptions, and (if necessary) a right to make a request to a controller who is processing your data obtained from a third party. This latter right to request erasure would necessarily be heavily constrained by free expression exceptions, including the right of others to express their opinion (news reporting) and to access information that is in the public interest.

5. Freedom of Expression Should Not be Deemed a Secondary Right

We strongly caution against enshrining in the GDPR the language from the CJEU ruling that “the data subject’s rights protected by [Articles 7 and 8 of the Charter] should override, as a general rule, the interest of Internet users.” The so-called “interest” of Internet users in access to true, lawfully published information is in fact a right protected by Article 11 of the Charter, which guarantees that everyone shall have the freedom “to receive and impart information and ideas without interference by public authorities.” While the CJEU opinion presents a quite constrained view of the issue, in the question of accessibility of information via search queries there are at least four sets of rights and interests in play: the privacy and data protection rights of the data subject, the right of the information source (e.g., the journalist who authored the article) to impart information, the right of Internet users to receive lawful information, as well as the interests of the search engines in providing their indexing and search functions. These potentially conflicting rights of the data subject, information provider, and information seeker must be balanced, both in practice and in the text of the GDPR.

6. The GDPR Should Not Place Unworkable Obligations on Controllers to Notify Other Controllers of Erasure Requests

In a draft addition to Recital 54, the Presidency suggests to elaborate on the proposed obligation of a controller who receives an erasure request to notify other controllers who are processing that data: “This information should also allow other controllers to assess whether the erasure would be contrary to the public interest in the availability of the data for reasons of freedom of expression, freedom of the press, historical, statistical or scientific purposes.”

As we discuss below, a right of intermediaries and platforms who receive deletion or delinking requests to notify publishers of deletion requests is certainly reasonable. In the fact pattern of the Google Spain case, for example, an obligation for Google to inform the newspaper La Vanguardia that it had been instructed to delink to the original article would serve important transparency purposes. But a broad obligation on all controllers to identify other controllers who are or may be processing the data at issue is a heavy and unreasonable burden. For example, had La Vanguardia been required to remove Mr. Costeja-Gonzales’s personal information, an obligation to also inform every possible other controller of that data, including all search engines as well as news archives, private databases, and other newspapers that had cited the story, would be a considerable burden on the newspaper. Moreover, it would be a significant infringement of free expression if deletion requests were now binding on new speakers of the information who possess their own rights to publicize the information. An individual should have a right to delete his embarrassing posts from Twitter, but if someone takes a screenshot of that tweet for posterity, that person shouldn’t have a secondary right to prohibit others from drawing attention to his bad decisions.  We believe that mandatory notification requirements should only extend to data processors working on behalf of the data controller who receives the request.

7. The GDPR Should Allow Public Transparency About Deletion Requests

The Presidency is correct to note that it is important for information hosts to know when a non-hosting controller has taken steps to render less accessible at a third party’s instruction some information provided by that host. Removing third-party content or delinking search results that point to true, lawfully posted information that is part of the public record necessarily interferes with both the author’s right to impart information and the public’s right to receive it. As with any government-imposed limitation on freedom of expression, this interference must occur only in a way that is predictable and transparent to all. Further, transparency as to the implementation of removal requests is crucial to avoid abuse of the system and to allow for an appeals process to be laid down. Individuals seeking to have information suppressed have strong incentives for understanding what information is and is no longer available about them online. The general public, or individual searchers, however, are much less likely to be able to determine that a web page has been altered or that a set of search results they view has been manipulated at a third party’s request. It is essential to have clear reporting about the ways in which individuals’ removal requests are affecting the information available for journalists, academics, researchers, artists, and other members of the public.

Because of the significant potential for abuse of a right to erasure – and the perverse incentives for publishers and certainly intermediaries to favor privacy at the expense of legitimate free expression – we believe that transparency regarding link takedown or the erasure demands will be crucial for preventing abuse of the system.  The GDPR should be amended to acknowledge the rights of intermediaries to notify publishers of delinking or removal requests, and the right of all controllers to publicly disclose when they have received delinking or removal requests.