EU’s “Right to Be Forgotten” Policy Sets Bad Precedent for Free Expression Worldwide

Written by Jens-Henrik Jeppesen, Emma Llansó

In the latest development in the debate over the “right to be forgotten” in Europe, Google has decided to begin suppressing links to URLs not only for searches on EU country-level domains, but also for searches conducted from within EU countries on their global .com site.  In 2014, the Court of Justice of the European Union(CJEU) found that, under the Data Protection Directive, people in the EU have a right to demand that search engines de-list URLs linking to information that is “inadequate, irrelevant or no longer relevant, or excessive.”

Our overriding concern with the ruling is that it enables broad restriction of access to lawful, public information, and as such, it inevitably curbs free expression.

CDT has been very critical of the reasoning behind the CJEU ruling in Google Spain v AEPD, Mario Costeja Gonzales. It established the principle that search engines can be requested to suppress links to true, public, lawfully posted information in searches that include the name of a person to protect that person’s privacy. We are sympathetic to people distressed by some of the information about them available in the public domain, we understand the desire in certain contexts to suppress such information, and we support targeted and proportionate policies to protect individuals’ right to privacy.  But our overriding concern with the Costeja ruling is that it enables broad restriction of access to lawful, public information, and as such, it inevitably curbs free expression.

Further, the court’s guidance in that case is so vague that it leaves much room for interpretation about which types of removal requests should be granted, and which should not. This puts a heavy responsibility on the companies affected by the ruling to exercise careful and difficult balancing acts between one person’s privacy rights and the rights of others to receive and impart information. Companies face pressures to minimise costs and maximise revenues, and there is a powerful incentive to accommodate too many requests, removing too much content, rather than taking on costly and risky lawsuits and legal challenges. And there is every indication that these problems will persist under the new Data Protection Regulation.

The available data about implementation of the Costeja ruling, provided by companies in their transparency reports, suggests that companies thus far have undertaken a serious effort to comply with the ruling in a sensible way.  Rather than complying with 100% of requests, it appears that they seek to resist claims for removal of information that is clearly relevant to the public, while accommodating requests about information that is not – although there are many examples of debatable decisions and outcomes.

On balance, CDT continues to believe that the “right to be forgotten” concept, and the de-listing right found in the Costeja case, are flawed, and that they cannot avoid restricting access to information that should be available to the public. The broader the geographical scope of implementation, the greater this negative impact on access to information and free expression. The implementation model that Google initially adopted of delisting results on EU member states’ country-code top-level domains (e.g. google.fr, google.de) was the least bad option.

Google has faced increasing pressure from the French Data Protection Authority (CNIL) and other DPAs in Europe, who have insisted on global implementation of the Costeja case and asked that delisting happen on the .com domain, in addition to the domain of the country in which the request was granted and other EU member state domains. Google’s recent concession fortunately does not go that far, but it is significant in that it extends delisting to also include search queries made on the .com domain from a location within the country where the de-listing request was granted.  So, someone in France searching on Google.com for information about a French citizen who has requested that links be suppressed will see the modified search results, though someone searching .com for that French citizen from Italy (or the U.S.) will not.

We fear that in countries that engage in more severe online censorship… governments will demand that their censorship laws should be applied to global domains when accessed from their countries.

Provided that Google and others take great care in evaluating de-listing requests and systematically refuse those that involve information that is of genuine interest to the public, the direct impact of the concession may be limited.  However, while Google and the DPAs have both stressed that the aim of the de-listing process is not to suppress political speech or important information about individuals’ political, commercial, or criminal activities, we have already seen that the de-listing policy can be used to restrict access not only to information about an individual, but also to news articles about implementation of the Costeja ruling.  We do not know what information has been successfully suppressed under this law, and this fundamental paradox – how to provide transparency about information that has been made less accessible on privacy grounds – means it is difficult to fully assess the impact of the Costeja ruling.

Further, we fear that in countries that engage in more severe online censorship and routinely restrict access to information, governments will demand that their censorship laws should be applied to global domains when accessed from their countries. This would be to the detriment of people living in such countries for whom the Internet still offers ways to get information that is otherwise censored.   That would be a serious step back for dissidents and others who seek to promote human rights and democracy in their countries. If this approach becomes standard practice for Internet companies, it will also serve as a barrier for new entrants to the online search and content-hosting business, as individual speakers and small businesses would struggle to implement geo-targeted availability of content based on several hundred countries’ laws.

Unfortunately, authoritarian regimes can now point to the arguments put forward by CNIL and other DPAs and use them to legitimise their own demands. This is surely not what the DPAs had intended, but it may well be a consequence of the line they have taken.

 

Share Post