Skip to Content

European Policy, Free Expression, Open Internet

Advocate General Opinion in Austrian Defamation Case Raises Troubling Prospects for Access to Information

Courts across Europe are grappling with difficult issues concerning content posted online. So far, the decisions they make are inconsistent. In one case, a court ordered a social media company to refrain from removing content that the company considered in violation of its terms of service. In the present case, a court ordered a social media company to remove and monitor for content that the court considered to violate legal limitations on speech, but the company did not consider to violate terms of service, let alone the law. These cases break new ground, which means that national courts refer questions of interpretation of EU law to the Court of Justice of the European Union (CJEU). The answers the CJEU returns are of enormous consequence for free expression because they set precedent for future legal disputes across the European Union and will ultimately determine companies’ willingness to host users’ speech.

A recent opinion from the Advocate-General (AG) of the CJEU in case C-18/18 raises troubling prospects for free expression and access to information online. The AG opinion is not a final ruling, but the Court often follows the direction set by the AG. In this instance, the court should think twice.

The case

An Austrian politician, Ms. Glawischnig-Piesczek, objected to disparaging comments posted on Facebook. A user referred to her as a “lousy traitor of the people”, a “corrupt oaf” and a member of a “fascist party”. She sought to have that content, which she considered defamatory, removed on a worldwide basis. At the same time, she asked for an injunction against potential future statements with identical wording or equivalent meaning, no matter who might post them. 

The dispute was eventually referred to Austrian Supreme Court, which sought guidance from the CJEU regarding obligations that can be imposed on a content host under the 2000 E-Commerce Directive. The court asked whether a content host can be ordered to remove content that is identical or equivalent to information that is deemed illegal, and whether removal should be limited to the Member State in question, or could apply worldwide. 

European courts may order content removal worldwide 

The most striking, and troubling, of the AG’s conclusions is his view that, in principle, nothing prevents a European court from ordering removal of online content on a worldwide basis. As CDT has argued on several occasions, the notion that one country’s authorities can demand that its restrictions on free expression be applied worldwide is dangerous for free expression and access to information on the internet. Were this principle to be generally applied, Pakistan could seek to impose its blasphemy law everywhere, and Thailand might want to extend its ban on criticism of its royal family beyond its borders. To take a European case, a German court might order that material that violates Germany’s rules on blasphemy should be made inaccessible, not only in Germany, but in other European countries, including those where blasphemy is no longer an offense. These are dramatic, but illustrative, examples of where application of this principle would lead. It should be obvious that the result is undesirable. The AG bases his view that removal orders can apply worldwide on the observation that the relevant law (defamation) has not been harmonized at EU level. That he did not conclude that the reach of a national court’s decision should stop at the border is surprising, and seems inconsistent with views he has expressed in other cases. Global application of obligations to not only remove content, but to police and prevent future sharing of identical or equivalent speech, would be an extremely problematic outcome. The AG does counsel courts to take a cautious approach and respect principles of international comity when ruling in such cases. However, we would hope that the CJEU takes a stronger line on this point and instructs courts to limit explicitly any speech restrictions to apply only in their own country, or possibly within the EU.

Filtering and monitoring for identical and equivalent content: specific or general?

What obligations may the court impose on a content host, under the E-Commerce Directive, to monitor for particular illegal content (and content that is deemed to be identical or equivalent)? The AG bases his analysis of these questions on the L’Oréal v eBay case (C-324/09) and uses it to draw a distinction between mandates for general monitoring, which is not allowed, and specific monitoring, which is allowed. The AG concludes that a content host may be ordered to find and restrict content that is identical to the original illegal content, both when shared by the original user and other users. However, the proactive obligation to find and restrict equivalent content only extends to the original user. Further, the AG proposes that monitoring obligations should be duly limited in time and scope. He argues that monitoring for identical content is straightforward and can be done with the use of technical tools such as filters. On the other hand, recognizing content that is different from, but equivalent to, the original post requires human intervention. 

Technical tools cannot judge what is “identical”, let alone “equivalent”, information

The AG’s reasoning is troublesome in a couple of ways. It applies the logic from a case involving intellectual property infringement (trademarks) to one that involves the limits of freedom of expression when criticising a political figure. It is difficult enough for a content host to take measures to stop fraudsters trying to sell fake branded goods on its website, even when the trademark holder insists that no use of its mark on that site is legitimate or licensed. But it is another thing entirely to monitor a social network for statements that are “identical” to one that has been ruled illegal. The context and intent can completely change the meaning of the original information so that it is no longer “identical”. For example, users might circulate the disputed statement to call on others to support the person who is allegedly defamed. It might be shared for purposes of sarcasm or criticism of the person who made the statement, and it might be shared as news reporting (or, indeed, in blog posts discussing legal opinions about the statements). On the whole, the distinction between identical and equivalent does not seem in line with the reality of moderating content online. The AG seems to suggest that simply filtering content for matching keywords can do this job. However, as research shows, even the most advanced technologies are incapable of understanding with any precision the context, intent, and meaning of posts. This matters for photos, video, and sound, and it matters even more for text. The AG opinion could amount to an expansion of the notion of “specific monitoring” in a way that poses risks to free expression and access to information online. These are matters the CJEU should consider carefully in its deliberations on a final ruling.