This week, the European Court of Human Rights (ECtHR) released its long-anticipated final decision in the case of Delfi AS v. Estonia. The case concerned the liability of Delfi, an online news publisher, for defamatory comments left by users in the comment section of one of its articles. In a major blow for legal protections for free expression online, the ECtHR upheld its earlier decision that, as an “active” host of user-generated content, Delfi should be considered the “publisher” of its users’ comments, and thus be legally responsible for them.
Holding content hosts liable for their users’ speech is a shortcut to censorship for governments and private litigants who cannot easily identify an anonymous speaker or seek a judgment against her. The threat of liability creates strong incentives for content hosts to preview and approve all user comments – and to censor with a broad brush, limit access to their services, and restrict users’ ability to communicate freely over their platforms. In a world where all online speech is intermediated by web servers, news portals, social media platforms, search engines, and ISPs, the collateral consequences of intermediary liability are potentially enormous.
When a host is not a host (but is a publisher)
The Estonian courts decided to treat Delfi as a traditional publisher and not to extend to it the liability protections for online content hosts found in Estonia’s law that implements the EU’s E-Commerce Directive. The ECtHR’s Grand Chamber, finding this decision to be “forseeable” to Delfi, held that the fine the Estonian court levied on Delfi, based on the defamatory nature of user comments, was not a disproportionate burden on Delfi’s right to freedom of expression under Article 10 of the European Convention on Human Rights.
The holding of this case, while complex, sets a troubling precedent. The ECtHR essentially held that some types of entities that host third-party content should expect to be considered traditional publishers (at least in Estonia), and thus that they should anticipate their potential for traditional publisher liability when their users upload “clearly unlawful” content – even if the entity has not been notified of the existence of this content on their servers.
Article 14 of the E-Commerce Directive (ECD), by contrast, limits a host’s liability for information provided by a third party as long as it a) does not have “actual knowledge of illegal activity or information” and b) “acts expeditiously to remove or to disable access to the information” upon obtaining such knowledge. The Estonian courts declined to treat Delfi as a host protected by Article 14 on the theory that it only concerns activity “of a mere technical, automatic and passive nature, which implies that the [host] has neither knowledge of nor control over the information which is transmitted or stored.” This is a confusing interpretation of the ECD, because Article 14(b) clearly anticipates that a host will be able to control the information stored on its servers. The ECtHR noted that “it is not its task to take the place of domestic courts” and thus declined to review the Estonian court’s decision not to treat Delfi as a host protected by Estonia’s intermediary liability laws.
The Court’s “narrow” reading will provide little comfort to hosts of third-party speech, as the line the Court draws between types of content hosts…is fuzzy at best.
The Court describes its holding, that certain types of “active” hosts can be held liable, without notice, for the content of user comments, as narrowly applying only to “a large professionally managed Internet news portal” that publishes its own news articles and invites readers to comment on them, and not to “other fora on the Internet where third-party comments can be disseminated.” It remains to be seen how EU Member State courts will apply this decision. But the Court’s “narrow” reading will provide little comfort to hosts of third-party speech, as the line the Court draws between types of content hosts – some of which can be treated as traditional publishers, some of which might not – is fuzzy at best.
The Court gives examples of fora not addressed by this decision, including discussion forums “where users can freely set out their ideas on any topics without the discussion being channelled by any input from the forum’s manager” or “a social media platform where the platform provider does not offer any content” and commenters are “a private person running the website or a blog as a hobby.”
It’s clear that the Court considers purely passive hosting to merit liability protection (as long as the entity hosting that content is not also a content provider in its own right). But for any content host who, like Delfi, is also a content provider (i.e. all major social media platforms); any forum host who, like Delfi, exercises some form of content moderation (i.e. the vast majority of fora); or any host who, like Delfi, runs their portal “for an economic purpose” (i.e. any ad-supported website), their potential to be held liable as publishers under laws of EU Member States is now a much murkier prospect.
Best practices in content moderation are “insufficient”
The Court’s opinion likewise gives unclear instruction to content hosts on how to stay on the right side of the law, compounding the uncertainty for intermediaries created by this ruling. In the opening paragraphs of the decision, the Court describes the many steps taken by Delfi to moderate the content that appears in its comment section:
“[T]here was a system of notice-and-take-down in place: any reader could mark a comment as leim (an Estonian word for an insulting or mocking message or a message inciting hatred on the Internet) and the comment was removed expeditiously. Furthermore, there was a system of automatic deletion of comments that included certain stems of obscene words. In addition, a victim of a defamatory comment could directly notify the applicant company, in which case the comment was removed immediately.”
As hosting user-generated content becomes an increasingly risky business in Europe, the online spaces in which users can share their opinions and expression can be expected, correspondingly, to shrink.
To a reasonable observer, this sounds like a description of some of the key best practices in online content moderation, a challenging task that typically includes a combination of automated and human review of user-generated content. But the ECtHR was unimpressed, finding these efforts “insufficient” to meet Delfi’s obligations as a publisher of those comments. Instead, the Court found that Delfi had “an obligation to remove from its website, without delay after publication, comments that amounted to hate speech and incitements to violence, and were thus clearly unlawful on their face.” The Court concluded that States may impose liability on “Internet news portals” like Delfi “if they fail to take measures to remove clearly unlawful comments without delay, even without notice from the alleged victim or from third parties.” [Emphasis added.]
This is a dangerous precedent and essentially creates proactive filtering and monitoring obligations for “active” content hosts. Under the Court’s logic, and regardless of any pre-screening efforts or prompt removal upon notice, if an intermediary like Delfi unknowingly allows even one comment containing unlawful speech to be posted on its site, it can face full liability for that comment. The sensible response from content hosts will be to tighten pre-screening measures, or to do away with user-upload features altogether. As the dissenting judges Sajó and Tsotsoria note, “[A]ctive intermediaries and blog operators will have considerable incentives to discontinue offering a comments feature, and the fear of liability may lead to additional self-censorship by operators.” As hosting user-generated content becomes an increasingly risky business in Europe, the online spaces in which users can share their opinions and expression can be expected, correspondingly, to shrink.
“Every ram has his Michaelmas” – and other manifestly unlawful speech
The Estonian Supreme Court found that 20 comments posted under a Delfi article violated the country’s defamation law. “The contents of the comments were unlawful; they were linguistically inappropriate. Value judgments [that prejudice and denigrate a person’s honor and good name] . . . are inappropriate if it is obvious to a sensible reader that their meaning is vulgar and intended to degrade human dignity and ridicule a person.” The ECtHR agreed, finding that “this characterization and analysis of the unlawful nature of the comments in question . . . is obviously based on the fact that the majority of the comments are, viewed on their face, tantamount to incitement to hatred or to violence against [the plaintiff]. . . . [T]he establishment of their unlawful nature did not require any linguistic or legal analysis since the remarks were on their face manifestly unlawful.” The Court finds that Delfi should have foreseen that it could “be held liable under domestic law for the uploading of clearly unlawful comments” by its users.
So what was this content that was so clearly outside the bounds of Article 10 protection that the content host should have known to take it down? Some of the comments at issue (as cited in the ECtHR opinion) read:
“Proposal – let’s do as in 1905, let’s go to [K]uressaare with sticks and put [Plaintiff] and [Plaintiff 2] in a bag.”
“rascal!!!” [in Russian]
“inhabitants of Saaremaa and Hiiumaa islands, do 1:0 to this dope”
“this [Plaintiff 3] will one day get hit with a cake by me. damn, as soon as you put a cauldron on the fire and there is smoke rising from the chimney of the sauna, the crows from Saaremaa are there – thinking that . . . a pig is going to be slaughtered. no way”
“. . . every [Plaintiff] has his Michaelmas . . . and this cannot at all be compared to a ram’s Michaelmas. Actually sorry for [Plaintiff] – a human after all . . . 😀 😀 :D”
It may well be the case that each of these comments would read as “linguistically inappropriate” defamation to the sensible Estonian reader. But they also illustrate one of the most important reasons for protecting content hosts from liability for user speech: whether a particular comment is defamatory, a threat, genuine incitement to imminent lawless action, or otherwise unlawful is highly dependent on the specific context of that statement. Different cultures and sub-cultures not only have different legal and societal standards about what is and isn’t inherently ridiculous or degrading, and they express those ideas in different languages and idioms that won’t necessarily translate. Requiring local website operators to screen all user comments for “linguistically inappropriate” statements by Estonian speakers, such as these apparently distasteful references to “1:0” or Michaelmas rituals, is onerous enough. Requiring hosts of user content, who open their servers to speakers from every corner of the world, to anticipate and remove what is “obviously” insulting or defamatory across potentially every language and culture is a wholly untenable burden.
“Unregistered” users unwelcome?
A final, intensely concerning, aspect of the Delfi decision is the Court’s implication that Delfi brought these liability woes on itself by allowing anonymous comments. When the ECtHR first reviewed the case, it found that “It had been [Delfi’s] choice to allow comments by non-registered users, and by doing so it had to be considered to have assumed a certain responsibility for such comments.” The Grand Chamber was somewhat less direct, citing “the uncertain effectiveness of measures allowing the identity of the authors of the comments to be established,” and “the lack of instruments put in place” by Delfi to enable the target of a comment to bring a claim against its author. But these factors, which, to the Court, demonstrated a failure by Delfi “to ensure a realistic prospect of the authors of such comments being held liable,” contributed to the Court’s finding that it was appropriate to impose liability on Delfi directly.
This is a potentially very damaging precedent for the free expression and privacy rights of users in Europe and around the world. By punishing platforms for hosting anonymous user comments, the decision creates strong incentives for intermediaries to demand identity information from all users as a condition of service. This would obviously directly limit individuals’ opportunities to engage in anonymous expression. As the UN Special Rapporteur on the Freedom of Expression, David Kaye, noted in a report to the UN Human Rights Council this week, the ability to speak anonymously online is critical to the freedoms of opinion and expression.
Further, requirements to “register” as the user of a site or service would have a chilling effect on people’s willingness to access information about sensitive topics, knowing that it would be easy to link that information back to their offline identity. This chilling effect would be particularly acute for online news sources, which function both as sources of information and town halls in which people exchange diverse and divergent views about that information. And, as the Court of Justice of the European Union found, when striking down the EU’s data retention mandate last year, retention of personal information by online services for long periods of time poses a significant threat to Internet users’ privacy rights. The EU should not embrace liability standards for content hosts that ultimately erode the privacy and free expression rights of Europeans.
Publisher – or “platform” – liability is not the answer
The ECtHR decision creates massive uncertainty for Internet intermediaries operating in Europe and could undermine the foundations of access to information and free expression online. When the high court in Thailand convicted Chiranuch Premchaiporn for failing to remove quickly enough user comments that criticized the king (which were unlawful under the country’s lèse-majesté laws), its decision was rightly criticized as threatening freedom of expression in that country. The ECtHR ruling, unfortunately, is no different.
We urge EU policymakers to use initiatives around the DSM to harmonize European legal regimes in ways that create incentives for innovation in Europe and encourage intermediaries to keep their platforms open to user commentary and debate.
Protecting intermediaries from liability for user-generated content is widely recognized to be an essential element to fostering freedom of expression online. It also supports the development of innovative new platforms for sharing information, exchanging opinions, and conducting business across borders. Europe’s own E-Commerce Directive is designed to protect conduits, caches, and hosts from liability, and to prohibit monitoring obligations that incentivize private censorship. That framework was designed to encourage the development of a robust information society, and should have been applied in this case.
Currently, EU policymakers are undertaking a significant and important effort to harmonize Member States’ approaches to regulation in the Digital Single Market Strategy (DSM). While there are a number of sensible proposals in the DSM, CDT has previously raised concerns about the proposals to create a new “duty of care” for intermediaries to police unlawful content and to develop a new category of “essential platforms” that will face heightened regulatory scrutiny. The E-Commerce Directive’s framework has supported a blossoming of free expression online in Europe; creating new liabilities for intermediaries, whether through court decisions or regulation, could nip this in the bud. We urge EU policymakers to use initiatives around the DSM to harmonize European legal regimes in ways that create incentives for innovation in Europe and encourage intermediaries to keep their platforms open to user commentary and debate.