Skip to Content

European Policy, Government Surveillance

The European Commission’s draft regulation on ‘terrorist content’ requires significant revision

On 12 September, the European Commission published a proposal for a Regulation on preventing the dissemination of terrorist content online.

The Regulation is the latest and most far-reaching of a long series of European Union initiatives to regulate and restrict various types of online content, both legal and illegal. It follows a public consultation in June 2018 on the Commission’s strategy to tackle illegal content. In our consultation response we cautioned against new legislative measures in the absence of compelling evidence to justify them. We also laid out some core principles that any policy measure should respect. The Commission has disregarded this advice, and proceeded to legislate at speed, most likely under pressure from influential Member State governments.

Politics aside, the evidence brought forward to justify the Regulation is less than compelling, and does not demonstrate convincingly that the potentially far-reaching restrictions and obligations it imposes are compatible with fundamental rights standards, and are necessary and proportionate to achieve the intended objectives.

The Regulation undermines the e-Commerce Directive

For the first time, the European Commission has issued draft legislation that explicitly breaks with the limited liability principles set out in the 2000 E-Commerce Directive (ECD). (The proposed Audio Visual Media Services Directive and the DSM Copyright Directive have similar consequences, but the Commission claims those texts are in conformity with the ECD [leading academics disagree]).

The ECD has underpinned the growth and proliferation of all manner of online services that enable users to post and share content. Until now, the fundamental principle has been that companies can host users’ content without assuming liability for the legality of that content. However, once notified about illegal content, the host is required to take appropriate action. The Directive also prohibits Member States from imposing general monitoring obligations on content hosts; they cannot not be forced to systematically scan and filter uploaded content. These principles have been invaluable in enabling free expression, creativity, innovation and education, among other things. However, it is clear that the ECD principles are being challenged constantly, and are unlikely to stay in place for the long term.

Duties of care: proactive measures, referrals and content removal orders

The proposed Regulation imposes ‘duties of care’ on an almost unlimited range of internet hosts, to prevent dissemination of terrorist content. It includes in its scope everything from global social media companies, trading platforms, cloud and storage services to newspaper sites or privately run blogs with comment sections. All of these services, regardless of turnover, size or reach, will be required to take ‘proactive measures’ to prevent the dissemination of ‘terrorist content’ (as defined in the recently adopted Directive on Combating Terrorism). ‘Competent Authorities’ – to be designated by Member States – can order any content host to take any proactive measure, including use of automated tools, to restrict content that the Authority finds proportionate and necessary. Authorities can also instruct hosts about how to design their terms of service (contracts with users and customers). Furthermore, content hosts must execute (in some cases within one hour, although the text is ambiguous) binding removal orders for content deemed illegal by Authorities. They must rapidly process and review referrals from those Authorities for content that the Authority thinks violates terms of service. (This is a continuation and broadening of the practice established by the Europol Internet Referral Unit). They must report on the effectiveness and speed with which they prevent, detect and remove targeted content. Failure to comply with obligations under the Regulation will carry open-ended penalties (it is not clear whether 4% of global turnover is an upper limit).

Data retention and proactive reporting

Troublingly, the Regulation obliges content hosts to report proactively on ‘terrorist offences’ they become aware of. The definitions in the Terrorism Directive are less than clear, and so far, only 15 Member States have implemented the Directive. Uncertainty about what content falls within the definition is almost certain to result in significant over-reporting of statements of controversial political views, rather than actual incitement. This flood of low-value information would be both ineffective and chill online speech.

The Regulation also introduces retention obligations for content that is removed or disabled following removal orders, referrals or automated tools. These retention obligations are extremely broad, and are unlikely to be in conformity with CJEU jurisprudence and fundamental rights.

Some – but not all content – calls for urgent response

The definition does not distinguish between the many kinds of content that may be deemed to fall with the definitions. On the one hand, there is content which is aimed to incite immediate, concrete acts of violence. On the other hand, there are statements of general sympathy or understanding for past acts, for certain ideologies, or for terrorist groups (however designated). The definitions also do not take into account the context in which content is shared or statements made. The same content may have very different risks associated with it depending on who posts it and with whom it is shared, etc. Many factors determine what is an appropriate response to a piece of content shared online, and any content moderation policy – mandated by law or not – should take this into account.

Competent Authorities with extremely broad powers

The Regulation sets no conditions whatsoever for designation of Competent Authorities, nor how many a Member State can designate. This is unsatisfactory given the extremely broad powers these Authorities would be granted. Notably, there is no mention of any judicial oversight. With the current controversies about some Member State governments’ rule of law records, it is reasonable to fear that such open-ended powers can be misused to suppress legitimate political speech.

This is not a full analysis of the proposal, but these are among of the most important concerns that we believe will need to be addressed in the upcoming legislative process. Other parts of the proposal, notably on safeguards, transparency and accountability include sensible ideas.

Is the proposal justified, necessary and proportionate?

A fundamental question is whether the Regulation is justified, given the numerous legislative and non-legislative counter-terrorism and security initiatives the European Commission and its Member States have already taken. These include extensive public-private cooperation with internet companies in the EU Internet Forum, with Member State law enforcement agencies, Europol, and global organisations like the United Nations.

The Commission argues that while the most widely used platforms are taking effective measures to counter use of their services by terrorist groups, smaller ones do not. It is argued that without this Regulation, smaller services cannot be motivated to deal responsibly with terrorist content.

It is a valid point that, as mainstream platforms enhance their measures to counter the content in question, those who disseminate it will look for alternatives. But if other, smaller, services systematically carry large quantities of material that cause people in Europe to commit terrorist offences, evidence that this is the case has not been included in the Impact Assessment. Without access to mainstream services, the reach of such propaganda will be limited.

It is also fair to ask whether the proposal will further strengthen the most powerful platforms relative to would-be challengers. Requiring start-up companies to comply with standards set by the large, global players will likely result in even higher barriers to entry for new platforms and services.

Finally, it is important to note that even if it were possible to scrub every website and platform worldwide of terrorist propaganda – and do it without capturing legitimate speech – threats of terrorist activity would persist. While online content has undoubtedly played a role in terrorist groups’ propaganda efforts, it is a limited one. Leading researchers argue convincingly that while online content can be effective in radicalising and influencing the mindset of some viewers, those who actually carry out attacks have nearly always been recruited through human, face-to-face contact.

CDT will provide further analysis and commentary as the legislative review process gets underway.