Skip to Content

European Policy, Free Expression

Terrorist content online: Parliament must take time to address the issues Member States did not

On 6 December, the Justice and Home Affairs (JHA) Ministers rushed through an agreement on a negotiating position on the proposal for a Regulation to prevent the dissemination of terrorist content online. This enables Council to begin discussions with the European Parliament once the parliamentarians agree on their own position.

When the European Commission published its proposal in September, we called out several problems: (1) the lack of evidence to justify its necessity in light of the multiple counter-terrorism initiatives already taken, notably Directive 2017/541 on Combating Terrorism, and (2) its incompatibility with fundamental rights standards. And we are not alone. CDT and about 30 digital and human rights organisations and experts raised these concerns in an open letter addressed to the JHA Ministers ahead of the meeting. Academic research from Dr. Aleksandra Kuczerawy also concludes that the proposal “poses a serious risk to fundamental rights protected by the EU Charter, in particular the right to freedom of expression and access to information”. Member States unfortunately did not heed these warnings. While the Council’s text attempts to address some of our concerns, more work is required. The European Parliament will need to solve these problems.

Further clarification of “terrorist content” definition is required

A major problem in the Commission’s proposal is the vague and unclear definition of “terrorist content” and the risk that it leads to repression of legitimate speech. The Council text attempts to align the definition more closely with the definition of “terrorist offences” under Directive 2017/541 on combating terrorism. It refers to “material which may contribute to the commission of the intentional acts, as listed in Article 3(1)(a) to (i) of Directive 2017/541”. Introducing an element of intent in the assessment of whether a piece of content is considered as “terrorist”  may help limit the impact on free expression. However, it must be noted that the language in the Directive itself is already broad and criminalises “glorification”, a vague term that captures a wide range of expressions.

Making reference to “fundamental rights” does not imply their protection

The Council’s text makes more references to the importance of freedom of expression and information, as well as the freedom of the press and plurality of the media (recitals 7 and 9), as elements to safeguard when adopting measures in this area. Dr. Aleksandra Kuczerawy notes that “[t]he change aims to take into account the journalistic standards established by press and media regulation, but it is doubtful that this change will actually facilitate the assessment process” when deciding on content removals. Moreover, given the limitations and accuracy rates of automated tools to analyse context, taken together with the broad definition of “terrorist content”, the risk of erroneous removals of legal content persists. This is not a theoretical risk. Human rights groups like WITNESS whose mission is to document human rights abuses testify that online videos and other material that provide crucially important evidence are routinely blocked and deleted by automated systems and reviewers.

Scope of services covered should be narrowed considerably

The Commission’s proposal includes in its scope a wide range of services, ranging from global social media companies, trading platforms and cloud and storage services to newspaper sites and privately run blogs with comment sections. The Commission reasons that what is deemed “terrorist content”, having been purged from mainstream platforms, is moving to smaller ones. In our view, the focus should be on those services being demonstrably and systematically used to disseminate terrorist content with the intent of inciting violence. The definition of “hosting service provider” should be amended to focus on those services that are designed and used for broad dissemination of user-uploaded content to the general public. The Council’s text may be an attempt to narrow the scope, in that Recital 10 excludes “other services provided in other layers of the internet infrastructure” and adds that “only those services for which the content provider is the direct recipient are in scope”. The changes have been made only in the recital, and seems confusing. The scope should explicitly exclude cloud and storage services as well as infrastructure providers.

Broad powers of undefined competent authorities to impose “proactive measures”

The Council’s text does not set conditions for the designation of Competent Authorities, nor specify how many a Member State can designate. There is no requirement that decisions they take are subject to judicial review. This is unsatisfactory, particularly given the rule of law issues in certain Member States, taken together with the fact that Competent Authorities would be given broad powers to impose proactive measures on service providers. The Council clarifies that these measures should be taken “depending on the risk and level of exposure to terrorist content”. However, it also adds that it will be left up to the competent authority to “decide on the nature and the scope of the proactive measures”. Any authority with the powers to impose proactive measures, including the use of automated tools, would need to be fully aware of the technical limitations of available technology. The proposal retains that safeguards should consist of “human oversight and verifications where appropriate”. However, human oversight does not always address the issue of erroneous removals, particularly when content is taken down against a platform’s own terms of service.

Compromise text remains in conflict with e-Commerce Directive

The Council’s position still calls for proactive measures to “effectively address the reappearance of content which has previously been removed”. While this may be a slightly softer wording that the Commission’s (“preventing the re-upload of content”), a general monitoring obligation may still be inferred. This is particularly the case taken together with the explicit derogation to the E-Commerce Directive mentioned in recital 19, which the Council has left untouched. Dr. Aleksandra Kuczerawy highlights that while the E-Commerce Directive “does not foresee any exemptions to the prohibition in Article 15 […] the proposed measures therefore, contradict the EU acquis”.

All in all, it is clear that speed has trumped thorough review in the Council proceedings. The Commission insists that the Regulation must pass before the 2019 Parliament elections. As noted, it does not provide convincing evidence to support this claim. We urge Members of the European Parliament to take the necessary time to consult extensively with business and civil society stakeholders, its internal policy services, free expression experts and press and media organisations. CDT will offer constructive and meaningful input for Parliament’s work.