Skip to Content

European Policy, Free Expression

European Parliament Vote Addresses Core CDT Concerns with the Proposed Regulation on Terrorist Content Online

On 17 April 2019, the European Parliament adopted its Report on the proposed Regulation on terrorist content online. The report was adopted with a vast majority of 308 votes in favour, 204 against and 70 abstentions.

The text improves significantly on the initial proposal of the European Commission released in September 2018. We are pleased that many of our concerns have been addressed, although several troubling elements remain in the Parliament’s version of the legislation.

Improvements

The original draft legislation was sweeping in scope, applying to a broad range of speech and online intermediaries. The Parliament amendments limit the potential scope of the bill. The definition of “terrorist content” has been narrowed, and explicitly does not apply to “content which is disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity, nor to content which represents an expression of polemic or controversial views in the course of public debate.” Combined with amendments that refer consistently to relevant articles of Directive 517/2017 on Combating Terrorism, this change is likely to help reduce the risk of censorship. The definition acknowledges that there are many contexts in which it is entirely legitimate to engage with material that might be deemed illegal “terrorist content” in others.

As we recommended, the definition of “hosting service provider” no longer applies to cloud and infrastructure services and electronic communication services. We have argued that the focus of the regulation should be “on those services being demonstrably and systematically used to disseminate terrorist content with the intent of inciting violence” to the public, and we are pleased to see that the committee has narrowed the scope of the Regulation in this vein.

Another major concern in the original draft was the authority provided to a broad and ill-defined category of “competent authorities” in each Member State to issue legally binding removal orders. In the final text adopted by Parliament, the competent authority which can order the removal of terrorist content has been more narrowly defined as “a single designated judicial authority or functionally independent administrative authority in the Member State.” This definition more appropriately limits the scope of entities who can issue these orders, which should help to ensure that removal orders have met the procedural and substantive standards required of any government effort to restrict speech.

Moreover, removal orders shall contain “easily understandable information about redress available to the hosting service provider and to the content provider, including redress with the competent authority as well as recourse to a court as well as deadlines for appeal.” This amendment will help raise the awareness of both users and hosting service providers on how to challenge unlawful removal orders in court. The Parliament text also includes a new Article 8a that specifies transparency reporting obligations for competent authorities. These would require these entities to regularly disclose the number of orders they issue and the number of actual investigations they have conducted related to alleged unlawful content. This is a very welcome development, as transparency from governments around their activity targeting content for removal has been noticeably lacking for many years.

The article on “referrals” has also been removed from the text. The Commission’s original draft permitted the competent authority to ask the hosting service provider to remove content based on its terms and conditions. This provision would have allowed the removal of content without necessarily respecting all the safeguards that must be provided by any law that restricts speech, and the Parliament has done well to reject it.

Finally, in extremely important amendments that were also adopted by the LIBE Committee, ”proactive measures” are no longer in the legislative text. This is an important move to reaffirm the E-Commerce Directive’s prohibition on monitoring obligations and confirm that filtering mandates should not be the means to tackle illegal content online. Fortunately, the European Parliament rejected several amendments that had been tabled to restore in the Plenary this significant achievement of the LIBE committee.

The one-hour deadline remains

Unfortunately, after a very close vote (300 in favor to 297 against), the one-hour deadline to remove content following a removal order is still in the final report text. We have repeatedly called for amendments that eliminate the removal order deadline and argued for requiring hosting service providers to remove the content “as soon as possible.” As we pointed out in the letter we co-signed with other civil society and industry organisations, small and medium-sized enterprises can not afford to be ready 24/7 to comply with the regulation. The very tight vote shows, on the other hand, that half of the European Parliament shares our concerns on this crucial matter.

Next steps

We recognise the substantial work by Parliament to produce this result, with a very tight deadline and under heavy political pressure. The European Parliament final report makes key improvements to the Commission’s proposal, and must be carried through to the trilogue negotiations that will begin after the European elections this May. We urge the institutions to use the Trilogue to build on the improvements adopted by Parliament and to address the portions of the legislation that continue to threaten fundamental rights.