The European Parliament is progressing its work on the draft Regulation to prevent the dissemination of terrorist content online. The Parliament is working to meet the European Commission’s tight timeframe and trying to complete its review ahead of the European Parliament elections. Issue leaders in two committees (MEP Julia Reda in the Internal Market [IMCO] Committee and MEP Julie Ward in the Culture and Education [CULT] Committee) have proposed very constructive and thoughtful amendments that respond effectively to the concerns we raised in our writings on the Regulation and in meetings with MEPs. Lead Rapporteur in the Civil Liberties Committee (LIBE) MEP Daniel Dalton published a much more cautious draft report that stays closer to the Commission’s text. Overall, it is very encouraging to see MEPs setting out to address some of the most problematic aspects of the proposal. This is the right approach and bodes well for the European Parliament negotiations that lie ahead. Below, we highlight some of the core problems the MEPs are addressing.
1. The definition of illegal terrorist content: In a detailed submission from December 2018, three UN Special Rapporteurs proposed using model language on terrorist content developed by the UN Human Rights Committee. This would be an excellent solution. MEPs suggest using the definition set out in the Directive on Combating Terrorism from 2017. While this definition is somewhat problematic (because it includes indirect advocacy and glorification – vague terms that could lead to too broad restrictions on speech), using the definition from the Directive would help create consistency with existing EU law, which would ensure the legislation targets content that is clearly illegal.
|IMCO||AMs 41-45 [Article 2(1)(5)]; AM 10 [Recital 9]|
|CULT||AMs 20-23 [Article 2(1)(5)]; AM 4 [Recital 9]|
2. Hosting service covered by the Regulation: We have argued that the scope of providers covered in the Commission’s draft is far too broad and captures an almost unlimited array of internet services, including enterprise cloud services and cloud infrastructure providers, which would be disproportionate. We advocate limiting the scope to those services that are designed and used for sharing content with the general public – hosting services that have direct control of uploaded content, can remove discrete pieces of content deemed to be illegal, and have direct relationships with those who upload content. The amendments proposed by MEPs aim to clarify this.
|IMCO||AM 40 [Article 2(1)]; AM 11 [Recital 10]|
|CULT||AM 18 [Article 2(1)]; AM 5 [Recital 10]|
3. Competent Authorities: In our view, and in the view of MEPs, Member States should designate independent, judicial authorities to exercise powers under this Regulation, and they should preferably designate only one such authority per country. This is important because the draft Regulation gives Competent Authorities broad powers to issue legally binding removal orders determining whether content is illegal. Such determinations should be made by courts, or independent agencies under judicial oversight, in line with European and international human rights principles.
|IMCO||AM 88 [Article 17]; AM 30 [Recital 37]; AM 13 [Recital 13]|
|CULT||AM 43 [Article 17]|
4. Referrals: The Commission has argued that content referrals made by Europol and by Member State law enforcement agencies have been an effective tool to limit online content that may contribute to radicalisation and extremism. Referral of content does not imply that it is illegal, but that it may violate the terms of service of the hosting provider. The provider makes the decision about whether or not to remove the content. MEP Julia Reda proposes to remove the concept of referrals from the Regulation entirely. This is a logical and reasonable position, given that the legislation should be focused on illegal content. Companies that remove content subsequent to referrals by law enforcement do so voluntarily, and this cooperation will no doubt continue irrespective of this draft Regulation. It is also clear that the discussion about the referral system would benefit from greater transparency about its effectiveness to what extent it causes legitimate content to be restricted.
|IMCO||AM 47 [Article 2–paragraph 1–point 8]; AM 60 [Article 5]; AM 64 [Article 7]; AM 14 [Recital 15]|
5. Removal orders and one-hour takedown: We argue that removal orders issued by Competent Authorities should be dealt with expeditiously and diligently, but that the one-hour requirement is disproportionate and somewhat arbitrary. MEPs have introduced sensible and reasonable amendments that require removal orders to be dealt with without undue delay.
|IMCO||AM 52 [Article 4(2)]; AM 13 [Recital 13]|
|CULT||AM 30 [Article 4(2)]; AM 7 [Recital 13]|
6. Proactive measures: The concept of proactive measures is one of the most controversial issues in the draft Regulation. We oppose the concept because it is hard to see how imposing proactive measures on providers does not amount to a general monitoring obligation in the meaning of Article 15 of Directive 31/2000, and to pervasive upload filtering. Research shows that available filtering solutions are unable to make sophisticated assessments of context to determine whether text, video, audio or photos can be considered illegal. Mandating providers to install such technologies, explicitly or implicitly, would entail serious risk that legitimate speech is restricted. It is therefore right that MEPs propose to remove the concept of proactive measures from the Regulation.
Many providers use automated tools (often in combination with human verification) to moderate content for a variety of purposes, including to prevent dissemination of illegal and harmful content. When they do so, they have an important responsibility to ensure full transparency and accountability of their moderation policies and enforcement.
It is possible that the upcoming negotiations on the draft Regulation will result in provisions that enable Competent Authorities to order providers to take additional measures to limit illegal terrorist content. If that is the case, it is essential that they should be narrowly targeted to those services that have been used, demonstrably and to a significant extent, for dissemination of illegal terrorist content. That is to say, they have received and complied with a considerable number of justified removal orders.
|IMCO||AM 6 [Recital 5]; AM 15 [Recital 16]; AM 16 [Recital 17]; AM 17 [Recital 18]; AM 18 [Recital 19]; AM 19 [Recital 21]; AM 24 [Recital 28]; AM 59 [Article 4]; AM 62 [Article 7]; AM 68 [Article 8]; AM 72 [Article 8]; AM 74 [Article 9]|
|CULT||AM 36 [Article 6]; AM 12 [Recital 19]|
CDT welcomes the approach MEPs have taken in their draft Opinions, and we will work with Parliament to find solutions and compromises where necessary. It is unfortunate that the Commission has issued its proposal with such time pressure. Many of the issues in the draft Regulation would benefit from more thorough discussion than the intended time frame allows. At this stage, we encourage MEPs to support the proposals made by the IMCO and CULT leaders on the core issues discussed above.