Skip to Content

European Policy, Free Expression, Open Internet

EU Tech Policy Brief, April 2019 Recap

This is the April 2019 recap issue of CDT’s monthly EU Tech Policy Brief. It highlights some of the most pressing technology and internet policy issues under debate in Europe, the U.S., and internationally, and gives CDT’s perspective on them.

Terrorist Content Online Regulation: Parliament improves text but retains one-hour deadline

On 17 April, the European Parliament adopted its Report on the proposed Regulation on Terrorist Content Online. We are pleased that many of our concerns were addressed, although troubling elements remain in the Parliament’s version of the legislation. Among other things, definitions were tightened to exclude reporting, research, and educational content. The scope of provider was narrowed to exclude infrastructure services, and the language on ‘competent authorities’ now ensures independence and judicial oversight. The article on referrals was removed, and the wording on ‘proactive measures’ was improved so that it does not imply upload filters. Unfortunately, amendments to the one-hour takedown deadline were rejected by a three-vote majority. We urge the institutions to retain these improvements when ‘trilogue’ negotiations begin.

DSM Copyright Directive: Council adopts over objections of six Member States

On 15 April, the Council of Ministers confirmed adoption of the DSM Copyright Directive. Six Member States voted against, and three abstained. Member States now have two years to transpose the Directive into their national legislation. Outgoing President of the European Commission, J.-C. Juncker, told German press that he favours an outcome that avoids upload filters. In the same vein, the German government stated that “the aim must be to make upload filters largely unnecessary.” It added that if implementation leads to restrictions on freedom of expression, it will “work to ensure that the shortcomings are corrected.” The French government has no such qualms: it has started work on the implementation this month, and said that it wants pervasive deployment of content recognition technology as a matter of priority. CDT will engage actively in the implementation process to limit the impact on online expression.

Intermediary Liability: UK Government will create new internet regulator to crack down on ‘harmful content’

On 10 April, the UK Government released the White Paper on ‘online harms’, aimed at tackling illegal and harmful content and activity online. The UK government proposed a regulatory architecture based on ‘duties of care’ enforced by a new regulatory body. This regulator would oversee the drafting and enforcement of ‘codes of practice’ for content hosts, and the regulator would have powers to fine companies it finds in breach of these codes. Many reasonable concerns and good intentions are cited as motivation for the policies set out in the White Paper, including the intention to promote free expression, access to information and internet-based innovation. However, the White Paper sets out concerns about a sweeping range of content that may (or may not) be illegal, or is seen (by some) to be harmful. It describes this ‘harmful’ content in such vague terms that it is impossible to know what content might be censored by the future regulator. The result may be drastic administrative limitations on legitimate content online, without due process. CDT will address these rule of law issues and many other serious concerns in the White Paper when we respond to the public consultation.

AI: European Commission ‘Trustworthy AI’ guidelines released

On 8 April, the European Commission’s High-Level Expert Group on AI published its report, “Ethics guidelines for Trustworthy AI”. CDT participated in a previous consultation on a draft version of the guidelines. The guidelines, which aim to set global standards for responsible development and deployment of AI and machine learning technology, cover a wide variety of topics. These include human agency and oversight, technical robustness and safety, privacy and data governance, transparency, diversity and inclusion, societal well-being, and accountability. CDT welcomes the European Commission’s initiative, but high-level guidelines may be too general to provide concrete and actionable guidance that can be applied in real-world scenarios. In our comments on the guidelines, we encourage the HLEG to continue to refine and strengthen the text.

CDT Urges European Court of Human Rights to Outlaw or Limit Bulk Collection

On 26 April, CDT filed a brief in the Grand Chamber of the European Court of Human Rights (ECtHR) in the joined petitions of Big Brother Watch and Others v. United Kingdom, Bureau of Investigative Journalism and Alice Ross v. the United Kingdom, and 10 Human Rights Organisations v. the United Kingdom (collectively called “Big Brother Watch”). In September 2018, the First Section of the ECtHR ruled in Big Brother Watch that, while the UK’s program violated the right to privacy under Article 8 of the European Convention on Human Rights (ECHR), the bulk collection regimes were not illegal per se. CDT argues that the lack of transparency, safeguards for the targets, and adequate judicial review are grounds to declare the surveillance scheme unlawful under the ECHR. Since the programmes of the U.S. National Security Agency lack these crucial elements, the exchange of intelligence with the UK cannot be considered ECHR-compliant. Should the Grand Chamber rule differently, CDT requests limitations on data sharing with third countries, that data be collected only in case of reasonable suspicion, and improved notice to surveillance targets.

The European Court of Human Rights rules against Russia in online free expression case

On 30 April, in the case Kablis v. Russia, the European Court of Human Rights (ECtHR) held that Russia violated Article 10 (freedom of expression), Article 11 (freedom of assembly), and Article 13 (right to an effective remedy) of the European Convention on Human Rights (ECHR). Mr. Kablis, an opposition advocate, used social media and his blog to draw attention to a decision by authorities to ban a planned demonstration, and called on supporters to attend a protest in a different location. The authorities issued orders to block access to Mr. Kablis’ blog and his social media account. ECtHR ruled that this action constituted ‘prior restraint’ because it was carried out by a prosecutor without a court having declared the content in question illegal. The orders, and the interference with Mr Kablis’ right to free expression, could not be justified by any particular circumstances, and therefore constituted a breach of Article 10.