EU Tech Policy Brief: November-December 2018

Written by Jens-Henrik Jeppesen, Laura Blanco

This is the November-December 2018 recap issue of CDT’s monthly EU Tech Policy Brief. It highlights some of the most pressing technology and internet policy issues under debate in Europe, the U.S., and internationally, and gives CDT’s perspective on them.

CDT Participates in CEPS event on Germany’s NetzDG and the EC Terrorist Content Regulation

The Center for European Policy Studies and the Counter Extremism Project recently published a report examining the impact of Germany’s NetzDG law on free expression, and its effectiveness fighting hate speech, nine months after it entered into force. At the launch event for the report, CDT’s European Affairs Director, Jens-Henrik Jeppesen, highlighted CDT’s longstanding concerns with the law. In particular, it is impossible to determine whether content removed under the law is in fact illegal, as it is never reviewed by a court. Disagreement about what is and is not illegal speech is reflected by very low takedown rates (between 10 and 28%). This underscores the need for court decisions in free speech cases and illustrates that flaggers, reviewers, and automated tools cannot be trusted to make these decisions.

E-Evidence: Council Adopts “General Approach” Despite Concerns from Civil Society and Various Member States

During October’s Justice and Home Affairs (JHA) Council, several Member States noted the lack of review of European Production Orders by authorities other than the ones issuing them. We expressed this concern in our analysis, and this point was also raised by the European Data Protection Board in its Opinion. However, the JHA Council agreed on a “general approach” on 7 December over the objections of several Member States, and over concerns raised by several civil society groups in a joint letter to Ministers. The adopted text does not satisfactorily address these concerns. Meanwhile, the Civil Liberties (LIBE) committee of the European Parliament held a hearing on 27 November to gather feedback on the proposal from various stakeholders across the board. Many speakers, such as the German Judges’ Association, questioned the necessity of the proposed legislation and proposed amending the European Investigation Order as an alternative. Moreover, the rapporteur and shadows in LIBE have identified several areas that need further clarification to reach a sound text, including scope and safeguards. Parliamentary scrutiny of the proposal is likely to continue after the May 2019 elections.

Terrorist Content Online: Human Rights Groups Caution Against the EC’s Proposal

On 6 December, the Austrian Presidency rushed through a compromise text for adoption by the JHA Council. Ministers agreed on a very problematicgeneral approach”, but several Member States share the concerns we identified. A joint letter we submitted to Ministers with many other digital policy groups questions whether the Regulation is necessary in view of the numerous counter-terrorism initiatives already taken, and highlights serious free expression concerns. It’s now up to the European Parliament to take time to address the issues Member States did not.

DSM Copyright Directive: No Compromise in Sight

Negotiations on Article 11 (press publishers’ right) and Article 13 (upload filtering obligation) of the Copyright Directive proposal continue to prove difficult. Ahead of the 23 November COREPER 1 meeting, we joined human, privacy, and civil rights and media freedom organisations, software developers, creators, journalists, radio stations, higher education institutions, and research institutions in signing an open letter highlighting ongoing concerns with the proposal. “Both the Council and the Parliament texts risk creating severe impediments to the functioning of the Internet and the freedom of expression of all,” reads the letter. Earlier in October, CDT proposed a series of amendments and recommendations for Articles 11 and 13 that aim at preserving the open nature of the internet. We particularly call for mitigation measures for liability to be included in Article 13; excluding hyperlinks and insubstantial parts of text from the scope of Article 11; and a broad mandatory text and data mining exception in Article 3. These contentious points were left undecided during what was supposed to be the ‘final’ trilogue meeting on 13 December. The next trilogue will take place during the first plenary of 2019.

Controversial French Surveillance Regulation to be Scrutinised by CJEU

The judges of France’s highest administrative court (“Conseil d’État”) decided to send France’s legislation enabling expanded electronic surveillance to be scrutinized by the Court of Justice of the European Union (CJEU) in a preliminary ruling. This decision follows from two lawsuits started by French Data Network, La Quadrature du Net, and the Fédération FDN (federation of non-profit Internet access providers), joined by Privacy International and CDT. We opposed this legislation from its inception in 2015, and called for action at the EU level to protect human rights in the surveillance context. On 4 December, CDT filed a brief (in French) with the CJEU challenging French surveillance and data retention laws.

French Parliament Adopts ‘Fake News’ Law & the European Commission Publishes ‘Disinformation’ Action Plan

On 20 November, the French National Assembly adopted new legislation to combat ‘fake news’, particularly in the context of elections. The new law calls for increased scrutiny of online platforms in the months preceding elections, and would empower judges to order takedowns of content they deem ‘fake news’ during election campaigns. We continue to worry that any such law could be used to censor free speech, particularly due to vague definitions of the content to be considered ‘fake’. The European Commission has also identified tackling disinformation, particularly ahead of next European year’s elections, as a top priority. It recently outlined an action plan to step up efforts to counter disinformation in Europe. One key area of the plan focuses on collaboration with online platforms and industry in the context of the self-regulatory ‘Code of Practice’ adopted earlier this year. The Code delivers commitments on five main areas: scrutiny of ad placements, political and issue-based advertising, integrity of services, empowering consumers, and empowering the research community. While we believe that commitments in themselves are necessary, we remain concerned that the overall process is oriented toward pressuring platforms to remove or suppress content and accounts without meaningful safeguards.

Share Post