EU Tech Policy Brief: September 2018 Recap
Written by Jens-Henrik Jeppesen, Laura Blanco
This is the September recap issue of CDT’s monthly EU Tech Policy Brief. It highlights some of the most pressing technology and internet policy issues under debate in Europe, the U.S., and internationally, and gives CDT’s perspective on them.
E-Evidence: Parliament and Member States Proceed Slowly on the Commission’s Proposals
The Civil Liberties (LIBE) committee of the European Parliament held the first discussion on the E-Evidence proposals this month. Rapporteur MEP Birgit Sippel and shadow rapporteurs raised several issues that we discussed in our initial observations and most recent recommendations on the proposals. MEP Sippel agreed that the proposals could help speed up criminal investigations, but she also argued that they create more legal uncertainty and risk users’ fundamental rights. She mentioned that the proposal burdens small service providers with verifying the authenticity and validity of judicial authorities’ requests to access electronic data; that deadlines are unreasonable for smaller service providers; and that the proposal does not fully guarantee right to information and remedies. A new study commissioned by the European Parliament’s Policy Department for Citizens’ Rights and Constitutional Affairs, at the request of the LIBE Committee, also points out a number of shortcomings in the Commission’s proposals. On the Member State side, the Austrian Presidency is expected to publish a discussion paper ahead of the next meeting of Justice and Home Affairs Ministers on 11-12 October.
Copyright: European Parliament Misses Opportunity to Adopt Balanced Position
On 12 September, the European Parliament missed the opportunity to redraft the most problematic parts of the Copyright Directive text adopted by the Legal Affairs (JURI) committee last June. The outcome of the plenary vote largely mirrors the European Commission’s proposal. Our main concerns with Article 13 (upload filters), Article 11 (press publishers’ right) and Article 3 (text and data mining exception) of the proposal remain. These provisions would have disastrous consequences for European citizens’ ability to communicate freely and share and access information online. Before the vote, we called on MEPs to support the amendments tabled by a cross-party coalition of parliamentarians. These amendments addressed the most serious concerns for free expression and access to information. Now, the European Commission, Parliament, and Member States will begin trilogue negotiations. The final vote in Parliament is expected in January 2019. CDT will continue to follow the institutional discussions closely and advocate for a progressive, innovation-friendly, and flexible copyright regime in the EU that safeguards internet users’ rights and freedoms.
Terrorist Content: EC Proposal Needs Significant Revision
On 12 September, the European Commission published a proposal for a Regulation on preventing the dissemination of terrorist content online. In our view, this is the most far-reaching of the latest initiatives to regulate and restrict various types of online content. In our consultation response earlier this year, we cautioned against new legislative measures in the absence of compelling evidence to justify them. We also laid out some core principles that any policy measure should respect.
The proposal’s most worrying element is that, for the first time, the draft legislation explicitly breaks with the limited liability principles set out in the 2000 E-Commerce Directive (ECD). The draft regulation also uses a relatively new and untested definition of “terrorist content”; proposes enabling law enforcement or other “competent authorities” to issue binding removal orders to content hosts which must be implemented within one hour; and requires content hosts to have in their Terms of Service “provisions to prevent the dissemination of terrorist content”. The draft regulation also allows the same “competent authorities” to refer possible terrorist content to hosts to “voluntarily” review against their Terms of Service; enables those authorities to instruct hosts to use “proactive measures” like upload filtering to restrict terrorist content; empowers those authorities to enforce penalties on content hosts; and requires hosts to report “any evidence of terrorist offenses” to national law enforcement and/or Europol.
Tackling Disinformation: Proposed EU Code of Practice Poses Free Expression Risks
In April 2018, the European Commission published a Communication on “Tackling Online Disinformation: A European Approach”. Among the Communication’s short-term outcomes is the setup of “an ambitious Code of Practice”, towards a new set of standards for fighting disinformation online. The Code builds on the Key Principles proposed by the High Level Expert Group. In response to the Communication and the public consultation, we cautioned against the potential risks to free expression of this self-regulatory initiative.
Last July, the Working Group of the multi-stakeholder forum on online disinformation — composed of online platforms, leading social networks, advertisers and the advertising industry — delivered a draft Code of Practice. This month, the final Code was released. The Code delivers commitments on five main areas: scrutiny of ad placements, political and issue-based advertising, integrity of services, empowering consumers, and empowering the research community. While many of the specific commitments in the Code are benign, or even positive, we remain concerned that the overall process is oriented toward pressuring platforms to remove or suppress content and accounts without meaningful safeguards.
CDT’s Nuala O’Connor speaks at ICDPPC 2018
The International Conference of Data Protection and Privacy Commissioners (ICDPPC) will take place in Brussels this year under the theme of “Debating Ethics: Dignity and Respect in Data Driven Life”. CDT’s President & CEO Nuala O’Connor will lead a discussion on Wednesday, 24 October, on how digital technology is changing the way we behave and interact. She will be joined on the panel by Tristan Harris of the Center for Humane Technology, journalist Julia Angwin, Malavika Jayaram of Asia Digital Hub, and others. We will also host a side event on the capabilities and limits of content moderation tools on Tuesday, 23 October, at 4:10pm. Come join us!
EcHR Rules That UK’s Bulk Surveillance Regime Violates Fundamental Rights
On 13 September, the European Court of Human Rights (EcHR) ruled that the UK’s bulk surveillance regime violates the rights to privacy and freedom of expression, as set out by the European Convention on Human Rights (ECHR). In particular, the Court highlighted inadequate oversight of selection criteria and selector choices in searching intercepted communications, lack of safeguards in the selection of bearers for interception, lack of protections afforded to metadata, and lack of safeguards for journalists’ confidential materials. However, it is disappointing to see the Court determine for the second time this year that bulk interception regimes can in principle be ECHR-compliant. This conclusion will hinder efforts to end mass surveillance.