Skip to Content

European Policy, Open Internet

EU Tech Policy Brief: March 2018 Recap

This is the March recap issue of CDT’s monthly EU Tech Policy Brief. It highlights some of the most pressing technology and internet policy issues under debate in Europe, the U.S., and internationally, and gives CDT’s perspective on them.

New EC Recommendation Demands One-Hour Takedown for ‘Terrorist’ Content

In October of last year, the European Commission published its Communication on tackling illegal content online. We criticised it for pushing private companies to police on government’s behalf for content that may be considered harmful or illegal. A few months later, a new European Commission Recommendation doubled down on this approach. It relies heavily on several mechanisms — trusted flaggers with government backing, Internet Referral Units, and hash databases — for faster takedowns. Safeguards and appeal processes are mentioned, but as voluntary measures. No role is foreseen for court and judicial review, and the public is left with little recourse to hold authorities accountable for declaring that an individual’s speech violates the law. Before the Recommendation was published, we submitted a joint letter, urging the Commission to reflect more on its approach to tackling illegal content online. A public consultation is forthcoming, and in our response to the preliminary Inception Impact Assessment, we call on the Commission to conduct and publish a comprehensive collection of data about the nature and volume of content it targets. Without such analysis, it is premature to propose any legislative option. Moreover, when the Commission assesses ‘progress’ in tackling various types of content, it cannot be measured in terms of faster takedown of more content. The Commission must also consider the extent to which its policy results in de facto censorship of legal political speech.

Copyright DSM: Rapporteur Puts Forward (Un)compromised Amendments

The copyright debates in both the European Parliament and Council seem to be moving in the wrong direction. Parliament rapporteur MEP Axel Voss finally presented ‘compromise’ amendments, which unfortunately do not balance the views of other political groups or stakeholders. Mr Voss doubled down on the most problematic elements of Article 11 (ancillary copyright), expanding its scope of beneficiaries to cover not only press publishers but also news agencies. This puts the right to access information at risk, given that simple facts and compilations of basic information could also be protected under this expanded scope. In addition, press publishers and news agencies would be granted an “inalienable right to obtain an fair and proportionate remuneration for such uses.” This would require publishers to demand payment from news aggregators. Unfortunately, Mr Voss’ amendments do not include the proposal put forward last year by former rapporteur and MEP Therese Comodini, which would offer press publishers the presumptive right to representation in court.

The discussions around Article 13 (upload filter obligation) are somewhat more advanced, in both the Parliament and Council. While we see more carve-outs being added, its scope remains too broad and continues to risk considerable damage to the digital economy and unintended harms to users’ right to free expression. As we’ve said on various occasions, and as the Council of Europe agrees, mandating upload filters is unjustifiable.

‘Fake News’: High-Level Expert Group Puts Forward Comprehensive Set of Recommendations

In February 2018, CDT responded to the European Commission’s public consultation on tackling ‘fake news’, which was announced by Commissioner Mariya Gabriel last year, along with the creation of an expert group. In our response, we emphasised that the lack of European data in this area is a poor basis for policymaking. We also referenced the misuse of the term ‘fake news’ and the need to have a consolidated and targeted definition of the phenomenon to avoid responses resulting in censorship. In this respect, we are pleased that the expert group’s report published in March avoids using the term ‘fake news’ and rather defines it as ‘disinformation’. The report also promotes media and information literacy; developing tools to empower users and journalists; and enhancing transparency of online news, among other recommendations. A somewhat unsettling element of the report is its suggestion to establish a Coalition of all stakeholders, which would support internet platforms to down-list disinformation and promote ‘real news’ on their platforms. The proposal includes periodic assessment of progress by the Commission. If results are deemed unsatisfactory, there is a risk of “fact-finding and/or additional policy initiatives, using any relevant instrument, including competition instruments or other mechanisms”. In our view, the suggested structure sounds too similar to the Hate Speech Code of Conduct, which we continue to criticise for promoting privatised law enforcement at significant risk to free speech online. The aim of demoting disinformation should be balanced against the need to preserve an open and free internet. The Commission is due to publish a Recommendation on this issue end of April, which is expected to take due consideration of the expert group’s recommendations in its suggested approach.

ePrivacy: Council Starts to Progress With Discussions on Controversial Provisions

On 28 March, the Council Telecoms Working Party met to discuss the ePrivacy proposal based on the latest Bulgarian Presidency discussion paper, which focuses on Articles 8, 10, 15, 16, and the related recitals. The paper seems to follow the structure of the European Commission’s proposal, with a general prohibition provision followed by a list of exceptions. Particularly on Article 8 (protection of end-users’ terminal equipment information), the latest Presidency document includes provisions allowing for security updates, where the Working Party seemed to reach more consensus. We have already highlighted the need for this proposal to focus on communications’ confidentiality and information security. Regarding Article 10 (privacy settings), Member State delegations seemed to be more divided, particularly on giving users the choice on browser settings. The Presidency text would, as currently drafted, oblige information to be provided to the end user about the possibility of choosing a privacy setting. According to this text, this would nonetheless not compel the end user to agree with the settings upon installation or upon first use.