European Commission Online Platform Proposals Puts Onus on Companies
Written by Jens-Henrik Jeppesen
On 25 May, the European Commission published a set of proposals and documents under the umbrella of its Digital Single Market strategy. Among them is a Communication on ‘Online Platforms and the Digital Single Market Opportunities and Challenges for Europe’. This sets out the Commission’s conclusions and proposed actions based on its Platforms Consultation, which CDT responded to in December 2015.
There are positive messages in the document, but also some problematic ones. CDT has consistently pushed back on proposals that would endanger the internet as an enabler of free expression, public debate, and access to information. That was the main theme in our response to the Platforms Consultation, and in our March 2015 paper on the importance of the internet for free expression in Europe.
The good news first. The Commission confirms that it does not intend to re-open the E-Commerce Directive, which sets out intermediary liability protections for caching, hosting, and mere conduit. We argued that these protections have been absolutely fundamental in spurring the growth of a wide range of online services that have created space for free expression, creativity, and debate, as well as untold innovation, economic growth, new services, and entrepreneurship. We welcome the confirmation that the Commission shares this view.
This decision was not a foregone conclusion that the Commission. In its May 2015 Communication announcing the DSM strategy, the Commission talked about introducing a ‘duty of care’ or monitoring obligations on intermediaries. These ideas, had they moved forward, would have had dramatic consequences for citizens, consumers, and entrepreneurs building online businesses.
On to the more problematic elements. The Commission confirms it intends to continue to push internet companies to ‘do more’ to combat various forms of illegal, harmful – or perhaps merely undesirable – content online. As an example of successful initiatives, the Commission refers to its on-going cooperation with online platforms on ‘terrorism content’, and a similar dialogue on a code of conduct on ‘illegal hate speech’.
The Commission must ensure that its policies are focused on a legitimate aim… and not on the suppression of extreme perspectives or opinions, which are protected as a matter of fundamental rights.
Curbing hate speech and content related to terrorism are not new policy priorities, and the Commission is by no means the only government body pushing for private sector action to enforce often vaguely defined speech laws. Across large parts of Europe, governments are intensely focused on Islamic extremism and concerns that online and offline efforts to radicalise young people are creating a fertile recruitment ground for groups such as the Islamic State. In response, they are running extensive counter-terrorism and anti-radicalisation policies with both offline and online components. Some, such as Europol’s Internet Referral Unit, are coordinated at EU-level while others are under Member State control. Curbing hate speech is also increasingly viewed as a means of dealing with the controversial and sometimes toxic debate provoked by the on-going migrant crisis.
It’s not surprising that the Commission reflects these priorities in its Communication. These are significant problems for European societies, and how they can be solved is not obvious. However, the Commission must ensure that its policies are focused on a legitimate aim – for example, preventing the commission of violent acts – and not on the suppression of extreme perspectives or opinions, which are protected as a matter of fundamental rights. Further, enlisting private sector companies in efforts to curb expressions considered illegal, harmful, or undesirable, is inherently problematic. It is a fundamental principle of the rule of law that courts, not companies, determine what speech is and is not lawful. When companies are compelled to censor their services of such content, with little or no transparency and a lack of due process, there is considerable risk that the result will be to stifle legitimate debate.
It is not easy to tell at this point how industry-government cooperation on hate speech and content linked to terrorism and extremism is working. The Commission seems enthusiastic, but it must ensure full transparency, clear guidelines, and proper redress and appeals possibilities. It must ensure that only illegal content is targeted, and that there is clarity about how and who makes the determinations. Past experiences with voluntary schemes and notice-and-takedown for copyright infringement (another priority for the Commission) suggest that incentives are often skewed towards excessive content removal at the expense of the presumption of innocence.
Finally, on copyright and enforcement, judgment must be reserved. The Communication restates the Commission’s view that revenue streams from distribution of online copyrighted content are unfair in some circumstances but does not say how it may seek to address it. Some of the ideas the Commission has previously aired are worrying, for reasons we have discussed in the past. These include defining new ancillary rights, and potentially narrowing the category of intermediaries that benefit from liability protections under the ECD. CDT will continue to caution against these types of policies as the Commission considers its future copyright and enforcement strategy.