CDT’s Response to EC ‘Fake News’ Consultation: How to Tackle the Issue and Protect Free Expression?
Written by Jens-Henrik Jeppesen
On 23 February, CDT filed its response to the European Commission’s Consultation on Tackling ‘Fake News’. Commissioner Mariya Gabriel announced the consultation and the creation of an expert group in November 2017.
Commissioner Gabriel should be commended for launching this initiative, and we are hopeful it contributes solid European data and analysis, without which it is impossible to make recommendations for policy. However, CDT worries that the group generally lacks participation from NGOs and experts focused on protecting free expression. This is unfortunate, and we urge the group to take on and discuss views and evidence in an open and transparent way. The group’s objectives are to assess the scope of the problem, evaluate measures already taken, and propose possible future actions. The Commission’s approach is preferable to that taken by some governments, especially if it proceeds in that order: gathering evidence before recommending action. For example, French President Emmanuel Macron announced plans for a new law to tackle ‘fake news’ in France, raising clear free expression concerns. In other parts of the world, countries with weaker democratic traditions and institutions do the same, with such laws posing even greater risks of abuse to repress political opponents and dissenting voices.
In our comments we make a number of observations. First, we point out that there is no consensus on a definition of the concept of fake news, which the consultation document acknowledges. Indeed, politicians in several countries have misused the concept to discredit journalists and undermine the credibility of the press. It is essential to define the concept in a way that does not capture good-faith mistakes, satire, biased or non-neutral news reporting (i.e. most journalism) and editorial decisions that prioritise certain issues over others, misleading headlines, or the omission of important and relevant information. A workable definition was contributed by PEN America, describing what they prefer to call ‘fraudulent news’: “Demonstrably false information that is being presented as a factual news report with the intention to deceive the public”. These three criteria must be met simultaneously for an item to fit the definition. Agreeing on a similarly clear and concise definition of the concept is the first step for the expert group.
Second, there is a need to gather credible evidence about what material exists fitting the description, and any impact it may have. There is very little European data available, and none that enables systematic comparisons across European countries. Overwhelmingly, available studies are based on data from the 2016 U.S. election. Many of the questions in the consultation questionnaire unfortunately ignore this reality and solicit views on questions for which no data exist, opening the way for speculation and conjecture. For example, the Commission’s questionnaire mentions several times the ‘increasing amount of fake news’. While it is possible that the volume of ‘fake news’ (however defined) is increasing, it is not possible to ascertain. It is also likely, and suggested by limited European data at hand, that the ‘fake news’ phenomenon differs significantly from Member State to Member State, and will no doubt on examination turn out to be different from the situation in the U.S. Each country has different political systems, different ideological and social tensions, different media landscapes, and differences in the way people interact with print, broadcast, and online news sources, and with social media platforms.
Third, one essential piece is missing from the Commission’s consultation document: the acknowledgement that the ‘fake news’ issue is now top of the political agenda due to the Russian regime’s deliberate, sustained and increasingly well-documented strategy to interfere with the 2016 U.S. elections. More evidence about the scale and scope of these efforts, both offline and online, is gradually coming to light in ongoing investigations. Were it not for the Russian regime’s actions in the U.S., and similar actions taken against some EU Member States, we would not be having this discussion. CDT does not engage in foreign and security policy, but it seems obvious that when a foreign and hostile regime deliberately tries to undermine and subvert democratic institutions in liberal democracies, governments must respond forcefully. The Commission seems to sidestep this issue entirely in its statements and consultation.
In addition to this fundamental point, our advice for governments and public authorities would be to minimise and, where possible, eliminate existing restrictions on free expression, and provide for a diverse media landscape with a broad range of news outlets from across the political spectrum. Most European countries provide arms-length public funding for quality and professional news production. Public financing should ideally be distributed across many, rather than a few, outlets to avoid political bias. It is particularly important to enable, rather than restrict, open and robust public debate on the most controversial issues that have potential to create polarisation.
Enabling the broadest possible public square in the online environment also means safeguarding the fundamental principle of limitation of liability for intermediaries, as laid down in the E-Commerce Directive. The EU and its Member States should refrain from imposing mandates on intermediaries for the use of any type of automated filtering or censorship systems for combating ‘fake news’ or other types of objectionable content (such as proposed in the draft DSM Copyright Directive Article 13). Laws such as the German ‘NetzDG’, while well-intentioned, are a move in the wrong direction. They create massive censorship risks and may well drive controversial political speech underground, which risks reinforcing the narrative among some groups that their opinions and speech are being unfairly repressed, and fueling distrust in mainstream media.
News organisations should do the best possible job of reporting the facts as objectively and accurately as possible. That means providing robust coverage of issues the public wants information on, particularly when those issues are divisive and polarising. If news media attempt to under-report or even suppress information about certain problems or issues that people find important, they fail in their mission, and they open up space for misinformation, conspiracy theories, and extremely biased reporting by fringe media. An important theme in the ‘fake news’ and other debates will be the challenges that professional news organisations face in the online environment. Some news organisations have managed to maintain and grow subscriber bases and secure new revenues through innovative use of digital tools. More should follow their lead. No doubt, some interest groups will use the expert group process to lobby for a ‘publishers’ right’ as proposed in the draft DSM Copyright Directive. This would be a bad use of the group’s time. Where it was tried, this idea failed to secure funding for publishers, and had unintended consequences for access to information.
The consultation document acknowledges that propaganda and misinformation are well-known phenomena and have been deployed in a myriad of ways throughout history. At the same time, the ‘fake news’ debate takes place in light of the increasing importance of social media as platforms for circulation of news and for public public debate. It has been documented that the Russian regime’s efforts to influence the U.S. election included a significant social media element.
This brings up broader questions about the way social media perform these functions. The major social media platforms are advertising-funded, and operate business models that seek to maximise engagement and time spent by users. This model favours content that is emotionally compelling, whether negatively or positively. The evidence disclosed thus far about Russian efforts to influence the U.S. elections indicates that they have been successful in targeting various groups with messages designed to intensify political and cultural polarisation. Some of these messages included content that would fit the ‘fake news’ definition.
Social media companies have reported on measures they are taking and have taken to counter such use of their platforms. Some questions in the consultation seek input on the effectiveness of those efforts, but we are not aware of publicly available data that can answer these questions conclusively. Social media companies can and should provide comprehensive data and allow independent researchers to analyse it to provide these answers.
Finally, the questionnaire seeks input on what civil society should do to meet the challenges discussed in the consultation. First and foremost, it is essential to continue defending the internet as a space that enables freedom of expression and open access to information. Second, improved media literacy could help people evaluate the veracity and reliability of news and content, and discourage them from reflexively spreading and sharing content without critical examination. Third, there is probably work to do to encourage people to inform themselves through a variety of professional, paid news sources. This is not a new insight. People have always tended to prefer information that confirms existing and cherished opinions, and social media can no doubt make it easier to satisfy personal bias. Seeking out information that challenges one’s views is a hard thing to do, and can require effort, but makes for a better informed citizen.