As government leaders, policymakers, and technology companies continue to navigate the global coronavirus pandemic, CDT is actively monitoring the latest responses and working to ensure they are grounded in civil rights and liberties. Our policy teams aim to help leaders craft solutions that balance the unique needs of the moment, while still respecting and upholding individual human rights. Find more of our work at cdt.org/coronavirus.
On April 24, 2020, The Center for Democracy & Technology hosted an online roundtable on “COVID-19, content moderation and the Digital Services Act”.
The COVID-19 pandemic has created new challenges in online content moderation. Social media platforms have proved essential for people to connect and communicate with family, friends and broader communities, and for authorities to provide citizens with relevant and necessary information about the outbreak. E-commerce services have enabled distribution of important goods during the lockdown. At the same time, various actors have sought to spread disinformation about the crisis, such as baseless claims about how the virus came about, and attempts to sell ineffective or even dangerous remedies. Companies have stepped up efforts to combat attempts to spread disinformation and fraudulent products.
The COVID-19 crisis comes at the same time as the European Union institutions begin deliberations on the Digital Services Act (DSA). The DSA will set the future legal framework for online content governance, and define new responsibilities for companies hosting third-party content. Our discussion focused on the experiences from the COVID-19 situations, what conclusions can be drawn, and how these learnings might inform the ongoing and upcoming Digital Services Act discussions. We heard from the European Parliament, the European Commission, industry, civil society and academia:
- Alex Agius Saliba MEP
Group of the Progressive Alliance of Socialists and Democrats in the European Parliament
- Prabhat Agarwal
European Commission, Deputy Head of Unit, Online Platforms & eCommerce
- Siada El Ramly
- Emma Llansó
CDT, Director, Free Expression Project
- Joris van Hoboken
Vrije Universiteit Brussels (VUB), Professor of Law, appointed to the Chair ‘Fundamental Rights and Digital Transformation’
- Eliška Pírková
Access Now, Europe Policy Analyst
- Moderated by Jens-Henrik Jeppesen
CDT, Director, European Affairs
Alex Agius Saliba MEP
The panel kicked off with the contribution of Agius Alex Saliba MEP, who emphasized how digital infrastructures helped mitigate the impact of the crisis. Online services have played a crucial role, allowing both European citizens to work and continue their daily lives, and the EU institutions to continue their activities remotely. However, the pandemic has also exposed the shortcomings in the existing regulatory framework. “Unfortunately the COVID-19 pandemic has also shown how vulnerable EU consumers can be to misleading trading practices of selling fake or illegal products (…) and other unfair conditions (…),” Saliba said.
As one of three rapporteurs of the European Parliament’s own-initiative reports on the topic, MEP Saliba further presented his recently published draft report: ‘Digital Services Act: Improving the functioning of the Single Market’, for the Internal Market and Consumer Protection Committee. He highlighted the main elements of his text, which focuses on a comprehensive reform of the e-Commerce directive while respecting some of its core pillars. The prohibition to impose general monitoring obligation, the limited liability regime, and the country of origin principle, among others, are to be maintained in the upcoming legislation. Saliba’s view of the DSA takes a horizontal approach and goes beyond mere platform regulation. Various business models as well as the competitiveness in the market will have to be considered. Businesses not established in the EU but capable of directly affecting the market and consumers will also need to be covered. Special attention must bepaid to consumer protection, user safety, and respect for European rights and values. The underlying principle is that “what is illegal offline is also illegal online”.
Prabhat Agarwal, Acting Head of Unit in DG CNECT, emphasized the high quality of MEP Saliba’s report and noted that on several points it is consistent with European Commission’s preliminary preparations as part of its legislative proposal on the DSA. He then made several key observations concerning the ongoing pandemic:
- Digital platforms have stepped up to the responsibility in managing the crisis. The DSA will need to examine options to facilitate voluntary and proactive measures the platforms are taking.
- Users’ safety is a critical issue, especially in terms of exposure to illegal content and unsafe goods.
- Use of automated content moderation tools will be an important area of attention to look at from user safety and fundamental rights perspective.
- The importance of empowering users to make their own decisions. It must be ensured users have enough choices and options available in the way content is presented to them and according to their preferences.
In his final remarks, Agarwal stated that the political green light has now been given to the continuation of the DSA work. “The machine is restarting for the public consultation”, which could be launched in the second half of May. Originally supposed to be published in Q4 this year, the timing of the Commission’s draft proposal(s) (Q4/2020 or Q1/2021) will be decided soon.
Siada El Ramly
One of the main takeaways from the crisis, El Ramly stressed the importance of strong cooperation between stakeholders and therefore the interlinkage of stakeholders and the responsibility that each has – when reflecting on content moderation in the efforts to tackle COVID19 disinformation. “The reason why the companies were able to act to a large part (…) in trying to demote content that could be deemed disinformation was because there were good contacts with e.g. the WHO,” she said. In general terms, one sector of the content distribution chain cannot tackle the issue alone.
Reiterating Agarwal’s positive assessment of voluntary and proactive measures taken by the industry, EDiMA supports self-regulation and the companies’ own possibility to act, as one of the cornerstones of the upcoming DSA framework, but recognises that oversight of the endeavours by companies is necessary by an independent oversight body. New legislation should be as pragmatic as possible but provide legal certainty where necessary, e.g. notice-and-action procedures. As EDiMA’s most recent position paper on the DSA, El Ramly introduced the ‘Online Responsibility Framework’.
CDT’s Director for Freedom of Expression, Emma Llansó, started her contribution by focusing on companies’ shift towards automation in content moderation during this pandemic, highlighting that:
- Automatic content moderation measures increased the error rate in comparison with traditional “moderation by hand”.
- These new measures have also limited companies’ abilities to provide users with the right to appeal in case of inaccurate assessment of online content.
- Restricting fundamental human rights must only be seen as a temporary measure. “Whatever the companies have to do in this situation, it needs to be understood as a kind of emergency procedure (…). This is not the new status quo, it is something that is happening because of the particular constraints the global pandemic is putting on these systems.”
Llansó further referred to the recently published COVID-19 Content Moderation Research Letter (coordinated by CDT, Committee to Protect Journalists, and WITNESS) urging companies to preserve data on content blocking and removal during the pandemic, so as to have a more comprehensive picture of these automatic measures’ capacity. The emphasis on transparency and accountability of the companies’ content moderation systems should be one of the DSA’s main areas of focus. There is no one-size-fits-all approach to transparency reporting – various sizes and types of digital services as well as different audiences for the reports have to be considered. Llansó also stressed the overriding importance of the ‘human in the loop principle‘ in every content moderation system as a guarantee for users’ privacy and freedom of expression. The basic framework for policymakers could be the Santa Clara Principles, which promote users’ right to a fair and unbiased due process.
Joris van Hoboken
Joris van Hoboken, bringing in the academic perspective, stressed that the coronavirus has accelerated the use of automated content governance, and demonstrated the need for better transparency and accountability in platforms’ systems. The crisis also shows we have to take seriously human labor involved in content moderation, and ensure the protection of their rights and well-being. He referred to the extensive research by the Transatlantic Working Group for policymakers to take into account.
A core challenge in drafting the new DSA proposal will be to bring together all the legal frameworks that already exist in the EU law. On top of digital market regulation and consumer law as outlined in MEP Saliba’s report, the debate will also have to consider the rich case law of the European Court of Justice in the area of fundamental human rights issues, privacy and data protection, and non-discrimination. “Designing a new or amended regulatory framework for digital services will have to involve proper understanding and articulation of fundamental rights and their implementation into actual safeguards,” van Hoboken emphasized. Additionally, media law and a wide variety of regulatory approaches towards illegal or harmful content will have to be brought into perspective.
Eliška Pírková from AccessNow reiterated and supported some important points mentioned by previous speakers, including the need to carefully monitor the increasing use of automatic tools and the emphasis on better transparency and accountability (AccessNow cosigned the above mentioned open letter). The element of human review should remain a key part of content moderation.
Pírková also stressed the crucial but ambiguous role of governments in tackling online disinformation relating to COVID-19. “Government (…) measures to tackle disinformation will determine how the post-COVID situation will look like”, she said. On the one hand, responsible governments provided essential official information. On the other hand, some governments and individual politicians spread or amplify highly problematic misinformation. She emphasized that new measures should not consequently lead to adopting new legislation to criminalise speech based on vague definitions such as ‘fake news’ and ‘propaganda’. Even if human rights can be restricted in case of emergency, they are essential and must continue to apply in the future. As AccessNow’s official position on the DSA, Pírková introduced the ‘26 recommendations on content governance’, which offers human rights-centred guidelines that can serve as a baseline foundation for content governance that safeguards human rights.