Skip to Content

European Policy, Free Expression

COVID-19, Content Moderation and the EU Digital Services Act: Key Takeaways from CDT Roundtable

As government leaders, policymakers, and technology companies continue to navigate the global coronavirus pandemic, CDT is actively monitoring the latest responses and working to ensure they are grounded in civil rights and liberties. Our policy teams aim to help leaders craft solutions that balance the unique needs of the moment, while still respecting and upholding individual human rights. Find more of our work at cdt.org/coronavirus.

On April 24 2020, The Center for Democracy & Technology hosted an online roundtable on “COVID-19, Content Moderation and the Digital Services Act”.

The context for the discussion is the challenge the COVID-19 pandemic has created for online content moderation. The crisis comes at the same time as the European Union institutions begin deliberations on the Digital Services Act (DSA). The DSA will set the future legal framework for online content governance, and define new responsibilities for companies hosting third-party content. Our discussion focused on the experiences from the COVID-19 situation: what conclusions can be drawn, and how these learnings might inform the ongoing and upcoming Digital Services Act discussions.

We heard from the European Parliament, the European Commission, industry, civil society and academia. Our speakers were Alex Agius Saliba MEP, who presented his thorough and thoughtful draft report on the DSA for the European Parliament Internal Market Committee. Prabhat Agarwal, European Commission’s Acting Head of Unit in DG CNECT, who commented on MEP Saliba’s report and laid out the Commission’s plans with regard to the DSA. Siada El Ramly, Director-General of EDiMA, provided an industry perspective on the DSA and experiences from leading online platforms. Emma Llansó, Director of CDT’s Free Expression Project, discussed learnings of the pandemic in regards to company practice and public policy, drawing on among other things, her work in the Transatlantic Working Group on Content Moderation and Free Expression. Professor Joris van Hoboken at Vrije Universiteit Brussels (VUB) put the DSA policy issues in context, and noted their relationship with other areas of law and policy: consumer protection, data protection, and media and competition. Eliška Pírková of AccessNow reinforced the need to protect fundamental rights, both during and after the pandemic, and presented AccessNow’s principles for online content governance. The event was moderated by Jens-Henrik Jeppesen, CDT, Director, European Affairs.

An important takeaway from the discussion was the conclusion that online services have played an essential role in helping European societies respond to the crisis. Online communications services have enabled people in lockdown to work, stay in touch with family and friends, continue their education, etc. Online services have helped authorities provide essential healthcare information to citizens, and they have taken robust measures to counter COVID-19 related misinformation, scams, and fraudulent behaviour. In this sense, the pandemic has demonstrated how much modern societies rely on online services, which in turn shows the legitimate public interest in setting the right framework for how they operate – the objective of the Digital Services Act. CDT’s objective is to ensure the new framework is conducive to online free expression and innovation.

Another conclusion is this: Taking very tough measures to stop the spread of health misinformation and defective products that could put lives at risk is amply justified by the deep crisis our societies are facing. Law and policy should be flexible enough to enable content hosts to take responsible and voluntary action to counter these risks. However, regulation should not mandate such strict content moderation in normal circumstances, or for the many types of speech and content that could fall under a broad categorization of disinformation. It’s important for companies to act quickly to stop the spread of inaccurate health information that can lead to injury and death. But those same measures cannot and should not be applied to other forms of content, such as political speech. Some observers have called for companies to crack down just as forcefully to restrict content that some might consider disinformation. But, such content is typically protected speech, and moderating it requires careful assessment. Emergency measures that companies are putting in place during the pandemic need to be time-limited and need to be wound down as the crisis relents, and should not be broadly applied to all types of content.

Thirdly, the COVID-19 pandemic has accelerated and intensified the use of automated tools for content moderation, because human reviewers have been unable to work as usual. The experiences from this situation show both the necessity and limitations of automated tools for at-scale moderation. The experience confirms that increasing reliance on automated tools implies costs for free expression and access to information, in terms of overbroad restriction of content, with limited opportunity to appeal. It demonstrates that human review continues to be essential for making the difficult judgments necessary for rights-respecting moderation. And, it shows the need to ensure that responsible content moderation is conducted with transparency, accountability, and fairness, including rights to appeal. These are important lessons for the Digital Services Act, which is likely to propose oversight of content moderation, and processes for notice-and-action.

Here’s a full summary of this webinar, with detailed breakdowns of each speaker’s contribution.