Skip to Content

European Policy, Free Expression, Government Surveillance, Privacy & Data

A Series on the EU Digital Services Act: Tackling Illegal Content Online

A blog series by the CDT Europe team on the EU Digital Services Act. Royal blue background, with a slightly visible circle of stars from the EU flag. Text in yellow and white.
A blog series by the CDT Europe team on the EU Digital Services Act. Royal blue background, with a slightly visible circle of stars from the EU flag. Text in yellow and white.

With the CDT Europe “Digital Services Act Series”, we take a deep dive into the recently adopted EU Digital Services Act, the bloc’s flagship online platforms governance Regulation. We break down key obligations, reflect on the potential impact of the legislation and pose recommendations towards the next steps in bringing the legislation to life. We hope that this series provides a more comprehensive understanding of the DSA and its implications and will support future analysis as the European Commission takes steps to supplement and implement the legislation.

(Have a look at our second and third editions in the series, on Due Diligence in Content Moderation and Ensuring Effective Enforcement.)

Introduction

After three years of intense negotiations, the EU institutions have agreed on a final text for the Digital Services Act (DSA); the potential gold standard for online content governance has arrived. For human rights advocates, there is much to celebrate as the DSA maintains and in some areas strengthens central tenets of a free and open internet. However, rule of law concerns remain, as the Regulation now empowers government authorities in the realm of tackling illegal content.

With this first in our blog series, we break down the obligations the DSA imposes for tackling illegal content, identify where these provisions will need to be strengthened with additional safeguards, and reflect on the potential global impact of the regulation.

Addressing Illegal Content Online & Intermediary Liability

The DSA adopts a two-pronged approach: On the one hand, it sets concrete obligations for how digital services must tackle illegal content, with differentiation on how these obligations apply depending on the size of service. On the other hand, it introduces novel due diligence obligations to address societal risks posed by the provision of services and dissemination of content (which we will explore in our next post).

The core provisions on liability of providers of intermediary services (Articles 3-7) reinforce existing aspects of the E-Commerce Directive, the regulation which the DSA replaces. This includes maintaining the conditional liability regime in which hosting services remain exempt from liability for illegal content until they have ‘actual knowledge’, at which time they must act diligently to take appropriate measures to address the issue. Similarly, the DSA is explicit in ensuring there is no general monitoring obligation imposed by the regulation. These safeguards are both essential to avoid a situation whereby platforms would censor or surveil users’ speech, including legal speech. Even then though, these provisions are far from perfect.

Notice & Action Mechanism

Many will have heard that the DSA has successfully established that “what is illegal offline is now illegal online.” However, it is important to clarify that this was already the case in Europe. What has indeed been achieved is a better harmonisation of mechanisms and obligations to tackle illegal content online. Most notably, the Notice and Action Mechanism (Article 14) establishes the procedure that providers of hosting services need to put into place to allow individuals to notify them of potentially illegal content on their platforms/services.

This provision has been controversial throughout the development of the final DSA text. It is through this notice mechanism that ‘actual knowledge’, and therefore potential liability for the dissemination of illegal content, can arise (actual knowledge can also be gained through a host service’s voluntary investigations, or through the receipt of court orders). The original proposal would have inappropriately required online platforms to make determinations on the illegality of content / speech, and would have created a significant incentive for intermediaries to remove any content so notified to them, to avoid the risk of liability. This sort of notice-and-takedown regime is vulnerable to abuse by individuals filing notices that target speech they disagree with and wish to see taken offline.

The EU co-legislators made positive headway in improving this provision from the initial draft by introducing several safeguards. The Notice and Action Mechanism in the final DSA still allows any individual or entity to submit a notice to alert the platform of the presence of allegedly illegal content. These notices, however, must be sufficiently precise and substantiated to the point that a provider can determine illegality without conducting a detailed legal examination. In short, only if the flagged content is evidently manifestly illegal can such notices give rise to ‘actual knowledge’. Any actions taken in response to a notice must be strictly targeted and in due regard to the EU Charter of Fundamental Rights, particularly free expression, protection of personal data and non-discrimination. These improvements show how guidance from advocates on how to improve the provision from a rights protection perspective were embedded into the text.

Other key recommendations from digital rights advocates were also included in the final text. For example, the Notice and Action Mechanism does not mandate short response and takedown times, and platforms will also need to provide a detailed statement of reasons for any restrictions they may, or may not, impose in response to a notice (Article 15). Some concerns about the provision on Trusted Flaggers (Article 19), a status awarded by Digital Services Coordinators and whose notices are to be given priority, were also addressed in the final text. Trusted flaggers are now obligated to provide proof of independence and are subject to annual reporting requirements. Avenues to counteract abuses of this system have also been put in place, such as the option for providers to report trusted flaggers who routinely provide inadequate or inaccurate notices.

It is important to note, however, that not all threats to fundamental rights from this mechanism have been addressed. In particular, law enforcement agencies are explicitly listed as entities who can become Trusted Flaggers, whose notices merely allege illegal content. As detailed further below, law enforcement notifications made as Trusted Flaggers would be tantamount to an order to remove the content given the significant legal risk services would face in failing to act and therefore are ripe for potential abuse.

Orders Against Illegal Content and for Information

While the improvements to the notice mechanism are to be welcomed, the same cannot be said for the changes made during the negotiations to the provisions concerning orders against illegal content (Article 8) and orders for information (Article 9). In short, under these provisions, providers of hosting services must, upon the receipt of an order from the relevant authority, act against one or more specific items of illegal content or provide specific information about one or more specific recipients of the service.

Where these provisions get it right is the need for such orders to meet harmonised minimum standards, including reference to the legal basis for issuing such orders. Orders must similarly be accompanied by a detailed statement of reasons and clear identification of the authority issuing the notice. This theme of transparency extends to a requirement for information on orders to be shared amongst the Digital Services Coordinators, and intermediary services are required to publish annual reports which include information on the number of orders they receive and which Member State issued said orders.

However, the provisions concerning orders against illegal content and orders for information unfortunately raise significant concerns for the rule of law. Competent authorities who can issue orders are not limited to judicial bodies but include administrative and law enforcement authorities, a significant empowerment of these entities. This means that the Article 8 provision enables administrative authorities, including law enforcement to send orders with which intermediaries must comply. In combination with the ability for law enforcement to submit notifications for alleged illegal content as Trusted Flaggers, these aspects of the DSA lack the necessary procedural and fundamental rights safeguards.

As lawmakers develop the implementation strategies for these extensive provisions, deep consideration will need to be given as to how they will address these contradictions and preserve the DSA’s aim to be centred on human rights.

Transparency & Mechanisms for Redress

These provisions that form the basis of how the DSA tackles illegal content are supported by additional transparency and redress mechanisms, which include requirements for providers of online platforms to maintain an internal complaint handling system (Article 17) and to engage with newly established Out-Of-Court Dispute Settlement bodies (Article 18).

The internal complaint handling system is available to users, including those who have submitted notices, to lodge complaints against restrictive decisions taken by an online platform as a result of a notice. These include actions such as whether or not a platform has restricted visibility of content, suspended or terminated provision of its services or demonetised content. The Out-Of-Court Dispute Settlement bodies are an additional avenue users can take to resolve disputes that could not be resolved through the internal complaint-handling system. These bodies must also be independent from industry influences. Importantly, these bodies will not be empowered to impose binding decisions, thereby removing any concern that they may act as an extra-judicial body; individuals will retain their right to initiate proceedings before a court if they wish to contest decisions made by providers. The DSA also foresees measures against misuse (Article 20) which allows online platforms to potentially suspend individuals who misuse the notice and action mechanism and/or complaint handling systems, as long as the decision to suspend is assessed on a case-by-case basis.

Importantly, all of the aforementioned provisions are subject to extensive reporting requirements on how well they are operating, for the platforms, individual entities such as Trusted Flaggers and Out-of-Court Dispute Settlement bodies, and regulators alike. Intermediary services will need to report on the number of orders they have received from Member state authorities; Trusted Flaggers and Out-of-Court Dispute Settlement Bodies will be subject to their own transparency reporting requirements; and European and national regulatory bodies will need to conduct regular assessments.

In sum, the DSA is certainly detailed in its aim to create a harmonised framework for addressing illegal content online, and rights advocates can celebrate the places in which clear considerations of the preservation of fundamental rights have been included in these provisions. However, international human rights law is clear that Courts alone should be the arbiters of what constitutes illegal speech and the highest levels of protections must be afforded to fundamental rights when it comes to tackling illegal content online. Consequently, despite efforts to include human rights safeguards in the DSA, there are some gaps in relation to fundamental rights safeguards and rule of law concerns remain pertinent.

The success of this framework will ultimately then depend upon a high level of coordination, and a non-politicised, multistakeholder implementation and oversight strategy that facilitates the development of the additional rights safeguards needed to address outstanding concerns.

What Will Be the DSA’s Global Impact?

Having dissected the DSA’s illegal content regime in more detail, it is evident that the regulation will be influential. Indeed the DSA is set to be a legislative driving force, with the Brussels Effect in its full stride. However, this level of influence is also precisely why additional safeguards must be put in place to address the gaps we have identified. Prior even to coming into force, reflections on how certain provisions of the DSA could be operationalised in other jurisdictions, particularly in authoritarian regimes, is a cause for concern.

As has been highlighted, the ability for administrative and law enforcement authorities to issue orders to remove content, even if the legal basis for such order is provided, is problematic, because the order is not disseminated or reviewed by a judicial authority itself. Similarly, the empowerment of law enforcement agencies not only as Trusted Flaggers but also entities who can issue orders for information on individuals or groups of people could be used against human rights groups and advocates.

Other aspects of the DSA could also be weaponized in authoritarian contexts. For example, providers of intermediary services are required to designate a legal representative for compliance and enforcement (Article 11); this individual can be held liable for the provider’s non-compliance with the regulation. At first glance, this would seem a reasonable measure given the extensive requirements of the Regulation; however this provision matches those already made and abused by authoritarian regimes. The threat of prosecution or imprisonment of employees or representatives if platforms do not comply with government demands even if unjustified, as seen during the recent Russian Federal elections and cases in Brazil, poses a significant risk to human rights and freedom of access to information.

The requirement for providers of hosting services to notify law enforcement of any suspicion that a criminal offence involving a threat to the life or safety of a person or persons has or is likely to take place (Article 15a) also raises a concern. This provision is meant to respond to the atrocities of extremist violence that have been shared on user-generated content platforms. However, the final text lacks legal certainty or adequate fundamental rights safeguards; what would be the baseline for a hosting provider “becoming aware of any information” of such activity that is “likely to take place”? This provision raises concerns of over-reporting of users to law enforcement within the EU. Moreover, governments in places where democracy is non-existent, weak, or under pressure, may use these provisions, coupled with the provisions providing for liability for legal representatives, to place pressure on platforms, which in turn leads to the over-reporting of users. The right balance between countering radicalisation and preserving individual rights under international law and rule of law hasn’t been struck, and this is problematic both within the EU and globally.

Therefore, it is imperative that gaps in the rule of law, more extensive safeguards, and consideration of how the DSA may be influential beyond the borders of the EU are prioritised as delegated acts and implementation strategies are developed.

Conclusion

The DSA proves ambitious in its aims, provides some clarity, and brings a level of harmonisation by introducing minimum standards across key provisions. In combination with the extensive transparency and due diligence obligations, which will be explored in depth in the next blog of our series, the two-pronged approach set by the DSA could prove to be groundbreaking at a global scale. Its success will lay in bringing these provisions to life, and this requires addressing the remaining contradictions and rights-concerns evident in the text.

Even with the text finalised, the opportunity to bolster the DSA remains. Delegated acts and implementation plans must be developed in consultation with experts and civil society; assessments of the effectiveness of the Regulation will need to be conducted periodically; and opportunities remain to make improvements once gaps have been evidenced. EU co-legislators voiced their commitment to centring fundamental rights at the heart of the Regulation, and this responsibility does not dissipate now that the text has been agreed upon. Indeed, upholding this duty is more important now than ever.