Skip to Content

European Policy, Free Expression

A Series on the EU Digital Services Act: Due Diligence in Content Moderation

A blog series by the CDT Europe team on the EU Digital Services Act. A green/blue background, with a slightly visible circle of stars from the EU flag. Text in yellow and white.
A blog series by the CDT Europe team on the EU Digital Services Act. A green/blue background, with a slightly visible circle of stars from the EU flag. Text in yellow and white.

With the CDT Europe “Digital Services Act Series”, we take a deep dive into the recently adopted EU Digital Services Act, the bloc’s flagship online platforms governance Regulation. We break down key obligations, reflect on the potential impact of the legislation and pose recommendations towards the next steps in bringing the legislation to life. We hope that this series provides a more comprehensive understanding of the DSA and its implications and will support future analysis as the European Commission takes steps to supplement and implement the legislation.

(Have a look at our first and third edition in the series, on Tackling Illegal Content Online and Ensuring Effective Enforcement.)

Introduction

As we continue our deep dive into the EU Digital Services Act (DSA), we take stock of the prominent due diligence chapter of the Regulation. Within this section of the legislation, the EU has put into place an extensive array of obligations with the aim of creating a safer, more transparent online environment; advocates diligently fought to strengthen this section throughout the negotiations on the text. If effectively implemented, these provisions have the potential to set important standards for tackling some of the most pervasive harms of the digital ecosystem. 

With this second entry in our blog series, we take a closer look at these novel obligations, reflect upon how practical implementation will need to be approached, and consider how influential these requirements will be at global level.

Protecting Fundamental Rights Online

The DSA’s focus on due diligence for tech companies certainly has proven to be timely; the latest annual report of the UN High Commissioner for Human Rights focuses on the practical application of the UN Guiding Principles on Businesses and Human Rights to the activities of technology companies, and the DSA references the Guiding Principles within its preamble. The DSA reiterates in numerous instances the requirement for providers of intermediary services to pay due regard to the protection of human rights, and the co-legislators have put into place varying obligations from this perspective. Adapted to the size and type of service, these provisions aim to both address the dissemination of illegal content and to tackle issues that pose a threat to the rights protected under the EU Charter of Fundamental Rights.

The chapter includes some provisions we touched upon in our previous post, such as the Notice and Action Mechanism (Article 14), Internal Complaint Handling System (Article 17) and Trusted Flaggers (Article 19). It also includes a range of extensive reporting requirements and obliges recommender system transparency (Article 23 & 24a), as well as increased personal data protection in online advertising for all online platforms (Article 24). Importantly, these due diligence obligations are independent from the question of liability of intermediaries and will therefore be assessed separately, typically by the Digital Services Coordinators and the European Commission, along with the Board.

The most expansive of these provisions are the additional obligations for very large online platforms (VLOPs, defined in Article 25) and very large online search engines (VLOSEs, defined in Article 33a). These additional requirements include, but are not limited to, mandatory risk assessments on the impacts of the intermediary services and independent audits. Rights defenders advocated diligently throughout the negotiations process to ensure that these provisions in particular were strengthened, so they would be as impactful as possible for all users.

Taking a closer look at the risk assessments provision (Article 26), VLOPs/VLOSEs will need to identify any systemic risks stemming from the design and provision of services on an annual basis or at the time of deployment of a new relevant functionality. Crucially, these assessments will need to identify risks related to all fundamental rights under the Charter, with particular consideration to be given to the risks to free expression, civic discourse and electoral processes, and issues such as gender-based violence, protection of minors and public health. This focus on all rights, and not merely a list of current concerns, will help ensure that risk assessments remain agile and able to thoughtfully consider a changing threat landscape as well as future developments in technology, which may affect different rights in different ways. These assessments will need to be detailed and accompanied by the adoption of mitigation measures (Article 27), which in themselves will need to be proportionate, effective and, most importantly, take into account the impacts on fundamental rights.

Alongside this, vetted researchers, which can include civil society organisations, will be provided the opportunity to gain access to relevant data (Article 31) in order to conduct research that contributes to the detection, identification and understanding of systemic risks, and to assess the adequacy and impacts of the risk mitigation measures adopted. Lastly, VLOPs/VLOSEs will be subject to independent audits (Article 28), which will assess compliance with the Chapter III due diligence obligations and commitments undertaken in regards to the voluntary codes of conduct (Article 35 & 36) and crisis protocol measures (Articles 37). The third-party organisations conducting the audits will need to prove their independence from industry and meet ethical and professional standards. 

Overall, these extensive obligations could facilitate a high level of meaningful transparency for users and provide concrete avenues to better assess and mitigate online abuse. Many of these provisions in the context of EU regulation are novel, however, and when analysing the scale of what will be required, the task ahead is colossal.

Challenges of Putting Due Diligence into Practice

Due diligence provisions are certainly welcome, yet their scope and scale are vast; the DSA will act essentially as a data-gathering machine and how that data will effectively be put into use remains ambiguous. A large number of stakeholders are going to need to be able to cooperate seamlessly, in an environment of trust and transparency, in order for these obligations to have a meaningful impact for users.

Article 26 on Risk Assessments provides possibly the most tangible example of how challenging operationalizing some these obligations is going to be. The first aspect to consider is that VLOPs and VLOSEs will, at their own expense, undergo what are essentially human rights impact assessments (HRIAs). Developing meaningful HRIAs is an extensive process which should be founded upon core principles, such as customising the assessment to a company’s specific business model or risk profile, or ensuring the evaluation gives both a comprehensive view of all human rights whilst focusing on specific pertinent issues. Though recent industry efforts indicate a growing willingness to undergo these processes, the practice of conducting HRIAs is far from uniform, either in form or quality. Article 26 will require a substantial increase in internal prioritisation and transparency within platforms, as well as the development of standards for what can be considered an adequate assessment. This will also require meaningful, non-tokenistic engagement with civil society organisations who can inform the development of assessments, which again, to date, has been difficult.

Ensuring compliance and ultimately analysing these assessments will be the purview of the Board, an entirely new enforcement entity constituted of all the Digital Services Coordinators and chaired by the European Commission. Jointly the Board and Commission will hold responsibility for annually evaluating the most prominent systemic risks identified in the assessments, and lead in outlining best practices for mitigation measures. This is a daunting task and will require dedicated professional teams of qualified human rights experts who can evaluate the adequacy of the assessments and understand the potential impacts of the risks identified from an intersectional perspective. How else will the Commission be able to ascertain if the assessments have been adequate or if the mitigation measures are appropriate? As it stands, the European Commission is going to need a significant commitment of new resources and restructuring just to effectively enforce these provisions alone.

Beyond the issue of financial and human resourcing, the European Commission will also need to grapple with new responsibilities, which it currently maintains limited expertise to execute. For example, the European Commission is tasked with outlining the rules on procedural steps, auditing methodologies and reporting templates for the independent audits, but such a task will be quite novel for the body in this context of online platform governance. In sum, without placing adequate enforcement needs in place across the board, these potentially transformative provisions risk becoming perfunctorily exercises with little to no meaning.

These practical challenges certainly raise questions, but other provisions in the due diligence chapter of the DSA also raise rule of law concerns.  The European Commission has already confirmed that they expect the development of several new Codes of Conduct to support the DSA, focused on protection of minors, disinformation and gender based violence. These Codes of Conduct are voluntary frameworks aimed to contribute to the proper application of the Regulation and have been used by the European Commission across various policy areas in which regulation or enforcement is limited.

Existing Codes of Conduct, however, have been heavily criticised by advocates and UN experts, including most prominently the Code of Conduct on Illegal Hate Speech, as these codes essentially circumvent rule of law safeguards on censorship by allowing authorities to pressure companies to takedown content without proper judicial review. Moreover, the DSA empowers the Commission to adopt measures within the Enhanced Supervision System (Article 59a), to address infringements of obligations laid down in Section 4 of Chapter III that include a commitment for VLOPs/VLOSEs to participate in a relevant Code of Conduct, thereby raising questions on their strictly ‘voluntary’ nature. (We will discuss this dynamic in greater detail in our forthcoming post on enforcement mechanisms within the DSA.) Co-legislators did bring more transparency into how these codes were concluded, providing more opportunities for consultation with civil society, but this does not fully alleviate the concerns about their use.

The Crisis Response Mechanism (Article 27a) is another pertinent example. Introduced at the eleventh hour, the mechanism empowers the European Commission, upon recommendation of the Board, to adopt a decision requiring VLOPs and VLOSEs to take certain measures in light of a crisis, such as a situation posing a threat to public health or security. These measures can include adapting content moderation processes, promoting trusted information and adapting the design of their online interfaces. Though the European Commission cannot unilaterally declare a crisis, the body will be monitoring the application of the measures taken by VLOPs/VLOSEs and can require providers to take additional measures. Most importantly, the “sunset clause” of three months outlined in the text can be extended by the Commission, unilaterally, at any time; if we take the war in Ukraine or Covid-19 pandemic as examples of crises to which such a mechanism would apply, this time frame could potentially be indefinite. 

It is once again clear that, in order to make these provisions a reality in ways that do not violate human rights, regulators will need to employ a multistakeholder approach that draws upon the expertise of the broader human rights community, and to address specific challenges in practical implementation. The European Commission will need to be innovative and far more collaborative than the institution may be accustomed to; trust between national regulators will need to be nurtured and consistent engagement with civil society will be paramount. 

What Will Be the DSA’s Global Impact on Human Rights Due Diligence?

The decision of the EU to embed concrete provisions on due diligence into the DSA is an important step forward. One of the most significant impacts of the DSA globally could be the harmonisation of efforts to embed international human rights principles into content moderation, which are founded upon existing international frameworks and operate coherently within a trans-national context. 

For example, in operationalising the provisions on risk assessments and mitigation measures, regulators could reinforce prevailing standards and the guidelines of entities already conducting such assessments. The DSA is an opportunity to bring together expert organisations already engaged in this area, to create a global blueprint for the development of meaningful risk assessments. Making use of this expertise will not only aid the European Commission in addressing some of the implementation challenges noted above, but also set a precedent for how such assessments could be conducted across jurisdictions.

The access to data for vetted researchers could also set a global standard, particularly given the data protection and privacy requirements established by the GDPR that such access must fulfil. Multiple discussions on the best methods for facilitating researcher access to data are currently taking place, and the conclusion of the DSA is a concrete opportunity for stakeholders to build upon the framework for access that has been proposed. This will be particularly relevant for the development of comparable research to analyse industry’s compliance with their due diligence obligations under international standards across regions. This is why it is so imperative that these well-intended provisions are unambiguously and effectively put into place: a legal framework is only as effective as its implementation and enforcement. 

The European Commission must take a robust approach to implementation of the due diligence obligations and recognise the need to formalise their engagement with civil society in order for the impacts of the legislation to be tangible. Similarly, the blurring of the lines between voluntary mechanisms and legal compliance must be addressed to ensure similar provisions are not weaponised in other contexts to pressure industry into taking extra measures against content. More so, clarity on how these provisions will be separately yet coherently operationalised with the obligations on illegal content and intermediary liability will result in the intended two-pronged approach of the DSA being more effectively achieved.

Conclusion 

Advocates were able to acknowledge some wins in the uphill struggle to strengthen the due diligence chapter of the DSA. However, more consideration needs to be given to what additional safeguards will be required and what adequate resourcing will be needed for effective enforcement. A broad range of civil society is already taking up the mantle of enlightening regulators on how best to bring these provisions to life, and this expertise must continue to inform how the DSA can be consistently improved. 

The DSA is not set to come fully into force until 2024 and it would be a gravely missed opportunity for all relevant stakeholders, but in particular the European Commission and national regulators, to diminish the potential impact of the DSA for human rights due diligence at this stage. Now is the time to facilitate extensive expert consultations, to develop comprehensive and inclusive implementation strategies, and most importantly, to conceptualise these aims from a human-centric approach.