In the early hours of April 23, after 16 hours of intense negotiations, the European institutions reached a political agreement on the EU’s flagship platform and content governance regulation, the Digital Services Act (DSA). Key aspects of the new rules for providers of digital services in Europe have emerged and CDT welcomes this significant step forward in increasing platform accountability and in protecting users’ fundamental rights online.
But the devil will be in the details. Though the provisional agreement was reached in record time, the text will now be finalised over the next few weeks at the technical level, meaning several open questions remain as to how the political agreement will be translated into the definitive version of the legislation. With the regulation set to come fully into force in early 2024, there will now be reflection on the practical implications of certain provisions and how the broad horizontal framework will be enforced across the bloc.
Key Aspects of the Proposal
Even from the initial draft proposal by the European Commission, the extensive transparency, increased accountability and risk mitigation obligations for providers of online services clearly set the DSA apart. Throughout the negotiations, these provisions were heavily debated, and initial indications are that the final political agreement does include some strengthened provisions that will prove important for the protection of free expression.
An important aspect of the text are the extensive due diligence obligations which include the assessment and mitigation of systemic risks online. These provisions were substantially updated to include four categories of systemic risks that Very Large Online Platforms (VLOPs) should consider and take measures against, with a last minute addition to the list to include “gender based cyber-violence” as a compromise to the European Parliament’s proposed article to combat non-consensual sharing of images on platforms hosting user-generated pornography content. Importantly, these assessment and mitigation measures will be subject to scrutiny by independent assessors and allow access to relevant data by vetted researchers, which include civil society organisations.
A top priority for the European Parliamentary lead Rapporteur Christel Schaldemose was the provisions concerning targeted advertising, which were extensively scrutinised throughout the negotiations. It’s difficult to discern what was precisely agreed during the final negotiations, but indications are that the DSA includes a ban on targeted advertising to minors and a prohibition on presenting advertising to recipients of a service based on profiling within the meaning of the existing GDPR. Importantly, the agreement also requires that these obligations to cease targeted advertising to minors must not lead to further personal data collection (for example, of users’ age or date-of-birth information). Similarly, platforms will have to provide users with clear information about their recommender systems and with an option for such a system to not be based on profiling, as per the recommendation of the European Data Protection Supervisor.
Though these additions are significant, it is gravely unfortunate that essential protections introduced in the European Parliamentary mandate were dropped during negotiations. Namely the essential privacy safeguards introduced in Article 7 and 7a, which included the right to use services anonymously, the protection of the use of end-to-end encrypted services and the prohibition on the use of legally mandated upload filters. Similarly, the provision on “Dark Patterns” seems to have been substantially weakened, meaning that the option for users to have a “do not track” function within their browsers was also removed.
On the other hand were the priorities of the European Council, one of which included the addition of online search engines within the scope of the DSA as a new category of “intermediary”. Late in the negotiations, a compromise proposal was made by the European Commission to establish this new categorisation and to further oblige these services to be subject to the Article 14 Notice and Action regime, which would have created the obligation to delist search results, or possibly entire webpages, that had been flagged to have contained illegal content. CDT and civil society partners raised significant concerns about this proposal and subsequent amendments, as such an obligation would not be technically feasible without imposing a general monitoring obligation and posing a considerable threat to free expression. The provisional agreement seems to have reached a compromise in which online search engines have been included in the list of services covered by the regulation, but most importantly, are not subject to strict removal obligations. Instead, the compromise provides scope for “a case-by-case assessment of the responsibilities of Google and the likes for illegal content, which is left to be clarified by a legal review”.
Developments Throughout the Negotiations
As alluded to, several aspects of the aforementioned provisions (and many others for which details are yet to emerge) were proposed during the midst of the speedy trilogues process, which raised several concerns for civil society and democratic observers. These amendments, and the wide variation between the compromise proposals put forward by each of the co-legislators, highlights the fact that the broad scope of the DSA and the issues it attempts to address are incredibly complex and nuanced.
One such example is the all-important Notice and Action Mechanism, in Article 14, which will only be clarified at the technical level, despite raising fundamental questions about the conditional liability regime established by the DSA. CDT welcomes that several previously identified issues with the article were addressed, with more clarity about when the knowledge-based liability provision would kick in and the removal of obligations that would have led to platforms determining the legality of content on the basis of notices provided by any individual. However, an amendment to a recital by the Council to include a commitment to process a majority of valid notifications for the removal of illegal hate speech in less than 24 hours was added late in the process, raising the implication that providers will be expected to process Article 14 notices very quickly and potentially without adequate evaluation of the human rights implications of the content or its removal. Despite calls from civil society to remove this language, negotiators seem to have reached agreement to maintain the wording of the recital under the condition that reference is not binding, which unfortunately does not fully address concerns about the legal uncertainty this reference now creates.
Another late introduction into the negotiation process was the establishment of a “Crisis Response Mechanism” which raised considerable concern for civil society due to the implications of restrictions on free expression, free access to information and the Rule of Law. Initial reports from the negotiations indicate that the mechanism has been amended to ensure the European Commission can only enact such a measure under the recommendation of the Committee of National Coordinators of Digital Services. The Commission would similarly be required to report to the European Parliament and Council about the measures taken, and the mechanism would automatically expire after three months, though it could be renewed for another three month period. CDT welcomes these changes and hopes these amendments concretely addresses civil society concerns.
What Comes Next?
Though the institutions have reached a provisional political agreement, many elements remain uncertain as the text will be further elaborated in the technical meetings over the coming weeks. The opportunity for the DSA to be more ambitious may have been missed by this political agreement, however, there still remains a window to correct course during these technical meetings and later under the required review process by the institution’s legal services. Particularly as we move forward to implementation and enforcement, it will be vital to address any shortcomings to ensure the DSA upholds foundational EU values, will work in practice and does not hinder existing and new EU legislation.
As a matter of priority, negotiators and EU legal services teams will have to address the outstanding Rule of Law concerns across the text. The EU still has the opportunity to reconsider holding the designated legal representative personally liable for non-compliance in Article 11, and to remove law enforcement entities from the list of proposed Trusted Flaggers in Article 19. These revisions are essential given the global precedent the DSA is likely to set; maintaining provisions which would empower government authorities with enhanced censorship powers or that are similar to those supported by authoritarian governments must be corrected, especially in light of the war in Ukraine. This also applies to articles pertaining to orders to act against illegal content from, and to provide information to, judicial or administrative authorities. Though it has not been made clear how these articles progressed within the final negotiations, it is vital that the necessary clarity on the scope, effective remedy and procedural safeguards to protect against potential fundamental rights violations as proposed by the European Parliament are evident within the final regulation.
Practically speaking, the political agreement now also raises questions on the implications of the DSA on other EU legislative frameworks. For example, the provisional agreement on targeted advertising brings into question the applicability of the draft Regulation on the Transparency and Targeting of Political Advertising, whilst the inclusion of cyber violence as a systemic risk poses the question about how this will work in tandem with the proposed criminalisation of specific forms of online gender based violence in the EU’s draft Directive on Combatting Violence against Women and Domestic Violence.
The duty to fine-tune the DSA is now in the hands of the technical negotiators, and though this leaves room for remaining issues to be addressed, the scope for radical changes has narrowed further with the political agreement having been reached and thereby limiting how substantial changes can be at this stage. Focus now therefore turns to the realisation of implementation and enforcement of the flagship Regulation. Will the proposed more centralised enforcement regime work? Has the EU missed a crucial opportunity to truly set a gold standard for online content governance?
Only time will tell, but it is essential now that EU institutions work closely with civil society to rectify any outstanding issues and to ensure the effective implementation of hard-won provisions aimed at protecting fundamental rights online. Especially as civil society and observers were unable to access the opaque and fast-moving inter-institutional negotiations, whatever next steps may come must, for the sheer principle of democratic accountability, be participatory and focused on realising our collective ambition for a truly transformative regulation that establishes a safe and rights-respecting digital ecosystem.