Skip to Content

Cybersecurity & Standards, European Policy, Free Expression, Government Surveillance, Privacy & Data

European Parliament IMCO Committee Adopts DSA Report: Significant steps Forward, Leaps Still to Be Made

Almost a year to the day since the European Commission published its long-awaited draft regulation, the Digital Services Act, the European Parliament’s lead IMCO committee voted on their position on the proposal, after an intense period of proposing amendments and negotiation. This amended text will now be submitted in plenary to the full house of the European Parliament, where additional amendments can still be tabled; however, once agreed and voted upon, this position will form the basis for the Parliament’s mandate in the upcoming negotiations. Just a couple of weeks prior, the European Council similarly adopted its general approach, thus setting the stage for the inter-institutional negotiations, commonly known as the trilogues. 

CDT Europe, alongside many civil society organisations, has been following the process and working with the EU institutions to ensure the development of the regulation is steered in the right direction: one that places the upholding of fundamental rights at its core and is truly impactful in creating safer, more vibrant online spaces.

Important Improvements Achieved, Many Still to Come

In light of our analysis of the European Commission draft regulation, CDT welcomes the commitment and perseverance of many within the European Parliament in reinforcing fundamental rights protections within the DSA. Central tenets to rights-protecting online content regulation have been maintained, such as the continuation of the prohibition of general monitoring obligations and clarifications in terms of definitions, such as for mere conduit and caching services. It’s similarly reassuring to see the removal of previously tabled amendments that would have mandated automated content moderation and short timeframes for takedown obligations, which would have resulted in the over-removal of content and created a chilling effect on free expression.

Another notable aspect of the Parliament’s position is the clear objective to protect the right to anonymity and the use of end-to-end encrypted services. This is a significant addition to the text, and provides much needed reassurance that these tools, which are so vital in particular to human rights defenders and journalists, who are operating in an increasingly hostile and shrinking civic space, must be preserved. Transparency obligations have also been extensively revised, which is a productive step forward, especially as meaningful transparency is so essential to accountability. Whether it is to shed light on government takedown orders, or to understand how online platforms themselves influence our information ecosystem, we need to know what is happening online before we can move to remedy any potential rights violations.

These welcome additions are unfortunately however accompanied by an underlying theme; the Parliament, and indeed all the EU institutions, should venture much further in their ambitions for the DSA, and must do more to strike the right balance between tackling illegal content and improvement of due diligence obligations, without a blurring of these boundaries. This theme is evidenced in several areas of the parliamentary position. 

For example, significant additions have been made to the provisions on risk assessments and mitigation of risks. The additional layers of transparency obligations, such as the need for consultation in the development of these assessments, may indeed facilitate increased public scrutiny. They would certainly provide scope for civil society, rights-holders and experts to provide necessary input, all of which will help improve the understanding of the impacts of content moderation. Indeed, much of this steers in the direction of establishing human rights impact assessments, but only brings us to the cusp.

CDT strongly encourages all institutions to take this definitive step as the negotiations move forward: the clear incorporation of a human rights lens into these assessments is a vital and necessary fix. In alignment with the UN Guiding Principles on Business and Human Rights, online platforms must focus their efforts on assessing the human rights impacts of their products and services and embed respect for human rights across the value chain. The Commission/Board should supplement this by focusing on providing best practice recommendations, in order to expand capacity and improve performance of risk assessment and mitigation across the industry.

The parliamentary position also presents stronger transparency obligations over algorithms and recommender systems, a long-standing call for civil society, and improvements here are a step in the right direction. Increased user choice and control in clear, unambiguous terms can be useful in reducing the amplification of campaigns that target vulnerable groups and the civil society organisations that work to protect them. Once again, EU legislators can afford to go further, more specifically in ensuring that recommender systems should not be based on profiling by default, in line with the recommendation of the European Data Protection Supervisor. 

A similar line of increased transparency for users has been adopted in provisions related to online advertising. The related article outlines obligations to provide more precise information such as labelling and requires platforms to provide users with the ability to make a more informed choice when exercising their consent for processing their personal data. Once again however, the opportunity to challenge the very idea of surveillance-based advertising and commence phasing out advertising based on pervasive tracking has been missed. The protections offered by GDPR, which have been included in the amended text, simply will not hold up to standard if a more robust enforcement of the EU’s flagship data protection and privacy regulation is also not actioned. 

In relation to this, the substantial amendments proposed to Article 31 (Data Access & Scrutiny) are a strong example of a clear inclusion of civil society and expert calls within the draft regulation. The development of more in-depth research into areas such as recommender systems and algorithmic amplification is essential to better informing policies. Data access and scrutiny by experts is an important part of this and CDT welcomes the amendments proposed here by the European Parliament to include civil society experts within the scope of the provision. It will be important however for all the EU negotiators to maintain this commitment and to guarantee that full respect for existing data protection and privacy regulation are the very foundation for this provision, as well as ensuring the rule of law is upheld.

Priority Amendments for the Plenary 

Alongside these steps forward, there are still areas where leaps and bounds are required. It is gravely unfortunate that some areas of the draft regulation that still present a significant concern regarding fundamental rights have not been amended in any truly significant way.

Alongside these steps forward, there are still areas where leaps and bounds are required.

Notably, within the amendments to Articles related to orders for removals and information, it is truly concerning to see non-judicial authorities empowered to request sensitive user information without strong and clear safeguards. The provision may also set an unnecessary and potentially abused precedent in listing gag orders within the minimum requirements criteria for an acceptable notice. Gag orders should not be commonplace, more so it raises pertinent questions about the applicability of the well-intended additional article 9(a) on effective remedies for address. 

Article 11 on Legal Representative similarly, though well intended to ensure accountability within the Union, unfortunately mirrors similar proposals made by authoritarian governments. A pertinent example of this is evidenced in the recent elections in Russia in which the “Smart Voting” app, supported by opposition leader Alexei Navalny, was removed from app stores after Russian authorities threatened to prosecute local employees. The EU must ensure that providers of intermediary services establish a point of contact and legal representative in the Union, without creating a risk of personal liability for that representative, and bring this Article into alignment with the requirements of GDPR Article 27.

The notice-and-action system also still requires changes to ensure it doesn’t inadvertently undermine the knowledge-based intermediary liability principle and is legally sound. It is crucial that notifications regarding illegal content be submitted through dedicated channels by judicial authorities; procedures for notifying providers of illegal content should not be merged with procedures that allow other entities to report content that potentially violates Terms of Service. Furthermore, allowing anyone to give notice of presumed illegality that defeats the intermediary’s safe harbor opens up the possibility for abuse of this system to gang up on or target dissenting voices online. We can imagine how consequential this would be in the context of an election, as well for human rights defenders or advocates operating in challenging civic space environments.

These elements that pose a threat to Rule of Law are unfortunately exacerbated by a distinct unclarity regarding the lack of independence and impartiality of the proposed governance structures. Though the parliamentary position has sought to bring some increased public scrutiny to key aspects such as the development of the proposed Codes of Conducts or transparency on the assignment of Trusted Flaggers, these simply do not go far enough. This is concerning especially considering the sensitivity of the matters and in light of Rule of Law being an essential EU value; correcting such elements must be prioritised within the inter-institutional negotiations.

Looking Forward

It is clear that the DSA is ambitious; developing a broad framework that takes a horizontal approach over such a complex area as platform governance is no easy feat. As evidenced by the complexity of the amendments process throughout the year, the sheer number of amendments tabled, and the diversity of perspectives expressed, a comprehensive and nuanced approach is the most appropriate. As we close 2021 with more clarity on the positions each EU institution will adopt as they enter the negotiations, a critical message from CDT to legislators is to commit to this holistic, future-looking, nuanced approach. 

CDT stresses that the EU Institutions must reflect comprehensively on the implications the entire draft regulation posits in terms of practical implementation and enforcement mechanisms, particularly because the current proposal presents a confused and unworkable enforcement framework.  Deep consideration will need to be given to the significant resources and creation of entirely new structural mechanisms that will be required in order to effectively enforce the regulation as outlined. In line with recent political commentary, effective implementation of the DSA will also rely on more robust enforcement of GDPR. The miscalculations made by the EU in developing and resourcing the enforcement mechanisms of the GDPR, which have arguably led to significant systemic failures, must be avoided; now is the opportunity to ensure this does not happen.

As the next stage of the ordinary legislative process begins in earnest, it is essential that civil society is provided the opportunity to consistently provide input into the discussions.

As the next stage of the ordinary legislative process begins in earnest, it is essential that civil society is provided the opportunity to consistently provide input into the discussions. This has been a challenge in some ways given the limitations the pandemic has placed on us all, but the EU has a commitment to uphold its values and has now the opportunity to set a global precedent for platform governance. There is no question about the added-value civil society brings through our insight, expertise and dedication to ensuring the final regulation upholds all fundamental rights and freedoms, throughout this process. 

Steps in the right direction have certainly been taken but much more can and must be done in the coming months. All EU institutions must now work diligently to realise our collective ambition to formulate a regulation for a digital ecosystem that is safe, democratic and rights-respecting for all.