Skip to Content

Digital Services Act Approved by Key EU Committee: Still a Long Road Ahead

(BRUSSELS) — After weeks of intense negotiation, the European Parliament Internal Market Committee (IMCO) voted in favour of the consolidated text of the Digital Services Act (DSA), paving the way for the full European Parliament to adopt the legislation’s final text in July.

Asha Allen, Centre for Democracy & Technology (CDT) Advocacy Director for Europe, Online Expression & Civic Space, said:

“We welcome this historic step towards a platform governance model that could protect human rights and democracy in Europe and beyond.

The negotiators made clear headway by bolstering obligations that increase transparency and public scrutiny on content moderation practices, provide users with increased control and protections against systemic risks to their rights, and that aim to centre the preservation of fundamental rights. If implemented well, these provisions could make the DSA a true game-changer for human rights online.

Serious questions remain, however, about how the DSA will be effectively and practically implemented and enforced.

To effectively implement the DSA, the European Commission and Digital Services Coordinators’ efforts must be cohesive.

The best way for the EU to ensure seamless collaboration, and avoid the challenges of enforcing the GDPR, is to establish a formal way for civil society and relevant experts to consistently assess enforcement efforts and make recommendations for improving them.”

We are especially pleased to see that the text maintains central tenets of rights-protecting online content regulation that CDT Europe advocated for throughout the negotiations, including a prohibition against general monitoring obligations for online platforms, and the liability framework established in the original E-Commerce Directive.

The text also ensures that platforms do not determine the legality of speech, and that they are not obligated to remove content on a short timeframe. Most notably, it requires platforms to assess and mitigate systemic risks for users, paving the way for them to develop human rights impact assessments that are supported by independent audits and access to data for researchers.

We remain concerned, however, that the final text allows for enforcement overreach and potential challenges to the rule of law. For example, the text empowers law enforcement agencies to be designated as “Trusted Flaggers” by nationally appointed Digital Services Coordinators (DSCs). They can also issue orders against illegal content, and request information on recipients of a service, actions that should remain the sole purview of the judiciary. Similarly, though the controversial Crisis Response Mechanism has been improved by removing the European Commission’s unilateral power to declare an EU-wide state of emergency and impose restrictive measures that may limit the freedom of access to information, the provision remains a concern due to limited democratic oversight over its functioning.

The text empowers the European Commission to be the primary regulator for very large online platforms and search engines. The Commission’s independence in this role is not fully assured, however, and to thoroughly enforce the DSA, it will also have to meet the challenge of acquiring significant new resources. Further, the national Digital Services Coordinators will need to operate seamlessly across borders, and similar needs have complicated enforcement of the General Data Protection Regulation (GDPR).

We look forward to the final adoption of the legislation, and to aiding in its implementation.