Skip to Content

European Policy, Free Expression

Civil Society Responds to DSA Risk Assessment Reports: An Initial Feedback Brief

The DSA Civil Society Coordination Group, in collaboration with the Recommender Systems Taskforce and People vs Big Tech, has released an initial analysis of the first Risk Assessment Reports submitted by major platforms under Article 42 of the DSA. This analysis identifies both promising practices and critical gaps, offering recommendations to improve future iterations of these reports and ensure meaningful compliance with the DSA.

The Digital Services Act (DSA) represents a landmark effort to create a safer and more transparent online environment. Central to this framework are yearly risk assessments required under Articles 34 and 35, which mandate Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) to identify, assess, and mitigate systemic risks posed by their services.

Identifying Useful Practices

The first round of RA Reports showcased varying approaches to risk identification and mitigation, but also different formats to present information. While the reports across platforms and services will inevitably differ to some extent, by identifying practices from each platform that were the most conducive to meaningful transparency, we aim to set a baseline for future iterations. To showcase this, we zoom-in on key topics like the online protection of minors, media pluralism, online gender-based violence and explore features from different reporting formats that we found compelling.

The Crucial Role of Platform Design

A recurring theme in the analysis of the RA Reports is the underrepresentation of design-related risks. While platforms occasionally acknowledged the role of their systems — such as recommender algorithms — in amplifying harmful content, these references were often indirect or insufficiently explored. Design choices, particularly those driven by engagement metrics, can significantly contribute to systemic risks, including mental health issues, political polarisation, and the spread of harmful content. Despite this, many reports focused primarily on content moderation rather than addressing how platform design itself might be a root cause of harm. Future RA Reports must prioritise assessing design-related risks, ensuring that mitigation measures target not only user-generated risks but also the systemic risks embedded in platform architecture. By doing so, platforms can better align with the DSA’s objectives and create safer digital environments for all users.

Transparency Builds Trust

Trust with users and regulators can only be fostered through transparency. Many RA Reports lacked verifiable data to substantiate claims about the effectiveness of mitigation measures. For instance, a number of reports referenced existing policies and data without providing new, DSA-specific assessments. Platforms must disclose quantitative and qualitative data, such as metrics on exposure to harmful content and user engagement with control tools, to demonstrate compliance and build trust. The brief includes a detailed table with the minimum level of disclosure that would be necessary to assess the effectiveness of mitigation measures, that we believe could be made public without posing a risk to trade secrets.

The Need for Meaningful Stakeholder Engagement

Finally, meaningful consultation with civil society, researchers, and impacted communities is essential to identifying and mitigating systemic risks. Yet, none of the RA Reports analysed detail how external expertise was incorporated into their assessments. Platforms must engage stakeholders systematically, reflecting their insights in risk assessments and mitigation strategies. This approach not only ensures compliance with DSA Recital 90 but also strengthens the credibility of the reports.

Recommendations

The first round of RA Reports under the DSA marks an important step toward greater accountability. However, significant gaps remain. To advance user safety and foster trust, platforms must:

  1. Focus on design-related risks, particularly those tied to recommender systems.
  2. Enhance transparency by providing verifiable data on mitigation measures.
  3. Engage meaningfully with stakeholders to ensure risk assessments reflect real-world harms.

By addressing these gaps, VLOPs and VLOSEs can align with the DSA’s objectives, contribute to a safer digital environment, and rebuild trust with users and regulators. Civil society remains committed to supporting this process through ongoing analysis and collaboration. Together, we can ensure that the DSA’s promise of a safer online space becomes a reality.

Read the full report.

The DSA CSO Coordination Group, convened and coordinated by CDT Europe, is an informal coalition of civil society organisations, academics and public interest technologists that advocates for the protection of human rights, in the implementation and enforcement of the EU Digital Services Act.