Skip to Content

European Policy, Free Expression

Overview of Transparency Obligations for Digital Services in the DSA

[ PDF version available here. ]

The draft Digital Services Act (DSA) Regulation presents multiple layers of transparency and accountability obligations that differ depending on the type of service concerned. There are a few obligations that are imposed on all intermediary services, i.e., mere conduit, caching, and hosting services1. Additional requirements then apply specifically to hosting services. Online platforms and very large online platforms (VLOPs), being subcategories of hosting services, are subject to the largest number of obligations.

Table 1: Transparency Obligations Grid, Divided by Type of Service

INTERMEDIARY SERVICESHOSTING SERVICESONLINE PLATFORMSVLOPs
Terms and conditions (Art. 12)XXXX
Transparency reporting obligations (Art. 13)XXXX
Statement of reasons (Art. 15)XXX
Trusted flaggers (Art. 19)XX
Measures and protection against misuse (Art. 20)XX
Traceability of traders (Art. 22)XX
Transparency reporting obligations for providers of online platforms (Art. 23)XX
Online advertising transparency (Art. 24)XX
Risk assessment (Art. 26)X
Mitigation of risks (Art. 27)X
Independent audit (Art. 28)X
Recommender systems (Art. 29)X
Additional online advertising transparency (Art. 30)X
Data access and scrutiny (Art. 31)X
Transparency reporting obligations for VLOPs (Art. 33)X
Right to be heard and access to the file (Art. 63)X

The above provisions and obligations can be further bundled in the following categories. First, the draft legislation brings forward a number of rules that govern the terms of use of digital services. This includes requirements for companies to explain and publish their terms of service contracts, and additional conditions managing the relationship between companies and their users (both in B2C and B2B environments). Second, the draft proposes multiple layers of obligations on transparency reporting about companies’ content moderation activities, information about misuse of their service, number of monthly active users, and more. Similarly to the previous category, these provisions are strongly differentiated by the type and size of the service.

Specifically for online platforms, there are also additional requirements on transparency about advertising. Finally, VLOPs in particular should also be obliged to perform an assessment of risks that arise on their services, and undergo independent audits evaluating their compliance with the Regulation. There are also provisions intended to allow authorities to access and scrutinize VLOPs’ data.

Table 2: Categories of Transparency Obligations

Art. 26 + 28INTERMEDIARY SERVICESHOSTING SERVICESONLINE PLATFORMSVLOPs
Terms of UseArt. 12Art. 12 + 15Art. 13 +19 + 23Art. 12 + 15 + 20 + 22 + 29
Transparency reporting obligationsArt. 13Art. 13Art. 13 +19 + 23Art. 26 + 28
Advertising transparencyXXArt. 24Art. 24 + 30
Risk assessment and auditingXXXArt. 26 + 28
Access to dataXXXArt. 31 + 63

1.1 – Terms of Use

The draft regulation puts an obligation on all intermediary services to explain and publish their Terms of Service (ToS) (Art. 12). This shall include information on any policies used for the purpose of content moderation, as well as algorithmic decision-making and human review procedures. When enforcing their ToS, the draft states that intermediaries shall take into consideration rights and legitimate interests of all parties involved, including fundamental rights as enshrined in the EU Charter of Fundamental Rights (dignity, freedoms, equality, solidarity, citizens’ rights, justice).

As a subcategory of intermediary services, all hosting services (including online platforms and VLOPs) will also need to provide a statement of reasons (SoR) (Art. 15) to the users whose content they decide to remove or disable. Such a statement shall contain relevant facts and circumstances about the decision, including information on whether the content was restricted due to violation of law or the company’s ToS, and information on redress possibilities to the user. The SoRs shall be published in a database managed by the Commission.

In addition, all online platforms will also need to include in their ToS policies with regard to misuse of their services (Art. 20). This includes actions against individuals or entities that frequently provide manifestly illegal content, or that submit notices and complaints that are manifestly unfounded.

There are also additional obligations for online platforms that allow users to conclude contracts with traders (Art. 22), e.g., online marketplaces. Before initiating their exchange with consumers, traders will need to provide platforms with personal data, including their name, bank account number, trade register number, etc. Platforms are responsible for collecting as well as assessing reliability of the data, and asking the trader for correction if necessary. Failure to correct the data shall be followed by suspension of the relationship with the trader. The trader’s data (excluding their bank account and ID copy) shall be available to the users.

Where applicable, very large online platforms (VLOPs) shall additionally expand their ToS with information about the main parameters of the recommender systems (Art. 29) they use, as well as any options for the users to modify the parameters, including at least one option which is not based on profiling2.

1.2Transparency Reporting Obligations

Under the draft legislation, all intermediary services (excluding micro and small enterprises) are obliged to publish, at least once a year, transparency reports (Art. 13) on activities related to their content moderation policies. These reports shall focus on information about orders received from Member States’ authorities, notices submitted via the notice and action mechanism, content moderation actions done at their own initiative, and complaints received through the internal complaint-handling system.

Further, online platforms particularly are subject to additional transparency reporting obligations (Art. 23). Together with the data required under Art. 13, reports of online platforms shall include information about disputes submitted to the out-of-court dispute settlement bodies, suspensions of accounts as part of measures against misuse under Art. 20, and the use of automatic means in content moderation. Once every six months, online platforms will also have to publish the average number of monthly active users in each Member State. The Digital Services Coordinator of establishment can request an updated number at any time.

Online platforms will also have a reporting obligation with regard to trusted flaggers (TF) (Art. 19) – entities whose notices submitted through the notice and action mechanism should be treated with priority. Where an online platform has information indicating that a TF submitted a significant number of insufficiently substantiated notices, it shall communicate that information to the Digital Services Coordinator (DSC) that awarded the status of the TF.

Finally, extra obligations apply with regard to transparency reporting of VLOPs (Art. 33). Unlike regular sized online platforms and other types of intermediaries, VLOPs shall publish data as required under Art. 13 & 23 every six months (instead of yearly). In addition, VLOPs shall provide reports about their risk assessment (Art. 26) and related risk mitigating measures (Art. 27), and their audit reports and audit implementation reports (Art. 28). Where a VLOP considers that the publication of this information may result in the disclosure of confidential information, it may remove such information from the public reports and transmit the complete data only to the European Commission and the DSC of establishment.

1.3 –  Advertising Transparency

Online platforms that display advertising (Art. 24) shall display to users the fact that they see an ad, a natural/legal person on whose behalf the ad is displayed, and meaningful information about the parameters that determine why the user sees the ad. The draft legislation is not further specific on what these parameters might be.

VLOPs that display ads are also subject to additional online advertising transparency (Art. 30), under which they shall create a publicly available repository including information on the content of the ad, the natural/legal person on whose behalf it was displayed, the main parameters used for targeting, and the number of users reached.

1.4 –  Risk Assessment and Auditing

Finally, there is a large portion of additional obligations established for VLOPs. As part of risk assessment (RA) (Art. 26), VLOPs shall analyse, at least once a year, any significant systemic risks stemming from the functioning of their services. The RA shall involve risks linked to the dissemination of illegal content, negative effects for fundamental rights, and intentional manipulation of their service. When conducting a RA, VLOPs shall take into account, in particular, how their content moderation systems, recommender systems, and advertisement systems influence any of the three systemic risks. The RA should be designed and conducted with the involvement of users, representatives of groups potentially impacted by the service, independent experts, and civil society organisations (rec. 59).

Once a year, VLOPs are also subject to independent auditing (Art. 28). The audit shall assess the obligations in Chapter III of the Regulations (Art. 10 – 37), which involve a range of provisions such as the notice and action systems and redress mechanisms, obligations on transparency reporting, or risk mitigating measures including the use of codes of conduct. The audit may have three different outcomes – either positive, positive with comments, or negative. For audits that do not have a positive outcome, VLOPs shall adopt within one month an audit implementation report setting out measures to achieve compliance with the Regulation.

1.5 Access to Data

As part of obligations on data access and scrutiny (Art. 31), VLOPs shall provide, upon the request of the European Commission or the DSC of establishment, data that are necessary to monitor and assess compliance with this Regulation. The Commission or the DSC of establishment may also request to provide access to data to vetted researchers, for the purpose of conducting research that contributes to the identification of systemic risks (Art. 26). Within 15 days, the VLOP concerned may ask to amend the request, where it doesn’t have access to such data or considers that the disclosure will harm its security.

Lastly, VLOPs get the right to be heard and access to the file (Art. 63) before the Commission adopts a decision on non-compliance (Art. 58), fines (Art. 59) or periodic penalty payments (Art. 60). VLOPs shall be heard on the preliminary findings and the measures the COM intends to take.

****

1  These three types of intermediary services were already defined in the e-Commerce Directive in 2000 and remain applicable. The DSA further expands on the definition of hosting services, and creates new subcategories: online platforms and VLOPs.

2 GDPR: ‘profiling’ means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.

3  Microenterprise: nr. of employees < 10; annual turnover/balance sheet ≤ EUR 2 million.

4 Small enterprise: nr. of employees < 50; annual turnover/balance sheet ≤ EUR 10 million. As defined in Annex to Recommendation 2003/361/EC.