European Policy, Free Expression
The DSA Introduces Important Transparency Obligations for Digital Services, but Key Questions Remain
CDT and other civil society organisations have long called for establishing meaningful and robust transparency mechanisms to shed light on content moderation practices that take place on online platforms. Transparency is essential to accountability. Whether it is to shed light on the requests that governments make to companies to remove user-generated content, or to understand how online platforms themselves influence our information ecosystem, we need to know what is happening online before we can move to remedy any potential rights violations. It is therefore welcome that the Digital Services Act proposal (DSA), published by the European Commission in December 2020, introduces a new framework of transparency obligations for digital services. If the framework as drafted is established, companies will be required to produce and disclose large amounts of information, potentially allowing oversight bodies, researchers and the public to understand their internal processes. A key question remains however, as to whether the proposed framework can be implemented in practice – and more pressingly, whether it actually satisfies the key public policy problems that rights advocates seek to fix.
Transparency About Terms of Use
The DSA proposal contains a number of provisions that govern the terms of use of digital services. This includes requirements for companies to explain and publish their terms of service contracts, provide statements of reasons for their content moderation decisions, and apply additional rules managing their relationships with users (in both business-to-consumer and business-to-business environments). Overall, these provisions are important so that users can operate in a predictable environment and know the possibilities and limits of their behaviour online.
However, there are a number of questions that policymakers will need to grapple with throughout the legislative process. In Article 12 governing terms and conditions (ToS), it is not apparent what will be considered ‘clear and unambiguous’ language for setting out ToS, or what will be the appropriate level of transparency about policies used for the purpose of content moderation, including algorithmic decision-making. Since this article applies to all intermediaries, answers to these questions might be significantly different depending on the type of company concerned, which opens space for legal uncertainty and potential litigation. Art. 12 Paragraph 2 further requires intermediaries to apply their ToS with due regard to the rights of users as enshrined in the EU Charter of Fundamental Rights. What will be the consequences for online platforms that, for example, restrict the depiction of nudity, which in many contexts is protected under free speech rights?
Article 15 then requires companies to produce a statement of reasons for decisions to remove or disable access to content, which should be publicly available in a database managed by the European Commission. This is a laudable effort to make content moderation more transparent. However, it is not clear how such a database should be organised and how to ensure its usefulness. The provision currently requires statements of reasons for every decision to restrict content and does not exempt categories such as spam, phishing, or malware removals. Logging these statements in the Commission’s database would flood it with millions of entries with questionable informative value. There is also a risk that overly detailed explanation requirements could add considerable time to moderation decision-logging, potentially meaning that smaller hosting services will be unable to address as much abusive content given their resource constraints.
At the same time, Art. 15 paragraph 4 stipulates that these statements shall not contain personal data. It is absolutely essential that individuals’ privacy be protected in any effort to make a publicly-viewable database of content removal decisions. This is also in line with the European Data Protection Supervisor’s (EDPS) Opinion on the DSA. However, it may also mean that much of the content at issue will not be allowed to be published, either because it has been removed subject to a legal order or because it includes personal information of the user or other individuals. It is not clear what will be the utility of such a database, without the underlying content or information about the users who posted it.
It is also unclear how entries to this database should be calculated. Daphne Keller, Director of the Program on Platform Regulation at Stanford’s Cyber Policy Center, rightly points out that there are a lot of logistical and operational questions that arise when building a transparency report. An example could be a photo-hosting site that takes down a file based on copyright infringement, and while doing so also removes 14 duplicate copies of the file using automated tools. Does that count as 1 removal decision, or 15?
Transparency Reporting Obligations
The draft also proposes multiple layers of obligations on transparency reporting about intermediaries’ content moderation activities. These reports shall focus on information about government orders, notices submitted via the notice and action mechanism, actions done at the services’ own initiative, and also information about measures against misuse and the use of automatic means in case the service constitutes an online platform, among others. This will benefit researchers, oversight bodies, and the general public to better understand how content moderation decisions are made, who submits notices, what are the frequent types of notified alleged illegal content, and what are the grounds for taking it down.
Again, some detailed requirements will need to be clarified. For example, the requirement in Article 13.1.a. for categorising legal orders by ‘type of illegal content’ will create challenges given the different legal regimes across Member States. Will intermediaries be required to report by specific Member State laws, or will there be some categorisation of laws? The former would make reporting very detailed and costly, potentially creating difficulties for cross-comparison between statutes and/or companies. The latter would require some kind of grouping of types of illegal content, and possibly acknowledgement that speech may violate more than one law. This kind of clarification will be necessary anywhere Article 13 requires reporting on a ‘type of’ content or order.
In addition, extremely prescriptive requirements for the content and format of transparency reports could have the unintended consequence of constraining the ability of services to respond effectively to abusive content. For example, requiring services to report the length of time it takes to respond to notifications and government orders, as stipulated in Article 13.1.b., will exert a strong pressure on services to shorten that time, which will likely decrease the quality of the review that they conduct.
In other cases, more detail in the provisions of the DSA may be beneficial. For example, Article 23 requiring online platforms to report on the use of automated tools is rather general and may not generate the most useful information. In order to really understand the impact of automated decisions in content moderation, it would be useful for reports to include specific information about how content is surfaced for review by automated tools, what proportion of decisions to remove content are made through automated processes, and the rate of appeals of enforcement decisions involving automation.
Article 23 also raises a concern that the requirement to report on actions taken against ‘manifestly illegal content’ creates a pressure on companies to demonstrate strong and decisive action against such content. In effect, this is another way of requiring platforms to determine illegality of speech, a role that should only be delegated to judicial authorities.
Advertising Transparency
Specifically for online platforms, there are also separate transparency requirements on advertising. Under the draft regulation, platforms displaying advertising shall ensure that their users can identify any individual ad, and have access to information about the targeting parameters and the person on whose behalf the ad is displayed. Very large online platforms (VLOPs) are subject to additional obligations, such as the creation of publicly available repositories with data about ads.
This is a welcome effort to allow users and the wider public to better understand the often rampant advertising practices taking place on online platforms. However, this alone might not reach the core of the problem. Currently, one of the primary public policy concerns is the practice of profiling users’ personal data, which can exploit people’s prejudices and in turn be a driver of hateful content or disinformation. The adverse effects of this model have also been demonstrated in the context of election interference and microtargeting. Merely increasing ad transparency will probably not solve the core issue without a stronger provision aiming to phase out the practice of advertising based on pervasive tracking.
As CDT points out in its response to the European Commission’s consultation on the DSA, transparency is not an end goal in itself; it needs to have a specific purpose, be tailored to specific audiences, and it will look different across varying services. Users, independent researchers, and oversight bodies will all benefit from different types of information or level of detail. The proposed framework should be flexible where it is necessary to account for variation in content moderation across services, and fixed where specific types of information is required and beneficial to the recipients. The transparency provisions in the DSA are a good first step to shed more light on the often opaque operations and content moderation practices of online platforms. However, it is apparent that there are still areas to be clarified in order to make the proposed framework workable, effective, and useful.
For more information, see CDT’s full overview of the transparency provisions as proposed by the DSA.