Skip to Content

Free Expression

Positive Intent Protections: Incorporating a Good Samaritan principle in the EU Digital Services Act

Authored by Joan Barata,
For the Center for Democracy & Technology

Executive Summary

The “Good Samaritan” principle ensures that online intermediaries are not penalized for good faith measures against illegal or other forms of inappropriate content. This is a rule that applies to concrete types of intermediaries, particularly those providing hosting services. When intermediaries are granted immunity for the content they handle, this principle in fact incentivizes the adoption and implementation of private policies regarding illegal and other types of lawful but offensive or undesirable content.

The principle finds one of its earliest and most acknowledged embodiments in Section 230(c) of the Communications Act of 1934 (as amended by the Telecommunications Act of 1996). Section 230 has played a fundamental role in the development of the Internet as we know it. Under the protections set by U.S. law, intermediaries have the incentive to operate and expand their businesses under a predictable legal regime, to moderate the content they share, and specifically to deal with certain forms of objectionable speech.

At the European level, the e-Commerce Directive (ECD) contains the general intermediary liability regime applicable to hosting services and establishes a series of provisions regarding the imposition of possible monitoring obligations to intermediaries. Intermediaries enjoy liability immunities inasmuch as they perform a role of a mere technical, automatic, and passive nature. This requirement of “passivity” is compatible with certain activities identified by the case law of the CJEU. However, intermediaries become liable in cases where they fail to act expeditiously to remove or to disable access to the illegal content upon obtaining knowledge or awareness, or they are simply proven to have overlooked a particular illegality when implementing voluntary and proactive monitoring measures in such a way as to create actual or constructive knowledge that strips them of immunity.

This legal framework, however, does not adequately promote the adoption of voluntary and proactive content moderation policies by private intermediaries, but rather the opposite. The more that intermediaries play an active role in monitoring the content they host, the more likely it becomes that they will find a potentially illegal piece of content. In this context the chances of overlooking a particular illegality, and therefore the risk of liability, grow significantly.

In order to incentivize content moderation under the Good Samaritan principle, and thereby enable intermediaries to address problematic but lawful content on their services, the paper sets a number of recommendations for the Digital Services Act (DSA). Given the importance of a strong liability framework to promote freedom of expression, access to information, and innovation online, the future DSA needs to keep the liability protections already present in the ECD. At the same time, it also needs to create additional clarity about the scope and requirements in notice-and-action systems. In general, intermediaries should not be required to make determinations of illegality of third-party content; that is the function of courts. Uploaders of content should have the right to issue a counter-notice, and the framework should include penalties for notices sent in bad faith, among others. Exceptions to these general rules should be limited and narrowly defined.

Moreover, intermediaries should be transparent regarding the impact of their content moderation systems, and develop mechanisms to evaluate their effectiveness. Reporting mechanisms for content that is illegal and content as violating the service’s own policies should be kept distinct, so that it is clear whether there is an allegation of illegality. Liability penalties should not arise from notifications of violations of content policies or Terms of Service. Should intermediaries be subject to duties aimed at preventing and tackling the dissemination of illegal content, these duties need to be commercially reasonable, transparent, proportionate, and generally flexible. Such obligations should not focus on the outcomes of content moderation processes, so as to avoid over-removal of lawful speech. Recognizing that there is no one-size-fits-all approach and maintaining flexibility for different content moderation practices can then enable effective Good Samaritan moderation of harmful, but not illegal, content.

Read the full report here.