Skip to Content

European Policy, Free Expression

German Social Media Law Creates Strong Incentives for Censorship

Social media companies and other hosts of third-party content will soon face potential fines of €50 million in Germany if they fail to promptly censor speech that may violate German law. Last week, the German parliament approved the NetzDG legislation, which goes into effect 1 October and will require social media sites and other hosts of user-generated content to remove “obviously illegal” speech within 24 hours of being notified of it.

CDT was critical of this bill when it was first introduced, and we’re deeply concerned that the German parliament has now adopted it. This is one of the most extreme online censorship bills that we have seen from a liberal democracy to date. There are a number of unanswered questions about the intended and actual scope and function of the bill (some of which we discuss below), but in short the law puts internet companies in the inappropriate position of interpreting and applying national law, and creates strong incentives for companies to err on the side of taking down speech. The German government’s focus has solely been on pressuring internet companies to take down more speech faster and has not addressed the inevitable suppression of lawful speech and speakers that will occur as a consequence of this law.

Strong Incentives for Taking Down Speech

Under the new law, a host of third-party content won’t necessarily face an immediate fine for allowing a single allegedly unlawful post to remain on its service. Rather, the potential fines under the new law are largely linked to a provider’s failure to comply with various procedural requirements, such as failing to provide a complaints mechanism or to publish a report detailing their responses to the notifications they have received.

But this only obfuscates, rather than eliminates, the pressure this law will exert on providers to censor. One of the key provisions of the law is the requirement (in Section 3(4)) that providers must make monthly checks of their process to evaluate notifications and “immediately” fix any “organizational deficiencies” in the process. Failure to do so could yield the massive €50 million Euro fine — but the law is silent on what will constitute evidence of an “organizational deficiency” in handling notifications.

The risk is that the German government will continue to press providers to show “improvements” in their processes in the form of increasing numbers of posts and accounts that are taken down. Over the past year, since Facebook, YouTube, Twitter, and Microsoft agreed to the European Commission’s Code of Conduct on Illegal Hate Speech, the Commission and Jugendschutz, an NGO funded by the German government, have each published several reports tracking the takedown rate of notifications they have sent to the platforms. Commissioner Jourová has been highly critical of the companies having takedown rates that are, in her view, too low, while Jugendschutz has conducted a study demonstrating that they can achieve higher takedown rates through repeated demands and direct pressure on the companies. Each of these studies is premised on the notion that companies should demonstrate extremely high, if not 100%, rates of takedown in response to notifications.

Under the Code, as will also be the case with the new German law, these notifications of allegedly illegal speech are coming from NGOs and private citizens, not the courts. These private actors may be misinformed, or simply mistaken, about the lawfulness of some of the speech that they flag, and it is absolutely essential for social media companies to push back on notifications that seek to censor lawful speech.

It remains to be seen how the German government will interpret “organizational deficiencies” in a provider’s approach to evaluating notifications. It is clear, though, that they should not set as their goal a 100% compliance rate with takedown demands. Such an approach would be incredibly vulnerable to abuse by people wanting to silence those they disagree with, and it would leave no room for evaluation of context or correcting for errors.

Institution for Regulated Self-Regulation

The law also contains other incentives for erring on the side of takedown. Before it was passed, the bill was amended to include a new option for providers seeking to comply with the law: the law now allows for the creation of an “institution of regulated self-regulation” (Einrichtung der Regulierten Selbstregulierung) to which providers can send notices and receive the institution’s determination of whether the speech violates the law. This institution, which the law says must be provided with “proper equipment” by providers but staffed by independent experts, must be approved by the Federal Office of Justice. The Office of Justice can also revoke the institution’s approval (Section 3(8)) or order a provider not to send notices to a particular institution (Section 3(9)).

Possibly, these “independent” institutions are intended to function as something of a safe harbor program for providers, who can rely on the fact that they sent notices to these entities and complied with its determinations as a shield against potential liability. It may be easiest for providers, especially smaller platforms, to outsource their determinations of legality of content to one of these institutions. But the institutions are dependent on staying in the Office of Justice’s good graces and maintaining its approval. This framework, where the institution receives funding or equipment from the providers and approval to operate from the government, could create strong incentives on the institutions to interpret the law harshly and issue takedown determinations at a very high rate.

Moreover, this framework does not address the fundamental flaw of this law: it continues to rely on private parties (providers, or the experts at the new institutions) to make final determinations that flagged speech has violated the law. The bill requires providers to interpret over 20 provisions of the German criminal code, including laws against hate speech, disseminating propaganda for unlawful organizations, “treasonous forgery,” defamation of religion, depictions of violence, and non-consensual explicit imagery.

Typically, a person whose speech is challenged as violating the law would have an opportunity to defend herself before a court. The new German law, however, provides no remedy or appeal option for censored individuals. It does not require providers to include an appeal process in their evaluation of notifications and describes no process by which an individual could appeal a determination either to the provider or the government. If a provider does decide that a post has been removed improperly and should be restored, there is no limitation on the provider’s potential liability for hosting that content – the provider assumes all of the risk for deciding to restore a post.

In sum, the elements of this law, from its scope and its reporting requirements to its procedures and its lack of remedy, all weigh on the side of censorship. The amendments introduced since the first draft was published do not remedy these problems. The law authorizes privatized law enforcement, where “trusted flaggers” and private companies will be making decisions about the extent of and limits to free expression under the law. In a liberal democracy, courts should make these determinations. This law creates an inherently flawed system that should not be followed in other jurisdictions.