Overview of the NetzDG Network Enforcement Law

What is the law?

  • The German parliament passed a law on 30 June that subjects social media companies and other providers that host third-party content to fines of up to €50 million if they fail to remove “obviously illegal” speech within 24 hours of it being reported.

Who does it apply to?

  • The law is described as applying to social media companies, but it defines that term very broadly, to include all profit-making internet platforms that are intended to allow users to share content with other users or make it publicly available.
  • The law includes an exemption for platforms offering their own editorial content (though it’s not clear whether this exemption extends to commenting systems on those sites). The exemption also extends to “platforms intended for individual communication or the dissemination of specific content,” but it is not clear what distinction the law aims to draw between “individual communication” and “sharing with other users”.
  • The law also exempts providers who have fewer than 2 million registered users in Germany.

What does the law require?

  • Providers have to maintain a procedure for handling complaints about purportedly unlawful content. That procedure has to take immediate notice of the complaint, and providers are obligated to remove or block access to “manifestly unlawful” content within 24 hours of receiving the complaint.
  • For content that is unlawful but not “manifestly unlawful,” providers have a seven-day deadline to remove or block access to the unlawful content.
  • Providers that receive more than 100 complaints about unlawful content per year will have to publish two German-language reports annually detailing the mechanisms in place to report unlawful content and the criteria used to evaluate the reported content, how the provider handled the complaints, and the number of complaints received.
  • Providers must also make monthly reviews of their processes for handling notices and “immediately” rectify any “inadequacies” in the process.
  • The law also allows platforms to transfer decision-making power about the lawfulness of content to an “institution of regulated self-regulation” that would be funded by social networks and approved by the Federal Office of Justice.  The Office of Justice may revoke the institution’s approval or order providers not to send notices to the institution.
  • Providers face fines of up to €50 million for, among other things, failing to produce the bi-yearly reports, to develop a procedure for receiving and evaluating notices, to conduct the monthly reviews of their processes or to eliminate “inadequacies” in the procedure.

What speech is at issue?

  • The law targets as “unlawful” content that offends any of nearly two dozen sections of the German Criminal Code.
  • Those sections of the Code include “public incitement to crime,” “violation of intimate privacy by taking photographs,” defamation, “treasonous forgery”, forming criminal or terrorist organizations, and “dissemination of depictions of violence.”

What are the policy consequences of the law?

  • The law puts heavy pressure on hosts of third-party content to censor speech.
  • The European Commission and German NGO Jugendschutz have been critical of the rates at which Facebook, YouTube, and Twitter have taken down content under the Commission’s Code of Conduct on Illegal Hate Speech. The pressure to take down reported content will likely only become worse under the new law.
  • Under the new law, notifications will come from NGOs and private citizens who allege that speech is unlawful. Judicial review of speech will only occur if the government seeks to bring an action arguing that content was unlawful. It is not clear if this requirement applies to imposition of fines for failure to report, review, or remedy “inadequacies” in a provider’s process.

 


Resources

Download PDF

Share