The Center for Democracy & Technology submitted the below letter to Commissioner Jourová, Commissioner for Justice, Consumer, and Gender Equality in the European Commission.
Dear Commissioner Jourová,
We write to you today regarding the recently announced Code of Conduct on Countering Illegal Hate Speech Online. As a civil society organization dedicated to promoting human rights online, the Center for Democracy & Technology is concerned that the practices enshrined in this Code of Conduct may be insufficient to ensure that the free expression rights of Internet users are protected and respected. We include in this letter a number of questions about the planned implementation of the Code of Conduct and the steps the Commission will take to provide accountability for decisions to remove citizens’ speech from the Internet and to prosecute them under applicable laws.
The Code of Conduct is focused on “illegal hate speech”, as defined in Framework Decision 2008/913/JHA of 28 November 2008. However, as is also recognised in the Framework Decision (e.g. Recital 5), there is significant divergence between Member States as to what constitutes illegal, and therefore punishable, speech. It is thus not clear what will be the scope of material targeted for removal under this Code, and this will undoubtedly vary from Member State to Member State. Further, the process that Member States will follow for determining that targeted speech is, as a matter of law, “illegal”, is not clear to us. As the United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression noted in 2012:
[A]ny restriction [on the freedom of expression] imposed must be applied by a body that is independent of political, commercial or other unwarranted influences in a manner that is neither arbitrary nor discriminatory, and with adequate safeguards against abuse, including the right of access to an independent court or tribunal. Indeed, the risks that legal provisions prohibiting hate speech may be interpreted loosely and applied selectively by authorities underline the importance of having unambiguous language and of devising effective safeguards against abuses of the law. 
It is therefore deeply concerning that there is no indication in the Code that a court or other independent arbiter will be involved in making these determinations of illegality. Rather, the Code refers to notifications about allegedly “illegal hate speech” coming from “national contact points” and “law enforcement agencies” within Member States, and indicates that IT Companies may review removal requests against the Framework Decision definition “where necessary” after examining requests under their own privately developed content policies. This raises a key question: Will notices from government officials also target speech that generally appears to violate an IT Company’s content policy, or will government notifications only be made for allegedly illegal speech? Overall, it is unclear what standards will actually be used by government officials to generate notices or by IT Companies to determine whether challenged content will be removed. The Code is also silent as to when, if ever, an assessment will be made by a court or judge that the targeted speech is truly “illegal hate speech.”
Furthermore, we are concerned that the Code of Conduct does not include any specific commitments on behalf of the Commission, Member States, or IT Companies to provide access to an appeal mechanism or other remedy for Internet users whose speech has been censored under this regime. The uncertain scope of proscribed content, especially when combined with the IT Companies’ commitment to act on the majority of notices within 24 hours, will lead to erroneous removal of speech that neither violates the law nor companies’ own content policies.
The Code also provides no safeguards against misuse of the notice procedure, even though it is well understood that a notice-and-takedown regime can be vulnerable to abuse by those seeking to silence diverse views. The risks to free expression of vague and overbroad efforts to restrict online content were the focus of a recent comparative study on the blocking and removal of illegal content on the Internet, published by the Council of Europe on 1 June 2016. Thorbjørn Jagland, CoE Secretary General, notes in the press release accompanying the study:
“[…] I am concerned that some states are not clearly defining what constitutes illegal content. Decisions are often delegated to authorities who are given a wide margin for interpreting content, potentially to the detriment of freedom of expression.”
“[…] In a number of cases content removal is done on the basis of co-operation arrangements between law enforcement and Internet companies. In practice, this means that the decision on what constitutes illegal content is often delegated to private entities, which in order to avoid being held liable for transmission of illegal content may exercise excessive control over information accessible on the Internet.”
This type of delegation, embodied in the Code of Conduct, can leave individuals without adequate access to a remedy. If a user’s speech is removed or her account deactivated under this Code, she has a right to appeal the government’s determination of the illegality of her speech and should be able to seek reinstatement of her speech by the IT Company. It is a surprising deviation from fundamental due process principles that the Code includes no commitments to either of these remedies. Further, the lack of commitment to any specific efforts to provide transparency into the operation of this Code means that it is unclear how (if at all) citizens will be able to hold the Commission, Member States, or IT Companies accountable for their activity under this Code.
In reading the Code of Conduct, we are confronted by a number of questions about its intended operation:
- What are the criteria for a valid notice under this regime? The Code describes a valid notice as “not … insufficiently precise or inadequately substantiated.” How will such notices compare to notices issued under Member States’ implementation of the E-Commerce Directive? What information are these notices required to include?
- What are the guidelines or criteria that will be used by government officials in reviewing and referring content? How do these compare to the guidelines that will be used by participating IT Companies, CSOs, or others? Will these guidelines be made publicly available? Is the Code focused solely on “illegal” content (as is stated), or will actions taken under the Code seek to capture a broader category of content that may merely be alleged to violate IT Companies’ content policies (as is implied)?
- What opportunities for remedy will affected speakers have? When will they be notified of the restriction of their speech under this code? What opportunity will they have to appeal the legality of their speech before an independent arbiter? What remedies do participating IT Companies plan to provide?
- What are the Commission and Member States’ plans to report on the implementation of this Code and the speech and speakers that have been restricted under it? What commitments have the IT Companies made to report on the implementation of this Code? Will a public record be maintained of notifications that lead to content removal and possibly prosecution? Will such a record also include notifications that did not result in removal? How will implementation of the Code of Conduct take account of different thresholds for legality in different Member States?
To our knowledge, the European Commission has not sought advice or input from civil society organisations or academic experts in the field of free expression, in the formulation of this Code. We would encourage the Commission to conduct open and inclusive consultations on this Code of Conduct and any implementation plans it may develop. We also encourage the Commission to consider the questions raised in this letter and look forward to hearing your views on them in due course.
Representative and Director for European Affairs
Emma J. Llansó
Director, Free Expression Project
Center for Democracy & Technology
 Frank La Rue, Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, A/67/357, ¶42 (7 September 2012).
 See, e.g., Comments of Center for Democracy & Technology and R Street Institute, United States Copyright Office Section 512 Study, p. 11-12 (1 April 2016) (documenting abuses of notice-and-takedown procedures for copyright claims under US law), available at https://cdt.org/wp-content/uploads/2016/04/CDT-R-Street-512-Copyright-comments-20160401.pdf.
 Council of Europe, “Secretary General concerned about Internet censorship,” (1 June 2016), available at http://bit.ly/24iqmLQ.