Skip to Content

Privacy & Data

Account Deactivation and Content Removal: Guiding Principles and Practices for Companies and Users

From the role of Facebook during protests in the Middle East and North Africa, to the use of YouTube, Twitter, and other tools in the wake of earthquakes in Haiti and Japan and the wildfires in Russia, platforms that host user-generated content (UGC) are increasingly being used by a range of civic actors in innovative ways: to amplify their voices, organize campaigns and emergency services, and advocate around issues of common concern. However, while these platforms and services may be perceived as public, their users are subject to the rules and controls that private companies create and implement. Intentionally or not, private entities assume a primary role in providing and controlling access to the ‘networked public sphere.’ This ‘networked public sphere’ has supplanted, in part, the traditional town square by providing an open and dynamic online space for social and political debate and activism where citizens around the world increasingly exercise their rights to free expression, opinion, assembly, and association. Platform operators are often faced with challenging decisions regarding content removal or account deactivation, which, even when valid or reasonable, may have costly implications for the rights of users, especially activists.

This report explores these dilemmas, and recommends principles, strategies, and tools that both UGC platforms and users can adopt to mitigate the negative effects of account deactivation and content removal. We use select examples to highlight good company practices, including efforts to balance complex and often competing considerations—the enforcement of site guidelines, responses to government pressure, the free expression and privacy rights of users, and the potential risks faced by activists—in consistent, transparent, and accountable ways. Importantly, this report does not put forth a one-size-fits-all solution for the complex set of challenges raised by Terms of Use (ToU) enforcement. Platforms vary in terms of history, mission, content hosted, size, and user base, and no single set of practices will be an appropriate fit in every case. Moreover, while the examples in this report focus on platforms that host social media, the recommendations are broadly applicable to companies that host different types of user-generated content.