Skip to Content

Report Provides Guidelines for Dilemmas of Account Deactivation and Content Removal

Washington–A report released today by the Center for Democracy & Technology and the Berkman Center for Internet & Society highlights the dilemmas companies and users face when enforcement of a website's Terms of Use policy results in deactivation of user accounts or removal of user-generated content.  The report recommends principles, strategies, and tools that both companies and users can adopt to lessen the negative effects of account deactivation and content removal.

The report, "Account Deactivation and Content Removal: Guiding Principles and Practices for Companies and Users," outlines select examples of good company practices. Such practices feature rules and enforcement policies that are sensitive to users' free expression and privacy rights and to the potential risks faced by human rights activists, who are increasingly using social media tools in their work.

"This report offers guidance for those companies that are often forced to make delicate decisions about when to remove content or shut off users' access to their accounts," said Cynthia Wong, Director of CDT's Project on Global Internet Freedom. "It is important to note that this is not a 'one-size-fits-all' set of recommendations for addressing the complex set of challenges raised by Terms of Use enforcement, but rather a toolkit of possible strategies," Wong said.  

"The report highlights a number of areas where companies can play an active role in mitigating the negative consequences of account deactivation and content removal for all users, but especially activists, by integrating human rights concerns into their user policies and being transparent and consistent in how they communicate and enforce those processes," said Caroline Nolan, Senior Project Manager at the Berkman Center for Internet & Society.  

The report's recommendations focus on how platforms can transparently and consistently communicate with users, how escalation processes can be implemented in a rights-protective way, and how platforms can craft appeals processes that balance their own needs with those of their users. The report also notes that users must educate themselves about the platforms they use.

The report can be found online here: