Skip to Content

Free Expression

CDT Joins Twitch Safety Advisory Council to Advocate for Users’ Rights, Transparency and Accountability in Content Moderation

Today, the video streaming platform Twitch announced the creation of its Safety Advisory Council (SAC), a group of experts and Twitch users who will provide input and feedback to Twitch about content moderation policy and user safety issues. This is a welcome move towards broader consultation on company policy from the growing platform, which has over 17 million daily active users, and CDT is pleased to participate as one of the inaugural members of the Twitch SAC. As a member of the SAC, CDT will continue to advocate for transparency and accountability in content moderation, and will focus on the implications of the company’s policies for users’ freedom of expression, privacy, and other human rights.

As debates about content moderation and platform accountability proliferate across the globe, more companies are creating formalized structures for consulting with outside stakeholders about the scope and effects of their policies for user-generated content. Some of these are massive efforts like the Facebook Oversight Board, which has been in development for 18 months and whose first 20 members were announced last week. The Oversight Board is unique in being empowered to render independent judgments about content moderation decisions, which Facebook has committed to implement.

Other efforts, including Twitch’s, involve bringing on outside stakeholders in an advisory capacity to expand the range of perspectives that inform and (potentially) influence the company’s decisions. The video-centric app TikTok recently announced a Content Advisory Council after facing many questions about the app’s approach to content moderation. And Twitter recently revamped its Trust & Safety Council, which CDT joined in 2016 when it first launched.

In general, CDT encourages companies to engage in dialogue with a wide variety of stakeholders as they develop their content policies. This kind of consultation is a key part of performing human rights due diligence around company activity. Content moderation at scale is extremely challenging and the same policy will be understood and experienced very differently by different users and communities. By consulting with third-party experts, advocacy groups, human rights defenders, and users themselves, a service provider can better anticipate the risks or unintended consequences of their policies. They can also develop mitigation strategies and make changes that better align with the intended scope and actual effect of a policy. Advisory boards create a formal mechanism for this type of consultation, which can help ensure that it happens regularly, and can improve transparency of companies’ decision-making processes by requiring them to articulate their principles and to identify the groups and individuals whose advice they seek.

Just as there’s no one-size-fits-all approach to content moderation, there’s plenty of room for experimentation in creating advisory boards, too. One intriguing feature of Twitch’s SAC is that four of the eight members are Twitch streamers themselves, who run channels with anywhere from tens of thousands to over a million followers. This is one of the most direct forms of user participation in a company’s policy deliberations that we have seen in the current wave of advisory bodies, and it should ensure that the real-world consequences of Twitch policy decisions are well-represented in discussions.

As a platform predominantly used for video game livestreaming, Twitch faces particular content moderation challenges: It must grapple with issues of sexist and racist harassment, bullying, and threats that have plagued online gaming environments since well before Gamergate, while maintaining an open environment that allows millions of gamers to build fun, creative, and supportive communities. A major part of the appeal of live video is its unpredictable and often unscripted nature, and it can be hard to tell at first glance exactly what is going on. Both channel mods and platform moderators have to strike a balance between, on the one hand, responding promptly to threatening or abusive content and, on the other hand, evaluating content in the broader context of the stream and channel. 

And content policies that are developed to be enforceable at scale can lack important nuance and can be confusing to users. Twitch is no stranger to this issue: For years, Twitch’s policy around nudity had drawn criticism from streamers for being too vague and subjective, leaving the site’s users unsure of what was allowed and making it possible to target certain individuals and groups for selective enforcement. Twitch recently updated its Nudity & Attire policy to modify and more clearly describe its rules, and to articulate specific exceptions to the policy, such as breastfeeding and nudity that appears in the background of videos shot at public venues such as beaches, concerts, and festivals. Still, users had feedback and questions about the consequences of the new policy, leading Twitch to issue further updates and clarifications about the scope of the revised policy. (The SAC was not consulted on the Nudity & Attire policy.)

The new Nudity & Attire policy still does not satisfy every user, with some pointing out that the distinction it draws between exposed male nipples (OK) and exposed female nipples (not allowed) reflects the mores of a sexist culture, and will likely enable the targeting of female streamers for harassment. But these various perspectives and critiques demonstrate why it’s crucial for content policies to be developed iteratively and through continued consultation with users, advocates, and experts from diverse disciplines and perspectives. No one set of advisors will cover every interest or perspective, and consultation with an advisory group should be one part of a broader strategy for engaging with users and other stakeholders about policy and safety issues. Nonetheless, a formal advisory body can create another vector for accountability and transparency around company policies, and CDT is looking forward to engaging with Twitch through the SAC to do just that.