12:00:34 >> Good morning, good afternoon, good 12:00:40 evening, wherever you might be. Welcome to the fifth annual The Future 12:00:45 of Speech Online. My name is Emma Llanso. I'm the closed captioning of 12:00:52 the Center for Democracy & Technology. First up today, we are honored to have 12:00:57 a keynote address from the a member of the European Parliament. Member of 12:01:04 the European Pirate Party and member of the greens and European-- lead 12:01:12 repertoire on the digital services act, the European legislation-- and online 12:01:19 content governments. He was unfortunately not be able to join us 12:01:24 live. But we will going to a recording that he was kind enough to send in. 12:01:30 >> Making transparency meaningful. That's a thing we all face. Being a 12:01:36 member to the Pirate Party, I am a big proponent of transparency and 12:01:41 accountability because it empowers us and the public. And knowledge is 12:01:48 power. In fact, I'm currently sue the 12:01:56 commission over its refusal to disclose position on unethical and illegal 12:02:02 research of video lie detector technology. And, the ruling is 12:02:09 expected on Wednesday on that. And you know, scientists can only 12:02:15 deliver it, these questionable technologies and the accuracy and the 12:02:20 discriminatory impact if the technology and the results are out in the open. 12:02:30 And also, the media and parliaments can only deliberate this kind of intrusive 12:02:34 technology if we know about it and if we have all the information that 12:02:40 that is necessary. So that's why I hope that, um, this really will set 12:02:48 the precedent and allow us to hold the EU accountable for the research it 12:02:53 funds. But then, knowing problems does not 12:03:00 yet stop them. Transparency to me is a tool to change and improve the 12:03:07 regulation. So, um, we will need to come to exclusion of unethical 12:03:12 technologies and the use of research funding program. We will need to 12:03:20 reform the ethics review process. So this is what transparency is about. 12:03:27 It's a tool. It's a vehicle. In the similar vein, the proposed digital 12:03:34 services act of the European Union has been drafted by liberal commissioners 12:03:40 and has been welcomed by industry and it's all about transparency and 12:03:45 self-auditing and co-regulation. Transparency seems to be the liberal 12:03:50 answer to addressing the lack of respect for fundamental rights online. 12:03:56 So let me look into where the proposed 12:04:00 digital services at transparency provisions can support the fundamental 12:04:06 rights. So there's a provision according to which information in the 12:04:11 terms when conditions should be provided on any restrictions that are 12:04:19 imposed in relation to the use of service and parameters for recommended 12:04:24 systems. So I think this will be rather general language we usually see 12:04:29 in the terms. And as long as the users are not given 12:04:34 a choice or for example, interoperability of recommended 12:04:40 systems, it is of limited value. What we are pushing for is that users would 12:04:54 have a right to see the timeline in chronological order and their choice, 12:04:58 such as ones developed by the community. I think that would go a 12:05:05 long way in addressing the problems these algorithms and content creates. 12:05:10 This is provision in 9 digital services act for a yearly transparency report 12:05:18 by companies. So we'll see statistics on tech and data requests. That's a 12:05:23 standard procedure that's already in place for which companies will being 12:05:28 mandatory. We'll also see a yearly report or 12:05:35 upload fitters with indicators of the accuracy and any safeguards applied. 12:05:42 So this goes in the right direction. However, it's not exactly what Frances 12:05:47 Hogan suggested. She said that you really need to breakdown the accurate 12:05:54 statistics country by country because the filters work even less well for 12:06:00 non-English language content, regularly suppresses valuable content. She 12:06:07 mentioned counter terrorist content. But the same applies to media content, 12:06:16 LGBTQ I content and other contents because these fitters can't tell right 12:06:22 from wrong. So the information that is provided for in the commission's 12:06:28 proposal may go some way, but it can only be the start of the discussion. 12:06:34 They will also, companies will also need to notify and give reasons for 12:06:47 removing content. So that person will be given reasons and information to 12:06:52 seek regress. That is useful and important because you can only 12:06:56 challenge decisions if you know the reasons why they were taken. 12:07:06 Viewers or users will be able to identify at as well as the advertiser 12:07:12 and will have a right to know the right targeting parameters. There will 12:07:18 also be a public database. This supports public discussion that 12:07:25 repeatedly resulted in reforms and in corporations removing criteria, that 12:07:31 can be circumvented. And it's not really sufficient to address the 12:07:37 problems that come with the personalized targeting of content. 12:07:45 It's a tool that, if used in elections, and referendums can be used to 12:07:53 manipulate democracy, can be used for for interference, if messages are just 12:07:59 designed in a way that appeals to the personality of the person reading 12:08:05 them. It's not just something that a person can decide individually. Many 12:08:09 people may not understand the implications of this kind of 12:08:18 manipulation and the power of the data that can be mined from comprehensive 12:08:23 online user profiles. Transparency but also not prevent 12:08:32 manipulation from minors or by using very-- by targeting very sensitive 12:08:39 groups such as based on the sexuality of a person, based on their religion, 12:08:52 et cetera, PP. Then there is an obligation, the they 12:08:59 publish a yearly report based on systemic race reported by platforms as 12:09:01 well as best practices to address them. 12:09:10 Now, it sounds like a good thing because this can sort of be part of a 12:09:15 fundamental rights assess many the. However, the legislation that mandates 12:09:25 mitigating risks can make things worse. Worried about the collateral 12:09:32 damage that companies may cause if they try themselves to address systemic, 12:09:37 so-called systemic risks. Note that the legislation is about risks. It's 12:09:44 not about actual impacts. Any adverse impacts. The mere risks they want to 12:09:51 address. I'm concerned that there is no Democratic accountability in terms 12:09:58 of these mitigation measures. There is no judicial oversight. That is kind 12:10:03 of pandora's box that we are opening here. 12:10:10 The key transparency provision that has been hailed a lot is a data access 12:10:15 provision that would give access to vetted researchers in order to 12:10:21 research systemic risks that the platforms are responsible for. The 12:10:29 European parliament a in its position that's just being adopted wants to add 12:10:36 not-for-profit NGOs that should be given access and wants to mandated the 12:10:40 publication of the findings, the researches which is important 12:10:46 improvement. And here again, publications on the 12:10:53 algorithms that platforms use have repeatedly made them adjust and 12:11:01 correct their algorithms. So this is of value. But again it doesn't mean 12:11:07 that users will be given any choice when it comes to their timeline. So 12:11:13 it may make the platforms adjust and improve on their algorithms. I'm not 12:11:18 sure if they are aware of their own algorithms all the time. It could be 12:11:24 very complicated. So to sum it up, I think Frances 12:11:31 Hogan's leaks prove that a company like Facebook meta keeps consistently 12:11:37 prioritizing profits over the public good. But it keeps believing that it 12:11:49 can make-- and sort of fix problems by itself, that's transparency has a 12:11:55 valuable role in safeguarding our rights online. But it's not 12:11:59 sufficient if we want the digital services act to be a game changer when 12:12:09 it comes to surveillance capitalist model he will. When it comes to our * 12:12:14 sensorship algorithm or when it comes to the platform lock that we are 12:12:18 encountering. So that's why, um, in the run up to 12:12:24 the plenary vote on this, European parliament's position to the digital 12:12:31 services act, I'm calling for improvements and we'll also see that-- 12:12:37 are tabled that will go beyond transparency and enter the regulation 12:12:45 of these platforms. What we need is a political will to 12:12:51 put users and Democratic institutions in control of the digital age rather 12:12:56 than trusting the corporations to shape the digital era we and our children 12:13:00 are living in. That said, I wish you a fruitful and 12:13:06 hopefully successful discussion on how to make transparency meaningful. 12:13:14 Thank you. >> Thank you. . That was Patrick 12:13:22 Breyer who is the lead repert war on the particle amount. And what to 12:13:26 expect next on all of these issues from the European Union. 12:13:31 Before we kick into our first panel of the day, just a few housekeeping 12:13:37 notes. This is day two of how three day of 12:13:40 The Future of Speech Online. Yesterday we heard a great overview about the 12:13:46 many different kinds of trains that are part of the ongoing tech policy 12:13:54 conversation. CDT has leased a guide for understanding how issues like 12:14:01 transparency reporting, third party assessment and researchers access to 12:14:03 data fit under the umbrella of transparency. And we heard from 12:14:08 advocates how to focus on different groups and communities of users to 12:14:14 understand what information are necessary to-- if you missed 12:14:19 yesterday's panel, the video is available on CDT' YouTube page. And 12:14:25 we also note that today's session is being live streamed and recorded and 12:14:33 will be available by the CDT page going forward. We can drop it in the CHAT. 12:14:34 If you have any questions for the 12:14:40 organizers or any trouble with Zoom features, you can contact events@CDT 12:14:48 .org. Folks are welcome to submit questions by the Q&A feature or if on 12:14:56 the live stream you can send in an email to question@CDT organ organ or 12:15:08 tweet at at CDT.org. Our first panel will be all about 12:15:16 reflecting on the Santa Clara Principles. We are explain what they 12:15:20 are. We will then have a very short break, starting just past the hour 12:15:25 while we change of our speakers for the next panel and move to the next 12:15:31 session of the day, Building a Better Transparency Report. It will run 12:15:35 through the second hour of the programming. If you're planning to 12:15:39 attend both, sit tight when the first panel concludes. 12:15:45 So without further ado, I would like to invite the first set of panelists. We 12:15:50 have a great group of experts here to talk with youd to. I'll do quick 12:16:02 introductions. First up is dark David Greene, Senior Staff Attorney and 12:16:03 Civil Liberties Director, has significant experience litigating 12:16:03 First Amendment issues in state and federal trial and appellate courts. 12:16:04 David was a founding member of the Internet Free Expression Alliance, and 12:16:24 currently serves on the th >> Next we have Connie Chung 12:16:24 Connie Chung is Sr. Director of Global Policy for Trust and Safety at Twitch, 12:16:26 where she leads the team setting global standards for healthy communities to 12:16:30 grow and thrive. Connie has more than a decade of experience in online safety, 12:16:34 including work at Twitter, Yahoo, Netsafe New Zealand, and Common Sense 12:16:41 Media. Connie was also was one of the chief 12:16:49 architects of the safety advisory counsel to give advice to Twitch. CDT 12:17:01 is a member of the Twitch advisory councilling. Then we'll here from 12:17:02 Julie Owono is the Executive Director of the Content Policy & Society Lab 12:17:04 (CPSL) and a fellow of the Program on Democracy and the Internet (PDI) at 12:17:07 Stanford University. She is also the Executive Director of digital rights 12:17:12 organization Internet Sans Frontières, one of the inaugural members of the 12:17:18 Facebook Oversight Board, and an affiliate at the Berkman Klein Center 12:17:29 at Harvard University. She holds And timely we have Evelyn Douek is an 12:17:30 S.J.D. candidate at Harvard Law School, Senior Research Fellow at the Knight 12:17:31 First Amendment Institute at Columbia University, Affiliate at the Berkman 12:17:34 Klein Center for Internet & Society and Visiting Fellow at the Yale 12:17:43 Information Society Project at Yale Law School. She and also the cohost of 12:17:47 arbiters of truth. Today we're talking about the Santa 12:17:51 Clara Principles. For content moderation. And generally about the 12:17:56 continually developing law and policy conversation about transparency and 12:18:03 user notice and appeal. And what that frame really accomplishes in content 12:18:07 governance and user empowerment and what it leaves out. 12:18:13 So yesterday we had a great overview courtesy of Rebecca giving us a quick 12:18:18 history of transparency and the tech sector looking back to issues with 12:18:26 companies operating in China in the mid 0s. Where Yahoo, Google and Microsoft 12:18:33 faced scrutiny about data of their user to the Chinese government. Google 12:18:39 releasing the first transparency report in 02010 and really starting a whole 12:18:44 decade plus worth of discussion about this discussion of transparency 12:18:47 content. Over the years after Google launched 12:18:52 the first report, a few companies joined in but in particular many joins 12:18:58 the fray of transparency reporting after the Snowden revelation about US 12:19:03 government surveillance. There is also a big push from civil society for 12:19:08 asking companies to do more reporting not just on government demands on user 12:19:15 data but also how companies themselves moderate content in terms of their 12:19:17 service and community guideline of have. 12:19:25 In 2018 a small group of NGO from the US, got together to develop the 12:19:29 original Santa Clara Principles. This was alongside a content moderation at 12:19:38 scale conference that was organized by Professor Eric at Santa Clara put. 12:19:44 There were three in that year, 2018, all available online and a great thing 12:19:48 inform look at. They were interesting themselves. For the first time 12:19:51 bringing together over a dozen different companies to talk very 12:19:57 publicly and on the record about their content moderation policy and 12:20:01 practices and how they address different issues and giving people the 12:20:04 opportunity to compare and contrast across different companies. 12:20:16 So this small coalition CDT-- they were launched in 2018 and laid out strong 12:20:20 recommendations around the kind of numbers, notice and appeal 12:20:25 opportunities that civil society was calling on companies to provide. 12:20:31 We really try to use these as an advocacy document, a rally and cry for 12:20:35 civil society, folks around the world to say, these are things we want to 12:20:40 see companies doing to improve people's awareness on what's going on on 12:20:43 services and respond to what we're hearing from many users around the 12:20:47 world about a lack of clarity and understanding how they can control 12:20:50 what's happening to their own speech on different surfaces. 12:20:55 And then just last week, we launched a revised set of the Santa Clara 12:21:00 Principles at the internet governance forum. This is where I want to turn 12:21:06 to David Greene. So David, you've been deeply involved in what was a contrary 12:21:10 long process to revise the Santa Clara Principleses. Can you tell us a 12:21:14 little bit about what that entailed. What were the major issues and 12:21:21 challenges that came up when you were revising the Santa Clara Principles? 12:21:28 >> About two years coalition, some of us, we've been using, the Santa Clara 12:21:35 Principles really as a touch stone this advocacy point to try and some basis 12:21:42 for assessing human rights, why want to use the word compliance, but to what 12:21:46 extent is their human rights dimensioned to content modation? One 12:21:51 of the concerns we had, was the original principles were written, it 12:21:56 was a spontaneous thing when this meeting happened adjacent to the 12:22:00 conference that wasn't done with the purpose of having a set of principles. 12:22:05 But to rose out the discussion. And wasn't planned to be an inclusive 12:22:09 group. People just happened to be in the room. I wasn't among them. 12:22:14 But one of the things we noticed as we're doing advocacy we start to 12:22:19 question a few things. One, are these effective? Wee as advocates found 12:22:25 they want useful but are they effective beyond the circle of advocates? Two, 12:22:30 how these things were such a product of the US and Europe, thinking like do 12:22:32 these have relevance inform the rest of the world? 12:22:37 One of the things we started to do was really to find out whether the Santa 12:22:42 Clara Principles were useful, what they were missing, and how did the world at 12:22:48 large feel about them? And so we enter barked on an open consultation 12:22:55 process, that really sought to get feedback from as many dare I say, 12:23:01 stakeholders, people who are just interested, people who are affected by 12:23:06 content moderation from around the world. Best laid plans with 12:23:12 everything in 2020, got skewed a little bit when the pandemic started shutting 12:23:20 down and moving to the virtual world, a lot of the conferences and things, 12:23:27 that we planned to in-person sessions. But still we felt it was a successful 12:23:30 process. In addition to just having individuals and organizations just 12:23:37 directly submit comments, we were able to have consultations happen by way of 12:23:42 partners around the world. So group consultations in Latin America, 12:23:47 Africa, India, in um other, in the Americas more broadly. 12:23:53 Really just people sitting around in a room for a few ours. We gave them a 12:23:59 set of 13 questions and interbeingtive conversation. All of these we had 12:24:03 about 40 submissions in total, representing, I believe it was 25 12:24:10 different countries. Really valuable and useful source of information. All 12:24:17 of us who read them learned a lot about the process. And we did try and 12:24:21 summarize a lot of things that were submitted in the report that we 12:24:26 published and is available on the Santa Clara Principles website the at Santa 12:24:36 Clara Principles.org. The The lessons we learned-- there were 12:24:40 overarching themes that ee edger inned. The top message was that, 12:24:45 Santa Clara Principles something like them has the ability to be very 12:24:51 relevant. I think the other thing we learned was there are lots of 12:24:56 different ways people think they can be relevant. I really saw the original 12:25:01 Santa Clara Principles as being user rights focused. Really very much 12:25:05 focused on as an individual user what information do I need to know in order 12:25:12 to decide whether I want to use this service and how to use it in a way 12:25:17 that I thought served me best. For example, the notice provisions 12:25:25 were designed to individual users. It's focused on due process for that 12:25:31 user. They weren't particularly focused very much on getting data for 12:25:36 research or regulatory purposes. I think they were a self-regulatory 12:25:41 culture not a regulatory document. One of the comments we saw was at least 12:25:46 the question being asked, are these most valuable as being user focused or 12:25:52 is there a larger-- do he with want to look at transparency broadly as to 12:25:56 what do we want to learn about this ecosystem in general to the Santa 12:26:01 Clara Principles have a role in bringing light to that type of 12:26:06 information. >> Now, it's really interesting, 12:26:11 David. And I want to dig in more on that question of sort of what were the 12:26:16 principles intended to be and what have they morphed into and how it feeds 12:26:21 into the feedback that we were getting or what people hoped to use them. But 12:26:29 I do want to bring in next so Connie. You've worked at on transparency 12:26:34 issues at a couple of user generated tech companies including at your 12:26:37 current role at Twitch. Open question to you. 12:26:43 David was talking about wanting to assess the principles. Are civil 12:26:47 society advocacy efforts like the Santa Clara Principles useful to you in your 12:26:52 role, work being at a company and thinking about these issues? What has 12:26:56 been useful about them and what kind of things do they seem to miss 1234 12:27:04 >> Yeah. And again, thanks. Happy to chat about this. So * there are 12:27:10 definitely useful, also Kev knitly limits. On the positive. What I've 12:27:14 seen with my pro information >>LEO: Al life, having prims and 12:27:19 frameworks could be useful for advocating internally and useful for 12:27:22 my team internally as our check on our work. 12:27:31 I've been on super tight smaller groups where, you're expected to support the 12:27:38 company out of the trust and safety in your work. And so stripes when you're 12:27:42 in a smaller group, having frameworks from trusted places means better 12:27:45 support of your work because you can say, there are other groups that also 12:27:52 support this principles, it's important for us to consider as well. 12:28:01 I'm a big fan of the book,-- it's written by a surgeon who advocates for 12:28:05 people who use checklist on routine things. There is efficiency and 12:28:12 safety. The routine things are written down. For policy it's helpful to have 12:28:17 lists and principles to act like a sanity check to make sure 12:28:22 explainability, clarity is something we're thinking about. Not just every 12:28:26 single person in my team is thinking about the same thing all the time. 12:28:32 That means, building our own list and values and using principles from 12:28:34 outsiders that we trust and tested before 12:28:41 This frees up creativity and deeper thinking in the work we do. As far as 12:28:47 limits, I think it depends on whether or not guidelines fit the operating 12:28:52 principles of the company. Twitch, for example, we are live streaming and 12:28:56 interactive. You can stream and there's a back and forth between 12:29:00 content creators and the CHAT community on the stream. For those who aren't 12:29:06 familiar, we have a layered approach on Twitch. The first foundational layer 12:29:13 is the community guidelines. My team writes the rules. If you break it you 12:29:20 might get kicked off or suspended from the service. On top of that, you 12:29:26 have a ton of tools for creating your community. We hear they have 12:29:32 different needs. You can delicate moderators who can control your chat. 12:29:36 You can hold-- you can block people from participating. And with all 12:29:40 these areas, you don't need justification or notice or process. 12:29:46 You get to do what is tailored for your community and do what's best. 12:29:52 What ends up happening is interesting because, you got some creators who are 12:29:56 super heavy handed with how they want to address their community whether 12:30:01 they're like, I want to turn up every feature to the highest. I don't want 12:30:05 to deal with the people who I find unpleasant. And you have people who 12:30:11 take the time to listen and even if it's someone's rude or UNESCO it I 12:30:18 I want to spend time in my community to have them be a productive member. 12:30:24 They use our tools less. And so the part that works in 12:30:33 community guidelines loves community-- procedure justice can have limits 12:30:38 particularly for smaller groups. And so different companies operate on 12:30:44 different sizes and models. Some are much more community focused than ours. 12:30:51 People want the flexibility to be able to make nuance decisions and they 12:30:54 don't necessarily need transparency rules. 12:30:59 They might be like, I don't like your vibe. These are the things we need 12:31:03 to balance. >> That's great. It's an interesting 12:31:09 perspective, too because I'm sure there were discussion about different 12:31:17 approaches to content moderation. I certainly, I myself didn't think of 12:31:22 them as necessarily guidelines for community involves moderators. Like 12:31:33 people who were there. I could make an argument. They can apply across the 12:31:39 board. You brought up a point about how there's a variety of ways that 12:31:43 content moderation happens online and any set of principles is going to fit 12:31:48 some models better and others less well. 12:31:55 I want now to bring in Julie as reflecting another potential audience 12:31:59 for things like the Santa Clara Principles although not-- existed when 12:32:06 had the principles were first drafted. Julie, you've worn many hats 12:32:12 including civil society and academia and now is a member of Facebook and/or 12:32:16 meta oversight board. In your perspective of the board, do Santa 12:32:22 Clara Principles make an impact? Do they register with the board as your 12:32:27 kind of you doing your work? How do you see them intersecting with the 12:32:31 transparency goals that the board has itself? 12:32:39 >> Thank you very much, Emma. It's a pleasure to be part of this 12:32:45 conversation and this whole event. Well, to respond very frankly to your 12:32:51 question, yes, there's initiatives by civil soy site organization around 12:32:56 transparency, accountability and Tech Industry particularly in the content 12:33:02 moderation space are extremely important? Why? Because I'll try to 12:33:10 link by different hats here. I truly believe in the ability of the 12:33:14 solutions that we want to bring in to the challenges that we're facing on 12:33:19 the internet in general. I really do believe that those have to in European 12:33:23 pirate roar the network itself. So the network is the network of networks and 12:33:31 there is a lot of interdependency and there's no way one point of the 12:33:37 network will find a solution to the whole network. I try to apply 12:33:41 principles to multicolorrism-- We can work together, include 12:33:48 different publics in the, yeah, in making policy that will apply to us 12:33:55 all on those spaces and many is particularly. 12:34:01 Side note, interesting try in that space, how can you make finding 12:34:06 decisions that will be imposed to a company like meta and particularly to 12:34:14 two of its companies which are Facebook and Instagram. How do you make those 12:34:21 binding decisions that are rights based and that are also informed by inputs 12:34:25 from different stakeholders? And the Santa Clara Principles around 12:34:32 transparency have been very helpful to many of our discussions. But I also 12:34:38 would like to speak to the importance of these principles to exist in the 12:34:42 first place and to start a conversation. A conversation that is 12:34:50 necessary to, I would say, try to fill the power imbalance really. When 12:34:56 you're a company like Facebook or meta rather now, when you're a company like 12:35:01 meta or Twitter or alphabet, you hold so much information, and you're trying 12:35:06 to, well, say, okay, we will be more transparent but you're talking to 12:35:13 users who don't even know how your companies are organized. I am a sales 12:35:20 society reader. It's only until I came to the Silicon Valley that I 12:35:24 understood trust and safety was and learned about the exist expr of such 12:35:28 profession. It was not 0 until I came that I knew the importance of talking 12:35:37 to product not only public policy. There is a real power imbalance. I'd 12:35:43 like to see initiatives, information that are lacking to organizations, to 12:35:48 users that cannot be in the valley. That do not not have access to the 12:35:51 companies and you will all the new knowledge. 12:35:57 We have to remember, social media are available to societies even in in the 12:36:03 concept of transparent governance does not exist. So if you want to talk 12:36:06 transparency you have to make sure that you are all talking the same language. 12:36:13 I'm converted by the fact that the day two of the Santa Clara Principles went 12:36:18 out there to the world and it's something that I hope also the other 12:36:23 side can learn from, how can we learn more and understand more the 12:36:26 challenges that are faced by communities that we don't know about 12:36:31 yet. But that are facing kind of the future of content moderation problems. 12:36:36 And most of the cases they are not located in Europe or United States. 12:36:44 So I wanted to really congratulate, well, the conveners of this principles 12:36:49 for, you know, showing the-- paving the way and leading by example. 12:36:55 So how do these principles intersect with the transparency work that the 12:37:01 oversight board tries to do. We make binding decisions on their individual 12:37:06 cases. For instance, your content was taken down by Facebook, you can appeal 12:37:12 to the oversight board. But beyond these individual cases we make a lot 12:37:17 of recommendations to meta, according to meta we make too many 12:37:21 recommendations. But what is important to know is that 12:37:27 many of these recommendations are informed by the interactions that the 12:37:34 oversight board has with external actors including civil society 12:37:40 organizations including the list and academic institutions. We inform 12:37:47 our-- recommendations and deliberations through this interactions. But also 12:37:53 through the readings we read a lot of reports that are published by civil 12:37:58 society actors. One example is, I don't know -- 12:38:04 obviously the Santa Clara Principles but even beyond that on the issue of 12:38:08 transparency, what are the recommendations the oversight board 12:38:13 made recently was around government requests on content take down. Most 12:38:19 of these information were not public until now. And I'm thinking 12:38:24 particularly of a case that we work withed on which involved-- it was a 12:38:33 publication by AlJazeera that was taken down by Facebook and that publication 12:38:38 was concerning the conflict in Israel and Palestine. What we learned as the 12:38:43 board during our deliberations is that through a report by organizations and 12:38:59 by an organization called 7AMLHA. But that report informed the sub and the 12:39:04 world and existence of the government unit that had direct access to content 12:39:09 moderators like these two, Facebook content teams and could request 12:39:15 content take downs. That information cass-- we insisted and recommended in 12:39:20 the-- while making decisions, recommended to meta and Facebook to 12:39:25 publish that information. And Facebook has agreed to be more, to include 12:39:28 those information in its transparency efforts. 12:39:34 All this to say that, yes, to these initiatives. Yes to having more 12:39:38 conversations about these and explaining more of these concepts that 12:39:52 are very new More of these are important. And they 12:39:56 help organizations and institutions to do their work better. Thank you. 12:40:02 >> Thank you so much, Julie. And now I want to bring in Evelyn. So Evelyn, 12:40:07 your scholarship focuses on online speech regulation and platform 12:40:13 governance. And you wrote a lot of on transparency why I including cartels 12:40:18 and platform reports about information. How do you see things 12:40:22 like the Santa Clara Principles fitting into the broader policy debates around 12:40:28 content governance? >> I want to take the opportunity to 12:40:33 thank you for your work. Advocates for things that we're talking about today. 12:40:36 I think it's important. Thanks to everyone who worked on the Santa Clara 12:40:42 Principleses. I think they are a really fantastic document. And 12:40:46 exciting to see the development of nuance and sort of greater detail on 12:40:52 building out the original documents. As David said, 9 greater focus on like 12:40:56 systems and * and access and things beyond the individual, you know, 12:41:03 notice, appeal, requirements centered around what the individual users need 12:41:08 is really important. I think the learning and-- throughout-- the 12:41:13 commentary that these are not a model for regulation. It's really 12:41:17 important. But in some sense, I think the Santa 12:41:22 Clara Principles-- they became kind of the gold standard of like a baseline 12:41:28 of what people should want from content moderation and platforms when they're 12:41:33 taking do you know content. Even though they weren't intended to be-- 12:41:40 they get picked up as regulators a lot. The-- 12:41:45 And I think there's a number of reasons for that. I think it's quite easy to 12:41:55 legislate. It's a lot of different trade offs in this context. But it's 12:42:00 easy to say, if you want to take down a piece of content, these are the 12:42:03 reasons and you have the right to appeal. It's easy to draft and deal 12:42:11 with platforms for not doing that. It's also intuitive. With we look at 12:42:19 human rights or Constitutional First Amendment, that's the things we 12:42:23 prioritize when government is taking down page. It feels right to put it 12:42:27 online. But I'm not necessarily sure that it's sort of a good fit. And I 12:42:36 think that the fit and scale of content moderation online distinguished-- I 12:42:41 think we need to engage closely with the trade offs that are involved 12:42:45 there. I think it's been really important progress to get transparency 12:42:49 from these companies. But I think we're starting to learn that these 12:42:57 transparency reports get produced quarterly biannually can obscure. 12:43:02 They're doing so well. They're taking down all the fake stuff. But if you 12:43:09 dig in, you don't know the accuracy rates, they're just more content 12:43:13 online rather than being diligent. It creates-- 12:43:18 Take down more content and there's though really vocal voice for 12:43:25 countering that. Government's like, yes, we made they will take down more 12:43:28 content. What gets lost in that-- that's not necessarily a good thing. 12:43:36 I think it really sort of incentivizes the idea if there's something wrong, 12:43:41 we can fix it by adding more due process. More process is better. We 12:43:45 see this in the oversight board opinions as well and regulators as 12:43:52 well. If there's errors we can fix that by giving users more explanation 12:43:57 and appeal rights. And make sure every piece of content goes to a human. 12:44:06 And we think it's legitimate and trust it more if we keep piling on-- I think 12:44:14 that's the tenor of these debates. It's not necessarily clear to me that 12:44:22 the user thinks of a decision against a legitimate-- or if it takes an extra 12:44:25 two days to get their decision through them because it needed to go through 12:44:27 all of these extra process and things like that. 12:44:35 So I really would love to see testing of assumptions and, I think-- like I 12:44:39 said, at the top the focus on moving to the systems level and thinking about 12:44:46 system design and exante system design coo is valuable. And I think that may 12:44:53 be should we focusing our priority and resources and the capital. Both the 12:44:58 business and private capital but also the political capital. We are 12:45:03 probably going to get one bite of the apple every department years. What is 12:45:07 going to make a difference in this context? I've been invited to this 12:45:13 panel to play the grinch. I'm skeptical about that. 12:45:20 >> Oh, boy. One of the friendliest and most cheerful out there. I think 12:45:26 that's an excellent segue. Just a reminder to our attendees, we have a 12:45:34 Q&A function. But Evelyn, I think you tee up a reflection on this. We have 12:45:40 been talking, David framed the Santa Clara Principles. The individual user 12:45:43 rights and empowerment president perspective. I have a general 12:45:49 question for folks on, just open to anyone on what you actually think is 12:45:57 necessary for real user empowerment here. Some information is part of 12:46:02 that. But is that enough? Probably not. What else do we need to look at? 12:46:08 Either take either that question and/or this other one about how to 12:46:13 balance that focus on the traditional user rights framework versus other 12:46:18 ways of thinking about sort of platform accountability or what are the systems 12:46:21 in play here? What are the kinds of things, whether 12:46:28 it's from Julie your role as being part of the emerging structure of dpomps. 12:46:35 governance, or Connie-- when you hear calls for more transparency about the 12:46:40 systems in play, what does that sound like from quarter perspective? So 12:46:45 sort of open it up to anybody who would like to jump in. Please do. 12:46:50 yeah, Julie. >> Yes, thank you, Emma. I think the 12:46:58 two questions are not exclusive-- sorry. My lighting is terrible. I'm 12:47:05 here. I'm talking. So the two questions are not mutually 12:47:10 exclusive. So actually what I mean by that is, what is needed for 12:47:18 transparency to be more useful to user or what else is needed. As ang 12:47:28 introduction, I would like to stress embedding content moderation, content 12:47:33 governance as a whole and language informed by rule of law and democracy. 12:47:36 Right now my biggest frustration, and 12:47:44 I apply it to myself, too, is that we have been talking very expert jar gun. 12:47:49 jargon. But I think it's also important to translate that into a 12:47:53 language that many around the world understand. And that is the rule of 12:48:00 law. When two companies disagree on their policies, for instance, or let's 12:48:08 say, one group is considered dangerous in one platform and not considered 12:48:12 dangerous in root platform, what does that mean as a whole for the 12:48:17 ecosystem? What does that mean for a rule of law perspective? Should can 12:48:23 he resolve this? Is that a problem? Those are the questions we should be 12:48:26 asking. That's related to transparency. But 12:48:30 for that, we need to make sure we're all talking the same language. And 12:48:38 right now, I'm linking to the remarks I said earlier, I'll not sure all users 12:48:43 around the world really understand what they're facing is a content moderation 12:48:49 problem and that transparency, what that is even, can be helpful. There 12:49:01 is a role for us as organizations to play in disseminating that message and 12:49:06 starting the conversation going beyond the digital rights realm, speaking to 12:49:10 children's rights organization, women's rights organization, disabled rights 12:49:15 organization, everyone who uses social media platforms. 12:49:20 And to your second question which was around, should we do that or should we 12:49:27 do the more accountability, both are necessary. If we are moving towards a 12:49:34 way where content governance is becoming or is a Democratic issue and 12:49:43 should be governed by rule of law principles, it has checks and balances 12:49:47 which is important for accountability. One example, there are many others, 12:49:53 and also disclosure, transparency also extremely important. 12:49:58 So to respond to your question, there is a lot of education needed. We need 12:50:06 to transition from rule of law when we talk about content moderation and 12:50:09 regulation specific have I and both transparency and accountability are 12:50:15 important in those conversations. >> I think I can try and base-- 12:50:20 jumping off to what Julie said. I view the Santa Clara Principles principles 12:50:25 in similar documents as being, I don't know if necessary, but useful but not 12:50:29 sufficient. I think there are, in in as a a user 12:50:34 rights focused approach really only works in a model where users can 12:50:41 respond to the information they. And it if they don't have a choice of 12:50:44 another service, then all the information they have doesn't do well. 12:50:51 So I think that, you know, the larger ecosystem also needs to really 12:50:56 seriously think about how rusers are better servedded about I a system with 12:51:01 greater consumptions and interoperability, and things like that 12:51:06 to be user focused. In some ways the Santa Clara 12:51:12 Principles is a user focused document would work better in an ecosystem 12:51:20 where there's real user choice. In terms of the larger systemic what does 12:51:25 transparency teach us about justice ecosystem generally and how we operate 12:51:32 in, what's the best way to operate, and what's the best way to operate, I 12:51:38 think it can help us negotiate a system that is dominated by a few large 12:51:44 players. But again it doesn't get us the whole 12:51:53 way there. So I'll go with useful but not sufficient in and of themselves. 12:52:00 >> Just the one thing I probably have to add. It's important for us to 12:52:05 hold the moderation systems hold accountable and users to advocate for 12:52:10 it. That means, if you're making content, giving controls who can 12:52:14 participate and interact with you, on the viewer side maybe limiting. If 12:52:19 you don't want to see certain things you don't have to see certain things. 12:52:26 A lot of that, safety, and tolerance-- the more you give people the tools, 12:52:31 the less you need interventions and less companies to step in. We see 12:52:37 that a lot at Twitch. >> Just to say, I think user 12:52:42 empowerment for users to control their content months radiation experience 12:52:47 could be an effective way. Going back to users can be really important. And 12:52:52 I think just focus on user he will powerful is great. 12:53:01 There is some-- empowering individual users for them and their rights isn't 12:53:05 necessarily going to fix what we think about as systems. For example, it 12:53:09 doesn't see all the people on the platforms that are fundamentally 12:53:13 affected by the platforms. It also creates funny incentives. 12:53:21 One of my stater stats from the report is you see the number of appeals it 12:53:30 gets from the people whose contents get taken down. And you can see that 12:53:34 that's a very reasonable thing to happen because someone's interest in 12:53:43 their own content is going to motivate them to get it put up. At some point, 12:53:50 you're going to be like-- you know, but for thinking about systemic issues, 12:54:00 affect on society and culture, we're not relying on individual users, 12:54:05 creating the system we want. There's always, always the trade off between 12:54:09 what you are focusing on. >> Yeah, and it's-- really interesting 12:54:15 because that leads us into the question of, who then makes the standards for 12:54:20 the systemic, the evaluation of systems? One nice thing about the 12:54:25 user empowering powerful and user centric focus is it's on the 12:54:30 individual to advocate on their behalf if they feel there's an injustice 12:54:34 versus evaluation of a system as a whole against, you know, I would say 12:54:39 from human rights principles. I'm not sure where that points us for 12:54:43 evaluation of systemic application of a company's rules across all the could 12:54:49 not tent on their platform. Or is the notice and take down process, working 12:54:51 sufficiently well? I think that's probably something 12:54:57 that's going to have get developed quickly as countries around the world 12:55:03 are looking at putting regulation that takes these systemic focus. 12:55:10 I want to ask a question from an ad Yens member, asking, a good concept of 12:55:14 who is the audience for transparency reports? They asked do you think that 12:55:19 companies are publishing transparency reports for the governments and not 12:55:23 user because of how the data and reports show the increase in content 12:55:31 take down over time and doesn't focus on incidents how things happen. It's 12:55:36 a point in the form of a question. I'd loof to get any reactions from the * 12:55:40 question of audience for trace reports. 12:55:44 >> >> It's a good question. I was 12:55:49 waiting to see if Connie would unmute first. I don't know if they're 12:55:55 particularly useful to individual users. Actually especially in the 12:55:59 jargon they're written in, I don't know the individual user gets a whole lot 12:56:04 out of them. As a assist society members and advocate, I find, I learn 12:56:08 something from them but hope that researchers and academics learn more 12:56:15 from them and can translate into them. I guess not-- [Laughter]. 12:56:20 I sort of hope they're not written for governments. Like I hope, I really 12:56:25 hope that's not the audience. And I especially hope that's not the 12:56:30 audience in-- the governments that are perhaps less accountable. I don't 12:56:37 want to say non-Democratic governments I don't think it row relates to 12:56:47 democracy necessarily. Whatever the word is for governments that are able 12:56:52 to regulate with the interest of their people in mind. I owe I don't think 12:56:58 that we have a good result if it is. I'm interest from company's 12:57:04 perspective what they think their particular audience is. 12:57:11 >> I can speak to, first and foremost-- so for Twitch, people are really, 12:57:16 really into transparency. Like this is one of the first things I realized 12:57:21 when I joined. A lot of that is because we are part of the threater 12:57:27 economy. So 9 creators make money if you will time or part-time from their 12:57:33 participation on Twitch. Like one of the-- again, as soon as I started, I 12:57:37 started getting questions, can I do this? Is this against the rules? I 12:57:44 think it's because, if you get caught into a policy, it has a far reaching 12:57:48 effect on your life. We treat it seriously. 12:57:55 One of the first policy launches that we worked on, I-- we spent a lot of 12:58:03 time researching. People feel like this is transparent enough and 12:58:06 comprehensive. And we talked to a number much creators. And the theme 12:58:10 over and over again wasn't like, oh, this is where you need to draw the 12:58:17 line-- they're like, can you please explain where the line is when where 12:58:21 we go from there? That has been a super bigs hoichtion 12:58:32 modification to be as transparent as possible. We started writing, putting 12:58:37 more examples and clarity. That is a modification for awe transparency. 12:58:43 Where our creators are super interested in the numbers that we take down and 12:58:55 how they're affected by these policies because 2 affects their livelihood as 12:58:57 well. >> I see we're coming up to the end of 12:59:03 the hour. And I one of our panelists has to drop quickly. Unfortunately 12:59:07 we're not going to get to all the questions in the Q&A. But I want to 12:59:13 ask for our panelist, final reflection. Transparency was a hot issue in 12:59:19 2021. It is going to be so in 2022. , what is what you're looking forward 12:59:24 to and could be accomplished in the new year or what you would like 12:59:30 policymakers to be considered seriously as the big tech policymaking machine 12:59:34 turns on in the next year? Julia, I'll come to you first. 12:59:41 >> Thank you very much, Emma. I guess one thing that I hope we'll be 12:59:46 able to do more in 2022 and specifically companies is talk more to 12:59:51 the user and explain more who you are, how you work, how you function, what 12:59:57 are your content moderation processes. This is important especially if we're 13:00:04 in a time when companies want to regain trust from their user base. So I 13:00:09 think that would be speaking more to users and explaining more what you're 13:00:13 doing and what you're not doing. >> Connie and 13:00:19 ? >> Yeah. I don't know if-- there's 13:00:27 this one episode where-- I was trying to think of a summary. Ted is falking 13:00:34 to the coaches. And he's like, oh, therapy didn't work for me. And his 13:00:38 assistant coach are like, all people are different people. I feel the same 13:00:45 as I've been thinking about this panel. You'll communities are I have did and 13:00:48 companies are different. And it's going to take a lot of effort and 13:00:53 going to be hard. To standardize and come up with the best principles is 13:00:57 important and going to take a lot of work from p all the spaces. We're 13:01:04 better off appreciating the diversity. >> Evelyn? How about you? What are 13:01:10 you looking forward to in 2022. >> There's a number of proposals in 13:01:15 the legislation rather -- to get research access, to create safe 13:01:19 harbors for companies that want to provide independent researchers with 13:01:25 access to underlying data, and using that data. We can't fix problems we 13:01:29 don't understand. It's fantastic to serve, talk in the abstract about 13:01:34 these questions of trade offs between speed and accuracy and appeal rates 13:01:39 and all those things. But I mean, we can't answer those 13:01:43 questions without actually knowing what's going on. You know, I think 13:01:50 that the-- the big thing that-- revelations showed that there's more 13:01:58 information inside than outside. It confirmed it. The lowest hanging feud 13:02:03 possibly, is to say, let's crack these companies open and see if we can sort 13:02:08 of get the empirical basis for smaller regulation. 13:02:12 >> Great. And David, last thoughts from you. 13:02:19 >> Yeah, I'm actually not looking forward to regulation. Let me say, 13:02:25 I'm actually looking forward to maybe seeing how the foundational principles 13:02:28 set out in the new Santa Clara Principles and how they get 13:02:33 implemented by the companies. And this was one of the big changes is 13:02:38 that we added the foundational principles from the old operational 13:02:46 principles. Human rights and due process understanding rules and 13:02:53 policy, cultural competence. More focused on information about the state 13:02:59 involvement and concerning of state involvement and content moderation and 13:03:05 integrity and explainability. I would like to see those principles 13:03:11 reflected. So my hope is that the first one I read in 2022 will at least 13:03:14 show some consideration of those ideas. 13:03:20 >> Yeah, and thank you for the foundational principles, David. I 13:03:26 think we can do an entire, two hourlong panel what competence looks like in 13:03:32 content moderation. Maybe mark your calendars for next year. But thank 13:03:36 you all so much for sharing your thoughts and everything with us today. 13:03:41 We are now going to take a very short, one or two minute break as we welcome 13:03:44 our next panelist to the virtual stage. 13:03:51 To all of our attendees, sit tight and join us for the second panel which 13:05:06 will be starting soon. Thank you. >> 13:05:09 >> >> Hello, everyone. I'm Caitlin 13:05:16 Vogus. Deputy Director of CDT's-- I'm excited to welcome you to our second 13:05:26 panel on The Future of Speech Online. This panel is called Mrs. Building a 13:05:34 Building a Better Transparency Report: Talking when the trends and reporting 13:05:37 and the challenges of providing transparency report as well. 13:05:43 A few housekeeping notes, I want to remind our audience that we have a-- 13:05:47 save time for questions at the end of the panel. If you have questions for 13:05:52 the panelists feel free to use the Q&A function if you're joining us through 13:05:58 the Zoom ebb with. Or tweet questions using the hashtag CDT questions or 13:06:03 email us questions@question at CDT.org. 13:06:11 I'm going to introduce our speakers from die. We'll here from Guy 13:06:18 Berger. Who is the director at UNESCO, where work at the intersection of 13:06:20 digital issues and free expression. This includes: work in countering 13:06:22 disinformation, including the study ‘Balancing Act: Countering Digital 13:06:24 Disinformation while respecting Freedom of Expression’; the development of 13:06:32 indicators for-- he's worked on the development of indicators of UNESCO's 13:06:39 internet universality and reports on world friends and media development. 13:06:44 And recent issue we've on transparency and accountability of internet 13:06:47 platform companies. We're also joined today by Charlotte 13:06:54 who is the Executive Director at the Trust & Safety Professional 13:06:55 Association (TSPA), a forum for trust and safety professionals to connect 13:06:57 with a network of peers, find resources for career development, and exchange 13:07:01 best practices for navigating challenges unique to the profession. 13:07:02 She also leads the Trust & Safety Foundation Project (TSF). Charlotte 13:07:08 began her tech career at Facebook, where she led international user 13:07:19 support, then built out theirfirst safety operation teams before building 13:07:24 Pinterest. And finally we'll be hearing from 13:07:33 Richard Wingfield. Richard is Head of Legal at GPD where oversees the 13:07:33 organisation's legal, policy and research function, building the 13:07:35 organisation’s understanding of the application of international law to 13:07:40 internet and digital policy, developing its policy positions, and monitoring 13:07:42 trends and developments across the world. Richard also oversees GPD’s 13:07:46 engagement in key legislative and legal processes at the national, regional 13:07:51 and global levels, as well as its engagement with the tech sector. 13:08:08 And before we dive in. I want to have a quick poll-- how you see 13:08:11 transparency report and I am. Improvements and changes might be in 13:08:17 the future. I put out a poll to the audience. 13:08:24 The first, is you how often do you read tech company transparency reports? 13:08:28 Second question, do you agree or disagree with the statement. 13:08:31 Transparency reports provide information that help shed light on 13:08:37 how tech companies operate? Anded 30, which are the blow topics that aren't 13:08:41 covered in current transparenciy reports and what would you like to see 13:08:48 included in future transparency report? There are three choices. 13:08:53 (READING). And this is where for our audiences to 13:08:57 weigh in and talk about it with our experts on the panel as well. I'll 13:09:05 leave a few more seconds for folks to add their responses to this poll. 13:09:13 Don't be shy. Okay, I'm going to close it up in two 13:09:21 more seconds. Last chance. >> Okay. I'll share the results. All 13:09:26 right. So in the first question, how often do you read tech company 13:09:32 transparency reports? We have a fairly engaged audience. That should be no 13:09:37 surprise given the topic of today's session, about 50% read them sometimes 13:09:46 and 30 percent often. Agree or disagree. Most of the audience agrees 13:09:50 that they shed light on how tech companies operate. 13:09:56 And the final questions, what else do you want to see covered on 13:10:03 transparency reports. There is a large 0 focus on government content removal. 13:10:09 And a few-- they would like to see something else. If that was you and 13:10:13 you want to share in the chat what you would like to see, feel free to do 13:10:16 that. Great. Well, I'm going to turn now to our 13:10:20 panel to get them to weigh in on this really interesting and important topic 13:10:25 regarding transparency reporting. I'm going to Richard to maybe give us an 13:10:29 overview of the current transparency reporting landscape. 13:10:33 Can you tell us, Richard what transparency reports a are out there, 13:10:37 who is publishing them and who is involved in the of the to improve 13:10:40 transparency reports and those types of processes as well? 13:10:49 >> Thank you so much, Caitlin. And I think if you ask-- the first part of 13:10:56 your question, looking at the transparency landscape-- access now 13:11:01 have air really helpful set of indicators which tracks the number of 13:11:07 companies producing some kind of transparency. I think it's shy of 13:11:17 100, between 80, 90 companies mostly based in the US or Europe -- 13:11:23 predominantly we see large online platforms, social media companies, 13:11:28 tele-- as well, and other companies that collect data or receive request 13:11:37 for data from government in some way. Cloud Servicing for example. 13:11:42 They're built into main categories, the first is social media companies and-- 13:11:47 looking at content removal. So how many pieces of content are removed 13:11:53 under a particular policy. Maybe the request comes from users or government 13:11:59 or court orders. The second is related to a request a person data. 13:12:03 Law enforcement and other security, other government agencies often want 13:12:08 access to the data the companies have particularly based in jurisdictions, 13:12:14 those people who-- they're not within the country themselves. Often these 13:12:21 transparency reports will give-- requests from government agencies. We 13:12:27 have seen a trend to develop more qualitative data. You see Facebook, 13:12:32 meta, talking about policies and how they're enforced during particular 13:12:36 situations. And you're seeing about data broken 13:12:42 down in granular level. Three or six months, broken down by country. 13:12:46 The second part much the question, what initiatives we're seeing. Obviously 13:12:53 companies themselves are looking to refine the transparency reports each 13:12:59 year. We're also seeing regulation now come in. EU and the UK, will be 13:13:04 demanding greater transparency from companies themselves and focus on the 13:13:09 qualitative information, systems and processes and algorithms rather than 13:13:14 raw numbers of content removals. Santa Clara Principles you mentioned 13:13:19 earlier, I believe in the previous session which bringing up to date set 13:13:24 of expectations in had civil society and Civil Rights groups. And we're 13:13:31 seeing focused multi-stakeholder in particular issues, for example, the 13:13:36 OCD has taken-- develop a framework for transparency reporting on terrorist 13:13:41 and extremist content and the similar work going on at the ECT and 13:13:45 transparency working group with the focus on transparency around terrorism 13:13:50 and extremism. There were a lot of things going on. 13:13:56 >> Absolutely. Thank you for providing that. Thanks, Richard for providing 13:14:03 that background. I want to go to Guy to ask you, Guy, about the iewnls 13:14:08 coreporting that it launched with the recent report letting the sunshine in. 13:14:17 Can you tell us why were UNESCO is interested in reporting and how you're 13:14:22 going to carry it forward? >> Sure. Thanks for inviting us. So 13:14:28 I just posted the link in the CHAT to this report which is called let the 13:14:33 sunshine in. And this comes from UNESCO which people will know is part 13:14:40 of the UN, 193 states members. Why does the UN get interested in these 13:14:47 issues? Well, particularly at UNESCO, we have this question of culture 13:14:51 communications and what's happening to freedom of expression access to 13:14:59 position, and at the same time the UN is sort of custodian of human rights 13:15:03 and also of the view that all the private sector has to respect human 13:15:07 rights. So part of human rights of course is 13:15:13 access to information. And so in the realm of of the goes, what we do at 13:15:19 UNESCO is we promote, the government should be proactive disclosure but 13:15:23 also be responsive to requests. And I guess you could say that this 13:15:29 sort of applies to the private sector. Within that bigger picture 13:15:37 transparency reports, part of the game. Of course the bigger picture is 13:15:43 access to data which is, there is some data and transparency reports but 13:15:50 bigger question of access to data and APIs and so on. Ultimately I think 13:15:55 the interest is making redundant for people to have to ask for information 13:16:04 and data because it's in the transparency reports and/or in ATI 13:16:09 which are available to certain actors. The reason we got involved in this 13:16:16 report is because in the UN there's a kind of attempt to set standards. The 13:16:21 UN adopted the universal declaration of human rights. That's a standard. And 13:16:28 so our am wigs here is to have standard for transparency. And * the idea is 13:16:34 really that, this could be a standard for self-regulation. It could be a 13:16:37 standard forever governmental regulation as well. 13:16:43 And what's in this is high level principles which could be something 13:16:51 that can give guidance both to companies and also to---- 13:16:56 >> I think-- >> The type of process we have with 13:17:03 this in developing these high level principles. We have-- what does it 13:17:11 mean? Well, quickly, just to tell you, for example, this standard what you 13:17:18 expect. But we've put in these high level principles that tr should be 13:17:22 disclosure about third party assessments, about human rights 13:17:24 compliance so that this stuff is not leaked. 13:17:31 There should be transparency report what risk assessments have been 13:17:36 conducted in elections, for example. And also there should be disclosure 13:17:43 about trends and conditions and amounts for moneys spent on research education 13:17:47 and lobbying for advocacy. So I think that's all useful stuff to understand 13:17:52 and make more accountable use of private sector power in this 13:17:57 particular space. And at a normative level this report 13:18:03 is followed by specific indicators as oppose to high level principles, then 13:18:08 I think we've got instruments that people can use and companies 13:18:13 themselves can use to structure the way they practice transparency. 13:18:18 >> Thanks so much, Guy. And thanks for sharing the link as well. I 13:18:22 recommend people checking the report. It also gives a back underon 13:18:26 transparency reporting as well, a thorough one and talks about some of 13:18:30 the history and also issues that have arisen in transparency reporting. 13:18:35 It's extremely informative. I know we'll be talking later about the 13:18:39 issues about standardization and volunteer framework versus government 13:18:48 regulation. Lots to dig into it. Charles 13:18:54 Charlotte I would like to bring you into the conversation. What are the 13:19:00 things they confront when they think about transparenciy reporting? 13:19:03 >> This is my favorite question, thank you so much for the invitation to be 13:19:09 here. It's wonderful to see all these names in the attendee list. It's good 13:19:15 to see you here. Great group to be talking with. 13:19:19 There are a lot of challenges and a lot of opportunities from from the tech 13:19:23 company perspective. And I had the opportunity to work at a company, 13:19:29 actually two companies that started small and ended up fairly large. 13:19:35 There are some challenges that I think are universal regardless of the size 13:19:39 of the company or how long they've not around. I know we'll be talking about 13:19:44 standardization a bit or a lot. It's one of the challenges simply because, 13:19:49 you know, as Richard you mentioned earlier, there play be 80100 13:19:51 companies ride now who produce these reports. 13:19:59 And each one of those * companies have different definition when a piece of 13:20:04 content is removed, acted upon, taken down. Companies haven't agreed 13:20:10 whether they report on number of requests of removal they receive. 13:20:15 What is removal ibs cha it means maybe it's content, behavior. 13:20:21 There's a lot of fundamental storytelling challenges that we haves 13:20:25 a an industry. Fundamentally we want these reports to be useful not just 13:20:29 for our audiences but actually for ourselves as practitioners. We would 13:20:33 like clear understanding what's happening in he's ecosystem. But the 13:20:39 numbers don't tell the story alone. So there is a lot of really 13:20:44 interesting challenges how we define what's meaningful as a metric and how 13:20:49 we tell that story to ourselves and broad he audiences. Connie mentioned 13:20:53 earlier, all companies are different companies. I think that's just very 13:20:58 true in this particular realm. The other thing I think, we think a 13:21:06 lot about is operational cost. transparency reports are not 13:21:11 automatic. It would be amazing if they were. In a lot of places, perhaps 13:21:21 most companies, these are tracked in Google Docs and manually-- there's 13:21:25 this intention operational cost, styles even gathering the data. 13:21:30 Then this operation ALT cost associated with interpreting the data and putting 13:21:36 it out in a form that is interpretable by the public. Even this question, I 13:21:40 think one of the earlier speakers talked about owl he is poos of content 13:21:45 were removed for X policy? That sounds straightforward. Often the way it's 13:21:49 logged as an example is whether, if something was taken do you know in 13:22:02 this type of report cue or that report cue something could be reported 13:22:06 through hate speech. It doesn't violate our hate speech but 13:22:12 it's naked. So we'll remove it. That will get locked as a hate speech be 13:22:19 nudity removal. Make sure one report is logged correctly. It's a lot of 13:22:23 day-to-day machinery mechanics that have a deep influence on the way the 13:22:33 reports a are then populated and assemble. We talk about that in TSPA. 13:22:38 How we structure ourselves from the beginning that make the decisions 13:22:45 easier and story easier to tell. >> And your example reminds me from 13:22:51 yesterday. Daphne, who-- some humility about transparency where she 13:22:58 links to a Google dock, counting the ways these types of issues differently 13:23:03 when you do transparency report on them. If you get several reports 13:23:09 about with same piece of content removed. Do you count it one time or 13:23:13 multiple times? Being you also mentioned Charlotte, 13:23:18 you want these reports to be useful. That raises the question, useful for 13:23:25 whom? That is something that came up on the previous panel. And I think 13:23:32 some of the panelists expressed skepticism that users can find users 13:23:39 transparenciy reports useful. So for any of the panelists who want to it 13:23:48 tackle this questions do you think the information in transparency reports is 13:23:53 useful for that audience? How do you change it for different audiences? 13:24:00 Anyone want to tackle that in >> I'd be happy to jump in. 13:24:04 >> Sure. >> I think that what's important is 13:24:13 who is it useful to? I think it's useful to-- it's useful in principle. 13:24:18 Even if nobody reads the report. The fact that people are putting out their 13:24:23 wares in a way, and maybe they instrumentallize it for public 13:24:31 relations purposes in many cases I'm sure. Nevertheless, that in itself 13:24:35 it's an exercise in public accountability even if it's not taken 13:24:40 up. Of course ideally by ones to make this 13:24:48 useful for users. And by user I don't only mean individual users, ee exwills 13:24:53 regulators, for example, they want to-- they need to know what's happening in 13:24:59 terms of electoral advertising, for example. That's really important. 13:25:07 For UN bodies it's really important to know, for example, that momentum is 13:25:16 building, and certain whistles, dog whistles being blown. And that these 13:25:22 can culminate in really dangers Russ speech. And the same with 13:25:30 disinformation and so on. That's why I say transparency reports-- 13:25:37 transparenciy for whom and access to other mechanisms such as-- APIs. 13:25:40 Because this stuff is really important for people who have to deal with the 13:25:47 consequences and who, instead of having to pick up the pieces or clean up the 13:25:56 mess afterwards, with greater transparency could be proactive at the 13:25:59 start of the process. >> Charlotte, would you like to share 13:26:03 your thoughts? >> I was going to add that from our 13:26:09 perspective, one of the big audiences from these reports ra employees at 13:26:14 other tech companies by is not the user intended or main audience when people 13:26:22 are writing it. We're still very much in the naissant stage. A number of 13:26:26 companies have been doing this for a number of years. There are a lot of 13:26:31 companies who are like, ooh, I just put out one of about. I want to see what 13:26:35 else I should be thinking about. That's true not just from a reporting 13:26:43 people, perspective. It's also about understanding, like, 13:26:49 how do people do their trust and safety business, right? Okay, watching a 13:26:56 company saying, we're getting these kinds of removal requests, that's how 13:27:01 we respond. That could be informative for safety professionals in other 13:27:08 places to say, looks like they've been able to manage this in a way. Et 13:27:12 cetera will learn that approach. It's hard to go up and say, hey, excuse me. 13:27:17 Do you have this problem? We do, too. But the transparency report is really 13:27:21 a big part of that. You can get a little peek behind the curtain and 13:27:26 say, all right. Let's see if we can learn to improve our own bra here. 13:27:32 >> That's fascinating. I had thought about the other tech companies being 13:27:37 audience to the transparency reports. That makes sense. Richard, Charlotte 13:27:42 talked about the trade offs that get talked inside the tech companies. 13:27:49 I know you've been involved in the voluntary processes earlier like 9 13:27:54 Santa Clara Principles and the OECD. I'm wondering when those groups are 13:28:01 talking about transparency reports and making them better, what trade offs 13:28:05 are you considering and recommendations to come up with? 13:28:11 >> There are a couple of also really big themes that emerge, and different 13:28:17 forms. One, which we touched upon a little bit. Is the aim of 13:28:21 transparency to get the largest companies to do transparency better 13:28:25 because they have the largest market Shah European pirate or do we want 13:28:30 have more companies do transparency particularly small and medium size 13:28:34 companies? The more demanding what it looks like, 13:28:42 it pushes the bigger companies to do better but exclude smaller companies 13:28:47 because the standards are too high. Are you trying to achieve transparency 13:28:52 from smaller companies or larger companies? I think a second 2R5EUD 13:29:00 trade off that we've seen from time to time, relating to how standardized do 13:29:03 you want to get? Because you can have a broad set of 13:29:09 kind of principles that apply to all companies within the cope, ICT rated 13:29:15 companies that allow users to generate content * 13:29:21 But if they're very generalized and you give a huge amount of discretion, it's 13:29:26 difficult to it compare and contrast. To be able to see how companies are 13:29:31 doing compared to each other. The more specific you get, the more granular 13:29:38 and detail you get is precisely what transparency should looks like. All 13:29:44 these companies aring things differently. And success at different 13:29:53 level. Trade offs as well. There is a 13:29:58 balance Wen transparency and effectiveness of problems being 13:30:08 caused or concerned. You can spend a lot of your money-- you might be able 13:30:12 to say, we're going to publish transparent data about the 13:30:16 organizations and individuals we're monitoring and concept we prioritize 13:30:25 that you're going to alert them to what you're doing. It minding-- 13:30:29 unexpected trade offs between transparency and effectiveness with 13:30:35 dealing with terrorism, hate speech or as a government, regulators, trying to 13:30:42 stop >> Yeah, we're talking about trade 13:30:47 offs within companies, trade offs in civil society, efforts. I wanted to 13:30:53 turn to thinking about the government and regulators. And Guy, you 13:30:59 mentioned the UNESCO report, let the sunshine in. It could shall useful to 13:31:04 regulators who are thinking of mandating transparency reports. And 13:31:08 we have seen of the goes around the world to require are transparency 13:31:12 reporting. From your perspective, what are the 13:31:16 most important things government and regulators need to understand when 13:31:19 they're thinking about transparency reports? 13:31:27 >> So this is, of course, a complex thing because it comes down to, you 13:31:31 know the power of government and the government respecting human rights and 13:31:37 so on. One certainly sees government's 13:31:43 wanting in many cases to regulate content. And in a way, it's an 13:31:50 alternative for them to regulate for transparency. Because regulating for 13:31:57 transparency gives insight into what's happening in terms of actors and 13:32:01 behavior in the companies and how the companies are dealing with these 13:32:07 issues. And so it's a way that, I think in 13:32:14 between the extremes of laissez faire at the moment where nothing happens 13:32:19 and overregulation of content, if you put this in the middle as saying, 13:32:27 well, regulator transparency-- I think this is a way that could be-- could 13:32:31 avoid some of the pitfalls of doing nothing and doing too much. 13:32:42 And in a way also, I think, an incentive for companies to know that, 13:32:47 because they could be scrutinized to take greater care about their 13:32:53 operations, how they spend their money and so on and so forth, I think that 13:33:02 can benefit a lot more than government. For example, and this relates to this 13:33:07 question of, you know, transparency and trying to get more specific things. I 13:33:13 think you can obviously-- regulation-- you have law, you have policy law 13:33:19 regulation more and more specific as you go down the pipe. But ultimately 13:33:24 you come to indicators as well. And indicators for who? 13:33:31 And so you might have particular policy law regulation and indicators for 13:33:38 elections, for example, or for health or whatever the particular concern is. 13:33:44 Or you can have it at an order level with a broader kind of instrument. 13:33:51 But I think that what's important is that there's something that could be a 13:33:56 foundation for further work both by stakeholders and by companies to 13:34:01 respond to, to specifics. I'll give one quick example here. 13:34:10 One of the things we consider at UNESCO is safety of journalist. At the 13:34:16 moment, there's nothing available-- what we would like to do, and next 13:34:25 year we'll do consultations here. What does it take to get indicators 13:34:30 of-- how can this build on principles on existing laws and existing 13:34:36 regulations and existing transparency reports and existing company policy? 13:34:39 But you have to get down to the granular level if you want to 13:34:43 understand this problem. And then propose ways to address it. So I 13:34:49 think governments have got-- an important part in this ecosystem 13:34:53 contributing elements, and other people can build them. 13:34:58 >> Yeah, if I can follow-up on that. This wasn't a question we talked 13:35:01 about-- sorry if this is a surprise. In the poll at the beginning many of 13:35:05 the audience member said they would be interested in greater transparency 13:35:09 reporting from governments, transparency reporting on content 13:35:13 removal that governments are instigating or requiring from 13:35:17 platforms. I'm wondering if any much the 13:35:21 panelists are seeing trends in that direction or lawmakers are interested 13:35:27 in that. It seems there's a focus on transparency from the companies which 13:35:30 is appropriate and makes sense and perhaps governments should be looking 13:35:34 to themselves and what sort of transparency reporting they should do. 13:35:37 Have any of the panelists have worked 13:35:43 on government transparency reporting with respect to content removals or 13:35:47 take down demands? >> Some of the multi-stakeholder 13:35:53 initiatives, that brought companies and government together, I haven't seen a 13:35:58 huge amount in terms of government. There has been acknowledgment the of 13:36:02 governments that they themselves have to do transparency reporting. You 13:36:07 have demands that come from courts. You don't publish court orders. Or 13:36:15 law enforcement agencies, different level of states, security of 13:36:20 services. Advertising regulators, all of them making different demands and 13:36:26 centralized within government at all. In some cases platform is better as 13:36:29 coordinating the data on this than governments even though they 13:36:34 themselves are making the demands they want transparency transparency about. 13:36:39 >> Charlotte was describing within companies, if you don't track the data 13:36:44 you can't report it. Maybe they need the Google Doc that Charlotte was 13:36:47 describing. >> I was thinking, what an incredibly 13:36:54 interesting parallel, Richard. It illustrates the problem of 13:36:59 centralization. As companies get larger and larger whether 13:37:09 intensionally or not, decentralized. It's one thing if you're one company 13:37:15 with one product, but if you're one company with many products or a 13:37:20 company with subsidiaries. I think an amazing example of that is 13:37:26 actually any government, especially these larger governments, it willized 13:37:31 governments, that's an interesting parallel that I hope can prompt 13:37:34 compassion sort of organization to organization in our society. Ones 13:37:38 things get complicated, it's hard to make it simple again. 13:37:48 >> If it I can jump in Caitlin. This is an inspiring question. And tech 13:37:53 people might see some potential in one system that does exist at the moment. 13:37:59 But it's not, I think, comprehensive scoping. 13:38:05 But I'm sure people have heard of the sustainable development golds. And 13:38:11 sustainable development goals a great by all the governments at the UN which 13:38:16 is SDG 10. Which is fundamental freedoms. 13:38:27 And states are supposed report on this voluntary reporting. Do they have a 13:38:32 guarantee and do they implement it. And the other one, what about the 13:38:37 safety of journalists, human rights and-- this gives scope to develop 13:38:41 indicators where states can implement, can do it. 13:38:49 And in a mirror to what Charlotte was saying, at UNESCO, we get states, we 13:38:54 encourage them because it's all political-- trying to build up 13:38:59 momentum. We get them to send us data on this access to information story. 13:39:01 Do they have a law and how do they implement it 13:39:07 And very often, it's not that they don't want to respond but they don't 13:39:11 have the data how they're implementing the right information. They're not 13:39:20 counting it. So they can't put it into their report. In the evident that you 13:39:24 need capacity building and magazinable indicators that are clear that have 13:39:26 consistent meeting that all different actors can use. 13:39:32 So it's very similar. And I think it could be something that could people 13:39:37 could try to build up in the future to develop standards for governments to 13:39:42 report inasmuch as this is access information and fundamental freedoms. 13:39:47 It's about what they're doing to protect freedom of expression. That 13:39:53 means, when they're asking nor take downs, are these-- on what basis are 13:39:59 they doing this and what kind of content it is and what rate of content 13:40:05 they're asked to be taken down? How they characterize and-- it's something 13:40:09 that should be looked at because it's a compliment to what we're looking at 13:40:13 for the companies. >> There we go. We've brained stormed 13:40:18 another multi-stakeholder process that can take place in 2022. I'm sure 13:40:25 we'll solve it 2022. >> I want to talk about the company 13:40:31 side of things. Charlotte you talked about the issues facing larger 13:40:35 companies in collection of data. In the smaller companies, what do you see 13:40:39 the difference there? And what advice would you give to smaller companies 13:40:43 who are thinking of publishing their first transparency report and 13:40:57 >> my advice is-- maybe a little controversial for this audience. My 13:41:04 first thing is log everything because you might need it some day. Or you 13:41:10 might decide we might be able to spin up more process, and this is the data 13:41:15 we need to use to do that. Log everything. 13:41:20 A version of this world where-- actually we don't want to be 13:41:26 collecting every piece of data about users do we? Right? And there is 13:41:31 this natural tension between being able to tell a complete story and being 13:41:38 able to not collect and hold all of these user data. That I feel as a 13:41:43 human society probably could use a lot of conversation on, sort of getting 13:41:50 those tensions, perhaps not resolved but just more, getting more people 13:41:53 aware of the fact that that tension exists. 13:42:00 I do actually in my job now give advice-- make suggestions or 13:42:04 recommendations to companies in that position. What I usually say is, 13:42:09 first of all, look around. Look at what other companies are doing. That 13:42:15 is often a very good guide to what other companies can realize, this is 13:42:18 practical, and we can do this. The other thing is really to be 13:42:24 thinking from the beginning what your values are as a company or product, 13:42:29 and how to tell the story of your product in relation to those 1R58s. 13:42:34 And be sure you're able to with the data you collect and how you present 13:42:40 that data. Transparency reports are the powerful storytelling that a 13:42:45 company can do. If framed in that way, helps you achieve a lot of your 13:42:50 business goals and reputational goals. But it needs to be thought of from the 13:42:55 beginning rather than as an afterthought. That's an advantage 13:42:59 newer companies have. A lot of companies were established before the 13:43:03 transparency report. Now they have to back into it. Okay, here's now what 13:43:06 we want to say. If you're starting today, that's 13:43:10 fresh. You can actually make decisions about that, incorporating that reality 13:43:13 in a way that I think is very powerful. 13:43:17 >> Yeah, those are great points and definitely a trade off between the 13:43:24 need to collect the data to report the data and also wanting to maybe not-- 13:43:31 keep the data where policymakers and transparency-- many proposals have 13:43:36 interesting ideas for information that can be reported. The societal benefit 13:43:42 is clear a lot of times but companies have to collect sensitive data that 13:43:48 maybe we don't want to select. >> I want to jump in. We had a host 13:43:54 and panelist chat who said, have someone who knows pivot tables. That 13:44:00 is the best practical advice, new to transparency reports can take. Make 13:44:08 sure anyone on your team is comfortable to pivot able. Thank you to that 13:44:12 attendee for making that pointed. >> That's a good reminder also if any 13:44:16 of our attendees would like to post questions to our panelists, feel free 13:44:24 to use the Q&A function or tweet us with CDT questions or emails questions 13:44:33 at CDT contract.org. Another issue which is the CRIT 13:44:38 civiles of transparency reports, we talked about the ways to improve them 13:44:42 and their benefits of transparency reporting. But some people are except 13:44:46 Cal of transparency reporting, kind of overall as a process. 13:44:55 And I think Evelyn Douek in our last panel says, transparency report can 13:45:01 sometimes obscure information more than illuminating. I wonder if any of our 13:45:07 panelists agree with that in whole or part or if how you would address it in 13:45:14 the transparency reporting process. Would you anyone like to have the 13:45:18 floor on that question? It's an easy one, I know. 13:45:23 >> Yeah, a couple of thoughts. So I think-- I'm going to be on the fence. 13:45:29 On the one hand I don't think we want to be overwhelmingly-- demanding 13:45:33 transparency, a lot of them are producing more and it's getting better 13:45:38 and detailed and granular because of the demands from civil society for 13:45:42 this information. I don't want to challenge how much 2 goes into 13:45:48 producing it and making it accessible. At the same time, as said, there lies 13:45:52 statistics. And through transparency you can of course manipulate what you 13:45:57 want to show. You can say the company-- we remove 99 percent of hate 13:46:03 speech we identify on our platform which sounds great. But how much are 13:46:09 they identifying? So I do think we need to be codges trucktive critical. 13:46:15 We need to consistently scrutinize and ask for better improvement for 13:46:19 transparency. Many companies are trying to do the 13:46:22 right thing here and trying to be helpful as possibility. Particularly 13:46:25 in companies where other parts of the companies are saying, why are you 13:46:33 doing this and being transparent? I think we should give them credit. 13:46:46 Richard? >> I think this is a good point. And 13:46:53 I think at the same time, if companies want to establish 9 bona fides and 13:46:56 convince people that they are account be for human rights, then I think 13:47:13 something increasingly is have meta report and the progress they're in 13:47:21 because it should never be that they kind of are dark -- static. You want 13:47:26 to roll in data on some of the topics. This continues to revolve. 13:47:33 evolve. What's important is for companies to be convincing about what 13:47:42 steps they're looking 59 for future proactive disclosure or reactive 13:47:45 disclosure for researchers and what's available. 13:47:50 And maybe it's not stuff in the particular report but ways people can 13:47:54 get information from the companies through other mechanisms. So it's a 13:47:59 kind of a guide. In the end, a transparency report could be so long 13:48:04 that it's basically-- it's unmanageable. But I think it could be 13:48:09 quite useful to keep sharing that companies are really committed to, 13:48:16 being, respectling privacy and law enforcement and everything else. But 13:48:30 a momentum available to stakeholders. Some people can gain the system-- but 13:48:34 on the whole, it's not an excuse to hold everything close to your chest 13:48:42 when you've got such a need for accountability or when you're under 13:48:46 pressure. It doesn't seem a lot of people have a lot of faith in the 13:48:50 companies to do the right thing especially if it's going to cost the 13:48:54 companies a bit more money. . >> I think that's an interesting 13:49:04 point, guy. And I like the-- idea of companies-- show consideration on 13:49:11 like, we use the report this way and now this is how we change. TSPA, we 13:49:17 did a discriminate on transparency reporting this last year. I encourage 13:49:22 people to check it out not because I'm going to show that fantastic volunteer 13:49:31 curriculum chapter. It shows how they've done it. I encourage people 13:49:36 to look at the reporting -- dab the transparency reporting trajectory that 13:49:42 we've seen from Google, of course, Twitter, now meta, and also from Cloud 13:49:48 flair. Because what you can see in the way that they produce their reports is 13:49:52 that sense of change and progress. And they all talk about this is 13:49:57 something we're making different this year. We're doing it a new way this 13:50:03 year and here's why. And explaining what that story is. Those have been 13:50:07 interesting to read year over year because you're able to get that sense 13:50:15 of, evolving thinking. >> If I can add another quick point 13:50:21 that in these UNESCO high level principles, I think there's one, says 13:50:28 companies should disclose what they're doing what we call heed I can't and 13:50:35 information literacy. This is an important thing because-- if we're 13:50:41 speaking about transparency we need to speak about the empower. Users of 13:50:46 transparency. Everybody knows the term surveillance capitalism because it was 13:50:52 kept under wraps, the business model. Now it's become transparent, not 13:50:57 necessarily through the authorities of the companies or the disclose you'res 13:51:02 of the companies. What's super important * is companies 13:51:08 do take some responsibility to make sure that the users fully understand 13:51:15 what is the business model, what are the risks, how to engage, how to 13:51:21 protect yourself as need be. And that is part of transparency, being able to 13:51:27 show people that you do care about educating and empowering people 13:51:33 concerning the use of your service. >> Absolutely. And I want to turn to 13:51:37 the questions from are the audience now in our last few minutes. Many so we 13:51:43 have one here in the CHAT from Tim O'Brien who asked, what can we learn 13:51:48 from an adjacent row Dayne, Google transparency reports request from 13:51:55 data. And Google publishes or report every year, detailed-- and says, 13:51:59 Google is not required by law to publish the transparency report but it 13:52:07 does as a matter of company policy and takes all the criticism while 13:52:12 lawmakers get a free pass. The question-- is there a disincentive 13:52:18 for companies publishing a transparency reports because a lot of times they're 13:52:24 criticized about government's demand for data or content moderation 13:52:29 practices? Maybe Charlotte I'll take that to you first since you have a 13:52:31 perspective from inside a tech company. 13:52:35 >> I'm going to need you to repeat the question because I was looking at the 13:52:38 Q&A box. >> The question is whether there's 13:52:43 maybe a disincentive for companies publishing transparency reports 13:52:49 because sometimes they are criticized after they publish them for not taking 13:52:54 down enough content, giving government too much data. 13:53:02 >> Yes and also companies get criticized for anything all the time 13:53:06 always. This goes back to my earlier point. The company has to understand 13:53:11 what are their values and why are they producing the report. 13:53:20 Yes, there absolutely is-- the world outside tech companies provide 13:53:26 incentives and disincentives. Doesn't mean you shouldn't put out a 13:53:30 transparency report. A lot of the operational cost I in 13:53:35 producing report is about that positioning. It's trying to minimize 13:53:41 criticism and help people understand what's going on in a meaningful way. 13:53:45 I'm not aware of any companies saying, we're not going to do one because it's 13:53:49 going to raise more questions. It's more of a problem for us. Some of 13:53:55 that is peer pressure based. We have these-- I think it's considered more 13:53:59 standard. But a lot of that is because the people who choose these jobs do 13:54:05 them because they want to help, right? 0 there are very few people who are 13:54:09 like, I'm here to have a great time looking at crazy stuff. 13:54:17 It's because seen as an important way to help our human society grow. And I 13:54:24 think, in that lens, there's always going to be an incentive towards 13:54:31 transparency. I think it just is-- it really requires society to ask good he 13:54:35 requests. And when there are critiques as someone was saying 13:54:40 earlier, make sure they're good and meaningful critiques rather than just 13:54:47 trash talk. >> Great, thank you, Charlotte. We 13:54:51 have a few minutes left. I'll take another question from the audience 13:54:55 that-- is the last question I wanted to ask anyway. 13:55:00 If you could get all tech companies to produce one new exhibit or metric as 13:55:05 part of their transparency efforts, what would it be or why? Or if you 13:55:12 want to answer it generally, what would you prioritize in the next generation 13:55:20 of transparenciy companies. I would love if each gives an ear. Richard-- 13:55:25 >> The one I'm interested in which raises the number of broader human 13:55:31 rights issues are who are the people, humans making the decisions. We know 13:55:37 there are tens of thousands people out there looking at worst images and 13:55:45 videos in society. We don't know where they are. We know are there are 13:55:48 mental health and working conditions there. This is part of the larger 13:55:53 companies. We need to think more about the well-being of the people doing 13:55:58 this difficult work alongside all of the data that goes alongside as well. 13:56:03 That's something we've never seen transparency reports before. I would 13:56:08 love to start seeing. >> I agree with that. I would add to 13:56:11 the well-being information about the training they receive, the languages 13:56:16 that they speak and are operating in. There's been a lot of recently 13:56:23 reported on from from the Hogan leaks regarding contact moderation. And 13:56:30 lacking on Facebook and lacking in other platforms as well. That would 13:56:34 be an interesting metric. Charlotte, what would you like to see? 13:56:40 >> I would love to see that metric and simultaneously wonder if a metric can 13:56:44 andivity in a meaningful way. That's often what I would contribute here. 13:56:51 I think that this is again sort of an operations question. I think it will 13:56:56 be interesting to know amount of time people spend producing their 13:57:05 transparency report actually. I think that would be a very ill will yous 13:57:09 thattive question. >> You think that would cause 13:57:13 companies to -- do you want a big number or small number? 13:57:18 >> Isn't that the question, right? I think * it's like a good meta example 13:57:23 of when people think they're asking for something, what-- we have this 13:57:28 principle in trust and safety which is the no good number. There is not a 13:57:34 good number. Either too many minutes or not enough minutes. What is it 13:57:40 we're actually looking for as consumers this information? That prompts the 13:57:45 frame the question we're asking in that report. 13:57:49 >> Guy, I'll give you the last word. What would you like to see? 13:57:57 >> I have to possibly-- people make suggestions to us because we're 13:58:00 interested in this question of media viability. And of course, you know, 13:58:09 this is partially an issue of what's amplified-- what is discoverable, the 13:58:14 money issues and we'll know the disputes and-- both in the EU and 13:58:18 Australia and so on. The issues of what deals are made with which 13:58:24 publishers to give them what revenues. So we want to do a consultation next 13:58:32 year as to what are the indicators for transparency that would respect 13:58:38 commercial secrecy but which will enable more informed and 13:58:42 evidence-based discussion and negotiation about what changes could 13:58:47 happen in the ecosystem to support independent journalism which we now is 13:58:53 a casually of the success of the platforms. So please be in touch with 13:59:04 me if you've got any ideas about how we can build up a system of metrics for 13:59:09 the companies and the publishers in the media. 13:59:14 >> That's a topic near and dear to my heart. With that I would like to 13:59:21 thank my three W07BDful panelists, Richard, Guy and Charlotte. I'm 13:59:28 going to turn it back to my colleague Emma Llanso to close it out. Thank 13:59:31 you. >> Thank you so much, Caitlin, and 13:59:34 thank you to Richard, Guy, and Charlotte. That was a great 13:59:39 discussion and I think we're all-- we have a lot more ideas how we can be 13:59:40 improved transparency reporting going forward. 13:59:47 So yeah, thank you all for joinings us for today's future of speech online. 13:59:58 We'll be starting again tomorrow for awe final day with lessons beyond 14:00:03 social media and can transparenciy be mandated by law, looking at that 14:00:09 question from legal systems around the world. You can register for the event 14:00:16 or join us in the live stream at CDT.org. We hope to see you again.