11:46:44 >> CAPTIONER: 11:46:48 Standing by for event to begin. 12:01:34 >> : Good morning, good 12:01:37 afternoon, good evening wherever you might be and welcome to the fifth annual 12:01:41 Future of Speech Online event. My name is Emma Llanso and I'm the director of 12:01:47 [indiscernible] at the Center for Democracy and Technology I will be your MC for this years event. 12:01:50 We have a great three days of programming plan for you this year. All focused around the theme of 12:01:54 making transparency and meaningful. 12:01:58 As we grapple with difficult questions 12:02:05 government content and accountability everything from finding disinformation and hate 12:02:08 speech online to combating terrorist abuse online services while respecting 12:02:12 human rights to enabling individuals to make choices about the site and services that 12:02:15 you they use. It is very common 12:02:18 to reach for transparency as a key in the policy collar 12:02:22 Quito in the policy toolbox but with any 12:02:26 big idea in policy [indiscernible] the more people talk about it the less it can 12:02:28 be clear exactly what we are talking about. 12:02:32 So over the next three days we will hear from a variety 12:02:35 of different speakers and experts on different aspects of transparency 12:02:39 and hear some different ideas about what different forms of transparency can really help us achieve in terms of 12:02:45 [indiscernible] empowering people and holding online services and government accountable 12:02:48 for how they affect peoples human rights. 12:02:51 Now, before we dive in just a few housekeeping notes 12:02:54 notes. This event is being life streamed and recorded and 12:02:57 will be accessible by his CDT you to page. We will also be dropping links 12:03:02 to the YouTube live stream in the chat. 12:03:05 Anyone who has questions for the organizers or trouble with any of the Zoom features or any other technical 12:03:13 issues you can contact events@CDT.org. And during our panel discussions 12:03:17 folks are welcome to submit questions by zooms Q&A feature or by email to 12:03:22 questions@cdt.org or on Twitter hashtag 12:03:28 CD questions. For today's agenda we have a lot of great speakers for you. Her first panel 12:03:31 what do we mean by transparency will run 12:03:35 to just past the hour then we will have a very short break as 12:03:39 we change over our speakers and move on to our second session of the day 12:03:40 , impairing users to transparency. 12:03:45 That will run to the very end of our 12:03:48 second hour of programming today. That second panel will be right here at the 12:03:51 same link that you used to join the first session so if you're planning 12:03:54 to attend both just sit tight when the first panel concludes and the second will start shortly. 12:03:57 And finally I would really like to 12:04:01 give a big thank you to our partners for this event. The Charles Cook incident. 12:04:06 CDI has sponsored future speech online 12:04:09 ever since our first event in 2017 and CDT is really grateful 12:04:12 for all of the support and partnership from CKI over the years. And now I am delighted to introduce 12:04:17 today's opening speaker Neil Chilson who is the Senior Research Fellow for technology and innovation at the 12:04:23 Charles Koch Institute. Neil, take it away. 12:04:26 >> : Thanks so much, Emma. 12:04:29 It's great to be back here at FOSO. First a little bit about the 12:04:32 Charles Koch Institute. We partner with social 12:04:36 under burners like CDT to explore how to break barriers so that each and every individual can realize 12:04:42 their unique potential. We do this through grantmaking convening an education and events like today's. 12:04:46 In pursuit of that person [indiscernible] we are 12:04:53 thrilled to continue our partnership with Center for Democracy and Technology and to support this years 12:04:56 future speech on my. As Emma mentioned the theme this year is workable transparency 12:05:02 and this topic could not be more timely. 12:05:06 Emma mentioned that everybody is talking about transparency and when it comes to speech online 12:05:13 why is talk about transparency so appealing? 12:05:17 I think politically we can easily understand the appeal. One of the big challenges 12:05:24 with the policies surrounding the speech online is that people fundamentally cannot agree 12:05:28 on what they want the companies to do. In many cases it comes down to 12:05:35 more fancy versions of leave up speech that I like and takedown speech that I don't like. 12:05:38 And NC a nation that is focused on polarization 12:05:42 these likes and dislikes can often conflict. 12:05:45 That kind of political impasse 12:05:49 transparency appears to offer a path to doing something 12:05:52 because regardless of what the companies do 12:05:56 many people agree that the company should at least be transparent about what they do. 12:06:01 And we have seen legislative proposals in that vein. 12:06:04 But as Emma mentioned and I think 12:06:08 as the many panels and experts are going to be talking about at this event are going 12:06:11 to illustrate 12:06:16 like all things in speech transparency 12:06:19 is complex. There are significant practical challenges with transparency 12:06:23 . What kinds of transparency? What information should be shared? Who should be shared with? And what 12:06:28 companies should practice it? I believe these and many other questions 12:06:33 stemming from a fundamental challenge that applies any time 12:06:34 an outsider seeks to understand and govern a system. 12:06:39 Using transparency to improve our understanding of social 12:06:42 media platforms 12:06:45 with the goal of improving the governance of those platforms 12:06:49 is a version of what yell anthropologist James E Scott called legibility 12:06:54 . The inner workings of a complex system 12:06:58 are illegible, they are unreadable, they are nontransparent to people who are trying to govern that system 12:07:04 so they engage in efforts to make the system legible, to make it transparent. 12:07:10 And the effects of such efforts differ greatly depending on the type of system. Now I would like to offer 12:07:14 a set of two different types of systems that we should ask 12:07:22 whether social media platforms are alike and that is force and factories. 12:07:27 Starting with factories in the early 20th century 12:07:31 Fred Taylor sought to transform how factories were operated. 12:07:34 He wanted to impose legibility, 12:07:37 he wanted to make the smallest activities within the factory transparent to management. Who could 12:07:40 then measure and adjust 12:07:42 those activities to serve the single purpose of the factory, which is the 12:07:44 maximization of efficient output. 12:07:52 This scientific approach to management, which came to be known 12:07:55 as Taylor is him, resulted in factories with strict goals 12:08:00 , routines, and measurements. Into these measures improved transparency and often did improve productivity 12:08:07 significantly. Although Taylor is him had its limits. Particularly when it treated 12:08:10 humans as essentially machines were doing very repetitive tasks 12:08:14 and it struggled to keep up in more creative or dynamic industries 12:08:20 . Still, transparency within the factory did tend to advance the limited 12:08:25 -- the single purposes of the factor. 12:08:29 Now, let's compare the factor to the forest. 12:08:33 Back to James C Scott. He tells the story of early German forestry. In the late 18th century 12:08:41 Prussian and sex and kingdoms viewed their wild complex force with a very narrow lens. They saw them as 12:08:44 revenue producers through timber 12:08:49 but the forest of course had many different purposes including 12:08:53 to the common people who used the mix of many different trees and the abundant nonmember 12:08:58 plant and animal life for medicine, hunting, even firewood. 12:09:04 But these many different trees, these complex wild forest were largely nontransparent to the 12:09:07 rollers. They were eligible to use Scott's language. 12:09:14 And to increase the transparency and the legibility of the forest the rollers 12:09:17 started with surveys to help them understand the number 12:09:20 of lumber producing trees then they formed complex formulas and tables to estimate the constant 12:09:25 lumbar yield of the force. This is very scientific approach to forestry 12:09:27 and it was groundbreaking at the time. 12:09:33 But with this focus on a single purpose, the production of 12:09:36 lumber, the next logical step 12:09:39 was to shape the forest in the direction that search that single-purpose. So now that we can 12:09:43 measure lumbar output 12:09:48 forestry management turned into moving the forest 12:09:53 toward that purpose. And the end result, which came 12:09:56 in just a few decades later, was a forest that was transformed from wild 12:10:03 diverse ecosystem to a very orderly aesthetically pleasing monoculture of lumber producing trees 12:10:08 that search the rollers purposes well, it was easy to measure, it 12:10:12 was easy to harvest for timber. But it fundamentally eliminated many of 12:10:12 the other uses of the forest by others. 12:10:18 Such as sources for the common people as I already mentioned 12:10:21 , medicine, hunting, and firewood. In the short term this rationalized forest 12:10:27 greatly increased lumbar yield. It served the purposes of the kingdom. 12:10:34 But ironically in the longer run due to the ecosystem disruption 12:10:37 the monoculture forest actually was more vulnerable 12:10:41 to disease and lumber production dropped significantly. And German forestry since then has been 12:10:45 a series of patches and approaches to try to accommodate the goals 12:10:52 of overproduction while keeping the benefits of a dynamic complex ecosystem. 12:10:56 Social media platforms 12:10:59 have parts that look like forests and they have parts that look like factories and 12:11:02 I think to make transparency workable we will have to identify those parts 12:11:08 that are like factories and those that are like forests. 12:11:12 My thesis, and I think will be exploring this throughout the rest of the agenda, 12:11:18 is that increasing transparency for the factory like portions 12:11:23 of social media will be straightforward although not necessarily simple. 12:11:26 And will tend to have fewer side effects 12:11:29 that are unintended because those systems can be understood better. 12:11:32 But increasing transparency of the forest 12:11:36 like portions, the portions that serve many different 12:11:39 purposes will be very difficult and are likely if transparency 12:11:46 is successful in the space to the purposes of the regulators it may neglect 12:11:51 or potentially degrade the usefulness of the platforms for other peoples purposes. 12:11:56 And as I already mentioned it is not even clear that regulators have a single purpose in this. 12:12:02 So these forests like portions I think are going to be the real challenge in the part 12:12:07 of social media governments and where transparency will be 12:12:11 be -- face a lot of difficulties. 12:12:15 So I think the program that is coming up is extremely 12:12:18 well-positioned to dig deep into this really difficult challenge and I am very glad 12:12:25 [indiscernible] is very glad to support CDT in this effort. I'm really glad that 12:12:28 you as the audience are here to join us in this effort 12:12:32 toward workable transparency. Thank you very much. 12:12:35 >> : Great, thank you so 12:12:39 much Neil. Those are some great metaphors for us to be thinking about 12:12:42 over the discussions that we have in the next few days. Now we are going to begin our first 12:12:45 panel discussion 12:12:48 on that key question of what do we mean by transparency. 12:12:52 So my speakers for panel one are welcome to turn on their video 12:12:55 and come join us on the virtual stage. 12:12:59 Now, as a little bit of background on this question of what we mean by transparency. 12:13:05 At CDT we have long been advocates for various forms 12:13:09 of transparency whether it is and kidding with companies to ask them to provide 12:13:14 regular information and reporting about the demands that they get from governments 12:13:17 around the world, for access to user data or for restricting content online. 12:13:22 We helped to found the global network initiative 12:13:25 back in 2,008, 2,009, as a way to actually begin 12:13:29 enabling multistate [indiscernible] group to adopt 12:13:35 assessments of information including very private confidential information from 12:13:38 companies to really try to evaluate what impact were they having on users 12:13:39 rights worldwide. 12:13:42 And we've also done things like work with 12:13:47 other nonprofit organizations to launch the Santa Clara principles on accountability 12:13:50 and transparency and content moderation. Including a 12:13:54 revision of the principles that was just launched 12:13:57 at the Internet governance forum last week. Transparency is a really big issue. 12:14:01 It is close to our heart at CDT, 12:14:04 it is also increasingly part of the public 12:14:07 policy discussions around online content governance. We see proposals 12:14:10 in the United States, proposals in the European Union, laws being passed into practice in countries around the 12:14:13 world. And as we think about 12:14:20 what is going to happen with transparency in the tech 12:14:23 policy field at CDT we think it is important to really try to be clear about 12:14:27 what we mean when we talk about that. So today we have actually published 12:14:32 a framework for policymakers 12:14:37 with the same title as this event Sears making transparency meaningful. 12:14:40 That covers four key areas that we see 12:14:43 in transparency policy. That includes transparency reports, those sort of 12:14:46 regular reporting out from companies about 12:14:50 their practices and their policies and how they respond to 12:14:53 government demands in particular. Things like user focused notice. What are 12:14:57 the content policies, the terms and conditions, the information that users get when 12:15:02 their content or accounts are restricted, what can they actually know about what is happening 12:15:07 to their speech on different platforms. A third category of transparency is assessment 12:15:12 or audits or other analysis of company practice 12:15:16 and the fourth topic and one that's getting a lot of attention lately is the sole question of 12:15:20 researcher access to data and how do you enable researchers 12:15:26 to gain access to data held by social media companies and other kinds 12:15:29 of tech companies so that they can perform independent research 12:15:31 on what is going on on these services. 12:15:33 We will share a link in the chat to the framework that we have 12:15:37 laid out but our goal there is to really kind of articulate the 12:15:40 state of play for some of these different areas and highlight key questions 12:15:44 that each of these kinds of transparency is aimed to answer 12:15:49 and to help policymakers consider what 12:15:52 are the trade-offs for each of these issues. If you answer a question in a certain way 12:15:56 what are really the pros and cons of that and the consequences. 12:16:01 So in all of the conversations that we are having during this event we are going 12:16:04 to hopefully dig into a lot of those keekah questions. What 12:16:05 sorts of transparency are we talking about? What problems are we https://www.charleskochinstitute.org/ 12:16:08 saw? Who is our 12:16:12 audience for this kind of transparency and many more. This can also sound 12:16:15 abstract but I promise you we will get concrete and to help me do that 12:16:19 for our first panel we are joined by 12:16:21 four very distinguished speakers who I will introduce right now and turn the floor over 12:16:26 to them. First we will be hearing from 12:16:29 Rebecca McKinnon who is the vice president 12:16:33 for global advocacy at the Wikimedia Foundation. Rebecca has a storied career 12:16:37 in human rights and Internet policy advocacy having lunch many initiatives 12:16:41 founded or focused on the users rights and corporate accountability including being a cofounder of the global 12:16:44 network initiative 12:16:47 and of the Citizen media network of Global voices and also 12:16:50 being the founding director of the ranking digital rights Project. 12:16:55 Care of Frederick, our second speaker, is a research fellow in technology policy at the Heritage 12:17:02 Foundation. Care's research focuses on big tech and emergency 12:17:05 technology policy and among her many previous roles she helped create and lead facebooks global security 12:17:11 counterterrorism analysis program. Then we will hear from 12:17:17 Dunstan Allison-Hope, the vice president at Business for Social Responsibility where he has worked for 12:17:21 nearly 18 years on issues of human rights and corporate responsibility 12:17:24 including conducting human rights impact assessments with major tech 12:17:27 companies helping to develop standards for reporting in different industries, and 12:17:30 helping to facilitate the launch of the global network initiative 12:17:34 . And find that we have Rebekah Tromble who is Associate Professor 12:17:38 in the school of media and Public 12:17:41 affairs at George Washington University where she is also the director of 12:17:44 the Institute for data democracy and politics. In addition to leading research projects 12:17:49 including investigating how to measure the quality of online 12:17:52 news content and the health of online conversations Rebecca 12:17:55 has also been deeply involved in the policy discussions 12:17:58 around how to better enable independent research to have access to data held 12:18:02 by private companies. So as you can see we have a wealth 12:18:05 of expertise here to help us unpack all of these different kinds of transparency. 12:18:08 First I think I will turn to Rebecca McKinnon 12:18:13 . Rebecca, as we have been laying out 12:18:17 so far in the session transparency is a really hot topic in 12:18:20 2021, looks like he will continue to be the same in C 12:18:23 2022 but you have been advocating for increased transference he 12:18:27 from technology companies for many, many years. Can you tell us 12:18:30 a little bit about how this issue of transparency in the tech sector has developed over the years? 12:18:35 >> : Thanks so much, Emma, 12:18:38 . I am thrilled to be here that this tickler group a panelist today. 12:18:43 I have been calling for transparency in various 12:18:46 roles since 2005 12:18:50 and working with activists and researchers on this 12:18:54 question 12:18:57 and the discussion about transparency 12:19:00 is that I was involved with initially was really 12:19:03 spurred by debates 12:19:07 and reactions 12:19:12 to the entry of a number of U.S. tech companies to 12:19:15 China, one of the most repressive Internet environments in the world 12:19:20 in 2,004, 2005 Yahoo came under fire for having handed over 12:19:25 email information 12:19:29 about dissidents and journalists and Chinese authorities resulting in people being jailed. 12:19:31 Google was under fire for 12:19:34 going into China and agreeing to 12:19:39 censor search engines and remove search results. And Microsoft 12:19:43 was also 12:19:47 being criticized for deleting bloggers content, 12:19:51 bloggers writings 12:19:55 , on a blogging platform that they had at the time that they were offering in China 12:19:59 in response to phone calls from authorities. 12:20:05 So there are a number of congressional hearings 12:20:08 and there is a lot of discourse around this at the time should these companies be in China and if so 12:20:12 under what circumstances 12:20:15 and you mentioned the global network initiative, there were a number of 12:20:18 conversations that started around that time 12:20:22 about what are some core principles that companies that are operating 12:20:27 in challenging environments like China need to adhere to in order 12:20:31 to respect and protect 12:20:35 their users rights. And that of course led to a broader conversation not just about China but about the 12:20:38 fact that a lot of governments are making demands of companies 12:20:44 to either remove content or handover user data in ways that violate peoples 12:20:47 human rights 12:20:51 to freedom of expression and privacy and what our companies obligations? And a very core 12:20:58 element that everyone agreed on was transparency. 12:21:02 People need to understand 12:21:06 who is manipulating their information environment, who is manipulate and 12:21:09 what they can and cannot know, what they can and cannot say. Under what authority for what reason 12:21:14 . Otherwise you don't know what you don't know. 12:21:19 In the case of search results being removed in China if you 12:21:22 don't know that some results have been removed 12:21:26 on government request 12:21:29 then you don't even realize you are being censored 12:21:33 and that is the most insidious thing. Similarly, if you don't know 12:21:36 what authorities have access to your data and 12:21:40 know what circumstances you can't protect yourself, you cannot make informed decisions about how to use 12:21:43 the service. So the point being 12:21:48 there was of course the debate about whether company should be 12:21:51 in environments like China at all but if they are 12:21:56 how can you be -- how do you make sure 12:22:02 that your users understand what risks they face when using your services. 12:22:06 So that they can make informed decisions 12:22:11 . And also understand how their information environment is being manipulated 12:22:15 , whether it is by government censorship or by other actors so that they 12:22:18 again can understand whether they have a full picture or not 12:22:24 so that they can that if they want to take action to find information elsewhere 12:22:31 so they are not living in a bubble not even realizing it's. 12:22:35 So that was kind of the basic principle that in order 12:22:38 for people 12:22:41 to have agency and exercise the rights they need to understand 12:22:47 how power is being exercised over their information and over 12:22:48 their access to their data and information about them. 12:22:57 So as the discussions that led to the 12:23:00 formation of the global network initiative proceeded, 12:23:03 one of the core principles was transparency and the obligation 12:23:07 of companies and the need to commit to be as transparent 12:23:12 with users as possible, not only about government demands 12:23:15 demands, but also about their processes for dealing with various external demands 12:23:22 that would affect users privacy and freedom of expression. So what that led 12:23:26 to in 2010 12:23:30 Google released its first transparency report. 12:23:34 With an actual data. Before that we were not 12:23:37 actually calling for transparency reports because we hadn't conceived of 12:23:41 the thing. Google created the thing known as the transparency report that created 12:23:46 that standard in terms of data about government demands and so on. 12:23:50 And I went back and looked -- there were some things I read about 10 years ago 12:23:56 as of 2012 12:24:00 -- starting in 2010 Google released its fruits transference report, started doing them on a regular 12:24:03 basis. By 2012 we had 12:24:08 Twitter, LinkedIn, dropbox, 12:24:12 and to the Internet service provider sonic .net were the only companies issuing transparency reports. 12:24:17 And some of us were calling 12:24:21 and also Wikimedia actually started issuing a transference report around 12:24:24 2012, 2013 starting with 2012 data. 12:24:30 So those were the early days. 12:24:37 And other companies many of us were calling on other companies to issue transparency reports. 12:24:45 You can come up with a list of some of the big-name companies we were bothering about 12:24:48 their need to do this and there was a lot of pushback about 12:24:54 it is not clear whether it is useful 12:24:58 or serves a real purpose or whatever. And then 2013 12:25:03 [indiscernible] revelations happened and all of a sudden a lot of companies 12:25:06 started issuing transparency reports about government demands for user data because there was 12:25:11 a trust deficit there because the lack of 12:25:14 transparency about government surveillance 12:25:16 through particularly through U.S. companies. 12:25:24 But advocates kept pushing for companies to do more 12:25:28 than just data related to privacy and surveillance 12:25:36 to be more transparent about content takedowns, not just in 12:25:39 response to government demands but response to other actors who 12:25:42 want content taken down whether it is copyright holders are people suing for libel 12:25:46 or et cetera, et cetera and also for transparency about content moderation 12:25:49 . That was pretty slow going 12:25:55 , ranking digital rights issued its first corporate accountability index in 2015 and there was 12:26:00 one indicator that looked at whether companies published any data 12:26:04 about the volume and nature of content 12:26:09 restricted 12:26:12 to enforce the terms of surface, it was flatlined, zero, nobody reported anything. 12:26:16 Then Cambridge analytic 12:26:19 happened in 2016 lo and behold people realize 12:26:22 there was a real trust deficit about how 12:26:26 content was being manipulated online not just directly by governments but by various actors 12:26:31 and suddenly we started seeing more transparency in reporting 12:26:34 around content 12:26:38 and we had 12:26:41 the Senate clear principles released which also helped 12:26:47 [indiscernible] the ball there. We also start to see more and more concerned about algorithm ethic 12:26:51 implication around targeted targeted advertising and 12:26:55 how 12:26:58 both of those factors are manipulate and peoples information environments so that they don't understand 12:27:05 the picture they are receiving 12:27:09 , who is manipulating it to one and under what authority 12:27:14 so pushing for more transparency there. When I was the ranking digital rights [indiscernible] 12:27:17 index was looking for transparency 12:27:24 both in targeting advertising and advertising practices generally as well as algorithmic 12:27:28 greater transparency around how algorithms 12:27:31 were recommending contents and amplifying content 12:27:35 . And Twitter sort of came out on top and the last index 12:27:39 in terms of being 12:27:42 more open with the public about various factors 12:27:51 that influence what people see. But there is a very long way to go 12:27:56 and to the point being not the transparency is the end all or be all but again, if you don't 12:28:00 know who has 12:28:03 access to your data for what purposes and how it is going to be used and if 12:28:06 you don't have clarity about how your information 12:28:09 environment is being manipulated 12:28:13 , what you know, why you are learning about it 12:28:16 it, whose agenda is advancing, how can you control 12:28:21 any of that, what choices do you have 12:28:24 . If you don't have choice, if you can't hold abuse accountable 12:28:28 because you don't know enough about what is going on then we are not going to 12:28:31 have freedom or self-determination 12:28:36 as human beings. So that in a nutshell 12:28:41 is from 2005 12:28:44 to the present and the handed over to the rest to 12:28:46 talk more about the future. >> : Thank you so much, 12:28:49 Rebecca Pickett it's really helpful to get that -- 12:28:51 that whole perspective of the evolution of this and it is something where I think that history will be 12:28:57 familiar to some of the folks in attendance but I think it is really not that familiar 12:29:01 to the much wider group of people who are now talking about transparency 12:29:07 and really looking at things like transparency reports that have 12:29:10 come, just expecting them as an industry standard because they become that and it's really interesting to 12:29:13 hear about how 12:29:17 we didn't even know to ask for transparency reports until the idea and the model was invented 12:29:23 and I think it's a great reminder too that we should be trying to be 12:29:26 as inventive as inventive 12:29:29 as we think about what else we want and transparency are what other of information do we want. But now 12:29:34 I look to pull other speakers in the conversation. I will 12:29:37 go next to care. You 12:29:41 have worked on national security and counterterrorism in a variety 12:29:45 of capacities including for the Department of Defense and Facebook and I would love 12:29:48 to know when you think about tech company transparency what is the kind 12:29:51 of information that you are looking for and what are the kinds of problems that you are looking to solve? 12:29:57 >> : Thanks. I think 12:30:00 Rebecca's rundown was absolutely critical because I think we 12:30:03 need to remind ourselves of the model. Why are we asking these questions in the 12:30:06 first place? Why are we even know here in the first place 12:30:09 ? And that is because of companies operating in China 12:30:12 and theater Conan practices 12:30:16 and their attempts to impinge upon 12:30:20 America, our culture of free speech, how 12:30:23 we build these platforms with that culture ingrained initially and then we go and try and we are all 12:30:28 of a sudden confronted with a completely different culture surrounding 12:30:32 some of our norms, et cetera. So adhering to that 12:30:35 model, remembering that model, being reminded of that model I think 12:30:38 it is the best thing that could come out of this panel today. 12:30:42 But I do want to play off of Rebecca's comments 12:30:46 as well. So yes, clearly I think American private companies should establish the principles 12:30:51 , the one Principle 2 coalesce around transferring to that Rebecca highlighted 12:30:54 perfect. But I think that American company 12:30:57 should also now start to get extremely concrete. They should establish a rule set 12:31:03 to establish this transparency when dealing with authoritarian countries. 12:31:08 We all know about the problems with free expression in authoritarian 12:31:12 countries and even in what we would consider global 12:31:15 swing states. Those states with elements of authoritarian governments 12:31:19 and some elements of democratic governments as well. I would say Brazil is one of these. India 12:31:24 , the sorts of places. But now you 12:31:27 look at encroachment on freedom of expression occurring outside of authoritarian countries 12:31:33 because of these authoritarian values being impressed upon our private companies. And I would say 12:31:38 YouTube is an example of this when they took down videos of testimonies from Shane John 12:31:44 from [indiscernible] human rights channel, and Zoom act was icing to Beijing last year when 12:31:48 it came to the account of a Chinese human rights activists. Being sensory images and this is Microsoft owned 12:31:54 [indiscernible] this month -- sorry, not this month but back in the summer TikTok censoring 12:31:59 Hong Kong protest 12:32:02 Tiananmen Square, et cetera. I don't know if anyone watched the testimony of the TikTok 12:32:06 representative sitting there in front of Congress but he 12:32:10 basically said TikTok does not censor now. I think that tells you all you need to know about some of the 12:32:13 reporting that was done 12:32:16 about TikTok practices, special before February 201-9- 12:32:19 where we were unsure if they shared U.S. user data with China 12:32:25 or stored it therewith they professed to only store in USN Singapore now. There's a lot of good 12:32:28 open-source detective work being done to pick apart 12:32:29 those proclamations by TikTok as well. 12:32:32 So stating for more on that from others. 12:32:38 [indiscernible] then you have things like Disney plus, 12:32:43 [indiscernible] that symptom absence out in Hong Kong. So we 12:32:46 not only just see these platforms operating if they are operating outside of U.S. borders 12:32:51 they are acquiescing to China's rules of the road 12:32:55 but now we are seeing it on American platforms. American privately owned companies. 12:33:00 So I think we need to develop a framework to 12:33:03 form a foundation for this rule set for private companies to implement. I think this can be helped 12:33:10 with their agency, the State Department, Congress department, they have a lot of work 12:33:15 and use of technologies and [indiscernible] things as well so how these technologies are used 12:33:20 for ill in the world, they can have a voice given some of the know-how that they have developed 12:33:26 over the past multiple years in fact to deal with some of this issue. 12:33:29 Though I think as a starting point this framework should account for 12:33:34 -- actually, 12:33:38 [indiscernible] I think everyone understands that. So I think there's also something too when it 12:33:41 comes to other platforms like TikTok. So these are 12:33:47 -- TikTok owned by [indiscernible] a Chinese based company, their parent company. 12:33:50 Is headquartered in China. 12:33:55 So if these specific companies want to enter into a U.S. marketplace they are already 12:34:01 here but now I think we need to retrofit some protections we should have a rule set 12:34:03 for non-transparency for companies like that foreign-owned seeking entry into the U.S. 12:34:06 market 12:34:11 in particular when it comes to these digital platforms that operate the information environment like 12:34:15 [indiscernible] company [indiscernible] atmosphere 12:34:19 of its home jurisdiction. We know that 12:34:23 the relationship between the private sector in the CCP 12:34:26 in China very different than the relationship between the government and 12:34:29 private sector here. So I think that needs to fit into the calculus of these frameworks and these rule 12:34:32 sets. Such a rule set should also encompass 12:34:38 the type of status these applications collect like is 12:34:42 a collecting biometric data, behavioral data, what is the size 12:34:45 of the American user face of this platform that wants to operate in America? What is the scale look like 12:34:50 and what is the overall heart security of this platform so making 12:34:53 sure that we account for 12:34:58 changing information environment that really does depend on 12:35:02 the legal atmosphere of where these companies exist and 12:35:05 come from. I think it is important. And you have to be able 12:35:08 to articulate that. You can't say just because it comes from China we 12:35:11 don't want it. But you have to break it down 12:35:15 under these specific set of criteria and I think that makes sense. So in terms of transparency 12:35:22 what are we looking for? Details on law enforcement quest from officials 12:35:26 , yeah, that clearly makes sense. Rebecca alluded to 12:35:30 this as well, especially request for data and information from authoritarian companies 12:35:33 , huge. When 12:35:37 air B&B gives up user data to an authoritarian regime or decide to expand 12:35:40 it is giving up this user data to an authoritarian regime the public should know about 12:35:45 that. I think also from a more domestic free expression 12:35:48 standpoint 12:35:52 I would like to see more details in these voluntary transparency reports that these companies are 12:35:55 publicizing 12:35:59 [indiscernible] not all of them infected so they do the law enforcement response request 12:36:05 , numbers, and data. They talk about 12:36:06 their enforcement of community standards and et cetera but I like to see more on their content moderation 12:36:13 practices. Their behaviors. What does this actually look like? Did they make mistakes 12:36:16 ? Why? What have they done to rectify these mistakes? Especially my comes to political speech? 12:36:19 I think reporting on their methodology 12:36:26 and to their decisions, reporting that two enforcement mechanisms 12:36:29 with the United States government itself with the public availability component is one idea. I think it is 12:36:35 huge. Again, the public should know as Rebecca said the public 12:36:38 should be able to hold these tech companies accountable for their 12:36:41 behavior and they should be able to control not 12:36:44 only their experience about their data on these platforms too 12:36:48 so that accountability factor that Rebecca highlighted, the user control issue 12:36:50 , I think that is huge as well. 12:36:53 And then finally, 12:36:56 Rebecca scooped me here but a lot of people talk about 12:37:00 this [indiscernible] the algorithmic transparency and this can be different than the proprietary design of the 12:37:05 algorithms. The white light of these companies and 12:37:08 how they make money essentially but instead let's report 12:37:11 maybe on how these algorithms operate and how they impact and affect users. 12:37:18 Some more transparency on that in particular I think would go a long way 12:37:22 in helping to hold these tech companies accountable and keeping them frankly accountable to the people 12:37:26 that they serve and the markets that they serve. 12:37:28 >> : Thank you so much, 12:37:31 care, that was a fantastic overview of a lot of 12:37:34 the conversations going on in transparency 12:37:40 and especially looking at the role of regulation in transparency and 12:37:43 what that might actually be able to accomplish and I hope we get 12:37:46 to dig more into that during this conversation but we will definitely be talking about that 12:37:51 and a few other panels this week so if folks are interested in really diving into that conversation 12:37:53 I hope you'll join us for those. 12:37:56 Now I want to enter continued survey of the big 12:38:01 general categories where we might be talking about or that 12:38:04 we might mean when we are talking about transparency. I 12:38:07 want to turn next to Dunstan. So Dustin, 12:38:11 your work at BSR covers a whole range of independent assessments including 12:38:16 human rights impact assessments and other ways of facilitating 12:38:21 regular review of and reporting about company practices. So can you give us 12:38:25 a sense of current state of play and reporting standards in the 12:38:27 tech industry and how this fits into the broader transparency conversation? 12:38:32 >> : Thank you, Emma. 12:38:35 Thank you first of all for the invitation to speak on a topic that is very close to my hearts. 12:38:39 It is close to my heart because long 12:38:42 before I found myself in the technology and human rights 12:38:45 field I participated in the world of sustainability disclosure 12:38:48 standards. Much of which predates today's interest in transparency in the tech industry. 12:38:51 12:38:55 There's an organization called the global reporting initiative, [indiscernible] slightly different 12:39:01 acronym which has become one of the main standard-setting organizations for how companies 12:39:07 in all industries report on the social and environmental impacts. Into that 12:39:10 turns 25 next year, which is one year older than Google and not as old as 12:39:16 Neil's factories enforcement has been around for over a century. 12:39:19 Now much more recently had been involved with 12:39:23 King technology come is to undertake human right assessments that you mentioned, Emma. 12:39:27 And hear this question of whether to publish the assessment and how much of the assessment to publish 12:39:31 has become a very hot topic. 12:39:35 Now I have probably been involved in something like 75 or 100 different 12:39:41 human rights assessments of different shapes and sizes, of different topics 12:39:45 at technology companies. And I would say today only about a dozen or so 12:39:48 of those are in public or 12:39:51 in the public in some shape or form. So you've 12:39:55 got these two different worlds, sustainability reporting standards 12:39:57 and human rights assessments being undertaken at technology companies. 12:40:00 But there are two worlds about 12:40:04 to come to a head and I would like to apologize for the excessive jargon in what 12:40:07 follows but I am going to 12:40:10 run with it because there are some things happening that I think said important context. 12:40:14 So first of all 12:40:18 the UN guiding principles on business and human rights, which many of you 12:40:21 will be very, you're with, for others that may be new but it is 12:40:24 the core document for the world of business and human 12:40:28 rights. Has recently been integrated into the global reporting initiative standards that a mention. 12:40:32 We are also expecting something very similar with mandatory 12:40:38 European Union social and environment to disclosure standards for all companies doing business in Europe. 12:40:41 So there will be a European standard 12:40:45 , all-American company is doing business in 12:40:48 Europe will have to comply with these emerging European standards. Which will incorporate 12:40:52 I am assuming the UN guiding principles on 12:40:53 business and human rights. 12:40:57 So that means things like disclosing a companies human rights impacts, disclosing how those 12:41:00 human 12:41:03 rights impacts have been identified, disclosing how they are being addressed 12:41:07 . So to a lot of relevant 12:41:12 things going on in the European Union with international sandals [indiscernible] we should think about. 12:41:16 And there is another organization I told you there would be a 12:41:19 bit of jargon, the sustainability accounting standards Board which has a lot of 12:41:23 influence in the accounting world and for those of that are interested in the accounting world there's a lot 12:41:28 going on today and how to create a new nonfinancial 12:41:31 reporting standards for companies. They have a WorkStream on content governance 12:41:35 and contact governance disclosures by social media companies. 12:41:40 So we have these big development surround standard-setting 12:41:46 that are not unique to the technology industry. They apply 12:41:49 to all companies of all sizes and all industries 12:41:53 and all locations. How the technology industry response to those develop its 12:41:56 I think is an open question. And in fact the uptick of the 12:42:00 standards that I have described on a voluntary basis is generally lower in the technology industry 12:42:04 than it is in other industries and we can perhaps have a discussion on that. 12:42:10 So there is a lot of -- the good news is there's a lot of good work underway to increase transparency 12:42:17 from technology companies. But for this discussion I think there are three questions that will be 12:42:17 useful to think about. 12:42:21 First of all, what disclosure from companies is useful? 12:42:26 And will 12:42:29 enable all of us to effectively evaluate what approaches companies are taking 12:42:32 to respecting human rights. As I mentioned there's a lot to focus on the human rights assessment and making 12:42:38 the human rights assessment public. For me that is just one part 12:42:41 of a much bigger picture and a much bigger discussion we need to have. 12:42:46 And the second is how do we apply reporting best practices and disclosures 12:42:51 so that reporting is reliable 12:42:55 , accurate, and comparable across companies and over time. 12:42:58 And Rebecca described how we didn't know to ask for a transparency 12:43:01 report until one was created. Well, how do we make sure that they are created in a way that adheres 12:43:05 to best practices into disclosure, that predates 12:43:09 the tech and human rights 12:43:13 sort of dialogue. And then third, how do we make sure that the emerging standards and regulations 12:43:19 that are coming our way result in meaningful disclosures and are not [indiscernible] 12:43:26 based approaches. So when I work 12:43:29 with companies and talk about these emerging standards, these emerging regulations they will 12:43:32 say this is a shame because this becomes a checklist approach. This become something 12:43:35 we just do because we have to. 12:43:38 And I was think there's got to be power in 12:43:41 and pick how do we have something that is required 12:43:46 in such a way that you get meaningful disclosure across all countries that is comparable 12:43:50 , standardize, something that we can all 12:43:53 expect from every company but is also something that is entrepreneurial, 12:43:58 that we evolve as time goes by as we create reports 12:44:00 is the decision useful for us that are reading them. 12:44:08 So that is an attempt to shape the broader standard-setting 12:44:11 world. I haven't gotten into topics like access to data 12:44:16 for researchers which I know we are going to get into any moment. There is a lot in that transparency 12:44:20 box 12:44:24 . My key messages I think the technology industry should be thinking 12:44:27 about the future of transparency 12:44:30 partly in the context of these broader standard-setting processes are that are underway and thinking about how 12:44:33 they should be applied in this industry. 12:44:38 >> : Thank you so much, 12:44:41 Dunstan. I love those framing questions that you have shared 12:44:45 in part because I think it also immediately starts drawing our attention to those tensions. 12:44:48 The desire on the one hand 12:44:53 for transparency and information coming out of companies to be 12:44:56 comparable across different companies and over time 12:45:00 may be being intentions with the needs for things to be flex about an iterative 12:45:03 and all of those great jargon words from the tech industry 12:45:08 of wanting to continually be reinventing products and services and approaches to things partly because 12:45:11 that can really help improve things or create 12:45:17 innovation, create new ways of tackling problems. 12:45:21 But also potentially making it that every 12:45:24 report or every bit of information coming out of a company exists in anent untethered netherworld 12:45:27 of here is 12:45:30 a data point and you don't have anything else to compare it to. So 12:45:33 I would love to dig into that and I also want to remind our attendees 12:45:39 that you are very welcome to ask questions. We will have a little bit of time at the end 12:45:43 for your questions so just use the Q&A or send 12:45:47 an email or tweet at RC 12:45:51 Dietz staff and they will be able to feature questions. Now I want to bring in our last speaker 12:45:55 Rebekah Tromble. Rebecca, the issue 12:46:00 [indiscernible] update has taken on such greater prominence even 12:46:04 feels like in the past few months since the friend since how can links looking 12:46:07 at this whole question of what is the kind of information that tech companies 12:46:13 that they do research on internally and that outside researchers are kind of 12:46:16 desperate to get their hands on. And then just today 12:46:20 the internal markets committee of the European Parliament voted out a draft of the digital services act 12:46:26 , Europe's big revision to the intermedia lability law that action includes researcher access to 12:46:30 data type provision. 12:46:34 So legislation on this topic is coming soon, if not in the U.S. then definitely in Europe. 12:46:39 Can you give us an overview of some of the 12:46:42 key challenges and roadblocks that have been coming up 12:46:45 in this question of facilitating researcher access to data? 12:46:49 >> : Absolutely. 12:46:52 And all of my predecessors on the panel have said 12:46:55 first and foremost thank you so much for having me today. It is a real pleasure to be in conversation with 12:46:59 all of you here. 12:47:02 There are all sorts of exciting develop it's happening both in Europe and the U.S. And I should 12:47:07 let those in the audience know 12:47:12 who aren't aware already that I have actually been doing 12:47:15 the vast majority of my work on this question of researcher data 12:47:20 -- researcher access to data, excuse me, in Europe first. 12:47:24 Because what we have seen is that most of the 12:47:28 initial movement has actually been coming from the European Union in this area as well as questions 12:47:33 about platform transparency much more broadly. 12:47:36 Europe has been the first mover in the space. 12:47:39 So much [indiscernible] 12:47:45 on access to platform data for the purposes of social scientific research 12:47:49 that is developing a code of conduct under GDPR. 12:47:55 And this points to what I think is probably the first and most fundamental tension 12:48:03 in the relationship between transparency and data access on one hand and on the other hand 12:48:06 the recognition 12:48:10 of users rights to privacy and users rights in general. 12:48:17 I think the key take away when we are considering this tension is that yes indeed 12:48:20 there is a tension. But 12:48:24 it is not one that is impossible to overcome. So I want to walk through a few key considerations 12:48:28 to keep in mind as we have been discussing 12:48:31 in our working group NSF 12:48:35 having these conversations with policymakers in the U.S. as well that we need to keep in 12:48:38 mind as we are trying to work out the logistics 12:48:41 for access to data for researchers. 12:48:42 12:48:45 Let me say at the outset that I think 12:48:49 it is quite ironic 12:48:55 that the platforms views this argument about user privacy quite effectively as a shield. 12:49:00 That they conveniently take 12:49:03 the most privacy protecting position 12:49:08 only when transparency is the effort in question. 12:49:12 So when we hear these arguments coming from the platforms about user privacy 12:49:17 in these debates we need to keep that in mind 12:49:21 . While at the same time recognizing from our position as though who seek transparency that we really do 12:49:26 have to take user privacy seriously. 12:49:29 It has to be at the forefront of our considerations. So I 12:49:32 think to achieve that the court goal, the starting point 12:49:38 needs to be a commitment to data minimization. As we are laying out the sort of research 12:49:41 questions 12:49:45 that we want to investigate we have to make a commitment to using as little data as 12:49:48 possible 12:49:53 as it needed to appropriately answer those questions. We can't go into the research with this sort 12:49:56 of guide vision 12:49:59 that we want to get our hands on everything. 12:50:01 That simply isn't respectful of users privacy and users rights. 12:50:06 Ultimately, we want to anonymize data when possible but I 12:50:10 will say 12:50:14 that when we are dealing with big data which in most cases we are, 12:50:17 anonymous station 12:50:21 simply isn't feasible because in practice even when we take directly 12:50:25 personally identifiable markers 12:50:28 , features, off of the data when 12:50:32 we are using big data frequently that data can still be 12:50:35 reidentified, can still be traced back to individual users. 12:50:40 So the most practical purposes anonymous station is a feasible and I think we need to move beyond 12:50:45 anonymous station as the end-all be-all goal when we are talking about user privacy. 12:50:48 12:50:51 To add to that we have to keep in mind 12:50:57 that some of the most important and pressing questions of the day involve 12:51:03 require the analysis not just of PII but of sensitive 12:51:10 information and sensitive data. We can't answer absolutely 12:51:14 that are affecting our democratic societies today 12:51:17 without looking at some of the sensitive data. So 12:51:20 what that means is that we need to also on top of making a commitment to data minimization 12:51:26 we also need to make a commitment to achieving informed consent whenever possible 12:51:30 and finally to implementing 12:51:33 a series of procedures that ensure that the researchers themselves 12:51:37 are getting access to the data are doing so 12:51:41 under very high standards of ethical 12:51:46 practice and of 12:51:49 security considerations. 12:51:53 So in some instances this means using things like 12:51:58 data clean rooms where researchers actually cannot take the data points 12:52:01 away themselves, cannot download them. They have to analyze the data 12:52:06 in a sanitized environment and effectively themselves 12:52:09 be monitored. 12:52:12 And face liability if they misuse the data and violate 12:52:15 users rights in any way. So that is kind of 12:52:22 is a part of this. To very quick once. I will run through much more quickly but I 12:52:25 want to add to the list. 12:52:29 The second one is this big question about how we handled the governance of this 12:52:35 , how we identify what are the appropriate research questions that should be interrogated, 12:52:39 what are the appropriate data sets, how are those data sets delivered, 12:52:44 and how is enforcement 12:52:47 of any potential violations on either side handled. We are looking right now 12:52:54 should the Kunz Portman bill that was just introduced in the Senate move forward there sort of 12:52:58 a hybrid relationship 12:53:02 proposed in that between the NSF and the FTC. I think this is a really good starting place for 12:53:05 conversation. 12:53:08 But there are even bigger questions that we need to take into in there including things like 12:53:12 how do we improve 12:53:16 the review of ethics 12:53:22 research plans from the research themselves? For academic 12:53:26 [indiscernible] in place but if we want researchers outside of 12:53:29 academia to have access to this data IRB is cannot be the fallback plus we 12:53:32 know that IRB's are not actually ideal. 12:53:38 So there is a lot of work to do there in order to 12:53:41 make it that much more robust. Finally, I want to highlight 12:53:44 that at the end of all of this if we are thinking about transparency 12:53:48 being the final goal we actually have to be able to trust 12:53:52 the data that we are receiving access to from 12:53:55 the platforms. Into be 12:54:01 between the platforms and most other outside actors 12:54:07 so simply believing, taking it at face value when they turn over, when they 12:54:10 supply data to research as I think going to be a real problem. 12:54:14 So at the end of this process in order to 12:54:17 complete the circle and to make transparency work 12:54:21 I think we ultimately need what I refer to as 12:54:24 data pipeline audits. 12:54:28 And in very simple terms that combines three types of 12:54:31 approaches to understanding what data has been provided. It includes traditional 12:54:38 interviewing auditing mechanisms, it includes code review and crucially 12:54:41 it includes the ability 12:54:44 for researchers/auditors to be able to inject synthetic data 12:54:50 into the data pipelines and reviewed the outputs of those. 12:54:53 So to sum it all up we are making a ton of progress 12:54:59 and I am particularly excited to see where things are going in Europe at the moment right now. 12:55:04 But we still have a lot of difficult questions particularly around that area of governance 12:55:09 that we need to address in order to do this right, to achieve both transparency 12:55:13 and protect users rights in the process. 12:55:18 >> : Thank you, Rebecca, 12:55:21 that was a really excellent kind of introduction to the topic 12:55:25 and from all of our panelists thank you so much for 12:55:28 giving us these wonderful overviews that really help underscore this point that there are 12:55:31 a lot of different questions to be asking when we talk about wanting more transparency 12:55:36 from technology companies. We are 12:55:39 running a little bit short on time so I'm actually going to 12:55:43 hop us right to final statements, final comments, from our 12:55:46 panelists so we can make sure we get to the next panel on time 12:55:49 but I would like to hear from everybody and we can just go right through the same order we 12:55:53 took 12:55:57 for opening comments. Since it's the end of the year everyone 12:56:00 is looking forward to to 2022 after hopefully getting to take some nice vacation at the end of the year. 12:56:04 What are you looking for next in transparency? 12:56:10 What are the things you have your ion happening 12:56:13 in 2022 and Rebecca to you first. >> : 12:56:17 Thanks. Speaking from my current role as vice president for global advocacy at the Wikimedia Foundation 12:56:23 and as more and particularly in Europe but also elsewhere 12:56:27 there is increasingly both regulations that are expecting transparency as well as 12:56:31 stakeholder expectations for transparency 12:56:34 . One priority of mine 12:56:39 and of my colleagues is to make sure we don't forget 12:56:44 that the Internet and Internet platforms if not only comprised of commercial entities 12:56:50 with centralized content moderation and data collection 12:56:55 and use models 12:56:58 that we need to make sure that we think about the vision for the Internet we want to 12:57:01 have in which citizens 12:57:07 can create, build, operate, and govern their own open knowledge projects, online communities 12:57:13 , and that the right to do so is protected and that we are not 12:57:17 requiring forms of transparent so that only work for Facebook or Twitter or Google 12:57:22 but may -- we need to ground 12:57:27 our expectations around transparency 12:57:31 in the ultimate purpose which is human rights which is really freedom and 12:57:35 self-determination of people and communities around the world. 12:57:41 So for example, when we are thinking about transparency around content moderation 12:57:47 that looks one way when you are thinking about a centralized platform 12:57:52 that employs content moderators. When you have a community model of setting 12:57:56 rules 12:57:59 for different language iterations and different projects 12:58:04 , different knowledge contexts, community enforcing 12:58:09 when you are already you can go on the website and see every single edit 12:58:12 that was ever made in every single debate about whether a piece of content goes up or down 12:58:17 your expectations around transparency or requirements 12:58:20 might be different. And also balancing 12:58:25 the need for transparency around 12:58:28 who is speaking online with privacy. 12:58:32 And security 12:58:36 of members of vulnerable communities who face reprisals 12:58:40 for participating in online discourse, which brings to the 12:58:44 data minimization principal for platforms generally, not just 12:58:50 for researchers, which is transparency is fine and good but let's be careful about 12:58:56 what we are collecting in the first place and why. And also 12:59:00 just then what we are sharing 12:59:04 online with whom. And the best framework to ensure 12:59:09 that transparency remained grounded in human rights 12:59:12 is not just for companies to have clear commitments to human rights principles 12:59:16 and how they 12:59:20 think about their accountability to the public. But also researchers 12:59:23 and nonprofit communities 12:59:27 that we are designing our activities and our 12:59:30 work in accordance 12:59:33 with human rights principles. 12:59:38 So that is maybe longer than it should've been picked but 12:59:41 that is my final word. >> : A lot to look forward 12:59:44 to in 2020. Cara, same question. >> : Rebecca scooped me 12:59:47 yet again but I was going to focus on the fact that I hope for 12:59:53 2020 we don't have to convene groups like us and talk about the need for transference because I hope that more 12:59:58 companies and as Rebecca number two said 13:00:04 researchers the data minimization factor but mostly the privacy 13:00:07 by design factor. So I hope that these companies will explore in earnest and I know 13:00:10 some are right now 13:00:14 and government related laboratories are as well, but I want those options for privacy 13:00:18 [indiscernible] and enhancing technologies 13:00:21 to be pervasive across the industry. Hopefully it will be industry-standard 13:00:26 to conduct differential privacy 13:00:29 federated models of machine learning there is different Internet protocols that can be used as well to protect 13:00:34 not just user data but how we access data 13:00:37 and what that looks like, privacy preserving computation on sensitive private data 13:00:42 , fantastic. So in 2022 I hope that that is the name 13:00:45 of the game, privacy by design, privacy preserving 13:00:49 technologies and that we don't necessarily have to keep him from the 13:00:52 rooftops about transference because we will build it at the outset. 13:00:56 >> : I love to hear it. 13:00:59 Yes. Dunstan, how about you? >> : Thanks, Emma. Two 13:01:02 points and both I think an appeal 13:01:05 to see where tech fits in the broader, wider context 13:01:09 . The first point is the transparency about the 13:01:12 impact of technology is not just about transparency by technology companies. 13:01:17 What transparency do we expect from companies and other industries that are 13:01:20 deploying technology? What about transparency for financial services companies? Retailers 13:01:26 , healthcare companies? There all deploying technology making really important decisions when they do so 13:01:29 but we've got some he focus on 13:01:32 transference from tech companies I think we risk taking the eye off the 13:01:35 ball with non-tech companies. The second point is there is this broader trend around 13:01:40 environmental social governance disclosures. 13:01:44 And a lot of momentum and investment in what they should be. I fear a world where 13:01:48 those of us in the human rights field are not have enough influence 13:01:52 on environmental social governance disclosures because we are not participating enough in that world. 13:01:55 So how do we make sure 13:01:59 that the emerging frameworks for ESD -- ESG disclosure are consistent 13:02:03 with a human rights-based approach? 13:02:06 >> : Opening even more 13:02:10 frontiers for all of us to be working on and focused on in the coming year. Finally, Rebekah 13:02:13 Tromble 13:02:15 , over to you. >> : Thanks so much. I am 13:02:18 going to be looking in the next year 13:02:23 at fee regulatory steps that are taken first in Europe 13:02:27 and second in the U.S. I think what we have seen out of the European Parliament today 13:02:32 is that there is definitely going to be a commitment and we can expect 13:02:37 by early 2023 for the DSA to be an act and to enforce 13:02:41 and to be actually having some impacts in the world of transparency. 13:02:47 The U.S. is clearly going to be further behind 13:02:50 and I desperately want privacy by design 13:02:54 exactly as Cara is a describing. I haven't seen a lot of movement 13:02:58 that 13:03:01 leaves me terribly encouraged that we are moving in the right direction. 13:03:05 But ultimately what I hope that the transparency measures that we can't get in place will achieve 13:03:09 beyond 2022 and 13:03:12 2023 I as a researcher looking 13:03:16 at online harms to users 13:03:21 that I am out of a job, that I no longer have 13:03:24 this area of inquiry because we have privacy by design 13:03:28 , because we have the platforms 13:03:33 and the basis of research and on the basis of transparent measures 13:03:36 doing things to protect their users from the very beginning of the design process 13:03:43 and I have very little to research and to investigate from there on out. So 13:03:46 that is my dream. That I work myself out of a job. 13:03:47 >> : Showing a key 13:03:50 distinction may be between academic researchers and lawyers. 13:03:55 That would be a wonderful future. 13:03:59 Thank you all so much 13:04:03 for this really excellent overview. We really appreciate it and 13:04:06 for all of our attendees now we are going to take a very short one or two minute break 13:04:12 our speakers and welcome our next panelist to the virtual space 13:04:16 . So please sit tight and join us for the 13:04:19 second panel user centric approaches to transparency which will be starting soon. 13:04:25 [ 13:04:29 break 13:04:33 ] 13:06:06 >> : 13:06:09 Today's panel is about how transparency can help users empower them 13:06:15 and taking control of their experiences online especially users from 13:06:18 communities who face increased errors and challenges when they are participating online 13:06:25 of things like abuse, harassment, and disinformation. 13:06:29 We have four wonderful speakers today that I will introduce shortly but I just 13:06:32 wanted to tell the audience that we are taking questions for the end of the panel 13:06:36 so if you have any questions please feel free to use the Q&A function 13:06:39 in Zoom or send us questions via email 13:06:44 at questions@CDT.org. So onto our panelists. 13:06:48 Our first panelist is 13:06:51 Matt Bailey, he has penned America's digital freedom Program Director focusing on issues ranging from 13:06:56 surveillance and 13:06:59 disinformation to digital inclusion that affect 13:07:02 airless and writers around the world. Match previously served as a Senior Advisor for the national Democratic 13:07:07 Institute as a civil servant in the office of the United States chief information 13:07:12 officer and as a first director of technology innovation for the city of Washington, DC. 13:07:15 Next we have 13:07:19 Nora Benavides was a civil rights attorney and she works at 13:07:23 the intersection of law tech and 13:07:26 democracy. She serves as a senior counsel and director for 13:07:29 digital justice and civil rights and free press where 13:07:33 she leads the organizations efforts to protect against digital threats to 13:07:36 democracy and to push for media and platform accountability. 13:07:39 And Nora previously served as the director of Penn America 13:07:45 [indiscernible] next we have Daphne Keller, Stephanie's work focuses on platform regulation and 13:07:48 Internet users rights. 13:07:52 She has testified before legislatures, courts, and regulatory bodies around 13:07:55 the world and she has published both academically and in the popular press 13:08:00 on topics including platform content moderation practices, constitutional and human rights 13:08:03 law, copyright data protection, and many other issues. 13:08:08 Her recent work focuses on legal protections for users of free expression rights in state 13:08:11 and 13:08:14 private power intersect particular through platforms and enforcement of terms 13:08:18 of service or use of algorithmic thinking and recommendation. 13:08:21 And last, but not least, we have Dr. Nicol Turner Lee who 13:08:24 is a senior fellow in government meant studies and the director of the Center for technology innovation 13:08:29 and she serves as co-editor in chief of tech tank 13:08:34 . Dr. Turnley researches public 13:08:38 [indiscernible] echoed with access to technology across the U.S. and 13:08:41 to harness its power to create change in communities across 13:08:44 the world. She has a forthcoming book on the U.S. digital divide that is entitled 13:08:46 digitally invisible, how the Internet is creating the new underclass. 13:08:52 I am really excited to talk with each of our speakers today about 13:08:55 empowering users to transparency and I would like to start first with Daphne 13:09:02 who has written extensively about tech company transparency and also advise companies and lawmakers on tech 13:09:08 transference and. So Daphne, can you set the stage 13:09:11 for us a little bit by describing 13:09:15 how users specifically benefit from transparency and the kind of transparency that is 13:09:17 most helpful to users as opposed to maybe experts or regulators? 13:09:20 >> : Sure and I think 13:09:23 it's really useful that you are gray nearly focusing on users that way because as we heard in the last panel 13:09:28 we are in this moment of political opportunity were civil rules for transparency are going to get 13:09:35 finalized and to the people who are writing them don't necessarily know what they want 13:09:39 or for what purpose so being granular and saying here is what users want and here's what researchers want et cetera 13:09:44 I think it's really important right now. 13:09:48 So for users the simplest thing is a consumer protection interest. 13:09:51 When you sign up for a platform what are the rules. 13:09:54 If you have had an action taken against her content 13:09:58 do you get notified and you get an exultation of why that happened? Those are 13:10:01 all are most 13:10:04 valuable in a world with competition in 13:10:09 other platforms to switch to if you don't like the rules. 13:10:12 In the platform you're currently signing up for. But in any case 13:10:17 even if you are trapped within one platform you want to know what the rules are. 13:10:22 Secondly, users specifically 13:10:25 need to get the information that protects them from abuse by people who influence platforms. 13:10:29 So if your government 13:10:34 told Facebook to take down your post as Palestinian activists that happen 13:10:37 to them in Israel should know, you should get a notification 13:10:41 that specifies that is what happened or if your commercial rival 13:10:45 was the source of a notice, which happens 13:10:48 all the time, or your political rival, knowing he requested and on what basis 13:10:51 is the predicate for being able to defend your rights if you are being abused. 13:10:54 Then there is user interest 13:10:59 that need interpreters. 13:11:04 So if we care as users we care about 13:11:07 patterns of dissemination of misinformation may be 13:11:11 targeting vulnerable communities. You need a big 13:11:14 data set to spot that. No one user is going to see it from the notices that they are going to use. 13:11:18 To see. So that is a disclosure to researchers 13:11:21 or to 13:11:25 governance is in the users interest. Similarly 13:11:28 when content moderation is done by algorithm and it's not an action against the specific piece of content 13:11:32 that affects everyone whose content was demoted or promoted 13:11:37 or it affects what we are seeing 13:11:40 but you can't spot that without a big data set or without disclosures [indiscernible] 13:11:44 algorithmic details that again no one user 13:11:47 is uniquely affected by. 13:11:53 I do want to flag there is a tension 13:11:56 about transparency about 13:12:00 how speech rules are enforced in that on the one hand I think 13:12:03 we want a diversity of platforms 13:12:07 with a diversity of speech rules and to be valuable those differences in speech rules are going to be pretty 13:12:13 granular and nuanced. What does Facebook is to sexual versus what doesn't read it 13:12:19 or medium think is to sexual or to violent? Do you prohibit 13:12:23 offensive slurs or racial epithets completely or do you 13:12:29 permit them in an artistic context like [indiscernible] or historical context like educational images 13:12:34 of Nazi propaganda. Things get really granular in ways 13:12:37 that are important but which also make it very hard to do apples to apples comparison if you are 13:12:41 asking 13:12:44 how many posted this platform removed for violence or for nudity or for sexuality 13:12:45 compared to another one. 13:12:52 So as we ask for the kind of transparency that users and 13:12:55 the general public and lawmakers want I think we need to be really careful to keep in mind 13:12:59 that trade-off 13:13:03 and my preference is to avoid that trade-off, to say we really 13:13:06 do value having this diversity of speech rules even if it means 13:13:10 we can't get perfect insight into exactly how the enforcement compares across platforms. 13:13:13 >> : Thanks, Daphne pricked 13:13:16 thanks for that overview and a little bit about the trade-offs, too. I want to turn from our other payments 13:13:21 to hear from them about how different communities are effective 13:13:25 when they engage online. So Nora, starting with 13:13:28 you your work at free press has really highlighted the way that communities of color 13:13:31 and also non-English speaking communities have been targeted by disinformation online. 13:13:38 Can you describe what you found about 13:13:39 targeted online disinformation and some examples of how disinformation impacts those communities 13:13:42 specifically? >> : Sure. 13:13:47 First, it is great to be here with all of you. This is a great discussion 13:13:52 and I think it centering users as Daphne said at the top is really important 13:13:55 as a distinction from a -- some of 13:13:58 the other ways we might think about transparency into the need for. I 13:14:01 guess 13:14:06 at the risk of going to deepen the weeds I think it 13:14:09 is often useful for people to remember some of the most concrete examples 13:14:16 where maligned operatives and content online has very real-world effects. 13:14:20 Not to do a history lesson of any sort into finding the problem 13:14:25 had not seen him but partly because it can feel like an abstract concept when we are not really aware 13:14:26 of who's being harmed and how. 13:14:32 So some of the smattering of examples 13:14:35 are both newer and older 13:14:39 but one is I think it's important to know a little bit about what is happening regarding COVID. 13:14:45 And what free press has been looking at for some time 13:14:48 is the ways that Spanish-language and non-English content have very specific 13:14:51 effects 13:14:56 for communities of color and for immigrant and non-English speaking community spirit for example, 13:15:00 research from the markup has shown that 13:15:04 communities of color, black, let 13:15:07 next, Native American, Middle Eastern communities have been shown and given 13:15:11 by Facebook less credible information from the World Health Organization about the COVID vaccine. 13:15:14 13:15:17 An interesting finding, which I think speaks to the 13:15:20 upstream promise we will get into today. Another example that many of us are aware of is the way that 13:15:26 Facebook searched discriminatory ads to users in violation of the fair housing act which prompted 13:15:31 HUD a couple years ago to bring a lawsuit 13:15:34 . At that time documents showed that Facebook 13:15:37 was allowing advertisers to target users based on location and other identity markers 13:15:41 , which really could be 13:15:44 processes for protected categories and resulted in a black users sing less or no ads 13:15:48 for certain housing. 13:15:52 Facebook has continued to run these types of ads and even a year plus after the lawsuit settled 13:15:57 those practices continued. With respect to 13:16:00 election issues we have seen that there are very specific ways 13:16:05 that let next and Native American populations 13:16:08 in various states on the West Coast around Arizona, New Mexico, and others with 13:16:15 certain types of content that specifically played on distrust and fears around immigration 13:16:19 and rates and immigration status to 13:16:22 imply that they were going to be people at election and pulling locations with guns 13:16:27 , that there would be Homeland security 13:16:30 officials waiting to pick people up. Irrespective of people status. But really playing on the ways that people 13:16:32 might actually send thoughts come out and vote. 13:16:37 So across these, 13:16:40 I don't want to take too much time into scrubbing more examples, but from COVID to some of the 13:16:43 ways that the systemic and third-party 13:16:46 issues 13:16:50 that play on identity and the way identity has been crafted by air platforms 13:16:54 to them the electoral democracy related content, 13:16:57 all of this I think speaks to a few things 13:17:00 . One is we still don't actually have enough information about the harms and the 13:17:03 impact Caitlin 13:17:07 it to your question and it's partly why think transparency is the 13:17:10 first step in meaningful reform picture can't be everything but part of what we 13:17:13 need are two different types of transparency. And I want to be 13:17:16 really clear in the ways that free press has 13:17:20 been defining this. One is dated transparency around what are the kinds of ways that 13:17:25 data brokers are gathering incredibly private and sensitive information about people and users. 13:17:28 And then 13:17:31 what are the other ways that we might call for transparency about the impact of business models by 13:17:37 platforms. So in thinking about how specific communities are harmed I think we often gloss over 13:17:43 what the solutions can be and just saying groups are harmed and 13:17:47 there is that. And I just want to elevate here that when communities are effected 13:17:52 there is no monolith for them. It is not that black 13:17:55 voters are hard in the same way, it is that these are 13:17:59 hyper local specific examples that play on our identities and on our online 13:18:02 behaviors and what I think Air Force that needs to be in reform is really understanding and 13:18:05 wrapping her arms around what 13:18:08 the problem looks like so that whatever reforms come after transparency 13:18:12 we have a better understanding of how we employ regulatory policy and other types of platform 13:18:17 reforms and changes. >> : Thanks, nor. 13:18:23 Those are some really disturbing examples and I know that unfortunately could fill a whole panel with more 13:18:28 examples so thank you for Lang that out for us. Max, I want to turn you to you because earlier this 13:18:31 year Penn America established a fantastic 13:18:36 report called no excuse for abuse and in that report you 13:18:39 read about how writers and journalists and especially those identified 13:18:43 as women, as people of color, 13:18:46 as members of the LGBTQ community are members of religious or ethnic communities 13:18:50 all of those groups face increased online abuse and harassment. 13:18:53 So can you tell us what your research has stone 13:18:57 about the harassment that these righteous face and how it impacts them and the 13:19:00 beers that they face in trying to prevent or respond to a? 13:19:02 >> : Thank you. I want to 13:19:05 underscore the report 13:19:08 that Nora made is that our data is wildly incomplete 13:19:12 and because the issues are both endemic ADL did a good study that showed 13:19:19 that roughly half of Americans report having experienced online abuse 13:19:22 so that is what I mean by endemic but there are also intersectional which means 13:19:28 if you are an American but you are a journalist and you are a woman of color or gender nonconforming 13:19:34 individual the rates of which you are expressing these issues sore dramatically. 13:19:39 See PJ found that black women 13:19:46 journalists experienced roughly 34 percent increased rate of up on right 13:19:49 harassment or abuse relative to white women you compare that relative to the general public. 13:19:53 13:19:56 So it is both dramatically more frequent on an intersectional grounds but it is also something 13:20:03 where the texture of the abuse, the ways that is 13:20:06 being targeted I think this and again calls back to Northpoint are specific and 13:20:09 they also are rapidly evolving 13:20:13 in the context of the platforms in the context of our political environment. 13:20:18 In our direct interactions 13:20:22 with newsrooms and activists, with community leaders we are finding if all of this is true 13:20:26 and the more that you interact with community leaders or with journalists or any been on the front lines the 13:20:29 more you find 13:20:34 that things that they are experiencing are not well reflected either in the aggregate data 13:20:38 or in the general purchase we are taking to try to create transparency or try to 13:20:38 create accountability on the platforms. 13:20:43 So far I think most of our discussion has been largely U.S. 13:20:47 focused but just to Zoom out you think about issues of content 13:20:50 moderation and cultural competency in places like 13:20:55 let me pick a couple that I have done some work personally 13:20:58 North Macedonia or in Uganda 13:21:02 , these are places where not only is the data even 13:21:06 more geometrically less complete but the content 13:21:09 moderation that is taking place 13:21:14 is less targeted as well whether that is AI or human it led moderation. 13:21:18 So there is a call for just a need for foundational data, what the heck is going on 13:21:23 for the approaches to retrieving that data 13:21:28 to be respective of that intersectional nature of the abuse 13:21:32 and also I think 13:21:37 to worry about not just [indiscernible] statistical data but also rich qualitative and testimonial 13:21:41 data from individuals who are experiencing its. 13:21:44 >> : Thanks, Matt. I agree 13:21:48 about that point of intersectionality is so important and I really appreciate you highlighting that 13:21:51 for our conversation. Nicole, I would like to take it to you and if you could tell us a little bit about 13:21:58 your research focusing on algorithmic bias and specifically 13:22:01 how biases in the algorithmic targeting a recommendation of content 13:22:04 including ads and things like that impact communities of color online. 13:22:09 >> : Thank you, Caitlin. I am glad to be here with distinguished colleagues and friends. 13:22:14 At the end of the day I think what you hear from all of the panelists 13:22:17 is that discrimination and bias is normative and his particular effect on communities of colors 13:22:22 and on vulnerable populations who are already experiencing 13:22:26 the type of systemic inequalities that technology has just worsened. 13:22:29 So a couple things I would just like to do to respond your question 13:22:33 to level set is we have to understand that this is normative now and 13:22:36 we need to get into the stage of coming up with some pragmatic solutions that allow us 13:22:40 to stop the type of impacts 13:22:45 that we are executing both intended and unintended or uncertain populations [indiscernible] having 13:22:48 this conversation 13:22:52 when I say that bias is normative I actually mean that because what we are finding 13:22:57 is that computers do not discriminate, people do pure people bring their 13:23:00 values and norms and [indiscernible] to the table into my research 13:23:04 that is what I am arguing that these computer models just don't show up and all 13:23:07 of a son engage in explicit and implicit biases, they actually come from a place 13:23:12 where the designers have particular assumptions and profiles of particular 13:23:16 subjects that they are looking at. With that being the case 13:23:20 that is where we start to see Caitlin what you're trying to ask is the civil and human rights arms 13:23:25 because technology not only can directly identify someone 13:23:29 by the information and data trust that people actually behind but they also have the ability to use 13:23:34 inferential data, online proxy that 13:23:38 amass and hide obscure quote unquote the characteristics of 13:23:43 certain populations [indiscernible] tells us that, she tells us that people with 13:23:48 [indiscernible] Hispanic sounding names tend to get offered credit cards 13:23:51 at higher interest and predatory we are finding that out in some recent 13:23:55 work they will 13:23:58 [indiscernible] when you move away from what you think are the 13:24:02 proxies that defined affinity groups of color 13:24:06 but really they offer a greater precision of discrimination 13:24:09 because you can actually exclude those groups. Without being the 13:24:12 case what did you do about that? Obviously as an advocate 13:24:15 and a scholar we have to figure out ways to peel that back 13:24:19 and part of the challenge of peeling that back is that we haven't 13:24:23 had a discussion and this is where my research comes into play, 13:24:26 you look at bias as being defined as 13:24:30 what similar situated people places and objects receive either differential treatment 13:24:34 so I like black dresses and I'm okay with 13:24:38 absence and be black dresses but I don't want disparate impact 13:24:43 with that black dress serves as a marker as my data trail to identify me as a black woman who then receives 13:24:47 predatory ads or 13:24:52 find myself being surveilled and profiled as a black woman. 13:24:55 That is where the challenge is when we look at my work on bias 13:24:58 because first and foremost we don't have enough people like me at the table who are actually working on 13:25:03 these issues and who have lived 13:25:06 experiences of the groups of which the technologies are 13:25:09 prioritized and tech slice that's the first thing. The second thing is 13:25:13 [indiscernible] because we live in an opaque technological sphere 13:25:17 were many of these algorithms live underground and they are not identifiable. 13:25:21 It is hard to prove disparate impact because two people are never the same when they are 13:25:25 engaging and our data trails due to the 13:25:30 essence of privacy legislation are just open season and people can could collect 13:25:33 whatever they want about his. The third thing I would say to answer your question the challenge 13:25:38 defined biases we don't have a civil rights regime that is agile 13:25:41 enough for us to actually go ahead and do due process around civil rights and human rights 13:25:44 violations. What experiences I 13:25:47 have have not yet triggered the ability to say 13:25:53 that there are some civil rights violation. Nora talked about the Facebook case with regard to HUD 13:25:57 there was some remediation but we 13:26:00 need to go back and look at what public accommodations law really looks like 13:26:04 in the online space. It is not the [indiscernible] they were dealing with 13:26:07 here folks. We are dealing with something that's much more opaque, below the surface, 13:26:11 that continues to persistent 13:26:15 aesthetic quality effects on populations that we talked about today. 13:26:21 So as a sociologist the research that I've tried to do is a social 13:26:24 technical, it moves us away from the technical cadence are looking at the technologies in the space of a 13:26:28 lab and it is action like many of us on this call can 13:26:32 conceptualize in it and providing [indiscernible] implications of how we should see 13:26:37 these technologies actually behave in the real world 13:26:40 and what the end attendant consequences are. I will 13:26:43 talk later about some additional pragmatic solutions but I have to tell you I'm so glad 13:26:46 we're having this conversation we had to get to the point where we actually make this better 13:26:50 because not only is the reputation [indiscernible] 13:26:54 but there is also unintended harm 13:26:58 that forecloses and I think we have heard it so far 13:27:00 on the economic and political or social opportunities that are available to certain segments of 13:27:05 society. >> : Thanks, Nicole. I 13:27:08 think many of the panelists here are working on some of those pragmatic solutions 13:27:14 which can include transparency, at least as a four step as nursing. I would 13:27:17 like to turn the conversation that way and Nora back to 13:27:21 you. I want to talk about the disinformation defensively policy 13:27:25 platform that Free Press recently launched along with other digital and civil rights groups. Can you tell us 13:27:30 what the disinformation link is 13:27:33 and what its policy platform is and the role that transparency 13:27:35 plays in it when it's talking about combating online disinformation? 13:27:38 >> : Of course. And 13:27:41 because it is a mouthful we can absolutely shorten this 13:27:44 to DDL pair but for the record that is the [indiscernible] of 13:27:50 defense leak it's a network of over 200 organizations 13:27:54 that are either led by or focused on centering communities of color 13:27:59 in the fight against disinformation. 13:28:02 So many of these are community-based organizations 13:28:06 that I have I think in their daily work as grassroots organizations 13:28:09 very much felt and seen what the local 13:28:10 contextual effects are of disinformation. 13:28:17 Whether that is in Arizona, in Texas, in Pennsylvania, 13:28:21 hyper, hyper local narratives that these individuals and organizers started picking up on and they were 13:28:25 like weight this is very clearly affecting the work we are doing 13:28:30 in and ask area, over there away from text policy 13:28:33 , away from frankly anything we are talking about today. 13:28:39 So these organizations knew that Blacks, let next, Asian 13:28:42 American, other communities of color were being affected by an 13:28:46 oftentimes the liberally targeted by disinformation. 13:28:50 So they hope in air creating DDL policy platform which Free Press led as a process 13:28:55 was to take a step back and say if there is a very 13:29:01 wide world of solutions we should consider inviting disinformation one of 13:29:04 them must be policy and regulatory Reform. It doesn't need to be the only and I think there is so much 13:29:08 strong debunking media literacy 13:29:14 . The kind of defense of work organizing, 13:29:18 local trusted networks which I fully endorse and we could have it all of 13:29:19 the discussion about those. 13:29:22 But in trying to advance a policy agenda what we sought to do was 13:29:25 really elevate and Nicole 13:29:29 I am just so happy you are here. I love 13:29:32 that you describe the cadence of this and how we change that because that is what 13:29:35 we are really trying to do with the DDL policy platform. 13:29:39 Is spring in this civil rights lens where we say 13:29:42 we are often focused on the [indiscernible] what does it mean to center the victims 13:29:47 or the targets, the users online. And that is part of why I was excited about this 13:29:50 discussion is thinking how can we talk about 13:29:55 I don't know data portability, the agency that an individual could have to move their data across platforms 13:29:58 and to know how 13:30:00 they are being targeted our how their identity is been crafted. 13:30:05 So we spent many months consulting with partners across 13:30:08 various organizations 13:30:11 whether access now, cybersecurity for democracy, fight for the future 13:30:16 , I am trying to name everyone, New Georgia Project, 13:30:20 I'm not going to get everyone. We had over three dozen organizations supporting the platform 13:30:25 and we really are trained to do a few things. I will keep it very brief because I know we are short on time. 13:30:30 But one is we are trying to really elevate 13:30:34 that oftentimes communities of color get left out of policy debates on this. 13:30:38 And the effect of 13:30:42 and consequence I think it is to Nicole's point, 13:30:46 that the normative discrimination here means we actually have to find more innovative ways of talking about 13:30:49 the problem and pivoting quickly so that groups that are left out 13:30:55 and traditionally left out of these debates actually feel empowered to come to 13:30:58 Congress, to come to an FTC open meeting, and talk about how their local 13:30:59 grassroots organization is harmed. 13:31:06 So we are trying to get people that kind of confidence, the talking points, and there are 13:31:10 three big buckets we are pushing in this platform. One are policies that would elevate 13:31:15 privacy and civil rights. And I am happy to talk about that in more detail. 13:31:19 Second is the kind of regulatory avenues we need 13:31:23 so how can the White House, how can the FTC, DOJ, other 13:31:26 dilatory agency be using 13:31:30 [indiscernible] the FTC is one example has existing authority to take on 13:31:35 [indiscernible] and other investigations into extractive data practices 13:31:38 which I think would be a precursor to 13:31:40 getting more transparency into these issues. 13:31:43 And the third piece is a kind of international focus and I'm 13:31:47 glad that this has not been brought up a couple times by Matt, Nicole, 13:31:50 I feel like we need to make sure 13:31:53 we are situating this away from a U.S. centric solutions conversation 13:31:57 knowing that the problem is global and frankly I think it is racist. 13:32:01 The way that we do not see investment in non-English content on social media platforms, 13:32:06 the way that we do not talk about as an ecosystem 13:32:09 of experts 13:32:13 the global elections that are happening, we can't only think of this as the lead up to 13:32:16 the 22 13:32:19 midterms in the U.S. There's dozens of elections happening next year so the 13:32:24 way that this platform elevates that global context we are also pushing for the UN to take on 13:32:28 a set of solutions and give a roadmap to state governments 13:32:33 so that we are able to provide is not just a kind of piecemeal 13:32:36 or sort of ad hoc solution framework. We are really trying to create a space 13:32:42 where we say people need to be centered in the conversation, they need 13:32:45 to center communities of color, we need to know more about how our date 13:32:49 is being used, and we need to see that this is a global scope. 13:32:52 >> : Met, in the no excuse 13:32:55 for abuse report 13:32:59 10 American late-summer conditions to platforms around user design and interfaces that you recommended 13:33:05 to decrease harassment and abuse. Can you walk us through 13:33:07 some of those recommendations and specifically the ones that are transparency focused? 13:33:11 >> : 13:33:16 I will pull a Noor here in terms of trying to 13:33:19 call out the framing of the conversation here. It is so interesting 13:33:22 to put the word empowerment and they were to transparency together 13:33:25 because traditionally transparency goes with the word accountability. 13:33:30 The question is sort of who is holding him accountable, 13:33:34 who is that feedback loop between the transparency come the data, or the information 13:33:37 is being made available and whose voice is being provided as a feedback loop into 13:33:41 the systems that are producing the data. 13:33:44 And I guess this calls back to Daphne's original point about consumer protection or consumer 13:33:47 empowerment. A lot 13:33:50 of what we are trying to do in no excuse for abuse is to center the conversation directly on 13:33:54 journalists, particularly journalists of color or journalists 13:34:01 with other identities that are exposed to greater risk of harm online 13:34:04 and say what would be empowering for those individuals both in the moment 13:34:09 as terrible things are coming at them, and also in 13:34:10 raising their voices to provide that feedback into the system. 13:34:13 13:34:17 So walking through each of the individual recommendations except to say 13:34:20 they are almost entirely focused on what we 13:34:23 think of product improvements or user experience UX improvements. 13:34:29 Things that anybody who has experienced these patterns of abuse or accordingly to online attacks 13:34:33 are recognized as useful. Things like how about a panic button if all of a sudden 13:34:37 the shit hits the fan for you and you haven't army of trolls and bots coming at 13:34:40 you that allows you to essentially lock 13:34:43 it down, make it stop, preserve the evidence 13:34:47 and figure out what you want to do next. Better tools for delegating account access. 13:34:53 Other than just sharing your password so that when that same phenomenon is going on 13:34:56 to have the ability to calling your friends, your colleagues, your family numbers to help content with what is going on. 13:35:01 Better forensic preservation of data and approaches 13:35:05 to splitting 13:35:07 the difference between getting the information off-line versus preserving what you need to take further action 13:35:10 on going forward. Spam filter like 13:35:13 approaches to sifting and sorting through 13:35:18 the sludge that a lot of people are dealing with 13:35:22 -- the word sludge is 13:35:25 [indiscernible] the traumatic and violative content that people are encountering and tilling with day in 13:35:28 and day out online. Not so much 13:35:33 to censor over holistically remove the content but to enable individuals to 13:35:36 have control of 13:35:38 when and how they encounter it and what amounts they encounter. 13:35:45 And the other thing we have in there and I think this 13:35:49 ties back to that overall question of transparency and accountability 13:35:54 is a suite of recommendations that are related to distant centralizing or creating 13:35:59 a more thoughtful pause before people engage in what might be abusive or violative 13:36:02 conduct online. 13:36:06 So something like two thirds of the report is focused on 13:36:10 those tools for empowering and arming and raising the forces of targets of abuse 13:36:13 and the back third is focused on creating 13:36:16 moments 13:36:20 of pause before people click the Tweet button 13:36:24 or click the amplify button 13:36:28 on online vitriol and I think this harkens back 13:36:31 -- I think we are all agreeing with each other in very 13:36:34 elaborate and [indiscernible] ways on this panel but one of the other panelist is mentioning this idea of 13:36:42 connecting peoples interactions online 13:36:45 and what the rules of the road are entreating some transparency there. So 13:36:48 that is really a lot of what we are focused on is if you go today and you try to 13:36:51 understand what are the rules on Twitter 13:36:54 there's a separate website, not an app, not in the app, not a 13:36:57 second section in the outcome of an actually 13:37:01 separate website you have to go and somehow find. I find myself 13:37:04 either digging through my terrible mess of bookmarks are having to literally Google for every time. 13:37:09 The Facebook community standards are the same thing. How do you 13:37:13 find it and how do you relate that directly to your experience in the platform 13:37:17 ? So we provide a number of recommendations for servicing what the rules of the road are 13:37:22 as well as providing nudges or design friction 13:37:26 for people as they interact when they get into spaces where they might be bumping up 13:37:29 against those rules and they may want to think twice either on moral ethical 13:37:33 and [indiscernible] 13:37:36 not to harm people online or because of the consequences they could face as they engage in 13:37:39 trolling and abuse. >> : Thanks, Matt. 13:37:45 I am [indiscernible] to hear you say you Google Twitter rose from the standards. 13:37:49 It's such an important point for 13:37:52 user decide if users can't find the rules easily they are not at their fingertips there going to be less 13:37:55 inclined to follow them. Nicole, 13:37:59 speaking of consumer protection 13:38:03 I know that policymakers around the world are focusing on how the algorithmic transparency 13:38:08 and I am wondering from your perspective what do you think the best approach is forgiving 13:38:12 users more transparency about the use of algorithms online. 13:38:17 Algorithms are complicated and how can we let 13:38:21 regular non-technical people know about the risks that 13:38:24 they impose and devices that they can perpetuate? 13:38:27 >> : I love that question because I often think and some people might find that it's offensive that 13:38:30 they were transparency is often overrated 13:38:34 and I will tell you why. At the end of the day policymakers 13:38:38 are really concerned about the predicted decisions 13:38:42 [indiscernible] that come out of these models. The type of 13:38:46 continuums of harm whether it is voter suppression and exploitation to whether or not they should 13:38:51 [indiscernible] misidentify people because of their skin color, all of that is really 13:38:56 where policymakers are trying to identify and mitigate and we say transparency 13:39:00 I would suggest that we don't have the research expertise within the 13:39:02 circles to be able to do that in a formative way. 13:39:06 So I will just start there. But the question is 13:39:09 how you consumers actually recognize that these biases are happening, that is tough because 13:39:13 one consumers actually trust 13:39:16 in the society that we live in today, which does not often although 13:39:22 the last five or 10 years has actually 13:39:24 increased [indiscernible] discrimination and outright live violence [indiscernible] but we expect 13:39:27 that the heirs code to contact when it comes to human rights 13:39:31 so I see one of the ways we exit have to start in terms of giving consumers 13:39:34 a leg up is to set this 13:39:37 floor around civil rights protect companies should not be given the permission 13:39:41 to break things and apologize later. Permission 13:39:44 [indiscernible] allow for that but we have to move away from 13:39:48 permission to forgiveness so that we follow the kits of civil rights laws 13:39:51 for people who already litigated and mitigated those 13:39:54 consequences. I will stick 13:39:58 to that is something that I too believe and it's a project we are 13:40:01 working at at Brookings and will put some stuff out on this but you have to really get 13:40:04 how the civil rights a loss apply to the 13:40:07 new digital economy. With that being said 13:40:10 there's some new research that I'm working on that ironic to share 13:40:14 which is how to give agencies back to consumers in this technical space? 13:40:17 One thing I've been writing about and will come out soon and a new 13:40:20 book is around and generally star rating and I came up with that 13:40:24 because I went to buy dishwasher two years ago and I went into a 13:40:27 big box store and I look for the yellow sticker that basically showed me 13:40:31 how much water the dishwasher was going to consume, how much electricity 13:40:35 the dishwasher was going to consume, how durable it was going to be 13:40:38 . In doing further research I found that this is a collaboration between 13:40:42 a variety of federal agencies that have come 13:40:45 up with some standards. Not necessarily enforcement standards but standards 13:40:51 of what consumers should expect when purchasing appliances. I think we need to do something like that 13:40:56 with algorithms perkier seen that done in Germany end of the places where we 13:40:59 are seeing these labeling systems but the argument that I'm making is that not 13:41:02 so much are just the way that this format looks but 13:41:06 what is behind it. First and foremost we need as everybody has said technical cadence that 13:41:09 is fair. 13:41:13 We need to get beyond the elusive [indiscernible] these models bring together [indiscernible] workgroups 13:41:20 that actually puts out those sensitive use cases where we know there's going to be a consequence 13:41:23 and we need to come up with ways to have better 13:41:27 data quality. Right now many of these models do not work because 13:41:30 the data quality is over represents people of color 13:41:34 or other fundable populations in the Cuba community or it under represents an. And that has an 13:41:37 impact 13:41:40 because that is not divorced from our society. So adding that technical 13:41:43 cadence in this framework I have been working on I think it's really important clue that data quality. 13:41:48 It's also important that policymakers commit on the side of 13:41:51 what are the predictions coming out of these models and where should 13:41:54 they find ways to legislate or regulate. In my view civil rights is 13:41:58 one but what about disclosure? Most consumers do not 13:42:01 know that decisions are being 13:42:04 made about them using algorithms. There is not any type of 13:42:08 redefined line that says this decision for you to buy a car because you think your credit is actually looks like 13:42:14 this, this is an algorithm, and we need to be very transparent that people have algorithms that are 13:42:17 positioning 13:42:21 as exposing information to them around polling data, et cetera 13:42:24 , because I think most of us don't understand what is in front of the [indiscernible]. I think what is most 13:42:32 important [indiscernible] we need consumer feedback loops. How do people come in and appeal the decision 13:42:36 when a technology fails to optimize 13:42:40 their physical characteristics or their social circumstances quick I will give you great example. 13:42:44 I belong on many panels in a conflict talk about facial recognition 13:42:47 not recognize my face when I change my hair, 13:42:50 what's lighting him and. First you need to disclose that it's not going to be optimize for me 13:42:56 but second you need to hear that from me and you need to hear that 13:42:59 before it hits market and become something that Miss Identifies me 13:43:03 , puts me in a prison or law enforcement 13:43:06 office like the man in Detroit if you much ago. I sit there for six hours because the technology 13:43:12 offers a great image and it's not very well in terms of its accuracy. 13:43:16 We need to have these embedded consumer feedback loops so 13:43:19 consumers can also be part of the agency 13:43:22 in making these algorithms much more accurate 13:43:26 and in some cases we may need policymakers 13:43:29 to work alongside civil society organizations as has been suggested to ensure that we are getting into help 13:43:34 housing education employment financial services 13:43:38 , again, that have been litigated and mitigated for 13:43:42 fairness that we are ensuring that these models are not [indiscernible] 13:43:45 our power [indiscernible] and essentially making its products 13:43:48 versus producers in this economy. As you can tell 13:43:52 I am like a black Baptist preacher, this is really important to me. 13:43:55 Because I really think that people don't understand what is behind the veil. 13:43:58 We have been sitting home for two years trying to figure out how to shop online 13:44:01 , how to see our doctors online and that has been 13:44:06 been -- debt has been collected we have opened by Farson for more information used against us in ways 13:44:12 that serve our market, our preferences, but also allow an entryway 13:44:16 as Nora has suggested from misinformation and disinformation. 13:44:20 Until we get this right and there is reputational 13:44:23 risk for companies that generate bad algorithms I cannot put that label on their product 13:44:27 and there is consumer harm 13:44:30 that comes when we foreclose on opportunities for people to have first-class 13:44:35 citizenship we will be back your next are talking about the same problem 13:44:39 and the four of us, five of us, will keep that same report. We really need 13:44:43 some type of conversation 13:44:47 from Congress, civil society, technology, sociologist, philosophers 13:45:01 >> : Speaking of Congress we 13:45:04 have seen so many proposals coming out of Congress around transparency 13:45:07 and we have also seen them from other policymakers around the world 13:45:11 and not just in United States. I am wondering, 13:45:15 Daphne, you speak to lawmakers and policymakers. Are they aware of the types of harms 13:45:19 facing different user communities that we have been talking about today and are they 13:45:22 thinking about the role that transparency can play in helping combat those times? 13:45:26 >> : I would say only very 13:45:32 superficially. They are not deep in their thinking about what data we want 13:45:36 [indiscernible] as I talked about before but they are also not deep in their thinking about 13:45:40 who is being harmed and how 13:45:44 within specific communities. It is the same problem 13:45:47 on Capitol Hill or inside of tech companies. You don't know what you don't know. 13:45:52 This is why you need diverse teams within platforms content moderation to reduce your blind spots 13:45:58 . As Nicole points out it's also why you need diversity -- I don't want to put 13:46:03 words in your mouth but the engineers billing the algorithm that depends 13:46:07 , depends -- it's important you they are [indiscernible] spotting. 13:46:15 On Capitol Hill and in tech companies talking about transparency 13:46:19 what voices are or are not heard really shapes outcomes and not that many voices get heard 13:46:29 . The governments question about getting information to platform so they can do a better 13:46:33 job has a lot of analogs to 13:46:36 the questions we've had about real governance on Capitol Hill or in state 13:46:39 legislatures or in agencies all along and how to get better information in front of them 13:46:42 from a greater diversity perspective. 13:46:49 What is hardest to spot I think 13:46:54 [indiscernible] least represented groups and a conspicuous example 13:46:57 is language. This is one of the things that I think was most important from 13:47:02 Francis Huggins revelations is how speakers of nonmajor market linkages 13:47:08 not just are getting worse content moderation by human moderators but also how 13:47:13 totally different algorithms because the data set 13:47:16 to train the algorithms isn't big enough to do it right in Chechen 13:47:20 or [indiscernible] or smaller 13:47:23 language groups. 13:47:26 So how do you correct for that? Harness the power of people who can see it, 13:47:30 bring in the local NGOs who can see the fine grains 13:47:37 , really concrete problems that are happening, and explain them. You will never have enough 13:47:41 tech employees inside the companies or staffers inside of governments 13:47:45 to have that knowledge themselves so you need a channel 13:47:50 to reach out of those capillaries of the system of who is 13:47:53 seeing the more fine-grained stuff. 13:47:56 Here is where maybe there is a 13:48:00 bright spot that Internet technologies are a [indiscernible] Nora talked about 13:48:03 bringing more local groups into the FTC hearings and so forth. 13:48:06 That used to mean flying to Washington, DC. 13:48:11 More and more we have ways to interact in ways 13:48:15 that are faster and cheaper and just lighter weight over Zoom. [indiscernible] 13:48:20 the; is that correct over Zoom this summer. That is a plus. That is digital technology 13:48:25 doing what we all hoped it would do 20 years ago in diversifying who gets 13:48:29 to speak truth to power or have therefore stirred 13:48:32 . So doubling down 13:48:35 on that in harnessing those positives I think is 13:48:38 one of the important things here perk >> : Face, Daphne. We are 13:48:41 running short on time and I want to get to audience question so I'm 13:48:44 going to ask one more question to the panelists and if you are in the audience 13:48:47 and have a question please feel free to use the Q&A button in your resume box. 13:48:51 So to any of our 13:48:55 panelists that want to talk about it what you 13:48:58 see some of the limits of transparency and empowering users and on the other side 13:49:01 of things what you see as some of the promisor some of the potential 13:49:04 or creative solutions you have seen tried and you want to see more of? 13:49:10 Matt, do you want to go first? >> : 13:49:15 This question of how do you create 13:49:21 sort of empowered users 13:49:24 who can then a speak in the context of the data that is perhaps available 13:49:28 in some beautiful future world to them to advocate on behalf of themselves and 13:49:31 I think there are two jargon he terms 13:49:36 that I always come back to on this. The first is from the world of cider 13:49:41 cybersecurity through obscurity. The idea that you can hide away 13:49:46 the thing that you hold most dear as long as people don't know what 13:49:49 it is there or they cannot figure out how to get to it, it's a 13:49:52 . I think there's a lot of security trips 30 of the businessmen of these platforms. The 13:49:56 second is weapon eyes born. I think I got this from the podcast 13:50:00 99 percent invisible in terms of service the 13:50:03 ideas if you can make esoteric enough like in terms 13:50:06 of service that will click through [indiscernible] install a new Apperson up for home loan 13:50:10 or anything these days 13:50:13 if you can make it boring enough and timely enough and burdensome enough on 13:50:16 Sunday's life they want to take the time or be able to navigate 13:50:21 through to understand and decide if they agree with it or not and I think that is exactly what is happening 13:50:26 with the things we are calling algorithms these days. 13:50:30 The challenge that is in front of us is not just how do you create a new law that requires 13:50:34 Facebook or Twitter 13:50:37 or whoever you want to pick to release all of their internal memos 13:50:41 and all of their internal AI recipes and all of these things. 13:50:45 The question is how do you create a public and a particular 13:50:50 in these most effective and most vulnerable and under served communities with regards to platforms 13:50:53 how to get those people to be the most informed people 13:50:58 about how these mechanics work in order to advocate for their needs. 13:51:02 The traditional answer here is media literacy or civics 13:51:06 . So it may be that we are talking about here is a combination of a need for 13:51:13 aggressive regulatory intervention and renorming in terms of what we expect from products 13:51:17 and spend aggressive consumer education and when I say consumer education, I mean, it like 13:51:21 K-12. Would not be a thing if we are teaching our middle schoolers how 13:51:26 algorithms works and what algorithm bias is. That would be great. 13:51:29 But we are not going to be able to get simply by publishing were data. 13:51:34 So I agree wholeheartedly with the point that Nicole was making on that front. 13:51:39 And I think the challenge isn't just antitrust, it isn't just transparency and it isn't just 13:51:43 making platforms that have 13:51:47 safe colors or buttons on them it's all those three things together. 13:51:51 >> : Thanks, Matt. I do 13:51:54 want to turn to a question from the audience of sorts for the rest of the post 13:51:57 . We do have a question coming in from Twitter 13:52:02 that is about anecdotal evidence versus actual evidence of systemic human rights violations. 13:52:07 And the questioner asks, how can transparency help us understand 13:52:12 the actual impacts of technology on human rights beyond the anecdote? 13:52:15 What kinds of transparency? Does anyone want to address that? 13:52:23 >> : I will go work I think 13:52:27 -- I like the question because I was just testifying 13:52:30 for the mayor which I think is the national AI research initiative 13:52:33 that is coming out of the White House. I think first and foremost what we are seeing in the White House in terms 13:52:36 of the recent release of a 13:52:39 rights-based framework when it comes to the models 13:52:43 or the collection or the development and collection of research that is much more inclusive and designed 13:52:48 to look at these human rights and civil rights violations. 13:52:51 Most important we have to define a. And I think 13:52:54 the person who asked the question is completely right. A lot of us 13:52:57 get up and we talk about these anecdotal situation but because of the [indiscernible] 13:53:01 of the Internet it becomes very difficult for us to look at the Internet in these cases 13:53:06 where we know that there is a causal relationship between the two. Whether it is misinformation 13:53:10 and voter interference. 13:53:15 We still debate that 2016 data and the extent to which it had a huge 13:53:18 effect or not a huge effect on this. 13:53:22 There are researchers that I just met some recently that are trying to look at this weaponization 13:53:27 of data and algorithms 13:53:32 to trigger these types of disruptions of human rights but I would just [indiscernible] we 13:53:35 just need more research. We need research that is not necessarily the regression [indiscernible] 13:53:39 or peer-reviewed 13:53:44 we need both peer-reviewed research and pragmatic readers 13:53:47 for people to understand these correlation specters and we need to 13:53:51 open up the space to more social scientists [indiscernible] technologists, to 13:53:55 better understand these human and civil rights violations. As a sociologist some people ask how did 13:53:58 you get in this debate [indiscernible] 13:54:02 like many people on this event call because we care about human and civil 13:54:05 rights but we have all had to carve airspace in this domain in ways 13:54:11 that I think has come very late in the process as these models have become more [indiscernible]. 13:54:15 So with regard to the question we just need to have this research done 13:54:21 and if companies are not going to do it 13:54:24 [indiscernible] going back to your question were Congress 13:54:29 should be allocating money to places like the National Science Foundation 13:54:32 and other places so that we can engage in this research and not be passed by 13:54:35 when it comes to these types of questions. Because right now 13:54:39 there is not an equal description of mining, research funding, to help 13:54:44 build better data quality, more inclusive data quality in addition to looking at these issues straight on. 13:54:49 >> : I would add I think 13:54:52 maybe there is this continuum between anecdotal accounts on the one hand 13:54:57 and then the perfect data dump of everything from the 13:55:00 platforms on the other and there are interesting things in between like the project [indiscernible] 13:55:06 and some others were running for a while gathering accounts by 13:55:09 people who thought they had been mistreated on platforms and try to synthesize that into reports 13:55:14 or I am thinking about the NYU Mozilla extension 13:55:18 that script peoples 13:55:22 Facebook pages together information about electoral disinformation and adds. 13:55:26 Is self-selecting [indiscernible] so it's not a perfect 13:55:30 set but it's better than anecdotal and this conversation makes me think 13:55:34 how analogous that is to projects 13:55:40 putting apps on peoples phones to report police abuse or to 13:55:43 otherwise try to quantify and track things that are going on in the world. Again in a way that is 13:55:48 not perfectly scientific because it is self-selecting 13:55:51 but it's better than anecdotal and I feel like there's a lot a promise 13:55:56 in more modest and achievable and interesting data gathering 13:55:57 experts in that middle range. >> : But that involved 13:56:06 Stephanie methodological approach to this. Which is a lot different 13:56:10 than when the space opened up that was highly research for technology. 13:56:14 So that type of qualitative analysis is now seeping into these spaces 13:56:18 to be in the last five or 10 years 13:56:21 that we haven't seen before. So hopefully we will continue on that trajectory. 13:56:25 >> : I love those examples, 13:56:30 Daphne, because they are almost like user generated transparency. So empowered 13:56:34 [indiscernible] using transparency that is [indiscernible] 13:56:39 another potential avenue to explore. We are out of time but I would like to thank 13:56:43 Max, New York, Daphne, aunts Nicole so much for sharing 13:56:46 their expertise with us and for having this conversation today and also to all of the audience for attending 13:56:49 and watching 13:56:53 and please tune in tomorrow for more. Thank you so much, everyone. [ End of meeting ] 13:57:22