SPEAKER: Good morning, good 12:00:42 afternoon, good evening wherever you might be and welcome to the 12:00:46 fifth annual future of speech online event. My name is Emma 12:00:49 and I'm the director of the free expression project at the center 12:00:52 for technology and I'll be your master of ceremony 12:00:56 ies today A. few housekeeping notes. This is the third and final 12:00:59 day of our future of speech online event. All focused 12:01:02 around the theme of making transparency meaningful. On 12:01:06 day one we had a great over view of the many different kinds 12:01:09 of transparency that are part of the tech policy conversation. 12:01:13 CDT has just release add guide for policy 12:01:16 makers laying out a frame work for understanding how issues like 12:01:19 transparency reporting, third party assessments, 12:01:23 all fit under the broad umbrella 12:01:27 . And we heard from experts on how to keep focused on 12:01:30 different groups and communities to understand what kinds of information 12:01:33 are necessary to actually empower users to navigate 12:01:36 their information environments. And yesterday we had a robust 12:01:39 discussion of the Santa Clara principles for transparency and 12:01:43 accountability and their focus on user rights and user 12:01:46 empowerment. We talked about the progress and opportunities for increasing 12:01:49 things like notice to users and appeals mechanism 12:01:53 s but also explored how that doesn't 12:01:56 help us solve all the challenges. And we explored what 12:01:59 it would take to build a better transparency report do. make sure 12:02:02 that company disclosures are more relevant, 12:02:05 more trust worthy . If you missed any of our earlier panels 12:02:09 the the video from the event is available and we 12:02:12 can drop a link to that in the the chat. Todays discussion is 12:02:15 also being live stream asked recorded and the video for that 12:02:19 will be archive #-D on CDTs YouTube 12:02:23 page so feel free to share that around. If you have any questions for 12:02:27 the organizers or any trouble with zoom features 12:02:30 please contact events at CDT.org and during our panel 12:02:33 discussions folks are welcome to submit questions 12:02:36 VIA zooms QA feature or by e-mail to questions 12:02:39 at CDT.org or on twitter using the 12:02:43 hash tag CDT questions. Today we have a grata gend 12:02:46 afor you. One of our first panel is going to be discussing 12:02:49 one of the heartest topics in transparency which is enable 12:02:52 ing independent the researchers to asks data held by private 12:02:56 companies in order to support a better understanding of our information environment 12:02:59 . We'll hear some lessons from other industry that can 12:03:02 help us think about how to address the question on 12:03:05 social media in particular. Our final panel will address the question 12:03:08 of whether transparency can actually be mandated in law. 12:03:12 We'll have legal experts from around the world sharing their 12:03:15 insights and analysis of how different transparency 12:03:18 mandates will hold tupe scrutiny in courts. First 12:03:23 up today to set the stage about researcher access to data I 12:03:26 am delighted to welcome 12:03:31 John sands and Alexandra givens. 12:03:34 Alex and John over to you. SPEAKER: Great thank you so much Emma 12:03:37 . It's wonderful to see the past three days of programming 12:03:41 and thanks so much to all of you for being with us. John thank 12:03:44 you for being generous with your time for this fire 12:03:48 starter conversation before the rest of the programming kick 12:03:51 s off. I have to say I am glad that we invited to you 12:03:54 talk about researcher access to data today because 12:03:57 the regulatory landscape is heating up on this issue even 12:04:00 more than the discussion in previous weeks. In addition to 12:04:04 bipartisan legislation that was recently introduced by 12:04:07 senators cunes, portman, just 12:04:10 this week the European em cocommittee voted out the the 12:04:13 digital services act with a provision on researcher access to 12:04:17 data in article 31. We only have a 12:04:20 short time for our conversation today but what I want to do is 12:04:24 help situate these I haveerts in broader questions about platform 12:04:27 accountability and the type of research you're trying to fund through 12:04:30 the night network. I'd love if you could start off just by explaining 12:04:34 the night foundations investments in this space. 12:04:37 The type of work you're fundwhat the goals are. 12:04:39 SPEAKER: Super yeah. Well thanks 12:04:42 alex. It's always great to be with you and we're so proud of our associations 12:04:46 with CDT and the work that ludo. I also want 12:04:49 to say thanks to the Charles coke institute for sponsoring 12:04:53 the really important conversations over the past couple 12:04:56 days. To your question as you know night 12:05:00 has been supporting you know a new field of 12:05:03 research, an emerging field that's interdiscipline 12:05:06 ary in scope that really seeks to understand and proactively 12:05:09 inform responses to the growing role in digital 12:05:13 media in our society. We made these investments to 12:05:16 ensure that policy make answers key stake holders in 12:05:19 industry and civil society are informed by independent 12:05:23 the nonpartisan research and that new legal and policy frame works 12:05:27 protect the public interest in advanced fundamental 12:05:31 democratic values on topics like content moderation 12:05:35 and free expression online. Intermediary liability, 12:05:39 all things that are very much in the wheel 12:05:44 house of. Today we have invested now 12:05:47 more than $60 million in the space helping to stand 12:05:50 up five new research centers in universities around 12:05:54 the country and we're supporting a host of new scholarship and 12:05:57 policy work at more than 50 other institutions including 12:06:00 at CDT. It's one 069 the largest target 12:06:03 ed investments in nights history and 12:06:06 is also I think one of our most consequential 12:06:10 . At a high level that's kind of what we're after, 12:06:12 what we're doing. SPEAKER: Yeah it's an incredible 12:06:15 investment and one of the things we value the most is being part of 12:06:19 a network that connect always of these research centers 12:06:22 the night research network and the chatter on that thread 12:06:25 is loud about this question of researcher access to data. You're 12:06:28 on it too you see those confidence you talk to 12:06:32 your grantees. I would love for you to give a sense of what type 12:06:35 of data are people trying to access and why and can you touch 12:06:39 on some of the barriers they're encountering as they try to do 12:06:41 it. SPEAKER: Sure maybe let's start by 12:06:44 emphasizing why data access is so important though. That reason is 12:06:47 simply that we don't know. In it's early 12:06:51 years this field I think has been really really amazing 12:06:54 at laying out the range of things that we in civil 12:06:58 society in government the general public the things we simply 12:07:01 don't know. For example we know companies moderate content 12:07:04 but what are the consequences of those moderation policies for free speech 12:07:07 in different communities and in different the parts of the world. CD 12:07:10 Ts obviously doing some amazing work in that space. How 12:07:13 prevalent the is misinformation online and what are it's effects 12:07:17 . Another huge question. To what extent is social media 12:07:20 and algorithmically curated 12:07:23 content contributing to societal fracture and decline 12:07:27 in institutions that we know 12:07:30 about because social scientists are able to measure that. Another might be 12:07:34 to what extent are bad actors able to manip 12:07:37 ulate algorithms and social networks to increase the 12:07:40 distribution of content that manipulates 12:07:44 political behaviorsory 12:07:47 political violence we saw on January sixth 12:07:50 . The Frances hogen disclosures I think point toward the 12:07:54 fact that at least one social media company knows more about 12:07:57 the answers to these questions than we do . And than 12:08:00 our elected government does. And that's a big problem. 12:08:03 The near term consequences of our 12:08:06 collective lack of knowledge are evident and 12:08:10 frankly really quite terrifying for the institutionalist 12:08:13 s among us. But the long term prospects 12:08:16 are just absolutely untenable. So our legislator 12:08:19 s and our regulators can't act in the public 12:08:22 interest if they're only working with a limited fact base 12:08:26 and we can't solely make decisions based on an industries 12:08:29 limited and often discredited public disclosures 12:08:33 . So with few exceptions like twitter, red it, most 12:08:36 of the tech companies that have stonewalled attempts by 12:08:39 scholars to do independent research with their 12:08:42 data. In the absence of more robust data 12:08:46 access these researchers many of whom night has been 12:08:49 supporting ozpoint out these researchers have developed some ways to 12:08:53 proximate what might be held by those companies but only 12:08:56 proximate. Many folks may 12:08:59 also remember that a few years ago along with several other 12:09:02 foundations night support add joint effort between Facebook 12:09:05 and academic researchers that tested a promising 12:09:09 modforelady gleaning insights from user level data held by these 12:09:13 companies or one of the companies. But this ultimated 12:09:18 ly failed because of privacy concerns and privacy and liability issues 12:09:21 around privacy continue to be the key hurdles to 12:09:24 get around and the risk for all of us including the companies 12:09:29 is growing day by day. And that's why at night we've been 12:09:32 supporting not only the basic and empirical 12:09:35 research side of this equation but also the legal and normative 12:09:38 work that can imagine better governance 12:09:42 and better regulatory frame work that promote independent 12:09:45 research and transparency while also protecting speech 12:09:48 and privacy and ultimately the public interest. 12:09:51 SPEAKER: Yeah I mean you frame that really articulately because it is 12:09:54 a hard trade off to breach there. There's not an easy 12:09:57 you know be witch style wave your magic 12:10:00 wand and get it fixed L. are hard questions to answer 12:10:04 . But at the same time it's so clear that progress needs to be made here 12:10:07 and I think one of the things we valued seeing in the network is 12:10:10 the rich conversation around what reasonable access could look like 12:10:13 right. How it could be done in a responsible way to 12:10:16 mitigate those concerns about privacy and security. 12:10:19 While still bringing down the barriers so you can have this independent 12:10:23 the check on the system. I want to ask you this summer 12:10:26 the president of the the night foundation issued a pretty 12:10:30 remarkable statement with other foundation leaders. 12:10:33 You had Darin walker, John from the 12:10:37 mic carthry foundation. A whole host of others condemning 12:10:40 in particular Facebooks shutting down of the NYU 12:10:43 s add you been an servatory account 12:10:46 . Can you talk a little about that one specific thought pattern 12:10:50 and what to make of that moment to have so many voices 12:10:53 across civil society choose doing weigh in together in that letter 12:10:56 to try to push this issue forward. SPEAKER: Well 12:11:00 I mean moments of consensus are really rare 12:11:03 today as you know. So it was particularly meaningful to 12:11:06 me to see the president's of so many foundations speak 12:11:10 with one voice 12:11:13 on an issue that night has been behind for 12:11:16 some years now 12:11:19 . That letter urged government and industry to 12:11:22 ensure that data, to ensure data 12:11:25 access for researchers who are working in the public interests 12:11:28 . In the wake of the housing disclosures it's 12:11:32 only clearer that legislature is required because the 12:11:35 Facebook and tremendous journalism around them have 12:11:38 only shown how much sun light there is between what 12:11:41 these firms likely know and what they make public . And I want to 12:11:45 be clear that I'm not supporting or suggesting night supports 12:11:48 any sort of specific legislation but last week I was really really 12:11:51 interested to see that a by 12:11:54 partisan group, cunes and 12:11:58 portman finally proposed legislation that would accomplish this 12:12:01 . The approach that they outlined was 12:12:04 largely based on research by professernate 12:12:08 at Stanford, his proposal has 12:12:11 three components and I'm going to paraphrase so I apologize 12:12:15 to Nate and to the senators. But 12:12:18 ioncourage everybody to check it out. 12:12:21 The first pillar of this proposal 12:12:25 is that congress needs to 12:12:29 be able to empower a federal agency 12:12:33 to compel private platform companies to 12:12:36 share data. 12:12:39 The second feature of this proposal 12:12:42 is thatresisertures need to be 12:12:46 vetted by that agency or by some 12:12:49 combination of federal agencies 12:12:52 and civil society actors and the third pill 12:12:56 ser that there should be a clearly defined 12:12:59 process and publishing the research in a way that 12:13:04 guaranteed privacy is protected in the 12:13:08 process. It's taken years and years 12:13:11 of research to get this frame work in writeit's 12:13:14 happening at a time when there's real public concern and momentum 12:13:17 for solutions when something could get done so it's 12:13:21 really quite exciting 12:13:24 . Described this outcome to the the hard work and 12:13:27 ten assuetude and brilliance of folks like 12:13:30 Nate but I'm glad night and other foundations are align 12:13:33 ed around transparency in a tech industry 12:13:36 that serves the public interest and advances 12:13:39 democratic ideals above all else. 12:13:42 SPEAKER: Yeah well that's great. I'm going to start to wrap us up because we have a 12:13:45 panel that's going to go into more detail on these questions 12:13:48 but we too are excited to see the legislative energy in 12:13:52 congress. There's this car lie effort in 12:13:55 the EU and I think good forms for 12:13:58 EUUS conversation through things like the trade and 12:14:01 technology counsel that's being create 12:14:05 today help share ideas. CDT is focused 12:14:08 on these issues. 12:14:11 Of course just deeply grateful to you for your support not 12:14:14 just of CDT but of the field it really has been 12:14:18 aMay's doing see. So thanks for joining us today and Emma I'm 12:14:21 going to pass it off to you to kick off the panel. 12:14:24 SPEAKER: Great thank you so much John and alex 12:14:27 . Keep the conversation going I'm very happy to 12:14:30 introduce my colleague gabe who is a research fellow with us 12:14:34 and who will be moderating our next panel. 12:14:37 Gabe take it away. SPEAKER: Thank you very much 12:14:40 Emma and thank you alex and John for setting the table for this 12:14:43 conversation so nicely. I would like tine vite our three panelist 12:14:46 s at this point to come off of camera 12:14:49 . So as many folks have said before we are here to talk 12:14:52 about researcher access to data 12:14:56 . And with social media in particular researchers use 12:14:59 social media data to learn as John said about 12:15:02 the sort of many ways it affects society. From issues 12:15:05 around misinformation to the way it affects 12:15:09 mental health to sort of many new issue that we haven't discover 12:15:12 ed yet. Research can serve a clear 12:15:15 public good but many platforms are cagey about giving researchers 12:15:19 access to data. Both to protect their own corporate interests 12:15:22 and to protect the privacy of end users. It's 12:15:25 difficult to figure out how to balance these trade offs but 12:15:28 fortunately there is some precedent. Because lots of 12:15:31 industries and sectors including health care, 12:15:35 smart city technology, education, even gam 12:15:38 bling makes certain data available to researchers 12:15:41 so that they can do work that serves the public 12:15:44 good. So today we have three fantast 12:15:47 ic panelists from different sectors who are going 12:15:50 to discuss how their industries sort of manage these 12:15:53 trade offs and potentially give some lesson that could 12:15:57 be learned for social media doing the 12:16:00 same. CDT will 12:16:03 also have a report coming ow in the spring about a similar 12:16:07 topfor this panel if you want to ask questions feel free 12:16:10 to use the the QA function 12:16:13 or to tweak questions using hash tag CDT questions or 12:16:16 to e-mail us at questions at C 12:16:19 DT.org. 12:16:23 Without further ado leme introduce our panelists 12:16:27 . First we have Elizabeth Hansen Shapiro 12:16:31 who is a senior research fellow at the tow center 12:16:34 at Columbia journalism school. She 12:16:37 has written extensively about media policy and platform 12:16:40 s and in particular has a fantastic paper with Ethan 12:16:44 where you cannerman about how social media platforms to and 12:16:47 do not make data available to researchers. For anyone 12:16:50 interested in this topic I highly recommend checking that out. 12:16:53 We also haveby 12:16:56 tris who is an assistant professor of law 12:17:00 artisanspo 12:17:05 and affiliate at Harvard university and she focuses 12:17:08 on data governance in smart city projects 12:17:12 as well as the sharing economy and COVID contact tracing apps 12:17:15 . He she has lots of experience around this data 12:17:19 governance design. Finally we have Matthew 12:17:23 hearder who is the director of the doll housing 12:17:26 universities institute and an associate professor and faculty 12:17:29 of medicine with a cross oopment 12:17:33 to the shullic school of law. He sturd ease biomed 12:17:36 ical innovation policy including data governance models 12:17:40 for clinical trial data in the US and Canada 12:17:43 . So welcome to all three of our panelists. Elizabeth 12:17:47 I'm going to start with you. So John gave us 12:17:50 a good start about what kind of data researchers 12:17:53 are looking for. But would you give osbit of a 12:17:56 lay of the land about how researchers are 12:18:00 trying to access social media data and what road blocks they're running into 12:18:03 from social media platforms. SPEAKER: Sure and thank you CD 12:18:06 T for having me today and for most hosting that 12:18:09 fantastic and very timely discussion. It's so exciting 12:18:12 to you know seen policy prescriptions actually 12:18:16 make it into draft legislation so really pleased 12:18:20 to see the movement on that front. This paper with 12:18:23 Ethan where you cannerman and colleague 12:18:26 s we really kind of wanted to take a step back 12:18:30 and survey the landscape and say okay what's, you know what is 12:18:34 currently happening, what are the road blocks and what are some 12:18:37 potential policy solutions. And I do encourage if you have an interest in 12:18:40 this top took read the paper 12:18:44 because we went through quite an exercise of 12:18:47 really being very structured about types 12:18:50 and approaches and so I think it's a nice 12:18:53 map of the landscape so I'm going to run through at a kind 12:18:56 of very basic level some of the categories that we 12:19:00 saw in terms of data 12:19:03 cleck efforts and then also road blocks 12:19:08 . And we categorized the current range of 12:19:11 data collection approaches falling into two types. So 12:19:14 one are researchers are absolutely accessing platform data 12:19:18 with platform cooperation. So that's one type. 12:19:21 And then the other are what we called more 12:19:24 adversarial data collection techniques and those 12:19:27 techniques researchers are under taking without platform coop 12:19:31 eration. And some of those are more risky and some of them 12:19:34 less risky. So under the platform cooperation 12:19:37 banner researchers are absolutely using open AP 12:19:40 Is. So like the twitter AP I 12:19:44 for example being one. We are also 12:19:47 putting in that bucket new and interesting approaches to 12:19:50 differential privacy. So you know this was a 12:19:53 technique that was used in the the social science one initiative and 12:19:57 it's really a way to add noise to a data 12:20:00 set so that you can protect user privacy 12:20:03 in any platform data. We also heard researchers 12:20:06 talk about the importance of contract based data sharing 12:20:10 . Now you know that can be problematic from a sort 12:20:13 of social good general knowledge perspective because often 12:20:16 times data that's shared under contract has certain restriction 12:20:19 s around what can and can't be published. 12:20:22 But that kind of data access is happening. And 12:20:25 then finally we also heard about researchers 12:20:28 and institutions beginning to think about how to 12:20:32 set up controlled environments where there's a high degree of cooperation between 12:20:35 platforms and researchers to really protect user privacy 12:20:38 and put some strange guard rails around you know the 12:20:42 ways that researchers can access analyze and 12:20:45 then publish data but it does allow researchers to have access to 12:20:48 a higher sensitivity than just through an AP I 12:20:51 for example. So that's the range of platform 12:20:55 cooperation techniques. You know the 12:21:00 adversarial set of techniques at the kind of basic 12:21:03 level you have researchers that are just doing logged in 12:21:06 manual data collection. There are some ethical issues around 12:21:09 that. Does the researcher disclose their research 12:21:12 , when where how. It can also sometimes 12:21:16 run into terms of service violations but 12:21:19 that's a pretty common method. We're also seeing and this is really 12:21:22 more from the kind of civil society and journalism 12:21:25 front um experimentation with panel data so you know the mark 12:21:28 ups citizen browser project pays a represent 12:21:32 ative panel sample of users to be able to access 12:21:35 their feeds. Data donation like what 12:21:38 mow zil ahas experimented with. This is where a user can opt in 12:21:42 to sharing their data directly. And then there's 12:21:45 you know self reports which we've seen a lot of. But can 12:21:48 sort of be problematic in terms of user recall 12:21:51 . But that's a way to sort of you know get around some 12:21:54 of the legal risks. We've seen a whole bunch of great work come 12:21:57 out of journalists again and 12:22:01 like the mark up around auditing and 12:22:06 algorithmic auditing. 12:22:09 Doing a series of key word searches to understand how an 12:22:12 algorithm works and what results come of that. And then we have 12:22:16 the whole set of like scraping based methodology 12:22:20 . So you know straight up web scraping which started 12:22:23 with the open web but you know has been implemented 12:22:26 implemented on social media platforms. App scraping 12:22:29 which is a bit more difdulate do on mobile but also possible and 12:22:33 then finally these synthetic research panels which are network 12:22:36 s of dummy accounts and you can configure them with 12:22:39 different characteristics and see how those 12:22:43 accounts behave. Those latter three, web scraping, app 12:22:47 scraping and synthetic panels those are 12:22:50 where we really start to see the increasing risk of terms of 12:22:53 service violations of CFAA violations. So 12:22:57 those researchers have decided that they're willing to take on that risk and 12:23:00 obviously we've seen the results of some of that. So 12:23:03 in terms of road blocks what I would say there's 12:23:06 absolutely legal road blocks. C 12:23:10 AA. We also talk in the paper about conceptual 12:23:14 road blocks. We have a huge issue around what we call the 12:23:17 denominator problem which is that it's easy to 12:23:20 kind of identify problematic content but 12:23:24 it's much hard door understand what is the denominator, how 12:23:27 many users are actually seeing this, what's the volume of overall 12:23:30 activity and we've seen you know exchange 12:23:34 s between platform employees and the public around 12:23:38 what platforms see internally given they have a 12:23:41 sense of the denominator versus what researchers point out 12:23:45 externally as the volume of problematic content. We 12:23:48 also talk about the issue of unknown unknowns. 12:23:51 And this is why I think the Facebook papers were so fascinating 12:23:54 . Because external researchers think we have a sense 12:23:57 of what the problems are but we don't really know 12:24:02 what the 12:24:05 problems are because their reaches and audiences are 12:24:09 so vast and complicated. So I'd say those are the conceptual 12:24:12 road blocks. Finally we just heard some 12:24:16 general social media data access issues. The type of 12:24:19 institution you're at and the resource that institution 12:24:22 has make as huge difference for the kind of data access 12:24:26 you can get. Mobile access as I said can be difficult 12:24:29 . The quality of commercial and paid data can 12:24:32 be higher than what researchers have access to. So 12:24:37 that aggravates those unequal access issues 12:24:41 . And then finally 12:24:45 some of these platforms 12:24:48 change it can be difficult to 12:24:51 design longitudinal studies 12:24:54 so that's another issue that we heard just in general from the 12:24:59 researchers that we spoke to. So that 12:25:01 was a very fast over view of 12:25:05 a very complicated top 12:25:08 ic but that's how we see the field current 12:25:11 ly. 12:25:15 SPEAKER: Very fast but very impressive and thorough interview so 12:25:18 thank you Elizabeth. Matt 12:25:21 I see some similarities between what Elizabeth is talking about 12:25:24 and the world of clinical trials. This is another area 12:25:28 where data sharing and probably even more than social 12:25:31 media could sort the of create these huge opportunities 12:25:34 for public good. But at the same time there's still this 12:25:37 sort of inherent tension with corporate trade 12:25:41 secrecy. Would you mind taking us through some of the ways 12:25:44 that pharmaceutical and medical companies 12:25:47 sort of share data with the public and one 12:25:50 another and how they end up kind of striking that balance with 12:25:54 corporate secrecy? SPEAKER: Sure I'll try and like other 12:25:57 s I'll just echo the thanks for the invitation to be 12:26:00 part of this it's always fun to get a little bit outside 12:26:05 your lane and learn from others experiences. So first thing to 12:26:08 really say is that transparency and sharing of clinical trial data 12:26:12 is still very much a work in progress. And 12:26:15 really while there are companies experimenting with so-called 12:26:19 open science and putting more data out there or trying to sort 12:26:22 of learn from data that's out there to develop a drug 12:26:25 of their own for instance that's 12:26:28 still a kind of fringe approach. 12:26:31 The dominant model has resisted the transparency 12:26:34 that we have to varying degrees in different 12:26:38 places. So there's still a lot of work to 12:26:41 be done and frankly one of the basic insights that I've come 12:26:44 to understand in this field given the sort of size that the companies 12:26:47 are talking about in some cases is you really only get meaning 12:26:50 ful transparency if you require it by law and you put the resources 12:26:54 into making it happen and making it pretty easily 12:26:57 accessible frankly. So it's kind of, there's not a lot 12:27:00 of madge took it in some ways but the way the system works 12:27:03 at present is there are multiple points in 12:27:07 the process of figuring out whether a given intervention works 12:27:10 whether it's safe and so on. That data has 12:27:13 to be shared. And that's a result of legal 12:27:16 reform that have been sort of hard won over decades really 12:27:20 . This has been a long known problem. The lack of 12:27:23 transparency and the ways in which the published 12:27:26 literature may be a distortion of what the 12:27:30 actual evidence looks like in it's totality. So when a 12:27:33 clinical trial starts you have to register your trial. Human 12:27:36 s are involved and share the I did zine of that trial. When results 12:27:39 come in you're supposed to report them I 12:27:43 within a certain time frame and thankfully that system has been in place 12:27:46 for some time now. Clinical trials dot gov' is the source 12:27:49 of a huge amount of research and independent validation of 12:27:53 trial designs the results of studies. Are they the same as 12:27:56 they were reported in peer reviewed publications. Often 12:28:00 they're not. So that's helpful. And the other big mile 12:28:03 stone at which you can get a lot of data is 12:28:06 when a product is approved or there's a decision made 12:28:09 to approve or not approve it in some cases. In Canada 12:28:13 they actually put data out there even if they reject the drug 12:28:16 . That's not true in the United States. And so at that point of 12:28:19 regulatory approval or decision making you can learn a lot about the 12:28:23 reviews that were conducted by scientific 12:28:26 reviewers within an agency with the food and drug administration 12:28:30 . And you can in some cases in Europe and in Canada 12:28:33 less so in the United States you can actually get the underlying trial data 12:28:36 itself. Not individual data, that's the fault line 12:28:40 around privacy. But a lot of the summary 12:28:43 level data which is still quite 12:28:46 interesting. 12:28:49 As I said it sort of very as fair bit. In Canada 12:28:53 we already have data around the different COVID vaccine 12:28:56 s for instance that's out there and people are trying to make sense of 12:29:00 independently of what the regulator did, what the companies 12:29:03 did and the research that's involved. You might've seen the 12:29:06 opEd in the Washington post in the last couple days talking about how the 12:29:09 same request made to the FDA which doesn't proactive 12:29:13 ly process the same level of data is reportedly going to take 12:29:18 55 years accord doing the agency to put online. Somehow little oh Canada has 12:29:21 managed that in a matter of months. So there are different approaches 12:29:24 and parameters around it but they are these discreet points 12:29:27 in time and what we've seen during the pandemic in particular is the 12:29:30 need for more dynamic data sharing 12:29:34 . And even if you get the data it's great to independent 12:29:37 ly validate someone's findings but to actually use tin the 12:29:40 context of developing your own 12:29:43 drug there's other barriers to that in the form of patent law and 12:29:47 other kinds of IP. You mentioned trade secrecy. 12:29:50 Some data still is not shared. Like what's your 12:29:53 manufacturing process for a drug or a vaccine. That 12:29:57 is still hived off from the results of a clinical trial. So 12:30:00 there are still these boundary that are being contested 12:30:03 and for good reason from my perspective in the context 12:30:07 of COVID given the short fall in vaccine supply in 12:30:10 other parts of the world. So we still have some work to do 12:30:13 and getting companies to comply with those data sharing 12:30:16 requirements requires some resources as well. Work in progress. 12:30:20 But compared to even five years ago we have a lot 12:30:23 more data out there and hopefully the norms are shifting 12:30:26 . And as other regulators put 12:30:29 stuff out there it sort of may embarrass the 12:30:32 FDA into action. So there's this way in 12:30:35 which all votes will be lifted if the tie of 12:30:39 transparency actually comes to pass. I can say a lot more about drug 12:30:41 s but I'll keep it there for now. 12:30:44 SPEAKER: Thank you Matthew. Very interesting to hear about the sort 12:30:47 of similarities and differences there with social media 12:30:50 . And I really liked that phrase that you used 12:30:54 the fault line around privacy. Just a great way to 12:30:57 put it. And I think it's a way that a lot of us in different 12:31:01 industries sort of feel like we have to navigate those privacy problem 12:31:04 s. Um sort of on that privacy 12:31:08 note on Tuesday at phose oh we heard Rebecca 12:31:12 trumble talk about how platform asks use privacy 12:31:16 as a shield when it is convenient as a shield from 12:31:19 transparency. Beat 12:31:22 riz I wanted to ask you about this in particular in 12:31:27 sort of smart city's, 12:31:31 maybe air BNB and rubier. 12:31:34 uber. Would you 12:31:38 talk about that and whether you've seen that kind of dynamic 12:31:41 play out of companies using privacy as 12:31:44 a transparency shield. 12:31:47 SPEAKER: Thank you Gabriel and before I start I'd also like 12:31:50 to thank you and the rest of the organizing team for having us. 12:31:53 It's been a great event and it's great to follow Matthew 12:31:56 and Elizabeth. Very interesting and also complete 12:32:00 ly different field than mine 12:32:03 so it's very interest doing see the parallels and how some of these 12:32:07 issues show up in different settings. So I've actually seen 12:32:10 those arguments privacy arguments being raised 12:32:14 by platforms in what is rather a contest 12:32:17 ed relationship between what we call urban tech 12:32:20 and city's. Since the beginning. So 12:32:24 it's already -- oh I'm low sorry. So 12:32:27 it's been almost always since platform 12:32:31 s arrived that city's have been trying to struggle with 12:32:34 the disruption of urban spaces and what uber 12:32:37 means or urban means 12:32:41 for housing or the E scooters and if you talk to city government officials 12:32:45 it always feels like the the newest tech 12:32:48 is day jaw view of whatever the former tech was 12:32:51 . One of the interesting things is many of these platforms 12:32:54 actually not all of them have tended to have strong legal strategy 12:32:58 ies to avoid local regulation. So 12:33:01 we've seen for example heavy lobbying in the United States where 12:33:04 uber managed to declassify it 12:33:08 self as something very different from tax 12:33:11 . So city's for example most city's in the United States lost 12:33:14 the legal capacity to regulate uber 12:33:17 . Air bnb has had a similar strategy 12:33:20 . As par of this legal strategy of framing 12:33:23 theirself differently to avoid regulation one of the the things I've 12:33:26 seen and I've written about this is that 12:33:30 they raise privacy arguments to avoid 12:33:33 data. So the way it goes is a city would try to reg 12:33:36 ulate a platform and as part of their regulatory 12:33:40 ordinance there would be a data sharing provision in which 12:33:43 the the city wants for example to know which citizens are list 12:33:47 ing their properties on air bnb or 12:33:50 what are the routes of uber or what is 12:33:53 the length of an uber driver or where are 12:33:56 their scooters being parked. And the way the argument 12:34:01 goes, two ways. One is 12:34:04 by arguing that that information is part of the the companieses about 12:34:09 records and therefore it is confidential and should not be 12:34:12 shared with third parties and the other one which I find more interest associate 12:34:17 that the platforms take some role of the privacy 12:34:20 advocates of their use answers says that 12:34:23 goes against the privacy interests of their users. 12:34:26 And the way they frame it legally for the lawyers 12:34:29 in the room is that it violates the fourth amendment of the 12:34:32 United States and other privacy provisions like the store communication 12:34:36 s act or CCPA for example. What's interesting 12:34:39 I think is that the platform 12:34:42 s strategies when they raise these arguments. So 12:34:47 for example once air bnb large 12:34:51 largely lost the battle to not be regulated they typically 12:34:55 drop law suits and they decided if I'm going to be reg 12:34:58 ulated by local government's anyway I might as well collaborate and 12:35:02 give away some of that information. So I think there's 12:35:05 some strategizing there that is worth paying attention to. Another 12:35:09 thing that I think is important to 12:35:13 say is there are legitimate privacy concerns there 12:35:16 which are hard to sometimes balance against the 12:35:19 strategizing of the platforms and I think 12:35:22 that especially in context in which privacy regulation 12:35:26 s stops protecting users once we 12:35:29 consent to privacy terms so one of the things 12:35:33 that I've thought about before is one of the things we should start 12:35:36 looking at is not only thinking about what data can or 12:35:39 can't be shared but what can be done with it once it's 12:35:42 shared. An interesting example for example 12:35:46 comes from San Diego 12:35:52 . Footage was used by police departments to 12:35:55 surveill protesters. Nothing major 12:35:58 happened beyond a big scandal but that's use that 12:36:02 should've been forbidden from the beginning but 12:36:05 wasn't. So tailoring those mechanisms or those ordinances 12:36:08 or frame work that shape how we govern data that is 12:36:12 being shared is something that is very important 12:36:16 . And the other thing which I think is interest associate that something 12:36:19 that might be very different regarding the point of view of 12:36:22 the regulator and access to data from the regulator 12:36:25 from the researcher is I'm actually not sure and I 12:36:28 would be interested in having a conversation on this of 12:36:31 whether the unknowns applies. Because 12:36:34 we also want to limit the power 12:36:37 of government. I am a firm 12:36:41 believer in the many advantages of some regulator 12:36:44 access to data. For planning purposes there's super 12:36:48 interesting example in New York 12:36:51 city where the 12:36:54 they used uber data to 12:36:57 create like the first 12:37:00 minimum wage for example and that was used, that was 12:37:03 possible thanks to having access to data that allowed the regulator to 12:37:07 understand how the worker work 12:37:10 ed. But a case there is that the regulator knew what 12:37:13 they were looking for. That's what they wanted to do so they 12:37:16 made a very specific request about what is it they know. 12:37:19 I'm not very sure that if we're thinking about the reg 12:37:23 ulator we should open update aentirely. Because of 12:37:26 that. I mean power is always 12:37:29 a little bit dangerous and we want to limit it to 12:37:33 . So some intentionally about what data is being request 12:37:36 ed might be something to look 12:37:40 at. And yeah I don't know I'm going to leave it there 12:37:43 because I'm looking forward. Thank you so 12:37:46 much. SPEAKER: Thank you very much. I feel 12:37:50 like that that sort of Diego 12:37:53 view that you were talking about it's funny 12:37:56 to see it happening in industries here. People act like 12:37:59 problems have never come up before but clearly they have and 12:38:03 in many industries they have also. I just want to remind the 12:38:06 audience that this is a good chance to get in your Q 12:38:09 A in that we will sort of shortly 12:38:13 be trying to look at this. We'll shortly be looking at those and 12:38:16 trying to answer some of those questions. But I want to dig in a bit on 12:38:19 this thing that's come up twice now this way of 12:38:22 trying to and also in John sands comment about learning about these unknown 12:38:26 unknowns. Right. 12:38:29 Like that seems to be a benefit of giving researchers 12:38:32 access to data at least when people talk about it. Is 12:38:35 that the kind of thing that just sort of naturally 12:38:39 happens? Is it you giveresisertures access to data 12:38:42 and a sort of if you build it they will 12:38:45 come situation kind of happens. In other words 12:38:48 is all that needs to happen is researchers need to be given 12:38:52 access to data and good things will follow or are there 12:38:55 sort of guidelines that need to exist 12:38:58 to actually kind of get some 12:39:02 of those positive public good things we expect to see 12:39:04 from giving researchers access to data. 12:39:07 SPEAKER: That's a really good question. I would say you know 12:39:10 one debate that we had in the research team 12:39:14 for this paper that didn't make it into 12:39:17 the paper but really informed how we chewed over the 12:39:22 the issues. Academic researchers as anybody who's 12:39:25 been in an academic institution knows have their research 12:39:28 agenda and they have their often 12:39:31 times fairly narrow set of questions that they're 12:39:36 trying to answer with in a very kind of particular genre 12:39:40 of knowledge generation. And so 12:39:45 we wanted to widen the lens lens in 12:39:48 the pape door talk about activists and journalists because often times 12:39:51 it's those actors that I think have more curiosity 12:39:54 and latitude around the unknown unknowns than the academ 12:39:59 ics I think it will be interest doing see 12:40:02 if we do have this new regulatory body what 12:40:06 the assessment and vetting mechanism is because it 12:40:09 can be very difficult to define a researcher 12:40:12 . It can be very difficult to define a journalist 12:40:15 . It can be very difficult to define an activist. So 12:40:19 I think you get more of that interesting unknown unknowns when you widen 12:40:23 beyond the academic frame but I think you run into real 12:40:26 definitional issues around 12:40:29 who is a legitimate user of 12:40:32 this data and for what purpose so I don't think it's as simple 12:40:36 as if you build it they will come. 12:40:39 But curious what my fellow panelists think. 12:40:42 MALE SPEAKER: Can epigy back on that. I really couldn't 12:40:45 agree more. Thinking of this portal 12:40:49 I mentioned in Canada and other sources like 12:40:52 it that preceded it despite a huge 12:40:56 literature showing data that's in the hands of regulators and data 12:41:00 that's been published the up take against academic researchers 12:41:03 is pretty low. Shockingly low despite 12:41:06 being free and easy to obtain at least in some cases and 12:41:10 I think that really speaks to the norms 12:41:13 that I think you were hitting on around their own research agenda 12:41:19 . Reexamining someone else's work is not 12:41:22 rewarding very much at all. Validation is not innovation in the 12:41:25 way that it's understood in academic and industrial circle 12:41:29 s as well. So even though tacit 12:41:32 work we need to understand from reaping the 12:41:35 benefits at a public health level from pharmaceutical 12:41:39 innovation we need people to replicate the science. but 12:41:42 that norm is rehearsed but not practiced in 12:41:45 my view. I couldn't agree to that 12:41:48 a lot of the the sort of interesting questions and key pieces 12:41:52 of work that have made them a little more 12:41:55 transparent over time have come from civil society and journalists 12:41:58 pushing it. But then there's this issue of legitimacy 12:42:02 . And actually in my space 12:42:05 that's the area that gets the regulators the least 12:42:08 comfortable right. They have trouble imagining 12:42:12 that those folks could critically appraise this data make all 12:42:15 kinds of assumptions about capacities 12:42:19 and what's driving their desire to do that and I would 12:42:23 flip that question around and ask the same question of 12:42:26 the actors that are more comfortable with. So it's a huge norm 12:42:29 ative and norm shift that a culture 12:42:32 shift that we really need to encourage 12:42:36 and folk who's see transparency and data 12:42:39 access a as a way of disclosure it's really problematic 12:42:43 . It has to be about setting a platform 12:42:46 for more participatory forms of government that are diverse 12:42:49 by definition. SPEAKER: That's not necessarily 12:42:52 a north star of the the academy I would 12:42:56 say. Not built for that. Go 12:42:58 ahead sorry. MALE SPEAKER: I was thinking 12:43:06 sorry. 12:43:09 SPEAKER: There's some evidence that open data portals are used 12:43:13 by a variety of actors 12:43:16 . But in some instances in which they were supposed to be 12:43:19 drivers of local innovation or things like that it's 12:43:22 unclear that that's the result they've had and so 12:43:27 it's, it goes back to the discussion on how 12:43:30 do we maybe open it up to more actors because 12:43:33 the more variety of actors will lead to a more variety 12:43:36 of questions maybe and that's already a benefit in it 12:43:41 self. To do that kind of work of finding new questions 12:43:46 and understanding better what's going 12:43:50 on. SPEAKER: Go ahead 12:43:54 Elizabeth. SPEAKER: I was just going to say just 12:43:57 to not completely rag on the academic 12:44:01 profession you know I do think that there's been all 12:44:04 kinds of interesting explain 12:44:07 ary field of innovation and there are for sure new journal 12:44:11 s new sub fields new methodology that are grownup 12:44:14 along side social media so I think those folks are absolutely 12:44:17 eager for this 12:44:22 but the sociologist in me is all this stuff institutionalizes 12:44:25 and results to the mean over time anyways so even there 12:44:28 I'm not totally sure that you know researchers 12:44:32 in the academic frame are the only check we need 12:44:35 for accountability. 12:44:40 SPEAKER: Just going off that 12:44:44 Elizabeth so you bring up the the end unless 12:44:47 definitional problem of what makes up a research 12:44:50 er and do we want to give access to folks more than just researchers 12:44:55 . And one of the kinds of solutions quote unquote to this 12:44:58 problem that we end up hearing about is I RBs. 12:45:02 Academic review boards 12:45:04 that can make the necessary 12:45:08 ethical evaluations give thumbs up or thumbs down. 12:45:11 They deal with the privacy trade offs etc. This is also start 12:45:14 doing come up in laws in proposed legislation 12:45:18 as well. Do you think, what do you all think about I RBs 12:45:21 sort of as a tool for being this kind 12:45:24 of like eltical check and balance around 12:45:27 research access to data problems 12:45:31 . 12:45:35 SPEAKER: I'm curious Mats 12:45:39 take on that because I RBs were 12:45:42 sort of built for your field. 12:45:44 SPEAKER: We call them research ethics 12:45:47 board in Canada. They're under pinned not by law but 12:45:50 by policy. There's real concerns about their ability to enforce 12:45:54 standards. But putting those issues 12:45:57 aside you know my anecdotal 12:46:00 experience and understanding of the literature about I RBs and 12:46:03 Bs and how well they do their work is 12:46:07 they are significantly under resourced and there's a lot 12:46:10 of up front review and a lot of contestation around 12:46:13 where does ethics stop and methodology begin and what 12:46:16 sort of okay for an I RB to be looking at 12:46:20 . And I think actually the privacy issue sort of brings 12:46:23 it very closely to the next us between the the two. 12:46:26 But then beyond that following up to see what people are doing with research 12:46:30 data that they have access all kinds of secondary use 12:46:33 s of that data which is sort of of course so 12:46:36 much more possible when you're talking about the concept data 12:46:40 that not me but other folks engage with. There my impression 12:46:43 is that they are not great at sort of maintain 12:46:46 ing oversight throughout that longer 12:46:50 research process. So I think they are part of the equation 12:46:53 that has an important role to play. Did I think they're going to be 12:46:57 the linchpin to solving those difficult questions probably not 12:47:01 . Especially given the diversity of institutions we're talk 12:47:04 ing about right. And so 12:47:07 it's yeah it's very challenging. Especially with 12:47:10 the growth of private I RBs 12:47:13 as well. Which I'm seeing get things 12:47:16 done a lot faster but also have a soar of structure 12:47:20 al concept of interest that needs to be taken 12:47:24 very seriously. SPEAKER: Yeah I would 12:47:28 agree I think they're a critical piece but they're not 12:47:32 the whole. And that follow on 12:47:35 use and what's the kind of life cycle of 12:47:38 monitoring beyond the design stage I think 12:47:41 those are really critical questions. I 12:47:46 do think the privacy issues and consent 12:47:50 issues raised by social media 12:47:54 date squaw social media access in the US context and the context 12:47:57 of the Belmont report I think do beg for a 12:48:00 rethink in general. And certainly you know as a qualitative 12:48:03 researcher in my training I ran 12:48:07 up against I RB issues. There are all kinds of way 12:48:10 s where it's sort of an I tell fitting suit for the 12:48:14 moment. And I think deep structural change at that 12:48:17 level would help. I don't think 12:48:20 it'll solve everything. But could make some things easier 12:48:23 . SPEAKER: Yeah. And it's both I want 12:48:27 to give Beatriz a chance to 12:48:30 jump in but just quickly I RBs tend to say 12:48:33 okay well it's been anonymized or 12:48:37 deidentify today a certain extent all bets are kind of off and 12:48:40 we all know that's a flawed assumption or a risky 12:48:43 assumption and on the other hand there's been some instances where you really 12:48:46 need the data sharing and data sets to be combined from multiple 12:48:50 sources and I RB approval isn't enough to get the 12:48:53 data holders to actually share that data. Still site privacy 12:48:56 reasons. So when there's real value in combining 12:49:00 these data sets I found that institution based I RBs 12:49:03 cannot, they're not sufficient to actually get that data set share 12:49:06 doing occur across institutions. So 12:49:11 again they're a key part of this but I'm not sure they're 12:49:14 going to be a solution per se 12:49:17 . SPEAKER: I actually think I have 12:49:20 little to add except just I do 12:49:24 think they provide an interesting precedent of how some of the the 12:49:28 data sharing might be shaped in the future at all 12:49:31 sort of institutional levels. Journalist 12:49:34 also policy makers there are things there. But 12:49:37 as Matthew and Elizabeth were saying the different stages 12:49:40 of review are important to look at and how to combine 12:49:45 data but yeah. It was very interest doing listen to 12:49:48 you. SPEAKER: Um so looking at one of the 12:49:51 questions that we have in the Q 12:49:55 A here chip is bringing up this point about 12:49:58 sort of technical solutions to this problem right. A 12:50:01 lot of what people talk about they will say you know they'll 12:50:04 talk about differential privacy they'll talk about synthet 12:50:07 ic data they'll talk about adding noise to the the data 12:50:10 . Do you feel like these are 12:50:13 the sort of more promising angle 12:50:17 s than kind of I RB are coming up with new 12:50:20 institutions. Just have a base layer of technic 12:50:23 al interventions that we assume and that becomes our 12:50:26 norm for when it's okay to make researcher access 12:50:29 to data or does that end up 12:50:32 kind of glossicking over too 12:50:35 much. SPEAKER: I can try to go first. I 12:50:39 think it's a mixture of both. We should be 12:50:42 at a stage in which we sort of try to imagine 12:50:46 both a combination of code 12:50:49 and low when we think about this type of data 12:50:53 . I think there are some technical solution that are very 12:50:56 interesting. Some are very complicated 12:50:59 like differential privacy or some form 12:51:03 of desan trallized structure for data 12:51:06 sharing which is what's common in some contact tracing 12:51:09 apps for example . And some are very easy as 12:51:13 just asking for aggregated data instead of very granular data. 12:51:16 If the question you're asking allows you to answer it with less 12:51:19 detailed data. But I think we can't over rely only 12:51:22 on the technical solutions but the 12:51:26 institutional solutions will also be important. Especially as for example 12:51:30 we're saying some of the assumptions about differential privacy or 12:51:33 aggregated data might not hold if we 12:51:36 are aggregating a bunch of data sets. 12:51:39 There are also trade offs involve would we choose these tech 12:51:42 nical solutions and they're often portrayed 12:51:45 as if there weren't the. But one of the interesting discussions 12:51:49 in contact tracing apps was for example if the apps are very 12:51:52 privacy secure meaning that the data never really leaves 12:51:55 the phone then health authorities were not going to have access 12:51:58 to the data about 12:52:02 who was, who was I guess sick is 12:52:05 the word I don't know. But who tested positive 12:52:08 and that curtailed some of the the public health object ofs. So 12:52:12 there are trade offs there as well and I think we need to 12:52:15 start thinking about technical standards 12:52:18 and institution mechanisms as something 12:52:21 that needs to go and about 12:52:24 how we're framing the the problem that we're doctoring. Like what 12:52:28 is the public issue we're trying to address and depending 12:52:32 on those different institutional 12:52:35 arrange . And different technical arrangements might lead 12:52:38 us there 12:52:42 . SPEAKER: I'll bite as well. So I 12:52:46 mean I think they can be helpful. I worry about them 12:52:49 for two very different reasons though. On the 12:52:52 the one hand I think with transparency at 12:52:56 least with pharmaceuticals you open one thing 12:52:59 up something else closes down. Instead of writing an e-mail 12:53:02 off phone call. Well the same way there's this move to aggregate data 12:53:06 and share that data because privacy 12:53:09 . But what two you know and there's probably only 12:53:13 a few dozen people who are interested in doing this in 12:53:16 the world but like looking at the individual level patient 12:53:19 data in that clinical trial is a huge task. 12:53:22 But for those folks and you know really 12:53:25 big purists around evidence based medicine and prescribing 12:53:30 decision that is the the work and they want to do it and so 12:53:33 you can't take away the props those folks and we 12:53:36 all benefit from it at the end of the the day most of the 12:53:39 time to give them the case report forms of the people. 12:53:42 That's not even shared with regulators anymore. It's 12:53:46 happened with lock step in making summary data available they don't even 12:53:50 ask case report forms unless someone dies. So we have to be 12:53:53 mindful of whether it's sort of moving where the real action is 12:53:56 . And if people are denied access to that it's a worry 12:53:59 . If that's what technical solution might mean. The 12:54:02 other reason I worry about it is I think 12:54:06 it can help by putting into place technical fixes and 12:54:09 getting more data out there you start to shift the norm 12:54:12 s but at bottom these seem like ultimately structure 12:54:15 al political problems to me around who has 12:54:18 control over knowledge and how it's produced. And what questions are 12:54:21 asked. And so a technical solution can assist 12:54:24 with starting the kind of culture change work that I 12:54:28 see is necessary but you ultimately need to shift 12:54:31 the sort of control over the knowledge production process and to 12:54:34 achieve that I'm not sure technical solutions are enough. It's 12:54:37 a larger political project. So helpful but again 12:54:41 we need to do more. SPEAKER: I 12:54:44 agree with both of you. I think it's both technic 12:54:47 al and institutional and I think it 12:54:50 can be compelling to believe that technical solutions are 12:54:54 institutional fixes when 12:54:57 nack they're just kind of reinforce the status quo. So I 12:55:01 totally agree about trying to figure out where that 12:55:04 power should be shared and who has the um the power to generate 12:55:07 knowledge. Um, just narrowly on the differential 12:55:11 privacy I'll say that in the course of our research 12:55:14 it was surprising to me how kind 12:55:17 of like undeveloped the methodology 12:55:21 itself is and there are plenty of researchers 12:55:24 who said this could be helpful but you know sometimes there's 12:55:27 mixed results and it kind of depends on what the 12:55:30 quality of the underlying data is. It can be implement 12:55:33 ed poorly. So I think there's more work to be done 12:55:37 on that particular tool to sharpen it. And you know so that 12:55:40 has to be part of the agenda as well is getting the tool 12:55:43 s to work better. 12:55:48 SPEAKER: Um great. So 12:55:51 we have really only two or three minutes so I 12:55:56 want to have a sort of surprise lightening round question for you 12:55:59 all. Which is in just a sentence 12:56:03 or two what do you see as the sort of structural or 12:56:06 political reasons around why 12:56:10 folks, why these sort of data holders in your industry 12:56:13 don't want to share data. Just very briefly. 12:56:16 Why are they so hesitant to share this data in the first 12:56:21 place. 12:56:24 SPEAKER: I think it's a combination 12:56:27 of mixed incentives honestly. I think the privacy 12:56:31 create the privacy regulations and 12:56:36 creates mixed incentives. I don't think 12:56:40 transparency is aligned with the thes about models 12:56:43 and I think organizationally they're not set up to execute on the 12:56:46 transparency that some of us would like to see. So a lot 12:56:49 of the times the team that are working on the patch 12:56:52 work transparency are spread throughout the organization. There's nobody at the top 12:56:56 who's responsible for it. So I think in this sense just mechanist 12:56:59 ically it can be difficult for these organizations to produce 12:57:02 the kind of transparency that we might want to 12:57:06 see. SPEAKER: Um 12:57:10 I'll say two things. 12:57:13 One pharmaceutical companies don't want to because 12:57:16 they think they're fundamentally sceptical. Same one the 12:57:19 regulators have is ambivalence or doubt about someone 12:57:22 who is a researcher and can they legit 12:57:26 imately an lies what we've done it's so complicated 12:57:29 so there's a scepticism. When it comes to other kinds of data 12:57:32 that they keep even more confidential such as the cost of 12:57:35 developing a drug or the different prices and different 12:57:39 jurisdictions that's a very financial reason. So 12:57:42 it depends on the data you're talking about in the industry the 12:57:45 motivators are certainly different. 12:57:48 SPEAKER: Yeah I think the 12:57:53 urban platform world is also about financial incentive 12:57:57 s and theirs about model has often benefited 12:58:01 benefited by not being very transparent. I think there 12:58:04 are a bunch of institutional intern 12:58:07 al incentives. I really respect the privacy team of 12:58:11 uber even though I don't necessarily agree with the arguments 12:58:14 they're making but I think the institutional setting and the legal frame 12:58:17 work does create the arguments for them to be 12:58:20 worried about their user privacy and 12:58:23 I don't think the individuals behind uber are necessarily act 12:58:26 ing in bad faith or uber whatever when 12:58:29 they're raising these arguments so I think there 12:58:33 are institutional and financial incentives that make 12:58:36 it hard for them to want to share that data. 12:58:40 SPEAKER: Great. Really excellent job answering them in lightening 12:58:44 time. Because that brings us to the close of this 12:58:47 panel. So thank you so much Elizabeth, Matthew 12:58:50 &Beatri 12:58:53 Matthew&Beatriz for an excellent panel. Elizabeth I think I put some 12:58:56 of your work in the chat but I'm about to put some of Matthew 12:59:02 and Beatrices as well. All worth checking out and thank for you participating 12:59:05 in this panel I am now going to hand it back 12:59:08 to you Emma. SPEAKER: Thank you. 12:59:10 SPEAKER: Great thanks gabe and thank 12:59:13 you so much 12:59:16 to Beatrice, Matthew, and 12:59:19 , excuse me Elizabeth. We 12:59:22 are going to take a quick break I'm going to get a drink 12:59:25 of water and we will be right back in a couple of 12:59:28 minutes with our final panel of the session, thank 12:59:33 you. 13:01:32 Welcome back. We will be discussing the question can transparency 13:01:35 be mandated by law. A lot 13:01:38 of the early discussion around transparency in the tech sector 13:01:42 has been about voluntary company 13:01:45 practice. As we herds a couple of days ago in our first 13:01:48 panel sort of laying out an over view of the transparency 13:01:52 issues in the tech industry early 13:01:55 transparency reports from companies like Google launching 13:01:58 their first report in 202010 about 13:02:01 government demand for access to user data or content 13:02:04 restrictions were really done on a voluntary basis not through 13:02:07 legal mandate. They were in response to a lot of 13:02:11 public criticism, publish pressure and internal decision making 13:02:14 but there was no law compelling the production of these 13:02:17 reports. Over the years we've had a lot of advocacy 13:02:20 coming from civil society and academics about wanting to 13:02:23 see more information about terms of service enforcement 13:02:26 and community guidelines enforcement including through things 13:02:30 like the Santa Clara principles. And 13:02:33 in general a push for companies to make their 13:02:36 policies clear door their users to experiment with how to explain to users 13:02:39 what the roles are for a particular platform or 13:02:43 forum or other place where they might be communicating. And as 13:02:46 we just heard in the previous panel there's been a lot of 13:02:49 work on voluntary sharing of data with researchers 13:02:53 and independent experts to analyze what's going on in companies 13:02:57 or just understand things about the online communication environment 13:03:00 better. We've also seen a number of instance 13:03:03 s of researchers engaging in self help and 13:03:07 just getting access to data that is technical available 13:03:10 to them even if it's not with 13:03:13 the particular cooperation of the company. So all of these efforts 13:03:16 at transparency aligned pretty well with 13:03:20 the relatively hands off approach to services that many 13:03:24 countries around the world have taken since the late 90s 13:03:27 . But it's 2021 and around the world most government's are 13:03:30 reconsidering their legal frame works around online 13:03:34 intermediaries. Whether that's liability for potentially illegal content 13:03:38 . Trying to craft regulations about what to do with harmful or 13:03:41 so-called lawful but awful content and in these 13:03:44 discussions transparency ends up being a big part 13:03:47 of the conversation. So we see right now in countries around the world 13:03:51 proposals that would actually require companies to produce 13:03:54 transparency reports or to make certain information about 13:03:57 content removal decisions available to users including 13:04:01 notifying users of exactly why their content was removed 13:04:04 from any given service. We've also seen 13:04:07 as the previous panel was just discussing a lot of 13:04:10 work looking at how to remove legal barriers for the 13:04:13 ability toose rompers to access data held by companies 13:04:17 or eve toon require that company poz vide data to vetted 13:04:20 researchers through different mechanisms. We've also seen 13:04:23 a real development of this question around third party 13:04:26 assessments or audits of company 13:04:29 practice. Either against the risks that those products and 13:04:32 services might pose to human rights or 13:04:37 more nebulous ideas about kind of whether the companies services 13:04:40 might cause harm that is not illegal in a given country 13:04:43 . So all of these different kinds of transparency 13:04:46 are very much in the mix 13:04:50 but every country or region is kind of approaching it 13:04:54 differently. They have different legal frame works, 13:04:58 different attitudes toward the question. But because this is still 13:05:01 a globally interconnected network for the most part the way 13:05:04 that one country answers these questions is probably going to have 13:05:07 ram if youications for people worldwide. So to help us 13:05:11 unpack this topic of what could be mandated 13:05:14 in law about transparency and to explore what those ramifications 13:05:18 of different kinds of laws might be we've got a 13:05:21 really stellar set of panelists. First up we'll have 13:05:24 Eric goldman who's a professor of law at 13:05:28 Santa Clara where he is also the associate 13:05:31 dean of research, and the codirector of the high tech law 13:05:34 institute. You may also know him from his 13:05:37 fantastlong running technology and marketing law 13:05:41 blog. I don't think I got the title quite right. I 13:05:44 usually just call it Eric goldmans blog. Next 13:05:47 we have August USena who is the director at the 13:05:50 center for studies on freedom of expression and access to 13:05:53 information in arjen Tina. She is also 13:05:57 the vice chair of the global network initiative. 13:06:00 Barbara is the senior direct for law and policy at article 13:06:03 19 the global freedom of expression advoc eases organization. She 13:06:06 has a distinguished career as a human rights litigate 13:06:09 or and has initiated around 50 cases before the 13:06:13 European court of human rights and finally 13:06:16 we'll be joined bymecky who is the founder of the software 13:06:20 freedom law center the Premiere nonprofit 13:06:23 organization representing the rights of Internet users in India 13:06:28 . Might saymecky practices law in India and in the United States. To 13:06:32 get us started first I'm going to come to Eric. 13:06:35 Eric as you know the US congress is considering a 13:06:38 variety of legislative proposal that 13:06:41 would mandate some aspenalty of transparency and 13:06:44 several US states have in acted state law around transparency. Based 13:06:48 on your research could transparency be mandated 13:06:52 in the United States consistent with the first amendmentment 13:06:55 . SPEAKER: Yeah thank you Emma. Thank 13:06:58 you to CDT and the Charles coke foundation for 13:07:01 putting it event devouring convening such a great conversation. It's a 13:07:04 great honor to be a part of it. And 13:07:08 I have been researching what the first amendment says about 13:07:11 the ability to pose um what I call 13:07:14 editorial transparency. Which can really take a variety 13:07:17 of forms as Emma was indicating. I divide 13:07:21 it into four categories. The first category is disclosure 13:07:24 about the editorial policies that the service is following. So 13:07:27 these can be things like tell us your rules, tell 13:07:31 you are us your house rules, tell us your content 13:07:35 moderation guide lines and mandating those are disclosed. Already 13:07:38 many companies disclose them but the regulators 13:07:42 probably want more. The second category is what Emma was referring 13:07:45 to as explanations. Individual decision 13:07:48 s being explained why they're being made. So you might 13:07:51 have the broad policies but then somebody might want an explanation 13:07:55 of why this particular one is being explained. Third is category 13:07:59 of statistical reports. Things like the classic transparency 13:08:03 reports. Tell us the numbers about various categories or activities 13:08:06 . And there's a fourth one and one that I think gets over look 13:08:09 ed. And I think really is the crux of the matter. What 13:08:13 I call source data disclosure. So this is disclosure 13:08:16 about what the service saw that allowed it to 13:08:20 make judgments that led to any of the 13:08:23 other three disclosures. Or it could be 13:08:27 just a compulsion that they have to disclose 13:08:30 it even there's not any other disclosure requirement. 13:08:33 It's basically saying if you're going to tell us these are your policies 13:08:36 give us the data that you're seeing so 13:08:39 we can see if that's actually your policy. Or tell us what 13:08:42 data you saw so we can understand why you 13:08:46 chose that explanation. Or tell us what data 13:08:49 you saw so we can double check your 13:08:53 statistical calculations. And it's this latter piece that I think 13:08:56 people are glossicking over 13:08:59 glossicking over. How will any 13:09:03 of the the disclosures be enforced by government 13:09:06 regulators to make sure they're accurate or validated 13:09:09 and putting the regulators in the shoes of the Internet 13:09:12 serve toes see what they're doing to double check their work 13:09:15 and really to second guess their work as well as to weapon 13:09:19 nize the pieces they think are most 13:09:22 extreme and to cherry 13:09:25 pick out a story that they want to tell from the 13:09:28 data. Inca 13:09:31 fends the first amendment in a number of ways and I think really gets down 13:09:34 to this notion of the entanglements with the editorial 13:09:38 process of the Internet service . It puts the government 13:09:41 regulators in the shoes of the editor and says you're going 13:09:44 to double check every single editorial and see if you would've 13:09:47 agreed. And I don't think the first amendment allows that. Now 13:09:50 for those of you outside the United States you might be saying okay 13:09:54 that's the ughs wacky first amendment we don't have that here. But I encourage 13:09:57 to you think very carefully about this 13:10:01 source data disclosure reports, how these are going 13:10:04 to be validate asked think about how that could be weaponnized 13:10:07 in your country and abused by the regulators 13:10:10 who are going to be used to 13:10:13 do the enforcement work 13:10:16 . SPEAKER: Thank you Eric that was, 13:10:19 yeah, I think it's a really great way to kind of focus 13:10:22 in some of the questions that we have about what these mandates 13:10:26 might look like. I want to bring August us 13:10:29 Tina to talk about us a little bit. So 13:10:34 in Argentina and Latin America how have transparency 13:10:38 requirements already been incorporated into law and along the lines of Erics 13:10:42 question do you think those kinds of requirements could be mandated for 13:10:45 content moderation activity. SPEAKER: Hi Emma 13:10:48 hi Eric hi everyone. Thank you so much it's been such an 13:10:52 interesting three days of this conference. 13:10:55 And a lot of information. Very useful information 13:10:58 . So thank you very much for the 13:11:01 honor 13:11:05 to be here. So the question Eric ended on was the one 13:11:08 I kept thinking of for our prep call last 13:11:11 week. And there's 13:11:16 a number of things that we thought 13:11:21 of with this based on that question. So what kinds of 13:11:24 information and where do our 13:11:28 sources for information mandates come from 13:11:31 and this l are very different sources and foundation 13:11:36 s to demand transparency in our law 13:11:39 . For one there's the interAmerican system 13:11:42 that in article one one has a duty to protect. 13:11:46 And to prevent and guarantee human 13:11:49 rights and prevent violation of human rights and 13:11:52 that has been interpreted as an obligation on the 13:11:55 state to organize it's entire legal structure 13:11:58 in order to prevent abuse of those human rights and 13:12:02 to guarantee the effectiveness and enjoyment of those rights 13:12:05 . And then there's article 13 of 13:12:08 the American convention on human rights 13:12:11 that also mandates states or 13:12:14 prints states from allowing private 13:12:18 or in direct restrictions to freedom of expression and it includes a 13:12:21 couple of examples. The examples of 13:12:25 course are dated by now but they 13:12:28 do include monopolies for example and the establish 13:12:31 ment of monopolies by law or the 13:12:34 inability to 13:12:37 fight monopolies with law. Those could also be interpreted 13:12:42 as violations of article 13. So based on those 13:12:45 duties to guarantee and to prevent you can 13:12:48 draw from there certain 13:12:51 abilities from the authorities to ask and 13:12:55 demand for transparency. 13:12:58 On a constitutional level and here in Argentina 13:13:02 and comparatively in I think in the entire region 13:13:06 but I ask everyone who has digital 13:13:10 information to supply it. 13:13:14 The application of constitutional law at least in most Latin 13:13:17 American countries is more horizontal than vertical 13:13:21 . Like the US is. So human 13:13:24 rights obligations or civil rights 13:13:27 obligation under a constitutional 13:13:30 law are mandatory not only visa 13:13:34 vy the state but also between private 13:13:37 citizens. So the obligations for nondiscrimination 13:13:41 for example are applicable to a number of 13:13:44 different actors not only the state 13:13:47 . In that same logic when the the state 13:13:51 finds or their arguments alleging freedom 13:13:55 of speech violations VIA private actor 13:14:00 the state does have an ability to demand information 13:14:03 to substantiate. 13:14:07 substantiate that claim. 13:14:12 A third one is access to information law. There's a number 13:14:16 of access to information laws across the region and some 13:14:19 of them include access to information of 13:14:22 private companies when those companies are perform 13:14:25 ing a public interest function. Which 13:14:29 is interesting. It has been most 13:14:32 ly applied to companies that have some participation of 13:14:35 the state within their share holders and what not. 13:14:39 But the language in a lot of those access to information law 13:14:42 s could very well justify access 13:14:45 to information 13:14:49 requirements to companies if the companies are deem today 13:14:52 be performing a public 13:14:56 interest function. Finally consumer protection 13:15:00 laws. Terms of service and how 13:15:03 those terms of service are applied the processes 13:15:08 within the companies to enforce those terms of 13:15:11 service could very much fit within a lot 13:15:14 of the consumer protection laws that we have 13:15:18 in Latin America that many of the 13:15:21 the different jurisdictions 13:15:26 have. Having 13:15:30 having said that I think there's a 13:15:33 radical difference that needs to 13:15:36 be acknowledged. You talk about 13:15:39 editorial freedoms and I think 13:15:44 that's great. I don't think anyone in Latin 13:15:47 America would speak of companies as having editorial 13:15:52 freedoms. They could 13:15:56 have commercial freedoms and they 13:15:59 could have other kinds of arguments 13:16:05 but we have seen a company argue 13:16:08 their editorial freedoms in Latin 13:16:11 America. I think the reason behind that 13:16:15 is that intermediaries and as you probably know Latin 13:16:19 America doesn't have none of our countries 13:16:22 have a specific 13:16:25 intermediary liability. We are still 13:16:28 under the civil law traditional responsibility 13:16:31 mechanism and in the interpretation 13:16:36 that they have done of that regime 13:16:40 the courts have 13:16:43 always viewed search engines and social media 13:16:47 and what not not as 13:16:50 editors but as hosters and organizers 13:16:53 of third party information and that's why we have the 13:16:56 intermediary liability 13:16:59 juice prudence that we have. Have they seen them as editors 13:17:03 we probably would have a lot of 13:17:06 decisions finding companies liable for everything that's illegal 13:17:11 and harmful 13:17:16 on their websites. SPEAKER: That's good thank you 13:17:20 so much. It's a really interesting distinctions 13:17:23 between the US and the Latin American 13:17:26 general approach to these issues especially 13:17:29 the absence of a specific liability frame work but 13:17:32 also that kind of the conclusion that the courts have come 13:17:35 to across Latin America 13:17:39 of trying not to hold the intermediaries necessary as 13:17:42 liable for everything but doing it by sort of categorizing 13:17:45 them differently than editors as a way to get around 13:17:49 the editorial discretion or liability kind of 13:17:52 questions that in the US for example we just have section two 13:17:56 30 creating a statutory answer to that question. That's fascinating 13:17:59 love to dig into that 13:18:02 more. Barbara in Europe transparency 13:18:06 is obviously a major aspect of the 13:18:09 digital services act which is the revision to the 13:18:12 intermediary frame work being negotiated right now. 13:18:15 We heard yesterday from a member of the European Parliament about 13:18:18 some of the the transparency provisions in the DSA 13:18:21 and countries including Germany have already in acted 13:18:25 various transparency cobigations so can you tell us how is the 13:18:28 legitimacy of transparency mandates being discuss 13:18:31 ed in Europe or is it being discussed in 13:18:35 Europe. 13:18:40 SPEAKER: Sorry guys hello 13:18:44 everyone I was muted. Great to see you 13:18:49 all. Already the discussion 13:18:52 shows how important it is to have different perspective 13:18:56 s and consider different con tex and the legislative 13:18:59 s frame works. Going back to a question to be 13:19:03 honest I don't think we have a lot of discussion about legitimacy 13:19:07 of this transparency mandate 13:19:10 because I think in Europe there is pretty 13:19:13 much agreement that self regulatory transparency frame 13:19:16 works are actually ineffective or 13:19:20 in complete, methodical, and unreliable. And I think there 13:19:23 is a general acceptance that this government 13:19:26 mandated transparency or account 13:19:30 ability regulations would have been in the near future or already 13:19:33 happened. This is also giving in the context the government 13:19:36 is quite on transparency in their own conduct 13:19:40 of what they are doing with transparency 13:19:43 of number of reports of referral 13:19:46 s or 13:19:50 as you said they are pushing for increasing sector transparency 13:19:53 and several legislative proposals. You 13:19:56 mentioned BSA but it's also proposed 13:20:00 regulation online terrorist content or in 13:20:03 the UK we have online safety bill previously 13:20:06 called online harms and in Germany you mentioned 13:20:09 the SVG. In France there was also a proposal. 13:20:12 All of those mandate transparency report 13:20:17 ing and legislation which is being put 13:20:20 forward or was an update is widely criticized 13:20:24 on freedom of expression 13:20:27 grounds but even the harshest critics of this 13:20:30 legislation supportive transparency 13:20:33 regulation in those legislations also we have 13:20:37 seen in Germany 13:20:42 some there was quite important transparency. 13:20:45 We saw in reporting about a lot of details 13:20:48 about operation procedures of those companies from stopping 13:20:52 and training or number of employees and all 13:20:55 kinds of stuff, how many contractors that employed or what 13:20:58 sort of support services 13:21:02 they have. But also breakdown of the the content 13:21:05 removals. 13:21:10 Criminal code or detailed breakdown of the time 13:21:13 they took for this content removals and 13:21:16 so on. So 13:21:19 this is been recognized as -- on the other hand 13:21:22 I need to echo what Eric said in 13:21:25 his introduction was that there are very great imitation 13:21:28 s of what this data provides 13:21:32 in the transparency sorts they don't have and there is a 13:21:35 lot of complaints that they don't 13:21:38 provide sufficient ( ) to be 13:21:41 useful. There is also recognition that this transparency 13:21:45 disclosures 13:21:48 lack auditing. They also lack credibility 13:21:51 and accuracy. Because the subsequent 13:21:54 judgment of the nature of that 13:21:58 of an organizational procedure is hard to verify 13:22:02 . But again back to your question I think that in Europe 13:22:06 the discussion moved from I would say this is my assessment 13:22:11 from abstract concept and it's more focus 13:22:14 ed on Taylored transparency initiative in 13:22:18 specific areas and I'm happy to just give 13:22:21 some examples of what specific areas are because 13:22:24 this is also among article 19 proposal 13:22:27 s what this new regulatory schemes governing 13:22:31 platform should in 13:22:36 include and how obligations should be clearly defined 13:22:40 . So we kind of argued that the types of measure 13:22:43 s for transparency requirements should 13:22:47 include the obligations 13:22:50 on the solution of content. They should provide 13:22:53 essential information and should explain to the public how their 13:22:56 algorithms are used. How they're used to present 13:22:59 prox mote, and demote content and they 13:23:02 should also provide some gran layerity there. 13:23:06 They should also explain how target 13:23:10 users with promotional content 13:23:13 on their initiative, third 13:23:17 party. The second area is the the transparency 13:23:20 order in terms of service and community stand 13:23:24 ards and not just publishing 13:23:27 this standards but in a way they are easy to understand and 13:23:30 to give case law of examples how they are applied 13:23:33 . This is also proposed in some of those 13:23:37 country legislations which I 13:23:40 mentioned. Then the third 13:23:44 area which those transparency requirements include 13:23:48 would be human and technological sources 13:23:51 which are used to ensure compliance. So this 13:23:55 is the information about issues such as 13:23:59 trusted schemes or 13:24:02 how they're selected. whether they have 13:24:06 any privileges included to their status 13:24:09 . Also how their algorithms 13:24:13 operate to detect illegal or aledgedly harm 13:24:17 ful content of the community standards 13:24:20 . And also negative 13:24:24 s or positives so on. Nen other area which is 13:24:27 included is decision making 13:24:30 . So this is also a part of reck niches. they should not 13:24:34 divide how their decisions are affecting parties how they 13:24:37 give reasons about their actions and so 13:24:42 on and then degree of transparency 13:24:46 pores of what the companies are obligated 13:24:49 to public and what details how they should 13:24:54 distinguish between certain removal requests from 13:24:58 the government from third parties and so on. Again we can go in great 13:25:01 er detail of what 13:25:04 what these proposals including types of transparency 13:25:09 reports and then important area 13:25:12 this is what Eric said and we 13:25:15 can go into more discussion about this. 13:25:19 algorithms plus transparency odd and the mary's 13:25:22 verification of this transparency reporting and 13:25:25 how they should get access to data 13:25:28 set regulators but as well as the independent 13:25:32 researchers in some way 13:25:35 journalists or academics or 13:25:38 otherwise where they can verify company 13:25:41 systems are operating anyway the company claims. 13:25:46 And also can also be 13:25:49 considered publicly accessible archives of political 13:25:53 advertisement where the the legislators are increase 13:25:56 ingly asking 13:26:00 for archiving being past. For example in 13:26:03 France or in the UK these are discuss 13:26:06 ed. Some examples 13:26:09 of requirements which are put forward in those legislations 13:26:13 all those EU proposals for transparency 13:26:16 environments obviously the issue is that whether this can be audited 13:26:19 and all these questions 13:26:23 , speak about them in a later discussion. 13:26:26 SPEAKER: Great thank you so much Barbara. Now I also finally 13:26:29 want to bring in Mishito 13:26:32 talk with us a little about the transparency obligation 13:26:35 that feature in the I'm going to get the the full name of 13:26:38 this wrong but the information technology intermediary 13:26:42 guide lines and digital media ethics code rules Tharp 13:26:45 in acted earlier this year in India 13:26:48 . So can you tell us a little bit about how the intermediary rules 13:26:51 address transparency giant how are they fitting into 13:26:54 the broader debates around online content regulation 13:26:58 in India. SPEAKER: Thank you for having me. And 13:27:02 thank you for bringing this conversation together. We don't 13:27:05 have to go through the basics 13:27:09 but because everybody's having a great time learning all these details 13:27:13 details that rarely happens. I am going to 13:27:16 resist putting on my US lawyer hat 13:27:19 and not commenting on any of the great stuff that is happening in Florida 13:27:23 where I just came from or 13:27:26 Texas, we'll just talk about India. I thought you 13:27:29 liked tongue twisters but the did you get the name right 13:27:33 . But for brevities sake we just 13:27:36 call them IT rules. Because we've 13:27:39 seen so many iterations of them that's how we have 13:27:42 stuck to them since 2011 so we just 13:27:46 stick to that. They are a kitchen sync of everything 13:27:58 siege of everything. These are also online 13:28:01 news agencies which are now also everything 13:28:05 is cep together under these rules 13:28:10 . seng. There's no 13:28:13 suspense because there is a transparency requirement under these rules 13:28:17 . Rule four one D requires that 13:28:20 significant social media intermediarychise is a defined 13:28:23 term that if you have above a certain number of users you 13:28:26 become significant enough and all the large platform 13:28:30 s fall under that because India is such a country 13:28:34 where numbers actually boggle everybody's mind and when 13:28:37 a small percentage of the the population means you are significant enough 13:28:41 . So that means they're required to publish month 13:28:44 ly compliance reports. Now in these reports they're required 13:28:47 to mention the details of the complaints received, action 13:28:50 taken there of and the number 13:28:54 of specific communication links or parts of communication that 13:28:57 the social media platform has removed or disabled 13:29:00 access by proactive monitoring 13:29:04 . I am going to say a few things. I 13:29:07 understand this is more granular and I would love to 13:29:10 talk about a specific 13:29:14 aspects. But I think that all these issues are 13:29:16 not independent the of the political economy which they 13:29:21 operate. And we forget that part of it then 13:29:25 we are forgetting how each of the countries is actually look 13:29:28 ing at these legislation and trying to come up with 13:29:31 various mechanisms. Atensively to 13:29:34 protect the user. Because everybody's out there to protect the 13:29:37 user. But in this ultimate struggle 13:29:40 the loser is always the user. That is the people 13:29:44 who we are. And that's an 13:29:48 important similarity where the territorial struggle between the platforms and the 13:29:51 government's are happening. In Parliament after a lot 13:29:54 of -- today presented the final report the J 13:29:58 P report on the data protection bill. You would think it's only 13:30:01 about date the aprotection but it has a lot of stuff 13:30:04 about content moderation as well. That's why I say that 13:30:08 political content of all of this becomes really really 13:30:11 important here. There's some distinctive feature 13:30:14 s to be observed in Indias case 13:30:18 but are also very important 13:30:21 . Because social media companies are coming to the the end 13:30:25 of their relationship for the worlds democratic government's. That's 13:30:28 why we're seeing a big push in this direction. And we've also seen 13:30:31 that no matter how hard government's bear down. No matter 13:30:35 how much platforms spend on lawyers, 13:30:39 PR means to wear the states down, in the end faced with the 13:30:42 final offer companies always fall. And 13:30:45 not since Google in China in 13:30:49 2010 a decision it has always regretted has any platform 13:30:52 been willing to forego a national or super national market 13:30:56 . In that when we see that in the context 13:31:00 when the IT rules come the idea is we 13:31:03 don't know much about what you're doing so why 13:31:06 don't you start publishing monthly reports. We've seen some 13:31:09 reports. Twitter and it's characteristic we cannot 13:31:13 decide whether it wants to comply with the law or it doesn't want to 13:31:16 comply with the law and what it's actual strategy 13:31:20 is. But maybe with an Indian CU 13:31:23 at the hem of affairs we'll have 13:31:27 some more compliance. We have some numbers from twitter 13:31:30 . Google 13:31:33 on the other hand did release a report with a caveat to say 13:31:36 this is what you require us, your transparency 13:31:40 requirements are only numerical and because you're asking us to do it every month we 13:31:43 can do it because just the data processing 13:31:47 and validation will not work. That's why there will be 13:31:50 a two month lag for reporting. That also tells you how 13:31:54 the compliance requirement for transparency are written. And I'm 13:31:57 only concentrating about the statistical reports and not the 13:32:00 ones which Eric laid down in terms of 13:32:04 full numbers which is about telling the users etc. which are 13:32:07 already mandated. Then what we learn 13:32:10 from those reports basically 13:32:13 were that as always the movie industries 13:32:16 the most powerful people all around the world. We can all talk 13:32:20 about our own free speech rights but Disney gets it's way 13:32:23 . What we learned was that 98 percent 13:32:26 of content remove that'll time was about copy right. 13:32:29 Then it was trademark and then it was counterfeit 13:32:33 . And interesting bit is that Google also came up with 13:32:36 their own voluntary report. Which tells us 13:32:39 how many times government of India asked them to 13:32:42 remove some con ten. These are please 13:32:45 take down requests. These are also information which is required 13:32:48 . The police just says okay we're investigating something 13:32:51 and we don't like some person some people here and give 13:32:55 us some information. These come in the form 13:32:58 of something which is equal to a 13:33:01 subpoena but actually are a little different there and the 13:33:04 requests instead of just asking for information 13:33:07 also say we want to you block these handle 13:33:11 s or we want you to take down this content 13:33:15 . Facebook also came up with the 13:33:19 report and that also tells us a little bit about what's 13:33:22 going on. The voluntary disclosures 13:33:25 closures told us that Indian government issued 13:33:30 43,300 orders for government take down. Because 13:33:35 Facebook and instagram released a combined 13:33:38 compliance report and what's app also declared that 13:33:42 they were only 13:33:45 345 complaints which were received through a grievance officer who's been 13:33:49 appointed under these rules and it took action on 63 of such 13:33:54 complaints. But other data told us that at least 13:33:57 two million accounts were banned in a span of 13:34:00 30 days. We have no information about why 13:34:03 they were done. What was done here 13:34:07 . But overall the larger 13:34:10 picture I'm going to say is that there is already 13:34:13 in India a mandate on these transparency rule 13:34:17 s under IT act. The the free speech 13:34:20 and expression debate just like the US 13:34:23 the platform companies have tried to coopt 13:34:26 many times trying to get into those intricacies of how 13:34:29 they also have rights. But it's most 13:34:34 ly just the state and we are operating in that gray area 13:34:37 somewhere. Where the companies sometimes want to pretend 13:34:40 to get the benefice of say proper and 13:34:44 intermediary liability and other times want to pretent to say oh 13:34:47 we are editorial, we are like news paper so 13:34:50 protect us just like briefs they filed in Florida 13:34:53 and Texas. But because in 13:34:57 India it's prey 13:35:00 pretty clear if you have 13:35:03 safe harbor you will have to take down content if mandated 13:35:06 by the the state they're airing on that side and 13:35:09 thus removing content. The third thing is at least 13:35:13 in India and we've seen this in many parse of 13:35:16 parts of the 13:35:19 world bad ideas get copied very quickly. India is 13:35:22 a very popular place where people look how to actually 13:35:25 emulate what India is doing in big part of 13:35:29 rest of the global south as well. Perhaps 13:35:32 US and European kind of judicial system 13:35:35 s don't exist in the same way. So and why I 13:35:38 say that is because I think the 13:35:41 government's are deliberately gerrymandering the 13:35:44 laws how they operate. They include certain things 13:35:47 and don't include other things. In the way IT rules are 13:35:51 written also the government is cleverly said just give 13:35:54 us the numbers and you don't ever have to say when we are the 13:35:58 ones asking to you take down any content or we are the one 13:36:01 who's are asking to you reveal any kind of information 13:36:04 . So overall 13:36:09 if we were to talk about, there's 13:36:12 not much discussion about why transparency, what are 13:36:16 we trying to accomplish here and what harms are 13:36:20 we trying to prevent by imposing all these disclosure obligations 13:36:23 . I think the motivation is platforms are 13:36:26 wreaking havoc with our societies. We need to hold them 13:36:29 accountable. But at least cleverly in India it is only 13:36:33 hold them accountable so we can pretend 13:36:36 but as long as you can leave the political parties as the lead 13:36:39 ers out of this that's how we want to 13:36:42 play the entire game. And companies are more than 13:36:46 happy to actually play this game 13:36:49 . And chats that's why I believe that no matter 13:36:52 how big a market is I don't need to quote numbers 13:36:55 to people about India how important that market 13:36:59 is they're leaving the user 13:37:02 completely out. So the discussions when they happen and 13:37:06 my other life on first mend . And user rights and 13:37:10 everything they seem so far away from the kind of behavior 13:37:13 the companies exhibit in our part 13:37:16 of the world that sometimes one has a very 13:37:19 difficult time defending the the companies 13:37:23 intermediary liability aspects also 13:37:27 . But the law or the rules currently are under 13:37:30 challenge in various 13:37:34 parts of India on various aspects of it but I don't 13:37:37 see the transparency requirement going away. How effective 13:37:40 it would be and what we will do with that data 13:37:44 that's something which we can discuss later 13:37:47 . SPEAKER: That was great thank you 13:37:50 Mishiand such a great illustration of how kind of bound up in all of 13:37:53 these other questions around intermediary 13:37:57 regulation these questions of transparency are 13:38:00 . I know one thing we've seen CDT has 13:38:03 filed briefs in the challenges to the Florida and Texas laws here in 13:38:06 the United States that have various obligations on intermediary 13:38:09 ies including things like essentially must carry provisions or 13:38:13 things saying online service providers must host the speech of 13:38:16 politicians or of political candidates. Some 13:38:19 things that under first amendment laws seem pretty clear 13:38:22 ly unconstitutional. But a lot of it comes back to this question of 13:38:26 how are courts looking at the intermediaries. Are they 13:38:29 like news papers. That's a lot of the framing we're seeing 13:38:32 in the US. Are they like news papers exercise 13:38:35 ing editorial discretion over a letter to the editors 13:38:39 section or are they like you know mere 13:38:42 conduits or something like the telephone system 13:38:45 that must carry the speech of all comers or are 13:38:48 they somewhere in between. So it was great to hear from all 13:38:51 of you about the sort of the perspectives that you're taking 13:38:55 , Eric clearly with the editorial 13:38:59 discretion frame, August us teen yeah talking about the other 13:39:02 ren kinds of obligations to support the expression of 13:39:05 individuals. I'm curious where you come down though on these questions 13:39:09 of sort of just how much editorial 13:39:12 discretion are intermediaries exercising when they're doing 13:39:15 things like content moderation and how do you, 13:39:18 do you see any risk or threat of different kinds 13:39:22 of transparency mandates to constraining that editorial 13:39:25 discretion or ultimately like putting a thumb on the 13:39:28 scale of how companies decide what content to take down, 13:39:32 what user speech to leave 13:39:35 up. 13:39:42 Barbara I see you're off mute the would you like to comment on that 13:39:46 . SPEAKER: I'm not sure I understand 13:39:49 how you what exactly 13:39:52 is the question. SPEAKER: I can clarify. So really 13:39:55 trying to look at I think one of the the risks that we've 13:39:59 heard about transparency mandates is that the mandates 13:40:02 themselves the requirements to disclose information about 13:40:05 how a company is doing content moderation will affect the 13:40:08 kinds of content moderation decisions 13:40:12 they make including making them more likely to 13:40:15 turn down certain speech so for example they can post bigger numbers of 13:40:18 amounts of hate speech that they've removed or responsive 13:40:21 ness to government demand for content removal. Curious 13:40:24 if does this seem like a threat does this seem like 13:40:27 an issue. Or is 13:40:31 it just not even something that you think is part 13:40:34 of the consideration here. 13:40:37 SPEAKER: So you mean sorry the companies giving of the transparency 13:40:40 mandates will be more able to gain the system. 13:40:44 SPEAKER: Yeah either that companies will be able to game the system 13:40:47 or that in interest of getting 13:40:50 a good grade on their transparency report of complying with the obligation 13:40:54 under law that it will affect what speech they allow 13:40:57 on their services. 13:41:06 SPEAKER: Yeah to be honest I haven't come across 13:41:09 this particular concern right but I think there are concerns 13:41:13 that there is to a 13:41:16 greater 13:41:19 extent not displaced but over estimation of whether the transparency 13:41:24 model some of these practices will lead to additional like 13:41:28 accountability measures and whether 13:41:31 they will, whether the 13:41:35 malpractice of some sort of practice alone 13:41:38 will subsequently lead to well intended action 13:41:42 to resolve the issue or the risk identified. 13:41:45 And there are recognitions that there will be limitations 13:41:48 of the transparency. Might even be unintended 13:41:51 consequences such as you are describing or 13:41:54 there might be misplaced risk 13:41:57 of harm put on something rather 13:42:01 than on other areas. Or even 13:42:04 whether this link between the transparency initiatives and specific 13:42:07 outcome can be attribute asked measurable. But to be 13:42:11 honest I didn't come across this particular concern which you just 13:42:14 described. 13:42:18 The concerns which we are discussric slightly different. But 13:42:21 maybe I just didn't come across 13:42:25 it. SPEAKER: Could I 13:42:29 come in? 13:42:34 Thank you. I want to make two points about what's taking 13:42:37 place in the US that I think are responsive to what Emma 13:42:40 s saying about transparency being use today shape editorial 13:42:43 decisions by the Internet services. Some of you 13:42:46 may be familiar with the litigation that's taking place in 13:42:51 California where twitter is 13:42:54 suing the Texas attorney general for an open investigation that 13:42:58 paxten launched 13:43:01 after twitter deplatformed trump. Paxten 13:43:05 made it very clear that he wanted to 13:43:08 punish twit prosecute platforming decision so he opened up an 13:43:11 investigation that said basically tell 13:43:14 me everything about your content moderation policies and he also 13:43:17 made a quote continuing nature. What it mean 13:43:20 s is it's an on going operation. So as 13:43:23 twitters continuing to make any new decisions in theory 13:43:26 they're supposed to send a CC copy over to paxten so 13:43:30 he can take a look as well. Now paxtens made 13:43:33 his retal yeahatory intent 13:43:36 clear. He wants to punish twit for having deplatform 13:43:39 ed trump. So he's going to use all the information that's 13:43:43 gathered to try and find the most 13:43:48 punishable material and if he can't find anything to punish 13:43:51 him he's going to use other means like shaming or publishy 13:43:54 to try to change twitters behavior and twitter 13:43:57 has said in it's briefings that they're getting the message. 13:44:00 They know if they want to get rid of paxten from breathing 13:44:04 down their neck they should just do what he wants. We 13:44:07 actually can see how these on going investigations where they're 13:44:12 demanding access to the good stuff is really 13:44:15 a form of control. And I think 13:44:18 that's a preview of what we're going to see a lot 13:44:22 of these transparency reports the government's going to feast on 13:44:25 the data disclosures and use them for the most 13:44:28 illegitimate purposes the government can engage in. 13:44:31 That's just the way they work. the other piece I'll mention is that 13:44:34 we're seeing some litigation now here in the 13:44:38 United States over explanations. So service will 13:44:42 provide an explanation to it's decision and then the user will say no 13:44:45 that's not right. Now I'm going to sue you. give me discovery 13:44:48 so I can look at all the other diseases you've made and I'm 13:44:51 going to second guess all of them 13:44:55 . Clearly that could not have been the right explanation. 13:44:58 And they're framing things like defamation. If 13:45:01 you're explaining it this way you're actually 13:45:04 defaming me by harming my reputation. I'd like to think 13:45:07 those law suits are going to fail but there's no 13:45:11 doubt the services know we'd prefer gnaw to 13:45:14 provide them . And B if we're required to provide them we're going to 13:45:18 provide them in the most generic way possible or C we're not 13:45:21 going to make the decision at all because of the fact we know we won't get 13:45:24 sued for having to provide explanation. So we're absolutely 13:45:28 seeing this shaping and bending of the editorial policies 13:45:32 of the services through the mechanisms where transparencies 13:45:35 the hook that can create 13:45:38 liability. August us 13:45:42 Tina did you want to comment? SPEAKER: 13:45:45 Yeah I did. I certainly 13:45:48 understand where Eric is coming from and I agree there is room 13:45:51 for abuse. But 13:45:54 I also think I mean at this point there's a legit 13:45:57 imate concern over the power of companies to limit 13:46:01 freedom of expression and public discourse and not just for the 13:46:04 trumps of this world. For 13:46:07 everyday people. For vulnerable groups 13:46:10 that depend on these platforms to communicate to get 13:46:13 their message out there. 13:46:17 I mean they are exercising a great amount 13:46:22 of power. And they do 13:46:25 have a very impressive 13:46:28 machinery to enforce their terms of 13:46:31 service and to design their terms of service and that's 13:46:34 all fine. They are private companies and they can have 13:46:38 their terms of services and those terms of service 13:46:41 should be compliant with the law. They cannot be outside 13:46:45 the law. And there should be a means to ascertain 13:46:49 whether those terms of service are within the scope of the the 13:46:52 law. Like I said. In Latin America at least 13:46:55 you cannot have terms of service 13:46:59 that are manifestly unconstitutional 13:47:02 . That should not happen. That gives anyone a cause of 13:47:05 action. Under constitutional law 13:47:08 . And you would not be able to verify that if the term 13:47:12 s of service were not publicly available 13:47:16 . I 13:47:19 mean I wouldn't necessarily frame the power 13:47:22 of companies as editorial decisions 13:47:26 . Again I know this is a very US way 13:47:29 of framing it and I understand 13:47:32 it. But I think by 13:47:35 framing it under editorial decisions we 13:47:38 might be putting within the 13:47:41 same bowl a number of things that are very that in my 13:47:45 mind at least are and should remain 13:47:48 very separate and very 13:47:51 different. As far as I understand terms of service 13:47:54 at least under Latin American 13:47:58 comparative constitutional law it's more like 13:48:01 commercial speech than first 13:48:04 amendment protected speech. If that makes 13:48:09 sense. And I think under 13:48:13 civil liability regimes I would like it to stay that 13:48:17 way. 13:48:21 SPEAKER: Sorry 13:48:25 Mishiwould you like to go ahead. 13:48:28 SPEAKER: Sure. I think it's, this is an interesting example 13:48:31 of the larger thing going on these days in which platform 13:48:35 s claim rights for themselves in ways 13:48:38 which are extraordinarily powerful as tools 13:48:42 against any government regulation. But which have another doing do with 13:48:45 speech rights of individuals at all. I am in 13:48:48 agreement with Eric that when you see 13:48:52 whether it's Greg abbot of Texas or 13:48:56 Ron did you santis notorious for many 13:48:59 other important things than this only which is the current pandemic 13:49:02 and when they said that these 13:49:05 state laws are intended to stop the the company from silencing 13:49:09 conservative view points and ideas one sees that these laws 13:49:12 are reflecting that kind of intent. And 13:49:15 this is especially true as I said of Florida 13:49:20 where at least the assumption is 13:49:23 that there are companies who harbor liberal sympathy 13:49:27 sympathies and then exclude Disney complete 13:49:30 ly because it has all these great properties in Florida 13:49:33 and that's how the law is written 13:49:36 . So if that's the example 13:49:39 and because we're all lawyers and we love the things about specifics 13:49:42 and each regulation has to be dealt with in those specifics then 13:49:45 companies are correct that the laws 13:49:49 discriminate against certain platforms based on their 13:49:53 political views but what's concern associate that these 13:49:56 companies are now using it to go much further 13:49:59 and just saying oh my god you're using transparency 13:50:03 you're asking us all of these things 13:50:06 things and claiming very broad protection 13:50:09 s because they exercise some editorial judgment 13:50:12 however minimally you can't have your cake and eat it too. 13:50:16 You can't have also the ideas of 13:50:19 we are never exercising it we need the 13:50:22 intermediary liability and safe harbors but unless give us the 13:50:25 protection you give news papers 13:50:29 . And because they may make sometimes certain decisions they have 13:50:32 different from news papers in important ways and 13:50:36 they are similar in some ways. Because platforms are care 13:50:40 carriers of somebody else's speech rather than 13:50:43 their own and they can't exercise the same kind 13:50:46 of curatorial content over everything that they publish because they 13:50:49 don't want to take responsibility for all of that. And so I 13:50:54 would say that companies make this 13:50:57 argument will make it almost impossible for any legislature 13:51:00 to in act any care 13:51:05 fully drawn law also. 13:51:08 Just to say for everything that way can be argued in 13:51:11 one side to say you're taking away my rights here and in countries 13:51:14 like ours I think now there's enough data to 13:51:18 say that no matter what claims they make there is not enough resources 13:51:22 which are 13:51:25 actually being deported to countries which are not the US or the 13:51:28 European union. That's just the truth of the matter 13:51:33 . Whether what's app 13:51:36 are happening in India the companies are like yeah 13:51:39 not our problem. Or unlike here when we concentrate 13:51:42 so much on trump in India now 13:51:46 there's enough information in the public domain to see 13:51:49 when it comes to the political party in power the platforms 13:51:52 just don't want to do anything. They don't want to take down 13:51:55 content. So and I began this to say that the 13:51:58 Hollywood industries so much more powerful. Because 13:52:02 we have over the years understood that copy right trademark 13:52:05 counterfeited etc. we can define easily but these 13:52:09 other things about hate speech and misinformation because 13:52:12 these are not so easy to define that's 13:52:15 why we've played into the hands 13:52:20 of saying about oh they 13:52:23 shouldn't be adjudicators and decision makers about 13:52:26 what should be allowed and not allowed on that platform. In 13:52:29 terms of transparency in the US I mean just to maintain 13:52:32 records of what you removed nobody at 13:52:35 least I haven't seen many, I know that digital services act 13:52:39 has made certain proposals but I think a lot of it is 13:52:42 about numerical data. Automobile junk 13:52:45 yards keep a record of the parts and automobiles in 13:52:48 the United States and we have specialized 13:52:51 warrants which are possible also there. There we don't claim anything oh 13:52:55 my god that we are robbing the junk yards rights of free speech 13:52:58 and expression. There could be certain things I know every law has 13:53:01 to be examined on it's specifics and 13:53:05 that's fine I am somewhat in agreement about the 13:53:09 Texas cases but I don't think 13:53:12 that agrees everywhere and say transparency is going to be used in 13:53:15 a certain way because companies don't come with 13:53:19 clean hands here and night door the government's in many parts 13:53:22 of the world including the government of my birth country 13:53:25 . SPEAKER: And it does seem like a lot 13:53:28 of this infer conversation is somewhat relate 13:53:31 today that tension of how to keep both govern 13:53:34 ance and countries 13:53:37 in check. Eric giving some illustrations of way 13:53:40 that we've seen in the US at least government's 13:53:44 using transparency efforts as a way to potentially 13:53:47 circ um vent limitations on their own activity by the first 13:53:51 amendment we, I could keep going down this rabbit hole 13:53:54 but die want to get to at least one of the questions from our audience 13:53:57 QA. So we have a very practical question coming in from 13:54:01 David Sullivan about what sort of information 13:54:04 do you think could be mandate today be disclosed to regulators 13:54:08 and just regulators instead of the the public to mean 13:54:11 ingfully explain content policies practices and decisions 13:54:15 . If you're familiar with how this works under 13:54:18 the GDPR due data 13:54:22 protection laws about disclosure 13:54:25 to authorities and not necessarily the public offer a 13:54:28 model that's worth considering. Would anyone like to have 13:54:32 a crack at the that one. 13:54:39 Yeah August US teen. 13:54:43 SPEAKER: So this this one first distinction that I 13:54:46 ask the first panel on Tuesday I think it was 13:54:50 or Wednesday. I think there are two con accepts we should bear 13:54:53 in mind. One is access to the information 13:54:56 that's on the platforms for research for example for research on 13:54:59 the social issues that we're concerned about the platform 13:55:02 s dealing with. What exactly 13:55:06 is the phenomenon 13:55:09 around online. Understanding that issue, understanding 13:55:12 hate speech as a whole. That requires access to information 13:55:15 within the platform and the other is transparency 13:55:18 of what the company and what transparency 13:55:22 around the actions of the company and not of the 13:55:25 the information that's contained within the 13:55:28 platform. And I think those two are 13:55:31 a lot of times 13:55:34 being mushed together. In a number 13:55:37 of conversations. On transparency about the companies 13:55:41 I think there's a number of model that could be followed. There's the 13:55:44 data protection. There's the financial institution 13:55:48 s have been mandated transparency for the long 13:55:51 est time ever. The pharmaceuticals have mandated 13:55:55 transparency for a long time as well. I mean there 13:55:58 are different industries that have 13:56:01 already certain mandates over 13:56:05 transparency. Environmental transparency obligations 13:56:09 are also there as a modthat'll we could use. I think 13:56:13 the key question here, there are two key questions 13:56:16 . One is I'm not sure we know what information 13:56:19 we need as opposed to the financial sector 13:56:24 the environmental sector and the banking sector or the data 13:56:27 protection sector. I'm not sure we have the questions as clear as 13:56:30 those other industries do have them. And the 13:56:33 other is one point that Eric mentioned at the beginning. 13:56:36 How are we going to supervise this 13:56:39 thing. What kind of supervisary mechanism 13:56:43 s do we need. We don't want the government second guessing 13:56:46 the companies as Eric said but then 13:56:49 again what kind of oversight are we think 13:56:52 ing of imposing 13:56:55 on these to follow up on these obligations 13:56:59 of transparency. I think those are the two questions. And on 13:57:02 super vision the data protection one I don't think is a good example 13:57:07 . SPEAKER: Open Barbara 13:57:09 did you want to come in really quick on that. 13:57:13 SPEAKER: Yeah yeah just a very quick one. what can the regulator 13:57:16 require from the companies I think that very interesting experiment which 13:57:19 was done by France they had a 13:57:23 group of from regulatory agency embedded 13:57:26 in Facebook for three months and they were supposed to be part 13:57:29 of the practices and then come with the regulation proposal 13:57:32 for France and after three months they actually came and said 13:57:36 they had no idea what to ask them. I mean they 13:57:39 didn't say in these very words but basically they said 13:57:42 it even after embedded within Facebook for 13:57:45 three months they still didn't have the clarity what the regulator could be 13:57:48 requiring as compared to what should be in the 13:57:52 public reports. So I don't think that 13:57:55 we can site in two 13:57:58 minutes which we have left. But I think 13:58:02 there is the UK online safety bill is 13:58:05 problematic but I think the transparency requirements and there 13:58:10 is like a list recently joint committee 13:58:13 which should report early this week actually came with quite an 13:58:17 interesting list of the requirements 13:58:20 which could be disclosed regulators 13:58:23 compared to public transparency reports you can look into that. 13:58:27 But overall they say we need a strong data 13:58:30 protection legislation and G 13:58:33 DP Is 13:58:36 not perfect but better 13:58:40 than nothing at the moment. So combination of those would be 13:58:43 useful 13:58:44 . SPEAKER: Do I have one minute to 13:58:47 contribute here. SPEAKER: Please do, yeah. 13:58:50 SPEAKER: So my head was in a completely different place than the 13:58:53 questioner. Because for me the government is a major vector of attack 13:58:58 . So thinking about disclosing information only to the government and not 13:59:01 to the public actually misapprehends the 13:59:04 terms of threat each know government's view services as 13:59:08 honey pots of information already and things like the twitter versus 13:59:11 paxten case show the example of how government's weapon I'd 13:59:15 ed the ability to get axis to achieve 13:59:18 their sen sorial goals. so for me actually I 13:59:21 would invert the question and say what information can 13:59:24 we make sure the government doesn't get knowing that they're going 13:59:27 to be the vector of attack that we need 13:59:31 to be potentially most concerned about 13:59:35 . And then at that point what 13:59:39 information shouldn't be. Why if informations available 13:59:43 why shouldn't it be available to the public a large so we 13:59:47 can help monitor the government's activity. 13:59:50 SPEAKER: And Mishithis has turned into our closing 13:59:53 thoughts round so I do want to make sure that you have time for a 13:59:56 last word. SPEAKER: I 13:59:59 wish this were longer because there are elements of what I can agree 14:00:03 with all three of them and elements where I can't but 14:00:06 I'm going to take what Eric 14:00:09 said. 14:00:15 Indias federal government is like the Texas or Florida 14:00:18 government right now. I will say that really does make a difference 14:00:21 . I'll give a recent example. Of 14:00:24 obviously privileged information can't really disclose 14:00:27 . But Indias one of the state police is 14:00:32 acting basically against the citizens about 14:00:35 a certain thing. And 14:00:38 organization based in the United States, incorporated here 14:00:42 , everything has to do with the US, the Indian police sends 14:00:45 an information demand 14:00:48 and content down request to twitter here and twitter 14:00:51 just says hey you know what Indian police is asking 14:00:54 for this we are just going to disclose this 14:00:57 because we're obligate bide law to do that. No you're not. There 14:01:01 is no extra territorial jurisdiction 14:01:05 that people in the the US have now lost rights to 14:01:08 speak here because the Indian police decided something 14:01:11 . I will say that's just an example of what we're asking about what 14:01:14 is information we don't want to give the 14:01:17 government because they already get enough. Because in terms of giving 14:01:21 them power. Content take down or arm twisting or 14:01:24 censorship by proxy which they already practice 14:01:27 is happening. But I will also say these problem 14:01:30 s are hard and they're not going 14:01:34 , and all ways of thinking which we are so used to 14:01:37 in terms of publish answers news papers 14:01:41 and are not going to give us the right solutions. The 14:01:45 plat the forms are not like the paper and they're not like the 14:01:48 ones which we thought they 14:01:51 were. Entire thing was starting. 14:01:54 Too so to expect that these solutions are going to 14:01:57 be based on some template which because we think 14:02:01 in common law terms or for so long we've thought about 14:02:04 it it's not going to emerge. And also each country 14:02:07 is going to be very different. I get 14:02:11 really bothered by the dominant first amendment think 14:02:15 ing in all of these discussions as much as I love 14:02:18 the first amendment but I think that 14:02:21 is correct. The coopting of the first amendment 14:02:24 and also for us as civil society members for a long time 14:02:27 to have played into say 14:02:30 carry to say we're taking away our speech so we'll 14:02:33 defend your action that approach has to stop. I think 14:02:38 the sher ard always ends up in the cats government and 14:02:41 the companies making some kind of deal and then we'll call 14:02:44 for pictures at the the end of all of that and that 14:02:47 has to stop. These are very complicated I'm not 14:02:51 saying that they're very smart people working on it so there will be solution 14:02:55 s but this simplistic way of 14:02:58 thinking about it. You ban trump so you're not going to 14:03:02 do this or we're going to come up with the law and now we 14:03:05 don't want any of the other information like Barbara gave 14:03:09 the example it's hard but it's also 14:03:12 we look into the black hole and try to get something out we can't even 14:03:15 decide what we want. So in the begin doing have some kind of 14:03:19 numbers transparency I still think is a good idea. 14:03:22 SPEAKER: Well thank you all so much. I also wish we had book 14:03:25 ed this session for at least 90 minutes. I think there's so 14:03:29 much more we can dig into. For one I am thinking very much 14:03:32 also about government 14:03:36 transparency in all of this and whether that's 14:03:39 in what government's already doing to demand content removal either under 14:03:42 law or extra legally but especially as we see transparency 14:03:46 mandates come into play in different countries around the 14:03:49 world what transparency of the government side of that activity can 14:03:53 we get as well because that seems it will be a really 14:03:56 necessary safe guard for these different systems that 14:03:59 got put in place around the world. I also really wanted 14:04:02 to get everyone's thoughts of what do we do in if and 14:04:06 when we have multiple different transparency mandates at 14:04:09 once but maybe that's just going to be a problem for the company 14:04:12 ies to solve. I think they will probably need and want some suggestions on 14:04:16 how to approach that too. So I want to 14:04:20 thank August us Tinamec aBarbara Eric 14:04:23 thank you for this excellent discussion and thank you to all the 14:04:26 speakers who have joined us for the the future of speech online these 14:04:29 past three days. To all the attendee 14:04:33 s who have joined and some of you I think have been to all 14:04:36 six panel says you 14:04:39 get thegled star. A big thank you to 14:04:43 our sponsor the Charles coke institute and thank 14:04:46 to you all my CDT colleagues for making this event an 14:04:49 interesting and fun tour through the many many different things that 14:04:53 transparency can mean. We thank you all so much. I am 14:04:56 sure we will be seeing you again at a future of speech 14:04:59 online next year. Maybe even in person who knows 14:05:02 . But until then I hope you all have a really wonderful 14:05:05 end of the year. Happy holidays to everybody and thank you all 14:05:08 once again.