1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,720 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,840 --> 00:00:18,000 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,079 --> 00:00:22,320 Speaker 1: and how the tech Area. A couple of years ago, Facebook, 5 00:00:22,400 --> 00:00:27,040 Speaker 1: which of course later became Meta, announced the intention to 6 00:00:27,160 --> 00:00:30,200 Speaker 1: create an Oversight Board for the purposes of her viewing 7 00:00:30,240 --> 00:00:36,320 Speaker 1: Facebook's content moderation practices. The company was still is under 8 00:00:36,440 --> 00:00:40,160 Speaker 1: a ton of pressure to respond to various issues, ranging 9 00:00:40,200 --> 00:00:45,760 Speaker 1: from disinformation campaigns to the proliferation of hate speech. Unfortunately, 10 00:00:46,320 --> 00:00:53,960 Speaker 1: Facebook's responses were frequently criticized as not being sufficient, as 11 00:00:54,000 --> 00:00:59,400 Speaker 1: were the company's policies, which had noticeable gaps and transparency issues. Thus, 12 00:00:59,720 --> 00:01:03,080 Speaker 1: the creation of the Oversight Board and the Board, while 13 00:01:03,240 --> 00:01:07,000 Speaker 1: deriving funds from a trust that was created by Facebook, 14 00:01:07,440 --> 00:01:11,440 Speaker 1: is independent of Meta itself. I had the chance to 15 00:01:11,480 --> 00:01:15,640 Speaker 1: speak with dex Hunter Torrik, who heads up the communications 16 00:01:15,640 --> 00:01:19,080 Speaker 1: department at the Oversight Board. We talked about the Board's mission, 17 00:01:19,520 --> 00:01:21,800 Speaker 1: We talked about its history and how it tackles the 18 00:01:21,880 --> 00:01:26,240 Speaker 1: challenge of picking and deciding cases that have the potential 19 00:01:26,280 --> 00:01:30,920 Speaker 1: to impact millions or potentially even billions of users. And 20 00:01:30,959 --> 00:01:35,920 Speaker 1: what follows is my conversation with him. Enjoy Dex, Welcome 21 00:01:36,080 --> 00:01:38,680 Speaker 1: to text stuff. It is a pleasure to have you 22 00:01:38,760 --> 00:01:42,160 Speaker 1: on the show. Great. I'm really excited to talk to 23 00:01:42,200 --> 00:01:46,280 Speaker 1: you because the Facebook Oversight Board is something that I've 24 00:01:46,319 --> 00:01:51,440 Speaker 1: talked about a few times since it first began to 25 00:01:51,440 --> 00:01:55,200 Speaker 1: to be a thing a few years ago. But I 26 00:01:55,240 --> 00:01:58,120 Speaker 1: feel like I include myself here. I feel like a 27 00:01:58,160 --> 00:02:01,000 Speaker 1: lot of people don't have a full unders standing of 28 00:02:01,560 --> 00:02:05,800 Speaker 1: what the board actually does, why it's They're like, what 29 00:02:06,000 --> 00:02:11,240 Speaker 1: necessitated it's um, it's creation and the process by which 30 00:02:11,320 --> 00:02:15,120 Speaker 1: it goes through while it's determining the outcome of of 31 00:02:15,280 --> 00:02:19,320 Speaker 1: various cases. So I'm so glad to have you here 32 00:02:19,360 --> 00:02:23,080 Speaker 1: to kind of pull back the veil of mystery because 33 00:02:23,080 --> 00:02:25,120 Speaker 1: I think it's very easy easy for those of us 34 00:02:25,120 --> 00:02:31,080 Speaker 1: in the media too, even unconsciously misrepresent what's happening. Sure. 35 00:02:31,280 --> 00:02:33,280 Speaker 1: I mean, when you hear about the board, right, and 36 00:02:33,320 --> 00:02:36,079 Speaker 1: you think there's a group of experts who are focused 37 00:02:36,080 --> 00:02:41,400 Speaker 1: on things like content moderation standards and Facebook's community standards, 38 00:02:41,600 --> 00:02:45,000 Speaker 1: you know, the rules that govern the platforms, it is 39 00:02:45,040 --> 00:02:47,000 Speaker 1: something that I think a lot of people sort of 40 00:02:47,000 --> 00:02:49,840 Speaker 1: skip past, right. It's almost like when you're installing an 41 00:02:49,840 --> 00:02:52,639 Speaker 1: app on your iPhone. You get those disclaimers and nobody 42 00:02:52,639 --> 00:02:55,720 Speaker 1: reads them. You just click yes. You know. Um, it's 43 00:02:55,760 --> 00:02:59,600 Speaker 1: something that's incredibly complex, and it sounds quite dry, and 44 00:02:59,680 --> 00:03:02,440 Speaker 1: it is is. And because of that, we naturally gravitate 45 00:03:02,480 --> 00:03:05,360 Speaker 1: to the more exciting questions about, you know, how should 46 00:03:05,400 --> 00:03:08,480 Speaker 1: social media be regulated? Um, what are the kind of 47 00:03:08,520 --> 00:03:11,560 Speaker 1: structural issues within the industry? Um. You know that we 48 00:03:11,560 --> 00:03:13,880 Speaker 1: can have lots of points of view on the truth 49 00:03:14,040 --> 00:03:19,160 Speaker 1: is those rules that Facebook and now matter of course, um, 50 00:03:19,240 --> 00:03:23,120 Speaker 1: you know have constructed four platforms like Facebook and Instagram. 51 00:03:23,240 --> 00:03:27,160 Speaker 1: They have an enormous impact on free speech and human 52 00:03:27,240 --> 00:03:31,440 Speaker 1: rights around the world. They govern almost every aspect of 53 00:03:31,720 --> 00:03:35,800 Speaker 1: online discourse today on any given topic. So we really 54 00:03:35,840 --> 00:03:38,920 Speaker 1: see the value of the Board as an institution that 55 00:03:39,040 --> 00:03:43,840 Speaker 1: is focused on scrutinizing those rules and understanding the impact 56 00:03:43,880 --> 00:03:46,360 Speaker 1: on people's lives and then working at how to improve 57 00:03:46,440 --> 00:03:49,480 Speaker 1: them so that we better defend free expression and human rights. 58 00:03:49,600 --> 00:03:53,520 Speaker 1: That is a incredibly valuable mission. And I would say, 59 00:03:53,560 --> 00:03:55,560 Speaker 1: I think the other thing to think about with the 60 00:03:55,560 --> 00:03:59,840 Speaker 1: Board is really, um, we are only a small part 61 00:04:00,160 --> 00:04:02,960 Speaker 1: of the solution to the you know, much larger challenges 62 00:04:03,040 --> 00:04:05,240 Speaker 1: that you know are occurring all across the tech industry 63 00:04:05,240 --> 00:04:09,240 Speaker 1: and across social media, and in particular with meta UM. 64 00:04:09,280 --> 00:04:11,720 Speaker 1: And I think people often today, where we have such 65 00:04:11,760 --> 00:04:14,840 Speaker 1: short attention spans and the world is in such crisis, 66 00:04:14,880 --> 00:04:17,680 Speaker 1: we often think, UM, I want a quick solution to 67 00:04:17,720 --> 00:04:19,760 Speaker 1: all of these problems. And the board is not a 68 00:04:19,839 --> 00:04:21,760 Speaker 1: quick solution to those problems, and it will only ever 69 00:04:21,839 --> 00:04:25,360 Speaker 1: be a small part of the overall solutions that we need. 70 00:04:25,520 --> 00:04:28,120 Speaker 1: But even though it's only part of the solution, I 71 00:04:28,120 --> 00:04:31,040 Speaker 1: think it can still be incredibly valuable. So that's that's 72 00:04:31,040 --> 00:04:33,560 Speaker 1: how I go about describing the impact of the boards 73 00:04:33,760 --> 00:04:36,880 Speaker 1: and working on a sort of purpose in the world. Yeah, 74 00:04:36,880 --> 00:04:38,800 Speaker 1: and when you look at as you were mentioned, in 75 00:04:38,839 --> 00:04:41,520 Speaker 1: the world in crisis, I mean there are crises all 76 00:04:41,560 --> 00:04:46,240 Speaker 1: over the world where we often see social networks being 77 00:04:46,440 --> 00:04:51,039 Speaker 1: pulled into them. Whether you're talking about Myanmar or Russia 78 00:04:51,080 --> 00:04:54,560 Speaker 1: and Ukraine, or you talk about India, I mean, they're 79 00:04:54,680 --> 00:04:58,320 Speaker 1: they're famous cases or even here in the United States. UM, 80 00:04:58,440 --> 00:05:01,840 Speaker 1: famous cases all around the world where you have this 81 00:05:02,600 --> 00:05:08,000 Speaker 1: delicate balancing act of how do you best serve uh, 82 00:05:08,160 --> 00:05:11,360 Speaker 1: the the customers of Facebook who are actually using the product, 83 00:05:11,520 --> 00:05:14,080 Speaker 1: or depending on your point of view, the product of 84 00:05:14,160 --> 00:05:17,120 Speaker 1: Facebook or using the product, uh. And then and then 85 00:05:17,120 --> 00:05:19,560 Speaker 1: how do you how do you also make sure that 86 00:05:20,400 --> 00:05:26,520 Speaker 1: it's it's not violating any laws to uphold a certain 87 00:05:26,560 --> 00:05:29,320 Speaker 1: policy like that. That to me is probably the most 88 00:05:29,320 --> 00:05:32,520 Speaker 1: complicated part of the picture, because I think a lot 89 00:05:32,560 --> 00:05:35,320 Speaker 1: of people can have a knee jerk reaction when they 90 00:05:35,360 --> 00:05:39,520 Speaker 1: see maybe a piece of content being removed or a 91 00:05:39,560 --> 00:05:42,040 Speaker 1: piece of content staying up when you may seem like 92 00:05:42,120 --> 00:05:47,440 Speaker 1: it's controversial, but without having the full understanding of the context, 93 00:05:48,000 --> 00:05:51,320 Speaker 1: uh it can you know you might not you might 94 00:05:51,320 --> 00:05:55,760 Speaker 1: not have the the appreciation of why that move was made. 95 00:05:55,800 --> 00:05:59,760 Speaker 1: And then, of course the Oversight Board can step in 96 00:06:00,680 --> 00:06:04,600 Speaker 1: when you have cases where people have appealed UH an 97 00:06:04,640 --> 00:06:07,880 Speaker 1: action that Facebook has taken in order to make a 98 00:06:07,960 --> 00:06:10,560 Speaker 1: judgment over whether or not that was the right call 99 00:06:11,880 --> 00:06:15,400 Speaker 1: more or less getting it right absolutely, And I think 100 00:06:15,480 --> 00:06:17,719 Speaker 1: you hit the nail on the head right, um, which 101 00:06:17,760 --> 00:06:22,120 Speaker 1: is these are incredibly challenging issues where there isn't always 102 00:06:22,160 --> 00:06:26,760 Speaker 1: an easy right answer um. And without the context, you 103 00:06:26,880 --> 00:06:31,000 Speaker 1: certainly can't make an informed decision about these issues. So 104 00:06:31,040 --> 00:06:33,479 Speaker 1: I'll give you a very specific example. One of the 105 00:06:33,560 --> 00:06:36,080 Speaker 1: early cases of the board took on when we started 106 00:06:36,120 --> 00:06:41,880 Speaker 1: operating in twenty twenty one was around a speech relating 107 00:06:41,960 --> 00:06:44,840 Speaker 1: to COVID misinformation. So there was a piece of content 108 00:06:44,920 --> 00:06:48,640 Speaker 1: that had been taken down by then Facebook from their 109 00:06:48,640 --> 00:06:52,600 Speaker 1: platform under their COVID misinformation rules, and when the board 110 00:06:52,640 --> 00:06:55,719 Speaker 1: looked at it, we actually found it was perfectly legitimate speech. 111 00:06:56,240 --> 00:06:59,640 Speaker 1: It wasn't something that was spreading something that we thought, 112 00:06:59,720 --> 00:07:02,679 Speaker 1: you know, had a potential of leading to to imminent harm. 113 00:07:02,720 --> 00:07:05,080 Speaker 1: It was something that was a legitimate debate about public 114 00:07:05,120 --> 00:07:09,600 Speaker 1: health policy and how authority should be responding to COVID 115 00:07:09,720 --> 00:07:14,480 Speaker 1: in France. And these are extraordinarily thorny questions, right if 116 00:07:14,520 --> 00:07:16,600 Speaker 1: you have something that seems like it might be in 117 00:07:16,600 --> 00:07:20,720 Speaker 1: the COVID misinformation space, given how sensitive that topic is, 118 00:07:20,760 --> 00:07:23,880 Speaker 1: given the enormity of the public health emergency, you might 119 00:07:23,920 --> 00:07:26,040 Speaker 1: as a platform just you know, err on the side 120 00:07:26,040 --> 00:07:28,840 Speaker 1: of caution and take something down. But if you do that, 121 00:07:29,080 --> 00:07:30,920 Speaker 1: and if you do that a lot, and you have 122 00:07:31,040 --> 00:07:34,920 Speaker 1: that sort of cumulative impact on freedom expression, we are 123 00:07:35,000 --> 00:07:39,760 Speaker 1: muscling the ability of communities to have informed debates about 124 00:07:39,800 --> 00:07:41,679 Speaker 1: these issues, and we need to be able to have those. 125 00:07:41,880 --> 00:07:45,640 Speaker 1: It is perfectly legitimate to have debates about how as 126 00:07:45,640 --> 00:07:48,560 Speaker 1: a society we should respond to things such as COVID. 127 00:07:48,880 --> 00:07:51,120 Speaker 1: Another example, just from the last few months, we had 128 00:07:51,120 --> 00:07:54,760 Speaker 1: a case that involved someone in Sweden who had posted 129 00:07:55,000 --> 00:07:59,680 Speaker 1: about sexual abuse of minors. And this is obviously, you know, 130 00:07:59,760 --> 00:08:03,360 Speaker 1: ex extraordinarily sensitive, and this piece of content was you know, 131 00:08:03,400 --> 00:08:06,240 Speaker 1: taken down UM and it was found to be you know, 132 00:08:06,440 --> 00:08:09,760 Speaker 1: very graphic and something that was deeply unpleasant at the 133 00:08:09,840 --> 00:08:13,560 Speaker 1: same time as being you know, deeply deeply unpleasant and 134 00:08:13,680 --> 00:08:16,400 Speaker 1: very very hard to to read. It was something that 135 00:08:16,480 --> 00:08:21,680 Speaker 1: raised legitimate questions again about sentencing and how people you know, 136 00:08:21,680 --> 00:08:23,880 Speaker 1: who are involved in these kinds of crimes should be 137 00:08:23,920 --> 00:08:27,720 Speaker 1: treated as a society. So again something that is important 138 00:08:27,760 --> 00:08:30,119 Speaker 1: for people people to be able to have discussions about 139 00:08:30,240 --> 00:08:32,760 Speaker 1: in spite of how difficult it is, UM, you know, 140 00:08:32,800 --> 00:08:34,920 Speaker 1: to read some of these things. So the board is 141 00:08:34,920 --> 00:08:37,960 Speaker 1: constantly looking to navigate these issues, and we judge each 142 00:08:38,040 --> 00:08:40,959 Speaker 1: piece of content based on a set of several things. 143 00:08:41,640 --> 00:08:46,240 Speaker 1: How does content interact with Meta's own rules and standards, 144 00:08:46,280 --> 00:08:49,200 Speaker 1: so their own stated rules that applied for the platforms 145 00:08:49,520 --> 00:08:52,200 Speaker 1: um is content compatible with that. We're looking at the 146 00:08:52,360 --> 00:08:56,080 Speaker 1: values that the company says that they abide by. And 147 00:08:56,160 --> 00:08:58,400 Speaker 1: of course then you sometimes have tension between the values 148 00:08:58,400 --> 00:09:00,680 Speaker 1: that the company says there by by the actual rules 149 00:09:00,679 --> 00:09:02,560 Speaker 1: that they've come up with, and we look to clarify 150 00:09:02,640 --> 00:09:04,640 Speaker 1: that tension. The third piece, which is where I think 151 00:09:04,640 --> 00:09:07,400 Speaker 1: the most interesting. You know, an important angle for the 152 00:09:07,440 --> 00:09:11,320 Speaker 1: board is is how content and the decisions made about 153 00:09:11,320 --> 00:09:15,200 Speaker 1: that content relate to international human rights standards. So of 154 00:09:15,240 --> 00:09:17,320 Speaker 1: course there have been decades of work by all sorts 155 00:09:17,320 --> 00:09:19,960 Speaker 1: of experts and leaders, you know, across the ecosystem which 156 00:09:20,000 --> 00:09:22,840 Speaker 1: have worked to codify human rights standards and norms. You know, 157 00:09:22,960 --> 00:09:25,960 Speaker 1: how we think it is acceptable to treat people and 158 00:09:26,000 --> 00:09:29,280 Speaker 1: communities in a way that protects their freedom of expression 159 00:09:29,320 --> 00:09:31,600 Speaker 1: and their human rights. And the Board is constantly going 160 00:09:31,600 --> 00:09:33,600 Speaker 1: back to that body of knowledge to try and make 161 00:09:33,640 --> 00:09:36,280 Speaker 1: decisions in a way that is principled and ultimately puts 162 00:09:36,320 --> 00:09:40,240 Speaker 1: those people first. It boggles my mind at how complicated 163 00:09:40,280 --> 00:09:43,400 Speaker 1: this is. See, I love tex stuff. I love doing 164 00:09:43,480 --> 00:09:47,040 Speaker 1: the technology side because I've said this many times. The 165 00:09:47,080 --> 00:09:50,240 Speaker 1: beauty of technology is either it works or it doesn't right, 166 00:09:50,440 --> 00:09:53,920 Speaker 1: either it's wired properly or it's not. Whereas when we 167 00:09:53,960 --> 00:09:57,280 Speaker 1: start getting into the application of technology in the real world, gosh, 168 00:09:57,320 --> 00:10:00,240 Speaker 1: people just make it so much more complicated. Yes, really, 169 00:10:00,400 --> 00:10:02,400 Speaker 1: I used to be a SpaceX and you know, my 170 00:10:02,440 --> 00:10:05,160 Speaker 1: favorite thing about that going to there from where I 171 00:10:05,160 --> 00:10:07,400 Speaker 1: previously was, you know, I was working at Facebook before 172 00:10:07,440 --> 00:10:10,280 Speaker 1: that was ultimately the rocket launchers or it doesn't. There's 173 00:10:10,280 --> 00:10:12,400 Speaker 1: no spinning it. You know, either you know, puts a 174 00:10:12,480 --> 00:10:15,160 Speaker 1: satellite up there or it doesn't. It everyone knows when 175 00:10:15,200 --> 00:10:17,520 Speaker 1: it doesn't. But this isn't like that, and that's what 176 00:10:17,600 --> 00:10:21,480 Speaker 1: makes it so I think challenging and sometimes unsatisfying for 177 00:10:21,520 --> 00:10:23,880 Speaker 1: a lot of people. You know, there are a relatively 178 00:10:23,960 --> 00:10:26,440 Speaker 1: fewer opportunities for the board to say, ah, we have 179 00:10:26,559 --> 00:10:30,120 Speaker 1: absolutely nailed it and everyone else is wrong about this. 180 00:10:30,240 --> 00:10:32,760 Speaker 1: You know, there is the rocket, you know, launching. You know, 181 00:10:32,840 --> 00:10:34,559 Speaker 1: sometimes there's a lot of debate about whether that's a 182 00:10:34,600 --> 00:10:38,440 Speaker 1: rocket or not. Dex and I will have more to 183 00:10:38,480 --> 00:10:49,600 Speaker 1: say about the Facebook Oversight Board after these messages. When 184 00:10:49,600 --> 00:10:52,160 Speaker 1: you start talking about things like digital rights, I mean, 185 00:10:52,200 --> 00:10:55,000 Speaker 1: there have been plenty of cases, even well before the 186 00:10:55,679 --> 00:10:59,360 Speaker 1: Facebook Oversight Board was formed, where you had some really 187 00:10:59,440 --> 00:11:05,440 Speaker 1: thorny aces where the question was who is ultimately um 188 00:11:05,480 --> 00:11:07,760 Speaker 1: in the wrong or more in the wrong on some 189 00:11:07,800 --> 00:11:11,840 Speaker 1: of these cases where it became a question of freedom 190 00:11:11,840 --> 00:11:16,320 Speaker 1: of speech, and digital rights versus someone's right to privacy. 191 00:11:16,400 --> 00:11:20,840 Speaker 1: These are very complicated matters and I I certainly wouldn't 192 00:11:20,880 --> 00:11:23,280 Speaker 1: want to be the person to have to adjudicate which 193 00:11:23,320 --> 00:11:26,520 Speaker 1: is the most important. Um. Let's let's let's backtrack a 194 00:11:26,559 --> 00:11:28,880 Speaker 1: little bit though. Let's talk a little bit about the 195 00:11:28,880 --> 00:11:32,280 Speaker 1: actual formation of the board itself. Can you talk about 196 00:11:32,400 --> 00:11:36,000 Speaker 1: sort of how that came to be? Yeah? Absolutely, So. 197 00:11:36,080 --> 00:11:39,400 Speaker 1: The board was created by Facebook back when it was 198 00:11:39,400 --> 00:11:43,080 Speaker 1: still called Facebook, and the first um, sort of you know, 199 00:11:43,200 --> 00:11:46,920 Speaker 1: vision for the board was announced by Mark Zuckerberg back 200 00:11:46,960 --> 00:11:50,120 Speaker 1: in twenty nineteen, and he wrote a long note which 201 00:11:50,120 --> 00:11:53,719 Speaker 1: he posts on his Facebook outlining the vision for an 202 00:11:53,760 --> 00:11:57,080 Speaker 1: independent body that could make decisions about some of the 203 00:11:57,120 --> 00:12:00,160 Speaker 1: most consequential content issues that the company was face ing. 204 00:12:00,559 --> 00:12:04,520 Speaker 1: And the sort of elevator pitch for why you would 205 00:12:04,600 --> 00:12:07,800 Speaker 1: want to create the sort of body was very simple. Um, 206 00:12:07,840 --> 00:12:12,840 Speaker 1: the world is of enormous and growing complexity, and there 207 00:12:12,880 --> 00:12:16,120 Speaker 1: are fewer and fewer decisions that the company can just 208 00:12:16,200 --> 00:12:20,319 Speaker 1: go and make successfully on its own. So having an 209 00:12:20,360 --> 00:12:24,400 Speaker 1: independent oversight body filled with leaders and experts from a 210 00:12:24,440 --> 00:12:28,080 Speaker 1: lot of different backgrounds was a way of expanding the 211 00:12:28,160 --> 00:12:31,679 Speaker 1: diversity of perspectives that the company had available to it, 212 00:12:31,920 --> 00:12:34,760 Speaker 1: and at the same time building a structure that gave 213 00:12:34,840 --> 00:12:38,920 Speaker 1: people trust in the rules and the way those rules 214 00:12:38,920 --> 00:12:41,600 Speaker 1: were enforced on these platforms that have become such a 215 00:12:41,600 --> 00:12:44,120 Speaker 1: big part of our lives. And the design of the 216 00:12:44,120 --> 00:12:47,120 Speaker 1: board incredibly complex, something that took quite a long time 217 00:12:47,160 --> 00:12:49,200 Speaker 1: because I think the company, to its credit, wanted to 218 00:12:49,200 --> 00:12:51,400 Speaker 1: get this right. So they went out and they consulted 219 00:12:51,440 --> 00:12:54,440 Speaker 1: an enormous number of different stakeholders around the world, you know, 220 00:12:54,480 --> 00:12:56,960 Speaker 1: more than two thousand people, um, you know, in many, 221 00:12:57,000 --> 00:13:00,360 Speaker 1: many countries, And they spent about a year tracking around 222 00:13:00,400 --> 00:13:06,360 Speaker 1: basically having these consultations with civil society, with experts, with lawyers, um, 223 00:13:06,400 --> 00:13:09,080 Speaker 1: you know, with policy makers, until they came up with 224 00:13:09,080 --> 00:13:11,320 Speaker 1: a structure which was you know, I think, really really 225 00:13:11,360 --> 00:13:13,199 Speaker 1: strong and and is the one that we now use. 226 00:13:13,520 --> 00:13:17,880 Speaker 1: So they built this institution with three components. There's the board, 227 00:13:17,960 --> 00:13:19,800 Speaker 1: which is whateveryone immediately thinks of when they think of 228 00:13:19,800 --> 00:13:23,040 Speaker 1: the oversight board. Right now, we've got twenty members, you know, 229 00:13:23,080 --> 00:13:25,960 Speaker 1: who come from a lot of different backgrounds and perspectives. 230 00:13:26,280 --> 00:13:29,200 Speaker 1: Ultimately the board might grow up to a maximum of 231 00:13:29,280 --> 00:13:32,920 Speaker 1: forty people. We then have a trust and the trust 232 00:13:33,160 --> 00:13:37,439 Speaker 1: has a set of trustees who operate again independently from META. 233 00:13:37,960 --> 00:13:41,320 Speaker 1: Their job is to oversee the sort of fiduciary you know, 234 00:13:41,600 --> 00:13:43,760 Speaker 1: duties that are needed to run an organization like this, 235 00:13:44,120 --> 00:13:46,480 Speaker 1: and the real value is to protect the independence of 236 00:13:46,520 --> 00:13:49,880 Speaker 1: the board. So META does fund the board. They put 237 00:13:49,920 --> 00:13:53,520 Speaker 1: money into that trust. There's an irrevocable trust that's overseen 238 00:13:53,520 --> 00:13:56,240 Speaker 1: by those trustees, and the trustees are there basically to 239 00:13:56,320 --> 00:13:58,640 Speaker 1: ensure that all the you know, sort of messy you 240 00:13:58,640 --> 00:14:01,480 Speaker 1: know conversations about resource seeing and so on take place 241 00:14:01,520 --> 00:14:05,479 Speaker 1: at arm's length from where the decisions are happening. And trustees, 242 00:14:05,520 --> 00:14:08,080 Speaker 1: of course don't have any role in the decision making 243 00:14:08,400 --> 00:14:10,719 Speaker 1: and the sort of substantive you know, policy work of 244 00:14:10,760 --> 00:14:13,480 Speaker 1: the board. You then have the third piece of the board, 245 00:14:13,559 --> 00:14:15,880 Speaker 1: which is where I sit, which is the full time 246 00:14:15,920 --> 00:14:19,280 Speaker 1: team of staff. So behind those board members you have 247 00:14:19,320 --> 00:14:21,480 Speaker 1: a full time team of you know, other experts and 248 00:14:21,520 --> 00:14:24,680 Speaker 1: people who helped keep the institution running, UM who helps 249 00:14:24,720 --> 00:14:27,640 Speaker 1: steer those cases through their life cycle at the board 250 00:14:27,840 --> 00:14:33,000 Speaker 1: and so UM. The board was announced in May twenty twenty, 251 00:14:33,120 --> 00:14:34,920 Speaker 1: that's when the first twenty members were announced, and then 252 00:14:34,920 --> 00:14:37,520 Speaker 1: we started accepting cases towards the end of that year, 253 00:14:37,560 --> 00:14:40,320 Speaker 1: and then the first decisions came out in January one. 254 00:14:40,560 --> 00:14:42,640 Speaker 1: So we're just a little bit over a year into 255 00:14:42,720 --> 00:14:44,800 Speaker 1: the life of the board now. Um and you know, 256 00:14:44,840 --> 00:14:50,040 Speaker 1: we've delivered over twenty cases now over a hundred detailed 257 00:14:50,040 --> 00:14:52,720 Speaker 1: policy recommendations to the company. Some of these cases have 258 00:14:52,800 --> 00:14:55,520 Speaker 1: been very, very significant. Others have been very significant but 259 00:14:55,560 --> 00:14:58,680 Speaker 1: have received less attention. But we're starting to see the 260 00:14:58,720 --> 00:15:02,160 Speaker 1: impact of the board's on the company, starting to see 261 00:15:02,160 --> 00:15:04,880 Speaker 1: tangible outcomes in terms of how those rules are structured 262 00:15:05,360 --> 00:15:08,560 Speaker 1: at Meta and how they're being enforced. Um And we've 263 00:15:08,600 --> 00:15:10,400 Speaker 1: got you know, years left to run for the board, 264 00:15:10,480 --> 00:15:14,800 Speaker 1: So I think it's it's an extraordinary exciting experiment, um 265 00:15:14,840 --> 00:15:17,120 Speaker 1: you know, which Meta committed to, but is one that 266 00:15:17,240 --> 00:15:19,600 Speaker 1: is becoming you know, more real by the day and 267 00:15:19,600 --> 00:15:22,480 Speaker 1: and less like a experiment and more like an enduring 268 00:15:22,520 --> 00:15:24,360 Speaker 1: institution that I think will be around for some time. 269 00:15:25,040 --> 00:15:27,480 Speaker 1: I like that word experiment because I feel like that's 270 00:15:27,480 --> 00:15:30,040 Speaker 1: why a lot of people viewed the board when it 271 00:15:30,120 --> 00:15:35,640 Speaker 1: was first announced. It was really a new kind of concept, 272 00:15:35,760 --> 00:15:40,440 Speaker 1: this idea of an outside entity that was independent of 273 00:15:40,480 --> 00:15:44,520 Speaker 1: the platform saying making judgments about whether or not the 274 00:15:44,520 --> 00:15:48,440 Speaker 1: platform was actually enforcing its own rules, whether those rules 275 00:15:48,480 --> 00:15:52,720 Speaker 1: were legal in some cases, I'm sure, um and and 276 00:15:52,720 --> 00:15:55,600 Speaker 1: and again, like the context of each case, I think 277 00:15:55,640 --> 00:15:59,680 Speaker 1: a lot of people were kind of kind of confused 278 00:15:59,680 --> 00:16:01,560 Speaker 1: by it, like it was surprising, it was not something 279 00:16:01,600 --> 00:16:03,360 Speaker 1: they would think of. But then you also have to 280 00:16:03,400 --> 00:16:08,040 Speaker 1: consider the fact that META has such a global reach, right, 281 00:16:08,120 --> 00:16:11,600 Speaker 1: It's it's very easy, I think, especially for people in 282 00:16:11,640 --> 00:16:13,840 Speaker 1: my country, in the United States, it's very easy for 283 00:16:13,920 --> 00:16:17,760 Speaker 1: us to be so US centric that we forget the 284 00:16:18,480 --> 00:16:22,640 Speaker 1: reach and how deep the penetration is in other parts 285 00:16:22,640 --> 00:16:26,800 Speaker 1: of the world. Uh, And so we don't even start 286 00:16:26,840 --> 00:16:28,920 Speaker 1: to consider the fact that we need to have a 287 00:16:28,960 --> 00:16:33,680 Speaker 1: different set of rules in order to guide those organizations 288 00:16:33,760 --> 00:16:37,800 Speaker 1: because we're used to everything being US. So that I 289 00:16:37,840 --> 00:16:39,920 Speaker 1: think was part of the confusion. Yeah, I think I 290 00:16:40,080 --> 00:16:42,280 Speaker 1: think you raise a really important point, Jonathan. I mean, 291 00:16:42,880 --> 00:16:45,720 Speaker 1: so much of the discourse you're absolutely right, is driven 292 00:16:46,000 --> 00:16:48,240 Speaker 1: by folks in the United States, and if not in 293 00:16:48,280 --> 00:16:51,600 Speaker 1: the United States, in Western Europe. And we are the 294 00:16:51,600 --> 00:16:54,360 Speaker 1: parts of the world which are super connected, you know. 295 00:16:54,400 --> 00:16:58,640 Speaker 1: Our experience of connectivity, um, you know is absolutely phenomenal 296 00:16:58,680 --> 00:17:00,280 Speaker 1: compared to the experience that a lot of people in 297 00:17:00,320 --> 00:17:04,080 Speaker 1: other communities have and um, I think people often take 298 00:17:04,119 --> 00:17:08,080 Speaker 1: connectivity for granted in those markets. Um. You know, we've 299 00:17:08,119 --> 00:17:10,359 Speaker 1: just seen, you know, in the last few days and 300 00:17:10,400 --> 00:17:13,359 Speaker 1: weeks what it means to be connected. Um, you know 301 00:17:13,400 --> 00:17:16,680 Speaker 1: in places like Ukraine or in Russia, where people are 302 00:17:16,680 --> 00:17:19,240 Speaker 1: you know, struggling to have their voices heard. And the 303 00:17:19,320 --> 00:17:22,560 Speaker 1: extraordinary impact that social media can have in allowing people 304 00:17:22,560 --> 00:17:25,800 Speaker 1: to mobilize and communities to organize and to fight for 305 00:17:25,880 --> 00:17:28,439 Speaker 1: things like freedom, you know, a struggle that again a 306 00:17:28,480 --> 00:17:30,280 Speaker 1: lot of people in the West of sort of you know, 307 00:17:30,359 --> 00:17:33,359 Speaker 1: until fairly recently, it's sort of taken for granted. So 308 00:17:33,400 --> 00:17:35,960 Speaker 1: I do think the fact that the Board operates and 309 00:17:36,000 --> 00:17:40,159 Speaker 1: thinks very globally is a huge asset. Over half of 310 00:17:40,160 --> 00:17:42,680 Speaker 1: our cases so far have been from the Global South, 311 00:17:43,119 --> 00:17:46,280 Speaker 1: and the Board has been very deliberate about going after 312 00:17:46,400 --> 00:17:49,560 Speaker 1: issues in places like India and myan mar um, you know, 313 00:17:49,600 --> 00:17:52,000 Speaker 1: in Latin America, because we think those areas have been 314 00:17:52,000 --> 00:17:54,399 Speaker 1: neglected for too long. When it comes to looking at 315 00:17:54,400 --> 00:17:58,239 Speaker 1: content moderation and the discourse about protecting speech online and well, 316 00:17:58,280 --> 00:18:02,120 Speaker 1: I think another thing that really nails home how important 317 00:18:02,160 --> 00:18:05,240 Speaker 1: this is for me is that if we start removing 318 00:18:05,280 --> 00:18:08,879 Speaker 1: things like like ethics and or morality or whatever, and 319 00:18:08,880 --> 00:18:11,880 Speaker 1: we're looking at things from purely a business standpoint. From 320 00:18:11,920 --> 00:18:15,719 Speaker 1: Meta's point of view, um, clearly the company gets the 321 00:18:15,760 --> 00:18:18,880 Speaker 1: majority of its revenue from places like the United States, 322 00:18:19,119 --> 00:18:22,399 Speaker 1: in Western Europe, less so in other places where you 323 00:18:22,440 --> 00:18:26,080 Speaker 1: have this limited connectivity, and as a result from that 324 00:18:26,160 --> 00:18:30,200 Speaker 1: business perspective, again not taking humanity into account, you start 325 00:18:30,200 --> 00:18:32,119 Speaker 1: to understand, oh, this is where they're going to dedicate 326 00:18:32,160 --> 00:18:35,320 Speaker 1: their resources because this is where the revenue comes from. 327 00:18:35,320 --> 00:18:38,840 Speaker 1: That's where an uh an independent board really becomes important 328 00:18:38,880 --> 00:18:43,200 Speaker 1: because they don't have that connection to this is where 329 00:18:43,240 --> 00:18:44,719 Speaker 1: the revenue comes from, So this is where we need 330 00:18:44,760 --> 00:18:48,400 Speaker 1: to focus. It's they're divorced from that so that they 331 00:18:48,400 --> 00:18:51,439 Speaker 1: can take that approach where they say, no, let's really 332 00:18:51,440 --> 00:18:56,080 Speaker 1: look at these areas where traditionally the company itself may 333 00:18:56,119 --> 00:19:00,240 Speaker 1: not have dedicated a lot of its focus and makes 334 00:19:00,280 --> 00:19:03,359 Speaker 1: certain that the decisions that are being made are really 335 00:19:03,400 --> 00:19:08,040 Speaker 1: the right ones and are not neglecting a population simply 336 00:19:08,040 --> 00:19:10,920 Speaker 1: because they're not driving ad revenue the way other parts 337 00:19:10,920 --> 00:19:13,639 Speaker 1: of the world are. Yeah. Absolutely, I mean the board 338 00:19:14,000 --> 00:19:19,840 Speaker 1: doesn't look at things like metas public relations, you know, strategy, 339 00:19:19,960 --> 00:19:23,399 Speaker 1: or the impact of decisions on advertisers or you know, 340 00:19:23,480 --> 00:19:26,320 Speaker 1: metas ad revenues. That is absolutely not a rubric that 341 00:19:26,359 --> 00:19:29,159 Speaker 1: we're interested in. UM. We're we're focused on, you know, 342 00:19:29,240 --> 00:19:33,920 Speaker 1: advancing free speech and human rights and very much UM 343 00:19:34,000 --> 00:19:37,119 Speaker 1: ensuring that the company treats its users better. And I 344 00:19:37,160 --> 00:19:40,240 Speaker 1: think the way that the board was constructed bringing together 345 00:19:40,359 --> 00:19:42,879 Speaker 1: leaders you know, who have the kinds of backgrounds and 346 00:19:42,920 --> 00:19:45,960 Speaker 1: perspectives and expertise who allow us to go and focus 347 00:19:46,000 --> 00:19:48,800 Speaker 1: on that UM, you know, who aren't being sidetracked into 348 00:19:48,920 --> 00:19:51,240 Speaker 1: you know, considering you know, those other issues. I think 349 00:19:51,240 --> 00:19:54,440 Speaker 1: that's a huge strength for the board. It's an extraordinarily 350 00:19:54,760 --> 00:19:58,159 Speaker 1: principal set of leaders UM and they really you know, 351 00:19:58,200 --> 00:20:01,520 Speaker 1: want to focus on doing good for the users UM 352 00:20:01,800 --> 00:20:05,000 Speaker 1: of Metas products, but also wider society. It's not just 353 00:20:05,080 --> 00:20:07,439 Speaker 1: about the users of Facebook and Instagram, because we know 354 00:20:07,520 --> 00:20:09,960 Speaker 1: that those platforms and you know, the billions of people 355 00:20:09,960 --> 00:20:12,760 Speaker 1: who use them also have a deep, deep impact on 356 00:20:13,359 --> 00:20:16,480 Speaker 1: broader society and and the world we all live in. Well, 357 00:20:16,520 --> 00:20:20,919 Speaker 1: then let's let's talk a little bit about the process, 358 00:20:21,000 --> 00:20:23,520 Speaker 1: and can you talk a bit about how the board 359 00:20:23,600 --> 00:20:28,360 Speaker 1: decides which cases to consider. I assume that they receive 360 00:20:28,640 --> 00:20:33,000 Speaker 1: far more than they actually go through. Yeah, So I 361 00:20:33,040 --> 00:20:36,640 Speaker 1: mean since we started accepting cases towards the end of 362 00:20:36,880 --> 00:20:41,640 Speaker 1: twenty we've now received over a million user appeals users 363 00:20:41,800 --> 00:20:46,320 Speaker 1: of Facebook and Instagram. UM. That's the primary route by 364 00:20:46,320 --> 00:20:49,359 Speaker 1: which appeals come to the board. Meta can also refer 365 00:20:49,480 --> 00:20:51,919 Speaker 1: questions to the board, but we've naturally wanted to focus 366 00:20:51,960 --> 00:20:54,199 Speaker 1: as much as possible on hearing the appeals that are 367 00:20:54,200 --> 00:20:56,560 Speaker 1: coming up from the users of those platforms. So we've 368 00:20:56,600 --> 00:21:00,600 Speaker 1: taken a lot more from from users in case us UM. 369 00:21:00,640 --> 00:21:03,760 Speaker 1: The way we sift through that avalanche of UM you know, 370 00:21:03,840 --> 00:21:08,280 Speaker 1: incoming appeals UM, some of it is based on how 371 00:21:08,359 --> 00:21:12,359 Speaker 1: we UM you know, categorize UM those appeals that are 372 00:21:12,400 --> 00:21:15,639 Speaker 1: coming in. So we have systems and you know processes 373 00:21:15,680 --> 00:21:18,840 Speaker 1: that are designed to sift us down to a smaller number, 374 00:21:19,000 --> 00:21:21,560 Speaker 1: which is still quite large, which then people are you 375 00:21:21,600 --> 00:21:23,719 Speaker 1: know going through you know, in in a lot of detail. 376 00:21:23,960 --> 00:21:26,720 Speaker 1: So the big criteria we use as those cases are 377 00:21:26,720 --> 00:21:30,560 Speaker 1: coming in are all these cases that we think have 378 00:21:30,680 --> 00:21:34,959 Speaker 1: the potential to impact a lot of users of UM 379 00:21:35,000 --> 00:21:38,199 Speaker 1: of metas platforms. We obviously want to take on UM, 380 00:21:38,240 --> 00:21:41,040 Speaker 1: you know, cases with the limited resources of the board 381 00:21:41,119 --> 00:21:43,200 Speaker 1: which have the biggest impact for people. So we're looking 382 00:21:43,200 --> 00:21:45,280 Speaker 1: at things that have the potential to impact thousands or 383 00:21:45,280 --> 00:21:48,200 Speaker 1: millions of people around the world. We're looking for things 384 00:21:48,280 --> 00:21:53,240 Speaker 1: that raise important questions about METAS policies. So you might 385 00:21:53,280 --> 00:21:55,000 Speaker 1: have a single case. It might you know, be something 386 00:21:55,040 --> 00:21:57,720 Speaker 1: about COVID misinformation in France, or it might be something 387 00:21:57,720 --> 00:21:59,879 Speaker 1: about you know, UM, you know, hate speech you know 388 00:22:00,200 --> 00:22:04,240 Speaker 1: in UH in India that might raise big questions more 389 00:22:04,280 --> 00:22:07,520 Speaker 1: generally about the policies and the rules and how they're 390 00:22:07,600 --> 00:22:10,960 Speaker 1: enforced on METAS platforms. The other one is, um, is 391 00:22:10,960 --> 00:22:13,240 Speaker 1: this something that's going to have a big impact on 392 00:22:13,720 --> 00:22:16,760 Speaker 1: freedom of expression and does it raise big questions about 393 00:22:16,760 --> 00:22:21,000 Speaker 1: public discourse online? So UM with each of those cases, 394 00:22:21,000 --> 00:22:24,000 Speaker 1: really we're looking for these emblematic cases, things that represent 395 00:22:24,160 --> 00:22:28,480 Speaker 1: something much much larger. We aren't really designed, and we've 396 00:22:28,520 --> 00:22:32,159 Speaker 1: never conceived ourselves as a sort of general customer service 397 00:22:32,200 --> 00:22:35,280 Speaker 1: type of infrastructure. With twenty board members and you know, 398 00:22:35,320 --> 00:22:38,080 Speaker 1: about fifty staff, there's no way we could possibly UM 399 00:22:38,160 --> 00:22:41,160 Speaker 1: here every single one of those million plus appeals. UM. 400 00:22:41,200 --> 00:22:43,359 Speaker 1: It just wouldn't be sustainable. But what we can do 401 00:22:43,760 --> 00:22:46,080 Speaker 1: is pick up UM, you know, a set of cases 402 00:22:46,240 --> 00:22:49,560 Speaker 1: every year which are then having a much wider area 403 00:22:49,640 --> 00:22:52,400 Speaker 1: of impact on the company, and that ultimately will then 404 00:22:52,400 --> 00:22:55,879 Speaker 1: see the impact on in um potentially millions of of 405 00:22:55,960 --> 00:22:58,280 Speaker 1: other you know, situations that are playing out every day 406 00:22:58,280 --> 00:23:01,399 Speaker 1: on Facebook and Instagram. So it's like it's like setting 407 00:23:01,400 --> 00:23:04,680 Speaker 1: precedent almost where you can say, all right, we take 408 00:23:04,720 --> 00:23:08,359 Speaker 1: this one case which is very specific here, but we 409 00:23:08,400 --> 00:23:14,800 Speaker 1: can generalize the decision whether we uphold or overturned whatever 410 00:23:14,880 --> 00:23:20,040 Speaker 1: Facebook's action was, and that in turn sets the precedent 411 00:23:20,119 --> 00:23:24,760 Speaker 1: where similar cases that follow this should go along the 412 00:23:24,840 --> 00:23:27,840 Speaker 1: same general pathway. I see the real value of that, 413 00:23:28,000 --> 00:23:31,240 Speaker 1: especially again, like talking about a company as large as 414 00:23:31,280 --> 00:23:34,679 Speaker 1: as Meta is UM. I mean, everyone knows, there's just 415 00:23:34,920 --> 00:23:39,440 Speaker 1: it's impossible for that company to monitor everything that's posted. 416 00:23:39,480 --> 00:23:42,760 Speaker 1: It's just that's not practical at all. But being able 417 00:23:42,760 --> 00:23:46,760 Speaker 1: to set these rules and be able to say, yes, this, 418 00:23:46,760 --> 00:23:51,639 Speaker 1: this is a valid application of your policy, and to 419 00:23:52,400 --> 00:23:54,439 Speaker 1: UH to send that message to the company so that 420 00:23:54,440 --> 00:23:57,639 Speaker 1: it can continue to do that and UH and in 421 00:23:57,720 --> 00:24:00,959 Speaker 1: the knowledge that it's doing the right thing according to 422 00:24:01,000 --> 00:24:04,040 Speaker 1: the rules that's set up or as is the case, 423 00:24:04,080 --> 00:24:07,080 Speaker 1: I think, I think more than I think more often 424 00:24:07,119 --> 00:24:09,520 Speaker 1: than not, in the cases that have been heard. We've 425 00:24:09,560 --> 00:24:13,480 Speaker 1: actually seen an overturning of Facebook's decision where it's really 426 00:24:13,480 --> 00:24:16,439 Speaker 1: seeing the message of you don't you don't have this 427 00:24:16,600 --> 00:24:19,199 Speaker 1: right on this one. That's right, you need to re 428 00:24:19,320 --> 00:24:22,640 Speaker 1: examine that's right. I mean, you know, well, well more 429 00:24:22,680 --> 00:24:24,640 Speaker 1: than half of the cases we've taken on, we've we've 430 00:24:24,720 --> 00:24:28,639 Speaker 1: ended up overturning the company. Um. And I think in 431 00:24:28,680 --> 00:24:32,160 Speaker 1: the recommendations, um, you know, going back to that, well 432 00:24:32,240 --> 00:24:34,840 Speaker 1: more than half of the recommendations we've given to the company, 433 00:24:34,840 --> 00:24:37,760 Speaker 1: they've agreed to those recommendations and they've committed to doing 434 00:24:37,760 --> 00:24:40,239 Speaker 1: them or they've already um, you know, implemented them. And 435 00:24:40,280 --> 00:24:43,200 Speaker 1: I think, um, that that is a powerful sign that 436 00:24:43,280 --> 00:24:45,199 Speaker 1: the board is working, that we are starting to have 437 00:24:45,200 --> 00:24:48,360 Speaker 1: an impact on those bigger issues. UM. When we talk 438 00:24:48,440 --> 00:24:52,359 Speaker 1: about cases, people often gravitate naturally to the part where 439 00:24:52,359 --> 00:24:55,360 Speaker 1: we overturned the company, um, or we we upheld them 440 00:24:55,600 --> 00:24:58,640 Speaker 1: the binding aspect of the decision itself on a specific 441 00:24:58,640 --> 00:25:02,160 Speaker 1: bit of content. I think lot of the really interesting 442 00:25:02,320 --> 00:25:05,400 Speaker 1: impactful work of the board comes in terms of our recommendations. 443 00:25:05,480 --> 00:25:09,400 Speaker 1: So there's more than a hundred detailed pieces of guidance 444 00:25:09,400 --> 00:25:12,080 Speaker 1: that we've given to the company. Because that's where um, 445 00:25:12,280 --> 00:25:15,679 Speaker 1: you have the chance to shape those broader standards and 446 00:25:15,680 --> 00:25:19,919 Speaker 1: how they're enforced to really deliver a lot of very detailed, 447 00:25:20,080 --> 00:25:23,080 Speaker 1: practical guidance to the company, which of course is a recommendation. 448 00:25:23,160 --> 00:25:25,800 Speaker 1: They don't have to follow it, but they do have to, um, 449 00:25:25,840 --> 00:25:28,359 Speaker 1: you know, commit to studying it, you know, um, you 450 00:25:28,400 --> 00:25:32,120 Speaker 1: know for real, and and to communicate transparently, um, what 451 00:25:32,160 --> 00:25:34,400 Speaker 1: they're going to do with those recommendations we've given to them, 452 00:25:34,440 --> 00:25:36,119 Speaker 1: and the company has been pretty good about doing that 453 00:25:36,200 --> 00:25:40,200 Speaker 1: up till now. Well and and again to think back 454 00:25:40,240 --> 00:25:45,000 Speaker 1: on just the way tech companies grow from my perspective, generally, 455 00:25:45,200 --> 00:25:49,040 Speaker 1: I see uh kind of a probably a seventy thirty 456 00:25:49,160 --> 00:25:54,119 Speaker 1: split of engineers really pushing an idea which ends up 457 00:25:54,160 --> 00:25:59,080 Speaker 1: blossoming and perhaps even going to to the beloved unicorn status, 458 00:25:59,560 --> 00:26:02,240 Speaker 1: and then maybe marketing which is really pushing the high 459 00:26:02,240 --> 00:26:05,280 Speaker 1: bug thing. But but you know, that's that that the 460 00:26:05,359 --> 00:26:07,760 Speaker 1: whole process, you know, it's all about growth, growth, growth, 461 00:26:08,320 --> 00:26:11,520 Speaker 1: and you eventually hit a point where you have grown 462 00:26:11,600 --> 00:26:15,080 Speaker 1: larger than what you are easily able to manage, whether 463 00:26:15,119 --> 00:26:17,920 Speaker 1: it's because you've expanded into other markets, like just going 464 00:26:17,960 --> 00:26:21,280 Speaker 1: into Europe and the g d p R considerations you 465 00:26:21,280 --> 00:26:23,679 Speaker 1: have to have, Like these are all things that I 466 00:26:23,720 --> 00:26:25,680 Speaker 1: think a lot of people just don't take into account 467 00:26:25,720 --> 00:26:30,280 Speaker 1: early on. And so I definitely see that that value 468 00:26:30,280 --> 00:26:33,920 Speaker 1: coming in again because reaching out and creating this this 469 00:26:34,080 --> 00:26:37,680 Speaker 1: organization where you have representatives from all over the world. 470 00:26:37,760 --> 00:26:39,800 Speaker 1: I mean, I know that a lot of people have 471 00:26:39,880 --> 00:26:42,119 Speaker 1: harped on the fact that I think the United States 472 00:26:42,119 --> 00:26:46,159 Speaker 1: has the most number of representatives on the board as 473 00:26:46,200 --> 00:26:49,000 Speaker 1: it currently stands, but it the board is made up 474 00:26:49,000 --> 00:26:52,960 Speaker 1: of people from all over right, that's right. Absolutely. I 475 00:26:53,000 --> 00:26:56,159 Speaker 1: think about one quarter of our board currently, UM, you 476 00:26:56,160 --> 00:26:59,240 Speaker 1: know comes from the United States. UM. You know, I 477 00:26:59,280 --> 00:27:02,280 Speaker 1: think that reflects partly you know, global inequities and the 478 00:27:02,320 --> 00:27:06,320 Speaker 1: fact that you just have a disproportionate number of you know, 479 00:27:06,440 --> 00:27:10,800 Speaker 1: experts and institutions that generate expertise, UM, which are located 480 00:27:10,800 --> 00:27:13,080 Speaker 1: in the West. UM. But every member of the board 481 00:27:13,119 --> 00:27:16,600 Speaker 1: I think has you know, extraordinary skills and expertise and UM. 482 00:27:16,640 --> 00:27:18,600 Speaker 1: Certainly in the next round of members, we're looking to 483 00:27:18,600 --> 00:27:23,640 Speaker 1: continue expanding the diversity off the board. Well. And that's 484 00:27:23,680 --> 00:27:28,000 Speaker 1: nice too, because again in in tech companies in particular, 485 00:27:28,480 --> 00:27:30,560 Speaker 1: diversity is an issue that we have seen come up 486 00:27:30,600 --> 00:27:33,399 Speaker 1: again and again where we, or rather a lack of 487 00:27:33,440 --> 00:27:36,680 Speaker 1: diversity has frequently been an issue. So making certain that 488 00:27:36,680 --> 00:27:40,840 Speaker 1: that becomes UH and a priority is really refreshing to 489 00:27:40,920 --> 00:27:44,520 Speaker 1: be able to break free of that very narrow view 490 00:27:44,520 --> 00:27:47,199 Speaker 1: of the world that some companies can develop due to 491 00:27:47,400 --> 00:27:52,720 Speaker 1: a just a lack of perspectives. It's not consciously ignoring things, 492 00:27:52,800 --> 00:27:57,840 Speaker 1: but just because of you know, the actual individual backgrounds. UM. 493 00:27:58,240 --> 00:28:01,000 Speaker 1: I wanted to talk also about actually the process of 494 00:28:01,600 --> 00:28:06,200 Speaker 1: considering a case, like what is how does the board 495 00:28:06,240 --> 00:28:09,960 Speaker 1: do that? As I understand it, there they have a 496 00:28:10,520 --> 00:28:14,880 Speaker 1: focus group essentially that looks at a case in great scrutiny. 497 00:28:15,280 --> 00:28:17,720 Speaker 1: That's right. So when a case comes into the board, 498 00:28:18,040 --> 00:28:21,760 Speaker 1: we convene a panel and it's five members UM. The 499 00:28:21,840 --> 00:28:25,600 Speaker 1: membership of these panels is regularly rotating UM, and you 500 00:28:25,720 --> 00:28:29,040 Speaker 1: have a sort of cross section of expertise in different 501 00:28:29,040 --> 00:28:32,160 Speaker 1: backgrounds represented within those panels. You always have at least 502 00:28:32,160 --> 00:28:34,760 Speaker 1: one member of the panel who's coming from the region 503 00:28:35,040 --> 00:28:37,840 Speaker 1: where that content you know is UM you know coming 504 00:28:37,840 --> 00:28:42,960 Speaker 1: from or or implicates, and that panel takes the initial 505 00:28:43,600 --> 00:28:46,600 Speaker 1: UM you know, review of the case. They spend a 506 00:28:46,680 --> 00:28:49,800 Speaker 1: decent amount of time studying these things in depth and 507 00:28:49,840 --> 00:28:53,480 Speaker 1: trying to reach a sort of provisional decision. The decision 508 00:28:53,840 --> 00:28:56,000 Speaker 1: that they come up with on a case then goes 509 00:28:56,040 --> 00:28:58,720 Speaker 1: before the entire board, So that's another you know, sort 510 00:28:58,720 --> 00:29:00,880 Speaker 1: of check within the board to ensure that we are 511 00:29:00,960 --> 00:29:03,400 Speaker 1: really studying these things with a you know, three sixty 512 00:29:03,400 --> 00:29:08,400 Speaker 1: degree perspective. Um. Every decision is then voted on by 513 00:29:08,400 --> 00:29:11,080 Speaker 1: the entire board, and decisions have to you know, receive 514 00:29:11,120 --> 00:29:13,040 Speaker 1: a majority of sport from the board, otherwise they can 515 00:29:13,080 --> 00:29:14,800 Speaker 1: be sent back and you know, we can convene a 516 00:29:14,800 --> 00:29:17,280 Speaker 1: new panel to look at these things. But I think 517 00:29:17,320 --> 00:29:20,400 Speaker 1: another very important part of this process, which I'll call 518 00:29:20,440 --> 00:29:24,280 Speaker 1: out is the public comment process. So unusually, you know, 519 00:29:24,320 --> 00:29:26,560 Speaker 1: I think for for an entity you know, that's been 520 00:29:26,560 --> 00:29:29,200 Speaker 1: empowered to take on this role, we we also want 521 00:29:29,280 --> 00:29:32,680 Speaker 1: to ensure that we aren't just limiting our expertise to ourselves. 522 00:29:32,800 --> 00:29:35,520 Speaker 1: So we do, you know, obviously provide independent oversight of 523 00:29:35,520 --> 00:29:37,760 Speaker 1: the company. There are many many more points of view 524 00:29:37,800 --> 00:29:40,760 Speaker 1: though from the world that we thought it was important 525 00:29:40,800 --> 00:29:43,760 Speaker 1: to reflect in the decision making process. So with every 526 00:29:43,800 --> 00:29:47,160 Speaker 1: case and every um you know, policy, um, you know 527 00:29:47,200 --> 00:29:49,880 Speaker 1: that we're we're working to review, we are going out 528 00:29:50,000 --> 00:29:55,360 Speaker 1: to civil society UM, two academics UM too, regular users 529 00:29:55,480 --> 00:29:57,720 Speaker 1: of these platforms and saying, if you have a point 530 00:29:57,760 --> 00:29:59,640 Speaker 1: of view, share it with us. UM. So we have 531 00:29:59,720 --> 00:30:02,960 Speaker 1: this process called the public comment process, and we get 532 00:30:03,160 --> 00:30:06,800 Speaker 1: really valuable comments thus submitted by people all over the world. UM. 533 00:30:06,840 --> 00:30:09,239 Speaker 1: You know cases from the Middle East. UM. You know, 534 00:30:09,280 --> 00:30:13,000 Speaker 1: we've received you know, really really valuable input from grassroots 535 00:30:13,040 --> 00:30:17,280 Speaker 1: you know organizations, UM, from countries all over the region. UM. 536 00:30:17,880 --> 00:30:20,920 Speaker 1: When we have the big Trump case looking at whether 537 00:30:21,000 --> 00:30:24,719 Speaker 1: President Trump's access to Facebook and Instagram should be restored, 538 00:30:24,880 --> 00:30:27,160 Speaker 1: we had over nine thousand comments, you know, delivered from 539 00:30:27,200 --> 00:30:29,320 Speaker 1: around the world. So we had you know, everyone from 540 00:30:29,520 --> 00:30:32,440 Speaker 1: you know, members of the public, you know, to people 541 00:30:32,440 --> 00:30:34,800 Speaker 1: who are in the U. S. Congress are submitting very 542 00:30:34,880 --> 00:30:38,240 Speaker 1: detailed guidance and what they thought were the implications on 543 00:30:38,320 --> 00:30:41,720 Speaker 1: free speech, UM and human rights. So I think the 544 00:30:41,800 --> 00:30:45,280 Speaker 1: board has UM, you know, a process that's designed to 545 00:30:45,320 --> 00:30:47,800 Speaker 1: reflect our own expertise and to you know, bring in 546 00:30:47,800 --> 00:30:50,600 Speaker 1: that diversity internally, but we also think about the diversity 547 00:30:50,640 --> 00:30:53,280 Speaker 1: in terms of the external world. We've got a bit 548 00:30:53,320 --> 00:30:56,200 Speaker 1: more to talk about with the Facebook Oversight Board after 549 00:30:56,240 --> 00:31:06,440 Speaker 1: the following break. One of the big problems Meta had 550 00:31:06,760 --> 00:31:09,120 Speaker 1: was that it failed to have a policy in place 551 00:31:09,200 --> 00:31:13,400 Speaker 1: that would allow sort of an indefinite ban, and that 552 00:31:13,720 --> 00:31:16,480 Speaker 1: really what Meta needed to do was either set a 553 00:31:16,520 --> 00:31:18,920 Speaker 1: firm limit on what the band was or to just 554 00:31:19,000 --> 00:31:22,240 Speaker 1: go ahead and call it a lifelong ban. But it 555 00:31:22,280 --> 00:31:27,840 Speaker 1: could not exist in this sort of nebulous, vague banning condition. 556 00:31:28,320 --> 00:31:30,680 Speaker 1: And I remember when that came out, I saw a 557 00:31:30,680 --> 00:31:33,680 Speaker 1: lot of knee jerk reactions all across the board, like 558 00:31:34,000 --> 00:31:38,000 Speaker 1: in various ways, but to me, it was really nailing 559 00:31:38,040 --> 00:31:41,040 Speaker 1: home for the first time in my experience what the 560 00:31:41,080 --> 00:31:44,520 Speaker 1: oversight board was doing in terms of metas policies, and 561 00:31:44,640 --> 00:31:49,000 Speaker 1: that it's saying, there are cases that are falling outside 562 00:31:49,280 --> 00:31:52,920 Speaker 1: of your of your rules, and you have to you 563 00:31:53,000 --> 00:31:56,600 Speaker 1: have to craft the rules to cover these cases. Otherwise 564 00:31:57,040 --> 00:31:59,400 Speaker 1: there's no way to say whether it's fair or not 565 00:31:59,520 --> 00:32:02,760 Speaker 1: you don't cover it. That's exactly right. There was a 566 00:32:02,840 --> 00:32:05,480 Speaker 1: very important principle at stake here, right, which is that 567 00:32:05,560 --> 00:32:09,880 Speaker 1: the rules should apply to everyone, and everyone also includes 568 00:32:09,920 --> 00:32:13,360 Speaker 1: the company. So the rules exist to UM you know, 569 00:32:13,400 --> 00:32:15,760 Speaker 1: govern the way that users get to use those platforms, 570 00:32:15,840 --> 00:32:19,000 Speaker 1: but they also govern the way Meta should be serving 571 00:32:19,160 --> 00:32:21,719 Speaker 1: their users. UM so that you know the fact that 572 00:32:21,720 --> 00:32:26,239 Speaker 1: the company didn't have clear, transparent, defined standards on how 573 00:32:26,280 --> 00:32:28,920 Speaker 1: to navigate innssy like this. That was a huge gap 574 00:32:29,480 --> 00:32:31,960 Speaker 1: in UM you know, the systems that are designed to 575 00:32:32,000 --> 00:32:35,400 Speaker 1: serve their users and UM you know. Ultimately, it pushed 576 00:32:35,400 --> 00:32:37,400 Speaker 1: the company to go and rewrite those rules and to 577 00:32:37,480 --> 00:32:40,400 Speaker 1: create new UM you know, processes and standards, which I 578 00:32:40,440 --> 00:32:43,960 Speaker 1: think will serve them much better in future situations where 579 00:32:43,960 --> 00:32:47,240 Speaker 1: they have to navigate these very very thorny issues. You know, Ultimately, 580 00:32:47,240 --> 00:32:50,160 Speaker 1: if you don't write down the rules and everyone can't 581 00:32:50,200 --> 00:32:52,760 Speaker 1: see them and you know, see them transparently for what 582 00:32:52,800 --> 00:32:55,040 Speaker 1: they are, it makes it much more likely that a 583 00:32:55,120 --> 00:32:57,720 Speaker 1: big company like Meta is going to be able to 584 00:32:57,720 --> 00:33:00,240 Speaker 1: get away with not treating their users fairly. I do 585 00:33:00,280 --> 00:33:02,880 Speaker 1: think it was an important point to really defend UM 586 00:33:02,920 --> 00:33:06,160 Speaker 1: you know. The board also said that the suspension UM 587 00:33:06,200 --> 00:33:08,840 Speaker 1: you know it was correct, so it was the correct 588 00:33:08,880 --> 00:33:11,600 Speaker 1: move to go and suspend him um you know quickly 589 00:33:11,680 --> 00:33:15,239 Speaker 1: from access to Facebook and Instagram. But despite that, there 590 00:33:15,320 --> 00:33:17,520 Speaker 1: was still you know, bigger you know principles at stake, 591 00:33:17,560 --> 00:33:19,920 Speaker 1: and the company didn't think through the long term consequences 592 00:33:19,960 --> 00:33:22,920 Speaker 1: of how they did it right. Yeah, it makes me 593 00:33:22,960 --> 00:33:25,160 Speaker 1: think and you may not get this reference. Actually, a 594 00:33:25,200 --> 00:33:27,240 Speaker 1: lot of my listeners might not get this reference, but 595 00:33:27,280 --> 00:33:29,840 Speaker 1: it makes me think of of Calvin Ball, which was 596 00:33:29,880 --> 00:33:32,800 Speaker 1: a thing in the comic strip Calvin and Hobbes, and 597 00:33:32,840 --> 00:33:35,120 Speaker 1: it was a game that this little boy named Calvin 598 00:33:35,160 --> 00:33:38,080 Speaker 1: would play where he literally would make up rules as 599 00:33:38,120 --> 00:33:40,440 Speaker 1: the game was being played, and there was no way 600 00:33:40,440 --> 00:33:43,720 Speaker 1: to know how to win the game because he was 601 00:33:43,760 --> 00:33:46,480 Speaker 1: the one making the rules in real time as the 602 00:33:46,480 --> 00:33:48,800 Speaker 1: game's being played out. And I have a feeling that 603 00:33:48,800 --> 00:33:52,560 Speaker 1: that was kind of how we were seeing meta with 604 00:33:52,640 --> 00:33:56,000 Speaker 1: some of these policies too, and that there was obviously 605 00:33:56,040 --> 00:33:58,959 Speaker 1: this huge external pressure on the company to take action. 606 00:33:59,480 --> 00:34:03,000 Speaker 1: Uh and and the company obviously was already in a 607 00:34:03,080 --> 00:34:06,960 Speaker 1: space where they're this external pressure had been building for 608 00:34:07,080 --> 00:34:09,520 Speaker 1: quite some time, particularly here in the United States. I mean, 609 00:34:09,600 --> 00:34:12,560 Speaker 1: Zuckerberg had to appear before the Senate a couple of 610 00:34:12,560 --> 00:34:16,400 Speaker 1: times leading up to this, and so there obviously was 611 00:34:16,440 --> 00:34:20,239 Speaker 1: this call to action. But again, until you have those 612 00:34:20,280 --> 00:34:27,000 Speaker 1: transparent rules that everyone understands and applies to everyone, taking 613 00:34:27,040 --> 00:34:30,239 Speaker 1: action on its own isn't enough. It's it's something that 614 00:34:30,280 --> 00:34:34,360 Speaker 1: could easily be reversed, because if if things swing a 615 00:34:34,360 --> 00:34:38,839 Speaker 1: different way, then the company can stand accused of unfairly 616 00:34:38,880 --> 00:34:42,440 Speaker 1: applying rules to certain people. In fact, we've seen that 617 00:34:42,560 --> 00:34:46,440 Speaker 1: discourse rise as well. Absolutely, I'm going to use that 618 00:34:46,520 --> 00:34:50,000 Speaker 1: Calvin Ball analogy now, Um, I mean that's fantastic. We're 619 00:34:50,080 --> 00:34:53,960 Speaker 1: basically we are designed to be the anti Calvin Ball mechanism. 620 00:34:54,040 --> 00:34:55,840 Speaker 1: We do not think Calvin Ball is a good you 621 00:34:55,840 --> 00:34:58,640 Speaker 1: know game to play when it comes to speech for 622 00:34:58,680 --> 00:35:02,359 Speaker 1: billions of people online, I do confront this narrative all 623 00:35:02,400 --> 00:35:06,440 Speaker 1: the time, which is is the board of distractions somehow 624 00:35:06,840 --> 00:35:10,600 Speaker 1: from regulation? Like is this something that is getting in 625 00:35:10,640 --> 00:35:13,359 Speaker 1: the way of the bigger fix that we need as 626 00:35:13,360 --> 00:35:17,200 Speaker 1: a society. And I just think that's so you know, 627 00:35:17,280 --> 00:35:20,320 Speaker 1: misplaced as a concern, and and simply the evidence doesn't 628 00:35:20,320 --> 00:35:22,400 Speaker 1: you know, back it up. Um. You know, policy makers 629 00:35:22,400 --> 00:35:24,919 Speaker 1: have been looking at these issues for years, and they've 630 00:35:24,920 --> 00:35:28,480 Speaker 1: been moving at an incredibly glacial pace. And part of 631 00:35:28,480 --> 00:35:29,840 Speaker 1: the reason they have been moving at that place, I 632 00:35:29,880 --> 00:35:32,280 Speaker 1: don't think it's necessarily, you know, due to any ill intent. 633 00:35:32,480 --> 00:35:36,239 Speaker 1: It's because these are massive, deeply complex issues, and it 634 00:35:36,280 --> 00:35:39,319 Speaker 1: takes years to align the interests you know, and then 635 00:35:39,360 --> 00:35:42,080 Speaker 1: to you know codify you know, legislation and to you know, 636 00:35:42,120 --> 00:35:45,680 Speaker 1: build a coalition to support that legislation. The board is 637 00:35:45,760 --> 00:35:48,480 Speaker 1: designed to be something that's solved one small part of 638 00:35:48,520 --> 00:35:51,960 Speaker 1: this overall challenge with social media and to move faster. 639 00:35:52,480 --> 00:35:54,200 Speaker 1: So I mean, we've gone ahead and we've built this 640 00:35:54,239 --> 00:35:58,040 Speaker 1: institution which is starting to serve the users of Facebook 641 00:35:58,040 --> 00:36:00,840 Speaker 1: and Instagram and and you know, the communities who are 642 00:36:00,880 --> 00:36:03,759 Speaker 1: impacted by those platforms. Nothing we're doing is getting in 643 00:36:03,800 --> 00:36:06,960 Speaker 1: the way of policymakers also getting stuck in and you know, 644 00:36:07,040 --> 00:36:09,799 Speaker 1: having you know, a broader you know, sweep at the 645 00:36:10,080 --> 00:36:13,640 Speaker 1: systemic and the structural issues that also impact social media. 646 00:36:13,920 --> 00:36:16,080 Speaker 1: So we really don't see ourselves in any way as 647 00:36:16,080 --> 00:36:18,480 Speaker 1: a substitute for you know, all the effort of policy 648 00:36:18,480 --> 00:36:20,080 Speaker 1: makers and all the work that needs to be done. 649 00:36:20,360 --> 00:36:23,040 Speaker 1: We're a compliment. We're a very small compliment, and you know, 650 00:36:23,440 --> 00:36:25,200 Speaker 1: we're looking forward to seeing, you know, what people come 651 00:36:25,280 --> 00:36:27,440 Speaker 1: up with in terms of other proposals to you know, 652 00:36:27,560 --> 00:36:30,840 Speaker 1: improve social media and deliver a healthier social media environment. 653 00:36:30,920 --> 00:36:33,840 Speaker 1: And yet even though the Board is this this small slice, 654 00:36:34,800 --> 00:36:37,880 Speaker 1: because of its global nature, it is I would argue, 655 00:36:38,840 --> 00:36:43,480 Speaker 1: h more well positioned to tackle its particular mission than 656 00:36:43,560 --> 00:36:48,200 Speaker 1: any regional government would be simply because again you're trying 657 00:36:48,200 --> 00:36:52,560 Speaker 1: to legislate something that has a global reach, but you 658 00:36:52,600 --> 00:36:56,880 Speaker 1: don't have global jurisdiction. So you've got these various nations 659 00:36:56,880 --> 00:37:01,800 Speaker 1: around the world all grappling with similar issues. But unless 660 00:37:01,800 --> 00:37:06,600 Speaker 1: there's some sort of broad agreement across the world as 661 00:37:06,640 --> 00:37:10,040 Speaker 1: to you know, how to go about doing this, it's 662 00:37:10,080 --> 00:37:12,319 Speaker 1: just gonna be very messy for a long time. So 663 00:37:12,320 --> 00:37:14,279 Speaker 1: that's another reason I think it does take a long 664 00:37:14,320 --> 00:37:17,960 Speaker 1: time to see things on the regulatory side take shape, 665 00:37:18,440 --> 00:37:20,440 Speaker 1: uh not to mention in places like here in the 666 00:37:20,520 --> 00:37:24,680 Speaker 1: United States, whenever there's a change in administration, there's a 667 00:37:24,680 --> 00:37:29,640 Speaker 1: massive change in the position of regulations. So you can 668 00:37:29,680 --> 00:37:35,480 Speaker 1: have a regulatory body that uh completely changes shape from 669 00:37:35,480 --> 00:37:39,440 Speaker 1: one administration to another, and then any any progress you 670 00:37:39,480 --> 00:37:42,279 Speaker 1: were making on any particular issue might get reset. So 671 00:37:43,320 --> 00:37:46,200 Speaker 1: I think that having something that's at least addressing part 672 00:37:46,200 --> 00:37:48,000 Speaker 1: of it, and a very important part, even if it 673 00:37:48,080 --> 00:37:53,960 Speaker 1: is a small one, is fantastic. It's at least it's encouraging, 674 00:37:54,360 --> 00:37:57,680 Speaker 1: because I think otherwise the narrative tends to be that 675 00:37:57,760 --> 00:38:01,640 Speaker 1: we're kind of stuck in a quagmire waiting for a solution. Too. 676 00:38:02,400 --> 00:38:05,359 Speaker 1: Something that I think everyone recognizes is a problem. Although 677 00:38:05,400 --> 00:38:09,319 Speaker 1: they may disagree what that problem specifically is, they just 678 00:38:09,320 --> 00:38:12,799 Speaker 1: don't recognize that there is a problem. Yeah, I mean 679 00:38:12,800 --> 00:38:15,440 Speaker 1: it's it's interesting, Jonathan. I mean we actually this is 680 00:38:15,440 --> 00:38:19,359 Speaker 1: where there's a real consequence of social media and all 681 00:38:19,360 --> 00:38:22,160 Speaker 1: the years we've lived in the digital era in how 682 00:38:22,200 --> 00:38:24,960 Speaker 1: it impacts are thinking, right, Um, all of our attention 683 00:38:25,000 --> 00:38:28,280 Speaker 1: spans have become way shorter. Um. I learned something interesting 684 00:38:28,320 --> 00:38:30,640 Speaker 1: the other day. The average attention span for a goldfish 685 00:38:30,719 --> 00:38:33,799 Speaker 1: is fifteen seconds, and the average attention span for a 686 00:38:33,880 --> 00:38:38,479 Speaker 1: person today is about nine seconds. Um. So like we 687 00:38:38,480 --> 00:38:41,120 Speaker 1: we literally you know, have these incredibly short attention spans. 688 00:38:41,160 --> 00:38:44,160 Speaker 1: And because of that, everyone's looking for really simple solutions 689 00:38:44,160 --> 00:38:46,759 Speaker 1: to complex problems. And I think of this as the 690 00:38:46,800 --> 00:38:49,920 Speaker 1: sort of buzz feed approach to solving problems. We all 691 00:38:49,960 --> 00:38:53,080 Speaker 1: want this one weird trick to solve content moderation, and 692 00:38:53,080 --> 00:38:55,520 Speaker 1: you're hoping there's there's that one weird trick that will 693 00:38:55,560 --> 00:38:58,520 Speaker 1: you do everything for you And I think the experience 694 00:38:58,520 --> 00:39:00,520 Speaker 1: of the last decade has very clear he shown us 695 00:39:00,560 --> 00:39:03,359 Speaker 1: that's not the case at all. There is no UM 696 00:39:03,400 --> 00:39:06,480 Speaker 1: you know, special piece of regulation. There is no um 697 00:39:06,520 --> 00:39:09,880 Speaker 1: you know, incredibly well crafted institution including the oversight board 698 00:39:10,000 --> 00:39:12,640 Speaker 1: that's going to solve all your problems in their totality, 699 00:39:12,680 --> 00:39:15,040 Speaker 1: and anyone who's claiming that you know, either from government 700 00:39:15,080 --> 00:39:17,839 Speaker 1: or from the tech industry, is just lying. Um. We 701 00:39:17,880 --> 00:39:20,719 Speaker 1: can also bits of that problem and together we add 702 00:39:20,760 --> 00:39:23,000 Speaker 1: up to something that's very, very meaningful, and we'll protect 703 00:39:23,280 --> 00:39:26,240 Speaker 1: users and communities. Um. But we should all be realistic 704 00:39:26,239 --> 00:39:28,040 Speaker 1: about what we can do and recognize the scale of 705 00:39:28,040 --> 00:39:31,680 Speaker 1: the challenge. Very well said. I could not agree more. 706 00:39:33,040 --> 00:39:37,040 Speaker 1: I think it's incredibly beneficial for listeners out there. If 707 00:39:37,120 --> 00:39:39,680 Speaker 1: you if you really want to get an understanding of 708 00:39:40,040 --> 00:39:44,520 Speaker 1: how complex this is is simply go and review one 709 00:39:44,520 --> 00:39:49,400 Speaker 1: of the cases that the board tackled and really read 710 00:39:49,520 --> 00:39:54,680 Speaker 1: up on all the different UH factors about it. Because 711 00:39:54,719 --> 00:39:58,799 Speaker 1: you start to really you get past that knee jerk reaction. Right. 712 00:39:58,880 --> 00:40:02,640 Speaker 1: You might initially you have a feeling of this should 713 00:40:02,680 --> 00:40:04,680 Speaker 1: go this way or this should go that way, But 714 00:40:04,800 --> 00:40:07,080 Speaker 1: as you really dive into it and you start to 715 00:40:07,160 --> 00:40:11,200 Speaker 1: pull back and look at the larger implications, then you 716 00:40:11,280 --> 00:40:15,560 Speaker 1: might start to understand there isn't a simple on off switch. 717 00:40:15,600 --> 00:40:19,000 Speaker 1: The world is not a binary system. We cannot treat 718 00:40:19,040 --> 00:40:21,759 Speaker 1: it that way. We have to We have to take 719 00:40:21,760 --> 00:40:25,799 Speaker 1: into consideration all the complexities, including where there may be 720 00:40:25,920 --> 00:40:29,920 Speaker 1: gaps in policy. And that's why there's this this issue, 721 00:40:30,040 --> 00:40:34,000 Speaker 1: because we can't definitively say this is against the rules 722 00:40:34,080 --> 00:40:36,279 Speaker 1: if there are no rules that govern it. So I 723 00:40:36,320 --> 00:40:39,879 Speaker 1: think that if if people do take that time, which 724 00:40:40,080 --> 00:40:42,279 Speaker 1: obviously with the nine second attention span is going to 725 00:40:42,280 --> 00:40:45,479 Speaker 1: be challenging, but they do take that time, they're they're 726 00:40:45,480 --> 00:40:49,719 Speaker 1: going to gain that greater appreciation because I mean, I'm 727 00:40:49,800 --> 00:40:53,200 Speaker 1: guilty of this too, Like I'll see a story pop 728 00:40:53,280 --> 00:40:56,640 Speaker 1: up and very quickly make a judgment, and it's only 729 00:40:56,800 --> 00:41:01,279 Speaker 1: by resisting that urgent and engaging in critical thinking and 730 00:41:01,320 --> 00:41:04,239 Speaker 1: taking those those further steps that I can get past that. 731 00:41:04,320 --> 00:41:06,480 Speaker 1: And I don't do it all the time. I say 732 00:41:06,520 --> 00:41:09,600 Speaker 1: all the time on the show. The two things I 733 00:41:09,640 --> 00:41:12,759 Speaker 1: advocate for the strongest our compassion and critical thinking. I 734 00:41:12,760 --> 00:41:15,480 Speaker 1: think the two of them together absolutely necessary if you 735 00:41:15,520 --> 00:41:20,560 Speaker 1: want to if you want a better world. But even 736 00:41:20,640 --> 00:41:23,520 Speaker 1: though I advocate strongly for it, I also admit fully 737 00:41:23,560 --> 00:41:28,280 Speaker 1: that I am not I'm not a a a perfect 738 00:41:28,560 --> 00:41:32,320 Speaker 1: steward of that approach. Sometimes I fall victim to it too, 739 00:41:32,520 --> 00:41:34,959 Speaker 1: And I really think that by diving in a little 740 00:41:35,000 --> 00:41:37,799 Speaker 1: further and reading up on these things, people can get 741 00:41:37,840 --> 00:41:41,920 Speaker 1: a greater appreciation for the complexities that are involved. UM. 742 00:41:42,200 --> 00:41:45,920 Speaker 1: I certainly do not envy anyone on the board's position. 743 00:41:46,400 --> 00:41:52,239 Speaker 1: I am fascinated by the process. But I cannot imagine 744 00:41:53,040 --> 00:41:56,000 Speaker 1: really dedicating the kind of time and attention to sometimes 745 00:41:56,160 --> 00:42:02,000 Speaker 1: incredibly emotionally charged situations and determining whether or not a 746 00:42:02,080 --> 00:42:05,279 Speaker 1: company's decision on that matter was in line with its 747 00:42:05,280 --> 00:42:08,800 Speaker 1: policy or not. Absolutely, And this this is the challenge, 748 00:42:08,880 --> 00:42:11,440 Speaker 1: right all the board members are human beings. They're all 749 00:42:11,480 --> 00:42:13,920 Speaker 1: looking at the same headlines. We're all looking at. UM. 750 00:42:14,000 --> 00:42:18,759 Speaker 1: These are issues which can be extraordinarily emotive, and there 751 00:42:18,880 --> 00:42:22,279 Speaker 1: is always, you know, an urge today for people to 752 00:42:22,320 --> 00:42:24,880 Speaker 1: make those snap decisions about these things. And the board 753 00:42:24,920 --> 00:42:28,640 Speaker 1: has been designed not just in the extraordinarily impressive people 754 00:42:28,680 --> 00:42:30,480 Speaker 1: who have been added to the board, but in the 755 00:42:30,520 --> 00:42:33,600 Speaker 1: mechanisms that go into making that institution. It's been designs 756 00:42:33,680 --> 00:42:36,120 Speaker 1: that we don't make those snap decisions. Um. You know, 757 00:42:36,160 --> 00:42:40,160 Speaker 1: we provide that thoughtful, you know, measured review of the 758 00:42:40,200 --> 00:42:43,480 Speaker 1: decisions that Meta has made. They have a responsibility to 759 00:42:43,520 --> 00:42:45,799 Speaker 1: act first and to act fast on a lot of 760 00:42:45,840 --> 00:42:47,680 Speaker 1: these issues. But then we're going to come in and 761 00:42:47,680 --> 00:42:50,120 Speaker 1: take a closer look and say, have you thought about this? 762 00:42:50,320 --> 00:42:52,840 Speaker 1: You know, implication for you your users. They used to 763 00:42:52,840 --> 00:42:56,840 Speaker 1: be that slogan which became very popularized and then maligned 764 00:42:57,080 --> 00:43:00,320 Speaker 1: from Facebook, move fast and break things. And I always 765 00:43:00,320 --> 00:43:02,880 Speaker 1: tell my team, half jokingly, only half joking, Lee, that 766 00:43:02,960 --> 00:43:05,400 Speaker 1: all slogan, if that was one, is moved thoughtfully and 767 00:43:05,400 --> 00:43:07,840 Speaker 1: defend human rights, because that's a much better way of 768 00:43:07,880 --> 00:43:11,239 Speaker 1: actually going about doing that well. And again it's one 769 00:43:11,280 --> 00:43:13,600 Speaker 1: of those things where you go from the technical aspect 770 00:43:13,640 --> 00:43:17,640 Speaker 1: of let's make something that's really cool to the application 771 00:43:17,719 --> 00:43:20,560 Speaker 1: aspect of how does this interact with the real world. 772 00:43:21,320 --> 00:43:23,800 Speaker 1: I'd like to wrap this up by sort of asking 773 00:43:24,200 --> 00:43:26,960 Speaker 1: are there are there plans? Like really, right now the 774 00:43:26,960 --> 00:43:32,040 Speaker 1: oversight board is focused primarily upon uh content moderation policies. 775 00:43:32,239 --> 00:43:35,440 Speaker 1: Are their plans for that to expand beyond Facebook's content 776 00:43:35,480 --> 00:43:37,560 Speaker 1: moderation or is that just going to be the primary 777 00:43:37,560 --> 00:43:40,000 Speaker 1: focus of the board from here on out. Yeah, you 778 00:43:40,040 --> 00:43:43,200 Speaker 1: made a very important point in Jonathan, which was about how, um, 779 00:43:43,239 --> 00:43:45,839 Speaker 1: you know, people can lose focus as they scale, and 780 00:43:45,880 --> 00:43:48,640 Speaker 1: so we think of ourselves in many ways as an organization, 781 00:43:48,880 --> 00:43:51,520 Speaker 1: as a startup. We're a very small, you know group 782 00:43:51,719 --> 00:43:54,640 Speaker 1: taking on a very big, you know mission. We've evolved 783 00:43:54,719 --> 00:43:58,080 Speaker 1: enormously over the last couple of years to navigate through 784 00:43:58,280 --> 00:44:02,040 Speaker 1: the sort of organizational and strategic challenges as an organization. 785 00:44:02,280 --> 00:44:04,080 Speaker 1: We think the right thing to do is to stay focused. 786 00:44:04,160 --> 00:44:07,920 Speaker 1: Right now, Meta is a huge challenge. Just getting our 787 00:44:08,040 --> 00:44:11,040 Speaker 1: arms around the mission of the board today is something 788 00:44:11,080 --> 00:44:13,080 Speaker 1: that you know, takes up an enormous amount of work, 789 00:44:13,200 --> 00:44:16,200 Speaker 1: and we want to make sure that we're delivering the 790 00:44:16,320 --> 00:44:19,200 Speaker 1: maximum impact in terms of that original mission before we 791 00:44:19,239 --> 00:44:21,640 Speaker 1: look to expand further. Having said that, you know, we 792 00:44:21,680 --> 00:44:24,840 Speaker 1: absolutely recognize that this is a shared challenge across the industry. 793 00:44:25,040 --> 00:44:27,520 Speaker 1: All the problems that META is dealing with our problems 794 00:44:27,560 --> 00:44:30,239 Speaker 1: that manifest in different ways for other platforms. I mean, 795 00:44:30,239 --> 00:44:32,799 Speaker 1: look at Spotify or Netflix over the last few months 796 00:44:32,800 --> 00:44:35,759 Speaker 1: and all the various controversies and and problems that they've 797 00:44:35,760 --> 00:44:38,799 Speaker 1: they've been experiencing. So UM For now, the way we 798 00:44:38,840 --> 00:44:41,680 Speaker 1: think about it is focused on getting that core mission done, 799 00:44:42,239 --> 00:44:44,759 Speaker 1: share as many of our learnings UM as widely and 800 00:44:44,760 --> 00:44:46,719 Speaker 1: as transparently as possible in a way that can be 801 00:44:46,760 --> 00:44:48,920 Speaker 1: helpful to other companies and down the road, you know, 802 00:44:48,920 --> 00:44:50,920 Speaker 1: over the next you know, coming years, um, I think 803 00:44:50,960 --> 00:44:52,920 Speaker 1: we'll be looking to explore whether there's something else we 804 00:44:52,960 --> 00:44:56,160 Speaker 1: can deliver for companies. And you know, as the Board 805 00:44:56,200 --> 00:44:59,680 Speaker 1: evolves itself and we look to expand the things that 806 00:44:59,719 --> 00:45:02,320 Speaker 1: we're looking at within Facebook and Instagram, they may be 807 00:45:02,400 --> 00:45:04,600 Speaker 1: other things that you know, then coming to focus for 808 00:45:04,640 --> 00:45:06,600 Speaker 1: the rest of the industry and they say, hey, actually, 809 00:45:06,600 --> 00:45:08,600 Speaker 1: maybe the Oversight Board can be helpful for us as 810 00:45:08,600 --> 00:45:13,399 Speaker 1: we develop our plans. Uh. Fascinating. Dex. Thank you so 811 00:45:13,520 --> 00:45:17,160 Speaker 1: much for joining the show and giving us more information 812 00:45:17,200 --> 00:45:20,160 Speaker 1: about the Oversight Board. I have a much greater appreciation 813 00:45:20,640 --> 00:45:23,880 Speaker 1: for what it does now than I did before we 814 00:45:23,920 --> 00:45:28,000 Speaker 1: even started chatting. It was really informative and educational. I 815 00:45:28,040 --> 00:45:31,840 Speaker 1: hope my listeners enjoyed it as well. Great to be here, Jonathan. 816 00:45:32,880 --> 00:45:35,560 Speaker 1: Thanks again to Dex for joining the show. It was 817 00:45:35,600 --> 00:45:38,680 Speaker 1: really interesting to hear about the board's mission directly from 818 00:45:38,719 --> 00:45:42,600 Speaker 1: someone who works with the organization, and it gave me 819 00:45:42,640 --> 00:45:46,280 Speaker 1: a greater appreciation for the scope of the board's job 820 00:45:46,640 --> 00:45:49,719 Speaker 1: as well as the potential impact of its decisions. And 821 00:45:49,760 --> 00:45:52,440 Speaker 1: it really does highlight the necessity to seek out a 822 00:45:52,480 --> 00:45:57,160 Speaker 1: diverse array of perspectives as companies scale up in many ways, 823 00:45:57,239 --> 00:46:00,319 Speaker 1: I totally understand how Facebook could find itself in such 824 00:46:00,360 --> 00:46:03,680 Speaker 1: a complex situation that it required the creation of an 825 00:46:03,680 --> 00:46:08,040 Speaker 1: external entity. I mean, come on, let's let's really be 826 00:46:08,160 --> 00:46:12,120 Speaker 1: real here, cards on the table. Facebook evolved out of 827 00:46:12,160 --> 00:46:15,400 Speaker 1: a tool that was meant to allow male Harvard students 828 00:46:15,400 --> 00:46:19,640 Speaker 1: to rate the attractiveness of female Harvard students. That's why 829 00:46:19,640 --> 00:46:23,439 Speaker 1: the predecessor to Facebook was all about. So that wasn't 830 00:46:23,480 --> 00:46:28,520 Speaker 1: exactly aiming to become a nexus of global communications. And 831 00:46:28,600 --> 00:46:33,359 Speaker 1: the process of growth and scaling and expansion is one 832 00:46:33,400 --> 00:46:36,920 Speaker 1: that happened so quickly that it's not a surprise to 833 00:46:37,000 --> 00:46:40,200 Speaker 1: me that people of the company didn't necessarily realize they 834 00:46:40,280 --> 00:46:44,120 Speaker 1: needed a robust set of policies until problems began to 835 00:46:44,160 --> 00:46:47,800 Speaker 1: pop up. And in many ways, Facebook's path could serve 836 00:46:47,880 --> 00:46:51,319 Speaker 1: as a lesson to other platforms, either to create their 837 00:46:51,360 --> 00:46:55,640 Speaker 1: own independent oversight boards or to incorporate departments that are 838 00:46:55,719 --> 00:46:59,600 Speaker 1: dedicated to the formation and execution of policies and to 839 00:47:00,000 --> 00:47:04,160 Speaker 1: really staff that with a diverse group of perspectives so 840 00:47:04,200 --> 00:47:07,239 Speaker 1: that you can best serve your users. Right, if your 841 00:47:07,320 --> 00:47:10,080 Speaker 1: users are all over the world, then you darn well 842 00:47:10,200 --> 00:47:13,759 Speaker 1: need to have that diversity of perspective in order to 843 00:47:14,160 --> 00:47:17,560 Speaker 1: serve them properly. And I'm sure there will be cases 844 00:47:18,120 --> 00:47:20,600 Speaker 1: where the Oversight Board will make a decision that I 845 00:47:20,600 --> 00:47:23,920 Speaker 1: will have trouble understanding. Uh, there may well be cases 846 00:47:24,080 --> 00:47:27,759 Speaker 1: where I have a fundamental disagreement with the board's conclusion. 847 00:47:28,080 --> 00:47:31,399 Speaker 1: But at the same time I have to account for 848 00:47:31,400 --> 00:47:35,480 Speaker 1: several facts. Namely, while I feel strongly about human rights issues, 849 00:47:35,920 --> 00:47:39,160 Speaker 1: I am by no means an expert, right, I do 850 00:47:39,280 --> 00:47:42,279 Speaker 1: not spend the same amount of time and energy researching 851 00:47:42,320 --> 00:47:45,400 Speaker 1: these cases, nor do I have the background in human 852 00:47:45,480 --> 00:47:48,879 Speaker 1: rights and digital rights that the board members have, And 853 00:47:48,960 --> 00:47:52,000 Speaker 1: the conclusion that the Board comes to could be nested 854 00:47:52,040 --> 00:47:54,719 Speaker 1: in a much deeper problem, one that where they will 855 00:47:54,760 --> 00:47:58,720 Speaker 1: Facebook itself lacks the framework to issue a clear decision 856 00:47:58,760 --> 00:48:01,560 Speaker 1: on the matter, and it might be that the Board 857 00:48:01,600 --> 00:48:04,920 Speaker 1: comes to its decision not because of specific matters with 858 00:48:04,960 --> 00:48:08,799 Speaker 1: the case, but because there are no actual rules that 859 00:48:09,280 --> 00:48:13,359 Speaker 1: govern what Facebook does. And therefore, you know, if there 860 00:48:13,360 --> 00:48:16,160 Speaker 1: are no rules that say Facebook can do this, that's 861 00:48:16,200 --> 00:48:20,000 Speaker 1: a problem. As dex indicated, reality is a complicated and 862 00:48:20,120 --> 00:48:23,640 Speaker 1: messy matter. In the end, I am glad there's an 863 00:48:23,640 --> 00:48:27,440 Speaker 1: independent group holding Facebook accountable, and one that can compel 864 00:48:27,560 --> 00:48:33,279 Speaker 1: Facebook to reverse decisions that on close examination do not 865 00:48:33,480 --> 00:48:37,560 Speaker 1: appear to follow with Facebook's stated rules and goals. I 866 00:48:37,640 --> 00:48:40,799 Speaker 1: do hope to see a more broad application of those 867 00:48:40,800 --> 00:48:45,040 Speaker 1: principles across the web and the tech industry in general. Um, 868 00:48:45,080 --> 00:48:50,000 Speaker 1: that kind of consistency is really important. And again I 869 00:48:50,080 --> 00:48:53,920 Speaker 1: don't anticipate agreeing with every single one of those decisions, 870 00:48:53,920 --> 00:48:57,360 Speaker 1: but at least I can be confident that the decisions 871 00:48:57,360 --> 00:49:02,680 Speaker 1: were made by people who were taking incredible care and 872 00:49:02,800 --> 00:49:07,040 Speaker 1: consideration when judging the matter. Uh, and not just be 873 00:49:07,239 --> 00:49:10,840 Speaker 1: something where you know, it's a moderator who's under intense 874 00:49:10,880 --> 00:49:14,600 Speaker 1: pressure to look through as many comments or or posts 875 00:49:14,680 --> 00:49:17,560 Speaker 1: as they possibly can, and they're just hitting you know, 876 00:49:18,239 --> 00:49:22,000 Speaker 1: delete or or they're leaving alone one after the other 877 00:49:22,040 --> 00:49:25,920 Speaker 1: in order to get through it an incredible backlog, right, Like, 878 00:49:26,800 --> 00:49:29,080 Speaker 1: my heart really goes out to moderators too. We've heard 879 00:49:29,120 --> 00:49:35,040 Speaker 1: some terrible stories about the emotional impact that moderating can 880 00:49:35,080 --> 00:49:37,879 Speaker 1: have on folks who have to go through all these 881 00:49:37,920 --> 00:49:42,160 Speaker 1: different types of posts on Facebook that get reported. Anyway, 882 00:49:42,200 --> 00:49:45,200 Speaker 1: that wraps up this episode. If you have suggestions for 883 00:49:45,239 --> 00:49:48,239 Speaker 1: people I should have on the show or topics I 884 00:49:48,239 --> 00:49:50,480 Speaker 1: should cover, please reach out to me. The best way 885 00:49:50,520 --> 00:49:52,920 Speaker 1: to do that is on Twitter. The handle for the 886 00:49:52,960 --> 00:49:56,480 Speaker 1: show is text Stuff hs W and I'll talk to 887 00:49:56,480 --> 00:50:05,319 Speaker 1: you again really soon. Yeah. Text Stuff is an I 888 00:50:05,440 --> 00:50:08,960 Speaker 1: heart Radio production. For more podcasts from I Heart Radio, 889 00:50:09,280 --> 00:50:12,480 Speaker 1: visit the i heart Radio app, Apple Podcasts, or wherever 890 00:50:12,560 --> 00:50:14,080 Speaker 1: you listen to your favorite shows.