1 00:00:15,396 --> 00:00:33,356 Speaker 1: Pushkin Pushkin from Pushkin Industries. This is deep background to 2 00:00:33,476 --> 00:00:36,796 Speaker 1: show where we explore the stories behind the stories in 3 00:00:36,836 --> 00:00:41,516 Speaker 1: the news. One of the recurring stories is Facebook, and 4 00:00:41,716 --> 00:00:45,476 Speaker 1: particularly criticisms of Facebook that have arisen from a large 5 00:00:45,516 --> 00:00:49,236 Speaker 1: body of material leaked by a whistleblower to the press 6 00:00:49,476 --> 00:00:52,836 Speaker 1: and to Congress. This series of leaks has, of course 7 00:00:52,956 --> 00:00:56,556 Speaker 1: major implications for the company, but it also has implications 8 00:00:56,756 --> 00:01:01,756 Speaker 1: for the Oversight Board associated with Facebook, the independent institution 9 00:01:01,996 --> 00:01:05,556 Speaker 1: that has the job of reviewing the company's most controversial 10 00:01:05,756 --> 00:01:09,636 Speaker 1: content decisions and correcting them if it believes they've gotten 11 00:01:09,676 --> 00:01:14,076 Speaker 1: them wrong. Regular listeners of the podcast know that I'm 12 00:01:14,076 --> 00:01:16,796 Speaker 1: interested in the Oversight Board, but I would take it 13 00:01:16,796 --> 00:01:19,716 Speaker 1: a bit further. I am personally interested in the Oversight Board, 14 00:01:19,876 --> 00:01:22,036 Speaker 1: having been one of the people who helped come up 15 00:01:22,036 --> 00:01:23,876 Speaker 1: with the idea for it in the first place, and 16 00:01:23,916 --> 00:01:27,596 Speaker 1: who worked with Facebook to bring it into reality. For 17 00:01:27,676 --> 00:01:30,596 Speaker 1: full disclosure, I also want listeners to know, if you 18 00:01:30,596 --> 00:01:33,996 Speaker 1: don't already, that I continue to advise Facebook on free 19 00:01:34,076 --> 00:01:37,756 Speaker 1: expression related issues in order to discuss the business of 20 00:01:37,756 --> 00:01:40,876 Speaker 1: the Oversight Board. However, I am not the right person 21 00:01:40,916 --> 00:01:43,036 Speaker 1: to do the talking because I am not on the 22 00:01:43,076 --> 00:01:46,316 Speaker 1: Oversight Board, and the Oversight Board is its own independent 23 00:01:46,356 --> 00:01:50,436 Speaker 1: institution for that purpose. We're very fortunate to be welcomed 24 00:01:50,476 --> 00:01:54,516 Speaker 1: on today's show by Jamal Green. Jamal is the Dwight 25 00:01:54,636 --> 00:01:58,196 Speaker 1: Professor of Law at Columbia Law School. He's an expert 26 00:01:58,236 --> 00:02:01,836 Speaker 1: on constitutional decision making and the author of an important 27 00:02:01,836 --> 00:02:05,276 Speaker 1: book called How Rights Went Wrong. He's one of the 28 00:02:05,316 --> 00:02:09,236 Speaker 1: co chairs of the Oversight Board, in which position he 29 00:02:09,276 --> 00:02:12,716 Speaker 1: has a crucial insider's view of the work of the board, 30 00:02:13,036 --> 00:02:19,716 Speaker 1: its purposes, its functions, its limitations, and the challenges it faces. Jamal, 31 00:02:19,876 --> 00:02:27,076 Speaker 1: thank you so much for being Jamal. Let's start with 32 00:02:27,676 --> 00:02:30,236 Speaker 1: the Oversight Board, of which you're one of the chairs, 33 00:02:30,956 --> 00:02:34,476 Speaker 1: and tell me how you decided to take that job 34 00:02:34,556 --> 00:02:37,476 Speaker 1: when it was offered to oh gosh, when the team 35 00:02:37,476 --> 00:02:41,556 Speaker 1: at Facebook that was making the initial choices reached out 36 00:02:41,556 --> 00:02:44,676 Speaker 1: and kind of explained the idea, it sounded like an 37 00:02:44,716 --> 00:02:47,956 Speaker 1: opportunity both to try to make the world slightly better 38 00:02:48,076 --> 00:02:52,396 Speaker 1: and also something that aligned with my own personal interests 39 00:02:52,396 --> 00:02:55,596 Speaker 1: and my professional interests in One of the things I 40 00:02:55,636 --> 00:02:57,516 Speaker 1: do in my day job is think a lot about 41 00:02:58,276 --> 00:03:00,676 Speaker 1: how to balance and optimize different kinds of rights and 42 00:03:00,716 --> 00:03:03,516 Speaker 1: how to think about how rights interact with different kinds 43 00:03:03,556 --> 00:03:06,476 Speaker 1: of institutions. So it had a professional interest for me, 44 00:03:06,796 --> 00:03:10,396 Speaker 1: and given how important what Facebook does in terms of 45 00:03:10,476 --> 00:03:14,156 Speaker 1: content moderation, is seemed like an opportunity to contribute to 46 00:03:14,196 --> 00:03:15,876 Speaker 1: making the world a little bit better. So that's why 47 00:03:15,916 --> 00:03:21,996 Speaker 1: I decided to do it. And he regrets, No, no regrets. Challenges, 48 00:03:22,076 --> 00:03:23,996 Speaker 1: of course, but I knew there would be challenges, so 49 00:03:23,996 --> 00:03:26,276 Speaker 1: I wouldn't say there are any regrets. No good. I'm 50 00:03:26,316 --> 00:03:29,236 Speaker 1: happy to hear that, since my own involvement with the 51 00:03:29,276 --> 00:03:31,716 Speaker 1: Oversight Board went back to the point where there weren't 52 00:03:31,716 --> 00:03:33,676 Speaker 1: any chairs. In fact, there was only the idea for 53 00:03:33,716 --> 00:03:36,276 Speaker 1: the thing. I want to talk about your work on 54 00:03:36,436 --> 00:03:39,076 Speaker 1: rights and its relationship to some of the themes at 55 00:03:39,076 --> 00:03:41,756 Speaker 1: the Facebook Oversight Board faces. But I want to first 56 00:03:41,756 --> 00:03:46,356 Speaker 1: start with transparency, which is, on the one hand, one 57 00:03:46,396 --> 00:03:48,956 Speaker 1: of the core rationales for the Oversight Board in the 58 00:03:49,076 --> 00:03:54,316 Speaker 1: first place. It exists in some sense to push Facebook 59 00:03:54,396 --> 00:03:57,196 Speaker 1: to be more transparent it's decision making and to make 60 00:03:57,236 --> 00:04:00,236 Speaker 1: its own decisions in a way that is transparent in 61 00:04:00,236 --> 00:04:05,316 Speaker 1: the sense of revealing reasoning and logic. Transparency, though, has 62 00:04:05,396 --> 00:04:08,556 Speaker 1: also been a great challenge for Facebook to put it 63 00:04:08,836 --> 00:04:11,676 Speaker 1: some what overly politely, I mean, the company has been 64 00:04:12,476 --> 00:04:16,756 Speaker 1: very badly buffeted by a whole series of leaks which 65 00:04:17,036 --> 00:04:21,636 Speaker 1: have gone to major newspapers, to Congress, to probably the 66 00:04:21,636 --> 00:04:27,276 Speaker 1: Federal Trade Commission, all of which seem primarily structured, not solely, 67 00:04:27,316 --> 00:04:30,476 Speaker 1: but primarily structure around this idea that the company hasn't 68 00:04:30,476 --> 00:04:34,916 Speaker 1: been transparent enough in aspects of its decision making. So 69 00:04:34,956 --> 00:04:38,716 Speaker 1: I'm wondering what you think the Oversight Board can do 70 00:04:39,436 --> 00:04:43,756 Speaker 1: to push Facebook towards greater transparency that hasn't already been 71 00:04:43,796 --> 00:04:47,676 Speaker 1: done by this leaking process. Well, I think there's a 72 00:04:47,676 --> 00:04:50,516 Speaker 1: couple of different ways in which one can talk about transparency, 73 00:04:50,556 --> 00:04:55,916 Speaker 1: and Facebook has that issue along several dimensions. The first 74 00:04:55,916 --> 00:04:59,156 Speaker 1: thing to say is, you know, we care about transparency 75 00:04:59,916 --> 00:05:04,156 Speaker 1: to the degree that the particular activity touches on lots 76 00:05:04,156 --> 00:05:07,676 Speaker 1: and lots of people in a way that one thinks 77 00:05:07,716 --> 00:05:11,356 Speaker 1: that a company or or institution needs to be held 78 00:05:11,436 --> 00:05:13,996 Speaker 1: accountable in some way. Right, So if Facebook had no 79 00:05:14,036 --> 00:05:16,196 Speaker 1: reach at all, it was just a sort of private 80 00:05:16,236 --> 00:05:19,436 Speaker 1: company making its own decisions, we wouldn't care that much 81 00:05:19,436 --> 00:05:21,916 Speaker 1: about transparency, just as we don't care that much about 82 00:05:21,956 --> 00:05:24,916 Speaker 1: transparency for you know, the person who makes our kids 83 00:05:24,916 --> 00:05:28,756 Speaker 1: toys or something. But it's because Facebook's reach is so 84 00:05:28,836 --> 00:05:32,196 Speaker 1: broad and now in a global sense, so broad and 85 00:05:32,356 --> 00:05:36,556 Speaker 1: touches on a basic social policy when it comes to vaccines, 86 00:05:36,676 --> 00:05:39,196 Speaker 1: or it comes to political elections and all sorts of 87 00:05:39,196 --> 00:05:42,796 Speaker 1: things that Facebook policies can have an impact on. That's 88 00:05:42,796 --> 00:05:45,276 Speaker 1: where the demand for transparency comes in. It's in a 89 00:05:45,316 --> 00:05:48,996 Speaker 1: sense proportional to the reach of the company. And so 90 00:05:49,036 --> 00:05:51,636 Speaker 1: when we say something like Facebook has not been transparent enough, 91 00:05:52,036 --> 00:05:54,316 Speaker 1: I think the right way to understand that is to 92 00:05:54,356 --> 00:05:56,916 Speaker 1: say that its transparency is not on par with its reach. 93 00:05:57,356 --> 00:05:59,596 Speaker 1: And I think that's a legitimate problem for the company. 94 00:06:00,196 --> 00:06:03,236 Speaker 1: We think about leaks, right, A lot of the leaking 95 00:06:03,436 --> 00:06:08,276 Speaker 1: has been related to internal research that Facebook is conducted 96 00:06:08,276 --> 00:06:11,556 Speaker 1: and the degree to which responded to that internal research. 97 00:06:11,676 --> 00:06:15,396 Speaker 1: And that's a certain kind of transparency problem that I 98 00:06:15,436 --> 00:06:18,076 Speaker 1: think the board is not centrally focused on. Although the 99 00:06:18,116 --> 00:06:21,436 Speaker 1: board is certainly interested in what the company's doing. The 100 00:06:21,476 --> 00:06:24,276 Speaker 1: ways in which what the board does can be most 101 00:06:24,276 --> 00:06:28,076 Speaker 1: directly relevant to Facebook's transparency is it makes lots and 102 00:06:28,116 --> 00:06:33,076 Speaker 1: lots of decisions without fully explaining why it's making those decisions. Right, So, 103 00:06:33,076 --> 00:06:36,556 Speaker 1: when it comes to content people's content gets taken down, 104 00:06:36,596 --> 00:06:39,196 Speaker 1: other content gets left up. There's not much of an 105 00:06:39,196 --> 00:06:42,996 Speaker 1: opinion written about it. Facebook barely gives reasons for why 106 00:06:43,036 --> 00:06:45,156 Speaker 1: it does what it does. It seems like it's acting 107 00:06:45,156 --> 00:06:48,396 Speaker 1: in consistently in a variety of ways. And so what 108 00:06:48,436 --> 00:06:51,316 Speaker 1: the board can do is open up that decision making 109 00:06:51,316 --> 00:06:54,436 Speaker 1: process to show what kinds of trade offs are being made, 110 00:06:54,796 --> 00:06:58,756 Speaker 1: to make recommendations and in some cases of binding decisions 111 00:06:58,836 --> 00:07:01,436 Speaker 1: on how those tradeoffs should be made, and doing it 112 00:07:01,436 --> 00:07:04,636 Speaker 1: in a way that we write opinions, we publish those opinions, 113 00:07:04,636 --> 00:07:07,636 Speaker 1: We tell people exactly what our sources of information are, 114 00:07:07,676 --> 00:07:10,396 Speaker 1: what exactly we're weighing, what the kind of governing I'll 115 00:07:10,396 --> 00:07:13,476 Speaker 1: put this in quotes law is. So what are the 116 00:07:13,516 --> 00:07:16,556 Speaker 1: resources for relying upon to make those decisions in a 117 00:07:16,596 --> 00:07:19,076 Speaker 1: way the company just hasn't And I think that that 118 00:07:19,196 --> 00:07:22,996 Speaker 1: goes directly to how fairly the company treats its choosers. Again, 119 00:07:23,036 --> 00:07:26,276 Speaker 1: there's a broader question of the transparency of any institution 120 00:07:26,356 --> 00:07:29,196 Speaker 1: that's a wielding this much power. What the board does 121 00:07:29,356 --> 00:07:31,716 Speaker 1: is one element of that. So I both want to 122 00:07:31,796 --> 00:07:34,516 Speaker 1: explore the distinction that you're offering jamal between two different 123 00:07:34,596 --> 00:07:38,276 Speaker 1: kinds of transparency and then also maybe ask you whether 124 00:07:38,316 --> 00:07:40,596 Speaker 1: they might have more in common than you're suggesting. So 125 00:07:40,636 --> 00:07:43,596 Speaker 1: I hear you saying that what the oversight board does best, 126 00:07:43,596 --> 00:07:47,716 Speaker 1: and I agree with this is focusing on let's call 127 00:07:47,756 --> 00:07:51,076 Speaker 1: it reason giving transparency, right, This is the idea that 128 00:07:51,116 --> 00:07:54,116 Speaker 1: when the company makes a decision about some piece of content, 129 00:07:54,476 --> 00:07:57,196 Speaker 1: it needs to follow principles and it needs to reveal 130 00:07:57,556 --> 00:08:00,796 Speaker 1: transparently what its reasoning process was, and often they don't. 131 00:08:01,116 --> 00:08:03,876 Speaker 1: And since the oversight board itself exercises that kind of 132 00:08:03,876 --> 00:08:07,956 Speaker 1: transparency in making its own decisions, because you guys, tell 133 00:08:07,996 --> 00:08:09,996 Speaker 1: everybody here or the principle is that we invoked, here 134 00:08:09,996 --> 00:08:12,876 Speaker 1: are the facts we relied on, here are the moral 135 00:08:12,916 --> 00:08:14,996 Speaker 1: ideas that we're relevant to our analysis, and here's the 136 00:08:15,076 --> 00:08:17,476 Speaker 1: kind of conclusion that we reached after weighing these factors. 137 00:08:18,196 --> 00:08:21,116 Speaker 1: You can urge Facebook to do something similar. So I 138 00:08:21,156 --> 00:08:23,636 Speaker 1: get that side of the distinction. And then I hear 139 00:08:23,676 --> 00:08:26,316 Speaker 1: you suggesting that there's also a kind of transparency of 140 00:08:26,836 --> 00:08:28,876 Speaker 1: you know, how much does a company disclose about its 141 00:08:28,916 --> 00:08:33,676 Speaker 1: internal research about the consequences of its actions? And so 142 00:08:33,796 --> 00:08:35,516 Speaker 1: first let's just see if that is the distinction you're 143 00:08:35,956 --> 00:08:38,996 Speaker 1: pushing here. So, yes, that is the distinction I'm pushing. 144 00:08:39,156 --> 00:08:41,956 Speaker 1: I'll be curious what the follow up question is, because 145 00:08:41,956 --> 00:08:44,716 Speaker 1: I do think that there is a relation between them, 146 00:08:44,716 --> 00:08:48,596 Speaker 1: but I'll let you Yeah, I mean to be collaborative. 147 00:08:48,636 --> 00:08:51,716 Speaker 1: I mean, I think that those are in some sense different. 148 00:08:51,756 --> 00:08:54,956 Speaker 1: But if you think about, you know, Facebook having access 149 00:08:55,036 --> 00:08:59,716 Speaker 1: to internal research that suggested maybe some of its policies 150 00:08:59,756 --> 00:09:03,076 Speaker 1: we're having a negative effect on users, I think the 151 00:09:03,116 --> 00:09:05,996 Speaker 1: public call is not merely that that fact should have 152 00:09:05,996 --> 00:09:08,956 Speaker 1: been known, but that people want to know that Facebook's 153 00:09:08,876 --> 00:09:13,156 Speaker 1: decision making process where it decided to continue these services, 154 00:09:13,236 --> 00:09:15,396 Speaker 1: or that it decided to tweak them in certain ways 155 00:09:15,916 --> 00:09:19,156 Speaker 1: rather than eliminate them all together, was a reasoning process 156 00:09:19,196 --> 00:09:22,236 Speaker 1: that they can hear. So I mean, in that sense, 157 00:09:22,756 --> 00:09:25,956 Speaker 1: the decision making is always you take certain facts, you 158 00:09:25,996 --> 00:09:29,596 Speaker 1: have certain values, and you try to bring those together 159 00:09:29,836 --> 00:09:32,796 Speaker 1: and make a decision, and then if you're being transparent 160 00:09:32,836 --> 00:09:35,076 Speaker 1: about your decision making, you tell people why you did that. 161 00:09:35,556 --> 00:09:37,876 Speaker 1: And so I actually have the ends think that that's 162 00:09:37,876 --> 00:09:40,796 Speaker 1: actually more similar to the kind of transparency and decision 163 00:09:40,796 --> 00:09:43,356 Speaker 1: making that you're talking about. I don't think it's just 164 00:09:43,396 --> 00:09:44,796 Speaker 1: that people were saying, well, they should have told us 165 00:09:44,836 --> 00:09:47,036 Speaker 1: about this research. I think they're saying they should have 166 00:09:47,036 --> 00:09:50,516 Speaker 1: taken this research into account in making their decisions. And 167 00:09:50,556 --> 00:09:52,476 Speaker 1: when Facebook says, which is sort of all they've said 168 00:09:52,516 --> 00:09:55,836 Speaker 1: so far, oh, don't worry, we did, people's response to 169 00:09:55,836 --> 00:09:57,236 Speaker 1: that is to say, well, how do we know you 170 00:09:57,276 --> 00:10:00,076 Speaker 1: did that right? And had they been more transparent about 171 00:10:00,116 --> 00:10:02,756 Speaker 1: their decision making process, I think they could have said, well, 172 00:10:02,756 --> 00:10:04,716 Speaker 1: this is how we took that into account, or this 173 00:10:04,796 --> 00:10:06,596 Speaker 1: is how we didn't take it into account. I do 174 00:10:06,636 --> 00:10:10,076 Speaker 1: think it's the case, certainly that part of what the 175 00:10:10,116 --> 00:10:13,716 Speaker 1: board does when it sees a decision that's made by 176 00:10:13,716 --> 00:10:16,956 Speaker 1: Facebook or increasingly the board has weighed in and will 177 00:10:16,996 --> 00:10:21,276 Speaker 1: weigh in on actual policies that Facebook is implementing and considering. 178 00:10:21,836 --> 00:10:25,876 Speaker 1: One of the things we care about is why did 179 00:10:25,916 --> 00:10:30,436 Speaker 1: the mistake happen? And when we say a mistake, we 180 00:10:30,756 --> 00:10:33,956 Speaker 1: use a few different resources figuring out what counts as 181 00:10:33,956 --> 00:10:36,676 Speaker 1: a mistake. Right, So we think about whether Facebook as 182 00:10:36,716 --> 00:10:41,356 Speaker 1: apply its community standards accurately. We think about if Facebook 183 00:10:41,476 --> 00:10:45,196 Speaker 1: is acting consistent with how it understands its values, and 184 00:10:45,316 --> 00:10:47,876 Speaker 1: we think about international human rights law, which and the 185 00:10:47,956 --> 00:10:51,676 Speaker 1: norms associated with the international human rights system to which 186 00:10:51,716 --> 00:10:55,916 Speaker 1: Facebook has committed itself. It may well be that there 187 00:10:55,956 --> 00:11:00,476 Speaker 1: are decisions that Facebook makes that are inconsistent with those 188 00:11:00,596 --> 00:11:03,556 Speaker 1: values and norms. And one of the things that the 189 00:11:03,596 --> 00:11:05,676 Speaker 1: Board tries to do and has been trying to do 190 00:11:05,756 --> 00:11:07,836 Speaker 1: in decision is not just make an up or down 191 00:11:07,876 --> 00:11:10,036 Speaker 1: decision on a piece of content. I'd say why with 192 00:11:10,076 --> 00:11:12,916 Speaker 1: this mistake made? And in order to do that, it 193 00:11:12,956 --> 00:11:14,996 Speaker 1: may sometimes be the case, right that we need to 194 00:11:14,996 --> 00:11:17,716 Speaker 1: know more about what inputs there were into particular kinds 195 00:11:17,716 --> 00:11:20,916 Speaker 1: of content decisions. And I think the Board in asking 196 00:11:20,916 --> 00:11:24,876 Speaker 1: Facebook questions about that, in exposing what answers it gives 197 00:11:24,876 --> 00:11:27,156 Speaker 1: to us when it refuses to give answers to us, 198 00:11:27,236 --> 00:11:31,316 Speaker 1: in pushing Facebook to be more clear about what influences 199 00:11:31,356 --> 00:11:33,196 Speaker 1: it has when it's decision making. Right. So, one of 200 00:11:33,196 --> 00:11:36,036 Speaker 1: the recommendations the Board has made that Facebook has taken 201 00:11:36,116 --> 00:11:39,676 Speaker 1: up is to be more clear about when governments make 202 00:11:39,716 --> 00:11:43,436 Speaker 1: requests or Facebook to remove content. Right. So that's relevant 203 00:11:43,476 --> 00:11:45,876 Speaker 1: to transparency as well. Right, So, there are things the 204 00:11:45,916 --> 00:11:49,516 Speaker 1: Board can do that get at that deeper transparency. What 205 00:11:49,556 --> 00:11:52,356 Speaker 1: I meant to say is that there are some very 206 00:11:52,396 --> 00:11:57,476 Speaker 1: deep issues about who governs us that at the moment, 207 00:11:57,916 --> 00:12:01,796 Speaker 1: the Board's jurisdiction and scope are limited. There's no question 208 00:12:01,836 --> 00:12:04,476 Speaker 1: about that. There's very important progress that the Board can 209 00:12:04,476 --> 00:12:07,876 Speaker 1: and does make on I'm trying to get Facebook to 210 00:12:07,916 --> 00:12:10,676 Speaker 1: treat its users better or more consistently, to start trying 211 00:12:10,676 --> 00:12:14,036 Speaker 1: to get at what's influencing its decisions. But there is 212 00:12:14,076 --> 00:12:16,996 Speaker 1: always going to be a deeper question about the private 213 00:12:17,036 --> 00:12:22,156 Speaker 1: companies engaged in far reaching activities that I think our 214 00:12:22,236 --> 00:12:24,996 Speaker 1: questions for sort of all of society, Lots of different 215 00:12:25,076 --> 00:12:30,396 Speaker 1: kinds of institutions, the Board, but also whistleblowers and journalists 216 00:12:30,436 --> 00:12:34,516 Speaker 1: and researchers and civil society organizations that are equally if 217 00:12:34,556 --> 00:12:36,956 Speaker 1: not more situated to get into than the Board itself. 218 00:12:38,436 --> 00:12:40,236 Speaker 1: Let me ask you about one point where there was 219 00:12:40,276 --> 00:12:43,596 Speaker 1: some overlap between what the Board said in its recent 220 00:12:43,596 --> 00:12:47,836 Speaker 1: transparency reports and what the whistleblowers materials disclosed, and that 221 00:12:47,916 --> 00:12:52,036 Speaker 1: was the program sometimes called cross Check, through which Facebook 222 00:12:52,036 --> 00:12:57,236 Speaker 1: initially was trying to address just a relatively smaller number 223 00:12:57,396 --> 00:13:02,956 Speaker 1: of distinctive users with respect to their newsworthiness of what 224 00:13:03,036 --> 00:13:05,116 Speaker 1: was being posted on the platform, but that extended to 225 00:13:05,156 --> 00:13:10,516 Speaker 1: cover really a very large number of users, many more 226 00:13:10,556 --> 00:13:13,876 Speaker 1: than Facebook had acknowledged, and it seems many more than 227 00:13:13,996 --> 00:13:17,356 Speaker 1: Facebook told the Oversight Board when the Oversight Board asked 228 00:13:17,396 --> 00:13:20,156 Speaker 1: point blank about this in the course of the Trump 229 00:13:20,196 --> 00:13:23,116 Speaker 1: deplatforming decision, what can you say about that and what 230 00:13:23,196 --> 00:13:26,756 Speaker 1: do the oversight boards say about that recently? So I 231 00:13:26,796 --> 00:13:29,436 Speaker 1: can say a bit about the sort of background here, 232 00:13:29,676 --> 00:13:32,596 Speaker 1: but I'll note that Facebook has given to the board 233 00:13:33,316 --> 00:13:36,556 Speaker 1: a policy advisory requests on how to structure it's cross 234 00:13:36,636 --> 00:13:39,676 Speaker 1: check program. And I wouldn't want to say too much 235 00:13:39,716 --> 00:13:42,916 Speaker 1: in advance of deliberating about that and getting more information 236 00:13:42,956 --> 00:13:46,036 Speaker 1: about it about exactly what problems there might be with 237 00:13:46,076 --> 00:13:48,916 Speaker 1: crosscheck or what the right way to resolve those problems 238 00:13:48,996 --> 00:13:52,076 Speaker 1: might be. But crosscheck is this system that Facebook has 239 00:13:52,076 --> 00:13:55,916 Speaker 1: in place, or had in place, in which it exposes 240 00:13:55,956 --> 00:14:00,316 Speaker 1: certain users to additional layers of review, ostensibly on the 241 00:14:00,316 --> 00:14:03,236 Speaker 1: theory that they don't want mistakes to be made with 242 00:14:03,316 --> 00:14:06,196 Speaker 1: respect to certain users. And the board asked in the 243 00:14:06,196 --> 00:14:09,556 Speaker 1: Trump decision about this program, and Facebook it that it 244 00:14:09,596 --> 00:14:12,276 Speaker 1: was it was only used for a small number of users, 245 00:14:12,356 --> 00:14:14,156 Speaker 1: and it turns out to be it's a few million. 246 00:14:14,396 --> 00:14:17,516 Speaker 1: Facebook said to the board later that that was a 247 00:14:17,556 --> 00:14:20,116 Speaker 1: small number in relation to the number of people on Facebook, 248 00:14:20,116 --> 00:14:22,316 Speaker 1: which is of course true. But it's true in a 249 00:14:22,396 --> 00:14:25,356 Speaker 1: kind of layally way. So what the board said and 250 00:14:25,356 --> 00:14:27,676 Speaker 1: it's recent, I can say, since we both law professors, 251 00:14:27,676 --> 00:14:29,276 Speaker 1: I can say that you're using the word layally in 252 00:14:29,276 --> 00:14:32,156 Speaker 1: the negative sense of that term. That's that's right, right. So, 253 00:14:32,436 --> 00:14:35,356 Speaker 1: if you're dealing with someone in an adversarial posture, as 254 00:14:35,436 --> 00:14:38,076 Speaker 1: lawyers often are, right, sometimes if they ask you a 255 00:14:38,156 --> 00:14:40,716 Speaker 1: question you answer it in the most narrow possible way. 256 00:14:40,796 --> 00:14:43,036 Speaker 1: You might still be truthful, but in some ways it's 257 00:14:43,036 --> 00:14:46,156 Speaker 1: misleading if you're being very narrow about it. But if 258 00:14:46,156 --> 00:14:47,636 Speaker 1: you're you know, talking to a friend of yours and 259 00:14:47,716 --> 00:14:50,916 Speaker 1: they ask you about some piece of information, it would 260 00:14:50,916 --> 00:14:53,996 Speaker 1: be strange to be excessively narrow about that. And I 261 00:14:54,396 --> 00:14:57,716 Speaker 1: what I would say is Facebook, in not being I 262 00:14:57,756 --> 00:15:02,116 Speaker 1: think fully forthcoming, I think treated this in two adversarial 263 00:15:02,236 --> 00:15:04,876 Speaker 1: sort of way. Right. So when we ask them for information, 264 00:15:05,356 --> 00:15:08,076 Speaker 1: we think that they should give us the full context 265 00:15:08,196 --> 00:15:10,756 Speaker 1: and try to be as helpful as they can in 266 00:15:10,796 --> 00:15:14,756 Speaker 1: providing the board with information. And we told them as much, right. So, 267 00:15:15,116 --> 00:15:17,476 Speaker 1: going forward, Facebook has promised that it's going to be 268 00:15:17,556 --> 00:15:20,356 Speaker 1: more contextual in the way in which it responds to 269 00:15:20,356 --> 00:15:23,276 Speaker 1: information requests, and that's going to be I think very 270 00:15:23,316 --> 00:15:25,876 Speaker 1: helpful for the board to try to do. It's not 271 00:15:26,076 --> 00:15:29,116 Speaker 1: better because you don't always know what you don't know right, 272 00:15:29,156 --> 00:15:33,276 Speaker 1: and so understanding better exactly how the cross check program 273 00:15:33,316 --> 00:15:36,716 Speaker 1: works can be helpful in deciding whether it's being applied 274 00:15:36,756 --> 00:15:40,156 Speaker 1: fairly in a particular case, Jamal, I want to turn 275 00:15:40,236 --> 00:15:43,156 Speaker 1: to the core of the Oversight Board's job, which is 276 00:15:43,236 --> 00:15:47,316 Speaker 1: decision making, and here I'll be really curious to hear 277 00:15:47,356 --> 00:15:52,156 Speaker 1: from you about how your distinctive approach to decision making 278 00:15:52,996 --> 00:15:56,196 Speaker 1: plays out. You published an amazing book this year called 279 00:15:56,316 --> 00:15:59,116 Speaker 1: How Rights Went Wrong, and that book, in turn drew 280 00:15:59,156 --> 00:16:02,756 Speaker 1: on some of your earlier scholarship, which I and others 281 00:16:02,796 --> 00:16:05,316 Speaker 1: read in the course of trying to think about how 282 00:16:05,316 --> 00:16:07,716 Speaker 1: the Oversight Board should make its decisions in the first place. 283 00:16:07,796 --> 00:16:09,996 Speaker 1: So some of your approaches I think maybe you already 284 00:16:09,996 --> 00:16:12,276 Speaker 1: baked in before you got there. But I wonder if 285 00:16:12,276 --> 00:16:14,876 Speaker 1: you would start by just saying something about your distinctive 286 00:16:14,956 --> 00:16:19,116 Speaker 1: view of how courts or bodies that are sort of 287 00:16:19,156 --> 00:16:22,956 Speaker 1: like chords, like the Oversight Board, should decide cases where 288 00:16:22,956 --> 00:16:27,196 Speaker 1: there are reasonable arguments on both sides. Sure, and I'm 289 00:16:27,196 --> 00:16:30,236 Speaker 1: happy to talk about this with a couple of caveats right, 290 00:16:30,316 --> 00:16:34,796 Speaker 1: one being that it's not just my approach, right, I 291 00:16:34,836 --> 00:16:38,276 Speaker 1: think I have a particular angle on it, But it 292 00:16:38,396 --> 00:16:41,316 Speaker 1: is a pushback against the way in which US courts 293 00:16:41,316 --> 00:16:44,236 Speaker 1: and often US thinkers about rights tend to think about rights, 294 00:16:44,556 --> 00:16:48,156 Speaker 1: less of a pushback against some global standards. And the 295 00:16:48,196 --> 00:16:51,276 Speaker 1: second thing I'll say is we are collaborative board, right, 296 00:16:51,316 --> 00:16:53,516 Speaker 1: And so if I were writing all of the opinions 297 00:16:53,556 --> 00:16:55,436 Speaker 1: just by myself, they might look a little bit different 298 00:16:55,436 --> 00:16:59,516 Speaker 1: than writing opinions when twenty members have to more or 299 00:16:59,596 --> 00:17:02,876 Speaker 1: less agree on them. The general point is that when 300 00:17:02,916 --> 00:17:07,076 Speaker 1: we're talking about rights conflicts, that rights conflicts are very often, 301 00:17:07,196 --> 00:17:12,076 Speaker 1: not always, but very often conflicts in which people have 302 00:17:12,196 --> 00:17:15,596 Speaker 1: reasonable disagreement about how to apply a set of more 303 00:17:15,676 --> 00:17:18,596 Speaker 1: or less shared values at a high level of generality, 304 00:17:18,956 --> 00:17:21,396 Speaker 1: but they disagree on how to apply them in the 305 00:17:21,396 --> 00:17:25,356 Speaker 1: particular case. When we're in that situation, it's not that 306 00:17:25,516 --> 00:17:29,756 Speaker 1: helpful to pick out just a few rights that we 307 00:17:30,236 --> 00:17:33,596 Speaker 1: think are important and essentialize them so that they are 308 00:17:33,596 --> 00:17:37,276 Speaker 1: applied kind of absolutely whenever they're because that will tend 309 00:17:37,316 --> 00:17:42,796 Speaker 1: to silence one or the other side of these rights conflicts. 310 00:17:42,796 --> 00:17:44,596 Speaker 1: And I think in the US we tend to do 311 00:17:44,636 --> 00:17:49,396 Speaker 1: this with the First Amendment. So the moment someone invokes speech, 312 00:17:49,636 --> 00:17:53,076 Speaker 1: there's a battle over saying who has the speech right 313 00:17:53,276 --> 00:17:55,716 Speaker 1: or whether there's a speech right or not. And because 314 00:17:55,756 --> 00:17:57,956 Speaker 1: we know the stakes of that battle are extremely high, 315 00:17:57,996 --> 00:17:59,876 Speaker 1: that you just sort of win if you get to 316 00:17:59,916 --> 00:18:02,476 Speaker 1: say that you have a speech right. And what I 317 00:18:02,516 --> 00:18:05,796 Speaker 1: try to urge in the book is in freedom of 318 00:18:06,196 --> 00:18:10,876 Speaker 1: speech cases, as in many others, that a tremendous institutional 319 00:18:10,956 --> 00:18:14,036 Speaker 1: variation in the ways in which speech might be affected. Right, 320 00:18:14,076 --> 00:18:17,756 Speaker 1: So a purge of all people who are opposed to 321 00:18:17,756 --> 00:18:20,276 Speaker 1: the government in which you put them in prison is 322 00:18:20,396 --> 00:18:24,396 Speaker 1: extremely different from let's say, a university deciding how to 323 00:18:24,396 --> 00:18:27,956 Speaker 1: regulate the speech of its students, or a platform deciding 324 00:18:28,236 --> 00:18:30,476 Speaker 1: who is going to be able to amplify their content 325 00:18:30,796 --> 00:18:33,916 Speaker 1: and spread it around the world. All forms of regulation 326 00:18:33,956 --> 00:18:36,636 Speaker 1: of speech. But they're in very very different contexts, and 327 00:18:36,676 --> 00:18:39,236 Speaker 1: that in some of those contexts we have to think 328 00:18:39,316 --> 00:18:42,916 Speaker 1: more carefully about the various other values that we think 329 00:18:42,956 --> 00:18:46,116 Speaker 1: are important than we do another of those contexts. Right, So, 330 00:18:46,516 --> 00:18:49,716 Speaker 1: we care about national security, but we don't care about 331 00:18:49,716 --> 00:18:52,916 Speaker 1: it so much that we allow purges of our political enemies. 332 00:18:53,356 --> 00:18:56,716 Speaker 1: But the fact that we care about hate speech or 333 00:18:56,756 --> 00:19:01,036 Speaker 1: we care about amplification of misinformation? Are values that we 334 00:19:01,116 --> 00:19:03,476 Speaker 1: might think are sufficiently important to put some kinds of 335 00:19:03,476 --> 00:19:06,116 Speaker 1: restrictions on who has access to certain kinds of platforms, 336 00:19:06,396 --> 00:19:08,636 Speaker 1: And so even though they're both speech cases, they're all 337 00:19:08,636 --> 00:19:10,436 Speaker 1: speech case. Is we have to think carefully about what's 338 00:19:10,436 --> 00:19:12,596 Speaker 1: on the other side of the balance, depending on the 339 00:19:12,596 --> 00:19:16,396 Speaker 1: institutional context. Can I ask you a philosophical question around that, 340 00:19:16,476 --> 00:19:19,516 Speaker 1: Jamal that I find myself struggling with very much right now, 341 00:19:19,956 --> 00:19:22,196 Speaker 1: and I don't think there's a simple answer to it. 342 00:19:22,196 --> 00:19:27,076 Speaker 1: It has to do with the boundary between misinformation on 343 00:19:27,116 --> 00:19:29,756 Speaker 1: matters that you and I would probably agree have a 344 00:19:29,836 --> 00:19:35,396 Speaker 1: fact associated with them, and misinformation on matters that have 345 00:19:35,556 --> 00:19:40,236 Speaker 1: a fact associated with it but has become so politicized 346 00:19:40,796 --> 00:19:44,636 Speaker 1: that it becomes a stand in for somebody's political beliefs 347 00:19:44,676 --> 00:19:47,596 Speaker 1: and values. Maybe a climate change would be a good example. 348 00:19:48,156 --> 00:19:49,636 Speaker 1: I think you and I both think that there is 349 00:19:49,676 --> 00:19:52,276 Speaker 1: a science of it, and the scientists are doing their 350 00:19:52,276 --> 00:19:53,636 Speaker 1: best to get at it. They might be right, they 351 00:19:53,716 --> 00:19:56,716 Speaker 1: might be wrong, but they achieve consensus they follow their process. 352 00:19:57,636 --> 00:20:01,516 Speaker 1: But when people argue about climate change and whether it's 353 00:20:01,516 --> 00:20:04,116 Speaker 1: man made, a lot of people are using in that argument, 354 00:20:04,156 --> 00:20:07,116 Speaker 1: which is nominally an argument about facts, as a kind 355 00:20:07,156 --> 00:20:10,916 Speaker 1: of stand in for their political points of view. And 356 00:20:11,636 --> 00:20:16,156 Speaker 1: once that happens, the differences in what people are saying 357 00:20:16,236 --> 00:20:19,676 Speaker 1: could be put into the box of misinformation if we're 358 00:20:19,716 --> 00:20:22,676 Speaker 1: confident that we know what the science says, and in 359 00:20:22,676 --> 00:20:24,676 Speaker 1: that case, maybe it's not so important to preserve different 360 00:20:24,676 --> 00:20:26,916 Speaker 1: points of view, or they could be put into the 361 00:20:26,956 --> 00:20:30,836 Speaker 1: category of political argument about political identity and about what 362 00:20:30,876 --> 00:20:33,796 Speaker 1: should be done in the future, and that's really important 363 00:20:34,036 --> 00:20:37,956 Speaker 1: and would probably deserve a lot more protection. So I 364 00:20:37,956 --> 00:20:40,436 Speaker 1: deliberately didn't choose COVID because it's too close to home 365 00:20:40,476 --> 00:20:43,076 Speaker 1: and too controversial, But climate change is still pretty darn important. 366 00:20:43,716 --> 00:20:46,196 Speaker 1: So I guess I'm wondering what you think about that. 367 00:20:46,236 --> 00:20:47,956 Speaker 1: And again, it's not that I think there's a particularly 368 00:20:48,076 --> 00:20:51,276 Speaker 1: right answer to it. I just think it's a hard problem. Yeah, no, don't. 369 00:20:51,356 --> 00:20:53,436 Speaker 1: I think it's definitely a hard problem. I don't think 370 00:20:53,476 --> 00:20:57,716 Speaker 1: it has a sort of abstract answer. What I would 371 00:20:57,756 --> 00:21:02,076 Speaker 1: say is a couple of things. So one is the 372 00:21:02,476 --> 00:21:07,596 Speaker 1: bare fact that something is false perhaps should not engage 373 00:21:07,676 --> 00:21:11,876 Speaker 1: or enrage or site us as much as whether something 374 00:21:11,876 --> 00:21:14,796 Speaker 1: that is false is leading people to do something that 375 00:21:14,916 --> 00:21:18,476 Speaker 1: is more materially harmful, So that just goes back into 376 00:21:18,556 --> 00:21:22,116 Speaker 1: sort of COVID misinformation might be an exlightly different category 377 00:21:22,156 --> 00:21:25,676 Speaker 1: than something like climate change, where the latter is might 378 00:21:25,676 --> 00:21:28,116 Speaker 1: be influencing policy in some way, but in a somewhat 379 00:21:28,156 --> 00:21:32,356 Speaker 1: indirect way. Whereas if someone really does think that if 380 00:21:32,356 --> 00:21:36,036 Speaker 1: they inject bleach into their veins, or if they take 381 00:21:36,236 --> 00:21:39,396 Speaker 1: some off labeled drug that might hurt them, that's in 382 00:21:39,436 --> 00:21:41,876 Speaker 1: a different kind of category in terms of the immediacy 383 00:21:41,876 --> 00:21:45,156 Speaker 1: of harm. And that's important, right because we because it's 384 00:21:45,236 --> 00:21:48,276 Speaker 1: very hard, as you say, for these philosophical reasons to 385 00:21:48,356 --> 00:21:51,036 Speaker 1: sort of adjudicate these things in the abstract, but when 386 00:21:51,036 --> 00:21:53,836 Speaker 1: we are able to connect them to more concrete harms, 387 00:21:53,836 --> 00:21:56,956 Speaker 1: that affects how we feel about regulating them, even if 388 00:21:56,996 --> 00:22:00,836 Speaker 1: we can't resolve the philosophical issue that you just raised. Right, So, 389 00:22:01,396 --> 00:22:04,116 Speaker 1: there are various forms of misinformation and ways in which 390 00:22:04,156 --> 00:22:08,076 Speaker 1: we mislead each other. There's a long spectrum from pure 391 00:22:08,116 --> 00:22:10,836 Speaker 1: truth to pure life, and we're often somewhere in the 392 00:22:10,836 --> 00:22:13,836 Speaker 1: middle of that in our political discourse. So I tend 393 00:22:13,836 --> 00:22:17,276 Speaker 1: to think that really the only productive way that a 394 00:22:17,356 --> 00:22:20,356 Speaker 1: regulator of some kind can respond to that is to 395 00:22:20,396 --> 00:22:24,676 Speaker 1: try to focus on direct and concrete harms. We'll be 396 00:22:24,756 --> 00:22:38,596 Speaker 1: right back, Jamal on your oversight board. There is one 397 00:22:38,636 --> 00:22:41,716 Speaker 1: of your CoA chairs, Michael McConnell's a retired federal judge 398 00:22:41,756 --> 00:22:44,876 Speaker 1: also a law professor. Do people like him, or people 399 00:22:44,876 --> 00:22:48,756 Speaker 1: who come out of one particular system find it relatively 400 00:22:48,876 --> 00:22:53,076 Speaker 1: simple and seamless to shift to a more overtly recognizing 401 00:22:53,196 --> 00:22:55,116 Speaker 1: approach in your view, or is there a kind of 402 00:22:55,396 --> 00:22:58,156 Speaker 1: sense of cultural class or cultural difference behind the scenes. 403 00:22:58,556 --> 00:23:01,476 Speaker 1: There are cultural differences. The Board is a very diverse 404 00:23:01,516 --> 00:23:06,036 Speaker 1: institution along many dimensions, including the legal traditions that people 405 00:23:06,076 --> 00:23:08,956 Speaker 1: are associated with, whether they're associated with legal traditions at all. 406 00:23:09,316 --> 00:23:11,516 Speaker 1: I think that's a strength of the Board and that 407 00:23:11,676 --> 00:23:17,996 Speaker 1: it doesn't become sort of overly lawyerized. My personal deliberative 408 00:23:18,036 --> 00:23:21,036 Speaker 1: model is that I'm most familiar with is the you know, 409 00:23:21,156 --> 00:23:24,636 Speaker 1: the faculty meeting or maybe the law law school workshop, 410 00:23:24,716 --> 00:23:27,156 Speaker 1: which is a particular kind of culture sort of bouncing 411 00:23:27,196 --> 00:23:31,916 Speaker 1: ideas off each other, challenging people fairly directly. And I 412 00:23:31,916 --> 00:23:34,476 Speaker 1: do think we all take that into the liberation room, 413 00:23:34,476 --> 00:23:36,596 Speaker 1: which turns out to be a zoom room. When we 414 00:23:36,636 --> 00:23:39,036 Speaker 1: talk about we're going to talk about cases. And again, 415 00:23:39,076 --> 00:23:42,156 Speaker 1: I think that's a that's a strength. Everyone who's joined 416 00:23:42,196 --> 00:23:45,396 Speaker 1: the board joins it knowing has joined it knowing that 417 00:23:45,436 --> 00:23:49,636 Speaker 1: this is a collaborative enterprise that you bring. You're there 418 00:23:49,636 --> 00:23:52,236 Speaker 1: for a reason, and what you're bringing is valuable to 419 00:23:52,396 --> 00:23:55,036 Speaker 1: the to the room. But we're also trying to reach 420 00:23:55,076 --> 00:23:57,396 Speaker 1: a decision and trying to reach a certain degree of consensus. 421 00:23:57,476 --> 00:24:01,596 Speaker 1: And I've certainly seen cases where people lodge strong objections 422 00:24:01,596 --> 00:24:03,836 Speaker 1: and then they say, Okay, we had a discussion, my 423 00:24:04,516 --> 00:24:07,236 Speaker 1: position lost, and now I'm on board. I think that 424 00:24:07,236 --> 00:24:10,836 Speaker 1: that's been very healthy, very active, and I actually want 425 00:24:10,916 --> 00:24:12,476 Speaker 1: us to be able to try to model that for 426 00:24:12,916 --> 00:24:14,916 Speaker 1: people who aren't on the board right that when you 427 00:24:14,956 --> 00:24:17,756 Speaker 1: disagree with about things, you hash it out. It's you 428 00:24:17,836 --> 00:24:20,596 Speaker 1: have respectful disagreement, and you reach a decision. You move 429 00:24:20,636 --> 00:24:24,156 Speaker 1: on to the next fight. What's been Jamal, the most 430 00:24:24,196 --> 00:24:32,516 Speaker 1: surprising thing that you've experienced while working on the oversight board. Gosh, 431 00:24:31,956 --> 00:24:35,596 Speaker 1: that's a hard question. What's the most surprising, because there's 432 00:24:35,596 --> 00:24:37,916 Speaker 1: been a few surprising things, I think, but we'll give 433 00:24:37,956 --> 00:24:39,796 Speaker 1: me several. I mean, I'm actually one of the reasons 434 00:24:39,836 --> 00:24:41,956 Speaker 1: again full disclosure One of the reasons I'm asking you 435 00:24:42,116 --> 00:24:45,316 Speaker 1: is I have a kind of nose pressed against the 436 00:24:45,316 --> 00:24:48,036 Speaker 1: glass feeling sometimes about the Oversight Board, you know, like 437 00:24:48,716 --> 00:24:52,076 Speaker 1: having dreamed the thing up, pushed for it, and then 438 00:24:52,436 --> 00:24:55,076 Speaker 1: decided that I was so close to the company through 439 00:24:55,076 --> 00:24:57,956 Speaker 1: the process of building it that I shouldn't serve on it. 440 00:24:58,396 --> 00:25:01,516 Speaker 1: I sort of like hope that people whom I hugely 441 00:25:01,596 --> 00:25:04,356 Speaker 1: respect and trust, like you, would go off and do it. 442 00:25:04,796 --> 00:25:07,196 Speaker 1: But I don't have a feeling for the minute to 443 00:25:07,276 --> 00:25:09,076 Speaker 1: minute of what it's like from the inside, and it's 444 00:25:09,156 --> 00:25:11,596 Speaker 1: sort of kills me. So I'm actually really curious to 445 00:25:11,596 --> 00:25:13,436 Speaker 1: get it. What have been various things that weren't what 446 00:25:13,476 --> 00:25:16,356 Speaker 1: you would have expected, So I'll name I'll name two things. 447 00:25:16,356 --> 00:25:20,316 Speaker 1: So one is that the work of the board is 448 00:25:20,356 --> 00:25:22,676 Speaker 1: not just the work of the board members. Right, So 449 00:25:22,716 --> 00:25:25,676 Speaker 1: we have a staff. The staff is excellent. Thomas Hughes 450 00:25:25,796 --> 00:25:28,636 Speaker 1: is the director. I hadn't thought very carefully about the 451 00:25:28,636 --> 00:25:31,196 Speaker 1: staff because I sort of had this idealized vision of 452 00:25:31,756 --> 00:25:33,236 Speaker 1: you get a case and then you sit in your 453 00:25:33,236 --> 00:25:35,636 Speaker 1: office and you think carefully about it, and you and 454 00:25:35,676 --> 00:25:37,956 Speaker 1: then you just you come to a view. Right. But 455 00:25:38,316 --> 00:25:40,436 Speaker 1: the day to day operation of the board the amount 456 00:25:40,436 --> 00:25:43,116 Speaker 1: of research that has to go into particular cases, the 457 00:25:43,156 --> 00:25:45,396 Speaker 1: complexity of writing these opinions and making sure we get 458 00:25:45,396 --> 00:25:48,476 Speaker 1: them right, a million other things having to do with 459 00:25:48,636 --> 00:25:51,316 Speaker 1: how do we work with Facebook to try to implement decisions, 460 00:25:51,476 --> 00:25:53,756 Speaker 1: how do you actually set it up technologically in terms 461 00:25:53,796 --> 00:25:56,516 Speaker 1: of security and privacy and the legal aspects of it, 462 00:25:56,556 --> 00:25:59,316 Speaker 1: and just the size and quality of the staff. I 463 00:25:59,356 --> 00:26:01,916 Speaker 1: think is one thing that I had not anticipated or 464 00:26:01,956 --> 00:26:04,596 Speaker 1: hadn't thought carefully about before I took the job, but 465 00:26:04,636 --> 00:26:07,716 Speaker 1: it's completely essential to what we do. The other is 466 00:26:07,716 --> 00:26:11,356 Speaker 1: a point about Facebook, which is just the complexity of 467 00:26:11,356 --> 00:26:15,116 Speaker 1: the company which I think I hadn't fully grasped. It's 468 00:26:15,156 --> 00:26:17,076 Speaker 1: not just that sometimes there's a right hand in a 469 00:26:17,156 --> 00:26:19,596 Speaker 1: left hand and they're doing different things, but it's twenty 470 00:26:19,596 --> 00:26:21,996 Speaker 1: five different hands right and they're all different doing different things. 471 00:26:21,996 --> 00:26:24,796 Speaker 1: And there's a lot of internal diversity at Facebook in 472 00:26:24,876 --> 00:26:27,036 Speaker 1: terms of whether people think the company is doing the 473 00:26:27,116 --> 00:26:30,996 Speaker 1: right thing or the wrong thing. Powder structure, it's it's platform, 474 00:26:31,276 --> 00:26:34,596 Speaker 1: and I think there's a perception of the company that 475 00:26:34,676 --> 00:26:37,436 Speaker 1: it's just sort of Mark Zuckerberg is sitting on a 476 00:26:37,476 --> 00:26:40,836 Speaker 1: throne just making decisions for everyone. It's a it's a 477 00:26:40,876 --> 00:26:45,036 Speaker 1: complicated place. Speaking of it being a complicated place, Facebook 478 00:26:45,156 --> 00:26:48,316 Speaker 1: as we know it just changed its overall name to 479 00:26:48,876 --> 00:26:53,156 Speaker 1: Meta or Meta Platforms, and that means this company is 480 00:26:53,156 --> 00:26:56,716 Speaker 1: going to do many, many more things in the broadly 481 00:26:56,796 --> 00:27:02,676 Speaker 1: speaking virtual reality space. Is the Oversight Boards Charter written 482 00:27:03,236 --> 00:27:07,796 Speaker 1: to give the Oversight Boards supervisory power or authority in 483 00:27:07,836 --> 00:27:13,996 Speaker 1: those kinds of undertakings beyond the product called Facebook. So 484 00:27:14,116 --> 00:27:16,276 Speaker 1: I'd have to go and take another look at the charter. 485 00:27:16,356 --> 00:27:18,276 Speaker 1: But my belief as I sit here is that the 486 00:27:18,356 --> 00:27:21,436 Speaker 1: charter is very much connected to the platform as it 487 00:27:21,476 --> 00:27:24,396 Speaker 1: exists today, and that if at some point in the future, 488 00:27:24,636 --> 00:27:28,276 Speaker 1: the Board and Facebook we're going to decide that the 489 00:27:28,356 --> 00:27:32,716 Speaker 1: Board was going to extend into other Facebook products, that 490 00:27:32,716 --> 00:27:36,036 Speaker 1: would have to require some change to the governing documents 491 00:27:36,036 --> 00:27:38,636 Speaker 1: of the Board. As I sit here today, I think 492 00:27:38,756 --> 00:27:41,836 Speaker 1: all of us are waiting to see what exactly the 493 00:27:41,836 --> 00:27:45,956 Speaker 1: company means by its entry into the virtual space. But 494 00:27:46,476 --> 00:27:48,476 Speaker 1: at the moment, there's not much for the Board to 495 00:27:48,516 --> 00:27:52,356 Speaker 1: say about that. What do you think would be a 496 00:27:52,396 --> 00:27:54,956 Speaker 1: good measure? You know, we're now a year into the 497 00:27:54,956 --> 00:27:56,956 Speaker 1: life of the Oversight Board, what would be a good 498 00:27:56,956 --> 00:28:00,116 Speaker 1: measure in another year or two in your mind as 499 00:28:00,116 --> 00:28:02,676 Speaker 1: to whether the work you were doing was having the 500 00:28:02,756 --> 00:28:05,796 Speaker 1: kind of impact you hope it will have. I think 501 00:28:05,836 --> 00:28:10,996 Speaker 1: the Board has already had some significant impact Facebook. I 502 00:28:11,036 --> 00:28:12,796 Speaker 1: think the culture of the company now knows that it 503 00:28:12,836 --> 00:28:15,556 Speaker 1: has to justify a number of the kinds of decisions 504 00:28:15,556 --> 00:28:17,636 Speaker 1: that makes relating to users to the board. But in 505 00:28:17,756 --> 00:28:20,076 Speaker 1: terms of how one would measure it from the outside, 506 00:28:21,156 --> 00:28:24,516 Speaker 1: I think engagement with the board right. Part of the 507 00:28:24,516 --> 00:28:29,196 Speaker 1: board's challenge has been that the board is an independent entity. 508 00:28:29,196 --> 00:28:32,836 Speaker 1: It's structured to be an independent entity. Facebook doesn't control 509 00:28:32,916 --> 00:28:35,876 Speaker 1: the board, right, but Facebook creative the board, And I 510 00:28:35,916 --> 00:28:39,956 Speaker 1: think from a matter of public perception, I think, just 511 00:28:40,036 --> 00:28:43,196 Speaker 1: as people have children and then the children become their 512 00:28:43,196 --> 00:28:46,276 Speaker 1: own independent people, I think the board as a new 513 00:28:46,316 --> 00:28:49,676 Speaker 1: institution is still working towards, and has to work towards, 514 00:28:50,116 --> 00:28:53,436 Speaker 1: making clearer the degree to which it's truly an independent entity. 515 00:28:54,036 --> 00:29:00,156 Speaker 1: And that means engagement with lots with people institutions that 516 00:29:00,196 --> 00:29:02,956 Speaker 1: are not necessarily connected to, or in league with, or 517 00:29:02,996 --> 00:29:06,396 Speaker 1: sympathetic with Facebook. For example, so the board is talking 518 00:29:06,436 --> 00:29:09,996 Speaker 1: to former Facebook employees, including Francis Hogan, because we want 519 00:29:09,996 --> 00:29:12,356 Speaker 1: to learn more about how to do our jobs better. 520 00:29:12,516 --> 00:29:15,876 Speaker 1: The Wall street journal reporting. I certainly, and I think 521 00:29:15,876 --> 00:29:19,036 Speaker 1: we've done this officially too. Is celebrates that, which is 522 00:29:19,076 --> 00:29:22,076 Speaker 1: to say, we celebrate learning more about the company through 523 00:29:22,076 --> 00:29:25,476 Speaker 1: other institutions as well. Right, So, the board is collaborative 524 00:29:25,516 --> 00:29:30,916 Speaker 1: with other modes of accountability. It's complementary to other modes 525 00:29:30,916 --> 00:29:33,556 Speaker 1: of accountability, including government. Right. I mean, so, I don't 526 00:29:33,596 --> 00:29:36,596 Speaker 1: mean the Board doesn't have a position on whether or 527 00:29:36,596 --> 00:29:39,676 Speaker 1: to what degree the government should regulate social media platforms. 528 00:29:39,836 --> 00:29:42,356 Speaker 1: That's to say that we are part of a larger 529 00:29:43,276 --> 00:29:48,316 Speaker 1: ecosystem in which we're all trying to make very powerful, 530 00:29:48,436 --> 00:29:52,556 Speaker 1: very important companies as responsible as we can make them. 531 00:29:52,596 --> 00:29:55,676 Speaker 1: And so again, the measure of that is who's engaging 532 00:29:55,716 --> 00:29:58,316 Speaker 1: with us, who trusts us, who relies on us, who 533 00:29:58,396 --> 00:30:02,556 Speaker 1: reads our opinions, who contributes to our policy recommendations, and 534 00:30:02,956 --> 00:30:05,956 Speaker 1: who gives public comments. Do people trust this as an 535 00:30:05,956 --> 00:30:10,396 Speaker 1: institution that can make a difference, that's listening to them, 536 00:30:10,436 --> 00:30:13,436 Speaker 1: that's open to engaging with them in a transparent way. 537 00:30:13,476 --> 00:30:15,796 Speaker 1: And I think we're on the road to doing that, 538 00:30:15,956 --> 00:30:18,836 Speaker 1: But there's more work to be done. Jamal, What should 539 00:30:18,836 --> 00:30:23,156 Speaker 1: I be asking you about the operation or future or 540 00:30:23,236 --> 00:30:26,756 Speaker 1: reality of the oversight woord that I haven't asked you. Well, 541 00:30:26,796 --> 00:30:29,316 Speaker 1: I think important for a new institution, which is what's 542 00:30:29,316 --> 00:30:33,396 Speaker 1: the biggest challenge going forward? I think implementation is one 543 00:30:33,436 --> 00:30:35,556 Speaker 1: of the biggest, just in the sense that we make 544 00:30:35,596 --> 00:30:38,436 Speaker 1: binding decisions on individual cases. We've made seventeen of them. 545 00:30:38,516 --> 00:30:40,916 Speaker 1: We've had five hundred thousand appeals. There's a billion pieces 546 00:30:40,956 --> 00:30:46,076 Speaker 1: of content on Facebook every day, and we've made seventeen decisions. Right, 547 00:30:46,156 --> 00:30:49,036 Speaker 1: so in some sense, right, how can you make an 548 00:30:49,036 --> 00:30:52,556 Speaker 1: impact on just seventeen decisions. Well, of course, you pick 549 00:30:52,636 --> 00:30:55,236 Speaker 1: the decisions that you're going to the cases that you're 550 00:30:55,236 --> 00:30:58,316 Speaker 1: going to make decisions about, based on whether they can 551 00:30:58,356 --> 00:31:01,036 Speaker 1: have a more long ranging impact. But then once you 552 00:31:01,076 --> 00:31:04,116 Speaker 1: make a decision on that individual piece of content, the 553 00:31:04,236 --> 00:31:08,756 Speaker 1: question is how can that decision spread across the company? 554 00:31:09,436 --> 00:31:13,836 Speaker 1: And that often will require engineering changes, changes in the 555 00:31:13,876 --> 00:31:17,756 Speaker 1: way the community standards are implemented by both machines and 556 00:31:17,796 --> 00:31:22,156 Speaker 1: by humans. It might require changes in UM in specific 557 00:31:22,236 --> 00:31:25,196 Speaker 1: policies that the company has UM we have to have 558 00:31:25,236 --> 00:31:28,396 Speaker 1: a certain degree of independence from the company in making 559 00:31:28,396 --> 00:31:30,756 Speaker 1: the decisions we make, but then in how to actually 560 00:31:30,796 --> 00:31:34,076 Speaker 1: implement them on the platform. There's no way around having 561 00:31:34,076 --> 00:31:37,196 Speaker 1: to collaborate with Facebook and in some sense, and so 562 00:31:37,316 --> 00:31:40,396 Speaker 1: that requires, you know, a tricky balance of not just 563 00:31:40,516 --> 00:31:43,076 Speaker 1: figuring out what the right answer is in terms of 564 00:31:43,396 --> 00:31:47,596 Speaker 1: implementing decisions, but also you know, figuring out what Facebook 565 00:31:47,636 --> 00:31:52,356 Speaker 1: really can and really can't do. UM. Since we're not engineers, UM, 566 00:31:52,796 --> 00:31:54,476 Speaker 1: you know, how much do you take This is really 567 00:31:54,476 --> 00:31:58,316 Speaker 1: expensive and really complicated as an as an answer is hard, right, 568 00:31:58,356 --> 00:32:00,436 Speaker 1: And there's there's no way around that being hard other 569 00:32:00,476 --> 00:32:04,236 Speaker 1: than you build expertise over time, you build relationships over time, 570 00:32:04,596 --> 00:32:07,476 Speaker 1: You bring expertise onto on board, onto our staff, and 571 00:32:07,476 --> 00:32:09,796 Speaker 1: onto our board so that we're not just relying on 572 00:32:09,916 --> 00:32:13,116 Speaker 1: Facebook's technical know how. And that's going to be something 573 00:32:13,156 --> 00:32:16,676 Speaker 1: that is an ongoing issue. And if you think about it, 574 00:32:16,716 --> 00:32:19,796 Speaker 1: just to underscore your point about it taking time, you know, 575 00:32:19,836 --> 00:32:22,316 Speaker 1: the Supreme Court of the United States has ordered the 576 00:32:22,356 --> 00:32:27,236 Speaker 1: federal government, it's ordered states, it's ordered institutions within states 577 00:32:27,676 --> 00:32:32,796 Speaker 1: to do very complicated things in its history, and sometimes 578 00:32:32,836 --> 00:32:36,276 Speaker 1: they pull it off. Sometimes they don't. Sometimes they're dragging 579 00:32:36,276 --> 00:32:39,636 Speaker 1: their feet because they have bad intentions. But sometimes it 580 00:32:39,676 --> 00:32:44,476 Speaker 1: really is just instrumentally very difficult to operationalize the commands 581 00:32:44,516 --> 00:32:46,476 Speaker 1: of the court. And so that creates a kind of 582 00:32:46,556 --> 00:32:51,276 Speaker 1: overtime dialogue where both sides gained expertise and develops some 583 00:32:51,396 --> 00:32:55,076 Speaker 1: degree of trust, even alongside the possibility of occasionally being 584 00:32:55,516 --> 00:32:57,876 Speaker 1: adversarial to each other. And I would say that's normal 585 00:32:58,356 --> 00:33:01,796 Speaker 1: for any entity that's engaged in oversight and the body 586 00:33:01,796 --> 00:33:03,676 Speaker 1: that it's overseeing, in the same way that in a sense, 587 00:33:03,676 --> 00:33:06,076 Speaker 1: the US Supreme Court exercise is a kind of constitutional 588 00:33:06,156 --> 00:33:08,996 Speaker 1: over the rest of our government. So that's to say 589 00:33:09,156 --> 00:33:11,796 Speaker 1: that you're off to the right start, and as you say, 590 00:33:11,916 --> 00:33:15,196 Speaker 1: it's going to take time and effort to get it right. Yeah, 591 00:33:15,396 --> 00:33:17,516 Speaker 1: I think that's if there's a final point I would 592 00:33:17,516 --> 00:33:21,356 Speaker 1: emphasize on the board and measuring its impact, is it 593 00:33:21,356 --> 00:33:24,396 Speaker 1: will take time. We live in a culture, certainly in 594 00:33:24,436 --> 00:33:27,476 Speaker 1: which the news cycle lasts a day or last two 595 00:33:27,516 --> 00:33:29,636 Speaker 1: days or a week at most if it's a really 596 00:33:29,676 --> 00:33:34,236 Speaker 1: important story, but building an institution trying to get you know, 597 00:33:34,596 --> 00:33:36,836 Speaker 1: we talk about moving the ship of State, but you 598 00:33:36,876 --> 00:33:40,356 Speaker 1: know the ship of Facebook. Moving that ship is going 599 00:33:40,436 --> 00:33:43,196 Speaker 1: to take time. That doesn't mean that you get an 600 00:33:43,196 --> 00:33:46,076 Speaker 1: infinite leash on moving that ship right, and so you 601 00:33:46,156 --> 00:33:48,476 Speaker 1: work as quickly as you can. There are a lot 602 00:33:48,476 --> 00:33:53,756 Speaker 1: of issues with Facebook and figuring out what to prioritize 603 00:33:53,916 --> 00:33:56,436 Speaker 1: what's going to take six months versus what's going to 604 00:33:56,516 --> 00:33:58,956 Speaker 1: take five years as part of the challenge, and as 605 00:33:58,956 --> 00:34:00,956 Speaker 1: you say, we're off to the right start and there's 606 00:34:00,956 --> 00:34:03,996 Speaker 1: a lot of work to be done. Jamal, I want 607 00:34:03,996 --> 00:34:06,956 Speaker 1: to thank you for this very wonderful and frank conversation. 608 00:34:07,036 --> 00:34:08,676 Speaker 1: I also want to thank you for your academic work, 609 00:34:08,756 --> 00:34:11,316 Speaker 1: which is taught me a lot. And I want to 610 00:34:11,316 --> 00:34:14,356 Speaker 1: thank you specifically for taking on the challenge of being 611 00:34:14,636 --> 00:34:17,196 Speaker 1: one of the chairs of the Oversight Board. It would 612 00:34:17,236 --> 00:34:20,196 Speaker 1: not be the same Oversight Board without you. And the 613 00:34:20,236 --> 00:34:22,076 Speaker 1: reason that I think it has a chance to make 614 00:34:22,276 --> 00:34:25,916 Speaker 1: meaningful contributions is precisely because people like you have agreed 615 00:34:25,996 --> 00:34:29,116 Speaker 1: to take on its work. So thank you, Thank you, Noah, 616 00:34:30,476 --> 00:34:42,556 Speaker 1: we'll be right back. There are twenty members of the 617 00:34:42,556 --> 00:34:45,556 Speaker 1: Oversight Board, but the reason I particularly wanted to speak 618 00:34:45,596 --> 00:34:49,956 Speaker 1: to Jamal is that his academic expertise is precisely in 619 00:34:50,076 --> 00:34:53,996 Speaker 1: how hard decisions should be made by bodies like the 620 00:34:54,036 --> 00:34:57,516 Speaker 1: Oversight Board. What's more, the chairs of the Oversight Board, 621 00:34:57,516 --> 00:35:01,836 Speaker 1: of whom Jamal is one, exercise disproportionate power relative to 622 00:35:01,876 --> 00:35:04,636 Speaker 1: the other members when it comes to setting the agenda 623 00:35:04,676 --> 00:35:07,876 Speaker 1: for the Board and deciding what kinds of important decisions 624 00:35:08,076 --> 00:35:12,156 Speaker 1: it ought to make. Speaking to Professor Jamal Green about 625 00:35:12,196 --> 00:35:15,276 Speaker 1: the Oversight Board was a kind of split screen experience 626 00:35:15,316 --> 00:35:18,356 Speaker 1: for me. On the one hand, I had the pleasure 627 00:35:18,516 --> 00:35:23,476 Speaker 1: of hearing a true expert on decision making talk about 628 00:35:23,476 --> 00:35:26,756 Speaker 1: how he makes decisions and how the institution that he 629 00:35:26,876 --> 00:35:31,316 Speaker 1: is helping to shape thinks about those decisions. Similarly, in 630 00:35:31,356 --> 00:35:34,116 Speaker 1: that same screen, I was hearing the perspective of one 631 00:35:34,156 --> 00:35:37,516 Speaker 1: of the chairs of the Oversight Board talk about what 632 00:35:37,716 --> 00:35:41,036 Speaker 1: its job is and what it needs to be, about 633 00:35:41,076 --> 00:35:43,436 Speaker 1: what it's done well so far, and the challenges that 634 00:35:43,476 --> 00:35:46,956 Speaker 1: it faces in the future. Over on the other screen 635 00:35:47,396 --> 00:35:50,836 Speaker 1: was my sense of wonderment, shock, and to be honest, 636 00:35:51,036 --> 00:35:54,076 Speaker 1: a slight feeling of the surreal to realize that an 637 00:35:54,076 --> 00:35:58,476 Speaker 1: institution that I helped imagine and create is actually up 638 00:35:58,476 --> 00:36:01,676 Speaker 1: and running, and that what it does has absolutely nothing 639 00:36:01,716 --> 00:36:04,476 Speaker 1: to do with me or anything I say about it. 640 00:36:05,036 --> 00:36:07,916 Speaker 1: In that respect, I am thrilled by what the Oversight 641 00:36:07,956 --> 00:36:11,876 Speaker 1: Board is doing, but also nervous on its behalf, sort 642 00:36:11,876 --> 00:36:14,676 Speaker 1: of in the way you would be nervous for anything 643 00:36:14,716 --> 00:36:17,916 Speaker 1: that you participated in creating that goes off on its own. 644 00:36:18,596 --> 00:36:21,556 Speaker 1: You want the institution to do well and be independent, 645 00:36:21,916 --> 00:36:24,796 Speaker 1: but of course you also wish it would do exactly 646 00:36:24,836 --> 00:36:27,756 Speaker 1: what you wanted to do in every context and in 647 00:36:27,796 --> 00:36:31,596 Speaker 1: every element. My ultimate takeaway from the conversation with Jamal 648 00:36:31,716 --> 00:36:34,196 Speaker 1: is that the oversight board is going to go its 649 00:36:34,236 --> 00:36:38,516 Speaker 1: own way. It is going to continue to assert authority 650 00:36:38,556 --> 00:36:41,836 Speaker 1: over Facebook's decisions. It is going to continue to press 651 00:36:41,996 --> 00:36:45,116 Speaker 1: Facebook to try to be more transparent, but it's also 652 00:36:45,196 --> 00:36:47,676 Speaker 1: going to have to grapple with the limitations of its 653 00:36:47,716 --> 00:36:51,476 Speaker 1: own design. As an oversight board that can give guidance 654 00:36:51,516 --> 00:36:54,236 Speaker 1: and advice to Facebook on a case by case basis, 655 00:36:54,556 --> 00:36:56,916 Speaker 1: and can tell Facebook what to do specifically when a 656 00:36:56,956 --> 00:37:00,876 Speaker 1: Facebook asks, but is not designed to nor has the 657 00:37:00,916 --> 00:37:05,956 Speaker 1: capacity to fundamentally transform the way the company does business. 658 00:37:06,716 --> 00:37:10,756 Speaker 1: For that kind of transformation, Facebook, like other companies, will 659 00:37:10,796 --> 00:37:13,516 Speaker 1: have to act on its own and on the basis 660 00:37:13,516 --> 00:37:16,876 Speaker 1: of its own conception of the public interest and the 661 00:37:16,956 --> 00:37:20,876 Speaker 1: interests of itself and its shareholders. Until the next time 662 00:37:20,916 --> 00:37:24,516 Speaker 1: I speak to here on Deep Background, breathe deep, think 663 00:37:24,516 --> 00:37:28,956 Speaker 1: deep thoughts, and try to have a little fun. If 664 00:37:28,996 --> 00:37:32,076 Speaker 1: you're a regular listener, you know I love communicating with 665 00:37:32,116 --> 00:37:35,316 Speaker 1: you here on Deep Background. I also really want that 666 00:37:35,356 --> 00:37:38,436 Speaker 1: communication to run both ways. I want to know what 667 00:37:38,516 --> 00:37:41,116 Speaker 1: you think are the most important stories of the moment, 668 00:37:41,476 --> 00:37:43,196 Speaker 1: and what kinds of guests do you think it would 669 00:37:43,236 --> 00:37:46,316 Speaker 1: be useful to hear from. More So, I'm opening a 670 00:37:46,316 --> 00:37:49,876 Speaker 1: new channel of communication. To access it, just go to 671 00:37:49,916 --> 00:37:53,436 Speaker 1: my website Noah Dashfelman dot com. You can sign up 672 00:37:53,436 --> 00:37:56,756 Speaker 1: from my newsletter and you can tell me exactly what's 673 00:37:56,756 --> 00:38:00,036 Speaker 1: on your mind, something that would be really valuable to 674 00:38:00,076 --> 00:38:04,796 Speaker 1: me and I hope to you too. Deep background is 675 00:38:04,796 --> 00:38:08,076 Speaker 1: brought to you by Pushkin Industries. Our producer is Mo La, 676 00:38:08,156 --> 00:38:12,156 Speaker 1: board engineer is Ben Taliday, and our showrunner is Sophie 677 00:38:12,156 --> 00:38:17,036 Speaker 1: Crane mckibbon. Editorial support from noahm Osband. Theme music by 678 00:38:17,116 --> 00:38:21,996 Speaker 1: Luis Skara at Pushkin. Thanks to Mia Lobell, Julia Barton, Lydia, 679 00:38:22,076 --> 00:38:26,756 Speaker 1: Jean Coott, Heather Faine, Carlie mcgliori, Maggie Taylor, Eric Sandler, 680 00:38:26,756 --> 00:38:29,756 Speaker 1: and Jacob Weisberg. You can find me on Twitter at 681 00:38:29,836 --> 00:38:33,196 Speaker 1: Noah R. Feldman. I also write a column for Bloomberg Opinion, 682 00:38:33,356 --> 00:38:36,516 Speaker 1: which you can find at Bloomberg dot com. Slash Feldman 683 00:38:36,956 --> 00:38:40,796 Speaker 1: to discover Bloomberg's original slate of podcasts go to Bloomberg 684 00:38:40,876 --> 00:38:44,436 Speaker 1: dot com slash podcasts, and if you liked what you 685 00:38:44,516 --> 00:38:47,316 Speaker 1: heard today, please write a review or tell a friend. 686 00:38:48,596 --> 00:38:50,236 Speaker 1: This is deep background