1 00:00:15,396 --> 00:00:23,836 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:23,876 --> 00:00:27,316 Speaker 1: where we explore the stories behind the stories in the news. 3 00:00:27,836 --> 00:00:32,076 Speaker 1: I'm Noah Feldman. This is a special bonus episode, a 4 00:00:32,156 --> 00:00:36,956 Speaker 1: mini episode about some breaking news. This week, Facebook's oversight 5 00:00:37,036 --> 00:00:40,756 Speaker 1: board decided the most important case in its short life, 6 00:00:41,156 --> 00:00:44,956 Speaker 1: what to do about Donald Trump's temporary suspension from the platform, 7 00:00:44,996 --> 00:00:48,036 Speaker 1: which had been announced by Facebook in the aftermath of 8 00:00:48,076 --> 00:00:52,916 Speaker 1: the January sixth attack on the Capitol. The story mattered 9 00:00:52,916 --> 00:00:55,996 Speaker 1: to me because, as some listeners will know, I've been 10 00:00:56,076 --> 00:00:59,996 Speaker 1: deeply involved with the oversight board, proposing it to Facebook 11 00:00:59,996 --> 00:01:03,556 Speaker 1: in the first place and advising the company on its creation. 12 00:01:04,036 --> 00:01:06,956 Speaker 1: In fact, I still advise Facebook on free speech and 13 00:01:07,076 --> 00:01:10,676 Speaker 1: free expression related issues. So when it comes to the 14 00:01:10,716 --> 00:01:13,956 Speaker 1: oversight board, I'm the very opposite of an objective observer. 15 00:01:14,196 --> 00:01:17,236 Speaker 1: I am an observer who's deeply bound up in the institution, 16 00:01:17,516 --> 00:01:21,036 Speaker 1: the process, and I care a lot about this decision, 17 00:01:21,516 --> 00:01:24,236 Speaker 1: and let me tell you, it was fascinating and strange 18 00:01:24,516 --> 00:01:27,876 Speaker 1: to see the decision of that institution plastered on the 19 00:01:27,876 --> 00:01:32,476 Speaker 1: front pages of the newspapers. After consultation with my terrific 20 00:01:32,516 --> 00:01:35,236 Speaker 1: team of producers here at Deep Background, we decided that 21 00:01:35,276 --> 00:01:38,276 Speaker 1: it might be useful to do a special mini episode 22 00:01:38,476 --> 00:01:41,556 Speaker 1: on the Oversight Board decision. And I'm going to tell you, 23 00:01:41,836 --> 00:01:46,036 Speaker 1: just from my own perspective, three different aspects of what 24 00:01:46,116 --> 00:01:48,076 Speaker 1: you should think or what you might wish to think 25 00:01:48,276 --> 00:01:50,916 Speaker 1: about the Oversight Board decision. What I'm going to do 26 00:01:51,116 --> 00:01:54,756 Speaker 1: is break my comments into three parts. First, what did 27 00:01:54,796 --> 00:01:58,156 Speaker 1: the Oversight Board actually do? And as you'll hear, the 28 00:01:58,196 --> 00:02:02,276 Speaker 1: answer is pretty different from what the headlines have said. Second, 29 00:02:02,636 --> 00:02:05,476 Speaker 1: what is likely to happen next in the coming months? 30 00:02:05,996 --> 00:02:10,156 Speaker 1: And last, but very much not least, why this matters 31 00:02:10,356 --> 00:02:20,556 Speaker 1: or may matter in the big picture. First, what did 32 00:02:20,556 --> 00:02:24,796 Speaker 1: the Oversight Board actually do? There is some confusion around 33 00:02:24,796 --> 00:02:27,756 Speaker 1: this because the very first thing the Oversight Board said 34 00:02:28,036 --> 00:02:31,596 Speaker 1: in its opinion was the slightest little bit misleading. The 35 00:02:31,676 --> 00:02:36,076 Speaker 1: Oversight Board began by saying that it was upholding Facebook's 36 00:02:36,076 --> 00:02:39,316 Speaker 1: decision in the aftermath of the January sixth attack on 37 00:02:39,356 --> 00:02:43,956 Speaker 1: the Capitol to take Donald Trump off the surface. And 38 00:02:44,036 --> 00:02:45,956 Speaker 1: yet when you went on to read the fine print, 39 00:02:46,116 --> 00:02:50,196 Speaker 1: the Oversight Board went on to say that Facebook's subsequent 40 00:02:50,636 --> 00:02:54,076 Speaker 1: deep platforming of Donald Trump for an indefinite length of 41 00:02:54,116 --> 00:03:00,756 Speaker 1: time was wrong, standardless and unjustified, as a consequence. The 42 00:03:00,796 --> 00:03:05,236 Speaker 1: first thing the newspapers reported was Oversight Board upholds Facebook, 43 00:03:05,836 --> 00:03:08,796 Speaker 1: Yet they could just as easily have said as their headline, 44 00:03:09,156 --> 00:03:12,596 Speaker 1: the Oversight Board told Facebook that it was not justified 45 00:03:12,836 --> 00:03:16,316 Speaker 1: in suspending Trump from its service. So what was the 46 00:03:16,356 --> 00:03:19,916 Speaker 1: Oversight Board in fact saying when you drill down, Well, 47 00:03:19,956 --> 00:03:23,076 Speaker 1: what it said is that the decision to block the 48 00:03:23,276 --> 00:03:27,996 Speaker 1: content that Trump posted during and in the process of 49 00:03:28,396 --> 00:03:31,996 Speaker 1: the attack on the Capitol was the right thing for 50 00:03:32,036 --> 00:03:36,596 Speaker 1: Facebook to do because Donald Trump's words, the Oversight Board 51 00:03:36,596 --> 00:03:41,996 Speaker 1: believed were contributing to ongoing harm, including violence with respect 52 00:03:42,076 --> 00:03:45,916 Speaker 1: to the attack on the Capitol. Therefore, said the Oversight Board, 53 00:03:46,116 --> 00:03:50,956 Speaker 1: it was appropriate to take down that content. But the 54 00:03:51,036 --> 00:03:54,436 Speaker 1: board then went on to say that when Facebook chooses 55 00:03:54,436 --> 00:03:57,996 Speaker 1: to take down content, it doesn't ordinarily go on to 56 00:03:58,036 --> 00:04:01,996 Speaker 1: remove the user from the platform. Instead, Facebook has a 57 00:04:02,116 --> 00:04:04,316 Speaker 1: range of things that it can do, which included just 58 00:04:04,436 --> 00:04:09,036 Speaker 1: taking down the content or temporarily freezing the person's account 59 00:04:09,156 --> 00:04:13,796 Speaker 1: too has posted that content, or under some circumstances, actually 60 00:04:13,916 --> 00:04:17,476 Speaker 1: d platforming the person. What Facebook had never done before, 61 00:04:17,556 --> 00:04:21,676 Speaker 1: according to the Oversight Board, was announced an indefinite suspension, 62 00:04:21,996 --> 00:04:25,756 Speaker 1: which was neither labeled as a mechanism to prevent future harm, 63 00:04:25,876 --> 00:04:30,196 Speaker 1: nor as a punishment for explicit violations by Trump of 64 00:04:30,396 --> 00:04:34,196 Speaker 1: rules of the platform that can get you d platformed. 65 00:04:34,716 --> 00:04:37,676 Speaker 1: In essence, what the board was saying was that Facebook 66 00:04:37,716 --> 00:04:40,796 Speaker 1: needs to go back to the drawing board. It needs 67 00:04:40,836 --> 00:04:44,196 Speaker 1: to clarify and specify what its rules are going to 68 00:04:44,236 --> 00:04:48,076 Speaker 1: be going forward for taking people off the platform, then 69 00:04:48,236 --> 00:04:50,716 Speaker 1: to see if those rules which it has to state, 70 00:04:50,996 --> 00:04:56,716 Speaker 1: explain and announce would apply to Donald Trump. Once it 71 00:04:56,756 --> 00:05:00,196 Speaker 1: reaches that conclusion, if it's clearly stated rules don't apply 72 00:05:00,236 --> 00:05:03,116 Speaker 1: to Trump, Trump has to be put back on the platform. 73 00:05:03,476 --> 00:05:07,396 Speaker 1: If it says that its rules do qualify for permanent 74 00:05:07,436 --> 00:05:11,316 Speaker 1: removal of Trump, then it could take Trump off the platform. 75 00:05:11,636 --> 00:05:14,236 Speaker 1: And Trump, of course, would then have the opportunity to 76 00:05:14,276 --> 00:05:16,596 Speaker 1: go back to the oversight board and ask for it 77 00:05:16,636 --> 00:05:19,716 Speaker 1: to review the issue again. Whether it would listen to 78 00:05:19,756 --> 00:05:22,396 Speaker 1: his case or not is uncertain, but it seems probable 79 00:05:22,436 --> 00:05:25,516 Speaker 1: that it would given the great importance of the issue. 80 00:05:26,476 --> 00:05:30,436 Speaker 1: You probably noticed that a lot of this decision therefore 81 00:05:30,476 --> 00:05:34,596 Speaker 1: depends on what Facebook does in the next six months, 82 00:05:35,196 --> 00:05:37,636 Speaker 1: and you might also be wondering, And the truth is, 83 00:05:37,716 --> 00:05:40,036 Speaker 1: I'm wondering about this a little bit too, how do 84 00:05:40,116 --> 00:05:45,276 Speaker 1: the Oversight Board decide to give Facebook six months to 85 00:05:45,316 --> 00:05:49,796 Speaker 1: figure out what it was going to do next. So 86 00:05:49,996 --> 00:05:52,836 Speaker 1: let's turn to that six month period. And here's why 87 00:05:52,876 --> 00:05:56,916 Speaker 1: that six month period matters so much. Some observers of 88 00:05:56,956 --> 00:06:00,756 Speaker 1: this decision have said that the Oversight Board punted the 89 00:06:00,876 --> 00:06:04,676 Speaker 1: question of what to do about Donald Trump back to Facebook, 90 00:06:04,796 --> 00:06:07,836 Speaker 1: and in a sense that is correct, acting in a 91 00:06:07,876 --> 00:06:11,796 Speaker 1: manner not unlike what many actual Supreme courts or constitutional 92 00:06:11,836 --> 00:06:16,356 Speaker 1: courts would do. The Oversight Board declined to say, here, Facebook, 93 00:06:16,396 --> 00:06:19,596 Speaker 1: are the rules which you must follow when the time 94 00:06:19,636 --> 00:06:22,396 Speaker 1: comes to decide whether to kick somebody off the platform. 95 00:06:22,796 --> 00:06:26,356 Speaker 1: The Oversight Board saw its role as doing oversight, not 96 00:06:26,516 --> 00:06:29,836 Speaker 1: as specifying policy. So there is a punt or a 97 00:06:29,956 --> 00:06:33,156 Speaker 1: return of this issue back to Facebook insofar as the 98 00:06:33,196 --> 00:06:38,036 Speaker 1: Oversight Board was telling Facebook, you have to write the policy, 99 00:06:38,276 --> 00:06:41,996 Speaker 1: We're not going to do it for you. That said, 100 00:06:42,196 --> 00:06:47,436 Speaker 1: the Oversight Board gave substantial guidance to Facebook with respect 101 00:06:47,596 --> 00:06:52,076 Speaker 1: to what that new policy should look like. When Facebook 102 00:06:52,116 --> 00:06:55,276 Speaker 1: now goes to rewrite its policies, it will go into 103 00:06:55,276 --> 00:06:59,436 Speaker 1: the details of what the Board suggested. And although the 104 00:06:59,436 --> 00:07:01,596 Speaker 1: Board did not say that Facebook had to listen to 105 00:07:01,636 --> 00:07:05,076 Speaker 1: these principles. The strong implication was that if Facebook made 106 00:07:05,116 --> 00:07:08,316 Speaker 1: a decision that violated the principles that the board laid out, 107 00:07:08,556 --> 00:07:13,116 Speaker 1: the board might well overturn Facebook's policies the next time around. 108 00:07:14,116 --> 00:07:16,636 Speaker 1: What was good for Trump is that the oversight Board 109 00:07:16,756 --> 00:07:21,076 Speaker 1: made it very clear that Facebook, in deciding whether someone 110 00:07:21,156 --> 00:07:24,476 Speaker 1: like Trump can be permanently deplatformed, has to look at 111 00:07:24,596 --> 00:07:29,756 Speaker 1: whether his presence on the platform would cause significant imminent 112 00:07:29,916 --> 00:07:34,956 Speaker 1: that means immediate harm. Here's the money quote. Facebook must 113 00:07:34,996 --> 00:07:39,596 Speaker 1: assess whether reinstating mister Trump's accounts would pose a serious 114 00:07:39,756 --> 00:07:45,796 Speaker 1: risk of inciting imminent discrimination, violence, or other lawless action. 115 00:07:46,476 --> 00:07:49,236 Speaker 1: In other words, Facebook can't just say we don't like 116 00:07:49,516 --> 00:07:53,676 Speaker 1: Donald Trump, we think Donald Trump's lousy, or even we 117 00:07:53,796 --> 00:07:56,996 Speaker 1: think Donald Trump is in general dangerous. They have to 118 00:07:57,036 --> 00:08:00,316 Speaker 1: create rules according to which a removal of Trump would 119 00:08:00,396 --> 00:08:05,956 Speaker 1: be conditioned on this serious risk of inciting discrimination, violence, 120 00:08:06,316 --> 00:08:09,756 Speaker 1: or lawless action. That's good for Trumps now that he's 121 00:08:09,836 --> 00:08:12,476 Speaker 1: no longer president of the United States, and now that 122 00:08:12,556 --> 00:08:15,516 Speaker 1: he's not commanding a mob that's about to attack the capital. 123 00:08:15,716 --> 00:08:17,996 Speaker 1: It would not be that easy for Facebook to show 124 00:08:18,276 --> 00:08:22,316 Speaker 1: that putting him back on the platform would insight imminent 125 00:08:22,716 --> 00:08:27,356 Speaker 1: violence or lawlessness. What's less good for Trump is that, 126 00:08:27,516 --> 00:08:30,876 Speaker 1: in describing what Facebook should do over the next six months, 127 00:08:31,116 --> 00:08:35,676 Speaker 1: the Oversight board also seemed to suggest that Facebook should 128 00:08:35,676 --> 00:08:39,356 Speaker 1: require Trump to back down from some of the spurious 129 00:08:39,396 --> 00:08:43,756 Speaker 1: claims about election fraud being made. Here's the money quote here. 130 00:08:44,436 --> 00:08:48,156 Speaker 1: Facebook should, for example, be satisfied that mister Trump has 131 00:08:48,156 --> 00:08:52,636 Speaker 1: ceased making unfounded claims about election fraud in the manner 132 00:08:52,676 --> 00:08:57,396 Speaker 1: that justified suspension on January six And in the real 133 00:08:57,436 --> 00:09:00,476 Speaker 1: world we all know it doesn't seem very likely that 134 00:09:00,596 --> 00:09:04,556 Speaker 1: Donald Trump, who responded to the oversight board decision with 135 00:09:05,116 --> 00:09:09,356 Speaker 1: a loud statement of rejection in which he referred to 136 00:09:09,436 --> 00:09:12,636 Speaker 1: himself as the president of the United States, is very 137 00:09:12,716 --> 00:09:18,316 Speaker 1: likely to take steps like that. In any case, what 138 00:09:18,436 --> 00:09:21,476 Speaker 1: Facebook is now going to have to do is engage 139 00:09:21,596 --> 00:09:25,036 Speaker 1: in an internal process of figuring out how to state 140 00:09:25,236 --> 00:09:30,116 Speaker 1: rules that will be designed to justify and explain whatever 141 00:09:30,156 --> 00:09:34,676 Speaker 1: they decide to do about Trump. That internal process will 142 00:09:34,796 --> 00:09:39,436 Speaker 1: involve those people within Facebook who make content policy rules, 143 00:09:39,716 --> 00:09:42,196 Speaker 1: and they will have to figure out how to apply 144 00:09:42,276 --> 00:09:45,076 Speaker 1: those rules in a public way. They will not only 145 00:09:45,116 --> 00:09:49,036 Speaker 1: cover Donald Trump, but will also cover anybody else whom 146 00:09:49,116 --> 00:09:52,676 Speaker 1: they wish to take off the service. The Oversight Board 147 00:09:52,716 --> 00:09:55,436 Speaker 1: made it very clear in its decision that Facebook cannot 148 00:09:55,476 --> 00:09:58,356 Speaker 1: have one rule for Trump and another rule for every 149 00:09:58,396 --> 00:10:02,396 Speaker 1: other government leader. It also strongly implied that Facebook should 150 00:10:02,436 --> 00:10:06,996 Speaker 1: not have different rules for public figures who influence a 151 00:10:06,996 --> 00:10:11,276 Speaker 1: lot of people than it does for regular users. Regardless, 152 00:10:11,356 --> 00:10:14,996 Speaker 1: the Oversight Board was very concerned that Facebook pay attention 153 00:10:15,196 --> 00:10:18,516 Speaker 1: to the potential dangers and harms posed by users and 154 00:10:18,676 --> 00:10:22,436 Speaker 1: explain the connection between those harms and any decision to 155 00:10:22,556 --> 00:10:26,716 Speaker 1: d platform the person. We may not know much publicly 156 00:10:26,996 --> 00:10:30,836 Speaker 1: about how Facebook undergoes this process right away, but the 157 00:10:30,876 --> 00:10:34,836 Speaker 1: good news is, under the board's guidance and oversight, Facebook 158 00:10:34,876 --> 00:10:38,356 Speaker 1: will have to explain clearly and publicly what its rules are, 159 00:10:38,636 --> 00:10:43,356 Speaker 1: and will have to show how those rules operate. That 160 00:10:43,396 --> 00:10:46,516 Speaker 1: brings us to the grand question of whether any of 161 00:10:46,556 --> 00:10:50,036 Speaker 1: this matters. It may not surprise you to hear that 162 00:10:50,116 --> 00:10:53,396 Speaker 1: I think it matters a lot, and for several reasons. 163 00:10:54,036 --> 00:10:57,596 Speaker 1: First is the fact that the Oversight Board actually did 164 00:10:57,716 --> 00:11:01,036 Speaker 1: its job. That is to say, it operated it in 165 00:11:01,076 --> 00:11:04,156 Speaker 1: such a way as to render a decision that neither 166 00:11:04,396 --> 00:11:09,596 Speaker 1: rubber stamped what Facebook had done nor fully versed what 167 00:11:09,676 --> 00:11:14,476 Speaker 1: it had done. Instead, the Oversight Board did oversight. That is, 168 00:11:14,556 --> 00:11:17,636 Speaker 1: it held Facebook to account by saying that Facebook had 169 00:11:17,676 --> 00:11:21,276 Speaker 1: an obligation to follow rules and principles that would be 170 00:11:21,356 --> 00:11:25,836 Speaker 1: made public in the realm of free expression. On its own, 171 00:11:26,036 --> 00:11:30,956 Speaker 1: Facebook had not clarified publicly exactly why Trump was removed. 172 00:11:31,156 --> 00:11:34,516 Speaker 1: It had acted in a somewhat let's figure out what 173 00:11:34,596 --> 00:11:37,756 Speaker 1: to do under these circumstances ad hoc manner, and the 174 00:11:37,796 --> 00:11:42,196 Speaker 1: Oversight Board told Facebook it just couldn't get away with that. Yet. 175 00:11:42,236 --> 00:11:44,956 Speaker 1: The Oversight Board also was unwilling to shoulder all of 176 00:11:44,996 --> 00:11:48,036 Speaker 1: the responsibility for telling Facebook exactly what it should do 177 00:11:48,076 --> 00:11:51,236 Speaker 1: in the future. It wanted Facebook to take on board 178 00:11:51,316 --> 00:11:54,596 Speaker 1: its own responsibility for getting it right, and that seems 179 00:11:54,796 --> 00:12:00,036 Speaker 1: to be exactly what oversight should be about. Second, the 180 00:12:00,156 --> 00:12:05,236 Speaker 1: Oversight Board decision was treated by news organizations throughout the 181 00:12:05,236 --> 00:12:09,116 Speaker 1: world the way a decision by an actual Supreme Court 182 00:12:09,196 --> 00:12:14,476 Speaker 1: would probably be treated. It wasn't just discussed, It was analyzed, 183 00:12:14,956 --> 00:12:21,116 Speaker 1: poured over, evaluated, argued about, and indeed also much anticipated 184 00:12:21,276 --> 00:12:24,396 Speaker 1: when it came down. The fact that the world seems 185 00:12:24,436 --> 00:12:27,156 Speaker 1: to have treated the Oversight Board's decision as a real 186 00:12:27,236 --> 00:12:31,356 Speaker 1: decision suggests that the institution may have passed its first 187 00:12:31,476 --> 00:12:35,276 Speaker 1: major test of legitimacy. Sure it will be criticized, and 188 00:12:35,396 --> 00:12:38,916 Speaker 1: indeed criticized harshly by supporters of Donald Trump, and it 189 00:12:38,996 --> 00:12:41,156 Speaker 1: may also be criticized by people who think that the 190 00:12:41,196 --> 00:12:44,316 Speaker 1: board didn't go far enough in telling Facebook exactly what 191 00:12:44,396 --> 00:12:46,916 Speaker 1: to do. But those are the kinds of criticisms to 192 00:12:46,956 --> 00:12:51,956 Speaker 1: which real world courts are subject all the time. It's 193 00:12:51,996 --> 00:12:56,556 Speaker 1: therefore very important that this decision was made, was discussed, 194 00:12:56,756 --> 00:13:00,556 Speaker 1: was analyzed, because it suggests that a possible future direction 195 00:13:00,916 --> 00:13:02,956 Speaker 1: for the way important decisions like this are going to 196 00:13:02,996 --> 00:13:06,756 Speaker 1: be made is in dialogue between Facebook and its oversight board. 197 00:13:07,236 --> 00:13:09,436 Speaker 1: Some people might prefer that there not be a dialog, 198 00:13:09,516 --> 00:13:12,356 Speaker 1: that the oversight board just speak and the conversation be finished, 199 00:13:12,596 --> 00:13:15,316 Speaker 1: but that's not how real world courts operate, and that's 200 00:13:15,356 --> 00:13:18,196 Speaker 1: probably not how the oversight board is going to operate 201 00:13:18,276 --> 00:13:21,756 Speaker 1: for now. Instead, to engage in oversight, it's going to 202 00:13:21,796 --> 00:13:27,076 Speaker 1: have to participate in an ongoing process of dialogue. Last, 203 00:13:27,116 --> 00:13:29,356 Speaker 1: but not least, one of the crucial reasons for the 204 00:13:29,356 --> 00:13:31,876 Speaker 1: creation of the oversight board in the future was the 205 00:13:31,916 --> 00:13:36,036 Speaker 1: sense that the most important decisions about free expression on 206 00:13:36,076 --> 00:13:39,636 Speaker 1: social media are too big to be made solely by 207 00:13:39,756 --> 00:13:43,316 Speaker 1: the people who run the company. The oversight board told 208 00:13:43,316 --> 00:13:46,356 Speaker 1: Facebook's leadership, we don't like how you made this decision, 209 00:13:46,596 --> 00:13:49,436 Speaker 1: go back and do it again. Facebook will then have 210 00:13:49,476 --> 00:13:52,676 Speaker 1: to make a new decision, and that decision, too is 211 00:13:52,716 --> 00:13:57,716 Speaker 1: subject to being reviewed finally by the board. In other words, 212 00:13:57,836 --> 00:14:01,476 Speaker 1: there will be a sharing of ultimate responsibility for decision making. 213 00:14:02,196 --> 00:14:05,196 Speaker 1: That sharing is, at least, in my view, a step 214 00:14:05,236 --> 00:14:08,796 Speaker 1: in the right direction away from a world where the 215 00:14:09,396 --> 00:14:13,116 Speaker 1: about free expression are made by the CEOs of platforms, 216 00:14:13,156 --> 00:14:17,236 Speaker 1: with no option for recourse and no independent review by 217 00:14:17,516 --> 00:14:22,956 Speaker 1: any third party body. Everything that I've just said to 218 00:14:22,996 --> 00:14:26,796 Speaker 1: you is subject to revision and review as time develops 219 00:14:27,076 --> 00:14:30,596 Speaker 1: and as the story continues. And just to remind you, 220 00:14:30,996 --> 00:14:34,316 Speaker 1: none of it comes from my objective analysis. It all 221 00:14:34,356 --> 00:14:38,036 Speaker 1: comes from my own connection to and care about this 222 00:14:38,236 --> 00:14:42,596 Speaker 1: nascent institution. That said, I will say, I'm pretty proud 223 00:14:42,636 --> 00:14:45,916 Speaker 1: today of what the oversight board did. I don't know 224 00:14:45,916 --> 00:14:47,476 Speaker 1: that I would have written the opinion the way the 225 00:14:47,516 --> 00:14:49,676 Speaker 1: oversight board did. I don't know that I would have 226 00:14:49,716 --> 00:14:52,676 Speaker 1: given Facebook six months in order to make this decision. 227 00:14:52,716 --> 00:14:54,316 Speaker 1: I might have thought it could do it in a 228 00:14:54,356 --> 00:14:57,876 Speaker 1: substantially shorter amount of time. I might have explained why 229 00:14:57,996 --> 00:14:59,756 Speaker 1: six months was the amount of time that was being 230 00:14:59,836 --> 00:15:02,476 Speaker 1: chosen as opposed to just suggesting it as a reasonable 231 00:15:02,516 --> 00:15:05,236 Speaker 1: amount of time in which Facebook could act, But those 232 00:15:05,356 --> 00:15:09,036 Speaker 1: are nothing but little quibbles. In the end, this institu 233 00:15:09,156 --> 00:15:15,076 Speaker 1: Juan acted as an oversight body and gave feedback to Facebook, 234 00:15:15,356 --> 00:15:18,756 Speaker 1: and Facebook is going to have to listen, and that, 235 00:15:19,396 --> 00:15:23,116 Speaker 1: for once, seems to be a small step forward in 236 00:15:23,156 --> 00:15:27,276 Speaker 1: the world of regulation and ethics in the context of 237 00:15:27,356 --> 00:15:35,916 Speaker 1: big tech. I'll be back to you soon with a 238 00:15:35,956 --> 00:15:40,316 Speaker 1: full episode. In the meantime, have a terrific week, stay 239 00:15:40,316 --> 00:15:45,716 Speaker 1: safe and be well. Deep background is brought to you 240 00:15:45,756 --> 00:15:49,996 Speaker 1: by Pushkin Industries. Our producer is Mo laboord our engineer 241 00:15:50,036 --> 00:15:53,516 Speaker 1: is Martin Gonzalez, and our shore runner is Sophie Crane mckibbon. 242 00:15:53,956 --> 00:15:57,836 Speaker 1: Editorial support from noahm Osband. Theme music by Luis Skara 243 00:15:58,356 --> 00:16:02,276 Speaker 1: at Pushkin. Thanks to Mia Lobell, Julia Barton, Lydia Jean Cott, 244 00:16:02,556 --> 00:16:07,436 Speaker 1: Heather Fain, Carl mcgliori, Maggie Taylor, Eric Sander, and Jacob Weissberg. 245 00:16:07,836 --> 00:16:10,156 Speaker 1: You can find me on Twitter at Noah R. Feldman. 246 00:16:10,556 --> 00:16:12,916 Speaker 1: I also write a column for Bloomberg Opinion which you 247 00:16:12,956 --> 00:16:16,636 Speaker 1: can find at bloomberg dot com slash Feldman. To discover 248 00:16:16,756 --> 00:16:19,996 Speaker 1: Bloomberg's original slate of podcasts, go to bloomberg dot com 249 00:16:19,996 --> 00:16:23,316 Speaker 1: slash podcasts, and if you liked what you heard today, 250 00:16:23,316 --> 00:16:26,156 Speaker 1: please write a review or tell a friend. This is 251 00:16:26,196 --> 00:16:26,956 Speaker 1: deep background