1 00:00:15,396 --> 00:00:23,876 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:23,916 --> 00:00:27,276 Speaker 1: where we explored the stories behind the stories in the news. 3 00:00:27,716 --> 00:00:33,356 Speaker 1: I'm Noah Feldman. This week, former President Donald Trump's second 4 00:00:33,516 --> 00:00:37,556 Speaker 1: impeachment trial is taking place in the Senate. At the 5 00:00:37,596 --> 00:00:41,556 Speaker 1: core of the trial is Donald Trump's speech. What did 6 00:00:41,556 --> 00:00:45,636 Speaker 1: the President say and when did he say it? From 7 00:00:45,676 --> 00:00:49,876 Speaker 1: the standpoint of the House managers, donald Trump incited violence 8 00:00:50,196 --> 00:00:53,836 Speaker 1: on January sixth when he gave the speech on the ellipse, 9 00:00:54,316 --> 00:00:58,276 Speaker 1: encouraging his followers to march on the Capitol, which they 10 00:00:58,316 --> 00:01:02,396 Speaker 1: then occupied. From the standpoint of Donald Trump's defense, there 11 00:01:02,476 --> 00:01:06,796 Speaker 1: was no intimate connection, no causal connection, no connection at all, 12 00:01:06,836 --> 00:01:09,796 Speaker 1: they say, between what the President had to say at 13 00:01:09,836 --> 00:01:13,996 Speaker 1: the time and what the rioters did. But impeachment was 14 00:01:14,036 --> 00:01:16,796 Speaker 1: not the only, and possibly not even the most serious 15 00:01:16,796 --> 00:01:20,156 Speaker 1: consequence to Donald Trump of his speech on January six 16 00:01:20,876 --> 00:01:23,836 Speaker 1: In the aftermath of the attack on the Capitol, Twitter 17 00:01:23,996 --> 00:01:31,356 Speaker 1: suspended Donald Trump permanently and Facebook suspended Trump indefinitely. Those 18 00:01:31,396 --> 00:01:36,316 Speaker 1: two suspensions effectively blocked Donald Trump from social media, which 19 00:01:36,356 --> 00:01:39,556 Speaker 1: had been the oxygen for his campaign and the main 20 00:01:39,876 --> 00:01:42,956 Speaker 1: method that he used to communicate to his public during 21 00:01:42,996 --> 00:01:47,436 Speaker 1: his presidency. The consequences for the question of free expression 22 00:01:47,476 --> 00:01:51,956 Speaker 1: and social media could not be greater, And indeed, after 23 00:01:52,036 --> 00:01:56,156 Speaker 1: Joe Biden's inauguration, Facebook decided to refer the question of 24 00:01:56,156 --> 00:02:00,916 Speaker 1: Trump's suspension to its newly created Facebook Oversight Board, a 25 00:02:00,996 --> 00:02:03,836 Speaker 1: group of twenty plus experts from all over the world 26 00:02:03,996 --> 00:02:06,916 Speaker 1: who are independent of Facebook and have the authority to 27 00:02:06,996 --> 00:02:10,396 Speaker 1: decide whether its decisions can form with its stated values 28 00:02:10,676 --> 00:02:14,556 Speaker 1: and with Facebook's rules. Right now, the question of trumps 29 00:02:14,516 --> 00:02:18,436 Speaker 1: suspension is pending before the Oversight Board, and Facebook has 30 00:02:18,476 --> 00:02:21,956 Speaker 1: pledged that it will follow the conclusion that the Oversight 31 00:02:22,036 --> 00:02:27,076 Speaker 1: Board reaches. They raise profound and difficult questions about what 32 00:02:27,116 --> 00:02:30,076 Speaker 1: speech should be permitted and what speech should be restrained. 33 00:02:30,636 --> 00:02:34,036 Speaker 1: To discuss these pressing questions and the state of play 34 00:02:34,316 --> 00:02:38,356 Speaker 1: in social media content governance more generally, I'm joined today 35 00:02:38,596 --> 00:02:42,836 Speaker 1: by the Vice President of Content Policy at Facebook, Monica Bickert. 36 00:02:43,276 --> 00:02:46,956 Speaker 1: Monica's job is to run all decisions about content policy 37 00:02:46,996 --> 00:02:50,676 Speaker 1: at Facebook. She was intimately involved in the Trump process, 38 00:02:50,756 --> 00:02:53,796 Speaker 1: the Trump decision, and the decision to pass along the 39 00:02:53,836 --> 00:02:57,916 Speaker 1: Trump case to the Facebook Oversight Board. Before we start 40 00:02:57,956 --> 00:03:00,516 Speaker 1: this conversation, I want to begin by telling my listeners 41 00:03:00,556 --> 00:03:03,236 Speaker 1: that when it comes to the Facebook Oversight Board, I 42 00:03:03,276 --> 00:03:07,596 Speaker 1: am the very opposite of a disinterested observer. I helped 43 00:03:07,636 --> 00:03:09,476 Speaker 1: come up with the idea for the oversight board in 44 00:03:09,516 --> 00:03:12,276 Speaker 1: the first place. I worked as it paid consultant to 45 00:03:12,316 --> 00:03:15,276 Speaker 1: Facebook during the entire three year process of getting the 46 00:03:15,276 --> 00:03:19,196 Speaker 1: board going, and I still advise Facebook now on questions 47 00:03:19,236 --> 00:03:23,036 Speaker 1: of free expression. Indeed, that's how I met Monica during 48 00:03:23,076 --> 00:03:26,196 Speaker 1: the process of working on the oversight board, and we 49 00:03:26,316 --> 00:03:29,396 Speaker 1: became friends and subsequently taught a class together on social 50 00:03:29,396 --> 00:03:32,876 Speaker 1: media and the law at Harvard Law School. With that 51 00:03:33,036 --> 00:03:36,476 Speaker 1: relevant background in mind, I'm excited to turn into our 52 00:03:36,516 --> 00:03:44,356 Speaker 1: conversation Monica. Welcome to deep background, Monica. Let's start with 53 00:03:44,396 --> 00:03:48,676 Speaker 1: the biggest ticket issue in the universe of content moderation 54 00:03:49,196 --> 00:03:53,836 Speaker 1: right upfront, which is the suspension of Donald Trump from 55 00:03:53,876 --> 00:03:58,636 Speaker 1: your platform as well as from Twitter in the wake 56 00:03:58,716 --> 00:04:01,636 Speaker 1: of the January sixth attacks on the Capitol. And I 57 00:04:01,636 --> 00:04:04,156 Speaker 1: guess I just want to begin by asking you, what 58 00:04:04,276 --> 00:04:08,396 Speaker 1: did your internal process look like, how you know in 59 00:04:08,436 --> 00:04:11,876 Speaker 1: the whole complies ecosystem that you're in charge of of 60 00:04:11,956 --> 00:04:15,316 Speaker 1: content policy at Facebook. Did you make your way towards 61 00:04:15,396 --> 00:04:21,236 Speaker 1: this really historic decision. The first thing that happened was 62 00:04:21,996 --> 00:04:25,596 Speaker 1: my attention was brought to a couple of posts by 63 00:04:26,276 --> 00:04:28,996 Speaker 1: then President Trump. My team flagged for me, hey, a 64 00:04:29,076 --> 00:04:33,036 Speaker 1: video has been posted by the President. We're reviewing it 65 00:04:33,036 --> 00:04:35,196 Speaker 1: now to see if it violence any of our content policies, 66 00:04:35,196 --> 00:04:37,716 Speaker 1: but it's something that you need to look at. One 67 00:04:37,836 --> 00:04:40,916 Speaker 1: was a video and one was a text post, and 68 00:04:40,956 --> 00:04:45,196 Speaker 1: they happened during the attack on the Capitol. We saw 69 00:04:45,196 --> 00:04:48,636 Speaker 1: in the president's video that he said I love you 70 00:04:48,676 --> 00:04:51,596 Speaker 1: all and thank you or words to that effect, which 71 00:04:51,916 --> 00:04:55,196 Speaker 1: to us constituted praise. So that was a violation of 72 00:04:55,196 --> 00:04:59,156 Speaker 1: our policies. And then shortly thereafter there was a text 73 00:04:59,156 --> 00:05:02,116 Speaker 1: post that had some of the same language. It called 74 00:05:02,156 --> 00:05:06,396 Speaker 1: those who had reached the Capitol great patriots. So they 75 00:05:06,436 --> 00:05:10,036 Speaker 1: flagged that video. I reviewed it along with some of 76 00:05:10,036 --> 00:05:16,636 Speaker 1: my colleagues, and what we saw was what arose to 77 00:05:16,916 --> 00:05:22,476 Speaker 1: a violation of our policy against celebrations of violence. And 78 00:05:22,596 --> 00:05:27,076 Speaker 1: this is a policy that you may have heard about before, 79 00:05:27,156 --> 00:05:29,396 Speaker 1: in the context of us saying we don't allow anybody 80 00:05:29,436 --> 00:05:32,876 Speaker 1: to praise terror acts or acts of violence. And you 81 00:05:32,876 --> 00:05:35,956 Speaker 1: can think of that as if somebody, if there's a 82 00:05:35,996 --> 00:05:39,156 Speaker 1: bombing somewhere and somebody says, oh, I'm glad that bombing happened. 83 00:05:39,436 --> 00:05:41,796 Speaker 1: We would remove that as praise of a terror act, 84 00:05:42,236 --> 00:05:44,796 Speaker 1: but we removed any praise of violent acts where a 85 00:05:44,836 --> 00:05:47,676 Speaker 1: person is likely to be injured. And here the capital 86 00:05:47,716 --> 00:05:51,676 Speaker 1: attack we knew was a violent act. And this was 87 00:05:51,716 --> 00:05:54,076 Speaker 1: sort of a normal part of our process throughout the 88 00:05:54,156 --> 00:05:55,596 Speaker 1: run up to the election and then the run up 89 00:05:55,596 --> 00:05:59,876 Speaker 1: to the inauguration was having a twenty four hour operation 90 00:06:00,036 --> 00:06:03,796 Speaker 1: center where we were flagging and looking at content that 91 00:06:03,876 --> 00:06:07,436 Speaker 1: potentially violated our policies. And of course that included from 92 00:06:08,356 --> 00:06:11,436 Speaker 1: anyone one of the billions of people using our services, 93 00:06:11,436 --> 00:06:13,356 Speaker 1: but it also did include looking at content that was 94 00:06:13,396 --> 00:06:17,036 Speaker 1: posted by high profile accounts, including the president's account. So, Monica, 95 00:06:17,076 --> 00:06:19,996 Speaker 1: I want to ask first a question about that twenty 96 00:06:19,996 --> 00:06:22,836 Speaker 1: four hour operations center and how they were functioning in 97 00:06:22,836 --> 00:06:25,956 Speaker 1: this instance. How was that working? It was it literally 98 00:06:25,956 --> 00:06:28,116 Speaker 1: that there's someone there in the op center looking to 99 00:06:28,116 --> 00:06:31,116 Speaker 1: see what their president would do next. Well, there's always 100 00:06:31,236 --> 00:06:33,916 Speaker 1: three ways that content can get flagged for our attention, 101 00:06:33,956 --> 00:06:37,156 Speaker 1: and one is a user report. Two is, as you mentioned, 102 00:06:37,156 --> 00:06:41,716 Speaker 1: we use technology to try to identify likely violations, and 103 00:06:41,916 --> 00:06:45,796 Speaker 1: three is we work with partners outside the company, and 104 00:06:45,836 --> 00:06:48,116 Speaker 1: that could include depending on where you are in the world, 105 00:06:48,556 --> 00:06:52,156 Speaker 1: safety groups or media groups that might want to flag 106 00:06:52,196 --> 00:06:54,876 Speaker 1: something for us. And when it came to the US election, 107 00:06:55,076 --> 00:06:58,476 Speaker 1: we were working with a number of partners, including elections 108 00:06:58,516 --> 00:07:03,196 Speaker 1: officials and safety groups, voters rights groups, etc. And then 109 00:07:03,236 --> 00:07:06,076 Speaker 1: we also had our teams who were looking at high 110 00:07:06,116 --> 00:07:10,036 Speaker 1: profile accounts, not just because content that violates our policies 111 00:07:10,036 --> 00:07:11,916 Speaker 1: from those accounts would be really important to be on 112 00:07:11,956 --> 00:07:15,756 Speaker 1: top of, but also because sometimes you'll see people who 113 00:07:15,796 --> 00:07:20,356 Speaker 1: have high profile accounts being the subject of attempted hacks 114 00:07:20,636 --> 00:07:23,596 Speaker 1: or attempted abuse in comments, and so those are things 115 00:07:23,636 --> 00:07:27,196 Speaker 1: that our teams do watch. So let me turn now 116 00:07:27,316 --> 00:07:33,036 Speaker 1: to the remarkable observation that you first reacted to the 117 00:07:33,036 --> 00:07:36,716 Speaker 1: Trump post because they were celebrating violence. That's a tiny 118 00:07:36,756 --> 00:07:39,836 Speaker 1: bit different from what the House of Representatives alleged in 119 00:07:39,916 --> 00:07:41,916 Speaker 1: its article of impeachment against Trump, not that he was 120 00:07:41,916 --> 00:07:45,596 Speaker 1: celebrating violence, but that he incited violence. And the difference, 121 00:07:45,636 --> 00:07:47,356 Speaker 1: I guess, is that celebration has when something of violence 122 00:07:47,356 --> 00:07:50,036 Speaker 1: has already happened. You're celebrating the fact that violent thing 123 00:07:50,116 --> 00:07:53,356 Speaker 1: is happening, whereas incitement is the thing of violence hasn't 124 00:07:53,356 --> 00:07:56,356 Speaker 1: happened yet. And you're doing something that is encouraging it 125 00:07:56,396 --> 00:07:59,116 Speaker 1: to come about, and that involves a lot of prediction 126 00:07:59,476 --> 00:08:01,956 Speaker 1: on the part of whoever's judging that, in this instance 127 00:08:01,996 --> 00:08:04,836 Speaker 1: by the House and then eventually by the Senate. So 128 00:08:05,676 --> 00:08:07,876 Speaker 1: do you see those things as in some way distinct 129 00:08:07,996 --> 00:08:09,796 Speaker 1: or was it just the case that when Trump gave 130 00:08:09,796 --> 00:08:13,996 Speaker 1: his initial speech to the rally on January sixth, that 131 00:08:13,996 --> 00:08:16,716 Speaker 1: that didn't ring any immediate bells because no violence had 132 00:08:16,716 --> 00:08:19,476 Speaker 1: happened yet. Because if so, that's kind of interesting really 133 00:08:19,556 --> 00:08:22,636 Speaker 1: for the question of impeachment, right, if it didn't look 134 00:08:22,636 --> 00:08:24,836 Speaker 1: like incitement when he said it, and you were all 135 00:08:24,876 --> 00:08:26,916 Speaker 1: sitting there in your twenty four operam looking at it, 136 00:08:27,156 --> 00:08:29,036 Speaker 1: that's sort of not a terrible defense for Trump to 137 00:08:29,116 --> 00:08:32,196 Speaker 1: raise when he says I didn't incite any violence. Not 138 00:08:32,236 --> 00:08:33,836 Speaker 1: a defense to raise with you, but a defense for 139 00:08:33,876 --> 00:08:37,196 Speaker 1: Trump to raise in his impeachment trial. Well, one of 140 00:08:37,236 --> 00:08:39,836 Speaker 1: the things that I would emphasize is with our celebration 141 00:08:39,876 --> 00:08:44,916 Speaker 1: of violence policy that is ultimately about preventing further violence. So, 142 00:08:45,036 --> 00:08:47,436 Speaker 1: for instance, I mentioned earlier, if somebody says, oh, I'm 143 00:08:47,436 --> 00:08:50,196 Speaker 1: glad that bomb went off in that city and killed 144 00:08:50,196 --> 00:08:52,836 Speaker 1: all those people, the reason we remove that is not 145 00:08:52,876 --> 00:08:56,276 Speaker 1: because it's distasteful, though it certainly is. It's because we 146 00:08:56,356 --> 00:09:01,236 Speaker 1: think that people praising and celebrating violent acts glorifies that 147 00:09:01,356 --> 00:09:04,076 Speaker 1: and can lead to further violence. So, whether or not 148 00:09:04,116 --> 00:09:08,236 Speaker 1: you want to call that additional incitement or call it, 149 00:09:08,276 --> 00:09:10,156 Speaker 1: as we do in our pulse, the celebration of violence, 150 00:09:10,196 --> 00:09:11,956 Speaker 1: I think the point is the same, which is, we 151 00:09:11,996 --> 00:09:14,876 Speaker 1: thought there was a risk of additional violence, and we 152 00:09:14,916 --> 00:09:18,356 Speaker 1: thought the president's remarks contributed to that. But is it, 153 00:09:18,396 --> 00:09:22,596 Speaker 1: in fact the case that your own decision to definitely 154 00:09:23,356 --> 00:09:27,316 Speaker 1: suspend President Trump from the platform was driven not by 155 00:09:27,396 --> 00:09:31,236 Speaker 1: the theory that his speech to the crowd on January 156 00:09:31,316 --> 00:09:34,916 Speaker 1: sixth led to the violence, but rather on the basis 157 00:09:34,916 --> 00:09:38,716 Speaker 1: of comments he made after that violence had already begun. Yes, 158 00:09:39,036 --> 00:09:44,156 Speaker 1: that's it is right. What we removed was commentary after 159 00:09:44,196 --> 00:09:46,596 Speaker 1: the violence had begun. It was a video in a 160 00:09:46,636 --> 00:09:48,996 Speaker 1: text post. But the reason that we had that celebration 161 00:09:49,036 --> 00:09:51,836 Speaker 1: of violence policy is because we think that kind of 162 00:09:51,836 --> 00:09:56,236 Speaker 1: commentary can indeed stoke further violence. And in fact, here 163 00:09:56,396 --> 00:10:00,956 Speaker 1: not only did we remove the content, but we extended 164 00:10:01,356 --> 00:10:03,916 Speaker 1: the twenty four hour band that was called for by 165 00:10:03,916 --> 00:10:07,636 Speaker 1: our policies. We extended that indefinitely because we thought the 166 00:10:07,756 --> 00:10:12,156 Speaker 1: risk of violence on the ground was still very present 167 00:10:12,196 --> 00:10:15,796 Speaker 1: and likely would be throughout the transition to power. That's 168 00:10:15,796 --> 00:10:19,356 Speaker 1: really interesting too, the indefinite band through the as it were, 169 00:10:19,436 --> 00:10:24,876 Speaker 1: transition until Joe Biden was eventually sworn into office, at 170 00:10:24,876 --> 00:10:27,476 Speaker 1: which point will come to this later. Facebook turned this 171 00:10:27,516 --> 00:10:30,876 Speaker 1: issue over to its oversight board. Was that then, because 172 00:10:30,916 --> 00:10:33,236 Speaker 1: you were worried about violence, or you were worried that 173 00:10:33,956 --> 00:10:38,316 Speaker 1: somehow the transition itself was in jeopardy through democracy itself 174 00:10:38,356 --> 00:10:41,516 Speaker 1: being in danger, or those basically the same thing. For us, 175 00:10:41,556 --> 00:10:45,156 Speaker 1: it's about the risk of violence. What we're looking at 176 00:10:45,276 --> 00:10:48,956 Speaker 1: is through the lens of our values around allowing speech 177 00:10:49,516 --> 00:10:52,716 Speaker 1: but also promoting safety and removing what we think could 178 00:10:52,796 --> 00:10:56,756 Speaker 1: reasonably contribute to a risk of physical harm to somebody. 179 00:10:57,156 --> 00:11:00,476 Speaker 1: And here we had actual physical harm happening on the grounds. 180 00:11:00,556 --> 00:11:02,956 Speaker 1: We thought there was a continued risk of that, and 181 00:11:03,076 --> 00:11:06,236 Speaker 1: we did not want the president at that time, who 182 00:11:06,276 --> 00:11:10,996 Speaker 1: had a high number of followers, a really big microphone, 183 00:11:11,316 --> 00:11:15,236 Speaker 1: and a pattern of celebrating violence, to be able to 184 00:11:15,276 --> 00:11:20,836 Speaker 1: further stoke violence. The question that a lot of critics 185 00:11:20,876 --> 00:11:25,076 Speaker 1: of Facebook inevitably are asking, once you did do it 186 00:11:25,196 --> 00:11:29,396 Speaker 1: is why not sooner? Right? You say the president was 187 00:11:29,436 --> 00:11:34,276 Speaker 1: glorifying violence, Well, what was it when after the Charlottesville violence, 188 00:11:34,316 --> 00:11:37,436 Speaker 1: which included a death, the president said there were fine 189 00:11:37,476 --> 00:11:40,356 Speaker 1: people on both sides. You know, why wasn't that a 190 00:11:40,356 --> 00:11:44,236 Speaker 1: celebration of violence? Why were in other comments the Trump 191 00:11:44,276 --> 00:11:48,396 Speaker 1: has made over the course of his presidency comparably violations 192 00:11:48,436 --> 00:11:51,596 Speaker 1: of policy, such that only now, when there was intact 193 00:11:51,636 --> 00:11:54,716 Speaker 1: on the Capitol was he actually dep platformed. Well, first, 194 00:11:54,996 --> 00:11:57,116 Speaker 1: I would say this isn't the first time we removed 195 00:11:57,156 --> 00:12:00,836 Speaker 1: content from the president. Any time that we have a 196 00:12:00,956 --> 00:12:03,996 Speaker 1: controversial posts by any world leader, and this has happened 197 00:12:03,996 --> 00:12:07,196 Speaker 1: a number of times, including with President Trump and other 198 00:12:07,276 --> 00:12:10,516 Speaker 1: high profile leaders in the United States, anytime we have 199 00:12:10,796 --> 00:12:13,036 Speaker 1: a post that is close to the line, we have 200 00:12:13,156 --> 00:12:15,156 Speaker 1: to look at what the most natural reading of that 201 00:12:15,236 --> 00:12:17,876 Speaker 1: post is. When you look at that video and he's 202 00:12:17,916 --> 00:12:22,716 Speaker 1: saying I love you and thank you, one could argue 203 00:12:23,116 --> 00:12:26,556 Speaker 1: that he was addressing the protesters generally and not those 204 00:12:26,596 --> 00:12:29,356 Speaker 1: who had breached the capitol. We thought the most natural 205 00:12:29,396 --> 00:12:32,676 Speaker 1: reading since he was also saying, okay, go home peacefully, 206 00:12:33,276 --> 00:12:35,236 Speaker 1: we thought the most natural reading was that he was 207 00:12:35,276 --> 00:12:39,676 Speaker 1: referring to those who had engaged in the violence that 208 00:12:39,716 --> 00:12:42,196 Speaker 1: gets you to the taking down of the specific content. 209 00:12:43,116 --> 00:12:46,836 Speaker 1: But this decision was different because this was an indefinite suspension, 210 00:12:47,436 --> 00:12:49,956 Speaker 1: what at least colloquially one would call a deep platforming, 211 00:12:49,996 --> 00:12:52,436 Speaker 1: which is a bigger deal than taking down content that 212 00:12:52,476 --> 00:12:56,516 Speaker 1: occurred in the past. Was there a specific rule that 213 00:12:56,996 --> 00:13:00,676 Speaker 1: you could point to that merited the deep platforming, the 214 00:13:01,156 --> 00:13:05,916 Speaker 1: indefinite suspension rather than the taking down of the content. Yes. 215 00:13:06,116 --> 00:13:10,196 Speaker 1: So basically, the first time that somebody elates one of 216 00:13:10,196 --> 00:13:12,956 Speaker 1: our policies, unless it's a really severe policy. For instance, 217 00:13:12,956 --> 00:13:16,156 Speaker 1: if somebody posts a child sexual abuse material, then we 218 00:13:16,156 --> 00:13:20,156 Speaker 1: would immediately take down the account. But for violations that 219 00:13:20,276 --> 00:13:23,556 Speaker 1: fall into a general category such as a bullying violation 220 00:13:24,116 --> 00:13:26,836 Speaker 1: or a celebration of violence violation, the first time is 221 00:13:26,916 --> 00:13:30,276 Speaker 1: usually just a warning, But if there is a second 222 00:13:30,356 --> 00:13:35,396 Speaker 1: violation within a period of time, then the consequence is 223 00:13:35,516 --> 00:13:39,956 Speaker 1: generally a twenty four hour ban on that person posting 224 00:13:40,036 --> 00:13:42,476 Speaker 1: on our services. So your account is still there, you 225 00:13:42,516 --> 00:13:46,316 Speaker 1: just can't post anything. And if you recall that day, 226 00:13:46,396 --> 00:13:48,756 Speaker 1: we came out and we said we've removed the president's 227 00:13:48,796 --> 00:13:51,916 Speaker 1: content and he is banned from posting on our services 228 00:13:51,916 --> 00:13:53,836 Speaker 1: for twenty four hours, So that was just a straight 229 00:13:53,876 --> 00:13:56,396 Speaker 1: application of the policy. But what we then did the 230 00:13:56,436 --> 00:14:00,596 Speaker 1: next day, we say, we're going to suspend that privilege 231 00:14:00,596 --> 00:14:06,796 Speaker 1: to post indefinitely because of the fear of further violence, 232 00:14:07,316 --> 00:14:11,876 Speaker 1: and that we at least have that in place through 233 00:14:11,876 --> 00:14:16,076 Speaker 1: the transition of power. So that part that extension from 234 00:14:16,076 --> 00:14:18,316 Speaker 1: twenty four hours with the indefinite suspension was based on 235 00:14:18,356 --> 00:14:22,156 Speaker 1: circumstances on the ground and not just routine application of 236 00:14:22,196 --> 00:14:25,476 Speaker 1: our policies. I will say, of course, that the consequences, 237 00:14:26,356 --> 00:14:28,316 Speaker 1: you know, do we take down somebody's account for this 238 00:14:28,356 --> 00:14:30,836 Speaker 1: certain number of strikes or do we ban them those 239 00:14:30,876 --> 00:14:35,836 Speaker 1: Sometimes we do exercise some judgment. We'll look at somebody's account, 240 00:14:35,876 --> 00:14:38,476 Speaker 1: for instance, and say, well, in this case, we think 241 00:14:38,476 --> 00:14:40,996 Speaker 1: this post was very borderline, and this person didn't get 242 00:14:41,156 --> 00:14:45,476 Speaker 1: at notice of his or her violation until we're actually 243 00:14:45,476 --> 00:14:47,076 Speaker 1: not going to remove the page. We'll give them a 244 00:14:47,076 --> 00:14:50,716 Speaker 1: final warning. That sort of thing is fairly routine. Here 245 00:14:51,236 --> 00:14:53,276 Speaker 1: it went in the other direction. We said, we think 246 00:14:53,316 --> 00:14:57,476 Speaker 1: we need to extend the ban at least through the 247 00:14:57,516 --> 00:15:01,916 Speaker 1: transition of power and indefinitely after that. What would you 248 00:15:01,916 --> 00:15:04,716 Speaker 1: say to a skeptic who said, Okay, I accept that 249 00:15:04,756 --> 00:15:08,276 Speaker 1: you have to exercise judgment. But why is it that 250 00:15:08,396 --> 00:15:12,476 Speaker 1: through the entire of Donald Trump's presidency that judgment did 251 00:15:12,476 --> 00:15:18,476 Speaker 1: not involve his being suspended? And then after Congress stayed 252 00:15:18,556 --> 00:15:21,436 Speaker 1: up all night and voted that he was not ultimately 253 00:15:21,516 --> 00:15:23,556 Speaker 1: that the election was over and that he had lost, 254 00:15:24,356 --> 00:15:28,356 Speaker 1: then suddenly the exerciser judgment went against Trump. You know, 255 00:15:28,396 --> 00:15:30,316 Speaker 1: when he was he posed less of a threat to 256 00:15:31,516 --> 00:15:34,196 Speaker 1: the company. Well, like I said, we had removed content 257 00:15:34,356 --> 00:15:37,596 Speaker 1: from the President's account before he had not hit the 258 00:15:37,636 --> 00:15:41,556 Speaker 1: threshold that would trigger the twenty four hour ban. So 259 00:15:42,636 --> 00:15:44,996 Speaker 1: that's just the application of our policies. I will say 260 00:15:44,996 --> 00:15:48,076 Speaker 1: that one of the questions we've gotten in the wake 261 00:15:48,156 --> 00:15:51,476 Speaker 1: of this decision is what about other world leaders? What 262 00:15:51,636 --> 00:15:56,236 Speaker 1: about world leaders who are seen as by the international community, 263 00:15:56,276 --> 00:15:58,556 Speaker 1: of the human rights community, has real bad guys, and 264 00:15:58,636 --> 00:16:01,076 Speaker 1: why don't you remove them from your service? And what 265 00:16:01,156 --> 00:16:04,076 Speaker 1: I can say there is, again, we remove content when 266 00:16:04,076 --> 00:16:06,916 Speaker 1: it violates our policies. We have removed content from other 267 00:16:06,916 --> 00:16:11,316 Speaker 1: world leaders, and that includes praise of violence. It also 268 00:16:11,396 --> 00:16:16,636 Speaker 1: includes sharing misinformation about COVID nineteen. So we do remove 269 00:16:16,676 --> 00:16:19,876 Speaker 1: that content, but we only impose those additional consequences when 270 00:16:19,916 --> 00:16:23,036 Speaker 1: it's called for under our policies. You mentioned the other 271 00:16:23,116 --> 00:16:26,276 Speaker 1: world leaders. This goes to one of the principles that's 272 00:16:26,436 --> 00:16:29,916 Speaker 1: in your statement of values, and the first of them 273 00:16:30,156 --> 00:16:34,396 Speaker 1: does acknowledge that sometimes because there's a preference that you 274 00:16:34,476 --> 00:16:38,756 Speaker 1: have for freedom of expression, especially on political topics, that 275 00:16:38,916 --> 00:16:43,476 Speaker 1: sometimes elected officials will say or do things that would 276 00:16:43,516 --> 00:16:46,756 Speaker 1: otherwise violate your policies and you don't take them down 277 00:16:46,836 --> 00:16:50,716 Speaker 1: because you think that those things serve positive news value. 278 00:16:51,236 --> 00:16:53,316 Speaker 1: How does that interact with the fact that somebody is 279 00:16:53,356 --> 00:16:55,796 Speaker 1: a world leader, I mean, is that basically a reason 280 00:16:55,876 --> 00:16:59,996 Speaker 1: to be more permissive with respect to statements by world 281 00:17:00,076 --> 00:17:02,916 Speaker 1: leaders with the policy The news wordiness policy is a 282 00:17:02,916 --> 00:17:06,116 Speaker 1: little bit different than that, and actually rarely do we 283 00:17:06,276 --> 00:17:10,556 Speaker 1: use it with world leaders or politicians. Basically, in our 284 00:17:10,556 --> 00:17:14,756 Speaker 1: community standards, we say here's what's prohibited, and then we say, 285 00:17:15,196 --> 00:17:18,476 Speaker 1: if something if we think the value for the public 286 00:17:18,556 --> 00:17:23,916 Speaker 1: in seeing something outweighs the safety risk because of the 287 00:17:23,996 --> 00:17:26,316 Speaker 1: item's newsworthiness, then we may leave it up even if 288 00:17:26,356 --> 00:17:30,116 Speaker 1: it violates our policies. And we do apply that newsworthiness 289 00:17:30,156 --> 00:17:36,036 Speaker 1: policy regularly, I think probably most often in the context of, 290 00:17:36,116 --> 00:17:39,796 Speaker 1: say there's a nude art exhibit or there's an image 291 00:17:39,796 --> 00:17:44,116 Speaker 1: of graphic violence in the context of somebody raising awareness 292 00:17:44,156 --> 00:17:47,636 Speaker 1: about a war, and it shows a nude child or 293 00:17:47,676 --> 00:17:50,196 Speaker 1: something like that, where we would say this is newsworthy, 294 00:17:50,276 --> 00:17:52,196 Speaker 1: so we're going to leave it up. We think that 295 00:17:52,116 --> 00:17:56,116 Speaker 1: the risk of safety is far outweighed by the value 296 00:17:56,116 --> 00:17:59,956 Speaker 1: of people seeing this content. In a small handful of cases, 297 00:18:00,476 --> 00:18:04,476 Speaker 1: we have used that policy to leave up content posted 298 00:18:04,476 --> 00:18:08,276 Speaker 1: by world leaders or politicians, but that's fairly rare. But 299 00:18:08,396 --> 00:18:11,556 Speaker 1: generally it would it would include something where we think 300 00:18:11,596 --> 00:18:14,676 Speaker 1: there's no real safety risk, and we think that people 301 00:18:14,676 --> 00:18:16,956 Speaker 1: should be able to see that this politician engaged in 302 00:18:17,036 --> 00:18:20,916 Speaker 1: this speech which is likely distasteful or there's probably something 303 00:18:20,956 --> 00:18:24,596 Speaker 1: about it that's problematic but not unsafe. I want to 304 00:18:24,596 --> 00:18:28,876 Speaker 1: turn out to the Oversight Board, which I helped advise 305 00:18:28,916 --> 00:18:30,836 Speaker 1: on and you helped construct and build. And in fact 306 00:18:30,876 --> 00:18:33,396 Speaker 1: that's how we met when I came out to Menlo 307 00:18:33,436 --> 00:18:36,116 Speaker 1: Park back when people still traveled places at the very 308 00:18:36,196 --> 00:18:39,836 Speaker 1: very early stages to think and talk about potential oversight 309 00:18:39,876 --> 00:18:43,316 Speaker 1: Board directions. And sure enough, the baby's all grown up. 310 00:18:43,716 --> 00:18:46,676 Speaker 1: And I mean, so far the Oversight Board has decided 311 00:18:46,716 --> 00:18:49,556 Speaker 1: I guess six and a half cases, yeah, five and 312 00:18:49,556 --> 00:18:52,036 Speaker 1: a half cases, five and a half cases, none of 313 00:18:52,076 --> 00:18:55,276 Speaker 1: them on the scale of this decision. This is a 314 00:18:55,356 --> 00:18:59,116 Speaker 1: huge decision, huge for the company, huge for the Oversight Board. 315 00:18:59,436 --> 00:19:02,636 Speaker 1: Possibly not insignificant for politics in the United States, given 316 00:19:02,676 --> 00:19:05,356 Speaker 1: that more than seventy million people voted for Donald Trump 317 00:19:05,356 --> 00:19:07,876 Speaker 1: and lots of Republicans seem to believe that he still 318 00:19:07,876 --> 00:19:10,436 Speaker 1: has a big influence within his party. So I guess 319 00:19:10,436 --> 00:19:12,196 Speaker 1: the first question I have is do you think they're 320 00:19:12,196 --> 00:19:14,996 Speaker 1: ready for it? I do. I do think they are, 321 00:19:15,076 --> 00:19:17,916 Speaker 1: and I think the decisions they just put out so 322 00:19:18,476 --> 00:19:21,476 Speaker 1: basically as you know, but people listening may not know. 323 00:19:21,756 --> 00:19:25,716 Speaker 1: The Oversight Board was constructed to be an independent check 324 00:19:26,236 --> 00:19:29,236 Speaker 1: on the decisions that my team is making, that Facebook 325 00:19:29,316 --> 00:19:33,796 Speaker 1: is making on removing people's content. And when they issued 326 00:19:33,836 --> 00:19:36,556 Speaker 1: their most recent decisions, their first decisions the slight of 327 00:19:37,556 --> 00:19:39,916 Speaker 1: there are five opinions that they put out and then 328 00:19:39,956 --> 00:19:42,516 Speaker 1: there was there was one decision they couldn't make because 329 00:19:42,516 --> 00:19:44,756 Speaker 1: the post was actually removed by the person who had 330 00:19:44,796 --> 00:19:49,036 Speaker 1: posted it. But in their five decisions, they really explained 331 00:19:49,076 --> 00:19:54,676 Speaker 1: their thinking. They demonstrated real seriousness and sophistication, and I 332 00:19:55,036 --> 00:19:58,596 Speaker 1: think I'm really excited about how they approached these first 333 00:19:58,636 --> 00:20:03,076 Speaker 1: cases and the potential for them to decide really important 334 00:20:03,076 --> 00:20:08,036 Speaker 1: cases in the future, starting with the decision to indefinitely 335 00:20:08,076 --> 00:20:12,156 Speaker 1: suspended remove content from President Trump's account. In most of 336 00:20:12,196 --> 00:20:15,276 Speaker 1: the cases that they heard this first trash, the oversight 337 00:20:15,316 --> 00:20:19,636 Speaker 1: Board flipped the decision that Facebook had made. How are 338 00:20:19,636 --> 00:20:22,236 Speaker 1: you going to feel if they flip you on this 339 00:20:22,276 --> 00:20:25,876 Speaker 1: one too? Well, we referred it to them because we 340 00:20:25,956 --> 00:20:29,036 Speaker 1: think they should get to make this decision. So you know, 341 00:20:29,076 --> 00:20:31,196 Speaker 1: we're we're looking forward to that. And I will say 342 00:20:31,236 --> 00:20:34,116 Speaker 1: the criteria for for us. So the way the way 343 00:20:34,156 --> 00:20:38,396 Speaker 1: the board can get a case a user could appeal 344 00:20:38,476 --> 00:20:41,396 Speaker 1: to them be Facebook could say, boy, this is really 345 00:20:41,396 --> 00:20:45,356 Speaker 1: hard and really significant, and we think that somebody else 346 00:20:45,356 --> 00:20:47,876 Speaker 1: should be making this decision. And in this case, we 347 00:20:47,996 --> 00:20:52,516 Speaker 1: decided to refer this to the oversight Board. The Trump decision. 348 00:20:52,716 --> 00:20:56,876 Speaker 1: The criteria we use are is it a significant decision? 349 00:20:57,236 --> 00:21:00,116 Speaker 1: It clearly is? And is it a difficult decision? And 350 00:21:00,196 --> 00:21:02,676 Speaker 1: here the fact that we have had people, some people 351 00:21:02,716 --> 00:21:05,116 Speaker 1: saying why wasn't a permanent band, why didn't you do 352 00:21:05,156 --> 00:21:07,316 Speaker 1: it sooner? And we've had other people saying, I can't 353 00:21:07,316 --> 00:21:11,596 Speaker 1: believe Facebook would remove a sitting president's ability to post 354 00:21:12,196 --> 00:21:14,476 Speaker 1: really shows how difficult this is. So I think they 355 00:21:14,476 --> 00:21:17,076 Speaker 1: are the right group to decide it. And we didn't 356 00:21:17,116 --> 00:21:20,396 Speaker 1: just ask them tell us whether or not we were 357 00:21:20,476 --> 00:21:24,156 Speaker 1: right to remove this particular video and impose this indefinite suspension. 358 00:21:24,236 --> 00:21:28,836 Speaker 1: We said, tell us how we should think about removing 359 00:21:28,916 --> 00:21:33,316 Speaker 1: content from or in definitely suspending. World leaders are those 360 00:21:33,636 --> 00:21:36,796 Speaker 1: in positions of power in countries around the world, so 361 00:21:36,836 --> 00:21:40,636 Speaker 1: this is something that really does have a global implication. 362 00:21:41,436 --> 00:21:53,676 Speaker 1: We'll be back in a moment. One of the things 363 00:21:53,716 --> 00:21:56,996 Speaker 1: that's in process right now is that Donald Trump is 364 00:21:57,436 --> 00:22:00,996 Speaker 1: being tried for impeachment in front of the US Senate, 365 00:22:01,676 --> 00:22:05,956 Speaker 1: and counting noses, it seems much more probable than not 366 00:22:06,476 --> 00:22:09,556 Speaker 1: that he will not be convicted by the Senate, and 367 00:22:10,036 --> 00:22:12,996 Speaker 1: according to a norm that is in place, despite the 368 00:22:12,996 --> 00:22:15,076 Speaker 1: fact that I don't like it very much, when a 369 00:22:15,156 --> 00:22:17,556 Speaker 1: present is not convicted by the Senate because there's no 370 00:22:17,556 --> 00:22:20,356 Speaker 1: two thirds vote to convict him, that president usually says 371 00:22:20,356 --> 00:22:22,356 Speaker 1: I was acquitted by the Senate. I'm not sure I 372 00:22:22,356 --> 00:22:24,276 Speaker 1: love the word acquitted in that context, because it's nothing 373 00:22:24,316 --> 00:22:26,356 Speaker 1: like acquittal in front of your jury, as you know 374 00:22:26,396 --> 00:22:29,596 Speaker 1: as a former prosecutor. But the president is likely to 375 00:22:29,676 --> 00:22:32,196 Speaker 1: say if the Senate doesn't convit him, I was acquitted, 376 00:22:32,876 --> 00:22:35,476 Speaker 1: and whether it's too late for him to say it 377 00:22:35,516 --> 00:22:37,956 Speaker 1: to the Oversight Board or not in public, I would 378 00:22:37,956 --> 00:22:41,756 Speaker 1: think that Trump or his supporters are likely to say, listen, Facebook, 379 00:22:42,116 --> 00:22:43,916 Speaker 1: who were you to second guess the Senate of the 380 00:22:43,996 --> 00:22:47,116 Speaker 1: United States? You know, the impeachment if a present is 381 00:22:47,156 --> 00:22:49,356 Speaker 1: it bit like an indictment, then he gets a trial. 382 00:22:49,756 --> 00:22:53,156 Speaker 1: I was tried, I was acquitted. I'm not guilty of incitement, 383 00:22:54,276 --> 00:22:57,476 Speaker 1: and therefore you should reinstate me. At least, That's what 384 00:22:57,476 --> 00:22:59,476 Speaker 1: I would say if I were supporting Donald Trump in 385 00:22:59,556 --> 00:23:03,516 Speaker 1: this in this effort, should the Oversight Board care about that? 386 00:23:03,636 --> 00:23:06,676 Speaker 1: Should it matter at all that there's been a public 387 00:23:06,836 --> 00:23:10,236 Speaker 1: political process prescribed by the Constitution, and if at the 388 00:23:10,316 --> 00:23:13,036 Speaker 1: end of that process it turns out the Trump is 389 00:23:13,156 --> 00:23:17,956 Speaker 1: not removed, does that matter? I think it's really for 390 00:23:18,036 --> 00:23:19,836 Speaker 1: the Oversight Board. I mean, that's why we have them. 391 00:23:19,876 --> 00:23:23,076 Speaker 1: So I actually won't won't give an opinion on that 392 00:23:23,196 --> 00:23:25,756 Speaker 1: because you know how to unduly influence them, or because 393 00:23:25,756 --> 00:23:27,276 Speaker 1: it's not because you don't want to take a stand 394 00:23:27,276 --> 00:23:29,156 Speaker 1: on it. Well, I just don't think. I don't think 395 00:23:29,156 --> 00:23:31,276 Speaker 1: it's really my role. I think the reason we have 396 00:23:31,396 --> 00:23:33,516 Speaker 1: them is because we think they should be able to 397 00:23:33,556 --> 00:23:37,316 Speaker 1: make that sort of decision. Monica, tell me about how, 398 00:23:37,516 --> 00:23:39,356 Speaker 1: maybe a little soon to say this, but how does 399 00:23:39,476 --> 00:23:42,476 Speaker 1: your job and the job of your whole team that 400 00:23:42,556 --> 00:23:47,116 Speaker 1: work on content policy change in a world where there's 401 00:23:47,156 --> 00:23:51,316 Speaker 1: now this oversight board to review the decisions that you 402 00:23:51,396 --> 00:23:53,916 Speaker 1: guys made. How does that affect you when you go 403 00:23:53,996 --> 00:23:57,076 Speaker 1: to work in the mornings. Well, I can just tell 404 00:23:57,116 --> 00:24:01,316 Speaker 1: you my personal reaction to the first slate of decisions, 405 00:24:01,356 --> 00:24:06,516 Speaker 1: which was I was very happy and felt like we 406 00:24:06,636 --> 00:24:08,996 Speaker 1: got clear direction from them. And this is not about 407 00:24:09,396 --> 00:24:13,156 Speaker 1: reinstating three posts that we had removed that we ended 408 00:24:13,196 --> 00:24:17,076 Speaker 1: up reinstating after their decisions. It wasn't about that so 409 00:24:17,156 --> 00:24:19,356 Speaker 1: much as it was about the other guidance they gave 410 00:24:19,436 --> 00:24:22,956 Speaker 1: us about why they thought we had to reinstate these posts, 411 00:24:22,956 --> 00:24:25,556 Speaker 1: and so, for instance, things like you need to provide 412 00:24:25,596 --> 00:24:30,516 Speaker 1: more granular information about your COVID nineteen misinformation policies, or 413 00:24:30,556 --> 00:24:34,236 Speaker 1: you need to it was operational advice about what we 414 00:24:34,276 --> 00:24:36,996 Speaker 1: need to tell people about whether review is automated or 415 00:24:37,396 --> 00:24:40,516 Speaker 1: using human beings. And it was process advice about ensuring 416 00:24:41,036 --> 00:24:43,676 Speaker 1: that people have the ability to be reheard to appeal 417 00:24:43,716 --> 00:24:48,236 Speaker 1: our decisions. That's the kind of guidance that can help 418 00:24:48,476 --> 00:24:51,556 Speaker 1: us know where to invest from an operational standpoint of 419 00:24:51,596 --> 00:24:56,036 Speaker 1: product standpoint. And do you and the company view those 420 00:24:56,316 --> 00:25:00,476 Speaker 1: recommendations as they're a kind of a subtle area, right? 421 00:25:00,556 --> 00:25:03,596 Speaker 1: I mean, the board is empowered to give you non 422 00:25:03,636 --> 00:25:07,036 Speaker 1: binding recommendations. But it's also true that if the board 423 00:25:07,116 --> 00:25:10,516 Speaker 1: makes something necessary to its decision in a case, then 424 00:25:10,716 --> 00:25:14,196 Speaker 1: arguably that would be binding. So how do you figure 425 00:25:14,196 --> 00:25:17,196 Speaker 1: out what it is in a given situation? What we've 426 00:25:17,516 --> 00:25:20,236 Speaker 1: we've and I should say, we're we're going to take 427 00:25:20,596 --> 00:25:24,596 Speaker 1: the we have thirty days under the process that we've devised, 428 00:25:24,636 --> 00:25:28,596 Speaker 1: thirty days to digest the decisions and respond to them publicly, 429 00:25:28,636 --> 00:25:32,156 Speaker 1: and we'll respond to them in newsroom posts. So we'll 430 00:25:32,156 --> 00:25:33,996 Speaker 1: have to look into the specifics of each of their 431 00:25:33,996 --> 00:25:36,636 Speaker 1: recommendations before we have an answer to give on that. 432 00:25:36,716 --> 00:25:41,676 Speaker 1: But more generally, the process we have is if they 433 00:25:42,156 --> 00:25:44,316 Speaker 1: tell us that a specific piece of content should be 434 00:25:44,396 --> 00:25:46,196 Speaker 1: up or down, we will honor that and we will 435 00:25:46,236 --> 00:25:48,596 Speaker 1: implement that right away. And we've done that with the 436 00:25:48,636 --> 00:25:52,636 Speaker 1: decisions that they gave us. If there is other content 437 00:25:52,916 --> 00:25:56,676 Speaker 1: that is identical in terms of what it's saying and 438 00:25:57,436 --> 00:26:00,356 Speaker 1: basically it's in parallel context, it's being used the same way, 439 00:26:00,676 --> 00:26:02,836 Speaker 1: then we will try to find that and make the 440 00:26:02,876 --> 00:26:05,956 Speaker 1: same decisions. So, for instance, if they tell us, hey, 441 00:26:05,956 --> 00:26:08,956 Speaker 1: this particular meme that you this is, I'm making this. 442 00:26:09,236 --> 00:26:11,596 Speaker 1: But if they said this meme that you removed for 443 00:26:11,796 --> 00:26:14,316 Speaker 1: hate speech does not violate and you should reinstate it, 444 00:26:14,636 --> 00:26:17,476 Speaker 1: we might look for other instances where we had removed 445 00:26:17,476 --> 00:26:20,316 Speaker 1: that same meme and say, okay, if it was shared 446 00:26:20,356 --> 00:26:22,356 Speaker 1: without a caption and it was shared in the same way, 447 00:26:22,356 --> 00:26:25,356 Speaker 1: we're going to reinstate that right away. So that's part 448 00:26:25,396 --> 00:26:30,196 Speaker 1: of implementing the binding part of their decisions, the policy 449 00:26:30,236 --> 00:26:34,196 Speaker 1: guidance stuff, including them saying, for instance, you know you 450 00:26:34,236 --> 00:26:38,036 Speaker 1: should look at the comments under a post in your evaluation, 451 00:26:38,116 --> 00:26:40,156 Speaker 1: or you should have an automatic right of appeal to 452 00:26:40,236 --> 00:26:43,956 Speaker 1: a human being to rereview content that is not binding 453 00:26:43,956 --> 00:26:49,356 Speaker 1: on us. You mentioned COVID misinformation as a currently very 454 00:26:49,356 --> 00:26:52,236 Speaker 1: important question, one that the oversight board is already referred to, 455 00:26:52,476 --> 00:26:54,316 Speaker 1: and obviously it takes up a lot of your own 456 00:26:54,596 --> 00:26:58,916 Speaker 1: thinking and time. How broadly speaking, has the company decided 457 00:26:58,956 --> 00:27:02,716 Speaker 1: to think about COVID misinformation And I'm thinking now especially 458 00:27:02,716 --> 00:27:07,996 Speaker 1: about vaccine related misinformation as we head into a period 459 00:27:08,076 --> 00:27:11,116 Speaker 1: where for the moment, there's still a question of getting 460 00:27:11,116 --> 00:27:13,516 Speaker 1: it enough people vaccines who want them, But at some 461 00:27:13,596 --> 00:27:15,796 Speaker 1: point they'll, with any luck, there'll be a shift it 462 00:27:15,796 --> 00:27:18,556 Speaker 1: we'll start wondering about what they call a vaccine hesitancy, 463 00:27:18,596 --> 00:27:23,036 Speaker 1: which does seem to me like a major, major, major euphemism. 464 00:27:23,076 --> 00:27:25,116 Speaker 1: People who don't want to be vaccinated, And if they 465 00:27:25,116 --> 00:27:26,476 Speaker 1: don't want to be vaccinated, that may be on the 466 00:27:26,556 --> 00:27:29,756 Speaker 1: basis of a view of the world that from a 467 00:27:29,756 --> 00:27:33,116 Speaker 1: scientific perspective might be counted as misinformation. So how is 468 00:27:33,116 --> 00:27:35,516 Speaker 1: the company thinking about that? Oh, this is such a 469 00:27:35,676 --> 00:27:38,996 Speaker 1: this is a so difficult and be so important. And 470 00:27:39,036 --> 00:27:42,556 Speaker 1: we've been focused on this since last January, I mean 471 00:27:42,596 --> 00:27:47,556 Speaker 1: since the pandemic first began, and we've been working closely 472 00:27:47,596 --> 00:27:51,636 Speaker 1: with health authorities, most notably probably the World Health Organization 473 00:27:52,236 --> 00:27:55,596 Speaker 1: and the CDC in the US to get their guidance 474 00:27:55,796 --> 00:27:58,796 Speaker 1: on how we should be thinking about and responding to 475 00:27:59,476 --> 00:28:03,236 Speaker 1: COVID misinformation. By the way, misinformation is just part of it. 476 00:28:03,316 --> 00:28:06,476 Speaker 1: We have a number of COVID specific policies. One that's 477 00:28:07,196 --> 00:28:09,636 Speaker 1: that's interesting that maybe we could talk through time is 478 00:28:10,396 --> 00:28:14,076 Speaker 1: what to do with commerce offers to sell masks or 479 00:28:14,236 --> 00:28:17,436 Speaker 1: COVID test kits, especially when there are shortages or when 480 00:28:17,476 --> 00:28:20,996 Speaker 1: things aren't necessarily reliably certified. So there are a lot 481 00:28:21,036 --> 00:28:24,596 Speaker 1: of COVID specific policies, but specifically in the area of misinformation, 482 00:28:25,156 --> 00:28:30,756 Speaker 1: we've developed really a two pronged approach. One is removing 483 00:28:31,076 --> 00:28:35,916 Speaker 1: or down ranking and labeling content that is misleading or inaccurate, 484 00:28:36,436 --> 00:28:42,476 Speaker 1: and then the other prong is really aggressively promoting accurate 485 00:28:42,516 --> 00:28:48,196 Speaker 1: information about the vaccine, about treatments, about the virus itself. 486 00:28:48,756 --> 00:28:51,116 Speaker 1: And so I mean, just to I think we put 487 00:28:51,156 --> 00:28:53,836 Speaker 1: the COVID Information Center. I think we got that going 488 00:28:54,556 --> 00:28:59,116 Speaker 1: in March or maybe earlier of twenty twenty, and we 489 00:28:59,236 --> 00:29:03,996 Speaker 1: have had hundreds of millions of people visit that, which 490 00:29:04,196 --> 00:29:06,956 Speaker 1: is encouraging, and part of that is because we're blasting 491 00:29:06,996 --> 00:29:10,956 Speaker 1: notifications trying to direct people to that center. In fact, 492 00:29:11,476 --> 00:29:13,556 Speaker 1: in just December twenty twenty, we had more than one 493 00:29:13,636 --> 00:29:16,476 Speaker 1: hundred and thirty million people visit that information center. So 494 00:29:16,556 --> 00:29:18,956 Speaker 1: that's one thing we're doing. But as you say, like 495 00:29:19,156 --> 00:29:23,276 Speaker 1: actually responding in the moment to misinformation that shared on 496 00:29:23,276 --> 00:29:26,196 Speaker 1: the platform is also really really important, and so we 497 00:29:26,236 --> 00:29:28,356 Speaker 1: are removing and this is criteria that we've worked up 498 00:29:28,396 --> 00:29:32,836 Speaker 1: with health organizations. We're removing content that falls into a 499 00:29:32,956 --> 00:29:36,676 Speaker 1: number of categories where if somebody believed it, it would 500 00:29:36,716 --> 00:29:38,876 Speaker 1: contribute to the risk of that person would get COVID 501 00:29:39,036 --> 00:29:43,356 Speaker 1: or would spread the virus. So, for instance, false statements, 502 00:29:43,396 --> 00:29:47,596 Speaker 1: false claims about the disease being no worse than the flu, 503 00:29:47,916 --> 00:29:52,116 Speaker 1: or the virus doesn't really exist, or poor people are 504 00:29:52,116 --> 00:29:54,676 Speaker 1: immune from it, they can't get it, or five G 505 00:29:54,996 --> 00:29:58,636 Speaker 1: causes covid all of them. What about their people who 506 00:29:58,676 --> 00:30:01,196 Speaker 1: say What about someone who says, I just don't believe 507 00:30:01,276 --> 00:30:04,396 Speaker 1: these vaccines will work and I believe they'll do harm 508 00:30:04,476 --> 00:30:06,916 Speaker 1: and I'm not going to take them. That's something we 509 00:30:06,916 --> 00:30:10,276 Speaker 1: would allow. If somebody says I personally and then they're 510 00:30:10,316 --> 00:30:13,996 Speaker 1: giving their their own experience, we would generally allow that. 511 00:30:14,196 --> 00:30:18,236 Speaker 1: If they are saying something as a statement, so something 512 00:30:18,316 --> 00:30:21,516 Speaker 1: like you know, I've I've looked at it, the vaccines 513 00:30:21,556 --> 00:30:25,996 Speaker 1: don't work, or the vaccines cause in fertility, or did 514 00:30:25,996 --> 00:30:28,156 Speaker 1: you see this study or this article, and then they're 515 00:30:28,436 --> 00:30:31,636 Speaker 1: signing something that is inaccurate, we would remove that. What 516 00:30:31,716 --> 00:30:33,196 Speaker 1: gets really tricky, and this is sort of where your 517 00:30:33,196 --> 00:30:37,396 Speaker 1: comment goes. What gets tricky is statements of personal questioning 518 00:30:37,876 --> 00:30:43,236 Speaker 1: or personal testimonials. So let's say somebody says, I just 519 00:30:43,276 --> 00:30:47,236 Speaker 1: got the first vaccine shot. I've never been this sick. 520 00:30:48,196 --> 00:30:50,276 Speaker 1: It's really really horrible. If I had it to do 521 00:30:50,276 --> 00:30:52,036 Speaker 1: all over again, I don't think I would have gotten it. 522 00:30:53,076 --> 00:30:54,476 Speaker 1: What do you do with that? That's just a person 523 00:30:54,556 --> 00:30:58,636 Speaker 1: stating his or her own opinion. What about somebody stating 524 00:30:59,076 --> 00:31:03,716 Speaker 1: facts like my sister got the vaccine on Monday. On Wednesday, 525 00:31:03,796 --> 00:31:08,116 Speaker 1: she was diagnosed with pancreatic cancer. With everybody, with the 526 00:31:08,196 --> 00:31:10,716 Speaker 1: high number of people get vaccinated, some people are going 527 00:31:10,756 --> 00:31:12,276 Speaker 1: to have heart attacks the next day, not because of 528 00:31:12,276 --> 00:31:13,916 Speaker 1: the vaccine, but because they were always going to have 529 00:31:13,956 --> 00:31:17,036 Speaker 1: a heart attack on Tuesday. And so it goes further 530 00:31:17,116 --> 00:31:18,916 Speaker 1: than that. There will be some people I don't know further, 531 00:31:18,956 --> 00:31:21,156 Speaker 1: but the hope be cases of people who get vaccinated 532 00:31:21,316 --> 00:31:24,476 Speaker 1: and then the next day have COVID because they were 533 00:31:24,516 --> 00:31:26,996 Speaker 1: infected before they got the vaccine. I mean, that's going 534 00:31:27,076 --> 00:31:30,316 Speaker 1: to happen to some people, right. So that's one of 535 00:31:30,356 --> 00:31:32,316 Speaker 1: the tricky questions for us. How do we deal with 536 00:31:32,476 --> 00:31:36,196 Speaker 1: it's sort of testimonial content, and what's your approach been 537 00:31:36,996 --> 00:31:40,396 Speaker 1: with personal testimonials, we generally allow it. I mean, if 538 00:31:40,396 --> 00:31:42,596 Speaker 1: it looks like if we see a case where it 539 00:31:42,596 --> 00:31:45,836 Speaker 1: looks like somebody is intentionally trying to scirt the policies, 540 00:31:45,876 --> 00:31:48,316 Speaker 1: maybe this is a financially motivated actor, or maybe this 541 00:31:48,396 --> 00:31:53,396 Speaker 1: is somebody who is generally sharing conspiracy theories and there's 542 00:31:53,436 --> 00:31:55,836 Speaker 1: something more going on there, we might take a different approach. 543 00:31:55,876 --> 00:31:57,436 Speaker 1: But if this is just a regular person who is 544 00:31:57,436 --> 00:32:00,436 Speaker 1: sharing a personal experience, our general policy is to allow it. 545 00:32:01,636 --> 00:32:05,396 Speaker 1: There's an in between approach too, so we remove content 546 00:32:05,436 --> 00:32:07,316 Speaker 1: where we think it can contribute to the spread of 547 00:32:07,316 --> 00:32:10,196 Speaker 1: the virus, and that is there are all different kinds 548 00:32:10,196 --> 00:32:12,276 Speaker 1: of claims that fall into that, but it's generally stuff 549 00:32:12,316 --> 00:32:17,236 Speaker 1: about diminishing the seriousness of the virus, or saying that 550 00:32:17,276 --> 00:32:21,236 Speaker 1: there are cures that there aren't, or discrediting the vaccines. 551 00:32:21,836 --> 00:32:26,636 Speaker 1: But then there is also content that we demote, meaning 552 00:32:26,756 --> 00:32:28,956 Speaker 1: it won't get the same distribution on Facebook, and we 553 00:32:29,036 --> 00:32:31,036 Speaker 1: put labels on it that direct people to that COVID 554 00:32:31,076 --> 00:32:35,436 Speaker 1: Information center, and that will include content like the vaccine 555 00:32:35,516 --> 00:32:38,316 Speaker 1: is man made, this is all a big conspiracy, and 556 00:32:38,676 --> 00:32:41,596 Speaker 1: there there's not so much a safety risk, but we 557 00:32:41,636 --> 00:32:43,276 Speaker 1: do want to make sure that people are actually getting 558 00:32:43,316 --> 00:32:46,796 Speaker 1: accurate information about the virus. So just to basically put 559 00:32:47,036 --> 00:32:50,476 Speaker 1: some numbers on it, since March, and I think these 560 00:32:50,516 --> 00:32:54,996 Speaker 1: numbers go through maybe October, we removed about twelve million 561 00:32:55,676 --> 00:33:01,036 Speaker 1: posts for COVID misinformation, and I think since then in December, 562 00:33:01,076 --> 00:33:04,156 Speaker 1: I think we've removed maybe just over four hundred thousand 563 00:33:04,396 --> 00:33:06,156 Speaker 1: such posts. So that kind of gives you the general 564 00:33:06,156 --> 00:33:08,436 Speaker 1: idea of scale. And then in terms of the demoting 565 00:33:08,516 --> 00:33:11,516 Speaker 1: and labeling content that where there's not a safety risk 566 00:33:11,556 --> 00:33:15,916 Speaker 1: but it's still widely debunked misinformation, it's more than one 567 00:33:15,956 --> 00:33:18,396 Speaker 1: hundred and sixty million posts in that same timeframe that 568 00:33:18,436 --> 00:33:21,636 Speaker 1: we've labeled, so as a proportion, it's almost it's more 569 00:33:21,636 --> 00:33:25,676 Speaker 1: than ten times as much down ranking or labeling compared 570 00:33:25,716 --> 00:33:27,796 Speaker 1: to removing content. I want to ask you about the 571 00:33:27,836 --> 00:33:30,476 Speaker 1: future of content moderation in that way. I mean, do 572 00:33:30,476 --> 00:33:33,036 Speaker 1: you see is that characteristic of where you see your 573 00:33:33,036 --> 00:33:35,196 Speaker 1: whole field going I mean, is there going to be 574 00:33:35,636 --> 00:33:38,876 Speaker 1: more and more and more and more down ranking and 575 00:33:39,076 --> 00:33:43,116 Speaker 1: labeling rather than removal or an addition to removal, or 576 00:33:43,396 --> 00:33:45,956 Speaker 1: do you think those numbers are likely to remain relatively 577 00:33:45,956 --> 00:33:49,036 Speaker 1: stable going forward in terms of the ratio when it 578 00:33:49,036 --> 00:33:52,236 Speaker 1: comes to misinformation, I think I think labeling will become 579 00:33:53,276 --> 00:33:56,156 Speaker 1: more and more important. Now. Facebook already does it quite broadly. 580 00:33:56,236 --> 00:33:59,556 Speaker 1: We've done this since twenty seventeen. We work with more 581 00:33:59,596 --> 00:34:02,516 Speaker 1: than eighty fact checking partners. Not just on COVID misinformation, 582 00:34:02,516 --> 00:34:06,276 Speaker 1: this is on misinformation about any topic. If it's going 583 00:34:06,356 --> 00:34:09,076 Speaker 1: viral and a fact checking agency that we work with 584 00:34:09,196 --> 00:34:10,916 Speaker 1: wants to fact check it, then they can label it 585 00:34:10,956 --> 00:34:12,516 Speaker 1: and we will down rank it and we will apply 586 00:34:12,516 --> 00:34:14,796 Speaker 1: the label to it. That's something that we already do 587 00:34:14,876 --> 00:34:17,716 Speaker 1: quite broadly. But I think this is an approach that 588 00:34:17,756 --> 00:34:20,396 Speaker 1: you're starting to see some of the other platforms get into. 589 00:34:20,436 --> 00:34:21,756 Speaker 1: In fact, you saw it and they run up to 590 00:34:21,756 --> 00:34:25,396 Speaker 1: the election, and I wouldn't be surprised if it's if 591 00:34:25,396 --> 00:34:29,556 Speaker 1: it's something that maybe increases in its importance an important tool. 592 00:34:30,396 --> 00:34:33,836 Speaker 1: One of the criticisms that I've heard a lot, sometimes 593 00:34:34,156 --> 00:34:37,196 Speaker 1: directed at the oversight poort, but more broadly directed against 594 00:34:37,236 --> 00:34:41,436 Speaker 1: content moderation is that in a sense, it's all very 595 00:34:41,436 --> 00:34:43,396 Speaker 1: well and good. Everyone says, it's good that you're doing that, 596 00:34:43,436 --> 00:34:46,076 Speaker 1: it's nice that the Facebook wants to do that. But 597 00:34:46,876 --> 00:34:51,396 Speaker 1: if the biggest, deepest social cost associate with Facebook is 598 00:34:52,236 --> 00:34:57,476 Speaker 1: people finding themselves in algorithmic bubbles where they mostly hear 599 00:34:57,716 --> 00:35:00,356 Speaker 1: what is referred to them by their friends, their family, 600 00:35:01,076 --> 00:35:04,156 Speaker 1: people of like mind, and if that, if that drives polarization. 601 00:35:04,156 --> 00:35:07,156 Speaker 1: And these are very controversial claims, but I'm ventriloquizeing what 602 00:35:07,436 --> 00:35:10,676 Speaker 1: critics often say. Then they say, you know, isn't it 603 00:35:10,676 --> 00:35:12,516 Speaker 1: just sort of a band aid to say we're taking 604 00:35:12,516 --> 00:35:15,716 Speaker 1: down the worst content or we're down ranking content that 605 00:35:15,756 --> 00:35:19,436 Speaker 1: we don't especially like. The strong form of the criticism 606 00:35:19,476 --> 00:35:22,236 Speaker 1: would say, all of the tools that you've placed in 607 00:35:22,236 --> 00:35:25,196 Speaker 1: content moderation or in content and policy, those should go 608 00:35:25,356 --> 00:35:28,276 Speaker 1: to the very fundamental question of what the company allows 609 00:35:28,316 --> 00:35:30,276 Speaker 1: to be seen in the first place. You know, maybe 610 00:35:30,356 --> 00:35:34,236 Speaker 1: the news feed that Facebook produces should come under the 611 00:35:34,276 --> 00:35:38,356 Speaker 1: auspices of content policy, you know, should be similarly not 612 00:35:38,396 --> 00:35:40,876 Speaker 1: just checked for misinformation but which it is, of course, 613 00:35:40,916 --> 00:35:44,076 Speaker 1: but more broadly, should be part of a process of 614 00:35:44,116 --> 00:35:49,236 Speaker 1: trying to curate material in a way that minimizes polarization. 615 00:35:49,636 --> 00:35:51,316 Speaker 1: And obviously that's not the world that we live in now, 616 00:35:51,356 --> 00:35:54,596 Speaker 1: but it's a normative vision of how things could evolve 617 00:35:54,676 --> 00:35:56,876 Speaker 1: or develop in the future. When you hear that kind 618 00:35:56,916 --> 00:36:00,516 Speaker 1: of criticism, how do you tend to react to it. Well. 619 00:36:00,516 --> 00:36:02,676 Speaker 1: You know, one of the things I think that points 620 00:36:02,676 --> 00:36:08,236 Speaker 1: to is the power of us directing people to authoritative information, which, 621 00:36:08,276 --> 00:36:10,756 Speaker 1: like I said, the numbers that actually are very good. 622 00:36:10,996 --> 00:36:13,156 Speaker 1: We have a COVID information center, we had a voting 623 00:36:13,196 --> 00:36:17,156 Speaker 1: information center before the US election. We're building other information centers, 624 00:36:17,196 --> 00:36:19,796 Speaker 1: and what we're seeing is people actually do visit these 625 00:36:19,836 --> 00:36:22,036 Speaker 1: when they are directed to them in the moment, and 626 00:36:22,076 --> 00:36:25,956 Speaker 1: so that's something that I think can be effective against polarization. 627 00:36:26,396 --> 00:36:29,556 Speaker 1: For instance, in the run up to the election, we 628 00:36:29,556 --> 00:36:33,076 Speaker 1: were directing people very broadly, not just when there was 629 00:36:33,116 --> 00:36:35,956 Speaker 1: something false, but when people were discussing election related topics. 630 00:36:35,996 --> 00:36:38,556 Speaker 1: We were saying, get the facts here and pointing them 631 00:36:38,596 --> 00:36:43,156 Speaker 1: to a bipartisan, accurate set of resources. So I do 632 00:36:43,236 --> 00:36:45,356 Speaker 1: think that's important. The other thing I would say is 633 00:36:46,436 --> 00:36:48,956 Speaker 1: because of the headlines and because of the understandable focus 634 00:36:48,956 --> 00:36:51,516 Speaker 1: on the election recently, I think there's a misperception that 635 00:36:52,236 --> 00:36:57,036 Speaker 1: Facebook is all about news or politics, and in fact, 636 00:36:57,276 --> 00:37:00,476 Speaker 1: the news content, the percentage of Facebook content that is 637 00:37:00,556 --> 00:37:02,556 Speaker 1: related to the news is very very small. I think 638 00:37:02,596 --> 00:37:05,956 Speaker 1: it's less than five percent, and so when you think 639 00:37:05,996 --> 00:37:09,036 Speaker 1: about polarization going up, and I think there are some 640 00:37:09,196 --> 00:37:12,356 Speaker 1: studies out there that show this polarization has been increasing 641 00:37:12,396 --> 00:37:16,076 Speaker 1: politically in the United States for decades, and there are 642 00:37:16,156 --> 00:37:20,156 Speaker 1: many reasons for that. So it will not be enough 643 00:37:20,836 --> 00:37:24,276 Speaker 1: for social media companies alone to say, well, we're going 644 00:37:24,276 --> 00:37:25,956 Speaker 1: to take this one approach. This is something that we 645 00:37:25,996 --> 00:37:29,916 Speaker 1: have to work on as a broader society. A last question, Monica, 646 00:37:29,916 --> 00:37:33,276 Speaker 1: and again this comes from skeptics. They'll say, well, look, 647 00:37:33,636 --> 00:37:35,356 Speaker 1: the oversight board is great, but it's only going to 648 00:37:35,396 --> 00:37:38,316 Speaker 1: hear a handful of cases. What about all the other 649 00:37:38,396 --> 00:37:43,116 Speaker 1: cases where every day Facebook is making decisions about content 650 00:37:43,196 --> 00:37:47,156 Speaker 1: posted by users who get a lot of engagement, including 651 00:37:47,596 --> 00:37:52,956 Speaker 1: people whose values and views might threaten the content policy standards. 652 00:37:53,436 --> 00:37:56,556 Speaker 1: How do you assure or try to assure the general 653 00:37:56,596 --> 00:38:01,476 Speaker 1: public that the company's profit incentive, which goes alongside engagement, 654 00:38:02,396 --> 00:38:08,716 Speaker 1: is not enough to overcome the counteracting principle of enforcement 655 00:38:09,196 --> 00:38:11,836 Speaker 1: that you and your team are are charged with implementing. 656 00:38:12,996 --> 00:38:14,916 Speaker 1: I guess the skeptical way of putting it would be, 657 00:38:15,676 --> 00:38:17,596 Speaker 1: it's nice that the oversight board will oversee you some 658 00:38:17,676 --> 00:38:20,196 Speaker 1: of the time, but why should the rest of the 659 00:38:20,236 --> 00:38:23,276 Speaker 1: world trust you when the oversight board isn't looking I 660 00:38:23,276 --> 00:38:25,196 Speaker 1: guess I have two answers to that, and one is 661 00:38:25,236 --> 00:38:27,996 Speaker 1: sort of a personal perspective, which is I've been in 662 00:38:27,996 --> 00:38:32,716 Speaker 1: this job now for eight years or so, and what 663 00:38:32,756 --> 00:38:36,996 Speaker 1: it looks like is my team of a couple hundred 664 00:38:36,996 --> 00:38:41,756 Speaker 1: people coming together with experts on speech and safety from 665 00:38:41,836 --> 00:38:44,796 Speaker 1: around the world and crafting a set of standards which 666 00:38:44,796 --> 00:38:48,756 Speaker 1: we then apply with thousands of content moderators that use 667 00:38:48,796 --> 00:38:51,276 Speaker 1: the rules and the guidance that we give them. It 668 00:38:51,436 --> 00:38:56,716 Speaker 1: is not dictated by concerns about revenue or you know, 669 00:38:56,756 --> 00:38:59,876 Speaker 1: I can't for instance, when we're when we're designing our policies, 670 00:38:59,916 --> 00:39:02,436 Speaker 1: we don't we don't talk to people on the sales 671 00:39:02,436 --> 00:39:04,356 Speaker 1: team about how that would affect revenue. That's not part 672 00:39:04,356 --> 00:39:07,996 Speaker 1: of what goes into this, and so that's that's one 673 00:39:08,076 --> 00:39:10,916 Speaker 1: personal assurance I would give. But in terms of the 674 00:39:10,956 --> 00:39:14,196 Speaker 1: oversight Board's role on this, I think, yes, it's true, 675 00:39:14,196 --> 00:39:16,396 Speaker 1: they're only going to hear a small number of cases. 676 00:39:16,716 --> 00:39:19,436 Speaker 1: And even if we double the size of the oversight board, 677 00:39:20,756 --> 00:39:24,116 Speaker 1: you know, we make millions of decisions every week, so 678 00:39:24,276 --> 00:39:25,956 Speaker 1: the oversight board is not going to be able to 679 00:39:26,476 --> 00:39:31,156 Speaker 1: hear a significant percentage of those cases. But the decisions 680 00:39:31,196 --> 00:39:37,276 Speaker 1: that we saw them make have an effect in flagging 681 00:39:37,356 --> 00:39:40,236 Speaker 1: for us the broader concerns. It's not just about reinstating 682 00:39:40,276 --> 00:39:42,276 Speaker 1: a piece of content, Like I said, their guidance was 683 00:39:42,356 --> 00:39:45,316 Speaker 1: much more around what kind of notice has to be provided, 684 00:39:45,316 --> 00:39:47,956 Speaker 1: what kind of process has to be provided, and those 685 00:39:48,196 --> 00:39:51,276 Speaker 1: that's the sort of guidance that will indeed lead to 686 00:39:51,356 --> 00:39:53,796 Speaker 1: us thinking about bigger questions that will affect all of 687 00:39:53,796 --> 00:39:57,316 Speaker 1: our users. Monica, thank you for taking us under the hood. 688 00:39:57,956 --> 00:40:01,076 Speaker 1: It's a complicated engine in there. We will look forward 689 00:40:01,156 --> 00:40:03,356 Speaker 1: to seeing what the oversight board does in the Trumps 690 00:40:03,356 --> 00:40:06,436 Speaker 1: suspension case. I myself, while looking forward to with bated breath, 691 00:40:06,476 --> 00:40:09,036 Speaker 1: I mean I am about as far from the capacity 692 00:40:09,116 --> 00:40:11,236 Speaker 1: to be objective about the oversight part is it's possible 693 00:40:11,276 --> 00:40:13,636 Speaker 1: for me to be about anything apart maybe from my 694 00:40:13,676 --> 00:40:16,676 Speaker 1: actual children. But on the other hand, the oversight port 695 00:40:16,756 --> 00:40:19,076 Speaker 1: is in fact totally independent now and independent not only 696 00:40:19,116 --> 00:40:21,996 Speaker 1: of Facebook, but certainly independent of me, and so I 697 00:40:22,116 --> 00:40:25,996 Speaker 1: myself I'm watching with fascination and not a little terror 698 00:40:25,996 --> 00:40:28,316 Speaker 1: to see it, to see how it all comes out. Well. 699 00:40:28,316 --> 00:40:37,156 Speaker 1: Thank you so much, and thanks for the conversation. I 700 00:40:37,316 --> 00:40:42,236 Speaker 1: always learn so much when I'm speaking to Monica. The 701 00:40:42,276 --> 00:40:46,356 Speaker 1: truth is that we never really ask what happens behind 702 00:40:46,436 --> 00:40:50,396 Speaker 1: the scenes at the big social media platforms when speech, 703 00:40:50,516 --> 00:40:52,916 Speaker 1: whether that of an ordinary person or of Donald Trump, 704 00:40:53,196 --> 00:40:57,036 Speaker 1: is left up or taking down. Where Monica lives professionally 705 00:40:57,236 --> 00:41:01,196 Speaker 1: is an epicenter of a new form of power. It's 706 00:41:01,236 --> 00:41:03,756 Speaker 1: the power to decide who is heard. It's also the 707 00:41:03,796 --> 00:41:08,676 Speaker 1: power to amplify or decelerate the trajectory of information as 708 00:41:08,716 --> 00:41:13,236 Speaker 1: its heads through the world. This is a crucial historical 709 00:41:13,636 --> 00:41:18,076 Speaker 1: moment for the governance of content on social media, for 710 00:41:18,196 --> 00:41:21,236 Speaker 1: what content is allowed to remain and what content is 711 00:41:21,276 --> 00:41:26,676 Speaker 1: taken down. We are witnessing a deep interpenetration of how 712 00:41:26,796 --> 00:41:29,796 Speaker 1: the president's words and speech play out in the realm 713 00:41:29,956 --> 00:41:33,476 Speaker 1: of government as in the impeachment, and how they play 714 00:41:33,516 --> 00:41:36,796 Speaker 1: out in the realm of communication across social media as 715 00:41:36,796 --> 00:41:41,476 Speaker 1: we see with respect to Trump's suspensions, the outcomes of 716 00:41:41,596 --> 00:41:44,596 Speaker 1: each are going to matter for the way we think 717 00:41:44,636 --> 00:41:47,876 Speaker 1: about free expression in the United States and the world. 718 00:41:48,836 --> 00:41:52,356 Speaker 1: And when the Oversight Board reaches its decision about Donald Trump, 719 00:41:52,796 --> 00:41:55,116 Speaker 1: I will come back to you here on deep background 720 00:41:55,356 --> 00:42:00,796 Speaker 1: with a possibility of further discussion and conversation. In the meantime, 721 00:42:01,196 --> 00:42:04,716 Speaker 1: I'm watching the impeachment trial as closely as I know 722 00:42:04,796 --> 00:42:07,516 Speaker 1: how I better be, because I'm on TV almost every 723 00:42:07,596 --> 00:42:09,676 Speaker 1: night this week trying to offer an opinion about it. 724 00:42:10,116 --> 00:42:13,316 Speaker 1: I'm sparing my Deep Background listeners those comments for the moment, 725 00:42:13,636 --> 00:42:16,796 Speaker 1: but as the trial develops, if important things come up 726 00:42:16,836 --> 00:42:19,436 Speaker 1: that we think are relevant to our listeners, I promise 727 00:42:19,516 --> 00:42:21,596 Speaker 1: to come back to them in the very near future. 728 00:42:23,196 --> 00:42:25,796 Speaker 1: Until the next time I speak to you, be careful, 729 00:42:26,196 --> 00:42:30,156 Speaker 1: be safe, and be well. Deep Background is brought to 730 00:42:30,196 --> 00:42:34,116 Speaker 1: you by Pushkin Industries. Our producer is Mo laboord, our 731 00:42:34,156 --> 00:42:37,316 Speaker 1: engineer is Martin Gonzalez, and our shore runner is Sophie 732 00:42:37,316 --> 00:42:41,836 Speaker 1: Crane mckibbon. Editorial support from noahm Osband. Theme music by 733 00:42:41,876 --> 00:42:45,756 Speaker 1: Luis Guerra at Pushkin. Thanks to Mia Lobell, Julia Barton, 734 00:42:46,036 --> 00:42:50,956 Speaker 1: Lydia Jean Cott, Heather Faine, Carly mcgliori, Maggie Taylor, Eric Sandler, 735 00:42:51,036 --> 00:42:53,716 Speaker 1: and Jacob Weisberg. You can find me on Twitter at 736 00:42:53,756 --> 00:42:57,156 Speaker 1: Noah R. Feldman. I also write a column for Bloomberg Opinion, 737 00:42:57,276 --> 00:43:00,316 Speaker 1: which you can find at Bloomberg dot com slash Feldman. 738 00:43:00,796 --> 00:43:04,156 Speaker 1: To discover Bloomberg's original slate of podcasts, go to bloomberg 739 00:43:04,196 --> 00:43:06,956 Speaker 1: dot com, slash podcasts, and if you liked what you 740 00:43:07,036 --> 00:43:10,596 Speaker 1: heard today, please write a review, Tell a friend. This 741 00:43:11,156 --> 00:43:12,036 Speaker 1: is deep background