1 00:00:03,920 --> 00:00:09,400 Speaker 1: Uhmerica, I don't know why I opened the show this way. 2 00:00:09,760 --> 00:00:12,440 Speaker 1: I really, I really don't know why you did. I 3 00:00:12,480 --> 00:00:17,920 Speaker 1: don't know why I did that. Um, Jamie, Jamie, Jamie, Hi, Hi, 4 00:00:19,800 --> 00:00:23,919 Speaker 1: is this a podcast? Yeah, it's like, after all these years, 5 00:00:24,480 --> 00:00:27,479 Speaker 1: it's a podcast. It's a podcast. It's a podcast. Sometimes 6 00:00:27,560 --> 00:00:30,880 Speaker 1: podcast I get on the phone to podcast and I 7 00:00:30,920 --> 00:00:32,720 Speaker 1: think that I'm just talking to my friends, But then 8 00:00:32,760 --> 00:00:36,320 Speaker 1: I remember that every relationship in my life is dominated 9 00:00:36,600 --> 00:00:39,280 Speaker 1: by podcasting, and I don't even know how to interact 10 00:00:39,360 --> 00:00:42,000 Speaker 1: with people outside of the filter of a zoom call anymore. 11 00:00:42,040 --> 00:00:49,440 Speaker 1: That it is incredibly accurately. I really, that's actually the 12 00:00:49,720 --> 00:00:53,000 Speaker 1: saddest thing I've ever heard of my entire life. Thank you, Jamie. 13 00:00:53,080 --> 00:00:56,160 Speaker 1: How are you doing it to me? Like? I'm gonna agree? 14 00:00:56,440 --> 00:01:00,920 Speaker 1: I was? I was terrible, and then I was great, 15 00:01:00,920 --> 00:01:03,960 Speaker 1: and now I'm fine. Excellent. That sounds like a solid trajectory. 16 00:01:03,960 --> 00:01:06,240 Speaker 1: That's a good little hero's journey. That's like the that's 17 00:01:06,280 --> 00:01:08,640 Speaker 1: like The Green Night. More or less, I didn't see 18 00:01:08,640 --> 00:01:11,120 Speaker 1: that movie. It looked long. It is it is, but 19 00:01:11,160 --> 00:01:14,000 Speaker 1: it's quite a film. I would not see a movie 20 00:01:14,040 --> 00:01:18,400 Speaker 1: over an hour and forty minutes anymore. That's marred out. 21 00:01:18,520 --> 00:01:21,280 Speaker 1: You know what, I don't recommend for short movies as 22 00:01:21,319 --> 00:01:24,440 Speaker 1: Herbert West Reanimator or The Reanimator. Sorry, it was before 23 00:01:24,480 --> 00:01:27,120 Speaker 1: the Herbert West one, which was a Halloween movie. I 24 00:01:27,160 --> 00:01:30,720 Speaker 1: loved and then rewatched this Halloween and it was really 25 00:01:30,760 --> 00:01:34,480 Speaker 1: great up until one one really horrifying scene that I 26 00:01:34,520 --> 00:01:37,959 Speaker 1: had kind of forgotten Reanimator before. It's got some amazing 27 00:01:38,000 --> 00:01:39,840 Speaker 1: stuff in it if you're a horror movie fan, and 28 00:01:39,880 --> 00:01:44,440 Speaker 1: then there is an incredibly uncomfortable sexual assault scene. Um 29 00:01:44,640 --> 00:01:50,160 Speaker 1: that is like really bad, like really bad. Wow, thank you. 30 00:01:50,200 --> 00:01:52,680 Speaker 1: I think we're opening this episode in a really strong, 31 00:01:53,040 --> 00:01:56,280 Speaker 1: powerful than you focused way. I want to recommend a 32 00:01:56,280 --> 00:02:00,400 Speaker 1: horror movie to you. If it's Midsummer, I'm not going 33 00:02:00,440 --> 00:02:04,320 Speaker 1: to listen. It's not. It's not. I only watch Midsummer 34 00:02:04,360 --> 00:02:07,480 Speaker 1: to get all horny for Will Poulter, and then I 35 00:02:07,520 --> 00:02:11,920 Speaker 1: turned it off Shifter he dies. But that's a reasonable 36 00:02:11,960 --> 00:02:14,840 Speaker 1: thing for a person to say. Absolutely, when the person 37 00:02:14,919 --> 00:02:16,799 Speaker 1: that I want to have sex with dies, I turn 38 00:02:16,840 --> 00:02:19,600 Speaker 1: off the movie because it's boring. After that, it's a 39 00:02:19,720 --> 00:02:22,480 Speaker 1: very healthy way to go through life. Thank you so much. Okay, 40 00:02:22,560 --> 00:02:24,960 Speaker 1: So the movie, which I will show you at some 41 00:02:25,000 --> 00:02:29,960 Speaker 1: point is called Pin heard of Pin so Pin Pin 42 00:02:30,360 --> 00:02:35,080 Speaker 1: is a It's about a pediatrician who has like a 43 00:02:35,120 --> 00:02:39,400 Speaker 1: life size medical dummy sitting in his office. Oh, this 44 00:02:39,440 --> 00:02:42,680 Speaker 1: doesn't seem like it's going a good place. The pediatrician 45 00:02:43,000 --> 00:02:46,280 Speaker 1: the only way. It's like a psychological thriller. It's not 46 00:02:46,400 --> 00:02:49,840 Speaker 1: really that gory. But the pediatrician the only way that 47 00:02:49,880 --> 00:02:53,720 Speaker 1: he can communicate with emotional honesty with his children is 48 00:02:53,760 --> 00:02:58,080 Speaker 1: by making a little ventriloquist ventriloquist voice for the dummy. 49 00:02:58,160 --> 00:03:02,720 Speaker 1: And so the ventriloquist dummy gives the kids sex head 50 00:03:02,840 --> 00:03:07,680 Speaker 1: the ventrala because dummy does all this stuff, and then um, 51 00:03:07,760 --> 00:03:10,040 Speaker 1: and then one of the kids thinks that the ventrals 52 00:03:10,160 --> 00:03:13,760 Speaker 1: quest dummy whose name is Pin, is real. And then 53 00:03:13,840 --> 00:03:16,960 Speaker 1: Pin starts to like control his thoughts and actions. And 54 00:03:17,000 --> 00:03:20,560 Speaker 1: then there's a scene where a nurse has sex with Pin, 55 00:03:21,080 --> 00:03:24,480 Speaker 1: but he's just a dummy. Okay, it's the greatest movie 56 00:03:24,480 --> 00:03:27,040 Speaker 1: I've ever seen. I'm not doing it justice that that 57 00:03:27,040 --> 00:03:29,800 Speaker 1: that sounds like quite a film, Jamie. That sounds like 58 00:03:29,840 --> 00:03:33,640 Speaker 1: quite a film. Mars has sex with Pin, Robert and 59 00:03:34,720 --> 00:03:36,840 Speaker 1: I don't. Yeah, I mean that definitely does sound like 60 00:03:36,880 --> 00:03:41,480 Speaker 1: a scene that would make me have very specific physical reaction. 61 00:03:41,520 --> 00:03:43,240 Speaker 1: And the best part about that scene is that the 62 00:03:43,320 --> 00:03:45,720 Speaker 1: nurse never comes back and it's never addressed in the 63 00:03:45,760 --> 00:03:49,320 Speaker 1: movie again. And that now that now you're speaking the language, 64 00:03:49,480 --> 00:03:54,000 Speaker 1: the language of shoddy filmmaking, something horrible happens and then 65 00:03:54,400 --> 00:03:57,160 Speaker 1: you just canonically have to forget in order to watch 66 00:03:57,240 --> 00:03:59,800 Speaker 1: the rest of the movie. No, I'm on board with that. 67 00:04:00,040 --> 00:04:02,440 Speaker 1: And you know, Jamie, now that you mentioned quitting a 68 00:04:02,440 --> 00:04:03,920 Speaker 1: movie as soon as the person you want to have 69 00:04:03,960 --> 00:04:06,480 Speaker 1: sex with dies, that may explain where I've never made 70 00:04:06,520 --> 00:04:09,080 Speaker 1: it past the halfway point in the first Star Wars movie. 71 00:04:09,400 --> 00:04:12,200 Speaker 1: Really wow, Yeah, once Alec Guinness is out of the picture, 72 00:04:12,240 --> 00:04:15,160 Speaker 1: why even keep watching? You know? See? Well, yeah, you're like, well, 73 00:04:15,200 --> 00:04:17,479 Speaker 1: the hottest person is gone. I don't want not a 74 00:04:17,560 --> 00:04:20,960 Speaker 1: single fuckable face. And the rest of that film a 75 00:04:21,040 --> 00:04:24,719 Speaker 1: bunch of ug goes Jamie. So you know who else 76 00:04:24,800 --> 00:04:28,400 Speaker 1: is a bunch of ug? Gosh that we did it. 77 00:04:28,560 --> 00:04:35,600 Speaker 1: It's so perfect. The people who run Facebook, it is 78 00:04:35,680 --> 00:04:38,680 Speaker 1: it is. We're talking today about the Facebook papers, which 79 00:04:38,760 --> 00:04:40,960 Speaker 1: is we'll talk about this a little bit more in detailed, 80 00:04:40,960 --> 00:04:43,839 Speaker 1: but an enormous cash of internal Facebook documents that just 81 00:04:43,880 --> 00:04:46,159 Speaker 1: got leaked revealing a tremendous amount of funked up shit. 82 00:04:46,839 --> 00:04:49,800 Speaker 1: And I think we have to start with the uncomfortable 83 00:04:50,000 --> 00:04:54,240 Speaker 1: situation that is everybody talking shit about how Mark Zuckerberg 84 00:04:54,240 --> 00:04:56,840 Speaker 1: looks like an android. I feel so mixed about it 85 00:04:56,880 --> 00:05:00,960 Speaker 1: because on one hand, yes, the thing that's bad about 86 00:05:01,040 --> 00:05:04,080 Speaker 1: him is not his appearance, but also yes, he does. 87 00:05:04,600 --> 00:05:07,800 Speaker 1: He does hit the uncanny valley. There's something missing in 88 00:05:07,839 --> 00:05:10,599 Speaker 1: his eyes. Look, there's something and and that's ivy league 89 00:05:10,600 --> 00:05:13,800 Speaker 1: boy syndrome, right, Like, that's not just him, that's anyone 90 00:05:13,839 --> 00:05:20,880 Speaker 1: who who graduates screen photos. I truly. I mean, even 91 00:05:20,920 --> 00:05:22,960 Speaker 1: though we did just refer to the entire cast of 92 00:05:23,040 --> 00:05:26,520 Speaker 1: Star Wars one as a bunch of ugos hideous, do 93 00:05:26,760 --> 00:05:31,360 Speaker 1: feel that it's like the worst, the most lazy thing 94 00:05:31,440 --> 00:05:35,000 Speaker 1: you could do is go after how someone looks when 95 00:05:35,000 --> 00:05:37,240 Speaker 1: there are so many other evil facets of him. I 96 00:05:37,279 --> 00:05:39,400 Speaker 1: will agree that there is no light in his eyes. 97 00:05:39,760 --> 00:05:42,599 Speaker 1: There's certainly no light in his eyes. There's nothing like 98 00:05:42,640 --> 00:05:46,400 Speaker 1: the pupils are very There's not that little thing that's 99 00:05:46,400 --> 00:05:48,440 Speaker 1: supposed to be there is not there. Yeah, he could 100 00:05:48,440 --> 00:05:50,279 Speaker 1: watch a kitten get murdered and it would just be 101 00:05:50,320 --> 00:05:53,400 Speaker 1: a dial tone inside his soul. He looks like a 102 00:05:53,520 --> 00:05:57,520 Speaker 1: character from the Polar Express. He looks he looks like 103 00:05:57,600 --> 00:06:00,920 Speaker 1: Herbert West from the movie Rhianna of Matter, but less 104 00:06:00,920 --> 00:06:05,200 Speaker 1: cares to check that he looks like Pin. Look up Pen. 105 00:06:05,680 --> 00:06:08,839 Speaker 1: I'm not going to look up Pen. At this point 106 00:06:08,880 --> 00:06:11,919 Speaker 1: in the history of the show, Jamie, we've recorded a 107 00:06:11,920 --> 00:06:14,520 Speaker 1: lot of podcast episodes about Facebook. You have been there 108 00:06:14,560 --> 00:06:17,960 Speaker 1: for what three of them? Um? Yeah, I forget. I 109 00:06:18,000 --> 00:06:20,800 Speaker 1: forget entirely how many episodes we did on Facebook. We 110 00:06:20,839 --> 00:06:23,359 Speaker 1: did three episodes on like the creation of Facebook, and 111 00:06:23,400 --> 00:06:25,720 Speaker 1: it's kind of a brief list of its crimes. I 112 00:06:25,720 --> 00:06:27,839 Speaker 1: think we did at least one follow up, maybe two 113 00:06:27,839 --> 00:06:32,120 Speaker 1: follow ups. Um. And then we've mentioned Facebook foppery fakery 114 00:06:32,320 --> 00:06:34,400 Speaker 1: in episodes of Like It could happen here in Worst 115 00:06:34,440 --> 00:06:36,599 Speaker 1: Year Ever. Facebook is personal to me for a couple 116 00:06:36,640 --> 00:06:39,800 Speaker 1: of reasons. Number one, number of people who helped raise 117 00:06:39,880 --> 00:06:42,240 Speaker 1: me have slowly lost their grip on reality and the 118 00:06:42,240 --> 00:06:46,160 Speaker 1: face of viral propaganda spread be a via Facebook's engagement algorithm, 119 00:06:46,240 --> 00:06:49,200 Speaker 1: and that's kind of bumped me out. Um yeah. And 120 00:06:49,360 --> 00:06:51,520 Speaker 1: number two, my friends and I all lost our jobs 121 00:06:51,520 --> 00:06:53,599 Speaker 1: in the company we built for a decade due to 122 00:06:53,720 --> 00:06:56,719 Speaker 1: the fact that Facebook told criminal lies about video metrics 123 00:06:57,000 --> 00:06:59,320 Speaker 1: that they have currently been find forty million dollars for, 124 00:07:00,279 --> 00:07:03,000 Speaker 1: which also frustrating. Actually over it, it sounds like you're 125 00:07:03,040 --> 00:07:05,720 Speaker 1: really over it. And you know, I I we've talked 126 00:07:05,760 --> 00:07:07,920 Speaker 1: about this. I don't know if it was on micro net, 127 00:07:07,960 --> 00:07:10,320 Speaker 1: but yeah, I also asked my job to that. I 128 00:07:10,360 --> 00:07:13,760 Speaker 1: mean we worked at the same place. Yeah, worked it 129 00:07:13,880 --> 00:07:17,360 Speaker 1: like yeah, I mean they all the whole industry went 130 00:07:17,440 --> 00:07:20,840 Speaker 1: up in flames. It's so sace. I'm still I'm still 131 00:07:20,880 --> 00:07:23,120 Speaker 1: mad about it. Yeah, I am mad about it, even 132 00:07:23,120 --> 00:07:25,560 Speaker 1: though like things have been going fine, great for me 133 00:07:25,680 --> 00:07:29,280 Speaker 1: career wise. Um, it's just like kind of bullshit. It's 134 00:07:29,360 --> 00:07:32,080 Speaker 1: kind of frustrating. It used to be. Did you catch 135 00:07:32,080 --> 00:07:35,640 Speaker 1: that Robert's career is going great? Yea? In my head, 136 00:07:35,680 --> 00:07:39,520 Speaker 1: I went, You're welcome. I'm a human rocket ship. I'm 137 00:07:39,560 --> 00:07:43,600 Speaker 1: the iggy pop of talking about Facebook Welcome. It's kind 138 00:07:43,640 --> 00:07:46,720 Speaker 1: of nice that we all just had to pivot to, like, Okay, 139 00:07:47,040 --> 00:07:49,480 Speaker 1: you can still talk about what you're passionate about, but 140 00:07:49,600 --> 00:07:51,400 Speaker 1: just no one has to look at you anymore. And 141 00:07:51,400 --> 00:07:53,880 Speaker 1: I'm like, that's actually not the worst thing. That's that ideal. 142 00:07:54,360 --> 00:07:56,520 Speaker 1: And in my case, it was like you don't have 143 00:07:56,600 --> 00:08:01,120 Speaker 1: to write articles. You just have to tell on a microphone, 144 00:08:01,240 --> 00:08:05,720 Speaker 1: which involves writing an article. But I don't know, they're easier. 145 00:08:06,520 --> 00:08:09,520 Speaker 1: It's true. You can, like you can really have a 146 00:08:09,560 --> 00:08:11,720 Speaker 1: series of bullet points and be like, well, you just 147 00:08:11,760 --> 00:08:15,720 Speaker 1: don't have to edit anymore. That's that's what that's what 148 00:08:15,840 --> 00:08:17,840 Speaker 1: we that's what we got rid of in this pivot 149 00:08:17,880 --> 00:08:21,800 Speaker 1: to podcasting. Editors were all, we're all green walding a 150 00:08:21,800 --> 00:08:24,440 Speaker 1: little bit. Yeah, it all worked out, but you know, 151 00:08:24,520 --> 00:08:29,240 Speaker 1: but also Q and on stealing the fact that there 152 00:08:29,240 --> 00:08:31,440 Speaker 1: will soon be death squads roving many of our neighbor 153 00:08:31,480 --> 00:08:34,160 Speaker 1: Like there's there's downsides to it too, you know. Um, 154 00:08:34,640 --> 00:08:37,200 Speaker 1: but then also meta, you know, and then yea meta, 155 00:08:37,240 --> 00:08:39,280 Speaker 1: thank god, we're getting meta. We'll be talking about meta 156 00:08:39,320 --> 00:08:42,880 Speaker 1: at some point. Um. But yeah, like it's it's Facebook's bad. 157 00:08:42,920 --> 00:08:45,440 Speaker 1: I don't like Facebook. But in one of those episodes, 158 00:08:45,480 --> 00:08:47,480 Speaker 1: and I forget which of those episodes, I said something 159 00:08:47,480 --> 00:08:49,760 Speaker 1: along the lines of, at this point, there's no moral 160 00:08:49,840 --> 00:08:54,240 Speaker 1: case for remaining employed by Facebook. Um. And earlier this year, 161 00:08:54,280 --> 00:08:57,200 Speaker 1: a Facebook employee named Francis Hoggan came to the same 162 00:08:57,240 --> 00:08:59,880 Speaker 1: conclusion on her own. Rather than jump out with her 163 00:09:00,040 --> 00:09:02,480 Speaker 1: dock options or whatever perk she decrued, and then get 164 00:09:02,480 --> 00:09:04,600 Speaker 1: another tech industry job, which is what a lot of 165 00:09:04,600 --> 00:09:06,960 Speaker 1: people do. I know people who have done this when 166 00:09:07,000 --> 00:09:09,880 Speaker 1: they were like, Facebook's kind of fucked up. I'm just 167 00:09:09,920 --> 00:09:12,920 Speaker 1: gonna go hop to another country company and make even 168 00:09:12,920 --> 00:09:16,200 Speaker 1: more money. Uh. Instead of doing that, Francis spent weeks 169 00:09:16,200 --> 00:09:21,640 Speaker 1: painstakingly photographing internal Facebook has its own internal communications network 170 00:09:21,720 --> 00:09:24,160 Speaker 1: that is patterned off of Facebook, but it's for like 171 00:09:24,200 --> 00:09:29,000 Speaker 1: the corporate the the employees to use. Um well, I 172 00:09:29,000 --> 00:09:32,160 Speaker 1: mean it's it's like it's like Slack, but probably more 173 00:09:32,200 --> 00:09:36,240 Speaker 1: all consuming and soul destroying. Um yeah, but she is. 174 00:09:36,679 --> 00:09:40,640 Speaker 1: There's like nothing worse. I mean, personality wise, I can't 175 00:09:40,679 --> 00:09:44,640 Speaker 1: stand someone who's like really killing it on Slack. That's 176 00:09:44,640 --> 00:09:47,760 Speaker 1: one of my least favorite traits in a person, as 177 00:09:47,800 --> 00:09:51,160 Speaker 1: if they're they're really giving it a percent on Slack. 178 00:09:51,280 --> 00:09:53,959 Speaker 1: I'm like, just I'm I'm asleep. I hate it. No, 179 00:09:54,200 --> 00:09:56,640 Speaker 1: I mean, Sophie barely hears from me when we need 180 00:09:56,679 --> 00:09:58,920 Speaker 1: to work, let alone, when we don't need to work 181 00:10:01,760 --> 00:10:09,720 Speaker 1: hardcent not true good stuff. So rather than you know, 182 00:10:10,440 --> 00:10:12,839 Speaker 1: take the money and run, so she gets in this 183 00:10:12,920 --> 00:10:15,920 Speaker 1: like internal communications app that Facebook has and because they 184 00:10:15,920 --> 00:10:19,360 Speaker 1: have protections about like because like they know that this 185 00:10:19,440 --> 00:10:21,720 Speaker 1: is a risk people leaking internal documents as a risk, 186 00:10:21,720 --> 00:10:23,840 Speaker 1: they have like security things set up and to get 187 00:10:23,840 --> 00:10:26,280 Speaker 1: past them, she just photographs a bunch of internal documents 188 00:10:26,280 --> 00:10:29,360 Speaker 1: on her phone. Um a huge Like she gets a 189 00:10:29,360 --> 00:10:31,920 Speaker 1: lot of shit. Uh, And then she didn't leaks those 190 00:10:31,920 --> 00:10:34,840 Speaker 1: files to several news agencies and are good friends at 191 00:10:34,840 --> 00:10:37,880 Speaker 1: the sec. UH. This week, we're going to go through 192 00:10:37,960 --> 00:10:40,120 Speaker 1: some of those documents and all of the damning ship 193 00:10:40,160 --> 00:10:43,600 Speaker 1: they reveal. The first and perhaps most important revelation is 194 00:10:43,640 --> 00:10:47,440 Speaker 1: this many Facebook employees understand that their company is a 195 00:10:47,480 --> 00:10:49,800 Speaker 1: force for evil. Some of them have vowed to fix 196 00:10:49,840 --> 00:10:52,400 Speaker 1: it from the inside. Others are convinced the evil is 197 00:10:52,440 --> 00:10:55,720 Speaker 1: outweighed by some nebulous good. But at the core of it, 198 00:10:56,040 --> 00:10:59,080 Speaker 1: they know that what they were a part of is problematic, 199 00:10:59,160 --> 00:11:00,840 Speaker 1: and a lot of them hate themselves for it. You 200 00:11:00,840 --> 00:11:04,520 Speaker 1: can really see that coming across in some of these conversations, 201 00:11:05,760 --> 00:11:10,000 Speaker 1: evidence of the yeah it's good stuff, don't you love? Yeah, 202 00:11:10,000 --> 00:11:13,559 Speaker 1: when human beings compromise the very nature of their soul 203 00:11:13,920 --> 00:11:16,920 Speaker 1: uh in the in seeking profit Yeah. And then and 204 00:11:16,960 --> 00:11:19,560 Speaker 1: then you know, you watch the you watch the light 205 00:11:19,640 --> 00:11:21,440 Speaker 1: leave their eyes, and then you're supposed to feel bad 206 00:11:21,480 --> 00:11:25,240 Speaker 1: for them. Uh. Evidence of the struggle over the soul 207 00:11:25,280 --> 00:11:27,920 Speaker 1: of Facebook can be found in the reactions of employees 208 00:11:27,960 --> 00:11:30,120 Speaker 1: to the growth of the Black Lives Matter movement. After 209 00:11:30,160 --> 00:11:33,959 Speaker 1: the murder of George Floyd by a cop that June, 210 00:11:33,960 --> 00:11:37,280 Speaker 1: as protests reached their height, a Facebook employee posted a 211 00:11:37,320 --> 00:11:41,560 Speaker 1: message to the company racial justice chatboards stating, get bright 212 00:11:41,640 --> 00:11:44,080 Speaker 1: Bart out of news tap. He was enraged at the 213 00:11:44,080 --> 00:11:46,559 Speaker 1: fact that the far right publisher was pushing this information 214 00:11:46,600 --> 00:11:49,680 Speaker 1: about violence at protests, and included screen grabs of bright 215 00:11:49,720 --> 00:11:54,199 Speaker 1: Bart articles with titles like Minneapolis Mayhem, massive looting, buildings 216 00:11:54,240 --> 00:11:59,760 Speaker 1: and flames, bonfires exclamation point, and BLM protesters pummel police cars. 217 00:12:00,080 --> 00:12:02,680 Speaker 1: Wonder how much more attention they paid the police cars 218 00:12:02,679 --> 00:12:05,400 Speaker 1: than the man who was choked to death by a cop. Anyway, 219 00:12:05,520 --> 00:12:09,960 Speaker 1: good stuff, good journalism, nailing it. This employee claimed that 220 00:12:09,960 --> 00:12:12,680 Speaker 1: these articles were part of a concerted effort by Breitbart 221 00:12:12,720 --> 00:12:15,200 Speaker 1: and other right wing media sites to quote paint Black 222 00:12:15,240 --> 00:12:18,440 Speaker 1: Americans and Black lead movements in a negative way. He 223 00:12:18,559 --> 00:12:21,320 Speaker 1: argued that none of those hyperpartisan sites deserved to be 224 00:12:21,400 --> 00:12:25,959 Speaker 1: highlighted by the Facebook news tab, so Facebook's News tab 225 00:12:26,160 --> 00:12:29,520 Speaker 1: consists of two tiers of content providers. Right, It's just 226 00:12:29,559 --> 00:12:30,960 Speaker 1: like the tab that tells you what's going on in 227 00:12:30,960 --> 00:12:33,400 Speaker 1: the world, and all of the people whose stories get 228 00:12:33,400 --> 00:12:36,119 Speaker 1: in there have been vetted to some extent by Facebook. 229 00:12:36,400 --> 00:12:39,040 Speaker 1: So there's a first tier of big publishers like the 230 00:12:39,040 --> 00:12:42,280 Speaker 1: New York Times, the Wall Street Journal, like the big dogs, 231 00:12:42,280 --> 00:12:44,319 Speaker 1: and they get paid. Facebook gives the money to be 232 00:12:44,360 --> 00:12:46,319 Speaker 1: a part of the News tab. And then there is 233 00:12:46,360 --> 00:12:48,760 Speaker 1: a second tier of news sites who are not paid 234 00:12:48,800 --> 00:12:51,440 Speaker 1: but did have to get vetted as a reliable news 235 00:12:51,480 --> 00:12:53,600 Speaker 1: source for Facebook to put them on their News tab. 236 00:12:53,960 --> 00:12:57,200 Speaker 1: And bright Bart is in that ladder tier, which means 237 00:12:57,200 --> 00:13:00,480 Speaker 1: Facebook isn't giving them money directly, but is institutionally pumping 238 00:13:00,480 --> 00:13:03,880 Speaker 1: a shipload of of like traffic towards their propaganda and 239 00:13:03,920 --> 00:13:06,440 Speaker 1: throwing a lot of their propaganda out into people's news feeds. 240 00:13:06,840 --> 00:13:10,160 Speaker 1: In their public facing statements, Facebook claims to only include 241 00:13:10,160 --> 00:13:13,280 Speaker 1: sites on their newstab who do quality news reporting. Sites 242 00:13:13,320 --> 00:13:17,320 Speaker 1: that repeatedly share disinformation, it claims are banned. This functions 243 00:13:17,320 --> 00:13:21,120 Speaker 1: on a strike system. In July, President Trump tweeted a 244 00:13:21,160 --> 00:13:24,440 Speaker 1: Brightbart video claiming you don't need a mask to protect 245 00:13:24,440 --> 00:13:28,960 Speaker 1: against COVID nineteen. The video also spread misinformation about hydroxy clarquin. 246 00:13:29,280 --> 00:13:32,880 Speaker 1: Despite the fact that this video clearly violated Facebook stated standards, 247 00:13:32,880 --> 00:13:34,800 Speaker 1: it was able to reach millions of people through the 248 00:13:34,800 --> 00:13:38,080 Speaker 1: newstab before Facebook took it down from the Wall Street Journal. 249 00:13:38,960 --> 00:13:41,720 Speaker 1: According to Facebook's fact checking rules, pages can be punished 250 00:13:41,720 --> 00:13:44,320 Speaker 1: if they acquire too many strikes, meaning they published content 251 00:13:44,400 --> 00:13:47,679 Speaker 1: deemed false by third party fact checkers. It requires two 252 00:13:47,720 --> 00:13:50,319 Speaker 1: strikes within ninety days to be deemed a repeat offender, 253 00:13:50,440 --> 00:13:53,240 Speaker 1: which can result in a user being suspended from posting content. 254 00:13:53,520 --> 00:13:56,840 Speaker 1: More strikes can lead to reductions and distribution and advertising revenue. 255 00:13:57,000 --> 00:13:59,840 Speaker 1: In a town hall, Mr Zuckerberg said Brightbart wasn't punished 256 00:13:59,840 --> 00:14:02,200 Speaker 1: for video because that was its only infraction in a 257 00:14:02,280 --> 00:14:05,720 Speaker 1: ninety day period. According to internal chats described to describe 258 00:14:05,760 --> 00:14:09,480 Speaker 1: the Yeah, now that seems wrong, right, knowing Brightbart that 259 00:14:09,520 --> 00:14:12,800 Speaker 1: they would have one strike and that's so I mean, 260 00:14:13,080 --> 00:14:16,160 Speaker 1: and was the reason that that video reached so many 261 00:14:16,160 --> 00:14:17,960 Speaker 1: people before it was taken down. It was that just 262 00:14:18,040 --> 00:14:21,120 Speaker 1: like a delay in fact checking, does that mean that 263 00:14:21,600 --> 00:14:24,320 Speaker 1: a certain number of people need to like? No? I mean? 264 00:14:24,320 --> 00:14:26,800 Speaker 1: Trump tweeted it and it spread and Facebook didn't want 265 00:14:26,800 --> 00:14:29,160 Speaker 1: to take it down until it had it had already 266 00:14:29,360 --> 00:14:31,640 Speaker 1: kind of made them some money, I think. I also 267 00:14:31,720 --> 00:14:33,200 Speaker 1: think it's just like they don't put a lot of 268 00:14:33,240 --> 00:14:35,400 Speaker 1: work into checking on this stuff. They don't. They don't 269 00:14:35,400 --> 00:14:36,840 Speaker 1: want to piss it. We're talking about all this, but 270 00:14:36,880 --> 00:14:39,920 Speaker 1: like they also just don't want to piss any conservatives off. 271 00:14:40,000 --> 00:14:41,880 Speaker 1: Like there's a lot of things going into why this 272 00:14:41,920 --> 00:14:46,280 Speaker 1: stuff is not in fashioned to any degree. Now you 273 00:14:46,480 --> 00:14:49,200 Speaker 1: express surprise at the fact that that Brightbart only had 274 00:14:49,200 --> 00:14:53,160 Speaker 1: one strike in ninety days. Let's talk about why. Yeah, So, 275 00:14:53,320 --> 00:14:56,160 Speaker 1: thanks to Francis Hoggans leaked documents, we now know that 276 00:14:56,200 --> 00:15:00,440 Speaker 1: Brightbart was one of the news sites Facebook considered managed partners. 277 00:15:00,480 --> 00:15:03,000 Speaker 1: These sites are part of a program whereby the social 278 00:15:03,040 --> 00:15:06,840 Speaker 1: network pairs handlers who work at Facebook with the website. 279 00:15:07,120 --> 00:15:09,920 Speaker 1: These handlers give Facebook a back channel to sites that 280 00:15:10,000 --> 00:15:13,360 Speaker 1: spread disinformation, which allows them to have content removed or 281 00:15:13,400 --> 00:15:16,720 Speaker 1: altered without giving the content maker a strike. So, in 282 00:15:16,720 --> 00:15:18,840 Speaker 1: other words, they put out the content it gets few 283 00:15:18,880 --> 00:15:22,720 Speaker 1: millions of times. Facebook one employee like messages and editor 284 00:15:22,760 --> 00:15:24,840 Speaker 1: and says like, hey, you need to change this now. 285 00:15:24,880 --> 00:15:27,240 Speaker 1: It gets changed after it's spread around, and they have 286 00:15:27,280 --> 00:15:29,280 Speaker 1: a way to strike and thus stay on the news tab. 287 00:15:30,200 --> 00:15:33,640 Speaker 1: That's a good way to do a back channel. That's 288 00:15:33,800 --> 00:15:36,800 Speaker 1: so dark, Okay, I mean that. I I guess if 289 00:15:36,800 --> 00:15:39,320 Speaker 1: you're looking for a way to keep misinformation up, that 290 00:15:39,520 --> 00:15:42,760 Speaker 1: is a logical way to go about it. Yeah. Yeah, 291 00:15:43,200 --> 00:15:46,120 Speaker 1: they do a perfect job. So it's actually you're saying 292 00:15:46,160 --> 00:15:50,680 Speaker 1: that bright Bart is accurate. Yes, perfectly accurate. That is 293 00:15:50,720 --> 00:15:53,600 Speaker 1: what we always say about Bridbart and Andrew Brightbart, a 294 00:15:53,640 --> 00:15:56,320 Speaker 1: man who did not do cocaine to until he died. 295 00:15:56,960 --> 00:15:59,840 Speaker 1: Um about someone who's who's got light in his eyes. 296 00:16:00,920 --> 00:16:05,240 Speaker 1: Not anymore, It doesn't so actual strikes were automatically escalated 297 00:16:05,240 --> 00:16:08,200 Speaker 1: for review by senior Facebook executives, who could decide to 298 00:16:08,240 --> 00:16:11,600 Speaker 1: overturn the punishment and remove a strike. Through these methods, 299 00:16:11,640 --> 00:16:15,400 Speaker 1: Facebook strike system for spreading disinformation actually proved to be 300 00:16:15,480 --> 00:16:18,320 Speaker 1: nothing at all, and he sufficiently large right wing website 301 00:16:18,360 --> 00:16:22,040 Speaker 1: was given numerous opportunities to avoid strikes without being delisted. 302 00:16:22,360 --> 00:16:24,800 Speaker 1: This was a problem that went further than Breitbart, as 303 00:16:24,800 --> 00:16:27,680 Speaker 1: The Wall Street General reports. In an internal memo, the 304 00:16:27,720 --> 00:16:29,840 Speaker 1: engineer said that he based his assessment in part on 305 00:16:29,840 --> 00:16:32,640 Speaker 1: a queue of three dozen escalations that he had stumbled onto. 306 00:16:32,720 --> 00:16:35,120 Speaker 1: The vast majority of which were on behalf of conservative 307 00:16:35,160 --> 00:16:38,280 Speaker 1: content producers. A summary of the engineer's findings was posted 308 00:16:38,280 --> 00:16:40,960 Speaker 1: to an internal message board. One case he cited regarding 309 00:16:41,000 --> 00:16:44,120 Speaker 1: pro Trump influencers Diamond and Silk, third party fact checkers 310 00:16:44,200 --> 00:16:47,120 Speaker 1: rated as false post on their page that said was 311 00:16:47,160 --> 00:16:50,200 Speaker 1: like porn stars Wait diamonded Silk? Oh do you not 312 00:16:50,240 --> 00:16:53,600 Speaker 1: know about Diamond and Silk? Oh? They are? Are they banned? 313 00:16:54,120 --> 00:16:56,720 Speaker 1: They're not great? No, no, but they're bad. They they're 314 00:16:56,800 --> 00:17:00,640 Speaker 1: not not nice people, not good people. Um so they 315 00:17:00,680 --> 00:17:04,040 Speaker 1: got yeah, it's a fact Checkers rated false a post 316 00:17:04,119 --> 00:17:06,280 Speaker 1: a post that Diamond and Silk made stating how the 317 00:17:06,280 --> 00:17:08,840 Speaker 1: hell is allocating million dollars in order to give a 318 00:17:08,960 --> 00:17:10,880 Speaker 1: raise to House members that don't give a damn about 319 00:17:10,880 --> 00:17:15,399 Speaker 1: Americans going to help stimulate America's economy. When fact checkers 320 00:17:15,480 --> 00:17:18,119 Speaker 1: rated that post false, a Facebook staff are involved in 321 00:17:18,119 --> 00:17:20,360 Speaker 1: the partner program, argued that there should be no punishment, 322 00:17:20,400 --> 00:17:23,159 Speaker 1: noting the publisher has not hesitated going public about their 323 00:17:23,160 --> 00:17:26,720 Speaker 1: concerns around alleged anti conservative bias on Facebook. So this 324 00:17:26,800 --> 00:17:28,439 Speaker 1: is a pretty minor case, but it shows what's going 325 00:17:28,480 --> 00:17:30,640 Speaker 1: on there. They post something that's not accurate. This raise 326 00:17:30,720 --> 00:17:34,200 Speaker 1: is not something that's like going through UM and fact 327 00:17:34,280 --> 00:17:36,920 Speaker 1: checkers flag it is inaccurate, which should mean it gets removed. 328 00:17:37,119 --> 00:17:39,240 Speaker 1: But then someone at Facebook is like, if we remove it, 329 00:17:39,240 --> 00:17:41,520 Speaker 1: they're gonna yell about us removing their post and it's 330 00:17:41,560 --> 00:17:43,120 Speaker 1: going to be a pain in the asked for us. 331 00:17:43,240 --> 00:17:46,120 Speaker 1: So just like fuck it. Yeah, this is I feel 332 00:17:46,119 --> 00:17:49,440 Speaker 1: like this is always the route that Facebook does. It's 333 00:17:49,520 --> 00:17:56,240 Speaker 1: just like this big, gigantic, bureaucratic style operation that people 334 00:17:56,760 --> 00:18:00,879 Speaker 1: uh do shitty things so that they're not convenience or 335 00:18:00,960 --> 00:18:04,639 Speaker 1: yelled at by someone else. Like it's also insidious and 336 00:18:04,760 --> 00:18:09,200 Speaker 1: also so boring at the same time. Yeah, it's it's 337 00:18:09,280 --> 00:18:13,840 Speaker 1: the consequences that aren't boring. Um. And to some extent, 338 00:18:13,960 --> 00:18:16,320 Speaker 1: this is true of a lot of the worst things 339 00:18:16,400 --> 00:18:21,199 Speaker 1: in history. There were an awful lot of men intotalitarian 340 00:18:21,320 --> 00:18:24,960 Speaker 1: societies who signed effectively or literally the death warrants of 341 00:18:25,000 --> 00:18:28,000 Speaker 1: their fellow man because like, well, otherwise it's going to 342 00:18:28,080 --> 00:18:30,520 Speaker 1: be a real pain in the ass for me, My 343 00:18:31,080 --> 00:18:33,440 Speaker 1: day at the office is going to be terror I 344 00:18:33,480 --> 00:18:35,720 Speaker 1: don't want to take this to the boss. I don't 345 00:18:35,720 --> 00:18:40,240 Speaker 1: want to escalate this yet just kill them. Yeah, yeah, 346 00:18:40,320 --> 00:18:42,800 Speaker 1: I mean it is so I like the most evil 347 00:18:42,840 --> 00:18:45,160 Speaker 1: stuff is done in a very slow and boring way. 348 00:18:45,200 --> 00:18:47,960 Speaker 1: I feel like it's just because if you can get 349 00:18:47,960 --> 00:18:50,280 Speaker 1: people to, you know, fall asleep, you can get away 350 00:18:50,280 --> 00:18:54,800 Speaker 1: with fucking murder. Like literally, Yep, it's good stuff. Loving 351 00:18:54,840 --> 00:18:58,800 Speaker 1: these papers, Robert, Yeah, they're very fun. So Diamond and 352 00:18:58,840 --> 00:19:01,440 Speaker 1: Silk were able to be the third party fact checker 353 00:19:01,520 --> 00:19:04,440 Speaker 1: into changing the rating on their article down from to 354 00:19:04,680 --> 00:19:07,639 Speaker 1: partly false, and with the help of the managed partner 355 00:19:07,760 --> 00:19:11,320 Speaker 1: escalation process, all of their strikes were removed. Um the 356 00:19:11,400 --> 00:19:14,479 Speaker 1: chat conversations that the general review showed that inside the company, 357 00:19:14,480 --> 00:19:19,040 Speaker 1: Facebook employees repeatedly demanded that higher ups explain the allegations. Quote, 358 00:19:19,240 --> 00:19:22,520 Speaker 1: we are apparently providing hate speech policy consulting and consequence 359 00:19:22,600 --> 00:19:26,560 Speaker 1: mitigation services to select partners, wrote one employee. Leadership is 360 00:19:26,600 --> 00:19:29,960 Speaker 1: scared of being accused of bias, wrote another. So that 361 00:19:30,080 --> 00:19:33,560 Speaker 1: seems bad. That seem good. Now, that seems like the 362 00:19:33,600 --> 00:19:35,320 Speaker 1: root of a lot of problems we've been having as 363 00:19:35,359 --> 00:19:38,600 Speaker 1: a society. Then like, well, conservatives are loud when they're angry, 364 00:19:38,640 --> 00:19:41,200 Speaker 1: so let's just let them lie and try to get 365 00:19:41,200 --> 00:19:45,040 Speaker 1: people killed. Diamond and Silk was doing that in that case, 366 00:19:45,040 --> 00:19:47,360 Speaker 1: but that that's a thing in writing media. Now when 367 00:19:47,359 --> 00:19:49,080 Speaker 1: you're saying, I don't know what to picture when you 368 00:19:49,119 --> 00:19:51,480 Speaker 1: say diamond and silk. So at first I was picturing 369 00:19:52,000 --> 00:19:56,960 Speaker 1: porn stars. Then I was picturing hair metal band looked 370 00:19:56,960 --> 00:20:00,440 Speaker 1: more like gospel singers. I was picturing two Springer sp annuals. 371 00:20:00,520 --> 00:20:03,119 Speaker 1: Most recently, I don't think I'll stay there. Yeah, no, 372 00:20:03,280 --> 00:20:07,359 Speaker 1: I wouldn't. I wouldn't. You should? You should look them up. UM, Yeah, 373 00:20:07,400 --> 00:20:10,640 Speaker 1: they're they're they're. There are two musicians who like opposed 374 00:20:10,640 --> 00:20:13,560 Speaker 1: with Trump and have like a I think they're on TikTok. 375 00:20:13,640 --> 00:20:18,000 Speaker 1: They're just like right wing media influencers, and they're they're 376 00:20:18,160 --> 00:20:21,240 Speaker 1: they're not they're not great people. UM. In a farewell 377 00:20:21,280 --> 00:20:24,840 Speaker 1: memo of colleagues in late one member of Facebook's Integrity 378 00:20:24,880 --> 00:20:27,280 Speaker 1: team UM and the Integrity team, their job is to 379 00:20:27,359 --> 00:20:31,240 Speaker 1: reduce harmful behavior on the platform, UH complained that Facebook's 380 00:20:31,280 --> 00:20:35,040 Speaker 1: tolerance for Britbarts stop them from effectively fighting hate speech. Quote. 381 00:20:35,280 --> 00:20:37,919 Speaker 1: We make special exceptions to our written policies for them, 382 00:20:37,960 --> 00:20:40,720 Speaker 1: and we even explicitly endorsed them by including the mistrusted 383 00:20:40,800 --> 00:20:45,720 Speaker 1: partners in our core products. Yeah, it's it's bad, and 384 00:20:45,880 --> 00:20:48,480 Speaker 1: you can see, like there's this constant with the Facebook 385 00:20:48,480 --> 00:20:52,159 Speaker 1: papers revealed, there's this constant seesaw and aggressive between the 386 00:20:52,160 --> 00:20:56,080 Speaker 1: integrity team, the people whose job is to reduce the 387 00:20:56,160 --> 00:20:59,600 Speaker 1: harm the site does, and everyone else who's only real 388 00:20:59,720 --> 00:21:02,119 Speaker 1: job is to increase engagement on the site. Right. That 389 00:21:02,240 --> 00:21:03,880 Speaker 1: is how you get your bonus, That is how you 390 00:21:04,320 --> 00:21:06,520 Speaker 1: get the kudos from the boss is keeping people on 391 00:21:06,560 --> 00:21:09,560 Speaker 1: the site for longer. So most of Facebook that is 392 00:21:09,600 --> 00:21:11,600 Speaker 1: their job, and a small number of people their job 393 00:21:11,640 --> 00:21:13,560 Speaker 1: is to try and make sure the site doesn't contribute 394 00:21:13,560 --> 00:21:16,560 Speaker 1: to an ethnic cleansing, and the ethnic cleansing people like 395 00:21:16,600 --> 00:21:19,000 Speaker 1: the people trying to stop that. The best way to 396 00:21:19,000 --> 00:21:20,640 Speaker 1: do that is always going to be to do things 397 00:21:20,640 --> 00:21:23,359 Speaker 1: that cut down on engagement with the site, and so 398 00:21:23,400 --> 00:21:27,879 Speaker 1: they nearly always lose the fights they have with everybody else. Um, 399 00:21:28,000 --> 00:21:32,679 Speaker 1: Jesus Christ. Yeah, it's great, Okay, okay, Yeah, that is 400 00:21:32,720 --> 00:21:36,160 Speaker 1: the scariest extension of that logic, yep ye. One thing 401 00:21:36,200 --> 00:21:38,040 Speaker 1: we know thanks to the Facebook papers is that the 402 00:21:38,080 --> 00:21:40,800 Speaker 1: social network launched a study in two thousand nineteen to 403 00:21:40,880 --> 00:21:43,840 Speaker 1: determine what level of trust its users had in different 404 00:21:43,840 --> 00:21:47,280 Speaker 1: media organizations. Out of dozens of websites across the US 405 00:21:47,320 --> 00:21:51,080 Speaker 1: and UK, Bright Bart was dead last. Facebook themselves rated 406 00:21:51,119 --> 00:21:53,720 Speaker 1: it as low quality, which, again based on the company's 407 00:21:53,760 --> 00:21:55,800 Speaker 1: own claims about how they decide who to include in 408 00:21:55,800 --> 00:21:59,200 Speaker 1: the news tab, would disqualify bright Bart. And guess what, 409 00:21:59,480 --> 00:22:02,320 Speaker 1: bright Bart is still a trusted Facebook partner. Oh hey, 410 00:22:02,440 --> 00:22:04,639 Speaker 1: what's this unrelated news clip from a too in November 411 00:22:05,359 --> 00:22:08,639 Speaker 1: one Washington Post article doing in my script? Quote? Right, 412 00:22:08,680 --> 00:22:11,399 Speaker 1: Bart is the most influential producer of climate change denial 413 00:22:11,440 --> 00:22:14,040 Speaker 1: posts on Facebook, according to a report released Tuesday that 414 00:22:14,080 --> 00:22:16,840 Speaker 1: suggests a small number of publishers play an outsize role 415 00:22:16,880 --> 00:22:20,560 Speaker 1: in creating content that undermines climate science. Good ship, Right, 416 00:22:20,920 --> 00:22:24,680 Speaker 1: that's still number one after all these years. Good Isn't 417 00:22:24,680 --> 00:22:26,760 Speaker 1: that a good thing? Isn't that a good thing? So 418 00:22:26,840 --> 00:22:29,760 Speaker 1: they haven't said two inaccurate things in the last ninety days, 419 00:22:29,800 --> 00:22:35,360 Speaker 1: which I find never Facebook's terrorists thought of offending conservatives 420 00:22:35,359 --> 00:22:38,760 Speaker 1: by cracking down on hate speech and rampant disinformation started. 421 00:22:38,840 --> 00:22:40,879 Speaker 1: I don't know if it started, but it really, it 422 00:22:40,960 --> 00:22:44,520 Speaker 1: really hit the ground running in two thousands sixteen, during 423 00:22:44,560 --> 00:22:47,160 Speaker 1: the only election that was even worse than this last election. 424 00:22:47,680 --> 00:22:50,720 Speaker 1: In May of that year, Gizmoto wrote an article reporting 425 00:22:50,720 --> 00:22:55,080 Speaker 1: that Facebook's trending topics lists suppressed conservative views. A handful 426 00:22:55,119 --> 00:22:58,280 Speaker 1: of ex employees made claims that seemed to back these allegations. 427 00:22:58,359 --> 00:23:01,080 Speaker 1: Up now. Reporting later in the year by NPR it 428 00:23:01,119 --> 00:23:04,280 Speaker 1: made it clear that this was bullshit quote. NPR called 429 00:23:04,320 --> 00:23:07,520 Speaker 1: up half a dozen technology experts, including data scientists who 430 00:23:07,560 --> 00:23:11,120 Speaker 1: have special access to Facebook's internal metrics. The consensus there 431 00:23:11,160 --> 00:23:14,080 Speaker 1: is no statistical evidence to support the argument that Facebook 432 00:23:14,080 --> 00:23:16,840 Speaker 1: does not give conservative views a fair shake. But truth 433 00:23:16,960 --> 00:23:20,320 Speaker 1: never matters when you're arguing with conservatives. They needed a 434 00:23:20,320 --> 00:23:23,480 Speaker 1: reason to threaten Facebook with regulation, et cetera. When Trump, 435 00:23:23,560 --> 00:23:26,439 Speaker 1: and when Trump won later that year, the social network 436 00:23:26,480 --> 00:23:28,720 Speaker 1: decides these threats may have teeth, and so we're going 437 00:23:28,800 --> 00:23:31,080 Speaker 1: to spend the next four years allowing them to say 438 00:23:31,119 --> 00:23:33,359 Speaker 1: whatever the funk they want, no matter how racist, no 439 00:23:33,359 --> 00:23:36,240 Speaker 1: matter how conspiratorial, no matter how many shootings it may 440 00:23:36,440 --> 00:23:38,560 Speaker 1: help to inspire, no matter how no matter how many 441 00:23:38,560 --> 00:23:41,240 Speaker 1: shootings may be live streamed on the platform, like the 442 00:23:41,320 --> 00:23:44,240 Speaker 1: christ shoot shooting. Um, we're gonna let it all in 443 00:23:45,040 --> 00:23:51,879 Speaker 1: all because yeah, money, well, because otherwise they yelled at 444 00:23:51,920 --> 00:23:56,320 Speaker 1: and maybe regulated. The conservatives are good angry. I don't 445 00:23:56,320 --> 00:23:59,639 Speaker 1: want the conservatives to get angry. The funny thing is 446 00:23:59,760 --> 00:24:02,320 Speaker 1: they're no stopping them from getting angry. Right, you know 447 00:24:02,560 --> 00:24:05,600 Speaker 1: how this works. I know how this works. They're gonna 448 00:24:05,600 --> 00:24:07,600 Speaker 1: be angry, and they're going to claim bias no matter what, 449 00:24:07,680 --> 00:24:10,240 Speaker 1: which is what they do. And so as Facebook gives 450 00:24:10,280 --> 00:24:13,199 Speaker 1: them a free pass and their content is consistently the 451 00:24:13,200 --> 00:24:16,919 Speaker 1: most influential stuff on the entire site, allegations of anti 452 00:24:17,000 --> 00:24:20,080 Speaker 1: right wing bias continue to spread, even though again like 453 00:24:20,119 --> 00:24:21,920 Speaker 1: eight of to nine out of the tin top shared 454 00:24:21,960 --> 00:24:24,080 Speaker 1: post on any even day are from right wing media. 455 00:24:24,280 --> 00:24:26,800 Speaker 1: But you know what's not from right wing media? Jamie? 456 00:24:27,160 --> 00:24:29,840 Speaker 1: What all these products and services that you're not at 457 00:24:29,840 --> 00:24:33,000 Speaker 1: all sure? You can't at all, not at all. We 458 00:24:33,240 --> 00:24:34,920 Speaker 1: have a different we have a different brand of brain 459 00:24:34,960 --> 00:24:37,760 Speaker 1: pills than the ones Alex Jones sells, and ours have 460 00:24:37,920 --> 00:24:41,520 Speaker 1: less than half brain pills, less than half the lead. 461 00:24:41,760 --> 00:24:44,240 Speaker 1: That is that. That is a promise, Jamie, every much 462 00:24:44,280 --> 00:24:47,200 Speaker 1: lad you think a post should have it's less than that. 463 00:24:47,400 --> 00:24:51,399 Speaker 1: Because we care, I'll take your sick little centrist brain pill. 464 00:24:51,640 --> 00:24:55,119 Speaker 1: See if I care, which I can start watching MSNBC 465 00:24:55,240 --> 00:25:01,560 Speaker 1: at any moment. Okay, take brain cooked, get brain cooked. Uh, 466 00:25:02,840 --> 00:25:11,479 Speaker 1: here's some other products. All right, so we're back. Okay, 467 00:25:11,600 --> 00:25:16,080 Speaker 1: So we're back, So we're back. Are think it's gonna 468 00:25:16,080 --> 00:25:18,320 Speaker 1: get happy. Are think he's gonna get happy? I think 469 00:25:18,320 --> 00:25:22,120 Speaker 1: it's gonna get funny. Not really, okay, just checking, Yeah, 470 00:25:22,200 --> 00:25:25,639 Speaker 1: that's not really. I mean Mark Zuckerberg will like, I 471 00:25:25,680 --> 00:25:29,000 Speaker 1: don't know, fall down a man hole someday. Maybe we're lucky. 472 00:25:29,440 --> 00:25:32,120 Speaker 1: That would be funny. That would be funny. In two 473 00:25:32,160 --> 00:25:35,560 Speaker 1: thousand eighteen, if Facebook engineer claimed on an internal message 474 00:25:35,560 --> 00:25:38,800 Speaker 1: board that the company was intolerant of his beliefs, the 475 00:25:38,880 --> 00:25:41,320 Speaker 1: reality is almost certainly that his co workers found him 476 00:25:41,320 --> 00:25:43,520 Speaker 1: to be an obnoxious bigot. I say this because he 477 00:25:43,600 --> 00:25:46,199 Speaker 1: left the company shortly thereafter and hit the grifting circuit, 478 00:25:46,240 --> 00:25:48,879 Speaker 1: showing up on Tucker Carlson's show. He just he does 479 00:25:48,920 --> 00:25:51,280 Speaker 1: the thing that like you remember two eighteen nineteen, a 480 00:25:51,320 --> 00:25:53,600 Speaker 1: bunch of these guys were like leaving big tech companies 481 00:25:53,600 --> 00:25:56,280 Speaker 1: and like going on the Alex Jones show. There was 482 00:25:56,320 --> 00:25:58,840 Speaker 1: one guy left Google and claimed like brought a bunch 483 00:25:58,880 --> 00:26:00,679 Speaker 1: of leaks, but they weren't in a thing because it 484 00:26:00,720 --> 00:26:02,760 Speaker 1: was never anything that people being like this guy kind 485 00:26:02,800 --> 00:26:07,680 Speaker 1: of seems like it sucks. It's very funny. Those press 486 00:26:07,720 --> 00:26:10,320 Speaker 1: tours were Yeah, that was truly that feels like it 487 00:26:10,359 --> 00:26:13,119 Speaker 1: was ten years ago, but yeah, it was. It was 488 00:26:13,160 --> 00:26:15,480 Speaker 1: funny because like I think the first one of these 489 00:26:15,600 --> 00:26:18,840 Speaker 1: dudes did all right money wise, but after that, like 490 00:26:19,320 --> 00:26:21,080 Speaker 1: this Spiggott dried up and so they were just like 491 00:26:21,200 --> 00:26:25,280 Speaker 1: detonating their careers in the tech industry for nothing, going 492 00:26:25,320 --> 00:26:32,760 Speaker 1: to work for gab afterwards. To watch after the election, 493 00:26:32,800 --> 00:26:34,840 Speaker 1: and I apologize for the rate that we're jumping around 494 00:26:34,880 --> 00:26:37,679 Speaker 1: here on the timeline, but it's unavoidable. Facebook became the 495 00:26:37,720 --> 00:26:40,119 Speaker 1: subject of bad pr from the left as well. The 496 00:26:40,160 --> 00:26:42,520 Speaker 1: Cambridge Analytic a scandal broke, and the outrage in the 497 00:26:42,520 --> 00:26:45,000 Speaker 1: wake of Trump's election meant that Facebook was being pressured 498 00:26:45,040 --> 00:26:48,760 Speaker 1: to do something about bigotry and disinformation spreading on their platform. 499 00:26:48,960 --> 00:26:51,280 Speaker 1: At the same time, the Republicans are in charge now, 500 00:26:51,320 --> 00:26:54,760 Speaker 1: so they can't actually do anything otherwise they'll be attacked 501 00:26:54,800 --> 00:26:58,000 Speaker 1: for being biased and maybe regulated. So they tested a 502 00:26:58,000 --> 00:27:02,040 Speaker 1: couple of different changes. One was a tool called sparing sharing, 503 00:27:02,320 --> 00:27:05,399 Speaker 1: which sought to reduce the reach of hyper posters. Hyper 504 00:27:05,440 --> 00:27:07,760 Speaker 1: posters are exactly what they sound like. These are users 505 00:27:07,760 --> 00:27:11,040 Speaker 1: that have been shown to mostly share false and hateful information, 506 00:27:11,400 --> 00:27:14,560 Speaker 1: and so reducing their virility was seen as potentially helpful. 507 00:27:14,920 --> 00:27:18,119 Speaker 1: This seems like a sensible change, right. Oh, these people 508 00:27:18,160 --> 00:27:21,640 Speaker 1: are are sharing at an incredible rate and it's all 509 00:27:21,840 --> 00:27:27,360 Speaker 1: violent trash. Let's reduce the number of stuff. I guess 510 00:27:27,359 --> 00:27:30,080 Speaker 1: that's like, that's a that's a real band aid just 511 00:27:30,200 --> 00:27:32,240 Speaker 1: be like, Okay, we're gonna have them. They can still 512 00:27:32,280 --> 00:27:36,480 Speaker 1: share stuff, but just less hateful stuff. Yeah, and it's 513 00:27:36,480 --> 00:27:39,160 Speaker 1: not less garbage. It's not even a shadow band because 514 00:27:39,160 --> 00:27:41,840 Speaker 1: the shadow ban would imply that, like you are actually reducing, 515 00:27:42,480 --> 00:27:46,680 Speaker 1: like artificially the spread, You're no longer artificially inflating their 516 00:27:46,680 --> 00:27:50,159 Speaker 1: reach because their stuff gets great engagement, right, because it 517 00:27:50,200 --> 00:27:53,919 Speaker 1: pisses people off even though it's untrue, and the algorithms 518 00:27:54,160 --> 00:27:57,800 Speaker 1: the default is, oh, this pisses people off. Let's let 519 00:27:57,800 --> 00:28:00,600 Speaker 1: everybody see what this asshole saying. And they're just being like, well, 520 00:28:00,640 --> 00:28:03,120 Speaker 1: let's not do that for these specific assholes, right, that's 521 00:28:03,119 --> 00:28:05,840 Speaker 1: all they're doing. Um, it's not a band, it's a 522 00:28:06,160 --> 00:28:08,920 Speaker 1: we're going to stop inflating these people's reached to the 523 00:28:09,000 --> 00:28:11,960 Speaker 1: same extent that we were. Seems like a sensible change. 524 00:28:12,480 --> 00:28:16,200 Speaker 1: You know who disagreed with that, Jamie Loftus, who Robert Evans, 525 00:28:16,320 --> 00:28:19,639 Speaker 1: Joel Kaplan, former deputy chief of staff to George W. 526 00:28:19,800 --> 00:28:24,240 Speaker 1: Bush and Facebook head of Public Policy UM famous right 527 00:28:24,320 --> 00:28:27,560 Speaker 1: wing ship had Joel Kaplan, who is huge at Facebook 528 00:28:27,760 --> 00:28:31,600 Speaker 1: UM and is a major driving force behind don't piss 529 00:28:31,600 --> 00:28:35,120 Speaker 1: off Conservatives. That's that's the guy that he is. That's 530 00:28:35,119 --> 00:28:37,360 Speaker 1: his whole job. How are we supposed to work together 531 00:28:37,440 --> 00:28:43,600 Speaker 1: if we're pissing on the conservatives? It actually at rising tide. Yeah, 532 00:28:43,720 --> 00:28:47,120 Speaker 1: so Kaplan's like, most of these hyper posters are conservatives. 533 00:28:47,160 --> 00:28:50,360 Speaker 1: This is this is you know, unfair, and he convinces 534 00:28:50,400 --> 00:28:53,480 Speaker 1: Zuckerberg to weaken have his engineers weakened the tools so 535 00:28:53,520 --> 00:28:55,959 Speaker 1: that they do kind of reduce the influence that these 536 00:28:56,040 --> 00:28:58,640 Speaker 1: hyper posters have, but not by as much as they 537 00:28:58,680 --> 00:29:01,160 Speaker 1: wanted to, and it doesn't really seem to have a 538 00:29:01,280 --> 00:29:03,280 Speaker 1: much of an impact. As we will talk about later, 539 00:29:03,760 --> 00:29:06,760 Speaker 1: this is still the way Facebook works. So however, to 540 00:29:06,760 --> 00:29:09,640 Speaker 1: whatever extent they did reduce the harm, it was not 541 00:29:09,720 --> 00:29:13,440 Speaker 1: by much. Another attempt is also like way too cool 542 00:29:13,760 --> 00:29:17,360 Speaker 1: of a word to describe what that is, which is 543 00:29:17,600 --> 00:29:20,200 Speaker 1: spreaders of hate speech. Why give them a cool name 544 00:29:20,240 --> 00:29:22,280 Speaker 1: like that? Yeah, why give him a cool name like that? 545 00:29:23,160 --> 00:29:25,480 Speaker 1: Although I don't know that might have that sounds like 546 00:29:25,520 --> 00:29:28,360 Speaker 1: something we might have said as like an insult to 547 00:29:29,360 --> 00:29:32,080 Speaker 1: people when I was young and on the internet. You're 548 00:29:32,080 --> 00:29:34,600 Speaker 1: a hyper poster. I don't know, dude, you're like hyper 549 00:29:34,680 --> 00:29:36,600 Speaker 1: posting right now. You need to chill the funk out. 550 00:29:36,880 --> 00:29:40,640 Speaker 1: I'm picturing someone sitting at their a filthy keyboard in 551 00:29:40,680 --> 00:29:44,760 Speaker 1: a Power Ranger suit. I am. I am imagining a 552 00:29:44,840 --> 00:29:47,760 Speaker 1: filthy and the filthy Power Rangers suit, Jamie. Oh, it's 553 00:29:47,920 --> 00:29:51,960 Speaker 1: really dirty and it doesn't fit either way too big 554 00:29:52,040 --> 00:29:55,520 Speaker 1: or way too small. Yeah. They they have soiled themselves 555 00:29:55,520 --> 00:29:58,120 Speaker 1: in it on more than one occasion because they can't 556 00:29:58,120 --> 00:30:01,520 Speaker 1: stop posting, Robert, because they're post too much. It was 557 00:30:01,600 --> 00:30:04,000 Speaker 1: It was not an accident. It was a choice they 558 00:30:04,040 --> 00:30:06,760 Speaker 1: made in order to finish an argument. I'm going to 559 00:30:06,920 --> 00:30:13,200 Speaker 1: make an oil painting of that exact image, Jamie. I 560 00:30:13,240 --> 00:30:15,080 Speaker 1: swear to you, I will put that up in my 561 00:30:15,160 --> 00:30:17,240 Speaker 1: living room. I will put it on my roof like 562 00:30:17,280 --> 00:30:23,680 Speaker 1: the Suesteem chapel. Really absolutely, don't underestimate how much free 563 00:30:23,680 --> 00:30:26,680 Speaker 1: time I have, Robert, I would never do that. You 564 00:30:26,720 --> 00:30:30,200 Speaker 1: work for the Internet. So another attempted tool to make 565 00:30:30,240 --> 00:30:34,000 Speaker 1: Facebook better was called informed engagement. This was supposed to 566 00:30:34,000 --> 00:30:37,280 Speaker 1: reduce the reach of posts that Facebook determined were more 567 00:30:37,320 --> 00:30:40,320 Speaker 1: likely to be shared by people who had not read them. 568 00:30:40,360 --> 00:30:43,680 Speaker 1: This rule was instituted, and over time, Facebook noticed a 569 00:30:43,720 --> 00:30:47,840 Speaker 1: significant decline in disinformation and toxic content. They commissioned to study, 570 00:30:47,880 --> 00:30:50,760 Speaker 1: which is where the problems started from. The Wall Street Journal. 571 00:30:51,640 --> 00:30:55,920 Speaker 1: The study, dubbed a political ideology analysis, suggested the company 572 00:30:55,960 --> 00:30:58,680 Speaker 1: had been suppressing the traffic of major far right publishers, 573 00:30:58,800 --> 00:31:01,520 Speaker 1: even though that wasn't it's an tint. According to the documents, 574 00:31:01,960 --> 00:31:04,719 Speaker 1: very conservative sites it found would benefit the most if 575 00:31:04,760 --> 00:31:07,600 Speaker 1: the tools were removed. With bright parts traffic increasing and 576 00:31:07,680 --> 00:31:13,080 Speaker 1: estimated Washington Times is eighteen percent, Western Journal sixteen percent, 577 00:31:13,200 --> 00:31:16,320 Speaker 1: and Epic Times by eleven percent. According to the documents, 578 00:31:16,960 --> 00:31:21,760 Speaker 1: the study was designed that's why you never conduct a study, Robert, Yeah. Yeah. 579 00:31:22,040 --> 00:31:25,040 Speaker 1: They find basically, hey, if you don't let people, if 580 00:31:25,080 --> 00:31:27,920 Speaker 1: you reduce the amount by which people share, like, by 581 00:31:28,040 --> 00:31:30,280 Speaker 1: by which posts shared by people who have not read 582 00:31:30,320 --> 00:31:32,840 Speaker 1: the article, if you if you make those spread less 583 00:31:32,960 --> 00:31:37,600 Speaker 1: bright parts, traffic drops. I still I still think that 584 00:31:37,600 --> 00:31:40,920 Speaker 1: that those like um, the tools that have developed over 585 00:31:41,000 --> 00:31:44,400 Speaker 1: time to be like, are you sure to read the article? 586 00:31:44,600 --> 00:31:48,360 Speaker 1: Our soul? Goofy? I I do like when Twitter, um, 587 00:31:48,640 --> 00:31:50,960 Speaker 1: I feel like they're like I just picturely a little 588 00:31:50,960 --> 00:31:53,840 Speaker 1: shaking person. Are you Are you sure you don't want 589 00:31:53,880 --> 00:31:57,600 Speaker 1: to read the article before you retweet it? What do 590 00:31:57,600 --> 00:32:01,360 Speaker 1: you think? I was like No, I felt so bad 591 00:32:01,400 --> 00:32:04,360 Speaker 1: because there's been times when I've like retweeted, like shared 592 00:32:04,400 --> 00:32:07,400 Speaker 1: my own articles that I've written, and because like I 593 00:32:07,920 --> 00:32:10,959 Speaker 1: wrote them, I didn't necessarily click the link before sharing. 594 00:32:11,000 --> 00:32:12,680 Speaker 1: I just like woke up and I like shared it, 595 00:32:12,680 --> 00:32:14,200 Speaker 1: and it's like, are you sure you don't want to 596 00:32:14,200 --> 00:32:16,960 Speaker 1: read this? And I just click to share because it's 597 00:32:16,960 --> 00:32:18,960 Speaker 1: like nine or ten in the morning and I haven't 598 00:32:18,960 --> 00:32:21,120 Speaker 1: had my coffee yet. But I feel bad even though 599 00:32:21,120 --> 00:32:23,520 Speaker 1: it's like, well, I wrote the motherfucker, like I know 600 00:32:23,680 --> 00:32:26,400 Speaker 1: what's in there. Usually by the time I share something, 601 00:32:26,440 --> 00:32:29,040 Speaker 1: I have already read it. But I do I think 602 00:32:29,040 --> 00:32:32,040 Speaker 1: that that function. I think it has a good purpose 603 00:32:32,120 --> 00:32:33,960 Speaker 1: and it like pains something in your brain that is 604 00:32:34,040 --> 00:32:35,800 Speaker 1: you know, yeah, I think it is a good thing. 605 00:32:36,240 --> 00:32:38,840 Speaker 1: It's just funny. It's this little Oliver twist that appears 606 00:32:38,840 --> 00:32:40,120 Speaker 1: in front of you and it's like, are you sure 607 00:32:40,160 --> 00:32:42,160 Speaker 1: you don't want to read the article for your share? 608 00:32:42,240 --> 00:32:46,040 Speaker 1: We're following like like I'm good I can read it's 609 00:32:46,040 --> 00:32:48,120 Speaker 1: all good. Would you like to maybe read the article 610 00:32:48,200 --> 00:32:52,760 Speaker 1: before suggesting that an ethnic group be slotted for their crimes? Right, 611 00:32:52,840 --> 00:32:56,560 Speaker 1: And that's where he really comes in. Handy, isn't those Yeah, yeah, 612 00:32:56,920 --> 00:33:05,280 Speaker 1: that guy shouldn't be British. Um. So, so the study, like, 613 00:33:05,360 --> 00:33:07,920 Speaker 1: the reason they conduct this study is in the hopes 614 00:33:08,000 --> 00:33:09,880 Speaker 1: that it will like allow them to say that it's 615 00:33:09,920 --> 00:33:12,520 Speaker 1: not biased. But then it turns out that, like, I 616 00:33:12,520 --> 00:33:16,360 Speaker 1: wouldn't call it biased. But this change, which is unequivocally 617 00:33:16,400 --> 00:33:20,640 Speaker 1: a good thing, impacts conservative sites, which are lower quality 618 00:33:20,680 --> 00:33:22,640 Speaker 1: and more often shared by people who haven't read the 619 00:33:22,720 --> 00:33:25,880 Speaker 1: articles but are incensed by a shitty, aggressive headline like 620 00:33:25,880 --> 00:33:28,240 Speaker 1: the brightbart ones. We just read those get shared a 621 00:33:28,280 --> 00:33:30,560 Speaker 1: lot and they don't read the article, and that's great 622 00:33:30,600 --> 00:33:34,560 Speaker 1: for bright part um. But they decided, like, oh shit, actually, 623 00:33:35,080 --> 00:33:37,640 Speaker 1: this study, the results of this study were absolutely going 624 00:33:37,680 --> 00:33:40,040 Speaker 1: to be called out for bias. One of the researchers 625 00:33:40,080 --> 00:33:42,640 Speaker 1: wrote in an internal memo, we could face significant backlash. 626 00:33:42,640 --> 00:33:46,760 Speaker 1: We're having experimented with distribution at the expensive conservative publishers. 627 00:33:46,800 --> 00:33:49,600 Speaker 1: So the company dumps this plan, they kill it. It 628 00:33:49,760 --> 00:33:51,680 Speaker 1: is bad for Brightbart, good for the world if it's 629 00:33:51,720 --> 00:33:54,000 Speaker 1: bad for bright part. If it's bad for the Bart, 630 00:33:54,800 --> 00:33:56,880 Speaker 1: we gotta we gotta. Can the plan bad for the Bart? 631 00:33:57,160 --> 00:34:00,200 Speaker 1: Can you have always said that Facebook? Yeah, I say 632 00:34:01,080 --> 00:34:03,360 Speaker 1: you are Cheryl Sandberg. Actually not a lot of people 633 00:34:03,400 --> 00:34:09,200 Speaker 1: know that. Um yeah, I you know, listen off of 634 00:34:09,320 --> 00:34:14,520 Speaker 1: making fun of Cheryl's very share eff So the good 635 00:34:14,520 --> 00:34:17,480 Speaker 1: news is that Facebook didn't just make craven decisions when 636 00:34:17,520 --> 00:34:19,920 Speaker 1: threatened with the possibility of being called out for bias. 637 00:34:20,120 --> 00:34:22,960 Speaker 1: They were also craven whenever a feature promised to improve 638 00:34:23,000 --> 00:34:25,759 Speaker 1: the safety of their network at the cost of absolutely 639 00:34:25,800 --> 00:34:29,799 Speaker 1: any profitability. In two thousand nineteen, Facebook researchers open to 640 00:34:29,840 --> 00:34:32,319 Speaker 1: study into the influence of the like button, which is 641 00:34:32,560 --> 00:34:34,480 Speaker 1: one of the most basic and central features of the 642 00:34:34,600 --> 00:34:39,240 Speaker 1: entire platform. Unfortunately, as we're discussing more detail later, likes 643 00:34:39,280 --> 00:34:42,480 Speaker 1: are one of the most toxic things about Facebook. Researchers 644 00:34:42,480 --> 00:34:45,719 Speaker 1: found that removing the like button, along with removing emoji 645 00:34:45,760 --> 00:34:49,920 Speaker 1: reactions from posts on Instagram, reduced quote stress and anxiety 646 00:34:50,040 --> 00:34:52,640 Speaker 1: felt by the youngest users on the platform, who all 647 00:34:52,680 --> 00:34:56,120 Speaker 1: reported significant anxiety due to the feature. But Facebook also 648 00:34:56,120 --> 00:34:58,799 Speaker 1: found that hiding the like button reduced the rate at 649 00:34:58,800 --> 00:35:02,799 Speaker 1: which users interacted with posts and clicked on ads. So 650 00:35:02,840 --> 00:35:05,880 Speaker 1: now this is more of yeah, yeah, no, this is 651 00:35:05,920 --> 00:35:07,200 Speaker 1: I will say, more of a mixed bag than the 652 00:35:07,239 --> 00:35:11,479 Speaker 1: last thing because removing the like didn't like, it made 653 00:35:11,600 --> 00:35:14,319 Speaker 1: one group of young people feel better, um, but not 654 00:35:14,360 --> 00:35:16,560 Speaker 1: other groups of young people like. It didn't reduce it 655 00:35:16,600 --> 00:35:19,400 Speaker 1: reduced like kids social anxiety, but it didn't have as 656 00:35:19,440 --> 00:35:21,799 Speaker 1: much of an impact on teens really, so it's not 657 00:35:21,960 --> 00:35:24,799 Speaker 1: as clear cut as the last one, but still a 658 00:35:24,880 --> 00:35:28,040 Speaker 1: protective effect had been found among the most vulnerable people 659 00:35:28,040 --> 00:35:31,960 Speaker 1: on Instagram in particular. Um, but they don't they don't 660 00:35:32,000 --> 00:35:35,920 Speaker 1: do anything about it because you know, that's so frustrated. 661 00:35:36,360 --> 00:35:41,880 Speaker 1: That is genuinely, very very valuable interesting information where I 662 00:35:41,880 --> 00:35:43,920 Speaker 1: don't know, I mean, I I feel like it probably 663 00:35:44,000 --> 00:35:47,719 Speaker 1: didn't affect teenagers because by that point it's like, I mean, 664 00:35:47,800 --> 00:35:50,759 Speaker 1: you don't want to say like too late, but by 665 00:35:50,760 --> 00:35:53,200 Speaker 1: that point you're so hardwired to be like, well, I 666 00:35:53,239 --> 00:35:56,280 Speaker 1: can tell what is important or like what is worth 667 00:35:56,360 --> 00:35:59,920 Speaker 1: discussing based on likes, and once that that's just such 668 00:35:59,920 --> 00:36:03,600 Speaker 1: a sticky I mean, I still feel that way, even 669 00:36:03,600 --> 00:36:05,600 Speaker 1: though it's like you can objectively know it's not true, 670 00:36:05,680 --> 00:36:08,399 Speaker 1: but once you've been kind of pilled in that way. 671 00:36:08,480 --> 00:36:13,200 Speaker 1: It's it's very hard to undo. Yeah, it's Um, I 672 00:36:13,239 --> 00:36:16,239 Speaker 1: don't know what it is, Jamie, it's not great. Um 673 00:36:16,400 --> 00:36:22,520 Speaker 1: upsetting that. Yeah, it's not great. Uh So, as time 674 00:36:22,520 --> 00:36:25,240 Speaker 1: went on, research made it increasingly clear that core features 675 00:36:25,239 --> 00:36:29,439 Speaker 1: of Facebook products were fundamentally harmful. From BuzzFeed quote, time 676 00:36:29,480 --> 00:36:32,239 Speaker 1: and again, they determine that people misused key features or 677 00:36:32,280 --> 00:36:35,840 Speaker 1: that those features amplified toxic content, among other effects. In 678 00:36:35,880 --> 00:36:39,440 Speaker 1: August and in August two, nineteen internal memos, several researchers 679 00:36:39,480 --> 00:36:42,400 Speaker 1: said it was Facebook's core product mechanics, meaning the basics 680 00:36:42,400 --> 00:36:44,920 Speaker 1: of how the product functioned, that had let misinformation and 681 00:36:44,920 --> 00:36:47,440 Speaker 1: hate speech flourish on the site. The mechanics of our 682 00:36:47,440 --> 00:36:52,319 Speaker 1: platform are not neutral, they concluded. So there's Facebook employees 683 00:36:52,600 --> 00:36:56,120 Speaker 1: recognize internally, we are making decisions that are allowing hatred 684 00:36:56,160 --> 00:36:59,439 Speaker 1: and other and just unhealthy toxic content to spread, and 685 00:37:00,080 --> 00:37:02,640 Speaker 1: we're not. The bias is not in us fighting it. 686 00:37:02,800 --> 00:37:05,200 Speaker 1: Our biases in refusing to fight it, like we are 687 00:37:05,280 --> 00:37:08,320 Speaker 1: not being neutral because we're letting this spread. The people 688 00:37:08,480 --> 00:37:11,279 Speaker 1: making the site work recognize this. They talk about it 689 00:37:11,280 --> 00:37:14,600 Speaker 1: to each other. Um, it makes it. They feel guilt 690 00:37:14,640 --> 00:37:18,439 Speaker 1: over it. They talk about it. You know, we know this. Yeah, yeah, 691 00:37:18,440 --> 00:37:21,440 Speaker 1: I mean we've discussed that before. Of just like the 692 00:37:21,480 --> 00:37:24,960 Speaker 1: existential stress of working at Facebook. Not the most empathetic 693 00:37:25,080 --> 00:37:27,719 Speaker 1: problem in the world, but not at all. A lot 694 00:37:27,760 --> 00:37:31,120 Speaker 1: of them are making bank but clear paper trail though 695 00:37:31,280 --> 00:37:38,480 Speaker 1: of deep existential guilt and distress. Now, uh, it's it's 696 00:37:38,520 --> 00:37:41,719 Speaker 1: pretty cool. Yeah, it's pretty cool. Pretty cool. So, rather 697 00:37:41,760 --> 00:37:45,080 Speaker 1: than expanding their tests on the impact of removing the 698 00:37:45,080 --> 00:37:47,640 Speaker 1: like button on a broader scale, Mark Zuckerberg and other 699 00:37:47,640 --> 00:37:50,480 Speaker 1: executives agreed to allow testing only on a much more 700 00:37:50,520 --> 00:37:53,600 Speaker 1: limited scale, not to reduce harm, but to quote build 701 00:37:53,640 --> 00:37:57,200 Speaker 1: a positive press narrative around Instagram. So not not to 702 00:37:57,280 --> 00:38:00,640 Speaker 1: actually help human beings, but to have us something to 703 00:38:00,680 --> 00:38:04,320 Speaker 1: brag about. Right, for him to be like, we're so nice, 704 00:38:04,680 --> 00:38:07,480 Speaker 1: We're so cool, We're gonna have fucking rad we are 705 00:38:07,840 --> 00:38:12,000 Speaker 1: I'm the guy site to stop the dated slang. He's 706 00:38:12,040 --> 00:38:14,319 Speaker 1: going to be like, yeah, this is gonna be uh, 707 00:38:14,360 --> 00:38:17,680 Speaker 1: we gotta get some lit press around Instagram, know what 708 00:38:17,719 --> 00:38:20,480 Speaker 1: I mean. Yeah, in three years, someone's going to tell 709 00:38:20,520 --> 00:38:22,640 Speaker 1: him the word poggers and then he's gonna say it 710 00:38:22,719 --> 00:38:26,640 Speaker 1: like thirty times in a week to all of his 711 00:38:26,719 --> 00:38:33,960 Speaker 1: imaginary alien friends on meta'd like that's bro sucker, I 712 00:38:34,000 --> 00:38:38,239 Speaker 1: hate that, screaming into avoid. In September of Facebook rolled 713 00:38:38,239 --> 00:38:40,600 Speaker 1: out a study of the share button. This came in 714 00:38:40,640 --> 00:38:43,719 Speaker 1: the wake of a summer of unprecedented political violence, much 715 00:38:43,719 --> 00:38:47,520 Speaker 1: of its stoked via viral Facebook posts and Facebook groups. Uh, 716 00:38:47,520 --> 00:38:49,880 Speaker 1: the ship at Kenosha started on Facebook in a lot 717 00:38:49,920 --> 00:38:52,560 Speaker 1: of ways. Uh, we're tracking it that night. A hell 718 00:38:52,600 --> 00:38:54,520 Speaker 1: of a lot of that ship got started on Facebook. 719 00:38:54,520 --> 00:38:56,400 Speaker 1: A lot of the like let's get a militia together 720 00:38:56,400 --> 00:38:59,880 Speaker 1: and protect businesses. You know, it's good stuff. Company research 721 00:39:00,160 --> 00:39:04,480 Speaker 1: identified reshare aggregation units, which were automatically generated groups of 722 00:39:04,520 --> 00:39:08,040 Speaker 1: posts that you're so they identify one of the problems 723 00:39:08,120 --> 00:39:10,280 Speaker 1: leading to all of this is what they called reshare 724 00:39:10,280 --> 00:39:13,880 Speaker 1: aggregation units. And these were automatically generated groups of posts 725 00:39:13,920 --> 00:39:17,279 Speaker 1: that were sent to you that we're posts your friends liked, right, 726 00:39:17,560 --> 00:39:19,040 Speaker 1: and they recognize this is how a lot of this 727 00:39:19,080 --> 00:39:22,680 Speaker 1: bad shit is spreading so right, right, that's creating a 728 00:39:22,760 --> 00:39:26,440 Speaker 1: feedback loop on purpose. Yes, in part because users are 729 00:39:26,520 --> 00:39:29,200 Speaker 1: much more likely to share posts that have already been 730 00:39:29,239 --> 00:39:31,600 Speaker 1: liked by other people they know, even if those posts 731 00:39:31,640 --> 00:39:34,880 Speaker 1: are hateful, bigoted, bullying, or contain an accurate information. So 732 00:39:34,920 --> 00:39:37,040 Speaker 1: if somebody gets the same post in two different ways, 733 00:39:37,080 --> 00:39:40,359 Speaker 1: if they just like see a bigoted article pop up 734 00:39:40,360 --> 00:39:43,440 Speaker 1: on their on their their Facebook feed, but they're not 735 00:39:43,480 --> 00:39:45,480 Speaker 1: being informed that other people they know have liked it 736 00:39:45,600 --> 00:39:47,799 Speaker 1: or shared it, they're less likely to share it than 737 00:39:47,840 --> 00:39:49,520 Speaker 1: if like, well, my buddy shared it. So maybe now 738 00:39:49,600 --> 00:39:52,040 Speaker 1: I have permission, right, and you can think about how 739 00:39:52,080 --> 00:39:54,399 Speaker 1: this happens on like a societal's level, how this has 740 00:39:54,400 --> 00:39:57,120 Speaker 1: contributed to everything we're dealing with right now. Um, so 741 00:39:57,160 --> 00:40:00,520 Speaker 1: I feel like everyone knows someone who has probably was 742 00:40:00,760 --> 00:40:03,879 Speaker 1: very very influenced by that exact function. That's yeah, God, 743 00:40:03,920 --> 00:40:08,200 Speaker 1: that's awful. So company researchers in September like these research 744 00:40:08,280 --> 00:40:10,160 Speaker 1: aggregations the fact that we don't have to do it 745 00:40:10,200 --> 00:40:12,600 Speaker 1: this way, right, we can only show people that the 746 00:40:12,800 --> 00:40:15,080 Speaker 1: articles that their friends comment on at the very least 747 00:40:15,120 --> 00:40:17,960 Speaker 1: as supposed to just like or just share without much commentary. Like, 748 00:40:18,160 --> 00:40:20,160 Speaker 1: they have a number of options here that could at 749 00:40:20,160 --> 00:40:24,799 Speaker 1: the very least reduce um um harmful content because when 750 00:40:24,800 --> 00:40:27,480 Speaker 1: every like I think the numbers like three times as 751 00:40:27,560 --> 00:40:30,240 Speaker 1: likely to share content that's presented to them in this way. 752 00:40:30,520 --> 00:40:33,400 Speaker 1: So in May of that year, um, while you know, 753 00:40:34,400 --> 00:40:37,759 Speaker 1: so this is actually months before Facebook researchers find this out. 754 00:40:38,080 --> 00:40:41,440 Speaker 1: Myself and a journalist named Jason Wilson published what I 755 00:40:41,480 --> 00:40:44,080 Speaker 1: still think is the first proper forensic investigation into the 756 00:40:44,120 --> 00:40:46,840 Speaker 1: Boogaloo movement. Um. It noted how the spread of the 757 00:40:46,880 --> 00:40:49,680 Speaker 1: movement and its crucial early days was enabled entirely by 758 00:40:49,680 --> 00:40:53,240 Speaker 1: Facebook's group recommendation algorithm, which was often spread to people 759 00:40:53,280 --> 00:40:56,240 Speaker 1: by these reshare aggregation units. You see, oh my buddy 760 00:40:56,320 --> 00:40:59,040 Speaker 1: joined this group where everybody's sharing these like Hawaiian shirt 761 00:40:59,040 --> 00:41:01,719 Speaker 1: photos and pictures of maybe I'll hop in there, and 762 00:41:01,840 --> 00:41:04,520 Speaker 1: you know, the cycle goes on from there. When you've 763 00:41:04,560 --> 00:41:06,799 Speaker 1: joined one group, it sends you advice like, hey, check 764 00:41:06,840 --> 00:41:08,520 Speaker 1: out this other group, check out this other group, and 765 00:41:08,560 --> 00:41:12,000 Speaker 1: it starts with like we're sharing memes about like Hawaiian 766 00:41:12,120 --> 00:41:15,560 Speaker 1: shirts and and you know, the boogaloo. And then five 767 00:41:15,680 --> 00:41:18,520 Speaker 1: or six groups down the line, you're making serious plans 768 00:41:18,560 --> 00:41:23,239 Speaker 1: to assassinate federal agents or kidnap a governor. You know, yeah, 769 00:41:23,320 --> 00:41:27,640 Speaker 1: I mean that he's uh, you know, I remember where 770 00:41:27,680 --> 00:41:30,040 Speaker 1: I was when I read it, because the how steep 771 00:41:30,120 --> 00:41:34,399 Speaker 1: the escalation is and how quickly it like, it's not 772 00:41:34,520 --> 00:41:36,600 Speaker 1: I guess not completely surprising, but at the time I 773 00:41:36,640 --> 00:41:39,320 Speaker 1: was like, oh, that that is a very short timeline 774 00:41:39,480 --> 00:41:44,160 Speaker 1: from yeah, so I don't like it. It's not good. 775 00:41:44,280 --> 00:41:48,080 Speaker 1: It's not good, and in fairness, um, there are actually 776 00:41:48,120 --> 00:41:51,959 Speaker 1: some face like it kind of becomes accepted the stuff 777 00:41:52,000 --> 00:41:54,400 Speaker 1: that Jason and I were writing about in May by 778 00:41:54,440 --> 00:41:57,000 Speaker 1: a lot of Facebook researchers around September of that year. 779 00:41:57,000 --> 00:41:59,560 Speaker 1: But there were people within Facebook who actually tried to 780 00:41:59,600 --> 00:42:02,720 Speaker 1: blow the it on this earlier um in fact, significantly earlier. 781 00:42:03,120 --> 00:42:06,200 Speaker 1: In mid two thousand nineteen, an internal researcher created an 782 00:42:06,280 --> 00:42:09,880 Speaker 1: experimental Facebook account, which is something that like certain researchers 783 00:42:09,880 --> 00:42:11,319 Speaker 1: would do from time to time to see what the 784 00:42:11,320 --> 00:42:15,120 Speaker 1: algorithm is feeding people. This experimental account account was a 785 00:42:15,160 --> 00:42:18,319 Speaker 1: fake conservative mom, and they made this acount because they 786 00:42:18,320 --> 00:42:20,800 Speaker 1: wanted to see what the recommendation algorithm would feed this account. 787 00:42:20,960 --> 00:42:23,560 Speaker 1: And I'm gonna read from BuzzFeed again here. The internal research, 788 00:42:23,640 --> 00:42:26,600 Speaker 1: titled Carol's Journey to Q and On, detailed how the 789 00:42:26,600 --> 00:42:29,400 Speaker 1: Facebook account for an imaginary woman named Carol Smith had 790 00:42:29,440 --> 00:42:33,160 Speaker 1: followed pages for Fox News and Sinclair Broadcasting. Within days, 791 00:42:33,200 --> 00:42:36,200 Speaker 1: Facebook had recommended pages and groups related to Q and On, 792 00:42:36,360 --> 00:42:39,440 Speaker 1: the conspiracy theory that falsely claimed Mr Trump was facing 793 00:42:39,440 --> 00:42:42,560 Speaker 1: down a shadowy cabal of democratic pedophiles. By the end 794 00:42:42,600 --> 00:42:45,320 Speaker 1: of three weeks, Carol Smith's Facebook feed had devolved further. 795 00:42:45,560 --> 00:42:48,279 Speaker 1: It became a constant flow of misleading, polarizing, and low 796 00:42:48,360 --> 00:42:53,120 Speaker 1: quality content. The researcher wrote, Yeah, how did so so 797 00:42:53,239 --> 00:42:56,000 Speaker 1: some some some jerk was like, let's call it Carol, 798 00:42:56,440 --> 00:43:02,160 Speaker 1: like they Carol stereotype. Statistically not unlikely that it could 799 00:43:02,360 --> 00:43:04,439 Speaker 1: It could have been a Carol that is. I mean 800 00:43:04,840 --> 00:43:08,240 Speaker 1: that that's interesting that we also all know a Carol. 801 00:43:09,719 --> 00:43:14,040 Speaker 1: A Carol. Yeah, unfortunately. So they're in They're in the 802 00:43:14,040 --> 00:43:17,479 Speaker 1: they're the Duncan Donuts drive through. I I walk among them. Yeah, 803 00:43:18,480 --> 00:43:22,200 Speaker 1: they live among us. They are dating you walk among them. 804 00:43:22,320 --> 00:43:28,080 Speaker 1: That's so funny. You eat hot dogs next to all people. Yeah, 805 00:43:28,120 --> 00:43:31,359 Speaker 1: we're in mine. I think that hot dog eaters are 806 00:43:31,400 --> 00:43:36,560 Speaker 1: maybe more politically astute bunch. But the Dunkan Donuts line 807 00:43:36,680 --> 00:43:41,320 Speaker 1: is just absolutely unmitigated chaos. There could be the politics 808 00:43:41,320 --> 00:43:43,480 Speaker 1: of the Dunkey Donuts line are all over the place. 809 00:43:43,800 --> 00:43:46,319 Speaker 1: They are they are anarchists. In the line. You have 810 00:43:46,719 --> 00:43:48,600 Speaker 1: you have the scariest people you've ever met in the line. 811 00:43:49,000 --> 00:43:51,200 Speaker 1: You have been Affleck in the line looking like his 812 00:43:51,400 --> 00:43:54,640 Speaker 1: entire family just died in a bus crash, but Afflex 813 00:43:54,960 --> 00:43:58,240 Speaker 1: in the line and you can see his little dragon 814 00:43:58,280 --> 00:44:01,960 Speaker 1: back piece. Oh my god. So Phoenix, Jamie, come on, 815 00:44:02,280 --> 00:44:05,239 Speaker 1: come on, sorry, that was disrespectful. That was disrespectful. And 816 00:44:05,280 --> 00:44:09,000 Speaker 1: you're right and you're right, thank you, thank you. So 817 00:44:10,160 --> 00:44:15,000 Speaker 1: this this study with this fake account that immediately gets radicalized. 818 00:44:15,400 --> 00:44:18,800 Speaker 1: This study, uh we it comes out in the Facebook 819 00:44:18,800 --> 00:44:20,719 Speaker 1: papers right this year, but it was done in two 820 00:44:20,760 --> 00:44:23,960 Speaker 1: thousand nineteen. And when this year, like information of this 821 00:44:24,040 --> 00:44:27,359 Speaker 1: dropped and journalists start writing about it, um, they do 822 00:44:27,400 --> 00:44:29,680 Speaker 1: what journalists do, which is you you, you put together 823 00:44:29,719 --> 00:44:31,840 Speaker 1: your article and then you go for comment. Right. And 824 00:44:31,920 --> 00:44:35,520 Speaker 1: so the comment that Facebook made about this experiment that 825 00:44:35,560 --> 00:44:38,320 Speaker 1: this researcher didn't two thousand nineteen was well, this was 826 00:44:38,360 --> 00:44:40,600 Speaker 1: a study of one hypothetical user. It is a perfect 827 00:44:40,640 --> 00:44:43,040 Speaker 1: example of research the company does to improve improve our 828 00:44:43,080 --> 00:44:45,520 Speaker 1: systems and helped inform our decision to remove Q and 829 00:44:45,600 --> 00:44:48,840 Speaker 1: on from the platform. That did not happen until January 830 00:44:48,880 --> 00:44:53,359 Speaker 1: of this year. They didn't ship for two years after this. 831 00:44:53,880 --> 00:44:56,280 Speaker 1: They only did ship because people stormed the fucking capital 832 00:44:56,280 --> 00:45:00,480 Speaker 1: waving Q and on banners. You motherfucker's sounds like them, 833 00:45:01,160 --> 00:45:03,160 Speaker 1: sound like that. They're like, Oh, let's wait until things 834 00:45:03,239 --> 00:45:06,680 Speaker 1: get so desperately bad that the company will be, you know, 835 00:45:06,760 --> 00:45:10,600 Speaker 1: severely impacted if we don't do something. The huge amount 836 00:45:10,680 --> 00:45:13,680 Speaker 1: of the of the radicalization Q and on gets supercharged 837 00:45:13,719 --> 00:45:16,040 Speaker 1: by the lockdowns, right because suddenly all these people, like 838 00:45:16,120 --> 00:45:18,239 Speaker 1: a lot of them, are in financial distress, they're locked 839 00:45:18,239 --> 00:45:20,680 Speaker 1: in their fucking houses, they're online all the goddamn time. 840 00:45:21,120 --> 00:45:23,880 Speaker 1: And they knew they could have dealt with this problem 841 00:45:24,000 --> 00:45:27,360 Speaker 1: and reduced massively the number of people who got fed 842 00:45:27,400 --> 00:45:30,120 Speaker 1: this poison during the lockdown. If they've done a goddamn 843 00:45:30,160 --> 00:45:31,920 Speaker 1: thing and do that's a nineteen. They had the option, 844 00:45:32,280 --> 00:45:38,000 Speaker 1: they did not. Yeah, okay, so no surprises here that 845 00:45:38,160 --> 00:45:42,640 Speaker 1: researcher side. Nothing bad happened, right, It did not name 846 00:45:42,680 --> 00:45:47,200 Speaker 1: one did nothing happened? Well, all happened actually and was 847 00:45:47,200 --> 00:45:50,920 Speaker 1: pretty heavily tied to this. Uh. In August of that 848 00:45:51,000 --> 00:45:54,040 Speaker 1: researcher left the company. She wrote an exit note where 849 00:45:54,040 --> 00:45:58,360 Speaker 1: she accused the company Facebook of quote knowingly exposing users 850 00:45:58,360 --> 00:46:01,080 Speaker 1: to harm. We've known for over a year now that 851 00:46:01,120 --> 00:46:04,120 Speaker 1: our recommendation systems can very quickly lead users down a 852 00:46:04,160 --> 00:46:06,920 Speaker 1: path to conspiracy theories and groups. In the meantime, the 853 00:46:06,960 --> 00:46:09,719 Speaker 1: fringe groups slash set of beliefs has has grown to 854 00:46:09,840 --> 00:46:12,960 Speaker 1: national prominence, with q and on congressional candidates and kean 855 00:46:13,000 --> 00:46:16,760 Speaker 1: on hashtags and groups trending on the mainstream. Out of fears, 856 00:46:17,200 --> 00:46:20,839 Speaker 1: out of fears over potential public and policy stakeholder responses, 857 00:46:21,080 --> 00:46:24,600 Speaker 1: we are knowingly exposing users to risks of integrity harms. 858 00:46:24,960 --> 00:46:27,359 Speaker 1: During the time that we've hesitated, I've seen folks from 859 00:46:27,360 --> 00:46:29,759 Speaker 1: my hometown go further and further down the rabbit hole. 860 00:46:29,960 --> 00:46:35,040 Speaker 1: It has been painful to observe. Okay, okay, and I 861 00:46:35,040 --> 00:46:39,920 Speaker 1: mean it is I mean, no arguments there. It is 862 00:46:40,040 --> 00:46:43,399 Speaker 1: very painful to observe that happen to people who are 863 00:46:43,440 --> 00:46:46,960 Speaker 1: not Yeah, that's a Facebook employee who's not going to 864 00:46:47,040 --> 00:46:50,360 Speaker 1: get any ship from me. Um. She identified the problem, 865 00:46:50,400 --> 00:46:53,839 Speaker 1: she did good research to try to make clear what 866 00:46:53,880 --> 00:46:58,160 Speaker 1: the problem was, and when she realized that her company 867 00:46:58,239 --> 00:46:59,920 Speaker 1: was never going to take action on this because it 868 00:47:00,040 --> 00:47:02,880 Speaker 1: to produce their profits, she fucking leaves. And she she 869 00:47:03,520 --> 00:47:06,320 Speaker 1: does everything she can to let people know how unacceptable 870 00:47:06,360 --> 00:47:10,160 Speaker 1: the situation is within the company. You know, I mean 871 00:47:10,160 --> 00:47:12,960 Speaker 1: that is good that's what that is the minimum. That 872 00:47:13,120 --> 00:47:15,520 Speaker 1: is the minimum, right, Yeah, I mean it is a 873 00:47:15,520 --> 00:47:18,320 Speaker 1: little silly to be like, and I just recently realized 874 00:47:18,360 --> 00:47:21,600 Speaker 1: that I don't think Facebook is ethical. You're like, like, 875 00:47:21,840 --> 00:47:23,719 Speaker 1: shut up, no way, I don't. I don't know when 876 00:47:23,719 --> 00:47:26,959 Speaker 1: this person started, but like, yeah, there they she got 877 00:47:27,040 --> 00:47:31,880 Speaker 1: there and she's clearly horrified by what she like realized 878 00:47:31,920 --> 00:47:34,600 Speaker 1: the company was doing. Like again, we've all been where 879 00:47:34,600 --> 00:47:36,319 Speaker 1: she is. Where you just see these people you grew 880 00:47:36,400 --> 00:47:41,359 Speaker 1: up with lose their goddamn minds and it's bad. It's bad. 881 00:47:41,400 --> 00:47:43,680 Speaker 1: And you're like, oh and I'm I'm and I work 882 00:47:43,719 --> 00:47:46,960 Speaker 1: in nucleus at the problem of the problem. Interesting. Yeah, 883 00:47:47,160 --> 00:47:48,960 Speaker 1: I mean that's why I had to quit working at 884 00:47:48,960 --> 00:47:53,239 Speaker 1: Preduce Pharmaceuticals. Yeah. I do miss the freedom, la. I know, 885 00:47:53,440 --> 00:47:55,360 Speaker 1: I know, I know, I know all those I was 886 00:47:55,400 --> 00:48:01,600 Speaker 1: a great salesman. You're so good plays a produced pharmaceuticals salesman. 887 00:48:02,360 --> 00:48:05,880 Speaker 1: Or no, maybe it's that Will Poulter. Oh yeah, okay. 888 00:48:06,200 --> 00:48:08,160 Speaker 1: You know who didn't is Alec Ginnis. Because he never 889 00:48:08,200 --> 00:48:10,799 Speaker 1: loved He never lived to see opiate's become what they 890 00:48:10,840 --> 00:48:15,440 Speaker 1: are today. Tragic. He never lived to taste the sweet 891 00:48:15,480 --> 00:48:18,319 Speaker 1: taste of tramadol or delatted and we don't talk about 892 00:48:18,320 --> 00:48:20,720 Speaker 1: that enough. We don't talk about that enough. What a shame? 893 00:48:21,320 --> 00:48:25,440 Speaker 1: What heartbreaks? Say I will what if Alec Ginnis had 894 00:48:25,440 --> 00:48:30,920 Speaker 1: access to high quality, pharmaceutical grade painkillers an essay? I 895 00:48:30,960 --> 00:48:33,560 Speaker 1: think it would have been sweet. I'm pitching it. I'm 896 00:48:33,600 --> 00:48:39,120 Speaker 1: pitching it. Okay, Um, someone who's better at movies than 897 00:48:39,160 --> 00:48:41,600 Speaker 1: I could have made a train spotting joke there because 898 00:48:41,640 --> 00:48:45,160 Speaker 1: he and McGregor in the Heroin movie than he played on. 899 00:48:45,440 --> 00:48:47,239 Speaker 1: I don't know, there's some joke there, but I didn't 900 00:48:47,239 --> 00:48:49,239 Speaker 1: come up with it. Okay. Someone Yeah, someone figure that 901 00:48:49,239 --> 00:48:51,239 Speaker 1: out in the reddit, and then don't tell us about it. 902 00:48:51,360 --> 00:48:53,440 Speaker 1: Do not tell us because I've never seen train spotting. 903 00:48:53,440 --> 00:48:57,600 Speaker 1: I'm just aware that woggin. Yeah, I haven't seen it, 904 00:48:57,680 --> 00:49:00,480 Speaker 1: but like, you know what it's about. I'm so really yeah, 905 00:49:00,480 --> 00:49:03,560 Speaker 1: I know what it's about. I also get so stressed 906 00:49:03,560 --> 00:49:06,799 Speaker 1: out when I anytime it's not often, but anytime I've 907 00:49:06,800 --> 00:49:09,680 Speaker 1: had to say even McGregor's name, I say, it's so weird. 908 00:49:09,920 --> 00:49:12,480 Speaker 1: It's the worst thing in the world. Saying when McGregor's 909 00:49:12,560 --> 00:49:14,759 Speaker 1: name is the most frightening experience a human can have. 910 00:49:16,000 --> 00:49:17,960 Speaker 1: I can't. I can't make my mouth make that shape. 911 00:49:17,960 --> 00:49:20,399 Speaker 1: It's embarrassing. I think what he has to live with. 912 00:49:20,560 --> 00:49:23,279 Speaker 1: Thankfully he's gorgeous. That must make it easier. Yeah, I 913 00:49:23,280 --> 00:49:29,400 Speaker 1: mean being sexy has to help. It has to help. Yeah. 914 00:49:29,600 --> 00:49:32,400 Speaker 1: You know who does know how to pronounce Ewan McGregor's 915 00:49:32,480 --> 00:49:35,080 Speaker 1: name and never feels any anxiety over it because their 916 00:49:35,120 --> 00:49:37,279 Speaker 1: friends they hang out on the weekend? Oh nice? Who 917 00:49:37,400 --> 00:49:40,480 Speaker 1: who's at the products and services that support this podcast 918 00:49:40,760 --> 00:49:44,319 Speaker 1: are all good buddies with Ewan McGregor hangs out with 919 00:49:44,320 --> 00:50:01,720 Speaker 1: the highway patrol. Ah, we are back. As the election 920 00:50:01,760 --> 00:50:05,000 Speaker 1: grew nearer, disinformation continued to proliferate on Facebook, and the 921 00:50:05,040 --> 00:50:08,600 Speaker 1: political temperature in the United States rose ever higher. Facebook 922 00:50:08,640 --> 00:50:11,839 Speaker 1: employees grew concerned about the wide variety of worst case 923 00:50:11,840 --> 00:50:14,840 Speaker 1: scenarios that might result if something went wrong with the election. 924 00:50:15,239 --> 00:50:18,760 Speaker 1: They put together a series of emergency break glass measures. 925 00:50:19,040 --> 00:50:21,640 Speaker 1: These would allow them to automatically slow or stop the 926 00:50:21,680 --> 00:50:24,759 Speaker 1: formation of new Facebook groups if the election was contested. 927 00:50:25,200 --> 00:50:27,560 Speaker 1: This was never stated, but you get the feeling. They 928 00:50:27,600 --> 00:50:30,040 Speaker 1: were looking at Kenosha and how Facebook groups had led 929 00:50:30,080 --> 00:50:33,000 Speaker 1: to spontaneous and deadly militia groups to form up due 930 00:50:33,040 --> 00:50:36,080 Speaker 1: to viral news stories. My interpretation is that they were 931 00:50:36,160 --> 00:50:38,160 Speaker 1: terrified of the same sort of phenomenon that it would 932 00:50:38,200 --> 00:50:40,799 Speaker 1: lead to Facebook like fueling a civil war. Like I 933 00:50:40,800 --> 00:50:43,040 Speaker 1: think that we're literally worried that like something will go wrong, 934 00:50:43,320 --> 00:50:45,680 Speaker 1: people will form militias on Facebook and there will be 935 00:50:45,680 --> 00:50:47,520 Speaker 1: a gun fight that a shipload of people die, and 936 00:50:47,560 --> 00:50:49,400 Speaker 1: that escalates to something worse, and everyone will say it 937 00:50:49,440 --> 00:50:52,120 Speaker 1: started on Facebook because that happened in Kenosha, Like, this 938 00:50:52,160 --> 00:50:54,640 Speaker 1: is not a thing. It happened with the Boololoo stuff 939 00:50:54,640 --> 00:50:57,640 Speaker 1: after Yeah, it happened several times last year at a 940 00:50:57,760 --> 00:51:01,040 Speaker 1: fair anxiety. We've seen it happen. Yeah, it was never 941 00:51:01,120 --> 00:51:05,000 Speaker 1: a massive exchange of gunfire, thank fucking god. Um, but 942 00:51:05,040 --> 00:51:07,239 Speaker 1: they were they thought they saw that possibility. This is 943 00:51:07,280 --> 00:51:11,360 Speaker 1: the thing. I was worried about the entirety of um um, 944 00:51:11,400 --> 00:51:13,680 Speaker 1: and we got really close to it several times. I 945 00:51:13,719 --> 00:51:16,640 Speaker 1: was there for a few of them. It sucked. UM. 946 00:51:16,680 --> 00:51:18,719 Speaker 1: So they're worried about this and they start coming. They 947 00:51:18,760 --> 00:51:21,400 Speaker 1: try to figure out like break like emergency measures they 948 00:51:21,400 --> 00:51:24,279 Speaker 1: can take to basically like shut all that ship down, 949 00:51:24,400 --> 00:51:27,000 Speaker 1: like stop people from joining and making new Facebook groups. 950 00:51:27,000 --> 00:51:30,040 Speaker 1: If they have to write if like it becomes obvious 951 00:51:30,080 --> 00:51:33,640 Speaker 1: that something needs to be done. Um and yeah, so 952 00:51:33,920 --> 00:51:37,800 Speaker 1: they they uh yeah. In September, Mark Zuckerberg wrote on 953 00:51:37,840 --> 00:51:40,840 Speaker 1: an internal company post that his company had quote and 954 00:51:41,040 --> 00:51:45,960 Speaker 1: obligator had quote, a responsibility to protect our democracy. He 955 00:51:46,000 --> 00:51:49,120 Speaker 1: bragged about a voter registration campaign the social network and funded, 956 00:51:49,160 --> 00:51:52,440 Speaker 1: and claimed they'd taken vigorous steps to eliminate voter misinformation 957 00:51:52,480 --> 00:51:55,319 Speaker 1: and block political ads, all with the stated goal of 958 00:51:55,360 --> 00:51:59,399 Speaker 1: reducing the chances of violence and unrest. The election came, 959 00:51:59,600 --> 00:52:02,200 Speaker 1: and it it all right. From Facebook's perspective, the whole 960 00:52:02,239 --> 00:52:05,080 Speaker 1: situation was too fluid and confusing in those early days 961 00:52:05,080 --> 00:52:07,000 Speaker 1: after the election. You know where we're getting the counts in. 962 00:52:07,400 --> 00:52:09,680 Speaker 1: Everything's down to the fucking wire. It was all too 963 00:52:09,719 --> 00:52:11,319 Speaker 1: messy for there to be much in the way of 964 00:52:11,400 --> 00:52:14,399 Speaker 1: violent on the ground action to occur. Wow, like that 965 00:52:14,520 --> 00:52:17,000 Speaker 1: was getting sorted out because people just didn't know where's 966 00:52:17,000 --> 00:52:19,759 Speaker 1: it going to land. Um, so they think, oh, we 967 00:52:19,880 --> 00:52:25,200 Speaker 1: dodged a bullet. Everything was fine because they're dumb. Oh baby. Yeah. 968 00:52:25,719 --> 00:52:28,800 Speaker 1: The reality, of course, is that misinformation about election integrity 969 00:52:28,840 --> 00:52:33,280 Speaker 1: spread immediately like wildfire. On November five, one Facebook employee 970 00:52:33,280 --> 00:52:36,319 Speaker 1: warn't colleagues that disinformation was filling the comments section of 971 00:52:36,360 --> 00:52:39,760 Speaker 1: news posts about the election. The most incendiary and dangerous 972 00:52:39,760 --> 00:52:43,520 Speaker 1: comments were being amplified by Facebook's algorithm to appear as 973 00:52:43,520 --> 00:52:47,040 Speaker 1: the top comment on popular threads. On November nine, a 974 00:52:47,120 --> 00:52:50,399 Speaker 1: Facebook data scientists reported in an internal study that one 975 00:52:50,400 --> 00:52:53,200 Speaker 1: out of every fifty views on Facebook in the United States, 976 00:52:53,360 --> 00:52:56,040 Speaker 1: and fully ten percent of all views of political information 977 00:52:56,080 --> 00:52:59,200 Speaker 1: on the site, was content claiming the election had been stolen. 978 00:52:59,480 --> 00:53:03,680 Speaker 1: Temper into Facebook's political posts are the election was stolen? Yeah, 979 00:53:03,920 --> 00:53:06,319 Speaker 1: one out of fifty views on Facebook is someone saying 980 00:53:06,360 --> 00:53:11,080 Speaker 1: the election stolen. This ship is out of controlsutably, everyone 981 00:53:11,360 --> 00:53:15,399 Speaker 1: engaging to agree. Wow, Okay, I honestly, honestly, I would 982 00:53:15,440 --> 00:53:17,839 Speaker 1: have guessed that it would have been higher. But but 983 00:53:17,960 --> 00:53:21,359 Speaker 1: one intent is still there's a lot going on on Facebook. Yeah. 984 00:53:21,480 --> 00:53:25,360 Speaker 1: The researcher noted there was also a fringe incitement to violence. 985 00:53:25,440 --> 00:53:27,719 Speaker 1: And I would quibble over the word fringe because I 986 00:53:27,760 --> 00:53:31,000 Speaker 1: don't think it was very fringe. Um, But otherwise the 987 00:53:31,040 --> 00:53:33,960 Speaker 1: data is interesting, you know, like, there's a lot of 988 00:53:34,000 --> 00:53:37,040 Speaker 1: ten people. It's not the fringe. He's saying. The incitement 989 00:53:37,080 --> 00:53:39,560 Speaker 1: to violence was a fringe of the posts claiming the 990 00:53:39,560 --> 00:53:44,000 Speaker 1: election had been stolen. I disagree with his interpretation of 991 00:53:44,040 --> 00:53:46,480 Speaker 1: that based on my own amount of time that I 992 00:53:46,520 --> 00:53:49,160 Speaker 1: spent looking through these same posts. But whatever, maybe we're 993 00:53:49,160 --> 00:53:51,359 Speaker 1: looking at different posts. You know, there's a lot going 994 00:53:51,400 --> 00:53:55,080 Speaker 1: on on Facebook. In those days. Facebook did not Yeah, 995 00:53:55,280 --> 00:53:57,760 Speaker 1: Facebook did not blow the whistle or sound the alarm 996 00:53:57,880 --> 00:54:00,880 Speaker 1: or do anything but start canceling. It's some agency procedures. 997 00:54:00,880 --> 00:54:03,560 Speaker 1: They were like, we get it. The critical periods over. 998 00:54:03,600 --> 00:54:08,319 Speaker 1: Everything's gonna be fine and dandy, baby. Um. They thought 999 00:54:08,360 --> 00:54:10,600 Speaker 1: the danger of post election violence was over. And most 1000 00:54:10,680 --> 00:54:12,279 Speaker 1: of all, they thought that if they took action to 1001 00:54:12,320 --> 00:54:15,960 Speaker 1: stop the reach of far right propaganda users, then conservatives 1002 00:54:16,000 --> 00:54:19,520 Speaker 1: would complain. As we now know, the most consequential species 1003 00:54:19,520 --> 00:54:22,440 Speaker 1: of disinformation after November two would be the Stop the 1004 00:54:22,480 --> 00:54:25,680 Speaker 1: Steel movement. The idea behind the campaign had its origins 1005 00:54:25,680 --> 00:54:30,080 Speaker 1: in the election, as essentially a fundraising grift from Roger Stone. 1006 00:54:30,160 --> 00:54:34,239 Speaker 1: Ali Alexander, who is a ship head, adapted it in 1007 00:54:34,280 --> 00:54:36,680 Speaker 1: the wake of the election, and it wound up being 1008 00:54:36,719 --> 00:54:40,200 Speaker 1: a major inspiration for the January sixth Capital riot. As 1009 00:54:40,239 --> 00:54:42,920 Speaker 1: we now know from a secret internal report, Facebook was 1010 00:54:42,960 --> 00:54:45,280 Speaker 1: aware of the Stop the Steel movement from the beginning. 1011 00:54:45,440 --> 00:54:48,920 Speaker 1: Quote from Facebook. The first stop the Steel group emerged 1012 00:54:48,920 --> 00:54:51,239 Speaker 1: on election night. It was flagged for escalation because it 1013 00:54:51,280 --> 00:54:54,000 Speaker 1: contained high levels of hate and violence and incitement V 1014 00:54:54,160 --> 00:54:56,800 Speaker 1: and I and the comments. The group was disabled and 1015 00:54:56,840 --> 00:54:59,440 Speaker 1: an investigation was kicked off looking for early signs of 1016 00:54:59,440 --> 00:55:02,240 Speaker 1: coordination and harm across the news stop the Steel groups 1017 00:55:02,360 --> 00:55:04,799 Speaker 1: that were quickly sprouting up to replace it. With our 1018 00:55:04,840 --> 00:55:07,800 Speaker 1: early signals, it was unclear that coordination was taking place 1019 00:55:07,920 --> 00:55:10,760 Speaker 1: or that there was enough harm to constitute designating the term. 1020 00:55:10,800 --> 00:55:13,120 Speaker 1: It wasn't until later that it became clear just how 1021 00:55:13,200 --> 00:55:15,480 Speaker 1: much of a focal point the catchphrase would be, and 1022 00:55:15,520 --> 00:55:17,560 Speaker 1: that they would serve as a rallying point around which 1023 00:55:17,600 --> 00:55:22,440 Speaker 1: a movement of violent election delegitimization could coalesce. From the 1024 00:55:22,440 --> 00:55:24,719 Speaker 1: earliest groups, we saw high levels of hate, v and 1025 00:55:24,800 --> 00:55:28,480 Speaker 1: I and de legitimization combined with meteoric growth rates. Almost 1026 00:55:28,520 --> 00:55:31,080 Speaker 1: all of the fastest growing Facebook groups were stopped the 1027 00:55:31,080 --> 00:55:33,719 Speaker 1: Steel during their peak growth. Because we were looking at 1028 00:55:33,719 --> 00:55:36,879 Speaker 1: each entity individually rather than as a cohesive movement, we 1029 00:55:36,880 --> 00:55:39,440 Speaker 1: were only able to take down individual groups and pages 1030 00:55:39,480 --> 00:55:42,600 Speaker 1: once they exceeded a violation threshold. We were not able 1031 00:55:42,640 --> 00:55:45,399 Speaker 1: to act on simple objects like posts and comments because 1032 00:55:45,440 --> 00:55:48,279 Speaker 1: they individually tended not to violate even if they were 1033 00:55:48,320 --> 00:55:52,880 Speaker 1: surrounded by hate, violence and misinformation after the Capital insurrection. Yeah, 1034 00:55:53,120 --> 00:55:55,920 Speaker 1: that is such garbage. I mean, it's like, and I 1035 00:55:56,000 --> 00:55:58,439 Speaker 1: know that you have examined far more of these posts 1036 00:55:58,440 --> 00:56:01,240 Speaker 1: than I have in depth, but it's just the fast 1037 00:56:01,280 --> 00:56:04,799 Speaker 1: and looseness that people interpret like just it's like free 1038 00:56:04,840 --> 00:56:07,960 Speaker 1: form jazz of the way people interpret Facebook community rules. 1039 00:56:08,120 --> 00:56:12,680 Speaker 1: Because I feel like in groups like that and in 1040 00:56:12,719 --> 00:56:17,239 Speaker 1: groups like less inflammatory than that, there is just constant 1041 00:56:17,280 --> 00:56:21,920 Speaker 1: breaking of the Facebook community rules. It's just uh, it's yeah, yeah, yeah, 1042 00:56:21,960 --> 00:56:26,759 Speaker 1: they're not really moderated at all. Um. Yeah, So this 1043 00:56:26,840 --> 00:56:28,440 Speaker 1: is interesting to me for a few reasons. For one, 1044 00:56:28,480 --> 00:56:30,560 Speaker 1: it lays out what I suspect is the case these 1045 00:56:30,600 --> 00:56:33,439 Speaker 1: researchers and Facebook employees needed to believe and be able 1046 00:56:33,440 --> 00:56:36,279 Speaker 1: to argue in order to not hate themselves. Um, the 1047 00:56:36,360 --> 00:56:39,279 Speaker 1: idea that like, we just didn't recognize it was coordinated. 1048 00:56:39,320 --> 00:56:40,719 Speaker 1: We thought it was all it was. It was all 1049 00:56:40,800 --> 00:56:44,640 Speaker 1: kind of grassroots and happening like organically, and so it 1050 00:56:44,680 --> 00:56:46,719 Speaker 1: was much more complicated for us to try to deal with. 1051 00:56:47,080 --> 00:56:49,719 Speaker 1: I think they need to believe this. I'm going to 1052 00:56:49,760 --> 00:56:52,879 Speaker 1: explain why it's not a good excuse. Starting in December two, 1053 00:56:53,120 --> 00:56:56,880 Speaker 1: nineteen and going until May, the boogleoo movement expanded rapidly 1054 00:56:56,960 --> 00:57:00,920 Speaker 1: in Facebook groups. Incitements to violence semi regularly got groups nooped, 1055 00:57:00,960 --> 00:57:03,680 Speaker 1: and members adapted new terms in order to avoid getting 1056 00:57:03,719 --> 00:57:06,640 Speaker 1: deep platformed. It became gradually obvious that a number of 1057 00:57:06,640 --> 00:57:09,440 Speaker 1: these groups were cohesive and connected, and this was revealed 1058 00:57:09,440 --> 00:57:11,719 Speaker 1: throughout the year in a string of terrorist attacks by 1059 00:57:11,719 --> 00:57:15,160 Speaker 1: boogleoo types in multiple states. When these attacks began, Facebook 1060 00:57:15,160 --> 00:57:17,760 Speaker 1: engaged in a much more cohesive and effective campaign to 1061 00:57:17,800 --> 00:57:21,240 Speaker 1: ban boogaloo groups. The Boogaloo movement and Stop the Steel 1062 00:57:21,280 --> 00:57:24,080 Speaker 1: are of course not one to one analogs, but the 1063 00:57:24,080 --> 00:57:26,720 Speaker 1: fact that this occurred earlier in the same year, resulting 1064 00:57:26,720 --> 00:57:29,840 Speaker 1: in deaths and widespread violence, shows that Fate's book fucking 1065 00:57:29,920 --> 00:57:32,760 Speaker 1: knew the stakes. They could have recognized what was going 1066 00:57:32,800 --> 00:57:35,080 Speaker 1: on with the Stop the Steel movement earlier, and they 1067 00:57:35,120 --> 00:57:37,680 Speaker 1: could have recognized that it was much likely more cohesive 1068 00:57:37,720 --> 00:57:40,360 Speaker 1: than it may have seemed. A decision was made not 1069 00:57:40,480 --> 00:57:42,240 Speaker 1: to do this, not to act on what they had 1070 00:57:42,280 --> 00:57:45,200 Speaker 1: learned earlier that year, and I would argue, based on 1071 00:57:45,240 --> 00:57:47,640 Speaker 1: everything else, we know that the reason why this decision 1072 00:57:47,680 --> 00:57:50,720 Speaker 1: was made was primarily political, like they didn't want to 1073 00:57:50,720 --> 00:57:54,360 Speaker 1: piss off conservatives, you know. Yeah, I mean, and that's 1074 00:57:54,400 --> 00:57:58,800 Speaker 1: that is like a criminal level of negligence. I would argue, 1075 00:57:59,280 --> 00:58:01,480 Speaker 1: that's leaving a did gun with a six year old, 1076 00:58:01,520 --> 00:58:04,520 Speaker 1: you know yeah, and being like, well, I was pretty 1077 00:58:04,520 --> 00:58:07,520 Speaker 1: sure it had a safety. Yeah. I just I'm like 1078 00:58:07,680 --> 00:58:11,440 Speaker 1: they like, God, there's just I miss when they were 1079 00:58:11,520 --> 00:58:18,600 Speaker 1: radicalized by farm bill. Yeah, white supremacy. Yeah. So my 1080 00:58:18,720 --> 00:58:21,480 Speaker 1: critiques aside, this internal report does provide us with a 1081 00:58:21,480 --> 00:58:23,960 Speaker 1: lot of really useful information info that would have been 1082 00:58:24,040 --> 00:58:26,480 Speaker 1: very helpful to experts seeking to stop the spread of 1083 00:58:26,520 --> 00:58:29,479 Speaker 1: violent extremism online if they had had it back when 1084 00:58:29,520 --> 00:58:32,840 Speaker 1: Facebook found it out. So it's rad that Facebook absolutely 1085 00:58:32,880 --> 00:58:35,760 Speaker 1: never intended to release any of this. Isn't that cool? 1086 00:58:36,480 --> 00:58:39,320 Speaker 1: And that they were never going to put any of 1087 00:58:39,320 --> 00:58:41,520 Speaker 1: this out? There's like really useful. Dad. I have a 1088 00:58:41,600 --> 00:58:43,120 Speaker 1: quote in here. I don't think I'll read it because 1089 00:58:43,120 --> 00:58:45,120 Speaker 1: it's a bunch of numbers and it's only really interesting 1090 00:58:45,120 --> 00:58:47,720 Speaker 1: to nerds about this, but about like how many of 1091 00:58:47,760 --> 00:58:49,680 Speaker 1: the people who get in to stop the steel groups 1092 00:58:49,680 --> 00:58:52,000 Speaker 1: come in through like invites, and like how many people 1093 00:58:52,000 --> 00:58:54,320 Speaker 1: are actually responsible for the invites. What the average number 1094 00:58:54,320 --> 00:58:57,880 Speaker 1: of invites for a person is? Like, it's really interesting stuff. 1095 00:58:58,200 --> 00:58:59,480 Speaker 1: I'll have the links for it in there. You can 1096 00:58:59,520 --> 00:59:02,440 Speaker 1: read these into and on Facebook post. But like you 1097 00:59:02,440 --> 00:59:04,640 Speaker 1: know what, I'll read the quote. Stop the Steel was 1098 00:59:04,680 --> 00:59:08,440 Speaker 1: able to grow rapidly through coordinated group invites. Sixtent of 1099 00:59:08,480 --> 00:59:11,840 Speaker 1: stop the Steel joints came through invites. Moreover, these invites 1100 00:59:11,880 --> 00:59:15,560 Speaker 1: were dominated by a handful of superinviterstent of invites came 1101 00:59:15,600 --> 00:59:18,640 Speaker 1: from just three point three percent of inviters. Invits also 1102 00:59:18,720 --> 00:59:21,600 Speaker 1: tended to be connected to one another through interactions. They 1103 00:59:21,600 --> 00:59:24,360 Speaker 1: comment on tag and share one another's content. Out of 1104 00:59:24,400 --> 00:59:27,520 Speaker 1: six thousand, four hundred and fifty high engagers, four thousand 1105 00:59:27,600 --> 00:59:31,200 Speaker 1: and twenty five of them were directly connected to one another, 1106 00:59:31,440 --> 00:59:34,320 Speaker 1: meaning they interacted with one another's content or messaged one 1107 00:59:34,320 --> 00:59:38,440 Speaker 1: another when using the full information cord or seventy connected 1108 00:59:38,480 --> 00:59:40,520 Speaker 1: to one another. This suggests that the bulk of the 1109 00:59:40,560 --> 00:59:42,600 Speaker 1: stop to steel application was happening as part of a 1110 00:59:42,640 --> 00:59:45,080 Speaker 1: cohesive movement. This would have been great data to have 1111 00:59:45,240 --> 00:59:49,600 Speaker 1: in January, right, That would have been really good to know. YEA. 1112 00:59:49,960 --> 00:59:53,720 Speaker 1: That is speaking as a guy who does this professionally. 1113 00:59:53,760 --> 00:59:56,360 Speaker 1: That would have been great to have. But this is 1114 00:59:56,400 --> 01:00:00,280 Speaker 1: all just internal Like, okay, so we know, so we 1115 01:00:00,400 --> 01:00:02,919 Speaker 1: know this, let us never to speak of it again. 1116 01:00:02,960 --> 01:00:06,800 Speaker 1: Problem is, yeah, now we'll deny it to anyone who 1117 01:00:06,840 --> 01:00:09,600 Speaker 1: alleges this while we have this excellent data that we 1118 01:00:09,640 --> 01:00:13,680 Speaker 1: will not hand out because we're pieces of ship. Yeah yeah, 1119 01:00:14,160 --> 01:00:17,400 Speaker 1: called January six. Facebook employees were as horrified as anyone 1120 01:00:17,400 --> 01:00:20,160 Speaker 1: else by what happened in the capital. This horror was 1121 01:00:20,160 --> 01:00:22,960 Speaker 1: tweaked up several degrees by the undeniable fact that their 1122 01:00:23,000 --> 01:00:25,160 Speaker 1: work for the social network had helped to enable it. 1123 01:00:25,240 --> 01:00:29,360 Speaker 1: Was that it was their fault. Yeah, in that Yeah, 1124 01:00:29,440 --> 01:00:31,640 Speaker 1: in the same way that like when I finish having 1125 01:00:31,680 --> 01:00:34,520 Speaker 1: a gasoline and match fight in my neighbor's house and 1126 01:00:34,560 --> 01:00:39,200 Speaker 1: then inexplicable tragedy ensues, I can't help but feel somewhat responsible, 1127 01:00:39,280 --> 01:00:42,200 Speaker 1: you know. Wait, hold on, I'm feeling this vague melan 1128 01:00:42,360 --> 01:00:44,080 Speaker 1: and I know it's not my fault. Don't worry. I 1129 01:00:44,120 --> 01:00:47,960 Speaker 1: know it's not my fault. Um, but I feel that way. No, 1130 01:00:48,120 --> 01:00:49,680 Speaker 1: you just lit a match. I mean, I think we 1131 01:00:49,720 --> 01:00:53,440 Speaker 1: can hold the match accountable. The match was responsible, the match, 1132 01:00:53,560 --> 01:00:55,760 Speaker 1: the neighbor for having a house A lot of people 1133 01:00:55,760 --> 01:00:59,880 Speaker 1: are to blame here them, That's on them exactly. So 1134 01:01:00,280 --> 01:01:03,560 Speaker 1: they're all horrified. Everybody's horrified. Much of the riot itself 1135 01:01:03,600 --> 01:01:06,680 Speaker 1: was broadcast in a series of Facebook live streams as 1136 01:01:06,720 --> 01:01:09,800 Speaker 1: Mike Pince's secret service details scrambled to extricate him from 1137 01:01:09,800 --> 01:01:12,840 Speaker 1: the Capitol grounds. Facebook employees tried to enact their break 1138 01:01:12,880 --> 01:01:16,600 Speaker 1: the Glass emergency measures, and originally conceived for the immediate 1139 01:01:16,640 --> 01:01:20,240 Speaker 1: post election period, this was too little, too late. That evening, 1140 01:01:20,320 --> 01:01:23,680 Speaker 1: Mark Zuckerberg posted a message on Facebook's internal messaging system 1141 01:01:23,720 --> 01:01:26,880 Speaker 1: with the title employee f y I. He claimed to 1142 01:01:26,880 --> 01:01:29,440 Speaker 1: be horrified about what had happened and reiterated his company's 1143 01:01:29,440 --> 01:01:33,120 Speaker 1: commitment to democracy. Chief technology officer Mike Schrepfer, one of 1144 01:01:33,160 --> 01:01:36,800 Speaker 1: the most internally respected members of Facebook's leadership, also made 1145 01:01:36,800 --> 01:01:39,400 Speaker 1: a post asking employees to hang in there while the 1146 01:01:39,440 --> 01:01:43,520 Speaker 1: company decided on its next steps. The theme from the 1147 01:01:43,560 --> 01:01:49,120 Speaker 1: Trolls song like what Jeffrey Tatsenberg fired the entire hang 1148 01:01:49,200 --> 01:01:52,320 Speaker 1: in their folks, hang in there, you're hanging in there. 1149 01:01:52,400 --> 01:01:57,120 Speaker 1: Here's an amazing song by misss Anna Kendrick. That's right, 1150 01:01:57,160 --> 01:02:00,200 Speaker 1: that's what he did. Uh. So he tells them us 1151 01:02:00,440 --> 01:02:02,640 Speaker 1: uh and an employee responds, we have been hanging in 1152 01:02:02,680 --> 01:02:04,920 Speaker 1: there for years. We must demand more action from our 1153 01:02:05,000 --> 01:02:08,400 Speaker 1: leaders at this point, faith alone is not sufficient. Another 1154 01:02:08,440 --> 01:02:11,440 Speaker 1: message was more pointed, all due respect, but haven't we 1155 01:02:11,480 --> 01:02:13,920 Speaker 1: had enough time to figure out how to manage discourse 1156 01:02:13,960 --> 01:02:16,680 Speaker 1: without enabling violence. We've been fueling this fire for a 1157 01:02:16,720 --> 01:02:18,920 Speaker 1: long time, and we shouldn't be surprised it's now out 1158 01:02:18,960 --> 01:02:24,280 Speaker 1: of control. The Atlantic. Yeah, fair, the Atlantic the poster 1159 01:02:24,440 --> 01:02:26,480 Speaker 1: with a little kitten hanging from the branch that says 1160 01:02:26,560 --> 01:02:29,800 Speaker 1: we have been hanging in there for years? Yea? Or 1161 01:02:29,800 --> 01:02:31,800 Speaker 1: how about we have been fueling this fire for a 1162 01:02:31,840 --> 01:02:34,200 Speaker 1: long time and shouldn't be surprised now it's now out 1163 01:02:34,200 --> 01:02:36,080 Speaker 1: of control. We could do that with the this is 1164 01:02:36,120 --> 01:02:40,520 Speaker 1: fine meme. Guys sitting in the fire. This is on us. 1165 01:02:40,640 --> 01:02:44,760 Speaker 1: You know, we shouldn't be This isn't surprising. We had 1166 01:02:44,800 --> 01:02:47,880 Speaker 1: ample warning of the fire. But hang in there, Hang 1167 01:02:47,920 --> 01:02:50,880 Speaker 1: in there, kiddos. So I think The Atlantic has done 1168 01:02:50,880 --> 01:02:53,560 Speaker 1: some of the best reporting I found on this particular 1169 01:02:53,640 --> 01:02:56,320 Speaker 1: moment when Mark and shreppor get up and like say, 1170 01:02:56,560 --> 01:02:59,200 Speaker 1: don't we like hanging there? We love democracy and like 1171 01:02:59,400 --> 01:03:03,440 Speaker 1: people go off on them January six, like there are 1172 01:03:03,440 --> 01:03:06,320 Speaker 1: people were like a little more open about the frustration 1173 01:03:06,360 --> 01:03:08,760 Speaker 1: they felt about all this stuff then and then they 1174 01:03:08,760 --> 01:03:12,680 Speaker 1: stopped being that open um. It's frustrating, and everybody's treating 1175 01:03:12,720 --> 01:03:15,439 Speaker 1: these people with kid loves whatever. The Atlantic has done 1176 01:03:15,440 --> 01:03:18,480 Speaker 1: really good reporting on this, on this exact momentum, which 1177 01:03:18,480 --> 01:03:20,560 Speaker 1: seems to have been kind of a damn breaking situation 1178 01:03:20,640 --> 01:03:23,760 Speaker 1: for unrest within the company. One employee wrote, what are 1179 01:03:23,800 --> 01:03:26,240 Speaker 1: we going to do differently to make the future better? 1180 01:03:26,320 --> 01:03:28,880 Speaker 1: If the answer is nothing, then we're just that apocryphal 1181 01:03:28,880 --> 01:03:32,520 Speaker 1: Einstein quote on the definition of insanity. Another added to 1182 01:03:32,600 --> 01:03:35,280 Speaker 1: Mike shrep for please understand, I think you are a 1183 01:03:35,320 --> 01:03:37,520 Speaker 1: great person and genuinely one of the good people in 1184 01:03:37,600 --> 01:03:40,040 Speaker 1: leadership that keeps me here. But we cannot deal with 1185 01:03:40,080 --> 01:03:43,720 Speaker 1: fundamentally bad faith actors by using good faith solutions. Um, 1186 01:03:44,600 --> 01:03:46,439 Speaker 1: that's a good way to but yeah, yeah, I would 1187 01:03:46,480 --> 01:03:50,080 Speaker 1: also like democratic leadership to know that. But well, let's 1188 01:03:50,080 --> 01:03:51,840 Speaker 1: not set the bar too high. They're they're not going 1189 01:03:51,880 --> 01:03:54,480 Speaker 1: to figure that out now. In the wake of January six, 1190 01:03:54,600 --> 01:03:58,280 Speaker 1: an awful lot of people me included exhaustively documented Facebook's 1191 01:03:58,280 --> 01:04:01,400 Speaker 1: contribution to the attack and criticized company for enabling political 1192 01:04:01,440 --> 01:04:04,400 Speaker 1: violence on a grand scale. The company responded the way 1193 01:04:04,440 --> 01:04:08,040 Speaker 1: they always respond with lies. Mark Zuckerberg told Congress in 1194 01:04:08,080 --> 01:04:10,520 Speaker 1: March that Facebook quote did our part to secure the 1195 01:04:10,560 --> 01:04:13,840 Speaker 1: integrity of our election. Cheryl Sandberg, boss girl and a 1196 01:04:13,960 --> 01:04:17,120 Speaker 1: chief operating officer for the company, claimed in mid January 1197 01:04:17,160 --> 01:04:20,640 Speaker 1: that the Capitol Yeah claimed in mid January that the 1198 01:04:20,680 --> 01:04:23,919 Speaker 1: Capitol riot was quote largely organized on platforms that don't 1199 01:04:23,960 --> 01:04:27,919 Speaker 1: have our abilities to stop hate. I mean, Robert, first 1200 01:04:27,920 --> 01:04:31,000 Speaker 1: of all, as you know, as a big Sandberg advocate, 1201 01:04:31,400 --> 01:04:33,600 Speaker 1: you can't talk about her that way because she told 1202 01:04:33,920 --> 01:04:38,480 Speaker 1: women that they should negotiate their own salary. You fucking loser. 1203 01:04:39,000 --> 01:04:42,640 Speaker 1: Did you ever think about negotiating your own salary? Fucking 1204 01:04:42,720 --> 01:04:48,080 Speaker 1: dweeb dollars? And I love that. I do love that too. 1205 01:04:48,160 --> 01:04:51,320 Speaker 1: That's my favorite thing that she did. So Sandberg is 1206 01:04:51,320 --> 01:04:53,920 Speaker 1: a lot smarter than Mark Zuckerberg, and her statement was 1207 01:04:53,960 --> 01:04:56,400 Speaker 1: the very clever sort of not technically a lie that 1208 01:04:56,440 --> 01:04:59,919 Speaker 1: spreads a lot more disinformation than just a normal lie. Whatever, man, 1209 01:05:00,400 --> 01:05:03,240 Speaker 1: Because it's technically true that more actual organizing for the 1210 01:05:03,360 --> 01:05:05,600 Speaker 1: riot was done in places like Parlor as well as 1211 01:05:05,600 --> 01:05:08,479 Speaker 1: more private messaging apps. But it's also a lie because 1212 01:05:08,520 --> 01:05:11,160 Speaker 1: the organizing of the actual movement that spawned the riot 1213 01:05:11,240 --> 01:05:14,480 Speaker 1: to stop the steel ship that was almost entirely Facebook. 1214 01:05:14,520 --> 01:05:17,600 Speaker 1: So like, yeah, people didn't like go and open Facebook 1215 01:05:17,600 --> 01:05:19,680 Speaker 1: groups and do like most of the people didn't like 1216 01:05:19,720 --> 01:05:21,400 Speaker 1: go in there and be like, Okay, we're doing a caravan, 1217 01:05:21,440 --> 01:05:25,000 Speaker 1: although some people did and we have quotes of that. Um, 1218 01:05:25,040 --> 01:05:27,000 Speaker 1: a lot of that happened in other apps, but like 1219 01:05:27,280 --> 01:05:30,720 Speaker 1: the overall those other apps if they had not first 1220 01:05:30,760 --> 01:05:34,000 Speaker 1: been on Facebook. No, yeah, that's what I'm saying, Like, 1221 01:05:34,040 --> 01:05:36,960 Speaker 1: that's why it's smarter than her, because Mark Zuckerberg is 1222 01:05:37,000 --> 01:05:41,520 Speaker 1: just lying to Congress, um, because they didn't Cheryl Sandberg 1223 01:05:41,600 --> 01:05:46,040 Speaker 1: is being very intelligent um, and also kind of backhandedly 1224 01:05:46,120 --> 01:05:51,520 Speaker 1: complimenting Facebook in its hour of most blatant failure within 1225 01:05:51,560 --> 01:05:54,960 Speaker 1: the United States. At least not most blatant failure. That 1226 01:05:54,960 --> 01:05:57,640 Speaker 1: would be far. I mean, check the date that year, listen, 1227 01:05:57,640 --> 01:05:59,520 Speaker 1: you know, check the date record at this we may 1228 01:05:59,560 --> 01:06:01,880 Speaker 1: have had an ethnic cleansing enabled by Facebook by the 1229 01:06:01,920 --> 01:06:06,920 Speaker 1: time this episode drops. Yeah, as of this recording of 1230 01:06:06,960 --> 01:06:09,840 Speaker 1: the non ethnic cleansings and also the things that have 1231 01:06:09,880 --> 01:06:11,720 Speaker 1: been done in the United States that were worse than 1232 01:06:11,760 --> 01:06:14,760 Speaker 1: Facebook did, this tops the list, does not top the 1233 01:06:14,760 --> 01:06:16,920 Speaker 1: list of their overall crimes, which include the deaths of 1234 01:06:16,920 --> 01:06:21,960 Speaker 1: tens of thousands us. True, ye, good stuff, good stuff. 1235 01:06:22,080 --> 01:06:24,840 Speaker 1: I just I felt like the need to celebrate I do. 1236 01:06:25,040 --> 01:06:29,000 Speaker 1: Oh those were nice. Those are nice. My crush used 1237 01:06:29,000 --> 01:06:30,880 Speaker 1: to send me those in high school. And now that 1238 01:06:30,920 --> 01:06:33,600 Speaker 1: website is responsible for the deaths of tens of thousands 1239 01:06:33,600 --> 01:06:37,520 Speaker 1: of people. Yeah, it's it's good stuff. So what I 1240 01:06:37,560 --> 01:06:40,320 Speaker 1: find so utterly fascinating about the leaks we have of 1241 01:06:40,400 --> 01:06:43,040 Speaker 1: Facebook employees responding to their bosses on the evening of 1242 01:06:43,120 --> 01:06:46,520 Speaker 1: January six is that it makes it irrevocably, undeniably clear 1243 01:06:46,560 --> 01:06:49,760 Speaker 1: that Zuckerberg and Sandberg and every other Facebook mouthpiece lide 1244 01:06:49,920 --> 01:06:52,200 Speaker 1: when they claimed the company had no responsibility for the 1245 01:06:52,240 --> 01:06:55,800 Speaker 1: violence on January six, The people who worked for Facebook 1246 01:06:56,000 --> 01:06:59,640 Speaker 1: on the day it happened immediately blamed their own company 1247 01:06:59,640 --> 01:07:03,720 Speaker 1: for the arnage quote. Really do appreciate this response, and 1248 01:07:03,800 --> 01:07:05,960 Speaker 1: I can imagine the stress leadership is under, But this 1249 01:07:06,000 --> 01:07:08,400 Speaker 1: feels like the only avenue where we opt to compare 1250 01:07:08,400 --> 01:07:10,919 Speaker 1: ourselves to other companies, rather than taking our own lead. 1251 01:07:11,160 --> 01:07:14,240 Speaker 1: If our headsheets shocked, if our headsets shocked someone, would 1252 01:07:14,240 --> 01:07:16,720 Speaker 1: we say, well, it's still much better than PlayStation VR 1253 01:07:16,880 --> 01:07:20,200 Speaker 1: and it's unprecedented technology, which I felt otherwise, but it's 1254 01:07:20,200 --> 01:07:22,520 Speaker 1: simply not enough to say we're adapting, because we should 1255 01:07:22,520 --> 01:07:25,760 Speaker 1: have adapted already long ago. The atrophy occurs when people 1256 01:07:25,800 --> 01:07:28,280 Speaker 1: know how to circumvent our policies and we're too reactive 1257 01:07:28,320 --> 01:07:30,920 Speaker 1: to stay ahead. There were dozens of Stopped the Steel 1258 01:07:30,920 --> 01:07:33,760 Speaker 1: groups active until recently, and I doubt they minced words 1259 01:07:33,760 --> 01:07:38,160 Speaker 1: about their intense intentions. Again, hugely appreciate your response and 1260 01:07:38,200 --> 01:07:40,680 Speaker 1: the dialogue, but I'm simply exhausted by the weight here. 1261 01:07:40,920 --> 01:07:44,640 Speaker 1: We're at Facebook, not some naive startup. With the unprecedented 1262 01:07:44,640 --> 01:07:47,640 Speaker 1: resources we have, we should do better. Yeah, but I've 1263 01:07:47,920 --> 01:07:50,640 Speaker 1: that's like part of the Zuckerberg eathos is to continue 1264 01:07:50,680 --> 01:07:56,840 Speaker 1: to behave like he's fast and yeah or well or 1265 01:07:57,000 --> 01:08:01,040 Speaker 1: more iconically, Um what if what is the quote that's 1266 01:08:01,040 --> 01:08:09,080 Speaker 1: on my shirt? Um? Oh and that's how I live 1267 01:08:09,160 --> 01:08:16,639 Speaker 1: my life? Ha ha the key And he's still like, yeah, 1268 01:08:16,760 --> 01:08:19,400 Speaker 1: I mean, and I feel like that is whatever. I'm 1269 01:08:19,439 --> 01:08:21,840 Speaker 1: sure that does say so much about him, because like 1270 01:08:22,000 --> 01:08:24,400 Speaker 1: a good person can say, you can break the law 1271 01:08:24,439 --> 01:08:26,559 Speaker 1: and not be unethical, and that's how I live my life. 1272 01:08:26,640 --> 01:08:30,680 Speaker 1: And that's fine because the laws generally trash. Zuckerberg is 1273 01:08:30,720 --> 01:08:32,960 Speaker 1: specifically saying, I get to be a piece of ship 1274 01:08:33,000 --> 01:08:34,960 Speaker 1: as long as I don't technically break the law. And 1275 01:08:34,960 --> 01:08:37,599 Speaker 1: because I have money, I'm never technically breaking the law. 1276 01:08:37,960 --> 01:08:41,600 Speaker 1: Is that not sweet? And then don't forget ha ha 1277 01:08:40,840 --> 01:08:44,960 Speaker 1: ha ha. You can be an ethical and still be illegal. 1278 01:08:45,080 --> 01:08:46,720 Speaker 1: That's the way I live my life. Ha ha. I'll 1279 01:08:46,720 --> 01:08:49,479 Speaker 1: never forgaw, I'll never, never, never forget. I'm getting that 1280 01:08:49,520 --> 01:08:53,960 Speaker 1: tattooed right above my comeback with a warrant tramp stamp. 1281 01:08:54,479 --> 01:08:57,639 Speaker 1: Yeah and yeah and then and I would also recommend 1282 01:08:57,680 --> 01:08:59,720 Speaker 1: getting it on on the other side, right next to 1283 01:08:59,800 --> 01:09:02,320 Speaker 1: your phoenix that I know you're you're planning out. Oh 1284 01:09:02,400 --> 01:09:07,080 Speaker 1: full back. It's actually gonna be a perfect replica of 1285 01:09:07,240 --> 01:09:09,559 Speaker 1: the tattoo that Ben Affleck has. And then over my 1286 01:09:09,640 --> 01:09:13,639 Speaker 1: chest a perfect photo realistic tattoo have Ben Affleck picking 1287 01:09:13,720 --> 01:09:16,200 Speaker 1: up Duncan Donuts and looking like he's just watched his 1288 01:09:16,240 --> 01:09:21,080 Speaker 1: dog shoot itself. No, I don't I like appreciate his 1289 01:09:21,120 --> 01:09:24,200 Speaker 1: devotion to Duncan Donuts. I don't know how he I mean, 1290 01:09:24,240 --> 01:09:26,160 Speaker 1: I guess he's just tired, because I'm like, I don't 1291 01:09:26,160 --> 01:09:28,120 Speaker 1: look that way at Duncan dunt picture I've seen of 1292 01:09:28,160 --> 01:09:30,679 Speaker 1: myself at Duncan Donuts. I look so happy to be there. 1293 01:09:30,760 --> 01:09:32,840 Speaker 1: How could you not be thrilled? I don't know. I 1294 01:09:32,880 --> 01:09:34,800 Speaker 1: don't know. I mean, i'd say it's Boston, but you're 1295 01:09:34,840 --> 01:09:37,280 Speaker 1: from Boston, right, I'm from Boston and I'm so happy 1296 01:09:37,320 --> 01:09:39,519 Speaker 1: to be there. Yeah. People keep saying, No, he doesn't 1297 01:09:39,520 --> 01:09:42,160 Speaker 1: look miserable. He just looks like he's from Boston, and 1298 01:09:42,320 --> 01:09:46,640 Speaker 1: I think he just as to that. I don't go 1299 01:09:46,680 --> 01:09:49,280 Speaker 1: to other girls, Robert. I don't know if you know that. 1300 01:09:49,360 --> 01:09:51,800 Speaker 1: I'm not like other girls. So I'm I'm happy at 1301 01:09:51,840 --> 01:09:56,000 Speaker 1: the Duncan Donuts. Okay, okay, fair enough? Um so yeah, 1302 01:09:56,200 --> 01:09:58,200 Speaker 1: I don't know. At the beginning, I talked about the 1303 01:09:58,200 --> 01:10:00,639 Speaker 1: fact that I have said I think working for Facebook 1304 01:10:00,680 --> 01:10:04,160 Speaker 1: is an a moral activity today given what's known. That said, 1305 01:10:04,280 --> 01:10:06,799 Speaker 1: there were some points made during this employee bitch session 1306 01:10:06,880 --> 01:10:09,400 Speaker 1: that do make me kind of hesitant to suggest employees 1307 01:10:09,479 --> 01:10:13,040 Speaker 1: just bounce from the company. In mass quote, please continue 1308 01:10:13,080 --> 01:10:15,679 Speaker 1: to fight for us, shrep. This is the person talking 1309 01:10:15,680 --> 01:10:22,320 Speaker 1: to the I'm sorry you care you're hearing. I know, 1310 01:10:22,479 --> 01:10:25,920 Speaker 1: I know, I know. Facebook Engineering needs you representing our 1311 01:10:25,960 --> 01:10:28,559 Speaker 1: ethical standards in the highest levels of leadership unless uck, 1312 01:10:28,600 --> 01:10:31,519 Speaker 1: what's is products built by a cast of mercenaries and ghoules. 1313 01:10:31,560 --> 01:10:34,479 Speaker 1: We need to employ thousands of thoughtful, carrying engineers, and 1314 01:10:34,479 --> 01:10:36,639 Speaker 1: it will be difficult to continue to hire and retain 1315 01:10:36,680 --> 01:10:40,920 Speaker 1: them on our present course. That's not a terrible point, Okay, 1316 01:10:42,080 --> 01:10:45,800 Speaker 1: that feels like a half step. Yeah. Yeah, It's one 1317 01:10:45,840 --> 01:10:48,040 Speaker 1: of those things where like, on one hand, it is 1318 01:10:48,160 --> 01:10:50,760 Speaker 1: bad to work for a company that's entire job is 1319 01:10:50,800 --> 01:10:53,120 Speaker 1: to do harm at scale, which is what Facebook does. 1320 01:10:53,160 --> 01:10:55,280 Speaker 1: On the other hand, if they are replaced by people 1321 01:10:55,320 --> 01:10:58,679 Speaker 1: who don't have any ethical standards at all, that also 1322 01:10:59,000 --> 01:11:03,800 Speaker 1: probably isn't great. Now I don't agree with that, Yeah, 1323 01:11:04,040 --> 01:11:07,599 Speaker 1: it's it's complicated because, like I guess you could argue that, like, well, 1324 01:11:07,640 --> 01:11:09,680 Speaker 1: if all of the good engineers leave and they have 1325 01:11:09,760 --> 01:11:13,200 Speaker 1: to hire the ghouls, like, it'll fall apart eventually. And 1326 01:11:13,200 --> 01:11:14,800 Speaker 1: I guess it's the question of like when does the 1327 01:11:15,400 --> 01:11:19,919 Speaker 1: damage done by Facebook, like fading in popularity hopefully eclipse 1328 01:11:19,960 --> 01:11:22,360 Speaker 1: the damage done by the fact that everyone working there 1329 01:11:22,520 --> 01:11:25,920 Speaker 1: is the blackwater equivalent of a guy coding a fucking algorithm. 1330 01:11:25,960 --> 01:11:29,840 Speaker 1: Like I don't know, it's yeah, it's whatever, it's a 1331 01:11:30,240 --> 01:11:34,400 Speaker 1: it's just something to think about. I guess, um, yeah, yeah, 1332 01:11:34,760 --> 01:11:37,240 Speaker 1: Facebook is having a hiring crisis right now. I think 1333 01:11:37,280 --> 01:11:39,400 Speaker 1: it's gotten a little better recently. Um, but they've had 1334 01:11:39,479 --> 01:11:42,240 Speaker 1: massive shortages of engineers and failed to meet their hiring 1335 01:11:42,280 --> 01:11:44,760 Speaker 1: goals in two thousand, nineteen twenty. I don't know if 1336 01:11:44,760 --> 01:11:46,680 Speaker 1: they're gonna if they have, if it's gotten better this 1337 01:11:46,760 --> 01:11:49,599 Speaker 1: year or not. Um, I don't know how much any 1338 01:11:49,640 --> 01:11:52,559 Speaker 1: of that's gonna like help uh matters. I don't know. 1339 01:11:52,680 --> 01:11:55,559 Speaker 1: It seems unlikely that anything will get better anytime. I mean, yeah, 1340 01:11:55,600 --> 01:11:57,519 Speaker 1: I think the real solution is to make the company 1341 01:11:58,080 --> 01:12:02,640 Speaker 1: run by even worse who are less qualified that I 1342 01:12:02,640 --> 01:12:05,640 Speaker 1: don't know. That doesn't that doesn't sound great either. I 1343 01:12:05,680 --> 01:12:08,719 Speaker 1: don't know. I think there are volcanoes and that probably 1344 01:12:08,760 --> 01:12:11,320 Speaker 1: has part of the solution to the Facebook problem in it. 1345 01:12:11,360 --> 01:12:15,240 Speaker 1: There you go cast their servers into the fires. So 1346 01:12:15,479 --> 01:12:18,280 Speaker 1: Mark Zuckerberg and his fellow Nerd horsemen of the Apocalypse 1347 01:12:18,320 --> 01:12:21,000 Speaker 1: have basically built a gun that fires automatically into a 1348 01:12:21,000 --> 01:12:24,240 Speaker 1: crowd called society every couple of seconds. The engineers are 1349 01:12:24,240 --> 01:12:29,640 Speaker 1: the people who keep the gun loaded. It's good, yes, 1350 01:12:29,680 --> 01:12:32,519 Speaker 1: So the engineers they keep the gun loaded, but also 1351 01:12:32,640 --> 01:12:34,880 Speaker 1: sometimes they jerk it away from shooting a child in 1352 01:12:34,920 --> 01:12:37,160 Speaker 1: the face, and if they leave the bull, the gun 1353 01:12:37,240 --> 01:12:38,800 Speaker 1: might run out of bullets. But it's just going to 1354 01:12:38,920 --> 01:12:41,640 Speaker 1: keep shooting straight until the crowd until that point. So 1355 01:12:41,680 --> 01:12:46,120 Speaker 1: maybe it's better to have engineers. Yeah, I had to 1356 01:12:46,200 --> 01:12:49,520 Speaker 1: end with a metaphor um. I don't know, it's complicated 1357 01:12:49,560 --> 01:12:52,320 Speaker 1: whatever I want to end this episode. Yeah, it's it's 1358 01:12:52,320 --> 01:12:54,080 Speaker 1: just a mess. It's a messy thing to think about. 1359 01:12:54,200 --> 01:12:56,479 Speaker 1: We should never have let it get this far. We've 1360 01:12:56,479 --> 01:12:58,200 Speaker 1: been putting fuel on the fire for a while. We 1361 01:12:58,200 --> 01:13:01,600 Speaker 1: shouldn't be surprised that it's burning every thing. Truly, it 1362 01:13:01,640 --> 01:13:04,240 Speaker 1: has become It feels like it is slowly becoming, just 1363 01:13:04,280 --> 01:13:08,400 Speaker 1: like an annual document drop of like, yeah, things have 1364 01:13:09,320 --> 01:13:14,280 Speaker 1: steadily gotten worse. Yeah, the Hell Company is pretty shitty here. Yeah, 1365 01:13:14,400 --> 01:13:21,960 Speaker 1: Hell Companies bad stuff at Nightmare Corp. Bad. It is 1366 01:13:22,040 --> 01:13:25,080 Speaker 1: really bad. It's funny how bad it is. I want 1367 01:13:25,080 --> 01:13:28,040 Speaker 1: to end this episode with my very favorite response from 1368 01:13:28,080 --> 01:13:30,759 Speaker 1: a Facebook employee to that message from ct O Shrepp 1369 01:13:31,400 --> 01:13:34,320 Speaker 1: Facebook employee, if you happen to be listening to this episode, 1370 01:13:34,360 --> 01:13:36,680 Speaker 1: please hit me up because I love you. Here it 1371 01:13:36,760 --> 01:13:39,799 Speaker 1: is never forget the day Trump wrote down the escalator 1372 01:13:39,800 --> 01:13:42,160 Speaker 1: in two thousand fifteen called for a ban on Muslims 1373 01:13:42,240 --> 01:13:45,200 Speaker 1: entering the US, and we determined that it violated our policies, 1374 01:13:45,240 --> 01:13:48,280 Speaker 1: and yet we explicitly overrode the policy and didn't take 1375 01:13:48,320 --> 01:13:50,680 Speaker 1: the video down. There was a straight line that can 1376 01:13:50,720 --> 01:13:52,880 Speaker 1: be drawn from that day to today, one of the 1377 01:13:52,960 --> 01:13:55,720 Speaker 1: darkest days in the history of democracy and self governance. 1378 01:13:55,920 --> 01:13:57,760 Speaker 1: Would it have made a difference in the end, We 1379 01:13:57,800 --> 01:14:02,320 Speaker 1: can never know, but history will not judge us kindly. Wow. Yeah, yeah, 1380 01:14:02,400 --> 01:14:06,280 Speaker 1: they know what they're doing. They know exactly what they're doing, 1381 01:14:06,360 --> 01:14:10,240 Speaker 1: and we get an annual reminder, we get an annual 1382 01:14:10,360 --> 01:14:15,439 Speaker 1: little Pelican summit drop of documents saying that no, they 1383 01:14:15,560 --> 01:14:17,800 Speaker 1: still know what they're doing. They have not forgotten what 1384 01:14:17,840 --> 01:14:21,439 Speaker 1: they're doing. I might suggest that overtake. Are we the 1385 01:14:21,479 --> 01:14:24,240 Speaker 1: batties as like the moment you know things need to change? 1386 01:14:24,240 --> 01:14:26,960 Speaker 1: When when you're like, boy, I think history might judge 1387 01:14:26,960 --> 01:14:30,120 Speaker 1: me from my employment decisions. I think I may be 1388 01:14:30,400 --> 01:14:33,599 Speaker 1: damned by like the historians of the future when they 1389 01:14:33,640 --> 01:14:36,760 Speaker 1: analyze my role in society. Right, And it's like, if 1390 01:14:36,800 --> 01:14:42,599 Speaker 1: you're at that place, Uh, that's not good. That we've 1391 01:14:42,640 --> 01:14:45,000 Speaker 1: passed the point of no return. You know, like five 1392 01:14:45,080 --> 01:14:48,560 Speaker 1: years ago with with Facebook, it's just good Laurd. I 1393 01:14:48,600 --> 01:14:50,560 Speaker 1: mean yeah, and I and I do like applaud the 1394 01:14:50,560 --> 01:14:53,879 Speaker 1: whistleblowers and the people who are continuing to drop documents. 1395 01:14:53,920 --> 01:14:56,080 Speaker 1: And at this point it also does just feel like, 1396 01:14:57,280 --> 01:15:00,000 Speaker 1: you know, getting punched in the face repeatedly because it's like, well, 1397 01:15:00,560 --> 01:15:02,920 Speaker 1: I'm glad that there is the paper trail, I'm glad 1398 01:15:02,960 --> 01:15:05,120 Speaker 1: that there's the evidence, but it's who at this point 1399 01:15:05,160 --> 01:15:10,160 Speaker 1: is surprised, like there's like when people it's whatever. I mean, technically, 1400 01:15:10,160 --> 01:15:12,720 Speaker 1: I think by the definition of the word, these are revelations, 1401 01:15:12,720 --> 01:15:15,800 Speaker 1: but they're also very much not They're just confirmations of 1402 01:15:15,840 --> 01:15:18,800 Speaker 1: things that appeared very obvious based on the conduct of 1403 01:15:18,840 --> 01:15:24,519 Speaker 1: this company already. Yep, yeah, Jamie, do you have any 1404 01:15:25,080 --> 01:15:27,920 Speaker 1: plugables for us? Do you do you perhaps have a 1405 01:15:28,080 --> 01:15:33,400 Speaker 1: Facebook owned Instagram meta thing? It don't We don't need 1406 01:15:33,439 --> 01:15:35,920 Speaker 1: to call it meta. I I think no one needs 1407 01:15:35,960 --> 01:15:37,960 Speaker 1: to call it what. There's only one meta and it's 1408 01:15:37,960 --> 01:15:45,880 Speaker 1: a metal world piece. Wow. Basketball player basketball player Ronald 1409 01:15:45,880 --> 01:15:48,000 Speaker 1: has changed his name a matter world Piece, like many 1410 01:15:48,080 --> 01:15:51,599 Speaker 1: years ago, so he did the meta first. Okay, alright, 1411 01:15:51,600 --> 01:15:55,559 Speaker 1: the metaverse was already taken. I you can you can 1412 01:15:55,960 --> 01:15:58,519 Speaker 1: listen to my podcast. I got a bunch you listen 1413 01:15:58,560 --> 01:16:01,240 Speaker 1: to Bechtel Cast, My Year and Men to Act Castle 1414 01:16:01,360 --> 01:16:05,439 Speaker 1: Lead to podcast or none. I won't know. I'll know 1415 01:16:05,720 --> 01:16:08,840 Speaker 1: in so actually Sophie will know. So maybe you'd better 1416 01:16:08,880 --> 01:16:12,040 Speaker 1: listen to them, or I'll lose my livelihood, which would 1417 01:16:12,080 --> 01:16:15,920 Speaker 1: be which would be interesting? Uh? You could follow me 1418 01:16:15,960 --> 01:16:21,799 Speaker 1: on Twitter Jamie Lofts Help, or Instagram at Jamie christ Superstar, 1419 01:16:21,960 --> 01:16:27,960 Speaker 1: where I bravely continued to use uh Zuckerberg's tools of 1420 01:16:28,000 --> 01:16:30,640 Speaker 1: having Yeah, isn't it funny Jamie that we call it 1421 01:16:30,680 --> 01:16:33,679 Speaker 1: our livelihood, which is just a nicer way of saying 1422 01:16:33,680 --> 01:16:38,320 Speaker 1: our ability to not starve to death in the street. Yeah? Yeah, whoever, 1423 01:16:38,640 --> 01:16:43,799 Speaker 1: whoever back in the day rebranded survival to livelihood. Really 1424 01:16:43,840 --> 01:16:48,000 Speaker 1: real genius, sleight of hand. Incredible. This is why I 1425 01:16:48,040 --> 01:16:51,080 Speaker 1: think we should have a program in schools where we 1426 01:16:51,200 --> 01:16:53,719 Speaker 1: determine which kids are going to be good at marketing, 1427 01:16:54,160 --> 01:16:55,960 Speaker 1: and then we shipped them to an island, and we 1428 01:16:56,040 --> 01:16:58,960 Speaker 1: don't we don't think about it after that point into 1429 01:16:59,000 --> 01:17:01,760 Speaker 1: the island. If they go, it's a nice island, like 1430 01:17:01,800 --> 01:17:04,879 Speaker 1: a good one, like a solid island. Marketing the island 1431 01:17:05,000 --> 01:17:07,680 Speaker 1: you're marketing to the island. That's good. I think we 1432 01:17:08,080 --> 01:17:09,840 Speaker 1: put them on an island and we divert all of 1433 01:17:09,880 --> 01:17:12,639 Speaker 1: the world's military forces to making sure that nothing gets 1434 01:17:12,640 --> 01:17:14,920 Speaker 1: to that island that isn't a marketer or leaves it, 1435 01:17:15,120 --> 01:17:17,280 Speaker 1: and then make sure that and and make fucking sure 1436 01:17:17,320 --> 01:17:19,920 Speaker 1: there's no WiFi signal on the Oh good god. No, 1437 01:17:20,240 --> 01:17:25,040 Speaker 1: would absolutely not absolutely fun. Once a day they can 1438 01:17:25,080 --> 01:17:28,200 Speaker 1: watch Shark Tank together, but that's all they're getting. Oh, 1439 01:17:28,320 --> 01:17:30,559 Speaker 1: that's sounds that actually kind of sounds nice living on 1440 01:17:30,600 --> 01:17:33,320 Speaker 1: an island and watching one episode of Shark Tank day. 1441 01:17:33,800 --> 01:17:38,080 Speaker 1: That's like, that's my like dream lobotomy. That's kind of nice. Yeah, 1442 01:17:38,160 --> 01:17:43,040 Speaker 1: well that's the episode, Paul Robert Jamie. How are you doing? 1443 01:17:43,360 --> 01:17:44,960 Speaker 1: How are you feeling? I feel like, you know, I 1444 01:17:45,439 --> 01:17:49,040 Speaker 1: am feeling just kind of a low thrumming of despair, 1445 01:17:49,560 --> 01:17:51,920 Speaker 1: but but I have good Yeah, that's right, that's how 1446 01:17:51,960 --> 01:17:54,439 Speaker 1: you're supposed to feel. I have felt worse at the 1447 01:17:54,520 --> 01:17:56,360 Speaker 1: end of this show, which I don't know if that 1448 01:17:56,600 --> 01:17:59,880 Speaker 1: says more about like my threshold for despair or you know, 1449 01:18:00,040 --> 01:18:03,679 Speaker 1: happens to be a coincidence, but you know, I'll say 1450 01:18:03,960 --> 01:18:06,360 Speaker 1: I'm hanging in there. But also I've been hanging in 1451 01:18:06,400 --> 01:18:09,040 Speaker 1: there for years. Yeah, that's all you can do is 1452 01:18:09,080 --> 01:18:11,599 Speaker 1: hang in there. Yeah, keep hanging in Are you hanging 1453 01:18:11,600 --> 01:18:15,320 Speaker 1: in there? Robert allegedly Yeah. I mean, by all accounts, 1454 01:18:15,320 --> 01:18:18,040 Speaker 1: you're hanging in there. But like internally, there's no who 1455 01:18:18,040 --> 01:18:21,240 Speaker 1: could say no. No, I'm I'm, I'm. I'm as unmoored 1456 01:18:21,240 --> 01:18:23,200 Speaker 1: and adrift as the rest of us are in these 1457 01:18:23,360 --> 01:18:25,680 Speaker 1: I'm gonna when I visit, I'm going to show you 1458 01:18:25,720 --> 01:18:29,040 Speaker 1: pin and I really think you're gonna like it. Oh good, Okay, Well, 1459 01:18:29,360 --> 01:18:29,920 Speaker 1: how's