1 00:00:03,520 --> 00:00:09,559 Speaker 1: Let's do it. Start the podcast podcast. All right, well 2 00:00:09,680 --> 00:00:12,119 Speaker 1: have the let's have that be what starts the podcast. 3 00:00:12,160 --> 00:00:16,279 Speaker 1: What we just said, let's start the podcast podcast. Let's 4 00:00:16,280 --> 00:00:19,720 Speaker 1: start the podcast. Well, I'm Robert Evans. Yep, I'm so 5 00:00:19,880 --> 00:00:23,600 Speaker 1: feeling I'm in. I never introduced myself. I'm Jamie. Who 6 00:00:23,640 --> 00:00:27,080 Speaker 1: are you, Jamie Loft? Anything more we need to say? 7 00:00:27,120 --> 00:00:29,720 Speaker 1: Are we done with the episode? Anderson's here? No, I 8 00:00:29,800 --> 00:00:34,400 Speaker 1: think that. Yeah, yeah, Anderson is here. Uh sure, Well, 9 00:00:34,800 --> 00:00:39,360 Speaker 1: you know what's happening in the world. Facebook is happening 10 00:00:39,360 --> 00:00:42,920 Speaker 1: to the world, and it's not great. Jamie. It's not great. 11 00:00:42,960 --> 00:00:45,480 Speaker 1: Sophie not not a fan of the Facebook. We left 12 00:00:45,520 --> 00:00:48,800 Speaker 1: off having gone through some of the Facebook papers, particularly 13 00:00:48,880 --> 00:00:53,120 Speaker 1: employees attacking their bosses after jan six, when it became 14 00:00:53,159 --> 00:00:55,800 Speaker 1: clear that the company they were working for was completely 15 00:00:55,840 --> 00:01:01,680 Speaker 1: morally indefensible. Attack they are. Yeah, they already knew. I 16 00:01:01,680 --> 00:01:05,200 Speaker 1: wouldn't call it attacking either. I would call it they 17 00:01:05,280 --> 00:01:09,480 Speaker 1: were pretty I mean, there's a guy like the quote there, 18 00:01:09,480 --> 00:01:12,280 Speaker 1: there was the guy who's like history won't judge as kindly. Uh, 19 00:01:12,640 --> 00:01:16,520 Speaker 1: the guy, and we didn't ban Trump into thus in fifteen. 20 00:01:16,600 --> 00:01:20,120 Speaker 1: That's what caused the Capital riot. I mean facts or facts? 21 00:01:20,200 --> 00:01:23,440 Speaker 1: Is that really attacking if you're just like, well, I 22 00:01:23,480 --> 00:01:28,639 Speaker 1: think that, Yeah, I think stating facts can be an attack. Well, okay, 23 00:01:28,720 --> 00:01:31,759 Speaker 1: put it on a T shirt. I mean for people 24 00:01:31,800 --> 00:01:35,240 Speaker 1: like this, you know, Um yeah, I think I think 25 00:01:35,240 --> 00:01:37,320 Speaker 1: stating facts can be an attack. And we ended Part 26 00:01:37,360 --> 00:01:40,880 Speaker 1: one by sharing some of the blistering criticisms of Facebook 27 00:01:40,880 --> 00:01:43,319 Speaker 1: employees against you know, management in the service itself. So 28 00:01:43,360 --> 00:01:45,800 Speaker 1: as we start part two, it's only proper that we 29 00:01:45,880 --> 00:01:49,520 Speaker 1: cover how Facebook responded to all of this internal criticism. 30 00:01:49,600 --> 00:01:51,920 Speaker 1: As I stated last episode, Facebook is in the midst 31 00:01:51,920 --> 00:01:54,480 Speaker 1: of a year's long drought of capable engineers and other 32 00:01:54,520 --> 00:01:57,080 Speaker 1: technical employees. They are having a lot of trouble hiring 33 00:01:57,120 --> 00:01:58,640 Speaker 1: all of the people that they need for all of 34 00:01:58,680 --> 00:02:01,680 Speaker 1: the things they're trying to do. Um. So, one of 35 00:02:01,720 --> 00:02:04,000 Speaker 1: the things is for a lot of these employees, when 36 00:02:04,000 --> 00:02:06,360 Speaker 1: they say things that are deeply critical, they can't just 37 00:02:06,480 --> 00:02:11,160 Speaker 1: dismiss the concerns of their employees outright because act like 38 00:02:12,000 --> 00:02:14,120 Speaker 1: if they were to do that, these people would get 39 00:02:14,160 --> 00:02:16,400 Speaker 1: angry and they need them, right. Facebook's not in the 40 00:02:16,400 --> 00:02:18,080 Speaker 1: strongest position when it comes to the people who are 41 00:02:18,080 --> 00:02:20,720 Speaker 1: good engineers. They have to walk a little bit of 42 00:02:20,720 --> 00:02:23,240 Speaker 1: a tight rope. However, if they were to actually do 43 00:02:23,320 --> 00:02:26,320 Speaker 1: anything about the actual meat of the concerns, it would 44 00:02:26,320 --> 00:02:29,280 Speaker 1: reduce profitability and in some cases destroy Facebook as it 45 00:02:29,280 --> 00:02:32,200 Speaker 1: currently exists. So they're not going to do anything, um, 46 00:02:32,320 --> 00:02:33,920 Speaker 1: which is meant that they've had to get kind of 47 00:02:33,919 --> 00:02:36,960 Speaker 1: creative with how they respond. So Mark and his fellow 48 00:02:37,160 --> 00:02:41,600 Speaker 1: bosses pivoted and argued that the damning critique like, yeah, 49 00:02:41,680 --> 00:02:43,799 Speaker 1: old Zucky z uck. So when when this all comes 50 00:02:43,800 --> 00:02:45,519 Speaker 1: out and people are like, boy, it sure seems like 51 00:02:45,560 --> 00:02:47,400 Speaker 1: all of your employees know that they're working for the 52 00:02:47,400 --> 00:02:51,960 Speaker 1: fucking death star. Um, Zuckerberg and his like mouthpieces made 53 00:02:51,960 --> 00:02:54,200 Speaker 1: a statement that like, all of these damning critiques from 54 00:02:54,200 --> 00:02:56,600 Speaker 1: people inside the company were actually evidence of the very 55 00:02:56,680 --> 00:03:00,240 Speaker 1: open culture inside Facebook, which encouraged workers to share their 56 00:03:00,240 --> 00:03:03,840 Speaker 1: opinions with management. Um. That's exactly what a company spokesperson 57 00:03:03,919 --> 00:03:06,480 Speaker 1: told The Atlantic when they asked about comments like history 58 00:03:06,480 --> 00:03:09,200 Speaker 1: will not judge us kindly. The fact that they're saying 59 00:03:09,200 --> 00:03:11,560 Speaker 1: will be damned by historians means that we really have 60 00:03:11,600 --> 00:03:16,320 Speaker 1: a healthy office culture. Um. Hashtag dust star, proud, dust Star, 61 00:03:16,400 --> 00:03:19,480 Speaker 1: proud everybody. Yeah. Yeah, it's like the fact that we 62 00:03:19,639 --> 00:03:22,839 Speaker 1: removed a stigma of working for the devil, right, I mean, 63 00:03:23,120 --> 00:03:25,760 Speaker 1: come on, the devil I would be proud to work 64 00:03:25,800 --> 00:03:28,240 Speaker 1: for because he's done some cool stuff, Like have you 65 00:03:28,280 --> 00:03:32,240 Speaker 1: ever been to Vegas? Nice town? Oh, I've been to Vegas. 66 00:03:32,280 --> 00:03:34,200 Speaker 1: I've seen I saw that. I saw the Backstreet Boys 67 00:03:34,200 --> 00:03:37,480 Speaker 1: in Vegas right before two of them were revealed to 68 00:03:37,520 --> 00:03:40,000 Speaker 1: be in Q and On. So really caught the end 69 00:03:40,000 --> 00:03:43,120 Speaker 1: of that, Locomo, Wow, I did not realize that a 70 00:03:43,280 --> 00:03:45,840 Speaker 1: sizeable percentage of the Backstreet Boys had gotten into Q 71 00:03:46,000 --> 00:03:50,040 Speaker 1: and On. That makes total sense. The Backstretoys, They're from Florida. 72 00:03:50,080 --> 00:03:53,400 Speaker 1: They're ultimately five men from Florida. So what can you do? 73 00:03:53,600 --> 00:03:56,600 Speaker 1: As the author of that article, The Atlantic article noted, 74 00:03:56,920 --> 00:04:00,000 Speaker 1: this stance allows Facebook to claim transparency while ignoring the 75 00:04:00,080 --> 00:04:02,720 Speaker 1: substance of the complaints and the implication of the complaints 76 00:04:02,760 --> 00:04:05,640 Speaker 1: that many of Facebook's employees believe their company operates without 77 00:04:05,640 --> 00:04:08,640 Speaker 1: a moral compass. All over America, people used Facebook to 78 00:04:08,760 --> 00:04:10,960 Speaker 1: organize convoys to d C and to fill the buses 79 00:04:11,000 --> 00:04:13,960 Speaker 1: they rented for their trips, and this was indeed done 80 00:04:14,240 --> 00:04:17,640 Speaker 1: in groups like the Lebanon main truth Seekers, where Kyle 81 00:04:17,640 --> 00:04:20,360 Speaker 1: fitz Simmons posted the following quote, this election was stolen, 82 00:04:20,400 --> 00:04:22,919 Speaker 1: and we're being slow walked towards Chinese ownership by an 83 00:04:23,080 --> 00:04:25,520 Speaker 1: establishment that is treasonous and all too willing to gaslight 84 00:04:25,520 --> 00:04:27,760 Speaker 1: the public into believing the theft was somehow the will 85 00:04:27,800 --> 00:04:29,839 Speaker 1: of the people. Would there be an interest locally in 86 00:04:29,920 --> 00:04:32,640 Speaker 1: organizing a caravan to Washington, d C. For the electoral 87 00:04:32,680 --> 00:04:37,760 Speaker 1: College vote count on January six? Um? Yeah, and Kyle 88 00:04:37,880 --> 00:04:40,680 Speaker 1: recently played guilt not guilty to eight federal charges, including 89 00:04:40,680 --> 00:04:43,920 Speaker 1: assault on a police officer. Mark Zuckerberg would argue that 90 00:04:43,960 --> 00:04:47,800 Speaker 1: like Facebook didn't play a significant role in organizing January six, 91 00:04:47,880 --> 00:04:50,240 Speaker 1: and couldn't have played a significant role in radicalizing this 92 00:04:50,320 --> 00:04:53,200 Speaker 1: guy and many other people. But the reality is that 93 00:04:53,800 --> 00:04:56,880 Speaker 1: for the people, like the people who managed part of 94 00:04:56,920 --> 00:04:59,720 Speaker 1: what led Kyle fitz Simmons to go assault people on 95 00:05:00,040 --> 00:05:02,880 Speaker 1: Nuary six, UM was the fact that he had been 96 00:05:02,960 --> 00:05:05,599 Speaker 1: radicalized by a social network that for years made the 97 00:05:05,600 --> 00:05:09,960 Speaker 1: conscious choice to amplify angry content and encourage anger because 98 00:05:09,960 --> 00:05:12,280 Speaker 1: it kept people on the site more right, Like all 99 00:05:12,320 --> 00:05:14,240 Speaker 1: of the anger that boiled up at January six that 100 00:05:14,320 --> 00:05:16,160 Speaker 1: came from a number of places, but one of those 101 00:05:16,200 --> 00:05:20,919 Speaker 1: places was social media. Because social media profited and specifically, 102 00:05:20,960 --> 00:05:25,160 Speaker 1: Facebook knowingly profited from making people angry. That was the business, 103 00:05:25,360 --> 00:05:27,120 Speaker 1: and of course it blew up in the real world. 104 00:05:27,279 --> 00:05:30,000 Speaker 1: I have a question, um, just out of your own 105 00:05:30,040 --> 00:05:34,840 Speaker 1: experience and observation, which is, how do you, um like, 106 00:05:34,880 --> 00:05:37,599 Speaker 1: if you're doing a side by side case study of 107 00:05:37,640 --> 00:05:40,960 Speaker 1: how Facebook responded to events like this versus how like 108 00:05:41,480 --> 00:05:47,400 Speaker 1: YouTube slash Google responded to radicalization? Are there like significant differences? 109 00:05:47,400 --> 00:05:50,960 Speaker 1: Did is there any did anyone do better or difference? 110 00:05:51,240 --> 00:05:56,960 Speaker 1: Twitter has done better than probably most of them YouTube. 111 00:05:57,000 --> 00:05:59,159 Speaker 1: I mean, and again I'm not saying this the Twitter 112 00:05:59,200 --> 00:06:01,880 Speaker 1: has done well, um or that YouTube has done well. 113 00:06:02,040 --> 00:06:05,719 Speaker 1: They've both done, particularly with coronavirus disinformation, a bit better 114 00:06:05,760 --> 00:06:08,440 Speaker 1: than Facebook. Um, and they were they were better in 115 00:06:08,480 --> 00:06:11,599 Speaker 1: general on not really YouTube as much, but like Twitter 116 00:06:11,720 --> 00:06:15,400 Speaker 1: was definitely has taken has been the most responsible of 117 00:06:15,440 --> 00:06:18,799 Speaker 1: the social networks around this stuff. Um. It did seem 118 00:06:18,800 --> 00:06:21,920 Speaker 1: like for a while there the various networks were kind 119 00:06:21,920 --> 00:06:24,120 Speaker 1: of like duking it out to see who could do 120 00:06:24,160 --> 00:06:27,080 Speaker 1: the absolute worst and damage the lives and it's it 121 00:06:27,160 --> 00:06:30,640 Speaker 1: seems like Facebook one. Yes, that I would say Facebook, 122 00:06:30,640 --> 00:06:34,919 Speaker 1: but and again Twitter chose to do a lot of 123 00:06:34,960 --> 00:06:38,039 Speaker 1: the same talxic things Facebook did, so did YouTube, and 124 00:06:38,040 --> 00:06:40,039 Speaker 1: they did it all for profit. A number of the 125 00:06:40,040 --> 00:06:43,120 Speaker 1: things we've criticized Facebook for, you can critique YouTube and Twitter. Four. 126 00:06:43,640 --> 00:06:46,920 Speaker 1: I would argue Twitter certainly has done more and more 127 00:06:46,920 --> 00:06:51,200 Speaker 1: effectively than Facebook, not enough that they're not being irresponsible, 128 00:06:51,200 --> 00:06:53,520 Speaker 1: because I would argue that Twitter has actually been extremely 129 00:06:53,560 --> 00:06:57,839 Speaker 1: irresponsible and knowingly so. Um. But I think Facebook, in 130 00:06:57,880 --> 00:07:01,159 Speaker 1: my analysis, Facebook has been the worst. Although I'm not 131 00:07:01,200 --> 00:07:03,920 Speaker 1: as I haven't gotten studied as much about like TikTok yet, 132 00:07:03,960 --> 00:07:07,800 Speaker 1: so we'll see. But but my analysis, you've got to 133 00:07:07,880 --> 00:07:12,720 Speaker 1: get on TikTok, pivot out of podcasting and into TikTok dances. Yeah, 134 00:07:12,800 --> 00:07:15,280 Speaker 1: I mean, I'm it's it's not the dances that concerned 135 00:07:15,280 --> 00:07:19,480 Speaker 1: me on TikTok. It's the minute long conspiracy theory videos 136 00:07:19,480 --> 00:07:21,920 Speaker 1: that have convinced a number of people that the Kardashians 137 00:07:21,960 --> 00:07:24,200 Speaker 1: are Armenian witches and had something to do with the 138 00:07:24,200 --> 00:07:27,040 Speaker 1: collapse of the astro worlds or the deaths in the 139 00:07:27,080 --> 00:07:31,760 Speaker 1: astro working. My my concern there is the dances that 140 00:07:31,840 --> 00:07:35,720 Speaker 1: go over those conspiracy videos and really marry the worst 141 00:07:35,720 --> 00:07:40,040 Speaker 1: of both worlds. Because I have seen dancing on there. 142 00:07:40,080 --> 00:07:43,800 Speaker 1: I have seen conspiracy videos that involved dancing incredible and 143 00:07:43,920 --> 00:07:47,360 Speaker 1: skincare routine. Have you ever seen a conspiracy video where 144 00:07:47,400 --> 00:07:50,400 Speaker 1: someone's also doing their skincare routine? Because that is a 145 00:07:50,560 --> 00:07:58,280 Speaker 1: thriving SOMETI I'm sure that's yeah. So well, I was like, 146 00:07:58,360 --> 00:08:03,960 Speaker 1: that is just a thing that on many platforms. Change 147 00:08:04,280 --> 00:08:10,920 Speaker 1: media companies are willfully bad at stopping radicalization because making 148 00:08:10,920 --> 00:08:14,280 Speaker 1: people angry and frightened is good for all of their 149 00:08:14,280 --> 00:08:17,960 Speaker 1: bottom lines, so they all knowingly participate in this. I 150 00:08:18,000 --> 00:08:21,680 Speaker 1: think Facebook has been the least responsible about it, but 151 00:08:21,760 --> 00:08:24,920 Speaker 1: that doesn't that shouldn't be taken as praise of anybody, 152 00:08:25,120 --> 00:08:28,080 Speaker 1: like saying Twitter. Saying Twitter has done the best is 153 00:08:28,120 --> 00:08:32,040 Speaker 1: saying like, well, we were all drunk driving, but John 154 00:08:32,160 --> 00:08:36,080 Speaker 1: could actually walk most of a straight line before vomiting, 155 00:08:36,240 --> 00:08:38,840 Speaker 1: so he was the least irresponsible of us who drunk 156 00:08:38,920 --> 00:08:40,800 Speaker 1: drove that night. Just to put it in terms that 157 00:08:40,840 --> 00:08:43,120 Speaker 1: I understand, I was it sounds like Twitter is the 158 00:08:43,120 --> 00:08:45,679 Speaker 1: Backstreet boy. That's like, look, I don't believe in Q 159 00:08:45,920 --> 00:08:50,280 Speaker 1: and on, but I see their points. That's that's kind 160 00:08:50,320 --> 00:08:53,960 Speaker 1: of um the vibe I'm getting fair enough. Um So, 161 00:08:54,240 --> 00:08:57,000 Speaker 1: when deciding what which posts should show up more often 162 00:08:57,000 --> 00:08:59,520 Speaker 1: in the feeds of other users. Facebook's algorithm ways a 163 00:08:59,600 --> 00:09:02,000 Speaker 1: number of actors. The end goal is always the same, 164 00:09:02,040 --> 00:09:03,960 Speaker 1: to get the most people to spend the most time 165 00:09:04,000 --> 00:09:06,920 Speaker 1: interacting with the site. For years, this was done by 166 00:09:06,960 --> 00:09:09,839 Speaker 1: calculating the different reactions of post god and weighing it 167 00:09:09,920 --> 00:09:13,360 Speaker 1: based on what responses people had to it. Again, for years, 168 00:09:13,400 --> 00:09:15,679 Speaker 1: the reaction that carried the most weight was anger. The 169 00:09:15,720 --> 00:09:18,320 Speaker 1: little snarling smiley face icon you could click under a 170 00:09:18,360 --> 00:09:21,200 Speaker 1: post it was at one point being waited five times 171 00:09:21,200 --> 00:09:24,160 Speaker 1: more than just like a like really like the again 172 00:09:24,200 --> 00:09:26,000 Speaker 1: when I'm when I'm saying this was all intentional. They 173 00:09:26,000 --> 00:09:29,720 Speaker 1: were like people who respond angrily to post that keeps 174 00:09:29,760 --> 00:09:31,640 Speaker 1: them on the site more. That's they spend the most 175 00:09:31,679 --> 00:09:33,880 Speaker 1: time engaging with things that make them angry. So, when 176 00:09:33,880 --> 00:09:38,040 Speaker 1: it comes to determining what by which methody like how 177 00:09:38,280 --> 00:09:41,320 Speaker 1: we choose to have the algorithm present people with posts, 178 00:09:41,760 --> 00:09:44,199 Speaker 1: the post that are making people angriest is the post 179 00:09:44,240 --> 00:09:46,640 Speaker 1: our algorithm will send to the most people. That's a 180 00:09:46,679 --> 00:09:50,080 Speaker 1: conscious choice. That's a conscious Yeah, it's so funny, how 181 00:09:50,120 --> 00:09:52,920 Speaker 1: I mean not funny, it's tragic and upsetting, But just 182 00:09:53,080 --> 00:09:55,680 Speaker 1: how specific the Facebook audience is that it's like you 183 00:09:55,679 --> 00:09:57,600 Speaker 1: would have to be the kind of person who would 184 00:09:57,960 --> 00:10:02,120 Speaker 1: be like, I'd better react angry as specific as possible. 185 00:10:02,120 --> 00:10:04,280 Speaker 1: In my feedback to this post, which is farm bill 186 00:10:04,440 --> 00:10:08,040 Speaker 1: moms and yeah it's boomers, it's boomers, and yeah, um 187 00:10:08,760 --> 00:10:12,040 Speaker 1: and yeah, they just kind of knowingly set off a 188 00:10:12,080 --> 00:10:14,720 Speaker 1: bomb and a lot of people's fucking brains. Um, they're 189 00:10:14,760 --> 00:10:19,560 Speaker 1: addicted to telling on themselves for Yeah, why why Facebook 190 00:10:19,600 --> 00:10:22,160 Speaker 1: has something called the Integrity Department and these are the 191 00:10:22,160 --> 00:10:25,400 Speaker 1: people with the unenviable task of trying to fight misinformation 192 00:10:25,440 --> 00:10:28,800 Speaker 1: and radicalization on the platform. They noted in July that 193 00:10:28,920 --> 00:10:33,880 Speaker 1: is so embarrassing. Yeah, just going on the first day 194 00:10:33,960 --> 00:10:37,000 Speaker 1: be like, I work for the Facebook Integrity Department, Like, yeah, 195 00:10:37,080 --> 00:10:40,360 Speaker 1: good fucking luck. Yeah, I work for thee in your life. 196 00:10:40,400 --> 00:10:42,839 Speaker 1: My job is to go door to door and apologize 197 00:10:42,840 --> 00:10:45,920 Speaker 1: to people after we bomb them. We have gift baskets 198 00:10:46,080 --> 00:10:50,480 Speaker 1: for the survivors, you know, Like it's that that's the gig. Really. Um, yeah, 199 00:10:50,640 --> 00:10:53,360 Speaker 1: I send edible arrangements to people who have been droned 200 00:10:53,400 --> 00:10:57,600 Speaker 1: striight like, oh Jesus awful. There's was one of my 201 00:10:57,640 --> 00:11:01,200 Speaker 1: favorite follows on twitters, Brook Minkowski Um, who used to 202 00:11:01,240 --> 00:11:03,079 Speaker 1: work for Facebook and was like one of the people 203 00:11:03,080 --> 00:11:06,559 Speaker 1: early on who was trying to warn them about disinformation 204 00:11:06,559 --> 00:11:09,720 Speaker 1: and radicalization on the platform years ago and left because 205 00:11:09,720 --> 00:11:11,720 Speaker 1: like it was clear they didn't actually give a shit. 206 00:11:12,360 --> 00:11:15,040 Speaker 1: UM And and a lot of the Integrity department people 207 00:11:15,040 --> 00:11:18,600 Speaker 1: are actually like really good people who are a little 208 00:11:18,600 --> 00:11:21,160 Speaker 1: bit optimistic and kind of young and coming and like, Okay, 209 00:11:21,200 --> 00:11:23,280 Speaker 1: I'll make it's my job to make this huge and 210 00:11:23,320 --> 00:11:27,880 Speaker 1: important thing a lot safer. Um. And these people get 211 00:11:28,160 --> 00:11:31,320 Speaker 1: chewed up and spit out very very quickly. Um. And 212 00:11:31,760 --> 00:11:34,600 Speaker 1: the members of the integrity team, Um, we're kind of 213 00:11:34,640 --> 00:11:38,520 Speaker 1: analyzing the impact of weighing angry content so much. And 214 00:11:38,640 --> 00:11:41,640 Speaker 1: some of them noted in July the extra weight given 215 00:11:41,679 --> 00:11:45,240 Speaker 1: to the anger reaction was a huge problem. They recommended 216 00:11:45,240 --> 00:11:47,800 Speaker 1: the company stop weighing at extra in order to stop 217 00:11:47,800 --> 00:11:50,720 Speaker 1: the spread of harmful content. Their own tests showed that 218 00:11:50,840 --> 00:11:53,160 Speaker 1: dialing the weight of anger back to zero so it 219 00:11:53,240 --> 00:11:55,800 Speaker 1: was no more influential than any other reaction, would stop 220 00:11:55,840 --> 00:11:58,960 Speaker 1: rege inducing content from being shared and spread nearly as widely. 221 00:11:59,280 --> 00:12:03,040 Speaker 1: This led to a five percent reduction and hate speech, misinformation, bullying, 222 00:12:03,040 --> 00:12:05,160 Speaker 1: and posts with violent threats. And when you consider how 223 00:12:05,160 --> 00:12:08,000 Speaker 1: many billions of Facebook posts there are that's a lot 224 00:12:08,160 --> 00:12:11,280 Speaker 1: less nasty shit, um, some of which is going to 225 00:12:11,320 --> 00:12:13,440 Speaker 1: translate into real world violence. And again this was kind 226 00:12:13,440 --> 00:12:15,120 Speaker 1: of a limited study, so who knows how it would 227 00:12:15,120 --> 00:12:18,000 Speaker 1: have actually affected things in the long run. So Facebook 228 00:12:18,080 --> 00:12:22,200 Speaker 1: made this well less money question mark. Yeah, this actually 229 00:12:22,280 --> 00:12:24,520 Speaker 1: was kind of a win for them. Um. Facebook did 230 00:12:24,640 --> 00:12:27,760 Speaker 1: make this change. They pushed it out in September. UM, 231 00:12:27,760 --> 00:12:30,400 Speaker 1: and the employees responsible deserve real credit. Again, there's people 232 00:12:30,400 --> 00:12:34,080 Speaker 1: within Um, there's people within Facebook who did things that 233 00:12:34,160 --> 00:12:38,760 Speaker 1: really actually we're good Like changing this was probably like 234 00:12:38,920 --> 00:12:41,920 Speaker 1: made the world a bit healthier. That said, the fact 235 00:12:41,920 --> 00:12:44,839 Speaker 1: that it had been weighted this way for years, you 236 00:12:45,160 --> 00:12:48,440 Speaker 1: don't undo that just by dialing it back now. For 237 00:12:48,520 --> 00:12:51,520 Speaker 1: one thing, anger has become such an aspect of the 238 00:12:51,600 --> 00:12:55,080 Speaker 1: culture of Facebook that even without weighing the anger emoji, 239 00:12:55,160 --> 00:12:57,160 Speaker 1: most of the content that goes viral is still stuff 240 00:12:57,200 --> 00:12:59,400 Speaker 1: that makes pisces people off, because that's just become what 241 00:12:59,440 --> 00:13:02,319 Speaker 1: Facebook is, because that's what they selected for for years. 242 00:13:02,920 --> 00:13:05,880 Speaker 1: Like it Also, like, who knows, like if they've done 243 00:13:05,920 --> 00:13:08,360 Speaker 1: this years ago, if they never waited anger more, it 244 00:13:08,440 --> 00:13:10,840 Speaker 1: might be a very different platform with like a very 245 00:13:10,840 --> 00:13:13,400 Speaker 1: different impact on the brains of for example, our aunts 246 00:13:13,400 --> 00:13:16,840 Speaker 1: and uncles. Um I think that that's really interesting too, 247 00:13:16,880 --> 00:13:20,800 Speaker 1: because that timeline lines up pretty quickly with or pretty 248 00:13:20,800 --> 00:13:22,800 Speaker 1: exactly with where it feels like a lot of younger 249 00:13:22,840 --> 00:13:27,600 Speaker 1: people were leaving that platform and the platforms became associated 250 00:13:27,679 --> 00:13:30,559 Speaker 1: with older people. Because I feel like I don't think 251 00:13:30,559 --> 00:13:34,880 Speaker 1: I was using Facebook consistently after twenties seventeen. I want 252 00:13:34,880 --> 00:13:37,240 Speaker 1: to say it was maybe my last Facebook year. Yeah 253 00:13:37,320 --> 00:13:41,200 Speaker 1: I stopped. I mean I I stopped visiting it super 254 00:13:41,200 --> 00:13:45,800 Speaker 1: regularly a while back. Um yeah, maybe around So in 255 00:13:45,920 --> 00:13:49,880 Speaker 1: April of Facebook employees came up with another recommendation. This 256 00:13:49,920 --> 00:13:52,920 Speaker 1: one wouldn't be as successful as as changing you know, 257 00:13:53,000 --> 00:13:57,000 Speaker 1: the reaction of the algorithm to the angry reaction spurred 258 00:13:57,000 --> 00:13:58,960 Speaker 1: by the lockdown on the sudden surge of q and 259 00:13:58,960 --> 00:14:02,680 Speaker 1: on Boogaloo and he lockdown groups urging real world violence. 260 00:14:02,920 --> 00:14:05,880 Speaker 1: It was suggested by internal employees that the news feed 261 00:14:05,880 --> 00:14:09,040 Speaker 1: algorithm de prioritize the posting of content based on the 262 00:14:09,080 --> 00:14:12,040 Speaker 1: behavior of people's Facebook friends. So the basic idea is this, 263 00:14:12,200 --> 00:14:16,160 Speaker 1: um a new what Facebook was doing was you would 264 00:14:16,240 --> 00:14:19,200 Speaker 1: if normally like the way you'd think it would work, right, 265 00:14:19,280 --> 00:14:21,760 Speaker 1: is that like your friends post something and you see 266 00:14:21,760 --> 00:14:23,880 Speaker 1: that in your news feed, right, like the posts of 267 00:14:23,880 --> 00:14:25,960 Speaker 1: the people that you've chosen to follow and say are 268 00:14:26,000 --> 00:14:28,400 Speaker 1: your friends? Right? That's how you you would you would 269 00:14:28,400 --> 00:14:30,080 Speaker 1: want it to work. Is that what worked? At one 270 00:14:30,080 --> 00:14:32,920 Speaker 1: point they made a change a few years back where 271 00:14:33,160 --> 00:14:35,800 Speaker 1: they started sending you things not because someone you followed 272 00:14:35,840 --> 00:14:38,840 Speaker 1: had said something, but because they'd like to think um, 273 00:14:38,960 --> 00:14:41,800 Speaker 1: or they'd commented like not so even commented just like 274 00:14:41,800 --> 00:14:43,840 Speaker 1: liked a thing, Like if they'd reacted to a thing 275 00:14:44,240 --> 00:14:47,360 Speaker 1: you would get you would get that sent to your 276 00:14:47,400 --> 00:14:51,240 Speaker 1: news feed UM. And members of the Integrity Team start 277 00:14:51,280 --> 00:14:54,160 Speaker 1: to recognize, like, this has some problems um in it 278 00:14:54,240 --> 00:14:56,160 Speaker 1: from one thing, it results in a lot of people 279 00:14:56,200 --> 00:15:00,520 Speaker 1: getting exposed to dangerous bullshit UM. So they uh, they 280 00:15:01,160 --> 00:15:04,080 Speaker 1: start looking into like the impact of this and how 281 00:15:04,520 --> 00:15:06,880 Speaker 1: how just sharing the kind of things your friends are 282 00:15:06,920 --> 00:15:09,400 Speaker 1: reacting to influences what you see and what that does 283 00:15:09,440 --> 00:15:13,520 Speaker 1: to you on Facebook. The Integrity Team experimented without changing 284 00:15:13,520 --> 00:15:15,840 Speaker 1: this might work, and their early experiments found that fixing 285 00:15:15,880 --> 00:15:19,920 Speaker 1: this would reduce the spread of violence, insight and content UM. 286 00:15:20,000 --> 00:15:23,640 Speaker 1: For one thing, what they found is that like normally 287 00:15:23,800 --> 00:15:27,400 Speaker 1: if you hadn't seen someone like a post about something 288 00:15:27,400 --> 00:15:29,720 Speaker 1: that was maybe like violent or aggressive, your conspiratory, like 289 00:15:29,760 --> 00:15:32,160 Speaker 1: a flat earth post, your appost urging the execution of 290 00:15:32,160 --> 00:15:34,680 Speaker 1: an elected leader. If you hadn't seen anyone that you 291 00:15:34,760 --> 00:15:37,440 Speaker 1: knew react to that post, even if you saw it, 292 00:15:37,480 --> 00:15:39,880 Speaker 1: you wouldn't comment on it or share it. But they 293 00:15:39,880 --> 00:15:41,920 Speaker 1: found that, like if you just saw that a friend 294 00:15:41,920 --> 00:15:43,840 Speaker 1: had liked it, you were more likely to share it, 295 00:15:43,880 --> 00:15:47,360 Speaker 1: which increases exponentially the spread of this kind of violent content. 296 00:15:47,680 --> 00:15:50,760 Speaker 1: And it's it's this idea, like the whole people weren't 297 00:15:50,760 --> 00:15:53,240 Speaker 1: stopped being afraid to be racist at a certain points 298 00:15:53,400 --> 00:15:55,360 Speaker 1: as much as they had been earlier, and it led 299 00:15:55,360 --> 00:15:58,080 Speaker 1: to this surge in real world violence. It is kind 300 00:15:58,080 --> 00:16:01,160 Speaker 1: of the same thing people felt by seeing their friends 301 00:16:01,200 --> 00:16:03,800 Speaker 1: react to this, they felt permission to react to it too. 302 00:16:03,800 --> 00:16:05,240 Speaker 1: In a way, maybe they would have liked, well, I 303 00:16:05,280 --> 00:16:07,360 Speaker 1: don't want to like, maybe I'm interested in flat earthship, 304 00:16:07,360 --> 00:16:08,720 Speaker 1: but I'm just going to ignore this because like I 305 00:16:08,760 --> 00:16:10,720 Speaker 1: don't want to seem like a kok that is so 306 00:16:10,840 --> 00:16:14,560 Speaker 1: fucking upsetting and and fascinating in the way that it 307 00:16:14,600 --> 00:16:16,840 Speaker 1: affects your mind. Is is Yeah, there there was a 308 00:16:16,880 --> 00:16:23,880 Speaker 1: time where you would if you were you know, racist, misogynist, homophobic, 309 00:16:23,920 --> 00:16:26,440 Speaker 1: whatever you were, but you just didn't talk about it. 310 00:16:26,480 --> 00:16:29,680 Speaker 1: But then all of a sudden, there's this confirmation that like, hey, 311 00:16:29,680 --> 00:16:32,040 Speaker 1: this person you know and see all the time feels 312 00:16:32,040 --> 00:16:34,880 Speaker 1: the same fucking way you do. So why be quiet 313 00:16:34,960 --> 00:16:38,640 Speaker 1: about it. Let's discuss. Like it's just that's so dark. 314 00:16:38,800 --> 00:16:41,480 Speaker 1: It's really dark. And so the the the integrity team 315 00:16:41,520 --> 00:16:44,040 Speaker 1: sees this and they're like, we should change this. Um, 316 00:16:44,120 --> 00:16:46,280 Speaker 1: we should only show We shouldn't be showing people just 317 00:16:46,360 --> 00:16:48,680 Speaker 1: like the reactions their friends have had to content because 318 00:16:48,720 --> 00:16:52,720 Speaker 1: it seems to be bad for everybody. Um. And they 319 00:16:52,760 --> 00:16:54,480 Speaker 1: do find in some of their you know, because when 320 00:16:54,480 --> 00:16:57,080 Speaker 1: the experiment, they're like that, we'll take this country or 321 00:16:57,120 --> 00:16:59,120 Speaker 1: this city and we'll we'll roll this change out in 322 00:16:59,120 --> 00:17:01,520 Speaker 1: this limited geograph location to like try and see how 323 00:17:01,560 --> 00:17:03,920 Speaker 1: it might affect its scale. And they do this and 324 00:17:03,960 --> 00:17:07,000 Speaker 1: they see that like, oh, changing this significantly reduces the 325 00:17:07,040 --> 00:17:10,320 Speaker 1: spread of specifically violence and citing contents. So they're like, hey, 326 00:17:10,359 --> 00:17:14,879 Speaker 1: we should roll this out service wide. Zuckerberg himself steps in. 327 00:17:14,960 --> 00:17:19,119 Speaker 1: According to Francis Hoggan, the whistleblower and quote rejected this 328 00:17:19,200 --> 00:17:21,280 Speaker 1: intervention that could have reduced the risk of violence in 329 00:17:21,320 --> 00:17:25,920 Speaker 1: the election from The Atlantic Quote, an internal message characterizing 330 00:17:25,960 --> 00:17:28,639 Speaker 1: Zuckerberg's reasoning says he wanted to avoid new features that 331 00:17:28,680 --> 00:17:31,960 Speaker 1: would get in the way of meaningful social interactions. But 332 00:17:32,000 --> 00:17:35,600 Speaker 1: according to Facebook's definition, its employees say engagement is considered 333 00:17:35,640 --> 00:17:38,240 Speaker 1: meaningful even if it entails bullying, hate speech, and re 334 00:17:38,320 --> 00:17:42,080 Speaker 1: shares of harmful content. The episode, like Facebook's response to 335 00:17:42,080 --> 00:17:45,360 Speaker 1: the incitement that proliferated between the election in January six, 336 00:17:45,520 --> 00:17:48,800 Speaker 1: reveals a fundamental problem with the platform. Facebook's mega scale 337 00:17:48,800 --> 00:17:51,080 Speaker 1: allows the company to influence the speech and thought patterns 338 00:17:51,080 --> 00:17:53,200 Speaker 1: of billions of people. What the world is seeing now 339 00:17:53,200 --> 00:17:55,560 Speaker 1: through the window provided by reams of internal documents is 340 00:17:55,600 --> 00:17:58,439 Speaker 1: that Facebook catalogs and studies the harm it inflicts on people, 341 00:17:58,680 --> 00:18:01,800 Speaker 1: and then it keeps harming anyway. See that that is 342 00:18:01,840 --> 00:18:06,200 Speaker 1: so that's always um so interesting to hear. And by interesting, 343 00:18:06,240 --> 00:18:12,560 Speaker 1: I mean, you know, psychologically harmful, but uh, because it's like, yes, 344 00:18:12,680 --> 00:18:14,760 Speaker 1: that is a fundamental flaw of the platform, but that's 345 00:18:14,800 --> 00:18:17,639 Speaker 1: also very entrenched into like what the DNA of the 346 00:18:17,680 --> 00:18:23,480 Speaker 1: platform always was, which was based on harshly judging other people. 347 00:18:23,600 --> 00:18:28,000 Speaker 1: Like that's why Mark Zuckerberg created Facebook, was to harshly 348 00:18:28,080 --> 00:18:33,439 Speaker 1: judge women in his community. So it's like, I I 349 00:18:33,520 --> 00:18:36,600 Speaker 1: know that it is you know, on a bajillion scale 350 00:18:36,600 --> 00:18:39,440 Speaker 1: at this point, but I'm always kind of stunned at 351 00:18:40,080 --> 00:18:43,280 Speaker 1: how people are like, oh, it's so weird that this 352 00:18:43,359 --> 00:18:45,679 Speaker 1: went you know, the way that it did. It's like, well, 353 00:18:45,920 --> 00:18:50,239 Speaker 1: to an extent um, it was always like that, and 354 00:18:50,280 --> 00:18:52,600 Speaker 1: maybe it was like cosplaying as not being like that, 355 00:18:52,680 --> 00:18:55,640 Speaker 1: and for certain people there were eras in Facebook where 356 00:18:55,640 --> 00:18:58,399 Speaker 1: your user experience wouldn't be like that. But it you know, 357 00:18:58,680 --> 00:19:00,960 Speaker 1: this goes back almost twenty is at this point of 358 00:19:02,480 --> 00:19:06,840 Speaker 1: this being in the DNA of this um this ship show. Yeah, 359 00:19:06,960 --> 00:19:10,440 Speaker 1: and it's it's really bleak. Um. It's just really bleak. 360 00:19:10,480 --> 00:19:13,760 Speaker 1: And it also goes to show like the It was 361 00:19:13,800 --> 00:19:16,040 Speaker 1: one of the things Zuckerberg will say repeatedly when he 362 00:19:16,440 --> 00:19:18,640 Speaker 1: talks about when he when he does admit like, yes, 363 00:19:18,640 --> 00:19:20,960 Speaker 1: there are problems and there have been like negatives associated 364 00:19:20,960 --> 00:19:23,280 Speaker 1: with the site, and we're we're aware of that. They're humbling, 365 00:19:23,320 --> 00:19:25,399 Speaker 1: but like, you know, you also have to include all 366 00:19:25,440 --> 00:19:27,040 Speaker 1: the good that we're doing, all of the meaning and 367 00:19:27,040 --> 00:19:28,639 Speaker 1: the way he always phrases this is like all of 368 00:19:28,640 --> 00:19:31,480 Speaker 1: the meaningful social interactions that wouldn't have happened otherwise. And 369 00:19:31,480 --> 00:19:34,160 Speaker 1: then you realize when every time he says that meaningful 370 00:19:34,200 --> 00:19:37,439 Speaker 1: social interactions that have taken place on Facebook, when he 371 00:19:37,480 --> 00:19:41,400 Speaker 1: says he's including as these internal blocuments, he includes bullying 372 00:19:41,440 --> 00:19:43,600 Speaker 1: and people like making death threats and like talking about 373 00:19:43,600 --> 00:19:46,560 Speaker 1: their desire to murder people like, that's a meaningful interaction. 374 00:19:46,600 --> 00:19:49,399 Speaker 1: People getting angry and trying to inside violence together is 375 00:19:49,440 --> 00:19:53,400 Speaker 1: a meaningful social interaction, which I guess yet I mean 376 00:19:53,560 --> 00:19:57,320 Speaker 1: not meaningless that it that has meaning. Plan meetings were 377 00:19:57,520 --> 00:20:01,359 Speaker 1: meaningful social interactions, you know, Um, you gotta give the 378 00:20:01,440 --> 00:20:07,399 Speaker 1: KKK that. Uh, the Nuremberg rally was meaning interacting. The 379 00:20:07,480 --> 00:20:10,639 Speaker 1: last meaningful interaction I had on Twitter led to like 380 00:20:11,040 --> 00:20:14,160 Speaker 1: a rebound. I was dating coming to my grandma's funeral, 381 00:20:14,280 --> 00:20:21,920 Speaker 1: blackout drunk, so you know, just man, it's been too 382 00:20:21,960 --> 00:20:24,640 Speaker 1: long since I've shown up at a funeral just too 383 00:20:24,720 --> 00:20:28,960 Speaker 1: drunk to stand. It was still a one of my 384 00:20:29,000 --> 00:20:31,639 Speaker 1: favorite mag with my family to this day. They're like, 385 00:20:31,680 --> 00:20:33,840 Speaker 1: who is this guy? And I'm like, I don't really know. 386 00:20:35,280 --> 00:20:39,639 Speaker 1: He's drunk. He came on the megabus. He did the 387 00:20:39,680 --> 00:20:46,080 Speaker 1: megabus getting drunk from a camel back on the megabus. Yeah, 388 00:20:46,119 --> 00:20:48,040 Speaker 1: that would be when I used to do a lot 389 00:20:48,080 --> 00:20:50,320 Speaker 1: of bus trips, like when I was traveling and stuff. 390 00:20:50,320 --> 00:20:52,040 Speaker 1: That would be one of the tactics as you feel 391 00:20:52,080 --> 00:20:56,919 Speaker 1: like a thermist or a camel back with like cranberry juice, 392 00:20:57,040 --> 00:21:01,520 Speaker 1: six liquor and and just oh, I mean get that. 393 00:21:02,200 --> 00:21:05,680 Speaker 1: It's I'm not above getting funked up on a mega 394 00:21:06,600 --> 00:21:09,119 Speaker 1: you know, on your way to my grandma's funeral. That was. 395 00:21:09,440 --> 00:21:11,919 Speaker 1: That was a move. Me and my friends got like 396 00:21:12,119 --> 00:21:15,120 Speaker 1: wasted in San Francisco one day, just like going shopping 397 00:21:15,200 --> 00:21:17,359 Speaker 1: in broad daylight with a camel back, where we would 398 00:21:17,560 --> 00:21:20,479 Speaker 1: we would get a bottle of orange flavored Trader Joe's 399 00:21:20,600 --> 00:21:23,520 Speaker 1: Patron tequila, and we would get a half dozen lime 400 00:21:24,000 --> 00:21:26,720 Speaker 1: popsicles and you just throw the popsicles in with the 401 00:21:26,760 --> 00:21:29,040 Speaker 1: patron and the camel back, and throughout the day it 402 00:21:29,160 --> 00:21:31,760 Speaker 1: melts and you just have constant cold margarita. It's actually 403 00:21:31,800 --> 00:21:39,240 Speaker 1: fucking amazing that fucking we were ito. Yeah, I recommend 404 00:21:39,359 --> 00:21:43,080 Speaker 1: it heavily. You will get trashed and people don't notice 405 00:21:43,359 --> 00:21:45,959 Speaker 1: dude walking around with a camel back in San Francisco. 406 00:21:46,520 --> 00:21:49,880 Speaker 1: Nobody gives a ship. Oh my god, you're basically camouflaged. 407 00:21:50,080 --> 00:21:54,479 Speaker 1: Yeah there, you know who else is camouflaged, The products 408 00:21:54,480 --> 00:21:57,520 Speaker 1: and services that support this podcast, camouflaged to be more 409 00:21:57,600 --> 00:22:00,320 Speaker 1: likable to you by being wrapped in a package of 410 00:22:00,320 --> 00:22:03,080 Speaker 1: of the three of us. That's how ads work. I 411 00:22:03,119 --> 00:22:05,360 Speaker 1: thought you were saying that you were taking ads from 412 00:22:05,400 --> 00:22:08,440 Speaker 1: the U. S. Army recruitment center. Again, I mean, it's 413 00:22:08,560 --> 00:22:13,480 Speaker 1: entirely possible. Um but but but at the moment, we're 414 00:22:13,480 --> 00:22:16,399 Speaker 1: just camouflaging. I don't know whoever, whoever comes on next, 415 00:22:16,440 --> 00:22:19,639 Speaker 1: whoever comes on next, you'll feel more positively about because 416 00:22:19,880 --> 00:22:23,280 Speaker 1: of our presence here. That's how ads work. It's good stuff. 417 00:22:29,160 --> 00:22:32,720 Speaker 1: Oh we're back. My goodness, What a good time we're 418 00:22:32,720 --> 00:22:36,200 Speaker 1: all having today. How are you doing? You make it okay? 419 00:22:36,920 --> 00:22:39,480 Speaker 1: You made it sounds sarcastic. I'm having I am having 420 00:22:39,480 --> 00:22:41,680 Speaker 1: a good time. I'm glad. I'm happy that you're having 421 00:22:41,680 --> 00:22:44,040 Speaker 1: a good time. That's that's my only goal for for 422 00:22:44,080 --> 00:22:46,359 Speaker 1: this show and for you. That's a good time. See 423 00:22:46,400 --> 00:22:48,680 Speaker 1: now you're doubling down on it, and I'm getting insecure 424 00:22:48,800 --> 00:22:50,919 Speaker 1: doubling down. And I'm also talking more and more like 425 00:22:50,960 --> 00:22:53,879 Speaker 1: an NPR talking head as I as I get quieter 426 00:22:54,200 --> 00:22:55,800 Speaker 1: by the bed. Now I'm going to start having a 427 00:22:55,880 --> 00:22:58,399 Speaker 1: panic attack. I've never heard you talk. I know. This 428 00:22:58,480 --> 00:22:59,919 Speaker 1: is how I talk to my cats when I may 429 00:23:00,040 --> 00:23:04,119 Speaker 1: grey at them there. Honestly, I feel like we do 430 00:23:04,200 --> 00:23:06,080 Speaker 1: have that dynamic. I feel like I'm a cat that 431 00:23:06,119 --> 00:23:08,680 Speaker 1: you get angry at something. Yeah, because you you jump 432 00:23:08,720 --> 00:23:12,080 Speaker 1: on my desk and knock over my zvia. It's infurious attention, 433 00:23:12,320 --> 00:23:15,159 Speaker 1: I know. But I've got to work to keep you 434 00:23:15,200 --> 00:23:17,480 Speaker 1: in expensive cat food. I only feed my cats the 435 00:23:17,560 --> 00:23:19,840 Speaker 1: nice wet food. I would rather have your attention than 436 00:23:20,280 --> 00:23:26,080 Speaker 1: really nice food. That's not what my cats say. Um. So, 437 00:23:26,280 --> 00:23:28,920 Speaker 1: there's just a shipload to say about how Facebook negatively 438 00:23:29,040 --> 00:23:32,800 Speaker 1: impacts the increasingly violent political discourse in the United States 439 00:23:32,800 --> 00:23:35,199 Speaker 1: and how they helped make January six happen. But I 440 00:23:35,240 --> 00:23:37,399 Speaker 1: think the way I'd like to illustrate the harm of 441 00:23:37,600 --> 00:23:40,640 Speaker 1: Facebook next is a bit less political. Um it also 442 00:23:40,640 --> 00:23:43,240 Speaker 1: occurs in a different Facebook product. I'm talking about Facebook, 443 00:23:43,240 --> 00:23:45,280 Speaker 1: the company generally want her to Facebook. But now we're 444 00:23:45,280 --> 00:23:48,360 Speaker 1: going to talk about Instagram. In Part one, I mentioned 445 00:23:48,359 --> 00:23:51,600 Speaker 1: that young people felt that removing likes from Instagram temporarily 446 00:23:51,640 --> 00:23:55,120 Speaker 1: corresponded with a decrease in social anxiety. The impact of Instagram, 447 00:23:55,160 --> 00:23:57,840 Speaker 1: specifically on the mental health of kids and teens can 448 00:23:57,840 --> 00:24:01,199 Speaker 1: be incredibly significant. One of the other Facebook internal studies 449 00:24:01,200 --> 00:24:03,480 Speaker 1: that was released as part of the Facebook papers was 450 00:24:03,520 --> 00:24:07,080 Speaker 1: conducted by researchers on Instagram UH. The study, which again 451 00:24:07,119 --> 00:24:09,080 Speaker 1: almost certainly would never have seen the light of day 452 00:24:09,119 --> 00:24:11,960 Speaker 1: if a whistleblower hadn't released, it found that thirty two 453 00:24:11,960 --> 00:24:14,880 Speaker 1: percent of teen girls reported Instagram made their made them 454 00:24:14,880 --> 00:24:18,600 Speaker 1: feel worse about their body. Twenty two million teenagers in 455 00:24:18,600 --> 00:24:21,560 Speaker 1: the United States log onto Instagram on like a daily basis. 456 00:24:21,920 --> 00:24:25,719 Speaker 1: So that's millions of teen girls feeling worse about their 457 00:24:25,720 --> 00:24:32,320 Speaker 1: body because of Instagram. I've never been less surprised. Well, 458 00:24:32,359 --> 00:24:36,520 Speaker 1: good news, it gets worse like no fucking kidding. These 459 00:24:36,560 --> 00:24:40,720 Speaker 1: researchers released their findings internally on in March of noting 460 00:24:40,800 --> 00:24:43,400 Speaker 1: that comparisons on Instagram can change how young women view 461 00:24:43,400 --> 00:24:46,720 Speaker 1: and describe themselves. Again not surprising. So company researchers have 462 00:24:46,760 --> 00:24:49,239 Speaker 1: been investigating the way that Instagram works though for for 463 00:24:49,320 --> 00:24:52,960 Speaker 1: quite a while years um about three years that they've 464 00:24:53,160 --> 00:24:55,800 Speaker 1: been doing this seriously, and their previous findings all back 465 00:24:55,880 --> 00:24:59,440 Speaker 1: up the same central issues. Photos sharing in particular is 466 00:24:59,520 --> 00:25:03,720 Speaker 1: harmful the teen girls nineteen report concluded we make body 467 00:25:03,760 --> 00:25:06,960 Speaker 1: image issues worse for one in three teen girls. Its 468 00:25:07,000 --> 00:25:10,920 Speaker 1: findings included this damning line, teens blame Instagram for increases 469 00:25:10,960 --> 00:25:14,639 Speaker 1: in anxiety and depression. This reaction was unprompted and consistent 470 00:25:14,680 --> 00:25:18,840 Speaker 1: across all groups. So like they almost always mentioned that 471 00:25:18,920 --> 00:25:21,480 Speaker 1: this app specifically makes them feel worse about their body, 472 00:25:21,480 --> 00:25:23,159 Speaker 1: and we don't have to prompt them at all, Like 473 00:25:23,200 --> 00:25:25,640 Speaker 1: this just comes up when they talk about Instagram. I mean, 474 00:25:25,720 --> 00:25:30,000 Speaker 1: that's that, truly, It's so Sophie. I don't know how 475 00:25:30,040 --> 00:25:32,040 Speaker 1: you feel. I mean, I truly think that because I've 476 00:25:32,040 --> 00:25:37,400 Speaker 1: been on Instagram since what like show earlier. I think 477 00:25:37,440 --> 00:25:39,560 Speaker 1: I liked that on earlier. It was around when we 478 00:25:39,560 --> 00:25:43,040 Speaker 1: were in high school. I truly think my my life 479 00:25:43,160 --> 00:25:46,480 Speaker 1: and my relationship to my body would be very different 480 00:25:46,880 --> 00:25:48,520 Speaker 1: had not been on that app for the better part 481 00:25:48,520 --> 00:25:51,280 Speaker 1: of a decade. Yeah, I mean, I mean, especially when 482 00:25:51,280 --> 00:25:55,000 Speaker 1: they introduced filters. Yeah, we're about to talk about So 483 00:25:56,600 --> 00:25:58,919 Speaker 1: here's here's the kicker, And by kicker, I mean the 484 00:25:58,960 --> 00:26:04,439 Speaker 1: bleakest part. In teens who reported suicidal thoughts, percent of 485 00:26:04,440 --> 00:26:06,639 Speaker 1: teens in the UK and six percent of the teams 486 00:26:06,640 --> 00:26:09,760 Speaker 1: in the United States claim their desire to kill themselves 487 00:26:09,960 --> 00:26:14,520 Speaker 1: started on Instagram. That's fucking disgusting and terror. That's pretty bleak. 488 00:26:15,040 --> 00:26:19,200 Speaker 1: More than I just like I wish I were more surprise. Yeah, 489 00:26:20,440 --> 00:26:22,680 Speaker 1: but it's good to have this data. Um. The data 490 00:26:22,720 --> 00:26:24,760 Speaker 1: shows that more than forty percent of Instagram so more 491 00:26:24,760 --> 00:26:28,200 Speaker 1: than of Instagram users are less than twenty two years old. UM, 492 00:26:28,240 --> 00:26:31,120 Speaker 1: which means you've got twenty two million teens logging onto 493 00:26:31,119 --> 00:26:33,359 Speaker 1: the service in the US every day. Six percent of 494 00:26:33,359 --> 00:26:37,000 Speaker 1: those people becoming suicidal as the result of Instagram is 495 00:26:37,040 --> 00:26:39,960 Speaker 1: one point three two million children who once started wanting 496 00:26:39,960 --> 00:26:43,720 Speaker 1: to kill themselves while using Instagram. Hey, everybody, Robert Evans here, 497 00:26:43,760 --> 00:26:46,040 Speaker 1: and I actually screwed up the math that I just cited, 498 00:26:46,080 --> 00:26:47,960 Speaker 1: which is often the case when I do math. So 499 00:26:48,000 --> 00:26:50,560 Speaker 1: anytime I do math of my own in an episode, 500 00:26:50,920 --> 00:26:54,640 Speaker 1: you're right to question me. UM. I was calculating six 501 00:26:54,680 --> 00:26:59,600 Speaker 1: percent of twenty two million basically, Um. But as the 502 00:27:00,000 --> 00:27:02,520 Speaker 1: auddy noted, it's six percent of kids who are suicidal 503 00:27:02,680 --> 00:27:05,040 Speaker 1: say that their suicidal feelings started on Instagram, So I 504 00:27:05,080 --> 00:27:09,360 Speaker 1: wanted to recalculate that, um about seventy to seventy six 505 00:27:09,400 --> 00:27:11,719 Speaker 1: kind of depending on the source. Percent of American teens 506 00:27:12,160 --> 00:27:16,560 Speaker 1: use Instagram. UM there are about forty two million teenagers 507 00:27:16,600 --> 00:27:22,000 Speaker 1: in the United States um SO I calculated from that, 508 00:27:22,160 --> 00:27:26,200 Speaker 1: and about eighteen percent of about nineteen percent of high 509 00:27:26,200 --> 00:27:31,320 Speaker 1: school students of of like teenagers UM seriously considered attempting suicide. So, 510 00:27:31,359 --> 00:27:34,600 Speaker 1: if we're just counting serious attempts at or people who 511 00:27:34,640 --> 00:27:40,040 Speaker 1: seriously considered attempting suicide, that's UM five millions, seven thousand, 512 00:27:40,080 --> 00:27:44,200 Speaker 1: six hundred teens who seriously considered suicide six percent of those. 513 00:27:44,280 --> 00:27:47,280 Speaker 1: If six percent of those kids had their suicidal feelings 514 00:27:47,280 --> 00:27:50,560 Speaker 1: start on Instagram, that's three hundred and forty four thousand, 515 00:27:50,640 --> 00:27:53,240 Speaker 1: seven hundred and thirty six children in the United States 516 00:27:53,280 --> 00:27:57,320 Speaker 1: who suicidal feelings started on Instagram. UM and I've furthermore 517 00:27:57,320 --> 00:28:00,359 Speaker 1: found that about nine percent of kids who seriously attempt 518 00:28:00,359 --> 00:28:04,960 Speaker 1: suicide or seriously consider suicide attempted. So of that three 519 00:28:05,000 --> 00:28:10,880 Speaker 1: hundred and uh thousand, seven thirty six American kid teens 520 00:28:11,359 --> 00:28:16,440 Speaker 1: whose suicidal feelings started on Instagram, about thirty one thousand, 521 00:28:17,080 --> 00:28:21,600 Speaker 1: uh and twenty six kids attempt suicide. Um SO, about 522 00:28:21,640 --> 00:28:25,480 Speaker 1: thirty one kids in the United States on an annual 523 00:28:25,520 --> 00:28:30,080 Speaker 1: basis attempts suicide because of suicidal feelings that started on Instagram. 524 00:28:30,200 --> 00:28:34,680 Speaker 1: So that is the more accurate look at the data. 525 00:28:35,040 --> 00:28:37,520 Speaker 1: And I apologize as always for the error. But what's 526 00:28:37,560 --> 00:28:41,000 Speaker 1: interesting is that these studies do document like Facebook has 527 00:28:40,800 --> 00:28:45,000 Speaker 1: is as physically harmful at scale, is like a wide 528 00:28:45,080 --> 00:28:48,760 Speaker 1: variety of narcotics, like most narcotics probably are less harmful 529 00:28:48,800 --> 00:28:52,360 Speaker 1: at scale physically than Instagram. Um, I think weeds certainly 530 00:28:52,440 --> 00:28:56,400 Speaker 1: is um so my god, if if every teenager was 531 00:28:56,440 --> 00:29:00,600 Speaker 1: smoking weed instead of doom scrolling Instagram, the world would 532 00:29:00,640 --> 00:29:03,160 Speaker 1: just be so funny. If they were, they were chained 533 00:29:03,200 --> 00:29:09,760 Speaker 1: smoking cigars instead of being on Instagram. It's so weird 534 00:29:09,760 --> 00:29:13,800 Speaker 1: because I think about, like how I don't know whatever. 535 00:29:14,040 --> 00:29:16,200 Speaker 1: Like I'm in my late twenties, so I feel like 536 00:29:16,240 --> 00:29:18,400 Speaker 1: I have like a little bit of memory of like 537 00:29:19,160 --> 00:29:25,000 Speaker 1: what life was like before you were constantly being encouraged 538 00:29:25,040 --> 00:29:29,160 Speaker 1: to compare yourself to every single person you've ever met 539 00:29:29,160 --> 00:29:31,520 Speaker 1: in your life, regardless of whether you know who they are, 540 00:29:31,560 --> 00:29:36,720 Speaker 1: how they are whatever. Um, and I just call me nostalgic, 541 00:29:36,960 --> 00:29:40,400 Speaker 1: but uh, I liked how I felt better. Yeah, Like 542 00:29:41,080 --> 00:29:43,680 Speaker 1: it's so absurd how much I know about people I 543 00:29:43,720 --> 00:29:46,680 Speaker 1: don't give a shit about, and how bad it makes 544 00:29:46,680 --> 00:29:49,960 Speaker 1: me feel to know about the curated lives of people 545 00:29:49,960 --> 00:29:51,760 Speaker 1: that I don't give a shit about, and how I 546 00:29:51,880 --> 00:29:54,960 Speaker 1: let that actively affect my daily life. And it's just yeah, 547 00:29:55,000 --> 00:29:58,000 Speaker 1: it's just fucking miserable. It is. It's horrible. It's horrible. 548 00:29:58,320 --> 00:30:01,280 Speaker 1: Um that said, I like Florida on the application, so 549 00:30:01,360 --> 00:30:05,040 Speaker 1: you know, it's sure. Now Here's why despite the documented 550 00:30:05,120 --> 00:30:09,360 Speaker 1: harm that Instagram does, nothing's ever going to change. Um. 551 00:30:09,400 --> 00:30:12,680 Speaker 1: As I stated, million US teens used to Graham daily, 552 00:30:13,120 --> 00:30:16,880 Speaker 1: only five million log on to Facebook. So Instagram is 553 00:30:16,920 --> 00:30:20,720 Speaker 1: almost five times is popular among teenagers as Facebook, where 554 00:30:20,800 --> 00:30:26,200 Speaker 1: kids are leaving in droves. So Facebook Mark Zuckerberg's invention 555 00:30:26,400 --> 00:30:28,840 Speaker 1: is now definitively just the terrain of the olds. And 556 00:30:28,840 --> 00:30:30,720 Speaker 1: Facebook knows that kids are never going to come back, 557 00:30:30,760 --> 00:30:33,680 Speaker 1: because that's not how how it being a kid works 558 00:30:33,720 --> 00:30:35,800 Speaker 1: like that, You don't get them back. They're going to 559 00:30:35,840 --> 00:30:38,640 Speaker 1: continue to do new ship. Eventually they'll leave Instagram for 560 00:30:38,760 --> 00:30:41,280 Speaker 1: something else. You know, That's just the way it fucking goes. 561 00:30:41,360 --> 00:30:45,200 Speaker 1: Unless unless the thirty year nostalgic cycle is like Facebook 562 00:30:45,240 --> 00:30:48,480 Speaker 1: is actually back now, it's actually and I don't think 563 00:30:48,520 --> 00:30:50,920 Speaker 1: it gave anybody a good experience enough to have that 564 00:30:51,040 --> 00:30:53,720 Speaker 1: it's not the fucking teenage mutant ninja turtle. Yeah. No, 565 00:30:53,760 --> 00:30:56,120 Speaker 1: one's getting dopamine hits. That's a good Yeah, it's like 566 00:30:56,200 --> 00:30:59,200 Speaker 1: flame and hot cheet does. Nobody's thinking fondly back to 567 00:30:59,360 --> 00:31:02,240 Speaker 1: scrolling Facebook when they were seven. They're thinking back to 568 00:31:02,400 --> 00:31:07,880 Speaker 1: I don't know, SpongeBob square Pants. Um oh, but at 569 00:31:07,880 --> 00:31:10,840 Speaker 1: the moment, Instagram is very popular with teens, and Facebook 570 00:31:10,840 --> 00:31:12,479 Speaker 1: knows that if they're going to continue to grow and 571 00:31:12,520 --> 00:31:15,600 Speaker 1: maintain their cultural dominance, they have to keep bringing in 572 00:31:15,640 --> 00:31:18,640 Speaker 1: the teens. They have to keep Instagram as profitable and 573 00:31:18,720 --> 00:31:22,360 Speaker 1: as as addictive as it currently is. Um. And that's 574 00:31:22,400 --> 00:31:24,160 Speaker 1: why they bought Instagram in the first place. They only 575 00:31:24,160 --> 00:31:25,880 Speaker 1: paid like a billion dollars for it. It It was an 576 00:31:25,880 --> 00:31:31,720 Speaker 1: incredible investment um. And they spend in it in a day. Yeah, 577 00:31:31,800 --> 00:31:34,800 Speaker 1: that's that's cheap as hell for something as influential and 578 00:31:34,880 --> 00:31:38,680 Speaker 1: huge as Instagram. Yeah. I wonder do you know what 579 00:31:38,720 --> 00:31:41,680 Speaker 1: it's worth now? I would guess significantly more than a 580 00:31:41,680 --> 00:31:44,479 Speaker 1: billion dollars um, But I don't entirely know a lot 581 00:31:44,520 --> 00:31:47,920 Speaker 1: of value. But Facebook's like a trillion dollar company now. Yeah, 582 00:31:47,960 --> 00:31:51,200 Speaker 1: they're very sucks and it's well, but face but that 583 00:31:51,240 --> 00:31:55,080 Speaker 1: includes Instagram, you know. Okay, so yeah, and among you know, 584 00:31:55,160 --> 00:31:57,880 Speaker 1: teens are one of the most valuable demographics to have 585 00:31:58,000 --> 00:32:02,400 Speaker 1: for advertisers, and Instagram is where quarantine. The number it's 586 00:32:02,560 --> 00:32:07,960 Speaker 1: estimated value is two billion. Yeah, that's a good investmentsment, 587 00:32:08,440 --> 00:32:13,040 Speaker 1: good investment if money you gotta yeah. Um So. The 588 00:32:13,080 --> 00:32:15,360 Speaker 1: fact that so much is at stake with Instagram, the 589 00:32:15,360 --> 00:32:17,640 Speaker 1: fact that it's such a central part of company having 590 00:32:17,680 --> 00:32:19,480 Speaker 1: any kind of future, is part of why Mark and 591 00:32:19,600 --> 00:32:22,480 Speaker 1: company have been so compelled to lie about it. None 592 00:32:22,520 --> 00:32:24,720 Speaker 1: of this stuff that we've been talking about was released 593 00:32:24,760 --> 00:32:27,400 Speaker 1: when Facebook researchers got it, of course not. They wouldn't 594 00:32:27,400 --> 00:32:30,920 Speaker 1: want anyone to know this ship. In March, Mark took 595 00:32:30,960 --> 00:32:33,200 Speaker 1: to Congress, where he was criticized for his plans to 596 00:32:33,200 --> 00:32:36,400 Speaker 1: create a new Instagram service for children under thirteen. He 597 00:32:36,440 --> 00:32:40,280 Speaker 1: was asked if he'd studied how Instagram infects affects children, 598 00:32:40,360 --> 00:32:43,120 Speaker 1: and he said, I believe the answer is yes. So 599 00:32:43,240 --> 00:32:46,880 Speaker 1: not yes, I think we've studied that. He told them. Then, 600 00:32:47,240 --> 00:32:49,560 Speaker 1: the research we've seen is that using social apps to 601 00:32:49,560 --> 00:32:52,360 Speaker 1: connect with other people can have positive mental health benefits. 602 00:32:52,880 --> 00:32:57,480 Speaker 1: And I'm sure there's something that he's gotten paid researchers 603 00:32:57,520 --> 00:32:59,360 Speaker 1: to come up with that he can make that case 604 00:32:59,400 --> 00:33:02,280 Speaker 1: off of. Um, I'm sure in certain situations it may 605 00:33:02,320 --> 00:33:04,560 Speaker 1: even be true. There are ways you can use social 606 00:33:04,560 --> 00:33:07,200 Speaker 1: media that are good. Dear, myth I've legitimately smiled or 607 00:33:07,240 --> 00:33:09,640 Speaker 1: had my heart warmed by things that happened on social media. 608 00:33:09,640 --> 00:33:12,720 Speaker 1: It doesn't not happen. Um, And I do think that 609 00:33:12,760 --> 00:33:14,840 Speaker 1: there is a case for like I mean, and it's 610 00:33:15,160 --> 00:33:17,520 Speaker 1: you can't credit Mark Zuckerberg with it, but just I 611 00:33:17,560 --> 00:33:20,080 Speaker 1: mean going back to fucking like live journal days of 612 00:33:20,240 --> 00:33:23,640 Speaker 1: just like friendships that have deepened as a result of 613 00:33:24,480 --> 00:33:28,200 Speaker 1: social media. That's definitely a thing. But the costs that 614 00:33:28,320 --> 00:33:32,440 Speaker 1: way the benefits there by quite a bit. Yeah, Um, 615 00:33:32,440 --> 00:33:35,880 Speaker 1: it's it's great. So um. So Mark goes on to say, 616 00:33:36,200 --> 00:33:38,640 Speaker 1: you know, I I think we've got research that shows 617 00:33:38,640 --> 00:33:40,680 Speaker 1: that can have positive mental health effects. You know, I 618 00:33:40,720 --> 00:33:44,080 Speaker 1: think we've studied whether or not how it affects children. Um. 619 00:33:44,120 --> 00:33:46,520 Speaker 1: But he doesn't talk about He leaves out all the 620 00:33:46,560 --> 00:33:49,200 Speaker 1: stuff that all the statistics, like about all the kids 621 00:33:49,240 --> 00:33:52,920 Speaker 1: whose suicidal ideation starts on Instagram, they had that data 622 00:33:52,960 --> 00:33:55,400 Speaker 1: when he went before Congress. He just didn't mention it. 623 00:33:55,440 --> 00:33:58,120 Speaker 1: They hadn't told anyone that ship, Like, he didn't say 624 00:33:58,160 --> 00:34:00,560 Speaker 1: a goddamn word about it. Yeah, he was like, yeah, 625 00:34:00,560 --> 00:34:01,840 Speaker 1: I think we've looked into it, and you know, there's 626 00:34:01,840 --> 00:34:03,760 Speaker 1: some ways in which it can be healthy not And 627 00:34:03,880 --> 00:34:07,000 Speaker 1: also one point three million American kids became suicidal because 628 00:34:07,040 --> 00:34:10,480 Speaker 1: of our app Like he did not throw that anything, 629 00:34:10,719 --> 00:34:13,560 Speaker 1: Like did he throw that? I mean truly, I'm like 630 00:34:13,680 --> 00:34:15,640 Speaker 1: up in the air of like did he not say 631 00:34:15,680 --> 00:34:18,920 Speaker 1: that because he didn't want people to know? Or did 632 00:34:18,960 --> 00:34:20,480 Speaker 1: he just say that because he heard it and he 633 00:34:20,520 --> 00:34:23,319 Speaker 1: didn't care and he forgot, Like you just don't know 634 00:34:23,400 --> 00:34:26,480 Speaker 1: what that guy that is so fucking evil. Woah, it's 635 00:34:26,480 --> 00:34:29,080 Speaker 1: pretty great. Um, and we'll talk more about that later. 636 00:34:29,360 --> 00:34:33,400 Speaker 1: In May, Instagram boss Adam Mussari told reporters that he 637 00:34:33,400 --> 00:34:36,279 Speaker 1: thought any impact on teen well being by Instagram was 638 00:34:36,360 --> 00:34:39,759 Speaker 1: likely quote quite small, based on the internal research he'd seen. Again, 639 00:34:39,880 --> 00:34:42,320 Speaker 1: I haven't released this research. He's saying, Oh, we have research, 640 00:34:42,360 --> 00:34:44,080 Speaker 1: and it says that any kind of impact on well 641 00:34:44,120 --> 00:34:47,520 Speaker 1: being is pretty small. And again, the actual research by 642 00:34:47,560 --> 00:34:50,480 Speaker 1: this point show of kids in the UK and six 643 00:34:50,520 --> 00:34:52,400 Speaker 1: percent of kids in the United States were moved to 644 00:34:52,400 --> 00:34:54,920 Speaker 1: thoughts of suicide by Instagram, which I would not call small. 645 00:34:55,480 --> 00:34:57,480 Speaker 1: I would not call I wouldn't I wouldn't necessarily say 646 00:34:57,480 --> 00:35:00,400 Speaker 1: it's huge. But that is not a small impact. Um, No, 647 00:35:00,719 --> 00:35:06,920 Speaker 1: that is like thousands and thousands, and yeah, that's significant. 648 00:35:07,480 --> 00:35:10,080 Speaker 1: The Wall Street Journal caught up with Massari after the 649 00:35:10,080 --> 00:35:12,799 Speaker 1: Facebook papers leaked, so they were able to like drill 650 00:35:12,880 --> 00:35:15,240 Speaker 1: him on this a bit, and he said a bit more. Quote, 651 00:35:15,560 --> 00:35:17,520 Speaker 1: in no way do I mean to diminish these issues. 652 00:35:17,560 --> 00:35:20,120 Speaker 1: Some of the issues mentioned in this story aren't necessarily widespread, 653 00:35:20,160 --> 00:35:23,160 Speaker 1: but their impact on people may be huge, which is like, 654 00:35:23,600 --> 00:35:27,680 Speaker 1: again a perfect nonstatement. That's right. They're like, but what 655 00:35:27,719 --> 00:35:30,600 Speaker 1: about the thing we couldn't possibly gauge at all versus 656 00:35:30,640 --> 00:35:33,480 Speaker 1: the thing that we did and we're actively distancing ourselves from. 657 00:35:33,520 --> 00:35:36,399 Speaker 1: I mean those statistics that's like at least one kid 658 00:35:36,440 --> 00:35:40,480 Speaker 1: in every classroom, Like that is gigantic. And when you 659 00:35:40,520 --> 00:35:43,200 Speaker 1: read the responses of guys like Massari and compare them 660 00:35:43,200 --> 00:35:45,520 Speaker 1: the responsive guy like people like Mark Zuckerberg and official 661 00:35:45,520 --> 00:35:47,680 Speaker 1: corporate spokes people, it's it's very clear that they're working 662 00:35:47,719 --> 00:35:51,120 Speaker 1: from the same playbook, that they're very disciplined in their responses. 663 00:35:51,440 --> 00:35:54,359 Speaker 1: Because Massari does try to tell the journal um that 664 00:35:54,440 --> 00:35:57,160 Speaker 1: he he thinks Facebook was late to realizing there were 665 00:35:57,200 --> 00:35:59,600 Speaker 1: drawbacks and connecting people in such large numbers. But then 666 00:35:59,600 --> 00:36:01,360 Speaker 1: he says, I've been pushing very hard for us to 667 00:36:01,360 --> 00:36:05,319 Speaker 1: embrace our responsibilities more broadly, which again says nothing. He 668 00:36:05,400 --> 00:36:08,560 Speaker 1: then pivots from that to stating that he's actually really 669 00:36:08,600 --> 00:36:10,600 Speaker 1: proud of the research they've done in the mental health 670 00:36:10,600 --> 00:36:13,000 Speaker 1: effects on teams, which again they didn't share with anybody, 671 00:36:13,040 --> 00:36:15,520 Speaker 1: and I would argue light about by omission in front 672 00:36:15,560 --> 00:36:18,839 Speaker 1: of Congress. Um, He's proud of this because he says 673 00:36:18,880 --> 00:36:21,840 Speaker 1: that shows Facebook employees are asking tough questions about the platform. 674 00:36:22,080 --> 00:36:24,440 Speaker 1: Quote for me, this isn't dirty laundry. I'm actually very 675 00:36:24,480 --> 00:36:26,840 Speaker 1: proud of this research, which is the same thing Zuckerberg 676 00:36:26,880 --> 00:36:30,200 Speaker 1: said about his own employees damning the service after six Right, 677 00:36:30,200 --> 00:36:32,840 Speaker 1: I'm gonna say that's the same exact thing as the 678 00:36:33,080 --> 00:36:36,440 Speaker 1: as the as the like actually bad work, you know, 679 00:36:36,480 --> 00:36:38,759 Speaker 1: talking about how that working for the Death Star is bad, 680 00:36:39,000 --> 00:36:42,520 Speaker 1: um is like evidence of Oh, the Death Stars actually 681 00:36:42,560 --> 00:36:46,520 Speaker 1: has a really open work culture. Like No, I don't know, 682 00:36:46,960 --> 00:36:49,400 Speaker 1: I feel like there are a few. There are not 683 00:36:49,480 --> 00:36:54,160 Speaker 1: many CEOs that are good at flipping a narrative. But 684 00:36:54,480 --> 00:36:58,759 Speaker 1: Mark Zuckerberg is particularly bad at it. Yeah, and it's 685 00:36:58,800 --> 00:37:00,480 Speaker 1: I mean, part of why they can be bad at 686 00:37:00,480 --> 00:37:03,160 Speaker 1: it is it doesn't really matter um, or at least 687 00:37:03,200 --> 00:37:06,440 Speaker 1: it hasn't fucking so far. Um. But the patterns I 688 00:37:06,480 --> 00:37:09,360 Speaker 1: mean not enough to get a better figurehead like, Yeah, 689 00:37:09,920 --> 00:37:13,879 Speaker 1: the patterns pretty clear here. Um. When a scandal comes out, 690 00:37:13,960 --> 00:37:17,279 Speaker 1: deny it until the information that can't be denied leaks out, 691 00:37:17,600 --> 00:37:20,600 Speaker 1: and then claim that whatever is happening at the site, 692 00:37:20,640 --> 00:37:23,680 Speaker 1: whatever like our information you had about how harmful it is, 693 00:37:23,680 --> 00:37:25,479 Speaker 1: is a positive because it means that you were trying 694 00:37:25,520 --> 00:37:28,040 Speaker 1: to do stuff about it, even if you actually rejected 695 00:37:28,120 --> 00:37:30,320 Speaker 1: taking action based on the data you had and refused 696 00:37:30,360 --> 00:37:33,680 Speaker 1: to share it with anybody else. Massari and Zuckerberg were 697 00:37:33,680 --> 00:37:36,400 Speaker 1: also careful to reiterate that any harms from Instagram had 698 00:37:36,400 --> 00:37:38,560 Speaker 1: to be weighed against its benefits, which I haven't found 699 00:37:38,600 --> 00:37:40,919 Speaker 1: a ton of documentation on. In fact, as the Wall 700 00:37:40,920 --> 00:37:44,480 Speaker 1: Street Journal rights. In five presentations over eighteen months to 701 00:37:44,560 --> 00:37:47,719 Speaker 1: this spring, the researchers Facebook researchers conducted what they called 702 00:37:47,760 --> 00:37:50,240 Speaker 1: a teen mental health deep dive and follow up studies. 703 00:37:50,480 --> 00:37:52,440 Speaker 1: They came to the conclusion that some of the problems 704 00:37:52,440 --> 00:37:55,280 Speaker 1: were specific to Instagram and not social media. More broadly, 705 00:37:55,560 --> 00:37:58,640 Speaker 1: This is especially true concerning so called social comparison, which 706 00:37:58,640 --> 00:38:00,560 Speaker 1: is when people assess their own value you in relation 707 00:38:00,600 --> 00:38:03,920 Speaker 1: to the attractiveness, wealth, and success of others. Social comparison 708 00:38:04,040 --> 00:38:07,120 Speaker 1: is worse on Instagram, states Facebook's deep dive into Team 709 00:38:07,160 --> 00:38:10,479 Speaker 1: Girl body image issues in noting that TikTok, a short 710 00:38:10,600 --> 00:38:13,520 Speaker 1: video app, is grounded in performance, while users on Snapchat, 711 00:38:13,600 --> 00:38:16,279 Speaker 1: a rifle photo and video sharing app, are sheltered by 712 00:38:16,360 --> 00:38:20,160 Speaker 1: jokey features that keep the focus on the face. In contrast, 713 00:38:20,200 --> 00:38:23,520 Speaker 1: Instagram more focuses more heavily on the body and lifestyle. 714 00:38:23,880 --> 00:38:27,439 Speaker 1: March Internal Research states it warns that the Explore page, 715 00:38:27,480 --> 00:38:30,840 Speaker 1: which serves users photos and videos curated by an algorithm, 716 00:38:31,040 --> 00:38:33,440 Speaker 1: can send users deep into content that can be harmful. 717 00:38:33,800 --> 00:38:37,080 Speaker 1: Aspects of Instagram exacerbate each other to create a perfect storm. 718 00:38:37,160 --> 00:38:40,719 Speaker 1: The research states, Yeah, I mean, again, not a shocking 719 00:38:40,760 --> 00:38:46,960 Speaker 1: revelation over here there. It is. I mean, I and 720 00:38:47,000 --> 00:38:50,480 Speaker 1: I do think that that let's TikTok and Snapchat get 721 00:38:50,480 --> 00:38:55,839 Speaker 1: off easy. They're like they're certain there is absolutely uh 722 00:38:56,080 --> 00:38:58,719 Speaker 1: toxic body image culture on there, And I feel like 723 00:38:58,800 --> 00:39:02,720 Speaker 1: finspo will thrive on any platform it fucking gloms itself onto. 724 00:39:02,800 --> 00:39:06,520 Speaker 1: But Instagram is particularly bad because it's like where so 725 00:39:06,560 --> 00:39:11,080 Speaker 1: many lifestyle people have launched and and there's so many 726 00:39:11,200 --> 00:39:14,520 Speaker 1: headless women on Instagram. It's it is shocking. There's so 727 00:39:14,560 --> 00:39:18,319 Speaker 1: many like not like um, not like you macheted my 728 00:39:18,360 --> 00:39:21,879 Speaker 1: head off, but like you're not encouraged to show your 729 00:39:21,920 --> 00:39:26,680 Speaker 1: head by the algorithm, which sounds weird, but it is true. 730 00:39:26,800 --> 00:39:29,879 Speaker 1: The less like it is just very focused on how 731 00:39:29,960 --> 00:39:33,400 Speaker 1: you physically look. And then there's also this tendency to 732 00:39:33,480 --> 00:39:37,480 Speaker 1: like tear people apart if they have edited their body 733 00:39:37,520 --> 00:39:39,400 Speaker 1: to look a certain way, when it's like, well that 734 00:39:39,680 --> 00:39:42,400 Speaker 1: the algorithm rewards editing your body to look a certain 735 00:39:42,440 --> 00:39:45,319 Speaker 1: way and to do all this, and it's you do 736 00:39:45,440 --> 00:39:47,600 Speaker 1: bring up a good point where it's like going, it's 737 00:39:47,640 --> 00:39:52,960 Speaker 1: frustrating that it's important to critique Facebook in relation to 738 00:39:53,000 --> 00:39:57,160 Speaker 1: its competitors like TikTok and snapchat. Um. That can lead 739 00:39:57,200 --> 00:40:00,160 Speaker 1: to the uncomfortable situation of like seeming to praise them 740 00:40:00,160 --> 00:40:02,160 Speaker 1: when they haven't done a good job. They just haven't 741 00:40:02,200 --> 00:40:05,360 Speaker 1: been as irresponsible. It's kind of like attacking like Chevron. 742 00:40:05,880 --> 00:40:08,280 Speaker 1: If you look at all of the overall harms, including 743 00:40:08,320 --> 00:40:11,160 Speaker 1: like their impact and like covering up climate change. Maybe 744 00:40:11,200 --> 00:40:13,000 Speaker 1: the worst of the big oil and gas companies. I 745 00:40:13,000 --> 00:40:14,839 Speaker 1: don't know, it's debatable, but it's like if you're if 746 00:40:14,840 --> 00:40:18,320 Speaker 1: you're criticizing Chevron specifically, it doesn't you're not saying that 747 00:40:18,600 --> 00:40:20,960 Speaker 1: BP is great. You're just being like, well, these are 748 00:40:20,960 --> 00:40:23,400 Speaker 1: the guys specifically that did this bad thing, and they 749 00:40:23,440 --> 00:40:26,560 Speaker 1: were the leaders in this specific terrible thing. Other bad 750 00:40:26,600 --> 00:40:28,960 Speaker 1: things are going on, but we can't, like, the episode 751 00:40:28,960 --> 00:40:31,240 Speaker 1: can't be about how bad everyone. We're talking about Facebook 752 00:40:31,320 --> 00:40:33,399 Speaker 1: right now, we have these documents from inside Facebook. I'm 753 00:40:33,400 --> 00:40:36,279 Speaker 1: sure versions of this are happening everywhere else. Listeners, in 754 00:40:36,320 --> 00:40:41,520 Speaker 1: your everyday life, just don't use Facebook as a yardstick 755 00:40:41,800 --> 00:40:44,760 Speaker 1: for morality, you know, because they just end up letting 756 00:40:44,760 --> 00:40:47,080 Speaker 1: a lot of people off for a lot of funked 757 00:40:47,120 --> 00:40:50,200 Speaker 1: up stuff. I would say in your regular life, don't 758 00:40:50,280 --> 00:40:53,959 Speaker 1: use Facebook. Is all the sentence we need it there. Um. Wow, 759 00:40:55,920 --> 00:40:58,359 Speaker 1: So you asked me you were talking earlier about like 760 00:40:58,840 --> 00:41:01,160 Speaker 1: because Mark went up and of Congress and was like, yeah, 761 00:41:01,200 --> 00:41:03,160 Speaker 1: I think we've got research on this, and I've definitely 762 00:41:03,160 --> 00:41:06,480 Speaker 1: seen research that says it's good for kids. We know everything. 763 00:41:06,520 --> 00:41:08,759 Speaker 1: I just stated that quote. I just read everything like 764 00:41:08,840 --> 00:41:12,080 Speaker 1: that's that's in those internal studies. Um, we know that 765 00:41:12,120 --> 00:41:14,000 Speaker 1: Mark saw this. We know that it was viewed by 766 00:41:14,000 --> 00:41:17,360 Speaker 1: top Facebook leaders because it was mentioned in a presentation 767 00:41:17,400 --> 00:41:20,480 Speaker 1: that was given to Mark Zuckerberg himself. We know that 768 00:41:20,560 --> 00:41:24,960 Speaker 1: when in August, Senators Richard Blumenthal and Marshall Blackburn sent 769 00:41:25,000 --> 00:41:27,160 Speaker 1: a letter to Mark Zuckerberg asking him to release his 770 00:41:27,200 --> 00:41:30,440 Speaker 1: internal research on how platforms impact child mental health, we 771 00:41:30,560 --> 00:41:32,600 Speaker 1: know that he sent back a six page letter that 772 00:41:32,640 --> 00:41:35,880 Speaker 1: included none of the studies we've just mentioned. Instead, the 773 00:41:35,920 --> 00:41:38,040 Speaker 1: study said that it was hard to conduct research on 774 00:41:38,080 --> 00:41:40,399 Speaker 1: Instagram and that there was no consensus about how much 775 00:41:40,440 --> 00:41:43,200 Speaker 1: screen time is too much. Meanwhile, their own data show 776 00:41:43,280 --> 00:41:46,680 Speaker 1: that Instagram users who reported feeling unattractive said that the 777 00:41:46,719 --> 00:41:50,200 Speaker 1: feeling began while they were on Instagram. Facebook's own internal 778 00:41:50,200 --> 00:41:52,759 Speaker 1: reports showed that their users reported wanting to spend less 779 00:41:52,760 --> 00:41:55,680 Speaker 1: time on Instagram but couldn't make themselves. And here's a 780 00:41:55,719 --> 00:41:58,719 Speaker 1: quote that makes it sound like Heroin teens told us 781 00:41:58,719 --> 00:42:00,560 Speaker 1: they don't like the amount of time they spend on 782 00:42:00,600 --> 00:42:02,360 Speaker 1: the app, but feel like they have to be present. 783 00:42:02,440 --> 00:42:04,640 Speaker 1: They often feel addicted and know that what they're seeing 784 00:42:04,719 --> 00:42:06,640 Speaker 1: is bad for their mental health but feel unable to 785 00:42:06,680 --> 00:42:11,080 Speaker 1: stop themselves. That's Facebook writing about Instagram like that's that's 786 00:42:11,200 --> 00:42:13,480 Speaker 1: their own people saying this like this is not some 787 00:42:13,640 --> 00:42:16,880 Speaker 1: activists getting in here, you know. That's so I mean, 788 00:42:17,080 --> 00:42:20,280 Speaker 1: it's I guess, good on them, regardless of the level 789 00:42:20,320 --> 00:42:24,239 Speaker 1: of self awareness going on there. Um, that's I mean. 790 00:42:24,239 --> 00:42:26,399 Speaker 1: And what I was thinking about earlier when it comes 791 00:42:26,440 --> 00:42:29,719 Speaker 1: to any time Zuckerberg is in front of Congress or 792 00:42:29,760 --> 00:42:32,080 Speaker 1: in front of political officials, I feel like for a 793 00:42:32,200 --> 00:42:36,680 Speaker 1: lot of people, the takeaway and the thing that gets 794 00:42:36,719 --> 00:42:41,760 Speaker 1: trending is how little political officials and members of Congress 795 00:42:42,000 --> 00:42:45,680 Speaker 1: understand about how the Internet works. And that's like the 796 00:42:45,719 --> 00:42:49,640 Speaker 1: funny story is like, oh, Mark Zuckerberg talked about an algorithm, 797 00:42:49,680 --> 00:42:51,839 Speaker 1: and they and you know, like this is this comes 798 00:42:51,920 --> 00:42:54,399 Speaker 1: up all the time, It comes up on v became 799 00:42:54,440 --> 00:42:57,520 Speaker 1: up on succession of just like how not internet literate 800 00:42:57,560 --> 00:43:00,840 Speaker 1: the majority of people who just i'd how the Internet 801 00:43:00,840 --> 00:43:04,239 Speaker 1: works are And it's like it almost becomes like a 802 00:43:04,320 --> 00:43:07,120 Speaker 1: he he ha ha, old guy doesn't know how algorithm works. 803 00:43:07,120 --> 00:43:09,560 Speaker 1: But it's like, well, the consequence of that is that 804 00:43:09,600 --> 00:43:12,560 Speaker 1: it ends up making Mark Zuckerberg look way cooler than 805 00:43:12,600 --> 00:43:14,920 Speaker 1: he is and it also doesn't address the problem at 806 00:43:14,920 --> 00:43:19,799 Speaker 1: all of like, no, Mark Zuckerberg is omitting something gigantic here, 807 00:43:20,040 --> 00:43:23,320 Speaker 1: and the majority of our you know, lawmakers in Congress 808 00:43:23,680 --> 00:43:27,560 Speaker 1: don't have the fucking, you know, cultural vocabulary to even 809 00:43:27,719 --> 00:43:31,919 Speaker 1: understand that. And that is like and and I guess 810 00:43:31,960 --> 00:43:34,160 Speaker 1: it's like it makes for a couple of memes, but 811 00:43:34,239 --> 00:43:38,280 Speaker 1: it's just like, no, this is bad. Can you commit 812 00:43:38,320 --> 00:43:44,799 Speaker 1: to cancel Finsta? Do you remember that who that was? Sad? Right? 813 00:43:44,840 --> 00:43:46,640 Speaker 1: Cancel fits? I mean that I think that's the most 814 00:43:46,760 --> 00:43:50,000 Speaker 1: recent one where it's like, okay, yeah, that is you know, 815 00:43:50,040 --> 00:43:55,360 Speaker 1: objectively funny. But but like the consequence of that is 816 00:43:55,480 --> 00:43:57,919 Speaker 1: I mean, that's ultimately a win for Instagram. And that's 817 00:43:57,920 --> 00:44:00,640 Speaker 1: a win for Facebook because it makes them look like 818 00:44:00,680 --> 00:44:03,719 Speaker 1: they're operating on a level of the fucking government doesn't understand. 819 00:44:04,000 --> 00:44:06,800 Speaker 1: And meanwhile, you know, one kid in every classroom is 820 00:44:06,840 --> 00:44:11,040 Speaker 1: suicidal as a result of the inability of law of 821 00:44:11,320 --> 00:44:15,240 Speaker 1: like law making officials to understand the effect that this has. 822 00:44:15,280 --> 00:44:18,799 Speaker 1: It's just it makes me real mad, Robert. And one 823 00:44:18,800 --> 00:44:21,480 Speaker 1: of the nune things about this is that while these 824 00:44:21,560 --> 00:44:26,080 Speaker 1: lawmakers don't understand and sound like idiots talking to Mark 825 00:44:26,160 --> 00:44:29,400 Speaker 1: Zuckerberg his own employees. These researchers who are part of 826 00:44:29,400 --> 00:44:32,480 Speaker 1: the Integrity team, these researchers studying the impact of Instagram 827 00:44:32,480 --> 00:44:35,680 Speaker 1: on teens know exactly how harmful it is, and they 828 00:44:35,719 --> 00:44:38,239 Speaker 1: are grappling in real time with like the damage their 829 00:44:38,239 --> 00:44:41,520 Speaker 1: product is doing two children. Members of these teams reported 830 00:44:41,560 --> 00:44:44,120 Speaker 1: frustration at the fact that their colleagues often refused to 831 00:44:44,120 --> 00:44:47,399 Speaker 1: take their findings seriously. One former researcher told The Wall 832 00:44:47,400 --> 00:44:50,640 Speaker 1: Street Journal that we're standing directly between people and their 833 00:44:50,640 --> 00:44:54,080 Speaker 1: bonuses when they try to reduce the harmful aspects of Instagram, 834 00:44:54,120 --> 00:44:57,120 Speaker 1: because like anything that reduces the harm is going to 835 00:44:57,239 --> 00:45:00,319 Speaker 1: reduce its popularity. It's going to cut down on time site, 836 00:45:00,320 --> 00:45:03,560 Speaker 1: it's going to cut down on users, and so everyone 837 00:45:03,600 --> 00:45:05,920 Speaker 1: else at Facebook. But the Integrity team gets paid by 838 00:45:05,960 --> 00:45:09,160 Speaker 1: how much they increase engagement on the site. That's where 839 00:45:09,160 --> 00:45:11,799 Speaker 1: the bonuses come from. So if you're trying to make 840 00:45:11,800 --> 00:45:15,239 Speaker 1: Facebook less dangerous, a huge number of the most influential 841 00:45:15,239 --> 00:45:19,120 Speaker 1: employees have a financially invested financial interest in you failing. 842 00:45:19,239 --> 00:45:21,400 Speaker 1: So that's yeah, And what that boils down to is 843 00:45:21,440 --> 00:45:25,040 Speaker 1: like there is a financial incentive for you convincing a 844 00:45:25,120 --> 00:45:28,360 Speaker 1: thirteen year old girl that she should get fucking fillers 845 00:45:28,440 --> 00:45:33,719 Speaker 1: in her, Like, what the fuck? It's just now all 846 00:45:33,719 --> 00:45:35,520 Speaker 1: I'm going to try to sell the teens because it's 847 00:45:35,560 --> 00:45:39,120 Speaker 1: ad break time, Jamie, and I would like to Okay, okay, 848 00:45:39,360 --> 00:45:43,279 Speaker 1: if you're if you're a teenager, you don't need Instagram. 849 00:45:43,280 --> 00:45:45,480 Speaker 1: That ship is bad for you. You certainly don't need Facebook. 850 00:45:45,560 --> 00:45:48,279 Speaker 1: That's where the olds are. But you know what's come 851 00:45:48,320 --> 00:45:50,480 Speaker 1: back around to being cool and what I think your 852 00:45:50,520 --> 00:45:55,440 Speaker 1: generation needs to get on board with smoking huge comical cigars. 853 00:45:55,480 --> 00:45:57,799 Speaker 1: I'm talking the ones that are like nine inches long 854 00:45:57,880 --> 00:46:01,600 Speaker 1: and like three inches white. Massive teens listening right now, 855 00:46:01,920 --> 00:46:04,520 Speaker 1: go out to your local cigar shop. Do not dribe 856 00:46:04,719 --> 00:46:08,319 Speaker 1: a sad ador. The worst looking you find outside the 857 00:46:08,320 --> 00:46:13,399 Speaker 1: cigar shall pay him for huge, cheap cigars and new 858 00:46:13,520 --> 00:46:19,200 Speaker 1: Instagram smoking horrible cigars. This is this is well, this 859 00:46:19,239 --> 00:46:22,960 Speaker 1: is what's going to bring buying teenagers hook zoomers, zoomers. 860 00:46:23,360 --> 00:46:25,799 Speaker 1: The air is not getting any cleaner. Right, You're all 861 00:46:25,800 --> 00:46:28,600 Speaker 1: gonna we're all gonna choke to death on wildfires. You 862 00:46:28,680 --> 00:46:31,760 Speaker 1: might as well burn down a big fat Macanudoh, Robert, 863 00:46:31,760 --> 00:46:33,680 Speaker 1: can I tell you I bought a teenager a white 864 00:46:33,680 --> 00:46:37,319 Speaker 1: claw the other. Good for you, Thank you. I felt good. 865 00:46:37,360 --> 00:46:39,560 Speaker 1: I felt like I did a public service. Yeah, teens, 866 00:46:39,719 --> 00:46:45,240 Speaker 1: go buy those big, fat, ridiculous lunatic cigars. Bribe, bribe 867 00:46:45,400 --> 00:46:49,040 Speaker 1: for it. You're just preparing yourself for climate change. All right, 868 00:46:49,160 --> 00:46:52,560 Speaker 1: here's the other white claw teens. Okay, I mean white 869 00:46:52,560 --> 00:46:55,040 Speaker 1: Cloud goes great with a huge shitty cigar, Jamie, No, 870 00:46:55,200 --> 00:47:00,160 Speaker 1: it did. Absolutely smoking is bad for you, Andy, as 871 00:47:00,160 --> 00:47:03,120 Speaker 1: a white smoking a cigar, you puff it, so it's 872 00:47:03,120 --> 00:47:15,480 Speaker 1: healthy you. Alright, here's some ads. Alright, we're back. We 873 00:47:15,560 --> 00:47:18,440 Speaker 1: are we all just we all just enjoyed a couple 874 00:47:18,480 --> 00:47:22,920 Speaker 1: of really comically large cigars. Um, we did not have 875 00:47:23,280 --> 00:47:27,399 Speaker 1: those ridiculous long asylum cigars. It was great. Why why 876 00:47:27,440 --> 00:47:30,680 Speaker 1: are you fixated on this what is happening? Because I 877 00:47:30,680 --> 00:47:32,520 Speaker 1: I find that sketch from I think you should leave 878 00:47:32,560 --> 00:47:35,279 Speaker 1: while the little girls are talking about smoking five maccanodos 879 00:47:35,280 --> 00:47:38,080 Speaker 1: to unwind at the end of the day. Actually, I 880 00:47:38,120 --> 00:47:42,520 Speaker 1: mean yeah, but like I love when you're reveal yourself 881 00:47:42,520 --> 00:47:46,000 Speaker 1: to be a basic bitch. I am a basic bittix 882 00:47:46,560 --> 00:47:49,160 Speaker 1: A right, That's why I'm thinking about cigar. I love that. 883 00:47:49,280 --> 00:47:51,399 Speaker 1: I love that we're in the middle of a podcast, 884 00:47:51,640 --> 00:47:55,360 Speaker 1: and uh, you can't get off that well. I also 885 00:47:55,400 --> 00:47:58,640 Speaker 1: think making children do things that's bad for them as funny, 886 00:47:58,760 --> 00:48:03,560 Speaker 1: but not this way, not the face flashes send dan flashes. 887 00:48:03,680 --> 00:48:06,120 Speaker 1: I mean they've also I think the teens are rejecting 888 00:48:06,120 --> 00:48:11,520 Speaker 1: in f t s pretty widely, Jamie. So when Facebook 889 00:48:11,520 --> 00:48:13,840 Speaker 1: does try to make the case that their products are benign, 890 00:48:13,960 --> 00:48:17,080 Speaker 1: they like to bring up studies from the Oxford Internet Institute, 891 00:48:17,080 --> 00:48:20,320 Speaker 1: which is a project of Oxford University, which show minimal 892 00:48:20,440 --> 00:48:23,920 Speaker 1: or no correlation between social media use and depression. Uh. 893 00:48:23,960 --> 00:48:26,240 Speaker 1: The Wall Street Journal actually reached out to the Oxford 894 00:48:26,239 --> 00:48:29,200 Speaker 1: researcher responsible for some of these studies, who right away 895 00:48:29,320 --> 00:48:33,359 Speaker 1: was like wasn't like, oh, yes, they're right, everything's fine. Um, 896 00:48:33,400 --> 00:48:35,920 Speaker 1: he was like, actually, Facebook needs to be much more 897 00:48:35,960 --> 00:48:38,200 Speaker 1: open with the research that they're doing because they have 898 00:48:38,280 --> 00:48:41,319 Speaker 1: better data than than we can get, than researchers can get, 899 00:48:41,560 --> 00:48:44,120 Speaker 1: and so our actual information that they're citing is hampered 900 00:48:44,120 --> 00:48:46,200 Speaker 1: by the fact that they're not sharing what they're finding. 901 00:48:46,600 --> 00:48:48,920 Speaker 1: And who knows how things can change and our conclusions 902 00:48:48,920 --> 00:48:52,440 Speaker 1: could change if we had access to all of that data. Um. 903 00:48:52,480 --> 00:48:54,799 Speaker 1: He even told the Wall spre Street Journal. People talk 904 00:48:54,800 --> 00:48:57,400 Speaker 1: about Instagram like it's a drug, but we can't study 905 00:48:57,440 --> 00:49:00,360 Speaker 1: the active ingredient, which you'll notice is not him saying 906 00:49:00,760 --> 00:49:03,279 Speaker 1: it's fine, it's him being like, yeah, I really wish 907 00:49:03,280 --> 00:49:07,000 Speaker 1: we could actually study this better. Um, it's difficult, right. 908 00:49:07,080 --> 00:49:10,279 Speaker 1: And also he's referring to it like drugs, which is 909 00:49:10,719 --> 00:49:14,719 Speaker 1: the comparable scale of how it manifesting. Okay, Yeah, he's 910 00:49:14,719 --> 00:49:18,000 Speaker 1: certainly not being like everything's fine. Um, I think that's clear. 911 00:49:18,719 --> 00:49:22,520 Speaker 1: He's truly like constantly, Mr Policeman. I gave you all 912 00:49:22,520 --> 00:49:25,680 Speaker 1: the closing the situation and just no one gives a ship. 913 00:49:25,719 --> 00:49:29,719 Speaker 1: It is very funny and like that that movie, And 914 00:49:29,760 --> 00:49:31,920 Speaker 1: that's what I was trying to say, that it's hilarious. Ye. 915 00:49:32,120 --> 00:49:34,440 Speaker 1: So we focused a lot on these episodes about how 916 00:49:34,440 --> 00:49:37,520 Speaker 1: Facebook has harmed people and institutions in the United States, 917 00:49:37,719 --> 00:49:40,280 Speaker 1: but as we've covered in past episodes, the social network 918 00:49:40,280 --> 00:49:42,799 Speaker 1: has been responsible for helping to incite ethnic cleansings and 919 00:49:42,840 --> 00:49:46,279 Speaker 1: mass racial violence in places like Myanmar and India. Mob 920 00:49:46,360 --> 00:49:50,240 Speaker 1: violence against Muslims in India and cited by viral Facebook misinformation, 921 00:49:50,560 --> 00:49:53,319 Speaker 1: led one researcher in February of two thousand nineteen to 922 00:49:53,400 --> 00:49:56,080 Speaker 1: create yet another fake account to try and experience social 923 00:49:56,080 --> 00:50:00,000 Speaker 1: media as a person in Kerala, India, might um from 924 00:50:00,040 --> 00:50:02,440 Speaker 1: New York Times quote. For the next three weeks, the 925 00:50:02,440 --> 00:50:05,600 Speaker 1: account operated by a simple rule follow all the recommendations 926 00:50:05,640 --> 00:50:08,600 Speaker 1: generated by Facebook's algorithm to join groups, watch videos, and 927 00:50:08,640 --> 00:50:11,120 Speaker 1: explore new pages on the site. The result was an 928 00:50:11,120 --> 00:50:14,440 Speaker 1: inundation of hate speech, misinformation, and celebrations of violence, which 929 00:50:14,480 --> 00:50:17,600 Speaker 1: were documented in an internal Facebook report published later that month. 930 00:50:18,640 --> 00:50:21,640 Speaker 1: And this is from the Facebook researcher following this test 931 00:50:21,760 --> 00:50:24,360 Speaker 1: users news feed. I've seen more images of dead people 932 00:50:24,400 --> 00:50:26,040 Speaker 1: in the past three weeks than I've seen in my 933 00:50:26,200 --> 00:50:31,880 Speaker 1: entire life total. What a great site Mark built. Facebook's 934 00:50:31,880 --> 00:50:38,560 Speaker 1: new tagline the place for corpses. Oh yeah, my goodness. 935 00:50:38,600 --> 00:50:40,960 Speaker 1: I mean and so I know that we we have 936 00:50:41,520 --> 00:50:46,960 Speaker 1: discussed Facebook's role in in uh super charging ethnic cleansings, 937 00:50:47,040 --> 00:50:50,880 Speaker 1: but that is just that is so. Yeah, it's not great, Jamie. 938 00:50:51,000 --> 00:50:54,080 Speaker 1: Someone wrote that down, Robert and one wrote that down 939 00:50:54,120 --> 00:50:57,600 Speaker 1: and hit published it's not greater or because India is 940 00:50:57,640 --> 00:51:02,520 Speaker 1: Facebook's biggest customer, forty million Indians use one or more 941 00:51:02,560 --> 00:51:06,840 Speaker 1: Facebook products. Um, that's that's a ship of people. Yeah 942 00:51:06,960 --> 00:51:10,600 Speaker 1: three ll um. That is something that I think is 943 00:51:10,640 --> 00:51:14,239 Speaker 1: important to remember and something that I lose sight of 944 00:51:14,320 --> 00:51:19,920 Speaker 1: sometimes is like Facebook is not a super popular platform 945 00:51:20,080 --> 00:51:24,520 Speaker 1: for people of all ages in North America. But that's 946 00:51:24,560 --> 00:51:26,960 Speaker 1: not the case. Yeah, and it is just it is 947 00:51:27,000 --> 00:51:29,920 Speaker 1: the Internet for a lot of these people. Um, Like 948 00:51:30,040 --> 00:51:31,840 Speaker 1: that is the way that that is the whole of 949 00:51:31,880 --> 00:51:34,160 Speaker 1: how they consume the Internet in a lot of cases. 950 00:51:34,200 --> 00:51:36,320 Speaker 1: I mean maybe with like YouTube or something mixed in, 951 00:51:36,320 --> 00:51:37,919 Speaker 1: but they're probably getting a lot of their YouTube links 952 00:51:37,920 --> 00:51:41,239 Speaker 1: from their Facebook feed. Um. Now, the fact the fact 953 00:51:41,239 --> 00:51:43,759 Speaker 1: that India is the number one customer in terms of 954 00:51:43,760 --> 00:51:45,920 Speaker 1: like number of people for Facebook, I'm sure the United 955 00:51:45,960 --> 00:51:48,360 Speaker 1: States is still more profitable just because of like differences 956 00:51:48,400 --> 00:51:51,879 Speaker 1: in income and whatnot. Um, but this is a huge 957 00:51:51,920 --> 00:51:54,400 Speaker 1: part of their business. But despite that fact, they have 958 00:51:54,440 --> 00:51:57,239 Speaker 1: failed to invest very much in terms of meaningful resources 959 00:51:57,239 --> 00:52:00,480 Speaker 1: into having employees who speak the language, whereas is more 960 00:52:00,520 --> 00:52:05,680 Speaker 1: the problem the languages of India. See India super mixed 961 00:52:05,840 --> 00:52:08,239 Speaker 1: country right in terms of different like ethnic groups and 962 00:52:08,239 --> 00:52:11,920 Speaker 1: religious groups. They have twenty two officially recognized languages in 963 00:52:11,960 --> 00:52:13,920 Speaker 1: the country, and there's way more languages than that in 964 00:52:13,920 --> 00:52:16,279 Speaker 1: India that significant numbers of people speak. There's twenty two 965 00:52:16,280 --> 00:52:19,359 Speaker 1: officially recognized languages. Anyone who can travel there, and I've 966 00:52:19,360 --> 00:52:20,840 Speaker 1: spent a lot of time in India can tell you 967 00:52:20,840 --> 00:52:23,160 Speaker 1: that being able to effectively say hello and ask basic 968 00:52:23,239 --> 00:52:25,319 Speaker 1: questions of people can require a lot of research if 969 00:52:25,360 --> 00:52:28,800 Speaker 1: you're traveling a decent amount. But Facebook aren't twenty something 970 00:52:28,840 --> 00:52:31,440 Speaker 1: tourists on the problem for good Tan Dory and bond Lassies. 971 00:52:31,680 --> 00:52:34,560 Speaker 1: They have effectively taken control of the primary method of 972 00:52:34,560 --> 00:52:37,680 Speaker 1: communication and information distribution for hundreds of millions of people, 973 00:52:37,880 --> 00:52:40,239 Speaker 1: and they fed failed to hire folks who might know 974 00:52:40,280 --> 00:52:43,320 Speaker 1: if some of those people are deliberately inciting genocide against 975 00:52:43,320 --> 00:52:47,040 Speaker 1: other people in the country. Eight seven to Facebook's global 976 00:52:47,040 --> 00:52:50,480 Speaker 1: budget for identifying misinformation is spent on the United States. 977 00:52:50,920 --> 00:52:55,040 Speaker 1: The rest of the planet shares of their misinformation budget. 978 00:52:55,200 --> 00:52:57,560 Speaker 1: You want to guess what percentage of Facebook users? North 979 00:52:57,600 --> 00:53:03,400 Speaker 1: Americans make up? Ten percent of their budget goes on 980 00:53:03,480 --> 00:53:07,239 Speaker 1: ten percent of their users. Of like dealing with disinformation 981 00:53:08,160 --> 00:53:12,680 Speaker 1: something else, of dealing with disinformation specifically now, When this 982 00:53:12,760 --> 00:53:15,440 Speaker 1: leaked out, Facebook's response was that the information site it 983 00:53:15,560 --> 00:53:18,520 Speaker 1: was incomplete and did not include third party fact checkers. 984 00:53:18,520 --> 00:53:20,280 Speaker 1: They're like, well, this doesn't include all of the people 985 00:53:20,440 --> 00:53:22,799 Speaker 1: the third party companies we hire, except for the data 986 00:53:22,840 --> 00:53:25,400 Speaker 1: they show suggests that the majority of the effort and 987 00:53:25,480 --> 00:53:27,600 Speaker 1: money spent on third party fact checkers is for fact 988 00:53:27,680 --> 00:53:30,359 Speaker 1: checking stuff in the United States. Um. And of course 989 00:53:30,360 --> 00:53:32,960 Speaker 1: they did not elaborate on how including this information might 990 00:53:33,000 --> 00:53:35,000 Speaker 1: have changed the overall numbers, So my guess is not 991 00:53:35,040 --> 00:53:38,120 Speaker 1: by much of at all. Internal documents do show that 992 00:53:38,160 --> 00:53:41,040 Speaker 1: Facebook attempted to create changes to their platform to stop 993 00:53:41,040 --> 00:53:43,799 Speaker 1: the spread of the disinformation during the November election. In 994 00:53:43,840 --> 00:53:46,800 Speaker 1: myan mar those changes which also halted the spread of 995 00:53:46,840 --> 00:53:49,160 Speaker 1: disinformation put out by the military, which was a big 996 00:53:49,200 --> 00:53:51,719 Speaker 1: like the it was the military inciting ethnic cleansings and 997 00:53:51,760 --> 00:53:53,759 Speaker 1: like and trying to incite violence in order to like 998 00:53:54,280 --> 00:53:58,160 Speaker 1: lockdown political power ahead of this election. Um, so they 999 00:53:58,280 --> 00:54:00,680 Speaker 1: cut this significantly prior to the election. They see it 1000 00:54:00,680 --> 00:54:03,280 Speaker 1: as a problem. The institute changes similar to the changes 1001 00:54:03,320 --> 00:54:05,040 Speaker 1: they talked about putting up in the U. S. If 1002 00:54:05,040 --> 00:54:07,200 Speaker 1: things went badly with the election and these worked, it 1003 00:54:07,320 --> 00:54:11,200 Speaker 1: dropped dramatically, Yeah and again and it's it's that is 1004 00:54:11,360 --> 00:54:13,920 Speaker 1: that is good. I'm glad that was done. But they 1005 00:54:13,960 --> 00:54:17,360 Speaker 1: only respond to give me a second, exclusively, give me 1006 00:54:17,400 --> 00:54:20,360 Speaker 1: a second, Jamie, because prior to the election, they institute 1007 00:54:20,360 --> 00:54:22,799 Speaker 1: these changes which are significant. It reduces the number of 1008 00:54:22,800 --> 00:54:26,160 Speaker 1: inflammatory posts by twenty five point one percent and reduces 1009 00:54:26,200 --> 00:54:29,319 Speaker 1: the spread of photo posts containing disinformation by forty eight 1010 00:54:29,360 --> 00:54:34,279 Speaker 1: point five pc. This is huge. That's that's that's really significant. UM. 1011 00:54:34,320 --> 00:54:37,240 Speaker 1: As soon as the election was done, Facebook reversed those changes, 1012 00:54:37,239 --> 00:54:40,479 Speaker 1: presumably because they were bad for money. Three months after 1013 00:54:40,520 --> 00:54:43,680 Speaker 1: the election, the Myanmar military launched a vicious coup. Violence 1014 00:54:43,719 --> 00:54:46,440 Speaker 1: there continues to this moment. In response, Facebook created a 1015 00:54:46,440 --> 00:54:49,520 Speaker 1: special policy to stop people from praising violence in the country, 1016 00:54:49,840 --> 00:54:52,960 Speaker 1: one which presumably reduces the spread of content by freedom 1017 00:54:53,000 --> 00:54:55,800 Speaker 1: fighters resisting the military as much as it reduces content 1018 00:54:55,840 --> 00:54:59,279 Speaker 1: spread by the military. It's obviously too much to say 1019 00:54:59,280 --> 00:55:01,799 Speaker 1: that Facebook call is to coup and Myanmar ship's been. 1020 00:55:01,840 --> 00:55:03,360 Speaker 1: I mean, there's a lot going on there. I'm not 1021 00:55:03,400 --> 00:55:06,200 Speaker 1: gonna I'm not pretending that this is like it's just Facebook, 1022 00:55:06,239 --> 00:55:10,520 Speaker 1: but a major contributing factor. It was an insignificant UM 1023 00:55:10,560 --> 00:55:13,680 Speaker 1: and the fact that they knew how much their policies 1024 00:55:13,719 --> 00:55:17,120 Speaker 1: were helping and reverse them after the election, reversing this 1025 00:55:17,160 --> 00:55:20,400 Speaker 1: effect and leading to an increase in inflammatory content because 1026 00:55:20,440 --> 00:55:23,799 Speaker 1: it profited them more is damning, right, That's the thing 1027 00:55:23,840 --> 00:55:28,200 Speaker 1: that's damning. UM Around the world. Facebook's contribution to violence 1028 00:55:28,200 --> 00:55:30,640 Speaker 1: maybe greatest in places where the company has huge reach 1029 00:55:30,680 --> 00:55:33,560 Speaker 1: but pays little attention. In Sri Lanka, people were able 1030 00:55:33,600 --> 00:55:36,360 Speaker 1: to automatically add hundreds of thousands of users to Facebook 1031 00:55:36,360 --> 00:55:40,160 Speaker 1: groups that spread violent content. In Ethiopia, a nationalist militia 1032 00:55:40,440 --> 00:55:43,760 Speaker 1: coordinated call sur violence openly on the app. The company 1033 00:55:43,760 --> 00:55:45,760 Speaker 1: claims that it has reduced the amount of hate speech 1034 00:55:45,760 --> 00:55:48,160 Speaker 1: people see globally by half this year. But even if 1035 00:55:48,200 --> 00:55:50,400 Speaker 1: that is true, how much hate was spread during the 1036 00:55:50,480 --> 00:55:52,600 Speaker 1: years where they ignored the rest of the world. How 1037 00:55:52,640 --> 00:55:55,400 Speaker 1: many killings, how many militant groups seeded with new recruits, 1038 00:55:55,480 --> 00:55:58,760 Speaker 1: how many pieces of extermination? Is propaganda spread while Facebook 1039 00:55:58,800 --> 00:56:03,120 Speaker 1: just wasn't paying attention? The actual answer is likely incalculable. 1040 00:56:03,160 --> 00:56:05,080 Speaker 1: But here's The New York Times again reporting on that 1041 00:56:05,120 --> 00:56:09,680 Speaker 1: test account in Kerala, India, perfect turn of friends. Yeah yeah. 1042 00:56:09,800 --> 00:56:12,160 Speaker 1: Ten days after the researcher opened the fake account to 1043 00:56:12,160 --> 00:56:15,560 Speaker 1: study misinformation, a suicide bombing and the disputed border region 1044 00:56:15,560 --> 00:56:17,360 Speaker 1: of cashmir set off a round of violence and a 1045 00:56:17,400 --> 00:56:21,960 Speaker 1: spike and accusations misinformation and conspiracies between Indian and Pakistani nationals. 1046 00:56:22,320 --> 00:56:25,239 Speaker 1: After the attack, anti Pakistan content began to circulate in 1047 00:56:25,239 --> 00:56:28,160 Speaker 1: the Facebook recommendation groups that the researcher had joined. Many 1048 00:56:28,200 --> 00:56:30,520 Speaker 1: of the groups she noted had tens of thousands of followers. 1049 00:56:30,640 --> 00:56:33,600 Speaker 1: A different report by Facebook published in December two thousand nineteen, 1050 00:56:33,600 --> 00:56:36,719 Speaker 1: found Indian Facebook users tended to join large groups, with 1051 00:56:36,760 --> 00:56:39,080 Speaker 1: the company's median group size at a hundred and forty 1052 00:56:39,080 --> 00:56:42,360 Speaker 1: thousand members. In a separate report produced after the elections, 1053 00:56:42,360 --> 00:56:46,160 Speaker 1: Facebook found that over fort of top views or impressions 1054 00:56:46,160 --> 00:56:48,440 Speaker 1: in the Indian state of West Bengal were fake or 1055 00:56:48,480 --> 00:56:51,640 Speaker 1: inauthentic when one in authentic account had amassed more than 1056 00:56:51,680 --> 00:56:55,000 Speaker 1: thirty million impressions. A report in March showed that many 1057 00:56:55,040 --> 00:56:57,960 Speaker 1: of the problems cited during the two thousand nineteen elections persisted. 1058 00:56:58,239 --> 00:57:01,720 Speaker 1: In the internal document called Adverse Serial Harmful Networks India 1059 00:57:01,800 --> 00:57:04,400 Speaker 1: Case Study, a Facebook researcher wrote that there were groups 1060 00:57:04,400 --> 00:57:08,120 Speaker 1: and pages replete with inflammatory and misleading anti Muslim content 1061 00:57:08,200 --> 00:57:10,560 Speaker 1: on Facebook. The report said that there were a number 1062 00:57:10,560 --> 00:57:14,080 Speaker 1: of dehumanizing posts comparing Muslims to pigs and dogs, and 1063 00:57:14,120 --> 00:57:16,720 Speaker 1: misinformation claimed that the Koran, the Holy Book of Islam, 1064 00:57:16,800 --> 00:57:19,520 Speaker 1: calls from men to rape their female family members, so 1065 00:57:20,120 --> 00:57:23,640 Speaker 1: that's significant. Like the scale at which the ship spreads 1066 00:57:23,880 --> 00:57:29,600 Speaker 1: is huge, and and I I mean, I don't even 1067 00:57:29,720 --> 00:57:33,360 Speaker 1: I mean I feel like I know the answer if 1068 00:57:33,400 --> 00:57:37,640 Speaker 1: if the hate is existing on that scale unmitigated. But 1069 00:57:39,440 --> 00:57:42,520 Speaker 1: who is working to? Like how many people does Facebook 1070 00:57:42,560 --> 00:57:46,760 Speaker 1: have working on? Is there is there an integrity team 1071 00:57:47,120 --> 00:57:51,600 Speaker 1: for this regent? Like technically yes, the question is how 1072 00:57:51,600 --> 00:57:54,000 Speaker 1: many of them and how many of the languages there 1073 00:57:54,000 --> 00:57:57,240 Speaker 1: are represented by the team. And it's not many exactly, 1074 00:57:57,320 --> 00:57:59,880 Speaker 1: Like it's not many. Can't have a global company and 1075 00:58:00,040 --> 00:58:03,560 Speaker 1: not have global representation or ship like this is going 1076 00:58:03,640 --> 00:58:06,520 Speaker 1: to happen, Like it's just it's actually, you know what. 1077 00:58:06,600 --> 00:58:08,760 Speaker 1: It kind of reminds me of Jamie. I was looking 1078 00:58:08,760 --> 00:58:10,360 Speaker 1: at this, and I was thinking about the East India 1079 00:58:10,400 --> 00:58:14,280 Speaker 1: Trading Company UM. When the East India Company took over 1080 00:58:14,400 --> 00:58:17,360 Speaker 1: large chunks of India UM. They took it over from 1081 00:58:17,360 --> 00:58:21,520 Speaker 1: a regime. The government, the monarchical government that had been 1082 00:58:21,520 --> 00:58:24,040 Speaker 1: in charge in that area prior, was not a good government, 1083 00:58:24,080 --> 00:58:25,920 Speaker 1: right because they number one, they lost that war. But 1084 00:58:25,960 --> 00:58:28,440 Speaker 1: like they weren't a very good government, they were a government, 1085 00:58:28,480 --> 00:58:31,320 Speaker 1: so they did do things like provide aid and famines 1086 00:58:31,360 --> 00:58:34,080 Speaker 1: and disasters and have people whose job it was to 1087 00:58:34,120 --> 00:58:36,680 Speaker 1: like handle stuff like that and like handle like make 1088 00:58:36,720 --> 00:58:38,920 Speaker 1: sure that like place the stuff was getting where it 1089 00:58:38,960 --> 00:58:42,040 Speaker 1: needed to go during like calamities and whatnot, and doing 1090 00:58:42,080 --> 00:58:44,640 Speaker 1: things specifically that helped people. But we're not we're not 1091 00:58:44,720 --> 00:58:46,720 Speaker 1: profitable because a big chunk of what a government does 1092 00:58:46,880 --> 00:58:50,040 Speaker 1: isn't directly profitable. It's just helping to like keep people 1093 00:58:50,040 --> 00:58:54,160 Speaker 1: alive and keep the roads open and whatnot, right sustain humanity. 1094 00:58:54,520 --> 00:58:57,400 Speaker 1: When the East India Company took over, they were governing 1095 00:58:57,440 --> 00:58:59,640 Speaker 1: and in control of this region and this has actually 1096 00:58:59,680 --> 00:59:01,800 Speaker 1: been all I think is their first place. But they 1097 00:59:01,840 --> 00:59:04,080 Speaker 1: don't have any responsibility. They don't have teams who are 1098 00:59:04,080 --> 00:59:06,080 Speaker 1: dedicated to making sure people aren't starving. They don't have 1099 00:59:06,080 --> 00:59:08,439 Speaker 1: people who are dedicated to actually keeping the roads open 1100 00:59:08,440 --> 00:59:11,280 Speaker 1: in any way that isn't necessary for directly the trade 1101 00:59:11,280 --> 00:59:14,280 Speaker 1: that profits them. They don't do those things because they're 1102 00:59:14,320 --> 00:59:16,760 Speaker 1: not They're governing effectively, but they're not a government. And 1103 00:59:16,800 --> 00:59:20,040 Speaker 1: there's been a lot of talk about how Facebook is 1104 00:59:20,440 --> 00:59:23,640 Speaker 1: a is effectively like a nation, a digital nation of 1105 00:59:23,680 --> 00:59:26,040 Speaker 1: like three billion people, and Mark Zuckerberg has the power 1106 00:59:26,080 --> 00:59:28,400 Speaker 1: of a dictator. And one of the problems with that 1107 00:59:28,520 --> 00:59:31,480 Speaker 1: is that for all of their faults. Governments have a 1108 00:59:31,560 --> 00:59:35,320 Speaker 1: responsibility to do things for people that are like necessary 1109 00:59:35,360 --> 00:59:38,840 Speaker 1: to stop them, like to deal with like calamities and whatnot. 1110 00:59:39,240 --> 00:59:42,360 Speaker 1: Facebook has no such responsibility, and so when people were 1111 00:59:42,400 --> 00:59:44,760 Speaker 1: not paying attention to Sri Lanka to West Bengal to 1112 00:59:44,960 --> 00:59:48,640 Speaker 1: um Um to Myanmar, they didn't do anything. Um And 1113 00:59:48,680 --> 00:59:50,520 Speaker 1: as we know, like forty in the in a in 1114 00:59:50,560 --> 00:59:53,120 Speaker 1: a region where there are millions and millions of people, 1115 00:59:53,520 --> 00:59:57,800 Speaker 1: forty percent of the views were fake and authentic content, 1116 00:59:58,280 --> 01:00:01,480 Speaker 1: you know, like because don't give a ship what's spreading 1117 01:00:01,520 --> 01:00:03,800 Speaker 1: because they don't have to, because they don't have to 1118 01:00:03,800 --> 01:00:06,720 Speaker 1: deal with the consequences unless it pisses people off, as 1119 01:00:06,720 --> 01:00:09,560 Speaker 1: opposed to a government where it's like, well, yeah, we 1120 01:00:09,640 --> 01:00:11,840 Speaker 1: are made up of the people who live here and 1121 01:00:11,880 --> 01:00:14,440 Speaker 1: if things go badly enough, it can't not affect us. 1122 01:00:14,680 --> 01:00:16,840 Speaker 1: I'm not trying to be again, not like with TikTok, 1123 01:00:16,840 --> 01:00:19,400 Speaker 1: I'm not trying to praise the concept of governance. But 1124 01:00:19,480 --> 01:00:23,560 Speaker 1: it is better than what Facebook's doing right right, It's 1125 01:00:23,640 --> 01:00:27,040 Speaker 1: it's I think that that is like a very I'd 1126 01:00:27,080 --> 01:00:29,680 Speaker 1: never considered looking at it that way, but but viewing 1127 01:00:29,680 --> 01:00:33,760 Speaker 1: it as this kind of digital dictatorship that a colonial dictatorship. 1128 01:00:33,800 --> 01:00:38,960 Speaker 1: It's colonized people's information. Um, really like information streams, it's 1129 01:00:39,000 --> 01:00:42,280 Speaker 1: colonized the way people communicate, but it has no responsibility 1130 01:00:42,320 --> 01:00:44,720 Speaker 1: to them if they aren't white and wealthy. Well yeah, 1131 01:00:44,760 --> 01:00:47,840 Speaker 1: and and yeah, and marginalize people in the same ways 1132 01:00:47,840 --> 01:00:51,920 Speaker 1: that actual dictatorships do in terms of how much attention 1133 01:00:52,000 --> 01:00:55,120 Speaker 1: is being given, how are are people being hired to 1134 01:00:55,800 --> 01:00:58,520 Speaker 1: support and represent this area? And of course the answer 1135 01:00:58,760 --> 01:01:04,960 Speaker 1: is no. Course the result of that is extreme human 1136 01:01:05,440 --> 01:01:09,360 Speaker 1: consequence and and harm. And it's so and it's like, 1137 01:01:09,840 --> 01:01:12,360 Speaker 1: it's just so striking to me that it still feels 1138 01:01:12,400 --> 01:01:18,600 Speaker 1: like in terms of the laws that exist that control. 1139 01:01:18,680 --> 01:01:22,160 Speaker 1: I mean that that even attempt to address the amount 1140 01:01:22,200 --> 01:01:25,920 Speaker 1: of influence and control that a gigantic digital network like 1141 01:01:25,920 --> 01:01:31,120 Speaker 1: like Facebook has. Um, you know that Facebook. I mean 1142 01:01:31,320 --> 01:01:33,960 Speaker 1: unless people are yelling at them, and unless unless their 1143 01:01:34,000 --> 01:01:36,440 Speaker 1: bottom line is threatened, they're never going to respond to 1144 01:01:36,480 --> 01:01:40,400 Speaker 1: stuff like this. Like it. That's that's been made clear 1145 01:01:40,560 --> 01:01:44,280 Speaker 1: for decades at this point. It's great. I love it 1146 01:01:45,920 --> 01:01:49,080 Speaker 1: so well. I'm all worked out. Yeah. A great seal 1147 01:01:49,240 --> 01:01:52,720 Speaker 1: of the disinformation that goes throughout India on Facebook comes 1148 01:01:52,760 --> 01:01:56,200 Speaker 1: from the RSS, which is an Indian fascist organization closely 1149 01:01:56,240 --> 01:01:58,440 Speaker 1: tied to the b j P, which is the current 1150 01:01:58,520 --> 01:02:00,760 Speaker 1: ruling right wing party. And when I say fascist, I 1151 01:02:00,760 --> 01:02:02,640 Speaker 1: mean like some of the founders of the r S 1152 01:02:02,840 --> 01:02:06,120 Speaker 1: s we're actual like friends with Nazis and they were 1153 01:02:06,120 --> 01:02:09,200 Speaker 1: heavily influenced by that ship in like the thirties. Both 1154 01:02:09,440 --> 01:02:12,880 Speaker 1: organizations are profoundly anti Muslim and the r S s 1155 01:02:13,000 --> 01:02:16,200 Speaker 1: IS propaganda has been tied to numerous acts of violence. 1156 01:02:16,600 --> 01:02:19,600 Speaker 1: Facebook refuses to designate them a dangerous organization because of 1157 01:02:19,680 --> 01:02:22,600 Speaker 1: quote political sensitivities that might harm their ability to make 1158 01:02:22,640 --> 01:02:25,800 Speaker 1: money in India. Facebook is the best friend many far 1159 01:02:25,920 --> 01:02:28,680 Speaker 1: right and fascist political parties have ever had. Take the 1160 01:02:28,760 --> 01:02:33,160 Speaker 1: Polish Confederation Party. There your standard right wing extremists, anti immigrant, 1161 01:02:33,200 --> 01:02:36,440 Speaker 1: anti lockdown, anti vaccine, anti LGBT. The head of their 1162 01:02:36,440 --> 01:02:41,200 Speaker 1: social media team, Thomas garbage Check uh sorry Toms, told 1163 01:02:41,240 --> 01:02:44,400 Speaker 1: The Washington Post that Facebook's hate algorithm, in his words, 1164 01:02:44,600 --> 01:02:47,000 Speaker 1: had been a huge boon to their digital efforts. Like 1165 01:02:47,040 --> 01:02:49,640 Speaker 1: he calls it a hate Alglerman says this is great 1166 01:02:49,680 --> 01:02:52,600 Speaker 1: for us um expanding it. Like I think we're good 1167 01:02:52,600 --> 01:02:56,000 Speaker 1: with emotional messages and thus there ship spreads well on Facebook. 1168 01:02:56,520 --> 01:03:00,000 Speaker 1: Quote from The Washington post. In one April two nineteen 1169 01:03:00,080 --> 01:03:02,560 Speaker 1: document detailing a research trip to the European Union, a 1170 01:03:02,560 --> 01:03:05,840 Speaker 1: Facebook team reported feedback from European politicians that an algorithm 1171 01:03:05,920 --> 01:03:08,600 Speaker 1: changed the previous year built by Facebook at chief executive 1172 01:03:08,600 --> 01:03:11,640 Speaker 1: Mark Zuckerberg's a effort to foster more meaningful interactions on 1173 01:03:11,680 --> 01:03:15,520 Speaker 1: the platform had changed politics for the worst. This change, 1174 01:03:15,560 --> 01:03:17,960 Speaker 1: Mark claim, was meant to make interactions more meaningful, but 1175 01:03:18,000 --> 01:03:19,840 Speaker 1: it was really just a tweak to the algorithm that 1176 01:03:19,880 --> 01:03:22,640 Speaker 1: made comments that provoked anger and argument even more viral. 1177 01:03:23,320 --> 01:03:25,600 Speaker 1: Um and I'm gonna quote from the post again here. 1178 01:03:25,720 --> 01:03:27,800 Speaker 1: In two thousand and eighteen, Facebook made a big change 1179 01:03:27,800 --> 01:03:31,280 Speaker 1: to that formula to provote meaningful social interactions. These changes 1180 01:03:31,280 --> 01:03:33,120 Speaker 1: were built as a design to make the news feed 1181 01:03:33,120 --> 01:03:35,160 Speaker 1: more focused on posts from family and friends and less 1182 01:03:35,160 --> 01:03:38,120 Speaker 1: from brands, businesses, and the media. The process weighted the 1183 01:03:38,160 --> 01:03:40,960 Speaker 1: profit probability that a post would produce an interaction such 1184 01:03:40,960 --> 01:03:43,680 Speaker 1: as a like, emoji or comment, more heavily than other factors, 1185 01:03:44,000 --> 01:03:46,640 Speaker 1: but that appeared to backfire. Hogan, who this week took 1186 01:03:46,640 --> 01:03:49,160 Speaker 1: her campaign against her former employer to Europe, voiced a 1187 01:03:49,160 --> 01:03:52,720 Speaker 1: concern that Facebook's algorithm amplifies the extreme anger and hate 1188 01:03:52,760 --> 01:03:54,800 Speaker 1: is the easiest way to grow on Facebook, she told 1189 01:03:54,800 --> 01:03:57,640 Speaker 1: British lawmakers, many of whom have their jobs because of 1190 01:03:57,680 --> 01:03:59,920 Speaker 1: how easy it is to make people ship go by 1191 01:04:00,080 --> 01:04:06,120 Speaker 1: roll when it comes. I mean that that that's not 1192 01:04:06,160 --> 01:04:09,960 Speaker 1: true for yes, Um, what's again? We're we're focusing on 1193 01:04:09,960 --> 01:04:11,720 Speaker 1: Facebook here in part because I do think it's more 1194 01:04:11,760 --> 01:04:13,280 Speaker 1: severe in a lot of ways there, but also just 1195 01:04:13,320 --> 01:04:15,760 Speaker 1: because like they're the ones who had a big leak, 1196 01:04:15,800 --> 01:04:18,640 Speaker 1: and so we have this data. So we're not just saying, yeah, 1197 01:04:18,760 --> 01:04:21,240 Speaker 1: look at Facebook obviously hate spreading. They were saying no, no no, 1198 01:04:21,400 --> 01:04:24,800 Speaker 1: we have numbers. We have their numbers about how fucking 1199 01:04:24,880 --> 01:04:28,760 Speaker 1: bad the problem is. I guess that that is the difference. Yeah, yeah, 1200 01:04:29,080 --> 01:04:32,440 Speaker 1: and and yeah, we have evidence that the system is 1201 01:04:32,440 --> 01:04:34,480 Speaker 1: well aware of the Yeah. I would love to be 1202 01:04:34,520 --> 01:04:37,640 Speaker 1: talking about Twitter too. It's just maybe Twitter just never 1203 01:04:37,680 --> 01:04:40,920 Speaker 1: bothered to get those kind of numbers. Who knows. Um. 1204 01:04:40,920 --> 01:04:43,520 Speaker 1: This caused what experts describe as a social civil war 1205 01:04:43,600 --> 01:04:47,120 Speaker 1: in Poland. Like this change, one internal report concluded, we 1206 01:04:47,120 --> 01:04:49,640 Speaker 1: can choose to be idle and keep feeding users fast food, 1207 01:04:49,680 --> 01:04:51,560 Speaker 1: but that only works for so long. And you've already 1208 01:04:51,560 --> 01:04:53,240 Speaker 1: caught onto the fact that fast food is linked to 1209 01:04:53,280 --> 01:04:55,680 Speaker 1: obesity and therefore and it's short term value is not 1210 01:04:55,720 --> 01:04:59,000 Speaker 1: worth the long term cost. So he's being like, we're 1211 01:04:59,040 --> 01:05:03,880 Speaker 1: poisoning people and it's addictive, like you know McDonald's, but 1212 01:05:03,960 --> 01:05:06,040 Speaker 1: like people are going to give it up in the 1213 01:05:06,040 --> 01:05:09,760 Speaker 1: same way that McDonald started to suffer a couple of 1214 01:05:09,840 --> 01:05:12,440 Speaker 1: years back, because like, they don't they don't like the 1215 01:05:12,440 --> 01:05:15,400 Speaker 1: way this makes them feel. Actually, it's fun for a moment. 1216 01:05:15,440 --> 01:05:19,240 Speaker 1: We just got to get a Morgan Spurlock for Facebook. Baby, 1217 01:05:19,480 --> 01:05:22,640 Speaker 1: we just got to get where's the supersize me for faceboo? 1218 01:05:23,000 --> 01:05:28,400 Speaker 1: Entire society is the Morgan Spurlock for Facebook? January sixties more. Yeah, 1219 01:05:29,760 --> 01:05:31,080 Speaker 1: I was gonna say it was like I feel like 1220 01:05:31,160 --> 01:05:37,640 Speaker 1: it's I mean, whatever, not to say that McDonald's isn't 1221 01:05:37,840 --> 01:05:41,680 Speaker 1: a hell of a drug, but like this is not 1222 01:05:42,000 --> 01:05:45,800 Speaker 1: the same. I mean, I it's stronger because it's your 1223 01:05:46,560 --> 01:05:50,160 Speaker 1: fucking brain and self image and the view of yourself. 1224 01:05:50,200 --> 01:05:54,360 Speaker 1: And I feel like that is the most strong manipulation 1225 01:05:54,440 --> 01:05:57,480 Speaker 1: that any given system, person, whatever can have on you 1226 01:05:57,520 --> 01:06:00,360 Speaker 1: as controlling the way that you see yourself. It's not 1227 01:06:01,600 --> 01:06:05,959 Speaker 1: the same in terms of like involuntary baseness. I feel 1228 01:06:05,960 --> 01:06:08,960 Speaker 1: like it's something that you very much participate in. Yeah, 1229 01:06:09,440 --> 01:06:13,840 Speaker 1: um it's bad. Yeah, um it's good. I think it's good. 1230 01:06:14,160 --> 01:06:17,520 Speaker 1: That's what I think, Jamie. Facebook has been aggressive. You've 1231 01:06:17,560 --> 01:06:20,280 Speaker 1: called me today to see good, actually to read all 1232 01:06:20,320 --> 01:06:22,680 Speaker 1: this and then say so that's fine, let's never talk 1233 01:06:22,720 --> 01:06:26,640 Speaker 1: of it again. Um. Anyway, Facebook has been aggressive at 1234 01:06:26,640 --> 01:06:30,040 Speaker 1: rebutting the allegations that their product leads to polarization. Their 1235 01:06:30,040 --> 01:06:32,560 Speaker 1: spokeswoman brought up a study which she said shows that 1236 01:06:32,640 --> 01:06:36,160 Speaker 1: academic research doesn't support the idea that Facebook or social 1237 01:06:36,160 --> 01:06:39,440 Speaker 1: media more generally is the primary cause of polarization. To 1238 01:06:39,520 --> 01:06:42,440 Speaker 1: ignore for the moment that not the primary cause doesn't 1239 01:06:42,480 --> 01:06:45,520 Speaker 1: mean isn't a significant cause. And let's look at this study. 1240 01:06:45,800 --> 01:06:49,640 Speaker 1: The spokeswoman was referencing cross country trends and effective Polarization 1241 01:06:49,920 --> 01:06:54,280 Speaker 1: in August study from researchers at Stanford and Brown University. 1242 01:06:54,360 --> 01:06:57,000 Speaker 1: This study opens by noting it includes data for only 1243 01:06:57,040 --> 01:06:59,960 Speaker 1: twelve countries and that all but Britain and Germany exhibited 1244 01:07:00,000 --> 01:07:04,120 Speaker 1: a positive trend trend towards more polarization. So right off 1245 01:07:04,120 --> 01:07:06,720 Speaker 1: the bat, there's some things to question about this study. Um, 1246 01:07:06,760 --> 01:07:09,000 Speaker 1: which is number one, they're saying that, like, oh, Britain 1247 01:07:09,040 --> 01:07:11,320 Speaker 1: hasn't gotten more polarized, which is like have you have 1248 01:07:11,400 --> 01:07:15,080 Speaker 1: you been there? Have you talked to but I yeah, 1249 01:07:15,200 --> 01:07:18,040 Speaker 1: not the not the don't live there, but not what 1250 01:07:18,160 --> 01:07:21,480 Speaker 1: I've been hearing. And here's the thing, when you look 1251 01:07:21,520 --> 01:07:24,360 Speaker 1: at Facebook is basically using this citing this is like 1252 01:07:24,400 --> 01:07:26,920 Speaker 1: evidence that like, look, we're fine, social media is not. 1253 01:07:27,040 --> 01:07:29,439 Speaker 1: The study from this very credible study says that we're 1254 01:07:29,440 --> 01:07:33,560 Speaker 1: not the cause of polarization, so everything's good. The study 1255 01:07:33,800 --> 01:07:37,000 Speaker 1: doesn't quite back them up on this um right off 1256 01:07:37,040 --> 01:07:40,680 Speaker 1: the bat. One of the authors provides it like notes this, 1257 01:07:40,960 --> 01:07:42,280 Speaker 1: and this is from a right up by one of 1258 01:07:42,280 --> 01:07:43,880 Speaker 1: the authors on the study on a website called tech 1259 01:07:43,880 --> 01:07:46,040 Speaker 1: policy dot com where he's talking about the study and 1260 01:07:46,040 --> 01:07:49,040 Speaker 1: what it says. A flat or de or declining trend 1261 01:07:49,160 --> 01:07:51,160 Speaker 1: over the forty years of our sample does not rule 1262 01:07:51,160 --> 01:07:53,920 Speaker 1: out the possibility that countries have seen rising polarization in 1263 01:07:53,960 --> 01:07:56,840 Speaker 1: the most recent years. Britain, for example, shows a slight 1264 01:07:56,920 --> 01:07:59,760 Speaker 1: overall decline, but a clear increasing trend post two thousand 1265 01:07:59,760 --> 01:08:02,640 Speaker 1: in Brexit. So he's saying that, like, we don't have 1266 01:08:02,720 --> 01:08:05,720 Speaker 1: as much data from like more recent polarization, and that 1267 01:08:05,800 --> 01:08:07,680 Speaker 1: may be a reason why this study is less accurate, 1268 01:08:07,720 --> 01:08:10,040 Speaker 1: why some of our our statements are not do not 1269 01:08:10,160 --> 01:08:12,920 Speaker 1: conform with like what people have observed. He goes on 1270 01:08:13,000 --> 01:08:15,560 Speaker 1: to note the data do not provide much support for 1271 01:08:15,560 --> 01:08:18,360 Speaker 1: the hypothesis that digital technology is the central driver of 1272 01:08:18,400 --> 01:08:21,640 Speaker 1: effective polarization. The Internet has diffused widely in all the 1273 01:08:21,680 --> 01:08:24,240 Speaker 1: countries we looked at, and under simple stories where this 1274 01:08:24,320 --> 01:08:26,559 Speaker 1: is the key driver, we would have expected polarization to 1275 01:08:26,560 --> 01:08:29,559 Speaker 1: have risen everywhere as well. In our data, neither diffusion 1276 01:08:29,800 --> 01:08:32,839 Speaker 1: of internet nor penetration of digital news are significantly correlated 1277 01:08:32,880 --> 01:08:36,679 Speaker 1: with increasing polarization. Similarly, we found little association with changes 1278 01:08:36,720 --> 01:08:40,000 Speaker 1: in inequality or trade. One explanatory factor that looks more 1279 01:08:40,040 --> 01:08:42,800 Speaker 1: promising is increasing racial diversity. The non white share of 1280 01:08:42,800 --> 01:08:45,040 Speaker 1: the population has increased faster in the US than in 1281 01:08:45,080 --> 01:08:47,559 Speaker 1: almost any other country, and our sample in other countries 1282 01:08:47,600 --> 01:08:49,679 Speaker 1: like New Zealand and Canada, where it has risen sharply 1283 01:08:49,680 --> 01:08:53,559 Speaker 1: have seen that rising polarization as well. So I have 1284 01:08:53,640 --> 01:08:56,200 Speaker 1: some significant arguments with him here, including the fact that, 1285 01:08:56,600 --> 01:09:00,439 Speaker 1: as he notes here, his study only looks at Western nations. 1286 01:09:00,439 --> 01:09:02,840 Speaker 1: With the exception of Japan, all of the nations in 1287 01:09:02,840 --> 01:09:06,160 Speaker 1: the study are European or the United States. In Canada, 1288 01:09:06,640 --> 01:09:08,840 Speaker 1: um and so they have all have had prior to 1289 01:09:08,880 --> 01:09:11,960 Speaker 1: two thousand, higher penetrations of the Internet and non Internet 1290 01:09:11,960 --> 01:09:14,280 Speaker 1: mass media. Like outside of this, if you're trying to 1291 01:09:14,320 --> 01:09:17,240 Speaker 1: determine the impact of social media elements of what social 1292 01:09:17,280 --> 01:09:19,320 Speaker 1: media has done, we're in present in places like Fox 1293 01:09:19,360 --> 01:09:23,200 Speaker 1: News in the United States years before Facebook ever existed. Um, 1294 01:09:23,240 --> 01:09:25,080 Speaker 1: And that was not the case in places like me 1295 01:09:25,160 --> 01:09:26,840 Speaker 1: and mar in India, which are not a part of 1296 01:09:26,840 --> 01:09:29,600 Speaker 1: this study. So right off the bat, it's problematic to 1297 01:09:29,640 --> 01:09:32,440 Speaker 1: try and study the impact of social media on polarization 1298 01:09:32,800 --> 01:09:35,840 Speaker 1: only in countries that already had robust mass media before 1299 01:09:35,880 --> 01:09:37,560 Speaker 1: social media came into the effect. Which is not to 1300 01:09:37,600 --> 01:09:40,280 Speaker 1: say that I agree with their conclusion, because I think 1301 01:09:40,280 --> 01:09:43,160 Speaker 1: there's other flaws with this study, but it one of 1302 01:09:43,160 --> 01:09:45,320 Speaker 1: the flaws is just that like hundreds of millions of 1303 01:09:45,360 --> 01:09:47,920 Speaker 1: their users exist in countries where they did not The 1304 01:09:47,960 --> 01:09:50,360 Speaker 1: study was not done, um, where they were not looking 1305 01:09:50,479 --> 01:09:52,519 Speaker 1: at these places, which is a flaw which is just 1306 01:09:52,640 --> 01:09:56,240 Speaker 1: like dependent on yeah, and that's dependent on most readers, 1307 01:09:56,360 --> 01:09:59,120 Speaker 1: just conflating, you know, in North America and Europe with 1308 01:09:59,200 --> 01:10:00,960 Speaker 1: the center of the fucking and and again I have 1309 01:10:01,080 --> 01:10:03,759 Speaker 1: issues about like, Okay, well you're saying that racial diversity 1310 01:10:03,800 --> 01:10:05,639 Speaker 1: is more of a thing. But like, where is the 1311 01:10:05,800 --> 01:10:09,200 Speaker 1: where is the propaganda, Where is the hate speech about 1312 01:10:09,240 --> 01:10:12,639 Speaker 1: racial diversity spreading? Is it spreading on social media? Like, yes, 1313 01:10:12,680 --> 01:10:14,960 Speaker 1: it is. I can say that as an expert. Um. 1314 01:10:15,000 --> 01:10:17,720 Speaker 1: It's also just like again, not that this study is 1315 01:10:17,760 --> 01:10:21,080 Speaker 1: even bad or not useful. It is one study, um. 1316 01:10:21,240 --> 01:10:24,879 Speaker 1: And again we have internal Facebook studies that make claims 1317 01:10:24,920 --> 01:10:28,080 Speaker 1: that I would say throw some of this into question. 1318 01:10:28,560 --> 01:10:30,920 Speaker 1: But again, this is just how a corporation is going 1319 01:10:30,960 --> 01:10:32,720 Speaker 1: to react. They're going to find a study that they 1320 01:10:32,720 --> 01:10:36,639 Speaker 1: can simplify in such a way that they claim that 1321 01:10:36,640 --> 01:10:39,320 Speaker 1: that they can claim there's not a problem because nobody, 1322 01:10:40,479 --> 01:10:41,880 Speaker 1: none of the people who they're going to be arguing 1323 01:10:41,880 --> 01:10:43,839 Speaker 1: with on Capitol Hill and precious to you, the journalists 1324 01:10:43,840 --> 01:10:45,720 Speaker 1: are going to actually drill into this and then talk 1325 01:10:45,760 --> 01:10:47,439 Speaker 1: to other experts. You're gonna reach out to members of 1326 01:10:47,479 --> 01:10:50,040 Speaker 1: that study and be like, how fair is this phrasing? 1327 01:10:50,040 --> 01:10:52,040 Speaker 1: How does that how does it deal with this information 1328 01:10:52,040 --> 01:10:54,160 Speaker 1: and this information As we saw earlier with the last study, 1329 01:10:54,160 --> 01:10:56,200 Speaker 1: when we people reached when the Wall Street Journal, to 1330 01:10:56,240 --> 01:10:58,639 Speaker 1: their credit, reached out to that scientist, he was like, well, 1331 01:10:58,680 --> 01:11:00,559 Speaker 1: actually they have better data than me. And I'd love 1332 01:11:00,640 --> 01:11:04,799 Speaker 1: to see it because maybe that'll change our conclusions. Um anyway, 1333 01:11:04,960 --> 01:11:08,759 Speaker 1: yeah uh. Mark Zuckerberg has been consistent in his argument 1334 01:11:08,760 --> 01:11:11,599 Speaker 1: that deliberately pushing divisive and violent content would be bad 1335 01:11:11,600 --> 01:11:14,400 Speaker 1: for Facebook. Quote. We make money from ads, and advertisers 1336 01:11:14,439 --> 01:11:16,360 Speaker 1: consistently tell us they don't want their ads next to 1337 01:11:16,400 --> 01:11:19,439 Speaker 1: harmful or angry content. While I was writing this article, 1338 01:11:19,479 --> 01:11:21,920 Speaker 1: I browsed over to one of my test Facebook accounts. 1339 01:11:22,080 --> 01:11:23,800 Speaker 1: The third ad on my feed was for a device 1340 01:11:23,840 --> 01:11:27,759 Speaker 1: to illegally turn a glock handgun into a fully automatic weapon. Um, 1341 01:11:28,360 --> 01:11:32,080 Speaker 1: just just his heads up. Yeah, one of my I 1342 01:11:32,080 --> 01:11:34,280 Speaker 1: have a couple of test feeds and it was like, hey, 1343 01:11:34,280 --> 01:11:37,280 Speaker 1: this button will turn your glock automatic, which is so 1344 01:11:37,320 --> 01:11:40,080 Speaker 1: many felonies. Jamie. If you even have that thing in 1345 01:11:40,120 --> 01:11:42,439 Speaker 1: a glock in your home, the FBI can put you 1346 01:11:42,479 --> 01:11:45,719 Speaker 1: away forever. I have to laugh. I have to laugh 1347 01:11:45,760 --> 01:11:48,400 Speaker 1: because that is really really scary. But yeah, it is 1348 01:11:48,439 --> 01:11:50,840 Speaker 1: like Mark being like, look, oh, no advertiser wants this 1349 01:11:50,880 --> 01:11:53,800 Speaker 1: to be a violent place by a machine gun on Facebook, 1350 01:11:53,840 --> 01:11:56,360 Speaker 1: you know, next to ads that are like t shirts 1351 01:11:56,360 --> 01:11:59,960 Speaker 1: about telling liberals and stuff, Like a machine gun ad 1352 01:12:00,000 --> 01:12:04,120 Speaker 1: advertiser maybe would be one that wouldn't take issue with that. 1353 01:12:04,720 --> 01:12:08,680 Speaker 1: Holy fucking hang the media shirts advertised to me on 1354 01:12:08,720 --> 01:12:12,439 Speaker 1: Facebook like my god go to go like fuck you 1355 01:12:12,640 --> 01:12:15,680 Speaker 1: Mark Um, I used to the last Well, when I 1356 01:12:15,760 --> 01:12:17,800 Speaker 1: quit Facebook a couple of years ago, I was I 1357 01:12:18,000 --> 01:12:21,920 Speaker 1: was getting Normi advertisements. I was getting good for you, 1358 01:12:22,080 --> 01:12:25,439 Speaker 1: those really scary ones that says, like those custom T 1359 01:12:25,520 --> 01:12:29,760 Speaker 1: shirts that say it's a Jamie Loftus thing, you wouldn't understand, 1360 01:12:31,400 --> 01:12:34,360 Speaker 1: and you wouldn't I would not know. And you know, 1361 01:12:34,840 --> 01:12:37,360 Speaker 1: the only time Facebook I can think of recently actually 1362 01:12:37,400 --> 01:12:40,360 Speaker 1: anticipated something I wanted is they keep showing me on 1363 01:12:40,400 --> 01:12:43,600 Speaker 1: all of the accounts that I've I've used videos of 1364 01:12:43,640 --> 01:12:46,759 Speaker 1: hydraulic presses crushing things, and I do love those videos. 1365 01:12:47,120 --> 01:12:50,519 Speaker 1: Those those are those are pretty pretty fun. And that's 1366 01:12:50,560 --> 01:12:55,000 Speaker 1: the meaningful social interactions that Mr Mark Zuckerberg was talking about, 1367 01:12:55,200 --> 01:12:58,160 Speaker 1: was the hydraulic press videos, and those are very comforted. 1368 01:12:58,240 --> 01:13:01,559 Speaker 1: On the good old Internet, which also wasn't all that great, 1369 01:13:01,600 --> 01:13:03,920 Speaker 1: but on the old Internet, which was it was a 1370 01:13:04,000 --> 01:13:05,439 Speaker 1: lot more fun. Though there would have just been a 1371 01:13:05,439 --> 01:13:07,679 Speaker 1: whole website that was just like, here's all the videos 1372 01:13:07,720 --> 01:13:11,600 Speaker 1: of hydraulic presses crushing things. Come watch this ship. There 1373 01:13:11,600 --> 01:13:14,639 Speaker 1: wouldn't have been any algorithm necessary. You could just scroll 1374 01:13:14,720 --> 01:13:20,719 Speaker 1: through videos. There's no friend function, it's just hydraulic press ship. Yeah, 1375 01:13:20,760 --> 01:13:24,920 Speaker 1: that's all I need, baby, I need. So back to 1376 01:13:24,960 --> 01:13:28,200 Speaker 1: the point, it is undeniable that any service on the 1377 01:13:28,240 --> 01:13:30,559 Speaker 1: scale of Facebook, again like three billion users, it's going 1378 01:13:30,600 --> 01:13:32,600 Speaker 1: to face some tough choices when it comes to the 1379 01:13:32,600 --> 01:13:35,479 Speaker 1: problem of regulating the speech of political movements and thinkers. 1380 01:13:35,800 --> 01:13:37,960 Speaker 1: As one employee wrote in an internal message, I am 1381 01:13:38,000 --> 01:13:40,840 Speaker 1: not comfortable making judgments about some parties being less good 1382 01:13:40,840 --> 01:13:43,360 Speaker 1: for society and less worthy of distribution based on where 1383 01:13:43,360 --> 01:13:47,280 Speaker 1: they fall in the ideological spectrum. That's true. Um, This 1384 01:13:47,360 --> 01:13:49,960 Speaker 1: is again part of the problem of not regulating them 1385 01:13:50,000 --> 01:13:53,040 Speaker 1: like a media company, like a newspaper or something. Um, 1386 01:13:53,120 --> 01:13:56,400 Speaker 1: Because by not making any choices, they're making an editorial choice, 1387 01:13:56,400 --> 01:13:59,800 Speaker 1: which is to allow this stuff to spread. Presumably. Actually, 1388 01:14:00,080 --> 01:14:01,760 Speaker 1: if you were actually being held to some kind of 1389 01:14:01,840 --> 01:14:04,519 Speaker 1: legal standard that again most of our media isn't anymore, 1390 01:14:04,720 --> 01:14:07,040 Speaker 1: you would at least have to be like, well, let's 1391 01:14:07,080 --> 01:14:10,400 Speaker 1: evaluate the truthfulness of some of these basic statements before 1392 01:14:10,439 --> 01:14:12,439 Speaker 1: president and and I would say that's where the judgment 1393 01:14:12,439 --> 01:14:15,280 Speaker 1: should come in on. But that's expensive. If Facebook is saying, 1394 01:14:15,520 --> 01:14:17,559 Speaker 1: we won't judge based on politics, but we will will 1395 01:14:17,640 --> 01:14:19,680 Speaker 1: We will judge based on whether or not something is 1396 01:14:19,760 --> 01:14:23,479 Speaker 1: counter factual that I think is morally defensible. But that's 1397 01:14:23,479 --> 01:14:26,160 Speaker 1: expensive as shit and they're never going to do that. 1398 01:14:26,439 --> 01:14:30,800 Speaker 1: Look spent more. Moral decisions are famously not cheap, and 1399 01:14:30,880 --> 01:14:33,120 Speaker 1: that is a lot of the reason why people do 1400 01:14:33,200 --> 01:14:36,439 Speaker 1: not do them. Yeah, um, it is true. That is 1401 01:14:36,479 --> 01:14:41,400 Speaker 1: not a profitable venture. Yeah, it's no, of course not um. 1402 01:14:41,439 --> 01:14:44,240 Speaker 1: And the other thing that's true is that Facebook already 1403 01:14:44,280 --> 01:14:46,759 Speaker 1: makes a lot of decisions about which politicians and parties 1404 01:14:46,800 --> 01:14:49,360 Speaker 1: are worthy of speech, and they make that decision based 1405 01:14:49,400 --> 01:14:51,439 Speaker 1: mostly on whether or not said public figures get a 1406 01:14:51,439 --> 01:14:55,080 Speaker 1: lot of engagement. Midway through last year, they deleted and 1407 01:14:55,640 --> 01:14:58,720 Speaker 1: like all of the different anarchist media groups that had 1408 01:14:58,720 --> 01:15:00,679 Speaker 1: in a lot of anti fascist group that had accounts 1409 01:15:00,680 --> 01:15:02,759 Speaker 1: on Facebook. Just across the board, they deleted Let crime 1410 01:15:02,840 --> 01:15:06,920 Speaker 1: Think and uh um they kicked off. It's going down 1411 01:15:07,080 --> 01:15:10,840 Speaker 1: like a rapper, I know, soul like yeah they I mean, 1412 01:15:11,040 --> 01:15:13,519 Speaker 1: I mean, it's nobody ever complains when best ship happens 1413 01:15:13,560 --> 01:15:16,040 Speaker 1: to anarchists except for anarchists. But yeah, they knuked a 1414 01:15:16,080 --> 01:15:19,599 Speaker 1: bunch of anarchist content. Um just kind of blankets saying 1415 01:15:19,600 --> 01:15:21,280 Speaker 1: it was dangerous. And I think it was because they 1416 01:15:21,320 --> 01:15:23,080 Speaker 1: just knuked the Proud Boys and they had to be 1417 01:15:23,120 --> 01:15:25,760 Speaker 1: shown to be fair. Um. But it has now come 1418 01:15:25,760 --> 01:15:28,320 Speaker 1: out that they have a whole program called x check 1419 01:15:28,560 --> 01:15:31,960 Speaker 1: or cross check, which is where they decide which political 1420 01:15:32,000 --> 01:15:37,759 Speaker 1: figures get to spread violent and false content without getting banned. Um, 1421 01:15:37,800 --> 01:15:40,960 Speaker 1: they had a yeah, based on engagement. They've claimed for 1422 01:15:41,040 --> 01:15:43,840 Speaker 1: years that everybody's accountable to site rules. But again the 1423 01:15:43,840 --> 01:15:47,200 Speaker 1: Facebook papers has revealed that, like, that's explicitly a lie 1424 01:15:47,240 --> 01:15:49,320 Speaker 1: and it's a life Facebook has told other people at 1425 01:15:49,400 --> 01:15:51,479 Speaker 1: high levels of Facebook. And I'm gonna quote from the 1426 01:15:51,479 --> 01:15:53,880 Speaker 1: Wall Street Journal here. The program, known as cross Checker 1427 01:15:54,040 --> 01:15:56,719 Speaker 1: x check, was initially intended as a quality control measure 1428 01:15:56,920 --> 01:16:00,520 Speaker 1: for actions taken against high profile accounts, including celebrities, politicians, 1429 01:16:00,560 --> 01:16:03,600 Speaker 1: and journalists. Today, at shields millions of v I P 1430 01:16:03,840 --> 01:16:07,040 Speaker 1: users from the company's normal enforcement process. The documents show 1431 01:16:07,320 --> 01:16:10,360 Speaker 1: some users are white listed, rendered immune from enforcement actions, 1432 01:16:10,360 --> 01:16:13,160 Speaker 1: while others are allowed to post rule violating material pending 1433 01:16:13,160 --> 01:16:16,280 Speaker 1: Facebook employee reviews that often never come. At times, the 1434 01:16:16,320 --> 01:16:19,320 Speaker 1: documents show ex check has protected public figures whose posts 1435 01:16:19,320 --> 01:16:22,719 Speaker 1: contain harassment or incitement to violence, violations that would typically 1436 01:16:22,760 --> 01:16:25,799 Speaker 1: lead to sanctions for regular users. In two thousand nineteen, 1437 01:16:25,840 --> 01:16:28,559 Speaker 1: it allowed international soccer star in neymar to show nude 1438 01:16:28,560 --> 01:16:30,479 Speaker 1: photos of a woman who had accused him of rape 1439 01:16:30,520 --> 01:16:32,519 Speaker 1: to tens of millions of his fans before the content 1440 01:16:32,600 --> 01:16:36,280 Speaker 1: was removed by Facebook. White Listed accounts shared inflammatory claims 1441 01:16:36,280 --> 01:16:39,479 Speaker 1: that Facebook's fact checkers dimmed deemed false, including that vaccines 1442 01:16:39,520 --> 01:16:42,439 Speaker 1: are deadly, that Hillary Clinton had covered up pedophile wings, 1443 01:16:42,640 --> 01:16:45,040 Speaker 1: and that then president Donald Trump had called all refugees 1444 01:16:45,040 --> 01:16:48,880 Speaker 1: seeking asylum animals. According to the documents, A two thousand 1445 01:16:48,960 --> 01:16:52,320 Speaker 1: nineteen review of Facebook's white listing procedures marked Attorney Client 1446 01:16:52,400 --> 01:16:55,920 Speaker 1: Privileged found favoritism to those users to be both widespread 1447 01:16:55,960 --> 01:16:59,360 Speaker 1: and not publicly defensible. We are not actually doing what 1448 01:16:59,400 --> 01:17:01,960 Speaker 1: we say we do publicly, said the confidential review. It 1449 01:17:02,040 --> 01:17:05,280 Speaker 1: called the company's actions a breach of trust, and added, 1450 01:17:05,360 --> 01:17:07,800 Speaker 1: unlike the rest of our community, these people violate our 1451 01:17:07,840 --> 01:17:09,960 Speaker 1: standards without any consequence, and they like to like their 1452 01:17:09,960 --> 01:17:12,000 Speaker 1: board members about whether or not this. They didn't lie 1453 01:17:12,000 --> 01:17:13,679 Speaker 1: about whether or not it was a thing. They said 1454 01:17:13,720 --> 01:17:15,360 Speaker 1: it was very small and just really I think the 1455 01:17:15,400 --> 01:17:17,360 Speaker 1: initial claim was, like we have to have something like 1456 01:17:17,400 --> 01:17:19,120 Speaker 1: this in place where people like President Trump. But it's 1457 01:17:19,160 --> 01:17:22,280 Speaker 1: a tiny number of people, and it's because they occupy 1458 01:17:22,360 --> 01:17:26,080 Speaker 1: some political position where we can't just as easily, you know, 1459 01:17:26,160 --> 01:17:29,200 Speaker 1: delete their account because it creates other problems because they're 1460 01:17:29,240 --> 01:17:30,960 Speaker 1: not as fringe as they need to be for this 1461 01:17:31,720 --> 01:17:36,240 Speaker 1: conduct to be. That was their justification, one sect, Jamie. 1462 01:17:36,320 --> 01:17:39,160 Speaker 1: That was their justification on the level of you can 1463 01:17:39,240 --> 01:17:43,360 Speaker 1: be unethical and still be illegal. And well, here's the 1464 01:17:43,400 --> 01:17:46,439 Speaker 1: thing they told their board. They only did this for 1465 01:17:46,479 --> 01:17:48,160 Speaker 1: a small number of users. You want to guess how 1466 01:17:48,160 --> 01:17:50,439 Speaker 1: small what that small number was. Oh, I love when 1467 01:17:50,479 --> 01:17:53,120 Speaker 1: Facebook say there's a small number. What is this? What 1468 01:17:53,280 --> 01:18:00,479 Speaker 1: is this? Eight million? That's so many? Yeah, oh dear, 1469 01:18:01,560 --> 01:18:04,960 Speaker 1: it's very funny. It's very funny. It's all good. I 1470 01:18:07,600 --> 01:18:11,439 Speaker 1: that is so. I mean, yeah, that they're just they're 1471 01:18:11,520 --> 01:18:15,160 Speaker 1: just can I say something controversial? Please? I don't like 1472 01:18:15,240 --> 01:18:18,519 Speaker 1: this company one bit. You don't, well, I feel like 1473 01:18:18,600 --> 01:18:22,080 Speaker 1: that's going a bit uh, that's going a bit far. Sorry, 1474 01:18:22,560 --> 01:18:26,800 Speaker 1: And I'm famously you know, I don't like making harsh 1475 01:18:26,880 --> 01:18:29,240 Speaker 1: judgments on others, but I'm starting to think that they 1476 01:18:29,320 --> 01:18:32,599 Speaker 1: might be doing some bad stuff over there. M hmmm, yeah, 1477 01:18:32,760 --> 01:18:35,920 Speaker 1: I would. You know, I don't like these people. I 1478 01:18:35,960 --> 01:18:38,240 Speaker 1: don't like these people at all. You know what, I 1479 01:18:38,280 --> 01:18:43,479 Speaker 1: do like Jamie ending podcast episodes. M m oh, I 1480 01:18:43,520 --> 01:18:46,080 Speaker 1: actually do like that. Yeah, that's the thing I'm best at. 1481 01:18:46,479 --> 01:18:50,880 Speaker 1: Do you want to plug your plug doubles passion? Um? Yeah, 1482 01:18:50,880 --> 01:18:55,799 Speaker 1: sure you can find. I'm gonna just open by plugging 1483 01:18:55,800 --> 01:19:01,280 Speaker 1: my Instagram account, a famously healthy platform that I'm addicted to, 1484 01:19:01,439 --> 01:19:03,720 Speaker 1: and I don't really have any concerns about it. I 1485 01:19:03,720 --> 01:19:06,679 Speaker 1: don't really think it's affecting my mental health at all. Um. 1486 01:19:06,720 --> 01:19:10,040 Speaker 1: So I'm over there and that's Jamie christ Superstar. I'm 1487 01:19:10,040 --> 01:19:12,559 Speaker 1: also on Twitter, which Robert can't stop saying is the 1488 01:19:12,560 --> 01:19:17,280 Speaker 1: healthiest of the platforms. Of all of the people who 1489 01:19:17,320 --> 01:19:22,120 Speaker 1: are drunk driving through intersections filled with children, Twitter has 1490 01:19:22,160 --> 01:19:25,559 Speaker 1: the least amount of human blood and gore underneath the 1491 01:19:25,600 --> 01:19:29,080 Speaker 1: grill of the car, like the Robert saying for all 1492 01:19:29,120 --> 01:19:32,080 Speaker 1: you Backstreet Boys heads, he's saying that Twitter is the 1493 01:19:32,160 --> 01:19:35,360 Speaker 1: Kevin Richardson of social media. I'm there as well. I'm 1494 01:19:35,439 --> 01:19:39,400 Speaker 1: saying the the drunk driving Twitter car made it a 1495 01:19:39,439 --> 01:19:42,920 Speaker 1: full fifteen feet further than the Facebook car before the 1496 01:19:42,920 --> 01:19:45,479 Speaker 1: sheer amount of blood being churned up into the engine, 1497 01:19:45,520 --> 01:19:48,200 Speaker 1: flooded the engine air intakes. But at the end of 1498 01:19:48,200 --> 01:19:51,160 Speaker 1: the day, we're all fucked. I'm on Twitter as well, 1499 01:19:51,400 --> 01:19:57,000 Speaker 1: at Jamie. Listen to my podcast, Yeah, you know. You 1500 01:19:57,000 --> 01:20:00,559 Speaker 1: can listen to my podcast, the Beachdel Cast. You listened 1501 01:20:00,600 --> 01:20:03,400 Speaker 1: to act Cast, that's about the Kathy Comics. You can 1502 01:20:03,439 --> 01:20:05,519 Speaker 1: listen to My year in Mensa. You can listen to 1503 01:20:05,560 --> 01:20:08,040 Speaker 1: the lead of podcast. You can listen to nothing you 1504 01:20:08,040 --> 01:20:11,599 Speaker 1: know what never led to a genocide in any country 1505 01:20:11,640 --> 01:20:14,880 Speaker 1: as far as I'm aware of, Jamie. Uh, the Kathy Comics. 1506 01:20:14,960 --> 01:20:17,400 Speaker 1: We'll see then you haven't listened to the whole series, really, 1507 01:20:17,560 --> 01:20:20,040 Speaker 1: is it? Oh? You know what? You know? Yeah, that's 1508 01:20:20,120 --> 01:20:23,000 Speaker 1: that's why the last the last episode is your Your 1509 01:20:23,040 --> 01:20:28,400 Speaker 1: Life Report from Sarah Jevo in episode eleven. Yeah, Irving 1510 01:20:28,479 --> 01:20:31,280 Speaker 1: really his his politics were not good. Yeah he was. 1511 01:20:31,400 --> 01:20:35,919 Speaker 1: He was. He was like weirdly into the Serbian nationalism. Um. 1512 01:20:36,520 --> 01:20:39,599 Speaker 1: Irving is like for the Kathy Comics, he's like, Okay, 1513 01:20:39,640 --> 01:20:43,760 Speaker 1: I'm about to make a wild parallel. But Irving is 1514 01:20:43,800 --> 01:20:48,000 Speaker 1: like the Barefoot Contessa's husband and that he looks so innocent, 1515 01:20:48,120 --> 01:20:50,679 Speaker 1: but then when you google him, you're like, wait, a second. 1516 01:20:51,040 --> 01:20:53,600 Speaker 1: This man is running on dark money. This guy is 1517 01:20:53,640 --> 01:20:55,559 Speaker 1: like was on Wall Street in the eighties. This is 1518 01:20:55,600 --> 01:21:00,000 Speaker 1: a bad man. That's he's basically like Jeffrey, the Barefoot 1519 01:21:00,000 --> 01:21:03,320 Speaker 1: Contessa's husband. The Barefoot Contessa is run on dark money. 1520 01:21:03,640 --> 01:21:06,000 Speaker 1: I know people don't like to hear it. They love her, 1521 01:21:06,600 --> 01:21:09,160 Speaker 1: but it's just true. It's objectively true, and that's what 1522 01:21:09,200 --> 01:21:11,559 Speaker 1: I would like to say at the end of the episode. 1523 01:21:11,760 --> 01:21:14,439 Speaker 1: I've never heard of the Barefoot Contessa and I don't 1524 01:21:14,479 --> 01:21:17,800 Speaker 1: know what you're talking about. I have not even one 1525 01:21:17,840 --> 01:21:20,760 Speaker 1: percent surprise, but that's okay. But you know what I 1526 01:21:20,800 --> 01:21:24,400 Speaker 1: do know about what I know about podcasts, and this 1527 01:21:24,439 --> 01:21:26,240 Speaker 1: one is done great ending