1 00:00:08,880 --> 00:00:12,799 Speaker 1: The heartbreaking videos and images from the ongoing fires in 2 00:00:12,840 --> 00:00:16,920 Speaker 1: Los Angeles are filling up social media. A number of 3 00:00:16,920 --> 00:00:20,600 Speaker 1: them are claiming to show the iconic Hollywood Sign on 4 00:00:20,760 --> 00:00:21,919 Speaker 1: fire last night. 5 00:00:22,320 --> 00:00:25,160 Speaker 2: Earlier this year, a bunch of videos went viral showing 6 00:00:25,200 --> 00:00:27,200 Speaker 2: the Hollywood Sign on fire. 7 00:00:27,320 --> 00:00:29,680 Speaker 3: Literally like the Hollywood Sign, it was burning, it was 8 00:00:29,720 --> 00:00:32,720 Speaker 3: burning yesterday, burning like what is going on? 9 00:00:32,840 --> 00:00:34,120 Speaker 4: It looked like the apocalypse. 10 00:00:35,120 --> 00:00:38,360 Speaker 2: They were posted in January when fires were raging across 11 00:00:38,560 --> 00:00:42,479 Speaker 2: LA where I live. This was a scary time and 12 00:00:42,560 --> 00:00:44,680 Speaker 2: those videos only made things worse. 13 00:00:45,360 --> 00:00:48,640 Speaker 4: These viral images that we see of the Hollywood Sign 14 00:00:48,720 --> 00:00:52,400 Speaker 4: on fire actually started to spread on social media as 15 00:00:52,479 --> 00:00:54,920 Speaker 4: Los Angeles in battles one of its worst fires in 16 00:00:55,000 --> 00:00:55,920 Speaker 4: all of its history. 17 00:00:56,160 --> 00:00:59,080 Speaker 2: There were different versions. Some of the videos just showed 18 00:00:59,120 --> 00:01:01,840 Speaker 2: fire on the hill low the letters, and in other 19 00:01:01,960 --> 00:01:03,960 Speaker 2: videos the letters themselves were burning. 20 00:01:04,319 --> 00:01:06,560 Speaker 3: When we saw the posts on Instagram, we were devastated. 21 00:01:06,600 --> 00:01:09,440 Speaker 3: We lived here all our lives and when we saw 22 00:01:09,480 --> 00:01:12,000 Speaker 3: that on Instagram, that broke our hearts. So we wanted 23 00:01:12,040 --> 00:01:15,080 Speaker 3: to come and see for as time, and we're really 24 00:01:15,080 --> 00:01:18,000 Speaker 3: shocked that the there's no fires, there's no fires, and 25 00:01:18,040 --> 00:01:19,959 Speaker 3: they're taking it basically. 26 00:01:20,520 --> 00:01:24,199 Speaker 2: So those videos were all fake. They were generated by AI. 27 00:01:24,720 --> 00:01:26,640 Speaker 2: If you saw these videos when they first came out, 28 00:01:26,959 --> 00:01:29,640 Speaker 2: maybe you already knew they were fake. But what you 29 00:01:29,720 --> 00:01:32,560 Speaker 2: might not know is how much money these videos can 30 00:01:32,600 --> 00:01:35,600 Speaker 2: make for their creators. And if the online course that 31 00:01:35,680 --> 00:01:38,000 Speaker 2: teaches you how to make these videos is to be believed, 32 00:01:38,520 --> 00:01:42,160 Speaker 2: you can make over five thousand dollars a month. Videos 33 00:01:42,240 --> 00:01:44,800 Speaker 2: like the Hollywood Sign burning are part of a bigger 34 00:01:44,840 --> 00:01:48,080 Speaker 2: phenomenon that is probably filling up your feed right now. 35 00:01:48,720 --> 00:01:50,840 Speaker 2: There's a word for this slap. 36 00:01:51,760 --> 00:01:55,040 Speaker 5: I actually don't know where it came from. Like I 37 00:01:55,080 --> 00:01:57,720 Speaker 5: had been calling it AI spam, I was writing about 38 00:01:57,720 --> 00:02:02,440 Speaker 5: it before it was called AI, and then someone on 39 00:02:02,480 --> 00:02:05,400 Speaker 5: Twitter I think, could just start calling it slop, and 40 00:02:05,440 --> 00:02:08,120 Speaker 5: I feel like slop is a better term for it. 41 00:02:08,520 --> 00:02:10,960 Speaker 5: There's a lot of different types of AI slop at 42 00:02:10,960 --> 00:02:14,240 Speaker 5: this point, but basically it is an AI generated image 43 00:02:14,600 --> 00:02:19,080 Speaker 5: or video that is designed to go viral on social media. 44 00:02:19,880 --> 00:02:22,959 Speaker 2: Jason Kevler is my friend and former colleague from back 45 00:02:23,000 --> 00:02:26,160 Speaker 2: when we both worked advice. Since then, he co founded 46 00:02:26,160 --> 00:02:28,720 Speaker 2: a site called four or for Media, and pretty much 47 00:02:28,800 --> 00:02:31,480 Speaker 2: right out the gate this stuff turned into one of 48 00:02:31,480 --> 00:02:34,280 Speaker 2: his beats. You're one of the first people who was 49 00:02:34,360 --> 00:02:38,880 Speaker 2: really documenting AI slap before it really truly blew up. 50 00:02:39,280 --> 00:02:41,560 Speaker 5: Yeah, I mean I'm obsessed with it. I've been obsessed 51 00:02:41,560 --> 00:02:44,160 Speaker 5: with it since the get go. I've just found it 52 00:02:44,240 --> 00:02:49,359 Speaker 5: to be like very fascinating to cover because I can 53 00:02:49,440 --> 00:02:54,040 Speaker 5: kind of like watch in real time as people lose 54 00:02:54,120 --> 00:02:56,239 Speaker 5: touch with reality. To be honest with you. 55 00:02:58,280 --> 00:03:00,160 Speaker 2: I don't know if I'm quite as obsessed with with 56 00:03:00,200 --> 00:03:03,240 Speaker 2: this stuff as Jason is. But then again, he was 57 00:03:03,240 --> 00:03:06,799 Speaker 2: looking at golden Jesus helicopters and I'm here looking at 58 00:03:06,880 --> 00:03:10,920 Speaker 2: fake fires. But when I started investigating, I realized that 59 00:03:10,960 --> 00:03:13,280 Speaker 2: the stuff that he was seeing and the stuff that 60 00:03:13,320 --> 00:03:16,000 Speaker 2: I was watching, we're all part of the same machine. 61 00:03:16,480 --> 00:03:18,520 Speaker 2: And I managed to talk to one of the people 62 00:03:18,600 --> 00:03:32,280 Speaker 2: behind it. I'm afraid from Kaleidoscope and iHeart podcasts. This 63 00:03:34,400 --> 00:03:39,240 Speaker 2: is kill Switch. I'm Dexter Thomas, I'm Chari. 64 00:04:02,720 --> 00:04:02,880 Speaker 6: Buy. 65 00:04:06,760 --> 00:04:09,200 Speaker 2: When did you start writing about AI slab? 66 00:04:10,280 --> 00:04:13,480 Speaker 5: Yeah, it was December twenty twenty three. One of our 67 00:04:13,520 --> 00:04:18,479 Speaker 5: readers posted this image of a guy in the United 68 00:04:18,600 --> 00:04:23,520 Speaker 5: Kingdom who carves dogs out of wood like with a chainsaw, 69 00:04:24,200 --> 00:04:28,240 Speaker 5: and this reader had noticed that there were all these 70 00:04:28,279 --> 00:04:35,160 Speaker 5: copycats of this one guy's dog wood carvings. Basically, there 71 00:04:35,160 --> 00:04:38,360 Speaker 5: were these Facebook pages that were posting dozens and dozens 72 00:04:38,400 --> 00:04:43,040 Speaker 5: of different versions of the original photos that he was uploading. 73 00:04:43,080 --> 00:04:46,040 Speaker 5: And so in some images it's like the guy would 74 00:04:46,040 --> 00:04:48,760 Speaker 5: be would have a goatee. In some images, the guy 75 00:04:48,800 --> 00:04:52,479 Speaker 5: would look Latino. In some images, the guy would be 76 00:04:52,520 --> 00:04:55,800 Speaker 5: a woman. In some images, the dog would be a 77 00:04:55,800 --> 00:04:58,640 Speaker 5: German shepherd instead of like, you know, a Golden Retriever. 78 00:04:59,480 --> 00:05:02,320 Speaker 5: And it was very obvious that they had been modified 79 00:05:02,320 --> 00:05:02,799 Speaker 5: by AI. 80 00:05:03,600 --> 00:05:06,720 Speaker 2: This stuff sounds pretty innocuous, and I guess it was. 81 00:05:07,240 --> 00:05:10,120 Speaker 2: Back in twenty twenty three. Image to image generators like 82 00:05:10,200 --> 00:05:14,159 Speaker 2: Dolly and Microsoft's Being Image Generator were still kind of novel, 83 00:05:14,640 --> 00:05:17,400 Speaker 2: and if someone wanted to use that to make variations 84 00:05:17,400 --> 00:05:21,000 Speaker 2: on existing images of chainsaw wood carvings, I mean, that's 85 00:05:21,120 --> 00:05:24,320 Speaker 2: not really my thing, but hey, if people are into it, whatever. 86 00:05:24,720 --> 00:05:27,000 Speaker 2: It was just kind of a. 87 00:05:26,920 --> 00:05:30,440 Speaker 5: Lot at first. This meant a lot of people were 88 00:05:30,560 --> 00:05:35,520 Speaker 5: using these aiimage generators and they were creating Facebook pages 89 00:05:35,880 --> 00:05:39,320 Speaker 5: and then posting I don't know, twenty thirty forty times 90 00:05:39,320 --> 00:05:42,960 Speaker 5: a day, and each page would have a theme. So 91 00:05:43,320 --> 00:05:46,960 Speaker 5: one of the themes would be like beautiful log cabins. 92 00:05:47,320 --> 00:05:51,480 Speaker 5: So a person would generate one hundred log cabins using 93 00:05:51,520 --> 00:05:54,840 Speaker 5: these aiimage generators and then they would post them on Facebook. 94 00:05:55,440 --> 00:05:58,479 Speaker 2: But his AI tools got better, and these Facebook pages 95 00:05:58,560 --> 00:06:04,960 Speaker 2: became more viral, the AI slap became more targeted and weirder. 96 00:06:06,120 --> 00:06:09,440 Speaker 5: The moment that this became really mainstream, in my opinion, 97 00:06:10,760 --> 00:06:16,840 Speaker 5: was this moment of shrimp Jesus and shrimp Jesus. Oh 98 00:06:16,880 --> 00:06:21,000 Speaker 5: we go right in Jesus, I mean, I think so. 99 00:06:21,000 --> 00:06:23,240 Speaker 5: So for a while, there were all of these images 100 00:06:23,240 --> 00:06:27,359 Speaker 5: of Jesus, this sort of stereotypical white Jesus, long hair, 101 00:06:28,080 --> 00:06:33,400 Speaker 5: and people were just posting hundreds of images of Jesus 102 00:06:33,440 --> 00:06:36,920 Speaker 5: in different situations and then the caption would say please 103 00:06:36,960 --> 00:06:39,320 Speaker 5: give me an amen. They would be asking people to 104 00:06:39,320 --> 00:06:41,880 Speaker 5: comment on this, which would boost in the algorithm, and 105 00:06:42,000 --> 00:06:43,960 Speaker 5: none of it was real. It was all AI generated. 106 00:06:44,400 --> 00:06:47,440 Speaker 5: There was a lot of images of Jesus sand sculptures, 107 00:06:47,680 --> 00:06:51,040 Speaker 5: and then the caption would be like only true believers 108 00:06:51,080 --> 00:06:54,320 Speaker 5: will love my art or something like that. And then 109 00:06:55,440 --> 00:07:02,520 Speaker 5: there was this one image where Jesus was floating above 110 00:07:02,600 --> 00:07:07,760 Speaker 5: the ocean and he had the face of Jesus but 111 00:07:07,880 --> 00:07:12,920 Speaker 5: the arms of shrimp. It was basically like an iracnid 112 00:07:13,160 --> 00:07:15,640 Speaker 5: Jesus with like six arms and they were all made 113 00:07:15,640 --> 00:07:18,680 Speaker 5: of shrimp. And it was just like obviously insane, like 114 00:07:18,720 --> 00:07:21,400 Speaker 5: an insane image, something that a human I don't think 115 00:07:21,440 --> 00:07:24,560 Speaker 5: would probably ever think up on their own. And this 116 00:07:24,640 --> 00:07:28,240 Speaker 5: image had millions of likes on Facebook, and there were 117 00:07:28,280 --> 00:07:30,680 Speaker 5: a bunch of different versions of it. It just went 118 00:07:30,840 --> 00:07:35,920 Speaker 5: incredibly viral and weirdly, like a lot of the comments 119 00:07:35,920 --> 00:07:39,720 Speaker 5: were not even about the fact that Jesus was a shrimp. 120 00:07:40,240 --> 00:07:44,200 Speaker 5: It was just like, amen, I love God, things like that, 121 00:07:44,480 --> 00:07:46,960 Speaker 5: and so, I mean, I don't think people thought it 122 00:07:47,040 --> 00:07:49,680 Speaker 5: was real, but it clearly didn't matter to a lot 123 00:07:49,720 --> 00:07:53,080 Speaker 5: of people that this was just like an absurd image. 124 00:07:53,640 --> 00:07:57,320 Speaker 2: It sounds like shrimp Jesus was kind of the turning 125 00:07:57,400 --> 00:08:02,160 Speaker 2: point for Ais. I don't want to say the genesis necessarily, 126 00:08:02,160 --> 00:08:04,040 Speaker 2: I don't want to get too biblical with this, but 127 00:08:04,680 --> 00:08:07,360 Speaker 2: when I read your article on shrimp Jesus, I think 128 00:08:07,400 --> 00:08:13,000 Speaker 2: that is when I really truly realized, Okay, something really 129 00:08:13,080 --> 00:08:14,600 Speaker 2: really strange is happening here. 130 00:08:15,480 --> 00:08:20,200 Speaker 5: Yeah, it was very interesting that this was happening on Facebook. First, like, 131 00:08:20,280 --> 00:08:23,880 Speaker 5: Facebook is a really old social network at this point, 132 00:08:24,320 --> 00:08:28,040 Speaker 5: it's not cool anymore. Like, I don't know a lot 133 00:08:28,080 --> 00:08:31,040 Speaker 5: of people who use Facebook in the same way that 134 00:08:31,080 --> 00:08:33,240 Speaker 5: they might have in like two thousand and eight, and 135 00:08:33,280 --> 00:08:36,280 Speaker 5: the people that I do know who use it are 136 00:08:36,360 --> 00:08:42,360 Speaker 5: generally older and therefore, I mean, perhaps more susceptible to 137 00:08:42,520 --> 00:08:46,720 Speaker 5: fake AI images. And then I wrote that article, and 138 00:08:46,760 --> 00:08:50,800 Speaker 5: there was also some different meme accounts on Instagram and 139 00:08:50,920 --> 00:08:56,760 Speaker 5: on Twitter who took screenshots of shrimp Jesus, and they 140 00:08:56,800 --> 00:09:00,960 Speaker 5: started saying Facebook is cooked. Facebook is became a meme 141 00:09:01,640 --> 00:09:05,440 Speaker 5: meaning look at what's happening over on Facebook. All the 142 00:09:05,480 --> 00:09:10,240 Speaker 5: old people are getting tricked into liking this absurd stuff. 143 00:09:11,120 --> 00:09:16,720 Speaker 5: And so that is a moment where AI slop escaped Facebook. 144 00:09:18,120 --> 00:09:21,040 Speaker 2: Shrimp Jesus went viral in April of twenty twenty four. 145 00:09:21,679 --> 00:09:24,200 Speaker 2: At the time, I remember thinking it was pretty funny. 146 00:09:24,320 --> 00:09:27,280 Speaker 2: I actually interviewed Jason about it and we basically spent 147 00:09:27,440 --> 00:09:31,679 Speaker 2: half an hour just laughing. But now, just a year later, 148 00:09:32,280 --> 00:09:36,800 Speaker 2: AI slap isn't only on Facebook tricking our parents. It's everywhere, 149 00:09:37,440 --> 00:09:40,640 Speaker 2: and it's not just bizarre or obviously fake stuff like 150 00:09:40,679 --> 00:09:44,480 Speaker 2: shrimp Jesus anymore. Some of those fake la fire videos 151 00:09:44,600 --> 00:09:48,160 Speaker 2: looked real, and they fooled a lot of people, so. 152 00:09:48,160 --> 00:09:50,240 Speaker 3: Which came right here just to check out the fires, 153 00:09:50,360 --> 00:09:53,520 Speaker 3: and there's nothing burnie in Hollywood. 154 00:09:53,559 --> 00:09:56,440 Speaker 5: The sign like you was shown on TV last night. 155 00:09:56,679 --> 00:10:00,280 Speaker 3: You guys should help each other out instead of trying 156 00:10:00,320 --> 00:10:05,280 Speaker 3: to create false narratives and trying to promote bus, you know, 157 00:10:06,160 --> 00:10:08,240 Speaker 3: help help each other out. It's not it's not it's 158 00:10:08,280 --> 00:10:08,640 Speaker 3: not right. 159 00:10:09,280 --> 00:10:13,520 Speaker 2: This is the latest iteration of AI content, more realistic, 160 00:10:14,000 --> 00:10:17,280 Speaker 2: more targeted, and tied to real world events. 161 00:10:17,880 --> 00:10:21,120 Speaker 5: That's the scary thing about it, right, is that when 162 00:10:21,120 --> 00:10:24,679 Speaker 5: I first started writing about this, there was almost nothing 163 00:10:25,880 --> 00:10:28,840 Speaker 5: that was political or tied to the news in any way. 164 00:10:29,400 --> 00:10:32,720 Speaker 5: It was almost all like, look at this crazy food, 165 00:10:33,320 --> 00:10:39,200 Speaker 5: look at this amazing futuristic sci fi scenescape, look at 166 00:10:39,280 --> 00:10:43,880 Speaker 5: shrimp Jesus, look at this sand sculpture. And then what 167 00:10:44,000 --> 00:10:48,920 Speaker 5: has happened is the people making this stuff have realized 168 00:10:48,960 --> 00:10:52,680 Speaker 5: that they can get more attention if they start tying 169 00:10:52,720 --> 00:10:54,320 Speaker 5: it to the news in some way. 170 00:10:56,160 --> 00:10:58,679 Speaker 2: So one night, as the fires were still burning in La, 171 00:10:58,920 --> 00:11:02,200 Speaker 2: I opened up Instagram, and this was kind of everyone's 172 00:11:02,200 --> 00:11:05,240 Speaker 2: way of communicating, and it was also kind of a lifeline. 173 00:11:05,679 --> 00:11:07,840 Speaker 2: When you look at someone's stories, you never knew if 174 00:11:07,880 --> 00:11:10,000 Speaker 2: it was going to be them saying, hey, I'm okay, 175 00:11:10,080 --> 00:11:12,040 Speaker 2: I'm safe, or if it was going to be them 176 00:11:12,320 --> 00:11:16,360 Speaker 2: showing their house burning down. So I'm scrolling through and 177 00:11:16,600 --> 00:11:18,680 Speaker 2: I see this video of animals that are in a 178 00:11:18,720 --> 00:11:23,400 Speaker 2: burning forest and there's firefighters saving the baby animals. There's 179 00:11:23,400 --> 00:11:27,160 Speaker 2: a firefighter holding these two baby bear cubs and carrying 180 00:11:27,160 --> 00:11:30,520 Speaker 2: them out of the flames. Pretty quickly, I realized, wait 181 00:11:30,520 --> 00:11:32,600 Speaker 2: a second, this is fake. And then I look at 182 00:11:32,600 --> 00:11:37,040 Speaker 2: the view count. It's in the millions and there's thousands 183 00:11:37,080 --> 00:11:40,920 Speaker 2: of comments. I get curious and I look at the 184 00:11:41,000 --> 00:11:46,480 Speaker 2: username future rider US. That's kind of weird. So I 185 00:11:46,520 --> 00:11:49,640 Speaker 2: look at the post history, and most of the posts 186 00:11:49,720 --> 00:11:54,439 Speaker 2: were of these AI motorcycles that were riding through futuristic landscapes, 187 00:11:55,200 --> 00:11:58,280 Speaker 2: so the name makes sense for those. But I scroll 188 00:11:58,400 --> 00:12:00,319 Speaker 2: up and I see that as soon who as the 189 00:12:00,360 --> 00:12:04,640 Speaker 2: fire started, they posted a fake Hollywood sign burning video 190 00:12:05,120 --> 00:12:07,800 Speaker 2: that got them a million views, and that was their 191 00:12:07,840 --> 00:12:11,560 Speaker 2: biggest success to that point, and they just kept posting 192 00:12:11,640 --> 00:12:12,800 Speaker 2: fire content after that. 193 00:12:14,440 --> 00:12:16,560 Speaker 5: The thing that's happened alongside of that is the AI 194 00:12:16,679 --> 00:12:18,760 Speaker 5: has gotten a lot better. The AI has gotten a 195 00:12:18,760 --> 00:12:19,800 Speaker 5: lot more convincing. 196 00:12:20,880 --> 00:12:22,560 Speaker 2: I said, I was able to tell this it's fake, 197 00:12:22,920 --> 00:12:26,079 Speaker 2: but it took me a second. And the comments are 198 00:12:26,160 --> 00:12:29,600 Speaker 2: kind of mixed. There's some people who are completely unaware 199 00:12:29,640 --> 00:12:32,000 Speaker 2: that this stuff is fake and saying things like, Wow, 200 00:12:32,040 --> 00:12:34,679 Speaker 2: look at these brave firefighters. I'm so glad they're helping 201 00:12:34,679 --> 00:12:38,800 Speaker 2: the animals. But there's other people who are posting angry 202 00:12:38,880 --> 00:12:42,640 Speaker 2: comments like how dare you make fake content about this disaster? 203 00:12:44,920 --> 00:12:46,600 Speaker 5: And so now you have a lot of people who 204 00:12:46,640 --> 00:12:50,680 Speaker 5: are making stuff that is designed to hit the news 205 00:12:50,720 --> 00:12:55,800 Speaker 5: cycle in some way, to upset people, to enrage people, 206 00:12:56,760 --> 00:13:00,280 Speaker 5: or to trick them. It's not just on face Book. 207 00:13:00,320 --> 00:13:03,440 Speaker 5: It's on YouTube, it's on TikTok, It's on Instagram. You know, 208 00:13:03,480 --> 00:13:06,880 Speaker 5: I've even seen it on Pinterest and LinkedIn. So it's 209 00:13:06,920 --> 00:13:09,839 Speaker 5: really become a strategy for people who want to make 210 00:13:09,880 --> 00:13:11,320 Speaker 5: money on the internet at this point. 211 00:13:12,520 --> 00:13:16,960 Speaker 2: Yeah, it's weird because it seemed like everything trickled down 212 00:13:17,040 --> 00:13:20,200 Speaker 2: to Facebook in the past few years. Good features from 213 00:13:20,200 --> 00:13:23,600 Speaker 2: somewhere else eventually make it over to Facebook. Information that 214 00:13:23,679 --> 00:13:27,000 Speaker 2: first happens on Twitter, because it's really real time, eventually 215 00:13:27,000 --> 00:13:30,240 Speaker 2: makes it over to Facebook. It's not like this really 216 00:13:30,280 --> 00:13:34,120 Speaker 2: weird thing that has started on Facebook is spread out 217 00:13:34,520 --> 00:13:35,480 Speaker 2: to the rest of the world. 218 00:13:35,559 --> 00:13:38,720 Speaker 5: It's the first time Facebook's irrelevant in like a decade. 219 00:13:40,200 --> 00:13:42,960 Speaker 2: This is the innovation that's happening on Facebook. No joke. 220 00:13:43,160 --> 00:13:47,080 Speaker 2: Yeah is aislab. But when was it that you first 221 00:13:47,160 --> 00:13:51,800 Speaker 2: started seeing stuff that was realistic and was directly tied 222 00:13:51,840 --> 00:13:54,520 Speaker 2: to actual events that were happening in the real world. 223 00:13:55,720 --> 00:14:00,520 Speaker 5: There was this moment where in Israel's invasion of Gaza, 224 00:14:01,800 --> 00:14:05,480 Speaker 5: there was that moment where they went into Rafa and 225 00:14:06,440 --> 00:14:10,320 Speaker 5: there was an AI generated image called all Eyes on Rafa. 226 00:14:11,000 --> 00:14:12,120 Speaker 2: Can you describe the image? 227 00:14:12,240 --> 00:14:16,560 Speaker 5: Yeah, So it was huge font and it said all 228 00:14:16,600 --> 00:14:21,760 Speaker 5: eyes on Rafa. There was like a bunch of tents 229 00:14:22,240 --> 00:14:25,600 Speaker 5: and what looked to be like a refugee camp. And 230 00:14:27,520 --> 00:14:33,400 Speaker 5: this was shared like tens of millions of times on Instagram. 231 00:14:33,560 --> 00:14:35,680 Speaker 5: A lot of celebrities shared it. There was a bunch 232 00:14:35,720 --> 00:14:39,040 Speaker 5: of people writing about it. So I started looking for 233 00:14:39,080 --> 00:14:43,640 Speaker 5: where it came from, and I found this group on 234 00:14:43,720 --> 00:14:50,320 Speaker 5: Facebook for AI creators in Malaysia and they were testing 235 00:14:50,480 --> 00:14:55,080 Speaker 5: all of these different versions of AI generated images about 236 00:14:55,200 --> 00:14:59,280 Speaker 5: the war in Gaza, and the all Eyes on Rafa 237 00:15:00,080 --> 00:15:03,840 Speaker 5: image came from that Facebook group. So that was the 238 00:15:03,840 --> 00:15:08,320 Speaker 5: first time that I ever saw AI spam intersect directly 239 00:15:08,440 --> 00:15:11,760 Speaker 5: with the news, and it was interesting because in that 240 00:15:11,840 --> 00:15:14,600 Speaker 5: group they were talking about the all eyes on Rafa 241 00:15:15,000 --> 00:15:19,480 Speaker 5: image and how viral it went, and we're like, maybe 242 00:15:19,560 --> 00:15:22,320 Speaker 5: we can reverse engineer this, maybe we can do this 243 00:15:22,880 --> 00:15:26,040 Speaker 5: over and over and over again. A month after that, 244 00:15:26,080 --> 00:15:29,640 Speaker 5: I figured out how people were making money with these 245 00:15:29,640 --> 00:15:32,680 Speaker 5: images and why and the whole strategy behind it. 246 00:15:33,760 --> 00:15:36,480 Speaker 2: Who's behind these posts and how do they make money. 247 00:15:37,120 --> 00:15:50,480 Speaker 2: We'll get into that after the break. One of the 248 00:15:50,520 --> 00:15:53,720 Speaker 2: disconnects that I've been seeing just in my own reporting 249 00:15:54,520 --> 00:15:59,560 Speaker 2: is that I think there is an awareness growing among 250 00:15:59,680 --> 00:16:04,040 Speaker 2: some people that AI generated images are a thing, and 251 00:16:04,360 --> 00:16:08,880 Speaker 2: that it's not always obvious if something is AI. But 252 00:16:10,040 --> 00:16:12,880 Speaker 2: I think a lot of people don't necessarily realize why 253 00:16:12,920 --> 00:16:15,880 Speaker 2: this is being made, what motivation there might be. 254 00:16:16,040 --> 00:16:20,160 Speaker 5: For a very long time. I and also the experts 255 00:16:20,200 --> 00:16:22,440 Speaker 5: that I spoke to, One thing that we couldn't figure 256 00:16:22,440 --> 00:16:25,120 Speaker 5: out was like, why are they doing this? Like what 257 00:16:25,320 --> 00:16:29,040 Speaker 5: is the scam here? Because there was no obvious way 258 00:16:29,040 --> 00:16:32,400 Speaker 5: that it was being monetized. For months, I thought that 259 00:16:32,440 --> 00:16:35,760 Speaker 5: people were doing this and trying to make money off 260 00:16:36,120 --> 00:16:38,360 Speaker 5: of Facebook's platform in some way. I thought they were 261 00:16:38,400 --> 00:16:40,360 Speaker 5: trying to hack people I thought they were trying to 262 00:16:40,400 --> 00:16:43,400 Speaker 5: push them off the website to steal their credit card 263 00:16:43,480 --> 00:16:47,000 Speaker 5: information or something like that. But all along they were 264 00:16:47,040 --> 00:16:51,480 Speaker 5: just getting paid directly by Facebook because these images were 265 00:16:51,480 --> 00:16:52,240 Speaker 5: going viral. 266 00:16:54,000 --> 00:16:57,680 Speaker 2: These creators don't need to scam anybody. Having a viral 267 00:16:57,800 --> 00:17:00,200 Speaker 2: video is enough to make a lot of money. 268 00:17:00,560 --> 00:17:05,400 Speaker 5: Facebook has this thing called the Creator Bonus Program, and 269 00:17:06,200 --> 00:17:10,679 Speaker 5: it basically pays people a sliver of ad revenue for 270 00:17:11,440 --> 00:17:12,840 Speaker 5: posts that get engagement. 271 00:17:14,240 --> 00:17:18,080 Speaker 2: How much money can you make? Meta doesn't publish those numbers, 272 00:17:18,480 --> 00:17:21,560 Speaker 2: but I asked a few influencers and recently the rate 273 00:17:21,600 --> 00:17:23,639 Speaker 2: seemed to be around one hundred to one hundred and 274 00:17:23,640 --> 00:17:25,200 Speaker 2: twenty dollars per million views. 275 00:17:25,800 --> 00:17:28,399 Speaker 5: It's become a job to be totally honest with you. 276 00:17:28,600 --> 00:17:32,480 Speaker 5: It's pretty crazy because there's there's like this one example 277 00:17:33,880 --> 00:17:39,080 Speaker 5: of a guy who quit his job in India. He 278 00:17:39,200 --> 00:17:41,560 Speaker 5: was like working in the medical industry and he wasn't 279 00:17:41,560 --> 00:17:43,760 Speaker 5: making a lot of money and he was supporting his 280 00:17:43,800 --> 00:17:46,399 Speaker 5: family and he was like, I used to work like 281 00:17:47,040 --> 00:17:51,440 Speaker 5: sixteen hour days and then I learned about spamming Facebook. 282 00:17:52,400 --> 00:17:56,280 Speaker 5: And he posted this one image that was a train 283 00:17:56,640 --> 00:17:59,920 Speaker 5: made out of leaves. It's like just to passenger train, 284 00:18:00,240 --> 00:18:02,679 Speaker 5: but the train was made entirely out of leaves. It 285 00:18:02,720 --> 00:18:05,639 Speaker 5: was an AI image and he made four hundred dollars 286 00:18:05,840 --> 00:18:10,000 Speaker 5: from that image. And he was like, I was only 287 00:18:10,040 --> 00:18:14,560 Speaker 5: making two hundred dollars a month at my other job. Wow, 288 00:18:15,000 --> 00:18:17,119 Speaker 5: And so I just do this now. 289 00:18:17,560 --> 00:18:22,280 Speaker 2: Wow. And there's the incentive. You can make pretty serious 290 00:18:22,320 --> 00:18:25,760 Speaker 2: money here, which brings us back to future writer us. 291 00:18:26,400 --> 00:18:29,400 Speaker 2: It's not really possible to know exactly how much money 292 00:18:29,400 --> 00:18:32,040 Speaker 2: they're making, but if we go back to that estimate 293 00:18:32,119 --> 00:18:35,119 Speaker 2: of about one hundred bucks per million views, let's compare 294 00:18:35,200 --> 00:18:38,639 Speaker 2: that to their most successful day. In a roughly twenty 295 00:18:38,680 --> 00:18:42,240 Speaker 2: four hour stretch, starting on January tenth, they posted seven 296 00:18:42,320 --> 00:18:47,560 Speaker 2: videos which collectively got about ninety four million views. Theoretically, 297 00:18:48,000 --> 00:18:51,760 Speaker 2: that could work out to nine thousand, four hundred dollars 298 00:18:51,760 --> 00:18:54,679 Speaker 2: in one day. And that's not the only way the 299 00:18:54,680 --> 00:18:57,719 Speaker 2: account makes money. They also sell a guide so that 300 00:18:57,760 --> 00:19:00,960 Speaker 2: you yourself can start your own AI slaw business for 301 00:19:01,160 --> 00:19:02,440 Speaker 2: just nineteen ninety nine. 302 00:19:02,720 --> 00:19:07,280 Speaker 6: Want to grow your Instagram TikTok YouTube fast, get thousands 303 00:19:07,320 --> 00:19:11,760 Speaker 6: of followers, create viral reels, and earn five thousand dollars 304 00:19:11,800 --> 00:19:15,800 Speaker 6: plus every month with the Viral Reels Guide it's easier 305 00:19:15,800 --> 00:19:20,480 Speaker 6: than ever. Step by step instructions, proven strategies, the perfect 306 00:19:20,520 --> 00:19:24,199 Speaker 6: for beginners, no experience needed, Ready to start. Grab the 307 00:19:24,200 --> 00:19:28,000 Speaker 6: guide today for just nineteen dollars and ninety nine cents. 308 00:19:28,760 --> 00:19:32,280 Speaker 2: This is a key part of the new AI slop economy. 309 00:19:32,800 --> 00:19:37,920 Speaker 5: So there is this entire network of influencers. Almost all 310 00:19:37,960 --> 00:19:44,199 Speaker 5: of them are in India, Vietnam, Pakistan, a couple in 311 00:19:44,400 --> 00:19:48,040 Speaker 5: a few African countries and Central America, but for the 312 00:19:48,119 --> 00:19:52,879 Speaker 5: most part they are in developing countries who have YouTube channels, 313 00:19:53,119 --> 00:19:56,800 Speaker 5: who are like, here is how you can make AI spam, 314 00:19:57,160 --> 00:19:59,520 Speaker 5: and here's how to put it on Facebook, and here's 315 00:19:59,560 --> 00:20:00,640 Speaker 5: how to mind to ties it. 316 00:20:01,359 --> 00:20:04,240 Speaker 2: So as I went further down this Future writer us 317 00:20:04,359 --> 00:20:07,479 Speaker 2: rabbit hole, I got so curious that I realized I 318 00:20:07,560 --> 00:20:09,560 Speaker 2: needed to know what was in this course they were selling, 319 00:20:10,400 --> 00:20:15,000 Speaker 2: So for journalism's sake, I bought it. It's a ZIP 320 00:20:15,040 --> 00:20:18,880 Speaker 2: file with two files. The first one is this really 321 00:20:18,960 --> 00:20:22,960 Speaker 2: short PDF which is pretty clearly chat GBT generated, and 322 00:20:23,640 --> 00:20:26,640 Speaker 2: the summary of it is it basically gives you these instructions. 323 00:20:27,080 --> 00:20:30,679 Speaker 2: First look online for what's already trending at the moment. Second, 324 00:20:31,000 --> 00:20:33,720 Speaker 2: type that into Sora dot Ai to generate a video 325 00:20:33,760 --> 00:20:37,840 Speaker 2: about that thing, add some music, and three posts a 326 00:20:37,920 --> 00:20:41,080 Speaker 2: video online. Then just repeat that multiple times a day. 327 00:20:41,760 --> 00:20:44,720 Speaker 2: The second file is just a seven minute iPhone screen 328 00:20:44,760 --> 00:20:48,560 Speaker 2: recording of them generating a prompt and then uploading that online. 329 00:20:49,119 --> 00:20:52,680 Speaker 5: It's like almost like a pyramid scheme where it's like, well, well, 330 00:20:52,680 --> 00:20:54,840 Speaker 5: this one person made a lot of money and here's 331 00:20:54,840 --> 00:20:57,280 Speaker 5: a guide to doing it, so I'm gonna try as well. 332 00:20:58,160 --> 00:21:00,280 Speaker 5: You just have like tens of thousands of people who 333 00:21:00,320 --> 00:21:03,359 Speaker 5: are trying this, and the end result is the entire 334 00:21:03,400 --> 00:21:04,560 Speaker 5: platform gets spammed. 335 00:21:05,680 --> 00:21:08,520 Speaker 2: So who's the audience for this stuff? 336 00:21:09,080 --> 00:21:12,600 Speaker 5: I mean, there's definitely some strategic thinking about will people 337 00:21:12,600 --> 00:21:16,120 Speaker 5: in the United States, the United Kingdom or Canada care 338 00:21:16,200 --> 00:21:18,960 Speaker 5: about this stuff? That's the strategy. And I know that's 339 00:21:18,960 --> 00:21:21,360 Speaker 5: the strategy because a lot of the videos about how 340 00:21:21,359 --> 00:21:24,920 Speaker 5: to make this stuff talk about here's what people in 341 00:21:24,960 --> 00:21:27,320 Speaker 5: the United States care about. Like I watched one video 342 00:21:27,920 --> 00:21:31,680 Speaker 5: that basically was like Americans love babies and they love pets, 343 00:21:32,040 --> 00:21:35,960 Speaker 5: and so make AI images about babies and pets. I've 344 00:21:35,960 --> 00:21:39,320 Speaker 5: seen guides that are like, here's what you need to 345 00:21:39,320 --> 00:21:43,119 Speaker 5: know about Jesus because you may be Hindu and you 346 00:21:43,680 --> 00:21:47,240 Speaker 5: don't know so like, if you're making AI spam about Jesus, 347 00:21:47,359 --> 00:21:49,880 Speaker 5: here are words to type in that will get you Jesus. 348 00:21:49,920 --> 00:21:54,800 Speaker 5: Because a lot of Americans are Christian. You want people 349 00:21:54,840 --> 00:21:56,960 Speaker 5: in the United States and Canada to look at these 350 00:21:57,040 --> 00:22:00,439 Speaker 5: because the way that online advertising works is if you 351 00:22:00,520 --> 00:22:05,600 Speaker 5: are in a richer country, the ad rates are higher, 352 00:22:06,160 --> 00:22:10,000 Speaker 5: page views from developed countries. 353 00:22:09,680 --> 00:22:11,359 Speaker 7: Are worth more. 354 00:22:11,920 --> 00:22:16,679 Speaker 5: Yeah, but I think the audience is an algorithm. I 355 00:22:16,680 --> 00:22:21,280 Speaker 5: think that the audience is like what what works well 356 00:22:21,440 --> 00:22:27,440 Speaker 5: in the algorithm, because the goal is not to create 357 00:22:28,080 --> 00:22:32,800 Speaker 5: an amazing image or an amazing video that people are 358 00:22:32,800 --> 00:22:36,760 Speaker 5: going to resonate with. The goal is to make people 359 00:22:36,880 --> 00:22:41,399 Speaker 5: linger on any given image or video long enough to 360 00:22:41,480 --> 00:22:44,920 Speaker 5: send a signal to the algorithm that this is something 361 00:22:44,960 --> 00:22:48,000 Speaker 5: that a human being spent time looking at, spend time 362 00:22:48,000 --> 00:22:51,880 Speaker 5: engaging with. You know, if it's something totally absurd, you're 363 00:22:51,880 --> 00:22:54,280 Speaker 5: going to get people in the comment saying, hey, this 364 00:22:54,359 --> 00:22:56,680 Speaker 5: isn't real, and then you're going to have fights back 365 00:22:56,680 --> 00:22:59,560 Speaker 5: and forth, and all of those signals send to the 366 00:22:59,600 --> 00:23:03,639 Speaker 5: algorithm that this is something that's worth surfacing in someone 367 00:23:03,640 --> 00:23:07,560 Speaker 5: else's feed. And so I really don't think that the 368 00:23:07,640 --> 00:23:11,040 Speaker 5: audience for this is real human beings. 369 00:23:13,200 --> 00:23:14,960 Speaker 2: You might have heard that last bit and thought of 370 00:23:15,000 --> 00:23:18,560 Speaker 2: something called the dead Internet theory. It's well, you could 371 00:23:18,640 --> 00:23:21,560 Speaker 2: kind of call it a conspiracy theory, but basically it 372 00:23:21,600 --> 00:23:24,280 Speaker 2: refers to the idea that the Internet has so many 373 00:23:24,320 --> 00:23:29,280 Speaker 2: bots and algorithmically generated content that human interaction is actually 374 00:23:29,280 --> 00:23:33,200 Speaker 2: a minority of the traffic online. The majority of traffic 375 00:23:33,400 --> 00:23:35,879 Speaker 2: is just bots spamming each other back and forth. 376 00:23:36,320 --> 00:23:38,240 Speaker 5: And the way that you would apply this to say, 377 00:23:38,840 --> 00:23:44,159 Speaker 5: shrimp Jesus, for example, is like, well, there's a bot 378 00:23:44,280 --> 00:23:48,320 Speaker 5: that posted this AI image, and then maybe all the 379 00:23:48,359 --> 00:23:50,600 Speaker 5: people liking at our bots, and maybe all the people 380 00:23:50,600 --> 00:23:54,879 Speaker 5: commenting on it our bots, and therefore none of it 381 00:23:54,920 --> 00:23:55,320 Speaker 5: is real. 382 00:23:56,119 --> 00:23:58,879 Speaker 2: But Jason has a slightly different theory. 383 00:23:58,960 --> 00:24:02,160 Speaker 5: I think that's too reductive. I just I don't think 384 00:24:02,200 --> 00:24:05,240 Speaker 5: that that's what's happening, and I know that's not what's happening, 385 00:24:05,359 --> 00:24:10,200 Speaker 5: because there are human beings in the loop. I'm certain 386 00:24:10,240 --> 00:24:14,480 Speaker 5: of it. Like they're human beings who are prompting these 387 00:24:14,800 --> 00:24:18,119 Speaker 5: AI images and are posting them. They may not be 388 00:24:18,240 --> 00:24:21,600 Speaker 5: monitoring the accounts very closely, but they're making it and 389 00:24:21,600 --> 00:24:25,920 Speaker 5: they're posting these images. And then there are definitely bots 390 00:24:26,720 --> 00:24:31,560 Speaker 5: who are liking and commenting on some of these images. 391 00:24:32,400 --> 00:24:36,280 Speaker 5: But these images and these photos and these you know, 392 00:24:36,440 --> 00:24:40,320 Speaker 5: videos are showing up in people's feeds. So I've called 393 00:24:40,400 --> 00:24:43,359 Speaker 5: the zombie Internet, where it's like a mix of human 394 00:24:43,400 --> 00:24:47,080 Speaker 5: beings and bots, where you have human beings arguing in 395 00:24:47,119 --> 00:24:49,240 Speaker 5: the comments with bots. You have human beings in the 396 00:24:49,240 --> 00:24:51,879 Speaker 5: comments arguing with other human beings. You have bots in 397 00:24:51,920 --> 00:24:54,960 Speaker 5: the comments arguing with other bots. And it's to me, 398 00:24:55,080 --> 00:24:58,760 Speaker 5: that's even worse than dead Internet, where everyone is a bot, 399 00:24:59,320 --> 00:25:04,840 Speaker 5: because you have all of these real humans who can't 400 00:25:04,880 --> 00:25:09,760 Speaker 5: tell that they're commenting on an AI generated image and 401 00:25:09,800 --> 00:25:15,159 Speaker 5: they're arguing about it, and they're spending tons of time, 402 00:25:15,480 --> 00:25:19,520 Speaker 5: like tons of like you know, human hours, engaging with it. 403 00:25:20,160 --> 00:25:23,719 Speaker 2: Yeah, like we're humans are moving amongst the dead, and 404 00:25:23,800 --> 00:25:26,040 Speaker 2: you end up with something that's only kind of half 405 00:25:26,119 --> 00:25:28,919 Speaker 2: alive and half dead and somewhere in the middle there. So, 406 00:25:29,119 --> 00:25:31,840 Speaker 2: while humans might not be the target audience for these 407 00:25:31,840 --> 00:25:35,840 Speaker 2: images and videos, real people are being fed this content 408 00:25:36,560 --> 00:25:39,080 Speaker 2: and you'd think that that would piss them off, but 409 00:25:39,359 --> 00:25:42,440 Speaker 2: we're starting to find out that that's not necessarily the case. 410 00:25:43,280 --> 00:25:46,159 Speaker 5: So there was a Hurricane Helene that hit you know, 411 00:25:46,320 --> 00:25:51,919 Speaker 5: North Carolina, Georgia the southeast and in the aftermath that 412 00:25:52,000 --> 00:25:55,240 Speaker 5: there were a lot of horrible images coming out of it, 413 00:25:55,240 --> 00:25:59,399 Speaker 5: has come out of any natural disaster, and there was 414 00:25:59,440 --> 00:26:02,840 Speaker 5: an AI generated image of a three year old girl 415 00:26:03,080 --> 00:26:05,800 Speaker 5: crying and I think she was like sitting on a 416 00:26:05,880 --> 00:26:08,400 Speaker 5: raft or you know, there was like a flood around her. 417 00:26:09,320 --> 00:26:14,159 Speaker 5: And this image is very clearly AI generated, but it 418 00:26:14,240 --> 00:26:17,800 Speaker 5: went really viral and it was shared by a few 419 00:26:17,880 --> 00:26:22,080 Speaker 5: Republican politicians and they were sort of talking about how 420 00:26:23,680 --> 00:26:27,679 Speaker 5: this is what FEMA and Joe Biden's poor response to 421 00:26:27,720 --> 00:26:31,359 Speaker 5: the hurricane has done to people, right and there was 422 00:26:31,359 --> 00:26:34,640 Speaker 5: a moment where everyone was like, this is fake, You're stupid, 423 00:26:35,000 --> 00:26:38,960 Speaker 5: And the response from the politicians were like, I don't 424 00:26:39,000 --> 00:26:42,000 Speaker 5: care if it's real or not. Something like this is 425 00:26:42,040 --> 00:26:42,800 Speaker 5: happening there. 426 00:26:43,520 --> 00:26:45,520 Speaker 2: Yeah, I mean, I'll even read it here. This is 427 00:26:45,560 --> 00:26:49,879 Speaker 2: Amy Kramer, the National Committee woman for the Republican National Convention, said, 428 00:26:50,000 --> 00:26:53,440 Speaker 2: I'm reading this verbatim. There are people going through much 429 00:26:53,480 --> 00:26:56,080 Speaker 2: worse than what is shown in this pic. So I'm 430 00:26:56,119 --> 00:26:58,719 Speaker 2: leaving it because it is emblematic of the trauma and 431 00:26:58,800 --> 00:27:03,840 Speaker 2: pain people are living through right now. That is something 432 00:27:03,880 --> 00:27:06,280 Speaker 2: that I've seen a lot when I look at the 433 00:27:06,359 --> 00:27:11,280 Speaker 2: comment sections of fake AI generated images that are related 434 00:27:11,280 --> 00:27:13,320 Speaker 2: to something that's actually happening in the real world, you'll 435 00:27:13,320 --> 00:27:16,280 Speaker 2: see people in the comments who were, I think, in 436 00:27:16,359 --> 00:27:19,240 Speaker 2: their own way, trying to make the Internet a better 437 00:27:19,280 --> 00:27:21,760 Speaker 2: place by telling people, hey, this is fake and trying 438 00:27:21,760 --> 00:27:25,440 Speaker 2: to educate people. And then some of those responses though, 439 00:27:25,560 --> 00:27:28,320 Speaker 2: are well, I don't care if this is fake. It 440 00:27:28,320 --> 00:27:32,360 Speaker 2: doesn't matter because I'm sure something like this is happening. 441 00:27:33,119 --> 00:27:37,560 Speaker 2: This is just a depiction of it. Yeah, that I 442 00:27:37,600 --> 00:27:38,720 Speaker 2: find really interesting. 443 00:27:39,520 --> 00:27:44,960 Speaker 5: It's really interesting, and I'll tell you like I used 444 00:27:45,000 --> 00:27:50,679 Speaker 5: to be more optimistic that social media can be fixed 445 00:27:50,720 --> 00:27:53,560 Speaker 5: and like her information ecosystems can be fixed and things 446 00:27:53,640 --> 00:27:57,920 Speaker 5: like that, and a lot of my first articles about 447 00:27:57,920 --> 00:28:01,920 Speaker 5: this were about people can't tell that they're not real, 448 00:28:02,560 --> 00:28:04,960 Speaker 5: and therefore it's bad that people can't tell that it's 449 00:28:05,000 --> 00:28:07,960 Speaker 5: not real. And now it's a mix of people not 450 00:28:08,040 --> 00:28:10,080 Speaker 5: being able to tell that these things are not real 451 00:28:10,320 --> 00:28:15,320 Speaker 5: and people not caring that they're not real. And the 452 00:28:15,359 --> 00:28:19,320 Speaker 5: second one is almost worse, where you know it's fake, 453 00:28:19,359 --> 00:28:21,440 Speaker 5: but you share it anyway because it captures some sort 454 00:28:21,480 --> 00:28:26,320 Speaker 5: of vibe, like if it verifies their worldview, then it's 455 00:28:26,800 --> 00:28:30,160 Speaker 5: useful to them in some way. Making it very unclear 456 00:28:30,240 --> 00:28:32,040 Speaker 5: what is real and what is fake is part of 457 00:28:32,080 --> 00:28:36,960 Speaker 5: the point of that entire project, where the truth is unknowable. 458 00:28:39,640 --> 00:28:42,680 Speaker 2: What are social media platforms doing about this? And is 459 00:28:42,720 --> 00:28:45,480 Speaker 2: there anything that we can do about it? That's after 460 00:28:45,480 --> 00:28:59,400 Speaker 2: the break, So this is where I have to ask 461 00:28:59,480 --> 00:29:02,240 Speaker 2: about what the platforms are doing, because this sounds like 462 00:29:03,520 --> 00:29:07,600 Speaker 2: it would make the experience really not fun. You get 463 00:29:07,640 --> 00:29:10,960 Speaker 2: on Facebook, you get on Instagram, you get on TikTok, 464 00:29:11,400 --> 00:29:15,080 Speaker 2: and everything is fake. Platforms can't possibly want this. So 465 00:29:15,320 --> 00:29:19,320 Speaker 2: you've talked to platforms, what's been the responsive places, especially 466 00:29:19,400 --> 00:29:20,440 Speaker 2: like Facebook. 467 00:29:21,160 --> 00:29:29,080 Speaker 5: Yeah, they don't care, like they like it, and I 468 00:29:29,120 --> 00:29:31,360 Speaker 5: know that they like it because Mark Zuckerberg has talked 469 00:29:31,360 --> 00:29:35,040 Speaker 5: about it in quarterly earnings reports Q three of last year. 470 00:29:35,720 --> 00:29:37,960 Speaker 7: Another part that I haven't talked about quite as much 471 00:29:38,040 --> 00:29:43,960 Speaker 7: yet is the opportunity for AI to help people create content. 472 00:29:44,600 --> 00:29:46,480 Speaker 7: And I think we're going to add a whole new 473 00:29:47,000 --> 00:29:52,400 Speaker 7: category of content, which is AI generated or AI summarized content. 474 00:29:52,920 --> 00:29:55,480 Speaker 7: And I think that that's going to be just very 475 00:29:55,480 --> 00:30:01,120 Speaker 7: exciting for the for Facebook and Instagram and maybe threads 476 00:30:01,160 --> 00:30:04,120 Speaker 7: or other kind of feed experiences over time, and. 477 00:30:04,040 --> 00:30:07,440 Speaker 5: I've also talked to Facebook comms people and said, like, 478 00:30:08,320 --> 00:30:10,520 Speaker 5: do you want this type of stuff on your platform? 479 00:30:10,520 --> 00:30:13,720 Speaker 5: Are you going to delete it? And you know, they 480 00:30:13,760 --> 00:30:16,719 Speaker 5: will delete some of the really grotesque things if it 481 00:30:16,840 --> 00:30:21,000 Speaker 5: violates other parts of their content policies, but they will 482 00:30:21,040 --> 00:30:25,320 Speaker 5: not delete anything just because it is AI generated or 483 00:30:25,360 --> 00:30:30,160 Speaker 5: because it's spam. And at the same time, Meta is 484 00:30:31,120 --> 00:30:35,880 Speaker 5: developing its own artificial intelligence. You can make AI slop 485 00:30:36,080 --> 00:30:40,120 Speaker 5: for lack of a better term, using Meta's own tools 486 00:30:40,160 --> 00:30:43,880 Speaker 5: and then post it to Facebook. And then most recently 487 00:30:43,960 --> 00:30:47,840 Speaker 5: that they said, you know, we're going to hopefully create 488 00:30:47,920 --> 00:30:51,680 Speaker 5: tools that will allow users to make their own AI 489 00:30:51,840 --> 00:30:56,920 Speaker 5: generated profiles of fake people. And we imagine a future 490 00:30:56,960 --> 00:30:59,960 Speaker 5: where like a lot of the content on these platforms 491 00:31:00,200 --> 00:31:02,120 Speaker 5: is generated by AI. 492 00:31:03,040 --> 00:31:07,200 Speaker 2: Jason, why like this seems like a bad All this 493 00:31:07,200 --> 00:31:08,000 Speaker 2: stuff seems bad. 494 00:31:08,480 --> 00:31:11,360 Speaker 5: Yeah, I've tried to figure out exactly why this is happening, 495 00:31:11,440 --> 00:31:14,520 Speaker 5: Like I've tried to put myself in Mark Zuckerberg's shoes 496 00:31:14,600 --> 00:31:18,200 Speaker 5: and be like, Okay, what is like the bullish argument here? 497 00:31:18,400 --> 00:31:21,240 Speaker 5: And so one, it's like they're spending billions and billions 498 00:31:21,240 --> 00:31:24,320 Speaker 5: of dollars on AI data centers, so they need to 499 00:31:24,400 --> 00:31:27,800 Speaker 5: push this on people because they think that artificial intelligence 500 00:31:27,840 --> 00:31:30,959 Speaker 5: is the future of work, it's the future of the Internet, 501 00:31:31,000 --> 00:31:34,880 Speaker 5: it's the future of humanity, and so they are desperate 502 00:31:34,920 --> 00:31:38,840 Speaker 5: to find use cases for this in some way. The 503 00:31:39,000 --> 00:31:42,840 Speaker 5: other thing that I've been thinking about is that the 504 00:31:42,880 --> 00:31:46,040 Speaker 5: way that all of Meta's platforms work is it tries 505 00:31:46,080 --> 00:31:48,880 Speaker 5: to learn as much about you as possible so that 506 00:31:48,960 --> 00:31:51,760 Speaker 5: you stay on their platforms as long as possible so 507 00:31:51,800 --> 00:31:56,240 Speaker 5: that they can deliver targeted ads to you. And right 508 00:31:56,280 --> 00:31:59,720 Speaker 5: now they have billions of people posting on their platforms 509 00:32:00,040 --> 00:32:02,440 Speaker 5: all sorts of different types of content, and then they 510 00:32:02,480 --> 00:32:06,760 Speaker 5: need to rely on their algorithm to categorize that content 511 00:32:06,840 --> 00:32:09,120 Speaker 5: in some way and deliver it to people who they 512 00:32:09,160 --> 00:32:12,760 Speaker 5: think will like it. And what I think is happening 513 00:32:12,800 --> 00:32:18,920 Speaker 5: here is they want to use artificial intelligence to create 514 00:32:19,440 --> 00:32:24,120 Speaker 5: hyper specific types of content so that you get on 515 00:32:24,160 --> 00:32:27,920 Speaker 5: their platforms and you don't stop scrolling. It's like, if 516 00:32:27,960 --> 00:32:30,920 Speaker 5: you are really into, like a specific type of sports 517 00:32:30,960 --> 00:32:34,840 Speaker 5: car and you only want to see like amazing videos 518 00:32:34,880 --> 00:32:38,400 Speaker 5: of that sports car driving around, there might not be 519 00:32:38,760 --> 00:32:42,400 Speaker 5: human beings creating enough of that content to like satisfy 520 00:32:42,440 --> 00:32:46,000 Speaker 5: you and keep you on that platform. Long enough, Whereas 521 00:32:46,080 --> 00:32:49,760 Speaker 5: with artificial intelligence, they can just make millions of different 522 00:32:49,920 --> 00:32:53,640 Speaker 5: variations of the specific thing that you're into, feed it 523 00:32:53,720 --> 00:32:55,880 Speaker 5: to you over and over and over again, and then 524 00:32:56,520 --> 00:32:59,880 Speaker 5: you know, target you even more close to with ads 525 00:33:00,040 --> 00:33:03,240 Speaker 5: and keep you on the platform longer. They want to 526 00:33:03,320 --> 00:33:09,000 Speaker 5: trap people into like even more specific algorithmic silos where 527 00:33:09,120 --> 00:33:14,480 Speaker 5: hyper specific, artificially intelligent content is fed to you endlessly. 528 00:33:14,760 --> 00:33:17,560 Speaker 2: Which brings us back to our guy or gal. I 529 00:33:17,600 --> 00:33:21,680 Speaker 2: actually don't know, future writer us. I'd seen all these 530 00:33:21,720 --> 00:33:24,680 Speaker 2: angry comments in their posts, and I decided I just 531 00:33:24,800 --> 00:33:27,640 Speaker 2: needed to see what the person behind the account was thinking, 532 00:33:28,240 --> 00:33:31,080 Speaker 2: so I DM them. They told me they were Russian, 533 00:33:31,440 --> 00:33:33,960 Speaker 2: and at first they were mostly bragging about how many 534 00:33:34,040 --> 00:33:36,840 Speaker 2: views they were getting. But when I pointed out that 535 00:33:36,920 --> 00:33:39,240 Speaker 2: a lot of people were either angry about their posts 536 00:33:39,400 --> 00:33:42,120 Speaker 2: or had no idea that it was fake, they started 537 00:33:42,120 --> 00:33:45,360 Speaker 2: getting defensive. Future writer us said that they didn't see 538 00:33:45,400 --> 00:33:49,080 Speaker 2: the problem because they'd added an AI tag to their videos, 539 00:33:49,640 --> 00:33:53,240 Speaker 2: and well, they have a point. Instagram does have a 540 00:33:53,240 --> 00:33:56,920 Speaker 2: feature that allows uploaders to voluntarily tag their posts as 541 00:33:56,920 --> 00:34:00,880 Speaker 2: AI generated. The trouble is that this label doesn't show 542 00:34:00,960 --> 00:34:03,840 Speaker 2: up when you're watching the video Normally, you only see 543 00:34:03,880 --> 00:34:05,680 Speaker 2: it if you look at the bottom and there's a 544 00:34:05,680 --> 00:34:09,000 Speaker 2: C more tag and you tap that, and even then 545 00:34:09,360 --> 00:34:12,799 Speaker 2: space is prioritized for the song title, so sometimes that 546 00:34:12,880 --> 00:34:16,240 Speaker 2: tag is pushed off the screen and all it says 547 00:34:16,400 --> 00:34:19,359 Speaker 2: is AI info in small text in the bottom right, 548 00:34:19,880 --> 00:34:22,040 Speaker 2: and if you tap that, it shows you some more 549 00:34:22,080 --> 00:34:28,480 Speaker 2: information AI info, not AI warning, not AI caution, just 550 00:34:28,760 --> 00:34:34,279 Speaker 2: AI info. Why would you ever tap that? Meta has 551 00:34:34,320 --> 00:34:36,439 Speaker 2: a page on their site that makes a big deal 552 00:34:36,440 --> 00:34:39,799 Speaker 2: about the introduction of this tag, and they primarily show 553 00:34:39,840 --> 00:34:43,320 Speaker 2: what it looks like in the grid view. The issue 554 00:34:43,440 --> 00:34:46,800 Speaker 2: is that it's even more imperceptible there. So when you 555 00:34:46,840 --> 00:34:48,839 Speaker 2: first look at a post and the grid, you see 556 00:34:48,880 --> 00:34:52,279 Speaker 2: the location tag, then you see the music title, and 557 00:34:52,320 --> 00:34:55,200 Speaker 2: that scrolls in a view for a few moments. Only 558 00:34:55,360 --> 00:34:59,800 Speaker 2: after that does the text AI info appear. By that 559 00:35:00,280 --> 00:35:02,760 Speaker 2: you're watching the video, you're not looking at tiny text 560 00:35:02,960 --> 00:35:05,759 Speaker 2: rolls in the upper corner of your screen. And also, 561 00:35:06,040 --> 00:35:09,439 Speaker 2: Meta never notified users that they were rolling this feature out. 562 00:35:10,000 --> 00:35:13,600 Speaker 2: Why would anyone know that this exists? And by the way, 563 00:35:13,719 --> 00:35:15,680 Speaker 2: I did reach out to Meta when I was first 564 00:35:15,680 --> 00:35:18,480 Speaker 2: reporting on this, and I asked them about their policy 565 00:35:18,520 --> 00:35:22,120 Speaker 2: on AI on their platform, specifically, if there's any obligation 566 00:35:22,239 --> 00:35:24,680 Speaker 2: for a user to more clearly label a post as 567 00:35:24,760 --> 00:35:28,280 Speaker 2: AI other than that small badge, or why the desktop 568 00:35:28,360 --> 00:35:32,080 Speaker 2: doesn't show the badge at all. They never responded, and 569 00:35:32,120 --> 00:35:34,440 Speaker 2: this is where I have to kind of agree with 570 00:35:34,520 --> 00:35:37,840 Speaker 2: Future writer us. They said that if people don't notice 571 00:35:37,840 --> 00:35:42,640 Speaker 2: the interfaces AI tag, it's not the poster's fault, it's Instagram's, 572 00:35:43,160 --> 00:35:46,640 Speaker 2: and it's the platform's responsibility to make those tags bigger 573 00:35:46,880 --> 00:35:50,680 Speaker 2: or more noticeable. That doesn't seem to be happening, and 574 00:35:50,800 --> 00:35:53,839 Speaker 2: since both the AI slot posters and the platforms are 575 00:35:53,840 --> 00:35:57,200 Speaker 2: making money, there's no real incentive for anyone to stop. 576 00:35:58,080 --> 00:36:01,080 Speaker 2: But it does get weird when this stuff is happening 577 00:36:01,239 --> 00:36:05,279 Speaker 2: in a place that you live. You've been reporting on 578 00:36:05,360 --> 00:36:08,279 Speaker 2: AI slap a lot, and I'm sure that when you 579 00:36:08,360 --> 00:36:12,120 Speaker 2: saw the Hollywood sign thing on fire, youknew exactly what 580 00:36:12,239 --> 00:36:17,399 Speaker 2: that was. Was there anything about the AI generated slap 581 00:36:17,480 --> 00:36:20,480 Speaker 2: that was coming from the LA fires that was surprising 582 00:36:20,560 --> 00:36:20,719 Speaker 2: to you? 583 00:36:21,520 --> 00:36:24,640 Speaker 5: I wasn't surprised that it happened, like I wasn't surprised 584 00:36:24,640 --> 00:36:28,200 Speaker 5: that people were making slop about it. I was surprised 585 00:36:29,400 --> 00:36:32,759 Speaker 5: that it upset me, like I was surprised, as someone 586 00:36:32,800 --> 00:36:35,840 Speaker 5: who lives in Los Angeles, who knows people who lost 587 00:36:35,880 --> 00:36:39,080 Speaker 5: their homes, who saw the fires like you know, the 588 00:36:39,560 --> 00:36:43,759 Speaker 5: first day as they were growing. I was surprised that 589 00:36:44,239 --> 00:36:46,479 Speaker 5: I was upset that this was happening and that people 590 00:36:46,560 --> 00:36:50,520 Speaker 5: were making money off of it. And I think I 591 00:36:50,640 --> 00:36:55,080 Speaker 5: was also upset because there were some really brave journalists 592 00:36:55,160 --> 00:36:57,319 Speaker 5: from the LA Times, New York Times, a lot of 593 00:36:57,360 --> 00:37:01,759 Speaker 5: local spots who were going there take photos, taking videos, 594 00:37:02,360 --> 00:37:08,319 Speaker 5: risking their lives to share this stuff. And then many 595 00:37:08,360 --> 00:37:11,080 Speaker 5: of the images and videos that were going really viral 596 00:37:11,120 --> 00:37:14,279 Speaker 5: were just like the Hollywood Sign is on fire. And 597 00:37:14,360 --> 00:37:16,479 Speaker 5: I was surprised because it was like the first time 598 00:37:16,520 --> 00:37:21,600 Speaker 5: that aislop had impacted anything that had any personal meaning 599 00:37:21,640 --> 00:37:24,320 Speaker 5: to me, and I found it to be like pretty upsetting. 600 00:37:24,960 --> 00:37:27,640 Speaker 2: It's interesting that you say that about people risking their 601 00:37:27,640 --> 00:37:32,319 Speaker 2: lives because the creator, when people started calling them out 602 00:37:32,680 --> 00:37:36,920 Speaker 2: a lot on their most viral reel, they posted a 603 00:37:36,920 --> 00:37:40,640 Speaker 2: response and they say, I'm Marita here. In this video, 604 00:37:40,719 --> 00:37:43,240 Speaker 2: I aimed to shed light on the reality of what's happening. 605 00:37:43,640 --> 00:37:47,400 Speaker 2: The problems are very real. Animals are dying, homes are 606 00:37:47,440 --> 00:37:51,280 Speaker 2: being destroyed, and firefighters are risking their lives to save others. 607 00:37:51,680 --> 00:37:54,799 Speaker 2: They don't have the time to produce visually stunning and 608 00:37:54,840 --> 00:37:58,359 Speaker 2: powerful footage to raise awareness about these issues. That's why 609 00:37:58,360 --> 00:38:00,400 Speaker 2: I took the initiative to create something that could help 610 00:38:00,440 --> 00:38:04,320 Speaker 2: people see and truly think about these tragedies. So basically, 611 00:38:04,600 --> 00:38:06,400 Speaker 2: people are risking their lives, they don't have time to 612 00:38:06,440 --> 00:38:09,120 Speaker 2: make these really well produced things, so I made this 613 00:38:09,320 --> 00:38:13,120 Speaker 2: for you. Which, of course, this all breaks down when 614 00:38:13,239 --> 00:38:15,479 Speaker 2: right after the fires stop being in the news as much, 615 00:38:15,880 --> 00:38:21,120 Speaker 2: they go on to making completely other unrelated AI generated content. 616 00:38:21,600 --> 00:38:23,919 Speaker 5: I find it to be offensive more or less. It's 617 00:38:23,960 --> 00:38:26,920 Speaker 5: just like it's I think it's kind of tasteless. I 618 00:38:26,960 --> 00:38:31,919 Speaker 5: don't think that that matters anymore on the Internet. But 619 00:38:32,000 --> 00:38:34,319 Speaker 5: you know, there were plenty of images coming out of 620 00:38:35,800 --> 00:38:41,840 Speaker 5: the La Fires, and we don't need an unlimited, infinite 621 00:38:42,360 --> 00:38:45,799 Speaker 5: visual gallery of everything that is happening. And I think 622 00:38:45,880 --> 00:38:51,240 Speaker 5: that's one of the biggest problems with AI slot more generally, 623 00:38:51,440 --> 00:38:55,239 Speaker 5: is that you can find any news that you want, 624 00:38:55,719 --> 00:39:01,000 Speaker 5: anything that confirms your worldview because of and because of 625 00:39:01,040 --> 00:39:04,000 Speaker 5: social media. So it's like, if you think that the 626 00:39:04,160 --> 00:39:07,640 Speaker 5: La fires was an act of God punishing gay people 627 00:39:07,800 --> 00:39:12,319 Speaker 5: in Los Angeles, which I've seen AI videos where that's 628 00:39:12,880 --> 00:39:16,720 Speaker 5: what they're about, Like God is striking back against Hollywood's 629 00:39:16,760 --> 00:39:19,560 Speaker 5: gay people. There's a video for you. If you think 630 00:39:19,600 --> 00:39:22,280 Speaker 5: that it was like a space laser, there's a video 631 00:39:22,400 --> 00:39:24,640 Speaker 5: for you. Like if you think that it was just 632 00:39:24,760 --> 00:39:27,719 Speaker 5: like climate change, there's a video for you. If you 633 00:39:27,800 --> 00:39:30,920 Speaker 5: only care about the animals that lost their homes, like, 634 00:39:30,960 --> 00:39:31,880 Speaker 5: there's a video for you. 635 00:39:32,520 --> 00:39:35,080 Speaker 2: Yeah, it just feels like something we're gonna see more 636 00:39:35,440 --> 00:39:38,400 Speaker 2: and more. You know, every time there's some new advancement 637 00:39:38,480 --> 00:39:41,080 Speaker 2: or some new engine has dropped, some new you know, 638 00:39:41,120 --> 00:39:45,120 Speaker 2: AI technology is dropped, people look at it, and the 639 00:39:45,680 --> 00:39:47,799 Speaker 2: really excited people will post, oh my gosh, look at 640 00:39:47,800 --> 00:39:52,280 Speaker 2: this and just think we're able to accomplish this today. 641 00:39:52,360 --> 00:39:55,439 Speaker 2: But this is the worst it'll ever be. What does 642 00:39:55,560 --> 00:40:00,040 Speaker 2: that mean for the future of AI slop and the 643 00:40:00,040 --> 00:40:03,960 Speaker 2: future of how we're going to experience reality on the internet. 644 00:40:05,160 --> 00:40:08,440 Speaker 5: Yeah, So two things. One that's absolutely correct. It's like 645 00:40:08,760 --> 00:40:12,720 Speaker 5: I've watched this stuff evolve in real time, and AI 646 00:40:12,760 --> 00:40:16,520 Speaker 5: generated videos especially have gotten far more realistic in the 647 00:40:16,600 --> 00:40:20,520 Speaker 5: last month and way way way better than they were 648 00:40:20,520 --> 00:40:24,440 Speaker 5: a year ago. And two the people were saying, like 649 00:40:25,120 --> 00:40:31,160 Speaker 5: we can use artificial intelligence to improve special effects for Hollywood, 650 00:40:31,280 --> 00:40:36,520 Speaker 5: to improve productivity for you know, writers or whatever like that. 651 00:40:36,600 --> 00:40:38,839 Speaker 5: All the productivity gains that are going to happen in 652 00:40:38,920 --> 00:40:42,400 Speaker 5: sort of like a best case use of artificial intelligence, 653 00:40:43,520 --> 00:40:47,120 Speaker 5: that very well might be true. But alongside of that, 654 00:40:47,960 --> 00:40:50,520 Speaker 5: you're going to have so many more people using these 655 00:40:50,520 --> 00:40:55,279 Speaker 5: tools to spam the Internet, to abuse people, to make 656 00:40:55,320 --> 00:40:59,799 Speaker 5: deep fakes of you know, celebrities and people that they 657 00:40:59,840 --> 00:41:02,879 Speaker 5: know in high school and things like that. And when 658 00:41:03,000 --> 00:41:05,840 Speaker 5: you're just like using the Internet as a consumer like that, 659 00:41:05,840 --> 00:41:08,360 Speaker 5: that's a lot of what you're going to experience because 660 00:41:09,360 --> 00:41:12,319 Speaker 5: there's a lot more people trying to make a few 661 00:41:12,360 --> 00:41:15,520 Speaker 5: bucks on Facebook than there are Hollywood studios that are 662 00:41:15,560 --> 00:41:18,400 Speaker 5: trying to make like a triple A film. 663 00:41:19,880 --> 00:41:22,680 Speaker 2: So one last thing before I let Jason go, I 664 00:41:22,680 --> 00:41:25,439 Speaker 2: asked if he could read me this one paragraph from 665 00:41:25,440 --> 00:41:29,080 Speaker 2: his shrimp Jesus article, and hopefully it'll make sense why I. 666 00:41:29,040 --> 00:41:34,040 Speaker 5: Asked, there are AI generated pages full of AI deformed women, 667 00:41:34,120 --> 00:41:38,480 Speaker 5: breastfeeding tiny cows, celebrities with amputations that they do not 668 00:41:38,600 --> 00:41:41,480 Speaker 5: have in real life. Jesus as a shrimp, Jesus as 669 00:41:41,480 --> 00:41:45,640 Speaker 5: a collection of Phanto bottles, Jesus as sand sculpture, Jesus 670 00:41:45,719 --> 00:41:49,000 Speaker 5: as a series of ramen noodles, Jesus as a shrimp 671 00:41:49,080 --> 00:41:52,400 Speaker 5: mixed with sprite bottles and ramen noodles. Jesus made a 672 00:41:52,480 --> 00:41:56,920 Speaker 5: plastic bottles and posing with large breasted AI generated female soldiers. 673 00:41:57,480 --> 00:42:00,800 Speaker 5: Jesus on a plane with AI generated sexy fe flight attendants, 674 00:42:01,440 --> 00:42:06,640 Speaker 5: giant gold Jesus being evacuated from a river, golden helicopter, 675 00:42:06,800 --> 00:42:11,439 Speaker 5: Jesus banana, Jesus coffee, Jesus goldfish, Jesus rice Jesus, any 676 00:42:11,520 --> 00:42:14,719 Speaker 5: number of AI generated female soldiers on a page called 677 00:42:14,880 --> 00:42:19,320 Speaker 5: beautiful Military, a page called everything skull, which is exactly 678 00:42:19,320 --> 00:42:24,920 Speaker 5: what it sounds like. Malnourished dogs, indigenous identity pages, beautiful landscapes, 679 00:42:25,080 --> 00:42:31,080 Speaker 5: flower arrangements, weird cakes, et cetera. I should write like 680 00:42:31,120 --> 00:42:33,680 Speaker 5: that more often, where it's just like, here's a bunch 681 00:42:33,680 --> 00:42:34,439 Speaker 5: of shit I saw. 682 00:42:36,600 --> 00:42:44,000 Speaker 2: I mean, just the breadth and the absolute unhinged quality 683 00:42:44,080 --> 00:42:49,319 Speaker 2: of it. I think it gives. It's a wild snapshot 684 00:42:49,400 --> 00:42:52,120 Speaker 2: of where we were. What is this last year? That 685 00:42:52,560 --> 00:42:56,000 Speaker 2: was the state of Ai back then was slop, was 686 00:42:56,120 --> 00:43:00,920 Speaker 2: kind of funny, and now not that long long after this, 687 00:43:01,080 --> 00:43:05,840 Speaker 2: we're now in the zone where aislop is being used 688 00:43:06,000 --> 00:43:10,360 Speaker 2: to make people afraid. It's being used to make people 689 00:43:10,480 --> 00:43:13,360 Speaker 2: think the things that are happening that are obviously not happening. 690 00:43:13,400 --> 00:43:16,359 Speaker 2: And this is affecting people who otherwise think that they're 691 00:43:16,440 --> 00:43:20,440 Speaker 2: very smart and informed and engaged and knowledgeable about the world. 692 00:43:20,640 --> 00:43:25,080 Speaker 2: So we've made a jump, a really quick jump from Hahaha, 693 00:43:25,120 --> 00:43:27,560 Speaker 2: isn't this funny all of us smart nerds on the 694 00:43:27,560 --> 00:43:31,400 Speaker 2: internet get to laugh at our parents' generation to wait 695 00:43:31,400 --> 00:43:35,520 Speaker 2: a second, this is affecting stuff that's affecting our perception 696 00:43:35,719 --> 00:43:39,400 Speaker 2: of a literal natural disaster that happened where you and 697 00:43:39,440 --> 00:43:39,799 Speaker 2: I live. 698 00:43:43,960 --> 00:43:47,800 Speaker 5: Yeah, I mean almost everything is trying to scare people. 699 00:43:48,400 --> 00:43:51,400 Speaker 5: It's trying to talk about the news, it's trying to 700 00:43:51,480 --> 00:43:56,319 Speaker 5: confuse people. And yeah, there's definitely a part of me 701 00:43:56,360 --> 00:43:59,239 Speaker 5: where I'm like, can we go back to Jesus with 702 00:43:59,280 --> 00:44:02,720 Speaker 5: the hot flight tence? Like what are we doing here? 703 00:44:03,920 --> 00:44:04,520 Speaker 2: And here we are? 704 00:44:05,320 --> 00:44:08,360 Speaker 5: Yeah, And like I feel like I can still generally 705 00:44:08,400 --> 00:44:11,560 Speaker 5: tell what's AI generated and what's not, but I'm using 706 00:44:11,600 --> 00:44:14,279 Speaker 5: like way more of my brain power like to try 707 00:44:14,320 --> 00:44:16,920 Speaker 5: to decipher what's real and what's not because it is 708 00:44:16,960 --> 00:44:17,840 Speaker 5: getting way better. 709 00:44:21,960 --> 00:44:24,440 Speaker 2: Thank you so much for listening to kill Switch and 710 00:44:24,719 --> 00:44:26,600 Speaker 2: let us know what you think. If there's something we 711 00:44:26,680 --> 00:44:29,040 Speaker 2: want us to cover, let us know that too. You 712 00:44:29,080 --> 00:44:32,400 Speaker 2: can hit us at kill Switch at Kaleidoscope dot NYC, 713 00:44:32,960 --> 00:44:35,080 Speaker 2: or you can find me at dex Digi on the 714 00:44:35,160 --> 00:44:39,880 Speaker 2: Gram or Blue Sky if that's more your flavor. And 715 00:44:40,040 --> 00:44:42,160 Speaker 2: also if you can leave us a review, it helps 716 00:44:42,200 --> 00:44:44,560 Speaker 2: other people find the show, which helps us keep doing 717 00:44:44,600 --> 00:44:48,560 Speaker 2: our thing. This show is hosted by me Dexter Thomas. 718 00:44:49,960 --> 00:44:54,160 Speaker 2: It's produced by Seno Ozaki, Darluk Potts, and Kate Osborne. 719 00:44:54,440 --> 00:44:57,120 Speaker 2: The theme song is by Kyle Murdoch, who also mixed 720 00:44:57,160 --> 00:45:01,160 Speaker 2: the show from Kaleidoscope Or execut Native producers are Oz 721 00:45:01,239 --> 00:45:05,360 Speaker 2: Valascian on Guest, Kaki Kadur, and Kate Osbourne from iHeart 722 00:45:05,480 --> 00:45:09,719 Speaker 2: Our Executive producers are Katrina Norbl and Nikki E. Tor See. 723 00:45:09,760 --> 00:45:12,840 Speaker 2: All in the next one h