1 00:00:09,600 --> 00:00:12,680 Speaker 1: I learned about the story from a reporter named Elizabeth 2 00:00:12,680 --> 00:00:16,040 Speaker 1: Hernandez at the Denver Post. She wrote this piece in 3 00:00:16,120 --> 00:00:23,079 Speaker 1: August about a murder that supposedly happened in Littleton, Colorado. 4 00:00:23,000 --> 00:00:26,520 Speaker 2: On July four, twenty fourteen, just before noon, the Littleton 5 00:00:26,560 --> 00:00:30,400 Speaker 2: Police Department made a decisive move. Harrison, twenty one, a 6 00:00:30,480 --> 00:00:33,600 Speaker 2: college student and part time bartender, had been living a 7 00:00:33,640 --> 00:00:37,080 Speaker 2: life of turmoil and confusion. Harrison admitted to the police 8 00:00:37,080 --> 00:00:39,640 Speaker 2: that he had been in a secret sexual relationship with 9 00:00:39,720 --> 00:00:41,520 Speaker 2: his stepfather for the past two years. 10 00:00:42,640 --> 00:00:45,360 Speaker 1: There was this YouTube channel that made this video and 11 00:00:45,640 --> 00:00:48,400 Speaker 1: it was a pretty wild case. 12 00:00:48,840 --> 00:00:51,960 Speaker 2: Welcome to True Crime Case Files. Today we uncover the 13 00:00:52,000 --> 00:00:56,000 Speaker 2: tragic and complex story of Richard Engelbert, a successful real 14 00:00:56,120 --> 00:00:59,080 Speaker 2: estate agent who's hidden life and secrets led to his 15 00:00:59,160 --> 00:00:59,960 Speaker 2: brutal murder. 16 00:01:00,680 --> 00:01:04,840 Speaker 1: And Elizabeth Hernandez, her editor, told her about these emails 17 00:01:04,840 --> 00:01:08,120 Speaker 1: that the newsroom have been receiving, all linking to this video, 18 00:01:08,400 --> 00:01:12,520 Speaker 1: all talking about this weird murder. And her editor said, 19 00:01:12,640 --> 00:01:15,319 Speaker 1: this seems like it's a story made for you. And 20 00:01:15,600 --> 00:01:17,360 Speaker 1: she read these emails and there were some that just 21 00:01:17,400 --> 00:01:19,479 Speaker 1: said you should check out this video. This is weird. 22 00:01:19,720 --> 00:01:22,280 Speaker 1: But there were others that were mad at the Denver 23 00:01:22,319 --> 00:01:25,479 Speaker 1: posts that were saying, why aren't you covering this? How 24 00:01:25,520 --> 00:01:28,679 Speaker 1: could you miss such a big, horrific case. 25 00:01:30,680 --> 00:01:33,320 Speaker 3: It makes sense that people would be upset. This video 26 00:01:33,400 --> 00:01:36,760 Speaker 3: had been seen almost two million times, and frankly, it 27 00:01:36,800 --> 00:01:40,480 Speaker 3: was embarrassing that the story of Richard Engelbert, this grizzly 28 00:01:40,560 --> 00:01:43,880 Speaker 3: murder hadn't been covered by the local paper. How is 29 00:01:43,880 --> 00:01:46,640 Speaker 3: it that an independent YouTube channel was doing a better 30 00:01:46,720 --> 00:01:51,840 Speaker 3: job of doing an investigation than professional reporters. Well there 31 00:01:51,920 --> 00:01:52,400 Speaker 3: was a reason. 32 00:01:53,080 --> 00:01:56,960 Speaker 1: It was pretty quickly clear that the story was made up, 33 00:01:57,040 --> 00:02:00,760 Speaker 1: that was invented using AI, and that no such murder 34 00:02:00,960 --> 00:02:01,760 Speaker 1: had ever happened. 35 00:02:02,120 --> 00:02:04,800 Speaker 3: I talked to Henry Larson, who wrote about this not 36 00:02:04,880 --> 00:02:08,080 Speaker 3: so true true crime phenomenon for FO for Media. 37 00:02:08,280 --> 00:02:11,400 Speaker 1: My name is Henry Larson. I'm a reporter and I 38 00:02:11,520 --> 00:02:15,600 Speaker 1: cover typically criminal justice. This is not that at all, 39 00:02:15,840 --> 00:02:18,880 Speaker 1: so much weirder story about fake crimes. 40 00:02:19,760 --> 00:02:22,520 Speaker 3: The channel was called True Crime Case Files and it 41 00:02:22,520 --> 00:02:26,240 Speaker 3: had about one hundred thousand subscribers and tons of videos. 42 00:02:26,840 --> 00:02:29,840 Speaker 1: The channel itself seemed to have been making dozens and 43 00:02:29,880 --> 00:02:34,880 Speaker 1: dozens of similar fake crimes, pumping them out, and it 44 00:02:34,919 --> 00:02:38,680 Speaker 1: seemed like people were buying into these fake crimes thinking 45 00:02:38,720 --> 00:02:39,240 Speaker 1: they were real. 46 00:02:45,680 --> 00:02:55,440 Speaker 3: I'm afraid Kaleidoscope and iHeart podcasts. This is kill Switch, 47 00:02:56,040 --> 00:02:58,720 Speaker 3: I'm Dexter Thomas, I'm. 48 00:03:22,320 --> 00:03:22,840 Speaker 4: Goodbye. 49 00:03:26,480 --> 00:03:30,400 Speaker 3: Henry started looking at the YouTube channel and he noticed something. 50 00:03:31,240 --> 00:03:36,640 Speaker 1: A lot of videos on the channel were pretty perverse sexualized. 51 00:03:37,360 --> 00:03:40,160 Speaker 3: Then again, so are a lot of true crime stories. 52 00:03:40,600 --> 00:03:44,480 Speaker 3: These ones just always had a particular extra element of drama. 53 00:03:44,560 --> 00:03:46,520 Speaker 1: That would usually have something to do with someone in 54 00:03:46,560 --> 00:03:51,080 Speaker 1: a position of power murdering or taking advantage of someone 55 00:03:51,400 --> 00:03:53,960 Speaker 1: who had less power than them, like a sheriff and 56 00:03:54,000 --> 00:03:56,760 Speaker 1: the secretary, or a teacher and a student, a parent 57 00:03:56,800 --> 00:03:57,880 Speaker 1: and their stepchild. 58 00:03:58,480 --> 00:04:01,440 Speaker 3: There were dozens of story with that same setup, and 59 00:04:01,680 --> 00:04:03,520 Speaker 3: just as an example, let me red you some of 60 00:04:03,560 --> 00:04:07,600 Speaker 3: these titles here. So one of them goes judge beats 61 00:04:07,680 --> 00:04:11,680 Speaker 3: college student to death after secret gay affair ends in scandal. 62 00:04:12,400 --> 00:04:16,640 Speaker 3: Then couple's wife swap experiment ends in obsession and brutal murders, 63 00:04:17,560 --> 00:04:21,359 Speaker 3: or cheating husband murder's loyal wife and claims he was 64 00:04:21,400 --> 00:04:24,640 Speaker 3: on an acid trip. And then of course there's the 65 00:04:24,640 --> 00:04:27,040 Speaker 3: one the reporter of the dever posts got the emails 66 00:04:27,040 --> 00:04:31,960 Speaker 3: about husband's secret gay love affair, with step son, ends 67 00:04:32,120 --> 00:04:35,760 Speaker 3: in grizzly murder, and whatever you're imagining the story is like, 68 00:04:36,720 --> 00:04:37,839 Speaker 3: you're probably right. 69 00:04:39,560 --> 00:04:43,920 Speaker 1: I will say that they got progressively more scandalous click 70 00:04:44,000 --> 00:04:48,200 Speaker 1: baity sexual over time. When this channel first started making 71 00:04:48,240 --> 00:04:51,160 Speaker 1: these videos, they were a little tamer. 72 00:04:51,720 --> 00:04:54,200 Speaker 3: The video of the went viral and alerted the Denver 73 00:04:54,279 --> 00:04:57,080 Speaker 3: Post to this fake channel was definitely from the era 74 00:04:57,240 --> 00:05:00,559 Speaker 3: after the channel had gotten much more scandalous. It told 75 00:05:00,640 --> 00:05:04,560 Speaker 3: the story of Richard Engelbert, a real estate agent who 76 00:05:04,600 --> 00:05:07,520 Speaker 3: lived a seemingly perfect life but was having a secret 77 00:05:07,600 --> 00:05:09,400 Speaker 3: sexual affair with his step son. 78 00:05:10,080 --> 00:05:13,320 Speaker 2: Richard kept this relationship hidden from everyone, fearing it would 79 00:05:13,360 --> 00:05:16,919 Speaker 2: ruin his reputation and career. In addition to this, he 80 00:05:17,000 --> 00:05:19,440 Speaker 2: continued to meet other men for sex through his work, 81 00:05:19,839 --> 00:05:22,480 Speaker 2: using the homes he showed as secret meeting places. 82 00:05:22,920 --> 00:05:26,080 Speaker 3: The whole crime took place in Littleton, Colorado, and the 83 00:05:26,160 --> 00:05:30,400 Speaker 3: video showed pictures of these generic looking rows of suburban homes, 84 00:05:30,920 --> 00:05:33,719 Speaker 3: and for anyone who actually lived there, it was obvious 85 00:05:33,800 --> 00:05:36,919 Speaker 3: that this was not Lyttleton, but for people who don't 86 00:05:36,960 --> 00:05:40,760 Speaker 3: live there, it might have been convincing. All these videos 87 00:05:40,839 --> 00:05:43,040 Speaker 3: look kind of like a lower budget version of something 88 00:05:43,080 --> 00:05:45,919 Speaker 3: you might see on the Hallmark channel. They're usually around 89 00:05:45,920 --> 00:05:48,839 Speaker 3: twenty five minutes long. They're narrated by a host with 90 00:05:48,960 --> 00:05:53,600 Speaker 3: this authoritative radio voice, and there's photos of suburban living rooms, 91 00:05:53,839 --> 00:05:57,359 Speaker 3: and then there's the setup. It introduces the victim or 92 00:05:57,400 --> 00:06:01,560 Speaker 3: the witnesses or the perpetrator with a plausible looking photograph. 93 00:06:02,000 --> 00:06:04,680 Speaker 3: You'll see a smiling businessman with a blue suit and 94 00:06:04,760 --> 00:06:07,840 Speaker 3: bleach white teeth, or an office worker with blonde hair 95 00:06:07,920 --> 00:06:10,839 Speaker 3: and dangling earrings, and then it'll cut to a photo 96 00:06:10,839 --> 00:06:14,520 Speaker 3: of police cars parked outside the crime scene. And stories 97 00:06:14,600 --> 00:06:16,680 Speaker 3: are on par with a lot of true crime stuff 98 00:06:16,760 --> 00:06:20,120 Speaker 3: you might see on TV. Was it obvious to you 99 00:06:20,320 --> 00:06:22,280 Speaker 3: that this stuff was AI generated? 100 00:06:23,080 --> 00:06:26,000 Speaker 1: It was pretty clear to me that the narrator was 101 00:06:26,040 --> 00:06:29,680 Speaker 1: an AI generated voice, that the photos are very weird. 102 00:06:30,279 --> 00:06:34,800 Speaker 1: Everyone had impeccable veneers in the headshots that were generated 103 00:06:35,120 --> 00:06:39,159 Speaker 1: and looked very glossy and kind of plasticky, and so 104 00:06:39,240 --> 00:06:43,120 Speaker 1: that made me initially think, oh, okay, this is pretty obvious, right, 105 00:06:43,200 --> 00:06:45,640 Speaker 1: this has been AI generated. But then when you go 106 00:06:45,680 --> 00:06:48,560 Speaker 1: in the comments, there might be a couple people who 107 00:06:48,560 --> 00:06:50,960 Speaker 1: are raising of red flag saying I don't know about this. 108 00:06:51,120 --> 00:06:54,039 Speaker 1: I can't find any information about this online, but they 109 00:06:54,080 --> 00:06:58,839 Speaker 1: were drowned out by people finding specific details from the 110 00:06:58,920 --> 00:07:02,640 Speaker 1: videos and saying, I can't believe that the police missed this, 111 00:07:03,040 --> 00:07:06,760 Speaker 1: or I think that there is this theory that was overlooked, 112 00:07:06,839 --> 00:07:09,200 Speaker 1: or here's this other comment I have about like a 113 00:07:09,240 --> 00:07:12,640 Speaker 1: material fact that was presented in the video, and that 114 00:07:12,920 --> 00:07:15,680 Speaker 1: at least to me, showed that people were buying into this, 115 00:07:16,000 --> 00:07:18,560 Speaker 1: and combined with the people who are mad at the 116 00:07:18,600 --> 00:07:22,520 Speaker 1: Denver Post for not reporting that story, makes me think 117 00:07:22,520 --> 00:07:25,960 Speaker 1: that this was compelling to at least a fair amount 118 00:07:26,120 --> 00:07:31,280 Speaker 1: of this channel's viewers. Generally, people were interacting with each 119 00:07:31,280 --> 00:07:34,240 Speaker 1: other talking about the details of this fake case as 120 00:07:34,240 --> 00:07:34,840 Speaker 1: if it was real. 121 00:07:35,720 --> 00:07:39,800 Speaker 3: Okay, just the recap. The images look AI generated. The 122 00:07:39,920 --> 00:07:44,640 Speaker 3: voice is also AI generated. Okay, it's fake and huh, 123 00:07:44,960 --> 00:07:47,600 Speaker 3: people are falling for it. Not surprising. This is the 124 00:07:47,640 --> 00:07:51,080 Speaker 3: Internet and that is where most of us would stop. 125 00:07:51,280 --> 00:07:53,720 Speaker 3: We'd close the window and go back to white day. 126 00:07:54,440 --> 00:07:57,600 Speaker 3: But Henry didn't do that. He kept digging. 127 00:07:57,880 --> 00:08:02,480 Speaker 1: I got really intrigued about the kind of person that 128 00:08:02,560 --> 00:08:06,360 Speaker 1: would make this video. So I found the contact email 129 00:08:06,400 --> 00:08:09,160 Speaker 1: associated with the YouTube channel and I reached out. 130 00:08:09,200 --> 00:08:12,280 Speaker 3: So hold on this person. They had just left their 131 00:08:12,320 --> 00:08:14,720 Speaker 3: contact email just on their YouTube page. 132 00:08:14,880 --> 00:08:16,640 Speaker 1: Yeah, they had a contact email. 133 00:08:16,920 --> 00:08:20,720 Speaker 3: Not every YouTuber does that. That's actually kind of unusual. 134 00:08:20,800 --> 00:08:22,960 Speaker 3: Why do you think they did that? I don't know. 135 00:08:23,200 --> 00:08:25,920 Speaker 1: I think, well, I guess I have the theory. I 136 00:08:25,960 --> 00:08:28,680 Speaker 1: think a lot of the reason why these videos were 137 00:08:28,720 --> 00:08:31,680 Speaker 1: being made by this person were because they wanted to 138 00:08:31,760 --> 00:08:33,959 Speaker 1: be a filmmaker and get some attention. 139 00:08:35,360 --> 00:08:38,679 Speaker 3: So Henry gave him some of that attention. Now, just 140 00:08:38,679 --> 00:08:40,440 Speaker 3: a note in case you want to go and read 141 00:08:40,480 --> 00:08:43,560 Speaker 3: about this. The guy behind the channel, his name is 142 00:08:43,600 --> 00:08:45,080 Speaker 3: out there now if you want to look for it. 143 00:08:45,400 --> 00:08:47,760 Speaker 3: But back when Henry wrote the article and when we 144 00:08:47,800 --> 00:08:50,840 Speaker 3: did the interview, he was referring to him anonymously with 145 00:08:50,920 --> 00:08:53,240 Speaker 3: the name Paul. So that's the name that you'll hear 146 00:08:53,240 --> 00:08:57,160 Speaker 3: in this episode. So, yeah, what's his deal? Why did 147 00:08:57,240 --> 00:08:58,720 Speaker 3: you start all this up? What did he tell you? 148 00:08:59,600 --> 00:09:04,160 Speaker 1: He graduated from college right before the pandemic, and when 149 00:09:04,200 --> 00:09:08,680 Speaker 1: the pandemic hit, he was living with his parents and together. 150 00:09:09,520 --> 00:09:11,680 Speaker 1: You know, a lot of families did all sorts of 151 00:09:11,679 --> 00:09:14,760 Speaker 1: traditions around this time. Paul and his family decided they 152 00:09:14,760 --> 00:09:18,400 Speaker 1: were going to watch dateline together, okay, and they watched 153 00:09:18,440 --> 00:09:21,760 Speaker 1: a lot of Dateline He told me he really didn't 154 00:09:22,200 --> 00:09:24,960 Speaker 1: like the show all that much, but he taught himself 155 00:09:24,960 --> 00:09:28,000 Speaker 1: the formula. He taught himself the process of the procedural 156 00:09:28,320 --> 00:09:32,840 Speaker 1: true crime genre. Here's the characters, here's the Grizzly murder, 157 00:09:33,000 --> 00:09:36,679 Speaker 1: and this lengthy investigation presenting each of the suspects in turn, 158 00:09:37,320 --> 00:09:44,000 Speaker 1: and then eventually a trial and some resolution for the victims. 159 00:09:44,480 --> 00:09:47,320 Speaker 1: This is also around the time that chat GPT is 160 00:09:47,360 --> 00:09:50,840 Speaker 1: really rolling out publicly, and he watches there's like a 161 00:09:50,960 --> 00:09:55,720 Speaker 1: twitch live stream of AI generated Seinfeld episodes. 162 00:09:57,920 --> 00:10:01,839 Speaker 3: Hey, yvone, did you hear about that new restaurant around 163 00:10:01,880 --> 00:10:07,600 Speaker 3: the corner? I remember that. Yeah, they're supposed to have 164 00:10:07,640 --> 00:10:11,160 Speaker 3: the best food in town. Yeah, I forgot all about that. 165 00:10:11,679 --> 00:10:13,880 Speaker 4: I heard they just opened up, and I'm dying to 166 00:10:13,920 --> 00:10:18,240 Speaker 4: try it, but it looks so expensive. Maybe we can 167 00:10:18,280 --> 00:10:20,439 Speaker 4: make a deal with the owner, you know, trade them 168 00:10:20,480 --> 00:10:22,120 Speaker 4: some of our jokes for a free meal. 169 00:10:24,840 --> 00:10:29,160 Speaker 1: He loved it real, he got really interesting. He was like, oh, 170 00:10:29,200 --> 00:10:32,360 Speaker 1: this is the thing. This is weird and new and 171 00:10:32,520 --> 00:10:34,480 Speaker 1: probably cost nothing to make. 172 00:10:35,679 --> 00:10:39,000 Speaker 3: So Paul decides to try his hand at AI generated content, 173 00:10:39,600 --> 00:10:42,360 Speaker 3: but before he lands on true crime, he tries a 174 00:10:42,400 --> 00:10:43,120 Speaker 3: different genre. 175 00:10:43,160 --> 00:10:43,480 Speaker 5: First. 176 00:10:43,800 --> 00:10:47,000 Speaker 1: He comes up with the bones of a plot for 177 00:10:47,040 --> 00:10:51,800 Speaker 1: a four to five minute Hallmark style rom com. Crucially, 178 00:10:51,840 --> 00:10:54,960 Speaker 1: he labels them as AI generated. He calls this channel 179 00:10:55,040 --> 00:10:58,280 Speaker 1: AI Film Studio. He thinks they're very good. We disagree 180 00:10:58,320 --> 00:10:59,880 Speaker 1: on that. I didn't think they were very good when 181 00:10:59,880 --> 00:11:01,440 Speaker 1: I watch them. 182 00:11:02,080 --> 00:11:02,920 Speaker 3: Did you tell him that? 183 00:11:03,240 --> 00:11:04,800 Speaker 1: I asked him if he thought they were good, and 184 00:11:04,880 --> 00:11:08,760 Speaker 1: he said yes, and I said okay, And they bomb. 185 00:11:09,040 --> 00:11:12,480 Speaker 1: The videos do terribly that none of them cross over 186 00:11:12,520 --> 00:11:15,880 Speaker 1: one hundred views, and so for some that might sort 187 00:11:15,920 --> 00:11:19,200 Speaker 1: of been part a lesson that this technology isn't ready. 188 00:11:19,280 --> 00:11:22,520 Speaker 1: There's something weird about this format. You need some more 189 00:11:22,559 --> 00:11:25,880 Speaker 1: production value, even if you want to use AI in media. 190 00:11:26,559 --> 00:11:29,000 Speaker 1: The lesson he learns is, don't tell people it's made 191 00:11:29,000 --> 00:11:30,640 Speaker 1: from AI. Don't tell people's think. 192 00:11:31,480 --> 00:11:34,400 Speaker 3: The other decision he makes is to focus on true crime, 193 00:11:35,000 --> 00:11:37,880 Speaker 3: and in January of twenty twenty four, he starts the 194 00:11:37,920 --> 00:11:41,920 Speaker 3: True Crime Case Files channel and he tests this idea 195 00:11:42,000 --> 00:11:44,160 Speaker 3: he had that maybe one of the things that was 196 00:11:44,160 --> 00:11:47,080 Speaker 3: holding him back was that he was telling people that 197 00:11:47,160 --> 00:11:50,360 Speaker 3: these videos were AI generated. So he stops doing that. 198 00:11:51,320 --> 00:11:54,760 Speaker 3: I mean, it turns out that maybe he wasn't wrong, 199 00:11:55,000 --> 00:11:56,800 Speaker 3: because his videos started to take off after that. 200 00:11:57,480 --> 00:12:01,040 Speaker 1: Yeah, in all honesty, he was success. At least for 201 00:12:01,280 --> 00:12:04,480 Speaker 1: a series of months. His channel was working. It was 202 00:12:04,520 --> 00:12:08,160 Speaker 1: working so well that other people noticed and started copying 203 00:12:08,400 --> 00:12:12,040 Speaker 1: his style, his formats, even the exact titles of some 204 00:12:12,120 --> 00:12:12,840 Speaker 1: of his videos. 205 00:12:13,160 --> 00:12:15,560 Speaker 3: And now you can find dozens of channels that are 206 00:12:15,640 --> 00:12:19,880 Speaker 3: all posting similar fake true crime content. There's variations in 207 00:12:19,920 --> 00:12:22,800 Speaker 3: different niches here and there, but it's all basically the 208 00:12:22,840 --> 00:12:27,880 Speaker 3: same format, AI generated images accompanied by AI generated voices 209 00:12:28,400 --> 00:12:31,640 Speaker 3: reading stories that are also probably AI generated. 210 00:12:32,440 --> 00:12:36,160 Speaker 1: He saw himself at the forefront of this gold rush, 211 00:12:36,720 --> 00:12:40,679 Speaker 1: this new medium of entertainment, and here he is experimenting, 212 00:12:40,760 --> 00:12:43,560 Speaker 1: trying something, and so I think part of the allure 213 00:12:43,600 --> 00:12:47,480 Speaker 1: of making AI films for him was the fact that 214 00:12:47,480 --> 00:12:49,200 Speaker 1: it was AI, was the fact that this was a 215 00:12:49,280 --> 00:12:52,360 Speaker 1: new technology. He told me he does consider himself to 216 00:12:52,400 --> 00:12:57,559 Speaker 1: be a filmmaker, one without a studio or expensive production costs. 217 00:12:56,920 --> 00:12:58,920 Speaker 3: Or in this case, I guess a camera. 218 00:12:59,040 --> 00:13:01,280 Speaker 1: Or a camera, yeah, or a microphone. 219 00:13:02,640 --> 00:13:05,199 Speaker 3: Okay, So how much of a gold rush are we talking? 220 00:13:05,280 --> 00:13:05,439 Speaker 5: Though? 221 00:13:06,320 --> 00:13:09,439 Speaker 3: It's a little hard to say. A few times Henry 222 00:13:09,480 --> 00:13:12,400 Speaker 3: did try asking him how much he was making, but 223 00:13:12,600 --> 00:13:13,840 Speaker 3: he couldn't get a straight answer. 224 00:13:14,400 --> 00:13:16,760 Speaker 1: He never told me exactly how much money he made. 225 00:13:16,800 --> 00:13:18,640 Speaker 1: I did learn that this was the only thing he 226 00:13:18,720 --> 00:13:22,040 Speaker 1: was working on, and as far as I know, it 227 00:13:22,160 --> 00:13:24,000 Speaker 1: was his only source of income. 228 00:13:25,080 --> 00:13:27,760 Speaker 3: All right, let's just stick with the facts for a second. 229 00:13:28,480 --> 00:13:31,880 Speaker 3: This is all fake and nobody's denying that the stories 230 00:13:31,920 --> 00:13:37,280 Speaker 3: are made up. The voice isn't real, the images aren't real. Yes, people, 231 00:13:37,720 --> 00:13:41,160 Speaker 3: a lot of people actually are being fooled. But does 232 00:13:41,160 --> 00:13:45,360 Speaker 3: fake mean bad? Is there anything wrong with what he's 233 00:13:45,360 --> 00:14:00,920 Speaker 3: doing here? We'll get into that after the break. So 234 00:14:01,559 --> 00:14:05,320 Speaker 3: true crime. Listen, I don't know how you feel about 235 00:14:05,320 --> 00:14:08,880 Speaker 3: true crime. I've got some feelings. I think a lot 236 00:14:08,920 --> 00:14:10,960 Speaker 3: of people have some feelings about true crime. But let's 237 00:14:10,960 --> 00:14:15,240 Speaker 3: be real, it is an extremely popular genre of I'm 238 00:14:15,240 --> 00:14:18,760 Speaker 3: gonna say content. I'm gonna be as neutral as I 239 00:14:18,800 --> 00:14:24,120 Speaker 3: can here. Not a fan personally, But why did he 240 00:14:24,240 --> 00:14:27,040 Speaker 3: pick true crime? Do you think he. 241 00:14:26,920 --> 00:14:30,320 Speaker 1: Picked true crime because he knew the format? But Paul 242 00:14:30,480 --> 00:14:33,280 Speaker 1: has a lot of criticisms of true crime, and this 243 00:14:33,400 --> 00:14:36,600 Speaker 1: was actually one of the reasons why he justified his work. 244 00:14:36,800 --> 00:14:40,440 Speaker 1: He said, what I'm doing making these AI videos is 245 00:14:40,480 --> 00:14:43,480 Speaker 1: actually better than real true crime. 246 00:14:43,720 --> 00:14:44,280 Speaker 3: It's better. 247 00:14:44,520 --> 00:14:45,080 Speaker 2: It's better. 248 00:14:45,200 --> 00:14:49,600 Speaker 1: It's better because there's no real victims involved. He got 249 00:14:49,640 --> 00:14:53,240 Speaker 1: to make his videos and his money and no one suffered. 250 00:14:53,920 --> 00:14:54,960 Speaker 3: How did you feel about that? 251 00:14:55,480 --> 00:14:57,480 Speaker 1: I thought he was wrong. I still think he's wrong, 252 00:14:57,520 --> 00:15:00,720 Speaker 1: and I told him that. I mean, there's the reporter line, 253 00:15:00,760 --> 00:15:05,160 Speaker 1: which is just injecting fake information into the world is bad. 254 00:15:05,640 --> 00:15:09,800 Speaker 1: But there's also the component that true crime as a 255 00:15:09,840 --> 00:15:13,680 Speaker 1: medium has plenty of flaws that are not just about 256 00:15:13,720 --> 00:15:16,479 Speaker 1: the specific victim of a crime, but also our societal 257 00:15:16,560 --> 00:15:20,080 Speaker 1: understandings of criminal justice in general. We turn on the 258 00:15:20,280 --> 00:15:24,400 Speaker 1: evening news on our local TV broadcaster and it leads 259 00:15:24,400 --> 00:15:26,880 Speaker 1: with a murder in a neighborhood near us, and we 260 00:15:26,920 --> 00:15:30,240 Speaker 1: think that crime is going up, And we listen to 261 00:15:30,280 --> 00:15:34,040 Speaker 1: a true crime podcast about a serial killer, and we're 262 00:15:34,160 --> 00:15:38,600 Speaker 1: a little more nervous around our neighbors. And there's a 263 00:15:38,640 --> 00:15:41,880 Speaker 1: real societal impact that a lot of researchers have looked 264 00:15:41,880 --> 00:15:46,560 Speaker 1: into and analyzed about crime media impacting our perceptions of 265 00:15:46,680 --> 00:15:47,600 Speaker 1: actual crime. 266 00:15:47,880 --> 00:15:48,040 Speaker 5: Right. 267 00:15:48,280 --> 00:15:51,320 Speaker 1: So, I think his work in many ways was committing 268 00:15:51,440 --> 00:15:54,320 Speaker 1: some of the same sins as the true crime genre 269 00:15:54,400 --> 00:15:54,840 Speaker 1: in general. 270 00:15:55,400 --> 00:15:55,680 Speaker 2: Right. 271 00:15:55,880 --> 00:15:59,800 Speaker 3: What you're talking about here is research essentially that shows 272 00:15:59,800 --> 00:16:03,000 Speaker 3: it this stuff also makes us feel like your neighborhood 273 00:16:03,040 --> 00:16:05,960 Speaker 3: is not safe, which is pro Your neighborhood's pretty safe. 274 00:16:06,480 --> 00:16:09,640 Speaker 3: You live in the suburbs, You're fine, it's gonna be okay. 275 00:16:09,760 --> 00:16:12,600 Speaker 3: But we watch a lot of this stuff, and if 276 00:16:12,640 --> 00:16:15,440 Speaker 3: it's maybe fun to watch, you know when you just 277 00:16:15,520 --> 00:16:16,800 Speaker 3: want to turn your brain off at the end of 278 00:16:16,840 --> 00:16:18,160 Speaker 3: the day, which I think is what a lot of 279 00:16:18,160 --> 00:16:20,640 Speaker 3: people do for true crime. Look, I get it, but 280 00:16:21,320 --> 00:16:23,120 Speaker 3: in the back of your mind, it also makes you 281 00:16:23,200 --> 00:16:27,200 Speaker 3: think that the world around you is more dangerous than 282 00:16:27,320 --> 00:16:28,920 Speaker 3: it actually is, and what does that do to you? 283 00:16:30,360 --> 00:16:33,880 Speaker 3: But Paul the AI True Crime creator, he had a 284 00:16:33,880 --> 00:16:35,680 Speaker 3: different point of view on what he was doing. 285 00:16:36,000 --> 00:16:37,760 Speaker 1: He said, what he was making is a form of 286 00:16:38,320 --> 00:16:42,840 Speaker 1: abstract art. He really liked his structural touches that he 287 00:16:42,840 --> 00:16:46,440 Speaker 1: would introduce into these videos, and at several points he 288 00:16:46,480 --> 00:16:49,760 Speaker 1: basically said, you know, I make these stories so ludicrous, 289 00:16:50,120 --> 00:16:54,560 Speaker 1: so insane that people should just assume that they're fake, 290 00:16:54,720 --> 00:16:58,640 Speaker 1: that they didn't really happen, and if they don't get it, 291 00:16:59,400 --> 00:16:59,840 Speaker 1: that's on that. 292 00:17:00,680 --> 00:17:03,680 Speaker 3: Really it was so like in an absurdist art form. 293 00:17:03,920 --> 00:17:06,000 Speaker 1: Yeah, he said what he's doing is absurdist start, and 294 00:17:06,080 --> 00:17:07,600 Speaker 1: he doesn't regret any of it. 295 00:17:07,680 --> 00:17:11,639 Speaker 5: Yeah, I mean like his larger artistic message is lost 296 00:17:11,680 --> 00:17:14,080 Speaker 5: on me because it sounds like complete BS. 297 00:17:14,240 --> 00:17:16,439 Speaker 6: I was gonna say, I'm gonna call BS on the 298 00:17:16,480 --> 00:17:17,840 Speaker 6: AI excuse. 299 00:17:18,440 --> 00:17:21,800 Speaker 3: You might have been wondering what actual true crime podcasters 300 00:17:21,800 --> 00:17:24,960 Speaker 3: think about all this stuff. Well you just heard from 301 00:17:24,960 --> 00:17:27,639 Speaker 3: two of them. Hi. I am Bob Mauta and my 302 00:17:27,760 --> 00:17:30,560 Speaker 3: name is Lauren brtcheck out. Bob used to be a 303 00:17:30,600 --> 00:17:34,040 Speaker 3: defense attorney. Then he made a podcast telling the story 304 00:17:34,080 --> 00:17:37,240 Speaker 3: of how his father defended the notorious serial killer John 305 00:17:37,240 --> 00:17:40,920 Speaker 3: Wayne Gacy, and since then he's been a true crime podcaster. 306 00:17:41,840 --> 00:17:44,919 Speaker 3: Lauren is a former television producer, but now she's focused 307 00:17:44,960 --> 00:17:48,439 Speaker 3: on true crime audio. She produced the podcast Happy Face, 308 00:17:48,480 --> 00:17:51,640 Speaker 3: which was about another serial killer, and that's been adapted 309 00:17:51,680 --> 00:17:54,239 Speaker 3: to a show on Paramount Plus. The two of them 310 00:17:54,240 --> 00:17:57,000 Speaker 3: work together on a true crime podcast called Murder on 311 00:17:57,119 --> 00:18:01,000 Speaker 3: Songbird Road. But I think what be happening here, and 312 00:18:01,000 --> 00:18:02,879 Speaker 3: I'm curious to hear what you think about this, is 313 00:18:02,920 --> 00:18:07,639 Speaker 3: this may be exposing something about the audience for true crime. 314 00:18:07,800 --> 00:18:09,640 Speaker 3: And I think this is one of the things that 315 00:18:09,800 --> 00:18:12,560 Speaker 3: this person who who are calling Paul is you know 316 00:18:12,680 --> 00:18:15,159 Speaker 3: this point he's trying to make, which he's saying, Well, 317 00:18:15,560 --> 00:18:18,639 Speaker 3: the stuff that he's making, This AI generated stuff is 318 00:18:18,680 --> 00:18:24,040 Speaker 3: actually better than real true crime because no actual victims 319 00:18:24,040 --> 00:18:24,919 Speaker 3: are being exploited. 320 00:18:25,560 --> 00:18:28,920 Speaker 6: That would work with the assumption that the intention of 321 00:18:29,600 --> 00:18:34,440 Speaker 6: all true crime creators is to exploit victims. And that's 322 00:18:34,520 --> 00:18:39,560 Speaker 6: the antithesis of my intention of Bob's intention. And I 323 00:18:39,600 --> 00:18:46,720 Speaker 6: would ask Paul, since he has wrapped this very I 324 00:18:46,760 --> 00:18:53,080 Speaker 6: think disingenuous scam up with the I'm teaching everybody a lesson, Bo, 325 00:18:53,280 --> 00:18:55,280 Speaker 6: what he's doing with the profits? 326 00:18:55,760 --> 00:18:58,919 Speaker 5: Mm hmm, Like are you are you are you sending 327 00:18:58,960 --> 00:19:02,280 Speaker 5: all the proceeds LIKECTIMS organizations, Like are you doing good 328 00:19:02,320 --> 00:19:04,639 Speaker 5: with it to teach your lesson? Or are you just 329 00:19:04,760 --> 00:19:07,440 Speaker 5: using that as a convenient excuse as to why you're 330 00:19:07,440 --> 00:19:10,080 Speaker 5: creating this stuff under the guys that it's actually real. 331 00:19:10,520 --> 00:19:14,320 Speaker 5: But and look, I'm going to devils advocate on Paul's 332 00:19:14,320 --> 00:19:18,520 Speaker 5: behalf here, true criminal defense attorney here. Yeah right, I mean, 333 00:19:18,560 --> 00:19:21,600 Speaker 5: I can't help it. So in terms of I mean, 334 00:19:21,680 --> 00:19:26,280 Speaker 5: there is a large chunk of creators out there in 335 00:19:26,320 --> 00:19:29,440 Speaker 5: the true crime realm that they're they are just they're 336 00:19:29,480 --> 00:19:33,280 Speaker 5: out there peddling the violence of the crimes. They are 337 00:19:33,880 --> 00:19:38,240 Speaker 5: not taking into consideration the victims themselves, their families that 338 00:19:38,320 --> 00:19:41,480 Speaker 5: have to live with these tragedies for generations beyond when 339 00:19:41,520 --> 00:19:44,760 Speaker 5: the crime took place. They're merely just retelling a story 340 00:19:44,800 --> 00:19:47,800 Speaker 5: that's not their story to tell, you know, and really 341 00:19:47,960 --> 00:19:51,800 Speaker 5: in order to just make money and look like I 342 00:19:51,840 --> 00:19:54,159 Speaker 5: have a lot of friends that do it, you know, 343 00:19:54,240 --> 00:19:56,680 Speaker 5: that don't like there aren't deep divers that really just 344 00:19:56,760 --> 00:19:59,280 Speaker 5: kind of sit there and reread. They'll watch a Discovery 345 00:19:59,320 --> 00:20:01,880 Speaker 5: ID thing and they'll go right up an episode about 346 00:20:01,880 --> 00:20:02,440 Speaker 5: it and just. 347 00:20:02,480 --> 00:20:04,479 Speaker 6: Speak about it with authority. 348 00:20:04,240 --> 00:20:07,520 Speaker 5: Right, you know. So it's like, I don't want to 349 00:20:07,560 --> 00:20:09,440 Speaker 5: offend any of the people that do that out there, 350 00:20:09,440 --> 00:20:12,280 Speaker 5: but I think that Paul might have a point as 351 00:20:12,320 --> 00:20:15,760 Speaker 5: to those type of creators because what like, really, at 352 00:20:15,760 --> 00:20:17,080 Speaker 5: the end of the day, what do they bringing to 353 00:20:17,119 --> 00:20:17,560 Speaker 5: the table? 354 00:20:18,040 --> 00:20:21,879 Speaker 3: I'm hearing YouTube really kind of distinguishing yourself from a 355 00:20:21,920 --> 00:20:25,520 Speaker 3: lot of other people who make true crime podcasts, even 356 00:20:25,560 --> 00:20:28,399 Speaker 3: though you all may you know, be situated in the 357 00:20:28,400 --> 00:20:31,800 Speaker 3: same podcast in the listing right, even in the same genre. 358 00:20:32,160 --> 00:20:33,919 Speaker 3: In this way, we could maybe think of what you 359 00:20:34,000 --> 00:20:40,360 Speaker 3: two do as creating really carefully prepared organic meals. It's 360 00:20:40,400 --> 00:20:42,080 Speaker 3: a lot of people who are very happy with a 361 00:20:42,119 --> 00:20:44,280 Speaker 3: bag of Cheetos. 362 00:20:45,240 --> 00:20:45,640 Speaker 6: I right. 363 00:20:45,760 --> 00:20:48,000 Speaker 5: I mean, like a lot of people just do like 364 00:20:48,160 --> 00:20:51,960 Speaker 5: kind of that pulp fiction quick hitter. I want to 365 00:20:52,000 --> 00:20:54,040 Speaker 5: rip open that bag of Cheetos. I want to dive in. 366 00:20:54,600 --> 00:20:57,280 Speaker 5: And it's arguably a much bigger market. 367 00:20:59,240 --> 00:21:02,560 Speaker 3: But remember the Paul talked about his work as holding 368 00:21:02,600 --> 00:21:05,560 Speaker 3: up a mirror to the industry of true crime, which 369 00:21:05,920 --> 00:21:08,520 Speaker 3: I think is actually pretty interesting as a concept. You know, 370 00:21:08,680 --> 00:21:11,520 Speaker 3: show the audience truly what it is that they're looking at. 371 00:21:12,400 --> 00:21:14,560 Speaker 3: But that made me wonder about the response of that 372 00:21:14,680 --> 00:21:20,720 Speaker 3: audience when he was talking about his work is absurdist art. 373 00:21:21,480 --> 00:21:25,120 Speaker 3: He's got to be looking at the comments. He's seen 374 00:21:25,160 --> 00:21:29,480 Speaker 3: the same comments you're seeing a lot of people. The majority, 375 00:21:29,600 --> 00:21:32,960 Speaker 3: the bulk of the comments clearly are people who do 376 00:21:33,000 --> 00:21:34,520 Speaker 3: not understand that this is fake. 377 00:21:35,160 --> 00:21:37,879 Speaker 1: Yeah, in large part also because he would moderate his 378 00:21:37,880 --> 00:21:41,120 Speaker 1: own comments and delete comments that would call him out 379 00:21:41,200 --> 00:21:41,879 Speaker 1: for his lives. 380 00:21:42,359 --> 00:21:43,159 Speaker 3: Did he tell you that. 381 00:21:43,560 --> 00:21:45,640 Speaker 1: Yeah, he said that he would go in and cut 382 00:21:45,760 --> 00:21:49,119 Speaker 1: comments that were negative or said that what he was 383 00:21:49,160 --> 00:21:51,560 Speaker 1: making was fake. He didn't get all of them, but 384 00:21:51,880 --> 00:21:53,240 Speaker 1: he said that he would try and get as many 385 00:21:53,280 --> 00:21:53,720 Speaker 1: as he could. 386 00:21:54,119 --> 00:21:57,800 Speaker 3: Yo, hold on, Okay, I'm sorry man. So at that point, 387 00:21:58,280 --> 00:22:00,800 Speaker 3: saying it's on you. If you don't get it, you 388 00:22:00,840 --> 00:22:05,000 Speaker 3: are manufacturing something that's fake, and you're also manufacturing an 389 00:22:05,040 --> 00:22:09,879 Speaker 3: echo chamber of other people of basically social proof. You know, 390 00:22:09,920 --> 00:22:12,520 Speaker 3: it's like walking into a room and everybody says, wow, 391 00:22:12,600 --> 00:22:15,400 Speaker 3: look at this amazing thing. You think, well, everybody else 392 00:22:15,440 --> 00:22:19,320 Speaker 3: thinks this thing is amazing. I suppose if I think 393 00:22:19,359 --> 00:22:21,399 Speaker 3: it's not amazing, something's probably wrong with me. So if 394 00:22:21,440 --> 00:22:24,840 Speaker 3: you watch the video, look at the comments, everybody else 395 00:22:24,840 --> 00:22:27,520 Speaker 3: seems to think it's real, You're gonna feel kind of 396 00:22:27,520 --> 00:22:31,480 Speaker 3: weird if you don't also go along with that. Also, like, 397 00:22:31,640 --> 00:22:35,200 Speaker 3: maybe it's just me. Everybody else seems to think it's real. Yeah, 398 00:22:35,240 --> 00:22:38,639 Speaker 3: that's incredible. So I don't know about you, but maybe 399 00:22:38,680 --> 00:22:41,159 Speaker 3: this does change things a little. Are we trying to 400 00:22:41,200 --> 00:22:44,119 Speaker 3: alert people to the danger of harmful entertainment or we 401 00:22:44,280 --> 00:22:47,480 Speaker 3: just trying to make money or are those two things 402 00:22:47,600 --> 00:22:51,760 Speaker 3: totally compatible. Maybe it's just recognizing that people wanted cheetos, 403 00:22:52,040 --> 00:22:54,399 Speaker 3: and Paul figured out that he could provide people with 404 00:22:54,440 --> 00:22:59,600 Speaker 3: those cheetos really easily, really quickly, and in massive quantities. 405 00:23:00,200 --> 00:23:02,320 Speaker 3: He started picking up the pace, and he was publishing 406 00:23:02,400 --> 00:23:05,600 Speaker 3: videos every day or so. Before long, he put out 407 00:23:05,600 --> 00:23:08,560 Speaker 3: over one hundred and fifty of these things and people 408 00:23:08,640 --> 00:23:12,320 Speaker 3: were watching these. Some people are being fooled. But what 409 00:23:12,440 --> 00:23:15,320 Speaker 3: if some people don't care? If the people just want 410 00:23:15,400 --> 00:23:17,960 Speaker 3: Paul's Cheetos? Is that so bad? 411 00:23:18,520 --> 00:23:21,760 Speaker 6: I'll go further with that metaphor. You are what you eat? 412 00:23:22,200 --> 00:23:27,080 Speaker 6: And are you putting ideas out there into the world. 413 00:23:27,680 --> 00:23:33,280 Speaker 6: Are you having an overspill into real life in which 414 00:23:33,960 --> 00:23:36,520 Speaker 6: these crimes could become real? 415 00:23:37,280 --> 00:23:39,520 Speaker 3: I mean, if I'm picking up what you're saying here, 416 00:23:39,560 --> 00:23:43,760 Speaker 3: do you think that there's some worry that this AI 417 00:23:43,960 --> 00:23:46,800 Speaker 3: generated stuff becoming more prevalent, which it's getting more and 418 00:23:46,880 --> 00:23:49,639 Speaker 3: more salacious, the details get more and more outlandish, more 419 00:23:49,640 --> 00:23:52,080 Speaker 3: and more silacious, but also seem to get more listens, 420 00:23:52,119 --> 00:23:58,200 Speaker 3: more views, more clicks. That this could start to affect 421 00:23:58,200 --> 00:24:03,200 Speaker 3: how people perceive action cases or even just the news 422 00:24:03,920 --> 00:24:05,360 Speaker 3: and give them ideas. 423 00:24:05,880 --> 00:24:09,919 Speaker 5: Yeah, I mean, what's to say that AI created you know, 424 00:24:10,080 --> 00:24:13,399 Speaker 5: true crime isn't going to come up with concepts of 425 00:24:13,480 --> 00:24:16,919 Speaker 5: ways to really effectively evade. 426 00:24:17,040 --> 00:24:21,120 Speaker 6: You know, I'll tell you another issue that I have 427 00:24:21,240 --> 00:24:27,479 Speaker 6: with it. If you're leaning heavily into sensationalizing what you 428 00:24:27,760 --> 00:24:33,679 Speaker 6: are claiming is real trauma suffered by real people, but 429 00:24:33,760 --> 00:24:37,800 Speaker 6: it's all fake and all made up. You are pulling 430 00:24:37,920 --> 00:24:43,040 Speaker 6: out emotion and concern from real people. And when you 431 00:24:43,080 --> 00:24:47,359 Speaker 6: are exhausting that reservoir, they're going to have a lot 432 00:24:47,560 --> 00:24:55,240 Speaker 6: less capacity to care about real trauma and crime for 433 00:24:55,359 --> 00:24:55,959 Speaker 6: other people. 434 00:24:57,080 --> 00:24:58,520 Speaker 3: So and that's a real thing. 435 00:24:59,000 --> 00:25:01,280 Speaker 5: That's a real thing. I mean, we see it especially 436 00:25:01,359 --> 00:25:05,480 Speaker 5: in what we do Dexter in terms of people become 437 00:25:05,760 --> 00:25:08,480 Speaker 5: emotionally invested in cases. 438 00:25:08,760 --> 00:25:10,960 Speaker 6: And I think it's kind of like one of the 439 00:25:11,119 --> 00:25:17,760 Speaker 6: unforeseen side effects of botox that has been studied extensively 440 00:25:18,119 --> 00:25:21,720 Speaker 6: is that when people have been doing it for a 441 00:25:21,760 --> 00:25:25,200 Speaker 6: period of time, they lose their ability to be empathetic 442 00:25:25,880 --> 00:25:29,439 Speaker 6: because when you're listening to somebody in real time, we 443 00:25:29,520 --> 00:25:35,400 Speaker 6: don't realize how much our face is mimicking that person's emotions, 444 00:25:35,640 --> 00:25:37,320 Speaker 6: and that gives us the empathy. 445 00:25:37,840 --> 00:25:39,080 Speaker 3: That is how wild. 446 00:25:39,880 --> 00:25:40,760 Speaker 5: I had not heard that. 447 00:25:40,880 --> 00:25:42,000 Speaker 3: It makes sense though. 448 00:25:41,840 --> 00:25:47,440 Speaker 6: But now just take that and put that over AI 449 00:25:47,560 --> 00:25:51,440 Speaker 6: true crime. It will ultimately be the same thing. We'll 450 00:25:51,480 --> 00:25:54,160 Speaker 6: stop caring about these cases. 451 00:25:56,320 --> 00:25:59,760 Speaker 3: So what now, have we become too addicted to junk food? 452 00:25:59,800 --> 00:25:59,960 Speaker 7: Kind? 453 00:26:00,800 --> 00:26:03,840 Speaker 3: Can we or can the platforms do anything about all this? 454 00:26:04,720 --> 00:26:18,720 Speaker 3: That's after the break the channel had around one hundred 455 00:26:18,720 --> 00:26:23,200 Speaker 3: thousand subscribers and across the videos millions of views. It's 456 00:26:23,240 --> 00:26:26,720 Speaker 3: made Henry wonder he might not like this, but was 457 00:26:26,760 --> 00:26:32,040 Speaker 3: Paul actually doing anything against YouTube's policies? So he contacted 458 00:26:32,080 --> 00:26:37,439 Speaker 3: YouTube and asked them, so you head up YouTube, and 459 00:26:38,480 --> 00:26:41,560 Speaker 3: YouTube nuked hiss channel essentially, Yeah. 460 00:26:41,240 --> 00:26:43,840 Speaker 1: Not only that channel, but another three or four that 461 00:26:43,920 --> 00:26:44,600 Speaker 1: he also had. 462 00:26:45,119 --> 00:26:48,160 Speaker 3: What was the reason that they gave for pulling these 463 00:26:48,240 --> 00:26:48,880 Speaker 3: channels down? 464 00:26:49,320 --> 00:26:52,359 Speaker 1: YouTube told me in an email they said that Paul's 465 00:26:52,400 --> 00:26:56,400 Speaker 1: videos had violated YouTube's policies around child safety, particularly their 466 00:26:56,440 --> 00:26:58,480 Speaker 1: policy and child sexual exploitation. 467 00:26:59,160 --> 00:26:59,960 Speaker 3: So not about AI. 468 00:27:00,600 --> 00:27:03,760 Speaker 1: No, No, he wasn't violating any of YouTube's policies around AI. 469 00:27:04,320 --> 00:27:06,800 Speaker 3: And even though this channel was taken off of YouTube, 470 00:27:07,240 --> 00:27:10,440 Speaker 3: you can still find it on other platforms. He's still 471 00:27:10,480 --> 00:27:13,480 Speaker 3: making this stuff, clearly because it's on Spotify. I found 472 00:27:13,480 --> 00:27:16,280 Speaker 3: it on Amazon Music, all the platform channels that I 473 00:27:16,359 --> 00:27:18,160 Speaker 3: look for, it's on there too. 474 00:27:18,520 --> 00:27:21,719 Speaker 1: Yeah, he has an RSS feed and a podcast player 475 00:27:21,760 --> 00:27:25,920 Speaker 1: and he's still generating these true crime stories. Also, they're 476 00:27:25,960 --> 00:27:29,359 Speaker 1: selling ads ads on all of these yeah, for hummus 477 00:27:29,440 --> 00:27:33,600 Speaker 1: and universities and all sorts of weird stuff. So he's 478 00:27:33,600 --> 00:27:34,840 Speaker 1: still clearly making money. 479 00:27:35,320 --> 00:27:38,560 Speaker 3: So again, YouTube takes down the channel because they say 480 00:27:38,560 --> 00:27:42,280 Speaker 3: it violated child safety policies, not because it was fake, 481 00:27:43,040 --> 00:27:45,639 Speaker 3: and Spotify doesn't really seem to care that it's fake 482 00:27:45,960 --> 00:27:49,480 Speaker 3: or that it's explicit. So for now, there's not really 483 00:27:49,520 --> 00:27:51,240 Speaker 3: any reason for this stuff to stop. 484 00:27:51,600 --> 00:27:55,000 Speaker 1: True crime is really popular and people love it, and 485 00:27:55,040 --> 00:27:58,680 Speaker 1: so of course there's going to be ripoffs and parodies 486 00:27:58,760 --> 00:28:01,960 Speaker 1: and scams associated with it. It's just the world we 487 00:28:02,320 --> 00:28:03,359 Speaker 1: live in now, I think. 488 00:28:03,760 --> 00:28:05,240 Speaker 7: So if you want to know how to make a 489 00:28:05,280 --> 00:28:08,080 Speaker 7: true crime case story video that will go viral, then 490 00:28:08,119 --> 00:28:10,239 Speaker 7: you are in luck because hey, there, and how are 491 00:28:10,280 --> 00:28:10,640 Speaker 7: you doing? 492 00:28:10,880 --> 00:28:13,600 Speaker 3: And now there are tutorials on how to make this stuff. 493 00:28:13,640 --> 00:28:15,360 Speaker 7: In this video, I am going to show you how 494 00:28:15,359 --> 00:28:18,200 Speaker 7: you can create your own true crime story video and 495 00:28:18,240 --> 00:28:22,080 Speaker 7: go viral. So are you excited? Well, let's dive right in. 496 00:28:22,640 --> 00:28:24,600 Speaker 1: I mean, of course there are tutorials, right, of course, 497 00:28:24,760 --> 00:28:28,240 Speaker 1: Pandora's box is now open, and there's really nothing to 498 00:28:28,640 --> 00:28:33,359 Speaker 1: do except hope for a solar flare, right, That's what 499 00:28:33,480 --> 00:28:35,439 Speaker 1: I'm my fingers are crossed for. 500 00:28:36,800 --> 00:28:38,680 Speaker 3: I mean, at this point, just bring the asteroid man, 501 00:28:38,720 --> 00:28:39,760 Speaker 3: because I don't know what we're gonna do. 502 00:28:39,840 --> 00:28:42,680 Speaker 1: Yeah, let's just clear all the satellites from orbit and 503 00:28:42,720 --> 00:28:45,960 Speaker 1: start fresh. And I don't know, maybe someone will cut 504 00:28:46,000 --> 00:28:49,240 Speaker 1: all the undersea telecoms cables and we'll be good that. 505 00:28:49,400 --> 00:28:53,120 Speaker 3: I mean, that might be it. At this point, I'm 506 00:28:53,160 --> 00:28:55,480 Speaker 3: going to refrain from saying kill switch. I'm going to 507 00:28:55,520 --> 00:28:57,960 Speaker 3: refrain from saying that because that would be too corny. 508 00:28:59,360 --> 00:29:03,400 Speaker 3: So you spent months reporting on this, What is your 509 00:29:03,440 --> 00:29:04,120 Speaker 3: takeaway from this? 510 00:29:05,040 --> 00:29:08,760 Speaker 1: There's a couple I think the very boring reporter in 511 00:29:08,840 --> 00:29:11,239 Speaker 1: me is like, you know, truth matter is more than 512 00:29:11,240 --> 00:29:15,200 Speaker 1: ever right, and misinformation is bad. But of course we 513 00:29:15,280 --> 00:29:18,320 Speaker 1: all know that. I think we have to ask ourselves 514 00:29:18,400 --> 00:29:21,840 Speaker 1: why we get interested in the media we consume. I 515 00:29:21,840 --> 00:29:24,480 Speaker 1: think that's a really important part of being a consumer 516 00:29:24,520 --> 00:29:27,960 Speaker 1: in an age where we have so much to choose from, 517 00:29:28,200 --> 00:29:32,000 Speaker 1: there's so much more of our media diet that's completely 518 00:29:32,000 --> 00:29:34,479 Speaker 1: in our own hands, and we can totally screw ourselves 519 00:29:34,560 --> 00:29:36,160 Speaker 1: over if we let ourselves. 520 00:29:37,160 --> 00:29:39,040 Speaker 3: I would imagine that there are some people who will 521 00:29:39,080 --> 00:29:42,880 Speaker 3: continuing to listen to Paul's podcasts who know that it's 522 00:29:42,920 --> 00:29:46,440 Speaker 3: fake and who don't care because it's good enough for them. 523 00:29:46,480 --> 00:29:48,480 Speaker 3: And not only is it good enough for them. We's 524 00:29:48,680 --> 00:29:51,960 Speaker 3: escalated to something, to a level that Netflix is not 525 00:29:52,000 --> 00:29:55,520 Speaker 3: going to give them because the source material doesn't exist. 526 00:29:56,200 --> 00:29:59,840 Speaker 3: There isn't a husband who is secretly gay who ca 527 00:30:00,320 --> 00:30:02,760 Speaker 3: his wife on a cruise that happens every week. There 528 00:30:02,840 --> 00:30:06,800 Speaker 3: is not a trans person who crosses state lines to 529 00:30:06,840 --> 00:30:09,840 Speaker 3: participate in some drug ring or something like that every 530 00:30:09,840 --> 00:30:14,520 Speaker 3: single week. It's not possible, but AI does make it possible. 531 00:30:15,080 --> 00:30:17,720 Speaker 3: And maybe that's where we are. Maybe we've gotten so 532 00:30:17,760 --> 00:30:20,240 Speaker 3: addicted to true crime that we we there are people 533 00:30:20,280 --> 00:30:24,960 Speaker 3: who want the fake stuff. This is something Bob also 534 00:30:25,040 --> 00:30:25,520 Speaker 3: brought up. 535 00:30:25,800 --> 00:30:28,440 Speaker 5: I mean, I think at some point people are gonna 536 00:30:28,440 --> 00:30:31,480 Speaker 5: become like it, Like more and more people are listening 537 00:30:31,480 --> 00:30:33,160 Speaker 5: to it, and they're they're going in or do the 538 00:30:33,160 --> 00:30:35,840 Speaker 5: guys as if it happened, And then when they come 539 00:30:35,840 --> 00:30:39,400 Speaker 5: to realize that it didn't, they're going to be upset. 540 00:30:39,800 --> 00:30:43,080 Speaker 5: But then the question becomes the next time it comes up, 541 00:30:43,120 --> 00:30:46,040 Speaker 5: they be like, well, I know now that it's fake, 542 00:30:46,120 --> 00:30:47,200 Speaker 5: but it was still pretty good. 543 00:30:47,920 --> 00:30:48,200 Speaker 7: Again. 544 00:30:49,040 --> 00:30:50,920 Speaker 3: Look when I when I open a bag of Cheetos, 545 00:30:50,960 --> 00:30:53,320 Speaker 3: I know what I'm in for. Right, And Bob and 546 00:30:53,400 --> 00:30:57,000 Speaker 3: Lauren are also worried about some other real life effects. 547 00:30:57,040 --> 00:30:59,760 Speaker 5: I've got the distinct fear about how it's going to 548 00:30:59,800 --> 00:31:02,680 Speaker 5: add actually move into the criminal justice system, because it's 549 00:31:02,720 --> 00:31:05,600 Speaker 5: only a matter of time before they're able to use 550 00:31:05,640 --> 00:31:09,320 Speaker 5: these same type of programs in terms of creating imagery 551 00:31:09,560 --> 00:31:11,800 Speaker 5: where they're going to be able to bring false evidence 552 00:31:11,840 --> 00:31:15,480 Speaker 5: into cases, like there is nothing stopping somebody from saying, yeah, 553 00:31:15,520 --> 00:31:17,760 Speaker 5: I have a recording of a phone call wherein this 554 00:31:17,800 --> 00:31:22,320 Speaker 5: person just confessed. I mean, the thought of what could 555 00:31:22,360 --> 00:31:26,120 Speaker 5: be in the very very near future is terrifying on 556 00:31:26,400 --> 00:31:29,800 Speaker 5: an entirely different level than in terms of just the 557 00:31:29,840 --> 00:31:32,920 Speaker 5: fact that it's able to create out of whole cloth 558 00:31:33,080 --> 00:31:35,640 Speaker 5: things that don't even exist and make it seem as 559 00:31:35,640 --> 00:31:36,280 Speaker 5: if they do. 560 00:31:37,160 --> 00:31:40,800 Speaker 6: You know, we're seeing it on the political stage right now, 561 00:31:41,240 --> 00:31:47,000 Speaker 6: and so we are gradually becoming more and more comfortable 562 00:31:47,360 --> 00:31:49,680 Speaker 6: with accepting ai. 563 00:31:52,640 --> 00:31:57,280 Speaker 3: AI generated evidence being used in a courtroom. I hope 564 00:31:57,320 --> 00:31:59,400 Speaker 3: we never have to do an episode about that, but honestly, 565 00:31:59,480 --> 00:32:02,240 Speaker 3: at this point, man, give it a couple months, but 566 00:32:02,400 --> 00:32:04,600 Speaker 3: I think what we're starting to learn is that there 567 00:32:04,680 --> 00:32:07,760 Speaker 3: is a segment of the population that knows what they're 568 00:32:07,760 --> 00:32:11,600 Speaker 3: getting is fake and they don't care. Maybe you personally 569 00:32:11,640 --> 00:32:14,560 Speaker 3: are okay with the chat GBT filters that can redraw 570 00:32:14,600 --> 00:32:17,880 Speaker 3: your picture as a Simpsons character, or you're cool with 571 00:32:17,960 --> 00:32:20,920 Speaker 3: the audio generation engines that can turn your lyrics into 572 00:32:20,920 --> 00:32:23,560 Speaker 3: a pop song or a mariachi song or a rap song. 573 00:32:24,440 --> 00:32:28,400 Speaker 3: Maybe you personally draw the line at simulated approximations of 574 00:32:28,480 --> 00:32:31,960 Speaker 3: people being murdered, but we should acknowledge that these are 575 00:32:32,000 --> 00:32:39,800 Speaker 3: all uses for the same technology. Thank you so much 576 00:32:39,840 --> 00:32:42,120 Speaker 3: for listening to kill Switch. You can hit us up 577 00:32:42,160 --> 00:32:45,240 Speaker 3: at kill Switch at Kaleidoscope dot NYC with any of 578 00:32:45,240 --> 00:32:48,000 Speaker 3: your thoughts, or you can hit me personally at dex 579 00:32:48,080 --> 00:32:52,000 Speaker 3: digi that's d e x d i GI on Instagram 580 00:32:52,120 --> 00:32:54,320 Speaker 3: or blue Sky if that's more your think. And if 581 00:32:54,320 --> 00:32:57,160 Speaker 3: you liked the episode, hopefully you did. If you're on 582 00:32:57,240 --> 00:33:00,200 Speaker 3: Apple Podcasts or Spotify, you know, take that phone out 583 00:33:00,240 --> 00:33:03,120 Speaker 3: of that pocket and leave us a review. It really 584 00:33:03,200 --> 00:33:05,640 Speaker 3: does help people find the show, which in turn helps 585 00:33:05,720 --> 00:33:08,920 Speaker 3: us keep doing our thing. Kill Switch is hosted by Me, 586 00:33:09,200 --> 00:33:13,320 Speaker 3: Dexter Thomas. It's produced by Shena Ozaki, darl Of Potts, 587 00:33:13,440 --> 00:33:16,280 Speaker 3: and Kate Osborne. A theme song was written by me 588 00:33:16,560 --> 00:33:19,959 Speaker 3: and Kyle Murdoch, and Kyle also mixed the show from 589 00:33:20,080 --> 00:33:24,959 Speaker 3: Kaleidoscope Our executive producers are Oswallashin, mangesh Hati Gadur and 590 00:33:25,080 --> 00:33:29,320 Speaker 3: Kate Osborne. From iHeart our executive producers our Katrina Norvil 591 00:33:29,560 --> 00:33:44,880 Speaker 3: and Nikki Etour. We'll catch you on the next one.