1 00:00:00,920 --> 00:00:04,360 Speaker 1: A quick note. This is episode six of a six 2 00:00:04,440 --> 00:00:08,560 Speaker 1: part series. If you haven't heard the prior episodes, we 3 00:00:08,680 --> 00:00:12,800 Speaker 1: recommend going back and starting there. It should also be 4 00:00:12,920 --> 00:00:19,160 Speaker 1: noted that this series explores sexualized imagery involving miners and violence. 5 00:00:19,920 --> 00:00:33,519 Speaker 1: Please take here when listening Leavitt Town, the place was 6 00:00:33,560 --> 00:00:38,839 Speaker 1: itself and innovation. William J. Levitt of Levitt and Sons, 7 00:00:39,240 --> 00:00:44,239 Speaker 1: inspired by the production assembly line, broke down mass produced 8 00:00:44,240 --> 00:00:49,920 Speaker 1: housing into just twenty six steps, with construction workers building 9 00:00:49,960 --> 00:00:55,120 Speaker 1: a house in just one day, quick cost saving production 10 00:00:55,440 --> 00:01:01,360 Speaker 1: of identical track housing in mere months. In nineteen forty seven, 11 00:01:02,240 --> 00:01:07,800 Speaker 1: thousands of houses were built and white families only were 12 00:01:07,800 --> 00:01:14,720 Speaker 1: moving in restricting by racial covenant. The innovation for some, 13 00:01:16,959 --> 00:01:22,200 Speaker 1: but it was a kind of progress. And yet today 14 00:01:23,120 --> 00:01:27,000 Speaker 1: the place that was once the pinnacle of innovation is 15 00:01:27,520 --> 00:01:31,360 Speaker 1: like everywhere else now, struggling to keep up with the 16 00:01:31,440 --> 00:01:35,880 Speaker 1: incredible force of technological change. 17 00:01:37,080 --> 00:01:38,880 Speaker 2: When I first got here, we were working off of 18 00:01:38,959 --> 00:01:39,759 Speaker 2: VCR teaps. 19 00:01:40,280 --> 00:01:44,120 Speaker 1: Lieutenant Detective Devon Ross has been at the Nassau County 20 00:01:44,200 --> 00:01:46,679 Speaker 1: Police Department since the late nineties. 21 00:01:47,240 --> 00:01:51,440 Speaker 2: Teap fed camp quarters in eight milimeter cassette taps for audio, 22 00:01:52,200 --> 00:01:54,920 Speaker 2: and now we're in a world that's the people who 23 00:01:54,920 --> 00:01:58,560 Speaker 2: worked with me back then wouldn't even recognize today today. 24 00:01:59,280 --> 00:02:03,800 Speaker 1: Ross is the of the Cyber Crimes Unit, investigating crimes 25 00:02:03,880 --> 00:02:08,880 Speaker 1: like cyber extortion, online fraud, and deep fakes. 26 00:02:10,040 --> 00:02:14,800 Speaker 2: They call it an arms race between law enforcement, slash 27 00:02:14,880 --> 00:02:19,600 Speaker 2: legislation and technology, and it's just where we're losing. We 28 00:02:19,680 --> 00:02:23,000 Speaker 2: are absolutely losing the things that we're trying to think 29 00:02:23,000 --> 00:02:26,200 Speaker 2: about solving that happened six weeks ago. They are already 30 00:02:26,760 --> 00:02:29,600 Speaker 2: ten months ahead of us, and it's a shame. 31 00:02:33,520 --> 00:02:37,400 Speaker 1: The Petrick Carry case is sort of a perfect example. 32 00:02:38,200 --> 00:02:43,120 Speaker 1: In twenty nineteen, he was essentially using glorified Photoshop to 33 00:02:43,200 --> 00:02:47,680 Speaker 1: make the images. Basically, he was creating a digital collage 34 00:02:48,000 --> 00:02:51,520 Speaker 1: a face here, pasted on a body there, with some 35 00:02:51,680 --> 00:02:56,520 Speaker 1: rudimentary AI to help blend the pieces and presto a 36 00:02:56,560 --> 00:03:01,639 Speaker 1: whole new picture. But now would be Patrick Carries. Can 37 00:03:01,760 --> 00:03:05,360 Speaker 1: use an app or a website and a few swipes 38 00:03:05,919 --> 00:03:12,399 Speaker 1: a few seconds later, boom, something new that looks real. 39 00:03:14,200 --> 00:03:18,560 Speaker 3: So I've opened up my laptop and I've got a 40 00:03:18,600 --> 00:03:23,799 Speaker 3: few windows open of several of the neudifying apps which 41 00:03:23,840 --> 00:03:26,480 Speaker 3: were on so Olivia and I decided to try it 42 00:03:26,520 --> 00:03:30,200 Speaker 3: for ourselves. This tech that in many ways brought us 43 00:03:30,280 --> 00:03:35,840 Speaker 3: to the story. So Drawn Nudes is the app we're 44 00:03:35,880 --> 00:03:36,520 Speaker 3: going to go visit. 45 00:03:36,760 --> 00:03:37,560 Speaker 1: I'd love to see. 46 00:03:37,960 --> 00:03:40,920 Speaker 3: We were doing some reporting in California together and we 47 00:03:41,040 --> 00:03:43,360 Speaker 3: decided to do a test run of some of the 48 00:03:43,440 --> 00:03:48,000 Speaker 3: latest tools using artificial intelligence to modify pictures which make 49 00:03:48,080 --> 00:03:52,320 Speaker 3: it so people who were dressed now appear naked. They're 50 00:03:52,360 --> 00:03:57,360 Speaker 3: called undressing or nudifi websites and apps. For our experiment, 51 00:03:57,680 --> 00:04:01,240 Speaker 3: we used one of the most popular websites, Draw Nudes, 52 00:04:01,520 --> 00:04:04,360 Speaker 3: that had more than twelve million visits in the first 53 00:04:04,400 --> 00:04:07,680 Speaker 3: half of twenty twenty four. These sites let you get 54 00:04:07,720 --> 00:04:10,760 Speaker 3: started quickly. You don't need to enter an email or 55 00:04:10,800 --> 00:04:13,080 Speaker 3: a name to use them, and you can make your 56 00:04:13,080 --> 00:04:16,360 Speaker 3: first few fakes for free. So it's saying, have someone 57 00:04:16,400 --> 00:04:20,719 Speaker 3: to undress upload a photo. So I'm going to upload 58 00:04:20,760 --> 00:04:25,120 Speaker 3: a photo from my. 59 00:04:26,760 --> 00:04:27,240 Speaker 4: Library. 60 00:04:27,360 --> 00:04:29,479 Speaker 3: So this is a photo of me. I am in 61 00:04:30,080 --> 00:04:33,719 Speaker 3: a desert in Nevada. Oh, it looks so cool and 62 00:04:34,320 --> 00:04:36,920 Speaker 3: I have been sort of going for a jog. 63 00:04:37,680 --> 00:04:40,240 Speaker 1: Okay, in this photo, I can see Margie. She looks 64 00:04:40,279 --> 00:04:43,400 Speaker 1: really happy. She's got her arms stretched out, she's standing. 65 00:04:43,440 --> 00:04:47,200 Speaker 1: It looks like you're wearing running shoes, maybe some a 66 00:04:47,279 --> 00:04:51,520 Speaker 1: casual pair of shorts, a sports boru and a bucket 67 00:04:51,560 --> 00:04:54,840 Speaker 1: hit in Sunny's and you're in the middle of the desert. 68 00:04:55,720 --> 00:05:00,520 Speaker 3: So done, I'm going to pick auto so ai paint 69 00:05:00,560 --> 00:05:06,880 Speaker 3: clothes on its own, Okay, so you just paint over 70 00:05:07,640 --> 00:05:12,559 Speaker 3: where your clothes are. So it's basically on my phone. 71 00:05:12,560 --> 00:05:14,640 Speaker 3: I'm just basically tapping on my sports bar and my 72 00:05:14,680 --> 00:05:23,919 Speaker 3: shorts and then it's highlighted in orange. Say's processing photos. 73 00:05:23,640 --> 00:05:29,839 Speaker 1: Almost of the place. Do I enclose the page? Oh 74 00:05:29,880 --> 00:05:38,119 Speaker 1: my god, Oh, oh my god, it looks so real. 75 00:05:39,680 --> 00:05:45,480 Speaker 3: Yeah, I look completely naked. I kind of couldn't believe it. 76 00:05:46,200 --> 00:05:49,120 Speaker 3: In a matter of seconds. This website did what it 77 00:05:49,240 --> 00:05:53,440 Speaker 3: likely took Patrick Carey and others hours to accomplish. 78 00:05:53,480 --> 00:05:58,239 Speaker 1: And it's proportionally careached to your body. Oh my god. 79 00:06:02,160 --> 00:06:05,960 Speaker 3: Welcome to the future where you really can't believe your eyes. 80 00:06:06,880 --> 00:06:12,280 Speaker 3: From iHeart Podcasts, Bloomberg and Kaleidoscope, this is Levettown. I'm 81 00:06:12,279 --> 00:06:22,000 Speaker 3: Margie Murphy and I'm Onlivia Carve. Nothing can quite prepare 82 00:06:22,040 --> 00:06:26,520 Speaker 3: you for this. One moment, a smiling vacation picture, the 83 00:06:26,560 --> 00:06:31,159 Speaker 3: next you're naked. It's not really your body and it's 84 00:06:31,240 --> 00:06:35,719 Speaker 3: no longer your photo. It's completely surreal to see it happen, 85 00:06:36,680 --> 00:06:39,240 Speaker 3: and a little weird to share it with a coworker. 86 00:06:40,000 --> 00:06:41,960 Speaker 3: I found it hard to look Olivia in the eye 87 00:06:41,960 --> 00:06:45,280 Speaker 3: for a moment afterward. I can't imagine what it would 88 00:06:45,279 --> 00:06:47,719 Speaker 3: be like to have your classmates looking at something that 89 00:06:47,760 --> 00:06:53,440 Speaker 3: feels so intimate. However, I created that particular image, I 90 00:06:53,520 --> 00:06:55,760 Speaker 3: might not have known exactly what I would look like 91 00:06:55,880 --> 00:06:59,400 Speaker 3: once the tech undressed me, but I knew basically what 92 00:06:59,600 --> 00:07:04,680 Speaker 3: was coming. That's not often the case. Most people who 93 00:07:04,720 --> 00:07:08,800 Speaker 3: make deep fake pornography haven't asked their subjects whether they're 94 00:07:08,839 --> 00:07:12,520 Speaker 3: okay with it. They just do it. It's pretty much 95 00:07:12,600 --> 00:07:16,520 Speaker 3: the reason these apps exist. So when a person learns 96 00:07:16,520 --> 00:07:19,520 Speaker 3: that their image has been altered this way for someone 97 00:07:19,520 --> 00:07:24,400 Speaker 3: else's pleasure, it can feel like a violation. For many 98 00:07:24,440 --> 00:07:27,640 Speaker 3: of the women we spoke to, it was a violation, 99 00:07:28,560 --> 00:07:32,800 Speaker 3: a new kind of violence. As a society, we haven't 100 00:07:32,840 --> 00:07:37,320 Speaker 3: even begun to grapple with this. Currently, it isn't a 101 00:07:37,360 --> 00:07:40,120 Speaker 3: crime under federal law to create and share deep fake 102 00:07:40,160 --> 00:07:43,760 Speaker 3: pornography based on adults. When it comes to fake neods 103 00:07:43,760 --> 00:07:46,760 Speaker 3: of children, the law is a narrow and pertains only 104 00:07:46,800 --> 00:07:50,120 Speaker 3: to cases where children are being abused. In March of 105 00:07:50,200 --> 00:07:53,720 Speaker 3: twenty twenty four, the FBI did issue a public announcement 106 00:07:53,960 --> 00:07:57,680 Speaker 3: saying child sexual abuse material, even if created by AI, 107 00:07:58,280 --> 00:08:02,440 Speaker 3: is illegal, but again, the law they are referring to 108 00:08:02,760 --> 00:08:07,040 Speaker 3: is quite narrow. So far, at least twenty five states 109 00:08:07,080 --> 00:08:09,880 Speaker 3: have tried to address this with their own laws and regulations, 110 00:08:10,600 --> 00:08:13,400 Speaker 3: but it should be noted that federal law protects the 111 00:08:13,440 --> 00:08:17,400 Speaker 3: hosting websites, whether it's a forum, social media platform, or 112 00:08:17,440 --> 00:08:21,760 Speaker 3: internet provider, from being held liable for any content, even 113 00:08:21,760 --> 00:08:24,240 Speaker 3: if it's illegal, posted to their sites. 114 00:08:26,760 --> 00:08:31,240 Speaker 1: Meanwhile, EPP developers are racing to mat a growing demand 115 00:08:31,600 --> 00:08:36,320 Speaker 1: to make better nodes faster, and many of the people 116 00:08:36,360 --> 00:08:40,200 Speaker 1: who are using this tech aren't adults. The a teens 117 00:08:40,679 --> 00:08:44,559 Speaker 1: wielding this technology against other teens. 118 00:08:45,000 --> 00:08:48,560 Speaker 5: Deep fake porn crushing high school girls in Westfield, New Jersey. 119 00:08:48,640 --> 00:08:52,080 Speaker 5: Two students are some serious trouble after being accused of 120 00:08:52,200 --> 00:08:54,760 Speaker 5: using AI to create inappropriate images. 121 00:08:54,840 --> 00:08:56,000 Speaker 6: Beverly Hills Middle. 122 00:08:55,760 --> 00:08:59,080 Speaker 4: School students are in trouble for using AI to create 123 00:08:59,200 --> 00:09:03,040 Speaker 4: and share the fake nude images of their classmates. 124 00:09:04,240 --> 00:09:07,240 Speaker 1: Parents are at a loss and so are school officials. 125 00:09:07,720 --> 00:09:10,600 Speaker 1: In so many places, there are no guard rails to 126 00:09:10,679 --> 00:09:13,880 Speaker 1: make sure what happened in their school in the town 127 00:09:14,840 --> 00:09:19,280 Speaker 1: doesn't happen again. Since I started reporting on Leavettown in 128 00:09:19,320 --> 00:09:23,120 Speaker 1: twenty twenty three, I've followed as cases like these have 129 00:09:23,320 --> 00:09:29,280 Speaker 1: mushroomed across the country, but one in particular stood out. 130 00:09:30,040 --> 00:09:32,360 Speaker 4: I didn't actually realize that it was such a big 131 00:09:32,640 --> 00:09:33,880 Speaker 4: thing across the country. 132 00:09:33,920 --> 00:09:34,160 Speaker 7: I think. 133 00:09:34,240 --> 00:09:37,679 Speaker 1: Detective Laurel Beer is one of two investigators in a 134 00:09:37,720 --> 00:09:42,240 Speaker 1: small police department in Lancaster County, Pennsylvania. 135 00:09:42,360 --> 00:09:45,240 Speaker 4: That's probably what surprised me the most was that there's 136 00:09:45,280 --> 00:09:48,320 Speaker 4: been so many cases that haven't ever gone to prosecution 137 00:09:48,640 --> 00:09:51,679 Speaker 4: simply because they didn't know how to, they didn't know 138 00:09:51,720 --> 00:09:53,880 Speaker 4: how to use the laws to their advantage, or they 139 00:09:53,920 --> 00:09:57,439 Speaker 4: didn't have a law that they could use to press 140 00:09:57,480 --> 00:09:58,800 Speaker 4: charges for something like this. 141 00:10:00,000 --> 00:10:04,360 Speaker 1: Ancaster is a place synonymous with farmland and the Amish, 142 00:10:04,800 --> 00:10:07,319 Speaker 1: where it's pretty normal to see a horse and buggy 143 00:10:07,440 --> 00:10:08,720 Speaker 1: clumping down the street. 144 00:10:09,040 --> 00:10:11,000 Speaker 4: Because we only have two of us, we pretty much 145 00:10:11,040 --> 00:10:14,400 Speaker 4: do every kind of investigation that comes into our office, 146 00:10:14,760 --> 00:10:18,760 Speaker 4: from your property crimes like burglary or trespassing, all the 147 00:10:18,760 --> 00:10:22,200 Speaker 4: way up to homicide that we would have it, but 148 00:10:22,760 --> 00:10:25,960 Speaker 4: pornography cases, child pornography cases, stuff like that. 149 00:10:25,960 --> 00:10:28,319 Speaker 7: That's something that we handle between the two of us. 150 00:10:28,320 --> 00:10:29,640 Speaker 7: We handle several a year. 151 00:10:31,520 --> 00:10:34,920 Speaker 1: In early twenty twenty four, Laurel got a call from 152 00:10:34,920 --> 00:10:40,439 Speaker 1: a patrol officer. Parents whose children attended a nearby private 153 00:10:40,520 --> 00:10:45,360 Speaker 1: school reported that one of the students had accidentally shared 154 00:10:45,400 --> 00:10:49,920 Speaker 1: a photograph and a group text messaging thread. That photo 155 00:10:50,320 --> 00:10:53,920 Speaker 1: was pulled from the social media account of a female classmate, 156 00:10:54,520 --> 00:10:58,720 Speaker 1: but she had been undressed with technology. It was a 157 00:10:58,760 --> 00:11:03,160 Speaker 1: deep fake. The officer didn't know whether they could arrest 158 00:11:03,280 --> 00:11:07,120 Speaker 1: the student, but Laurel had worked a similar case the 159 00:11:07,200 --> 00:11:12,880 Speaker 1: year before. She knew that in Pennsylvania, any quote depiction 160 00:11:13,120 --> 00:11:16,600 Speaker 1: of a child in a sexual manner, created by AI 161 00:11:16,920 --> 00:11:21,320 Speaker 1: or not can lead to a child abuse charge. 162 00:11:22,120 --> 00:11:23,960 Speaker 4: It really helped move things along, I think, at a 163 00:11:24,000 --> 00:11:26,319 Speaker 4: much faster piece because we knew where to go. 164 00:11:27,080 --> 00:11:29,920 Speaker 3: The pictures in this case weren't being traded on a 165 00:11:29,920 --> 00:11:33,760 Speaker 3: tributing website like they were in Leavittown. This time, they 166 00:11:33,800 --> 00:11:37,520 Speaker 3: were being shared between two classmates over a messaging platform 167 00:11:37,720 --> 00:11:41,120 Speaker 3: called discord, until one of them messed up and sent 168 00:11:41,120 --> 00:11:44,480 Speaker 3: a deep fake to a larger group threat. But because 169 00:11:44,480 --> 00:11:48,160 Speaker 3: the rest of the pictures were being privately exchanged. No 170 00:11:48,160 --> 00:11:53,120 Speaker 3: one knew exactly who'd been deep faked, so students and 171 00:11:53,160 --> 00:11:56,960 Speaker 3: families were panicking, and the detective wanted to provide answers. 172 00:11:58,840 --> 00:12:02,199 Speaker 3: Like the police and prosecut is in Levittown, Laurel got 173 00:12:02,200 --> 00:12:06,080 Speaker 3: a search warrant for online records and chat histories. It 174 00:12:06,240 --> 00:12:12,000 Speaker 3: was a painstaking process to identify each victim. Laurel created spreadsheets, 175 00:12:12,200 --> 00:12:16,079 Speaker 3: collected and cross checked the pictures against yearbook, interviewed witnesses, 176 00:12:16,480 --> 00:12:20,479 Speaker 3: and contacted families. In all, she was able to identify 177 00:12:20,559 --> 00:12:24,839 Speaker 3: sixty victims, fifty nine of whom were miners. 178 00:12:25,480 --> 00:12:27,839 Speaker 4: I was surprised over the course of the investigation to 179 00:12:27,880 --> 00:12:32,960 Speaker 4: find out exactly how large scale this case actually was. 180 00:12:33,520 --> 00:12:36,120 Speaker 3: It's one of the largest known cases of deep fake 181 00:12:36,200 --> 00:12:39,880 Speaker 3: pornography in the US that we know about. And just 182 00:12:39,920 --> 00:12:44,680 Speaker 3: like in Levittown, these young women knew the alleged suspects. 183 00:12:44,360 --> 00:12:46,760 Speaker 7: Which seems to be the case in most of the 184 00:12:46,760 --> 00:12:49,520 Speaker 7: ones that I've researched and seen across the country. Is 185 00:12:49,559 --> 00:12:52,040 Speaker 7: that most of the time it's not somebody that these 186 00:12:52,120 --> 00:12:55,160 Speaker 7: kids don't know. Sadly, it's someone that they do know. 187 00:12:56,360 --> 00:12:59,520 Speaker 4: Just knowing that day in and day out, this person 188 00:13:00,440 --> 00:13:03,079 Speaker 4: was among them in their day to day life. Well, 189 00:13:03,120 --> 00:13:04,960 Speaker 4: they ate lunch, while they went to class, while they 190 00:13:04,960 --> 00:13:06,959 Speaker 4: played sports, you know whatever it was that they might 191 00:13:07,000 --> 00:13:08,760 Speaker 4: have had a class with one of their students. 192 00:13:11,080 --> 00:13:14,240 Speaker 3: School officials in Lancaster first learned about the deep Feek 193 00:13:14,280 --> 00:13:17,319 Speaker 3: images several months before Laurel got involved. 194 00:13:18,320 --> 00:13:22,600 Speaker 1: In November twenty twenty three, an anonymous tipster told school 195 00:13:22,640 --> 00:13:28,320 Speaker 1: officials that a student had created AI generated nude images 196 00:13:28,760 --> 00:13:33,800 Speaker 1: of their classmates. The school launched an internal investigation, but 197 00:13:33,920 --> 00:13:39,120 Speaker 1: couldn't corroborate the tip. They did not go to the police. 198 00:13:39,160 --> 00:13:44,800 Speaker 1: Months later, after another anonymous tip, they opened a new investigation. 199 00:13:45,840 --> 00:13:50,320 Speaker 1: After that, one of the suspects was identified and withdrew 200 00:13:50,360 --> 00:13:54,920 Speaker 1: from the school. The incident was reported by the school 201 00:13:55,200 --> 00:13:59,880 Speaker 1: to the state agency responsible for child abuse reporting. The 202 00:14:00,160 --> 00:14:04,200 Speaker 1: police had no idea this was happening until spring twenty 203 00:14:04,240 --> 00:14:06,679 Speaker 1: twenty four, when a mother told them. 204 00:14:06,640 --> 00:14:09,880 Speaker 4: A Lancaster Country Day High School student staged a walkout 205 00:14:09,880 --> 00:14:13,520 Speaker 4: this morning in protest of the administration's handling of the 206 00:14:13,559 --> 00:14:16,760 Speaker 4: private school's ongoing AI nude photo. 207 00:14:16,520 --> 00:14:20,400 Speaker 1: Scanned on this In November twenty twenty four, hundreds of 208 00:14:20,520 --> 00:14:25,120 Speaker 1: students and faculty walked out in protest of how the 209 00:14:25,120 --> 00:14:30,000 Speaker 1: school had handled the situation. Classes were canceled. One student 210 00:14:30,040 --> 00:14:33,120 Speaker 1: said she thought the action sent strong Massachestts school, that 211 00:14:33,240 --> 00:14:35,840 Speaker 1: students are upset with how the adults handle the AI 212 00:14:35,920 --> 00:14:40,760 Speaker 1: situation and that girls don't feel safe at school. Parents 213 00:14:40,840 --> 00:14:44,960 Speaker 1: were outraged about what they believed was negligence on the 214 00:14:45,000 --> 00:14:48,560 Speaker 1: part of the school, and they sued. As of the 215 00:14:48,680 --> 00:14:52,880 Speaker 1: end of January, the cases ongoing. Officials from the school 216 00:14:53,080 --> 00:14:56,080 Speaker 1: told us they cannot discuss the details of the matter. 217 00:14:56,480 --> 00:15:00,000 Speaker 1: They said they are working on caring for their students 218 00:15:00,200 --> 00:15:05,760 Speaker 1: in the process of reviewing school policies. They also added 219 00:15:05,840 --> 00:15:09,680 Speaker 1: that they've begun additional staff training on child protection and 220 00:15:09,800 --> 00:15:16,800 Speaker 1: mandatory reporting. In the meantime. The suspects in the Lancaster 221 00:15:16,920 --> 00:15:21,920 Speaker 1: case have been charged with felony offenses, including sexual abuse 222 00:15:22,000 --> 00:15:26,640 Speaker 1: of children and the dissemination of child sexual abuse materials. 223 00:15:27,520 --> 00:15:32,640 Speaker 1: They're due to enter please soon. Police haven't shared their ages, 224 00:15:33,160 --> 00:15:36,800 Speaker 1: but according to local news reports, at least one of 225 00:15:36,840 --> 00:15:41,120 Speaker 1: them was in ninth grade at the time. They will 226 00:15:41,160 --> 00:15:46,400 Speaker 1: be tried as miners. In Laurel's experience, that could mean 227 00:15:46,520 --> 00:15:51,240 Speaker 1: jail time or being sent to a rehabilitation facility for. 228 00:15:51,240 --> 00:15:55,000 Speaker 4: A facility that helps with the sexualized behaviors. It's a 229 00:15:55,080 --> 00:15:57,800 Speaker 4: more of a treatment than a punishment, I guess you 230 00:15:57,800 --> 00:15:58,280 Speaker 4: could say. 231 00:15:58,720 --> 00:15:59,960 Speaker 7: So that's one possible. 232 00:16:00,080 --> 00:16:02,600 Speaker 4: The other possibility is just a detention center like you 233 00:16:02,640 --> 00:16:06,920 Speaker 4: would expect there to be. But it's nice that we 234 00:16:07,000 --> 00:16:11,160 Speaker 4: have that option, that we have done that before. Worry 235 00:16:11,160 --> 00:16:14,400 Speaker 4: you sent them to a treatment facility to again fix 236 00:16:14,480 --> 00:16:17,800 Speaker 4: the behavior so it doesn't continue into adulthood. 237 00:16:18,680 --> 00:16:22,680 Speaker 3: Because that's the other issue in these cases that have 238 00:16:22,720 --> 00:16:25,680 Speaker 3: popped up across the country. The people who are behind 239 00:16:25,720 --> 00:16:30,600 Speaker 3: deep fakes, non consensual pornography and the harassment are often 240 00:16:30,640 --> 00:16:34,520 Speaker 3: the same age as they're victims. As deep fake technology 241 00:16:34,560 --> 00:16:39,080 Speaker 3: becomes ubiquitous, intuitive, and easier to use, the users become 242 00:16:39,240 --> 00:16:43,000 Speaker 3: younger and younger, coming to these sites right when their 243 00:16:43,040 --> 00:16:47,440 Speaker 3: desires and their concept of sexuality is starting to be hardwired, 244 00:16:48,600 --> 00:16:52,280 Speaker 3: and while they're exploring this part of themselves. Instead of 245 00:16:52,280 --> 00:16:55,520 Speaker 3: grabbing a playboy from underneath dad's bed or sneaking a 246 00:16:55,520 --> 00:16:59,120 Speaker 3: glance at Skinomax, they can now make in trade pictures 247 00:16:59,200 --> 00:17:03,120 Speaker 3: of people that they actually know without necessarily understanding the 248 00:17:03,160 --> 00:17:07,000 Speaker 3: consequences both psychologically and legally. 249 00:17:08,119 --> 00:17:10,120 Speaker 7: These kids can do it in the privacy of their bedroom. 250 00:17:10,119 --> 00:17:11,720 Speaker 4: No one knows that they're doing it, and that just 251 00:17:11,800 --> 00:17:15,280 Speaker 4: creates this start on the line of deviancy, where now 252 00:17:15,320 --> 00:17:17,920 Speaker 4: it's okay, well, if I can look at that, then 253 00:17:18,119 --> 00:17:20,479 Speaker 4: maybe I should try girls my age, which is what 254 00:17:20,520 --> 00:17:23,760 Speaker 4: we see a lot, is that they're looking at They're 255 00:17:23,800 --> 00:17:26,960 Speaker 4: looking at pornographic images for kids their age, which for 256 00:17:27,040 --> 00:17:32,000 Speaker 4: them doesn't seem wrong because it's their age, but it 257 00:17:32,080 --> 00:17:34,080 Speaker 4: is still child pornography. 258 00:17:35,080 --> 00:17:38,280 Speaker 1: Of course. And I guess if they can't find pictures 259 00:17:38,320 --> 00:17:40,720 Speaker 1: on the Internet of girls their age, now there are 260 00:17:40,760 --> 00:17:58,200 Speaker 1: apps allowing them to create it themselves. Yes, at this point, 261 00:17:58,600 --> 00:18:02,639 Speaker 1: it isn't a crime federal law to create and share 262 00:18:02,880 --> 00:18:07,560 Speaker 1: deep fake pornography, and the Supreme Court hasn't ruled on 263 00:18:07,640 --> 00:18:11,040 Speaker 1: what can be deep faked legally and what can't be. 264 00:18:11,960 --> 00:18:16,439 Speaker 1: There is a constitutional prerogative to protect free speech, but 265 00:18:17,040 --> 00:18:21,040 Speaker 1: stopping non consensual deep fake pornography is one of the 266 00:18:21,119 --> 00:18:27,199 Speaker 1: few things that nearly everyone in Congress openly supports. In fact, 267 00:18:27,640 --> 00:18:32,280 Speaker 1: in twenty twenty four, the Senate unanimously passed a bipartisan 268 00:18:32,320 --> 00:18:36,359 Speaker 1: measure that would regulate deep fake pornography, making it a 269 00:18:36,440 --> 00:18:42,000 Speaker 1: federal offense to make and distribute non consensual deep fake porn, 270 00:18:42,880 --> 00:18:46,760 Speaker 1: and it would require social media companies to remove the 271 00:18:46,840 --> 00:18:54,000 Speaker 1: images within forty eight hours of being reported. Companies like Meta, Google, 272 00:18:54,160 --> 00:18:58,920 Speaker 1: and TikTok all came out in support. The bill died 273 00:18:58,960 --> 00:19:01,280 Speaker 1: in the House at the end of twenty twenty four, 274 00:19:01,720 --> 00:19:04,520 Speaker 1: but its backers say it's a priority for them to 275 00:19:04,640 --> 00:19:08,120 Speaker 1: reintroduce and pass it into law in the new session. 276 00:19:10,160 --> 00:19:13,119 Speaker 1: And while states are trying to pass their own laws, 277 00:19:13,640 --> 00:19:16,720 Speaker 1: many are written so that only people under the age 278 00:19:16,720 --> 00:19:22,000 Speaker 1: of eighteen can be considered victims of deep fake pornography. Still, 279 00:19:22,080 --> 00:19:27,040 Speaker 1: other states, like New York, have now expanded existing revenge 280 00:19:27,080 --> 00:19:32,280 Speaker 1: porn statutes to include deep fakes, but this can mean 281 00:19:32,440 --> 00:19:36,400 Speaker 1: that a victim has to prove intent that the person 282 00:19:36,480 --> 00:19:40,560 Speaker 1: who made and posted the image actually wanted to do 283 00:19:40,720 --> 00:19:47,439 Speaker 1: them harm. In Pennsylvania, law enforcement officials like Laurel are 284 00:19:47,560 --> 00:19:52,080 Speaker 1: using child sexual abuse statutes to charge miners in non 285 00:19:52,119 --> 00:19:57,960 Speaker 1: consensual pornography cases. Late last year, Pennsylvania passed a bill 286 00:19:58,119 --> 00:20:03,600 Speaker 1: expanding the definition of the quote unlawful dissemination of an 287 00:20:03,600 --> 00:20:10,760 Speaker 1: intimate image to include deep fake pornography. So whatever legal 288 00:20:10,800 --> 00:20:20,760 Speaker 1: protections do exist are patchwork, uneven and untested. But there 289 00:20:20,840 --> 00:20:26,840 Speaker 1: are other officials going after the people profiting from this technology. 290 00:20:27,280 --> 00:20:30,160 Speaker 1: If you could just start by introducing yourself, that would 291 00:20:30,160 --> 00:20:31,080 Speaker 1: be perfect. 292 00:20:31,720 --> 00:20:35,800 Speaker 6: Sure. I'm David Chu. I'm the City Attorney of San Francisco. 293 00:20:36,640 --> 00:20:39,560 Speaker 3: David Chu had first heard about Newdifi apps through the 294 00:20:39,600 --> 00:20:43,400 Speaker 3: women in his office, who were both lawyers and mothers 295 00:20:43,440 --> 00:20:47,440 Speaker 3: of teenage girls. He said, he realized how dangerous these 296 00:20:47,440 --> 00:20:50,679 Speaker 3: websites and apps could be, and he decided to act. 297 00:20:51,320 --> 00:20:53,919 Speaker 6: We have brought a first of its kind lawsuit against 298 00:20:53,960 --> 00:21:00,159 Speaker 6: the world's largest websites engaged in AI facilitated exploitation. We 299 00:21:00,200 --> 00:21:04,560 Speaker 6: have sued sixteen of the most visited websites that create 300 00:21:04,640 --> 00:21:08,960 Speaker 6: and distribute AI generated non consensual pornography. 301 00:21:09,600 --> 00:21:12,640 Speaker 3: Choose Office filed the suit in August twenty twenty four, 302 00:21:13,480 --> 00:21:16,960 Speaker 3: arguing the new Deify sites not only violate federal and 303 00:21:17,040 --> 00:21:21,760 Speaker 3: state criminal statutes against revenge pornography and child pornography, but 304 00:21:22,000 --> 00:21:26,000 Speaker 3: also california is unfair competition law, which regulates how companies 305 00:21:26,040 --> 00:21:27,320 Speaker 3: can advertise online. 306 00:21:27,720 --> 00:21:31,320 Speaker 6: Collectively, these websites have been visited over two hundred million 307 00:21:31,440 --> 00:21:34,320 Speaker 6: times in the first six months of this year in 308 00:21:34,359 --> 00:21:38,240 Speaker 6: twenty twenty four. These operators are profiting from this content. 309 00:21:38,359 --> 00:21:42,280 Speaker 6: They require users to subscribe or pay to generate images, 310 00:21:42,840 --> 00:21:45,320 Speaker 6: and they know exactly what they're doing. They know that 311 00:21:45,400 --> 00:21:49,760 Speaker 6: this is non consensual. These images are incredibly realistic. There's 312 00:21:49,800 --> 00:21:53,800 Speaker 6: one website that specifically promoted the non consensual nature of this, 313 00:21:54,320 --> 00:21:58,000 Speaker 6: and I quote, imagine wasting time taking her out on 314 00:21:58,119 --> 00:22:01,520 Speaker 6: dates when you can just use website xxx to get 315 00:22:01,560 --> 00:22:05,320 Speaker 6: her notes. And so our lawsuit, among other things, is 316 00:22:05,359 --> 00:22:06,800 Speaker 6: seeking to shut them down. 317 00:22:09,080 --> 00:22:11,280 Speaker 1: Have you heard from any of the companies named in 318 00:22:11,320 --> 00:22:12,000 Speaker 1: the lawsuits. 319 00:22:13,040 --> 00:22:16,440 Speaker 6: It will be interesting to see if they scurry out 320 00:22:16,480 --> 00:22:18,600 Speaker 6: of the darkest corners of the Internet to defend this. 321 00:22:22,600 --> 00:22:24,879 Speaker 3: She says. His office hasn't heard from many of the 322 00:22:24,920 --> 00:22:28,000 Speaker 3: companies named in the lawsuit, and his office might be 323 00:22:28,000 --> 00:22:31,840 Speaker 3: waiting a while because these sixteen websites and apps are 324 00:22:31,880 --> 00:22:35,679 Speaker 3: being developed all over the world. One of these websites 325 00:22:36,040 --> 00:22:38,800 Speaker 3: is draw Needs, the same one Olivia and I used 326 00:22:38,800 --> 00:22:41,040 Speaker 3: to make a deep fake of me. You can find 327 00:22:41,040 --> 00:22:45,880 Speaker 3: it easily, but finding out who runs the app that's trickier. 328 00:22:46,680 --> 00:22:48,560 Speaker 3: We tried following the money when it came to the 329 00:22:48,640 --> 00:22:52,600 Speaker 3: draw neuds app. We paid eight dollars for their premium service, 330 00:22:53,359 --> 00:22:56,159 Speaker 3: which allowed us a window into how they collect the cash. 331 00:22:56,640 --> 00:22:58,960 Speaker 3: Making a payment isn't as easy as typing in your 332 00:22:58,960 --> 00:23:02,120 Speaker 3: credit card, So when I was trying to pay, it 333 00:23:02,160 --> 00:23:05,440 Speaker 3: takes you off app and you talk to these dealers. 334 00:23:05,880 --> 00:23:09,920 Speaker 3: It's a live chat and you'll say to them, here 335 00:23:10,480 --> 00:23:14,000 Speaker 3: you have a specific link that corresponds to your account, 336 00:23:14,560 --> 00:23:17,560 Speaker 3: and they'll say, okay, here's a PayPal link. Go to 337 00:23:17,600 --> 00:23:20,640 Speaker 3: that PayPal account, give me the transaction ID, and then 338 00:23:20,720 --> 00:23:24,159 Speaker 3: I will go and kind of put those coins that 339 00:23:24,200 --> 00:23:28,160 Speaker 3: you get, the eight dollars worth of coins into your app. 340 00:23:28,720 --> 00:23:31,080 Speaker 1: Why there's so many steps to get coins to. 341 00:23:31,440 --> 00:23:35,600 Speaker 3: Do this, I think there's a couple of reasons. It 342 00:23:35,640 --> 00:23:38,160 Speaker 3: could be that they are just trying to hide their tracks. 343 00:23:39,119 --> 00:23:41,960 Speaker 3: If you are paying people to use their PayPal accounts, 344 00:23:42,160 --> 00:23:44,959 Speaker 3: there's no link back to you, so there's no paper record. 345 00:23:45,960 --> 00:23:49,040 Speaker 3: The other reason that you know that it's not just 346 00:23:49,040 --> 00:23:51,600 Speaker 3: a simple use your credit card thing is because a 347 00:23:51,640 --> 00:23:56,040 Speaker 3: lot of companies, payment processing companies are now banning these 348 00:23:56,119 --> 00:24:01,280 Speaker 3: kind of apps. The truth is it's not really complicated 349 00:24:01,320 --> 00:24:04,719 Speaker 3: for the consumer, but those extra steps make it harder 350 00:24:04,840 --> 00:24:08,760 Speaker 3: to determine the identity and the finances of the developers. 351 00:24:09,800 --> 00:24:13,439 Speaker 3: And that's exactly who he wanted to find so we 352 00:24:13,480 --> 00:24:15,879 Speaker 3: went looking for the guy whose name appeared to be 353 00:24:15,920 --> 00:24:18,920 Speaker 3: connected to draw nudes to ask him about a lot 354 00:24:18,960 --> 00:24:28,720 Speaker 3: of things, including the lawsuit. I was looking for a 355 00:24:28,760 --> 00:24:31,720 Speaker 3: guy named Cyril. What I know about him is he 356 00:24:31,760 --> 00:24:34,960 Speaker 3: appears to be in his early thirties and he's from Ukraine. 357 00:24:35,680 --> 00:24:38,040 Speaker 3: His hair is long enough to tuck behind both ears. 358 00:24:38,520 --> 00:24:40,639 Speaker 3: He looks more like a DJ than a tech mogul. 359 00:24:41,160 --> 00:24:44,199 Speaker 3: He has an active social media presence and appears to 360 00:24:44,200 --> 00:24:47,200 Speaker 3: have a model girlfriend. He moved to the United States 361 00:24:47,240 --> 00:24:50,639 Speaker 3: before the war in twenty twenty one and lives somewhere 362 00:24:50,680 --> 00:24:56,360 Speaker 3: in the US. Cyril has several businesses. He's talked publicly 363 00:24:56,400 --> 00:24:58,919 Speaker 3: about one of them, which acts as a sort of 364 00:24:58,920 --> 00:25:03,600 Speaker 3: broker for internet for a fee. Cyril's company helps a 365 00:25:03,680 --> 00:25:06,840 Speaker 3: business to advertise on the web. Here he is at 366 00:25:06,840 --> 00:25:10,920 Speaker 3: a conference in Thailand in December talking about marketing supplements. 367 00:25:11,800 --> 00:25:15,400 Speaker 3: This one promises everything from weight loss to bigger muscles 368 00:25:15,680 --> 00:25:16,840 Speaker 3: and better erections. 369 00:25:17,240 --> 00:25:19,959 Speaker 8: I run the agency and we help like we and 370 00:25:19,960 --> 00:25:23,720 Speaker 8: we helped to some neutral guys like to scale or 371 00:25:23,760 --> 00:25:27,320 Speaker 8: like to build the Neutra offers or neutral whitehead brands. 372 00:25:28,640 --> 00:25:31,600 Speaker 3: But it was another one of Cyril's companies that caught 373 00:25:31,640 --> 00:25:35,520 Speaker 3: our attention sole e Com, which was listed on the 374 00:25:35,520 --> 00:25:38,440 Speaker 3: bottom of the draw News homepage and in the David 375 00:25:38,520 --> 00:25:42,760 Speaker 3: Chu lawsuit. Sole eCOM has a registered address in Los Angeles. 376 00:25:43,560 --> 00:25:46,600 Speaker 3: I spent weeks trying to contact Cyril and anyone who 377 00:25:46,720 --> 00:25:49,919 Speaker 3: might know him. That's when we tried knocking on the 378 00:25:49,920 --> 00:25:54,720 Speaker 3: placelisted is soul Ecom's office. Hi, my name is Margie. 379 00:25:55,240 --> 00:25:57,880 Speaker 7: Are you Pilo? Oh? 380 00:25:57,960 --> 00:26:00,000 Speaker 2: Okay, so I'm a reporter. 381 00:25:59,720 --> 00:26:02,359 Speaker 3: From The person who answered said they'd never heard of 382 00:26:02,400 --> 00:26:05,680 Speaker 3: his name or the business. But more than a month 383 00:26:05,720 --> 00:26:09,520 Speaker 3: after I started looking for him, Cyril finally picked up 384 00:26:09,560 --> 00:26:14,000 Speaker 3: the phone. We talked for maybe half an hour. During 385 00:26:14,040 --> 00:26:17,520 Speaker 3: the conversation, Cyril was adamant that sole eComm and draw 386 00:26:17,560 --> 00:26:22,120 Speaker 3: Nudes were separate companies. He said that, like his other clients, 387 00:26:22,480 --> 00:26:25,200 Speaker 3: the developers of draw Nudes app had come to him 388 00:26:25,440 --> 00:26:28,960 Speaker 3: for help marketing their product. He also said he had 389 00:26:29,000 --> 00:26:34,080 Speaker 3: actually met the app's owners, two men, he says, one 390 00:26:34,400 --> 00:26:36,879 Speaker 3: a Russian living in Cyprus who was big in the 391 00:26:36,920 --> 00:26:41,600 Speaker 3: porn industry, the other a Belarusian who lived in the States. 392 00:26:42,560 --> 00:26:45,840 Speaker 3: Cyril said that after reading several negative articles about the 393 00:26:45,920 --> 00:26:49,600 Speaker 3: draw Nudes app, he told the developers he didn't want 394 00:26:49,600 --> 00:26:52,720 Speaker 3: to work with them anymore. Cyril told me that he 395 00:26:52,800 --> 00:26:55,680 Speaker 3: in soul eCOM, had not received any form of payment 396 00:26:55,720 --> 00:27:00,159 Speaker 3: from Drawnuds or its developers. We spoke several times, but 397 00:27:00,280 --> 00:27:04,520 Speaker 3: after our last conversation, promising to speak again, we hung 398 00:27:04,640 --> 00:27:09,600 Speaker 3: up and then he ghosted me. I emailed, rang, sent 399 00:27:09,680 --> 00:27:14,159 Speaker 3: telegram messages, booked time on his calendly. All the while 400 00:27:14,560 --> 00:27:18,160 Speaker 3: I watched Cyril attend a tech conference in Thailand, travel 401 00:27:18,240 --> 00:27:21,760 Speaker 3: through Italy, and spend Christmas on a luxury ski vacation 402 00:27:22,080 --> 00:27:26,800 Speaker 3: in France. As Cyril was off avoiding me and living 403 00:27:26,840 --> 00:27:29,879 Speaker 3: his best life, twenty twenty four came to a close, 404 00:27:30,560 --> 00:27:35,679 Speaker 3: and with it, the drawnuds app and website went offline. Meanwhile, 405 00:27:35,880 --> 00:27:38,000 Speaker 3: I was looking at other nudify apps named in the 406 00:27:38,040 --> 00:27:43,400 Speaker 3: California lawsuit, one which is called cloth Off. It's even 407 00:27:43,400 --> 00:27:47,160 Speaker 3: bigger than draw Nudes, with almost twenty seven million visitors 408 00:27:47,240 --> 00:27:50,879 Speaker 3: last year alone. As I clicked around the site and 409 00:27:50,920 --> 00:27:54,840 Speaker 3: the app, I was struck by how similar the cloth Off, 410 00:27:55,160 --> 00:27:59,480 Speaker 3: draw Nudes, and another site called undress app were. They 411 00:27:59,520 --> 00:28:04,640 Speaker 3: all have the same interface, same graphics, same text, same design, 412 00:28:05,600 --> 00:28:11,399 Speaker 3: the same support assistant infrastructure. In fact, many of the 413 00:28:11,440 --> 00:28:15,200 Speaker 3: top nudifying apps seemed to be shell websites that hyperlinked 414 00:28:15,280 --> 00:28:18,720 Speaker 3: to each other, moving Internet traffic around to get people 415 00:28:18,760 --> 00:28:21,720 Speaker 3: to pay up for a smaller group of services that 416 00:28:21,840 --> 00:28:25,879 Speaker 3: all worked in exactly the same way. And when I 417 00:28:25,960 --> 00:28:29,000 Speaker 3: was using draw Nudes and had a question for customer service, 418 00:28:29,640 --> 00:28:34,200 Speaker 3: up popped the cloth Off name and logo, same customer service, 419 00:28:34,920 --> 00:28:39,239 Speaker 3: same company. I contacted customer service again to ask if 420 00:28:39,320 --> 00:28:42,120 Speaker 3: cloth Off or draw Nudes had been bought or sold recently. 421 00:28:43,080 --> 00:28:46,320 Speaker 3: I was assured that owners hadn't changed, but that the 422 00:28:46,360 --> 00:28:49,480 Speaker 3: cloth Off and draw Nude websites had moved to a 423 00:28:49,520 --> 00:28:55,040 Speaker 3: different domain for quote greater online recognition and to speed 424 00:28:55,160 --> 00:29:00,640 Speaker 3: up its performance, which may very well be true, But 425 00:29:00,760 --> 00:29:04,040 Speaker 3: another effect of switching domains is to make it really 426 00:29:04,080 --> 00:29:07,640 Speaker 3: difficult to trace the owners and hold them legally accountable. 427 00:29:11,240 --> 00:29:16,560 Speaker 3: All signs point to one likely explanation. Cloth Off, draw Nudes, 428 00:29:16,800 --> 00:29:20,120 Speaker 3: and the undress app all sites named in the California 429 00:29:20,160 --> 00:29:23,880 Speaker 3: suit may be the work of the same people. We 430 00:29:23,920 --> 00:29:27,400 Speaker 3: attempted to contact the sites via their service email addresses, 431 00:29:27,600 --> 00:29:32,160 Speaker 3: but got no response. Again, a handful of people may 432 00:29:32,160 --> 00:29:36,200 Speaker 3: be behind three of the most widely used nudifi apps 433 00:29:36,240 --> 00:29:40,040 Speaker 3: in the country, and the impact of their product is 434 00:29:40,160 --> 00:29:44,400 Speaker 3: at best terrorizing young women and girls and at worst 435 00:29:45,040 --> 00:29:49,120 Speaker 3: changing young men and women. It's changing how they relate 436 00:29:49,160 --> 00:29:53,640 Speaker 3: to the world and each other. While Apple and Google 437 00:29:53,800 --> 00:29:57,240 Speaker 3: have banned a number of nudified products, apps, and websites 438 00:29:57,320 --> 00:30:00,360 Speaker 3: from their app stores, their sites can readily be found 439 00:30:00,440 --> 00:30:05,200 Speaker 3: on Google or Safari. Developers keep putting them out and anyone, 440 00:30:05,960 --> 00:30:09,840 Speaker 3: anyone can use them. If this lawsuit fails to stop 441 00:30:09,880 --> 00:30:13,440 Speaker 3: the influx of these apps, David Chu from California says 442 00:30:13,680 --> 00:30:16,120 Speaker 3: there are other ways to defang them. 443 00:30:16,680 --> 00:30:23,040 Speaker 6: There is a universe of other technology players and technology 444 00:30:23,040 --> 00:30:26,560 Speaker 6: infrastructure in and around these websites. 445 00:30:27,080 --> 00:30:32,000 Speaker 3: He's talking about Internet service providers, search engines, login authenticators, 446 00:30:32,200 --> 00:30:36,440 Speaker 3: or payment processes. It sounds boring, but this is how 447 00:30:36,480 --> 00:30:41,640 Speaker 3: tech works. Software working in tandem with online businesses to 448 00:30:41,720 --> 00:30:45,840 Speaker 3: make even the most basic sites run. After being called 449 00:30:45,880 --> 00:30:48,840 Speaker 3: out for their relationships with the newdify apps, some of 450 00:30:48,880 --> 00:30:51,720 Speaker 3: these tech companies have already said they're going to stop 451 00:30:51,760 --> 00:30:55,360 Speaker 3: working with them. 452 00:30:55,400 --> 00:30:59,840 Speaker 1: But there are also richly capitalized teach companies like st 453 00:31:00,000 --> 00:31:04,520 Speaker 1: Stability Ai and mid Journey that are creating powerful generative 454 00:31:04,560 --> 00:31:09,400 Speaker 1: AI tools. These companies say they've put guardrails in place 455 00:31:09,800 --> 00:31:13,280 Speaker 1: to prevent anyone from using the attack to create deep 456 00:31:13,320 --> 00:31:17,760 Speaker 1: fake pawn, but there are ways to get around them 457 00:31:18,200 --> 00:31:21,680 Speaker 1: by using the older open source versions of the attach 458 00:31:22,160 --> 00:31:27,560 Speaker 1: or using special prompts to circumvent the word filters. Stability 459 00:31:27,640 --> 00:31:31,080 Speaker 1: AI declined to be interviewed and we didn't hear back 460 00:31:31,080 --> 00:31:32,120 Speaker 1: from mid Journey. 461 00:31:33,240 --> 00:31:37,760 Speaker 3: And Newdify apps aren't being pushed into the shadows. Giants 462 00:31:37,880 --> 00:31:41,680 Speaker 3: like Matter are allowing these companies to buy ads and 463 00:31:41,840 --> 00:31:46,560 Speaker 3: market to consumers on their massive platforms. By some reports, 464 00:31:46,960 --> 00:31:50,560 Speaker 3: nearly ninety percent of traffic to certain Newdify apps is 465 00:31:50,640 --> 00:31:55,240 Speaker 3: driven by ads on Facebook or Instagram. We emailed that 466 00:31:55,360 --> 00:31:59,040 Speaker 3: owner Matter, which told us these apps and these ads 467 00:31:59,400 --> 00:32:02,480 Speaker 3: break their rules. They remove the ads when they become 468 00:32:02,520 --> 00:32:06,240 Speaker 3: aware of them, disable the accounts responsible, and block links 469 00:32:06,240 --> 00:32:10,200 Speaker 3: to the host websites, but the spokesperson noted this is 470 00:32:10,240 --> 00:32:14,080 Speaker 3: an ongoing battle. The people behind these apps are constantly 471 00:32:14,120 --> 00:32:17,920 Speaker 3: evolving their tactics to stay ahead of the game. If blocked, 472 00:32:18,200 --> 00:32:20,720 Speaker 3: they just pop back up again under a new domain name. 473 00:32:21,840 --> 00:32:26,920 Speaker 6: Our hope is with more public attention to what's happening here, 474 00:32:26,960 --> 00:32:29,400 Speaker 6: as well as letting people know about the lawsuit that 475 00:32:29,400 --> 00:32:33,000 Speaker 6: we're involved in that other players will step up to 476 00:32:33,080 --> 00:32:36,280 Speaker 6: do their part to help all of us crack down 477 00:32:36,280 --> 00:32:40,920 Speaker 6: on these bad actors and their use of AI exploitative 478 00:32:41,160 --> 00:32:43,320 Speaker 6: technologies to abuse real people. 479 00:32:47,680 --> 00:32:52,920 Speaker 1: Artificial intelligence is part of our reality now. It's everywhere, 480 00:32:53,520 --> 00:32:57,840 Speaker 1: and it touches every single part of our lives. But 481 00:32:58,080 --> 00:33:02,480 Speaker 1: just because it exists, it doesn't mean it can't be controlled. 482 00:33:03,360 --> 00:33:08,920 Speaker 1: This isn't science fiction. Robots aren't taking over, at least 483 00:33:09,520 --> 00:33:10,000 Speaker 1: not yet. 484 00:33:11,040 --> 00:33:13,440 Speaker 6: I have to say one of the personal reasons why 485 00:33:13,760 --> 00:33:16,959 Speaker 6: I am so motivated for our team to get to 486 00:33:17,000 --> 00:33:18,680 Speaker 6: the bottom of this is I have an eight year 487 00:33:18,680 --> 00:33:21,320 Speaker 6: old son, and I'm worried in a few years that 488 00:33:21,440 --> 00:33:24,920 Speaker 6: if we don't get a handle on deep fake pornography, 489 00:33:24,960 --> 00:33:29,600 Speaker 6: that his generation is going to be impacted in ways 490 00:33:29,640 --> 00:33:33,480 Speaker 6: that we cannot anticipate today. Not just the impact on 491 00:33:33,680 --> 00:33:37,640 Speaker 6: his female classmates, but what this will do to boys 492 00:33:37,760 --> 00:33:42,240 Speaker 6: and teenagers as they grow up, learning about different norms 493 00:33:42,360 --> 00:33:47,000 Speaker 6: in how to engage with their classmates, their friends, people 494 00:33:47,040 --> 00:33:51,400 Speaker 6: they're going to have relationships with. This is horrifying and unfortunately, 495 00:33:52,040 --> 00:33:56,560 Speaker 6: this AI facilitated exploitation, I'm worried has opened up a 496 00:33:56,600 --> 00:34:00,600 Speaker 6: Pandora's box and if we don't act quickly, it could 497 00:34:00,640 --> 00:34:03,560 Speaker 6: warp an entire generation of young people. 498 00:34:12,520 --> 00:34:15,839 Speaker 1: The Livet Town case showed us that the stakes are high, 499 00:34:17,040 --> 00:34:21,680 Speaker 1: much higher than many of us realize. While their story 500 00:34:21,920 --> 00:34:25,200 Speaker 1: was one of the first, it took us beyond deep 501 00:34:25,280 --> 00:34:31,200 Speaker 1: fakes and tributing websites, beyond the hurt and fear experienced 502 00:34:31,239 --> 00:34:36,520 Speaker 1: by a couple dozen school kids. Even now, it stands 503 00:34:36,640 --> 00:34:37,280 Speaker 1: as a warning. 504 00:34:38,480 --> 00:34:43,360 Speaker 9: I'd hope that after this case, in his conviction, something's 505 00:34:43,480 --> 00:34:45,960 Speaker 9: changed in the way that they approach these kinds of 506 00:34:47,960 --> 00:34:51,680 Speaker 9: oversteppings of moral boundaries. 507 00:34:51,800 --> 00:34:56,040 Speaker 1: You know. I asked Cecilia, one of the Livett Town women, 508 00:34:56,520 --> 00:34:59,960 Speaker 1: what she thought may help other people who were experien 509 00:35:00,040 --> 00:35:01,680 Speaker 1: variencing what she did. 510 00:35:02,160 --> 00:35:08,920 Speaker 9: I'd hope that maybe after this we have more of 511 00:35:08,960 --> 00:35:12,280 Speaker 9: an idea of what to do or how to approach 512 00:35:12,320 --> 00:35:17,560 Speaker 9: it when someone does use AI or technology in a 513 00:35:17,600 --> 00:35:20,560 Speaker 9: way to harm others. You know, it was one of 514 00:35:20,640 --> 00:35:24,800 Speaker 9: the worst things I've ever been through, and not everyone 515 00:35:24,920 --> 00:35:28,960 Speaker 9: is either healed enough or lucky enough to have an 516 00:35:28,960 --> 00:35:35,120 Speaker 9: outlet to talk about that. But man, I really hope 517 00:35:35,160 --> 00:35:41,440 Speaker 9: that something can be done in terms of regulation so 518 00:35:41,480 --> 00:35:46,920 Speaker 9: that other people don't have to experience any aspect of 519 00:35:46,960 --> 00:35:49,080 Speaker 9: how it feels to have that done to you. 520 00:35:52,560 --> 00:35:55,560 Speaker 1: All the women we spoke to agreed to share these 521 00:35:55,600 --> 00:36:00,000 Speaker 1: stories because they recognized that there are still very few 522 00:36:00,239 --> 00:36:05,680 Speaker 1: safeguards in place to protect them or others even now. 523 00:36:08,080 --> 00:36:12,080 Speaker 1: Patrick Carrey was released from prison early for good behavior 524 00:36:12,600 --> 00:36:17,280 Speaker 1: in August twenty twenty three. He spent four months behind bars. 525 00:36:18,960 --> 00:36:22,759 Speaker 1: Cecilia found out the hard way. She drove past him 526 00:36:22,960 --> 00:36:25,719 Speaker 1: walking down the street one day and nearly had a 527 00:36:25,760 --> 00:36:31,719 Speaker 1: panic attack. Kayla has also seen Patrick just around the 528 00:36:31,760 --> 00:36:35,919 Speaker 1: corner from her house. She has become, in a way, 529 00:36:36,800 --> 00:36:40,880 Speaker 1: a de facto spokesperson, a role she may not have 530 00:36:40,960 --> 00:36:44,960 Speaker 1: asked for, but because of her experience, she says, she 531 00:36:45,080 --> 00:36:48,839 Speaker 1: can't stay silent. How do you feel that there are 532 00:36:48,880 --> 00:36:51,920 Speaker 1: still no federal laws preventing this today? 533 00:36:52,360 --> 00:36:56,920 Speaker 5: It's very frustrating, very frustrating. I don't understand any of it. 534 00:36:57,320 --> 00:36:59,680 Speaker 5: I really don't understand how hard it is for them 535 00:36:59,760 --> 00:37:03,440 Speaker 5: not you see that. There's plenty of cases you can 536 00:37:03,480 --> 00:37:06,279 Speaker 5: pull up and be like this, this is this, Like 537 00:37:07,120 --> 00:37:09,520 Speaker 5: how could you not make a change there? A change 538 00:37:09,600 --> 00:37:13,759 Speaker 5: needs to be made. It's unfortunate that we have to 539 00:37:13,760 --> 00:37:17,080 Speaker 5: go through this, but that doesn't mean you can't stop it. 540 00:37:18,239 --> 00:37:23,480 Speaker 5: There's something you can do. I mean, I don't know 541 00:37:23,520 --> 00:37:26,799 Speaker 5: how hard it is, but think about how hard it 542 00:37:26,840 --> 00:37:29,520 Speaker 5: is for me to speak out on this. I mean, 543 00:37:29,560 --> 00:37:32,720 Speaker 5: I shouldn't have to do this, but I am because 544 00:37:32,760 --> 00:37:35,720 Speaker 5: it's worth it, because it's worth getting the word out there, 545 00:37:36,280 --> 00:37:44,040 Speaker 5: because something needs to change. 546 00:37:44,880 --> 00:37:48,399 Speaker 3: This series is reported and hosted by Olivia Carvill and 547 00:37:48,440 --> 00:37:53,000 Speaker 3: me Margie Murphy. Produced by Kaleidoscope, led by Julia Nutta, 548 00:37:53,400 --> 00:37:57,759 Speaker 3: edited by Netta to Luis Semnani, Producing by Darren Luckpott's 549 00:37:58,280 --> 00:38:03,000 Speaker 3: Executive produced by Kate Osby, Original composition and mixing by 550 00:38:03,040 --> 00:38:07,520 Speaker 3: Steve Bone. Fact checking by Lena chang Our. Bloomberg editors 551 00:38:07,640 --> 00:38:11,560 Speaker 3: are Caitlin Kenney and Jeff Grocopp. Some additional reporting by 552 00:38:11,600 --> 00:38:16,080 Speaker 3: Samantha Stewart. Sage Bauman is Bloomberg's executive producer and head 553 00:38:16,120 --> 00:38:22,120 Speaker 3: of Podcasting. Kristin Powers is our senior executive editor, lavettown 554 00:38:22,239 --> 00:38:27,440 Speaker 3: archive booklips provided by screen Ocean Clips and footage from iHeart. 555 00:38:27,440 --> 00:38:30,799 Speaker 3: Our executive producers are Tyler Klang and nicky. 556 00:38:30,640 --> 00:38:35,920 Speaker 1: Etour Special thanks to Robert Friedman Andrew Martin, Gilda Decale 557 00:38:36,320 --> 00:38:41,440 Speaker 1: Andrew Ressio, Victor Evez and Sean Wenn at Bloomberg. Thanks 558 00:38:41,480 --> 00:38:46,640 Speaker 1: also to Alex Sarmas, Henry Ida, Matt Torre, Paul Manton, 559 00:38:47,000 --> 00:38:51,960 Speaker 1: and the Livetown Historical Society, Nassa County District Attorney Anne Donnelly, 560 00:38:52,400 --> 00:38:57,439 Speaker 1: and Nicole Turso. And thanks to all the young women 561 00:38:57,560 --> 00:39:00,719 Speaker 1: and parents in Levettown who spoke to us as we 562 00:39:00,840 --> 00:39:19,120 Speaker 1: reported the story.