1 00:00:01,240 --> 00:00:04,200 Speaker 1: Welcome to tech stuff. I'ma's veloscan and this is the 2 00:00:04,280 --> 00:00:07,960 Speaker 1: story this week. Something quite prescient. Have you ever been 3 00:00:08,039 --> 00:00:11,000 Speaker 1: deep faked? Or maybe this is just a new fear 4 00:00:11,320 --> 00:00:14,080 Speaker 1: that photos of you end up online that are you 5 00:00:14,480 --> 00:00:18,920 Speaker 1: but not really you? What do you do? For an 6 00:00:18,920 --> 00:00:23,520 Speaker 1: increasing number of people, especially women, this is becoming a reality, 7 00:00:24,160 --> 00:00:26,279 Speaker 1: so much so that a recent bill in Congress called 8 00:00:26,320 --> 00:00:29,000 Speaker 1: the Take It Down Act has found some incredibly rare 9 00:00:29,040 --> 00:00:33,199 Speaker 1: bipartisan support. The bill is sponsored by Republican Senator Ted 10 00:00:33,280 --> 00:00:37,800 Speaker 1: Cruz and Democrat Senator Amy Klobuchar, making it illegal to 11 00:00:37,840 --> 00:00:41,280 Speaker 1: post explicit deep fakes. Milania Trump has also been a 12 00:00:41,320 --> 00:00:45,240 Speaker 1: vocal supporter. But the thing is it isn't law yet, 13 00:00:45,640 --> 00:00:49,200 Speaker 1: and it might not be enough. A new podcast called 14 00:00:49,280 --> 00:00:54,360 Speaker 1: Levittown from Bloomberg News and our company Kaleidoscope, takes listeners 15 00:00:54,360 --> 00:00:57,600 Speaker 1: on a sort of cyber thriller for the AI age. 16 00:00:57,640 --> 00:00:59,760 Speaker 1: It's the story of a group of young women in 17 00:00:59,800 --> 00:01:03,600 Speaker 1: the suburbs of Long Island who find naked fakes of 18 00:01:03,640 --> 00:01:07,080 Speaker 1: themselves online, and, when told there's nothing they can do 19 00:01:07,160 --> 00:01:11,360 Speaker 1: about it, set out to catch the perpetrator. This ends 20 00:01:11,440 --> 00:01:14,360 Speaker 1: up connecting them to a web of online vigilantes and 21 00:01:14,400 --> 00:01:18,520 Speaker 1: cyber criminals taking advantage of a justice system not ready 22 00:01:18,600 --> 00:01:21,880 Speaker 1: for the reality of AI. This week on text Stuff, 23 00:01:22,160 --> 00:01:24,720 Speaker 1: we bring you the first episode in the series. If 24 00:01:24,800 --> 00:01:27,240 Speaker 1: you like what you hear, jump over to the Levittown 25 00:01:27,280 --> 00:01:29,920 Speaker 1: podcast feed and listen to the series in full. 26 00:01:31,319 --> 00:01:37,240 Speaker 2: The series explores sexualized imagery involving miners and violence. Please 27 00:01:37,280 --> 00:01:50,760 Speaker 2: take here when listening. Can we start from the beginning? 28 00:01:51,400 --> 00:01:52,680 Speaker 2: How did this all start for you? 29 00:01:55,560 --> 00:02:04,040 Speaker 3: So I can go back into it? Remember somehow all 30 00:02:04,120 --> 00:02:07,160 Speaker 3: these pictures from my Instagram by Visco were all on 31 00:02:07,200 --> 00:02:14,040 Speaker 3: this website. There was one picture of me in a 32 00:02:14,080 --> 00:02:18,480 Speaker 3: bathing suit, well it was me in a bathing suit 33 00:02:19,680 --> 00:02:26,520 Speaker 3: in one of my friend's backyards and I didn't have 34 00:02:26,520 --> 00:02:33,160 Speaker 3: a bathing suit on anymore. And it was just me naked, 35 00:02:34,760 --> 00:02:38,959 Speaker 3: well not me, but me with someone else's body parts 36 00:02:39,000 --> 00:02:49,880 Speaker 3: on my body parts. It was just boobs and body 37 00:02:49,880 --> 00:02:56,080 Speaker 3: parts completely nude that looked so much like my own body. 38 00:02:58,680 --> 00:03:02,480 Speaker 3: There were so many different pictures from my social media 39 00:03:02,680 --> 00:03:10,440 Speaker 3: on essentially a porn website. I was just trying to 40 00:03:10,560 --> 00:03:13,920 Speaker 3: figure out exactly, like where is this coming from? Like 41 00:03:14,040 --> 00:03:17,800 Speaker 3: how are they finding my stuff? I didn't let people 42 00:03:18,360 --> 00:03:22,560 Speaker 3: follow me that I didn't know, so it was always 43 00:03:22,560 --> 00:03:24,280 Speaker 3: in the back of my head like, oh, it's someone 44 00:03:24,280 --> 00:03:27,120 Speaker 3: that I know, But how do you find out who 45 00:03:27,200 --> 00:03:33,160 Speaker 3: that someone is when you know so many people from school, soccer, 46 00:03:34,000 --> 00:03:34,800 Speaker 3: all these things. 47 00:03:37,960 --> 00:03:40,760 Speaker 2: Did you even have any suspicions as to who it 48 00:03:41,000 --> 00:03:41,560 Speaker 2: was back then? 49 00:03:43,160 --> 00:03:46,000 Speaker 4: No, not at all. 50 00:03:46,680 --> 00:03:49,000 Speaker 3: It was always like, why is this happening? 51 00:03:49,080 --> 00:03:49,840 Speaker 4: Who is this? 52 00:04:09,880 --> 00:04:12,680 Speaker 2: Kayla was twenty years old and living at home with 53 00:04:12,720 --> 00:04:16,360 Speaker 2: her parents in a town that was once the picture 54 00:04:16,720 --> 00:04:21,480 Speaker 2: of the American dream, a long Island suburb called Leavettown. 55 00:04:23,080 --> 00:04:27,120 Speaker 2: We're driving around Leavittown. A lot of white picket fences, 56 00:04:27,279 --> 00:04:31,880 Speaker 2: a lot of American flags. It has the feel of 57 00:04:32,000 --> 00:04:33,080 Speaker 2: going back in time. 58 00:04:33,800 --> 00:04:37,480 Speaker 5: This is Levittown, one of the most remarkable housing developments 59 00:04:37,520 --> 00:04:38,240 Speaker 5: ever conceived. 60 00:04:38,920 --> 00:04:41,520 Speaker 2: It's the kind of America that makes you think of 61 00:04:41,600 --> 00:04:47,640 Speaker 2: the nineteen fifties cookie cutter single family houses. Like someone 62 00:04:47,720 --> 00:04:52,160 Speaker 2: hit control C on the ideal suburban house and then 63 00:04:52,240 --> 00:04:57,200 Speaker 2: pasted it over and over on street after street. 64 00:04:57,720 --> 00:04:59,960 Speaker 5: The idea that came to a man named Bill Levitt 65 00:05:00,240 --> 00:05:03,159 Speaker 5: this why not apply to the building of houses the 66 00:05:03,160 --> 00:05:05,760 Speaker 5: same principles that have brought other American industries to there 67 00:05:05,760 --> 00:05:07,760 Speaker 5: On Accel Peaks of efficiency and service. 68 00:05:08,160 --> 00:05:11,960 Speaker 2: Levittown was built for veterans returning from World War II 69 00:05:12,800 --> 00:05:18,280 Speaker 2: as the picture of white suburbia. Perfect houses, manicured. 70 00:05:17,680 --> 00:05:20,960 Speaker 5: Lawns started here and there Throughout the huge area are 71 00:05:21,040 --> 00:05:24,400 Speaker 5: shopping centers where every type of product or service is 72 00:05:24,440 --> 00:05:28,960 Speaker 5: readily available Hue Stars five and Dimes Department Store. 73 00:05:33,640 --> 00:05:38,760 Speaker 2: Today we often talk about the future as digital algorithms 74 00:05:38,800 --> 00:05:45,120 Speaker 2: and codes, as the architecture of our lives. But back then, 75 00:05:45,560 --> 00:05:51,040 Speaker 2: Levittown was the imagined future. It just feels like quite 76 00:05:51,760 --> 00:05:56,520 Speaker 2: suburban and safe, you know, it feels like the kind 77 00:05:56,560 --> 00:06:01,320 Speaker 2: of place that nothing really happens around here. And yet 78 00:06:01,600 --> 00:06:06,239 Speaker 2: underneath the facade of perfect order, it's here I found 79 00:06:06,240 --> 00:06:10,279 Speaker 2: a story unfolding of a group of women faced with 80 00:06:10,360 --> 00:06:15,440 Speaker 2: a new reality none of them wanted, a technological reality 81 00:06:16,160 --> 00:06:22,320 Speaker 2: spinning out of control. It's like, this is not the 82 00:06:22,440 --> 00:06:26,080 Speaker 2: setting for a horror movie, and that's exactly what played out. 83 00:06:27,240 --> 00:06:31,159 Speaker 2: But this isn't your average horror story, the kind where 84 00:06:31,160 --> 00:06:34,520 Speaker 2: a group of teens are hunted down until there's only 85 00:06:34,600 --> 00:06:39,560 Speaker 2: one left standing. In this story, they flip the script, 86 00:06:40,000 --> 00:06:46,680 Speaker 2: band together and fight back alongside some unexpected global allies, 87 00:06:47,520 --> 00:06:53,120 Speaker 2: forcing law enforcement and tech companies to sit up and 88 00:06:53,279 --> 00:07:08,919 Speaker 2: take notice from iHeart podcasts, Bloomberg and Kaleidoscope. This is Livettown. 89 00:07:09,320 --> 00:07:23,280 Speaker 2: I'm Olivia Carvill. Over the summer, our producer Julia Nutter, 90 00:07:23,360 --> 00:07:26,160 Speaker 2: and I made the drive out to Long Island to 91 00:07:26,240 --> 00:07:28,480 Speaker 2: meet Kayla at her family home. 92 00:07:31,600 --> 00:07:32,560 Speaker 4: I can hear music. 93 00:07:37,280 --> 00:07:38,120 Speaker 6: How are you nice? 94 00:07:38,880 --> 00:07:40,880 Speaker 4: It's to see too. How are you feeling? 95 00:07:42,480 --> 00:07:43,800 Speaker 7: Let me just do a quick levels check. 96 00:07:44,000 --> 00:07:45,520 Speaker 3: Oh do you want me to put the bird in 97 00:07:45,560 --> 00:07:46,200 Speaker 3: another room? 98 00:07:46,480 --> 00:07:48,040 Speaker 7: I don't even hear the bird. 99 00:07:48,080 --> 00:07:48,480 Speaker 6: The bird. 100 00:07:48,760 --> 00:07:53,120 Speaker 2: Yeah, Kayla is like a firecracker. She's small but loud, 101 00:07:53,440 --> 00:07:56,880 Speaker 2: with brightly colored hair. Every time I've seen her, her 102 00:07:56,960 --> 00:08:01,440 Speaker 2: hair is a different color. She's covered in tattoos and piercings. 103 00:08:01,800 --> 00:08:03,680 Speaker 4: I was like, oh, yeah, what kind of bird is that? 104 00:08:03,800 --> 00:08:06,480 Speaker 4: A parakeet? So he likes to talk to himself a lot. 105 00:08:10,680 --> 00:08:13,680 Speaker 2: Do you feel comfortable showing us where you were when 106 00:08:13,720 --> 00:08:18,000 Speaker 2: you discovered the website? When you were showing it, yeah, 107 00:08:18,040 --> 00:08:21,760 Speaker 2: it's a little messy. Kayla's twenty four now. When we 108 00:08:21,880 --> 00:08:24,600 Speaker 2: visited her, she was living in the same house she 109 00:08:24,680 --> 00:08:25,200 Speaker 2: grew up in. 110 00:08:26,120 --> 00:08:29,960 Speaker 3: Well, I see all your plants and it's great, as 111 00:08:30,000 --> 00:08:30,520 Speaker 3: you could tell. 112 00:08:30,560 --> 00:08:33,720 Speaker 2: I love my books in high school. Kayla was an 113 00:08:33,760 --> 00:08:38,160 Speaker 2: elite athlete and a girl who moved easily between friend groups. 114 00:08:38,679 --> 00:08:41,439 Speaker 3: People make it seem like there's either a popular group 115 00:08:41,600 --> 00:08:43,800 Speaker 3: or you know, so called nerds. 116 00:08:45,160 --> 00:08:46,160 Speaker 4: I would like to say I was. 117 00:08:46,160 --> 00:08:48,680 Speaker 3: In the middle of it because I hung out with 118 00:08:48,720 --> 00:08:51,280 Speaker 3: a lot of the sports people who were considered popular. 119 00:08:51,640 --> 00:08:54,480 Speaker 3: But I had my friends who we would just sit 120 00:08:54,559 --> 00:08:58,560 Speaker 3: around and play video games, hang out, do funny things, 121 00:08:58,640 --> 00:09:09,240 Speaker 3: and make videos of us doing stupid little things. I 122 00:09:09,280 --> 00:09:12,000 Speaker 3: say this to my friends actually all the time. I 123 00:09:12,080 --> 00:09:14,719 Speaker 3: may not dress like it, but my inner person is 124 00:09:14,800 --> 00:09:18,360 Speaker 3: kind of emo. I listen to a lot of dark music. 125 00:09:18,480 --> 00:09:18,760 Speaker 6: Now. 126 00:09:18,920 --> 00:09:21,280 Speaker 4: I have the tattoos. 127 00:09:20,679 --> 00:09:24,760 Speaker 3: All over, I have piercings all over, and I truly 128 00:09:24,920 --> 00:09:27,400 Speaker 3: just like It's not like I like the color black. 129 00:09:27,600 --> 00:09:31,720 Speaker 4: I just like I changed a lot. I have a 130 00:09:31,760 --> 00:09:33,560 Speaker 4: really big book tattoo on my leg. 131 00:09:34,120 --> 00:09:38,920 Speaker 3: I have a big wrap around plants because my whole 132 00:09:39,000 --> 00:09:42,200 Speaker 3: leg is going to be things I love, and plants 133 00:09:42,280 --> 00:09:43,240 Speaker 3: is another thing for me. 134 00:09:44,440 --> 00:09:48,520 Speaker 4: This one, it's one of my favorites. It's tell Me 135 00:09:48,559 --> 00:09:49,400 Speaker 4: About Tomorrow. 136 00:09:50,679 --> 00:09:53,200 Speaker 3: Tell Me About Tomorrow is one of my favorite songs 137 00:09:53,200 --> 00:09:56,160 Speaker 3: because whenever I'm in a depressive state or just not 138 00:09:56,280 --> 00:09:59,360 Speaker 3: doing good, mentally, I play that song and it tells 139 00:09:59,360 --> 00:10:01,880 Speaker 3: you to tell me about tomorrow, tell me about the 140 00:10:01,920 --> 00:10:03,679 Speaker 3: next day that's going to go on in your life. 141 00:10:04,679 --> 00:10:09,760 Speaker 3: When I was going through this situation, I was not 142 00:10:09,880 --> 00:10:14,200 Speaker 3: mentally doing good at all, and it told me you 143 00:10:14,240 --> 00:10:18,439 Speaker 3: will have tomorrow, and you can't have tomorrow. 144 00:10:20,480 --> 00:10:24,840 Speaker 2: It sounds like a lot of the tattoos are important 145 00:10:24,840 --> 00:10:28,800 Speaker 2: to you because they represent you moving forward or healing 146 00:10:29,000 --> 00:10:32,480 Speaker 2: or getting over this situation as you described it. Do 147 00:10:32,520 --> 00:10:34,960 Speaker 2: you feel comfortable getting into that now. 148 00:10:35,360 --> 00:10:35,680 Speaker 4: Yes. 149 00:10:39,480 --> 00:10:43,120 Speaker 3: I remember being in my bed, just laying down. It 150 00:10:43,200 --> 00:10:43,880 Speaker 3: was lead at night. 151 00:10:44,840 --> 00:10:47,720 Speaker 2: This was in March twenty twenty, at the onset of 152 00:10:47,800 --> 00:10:51,960 Speaker 2: the COVID pandemic. She'd recently graduated from high school and 153 00:10:52,120 --> 00:10:53,840 Speaker 2: was stuck at home with her parents. 154 00:10:54,440 --> 00:10:58,520 Speaker 4: I was in that corner. My bad was up against 155 00:10:58,559 --> 00:11:02,360 Speaker 4: the wall. I was just laying in the bed. 156 00:11:03,080 --> 00:11:05,920 Speaker 3: I'm pretty sure I was on my phone, just scrolling 157 00:11:05,960 --> 00:11:08,160 Speaker 3: through Instagram or something, or on Snapchat. 158 00:11:09,160 --> 00:11:12,560 Speaker 2: She heard her dad walking up the stairs from the living. 159 00:11:12,360 --> 00:11:14,320 Speaker 3: Room, says, I can always hear when someone comes up 160 00:11:14,360 --> 00:11:15,240 Speaker 3: the stairs and he knocked. 161 00:11:15,280 --> 00:11:17,840 Speaker 4: He always knocks on the door. So he knocked and 162 00:11:19,080 --> 00:11:20,080 Speaker 4: just like walked right in. 163 00:11:21,320 --> 00:11:24,440 Speaker 3: I thought automatically like I was in trouble or something. 164 00:11:24,920 --> 00:11:26,880 Speaker 3: But then I could just like tell like he was 165 00:11:27,000 --> 00:11:31,520 Speaker 3: just confused, and he was like a little hesitant to 166 00:11:31,600 --> 00:11:35,680 Speaker 3: like come up to me. I could tell, and he 167 00:11:35,800 --> 00:11:37,240 Speaker 3: was just like, what is this? 168 00:11:39,640 --> 00:11:41,640 Speaker 2: What is this? What was he showing you? 169 00:11:42,840 --> 00:11:45,719 Speaker 4: I think it was the website. 170 00:11:46,000 --> 00:11:49,439 Speaker 2: It was the faked picture of her naked, and next 171 00:11:49,480 --> 00:11:53,520 Speaker 2: to it a pole asking the site's users to rank 172 00:11:53,640 --> 00:11:58,160 Speaker 2: what they wanted to do to her sexually. This next 173 00:11:58,160 --> 00:12:01,400 Speaker 2: part is graphic, and it might be difficult to hear. 174 00:12:03,120 --> 00:12:11,400 Speaker 4: It was. I always had trouble saying these parts. 175 00:12:12,600 --> 00:12:13,080 Speaker 6: It was. 176 00:12:14,760 --> 00:12:21,320 Speaker 3: Drink her piss, milk her, have her drink my piss, 177 00:12:21,880 --> 00:12:24,920 Speaker 3: and there was something else, and I can't entirely remember 178 00:12:24,920 --> 00:12:25,760 Speaker 3: what the other one was. 179 00:12:26,800 --> 00:12:29,200 Speaker 2: So they were ranking what they wanted to do to 180 00:12:29,240 --> 00:12:29,880 Speaker 2: you sexually. 181 00:12:30,160 --> 00:12:33,400 Speaker 4: Yes, yeah. 182 00:12:33,520 --> 00:12:36,160 Speaker 2: She kept scrolling and it got worse. 183 00:12:36,840 --> 00:12:39,319 Speaker 3: You would see what they posted, like their nude pictures 184 00:12:39,440 --> 00:12:44,440 Speaker 3: and jerking off and coming on our pictures, even like 185 00:12:44,520 --> 00:12:50,439 Speaker 3: pictures of our pictures with their dicks there and the 186 00:12:50,520 --> 00:12:54,760 Speaker 3: ejaculation there, and then there was like some like. 187 00:12:56,760 --> 00:13:01,120 Speaker 4: Writings of like reaper. 188 00:13:07,520 --> 00:13:10,679 Speaker 3: My dad was very confused. He asked me what this 189 00:13:10,960 --> 00:13:15,400 Speaker 3: was and how did it get up there? And I 190 00:13:15,440 --> 00:13:19,200 Speaker 3: did not know because I've never seen it before, which 191 00:13:19,240 --> 00:13:25,320 Speaker 3: I obviously told him. 192 00:13:25,360 --> 00:13:28,120 Speaker 4: So that's how we found out about the website. 193 00:13:39,679 --> 00:13:42,880 Speaker 2: Kylas is Her dad is a big guy, big personality 194 00:13:43,080 --> 00:13:47,600 Speaker 2: like her loud laugh. Back then, he was a police officer, 195 00:13:48,120 --> 00:13:51,600 Speaker 2: a beat cop really, for the Nassau County Police Department. 196 00:13:52,320 --> 00:13:54,600 Speaker 2: He didn't want to talk to us for this podcast, 197 00:13:55,200 --> 00:13:59,800 Speaker 2: but he's really protective of his kids, which means he 198 00:14:00,040 --> 00:14:04,720 Speaker 2: regularly googles their names, searching the Internet for anything related 199 00:14:04,760 --> 00:14:05,120 Speaker 2: to them. 200 00:14:05,840 --> 00:14:11,280 Speaker 3: For me, specifically, I was playing soccer and competitively, so 201 00:14:13,040 --> 00:14:16,000 Speaker 3: when I was, you know, going to college and playing soccer, 202 00:14:16,040 --> 00:14:19,840 Speaker 3: then he just got nervous when they search things. He 203 00:14:19,880 --> 00:14:24,440 Speaker 3: doesn't want dumb to see things are wrong or you know, 204 00:14:24,800 --> 00:14:27,200 Speaker 3: like exactly what we found. 205 00:14:28,360 --> 00:14:31,040 Speaker 2: When he shows you the phone, what is the photograph 206 00:14:31,760 --> 00:14:34,640 Speaker 2: on the phone that you're looking at? The specific photo 207 00:14:34,680 --> 00:14:36,240 Speaker 2: that you first see. 208 00:14:36,480 --> 00:14:37,520 Speaker 4: I'm almost passive. 209 00:14:37,600 --> 00:14:40,480 Speaker 8: It was the one with the bikini taking off my 210 00:14:40,520 --> 00:14:45,360 Speaker 8: body and it was just me naked, well not me, 211 00:14:45,600 --> 00:14:49,880 Speaker 8: but me with someone else's body parts on my body 212 00:14:49,880 --> 00:14:56,640 Speaker 8: parts that looked exactly like my own. 213 00:14:57,400 --> 00:15:00,920 Speaker 2: Was there ever a moment even and a split seecond 214 00:15:00,960 --> 00:15:03,920 Speaker 2: moment that you thought it was you naked, Yes. 215 00:15:04,520 --> 00:15:06,880 Speaker 3: I definitely like at first, like I was like, oh 216 00:15:06,880 --> 00:15:08,600 Speaker 3: my god, that's me, Like how did they get that? 217 00:15:08,680 --> 00:15:13,200 Speaker 3: But then I was like, no, that's in my friend's backyard. 218 00:15:13,400 --> 00:15:16,720 Speaker 3: Like I was obviously wearing a bikini, and I remember 219 00:15:16,720 --> 00:15:19,680 Speaker 3: the bikini. It was a blue one and it was 220 00:15:19,800 --> 00:15:22,480 Speaker 3: mesh a little bit. So that's why I would think 221 00:15:22,520 --> 00:15:26,560 Speaker 3: it was so easy for them to completely alter the picture. 222 00:15:27,400 --> 00:15:33,200 Speaker 3: But I just like instantly I was like, whoa, how 223 00:15:33,320 --> 00:15:37,080 Speaker 3: how did they get that picture of me? Like I 224 00:15:37,120 --> 00:15:41,359 Speaker 3: never took a picture like that. It was frightening how 225 00:15:41,880 --> 00:15:42,960 Speaker 3: exact it looked. 226 00:15:44,840 --> 00:15:48,120 Speaker 2: The website it was on was called come on printed 227 00:15:48,200 --> 00:15:52,480 Speaker 2: pecks dot com, and it encouraged its members to do 228 00:15:52,640 --> 00:15:57,800 Speaker 2: exactly what that title implies, to print out photos of 229 00:15:57,880 --> 00:16:02,280 Speaker 2: girls and young women and estbate to them, and then 230 00:16:03,000 --> 00:16:06,760 Speaker 2: to post a photo of the user's erection or ejaculation 231 00:16:07,480 --> 00:16:11,160 Speaker 2: on top of that image. This is a practice they 232 00:16:11,280 --> 00:16:12,600 Speaker 2: called tributing. 233 00:16:14,200 --> 00:16:17,320 Speaker 3: To have to see my dad find all that, and 234 00:16:17,440 --> 00:16:23,360 Speaker 3: like I don't know, like just not understanding what's going on. 235 00:16:23,520 --> 00:16:25,400 Speaker 3: I didn't even know anything was going on. And then 236 00:16:25,440 --> 00:16:27,560 Speaker 3: my dad just coming in and seeing all that, It's 237 00:16:27,640 --> 00:16:31,640 Speaker 3: like nobody wants to see that, but like to have 238 00:16:31,760 --> 00:16:36,960 Speaker 3: your dad see that, that's like not even uncomfortable, like 239 00:16:37,000 --> 00:16:41,440 Speaker 3: that's not even the word for it, but just like unimaginable. 240 00:16:46,680 --> 00:16:49,320 Speaker 2: I wish I could tell you that this website was 241 00:16:49,400 --> 00:16:52,720 Speaker 2: tucked away in the dark web. We users needed to 242 00:16:52,840 --> 00:16:56,720 Speaker 2: download a special browser or slink through some other secret 243 00:16:56,840 --> 00:17:01,320 Speaker 2: cyber door. But it wasn't. It was just out in 244 00:17:01,360 --> 00:17:07,119 Speaker 2: the open. Kayla was out in the open too. Postings 245 00:17:07,160 --> 00:17:10,480 Speaker 2: on the site included her real name, which is how 246 00:17:10,520 --> 00:17:14,520 Speaker 2: her dad found it. Sometimes the images posted to the 247 00:17:14,560 --> 00:17:20,199 Speaker 2: website were actual unaltered photos, revenge porn type stuff, but 248 00:17:20,359 --> 00:17:24,760 Speaker 2: often the original photos were taken from innocent social media 249 00:17:24,840 --> 00:17:29,920 Speaker 2: posts like Kayla's, and then the site's users turned those 250 00:17:29,960 --> 00:17:39,120 Speaker 2: images into non consensual pornography, often with underage subjects. This 251 00:17:39,200 --> 00:17:42,800 Speaker 2: type of photo manipulation was new to Kayla, but it 252 00:17:42,880 --> 00:17:46,120 Speaker 2: wasn't new to me. I've been starting to see more 253 00:17:46,160 --> 00:17:49,600 Speaker 2: of it over the past few years. I've been reporting 254 00:17:49,680 --> 00:17:53,679 Speaker 2: on big tech and social media platforms what they're doing 255 00:17:53,880 --> 00:17:57,520 Speaker 2: or not doing to prevent harm to kids, what guardrails 256 00:17:57,560 --> 00:18:00,920 Speaker 2: they're putting up to stop exactly this kind of thing. 257 00:18:01,040 --> 00:18:05,399 Speaker 2: From happening, and images like these were starting to multiply 258 00:18:05,600 --> 00:18:11,760 Speaker 2: with shocking speed alongside the rise of artificial intelligence. I 259 00:18:11,880 --> 00:18:15,040 Speaker 2: kept hearing more and more about the dark side of 260 00:18:15,080 --> 00:18:21,000 Speaker 2: this new technology, about deep fakes, altered videos online known 261 00:18:21,040 --> 00:18:21,920 Speaker 2: as deep. 262 00:18:21,680 --> 00:18:23,440 Speaker 6: Fakes, the rise of deep fakes. 263 00:18:23,240 --> 00:18:25,560 Speaker 7: Computer generated videos known as deep fakes. 264 00:18:25,600 --> 00:18:28,840 Speaker 9: Deep fake videos make people appear to say things they 265 00:18:28,880 --> 00:18:30,600 Speaker 9: never did or never would. 266 00:18:30,880 --> 00:18:32,680 Speaker 10: It's getting harder and harder to trust our. 267 00:18:32,640 --> 00:18:33,680 Speaker 6: Eyes and airs. 268 00:18:34,480 --> 00:18:37,560 Speaker 2: When I first heard about the Leavettown case, I knew 269 00:18:37,600 --> 00:18:40,359 Speaker 2: the reporting was going to take me to some uncomfortable 270 00:18:40,880 --> 00:18:45,520 Speaker 2: but important places, from the rise of generative AI to 271 00:18:45,600 --> 00:18:51,600 Speaker 2: the explosion of deep fakes to child sexual abuse material online. 272 00:18:51,880 --> 00:18:53,960 Speaker 2: I also knew that it was going to be really 273 00:18:54,000 --> 00:18:57,480 Speaker 2: hard to find the victims and dive into this website 274 00:18:57,720 --> 00:19:01,439 Speaker 2: on my own, so I turned to my colleague for help, 275 00:19:02,200 --> 00:19:05,480 Speaker 2: Maggie Murphy. She's a technology reporter at Bloomberg. 276 00:19:06,920 --> 00:19:09,280 Speaker 7: At the time, I remember I was reporting on this 277 00:19:09,440 --> 00:19:12,399 Speaker 7: new generative AI tool that people were using to create 278 00:19:12,480 --> 00:19:16,479 Speaker 7: child sexual abuse material. I typically write about people who 279 00:19:16,600 --> 00:19:20,560 Speaker 7: hide on the fringes of the Internet, like teenage hackers 280 00:19:20,720 --> 00:19:25,200 Speaker 7: extorting global businesses, or cyber criminals using AI to scam people. 281 00:19:26,080 --> 00:19:28,520 Speaker 7: Not that long ago when I started on this beat, 282 00:19:28,920 --> 00:19:31,679 Speaker 7: deep fakes weren't that easy to make and kind of 283 00:19:31,720 --> 00:19:34,320 Speaker 7: always had to tell. If you looked at them long 284 00:19:34,440 --> 00:19:37,640 Speaker 7: enough you would notice something was off, yet they caught 285 00:19:37,680 --> 00:19:38,200 Speaker 7: your attention. 286 00:19:38,480 --> 00:19:42,159 Speaker 6: With AI, there has been so much innovation, but in the. 287 00:19:42,200 --> 00:19:46,040 Speaker 7: Last few years there have been rapid breakthroughs and machine learning. 288 00:19:46,080 --> 00:19:49,840 Speaker 10: Amazon investing up to four billion dollars in startup andthropic as. 289 00:19:49,760 --> 00:19:53,639 Speaker 1: You reportedly investing ten billion dollars in open AI. 290 00:19:53,800 --> 00:19:56,600 Speaker 7: That is and truly hundreds of millions of dollars has 291 00:19:56,640 --> 00:20:00,040 Speaker 7: flowed into image generating software powered by AI. 292 00:20:00,600 --> 00:20:04,600 Speaker 6: Microsoft, Amazon, you name it. They are all actively investing 293 00:20:05,000 --> 00:20:07,840 Speaker 6: in young AI startups because you're trying. 294 00:20:08,080 --> 00:20:10,560 Speaker 7: That has made it a lot cheaper and easier to 295 00:20:10,600 --> 00:20:14,520 Speaker 7: make convincing photos or videos of pretty much anything you 296 00:20:14,520 --> 00:20:19,080 Speaker 7: can think of. Now, there are billions upon billions of 297 00:20:19,119 --> 00:20:23,160 Speaker 7: deep fake images out there, and this probably won't surprise you, 298 00:20:23,840 --> 00:20:27,479 Speaker 7: but the vast, vast majority, by one estimate, more than 299 00:20:27,560 --> 00:20:33,119 Speaker 7: ninety percent of all deep fake imagery is pornographic. Typically, 300 00:20:33,320 --> 00:20:35,800 Speaker 7: people who have had their image turned into porn have 301 00:20:35,960 --> 00:20:39,879 Speaker 7: not consented to having their faces, bodies, or voices morphed 302 00:20:39,880 --> 00:20:43,080 Speaker 7: in this way. A lot of people don't even know 303 00:20:43,200 --> 00:20:46,879 Speaker 7: that these images of themselves exist. But as Olivia and 304 00:20:46,920 --> 00:20:50,960 Speaker 7: I learned, Kayla's dad had found this website in a 305 00:20:51,000 --> 00:20:54,680 Speaker 7: regular search of her name, and now he and Kayler 306 00:20:54,800 --> 00:20:55,880 Speaker 7: knew it was there. 307 00:20:57,240 --> 00:21:00,880 Speaker 3: Seeing what people had to say about my boy as 308 00:21:01,119 --> 00:21:05,760 Speaker 3: the pictures were like when I'm what thirteen fourteen, I 309 00:21:05,800 --> 00:21:09,720 Speaker 3: think I instantly cried. I was just like I think 310 00:21:09,760 --> 00:21:13,360 Speaker 3: I remember, kind of like stating more to myself than 311 00:21:13,400 --> 00:21:16,159 Speaker 3: anybody else, like why is this happening to me? Like 312 00:21:16,359 --> 00:21:19,639 Speaker 3: what is going on? Like is this even real life? 313 00:21:20,840 --> 00:21:26,600 Speaker 3: I felt betrayed by my body because it felt so real, 314 00:21:27,600 --> 00:21:30,480 Speaker 3: Like the pictures just seemed so real. 315 00:21:31,760 --> 00:21:34,320 Speaker 7: The website, to us, is also encouraged each other to 316 00:21:34,359 --> 00:21:39,200 Speaker 7: comment on the images and even harass the person being pictured. 317 00:21:39,680 --> 00:21:44,520 Speaker 3: It's unimaginable to think that people can even say how gross, 318 00:21:44,560 --> 00:21:46,640 Speaker 3: like to tell you to rape. 319 00:21:46,359 --> 00:21:48,440 Speaker 4: Me or milk me. 320 00:21:49,760 --> 00:21:53,280 Speaker 3: I remember a lot of mine was with my braces, 321 00:21:53,320 --> 00:21:56,040 Speaker 3: because I had braces until I was in eleventh grade, 322 00:21:56,840 --> 00:22:01,600 Speaker 3: and then I had some pimples, and I'd see people 323 00:22:02,080 --> 00:22:07,800 Speaker 3: saying that they loved my brastface and my pimples on 324 00:22:07,840 --> 00:22:09,400 Speaker 3: my face they like, I'm younger. 325 00:22:13,160 --> 00:22:17,160 Speaker 2: Did you actively search through the website for all photos 326 00:22:17,200 --> 00:22:19,920 Speaker 2: of you and look at other photos on there and 327 00:22:20,280 --> 00:22:22,480 Speaker 2: what the people were saying, or did you just could 328 00:22:22,480 --> 00:22:23,560 Speaker 2: you just not look at it at all? 329 00:22:24,520 --> 00:22:24,600 Speaker 5: No. 330 00:22:24,760 --> 00:22:26,360 Speaker 4: I constantly looked at it. 331 00:22:26,520 --> 00:22:30,920 Speaker 3: I was searching to see if there were new pictures 332 00:22:30,920 --> 00:22:31,920 Speaker 3: that would be uploaded. 333 00:22:33,320 --> 00:22:38,480 Speaker 2: After learning about the website, Kayla's biggest question was who 334 00:22:38,520 --> 00:22:43,880 Speaker 2: did this? Who went through the trouble of scraping her 335 00:22:43,920 --> 00:22:48,120 Speaker 2: pictures from social media, editing them to make her look naked, 336 00:22:48,840 --> 00:22:52,560 Speaker 2: and then posting them on the site asking others to 337 00:22:52,640 --> 00:22:57,520 Speaker 2: harass her. She wanted it to stop, so she took 338 00:22:57,520 --> 00:23:00,240 Speaker 2: a break from Visco, a photo sharing app. 339 00:23:00,760 --> 00:23:03,119 Speaker 3: Because I started to notice that was the main place 340 00:23:03,200 --> 00:23:05,040 Speaker 3: that they took my pictures from. 341 00:23:05,720 --> 00:23:09,280 Speaker 2: I mean, for you, you can't unsee that that's your 342 00:23:09,320 --> 00:23:12,640 Speaker 2: body and that's someone putting it out there naked on 343 00:23:12,680 --> 00:23:16,439 Speaker 2: this website. How do you survive and how do you cope? 344 00:23:18,040 --> 00:23:21,879 Speaker 3: I quite literally just put in the back of my 345 00:23:21,960 --> 00:23:26,040 Speaker 3: head and moved on and just did not think about it. 346 00:23:26,240 --> 00:23:31,639 Speaker 3: Try not to think about it. We're still on lockdown, 347 00:23:31,760 --> 00:23:34,199 Speaker 3: so I was stuck with my thoughts a lot. But 348 00:23:34,240 --> 00:23:37,320 Speaker 3: I was also stuck with my family. We all kind 349 00:23:37,320 --> 00:23:39,159 Speaker 3: of just put it in the back of our heads 350 00:23:39,200 --> 00:23:43,720 Speaker 3: and essentially basically pretended that it never happened. 351 00:23:46,080 --> 00:23:49,320 Speaker 7: Kayla didn't tell anyone about it, not even her friends, 352 00:23:50,200 --> 00:23:52,280 Speaker 7: though her father reached out to his colleagues at the 353 00:23:52,320 --> 00:23:57,120 Speaker 7: Natau County Police Department. He wanted to know what could 354 00:23:57,160 --> 00:23:59,480 Speaker 7: they do to help Kayla. 355 00:24:00,000 --> 00:24:02,679 Speaker 3: My dad did seek advice to his buddies at work. 356 00:24:03,000 --> 00:24:05,840 Speaker 3: He did ask if there was anything that they could do, 357 00:24:05,960 --> 00:24:09,560 Speaker 3: but his buddies really didn't know how to go about 358 00:24:09,560 --> 00:24:13,760 Speaker 3: it because it's so hard to trace, like those types 359 00:24:13,800 --> 00:24:18,520 Speaker 3: of websites when they're all anonymous and they're essentially a 360 00:24:18,560 --> 00:24:21,400 Speaker 3: porn website, so it's really hard to trace those back 361 00:24:21,440 --> 00:24:23,800 Speaker 3: to the original owner or anything like that. 362 00:24:24,840 --> 00:24:28,080 Speaker 7: Bear in mind, this was happening in twenty twenty, when 363 00:24:28,080 --> 00:24:31,680 Speaker 7: the term deep fakes wasn't as widespread or recognized as 364 00:24:31,680 --> 00:24:34,840 Speaker 7: it is now, and there weren't any laws federal or 365 00:24:34,880 --> 00:24:37,960 Speaker 7: in New York State at this time to prevent faked 366 00:24:38,000 --> 00:24:40,080 Speaker 7: pornographic images of real people. 367 00:24:40,960 --> 00:24:43,959 Speaker 3: I didn't really know what there was to do. After 368 00:24:44,320 --> 00:24:47,560 Speaker 3: my dad already tried to contact the police department and 369 00:24:47,640 --> 00:24:51,200 Speaker 3: his buddies and them just telling us that, like, there 370 00:24:51,200 --> 00:24:55,360 Speaker 3: really was not much that they can do going through 371 00:24:55,359 --> 00:24:59,760 Speaker 3: a complete and trying to go through all the steps 372 00:24:59,760 --> 00:25:00,359 Speaker 3: that took. 373 00:25:00,400 --> 00:25:03,920 Speaker 4: I just don't think I was strong enough at that moment. 374 00:25:07,240 --> 00:25:11,800 Speaker 2: And there's more time passed. Can you describe how your 375 00:25:11,880 --> 00:25:15,040 Speaker 2: feelings about that website and those photos changed. 376 00:25:16,240 --> 00:25:16,920 Speaker 4: I think. 377 00:25:19,240 --> 00:25:22,560 Speaker 3: After some time I stopped looking at the website put 378 00:25:22,600 --> 00:25:25,720 Speaker 3: in the back of my head, but it completely altered 379 00:25:25,760 --> 00:25:29,480 Speaker 3: the way that I thought about my body. And there's 380 00:25:29,480 --> 00:25:33,120 Speaker 3: some thing called body dysmorphia. Your brain alters the way 381 00:25:33,160 --> 00:25:36,800 Speaker 3: that you look at your body, and I completely went 382 00:25:36,840 --> 00:25:37,200 Speaker 3: through that. 383 00:25:38,240 --> 00:25:40,560 Speaker 4: I used to feel so confident about it because I 384 00:25:40,600 --> 00:25:41,280 Speaker 4: was an athlete. 385 00:25:41,359 --> 00:25:44,600 Speaker 3: I felt so in tune with my body, and then 386 00:25:44,640 --> 00:25:53,120 Speaker 3: I completely changed. I started having eating issues and wasn't able. 387 00:25:52,920 --> 00:25:54,280 Speaker 4: To eat full meals. 388 00:25:54,480 --> 00:25:59,000 Speaker 3: I would feel sick after eating. I couldn't even eat 389 00:25:59,040 --> 00:26:01,080 Speaker 3: three bites of food. I would eat two bites and 390 00:26:01,119 --> 00:26:05,119 Speaker 3: feel absolutely disgusted in what I'm eating and disgusted in 391 00:26:05,119 --> 00:26:09,680 Speaker 3: my body for wanting food. It completely changed just who 392 00:26:09,720 --> 00:26:12,680 Speaker 3: I was as a person. I completely got depressed and 393 00:26:13,400 --> 00:26:14,720 Speaker 3: just wasn't myself. 394 00:26:16,600 --> 00:26:19,440 Speaker 7: And that's where the story could have ended. With Kayla's 395 00:26:19,440 --> 00:26:22,879 Speaker 7: faked pictures still online, and her and her dad not 396 00:26:22,960 --> 00:26:27,320 Speaker 7: able to do anything about it. Her small, safe, suburban 397 00:26:27,359 --> 00:26:32,600 Speaker 7: world turned upside down. But ten months later, Kayla's phone rang. 398 00:26:33,320 --> 00:26:34,840 Speaker 3: I got a call when I was working at my 399 00:26:34,920 --> 00:26:39,320 Speaker 3: old job at a clothing store. It was around New 400 00:26:39,400 --> 00:26:45,480 Speaker 3: Year's Eve. I remember it was like seven forty and 401 00:26:45,640 --> 00:26:49,359 Speaker 3: I was behind the register. Was very quiet in the store, 402 00:26:49,440 --> 00:26:51,840 Speaker 3: so I was just on my phone. We weren't really 403 00:26:51,840 --> 00:26:53,719 Speaker 3: supposed to be on our phones, but I just remember 404 00:26:53,800 --> 00:26:57,760 Speaker 3: being on my phone and I see one of the classmates. 405 00:26:58,080 --> 00:27:00,359 Speaker 3: It was her dad's name popped up on my phone, 406 00:27:00,640 --> 00:27:03,280 Speaker 3: which was so weird to me because I was like, 407 00:27:03,320 --> 00:27:04,520 Speaker 3: I don't even have this number. 408 00:27:05,760 --> 00:27:07,359 Speaker 4: Why are they calling me? 409 00:27:07,400 --> 00:27:10,159 Speaker 3: And I just remember the last name, and I answered 410 00:27:10,160 --> 00:27:13,520 Speaker 3: and I was like hello, and she's like, Hi, this 411 00:27:13,720 --> 00:27:20,480 Speaker 3: is blank. She goes, I saw you on the website. 412 00:27:20,640 --> 00:27:22,960 Speaker 3: I just want to let you know I'm also on 413 00:27:23,000 --> 00:27:26,400 Speaker 3: this website. And I remember just like walking away from 414 00:27:26,400 --> 00:27:30,480 Speaker 3: the register and I was just like okay, like how 415 00:27:30,480 --> 00:27:34,240 Speaker 3: many other people are on there? Who do you think 416 00:27:34,240 --> 00:27:39,600 Speaker 3: it is? And that's when she said I figured out 417 00:27:39,600 --> 00:27:40,119 Speaker 3: who it was. 418 00:27:40,160 --> 00:27:55,200 Speaker 4: I think. 419 00:27:50,080 --> 00:27:54,640 Speaker 2: In an instant. Kayla's story got so much bigger. From there, 420 00:27:54,800 --> 00:27:58,800 Speaker 2: it would stretch beyond her bedroom, past the high school, 421 00:27:58,960 --> 00:28:03,600 Speaker 2: out of Leavettown, across the country, and around the world. 422 00:28:04,760 --> 00:28:08,080 Speaker 7: What she didn't and couldn't know, but what Olivia and 423 00:28:08,119 --> 00:28:11,080 Speaker 7: I would learn through our investigation was that there was 424 00:28:11,119 --> 00:28:15,000 Speaker 7: a global web of cyber criminals taking advantage of a 425 00:28:15,119 --> 00:28:20,560 Speaker 7: plodding justice system being pursued by online vigilantes and hackers 426 00:28:21,240 --> 00:28:25,879 Speaker 7: willing to take risks the cops and prosecutors wouldn't. The 427 00:28:26,000 --> 00:28:29,600 Speaker 7: story took us from New York to small town New Zealand, 428 00:28:30,280 --> 00:28:34,880 Speaker 7: through the darkest corners of the Internet, back into bedrooms 429 00:28:35,240 --> 00:28:57,680 Speaker 7: of mid century homes lined up in neat little rows. 430 00:29:08,480 --> 00:29:10,160 Speaker 7: This season on Lovett Town. 431 00:29:10,720 --> 00:29:14,440 Speaker 9: There are things you don't want to believe about your friends. 432 00:29:15,440 --> 00:29:17,960 Speaker 9: You kind of think that the people in your circle 433 00:29:18,000 --> 00:29:20,520 Speaker 9: are like you put them on a pedestal. 434 00:29:22,200 --> 00:29:25,680 Speaker 10: This individual was just out to ruin their lives for 435 00:29:25,720 --> 00:29:30,160 Speaker 10: no reason. There was a very new type of crime 436 00:29:30,200 --> 00:29:33,800 Speaker 10: where there was a gray area to say, you. 437 00:29:33,800 --> 00:29:36,720 Speaker 8: Have certain police officers there will wait for calls that 438 00:29:36,840 --> 00:29:39,200 Speaker 8: come in and they're now go and respond to those courts. 439 00:29:39,800 --> 00:29:41,680 Speaker 8: I'm the person that will go and hunt people. 440 00:29:43,000 --> 00:29:47,520 Speaker 5: They call it an arms race between law enforcement and technology, 441 00:29:48,080 --> 00:29:50,000 Speaker 5: and it's just where we're losing. 442 00:29:50,640 --> 00:29:51,920 Speaker 4: We are absolutely losing. 443 00:29:56,920 --> 00:30:01,040 Speaker 2: This series is reported and hosted by Margie and me 444 00:30:01,440 --> 00:30:06,560 Speaker 2: Olivia Carvell. Produced by Kaleidoscote led by Julia Nutter, Edited 445 00:30:06,560 --> 00:30:11,600 Speaker 2: by Nita Tuluis Semnani, Producing by Dara luck Potts. Executive 446 00:30:11,640 --> 00:30:16,560 Speaker 2: produced by Kate Osborne. Original composition and mixing by Steve 447 00:30:16,640 --> 00:30:22,760 Speaker 2: bone Leavittown archival clips provided by screen Ocean Clips and Footage. 448 00:30:23,160 --> 00:30:27,480 Speaker 2: Our Bloomberg editors are Caitlyn Kenney and Jeff Grocock. Additional 449 00:30:27,560 --> 00:30:32,400 Speaker 2: reporting by Samantha Stewart. Sage Bowman is Bloomberg's executive producer 450 00:30:32,560 --> 00:30:37,040 Speaker 2: and head of Podcasting. Kristin Powers is our senior executive editor. 451 00:30:37,640 --> 00:30:42,200 Speaker 2: From iHeart, Our executive producers are Tyler Klang and Nicki Etoor. 452 00:30:42,640 --> 00:30:47,720 Speaker 2: Leavittown is a production of Bloomberg, Kaleidoscope and iHeart Podcasts. 453 00:30:48,480 --> 00:30:51,040 Speaker 2: If you liked this show, give us a follow and 454 00:30:51,120 --> 00:30:51,960 Speaker 2: tell your friends. 455 00:30:58,200 --> 00:31:01,760 Speaker 1: That was the first episode of the six part series Levittown. 456 00:31:02,440 --> 00:31:05,120 Speaker 1: Go listen and subscribe to the series wherever you get 457 00:31:05,120 --> 00:31:10,040 Speaker 1: your podcasts, It's simply called Levittown. Thanks for listening. We'll 458 00:31:10,080 --> 00:31:12,160 Speaker 1: be back on Friday. And back in the studio with 459 00:31:12,240 --> 00:31:14,600 Speaker 1: my co host Cara with the week in Tech.