1 00:00:01,040 --> 00:00:04,600 Speaker 1: A quick note, this is episode three of a six 2 00:00:04,640 --> 00:00:08,760 Speaker 1: part series. If you haven't heard the prior episodes, we 3 00:00:08,920 --> 00:00:13,280 Speaker 1: recommend going back and starting there. It should also be 4 00:00:13,360 --> 00:00:19,520 Speaker 1: noted that this series explores sexualized imagery involving miners and violence. 5 00:00:20,040 --> 00:00:22,160 Speaker 1: Please take care when listening. 6 00:00:25,920 --> 00:00:27,840 Speaker 2: With Late. 7 00:00:27,840 --> 00:00:31,120 Speaker 1: One night, after a day of reporting in Leavittown, I 8 00:00:31,160 --> 00:00:34,280 Speaker 1: called Margie from my drive home and she told me 9 00:00:34,400 --> 00:00:36,280 Speaker 1: she'd turned up something big. 10 00:00:36,840 --> 00:00:39,400 Speaker 2: I told Olivia I'd found some people who were looking 11 00:00:39,479 --> 00:00:43,120 Speaker 2: into the same website where the Lavettankas had appeared. They 12 00:00:43,159 --> 00:00:46,559 Speaker 2: were halfway across the world in New Zealand. 13 00:00:47,880 --> 00:00:52,279 Speaker 1: I was like, what, in my many years of covering 14 00:00:52,400 --> 00:00:56,360 Speaker 1: tech for Bloomberg, I've had very few stories take me 15 00:00:56,480 --> 00:01:00,000 Speaker 1: back home to me, New Zealand is a little slah 16 00:01:00,240 --> 00:01:03,440 Speaker 1: of paradise at the bottom of the world, a place 17 00:01:03,480 --> 00:01:07,120 Speaker 1: where traffic jams and missing dogs make the nightly news. 18 00:01:08,560 --> 00:01:11,959 Speaker 1: Now Margie was telling me that women in New Zealand 19 00:01:12,160 --> 00:01:16,160 Speaker 1: were also being harassed on the very same website where 20 00:01:16,160 --> 00:01:21,120 Speaker 1: the Livettown girls were posted, and in New Zealand the 21 00:01:21,200 --> 00:01:30,520 Speaker 1: local cops were taking notice. 22 00:01:32,319 --> 00:01:36,800 Speaker 3: So my name is Doug preferred Doug, but Douglas. 23 00:01:36,480 --> 00:01:40,320 Speaker 2: Nuoku one of my sources introduced me to Doug, who's 24 00:01:40,319 --> 00:01:44,560 Speaker 2: in his early fifties, tall, shaved head. He's a detective 25 00:01:44,560 --> 00:01:47,040 Speaker 2: in New Zealand, stationed at the top of the South 26 00:01:47,080 --> 00:01:49,760 Speaker 2: Island in a small town called Blenham. 27 00:01:50,360 --> 00:01:54,640 Speaker 3: No traffic lights in Blenham, no traffic jams, no congestion 28 00:01:54,760 --> 00:01:56,520 Speaker 3: of people or commercial buildings. 29 00:01:56,600 --> 00:02:02,840 Speaker 2: So is it I'm just imagining less I'm there, not necessarily. 30 00:02:04,440 --> 00:02:07,440 Speaker 2: Doug is brusque and no nonsense. He looks like he 31 00:02:07,520 --> 00:02:09,880 Speaker 2: was born to play the role of coop, and in 32 00:02:09,919 --> 00:02:11,119 Speaker 2: fact he sort. 33 00:02:10,880 --> 00:02:15,040 Speaker 4: Of was, so I'll put it out there. At nine 34 00:02:15,120 --> 00:02:19,240 Speaker 4: years old, my father left. Once my father left, basically 35 00:02:19,360 --> 00:02:24,440 Speaker 4: our wills fell apart in terms of structure and discipline 36 00:02:24,480 --> 00:02:25,120 Speaker 4: and behavior. 37 00:02:26,360 --> 00:02:27,520 Speaker 3: My brothers fell hard. 38 00:02:28,040 --> 00:02:31,080 Speaker 5: My brothers were into gangs and gang members, and so 39 00:02:31,160 --> 00:02:35,240 Speaker 5: they fell straight into trouble, straight into prison. I went 40 00:02:35,280 --> 00:02:37,680 Speaker 5: and saw them in prison once, and the first thing 41 00:02:37,760 --> 00:02:39,800 Speaker 5: I said to me is, don't you ever come back here? 42 00:02:40,680 --> 00:02:44,080 Speaker 4: Don't you ever come and see see? And I basically 43 00:02:44,200 --> 00:02:46,040 Speaker 4: left from there and that was the end of it. 44 00:02:47,000 --> 00:02:50,799 Speaker 2: But Doug went the other way into law enforcement. He's 45 00:02:50,840 --> 00:02:53,079 Speaker 2: been on the police force for over twenty years now, 46 00:02:53,520 --> 00:03:00,000 Speaker 2: working mostly on drug related crimes. One day in twenty nineteen, 47 00:03:00,360 --> 00:03:02,680 Speaker 2: he got a call that changed his career. 48 00:03:03,200 --> 00:03:07,360 Speaker 4: So one day a colleague of mine phoned me and said, basically, 49 00:03:07,400 --> 00:03:10,760 Speaker 4: I've got this friend of mine who's been bombarded with 50 00:03:11,680 --> 00:03:17,520 Speaker 4: horrific photographs for several years, images of her face taken 51 00:03:17,560 --> 00:03:21,280 Speaker 4: from social media sites, and on those images there were 52 00:03:21,320 --> 00:03:25,280 Speaker 4: often captions detailing how pretty she was and what would 53 00:03:25,280 --> 00:03:28,440 Speaker 4: you like to do to her in terms of sexual 54 00:03:28,600 --> 00:03:32,359 Speaker 4: preferences and violations. There were also images of her face 55 00:03:32,520 --> 00:03:38,800 Speaker 4: with males erect genitalia or males ejaculating on her image 56 00:03:38,800 --> 00:03:42,440 Speaker 4: and then reposting it back online. She had been to 57 00:03:42,520 --> 00:03:45,280 Speaker 4: another police station to make a complaint, but due to 58 00:03:45,320 --> 00:03:48,720 Speaker 4: capability and understanding, they didn't. 59 00:03:48,440 --> 00:03:49,360 Speaker 3: Know how to deal with it. 60 00:03:50,320 --> 00:03:53,200 Speaker 4: And she's ad wits end to the point where she 61 00:03:53,360 --> 00:03:56,920 Speaker 4: was potentially suicidal at one stage. Is there anything you 62 00:03:56,920 --> 00:03:59,200 Speaker 4: can do? 63 00:03:59,200 --> 00:04:02,640 Speaker 2: Doug hadn't seen this level of online abuse, stalking and 64 00:04:02,720 --> 00:04:06,680 Speaker 2: harassment before. He needed to hear what happened firsthand. 65 00:04:07,320 --> 00:04:11,160 Speaker 4: So the first time I met the victim, I was 66 00:04:11,200 --> 00:04:11,920 Speaker 4: obviously at work. 67 00:04:12,000 --> 00:04:12,560 Speaker 3: It was late. 68 00:04:12,800 --> 00:04:15,280 Speaker 4: Typically I'd set up a nice soft interview room where 69 00:04:15,600 --> 00:04:17,320 Speaker 4: he sat on a nice soft couch get a cup 70 00:04:17,360 --> 00:04:20,240 Speaker 4: of tea. But I would start off, try and get 71 00:04:20,240 --> 00:04:22,320 Speaker 4: to know the person, try and build a nice, easy 72 00:04:22,400 --> 00:04:24,960 Speaker 4: rapport really quickly. You want to make things as easy 73 00:04:24,960 --> 00:04:30,040 Speaker 4: as possible. She was pretty apprehensive. What was a physical 74 00:04:30,080 --> 00:04:31,640 Speaker 4: toll as well as a mental toll? And you could 75 00:04:31,680 --> 00:04:33,400 Speaker 4: see it in your face and she was talking to me. 76 00:04:35,480 --> 00:04:38,039 Speaker 2: The woman who was not willing to be interviewed and 77 00:04:38,080 --> 00:04:42,040 Speaker 2: will remain anonymous in our series recounted how her email 78 00:04:42,040 --> 00:04:46,640 Speaker 2: inbox was constantly flooded with explicit pictures and violent threats. 79 00:04:47,200 --> 00:04:50,920 Speaker 4: She was able to provide some screenshots and information about 80 00:04:51,040 --> 00:04:51,920 Speaker 4: what's been going on. 81 00:04:53,040 --> 00:04:55,960 Speaker 2: Every time she got these messages, she'd block the center 82 00:04:56,279 --> 00:04:58,520 Speaker 2: and take a screenshot, and she'd add it to her 83 00:04:58,520 --> 00:05:02,440 Speaker 2: growing pile of digital a evidence. Had you seen anything 84 00:05:02,560 --> 00:05:03,799 Speaker 2: like that before? 85 00:05:04,600 --> 00:05:04,800 Speaker 3: No? 86 00:05:06,800 --> 00:05:09,320 Speaker 2: What went through your mind when she showed you those pictures? 87 00:05:10,520 --> 00:05:11,520 Speaker 3: Well, I'm a hunter. 88 00:05:12,720 --> 00:05:16,479 Speaker 4: You have certain police officers there that will wait for 89 00:05:16,520 --> 00:05:18,760 Speaker 4: calls that come in and then now go and respond 90 00:05:18,760 --> 00:05:21,840 Speaker 4: to those courts. I like to identify myself as a 91 00:05:21,839 --> 00:05:24,040 Speaker 4: person that will go and hunt people before they get 92 00:05:24,040 --> 00:05:27,640 Speaker 4: a chance to do that sort of thing. So, as 93 00:05:27,680 --> 00:05:32,000 Speaker 4: a hunter, if you're given pray, you want to go 94 00:05:32,040 --> 00:05:32,560 Speaker 4: and get it. 95 00:05:38,360 --> 00:05:44,159 Speaker 2: From iHeart Podcasts, Bloomberg and Kaleidoscope. This is Levertown, I'm Margie. 96 00:05:43,920 --> 00:05:45,799 Speaker 1: Murphy and I'm Olivia Carvill. 97 00:05:54,120 --> 00:05:58,800 Speaker 2: Doug sifted through image after image, email after email, attempting 98 00:05:58,839 --> 00:06:02,520 Speaker 2: to trace the sender. There were so many emails and 99 00:06:02,560 --> 00:06:05,719 Speaker 2: they were sent from different addresses, making it hard to 100 00:06:05,760 --> 00:06:09,960 Speaker 2: identify the sender. Many of the messages were also attempting 101 00:06:10,080 --> 00:06:12,520 Speaker 2: to get her to click on links to pawn websites. 102 00:06:13,320 --> 00:06:17,320 Speaker 2: Doug wondered if the sender had posted these explicit images online, 103 00:06:17,680 --> 00:06:20,560 Speaker 2: which is a crime in New Zealand. So, in addition 104 00:06:20,640 --> 00:06:23,080 Speaker 2: to the threats of blackmail that the person was sending, 105 00:06:23,960 --> 00:06:27,360 Speaker 2: if these images were disseminated online, and if Doug could 106 00:06:27,400 --> 00:06:30,480 Speaker 2: catch that person, it would mean years in prison. 107 00:06:31,279 --> 00:06:35,279 Speaker 4: Although I'm a hunter, I'm still bound by rules, by laws, 108 00:06:35,960 --> 00:06:39,760 Speaker 4: and by a line that I cannot cross. The reason 109 00:06:39,800 --> 00:06:42,520 Speaker 4: why Will got involved was because there were lines that 110 00:06:42,640 --> 00:06:43,520 Speaker 4: needed to be crossed. 111 00:06:44,480 --> 00:06:47,680 Speaker 2: Doug had heard about Will from another officer who said 112 00:06:47,720 --> 00:06:50,480 Speaker 2: Will was known for doing whatever had to be done 113 00:06:50,760 --> 00:06:52,600 Speaker 2: to solve tricky cases like these. 114 00:06:53,040 --> 00:06:56,760 Speaker 4: He is like a little terrier, so as soon as 115 00:06:56,760 --> 00:07:00,320 Speaker 4: he gets a rope to pull on I'll pull until 116 00:07:00,320 --> 00:07:01,200 Speaker 4: he gets the whole rope. 117 00:07:01,680 --> 00:07:03,080 Speaker 3: He would go to extremes. 118 00:07:07,920 --> 00:07:12,280 Speaker 6: My name is Will Wallace. I am thirty seven years old, 119 00:07:12,520 --> 00:07:16,360 Speaker 6: live in New Zealand. I have a young family. 120 00:07:16,800 --> 00:07:19,480 Speaker 2: Well, there's really nothing like a little terrier, and he's 121 00:07:19,560 --> 00:07:22,400 Speaker 2: much more comfortable behind the computer in that space. 122 00:07:22,480 --> 00:07:25,760 Speaker 6: I found it a lot easier just to be a nobody, 123 00:07:25,920 --> 00:07:27,840 Speaker 6: you know, not worry about all the other stuff that 124 00:07:27,920 --> 00:07:30,200 Speaker 6: kind of comes along with it, you know, this whole 125 00:07:30,240 --> 00:07:31,560 Speaker 6: podcast being one of them. 126 00:07:32,720 --> 00:07:35,240 Speaker 3: Yeah. 127 00:07:35,480 --> 00:07:38,040 Speaker 2: Will grew up on the South Island. He served in 128 00:07:38,040 --> 00:07:41,560 Speaker 2: New Zealand's military and as a police officer. He then 129 00:07:41,680 --> 00:07:45,360 Speaker 2: became a private investigator, which is when Doug caught him 130 00:07:45,760 --> 00:07:48,320 Speaker 2: and told him about this woman who was the target 131 00:07:48,400 --> 00:07:50,360 Speaker 2: of an online abuse campaign. 132 00:07:50,760 --> 00:07:53,480 Speaker 6: To be perfectly honest, I was given very little in 133 00:07:53,600 --> 00:07:57,200 Speaker 6: terms of information. I was basically just given internet harassment. 134 00:07:57,400 --> 00:08:01,680 Speaker 6: It's been happening for three or four years. The victim 135 00:08:01,760 --> 00:08:05,440 Speaker 6: is at their wets end. Would you be interested in 136 00:08:05,480 --> 00:08:07,800 Speaker 6: having a look. So I was like a moth to 137 00:08:07,840 --> 00:08:08,280 Speaker 6: a flame. 138 00:08:10,200 --> 00:08:13,760 Speaker 2: First, he reviewed the evidence the victim gave him, essentially 139 00:08:13,760 --> 00:08:17,600 Speaker 2: a giant pdaff of abusive pictures and comments. The victim 140 00:08:17,640 --> 00:08:20,480 Speaker 2: had received all this by email, but Will suspected the 141 00:08:20,520 --> 00:08:22,400 Speaker 2: content also lived somewhere else. 142 00:08:22,760 --> 00:08:24,960 Speaker 6: I was like, well, there's got to be a website 143 00:08:25,400 --> 00:08:28,400 Speaker 6: or there's got to be somewhere where the images are 144 00:08:28,400 --> 00:08:31,760 Speaker 6: being posted. So I was like, the first step is 145 00:08:31,800 --> 00:08:35,040 Speaker 6: to find out where the photos are being posted. 146 00:08:35,920 --> 00:08:39,120 Speaker 2: And that's when he found it with a simple reverse 147 00:08:39,200 --> 00:08:39,880 Speaker 2: image search. 148 00:08:40,640 --> 00:08:46,320 Speaker 6: This website come on Printed Picks. 149 00:08:44,679 --> 00:08:47,440 Speaker 2: The very same website where the Levett Town girls had 150 00:08:47,440 --> 00:08:50,920 Speaker 2: found their images altered, posted and defiled. 151 00:08:52,040 --> 00:08:54,400 Speaker 6: The first time I saw up, it's kind of like 152 00:08:54,520 --> 00:08:58,400 Speaker 6: disbelief that our website like that exists on the Internet. 153 00:08:58,920 --> 00:09:03,160 Speaker 6: It's like an online so right. So yeah, seeing it 154 00:09:03,160 --> 00:09:05,480 Speaker 6: in front of me, I just site, what the fuck 155 00:09:05,840 --> 00:09:10,200 Speaker 6: is this place? It's you can tell by the photos 156 00:09:10,200 --> 00:09:12,880 Speaker 6: on there that the people on there have not consented 157 00:09:12,920 --> 00:09:16,520 Speaker 6: to being on there in the first place. I would 158 00:09:16,559 --> 00:09:21,000 Speaker 6: describe it as more of a severe Internet harassment site 159 00:09:21,200 --> 00:09:23,239 Speaker 6: than a pornography website. 160 00:09:26,760 --> 00:09:29,480 Speaker 2: To figure out how many users were going after the victim, 161 00:09:30,080 --> 00:09:33,040 Speaker 2: will have to wade through every single post made about her. 162 00:09:34,120 --> 00:09:36,520 Speaker 2: Only then could he figure out who was going after 163 00:09:36,559 --> 00:09:40,079 Speaker 2: her the most. Once he did that, he was able 164 00:09:40,120 --> 00:09:44,360 Speaker 2: to determine who was organizing the harassment, coordinating it, and 165 00:09:44,400 --> 00:09:45,880 Speaker 2: goading others to join in. 166 00:09:46,720 --> 00:09:49,320 Speaker 6: So if you can imagine this has been happening over 167 00:09:49,480 --> 00:09:52,520 Speaker 6: like a period of three or four years, then there 168 00:09:52,600 --> 00:09:55,640 Speaker 6: is a lot of content that that person has posted. 169 00:09:56,720 --> 00:10:02,480 Speaker 2: The question remained, who is this person? That's where having 170 00:10:02,480 --> 00:10:06,679 Speaker 2: a civilian like Will involved in the investigation helped. As 171 00:10:06,720 --> 00:10:09,920 Speaker 2: a private citizen, he could move through the web freely 172 00:10:10,640 --> 00:10:13,440 Speaker 2: and even pay for information he wanted on the dark Web, 173 00:10:14,120 --> 00:10:16,560 Speaker 2: and Will knew how to do this with tools he'd 174 00:10:16,600 --> 00:10:20,280 Speaker 2: first used as a police officer tracking drug dealers online. 175 00:10:20,559 --> 00:10:22,040 Speaker 3: The Facebook graph tool. 176 00:10:22,720 --> 00:10:27,400 Speaker 6: You could pretty much punch anyone's Facebook profile into and 177 00:10:28,480 --> 00:10:32,280 Speaker 6: export a lot more information than they would otherwise be 178 00:10:32,360 --> 00:10:37,160 Speaker 6: willing to show, like via interaction with other people's photos, 179 00:10:37,360 --> 00:10:41,080 Speaker 6: all of their comments, all of their photos, the groups 180 00:10:41,120 --> 00:10:41,640 Speaker 6: they were in. 181 00:10:43,040 --> 00:10:46,439 Speaker 2: There's a name for this, it's called open source intelligence, 182 00:10:46,840 --> 00:10:49,440 Speaker 2: and as well explained, it's basically. 183 00:10:49,320 --> 00:10:53,679 Speaker 6: Hoovering up information all over the place, generally the Internet, 184 00:10:54,080 --> 00:10:57,040 Speaker 6: and then turning it into a product that can be actioned. 185 00:10:58,559 --> 00:11:00,439 Speaker 2: This is the kind of thing that takes a lot 186 00:11:00,480 --> 00:11:03,640 Speaker 2: of skill and a lot of time, but Will was 187 00:11:03,679 --> 00:11:07,960 Speaker 2: good at it. After months of digging, Will started following 188 00:11:08,000 --> 00:11:12,000 Speaker 2: the harasser from pseudonym to pseudonym, finding accounts he owned 189 00:11:12,040 --> 00:11:16,600 Speaker 2: on different social media platforms, and Will discovered something else. 190 00:11:17,920 --> 00:11:22,280 Speaker 2: Like in Levittown, this one harasser had many victims. 191 00:11:34,080 --> 00:11:38,800 Speaker 1: As Will Wallace followed the suspected harasser around the internet, 192 00:11:39,480 --> 00:11:45,200 Speaker 1: he found other victims, including a young woman named Lucy, 193 00:11:45,640 --> 00:11:49,040 Speaker 1: who had been harassed for more than half a decade, 194 00:11:49,240 --> 00:11:52,520 Speaker 1: starting when she lived not far from the first victim. 195 00:11:53,960 --> 00:11:56,600 Speaker 7: I'm really still trying to untangle in my adult life 196 00:11:56,800 --> 00:11:59,839 Speaker 7: how it has affected certain things that happened to me. 197 00:12:00,120 --> 00:12:01,120 Speaker 8: Lady as of high school. 198 00:12:01,960 --> 00:12:05,840 Speaker 1: The first time Lucy was harassed online in twenty fifteen, 199 00:12:06,400 --> 00:12:09,480 Speaker 1: she was going to an all girls high school, similar 200 00:12:09,520 --> 00:12:11,640 Speaker 1: to the school I went to when I was growing 201 00:12:11,720 --> 00:12:15,319 Speaker 1: up in New Zealand. All girls' schools are the norm. 202 00:12:16,000 --> 00:12:19,080 Speaker 1: We wore a uniform every day. We even had a 203 00:12:19,120 --> 00:12:24,080 Speaker 1: special cream colored outfit just for church. Lucy also had 204 00:12:24,080 --> 00:12:25,040 Speaker 1: to wear a uniform. 205 00:12:26,080 --> 00:12:30,240 Speaker 7: We were fed talking points like being only surrounded by 206 00:12:30,240 --> 00:12:35,520 Speaker 7: other girls is better for your academic standard because you're 207 00:12:35,520 --> 00:12:38,240 Speaker 7: not distracted by boys, and we were just trained to 208 00:12:38,400 --> 00:12:43,400 Speaker 7: be good girls. I think it reiterated that we were 209 00:12:43,520 --> 00:12:50,480 Speaker 7: lesser than boys, so we were constantly, constantly thinking of 210 00:12:50,520 --> 00:12:53,880 Speaker 7: how to appear more attractive or more cool to the boys. 211 00:12:55,640 --> 00:12:59,960 Speaker 1: As Lucy learned, whether it's in Livettown, Long Island or 212 00:13:00,040 --> 00:13:03,920 Speaker 1: or a beach town in New Zealand, just identifying as 213 00:13:03,960 --> 00:13:08,080 Speaker 1: a girl online can make you a target, and in 214 00:13:08,200 --> 00:13:12,080 Speaker 1: Lucy's case, she learned she was a target from some 215 00:13:12,320 --> 00:13:16,000 Speaker 1: friends who saw her face on the dating app Tinder. 216 00:13:16,920 --> 00:13:19,360 Speaker 7: I was under the age of eighteen, so I could 217 00:13:19,400 --> 00:13:22,319 Speaker 7: not have a tender, but it's some guys that I 218 00:13:22,400 --> 00:13:24,120 Speaker 7: knew who were maybe one or two years older than 219 00:13:24,120 --> 00:13:27,560 Speaker 7: me sent me screenshots of a tender account with all 220 00:13:27,600 --> 00:13:29,640 Speaker 7: of my photos on it and were asking if it 221 00:13:29,679 --> 00:13:29,920 Speaker 7: was me. 222 00:13:30,760 --> 00:13:34,520 Speaker 1: The photos had been taken from her Instagram account. Some 223 00:13:34,600 --> 00:13:38,720 Speaker 1: of them featured her in her high school uniform. At first, 224 00:13:38,880 --> 00:13:41,840 Speaker 1: Lucy thought that was kind of funny, maybe a joke, 225 00:13:42,960 --> 00:13:45,320 Speaker 1: but then she learned that there was more to it. 226 00:13:46,920 --> 00:13:50,280 Speaker 7: When this person matched with a boy on Tinder, they 227 00:13:50,280 --> 00:13:55,240 Speaker 7: would send them a Snapchat account, and through that Snapchat 228 00:13:55,240 --> 00:14:00,239 Speaker 7: account would send like very explicit nude and sexual content, 229 00:14:00,440 --> 00:14:01,240 Speaker 7: but without a face. 230 00:14:02,679 --> 00:14:06,760 Speaker 1: The photos that were sent back weren't altered images of Lucy. 231 00:14:07,320 --> 00:14:10,240 Speaker 1: They were nude photos that were cropped so you couldn't 232 00:14:10,280 --> 00:14:14,520 Speaker 1: see a face, but whoever had created the fake account 233 00:14:15,000 --> 00:14:18,520 Speaker 1: clearly wanted to make it look like Lucy was sending 234 00:14:18,600 --> 00:14:24,320 Speaker 1: back nude and sexual content. She decided she would report 235 00:14:24,360 --> 00:14:27,520 Speaker 1: it to the police in Nelson, her hometown in New Zealand. 236 00:14:28,600 --> 00:14:31,200 Speaker 7: I went after school one day in my school uniform 237 00:14:31,320 --> 00:14:35,800 Speaker 7: to the police station and I knocked on the little 238 00:14:36,320 --> 00:14:40,960 Speaker 7: window thingy and a middle aged man came out, and 239 00:14:41,400 --> 00:14:45,000 Speaker 7: I said I maybe needed some help. But as soon 240 00:14:45,040 --> 00:14:47,600 Speaker 7: as he stepped up to the window, I remember getting 241 00:14:47,640 --> 00:14:53,320 Speaker 7: really nervous and suddenly feeling so silly, because it's actually 242 00:14:53,400 --> 00:14:55,040 Speaker 7: rather public when you go to the police you have 243 00:14:55,080 --> 00:14:58,160 Speaker 7: to kind of plead your case in the waiting room, 244 00:14:58,360 --> 00:15:03,040 Speaker 7: which is quite echoe loud. So this random middle aged 245 00:15:03,040 --> 00:15:07,720 Speaker 7: man police officer come up to the window and asked 246 00:15:07,720 --> 00:15:10,920 Speaker 7: me what was wrong, and I kind of feebly told 247 00:15:10,960 --> 00:15:13,360 Speaker 7: him that I think someone had stolen my pictures and 248 00:15:13,520 --> 00:15:19,200 Speaker 7: was faking my identity on Tinder, and he just stared 249 00:15:19,200 --> 00:15:19,920 Speaker 7: at me blankly. 250 00:15:20,960 --> 00:15:24,600 Speaker 1: She said she felt so weird being in this public 251 00:15:24,640 --> 00:15:30,640 Speaker 1: space trying to explain this bizarre, upsetting and deeply personal 252 00:15:30,720 --> 00:15:35,040 Speaker 1: situation to this man who was completely clueless. 253 00:15:35,320 --> 00:15:37,560 Speaker 7: And he he just had no idea what I was 254 00:15:37,560 --> 00:15:43,120 Speaker 7: talking about. I remember him just saying, oh, Snapchat, what's that? 255 00:15:45,200 --> 00:15:48,720 Speaker 8: He just he had really no idea what I was 256 00:15:48,720 --> 00:15:49,240 Speaker 8: talking about. 257 00:15:50,000 --> 00:15:53,920 Speaker 1: To be fear, this was around twenty fifteen. Snapchat and 258 00:15:54,000 --> 00:15:56,440 Speaker 1: Tinder were only three or four years old. 259 00:15:57,560 --> 00:15:59,640 Speaker 7: I think I just went in with confidence out a ten, 260 00:15:59,800 --> 00:16:02,800 Speaker 7: and every minute that I was in the waiting room, 261 00:16:02,920 --> 00:16:06,160 Speaker 7: it just depleted by one point by the time he 262 00:16:06,200 --> 00:16:09,600 Speaker 7: stared at me blankly and then realizing that this was 263 00:16:10,320 --> 00:16:12,480 Speaker 7: you know, other people could hear me speaking, and then 264 00:16:12,680 --> 00:16:15,120 Speaker 7: I got quieter, and then I got embarrassed, and then 265 00:16:15,960 --> 00:16:18,160 Speaker 7: you know, I just didn't have anything left to give 266 00:16:18,200 --> 00:16:19,200 Speaker 7: after a few minutes, so. 267 00:16:21,080 --> 00:16:23,560 Speaker 8: I just apologized for wasting his time, and I left. 268 00:16:26,960 --> 00:16:29,960 Speaker 2: Like the women in Levittown, Lucy tried to put the 269 00:16:30,000 --> 00:16:32,600 Speaker 2: pictures and the police visit to the back of her mind. 270 00:16:33,520 --> 00:16:36,440 Speaker 2: She moved on with her life, graduated from high school, 271 00:16:37,440 --> 00:16:42,040 Speaker 2: but the next year, twenty sixteen, she started receiving messages. 272 00:16:43,040 --> 00:16:46,800 Speaker 7: I am not quite sure how it began, but it 273 00:16:46,920 --> 00:16:50,280 Speaker 7: began at some point that year that I started receiving 274 00:16:50,560 --> 00:16:55,880 Speaker 7: quite a number of message requests from anonymous profiles, either 275 00:16:55,880 --> 00:17:03,080 Speaker 7: on Facebook, Messenger or via Instagram, Oh of some disturbing content. 276 00:17:04,840 --> 00:17:08,800 Speaker 2: The messages began as verbal abuse and threats of violence. 277 00:17:08,520 --> 00:17:12,120 Speaker 8: And other times there was photos. 278 00:17:11,880 --> 00:17:14,520 Speaker 2: And then there was an onscourt of a different. 279 00:17:14,240 --> 00:17:19,800 Speaker 7: Abuse, somebody holding their penis over printed out photos. 280 00:17:19,400 --> 00:17:23,560 Speaker 2: Of me, one after the other after the other, so 281 00:17:23,720 --> 00:17:26,800 Speaker 2: many that at some point Lucy stopped counting. 282 00:17:26,640 --> 00:17:29,840 Speaker 7: Many a different photos strewn across the cavet or on 283 00:17:29,880 --> 00:17:30,320 Speaker 7: a bed. 284 00:17:31,680 --> 00:17:34,359 Speaker 2: At that point, it wasn't clear to Lucy whether this 285 00:17:34,520 --> 00:17:37,639 Speaker 2: was illegal. She figured it was something that she just 286 00:17:37,680 --> 00:17:38,480 Speaker 2: had to deal with. 287 00:17:44,840 --> 00:17:49,080 Speaker 7: It sounds crazy, but I have received so so many 288 00:17:49,160 --> 00:17:56,680 Speaker 7: photos of penises over pictures of me over five years. 289 00:17:57,640 --> 00:17:59,920 Speaker 7: I cannot remember how it feels to be shocked by it. 290 00:18:00,840 --> 00:18:08,159 Speaker 7: I have received more than I cared to remember. I 291 00:18:08,200 --> 00:18:12,960 Speaker 7: have become so accustomed to that image. I cannot even 292 00:18:13,000 --> 00:18:16,200 Speaker 7: remember being shocked by it. And I'm sorry to say 293 00:18:16,240 --> 00:18:22,399 Speaker 7: that just seems like same shit, different day now. 294 00:18:24,040 --> 00:18:26,679 Speaker 2: As soon as she saw the messages, she blocked the 295 00:18:26,720 --> 00:18:33,000 Speaker 2: center and deleted the content. Ping block delete, ping block delete, 296 00:18:33,800 --> 00:18:38,199 Speaker 2: ping block delete. It became the constant humming in the 297 00:18:38,240 --> 00:18:42,280 Speaker 2: background of her life. By this time, she had moved 298 00:18:42,280 --> 00:18:45,760 Speaker 2: to Europe and had begun studying to become a social scientist, 299 00:18:46,680 --> 00:18:48,560 Speaker 2: and still she had to deal with the dread of 300 00:18:48,600 --> 00:18:49,479 Speaker 2: this cyber abuse. 301 00:18:50,880 --> 00:18:55,000 Speaker 7: From twenty eighteen, I had been living in Europe for 302 00:18:55,320 --> 00:18:58,439 Speaker 7: about a year and a half and some of the 303 00:18:58,520 --> 00:19:01,360 Speaker 7: harassment had slowed down, like it was just a sort 304 00:19:01,400 --> 00:19:05,640 Speaker 7: of small hangover and remnant from high school. But unfortunately 305 00:19:05,840 --> 00:19:09,200 Speaker 7: my dad got very sick, so I had to return 306 00:19:09,240 --> 00:19:13,760 Speaker 7: to New Zealand, and almost as soon as I got back, 307 00:19:13,920 --> 00:19:18,880 Speaker 7: I started receiving these messages again, but now they were 308 00:19:18,920 --> 00:19:19,640 Speaker 7: more pointed. 309 00:19:20,160 --> 00:19:23,320 Speaker 2: This time, someone had threatened to contact her parents and 310 00:19:23,359 --> 00:19:26,000 Speaker 2: send graphic photos if she didn't respond to them in 311 00:19:26,040 --> 00:19:27,280 Speaker 2: a certain number of days. 312 00:19:27,600 --> 00:19:31,080 Speaker 7: They also had my parents' email addresses already in the 313 00:19:31,119 --> 00:19:34,120 Speaker 7: email address bar, to prove to me that they knew 314 00:19:34,119 --> 00:19:37,840 Speaker 7: who my parents were and where they worked. The idea 315 00:19:37,960 --> 00:19:42,359 Speaker 7: that just someone out there was obsessed with hurting me 316 00:19:43,320 --> 00:19:48,080 Speaker 7: definitely began to take off my life. I remember being 317 00:19:48,520 --> 00:19:51,879 Speaker 7: really scared to go out. 318 00:19:51,720 --> 00:19:52,560 Speaker 8: At night on my own. 319 00:19:52,880 --> 00:19:56,760 Speaker 7: I was terrified of ever sleeping somewhere where you could 320 00:19:56,760 --> 00:19:59,159 Speaker 7: see from the outside in because I always had this 321 00:19:59,240 --> 00:20:03,000 Speaker 7: vision that someone following me at night and would would 322 00:20:03,080 --> 00:20:10,240 Speaker 7: look into my room. It was unbelievable how much the 323 00:20:11,640 --> 00:20:21,160 Speaker 7: fear consumed me. I felt sometimes overcome with suspicion, and 324 00:20:21,200 --> 00:20:23,439 Speaker 7: it really undermined a lot of relationships I had with 325 00:20:23,520 --> 00:20:27,120 Speaker 7: men or people that were in the life of my mom, 326 00:20:27,280 --> 00:20:28,760 Speaker 7: or men that were in the life of my friends. 327 00:20:29,280 --> 00:20:34,720 Speaker 7: I yeah, I just always had in the back of 328 00:20:34,720 --> 00:20:42,160 Speaker 7: my mind that any of them could be the one. 329 00:20:42,800 --> 00:20:45,160 Speaker 2: Lucy decided she would try to go to the police again. 330 00:20:46,080 --> 00:20:48,920 Speaker 7: At this point, I felt, now I'm an adult, so like, 331 00:20:48,960 --> 00:20:50,240 Speaker 7: they're definitely gonna take me seriously. 332 00:20:50,240 --> 00:20:51,880 Speaker 8: I'm not wearing my skool uniform this time. 333 00:20:52,440 --> 00:20:58,200 Speaker 7: I am a very old twenty year old and I 334 00:20:58,240 --> 00:21:01,000 Speaker 7: walked in. It was when I remember, so it was 335 00:21:01,119 --> 00:21:07,280 Speaker 7: cold and dark already, at like six pm, and I 336 00:21:07,320 --> 00:21:10,040 Speaker 7: said I would like to speak to someone because I'm 337 00:21:10,040 --> 00:21:15,840 Speaker 7: being harassed, and the exact same thing happened. Despite my 338 00:21:15,840 --> 00:21:20,200 Speaker 7: best intentions, they just had no idea what I was 339 00:21:20,240 --> 00:21:20,720 Speaker 7: talking about. 340 00:21:21,800 --> 00:21:23,920 Speaker 8: I remember. It was a very short interaction. 341 00:21:24,040 --> 00:21:29,359 Speaker 7: And I left and sat crying in my car. I 342 00:21:29,440 --> 00:21:34,040 Speaker 7: didn't have the language to express what was illegal about 343 00:21:34,080 --> 00:21:36,400 Speaker 7: what was going on, and they didn't have the language 344 00:21:36,800 --> 00:21:39,760 Speaker 7: or the training to understand what was significant or illegal 345 00:21:39,800 --> 00:21:43,360 Speaker 7: about what was going on. So, you know, I probably 346 00:21:43,400 --> 00:21:48,359 Speaker 7: said something like, Oh, I'm being bothered on Instagram or 347 00:21:48,600 --> 00:21:54,320 Speaker 7: I'm receiving some unwanted content or something along those lines. 348 00:21:54,320 --> 00:22:00,320 Speaker 7: But we didn't have enough, like yeah, digital literacy or 349 00:22:00,359 --> 00:22:04,240 Speaker 7: social media literacy from a legal or criminal standpoint at 350 00:22:04,240 --> 00:22:10,960 Speaker 7: that time to deal with these things. I feel like 351 00:22:11,080 --> 00:22:15,480 Speaker 7: neither of us could communicate to each other what was important. 352 00:22:16,119 --> 00:22:18,520 Speaker 7: I felt deeply like at that point the onus was 353 00:22:18,560 --> 00:22:22,000 Speaker 7: on me to prove to them why they should care, 354 00:22:22,320 --> 00:22:26,800 Speaker 7: and then because I couldn't do that, I panicked and left. 355 00:22:31,680 --> 00:22:36,800 Speaker 7: Two years later, she got an email in twenty twenty, 356 00:22:37,160 --> 00:22:42,160 Speaker 7: I got a very unexpected email from the New Zealand 357 00:22:42,200 --> 00:22:46,080 Speaker 7: Police asking to have a phone call with me. And 358 00:22:46,960 --> 00:22:52,520 Speaker 7: I was so suspicious at this point of online harassment 359 00:22:52,680 --> 00:22:56,880 Speaker 7: and the ability to fake anything online that my initial 360 00:22:56,920 --> 00:22:59,439 Speaker 7: response was that this was my harasser pretending to be 361 00:22:59,440 --> 00:23:02,000 Speaker 7: a place I was trying to get in touch with me. 362 00:23:03,119 --> 00:23:06,359 Speaker 2: Lucy soon registered that this was real and that she 363 00:23:06,560 --> 00:23:08,240 Speaker 2: was part of something much bigger. 364 00:23:08,640 --> 00:23:12,440 Speaker 7: I realized that there was a wider case going on 365 00:23:13,560 --> 00:23:16,280 Speaker 7: revolving around another woman who was being harassed in a 366 00:23:16,320 --> 00:23:23,400 Speaker 7: similar way, where they were basically had reverse engineered some 367 00:23:23,480 --> 00:23:26,280 Speaker 7: evidence that led them to me. 368 00:23:27,720 --> 00:23:31,680 Speaker 2: In addition to Lucy, Doug Nukou and Will Wallace had 369 00:23:31,720 --> 00:23:35,640 Speaker 2: identified twelve other victims in New Zealand, and they were 370 00:23:35,640 --> 00:23:39,560 Speaker 2: getting closer to a perpetrator. Will had been following a 371 00:23:39,600 --> 00:23:42,960 Speaker 2: suspect around the Internet for months, getting to know his 372 00:23:43,080 --> 00:23:47,159 Speaker 2: favorite usernames, watching him post the same victims on different 373 00:23:47,200 --> 00:23:51,280 Speaker 2: pornographic websites, and even watching him buy and sell secondhand 374 00:23:51,280 --> 00:23:56,520 Speaker 2: goods on Facebook, Marketplace and elsewhere. It quickly became clear 375 00:23:56,840 --> 00:24:00,080 Speaker 2: that the person who was responsible for the harassment was 376 00:24:00,080 --> 00:24:02,920 Speaker 2: skilled at what he was doing, and he knew how 377 00:24:02,920 --> 00:24:03,359 Speaker 2: to hide. 378 00:24:03,920 --> 00:24:08,119 Speaker 4: They were using crypto accounts that were from overseas sites, 379 00:24:08,320 --> 00:24:14,560 Speaker 4: for instance America, proton Email, Switzerland, European sites. It was 380 00:24:14,600 --> 00:24:18,320 Speaker 4: also obviously at the time that our finder used a VPN. 381 00:24:19,040 --> 00:24:22,879 Speaker 2: A VPN or a virtual private network is like wearing 382 00:24:22,920 --> 00:24:26,840 Speaker 2: an invisibility cloak while you glide around the Internet. No 383 00:24:26,920 --> 00:24:29,280 Speaker 2: one will know where in the world you actually are. 384 00:24:30,320 --> 00:24:33,919 Speaker 2: So Will didn't know exactly where this perp lived, but 385 00:24:34,000 --> 00:24:37,120 Speaker 2: he kept watching him, waiting for a chance to pick 386 00:24:37,200 --> 00:24:42,159 Speaker 2: up some detail somewhere about where he lived. Will watched 387 00:24:42,200 --> 00:24:46,160 Speaker 2: as the suspect brought girls underwear and bulk. Another time 388 00:24:46,359 --> 00:24:49,560 Speaker 2: he saw him post an electric guitar for sale. When 389 00:24:49,600 --> 00:24:53,159 Speaker 2: he looked closely, the guitar had been laid out on 390 00:24:53,240 --> 00:24:58,439 Speaker 2: a floral comforter. Will realized he'd seen that comforter somewhere before, 391 00:24:59,200 --> 00:25:02,440 Speaker 2: on a picture hosted to a pornographic website called ex 392 00:25:02,480 --> 00:25:07,040 Speaker 2: Hamster that he'd already linked to the suspect. Doug was 393 00:25:07,080 --> 00:25:10,840 Speaker 2: able to subpoena that ex Hamster account. It turned out 394 00:25:10,960 --> 00:25:13,960 Speaker 2: the suspect hadn't always masked his traffic on the site. 395 00:25:14,680 --> 00:25:16,919 Speaker 2: This one time he hadn't. 396 00:25:18,000 --> 00:25:22,480 Speaker 4: My team found one IP address that hadn't been masked, 397 00:25:23,080 --> 00:25:26,080 Speaker 4: and that was an IP address that we could attribute 398 00:25:26,840 --> 00:25:27,520 Speaker 4: to a person. 399 00:25:28,480 --> 00:25:32,360 Speaker 2: Once again, like in Levittown, the background of a photo 400 00:25:32,680 --> 00:25:37,000 Speaker 2: sealed the deal. Will knew the matching bedspread meant they 401 00:25:37,040 --> 00:25:40,320 Speaker 2: had the right guy. Now it was up to the 402 00:25:40,320 --> 00:25:42,120 Speaker 2: police when we coloded on the door. 403 00:25:42,160 --> 00:25:44,320 Speaker 4: I would say for personally, for myself, that was one 404 00:25:44,359 --> 00:25:50,200 Speaker 4: of the most satisfying and highlighted days of my career. 405 00:25:50,359 --> 00:25:54,800 Speaker 4: You can imagine Americans on the helicopters parasailing down and 406 00:25:54,800 --> 00:25:55,760 Speaker 4: smashing through windows. 407 00:25:55,760 --> 00:25:56,560 Speaker 3: That's what I wanted. 408 00:25:57,040 --> 00:25:59,800 Speaker 4: Box wouldn't give me that, So the next best thing 409 00:26:00,000 --> 00:26:03,640 Speaker 4: I was kicking in the sky's door and exposing him 410 00:26:03,680 --> 00:26:04,600 Speaker 4: was a highlight. 411 00:26:05,320 --> 00:26:07,320 Speaker 3: I've not had a better day in the place, honestly. 412 00:26:09,000 --> 00:26:12,600 Speaker 2: His name is Finn Cotton. He lived in Nelson, the 413 00:26:12,680 --> 00:26:16,879 Speaker 2: same town Loosey's from. I remember the distinct sensation of 414 00:26:16,920 --> 00:26:20,960 Speaker 2: time slowing down, and then they just said a name. 415 00:26:21,560 --> 00:26:26,200 Speaker 7: And I don't know what I expected, but they said 416 00:26:26,520 --> 00:26:29,159 Speaker 7: a name that was very familiar to me. 417 00:26:30,160 --> 00:26:32,600 Speaker 8: It was the name of a. 418 00:26:32,560 --> 00:26:36,560 Speaker 7: Boy that I went to school with when I was twelve, 419 00:26:37,800 --> 00:26:43,160 Speaker 7: and it just suddenly felt all. 420 00:26:43,320 --> 00:26:46,160 Speaker 2: She knew Finn when they were kids, but hadn't seen 421 00:26:46,240 --> 00:26:47,560 Speaker 2: him since she went to high school. 422 00:26:48,359 --> 00:26:51,239 Speaker 7: I felt such a wave of relief that it was 423 00:26:52,359 --> 00:26:56,480 Speaker 7: a person, because I guess when someone lives in an anonymity, 424 00:26:56,640 --> 00:26:59,160 Speaker 7: you think they're a monster, and when you hear a name, 425 00:26:59,240 --> 00:27:00,560 Speaker 7: there suddenly just a person. 426 00:27:01,760 --> 00:27:04,679 Speaker 2: But she also said it's scary that someone she barely 427 00:27:04,760 --> 00:27:08,560 Speaker 2: knew could launch such a vicious, years long campaign against her. 428 00:27:10,200 --> 00:27:14,040 Speaker 2: In October twenty twenty four, Finn pleaded guilty to blackmail, 429 00:27:14,560 --> 00:27:18,920 Speaker 2: causing harm by posting digital communication and possession of child 430 00:27:19,000 --> 00:27:24,320 Speaker 2: sexual abuse materials. His online abuse campaign lasted nearly a decade, 431 00:27:24,840 --> 00:27:28,199 Speaker 2: and he targeted over a dozen people. What would you 432 00:27:28,320 --> 00:27:30,879 Speaker 2: do or say to Finn if you saw him again? 433 00:27:31,800 --> 00:27:33,000 Speaker 8: What would I say to Finn? 434 00:27:36,200 --> 00:27:36,480 Speaker 9: Ah? 435 00:27:36,520 --> 00:27:37,240 Speaker 8: Such a good question. 436 00:27:41,240 --> 00:27:42,800 Speaker 7: Oh it's so bad, but I just want to ask 437 00:27:42,840 --> 00:27:48,720 Speaker 7: it you. Okay, this is so much anger coming from him. 438 00:27:48,760 --> 00:27:49,160 Speaker 8: It's just. 439 00:27:51,240 --> 00:27:53,840 Speaker 7: Someone has to be so hurt and so angry too. 440 00:27:54,760 --> 00:27:57,560 Speaker 7: I want to hurt other people in this way because 441 00:27:57,600 --> 00:28:01,200 Speaker 7: I guess even though even though he's done horrible things 442 00:28:01,240 --> 00:28:05,800 Speaker 7: to me, we are from the same place. And it's 443 00:28:05,840 --> 00:28:07,560 Speaker 7: not that I don't think he has to answer to 444 00:28:07,560 --> 00:28:14,520 Speaker 7: what he's done, but I feel that this is someone 445 00:28:14,560 --> 00:28:22,520 Speaker 7: who's been failed. I see so much anger towards women, 446 00:28:23,600 --> 00:28:26,280 Speaker 7: and that frustrates me and makes me so angry that, 447 00:28:27,240 --> 00:28:30,520 Speaker 7: you know, a young man can feel so entitled, so 448 00:28:30,520 --> 00:28:34,160 Speaker 7: so entitled to women's affection and attention. 449 00:28:36,200 --> 00:28:40,680 Speaker 8: And now there's you know, a queue of. 450 00:28:43,120 --> 00:28:49,640 Speaker 9: Equally angry professional men in police uniforms or in h 451 00:28:49,840 --> 00:28:53,600 Speaker 9: you know, lawyers offices willing to bring down the full 452 00:28:53,640 --> 00:28:54,600 Speaker 9: force of the law on him. 453 00:28:54,800 --> 00:28:57,040 Speaker 8: And what is going to come of this. He's going 454 00:28:57,080 --> 00:29:00,440 Speaker 8: to go to prison, then what. 455 00:29:05,160 --> 00:29:08,360 Speaker 2: We have attempted to reach Finncotton for comment but haven't 456 00:29:08,400 --> 00:29:12,240 Speaker 2: heard back. In November twenty twenty four, he was sentenced 457 00:29:12,280 --> 00:29:20,160 Speaker 2: to seven years behind bars. As for Will, he's been 458 00:29:20,240 --> 00:29:23,360 Speaker 2: changed by the cases too, but in a different way. 459 00:29:24,240 --> 00:29:27,640 Speaker 2: He desperately wanted someone to go after what he believes 460 00:29:27,800 --> 00:29:31,440 Speaker 2: is the root of the problem. A website where so 461 00:29:31,600 --> 00:29:35,520 Speaker 2: many perpetrators he was helping to catch was still posting. 462 00:29:38,120 --> 00:29:41,840 Speaker 6: At the top of the website, on the left hand side, 463 00:29:41,880 --> 00:29:47,160 Speaker 6: there is a cartoon picture of a girl who looks underage. 464 00:29:47,880 --> 00:29:51,640 Speaker 6: There is a scroll bar that scrolls from sort of 465 00:29:51,720 --> 00:29:57,880 Speaker 6: right to lift and it shows the most recently updated topics. 466 00:29:58,560 --> 00:30:02,400 Speaker 6: I think when you you see that scroll bar, that's 467 00:30:02,480 --> 00:30:05,280 Speaker 6: when you get an understanding of just how vile the 468 00:30:05,320 --> 00:30:08,719 Speaker 6: website is and just the type of horrible things that 469 00:30:08,760 --> 00:30:11,880 Speaker 6: get put on it. And that is anything from the 470 00:30:12,400 --> 00:30:15,800 Speaker 6: harassment that we've just talked about through to pictures of children, 471 00:30:16,760 --> 00:30:21,160 Speaker 6: pictures of social media details addresses. 472 00:30:22,400 --> 00:30:26,040 Speaker 2: And he noticed something else all over this website deep 473 00:30:26,080 --> 00:30:29,880 Speaker 2: fakes images that made people look like they were naked 474 00:30:30,520 --> 00:30:34,400 Speaker 2: or in sexual positions. Some images were generated to make 475 00:30:34,440 --> 00:30:37,960 Speaker 2: it look like the women had been abused, covered in bruises. 476 00:30:38,960 --> 00:30:41,880 Speaker 2: At first, Will thought of it as just a new 477 00:30:41,960 --> 00:30:43,440 Speaker 2: mechanism of harassment. 478 00:30:44,040 --> 00:30:46,920 Speaker 6: That's kind of how I viewed it, probably a little 479 00:30:46,920 --> 00:30:50,560 Speaker 6: bit naive at that point, just not understanding that it's 480 00:30:50,600 --> 00:30:57,240 Speaker 6: the start of a massive problem. 481 00:30:54,840 --> 00:30:58,920 Speaker 2: A massive problem that would consume the next two years 482 00:30:58,960 --> 00:31:02,360 Speaker 2: of will Wallace's life and bring him together with an 483 00:31:02,480 --> 00:31:06,680 Speaker 2: unlikely international coalition to take aim at the site itself. 484 00:31:11,560 --> 00:31:16,040 Speaker 1: At the same time, halfway across the world, law enforcement 485 00:31:16,120 --> 00:31:22,280 Speaker 1: officials in Levettown, New York, were also investigating the same website. 486 00:31:23,080 --> 00:31:26,680 Speaker 1: But back then in New York, there was no guidebook, 487 00:31:27,280 --> 00:31:32,560 Speaker 1: no law that made posting deep fake pornography a crime. 488 00:31:33,120 --> 00:31:34,760 Speaker 3: It is a crime that you look at and you say, 489 00:31:34,800 --> 00:31:36,400 Speaker 3: this is awful, this should be illegal. 490 00:31:36,440 --> 00:31:38,160 Speaker 7: Book, what is this? 491 00:31:38,680 --> 00:31:39,720 Speaker 8: What is this crime? 492 00:31:41,040 --> 00:31:45,840 Speaker 1: It was up to them to figure that out and fast. 493 00:31:52,880 --> 00:31:56,880 Speaker 1: This series is reported and hosted by Margie Murphy and 494 00:31:56,920 --> 00:32:01,840 Speaker 1: me Olivia Carvill. Produced by Kaleidosco, Led by Julia Nutter, 495 00:32:02,520 --> 00:32:07,000 Speaker 1: edited by Nita Tuluis Semnani, Producing by Dara luck Potts, 496 00:32:07,480 --> 00:32:12,640 Speaker 1: Executive produced by Kate Osborne. Original composition and mixing by 497 00:32:12,680 --> 00:32:17,520 Speaker 1: Steve bone Our Bloomberg Editors are Caitlin Kenney and Jeff Grocop. 498 00:32:17,960 --> 00:32:23,080 Speaker 1: Additional reporting by Samantha Stewart. Sage Bauman is Bloomberg's executive 499 00:32:23,120 --> 00:32:27,280 Speaker 1: producer and head of Podcasting. Kristin Powers is our senior 500 00:32:27,360 --> 00:32:32,400 Speaker 1: executive editor. From iHeart, our executive producers are Tyler Klang 501 00:32:32,640 --> 00:32:37,400 Speaker 1: and Nicki Etour. Levettown is a production of Bloomberg, Kaleidoscope 502 00:32:37,640 --> 00:32:41,720 Speaker 1: and iHeart Podcasts. If you liked this show, give us 503 00:32:41,720 --> 00:32:49,360 Speaker 1: a follow and tell your friends.