1 00:00:01,120 --> 00:00:04,240 Speaker 1: The views and opinions expressed in this podcast are solely 2 00:00:04,240 --> 00:00:07,560 Speaker 1: those of the authors and participants and do not necessarily 3 00:00:07,600 --> 00:00:11,520 Speaker 1: represent those of iHeart Media, Tenderfoot TV, or their employees. 4 00:00:12,320 --> 00:00:16,720 Speaker 1: This series contains discussions of violence and sexual violence. Listener 5 00:00:16,720 --> 00:00:24,800 Speaker 1: discretion is advised. Last time an algorithm. Lorie Townsend more 6 00:00:24,840 --> 00:00:27,800 Speaker 1: in the death of her daughter, Africa Hardy, don't think 7 00:00:27,880 --> 00:00:30,960 Speaker 1: that you know everything about child, because there's something that 8 00:00:31,000 --> 00:00:34,280 Speaker 1: they're not telling you. After moving to Chicago when she 9 00:00:34,440 --> 00:00:41,599 Speaker 1: was nineteen, Africa started escorting. In October, detectives found her 10 00:00:42,200 --> 00:00:46,320 Speaker 1: strangled in the bathtub of an Indiana motel. I just 11 00:00:46,360 --> 00:00:49,839 Speaker 1: think a lot of it, and I think it could 12 00:00:49,880 --> 00:00:54,840 Speaker 1: have been prevented. Years earlier, journalist Thomas Hargrove had learned 13 00:00:54,840 --> 00:01:01,040 Speaker 1: about the concept of linkage blindness. Most connected murders go unrecognized, 14 00:01:01,360 --> 00:01:04,160 Speaker 1: and I kept that in the back of my mind. 15 00:01:05,000 --> 00:01:09,640 Speaker 1: And then when Hargrove discovered an FBI database that tracked homicides, 16 00:01:09,959 --> 00:01:13,399 Speaker 1: it gave him an idea, could we teach a computer 17 00:01:13,720 --> 00:01:20,800 Speaker 1: to identify connected cases to find serial killings? From my 18 00:01:20,959 --> 00:01:26,120 Speaker 1: Heart Media and Tenderfoot TV, this is algorithm I'm ben 19 00:01:26,200 --> 00:01:35,479 Speaker 1: Keebrick I mentioned in the last episode that Africa's case 20 00:01:35,720 --> 00:01:38,880 Speaker 1: flipped what I knew about crime on its head this 21 00:01:39,040 --> 00:01:44,120 Speaker 1: episode you'll see why. Let's jump back to my conversation 22 00:01:44,360 --> 00:01:48,480 Speaker 1: with Thomas Hargrove. So, at what point in my um 23 00:01:48,920 --> 00:01:51,360 Speaker 1: you were saying that we shouldn't ignore her. Grove told 24 00:01:51,360 --> 00:01:53,920 Speaker 1: me that back in two thousand four, when he was 25 00:01:53,960 --> 00:01:57,920 Speaker 1: working as an investigative reporter in Washington, d C. He 26 00:01:57,960 --> 00:02:01,960 Speaker 1: had come across a database with information about homicides all 27 00:02:02,000 --> 00:02:06,320 Speaker 1: across the US. He wondered whether an algorithm could find 28 00:02:06,440 --> 00:02:11,920 Speaker 1: patterns within that data and to text serial killers. Maybe 29 00:02:12,000 --> 00:02:15,160 Speaker 1: it could even be used to detect active serial killers 30 00:02:15,200 --> 00:02:18,720 Speaker 1: who had not yet been caught. I did not know 31 00:02:19,200 --> 00:02:23,400 Speaker 1: that such an algorithm was possible. I was going on faith. 32 00:02:24,120 --> 00:02:26,280 Speaker 1: I did believe that it was important, and I did 33 00:02:26,400 --> 00:02:28,880 Speaker 1: believe that it could save lives, but I didn't know 34 00:02:28,960 --> 00:02:32,080 Speaker 1: that it would work. I begged my editors, let me 35 00:02:32,120 --> 00:02:35,680 Speaker 1: try this. Let's do a project looking at murder, with 36 00:02:35,840 --> 00:02:38,680 Speaker 1: the understanding that the actual goal is to see if 37 00:02:38,720 --> 00:02:42,160 Speaker 1: we could create a computer program that would identify serial murder. 38 00:02:42,760 --> 00:02:46,239 Speaker 1: But remember this was back in two thousand four, years 39 00:02:46,320 --> 00:02:49,720 Speaker 1: before people were talking about things like machine learning or 40 00:02:49,960 --> 00:02:53,200 Speaker 1: the power of big data. Because Harder was on the 41 00:02:53,240 --> 00:02:56,440 Speaker 1: cutting edge of these ideas, it was sometimes difficult to 42 00:02:56,520 --> 00:02:59,920 Speaker 1: get other people to understand him or to take him serious. 43 00:03:00,120 --> 00:03:05,240 Speaker 1: Lee My editors recognized that this could be something very cool, 44 00:03:05,919 --> 00:03:09,400 Speaker 1: that this could make news, but it's hard to commit 45 00:03:09,720 --> 00:03:14,320 Speaker 1: to a project you don't know upfront whether it's possible. Ultimately, 46 00:03:14,560 --> 00:03:17,839 Speaker 1: his editors told him that trying to design an algorithm 47 00:03:18,000 --> 00:03:23,360 Speaker 1: sounded too risky. It's normal for story pitches, especially ambitious 48 00:03:23,360 --> 00:03:27,240 Speaker 1: ones like this, to get rejected, but what's unusual is 49 00:03:27,280 --> 00:03:31,280 Speaker 1: that Hargrove didn't give up For the next six years. 50 00:03:31,360 --> 00:03:35,560 Speaker 1: He kept pitching the story six years, and he says 51 00:03:35,640 --> 00:03:38,520 Speaker 1: that his editors always considered it but would end up 52 00:03:38,560 --> 00:03:44,040 Speaker 1: assigning him to another safer story. Then, in two thousand ten, 53 00:03:44,560 --> 00:03:48,000 Speaker 1: Hargrove got his big break. He had just published a 54 00:03:48,040 --> 00:03:55,160 Speaker 1: provocative piece called Saving Babies, exposing sudden infant death. Hargrove 55 00:03:55,280 --> 00:03:59,080 Speaker 1: had analyzed data from Corners offices all across the US 56 00:03:59,760 --> 00:04:03,080 Speaker 1: and found that many times when infant deaths were listed 57 00:04:03,080 --> 00:04:07,800 Speaker 1: as sid's sudden infant death syndrome, the evidence actually showed 58 00:04:07,920 --> 00:04:12,480 Speaker 1: that the infants had died of accidental cufifocations. Most of 59 00:04:12,520 --> 00:04:17,680 Speaker 1: the time, babies don't die mysteriously. They die from avoidable 60 00:04:17,800 --> 00:04:23,080 Speaker 1: accidents and unsafe sleeping conditions. They got covered up because 61 00:04:23,120 --> 00:04:26,440 Speaker 1: of what was intended to be a very kindly diagnosis 62 00:04:26,520 --> 00:04:30,080 Speaker 1: called SIDS. That was meant to be a kindness to parents. 63 00:04:30,800 --> 00:04:35,760 Speaker 1: There was a mistake. This project prompted a national conference 64 00:04:35,839 --> 00:04:40,160 Speaker 1: to re examine infant death. It caused the creation of 65 00:04:40,200 --> 00:04:44,720 Speaker 1: a sudden Infant Death monitoring system by the CDC. I mean, 66 00:04:44,760 --> 00:04:49,600 Speaker 1: it was tremendously successful. It had got rave attention. Finally, 67 00:04:49,640 --> 00:04:52,840 Speaker 1: after six years, my stock was high enough in the 68 00:04:52,880 --> 00:04:55,640 Speaker 1: newsroom that I was able to get them to look 69 00:04:55,680 --> 00:05:02,479 Speaker 1: at murder. My editors that, okay, Tom, you've got a year. 70 00:05:04,120 --> 00:05:07,960 Speaker 1: I could do a national reporting project looking at unsolved murder, 71 00:05:08,440 --> 00:05:11,760 Speaker 1: with the understanding that what we were really about was 72 00:05:11,800 --> 00:05:16,320 Speaker 1: to try to develop a statistical means to identify overlooked 73 00:05:16,520 --> 00:05:22,720 Speaker 1: serial murder. Could we teach a computer to find serial killings? 74 00:05:25,040 --> 00:05:29,800 Speaker 1: After years of dreaming about making this algorithm, now Hargrove 75 00:05:29,839 --> 00:05:33,960 Speaker 1: had this chance to try and make it work. Hargrove 76 00:05:34,160 --> 00:05:38,040 Speaker 1: starting to design a computer program, one that could comb 77 00:05:38,040 --> 00:05:42,320 Speaker 1: through the five hundred thousand supplemental homicide reports and find 78 00:05:42,400 --> 00:05:46,919 Speaker 1: patterns between victims, patterns that might suggest the work of 79 00:05:46,920 --> 00:05:53,360 Speaker 1: a serial killer. So what is an algorithm? These days? 80 00:05:53,480 --> 00:05:57,320 Speaker 1: Algorithms dictate much of our digital lives. They determine what 81 00:05:57,400 --> 00:06:02,040 Speaker 1: TV shows get suggested to us, post we see on Facebook, 82 00:06:02,640 --> 00:06:07,480 Speaker 1: what route our GPS takes us on. Often these algorithms 83 00:06:07,480 --> 00:06:10,360 Speaker 1: are black boxes. We're not sure what data is being 84 00:06:10,400 --> 00:06:13,120 Speaker 1: fed into them and how they're deciding what to spit 85 00:06:13,160 --> 00:06:18,920 Speaker 1: back out. But at their core, algorithms aren't mysterious. They're 86 00:06:18,920 --> 00:06:22,680 Speaker 1: not even really that complicated. Algorithms are just sets of 87 00:06:22,680 --> 00:06:27,120 Speaker 1: instructions used to accomplish a specific task. You could say 88 00:06:27,279 --> 00:06:30,119 Speaker 1: a cake recipe is an algorithm for baking a cake. 89 00:06:30,839 --> 00:06:34,080 Speaker 1: You wouldn't say that, but you could. But while following 90 00:06:34,080 --> 00:06:37,760 Speaker 1: an algorithm might be simple, designing a whole new algorithm 91 00:06:37,800 --> 00:06:40,839 Speaker 1: is trickier. It wasn't like Cargroove was just baking a 92 00:06:40,920 --> 00:06:43,240 Speaker 1: cake from scratch. It was more like he was trying 93 00:06:43,279 --> 00:06:47,960 Speaker 1: to invent a brand new dessert. What we were doing 94 00:06:48,240 --> 00:06:52,080 Speaker 1: with the algorithm was the process literally of trial and error. 95 00:06:52,560 --> 00:06:57,840 Speaker 1: The ability to experiment with data was critical. If you're 96 00:06:57,839 --> 00:07:00,719 Speaker 1: trying to come up with a new dessert, might experiment 97 00:07:00,800 --> 00:07:03,640 Speaker 1: with ingredients until you come up with a combination that 98 00:07:03,680 --> 00:07:07,440 Speaker 1: tastes good. Hargrove knew what ingredients he had to play with. 99 00:07:08,120 --> 00:07:11,559 Speaker 1: That was the data from the supplemental homicide report, which 100 00:07:11,640 --> 00:07:14,560 Speaker 1: listed the weapon that was used he was shot with 101 00:07:14,600 --> 00:07:17,880 Speaker 1: a handgun, the time and location of the murder January, 102 00:07:18,840 --> 00:07:21,840 Speaker 1: and the age, race, and sex of the victim. Victim 103 00:07:21,920 --> 00:07:25,520 Speaker 1: is a blackmail eighteen years old. But as he experimented 104 00:07:25,600 --> 00:07:29,440 Speaker 1: with how an algorithm might sort through that data, Hargrove 105 00:07:29,600 --> 00:07:31,560 Speaker 1: needed a way to check if he was on the 106 00:07:31,640 --> 00:07:35,240 Speaker 1: right track. So we decided to test each new prototype 107 00:07:35,240 --> 00:07:37,760 Speaker 1: of his algorithm to see if it could detect a 108 00:07:37,840 --> 00:07:43,080 Speaker 1: known serial killer. We used as our test bed the 109 00:07:43,240 --> 00:07:49,040 Speaker 1: forty eight victims of serial killer Gary Ridgeway, so called 110 00:07:49,040 --> 00:07:53,160 Speaker 1: Green River Killer. He was convicted convicted in a court 111 00:07:53,160 --> 00:07:56,840 Speaker 1: of law of murdering forty eight girls and women in 112 00:07:56,880 --> 00:08:01,360 Speaker 1: the Seattle area. The question was, we'd teach a computer 113 00:08:01,960 --> 00:08:05,200 Speaker 1: a process that would tell us that something god awful 114 00:08:05,320 --> 00:08:09,560 Speaker 1: happened in Seattle. At the time, Gary Ridgeway was thought 115 00:08:09,640 --> 00:08:13,760 Speaker 1: to be the most prolific serial killer in America. Her 116 00:08:13,840 --> 00:08:16,480 Speaker 1: group figured if he could create an algorithm that could 117 00:08:16,520 --> 00:08:20,960 Speaker 1: detect Ridgeway. Maybe it would also detect other serial killers 118 00:08:21,080 --> 00:08:26,160 Speaker 1: that had gone unrecognized. To motivate himself, Hargrove stuck up 119 00:08:26,160 --> 00:08:29,960 Speaker 1: a picture of Gary Ridgeway in his office. It was 120 00:08:30,200 --> 00:08:33,560 Speaker 1: one of his booking photographs. He was glowering and looked 121 00:08:33,600 --> 00:08:37,760 Speaker 1: like it was glowering at me. Under that picture, I 122 00:08:37,880 --> 00:08:43,959 Speaker 1: typed the headline, what do serial victims look like? Statistically? 123 00:08:45,160 --> 00:08:48,240 Speaker 1: For over two decades, Ridgeway had killed women and dumped 124 00:08:48,240 --> 00:08:52,720 Speaker 1: their bodies along the Green River outside Seattle. In most cases, 125 00:08:52,840 --> 00:08:56,080 Speaker 1: by the time anyone had discovered those corpses, they were 126 00:08:56,120 --> 00:09:00,920 Speaker 1: already skeletal. Ridgeway has snuffed out does sense of lives. 127 00:09:01,480 --> 00:09:05,160 Speaker 1: He'd killed women and girls with their own hopes and dreams. 128 00:09:06,000 --> 00:09:09,600 Speaker 1: But Hargrove needed to approach their deaths like the algorithm would, 129 00:09:10,080 --> 00:09:13,480 Speaker 1: looking at their cases by their numbers, using only the 130 00:09:13,600 --> 00:09:18,960 Speaker 1: data from the FBI's supplemental homicide reports. So what did 131 00:09:19,040 --> 00:09:23,920 Speaker 1: ridgeways victims look like? Statistically? And more broadly, what do 132 00:09:24,000 --> 00:09:28,240 Speaker 1: a serial killer's victims look like? To find out, I 133 00:09:28,320 --> 00:09:31,760 Speaker 1: reached out to one of Hargrove's collaborators, Dr Mike A. Mott. 134 00:09:32,520 --> 00:09:37,079 Speaker 1: A Mott is a forensic psychology professor at Radford University, 135 00:09:37,240 --> 00:09:40,920 Speaker 1: and he curates the world's largest database of serial killers 136 00:09:40,960 --> 00:09:43,640 Speaker 1: and their victims. If you look at how serial killers 137 00:09:43,640 --> 00:09:47,720 Speaker 1: are portrayed in movies or on TV, the stereotypes not 138 00:09:47,840 --> 00:09:52,480 Speaker 1: really very consistent with with the back Are these also 139 00:09:52,640 --> 00:09:56,000 Speaker 1: stereotypes that law enforcement might have when they're trying to 140 00:09:56,040 --> 00:10:01,120 Speaker 1: decide could this murder possibly be a serial murder? Well, 141 00:10:01,240 --> 00:10:04,800 Speaker 1: law enforcement certainly has the same stereotypes. When we've done 142 00:10:04,840 --> 00:10:09,480 Speaker 1: presentations to law enforcement groups and even clinical psychologists that 143 00:10:09,520 --> 00:10:27,920 Speaker 1: are police psychologists, they're very surprised at the results. I'm 144 00:10:27,960 --> 00:10:31,640 Speaker 1: talking to Professor Mike al Mott about what serial murders 145 00:10:31,679 --> 00:10:36,760 Speaker 1: look like statistically, because statistical differences between serial killings and 146 00:10:36,840 --> 00:10:40,600 Speaker 1: other homicides are the kind of signatures and algorithm could 147 00:10:40,679 --> 00:10:44,720 Speaker 1: use to detect a serial killer. Professor Amatt has built 148 00:10:44,760 --> 00:10:49,400 Speaker 1: a database with information on thousands of serial killers, and 149 00:10:49,400 --> 00:10:52,920 Speaker 1: he told me that there are many misconceptions about these killers, 150 00:10:52,960 --> 00:10:56,880 Speaker 1: even within law enforcement and the true crime community. The 151 00:10:56,960 --> 00:10:59,840 Speaker 1: stereotypes about the profile of a serial killer or pro 152 00:11:00,040 --> 00:11:03,320 Speaker 1: file of the victims is not really very consistent with 153 00:11:03,360 --> 00:11:06,800 Speaker 1: the facts. If we want to catch serial killers, we 154 00:11:06,880 --> 00:11:09,560 Speaker 1: need to know who they really are in the reality 155 00:11:09,600 --> 00:11:12,360 Speaker 1: of their crimes. I want you to stop for a 156 00:11:12,400 --> 00:11:15,440 Speaker 1: moment and try to conjure up a mental image of 157 00:11:15,440 --> 00:11:19,760 Speaker 1: a typical serial killers victim. How old are they, what 158 00:11:19,840 --> 00:11:24,319 Speaker 1: do they look like? Now, imagine all of the victims 159 00:11:24,360 --> 00:11:27,040 Speaker 1: of a single killer. What would you guess they have 160 00:11:27,160 --> 00:11:30,960 Speaker 1: in common with one another? If we look at the 161 00:11:31,000 --> 00:11:36,280 Speaker 1: victims of serial killers, of the victims or women or 162 00:11:36,480 --> 00:11:39,760 Speaker 1: forty nine or men, so there's not really much of 163 00:11:39,760 --> 00:11:43,760 Speaker 1: a difference there. One of the stereotypes about serial killers 164 00:11:43,800 --> 00:11:45,520 Speaker 1: is that they have a type, you know, kinde of 165 00:11:45,600 --> 00:11:48,600 Speaker 1: victim that they're going to kill. The most consistent profile 166 00:11:48,640 --> 00:11:50,640 Speaker 1: we can have is really in the age category of 167 00:11:50,679 --> 00:11:54,920 Speaker 1: who they're killing. About are going to kill people that 168 00:11:54,960 --> 00:11:57,880 Speaker 1: are in the same age categories, so it's children, or 169 00:11:57,880 --> 00:12:02,559 Speaker 1: its teens, or its adults or elderly. Serial Killers are 170 00:12:02,640 --> 00:12:06,280 Speaker 1: less consistent when it comes to other attributes of their victims. 171 00:12:06,360 --> 00:12:10,280 Speaker 1: For example, only six kill victims there are all male 172 00:12:10,600 --> 00:12:13,400 Speaker 1: or all female. And then if you look at race, 173 00:12:13,480 --> 00:12:17,440 Speaker 1: for example, of serial killers only kill somebody of the 174 00:12:17,520 --> 00:12:21,240 Speaker 1: same race. And while the stereotypical victim of a serial 175 00:12:21,320 --> 00:12:24,439 Speaker 1: killer is white, actually a third of victims or people 176 00:12:24,440 --> 00:12:29,440 Speaker 1: of color. For Gary Ridgeway, the serial killer that Hargrove 177 00:12:29,480 --> 00:12:32,640 Speaker 1: picked as his test case, all his known victims were 178 00:12:32,679 --> 00:12:35,839 Speaker 1: young women, but they were diverse in terms of race. 179 00:12:36,640 --> 00:12:40,080 Speaker 1: In an interview with an FBI agent, Ridgeway says he 180 00:12:40,160 --> 00:12:44,600 Speaker 1: targeted prostitutes out of convenience, and studies show that there's 181 00:12:44,600 --> 00:12:49,000 Speaker 1: actually been a dramatic rise in serial killers targeting sex workers. 182 00:12:49,760 --> 00:12:53,000 Speaker 1: In the seventies, prostitutes are thought to have made up 183 00:12:53,040 --> 00:12:57,719 Speaker 1: around six of the female victims of serial killers. By 184 00:12:57,720 --> 00:13:01,120 Speaker 1: the two thousands, more than two thirds of women murdered 185 00:13:01,160 --> 00:13:05,800 Speaker 1: by serial killers were sex workers. Many serial killings are 186 00:13:05,960 --> 00:13:09,679 Speaker 1: sexual in nature. About one third of serial killers rape 187 00:13:09,720 --> 00:13:14,240 Speaker 1: at least one of their victims. Ridgeway told detectives he'd 188 00:13:14,280 --> 00:13:17,560 Speaker 1: have sex with his victims before he strangled them, and 189 00:13:17,600 --> 00:13:20,520 Speaker 1: when asked why he chose to choke all his victims, 190 00:13:20,720 --> 00:13:24,640 Speaker 1: Ridgeway replied because that was more personal and more rewarding 191 00:13:24,880 --> 00:13:30,079 Speaker 1: than shooting them. And compared to typical murderers, serial killers 192 00:13:30,160 --> 00:13:32,760 Speaker 1: are more likely to use methods that are up close 193 00:13:32,840 --> 00:13:38,280 Speaker 1: and personal, like strangulation or bludgending. So if we're looking 194 00:13:38,360 --> 00:13:44,840 Speaker 1: at victims in the US about were shot, were strangled 195 00:13:46,040 --> 00:13:50,640 Speaker 1: were stabbed and ten percent were blugend and serial killers 196 00:13:50,760 --> 00:13:53,560 Speaker 1: tend to be consistent in the method they used to kill. 197 00:13:54,240 --> 00:13:57,440 Speaker 1: Amont says that two thirds of serial killers use only 198 00:13:57,480 --> 00:14:01,320 Speaker 1: a single means to kill their victims. So what are 199 00:14:01,360 --> 00:14:05,600 Speaker 1: the takeaways from a MOOTS data? Serial killers show some 200 00:14:05,920 --> 00:14:10,520 Speaker 1: but not perfect consistency in terms of their victims age, race, 201 00:14:10,679 --> 00:14:15,320 Speaker 1: and sex, and also in their method of killing. But 202 00:14:15,400 --> 00:14:18,240 Speaker 1: it turns out that there's one more thing that serial 203 00:14:18,320 --> 00:14:22,120 Speaker 1: homicides tend to have in common, a property that would 204 00:14:22,160 --> 00:14:26,360 Speaker 1: be crucial for hard Groves algorithm. We spent the summer 205 00:14:26,400 --> 00:14:29,720 Speaker 1: of two thousand and ten finding at least the hundred 206 00:14:29,840 --> 00:14:33,680 Speaker 1: procedures that crashed and burned. Does the presence of a 207 00:14:33,760 --> 00:14:37,360 Speaker 1: serial killer increase the rated which women are murdered? No? 208 00:14:38,200 --> 00:14:41,000 Speaker 1: Does it increase the rate of which women are murdered 209 00:14:41,040 --> 00:14:46,400 Speaker 1: through unusual means? No? And do you get any indications 210 00:14:46,440 --> 00:14:51,320 Speaker 1: that you're getting closer? Yeah, So as we were progressing 211 00:14:51,560 --> 00:14:54,560 Speaker 1: through the hundred and one things that don't work, we 212 00:14:54,560 --> 00:14:57,960 Speaker 1: were starting to get closer. So um, the last thing 213 00:14:58,200 --> 00:15:02,520 Speaker 1: was to look at what the clearance rate was for 214 00:15:02,840 --> 00:15:07,960 Speaker 1: particular types of weapons. That term clearance rate refers to 215 00:15:08,000 --> 00:15:11,720 Speaker 1: the percentage of cases that police end up arresting someone 216 00:15:11,800 --> 00:15:16,160 Speaker 1: for the crime. Hard Group realized that Ridgeway had gotten 217 00:15:16,160 --> 00:15:19,040 Speaker 1: away with his murders for so long that they've been 218 00:15:19,080 --> 00:15:22,840 Speaker 1: listed as unsolved in the database. So if you looked 219 00:15:22,880 --> 00:15:26,640 Speaker 1: at killings in Seattle that matched his method, the number 220 00:15:26,640 --> 00:15:30,240 Speaker 1: of cases police had cleared was much lower than expected. 221 00:15:31,280 --> 00:15:36,360 Speaker 1: The presence of an active serial killer often destroys the 222 00:15:36,440 --> 00:15:39,680 Speaker 1: batting average for the local police department. They're able to 223 00:15:39,680 --> 00:15:43,560 Speaker 1: solve most murders, but not that type of murder because 224 00:15:43,560 --> 00:15:48,000 Speaker 1: there's a serial killer who's avoiding arrest hert group was 225 00:15:48,040 --> 00:15:51,400 Speaker 1: making progress. He'd picked up the hint of a signal 226 00:15:51,480 --> 00:15:54,400 Speaker 1: from Bridgeway in Seattle, but it wasn't enough to make 227 00:15:54,440 --> 00:15:58,280 Speaker 1: the algorithm useful as a tool. He racked his brain 228 00:15:58,520 --> 00:16:00,800 Speaker 1: trying to come up with a way to improve it. 229 00:16:01,960 --> 00:16:04,680 Speaker 1: One day, near the end of the summer, he looked 230 00:16:04,760 --> 00:16:08,440 Speaker 1: up at Ridgeway's mug shot and asked himself again, what 231 00:16:08,480 --> 00:16:13,160 Speaker 1: do a serial killers victims look like? Statistically? As he 232 00:16:13,200 --> 00:16:16,720 Speaker 1: talked through the project one last time with his research assistant, 233 00:16:16,800 --> 00:16:20,920 Speaker 1: Liz Lucas, an idea struck him. As I was taking 234 00:16:21,040 --> 00:16:23,360 Speaker 1: Liz to the airport because she had to go home 235 00:16:23,400 --> 00:16:26,360 Speaker 1: to defend her master's thesis. I told her that what 236 00:16:26,560 --> 00:16:29,520 Speaker 1: might work, and we're gonna try this next is a 237 00:16:29,600 --> 00:16:34,600 Speaker 1: kind of cluster analysis. Instead of querying all of the data, 238 00:16:35,280 --> 00:16:40,240 Speaker 1: we tried to assemble the data into smaller clusters according 239 00:16:40,280 --> 00:16:43,720 Speaker 1: to the county where the murderer occurred, the gender of 240 00:16:43,760 --> 00:16:46,560 Speaker 1: the victim, the method of killing, and at that time 241 00:16:46,600 --> 00:16:51,120 Speaker 1: age group. Then we calculated what the clearance rate was 242 00:16:51,720 --> 00:16:55,640 Speaker 1: for each cluster, how many of those murders were solved. 243 00:16:56,520 --> 00:16:59,200 Speaker 1: Up until this point, he tried looking at the data 244 00:16:59,360 --> 00:17:02,840 Speaker 1: for all the orders in Seattle. His new idea was 245 00:17:02,920 --> 00:17:06,679 Speaker 1: to further break down the data. Have the algorithm split 246 00:17:06,800 --> 00:17:10,760 Speaker 1: up the homicides for each county into buckets of victims, 247 00:17:10,880 --> 00:17:13,720 Speaker 1: and then rank these different buckets by the percentage of 248 00:17:13,760 --> 00:17:18,120 Speaker 1: cases that were solved. And Hargrove wasn't just doing this 249 00:17:18,200 --> 00:17:21,000 Speaker 1: for Seattle. He was doing this for data all across 250 00:17:21,040 --> 00:17:23,520 Speaker 1: the US, and he was hoping that out of the 251 00:17:23,560 --> 00:17:27,920 Speaker 1: thousands of clusters, Ridgeways Seattle killings would show up somewhere 252 00:17:27,960 --> 00:17:31,800 Speaker 1: near the top of the list. Hargrove ran his new 253 00:17:31,840 --> 00:17:37,520 Speaker 1: algorithm and waited anxiously for the result. When we did that, 254 00:17:38,000 --> 00:17:42,280 Speaker 1: the Green River killings jumped out plane as Day came 255 00:17:42,320 --> 00:17:46,600 Speaker 1: in third place most of the time seventies seven percent 256 00:17:46,680 --> 00:17:48,920 Speaker 1: of the time, and arrest is made when a woman 257 00:17:49,040 --> 00:17:52,920 Speaker 1: is killed in the cluster where the Green River killings 258 00:17:52,960 --> 00:17:58,359 Speaker 1: were grouped. The solution rate was less than and that 259 00:17:58,640 --> 00:18:03,119 Speaker 1: was our key. We're looking for groups of similar murders 260 00:18:03,160 --> 00:18:08,199 Speaker 1: that have very low clearance rates. We had hundreds of 261 00:18:08,240 --> 00:18:12,760 Speaker 1: results all over the country, highly suspicious clusters, and we 262 00:18:12,800 --> 00:18:35,280 Speaker 1: started investigating them. Heart Grove was a static. It seemed 263 00:18:35,320 --> 00:18:39,200 Speaker 1: like his algorithm might be working. It detected Gary Ridgeway, 264 00:18:39,359 --> 00:18:43,119 Speaker 1: the Green River killer, but at the same time, he 265 00:18:43,280 --> 00:18:46,800 Speaker 1: designed the algorithm to detect Ridgeway, and he kept tweaking 266 00:18:46,840 --> 00:18:49,879 Speaker 1: it until he did. So maybe the algorithm had just 267 00:18:49,920 --> 00:18:54,160 Speaker 1: detected Ridgeway by luck and it wouldn't generalize to other killers. 268 00:18:54,840 --> 00:18:58,720 Speaker 1: In statistics, this is a problem called overfitting, and it's 269 00:18:58,760 --> 00:19:01,800 Speaker 1: a common problem when scientists try to make algorithms that 270 00:19:01,880 --> 00:19:05,680 Speaker 1: predict things. So to get around this problem of overfitting, 271 00:19:06,000 --> 00:19:08,840 Speaker 1: people often train their algorithms with one set of data 272 00:19:09,240 --> 00:19:13,560 Speaker 1: and then validate the algorithm by testing it with new data. So, 273 00:19:13,680 --> 00:19:16,840 Speaker 1: since Hargrove had been using homicides in Seattle. To train 274 00:19:16,920 --> 00:19:19,919 Speaker 1: his algorithm, he needed to look at other cities to 275 00:19:19,920 --> 00:19:23,800 Speaker 1: see if it was really working. There were two larger 276 00:19:24,160 --> 00:19:29,920 Speaker 1: suspicious clusters, and in first place was Los Angeles. They 277 00:19:29,960 --> 00:19:35,119 Speaker 1: were a large group of almost entirely African American women 278 00:19:35,359 --> 00:19:39,040 Speaker 1: who were killed by handguns, and those murders had a 279 00:19:39,200 --> 00:19:43,399 Speaker 1: very low solution rate. So I assembled a spreadsheet of 280 00:19:43,400 --> 00:19:46,920 Speaker 1: those murders and emailed them to the public relations department 281 00:19:46,920 --> 00:19:49,399 Speaker 1: of the l a p D. Got one of the 282 00:19:49,600 --> 00:19:53,040 Speaker 1: representatives on the phone, is there a chance that any 283 00:19:53,080 --> 00:19:56,560 Speaker 1: of these could be serial murders? And he spent a 284 00:19:56,760 --> 00:19:59,800 Speaker 1: minute or so looking through the files, and then he 285 00:19:59,800 --> 00:20:02,520 Speaker 1: can back and said, what are you kidding? They're all 286 00:20:02,560 --> 00:20:07,440 Speaker 1: serial killings? I said, what. In fact, in the es 287 00:20:08,040 --> 00:20:10,800 Speaker 1: l ap D established what they hoped would be a 288 00:20:10,840 --> 00:20:15,040 Speaker 1: secret task force. They didn't want to alarm the public, 289 00:20:15,760 --> 00:20:18,560 Speaker 1: but they were exploring the possibility that there could be 290 00:20:18,600 --> 00:20:22,679 Speaker 1: a serial killer active. They called it the South Side 291 00:20:22,680 --> 00:20:26,879 Speaker 1: Slayer Task Force. Well, it turned out that the task 292 00:20:26,920 --> 00:20:29,520 Speaker 1: force was misnamed. It should have been called the South 293 00:20:29,680 --> 00:20:34,080 Speaker 1: Side Slayers Task Force. Because they had five they were 294 00:20:34,080 --> 00:20:36,880 Speaker 1: all quite independent of each other. They didn't know each 295 00:20:36,880 --> 00:20:39,920 Speaker 1: other but they were all killing women over a period 296 00:20:39,960 --> 00:20:44,560 Speaker 1: of twenty years. You heard that right. During the eighties 297 00:20:44,600 --> 00:20:48,159 Speaker 1: and nineties, Los Angeles had at least five different serial 298 00:20:48,280 --> 00:20:52,439 Speaker 1: killers who were shooting, strangling, and sexually assaulting women in 299 00:20:52,440 --> 00:20:55,560 Speaker 1: the area. This was happening at the height of the 300 00:20:55,640 --> 00:20:58,879 Speaker 1: crack epidemic and at a time when homicides across the 301 00:20:58,960 --> 00:21:03,719 Speaker 1: US where peaking. The large number of overall homicides probably 302 00:21:03,760 --> 00:21:06,080 Speaker 1: helped cover up the fact that there were these serial 303 00:21:06,160 --> 00:21:09,879 Speaker 1: killers operating in l A. But it's also likely that 304 00:21:09,920 --> 00:21:13,119 Speaker 1: the police didn't give these murders enough attention due to 305 00:21:13,200 --> 00:21:16,760 Speaker 1: a phenomena that's sometimes called victim discounting, and this is 306 00:21:16,800 --> 00:21:21,760 Speaker 1: the tendency to ignore crimes targeting marginalized groups. But when 307 00:21:21,800 --> 00:21:25,199 Speaker 1: Hargrove found out about these l A serial killers, he 308 00:21:25,240 --> 00:21:28,080 Speaker 1: took this as another sign that the algorithms seemed to 309 00:21:28,119 --> 00:21:31,959 Speaker 1: be working. It had detected a group of confirmed serial 310 00:21:32,080 --> 00:21:36,520 Speaker 1: killers that he hadn't even been aware of. Hargrove continued 311 00:21:36,560 --> 00:21:40,320 Speaker 1: down his list of top clusters that the algorithm had identified. 312 00:21:41,080 --> 00:21:44,639 Speaker 1: He called up the police department in Youngstown, Ohio, and 313 00:21:44,720 --> 00:21:47,520 Speaker 1: left a long message where he tried to explain the 314 00:21:47,560 --> 00:21:50,919 Speaker 1: algorithm and how it had detected a cluster of murders 315 00:21:50,920 --> 00:21:54,520 Speaker 1: in Youngstown. Thirty minutes later, the phone rings the chief 316 00:21:54,560 --> 00:21:57,720 Speaker 1: of detectives and you gotta give him credit, answering a 317 00:21:57,800 --> 00:22:00,359 Speaker 1: voicemail like that, he's a young guy. He called back 318 00:22:00,440 --> 00:22:03,320 Speaker 1: and said, that was the damnedest message I ever had 319 00:22:03,359 --> 00:22:05,560 Speaker 1: on my phone. And so I went back and I 320 00:22:05,640 --> 00:22:10,080 Speaker 1: interviewed my senior detectives and they told me something I 321 00:22:10,160 --> 00:22:13,240 Speaker 1: did not know. We thought we had a serial killer 322 00:22:13,280 --> 00:22:15,840 Speaker 1: in the nineties. We definitely thought we had one, and 323 00:22:15,880 --> 00:22:19,359 Speaker 1: we never got him, and so we started a new investigation. 324 00:22:19,920 --> 00:22:24,119 Speaker 1: He was attempting to locate the rape kids from those cases, 325 00:22:24,760 --> 00:22:27,760 Speaker 1: to try to DNA type all of the rape kits 326 00:22:27,800 --> 00:22:32,040 Speaker 1: they could find. Unfortunately, and this was very embarrassing, the 327 00:22:32,119 --> 00:22:35,719 Speaker 1: rape kits had all been destroyed, the property rooms had 328 00:22:35,760 --> 00:22:39,400 Speaker 1: been cleaned out, and he was not able to get 329 00:22:39,440 --> 00:22:44,679 Speaker 1: any of the kids from his cases or from surrounding jurisdictions. 330 00:22:45,320 --> 00:22:49,520 Speaker 1: There were other similar murders in neighboring jurisdictions, but they 331 00:22:49,560 --> 00:22:52,600 Speaker 1: too did not retain the rape kids. So it was 332 00:22:52,640 --> 00:22:54,600 Speaker 1: it was very sad, but they gave it the old 333 00:22:54,640 --> 00:22:59,159 Speaker 1: college try again. The algorithm had identified a cluster of 334 00:22:59,240 --> 00:23:01,399 Speaker 1: murders that seemed like it was the work of a 335 00:23:01,520 --> 00:23:05,560 Speaker 1: serial killer. Police couldn't prove that the killings were connected, 336 00:23:06,119 --> 00:23:09,720 Speaker 1: but the algorithms findings lined up with what police suspected, 337 00:23:10,440 --> 00:23:14,400 Speaker 1: and the algorithm had inspired a reinvestigation of those cold cases. 338 00:23:15,480 --> 00:23:18,159 Speaker 1: Hargrove felt like he was on the right track and 339 00:23:18,240 --> 00:23:21,119 Speaker 1: that maybe the algorithm could be a useful tool for 340 00:23:21,240 --> 00:23:25,600 Speaker 1: law enforcement. We selected ten major cities that appeared to 341 00:23:25,640 --> 00:23:30,960 Speaker 1: have a suspicious number of algorithm identified murders. Gary was 342 00:23:31,000 --> 00:23:35,480 Speaker 1: one of those ten. Gary, Indiana, the city right next 343 00:23:35,480 --> 00:23:39,120 Speaker 1: door to where Africa Hardy would be strangled. Four years later, 344 00:23:39,920 --> 00:23:44,639 Speaker 1: the algorithm flagged fifteen all unsolved murders in the Gary, 345 00:23:44,680 --> 00:23:48,600 Speaker 1: Indiana area. They were all women who were strangled. Not 346 00:23:48,680 --> 00:23:53,240 Speaker 1: one of the cases were solved, which is unusual. Seventy 347 00:23:53,280 --> 00:23:57,400 Speaker 1: seven percent of female murders get cleared, but not one 348 00:23:57,520 --> 00:24:02,480 Speaker 1: of these fifteen strangulation murder and Gary were cleared. I 349 00:24:02,600 --> 00:24:06,560 Speaker 1: called the public relations officer for the Gary Police Department, 350 00:24:06,800 --> 00:24:09,560 Speaker 1: gave him my name and said what we had found, 351 00:24:10,640 --> 00:24:13,160 Speaker 1: and said, is there a chance that you're dealing with 352 00:24:13,240 --> 00:24:17,280 Speaker 1: a serial killer? The next day the phone rings, it's 353 00:24:17,400 --> 00:24:21,200 Speaker 1: uh as I recall His name was Captain Roberts, who said, UM, 354 00:24:21,240 --> 00:24:24,760 Speaker 1: I've checked with our detectives and I can tell you 355 00:24:24,920 --> 00:24:30,200 Speaker 1: definitively that there are no unsolved serial murders in Gary, Indiana, 356 00:24:30,960 --> 00:24:35,679 Speaker 1: which is by definition an impossible statement to make. Unless 357 00:24:35,720 --> 00:24:39,239 Speaker 1: you have no unsolved murders, you cannot claim you'd be 358 00:24:39,280 --> 00:24:44,240 Speaker 1: definitively certain that there are no unsolved serial murders. Hargrove 359 00:24:44,320 --> 00:24:47,000 Speaker 1: felt like the police were just blowing him off, so 360 00:24:47,040 --> 00:24:51,679 Speaker 1: he started investigating the fifteen murders himself. Hargrove wanted to 361 00:24:51,720 --> 00:24:55,040 Speaker 1: know whether the algorithm was working, and he thought that 362 00:24:55,119 --> 00:24:58,879 Speaker 1: if he found evidence suggesting the murders were connected, maybe 363 00:24:59,000 --> 00:25:02,919 Speaker 1: the police would start take came seriously. But looking up 364 00:25:02,920 --> 00:25:07,280 Speaker 1: the information about the cases wasn't easy. The FBI supplemental 365 00:25:07,320 --> 00:25:10,720 Speaker 1: homicide report didn't include the victim's names or even the 366 00:25:10,760 --> 00:25:13,960 Speaker 1: exact date of their death. It just listed a month 367 00:25:14,000 --> 00:25:17,560 Speaker 1: in a year. So define more details about the cases, 368 00:25:18,040 --> 00:25:21,919 Speaker 1: Hargrove had to meticulously dig through old issues of local papers, 369 00:25:22,400 --> 00:25:24,679 Speaker 1: and as he started to piece things together, he was 370 00:25:24,760 --> 00:25:28,280 Speaker 1: unsettled by what he found. He was disturbed not just 371 00:25:28,320 --> 00:25:31,840 Speaker 1: by how grotesque these murders were, but by the patterns 372 00:25:31,880 --> 00:25:35,000 Speaker 1: that seemed to link them, and he became convinced that 373 00:25:35,080 --> 00:25:38,600 Speaker 1: Northwest Indiana had a serial killer on the loose, a 374 00:25:38,720 --> 00:25:43,280 Speaker 1: strangler who was targeting young women, women just like Africa Hardy. 375 00:25:44,440 --> 00:25:46,960 Speaker 1: He tried to talk to Gary's chief of police, but 376 00:25:47,040 --> 00:25:50,280 Speaker 1: he couldn't get through. I continued to suggest, have you 377 00:25:50,359 --> 00:25:54,720 Speaker 1: really looked at these cases? Soon the police stopped returning 378 00:25:54,720 --> 00:25:59,040 Speaker 1: his calls altogether, and Hargrove needed a response. He wasn't 379 00:25:59,080 --> 00:26:02,400 Speaker 1: just playing arm chair detective. We were about to publish 380 00:26:02,400 --> 00:26:05,680 Speaker 1: a story saying that Gary, Indiana has a serial killer 381 00:26:06,320 --> 00:26:10,600 Speaker 1: and the police would not talk about it. We were 382 00:26:10,640 --> 00:26:13,439 Speaker 1: afraid that the reason they weren't talking to us was 383 00:26:13,480 --> 00:26:15,719 Speaker 1: because they were hot on the heels of solving it. 384 00:26:16,080 --> 00:26:17,920 Speaker 1: That they had a suspect that we're trying to reel 385 00:26:18,000 --> 00:26:19,879 Speaker 1: them in, and we were going to screw that up. 386 00:26:21,000 --> 00:26:25,200 Speaker 1: We needn't have worried, but um, that was our fear. 387 00:26:25,920 --> 00:26:29,080 Speaker 1: I even sent registered letters to the chief of police 388 00:26:29,119 --> 00:26:31,440 Speaker 1: and to the mayor saying what we were about to do, 389 00:26:32,040 --> 00:26:35,600 Speaker 1: and if there's any issue they have, or any conversation 390 00:26:35,640 --> 00:26:39,520 Speaker 1: they want to have, for heaven's sakes, call me. This 391 00:26:39,600 --> 00:26:42,480 Speaker 1: is the letter that I wrote to the chief of 392 00:26:42,520 --> 00:26:48,760 Speaker 1: Police and Mayor in Gary, Indiana, and it goes dear Chief. 393 00:26:48,840 --> 00:26:52,560 Speaker 1: Carter Scripts Hour news service based in Washington, d C 394 00:26:52,840 --> 00:26:58,720 Speaker 1: is conducting a national reporting project looking into the thousand 395 00:26:58,800 --> 00:27:03,120 Speaker 1: unsolved thomicides committed in the United States since night. As 396 00:27:03,160 --> 00:27:06,600 Speaker 1: part of this project, we are investigating whether it's possible 397 00:27:06,640 --> 00:27:11,600 Speaker 1: to spot victims of serial murder among these unsolved killings. 398 00:27:11,760 --> 00:27:17,720 Speaker 1: Using the FBI's Supplementary Homicide Report. Using multivariate analysis, we've 399 00:27:17,760 --> 00:27:21,920 Speaker 1: determined that Gary, Indiana has an elevated number of unsolved 400 00:27:22,000 --> 00:27:25,600 Speaker 1: murders of women who were strangled in recent years. The 401 00:27:25,720 --> 00:27:29,080 Speaker 1: data that your department reported to the FBI are consistent 402 00:27:29,600 --> 00:27:33,720 Speaker 1: with the possibility that multiple victim killers have operated in 403 00:27:33,800 --> 00:27:39,439 Speaker 1: northwestern Indiana. Broadly, we see two possible patterns. In recent years, 404 00:27:40,000 --> 00:27:43,760 Speaker 1: several women have been strangled in their homes. In at 405 00:27:43,840 --> 00:27:47,040 Speaker 1: least two cases, of fire was set after the women 406 00:27:47,080 --> 00:27:51,640 Speaker 1: were killed. Also, starting in the nineteen nineties, we've seen 407 00:27:51,720 --> 00:27:55,800 Speaker 1: several women who were found strangled in or near abandoned buildings. 408 00:27:56,600 --> 00:28:00,399 Speaker 1: We doubt these killings are the result of convicted serial 409 00:28:00,480 --> 00:28:04,399 Speaker 1: killer Eugene V. Brit who admitted to killing eight people. 410 00:28:06,760 --> 00:28:11,040 Speaker 1: Please note the attached list of homicide victims. We'd be 411 00:28:11,080 --> 00:28:15,160 Speaker 1: grateful if your detectives would review these cases to determine 412 00:28:15,160 --> 00:28:19,240 Speaker 1: if any might have a common perpetrator. The US Justice 413 00:28:19,280 --> 00:28:22,840 Speaker 1: Department defines a serial killer simply as anyone who kills 414 00:28:22,920 --> 00:28:28,280 Speaker 1: two or more people in separate incidents. Experience thomicide investigators 415 00:28:28,320 --> 00:28:32,440 Speaker 1: tell us it's extremely difficult to spot a serial murderer. 416 00:28:33,200 --> 00:28:36,639 Speaker 1: There have been enough unsolved killings of women in Gary, 417 00:28:36,680 --> 00:28:40,680 Speaker 1: Indiana that your metropolitan area made our top ten list. 418 00:28:41,360 --> 00:28:44,920 Speaker 1: We are contacting authorities in all ten areas. Police and 419 00:28:45,000 --> 00:28:48,560 Speaker 1: five cities have already confirmed that cases we've cited contained 420 00:28:48,600 --> 00:28:52,360 Speaker 1: proven or suspected victims of serial murder. We are also 421 00:28:52,400 --> 00:28:55,720 Speaker 1: making a similar request to the Lake County Coroner's office. 422 00:28:56,320 --> 00:28:59,320 Speaker 1: Thank you for your time and consideration. Please help us 423 00:28:59,360 --> 00:29:05,440 Speaker 1: explore whether national crime data can assist local law enforcement. Sincerely, 424 00:29:05,480 --> 00:29:11,040 Speaker 1: Thomas Hargrove, Scripts Hour News service, and we had total 425 00:29:11,120 --> 00:29:16,280 Speaker 1: radio silence from those people. The Gary police chief and 426 00:29:16,400 --> 00:29:19,400 Speaker 1: mayor didn't respond to Hargrove's letter or to his follow 427 00:29:19,440 --> 00:29:22,440 Speaker 1: up phone calls, but Hargrove had also sent the letter 428 00:29:22,560 --> 00:29:26,400 Speaker 1: to the county corners corners or the public officials who 429 00:29:26,480 --> 00:29:30,880 Speaker 1: oversee autopsies and determined causes of death. They work with 430 00:29:30,920 --> 00:29:34,080 Speaker 1: the police, but they're an entirely separate entity. When I 431 00:29:34,120 --> 00:29:37,920 Speaker 1: called the Lake County Coroner's Office, identified myself, I'm Tom 432 00:29:37,960 --> 00:29:40,920 Speaker 1: Hargrove calling from Washington, d C. Oh just a minute, 433 00:29:41,000 --> 00:29:44,360 Speaker 1: Mr Hargrove, the senior deputy corner, wants to talk to you. 434 00:29:44,880 --> 00:29:47,520 Speaker 1: He came on and said, Mr Hargrove, we got your 435 00:29:47,560 --> 00:29:50,160 Speaker 1: packet of information. Thank you very much for sending it. 436 00:29:50,320 --> 00:29:53,480 Speaker 1: I'm assigning it to one of our assistant corners, a 437 00:29:53,560 --> 00:29:57,960 Speaker 1: lady named Jackie. We're going to have her look into it. 438 00:29:58,480 --> 00:30:01,720 Speaker 1: An entirely different reception them when I didn't get it Gary, 439 00:30:01,880 --> 00:30:05,040 Speaker 1: they agreed with us that there were too many unsolved murders. 440 00:30:05,080 --> 00:30:08,240 Speaker 1: She added three more cases that she thought belonged on 441 00:30:08,320 --> 00:30:12,240 Speaker 1: that pile we had identified fifteen. She added three, making 442 00:30:12,280 --> 00:30:15,960 Speaker 1: eighteen that she thought were connected and was trying to 443 00:30:16,000 --> 00:30:19,480 Speaker 1: have a conversation with the Carry Police department. She's never 444 00:30:19,560 --> 00:30:23,400 Speaker 1: gone on Mike to talk about this case. For the 445 00:30:23,440 --> 00:30:27,440 Speaker 1: Coroner's office, it is very, very difficult to speak ill 446 00:30:27,600 --> 00:30:31,080 Speaker 1: of a police department. It's considered bad form, and so 447 00:30:31,280 --> 00:30:34,760 Speaker 1: she probably still feels a reluctance to do that. Although 448 00:30:34,800 --> 00:30:38,680 Speaker 1: she has passion about this case, you shouldn't use this 449 00:30:38,800 --> 00:30:41,800 Speaker 1: recording where I named her unless she agrees. I was 450 00:30:41,840 --> 00:30:44,360 Speaker 1: looking up other names of people in that department, you know. 451 00:30:44,400 --> 00:30:47,960 Speaker 1: I think one went down for some kind of corruption charge, 452 00:30:48,160 --> 00:30:50,680 Speaker 1: and then it's kind of hard to find someone who 453 00:30:50,720 --> 00:30:55,080 Speaker 1: is there that can talk about this stuff. Yeah, now 454 00:30:55,160 --> 00:30:57,760 Speaker 1: your your only hope is to get Jackie to talk. 455 00:30:58,360 --> 00:31:01,560 Speaker 1: This is really one of the most frustrating experiences of 456 00:31:01,600 --> 00:31:04,560 Speaker 1: my life. I think the Gary Police Department should be 457 00:31:04,560 --> 00:31:08,480 Speaker 1: looking at some of those old cases. They still may 458 00:31:08,480 --> 00:31:11,760 Speaker 1: have a killer out there. When I finished speaking with 459 00:31:11,800 --> 00:31:15,280 Speaker 1: hard Grove, I tried calling Jackie, but I couldn't get through. 460 00:31:16,880 --> 00:31:19,360 Speaker 1: After a couple of failed attempts, I left her a 461 00:31:19,400 --> 00:31:22,920 Speaker 1: voicemail explaining the podcast how I was trying to look 462 00:31:22,960 --> 00:31:26,400 Speaker 1: into the murders Hargrove had identified to see if they 463 00:31:26,400 --> 00:31:30,920 Speaker 1: were indeed connected to Africa's death. Weeks passed, and I've 464 00:31:30,960 --> 00:31:34,720 Speaker 1: worked on other stories and chased down other leads, and 465 00:31:34,800 --> 00:31:37,680 Speaker 1: after what Hargrove had told me about her reluctance to 466 00:31:37,720 --> 00:31:40,280 Speaker 1: talk about the case, I didn't think she would want 467 00:31:40,280 --> 00:31:44,240 Speaker 1: to speak to me. Then one morning I woke up 468 00:31:44,240 --> 00:31:48,640 Speaker 1: to a missed call from a number I didn't recognize. Hey, Don, 469 00:31:48,720 --> 00:31:52,239 Speaker 1: this is Jackie. You would try to contact me a 470 00:31:52,240 --> 00:31:57,400 Speaker 1: while back in regard to heart Grow story you're doing. Yes, 471 00:31:57,960 --> 00:32:01,360 Speaker 1: you want to give me a call later next week? Um, 472 00:32:01,480 --> 00:32:04,440 Speaker 1: that would be fine. Sorry, had gotten back to you sooner. 473 00:32:04,560 --> 00:32:09,200 Speaker 1: It's just everything is fine, a little different. All right, guys, 474 00:32:09,200 --> 00:32:14,560 Speaker 1: I'll talk to you soon. Maybe that's coming next episode. 475 00:32:15,560 --> 00:32:18,240 Speaker 1: They don't want to talk about it either, I assume 476 00:32:18,360 --> 00:32:22,360 Speaker 1: because they don't want to embarrass their neighboring police agency. 477 00:32:23,120 --> 00:32:25,840 Speaker 1: All right, and said here and they had nothing to do. 478 00:32:26,280 --> 00:32:29,240 Speaker 1: They led to the death your friend. You should try 479 00:32:29,280 --> 00:32:31,280 Speaker 1: to find out. You'd be the first to do that. 480 00:32:33,480 --> 00:32:37,320 Speaker 1: Algorithm is released weekly on Tuesday's Subscribe Now so you 481 00:32:37,360 --> 00:32:40,120 Speaker 1: don't miss the next episode on the I Heart Radio app, 482 00:32:40,280 --> 00:32:47,280 Speaker 1: Apple Podcasts, or wherever you get your favorite shows. This 483 00:32:47,360 --> 00:32:50,280 Speaker 1: episode was written and produced by me ben Key. Brick. 484 00:32:50,760 --> 00:32:54,400 Speaker 1: Algorithm is executive produced by Alex Williams, Donald Albright, and 485 00:32:54,440 --> 00:32:59,320 Speaker 1: Matt Frederick. Production assistance in mixing by Eric Quintana. The 486 00:32:59,400 --> 00:33:02,640 Speaker 1: music is by Makeup and Vanity Set and Blue Dot Sessions. 487 00:33:03,200 --> 00:33:08,280 Speaker 1: Thanks to Christina Dana Miranda Hawkins. Jamie Albright, rema El Kaili, 488 00:33:08,600 --> 00:33:15,200 Speaker 1: Trevor Young, and Josh Thane for their help and notes. Again, 489 00:33:15,280 --> 00:33:18,320 Speaker 1: thanks for listening as it heads up. I'm still working 490 00:33:18,320 --> 00:33:21,360 Speaker 1: on this podcast as we release it, so any feedback 491 00:33:21,480 --> 00:33:24,720 Speaker 1: is appreciated. I think Algorithm is going to address some 492 00:33:24,800 --> 00:33:28,800 Speaker 1: really important issues about policing and how crimes are investigated 493 00:33:29,000 --> 00:33:32,360 Speaker 1: that don't receive enough attention. So if you can, please 494 00:33:32,440 --> 00:33:35,120 Speaker 1: leave a review on Apple Podcasts or tell a friend 495 00:33:35,160 --> 00:33:38,320 Speaker 1: about Algorithm, where brand new show and could really use 496 00:33:38,320 --> 00:33:38,680 Speaker 1: your help.