1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,360 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,440 --> 00:00:17,680 Speaker 1: Jonathan Strickland, Diamond executive producer with iHeart Radio and how 4 00:00:17,840 --> 00:00:20,480 Speaker 1: the tech are you? It's time for a tech Stuff 5 00:00:20,560 --> 00:00:25,200 Speaker 1: classic episode. This episode is called d n A Forensics. 6 00:00:25,200 --> 00:00:29,400 Speaker 1: It originally published on October seventh, two thousand fifteen. In 7 00:00:29,520 --> 00:00:34,839 Speaker 1: this episode, Ben Bolan, host of such shows as Ridiculous 8 00:00:35,159 --> 00:00:37,839 Speaker 1: History as well as Stuff they Don't want you to 9 00:00:37,920 --> 00:00:42,479 Speaker 1: know and as a tireless producer, the kind of person 10 00:00:42,600 --> 00:00:44,680 Speaker 1: you can ask to be on your show and he'll 11 00:00:44,680 --> 00:00:48,199 Speaker 1: say yes, even if he's already committed to like a 12 00:00:48,240 --> 00:00:50,680 Speaker 1: billion other things, and he'll somehow find a way to 13 00:00:50,720 --> 00:00:54,120 Speaker 1: do all of them. I don't understand this man anyway. 14 00:00:54,160 --> 00:00:57,520 Speaker 1: He joined the show to talk about DNA forensics. Hope 15 00:00:57,560 --> 00:01:02,160 Speaker 1: you enjoy this classic episode. Ben has had people mark 16 00:01:02,240 --> 00:01:04,680 Speaker 1: all over him and crayon today. I'm not gonna ask 17 00:01:04,720 --> 00:01:07,679 Speaker 1: why I don't get into personal lives in the show. Well, 18 00:01:07,760 --> 00:01:09,680 Speaker 1: you know, it's it's a big deal. Whenever I could 19 00:01:09,680 --> 00:01:12,120 Speaker 1: be on the show and I wanted to do something special, 20 00:01:12,680 --> 00:01:16,440 Speaker 1: my suits at the cleaners got a bunch of sharpies 21 00:01:16,520 --> 00:01:18,440 Speaker 1: and asked people to go nuts. It's kind of like 22 00:01:18,520 --> 00:01:22,080 Speaker 1: body paint, but really they'll super like on the cheap 23 00:01:22,120 --> 00:01:25,200 Speaker 1: because we just can't. We don't have that in our budget, honestly, right, 24 00:01:25,240 --> 00:01:28,360 Speaker 1: not yet. But Hope springs eternal. And what's weird about 25 00:01:28,480 --> 00:01:32,280 Speaker 1: having um all these all these colors and markers all 26 00:01:32,319 --> 00:01:35,240 Speaker 1: over me is that anything I touched literally is leaving 27 00:01:35,280 --> 00:01:37,840 Speaker 1: a trace. Yeah, and that kind of you know, we 28 00:01:37,840 --> 00:01:40,840 Speaker 1: were going to have a really in depth conversation on 29 00:01:40,880 --> 00:01:44,039 Speaker 1: how catalytic converters work, but once I noticed you're doing that, 30 00:01:44,720 --> 00:01:47,080 Speaker 1: I thought, why don't we talk about DNA forensics, like 31 00:01:47,120 --> 00:01:51,640 Speaker 1: the traces people leave behind. So that's why I decided 32 00:01:51,720 --> 00:01:53,440 Speaker 1: to switch in the last minute. I hope you can 33 00:01:53,560 --> 00:01:56,120 Speaker 1: roll with it. Oh yeah, yeah, yeah yeah. And just 34 00:01:56,240 --> 00:02:00,040 Speaker 1: in in you know, Ben and I have talked a 35 00:02:00,040 --> 00:02:03,280 Speaker 1: little bit about our our mutual interest in the true 36 00:02:03,320 --> 00:02:07,640 Speaker 1: crime uh discipline, the whole the whole true crime like field, 37 00:02:08,040 --> 00:02:09,680 Speaker 1: and it turns out we're not the only ones in 38 00:02:09,680 --> 00:02:11,799 Speaker 1: the office. There are certain people in the office who 39 00:02:11,800 --> 00:02:14,359 Speaker 1: have a really deep interest in this sort of stuff, 40 00:02:14,639 --> 00:02:16,440 Speaker 1: and so we thought, you know, it'd be kind of 41 00:02:16,440 --> 00:02:20,480 Speaker 1: fun to explore the concept in different shows. So if 42 00:02:20,480 --> 00:02:22,760 Speaker 1: you listen to all of How Stuff Work shows, you 43 00:02:22,800 --> 00:02:25,320 Speaker 1: may have noticed things popping up here and there that's 44 00:02:25,400 --> 00:02:29,840 Speaker 1: not entirely by accident, and so, uh really was one 45 00:02:29,840 --> 00:02:32,040 Speaker 1: of those things where as we all started talking, we're like, hey, 46 00:02:32,080 --> 00:02:33,600 Speaker 1: you know, I would like to do something in that 47 00:02:33,680 --> 00:02:36,080 Speaker 1: and we can kind of It's almost like an easter 48 00:02:36,160 --> 00:02:38,520 Speaker 1: egg for those of you who subscribe to lots of 49 00:02:38,560 --> 00:02:40,720 Speaker 1: different shows, and you should let us know if you 50 00:02:40,720 --> 00:02:44,359 Speaker 1: thought it was really cool. So from the technology standpoint, 51 00:02:44,639 --> 00:02:47,640 Speaker 1: we thought DNA forensics would be really really interesting to 52 00:02:47,720 --> 00:02:51,720 Speaker 1: cover and to talk about how it actually works. Uh, 53 00:02:51,760 --> 00:02:54,639 Speaker 1: what are the processes, what are some of the challenges, 54 00:02:55,200 --> 00:02:56,959 Speaker 1: What are some of the things that people are doing 55 00:02:56,960 --> 00:03:01,080 Speaker 1: with DNA forensics now that might end up helping uh, 56 00:03:01,240 --> 00:03:04,760 Speaker 1: like investigations in the future where could it actually end 57 00:03:04,800 --> 00:03:07,120 Speaker 1: up giving us a false positive, because there there is 58 00:03:07,120 --> 00:03:09,880 Speaker 1: the possibility of that as well. But to start it 59 00:03:09,919 --> 00:03:12,839 Speaker 1: all off, we really kind of have to lay the groundwork. Yeah, 60 00:03:12,840 --> 00:03:14,440 Speaker 1: I was gonna I was just gonna ask you. I 61 00:03:14,800 --> 00:03:17,520 Speaker 1: hate to be the bad kid in class right now, 62 00:03:18,040 --> 00:03:24,600 Speaker 1: but what's what's DNA d oxy ribonucleic acid, man, Yeah, 63 00:03:24,760 --> 00:03:27,440 Speaker 1: that's that's what it is. It's so obviously you know, 64 00:03:27,520 --> 00:03:31,080 Speaker 1: anyone who's had science class, like a biology class anything recently, 65 00:03:31,160 --> 00:03:34,320 Speaker 1: you know all about DNA. But you know, we gotta 66 00:03:34,360 --> 00:03:37,119 Speaker 1: build from the ground up. So DNA is a molecule 67 00:03:37,160 --> 00:03:40,480 Speaker 1: that carries the genetic instructions that govern the development, function, 68 00:03:40,520 --> 00:03:44,480 Speaker 1: and reproduction of organisms. DNA is found in all of 69 00:03:44,520 --> 00:03:48,240 Speaker 1: your cells. Essentially, an entire blueprint of what makes you 70 00:03:48,240 --> 00:03:51,560 Speaker 1: you is in every single one of your cells in 71 00:03:51,600 --> 00:03:55,320 Speaker 1: the form of d N A UH. And the molecule 72 00:03:55,640 --> 00:03:58,040 Speaker 1: is in that double helix form. So that says, if 73 00:03:58,040 --> 00:03:59,880 Speaker 1: you were to make a ladder and then twisted in 74 00:04:00,000 --> 00:04:04,400 Speaker 1: to a twisty shape, that's the DNA double helix. The 75 00:04:04,480 --> 00:04:06,480 Speaker 1: rungs on that letter are made up of pairs of 76 00:04:06,520 --> 00:04:10,440 Speaker 1: what we call nucleotides, alright, So each rung on the 77 00:04:10,520 --> 00:04:14,400 Speaker 1: ladder is two different nucleotides that that bond together. Uh. 78 00:04:14,440 --> 00:04:19,560 Speaker 1: There's adenine and thymine. Those always pair up together. So 79 00:04:19,760 --> 00:04:22,640 Speaker 1: those are your two base pairs of nucleotides that will 80 00:04:22,680 --> 00:04:26,200 Speaker 1: always join. And then there's guanine and cytosine and those 81 00:04:26,279 --> 00:04:31,640 Speaker 1: always join. And it's the sequence of these pairs that 82 00:04:31,839 --> 00:04:35,960 Speaker 1: end up determining what makes you you. Right yeah, yeah, 83 00:04:36,000 --> 00:04:40,720 Speaker 1: so it could be. And these pairs can affect multiple 84 00:04:40,920 --> 00:04:45,520 Speaker 1: the order of these pairs can affect multiple characteristics. Sure, yeah, absolutely. 85 00:04:45,600 --> 00:04:50,520 Speaker 1: And also what's really interesting to me is that point 86 00:04:50,680 --> 00:04:55,080 Speaker 1: nine of all the DNA that is in you is 87 00:04:55,120 --> 00:04:58,200 Speaker 1: shared with every other human Like we we have ninety 88 00:04:58,760 --> 00:05:02,359 Speaker 1: nine of our DNA in common, which means the stuff 89 00:05:02,400 --> 00:05:05,080 Speaker 1: that makes you who you are as indifferent from every 90 00:05:05,120 --> 00:05:09,360 Speaker 1: other person makes up just point one percent of your DNA. 91 00:05:09,680 --> 00:05:11,840 Speaker 1: But that's all it takes. Is that point one percent. 92 00:05:11,880 --> 00:05:14,800 Speaker 1: That's about three million base pairs that are unique to 93 00:05:15,000 --> 00:05:19,800 Speaker 1: you unless you have an identical sibling. Ah ah the 94 00:05:19,880 --> 00:05:24,080 Speaker 1: old Now this goes into um, this verges into some 95 00:05:24,160 --> 00:05:29,160 Speaker 1: good detective fiction. Now, yes, that old The evil twin, Yeah, exactly, 96 00:05:29,200 --> 00:05:31,320 Speaker 1: it wasn't me. It was my evil twin or or 97 00:05:31,360 --> 00:05:34,440 Speaker 1: evil triplet or evil quadruplet really which we did an 98 00:05:34,480 --> 00:05:37,200 Speaker 1: ill fated brain Stuff but it was, oh my gosh, 99 00:05:37,200 --> 00:05:40,159 Speaker 1: we did. We did. Yeah. If you if you watch 100 00:05:40,279 --> 00:05:42,720 Speaker 1: brain Stuff the video series and you look up how 101 00:05:42,800 --> 00:05:49,440 Speaker 1: twins work, Ben and I did a funny At the time, 102 00:05:51,360 --> 00:05:53,080 Speaker 1: I thought I still thought there were parts of it 103 00:05:53,080 --> 00:05:55,919 Speaker 1: that were funny. Honestly, Ben, I still fully enjoy it, 104 00:05:55,920 --> 00:05:58,560 Speaker 1: but my sense of humor is very corny. So but 105 00:05:58,640 --> 00:06:00,880 Speaker 1: if you want to see you want to see me 106 00:06:00,920 --> 00:06:04,000 Speaker 1: and Ben dressing up in two different types of outfits, 107 00:06:04,000 --> 00:06:06,760 Speaker 1: like we're the the good Ben and Jonathan and then 108 00:06:06,760 --> 00:06:08,919 Speaker 1: there's the evil Ben and Jonathan and we each have 109 00:06:09,000 --> 00:06:12,640 Speaker 1: an eyepatch. Jonathan I were talking and it was it 110 00:06:12,720 --> 00:06:14,560 Speaker 1: was strange because when we were talking about doing this 111 00:06:14,600 --> 00:06:18,480 Speaker 1: episode and said, well, how could we represent evil twins 112 00:06:18,800 --> 00:06:23,280 Speaker 1: like I patch is clearly because you know, the the 113 00:06:23,279 --> 00:06:25,760 Speaker 1: goatee is not gonna work for right, because it wasn't 114 00:06:25,800 --> 00:06:27,720 Speaker 1: like neither of us were going to end up shaving 115 00:06:27,800 --> 00:06:30,039 Speaker 1: just so that I can be the good twin. But 116 00:06:30,200 --> 00:06:32,119 Speaker 1: both of us are the kind of person who would 117 00:06:32,120 --> 00:06:35,159 Speaker 1: have an I patch. And actually I ended up taking 118 00:06:35,200 --> 00:06:37,440 Speaker 1: a quick walk to a nearby toy store to pick 119 00:06:37,520 --> 00:06:39,880 Speaker 1: some up um. So, at any rate, if you do 120 00:06:39,920 --> 00:06:43,880 Speaker 1: have an identical sibling, your identical sibling shares your DNA. 121 00:06:44,040 --> 00:06:46,160 Speaker 1: There they are identical like the d N. If you 122 00:06:46,200 --> 00:06:48,720 Speaker 1: were to compare the two and look at those base pairs, 123 00:06:48,920 --> 00:06:52,000 Speaker 1: they're going to be the same all the way down, right, 124 00:06:52,080 --> 00:06:55,000 Speaker 1: So that's one of the that's one of the exceptions. 125 00:06:55,200 --> 00:06:59,240 Speaker 1: Really the exception so our DNA can be found in 126 00:06:59,640 --> 00:07:03,159 Speaker 1: twenty three pairs of chromosomes. That's what humans have. Not 127 00:07:03,279 --> 00:07:06,919 Speaker 1: all animals have that many, so have fewer, and etcetera, etcetera. 128 00:07:07,040 --> 00:07:10,800 Speaker 1: So chromosomes are ribbons of protein essentially have a strand 129 00:07:10,800 --> 00:07:13,280 Speaker 1: of DNA that are wrapped up in that and within 130 00:07:13,360 --> 00:07:16,640 Speaker 1: each pair, one chromosome comes from your mother, one chromosome 131 00:07:16,680 --> 00:07:19,920 Speaker 1: comes from your father, and that's what uh you know, 132 00:07:20,080 --> 00:07:23,600 Speaker 1: those are the ingredients that come together to create the 133 00:07:23,680 --> 00:07:27,880 Speaker 1: unique individual that is you and or your identical siblings. 134 00:07:28,360 --> 00:07:31,320 Speaker 1: Uh So, if we took look at each person's DNA 135 00:07:31,400 --> 00:07:33,239 Speaker 1: and pay attention to the order of those base pairs, 136 00:07:33,280 --> 00:07:36,040 Speaker 1: we get something like a d NA fingerprint. It is 137 00:07:36,120 --> 00:07:39,120 Speaker 1: unique to that person. But we can't just look at 138 00:07:39,240 --> 00:07:42,600 Speaker 1: one section. We have to look at several different sections 139 00:07:42,640 --> 00:07:46,040 Speaker 1: also known as loci in the in the parlance of 140 00:07:46,240 --> 00:07:50,920 Speaker 1: forensics to get a robust fingerprint profile. So, just as 141 00:07:51,160 --> 00:07:53,640 Speaker 1: we would look at a fingerprint and look for points 142 00:07:53,640 --> 00:07:56,960 Speaker 1: of comparison to from from a from a fingerprint that 143 00:07:57,000 --> 00:07:59,880 Speaker 1: we've gathered from a suspect, let's say, and a fingerprint 144 00:07:59,880 --> 00:08:01,880 Speaker 1: that is left at the scene of a crime, you 145 00:08:01,920 --> 00:08:04,080 Speaker 1: would have to look at several different points to make 146 00:08:04,080 --> 00:08:06,960 Speaker 1: sure that all those points correspond to one another to 147 00:08:07,120 --> 00:08:10,320 Speaker 1: say that there's a match. Same thing with DNA DNA forensics, 148 00:08:10,440 --> 00:08:13,320 Speaker 1: you would look at several different locations along a strand 149 00:08:13,320 --> 00:08:18,320 Speaker 1: of DNA and see if the same sequence of nucleo 150 00:08:18,360 --> 00:08:23,360 Speaker 1: tides were appearing on both sets, because that would tell 151 00:08:23,400 --> 00:08:27,680 Speaker 1: you what are the statistical probabilities of the person that 152 00:08:27,760 --> 00:08:30,560 Speaker 1: you suspect and the the evidence that was left behind 153 00:08:31,040 --> 00:08:34,000 Speaker 1: are one and the same, right, Okay, So each time 154 00:08:34,080 --> 00:08:37,400 Speaker 1: there's a new location, the more loci there are, the 155 00:08:37,480 --> 00:08:40,439 Speaker 1: more certitude you have that you've got your catch. Yeah, 156 00:08:40,480 --> 00:08:43,160 Speaker 1: if you were to say, look at just one location, 157 00:08:43,320 --> 00:08:46,640 Speaker 1: then that would mean you would have a very you know, 158 00:08:46,679 --> 00:08:50,160 Speaker 1: there's actually quite a good chance depending upon the sequence, 159 00:08:50,640 --> 00:08:57,040 Speaker 1: that coincidence could could completely explain away any any uh 160 00:08:57,440 --> 00:09:00,520 Speaker 1: duplication there. Right, So it could just be coincidence. It 161 00:09:00,520 --> 00:09:04,160 Speaker 1: could be that this person just coincidentally has that same sequence. 162 00:09:05,360 --> 00:09:09,520 Speaker 1: As you add more loci, that becomes less and less likely. 163 00:09:10,040 --> 00:09:14,640 Speaker 1: Uh FBI. The FBI has thirteen that they suggest, so 164 00:09:15,040 --> 00:09:18,720 Speaker 1: thirteen specific loca that's their standard, and that that results 165 00:09:18,720 --> 00:09:22,040 Speaker 1: in about a one in a billion chance that if 166 00:09:22,040 --> 00:09:24,800 Speaker 1: you were to take all thirteen loca and compare the 167 00:09:24,840 --> 00:09:27,880 Speaker 1: two strength you know, the stuff that was left at 168 00:09:27,880 --> 00:09:30,480 Speaker 1: the evidence and the suspect or whatever is in the database. 169 00:09:31,040 --> 00:09:33,920 Speaker 1: If you were to compare the two and they were 170 00:09:33,960 --> 00:09:37,360 Speaker 1: to come up equal at all thirteen, it's a one 171 00:09:37,360 --> 00:09:41,200 Speaker 1: in a billion chance that somebody else besides the person 172 00:09:41,240 --> 00:09:45,040 Speaker 1: you're looking at possesses that. So seven of the six 173 00:09:45,040 --> 00:09:48,000 Speaker 1: other people in the world. Yeah, it's like flash forward 174 00:09:48,040 --> 00:09:51,680 Speaker 1: to that day in court where someone's doing that horrible 175 00:09:51,720 --> 00:09:56,000 Speaker 1: reference joke and going, so you're saying that there's a chance. Yeah, 176 00:09:56,040 --> 00:10:00,400 Speaker 1: And honestly, people who are analyzing the stuff they speak 177 00:10:00,440 --> 00:10:05,120 Speaker 1: in statistical probabilities, because you cannot say for certain that 178 00:10:05,240 --> 00:10:07,920 Speaker 1: this person left behind that DNA. You can say, like, 179 00:10:08,240 --> 00:10:11,760 Speaker 1: what is the statistical probability that they did? And then 180 00:10:11,800 --> 00:10:14,400 Speaker 1: you look at other elements of the case, right like saying, 181 00:10:14,600 --> 00:10:17,120 Speaker 1: all right, can we put the person in that area, 182 00:10:17,440 --> 00:10:21,560 Speaker 1: because let's say that it's in a small town. Well, 183 00:10:21,679 --> 00:10:23,520 Speaker 1: if it's a one in a billion chance and you 184 00:10:23,559 --> 00:10:26,800 Speaker 1: know that the suspect was in that small town, that's 185 00:10:26,840 --> 00:10:31,560 Speaker 1: a pretty darn compelling yeah, because why are the chances 186 00:10:31,600 --> 00:10:33,920 Speaker 1: of the other one of the other six people in 187 00:10:34,040 --> 00:10:37,679 Speaker 1: the entire world was also in that small town. Not good, 188 00:10:38,760 --> 00:10:40,920 Speaker 1: Ben and I will be back to talk more about 189 00:10:41,000 --> 00:10:54,080 Speaker 1: DNA forensics after we take this quick commercial break. So 190 00:10:55,280 --> 00:10:58,120 Speaker 1: where do we get the DNA evidence from? Well, stuff 191 00:10:58,120 --> 00:11:02,120 Speaker 1: that people leave behind, Uh, a much anything that has cells, 192 00:11:02,240 --> 00:11:05,000 Speaker 1: like living tissue that was left behind our living or 193 00:11:05,200 --> 00:11:08,920 Speaker 1: stuff where living cells could have been in before being 194 00:11:08,960 --> 00:11:12,320 Speaker 1: deposited at the crime scene. So stuff like blood or 195 00:11:12,520 --> 00:11:18,920 Speaker 1: saliva or semen or skin cells, mucus, ear wax, sweat. Yeah, 196 00:11:19,040 --> 00:11:22,640 Speaker 1: all of that, All of that can leave behind cells 197 00:11:22,720 --> 00:11:25,760 Speaker 1: that we can pull DNA from that. What about hair, 198 00:11:26,559 --> 00:11:31,040 Speaker 1: hair not so much, not not not for traditional DNA 199 00:11:31,200 --> 00:11:34,719 Speaker 1: hair follicles, yes, but hair itself is dead. Those are 200 00:11:34,720 --> 00:11:39,479 Speaker 1: dead cells. So you can do some some DNA analysis, 201 00:11:39,520 --> 00:11:43,160 Speaker 1: but not the the standard kind that most people use 202 00:11:43,240 --> 00:11:47,360 Speaker 1: in DNA. For instance, Uh, fingernails, same thing, but fingernails 203 00:11:47,400 --> 00:11:50,920 Speaker 1: often come with other tissue attached to it, and that's 204 00:11:50,960 --> 00:11:54,360 Speaker 1: where you find the DNA. So if we want to 205 00:11:54,360 --> 00:11:56,679 Speaker 1: look at the history of people actually saying hey, why 206 00:11:56,679 --> 00:12:00,839 Speaker 1: don't we use this this DNA stuff to try and 207 00:12:01,800 --> 00:12:04,120 Speaker 1: help with investigations, you've got to look back to the 208 00:12:04,160 --> 00:12:08,760 Speaker 1: nineteen eighties when a brit named Alec Jefferies, who now 209 00:12:08,880 --> 00:12:12,640 Speaker 1: you may refer to as Professor Sir Alec John Jefferies 210 00:12:12,760 --> 00:12:20,679 Speaker 1: f R S hang okay, good, Professor Sir Alec John Jeffries, FRS. Uh, 211 00:12:20,840 --> 00:12:23,560 Speaker 1: that would be his full title. Now. He hit upon 212 00:12:23,600 --> 00:12:27,479 Speaker 1: the idea of using DNA as a means of genetic fingerprinting, 213 00:12:27,880 --> 00:12:30,480 Speaker 1: and he realized that the unique sequences of DNA could 214 00:12:30,520 --> 00:12:32,440 Speaker 1: serve as a means to link an individual to a 215 00:12:32,520 --> 00:12:35,640 Speaker 1: scene where DNA samples were found. And his process was 216 00:12:35,679 --> 00:12:38,480 Speaker 1: first applied in the court system in nineteen eight five. 217 00:12:38,880 --> 00:12:41,120 Speaker 1: In that case, it was an it was an immigration case. 218 00:12:41,160 --> 00:12:43,199 Speaker 1: It wasn't like a murder or a rape or something 219 00:12:43,240 --> 00:12:46,840 Speaker 1: like that. It was to ascertain if the identity of 220 00:12:46,840 --> 00:12:51,080 Speaker 1: a British boy was actually related to a family who 221 00:12:51,160 --> 00:12:55,600 Speaker 1: had originally immigrated to the United Kingdom from elsewhere. And 222 00:12:55,720 --> 00:12:59,000 Speaker 1: he did. Uh. The first time it was used in 223 00:12:59,040 --> 00:13:03,840 Speaker 1: a criminal case would be nine seven. That was yeah, 224 00:13:03,880 --> 00:13:06,679 Speaker 1: not not long after, and that wasn't a case. Uh. 225 00:13:06,760 --> 00:13:10,880 Speaker 1: The The suspect was named Colin Pitchfork, which is a 226 00:13:10,880 --> 00:13:14,480 Speaker 1: heck of a name. Talking about nominative determinism, Yeah, and 227 00:13:14,520 --> 00:13:17,760 Speaker 1: he was arrested on suspicion of rape and murder, and 228 00:13:17,800 --> 00:13:19,760 Speaker 1: he was the first criminal cut as a result of 229 00:13:19,840 --> 00:13:23,600 Speaker 1: DNA screening. So this was DNA screen that led to 230 00:13:23,679 --> 00:13:27,480 Speaker 1: his capture. He actually confessed to his crimes, so the 231 00:13:27,600 --> 00:13:31,160 Speaker 1: DNA didn't lead to his conviction. He confessed UH and 232 00:13:31,240 --> 00:13:34,200 Speaker 1: he received life in prison as a result. So I 233 00:13:34,240 --> 00:13:36,240 Speaker 1: wanted to talk a little bit before we get into 234 00:13:36,320 --> 00:13:38,880 Speaker 1: some of the pros and cons about what actually happens 235 00:13:39,280 --> 00:13:42,920 Speaker 1: with DNA because you hear like DNA forensics and you're like, well, 236 00:13:44,280 --> 00:13:46,280 Speaker 1: what goes into that? Yeah, this is a great thing 237 00:13:46,320 --> 00:13:50,280 Speaker 1: to contextualize right now because there are a lot of 238 00:13:50,400 --> 00:13:55,479 Speaker 1: fans and tech stuff who have probably seen and scoffed 239 00:13:55,480 --> 00:14:02,199 Speaker 1: at the various entertaining but inaccurate crimes shows c s I, 240 00:14:02,720 --> 00:14:05,160 Speaker 1: CSI being the big one, Like they're there are are 241 00:14:05,520 --> 00:14:08,319 Speaker 1: FRIENDSIC specialists who say that c s I is probably 242 00:14:08,360 --> 00:14:10,480 Speaker 1: one of the most damaging things that have happened to 243 00:14:10,520 --> 00:14:15,520 Speaker 1: their their career path ever because people have unrealistic expectations, specifically, 244 00:14:15,960 --> 00:14:20,760 Speaker 1: juries have unrealistic expectations, which can hurt a trial case 245 00:14:21,200 --> 00:14:25,960 Speaker 1: because juries will often one want DNA UH data when 246 00:14:26,160 --> 00:14:29,560 Speaker 1: it's not even relevant to a case. Like they're like 247 00:14:29,800 --> 00:14:32,280 Speaker 1: it's not necessary for them to make a determination in 248 00:14:32,320 --> 00:14:34,560 Speaker 1: a case, but they want it because it's one of 249 00:14:34,640 --> 00:14:38,080 Speaker 1: those things that people associate with. Oh, d NA gets 250 00:14:38,280 --> 00:14:41,600 Speaker 1: you the the locked in answer was that person there 251 00:14:41,880 --> 00:14:45,560 Speaker 1: were they not there? Um, just run the DNA enhanced 252 00:14:45,560 --> 00:14:48,280 Speaker 1: the photograph. I see what the problem right? Exactly. Yeah, 253 00:14:48,560 --> 00:14:51,120 Speaker 1: let's pull up one of those three dimensional holographic images. 254 00:14:52,560 --> 00:14:56,280 Speaker 1: We're just we're just throw every single science fiction CSI 255 00:14:56,400 --> 00:15:01,560 Speaker 1: trope in there, so early friends. Now, this actually used 256 00:15:01,640 --> 00:15:07,320 Speaker 1: a process called restriction fragment length polymorphism or r f LP, 257 00:15:07,960 --> 00:15:10,840 Speaker 1: and that involves taking a sample of DNA that has 258 00:15:11,000 --> 00:15:14,440 Speaker 1: repeating base pairs, like they can repeat from anywhere between 259 00:15:14,520 --> 00:15:18,600 Speaker 1: one and thirty times. They're called variable number tandem repeats 260 00:15:18,800 --> 00:15:21,880 Speaker 1: or v N t r s. And what they would 261 00:15:21,880 --> 00:15:24,560 Speaker 1: do is they would dissolve this DNA in an enzyme 262 00:15:24,680 --> 00:15:29,120 Speaker 1: to break the strand at specific locations along that the DNA. 263 00:15:29,320 --> 00:15:33,840 Speaker 1: So uh saying like, um, when there are this many repetitions, 264 00:15:33,960 --> 00:15:36,960 Speaker 1: this enzyme is going to break the strand at that point, 265 00:15:37,440 --> 00:15:39,640 Speaker 1: so that way we can measure how long the strand 266 00:15:39,800 --> 00:15:44,120 Speaker 1: DNA is in and out point. Yeah, So imagine that 267 00:15:44,200 --> 00:15:47,560 Speaker 1: you've got like a ribbon, right, And let's say that 268 00:15:47,600 --> 00:15:50,840 Speaker 1: little ribbon is maybe three ft long, and you're going 269 00:15:50,920 --> 00:15:54,320 Speaker 1: to cut out a six inch segment of that ribbon. 270 00:15:54,760 --> 00:15:57,520 Speaker 1: Use this enzyme, and it cuts it at the very 271 00:15:57,680 --> 00:16:01,400 Speaker 1: specific locations along that strand that you want. You do 272 00:16:01,560 --> 00:16:05,240 Speaker 1: the same thing with the material that was left behind 273 00:16:05,800 --> 00:16:07,880 Speaker 1: at the scene. So let's say you've got you've got 274 00:16:08,040 --> 00:16:12,160 Speaker 1: your your DNA sample from your suspect, you've got the 275 00:16:12,200 --> 00:16:14,640 Speaker 1: sample from the scene, and you compare the two and 276 00:16:14,680 --> 00:16:18,720 Speaker 1: you're essentially measuring them against each other, like literally measuring 277 00:16:18,800 --> 00:16:21,760 Speaker 1: the length of them, because it's those repeating pairs that 278 00:16:21,880 --> 00:16:26,240 Speaker 1: determine how long that segment is. So if the two 279 00:16:26,520 --> 00:16:30,160 Speaker 1: are about the same length, or actually they are the 280 00:16:30,280 --> 00:16:32,720 Speaker 1: same length, then you know, or at least you you 281 00:16:32,840 --> 00:16:36,800 Speaker 1: have a good uh inclination to say that this person 282 00:16:37,080 --> 00:16:40,360 Speaker 1: was the one who left behind that DNA. That's not 283 00:16:40,480 --> 00:16:44,520 Speaker 1: really used that frequently anymore, but more frequently now we 284 00:16:44,640 --> 00:16:48,640 Speaker 1: use a method called short tandem repeat analysis, which is 285 00:16:48,720 --> 00:16:52,240 Speaker 1: more reliable, more popular, And in this method, analysts take 286 00:16:52,280 --> 00:16:55,080 Speaker 1: a sample of DNA and they count the repetition of 287 00:16:55,120 --> 00:16:58,840 Speaker 1: those base pairs along certain locations the loci of that sample. 288 00:16:59,360 --> 00:17:02,120 Speaker 1: So for or five base pair repeats, like where you 289 00:17:02,240 --> 00:17:05,200 Speaker 1: get you know, your your those nucleo tied pairings I 290 00:17:05,280 --> 00:17:08,159 Speaker 1: talked about, sometimes those pairings repeat in a sequence, right, 291 00:17:09,160 --> 00:17:14,200 Speaker 1: They look for uh, preferably four or five base pair 292 00:17:14,560 --> 00:17:18,879 Speaker 1: repeat segments. So that way, because it's less likely than 293 00:17:18,920 --> 00:17:21,840 Speaker 1: if you would have two or three in a row. Yeah, 294 00:17:22,000 --> 00:17:24,280 Speaker 1: the more you have in a row, the less likely 295 00:17:24,359 --> 00:17:27,520 Speaker 1: you're going to find that exact same repetition in another 296 00:17:27,800 --> 00:17:31,240 Speaker 1: in an unrelated person's DNA. And these are by the way, 297 00:17:31,359 --> 00:17:35,040 Speaker 1: called tetra nucleotide or penta nucleotide repetitions because of the 298 00:17:35,160 --> 00:17:38,800 Speaker 1: number tetra being four, penta being five. Um. There, those 299 00:17:38,840 --> 00:17:41,920 Speaker 1: are best in order to indicate an accurate match. So 300 00:17:42,040 --> 00:17:45,080 Speaker 1: the FBI, like I said, says, thirteen specific locai to 301 00:17:45,600 --> 00:17:48,360 Speaker 1: find this, you would do this in thirteen different locations 302 00:17:48,400 --> 00:17:51,480 Speaker 1: along the strand of DNA. And if you were to 303 00:17:51,560 --> 00:17:55,600 Speaker 1: find these, uh, these base pair repeats that are identical 304 00:17:55,720 --> 00:17:59,040 Speaker 1: in both and both samples, that's a really good indication 305 00:17:59,200 --> 00:18:02,520 Speaker 1: that they belong to the same person. And this this 306 00:18:02,800 --> 00:18:08,040 Speaker 1: investigation technique, while it is while it's pretty solid and 307 00:18:08,119 --> 00:18:12,280 Speaker 1: there's solid science behind it, it doesn't work in every 308 00:18:12,800 --> 00:18:15,480 Speaker 1: in every case, it's not a silver bullet, and this 309 00:18:15,720 --> 00:18:18,520 Speaker 1: is kind of some dark territory. Yeah yeah. In fact, 310 00:18:18,600 --> 00:18:21,639 Speaker 1: there there are a lot of reasons why, uh, this 311 00:18:22,280 --> 00:18:26,760 Speaker 1: can be this can be problematic. Um. One other thing 312 00:18:26,800 --> 00:18:30,000 Speaker 1: I want to talk about before we get into the challenges, 313 00:18:30,200 --> 00:18:34,760 Speaker 1: specifically things like contamination and chain of possession and this 314 00:18:35,400 --> 00:18:37,960 Speaker 1: chain of custody, thank you, before we get into that, 315 00:18:38,720 --> 00:18:40,680 Speaker 1: is to talk about all right, So you know, I 316 00:18:40,880 --> 00:18:44,359 Speaker 1: gave these these overviews of how they're analyzing the DNA, 317 00:18:44,720 --> 00:18:46,920 Speaker 1: but one of the big issues here is that often 318 00:18:47,040 --> 00:18:50,360 Speaker 1: when you're in the field and you're looking for anything 319 00:18:50,480 --> 00:18:54,680 Speaker 1: that has you know, remnants of DNA on it, you 320 00:18:54,800 --> 00:18:57,760 Speaker 1: may not have a very large sample to work with, Right, 321 00:18:58,400 --> 00:19:00,200 Speaker 1: So you've got a tiny amount of d and a 322 00:19:00,960 --> 00:19:03,439 Speaker 1: how do you make sure you can do the tests 323 00:19:03,520 --> 00:19:06,800 Speaker 1: you need with a tiny little amount? And the answer 324 00:19:06,960 --> 00:19:11,760 Speaker 1: is you duplicate the crap out of it? What how? Yeah? Okay, 325 00:19:11,800 --> 00:19:14,000 Speaker 1: So this is this is gonna get super weird because 326 00:19:14,040 --> 00:19:17,480 Speaker 1: I'm gonna get into molecular biology and chemistry. But starting it, 327 00:19:17,520 --> 00:19:21,080 Speaker 1: I want to Okay. So they use a process called 328 00:19:21,320 --> 00:19:25,440 Speaker 1: polymerase chain reaction or PCR to duplicate a specific region 329 00:19:25,520 --> 00:19:27,840 Speaker 1: of the DNA in a sample. So this process was 330 00:19:27,880 --> 00:19:31,320 Speaker 1: developed in three by Carrie Mullis who actually he won 331 00:19:31,440 --> 00:19:34,320 Speaker 1: a Nobel Prize in chemistry for his work in this field. 332 00:19:35,160 --> 00:19:38,280 Speaker 1: And what they'll do, because they'll take samples of DNA, 333 00:19:38,359 --> 00:19:40,479 Speaker 1: they'll take a string of DNA. So you've got your 334 00:19:40,520 --> 00:19:44,639 Speaker 1: double helix right, and then you heat it to between 335 00:19:44,800 --> 00:19:49,560 Speaker 1: ninety four and ninety six degrees celsius for a few minutes. Yeah, 336 00:19:49,680 --> 00:19:53,320 Speaker 1: so it's almost boiling um for a few minutes. And 337 00:19:53,880 --> 00:19:56,720 Speaker 1: this is to d nature the sample, which means that 338 00:19:56,920 --> 00:19:59,960 Speaker 1: the DNA straightens out, so it's no longer a twisted ladder, 339 00:20:00,040 --> 00:20:03,640 Speaker 1: it's a ladder, and the rungs split apart. So those 340 00:20:03,680 --> 00:20:08,280 Speaker 1: base pairs split and you get two strands, two half 341 00:20:08,359 --> 00:20:13,080 Speaker 1: strands of DNA, all right, So then you end up 342 00:20:13,840 --> 00:20:16,560 Speaker 1: changing the temperature. You lower it to between fifty and 343 00:20:16,720 --> 00:20:19,320 Speaker 1: sixty five degrees celsius for a few minutes. That first 344 00:20:19,359 --> 00:20:21,360 Speaker 1: one only takes a few minutes to so you lower 345 00:20:21,400 --> 00:20:23,639 Speaker 1: it down to fifty to six degrees celsius for a 346 00:20:23,720 --> 00:20:27,000 Speaker 1: few more minutes. That allows the left and right primers. 347 00:20:27,960 --> 00:20:32,960 Speaker 1: These are small sections of DNA that have matching nucleo 348 00:20:33,040 --> 00:20:36,720 Speaker 1: tides to the two separated pieces that you've created. Think 349 00:20:36,760 --> 00:20:39,399 Speaker 1: of them as almost like half zippers. So you've got 350 00:20:39,520 --> 00:20:42,560 Speaker 1: the right and left half of a zipper on either 351 00:20:42,920 --> 00:20:46,040 Speaker 1: like they're they're spreading out there there there's apart from 352 00:20:46,080 --> 00:20:49,119 Speaker 1: one another. You've got a small section that interlocks with 353 00:20:49,240 --> 00:20:55,760 Speaker 1: each side. Because that you've got the the complementary base pairs. Uh, 354 00:20:55,880 --> 00:20:58,880 Speaker 1: those will then connect to those sections. Now that's only 355 00:20:58,960 --> 00:21:03,840 Speaker 1: a tiny little part overall part of the full DNA. 356 00:21:04,840 --> 00:21:08,040 Speaker 1: But um. They then raise the temperature to seventy two 357 00:21:08,119 --> 00:21:11,320 Speaker 1: degrees celsius for a few minutes to allow the tach 358 00:21:11,760 --> 00:21:15,600 Speaker 1: polyme race. Now this is the material that can then 359 00:21:15,960 --> 00:21:20,320 Speaker 1: build and synthesize new DNA to the two separate strands. 360 00:21:20,400 --> 00:21:22,440 Speaker 1: So if you think about it like a video game. 361 00:21:22,840 --> 00:21:24,239 Speaker 1: All right, so you get your little you get your 362 00:21:24,280 --> 00:21:28,240 Speaker 1: little segment that's locked onto the half ladder of DNA, 363 00:21:28,440 --> 00:21:30,480 Speaker 1: the stuff you started off with in the first place. 364 00:21:30,880 --> 00:21:32,600 Speaker 1: At one end of that, imagine that you get a 365 00:21:32,640 --> 00:21:35,359 Speaker 1: little bitty blob, all right, a little big blob just 366 00:21:35,640 --> 00:21:41,600 Speaker 1: builds the corresponding rungs and goes down the line rebuilding 367 00:21:41,680 --> 00:21:44,639 Speaker 1: the DNA. And it doesn't you know, there's one on 368 00:21:44,760 --> 00:21:48,080 Speaker 1: both sides. There's a primer on each half strand of DNA, 369 00:21:48,400 --> 00:21:50,359 Speaker 1: so at the end of this process you end up 370 00:21:50,440 --> 00:21:55,320 Speaker 1: with two strands of d N. A. Okay, we've got 371 00:21:55,400 --> 00:21:57,760 Speaker 1: more to say in this classic episode of tech stuff 372 00:21:57,800 --> 00:22:11,159 Speaker 1: after these quick messages. Now you started with one, but 373 00:22:11,240 --> 00:22:15,040 Speaker 1: because you've used this molecular biology slash chemistry approach, you've 374 00:22:15,040 --> 00:22:18,800 Speaker 1: been able to duplicate it. And then you repeat that process, 375 00:22:18,880 --> 00:22:21,040 Speaker 1: so you do it again. Those two become four, the 376 00:22:21,119 --> 00:22:24,120 Speaker 1: four become a You see how this expands very rapidly. 377 00:22:24,440 --> 00:22:26,520 Speaker 1: You do it over and over, so that way, even 378 00:22:26,600 --> 00:22:28,800 Speaker 1: if you started with a very small sample of DNA, 379 00:22:28,920 --> 00:22:31,440 Speaker 1: by the end you've got plenty to work with, so 380 00:22:31,520 --> 00:22:34,280 Speaker 1: you don't have to worry about you know, we had 381 00:22:34,840 --> 00:22:38,159 Speaker 1: one little drop of sweat at the scene and and 382 00:22:38,320 --> 00:22:40,600 Speaker 1: we blew it on on a test that didn't work out. 383 00:22:41,040 --> 00:22:43,399 Speaker 1: You don't have to worry about that. I'm sorry to 384 00:22:43,440 --> 00:22:46,960 Speaker 1: be like, uh, emotionally or mentally a nine year old here, Jonathan, 385 00:22:47,600 --> 00:22:49,520 Speaker 1: but can we can we make it a booker? I 386 00:22:49,640 --> 00:22:52,040 Speaker 1: just love picture in us as cops or like no, 387 00:22:52,760 --> 00:22:55,840 Speaker 1: no one knows who stole the bus. We have only 388 00:22:56,080 --> 00:23:00,760 Speaker 1: this single the mysterious picker has struck again. Yeah, okay, 389 00:23:00,840 --> 00:23:03,760 Speaker 1: so you're in Yeah, well we weren't talking about urine. 390 00:23:03,800 --> 00:23:06,560 Speaker 1: We were talking about the boogers, all right, you got me. 391 00:23:07,119 --> 00:23:09,280 Speaker 1: But this is but this is a great that that's 392 00:23:09,320 --> 00:23:13,800 Speaker 1: a great explanation of how this occurs. Because given that 393 00:23:14,840 --> 00:23:20,560 Speaker 1: you're essentially destroying the evidence every time that you you 394 00:23:20,720 --> 00:23:24,640 Speaker 1: conduct this kind of this, this kind of investigation, than 395 00:23:24,920 --> 00:23:28,880 Speaker 1: being able to reproduce it is fundamental. Yeah, it's absolutely 396 00:23:28,960 --> 00:23:32,080 Speaker 1: key because again, if you do not have very much 397 00:23:32,240 --> 00:23:35,480 Speaker 1: of that material, then you really have to be careful. 398 00:23:35,560 --> 00:23:38,000 Speaker 1: And there are a lot of things that can complicate this, 399 00:23:38,160 --> 00:23:39,680 Speaker 1: and that's kind of where we were leading to a 400 00:23:39,720 --> 00:23:41,800 Speaker 1: little bit earlier. There are a lot of reasons why 401 00:23:42,480 --> 00:23:45,320 Speaker 1: you cannot just say that DNA forensics is going to 402 00:23:45,440 --> 00:23:50,000 Speaker 1: solve you know, the crimes out there as long as 403 00:23:50,040 --> 00:23:53,680 Speaker 1: someone's left something behind. Because even though it's versatile, even 404 00:23:53,760 --> 00:23:58,560 Speaker 1: though we have this amazing capability, life is weird and 405 00:23:58,920 --> 00:24:01,240 Speaker 1: things can go wrong, and they can go wrong either 406 00:24:01,440 --> 00:24:04,280 Speaker 1: accidentally or on purpose. So one thing that can happen 407 00:24:04,480 --> 00:24:07,920 Speaker 1: is multiple people could be involved in a an incident 408 00:24:08,320 --> 00:24:11,679 Speaker 1: crime of some sort, and so the more people who 409 00:24:11,720 --> 00:24:15,560 Speaker 1: are involved, the harder it is. To be absolutely certain 410 00:24:15,640 --> 00:24:18,320 Speaker 1: that the DNA samples you're working with all linked to 411 00:24:18,400 --> 00:24:23,920 Speaker 1: a specific individual. In fact, there are currently some changes 412 00:24:24,119 --> 00:24:28,600 Speaker 1: in the way DNA can be handled UH in cases 413 00:24:28,720 --> 00:24:30,639 Speaker 1: court cases actually to the point where it's in the 414 00:24:30,720 --> 00:24:35,040 Speaker 1: legal case since UH in Texas and other places as well, 415 00:24:35,600 --> 00:24:37,720 Speaker 1: and so forensics labs are having to put in greater 416 00:24:37,800 --> 00:24:42,680 Speaker 1: restrictions because forensics analysts would go into testify in court 417 00:24:42,760 --> 00:24:45,240 Speaker 1: cases and say, there's a one in a billion chance 418 00:24:45,359 --> 00:24:47,560 Speaker 1: this belonged to someone else. But if you start the 419 00:24:47,640 --> 00:24:50,720 Speaker 1: factor in that there is more than one person's DNA 420 00:24:50,800 --> 00:24:54,640 Speaker 1: found at the scene and the contamination issues that result 421 00:24:54,680 --> 00:24:57,040 Speaker 1: from that, then people would say like, all right, well, 422 00:24:57,119 --> 00:24:59,879 Speaker 1: really it's more like one in a thousand or one 423 00:25:00,000 --> 00:25:02,359 Speaker 1: in a hundred. And then at this point you might say, well, 424 00:25:02,400 --> 00:25:05,359 Speaker 1: the DNA evidence is not strong enough for it to 425 00:25:05,400 --> 00:25:09,399 Speaker 1: be a compelling argument for the guilt or innocence of 426 00:25:09,440 --> 00:25:12,879 Speaker 1: a person, because there's enough like if you're in a 427 00:25:12,960 --> 00:25:15,320 Speaker 1: really dense urban area and you say there's a one 428 00:25:15,359 --> 00:25:19,720 Speaker 1: and a hundred chance that's you know, it's it's hard 429 00:25:19,800 --> 00:25:24,920 Speaker 1: to say that shouldn't introduce reasonable doubt that it doesn't 430 00:25:25,000 --> 00:25:29,040 Speaker 1: meet the burden of reasonable doubts. But and then you 431 00:25:29,160 --> 00:25:32,879 Speaker 1: have to try to chase down all the other possibilities. 432 00:25:33,880 --> 00:25:36,200 Speaker 1: And that's that's if there are multiple people involved. But 433 00:25:36,240 --> 00:25:40,200 Speaker 1: even if there's not multiple people involved, obviously you have 434 00:25:40,359 --> 00:25:45,680 Speaker 1: to be very cognizant of the possibility of contamination. Yeah, okay, 435 00:25:45,800 --> 00:25:48,000 Speaker 1: we we can talk about this a little bit, because 436 00:25:48,520 --> 00:25:51,520 Speaker 1: we this is something that you might not see on 437 00:25:51,760 --> 00:25:57,200 Speaker 1: Hollywood as often as you see it in real life. Yes, exactly, Jonathan. 438 00:25:57,359 --> 00:26:03,120 Speaker 1: So let's say you know the Let's say you're the detective, 439 00:26:03,480 --> 00:26:09,600 Speaker 1: right and Noel is the prosecutor, and I'm the I'm 440 00:26:09,640 --> 00:26:11,880 Speaker 1: the JABRONI at the scene. You was supposed to pick 441 00:26:11,960 --> 00:26:14,480 Speaker 1: up the stuff and bring it, right, Yeah, So you're 442 00:26:14,560 --> 00:26:17,120 Speaker 1: you're your job is to actually go in and collect 443 00:26:17,359 --> 00:26:21,400 Speaker 1: the evidence before anyone else can go through that area, right, Yeah, 444 00:26:21,560 --> 00:26:24,000 Speaker 1: because as soon as you introduce other people, then you've 445 00:26:24,000 --> 00:26:26,480 Speaker 1: introduced other DNA that could be left at the scene. 446 00:26:26,920 --> 00:26:29,720 Speaker 1: But I've been having a I've been having a crazy 447 00:26:29,840 --> 00:26:33,520 Speaker 1: time work lately, and I've been cutting corners a little 448 00:26:33,560 --> 00:26:36,680 Speaker 1: and everybody knows. Nobody said anything yet because it's not 449 00:26:36,800 --> 00:26:41,000 Speaker 1: a big deal yet. But here's what happens. Uh, while 450 00:26:41,320 --> 00:26:44,159 Speaker 1: I'm on while I collect the evidence. Let's see, I 451 00:26:44,200 --> 00:26:47,119 Speaker 1: get blood samples, and I'm on the way back. I 452 00:26:47,920 --> 00:26:50,439 Speaker 1: stop it cook out because my diet is as much 453 00:26:50,480 --> 00:26:55,600 Speaker 1: of a train wreck as my life, and and because 454 00:26:55,640 --> 00:26:58,920 Speaker 1: I'm personable, I shake hands with six people as I'm 455 00:26:58,960 --> 00:27:02,720 Speaker 1: walking back into our holding, right, I don't wash my hands. 456 00:27:03,320 --> 00:27:07,000 Speaker 1: And I also kept the sample for some reason, in 457 00:27:07,040 --> 00:27:10,159 Speaker 1: the bag from cookout. Yeah, that would there there might 458 00:27:10,240 --> 00:27:13,520 Speaker 1: be a chance that that was encountered some form of 459 00:27:13,600 --> 00:27:16,000 Speaker 1: contamination from the scene to the point where you get 460 00:27:16,000 --> 00:27:20,160 Speaker 1: to the lab and then you you run the d NA. Yeah, right, Well, 461 00:27:20,240 --> 00:27:24,600 Speaker 1: clearly the suspect was a roast pig. Right, Yeah, clearly 462 00:27:24,680 --> 00:27:28,040 Speaker 1: suspect was a roast pig. Or even more dangerously, Uh, 463 00:27:28,280 --> 00:27:32,240 Speaker 1: clearly the suspect, Uh, the suspect maybe someone that already 464 00:27:32,320 --> 00:27:35,760 Speaker 1: pings in our database, who just got out of prison 465 00:27:35,920 --> 00:27:39,760 Speaker 1: for grand theft auto and now works out of cookout. Yeah. Yeah, 466 00:27:39,800 --> 00:27:42,639 Speaker 1: that's I mean that that's a you know, it's it 467 00:27:42,720 --> 00:27:45,040 Speaker 1: seems like it's a convoluted example, except for the fact 468 00:27:45,040 --> 00:27:47,240 Speaker 1: that this is the source of stuff that can happen r. Yeah, 469 00:27:47,280 --> 00:27:49,800 Speaker 1: it's not. I would say it's possible, but that one 470 00:27:49,960 --> 00:27:53,240 Speaker 1: is not plausible. No, No, But but the example you 471 00:27:53,320 --> 00:27:56,359 Speaker 1: give does show that there has to be great care 472 00:27:56,760 --> 00:28:00,480 Speaker 1: the the people who come in to collect the evidence 473 00:28:01,080 --> 00:28:03,040 Speaker 1: have to do so before there can be a lot 474 00:28:03,119 --> 00:28:05,360 Speaker 1: of disturbance of the crime scene. Because the more disturbance 475 00:28:05,400 --> 00:28:07,520 Speaker 1: there there is, like I said, the more chances other 476 00:28:07,600 --> 00:28:12,000 Speaker 1: people will leave behind DNA skin cells, or or um 477 00:28:12,400 --> 00:28:14,959 Speaker 1: sweat or blood or whatever it might be. Um might 478 00:28:15,000 --> 00:28:17,000 Speaker 1: be that there were other people who were involved in 479 00:28:17,080 --> 00:28:21,000 Speaker 1: it who have no or you know, people who maybe 480 00:28:21,040 --> 00:28:24,119 Speaker 1: the person who stumbled upon the scene left something behind 481 00:28:24,240 --> 00:28:28,399 Speaker 1: without intending to, like cutting a hand on a on 482 00:28:28,560 --> 00:28:31,320 Speaker 1: a piece of glass or something, letting themselves into see 483 00:28:31,400 --> 00:28:36,080 Speaker 1: what's happened. Yeah, even something as simple as that. So 484 00:28:36,520 --> 00:28:39,040 Speaker 1: there's there's that you have to be aware of contamination there. 485 00:28:39,080 --> 00:28:41,560 Speaker 1: You also have to be aware of contamination through the 486 00:28:41,760 --> 00:28:45,240 Speaker 1: moment you've collected it, all the way through the testing phase. 487 00:28:45,680 --> 00:28:48,320 Speaker 1: So that's where the chain of custody comes in. By 488 00:28:48,360 --> 00:28:50,240 Speaker 1: the way, if you ever see people like putting stuff 489 00:28:50,280 --> 00:28:53,400 Speaker 1: in plastic bags in order to preserve it, that's pretty 490 00:28:53,520 --> 00:29:00,640 Speaker 1: much a fiction because plastic will will contain moisture, right anything, 491 00:29:00,720 --> 00:29:03,040 Speaker 1: Any moisture that's in the bag will stay there and 492 00:29:03,200 --> 00:29:08,440 Speaker 1: moisture can can degrade DNA samples, so usually they're actually 493 00:29:08,480 --> 00:29:11,040 Speaker 1: put in paper, so it's usually a paper envelope or 494 00:29:11,080 --> 00:29:15,160 Speaker 1: a paper bag that's quickly labeled, and then there's this 495 00:29:15,360 --> 00:29:18,960 Speaker 1: chain of custody that must be documented through the entire 496 00:29:19,040 --> 00:29:21,560 Speaker 1: process until it gets to the lab, and then at 497 00:29:21,600 --> 00:29:23,480 Speaker 1: the lab. Even at the lab, they have to be 498 00:29:23,720 --> 00:29:26,360 Speaker 1: very careful with the equipment they're using. They have to 499 00:29:26,400 --> 00:29:29,800 Speaker 1: make certain that it's completely clean and that way you 500 00:29:29,920 --> 00:29:33,400 Speaker 1: don't end up cross contaminating from a previous test into 501 00:29:33,480 --> 00:29:36,200 Speaker 1: your current test. That's happened a couple of times. There 502 00:29:36,240 --> 00:29:38,840 Speaker 1: actually been a couple of cases. Yeah, there was a 503 00:29:38,920 --> 00:29:42,080 Speaker 1: case where, uh, there was a victim of a crime 504 00:29:43,240 --> 00:29:46,120 Speaker 1: and there was another crime that was committed, and the 505 00:29:46,240 --> 00:29:50,120 Speaker 1: initial DNA test results of the crime that was committed 506 00:29:50,200 --> 00:29:52,880 Speaker 1: came back with the victim from the other crime as 507 00:29:52,920 --> 00:29:55,480 Speaker 1: a positive. And they realized that the reason why that 508 00:29:55,600 --> 00:29:58,640 Speaker 1: was happening was that there were two different DNA tests 509 00:29:58,760 --> 00:30:01,560 Speaker 1: that had been performed, and the victim from the first 510 00:30:01,640 --> 00:30:05,000 Speaker 1: one that their DNA had not been completely cleaned out 511 00:30:05,040 --> 00:30:07,280 Speaker 1: of the system before they started doing the next test, 512 00:30:07,880 --> 00:30:10,040 Speaker 1: and so they were getting these false positives, and they 513 00:30:10,120 --> 00:30:11,960 Speaker 1: knew it couldn't have been the victim because the victim 514 00:30:12,120 --> 00:30:15,120 Speaker 1: was the victim was victimized, the victim was not capable 515 00:30:15,240 --> 00:30:18,200 Speaker 1: of committing that crime. Um. So it was already like 516 00:30:18,320 --> 00:30:21,200 Speaker 1: one of those things that proved that there was an 517 00:30:21,240 --> 00:30:24,880 Speaker 1: issue here. And in almost every case, in fact, I'll 518 00:30:24,880 --> 00:30:28,320 Speaker 1: go ahead say the vast majority of cases, this has 519 00:30:28,400 --> 00:30:33,080 Speaker 1: to do with a person either mistakenly or purposefully, not 520 00:30:33,400 --> 00:30:38,240 Speaker 1: following procedure or not making certain that that everything is 521 00:30:38,520 --> 00:30:42,320 Speaker 1: on the up and up. Rather than the process itself 522 00:30:42,680 --> 00:30:45,720 Speaker 1: being a failure, it's it's it's a human error, either 523 00:30:45,840 --> 00:30:51,760 Speaker 1: intentional or otherwise introduced typically um. And so another thing 524 00:30:51,880 --> 00:30:54,160 Speaker 1: that you have to worry about is whether or not 525 00:30:54,320 --> 00:31:00,240 Speaker 1: someone has purposefully introduced DNA. There have been cases where 526 00:31:00,680 --> 00:31:06,080 Speaker 1: in order to try and either uh, to hide one's 527 00:31:06,160 --> 00:31:10,480 Speaker 1: involvement in a crime or to implicate someone else specifically 528 00:31:10,720 --> 00:31:13,840 Speaker 1: in a crime, people have left behind samples of DNA 529 00:31:14,760 --> 00:31:18,280 Speaker 1: in order to throw people, throw investigators off the track. 530 00:31:19,200 --> 00:31:22,080 Speaker 1: Whether again, whether it is to protect yourself, like let's 531 00:31:22,080 --> 00:31:24,160 Speaker 1: say that you committed the crime and you leave behind 532 00:31:24,240 --> 00:31:28,240 Speaker 1: the DNA of you know, your uh, your your hated cousin. 533 00:31:28,680 --> 00:31:32,760 Speaker 1: So that your cousin takes the rapid you don't, or 534 00:31:32,880 --> 00:31:35,440 Speaker 1: you're an investigator and you're like, well, there's this really 535 00:31:35,600 --> 00:31:38,800 Speaker 1: awful guy, and we want to get him for this crime. 536 00:31:39,640 --> 00:31:41,440 Speaker 1: We really like him for this crime, but we don't 537 00:31:41,520 --> 00:31:43,640 Speaker 1: have the direct evidence for him. However, I do have 538 00:31:43,760 --> 00:31:47,560 Speaker 1: this DNA from a separate incident. I can leave this 539 00:31:47,720 --> 00:31:50,320 Speaker 1: behind the crime scene collected, and therefore we can finally 540 00:31:50,360 --> 00:31:52,600 Speaker 1: get the guy. I'm pretty sure he did it anyway, 541 00:31:52,840 --> 00:32:00,320 Speaker 1: you know, solid solid. Yeah. So that's again, this is 542 00:32:00,360 --> 00:32:02,240 Speaker 1: not something that happens all the time. It's not not 543 00:32:02,520 --> 00:32:05,600 Speaker 1: something that's even prevalent, but it's it's one of those 544 00:32:05,600 --> 00:32:07,240 Speaker 1: things that you have to be aware of. That's why 545 00:32:07,560 --> 00:32:10,560 Speaker 1: these things like the chain of custody is so important 546 00:32:10,600 --> 00:32:13,600 Speaker 1: to maintain. I have a question, sure, uh, And I 547 00:32:13,920 --> 00:32:16,200 Speaker 1: don't know if I'm jumping ahead here, no, please ask, 548 00:32:16,360 --> 00:32:19,200 Speaker 1: But I was kind of foreshadowing this while we're asking 549 00:32:19,360 --> 00:32:25,560 Speaker 1: about hair follicles. So the it sounds like the home 550 00:32:25,720 --> 00:32:29,840 Speaker 1: run for UH, for DNA testing would be something, as 551 00:32:29,880 --> 00:32:34,640 Speaker 1: you said, containing living cells, so blood, bodily fluid, stuff 552 00:32:34,680 --> 00:32:38,960 Speaker 1: like that. But if that's a home run, the kind 553 00:32:39,000 --> 00:32:41,120 Speaker 1: of stuff that people are much more likely to leave 554 00:32:41,160 --> 00:32:45,480 Speaker 1: behind would be things like hair follicles or flakes of skin. 555 00:32:46,000 --> 00:32:49,280 Speaker 1: You know, so, what what's the deal with that? How 556 00:32:49,440 --> 00:32:52,720 Speaker 1: how does that work? It's still the same same process 557 00:32:52,760 --> 00:32:54,920 Speaker 1: in the sense that these are things that can leave 558 00:32:55,000 --> 00:32:59,640 Speaker 1: behind traces of DNA, like as long as as long 559 00:32:59,720 --> 00:33:03,320 Speaker 1: as for instance, let's say that you have h you're 560 00:33:03,360 --> 00:33:05,800 Speaker 1: at a murder scene scene and you are you are 561 00:33:05,840 --> 00:33:08,600 Speaker 1: investigating one of the things, You're going to look for 562 00:33:09,200 --> 00:33:13,280 Speaker 1: our traces of any skin under the victim's fingernails, because 563 00:33:13,360 --> 00:33:16,480 Speaker 1: that that's a an indication that the victim fought back 564 00:33:16,640 --> 00:33:20,840 Speaker 1: against his or her murderer and may in fact have 565 00:33:21,480 --> 00:33:26,160 Speaker 1: samples of that skin underneath his or her fingernails. And 566 00:33:26,320 --> 00:33:28,920 Speaker 1: so you can collect that and then do the same 567 00:33:29,000 --> 00:33:31,400 Speaker 1: process I was talking about. You can extract the DNA 568 00:33:31,600 --> 00:33:34,120 Speaker 1: from those cells and then do the same process to 569 00:33:34,240 --> 00:33:38,280 Speaker 1: duplicate that DNA and then run it either against suspects 570 00:33:38,360 --> 00:33:41,760 Speaker 1: DNA or or use a database. We mentioned the databases 571 00:33:41,840 --> 00:33:44,920 Speaker 1: briefly to a little bit about yeah, let's do that, 572 00:33:45,040 --> 00:33:47,040 Speaker 1: because there are a couple of different ones. There there 573 00:33:47,040 --> 00:33:51,280 Speaker 1: are state databases, there's a national database, and then there's 574 00:33:51,320 --> 00:33:55,120 Speaker 1: the FBI's database. Uh So, These are all databases that 575 00:33:55,800 --> 00:33:59,680 Speaker 1: contain the DNA information of various people who have been 576 00:34:00,400 --> 00:34:04,120 Speaker 1: booked for specific types of crimes. It's not every crime. 577 00:34:04,240 --> 00:34:07,400 Speaker 1: Don't worry, the FBI does not have your genetic blueprint 578 00:34:07,480 --> 00:34:10,520 Speaker 1: because one time you purposely parked in a handicap spot. 579 00:34:11,000 --> 00:34:14,719 Speaker 1: Although Jonathan and Nolan I do judge you for that, Yes, 580 00:34:14,840 --> 00:34:17,839 Speaker 1: we think you should definitely never do that. If you don't, 581 00:34:18,000 --> 00:34:22,120 Speaker 1: if you do not have a the handicap label, uh, 582 00:34:22,320 --> 00:34:25,120 Speaker 1: then don't park in that spot. But now these are 583 00:34:25,239 --> 00:34:29,520 Speaker 1: specifically pretty serious crimes where that's really the only way 584 00:34:30,000 --> 00:34:34,880 Speaker 1: that they that that the state or federal government is 585 00:34:34,880 --> 00:34:40,360 Speaker 1: allowed to to collect a DNA sample from you to 586 00:34:40,600 --> 00:34:42,879 Speaker 1: use in this database when they've got a lot of people. 587 00:34:42,960 --> 00:34:45,560 Speaker 1: I've got some statistics here to yeah, please hit me. Okay, 588 00:34:45,680 --> 00:34:49,719 Speaker 1: so let's let's go with the big one, right, Okay, 589 00:34:50,000 --> 00:34:53,120 Speaker 1: The big one here in the States is the National 590 00:34:53,320 --> 00:34:57,200 Speaker 1: DNA Index or in d I S. That's the that's 591 00:34:57,239 --> 00:35:00,520 Speaker 1: the Fed's, that's the FBI. It contains a little under 592 00:35:00,640 --> 00:35:05,359 Speaker 1: twelve million offender profiles, specifically eleven million, eight hundred twenty two, 593 00:35:05,480 --> 00:35:08,759 Speaker 1: nine hundred and twenty seven. It has to a little 594 00:35:08,800 --> 00:35:12,560 Speaker 1: over two million R s D profiles and a little 595 00:35:12,600 --> 00:35:16,240 Speaker 1: over six hundred thousand forensic profiles. That's as of June. 596 00:35:18,800 --> 00:35:21,759 Speaker 1: If you visit the FBI's website, you can learn a 597 00:35:21,840 --> 00:35:27,719 Speaker 1: lot about their biometric analysis, which does also contain print work. 598 00:35:27,800 --> 00:35:30,600 Speaker 1: It's it's sort of a mixtape of all the stuff 599 00:35:30,680 --> 00:35:33,960 Speaker 1: that they could use to investigate. And here's the thing 600 00:35:34,040 --> 00:35:36,160 Speaker 1: you can do if you live in the US and 601 00:35:36,280 --> 00:35:38,400 Speaker 1: you would like to feel a little bit less comfortable 602 00:35:38,480 --> 00:35:42,040 Speaker 1: each day. Okay, it's there's a breakdown by state, so 603 00:35:42,280 --> 00:35:47,040 Speaker 1: you can see how many offender profiles are are located 604 00:35:47,080 --> 00:35:50,200 Speaker 1: in your state. Here in Georgia, in our case, it's 605 00:35:50,320 --> 00:35:53,680 Speaker 1: two hundred nine five thousand, nine hundred and thirty eight. 606 00:35:54,080 --> 00:35:58,520 Speaker 1: Considering the population of Georgia, that is a significant number. Right. 607 00:35:59,160 --> 00:36:02,960 Speaker 1: You can see the forensics profiles, the arrestees. You can 608 00:36:03,000 --> 00:36:08,319 Speaker 1: also see the number of investigations aided in labs participate. Now, 609 00:36:09,120 --> 00:36:10,719 Speaker 1: one of the things I want to point out is 610 00:36:10,800 --> 00:36:14,279 Speaker 1: that this also goes back into the drawbacks or the 611 00:36:14,400 --> 00:36:19,000 Speaker 1: challenges of forensics, is that forensic labs can get really 612 00:36:19,120 --> 00:36:21,920 Speaker 1: backed up with this stuff, Like the backlogs can be 613 00:36:22,520 --> 00:36:25,719 Speaker 1: can be crazy because while I I you know, I 614 00:36:25,840 --> 00:36:28,560 Speaker 1: mentioned that process just for duplicating the DNA. That can 615 00:36:28,600 --> 00:36:31,520 Speaker 1: take a couple of hours to do that process, and 616 00:36:31,600 --> 00:36:34,320 Speaker 1: then of course you've got all the cleaning of the 617 00:36:34,400 --> 00:36:36,200 Speaker 1: material that has to happen in order for you to 618 00:36:36,239 --> 00:36:39,200 Speaker 1: be able to use it again. That's not that doesn't 619 00:36:39,239 --> 00:36:44,160 Speaker 1: even involve the actual analysis of the DNA that tends 620 00:36:44,200 --> 00:36:48,840 Speaker 1: to require a forensic specialist to do this. It's not 621 00:36:49,000 --> 00:36:52,160 Speaker 1: like it's all automated, although there are more and more 622 00:36:52,200 --> 00:36:56,399 Speaker 1: automated systems that help, but generally speaking, it's it's sort 623 00:36:56,440 --> 00:37:01,160 Speaker 1: of an augmented approach where you still have forensic expert 624 00:37:02,000 --> 00:37:05,040 Speaker 1: do the the look you know they're doing. They're looking 625 00:37:05,200 --> 00:37:07,800 Speaker 1: at the DNA to look at those base pairs and 626 00:37:07,880 --> 00:37:12,040 Speaker 1: actually makes certain visually that they are in fact identical. 627 00:37:12,920 --> 00:37:17,279 Speaker 1: So it takes a lot of time. And meanwhile, while 628 00:37:17,320 --> 00:37:20,160 Speaker 1: you're doing all this, more samples are coming in, so 629 00:37:20,719 --> 00:37:23,280 Speaker 1: there's a backlog that starts to build up, and depending 630 00:37:23,360 --> 00:37:26,480 Speaker 1: upon the area and the number of labs that are available, 631 00:37:27,000 --> 00:37:29,960 Speaker 1: it might be a very serious backlog. And there was 632 00:37:30,000 --> 00:37:32,480 Speaker 1: also a pre existing backlog because if we look at 633 00:37:32,520 --> 00:37:36,439 Speaker 1: how recently this occurred and off air we talked about 634 00:37:36,480 --> 00:37:38,840 Speaker 1: this in the course of your research. You send in 635 00:37:38,960 --> 00:37:42,160 Speaker 1: some great stuff about cold cases. Yeah, so there was 636 00:37:42,239 --> 00:37:46,439 Speaker 1: already a built in backlog for this technology, and they're 637 00:37:46,719 --> 00:37:51,040 Speaker 1: already been cases of people who were in jail for years, decades, Yeah, 638 00:37:51,280 --> 00:37:55,839 Speaker 1: who were innocent. Yeah, yeah, where the DNA evidence ended 639 00:37:55,920 --> 00:37:58,400 Speaker 1: up clearing them like it could not possibly have been 640 00:37:58,480 --> 00:38:03,120 Speaker 1: that person. Um. And you know, actually, that's when I 641 00:38:03,200 --> 00:38:06,200 Speaker 1: say that there are some serious restrictions and tax us 642 00:38:06,280 --> 00:38:10,279 Speaker 1: about this multi person DNA approach. It's specifically so that 643 00:38:10,960 --> 00:38:15,879 Speaker 1: there is every attempt to make certain that innocent people 644 00:38:15,960 --> 00:38:19,879 Speaker 1: aren't incarcerated. There is a huge obviously there's a huge 645 00:38:20,400 --> 00:38:25,040 Speaker 1: pressure on law enforcement to to assign guilt and uh 646 00:38:25,120 --> 00:38:29,880 Speaker 1: and bring somebody in for particularly awful crimes. And there's 647 00:38:30,160 --> 00:38:32,840 Speaker 1: an enormous pressure because of course, the community wants to 648 00:38:32,880 --> 00:38:35,080 Speaker 1: feel safe, they want to feel that something is being done. 649 00:38:36,000 --> 00:38:38,600 Speaker 1: That has to be balanced against making sure you get 650 00:38:38,640 --> 00:38:42,560 Speaker 1: the right person. Yeah. Yeah, because we know that we 651 00:38:42,680 --> 00:38:47,320 Speaker 1: live in an age of instant gratification. Yeah, things should 652 00:38:47,320 --> 00:38:52,680 Speaker 1: be right immediately and right the first time and right 653 00:38:52,760 --> 00:38:56,360 Speaker 1: now and right now, yes, the three rights. But unfortunately 654 00:38:56,440 --> 00:38:59,359 Speaker 1: the wheels of justice hows those saying go. Man, they 655 00:38:59,480 --> 00:39:03,400 Speaker 1: grinds low but exceedingly fine. Yes, as opposed to go 656 00:39:03,560 --> 00:39:05,960 Speaker 1: round and round. Yes, the wheels in the bus. I 657 00:39:06,080 --> 00:39:10,000 Speaker 1: was thinking, you know we would be here, terrible lawyers. Yeah, 658 00:39:10,040 --> 00:39:12,120 Speaker 1: we would be Actually, I know I would be. I 659 00:39:12,200 --> 00:39:15,800 Speaker 1: remember participating in a mock trial in school and not 660 00:39:16,000 --> 00:39:18,319 Speaker 1: knowing what the heck I was doing? What was your 661 00:39:18,360 --> 00:39:21,360 Speaker 1: what was your rule? I was defense and it was terrible. 662 00:39:21,719 --> 00:39:23,840 Speaker 1: It was terrible. I did not want to do it. 663 00:39:24,000 --> 00:39:26,960 Speaker 1: I was. I was. I was bullied into it and 664 00:39:27,080 --> 00:39:29,520 Speaker 1: I it was awful. I could see you doing like 665 00:39:29,600 --> 00:39:33,560 Speaker 1: a judge or maybe a bailiff who's over it, like 666 00:39:34,000 --> 00:39:39,120 Speaker 1: sit down. Yeah, no, I was. Uh my my client 667 00:39:39,160 --> 00:39:42,160 Speaker 1: would have gotten the chair. It was terrible for a 668 00:39:42,320 --> 00:39:45,680 Speaker 1: very minor offense too, That's how bad I was. We'll 669 00:39:45,800 --> 00:39:49,760 Speaker 1: wrap up this discussion about DNA forensics after this quick break. 670 00:39:59,719 --> 00:40:02,799 Speaker 1: So I away getting back into DNA forensics. Uh so, yeah, 671 00:40:02,920 --> 00:40:06,080 Speaker 1: you mentioned cold cases. I've got one specific one I'll mention, 672 00:40:06,280 --> 00:40:09,359 Speaker 1: and it's not it's one that has not been um 673 00:40:10,080 --> 00:40:12,600 Speaker 1: seen to completion yet. In other words, there there hasn't 674 00:40:12,640 --> 00:40:15,600 Speaker 1: been a conviction yet in this case, but it does 675 00:40:15,719 --> 00:40:21,240 Speaker 1: show how how far reaching this can go. So in December, 676 00:40:23,520 --> 00:40:26,480 Speaker 1: the body of a young lady named Crystal Lynn bez 677 00:40:26,920 --> 00:40:31,280 Speaker 1: bes Lana, which was found along the Provo River in Utah. 678 00:40:32,280 --> 00:40:35,400 Speaker 1: She was seventeen years old when she was killed, and 679 00:40:35,600 --> 00:40:40,000 Speaker 1: she had been sexually assaulted and murdered, uh perhaps bludgeoned 680 00:40:40,040 --> 00:40:43,160 Speaker 1: to death with rocks. That was what the police believed 681 00:40:43,200 --> 00:40:45,440 Speaker 1: at the time. Now, the original investigator of the crime, 682 00:40:46,160 --> 00:40:49,200 Speaker 1: it was a guy who became the the deputy sheriff, 683 00:40:49,239 --> 00:40:53,919 Speaker 1: I believe, but Todd Bonner decided to continue investigation even 684 00:40:54,080 --> 00:40:57,040 Speaker 1: long after all the leads were drying up, like they 685 00:40:57,120 --> 00:41:02,600 Speaker 1: just could not find any leads, and in a lab 686 00:41:02,719 --> 00:41:06,600 Speaker 1: was able to extract what's called touch DNA. It was 687 00:41:06,719 --> 00:41:10,359 Speaker 1: left behind on a granite rock that the police had 688 00:41:10,440 --> 00:41:14,880 Speaker 1: believed was used in killing this young lady, and the 689 00:41:15,320 --> 00:41:18,960 Speaker 1: lab used a vacuum instrument to pull this touch DNA 690 00:41:19,160 --> 00:41:21,719 Speaker 1: off the granite rock and then put it through this 691 00:41:21,840 --> 00:41:27,920 Speaker 1: analysis process and the results ended up matching DNA from 692 00:41:28,120 --> 00:41:31,680 Speaker 1: a suspect that people were interested in but had no 693 00:41:31,880 --> 00:41:35,400 Speaker 1: direct connection to the crime. The suspect's name was Joseph 694 00:41:35,520 --> 00:41:39,960 Speaker 1: Michael Simpson, and they got the sample DNA from a 695 00:41:40,080 --> 00:41:44,520 Speaker 1: discarded cigarette butt he had tossed us a cigarette but 696 00:41:44,680 --> 00:41:47,520 Speaker 1: the cops scooped it up, they tested the DNA, they 697 00:41:47,520 --> 00:41:52,719 Speaker 1: found a match. They arrested him back in Uh, he 698 00:41:53,200 --> 00:41:56,680 Speaker 1: has a previous conviction for murder. He had actually been 699 00:41:56,760 --> 00:42:01,759 Speaker 1: out on parole for eight months for before Brasilanta, which 700 00:42:01,880 --> 00:42:04,239 Speaker 1: is death. Yeah, so he had been in jail for 701 00:42:04,360 --> 00:42:07,640 Speaker 1: several years but got paroled and then uh, eight months 702 00:42:07,760 --> 00:42:11,640 Speaker 1: later Brisilana, which was dead, and he's been linked to 703 00:42:11,719 --> 00:42:15,040 Speaker 1: this and arrested for the crime. Now that being said, 704 00:42:15,760 --> 00:42:18,000 Speaker 1: the last I checked into this case, you know, that 705 00:42:18,080 --> 00:42:20,439 Speaker 1: was back in the last I checked into this case. 706 00:42:20,560 --> 00:42:24,399 Speaker 1: It's still not it still hasn't been tried. There's been 707 00:42:25,719 --> 00:42:30,640 Speaker 1: request for more evidence on the prosecution side, including uh 708 00:42:30,800 --> 00:42:36,279 Speaker 1: an actual DNA sample from Simpson himself to confirm that 709 00:42:36,440 --> 00:42:39,839 Speaker 1: the findings are in fact accurate, so in other words, 710 00:42:39,920 --> 00:42:43,360 Speaker 1: not just from the cigarette, but but from Simpson in custody. 711 00:42:43,880 --> 00:42:47,600 Speaker 1: And then there's also a request to get a print 712 00:42:47,680 --> 00:42:51,239 Speaker 1: sample because of a partial print that was left behind 713 00:42:51,320 --> 00:42:55,520 Speaker 1: on the victim herself. So, uh, this case is not 714 00:42:55,680 --> 00:42:58,920 Speaker 1: one that's like cut and dry and it's definitive, but 715 00:42:59,080 --> 00:43:02,440 Speaker 1: it does Indica eight that this approach is able to 716 00:43:02,520 --> 00:43:07,040 Speaker 1: start pulling up connections that otherwise would have been unlikely 717 00:43:07,200 --> 00:43:10,040 Speaker 1: or even impossible to make. And this brings us to 718 00:43:10,600 --> 00:43:13,320 Speaker 1: this is just a sidebar, Okay, this brings us to 719 00:43:14,160 --> 00:43:16,960 Speaker 1: a dangerous thing. And you know, of course that I 720 00:43:17,160 --> 00:43:20,040 Speaker 1: who can sometimes be a cartoon of myself and am 721 00:43:20,120 --> 00:43:23,600 Speaker 1: required to mention this. What do you think about the 722 00:43:23,719 --> 00:43:28,200 Speaker 1: idea of blanket DNA sampling? They're taking every citizen. You know, 723 00:43:28,440 --> 00:43:32,759 Speaker 1: some prominent members of the UK legal system have advocated 724 00:43:32,840 --> 00:43:37,000 Speaker 1: this for all British citizens, and Kuwait is doing the 725 00:43:37,080 --> 00:43:40,120 Speaker 1: same thing. Well, let me put it to you this way. Okay. 726 00:43:42,920 --> 00:43:45,200 Speaker 1: There's always the argument that some people will make that 727 00:43:45,320 --> 00:43:47,239 Speaker 1: if you're not doing anything wrong, then what do you 728 00:43:47,320 --> 00:43:49,439 Speaker 1: have to fear? Right, Well, here's what you have to fear, 729 00:43:49,680 --> 00:43:53,480 Speaker 1: I'm gonna tell you. Okay. So there have been at 730 00:43:53,520 --> 00:43:57,839 Speaker 1: least a couple of companies that have shown that through 731 00:43:58,200 --> 00:44:01,279 Speaker 1: a little bit of your DNA, they and do a 732 00:44:01,440 --> 00:44:05,839 Speaker 1: very similar process to duplicating DNA, which means that they 733 00:44:05,880 --> 00:44:09,160 Speaker 1: can synthesize your DNA, which means then that if your 734 00:44:09,200 --> 00:44:12,720 Speaker 1: DNA can be synthesized, it could be created and dropped somewhere. 735 00:44:12,840 --> 00:44:15,640 Speaker 1: And you had never been to that place. Oh wow, 736 00:44:15,880 --> 00:44:18,640 Speaker 1: So you all of a sudden, you get a summons 737 00:44:18,800 --> 00:44:23,879 Speaker 1: for some horrendous crime in uh Iceland or something. You say, 738 00:44:24,000 --> 00:44:26,439 Speaker 1: I've never been Yeah, this is the weirdest thing because 739 00:44:26,440 --> 00:44:29,719 Speaker 1: I've never been there. Like, but this is match to 740 00:44:29,800 --> 00:44:32,600 Speaker 1: your DNA. There's a one and a billion chance that 741 00:44:32,719 --> 00:44:35,799 Speaker 1: someone else did this, um and you you know that's 742 00:44:36,080 --> 00:44:38,520 Speaker 1: that's a thing like, that's we're in a world where 743 00:44:38,560 --> 00:44:42,080 Speaker 1: technologically it is possible to do this. Now, is that 744 00:44:42,239 --> 00:44:46,520 Speaker 1: likely to happen? It's definitely, like and it's in the 745 00:44:46,800 --> 00:44:50,440 Speaker 1: realm of possibility, but not plausibility. However, as long as 746 00:44:50,480 --> 00:44:54,000 Speaker 1: it's possible, then I would argue that it is too 747 00:44:54,120 --> 00:44:58,920 Speaker 1: invasive to demand from your population that everyone submit to 748 00:44:59,280 --> 00:45:03,319 Speaker 1: d n A like submitting a DNA sample and yeah, yeah, 749 00:45:03,400 --> 00:45:05,920 Speaker 1: what about well, let's take a step further. What about 750 00:45:06,000 --> 00:45:10,200 Speaker 1: the idea that there would be what about the idea 751 00:45:10,360 --> 00:45:13,840 Speaker 1: that this stuff, which is you set a blueprint in 752 00:45:14,000 --> 00:45:18,319 Speaker 1: some ways. Also it's it's similar to metadata. Okay, let's 753 00:45:18,400 --> 00:45:21,680 Speaker 1: I can see where you're saying. So the ability then 754 00:45:22,000 --> 00:45:25,399 Speaker 1: to build this enormous sample size let's say the entire 755 00:45:25,520 --> 00:45:28,000 Speaker 1: population of the UK. I think right now they're only 756 00:45:28,080 --> 00:45:31,320 Speaker 1: at maybe five of the population because you have to 757 00:45:31,400 --> 00:45:35,040 Speaker 1: get you know, you have to get pinched. So if 758 00:45:35,120 --> 00:45:38,920 Speaker 1: they had this enormous sample size, then they could start 759 00:45:39,040 --> 00:45:44,760 Speaker 1: comparing and collating and analyzing this stuff on a larger 760 00:45:44,920 --> 00:45:50,000 Speaker 1: scale such that they would be able to possibly again 761 00:45:50,120 --> 00:45:55,440 Speaker 1: possibly not plausibly, uh predict um not epo genetic trends, 762 00:45:55,480 --> 00:45:59,320 Speaker 1: but but predict the likelihood of someone incurring a certain 763 00:45:59,400 --> 00:46:03,239 Speaker 1: disease or something. Well, we're getting into more of a 764 00:46:03,320 --> 00:46:08,160 Speaker 1: genomic sequencing at that point. Yeah, yeah, when you're when 765 00:46:08,160 --> 00:46:11,160 Speaker 1: you're getting into genomic sequencing, it's it's much further, it's 766 00:46:11,239 --> 00:46:14,920 Speaker 1: much it's a much longer process. Because again, this is 767 00:46:15,120 --> 00:46:18,200 Speaker 1: very close to when they call a genetic fingerprinting. It 768 00:46:18,760 --> 00:46:20,759 Speaker 1: makes sense to call it that because you're really just 769 00:46:20,880 --> 00:46:25,680 Speaker 1: looking at the physical resemblance of two strands, right, Like 770 00:46:25,880 --> 00:46:28,360 Speaker 1: like two drawings and there are two drawings of ladders, 771 00:46:28,440 --> 00:46:30,239 Speaker 1: and if the two drawings of ladders are the same, 772 00:46:30,320 --> 00:46:32,120 Speaker 1: then you know you've got a one and a billion 773 00:46:32,280 --> 00:46:36,480 Speaker 1: chance of it not being that person. So that's a 774 00:46:36,560 --> 00:46:39,640 Speaker 1: lot different than going through and identifying things like which 775 00:46:39,760 --> 00:46:42,520 Speaker 1: genes do want I mean, we still don't even know, right, 776 00:46:43,880 --> 00:46:46,600 Speaker 1: So in case you guys are not terribly familiar with 777 00:46:46,760 --> 00:46:50,799 Speaker 1: with genetics. The genes can be pretty complicated things. Think 778 00:46:50,800 --> 00:46:54,720 Speaker 1: about like a giant switchboard. Right, You've got an enormous switchboard, 779 00:46:54,760 --> 00:46:59,120 Speaker 1: and there's like a thousand switches on little metal toggle switches, 780 00:46:59,200 --> 00:47:03,480 Speaker 1: the classic up down toggle switches. Right, unlabeled. That you 781 00:47:03,600 --> 00:47:07,440 Speaker 1: have a bank of lightbulbs in front of you, also unlabeled. 782 00:47:07,840 --> 00:47:10,759 Speaker 1: You flip one switch and one lightbulb comes on. You 783 00:47:10,800 --> 00:47:13,920 Speaker 1: flip a second switch, that lightbulb stays on. Three other 784 00:47:14,000 --> 00:47:16,479 Speaker 1: lightbulbs come on. You turn off the first switch. Only 785 00:47:16,600 --> 00:47:18,839 Speaker 1: one lightbulb goes off, and you start thinking, Okay, wait, 786 00:47:18,960 --> 00:47:22,240 Speaker 1: what how is this? Well, that's the thing about genes 787 00:47:22,360 --> 00:47:24,960 Speaker 1: is that they it's not so simple as to say 788 00:47:25,000 --> 00:47:29,040 Speaker 1: that this one gene is in charge of this one trait. 789 00:47:29,800 --> 00:47:32,400 Speaker 1: It's it can be much more complicated, where it's a 790 00:47:32,640 --> 00:47:36,000 Speaker 1: a selection of genes that some are active, some are 791 00:47:36,040 --> 00:47:40,120 Speaker 1: not active. Um So, because of that, even if you've 792 00:47:40,120 --> 00:47:44,960 Speaker 1: got all the DNA from an entire population, you might 793 00:47:45,040 --> 00:47:48,279 Speaker 1: be able to say, well, this one person suffered from 794 00:47:48,600 --> 00:47:52,839 Speaker 1: a particular inherited disease. Let's examine the DNA and then 795 00:47:53,160 --> 00:47:55,080 Speaker 1: compare it to other people who have suffered from that 796 00:47:55,160 --> 00:47:57,160 Speaker 1: same disease and see where the points of comparison are. 797 00:47:57,239 --> 00:48:00,600 Speaker 1: But that is I mean, it's a monu mental task 798 00:48:00,800 --> 00:48:06,760 Speaker 1: because you just it's beyond taking points along a strand 799 00:48:07,040 --> 00:48:10,879 Speaker 1: and comparing the them against a second sample. Right, it's 800 00:48:10,960 --> 00:48:14,399 Speaker 1: it's a it's another it's almost like a an order 801 00:48:14,480 --> 00:48:17,160 Speaker 1: of magnitude greater in the amount of effort that you 802 00:48:17,239 --> 00:48:19,360 Speaker 1: have to take. Yeah, that's a really good way to 803 00:48:19,440 --> 00:48:22,080 Speaker 1: put it. And that that a squatches some of my 804 00:48:22,760 --> 00:48:25,920 Speaker 1: apien predictions. I do have one other question, all right, 805 00:48:26,239 --> 00:48:31,960 Speaker 1: So we talked about in identical twins, right there, there 806 00:48:32,160 --> 00:48:37,279 Speaker 1: is another, Um, there's another possibility where a person could 807 00:48:37,400 --> 00:48:40,600 Speaker 1: get pinched with the wrong DNA. Are you going with 808 00:48:40,680 --> 00:48:44,399 Speaker 1: a clone? What's your Well, there's a there's another possibility. Wait, 809 00:48:44,520 --> 00:48:46,840 Speaker 1: maybe not. It's not the same as identical twins, but 810 00:48:47,480 --> 00:48:54,200 Speaker 1: it throws another monkey wrench into this. Uh uh chimeras. 811 00:48:54,920 --> 00:49:01,319 Speaker 1: Oh interesting, Yeah, okay, uh, all right, Well I want 812 00:49:01,360 --> 00:49:03,279 Speaker 1: to hear your thought process on this, because this is 813 00:49:03,360 --> 00:49:06,719 Speaker 1: not something I specifically looked into, because chimerism is not 814 00:49:06,960 --> 00:49:11,120 Speaker 1: that it's super rare. Yeah, it's not that it's like 815 00:49:11,280 --> 00:49:15,080 Speaker 1: an episode of SPU for that time. Um, And so yeah, 816 00:49:16,000 --> 00:49:18,759 Speaker 1: it's true though it sounds crazy. And you guys talked 817 00:49:18,760 --> 00:49:20,960 Speaker 1: about this on one of your other shows, right yeah, 818 00:49:20,960 --> 00:49:24,400 Speaker 1: and Forward Thinking, we talked about cameras and yeah. It 819 00:49:24,480 --> 00:49:26,040 Speaker 1: was one of those things where where the more you 820 00:49:26,120 --> 00:49:30,480 Speaker 1: talked about it, the more the more like unsure I 821 00:49:30,680 --> 00:49:34,440 Speaker 1: was that I was reflecting reality because it seems so weird. 822 00:49:34,920 --> 00:49:38,200 Speaker 1: It seems very very strange. So it's a person composed 823 00:49:38,280 --> 00:49:44,279 Speaker 1: of two genetically distinct types of cells. So you might have, um, 824 00:49:44,760 --> 00:49:46,920 Speaker 1: I think the first time I was discovered it was 825 00:49:47,000 --> 00:49:49,640 Speaker 1: related to blood type, right, yeah, I believe, So I 826 00:49:49,680 --> 00:49:54,080 Speaker 1: believe you're correct. So somebody had more than one blood type, 827 00:49:54,120 --> 00:49:56,920 Speaker 1: which is already so trippy to me. I just felt 828 00:49:56,960 --> 00:50:00,680 Speaker 1: like to do the to do just to this topic, 829 00:50:01,000 --> 00:50:04,279 Speaker 1: we would have to mention that that is one of 830 00:50:04,360 --> 00:50:10,880 Speaker 1: those very very exceedingly rare cases where DNA testing is 831 00:50:10,920 --> 00:50:14,480 Speaker 1: not again a silver bullet. Yeah, so you could in 832 00:50:14,600 --> 00:50:19,600 Speaker 1: a bizarre like this is almost like a science fiction novel. Yeah, 833 00:50:19,760 --> 00:50:21,879 Speaker 1: Like to the point where you're like, for this to work, 834 00:50:23,200 --> 00:50:25,440 Speaker 1: so many things would have to fall in line perfectly 835 00:50:25,520 --> 00:50:27,840 Speaker 1: that you might as well say it's impossible. But imagine 836 00:50:27,880 --> 00:50:30,360 Speaker 1: that you have a scenario in which you have a 837 00:50:30,440 --> 00:50:33,520 Speaker 1: chimera and DNA is left behind at the scene, but 838 00:50:33,600 --> 00:50:36,920 Speaker 1: it's only one type of DNA somehow, and then the 839 00:50:37,040 --> 00:50:40,600 Speaker 1: sample they get is somehow just the other type of DNA, 840 00:50:40,719 --> 00:50:47,440 Speaker 1: thus exonerating your your perpetrator um practically. Yeah, there's no, 841 00:50:48,480 --> 00:50:52,080 Speaker 1: it's impossible. The only way that that could really affect 842 00:50:52,160 --> 00:50:54,880 Speaker 1: it is if there were somehow a chimera on the 843 00:50:55,800 --> 00:50:59,240 Speaker 1: involved in this scene and it became a contaminating factors, 844 00:51:00,120 --> 00:51:03,360 Speaker 1: then they would say, well, aside from the victim, it 845 00:51:03,480 --> 00:51:05,960 Speaker 1: seems it appears that there were three people here. Yes, 846 00:51:06,239 --> 00:51:09,279 Speaker 1: that that would certainly, that would certainly cause problems, right, 847 00:51:09,360 --> 00:51:12,680 Speaker 1: that would certainly cause confusion in the whole process. But 848 00:51:12,760 --> 00:51:14,760 Speaker 1: it's also so rare for someone to be a criminal. 849 00:51:14,960 --> 00:51:18,839 Speaker 1: I think it's already yeah, begging beggaring belief right right, 850 00:51:18,920 --> 00:51:21,360 Speaker 1: like like like you you already have. Like if you 851 00:51:21,520 --> 00:51:24,160 Speaker 1: think of the population of people who have some form 852 00:51:24,280 --> 00:51:29,000 Speaker 1: of of that, you know, the chimera DNA thing going on, 853 00:51:29,600 --> 00:51:33,319 Speaker 1: and then within that population, what percentage of those people 854 00:51:33,400 --> 00:51:38,120 Speaker 1: are our master yeah, are are committing these sort of crimes. 855 00:51:38,600 --> 00:51:42,799 Speaker 1: It's got to be pretty pretty small number. Sorry, man, 856 00:51:42,920 --> 00:51:44,560 Speaker 1: I'm sorry, I just had to bring it up. No, No, 857 00:51:44,760 --> 00:51:46,759 Speaker 1: it's fine, like you know, it's it's you know, but 858 00:51:47,040 --> 00:51:51,200 Speaker 1: what if so one other thing I wanted to talk about. 859 00:51:51,320 --> 00:51:54,520 Speaker 1: I almost forgot about this. So did you read up 860 00:51:54,600 --> 00:52:00,319 Speaker 1: about the um the technology of reconstructing a person's face 861 00:52:00,680 --> 00:52:06,359 Speaker 1: using just DNA material, and you, like you you had 862 00:52:06,400 --> 00:52:09,200 Speaker 1: sent that to me off there, And I was initially 863 00:52:09,400 --> 00:52:13,239 Speaker 1: skeptical when I was looking at it, but I'm out. 864 00:52:13,400 --> 00:52:16,759 Speaker 1: It's it's called Snapshot, and it's from a company called 865 00:52:16,800 --> 00:52:20,520 Speaker 1: Parabond and Snapshot what it's what it attempts to do 866 00:52:20,760 --> 00:52:24,680 Speaker 1: is take the information from a DNA sample and create 867 00:52:25,120 --> 00:52:28,359 Speaker 1: a essentially a police sketch of a person, a three 868 00:52:28,520 --> 00:52:34,160 Speaker 1: dimensional uh representation of what a person might look like. Now, 869 00:52:34,239 --> 00:52:36,480 Speaker 1: when I say might look like, you've got to be 870 00:52:36,760 --> 00:52:39,560 Speaker 1: super generous with this because if all you have is 871 00:52:39,600 --> 00:52:42,239 Speaker 1: the DNA, if that's all you have, like you don't 872 00:52:42,360 --> 00:52:46,920 Speaker 1: have any any knowledge of what the person's face looks like, 873 00:52:47,000 --> 00:52:50,440 Speaker 1: otherwise how old they are, they're height, or anything. What 874 00:52:50,640 --> 00:52:54,080 Speaker 1: you'll be able to do is probably approximate their skin tone, 875 00:52:54,680 --> 00:52:59,759 Speaker 1: their ethnicity, their gender, at least their biological gender there, 876 00:52:59,800 --> 00:53:02,960 Speaker 1: they're hair color, their eye color, that kind of stuff, 877 00:53:03,040 --> 00:53:05,080 Speaker 1: whether or not they have freckles, that kind of thing. 878 00:53:06,040 --> 00:53:08,640 Speaker 1: But beyond that, you're not gonna be able to tell 879 00:53:08,640 --> 00:53:10,840 Speaker 1: their age. You're not gonna tell their higher weight, So 880 00:53:11,040 --> 00:53:13,960 Speaker 1: you can't you can't tell how heavy said or thin 881 00:53:14,120 --> 00:53:16,120 Speaker 1: they may be. How you don't know how much how 882 00:53:16,239 --> 00:53:19,600 Speaker 1: much like what sort of wrinkles would you need to 883 00:53:19,600 --> 00:53:22,919 Speaker 1: add in if they're you know, if they're older. Yeah, 884 00:53:22,920 --> 00:53:25,360 Speaker 1: you wouldn't know any of that. Uh, And you wouldn't 885 00:53:25,400 --> 00:53:28,359 Speaker 1: know their skull shape, Like you wouldn't know their face shape, right, 886 00:53:28,520 --> 00:53:31,160 Speaker 1: like the DNA wouldn't. So, so what you can do 887 00:53:31,320 --> 00:53:34,200 Speaker 1: is create like a very generic looking person but with 888 00:53:34,400 --> 00:53:38,000 Speaker 1: those traits. So it may not be so useful in 889 00:53:38,160 --> 00:53:41,919 Speaker 1: the sense of using this as a means of trying 890 00:53:41,960 --> 00:53:45,400 Speaker 1: to track down the suspect there may not come in handy. 891 00:53:45,480 --> 00:53:48,800 Speaker 1: Where it might help is if you have unidentified remains. 892 00:53:49,680 --> 00:53:52,760 Speaker 1: So let's say you've found the remains of a person 893 00:53:53,040 --> 00:53:56,080 Speaker 1: and you're able to extract some DNA information, but you're 894 00:53:56,120 --> 00:54:00,560 Speaker 1: not able to ascertain the identity of this this person. Uh. 895 00:54:00,800 --> 00:54:03,880 Speaker 1: This would allow you once if you have the person's 896 00:54:04,040 --> 00:54:06,320 Speaker 1: like skull, like if if that's part of the remains 897 00:54:06,360 --> 00:54:08,800 Speaker 1: that are left behind, you then know at least the 898 00:54:08,880 --> 00:54:12,440 Speaker 1: dimensions of the skull. And there are also other technologies 899 00:54:12,719 --> 00:54:15,680 Speaker 1: that allow people to approximate lot of person's face looks 900 00:54:15,719 --> 00:54:19,680 Speaker 1: like based upon their skull shape. So combining those two 901 00:54:20,360 --> 00:54:22,759 Speaker 1: where you you say, all right, this is what they're 902 00:54:22,840 --> 00:54:25,800 Speaker 1: they probably look like based upon the the shape of 903 00:54:25,880 --> 00:54:30,120 Speaker 1: their skull. Plus here are their characteristics that they had 904 00:54:30,320 --> 00:54:32,960 Speaker 1: according to their d n A, then you might be 905 00:54:33,120 --> 00:54:36,560 Speaker 1: able to create a few different looks for that particular 906 00:54:36,719 --> 00:54:41,399 Speaker 1: individual that might help in identifying who that person was. Yeah, 907 00:54:41,520 --> 00:54:45,480 Speaker 1: and that that I think is the most tremendous possibility 908 00:54:45,640 --> 00:54:49,640 Speaker 1: of this technology, absolutely right, because we're seeing already that 909 00:54:50,600 --> 00:54:54,000 Speaker 1: the study of d n A and and the application 910 00:54:54,239 --> 00:54:58,720 Speaker 1: of this sort of science has fundamentally changed the nature 911 00:54:58,880 --> 00:55:02,120 Speaker 1: of crime and investor negation. Yeah, to the point where 912 00:55:02,320 --> 00:55:07,160 Speaker 1: again it can affect juries, uh and there and their 913 00:55:07,680 --> 00:55:11,480 Speaker 1: perception of a case to the So so it can 914 00:55:11,520 --> 00:55:13,560 Speaker 1: be a frustration, right, like if you have if you 915 00:55:13,640 --> 00:55:17,040 Speaker 1: have other lines of evidence that clearly indicate that the 916 00:55:17,120 --> 00:55:20,440 Speaker 1: person accused of the crime has committed it, but because 917 00:55:20,800 --> 00:55:24,520 Speaker 1: there was there was no DNA evidence, or maybe there 918 00:55:24,600 --> 00:55:27,080 Speaker 1: was some problem with the chain of custody, that can 919 00:55:27,760 --> 00:55:31,439 Speaker 1: create enough doubt in a jury's mind, a jury that's 920 00:55:31,440 --> 00:55:33,640 Speaker 1: been conditioned to believe the DNA evidence is the end 921 00:55:33,640 --> 00:55:36,440 Speaker 1: all be all that it can it can cause problems 922 00:55:36,719 --> 00:55:38,520 Speaker 1: in that case. This is this is the thing, is 923 00:55:38,560 --> 00:55:42,200 Speaker 1: that human beings were messy, right, Like we're not just 924 00:55:42,320 --> 00:55:44,680 Speaker 1: messy and that we leave DNA behind. We're messy in 925 00:55:44,800 --> 00:55:49,080 Speaker 1: the way we try to process information. And so sometimes 926 00:55:49,520 --> 00:55:52,920 Speaker 1: you know, when you go through an entire process of 927 00:55:53,600 --> 00:55:58,080 Speaker 1: when a crime is committed, to figuring out who potentially 928 00:55:58,160 --> 00:56:01,759 Speaker 1: did it, to apprehend that person, to then trying that 929 00:56:01,880 --> 00:56:04,840 Speaker 1: person for the crime, to then deciding whether or not 930 00:56:04,920 --> 00:56:08,279 Speaker 1: they're guilty. I mean, there's so much stuff going on 931 00:56:08,480 --> 00:56:12,000 Speaker 1: through that whole process. I hope you enjoyed that classic 932 00:56:12,080 --> 00:56:16,160 Speaker 1: episode with Ben Bolan guest hosting. Was a lot of fun. 933 00:56:16,239 --> 00:56:19,480 Speaker 1: Having him on always is fun. Occasionally you might hear 934 00:56:19,600 --> 00:56:22,320 Speaker 1: me pop on his show Ridiculous History as a character 935 00:56:22,360 --> 00:56:26,600 Speaker 1: called the Quister, where I am even more obnoxious than 936 00:56:26,680 --> 00:56:29,520 Speaker 1: I am on this show. And I know that's hard 937 00:56:29,560 --> 00:56:32,880 Speaker 1: to believe, but it is true. If you would like 938 00:56:33,000 --> 00:56:35,200 Speaker 1: to reach out with suggestions for topics I should cover 939 00:56:35,280 --> 00:56:36,960 Speaker 1: on future episodes of Tech Stuff, there are a couple 940 00:56:36,960 --> 00:56:39,160 Speaker 1: of different ways of doing that one is to download 941 00:56:39,239 --> 00:56:42,319 Speaker 1: the I heart Radio app, navigate over to the text 942 00:56:42,360 --> 00:56:45,880 Speaker 1: Stuff page, use little microphone icon, record a message up 943 00:56:45,920 --> 00:56:47,680 Speaker 1: to thirty seconds in length letting me know what you 944 00:56:47,680 --> 00:56:50,560 Speaker 1: would like me to cover, or you can reach out 945 00:56:50,600 --> 00:56:52,960 Speaker 1: on Twitter. The handle for the show is text Stuff 946 00:56:53,280 --> 00:56:57,239 Speaker 1: H s W and I'll talk to you again really soon. 947 00:57:03,480 --> 00:57:06,480 Speaker 1: Text Stuff is an I heart Radio production. For more 948 00:57:06,560 --> 00:57:09,960 Speaker 1: podcasts from I heart Radio, visit the i heart Radio app, 949 00:57:10,080 --> 00:57:13,240 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.