1 00:00:01,080 --> 00:00:04,560 Speaker 1: Welcome to Stuff you Should Know from house Stuff Works 2 00:00:04,559 --> 00:00:13,000 Speaker 1: dot com. Hey, and welcome to the podcast. I'm Josh 3 00:00:13,039 --> 00:00:15,720 Speaker 1: Clark and there's Stuffy Chuck Bryant, and there's Jerry and 4 00:00:15,880 --> 00:00:17,599 Speaker 1: there's some papers in front of me, and then I 5 00:00:17,720 --> 00:00:20,080 Speaker 1: key a lamp and a big old microphone and it's 6 00:00:20,120 --> 00:00:23,599 Speaker 1: all pressing in on me really hard. That means this 7 00:00:23,760 --> 00:00:26,759 Speaker 1: is stuff you should know. That means Josh has gone 8 00:00:26,760 --> 00:00:31,280 Speaker 1: on vacations. When the microphone is the size of a watermelon, 9 00:00:32,760 --> 00:00:34,640 Speaker 1: it's just hitting you in the face. Yeah, I'm like, 10 00:00:34,760 --> 00:00:40,520 Speaker 1: it doesn't normally a braid my chin is today. So yeah, Chuck, Josh, 11 00:00:40,720 --> 00:00:43,720 Speaker 1: how are you doing. I'm great. You get some fingerprints there? 12 00:00:43,760 --> 00:00:47,960 Speaker 1: Let me see you. Um, I have four on my 13 00:00:48,040 --> 00:00:50,800 Speaker 1: fingers and one each on each thumb. Yeah, are you're 14 00:00:50,840 --> 00:00:54,360 Speaker 1: one of those guys on um Oh, let me tell 15 00:00:54,360 --> 00:00:57,840 Speaker 1: you a little story about where your little fingerprints came from, 16 00:00:57,880 --> 00:01:01,480 Speaker 1: back when Chuck, little Chuck was just a ten week 17 00:01:01,560 --> 00:01:07,120 Speaker 1: old fetus. Okay, so nineteen seventy it's a certain point 18 00:01:07,160 --> 00:01:09,400 Speaker 1: in nineteen seventy. Well, if I was born in March 19 00:01:09,520 --> 00:01:11,880 Speaker 1: seventy one, and that would go back, just go back 20 00:01:11,920 --> 00:01:15,400 Speaker 1: nine months from there. Okay, whatever that is, uh nine seventy. 21 00:01:16,200 --> 00:01:21,119 Speaker 1: So your your little basil cell layer of your skin. 22 00:01:21,240 --> 00:01:23,920 Speaker 1: You got three. You got the epidermis, the outer layer, 23 00:01:24,160 --> 00:01:26,679 Speaker 1: the basil layer, which is in between that's like that 24 00:01:26,800 --> 00:01:29,399 Speaker 1: where all the new skin cells are produced. Then there's 25 00:01:29,400 --> 00:01:33,520 Speaker 1: a dermist below that. Those are your three layers. About 26 00:01:33,560 --> 00:01:37,560 Speaker 1: that time, your little basil layer started going haywire, producing 27 00:01:37,600 --> 00:01:41,680 Speaker 1: skin cells at a much faster rate than your epidermis 28 00:01:41,720 --> 00:01:45,679 Speaker 1: and your dermist, which meant that your basil layer was 29 00:01:45,760 --> 00:01:51,480 Speaker 1: growing up against your epidermis and your dermists squishing together. Yeah. 30 00:01:51,680 --> 00:01:53,480 Speaker 1: And so when this would happen, when it would grow 31 00:01:53,560 --> 00:01:57,840 Speaker 1: up against say you're your epidermis, it would create a 32 00:01:57,840 --> 00:02:00,800 Speaker 1: point of contact, and that point of content act would 33 00:02:00,840 --> 00:02:03,240 Speaker 1: create enough pressure so that your basil layer would buckle 34 00:02:03,320 --> 00:02:06,880 Speaker 1: a little bit. And what's weird about that was that 35 00:02:06,960 --> 00:02:10,320 Speaker 1: your little basil layer buckled in what appeared to be 36 00:02:10,480 --> 00:02:17,000 Speaker 1: little patterns, little worlds, little swoops, little circles. But what's neat, 37 00:02:17,080 --> 00:02:20,440 Speaker 1: chuck is at this point, within the next six weeks 38 00:02:20,560 --> 00:02:24,320 Speaker 1: after it started, you had fingerprints that are going to 39 00:02:24,360 --> 00:02:27,640 Speaker 1: stay the same for the rest of your life, just 40 00:02:27,760 --> 00:02:32,680 Speaker 1: beneath your epidermis. Yeah, the tiny little chuck fingers are 41 00:02:32,720 --> 00:02:36,400 Speaker 1: now sort of tiny little man fingers, but they are 42 00:02:36,400 --> 00:02:40,080 Speaker 1: the same fingerprints throughout my entire line. That's right, And 43 00:02:40,120 --> 00:02:43,960 Speaker 1: you can it's true damage your fingerprints. Some people have purposely, 44 00:02:44,000 --> 00:02:46,560 Speaker 1: which we'll talk about, but for the most part, since 45 00:02:46,560 --> 00:02:50,040 Speaker 1: it's your basil layer, that's where the actual fingerprint is. 46 00:02:51,480 --> 00:02:55,200 Speaker 1: Even if you cut your epidermis, which happens, your skin 47 00:02:55,280 --> 00:02:58,000 Speaker 1: will grow back in your basil layer will remain the same. 48 00:02:58,440 --> 00:03:00,880 Speaker 1: Your fingerprints will remain the same. That's a great way 49 00:03:00,919 --> 00:03:04,880 Speaker 1: to end this good night. Uh that was a great 50 00:03:04,880 --> 00:03:08,320 Speaker 1: little story. And I would grow up to be a 51 00:03:08,360 --> 00:03:13,799 Speaker 1: sociopath podcaster with those very same fingerprints. I don't think 52 00:03:13,880 --> 00:03:18,040 Speaker 1: you're sociopathic. I'm just kidding. Um, So we're talking fingerprints, 53 00:03:18,240 --> 00:03:22,320 Speaker 1: and you mentioned the little ridges, little worlds, and we're 54 00:03:22,320 --> 00:03:24,679 Speaker 1: not making these words up. W H O R L 55 00:03:24,880 --> 00:03:28,680 Speaker 1: S is actually what it's called worlds. Valleys and uh 56 00:03:28,800 --> 00:03:32,760 Speaker 1: loops loops, arches, arches. Those are three. I don't know 57 00:03:32,760 --> 00:03:36,440 Speaker 1: where you got valleys. Um, well, it's a in the pattern. 58 00:03:36,480 --> 00:03:39,640 Speaker 1: But that's not a part of the official fingerprints fingerprint 59 00:03:39,640 --> 00:03:41,520 Speaker 1: classic cases. No, no no, no, no, that's not a part 60 00:03:41,520 --> 00:03:45,840 Speaker 1: of the classic Um, but each are unique. And we 61 00:03:45,880 --> 00:03:48,240 Speaker 1: all know this because you know that's why they use 62 00:03:48,320 --> 00:03:52,880 Speaker 1: fingerprinting as one of the biometric UH sciences to classify 63 00:03:52,960 --> 00:03:56,280 Speaker 1: people and identify people. Um, there's a one in sixty 64 00:03:56,280 --> 00:03:59,520 Speaker 1: four billion chance that your finger fingerprint will match exactly 65 00:03:59,560 --> 00:04:03,400 Speaker 1: with someone. So get this. Sir Francis Galton was the 66 00:04:03,440 --> 00:04:07,600 Speaker 1: one who said that, and he was saying that through 67 00:04:07,720 --> 00:04:11,360 Speaker 1: his classification system legally speaking, as far as what would 68 00:04:11,360 --> 00:04:14,520 Speaker 1: be admissible in court, probably there would be a one 69 00:04:14,520 --> 00:04:17,800 Speaker 1: and sixty four chance of matching people's fingerprints up. He 70 00:04:17,880 --> 00:04:23,560 Speaker 1: also he also thought that if you went down on 71 00:04:23,839 --> 00:04:27,440 Speaker 1: more of a a more granular level and looked at 72 00:04:27,440 --> 00:04:31,360 Speaker 1: people's fingerprints, there were probably a better chance that people 73 00:04:31,400 --> 00:04:35,880 Speaker 1: would have fingerprints who matched. And if you take Galton's 74 00:04:36,279 --> 00:04:40,919 Speaker 1: kind of liberal view of matching fingerprints, and you have 75 00:04:40,960 --> 00:04:43,720 Speaker 1: a one and sixty four billion chance of having matching 76 00:04:43,720 --> 00:04:46,840 Speaker 1: fingerprints with somebody just looking like that's somebody just looking 77 00:04:46,880 --> 00:04:53,159 Speaker 1: at the patterns. Um. Since about a hundred billion people 78 00:04:53,200 --> 00:04:56,360 Speaker 1: have lived in the history of humanity, that means that 79 00:04:56,480 --> 00:04:59,080 Speaker 1: there's at least one pair of people who have ever 80 00:04:59,120 --> 00:05:03,400 Speaker 1: lived who had the same fingerprints. And that's if you 81 00:05:04,320 --> 00:05:08,760 Speaker 1: subscribe to his numbers. Yeah from eighteen whatever, which may 82 00:05:08,760 --> 00:05:11,760 Speaker 1: be overestimating it are we are fingerprints may be slightly 83 00:05:11,800 --> 00:05:15,440 Speaker 1: more similar than you think. Interesting. Yeah, well they are 84 00:05:15,560 --> 00:05:17,880 Speaker 1: more unique than DNA, because we all know if you 85 00:05:17,920 --> 00:05:20,200 Speaker 1: listen to the Twins podcast that twins can share a 86 00:05:20,200 --> 00:05:23,960 Speaker 1: lot of DNA, but they can't share fingerprints. No, it's different, 87 00:05:24,240 --> 00:05:28,520 Speaker 1: it's very different. Um, all right, so let's get into this. Fingerprints, 88 00:05:28,640 --> 00:05:32,880 Speaker 1: um are actually made of ridges called friction ridges, and 89 00:05:33,040 --> 00:05:35,480 Speaker 1: they have little pores underneath them, and it's the pores 90 00:05:35,520 --> 00:05:38,360 Speaker 1: where you leak like sweat and oils, and that's actually 91 00:05:38,400 --> 00:05:41,680 Speaker 1: what the mark you are leaving as a fingerprint is 92 00:05:41,720 --> 00:05:45,159 Speaker 1: coming up through those pores at those friction ridges. You're 93 00:05:45,160 --> 00:05:47,960 Speaker 1: not leaving skin behind, No, you're just leaving a little 94 00:05:47,960 --> 00:05:52,200 Speaker 1: bit of steam, that's right. And they're really popular way 95 00:05:52,240 --> 00:05:56,920 Speaker 1: of uh, probably the most popular biometric right now because 96 00:05:58,000 --> 00:06:02,160 Speaker 1: everyone's leaving fingerprints. Everyone's got fingers, well not everyone. They're 97 00:06:02,160 --> 00:06:05,200 Speaker 1: easy to classify, they're easy to sort. Um. They do 98 00:06:05,279 --> 00:06:07,320 Speaker 1: mention this article that like you could probably do the 99 00:06:07,360 --> 00:06:10,600 Speaker 1: same thing with like toe prints, but um, no one 100 00:06:10,640 --> 00:06:13,040 Speaker 1: wants to ask all these criminals to take off their 101 00:06:13,040 --> 00:06:15,280 Speaker 1: shoes and socks and like toe prints, especially not if 102 00:06:15,320 --> 00:06:18,279 Speaker 1: they just defecated in the back seat of the police cruiser. Plus, 103 00:06:18,279 --> 00:06:20,119 Speaker 1: you're more likely and they even mentioned this in the article, 104 00:06:20,200 --> 00:06:22,360 Speaker 1: you're more likely to leave a fingerprint than a toe print. 105 00:06:22,800 --> 00:06:24,919 Speaker 1: They don't even put that in here. A lot of 106 00:06:24,920 --> 00:06:28,080 Speaker 1: people that was the most obvious. Uh yeah, you know, 107 00:06:28,520 --> 00:06:31,920 Speaker 1: you're not gonna commit a crime barefoot, and some people do. 108 00:06:32,000 --> 00:06:37,440 Speaker 1: But it's necessary. But nowadays it's virtually too late because 109 00:06:37,680 --> 00:06:42,000 Speaker 1: we've amassed such a database of fingerprints that, like a 110 00:06:42,839 --> 00:06:45,520 Speaker 1: bare footprint would be almost useless unless you had the 111 00:06:45,560 --> 00:06:47,599 Speaker 1: person in the suspect. It was about one and every 112 00:06:47,640 --> 00:06:51,760 Speaker 1: six people have their fingerprints on record. Yeah, and that 113 00:06:51,800 --> 00:06:53,800 Speaker 1: doesn't mean that one and every six people have been 114 00:06:53,920 --> 00:06:56,839 Speaker 1: charged with the crime. But there's a lot of ways 115 00:06:56,920 --> 00:07:02,480 Speaker 1: that fingerprints make it into the fingerprint database. Aphis uh yeah, 116 00:07:02,480 --> 00:07:05,640 Speaker 1: what people use them as a signature verifications these days, 117 00:07:06,200 --> 00:07:10,920 Speaker 1: as um to identify victims, um job applications. Sometimes the 118 00:07:11,040 --> 00:07:13,960 Speaker 1: first time mine made it in because my dad took 119 00:07:13,960 --> 00:07:16,840 Speaker 1: me to the public library to have me fingerprinted as 120 00:07:16,840 --> 00:07:20,400 Speaker 1: a child. Really yeah, it's just so now that like 121 00:07:20,440 --> 00:07:22,000 Speaker 1: we have your kid on file in case he ever 122 00:07:22,080 --> 00:07:25,680 Speaker 1: goes missing. Yep. After that whole Adam Walsh thing. That 123 00:07:25,800 --> 00:07:28,200 Speaker 1: was it, like anything anybody said like this could help 124 00:07:28,200 --> 00:07:31,080 Speaker 1: if your kid is kidnapped or whatever. Parents just did 125 00:07:31,080 --> 00:07:33,720 Speaker 1: it in the early eighties. My parents did. No. No, 126 00:07:33,840 --> 00:07:36,720 Speaker 1: I don't think I'm on file anywhere. Really with my fingerprints? 127 00:07:36,720 --> 00:07:40,600 Speaker 1: That's good, is it? I guess so? Sure? Man, your 128 00:07:40,640 --> 00:07:44,360 Speaker 1: fingerprints are your own? Yeah? Maybe so. Um, and that 129 00:07:44,520 --> 00:07:48,320 Speaker 1: is one in six people. And apparently the iPhone five 130 00:07:48,640 --> 00:07:51,160 Speaker 1: s the rumor is is it is going to have 131 00:07:51,240 --> 00:07:56,680 Speaker 1: fingerprint authentification instead of your pass code. Wow, it's pretty neat. 132 00:07:56,960 --> 00:07:59,120 Speaker 1: We'll see. That's the rumor. We used to have laptops 133 00:07:59,120 --> 00:08:00,960 Speaker 1: here that had that. Well, yeah you did. I'd never 134 00:08:01,040 --> 00:08:03,160 Speaker 1: use mine. Yeah I did. I thought it was pretty cool. Yeah. 135 00:08:03,200 --> 00:08:06,680 Speaker 1: Thumb print right, Uh, whatever you wanted. I think it 136 00:08:06,800 --> 00:08:09,960 Speaker 1: was really either your thumb, your fore finger, or your 137 00:08:10,240 --> 00:08:13,240 Speaker 1: middle finger. I would have done my bigtoe right. It 138 00:08:13,280 --> 00:08:18,600 Speaker 1: would have been like, isn't it obvious? Um, there's I 139 00:08:18,600 --> 00:08:24,960 Speaker 1: guess cars. Some cars now have birometric Um. I guess ignitions. Yeah, 140 00:08:25,000 --> 00:08:27,520 Speaker 1: and we might as well talk about why that stinks. Yeah, 141 00:08:27,560 --> 00:08:30,640 Speaker 1: because in Malaysia. Yeah, in Malaysia, they cut off a 142 00:08:30,640 --> 00:08:33,000 Speaker 1: guy's finger to get into his Mercedes. And that is 143 00:08:33,320 --> 00:08:36,720 Speaker 1: a worry for police as fingerprinting becomes more and more 144 00:08:36,840 --> 00:08:40,560 Speaker 1: used as authentification, uh that people are gonna start cutting 145 00:08:40,559 --> 00:08:48,280 Speaker 1: fingers off to do so did you say authentification, authentification, authentication, authentication? 146 00:08:48,320 --> 00:08:53,080 Speaker 1: What do you add in another syllable? Oh? Boy? So, hey, 147 00:08:53,360 --> 00:08:57,960 Speaker 1: don't get on me about words. Yeah I can't. Okay, Um, 148 00:08:58,040 --> 00:09:02,679 Speaker 1: there's one that won't bring up what the deletrius deleterious. 149 00:09:03,520 --> 00:09:07,359 Speaker 1: I just say it in my own way. No, deletrious 150 00:09:08,200 --> 00:09:11,600 Speaker 1: this is not a word. It is. You're deleting it. 151 00:09:11,600 --> 00:09:15,559 Speaker 1: It deletes things. It's negative, all right. So, um, and 152 00:09:15,679 --> 00:09:20,719 Speaker 1: if you off this case. Uh so the biometric companies 153 00:09:20,840 --> 00:09:23,240 Speaker 1: who said, well, we can't have our customers fingers being 154 00:09:23,240 --> 00:09:26,080 Speaker 1: cut off, said well, now we'll just add a little 155 00:09:26,120 --> 00:09:28,640 Speaker 1: something that the texts blood flow. So now our customers 156 00:09:28,679 --> 00:09:30,880 Speaker 1: are just kidnapped rather than have their fingers cut off 157 00:09:31,000 --> 00:09:34,920 Speaker 1: or the the kidnapper doesn't know this. And you you 158 00:09:34,960 --> 00:09:37,040 Speaker 1: still have your car, but you're out of finger right. 159 00:09:37,080 --> 00:09:39,640 Speaker 1: You should probably put like that kind of thing on 160 00:09:39,760 --> 00:09:42,520 Speaker 1: ads on busses, yeah, or put it on your car like, 161 00:09:43,280 --> 00:09:45,959 Speaker 1: oh yeah, car will not start. A finger is detached. Yeah, 162 00:09:45,960 --> 00:09:48,040 Speaker 1: because think about that could be even worse if their finger. 163 00:09:48,080 --> 00:09:49,920 Speaker 1: If the finger doesn't work, they're like, oh, maybe it's 164 00:09:49,920 --> 00:09:52,959 Speaker 1: his other finger. Just keep cutting fingers off and you're like, no, 165 00:09:53,080 --> 00:09:57,679 Speaker 1: it needs a pulse. Alright. So we talked about the 166 00:09:57,720 --> 00:10:01,480 Speaker 1: friction ridges. Yeah, they're called f Just remember because they 167 00:10:01,880 --> 00:10:05,480 Speaker 1: buck up buckle under the pressure. The friction up against 168 00:10:05,520 --> 00:10:09,320 Speaker 1: the dermis that can take the pressure. Buckles. Um. But 169 00:10:09,360 --> 00:10:12,680 Speaker 1: the forms a specific pattern and this is what you know. 170 00:10:12,720 --> 00:10:14,960 Speaker 1: The arrangement shape and size and number of lines is 171 00:10:14,960 --> 00:10:17,920 Speaker 1: what they're looking for when they're identifying and comparing these things. 172 00:10:18,520 --> 00:10:21,680 Speaker 1: And there's three different patterns there can be. Uh, there's 173 00:10:21,720 --> 00:10:25,240 Speaker 1: loops begins on one side, curves up and around and 174 00:10:25,320 --> 00:10:28,120 Speaker 1: exits the other. Yeah, look at your fingers while we describe. 175 00:10:28,640 --> 00:10:31,600 Speaker 1: There are radio loops and owner loops, um radial slope 176 00:10:31,600 --> 00:10:35,719 Speaker 1: toward the thumb, owner towards the little finger, and intuitive 177 00:10:35,800 --> 00:10:39,160 Speaker 1: to tell which way that the slope is going. Oh yeah, 178 00:10:39,600 --> 00:10:41,560 Speaker 1: just yeah, because technically you could be like, well no, 179 00:10:41,600 --> 00:10:43,280 Speaker 1: I think it's so much way to turn your head. 180 00:10:44,559 --> 00:10:49,200 Speaker 1: The aforementioned whirls are circular or spiral in nature, and 181 00:10:49,400 --> 00:10:52,319 Speaker 1: arches slope up and then down like, and they're described 182 00:10:52,360 --> 00:10:58,080 Speaker 1: here as narrow mountains. Yeah, it just goes. I think 183 00:10:58,120 --> 00:11:01,920 Speaker 1: that summed it up. So those are the identifying marks 184 00:11:01,920 --> 00:11:04,600 Speaker 1: on your fingers, um that you can see the naked eye. 185 00:11:04,640 --> 00:11:07,640 Speaker 1: And then if you're in law enforcement, they're gonna be 186 00:11:07,679 --> 00:11:10,880 Speaker 1: also analyzing something called minutia, which you can't see with 187 00:11:10,920 --> 00:11:13,880 Speaker 1: the naked eye. No. And these are basically like further 188 00:11:14,040 --> 00:11:17,120 Speaker 1: characteristics of the loops and worlds and arches. So you 189 00:11:17,280 --> 00:11:20,840 Speaker 1: might you might have a spur, which is another um 190 00:11:20,880 --> 00:11:24,600 Speaker 1: like world that comes off of a larger whirl um. 191 00:11:24,760 --> 00:11:27,160 Speaker 1: Or there's an abrupt end to a ridge, or there's 192 00:11:27,200 --> 00:11:31,520 Speaker 1: bifurcation or islands. It's like a world within a world. Um. 193 00:11:31,559 --> 00:11:35,280 Speaker 1: There's deltas, which are like ridges that form like y patterns, 194 00:11:35,360 --> 00:11:37,640 Speaker 1: just a little stuff like that, and they all form 195 00:11:38,040 --> 00:11:42,880 Speaker 1: this classification system that the cops rely on when they 196 00:11:42,920 --> 00:11:46,599 Speaker 1: fingerprint you. And the the science, the forensic science of 197 00:11:46,679 --> 00:11:53,480 Speaker 1: fingerprinting is called deactyloscopy. That's right, like pterodactyl um. And 198 00:11:54,080 --> 00:11:56,480 Speaker 1: I guess there are probably some places still that do 199 00:11:56,559 --> 00:11:58,840 Speaker 1: it the old fashioned way UM and don't have digital 200 00:11:58,840 --> 00:12:05,200 Speaker 1: scanning finger methods. Montana hello, Montana, Um. They would do 201 00:12:05,240 --> 00:12:08,040 Speaker 1: it like you've seen it on countless TV shows and movies. 202 00:12:08,440 --> 00:12:13,079 Speaker 1: They would clean your hand off, dry it off with alcohol. Yeah, 203 00:12:13,120 --> 00:12:14,959 Speaker 1: they want to get all the sea bulm off, get 204 00:12:14,960 --> 00:12:18,840 Speaker 1: all the sum off, roll the fingertip um, and then 205 00:12:19,120 --> 00:12:20,960 Speaker 1: I usually say left or right, but I imagine you 206 00:12:20,960 --> 00:12:24,120 Speaker 1: could do either way to get the ink on the finger. 207 00:12:25,360 --> 00:12:27,880 Speaker 1: Make sure it's fully covered. Then you roll onto the 208 00:12:27,920 --> 00:12:31,800 Speaker 1: card from fingernail to fingernail, from one side to the other. 209 00:12:32,120 --> 00:12:34,400 Speaker 1: That is called a rolled fingerprint. Yeah. You do this 210 00:12:34,480 --> 00:12:39,240 Speaker 1: with all eight fingers and two thumbs and uh, then 211 00:12:39,280 --> 00:12:41,680 Speaker 1: you've got your set of rolled prints. Then they take 212 00:12:41,720 --> 00:12:45,560 Speaker 1: your hands and cover all your fingers with the ink 213 00:12:45,960 --> 00:12:47,840 Speaker 1: and then just have you press it down flat at 214 00:12:47,880 --> 00:12:51,400 Speaker 1: the bottom of this fingerprint card. And uh, those are 215 00:12:51,520 --> 00:12:53,880 Speaker 1: that's a set of flat prints which are used apparently 216 00:12:53,960 --> 00:13:00,760 Speaker 1: to um verify the uh um yeah, exactly to say 217 00:13:00,760 --> 00:13:04,440 Speaker 1: they have two sets basically, yeah. Um. If you live 218 00:13:04,480 --> 00:13:07,920 Speaker 1: in the modern world and you live in a large city, 219 00:13:08,240 --> 00:13:11,400 Speaker 1: you're probably gonna have digital scanners doing the same thing. Um. 220 00:13:11,400 --> 00:13:15,080 Speaker 1: It's an optical scanner that basically put your fingers on 221 00:13:15,160 --> 00:13:19,320 Speaker 1: there and it through magic converts that into digital data 222 00:13:19,360 --> 00:13:22,600 Speaker 1: patterns and then they have programs that map those points 223 00:13:22,640 --> 00:13:27,200 Speaker 1: and basically it's sort of like you see in the movies. Yeah, 224 00:13:27,320 --> 00:13:30,440 Speaker 1: and what's neat about the optical scanners. The picture that 225 00:13:30,520 --> 00:13:36,000 Speaker 1: it makes is the inverse. Yeah, yeah, that makes sense, 226 00:13:36,360 --> 00:13:40,880 Speaker 1: So you're uh to negative image, right. The the the 227 00:13:40,880 --> 00:13:43,520 Speaker 1: the worlds and everything had more light bounce off of them, 228 00:13:43,520 --> 00:13:47,000 Speaker 1: so they tend to be lighter. Um, whereas like the 229 00:13:47,080 --> 00:13:49,840 Speaker 1: valleys and everything in between, the friction ridges are darker. 230 00:13:50,400 --> 00:13:51,679 Speaker 1: And I'm gonna go out on a limb and say 231 00:13:51,679 --> 00:13:55,280 Speaker 1: it's more accurate than rolled prints. I might be wrong, 232 00:13:55,920 --> 00:13:59,280 Speaker 1: You're probably right from what I've read. Sure, Yeah, because 233 00:13:59,800 --> 00:14:04,560 Speaker 1: you know when someone's like you ever seen First Blood? No? Yeah, 234 00:14:04,559 --> 00:14:09,640 Speaker 1: you have stallone the Vampire Show? No, No, that's true Blood, 235 00:14:10,080 --> 00:14:13,000 Speaker 1: first Blood. The Rambo first printure. Remember when he was 236 00:14:13,520 --> 00:14:16,280 Speaker 1: being difficult when they were booking him and he wasn't 237 00:14:16,280 --> 00:14:18,560 Speaker 1: doing the fingerprinting right right, They couldn't get good prints 238 00:14:18,760 --> 00:14:21,120 Speaker 1: because Rambo wouldn't stand for it. No, he doesn't take 239 00:14:21,120 --> 00:14:24,840 Speaker 1: any crap, all right. There's two types of prints visible 240 00:14:24,840 --> 00:14:27,560 Speaker 1: prints that we talked of. Um, actually we didn't talk 241 00:14:27,560 --> 00:14:30,040 Speaker 1: about him there there if you actually leave an intention 242 00:14:30,120 --> 00:14:33,640 Speaker 1: in something, a visible print like dirt or clay or 243 00:14:33,640 --> 00:14:35,760 Speaker 1: something blood or something like that. Blood would be a 244 00:14:35,760 --> 00:14:38,200 Speaker 1: good example. It's visible. You can look and say there's 245 00:14:38,240 --> 00:14:41,920 Speaker 1: a fingerprint exactly. There's also latent prints, which we leave 246 00:14:42,160 --> 00:14:44,800 Speaker 1: everywhere all the time. Um. And those are the ones 247 00:14:44,840 --> 00:14:46,680 Speaker 1: that are just made with the sea bum coming out 248 00:14:46,720 --> 00:14:50,320 Speaker 1: of our fingers, the pores on our friction ridges, and 249 00:14:50,680 --> 00:14:55,400 Speaker 1: those are typically not necessarily naked to the the human eye. 250 00:14:55,400 --> 00:14:58,240 Speaker 1: Like if you look at like, yes, stainless steel, you'll 251 00:14:58,280 --> 00:15:01,760 Speaker 1: probably see some pressure clear class maybe right, But those 252 00:15:01,760 --> 00:15:06,600 Speaker 1: are technically latent prints. Um. They also can be um 253 00:15:06,960 --> 00:15:09,400 Speaker 1: invisible to the naked eye, and so they have to 254 00:15:09,440 --> 00:15:13,640 Speaker 1: be um dusted. Yeah, and they actually do dust with 255 00:15:13,680 --> 00:15:16,560 Speaker 1: a little brush. UM. I looked up the dusting powder 256 00:15:16,640 --> 00:15:20,440 Speaker 1: fingerprinting powder, and apparently most of it these days is proprietary, 257 00:15:20,560 --> 00:15:22,920 Speaker 1: so like you don't know exactly what's in it, really, 258 00:15:22,960 --> 00:15:27,720 Speaker 1: I just guess like carbon, who knows. Maybe I know 259 00:15:27,920 --> 00:15:31,520 Speaker 1: you should formulate your own Josh's fingerprinting caller, you know, 260 00:15:32,240 --> 00:15:35,480 Speaker 1: market it to some local harmfulks enforcement agencies. Yeah, I'll 261 00:15:35,480 --> 00:15:38,320 Speaker 1: sell it like snake oil or something like Uncle Peppi's 262 00:15:38,360 --> 00:15:43,840 Speaker 1: patented a one blue ribbon fingerprinting. Uh so they lift 263 00:15:43,880 --> 00:15:46,440 Speaker 1: the prints, um, these latent prints, and those are the 264 00:15:46,640 --> 00:15:49,680 Speaker 1: obviously the prints too that you see criminals and TV 265 00:15:49,760 --> 00:15:53,520 Speaker 1: shows and movies always trying to wipe off with a hanky, right, 266 00:15:53,920 --> 00:15:57,000 Speaker 1: very wisely, Yeah, I wonder if what the if that 267 00:15:57,080 --> 00:16:00,680 Speaker 1: really works? You want to get into that, do you know? Yes, 268 00:16:00,840 --> 00:16:04,240 Speaker 1: it should Okay, it should work from what I understand. 269 00:16:04,600 --> 00:16:07,800 Speaker 1: I read this one paper that was basically like, we 270 00:16:07,840 --> 00:16:12,880 Speaker 1: should not be using fingerprints in court any longer. That 271 00:16:13,480 --> 00:16:17,680 Speaker 1: first of all, we have DNA now, and DNA is objective. 272 00:16:17,880 --> 00:16:20,080 Speaker 1: It's like this protein sequence is the same as this 273 00:16:20,120 --> 00:16:24,480 Speaker 1: protein sequence. Right with fingerprinting is subjective, even though there's 274 00:16:24,520 --> 00:16:29,440 Speaker 1: an extensive classification system, it's I think that this print 275 00:16:29,720 --> 00:16:34,080 Speaker 1: matches up to this print. There's not. It's not quantitative, 276 00:16:34,760 --> 00:16:38,240 Speaker 1: or if it is, it's not enough quantitative enough. Also, 277 00:16:38,480 --> 00:16:43,080 Speaker 1: there's so much faith placed by the public in fingerprint 278 00:16:43,080 --> 00:16:51,040 Speaker 1: analysis that fingerprinting people who do this work frequently are 279 00:16:51,240 --> 00:16:54,640 Speaker 1: matching stuff like they're they're taking a great role print 280 00:16:55,200 --> 00:16:58,040 Speaker 1: and comparing it to like just the worst smudge print 281 00:16:58,080 --> 00:16:59,960 Speaker 1: on the planet, where you know, you can be like, 282 00:17:00,320 --> 00:17:05,080 Speaker 1: I don't get that because I'm not a dactyloscopist. But 283 00:17:05,320 --> 00:17:08,040 Speaker 1: I'm sure that somebody who who is that could could 284 00:17:08,080 --> 00:17:10,440 Speaker 1: figure this out. And supposedly that's a lot of faith 285 00:17:10,480 --> 00:17:13,240 Speaker 1: that we're placing into people, and that now that DNA 286 00:17:13,320 --> 00:17:16,880 Speaker 1: evidence is becoming more and more UM available and prevalent 287 00:17:16,880 --> 00:17:20,720 Speaker 1: and widespread, it's starting to show like um fingerprinting is 288 00:17:20,760 --> 00:17:23,760 Speaker 1: actually probably put a lot of innocent people behind bars, 289 00:17:24,160 --> 00:17:27,280 Speaker 1: and we really shouldn't rely on it anymore because even 290 00:17:27,320 --> 00:17:31,200 Speaker 1: even if you have a great print that you took 291 00:17:31,240 --> 00:17:35,600 Speaker 1: a latent print, right yeah, and there are plenty like 292 00:17:35,640 --> 00:17:38,720 Speaker 1: they're not shysters or frauds or crooks because even within 293 00:17:38,760 --> 00:17:41,679 Speaker 1: their profession, a lot of them are like there's a 294 00:17:41,720 --> 00:17:46,400 Speaker 1: lot of recklessness going on here. Um that you're never 295 00:17:46,480 --> 00:17:49,960 Speaker 1: going to have a really great latent print. It's never 296 00:17:49,960 --> 00:17:52,840 Speaker 1: gonna be good, and so you're working from a deficit 297 00:17:52,960 --> 00:17:56,200 Speaker 1: every time, and you're also comparing it to a role print, 298 00:17:56,240 --> 00:17:59,560 Speaker 1: and a roll print is also not the same as 299 00:17:59,600 --> 00:18:02,320 Speaker 1: another rolled print made right after. Like you can take 300 00:18:02,359 --> 00:18:05,439 Speaker 1: somebody's finger and roll it from fingernails, fingernail, pick it up, 301 00:18:05,560 --> 00:18:07,800 Speaker 1: put it on the next little box, roll the same 302 00:18:07,840 --> 00:18:10,240 Speaker 1: finger from fingernail, the fingernail and you're going to have 303 00:18:10,720 --> 00:18:13,480 Speaker 1: basically two different prints. So that's in the olden days 304 00:18:13,520 --> 00:18:16,520 Speaker 1: before they had the digital scanning though, right, But those 305 00:18:16,520 --> 00:18:18,720 Speaker 1: are still a lot of the ones on file imagine. Yes, 306 00:18:18,800 --> 00:18:22,840 Speaker 1: And I feel like we're probably still dealing with the 307 00:18:22,880 --> 00:18:24,879 Speaker 1: same deficits. The paper I read was from like two 308 00:18:24,920 --> 00:18:27,720 Speaker 1: thousand five or seven, so it's not like it was 309 00:18:27,800 --> 00:18:30,040 Speaker 1: old and they're they're saying like this is still going 310 00:18:30,080 --> 00:18:33,160 Speaker 1: on with the advent of digital scanning. Plus, I think 311 00:18:33,359 --> 00:18:34,919 Speaker 1: you know as well as I do, when you're a 312 00:18:34,960 --> 00:18:37,199 Speaker 1: court and the attorney yells at the top of his 313 00:18:37,280 --> 00:18:40,439 Speaker 1: lungs his fingerprints were found all over the murder scene, 314 00:18:40,520 --> 00:18:42,720 Speaker 1: your your toast, well, yeah, and I think that a 315 00:18:42,760 --> 00:18:46,520 Speaker 1: lot of perpetrators to think that, like if they have 316 00:18:46,560 --> 00:18:48,639 Speaker 1: your prints but you're dead, that's it, and like they 317 00:18:48,680 --> 00:18:51,000 Speaker 1: have you dead to rights, so you might as well confess. 318 00:18:51,080 --> 00:18:53,600 Speaker 1: And I'm sure it's a great tool for confessing. Yeah. 319 00:18:53,600 --> 00:18:55,919 Speaker 1: I wonder if d n A thwarts I wonder if 320 00:18:55,960 --> 00:18:59,600 Speaker 1: anything like that actually thwarts people from committing crimes. You know, 321 00:19:00,000 --> 00:19:01,880 Speaker 1: I wonder if anyone ever stops and goes, boy, now 322 00:19:01,880 --> 00:19:03,840 Speaker 1: a d n A. I would think you can just 323 00:19:03,880 --> 00:19:06,119 Speaker 1: wipe down a crime scene, Like if I drop a 324 00:19:06,160 --> 00:19:09,000 Speaker 1: hair on the carpet, then I could be, you know, nabbed. 325 00:19:09,080 --> 00:19:11,600 Speaker 1: Sure I would think that's a pretty good deterrent. Yeah, 326 00:19:11,720 --> 00:19:13,440 Speaker 1: I just don't know in the criminal mind how how 327 00:19:13,480 --> 00:19:16,080 Speaker 1: that operates. I'm sure it makes them operate a lot 328 00:19:16,280 --> 00:19:18,600 Speaker 1: less sloppy than I used to. You know, that's a 329 00:19:18,600 --> 00:19:21,760 Speaker 1: good point. All right, Let's talk a little bit about 330 00:19:21,760 --> 00:19:24,879 Speaker 1: the history, because it's pretty interesting. I think, um that 331 00:19:25,000 --> 00:19:29,520 Speaker 1: in ancient Babylon they actually uh press fingertips and clay 332 00:19:29,560 --> 00:19:33,520 Speaker 1: to for some business transactions. Pretty advanced. I mean, thousands 333 00:19:33,520 --> 00:19:36,760 Speaker 1: of years ago people already understood like if fingertips are unique, 334 00:19:36,840 --> 00:19:39,440 Speaker 1: and of course the Chinese we're always ahead of the 335 00:19:39,480 --> 00:19:42,920 Speaker 1: game on everything. It seemed like they actually used ink 336 00:19:43,040 --> 00:19:46,920 Speaker 1: on paper for business transactions and help identify their kids, 337 00:19:47,200 --> 00:19:49,480 Speaker 1: like my dad. Yeah, I thought it was kind of weird, though, 338 00:19:49,800 --> 00:19:52,960 Speaker 1: can you un identify your child? Oh well, if they 339 00:19:53,119 --> 00:19:56,040 Speaker 1: grow up, if they're kidnapped and take into another village, 340 00:19:56,280 --> 00:19:58,960 Speaker 1: I guess, so they grow up in return to claim 341 00:19:58,960 --> 00:20:03,600 Speaker 1: of birthright. Yeah, that's a good point. Sure. Um, it 342 00:20:03,720 --> 00:20:07,639 Speaker 1: wasn't until they didn't use them for identifying criminals until 343 00:20:07,640 --> 00:20:10,960 Speaker 1: the nineteenth century and there's a series of events that's 344 00:20:10,960 --> 00:20:14,240 Speaker 1: sort of just not necessarily connected, happened one after the 345 00:20:14,240 --> 00:20:16,960 Speaker 1: other to sort of advance it at the same time. 346 00:20:17,720 --> 00:20:21,320 Speaker 1: The first eight was a guy named Sir William Herschel, 347 00:20:21,359 --> 00:20:26,560 Speaker 1: and Englishman who was chief magistrate of the Hoogli District 348 00:20:26,960 --> 00:20:32,399 Speaker 1: in India, and he started recording fingerprints when signing documents. 349 00:20:32,440 --> 00:20:34,960 Speaker 1: So that's kind of the first thing, right. Then you 350 00:20:35,000 --> 00:20:38,720 Speaker 1: mentioned Henry Faulds. He came around next, Scottish doctor. Yeah, 351 00:20:38,800 --> 00:20:41,320 Speaker 1: he was, Um, I guess he got into Japanese pottery 352 00:20:41,320 --> 00:20:44,880 Speaker 1: and noticed like the fingerprints left behind by the artists, 353 00:20:45,600 --> 00:20:49,320 Speaker 1: and um, he started getting into fingerprinting. So he wanted 354 00:20:49,359 --> 00:20:52,160 Speaker 1: to create a classification system and said, I'm not gonna 355 00:20:52,200 --> 00:20:54,720 Speaker 1: do it myself. I'm cousins to Charles Darwin all this 356 00:20:54,800 --> 00:20:56,359 Speaker 1: asked him to do it. He's pretty good at that, 357 00:20:56,680 --> 00:20:59,480 Speaker 1: and Darwin said, I'm kind of busy, but I have 358 00:20:59,520 --> 00:21:02,320 Speaker 1: another cousin. His name is Sir Francis Galton, and he's 359 00:21:02,359 --> 00:21:05,879 Speaker 1: going to be into this. Galton here's a eugenicist. I 360 00:21:05,880 --> 00:21:10,600 Speaker 1: wonder if Darwin was like, oh god, it's you know, 361 00:21:11,080 --> 00:21:15,960 Speaker 1: another classification Henry again, I almos pass him off. The cousin, 362 00:21:16,680 --> 00:21:20,200 Speaker 1: cousin Frank, Yeah, be cousin Frank. So, like you said, 363 00:21:20,240 --> 00:21:23,080 Speaker 1: he was augenicist, um and he got I feel like 364 00:21:23,080 --> 00:21:26,480 Speaker 1: we talked about him before he pops up here there, 365 00:21:26,760 --> 00:21:30,760 Speaker 1: just pretty big in um in this era. Yeah, he 366 00:21:30,840 --> 00:21:33,280 Speaker 1: was the first dude to really start kind of collecting 367 00:21:34,200 --> 00:21:37,920 Speaker 1: biometric information on people, um, not just fingerprints, all kinds 368 00:21:37,920 --> 00:21:41,000 Speaker 1: of stuff. And being a eugenicist, he decided that there 369 00:21:41,080 --> 00:21:45,840 Speaker 1: was a perfect human and we should selectively breed ourselves. 370 00:21:45,920 --> 00:21:49,879 Speaker 1: That's right. Yeah, we're not promoting his work by the way. Um, 371 00:21:49,920 --> 00:21:53,159 Speaker 1: although in two he wrote a book called Fingerprints and 372 00:21:53,240 --> 00:21:55,840 Speaker 1: he outlined his systems, first time it had ever been done, 373 00:21:56,600 --> 00:21:58,960 Speaker 1: and it was based on the system we know today 374 00:21:59,119 --> 00:22:03,159 Speaker 1: archist loops and roles. Uh. And then in France, the 375 00:22:03,200 --> 00:22:07,720 Speaker 1: guy named Alfonse Bertillon. Now he's made maybe four appearances 376 00:22:07,760 --> 00:22:11,560 Speaker 1: in our podcast before I knew I knew that name. Um, 377 00:22:11,640 --> 00:22:15,160 Speaker 1: he was at the same time using his own system 378 00:22:15,200 --> 00:22:20,639 Speaker 1: called bert Bertiel nage and uh, anthropometry is basically what 379 00:22:20,680 --> 00:22:22,959 Speaker 1: he was doing. Yeah, because remember he was like working 380 00:22:22,960 --> 00:22:26,000 Speaker 1: in the Paris police department and like he saw the 381 00:22:26,040 --> 00:22:28,439 Speaker 1: same criminals come and go, but they use aliases. So 382 00:22:28,520 --> 00:22:32,040 Speaker 1: he devised the system of like measuring their face and 383 00:22:32,119 --> 00:22:34,639 Speaker 1: head in their tears and all that. Definitely sketches. He 384 00:22:34,680 --> 00:22:37,280 Speaker 1: was definitely that one. Yeah, definitely UM. And he one 385 00:22:37,320 --> 00:22:40,760 Speaker 1: of them also was fingerprinting too, but his his system 386 00:22:40,880 --> 00:22:43,639 Speaker 1: was extremely exhaustive, even though it was adopted by the 387 00:22:43,760 --> 00:22:46,080 Speaker 1: London police. I believe it was just all it was 388 00:22:46,200 --> 00:22:51,600 Speaker 1: really time consuming, sure, but he was advancing the art, 389 00:22:51,960 --> 00:22:55,080 Speaker 1: like eight of them at once. Uh. And then about 390 00:22:55,080 --> 00:22:57,760 Speaker 1: the same time in Argentina, UM, a police officer named 391 00:22:57,800 --> 00:23:01,760 Speaker 1: Juan Uh. You wanna try that a it sounds good, 392 00:23:01,840 --> 00:23:08,919 Speaker 1: doesn't sound Argentinian, but Umo, he actually used fingerprinting. This 393 00:23:08,960 --> 00:23:12,880 Speaker 1: is a two in a case to convict a mother 394 00:23:12,960 --> 00:23:15,080 Speaker 1: who had killed her two kids, when in fact she 395 00:23:15,160 --> 00:23:18,840 Speaker 1: was saying it was her her boyfriend. He actually matched 396 00:23:18,840 --> 00:23:21,760 Speaker 1: fingerprints and she confessed and like he had her first 397 00:23:21,760 --> 00:23:24,680 Speaker 1: case right, there were actually being used in uh to 398 00:23:24,800 --> 00:23:26,600 Speaker 1: convict a criminal. Yeah, that was the first time it 399 00:23:26,720 --> 00:23:32,080 Speaker 1: ever happened. Two and Buenos Aires, UM, I think what 400 00:23:32,240 --> 00:23:36,720 Speaker 1: The following year eree a guy named Sir Edward Henry. 401 00:23:36,760 --> 00:23:39,399 Speaker 1: He was the commissioner of the police department. He became 402 00:23:39,440 --> 00:23:44,000 Speaker 1: interested in using fingerprinting to fight crimes. UM. And he 403 00:23:44,200 --> 00:23:49,920 Speaker 1: came up with a classification system that further extended Galton's UM. 404 00:23:50,000 --> 00:23:53,719 Speaker 1: So he came up with, I believe the minutia and UM, 405 00:23:53,760 --> 00:23:57,520 Speaker 1: I guess the kind of the comparable points that we 406 00:23:57,720 --> 00:24:01,679 Speaker 1: rely on still today. It's called the henrylassification system. And 407 00:24:01,720 --> 00:24:06,040 Speaker 1: when you see on tv UM, you know, a fingerprint 408 00:24:06,040 --> 00:24:09,040 Speaker 1: fed into a computer and like it flashes through for 409 00:24:09,080 --> 00:24:12,480 Speaker 1: some reason all of the fingerprints that it's matching them 410 00:24:12,480 --> 00:24:15,800 Speaker 1: against that is using the Henry classification system that this 411 00:24:15,840 --> 00:24:19,760 Speaker 1: guy created in that's pretty awesome. Uh. In nineteen o one, 412 00:24:19,800 --> 00:24:23,719 Speaker 1: Scotland Yard established the Fingerprint Bureau, it's first one, and 413 00:24:23,760 --> 00:24:25,960 Speaker 1: then they used them as evidence the following year for 414 00:24:26,000 --> 00:24:28,520 Speaker 1: the first time, and then the year after that in 415 00:24:28,560 --> 00:24:31,359 Speaker 1: New York they started using it in state prisons. And 416 00:24:31,400 --> 00:24:33,640 Speaker 1: then the FBI said it's not a bad idea, let's 417 00:24:33,640 --> 00:24:36,399 Speaker 1: get on board exactly, So everybody's getting on board. The 418 00:24:36,440 --> 00:24:40,879 Speaker 1: Henry system like really allowed a system of classification that 419 00:24:40,920 --> 00:24:44,959 Speaker 1: could be used anywhere UM to be devised, and it 420 00:24:45,040 --> 00:24:50,600 Speaker 1: was adopted. The problem was it was extremely time consuming too. Yeah, 421 00:24:50,640 --> 00:24:53,920 Speaker 1: you're matching paper to paper basically, and you're doing it 422 00:24:54,119 --> 00:24:57,679 Speaker 1: with a magnifying glass. Yeah, that the computer systems that 423 00:24:57,720 --> 00:25:04,720 Speaker 1: we see today on TV, those are going through you know, millions, say, um, 424 00:25:04,720 --> 00:25:08,600 Speaker 1: possible matches. Even if you had like a thousand, how 425 00:25:08,600 --> 00:25:11,280 Speaker 1: many detectives would it take to just look through a 426 00:25:11,359 --> 00:25:14,520 Speaker 1: set of prints, your control prints and then you know, 427 00:25:14,720 --> 00:25:19,119 Speaker 1: another print to find a killer or something like that 428 00:25:19,160 --> 00:25:21,439 Speaker 1: from a latent print. Yeah, they call it minutia for 429 00:25:21,440 --> 00:25:26,080 Speaker 1: a reason. Exactly. Imagine those guys went kind of nuts, um, 430 00:25:26,160 --> 00:25:29,119 Speaker 1: And that's if they even had a fingerprint on file, 431 00:25:29,480 --> 00:25:32,480 Speaker 1: like they were counting on someone having because at the time, 432 00:25:32,520 --> 00:25:35,120 Speaker 1: you know, it was only criminals had figger prints. Well, 433 00:25:35,160 --> 00:25:38,119 Speaker 1: they probably were like, this looks like a man's thumb print. 434 00:25:38,640 --> 00:25:40,880 Speaker 1: Let's go through all the thumb prints of the men 435 00:25:40,960 --> 00:25:44,119 Speaker 1: we have on file and see if we can catch somebody. Thummy, 436 00:25:44,400 --> 00:25:47,399 Speaker 1: it was much better if like you could catch a suspect, 437 00:25:47,920 --> 00:25:50,359 Speaker 1: print them and then compare it. But that's you know, 438 00:25:50,440 --> 00:25:54,600 Speaker 1: not necessarily what they were doing. Luckily, we created computers 439 00:25:54,600 --> 00:25:57,199 Speaker 1: to be our mindless slaves at this kind of stuff, 440 00:25:57,680 --> 00:26:01,200 Speaker 1: and starting around the seventies, Yeah, and Japan was their 441 00:26:01,280 --> 00:26:04,320 Speaker 1: national police agency, was the first one to use this 442 00:26:04,400 --> 00:26:07,520 Speaker 1: kind of automation. In the nineteen eighties and they created 443 00:26:07,560 --> 00:26:12,520 Speaker 1: the Automated Fingerprint Identification systems APISTS. Its slogan was warm, 444 00:26:12,600 --> 00:26:16,600 Speaker 1: fuzzy happiness. That's what it says in quotes in their name, 445 00:26:17,000 --> 00:26:22,639 Speaker 1: the Automated Fingerprinting Concern. Uh so they used it in 446 00:26:22,680 --> 00:26:26,840 Speaker 1: the US UM to too great effect. Although the problem 447 00:26:26,960 --> 00:26:30,440 Speaker 1: was it wasn't integrated, Like you know, they didn't share 448 00:26:30,480 --> 00:26:34,200 Speaker 1: information between agencies or between districts of of law enforcement. 449 00:26:34,240 --> 00:26:36,480 Speaker 1: So you're kind of just stuck with whoever you had 450 00:26:36,520 --> 00:26:40,200 Speaker 1: on file, right like, even though it was computerized. A yeah, yeah, 451 00:26:40,320 --> 00:26:44,800 Speaker 1: a particular police department, maybe even a statewide police department 452 00:26:45,040 --> 00:26:47,800 Speaker 1: if you're luck, could buy APHIST an apist system. Yeah, 453 00:26:47,920 --> 00:26:51,880 Speaker 1: but like that was your APHIST system. Fortunately, the FBI said, hey, 454 00:26:51,920 --> 00:26:54,720 Speaker 1: there's this awesome thing called the Internet. Yeah I should 455 00:26:54,760 --> 00:26:57,640 Speaker 1: I say, fortunately if you're a criminal, not necessarily fortunately, 456 00:26:57,720 --> 00:27:00,480 Speaker 1: or you're in the privacy and all that. Um. But they, 457 00:27:00,560 --> 00:27:06,159 Speaker 1: the FBI and I think UM created the Integrated APHIS UM, 458 00:27:06,160 --> 00:27:10,560 Speaker 1: which basically plugged all of these databases together and created 459 00:27:10,560 --> 00:27:13,639 Speaker 1: one huge database that the FBI maintains screwed up. Their 460 00:27:13,640 --> 00:27:19,920 Speaker 1: acronym though, i aphis Yeah it's terrible. UM. So yeah, 461 00:27:19,960 --> 00:27:25,040 Speaker 1: now there's one in six Americans who has their fingerprints 462 00:27:25,040 --> 00:27:27,520 Speaker 1: on file and i aphis yea, and I think they 463 00:27:27,520 --> 00:27:30,239 Speaker 1: say it takes a about thirty minutes as little as 464 00:27:30,280 --> 00:27:34,359 Speaker 1: thirty minutes to scan against everyone in the entire country 465 00:27:34,359 --> 00:27:38,520 Speaker 1: at this point, including mud shots, criminal histories, seven million 466 00:27:38,520 --> 00:27:43,639 Speaker 1: people on record. That's not bad, including you, my friend. Yeah, 467 00:27:43,960 --> 00:27:46,199 Speaker 1: you better keep that hanky on you at all times. 468 00:27:46,800 --> 00:27:49,439 Speaker 1: Wiped down my print, your prints. I could also just 469 00:27:49,480 --> 00:27:52,000 Speaker 1: stay on the straight and narrow. Oh yeah, that's true. 470 00:27:52,160 --> 00:27:54,720 Speaker 1: I'll like you wipe my prints down? Is to be sure. 471 00:27:54,480 --> 00:27:57,600 Speaker 1: Should we talk a little bit about other biometrics, even 472 00:27:57,600 --> 00:27:59,720 Speaker 1: though we've covered some of the stuff. If you want 473 00:28:00,240 --> 00:28:04,600 Speaker 1: I scans, they're really expensive there. Um, the retinas and 474 00:28:04,600 --> 00:28:08,640 Speaker 1: the iris are also unique, but um, they're just super expensive. 475 00:28:08,640 --> 00:28:10,720 Speaker 1: So the only place you're going to see those are 476 00:28:10,880 --> 00:28:14,159 Speaker 1: like high security, like expensive facilities. Well, it depends. The 477 00:28:14,200 --> 00:28:17,960 Speaker 1: retina scan is extremely detailed and tough. The iris scan 478 00:28:18,040 --> 00:28:22,119 Speaker 1: supposedly is much quicker, and you're more likely to find those. 479 00:28:22,320 --> 00:28:26,080 Speaker 1: Um it's cheaper to yeah, a little more prevalently, but 480 00:28:26,160 --> 00:28:32,720 Speaker 1: still you're not gonna find that at your average parties. Um, 481 00:28:32,840 --> 00:28:37,320 Speaker 1: what about Carl Jr? Though? Okay, ear scans. Apparently ears 482 00:28:37,320 --> 00:28:39,760 Speaker 1: are unique in size and shape instructor as well. I 483 00:28:39,800 --> 00:28:42,400 Speaker 1: think they used this too for scanning crowds as well. 484 00:28:42,480 --> 00:28:52,280 Speaker 1: Like part official recognition is scan ears? Interesting? Yeah, slapdash 485 00:28:50,600 --> 00:28:55,479 Speaker 1: scary boys. Fingerprints. Um, there's an audio lab the FBI 486 00:28:55,520 --> 00:29:00,480 Speaker 1: operates in Quantico, Virginia. UM, and when there are messages 487 00:29:00,560 --> 00:29:03,920 Speaker 1: from supposed known terrorists, they run it through their program 488 00:29:03,960 --> 00:29:06,680 Speaker 1: and it does pretty good job. It's not like a fingerprint, 489 00:29:06,760 --> 00:29:09,040 Speaker 1: but they can do a pretty good job with vocal analysis. 490 00:29:09,080 --> 00:29:11,240 Speaker 1: At this point, you could also use this if you're, 491 00:29:11,320 --> 00:29:16,040 Speaker 1: like say, um, contacting your bank in Geneva by phone. 492 00:29:16,800 --> 00:29:20,120 Speaker 1: Your bank may use some sort of voice print analysis 493 00:29:20,880 --> 00:29:25,400 Speaker 1: to say, okay, you're you single favorite song. I wonder 494 00:29:25,440 --> 00:29:27,600 Speaker 1: if they do what they do, have you say yeah 495 00:29:27,760 --> 00:29:29,880 Speaker 1: or do? I'm sure it's just your name, but it'd 496 00:29:29,880 --> 00:29:34,520 Speaker 1: be funny if like they had you sing you give 497 00:29:34,560 --> 00:29:38,160 Speaker 1: love a bad name all right by chuck? Uh? And 498 00:29:38,200 --> 00:29:41,040 Speaker 1: then d N A of course. Um. There was just 499 00:29:41,080 --> 00:29:43,720 Speaker 1: a ruling last week in the Supreme Court. They ruled 500 00:29:43,720 --> 00:29:46,640 Speaker 1: five to four that DNA swabs can now be taken 501 00:29:47,360 --> 00:29:51,440 Speaker 1: at the time of arrest for serious crimes. Um. Is 502 00:29:51,480 --> 00:29:55,480 Speaker 1: that right? Yeah? So for serious crimes. I thought the 503 00:29:55,520 --> 00:29:58,120 Speaker 1: whole the whole row over the this was that it 504 00:29:58,200 --> 00:30:02,160 Speaker 1: was like like it can just take him from anybody. Well, 505 00:30:02,800 --> 00:30:05,600 Speaker 1: it's like it's just for serious crimes. The whole row 506 00:30:05,680 --> 00:30:07,560 Speaker 1: is the fact that it's a police officer doing it 507 00:30:08,000 --> 00:30:11,360 Speaker 1: and not a like you're not at the police station 508 00:30:11,440 --> 00:30:15,320 Speaker 1: with a trained you know, d n A analysis analyst 509 00:30:15,920 --> 00:30:21,960 Speaker 1: man and Allen nice what is wrong with me today? Yea. 510 00:30:24,160 --> 00:30:26,600 Speaker 1: In the ruling they said that they've essentially found that 511 00:30:26,640 --> 00:30:29,600 Speaker 1: it's the same thing as fingerprinting, So now cops are 512 00:30:29,600 --> 00:30:32,680 Speaker 1: gonna have a little swab mouth swab kits in their cars. 513 00:30:32,880 --> 00:30:34,880 Speaker 1: And uh, it was applied before ruling. It was close. 514 00:30:35,920 --> 00:30:38,360 Speaker 1: And it's some people you know are up in arms 515 00:30:38,360 --> 00:30:43,200 Speaker 1: about it, saying it's civil rights, uh infringement, and yeah, 516 00:30:43,720 --> 00:30:48,600 Speaker 1: the end of privacy is what we're witnessing. Yeah, yeah, 517 00:30:48,760 --> 00:30:53,800 Speaker 1: oh yeah. It's sad to see nobody, nobody's doing anything 518 00:30:53,800 --> 00:30:57,920 Speaker 1: about it that that n s a whistleblower gave up 519 00:30:58,120 --> 00:31:01,720 Speaker 1: his life basically, and everybody it's like, wow, I guess 520 00:31:01,760 --> 00:31:04,880 Speaker 1: I always suspect that was going on anyway, right, you know, 521 00:31:06,680 --> 00:31:10,240 Speaker 1: so anyway, um fingerprints, yes, you got anything else? I 522 00:31:10,280 --> 00:31:12,680 Speaker 1: got nothing else. All right, Well, if you want to 523 00:31:12,720 --> 00:31:15,240 Speaker 1: learn more about fingerprints, you can type that word into 524 00:31:15,280 --> 00:31:18,160 Speaker 1: the search part how stuffworks dot com. And since I 525 00:31:18,200 --> 00:31:27,480 Speaker 1: said search part, that means it's time for message break. Yeah, 526 00:31:27,360 --> 00:31:31,640 Speaker 1: I know it's time for listening mail Josh, I'm gonna 527 00:31:31,680 --> 00:31:34,400 Speaker 1: call this. UH. Let's let's help this uh young lady 528 00:31:34,480 --> 00:31:37,520 Speaker 1: raise some money for cancer. We don't do this lot. 529 00:31:37,520 --> 00:31:38,840 Speaker 1: We get a lot of requests and we can't do 530 00:31:38,880 --> 00:31:41,480 Speaker 1: them all. We wish we could. But Whitney spoke to me. 531 00:31:42,280 --> 00:31:44,400 Speaker 1: She's been listening for years. She's a big fan. She 532 00:31:44,480 --> 00:31:48,000 Speaker 1: has started as an undergrad and now she is wrapping 533 00:31:48,080 --> 00:31:50,880 Speaker 1: up her second year in law school at the Ohio 534 00:31:51,000 --> 00:31:54,920 Speaker 1: State University. Couple of guys, Yeah, um, I know you 535 00:31:55,000 --> 00:31:57,080 Speaker 1: like to hear about awesome charities. I wanted to share 536 00:31:57,200 --> 00:32:00,280 Speaker 1: this unique one here in Columbus, Ohio. UH pellots Onya 537 00:32:00,440 --> 00:32:02,720 Speaker 1: is a bike ride that raises money for life saving 538 00:32:02,800 --> 00:32:07,000 Speaker 1: cancer research. Goes raised by writers goes to cancer research 539 00:32:07,040 --> 00:32:10,760 Speaker 1: at the James Cancer Center at Ohio State. UM. It 540 00:32:10,920 --> 00:32:13,680 Speaker 1: especial to me because my grandmother got cancer treatment at 541 00:32:13,680 --> 00:32:16,120 Speaker 1: the James when she did not have health insurance, and 542 00:32:16,200 --> 00:32:18,840 Speaker 1: she was proud that they could use her rare case 543 00:32:18,880 --> 00:32:24,280 Speaker 1: to potentially help find a cure for others. Um Nancy 544 00:32:24,280 --> 00:32:27,440 Speaker 1: passed away in two thousand eleven. She was fighting her cancer. 545 00:32:27,840 --> 00:32:29,280 Speaker 1: In the course of a single day though the age 546 00:32:29,320 --> 00:32:32,880 Speaker 1: of sixty five, she began having severe dementia like symptoms. 547 00:32:33,520 --> 00:32:35,960 Speaker 1: We are not sure why. Symptoms left her unable to 548 00:32:36,000 --> 00:32:39,080 Speaker 1: mentally compete with the cancer and without her willpower and 549 00:32:39,160 --> 00:32:42,600 Speaker 1: understanding um that she had an extremely rare terminal cancel 550 00:32:42,920 --> 00:32:47,040 Speaker 1: cancer to battle, her health went downhill quickly. Still, her 551 00:32:47,080 --> 00:32:50,080 Speaker 1: inter dignity shine through, even with a drastic drop in 552 00:32:50,120 --> 00:32:52,720 Speaker 1: body weight and repeated trips to the e R. So 553 00:32:52,920 --> 00:32:56,960 Speaker 1: in memory of her grandmother, she is writing for charity. 554 00:32:57,040 --> 00:32:59,440 Speaker 1: So she's raised two d and fifteen dollars right now. 555 00:33:00,040 --> 00:33:01,960 Speaker 1: I think we should pump that up a little bit 556 00:33:02,120 --> 00:33:05,720 Speaker 1: in our listeners. So I created a little tiny rol 557 00:33:05,760 --> 00:33:09,760 Speaker 1: of her page um H T T P colon slash 558 00:33:09,760 --> 00:33:14,920 Speaker 1: slash tiny rol dot com slash m r m k 559 00:33:15,440 --> 00:33:19,080 Speaker 1: X six V. So that's m r m k x 560 00:33:19,120 --> 00:33:22,200 Speaker 1: six v after tiny orl dot com and that is 561 00:33:22,240 --> 00:33:26,000 Speaker 1: Whitney Bramlin's bike ride page. It goes down August tent, 562 00:33:26,160 --> 00:33:29,000 Speaker 1: I believe, and that'd be cool if we did race 563 00:33:29,000 --> 00:33:34,680 Speaker 1: all extra dough for her. Yeah, everybody gets to there. 564 00:33:34,800 --> 00:33:36,640 Speaker 1: That's pretty good. So and you know, it's one of 565 00:33:36,680 --> 00:33:38,400 Speaker 1: those things where you can get like five dollars if 566 00:33:38,400 --> 00:33:41,280 Speaker 1: you want to, dude to skip that latte today, I say, 567 00:33:41,280 --> 00:33:44,520 Speaker 1: and donate to Whitney's calls. Yeah, way to go Chuck. 568 00:33:44,840 --> 00:33:47,040 Speaker 1: Wait you go, Whitney, Way to go Whitney, And way 569 00:33:47,080 --> 00:33:49,520 Speaker 1: to go you for donating. We're proud of you already 570 00:33:49,520 --> 00:33:52,280 Speaker 1: in advance, agreed. Uh, if you want to let us 571 00:33:52,320 --> 00:33:55,240 Speaker 1: know about a charitable organization you care about, We're always 572 00:33:55,280 --> 00:33:57,520 Speaker 1: down with that. We'll try to let everybody know about it. 573 00:33:57,560 --> 00:34:00,200 Speaker 1: In turn. Um, you can tweet to us at s 574 00:34:00,320 --> 00:34:02,960 Speaker 1: Y s G podcast. You can join us on Facebook 575 00:34:03,000 --> 00:34:05,200 Speaker 1: dot com, Slash you Should Know. You can send us 576 00:34:05,200 --> 00:34:07,920 Speaker 1: an email to Stuff Podcast at Discovery dot com, and 577 00:34:07,960 --> 00:34:09,800 Speaker 1: you can join us at our home on the web. 578 00:34:09,960 --> 00:34:15,279 Speaker 1: That's Stuff you Should Know dot com for more on 579 00:34:15,360 --> 00:34:17,840 Speaker 1: this and thousands of other topics. Does it how Stuff 580 00:34:17,840 --> 00:34:28,960 Speaker 1: works dot com. This episode of Stuff you Should Know 581 00:34:29,040 --> 00:34:31,400 Speaker 1: is brought to you by YouTube geek Week. Tune end, 582 00:34:31,400 --> 00:34:35,000 Speaker 1: define your channels at YouTube dot com. Slash Geek Week