1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,000 --> 00:00:14,760 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,880 --> 00:00:17,720 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio and 4 00:00:17,720 --> 00:00:20,320 Speaker 1: a love of all things tech. It is time for 5 00:00:20,480 --> 00:00:26,040 Speaker 1: a classic episode. This episode originally aired on April two, fourteen. 6 00:00:26,600 --> 00:00:33,520 Speaker 1: It is called Biometrics Digital Fingerprinting Enjoy. We thought, hey, 7 00:00:33,560 --> 00:00:35,840 Speaker 1: we should talk about biometrics, and then we started looking 8 00:00:35,840 --> 00:00:38,320 Speaker 1: into it and getting really excited, and then we realized, hey, 9 00:00:39,520 --> 00:00:43,200 Speaker 1: there's a lot to talk about, right. You know. Biometrics, 10 00:00:43,240 --> 00:00:49,880 Speaker 1: of course, being the measurable biological or behavioral characteristics used 11 00:00:49,920 --> 00:00:52,800 Speaker 1: for for any given individual. Yes, this is what how 12 00:00:52,840 --> 00:00:57,200 Speaker 1: the FBI says, this is what the biometrics are. So, uh, 13 00:00:57,240 --> 00:00:59,000 Speaker 1: you know, it's one of those things where we knew 14 00:00:59,040 --> 00:01:01,000 Speaker 1: it was a huge time bick and we decided to 15 00:01:01,080 --> 00:01:06,600 Speaker 1: narrow it down. So today we're specifically focusing on your fingerprints. Well, 16 00:01:06,640 --> 00:01:09,760 Speaker 1: not your fingerprints, I mean, I mean everyone's fingerprints. No, 17 00:01:09,920 --> 00:01:12,160 Speaker 1: see the person sitting next to you, No, not that one, 18 00:01:12,200 --> 00:01:15,600 Speaker 1: the other one, that person's fingerprints. That's that's the one 19 00:01:15,640 --> 00:01:20,320 Speaker 1: we're concentrating on. So yeah, because everyone has has different fingerprints, 20 00:01:20,360 --> 00:01:22,800 Speaker 1: I mean like everyone, right, and this is something that 21 00:01:22,840 --> 00:01:25,480 Speaker 1: has been known for a while but then forgotten and 22 00:01:25,480 --> 00:01:27,640 Speaker 1: then rediscovered. So we're going to talk about that because 23 00:01:27,640 --> 00:01:30,240 Speaker 1: it's kind of funny. So these days we think of 24 00:01:30,360 --> 00:01:33,160 Speaker 1: biometrics as sort of those automated ways to verify your 25 00:01:33,200 --> 00:01:36,480 Speaker 1: identity based on some sort of biological characteristic, like you know, 26 00:01:36,560 --> 00:01:40,440 Speaker 1: the eye scan or the fingerprints scan or whatever. Vocal 27 00:01:40,600 --> 00:01:44,320 Speaker 1: scans as well, you know, like the voice imprint. Lots 28 00:01:44,319 --> 00:01:48,320 Speaker 1: of different Hollywood versions of this. But you know, again, 29 00:01:48,360 --> 00:01:51,960 Speaker 1: going into that fingerprint approach, we thought that, uh, you know, 30 00:01:52,040 --> 00:01:54,680 Speaker 1: we we look at not just how we've defined it 31 00:01:54,720 --> 00:01:57,640 Speaker 1: and how stuff works, but look at a whole history 32 00:01:57,640 --> 00:02:01,360 Speaker 1: of fingerprinting. We're not the first podcast to do this, 33 00:02:01,440 --> 00:02:04,560 Speaker 1: by the way, certainly not. Josh and Chuck of Stuff 34 00:02:04,600 --> 00:02:08,440 Speaker 1: you Should Know did an episode on April called how 35 00:02:08,520 --> 00:02:12,160 Speaker 1: Fingerprinting Works, and they go pretty deeply into the history 36 00:02:12,240 --> 00:02:14,600 Speaker 1: and say many pithy things. So if you would like 37 00:02:14,600 --> 00:02:17,320 Speaker 1: to check that episode out, you can go to Stuff 38 00:02:17,360 --> 00:02:19,960 Speaker 1: you Should Know dot com right and just stick around, 39 00:02:19,960 --> 00:02:21,880 Speaker 1: because we're gonna say some pithy things too. We are 40 00:02:21,880 --> 00:02:23,920 Speaker 1: going to cover some of the history kind of a 41 00:02:24,040 --> 00:02:27,320 Speaker 1: quick overview, but first I thought I would read a 42 00:02:27,360 --> 00:02:30,560 Speaker 1: couple of little excerpts from our article on how fingerprints 43 00:02:30,560 --> 00:02:32,639 Speaker 1: scanners work, because there are a couple that I thought 44 00:02:32,680 --> 00:02:35,640 Speaker 1: were really interesting from how stuff works dot Com. Neither 45 00:02:35,639 --> 00:02:37,880 Speaker 1: of us actually wrote this. Nope, Nope, I didn't write 46 00:02:37,919 --> 00:02:41,280 Speaker 1: this one. This one says people have tiny ridges of 47 00:02:41,280 --> 00:02:44,760 Speaker 1: skin on their fingers because this particular adaptation was extremely 48 00:02:44,840 --> 00:02:48,720 Speaker 1: advantageous to the ancestors of the human species. The pattern 49 00:02:48,760 --> 00:02:51,560 Speaker 1: of ridges and valleys on fingers make it easier for 50 00:02:51,600 --> 00:02:53,760 Speaker 1: the hands to grip things in the same way a 51 00:02:53,800 --> 00:02:57,680 Speaker 1: rubber tread pattern helps to attire grip the road. The 52 00:02:57,800 --> 00:03:02,240 Speaker 1: other function of fingerprints is a total coincidence. Like everything 53 00:03:02,240 --> 00:03:05,240 Speaker 1: in the human body, these ridges form through a combination 54 00:03:05,280 --> 00:03:08,600 Speaker 1: of genetic and environmental factors. The genetic code and DNA 55 00:03:08,720 --> 00:03:10,720 Speaker 1: gives general orders in the way skin should form in 56 00:03:10,720 --> 00:03:13,560 Speaker 1: the developing fetus, but the specific way it forms is 57 00:03:13,600 --> 00:03:16,480 Speaker 1: a result of random events. The exact position of the 58 00:03:16,480 --> 00:03:18,720 Speaker 1: fetus in the womb at a particular moment, and the 59 00:03:18,760 --> 00:03:22,480 Speaker 1: exact composition and density of surrounding amniotic fluid decides how 60 00:03:22,560 --> 00:03:25,919 Speaker 1: every individual ridge will form. This, by the way, is 61 00:03:25,960 --> 00:03:30,200 Speaker 1: how identical twins can have different fingerprints pretty cool because 62 00:03:30,240 --> 00:03:34,160 Speaker 1: you know, genetically they're identical, but their fingerprints are different, 63 00:03:34,200 --> 00:03:40,760 Speaker 1: still different. Interesting. So looking at this history, you might think, Okay, 64 00:03:40,800 --> 00:03:45,400 Speaker 1: I've heard about fingerprinting, particularly when it applies to law enforcement. 65 00:03:45,440 --> 00:03:47,440 Speaker 1: That's probably where a lot of people are familiar with it, 66 00:03:47,480 --> 00:03:50,480 Speaker 1: besides the verification. Sure, sure, and uh and a couple 67 00:03:50,880 --> 00:03:54,520 Speaker 1: popular media pieces have talked recently about Um. I mean 68 00:03:54,680 --> 00:03:56,960 Speaker 1: you you get things like Ripper Street or Sleepy Hollow 69 00:03:57,360 --> 00:04:01,119 Speaker 1: having characters going like this new fingerprinting thing in the amaging, 70 00:04:01,360 --> 00:04:05,560 Speaker 1: amazing Age of Victoria. Um, not not so much. Actually, 71 00:04:05,600 --> 00:04:07,960 Speaker 1: so fingerprints people, Well, first of all, people have been 72 00:04:08,000 --> 00:04:11,720 Speaker 1: aware of them for ages because we're curious folks, you know, 73 00:04:11,880 --> 00:04:16,120 Speaker 1: human beings. We we were very curious and narcissistic. We 74 00:04:16,240 --> 00:04:19,480 Speaker 1: like to learn stuff and we like to look at ourselves. 75 00:04:19,600 --> 00:04:23,200 Speaker 1: And how do I know that this dates back are 76 00:04:23,839 --> 00:04:26,640 Speaker 1: ages and ages and ages ago because we have prehistoric 77 00:04:26,680 --> 00:04:31,320 Speaker 1: depictions found in Nova Scotia and it depicted a hand 78 00:04:31,360 --> 00:04:33,880 Speaker 1: with ridge patterns on the skin. Now, that does not 79 00:04:34,040 --> 00:04:38,640 Speaker 1: mean that the ancient Nova Scotians were familiar with the 80 00:04:38,680 --> 00:04:42,280 Speaker 1: fact that all the fingerprints were unique, and therefore no 81 00:04:42,440 --> 00:04:45,520 Speaker 1: to individuals had the same ones, but at least showed 82 00:04:45,520 --> 00:04:48,880 Speaker 1: that you know, yeah, they noticed them. Yeah, but the 83 00:04:48,920 --> 00:04:53,720 Speaker 1: ancient Babylonians may have actually used fingerprints to differentiate people. Um, 84 00:04:53,720 --> 00:04:57,880 Speaker 1: we've found fingerprints in clay to to sign business transactions, 85 00:04:57,920 --> 00:05:01,520 Speaker 1: and the ancient Chinese used inked fingerprints for both business 86 00:05:01,520 --> 00:05:06,040 Speaker 1: purposes and child identification. And in the thirteen hundreds in Persia, 87 00:05:06,680 --> 00:05:11,440 Speaker 1: official government documents often included fingerprints, probably to indicate they 88 00:05:11,440 --> 00:05:15,480 Speaker 1: were authorized and official. Now, according to the US Marshall's Office, 89 00:05:15,520 --> 00:05:18,800 Speaker 1: which has an entire web page devoted to fingerprinting in 90 00:05:18,800 --> 00:05:22,520 Speaker 1: the history of it, one government official in Persia at 91 00:05:22,560 --> 00:05:25,600 Speaker 1: that time made the observation that no two fingerprints were alike, 92 00:05:25,680 --> 00:05:28,680 Speaker 1: which obviously would be very important if you're making a 93 00:05:28,720 --> 00:05:32,680 Speaker 1: document official or authorized. But it wasn't until the eighteen 94 00:05:32,720 --> 00:05:37,080 Speaker 1: eighties that that amazing Age of Victoria previously mentioned that 95 00:05:37,120 --> 00:05:40,400 Speaker 1: we got some kind of official classification system, right. It 96 00:05:40,640 --> 00:05:43,560 Speaker 1: wasn't until the modern era that we started seeing it. 97 00:05:43,640 --> 00:05:46,760 Speaker 1: In uh story, it wasn't even in wide use at 98 00:05:46,760 --> 00:05:50,160 Speaker 1: the beginning. It was just kind of exploratory. Dr Henry 99 00:05:50,200 --> 00:05:54,240 Speaker 1: Faulds back in eighteen eighty proposed using fingerprints for identification 100 00:05:54,320 --> 00:05:57,560 Speaker 1: as well as a means of classifying them. So he 101 00:05:57,720 --> 00:06:02,880 Speaker 1: forwarded these ideas to a certain Charles Darwin, very important 102 00:06:02,920 --> 00:06:05,680 Speaker 1: historical figure in his own right. But Darwin at the 103 00:06:05,720 --> 00:06:08,000 Speaker 1: time was towards the end of his life and felt 104 00:06:08,040 --> 00:06:11,800 Speaker 1: that he did not have the necessary UH time and 105 00:06:12,000 --> 00:06:14,880 Speaker 1: energy to devote to this thought. It was really interesting, however, 106 00:06:14,920 --> 00:06:17,560 Speaker 1: so he forwarded the information along to a cousin of 107 00:06:17,640 --> 00:06:21,960 Speaker 1: his named Francis Galton. So false would write a scientific 108 00:06:22,040 --> 00:06:27,600 Speaker 1: paper about his methods and actually identified a fingerprint, ascertain 109 00:06:27,680 --> 00:06:30,800 Speaker 1: the identity of the person who left that fingerprint on 110 00:06:31,839 --> 00:06:35,679 Speaker 1: bottle of alcohol. Shouldn't come as any surprise. I guess 111 00:06:36,279 --> 00:06:40,239 Speaker 1: this brings us all together. It does. Then in three 112 00:06:40,600 --> 00:06:45,360 Speaker 1: Mark Twain would would use this this new startling scientific 113 00:06:45,400 --> 00:06:47,960 Speaker 1: information in a story. Yeah, it was in Life on 114 00:06:47,960 --> 00:06:51,080 Speaker 1: the Mississippi, And in that one of the elements of 115 00:06:51,080 --> 00:06:53,600 Speaker 1: that story is a murderer is identified through the use 116 00:06:53,600 --> 00:06:56,279 Speaker 1: of fingerprints. And he would revisit the idea in a 117 00:06:56,320 --> 00:07:02,000 Speaker 1: book called Putting Head Wilson Putting Head. Well, so it's interesting. Now, 118 00:07:02,200 --> 00:07:06,880 Speaker 1: this was before anyone was actually using UH fingerprints in 119 00:07:06,920 --> 00:07:09,760 Speaker 1: any kind of criminal investigations on an official basis. It 120 00:07:09,800 --> 00:07:13,640 Speaker 1: had not been science fiction, really was. It kind of 121 00:07:13,760 --> 00:07:16,440 Speaker 1: was so so that was exciting. But but soon in 122 00:07:16,520 --> 00:07:20,600 Speaker 1: eighteen eighty eight, Sir Francis Galton, remember the cousin to 123 00:07:20,840 --> 00:07:24,280 Speaker 1: Charles Darwin, who had received information about this about ten 124 00:07:24,360 --> 00:07:27,360 Speaker 1: years previous, um, began his own study of fingerprints. Yep. 125 00:07:27,560 --> 00:07:29,520 Speaker 1: He wrote a book in eighteen ninety two and put 126 00:07:29,560 --> 00:07:34,080 Speaker 1: forth a formal classification system and identify the tiny characteristics 127 00:07:34,160 --> 00:07:38,520 Speaker 1: used to differentiate fingerprints, which now we call Galton's details. 128 00:07:39,320 --> 00:07:42,320 Speaker 1: He also observed that fingerprints don't change over the lifetime 129 00:07:42,320 --> 00:07:43,960 Speaker 1: of a person, so the ones you have when you're 130 00:07:44,000 --> 00:07:46,400 Speaker 1: a kid are the same ones you have when you're old. 131 00:07:47,480 --> 00:07:51,680 Speaker 1: Originally wanted to kind of link fingerprints to certain types 132 00:07:51,720 --> 00:07:56,200 Speaker 1: of traits like intelligence or heredity, which is so racist. Yeah, 133 00:07:56,440 --> 00:07:59,560 Speaker 1: kind of essentially, I think this was almost in an 134 00:07:59,600 --> 00:08:02,080 Speaker 1: attempt And I don't know enough about Galton to say 135 00:08:02,080 --> 00:08:04,440 Speaker 1: this for sure, but it seems like an attempt to 136 00:08:04,600 --> 00:08:11,200 Speaker 1: justify certain societies beliefs in certain people, that kind of thing. Yeah, 137 00:08:11,280 --> 00:08:15,360 Speaker 1: it wasn't necessarily so racist, but um, but it kind 138 00:08:15,360 --> 00:08:17,400 Speaker 1: of runs in that direction. Yeah, Because I mean, the 139 00:08:17,440 --> 00:08:20,200 Speaker 1: idea that you could identify a person's intelligence based on 140 00:08:20,240 --> 00:08:23,679 Speaker 1: their fingerprints already seems a little sketchy, and in fact, 141 00:08:23,800 --> 00:08:27,360 Speaker 1: he did determined that there was no connection. There was 142 00:08:27,400 --> 00:08:30,160 Speaker 1: no there were no identifier marks in a person's fingerprints 143 00:08:30,160 --> 00:08:32,440 Speaker 1: that would give you any clue as to that person's 144 00:08:32,480 --> 00:08:37,440 Speaker 1: intelligence or genetic background. However, he did figure out that 145 00:08:37,440 --> 00:08:40,280 Speaker 1: there were a lot of differences and that people didn't 146 00:08:40,400 --> 00:08:44,240 Speaker 1: weren't likely to have the same fingerprints. Yeah, unlikely, in 147 00:08:44,280 --> 00:08:48,160 Speaker 1: the order of one in sixty four billion, that's pretty unlikely. 148 00:08:49,200 --> 00:08:53,240 Speaker 1: So in eight we get the first use of fingerprints 149 00:08:53,240 --> 00:08:58,439 Speaker 1: in a criminal investigation on an official level. Uh one vs. 150 00:08:58,600 --> 00:09:01,960 Speaker 1: Tech and I I'm sure I butchered his last name. 151 00:09:02,280 --> 00:09:04,600 Speaker 1: Who was a police officer in Argentina used them in 152 00:09:04,679 --> 00:09:07,800 Speaker 1: a murder investigation. It was actually a really tragic case, 153 00:09:08,360 --> 00:09:13,400 Speaker 1: but the discovery meant that he was able to solve 154 00:09:13,480 --> 00:09:17,120 Speaker 1: this mystery and prove that the person that was believed 155 00:09:17,120 --> 00:09:22,040 Speaker 1: to have been the murderer was in fact not the killer. So, uh, 156 00:09:22,160 --> 00:09:24,200 Speaker 1: not only was it the first use, but it was 157 00:09:24,200 --> 00:09:28,560 Speaker 1: a pretty dramatic example of it. Now, between we're gonna 158 00:09:28,600 --> 00:09:32,000 Speaker 1: skip to nineteen eighteen, but between the late nineteenth century 159 00:09:32,000 --> 00:09:35,520 Speaker 1: and the early twentieth century he started to see fingerprints 160 00:09:35,520 --> 00:09:40,280 Speaker 1: get adopted into various legal organizations all around the world. 161 00:09:40,360 --> 00:09:42,959 Speaker 1: The United Kingdom and the United States were leading the way, 162 00:09:43,000 --> 00:09:45,760 Speaker 1: but it was all over the place. So by nineteen 163 00:09:45,800 --> 00:09:49,560 Speaker 1: eighteen you have Edmund Lekard who says that you only 164 00:09:49,600 --> 00:09:53,400 Speaker 1: need twelve points of similarity between an individual's fingerprint and 165 00:09:53,400 --> 00:09:58,080 Speaker 1: a target fingerprint to serve as a positive identification. Now 166 00:09:59,200 --> 00:10:01,720 Speaker 1: you may have heard about those twelve points of singularity, 167 00:10:01,760 --> 00:10:04,480 Speaker 1: that this is somehow like a legal thing, that if 168 00:10:04,480 --> 00:10:06,800 Speaker 1: you were able to meet those twelve points of singularity, 169 00:10:06,800 --> 00:10:09,200 Speaker 1: you have the legal basis to say this person did this. 170 00:10:10,840 --> 00:10:15,840 Speaker 1: Uh not necessarily a legal definition at all. In fact, 171 00:10:16,360 --> 00:10:20,640 Speaker 1: UH countries have very very different ways of saying whether 172 00:10:20,720 --> 00:10:23,520 Speaker 1: or not a fingerprint is a valid match for another one, 173 00:10:23,960 --> 00:10:27,280 Speaker 1: and some, like the United States, do not have a 174 00:10:27,360 --> 00:10:30,800 Speaker 1: minimum number of resemblances that need to be there in 175 00:10:30,880 --> 00:10:33,160 Speaker 1: order for you to call it a match. Now, usually 176 00:10:33,520 --> 00:10:37,959 Speaker 1: law enforcement agencies rely on experts who give their expert 177 00:10:37,960 --> 00:10:40,680 Speaker 1: opinion and therefore putting their own reputation on the line 178 00:10:40,920 --> 00:10:43,200 Speaker 1: as to whether or not something matches. And of course, 179 00:10:43,240 --> 00:10:48,079 Speaker 1: now these days we rely very heavily on digital information, 180 00:10:48,160 --> 00:10:52,840 Speaker 1: which with very very complex and intelligent algorithms that will 181 00:10:52,840 --> 00:10:55,760 Speaker 1: that will do some really interesting like work. Yeah, and 182 00:10:55,840 --> 00:10:59,200 Speaker 1: so with that, you know, the the level of a 183 00:10:59,280 --> 00:11:03,240 Speaker 1: confidence grows quite a bit. So, uh, just so you 184 00:11:03,280 --> 00:11:06,720 Speaker 1: know that twelve points of similarity not necessarily a legal standard. 185 00:11:07,559 --> 00:11:11,000 Speaker 1: Nineteen four that's when Congress enabled had an act that 186 00:11:11,080 --> 00:11:14,280 Speaker 1: established the new division for the Federal Bureau of Investigation 187 00:11:14,320 --> 00:11:17,959 Speaker 1: also known as the FBI. Uh And this division was 188 00:11:18,000 --> 00:11:21,440 Speaker 1: called the Identification Division. I bet you can guess what 189 00:11:21,520 --> 00:11:24,840 Speaker 1: it did. So it became kind of a centralized fingerprint 190 00:11:24,960 --> 00:11:28,960 Speaker 1: file for the entire country. So not it wasn't necessarily 191 00:11:29,320 --> 00:11:33,000 Speaker 1: uh standard procedure for every law enforcement agency out there 192 00:11:33,040 --> 00:11:36,760 Speaker 1: to send a copy to the FBI, But that's kind 193 00:11:36,760 --> 00:11:39,840 Speaker 1: of what started to happen, so that, uh, there could 194 00:11:39,840 --> 00:11:43,720 Speaker 1: be this sort of cooperation between different departments which often 195 00:11:43,760 --> 00:11:46,800 Speaker 1: wouldn't have any communication with each other. Sure sure, by 196 00:11:46,920 --> 00:11:49,280 Speaker 1: nineteen forty six, the FBI would have processed more than 197 00:11:49,320 --> 00:11:53,760 Speaker 1: a hundred million fingerprint cards. Yeah, not just processed, but 198 00:11:53,840 --> 00:11:56,760 Speaker 1: they did that by hand, right right, and it would 199 00:11:56,800 --> 00:12:02,679 Speaker 1: be two million by one, So two million physical cards 200 00:12:03,000 --> 00:12:05,920 Speaker 1: with fingerprints on them. I mean, just imagine how many, like, 201 00:12:06,000 --> 00:12:08,360 Speaker 1: how how much storage you need just for me, it's two. 202 00:12:08,880 --> 00:12:13,200 Speaker 1: I can't even I can't even imagine it. But by 203 00:12:13,360 --> 00:12:18,640 Speaker 1: the FBI introduced the Integrated Automated Fingerprint Identification System or 204 00:12:18,920 --> 00:12:23,080 Speaker 1: if SO. It's the largest fingerprint database in the world 205 00:12:23,160 --> 00:12:27,080 Speaker 1: and its computer automated. It takes about twenty seven minutes 206 00:12:27,120 --> 00:12:30,199 Speaker 1: for the system to comb through every single file in 207 00:12:30,280 --> 00:12:33,640 Speaker 1: its database to find out if there is a potential match. 208 00:12:34,040 --> 00:12:37,600 Speaker 1: During a criminal investigation, it's different. If it's a civil case, 209 00:12:37,960 --> 00:12:41,400 Speaker 1: it's actually like more than an hour. But for criminal investigation, 210 00:12:41,960 --> 00:12:44,280 Speaker 1: for all the criminal files that are storing this database, 211 00:12:44,280 --> 00:12:46,839 Speaker 1: twenty seven minutes from the time you actually input the 212 00:12:47,640 --> 00:12:51,400 Speaker 1: suspects fingerprint. I imagine that's a lot faster than whatever 213 00:12:51,520 --> 00:12:54,600 Speaker 1: in turn they sent down to the basement. Well yeah, 214 00:12:54,640 --> 00:12:56,600 Speaker 1: because I mean you would have to narrow down the 215 00:12:56,640 --> 00:13:01,440 Speaker 1: person quite a bit before you could ever start comparing. Right, 216 00:13:02,040 --> 00:13:05,280 Speaker 1: this is a man, so cut out all the women 217 00:13:05,440 --> 00:13:08,120 Speaker 1: and then go like there are seventy million of them 218 00:13:08,160 --> 00:13:10,600 Speaker 1: down there. Uh yeah, that would be that would be 219 00:13:10,640 --> 00:13:14,440 Speaker 1: a daunting task. So in the digital age, we can 220 00:13:14,480 --> 00:13:18,040 Speaker 1: actually analyze this stuff way better than we ever could, 221 00:13:18,080 --> 00:13:19,840 Speaker 1: Like you don't have to use the naked eye anymore 222 00:13:19,880 --> 00:13:22,280 Speaker 1: and try and find those little ridges and stuff. You 223 00:13:22,280 --> 00:13:26,600 Speaker 1: can actually rely very heavily upon computer systems, and once 224 00:13:26,640 --> 00:13:31,280 Speaker 1: we started getting those computer systems in place, they pretty 225 00:13:31,320 --> 00:13:35,800 Speaker 1: soon thereafter became um commercially available. Yeah. Yeah, we had 226 00:13:35,840 --> 00:13:39,040 Speaker 1: some fingerprint verification systems that have been around for a 227 00:13:39,040 --> 00:13:41,920 Speaker 1: few decades now. On the consumer level, they've only been 228 00:13:41,960 --> 00:13:44,920 Speaker 1: available fairly recently. And you might be thinking, oh, yeah, yeah, 229 00:13:44,960 --> 00:13:47,840 Speaker 1: the iPhone five s because it has that that fingerprint 230 00:13:47,920 --> 00:13:50,440 Speaker 1: scanner where you can use that to log into your phone. 231 00:13:50,720 --> 00:13:54,840 Speaker 1: That's the first smartphone to ever use that, right, Nope, no, no, 232 00:13:54,960 --> 00:13:57,680 Speaker 1: there's actually a mobile device. The first mobile device that 233 00:13:57,760 --> 00:14:00,280 Speaker 1: I found was a one that dated from two thousand three, 234 00:14:00,360 --> 00:14:03,760 Speaker 1: the H and Boy. The name of this is phenomenal. 235 00:14:03,920 --> 00:14:07,520 Speaker 1: HP names their products in such catchy ways. It's the 236 00:14:07,760 --> 00:14:15,120 Speaker 1: HP I pack, I p a Q P PC pocket PC. 237 00:14:15,800 --> 00:14:18,560 Speaker 1: Oh yeah, it just rolls off the tongue. Yeah, it 238 00:14:18,559 --> 00:14:20,480 Speaker 1: really is. I mean, with a name like that, how 239 00:14:20,480 --> 00:14:25,000 Speaker 1: could you resist I phone my phone? So at any rate, 240 00:14:25,040 --> 00:14:28,800 Speaker 1: this was the first mobile device to incorporate finger scanning technology, 241 00:14:28,960 --> 00:14:30,320 Speaker 1: but it was also sort of the edge of a 242 00:14:30,360 --> 00:14:34,080 Speaker 1: boom for the technology. UM it was being extended for 243 00:14:34,080 --> 00:14:36,000 Speaker 1: for wide consumer use. At the time. I mean, you know, 244 00:14:36,080 --> 00:14:38,200 Speaker 1: keyboards and mice had them, laptops had them. You could 245 00:14:38,200 --> 00:14:42,000 Speaker 1: buy a USB scanner for multipurpose use and encrypt everything 246 00:14:42,040 --> 00:14:46,200 Speaker 1: with your fingerprint, right, you could end up creating, for example, 247 00:14:46,240 --> 00:14:48,680 Speaker 1: with a copier. You could end up telling people, all right, 248 00:14:49,120 --> 00:14:52,400 Speaker 1: this group of folks are authorized to make copies. Anyone 249 00:14:52,440 --> 00:14:54,800 Speaker 1: who's not on this list cannot make copies. And you 250 00:14:54,800 --> 00:14:56,560 Speaker 1: would walk up to make copies. And if you weren't 251 00:14:56,600 --> 00:15:00,480 Speaker 1: on that list, then I guess no fun Christmas party 252 00:15:00,560 --> 00:15:03,520 Speaker 1: Shenanigans for you. But but but also opening this up 253 00:15:03,560 --> 00:15:06,360 Speaker 1: to the consumer market meant that a lot of people 254 00:15:06,480 --> 00:15:09,920 Speaker 1: started finding the flaws in the technology. Oh yes. In 255 00:15:10,040 --> 00:15:13,920 Speaker 1: December five, clarks And University researchers announced that they could 256 00:15:13,920 --> 00:15:20,240 Speaker 1: fool of the world's fingerprint scanners using an incredibly sophisticated, 257 00:15:20,800 --> 00:15:26,440 Speaker 1: expensive substance. Yeah, they could just go out to toy 258 00:15:26,480 --> 00:15:30,000 Speaker 1: store and buy some Plato and essentially make a copy 259 00:15:30,160 --> 00:15:33,080 Speaker 1: of a fingerprint and put it on any optical scanner. 260 00:15:33,080 --> 00:15:35,000 Speaker 1: And we'll talk about the optical scanners in a little bit. 261 00:15:35,480 --> 00:15:39,920 Speaker 1: And it worked really well. So that gave people some 262 00:15:39,920 --> 00:15:42,080 Speaker 1: pause and thought, maybe we should come up with something 263 00:15:42,120 --> 00:15:47,080 Speaker 1: besides optical scanners too, because fingerprint identification is a great idea, 264 00:15:47,520 --> 00:15:49,240 Speaker 1: but if it's if it's that easy to fool, we 265 00:15:49,240 --> 00:15:51,640 Speaker 1: have to find a different way of measuring it. So 266 00:15:51,760 --> 00:15:55,680 Speaker 1: then there was a totally different expose on September one, 267 00:15:55,840 --> 00:15:59,640 Speaker 1: two thousand and six, when our beloved MythBusters decided to 268 00:16:00,040 --> 00:16:02,760 Speaker 1: do a fingerprint scanner scam of their own. They decided 269 00:16:02,760 --> 00:16:04,360 Speaker 1: to see if they could fool one, and this one 270 00:16:04,440 --> 00:16:06,560 Speaker 1: was a little different. It wasn't just an optical scanner. 271 00:16:06,600 --> 00:16:10,160 Speaker 1: It was supposed to also detect sweat from pores in 272 00:16:10,200 --> 00:16:13,000 Speaker 1: the fingertips. Okay, so so you can't just use a 273 00:16:13,080 --> 00:16:16,320 Speaker 1: lump of plato or good photograph. You have to have 274 00:16:16,400 --> 00:16:19,520 Speaker 1: something that sweats has to have to otherwise it's never 275 00:16:19,560 --> 00:16:22,320 Speaker 1: gonna work. So what did they do. They made a 276 00:16:22,440 --> 00:16:25,880 Speaker 1: latext copy of a fingerprint and then they did a 277 00:16:25,960 --> 00:16:31,800 Speaker 1: very sophisticated thing in order to simulate sweat. Yeah, and 278 00:16:31,880 --> 00:16:37,000 Speaker 1: it worked. So if you can't play dough it, lick it, 279 00:16:37,400 --> 00:16:39,960 Speaker 1: I guess is the moral of that story. I think 280 00:16:40,200 --> 00:16:42,480 Speaker 1: I think that that's really what we all should learn 281 00:16:42,520 --> 00:16:45,920 Speaker 1: from this today. Probably so anyway, they're still seeing this 282 00:16:46,000 --> 00:16:48,680 Speaker 1: kind of technology being rolled out. But it's not just 283 00:16:48,840 --> 00:16:52,320 Speaker 1: an optical scanners, and even optical scanners have gotten a 284 00:16:52,360 --> 00:16:54,760 Speaker 1: lot better than they were back in two thousand five. 285 00:16:55,480 --> 00:16:59,120 Speaker 1: We'll be right back with more digital fingerprinting after these 286 00:16:59,240 --> 00:17:11,040 Speaker 1: short messages. All right, I had just talked about optical scanners, 287 00:17:11,040 --> 00:17:12,840 Speaker 1: and those are one of the ones that are the 288 00:17:12,880 --> 00:17:16,560 Speaker 1: easiest to explain and uh and fairly common even today. 289 00:17:17,160 --> 00:17:20,480 Speaker 1: In fact, not my work computer, but my home computer 290 00:17:20,920 --> 00:17:23,840 Speaker 1: has my home laptop has a little optical scanner. So 291 00:17:23,880 --> 00:17:25,399 Speaker 1: if I want to log in, I can just swipe 292 00:17:25,440 --> 00:17:29,560 Speaker 1: my finger across it. So optical scanners use something that 293 00:17:29,720 --> 00:17:33,479 Speaker 1: is found in digital cameras and camcorders called a charge 294 00:17:33,600 --> 00:17:37,399 Speaker 1: coupled device or c c D. Now, essentially that's a 295 00:17:37,520 --> 00:17:41,159 Speaker 1: light sensor. It's got an array of things called photo sites. 296 00:17:41,280 --> 00:17:45,520 Speaker 1: Now that's light sensitive diodes, and it's works pretty much 297 00:17:45,520 --> 00:17:48,280 Speaker 1: the way you would imagine. So photons, those little particles 298 00:17:48,280 --> 00:17:51,720 Speaker 1: of light when they make a collision with these photo sites, 299 00:17:51,880 --> 00:17:55,720 Speaker 1: it generates a little electrical signal and those can then 300 00:17:55,760 --> 00:17:58,840 Speaker 1: be compiled and converted into a digital image. And it's 301 00:17:58,880 --> 00:18:02,080 Speaker 1: essentially the same process that digital cameras used to take pictures. 302 00:18:02,680 --> 00:18:05,639 Speaker 1: And here's the thing though, So if I put my 303 00:18:05,680 --> 00:18:09,600 Speaker 1: finger on an optical scanner, I'm actually blocking light. And 304 00:18:10,160 --> 00:18:14,160 Speaker 1: unless you happen to be e T the extraterrestrial, your 305 00:18:14,200 --> 00:18:17,359 Speaker 1: finger probably isn't emitting light. So how the heck is 306 00:18:17,400 --> 00:18:20,879 Speaker 1: it getting this picture? Well, it uses a flash, yeah, 307 00:18:20,960 --> 00:18:22,840 Speaker 1: except in this case of flash is probably like a 308 00:18:22,880 --> 00:18:25,760 Speaker 1: single l ED or for some scanners maybe an array 309 00:18:25,840 --> 00:18:29,439 Speaker 1: of LEDs light emitting diodes in other words, and so 310 00:18:29,480 --> 00:18:33,399 Speaker 1: that provides the light that is necessary to take this image. 311 00:18:33,760 --> 00:18:37,280 Speaker 1: And uh, then the c c D creates an image 312 00:18:37,280 --> 00:18:40,920 Speaker 1: of your fingertip. However, it's a little funky. It's inverted, right. 313 00:18:40,960 --> 00:18:44,080 Speaker 1: The dark areas are going to represent the ridges that 314 00:18:44,160 --> 00:18:46,919 Speaker 1: the raised portions of your fingertip, and the light areas 315 00:18:46,920 --> 00:18:49,120 Speaker 1: are going to represent the values. Yeah, so it's kind 316 00:18:49,119 --> 00:18:52,000 Speaker 1: of like looking at a negative, a photo negative. Uh. 317 00:18:52,040 --> 00:18:55,400 Speaker 1: And it makes sense because stuff that's reflecting more light 318 00:18:55,520 --> 00:18:58,359 Speaker 1: is going to create a lot a bigger electrical signal, 319 00:18:58,480 --> 00:19:02,639 Speaker 1: thus creating that you know, the charge couple device is 320 00:19:02,640 --> 00:19:05,680 Speaker 1: going to make that into a darker portion the valleys. 321 00:19:05,720 --> 00:19:07,480 Speaker 1: The light is not as much of it's going to 322 00:19:07,560 --> 00:19:10,080 Speaker 1: reflect back, so you get a weaker electrical signal. That's 323 00:19:10,080 --> 00:19:13,480 Speaker 1: why it gets lighter. Uh So, if you were to 324 00:19:14,240 --> 00:19:17,760 Speaker 1: scan your finger and UM and it tells you that 325 00:19:17,760 --> 00:19:20,960 Speaker 1: it's a good scan, because most of these devices also 326 00:19:21,040 --> 00:19:23,520 Speaker 1: have some sort of fail safe in them that will 327 00:19:23,560 --> 00:19:28,480 Speaker 1: alert you if there wasn't enough enough differentiation between the 328 00:19:28,560 --> 00:19:30,560 Speaker 1: dark and light parts. Right, that the same way that 329 00:19:30,600 --> 00:19:33,879 Speaker 1: you're that your camera UM sometimes will take a picture 330 00:19:33,960 --> 00:19:37,120 Speaker 1: that's that's overdeveloped or underdeveloped. The same thing can happen here. Yeah, 331 00:19:37,160 --> 00:19:39,879 Speaker 1: So if you wanted to, like if you most of 332 00:19:39,880 --> 00:19:42,560 Speaker 1: the software has let error function built into it, so 333 00:19:42,600 --> 00:19:45,199 Speaker 1: it can tell and it will ask you to scan again, right, 334 00:19:45,240 --> 00:19:47,040 Speaker 1: So then you would scan again. So if it's the 335 00:19:47,040 --> 00:19:49,639 Speaker 1: first time you're using it, then you would also end 336 00:19:49,720 --> 00:19:52,040 Speaker 1: up creating a profile in some way. It might not 337 00:19:52,080 --> 00:19:54,840 Speaker 1: be you, it might be an administrator, but something that 338 00:19:54,920 --> 00:19:58,080 Speaker 1: links the fingerprint to who you are and what access 339 00:19:58,160 --> 00:20:02,359 Speaker 1: you are supposed to have will have then on subsequent uses, 340 00:20:02,600 --> 00:20:04,639 Speaker 1: what will happen is that when you scan your finger 341 00:20:05,080 --> 00:20:08,760 Speaker 1: it will go through its database of of identities that 342 00:20:08,880 --> 00:20:12,560 Speaker 1: have fingerprints attached to them look for you. If you're there, 343 00:20:12,680 --> 00:20:16,240 Speaker 1: then it will authorize you for whatever use you're allowed. So, 344 00:20:16,560 --> 00:20:20,240 Speaker 1: for instance, on my home computer, I've given myself very 345 00:20:20,280 --> 00:20:23,240 Speaker 1: strict restrictions because I am not to be trusted. Uh, 346 00:20:23,280 --> 00:20:25,720 Speaker 1: and so all I'm able to do is play a 347 00:20:25,800 --> 00:20:30,840 Speaker 1: pirated copy of Flappy Bird. That's not true, I don't 348 00:20:30,880 --> 00:20:33,680 Speaker 1: have Flappy Bird. But at any rate, you know, that's 349 00:20:33,880 --> 00:20:37,800 Speaker 1: the basic You know, that's the basic procedure, right, And 350 00:20:37,800 --> 00:20:41,320 Speaker 1: and so if your fingerprint is not found in that database, 351 00:20:41,640 --> 00:20:43,960 Speaker 1: you get an error. So either you have to swipe 352 00:20:43,960 --> 00:20:46,640 Speaker 1: it again or or scan it again if it's not swipe. 353 00:20:46,920 --> 00:20:49,919 Speaker 1: All depends on what kind of scanner you're at, or 354 00:20:50,200 --> 00:20:52,800 Speaker 1: you end up saying, uh, the jig is up, you 355 00:20:52,880 --> 00:20:55,920 Speaker 1: got me, I don't really belong here, and then you 356 00:20:56,000 --> 00:20:59,840 Speaker 1: run away. Um. So that that's the basic thing. And 357 00:21:00,240 --> 00:21:04,360 Speaker 1: what they're looking for. It's not the entire pattern of 358 00:21:04,440 --> 00:21:08,600 Speaker 1: your fingertip. It's looking for for a specific minutia about it, 359 00:21:09,200 --> 00:21:11,560 Speaker 1: certain types of patterns and it. And it depends on 360 00:21:11,600 --> 00:21:14,440 Speaker 1: the software that the scanner is using. Um. It might 361 00:21:14,480 --> 00:21:18,480 Speaker 1: be the places where the ridges converge, um or split 362 00:21:18,480 --> 00:21:21,239 Speaker 1: apart at the end, or um any any other kind 363 00:21:21,280 --> 00:21:23,560 Speaker 1: of detail. It's going to be unique to you. Right. 364 00:21:23,640 --> 00:21:25,960 Speaker 1: So in other words, all has to do is say, hey, 365 00:21:26,000 --> 00:21:29,280 Speaker 1: there are these three points on this fingertip, uh that 366 00:21:29,720 --> 00:21:32,400 Speaker 1: are unique that that's all I'm concerned about. And if 367 00:21:32,480 --> 00:21:35,439 Speaker 1: the fingerprint that scan has those three points, I know 368 00:21:35,520 --> 00:21:38,120 Speaker 1: it's this person and they can be let through by 369 00:21:38,119 --> 00:21:40,480 Speaker 1: constraining on the minutia, then you really cut down on 370 00:21:40,560 --> 00:21:43,000 Speaker 1: all the rest of the data that's necessary to have 371 00:21:43,080 --> 00:21:46,160 Speaker 1: a verification. Right, you can kind of throw everything else 372 00:21:46,200 --> 00:21:49,080 Speaker 1: out and concentrate on that and right. Uh and and 373 00:21:49,119 --> 00:21:51,400 Speaker 1: they're pretty good, like we said, though they can sometimes 374 00:21:51,440 --> 00:21:55,080 Speaker 1: be fooled by a really high quality picture of a fingerprint. Right, 375 00:21:55,160 --> 00:21:58,560 Speaker 1: So what if you instead of looking at a picture, 376 00:21:59,200 --> 00:22:01,960 Speaker 1: you were to make sure the fingerprint in some other way, 377 00:22:02,840 --> 00:22:08,480 Speaker 1: such as capacitance right, Because capacitense touch screens are totally 378 00:22:08,480 --> 00:22:11,840 Speaker 1: a thing. Yeah, So it's very similar to what a 379 00:22:11,920 --> 00:22:14,240 Speaker 1: regular capacityans touch screen is. So you know, there are 380 00:22:14,280 --> 00:22:16,440 Speaker 1: different ways of doing touch screens as well. There's somewhere 381 00:22:16,480 --> 00:22:19,080 Speaker 1: you have to use pressure. With the benefit of that 382 00:22:19,200 --> 00:22:22,000 Speaker 1: is you could be wearing gloves and still work a 383 00:22:22,119 --> 00:22:26,560 Speaker 1: pressure version. The downside is that when people are expected 384 00:22:26,600 --> 00:22:31,200 Speaker 1: to apply pressure to delicate material, they sometimes destroy it completely, right, 385 00:22:31,359 --> 00:22:35,880 Speaker 1: or it'll at least decrease the lifespan of that object 386 00:22:36,160 --> 00:22:41,440 Speaker 1: by quite a bit. Capacitance, however, uses weak electric fields. 387 00:22:41,840 --> 00:22:45,480 Speaker 1: So when you make contact with a screen, a touch screen, 388 00:22:45,480 --> 00:22:49,520 Speaker 1: that's using capacitance. See, you're a conductor. I don't mean 389 00:22:49,560 --> 00:22:51,760 Speaker 1: that you conduct trains, Nor do I mean that you 390 00:22:51,880 --> 00:22:54,920 Speaker 1: Maybe you do, Maybe you do, Maybe you do. Maybe 391 00:22:54,960 --> 00:22:58,159 Speaker 1: you conduct orchestras. Maybe you conduct orchestras on a train. 392 00:22:58,280 --> 00:23:00,239 Speaker 1: I don't know. But what I'm talking about, you do 393 00:23:00,359 --> 00:23:02,960 Speaker 1: call us because that sounds fascinating, kind of cool. Actually, 394 00:23:03,000 --> 00:23:06,880 Speaker 1: But now I'm talking about electrical conductivity. So we can 395 00:23:06,960 --> 00:23:09,600 Speaker 1: conduct electricity. It's not great for us to have a 396 00:23:09,640 --> 00:23:12,400 Speaker 1: lot of it, but tiny amounts don't hurt us. Sure, 397 00:23:12,520 --> 00:23:14,800 Speaker 1: And as it turns out, the ridges and valleys on 398 00:23:14,840 --> 00:23:19,159 Speaker 1: your fingers conduct slightly different amounts of electricity. This blows 399 00:23:19,240 --> 00:23:21,880 Speaker 1: my mind. I mean to think that the raised parts 400 00:23:21,920 --> 00:23:24,720 Speaker 1: of your fingerprint and the valleys of your fingerprint are 401 00:23:24,960 --> 00:23:30,439 Speaker 1: distinct enough to create a measurable difference in capacitance. You know, 402 00:23:30,520 --> 00:23:32,920 Speaker 1: it's something I never would have imagined. And it's really 403 00:23:32,920 --> 00:23:34,639 Speaker 1: a sensor issue here. I mean the fact that we 404 00:23:34,680 --> 00:23:38,359 Speaker 1: can create these these cells, these capacitor plate cells that 405 00:23:38,480 --> 00:23:42,640 Speaker 1: are sensitive enough to tell that yeah. Technology, so what's 406 00:23:42,640 --> 00:23:44,600 Speaker 1: happening is when you put your finger down on one 407 00:23:44,640 --> 00:23:49,600 Speaker 1: of these capacity scanners, you are actually you're you're acting 408 00:23:49,600 --> 00:23:52,600 Speaker 1: as a capacitance plate, right Your fingertip is acting is one, 409 00:23:52,640 --> 00:23:55,480 Speaker 1: and you already have other ones inside the scanner itself, 410 00:23:56,080 --> 00:23:59,520 Speaker 1: and it will end up creating voltages, and there will 411 00:23:59,560 --> 00:24:03,000 Speaker 1: be different differences in those voltages. The differences between the 412 00:24:03,080 --> 00:24:05,719 Speaker 1: ridges and the valleys based on how far away they 413 00:24:05,760 --> 00:24:08,520 Speaker 1: are from the from the cells. Yeah, so if it's 414 00:24:08,520 --> 00:24:10,919 Speaker 1: a valley, it's going to be lower capacitance because the 415 00:24:11,040 --> 00:24:15,320 Speaker 1: distance is greater. Capacitance is very dependent upon distance. So 416 00:24:15,440 --> 00:24:18,040 Speaker 1: if you move to capacitance plates far enough apart, they 417 00:24:18,040 --> 00:24:20,919 Speaker 1: will not be they will not work together. We've got 418 00:24:20,960 --> 00:24:24,000 Speaker 1: a little bit more to say about biometrics, but first 419 00:24:24,160 --> 00:24:34,880 Speaker 1: let's take a quick break. It's amazing that the valleys, 420 00:24:35,000 --> 00:24:38,520 Speaker 1: just by being that much further away will create a 421 00:24:38,520 --> 00:24:41,359 Speaker 1: different voltage than the ridges. And then having a whole, 422 00:24:41,560 --> 00:24:43,760 Speaker 1: a whole set of these cells set up next to 423 00:24:43,760 --> 00:24:47,160 Speaker 1: one another, um it allows the scanner to to sort 424 00:24:47,160 --> 00:24:49,800 Speaker 1: of make a digital picture of your fingerprint, but just 425 00:24:50,000 --> 00:24:53,320 Speaker 1: using electricity rather than light. The data from each one 426 00:24:53,400 --> 00:24:57,760 Speaker 1: is again compiled and then converted into an image of sorts. Yeah. Yeah, 427 00:24:57,960 --> 00:25:00,680 Speaker 1: you can think of it as like an image made 428 00:25:00,680 --> 00:25:03,639 Speaker 1: with electricity. And this is the sort of scanner that 429 00:25:03,680 --> 00:25:07,679 Speaker 1: the iPhone five S uses, so it's not an optical scanner. 430 00:25:07,720 --> 00:25:10,119 Speaker 1: One of the big benefits of this technology is that 431 00:25:10,160 --> 00:25:13,520 Speaker 1: it's easier to make it really compact and manatorized, which 432 00:25:13,560 --> 00:25:16,720 Speaker 1: is why you could find it in things like handheld electronics. 433 00:25:17,040 --> 00:25:21,560 Speaker 1: Right sure. Um. However, this technology can also be fooled 434 00:25:21,640 --> 00:25:24,880 Speaker 1: um sometimes by by mold of a finger um or 435 00:25:24,920 --> 00:25:27,520 Speaker 1: if someone has gone and calibrated the scanner to look 436 00:25:27,560 --> 00:25:30,719 Speaker 1: for things like like heat or a pulse. Um. You 437 00:25:30,760 --> 00:25:33,439 Speaker 1: can you can use one of those movie tricks, like 438 00:25:33,480 --> 00:25:37,040 Speaker 1: a gelatin or silicone mold of a finger um pasted 439 00:25:37,160 --> 00:25:41,480 Speaker 1: onto a different, different finger. Yeah. My favorite version of 440 00:25:41,520 --> 00:25:46,240 Speaker 1: getting past a capacitance or an optical scanner is to 441 00:25:47,480 --> 00:25:50,680 Speaker 1: use the finger that's been removed from the person who 442 00:25:50,880 --> 00:25:55,080 Speaker 1: had the authorization. Yeah, yeah, snipping, that's that's that's your favorite. 443 00:25:55,320 --> 00:25:58,520 Speaker 1: That's my favorite out in the field. That's what you like. 444 00:25:58,840 --> 00:26:01,040 Speaker 1: We had this discussion about the born identity before we 445 00:26:01,080 --> 00:26:03,919 Speaker 1: came into the podcast. This is totally true. Uh, I 446 00:26:03,920 --> 00:26:07,040 Speaker 1: don't like to discuss my actual field operative kind of 447 00:26:07,400 --> 00:26:10,320 Speaker 1: mentality and strategies. But yes, I do love doing that. 448 00:26:10,440 --> 00:26:13,720 Speaker 1: So thermal scanners, well, this one's a little different because 449 00:26:13,720 --> 00:26:16,320 Speaker 1: it's a it's a heat scanner, and again it's one 450 00:26:16,359 --> 00:26:19,840 Speaker 1: of those ideas where it measures the differences in heat 451 00:26:20,320 --> 00:26:24,359 Speaker 1: between ridges and valleys. Once again, you're going to get 452 00:26:24,359 --> 00:26:26,919 Speaker 1: a slightly higher temperature from the ridges than you do 453 00:26:26,960 --> 00:26:29,240 Speaker 1: with the valleys. The valleys that are essentially pockets of air. 454 00:26:29,960 --> 00:26:32,320 Speaker 1: There's some downside with this one. One of the problems 455 00:26:32,359 --> 00:26:35,400 Speaker 1: with thermal scanners is that if you if it takes 456 00:26:35,400 --> 00:26:38,359 Speaker 1: too long to do the scan, the temperature differences are 457 00:26:38,400 --> 00:26:43,000 Speaker 1: going to equalize across the and I wind up get 458 00:26:44,600 --> 00:26:48,000 Speaker 1: you just get a blank fingerprint, like a big blank fingerprint. 459 00:26:48,080 --> 00:26:51,600 Speaker 1: Not useful. This next one is super cool. It's ultrasonic 460 00:26:51,760 --> 00:26:54,960 Speaker 1: ultrasonic sensors, so right, yeah, Well, we did a whole 461 00:26:55,000 --> 00:26:59,359 Speaker 1: episode about ultrasound called How Ultrasound Works. Crazily enough that 462 00:26:59,400 --> 00:27:03,320 Speaker 1: published jan if you would like to hear all about 463 00:27:03,359 --> 00:27:07,320 Speaker 1: how this technology works. But um, but it's essentially echolocation. Yeah, 464 00:27:07,400 --> 00:27:09,920 Speaker 1: you're sending out sound signals and then you're waiting for 465 00:27:09,960 --> 00:27:12,320 Speaker 1: them to bounce back and by measuring the amount of 466 00:27:12,359 --> 00:27:15,000 Speaker 1: time it took to go out and come back. You 467 00:27:15,040 --> 00:27:17,400 Speaker 1: get an idea of how far away something is. Well. 468 00:27:17,400 --> 00:27:20,680 Speaker 1: That works, you know, in lots of different ways, including 469 00:27:20,960 --> 00:27:23,240 Speaker 1: being able to tell a fingerprint, being able to read 470 00:27:23,280 --> 00:27:26,560 Speaker 1: a fingerprint. It can go even deeper than that, exactly, 471 00:27:26,600 --> 00:27:29,199 Speaker 1: it can go into tissues. So if you wanted, you 472 00:27:29,240 --> 00:27:32,520 Speaker 1: could create an ultrasonic fingerprint scanner that scan not just 473 00:27:32,600 --> 00:27:36,119 Speaker 1: the fingerprint itself, but the underlying veins that are in 474 00:27:36,160 --> 00:27:38,520 Speaker 1: your finger which also are going to be unique to you. 475 00:27:39,040 --> 00:27:42,879 Speaker 1: So that's a lot harder to fake than a fingerprint. 476 00:27:42,920 --> 00:27:45,320 Speaker 1: Like you're not going to get a really high resolution 477 00:27:45,400 --> 00:27:49,400 Speaker 1: image of veins and then create a fake finger easily. 478 00:27:49,560 --> 00:27:51,760 Speaker 1: It would be really difficult. I mean, but you could 479 00:27:51,840 --> 00:27:55,760 Speaker 1: use your favorite application, which is removing someone's finger and 480 00:27:55,880 --> 00:27:59,360 Speaker 1: say that right unless you had also built into the 481 00:27:59,440 --> 00:28:04,200 Speaker 1: software to detect living tissue, because to see if blood 482 00:28:04,280 --> 00:28:07,080 Speaker 1: is moving through the veins the vessels. If it's not 483 00:28:07,160 --> 00:28:10,280 Speaker 1: detecting blood, it's good to say, y'all, this is messed up. 484 00:28:11,560 --> 00:28:14,320 Speaker 1: You need to send someone down right now the fingerprint scanner. 485 00:28:14,400 --> 00:28:17,639 Speaker 1: Things that are bad are happening. That's exactly what the 486 00:28:17,720 --> 00:28:20,639 Speaker 1: voice that uses too, I I assumed run into it 487 00:28:20,640 --> 00:28:22,720 Speaker 1: in the field all the time. So yeah, I mean 488 00:28:22,760 --> 00:28:25,040 Speaker 1: this is uh, you know, that's we're making light of 489 00:28:25,080 --> 00:28:28,680 Speaker 1: it because to be serious about it is so squeaky, terrifying. 490 00:28:28,800 --> 00:28:31,040 Speaker 1: But at any rate, it does mean that you can 491 00:28:31,119 --> 00:28:32,959 Speaker 1: build those sort of parameters, and so it's not just 492 00:28:33,000 --> 00:28:36,040 Speaker 1: looking at the fingerprint and the veins underneath, but also 493 00:28:36,080 --> 00:28:40,040 Speaker 1: to make sure it is truly a valid uh entry, 494 00:28:40,200 --> 00:28:42,200 Speaker 1: so that they don't you know, you don't end up 495 00:28:42,600 --> 00:28:46,200 Speaker 1: compromising security. Um. And there there's also I wanted to 496 00:28:46,240 --> 00:28:50,320 Speaker 1: mention very briefly the difference between those static uh fingerprint scanners, 497 00:28:50,480 --> 00:28:52,600 Speaker 1: especially the optical ones where you just hold your finger 498 00:28:52,640 --> 00:28:53,840 Speaker 1: down and you wait for it. It's kind of like 499 00:28:53,880 --> 00:28:56,760 Speaker 1: a copier, right right, and the swipe style that you 500 00:28:56,840 --> 00:28:59,880 Speaker 1: were talking about having on your home laptop exactly. So 501 00:29:00,200 --> 00:29:03,000 Speaker 1: if you've ever had a laptop or any other device 502 00:29:03,080 --> 00:29:05,360 Speaker 1: that has like a narrow window and you're supposed to 503 00:29:05,400 --> 00:29:08,800 Speaker 1: swipe your finger across that window, the reason for that 504 00:29:08,960 --> 00:29:12,520 Speaker 1: is that it's actually taking a series of quote unquote 505 00:29:12,560 --> 00:29:17,000 Speaker 1: images of your finger. However the implementation is actually being used, 506 00:29:17,040 --> 00:29:21,080 Speaker 1: it's it's doing then a quick series. Machines are really fast, 507 00:29:21,120 --> 00:29:23,760 Speaker 1: so they can do this without any real problem. They're 508 00:29:23,800 --> 00:29:26,560 Speaker 1: looking for those minutia that we talked about before, and 509 00:29:26,840 --> 00:29:29,640 Speaker 1: they're able to uh to use software to compile them. 510 00:29:29,720 --> 00:29:32,120 Speaker 1: But you know, it's it's nifty having the smaller form 511 00:29:32,160 --> 00:29:35,880 Speaker 1: factor because, like we said, uh, then you can miniaturize, 512 00:29:35,880 --> 00:29:37,960 Speaker 1: you can put them in something like a cell phone 513 00:29:38,320 --> 00:29:41,440 Speaker 1: and also make them cheaper. Exactly. Yeah, you've got this 514 00:29:41,520 --> 00:29:43,880 Speaker 1: little window and your finger moves past the window instead 515 00:29:43,880 --> 00:29:47,280 Speaker 1: of the window having to be big enough to So 516 00:29:47,320 --> 00:29:52,200 Speaker 1: it's it's really uh an interesting development and uh pretty cool. Also, 517 00:29:52,360 --> 00:29:55,560 Speaker 1: we can mention that biometric systems, many of them, not 518 00:29:55,640 --> 00:29:57,840 Speaker 1: necessarily all of them, but many of them end up 519 00:29:57,880 --> 00:30:01,840 Speaker 1: translating your fingerprint in to an algorithm that or an 520 00:30:01,840 --> 00:30:04,080 Speaker 1: algorithm rather does the translating that turns into a bunch 521 00:30:04,120 --> 00:30:06,560 Speaker 1: of ones and zeros, all right, right. A digitization um 522 00:30:06,600 --> 00:30:09,440 Speaker 1: sometimes called a hash. It's like a personal code, like 523 00:30:09,480 --> 00:30:12,560 Speaker 1: a really long pin. Yeah. So in this case, what 524 00:30:12,600 --> 00:30:14,880 Speaker 1: you would say is that it's not storing an image 525 00:30:14,880 --> 00:30:16,560 Speaker 1: of your fingerprint. It's not like if you were to 526 00:30:16,600 --> 00:30:19,720 Speaker 1: somehow hack into the computer you would suddenly see a 527 00:30:19,960 --> 00:30:23,080 Speaker 1: on your screen a representation of your fingerprint. It would 528 00:30:23,120 --> 00:30:26,160 Speaker 1: just mean that it would take the the pattern of 529 00:30:26,240 --> 00:30:29,480 Speaker 1: ridges and valleys and all the night nutia convert that 530 00:30:29,520 --> 00:30:33,080 Speaker 1: into this this hash, this this long string of ones 531 00:30:33,120 --> 00:30:36,040 Speaker 1: and zeros, and the next time you scan it, if 532 00:30:36,280 --> 00:30:38,640 Speaker 1: the if the same hash comes up, then it is 533 00:30:38,680 --> 00:30:41,640 Speaker 1: a match and it says, all right, identification has been verified. 534 00:30:42,000 --> 00:30:46,120 Speaker 1: But it's not actually like an actual real image of 535 00:30:46,160 --> 00:30:49,040 Speaker 1: your fingerprint. And the reason why a lot of these 536 00:30:49,040 --> 00:30:52,240 Speaker 1: companies try to talk about you know this as as 537 00:30:52,280 --> 00:30:55,160 Speaker 1: a big selling point is that it doesn't allow you 538 00:30:55,280 --> 00:30:58,800 Speaker 1: to recreate a person's fingerprint if you were to get 539 00:30:58,880 --> 00:31:02,000 Speaker 1: hold of those hash So it's not like you would say, oh, 540 00:31:02,040 --> 00:31:05,320 Speaker 1: if I just put this through an image program, I 541 00:31:05,320 --> 00:31:09,000 Speaker 1: suddenly get a picture of that fingerprint, you would just yeah, 542 00:31:09,040 --> 00:31:11,320 Speaker 1: that's that's the way. I think that the iPhone five 543 00:31:11,560 --> 00:31:17,520 Speaker 1: S and the Galaxy something something that the latest Samsung 544 00:31:17,800 --> 00:31:21,280 Speaker 1: incorporates it. And UM PayPal these days even has UM 545 00:31:21,520 --> 00:31:25,000 Speaker 1: fingerprint signatures on their app, and any any device that 546 00:31:25,440 --> 00:31:28,760 Speaker 1: allows you to in to scan in your fingerprint will 547 00:31:29,280 --> 00:31:31,960 Speaker 1: yeah let you pay for stuff on PayPal with that signature. 548 00:31:32,000 --> 00:31:33,320 Speaker 1: I'm sure we're going to see a lot of that 549 00:31:33,400 --> 00:31:37,280 Speaker 1: kind of stuff incorporate with things like the NFC technology 550 00:31:37,360 --> 00:31:41,040 Speaker 1: or even the low Bluetooth energy low energy bluetooth. Rather, 551 00:31:41,080 --> 00:31:44,280 Speaker 1: I should say, implementations where your fingerprint instead of having 552 00:31:44,320 --> 00:31:46,720 Speaker 1: to put in a pen, you just swipe your finger. Yeah, 553 00:31:47,200 --> 00:31:48,760 Speaker 1: you know, I don't know. I kind of hope that 554 00:31:48,880 --> 00:31:52,400 Speaker 1: it's an additional safety feature, not a standalone safety feature, 555 00:31:52,440 --> 00:31:55,160 Speaker 1: because you know, unlike a password or a pin, you 556 00:31:55,200 --> 00:31:58,320 Speaker 1: can't just change your fingerprint if it gets stolen. And 557 00:31:58,440 --> 00:32:02,320 Speaker 1: this hash issue add security to to the whole thing. 558 00:32:02,680 --> 00:32:05,320 Speaker 1: It's harder to but I'm sure that someone if they 559 00:32:05,360 --> 00:32:08,320 Speaker 1: really wanted to, could decode a hash and figure out 560 00:32:08,360 --> 00:32:11,120 Speaker 1: what that scan looks like. Yeah. Maybe, I mean, they 561 00:32:11,160 --> 00:32:13,080 Speaker 1: would have to have a lot of information, but it 562 00:32:13,120 --> 00:32:15,920 Speaker 1: is It is important to say that there is no 563 00:32:16,080 --> 00:32:21,959 Speaker 1: security feature out there that's going to be right right, 564 00:32:22,000 --> 00:32:25,440 Speaker 1: There's nothing out there, So having it as an additional tool, 565 00:32:25,560 --> 00:32:27,960 Speaker 1: I agree, Lauren, that's that's the best way of looking 566 00:32:28,000 --> 00:32:30,320 Speaker 1: at it. I think anytime we decide that we're going 567 00:32:30,400 --> 00:32:34,840 Speaker 1: to rely on a specific implementation and that's it and 568 00:32:34,880 --> 00:32:38,280 Speaker 1: we're done, we're good, then we're pretty much dooming ourselves 569 00:32:38,280 --> 00:32:41,160 Speaker 1: to getting hacked in some way down the line. And 570 00:32:41,240 --> 00:32:45,880 Speaker 1: that wraps up this classic episode about biometrics. It's still 571 00:32:45,920 --> 00:32:49,920 Speaker 1: a very very big topic, arguably an even bigger topic 572 00:32:49,960 --> 00:32:54,280 Speaker 1: today than it was back in very complicated. There's so 573 00:32:54,320 --> 00:32:58,920 Speaker 1: many things to take into consideration from security and privacy 574 00:32:58,960 --> 00:33:04,000 Speaker 1: issues too. You know, the problems with authentication in some cases, 575 00:33:04,760 --> 00:33:09,440 Speaker 1: ways to work around biometrics. There's a lot going on there. 576 00:33:09,480 --> 00:33:12,600 Speaker 1: I will likely do more episodes kind of diving into 577 00:33:12,640 --> 00:33:16,440 Speaker 1: not just the tech but the ethics behind biometrics. If 578 00:33:16,480 --> 00:33:19,440 Speaker 1: you have suggestions for topics I should cover in future episodes, 579 00:33:19,560 --> 00:33:21,400 Speaker 1: let me know the best way to do that is 580 00:33:21,400 --> 00:33:25,280 Speaker 1: over on Twitter. The handle is text stuff H s 581 00:33:25,480 --> 00:33:29,840 Speaker 1: W and I'll talk to you again really soon. Y 582 00:33:34,160 --> 00:33:37,160 Speaker 1: tex Stuff is an I Heart Radio production. For more 583 00:33:37,240 --> 00:33:40,640 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 584 00:33:40,760 --> 00:33:43,920 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.