1 00:00:04,440 --> 00:00:12,320 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:16,120 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,160 --> 00:00:19,639 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,720 --> 00:00:22,639 Speaker 1: are you? You know, I'm sure you've all seen movies 5 00:00:22,800 --> 00:00:25,920 Speaker 1: which someone has to get past a security system by 6 00:00:26,040 --> 00:00:29,960 Speaker 1: having an eye scan, and maybe they have an unconscious 7 00:00:30,080 --> 00:00:35,440 Speaker 1: or potentially unalive security guard laying nearby and they hold 8 00:00:35,440 --> 00:00:38,320 Speaker 1: the person's head up to the scanner to gain access 9 00:00:38,360 --> 00:00:42,199 Speaker 1: to the security door. Or maybe the main character has 10 00:00:42,280 --> 00:00:47,280 Speaker 1: undergone some sort of incredibly disturbing surgical procedure to have 11 00:00:47,360 --> 00:00:50,559 Speaker 1: their own eyes altered to give them access. But you 12 00:00:50,600 --> 00:00:53,320 Speaker 1: get the idea. The eye scan verifies that the person 13 00:00:53,400 --> 00:00:57,320 Speaker 1: trying to open the door has the authorization to do that. 14 00:00:57,880 --> 00:01:01,600 Speaker 1: This is a form a biometric and I'm sure we're 15 00:01:01,640 --> 00:01:06,600 Speaker 1: all familiar with biometrics. You probably use them regularly. If 16 00:01:06,600 --> 00:01:09,679 Speaker 1: you have a phone that has a fingerprint scanner or 17 00:01:09,720 --> 00:01:14,440 Speaker 1: a face scanner to unlock the phone, that's biometrics. I 18 00:01:14,520 --> 00:01:17,039 Speaker 1: used to use a laptop that had a fingerprint scanner, 19 00:01:17,440 --> 00:01:20,120 Speaker 1: which was kind of cool. It made it really easy 20 00:01:20,200 --> 00:01:25,600 Speaker 1: for me to log in. Not necessarily the most secure technology. However, 21 00:01:25,760 --> 00:01:30,080 Speaker 1: as I remember reading about how that particular fingerprint scanner 22 00:01:30,160 --> 00:01:34,240 Speaker 1: had some issues, but it's still sure made logging into 23 00:01:34,240 --> 00:01:36,800 Speaker 1: my computer easy back in those days. My current one 24 00:01:36,840 --> 00:01:40,200 Speaker 1: does not have that. And there are actually a couple 25 00:01:40,240 --> 00:01:44,039 Speaker 1: of different definitions for biometrics, but the one we're concerned 26 00:01:44,080 --> 00:01:49,000 Speaker 1: about involves body and behavioral measurements for the purposes of 27 00:01:49,200 --> 00:01:55,240 Speaker 1: authentication and verification or identification, because these are all related 28 00:01:55,320 --> 00:01:58,680 Speaker 1: but different things. So for this to work, the stuff 29 00:01:58,840 --> 00:02:02,280 Speaker 1: what we're measuring has to be unique to an individual. 30 00:02:02,680 --> 00:02:05,800 Speaker 1: It needs to be something that cannot be easily replicated. 31 00:02:06,440 --> 00:02:08,679 Speaker 1: That's the whole point, right. If it could be replicated, 32 00:02:09,600 --> 00:02:13,360 Speaker 1: then it's no good for verifying someone or identifying someone 33 00:02:13,560 --> 00:02:16,960 Speaker 1: because it could be one of many people. So you 34 00:02:17,000 --> 00:02:20,040 Speaker 1: need to have something that is unique to that person. 35 00:02:20,760 --> 00:02:23,920 Speaker 1: Spoiler alert. While biometrics can be used for the purposes 36 00:02:23,919 --> 00:02:28,760 Speaker 1: of security, for verification or identification, it's also pretty darn 37 00:02:28,800 --> 00:02:32,320 Speaker 1: handy if you just want to keep track of someone, right, 38 00:02:33,000 --> 00:02:35,720 Speaker 1: so you get enough biometric data on a person, it 39 00:02:35,720 --> 00:02:38,600 Speaker 1: becomes easier to track that person down to keep track 40 00:02:38,600 --> 00:02:42,000 Speaker 1: of what they're doing, assuming that they're not like hold 41 00:02:42,080 --> 00:02:44,560 Speaker 1: up in some off grid shack in the middle of 42 00:02:44,600 --> 00:02:47,840 Speaker 1: nowhere and they never venture out of there, in which case, yeah, 43 00:02:47,960 --> 00:02:50,400 Speaker 1: they can be under the radar then, but otherwise it 44 00:02:50,440 --> 00:02:54,000 Speaker 1: gets pretty tricky. But before we get into all that 45 00:02:54,280 --> 00:02:57,280 Speaker 1: and the concerns around biometrics, let's talk a little bit 46 00:02:57,440 --> 00:03:02,840 Speaker 1: about the history of biometrics. This history is varied, It 47 00:03:02,880 --> 00:03:05,840 Speaker 1: goes all over the place. It has lots of starts 48 00:03:05,840 --> 00:03:09,880 Speaker 1: and stops and stutters, and that's because biometrics focuses on 49 00:03:09,960 --> 00:03:12,880 Speaker 1: lots of different things, right Like, it's not all just 50 00:03:13,520 --> 00:03:16,720 Speaker 1: one area of study. Typically we're talking about things that 51 00:03:16,760 --> 00:03:22,840 Speaker 1: people were interested in studying or advancing that collectively we 52 00:03:22,919 --> 00:03:26,040 Speaker 1: would categorize under biometrics, but they were looking at a 53 00:03:26,120 --> 00:03:30,079 Speaker 1: very specific version of it. So you could argue that 54 00:03:30,280 --> 00:03:33,680 Speaker 1: biometrics traces its history to more than twenty five hundred 55 00:03:33,720 --> 00:03:38,000 Speaker 1: years ago. Because Babylonians back in the day, which I'm 56 00:03:38,040 --> 00:03:42,240 Speaker 1: told was a Tuesday, would sign off on business transactions 57 00:03:42,600 --> 00:03:46,440 Speaker 1: by placing a thumb print on a clay tablet. Right 58 00:03:46,600 --> 00:03:50,480 Speaker 1: that the clay tablet would be pliable, you would PLoP 59 00:03:50,480 --> 00:03:53,560 Speaker 1: your thumb down and make an impression. The tablet would 60 00:03:53,640 --> 00:03:56,000 Speaker 1: dry and your print would be there to authenticate that 61 00:03:56,040 --> 00:04:00,080 Speaker 1: you had, in fact completed whatever the transaction was. Bo 62 00:04:00,640 --> 00:04:04,440 Speaker 1: that's biometrics, and Tom Cruise wasn't even involved a little bit. 63 00:04:05,080 --> 00:04:07,280 Speaker 1: I always think of Tom Cruise in movies like the 64 00:04:07,320 --> 00:04:10,800 Speaker 1: Mission Impossible series, or even in Minority Report. When I 65 00:04:10,840 --> 00:04:13,080 Speaker 1: think about biometrics, those are the ones that pop up 66 00:04:13,080 --> 00:04:16,280 Speaker 1: in my head. Of course, it's not like there was 67 00:04:16,320 --> 00:04:20,720 Speaker 1: a direct line of evolution from the ancient Babylonians to 68 00:04:20,720 --> 00:04:23,440 Speaker 1: Tom Cruise, at least not that I am aware of. 69 00:04:23,960 --> 00:04:27,560 Speaker 1: Biometrics weren't always a common means of marking documents or 70 00:04:27,560 --> 00:04:31,600 Speaker 1: anything like that. But by the late nineteenth century, some 71 00:04:31,640 --> 00:04:34,760 Speaker 1: folks started to figure out that the Babylonians could have 72 00:04:34,800 --> 00:04:38,720 Speaker 1: really been onto something. Actually, I'm kidding. They probably didn't 73 00:04:38,720 --> 00:04:41,160 Speaker 1: even know about the Babylonian stuff. They just thought they 74 00:04:41,200 --> 00:04:44,359 Speaker 1: came up with it themselves. Because typically that's how we 75 00:04:44,440 --> 00:04:48,640 Speaker 1: see the smarty pants of the eighteenth and nineteenth centuries. 76 00:04:49,200 --> 00:04:52,080 Speaker 1: Some of them were a little more humble than that, 77 00:04:52,600 --> 00:04:56,560 Speaker 1: but not all of them. Anyway, Folks in the late 78 00:04:56,640 --> 00:04:59,440 Speaker 1: nineteenth century, so we're talking the eighteen hundreds here, they 79 00:04:59,480 --> 00:05:02,720 Speaker 1: observe that fingerprints appeared to be unique to the individual, 80 00:05:03,040 --> 00:05:05,640 Speaker 1: and if you got two different people to make prints 81 00:05:05,680 --> 00:05:08,680 Speaker 1: of their fingertips. You would see the differences in the 82 00:05:08,720 --> 00:05:11,200 Speaker 1: patterns of those fingerprints, and you would be able to 83 00:05:11,240 --> 00:05:14,960 Speaker 1: tell which person made which set of prints, assuming that 84 00:05:15,000 --> 00:05:17,360 Speaker 1: you had a reference right, as long as you're able 85 00:05:17,400 --> 00:05:20,919 Speaker 1: to refer it back to something on record. This logically 86 00:05:21,480 --> 00:05:24,200 Speaker 1: led to the concept of using fingerprints as a way 87 00:05:24,240 --> 00:05:27,200 Speaker 1: to identify if a suspect had been present at the 88 00:05:27,240 --> 00:05:29,760 Speaker 1: scene of a crime. Of course, folks had to figure 89 00:05:29,800 --> 00:05:34,159 Speaker 1: out how to detect and lift prints at the scene. Typically, 90 00:05:34,240 --> 00:05:38,000 Speaker 1: they would use a very fine powder or dust which 91 00:05:38,000 --> 00:05:41,840 Speaker 1: would adhere to the oils left behind from the fingertips 92 00:05:41,880 --> 00:05:44,960 Speaker 1: on different surfaces, and then they would lift those prints 93 00:05:45,000 --> 00:05:49,080 Speaker 1: and then compare that to a record of fingerprints so 94 00:05:49,080 --> 00:05:51,480 Speaker 1: that they could reference the prints that were lifted at 95 00:05:51,480 --> 00:05:55,160 Speaker 1: the scene with ones that were in their database the 96 00:05:55,279 --> 00:05:58,799 Speaker 1: reference system, or if they had the suspect in custody, 97 00:05:59,040 --> 00:06:02,599 Speaker 1: they would have the suspect, you know, get printed, and 98 00:06:02,600 --> 00:06:04,880 Speaker 1: they would compare the prints against whatever was found at 99 00:06:04,880 --> 00:06:07,440 Speaker 1: the scene and then try to determine if the suspect 100 00:06:07,480 --> 00:06:10,520 Speaker 1: was actually present at the scene of the crime. You 101 00:06:10,600 --> 00:06:14,840 Speaker 1: know how this works, But the science of that and 102 00:06:14,960 --> 00:06:18,040 Speaker 1: the systems based around that took a while to develop. 103 00:06:18,720 --> 00:06:21,440 Speaker 1: Folks had to identify the elements and fingerprints that are 104 00:06:21,560 --> 00:06:27,280 Speaker 1: definable and identifiable. They had to make classifications and taxonomies. 105 00:06:27,440 --> 00:06:29,560 Speaker 1: You know, you had to have a way to communicate 106 00:06:30,240 --> 00:06:33,400 Speaker 1: the qualities of a print when talking with someone else, 107 00:06:33,720 --> 00:06:35,640 Speaker 1: or to be able to look for a match or 108 00:06:35,720 --> 00:06:38,880 Speaker 1: even just a partial match. And it took a couple 109 00:06:38,880 --> 00:06:42,880 Speaker 1: of decades to really catch on. And while fingerprints are 110 00:06:43,080 --> 00:06:47,320 Speaker 1: the most famous example of early biometrics and law enforcement, 111 00:06:47,720 --> 00:06:50,880 Speaker 1: police were trying different stuff too. In France, around the 112 00:06:50,880 --> 00:06:54,159 Speaker 1: same time as fingerprints coming into vogue, there was a 113 00:06:54,200 --> 00:06:59,880 Speaker 1: system called the Bertilline process or Bertillion process. The police 114 00:07:00,120 --> 00:07:04,120 Speaker 1: would measure suspects and they would write down their measurements 115 00:07:04,160 --> 00:07:07,120 Speaker 1: on a card and keep that in an index. And 116 00:07:07,200 --> 00:07:09,880 Speaker 1: I'm talking all sorts of measurements. You know. It's kind 117 00:07:09,880 --> 00:07:12,120 Speaker 1: of like if you were to go and get fitted 118 00:07:12,200 --> 00:07:16,720 Speaker 1: for custom clothing, except you're being fitted for crimes, I guess. 119 00:07:16,880 --> 00:07:19,680 Speaker 1: So the police would take down stuff like how tall 120 00:07:19,720 --> 00:07:22,680 Speaker 1: you were, you know, how wide were your shoulders, how 121 00:07:22,760 --> 00:07:26,440 Speaker 1: long are your arms, all that kind of stuff. This 122 00:07:26,560 --> 00:07:30,000 Speaker 1: approach had some big drawbacks, however, the biggest one being 123 00:07:30,000 --> 00:07:33,440 Speaker 1: that there was a lack of standardization in technique and metrics, 124 00:07:33,960 --> 00:07:37,520 Speaker 1: which meant it wasn't terribly useful. Because if one police 125 00:07:37,520 --> 00:07:42,240 Speaker 1: officer measures arm length one way, let's say, like they 126 00:07:42,280 --> 00:07:45,960 Speaker 1: measure the arm just hanging down from the shoulder, and 127 00:07:46,040 --> 00:07:48,000 Speaker 1: someone else measures it in a different way, like you're 128 00:07:48,000 --> 00:07:51,160 Speaker 1: holding your arm up at an angle, then you might 129 00:07:51,240 --> 00:07:55,280 Speaker 1: end up getting very different you know, metrics measurements. And 130 00:07:55,320 --> 00:07:57,720 Speaker 1: that meant that you would look at the index card 131 00:07:57,720 --> 00:07:59,680 Speaker 1: and say, oh, well, this person's not a match. That's 132 00:07:59,720 --> 00:08:02,720 Speaker 1: not who who this person is or whatever, And so 133 00:08:03,240 --> 00:08:07,040 Speaker 1: it was not terribly useful, and you could get false 134 00:08:07,080 --> 00:08:11,000 Speaker 1: positives or false negatives. So this particular method wouldn't stand 135 00:08:11,000 --> 00:08:15,680 Speaker 1: the test of time. Fingerprints definitely did. In the nineteen thirties, 136 00:08:15,880 --> 00:08:18,520 Speaker 1: a guy named Frank Birch had another idea for using 137 00:08:18,560 --> 00:08:23,400 Speaker 1: biometrics as a way to identify people. He was an ophthalmologist, that, 138 00:08:23,440 --> 00:08:25,360 Speaker 1: by the way, is a great word that trips me 139 00:08:25,440 --> 00:08:29,520 Speaker 1: up in spelling bees, and you might imagine. His idea 140 00:08:29,720 --> 00:08:34,280 Speaker 1: was that an individual's iris contains complex patterns, and those 141 00:08:34,320 --> 00:08:37,479 Speaker 1: patterns are unique to that person. It's like a fingerprint, 142 00:08:37,480 --> 00:08:40,280 Speaker 1: but it's the pattern that's in the iris of your eye, 143 00:08:40,800 --> 00:08:43,840 Speaker 1: and so like a fingerprint, a person's iris could serve 144 00:08:43,920 --> 00:08:47,280 Speaker 1: as a way to identify that person and subsequently verify 145 00:08:47,320 --> 00:08:50,120 Speaker 1: their identity in the future. You just needed a way 146 00:08:50,160 --> 00:08:53,640 Speaker 1: to capture those patterns and then a technology capable of 147 00:08:53,720 --> 00:08:58,160 Speaker 1: scanning and matching patterns against those that are stored in 148 00:08:58,160 --> 00:09:02,120 Speaker 1: a database. So in twenty six when Frank Birch came 149 00:09:02,200 --> 00:09:05,080 Speaker 1: up with this idea, none of that was really practical 150 00:09:05,160 --> 00:09:09,480 Speaker 1: or possible, but his basic concept using the eye as 151 00:09:09,480 --> 00:09:13,959 Speaker 1: a way to verify identity, that was solid. Now, it 152 00:09:13,960 --> 00:09:16,520 Speaker 1: would take more than fifty years before someone would come 153 00:09:16,559 --> 00:09:19,880 Speaker 1: up with a computer algorithm that could identify iris patterns 154 00:09:20,360 --> 00:09:23,920 Speaker 1: and moreover, match an IRIS pattern that is collected to 155 00:09:24,000 --> 00:09:27,120 Speaker 1: one in a database. That came to us courtesy of 156 00:09:27,200 --> 00:09:32,160 Speaker 1: Cambridge University professor John Dogman in nineteen eighty seven, and 157 00:09:32,240 --> 00:09:35,400 Speaker 1: it took a few more years to actually capitalize on 158 00:09:35,440 --> 00:09:38,640 Speaker 1: that development. It wasn't until the mid nineteen nineties that 159 00:09:38,720 --> 00:09:42,760 Speaker 1: a company called Iridium Technologies became the first to commercialize 160 00:09:42,800 --> 00:09:46,720 Speaker 1: iris scanning. Now there are lots of companies that do this, 161 00:09:47,160 --> 00:09:49,920 Speaker 1: many with their own proprietary approach to doing it, but 162 00:09:50,160 --> 00:09:53,360 Speaker 1: it's all rooted in Frank Birch's suggestions back in the 163 00:09:53,440 --> 00:09:57,080 Speaker 1: nineteen thirties. Okay, we've got a lot more to talk 164 00:09:57,120 --> 00:10:00,240 Speaker 1: about with biometrics before we get into that. Let's take 165 00:10:00,280 --> 00:10:13,439 Speaker 1: a quick break. You know, not long ago I talked 166 00:10:13,440 --> 00:10:15,720 Speaker 1: a little bit about the early days of researchers trying 167 00:10:15,760 --> 00:10:21,720 Speaker 1: to use computers to develop facial recognition technology, another example 168 00:10:21,760 --> 00:10:26,160 Speaker 1: of biometrics. So the pioneers in that space included Woody Bledsoe, 169 00:10:26,440 --> 00:10:31,000 Speaker 1: a mathematician and computer scientist, Helen chan Wolf, a scientist 170 00:10:31,080 --> 00:10:35,360 Speaker 1: specializing in early artificial intelligence work, and Charles Bisson, another 171 00:10:35,640 --> 00:10:40,320 Speaker 1: computer scientist and the three word for panoramic research in California. 172 00:10:40,440 --> 00:10:42,200 Speaker 1: And they set about trying to figure out how to 173 00:10:42,320 --> 00:10:46,200 Speaker 1: break down facial features into data points that a computer 174 00:10:46,320 --> 00:10:50,040 Speaker 1: could analyze and match. So, you know, you imagined that 175 00:10:50,080 --> 00:10:52,760 Speaker 1: at first, this just looks and involves looking at pictures 176 00:10:52,760 --> 00:10:56,280 Speaker 1: of people and saying, well, where are some ways that 177 00:10:56,320 --> 00:10:59,320 Speaker 1: we can we can point out, like landmarks that a 178 00:10:59,400 --> 00:11:03,480 Speaker 1: computer would be able to recognize and measure against, right, 179 00:11:04,080 --> 00:11:07,080 Speaker 1: like the corners of eyes compared to the bridge of 180 00:11:07,080 --> 00:11:09,160 Speaker 1: the nose, that kind of stuff. And there are all 181 00:11:09,200 --> 00:11:12,480 Speaker 1: these different images of pictures of people with all these 182 00:11:12,559 --> 00:11:16,439 Speaker 1: different lines geometric lines drawn on the faces in an 183 00:11:16,440 --> 00:11:21,480 Speaker 1: effort to kind of establish these standards. And as you 184 00:11:21,520 --> 00:11:24,560 Speaker 1: might imagine, in the early days, this technology hit some 185 00:11:24,600 --> 00:11:28,960 Speaker 1: pretty tough limitations. A computer couldn't necessarily match two different 186 00:11:28,960 --> 00:11:32,880 Speaker 1: pictures of the same person's face. If the lighting and 187 00:11:32,920 --> 00:11:36,040 Speaker 1: the shadows or the angle of the picture were different enough, 188 00:11:36,360 --> 00:11:38,920 Speaker 1: the computer system would have trouble determining that both images 189 00:11:38,920 --> 00:11:41,360 Speaker 1: were of the same face. Yes, if you had two 190 00:11:41,400 --> 00:11:45,000 Speaker 1: pictures of the same person where it was under identical 191 00:11:45,080 --> 00:11:48,240 Speaker 1: lighting conditions and the camera was at the same distance 192 00:11:48,280 --> 00:11:51,680 Speaker 1: in the same angle, then it was easier, But with 193 00:11:51,840 --> 00:11:55,880 Speaker 1: any deviation from that it became much more difficult. It 194 00:11:55,880 --> 00:11:59,920 Speaker 1: would take many years for the algorithms, the computer technology, 195 00:12:00,160 --> 00:12:03,280 Speaker 1: camera technologies to improve to a point that made facial 196 00:12:03,320 --> 00:12:07,840 Speaker 1: recognition a possibility. So we're going to skip way up 197 00:12:07,880 --> 00:12:10,719 Speaker 1: to the present day, and we've talked extensively on this 198 00:12:10,840 --> 00:12:14,360 Speaker 1: show about how facial recognition technology often has major issues, 199 00:12:14,400 --> 00:12:18,000 Speaker 1: particularly when it comes to false positives and false negatives 200 00:12:18,000 --> 00:12:21,319 Speaker 1: with certain populations. Now, I'm not going to dive into 201 00:12:21,320 --> 00:12:23,280 Speaker 1: that whole can of worms yet again, because I'm sure 202 00:12:23,280 --> 00:12:24,960 Speaker 1: most of you have heard me talk about it a lot, 203 00:12:25,440 --> 00:12:30,080 Speaker 1: but it is an inescapable fact that most facial recognition 204 00:12:30,160 --> 00:12:33,680 Speaker 1: systems are prone to making errors due to biases that 205 00:12:33,720 --> 00:12:35,840 Speaker 1: are in the system. And just to be clear, I 206 00:12:35,880 --> 00:12:39,800 Speaker 1: don't mean to imply that there was ever any intentional bias, 207 00:12:40,400 --> 00:12:45,920 Speaker 1: but intentional or not, that bias still has an effect. Meanwhile, 208 00:12:45,960 --> 00:12:49,760 Speaker 1: other computer scientists were working in the mid twentieth century 209 00:12:50,120 --> 00:12:54,520 Speaker 1: on the challenge of developing speech recognition technologies. Now, like 210 00:12:54,600 --> 00:12:58,800 Speaker 1: facial recognition, this tech had a steep barrier of entry. 211 00:12:59,320 --> 00:13:04,599 Speaker 1: Voices come in all sorts of timbers, volumes, pitches, dialects, 212 00:13:04,679 --> 00:13:08,240 Speaker 1: or accents. So two people might say the very same 213 00:13:08,280 --> 00:13:12,600 Speaker 1: word in very different ways. You know, it may sound 214 00:13:12,760 --> 00:13:14,640 Speaker 1: very different, and yet it's the same word, And that 215 00:13:14,760 --> 00:13:18,520 Speaker 1: means the system needs to be capable of recognizing that 216 00:13:18,559 --> 00:13:21,480 Speaker 1: word no matter who says it or how they say it. 217 00:13:21,720 --> 00:13:25,079 Speaker 1: Otherwise you're not going to have a satisfying experience. Well, 218 00:13:25,080 --> 00:13:27,760 Speaker 1: folks in Bell Labs were working on speech recognition tech 219 00:13:27,800 --> 00:13:30,880 Speaker 1: as early as the nineteen fifties, but that work focused 220 00:13:30,880 --> 00:13:34,119 Speaker 1: on training a machine to recognize when someone was speaking 221 00:13:34,320 --> 00:13:38,640 Speaker 1: out numbers. It took a decade of hard work to 222 00:13:38,679 --> 00:13:41,800 Speaker 1: get to a point where the computer systems could recognize 223 00:13:42,000 --> 00:13:45,600 Speaker 1: words of a certain complexity. The nineteen seventies would see 224 00:13:45,600 --> 00:13:49,360 Speaker 1: the technology advanced significantly, helped in large part by a 225 00:13:49,559 --> 00:13:53,600 Speaker 1: DARPA initiative. Y'all remember DARPA, right, that's the division within 226 00:13:53,679 --> 00:13:56,679 Speaker 1: the US Department of Defense that awards contracts to companies 227 00:13:56,960 --> 00:13:59,800 Speaker 1: that engage in R and D, and it's all in 228 00:13:59,840 --> 00:14:03,359 Speaker 1: the purpose of trying to fund projects that could potentially 229 00:14:03,400 --> 00:14:07,559 Speaker 1: benefit US defense initiatives in the future. DARPA has funded 230 00:14:07,640 --> 00:14:12,560 Speaker 1: research into everything from autonomous vehicles to robots capable of 231 00:14:12,559 --> 00:14:16,040 Speaker 1: performing half a dozen different tasks, and in the seventies, 232 00:14:16,320 --> 00:14:21,440 Speaker 1: the division funded work in speech recognition. But speech recognition 233 00:14:22,000 --> 00:14:28,680 Speaker 1: is really related to using voice technology for the purposes 234 00:14:28,720 --> 00:14:32,960 Speaker 1: of a computer understanding what someone is saying. There's also 235 00:14:33,880 --> 00:14:40,200 Speaker 1: voice recognition, or some people prefer speaker recognition, because that's 236 00:14:40,240 --> 00:14:46,480 Speaker 1: all about recognizing a specific person. So speech recognition is 237 00:14:46,520 --> 00:14:49,800 Speaker 1: really more about computer systems that can parse what you're 238 00:14:49,840 --> 00:14:53,880 Speaker 1: saying and glean instructions from that and respond properly so 239 00:14:53,920 --> 00:14:56,960 Speaker 1: that when you ask your smart speaker about the weather, 240 00:14:57,120 --> 00:14:59,160 Speaker 1: it can give you the information you want instead of 241 00:14:59,200 --> 00:15:01,640 Speaker 1: I don't knowtaneously turning off all the lights in your 242 00:15:01,640 --> 00:15:07,040 Speaker 1: house or whatever, not that I've had that experience. Voice 243 00:15:07,080 --> 00:15:11,240 Speaker 1: recognition or speaker recognition involves breaking a specific voice into 244 00:15:11,320 --> 00:15:14,360 Speaker 1: data points for the purposes of identification, just like with 245 00:15:14,480 --> 00:15:18,000 Speaker 1: facial recognition. Right. So, the idea of developing a system 246 00:15:18,040 --> 00:15:22,440 Speaker 1: to do serious analysis on voices dates back to the 247 00:15:22,560 --> 00:15:26,400 Speaker 1: early twentieth century. However, a lot of that early work 248 00:15:26,520 --> 00:15:31,840 Speaker 1: lacked scientific value or rigor. So while you had people saying, yes, 249 00:15:31,880 --> 00:15:35,920 Speaker 1: I can get a print out of what someone's voices 250 00:15:36,000 --> 00:15:38,040 Speaker 1: and thus be able to compare two print outs and 251 00:15:38,040 --> 00:15:39,960 Speaker 1: tell you if it's the same person speaking or not, 252 00:15:40,440 --> 00:15:43,520 Speaker 1: in truth, it was a bit more complicated than that. 253 00:15:43,720 --> 00:15:46,760 Speaker 1: I mean, you know, it required more than just having 254 00:15:46,880 --> 00:15:54,600 Speaker 1: a long strip of paper tape pulled across a pencil that, 255 00:15:54,720 --> 00:15:58,080 Speaker 1: in turn was connected to a diaphragm that would cause 256 00:15:58,160 --> 00:16:01,720 Speaker 1: vibrations to move through when sound would go through like 257 00:16:02,880 --> 00:16:07,800 Speaker 1: a horn or a microphone. Yeah, you could create a 258 00:16:07,920 --> 00:16:11,280 Speaker 1: visual representation of someone's voice that way, but it wasn't 259 00:16:11,480 --> 00:16:14,320 Speaker 1: something that was specific enough for you to be able 260 00:16:14,360 --> 00:16:19,320 Speaker 1: to differentiate that voice from another voice or even sometimes 261 00:16:19,400 --> 00:16:23,440 Speaker 1: other noises entirely, like you might get a record of 262 00:16:23,960 --> 00:16:26,520 Speaker 1: noise that is not a voice at all, but because 263 00:16:26,560 --> 00:16:30,160 Speaker 1: it looks kind of like what someone else produced when 264 00:16:30,160 --> 00:16:31,720 Speaker 1: they were speaking into it, you might think, oh, it 265 00:16:31,760 --> 00:16:35,600 Speaker 1: was this person. It would take years to get to 266 00:16:35,680 --> 00:16:39,920 Speaker 1: a more sophisticated approach to speech analysis and voice analysis 267 00:16:40,440 --> 00:16:45,480 Speaker 1: to approach the possibility of identifying someone based on their voice. 268 00:16:45,920 --> 00:16:50,280 Speaker 1: In the nineteen forties, folks learned about a sound spectrograph technology. 269 00:16:50,440 --> 00:16:52,680 Speaker 1: So this is a device that would create a visual 270 00:16:52,720 --> 00:16:56,880 Speaker 1: representation of a signal. Sound spectrograph being sound. We're talking 271 00:16:56,880 --> 00:16:59,280 Speaker 1: about audio in this case, but you can have spectrographs 272 00:16:59,280 --> 00:17:02,440 Speaker 1: that create a visual representation of lots of different types 273 00:17:02,480 --> 00:17:06,320 Speaker 1: of signals. This is just specifically about audio. Some people 274 00:17:06,359 --> 00:17:09,400 Speaker 1: would refer to these as voice prints. They would compare 275 00:17:09,440 --> 00:17:13,159 Speaker 1: this to a fingerprint, saying, oh, this representation shows the 276 00:17:13,240 --> 00:17:15,920 Speaker 1: quality of this person's voice. If we find a match, 277 00:17:16,040 --> 00:17:18,080 Speaker 1: then that's the same person. It's just like a fingerprint. 278 00:17:18,440 --> 00:17:21,600 Speaker 1: The people who were actually working in the field really 279 00:17:21,640 --> 00:17:24,640 Speaker 1: didn't use the term voice print very much, if at all, 280 00:17:25,040 --> 00:17:26,600 Speaker 1: but it was a term that was used a lot 281 00:17:26,640 --> 00:17:30,280 Speaker 1: in the media, and I think a lot of folks 282 00:17:30,320 --> 00:17:33,159 Speaker 1: who were working in the field were frustrated because they 283 00:17:33,160 --> 00:17:36,040 Speaker 1: felt it was a bit reductive and an oversimplification of 284 00:17:36,080 --> 00:17:39,640 Speaker 1: what they were doing. So a slightly more acceptable term 285 00:17:39,760 --> 00:17:43,679 Speaker 1: is what I've referenced earlier, speaker recognition. This implies the 286 00:17:43,680 --> 00:17:47,560 Speaker 1: technology isn't necessarily trying to understand what someone is saying. 287 00:17:48,320 --> 00:17:52,200 Speaker 1: Instead is trying to identify the person who is saying it, 288 00:17:52,760 --> 00:17:55,359 Speaker 1: or verifying that the person who is saying it is 289 00:17:55,400 --> 00:17:58,400 Speaker 1: who they claimed to be. You might use this technology 290 00:17:59,119 --> 00:18:04,159 Speaker 1: to have someone get access to something secret, right, like 291 00:18:04,520 --> 00:18:07,000 Speaker 1: the voice print analysis. You've seen this in movies too, 292 00:18:07,000 --> 00:18:09,880 Speaker 1: where someone walks up to a door and they say 293 00:18:09,920 --> 00:18:14,920 Speaker 1: a phrase and then apparently the computer inside the door 294 00:18:15,359 --> 00:18:18,600 Speaker 1: and analyzes the voice and then either allows entry or 295 00:18:18,640 --> 00:18:21,760 Speaker 1: denies it. Or you might use it to try and 296 00:18:21,800 --> 00:18:25,560 Speaker 1: identify somebody, right, maybe you've got a recording of someone. 297 00:18:25,600 --> 00:18:29,439 Speaker 1: Maybe someone calls into a television station and makes a threat. 298 00:18:29,880 --> 00:18:33,199 Speaker 1: This has happened, and then what you're trying to do 299 00:18:33,320 --> 00:18:37,480 Speaker 1: is identify a suspect you have found and to determine 300 00:18:37,480 --> 00:18:39,000 Speaker 1: whether or not they were the person who made the 301 00:18:39,040 --> 00:18:41,240 Speaker 1: phone call, and you're trying to match the voices together. 302 00:18:41,800 --> 00:18:44,440 Speaker 1: This is something that's been going on in law enforcement 303 00:18:44,520 --> 00:18:48,320 Speaker 1: for decades and for a very long time. Courts would 304 00:18:48,800 --> 00:18:55,760 Speaker 1: reject the evidence presented because there was a lack of 305 00:18:55,800 --> 00:19:00,920 Speaker 1: scientific studies showing the accuracy and reliability of this kind 306 00:19:00,920 --> 00:19:04,119 Speaker 1: of approach, and you needed to show that there was 307 00:19:04,160 --> 00:19:09,439 Speaker 1: a scientific basis for this and not just a claim 308 00:19:09,600 --> 00:19:13,800 Speaker 1: that these two voices must be identical. The technology involves 309 00:19:14,000 --> 00:19:18,960 Speaker 1: sophisticated pattern analysis and it is really tricky. So those 310 00:19:19,040 --> 00:19:21,520 Speaker 1: early court cases, it's probably a good idea to throw 311 00:19:21,560 --> 00:19:25,640 Speaker 1: those cases out. Now, maybe some guilty people went free. 312 00:19:26,160 --> 00:19:30,359 Speaker 1: It's hard to say, but the fact is we just 313 00:19:30,400 --> 00:19:33,359 Speaker 1: weren't at a level of sophistication a pattern analysis to 314 00:19:33,359 --> 00:19:37,560 Speaker 1: be able to have really reliable identification. These days, there 315 00:19:37,560 --> 00:19:41,920 Speaker 1: are examples of speaker recognition technology built into consumer products. 316 00:19:42,359 --> 00:19:47,840 Speaker 1: My smart speaker at home supposedly recognizes my voice, for example, 317 00:19:48,080 --> 00:19:50,160 Speaker 1: and this should make it possible for me to ask 318 00:19:50,200 --> 00:19:54,640 Speaker 1: about my daily schedule, and because the speaker recognizes that 319 00:19:54,680 --> 00:19:58,359 Speaker 1: the user is yours, truly, it could then cross reference 320 00:19:58,440 --> 00:20:01,520 Speaker 1: my calendar and then tell me what my schedule is, 321 00:20:01,960 --> 00:20:03,679 Speaker 1: or at least it's supposed to be able to do that. 322 00:20:04,200 --> 00:20:06,720 Speaker 1: I don't know. I mean, it's probably because I never 323 00:20:06,760 --> 00:20:09,000 Speaker 1: put anything in my calendar. I'm really bad about doing that. 324 00:20:09,160 --> 00:20:11,600 Speaker 1: So really, my speaker just gets fed up with me 325 00:20:11,840 --> 00:20:15,760 Speaker 1: because I'm asking it to do something it really cannot do. 326 00:20:16,560 --> 00:20:18,359 Speaker 1: All right, we're gonna take another quick break. When we 327 00:20:18,400 --> 00:20:20,200 Speaker 1: come back, I'm going to talk a little bit more 328 00:20:20,200 --> 00:20:25,760 Speaker 1: about some other biometric approaches and also about a current 329 00:20:25,840 --> 00:20:30,720 Speaker 1: story that really inspired this entire episode and the different 330 00:20:30,880 --> 00:20:34,600 Speaker 1: sides to that story. But first let's take another quick break. 331 00:20:43,720 --> 00:20:45,720 Speaker 1: Before the break, I alluded to the fact that there 332 00:20:45,760 --> 00:20:50,639 Speaker 1: are other approaches to biometric verification or identification. So another 333 00:20:50,680 --> 00:20:55,679 Speaker 1: one is gate recognition GAI. This is the way someone 334 00:20:55,800 --> 00:20:59,760 Speaker 1: moves through a space, like things like the length of 335 00:20:59,800 --> 00:21:04,480 Speaker 1: their stride, the position of their body as they are walking, 336 00:21:05,040 --> 00:21:08,000 Speaker 1: the location of various body parts in relation to one another, 337 00:21:08,119 --> 00:21:12,119 Speaker 1: like how far are your hips from your knees or 338 00:21:12,160 --> 00:21:14,919 Speaker 1: your knees from your ankles, that sort of thing, what 339 00:21:15,080 --> 00:21:18,119 Speaker 1: sort of how much do you bend when you're moving. 340 00:21:19,280 --> 00:21:22,600 Speaker 1: So it is possible to analyze how a person naturally 341 00:21:22,800 --> 00:21:27,560 Speaker 1: walks or moves through space and then to use that 342 00:21:27,600 --> 00:21:32,880 Speaker 1: information to identify someone. So if you have reference data 343 00:21:33,119 --> 00:21:36,800 Speaker 1: where you know how a person typically walks, then you 344 00:21:36,880 --> 00:21:39,639 Speaker 1: might be able to use that in when you're searching 345 00:21:39,640 --> 00:21:42,520 Speaker 1: for somebody. A person could be trying to evade surveillance, 346 00:21:42,560 --> 00:21:46,359 Speaker 1: for example, through the use of disguises, but gate analysis 347 00:21:46,440 --> 00:21:49,359 Speaker 1: might reveal who they really are, assuming you've got reference 348 00:21:49,480 --> 00:21:52,720 Speaker 1: data as well. I've seen movies that do this where 349 00:21:52,720 --> 00:21:55,960 Speaker 1: it's a person who's just watching like a video feed. 350 00:21:56,000 --> 00:21:58,720 Speaker 1: They're saying, ah, I see they have a limp, and 351 00:21:59,040 --> 00:22:01,679 Speaker 1: you know, we know so and so favors their right legs, 352 00:22:01,680 --> 00:22:04,240 Speaker 1: so I think we've got the person here. Like we've 353 00:22:04,240 --> 00:22:06,639 Speaker 1: seen that sort of thing. It's the same basic idea, 354 00:22:06,680 --> 00:22:08,800 Speaker 1: except you don't have to have a limp. It's really 355 00:22:08,920 --> 00:22:13,040 Speaker 1: just all the basic movements you make that end up 356 00:22:13,280 --> 00:22:16,600 Speaker 1: kind of being unique to you. You can try and 357 00:22:16,640 --> 00:22:20,000 Speaker 1: disguise that. Obviously, you can purposefully give yourself like a 358 00:22:20,119 --> 00:22:23,800 Speaker 1: limp or something in an effort to throw off any 359 00:22:24,600 --> 00:22:27,640 Speaker 1: surveillance techniques that would be looking to match your GIT 360 00:22:27,880 --> 00:22:33,120 Speaker 1: with you. And that's a possibility, but this is an 361 00:22:33,160 --> 00:22:38,560 Speaker 1: actual thing that's being used, this gate analysis and gait verification. Really, 362 00:22:38,600 --> 00:22:41,239 Speaker 1: when it comes to biometrics, again, as I said at 363 00:22:41,240 --> 00:22:43,959 Speaker 1: the beginning, the important quality is that whatever thing you 364 00:22:44,000 --> 00:22:48,719 Speaker 1: are measuring needs to be unique to individuals. So it 365 00:22:48,720 --> 00:22:51,119 Speaker 1: could be physical, it could be behavioral. It could be 366 00:22:51,160 --> 00:22:53,040 Speaker 1: a combination of the two, but it has to be 367 00:22:53,160 --> 00:22:56,600 Speaker 1: unique or else it doesn't do you any good whether 368 00:22:56,720 --> 00:23:00,159 Speaker 1: you're using this technology to verify someone's identity before or 369 00:23:00,200 --> 00:23:02,840 Speaker 1: letting them through a secure checkpoint, or if you're just 370 00:23:02,880 --> 00:23:06,240 Speaker 1: trying to use it to determine an unknown person's identity 371 00:23:06,240 --> 00:23:09,960 Speaker 1: by comparing it against the database of known individuals. Now, 372 00:23:11,080 --> 00:23:14,600 Speaker 1: there are a lot of reasonable concerns about biometrics. It's 373 00:23:14,600 --> 00:23:18,440 Speaker 1: totally understandable, right, like, this is information that is unique 374 00:23:18,480 --> 00:23:22,240 Speaker 1: to you, that is a very private info. And we're 375 00:23:22,680 --> 00:23:27,600 Speaker 1: seeing more and more implementations of biometric systems around us. 376 00:23:27,600 --> 00:23:30,920 Speaker 1: Like I mentioned before, your very phone may rely on it. 377 00:23:31,400 --> 00:23:34,919 Speaker 1: Mine has a fingerprint sensor for example. A lot of 378 00:23:34,920 --> 00:23:38,960 Speaker 1: iPhones use facial recognition to activate, to turn on without 379 00:23:39,000 --> 00:23:41,239 Speaker 1: you having to put in some sort of code. So 380 00:23:41,560 --> 00:23:46,840 Speaker 1: we're seeing biometrics rolled out very in very wide deployments 381 00:23:46,960 --> 00:23:50,600 Speaker 1: in all sorts of different applications. We also have seen 382 00:23:50,640 --> 00:23:53,480 Speaker 1: it for things that are very official, not just use 383 00:23:53,520 --> 00:23:56,600 Speaker 1: of consumer products, but things like you know, maybe getting 384 00:23:56,640 --> 00:23:59,600 Speaker 1: an iris scan as part of identification like in a 385 00:23:59,640 --> 00:24:03,520 Speaker 1: past or or something, or you might have a biometric 386 00:24:03,600 --> 00:24:07,840 Speaker 1: scan that allows you to bypass the normal process of 387 00:24:07,880 --> 00:24:11,040 Speaker 1: getting on a plane. I've had that happen a couple 388 00:24:11,040 --> 00:24:12,640 Speaker 1: of times, and it freaked me out the first time 389 00:24:12,680 --> 00:24:17,080 Speaker 1: because I don't remember ever submitting to the initial scan, 390 00:24:18,560 --> 00:24:21,240 Speaker 1: but it knew who I was when I walked up, 391 00:24:21,280 --> 00:24:23,680 Speaker 1: and I thought, well, that's weird because I didn't actually 392 00:24:24,400 --> 00:24:26,840 Speaker 1: I didn't do I didn't sit down or agree to 393 00:24:26,880 --> 00:24:31,439 Speaker 1: do a process I didn't knowingly agree. I probably did agree, 394 00:24:31,640 --> 00:24:34,560 Speaker 1: I just didn't read the fine print, because nobody ever does, right, 395 00:24:35,119 --> 00:24:37,919 Speaker 1: But that I walked up to the gate and it 396 00:24:37,960 --> 00:24:40,080 Speaker 1: recognized who I was and I didn't even have to 397 00:24:40,080 --> 00:24:43,400 Speaker 1: present a boarding pass or anything. I just got waved in. 398 00:24:43,960 --> 00:24:48,520 Speaker 1: And that's a little creepy, and I can see why 399 00:24:48,560 --> 00:24:53,439 Speaker 1: people would be hesitant about it. And you think about 400 00:24:53,520 --> 00:24:57,000 Speaker 1: the possibilities of this technology and the possibilities of abusing 401 00:24:57,000 --> 00:24:59,800 Speaker 1: that technology, and it quickly does get into that dis 402 00:25:00,040 --> 00:25:03,879 Speaker 1: stopian kind of a vibe. But we're still seeing it 403 00:25:03,960 --> 00:25:06,679 Speaker 1: rolled out every place. There are a lot of sports 404 00:25:06,680 --> 00:25:11,200 Speaker 1: stadiums out there that now use biometric systems in order 405 00:25:11,240 --> 00:25:14,400 Speaker 1: to scan someone, and that way you have a ticketless approach. 406 00:25:14,480 --> 00:25:18,040 Speaker 1: Right you walk up to the stadium, you scan your 407 00:25:18,080 --> 00:25:21,240 Speaker 1: palm and it identifies who you are and the fact 408 00:25:21,280 --> 00:25:23,880 Speaker 1: that you already have tickets to that event and you're 409 00:25:23,920 --> 00:25:27,600 Speaker 1: just you're waved in. This is being deployed more and 410 00:25:27,640 --> 00:25:32,119 Speaker 1: more around the world. And as I record this, Amazon 411 00:25:32,200 --> 00:25:36,360 Speaker 1: is rolling out It's Amazon One technology. This is it's 412 00:25:36,600 --> 00:25:40,640 Speaker 1: palm scanning technology, which was already being used in things 413 00:25:40,680 --> 00:25:43,920 Speaker 1: like Amazon Go storefronts. It's now rolling those out to 414 00:25:44,200 --> 00:25:47,840 Speaker 1: Whole Foods grocery stores here in the United States, and 415 00:25:47,960 --> 00:25:50,159 Speaker 1: it's already in several hundred of them. It's going to 416 00:25:50,200 --> 00:25:51,840 Speaker 1: be put into the rest of them before the end 417 00:25:51,840 --> 00:25:55,920 Speaker 1: of the year. And the value proposition that Amazon is 418 00:25:55,960 --> 00:25:59,800 Speaker 1: giving customers is this, It's mainly one of convenience that 419 00:26:00,000 --> 00:26:04,840 Speaker 1: customers can opt to scan their palm, which then will 420 00:26:04,880 --> 00:26:09,960 Speaker 1: create a unique identifier associated with that person. That identifier 421 00:26:10,040 --> 00:26:13,240 Speaker 1: then in turn must be linked to the customer's Amazon 422 00:26:13,280 --> 00:26:17,119 Speaker 1: account and then whatever payment options they have linked to 423 00:26:17,320 --> 00:26:20,320 Speaker 1: that Amazon account. So you go to Whole food so 424 00:26:20,359 --> 00:26:23,040 Speaker 1: you do your grocery shopping, you scan all your items 425 00:26:23,040 --> 00:26:28,080 Speaker 1: at the self scan cashier area, and then you scan 426 00:26:28,200 --> 00:26:32,600 Speaker 1: your palm and it automatically deducts however much the groceries 427 00:26:32,640 --> 00:26:35,159 Speaker 1: were from your account and you're done. You don't have 428 00:26:35,200 --> 00:26:37,920 Speaker 1: to carry credit cards. You don't even need a smartphone 429 00:26:37,920 --> 00:26:40,840 Speaker 1: with a digital card on it. You literally have your 430 00:26:40,880 --> 00:26:45,440 Speaker 1: payment on hand. It is your hand. The scanner, meanwhile, 431 00:26:45,960 --> 00:26:48,119 Speaker 1: is looking not just at the skin of the palm. 432 00:26:48,200 --> 00:26:51,960 Speaker 1: It's actually using wavelengths of light that let it see 433 00:26:52,520 --> 00:26:56,560 Speaker 1: below the skin level to the pattern of veins in 434 00:26:56,600 --> 00:27:00,000 Speaker 1: your palm. That ends up being part of the information 435 00:27:00,200 --> 00:27:04,159 Speaker 1: that ultimately generates this identifier that it's associated with you. 436 00:27:04,760 --> 00:27:09,560 Speaker 1: Amazon says they're not actually saving the palm scans themselves, 437 00:27:10,040 --> 00:27:13,160 Speaker 1: so it's not like there's some According to Amazon, it's 438 00:27:13,160 --> 00:27:15,760 Speaker 1: not like there's some sort of massive database that has 439 00:27:16,160 --> 00:27:19,200 Speaker 1: all these different palm scans in it. Instead, what they're 440 00:27:19,240 --> 00:27:22,960 Speaker 1: saying is that when you scan your palm, the scanner 441 00:27:23,080 --> 00:27:27,480 Speaker 1: essentially reduces all the features of your palm into data points, 442 00:27:28,160 --> 00:27:32,440 Speaker 1: which then end up generating this unique identifier. So really 443 00:27:32,440 --> 00:27:34,680 Speaker 1: what you're doing is verifying every single time right you 444 00:27:35,040 --> 00:27:38,960 Speaker 1: hold your palm up, it scans it, it generates this number, 445 00:27:39,080 --> 00:27:42,760 Speaker 1: it compares that to the number on file, and if 446 00:27:42,800 --> 00:27:45,400 Speaker 1: the two numbers match, yay, you're who you say you are, 447 00:27:46,040 --> 00:27:49,200 Speaker 1: and then you can be charged for your very expensive 448 00:27:50,000 --> 00:27:53,480 Speaker 1: groceries and then you can go on your merry way. 449 00:27:53,840 --> 00:27:56,159 Speaker 1: What they what they're saying is not happening is that 450 00:27:56,200 --> 00:27:59,240 Speaker 1: it's scanning your hand and then comparing the physical scan 451 00:27:59,720 --> 00:28:02,919 Speaker 1: again and stay previous physical scan that has to be 452 00:28:03,000 --> 00:28:06,880 Speaker 1: saved somewhere. That's important because all the information that Amazon 453 00:28:06,960 --> 00:28:09,960 Speaker 1: is storing is in the cloud. They say that they've 454 00:28:10,000 --> 00:28:12,480 Speaker 1: put a lot of security on this because people are 455 00:28:12,640 --> 00:28:18,680 Speaker 1: understandably concerned about having very personal and personally identifiable information 456 00:28:19,440 --> 00:28:23,560 Speaker 1: stored on the company systems, and if it's on the cloud, 457 00:28:23,640 --> 00:28:27,040 Speaker 1: that means that potentially you could have hackers target it. 458 00:28:27,560 --> 00:28:31,240 Speaker 1: Maybe you could have law enforcement agencies try to force 459 00:28:31,280 --> 00:28:36,639 Speaker 1: Amazon to share information as they go through some form 460 00:28:36,680 --> 00:28:41,440 Speaker 1: of investigation. So people are understandably concerned. If Amazon's telling 461 00:28:41,480 --> 00:28:45,200 Speaker 1: the truth and the only information it's storing is a 462 00:28:45,280 --> 00:28:49,560 Speaker 1: unique identifier, then if hackers were to get access to that, 463 00:28:49,600 --> 00:28:52,680 Speaker 1: then it arguably wouldn't be very useful. It'd be kind 464 00:28:52,680 --> 00:28:54,880 Speaker 1: of like you got the answer to a math question, 465 00:28:55,120 --> 00:28:59,120 Speaker 1: but you don't know what the math question was, or 466 00:28:59,160 --> 00:29:02,240 Speaker 1: in the case of hitching Guide to the Galaxy. They 467 00:29:03,080 --> 00:29:06,600 Speaker 1: the part of the story is that these people build 468 00:29:06,800 --> 00:29:10,560 Speaker 1: this massively powerful computer to give the answer to life, 469 00:29:10,600 --> 00:29:13,560 Speaker 1: the universe, and everything, and the answer ends up being 470 00:29:13,600 --> 00:29:17,480 Speaker 1: forty two, and they say, how's that? What does that 471 00:29:17,600 --> 00:29:20,400 Speaker 1: mean forty two? Well, you need to know the question. 472 00:29:21,280 --> 00:29:23,520 Speaker 1: You need to know what the question is for forty 473 00:29:23,520 --> 00:29:27,080 Speaker 1: two to make sense. You just said the answer to life, 474 00:29:27,080 --> 00:29:30,680 Speaker 1: the universe, and everything. And so you see that you 475 00:29:30,720 --> 00:29:32,720 Speaker 1: need to know both the question and the answer. If 476 00:29:32,720 --> 00:29:34,080 Speaker 1: you only have the answer, you don't know what the 477 00:29:34,160 --> 00:29:39,040 Speaker 1: question was. So if Amazon's just storing these numbers and 478 00:29:39,120 --> 00:29:41,320 Speaker 1: hackers got access to it, they wouldn't be able to 479 00:29:41,320 --> 00:29:44,880 Speaker 1: backtrack and figure out your scan What they could do, however, potentially, 480 00:29:45,720 --> 00:29:49,960 Speaker 1: is find out every single time you used that technology, 481 00:29:50,040 --> 00:29:52,720 Speaker 1: whether it was when you were grocery shopping or going 482 00:29:52,800 --> 00:29:56,360 Speaker 1: to a sporting event, or going to any other place 483 00:29:56,400 --> 00:30:02,880 Speaker 1: that has opted to use AMI Amazon's palm scanning technology 484 00:30:02,960 --> 00:30:05,880 Speaker 1: in their business. It would become a way of tracking 485 00:30:05,920 --> 00:30:09,840 Speaker 1: your movements and potentially also seeing what it was you 486 00:30:09,920 --> 00:30:14,000 Speaker 1: were using the scanner for, you know, maybe getting access 487 00:30:14,040 --> 00:30:15,960 Speaker 1: to stuff or whatever, like going to a sporting event 488 00:30:16,080 --> 00:30:18,720 Speaker 1: or a concert. And that's where you start to see 489 00:30:19,880 --> 00:30:23,680 Speaker 1: real security and privacy issues. Even if Amazon's super super 490 00:30:23,720 --> 00:30:26,479 Speaker 1: careful with this, Amazon itself still has access to all 491 00:30:26,520 --> 00:30:29,480 Speaker 1: of that right, So of course, if you go to 492 00:30:29,560 --> 00:30:33,240 Speaker 1: Amazon dot com and buy something, Amazon knows what you bought, 493 00:30:33,360 --> 00:30:36,160 Speaker 1: and it can use that information to try and target 494 00:30:36,840 --> 00:30:41,000 Speaker 1: you for advertising and to give suggestions for products you 495 00:30:41,080 --> 00:30:44,920 Speaker 1: might find useful based upon your past purchases. If you're 496 00:30:45,000 --> 00:30:50,120 Speaker 1: using Amazon's palm scanning system out in the wild, then 497 00:30:50,160 --> 00:30:54,000 Speaker 1: Amazon also knows all that information out in the real world, 498 00:30:54,080 --> 00:30:56,320 Speaker 1: not just on Amazon dot Com. So you go to 499 00:30:56,360 --> 00:30:58,160 Speaker 1: Whole Foods and you buy a whole bunch of groceries, 500 00:30:58,200 --> 00:31:01,640 Speaker 1: you scan your palm. Now Amazon knows exactly what things 501 00:31:01,680 --> 00:31:04,640 Speaker 1: you bought, and they know it's you. They've got the 502 00:31:04,680 --> 00:31:07,840 Speaker 1: identifier it's connected to your Amazon account that could be 503 00:31:07,920 --> 00:31:11,400 Speaker 1: used for the purposes of targeted advertising. Amazon has said 504 00:31:11,440 --> 00:31:16,560 Speaker 1: they're not sharing the palm data with advertisers, which is fine, 505 00:31:17,040 --> 00:31:20,160 Speaker 1: but they didn't say anything like all the stuff I read. 506 00:31:20,720 --> 00:31:24,040 Speaker 1: They very carefully did not say they weren't sharing purchase 507 00:31:24,120 --> 00:31:28,800 Speaker 1: history or use history. And sure they might keep your 508 00:31:28,880 --> 00:31:33,360 Speaker 1: actual palm scan private, but if they're sharing with advertisers 509 00:31:33,400 --> 00:31:36,200 Speaker 1: what it is you're buying or where you are going, 510 00:31:37,520 --> 00:31:40,360 Speaker 1: the palm scan thing ends up being kind of moot, Like, 511 00:31:40,400 --> 00:31:46,000 Speaker 1: that's not that important. Your activities are telling the advertisers 512 00:31:46,040 --> 00:31:50,560 Speaker 1: and Amazon a lot about you beyond just what your 513 00:31:50,680 --> 00:31:54,479 Speaker 1: palm is like it. Yeah, knowing where the veins are 514 00:31:54,520 --> 00:31:59,200 Speaker 1: in your palm is potentially disastrous if someone's trying to 515 00:31:59,480 --> 00:32:03,320 Speaker 1: somehow replicate you and run up a bunch of charges 516 00:32:03,320 --> 00:32:07,160 Speaker 1: on your account. But to me, the more disturbing thing 517 00:32:08,440 --> 00:32:11,040 Speaker 1: is that every time you use a point of sale 518 00:32:11,160 --> 00:32:16,520 Speaker 1: or point of access that relies on this technology, it's 519 00:32:16,560 --> 00:32:20,080 Speaker 1: another data point that associates you with a specific action, 520 00:32:20,760 --> 00:32:24,600 Speaker 1: and collectively that ends up really mattering a whole lot, 521 00:32:25,680 --> 00:32:30,280 Speaker 1: both to Amazon and to potential advertisers. So I have 522 00:32:30,360 --> 00:32:33,239 Speaker 1: real concerns about using this kind of technology. I mean, 523 00:32:33,240 --> 00:32:35,400 Speaker 1: you could argue the same thing is true if you're 524 00:32:35,480 --> 00:32:39,880 Speaker 1: using the same credit cards or the same payment systems. 525 00:32:40,880 --> 00:32:45,240 Speaker 1: It's the same issue, right, Like, Honestly, the big issue 526 00:32:45,280 --> 00:32:48,760 Speaker 1: I have, and I'm not like a conspiracy minded person, 527 00:32:48,840 --> 00:32:51,000 Speaker 1: but the big issue I have is that we are 528 00:32:51,080 --> 00:32:53,080 Speaker 1: past the days where you would do things like cash 529 00:32:53,160 --> 00:32:57,000 Speaker 1: transactions for a lot of stuff, and cash transactions could 530 00:32:57,000 --> 00:32:59,080 Speaker 1: be really useful if you need to make a transaction 531 00:32:59,160 --> 00:33:03,240 Speaker 1: that you didn't want to be associated with you for 532 00:33:03,320 --> 00:33:05,480 Speaker 1: the rest of your life. Right if you need to 533 00:33:05,480 --> 00:33:08,800 Speaker 1: buy something, Maybe you need to buy some medication and 534 00:33:08,840 --> 00:33:12,240 Speaker 1: it's nobody businesses but your own that you need this medication, 535 00:33:12,680 --> 00:33:15,040 Speaker 1: and you normally would use cash to do it, but 536 00:33:15,120 --> 00:33:18,920 Speaker 1: now you're using some other method that is linked to you, 537 00:33:19,200 --> 00:33:21,880 Speaker 1: which means other people know that you're having to get 538 00:33:21,880 --> 00:33:24,880 Speaker 1: this medication beyond you and the pharmacist and your doctor, 539 00:33:25,880 --> 00:33:31,200 Speaker 1: and that becomes a potential privacy issue, and you could 540 00:33:31,480 --> 00:33:34,480 Speaker 1: apply that to all sorts of different stuff. So to me, 541 00:33:34,880 --> 00:33:38,560 Speaker 1: the biometrics approach, especially in a consumer use for things 542 00:33:38,600 --> 00:33:44,280 Speaker 1: like access to events or a way of paying, doesn't 543 00:33:44,320 --> 00:33:47,080 Speaker 1: appeal to me. But you could argue that it's really 544 00:33:47,120 --> 00:33:51,200 Speaker 1: just an extension of how we already interact in the world. 545 00:33:51,800 --> 00:33:53,520 Speaker 1: I guess the more I talk about this, the more 546 00:33:53,520 --> 00:33:58,520 Speaker 1: I can see why some cryptocurrency enthusiasts really like cryptocurrency. 547 00:33:58,560 --> 00:34:01,440 Speaker 1: They like the idea of being able to use currency 548 00:34:01,480 --> 00:34:04,840 Speaker 1: in a way that doesn't immediately associate them as a 549 00:34:04,840 --> 00:34:07,840 Speaker 1: person with their purchases or the way they're spending their money. 550 00:34:08,200 --> 00:34:10,880 Speaker 1: I kind of get that. I don't think cryptocurrency is 551 00:34:11,320 --> 00:34:15,080 Speaker 1: still the solution for that personally. That's my own personal belief, 552 00:34:16,120 --> 00:34:19,040 Speaker 1: but I can understand the tendency because I feel the 553 00:34:19,080 --> 00:34:24,200 Speaker 1: same way about biometrics and other forms of credit and 554 00:34:24,239 --> 00:34:27,640 Speaker 1: debit cards and stuff too. But really, biometrics just takes 555 00:34:27,680 --> 00:34:31,120 Speaker 1: it to a degree that's so personal that I think 556 00:34:31,120 --> 00:34:35,520 Speaker 1: it's impossible to deny. So anyway, that's just a quick 557 00:34:35,560 --> 00:34:39,480 Speaker 1: overview of biometrics. There's clearly tons more we can talk about, 558 00:34:39,640 --> 00:34:42,800 Speaker 1: and lots of different applications that have really valid uses, 559 00:34:43,400 --> 00:34:46,759 Speaker 1: including ones where you could argue, yeah, I understand that, 560 00:34:47,000 --> 00:34:50,520 Speaker 1: you know, I'm giving up my privacy, but in return, 561 00:34:50,560 --> 00:34:54,359 Speaker 1: I'm getting this additional convenience and maybe even some other 562 00:34:54,480 --> 00:34:57,680 Speaker 1: features that normally wouldn't be available to me if I 563 00:34:57,680 --> 00:35:00,080 Speaker 1: were using other methods of payments. So I'm willing to 564 00:35:00,160 --> 00:35:04,040 Speaker 1: make that trade. That's valid if that's who you are. 565 00:35:04,520 --> 00:35:07,160 Speaker 1: There is nothing invalid about that. I think the important 566 00:35:07,160 --> 00:35:10,800 Speaker 1: thing is just making the decision with as much information 567 00:35:10,840 --> 00:35:13,520 Speaker 1: as possible so that you feel good about it. And 568 00:35:13,680 --> 00:35:16,359 Speaker 1: even though I don't feel good about that for myself, 569 00:35:16,760 --> 00:35:20,279 Speaker 1: I wouldn't fault someone else for opting into it. If 570 00:35:20,600 --> 00:35:23,080 Speaker 1: to them they're like, no, this is what matters most 571 00:35:23,080 --> 00:35:28,640 Speaker 1: to me. That to me is a perfectly cromulent thing 572 00:35:28,719 --> 00:35:33,080 Speaker 1: to say. Okay, I've rambled enough. Like I said, we'll 573 00:35:33,080 --> 00:35:36,840 Speaker 1: probably do episodes where we focus more on specific types 574 00:35:36,880 --> 00:35:40,759 Speaker 1: of biometrics in the future. I have done one on 575 00:35:40,840 --> 00:35:43,439 Speaker 1: ones on things like fingerprints, and I think I've even 576 00:35:43,520 --> 00:35:49,120 Speaker 1: done some on voice recognition. All they think I was 577 00:35:49,360 --> 00:35:53,120 Speaker 1: focusing more on speech recognition as opposed to speaker recognition. 578 00:35:53,800 --> 00:35:56,160 Speaker 1: But maybe I'll do some more specifics and talk about 579 00:35:56,239 --> 00:36:00,640 Speaker 1: the actual development of the technologies and the algorith needed 580 00:36:00,680 --> 00:36:03,880 Speaker 1: to make that possible. Because you know, I summarized it 581 00:36:03,920 --> 00:36:06,479 Speaker 1: in this episode, but you have to understand it took 582 00:36:07,000 --> 00:36:10,799 Speaker 1: decades of work and tons of very smart people and 583 00:36:10,960 --> 00:36:13,960 Speaker 1: a lot of advancements in technology to make it a 584 00:36:14,040 --> 00:36:20,800 Speaker 1: really feasible possibility. So very interesting stuff. Lots of concerns 585 00:36:20,840 --> 00:36:23,799 Speaker 1: around it, not necessarily about the technology itself but how 586 00:36:23,800 --> 00:36:27,520 Speaker 1: we apply it. And yeah, there's plenty more to talk about, 587 00:36:27,880 --> 00:36:30,359 Speaker 1: but for now, let's wrap up. I hope you are 588 00:36:30,400 --> 00:36:33,440 Speaker 1: all well. This Tech Stuff Tidbits ended up being sub 589 00:36:33,600 --> 00:36:35,919 Speaker 1: forty minutes, so I'm going to call that a win, 590 00:36:36,560 --> 00:36:46,400 Speaker 1: and I'll talk to you again really soon. Tech Stuff 591 00:36:46,480 --> 00:36:51,000 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 592 00:36:51,040 --> 00:36:54,560 Speaker 1: the iHeartRadio app, Apple podcasts, or wherever you listen to 593 00:36:54,600 --> 00:36:59,279 Speaker 1: your favorite shows.