1 00:00:04,440 --> 00:00:12,320 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey Thearon, 2 00:00:12,480 --> 00:00:16,120 Speaker 1: Welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:16,160 --> 00:00:21,079 Speaker 1: an executive producer with iHeartRadio. And how the tech are you? Well, 4 00:00:21,120 --> 00:00:24,040 Speaker 1: it's time for another tech Stuff classic episode and this 5 00:00:24,160 --> 00:00:28,800 Speaker 1: episode is titled Authentication, Tech and You. It originally published 6 00:00:28,800 --> 00:00:33,120 Speaker 1: on February twenty second, twenty seventeen. Let's have a listen. 7 00:00:34,400 --> 00:00:38,080 Speaker 1: I feel like security is becoming a bigger and bigger 8 00:00:38,120 --> 00:00:41,479 Speaker 1: concern as it should be for a lot of people. 9 00:00:41,640 --> 00:00:44,319 Speaker 1: People are more aware of it, I think than they 10 00:00:44,320 --> 00:00:47,920 Speaker 1: were perhaps five years ago. Not everyone is practicing good 11 00:00:47,960 --> 00:00:52,160 Speaker 1: security measures. Not everyone's practicing two factor authentication or multi 12 00:00:52,159 --> 00:00:55,840 Speaker 1: factor authentication. We'll talk about that in this episode. And 13 00:00:55,960 --> 00:00:59,600 Speaker 1: if you weren't familiar with what that's all about, That's 14 00:00:59,600 --> 00:01:01,520 Speaker 1: why I'm wanted to do the show was to kind 15 00:01:01,520 --> 00:01:06,040 Speaker 1: of explain what that actually means and why it is important. 16 00:01:06,840 --> 00:01:11,440 Speaker 1: Authentication is something that we should probably define. First of all, 17 00:01:11,560 --> 00:01:17,360 Speaker 1: it's the process or action of proving something to be true, genuine, 18 00:01:17,880 --> 00:01:22,120 Speaker 1: or valid. So that covers a broad spectrum right authentication. 19 00:01:22,240 --> 00:01:27,760 Speaker 1: You could be talking about authenticating a historical artifact. That's 20 00:01:27,760 --> 00:01:30,800 Speaker 1: a great example, you bring a historical artifact to an expert, 21 00:01:31,240 --> 00:01:35,560 Speaker 1: they authenticate that it is in fact a historical artifact 22 00:01:35,800 --> 00:01:38,679 Speaker 1: and not something that was whipped up in some sort 23 00:01:38,680 --> 00:01:41,880 Speaker 1: of souvenir shop and some out of the way place. 24 00:01:42,840 --> 00:01:47,199 Speaker 1: But authentication has a very special role in the world 25 00:01:47,200 --> 00:01:49,920 Speaker 1: of technology. In the world of computers and electronics, it 26 00:01:49,960 --> 00:01:53,840 Speaker 1: gets a bit more specific. It's the process of verifying 27 00:01:53,880 --> 00:01:58,840 Speaker 1: the identity of a user or a program or process. 28 00:01:59,520 --> 00:02:02,160 Speaker 1: You want to make certain everything is authentic so that 29 00:02:02,480 --> 00:02:06,919 Speaker 1: a program or person doesn't get unauthorized access to a system. 30 00:02:07,200 --> 00:02:12,160 Speaker 1: So you're probably familiar with a lot of authentication processes, 31 00:02:12,200 --> 00:02:15,480 Speaker 1: even if you didn't call them that, because you yourself 32 00:02:15,760 --> 00:02:21,320 Speaker 1: have to employ them on a regular basis. Programs do too. 33 00:02:21,600 --> 00:02:24,280 Speaker 1: But I'm not gonna really spend a lot of time 34 00:02:24,280 --> 00:02:26,120 Speaker 1: talking about programs. In fact, I'm really not going to 35 00:02:26,200 --> 00:02:28,720 Speaker 1: dive into it at all because that gets super technical, 36 00:02:29,560 --> 00:02:31,680 Speaker 1: and really I think it's more important to focus on 37 00:02:31,720 --> 00:02:34,880 Speaker 1: the stuff that you have a direct involvement with, unless, 38 00:02:34,880 --> 00:02:38,480 Speaker 1: of course, you're a programmer, in which case Mia Kulpa. 39 00:02:38,560 --> 00:02:42,240 Speaker 1: So I'm going to focus on authentication technology targeted at humans. 40 00:02:43,000 --> 00:02:45,600 Speaker 1: So one day maybe I'll do a software one if 41 00:02:45,639 --> 00:02:47,920 Speaker 1: there's a lot of requests for it. But I feel 42 00:02:47,919 --> 00:02:49,720 Speaker 1: like that might just get a little too deep in 43 00:02:49,760 --> 00:02:53,120 Speaker 1: the weeds. So I'm going to talk about the stuff 44 00:02:53,160 --> 00:02:56,040 Speaker 1: you and I encounter when we try to access or 45 00:02:56,080 --> 00:03:00,359 Speaker 1: protect our technology and our data. Now, there are a 46 00:03:00,360 --> 00:03:02,639 Speaker 1: ton of different ways to do this. Some of them 47 00:03:02,919 --> 00:03:08,359 Speaker 1: are inherently stronger methods of authentication than others and are 48 00:03:08,919 --> 00:03:14,680 Speaker 1: better as far as being more secure, And all of 49 00:03:14,680 --> 00:03:19,040 Speaker 1: these authentication strategies can be divided into three broad categories. 50 00:03:19,760 --> 00:03:26,280 Speaker 1: Those categories are inherence factors, knowledge factors, and ownership factors. 51 00:03:26,560 --> 00:03:30,360 Speaker 1: So when you hear about two factor authentication, we're talking 52 00:03:30,400 --> 00:03:40,200 Speaker 1: about a specific strategy that employs different different approaches belonging 53 00:03:40,240 --> 00:03:44,880 Speaker 1: to different factors. Now, that doesn't really mean anything unless 54 00:03:44,880 --> 00:03:48,600 Speaker 1: I expand on it. So an inherence factor relies upon 55 00:03:49,120 --> 00:03:51,800 Speaker 1: the user him or herself. In other words, it has 56 00:03:51,840 --> 00:03:56,200 Speaker 1: something to do with you as a user. It has 57 00:03:56,240 --> 00:03:59,680 Speaker 1: to do with either your physical traits or your behavioral traits. 58 00:04:00,600 --> 00:04:03,760 Speaker 1: So a very easy to understand example of this would 59 00:04:03,760 --> 00:04:08,160 Speaker 1: be a fingerprint scanner. Right, like your fingerprints are unique 60 00:04:08,160 --> 00:04:12,000 Speaker 1: to you, It is something you have inherited, is inherent 61 00:04:12,440 --> 00:04:16,520 Speaker 1: in who you are. So it's an inherence factor, But 62 00:04:16,560 --> 00:04:18,600 Speaker 1: there are lots and lots of others, and I'll talk 63 00:04:18,600 --> 00:04:22,440 Speaker 1: about some of those later on this episode. Knowledge factors 64 00:04:22,440 --> 00:04:25,920 Speaker 1: are pretty self explanatory. Those are authentication strategies that rely 65 00:04:25,960 --> 00:04:29,560 Speaker 1: on something that the user knows, like a password or 66 00:04:29,600 --> 00:04:35,120 Speaker 1: a personal identification number otherwise known as a pen. Ownership 67 00:04:35,160 --> 00:04:38,640 Speaker 1: factors are also pretty easy to understand. Those rely on 68 00:04:38,720 --> 00:04:42,760 Speaker 1: something the user possesses, like a key card for security door. 69 00:04:43,120 --> 00:04:46,359 Speaker 1: That would be an ownership factor. Now, on top of 70 00:04:46,360 --> 00:04:49,920 Speaker 1: those categories, you have the additional strategies to enable authentication, 71 00:04:50,160 --> 00:04:54,440 Speaker 1: which includes that two factor authentication that I talked about before. 72 00:04:55,360 --> 00:04:57,719 Speaker 1: And maybe you don't know exactly what that means, well, 73 00:04:57,760 --> 00:05:02,039 Speaker 1: that's why i'm here. Really, single factor authentication relies on 74 00:05:02,160 --> 00:05:06,120 Speaker 1: just one component to access a system. So, for example, 75 00:05:06,120 --> 00:05:09,279 Speaker 1: a lot of smartphones require users to unlock the device 76 00:05:09,400 --> 00:05:13,800 Speaker 1: with a pin or a swipe pattern or a fingerprint scan. 77 00:05:14,640 --> 00:05:17,640 Speaker 1: But that's it, right, You just have to do one 78 00:05:17,800 --> 00:05:20,839 Speaker 1: of those things. You don't have to do multiple things, 79 00:05:21,760 --> 00:05:24,640 Speaker 1: and once you do, whichever method you've enabled on your device, 80 00:05:25,120 --> 00:05:28,960 Speaker 1: you have access to it. There's no secondary requirement. Systems 81 00:05:29,000 --> 00:05:32,920 Speaker 1: that use single factor authentication are weaker than those that 82 00:05:33,000 --> 00:05:39,240 Speaker 1: require more than one authentication strategy. In general, there are 83 00:05:39,520 --> 00:05:43,240 Speaker 1: some different definitions for strong authentication I'll get into, and 84 00:05:43,720 --> 00:05:47,640 Speaker 1: you could argue that some inherence factors are so strong 85 00:05:47,760 --> 00:05:51,479 Speaker 1: as to be fine on their own, But in general, 86 00:05:52,680 --> 00:05:55,160 Speaker 1: going with a single factor is less secure than going 87 00:05:55,200 --> 00:05:58,160 Speaker 1: for a two factor authentication strategy, which is exactly what 88 00:05:58,200 --> 00:06:03,680 Speaker 1: it sounds like. It requires two different authentication factors. That 89 00:06:03,760 --> 00:06:07,159 Speaker 1: means the system will require users to provide authentication in 90 00:06:07,279 --> 00:06:11,279 Speaker 1: two of those three categories. So an example of this 91 00:06:11,560 --> 00:06:14,320 Speaker 1: is an ATM card. If you want to use an 92 00:06:14,360 --> 00:06:17,960 Speaker 1: ATM card, you need to provide the card. That's an 93 00:06:17,960 --> 00:06:20,599 Speaker 1: ownership factor. You have to be in possession of the card, 94 00:06:21,200 --> 00:06:25,760 Speaker 1: and you have to supply the pen that's the knowledge factor. 95 00:06:26,320 --> 00:06:29,279 Speaker 1: So you have an ownership factor and a knowledge factor. 96 00:06:29,480 --> 00:06:33,440 Speaker 1: Those are two factors. That's two factor authentication. Possession of 97 00:06:33,480 --> 00:06:37,440 Speaker 1: one factor should not be sufficient to access the respective system, 98 00:06:38,080 --> 00:06:42,000 Speaker 1: nor should it lead to the discovery of the second factor. 99 00:06:42,160 --> 00:06:44,719 Speaker 1: In other words, if you get hold of the card 100 00:06:44,920 --> 00:06:48,480 Speaker 1: like you get hold of someone else's card, ideally there 101 00:06:48,480 --> 00:06:51,760 Speaker 1: should be no indication on the card of what the 102 00:06:51,800 --> 00:06:55,120 Speaker 1: pen is because you need both of those things in 103 00:06:55,200 --> 00:06:58,160 Speaker 1: order to access someone's account, and if you make sure 104 00:06:58,200 --> 00:07:00,760 Speaker 1: that only one of the two things is in possession 105 00:07:00,800 --> 00:07:04,600 Speaker 1: of somebody else, they still can't get your stuff. So 106 00:07:04,680 --> 00:07:08,200 Speaker 1: that's why you want the two factor authentication. You have 107 00:07:08,240 --> 00:07:11,640 Speaker 1: to possess or know both of the authentication requirements independently 108 00:07:11,680 --> 00:07:16,000 Speaker 1: of each other. This also applies to other factors as well. 109 00:07:16,040 --> 00:07:18,240 Speaker 1: It doesn't just have to be knowledge and ownership. It 110 00:07:18,240 --> 00:07:21,080 Speaker 1: could be ownership and inherence. It could be knowledge and inherence. 111 00:07:21,160 --> 00:07:25,120 Speaker 1: You get the idea. So, if you've enabled two factor 112 00:07:25,160 --> 00:07:29,400 Speaker 1: authentication on various online accounts, which I urge you to 113 00:07:29,520 --> 00:07:33,160 Speaker 1: do for any accounts that actually offer it, you've likely 114 00:07:33,240 --> 00:07:35,880 Speaker 1: had to supply a password as well as a code 115 00:07:36,160 --> 00:07:39,160 Speaker 1: sent to you in some way. For example, you might 116 00:07:39,200 --> 00:07:41,920 Speaker 1: have an email account that when you try and access 117 00:07:41,920 --> 00:07:45,120 Speaker 1: it using a brand new device, says, all right, well 118 00:07:45,120 --> 00:07:47,440 Speaker 1: what's your password? So you typed a little password in 119 00:07:47,480 --> 00:07:49,280 Speaker 1: and then says all right, well, now I'm going to 120 00:07:49,360 --> 00:07:53,320 Speaker 1: send you a code via text message. You need to 121 00:07:53,600 --> 00:07:56,320 Speaker 1: put that code into this little box here, and then 122 00:07:56,360 --> 00:07:59,920 Speaker 1: I'll give you access to your email. So the past 123 00:08:00,000 --> 00:08:03,800 Speaker 1: this word part taps into that knowledge factor because you 124 00:08:03,960 --> 00:08:08,360 Speaker 1: know the password and the text message taps into the 125 00:08:08,440 --> 00:08:11,960 Speaker 1: ownership factor because there's a specific cell phone with a 126 00:08:12,000 --> 00:08:15,880 Speaker 1: specific cell phone number associated with your email account, so 127 00:08:15,920 --> 00:08:18,080 Speaker 1: you have to be an ownership of the cell phone 128 00:08:18,120 --> 00:08:21,360 Speaker 1: in order to receive the text message and complete that 129 00:08:21,440 --> 00:08:26,080 Speaker 1: authentication strategy. Many two factor authentication systems will actually allow 130 00:08:26,120 --> 00:08:30,880 Speaker 1: you to designate specific devices as being safe quote unquote safe, 131 00:08:30,880 --> 00:08:32,800 Speaker 1: meaning that you don't have to do that every single 132 00:08:32,800 --> 00:08:35,840 Speaker 1: time you log in from that specific device. That way, 133 00:08:35,840 --> 00:08:37,719 Speaker 1: you don't end up waiting for a text message every 134 00:08:37,760 --> 00:08:40,640 Speaker 1: time you try and check your email from your personal laptop, computer, 135 00:08:40,800 --> 00:08:44,360 Speaker 1: or smartphone. Now, there are systems that require even more 136 00:08:44,480 --> 00:08:49,320 Speaker 1: forms of authentication, and we typically group these under the 137 00:08:49,400 --> 00:08:54,760 Speaker 1: category multi factor authentication, indicating you've got to supply at 138 00:08:54,880 --> 00:08:58,520 Speaker 1: least two methods in order to access the respective system. 139 00:08:58,720 --> 00:09:01,640 Speaker 1: So technically, two factor authentication is a type of multi 140 00:09:01,640 --> 00:09:05,720 Speaker 1: factor authentication. Most of the time, when I encounter it, 141 00:09:05,840 --> 00:09:08,880 Speaker 1: multi factor is being used to mean more than two. 142 00:09:10,040 --> 00:09:14,440 Speaker 1: I haven't personally ever encountered a system where I've had 143 00:09:14,480 --> 00:09:17,760 Speaker 1: to supply more than two factors. But then again, no 144 00:09:17,800 --> 00:09:21,120 Speaker 1: one trusts me with anything that's that important, so no 145 00:09:21,200 --> 00:09:26,120 Speaker 1: big surprise there. Now, confusing matters somewhat. Is this term 146 00:09:26,320 --> 00:09:30,439 Speaker 1: called strong authentication, which is used in a lot of 147 00:09:30,480 --> 00:09:34,160 Speaker 1: different places, including the European Union. In fact, it's very 148 00:09:34,480 --> 00:09:37,800 Speaker 1: prominently used in the EU. At first glance, you might 149 00:09:37,840 --> 00:09:42,120 Speaker 1: think strong authentication and two factor or multi factor authentication 150 00:09:42,320 --> 00:09:45,079 Speaker 1: are synonymous, that in order for it to be strong, 151 00:09:45,160 --> 00:09:48,320 Speaker 1: it must be at least two factor authentication, But that's 152 00:09:48,360 --> 00:09:53,240 Speaker 1: not actually the case. If a single authentication strategy is 153 00:09:53,280 --> 00:09:57,640 Speaker 1: deemed secure enough, it can fall under the category of 154 00:09:57,720 --> 00:10:02,200 Speaker 1: strong authentication. And so there's a lot of disagreement over 155 00:10:02,240 --> 00:10:04,920 Speaker 1: what the actual definition is. It makes it pretty confusing. 156 00:10:05,280 --> 00:10:07,040 Speaker 1: But let's give you an example. Let's say that there's 157 00:10:07,080 --> 00:10:10,920 Speaker 1: a retinal scanner that scans the pattern of blood vessels 158 00:10:10,960 --> 00:10:15,480 Speaker 1: in your eye. Now that's really difficult to replicate compared 159 00:10:15,520 --> 00:10:18,959 Speaker 1: to other biometric measures such as a fingerprint, which you could, 160 00:10:19,600 --> 00:10:23,800 Speaker 1: in fact, if you're very clever fake. So in the 161 00:10:23,920 --> 00:10:27,240 Speaker 1: European Union, a system that looks at the blood vessels 162 00:10:27,280 --> 00:10:31,520 Speaker 1: in your eye for authentication might be considered strong even 163 00:10:31,559 --> 00:10:34,080 Speaker 1: though it's just a single factor. Let's say you don't 164 00:10:34,080 --> 00:10:36,640 Speaker 1: have to provide any other information, it's just a quick 165 00:10:36,679 --> 00:10:39,400 Speaker 1: scan of the eye and you're in If the system 166 00:10:39,440 --> 00:10:41,960 Speaker 1: is robust enough, and if it's looking at something that 167 00:10:42,040 --> 00:10:44,960 Speaker 1: is difficult enough to replicate, it could still count as 168 00:10:44,960 --> 00:10:49,120 Speaker 1: strong authentication. It could even refer to knowledge based factors. 169 00:10:49,480 --> 00:10:51,640 Speaker 1: So let's say a system requires you to answer a 170 00:10:51,679 --> 00:10:54,800 Speaker 1: series of unrelated questions when you set up your account. 171 00:10:55,480 --> 00:10:58,120 Speaker 1: Accessing the account at a later time requires that you 172 00:10:58,240 --> 00:11:01,200 Speaker 1: replicate those answers. You've got to remember how you answered 173 00:11:01,200 --> 00:11:03,040 Speaker 1: the questions when you first set it up. It's kind 174 00:11:03,040 --> 00:11:06,160 Speaker 1: of like the security questions a lot of different systems 175 00:11:06,240 --> 00:11:10,280 Speaker 1: use right now. Now, because these questions are unrelated and 176 00:11:10,400 --> 00:11:14,160 Speaker 1: knowledge of one answer doesn't provide any of the other answers, 177 00:11:14,600 --> 00:11:17,880 Speaker 1: that could be considered strong authentication. Now, personally, I find 178 00:11:17,920 --> 00:11:19,800 Speaker 1: that method to be a little on the flimsy side, 179 00:11:19,840 --> 00:11:22,680 Speaker 1: but I'm not the one making definitions. I'm just reporting 180 00:11:22,679 --> 00:11:27,360 Speaker 1: them to you guys. Now we've got the basic definitions 181 00:11:27,480 --> 00:11:30,640 Speaker 1: out of the way, let's dive into a bit of history, 182 00:11:30,880 --> 00:11:33,000 Speaker 1: because you guys know, I love to talk about the 183 00:11:33,120 --> 00:11:37,520 Speaker 1: history of the various technologies and processes we've developed over 184 00:11:37,559 --> 00:11:41,560 Speaker 1: the years. So the concept of authentication is ancient. It 185 00:11:41,600 --> 00:11:46,360 Speaker 1: predates electronics by centuries. Throughout the years, people would have 186 00:11:46,480 --> 00:11:49,880 Speaker 1: to provide some sort of proof of their identities. It 187 00:11:49,960 --> 00:11:53,199 Speaker 1: might require someone else to vouchsay for a person, or 188 00:11:53,240 --> 00:11:56,760 Speaker 1: it might require a special seal belonging to a particular 189 00:11:56,800 --> 00:12:00,560 Speaker 1: office or noble house, placed upon an official document. You 190 00:12:00,600 --> 00:12:02,760 Speaker 1: may have heard that a lot of those documents would 191 00:12:02,760 --> 00:12:06,880 Speaker 1: be sealed with wax, and then someone would use a 192 00:12:06,920 --> 00:12:10,160 Speaker 1: signet ring in order to put a specific stamp in 193 00:12:10,240 --> 00:12:13,640 Speaker 1: that wax. That was considered a form of authentication. If 194 00:12:13,640 --> 00:12:17,680 Speaker 1: you saw the proper symbol, then presumably it came from 195 00:12:17,880 --> 00:12:22,040 Speaker 1: the proper place. Not that you couldn't create a fake 196 00:12:22,080 --> 00:12:24,240 Speaker 1: of that if you really wanted to, but you know 197 00:12:24,320 --> 00:12:26,760 Speaker 1: that was the idea. Or you might even just have 198 00:12:26,800 --> 00:12:29,760 Speaker 1: a password shared between a small group of people. So 199 00:12:29,840 --> 00:12:31,600 Speaker 1: as long as there have been secrets, there have been 200 00:12:31,600 --> 00:12:34,720 Speaker 1: means to identify those who should and should not have 201 00:12:34,920 --> 00:12:38,880 Speaker 1: access to those secrets. And secrets predate the written word. 202 00:12:40,000 --> 00:12:45,280 Speaker 1: But let's talk about passwords and authentication and electronics, because honestly, 203 00:12:45,320 --> 00:12:48,920 Speaker 1: if I did a full episode about the history of passwords, 204 00:12:49,040 --> 00:12:51,320 Speaker 1: that would not really be tech stuff. That would be 205 00:12:51,360 --> 00:12:53,679 Speaker 1: an awesome, awesome episode of stuff they don't want you 206 00:12:53,720 --> 00:12:58,520 Speaker 1: to know. Hint, hint. So computer passwords actually pre date 207 00:12:58,800 --> 00:13:03,680 Speaker 1: personal computers. Back in nineteen sixty one, MIT created a 208 00:13:03,720 --> 00:13:07,960 Speaker 1: password system for authorized access to its Compatible Time Sharing 209 00:13:08,000 --> 00:13:13,640 Speaker 1: System or CTSS. CTSS allowed multiple users to access the 210 00:13:13,720 --> 00:13:17,679 Speaker 1: same computational core. So imagine that you are in a 211 00:13:17,800 --> 00:13:21,560 Speaker 1: room and it's filled. There's like lots of tables everywhere, 212 00:13:21,559 --> 00:13:24,920 Speaker 1: and every table has a couple different workstations. Every workstation 213 00:13:25,000 --> 00:13:28,480 Speaker 1: has a screen and a keyboard, but not a computer. 214 00:13:29,000 --> 00:13:31,920 Speaker 1: They just have the keyboard in the screen, which are 215 00:13:32,120 --> 00:13:35,959 Speaker 1: connected via cables to a single computer. Everyone is sharing 216 00:13:36,400 --> 00:13:39,800 Speaker 1: the exact same computer. Well, way back in the day, 217 00:13:39,840 --> 00:13:42,720 Speaker 1: that's how a lot of computer systems were made. They 218 00:13:43,000 --> 00:13:47,720 Speaker 1: didn't have personal devices at every station. The stations were 219 00:13:47,760 --> 00:13:51,560 Speaker 1: just dummy terminals that connected to a core system. Also, 220 00:13:51,679 --> 00:13:54,720 Speaker 1: in those days, time sharing meant that the computer actually 221 00:13:54,800 --> 00:13:58,400 Speaker 1: would divvy up when it was specifically available to do 222 00:13:58,600 --> 00:14:02,360 Speaker 1: your calculation. So let's say you're typing in something, you're 223 00:14:02,400 --> 00:14:05,760 Speaker 1: programming some code, and you send it to the computer. 224 00:14:06,600 --> 00:14:11,080 Speaker 1: It would be responding to each station in turn, and 225 00:14:11,120 --> 00:14:14,360 Speaker 1: it's doing it so fast that it feels almost instantaneous, 226 00:14:14,480 --> 00:14:17,720 Speaker 1: or close enough to it, but in fact it would 227 00:14:17,760 --> 00:14:22,080 Speaker 1: be responding in sequence. As people had logged into the 228 00:14:22,160 --> 00:14:28,400 Speaker 1: various terminals now obviously using the same computer for all 229 00:14:28,400 --> 00:14:32,440 Speaker 1: these dummy terminals create some challenges. How can each individual 230 00:14:32,720 --> 00:14:36,120 Speaker 1: user maintain control over his or her data? How do 231 00:14:36,200 --> 00:14:40,040 Speaker 1: they maintain their own private files? Because every user had 232 00:14:40,080 --> 00:14:43,480 Speaker 1: a set of private files that other users should not 233 00:14:43,560 --> 00:14:47,560 Speaker 1: be able to access without authorization. I mean, one person 234 00:14:47,640 --> 00:14:50,000 Speaker 1: might be working on a project, someone else is working 235 00:14:50,040 --> 00:14:52,480 Speaker 1: on a totally different project. You don't want those files 236 00:14:52,520 --> 00:14:56,920 Speaker 1: to intermingle. You had to partition that stuff. So without 237 00:14:56,920 --> 00:15:00,000 Speaker 1: a password, you really couldn't do that. So if everyone 238 00:15:00,240 --> 00:15:02,360 Speaker 1: using a core machine as the processor in storage unit, 239 00:15:02,440 --> 00:15:05,800 Speaker 1: you had to create some means of differentiating one user 240 00:15:06,040 --> 00:15:10,720 Speaker 1: from another. The solution was the password, So every user 241 00:15:10,760 --> 00:15:13,480 Speaker 1: would get a unique password to enter into the system, 242 00:15:13,800 --> 00:15:16,960 Speaker 1: which would then allow that user to create and access 243 00:15:17,040 --> 00:15:20,480 Speaker 1: private files. And it also helped control the amount of 244 00:15:20,520 --> 00:15:24,560 Speaker 1: time any individual user had with the machine. Because these 245 00:15:24,600 --> 00:15:27,840 Speaker 1: machines they were rare. There were only a few of 246 00:15:27,880 --> 00:15:32,200 Speaker 1: them in nineteen sixty one, so the time on those 247 00:15:32,240 --> 00:15:37,400 Speaker 1: machines was very valuable. You know, people were hoarding time. 248 00:15:37,480 --> 00:15:38,960 Speaker 1: They were trying to do their best. You know, you 249 00:15:39,040 --> 00:15:42,200 Speaker 1: might only get a few hours a week, so they 250 00:15:42,200 --> 00:15:45,400 Speaker 1: would end up parsioning that out through passwords. It was 251 00:15:45,480 --> 00:15:48,680 Speaker 1: kind of like a controlled ticket system so that a 252 00:15:48,960 --> 00:15:51,920 Speaker 1: ride doesn't get overwhelmed with a ton of people. You 253 00:15:52,400 --> 00:15:54,960 Speaker 1: release a certain number of tickets per hour and you 254 00:15:55,080 --> 00:15:57,720 Speaker 1: keep the traffic flowing steadily. Same sort of thing, except 255 00:15:57,720 --> 00:16:00,440 Speaker 1: in this case it was with a computer access so 256 00:16:00,520 --> 00:16:03,160 Speaker 1: as a way to control the point of entry into 257 00:16:03,200 --> 00:16:06,920 Speaker 1: the system. We're going to take a quick break from 258 00:16:07,000 --> 00:16:20,840 Speaker 1: talking about authentication tech to thank our sponsors. Now, at 259 00:16:20,840 --> 00:16:23,720 Speaker 1: that time, the passwords were pretty simple, and they were 260 00:16:23,760 --> 00:16:27,280 Speaker 1: not really secure at all. It was more for the 261 00:16:27,320 --> 00:16:31,960 Speaker 1: matter of convenience than security really. After all, this predated 262 00:16:32,000 --> 00:16:35,320 Speaker 1: the Internet, so external access to the system wasn't really 263 00:16:35,320 --> 00:16:37,280 Speaker 1: a factor. If you wanted to get your hands on 264 00:16:37,360 --> 00:16:40,800 Speaker 1: those sweet, sweet private files, you actually needed to have 265 00:16:40,840 --> 00:16:43,960 Speaker 1: physical access to the system itself. You couldn't just hack 266 00:16:44,040 --> 00:16:46,800 Speaker 1: in from across the country. So in a way, that's 267 00:16:47,320 --> 00:16:51,200 Speaker 1: one factor of authentication all by itself. Ownership in this case, 268 00:16:51,240 --> 00:16:54,200 Speaker 1: the ownership doesn't really refer to something that you personally owned, 269 00:16:54,280 --> 00:16:59,560 Speaker 1: but rather your physical access to the system. But these 270 00:17:00,120 --> 00:17:03,280 Speaker 1: weren't encrypted or stored in a particularly safe way. They 271 00:17:03,280 --> 00:17:06,960 Speaker 1: were in plain text. So just a year after they 272 00:17:07,119 --> 00:17:12,119 Speaker 1: debuted this password strategy, a graduate student named Alan Scherer 273 00:17:12,440 --> 00:17:16,639 Speaker 1: accessed the entire list of unencrypted passwords stored on the 274 00:17:16,680 --> 00:17:20,120 Speaker 1: system and printed them out. Now, the reason Shared did 275 00:17:20,119 --> 00:17:23,439 Speaker 1: this was not to access private files created by other people. 276 00:17:24,040 --> 00:17:26,640 Speaker 1: It was so that Share could get more time on 277 00:17:26,680 --> 00:17:30,280 Speaker 1: the system because every student was allotted just four hours 278 00:17:30,280 --> 00:17:33,879 Speaker 1: of access per week, and he needed more access, and 279 00:17:33,960 --> 00:17:36,560 Speaker 1: he figured, well, there's all these other hours of access 280 00:17:36,560 --> 00:17:40,000 Speaker 1: that are going unused from other students. That's not fair. 281 00:17:40,359 --> 00:17:44,159 Speaker 1: I'll just take their hours and use them myself. The 282 00:17:44,200 --> 00:17:46,320 Speaker 1: way he did this was he actually created a punch 283 00:17:46,359 --> 00:17:51,000 Speaker 1: card that contained the file name and location for the 284 00:17:51,040 --> 00:17:54,760 Speaker 1: password list, and it also contained a set of instructions 285 00:17:54,800 --> 00:17:58,480 Speaker 1: that said take this file and send it to a printer. 286 00:17:59,840 --> 00:18:02,960 Speaker 1: Didn't even have to physically look at this file at all. 287 00:18:03,000 --> 00:18:05,200 Speaker 1: He just had to figure out what was the file name, 288 00:18:05,320 --> 00:18:08,240 Speaker 1: where was it located on the system, and then include 289 00:18:08,240 --> 00:18:11,639 Speaker 1: the instructions send to printer. By the way, if you 290 00:18:11,640 --> 00:18:13,880 Speaker 1: want to know more about how punch cards work and 291 00:18:14,000 --> 00:18:17,680 Speaker 1: the way that they were an integral part of early computing, 292 00:18:18,160 --> 00:18:20,920 Speaker 1: you can actually listen to a classic two thousand and 293 00:18:21,040 --> 00:18:25,199 Speaker 1: nine Tech Stuff episode titled Computers from the past, and 294 00:18:25,320 --> 00:18:27,439 Speaker 1: Chris Palette and I talked a lot about them in 295 00:18:27,480 --> 00:18:31,960 Speaker 1: that episode. So it's easy in hindsight to criticize the 296 00:18:32,080 --> 00:18:34,480 Speaker 1: MIT strategy. But keep in mind this was at a 297 00:18:34,560 --> 00:18:39,360 Speaker 1: time when unauthorized access to computers was exceedingly rare, because well, 298 00:18:39,400 --> 00:18:43,040 Speaker 1: the computers were exceedingly rare. As computers began to proliferate 299 00:18:43,400 --> 00:18:46,840 Speaker 1: throughout all areas of life, the need for more secure 300 00:18:46,920 --> 00:18:51,679 Speaker 1: access strategies grew. According to Roger Needham, who was a 301 00:18:51,680 --> 00:18:55,880 Speaker 1: professor of computing at Cambridge University, the Cambridge Lab came 302 00:18:56,000 --> 00:18:59,159 Speaker 1: up with a concept to make passwords more secure, and 303 00:18:59,200 --> 00:19:02,400 Speaker 1: that's the concept of hashing. Now, that's when you convert 304 00:19:02,400 --> 00:19:06,399 Speaker 1: passwords of variable lengths into a fixed length string of 305 00:19:06,480 --> 00:19:10,760 Speaker 1: characters using an algorithm for the transformation. It's a fancy 306 00:19:10,800 --> 00:19:13,159 Speaker 1: way of saying, no matter how long or short a 307 00:19:13,240 --> 00:19:18,280 Speaker 1: password is, you put it through a series of mathematical processes. 308 00:19:18,480 --> 00:19:22,119 Speaker 1: Will you convert the password into numerals first? Then you 309 00:19:22,200 --> 00:19:26,840 Speaker 1: do this series of mathematic processes, the result of which 310 00:19:27,000 --> 00:19:31,720 Speaker 1: is you get a much longer string of characters and 311 00:19:31,760 --> 00:19:34,640 Speaker 1: that represents the password. And it doesn't matter how long 312 00:19:34,720 --> 00:19:38,240 Speaker 1: or short the original password was. All of the hashed 313 00:19:38,560 --> 00:19:42,399 Speaker 1: versions of the password are the same length. So let's 314 00:19:42,440 --> 00:19:45,000 Speaker 1: say the hash is eighty characters long. That means if 315 00:19:45,000 --> 00:19:51,000 Speaker 1: your base password is pass or its anti disestablishmentarianism or 316 00:19:51,080 --> 00:19:54,280 Speaker 1: anything else, it will end up converted into a string 317 00:19:54,359 --> 00:19:56,959 Speaker 1: of eighty characters. So if someone gets hold of the 318 00:19:57,000 --> 00:19:59,679 Speaker 1: hashed passwords, those are the only ones that are being 319 00:19:59,720 --> 00:20:02,080 Speaker 1: stored on the system, they would still have to figure 320 00:20:02,080 --> 00:20:05,000 Speaker 1: out what was the mechanism used to generate the hashes 321 00:20:05,080 --> 00:20:08,240 Speaker 1: in order to guess what the root password was, because 322 00:20:08,240 --> 00:20:11,280 Speaker 1: otherwise they're all going to look like they're eighty characters long. 323 00:20:11,320 --> 00:20:14,320 Speaker 1: You won't know which ones were short passwords or long passwords. 324 00:20:16,160 --> 00:20:18,600 Speaker 1: In order to do that, obviously, you have to decide 325 00:20:18,640 --> 00:20:22,520 Speaker 1: upon what the specific sequence of mathematical operations are going 326 00:20:22,560 --> 00:20:27,040 Speaker 1: to be and what seed you're using for those operations. 327 00:20:28,240 --> 00:20:30,440 Speaker 1: And once you do that, then you're able to make 328 00:20:30,520 --> 00:20:35,280 Speaker 1: these kind of changes. So Needham said that the system 329 00:20:35,400 --> 00:20:38,840 Speaker 1: was created and implemented in the mid to late nineteen sixties, 330 00:20:39,240 --> 00:20:45,800 Speaker 1: so it wasn't very long after the MIT rollout of passwords. 331 00:20:46,440 --> 00:20:49,400 Speaker 1: Now later, still, computer scientists began to develop more secure 332 00:20:49,480 --> 00:20:54,240 Speaker 1: hashing strategies. This includes salting passwords, which means adding characters 333 00:20:54,280 --> 00:20:57,720 Speaker 1: to a password before you hash it. So a simple 334 00:20:57,760 --> 00:21:00,800 Speaker 1: example of this is using a computer's claw to insert 335 00:21:00,880 --> 00:21:04,680 Speaker 1: digits into the password and then hashing the new password, 336 00:21:04,840 --> 00:21:07,080 Speaker 1: which makes it even harder for a hacker to figure 337 00:21:07,080 --> 00:21:10,120 Speaker 1: out the root password from the hash because they need 338 00:21:10,160 --> 00:21:13,399 Speaker 1: to know at what time that operation was performed on 339 00:21:13,440 --> 00:21:17,879 Speaker 1: the original password, otherwise they wouldn't be able to replicate 340 00:21:17,960 --> 00:21:22,280 Speaker 1: the original password. Now, this is easier to understand if 341 00:21:22,320 --> 00:21:24,840 Speaker 1: I give you an example. So let's say your password 342 00:21:25,480 --> 00:21:29,520 Speaker 1: has been set to let's say tech stuff. You chose 343 00:21:29,560 --> 00:21:32,320 Speaker 1: tech stuff as your password. First of all, that was dumb. 344 00:21:32,480 --> 00:21:35,160 Speaker 1: Don't do that. Don't pick a word that's easy to guess, 345 00:21:35,200 --> 00:21:37,480 Speaker 1: even if it's a name like tech stuff, which is 346 00:21:37,760 --> 00:21:41,399 Speaker 1: granted an awesome show. But you've chosen tech stuff for 347 00:21:41,440 --> 00:21:45,240 Speaker 1: this example. You access the system at two thirty five 348 00:21:45,240 --> 00:21:48,439 Speaker 1: in the afternoon. Let's say that the computer converts that 349 00:21:48,520 --> 00:21:51,720 Speaker 1: into military time, so that gives you fourteen thirty five, 350 00:21:52,040 --> 00:21:55,199 Speaker 1: and then it salts your password with those numbers, so 351 00:21:55,440 --> 00:21:57,960 Speaker 1: instead of it just saying text stuff, now it says 352 00:21:58,080 --> 00:22:04,920 Speaker 1: T one E four C three H five stuff. That 353 00:22:04,960 --> 00:22:09,280 Speaker 1: password then gets hashed into that eighty character long version 354 00:22:09,400 --> 00:22:12,240 Speaker 1: stored on the computers. By the way, that eighty characters 355 00:22:12,280 --> 00:22:15,679 Speaker 1: is just an arbitrary example. It doesn't really mean anything. 356 00:22:15,720 --> 00:22:18,720 Speaker 1: I just need a number for the example. Now, let's 357 00:22:18,720 --> 00:22:21,639 Speaker 1: say you access the same system the following day, but 358 00:22:21,680 --> 00:22:24,119 Speaker 1: this time it's one twenty three in the afternoon. Remember 359 00:22:24,119 --> 00:22:25,960 Speaker 1: it was two thirty five the day before, but now 360 00:22:25,960 --> 00:22:29,640 Speaker 1: it's one twenty three the next day. The salted password 361 00:22:29,680 --> 00:22:31,960 Speaker 1: is going to be different because it's going to convert 362 00:22:32,000 --> 00:22:34,520 Speaker 1: one to twenty three to military time, and then it's 363 00:22:34,560 --> 00:22:37,520 Speaker 1: going to salt the password that way, so it would 364 00:22:37,560 --> 00:22:41,120 Speaker 1: be T one E three C two H three stuff. 365 00:22:42,800 --> 00:22:45,679 Speaker 1: The hashed value will end up being different as well, 366 00:22:46,000 --> 00:22:49,359 Speaker 1: because it's inserted those new numbers. So that means that 367 00:22:49,440 --> 00:22:52,600 Speaker 1: if the hacker gets two versions of your hashed password, 368 00:22:53,000 --> 00:22:55,440 Speaker 1: they're still going to be different from each other. It's 369 00:22:55,480 --> 00:22:57,680 Speaker 1: all going to be dependent upon the time you tried 370 00:22:57,720 --> 00:23:01,960 Speaker 1: to access the system. Now itself, it knows when you 371 00:23:02,000 --> 00:23:04,879 Speaker 1: were accessing it, so it's able to do all of 372 00:23:04,920 --> 00:23:09,800 Speaker 1: this decoding easily. There's no problem for the system, but 373 00:23:09,880 --> 00:23:12,960 Speaker 1: it makes it difficult for a hacker to figure out 374 00:23:12,960 --> 00:23:15,919 Speaker 1: what your password was based upon the hashed value that 375 00:23:16,000 --> 00:23:20,280 Speaker 1: appears inside the system. Now, of course, hackers can bypass 376 00:23:20,320 --> 00:23:23,160 Speaker 1: all that and try to hack a password using brute force. 377 00:23:23,840 --> 00:23:26,720 Speaker 1: That's when someone and usually it's a computer program not 378 00:23:26,760 --> 00:23:31,240 Speaker 1: a person these days, submits endless guesses into a password 379 00:23:31,240 --> 00:23:35,280 Speaker 1: protected account in order to gain access. There's no need 380 00:23:35,320 --> 00:23:38,680 Speaker 1: to work backward from hashed values. Using this approach, you're 381 00:23:38,760 --> 00:23:42,439 Speaker 1: just guessing the root password from the get go. But 382 00:23:43,040 --> 00:23:45,400 Speaker 1: it takes a lot of time, particularly if the user 383 00:23:45,440 --> 00:23:49,000 Speaker 1: has created a strong password. So the longer and more 384 00:23:49,040 --> 00:23:52,880 Speaker 1: complex a password, the less likely a traditional computer can 385 00:23:52,920 --> 00:23:56,440 Speaker 1: hack it in a reasonable amount of time. Given enough 386 00:23:56,480 --> 00:24:01,359 Speaker 1: time and enough computing power, any password and ultimately be 387 00:24:01,640 --> 00:24:05,719 Speaker 1: cracked by brute force. But the more complex it is 388 00:24:05,760 --> 00:24:09,200 Speaker 1: and the longer it is, the more time it requires 389 00:24:09,240 --> 00:24:13,200 Speaker 1: to a point where it can approach time that last centuries, 390 00:24:13,240 --> 00:24:14,960 Speaker 1: which means no one's going to bother to do it 391 00:24:15,040 --> 00:24:17,560 Speaker 1: because they're not going to be around to actually see 392 00:24:17,600 --> 00:24:22,000 Speaker 1: it work. Assuming you've picked a good strong password, So 393 00:24:22,000 --> 00:24:24,440 Speaker 1: why you should never use real words or even names 394 00:24:24,480 --> 00:24:26,520 Speaker 1: as a password. They're too easy for a computer to 395 00:24:26,560 --> 00:24:30,320 Speaker 1: guess using what's called a dictionary attack, So make sure 396 00:24:30,359 --> 00:24:33,520 Speaker 1: you create those really strong passwords and as always, I 397 00:24:33,600 --> 00:24:37,840 Speaker 1: like to recommend using a password management program so that 398 00:24:37,880 --> 00:24:41,240 Speaker 1: way you don't have to remember those strong passwords, because 399 00:24:41,240 --> 00:24:44,560 Speaker 1: obviously the downside to creating a strong password is they're 400 00:24:44,560 --> 00:24:47,600 Speaker 1: difficult to remember. It's really easy to remember a word 401 00:24:47,760 --> 00:24:51,840 Speaker 1: like tech stuff, but that's not very secure. Unfortunately, the 402 00:24:51,880 --> 00:24:55,080 Speaker 1: more secure approach is also difficult to remember, and you 403 00:24:55,119 --> 00:24:58,080 Speaker 1: don't want to just write stuff down someplace because that 404 00:24:58,160 --> 00:25:00,880 Speaker 1: kind of defeats the purpose of having a secret password. 405 00:25:01,480 --> 00:25:04,560 Speaker 1: Having a really good password management system and then just 406 00:25:04,600 --> 00:25:09,480 Speaker 1: having to remember one good master password simplifies things. So 407 00:25:09,560 --> 00:25:12,320 Speaker 1: I recommend that. Well, I've got a lot more to 408 00:25:12,320 --> 00:25:15,760 Speaker 1: say about authentication strategies, but before I get into it, 409 00:25:16,160 --> 00:25:29,560 Speaker 1: let's take a quick break to thank our sponsor. Okay, 410 00:25:29,600 --> 00:25:32,560 Speaker 1: so I think we've covered passwords pretty thoroughly. Let's talk 411 00:25:32,600 --> 00:25:36,760 Speaker 1: about some other authentication strategies. One of the earliest authentication 412 00:25:36,840 --> 00:25:40,600 Speaker 1: systems and electronics was the personal identification number, or PEN. 413 00:25:41,480 --> 00:25:45,680 Speaker 1: And technically, yeah, if you say PEN number, you're repeating yourself, 414 00:25:45,800 --> 00:25:48,320 Speaker 1: just as if you were to say ATM machine. And 415 00:25:48,480 --> 00:25:51,080 Speaker 1: I still do it just like a lot of people. 416 00:25:51,440 --> 00:25:55,800 Speaker 1: If someone can realistically argue that irrespective is a word, 417 00:25:56,200 --> 00:25:59,320 Speaker 1: I can argue pen number is acceptable, dang it, so 418 00:25:59,400 --> 00:26:04,080 Speaker 1: don't write me. The PEN debuted on the world scene 419 00:26:04,240 --> 00:26:09,399 Speaker 1: in nineteen sixty seven. That's when Barclays of London introduced 420 00:26:09,400 --> 00:26:12,879 Speaker 1: the first ATM system, which a man named John Shepherd 421 00:26:12,960 --> 00:26:16,919 Speaker 1: barn invented Barclays to come up with a method that 422 00:26:17,040 --> 00:26:21,639 Speaker 1: kept customer's finances safe. Otherwise, anyone might be able to 423 00:26:21,680 --> 00:26:24,800 Speaker 1: access anyone else's money, and that does not make for 424 00:26:24,880 --> 00:26:28,040 Speaker 1: a very positive banking experience. I mean it does for 425 00:26:28,119 --> 00:26:29,840 Speaker 1: the person who makes off with all the cash, but 426 00:26:29,880 --> 00:26:33,600 Speaker 1: for everybody else it's pretty negative. The solution was the PEN, 427 00:26:33,960 --> 00:26:37,399 Speaker 1: which was a numeric code unique to the customer. The 428 00:26:37,440 --> 00:26:41,920 Speaker 1: standard for pen management is actually called ISSO nine five 429 00:26:42,119 --> 00:26:45,239 Speaker 1: six y four dash one ISO ninety five sixty four 430 00:26:45,320 --> 00:26:49,600 Speaker 1: dash one. Technically, this standard allows for a spectrum of 431 00:26:49,680 --> 00:26:53,320 Speaker 1: pen lengths. We're mostly used to four digits, but it 432 00:26:53,359 --> 00:26:55,760 Speaker 1: doesn't have to just before you could go from four 433 00:26:55,840 --> 00:26:58,280 Speaker 1: that's the minimum number of digits you can use, but 434 00:26:58,359 --> 00:27:01,720 Speaker 1: you can use up to twelve digits. But we humans 435 00:27:01,920 --> 00:27:05,159 Speaker 1: tend to have trouble remembering lots of unrelated numbers, and 436 00:27:05,280 --> 00:27:07,960 Speaker 1: if you're choosing lots of related numbers, then that makes 437 00:27:08,000 --> 00:27:10,639 Speaker 1: it pretty easy for people to guess your pin. So 438 00:27:11,200 --> 00:27:14,800 Speaker 1: most ATMs, especially in the banking and finance industry, would 439 00:27:14,840 --> 00:27:17,600 Speaker 1: require a pin of four digits in length, which dates 440 00:27:17,600 --> 00:27:21,159 Speaker 1: back to the first ATM system. So why was the 441 00:27:21,240 --> 00:27:25,920 Speaker 1: number four picked in the very beginning? Why just four digits? Well, 442 00:27:25,960 --> 00:27:28,840 Speaker 1: that's because John Shepherd Barren, who originally was going to 443 00:27:28,920 --> 00:27:33,119 Speaker 1: use a six digit pen system, found his wife Caroline, 444 00:27:33,119 --> 00:27:37,080 Speaker 1: had trouble remembering anything more than four digits, so he 445 00:27:37,160 --> 00:27:39,679 Speaker 1: sensed that there could be a possible problem with longer 446 00:27:39,760 --> 00:27:44,000 Speaker 1: pins and decided to stick with four digits instead of six. 447 00:27:44,359 --> 00:27:48,199 Speaker 1: That's why we have that now. Those early ATMs didn't 448 00:27:48,280 --> 00:27:51,119 Speaker 1: accept plastic cards with a magnetic stripe on them the 449 00:27:51,119 --> 00:27:54,600 Speaker 1: way modern ones do, and obviously the chip and pin 450 00:27:54,760 --> 00:27:58,120 Speaker 1: system was decades away. So instead, what you would use 451 00:27:58,160 --> 00:28:00,920 Speaker 1: as a check you would actually insert a check into 452 00:28:01,000 --> 00:28:04,360 Speaker 1: the machine, and each check had information encoded upon it 453 00:28:04,800 --> 00:28:07,520 Speaker 1: that allowed the ATM to read the information on it, 454 00:28:07,560 --> 00:28:10,560 Speaker 1: for example, how much money it represented and who it 455 00:28:10,600 --> 00:28:13,840 Speaker 1: was supposed to go to. You would couple this with 456 00:28:13,920 --> 00:28:17,320 Speaker 1: the proper pen, and then the ATM could dispense cash 457 00:28:17,359 --> 00:28:19,680 Speaker 1: at all hours of the day, which eliminated the need 458 00:28:19,680 --> 00:28:22,240 Speaker 1: for people to make time to access the bank during 459 00:28:22,320 --> 00:28:25,880 Speaker 1: bank hours, which we all know are the shortest hours 460 00:28:26,000 --> 00:28:28,600 Speaker 1: in the world. If you'd like to learn more about 461 00:28:28,640 --> 00:28:31,400 Speaker 1: ATMs and how they work, be sure to check out 462 00:28:31,400 --> 00:28:35,160 Speaker 1: the classic episode of tech stuff called appropriately Enough How 463 00:28:35,280 --> 00:28:39,480 Speaker 1: ATMs Work. I republished it in February twenty fifteen, so 464 00:28:39,520 --> 00:28:41,680 Speaker 1: you can listen to that, but it actually dates much 465 00:28:41,720 --> 00:28:45,120 Speaker 1: further than that. This is really a blast from the 466 00:28:45,160 --> 00:28:48,840 Speaker 1: past with some of the stuff in this episode. Now, 467 00:28:48,840 --> 00:28:53,000 Speaker 1: another strategy is to use tokens. That's very popular for 468 00:28:53,160 --> 00:28:56,840 Speaker 1: authentication strategies. There's several versions of these, including tokens that 469 00:28:56,840 --> 00:28:59,360 Speaker 1: have a static code that acts like a key to 470 00:28:59,400 --> 00:29:02,760 Speaker 1: a system's line. Now, those are not terribly secure because 471 00:29:02,800 --> 00:29:05,720 Speaker 1: if someone else gets hold of that token, they can 472 00:29:05,880 --> 00:29:08,760 Speaker 1: pretty much get into the system. They represent kind of 473 00:29:08,760 --> 00:29:13,400 Speaker 1: a single factor method of authentication on their own. For example, 474 00:29:13,400 --> 00:29:14,960 Speaker 1: if you work in a building that requires you to 475 00:29:15,000 --> 00:29:17,480 Speaker 1: tap a security card to a panel in order to 476 00:29:17,560 --> 00:29:21,200 Speaker 1: unlock the door, that's a single factor approach, right. There's 477 00:29:21,240 --> 00:29:23,760 Speaker 1: no other need to submit any other proof that you 478 00:29:23,800 --> 00:29:27,120 Speaker 1: should have access. As long as you possess the security card, 479 00:29:27,480 --> 00:29:29,520 Speaker 1: you can enter the building. It's just like having a 480 00:29:29,520 --> 00:29:33,880 Speaker 1: physical key to a physical lock, you could pair that 481 00:29:34,240 --> 00:29:37,760 Speaker 1: with another factor and then make the security stronger. Right, 482 00:29:37,880 --> 00:29:42,080 Speaker 1: there could be some other additional information or element that 483 00:29:42,120 --> 00:29:45,120 Speaker 1: you'd have to supply apart from just owning the card, 484 00:29:45,440 --> 00:29:48,040 Speaker 1: and that would make it a two factor authentication approach, 485 00:29:48,120 --> 00:29:52,760 Speaker 1: and that would make it a stronger secure system. Now, 486 00:29:54,400 --> 00:29:56,200 Speaker 1: there are a lot of tokens that are used in 487 00:29:56,200 --> 00:30:00,480 Speaker 1: two factor authentication, and one of the most common is 488 00:30:00,600 --> 00:30:03,360 Speaker 1: a device with a small led screen that displays a 489 00:30:03,360 --> 00:30:06,520 Speaker 1: string of seemingly random numbers when you activate it, and 490 00:30:06,560 --> 00:30:10,240 Speaker 1: those seemingly random numbers change when you activate it over time. 491 00:30:10,320 --> 00:30:12,840 Speaker 1: Let's say that you pull out this token in order 492 00:30:12,880 --> 00:30:15,600 Speaker 1: to access a system it's asking for this code. You 493 00:30:15,600 --> 00:30:18,000 Speaker 1: press a little button, the numbers light up, and you 494 00:30:18,040 --> 00:30:20,560 Speaker 1: type the numbers into the system and it gives you access. 495 00:30:21,200 --> 00:30:23,280 Speaker 1: And then the next day you want to access it again, 496 00:30:23,320 --> 00:30:25,520 Speaker 1: you pull up the token, you press a button, a 497 00:30:25,600 --> 00:30:28,160 Speaker 1: totally different set of numbers shows up, you type those 498 00:30:28,160 --> 00:30:30,400 Speaker 1: into the system, you get access to it. What the 499 00:30:30,440 --> 00:30:33,760 Speaker 1: heck is going on? How does that work? How does 500 00:30:33,920 --> 00:30:36,760 Speaker 1: how does the token magically know what numbers to create? 501 00:30:37,800 --> 00:30:41,120 Speaker 1: It's actually a pretty elegant system as it turns out. 502 00:30:41,320 --> 00:30:43,760 Speaker 1: I'll give an example of one way this can happen. 503 00:30:43,800 --> 00:30:46,120 Speaker 1: It's not the only way, but it's a pretty common one. 504 00:30:47,280 --> 00:30:49,360 Speaker 1: So in most of these devices, the token has a 505 00:30:49,400 --> 00:30:53,080 Speaker 1: low power clock which is synchronized to the system that 506 00:30:53,240 --> 00:30:56,120 Speaker 1: it is related to, and it also has a serial 507 00:30:56,200 --> 00:30:59,640 Speaker 1: number associated with the specific token. The token uses those 508 00:30:59,680 --> 00:31:04,000 Speaker 1: two values to generate what is called a pr NG value, 509 00:31:04,080 --> 00:31:08,800 Speaker 1: and pr NG stands for pseudo random number generator and 510 00:31:08,840 --> 00:31:11,000 Speaker 1: it means pretty much what sounds like. It can create 511 00:31:11,000 --> 00:31:13,720 Speaker 1: a string of numbers that appears to be random, though 512 00:31:13,800 --> 00:31:17,120 Speaker 1: ultimately those numbers are in fact determined by an ordered 513 00:31:17,240 --> 00:31:19,960 Speaker 1: series of calculations. But you have to know what those 514 00:31:20,000 --> 00:31:23,960 Speaker 1: calculations are and what the two different numbers were to 515 00:31:24,080 --> 00:31:28,840 Speaker 1: start off with in order to get the pseudorandom result. 516 00:31:29,800 --> 00:31:32,680 Speaker 1: So when you're typing in the string of numerals into 517 00:31:32,680 --> 00:31:36,640 Speaker 1: a system, the system runs the same pr NG operation 518 00:31:37,400 --> 00:31:40,200 Speaker 1: using the same time stamp and the serial number for 519 00:31:40,240 --> 00:31:43,680 Speaker 1: the token. Now, that obviously requires the system to quote unquote, 520 00:31:43,920 --> 00:31:47,520 Speaker 1: know what your token's serial number is, so you have 521 00:31:47,520 --> 00:31:51,480 Speaker 1: to have an official registered token, and if the system's 522 00:31:51,520 --> 00:31:54,560 Speaker 1: results match the one that you typed in, you're authenticated. 523 00:31:54,880 --> 00:31:58,680 Speaker 1: So typically these codes that you generate have a shelf 524 00:31:58,720 --> 00:32:00,600 Speaker 1: life of a certain amount of time. Let's say it's 525 00:32:00,640 --> 00:32:05,480 Speaker 1: thirty minutes. So you use the token, and it takes 526 00:32:05,560 --> 00:32:09,680 Speaker 1: the closest time at the thirty minute mark from when 527 00:32:09,720 --> 00:32:11,800 Speaker 1: you push the button. So you push the button at 528 00:32:11,840 --> 00:32:15,560 Speaker 1: two thirty five. It says two thirty, and it runs 529 00:32:15,600 --> 00:32:18,480 Speaker 1: the operation. It gives you some numbers. You type it 530 00:32:18,480 --> 00:32:21,760 Speaker 1: into the system. The system looks at its clock. It says, oh, 531 00:32:21,760 --> 00:32:24,680 Speaker 1: it's two thirty seven. Well, the closest half hour mark 532 00:32:24,760 --> 00:32:27,080 Speaker 1: was two thirty, so I'll use that to start off with. 533 00:32:27,440 --> 00:32:29,480 Speaker 1: I happen to know that the serial number for this 534 00:32:29,520 --> 00:32:32,440 Speaker 1: particular token is such and such. I'll use that to 535 00:32:32,680 --> 00:32:35,760 Speaker 1: perform the same number of operations, and it should create 536 00:32:35,880 --> 00:32:39,320 Speaker 1: the exact same result. If it doesn't create the same result, 537 00:32:39,640 --> 00:32:42,800 Speaker 1: it means that you've somehow spanned over that time limit 538 00:32:42,840 --> 00:32:44,880 Speaker 1: and you're going to have to generate a new code 539 00:32:44,920 --> 00:32:48,640 Speaker 1: and insert it again, or something has gone wrong, or 540 00:32:48,680 --> 00:32:50,880 Speaker 1: you're just trying to access the system that you don't 541 00:32:50,880 --> 00:32:52,840 Speaker 1: actually have a token for, which would be kind of 542 00:32:52,840 --> 00:32:55,840 Speaker 1: foolish because you'd have to be incredibly lucky to just 543 00:32:56,280 --> 00:32:58,880 Speaker 1: magically type in the right string of numbers in order 544 00:32:58,920 --> 00:33:03,440 Speaker 1: to get access. Another great area to explore is biometrics. 545 00:33:03,640 --> 00:33:06,720 Speaker 1: I love this field because, when implemented properly, it's pretty 546 00:33:06,760 --> 00:33:09,960 Speaker 1: difficult to replicate biometrics. That all has to do with 547 00:33:10,080 --> 00:33:13,400 Speaker 1: our physical attributes, right, It's tough for bad guys to 548 00:33:13,480 --> 00:33:16,200 Speaker 1: get into a system that happens to be based on 549 00:33:16,200 --> 00:33:20,440 Speaker 1: our physical traits. We did an episode called Biometrics Digital 550 00:33:20,520 --> 00:33:23,640 Speaker 1: Fingerprinting back in twenty fourteen. But let me give you 551 00:33:23,680 --> 00:33:27,160 Speaker 1: a quick rundown of the history of biometrics. First of all, 552 00:33:27,200 --> 00:33:30,800 Speaker 1: fingerprints have long been used as a means of identification. 553 00:33:31,000 --> 00:33:34,600 Speaker 1: Actually centuries before the practice was officially adopted by law enforcement. 554 00:33:35,600 --> 00:33:40,360 Speaker 1: On ancient business transactions, merchants and customers would sometimes use 555 00:33:40,360 --> 00:33:43,800 Speaker 1: fingerprint marks in clay tablets as a kind of signature. 556 00:33:43,800 --> 00:33:47,120 Speaker 1: It would identify the person who had purchased a good 557 00:33:47,400 --> 00:33:51,200 Speaker 1: from someone else. It wouldn't be until the late eighteen 558 00:33:51,280 --> 00:33:55,800 Speaker 1: hundreds the law enforcement jumped on the fingerprint bandwagon once 559 00:33:55,840 --> 00:33:58,400 Speaker 1: the establishment accepted the fact that no two sets of 560 00:33:58,400 --> 00:34:01,200 Speaker 1: fingerprints were alike, which was It's something that ancient people 561 00:34:01,240 --> 00:34:04,320 Speaker 1: had known for ever, but it just hadn't been accepted 562 00:34:04,320 --> 00:34:07,560 Speaker 1: as a scientific fact for a very long time. A 563 00:34:07,600 --> 00:34:13,200 Speaker 1: couple of people named Azizul Hawk and Edward Henry created 564 00:34:13,239 --> 00:34:16,600 Speaker 1: a system for indexing and classifying fingerprints for the purposes 565 00:34:16,640 --> 00:34:20,360 Speaker 1: of criminal investigation. Now. They based that partly on a 566 00:34:20,400 --> 00:34:23,560 Speaker 1: classification system that was developed by another man named Sir 567 00:34:23,640 --> 00:34:27,280 Speaker 1: Francis Galton, but that system was more for academic purposes 568 00:34:27,480 --> 00:34:32,440 Speaker 1: right to describe fingerprints, whereas Henry wanted a system that 569 00:34:32,480 --> 00:34:37,680 Speaker 1: could be used in investigations, legal investigations, criminal investigations. Mark 570 00:34:37,680 --> 00:34:40,000 Speaker 1: Twain actually wrote a story in the eighteen nineties in 571 00:34:40,040 --> 00:34:43,080 Speaker 1: which a character put on trial asks that his fingerprints 572 00:34:43,360 --> 00:34:45,319 Speaker 1: be compared to some left at the scene of a 573 00:34:45,360 --> 00:34:49,560 Speaker 1: crime in order to prove his innocence. In nineteen sixty three, 574 00:34:49,719 --> 00:34:54,640 Speaker 1: the Hughes Research Laboratory published a research paper about fingerprint automation. 575 00:34:55,600 --> 00:34:59,719 Speaker 1: The lab which is today known as HRL Laboratories, which 576 00:34:59,760 --> 00:35:02,520 Speaker 1: I guess makes it another repetitive term, because I'm assuming 577 00:35:02,680 --> 00:35:05,960 Speaker 1: HRL already stands for Hughes Research Laboratory, so the new 578 00:35:06,040 --> 00:35:09,839 Speaker 1: name could be interpreted as Hughes Research Laboratory Laboratory. So 579 00:35:10,080 --> 00:35:14,280 Speaker 1: stop bugging me about pen numbers, is what I'm saying. Anyway, 580 00:35:14,480 --> 00:35:16,799 Speaker 1: It used to be the research and development division of 581 00:35:16,920 --> 00:35:21,200 Speaker 1: Hughes Aircraft. Today it's owned by Boeing in General Motors. 582 00:35:21,200 --> 00:35:23,880 Speaker 1: But back in the nineteen sixties, the Lab published a 583 00:35:23,960 --> 00:35:29,000 Speaker 1: paper about automated fingerprint identification. It kind of acts as 584 00:35:29,000 --> 00:35:33,560 Speaker 1: the foundation for fingerprints scanning today. It's basically automating a 585 00:35:33,600 --> 00:35:36,239 Speaker 1: system that has been performed manually, which is where you 586 00:35:36,280 --> 00:35:40,400 Speaker 1: take two sets of fingerprints. You have your reference set 587 00:35:40,719 --> 00:35:43,719 Speaker 1: and you have your submitted set, and you want to 588 00:35:43,760 --> 00:35:47,719 Speaker 1: compare those together and look for points of similarity. And 589 00:35:47,800 --> 00:35:50,200 Speaker 1: if you have enough points of similarity, the likelihood of 590 00:35:50,239 --> 00:35:53,719 Speaker 1: the fingerprints belonging to someone else drops to near zero. 591 00:35:54,080 --> 00:35:56,360 Speaker 1: So it means someone who happens to have very similar 592 00:35:56,400 --> 00:36:00,680 Speaker 1: fingerprints to the person in question, the reference happened to 593 00:36:00,719 --> 00:36:03,160 Speaker 1: be in the same geographic region around the same time, 594 00:36:03,400 --> 00:36:06,759 Speaker 1: and if there are enough sufficient points of similarity, this 595 00:36:06,840 --> 00:36:12,760 Speaker 1: becomes increasingly unlikely. So while researchers worked on creating automated 596 00:36:12,760 --> 00:36:16,480 Speaker 1: systems for fingerprint identification, others were working on similar systems 597 00:36:16,520 --> 00:36:21,720 Speaker 1: for facial recognition and voice identification strategies. Essentially, any aspect 598 00:36:21,760 --> 00:36:24,640 Speaker 1: of a person that would be intrinsically unique to him 599 00:36:24,800 --> 00:36:28,400 Speaker 1: or her was considered an interesting value to quantify and 600 00:36:28,520 --> 00:36:33,839 Speaker 1: classify for good or for ill. In nineteen seventy four, 601 00:36:33,960 --> 00:36:38,359 Speaker 1: the first commercial hand geometry systems launched. Dylan, you ever 602 00:36:38,400 --> 00:36:40,920 Speaker 1: have to use a hand geometry system where it measures 603 00:36:40,920 --> 00:36:44,160 Speaker 1: your hand? Dylan shaking his head. No, I did. It 604 00:36:44,200 --> 00:36:46,759 Speaker 1: was a regular part of the University of Georgia when 605 00:36:46,800 --> 00:36:50,480 Speaker 1: I was there. So this is a scanner that looks 606 00:36:50,520 --> 00:36:53,680 Speaker 1: at the hand, the shape of a person's hand, and 607 00:36:53,760 --> 00:36:56,880 Speaker 1: compares it to a database and it authenticates the person 608 00:36:56,920 --> 00:36:59,759 Speaker 1: based on hand geometry. So you have to set up 609 00:37:00,040 --> 00:37:03,080 Speaker 1: your profile right you scan your hand for the first time, 610 00:37:03,719 --> 00:37:06,960 Speaker 1: and it associates your hand geometry with you the person. 611 00:37:07,480 --> 00:37:10,719 Speaker 1: Every time you scan your hand later on, it goes 612 00:37:10,760 --> 00:37:13,319 Speaker 1: in references that database and says, hey, does this match 613 00:37:13,360 --> 00:37:15,840 Speaker 1: with the hand that we measured that first time, And 614 00:37:15,840 --> 00:37:18,400 Speaker 1: if the answer was yes, it authenticated you. So my 615 00:37:18,520 --> 00:37:21,359 Speaker 1: university's food hall had one of these. If you wanted 616 00:37:21,360 --> 00:37:25,760 Speaker 1: to eat, you had to stick your hand in the machine. 617 00:37:26,040 --> 00:37:30,360 Speaker 1: Kind of got a little bit sort of Flash Gordon esque. 618 00:37:30,640 --> 00:37:32,320 Speaker 1: You know, you sit there wondering if you're going to 619 00:37:32,360 --> 00:37:34,760 Speaker 1: get your hand back after you put your hand in there. 620 00:37:34,880 --> 00:37:36,719 Speaker 1: But I mean if you want tater tots, you just 621 00:37:36,840 --> 00:37:39,359 Speaker 1: had to do it, or in my case, chili cheese fries, 622 00:37:39,400 --> 00:37:43,520 Speaker 1: which I ate way too frequently. I digress. In nineteen 623 00:37:43,560 --> 00:37:47,560 Speaker 1: seventy five, partially funded by the FBI, researchers began to 624 00:37:47,600 --> 00:37:52,080 Speaker 1: develop fingerprint scanners. Now. The first of those used capacitive detection, 625 00:37:52,480 --> 00:37:56,200 Speaker 1: which wasn't terribly precise in the nineteen seventies. Most smartphones 626 00:37:56,239 --> 00:38:01,040 Speaker 1: these days actually use this approach. Capacitive touch screens use that. Essentially, 627 00:38:01,080 --> 00:38:03,960 Speaker 1: touching the screen alters an electric field on the phone 628 00:38:04,400 --> 00:38:08,200 Speaker 1: because we conduct electricity. It's a very weak electric field, 629 00:38:08,440 --> 00:38:11,920 Speaker 1: but we conduct electricity. Touching a device that has an 630 00:38:11,920 --> 00:38:15,800 Speaker 1: electric field running across the surface disrupts that electric field, 631 00:38:16,280 --> 00:38:19,080 Speaker 1: and it actually allows a device to detect the presence 632 00:38:19,080 --> 00:38:22,120 Speaker 1: and orientation of a touch, so it knows the X 633 00:38:22,200 --> 00:38:24,840 Speaker 1: and y axis of where you are touching on a screen. 634 00:38:24,920 --> 00:38:27,919 Speaker 1: That's why if you wear non capacitive gloves while trying 635 00:38:27,920 --> 00:38:31,719 Speaker 1: to work an iPhone, nothing happens because it cannot hold 636 00:38:31,719 --> 00:38:36,040 Speaker 1: that capacitance. So the screen isn't a resistive touch screen. 637 00:38:36,120 --> 00:38:39,560 Speaker 1: It can't detect a touch unless that capacitance is there. 638 00:38:40,920 --> 00:38:45,759 Speaker 1: Our capacitive aspect is there. Rather not capacitants. Sorry about 639 00:38:45,760 --> 00:38:50,320 Speaker 1: that misspoke. Well, speaking of the iPhone, the touch ID 640 00:38:50,640 --> 00:38:53,400 Speaker 1: on the iPhone five S and later models actually uses 641 00:38:53,480 --> 00:38:57,840 Speaker 1: capacitive touch to authenticate a fingerprint, just like this system 642 00:38:57,880 --> 00:39:01,320 Speaker 1: did in nineteen seventy five, except the days it's way 643 00:39:01,360 --> 00:39:04,080 Speaker 1: more precise than the tech was capable of back in 644 00:39:04,120 --> 00:39:07,799 Speaker 1: the seventies, so it's much less likely to give a 645 00:39:08,000 --> 00:39:11,160 Speaker 1: either a false positive or to deny someone access to 646 00:39:11,200 --> 00:39:14,640 Speaker 1: their phone. It may require you to scan a second 647 00:39:14,640 --> 00:39:17,080 Speaker 1: time if you didn't get a good representation of your 648 00:39:17,080 --> 00:39:19,000 Speaker 1: fingerprint when you were trying to unlock the phone, but 649 00:39:19,000 --> 00:39:21,960 Speaker 1: it's not likely to deny you because it cannot identify 650 00:39:22,000 --> 00:39:28,240 Speaker 1: your fingerprint now. In nineteen eighty five, two doctors Aaron 651 00:39:28,320 --> 00:39:33,160 Speaker 1: Sefir and Leonard Flom proposed that irides could be unique 652 00:39:33,160 --> 00:39:35,160 Speaker 1: to a person. And you might say, well, what a 653 00:39:35,280 --> 00:39:38,520 Speaker 1: I rides? Well, iride is the plural for iris, so 654 00:39:38,760 --> 00:39:41,640 Speaker 1: we're talking about the pigmented membrane surrounding the pupil in 655 00:39:41,680 --> 00:39:45,800 Speaker 1: your eye. By nineteen eighty six, these two ophthalmologists received 656 00:39:45,800 --> 00:39:49,160 Speaker 1: a patent for their approach to use irides for authentication 657 00:39:49,280 --> 00:39:53,480 Speaker 1: and identification purposes. By nineteen ninety five, the first IRIS 658 00:39:53,520 --> 00:39:57,960 Speaker 1: identification security systems became part of the Defense Nuclear Agency. 659 00:39:58,600 --> 00:40:01,640 Speaker 1: So all those spy movies where you see someone leaning 660 00:40:01,640 --> 00:40:04,040 Speaker 1: forward and getting their eye scanned, that's a real thing. 661 00:40:04,760 --> 00:40:08,040 Speaker 1: Our irises or eye rides, i should say, are unique 662 00:40:08,120 --> 00:40:11,480 Speaker 1: to us, and so that is a pretty tricky thing 663 00:40:11,680 --> 00:40:15,480 Speaker 1: to replicate. You probably have seen at least one or 664 00:40:15,480 --> 00:40:19,040 Speaker 1: two movies where someone got hold of somebody's eyeball and 665 00:40:19,080 --> 00:40:22,160 Speaker 1: got access that way, or knocked a person out then 666 00:40:22,360 --> 00:40:24,719 Speaker 1: forced their eye open and held their head up to 667 00:40:24,760 --> 00:40:28,719 Speaker 1: the scanner, But in general not easy to replicate without 668 00:40:28,760 --> 00:40:33,720 Speaker 1: access to somebody who already is authorized to enter that area. 669 00:40:35,040 --> 00:40:38,040 Speaker 1: Over the next several years, advances in biometrics opened up 670 00:40:38,239 --> 00:40:42,200 Speaker 1: new opportunities, not just for authentication or security. So facial 671 00:40:42,239 --> 00:40:45,360 Speaker 1: recognition is a great example. It's been incorporated into dozens 672 00:40:45,360 --> 00:40:49,040 Speaker 1: of technologies, probably most notably into our cameras, including the 673 00:40:49,040 --> 00:40:52,600 Speaker 1: cameras and our smartphones. And sometimes it's a simple implementation 674 00:40:52,920 --> 00:40:55,400 Speaker 1: which just detects a face in order to focus properly 675 00:40:55,440 --> 00:40:58,640 Speaker 1: on a subject. Sometimes it's more complicated, so it might 676 00:40:58,800 --> 00:41:03,440 Speaker 1: allow for automatic tagging of images because it can recognize 677 00:41:03,440 --> 00:41:06,080 Speaker 1: people based on their facial features. You probably had some 678 00:41:06,160 --> 00:41:10,799 Speaker 1: experience with this in some capacity. Organizations also began to 679 00:41:10,880 --> 00:41:14,680 Speaker 1: form around this time to create standards for biometric implementations. 680 00:41:15,520 --> 00:41:18,280 Speaker 1: This would reduce the chance of competing technologies with varying 681 00:41:18,320 --> 00:41:21,680 Speaker 1: degrees of efficiency and accuracy from interfering with each other, 682 00:41:22,360 --> 00:41:24,919 Speaker 1: and by two thousand and three, the US government began 683 00:41:24,960 --> 00:41:30,719 Speaker 1: to formally coordinate biometric implementations. Meanwhile, the International Civil Aviation 684 00:41:30,960 --> 00:41:34,879 Speaker 1: Organization created a global standard to incorporate biometric data into 685 00:41:34,920 --> 00:41:39,160 Speaker 1: travel documentation like passports, and ten years later, in twenty thirteen, 686 00:41:39,280 --> 00:41:43,080 Speaker 1: you could find biometric solutions built directly into personal electronics 687 00:41:43,440 --> 00:41:46,560 Speaker 1: like laptops and smartphones. In fact, I had a fingerprint 688 00:41:46,600 --> 00:41:50,160 Speaker 1: scanner from before twenty thirteen where you just you would 689 00:41:50,360 --> 00:41:52,640 Speaker 1: actually have to slide your finger kind of like a 690 00:41:52,640 --> 00:41:56,480 Speaker 1: copier against the little panel and if your fingerprint matched, 691 00:41:56,520 --> 00:41:59,640 Speaker 1: it would unlock your computer for you. I actually had 692 00:41:59,640 --> 00:42:04,359 Speaker 1: that one. Hear at how stuff works, I miss it sometimes. Well, 693 00:42:04,360 --> 00:42:06,399 Speaker 1: I've got a lot more to say, but first let's 694 00:42:06,440 --> 00:42:19,439 Speaker 1: take another quick break to think our sponsor. All right, 695 00:42:20,160 --> 00:42:25,280 Speaker 1: things like fingerprint scanners are not foolproof. It is possible, 696 00:42:25,360 --> 00:42:29,240 Speaker 1: although challenging, to lift a person's fingerprint from something they've handled, 697 00:42:29,360 --> 00:42:32,160 Speaker 1: scan it and replicate it. A couple of different ways 698 00:42:32,200 --> 00:42:34,920 Speaker 1: to do this. Some of them require access to some 699 00:42:35,000 --> 00:42:37,760 Speaker 1: equipment and materials most of us don't have in our homes, 700 00:42:37,760 --> 00:42:40,080 Speaker 1: so it's not like it's practical for the average person. 701 00:42:40,360 --> 00:42:43,600 Speaker 1: But the point is, with the right determination and the 702 00:42:43,680 --> 00:42:47,719 Speaker 1: right know how, and specifically the right materials, you can 703 00:42:47,760 --> 00:42:51,080 Speaker 1: create a fake fingerprint. And you might use something like 704 00:42:51,160 --> 00:42:55,200 Speaker 1: latex or even wood glue, and you could lift a 705 00:42:55,239 --> 00:42:59,600 Speaker 1: fingerprint and use it to fool certain authentication systems. If 706 00:42:59,600 --> 00:43:03,239 Speaker 1: the system is just looking for a particular pattern on 707 00:43:03,320 --> 00:43:06,120 Speaker 1: a fingerprint, the copy could be good enough to fool 708 00:43:06,160 --> 00:43:09,360 Speaker 1: the system, particularly if you can overlay the copy on 709 00:43:09,440 --> 00:43:14,400 Speaker 1: top of your own finger This would provide the capacitive connections. So, 710 00:43:14,440 --> 00:43:16,920 Speaker 1: in other words, let's say I've got a latex fingerprint 711 00:43:17,200 --> 00:43:19,719 Speaker 1: and I need to access a phone. Well, if I 712 00:43:19,840 --> 00:43:24,120 Speaker 1: just lay the latex down against the capacitive screen, it's 713 00:43:24,160 --> 00:43:27,160 Speaker 1: not really gonna affect anything. If I put an actual 714 00:43:27,880 --> 00:43:31,319 Speaker 1: living tissue behind it, that's a different story. So how 715 00:43:31,360 --> 00:43:34,640 Speaker 1: do you defeat that sort of security vulnerability. Well, I 716 00:43:34,680 --> 00:43:37,400 Speaker 1: had the opportunity to speak with doctor P who's the 717 00:43:37,480 --> 00:43:40,840 Speaker 1: chief technology officer of goodex to talk about a fingerprint 718 00:43:40,920 --> 00:43:45,680 Speaker 1: scanner with an additional measure of security to counteract those 719 00:43:45,680 --> 00:43:49,719 Speaker 1: sort of spoofing attempts. Here's what we talked about, doctor p. 720 00:43:49,920 --> 00:43:56,280 Speaker 1: Let's start off by talking about how biometrics are transforming 721 00:43:56,920 --> 00:44:02,399 Speaker 1: security in the technology field, specifically for things like consumer tech, 722 00:44:02,480 --> 00:44:06,440 Speaker 1: because my listeners are very interested in that, the concept 723 00:44:06,440 --> 00:44:11,520 Speaker 1: of using biometrics to access various devices. I think probably 724 00:44:11,600 --> 00:44:14,880 Speaker 1: the example most of them would be familiar with it 725 00:44:14,880 --> 00:44:18,279 Speaker 1: would be smartphones. Uh, can you talk a little bit 726 00:44:18,280 --> 00:44:21,240 Speaker 1: about how that has developed over the last few years 727 00:44:21,480 --> 00:44:26,600 Speaker 1: and and why it is such a compelling component for security. 728 00:44:27,480 --> 00:44:30,960 Speaker 2: Well, I think one of the story I actually met, 729 00:44:31,360 --> 00:44:34,239 Speaker 2: which is a part of my experience too, is some 730 00:44:34,480 --> 00:44:37,920 Speaker 2: is out really well is the since the more and 731 00:44:37,960 --> 00:44:42,919 Speaker 2: more phone has a fingerprint, Uh thanks, more and more 732 00:44:42,920 --> 00:44:47,600 Speaker 2: people using it? Is one guy and Agen say, totally 733 00:44:47,600 --> 00:44:51,160 Speaker 2: forgot the past code. Now he's using fingerprints on the 734 00:44:51,200 --> 00:44:54,000 Speaker 2: phone all the time, and one on my phone. I 735 00:44:54,040 --> 00:44:57,400 Speaker 2: don't use offering also I forgot the past code as well, 736 00:44:57,920 --> 00:45:01,840 Speaker 2: So it is a kind of Chelsea. The consumer behavior 737 00:45:02,000 --> 00:45:07,360 Speaker 2: has changed so much. Yeah, they used to obviously everyone 738 00:45:07,480 --> 00:45:12,759 Speaker 2: have a pass code and Uh, nowadays they do, but 739 00:45:12,880 --> 00:45:16,600 Speaker 2: they didn't use it anymore. They think of present. That 740 00:45:17,400 --> 00:45:24,239 Speaker 2: is certainly take over majority of the authentication meant And 741 00:45:24,280 --> 00:45:28,080 Speaker 2: then the other thing was the in the case of 742 00:45:28,239 --> 00:45:31,440 Speaker 2: like in China market where a lot of mobile payment. Now, 743 00:45:32,640 --> 00:45:37,200 Speaker 2: if you were in China you could literally live without 744 00:45:37,680 --> 00:45:40,760 Speaker 2: it's like a critical you can live without a catch. 745 00:45:41,480 --> 00:45:45,640 Speaker 2: But in China you can live without critic and the 746 00:45:45,680 --> 00:45:50,600 Speaker 2: catch you can use your phone and mobile payment literally 747 00:45:50,680 --> 00:45:55,960 Speaker 2: do everything from convenience store to buying ticket to hotel 748 00:45:56,040 --> 00:46:01,480 Speaker 2: payment everything. It's quite a but all that is obviously 749 00:46:02,320 --> 00:46:06,640 Speaker 2: going through thinkation. 750 00:46:06,920 --> 00:46:11,560 Speaker 1: Right, and so the authentication part is obviously really important. 751 00:46:11,560 --> 00:46:14,799 Speaker 1: You want to make certain that the person who is 752 00:46:14,960 --> 00:46:18,640 Speaker 1: utilizing a device, particularly one that can be used as 753 00:46:19,120 --> 00:46:22,399 Speaker 1: a means of commerce, a means of purchase. You want 754 00:46:22,440 --> 00:46:26,480 Speaker 1: to make sure that the identity of the person holding 755 00:46:26,480 --> 00:46:29,279 Speaker 1: the phone is in fact the person authorized to use 756 00:46:29,320 --> 00:46:31,960 Speaker 1: that device for that purpose. And that kind of comes 757 00:46:32,000 --> 00:46:37,080 Speaker 1: in with the sensors that you've been working on in 758 00:46:37,120 --> 00:46:41,160 Speaker 1: the recent past, where it's not just looking for the 759 00:46:41,200 --> 00:46:44,719 Speaker 1: pattern of a fingerprint, which, as some people have pointed out, 760 00:46:45,000 --> 00:46:49,680 Speaker 1: is something that is possible to spoof if you go 761 00:46:49,920 --> 00:46:52,759 Speaker 1: and you have the right scanners and you have the 762 00:46:52,840 --> 00:46:55,680 Speaker 1: right you know, even three D printer technology, you could 763 00:46:55,760 --> 00:47:01,440 Speaker 1: potentially create a fake fingerprint and access US sensors that 764 00:47:01,600 --> 00:47:07,759 Speaker 1: are only capable of detecting the fingerprint layout. You are 765 00:47:07,840 --> 00:47:10,520 Speaker 1: working on technology that goes a step further than that. 766 00:47:10,600 --> 00:47:12,080 Speaker 1: Can you talk about that a little bit? 767 00:47:14,040 --> 00:47:18,920 Speaker 2: Yes, this is the one new technology we recently released 768 00:47:18,920 --> 00:47:23,640 Speaker 2: to the market. Is you at the same time when 769 00:47:23,680 --> 00:47:28,400 Speaker 2: you scan, recording or setting it in the fingerprint pattern, 770 00:47:28,960 --> 00:47:35,279 Speaker 2: you're also detecting the dynamic bluff flow in your fingerchap. 771 00:47:35,960 --> 00:47:40,680 Speaker 2: So that enable the sensor tells this fingerprint pattern it's 772 00:47:40,800 --> 00:47:48,200 Speaker 2: from a life person versus a mackup spoof. So that 773 00:47:48,760 --> 00:47:54,640 Speaker 2: further you enhanced the security level of the fingerprint authentication, 774 00:47:54,880 --> 00:47:58,200 Speaker 2: right because the most of the spoof method we know 775 00:47:59,080 --> 00:48:04,359 Speaker 2: obviously is UH, it's not by life object. So this 776 00:48:04,800 --> 00:48:11,680 Speaker 2: basically enables the security level one level up from UH. 777 00:48:12,000 --> 00:48:15,600 Speaker 2: So I think it will block the most, if not all, 778 00:48:15,719 --> 00:48:17,479 Speaker 2: the potential spoof master. 779 00:48:18,080 --> 00:48:20,719 Speaker 1: Right, So, people who would be you know, people who 780 00:48:20,719 --> 00:48:24,600 Speaker 1: would normally rely on something like a fake fingerprint made 781 00:48:24,680 --> 00:48:28,359 Speaker 1: from say silicone or rubber, that wouldn't work on this 782 00:48:28,480 --> 00:48:31,680 Speaker 1: particular type of device or this particular sensor, I should 783 00:48:31,719 --> 00:48:35,560 Speaker 1: say that will be incorporated into other devices, whether it's 784 00:48:35,600 --> 00:48:39,120 Speaker 1: a phone or a secure entry point or whatever it 785 00:48:39,160 --> 00:48:42,680 Speaker 1: may be, because it will lack that blood flow, and 786 00:48:42,719 --> 00:48:46,440 Speaker 1: without the blood flow, the device quote unquote knows it 787 00:48:46,560 --> 00:48:49,880 Speaker 1: is not a valid authentication. Am I getting that correct? 788 00:48:50,560 --> 00:48:50,960 Speaker 2: Correct? 789 00:48:50,960 --> 00:48:54,759 Speaker 1: Correct? You're absolutely crackt wonderful. So let's talk a little 790 00:48:54,760 --> 00:48:58,160 Speaker 1: bit about how this how this sensor actually does detect 791 00:48:58,239 --> 00:49:01,359 Speaker 1: that blood flow. What are you using, uh in order 792 00:49:01,480 --> 00:49:05,600 Speaker 1: for the technology to to quote unquote know that blood 793 00:49:05,640 --> 00:49:08,120 Speaker 1: is flowing behind that fingerprint. 794 00:49:10,000 --> 00:49:15,480 Speaker 2: Yeah, so what we I think we're using this technology, uh, 795 00:49:15,680 --> 00:49:20,840 Speaker 2: integrating the obstacle sensor in the same area as a 796 00:49:21,440 --> 00:49:26,719 Speaker 2: fingerprin sensor and so, and we also put in a 797 00:49:26,760 --> 00:49:33,320 Speaker 2: small led emitter emitting an infrared light through the center 798 00:49:33,480 --> 00:49:38,759 Speaker 2: glass cover, so that sending the light into your finger tip, 799 00:49:39,560 --> 00:49:43,279 Speaker 2: and then the optical censor detect the scatter light of 800 00:49:43,280 --> 00:49:47,600 Speaker 2: your fingertip, so the blood flow itself will change in 801 00:49:47,719 --> 00:49:50,960 Speaker 2: the scatter light the intensity. So this is a very 802 00:49:51,000 --> 00:49:55,640 Speaker 2: common technique to use, like in the hospital, the oxidometer 803 00:49:56,400 --> 00:49:58,520 Speaker 2: we use all the time. You know, if you're in 804 00:49:58,560 --> 00:50:02,000 Speaker 2: the hospital bed, just put on your fingertips. It's the 805 00:50:02,040 --> 00:50:06,279 Speaker 2: same principle, except that in this case we just used 806 00:50:06,320 --> 00:50:10,200 Speaker 2: it to detact to the blood blow instead of uh, 807 00:50:10,719 --> 00:50:12,600 Speaker 2: attacking the oxygen them. 808 00:50:13,239 --> 00:50:15,200 Speaker 1: Right. So in some ways you could even argue this 809 00:50:15,239 --> 00:50:18,200 Speaker 1: is this is a a a simpler use of a 810 00:50:18,239 --> 00:50:22,080 Speaker 1: technology that's been put to use specifically for those monitoring 811 00:50:22,120 --> 00:50:25,839 Speaker 1: devices and hospitals where you know you need to have 812 00:50:26,000 --> 00:50:29,520 Speaker 1: more specific information. It's not like your smartphone necessarily is 813 00:50:29,560 --> 00:50:31,600 Speaker 1: going to tell you what the oxygen levels are in 814 00:50:31,640 --> 00:50:35,160 Speaker 1: your blood, although I guess you could technically develop sensors 815 00:50:35,200 --> 00:50:35,840 Speaker 1: that could do that. 816 00:50:38,080 --> 00:50:42,680 Speaker 2: You're right about. On the other hand, obviously, is fontify everything. 817 00:50:42,840 --> 00:50:47,120 Speaker 2: I got one level up? Right, you also need a 818 00:50:47,160 --> 00:50:51,759 Speaker 2: longer time, you mean's not a something average you that 819 00:50:51,800 --> 00:50:56,759 Speaker 2: we're winning to wait right. By way, we do providing 820 00:50:56,800 --> 00:51:02,160 Speaker 2: a simple way to also provide the heartbeat, the heart rate. 821 00:51:03,200 --> 00:51:09,120 Speaker 2: There's the censor, so user could u fingerative on the center, well, 822 00:51:10,520 --> 00:51:13,560 Speaker 2: I will report a heart rate. This is a kind 823 00:51:13,560 --> 00:51:15,160 Speaker 2: of side benefit of the. 824 00:51:15,280 --> 00:51:19,560 Speaker 1: Technology, right, and so one potential application for being able 825 00:51:19,600 --> 00:51:24,040 Speaker 1: to detect heart rate. Uh. Obviously you have medical applications, 826 00:51:24,040 --> 00:51:28,000 Speaker 1: but you also have applications within the health and fitness sector. 827 00:51:28,160 --> 00:51:32,000 Speaker 1: Where people might be using their smartphone while out on 828 00:51:32,120 --> 00:51:33,840 Speaker 1: say a jog, and they want to make sure that 829 00:51:33,880 --> 00:51:36,800 Speaker 1: they're keeping their heart rate within a specific target zone. 830 00:51:37,320 --> 00:51:39,359 Speaker 1: That could be something that you would use that sort 831 00:51:39,400 --> 00:51:44,799 Speaker 1: of sensor technology for beyond its authentication capabilities. So it's 832 00:51:44,920 --> 00:51:48,839 Speaker 1: really interesting to me that we're looking at a technology 833 00:51:48,920 --> 00:51:51,279 Speaker 1: that for a long time people thought of as sort 834 00:51:51,320 --> 00:51:53,919 Speaker 1: of science fiction. You know, you saw you would see 835 00:51:53,920 --> 00:51:57,040 Speaker 1: in movies that someone would put their finger down and 836 00:51:57,080 --> 00:51:59,600 Speaker 1: get a scan and that would give them access to stuff. 837 00:51:59,640 --> 00:52:04,560 Speaker 1: And now we're realizing that's convenient because you unless something 838 00:52:04,640 --> 00:52:06,920 Speaker 1: terrible has happened, you always have your finger with you. 839 00:52:07,360 --> 00:52:12,120 Speaker 1: But but as as we've discussed, it's it's not fool 840 00:52:12,160 --> 00:52:17,359 Speaker 1: proof unless you have this second dairy layer of protection, uh, 841 00:52:17,400 --> 00:52:21,719 Speaker 1: in this case, that detection of blood flow. So what 842 00:52:21,800 --> 00:52:25,120 Speaker 1: sort of devices might we see this incorporated into. I 843 00:52:25,120 --> 00:52:28,359 Speaker 1: mean again, smartphones are are an obvious example. Are there 844 00:52:28,400 --> 00:52:32,400 Speaker 1: others that, uh that you either have your eye on 845 00:52:32,680 --> 00:52:34,800 Speaker 1: or you could see as being a potential in the future. 846 00:52:35,200 --> 00:52:40,920 Speaker 2: Yeah, the other uh, we not looking beyond the mobile device, 847 00:52:41,000 --> 00:52:45,240 Speaker 2: and then you're looking at the maybe say the same 848 00:52:45,440 --> 00:52:52,359 Speaker 2: but for example if using code, but at the same time, 849 00:52:53,160 --> 00:53:01,000 Speaker 2: you could have even in the code friend scanner, right, 850 00:53:01,520 --> 00:53:05,160 Speaker 2: and so not only you use the code, you also 851 00:53:05,280 --> 00:53:08,440 Speaker 2: on top of that you can use the fingerprints out 852 00:53:08,520 --> 00:53:12,080 Speaker 2: the owner. So that will add uh, you know, extra 853 00:53:12,200 --> 00:53:17,920 Speaker 2: layer of security and your your doors. So many times 854 00:53:17,920 --> 00:53:22,040 Speaker 2: people you know now they're wireless, they control door become 855 00:53:22,360 --> 00:53:27,200 Speaker 2: more and more popular and you may enable a scanner 856 00:53:27,400 --> 00:53:30,799 Speaker 2: for people to do that. There's a lot of us 857 00:53:31,080 --> 00:53:34,279 Speaker 2: like the car, right they are the most is the 858 00:53:34,360 --> 00:53:37,920 Speaker 2: same way people steal your key today can just drive 859 00:53:37,960 --> 00:53:42,000 Speaker 2: away with your car. But if you have a fingerprints 860 00:53:42,000 --> 00:53:45,040 Speaker 2: scanner in the car or on the key, that will 861 00:53:45,600 --> 00:53:49,799 Speaker 2: obviously protect your car better. You can buck you can 862 00:53:49,840 --> 00:53:52,680 Speaker 2: lose your key, but the people still can of drive 863 00:53:52,719 --> 00:53:56,560 Speaker 2: away with your car. Right, So there's a thic way 864 00:53:56,600 --> 00:53:59,839 Speaker 2: of using it. The one benefit of the mobile app 865 00:54:01,840 --> 00:54:05,160 Speaker 2: is really driving their costs and the size and the 866 00:54:05,280 --> 00:54:11,120 Speaker 2: power way done. Imagine it's a bidding devisorshipping every year, 867 00:54:11,280 --> 00:54:16,000 Speaker 2: so they scale the economy make it cost coming down 868 00:54:16,080 --> 00:54:19,759 Speaker 2: so much, so you enable all those other applications. 869 00:54:20,440 --> 00:54:23,160 Speaker 1: Yeah, you hit upon something really interesting there because we've 870 00:54:23,200 --> 00:54:27,840 Speaker 1: seen that we've seen the smartphone and cell phone technologies 871 00:54:28,320 --> 00:54:31,839 Speaker 1: drive a lot of development in what you might think 872 00:54:31,960 --> 00:54:36,880 Speaker 1: initially are unrelated technology simply because, as you say, the 873 00:54:36,960 --> 00:54:41,920 Speaker 1: economies of scale provide this this economic imperative. It's not 874 00:54:41,960 --> 00:54:47,560 Speaker 1: even anive incentive. It's an imperative to develop a smaller, 875 00:54:47,880 --> 00:54:53,480 Speaker 1: more efficient, more economic sensors and other technologies. So, for example, 876 00:54:54,000 --> 00:54:57,560 Speaker 1: beyond this fingerprint sensing technology that could be used in 877 00:54:57,680 --> 00:55:01,560 Speaker 1: multiple applications, a lot of the development we've seen in 878 00:55:01,719 --> 00:55:07,200 Speaker 1: the virtual reality space, in just gaming in general, and 879 00:55:07,280 --> 00:55:11,480 Speaker 1: a lot of technologies. The reason why it's possible is 880 00:55:11,520 --> 00:55:16,360 Speaker 1: because the smartphone has acted as a platform that people 881 00:55:16,400 --> 00:55:19,960 Speaker 1: have been developing for for years to increase the number 882 00:55:19,960 --> 00:55:24,439 Speaker 1: of features, increase its security, increase its applicability for lots 883 00:55:24,480 --> 00:55:29,279 Speaker 1: of different possible uses, and we end up seeing that 884 00:55:29,520 --> 00:55:35,000 Speaker 1: spill over into seemingly unrelated uses. And I think that's 885 00:55:35,000 --> 00:55:40,520 Speaker 1: a great story in general, just that it illustrates that 886 00:55:40,960 --> 00:55:45,279 Speaker 1: work in one particular platform benefits in ways that you 887 00:55:45,320 --> 00:55:50,160 Speaker 1: can't necessarily anticipate from the beginning, and certainly when it 888 00:55:50,200 --> 00:55:53,840 Speaker 1: comes to things like authentication and security, you want to 889 00:55:53,840 --> 00:55:58,640 Speaker 1: see those benefits being applied to a broader spectrum of 890 00:55:58,840 --> 00:56:02,080 Speaker 1: uses because we're we're getting to a world in fact 891 00:56:02,080 --> 00:56:04,759 Speaker 1: we're already there. We're in a world where more and 892 00:56:04,800 --> 00:56:09,640 Speaker 1: more of our devices are interconnected in ways where if 893 00:56:09,719 --> 00:56:13,080 Speaker 1: you are able to get unauthorized access to them, you 894 00:56:13,120 --> 00:56:16,520 Speaker 1: could potentially cause a great deal of mischief and harm. 895 00:56:17,560 --> 00:56:20,279 Speaker 1: So where do you see the future going? If you 896 00:56:20,320 --> 00:56:23,759 Speaker 1: had to put on your prognosticator hat, what do you 897 00:56:23,760 --> 00:56:27,920 Speaker 1: think the next big step in authentication is going to be? 898 00:56:28,920 --> 00:56:32,480 Speaker 2: Well, there is already happening the iris scan on the 899 00:56:32,480 --> 00:56:39,960 Speaker 2: phone right, that is also inment, and I think it 900 00:56:40,000 --> 00:56:46,120 Speaker 2: will become more gold popular, and the next level of 901 00:56:46,120 --> 00:56:49,280 Speaker 2: people already talking is a fingerprint scan and will get 902 00:56:49,280 --> 00:56:54,960 Speaker 2: in into the display area. As I think rumor is 903 00:56:55,000 --> 00:56:59,520 Speaker 2: the UIFO may have its function. And then I think 904 00:56:59,680 --> 00:57:03,200 Speaker 2: going beyond you're going to see more and more maybe 905 00:57:03,440 --> 00:57:08,200 Speaker 2: medical related because the mobile device is so powerful and 906 00:57:08,719 --> 00:57:11,920 Speaker 2: with us all the time. It can't really use that 907 00:57:13,120 --> 00:57:19,400 Speaker 2: platform for monitoring your house, right because it's with you 908 00:57:19,520 --> 00:57:21,960 Speaker 2: all the time. So we see a lot of those 909 00:57:24,760 --> 00:57:30,760 Speaker 2: were happening, and so I think I think that is ah, 910 00:57:32,240 --> 00:57:34,960 Speaker 2: that's a kind of next few years. We're going to 911 00:57:35,040 --> 00:57:36,960 Speaker 2: see more and more of those things coming. 912 00:57:36,720 --> 00:57:41,640 Speaker 1: To the interesting. Well, sir, thank you so much for 913 00:57:41,840 --> 00:57:45,440 Speaker 1: joining our show and answering my questions. This has been 914 00:57:45,640 --> 00:57:49,560 Speaker 1: a fascinating conversation, and I know that my listeners are 915 00:57:49,600 --> 00:57:55,080 Speaker 1: always really interested to learn not just about how technology works, 916 00:57:55,120 --> 00:57:57,960 Speaker 1: but but why those applications are so important. I think 917 00:57:58,000 --> 00:58:00,440 Speaker 1: I think you've done a great job at doing that. 918 00:58:00,480 --> 00:58:03,400 Speaker 1: So thank you very much for joining me today my pressure. 919 00:58:03,520 --> 00:58:03,840 Speaker 2: Thank you. 920 00:58:04,760 --> 00:58:07,240 Speaker 1: As for the future, what if you could authenticate your 921 00:58:07,240 --> 00:58:12,160 Speaker 1: identity just through thinking? Researchers over at Binghampton University developed 922 00:58:12,160 --> 00:58:14,680 Speaker 1: a process in which they could identify or at least 923 00:58:14,680 --> 00:58:17,560 Speaker 1: they claim they can identify a person based on their 924 00:58:17,560 --> 00:58:20,720 Speaker 1: brain wave activity alone. So here's what they did. They 925 00:58:20,760 --> 00:58:23,120 Speaker 1: took a sample of fifty people. It's not a big 926 00:58:23,160 --> 00:58:27,080 Speaker 1: sample size, but it's interesting. Fifty people, fitted each person 927 00:58:27,240 --> 00:58:31,760 Speaker 1: with an electro encephalogram or EEG headset. Then they showed 928 00:58:31,800 --> 00:58:35,280 Speaker 1: each person a series of five hundred images, and those 929 00:58:35,320 --> 00:58:40,640 Speaker 1: images prompted various emotional and cognitive responses. Now those responses 930 00:58:41,120 --> 00:58:45,640 Speaker 1: are unique to each individual. So let's say that you 931 00:58:45,680 --> 00:58:47,880 Speaker 1: and I are looking at the same photo, and just 932 00:58:47,920 --> 00:58:51,440 Speaker 1: for argument's sake, it's a picture of my adorable dog, TIBLT, 933 00:58:51,920 --> 00:58:53,680 Speaker 1: and both of us just think he's accused of little 934 00:58:53,680 --> 00:58:57,080 Speaker 1: dog in the world because he is. I mean, come on, well, 935 00:58:57,120 --> 00:59:01,400 Speaker 1: the way your brain manifests that and the way my 936 00:59:01,640 --> 00:59:05,280 Speaker 1: brain manifests that information, even if we both feel the 937 00:59:05,320 --> 00:59:10,520 Speaker 1: same way, is going to be different. So theoretically, once 938 00:59:10,560 --> 00:59:16,200 Speaker 1: you record responses from people, these brain responses to these images, 939 00:59:16,600 --> 00:59:20,080 Speaker 1: and assign each of those responses to the respective identity, 940 00:59:20,560 --> 00:59:23,240 Speaker 1: you can authenticate a person's identity just by showing him 941 00:59:23,320 --> 00:59:26,120 Speaker 1: or her the same series of images and looking for matches. 942 00:59:26,880 --> 00:59:29,560 Speaker 1: If there's no match, then the person you're looking at 943 00:59:29,880 --> 00:59:33,040 Speaker 1: isn't who you think they are, and they're likely a 944 00:59:33,080 --> 00:59:37,320 Speaker 1: pod person. Maybe I should add that no one I 945 00:59:37,360 --> 00:59:40,320 Speaker 1: know of is actually talking about using brain waves for 946 00:59:40,400 --> 00:59:45,240 Speaker 1: authentication just yet. The study said that the researchers had 947 00:59:45,240 --> 00:59:48,800 Speaker 1: a one hundred percent success rate identifying subjects based on 948 00:59:48,840 --> 00:59:51,440 Speaker 1: brain waves, and it came out in twenty sixteen. So, 949 00:59:51,480 --> 00:59:54,040 Speaker 1: in other words, they put these fifty people through the 950 00:59:54,080 --> 00:59:58,400 Speaker 1: test of recording all of these responses. Then I assume 951 00:59:58,640 --> 01:00:03,280 Speaker 1: they used a blind mesas where somebody would end up 952 01:00:03,360 --> 01:00:07,320 Speaker 1: looking at the responses that were coming in from an 953 01:00:07,400 --> 01:00:10,360 Speaker 1: unknown subject and they would be able to match that 954 01:00:10,720 --> 01:00:13,640 Speaker 1: person's responses to one that was already in the database, 955 01:00:13,680 --> 01:00:17,480 Speaker 1: thus saying, oh, that's Jill. Because when Jilsey is a 956 01:00:17,480 --> 01:00:22,120 Speaker 1: picture of Timbalt, her heart grows three sizes that day. 957 01:00:23,280 --> 01:00:26,760 Speaker 1: We've got to stop showing those pictures. She's having heart 958 01:00:26,800 --> 01:00:31,440 Speaker 1: travel trouble. It's terrible. Tipple's just so cute. Anyway, I 959 01:00:31,440 --> 01:00:35,120 Speaker 1: should add that. Also, if you wanted to use this 960 01:00:35,160 --> 01:00:38,840 Speaker 1: as an authentication strategy, it would be pretty tricky because 961 01:00:38,960 --> 01:00:41,720 Speaker 1: it requires an EEG headset. It's not exactly the most 962 01:00:41,760 --> 01:00:45,840 Speaker 1: convenient authentication technology around now. If we ever develop a 963 01:00:45,920 --> 01:00:50,320 Speaker 1: less cumbersome method for measuring measuring brainwave activity with precision, 964 01:00:50,560 --> 01:00:54,760 Speaker 1: that's important, that could become an authentication technology of the future. 965 01:00:55,080 --> 01:00:57,840 Speaker 1: It's literally the way you think, and that would be 966 01:00:58,040 --> 01:01:02,400 Speaker 1: much much more difficult, if not possible, to replicate unless 967 01:01:02,440 --> 01:01:04,680 Speaker 1: you had some sort of recording of a person's brain 968 01:01:04,720 --> 01:01:08,040 Speaker 1: waves and you could somehow, you know, push those out 969 01:01:08,320 --> 01:01:11,720 Speaker 1: to cover up your own brainwave activity. I think I 970 01:01:11,760 --> 01:01:15,200 Speaker 1: might have just written a science fiction novel accidentally. I 971 01:01:15,240 --> 01:01:19,120 Speaker 1: hope you all enjoyed that classic episode Authentication Tech and 972 01:01:19,400 --> 01:01:23,360 Speaker 1: you from February twenty second, twenty seventeen, and I hope 973 01:01:23,400 --> 01:01:26,160 Speaker 1: you are all well, and I'll talk to you again 974 01:01:26,720 --> 01:01:36,720 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 975 01:01:36,800 --> 01:01:41,520 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 976 01:01:41,560 --> 01:01:47,280 Speaker 1: wherever you listen to your favorite shows.