1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:14,160 Speaker 1: stuff Works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,200 --> 00:00:17,319 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,360 --> 00:00:19,400 Speaker 1: How Stuff Works and my Heart Radio and I love 5 00:00:19,560 --> 00:00:24,040 Speaker 1: all things tech. And today we're gonna cover a topic 6 00:00:24,160 --> 00:00:28,120 Speaker 1: that's a listener request that nikkil Cardale and I apologize. 7 00:00:28,640 --> 00:00:32,080 Speaker 1: I am sure I have butchered the pronunciation of your name. 8 00:00:32,400 --> 00:00:35,080 Speaker 1: That is totally on me, but he sent in a 9 00:00:35,159 --> 00:00:38,159 Speaker 1: great request for a podcast topic. He linked me to 10 00:00:38,280 --> 00:00:40,720 Speaker 1: an article on The Verge, which is one of my 11 00:00:40,760 --> 00:00:44,080 Speaker 1: favorite tech news sites. By the way, Verges is great 12 00:00:44,120 --> 00:00:46,239 Speaker 1: if you want to catch up on tech news. But 13 00:00:46,440 --> 00:00:49,800 Speaker 1: the article has the title the Web just took a 14 00:00:49,920 --> 00:00:53,080 Speaker 1: big step toward a password free future, and it was 15 00:00:53,120 --> 00:00:57,800 Speaker 1: written by John Porter. The article talks about an authentication 16 00:00:57,920 --> 00:01:02,200 Speaker 1: standard called web authen that's a W E B A 17 00:01:02,440 --> 00:01:06,520 Speaker 1: U T h N, which is short for Web authentication. 18 00:01:06,800 --> 00:01:09,080 Speaker 1: So in this episode, we're going to explore why we 19 00:01:09,080 --> 00:01:12,000 Speaker 1: would want to move beyond passwords in the first place. 20 00:01:12,000 --> 00:01:13,960 Speaker 1: I mean, what's wrong with passwords. We're going to cover 21 00:01:14,040 --> 00:01:17,480 Speaker 1: that and what web often is and how we might 22 00:01:17,520 --> 00:01:21,600 Speaker 1: interact with that in the future. So strap in securely, 23 00:01:22,280 --> 00:01:26,640 Speaker 1: because security is what this is all about. So let's 24 00:01:26,840 --> 00:01:31,320 Speaker 1: start with passwords. The idea of the password is truly ancient, 25 00:01:31,360 --> 00:01:33,640 Speaker 1: and I don't just mean it's been around for decades 26 00:01:33,680 --> 00:01:37,760 Speaker 1: and computer science. It's truly ancient, and it was a 27 00:01:37,760 --> 00:01:41,000 Speaker 1: pretty common go to for various societies to protect and 28 00:01:41,080 --> 00:01:44,800 Speaker 1: authenticate information and messengers. Do you need access to a 29 00:01:44,800 --> 00:01:49,280 Speaker 1: secure location, better know the password. Chances are it's sword fish. 30 00:01:49,480 --> 00:01:52,480 Speaker 1: You want to confirm that the information you are delivering 31 00:01:52,720 --> 00:01:55,800 Speaker 1: is legit. What you got to share the password. But 32 00:01:55,880 --> 00:01:58,360 Speaker 1: we all know this stuff, right, It's right up there 33 00:01:58,360 --> 00:02:00,840 Speaker 1: with the history of codes and cryptic rams. I'm not 34 00:02:00,880 --> 00:02:04,000 Speaker 1: telling you anything you don't know already. So let's flash forward. 35 00:02:04,720 --> 00:02:09,360 Speaker 1: Let's say a few dozen centuries, and then we'll arrive 36 00:02:10,000 --> 00:02:14,239 Speaker 1: at the Massachusetts Institute of Technology, good old m i T. 37 00:02:14,880 --> 00:02:19,320 Speaker 1: And the year is nineteen sixty. A mathematician and physicist 38 00:02:19,400 --> 00:02:24,959 Speaker 1: named Dr Fernando Corbato was working in the universities relatively 39 00:02:25,160 --> 00:02:29,480 Speaker 1: young computation center. Now, back in those days, computers were 40 00:02:29,639 --> 00:02:33,320 Speaker 1: monsters of the thing. The practice at the time was 41 00:02:33,360 --> 00:02:36,680 Speaker 1: to use dumb terminals, meaning you had a keyboard and 42 00:02:36,680 --> 00:02:39,480 Speaker 1: you had to display but it didn't have any computation 43 00:02:39,520 --> 00:02:43,560 Speaker 1: power itself. It was connected to a centralized computer along 44 00:02:43,560 --> 00:02:46,880 Speaker 1: with several other dumb terminals. So you have multiple terminals 45 00:02:46,919 --> 00:02:50,880 Speaker 1: all connecting back to one centralized computer system. The computer 46 00:02:51,400 --> 00:02:55,760 Speaker 1: couldn't truly multitask. You couldn't have all of those dumb 47 00:02:55,840 --> 00:03:01,600 Speaker 1: terminals communicating simultaneously with that centralized computer. Instead, it would 48 00:03:01,600 --> 00:03:05,840 Speaker 1: dedicate a certain number of processing cycles to each terminal, 49 00:03:06,120 --> 00:03:09,440 Speaker 1: and it would rotate through the terminals in turn in 50 00:03:09,440 --> 00:03:12,240 Speaker 1: a process that was called time sharing. So let's say 51 00:03:12,240 --> 00:03:15,680 Speaker 1: that you've got ten dumb terminals attached to this central computer. 52 00:03:16,120 --> 00:03:18,799 Speaker 1: For a certain number of cycles, the computer would say, 53 00:03:18,800 --> 00:03:22,680 Speaker 1: all right, let me handle the commands from terminal one. 54 00:03:23,080 --> 00:03:25,919 Speaker 1: Then I'll go to terminal two, terminal three, etcetera, etcetera, 55 00:03:26,000 --> 00:03:28,600 Speaker 1: until I get back around to terminal one. This would 56 00:03:28,600 --> 00:03:34,320 Speaker 1: happen pretty fast, so the delay wasn't always necessarily noticeable. 57 00:03:34,680 --> 00:03:37,160 Speaker 1: It all depending on how many terminals there were and 58 00:03:37,200 --> 00:03:40,320 Speaker 1: how many processing cycles were dedicated to each terminal. But 59 00:03:40,400 --> 00:03:45,920 Speaker 1: you get the idea. So this computation time was precious. 60 00:03:46,120 --> 00:03:49,280 Speaker 1: In fact, M I T. S standard practice in nineteen 61 00:03:49,360 --> 00:03:53,120 Speaker 1: sixty was to limit each computer user to just four 62 00:03:53,280 --> 00:03:58,960 Speaker 1: hours of computer time every week, so you can see 63 00:03:58,960 --> 00:04:01,280 Speaker 1: that you need to make really good use of that 64 00:04:01,320 --> 00:04:05,760 Speaker 1: computer time. Now people were literally using the same machine 65 00:04:06,480 --> 00:04:09,400 Speaker 1: and reserving those blocks of time to do work on 66 00:04:09,800 --> 00:04:13,920 Speaker 1: that computer, and that meant that you had various graduate students, professors, 67 00:04:14,000 --> 00:04:17,919 Speaker 1: lab employees, and more, all relying on the same hardware 68 00:04:18,240 --> 00:04:22,920 Speaker 1: and more importantly for this discussion, the same disc file system, 69 00:04:22,960 --> 00:04:25,480 Speaker 1: and they all had different files that they would work on. 70 00:04:25,880 --> 00:04:28,159 Speaker 1: So there need to be some way to protect one 71 00:04:28,240 --> 00:04:32,359 Speaker 1: person's files so that the respective person could be reasonably 72 00:04:32,400 --> 00:04:35,000 Speaker 1: sure those files would be there when they would need 73 00:04:35,040 --> 00:04:40,520 Speaker 1: them next. So Fernando's solution was to create a password system. 74 00:04:40,680 --> 00:04:44,120 Speaker 1: Each user would have his or her own password that 75 00:04:44,160 --> 00:04:47,440 Speaker 1: would allow them to access their files while keeping all 76 00:04:47,520 --> 00:04:50,960 Speaker 1: the other users files off limits. It was sort of 77 00:04:50,960 --> 00:04:55,039 Speaker 1: a clueg solution to the problem, and Fernando himself didn't 78 00:04:55,080 --> 00:04:57,760 Speaker 1: necessarily think it was meant to be the go to 79 00:04:57,960 --> 00:05:01,680 Speaker 1: methodology for securing data from that point forward. It was 80 00:05:01,760 --> 00:05:04,760 Speaker 1: really just kind of a stop gap. But sometimes when 81 00:05:04,800 --> 00:05:08,240 Speaker 1: an idea takes hold, we just run with it, even 82 00:05:08,279 --> 00:05:10,240 Speaker 1: if it turns out that that idea might not have 83 00:05:10,279 --> 00:05:12,520 Speaker 1: been the best one. To hitch our wagons too in 84 00:05:12,560 --> 00:05:15,840 Speaker 1: the long run, and so is the case with passwords. Now. 85 00:05:15,839 --> 00:05:19,800 Speaker 1: At first, passwords didn't need to be particularly complicated or 86 00:05:19,839 --> 00:05:23,200 Speaker 1: disguised in any complex way, because the number of people 87 00:05:23,240 --> 00:05:25,360 Speaker 1: who had access to the computer in the first place 88 00:05:25,760 --> 00:05:29,279 Speaker 1: was pretty darned small, and so was a manageable system. 89 00:05:29,360 --> 00:05:33,440 Speaker 1: You could, and theory even have your passwords completely unencrypted, 90 00:05:33,880 --> 00:05:37,320 Speaker 1: and as long as no one is digging around for it, 91 00:05:37,320 --> 00:05:41,240 Speaker 1: it's fine. But you know, you always find troublemakers. And 92 00:05:41,240 --> 00:05:44,400 Speaker 1: Fernando's approach couldn't scale up to something as immense as 93 00:05:44,400 --> 00:05:47,960 Speaker 1: the Internet without some pretty significant enhancements. So we're gonna 94 00:05:47,960 --> 00:05:50,880 Speaker 1: flash forward again, and now we're in the nineteen seventies, 95 00:05:50,920 --> 00:05:54,840 Speaker 1: a little more than a decade after the modest solution 96 00:05:54,920 --> 00:05:57,719 Speaker 1: from Fernando had taken deep hold in the computer labs 97 00:05:57,720 --> 00:06:02,720 Speaker 1: across various universities. That's when Robert Morris Senior came up 98 00:06:02,760 --> 00:06:06,440 Speaker 1: with another methodology to use in combination with passwords to 99 00:06:06,520 --> 00:06:11,160 Speaker 1: make them more secure. This was called hashing. Now, in hashing, 100 00:06:11,720 --> 00:06:15,080 Speaker 1: you start off with your password, and the password is 101 00:06:15,120 --> 00:06:19,080 Speaker 1: made up of a string of characters. Uh so, maybe 102 00:06:19,080 --> 00:06:21,480 Speaker 1: it's all letters, maybe it's a combination of different things. 103 00:06:21,480 --> 00:06:24,800 Speaker 1: Maybe it's case sensitive and that means you could have 104 00:06:24,839 --> 00:06:29,080 Speaker 1: both upper and lower case letters in that particular password. 105 00:06:29,720 --> 00:06:33,480 Speaker 1: Maybe it includes numerals and symbols in it. But whether 106 00:06:33,520 --> 00:06:36,840 Speaker 1: it's a strong password or not, you have a series 107 00:06:36,920 --> 00:06:40,080 Speaker 1: of characters that make up your password. Then you take 108 00:06:40,080 --> 00:06:43,239 Speaker 1: that serious series of characters are rather, the system does, 109 00:06:43,800 --> 00:06:47,240 Speaker 1: and it applies a mathematical operation to that series of 110 00:06:47,320 --> 00:06:51,599 Speaker 1: characters to transform them into a numerical code that represents 111 00:06:51,640 --> 00:06:55,520 Speaker 1: the original password. That numerical code is what would get 112 00:06:55,520 --> 00:06:58,680 Speaker 1: stored in a database, and that creates that added level 113 00:06:58,720 --> 00:07:01,880 Speaker 1: of security because of some one else were to get 114 00:07:01,920 --> 00:07:05,640 Speaker 1: access to this database, they wouldn't find a bunch of raw, 115 00:07:05,760 --> 00:07:09,400 Speaker 1: unencrypted passwords that they could then use to get unauthorized 116 00:07:09,440 --> 00:07:13,040 Speaker 1: access to a system. Instead, they would get the numerical 117 00:07:13,160 --> 00:07:18,640 Speaker 1: codes that represent passwords but are not themselves passwords. So, 118 00:07:18,680 --> 00:07:21,760 Speaker 1: in other words, if I got the hash of your password, 119 00:07:21,840 --> 00:07:25,320 Speaker 1: let's say I get this incredibly long string of numbers 120 00:07:25,360 --> 00:07:29,440 Speaker 1: that represents your relatively short password, and I were to 121 00:07:29,480 --> 00:07:34,040 Speaker 1: try and log into an account that you owned, that 122 00:07:34,200 --> 00:07:38,920 Speaker 1: long string of numbers wouldn't work for me, because that's 123 00:07:38,920 --> 00:07:41,760 Speaker 1: what you get after the hashing process. It's not your 124 00:07:41,800 --> 00:07:45,680 Speaker 1: actual password itself. So if the only thing stored on 125 00:07:45,720 --> 00:07:49,960 Speaker 1: the database is this hash, it protects the original password. 126 00:07:50,280 --> 00:07:53,840 Speaker 1: Without knowing what operation was used on that original string 127 00:07:53,880 --> 00:07:57,000 Speaker 1: of characters, it would be very hard to reverse engineer 128 00:07:57,080 --> 00:08:00,960 Speaker 1: the process to get those original passwords. So let's say 129 00:08:01,000 --> 00:08:05,560 Speaker 1: that the mathematical process was a and a very large 130 00:08:05,640 --> 00:08:09,600 Speaker 1: prime number was used to multiply against the value of 131 00:08:09,640 --> 00:08:13,760 Speaker 1: your password. If I don't know the identity of that 132 00:08:13,920 --> 00:08:17,800 Speaker 1: very large prime number, then I cannot determine what was 133 00:08:17,880 --> 00:08:22,960 Speaker 1: the original string. That's the idea behind this particular method 134 00:08:23,200 --> 00:08:27,320 Speaker 1: of of security. Over time, more enhancements were added to 135 00:08:27,360 --> 00:08:32,440 Speaker 1: make passwords additionally secure. Remember that any security system tends 136 00:08:32,480 --> 00:08:35,240 Speaker 1: to be involved in a high stakes game against those 137 00:08:35,280 --> 00:08:38,079 Speaker 1: who would penetrate that system. So you have the hackers 138 00:08:38,160 --> 00:08:41,439 Speaker 1: on one side. They want to know how the security works, 139 00:08:41,520 --> 00:08:44,640 Speaker 1: and in the process they can learn how to exploit 140 00:08:44,880 --> 00:08:49,760 Speaker 1: the security systems. That might not be to capitalize on 141 00:08:50,040 --> 00:08:52,400 Speaker 1: that knowledge. They may just want to know how it works. 142 00:08:53,120 --> 00:08:56,400 Speaker 1: But whether that's the case or not, they do. If 143 00:08:56,440 --> 00:08:58,240 Speaker 1: they know how it works, they know how to exploit it. 144 00:08:58,600 --> 00:09:00,760 Speaker 1: That's and that's a danger. I mean, obviously that makes 145 00:09:00,800 --> 00:09:05,680 Speaker 1: the system insecure. It's like if someone who probably isn't 146 00:09:05,679 --> 00:09:09,400 Speaker 1: a thief, but maybe you know they could be persuaded 147 00:09:09,440 --> 00:09:11,600 Speaker 1: to be a thief if you gave them a copy 148 00:09:11,679 --> 00:09:14,920 Speaker 1: of the key into the bank vault. That's probably a 149 00:09:14,960 --> 00:09:19,640 Speaker 1: bad idea. So on the other side of this game 150 00:09:19,840 --> 00:09:22,480 Speaker 1: are the security system engineers who are trying to shore 151 00:09:22,559 --> 00:09:26,160 Speaker 1: up defenses and make it harder for outsiders to get 152 00:09:26,200 --> 00:09:29,600 Speaker 1: access to a system. So they're trying to identify and 153 00:09:29,640 --> 00:09:33,800 Speaker 1: patch vulnerabilities before those vulnerabilities can be exploited, or if 154 00:09:33,800 --> 00:09:37,720 Speaker 1: a vulnerability has been exploited, to address that and fix 155 00:09:37,760 --> 00:09:40,960 Speaker 1: it so that it's no longer an entry point into 156 00:09:41,000 --> 00:09:44,560 Speaker 1: the system. So that's why we started seeing other methods 157 00:09:44,920 --> 00:09:48,400 Speaker 1: being used with passwords, like salting, for example, that became 158 00:09:48,480 --> 00:09:51,360 Speaker 1: part of the password systems. Salting is when a system 159 00:09:51,400 --> 00:09:55,079 Speaker 1: adds a string of random data before the actual password 160 00:09:55,160 --> 00:09:59,920 Speaker 1: characters and then puts the entire string through the hashing process, 161 00:10:00,280 --> 00:10:03,040 Speaker 1: and that makes it even more complicated to unravel because 162 00:10:03,280 --> 00:10:05,760 Speaker 1: not only do you have the starting password, you have 163 00:10:05,880 --> 00:10:10,920 Speaker 1: the random data of the salt That has further complicated 164 00:10:10,920 --> 00:10:13,960 Speaker 1: the whole process, and a lot of security depends upon 165 00:10:14,640 --> 00:10:19,280 Speaker 1: the people using the system being careful. Hi, there's the 166 00:10:19,400 --> 00:10:23,880 Speaker 1: rub is an assumption that is frequently unsafe to make. 167 00:10:24,720 --> 00:10:28,560 Speaker 1: To have a truly secure system, you need the people 168 00:10:28,760 --> 00:10:32,280 Speaker 1: who create accounts to make strong passwords that are difficult 169 00:10:32,400 --> 00:10:37,480 Speaker 1: or impossible to guess. Because if I could guess that 170 00:10:37,600 --> 00:10:40,120 Speaker 1: you are going to use let's say your cat's name 171 00:10:40,480 --> 00:10:44,160 Speaker 1: as your password, it really doesn't matter how much salting 172 00:10:44,480 --> 00:10:47,880 Speaker 1: or hashing or other fancy things are going on behind 173 00:10:47,880 --> 00:10:50,800 Speaker 1: the scenes with the password. If I can guess what 174 00:10:50,920 --> 00:10:53,200 Speaker 1: you used as your password, I can just type that 175 00:10:53,400 --> 00:10:56,680 Speaker 1: in when logging in. As you and I have access, 176 00:10:57,040 --> 00:10:59,839 Speaker 1: I don't need to do any fancy decoding, d HA 177 00:11:00,080 --> 00:11:02,160 Speaker 1: shing or anything like that. I don't have to even 178 00:11:02,400 --> 00:11:06,800 Speaker 1: get access to the hashed passwords in that database. If 179 00:11:06,800 --> 00:11:11,640 Speaker 1: I could just guess that used Mr. Kiddies as your password, 180 00:11:11,920 --> 00:11:14,640 Speaker 1: then I'm going to get in. That's why there's such 181 00:11:14,679 --> 00:11:18,520 Speaker 1: an emphasis on creating strong passwords, because it makes it 182 00:11:18,679 --> 00:11:22,080 Speaker 1: very hard for attackers to score a hit by just 183 00:11:22,200 --> 00:11:25,480 Speaker 1: knowing a few details about you or making some educated guesses. 184 00:11:26,000 --> 00:11:29,760 Speaker 1: The longer and stronger your password is, the less likely 185 00:11:29,840 --> 00:11:32,400 Speaker 1: someone is going to get access to your account through 186 00:11:32,559 --> 00:11:35,440 Speaker 1: a brute force attack, not when a hacker or a 187 00:11:35,480 --> 00:11:39,120 Speaker 1: cracker or whatever you want to call them, uses a 188 00:11:39,120 --> 00:11:43,680 Speaker 1: password generator, for example, that cycles through possible passwords. Typically 189 00:11:44,040 --> 00:11:47,680 Speaker 1: it will start with common passwords, maybe default passwords for 190 00:11:47,720 --> 00:11:51,280 Speaker 1: certain systems like routers tend to have default passwords and 191 00:11:51,360 --> 00:11:53,600 Speaker 1: a lot of people don't change them and that could 192 00:11:53,640 --> 00:11:57,120 Speaker 1: become a problem, or common passwords that people tend to 193 00:11:57,160 --> 00:12:01,599 Speaker 1: rely upon, like password, or if you're really clever, password 194 00:12:01,600 --> 00:12:05,240 Speaker 1: one two three, and seriously, if you use password one 195 00:12:05,280 --> 00:12:09,480 Speaker 1: two three is a password, please change it? Please? It 196 00:12:09,520 --> 00:12:13,960 Speaker 1: would be a great idea. And then after that the 197 00:12:13,960 --> 00:12:17,600 Speaker 1: password generator might work its way through a list of 198 00:12:17,720 --> 00:12:20,960 Speaker 1: common words and a big database that hackers tend to 199 00:12:20,960 --> 00:12:25,000 Speaker 1: develop that where they'll store common passwords in that database. 200 00:12:25,600 --> 00:12:29,840 Speaker 1: So it's not a true dictionary. It's called a dictionary attack. 201 00:12:30,160 --> 00:12:32,680 Speaker 1: It's not necessarily a true dictionary where you're just working 202 00:12:32,679 --> 00:12:36,400 Speaker 1: through all the words of a various of a specific language, 203 00:12:36,679 --> 00:12:39,600 Speaker 1: but rather you're working through a database of words that 204 00:12:39,679 --> 00:12:44,480 Speaker 1: have been flagged as being common passwords. Because we humans 205 00:12:44,520 --> 00:12:46,760 Speaker 1: tend to pick these sort of things, we tend to 206 00:12:46,760 --> 00:12:50,640 Speaker 1: rely upon them. Now beyond that, let's say that you 207 00:12:50,679 --> 00:12:54,079 Speaker 1: haven't used one of these common words, the generator might 208 00:12:54,120 --> 00:12:57,160 Speaker 1: start going through new guesses and the hopes of landing 209 00:12:57,160 --> 00:12:59,160 Speaker 1: on the right one. So if you make your password 210 00:12:59,240 --> 00:13:01,840 Speaker 1: very strong, lots of upper and lower case letters and 211 00:13:02,000 --> 00:13:05,120 Speaker 1: numbers and some symbols, and you make it long enough, 212 00:13:05,400 --> 00:13:07,400 Speaker 1: it makes it very hard for a brute force attack 213 00:13:07,440 --> 00:13:10,720 Speaker 1: to get through, because it's not likely that it's going 214 00:13:10,760 --> 00:13:15,080 Speaker 1: to hit upon that specific combination of characters. Of course, 215 00:13:15,840 --> 00:13:18,560 Speaker 1: the flip side of that is it also makes it 216 00:13:18,720 --> 00:13:21,920 Speaker 1: very hard for you to remember that password. And that's 217 00:13:21,960 --> 00:13:25,160 Speaker 1: the other big weakness of passwords. We have to use 218 00:13:25,200 --> 00:13:29,400 Speaker 1: them to access our stuff, and convenience is a very 219 00:13:29,440 --> 00:13:33,320 Speaker 1: important thing to us. Security is important, but so is convenience. 220 00:13:33,600 --> 00:13:38,280 Speaker 1: If I make the most amazing password and it is 221 00:13:39,320 --> 00:13:42,840 Speaker 1: not impossible to crack with a conventional computer system, like 222 00:13:42,880 --> 00:13:46,960 Speaker 1: it would take centuries for a computer to guess the password, 223 00:13:47,200 --> 00:13:49,080 Speaker 1: and by the time it did, I'd be long dead 224 00:13:49,080 --> 00:13:52,880 Speaker 1: and wouldn't care anyway. That's fine. But if I can't 225 00:13:52,880 --> 00:13:55,320 Speaker 1: remember the password, what good does it do me? I 226 00:13:55,360 --> 00:13:58,400 Speaker 1: can't access my stuff anyway. So if all of our 227 00:13:58,440 --> 00:14:01,720 Speaker 1: passwords are unique, which they should be, if we make 228 00:14:02,200 --> 00:14:05,599 Speaker 1: every password we use on every service a unique password 229 00:14:05,920 --> 00:14:09,160 Speaker 1: across all of those services, and all of those passwords 230 00:14:09,200 --> 00:14:12,840 Speaker 1: are really strong, We're probably gonna need some sort of 231 00:14:12,880 --> 00:14:17,560 Speaker 1: password vault or manager to hold all of them because 232 00:14:18,120 --> 00:14:20,080 Speaker 1: we're not going to be able to remember all of 233 00:14:20,120 --> 00:14:23,720 Speaker 1: these unique strings of characters with upper and lower case 234 00:14:23,800 --> 00:14:27,200 Speaker 1: and numbers and symbols. It's just our brains just don't 235 00:14:27,200 --> 00:14:29,920 Speaker 1: tend to work that way. So I use a password 236 00:14:29,920 --> 00:14:33,600 Speaker 1: manager myself. I use dash Lane, but there are tons 237 00:14:33,920 --> 00:14:37,280 Speaker 1: of services out there, so I'm not saying dash Lane 238 00:14:37,320 --> 00:14:39,240 Speaker 1: is the one and only it's the one I use, 239 00:14:39,360 --> 00:14:41,400 Speaker 1: but there are lots of other really good ones out there, 240 00:14:42,200 --> 00:14:45,000 Speaker 1: and we might enable other security measures as well. In fact, 241 00:14:45,040 --> 00:14:47,280 Speaker 1: we should if we have the option, stuff like two 242 00:14:47,280 --> 00:14:51,440 Speaker 1: factor authentication to improve security and reduce the chance that 243 00:14:51,560 --> 00:14:55,080 Speaker 1: some unauthorized person would get access to our stuff. I've 244 00:14:55,120 --> 00:14:57,520 Speaker 1: talked about this before, but just as a reminder, two 245 00:14:57,600 --> 00:15:01,400 Speaker 1: factor security is when you combine factors, or you can 246 00:15:01,400 --> 00:15:04,240 Speaker 1: think of them as sort of like categories of stuff 247 00:15:04,280 --> 00:15:07,200 Speaker 1: to authenticate that you are who you say you are. 248 00:15:07,640 --> 00:15:10,920 Speaker 1: So one factor could be a password that falls into 249 00:15:10,960 --> 00:15:15,080 Speaker 1: the category of something you know, something you the user knows, 250 00:15:15,680 --> 00:15:19,200 Speaker 1: so that's one factor. Then you would want to use 251 00:15:19,320 --> 00:15:23,280 Speaker 1: a different factor, something belonging to a different category. So 252 00:15:23,760 --> 00:15:26,800 Speaker 1: that might be a token, or it might be a 253 00:15:26,800 --> 00:15:30,080 Speaker 1: cell phone that you have where you've registered your cell 254 00:15:30,080 --> 00:15:33,400 Speaker 1: phone number in your profile, right, so when you log in, 255 00:15:34,000 --> 00:15:37,000 Speaker 1: you do your password, that's something you know, and then 256 00:15:37,520 --> 00:15:40,720 Speaker 1: the service sends you a message onto your cell phone. 257 00:15:40,760 --> 00:15:43,840 Speaker 1: Maybe there's a text message with a code at one time, 258 00:15:43,960 --> 00:15:47,160 Speaker 1: use code associated with it that you are supposed to 259 00:15:47,280 --> 00:15:50,720 Speaker 1: enter in order to log into the service. This factor 260 00:15:51,160 --> 00:15:55,520 Speaker 1: relies on something that you have the phone, so something 261 00:15:55,520 --> 00:15:58,680 Speaker 1: you know, the password, something you have the phone and 262 00:15:58,720 --> 00:16:01,720 Speaker 1: become Combining those fact there's you decrease the chance that 263 00:16:01,840 --> 00:16:05,400 Speaker 1: someone other than yourself can access your stuff. It's still 264 00:16:05,440 --> 00:16:08,040 Speaker 1: not full proof, there's still ways to work around it 265 00:16:08,080 --> 00:16:13,480 Speaker 1: if you're really determined, but it reduces the chances of 266 00:16:13,560 --> 00:16:17,800 Speaker 1: somebody being able to get unauthorized access to your accounts. 267 00:16:17,880 --> 00:16:22,680 Speaker 1: So if you use services that offer to factor authentication, 268 00:16:23,720 --> 00:16:27,320 Speaker 1: I highly recommend you actually activate that. But it's pretty 269 00:16:27,320 --> 00:16:31,760 Speaker 1: clear that passwords aren't the ideal solution for security. They 270 00:16:31,800 --> 00:16:35,000 Speaker 1: rely too heavily on the people using the systems to 271 00:16:35,080 --> 00:16:38,040 Speaker 1: take the steps needed to make those systems truly secure. 272 00:16:38,440 --> 00:16:41,720 Speaker 1: And I'm not placing all the blame on users because 273 00:16:41,760 --> 00:16:44,200 Speaker 1: taking all those steps, if you're really trying to be 274 00:16:44,280 --> 00:16:46,920 Speaker 1: super secure and you also want to rely on a 275 00:16:46,920 --> 00:16:49,880 Speaker 1: lot of different services, that's a big pain in the butt, 276 00:16:50,480 --> 00:16:53,760 Speaker 1: you know, it's it's it's not convenient. And on top 277 00:16:53,800 --> 00:16:56,720 Speaker 1: of all of that, we're also at the dawn of 278 00:16:56,760 --> 00:17:00,920 Speaker 1: the age of quantum computing, which could potentially render modern 279 00:17:01,000 --> 00:17:05,000 Speaker 1: passwords as obsolete. That's because the peculiar way that quantum 280 00:17:05,000 --> 00:17:09,280 Speaker 1: computers depend upon phenomena like superposition, in which a quantum 281 00:17:09,320 --> 00:17:12,439 Speaker 1: particle can inhabit multiple states at once, sort of like 282 00:17:12,480 --> 00:17:14,480 Speaker 1: a light switch being able to be off and on 283 00:17:14,720 --> 00:17:19,680 Speaker 1: at the same time, or the quantum phenomena of entanglement, 284 00:17:20,160 --> 00:17:22,879 Speaker 1: in which two quantum particles had their states tied to 285 00:17:22,920 --> 00:17:25,360 Speaker 1: one another, no matter how far apart those particles might 286 00:17:25,400 --> 00:17:30,280 Speaker 1: be in space. The whole quantum computer discussion gets really complicated, 287 00:17:30,320 --> 00:17:32,679 Speaker 1: but what it boils down to for the purposes of 288 00:17:32,720 --> 00:17:37,719 Speaker 1: cryptography is that a quantum computer with sufficient computing power 289 00:17:38,600 --> 00:17:41,720 Speaker 1: will be very good at certain tasks, and among them 290 00:17:41,880 --> 00:17:45,119 Speaker 1: could be breaking modern security systems. If you had a 291 00:17:45,119 --> 00:17:48,680 Speaker 1: hacker who had access to a powerful quantum computer and 292 00:17:48,880 --> 00:17:51,639 Speaker 1: the right algorithm. Because it's not just that a quantum 293 00:17:51,640 --> 00:17:54,399 Speaker 1: computer can magically do this. You'd have to actually design 294 00:17:54,440 --> 00:17:57,840 Speaker 1: the algorithm to do this. But people have designed such 295 00:17:57,880 --> 00:18:01,960 Speaker 1: algorithms they could con eviably penetrated secure system in a 296 00:18:02,000 --> 00:18:05,840 Speaker 1: relatively short amount of time, and so even strong passwords 297 00:18:05,880 --> 00:18:08,719 Speaker 1: are on borrowed time. That sets the stage for us 298 00:18:08,720 --> 00:18:12,480 Speaker 1: to talk about web athen although that's not a cure 299 00:18:12,560 --> 00:18:16,760 Speaker 1: all for the quantum computer problem, but I'll explain more 300 00:18:16,800 --> 00:18:20,520 Speaker 1: about that in just a second. First, let's take a 301 00:18:20,640 --> 00:18:30,320 Speaker 1: quick break, all right, So what is web auth in? 302 00:18:30,920 --> 00:18:34,720 Speaker 1: So it's a specification that has recently been upgraded to 303 00:18:34,880 --> 00:18:39,320 Speaker 1: full on standard status by the Worldwide Web Consortium or 304 00:18:39,480 --> 00:18:43,240 Speaker 1: W three C and the Fido Alliance f I d O. 305 00:18:43,720 --> 00:18:46,520 Speaker 1: But what the heck does all of that mean? Well, 306 00:18:46,560 --> 00:18:50,600 Speaker 1: the W three C is quote an international community where 307 00:18:50,640 --> 00:18:54,200 Speaker 1: member organizations, a full time staff, and the public work 308 00:18:54,320 --> 00:18:58,159 Speaker 1: together to develop Web standards. Led by Web inventor and 309 00:18:58,240 --> 00:19:02,880 Speaker 1: director Tim Berners Lee and CEO Jeffrey Jaff. W three 310 00:19:02,920 --> 00:19:05,800 Speaker 1: c's mission is to lead the Web to its full 311 00:19:05,800 --> 00:19:09,000 Speaker 1: potential end quote. So the purpose of the W three 312 00:19:09,040 --> 00:19:11,840 Speaker 1: C is to standardize stuff for the Web so that 313 00:19:11,880 --> 00:19:14,960 Speaker 1: there's kind of a unified foundation upon which all web 314 00:19:15,080 --> 00:19:18,280 Speaker 1: stuff can be built. Without some sort of organization to 315 00:19:18,320 --> 00:19:21,000 Speaker 1: oversee this, you could end up with a very fractured 316 00:19:21,040 --> 00:19:24,960 Speaker 1: experience that would be a nightmare to navigate, because as 317 00:19:24,960 --> 00:19:28,640 Speaker 1: a user, you might discover that you can't visit some 318 00:19:28,680 --> 00:19:33,359 Speaker 1: websites or use some web services if you were using 319 00:19:33,400 --> 00:19:37,520 Speaker 1: a particular browser versus another, or a particular device versus another. 320 00:19:37,640 --> 00:19:40,119 Speaker 1: Might be that, oh, when I'm on this smartphone, I 321 00:19:40,200 --> 00:19:43,399 Speaker 1: can't access this thing, or if I'm using Chrome instead 322 00:19:43,400 --> 00:19:46,040 Speaker 1: of Firefox, I can't go to this website. That would 323 00:19:46,080 --> 00:19:49,399 Speaker 1: be awful. So we don't want that to happen. Standards 324 00:19:49,920 --> 00:19:53,879 Speaker 1: allow us to create that common ground where we know, 325 00:19:54,720 --> 00:19:57,359 Speaker 1: as long as everything is built on those standards, we 326 00:19:57,400 --> 00:20:02,920 Speaker 1: should be able to access it through standardized browser or device, 327 00:20:03,480 --> 00:20:08,080 Speaker 1: or rather a browser device that works on those same standards. Now, 328 00:20:08,119 --> 00:20:12,040 Speaker 1: in turn, if we did not do that, then we 329 00:20:12,080 --> 00:20:15,000 Speaker 1: would have a very fractured approach, right. You would have 330 00:20:15,080 --> 00:20:18,679 Speaker 1: to have multiple browsers on your computer in order to 331 00:20:18,880 --> 00:20:21,280 Speaker 1: access different things. You might say, oh, well, I want 332 00:20:21,280 --> 00:20:23,000 Speaker 1: to go to my bank, but that means I got 333 00:20:23,080 --> 00:20:24,640 Speaker 1: a quit out of Chrome and I need to open 334 00:20:24,760 --> 00:20:29,600 Speaker 1: up uh, you know, Firefox or Safari or something like that. 335 00:20:29,600 --> 00:20:31,879 Speaker 1: That would be a nightmare. On top of that, you 336 00:20:31,960 --> 00:20:34,919 Speaker 1: might end up having certain types of services where you 337 00:20:34,920 --> 00:20:38,200 Speaker 1: would have to install lots of different extensions onto an 338 00:20:38,200 --> 00:20:42,520 Speaker 1: existing browser, which could introduce security vulnerabilities. So that could 339 00:20:42,520 --> 00:20:44,960 Speaker 1: be bad. So the W three C seeks to make 340 00:20:45,000 --> 00:20:47,600 Speaker 1: the web a virtual place where anyone with a browser 341 00:20:48,240 --> 00:20:51,119 Speaker 1: or a web enabled device can access the stuff on 342 00:20:51,160 --> 00:20:53,480 Speaker 1: the web in a positive way, meaning way that doesn't 343 00:20:53,520 --> 00:20:57,520 Speaker 1: like invite malware and security breaches. Now, the final alliance 344 00:20:58,119 --> 00:21:01,920 Speaker 1: is quote an open end street association with a focused 345 00:21:02,000 --> 00:21:06,240 Speaker 1: mission authentication standards to help reduce the world's over reliance 346 00:21:06,320 --> 00:21:11,320 Speaker 1: on passwords end quote. So this ties directly into the 347 00:21:11,359 --> 00:21:14,600 Speaker 1: whole purpose of web off in. The goal of FIDO 348 00:21:14,800 --> 00:21:17,439 Speaker 1: is to use open standards to develop better ways to 349 00:21:17,520 --> 00:21:21,000 Speaker 1: secure systems that not only protect data but are easy 350 00:21:21,119 --> 00:21:25,520 Speaker 1: for end users to employ without negatively impacting access to 351 00:21:25,560 --> 00:21:29,040 Speaker 1: the web. That's a tall order. You want something that's 352 00:21:29,480 --> 00:21:32,760 Speaker 1: more secure than passwords, you don't want it to be 353 00:21:33,160 --> 00:21:38,359 Speaker 1: more inconvenient than passwords are, and you want to make 354 00:21:38,400 --> 00:21:42,919 Speaker 1: sure that the actual practice of using it is no 355 00:21:43,200 --> 00:21:48,000 Speaker 1: more negative than a password would be. The organization creates 356 00:21:48,040 --> 00:21:51,520 Speaker 1: these specifications that makes them free to use globally. Fido 357 00:21:51,680 --> 00:21:54,159 Speaker 1: is a relatively young organization that grew out of an 358 00:21:54,160 --> 00:21:58,359 Speaker 1: alliance between PayPal, Lenovo, and several other companies back in 359 00:21:58,359 --> 00:22:02,200 Speaker 1: two thousand twelve, and that developed out of earlier discussions 360 00:22:02,240 --> 00:22:06,359 Speaker 1: between PayPal and a company called Validity Sensors that was 361 00:22:06,440 --> 00:22:09,399 Speaker 1: really focusing on the possibility of using biometrics as a 362 00:22:09,440 --> 00:22:14,400 Speaker 1: means of identifying a user rather than passwords. Okay, so 363 00:22:14,760 --> 00:22:17,960 Speaker 1: those two organizations, Phido n W three C have declared 364 00:22:17,960 --> 00:22:23,280 Speaker 1: the web Authen specification a standard, so now it's adopted 365 00:22:23,400 --> 00:22:26,640 Speaker 1: as the official standard. Now we can dive into what 366 00:22:26,800 --> 00:22:31,080 Speaker 1: web authen is all about now. First, web Authen itself 367 00:22:31,200 --> 00:22:34,320 Speaker 1: is one part of a larger group of specifications called 368 00:22:34,320 --> 00:22:37,720 Speaker 1: Phido two, and it's a standard that focuses on the 369 00:22:37,760 --> 00:22:41,399 Speaker 1: platform or browser level of the Internet ecosystem, So we 370 00:22:41,400 --> 00:22:45,879 Speaker 1: could call this sort of a client side standard. Ultimately, 371 00:22:46,000 --> 00:22:49,440 Speaker 1: what the standard allows for is for web based sites 372 00:22:49,440 --> 00:22:53,840 Speaker 1: and services to use a Phido security key or biometric 373 00:22:53,960 --> 00:22:58,960 Speaker 1: data or a personal mobile device to access their various accounts. 374 00:22:59,480 --> 00:23:02,359 Speaker 1: BioMed tricks could rely on stuff like a fingerprint scan 375 00:23:02,640 --> 00:23:06,000 Speaker 1: or face scan or retina scan, voiceprint The idea here 376 00:23:06,119 --> 00:23:08,000 Speaker 1: is that the user would never have to remember a 377 00:23:08,000 --> 00:23:13,439 Speaker 1: password again, they would instead depend upon this authentication strategy. Now, 378 00:23:13,480 --> 00:23:17,000 Speaker 1: in addition, the log in credentials would be unique to 379 00:23:17,160 --> 00:23:20,919 Speaker 1: each service or site, so in other words, you'd be 380 00:23:21,000 --> 00:23:24,480 Speaker 1: using the same input on your level, like your experience 381 00:23:24,520 --> 00:23:27,360 Speaker 1: would always be the same. It might be the fingerprint 382 00:23:27,440 --> 00:23:30,320 Speaker 1: scan Let's say, so you want to log into your bank, 383 00:23:30,920 --> 00:23:33,400 Speaker 1: use your fingerprint scanner, but then you want to log 384 00:23:33,480 --> 00:23:36,720 Speaker 1: into a social media site, use the fingerprint scanner again. 385 00:23:37,840 --> 00:23:41,040 Speaker 1: So to you, you're using the exact same thing each time. 386 00:23:41,080 --> 00:23:44,120 Speaker 1: But on the back end, the actual credentials that are 387 00:23:44,160 --> 00:23:48,240 Speaker 1: created are unique to the bank or to the social 388 00:23:48,240 --> 00:23:53,000 Speaker 1: media site or whatever. So you don't have one universal 389 00:23:53,280 --> 00:23:56,720 Speaker 1: set of credentials for everything, because that would be a 390 00:23:56,840 --> 00:24:00,800 Speaker 1: very poor security method. So we're going to think about 391 00:24:01,080 --> 00:24:04,720 Speaker 1: fingerprint scans for much of this episode just as the 392 00:24:04,720 --> 00:24:08,200 Speaker 1: purpose of of simplicity. But obviously it could be lots 393 00:24:08,240 --> 00:24:10,639 Speaker 1: of different stuff, and you might even use something like 394 00:24:10,720 --> 00:24:13,840 Speaker 1: a security key on a special USB stick that you 395 00:24:13,840 --> 00:24:15,760 Speaker 1: would plug into a computer when you want to log 396 00:24:15,800 --> 00:24:19,760 Speaker 1: into services. So it doesn't have to be biometric data 397 00:24:19,960 --> 00:24:24,359 Speaker 1: connected directly to you. It could be a specific key 398 00:24:24,400 --> 00:24:28,520 Speaker 1: that you've used when you've registered on some service, and 399 00:24:28,560 --> 00:24:31,240 Speaker 1: then you just have to keep track of that key 400 00:24:31,440 --> 00:24:34,560 Speaker 1: for the rest of your life. So here's the thing. 401 00:24:34,960 --> 00:24:37,960 Speaker 1: Your fingerprint obviously doesn't change from site to site. That's 402 00:24:37,960 --> 00:24:40,680 Speaker 1: the point, right Your fingerprint is unique to you, so 403 00:24:41,520 --> 00:24:44,760 Speaker 1: it's what gives you the authority to access your profiles 404 00:24:44,760 --> 00:24:47,600 Speaker 1: across these various services. At the same time, you don't 405 00:24:47,600 --> 00:24:50,840 Speaker 1: want the digital representation of your fingerprint to be the 406 00:24:50,920 --> 00:24:53,040 Speaker 1: same from service to service, is what I talking about 407 00:24:53,040 --> 00:24:56,080 Speaker 1: a second ago. You don't want that set of credentials 408 00:24:56,080 --> 00:24:59,680 Speaker 1: to be universal because that would be a huge security vulnerability. 409 00:24:59,760 --> 00:25:02,000 Speaker 1: If hack we're able to get access to the digital 410 00:25:02,040 --> 00:25:05,600 Speaker 1: representation of your fingerprint, this set of credentials, in other words, 411 00:25:06,240 --> 00:25:09,560 Speaker 1: then they could presumably use that as a means to 412 00:25:09,600 --> 00:25:14,719 Speaker 1: access your stuff. They could clone your method of access 413 00:25:14,920 --> 00:25:17,360 Speaker 1: and then access your things as if they were you, 414 00:25:17,760 --> 00:25:21,800 Speaker 1: So that would be a bad thing. Other sequences such 415 00:25:21,840 --> 00:25:26,040 Speaker 1: as hashing can take the data from the the way 416 00:25:26,119 --> 00:25:29,960 Speaker 1: you've you've scanned in whether it's that biometric approach with 417 00:25:30,000 --> 00:25:32,960 Speaker 1: the fingerprint scanner or with the key fob um, and 418 00:25:33,280 --> 00:25:36,680 Speaker 1: transform it in a unique particular way to the service 419 00:25:36,720 --> 00:25:39,439 Speaker 1: you're accessing. I'll talk more about that sequence in a 420 00:25:39,480 --> 00:25:43,399 Speaker 1: little bit. So let's say you're using fingerprint scans to 421 00:25:43,640 --> 00:25:46,159 Speaker 1: access a social media site and you're using it to 422 00:25:46,200 --> 00:25:50,800 Speaker 1: access your bank. The biometric data itself is on the 423 00:25:50,840 --> 00:25:54,600 Speaker 1: device you're using, so the scanner you're using, it's not 424 00:25:54,760 --> 00:25:57,200 Speaker 1: set up to a web server at your social media 425 00:25:57,240 --> 00:26:00,320 Speaker 1: site or your bank. Instead, another process or series of 426 00:26:00,359 --> 00:26:05,000 Speaker 1: processes transforms that biometric data into what amounts to the 427 00:26:05,080 --> 00:26:09,200 Speaker 1: equivalent of a unique password for that particular service. It's 428 00:26:09,200 --> 00:26:11,160 Speaker 1: just not a password you type in, it's one that's 429 00:26:11,200 --> 00:26:16,800 Speaker 1: generated from your physical biometric information. The social media site 430 00:26:16,880 --> 00:26:20,119 Speaker 1: uses one process to transform the biometric data and the 431 00:26:20,160 --> 00:26:22,840 Speaker 1: bank uses a totally different one, And if someone were 432 00:26:22,840 --> 00:26:25,119 Speaker 1: to somehow get access to the database for both of 433 00:26:25,119 --> 00:26:28,240 Speaker 1: those businesses, they would not be able to link the 434 00:26:28,440 --> 00:26:33,600 Speaker 1: two profiles together to identify you because the mathematical transformation 435 00:26:33,680 --> 00:26:35,560 Speaker 1: is done on your fingerprint data make it look like 436 00:26:35,640 --> 00:26:39,359 Speaker 1: it's two different people. So what's actually going on is 437 00:26:39,400 --> 00:26:43,000 Speaker 1: the pairing of public and private keys, which I've talked 438 00:26:43,040 --> 00:26:46,320 Speaker 1: about in previous episodes of tech Stuff. Web often is 439 00:26:46,359 --> 00:26:49,919 Speaker 1: an a p I or application programming interface, so it 440 00:26:50,000 --> 00:26:54,919 Speaker 1: creates login credentials through asymmetric encryption. Now it's a lot 441 00:26:54,960 --> 00:26:57,000 Speaker 1: of technical talk for what's going on, but let's break 442 00:26:57,000 --> 00:27:00,920 Speaker 1: it down into a more understandable description. We can look 443 00:27:00,960 --> 00:27:05,280 Speaker 1: at web often as an ecosystem with three major entities. 444 00:27:05,760 --> 00:27:08,840 Speaker 1: You've got the user agent. That's the portal through which 445 00:27:08,840 --> 00:27:11,040 Speaker 1: a user is accessing a service. So it could be 446 00:27:11,119 --> 00:27:14,840 Speaker 1: a web browser. That's the primary one. So a web 447 00:27:14,840 --> 00:27:18,480 Speaker 1: browser would be the user agent, and it just as however, 448 00:27:18,600 --> 00:27:21,840 Speaker 1: you're getting access to what you're trying to log into. 449 00:27:22,560 --> 00:27:25,320 Speaker 1: Then you have the servers upon which the service you're 450 00:27:25,359 --> 00:27:29,320 Speaker 1: logging into exists. That is also called the relying party. 451 00:27:29,680 --> 00:27:32,760 Speaker 1: It's the part that needs assurance that you are who 452 00:27:32,800 --> 00:27:34,760 Speaker 1: you say you are in order to give you the 453 00:27:34,840 --> 00:27:38,080 Speaker 1: access to the stuff you want. Then you have the 454 00:27:38,200 --> 00:27:42,400 Speaker 1: authentic ator. This is the element that acknowledges you are 455 00:27:42,600 --> 00:27:46,680 Speaker 1: who you say you are. It's the trusted third party 456 00:27:46,720 --> 00:27:51,040 Speaker 1: that tells the service this person's legit. I can vouch 457 00:27:51,160 --> 00:27:54,080 Speaker 1: that they are who they claim to be. And this 458 00:27:54,119 --> 00:27:57,280 Speaker 1: could take the form of biometric data like a fingerprint, 459 00:27:57,920 --> 00:28:00,959 Speaker 1: that could be retina scan, voiceprints, gam it could be 460 00:28:01,040 --> 00:28:04,520 Speaker 1: a pen, it could be a gesture or it might 461 00:28:04,560 --> 00:28:07,600 Speaker 1: require a USB security token that you plug into a 462 00:28:07,680 --> 00:28:10,639 Speaker 1: laptop to authenticate that you are who you claim to be. 463 00:28:11,200 --> 00:28:15,760 Speaker 1: There are two use cases in which these three parties interact. 464 00:28:16,160 --> 00:28:21,320 Speaker 1: The first is registration and the second is authentication. Registration, 465 00:28:21,400 --> 00:28:24,440 Speaker 1: as the name implies, refers to the process of initially 466 00:28:24,600 --> 00:28:30,199 Speaker 1: establishing an identity associated with an account using an authenticator. 467 00:28:30,640 --> 00:28:34,400 Speaker 1: Authentication refers to using the authenticator to prove your identity 468 00:28:34,480 --> 00:28:39,320 Speaker 1: upon subsequent visits. So let's use an example. Let's say 469 00:28:39,360 --> 00:28:43,240 Speaker 1: that you want to create an account with a fictional 470 00:28:43,360 --> 00:28:47,720 Speaker 1: service we're gonna call it Schmoogle, and you are on 471 00:28:47,760 --> 00:28:50,880 Speaker 1: a desktop computer, and you're on the web browser of 472 00:28:50,920 --> 00:28:54,760 Speaker 1: your choice, that's the user agent, and you use your 473 00:28:54,800 --> 00:28:58,880 Speaker 1: browser to navigate to the Schmoogle website and you start 474 00:28:58,960 --> 00:29:00,960 Speaker 1: filling out a profile. He said, already, I want to 475 00:29:01,000 --> 00:29:05,600 Speaker 1: profile on Schmoogle. The Schmoogle server, also known as the 476 00:29:05,640 --> 00:29:10,160 Speaker 1: relying party in this ecosystem, sends back to the user 477 00:29:10,200 --> 00:29:14,680 Speaker 1: agent your browser what is called a challenge. The user 478 00:29:14,720 --> 00:29:18,560 Speaker 1: agent sends that challenge plus a command to create new 479 00:29:18,600 --> 00:29:23,640 Speaker 1: credentials to the authenticator, and the authenticator may send an 480 00:29:23,680 --> 00:29:27,080 Speaker 1: authorization request to the user agent. UH In this example, 481 00:29:27,280 --> 00:29:30,680 Speaker 1: Let's say that after you've hit register, your smartphone buzzes 482 00:29:30,880 --> 00:29:33,320 Speaker 1: and you see a notification asking you to complete registration 483 00:29:33,320 --> 00:29:35,880 Speaker 1: on your phone by scanning your fingerprint or hitting a 484 00:29:35,880 --> 00:29:39,240 Speaker 1: button or something like that. You authenticate, and the message 485 00:29:39,280 --> 00:29:42,880 Speaker 1: goes back to the authenticator, which creates new credentials and 486 00:29:43,000 --> 00:29:46,720 Speaker 1: signs off on the challenge. It creates a digital signature 487 00:29:47,160 --> 00:29:49,680 Speaker 1: and sends that to the user agent. The user agent 488 00:29:49,720 --> 00:29:51,720 Speaker 1: then sends the new credentials in the form of a 489 00:29:51,760 --> 00:29:55,480 Speaker 1: public key paired with the signed challenge, and sends that 490 00:29:55,560 --> 00:29:59,320 Speaker 1: to the relying party, and the relying party registers the user. 491 00:30:00,080 --> 00:30:03,120 Speaker 1: It's a long way of putting that process. Authentication is 492 00:30:03,160 --> 00:30:06,560 Speaker 1: slightly different, but also involves the relying party requiring the 493 00:30:06,600 --> 00:30:10,280 Speaker 1: authenticator to sign off on a challenge based on public 494 00:30:10,440 --> 00:30:14,440 Speaker 1: key cryptography. Upon receiving the sign challenge, the relying party 495 00:30:14,480 --> 00:30:18,080 Speaker 1: admits entry into the service. Now, much of the emphasis 496 00:30:18,120 --> 00:30:21,960 Speaker 1: I've seen on web often has been its association with biometrics. 497 00:30:22,000 --> 00:30:26,560 Speaker 1: In particular, that would be the factor known as something 498 00:30:26,640 --> 00:30:31,520 Speaker 1: a user is because it's pretty darn hard to replicate 499 00:30:31,640 --> 00:30:37,080 Speaker 1: without the user. Not impossible, but difficult. I'll explain more 500 00:30:37,160 --> 00:30:39,320 Speaker 1: in just a second, but first, let's take another quick 501 00:30:39,360 --> 00:30:49,960 Speaker 1: break to thank our sponsor. All right, Since week passwords 502 00:30:50,240 --> 00:30:52,920 Speaker 1: are a huge threat to security, the W three C 503 00:30:53,160 --> 00:30:57,480 Speaker 1: sites that stolen weak or default passwords are responsible for 504 00:30:57,520 --> 00:31:01,800 Speaker 1: eighty one percent of data breaches. The idea is that 505 00:31:01,840 --> 00:31:07,120 Speaker 1: the biometric approach can eliminate an enormous and enormously expensive problem. 506 00:31:07,200 --> 00:31:10,200 Speaker 1: Companies have to spend millions of dollars every year to 507 00:31:10,280 --> 00:31:14,560 Speaker 1: address or prevent data breaches. By taking passwords out of 508 00:31:14,600 --> 00:31:17,800 Speaker 1: the ecosystem, the W three C and FIDO hope to 509 00:31:17,840 --> 00:31:20,880 Speaker 1: eliminate the most common tools in a hackers arsenal. You 510 00:31:20,920 --> 00:31:23,960 Speaker 1: can't guess someone's fingerprint or use a clever trick to 511 00:31:24,000 --> 00:31:28,080 Speaker 1: fool someone into sharing their fingerprint password with you. Social 512 00:31:28,120 --> 00:31:31,960 Speaker 1: engineering and phishing would get super complicated. You would need 513 00:31:32,040 --> 00:31:34,840 Speaker 1: physical access to your targeted user to trick them into 514 00:31:34,840 --> 00:31:39,040 Speaker 1: opening up the log in access to you, unless you 515 00:31:39,080 --> 00:31:43,720 Speaker 1: could figure some other trick away around it. Presumably, getting 516 00:31:43,760 --> 00:31:46,160 Speaker 1: physical access to a user would raise a few more 517 00:31:46,240 --> 00:31:50,560 Speaker 1: questions than just your average phishing attempt. The web authen 518 00:31:50,720 --> 00:31:54,360 Speaker 1: specification already has wide adoption as well. It's supported in 519 00:31:54,400 --> 00:31:59,880 Speaker 1: Windows ten, Android, Google Chrome, Mozilla, Firefox, Microsoft Edge, and 520 00:32:00,120 --> 00:32:04,240 Speaker 1: Apple Safari preview web browsers. So now it falls to 521 00:32:04,520 --> 00:32:07,680 Speaker 1: service providers and site administrators to turn on support for 522 00:32:07,800 --> 00:32:10,760 Speaker 1: web off in, and it opens up a new opportunity 523 00:32:10,800 --> 00:32:13,680 Speaker 1: for companies to produce the purpherals needed to take advantage 524 00:32:13,720 --> 00:32:16,160 Speaker 1: of this. Most smart funds and laptops have things like 525 00:32:16,200 --> 00:32:19,800 Speaker 1: built in cameras. Webcams are fairly common. We're seeing more 526 00:32:19,840 --> 00:32:23,640 Speaker 1: companies incorporate cameras directly into displays for web oft then 527 00:32:23,720 --> 00:32:26,680 Speaker 1: to become widespread, will likely see more devices that can 528 00:32:26,720 --> 00:32:30,960 Speaker 1: interoperate with existing technology, like fingerprint scanners that can plug 529 00:32:31,000 --> 00:32:34,680 Speaker 1: into existing computers via USB ports, or we'll see more 530 00:32:34,880 --> 00:32:38,680 Speaker 1: of those security keys. Some services have been supporting web 531 00:32:38,680 --> 00:32:41,720 Speaker 1: often for a while now. Back in May two eighteen, 532 00:32:42,240 --> 00:32:46,120 Speaker 1: Dropbox announced via its official blog that the service had 533 00:32:46,160 --> 00:32:49,760 Speaker 1: integrated web at in support. The company did not toss 534 00:32:49,800 --> 00:32:53,760 Speaker 1: passwords out the window or anything. Instead, Dropbox incorporated web 535 00:32:53,760 --> 00:32:57,760 Speaker 1: often into two factor authentication, and the company acknowledges that 536 00:32:57,800 --> 00:32:59,680 Speaker 1: there's still a lot of questions we need to answer 537 00:32:59,760 --> 00:33:03,920 Speaker 1: before we leave passwords behind entirely. So, for example, what 538 00:33:04,000 --> 00:33:06,680 Speaker 1: if you were to depend upon a USB security key 539 00:33:06,840 --> 00:33:11,120 Speaker 1: such as the Ubickoe security key that's like the industry standard. 540 00:33:11,760 --> 00:33:14,560 Speaker 1: It's a little USB stick. You plug it into your 541 00:33:15,080 --> 00:33:18,280 Speaker 1: laptop or your desktop computer. When you try and log 542 00:33:18,320 --> 00:33:21,840 Speaker 1: into a service that requires the security key. Uh, there's 543 00:33:21,840 --> 00:33:24,520 Speaker 1: a little button on the actual USB stick that starts 544 00:33:24,560 --> 00:33:28,280 Speaker 1: to light up. You press that button and it authenticates 545 00:33:28,400 --> 00:33:30,120 Speaker 1: that you are who you say you are, and you're 546 00:33:30,160 --> 00:33:32,240 Speaker 1: able to access without having to put in a password 547 00:33:32,400 --> 00:33:37,040 Speaker 1: or more typically, you're using this as two factor authentication, 548 00:33:37,120 --> 00:33:40,360 Speaker 1: so you're giving this an addition to a password. Now, 549 00:33:40,360 --> 00:33:43,640 Speaker 1: these keys don't have any identifiable information stored on them. 550 00:33:43,760 --> 00:33:46,840 Speaker 1: If someone else found your security key, they wouldn't be 551 00:33:46,920 --> 00:33:49,680 Speaker 1: able to log into your account without knowing who you 552 00:33:49,720 --> 00:33:53,680 Speaker 1: are and which services you use with that key. So 553 00:33:54,200 --> 00:33:57,440 Speaker 1: if you happen to have the UBI key security key 554 00:33:57,680 --> 00:34:00,560 Speaker 1: USB thing on a key chain and it fell somewhere 555 00:34:00,960 --> 00:34:03,520 Speaker 1: and some random person picked it up, it would be 556 00:34:03,560 --> 00:34:05,880 Speaker 1: of no use to them. They wouldn't know it belonged 557 00:34:05,920 --> 00:34:08,000 Speaker 1: to you, they wouldn't know how to use it. So 558 00:34:09,080 --> 00:34:13,360 Speaker 1: you could feel fairly secure that your your your various 559 00:34:13,400 --> 00:34:16,520 Speaker 1: accounts would be safe. But what about you? How are 560 00:34:16,560 --> 00:34:20,399 Speaker 1: you to access your services if you lose that key? 561 00:34:20,680 --> 00:34:22,560 Speaker 1: I mean, it's another physical thing you have to keep 562 00:34:22,640 --> 00:34:24,399 Speaker 1: up with it's a pain in the butt right well. 563 00:34:24,400 --> 00:34:28,000 Speaker 1: According to Yubaco, the best practice is to have a 564 00:34:28,040 --> 00:34:32,400 Speaker 1: backup security key registered as well. So you could presumably 565 00:34:32,440 --> 00:34:36,399 Speaker 1: have a web often implementation where a user could link 566 00:34:36,480 --> 00:34:39,840 Speaker 1: more than one security key to the account and have 567 00:34:39,960 --> 00:34:43,600 Speaker 1: a backup security key, and you would keep that physical 568 00:34:44,400 --> 00:34:47,080 Speaker 1: USB security key in a safe place, maybe in an 569 00:34:47,160 --> 00:34:50,440 Speaker 1: actual fireproof safe for example, and if you lost your 570 00:34:50,440 --> 00:34:52,560 Speaker 1: primary key, you would still have a backup you could 571 00:34:52,640 --> 00:34:55,440 Speaker 1: use and you can maybe deactivate your primary key at 572 00:34:55,480 --> 00:34:59,080 Speaker 1: that point. Yubaco points out that the threats for which 573 00:34:59,239 --> 00:35:01,399 Speaker 1: the company made the keys in the first place are 574 00:35:01,440 --> 00:35:06,920 Speaker 1: all remote account takeover threats. They aren't physical access threats, right, 575 00:35:06,960 --> 00:35:09,960 Speaker 1: They're not this is someone who has gotten physical access 576 00:35:10,000 --> 00:35:13,120 Speaker 1: to your computer and now they're gonna log on. But rather, 577 00:35:13,320 --> 00:35:17,160 Speaker 1: this is someone who's trying to inject an attack on 578 00:35:17,200 --> 00:35:19,759 Speaker 1: the Internet and pose as you, and it's the most 579 00:35:19,800 --> 00:35:24,279 Speaker 1: common tactic that attackers take. It's unlikely that you would 580 00:35:24,360 --> 00:35:26,799 Speaker 1: encounter someone who would aim to get physical access to 581 00:35:26,840 --> 00:35:29,799 Speaker 1: you first in order to get access to your online accounts. 582 00:35:30,360 --> 00:35:33,560 Speaker 1: They're more likely to use social engineering and phishing tactics, 583 00:35:33,640 --> 00:35:35,719 Speaker 1: or a man in the middle attack, which is what 584 00:35:35,840 --> 00:35:38,759 Speaker 1: these security keys are designed to foil. The security key 585 00:35:38,800 --> 00:35:42,560 Speaker 1: represents an entity the relying party trusts, and without that 586 00:35:42,640 --> 00:35:46,120 Speaker 1: party in the login process, the relying party won't grant 587 00:35:46,200 --> 00:35:49,440 Speaker 1: access to the account. Now, apart from the concern that 588 00:35:49,560 --> 00:35:53,600 Speaker 1: you might misplace a physical security key, there are other 589 00:35:53,680 --> 00:35:57,120 Speaker 1: challenges associated with web all then, So let's go to biometrics. 590 00:35:57,200 --> 00:35:59,880 Speaker 1: Let's say you're not using a physical security key or 591 00:36:00,040 --> 00:36:03,600 Speaker 1: using a fingerprint scanner or retina scanner or something. Biometrics 592 00:36:03,640 --> 00:36:07,560 Speaker 1: relate back to who you are biologically. A biometric system 593 00:36:07,640 --> 00:36:12,279 Speaker 1: might authenticate your identity based off of some uh, some 594 00:36:12,560 --> 00:36:16,040 Speaker 1: aspect of you that should be unique to you. Now. 595 00:36:16,120 --> 00:36:19,759 Speaker 1: Unlike a password, which is at least in theory, privately 596 00:36:20,000 --> 00:36:23,760 Speaker 1: known and only known to the individual user, biometric data 597 00:36:23,880 --> 00:36:27,759 Speaker 1: is public, right, I mean, it's it's things about us 598 00:36:27,800 --> 00:36:31,399 Speaker 1: that other people can see. It's not hidden away. Most 599 00:36:31,440 --> 00:36:35,440 Speaker 1: of our modern culture these days celebrates sharing images and 600 00:36:35,600 --> 00:36:39,160 Speaker 1: videos of ourselves in public online forums, from social media 601 00:36:39,239 --> 00:36:42,440 Speaker 1: sites to video platforms like YouTube. That puts pressure on 602 00:36:42,480 --> 00:36:46,120 Speaker 1: companies creating biometric systems to make sure they're detecting the 603 00:36:46,160 --> 00:36:49,720 Speaker 1: real presence of a user. You might remember a story 604 00:36:49,760 --> 00:36:52,720 Speaker 1: from a few years ago about some cigarette vending machines 605 00:36:52,760 --> 00:36:56,759 Speaker 1: in Japan that used facial recognition software to determine if 606 00:36:56,800 --> 00:36:59,120 Speaker 1: someone who was trying to buy cigarettes was actually old 607 00:36:59,200 --> 00:37:01,759 Speaker 1: enough to do so. The system look for signs of 608 00:37:01,800 --> 00:37:04,920 Speaker 1: aging that would indicate the buyer was of the appropriate age, 609 00:37:05,120 --> 00:37:07,640 Speaker 1: But soon news broke that the system could be fooled 610 00:37:07,760 --> 00:37:09,960 Speaker 1: just by holding up a printed out picture of an 611 00:37:10,000 --> 00:37:13,479 Speaker 1: older person's face. The camera couldn't tell the difference between 612 00:37:13,520 --> 00:37:16,439 Speaker 1: a two dimensional image and a three dimensional human face 613 00:37:16,520 --> 00:37:19,319 Speaker 1: in front of it. Now, that's a huge flaw. It's 614 00:37:19,320 --> 00:37:22,879 Speaker 1: a pretty extreme example of how a biometric system could 615 00:37:22,920 --> 00:37:26,240 Speaker 1: be fooled. But it reinforces the fact that these systems 616 00:37:26,320 --> 00:37:29,600 Speaker 1: must be proven to be extremely reliable or else they 617 00:37:29,640 --> 00:37:34,000 Speaker 1: are still big security vulnerabilities. Similar concerns were raised when 618 00:37:34,000 --> 00:37:37,080 Speaker 1: Apple announced that new versions of its iPhone would allow 619 00:37:37,160 --> 00:37:41,040 Speaker 1: users to unlock their phones via facial recognition software. People 620 00:37:41,080 --> 00:37:44,440 Speaker 1: began to ask questions like could the camera differentiate between 621 00:37:44,520 --> 00:37:48,720 Speaker 1: images and actual faces? Could tell the difference between identical twins? 622 00:37:49,239 --> 00:37:51,880 Speaker 1: And then there's a related problem. Sometimes the system might 623 00:37:51,920 --> 00:37:55,799 Speaker 1: have trouble recognizing a person in different circumstances, like in 624 00:37:55,880 --> 00:37:58,839 Speaker 1: low lighting or at different parts of the day. There 625 00:37:58,920 --> 00:38:01,400 Speaker 1: was an article in Slate in July two thous eighteen 626 00:38:01,480 --> 00:38:05,040 Speaker 1: titled iPhone face i D struggles to recognize people in 627 00:38:05,080 --> 00:38:09,160 Speaker 1: the morning. So these systems could fail to authenticate a 628 00:38:09,200 --> 00:38:12,920 Speaker 1: person when it really is the real person. But because 629 00:38:12,960 --> 00:38:17,720 Speaker 1: the circumstances are different from when you registered your identity, 630 00:38:17,880 --> 00:38:20,200 Speaker 1: then there's the concern that some of these systems might 631 00:38:20,320 --> 00:38:26,200 Speaker 1: exclude entire ethnicities. It's a case of technological bias, which 632 00:38:26,239 --> 00:38:28,680 Speaker 1: is something that can happen. The people who design the 633 00:38:28,719 --> 00:38:34,839 Speaker 1: systems might exclude entire ethnicities, not necessarily intentionally, but just 634 00:38:34,880 --> 00:38:38,960 Speaker 1: because of their own ethnic background that they're they're designing 635 00:38:38,960 --> 00:38:43,080 Speaker 1: a system meant to recognize and differentiate people of their 636 00:38:43,120 --> 00:38:46,520 Speaker 1: own ethnicity because that's where they're coming from. They're not 637 00:38:47,040 --> 00:38:51,640 Speaker 1: thinking outside of that. And uh, there are real examples 638 00:38:51,640 --> 00:38:54,520 Speaker 1: of this as well. Two thousand seventeen news story reported 639 00:38:54,560 --> 00:38:57,160 Speaker 1: that once again Apple was in the news for a 640 00:38:57,200 --> 00:39:00,080 Speaker 1: bad reason. Their face i D had troubled distinguished in 641 00:39:00,160 --> 00:39:05,080 Speaker 1: between different Chinese people, So that's a big problem. Fingerprints 642 00:39:05,080 --> 00:39:07,840 Speaker 1: have likewise been shown to be vulnerable to security attacks. 643 00:39:08,160 --> 00:39:12,920 Speaker 1: Yan Chrysler took some high resolution pictures of ursula von 644 00:39:13,040 --> 00:39:17,720 Speaker 1: der Lyon the German Minister of Defense, and was able 645 00:39:18,000 --> 00:39:22,560 Speaker 1: to make copies of her fingerprints and full fingerprint authentication 646 00:39:22,880 --> 00:39:26,480 Speaker 1: just from high resolution photos. Chrysler was also able to 647 00:39:26,560 --> 00:39:29,960 Speaker 1: lift a fingerprint off of an iPhone screen and then 648 00:39:30,080 --> 00:39:33,200 Speaker 1: use that lifted print to fool the iPhones touch i 649 00:39:33,320 --> 00:39:37,680 Speaker 1: D technology and authenticate him as the proper owner of 650 00:39:37,719 --> 00:39:41,239 Speaker 1: the iPhone. Now, these are pretty specific incidents, and they're 651 00:39:41,280 --> 00:39:43,879 Speaker 1: not likely to become the types of situations that the 652 00:39:43,920 --> 00:39:47,440 Speaker 1: average Internet user will encounter. But I felt I needed 653 00:39:47,440 --> 00:39:50,640 Speaker 1: to include a bit in this discussion on the subject 654 00:39:50,640 --> 00:39:55,040 Speaker 1: to be thorough with it, to be fair and objective, 655 00:39:55,280 --> 00:39:58,520 Speaker 1: and to show that while biometrics have advantages over passwords 656 00:39:58,600 --> 00:40:01,279 Speaker 1: in some areas, there are still some things will need 657 00:40:01,320 --> 00:40:04,440 Speaker 1: to address if we want to use them regularly. Uh, 658 00:40:04,480 --> 00:40:08,160 Speaker 1: the responsibility will no longer be let's make sure we 659 00:40:08,239 --> 00:40:11,600 Speaker 1: make really good passwords and that we remember them all, 660 00:40:11,960 --> 00:40:14,120 Speaker 1: but rather, let's make sure we keep track of our 661 00:40:14,160 --> 00:40:17,080 Speaker 1: stuff and we don't let it fall into the wrong hands, 662 00:40:17,600 --> 00:40:20,640 Speaker 1: particularly the wrong hands that are as clever as Yon Chrystler. 663 00:40:21,360 --> 00:40:24,440 Speaker 1: That would be bad. Now, outside of the biometrics, some 664 00:40:24,600 --> 00:40:28,440 Speaker 1: security experts have cautioned against Web often from a purely 665 00:40:28,600 --> 00:40:34,440 Speaker 1: algorithmic perspective. Some security researchers with the Paragon Initiative raised 666 00:40:34,520 --> 00:40:38,120 Speaker 1: questions about two algorithms, in particular, stating that it was 667 00:40:38,160 --> 00:40:41,799 Speaker 1: theoretically possible for someone to develop an exploit that would 668 00:40:41,840 --> 00:40:44,640 Speaker 1: let attackers steal a key from a user and then 669 00:40:44,680 --> 00:40:47,400 Speaker 1: clone it themselves, kind of like cloning a credit card. 670 00:40:47,840 --> 00:40:50,520 Speaker 1: But on the bright side, the Fighto Alliance got in 671 00:40:50,560 --> 00:40:53,880 Speaker 1: touch with those researchers from Paragon Initiative to work on 672 00:40:53,960 --> 00:40:57,640 Speaker 1: best practices and documentation, as well as improving protocols for Web, 673 00:40:57,640 --> 00:41:00,440 Speaker 1: often to make it more secure. So that's a great step. 674 00:41:00,840 --> 00:41:04,040 Speaker 1: So I'm not doom and glooming Web of then I 675 00:41:04,040 --> 00:41:07,240 Speaker 1: actually think it's a really useful technology. But I also 676 00:41:07,760 --> 00:41:11,520 Speaker 1: will completely admit it's one that's going to require a 677 00:41:11,560 --> 00:41:15,279 Speaker 1: little care and shepherding along the way to make it 678 00:41:15,400 --> 00:41:22,640 Speaker 1: truly a useful, adoptable tech. And maybe within five years, 679 00:41:22,680 --> 00:41:26,239 Speaker 1: ten years, we won't be worried about passwords anymore. We'll 680 00:41:26,280 --> 00:41:30,040 Speaker 1: be using web athen to log into everything, And then 681 00:41:30,080 --> 00:41:32,000 Speaker 1: I'll actually make my life a lot easier as long 682 00:41:32,040 --> 00:41:35,440 Speaker 1: as I don't lose that security key, which I'm already 683 00:41:35,480 --> 00:41:37,560 Speaker 1: having anxiety about and I don't even own one of 684 00:41:37,560 --> 00:41:39,520 Speaker 1: the darned things. Yet they do exist. You can go 685 00:41:39,600 --> 00:41:42,640 Speaker 1: out there and get them right now. I'm just I 686 00:41:42,719 --> 00:41:46,239 Speaker 1: got this terrible feeling that I'll be constantly trying to 687 00:41:46,280 --> 00:41:52,319 Speaker 1: retrieve my accounts through complicated customer service menus because I 688 00:41:52,360 --> 00:41:55,480 Speaker 1: lose stuff. But that's on me. If you guys have 689 00:41:55,520 --> 00:41:58,440 Speaker 1: any suggestions for future episodes of tech Stuff, why not 690 00:41:58,480 --> 00:42:00,920 Speaker 1: send me a message. The email address for the show 691 00:42:01,560 --> 00:42:05,520 Speaker 1: is tech Stuff at how stuff works dot com, or 692 00:42:05,600 --> 00:42:08,560 Speaker 1: pop on over to our website that's tech stuff podcast 693 00:42:08,640 --> 00:42:10,600 Speaker 1: dot com. That's where you're going to find an archive 694 00:42:10,640 --> 00:42:13,040 Speaker 1: of all of our older episodes, as well as links 695 00:42:13,080 --> 00:42:16,640 Speaker 1: to our social media presence so you can present us 696 00:42:16,640 --> 00:42:19,000 Speaker 1: with something on social media. There's also a link to 697 00:42:19,120 --> 00:42:22,319 Speaker 1: our online store. Remember every purchase you make there goes 698 00:42:22,360 --> 00:42:24,839 Speaker 1: to help the show. We greatly appreciate it, and I'll 699 00:42:24,880 --> 00:42:33,480 Speaker 1: talk to you again really soon for more on this 700 00:42:33,640 --> 00:42:36,160 Speaker 1: and thousands of other topics. Is it how stuff works 701 00:42:36,160 --> 00:42:46,360 Speaker 1: dot com