1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:14,000 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:14,240 --> 00:00:17,360 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,400 --> 00:00:19,840 Speaker 1: How Stuff Works in my Heart Radio, and I love 5 00:00:19,840 --> 00:00:22,880 Speaker 1: all things tech. I decided to deliver it that way 6 00:00:23,040 --> 00:00:25,680 Speaker 1: because Tary was going to lipstick it with me otherwise. 7 00:00:26,680 --> 00:00:29,880 Speaker 1: I am here to give you a classic episode of 8 00:00:29,920 --> 00:00:32,400 Speaker 1: tech Stuff, and I thought for a while that I 9 00:00:32,400 --> 00:00:34,760 Speaker 1: should re enact the entire thing and do all the 10 00:00:34,880 --> 00:00:37,519 Speaker 1: voices for me and for Chris, but Sary reminded me 11 00:00:37,560 --> 00:00:40,000 Speaker 1: that I also want to get home today, so instead, 12 00:00:40,040 --> 00:00:42,479 Speaker 1: we're just gonna play this classic episode for you. It 13 00:00:42,560 --> 00:00:45,720 Speaker 1: is called tech Stuff looks at Password Security, and it 14 00:00:45,840 --> 00:00:50,959 Speaker 1: dates from September nineteen, two thousand twelve. Enjoy Security has 15 00:00:51,120 --> 00:00:52,920 Speaker 1: been in the news a lot lately as of the 16 00:00:52,960 --> 00:00:56,680 Speaker 1: time we're recording this in lead August two thousand twelve, UM, 17 00:00:56,720 --> 00:00:59,600 Speaker 1: and part of that is because, as we have touched 18 00:00:59,640 --> 00:01:02,120 Speaker 1: on an a handful of times since some of the big, 19 00:01:03,040 --> 00:01:07,440 Speaker 1: more widely publicized cases have been making the news that 20 00:01:08,200 --> 00:01:11,880 Speaker 1: you know, hackers have been breaking into different accounts at 21 00:01:12,040 --> 00:01:17,280 Speaker 1: major corporations online, stealing people's information. It's unclear whether people's 22 00:01:17,280 --> 00:01:19,680 Speaker 1: credit card numbers were stolen or if we have your 23 00:01:19,720 --> 00:01:21,760 Speaker 1: home address or we know the name of your dog. 24 00:01:22,000 --> 00:01:24,440 Speaker 1: There was a whole story of Matt Honan getting his 25 00:01:24,720 --> 00:01:29,440 Speaker 1: entire digital life hacked because of a vulnerability between the 26 00:01:29,480 --> 00:01:34,360 Speaker 1: systems of Amazon and Apple, which clearly taken a loan, 27 00:01:35,240 --> 00:01:40,440 Speaker 1: clearly were not obvious as problems, but when put together, 28 00:01:40,560 --> 00:01:43,479 Speaker 1: post problems because they were the people who were doing 29 00:01:43,520 --> 00:01:46,360 Speaker 1: the hacking game to the system and put them against 30 00:01:46,400 --> 00:01:49,000 Speaker 1: one another to create a bigger picture that allowed them 31 00:01:49,000 --> 00:01:51,960 Speaker 1: to get the information. Well, uh, you know, people have 32 00:01:52,040 --> 00:01:55,040 Speaker 1: been saying that you need secure password please, and there 33 00:01:55,040 --> 00:01:57,960 Speaker 1: are news reports about this too. People are still using 34 00:01:58,520 --> 00:02:04,160 Speaker 1: password as password or obvious terms one, two, three, four. 35 00:02:04,240 --> 00:02:08,519 Speaker 1: That's the kind of thing an idiot puts on his luggage. Hey, so, 36 00:02:08,680 --> 00:02:11,880 Speaker 1: uh yeah, I mean those kinds of things are still 37 00:02:11,919 --> 00:02:15,839 Speaker 1: in practice, and of course you need to use more 38 00:02:16,000 --> 00:02:20,440 Speaker 1: secure passwords, but it's it's it goes deeper than that. 39 00:02:20,480 --> 00:02:24,960 Speaker 1: There's more information out there now about how even using 40 00:02:25,680 --> 00:02:31,760 Speaker 1: stronger passwords alone isn't necessarily going to keep hackers from 41 00:02:31,800 --> 00:02:35,160 Speaker 1: being able to get into your account. So think about 42 00:02:35,160 --> 00:02:38,040 Speaker 1: what you're doing. There's there's several several things that you 43 00:02:38,120 --> 00:02:41,920 Speaker 1: have to consider. One of those is the idea of 44 00:02:42,040 --> 00:02:46,040 Speaker 1: linking accounts together, because that means that should one account 45 00:02:46,120 --> 00:02:50,640 Speaker 1: become vulnerable, then those other linked accounts could also be vulnerable. 46 00:02:51,280 --> 00:02:54,320 Speaker 1: That was the case with Matt Honan, right. So one 47 00:02:54,320 --> 00:02:57,359 Speaker 1: of the many problems of his yes UM because more 48 00:02:57,360 --> 00:03:00,200 Speaker 1: identifiable problems because once they got to access to his 49 00:03:00,280 --> 00:03:03,120 Speaker 1: Google account, then they were able to reset stuff all 50 00:03:03,120 --> 00:03:05,080 Speaker 1: over the place. And then it turned out that all 51 00:03:05,120 --> 00:03:07,600 Speaker 1: they really wanted was to access his Twitter account, which 52 00:03:07,639 --> 00:03:11,320 Speaker 1: is I guess in a way he's fortunate, but it's 53 00:03:11,320 --> 00:03:13,760 Speaker 1: still pretty crazy everything that they managed to do in 54 00:03:13,840 --> 00:03:15,440 Speaker 1: order to do that, and they caused quite a bit 55 00:03:15,480 --> 00:03:19,240 Speaker 1: of damage along the way to mattahone in anyway not 56 00:03:19,320 --> 00:03:23,280 Speaker 1: to mention to the the the public perception of security 57 00:03:23,880 --> 00:03:27,239 Speaker 1: UM on the back end. So that's one thing is 58 00:03:27,280 --> 00:03:31,280 Speaker 1: linking lots of accounts together holds a very specific danger. 59 00:03:31,320 --> 00:03:34,480 Speaker 1: I mean, for one thing like Facebook Connect or really 60 00:03:34,520 --> 00:03:38,080 Speaker 1: any open i D approach, right, if that system is 61 00:03:38,120 --> 00:03:41,360 Speaker 1: not secure, you have a single point that you can 62 00:03:41,440 --> 00:03:44,640 Speaker 1: target that will give you access to lots of stuff. 63 00:03:45,360 --> 00:03:50,240 Speaker 1: Now that's so sad because for us, the consumer that's 64 00:03:50,280 --> 00:03:52,840 Speaker 1: so helpful. Yeah, having one account that you can log 65 00:03:52,880 --> 00:03:57,000 Speaker 1: into and from there you can authenticate with multiple other services. 66 00:03:57,360 --> 00:04:01,440 Speaker 1: You don't have to form after form after form. Uh, 67 00:04:01,680 --> 00:04:05,080 Speaker 1: you know, it's it is a very valuable service now. 68 00:04:05,200 --> 00:04:08,320 Speaker 1: And I'm not saying that that Facebook Connect or Open 69 00:04:08,360 --> 00:04:10,840 Speaker 1: Idea or any of that is that they are not secure. 70 00:04:10,920 --> 00:04:13,760 Speaker 1: They're putting they're putting lots of protections in place to 71 00:04:13,800 --> 00:04:17,599 Speaker 1: try and keep user information as safe as possible. It's not. Yeah, 72 00:04:17,600 --> 00:04:20,600 Speaker 1: it's not so much that it's inherently wrong, as that 73 00:04:21,080 --> 00:04:25,400 Speaker 1: if something does happen, it can cause serious problems. Right. 74 00:04:25,480 --> 00:04:28,960 Speaker 1: So that's one issue. Another issue is the way that 75 00:04:29,000 --> 00:04:33,000 Speaker 1: we create passwords as users for those of us who 76 00:04:33,200 --> 00:04:38,839 Speaker 1: are using either very common words or even names. Um, 77 00:04:38,920 --> 00:04:40,839 Speaker 1: even if we think we're being clever by adding a 78 00:04:40,839 --> 00:04:44,159 Speaker 1: few numbers to it, that's not really that secure. And 79 00:04:44,640 --> 00:04:49,680 Speaker 1: if it becomes even more insecure if we're using those 80 00:04:50,800 --> 00:04:57,919 Speaker 1: passwords at multiple accounts. So I think, uh, we we were. 81 00:04:57,960 --> 00:05:01,279 Speaker 1: We both read an article from Ours Technical by Dan 82 00:05:01,320 --> 00:05:05,080 Speaker 1: Gooden called why passwords have never been weaker and crackers 83 00:05:05,080 --> 00:05:07,760 Speaker 1: have never been stronger. It's actually it's a fascinating read, 84 00:05:07,800 --> 00:05:09,640 Speaker 1: and I do recommend you check it out if you 85 00:05:09,760 --> 00:05:12,240 Speaker 1: find this episode interesting, where even if you don't, it's 86 00:05:12,279 --> 00:05:15,480 Speaker 1: a good thing to know. And uh, it's it's typically 87 00:05:15,839 --> 00:05:19,440 Speaker 1: our technical typically get into more technical detail than than 88 00:05:19,560 --> 00:05:21,880 Speaker 1: articles on how stuff works dot com. But if you're 89 00:05:21,920 --> 00:05:23,840 Speaker 1: if you're really serious about it, there there's a lot 90 00:05:23,839 --> 00:05:26,400 Speaker 1: of important information in there, and we can give you 91 00:05:26,480 --> 00:05:29,960 Speaker 1: kind of the layman approach to what is going on here. 92 00:05:30,040 --> 00:05:32,000 Speaker 1: But part of that is that I remember reading, and 93 00:05:32,000 --> 00:05:33,680 Speaker 1: it may not have been in this article, I do 94 00:05:33,720 --> 00:05:36,839 Speaker 1: remember reading a statistic that the average user has something 95 00:05:36,880 --> 00:05:39,719 Speaker 1: like six and a half passwords. That's in there. Okay, 96 00:05:39,880 --> 00:05:41,760 Speaker 1: so they use six and a half past and you know, 97 00:05:41,839 --> 00:05:43,479 Speaker 1: of course this is an average. We're not saying someone 98 00:05:43,520 --> 00:05:45,960 Speaker 1: out there's just putting, you know what, I was gonna 99 00:05:46,000 --> 00:05:48,479 Speaker 1: type in my whole password, which is typically password, and 100 00:05:48,480 --> 00:05:50,680 Speaker 1: I'm just gonna type in pass for this one. No, 101 00:05:50,839 --> 00:05:53,800 Speaker 1: that's not what it means sword, it's the average. So 102 00:05:53,960 --> 00:05:56,239 Speaker 1: but that means that, you know, you think the average 103 00:05:56,240 --> 00:05:59,800 Speaker 1: person has around twenty five accounts across the web, but 104 00:05:59,839 --> 00:06:02,720 Speaker 1: they're using on average six and a half passwords, so 105 00:06:03,080 --> 00:06:06,000 Speaker 1: each password is being used for around three times on average. 106 00:06:06,040 --> 00:06:07,839 Speaker 1: I mean that's again an average. You might have just 107 00:06:07,920 --> 00:06:10,240 Speaker 1: one password that used twenty times and the other three 108 00:06:10,360 --> 00:06:13,000 Speaker 1: used the other five. Well, I don't want to use 109 00:06:13,040 --> 00:06:17,120 Speaker 1: the same password on Google, and yeah, who's so I'll 110 00:06:17,200 --> 00:06:19,120 Speaker 1: use one for one and the other one for the other, 111 00:06:19,200 --> 00:06:23,000 Speaker 1: and then I'll use the Google one again for pest 112 00:06:23,520 --> 00:06:26,280 Speaker 1: or whatever one for Facebook because they are those are 113 00:06:26,360 --> 00:06:29,200 Speaker 1: disconnected enough where it's not gonna know. That's still a 114 00:06:29,200 --> 00:06:31,680 Speaker 1: problem unless you think that I am a super genius, 115 00:06:31,760 --> 00:06:34,799 Speaker 1: because I can say this, no, I I reused passwords 116 00:06:34,800 --> 00:06:37,160 Speaker 1: from time to time too. I'm guilty of it, just 117 00:06:37,240 --> 00:06:40,640 Speaker 1: as much as the planet. I was awful for a 118 00:06:40,680 --> 00:06:44,679 Speaker 1: long time. Passwords among Yeah there were that was pretty 119 00:06:44,720 --> 00:06:46,920 Speaker 1: much mine too. I had about three passwords that I 120 00:06:47,040 --> 00:06:49,360 Speaker 1: used for almost everything. That is no longer the case. People, 121 00:06:50,080 --> 00:06:52,000 Speaker 1: I don't do that anymore. Well, I told you I 122 00:06:52,000 --> 00:06:55,320 Speaker 1: didn't mean you erase all those accounts anyway. So that's 123 00:06:56,080 --> 00:06:58,720 Speaker 1: that's another user behavior, and we'll get more into that 124 00:06:58,760 --> 00:07:02,559 Speaker 1: in a minute. But then the third piece is how 125 00:07:03,640 --> 00:07:08,480 Speaker 1: safe are those passwords within the databases of the companies 126 00:07:08,520 --> 00:07:13,680 Speaker 1: that hold those passwords. So if you are a cracker, 127 00:07:13,920 --> 00:07:15,880 Speaker 1: you know a hacker who is specifically trying to crack 128 00:07:15,880 --> 00:07:21,600 Speaker 1: into security systems, and you have identified a potential target 129 00:07:21,640 --> 00:07:26,720 Speaker 1: to try and get at their password database, then uh, 130 00:07:26,840 --> 00:07:29,880 Speaker 1: if it's if it's one where the user base of 131 00:07:29,920 --> 00:07:35,320 Speaker 1: that service or company also typically has accounts at other places. 132 00:07:35,920 --> 00:07:38,800 Speaker 1: You've managed to not just get the passwords for that 133 00:07:38,920 --> 00:07:42,480 Speaker 1: one account, but knowing that people tend to reuse their passwords, 134 00:07:42,720 --> 00:07:48,800 Speaker 1: you might actually have access to multiple services. Now, there 135 00:07:48,840 --> 00:07:51,000 Speaker 1: are ways that companies can protect against this, not just 136 00:07:51,160 --> 00:07:54,000 Speaker 1: by building a good security system that's hard to crack, 137 00:07:54,680 --> 00:07:59,560 Speaker 1: but also by uh encrypting those passwords in the database 138 00:08:00,120 --> 00:08:03,000 Speaker 1: that if you get that database, yes you've got a 139 00:08:03,000 --> 00:08:06,600 Speaker 1: whole bunch of data, but it does not translate directly 140 00:08:06,680 --> 00:08:10,680 Speaker 1: to the passwords because it's been put through a hashing algorithm. Yeah, 141 00:08:10,920 --> 00:08:14,880 Speaker 1: and there's there are several sort of standard hashing algorithms. 142 00:08:15,200 --> 00:08:19,080 Speaker 1: So basically it's a it's a little like email encryption too. 143 00:08:19,760 --> 00:08:22,840 Speaker 1: So you have, let's just pick pass the four letter 144 00:08:22,840 --> 00:08:26,040 Speaker 1: word pass um, you put it through the hashing algorithm, 145 00:08:26,280 --> 00:08:28,760 Speaker 1: and on the other side of that, it the letters 146 00:08:28,760 --> 00:08:32,760 Speaker 1: and numbers that make up the encrypted information look nothing 147 00:08:32,840 --> 00:08:35,680 Speaker 1: like that. And it might be that your four letter 148 00:08:35,800 --> 00:08:39,120 Speaker 1: password has just become a thirty two letter encrypted string 149 00:08:39,200 --> 00:08:42,600 Speaker 1: of characters. Yeah, so somebody seeing that written down, say 150 00:08:42,640 --> 00:08:45,360 Speaker 1: on a piece of paper, is not going to have 151 00:08:45,400 --> 00:08:47,280 Speaker 1: any idea what that is, and they're not really going 152 00:08:47,320 --> 00:08:51,520 Speaker 1: to have any way to decipher it. And theoretically it's 153 00:08:51,559 --> 00:08:56,000 Speaker 1: pretty well, uh, pretty well protected, right theoretically, But here's 154 00:08:56,040 --> 00:08:58,480 Speaker 1: the problem is that not first of all, not every 155 00:08:58,520 --> 00:09:03,440 Speaker 1: company has historically encrypted all those passwords. And there have 156 00:09:03,600 --> 00:09:07,560 Speaker 1: been cases where crackers have gotten access to a password 157 00:09:07,600 --> 00:09:10,800 Speaker 1: database that was stored in plain text. That means that 158 00:09:10,880 --> 00:09:14,719 Speaker 1: the password that you type in appears in that database 159 00:09:14,960 --> 00:09:18,760 Speaker 1: as you typed it, so there's no hidden you know, 160 00:09:18,840 --> 00:09:22,000 Speaker 1: code or anything. You've got those passwords, Well, that's very 161 00:09:22,080 --> 00:09:24,600 Speaker 1: valuable to a cracker for more than just the fact 162 00:09:24,600 --> 00:09:27,120 Speaker 1: that they now have access to your account. What's also 163 00:09:27,280 --> 00:09:30,360 Speaker 1: valuable is that they now have a list of words 164 00:09:30,440 --> 00:09:36,480 Speaker 1: that people use as passwords. So, uh, there's a there's 165 00:09:36,520 --> 00:09:38,600 Speaker 1: a type of attack we should talk about, the brute 166 00:09:38,640 --> 00:09:43,200 Speaker 1: force attack. A brute force attack is when a cracker 167 00:09:43,280 --> 00:09:47,000 Speaker 1: tries to get access to a system by filling out 168 00:09:47,040 --> 00:09:51,120 Speaker 1: the essentially filling out the password field multiple times until 169 00:09:51,160 --> 00:09:56,120 Speaker 1: they get a positive result. And um, one way of 170 00:09:56,120 --> 00:09:58,480 Speaker 1: doing a brute force attack. A very common way is 171 00:09:58,520 --> 00:10:01,240 Speaker 1: to do what's called a dictionary at at where you take. 172 00:10:01,600 --> 00:10:05,920 Speaker 1: You create a virtual dictionary of words that you use 173 00:10:06,120 --> 00:10:08,680 Speaker 1: as the basis for passwords. Knowing that a lot of 174 00:10:08,720 --> 00:10:12,760 Speaker 1: people will pick a common dictionary word as the basis 175 00:10:13,080 --> 00:10:16,080 Speaker 1: of their password hard wark, antelope, ant eater, you know, 176 00:10:16,080 --> 00:10:18,160 Speaker 1: and it just goes all the way through to pick 177 00:10:18,240 --> 00:10:21,360 Speaker 1: animals for some reason. But something else that they'll do 178 00:10:21,440 --> 00:10:24,680 Speaker 1: as part of this dictionary attack what they'll start adding 179 00:10:25,360 --> 00:10:28,680 Speaker 1: changing symbols. So let's say your your password is hardwark, 180 00:10:29,440 --> 00:10:33,760 Speaker 1: but you're being clever and changing the a's symbols at 181 00:10:33,800 --> 00:10:36,840 Speaker 1: symbols and uh, you know, let's see you pick a 182 00:10:36,880 --> 00:10:39,560 Speaker 1: word with with ease in it and you change them 183 00:10:39,600 --> 00:10:43,000 Speaker 1: to threes. They try those two, Yeah, because those are 184 00:10:43,120 --> 00:10:46,160 Speaker 1: very common approaches. And yes, you know, keeping in mind 185 00:10:46,200 --> 00:10:48,400 Speaker 1: that most of us are using passwords that are easy 186 00:10:48,440 --> 00:10:53,040 Speaker 1: for us to remember, and the more random ish or 187 00:10:53,120 --> 00:10:56,200 Speaker 1: seemingly random these passwords get, the harder it is for 188 00:10:56,280 --> 00:10:59,760 Speaker 1: us to recall them. So, knowing that's a weakness, the 189 00:11:00,040 --> 00:11:02,319 Speaker 1: racker can say, all right, well, let's go with all 190 00:11:02,360 --> 00:11:05,600 Speaker 1: these words, and let's go with the various variations we 191 00:11:05,600 --> 00:11:08,559 Speaker 1: would expect people to use with these words. And even 192 00:11:08,640 --> 00:11:10,400 Speaker 1: if you've done stuff like just added a couple of 193 00:11:10,480 --> 00:11:13,400 Speaker 1: numbers at the end, that's not always a tough thing either. 194 00:11:13,480 --> 00:11:16,800 Speaker 1: They can start going through all of these different variations 195 00:11:17,120 --> 00:11:19,240 Speaker 1: adding various numbers at the end. If they know how 196 00:11:19,240 --> 00:11:22,679 Speaker 1: many characters your password is, that already has given them 197 00:11:22,920 --> 00:11:26,120 Speaker 1: a huge advantage. And the reason why this is possible 198 00:11:26,200 --> 00:11:29,440 Speaker 1: is because we've got processors out there that can do 199 00:11:29,520 --> 00:11:32,360 Speaker 1: these these calculations in parallel. You know, if you were 200 00:11:32,400 --> 00:11:35,240 Speaker 1: to do them all one after the other, it may 201 00:11:35,280 --> 00:11:39,760 Speaker 1: take you centuries to get through all the possibilities of 202 00:11:39,800 --> 00:11:43,559 Speaker 1: a particular password, depending on how many characters there are 203 00:11:43,640 --> 00:11:46,800 Speaker 1: within that password. Hey guys, it's Jonathan from two thousand nineteen. 204 00:11:47,080 --> 00:11:50,479 Speaker 1: I just hacked into this classic episode because the password 205 00:11:50,520 --> 00:11:54,640 Speaker 1: protection was laughable. It was just palette one to three. 206 00:11:55,080 --> 00:11:57,640 Speaker 1: So I'm gonna mess around with some stuff. But let's 207 00:11:57,640 --> 00:12:07,080 Speaker 1: take a quick break while I do that. In Hollywood, 208 00:12:07,080 --> 00:12:10,400 Speaker 1: Hollywood computers can do an executive brute force attack in 209 00:12:10,440 --> 00:12:13,480 Speaker 1: about twelve seconds. Yeah, well, sometimes that can happen here too, 210 00:12:13,520 --> 00:12:15,720 Speaker 1: but that's generally not the way it works. Well, that's 211 00:12:15,840 --> 00:12:17,880 Speaker 1: that's one of the interesting things about this article is 212 00:12:17,920 --> 00:12:21,600 Speaker 1: you learn from reading that UH an attack like this 213 00:12:21,760 --> 00:12:27,880 Speaker 1: doesn't take very long at all, that at most, assuming 214 00:12:27,880 --> 00:12:34,040 Speaker 1: that you're not following really really strong password particles. UM. Yeah, 215 00:12:34,120 --> 00:12:37,120 Speaker 1: it turns out that it's like, because of this parallel processing, 216 00:12:37,160 --> 00:12:42,240 Speaker 1: you've got a processor that's working on multiple UH approaches 217 00:12:42,280 --> 00:12:44,600 Speaker 1: to this logan attempt. So we can go through all 218 00:12:44,640 --> 00:12:48,719 Speaker 1: these different variations, even when there are billions and billions, 219 00:12:49,320 --> 00:12:53,640 Speaker 1: as Karl Sagan would say, variations of passwords, the processor 220 00:12:53,720 --> 00:12:56,520 Speaker 1: can go through so many so quickly. You know, each 221 00:12:56,600 --> 00:12:59,319 Speaker 1: each thread in that parallel processing is movie got an 222 00:12:59,320 --> 00:13:03,760 Speaker 1: incredible rate, and you've got multiple threads all going UH. 223 00:13:03,800 --> 00:13:07,320 Speaker 1: There are crackers who use graphics processing units GPUs to 224 00:13:07,360 --> 00:13:10,120 Speaker 1: do this. They because the GPUs are designed to be 225 00:13:10,120 --> 00:13:14,080 Speaker 1: parallel processors. Yeah. Even even though they're designed primarily to 226 00:13:14,320 --> 00:13:18,920 Speaker 1: handle graphics instructions and display them on your your monitor, 227 00:13:19,400 --> 00:13:24,400 Speaker 1: GPUs can be UH pressed into service, let's say, by 228 00:13:25,040 --> 00:13:29,520 Speaker 1: a program by a software that that can specifically UM 229 00:13:29,679 --> 00:13:33,080 Speaker 1: send instructions to it. So what people do, UM, there 230 00:13:33,080 --> 00:13:36,840 Speaker 1: are open source programs that you can use to UH 231 00:13:37,200 --> 00:13:42,640 Speaker 1: assign password cracking to your GPU. UM sad to say, 232 00:13:42,720 --> 00:13:45,280 Speaker 1: and and one of the uh, the interesting stories that 233 00:13:45,360 --> 00:13:47,120 Speaker 1: are One of the interesting bits that I read from 234 00:13:47,120 --> 00:13:52,640 Speaker 1: this article too was uh that people have grown increasingly 235 00:13:52,840 --> 00:13:57,320 Speaker 1: intelligent about the way they save cracked passwords. So they're 236 00:13:57,360 --> 00:14:03,280 Speaker 1: saving up dictionary attack type information. And so if you 237 00:14:03,400 --> 00:14:08,600 Speaker 1: use you know, password one, is your password on one site, um, 238 00:14:08,640 --> 00:14:11,040 Speaker 1: and they want to hack in to your account at 239 00:14:11,040 --> 00:14:15,840 Speaker 1: the House of online Grapefruit, they might try they and 240 00:14:15,880 --> 00:14:18,240 Speaker 1: they've got your information. They could try it there too, 241 00:14:18,240 --> 00:14:20,040 Speaker 1: to see if you've used your password on more than 242 00:14:20,080 --> 00:14:23,600 Speaker 1: one site. So that makes it increasingly dangerous for you 243 00:14:23,680 --> 00:14:27,600 Speaker 1: to use the same password in multiple locations because there 244 00:14:27,720 --> 00:14:31,560 Speaker 1: is a growing database of password information that that people 245 00:14:31,560 --> 00:14:34,480 Speaker 1: are saving and not just throwing away once an attack 246 00:14:34,560 --> 00:14:37,120 Speaker 1: is completely That database also means that they can look 247 00:14:37,160 --> 00:14:40,520 Speaker 1: at things like frequencies like how frequently are people using 248 00:14:40,520 --> 00:14:43,800 Speaker 1: the specific word or variations of this word as a password. 249 00:14:44,040 --> 00:14:46,480 Speaker 1: And the more people who use it, the more you're like, 250 00:14:46,520 --> 00:14:48,440 Speaker 1: all right, well let's bump this up the list. It's 251 00:14:48,480 --> 00:14:51,360 Speaker 1: more of a likely candidate for a password. So, you know, 252 00:14:51,600 --> 00:14:53,920 Speaker 1: we like to think that the passwords we choose are unique, 253 00:14:54,560 --> 00:14:57,880 Speaker 1: but that's if we're basing it off a name or 254 00:14:57,920 --> 00:15:01,320 Speaker 1: a word. That's not the case. Are lots of people 255 00:15:01,360 --> 00:15:03,920 Speaker 1: out there using lots of passwords, and there's a good 256 00:15:04,000 --> 00:15:06,040 Speaker 1: chance that someone out there is using the same quote 257 00:15:06,120 --> 00:15:09,880 Speaker 1: unquote unique password. You are. Just remember your unique just 258 00:15:10,000 --> 00:15:14,240 Speaker 1: like everybody else. You know, when everybody is special, no 259 00:15:14,320 --> 00:15:21,600 Speaker 1: one is. It's incredible. Um. The so yeah, the the 260 00:15:21,600 --> 00:15:24,960 Speaker 1: the database can tell the cracker all right, Well, not 261 00:15:25,000 --> 00:15:27,120 Speaker 1: only am I using a dictionary attack, but I'm using 262 00:15:27,280 --> 00:15:31,520 Speaker 1: a curated dictionary attack in a way, because these are 263 00:15:31,560 --> 00:15:34,480 Speaker 1: the known passwords that are floating out there in the world, 264 00:15:34,560 --> 00:15:36,320 Speaker 1: and these are the ones that are really popular that 265 00:15:36,360 --> 00:15:39,080 Speaker 1: lots of people use. So we'll go through all the 266 00:15:39,160 --> 00:15:42,040 Speaker 1: variations of these first, and you just you tweak your 267 00:15:42,040 --> 00:15:45,400 Speaker 1: cracking program to do that so that you can get 268 00:15:45,440 --> 00:15:48,680 Speaker 1: the the largest number of results in the least amount 269 00:15:48,720 --> 00:15:50,800 Speaker 1: of time. And another thing you can do is once 270 00:15:50,840 --> 00:15:54,600 Speaker 1: you've figured out these passwords that are very popular, that 271 00:15:54,720 --> 00:15:57,520 Speaker 1: helps you determine other things, like there are only so 272 00:15:57,560 --> 00:16:01,440 Speaker 1: many hashing algorithms that are really popular out there in 273 00:16:01,440 --> 00:16:04,960 Speaker 1: the world of computer security, right, so if you know 274 00:16:05,040 --> 00:16:10,320 Speaker 1: which hashing algorithm there the particular company is using, and 275 00:16:10,440 --> 00:16:12,840 Speaker 1: you are able to get let's say you get access 276 00:16:12,840 --> 00:16:15,920 Speaker 1: to their encrypted password database. So now you've got a 277 00:16:16,000 --> 00:16:19,080 Speaker 1: list of passwords that are encrypted, so you cannot just 278 00:16:19,240 --> 00:16:21,840 Speaker 1: look at them and know what the passwords are. If 279 00:16:21,840 --> 00:16:24,640 Speaker 1: you are able to determine which security protocol they're using, 280 00:16:25,120 --> 00:16:29,360 Speaker 1: and you have this massive database of um of of 281 00:16:29,640 --> 00:16:32,560 Speaker 1: of passwords that are really popular, you can run those 282 00:16:32,560 --> 00:16:36,400 Speaker 1: passwords through the same encryption algorithm to look at the 283 00:16:36,480 --> 00:16:39,120 Speaker 1: hashes that come out and then start matching them up 284 00:16:39,160 --> 00:16:41,560 Speaker 1: with the stuff that was in the database. So you're 285 00:16:41,560 --> 00:16:43,840 Speaker 1: still cracking the passwords, you're just going about in a 286 00:16:43,880 --> 00:16:47,440 Speaker 1: different way as far as this brute force attack is concerned. 287 00:16:47,440 --> 00:16:50,040 Speaker 1: It's still a brute force attack. It's just doing it 288 00:16:50,120 --> 00:16:54,000 Speaker 1: in a kind of an odd roundabout way because you've 289 00:16:54,000 --> 00:16:56,840 Speaker 1: got the you've got the hash of the password, you've 290 00:16:56,880 --> 00:16:59,840 Speaker 1: got the security protocol that's being used. Now you're trying 291 00:16:59,840 --> 00:17:03,960 Speaker 1: to yes the original word that created that hashed password. 292 00:17:04,480 --> 00:17:06,880 Speaker 1: Once you're able to do that, that account is no 293 00:17:06,920 --> 00:17:09,840 Speaker 1: longer secure. And if that again, if you're using that 294 00:17:09,880 --> 00:17:13,960 Speaker 1: same password elsewhere, those accounts aren't secure. Um, So you 295 00:17:14,000 --> 00:17:17,000 Speaker 1: might be asking yourself, hey, if there are crackers out 296 00:17:17,040 --> 00:17:21,000 Speaker 1: there who have these really advanced tools that can either 297 00:17:21,680 --> 00:17:25,360 Speaker 1: figure out a password or uh, you know, kind of 298 00:17:25,560 --> 00:17:29,040 Speaker 1: worked on a list so that the passwords I use 299 00:17:29,080 --> 00:17:32,760 Speaker 1: are vulnerable. How do I how do I protect myself? 300 00:17:33,320 --> 00:17:34,800 Speaker 1: And there are a few things you can do. One 301 00:17:35,000 --> 00:17:39,240 Speaker 1: is use a unique password for every service that you 302 00:17:39,359 --> 00:17:43,320 Speaker 1: log into, which is incredibly difficult if you're doing it 303 00:17:43,359 --> 00:17:45,960 Speaker 1: on your own, which is why I would suggest getting 304 00:17:45,960 --> 00:17:49,280 Speaker 1: a password manager program. And there are a lot of 305 00:17:49,320 --> 00:17:52,400 Speaker 1: them out there. There are some that are free, there's 306 00:17:52,440 --> 00:17:55,240 Speaker 1: some that you pay for. Um, there's some that are 307 00:17:55,240 --> 00:17:58,439 Speaker 1: in the cloud. There are some that are based on 308 00:17:58,520 --> 00:18:02,959 Speaker 1: your system. Yeah. Uh, you use a password manager, right, 309 00:18:03,560 --> 00:18:06,680 Speaker 1: I do as well. UM, I'll go ahead and say 310 00:18:06,680 --> 00:18:10,000 Speaker 1: which one I use. I use dash Lane, which I 311 00:18:10,160 --> 00:18:12,600 Speaker 1: tried out for the first time this year and I 312 00:18:13,119 --> 00:18:17,280 Speaker 1: like it well enough. Um. It saves passwords and if 313 00:18:17,320 --> 00:18:19,959 Speaker 1: you want, it will generate a password for you, so 314 00:18:20,040 --> 00:18:22,600 Speaker 1: you don't have to just come up with a string 315 00:18:22,640 --> 00:18:24,840 Speaker 1: of things. It'll it'll do it for you and save 316 00:18:24,880 --> 00:18:28,639 Speaker 1: it to your account. You create a master password that 317 00:18:28,880 --> 00:18:32,480 Speaker 1: is a strong password, meaning that there are upper and 318 00:18:32,520 --> 00:18:36,240 Speaker 1: lower case letters. There's also numbers in there. Uh, and 319 00:18:36,280 --> 00:18:38,639 Speaker 1: all you have to do is remember that one. Which 320 00:18:39,040 --> 00:18:41,320 Speaker 1: that sounds tricky, but I'll give you a hint on 321 00:18:41,400 --> 00:18:43,679 Speaker 1: how to do something like that. If you want to 322 00:18:43,680 --> 00:18:47,480 Speaker 1: try it yourself, you create a master password. Uh. Then 323 00:18:48,480 --> 00:18:51,639 Speaker 1: when you log into your dash Ling account in my case, 324 00:18:52,440 --> 00:18:54,520 Speaker 1: you then have access to all the other passwords that 325 00:18:54,600 --> 00:18:57,280 Speaker 1: are that that dash Lane generates. So I actually went 326 00:18:57,280 --> 00:19:01,320 Speaker 1: in to all my accounts and use the dash Lane 327 00:19:01,560 --> 00:19:05,359 Speaker 1: password generator program, and it creates a ten character long 328 00:19:05,920 --> 00:19:10,800 Speaker 1: strong password that's unique. So none of my accounts used 329 00:19:10,840 --> 00:19:14,720 Speaker 1: the same ones anymore. They're all ten characters long. They 330 00:19:14,760 --> 00:19:19,879 Speaker 1: are a mix of various characters and uh. When you 331 00:19:19,960 --> 00:19:24,119 Speaker 1: get to about nine characters, and if it's a truly 332 00:19:24,520 --> 00:19:27,880 Speaker 1: you know at least a seemingly random series of characters 333 00:19:27,880 --> 00:19:33,560 Speaker 1: and numbers. Uh. The difficulty of cracking that password escalates dramatically, 334 00:19:34,359 --> 00:19:36,600 Speaker 1: So I might go from a matter of days, two 335 00:19:36,600 --> 00:19:39,760 Speaker 1: weeks or months. And the harder you make it to crack, 336 00:19:40,560 --> 00:19:45,600 Speaker 1: the more likely your information will be safe so or 337 00:19:45,640 --> 00:19:48,560 Speaker 1: that it will just be difficult for anyone to guess. Um. 338 00:19:48,760 --> 00:19:51,959 Speaker 1: So that's the purpose of creating these strong passwords and 339 00:19:52,000 --> 00:19:55,520 Speaker 1: the purpose for the password managers, because strong passwords are 340 00:19:55,520 --> 00:19:58,959 Speaker 1: hard to remember. Um, So all I have to do 341 00:19:59,000 --> 00:20:01,879 Speaker 1: is remember my one master password here's the hint I 342 00:20:01,960 --> 00:20:04,080 Speaker 1: was gonna make. So if you want to make a 343 00:20:04,280 --> 00:20:10,080 Speaker 1: strong password, like a master strong password, uh, it's best 344 00:20:10,119 --> 00:20:13,240 Speaker 1: that you come up with a phrase that you will 345 00:20:13,280 --> 00:20:17,320 Speaker 1: not forget and it it's great if the phrase also 346 00:20:17,440 --> 00:20:21,879 Speaker 1: has a proper noun somewhere after the first word, so 347 00:20:21,880 --> 00:20:24,360 Speaker 1: that you have some capitals in there as well. And 348 00:20:24,480 --> 00:20:27,160 Speaker 1: you need a number, like a four digit number is best. 349 00:20:28,000 --> 00:20:32,719 Speaker 1: So for example, you might say Dad's first car was 350 00:20:33,280 --> 00:20:38,480 Speaker 1: a nineteen fifty six Volkswagen Bug. M all right, So 351 00:20:38,520 --> 00:20:41,119 Speaker 1: then your password. You take the first letter off of 352 00:20:41,200 --> 00:20:43,800 Speaker 1: each of those words and the number and you put 353 00:20:43,880 --> 00:20:47,600 Speaker 1: them together and that becomes your password. So the first 354 00:20:47,640 --> 00:20:50,960 Speaker 1: letter would be upper case D for Dad's then first car, 355 00:20:51,040 --> 00:20:54,760 Speaker 1: so it's upper case D, lower case F, lower case C, 356 00:20:55,480 --> 00:20:59,040 Speaker 1: lower case W, lower case A. Then you have the 357 00:20:59,080 --> 00:21:04,159 Speaker 1: one X and then percase v U percase B for 358 00:21:04,320 --> 00:21:08,040 Speaker 1: Volkswagen Bug. That could be your master password. And when 359 00:21:08,040 --> 00:21:10,960 Speaker 1: you look at it as just a string of letters 360 00:21:11,000 --> 00:21:15,080 Speaker 1: and numbers, it looks meaningless. You know, there's no there's 361 00:21:15,119 --> 00:21:18,920 Speaker 1: no phrase that's evident right there immediately unless you happen 362 00:21:19,000 --> 00:21:22,160 Speaker 1: to have already known it. So don't tell people you're oh, 363 00:21:22,200 --> 00:21:26,359 Speaker 1: I gotta change my password. Yeah, but no, don't tell 364 00:21:26,400 --> 00:21:29,359 Speaker 1: people what your phrases, but make it a phrase that 365 00:21:29,560 --> 00:21:34,159 Speaker 1: is easy to remember. And uh and that could be 366 00:21:34,200 --> 00:21:38,320 Speaker 1: your master password, and don't use it again. Just use 367 00:21:38,359 --> 00:21:41,480 Speaker 1: it for your master password and then use the password 368 00:21:41,520 --> 00:21:44,919 Speaker 1: generator or a password generator if you don't want to 369 00:21:45,240 --> 00:21:48,159 Speaker 1: trust one thing with it. But it's it's easier to 370 00:21:48,280 --> 00:21:51,480 Speaker 1: use a password managers on board password generator because it 371 00:21:51,520 --> 00:21:54,800 Speaker 1: can save it directly to your account. Otherwise you're gonna 372 00:21:54,800 --> 00:21:58,959 Speaker 1: have to transfer that that password to whatever your manager 373 00:21:59,080 --> 00:22:04,480 Speaker 1: is UM and then that way you've got a vault 374 00:22:04,520 --> 00:22:10,240 Speaker 1: of passwords that are encrypted that are ten characters, hopefully 375 00:22:10,280 --> 00:22:12,760 Speaker 1: at least ten characters nine or ten characters at the 376 00:22:12,920 --> 00:22:18,960 Speaker 1: very least, and are strong. It's funny. It's it's rather 377 00:22:19,000 --> 00:22:22,080 Speaker 1: than coming up with a mnemonic device to remember your password, 378 00:22:22,119 --> 00:22:25,960 Speaker 1: you start with them mnemonic device and from it from it. Yeah, 379 00:22:25,960 --> 00:22:28,440 Speaker 1: I think that that's way easier because that is I've 380 00:22:28,560 --> 00:22:33,480 Speaker 1: used a password generator before that creates a random string 381 00:22:33,520 --> 00:22:36,840 Speaker 1: of characters and then tells you it's easy to remember this. 382 00:22:37,240 --> 00:22:42,080 Speaker 1: Just remember echo bravos seven delta delta bro. You know, 383 00:22:42,119 --> 00:22:44,960 Speaker 1: I'm like, this is that where are you from where 384 00:22:45,000 --> 00:22:48,960 Speaker 1: that is easy? How is how is remembering a random 385 00:22:49,000 --> 00:22:53,240 Speaker 1: selection of echoes and Bravos and etcetera and numbers easier 386 00:22:53,280 --> 00:22:56,600 Speaker 1: than say, just remembering e e blah blah. You know, 387 00:22:56,640 --> 00:22:59,720 Speaker 1: like that's not easier to me. But this other method 388 00:22:59,720 --> 00:23:03,840 Speaker 1: where you create a pmneumonic device first and then convert 389 00:23:03,880 --> 00:23:07,760 Speaker 1: that into a strong password makes way more sense to me. 390 00:23:09,000 --> 00:23:14,000 Speaker 1: And uh again because you know the output of it 391 00:23:14,119 --> 00:23:18,480 Speaker 1: is a seemingly random string of letters and numbers. Uh, 392 00:23:18,600 --> 00:23:22,600 Speaker 1: it's not something that's easy for a computer to guess. Hi, guys, 393 00:23:22,680 --> 00:23:25,560 Speaker 1: it's Jeamvan twenty nineteen. Chris called me up and he 394 00:23:25,640 --> 00:23:29,040 Speaker 1: yelled at me. So I've updated the password. And while 395 00:23:29,040 --> 00:23:31,160 Speaker 1: I'm doing that, we're just, uh, we're gonna take another 396 00:23:31,240 --> 00:23:41,760 Speaker 1: quick break. Well, um, I use one password by agile 397 00:23:41,800 --> 00:23:45,000 Speaker 1: bits um, which is a you can get as a 398 00:23:45,040 --> 00:23:48,800 Speaker 1: desktop application for Windows or Mac. UM also works on 399 00:23:48,840 --> 00:23:52,400 Speaker 1: iOS and Android. UM and uh, you know it has 400 00:23:52,400 --> 00:23:55,400 Speaker 1: a browser plug in too on the desktop, so that 401 00:23:56,280 --> 00:23:58,800 Speaker 1: you uh, say, you visit a site where you have 402 00:23:58,960 --> 00:24:02,879 Speaker 1: a um an account, maybe a shopping site, maybe a 403 00:24:02,880 --> 00:24:05,240 Speaker 1: banking site or something like that for example, so you 404 00:24:05,280 --> 00:24:07,560 Speaker 1: have your log in and password, you have to log 405 00:24:07,560 --> 00:24:09,000 Speaker 1: in and has a little button and you press the 406 00:24:09,040 --> 00:24:11,840 Speaker 1: button in it, you know, says what is your overall passwords? 407 00:24:11,880 --> 00:24:14,080 Speaker 1: He is your master password in there, and then as 408 00:24:14,119 --> 00:24:17,560 Speaker 1: soon as you uh log in, you'll be given an 409 00:24:17,560 --> 00:24:20,280 Speaker 1: opportunity to log into the site and it submits the 410 00:24:20,320 --> 00:24:23,399 Speaker 1: information for you. Yeah, this is important if you're using 411 00:24:23,480 --> 00:24:27,280 Speaker 1: a someone else's computer and you are using a browser 412 00:24:27,320 --> 00:24:30,160 Speaker 1: to navigate to something. And you know, again, if you've 413 00:24:30,160 --> 00:24:34,480 Speaker 1: created these these strong passwords, remembering each one is going 414 00:24:34,520 --> 00:24:36,399 Speaker 1: to be really hard. And if you and it's not 415 00:24:36,480 --> 00:24:39,760 Speaker 1: like you're going to go and install your you know, 416 00:24:40,320 --> 00:24:42,560 Speaker 1: you don't want to install the desktop program on someone 417 00:24:42,600 --> 00:24:46,040 Speaker 1: else's computer. I mean, that's not your job, it's their computer. 418 00:24:46,760 --> 00:24:48,760 Speaker 1: Especially like let's say that you're at a library or 419 00:24:48,800 --> 00:24:50,880 Speaker 1: something and you want to log in and check email, 420 00:24:50,920 --> 00:24:54,760 Speaker 1: but you've used one of these strong password vaults using 421 00:24:55,000 --> 00:24:57,959 Speaker 1: something that has a web browser interface in it, so 422 00:24:58,000 --> 00:25:01,160 Speaker 1: that you can log into the service and access those 423 00:25:01,200 --> 00:25:04,159 Speaker 1: passwords and then log out and those passwords are no 424 00:25:04,240 --> 00:25:08,480 Speaker 1: longer there. That's important. Yeah, yeah, and uh, it does 425 00:25:08,520 --> 00:25:10,919 Speaker 1: give you a one password. Also gives you the opportunity 426 00:25:10,960 --> 00:25:15,119 Speaker 1: to when you're creating a password, UM, to make it 427 00:25:15,160 --> 00:25:17,439 Speaker 1: as longer as short as you need to really so, 428 00:25:17,680 --> 00:25:20,960 Speaker 1: or include symbols, or not to include symbols. So one 429 00:25:20,960 --> 00:25:24,880 Speaker 1: of the important tips that this article that that Jonathan 430 00:25:24,880 --> 00:25:28,119 Speaker 1: and I read points out is that eight digit or 431 00:25:28,160 --> 00:25:34,320 Speaker 1: eight character uh passwords are easier to crack than longer ones. 432 00:25:34,400 --> 00:25:37,280 Speaker 1: So if you're you're presented with a a website, you're 433 00:25:37,480 --> 00:25:40,119 Speaker 1: you're filling out the information for the account, it says, oh, well, 434 00:25:40,119 --> 00:25:43,160 Speaker 1: your password needs to be six characters are longer. Don't 435 00:25:43,160 --> 00:25:46,200 Speaker 1: pick a six character password? Is the is the simple 436 00:25:46,320 --> 00:25:49,560 Speaker 1: thing for that, whether it's your own or one that uh, 437 00:25:49,800 --> 00:25:54,000 Speaker 1: one of many, many very capable password generators. Um, yeah, 438 00:25:54,000 --> 00:25:55,919 Speaker 1: it was. As Jonathan said, these are the two that 439 00:25:55,960 --> 00:25:57,960 Speaker 1: we picked, but there are lots of them out that 440 00:25:57,960 --> 00:25:59,760 Speaker 1: they're great. There are a lot of them and they 441 00:25:59,800 --> 00:26:03,720 Speaker 1: all like you can read reviews of them and uh, 442 00:26:03,960 --> 00:26:07,760 Speaker 1: and you know, these are companies that their reputation is 443 00:26:07,840 --> 00:26:11,040 Speaker 1: completely built upon how reliable they are and that and 444 00:26:11,160 --> 00:26:14,240 Speaker 1: how upfront and transparent they are in the sense of 445 00:26:14,600 --> 00:26:18,720 Speaker 1: they're not using data themselves to get access to stuff. 446 00:26:18,720 --> 00:26:23,280 Speaker 1: In fact, most of these companies have the information encrypted 447 00:26:23,320 --> 00:26:26,720 Speaker 1: so that they don't have any idea what passwords you 448 00:26:26,960 --> 00:26:29,960 Speaker 1: are using. Because it's just like we were talking about 449 00:26:30,000 --> 00:26:34,439 Speaker 1: with the the password databases, where all they are encrypted passwords, 450 00:26:34,760 --> 00:26:37,119 Speaker 1: same sort of thing. They have no way of knowing 451 00:26:37,240 --> 00:26:41,520 Speaker 1: what you chose as your various passwords. They just provide 452 00:26:41,560 --> 00:26:44,119 Speaker 1: the hard the world the software that that lets you 453 00:26:44,160 --> 00:26:46,399 Speaker 1: do it. So yeah, if you can, if you can 454 00:26:46,520 --> 00:26:49,640 Speaker 1: choose a password manager that allow you to create longer 455 00:26:49,680 --> 00:26:54,280 Speaker 1: passwords and to save them automatically in the in your database, 456 00:26:54,440 --> 00:26:57,040 Speaker 1: that's a good thing, especially if your database is encrypted 457 00:26:57,040 --> 00:26:58,919 Speaker 1: wherever it is, whether it's on the cloud or on 458 00:26:58,960 --> 00:27:02,199 Speaker 1: your your hard drive or your phone. UM, you know 459 00:27:02,240 --> 00:27:06,080 Speaker 1: those that's important to know. UM. Also one of the 460 00:27:06,440 --> 00:27:08,439 Speaker 1: interesting things, and this is one of those things that 461 00:27:08,560 --> 00:27:15,359 Speaker 1: companies do that make your security less uh more open. 462 00:27:15,440 --> 00:27:19,760 Speaker 1: Let's say to to being cracked is people who for 463 00:27:19,840 --> 00:27:25,000 Speaker 1: their accounts have their email address UM as their user name. 464 00:27:25,040 --> 00:27:28,320 Speaker 1: Because these are this is sort of the equivalent of 465 00:27:28,320 --> 00:27:32,960 Speaker 1: of linking accounts. So you know, anybody, Let's say somebody 466 00:27:33,040 --> 00:27:36,439 Speaker 1: hacks into UM an account like they did with that 467 00:27:36,960 --> 00:27:40,920 Speaker 1: large shopping provider, the one that had all the uh 468 00:27:41,200 --> 00:27:45,200 Speaker 1: loyalty programs or cards. Uh. If they if they say, well, 469 00:27:45,240 --> 00:27:49,520 Speaker 1: all they got was people's email addresses. Well, that's an 470 00:27:49,520 --> 00:27:52,440 Speaker 1: important part of the equation. So maybe they'll start using 471 00:27:52,680 --> 00:27:55,760 Speaker 1: that email address that they got from those loyalty cards 472 00:27:55,800 --> 00:28:00,920 Speaker 1: in accounts with Amazon and Facebook, Google and all these 473 00:28:00,960 --> 00:28:04,320 Speaker 1: other places. They may start figuring out where your accounts are. 474 00:28:04,359 --> 00:28:07,200 Speaker 1: If they can figure out, you know, using that user 475 00:28:07,280 --> 00:28:10,159 Speaker 1: name and they identify one of the passwords, then the 476 00:28:10,240 --> 00:28:16,000 Speaker 1: dominoes start to fall. So uh, using multiple user names 477 00:28:16,280 --> 00:28:19,960 Speaker 1: and especially not your email address, you can arrange that. 478 00:28:19,960 --> 00:28:23,520 Speaker 1: That's very helpful as well. Um, you wouldn't necessarily think 479 00:28:23,520 --> 00:28:25,840 Speaker 1: it right off the shelf, but when you think that 480 00:28:26,040 --> 00:28:29,800 Speaker 1: these these people are putting together databases of this information, 481 00:28:30,400 --> 00:28:34,239 Speaker 1: it makes it clear that varying as much information as 482 00:28:34,280 --> 00:28:38,680 Speaker 1: possible is a good idea. Also, changing your passwords regularly. 483 00:28:38,840 --> 00:28:42,080 Speaker 1: Let's say you do have a banking site. Um, you 484 00:28:42,160 --> 00:28:46,840 Speaker 1: have a fifteen character password. It's got four different symbols 485 00:28:46,880 --> 00:28:49,800 Speaker 1: in a upper and lower case letters and numbers. That's 486 00:28:49,800 --> 00:28:52,840 Speaker 1: pretty secure. You should probably change it every few months, 487 00:28:53,240 --> 00:28:55,320 Speaker 1: just to be on the safe side. This is your 488 00:28:55,320 --> 00:28:57,720 Speaker 1: financial information we're talking about. It's a good idea to 489 00:28:57,760 --> 00:29:00,360 Speaker 1: swap it out, and you know, another night sings. A 490 00:29:00,400 --> 00:29:03,240 Speaker 1: lot of those password managers will even have a you know, 491 00:29:03,320 --> 00:29:07,000 Speaker 1: you can set a reminder on many of them that 492 00:29:07,360 --> 00:29:09,600 Speaker 1: you know they'll they'll keep a track of when you 493 00:29:09,800 --> 00:29:12,760 Speaker 1: established a particular password and let you know when it's 494 00:29:12,800 --> 00:29:16,040 Speaker 1: time you should change it up. And again, if you're 495 00:29:16,080 --> 00:29:18,280 Speaker 1: using one of these that has a password generator is 496 00:29:18,360 --> 00:29:20,640 Speaker 1: part of it, then all it takes is logging in 497 00:29:21,200 --> 00:29:23,840 Speaker 1: and uh often it'll go ahead and fill out the 498 00:29:24,520 --> 00:29:26,560 Speaker 1: forms that you need already and then you just press 499 00:29:26,560 --> 00:29:29,040 Speaker 1: a little button to generate a new password. It will 500 00:29:29,080 --> 00:29:31,880 Speaker 1: save the new password to your account. So I mean 501 00:29:31,880 --> 00:29:34,080 Speaker 1: it's something that takes five seconds once you've set up 502 00:29:34,120 --> 00:29:37,480 Speaker 1: the first time. And uh, you know, five seconds of 503 00:29:37,600 --> 00:29:42,760 Speaker 1: effort to keep crackers at bay is not a bad idea. Uh. 504 00:29:42,760 --> 00:29:46,480 Speaker 1: And keep in mind also that as GPUs become more sophisticated, 505 00:29:47,160 --> 00:29:51,720 Speaker 1: um as software gets more sophisticated, as as these algorithms 506 00:29:51,720 --> 00:29:55,240 Speaker 1: get more sophisticated, it's gonna get harder and harder to 507 00:29:55,640 --> 00:29:58,920 Speaker 1: protect the password. You know, you can play the game 508 00:29:58,960 --> 00:30:03,480 Speaker 1: of adding more care actors, which does uh increase the 509 00:30:03,480 --> 00:30:09,800 Speaker 1: difficulties significantly to get the positive hit. So uh, you know, 510 00:30:09,960 --> 00:30:14,080 Speaker 1: we we can stay ahead just by adding longer and 511 00:30:14,120 --> 00:30:17,920 Speaker 1: longer passwords as we go along. But you know, that's 512 00:30:17,920 --> 00:30:19,920 Speaker 1: a game that ultimately we're gonna have to sit there 513 00:30:19,960 --> 00:30:21,719 Speaker 1: and say we need to find a new way to 514 00:30:22,080 --> 00:30:26,880 Speaker 1: protect stuff, because that's the problem is that you know, you're, you're, 515 00:30:27,000 --> 00:30:28,680 Speaker 1: you're just playing a game of cat and mouse at 516 00:30:28,680 --> 00:30:32,080 Speaker 1: that point. And you know, we talked about quantum computers 517 00:30:32,120 --> 00:30:35,200 Speaker 1: a few times. One of the potential things the quantum 518 00:30:35,200 --> 00:30:39,880 Speaker 1: computer could be very good at is cracking codes. Because 519 00:30:40,120 --> 00:30:46,280 Speaker 1: a quantum computer is is also really well equipped for 520 00:30:46,440 --> 00:30:52,080 Speaker 1: parallel processing. Um. So that's something else to think about. 521 00:30:52,160 --> 00:30:54,600 Speaker 1: Is that now? Granted, right now, quantum computers are still 522 00:30:55,440 --> 00:30:59,440 Speaker 1: largely theoretical. There are a few working examples, but they're 523 00:31:00,040 --> 00:31:03,920 Speaker 1: horriously difficult to design and even more difficult to maintain 524 00:31:04,640 --> 00:31:09,440 Speaker 1: because you know, the slightest alteration and they there the 525 00:31:09,480 --> 00:31:13,160 Speaker 1: whole coherence problem becomes apparent. Yeah, either it is or 526 00:31:13,160 --> 00:31:18,200 Speaker 1: it isn't torn maybe somewhere in between. Um. Yeah, and uh. 527 00:31:18,240 --> 00:31:21,240 Speaker 1: I also read another article on on Ours Technica by 528 00:31:21,240 --> 00:31:24,520 Speaker 1: the same author actually, where they had discovered that in 529 00:31:24,840 --> 00:31:29,240 Speaker 1: versions of Windows seven and eight, um, it's possible to 530 00:31:29,320 --> 00:31:35,560 Speaker 1: get hold of people's security questions. Uh. Now, uh, that 531 00:31:35,720 --> 00:31:38,760 Speaker 1: sounds I think it's easy to come off with a 532 00:31:38,840 --> 00:31:41,479 Speaker 1: negative that seems like it's a negative against Microsoft, and 533 00:31:41,520 --> 00:31:43,720 Speaker 1: I guess in a way it is. But it assumes 534 00:31:43,800 --> 00:31:47,720 Speaker 1: first that the person has the person's computer. You would 535 00:31:47,720 --> 00:31:50,080 Speaker 1: actually have to have their computer to get it, and 536 00:31:50,160 --> 00:31:53,719 Speaker 1: you'd also have to know how to retrieve that information. 537 00:31:53,960 --> 00:31:56,280 Speaker 1: But that goes back to our discussion of Matt Honan too, 538 00:31:56,280 --> 00:32:00,160 Speaker 1: because if you know a lot of these security words 539 00:32:00,600 --> 00:32:02,720 Speaker 1: that you set up to talk to people on the 540 00:32:02,720 --> 00:32:06,160 Speaker 1: phone about your accounts or you set them up online. 541 00:32:06,480 --> 00:32:08,959 Speaker 1: You know, what's the name of your first pet? You know, 542 00:32:09,080 --> 00:32:12,320 Speaker 1: and you put in your first dog's name, and then 543 00:32:12,400 --> 00:32:15,520 Speaker 1: you use that in multiple places. Then want that was 544 00:32:15,560 --> 00:32:18,280 Speaker 1: what enabled them to get hold of that information. If 545 00:32:18,320 --> 00:32:20,719 Speaker 1: this person got hold of your computer was able to 546 00:32:20,760 --> 00:32:24,800 Speaker 1: pull that out from the log in help, they could 547 00:32:24,840 --> 00:32:27,120 Speaker 1: use that on your accounts too. So it might be 548 00:32:27,160 --> 00:32:29,800 Speaker 1: a little good to use some reverse social engineering. And 549 00:32:29,840 --> 00:32:32,720 Speaker 1: when someone asks you what who what you're uh the 550 00:32:32,800 --> 00:32:35,680 Speaker 1: name of your first dog was or first pet was, 551 00:32:36,080 --> 00:32:41,120 Speaker 1: you put your favorite UH form of salad dressing in 552 00:32:41,160 --> 00:32:44,720 Speaker 1: there instead something something unusual that they wouldn't be able 553 00:32:44,760 --> 00:32:48,400 Speaker 1: to pick. So that which by the way obvious, is 554 00:32:48,440 --> 00:32:52,200 Speaker 1: a blast when you have to call as you've forgotten 555 00:32:52,200 --> 00:32:55,880 Speaker 1: your passwords stuff, and you call in and then they're like, so, 556 00:32:56,840 --> 00:33:03,560 Speaker 1: what's your favorite pets name? Paul Newman's Thousand Island dressing. Yeah, 557 00:33:03,640 --> 00:33:09,360 Speaker 1: that's right. Well I'll tell you that this is and 558 00:33:09,440 --> 00:33:12,200 Speaker 1: anybody who's frustrated by this conversation and will tell you 559 00:33:12,240 --> 00:33:18,120 Speaker 1: that using these super secure passwords and obviously a fustutory 560 00:33:18,680 --> 00:33:22,520 Speaker 1: material here is a pain in the neck because you know, 561 00:33:22,560 --> 00:33:24,240 Speaker 1: if you don't have to have your password manager with 562 00:33:24,240 --> 00:33:26,160 Speaker 1: you when you're on a friends computer logging in to 563 00:33:26,280 --> 00:33:28,520 Speaker 1: check your mail and it's got some kind of thirty 564 00:33:28,560 --> 00:33:32,680 Speaker 1: two character weird password and you don't remember it, and 565 00:33:32,760 --> 00:33:35,160 Speaker 1: you're going, man, I know no one's ever going to 566 00:33:35,240 --> 00:33:38,720 Speaker 1: crack into this computer. It's a friends computer. I'm fairly saying, well, yeah, 567 00:33:38,760 --> 00:33:42,480 Speaker 1: you probably are fairly safe. But it's probably worth a 568 00:33:42,560 --> 00:33:45,640 Speaker 1: frustration then, more so than it will be having to 569 00:33:45,640 --> 00:33:48,280 Speaker 1: put out all the fires of all the account information 570 00:33:48,280 --> 00:33:50,520 Speaker 1: that you could be giving up otherwise. And it's not 571 00:33:50,800 --> 00:33:53,480 Speaker 1: so much worrying about your friends computer as it is 572 00:33:53,520 --> 00:33:56,600 Speaker 1: worrying about that database that's on the other end of 573 00:33:56,600 --> 00:34:02,200 Speaker 1: this password system, because uh, the more passwords a company 574 00:34:02,240 --> 00:34:04,640 Speaker 1: accumulates as more and more people use its service, the 575 00:34:04,680 --> 00:34:07,360 Speaker 1: more attractive it is as a target to crackers. And 576 00:34:08,200 --> 00:34:10,760 Speaker 1: they're doing you know, that's that's what they do. They 577 00:34:10,800 --> 00:34:13,759 Speaker 1: look at systems and try and find ways of penetrating it. 578 00:34:13,880 --> 00:34:17,839 Speaker 1: So it's you know, they're not they're not worried about 579 00:34:17,840 --> 00:34:21,239 Speaker 1: getting your your buddy bills computer. They're looking at you know, 580 00:34:21,960 --> 00:34:25,080 Speaker 1: like Mega core that has all those passwords in it. 581 00:34:25,120 --> 00:34:29,000 Speaker 1: That's what they want. So you know, using that easy password, 582 00:34:29,440 --> 00:34:36,200 Speaker 1: while it's convenient, is also ultimately a dangerous thing. And 583 00:34:36,640 --> 00:34:39,279 Speaker 1: you know, I gotta I gotta admit, like, for the 584 00:34:39,480 --> 00:34:44,799 Speaker 1: very long time, I had pretty poor password protection. I 585 00:34:44,800 --> 00:34:46,839 Speaker 1: mean I just I was just I did not. I 586 00:34:46,880 --> 00:34:50,200 Speaker 1: was not very good about it at all. Even as 587 00:34:50,280 --> 00:34:54,759 Speaker 1: we were telling people change your passwords. Still wasn't doing 588 00:34:54,800 --> 00:34:57,080 Speaker 1: as as good a job as I should have. Don't 589 00:34:57,080 --> 00:34:59,120 Speaker 1: back up your hard drive regularly? Oh yes, I do, 590 00:34:59,600 --> 00:35:02,359 Speaker 1: I do good. I got well the Mac hard drive, 591 00:35:03,239 --> 00:35:05,200 Speaker 1: my my PC hard drive. I do not back up 592 00:35:05,239 --> 00:35:07,360 Speaker 1: as regularly as I should, which really I need to 593 00:35:07,360 --> 00:35:11,040 Speaker 1: start doing that. But the thing in the neck. But 594 00:35:11,040 --> 00:35:14,640 Speaker 1: but cloud services have made that really a lot better too, now, 595 00:35:14,719 --> 00:35:17,040 Speaker 1: you know cloud hell of course, has its own set 596 00:35:17,040 --> 00:35:19,319 Speaker 1: of problems, which we've talked about in previous podcasts. But 597 00:35:19,400 --> 00:35:22,120 Speaker 1: everything technological has its own set of problems. You just 598 00:35:22,160 --> 00:35:24,720 Speaker 1: have to decide which ones are the most acceptable setup 599 00:35:24,719 --> 00:35:28,240 Speaker 1: problems for you. So, but I have I have switched. 600 00:35:28,280 --> 00:35:31,960 Speaker 1: I mean, I am now, I am wholeheartedly in this. 601 00:35:32,600 --> 00:35:35,280 Speaker 1: Let's protect our passwords, especially after seeing what happened to Honan. 602 00:35:36,120 --> 00:35:38,359 Speaker 1: I mean, you and I are in the public eye. 603 00:35:38,680 --> 00:35:42,239 Speaker 1: We're not celebrities by any stretch of the imagination. But 604 00:35:42,520 --> 00:35:46,120 Speaker 1: it's not that far um, it's not it's not all 605 00:35:46,160 --> 00:35:48,879 Speaker 1: the realm of possibility that someone at some point could say, 606 00:35:48,880 --> 00:35:51,400 Speaker 1: you know what would be funny? Well, and and it 607 00:35:51,480 --> 00:35:53,319 Speaker 1: just really takes somebody getting a hold of your name. 608 00:35:54,040 --> 00:35:56,839 Speaker 1: That's why they tell people to shred when you have 609 00:35:56,920 --> 00:35:58,640 Speaker 1: a junk mail or something with your name on it, 610 00:35:58,680 --> 00:36:01,200 Speaker 1: to shred that information. Because I've got one of those two. 611 00:36:01,360 --> 00:36:04,480 Speaker 1: You never know when somebody's gonna go and you know, 612 00:36:04,600 --> 00:36:06,600 Speaker 1: say Jonathan Strickline. I think there's a bunch of people 613 00:36:06,680 --> 00:36:10,040 Speaker 1: named that, actually there are so one of them got 614 00:36:10,040 --> 00:36:13,200 Speaker 1: booked in North Atlanta for something a couple of weeks ago, 615 00:36:13,280 --> 00:36:15,920 Speaker 1: but wasn't me. I want to ask how you know 616 00:36:16,000 --> 00:36:19,040 Speaker 1: that I'm on the lamb because I've got a Google 617 00:36:19,040 --> 00:36:22,840 Speaker 1: alert said to my name. All right, that wraps up 618 00:36:22,840 --> 00:36:25,600 Speaker 1: another classic episode. I think we've all learned a valuable lesson. 619 00:36:25,719 --> 00:36:28,040 Speaker 1: I know I have. I know I learned that Chris 620 00:36:28,040 --> 00:36:31,040 Speaker 1: remembers my phone number for example. Well, I hope you 621 00:36:31,040 --> 00:36:33,160 Speaker 1: guys enjoyed it. If you have any suggestions for future 622 00:36:33,200 --> 00:36:35,760 Speaker 1: episodes of tech Stuff, reach out to me. The address 623 00:36:35,800 --> 00:36:38,840 Speaker 1: is tech stuff at how stuff works dot com, or 624 00:36:38,880 --> 00:36:41,719 Speaker 1: pop on over to our website that's tech stuff podcast 625 00:36:41,840 --> 00:36:45,360 Speaker 1: dot com. You'll find links to our presence on social media. 626 00:36:45,680 --> 00:36:48,000 Speaker 1: You'll find an archive of all of our past episodes. 627 00:36:48,040 --> 00:36:51,480 Speaker 1: You'll find a link to our online store where you 628 00:36:51,520 --> 00:36:55,080 Speaker 1: can buy tech Stuff merch. Get some tech stuff swag 629 00:36:55,800 --> 00:37:00,400 Speaker 1: handed out like it's Christmas. Please, because every purchase you 630 00:37:00,400 --> 00:37:02,440 Speaker 1: make goes to help the show. We greatly appreciate it, 631 00:37:02,600 --> 00:37:05,799 Speaker 1: and I will talk to you again really soon. YEA 632 00:37:09,560 --> 00:37:11,759 Speaker 1: text Stuff is a production of I Heart Radio's How 633 00:37:11,800 --> 00:37:15,200 Speaker 1: Stuff Works. For more podcasts from my heart Radio, visit 634 00:37:15,239 --> 00:37:18,319 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 635 00:37:18,360 --> 00:37:19,720 Speaker 1: listen to your favorite shows.