1 00:00:11,280 --> 00:00:14,800 Speaker 1: The genesis of this episode basically is that I've been 2 00:00:14,840 --> 00:00:20,079 Speaker 1: having trouble convincing people to be better about their security. 3 00:00:20,480 --> 00:00:23,000 Speaker 1: And I have a hypothetical person. We're just gonna call 4 00:00:23,040 --> 00:00:27,640 Speaker 1: this person Jeremy. Jeremy told me about how they use 5 00:00:27,720 --> 00:00:32,960 Speaker 1: the same password across multiple services, multiple logins, and what's 6 00:00:32,960 --> 00:00:35,640 Speaker 1: a big deal? Who would hack little old me? And 7 00:00:35,840 --> 00:00:38,839 Speaker 1: I've been trying to explain to Jeremy, Yo, it's not 8 00:00:38,920 --> 00:00:42,239 Speaker 1: that simple, and I'm struggling. So here we are. 9 00:00:43,320 --> 00:00:47,240 Speaker 2: Yeah, security is a big problem, and it's pretty complex, 10 00:00:47,280 --> 00:00:48,800 Speaker 2: and I think a lot of people are kind of 11 00:00:48,840 --> 00:00:50,720 Speaker 2: frustrated with the state of affairs. 12 00:00:51,760 --> 00:00:55,240 Speaker 1: Josh Blackwelder is a deputy Chief Information Security officer at 13 00:00:55,240 --> 00:00:58,200 Speaker 1: Sentinel One. He's colleagues with Alex Stamos, who you might 14 00:00:58,280 --> 00:01:01,040 Speaker 1: remember from our exit utils episode, the one about the 15 00:01:01,080 --> 00:01:04,319 Speaker 1: biggest hack that never happened. And Josh says, I'm not 16 00:01:04,520 --> 00:01:07,560 Speaker 1: crazy that it is getting harder and harder to protect 17 00:01:07,560 --> 00:01:10,600 Speaker 1: ourselves from hackers, but it's also getting harder to get 18 00:01:10,680 --> 00:01:13,080 Speaker 1: normal people to take this stuff seriously. 19 00:01:14,280 --> 00:01:16,360 Speaker 2: If you have to create a password, make it strong, 20 00:01:16,480 --> 00:01:19,520 Speaker 2: make it sixteen characters, use a past phrase. 21 00:01:19,680 --> 00:01:19,840 Speaker 3: You know. 22 00:01:19,920 --> 00:01:22,800 Speaker 2: My favorite place to visit is and blah blah with 23 00:01:22,880 --> 00:01:26,800 Speaker 2: a bunch of numbers. That's the legacy password sort of mindset. 24 00:01:27,520 --> 00:01:31,720 Speaker 2: But depending on how they have their data cracking machines. 25 00:01:31,760 --> 00:01:34,400 Speaker 2: They're so powerful now with the advent of the latest 26 00:01:35,000 --> 00:01:39,000 Speaker 2: Nvidio processors, it doesn't really matter the length that you choose. 27 00:01:39,319 --> 00:01:41,920 Speaker 2: The horsepower is such that eventually they'll get through there 28 00:01:42,440 --> 00:01:43,880 Speaker 2: and they'll find your password. 29 00:01:47,520 --> 00:01:50,160 Speaker 1: So yeah, this how to episode is for all the 30 00:01:50,240 --> 00:01:53,280 Speaker 1: Jeremies out there that no, they should do something about 31 00:01:53,280 --> 00:01:56,720 Speaker 1: their passwords, but it just kind of feels overwhelming, or 32 00:01:56,720 --> 00:01:58,960 Speaker 1: maybe somebody sent this to you and this is how 33 00:01:59,000 --> 00:02:02,360 Speaker 1: you're finding out that you are the Jeremy in somebody's life. 34 00:02:02,720 --> 00:02:04,840 Speaker 1: In that case, welcome to the show. This is not 35 00:02:04,920 --> 00:02:07,160 Speaker 1: meant to shame you or anything like that. We're gonna 36 00:02:07,160 --> 00:02:09,320 Speaker 1: work through this together. But even if you think you 37 00:02:09,360 --> 00:02:12,000 Speaker 1: have a good handle on your passwords and your security 38 00:02:12,040 --> 00:02:14,560 Speaker 1: and all that, I think there's still something in here 39 00:02:14,600 --> 00:02:18,880 Speaker 1: for you, because as hackers get more sophisticated, the way 40 00:02:18,880 --> 00:02:23,959 Speaker 1: we communicate with each other gets more important. From Kaleidoscope 41 00:02:24,000 --> 00:02:26,960 Speaker 1: and iHeart podcasts, this is kill Switch. 42 00:02:27,400 --> 00:02:41,160 Speaker 3: I'm Dexter Thomas, I'm goodbye. 43 00:02:44,520 --> 00:02:47,560 Speaker 1: To be online today means having a bunch of digital 44 00:02:47,560 --> 00:02:50,920 Speaker 1: accounts and each one has its own password to remember. 45 00:02:51,400 --> 00:02:54,800 Speaker 1: It's just part of life now. But it wasn't always 46 00:02:54,919 --> 00:02:55,239 Speaker 1: like this. 47 00:02:56,080 --> 00:02:58,600 Speaker 2: You think about the early days of computing, you really 48 00:02:58,639 --> 00:03:01,960 Speaker 2: had very few past you didn't have social media accounts, 49 00:03:01,960 --> 00:03:03,280 Speaker 2: you didn't have any of that stuff. It was just 50 00:03:03,320 --> 00:03:07,040 Speaker 2: like the World Wide Web. It was just this free wonderland, right. 51 00:03:07,400 --> 00:03:10,840 Speaker 2: And then as these sort of paid services that had 52 00:03:11,120 --> 00:03:15,000 Speaker 2: more personal data started to require more logins, and you know, 53 00:03:15,040 --> 00:03:19,520 Speaker 2: the advent of online banking. Now use hundreds of different applications, 54 00:03:19,560 --> 00:03:22,040 Speaker 2: and so what you're left with is this sort of 55 00:03:22,840 --> 00:03:25,960 Speaker 2: massive sprawl of usernames and passwords. 56 00:03:26,360 --> 00:03:28,400 Speaker 1: There was a time when you only had a handful 57 00:03:28,440 --> 00:03:33,119 Speaker 1: of accounts your email, maybe a Facebook, maybe an online 58 00:03:33,160 --> 00:03:38,560 Speaker 1: banking account. But now there's Netflix, Amazon, Home Depot, Target Yo. 59 00:03:38,720 --> 00:03:41,240 Speaker 1: Even the laundry room in my apartment building tried to 60 00:03:41,240 --> 00:03:43,040 Speaker 1: make me set up a password so I could wash 61 00:03:43,080 --> 00:03:46,000 Speaker 1: my clothes. It's kind of out of control. And you 62 00:03:46,080 --> 00:03:48,840 Speaker 1: used to be able to just remember your passwords, but 63 00:03:49,120 --> 00:03:52,200 Speaker 1: there's too many now, so you might be tempted to 64 00:03:52,240 --> 00:03:55,119 Speaker 1: just use the same password for everything. It's more convenient. 65 00:03:55,520 --> 00:03:58,160 Speaker 1: So in this episode, we're going to try to convince 66 00:03:58,200 --> 00:04:01,280 Speaker 1: you to stop doing that and the change your habits 67 00:04:01,720 --> 00:04:04,720 Speaker 1: because the old way might feel convenient, but it is 68 00:04:04,800 --> 00:04:05,480 Speaker 1: not safe. 69 00:04:06,160 --> 00:04:08,560 Speaker 2: It's just the landscape is too wide. You have too 70 00:04:08,560 --> 00:04:11,720 Speaker 2: many passwords at this point, like it's impossible. You can't 71 00:04:11,720 --> 00:04:14,400 Speaker 2: make them unique enough and strong enough per site. 72 00:04:14,600 --> 00:04:16,800 Speaker 1: What would be the worst case scenario or a possible 73 00:04:16,839 --> 00:04:20,719 Speaker 1: scenario if somebody gets into one of your accounts, because 74 00:04:20,839 --> 00:04:23,160 Speaker 1: I'll try to tell somebody, hey, look, if you're using 75 00:04:23,160 --> 00:04:26,680 Speaker 1: the same password on your email and your bank or 76 00:04:26,720 --> 00:04:31,000 Speaker 1: something like that, this is a problem, and the response is, look, man, 77 00:04:31,000 --> 00:04:34,600 Speaker 1: if somebody wants to read my email, whatever, who am I? 78 00:04:34,760 --> 00:04:37,919 Speaker 1: Nobody's looking for my email. What would be a nightmare 79 00:04:37,960 --> 00:04:42,440 Speaker 1: scenario for somebody getting into even an old forum account 80 00:04:42,480 --> 00:04:43,720 Speaker 1: or something they don't even use. 81 00:04:44,640 --> 00:04:49,840 Speaker 2: So depending on if that old forum allows movement into 82 00:04:49,880 --> 00:04:54,480 Speaker 2: their Gmail account, their email account access is probably your 83 00:04:54,520 --> 00:04:58,200 Speaker 2: worst case scenario is an individual because that's sort of 84 00:04:58,279 --> 00:05:01,720 Speaker 2: the reset point for all of the services that you utilize. 85 00:05:01,760 --> 00:05:04,680 Speaker 2: Once an attacker has access to your Gmail, they can 86 00:05:04,680 --> 00:05:07,680 Speaker 2: see your sign ups for all these accounts in different places, right, 87 00:05:07,760 --> 00:05:10,520 Speaker 2: so they can quickly search for like what bank you use, 88 00:05:10,600 --> 00:05:14,400 Speaker 2: what insurance, you use, what social media, and then they 89 00:05:14,440 --> 00:05:17,920 Speaker 2: can see what those usernames are and start targeting those 90 00:05:17,960 --> 00:05:20,960 Speaker 2: other services. And even if you use a different password, 91 00:05:21,480 --> 00:05:25,320 Speaker 2: you can reset back to that email account and gain 92 00:05:25,440 --> 00:05:27,680 Speaker 2: access there. So, in terms of like where you want 93 00:05:27,720 --> 00:05:31,880 Speaker 2: to secure your life, the most first is really your email. 94 00:05:32,560 --> 00:05:34,760 Speaker 1: So the point I want to make to my hypothetical 95 00:05:34,760 --> 00:05:38,799 Speaker 1: friend Jeremy here is you might think, Hey, a hacker 96 00:05:38,880 --> 00:05:41,640 Speaker 1: is not interested in me personally, Who would spend time 97 00:05:41,680 --> 00:05:44,880 Speaker 1: hacking me? What could they possibly want from me? And 98 00:05:45,160 --> 00:05:48,800 Speaker 1: you're kind of right. Unless you're a high profile person, 99 00:05:49,200 --> 00:05:52,920 Speaker 1: hackers aren't looking specifically for you. But let me just 100 00:05:52,960 --> 00:05:55,560 Speaker 1: give us scenario here. Let's say you have an old 101 00:05:55,680 --> 00:05:58,800 Speaker 1: MySpace account that you have not logged into for years, 102 00:05:59,240 --> 00:06:02,160 Speaker 1: and I'm a hacker and I'm on the dark Web, 103 00:06:02,200 --> 00:06:04,320 Speaker 1: and I see that someone's leaked a list of three 104 00:06:04,360 --> 00:06:06,640 Speaker 1: hundred and sixty million people's account information. 105 00:06:07,240 --> 00:06:07,560 Speaker 2: Cool. 106 00:06:08,520 --> 00:06:10,599 Speaker 1: I can grab that and run a program that would 107 00:06:10,640 --> 00:06:14,000 Speaker 1: automatically try to get into each of those people's email accounts. 108 00:06:14,640 --> 00:06:17,919 Speaker 1: Let's say that only point zero zero one percent of 109 00:06:17,960 --> 00:06:21,720 Speaker 1: those people use the same passwords for MySpace and their 110 00:06:21,760 --> 00:06:26,080 Speaker 1: email account, and you are one of them. Now I 111 00:06:26,080 --> 00:06:30,320 Speaker 1: have access to your email and boom naga access to 112 00:06:30,360 --> 00:06:34,400 Speaker 1: your bank and now, yes, Jeremy, I am looking for 113 00:06:34,440 --> 00:06:38,240 Speaker 1: you because you do have something I want money And 114 00:06:38,480 --> 00:06:41,039 Speaker 1: this didn't take me much time at all. And that's 115 00:06:41,080 --> 00:06:43,040 Speaker 1: just one scenario, and by the way, it is a 116 00:06:43,080 --> 00:06:46,200 Speaker 1: real scenario. Back in twenty sixteen, a list of three 117 00:06:46,320 --> 00:06:50,320 Speaker 1: hundred and sixty million MySpace username and password combos was leaked. 118 00:06:50,880 --> 00:06:53,279 Speaker 1: This was back when a lot of people probably were 119 00:06:53,400 --> 00:06:56,679 Speaker 1: using the same passwords for just about everything, and maybe 120 00:06:56,680 --> 00:06:59,440 Speaker 1: they still do today. And this is why we use 121 00:06:59,480 --> 00:07:03,000 Speaker 1: different paths words for different accounts to limit the damage 122 00:07:03,040 --> 00:07:05,640 Speaker 1: in case your data is leaked. And even if your 123 00:07:05,720 --> 00:07:08,560 Speaker 1: data wasn't leaked in the MySpace hack, there's a good 124 00:07:08,640 --> 00:07:11,440 Speaker 1: chance that's been leaked somewhere else in the past. And 125 00:07:11,600 --> 00:07:14,760 Speaker 1: you can see this for yourself. Try this. Open up 126 00:07:14,760 --> 00:07:19,440 Speaker 1: a browser and type in haveibenponed dot com? That's have 127 00:07:19,720 --> 00:07:24,880 Speaker 1: I been pwned dot com? And type in your email 128 00:07:24,880 --> 00:07:27,120 Speaker 1: address to see if your account pops up in any 129 00:07:27,200 --> 00:07:30,800 Speaker 1: known hacker lists. You'll probably see your email address pop 130 00:07:30,920 --> 00:07:34,320 Speaker 1: up a lot. Mine popped up twenty seven times, so 131 00:07:34,640 --> 00:07:38,840 Speaker 1: hopefully that is inspiration to take this stuff seriously, because 132 00:07:38,880 --> 00:07:42,240 Speaker 1: even if you think you're not worth hacking, you might 133 00:07:42,280 --> 00:07:48,480 Speaker 1: still be an access point for a larger target. Maybe 134 00:07:48,760 --> 00:07:52,120 Speaker 1: you're not that interesting. Is the place you work interesting 135 00:07:52,160 --> 00:07:54,680 Speaker 1: to somebody? For example, do you work at a small business, 136 00:07:54,760 --> 00:07:57,200 Speaker 1: do you work in a government office? Somebody would very 137 00:07:57,280 --> 00:08:01,600 Speaker 1: much like to get into your personal computer or your phone, 138 00:08:01,880 --> 00:08:07,520 Speaker 1: and then get into your workplace's computer because your workplace 139 00:08:07,520 --> 00:08:10,000 Speaker 1: probably has a whole lot more money than you do. Yeah. 140 00:08:10,280 --> 00:08:13,280 Speaker 2: I think people vastly under estate the risk for small 141 00:08:13,320 --> 00:08:16,760 Speaker 2: businesses that deal with high volumes of cash, and especially 142 00:08:16,840 --> 00:08:22,200 Speaker 2: doctor's offices, dentist's office, orthodontists like These folks typically bring 143 00:08:22,280 --> 00:08:25,600 Speaker 2: in a very high amount of revenue and have pretty 144 00:08:26,040 --> 00:08:30,480 Speaker 2: low security controls, and the impact of disruption to their 145 00:08:30,520 --> 00:08:33,120 Speaker 2: business is so great that they're more inclined to pay 146 00:08:33,200 --> 00:08:34,880 Speaker 2: ransom fees. Right right. 147 00:08:34,920 --> 00:08:37,080 Speaker 1: If somebody says, yo, I got all your data, give 148 00:08:37,120 --> 00:08:39,600 Speaker 1: me ye, give me one hundred K, give me three 149 00:08:39,679 --> 00:08:40,719 Speaker 1: hundred k, they'll just do it. 150 00:08:40,800 --> 00:08:41,280 Speaker 2: Yeah. 151 00:08:41,480 --> 00:08:45,160 Speaker 1: Oh yeah, So what do you do? You don't want 152 00:08:45,200 --> 00:08:47,000 Speaker 1: to get hacked, but you don't want to have to 153 00:08:47,040 --> 00:08:51,160 Speaker 1: remember a bunch of difficult to remember passwords. This is 154 00:08:51,160 --> 00:08:53,880 Speaker 1: where a password manager comes in. So I'm gonna do 155 00:08:53,920 --> 00:08:55,600 Speaker 1: something real quick, and you can try this with me 156 00:08:55,640 --> 00:08:58,599 Speaker 1: if you want. I'm going to open up a password evaluator. 157 00:08:58,960 --> 00:09:00,600 Speaker 1: This is a tool that would tell you how long 158 00:09:00,640 --> 00:09:03,040 Speaker 1: it would take for a hacker to guess your password 159 00:09:03,160 --> 00:09:06,800 Speaker 1: using a brute force attack. So, for example, let me 160 00:09:06,880 --> 00:09:09,320 Speaker 1: just type in the name of my first pet and 161 00:09:09,480 --> 00:09:16,520 Speaker 1: my birthday. So it says it would take three seconds 162 00:09:16,559 --> 00:09:19,720 Speaker 1: to crack that. Okay, that's bad. Next, I'm just gonna 163 00:09:19,760 --> 00:09:24,800 Speaker 1: mash a bunch of random keys. Now it says it 164 00:09:24,840 --> 00:09:29,400 Speaker 1: would take centuries to crack. That is way better, very secure, 165 00:09:30,320 --> 00:09:32,600 Speaker 1: and that is the kind of password that a password 166 00:09:32,679 --> 00:09:36,880 Speaker 1: manager can generate for you automatically. You might be thinking, okay, wait, 167 00:09:36,960 --> 00:09:39,640 Speaker 1: how am I going to remember a bunch of random keypresses. 168 00:09:40,040 --> 00:09:43,800 Speaker 1: Here's the thing. A password manager will remember it for you. 169 00:09:44,240 --> 00:09:46,559 Speaker 1: It sits on your phone or your desktop, and when 170 00:09:46,600 --> 00:09:49,360 Speaker 1: you go to any website, it inputs it for you. 171 00:09:49,360 --> 00:09:52,640 Speaker 1: You don't have to remember anything. There's a lot of 172 00:09:52,640 --> 00:09:56,120 Speaker 1: these apps. There's last Pass, There's one Password. Those have 173 00:09:56,200 --> 00:09:59,440 Speaker 1: annual fees, but hey, it's cheaper than getting hacked. There's 174 00:09:59,480 --> 00:10:02,400 Speaker 1: also a bit which does the same thing. It's open 175 00:10:02,440 --> 00:10:06,120 Speaker 1: source and it's totally free. Or another way is most 176 00:10:06,160 --> 00:10:09,240 Speaker 1: browsers like Google Chrome also have versions of this built 177 00:10:09,280 --> 00:10:11,600 Speaker 1: in now, and if you use an Apple device, you've 178 00:10:11,600 --> 00:10:17,040 Speaker 1: got iCloud keychain. Now. This all sounds easy, and that's 179 00:10:17,120 --> 00:10:19,920 Speaker 1: because it is. If you want to start now seriously, 180 00:10:19,960 --> 00:10:22,480 Speaker 1: you can just pick one of those services I mentioned. 181 00:10:22,520 --> 00:10:25,520 Speaker 1: They're all really good at teaching you how to use them. So, 182 00:10:25,720 --> 00:10:28,840 Speaker 1: for example, if your email showed up on that haveibenponed 183 00:10:28,840 --> 00:10:32,480 Speaker 1: dot com site, start there, change the password for that 184 00:10:32,520 --> 00:10:35,040 Speaker 1: email account, and this app will help you do it. 185 00:10:35,440 --> 00:10:37,440 Speaker 1: By the time we're back from the break, you'll be 186 00:10:37,520 --> 00:10:42,880 Speaker 1: on your way again. This is easy stuff, but sometimes 187 00:10:42,880 --> 00:10:45,720 Speaker 1: when I encourage my friends to do this, I get 188 00:10:45,720 --> 00:10:48,920 Speaker 1: a little bit of resistance. So I've spent a lot 189 00:10:48,920 --> 00:10:51,920 Speaker 1: of time telling people, hey, look, you should use a 190 00:10:51,920 --> 00:10:54,480 Speaker 1: password manager, you should use a pasword manager. And one 191 00:10:54,480 --> 00:10:56,640 Speaker 1: of the concerns that I've heard back is they say, 192 00:10:56,640 --> 00:10:58,680 Speaker 1: hold on, wait a second. So you're telling me I'm 193 00:10:58,679 --> 00:11:03,000 Speaker 1: just gonna make one really great password and inside of 194 00:11:03,040 --> 00:11:07,559 Speaker 1: that vault is going to be stored everything that I use, Well, 195 00:11:07,600 --> 00:11:10,240 Speaker 1: what if that one gets hacked? And I've always said, 196 00:11:10,880 --> 00:11:14,360 Speaker 1: well that ah, that won't happen, but it kind of did. 197 00:11:14,760 --> 00:11:17,480 Speaker 2: Yeah. 198 00:11:17,679 --> 00:11:19,800 Speaker 1: So this is where my pitch for using a password 199 00:11:19,840 --> 00:11:23,280 Speaker 1: manager gets kind of complicated because the one thing that 200 00:11:23,320 --> 00:11:38,560 Speaker 1: people were most worried about actually happened. That's after the break. Okay, 201 00:11:38,760 --> 00:11:41,480 Speaker 1: maybe I scared you back there, but seriously, the great 202 00:11:41,480 --> 00:11:44,920 Speaker 1: thing about password managers is the vault. The vault is 203 00:11:44,920 --> 00:11:48,400 Speaker 1: where those passwords like your Netflix, your Instagram, your Amazon, 204 00:11:48,520 --> 00:11:52,840 Speaker 1: your laundry room, all those passwords are kept and they're encrypted. 205 00:11:53,559 --> 00:11:57,320 Speaker 1: That means even if someone somehow steals that vault, they 206 00:11:57,320 --> 00:12:00,880 Speaker 1: can't just read the passwords, not without rypting them first. 207 00:12:01,559 --> 00:12:04,040 Speaker 1: Only you have the key to that vault to decrypt it. 208 00:12:04,520 --> 00:12:06,720 Speaker 1: And this key can be your fingerprint or your own 209 00:12:06,800 --> 00:12:10,040 Speaker 1: single master password to get into it. And this password 210 00:12:10,080 --> 00:12:12,560 Speaker 1: can be a really long one, like say, a full 211 00:12:12,640 --> 00:12:15,800 Speaker 1: sentence that nobody else would guess. So, for example, if 212 00:12:15,840 --> 00:12:19,120 Speaker 1: your password was Gwen Stefani was the number three most 213 00:12:19,160 --> 00:12:22,880 Speaker 1: overhyped member of Wu Tang clan, that is factually incorrect, 214 00:12:23,040 --> 00:12:26,240 Speaker 1: but it is memorable, and it is a fantastic length 215 00:12:26,280 --> 00:12:28,520 Speaker 1: of a password. You only need to remember that one 216 00:12:28,800 --> 00:12:31,360 Speaker 1: and even at the rate of a million guesses per second, 217 00:12:31,760 --> 00:12:34,760 Speaker 1: that could take trillions of years to crack. If I'm 218 00:12:34,800 --> 00:12:37,160 Speaker 1: a hacker, it is not worth it. I'm moving on 219 00:12:37,200 --> 00:12:41,000 Speaker 1: and looking for somebody else. Of course, you still don't 220 00:12:41,000 --> 00:12:43,319 Speaker 1: want anyone to get a hold of your password vault, 221 00:12:43,640 --> 00:12:47,400 Speaker 1: even if it is encrypted. But that's exactly what happened. 222 00:12:47,440 --> 00:12:50,959 Speaker 1: In twenty twenty two, a very popular service called last 223 00:12:51,000 --> 00:12:55,400 Speaker 1: Pass had a major security breach. Hackers stole customer information 224 00:12:56,000 --> 00:12:59,360 Speaker 1: and encrypted password vaults. This is not good. 225 00:13:00,840 --> 00:13:03,760 Speaker 2: Unfortunately, they suffered a breach where one of their senior 226 00:13:03,800 --> 00:13:08,440 Speaker 2: engineer laptops was compromised and they were able to extract 227 00:13:08,520 --> 00:13:12,920 Speaker 2: data from the customer environment. This data wasn't fully encrypted, 228 00:13:13,520 --> 00:13:16,560 Speaker 2: so the vaults were encrypted, but some of the metadata 229 00:13:16,960 --> 00:13:19,400 Speaker 2: involved in some of the vaults was not encrypted, like 230 00:13:19,760 --> 00:13:23,000 Speaker 2: the customer's email address, and some of the line items 231 00:13:23,120 --> 00:13:26,160 Speaker 2: within the vault, like what they had entries for. There 232 00:13:26,240 --> 00:13:28,520 Speaker 2: was some metadata associated with that, so they could see 233 00:13:28,559 --> 00:13:31,120 Speaker 2: like this person's email address. Oh and by the way, 234 00:13:31,160 --> 00:13:33,160 Speaker 2: they have a coinbase entry. 235 00:13:34,160 --> 00:13:37,320 Speaker 1: So Coinbase, if you're not familiar with this, this is 236 00:13:37,320 --> 00:13:40,839 Speaker 1: a site where people can buy and also store cryptocurrency. 237 00:13:41,320 --> 00:13:43,960 Speaker 1: Oversimplifying a lot here, Just think of it as a 238 00:13:44,000 --> 00:13:47,160 Speaker 1: bank where some people keep their bitcoin, and if you 239 00:13:47,200 --> 00:13:49,400 Speaker 1: figure out that someone has an accounts in this bank, 240 00:13:49,679 --> 00:13:51,839 Speaker 1: well you can imagine where this is going. 241 00:13:52,520 --> 00:13:55,320 Speaker 2: So now you have a lot more information to sort 242 00:13:55,320 --> 00:13:57,520 Speaker 2: of triangule your attack and say, like I want to 243 00:13:57,559 --> 00:14:00,200 Speaker 2: decrypt this fault and I want to focus my cracking 244 00:14:00,679 --> 00:14:04,160 Speaker 2: on this particular vault. And so that's basically what happened. 245 00:14:04,400 --> 00:14:04,800 Speaker 3: I see. 246 00:14:04,800 --> 00:14:08,640 Speaker 1: So basically what hackers were able to obtain wasn't necessarily 247 00:14:09,520 --> 00:14:13,679 Speaker 1: passwords itself, but they were able to find Okay, I 248 00:14:13,679 --> 00:14:16,240 Speaker 1: can see the email address, and I can tell that 249 00:14:16,280 --> 00:14:20,280 Speaker 1: they have an account at such and such bank, and 250 00:14:20,720 --> 00:14:24,760 Speaker 1: if they can figure out high profile targets within that, 251 00:14:25,080 --> 00:14:27,840 Speaker 1: they can tell oh this email address, I know who 252 00:14:27,840 --> 00:14:30,960 Speaker 1: this person is. I think they're rich. Yeah, let me 253 00:14:31,000 --> 00:14:34,240 Speaker 1: throw some computing power and figuring out what their password is. 254 00:14:34,720 --> 00:14:37,720 Speaker 2: And they use multiple avenues of attack, So attacking the 255 00:14:37,800 --> 00:14:40,920 Speaker 2: vault for you know, did they use a bad password 256 00:14:40,960 --> 00:14:43,200 Speaker 2: that's easily cracked, So try to attack the vault, but 257 00:14:43,280 --> 00:14:47,200 Speaker 2: also very targeted phishing emails. If you know that they're 258 00:14:47,280 --> 00:14:49,560 Speaker 2: at a particular bank, and you also know that they 259 00:14:49,560 --> 00:14:52,320 Speaker 2: have all these other accounts, it's very easy to craft 260 00:14:52,320 --> 00:14:55,400 Speaker 2: an email saying we noticed a fraudulent attempt at this 261 00:14:55,560 --> 00:14:58,720 Speaker 2: service that they had an entry for from this bank, 262 00:14:59,360 --> 00:15:02,200 Speaker 2: and now it's looked like a very legitimate email from 263 00:15:02,200 --> 00:15:04,720 Speaker 2: the bank, and they're more likely to click and get 264 00:15:04,760 --> 00:15:08,160 Speaker 2: fished with that information, Like they send a fake email 265 00:15:08,200 --> 00:15:09,960 Speaker 2: from Bank of America saying we had a problem with 266 00:15:10,000 --> 00:15:12,480 Speaker 2: your utility payment to this utility. 267 00:15:12,640 --> 00:15:13,640 Speaker 3: Aha. 268 00:15:13,720 --> 00:15:17,000 Speaker 2: Right, so now you're starting to like make it more 269 00:15:17,160 --> 00:15:18,840 Speaker 2: likely that the user is possibly going to click. 270 00:15:19,960 --> 00:15:23,400 Speaker 1: So here's what happened. In September of twenty twenty three, 271 00:15:23,760 --> 00:15:27,000 Speaker 1: security researchers found that the previous year's last pass breach 272 00:15:27,320 --> 00:15:31,120 Speaker 1: led to thirty five million dollars in cryptocurrency being stolen 273 00:15:31,440 --> 00:15:34,480 Speaker 1: for more than one hundred and fifty victims. Then in 274 00:15:34,600 --> 00:15:37,440 Speaker 1: January of twenty twenty four, it was reported that one 275 00:15:37,520 --> 00:15:40,840 Speaker 1: hundred and fifty million dollars in crypto was stolen from 276 00:15:40,960 --> 00:15:44,840 Speaker 1: one person, the co founder of a crypto platform, obviously 277 00:15:44,880 --> 00:15:49,320 Speaker 1: a very high profile individual. The access point they're hacked 278 00:15:49,400 --> 00:15:53,080 Speaker 1: last pass data, And I should say here Josh was 279 00:15:53,120 --> 00:15:56,120 Speaker 1: talking about a theoretical extra step that hackers could use 280 00:15:56,160 --> 00:15:59,160 Speaker 1: to trick you into giving up your information, like send 281 00:15:59,200 --> 00:16:02,040 Speaker 1: you emails at looks like it's from your bank. If 282 00:16:02,080 --> 00:16:04,520 Speaker 1: I was a hacker, honestly, that's the route that I 283 00:16:04,520 --> 00:16:08,080 Speaker 1: would go. But in these cases the hackers didn't have 284 00:16:08,160 --> 00:16:11,160 Speaker 1: to do that. Security researchers found that some of the 285 00:16:11,160 --> 00:16:14,040 Speaker 1: people who'd lost the most money didn't get hit with 286 00:16:14,080 --> 00:16:19,440 Speaker 1: social engineering attacks. They had used weak master passwords. See now, 287 00:16:19,440 --> 00:16:21,840 Speaker 1: when you sign up with last pass or basically any 288 00:16:21,840 --> 00:16:25,680 Speaker 1: other password manager, it makes you choose a really complicated 289 00:16:25,720 --> 00:16:30,160 Speaker 1: master password. Now, for years, last Pass has recommended a 290 00:16:30,240 --> 00:16:33,880 Speaker 1: twelve character minimum password as a default, but if you 291 00:16:33,920 --> 00:16:37,000 Speaker 1: were lazy, you could override that and choose a shorter 292 00:16:37,120 --> 00:16:41,160 Speaker 1: and simpler one. That made those people more realistic targets 293 00:16:41,240 --> 00:16:45,240 Speaker 1: for a brute force attack. How would a hacker get 294 00:16:45,280 --> 00:16:47,240 Speaker 1: that main password? 295 00:16:48,280 --> 00:16:51,600 Speaker 2: So because they had the vaults, now you're able to 296 00:16:51,880 --> 00:16:54,600 Speaker 2: put that on your own hardware, either at home or 297 00:16:54,640 --> 00:16:58,760 Speaker 2: in the cloud, and run millions of cycles against it. 298 00:16:58,840 --> 00:17:03,320 Speaker 2: If you're just trying to cycle through twelve sixteen characters 299 00:17:03,320 --> 00:17:05,600 Speaker 2: and you got enough horsepower, some of these new machines 300 00:17:05,640 --> 00:17:08,200 Speaker 2: are so fast they can get through, especially if you 301 00:17:08,240 --> 00:17:09,680 Speaker 2: haven't used a great password. 302 00:17:10,240 --> 00:17:14,240 Speaker 1: This truly was a nightmare scenario. Now, last Past says 303 00:17:14,240 --> 00:17:16,880 Speaker 1: that they've updated their security measures and changed the data 304 00:17:16,920 --> 00:17:20,439 Speaker 1: encryption process to protect users from a future breach. And 305 00:17:20,520 --> 00:17:23,000 Speaker 1: there is one major change that we can all see. 306 00:17:23,520 --> 00:17:26,960 Speaker 1: Last year. Last Past forced all of their users, even 307 00:17:27,000 --> 00:17:30,399 Speaker 1: their day one customers, to update their master password to 308 00:17:30,480 --> 00:17:34,520 Speaker 1: something that is at least twelve characters. You cannot override this. 309 00:17:34,880 --> 00:17:37,800 Speaker 1: You are being forced to be more secure. This is 310 00:17:37,840 --> 00:17:41,280 Speaker 1: a good thing. But this incident, if you're hearing about it, 311 00:17:41,600 --> 00:17:43,680 Speaker 1: I understand if it maybe turns you while from using 312 00:17:43,680 --> 00:17:45,679 Speaker 1: a password manager. Maybe it makes you want to go 313 00:17:45,720 --> 00:17:48,000 Speaker 1: back to just trying to remember your passwords, or using 314 00:17:48,040 --> 00:17:50,520 Speaker 1: the same one or multiple sites, or writing it down 315 00:17:50,600 --> 00:17:55,240 Speaker 1: in your notesapp. Please do not do that. Security experts, 316 00:17:55,440 --> 00:17:59,680 Speaker 1: including Josh Will all still recommend a password manager over 317 00:17:59,720 --> 00:18:02,640 Speaker 1: any of those options I just mentioned. I mean, would 318 00:18:02,640 --> 00:18:05,600 Speaker 1: you person at this point, is Last Past something you 319 00:18:05,840 --> 00:18:06,680 Speaker 1: suggest to people? 320 00:18:07,160 --> 00:18:07,560 Speaker 2: Of course? 321 00:18:07,680 --> 00:18:10,560 Speaker 1: Yeah? Really? Even despite that, Yeah. 322 00:18:10,480 --> 00:18:15,320 Speaker 2: They're a business that's you know, there to service their clients, 323 00:18:15,320 --> 00:18:17,920 Speaker 2: and they're going to rotate very quickly to a more 324 00:18:17,920 --> 00:18:20,159 Speaker 2: secure posture. They're going to fix the wholes, they're going 325 00:18:20,200 --> 00:18:22,520 Speaker 2: to scrub all the things that they're offering and make 326 00:18:22,520 --> 00:18:25,320 Speaker 2: sure that they're secure. So yeah, it's probably a great 327 00:18:25,320 --> 00:18:29,080 Speaker 2: time to use the service at this point, some people 328 00:18:29,200 --> 00:18:31,960 Speaker 2: just never feel comfortable with that. And there's other alternatives 329 00:18:31,960 --> 00:18:33,960 Speaker 2: as well. You know, there's one password, and there's the 330 00:18:34,000 --> 00:18:36,880 Speaker 2: free alternative on your phone, the password manager built into 331 00:18:36,920 --> 00:18:37,800 Speaker 2: the iOS. 332 00:18:38,080 --> 00:18:41,760 Speaker 1: Yeah. Well, for people who you speak to just in 333 00:18:42,080 --> 00:18:46,639 Speaker 1: personal life, are there certain ones that you recommend over others. 334 00:18:47,000 --> 00:18:50,080 Speaker 2: I personally prefer the built in one to the iPhone 335 00:18:50,359 --> 00:18:53,840 Speaker 2: for myself. I know a lot of folks really like 336 00:18:53,920 --> 00:18:57,439 Speaker 2: one password, and that's a very popular one in the 337 00:18:57,440 --> 00:19:02,440 Speaker 2: commercial environments as well. But you know, largely this concept 338 00:19:02,480 --> 00:19:08,320 Speaker 2: of like creating a sixteen character password has been surpassed 339 00:19:08,359 --> 00:19:13,479 Speaker 2: by this technology called passkey, which aims to remove the 340 00:19:13,560 --> 00:19:17,879 Speaker 2: need to keep track of these passwords and they're also 341 00:19:18,080 --> 00:19:21,719 Speaker 2: stored in a very safe way within your digital devices. 342 00:19:23,960 --> 00:19:27,640 Speaker 1: So yes, password managers are still recommended. I use one, 343 00:19:28,080 --> 00:19:30,560 Speaker 1: but the security community has started to move on to 344 00:19:30,640 --> 00:19:33,359 Speaker 1: something that is more secure passkeys. 345 00:19:34,320 --> 00:19:40,080 Speaker 2: Pastkey is a technology that was invented by a consortium 346 00:19:40,080 --> 00:19:45,080 Speaker 2: of folks Apple, Google, Microsoft, Amazon. They all sort of 347 00:19:45,160 --> 00:19:49,840 Speaker 2: came together and said, you know, this password world is terrible. 348 00:19:50,280 --> 00:19:54,480 Speaker 2: How do we solve this properly for businesses and consumers. 349 00:19:56,680 --> 00:19:58,360 Speaker 1: You might have seen this option before when you were 350 00:19:58,359 --> 00:20:02,240 Speaker 1: logging into Amazon or Google. It basically removes the need 351 00:20:02,280 --> 00:20:04,879 Speaker 1: for a password altogether, so you can sign it with 352 00:20:04,960 --> 00:20:08,000 Speaker 1: face ID or a fingerprint or a pin on a device. 353 00:20:08,359 --> 00:20:10,399 Speaker 1: It kind of takes a while to wrap your head around, 354 00:20:10,480 --> 00:20:15,000 Speaker 1: but it's pretty simple. If you're to explain this to 355 00:20:15,040 --> 00:20:17,480 Speaker 1: a five year old, how to pass keys work. 356 00:20:17,600 --> 00:20:20,680 Speaker 2: For a five year old, I would say I would 357 00:20:20,760 --> 00:20:23,639 Speaker 2: tear a piece of paper in half in like a 358 00:20:23,760 --> 00:20:27,280 Speaker 2: unique way, and I would say, I'm gonna put this 359 00:20:27,320 --> 00:20:31,000 Speaker 2: one somewhere in the house, and I'm not going to 360 00:20:31,040 --> 00:20:33,119 Speaker 2: show you this other one, but I'm gonna hide this 361 00:20:33,160 --> 00:20:36,240 Speaker 2: one on the vault. And if you find that public 362 00:20:36,280 --> 00:20:38,760 Speaker 2: one in the house and you bring it back to 363 00:20:38,800 --> 00:20:41,240 Speaker 2: me and it matches mine, I'll give you an ice cream. 364 00:20:41,520 --> 00:20:44,640 Speaker 2: And that's like kind of a simple way to describe 365 00:20:44,680 --> 00:20:47,159 Speaker 2: it in not so much accurate terms. 366 00:20:47,200 --> 00:20:53,600 Speaker 1: But you know, that's an interesting, actually way to put it, 367 00:20:53,680 --> 00:20:58,400 Speaker 1: because I've seen the comparison of passwords to pass keys 368 00:20:58,480 --> 00:21:01,560 Speaker 1: as something you need versus something you have. 369 00:21:02,320 --> 00:21:04,480 Speaker 2: Yeah, And I think you're right there because it does 370 00:21:04,560 --> 00:21:08,560 Speaker 2: involve those two components. And so when on your phone 371 00:21:08,600 --> 00:21:11,360 Speaker 2: you want to enable a pass key for say Netflix, 372 00:21:11,400 --> 00:21:14,280 Speaker 2: it will give them that torn side of the piece 373 00:21:14,280 --> 00:21:18,760 Speaker 2: of paper, the public side, and then within your phone 374 00:21:18,840 --> 00:21:22,879 Speaker 2: it will put in the secure enclave the private portion. 375 00:21:23,800 --> 00:21:28,080 Speaker 2: And that secure enclave on your phone is a separate 376 00:21:28,240 --> 00:21:31,399 Speaker 2: chip from the main processor on your iPhone chip, and 377 00:21:31,480 --> 00:21:34,400 Speaker 2: it actually runs its own boot process in its own 378 00:21:34,640 --> 00:21:39,840 Speaker 2: operating system, and so the kernel of the iPhone cannot 379 00:21:39,960 --> 00:21:42,880 Speaker 2: access the secure enclave directly. 380 00:21:43,080 --> 00:21:46,160 Speaker 1: So it's kind of like having a separate place where 381 00:21:46,160 --> 00:21:49,480 Speaker 1: you keep all these torn pieces of paper exactly for 382 00:21:49,640 --> 00:21:51,800 Speaker 1: very specific things. Okay, you know what I actually feel 383 00:21:51,840 --> 00:21:54,440 Speaker 1: like I'm doing. What you're saying here is so a password. 384 00:21:54,520 --> 00:21:58,280 Speaker 1: Somebody could guess my password conceivably, even if it's a 385 00:21:58,320 --> 00:22:00,879 Speaker 1: really long one, given enough time. I'm in a powerful 386 00:22:00,960 --> 00:22:03,399 Speaker 1: enough computer, they can run a bunch of numbers and 387 00:22:03,440 --> 00:22:07,040 Speaker 1: eventually figure out that my password is Fido one two three. 388 00:22:07,200 --> 00:22:09,680 Speaker 1: For those listening, my password is not Fidle one two three. 389 00:22:09,720 --> 00:22:12,919 Speaker 1: Please do not try that. But nobody will ever be 390 00:22:13,000 --> 00:22:16,879 Speaker 1: able to rip a piece of paper in precisely the 391 00:22:16,960 --> 00:22:20,119 Speaker 1: correct way that when if I go up to the vault, 392 00:22:20,840 --> 00:22:25,320 Speaker 1: they put together and they match perfectly exactly. That can 393 00:22:25,359 --> 00:22:26,040 Speaker 1: never happen. 394 00:22:26,200 --> 00:22:30,760 Speaker 2: The other thing is that the operating system will not 395 00:22:31,040 --> 00:22:34,960 Speaker 2: allow the papers to be compared. Then tell it validates 396 00:22:35,000 --> 00:22:38,919 Speaker 2: my face. So a friend could come and bring you 397 00:22:38,920 --> 00:22:42,240 Speaker 2: the piece of paper. And in order to unlock the 398 00:22:42,320 --> 00:22:46,440 Speaker 2: vault to even do the comparison, the operating system prompts 399 00:22:46,480 --> 00:22:49,640 Speaker 2: for your bio, and the comparison is only made inside 400 00:22:49,680 --> 00:22:53,159 Speaker 2: the vault and then signed and sent out, So the 401 00:22:53,359 --> 00:22:55,679 Speaker 2: other side of the torn paper never comes out. 402 00:22:55,840 --> 00:22:57,800 Speaker 1: Okay, all right, so nobody else gets to see it. 403 00:22:57,840 --> 00:23:00,800 Speaker 1: So it's like we're stretching the metaphor, but let's keep going. 404 00:23:00,840 --> 00:23:03,439 Speaker 1: It's like there's somebody inside the vault. Yeah, you know, 405 00:23:03,560 --> 00:23:05,800 Speaker 1: in order to even get to the vault, I got 406 00:23:05,800 --> 00:23:08,560 Speaker 1: to prove that it's me via fingerprint or you know, 407 00:23:08,640 --> 00:23:11,520 Speaker 1: face scan or something like that. And then somebody who 408 00:23:11,640 --> 00:23:13,840 Speaker 1: lives inside the vault. For some reason, I don't know why, 409 00:23:13,960 --> 00:23:16,720 Speaker 1: but in this metaphor, somebody lives inside the vault. I 410 00:23:16,920 --> 00:23:18,960 Speaker 1: put the piece of paper inside the vault. The little 411 00:23:18,960 --> 00:23:21,160 Speaker 1: guy inside the vault says, man, these two things match. 412 00:23:21,480 --> 00:23:23,840 Speaker 2: Yeah. He puts a stamp on the on the paper, 413 00:23:24,320 --> 00:23:27,600 Speaker 2: sends it back, and then somebody sees the stamp and said, oh, yeah, 414 00:23:27,640 --> 00:23:29,359 Speaker 2: this really was legit. 415 00:23:31,640 --> 00:23:34,760 Speaker 1: Okay, but there is a downside to using pass keys? 416 00:23:35,400 --> 00:23:38,679 Speaker 1: What if you lose your device? That's after the break. 417 00:23:46,240 --> 00:23:49,160 Speaker 1: So okay, let's come back to reality a little bit. 418 00:23:49,560 --> 00:23:52,200 Speaker 1: What happens if I lose my piece of paper? See 419 00:23:52,200 --> 00:23:55,159 Speaker 1: with a password, even if it's a bad password, If 420 00:23:55,200 --> 00:23:58,440 Speaker 1: I remember it, I remember it. But in this metaphor 421 00:23:58,480 --> 00:24:03,080 Speaker 1: that that torn piece of paper might be my actual phone, right, Yeah? 422 00:24:03,119 --> 00:24:06,520 Speaker 1: What happens if I lose my phone or even break 423 00:24:06,560 --> 00:24:08,960 Speaker 1: my phone? Oh I dropped my phone into the toilet. 424 00:24:09,640 --> 00:24:13,240 Speaker 1: I can't read my email anymore, or I can't access 425 00:24:13,240 --> 00:24:16,720 Speaker 1: my bank. You might start wishing you just use passwords. 426 00:24:17,280 --> 00:24:21,119 Speaker 2: Yeah, So the smart folks involved in this consortium have 427 00:24:21,200 --> 00:24:25,480 Speaker 2: figured out ways to sort of operationalize this exact problem. 428 00:24:25,600 --> 00:24:27,560 Speaker 2: People lose their phones all the time. It has all 429 00:24:27,600 --> 00:24:31,400 Speaker 2: your pass keys on it. How do we restore this madness? 430 00:24:31,640 --> 00:24:34,000 Speaker 2: And in the last few years, people have moved to 431 00:24:34,040 --> 00:24:38,280 Speaker 2: a concept called zero knowledge encryption, which is they can 432 00:24:38,320 --> 00:24:41,639 Speaker 2: handle your super sensitive data without having to know what 433 00:24:41,680 --> 00:24:45,560 Speaker 2: the data is. So since the last past attack, now 434 00:24:45,760 --> 00:24:50,560 Speaker 2: last Past can even unencrypt your data. It's entirely encrypted 435 00:24:50,600 --> 00:24:52,879 Speaker 2: in their system. They can't recover it, they can't see it, 436 00:24:52,920 --> 00:24:56,000 Speaker 2: they can't see any of the metadata, same with one password, 437 00:24:56,280 --> 00:24:59,360 Speaker 2: and this is much safer for them and us because 438 00:25:00,200 --> 00:25:04,320 Speaker 2: if attackers know that there's nothing of value within one 439 00:25:04,359 --> 00:25:07,680 Speaker 2: password or last pass they're less likely to attack these 440 00:25:07,760 --> 00:25:11,879 Speaker 2: vendors as well. This same concept applies to the Apple 441 00:25:12,840 --> 00:25:17,200 Speaker 2: iCloud keychain. So Apple has a way in which the 442 00:25:17,440 --> 00:25:22,280 Speaker 2: secure enclave can encrypt its data, send it into iCloud, 443 00:25:22,960 --> 00:25:27,199 Speaker 2: and then be decrypted by another secure enclave once it's authorized. 444 00:25:27,840 --> 00:25:32,560 Speaker 2: When you lose your Apple device, you have to prove 445 00:25:32,800 --> 00:25:36,760 Speaker 2: that this new device is indeed you through validating on 446 00:25:36,800 --> 00:25:40,200 Speaker 2: an old device or another trusted device, and then once 447 00:25:40,240 --> 00:25:44,040 Speaker 2: you have that authentication established the iCloud as this is 448 00:25:44,119 --> 00:25:48,760 Speaker 2: your new device, then that secure enclave can unencrypt that 449 00:25:48,880 --> 00:25:51,639 Speaker 2: pass key and utilize it again. 450 00:25:52,600 --> 00:25:54,399 Speaker 1: All right, So let's be real here. This is a 451 00:25:54,440 --> 00:25:57,960 Speaker 1: downside with passkeys. It can get complicated if you lose 452 00:25:58,000 --> 00:26:00,360 Speaker 1: any of your devices, and if you happen to lose 453 00:26:00,359 --> 00:26:02,840 Speaker 1: your phone and you don't have access to another device, 454 00:26:03,040 --> 00:26:04,880 Speaker 1: like say a laptop, that you can use to sink 455 00:26:04,880 --> 00:26:07,600 Speaker 1: your pass keys again, then you might have to wait 456 00:26:07,640 --> 00:26:10,000 Speaker 1: to get a new phone before you can access your email. 457 00:26:10,640 --> 00:26:14,439 Speaker 1: Not exactly an ideal situation. But Josh and pretty much 458 00:26:14,480 --> 00:26:16,880 Speaker 1: any security expert you talk to is going to say 459 00:26:16,880 --> 00:26:20,560 Speaker 1: that the occasional inconvenience is still worth it given the 460 00:26:20,600 --> 00:26:24,239 Speaker 1: added security that past keys can give you. So I 461 00:26:24,280 --> 00:26:28,280 Speaker 1: think somebody might be listening to this and thinking this 462 00:26:28,400 --> 00:26:31,280 Speaker 1: sounds like a pain. All this stuff sounds like a pain. 463 00:26:31,400 --> 00:26:33,440 Speaker 1: You know, we're talking about Jeremy here, right, Jeremy is 464 00:26:33,520 --> 00:26:33,880 Speaker 1: just yo. 465 00:26:34,040 --> 00:26:34,240 Speaker 3: Man. 466 00:26:34,320 --> 00:26:39,200 Speaker 1: Look, I do my little shopping, I do my little email. 467 00:26:39,440 --> 00:26:42,840 Speaker 1: You know, I pop on Instagram, maybe TikTok every once 468 00:26:42,880 --> 00:26:46,440 Speaker 1: in a while. That's it. Nobody's looking for me, and 469 00:26:47,119 --> 00:26:51,520 Speaker 1: this past key thing sounds like a pain. Password managers, 470 00:26:51,520 --> 00:26:53,320 Speaker 1: maybe you could twist my arm make me do this. 471 00:26:54,440 --> 00:26:57,760 Speaker 1: What would you say would be the one thing that 472 00:26:57,840 --> 00:26:59,800 Speaker 1: you're confident you could convince somebody to do. 473 00:27:00,200 --> 00:27:02,440 Speaker 2: Let me walk you through how to enable your past 474 00:27:02,520 --> 00:27:06,119 Speaker 2: key on your Gmail account. It'll take it up five minutes, 475 00:27:06,600 --> 00:27:09,000 Speaker 2: and it's going to make your life way more secure. 476 00:27:09,480 --> 00:27:14,160 Speaker 1: So your recommendation would be, first thing past key on 477 00:27:14,840 --> 00:27:15,880 Speaker 1: your email account? 478 00:27:16,240 --> 00:27:16,680 Speaker 2: Correct? 479 00:27:17,080 --> 00:27:18,440 Speaker 1: Okay, that one simple thing. 480 00:27:18,920 --> 00:27:22,240 Speaker 2: Yeah, It's incredibly important to do, and it's a safety 481 00:27:22,280 --> 00:27:25,040 Speaker 2: net for all your other accounts that are going to 482 00:27:25,119 --> 00:27:27,080 Speaker 2: password reset to your email. 483 00:27:26,960 --> 00:27:28,639 Speaker 1: And you say that's five minutes. 484 00:27:28,960 --> 00:27:30,800 Speaker 2: Yeah, So you're just going to go into your Google account. 485 00:27:30,840 --> 00:27:34,520 Speaker 2: You're going to click on I think it's your security settings. 486 00:27:34,720 --> 00:27:37,760 Speaker 2: It'll have your sort of authentication methods there, and you 487 00:27:37,800 --> 00:27:41,760 Speaker 2: can start to just set up past key and it'll 488 00:27:42,040 --> 00:27:44,560 Speaker 2: prompt you on your device, ask you for your face 489 00:27:44,600 --> 00:27:47,119 Speaker 2: ID or your touch depending on what device you're using, 490 00:27:47,640 --> 00:27:51,280 Speaker 2: and then it's sort of seamless being real. 491 00:27:51,320 --> 00:27:53,960 Speaker 1: Here. Before I did this interview, none of us a 492 00:27:54,080 --> 00:27:56,920 Speaker 1: kill switch. We're using pass keys. Neither me nor our 493 00:27:56,920 --> 00:28:00,240 Speaker 1: producers were using this. But we decided, all right, maybe 494 00:28:00,240 --> 00:28:03,679 Speaker 1: we better try this ourselves. And I'll be honest, it 495 00:28:03,720 --> 00:28:06,080 Speaker 1: took a little bit for me to understand it. But 496 00:28:06,160 --> 00:28:08,200 Speaker 1: once it's set up, you really don't have to think 497 00:28:08,200 --> 00:28:11,840 Speaker 1: about it again. And look, if you have a Gmail account, 498 00:28:11,880 --> 00:28:14,440 Speaker 1: I can walk you through the whole thing right now 499 00:28:14,480 --> 00:28:16,520 Speaker 1: while you're listening. So you can do this on your 500 00:28:16,520 --> 00:28:19,800 Speaker 1: computer or on your phone. So here we go. Take 501 00:28:19,800 --> 00:28:22,920 Speaker 1: out your phone, get out your computer. Let's go. First thing, 502 00:28:23,480 --> 00:28:26,479 Speaker 1: open up a browser and go to this address g 503 00:28:27,040 --> 00:28:33,800 Speaker 1: dot co slash passkeys. It's g dot co slash passkeys. 504 00:28:34,280 --> 00:28:37,080 Speaker 1: So this is Google's official page. You'll need to be 505 00:28:37,119 --> 00:28:40,040 Speaker 1: signed into your Google account already. If you're not, it'll 506 00:28:40,080 --> 00:28:42,760 Speaker 1: just tell you to sign in. Once you're in, you'll 507 00:28:42,760 --> 00:28:45,720 Speaker 1: see a screen that talks about passkeys, and there should 508 00:28:45,720 --> 00:28:48,600 Speaker 1: be a blue button that says use passkey or create passkey. 509 00:28:49,240 --> 00:28:51,920 Speaker 1: Tap or click that button, and your device will then 510 00:28:52,000 --> 00:28:54,560 Speaker 1: ask you how you want to create your passkey. You 511 00:28:54,560 --> 00:28:57,360 Speaker 1: could use a device pen, your fingerprint, your face ID, 512 00:28:58,080 --> 00:29:02,680 Speaker 1: say yes, follow the instructions, and boom, you're done. That's it. 513 00:29:02,720 --> 00:29:05,520 Speaker 1: If you were following along, you're maybe already done now. 514 00:29:05,880 --> 00:29:09,880 Speaker 1: Also extra credit, do this same process on another device, 515 00:29:10,400 --> 00:29:11,360 Speaker 1: just as backup. 516 00:29:12,640 --> 00:29:15,080 Speaker 2: It can be very problematic if you don't have any 517 00:29:15,120 --> 00:29:19,400 Speaker 2: backup devices anywhere, because Apple to get that device back 518 00:29:19,440 --> 00:29:23,600 Speaker 2: online it needs a previously trusted device to sort of 519 00:29:23,800 --> 00:29:24,560 Speaker 2: help you through that. 520 00:29:25,000 --> 00:29:27,800 Speaker 1: So I have this old phone from I think twenty nineteen, 521 00:29:27,840 --> 00:29:30,640 Speaker 1: has got a crack screen, rundown battery, the speaker doesn't 522 00:29:30,680 --> 00:29:33,200 Speaker 1: work right. I can't use it as a phone anymore, 523 00:29:33,720 --> 00:29:37,280 Speaker 1: but it's perfect as a backup. I created another pass 524 00:29:37,360 --> 00:29:39,960 Speaker 1: key on this device. So if I'm out somewhere and 525 00:29:40,000 --> 00:29:42,640 Speaker 1: I lose my phone or it breaks or it gets stolen, 526 00:29:43,080 --> 00:29:45,400 Speaker 1: I know that I got a backup back at the house, 527 00:29:45,560 --> 00:29:47,000 Speaker 1: and it's not going to take me a whole bunch 528 00:29:47,000 --> 00:29:49,840 Speaker 1: of time to get back up and running. You know, 529 00:29:50,000 --> 00:29:54,280 Speaker 1: it occurs to me that I feel like and this 530 00:29:54,320 --> 00:29:57,200 Speaker 1: is honestly part of not only this episode, but part 531 00:29:57,240 --> 00:30:00,240 Speaker 1: of this podcast, part of the show in general. Is 532 00:30:00,280 --> 00:30:03,280 Speaker 1: it a lot of cybersecurity and a lot of technology. 533 00:30:03,360 --> 00:30:06,280 Speaker 1: Really isn't even about the computers anymore. I suppose it 534 00:30:06,320 --> 00:30:10,520 Speaker 1: never was. It's about people, you know, It's about relationships 535 00:30:10,520 --> 00:30:12,840 Speaker 1: to people, how we speak to each other. And I 536 00:30:12,880 --> 00:30:15,360 Speaker 1: think you know, some of us who are who are 537 00:30:15,440 --> 00:30:18,240 Speaker 1: kind of good at technology, we can look at people 538 00:30:18,280 --> 00:30:20,200 Speaker 1: and say, hey, why aren't you doing this very simple 539 00:30:20,240 --> 00:30:22,920 Speaker 1: technological thing. Why aren't you doing it? But a lot 540 00:30:22,920 --> 00:30:27,720 Speaker 1: of the security stuff is actually having a personal conversation 541 00:30:27,800 --> 00:30:31,960 Speaker 1: with somebody and saying, hey, I care about you. Yeah, 542 00:30:32,080 --> 00:30:33,840 Speaker 1: I know you think you're not going to get hacked, 543 00:30:33,880 --> 00:30:36,440 Speaker 1: but let's let's get beyond that. 544 00:30:37,280 --> 00:30:39,640 Speaker 2: Yeah. And I think I think one of the problems 545 00:30:39,680 --> 00:30:43,719 Speaker 2: with the Consortium on Past Key is that the marketing 546 00:30:44,240 --> 00:30:47,680 Speaker 2: and education has been very poor. Right, there hasn't been 547 00:30:48,480 --> 00:30:52,760 Speaker 2: sort of this outreach to the public on hey, folks, 548 00:30:52,920 --> 00:30:57,560 Speaker 2: this is the most revolutionary thing in passwords. You should 549 00:30:57,560 --> 00:31:01,600 Speaker 2: be adopting these things. Paskys are are game changing, but 550 00:31:02,120 --> 00:31:04,680 Speaker 2: most people haven't even heard of them, and so I 551 00:31:04,680 --> 00:31:09,040 Speaker 2: think it's it's unfortunate that the educational campaign hasn't sort 552 00:31:09,040 --> 00:31:14,640 Speaker 2: of matched the technology capabilities, because you know, people really 553 00:31:14,680 --> 00:31:18,480 Speaker 2: are getting a like several levels increase in security and 554 00:31:18,520 --> 00:31:21,320 Speaker 2: it's available to them, but most folks don't understand that 555 00:31:21,520 --> 00:31:23,560 Speaker 2: it even didn't exists, and so we. 556 00:31:23,600 --> 00:31:25,800 Speaker 1: Got to be out here like weird evangelists telling everybody 557 00:31:25,840 --> 00:31:30,040 Speaker 1: to use it. Yeah, so yes, thank you for letting 558 00:31:30,080 --> 00:31:32,360 Speaker 1: us evangelize to you. This is a cause that I 559 00:31:32,440 --> 00:31:34,440 Speaker 1: care a lot about. I really don't want to hear 560 00:31:34,440 --> 00:31:37,720 Speaker 1: about any kill switch listeners getting hacked. So if you 561 00:31:37,760 --> 00:31:41,000 Speaker 1: were a Jeremy, I hope you've decided to join us 562 00:31:41,000 --> 00:31:43,560 Speaker 1: here in the light, or if you have a Jeremy 563 00:31:43,560 --> 00:31:46,160 Speaker 1: in your life, seriously send this to them hopefully they 564 00:31:46,240 --> 00:31:48,880 Speaker 1: join us as well. And also, if your literal name 565 00:31:48,920 --> 00:31:52,920 Speaker 1: is Jeremy, this is probably a very disorienting episode and 566 00:31:52,960 --> 00:31:56,800 Speaker 1: for that I apologize. But yo, for real, I still 567 00:31:56,840 --> 00:32:01,000 Speaker 1: think password managers are great again. I use one. My 568 00:32:01,160 --> 00:32:03,960 Speaker 1: mother uses a password manager. By the way, Mom, thank 569 00:32:04,000 --> 00:32:05,800 Speaker 1: you for not getting mad at me for yelling at 570 00:32:05,840 --> 00:32:09,200 Speaker 1: you until you finally started using one. I appreciate it, 571 00:32:09,760 --> 00:32:12,240 Speaker 1: but for real, my mom can do this, so can you. 572 00:32:13,000 --> 00:32:16,040 Speaker 1: Now this does not mean you will be bulletproof. Password 573 00:32:16,080 --> 00:32:18,800 Speaker 1: managers and passkeys do not prevent you from scams or 574 00:32:18,800 --> 00:32:21,520 Speaker 1: fishing or anything like that. That's a whole other episode. 575 00:32:21,960 --> 00:32:24,360 Speaker 1: But this is a great start because, look, one of 576 00:32:24,400 --> 00:32:26,600 Speaker 1: the points of this show of kill Switch is that 577 00:32:26,640 --> 00:32:29,880 Speaker 1: the future does not have to be scary. Things are 578 00:32:29,920 --> 00:32:32,440 Speaker 1: scary when you are so overwhelmed that you don't know 579 00:32:32,480 --> 00:32:34,239 Speaker 1: what to do and you feel like there's nothing you 580 00:32:34,320 --> 00:32:37,360 Speaker 1: can do, But there is something you can do here, 581 00:32:38,200 --> 00:32:40,200 Speaker 1: and it really does feel kind of cool to take 582 00:32:40,240 --> 00:32:44,360 Speaker 1: back some control for yourself. It's cool to be vigilant, 583 00:32:45,080 --> 00:32:54,640 Speaker 1: but you don't have to be scared. And I will 584 00:32:54,680 --> 00:32:57,600 Speaker 1: cut out with the diet tribe there. Thank you once 585 00:32:57,640 --> 00:33:00,320 Speaker 1: again so much for listening to kill Switch for Real. 586 00:33:00,400 --> 00:33:02,360 Speaker 1: Let us know what you think and if there's something 587 00:33:02,360 --> 00:33:04,600 Speaker 1: you want us to cover or some questions you might have, 588 00:33:04,960 --> 00:33:07,720 Speaker 1: you can hit us up at kill Switch at Kaleidoscope 589 00:33:07,720 --> 00:33:11,240 Speaker 1: dot NYC. You can also follow us on Instagram at 590 00:33:11,520 --> 00:33:15,600 Speaker 1: kill switch pod or my personal account dex Digi that's 591 00:33:15,680 --> 00:33:18,920 Speaker 1: d e X d I g I on Instagram or 592 00:33:18,920 --> 00:33:22,400 Speaker 1: blue Sky And for real, first priority, set up that 593 00:33:22,480 --> 00:33:26,040 Speaker 1: past key, set up your password manager. But after you've 594 00:33:26,080 --> 00:33:27,920 Speaker 1: done that, while you got your phone out, you know, 595 00:33:28,040 --> 00:33:30,000 Speaker 1: make sure to leave us a review because it helps 596 00:33:30,000 --> 00:33:33,120 Speaker 1: other people find the show, which in turn helps us 597 00:33:33,240 --> 00:33:36,440 Speaker 1: keep doing our thing. And this thing is hosted by 598 00:33:36,520 --> 00:33:40,800 Speaker 1: me Dexter Thomas is produced by Sena Ozaki, Darluck Potts 599 00:33:41,080 --> 00:33:44,240 Speaker 1: and Kate Osbourne. Our theme song is by me and 600 00:33:44,360 --> 00:33:48,360 Speaker 1: Kyle Murdoch, and Kyle also mixes the show from Kaleidoscope. 601 00:33:48,400 --> 00:33:53,480 Speaker 1: Our executive producers are Ozro Lashin, Mangesh Hajigadur and Kate Osbourne. 602 00:33:53,760 --> 00:33:58,080 Speaker 1: From iHeart, our executive producers are Katrina Norville and Nikki e. Tour. 603 00:33:58,680 --> 00:34:01,920 Speaker 1: Catch on in the next one 604 00:34:10,320 --> 00:34:11,160 Speaker 3: By