1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to Tech Stuff, a production of I Heart Radios, 2 00:00:07,320 --> 00:00:13,680 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:13,720 --> 00:00:16,759 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,800 --> 00:00:18,599 Speaker 1: How Stuff Works and I heart Radio and I love 5 00:00:18,680 --> 00:00:23,239 Speaker 1: all things tech. And today we're gonna talk about a 6 00:00:23,320 --> 00:00:28,240 Speaker 1: really pretty serious issue about hacking and personal information. That 7 00:00:28,400 --> 00:00:32,080 Speaker 1: this was inspired by a real world incident that happened 8 00:00:32,200 --> 00:00:35,760 Speaker 1: several years ago. This particular episode actually originally published on 9 00:00:35,800 --> 00:00:39,000 Speaker 1: September three, two twelve, and at the time it was 10 00:00:39,040 --> 00:00:43,159 Speaker 1: extremely topical. But I would argue that the message behind 11 00:00:43,200 --> 00:00:46,480 Speaker 1: the show is one that we still should heed today, 12 00:00:46,520 --> 00:00:50,560 Speaker 1: even though the actual incident now is more than seven 13 00:00:50,640 --> 00:00:54,640 Speaker 1: years old. But this episode was called More Data, More Problems, 14 00:00:54,760 --> 00:00:58,160 Speaker 1: and I hope you enjoy now when we're recording this, 15 00:00:58,360 --> 00:01:02,440 Speaker 1: it's in August, early August twelve. It's August ten, actually, 16 00:01:03,000 --> 00:01:06,400 Speaker 1: and earlier this week, there was a news story that 17 00:01:06,440 --> 00:01:09,960 Speaker 1: broke throughout the Twitter sphere really first and then beyond 18 00:01:10,520 --> 00:01:15,120 Speaker 1: about a tech journalist named Matt Honan who has written 19 00:01:15,240 --> 00:01:21,479 Speaker 1: for various UH publications including Wired, and how he had 20 00:01:21,560 --> 00:01:25,880 Speaker 1: his essentially his entire digital life hacked over the course 21 00:01:26,000 --> 00:01:30,920 Speaker 1: of about thirty minutes and uh, and to kind of 22 00:01:30,959 --> 00:01:35,200 Speaker 1: explain what happened. First, we'll sort of talk about the 23 00:01:35,840 --> 00:01:39,920 Speaker 1: way he discovered this through his personal experience, and then 24 00:01:40,000 --> 00:01:42,560 Speaker 1: how the hackers did it, and then what needs to 25 00:01:42,640 --> 00:01:46,120 Speaker 1: happen so that we protect ourselves against such things happening 26 00:01:46,160 --> 00:01:50,560 Speaker 1: in the future. So to start, he was he was 27 00:01:50,600 --> 00:01:53,720 Speaker 1: playing with his kid and he noticed that his iPhone 28 00:01:54,000 --> 00:01:58,280 Speaker 1: had shut down. It was so it crashed essentially, and 29 00:01:58,280 --> 00:01:59,960 Speaker 1: he thought, oh, well, that's annoying. I guess I'll have 30 00:02:00,040 --> 00:02:03,520 Speaker 1: to go and uh connected to my computer over store 31 00:02:03,560 --> 00:02:06,120 Speaker 1: from back up and just get this thing going again. 32 00:02:06,200 --> 00:02:08,680 Speaker 1: He didn't really think much of it, because you know, 33 00:02:08,760 --> 00:02:12,480 Speaker 1: technology occasionally fails. Yes, So then he goes and he 34 00:02:12,560 --> 00:02:16,160 Speaker 1: goes over to his computer and tries to start that up, 35 00:02:16,720 --> 00:02:20,280 Speaker 1: and that also isn't loading up properly. It's asking him 36 00:02:20,360 --> 00:02:23,280 Speaker 1: for information that he doesn't have and it won't accept 37 00:02:23,280 --> 00:02:26,440 Speaker 1: his password, and so he's thinking, well, that's weird, but 38 00:02:26,520 --> 00:02:30,639 Speaker 1: he doesn't again panic yet. Uh. He then thinks about 39 00:02:31,040 --> 00:02:36,280 Speaker 1: trying his iPad, which also isn't working, and he tries 40 00:02:36,360 --> 00:02:41,280 Speaker 1: logging into his Google account using a different computer, and 41 00:02:41,360 --> 00:02:45,800 Speaker 1: that also gives him a failure, and it's at that 42 00:02:45,840 --> 00:02:49,440 Speaker 1: point where he's thinking something seriously wrong is happening, and 43 00:02:49,480 --> 00:02:54,000 Speaker 1: eventually he starts noticing that his own Twitter handle is 44 00:02:54,080 --> 00:02:57,320 Speaker 1: posting stuff uh, and he's not the one doing it, 45 00:02:58,120 --> 00:03:01,120 Speaker 1: and so he can't access his Twitter account anymore either. 46 00:03:01,320 --> 00:03:07,679 Speaker 1: And they are these horrible Twitter messages with various you know, uh, 47 00:03:07,840 --> 00:03:12,440 Speaker 1: inappropriate tweets going out things that are racist or homophobic 48 00:03:12,800 --> 00:03:16,359 Speaker 1: or having lots of foul language in it um and 49 00:03:16,480 --> 00:03:20,680 Speaker 1: it's just, you know, it's it's just beyond his control. 50 00:03:21,000 --> 00:03:22,720 Speaker 1: He gets on the phone with Apple trying to find 51 00:03:22,720 --> 00:03:28,600 Speaker 1: out what's going on, uh, to explain that his his 52 00:03:28,639 --> 00:03:32,640 Speaker 1: account has been hacked, and it takes him quite some 53 00:03:32,720 --> 00:03:35,240 Speaker 1: time before they're able to sort this out. Part of 54 00:03:35,240 --> 00:03:38,200 Speaker 1: the reason is that they, for a while, we're looking 55 00:03:38,200 --> 00:03:41,360 Speaker 1: at the wrong account. They had his name wrong, and 56 00:03:41,440 --> 00:03:43,520 Speaker 1: so they were looking at an account that had none 57 00:03:43,520 --> 00:03:46,440 Speaker 1: of the issues he was explaining. And then when the 58 00:03:46,680 --> 00:03:50,560 Speaker 1: Apple representative repeated his name back to him, that's when 59 00:03:50,600 --> 00:03:52,720 Speaker 1: he said, wait a minute, that's not who I am. 60 00:03:52,960 --> 00:03:55,800 Speaker 1: I'm Matt Honan. You've got the wrong name. And then 61 00:03:55,840 --> 00:03:59,560 Speaker 1: once they switched their focus, then they started seeing oh, 62 00:03:59,600 --> 00:04:03,400 Speaker 1: well before you called in, and actually I think Honan 63 00:04:03,480 --> 00:04:07,000 Speaker 1: had to ask about this. They didn't. They didn't volunteer 64 00:04:07,040 --> 00:04:09,800 Speaker 1: this information. But before Honan had called in, someone else 65 00:04:09,840 --> 00:04:14,360 Speaker 1: had called in to regain access. They said, to regain access. 66 00:04:14,360 --> 00:04:16,320 Speaker 1: Really it was to gain access for the first time. 67 00:04:16,320 --> 00:04:18,839 Speaker 1: It was the hackers who had called in too, because 68 00:04:18,839 --> 00:04:21,120 Speaker 1: they had claimed that they no longer had the password 69 00:04:21,160 --> 00:04:24,720 Speaker 1: or security question answers, so they could not get the 70 00:04:24,760 --> 00:04:27,720 Speaker 1: password normally. They were trying to get into his dot 71 00:04:27,839 --> 00:04:33,919 Speaker 1: me email right and the the reason for all of 72 00:04:33,960 --> 00:04:37,440 Speaker 1: this is probably the craziest part of the story, although 73 00:04:37,839 --> 00:04:39,920 Speaker 1: the pathway of how the hackers got to the point 74 00:04:39,960 --> 00:04:42,120 Speaker 1: where they were able to do all these things. You know, 75 00:04:42,160 --> 00:04:44,520 Speaker 1: once they got access to his iCloud account, they were 76 00:04:44,520 --> 00:04:46,599 Speaker 1: able to do things like wipe his devices, which is 77 00:04:46,600 --> 00:04:49,839 Speaker 1: what happened. They wiped his iPhone, his Mac, and his 78 00:04:50,000 --> 00:04:53,240 Speaker 1: iPad in part to prevent him from being able to 79 00:04:53,279 --> 00:04:55,799 Speaker 1: head them off. While they were going down this trail 80 00:04:56,640 --> 00:04:59,720 Speaker 1: of hacking his digital life. They were also able because 81 00:05:00,040 --> 00:05:04,400 Speaker 1: of the way he had interconnected various accounts. They were 82 00:05:04,400 --> 00:05:08,400 Speaker 1: able to do things like reset his Google password, send 83 00:05:08,440 --> 00:05:11,520 Speaker 1: the message to the dot Me address, which they already 84 00:05:11,520 --> 00:05:14,599 Speaker 1: had access to. Yes, because they had gained it from Apple. 85 00:05:14,880 --> 00:05:17,080 Speaker 1: Once they got the password for the Google account, then 86 00:05:17,120 --> 00:05:20,200 Speaker 1: they were able to get the password for Twitter because 87 00:05:20,480 --> 00:05:23,919 Speaker 1: that's where he had his Twitter account attached to his 88 00:05:23,960 --> 00:05:28,039 Speaker 1: Google account, So it was kind of a leap frog thing, right, 89 00:05:28,080 --> 00:05:31,000 Speaker 1: he would they could do a password recovery from one system, 90 00:05:31,000 --> 00:05:33,680 Speaker 1: It would send the message to one of the email 91 00:05:33,680 --> 00:05:36,280 Speaker 1: addresses that was already compromised, and then they would get 92 00:05:36,320 --> 00:05:39,200 Speaker 1: access to the next thing. Turns out what the hackers 93 00:05:39,200 --> 00:05:42,479 Speaker 1: were interested in from the very beginning was getting hold 94 00:05:42,600 --> 00:05:46,880 Speaker 1: of his Twitter account and posting these messages. That's really 95 00:05:47,600 --> 00:05:49,920 Speaker 1: just for laughs. That's all they really wanted to do. 96 00:05:49,960 --> 00:05:54,080 Speaker 1: They weren't really out to make a big show that 97 00:05:54,240 --> 00:05:56,600 Speaker 1: you know, it should be Matt Honan that should suffer 98 00:05:56,640 --> 00:05:59,640 Speaker 1: for this. Uh. They had nothing to do with Gizmodo, 99 00:06:00,040 --> 00:06:04,799 Speaker 1: which Owen had written for, and his account was linked 100 00:06:04,800 --> 00:06:07,839 Speaker 1: to Gizmoto's account. It never been unlinked, even though he 101 00:06:07,880 --> 00:06:11,320 Speaker 1: no longer wrote for Gizmoto, so they also had access 102 00:06:11,360 --> 00:06:13,800 Speaker 1: to Gizmodo's Twitter account and hijack that for a while. 103 00:06:14,320 --> 00:06:17,680 Speaker 1: Um so, you you know, it turned out the only 104 00:06:17,760 --> 00:06:21,120 Speaker 1: reason they wanted to get his Twitter account was because 105 00:06:21,480 --> 00:06:24,159 Speaker 1: he had one of the most rare things in Twitter. 106 00:06:24,640 --> 00:06:28,640 Speaker 1: A three letter Twitter handle, yes, you know, because most 107 00:06:28,680 --> 00:06:31,560 Speaker 1: people had to go with a longer Twitter handle because 108 00:06:31,600 --> 00:06:35,320 Speaker 1: of course, once one's taken, it's gone. Yes, so people 109 00:06:35,360 --> 00:06:38,039 Speaker 1: who managed to land one of those three letter accounts 110 00:06:38,080 --> 00:06:40,679 Speaker 1: are rare, and so they thought, oh, this is that's 111 00:06:40,680 --> 00:06:43,320 Speaker 1: that's why they targeted this particular Twitter account. Had nothing 112 00:06:43,320 --> 00:06:46,280 Speaker 1: to do with him personally, had nothing to do with 113 00:06:46,320 --> 00:06:48,120 Speaker 1: who he worked for, and had nothing to do with 114 00:06:48,120 --> 00:06:49,640 Speaker 1: the fact that he was a tech journalist. It was 115 00:06:49,760 --> 00:06:52,960 Speaker 1: just because his Twitter handle was three letters long. And 116 00:06:55,200 --> 00:06:58,240 Speaker 1: that's crazy to me. First of all, that you know 117 00:06:58,320 --> 00:07:00,599 Speaker 1: that that was the that they were were willing to 118 00:07:00,600 --> 00:07:02,720 Speaker 1: go through, the steps that they had to go through 119 00:07:03,120 --> 00:07:06,599 Speaker 1: in order to get this one Twitter account. Well, that's true, 120 00:07:06,800 --> 00:07:09,040 Speaker 1: although it only took them a little less than an 121 00:07:09,040 --> 00:07:12,640 Speaker 1: hour to accomplish. Once they had, once they had determined 122 00:07:12,680 --> 00:07:16,800 Speaker 1: their route of attack, it was all over. So the 123 00:07:16,840 --> 00:07:20,920 Speaker 1: way they did this was not through any kind of 124 00:07:21,000 --> 00:07:24,760 Speaker 1: crazy sit down at the computer, type in the password 125 00:07:24,800 --> 00:07:27,400 Speaker 1: three times and then you managed to get in type thing. 126 00:07:27,640 --> 00:07:32,120 Speaker 1: And it certainly wasn't a Hollywood style hacker brute force 127 00:07:32,160 --> 00:07:36,240 Speaker 1: attack where there was uh, you know, some group of 128 00:07:36,240 --> 00:07:38,920 Speaker 1: of hackers trying everything they could to brute force their 129 00:07:38,960 --> 00:07:41,720 Speaker 1: way in. Yeah, it wasn't like a computer program that 130 00:07:41,800 --> 00:07:44,160 Speaker 1: was just running password after password and you see the 131 00:07:44,200 --> 00:07:48,080 Speaker 1: little like digits flip up each time you hit one. 132 00:07:48,160 --> 00:07:52,000 Speaker 1: That's correct, That wasn't what happened. What happened was much 133 00:07:52,080 --> 00:07:55,720 Speaker 1: more simple, really in a way, because I had nothing 134 00:07:55,760 --> 00:07:59,040 Speaker 1: to do with using code. It has everything to do 135 00:07:59,080 --> 00:08:02,200 Speaker 1: with manipulating SISS stems, but from a person perspective, not 136 00:08:02,480 --> 00:08:07,080 Speaker 1: or or a policy perspective, not from a technological one. Yeah. 137 00:08:07,320 --> 00:08:12,400 Speaker 1: And it's it's also clear that although Apple's security procedures 138 00:08:12,960 --> 00:08:16,600 Speaker 1: are in part to at fault, um, they are not 139 00:08:16,640 --> 00:08:20,560 Speaker 1: the only ones the hackers targeted to get more information 140 00:08:20,680 --> 00:08:25,640 Speaker 1: on on Honan and that um, it just so happened that, uh, 141 00:08:26,120 --> 00:08:31,440 Speaker 1: the information they needed coincided across multiple companies with his accounts, 142 00:08:31,760 --> 00:08:34,800 Speaker 1: and once they got some information from a couple of places, 143 00:08:34,840 --> 00:08:38,440 Speaker 1: they were easily able to go in and fiddle with 144 00:08:38,480 --> 00:08:43,719 Speaker 1: other stuff. There are really three parties that are I 145 00:08:43,920 --> 00:08:45,840 Speaker 1: don't want to say at fault you don't blame the victim. 146 00:08:46,000 --> 00:08:47,920 Speaker 1: There are three party There are three parties that made 147 00:08:47,920 --> 00:08:50,280 Speaker 1: this possible for the hackers to get the access to 148 00:08:50,520 --> 00:08:53,920 Speaker 1: to the accounts. One of those is Honan himself. Yeah, 149 00:08:54,240 --> 00:08:56,920 Speaker 1: and he freely admits that, yes, if you he has 150 00:08:56,960 --> 00:09:03,480 Speaker 1: written an incredible uh uh article that that documents this 151 00:09:03,720 --> 00:09:06,560 Speaker 1: entire process and what he went through. He he blogged 152 00:09:06,559 --> 00:09:08,400 Speaker 1: about it when it happened, but then he wrote up 153 00:09:08,800 --> 00:09:12,360 Speaker 1: a much more comprehensive account of it for Wired and 154 00:09:12,480 --> 00:09:14,800 Speaker 1: uh and it's a very interesting read. I highly recommend 155 00:09:14,840 --> 00:09:17,240 Speaker 1: you read it, especially if you're concerned with your own 156 00:09:17,920 --> 00:09:23,840 Speaker 1: potential security computer security. So he was at fault and 157 00:09:23,960 --> 00:09:26,840 Speaker 1: not at fault. He was. He some of his choices 158 00:09:26,920 --> 00:09:32,400 Speaker 1: made this possible. Uh. The Amazon, Amazon dot Com also, 159 00:09:32,840 --> 00:09:38,280 Speaker 1: its policies made this possible, and Apple's policies made this possible. 160 00:09:38,320 --> 00:09:42,160 Speaker 1: So those three parties together made it possible for the 161 00:09:42,200 --> 00:09:46,520 Speaker 1: hackers to achieve this and uh and it's kind of 162 00:09:46,920 --> 00:09:49,960 Speaker 1: interesting how how they came about it. Yeah, and and 163 00:09:50,120 --> 00:09:52,000 Speaker 1: some of the irony as we get into this, is 164 00:09:52,040 --> 00:09:55,520 Speaker 1: that some of the very things that made this possible 165 00:09:56,360 --> 00:10:01,920 Speaker 1: are in place specifically to make it more difficult for 166 00:10:02,000 --> 00:10:06,720 Speaker 1: someone to steal identities. So it actually uh, some of 167 00:10:06,760 --> 00:10:10,120 Speaker 1: these some of these procedures actually worked in exactly the 168 00:10:10,120 --> 00:10:14,160 Speaker 1: opposite way in which they weren't intended when they were implemented. 169 00:10:15,040 --> 00:10:20,480 Speaker 1: So the way this started off was it was fairly clever. 170 00:10:20,640 --> 00:10:23,680 Speaker 1: So they they first they started the hackers did a 171 00:10:23,679 --> 00:10:28,240 Speaker 1: little recon work and they wanted to find out, um 172 00:10:28,280 --> 00:10:33,079 Speaker 1: about how they would get uh the access to the 173 00:10:33,120 --> 00:10:36,200 Speaker 1: Twitter account. And then they were able to find out 174 00:10:36,400 --> 00:10:42,200 Speaker 1: Honan's uh email address because he has a website. They 175 00:10:42,240 --> 00:10:44,600 Speaker 1: went to the website, they did a who is look 176 00:10:44,679 --> 00:10:47,520 Speaker 1: up on Honan, which gave them two things, like two 177 00:10:47,520 --> 00:10:50,800 Speaker 1: things they needed. They needed the email address and they 178 00:10:50,840 --> 00:10:54,400 Speaker 1: needed his physical address. Yeah. Now, if you register a 179 00:10:54,480 --> 00:10:59,560 Speaker 1: domain name, you are required to have contact information available. Um, 180 00:10:59,559 --> 00:11:04,240 Speaker 1: and that information is publicly available now um some well 181 00:11:04,320 --> 00:11:06,120 Speaker 1: we could talk about that too, but anyway, the the 182 00:11:06,200 --> 00:11:10,280 Speaker 1: who is record for the domain had his information in it. Yeah. 183 00:11:10,440 --> 00:11:14,760 Speaker 1: So once they had that information, the Google account and 184 00:11:14,800 --> 00:11:17,240 Speaker 1: the just the email address didn't have access to the 185 00:11:17,320 --> 00:11:21,680 Speaker 1: account yet. Um. They figured out that the Twitter account 186 00:11:21,840 --> 00:11:24,400 Speaker 1: was linked to the personal website. That's what That's where 187 00:11:24,400 --> 00:11:26,240 Speaker 1: they found the Gmail address, That's where they found the 188 00:11:26,800 --> 00:11:31,160 Speaker 1: physical address. And then they started to look at the 189 00:11:31,200 --> 00:11:35,080 Speaker 1: account recovery for a Google and without actually sending in 190 00:11:35,120 --> 00:11:39,280 Speaker 1: a recovery request, they saw that the address, which was 191 00:11:39,320 --> 00:11:45,839 Speaker 1: only partially obscured per Google's policy, wasn't at me dot 192 00:11:45,880 --> 00:11:50,080 Speaker 1: com email address. That was the recovery address. Yeah, well 193 00:11:50,559 --> 00:11:54,960 Speaker 1: that's an Apple thing, right. So that's where they said, Ah, 194 00:11:55,120 --> 00:11:58,040 Speaker 1: now we know how to get at him because it's 195 00:11:58,400 --> 00:12:03,120 Speaker 1: because his Google address, uh will go back if we 196 00:12:03,200 --> 00:12:05,200 Speaker 1: did a password recovery, because that will go to an 197 00:12:05,240 --> 00:12:08,200 Speaker 1: Apple address. And because we know how to manipulate the 198 00:12:08,200 --> 00:12:10,720 Speaker 1: system so that we can get access to his Apple account. 199 00:12:11,400 --> 00:12:13,880 Speaker 1: It's all over. And the way they got access to 200 00:12:13,880 --> 00:12:17,520 Speaker 1: the Apple account was kind of interesting. Now, they did 201 00:12:17,600 --> 00:12:21,440 Speaker 1: not have the password, they did not have the answer 202 00:12:21,480 --> 00:12:25,480 Speaker 1: to security questions. So calling up Apple and getting access 203 00:12:25,520 --> 00:12:28,719 Speaker 1: to this account would require that they have some other information. 204 00:12:29,080 --> 00:12:32,200 Speaker 1: What Apple requires is that you have to have the 205 00:12:32,240 --> 00:12:35,199 Speaker 1: building address and the last four digits of the credit 206 00:12:35,240 --> 00:12:40,840 Speaker 1: card you used to establish that account. So what the 207 00:12:40,840 --> 00:12:44,360 Speaker 1: hackers did was they said, well, there's a good chance 208 00:12:44,920 --> 00:12:48,959 Speaker 1: that the same credit card this guy used to establish 209 00:12:49,080 --> 00:12:53,920 Speaker 1: his iCloud account is the one that he uses for Amazon. 210 00:12:55,280 --> 00:12:59,520 Speaker 1: And so instead of calling Apple first, they called Amazon first, 211 00:13:00,120 --> 00:13:03,480 Speaker 1: and they said that they wanted to add a credit 212 00:13:03,520 --> 00:13:08,480 Speaker 1: card number to the existing Amazon account. That's right, So 213 00:13:08,520 --> 00:13:10,320 Speaker 1: they weren't trying to get the credit card number. They 214 00:13:10,320 --> 00:13:12,439 Speaker 1: wanted to add a credit card number, right, So then 215 00:13:12,480 --> 00:13:15,440 Speaker 1: they add a credit card number to the Amazon account. 216 00:13:15,920 --> 00:13:19,120 Speaker 1: Then they hang up. Then they call Amazon back and 217 00:13:19,120 --> 00:13:22,160 Speaker 1: they say that they have lost access to their account 218 00:13:23,200 --> 00:13:26,880 Speaker 1: and that they will provide the name, the billing address, 219 00:13:26,920 --> 00:13:28,800 Speaker 1: which they already have from the who is look up 220 00:13:28,840 --> 00:13:33,120 Speaker 1: of the website, and then the credit card number they 221 00:13:33,160 --> 00:13:37,320 Speaker 1: gave at the at the call they made earlier. So 222 00:13:37,440 --> 00:13:40,840 Speaker 1: there's now this credit card number that is legit because 223 00:13:40,920 --> 00:13:43,760 Speaker 1: they provided it. It's not the same one that was 224 00:13:43,840 --> 00:13:46,640 Speaker 1: used to establish the account in the first place. So 225 00:13:46,679 --> 00:13:50,080 Speaker 1: then Amazon says, oh, all right, well we'll send you 226 00:13:50,120 --> 00:13:53,320 Speaker 1: the password to the account. Here's which email addressed you 227 00:13:53,760 --> 00:13:58,200 Speaker 1: wanted to go to. So they hackers give their email 228 00:13:58,200 --> 00:14:00,760 Speaker 1: address or an email address that they have created for 229 00:14:00,800 --> 00:14:04,480 Speaker 1: the purposes of this hack. So now Amazon sends the 230 00:14:04,559 --> 00:14:09,840 Speaker 1: log in information to UH to Amazon dot Com to 231 00:14:09,960 --> 00:14:15,240 Speaker 1: that account, to the email they log into the Amazon 232 00:14:15,320 --> 00:14:18,199 Speaker 1: dot Com account, and then they look for the other 233 00:14:18,679 --> 00:14:21,520 Speaker 1: credit card number, the one that was actually used to 234 00:14:21,640 --> 00:14:25,520 Speaker 1: establish that account. So this is Honan's actual final four digits, 235 00:14:25,520 --> 00:14:30,280 Speaker 1: because those are unmasked in the Amazon dot Com system. Yes, 236 00:14:30,760 --> 00:14:33,000 Speaker 1: they mask the rest of it, right, Yeah, the rest 237 00:14:33,040 --> 00:14:34,840 Speaker 1: of the numbers are mass So it's not that the 238 00:14:34,880 --> 00:14:37,600 Speaker 1: hackers ever had access to the credit card, other than 239 00:14:37,960 --> 00:14:39,800 Speaker 1: they could have bought a whole bunch of stuff on 240 00:14:39,840 --> 00:14:44,080 Speaker 1: Amazon and had it sent somewhere. But that's all. That's. Yeah, 241 00:14:44,120 --> 00:14:46,080 Speaker 1: that's what they could have done if they had wanted to, 242 00:14:46,360 --> 00:14:49,200 Speaker 1: But they could not actually pull the credit card number 243 00:14:49,200 --> 00:14:51,960 Speaker 1: itself other than the last four digits. But those last 244 00:14:51,960 --> 00:14:56,080 Speaker 1: four digits are what Apple needs for account verification, right, 245 00:14:56,720 --> 00:14:59,760 Speaker 1: So they take those four digits, they've got the building address, 246 00:15:00,080 --> 00:15:02,560 Speaker 1: they give a call to Apple, they give that information, 247 00:15:02,680 --> 00:15:06,560 Speaker 1: and because Honan used the same billing address and the 248 00:15:06,600 --> 00:15:11,080 Speaker 1: same credit card for both services, Apple said, oh, well 249 00:15:11,120 --> 00:15:14,160 Speaker 1: then you're clearly this guy. We will send you the 250 00:15:14,200 --> 00:15:19,400 Speaker 1: account retrieval information to your email address. So then they 251 00:15:19,440 --> 00:15:23,760 Speaker 1: now have the way to log into Honan's iCloud account. 252 00:15:23,760 --> 00:15:27,479 Speaker 1: They do that. That's where they then disable his devices. 253 00:15:27,520 --> 00:15:30,760 Speaker 1: They wipe them to help slow things down so they 254 00:15:30,760 --> 00:15:34,320 Speaker 1: can continue to do this stuff. Now they have access 255 00:15:34,360 --> 00:15:37,280 Speaker 1: to his Apple email, they have access to his Amazon account. 256 00:15:37,720 --> 00:15:41,360 Speaker 1: That's when they go to the Google password recovery asked 257 00:15:41,440 --> 00:15:45,000 Speaker 1: for the recovery information so that they can access his 258 00:15:45,040 --> 00:15:48,920 Speaker 1: Google account. Well, that goes to his Apple address, which 259 00:15:48,920 --> 00:15:52,400 Speaker 1: they already have access to. The information comes to the 260 00:15:52,440 --> 00:15:54,960 Speaker 1: Apple address, they go into the Google account. They immediately 261 00:15:55,000 --> 00:16:00,760 Speaker 1: delete the password recovery UH email out of his account 262 00:16:00,840 --> 00:16:03,280 Speaker 1: so that if he has any other devices that would 263 00:16:03,320 --> 00:16:08,760 Speaker 1: alert him that his password had been changed, that he 264 00:16:08,800 --> 00:16:11,520 Speaker 1: would not be aware of it. So they they hide 265 00:16:11,560 --> 00:16:14,280 Speaker 1: that they changed the password, so that now they've locked 266 00:16:14,360 --> 00:16:17,040 Speaker 1: him out, they have access to his Google account. They 267 00:16:17,040 --> 00:16:18,840 Speaker 1: then were able to go and get access to the 268 00:16:18,840 --> 00:16:25,040 Speaker 1: Twitter account. Um, this is kind of scary, and again 269 00:16:25,120 --> 00:16:27,920 Speaker 1: it has nothing to do with sitting down encoding stuff. 270 00:16:28,080 --> 00:16:30,960 Speaker 1: It is hacking. You're hacking a system, but you're doing 271 00:16:31,000 --> 00:16:36,680 Speaker 1: it more through social engineering and manipulating policies and systems. 272 00:16:36,720 --> 00:16:39,440 Speaker 1: So if you guys remember we had that discussion and 273 00:16:39,520 --> 00:16:41,560 Speaker 1: I think it was episode three ninety nine where we 274 00:16:41,600 --> 00:16:45,080 Speaker 1: interviewed Brian Brushwood and we talked about social engineering. Now 275 00:16:45,080 --> 00:16:48,560 Speaker 1: with Brushwood, his approach to social engineering is more about 276 00:16:49,080 --> 00:16:52,600 Speaker 1: you know, having fun and uh, like, you're in a 277 00:16:52,640 --> 00:16:55,640 Speaker 1: social situation where you you know, you never have to 278 00:16:55,680 --> 00:16:58,400 Speaker 1: buy a drink because you're doing these cool things and 279 00:16:58,440 --> 00:17:01,280 Speaker 1: convincing other people to buy drinks for you, or you know, 280 00:17:01,320 --> 00:17:03,920 Speaker 1: you're doing something so that you can get the phone 281 00:17:04,000 --> 00:17:06,919 Speaker 1: number of someone you're interested in. So you're still social 282 00:17:06,920 --> 00:17:11,880 Speaker 1: engineering people. But it's not necessarily this as nefarious as 283 00:17:12,040 --> 00:17:15,719 Speaker 1: as what these hackers were doing. Yeah, and it's not 284 00:17:15,800 --> 00:17:19,600 Speaker 1: typically what one thinks of when one thinks of identity theft. 285 00:17:19,600 --> 00:17:23,040 Speaker 1: I mean again, Um, a lot of us would look 286 00:17:23,080 --> 00:17:26,240 Speaker 1: at the specifically maybe the Amazon portion of this or 287 00:17:26,280 --> 00:17:28,760 Speaker 1: an online retail portion of this and say, oh, well, 288 00:17:28,800 --> 00:17:30,919 Speaker 1: they got access to his credit card number, they can 289 00:17:30,960 --> 00:17:32,879 Speaker 1: buy stuff well you and and in a lot of 290 00:17:32,920 --> 00:17:37,199 Speaker 1: cases that maybe what a hacker might try to do. 291 00:17:37,960 --> 00:17:42,600 Speaker 1: After all, we have talked about uh online systems being 292 00:17:42,640 --> 00:17:46,679 Speaker 1: hacked for financial information and financial gain, but that's not 293 00:17:46,960 --> 00:17:50,119 Speaker 1: the point of this. Um, the system that I was 294 00:17:50,119 --> 00:17:52,200 Speaker 1: speaking of a few minutes ago, when I was saying 295 00:17:52,280 --> 00:17:55,680 Speaker 1: that ironically, some of these things were turned against him 296 00:17:56,280 --> 00:17:59,200 Speaker 1: tools that would be used to protect him. Um, if 297 00:17:59,240 --> 00:18:03,159 Speaker 1: you're not in an Apple customer, you may not be 298 00:18:03,240 --> 00:18:07,000 Speaker 1: aware there's a there's a uh an I cloud system 299 00:18:07,560 --> 00:18:10,920 Speaker 1: called find my and there're a couple of them like 300 00:18:11,000 --> 00:18:15,720 Speaker 1: to find my iPhone. Yeah. Um, so let's say Uh, 301 00:18:15,760 --> 00:18:18,320 Speaker 1: you know, we're talking completely behind here. Let's say you 302 00:18:18,359 --> 00:18:21,600 Speaker 1: have an iPhone and your kids run off with it 303 00:18:21,680 --> 00:18:25,080 Speaker 1: and stuffed it somewhere in some piece of furniture or 304 00:18:25,200 --> 00:18:27,520 Speaker 1: dropped it and or you left it in a cab, 305 00:18:27,760 --> 00:18:29,320 Speaker 1: or you left it in a cab. Well, if you're 306 00:18:29,359 --> 00:18:32,800 Speaker 1: if you're Natalie Dell Conti, well yeah, um, well, I 307 00:18:33,320 --> 00:18:35,280 Speaker 1: was going to start with the the easy one. You 308 00:18:35,320 --> 00:18:37,159 Speaker 1: can make it. You can make your phone make a 309 00:18:37,200 --> 00:18:39,760 Speaker 1: noise so you know it's in the house, but you 310 00:18:39,760 --> 00:18:41,959 Speaker 1: can't figure out where it went. I'd like to have 311 00:18:41,960 --> 00:18:43,760 Speaker 1: one of these for my keys and maybe the remote. 312 00:18:44,160 --> 00:18:46,239 Speaker 1: But you know you can you can make it make 313 00:18:46,240 --> 00:18:48,440 Speaker 1: a noise, or if you've left it in a cab, 314 00:18:49,320 --> 00:18:53,199 Speaker 1: you can have it tell you roughly where it is. Uh. 315 00:18:53,320 --> 00:18:55,280 Speaker 1: This is especially useful if you can't remember if you 316 00:18:55,359 --> 00:18:57,520 Speaker 1: left it in a cab, or if you at a 317 00:18:57,600 --> 00:19:00,840 Speaker 1: restaurant whatever, or you know, you were at bar and 318 00:19:01,160 --> 00:19:05,760 Speaker 1: you had a prototype version of the newest iPhone and 319 00:19:05,800 --> 00:19:07,280 Speaker 1: it was sitting on the stool next to you when 320 00:19:07,280 --> 00:19:08,639 Speaker 1: you were sitting there at the bar, but then when 321 00:19:08,640 --> 00:19:11,080 Speaker 1: you turned around, it was gone, and then it ends 322 00:19:11,160 --> 00:19:15,240 Speaker 1: up at some tech blog. Yeah, that could happen. Yeah, 323 00:19:15,440 --> 00:19:19,080 Speaker 1: there their Twitter feed could be hacked to UM. But yeah, 324 00:19:19,160 --> 00:19:20,959 Speaker 1: I mean, so you can find out where it is. 325 00:19:21,040 --> 00:19:22,520 Speaker 1: You can have it make a noise so that if 326 00:19:22,560 --> 00:19:25,040 Speaker 1: it is in the same location as you are, Uh, 327 00:19:25,080 --> 00:19:27,360 Speaker 1: you know you can you can track it down. UM 328 00:19:27,760 --> 00:19:29,359 Speaker 1: if you don't know where it is, Let's say you 329 00:19:29,359 --> 00:19:31,920 Speaker 1: did leave it in a in a bar somewhere and 330 00:19:32,119 --> 00:19:35,080 Speaker 1: uh you say, oh, well, you know it's not I 331 00:19:35,119 --> 00:19:37,240 Speaker 1: don't know where that is, and you could see a 332 00:19:37,280 --> 00:19:39,119 Speaker 1: location it shows you on the map where where it 333 00:19:39,200 --> 00:19:42,160 Speaker 1: might be. Oh, it's no longer in my control. It's 334 00:19:42,240 --> 00:19:44,720 Speaker 1: somewhere where I don't know where it is. I'm I 335 00:19:44,760 --> 00:19:47,480 Speaker 1: have sensitive information on there. My my calendars on there, 336 00:19:47,560 --> 00:19:51,080 Speaker 1: my contacts are on there. Um as as Honan himself said, 337 00:19:51,119 --> 00:19:56,800 Speaker 1: you know he had um information from many other tech journalists. UM, 338 00:19:57,280 --> 00:19:59,639 Speaker 1: so he might just let's say he was still in 339 00:19:59,680 --> 00:20:01,879 Speaker 1: control of his accounts, but no longer in control of 340 00:20:01,920 --> 00:20:05,440 Speaker 1: the device. He could say, wipe this device. I don't 341 00:20:05,480 --> 00:20:09,119 Speaker 1: want anything on it anymore, you know, I want to 342 00:20:09,119 --> 00:20:11,440 Speaker 1: wipe it clean so that nobody else gains information in 343 00:20:11,560 --> 00:20:13,720 Speaker 1: my personal stuff. It's only a matter of time before 344 00:20:13,760 --> 00:20:17,160 Speaker 1: they figure out my my pass code, wipe it clean. 345 00:20:17,480 --> 00:20:18,760 Speaker 1: You know, you can tell it to do that and 346 00:20:18,840 --> 00:20:22,080 Speaker 1: will remotely do that. Apple has added that for the 347 00:20:22,119 --> 00:20:26,040 Speaker 1: Mac to find my Mac. So in that case, let's 348 00:20:26,040 --> 00:20:32,200 Speaker 1: say he had corporate information. Many companies have have this 349 00:20:32,480 --> 00:20:35,200 Speaker 1: policy in place. Yes, you can check your corporate email 350 00:20:35,240 --> 00:20:38,880 Speaker 1: on your personal device, but if you do that, um, 351 00:20:39,000 --> 00:20:42,280 Speaker 1: we retain the right to wipe the information on the device. 352 00:20:42,440 --> 00:20:46,680 Speaker 1: If it should fall into somebody else's hands. Or let's 353 00:20:46,720 --> 00:20:49,480 Speaker 1: say that you were to uh, you were to to 354 00:20:50,080 --> 00:20:53,600 Speaker 1: either be fired or you you know, you left or whatever, 355 00:20:53,600 --> 00:20:55,639 Speaker 1: they might retain that right so that they can protect 356 00:20:55,640 --> 00:20:58,760 Speaker 1: themselves as a corporate entity. Yeah, so there there are 357 00:20:59,160 --> 00:21:02,520 Speaker 1: positive reason uh to be able to do this in 358 00:21:02,520 --> 00:21:07,159 Speaker 1: this case. Once the hackers gained information about his account 359 00:21:07,160 --> 00:21:09,240 Speaker 1: and we're able to get access to his account and 360 00:21:09,280 --> 00:21:14,480 Speaker 1: lock him out, um, they also chose to completely wipe 361 00:21:14,560 --> 00:21:21,040 Speaker 1: his phone, his iPad, and his Mac laptop. And in 362 00:21:21,080 --> 00:21:24,840 Speaker 1: doing so, they not only wiped out any you know, 363 00:21:25,280 --> 00:21:28,760 Speaker 1: corporate information. He's he's a freelance writer, so any articles 364 00:21:28,800 --> 00:21:30,040 Speaker 1: he might have been working on that we're on his 365 00:21:30,160 --> 00:21:34,400 Speaker 1: hard drive gone. He also lost a year's worth or more, 366 00:21:34,440 --> 00:21:38,440 Speaker 1: I guess the photos of personal photos personal stuff that 367 00:21:38,440 --> 00:21:43,879 Speaker 1: that he had created. And yeah, Liz leads us to 368 00:21:43,960 --> 00:21:47,520 Speaker 1: the the thing that we have said a billion times 369 00:21:47,520 --> 00:21:50,439 Speaker 1: on this podcast that is an exaggeration, but back up 370 00:21:50,480 --> 00:21:54,199 Speaker 1: your data. Yeah, and he admits he admits he was 371 00:21:54,240 --> 00:21:57,080 Speaker 1: not regularly backing up his hard drive. This is not 372 00:21:57,480 --> 00:22:00,119 Speaker 1: to pick on him or anything else. It's something that 373 00:22:00,160 --> 00:22:02,879 Speaker 1: he wishes in retrospect he had been doing on a 374 00:22:02,880 --> 00:22:07,520 Speaker 1: regular basis. Because, um, oddly enough, this is where this 375 00:22:07,520 --> 00:22:10,840 Speaker 1: this is where the story takes an unusual turn. He 376 00:22:10,880 --> 00:22:13,680 Speaker 1: has been in contact with his hackers and has agreed 377 00:22:13,760 --> 00:22:18,359 Speaker 1: not to in return, they were telling him how they 378 00:22:18,359 --> 00:22:21,040 Speaker 1: did it. Yes, and uh, I think first of all, 379 00:22:21,080 --> 00:22:24,040 Speaker 1: the first thing we can agree on easily is that 380 00:22:24,119 --> 00:22:28,600 Speaker 1: Amazon has to change its policy. Well, yeah, because because 381 00:22:29,400 --> 00:22:32,360 Speaker 1: that's the first step that means that anyone could access 382 00:22:32,400 --> 00:22:38,080 Speaker 1: anyone else's Amazon accounting. This Well, um, I wasn't going 383 00:22:38,119 --> 00:22:40,320 Speaker 1: to get there quite yet. I wanted to make the 384 00:22:40,359 --> 00:22:43,359 Speaker 1: point that this is where it kind of gets a 385 00:22:43,400 --> 00:22:47,320 Speaker 1: little weird, because they they shared all this information with him. 386 00:22:47,920 --> 00:22:49,639 Speaker 1: This is how he was able to write such a 387 00:22:49,680 --> 00:22:54,040 Speaker 1: comprehensive post on onn Wired about it was. They told 388 00:22:54,119 --> 00:22:57,000 Speaker 1: him what they were doing, what the point of it was, Um, 389 00:22:57,040 --> 00:23:00,359 Speaker 1: they admitted, look, you know, we weren't trying to deal 390 00:23:00,480 --> 00:23:03,120 Speaker 1: your your stuff. We weren't really trying to wipe out 391 00:23:03,119 --> 00:23:06,160 Speaker 1: your your personal life. We have nothing against you personally. 392 00:23:06,560 --> 00:23:10,520 Speaker 1: We wanted your Twitter account. Um. The guy that that 393 00:23:10,520 --> 00:23:16,240 Speaker 1: that he talked to primarily UM was saying, essentially, hey, 394 00:23:17,240 --> 00:23:19,160 Speaker 1: you know, my partner was the one who wiped out 395 00:23:19,160 --> 00:23:22,200 Speaker 1: your computer. And now that you tell me, all your 396 00:23:22,240 --> 00:23:25,160 Speaker 1: personal files, your your the pictures of your your kid 397 00:23:25,240 --> 00:23:29,560 Speaker 1: were on here. I'm really sorry. I'm actually really sorry. 398 00:23:29,600 --> 00:23:32,240 Speaker 1: I didn't mean to to cause you personal harm as 399 00:23:32,240 --> 00:23:35,000 Speaker 1: a result of this. And and they say, now, I 400 00:23:35,040 --> 00:23:38,760 Speaker 1: don't know, you know, I don't know whether their motives 401 00:23:38,760 --> 00:23:40,959 Speaker 1: are are as pure as they say. You know, they 402 00:23:40,960 --> 00:23:42,800 Speaker 1: say part of it was that they wanted to point 403 00:23:42,800 --> 00:23:46,240 Speaker 1: out that it really is this easy to hack into 404 00:23:46,240 --> 00:23:48,680 Speaker 1: your personal account. They wanted to draw attention to that. Now, 405 00:23:50,119 --> 00:23:53,160 Speaker 1: I say that all the time. I suspect, based upon 406 00:23:53,560 --> 00:23:58,160 Speaker 1: the messages that they posted on Twitter, that that's something 407 00:23:58,160 --> 00:24:01,800 Speaker 1: they they that's covering the tracks. I think they were 408 00:24:01,800 --> 00:24:05,280 Speaker 1: doing it for the kicks. Yes, exactly. Well, if you're 409 00:24:05,280 --> 00:24:08,240 Speaker 1: looking at again, if you're reading the Twitter, the Twitter 410 00:24:08,960 --> 00:24:11,240 Speaker 1: posts that he that were posted under his name, and 411 00:24:11,240 --> 00:24:13,840 Speaker 1: there were a lot that he left there. He says, 412 00:24:13,880 --> 00:24:15,320 Speaker 1: I wanted to keep a record of it. He did 413 00:24:15,359 --> 00:24:20,680 Speaker 1: delete some because they were overly hurtful offensive. Yes, and 414 00:24:20,840 --> 00:24:24,000 Speaker 1: he said, you know, these could actually cause people to 415 00:24:24,080 --> 00:24:26,879 Speaker 1: feel badly about themselves, and I don't want that. I 416 00:24:26,920 --> 00:24:28,959 Speaker 1: do want there to be a record of what had happened, 417 00:24:28,960 --> 00:24:31,119 Speaker 1: but not at that, not that, not at the expense 418 00:24:31,160 --> 00:24:34,760 Speaker 1: of someone else's feelings, um, other than my own obviously. 419 00:24:35,080 --> 00:24:37,120 Speaker 1: So then he went out and he deleted the ones 420 00:24:37,160 --> 00:24:39,560 Speaker 1: they felt were particularly offensive, and then the rest he 421 00:24:39,680 --> 00:24:44,040 Speaker 1: left up. If you read those, I think it's it's 422 00:24:44,080 --> 00:24:47,919 Speaker 1: pretty hard to defend yourself with. I'm just showing how 423 00:24:48,560 --> 00:24:51,560 Speaker 1: the system can be hacked. It's more than that. It's 424 00:24:51,600 --> 00:24:55,280 Speaker 1: also hey, you know, ha ha, we did it, you know, 425 00:24:56,040 --> 00:24:58,240 Speaker 1: And and it's so it goes beyond that. And I 426 00:24:58,280 --> 00:25:03,080 Speaker 1: think it's very telling the hacker he got in touch with, 427 00:25:04,119 --> 00:25:07,639 Speaker 1: assuming that the what he the information he gave was 428 00:25:07,680 --> 00:25:11,000 Speaker 1: accurate about himself, about the hacker himself as a young 429 00:25:11,040 --> 00:25:15,320 Speaker 1: guy nineteen years old, might not quite really get be 430 00:25:15,480 --> 00:25:20,919 Speaker 1: mature enough to realize, you know, what the consequences are 431 00:25:20,960 --> 00:25:24,119 Speaker 1: of those actions. And what how they could affect the 432 00:25:24,200 --> 00:25:28,280 Speaker 1: target beyond just oh, you know, they're thinking, we have 433 00:25:28,359 --> 00:25:31,440 Speaker 1: a goal, we want to get hold of this Twitter account. 434 00:25:31,560 --> 00:25:34,960 Speaker 1: They're not thinking of what consequences are going to be 435 00:25:35,359 --> 00:25:37,600 Speaker 1: felt by the target beyond just the fact that our 436 00:25:37,600 --> 00:25:40,760 Speaker 1: Twitter handle has been taken over. And so some of 437 00:25:40,800 --> 00:25:42,960 Speaker 1: them may just be that they were very narrowly focused 438 00:25:42,960 --> 00:25:44,639 Speaker 1: on what they wanted to do and they didn't really 439 00:25:44,640 --> 00:25:47,840 Speaker 1: consider what could happen or how it would feel for 440 00:25:47,880 --> 00:25:51,600 Speaker 1: that sort of stuff to happen to a person. Um. 441 00:25:51,760 --> 00:25:54,280 Speaker 1: So that's that's something there too, and we see that 442 00:25:54,320 --> 00:25:55,840 Speaker 1: a lot. I mean, there are a lot of hackers 443 00:25:55,840 --> 00:25:58,639 Speaker 1: out there who because they can do something, they'll do 444 00:25:58,680 --> 00:26:02,240 Speaker 1: it and they don't realize or they don't care what 445 00:26:02,280 --> 00:26:04,560 Speaker 1: the consequences of that action are going to be to 446 00:26:04,680 --> 00:26:07,200 Speaker 1: the people who are also involved in that whatever that 447 00:26:07,320 --> 00:26:10,959 Speaker 1: situation is. Hey, guys, Jonathan from two thousand nineteen, just 448 00:26:11,160 --> 00:26:13,320 Speaker 1: interrupting this episode to say, we're going to take a 449 00:26:13,400 --> 00:26:24,439 Speaker 1: quick break, but we'll be right back. So maybe maybe 450 00:26:24,480 --> 00:26:28,919 Speaker 1: now this According to the article, it sounds like this 451 00:26:28,960 --> 00:26:33,720 Speaker 1: guy is at least a little remorseful, and remorseful yes, 452 00:26:33,840 --> 00:26:36,560 Speaker 1: that he's feeling some remorse for this, and you know, 453 00:26:36,680 --> 00:26:42,000 Speaker 1: we don't know if really, like he was at all 454 00:26:42,400 --> 00:26:46,639 Speaker 1: culpable in the actual deletion. He claims that it was 455 00:26:46,680 --> 00:26:48,359 Speaker 1: the other guy who did it, but you know, you 456 00:26:48,400 --> 00:26:53,000 Speaker 1: never know. So it's interesting to look at that. And 457 00:26:53,760 --> 00:26:55,600 Speaker 1: you know, if if you kind of put yourself in 458 00:26:55,680 --> 00:27:00,359 Speaker 1: the shoes of the the hacker, um, you know, especially 459 00:27:00,400 --> 00:27:03,280 Speaker 1: if you're thinking of somebody who is doing it for 460 00:27:03,280 --> 00:27:07,120 Speaker 1: for fun, to mess with somebody, and and the person says, hey, look, 461 00:27:07,119 --> 00:27:08,919 Speaker 1: I'm not going to press charges against you, but I 462 00:27:08,960 --> 00:27:11,720 Speaker 1: want to know how how you did it. He started thinking, hey, 463 00:27:11,760 --> 00:27:13,160 Speaker 1: this guy is working with me. You know, the heat 464 00:27:13,160 --> 00:27:16,360 Speaker 1: of the moments off, the sense of accomplishment you get 465 00:27:16,400 --> 00:27:20,879 Speaker 1: from hacking in and gaining access to all this information. 466 00:27:21,200 --> 00:27:22,960 Speaker 1: You know, after the fact, you've had a chance to 467 00:27:22,960 --> 00:27:25,080 Speaker 1: cool down, they've had a chance to cool down. You 468 00:27:25,119 --> 00:27:27,480 Speaker 1: start thinking about it like, well, you know what, this 469 00:27:27,560 --> 00:27:30,440 Speaker 1: guy is not angry enough with me to to press 470 00:27:30,520 --> 00:27:35,200 Speaker 1: charges with the cops. You know, we kind of damaged 471 00:27:35,240 --> 00:27:37,120 Speaker 1: this guy and he's willing to talk to us about 472 00:27:37,160 --> 00:27:40,480 Speaker 1: it and share the story online. You know, they kind 473 00:27:40,480 --> 00:27:42,560 Speaker 1: of got something out of it too. They kind of 474 00:27:42,560 --> 00:27:46,879 Speaker 1: got a little anonymity anonymous press, so they get to 475 00:27:46,960 --> 00:27:50,200 Speaker 1: point to themselves and say, hey, look he's talking about us. 476 00:27:51,200 --> 00:27:53,440 Speaker 1: He doesn't seem like such a bad guy. I guess 477 00:27:53,440 --> 00:27:56,560 Speaker 1: we kind of you know, burned a lot of stuff 478 00:27:56,680 --> 00:28:00,800 Speaker 1: of his online that kind of stay ex We were 479 00:28:00,840 --> 00:28:02,359 Speaker 1: really kind of doing it for the fun of it, 480 00:28:02,440 --> 00:28:07,880 Speaker 1: and now it's so much fun as a decent guy. 481 00:28:07,960 --> 00:28:09,840 Speaker 1: Now you know that there's a real person on the 482 00:28:09,840 --> 00:28:11,399 Speaker 1: other end of that account. That's the other thing is 483 00:28:11,440 --> 00:28:14,679 Speaker 1: there's a dehumanizing effects sometimes with the whole you know, 484 00:28:14,720 --> 00:28:17,240 Speaker 1: you don't really identify the fact that there's a person 485 00:28:17,320 --> 00:28:20,440 Speaker 1: on the other end of these accounts. Sometimes you don't. 486 00:28:20,720 --> 00:28:23,840 Speaker 1: It doesn't the concept isn't fully formed. For for a 487 00:28:23,840 --> 00:28:26,040 Speaker 1: lot of us, we would have gone out and if 488 00:28:26,040 --> 00:28:28,240 Speaker 1: we had found out who did it, we would have 489 00:28:28,320 --> 00:28:30,760 Speaker 1: pressed charges. We would have wanted to take them. Now 490 00:28:30,960 --> 00:28:33,600 Speaker 1: some of us would have re enacted the film taken 491 00:28:35,240 --> 00:28:37,760 Speaker 1: but I will find you. But yeah, that that's that's 492 00:28:37,800 --> 00:28:42,080 Speaker 1: what makes this story more interesting than other hacking stories, 493 00:28:42,160 --> 00:28:46,680 Speaker 1: I think is that that it's got a humanizing factor 494 00:28:46,720 --> 00:28:50,400 Speaker 1: to character for both parties. The person who or people 495 00:28:50,480 --> 00:28:54,760 Speaker 1: who took advantage of of honing and honing himself, and 496 00:28:55,240 --> 00:29:00,440 Speaker 1: it does point to security issues. Now, these are sitimate 497 00:29:00,520 --> 00:29:05,040 Speaker 1: for UM. You think about your Amazon account, for example, 498 00:29:05,160 --> 00:29:07,360 Speaker 1: Let's say you don't have anything else except an email 499 00:29:07,400 --> 00:29:10,840 Speaker 1: account and an Amazon account. By and large, you probably 500 00:29:10,840 --> 00:29:14,320 Speaker 1: wouldn't have a lot of these security issues. The security 501 00:29:14,360 --> 00:29:17,200 Speaker 1: issues that Amazon would have in place would make it 502 00:29:17,560 --> 00:29:20,080 Speaker 1: very difficult for them for someone else to get that 503 00:29:20,120 --> 00:29:23,960 Speaker 1: information from them. But then you start sharing. You start 504 00:29:24,080 --> 00:29:28,680 Speaker 1: using this UM email address with Amazon and every other 505 00:29:28,720 --> 00:29:32,360 Speaker 1: company that you do business with online. That makes your 506 00:29:32,440 --> 00:29:37,440 Speaker 1: email address a a key to getting information from other companies. 507 00:29:38,360 --> 00:29:41,520 Speaker 1: And then you start doing business with other pieces. You've 508 00:29:41,560 --> 00:29:46,120 Speaker 1: got the same credit card number across these different companies, 509 00:29:46,560 --> 00:29:48,400 Speaker 1: and once you have the last four digits of your 510 00:29:48,400 --> 00:29:52,480 Speaker 1: social Security number or a credit card number, that makes 511 00:29:52,520 --> 00:29:55,520 Speaker 1: it possible to use that information as a key across 512 00:29:55,720 --> 00:29:59,360 Speaker 1: multiple entities. And all of a sudden, if you do 513 00:29:59,440 --> 00:30:02,400 Speaker 1: business with a whole bunch of places, they get something 514 00:30:02,440 --> 00:30:07,040 Speaker 1: like your physical address, your name, your email address, a 515 00:30:07,080 --> 00:30:09,560 Speaker 1: credit card number, any of that stuff, and they've got 516 00:30:09,600 --> 00:30:14,640 Speaker 1: the keys to open lots and lots of accounts for 517 00:30:14,640 --> 00:30:17,080 Speaker 1: for them to get more information. And once they've hacked one, 518 00:30:17,680 --> 00:30:20,200 Speaker 1: they can get information that will let them into lots 519 00:30:20,200 --> 00:30:22,640 Speaker 1: and lots of other places. Oh, they have an Amazon account. 520 00:30:22,680 --> 00:30:24,280 Speaker 1: I wonder if they have a Barnes and Noble account. 521 00:30:24,320 --> 00:30:26,400 Speaker 1: We could find out in about ten minutes. Yea. So 522 00:30:26,760 --> 00:30:31,200 Speaker 1: Honan admits that his password was not the strongest. It 523 00:30:31,280 --> 00:30:35,680 Speaker 1: was a seven seven digit alpha numeric password, but that 524 00:30:35,800 --> 00:30:38,040 Speaker 1: it was one he had used for many years. But 525 00:30:38,120 --> 00:30:42,360 Speaker 1: they haven't. They didn't really use it, right, So that's 526 00:30:42,400 --> 00:30:44,320 Speaker 1: that's the point of this thing, is that even if 527 00:30:44,320 --> 00:30:47,000 Speaker 1: he had had the strongest password in the world, it 528 00:30:47,000 --> 00:30:50,760 Speaker 1: would not have mattered because they circumvented that, right. They didn't. 529 00:30:50,880 --> 00:30:55,120 Speaker 1: They weren't attacking through that direction. And this this demonstrates 530 00:30:55,160 --> 00:30:59,600 Speaker 1: why security is so tough, because you think about the 531 00:30:59,640 --> 00:31:03,120 Speaker 1: most obvious point of entry, which would be the log 532 00:31:03,200 --> 00:31:05,800 Speaker 1: in right your user name and your password. That's the 533 00:31:05,800 --> 00:31:09,160 Speaker 1: most obvious point because that's the way we access our information. 534 00:31:09,920 --> 00:31:12,520 Speaker 1: Hackers are looking at a system and saying, what's the 535 00:31:12,560 --> 00:31:16,000 Speaker 1: best vulnerable spot to go in at And if the 536 00:31:16,040 --> 00:31:18,840 Speaker 1: front door is heavily locked, you look for a window 537 00:31:19,000 --> 00:31:21,160 Speaker 1: or a backdoor, You look for something else it's gonna 538 00:31:21,240 --> 00:31:23,920 Speaker 1: let you get into there, and not even you just 539 00:31:24,000 --> 00:31:27,360 Speaker 1: bypass the place where you've got all the security and 540 00:31:27,400 --> 00:31:29,320 Speaker 1: you go in through a different entrance. So when I 541 00:31:29,360 --> 00:31:31,680 Speaker 1: said that Amazon really needs to work on its policy, 542 00:31:32,200 --> 00:31:34,080 Speaker 1: mainly the reason for that is that the only thing 543 00:31:34,080 --> 00:31:38,280 Speaker 1: you need in order to get that that lug and 544 00:31:38,320 --> 00:31:43,320 Speaker 1: recovery information was the credit card number that's associated with 545 00:31:43,360 --> 00:31:46,840 Speaker 1: the account, which they did by adding in one the 546 00:31:46,840 --> 00:31:51,880 Speaker 1: building address and an email address, and that's it, um uh. 547 00:31:52,120 --> 00:31:54,680 Speaker 1: And in order to add the credit card number, all 548 00:31:54,720 --> 00:31:57,760 Speaker 1: you need is the building address and the email address 549 00:31:57,800 --> 00:32:02,280 Speaker 1: that is associated with the account. So you know, using 550 00:32:02,360 --> 00:32:06,680 Speaker 1: some guesswork, thinking that okay, well he's got an Amazon account, 551 00:32:07,240 --> 00:32:09,840 Speaker 1: he's probably got an Amazon account. He's probably using this 552 00:32:09,960 --> 00:32:13,520 Speaker 1: address for that Amazon account. We know his address because 553 00:32:13,520 --> 00:32:17,080 Speaker 1: we looked it up from his website. We can create 554 00:32:17,240 --> 00:32:21,520 Speaker 1: fabricate a a a credit card using a generator that 555 00:32:21,560 --> 00:32:26,720 Speaker 1: creates a realistic but not actually activated credit card number 556 00:32:28,200 --> 00:32:30,880 Speaker 1: and assign that to the Amazon account and then use 557 00:32:30,920 --> 00:32:34,280 Speaker 1: that to get the entry point. So obviously Amazon needs 558 00:32:34,320 --> 00:32:37,040 Speaker 1: to fix that because if all you have is a 559 00:32:37,040 --> 00:32:39,680 Speaker 1: person's address, and you have a good guess at what 560 00:32:39,800 --> 00:32:43,400 Speaker 1: email address they use for that Amazon account, then you 561 00:32:43,440 --> 00:32:46,840 Speaker 1: could do the same thing. And so that's that's a. 562 00:32:47,360 --> 00:32:50,800 Speaker 1: That's number one. Number two would be the fact that 563 00:32:51,440 --> 00:32:54,600 Speaker 1: Apple uses the last four digits of the credit card, 564 00:32:55,040 --> 00:32:59,560 Speaker 1: the building and the building address as a security recovery method. 565 00:33:00,800 --> 00:33:05,000 Speaker 1: Clearly that needs to to change in some way. Yeah, 566 00:33:05,040 --> 00:33:07,320 Speaker 1: I think I think this is a uh they're there 567 00:33:07,320 --> 00:33:09,840 Speaker 1: are a couple of things. Now, if you read uh, 568 00:33:09,960 --> 00:33:13,160 Speaker 1: there's an account on Honan's tumbler and if you want 569 00:33:13,200 --> 00:33:18,400 Speaker 1: to read some truly hurtful comments, I would suggest reading that. 570 00:33:19,040 --> 00:33:23,240 Speaker 1: Um because some people blame him for owning Apple devices, 571 00:33:23,560 --> 00:33:27,080 Speaker 1: which is ridiculous. In fact, that the one that that 572 00:33:27,160 --> 00:33:29,000 Speaker 1: bugged me probably the most was the one that said, 573 00:33:29,160 --> 00:33:31,920 Speaker 1: serves him right for owning I crap. And I'm going 574 00:33:32,280 --> 00:33:34,800 Speaker 1: you know this, this really could have happened with pretty 575 00:33:34,880 --> 00:33:39,640 Speaker 1: much any manufacturer or it's just I mean, Apple had 576 00:33:39,640 --> 00:33:41,920 Speaker 1: policies that they were able to leverage. That's not to 577 00:33:41,960 --> 00:33:45,440 Speaker 1: say that other companies don't have those same policies, and 578 00:33:45,440 --> 00:33:47,480 Speaker 1: it's just that Apples were well known to them. So 579 00:33:47,600 --> 00:33:51,360 Speaker 1: that's how they, once they saw the me dot com addresses, 580 00:33:51,360 --> 00:33:53,760 Speaker 1: said all right, we know how to do this. Yeah. 581 00:33:53,840 --> 00:33:57,720 Speaker 1: And the thing is, I would say the vast majority 582 00:33:57,760 --> 00:34:02,760 Speaker 1: of online retailers or or companies that have that offer 583 00:34:02,840 --> 00:34:05,720 Speaker 1: services online. UM, I mean they knew how to get 584 00:34:05,720 --> 00:34:09,640 Speaker 1: into a Google account to um and and a lot 585 00:34:09,719 --> 00:34:11,920 Speaker 1: of them have the same policies. So if you can 586 00:34:11,960 --> 00:34:14,120 Speaker 1: get as they did, if you can get one piece, 587 00:34:14,560 --> 00:34:17,640 Speaker 1: then you can apply it to other pieces and get 588 00:34:17,719 --> 00:34:21,400 Speaker 1: information from them and put the whole puzzle together that way. 589 00:34:21,480 --> 00:34:24,399 Speaker 1: So it's not while while I've seen people singling out 590 00:34:24,440 --> 00:34:28,279 Speaker 1: Apple and Amazon and um And, they should to some 591 00:34:28,320 --> 00:34:34,680 Speaker 1: degree be uh considering new stuff, it's not just their fault. 592 00:34:34,800 --> 00:34:37,920 Speaker 1: The catch twenty two here is once you make an 593 00:34:37,960 --> 00:34:43,840 Speaker 1: account so locked down that it's extremely hard to get into, 594 00:34:44,120 --> 00:34:46,920 Speaker 1: it's also hard for you to get into when you 595 00:34:47,040 --> 00:34:50,800 Speaker 1: do forget your password, when you do forget what credit 596 00:34:50,840 --> 00:34:54,360 Speaker 1: card you used. Say you've got ten credit cards. UM, 597 00:34:54,680 --> 00:34:57,160 Speaker 1: let's say you you shredded one of them because you 598 00:34:57,200 --> 00:34:59,719 Speaker 1: don't use that card anymore. But that's the one that 599 00:34:59,800 --> 00:35:03,040 Speaker 1: you set up the account with two years ago. Now 600 00:35:03,040 --> 00:35:06,200 Speaker 1: you can't get back in. So and so if they 601 00:35:06,239 --> 00:35:09,359 Speaker 1: lock it down this too hard, then you can't get 602 00:35:09,400 --> 00:35:12,640 Speaker 1: back in either. So that's why they make a Yeah, 603 00:35:12,640 --> 00:35:15,399 Speaker 1: that's why they make those those pieces available. Well, can 604 00:35:15,440 --> 00:35:17,439 Speaker 1: you tell me the last four digits of your social 605 00:35:17,480 --> 00:35:20,520 Speaker 1: Security number. Oh yeah, I know those. Well they got 606 00:35:20,560 --> 00:35:24,600 Speaker 1: that from somebody else. So there there's a catch twenty 607 00:35:24,640 --> 00:35:28,719 Speaker 1: two here. How how how secure is secure enough and 608 00:35:28,800 --> 00:35:32,640 Speaker 1: not too secure to lock you out forever? So so 609 00:35:32,719 --> 00:35:36,640 Speaker 1: there there is that is a challenge. UM. The part 610 00:35:36,680 --> 00:35:39,239 Speaker 1: of it is to UM when we're talking about the 611 00:35:39,280 --> 00:35:41,680 Speaker 1: domain name. They were able to get information from his 612 00:35:41,800 --> 00:35:47,680 Speaker 1: domain name, UH, and you can. There are things you 613 00:35:47,680 --> 00:35:50,480 Speaker 1: can do there too. UM. A lot of the services, 614 00:35:50,760 --> 00:35:53,080 Speaker 1: the places where you can register domain names offer a 615 00:35:53,200 --> 00:35:56,799 Speaker 1: secure UH service where you pay an additional fee per 616 00:35:56,880 --> 00:36:00,879 Speaker 1: year or or per however often you you renew your 617 00:36:00,920 --> 00:36:03,680 Speaker 1: domain name, that will lock it down so that it 618 00:36:03,760 --> 00:36:07,000 Speaker 1: has a Basically the the registrar is responsible for it. 619 00:36:07,360 --> 00:36:10,040 Speaker 1: So if you want to contact the owner of the 620 00:36:10,080 --> 00:36:12,759 Speaker 1: domain name to say make them an offer, Hey, we 621 00:36:12,800 --> 00:36:15,839 Speaker 1: want so and so dot com. You've got it, Can 622 00:36:15,920 --> 00:36:18,640 Speaker 1: we offer you ten thousand dollars and buy the domain 623 00:36:18,719 --> 00:36:20,840 Speaker 1: name for you? It would go through your registrar and 624 00:36:20,840 --> 00:36:23,719 Speaker 1: you would get contacted for it. But your information is 625 00:36:23,760 --> 00:36:27,240 Speaker 1: not the information out there, so there's a proxy between 626 00:36:27,280 --> 00:36:29,799 Speaker 1: you and them. UM. That would have helped him too, 627 00:36:29,800 --> 00:36:31,440 Speaker 1: If he had had something like that in place, it 628 00:36:31,480 --> 00:36:35,759 Speaker 1: would have helped lock it down Google um the uh 629 00:36:36,520 --> 00:36:38,839 Speaker 1: it's it's kind of interesting because what Google showed them 630 00:36:38,960 --> 00:36:43,319 Speaker 1: was uh M, star star star star star star n at, 631 00:36:44,719 --> 00:36:49,040 Speaker 1: you know, the Gmail name. They were pretty right in 632 00:36:49,160 --> 00:36:52,440 Speaker 1: guessing that it was his first initial last name. He 633 00:36:52,520 --> 00:36:55,680 Speaker 1: had that address at at several places. He points that out, 634 00:36:55,719 --> 00:36:58,839 Speaker 1: and that was that was easy. Could Google fix that 635 00:36:58,920 --> 00:37:02,120 Speaker 1: and make it more or obscure so that it wouldn't 636 00:37:02,160 --> 00:37:04,960 Speaker 1: be so easy to guess? Maybe? Could he have picked 637 00:37:04,960 --> 00:37:09,560 Speaker 1: a more difficult name to use as his backup email address? Probably? 638 00:37:10,239 --> 00:37:12,959 Speaker 1: But these are there are lots of little stuff that 639 00:37:13,160 --> 00:37:16,720 Speaker 1: everyone involved could have done to make it more difficult. 640 00:37:16,880 --> 00:37:20,240 Speaker 1: And there's Google also has a a two step verification process. 641 00:37:20,239 --> 00:37:22,120 Speaker 1: That's exactly what I was going to mention next to 642 00:37:23,440 --> 00:37:28,320 Speaker 1: two part authentication is um is a useful approach it 643 00:37:28,440 --> 00:37:31,919 Speaker 1: also and I've used it, Yeah, I've used it. It's 644 00:37:32,320 --> 00:37:35,160 Speaker 1: so two part of authentication is kind of what it 645 00:37:35,200 --> 00:37:38,000 Speaker 1: sounds like. You need. You need to have two different 646 00:37:38,040 --> 00:37:40,640 Speaker 1: things in order to be able to access the account. 647 00:37:40,760 --> 00:37:45,360 Speaker 1: And a typical approach is that you register a phone 648 00:37:45,440 --> 00:37:49,000 Speaker 1: number with whatever the services of like a cell phone. 649 00:37:49,280 --> 00:37:52,000 Speaker 1: You register that cell phone with whatever the services and 650 00:37:52,000 --> 00:37:55,080 Speaker 1: then when you try to access it, you have to 651 00:37:55,160 --> 00:37:58,000 Speaker 1: be able to provide not only the password, but then 652 00:37:58,040 --> 00:38:02,040 Speaker 1: an authentication code is sent to your device that you 653 00:38:02,080 --> 00:38:06,040 Speaker 1: have registered and you have to insert whatever that that 654 00:38:06,320 --> 00:38:09,440 Speaker 1: number is, and then then you can and then and 655 00:38:09,480 --> 00:38:12,239 Speaker 1: only then you can actually access whatever the account is. 656 00:38:13,040 --> 00:38:17,640 Speaker 1: And that helps a lot because as long as that 657 00:38:17,719 --> 00:38:20,799 Speaker 1: device remains in your possession and no one has been 658 00:38:20,840 --> 00:38:24,080 Speaker 1: able to intercept it in any way, you should be 659 00:38:24,160 --> 00:38:29,040 Speaker 1: fairly safe. So even if they try to reset the password, 660 00:38:29,760 --> 00:38:32,759 Speaker 1: they can't get access to it because they're trying through 661 00:38:32,800 --> 00:38:35,600 Speaker 1: a different device that has not been registered. Uh, And 662 00:38:35,640 --> 00:38:39,120 Speaker 1: then you get that that message. And we've seen very 663 00:38:39,640 --> 00:38:42,239 Speaker 1: variations of this as well, not just too part authentication, 664 00:38:42,280 --> 00:38:48,040 Speaker 1: but also registering devices with services like UM Lots of 665 00:38:48,040 --> 00:38:49,520 Speaker 1: them do that so that you can look at the 666 00:38:49,560 --> 00:38:53,200 Speaker 1: different sessions that are logged in through a particular service 667 00:38:53,239 --> 00:38:55,000 Speaker 1: and then if you if you see that there's one 668 00:38:55,040 --> 00:38:58,600 Speaker 1: there that you don't recognize, someone might have access to 669 00:38:58,680 --> 00:39:02,640 Speaker 1: your account. So, for example, Facebook does this where if 670 00:39:02,680 --> 00:39:07,160 Speaker 1: you try and access your UM Facebook account through different devices, 671 00:39:07,719 --> 00:39:10,240 Speaker 1: it may tell you, hey, I don't recognize this device. 672 00:39:10,320 --> 00:39:12,920 Speaker 1: This isn't something that you've used to access this account 673 00:39:13,000 --> 00:39:17,319 Speaker 1: before um, and it'll send an email to you and 674 00:39:17,440 --> 00:39:21,680 Speaker 1: let you know if you are that that, hey, someone's 675 00:39:21,680 --> 00:39:24,560 Speaker 1: accessing this. Is this you? Because if it's you, it's cool. 676 00:39:24,640 --> 00:39:27,399 Speaker 1: But if it's not you, then you need to look 677 00:39:27,440 --> 00:39:31,400 Speaker 1: into this. Johnathan, I'm two thousand nineteen. Again. Uh, well, 678 00:39:31,480 --> 00:39:34,239 Speaker 1: you know, we still have some more information to give 679 00:39:34,320 --> 00:39:37,520 Speaker 1: you about this particular story, but before we can dive 680 00:39:37,520 --> 00:39:48,560 Speaker 1: into that, we need to take one more break. Now. Again, 681 00:39:48,880 --> 00:39:53,439 Speaker 1: this is this is a good tool for people who 682 00:39:53,719 --> 00:39:58,320 Speaker 1: feel like they may have been hacked. However, let's say 683 00:39:58,360 --> 00:40:03,560 Speaker 1: that the person who is trying to access your Facebook account, um, 684 00:40:03,600 --> 00:40:05,320 Speaker 1: you know where they're trying to hack into your Facebook 685 00:40:05,320 --> 00:40:09,839 Speaker 1: account also has control of your email address. Then when 686 00:40:09,840 --> 00:40:11,959 Speaker 1: they say that, hey, is this you, and they send 687 00:40:11,960 --> 00:40:15,279 Speaker 1: that to your email address, well they've got that email address, yes, yes, 688 00:40:15,640 --> 00:40:18,960 Speaker 1: if it's gotten to that point. It's this particular approach 689 00:40:19,000 --> 00:40:22,880 Speaker 1: doesn't really help you. But other things that that you 690 00:40:22,960 --> 00:40:25,440 Speaker 1: can do, because there's some things that you can't have 691 00:40:25,480 --> 00:40:28,160 Speaker 1: any control over. It's it's the pole, it's the companies 692 00:40:28,160 --> 00:40:30,480 Speaker 1: you work with. Well, one, you can choose which companies 693 00:40:30,520 --> 00:40:34,200 Speaker 1: you you associate yourself with, but beyond that, you know 694 00:40:34,239 --> 00:40:35,759 Speaker 1: you have to hope that they put in the right 695 00:40:35,920 --> 00:40:39,120 Speaker 1: stuff in place to protect you. What you can do one, 696 00:40:39,480 --> 00:40:42,279 Speaker 1: continue to use strong passwords and don't don't use the 697 00:40:42,320 --> 00:40:45,560 Speaker 1: same ones across multiple platforms because it just makes it 698 00:40:45,640 --> 00:40:49,239 Speaker 1: way easier if one if one account does get compromised, 699 00:40:49,280 --> 00:40:50,800 Speaker 1: it makes it way easier for all the others to 700 00:40:50,840 --> 00:40:53,680 Speaker 1: get compromised. It's the domino effect. Yeah, so you we 701 00:40:53,760 --> 00:40:57,200 Speaker 1: wanna you want to start picking some pretty tough passwords 702 00:40:57,560 --> 00:41:00,840 Speaker 1: and and vary them across and change the UM you 703 00:41:00,840 --> 00:41:05,439 Speaker 1: know fairly regularly because the longer they stay, the more 704 00:41:05,640 --> 00:41:10,520 Speaker 1: likely you're going to UM encounter a problem. Use some 705 00:41:10,560 --> 00:41:13,080 Speaker 1: sort of password manager so that you can keep track 706 00:41:13,120 --> 00:41:15,480 Speaker 1: of them all, because I know it is you know, 707 00:41:16,239 --> 00:41:18,799 Speaker 1: the flip side of a strong password is it's really 708 00:41:18,840 --> 00:41:22,360 Speaker 1: hard to remember. So if you're if you've got lots 709 00:41:22,360 --> 00:41:25,920 Speaker 1: and lots of online accounts, then it's going to be 710 00:41:25,960 --> 00:41:28,279 Speaker 1: really challenging to keep all those straight. So some sort 711 00:41:28,280 --> 00:41:34,120 Speaker 1: of password manager is important. UM Also, think about what 712 00:41:34,200 --> 00:41:37,360 Speaker 1: you share before you share it online, because some of 713 00:41:37,400 --> 00:41:41,480 Speaker 1: the details you share may also serve as answers to 714 00:41:41,560 --> 00:41:46,360 Speaker 1: various security questions, or they may give off other information 715 00:41:46,400 --> 00:41:50,000 Speaker 1: that companies use to verify identity, So be careful about that, 716 00:41:50,520 --> 00:41:55,160 Speaker 1: you know, don't don't be too free with personal information 717 00:41:56,120 --> 00:41:59,480 Speaker 1: if that means that information could be used to circumvent 718 00:41:59,600 --> 00:42:03,719 Speaker 1: security systems. One suggestion I've always heard is that when 719 00:42:03,760 --> 00:42:07,960 Speaker 1: you create answers to security questions you create, you're essentially 720 00:42:07,960 --> 00:42:11,000 Speaker 1: creating another password. You don't you don't answer the question. 721 00:42:11,680 --> 00:42:13,440 Speaker 1: You and you put something else in there, and you 722 00:42:13,520 --> 00:42:17,440 Speaker 1: put something something unrelated but something you will easily remember, 723 00:42:18,040 --> 00:42:20,320 Speaker 1: all right, So something that doesn't have to be a 724 00:42:20,360 --> 00:42:22,799 Speaker 1: strong password. In other words, it just needs to be 725 00:42:22,840 --> 00:42:25,720 Speaker 1: a keyword that doesn't have anything to do with a question, 726 00:42:25,760 --> 00:42:28,799 Speaker 1: but it's a keyword you are guaranteed to remember. So 727 00:42:28,800 --> 00:42:31,759 Speaker 1: so for example, if you, uh, maybe I've seen something 728 00:42:31,760 --> 00:42:34,000 Speaker 1: that ask for the name of your friend, model of 729 00:42:34,000 --> 00:42:38,080 Speaker 1: your first car, you could say something like grapefruit, yeah, which, well, 730 00:42:38,080 --> 00:42:39,799 Speaker 1: I know, if I'm asked about my car, I'm going 731 00:42:39,840 --> 00:42:43,480 Speaker 1: to say grapefruit. Right. Somebody might go, oh, it's a Chevy. 732 00:42:43,840 --> 00:42:45,719 Speaker 1: They might have looked on your Facebook page and you 733 00:42:45,800 --> 00:42:48,239 Speaker 1: might have had a thing like this, says man, I 734 00:42:48,239 --> 00:42:51,040 Speaker 1: have such great memories of my of my first car, 735 00:42:51,239 --> 00:42:53,080 Speaker 1: and then you have a picture of it on there. 736 00:42:53,440 --> 00:42:55,200 Speaker 1: But that's all they would need to be able to 737 00:42:55,200 --> 00:42:57,920 Speaker 1: answer that question if you use the right answer, the 738 00:42:58,080 --> 00:43:01,319 Speaker 1: right or the corresponding answer. So if you've done, say 739 00:43:01,520 --> 00:43:05,600 Speaker 1: a thing on genealogy, and you've uh, you know, talked 740 00:43:05,600 --> 00:43:08,200 Speaker 1: about your parents and say, well, you know my mother 741 00:43:08,239 --> 00:43:10,000 Speaker 1: who was so and so, and it's like, what's your 742 00:43:10,000 --> 00:43:12,440 Speaker 1: mother's maiden name? Oh? Well, I know it was Steven's 743 00:43:12,480 --> 00:43:14,920 Speaker 1: because I saw it on the on their Facebook account. 744 00:43:15,719 --> 00:43:19,440 Speaker 1: Well that's pretty easy to track down. Um. And and 745 00:43:19,480 --> 00:43:23,319 Speaker 1: speaking of Facebook, uh, it occurs to me that a 746 00:43:23,360 --> 00:43:27,600 Speaker 1: lot of sites these days are using Facebook connect or 747 00:43:27,760 --> 00:43:30,560 Speaker 1: Google or Yahoo, and you can say, hey, would you 748 00:43:30,600 --> 00:43:33,759 Speaker 1: like to sign in with your blank account? Some of 749 00:43:33,800 --> 00:43:38,920 Speaker 1: them exclusively do that where you cannot access it unless 750 00:43:38,960 --> 00:43:41,160 Speaker 1: you happen to have one of those are their accounts? Yes, 751 00:43:41,320 --> 00:43:44,200 Speaker 1: Like I believe Pinterest you had to log in through 752 00:43:44,239 --> 00:43:46,319 Speaker 1: Facebook when it was when it first started. I don't 753 00:43:46,320 --> 00:43:49,320 Speaker 1: know if that's still the case. And Spotify, Uh, Spotify, 754 00:43:50,000 --> 00:43:54,600 Speaker 1: you know had had switched to requiring Facebook. Um. Okay, 755 00:43:54,600 --> 00:43:58,040 Speaker 1: So if they gain access to your Facebook account, all 756 00:43:58,080 --> 00:44:00,600 Speaker 1: of a sudden, they've got access to every their account 757 00:44:00,600 --> 00:44:04,359 Speaker 1: that you've used that log in with when they offer 758 00:44:04,440 --> 00:44:08,200 Speaker 1: you an opportunity to create a separate log in. Maybe 759 00:44:08,239 --> 00:44:10,680 Speaker 1: you should take that opportunity. Yeah, it's a pain. It 760 00:44:10,800 --> 00:44:13,080 Speaker 1: is a pain. And the whole point about the whole 761 00:44:13,080 --> 00:44:15,640 Speaker 1: Facebook connect is that it makes it much more convenient. 762 00:44:15,760 --> 00:44:18,080 Speaker 1: You know, you you know, Facebook loves it because it 763 00:44:18,160 --> 00:44:21,600 Speaker 1: becomes the platform for the Internet, and people love it 764 00:44:21,600 --> 00:44:23,719 Speaker 1: because it means that it's one less thing they have 765 00:44:23,800 --> 00:44:25,680 Speaker 1: to worry about when they want to log in. But 766 00:44:25,800 --> 00:44:29,120 Speaker 1: it does mean that there is this point of vulnerability 767 00:44:29,160 --> 00:44:31,960 Speaker 1: that is incredibly attractive to someone who wants to get 768 00:44:32,000 --> 00:44:35,239 Speaker 1: access to your stuff, because it's going if they get 769 00:44:35,239 --> 00:44:38,239 Speaker 1: access to one thing, they get access to a dozen more. 770 00:44:38,719 --> 00:44:41,959 Speaker 1: And it doesn't I say Facebook, but like Chris was saying, 771 00:44:42,000 --> 00:44:44,359 Speaker 1: it's not just Facebook. Google is the same way. There 772 00:44:44,360 --> 00:44:47,480 Speaker 1: are lots of different services that if you have a 773 00:44:47,560 --> 00:44:53,880 Speaker 1: Google account you could potentially access. UM. Another another suggestion 774 00:44:53,920 --> 00:44:58,600 Speaker 1: I've seen is that there are a lot of services 775 00:44:58,640 --> 00:45:01,000 Speaker 1: out there that some of us will sign up for 776 00:45:01,760 --> 00:45:06,680 Speaker 1: and then stop using and then forget about um. It 777 00:45:06,760 --> 00:45:09,359 Speaker 1: might not be a bad idea to if you never 778 00:45:09,480 --> 00:45:11,160 Speaker 1: use those services, it might not be a bad idea 779 00:45:11,200 --> 00:45:15,200 Speaker 1: to go back and check and delete those accounts because 780 00:45:15,400 --> 00:45:18,239 Speaker 1: those are other points of vulnerability, especially if it's going 781 00:45:18,280 --> 00:45:20,319 Speaker 1: to you know, if you do tend to use the 782 00:45:20,320 --> 00:45:24,000 Speaker 1: same group of passwords over and over and hackers get 783 00:45:24,520 --> 00:45:27,640 Speaker 1: access to something, particularly if it's something that isn't terribly 784 00:45:27,680 --> 00:45:31,400 Speaker 1: popular anymore, and maybe as a result, the security measures 785 00:45:31,400 --> 00:45:34,359 Speaker 1: aren't as up to date as they could be. It's 786 00:45:34,360 --> 00:45:37,840 Speaker 1: a possibility you might want to get rid of that stuff. 787 00:45:38,000 --> 00:45:40,520 Speaker 1: So you know that my Space account that you haven't 788 00:45:40,600 --> 00:45:43,480 Speaker 1: checked in four years, maybe it's time to just go 789 00:45:43,520 --> 00:45:49,080 Speaker 1: ahead and close that out, you know that kind of stuff. Yeah, uh, 790 00:45:49,120 --> 00:45:52,240 Speaker 1: and we've already mentioned back up your data. It's also 791 00:45:52,360 --> 00:45:56,239 Speaker 1: very important. So yeah, so basic basic tips that you 792 00:45:56,280 --> 00:45:58,759 Speaker 1: can follow to try and protect yourself and keeping in 793 00:45:58,800 --> 00:46:01,160 Speaker 1: mind that you know, a lot this also depends upon 794 00:46:01,239 --> 00:46:05,359 Speaker 1: the other parties involved. Yeah, and so looking back at 795 00:46:05,440 --> 00:46:10,320 Speaker 1: at at Matt hone and did he do something wrong 796 00:46:10,880 --> 00:46:14,160 Speaker 1: or you know, deserving of being you know, you know, 797 00:46:14,239 --> 00:46:16,520 Speaker 1: really he could have been any of us. And even 798 00:46:16,520 --> 00:46:19,839 Speaker 1: though he's a known tech journalist, he you know, sort 799 00:46:19,880 --> 00:46:22,560 Speaker 1: of succumbed to being human. You know, he had the 800 00:46:22,600 --> 00:46:24,520 Speaker 1: same password, he didn't change it for a long time. 801 00:46:24,560 --> 00:46:26,799 Speaker 1: He's probably told he didn't back up and I'm sure 802 00:46:26,840 --> 00:46:28,959 Speaker 1: he's probably told people to do that a thousand times, 803 00:46:29,000 --> 00:46:31,279 Speaker 1: just like we have. You know, we're all guilty of 804 00:46:31,320 --> 00:46:33,839 Speaker 1: doing these little things because their pains in the neck. 805 00:46:33,880 --> 00:46:35,120 Speaker 1: We don't want to do it, we don't have time 806 00:46:35,160 --> 00:46:37,680 Speaker 1: to do it. I mean, he's got kids times of 807 00:46:37,760 --> 00:46:40,200 Speaker 1: premium for him, just like it is for so many 808 00:46:40,239 --> 00:46:43,720 Speaker 1: of us. Um, you know, is it is it Apple's 809 00:46:43,760 --> 00:46:46,759 Speaker 1: fault in particular? Is it Amazon's fault in particular? The 810 00:46:46,760 --> 00:46:49,520 Speaker 1: only people who are are really at fault of the hackers. Yeah, 811 00:46:49,560 --> 00:46:52,120 Speaker 1: it's it's it's the combination of all of these things 812 00:46:52,160 --> 00:46:55,239 Speaker 1: together that made it possible. It's the hackers that are 813 00:46:55,280 --> 00:46:58,560 Speaker 1: really at fault. Yeah. And the thing is, yeah, we're 814 00:46:58,600 --> 00:47:01,719 Speaker 1: all busy and none of us really wants to make 815 00:47:01,800 --> 00:47:06,399 Speaker 1: up a new, you know, twenty four digit password for 816 00:47:06,440 --> 00:47:09,680 Speaker 1: each thing and worry about them. No, none of us 817 00:47:09,719 --> 00:47:12,719 Speaker 1: really wants to mess with that. But the truth of 818 00:47:12,760 --> 00:47:15,799 Speaker 1: the matter is that all these systems worked together to 819 00:47:15,920 --> 00:47:19,399 Speaker 1: make this possible, and it's true for all of us. 820 00:47:19,440 --> 00:47:22,440 Speaker 1: I mean, these these vulnerabilities are vulnerable for all of us. 821 00:47:22,680 --> 00:47:25,040 Speaker 1: It's I know that Amazon and Apple both have thought 822 00:47:25,080 --> 00:47:30,759 Speaker 1: about this. It's still kind of fresh. Um as the recording. Yeah, 823 00:47:31,040 --> 00:47:33,920 Speaker 1: as they're recording this podcast. So you know, neither of them, 824 00:47:34,000 --> 00:47:37,439 Speaker 1: I don't think, have made some public proclamation about how 825 00:47:37,960 --> 00:47:42,200 Speaker 1: they're going to fix this going forward quote unquote fix 826 00:47:42,239 --> 00:47:46,440 Speaker 1: it again. How what do you do? It's not obvious 827 00:47:46,480 --> 00:47:48,879 Speaker 1: to do this, so I think the two part authentication 828 00:47:48,960 --> 00:47:53,359 Speaker 1: is probably one of the the more obvious approaches. And uh, 829 00:47:53,440 --> 00:47:57,800 Speaker 1: well we might see some other elements thrown in there too. 830 00:47:58,080 --> 00:48:00,400 Speaker 1: And and however, I have seen people say yeah, and 831 00:48:00,440 --> 00:48:03,040 Speaker 1: I turned this on and it was the point I 832 00:48:03,080 --> 00:48:05,920 Speaker 1: was making earlier. It made it so difficult that it 833 00:48:05,920 --> 00:48:08,359 Speaker 1: took me two weeks to figure out how to get 834 00:48:08,360 --> 00:48:10,719 Speaker 1: back into my account, and it was a real pain 835 00:48:10,760 --> 00:48:13,000 Speaker 1: in the neck. I got in, but it took me 836 00:48:13,040 --> 00:48:16,320 Speaker 1: a while because I kind of, uh laid myself a trap. 837 00:48:17,080 --> 00:48:19,759 Speaker 1: So it's it's one of those things where I think 838 00:48:19,800 --> 00:48:22,239 Speaker 1: you kind of have to work into it and think 839 00:48:22,280 --> 00:48:24,239 Speaker 1: about this stuff when you set it up, and go 840 00:48:24,320 --> 00:48:27,319 Speaker 1: back and look at your accounts and see how it's 841 00:48:27,360 --> 00:48:30,319 Speaker 1: laid out to fix this for yourself. Yeah, this is 842 00:48:30,360 --> 00:48:32,920 Speaker 1: this is why it's really important for companies to uh 843 00:48:33,239 --> 00:48:37,680 Speaker 1: to hire white hat hackers who I mean, all they 844 00:48:37,719 --> 00:48:40,120 Speaker 1: do is look at systems and try and find ways 845 00:48:40,239 --> 00:48:44,000 Speaker 1: to to breach systems so that those systems can be 846 00:48:44,080 --> 00:48:47,279 Speaker 1: improved over time. And it's important to get a third 847 00:48:47,280 --> 00:48:50,960 Speaker 1: party to do it because when you design a system again, 848 00:48:51,040 --> 00:48:53,279 Speaker 1: you may be thinking of the obvious points of entry, 849 00:48:53,440 --> 00:48:57,040 Speaker 1: which is where you've really really put in great security, 850 00:48:57,440 --> 00:48:59,799 Speaker 1: right like you know, like there's no way anyone's gonna 851 00:48:59,840 --> 00:49:01,800 Speaker 1: get through this, at least not in the next five years. 852 00:49:01,920 --> 00:49:05,239 Speaker 1: We require people to use non alpha numeric characters. Well, 853 00:49:05,280 --> 00:49:07,440 Speaker 1: that's great if they're going to use the password in 854 00:49:07,719 --> 00:49:10,960 Speaker 1: case the door. Yeah. So again, that's why you want 855 00:49:11,000 --> 00:49:12,960 Speaker 1: to have a third party, because they're not thinking the 856 00:49:13,000 --> 00:49:15,759 Speaker 1: way you think. They're thinking how do I get into 857 00:49:15,800 --> 00:49:19,600 Speaker 1: this system? Not not how strong do I make this door? 858 00:49:19,920 --> 00:49:22,760 Speaker 1: And that wraps up another classic episode. Hope you guys 859 00:49:22,920 --> 00:49:28,600 Speaker 1: enjoyed this walk down memory lane and the reminder that 860 00:49:28,760 --> 00:49:33,200 Speaker 1: things can get pretty dicey out there. Uh though, sometimes 861 00:49:33,320 --> 00:49:35,319 Speaker 1: you can find out that the people who attacked you 862 00:49:35,719 --> 00:49:40,520 Speaker 1: aren't really terrible people, but sometimes do questionable things for 863 00:49:41,040 --> 00:49:44,120 Speaker 1: weird motivations. I don't know how much comfort we can 864 00:49:44,160 --> 00:49:46,759 Speaker 1: take in that, but I guess it's something anyway. If 865 00:49:46,800 --> 00:49:49,800 Speaker 1: you guys have any suggestions for future episodes of tech Stuff. 866 00:49:50,080 --> 00:49:52,160 Speaker 1: Feel free to reach out and let me know the 867 00:49:52,239 --> 00:49:55,399 Speaker 1: email addresses tech Stuff at how stuff works dot com, 868 00:49:55,520 --> 00:49:58,000 Speaker 1: or pop on over to our website that's tech stuff 869 00:49:58,000 --> 00:50:01,960 Speaker 1: podcast dot com. You will f links to our presence 870 00:50:02,000 --> 00:50:04,960 Speaker 1: on social media. Over there, you also find links to 871 00:50:05,160 --> 00:50:08,120 Speaker 1: all of the archived episodes of tech Stuff, all of 872 00:50:08,160 --> 00:50:13,520 Speaker 1: the episodes that have ever published, obviously not including the 873 00:50:13,640 --> 00:50:16,879 Speaker 1: legendary lost episodes of tech Stuff. And you also find 874 00:50:16,880 --> 00:50:19,000 Speaker 1: a link to our online store, where every purchase you 875 00:50:19,040 --> 00:50:21,359 Speaker 1: make goes to help the show. We greatly appreciate it, 876 00:50:21,440 --> 00:50:24,320 Speaker 1: and I will talk to you again really soon. Y. 877 00:50:28,200 --> 00:50:30,400 Speaker 1: Tech Stuff is a production of I Heart Radio's How 878 00:50:30,440 --> 00:50:33,839 Speaker 1: Stuff Works. For more podcasts from my heart Radio, visit 879 00:50:33,880 --> 00:50:36,960 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 880 00:50:37,000 --> 00:50:38,360 Speaker 1: listen to your favorite shows.