1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:16,040 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,079 --> 00:00:19,759 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,880 --> 00:00:23,680 Speaker 1: tech are you? So recently I talked about how the 5 00:00:23,880 --> 00:00:28,800 Speaker 1: US Department of Justice has filed a civil antitrust lawsuit 6 00:00:29,040 --> 00:00:34,400 Speaker 1: against the company Live Nation Entertainment, which, among many other things, 7 00:00:34,560 --> 00:00:38,720 Speaker 1: operates the service Ticketmaster, a service that I would say 8 00:00:39,159 --> 00:00:44,160 Speaker 1: has fostered a lot of very strong opinions among concertgoers, 9 00:00:44,320 --> 00:00:48,600 Speaker 1: including yours. Truly, I have very strong feelings about Ticketmaster. 10 00:00:48,800 --> 00:00:51,520 Speaker 1: But last Friday night, which was the night of May 11 00:00:51,600 --> 00:00:55,720 Speaker 1: thirty first, two thy twenty four for those of y'all 12 00:00:55,760 --> 00:00:59,600 Speaker 1: listening from the future, Ticketmaster was in the news for 13 00:00:59,680 --> 00:01:02,760 Speaker 1: a reason because the company had been the target of 14 00:01:02,840 --> 00:01:06,800 Speaker 1: hackers who allegedly stole data belonging to around five hundred 15 00:01:07,080 --> 00:01:13,080 Speaker 1: sixty million ticket Master customers. Now, that data reportedly includes 16 00:01:13,360 --> 00:01:17,680 Speaker 1: personal information like names, addresses, and phone numbers, as well 17 00:01:17,720 --> 00:01:20,880 Speaker 1: as purchase history. So you know, that means the hackers 18 00:01:20,920 --> 00:01:24,679 Speaker 1: can check and see if a you know, very public 19 00:01:24,760 --> 00:01:27,679 Speaker 1: punk rocker type has secretly been sneaking off to watch 20 00:01:27,680 --> 00:01:31,360 Speaker 1: Taylor Swift concerts or something, and also some partial credit 21 00:01:31,400 --> 00:01:34,560 Speaker 1: card information like the last four digits on credit cards. 22 00:01:34,959 --> 00:01:39,240 Speaker 1: Ticketmaster slash Live Nation initially kept quiet about this revelation, 23 00:01:39,360 --> 00:01:42,080 Speaker 1: but then late on Friday confirmed that a data breach 24 00:01:42,240 --> 00:01:46,600 Speaker 1: did in fact happen. This is a problem for lots 25 00:01:46,600 --> 00:01:48,920 Speaker 1: of reasons. I mean, anytime there's a data breach, that's 26 00:01:48,960 --> 00:01:51,840 Speaker 1: a problem, But when you're talking about a data breach 27 00:01:51,920 --> 00:01:57,080 Speaker 1: affecting hundreds of millions of people, that just spells a 28 00:01:57,240 --> 00:02:00,600 Speaker 1: massive headache moving forward. And we'll talk a lot about 29 00:02:00,840 --> 00:02:04,080 Speaker 1: why that is in this episode, But really I thought 30 00:02:04,120 --> 00:02:06,360 Speaker 1: I would chat about some of the largest data breaches 31 00:02:06,400 --> 00:02:09,240 Speaker 1: in US history, which is a super happy topic, right, 32 00:02:09,440 --> 00:02:11,520 Speaker 1: but I thought it was really important to consider how 33 00:02:11,600 --> 00:02:15,480 Speaker 1: technology that's meant to make systems more efficient and effective 34 00:02:15,800 --> 00:02:20,880 Speaker 1: can also sometimes provide an opportunity for malicious agents, for 35 00:02:21,040 --> 00:02:24,960 Speaker 1: hackers to make off with potentially huge amounts of information. 36 00:02:25,160 --> 00:02:27,799 Speaker 1: And as we all know, information is valuable. I mean, 37 00:02:27,800 --> 00:02:31,239 Speaker 1: it is the currency of the Internet in many ways, 38 00:02:31,520 --> 00:02:35,120 Speaker 1: and data breaches are becoming more and more common. The 39 00:02:35,160 --> 00:02:38,919 Speaker 1: Identity Theft Resource Center reported that in twenty twenty one, 40 00:02:39,320 --> 00:02:42,600 Speaker 1: there were one eight hundred and sixty two data breaches 41 00:02:42,800 --> 00:02:45,840 Speaker 1: that it was able to identify. In twenty twenty three, 42 00:02:46,160 --> 00:02:49,520 Speaker 1: that number was up to three thousand, two hundred five, 43 00:02:49,680 --> 00:02:52,680 Speaker 1: almost double. However, I feel I should clarify that twenty 44 00:02:52,720 --> 00:02:57,280 Speaker 1: five of those incidents were data exposures, and two of 45 00:02:57,320 --> 00:03:00,880 Speaker 1: them were data leaks, and fifty six were incidents that 46 00:03:00,919 --> 00:03:04,639 Speaker 1: weren't categorized at all. They're uncategorized, I don't know the 47 00:03:04,760 --> 00:03:08,480 Speaker 1: nature of them, so that leaves us three twenty two 48 00:03:08,520 --> 00:03:13,000 Speaker 1: cases of actual data breaches, and the differences between these 49 00:03:13,000 --> 00:03:17,560 Speaker 1: different categories are sometimes subtle and a little gray. As 50 00:03:17,560 --> 00:03:21,519 Speaker 1: for my source for what constitutes the largest data breaches 51 00:03:21,560 --> 00:03:24,799 Speaker 1: in the United States, I decided settle on one source 52 00:03:25,040 --> 00:03:28,280 Speaker 1: just for the list. Right, I went into lots of 53 00:03:28,360 --> 00:03:31,120 Speaker 1: sources for the details of all these things, but I 54 00:03:31,240 --> 00:03:35,200 Speaker 1: used a blog post on upguard dot com. It was 55 00:03:35,240 --> 00:03:39,560 Speaker 1: written by Kyle Chen. Now, Kyle Chen lists twenty six 56 00:03:39,800 --> 00:03:43,640 Speaker 1: cases of data breaches, and the Ticketmaster case isn't among them. 57 00:03:43,720 --> 00:03:48,040 Speaker 1: It hasn't been updated since the Ticketmaster issue. Arguably, Ticketmasters 58 00:03:48,080 --> 00:03:51,720 Speaker 1: should be in any list about large data breaches in 59 00:03:51,760 --> 00:03:53,680 Speaker 1: the United States because this was a big one. I 60 00:03:53,680 --> 00:03:56,680 Speaker 1: imagine when the dust settles, it could end up on 61 00:03:56,760 --> 00:04:00,560 Speaker 1: that list where I can't say Chen's definition. The biggest 62 00:04:00,680 --> 00:04:03,680 Speaker 1: isn't just in how many records were part of a 63 00:04:03,760 --> 00:04:07,960 Speaker 1: data breach, Like that's not the only factor that constitutes 64 00:04:08,000 --> 00:04:12,440 Speaker 1: whether or not it merits consideration. Also the nature of 65 00:04:12,480 --> 00:04:15,520 Speaker 1: the information and the impact the breach had end up 66 00:04:15,560 --> 00:04:19,040 Speaker 1: factoring how it falls on the list. And twenty six 67 00:04:19,480 --> 00:04:22,960 Speaker 1: cases is way too many cases for a podcast episode 68 00:04:23,080 --> 00:04:25,839 Speaker 1: or even you know, two of them. So I'm just 69 00:04:25,839 --> 00:04:27,880 Speaker 1: gonna go with the top ten, and even that's gonna 70 00:04:27,920 --> 00:04:30,640 Speaker 1: require me to break this into two episodes, and I'm 71 00:04:30,680 --> 00:04:33,719 Speaker 1: gonna work backward to add to the drama. By the way, 72 00:04:33,960 --> 00:04:36,279 Speaker 1: Kevin Chin and no point says that this is a 73 00:04:36,400 --> 00:04:40,040 Speaker 1: ranked list, so you could argue, I'm just giving you 74 00:04:40,160 --> 00:04:43,719 Speaker 1: ten random large data breach stories out of a list 75 00:04:43,720 --> 00:04:46,400 Speaker 1: of twenty six, and that's a legitimate criticism. But a 76 00:04:46,440 --> 00:04:49,680 Speaker 1: guy's got to start somewhere, right Anyway. I'm doing this 77 00:04:49,720 --> 00:04:53,000 Speaker 1: as a list because I've watched a lot of Jenny 78 00:04:53,120 --> 00:04:56,400 Speaker 1: Nicholson's older YouTube videos recently, and I absolutely love how 79 00:04:56,440 --> 00:05:00,320 Speaker 1: she turns everything into quote an internet friendly numbered li list. 80 00:05:00,520 --> 00:05:02,760 Speaker 1: In the quote, I think that's very funny. I mean, 81 00:05:02,920 --> 00:05:05,320 Speaker 1: Red Letter Media did the same thing with the Planket 82 00:05:05,360 --> 00:05:07,760 Speaker 1: reviews with all the different parts, although that was somewhat 83 00:05:07,800 --> 00:05:10,840 Speaker 1: necessitated by the fact that in the early days when 84 00:05:10,839 --> 00:05:14,320 Speaker 1: they were posting those super long reviews, YouTube videos were 85 00:05:14,360 --> 00:05:19,039 Speaker 1: limited to ten minutes each, so they would upload like 86 00:05:19,080 --> 00:05:22,599 Speaker 1: a nine part series to take down the Star Wars 87 00:05:22,680 --> 00:05:26,440 Speaker 1: episode one critique or whatever. Anyway, I've decided to go 88 00:05:26,680 --> 00:05:30,719 Speaker 1: backward in order to increase the drama. So we're gonna 89 00:05:30,720 --> 00:05:35,440 Speaker 1: start with number ten, which is FriendFinder Networks. And this 90 00:05:35,520 --> 00:05:40,200 Speaker 1: one's a doozy. So friend Fighter Networks deals with products 91 00:05:40,240 --> 00:05:43,440 Speaker 1: and services that include some that are not suitable for 92 00:05:43,520 --> 00:05:47,680 Speaker 1: a family friendly podcast. I will use some euphemisms, but 93 00:05:47,920 --> 00:05:52,200 Speaker 1: they include stuff like adult entertainment, webcam sites, that kind 94 00:05:52,279 --> 00:05:55,839 Speaker 1: of thing. That's part of what friend Fighter Networks operates. 95 00:05:56,240 --> 00:06:00,880 Speaker 1: The adult magazine company Penthouse bought friend Fire in early 96 00:06:00,960 --> 00:06:07,040 Speaker 1: twenty sixteen, and interestingly, the company operates several dating services, 97 00:06:07,320 --> 00:06:10,400 Speaker 1: including one intended to help people find someone with whom 98 00:06:10,400 --> 00:06:14,800 Speaker 1: to have casual sexual encounters, that being adult friend Finder. 99 00:06:14,920 --> 00:06:17,160 Speaker 1: On one end of the spectrum, and on the other 100 00:06:17,279 --> 00:06:19,599 Speaker 1: end of the spectrum, they have a dating service for 101 00:06:19,720 --> 00:06:22,960 Speaker 1: devout Christians. So I guess it's a company that really 102 00:06:22,960 --> 00:06:25,359 Speaker 1: does believe an equal opportunity to make money off of 103 00:06:25,440 --> 00:06:29,800 Speaker 1: various audiences. Anyway, as a company with businesses that are 104 00:06:30,160 --> 00:06:34,240 Speaker 1: in the adult entertainment sphere and social networks and also 105 00:06:34,360 --> 00:06:38,800 Speaker 1: dating services, FriendFinder Networks has access to a lot of 106 00:06:39,040 --> 00:06:44,560 Speaker 1: sensitive user information that includes info that customers absolutely would 107 00:06:44,600 --> 00:06:48,240 Speaker 1: prefer remain private or at least under their own control. 108 00:06:48,600 --> 00:06:50,360 Speaker 1: So it was a bit of a shock in late 109 00:06:50,440 --> 00:06:53,840 Speaker 1: twenty sixteen when news broke that hackers stole data from 110 00:06:53,880 --> 00:06:57,800 Speaker 1: the company that stretched back two decades, like there was 111 00:06:57,839 --> 00:07:00,560 Speaker 1: information in there that was twenty years old, and it 112 00:07:00,600 --> 00:07:04,000 Speaker 1: even included information belonging to people who had long since 113 00:07:04,160 --> 00:07:09,600 Speaker 1: deleted their accounts with FriendFinder Networks, but their information remained 114 00:07:09,680 --> 00:07:12,720 Speaker 1: on company servers despite the fact that they had deleted 115 00:07:12,720 --> 00:07:17,960 Speaker 1: their accounts. That seems like a very bad data ownership policy. 116 00:07:18,160 --> 00:07:23,040 Speaker 1: Right to retain information about people who had subsequently deleted 117 00:07:23,080 --> 00:07:26,400 Speaker 1: their account with you, that's a real problem. So the 118 00:07:26,440 --> 00:07:30,080 Speaker 1: method that these hackers used relied on LFI, which is 119 00:07:30,160 --> 00:07:34,840 Speaker 1: local file intrusion or sometimes local file insertion. It kind 120 00:07:34,880 --> 00:07:37,840 Speaker 1: of depends upon who you're talking to, but the name 121 00:07:38,080 --> 00:07:42,280 Speaker 1: sort of explains how this works. The hacker injects essentially 122 00:07:42,360 --> 00:07:48,120 Speaker 1: malicious directions into a system, and they do this usually 123 00:07:48,280 --> 00:07:53,000 Speaker 1: by incorporating those directions into a file, so, for example, 124 00:07:53,000 --> 00:07:58,560 Speaker 1: a multimedia file. This multimedia file might contain basic directory 125 00:07:58,680 --> 00:08:03,440 Speaker 1: commands within the file itself, so essentially it tells the system, 126 00:08:03,560 --> 00:08:07,760 Speaker 1: hey execute these commands in this order, and if the 127 00:08:07,800 --> 00:08:11,840 Speaker 1: server isn't protected against such relatively simple attacks, if I'm 128 00:08:11,880 --> 00:08:15,160 Speaker 1: being honest, then the code can prompt the web server 129 00:08:15,280 --> 00:08:18,640 Speaker 1: to configure the file improperly and give backdoor access to 130 00:08:18,720 --> 00:08:22,280 Speaker 1: a hacker, which is in fact what happened in this case. 131 00:08:22,320 --> 00:08:26,480 Speaker 1: The hackers got access to information stored on the affected servers, 132 00:08:26,680 --> 00:08:30,920 Speaker 1: and there were six databases in total that were affected 133 00:08:30,920 --> 00:08:34,319 Speaker 1: by this, six massive databases, and the take was huge. 134 00:08:34,440 --> 00:08:36,840 Speaker 1: So the hackers made off of information that related to 135 00:08:36,880 --> 00:08:41,160 Speaker 1: more than four hundred and twelve million customer accounts. The 136 00:08:41,200 --> 00:08:45,720 Speaker 1: information included email addresses, including some belonging to government and 137 00:08:45,840 --> 00:08:51,600 Speaker 1: military users, transaction history, account passwords. Some of these passwords 138 00:08:51,600 --> 00:08:55,080 Speaker 1: at least were encrypted, but they used a really primitive 139 00:08:55,160 --> 00:08:57,800 Speaker 1: hash to do it, an outdated method that was no 140 00:08:57,880 --> 00:09:00,280 Speaker 1: longer considered secure, so that was a big, prible problem. 141 00:09:00,640 --> 00:09:03,240 Speaker 1: More than three hundred million of the accounts came from 142 00:09:03,360 --> 00:09:06,720 Speaker 1: Adult friend Finder, and more than sixty million came from 143 00:09:06,720 --> 00:09:10,000 Speaker 1: a webcam site. And I'm sure a lot of customers 144 00:09:10,040 --> 00:09:12,840 Speaker 1: got really nervous about this. I mean, the taboo nature 145 00:09:13,280 --> 00:09:15,960 Speaker 1: of these sites and services meant a lot of people 146 00:09:16,160 --> 00:09:19,440 Speaker 1: were probably sweating over their past activities and hoping they 147 00:09:19,480 --> 00:09:23,960 Speaker 1: wouldn't be exposed. Now, keep in mind that one year earlier, 148 00:09:24,080 --> 00:09:27,520 Speaker 1: in the summer of twenty fifteen, hackers compromised around thirty 149 00:09:27,559 --> 00:09:31,559 Speaker 1: two million accounts from the company Ashley Madison. Ashley Madison 150 00:09:31,600 --> 00:09:33,520 Speaker 1: was built around the idea of a dating service that 151 00:09:33,600 --> 00:09:37,319 Speaker 1: would let married people secretly find potential partners in order 152 00:09:37,320 --> 00:09:41,360 Speaker 1: to have an affair. There was this sense that some 153 00:09:41,880 --> 00:09:45,719 Speaker 1: sort of hacker anarchist was going to reveal salacious details 154 00:09:45,760 --> 00:09:48,320 Speaker 1: about folks in the wake of these attacks, or that 155 00:09:48,440 --> 00:09:51,480 Speaker 1: at the very least, they would make these details available 156 00:09:51,520 --> 00:09:54,120 Speaker 1: so that anyone who really wanted to sift through all 157 00:09:54,200 --> 00:09:56,760 Speaker 1: the stolen information could dig up whether or not you 158 00:09:56,800 --> 00:09:58,959 Speaker 1: know the neighbor down the street was secretly trying to 159 00:09:58,960 --> 00:10:02,200 Speaker 1: sneak around behind their partners back or whatever, or the 160 00:10:02,240 --> 00:10:05,440 Speaker 1: sexual orientation of people you knew. You could find that 161 00:10:05,559 --> 00:10:07,880 Speaker 1: kind of information out based upon the stuff that had 162 00:10:07,920 --> 00:10:11,160 Speaker 1: been stolen in these sorts of attacks, and depending on 163 00:10:11,200 --> 00:10:14,480 Speaker 1: where you are, that kind of thing can have deadly consequences. 164 00:10:14,760 --> 00:10:18,720 Speaker 1: So the information involved with this data breach was extremely sensitive, 165 00:10:18,760 --> 00:10:21,880 Speaker 1: particularly from a social perspective. I mean, you're not likely 166 00:10:21,920 --> 00:10:24,200 Speaker 1: to come forward and say I was the victim of 167 00:10:24,280 --> 00:10:27,439 Speaker 1: identity theft if it also means you have to cop 168 00:10:27,520 --> 00:10:31,520 Speaker 1: up to something that is socially taboo, like, there's just 169 00:10:31,600 --> 00:10:34,679 Speaker 1: a lot of pressure on you to not come forward. 170 00:10:35,080 --> 00:10:37,960 Speaker 1: That the idea of coming forward is actually worse than 171 00:10:38,000 --> 00:10:41,280 Speaker 1: someone taking advantage of the information they have on you. So, 172 00:10:42,120 --> 00:10:45,360 Speaker 1: while this hack didn't include stuff like credit card information, 173 00:10:45,800 --> 00:10:49,240 Speaker 1: just the fact that names were appearing on these customer 174 00:10:49,280 --> 00:10:52,360 Speaker 1: lists was a huge problem. It could give other hackers 175 00:10:52,360 --> 00:10:56,520 Speaker 1: the opportunity to engage in blackmail or spearfishing and target 176 00:10:56,600 --> 00:11:00,000 Speaker 1: people based on what was revealed in their data with friends. 177 00:11:00,840 --> 00:11:03,760 Speaker 1: And that's a real issue that's going to come up 178 00:11:03,800 --> 00:11:07,280 Speaker 1: again and again in these episodes. Is that idea of yeah, 179 00:11:07,320 --> 00:11:10,040 Speaker 1: the data might not include, say, your credit card, but 180 00:11:10,120 --> 00:11:14,040 Speaker 1: that's not really the concern here. The concern is how 181 00:11:14,080 --> 00:11:19,199 Speaker 1: can someone use your information to victimize you in various ways? 182 00:11:19,200 --> 00:11:22,680 Speaker 1: And one of those is spearfishing. So what did FriendFinder 183 00:11:22,720 --> 00:11:25,520 Speaker 1: Network do in response to this? Sadly, the answer was 184 00:11:25,559 --> 00:11:28,839 Speaker 1: not much. While security researchers alerted the public that they 185 00:11:28,880 --> 00:11:32,880 Speaker 1: had detected a vulnerability in the FriendFinder Network system, the 186 00:11:32,960 --> 00:11:35,640 Speaker 1: company did not acknowledge the data breach for a full 187 00:11:35,679 --> 00:11:39,760 Speaker 1: week and only then began to send out notifications to customers. 188 00:11:39,920 --> 00:11:42,680 Speaker 1: And the company didn't have any really helpful advice for 189 00:11:42,760 --> 00:11:46,480 Speaker 1: those customers either, saying that people should change their passwords. Now, 190 00:11:46,520 --> 00:11:51,600 Speaker 1: according to idstrong dot com, the company had lacks password 191 00:11:51,640 --> 00:11:55,360 Speaker 1: requirements in the first place. Passwords weren't even case sensitive, 192 00:11:55,520 --> 00:11:59,880 Speaker 1: for example, and they didn't update this, so their password 193 00:12:00,080 --> 00:12:03,960 Speaker 1: protocols were still not really at an industry standard. And 194 00:12:04,040 --> 00:12:07,640 Speaker 1: here's a real kicker. The company had also been breached 195 00:12:07,640 --> 00:12:12,320 Speaker 1: in twenty fifteen. Now, the twenty fifteen breach, because remember 196 00:12:12,320 --> 00:12:14,640 Speaker 1: the one we're talking about is really twenty sixteen, But 197 00:12:14,720 --> 00:12:17,679 Speaker 1: the twenty fifteen breach was much smaller in scope. Only 198 00:12:17,720 --> 00:12:20,280 Speaker 1: three and a half million users were affected. That's still 199 00:12:20,280 --> 00:12:22,679 Speaker 1: a lot of people, but it's nowhere close to four 200 00:12:22,760 --> 00:12:26,079 Speaker 1: hundred and twelve million. But the types of information that 201 00:12:26,120 --> 00:12:30,920 Speaker 1: were stolen included things like partial payment information, and at 202 00:12:30,960 --> 00:12:32,800 Speaker 1: least in some of the research I was doing, Like 203 00:12:32,840 --> 00:12:35,960 Speaker 1: some sources said that the types of info that were 204 00:12:36,000 --> 00:12:40,080 Speaker 1: stolen in the twenty sixteen attack did not include things 205 00:12:40,160 --> 00:12:44,560 Speaker 1: like sexual orientation or preferences or that kind of thing. 206 00:12:45,040 --> 00:12:47,800 Speaker 1: Other sources said, no, that was part of the twenty 207 00:12:47,840 --> 00:12:50,600 Speaker 1: sixteen hack as well. So I don't know what the 208 00:12:50,640 --> 00:12:53,800 Speaker 1: full extent was, but a lot of the analysis I've 209 00:12:53,800 --> 00:12:56,080 Speaker 1: looked at about this particular breach points out that the 210 00:12:56,080 --> 00:12:58,560 Speaker 1: company failed to act properly in the wake of the 211 00:12:58,600 --> 00:13:02,400 Speaker 1: twenty fifteen breach, which meant it was essentially set up 212 00:13:02,559 --> 00:13:06,600 Speaker 1: for the much larger attack in twenty sixteen. So that's 213 00:13:06,600 --> 00:13:11,960 Speaker 1: a pretty damning allegation there, right, that a company had 214 00:13:12,000 --> 00:13:15,760 Speaker 1: already been the victim of a massive data breach and 215 00:13:15,800 --> 00:13:19,840 Speaker 1: then failed to take the adequate response in order to 216 00:13:19,840 --> 00:13:24,280 Speaker 1: prevent an even larger data breach the following year. So again, 217 00:13:24,440 --> 00:13:28,040 Speaker 1: just having the basics of your information leaked out would 218 00:13:28,080 --> 00:13:31,200 Speaker 1: be a huge problem given the nature of this company, 219 00:13:31,440 --> 00:13:34,480 Speaker 1: And despite the company's arguably lack luster response to the breach, 220 00:13:34,520 --> 00:13:38,160 Speaker 1: customers kept on being customers. I guess they never had 221 00:13:38,200 --> 00:13:41,520 Speaker 1: to learn a lesson because there really weren't massive consequences. 222 00:13:41,880 --> 00:13:45,280 Speaker 1: And again maybe this is partly because of the nature 223 00:13:45,400 --> 00:13:49,079 Speaker 1: of the services themselves, right, Like for a customer to 224 00:13:49,120 --> 00:13:51,200 Speaker 1: put up a big fuss, they would also have to 225 00:13:51,320 --> 00:13:53,640 Speaker 1: reveal themselves to be a customer in the first place, 226 00:13:53,720 --> 00:13:56,600 Speaker 1: and then the social taboo kicks in again. But unlike 227 00:13:56,640 --> 00:13:58,480 Speaker 1: some other companies that were going to talk about in 228 00:13:58,480 --> 00:14:02,640 Speaker 1: this episode, the friend Finder Networks didn't see serious setbacks 229 00:14:02,720 --> 00:14:06,400 Speaker 1: as a result of this attack. Okay, and we just 230 00:14:06,440 --> 00:14:09,280 Speaker 1: got through one, and we've got lots more to go, 231 00:14:09,440 --> 00:14:12,040 Speaker 1: So let's take a quick break to thank our sponsors 232 00:14:12,040 --> 00:14:24,280 Speaker 1: and we'll be right back. Okay, we're moving on to 233 00:14:24,360 --> 00:14:27,240 Speaker 1: number nine on our list. And this one is a 234 00:14:27,280 --> 00:14:30,520 Speaker 1: real blast from the past. It's MySpace, and this attack 235 00:14:30,640 --> 00:14:34,600 Speaker 1: technically happened in twenty thirteen, but it wasn't discovered and 236 00:14:34,680 --> 00:14:39,880 Speaker 1: reported until twenty sixteen, and even twenty thirteen was late 237 00:14:39,920 --> 00:14:42,720 Speaker 1: in the game for MySpace now. MySpace was once the 238 00:14:42,920 --> 00:14:46,600 Speaker 1: king of social networking platforms, but it had been losing 239 00:14:46,600 --> 00:14:50,120 Speaker 1: ground to Facebook since two thousand and nine. News Corps, 240 00:14:50,280 --> 00:14:53,080 Speaker 1: which had purchased MySpace for a whopping five hundred and 241 00:14:53,120 --> 00:14:56,160 Speaker 1: eighty million dollars in two thousand and five, ended up 242 00:14:56,280 --> 00:14:59,240 Speaker 1: selling the company off to Justin Timberlake and a company 243 00:14:59,240 --> 00:15:02,920 Speaker 1: called Specific Media in twenty eleven for thirty five million dollars. 244 00:15:02,960 --> 00:15:06,320 Speaker 1: So again they purchased it for five hundred eighty million 245 00:15:06,480 --> 00:15:08,960 Speaker 1: and then six years later sold it for thirty five million. 246 00:15:09,280 --> 00:15:13,520 Speaker 1: Not a good deal. By twenty sixteen, Time Incorporated purchased 247 00:15:13,520 --> 00:15:18,160 Speaker 1: Specific Media, and then Meredith Corporation acquired Time Incorporated. Because 248 00:15:18,160 --> 00:15:20,600 Speaker 1: there's always a bigger fish, that story gets more and 249 00:15:20,600 --> 00:15:22,640 Speaker 1: more complicated too, but we're going to leave that here. 250 00:15:23,200 --> 00:15:26,680 Speaker 1: My point is that MySpace had already experienced a dramatic 251 00:15:26,760 --> 00:15:31,440 Speaker 1: decline in relevance by twenty thirteen when the attack actually happened, 252 00:15:31,640 --> 00:15:35,960 Speaker 1: but still the site had millions of user records and 253 00:15:36,000 --> 00:15:38,600 Speaker 1: a hacker was able to get access to them, like 254 00:15:38,840 --> 00:15:43,720 Speaker 1: three hundred and sixty million records. The data lifted during 255 00:15:43,760 --> 00:15:48,200 Speaker 1: the breach included email addresses, user names, and passwords, which 256 00:15:48,240 --> 00:15:52,040 Speaker 1: were encrypted using again an outdated method, and therefore security 257 00:15:52,040 --> 00:15:55,680 Speaker 1: experts considered it insecure, and that was a real issue 258 00:15:55,760 --> 00:15:58,680 Speaker 1: right now. Looking back on this hack today, there's a 259 00:15:58,720 --> 00:16:02,200 Speaker 1: disturbing lack of information as to how it actually happened. 260 00:16:02,520 --> 00:16:06,000 Speaker 1: It went undiscovered for nearly three years and only really 261 00:16:06,080 --> 00:16:08,440 Speaker 1: came to light when folks realized that data from the 262 00:16:08,480 --> 00:16:11,560 Speaker 1: breach was popping up for sale on black market sites 263 00:16:11,560 --> 00:16:14,400 Speaker 1: on the Dark Web. As for who was responsible and 264 00:16:14,440 --> 00:16:18,640 Speaker 1: the vulnerabilities they exploited, that remains something of a mystery. 265 00:16:19,040 --> 00:16:23,120 Speaker 1: MySpace responded to this news by invalidating all the passwords 266 00:16:23,160 --> 00:16:26,320 Speaker 1: of all the affected accounts, which would require users to 267 00:16:26,360 --> 00:16:29,520 Speaker 1: set up new passwords and also encourage people who weren't 268 00:16:29,560 --> 00:16:32,960 Speaker 1: directly impacted to go ahead and update their passwords as well. 269 00:16:33,040 --> 00:16:36,560 Speaker 1: In an overabundance of caution, like the friend Finder breach, 270 00:16:36,920 --> 00:16:39,760 Speaker 1: there wasn't much a user could do to protect themselves 271 00:16:39,760 --> 00:16:41,720 Speaker 1: from the hackers. In fact, I would argue there was 272 00:16:41,760 --> 00:16:44,720 Speaker 1: nothing a user could do. It wouldn't matter if they 273 00:16:44,800 --> 00:16:47,640 Speaker 1: had used a strong or a weak password, because the 274 00:16:47,720 --> 00:16:50,760 Speaker 1: real issue was MySpace was using a very weak hashing 275 00:16:50,840 --> 00:16:54,040 Speaker 1: method to encrypt passwords in the first place. So even 276 00:16:54,040 --> 00:16:57,120 Speaker 1: if you picked a very strong password, if it's being 277 00:16:57,200 --> 00:17:01,960 Speaker 1: stored in an encryption that can easily be broken, then 278 00:17:02,560 --> 00:17:05,560 Speaker 1: they can just get to your password anyway, doesn't matter 279 00:17:05,560 --> 00:17:09,240 Speaker 1: how strong it was. You did your part. MySpace failed, 280 00:17:09,440 --> 00:17:11,919 Speaker 1: is what I'm saying. Now. All that being said, I 281 00:17:11,960 --> 00:17:15,159 Speaker 1: do still urge everyone to use unique, strong passwords for 282 00:17:15,240 --> 00:17:18,600 Speaker 1: all their sites and services. Unique is really important because 283 00:17:18,600 --> 00:17:21,280 Speaker 1: if you're using the same password everywhere, it just takes 284 00:17:21,400 --> 00:17:24,760 Speaker 1: one data breach to be able to compromise all of 285 00:17:24,800 --> 00:17:27,880 Speaker 1: your stuff. If they have your email and whatever password 286 00:17:27,880 --> 00:17:32,320 Speaker 1: you use for that, you know, one like obscure website, 287 00:17:32,440 --> 00:17:34,400 Speaker 1: and it happens to be the same password you use 288 00:17:34,440 --> 00:17:38,000 Speaker 1: for say your bank, that's bad news for you. Use 289 00:17:38,119 --> 00:17:41,879 Speaker 1: unique passwords, get a password vault of some sort a 290 00:17:41,880 --> 00:17:45,600 Speaker 1: good one, research this and find one that really works 291 00:17:45,640 --> 00:17:49,560 Speaker 1: for you, and make unique, strong passwords for each of 292 00:17:49,600 --> 00:17:52,840 Speaker 1: the sites you go to so that you can avoid 293 00:17:53,119 --> 00:17:56,360 Speaker 1: this issue. Because data breaches, sadly are not uncommon, they're 294 00:17:56,359 --> 00:18:00,560 Speaker 1: getting more common every year, and this will help protect 295 00:18:00,760 --> 00:18:05,280 Speaker 1: other elements of your online presence from hackers. Sadly, there's 296 00:18:05,280 --> 00:18:08,040 Speaker 1: not very much you can do to protect the systems themselves. 297 00:18:08,080 --> 00:18:11,000 Speaker 1: I mean, that's in the control of whatever platform you're using. 298 00:18:11,040 --> 00:18:13,640 Speaker 1: And I'm not telling you not to use platforms goodness, nos, 299 00:18:13,720 --> 00:18:17,080 Speaker 1: I use tons of them. Just to be as careful 300 00:18:17,080 --> 00:18:19,960 Speaker 1: as you can be to mitigate any issues that might 301 00:18:20,000 --> 00:18:23,280 Speaker 1: pop up due to data breaches. Also, you know, enable 302 00:18:23,359 --> 00:18:26,800 Speaker 1: multi factor authentication if that's available, if it's on there, 303 00:18:27,280 --> 00:18:31,000 Speaker 1: use it again. Nothing is absolutely fool proof. I'm not 304 00:18:31,080 --> 00:18:33,159 Speaker 1: here to tell you that if you have multi factor 305 00:18:33,200 --> 00:18:37,600 Speaker 1: authentication you'll never get hacked. That's not necessarily true. But 306 00:18:37,680 --> 00:18:41,239 Speaker 1: the more precautions you take the better. The harder you 307 00:18:41,320 --> 00:18:44,760 Speaker 1: make yourself to be a target, the more effort it 308 00:18:44,800 --> 00:18:49,359 Speaker 1: takes to actually crack your security, and the less likely 309 00:18:49,440 --> 00:18:54,399 Speaker 1: someone's going to actually pursue that. It's not impossible, but like, 310 00:18:54,840 --> 00:18:57,360 Speaker 1: why struggle if you can go for all the low 311 00:18:57,400 --> 00:19:00,400 Speaker 1: hanging fruit, don't be low hanging fruit still. Now, if 312 00:19:00,400 --> 00:19:03,960 Speaker 1: hackers are breaching a company's systems, we're really left to 313 00:19:03,960 --> 00:19:07,399 Speaker 1: the competence of that company when it comes to personal security. 314 00:19:07,680 --> 00:19:09,680 Speaker 1: So our first two entries on this list are both 315 00:19:09,960 --> 00:19:13,960 Speaker 1: web based companies. Right, we had MySpace and we had 316 00:19:14,119 --> 00:19:17,520 Speaker 1: the FriendFinder Networks. But up next is a company known 317 00:19:17,560 --> 00:19:20,840 Speaker 1: for its brick and mortar operations, and I'm talking about 318 00:19:21,040 --> 00:19:25,199 Speaker 1: home Depot, which experienced a massive data breach in April 319 00:19:25,240 --> 00:19:28,920 Speaker 1: twenty fourteen. This was an attack that compromised more than 320 00:19:28,960 --> 00:19:35,040 Speaker 1: fifty million customers data, including their credit or debit card information, 321 00:19:35,600 --> 00:19:39,959 Speaker 1: lifting that information right from inside the stores themselves. And 322 00:19:40,000 --> 00:19:42,639 Speaker 1: this attack went unnoticed until the hackers started putting the 323 00:19:42,640 --> 00:19:45,240 Speaker 1: credit card info up on sale on the dark web, 324 00:19:45,440 --> 00:19:48,680 Speaker 1: at which point home Depot was made aware that they 325 00:19:48,760 --> 00:19:52,399 Speaker 1: had been breached. So let's walk through how this attack happened. So, 326 00:19:52,480 --> 00:19:56,240 Speaker 1: according to the US Office of the Director of National Intelligence, 327 00:19:56,480 --> 00:20:01,000 Speaker 1: the hackers first secured Quote credentials, user names, and passwords 328 00:20:01,280 --> 00:20:04,720 Speaker 1: from a third party vendor end Quote, and that gave 329 00:20:04,760 --> 00:20:08,840 Speaker 1: them the foothold into home depots computer network. So first 330 00:20:09,040 --> 00:20:12,000 Speaker 1: they identified a company that worked with home Depot. They 331 00:20:12,000 --> 00:20:15,640 Speaker 1: were able to secure a username and password from this company. 332 00:20:15,880 --> 00:20:19,719 Speaker 1: They use that to infiltrate home Depot's computer network. On 333 00:20:19,800 --> 00:20:25,200 Speaker 1: top of that, they then were able to essentially take 334 00:20:25,200 --> 00:20:29,720 Speaker 1: advantage of a zero day vulnerability that was within Microsoft Windows. 335 00:20:29,840 --> 00:20:32,399 Speaker 1: So a zero day vulnerability is a fancy way of 336 00:20:32,440 --> 00:20:36,080 Speaker 1: saying that the entity responsible for making whatever the thing 337 00:20:36,240 --> 00:20:39,800 Speaker 1: is So in this case, Microsoft Windows is unaware that 338 00:20:39,800 --> 00:20:43,720 Speaker 1: the vulnerability even exists. And because they're unaware that there 339 00:20:43,800 --> 00:20:47,040 Speaker 1: is a vulnerability, there's no means to prevent or mitigate 340 00:20:47,160 --> 00:20:51,920 Speaker 1: attacks that leverage or exploit this vulnerability. Zero day vulnerabilities 341 00:20:51,960 --> 00:20:56,159 Speaker 1: are incredibly valuable in the hacker community because there's no 342 00:20:56,280 --> 00:20:59,879 Speaker 1: real defense against them, and if you're very careful, you 343 00:21:00,560 --> 00:21:04,000 Speaker 1: have the chance to continue to exploit these kinds of 344 00:21:04,040 --> 00:21:07,680 Speaker 1: vulnerabilities for a while before anyone notices. So it's called 345 00:21:07,800 --> 00:21:10,439 Speaker 1: zero day because that's how much time the you know, 346 00:21:10,480 --> 00:21:14,320 Speaker 1: the entity Microsoft in this case has before malicious agents 347 00:21:14,359 --> 00:21:18,160 Speaker 1: are able to exploit that vulnerability. So the hackers exploit 348 00:21:18,320 --> 00:21:22,600 Speaker 1: Microsoft Windows and they're exploring home Depot systems and they're 349 00:21:22,600 --> 00:21:27,080 Speaker 1: able to identify thousands, like seven five hundred points of 350 00:21:27,200 --> 00:21:31,840 Speaker 1: sale systems in self checkout lanes at physical home Depot stores. 351 00:21:31,880 --> 00:21:35,359 Speaker 1: So again, this was not targeting the online point of 352 00:21:35,400 --> 00:21:38,800 Speaker 1: sale operations for home Depot. You know, the website commerce 353 00:21:39,040 --> 00:21:41,720 Speaker 1: part of Home Depot was not part of this attack, 354 00:21:42,000 --> 00:21:43,960 Speaker 1: And I just think that's good to point out because 355 00:21:44,160 --> 00:21:46,480 Speaker 1: I don't think it's as common now, But I remember 356 00:21:46,560 --> 00:21:51,000 Speaker 1: when online commerce first became a thing, people were scared 357 00:21:51,520 --> 00:21:54,639 Speaker 1: to buy stuff off the internet. They were reluctant to 358 00:21:54,760 --> 00:21:57,439 Speaker 1: use their credit card to purchase something online because they 359 00:21:57,480 --> 00:22:01,040 Speaker 1: were worried about security, which is understandable, but it turns 360 00:22:01,080 --> 00:22:04,000 Speaker 1: out that going to a brick and mortar store is 361 00:22:04,000 --> 00:22:08,320 Speaker 1: not necessarily more secure because those systems are also connected 362 00:22:08,359 --> 00:22:11,480 Speaker 1: to networks that ultimately get connected to the Internet, and 363 00:22:11,560 --> 00:22:15,159 Speaker 1: so if you're able to compromise those networks, then you 364 00:22:15,160 --> 00:22:18,320 Speaker 1: can still tap into that kind of system. So the 365 00:22:18,359 --> 00:22:22,400 Speaker 1: hackers deployed custom built malware for these points of sale systems, 366 00:22:22,480 --> 00:22:25,040 Speaker 1: and they use this malware to record the credit and 367 00:22:25,119 --> 00:22:28,360 Speaker 1: debit card information of home Depot customers. They even made 368 00:22:28,400 --> 00:22:32,360 Speaker 1: sure that they transmitted that data during home Depot's business 369 00:22:32,400 --> 00:22:36,080 Speaker 1: hours so that the company's security team wouldn't notice like 370 00:22:36,760 --> 00:22:39,399 Speaker 1: a transmission at an odd hour, like if it was 371 00:22:39,440 --> 00:22:42,120 Speaker 1: two in the morning, then the security team was saying, like, hey, 372 00:22:42,160 --> 00:22:45,240 Speaker 1: why is our system sending info out at this hour? 373 00:22:45,600 --> 00:22:47,400 Speaker 1: That could be a tip off. So they made sure 374 00:22:47,440 --> 00:22:51,479 Speaker 1: that all those transmissions happened during normal business operating hours 375 00:22:51,760 --> 00:22:55,280 Speaker 1: and that would kind of mask these On top of 376 00:22:55,359 --> 00:23:00,080 Speaker 1: all the legitimate transmissions, cybersecurity experts criticized home Depot so 377 00:23:00,320 --> 00:23:03,960 Speaker 1: for having insufficient security measures in place. The company estimated 378 00:23:03,960 --> 00:23:07,359 Speaker 1: that spent nearly one hundred and eighty million dollars in 379 00:23:07,400 --> 00:23:09,560 Speaker 1: the wake of this attack to pay off all the 380 00:23:09,680 --> 00:23:12,280 Speaker 1: various costs. On top of that, there was a class 381 00:23:12,320 --> 00:23:15,840 Speaker 1: action lawsuit from across forty six states that ended with 382 00:23:15,920 --> 00:23:18,920 Speaker 1: Home Depots settling out of court for seventeen point five 383 00:23:19,040 --> 00:23:24,399 Speaker 1: million dollars. Now, Home Depot didn't admit, you know, responsibility 384 00:23:24,680 --> 00:23:27,720 Speaker 1: for this, but it did promise to invest in security measures, 385 00:23:27,720 --> 00:23:30,880 Speaker 1: including hiring a chief of information of security. Now, as 386 00:23:30,920 --> 00:23:33,439 Speaker 1: for that seventeen point five million dollar settlement, I just 387 00:23:33,480 --> 00:23:36,000 Speaker 1: want to put that into context so that we can 388 00:23:36,080 --> 00:23:39,800 Speaker 1: kind of appreciate what that means or doesn't mean. Keep 389 00:23:39,840 --> 00:23:43,520 Speaker 1: in mind, around fifty six million customers were affected by 390 00:23:43,520 --> 00:23:46,080 Speaker 1: this data breach, So if you were to include all 391 00:23:46,119 --> 00:23:48,840 Speaker 1: of them in the class action lawsuit, which obviously not 392 00:23:48,880 --> 00:23:50,840 Speaker 1: realistic but you know, we're just doing this as a 393 00:23:50,880 --> 00:23:54,520 Speaker 1: thought experiment, then that would mean each person would receive 394 00:23:54,600 --> 00:23:58,800 Speaker 1: the princely sum of thirty one cents. That's only if 395 00:23:58,840 --> 00:24:02,320 Speaker 1: the various lawyers of all the different states did this case. 396 00:24:02,520 --> 00:24:05,480 Speaker 1: Gradis for free. So what I'm saying is that while 397 00:24:05,480 --> 00:24:07,200 Speaker 1: Home Depot may have had to spend a lot of 398 00:24:07,280 --> 00:24:09,960 Speaker 1: money to deal with the aftermath of this breach, the 399 00:24:10,040 --> 00:24:13,240 Speaker 1: settlement I think was a case of getting off lightly 400 00:24:13,440 --> 00:24:16,679 Speaker 1: considering the nature of that breach. But I also have 401 00:24:16,760 --> 00:24:19,879 Speaker 1: to remind myself that ultimately the real criminal here are 402 00:24:19,920 --> 00:24:22,600 Speaker 1: the hackers who pulled off the attack and the folks 403 00:24:22,640 --> 00:24:25,159 Speaker 1: on the dark web who purchased the credit and debit 404 00:24:25,200 --> 00:24:27,959 Speaker 1: card information. Those are the real criminals. While I can 405 00:24:28,000 --> 00:24:33,720 Speaker 1: be disappointed in home Depot's lack of security or lackluster security, 406 00:24:33,720 --> 00:24:36,400 Speaker 1: in this case, I don't want to blame the victim 407 00:24:36,840 --> 00:24:39,240 Speaker 1: like I do think that there is a responsibility there, 408 00:24:39,280 --> 00:24:42,840 Speaker 1: But the real villains are the people who did the stealing. 409 00:24:43,119 --> 00:24:45,560 Speaker 1: It's just it's easy to blame big companies as well 410 00:24:45,560 --> 00:24:48,240 Speaker 1: when they failed to be good stewards of customer information. 411 00:24:48,840 --> 00:24:51,879 Speaker 1: So next up on Chin's list, oh massive data breaches 412 00:24:51,880 --> 00:24:53,720 Speaker 1: here in the United States, is another one that happened 413 00:24:53,720 --> 00:24:58,080 Speaker 1: in twenty fourteen. This attack targeted the bank JP Morgan 414 00:24:58,240 --> 00:25:01,960 Speaker 1: Chase and it impacted around eighty three million bank customers. 415 00:25:02,160 --> 00:25:05,000 Speaker 1: Seventy six million of those were households and the other 416 00:25:05,040 --> 00:25:10,640 Speaker 1: seven million were small businesses. This attack also reportedly leveraged 417 00:25:10,680 --> 00:25:13,280 Speaker 1: a zero day vulnerability, but in this case, it was 418 00:25:13,280 --> 00:25:17,479 Speaker 1: a vulnerability in JP Morgan Chase's web applications, so this 419 00:25:17,640 --> 00:25:21,399 Speaker 1: gave the hackers the foothold to access kind of a 420 00:25:21,440 --> 00:25:25,800 Speaker 1: directory level of server information for JP Morgan Chase. This 421 00:25:25,880 --> 00:25:31,040 Speaker 1: then let the hackers identify databases containing customer information Now, 422 00:25:31,600 --> 00:25:34,960 Speaker 1: one source I looked at suggested the information included financial 423 00:25:35,040 --> 00:25:37,479 Speaker 1: data like credit card information, but that was just in 424 00:25:37,560 --> 00:25:41,120 Speaker 1: one source, and every other source, including The New York Times, 425 00:25:41,359 --> 00:25:43,919 Speaker 1: says that was not the case. So I feel pretty 426 00:25:43,960 --> 00:25:47,960 Speaker 1: confident that that one source was an outlier and had 427 00:25:48,000 --> 00:25:50,359 Speaker 1: some misinformation in it. I mean, that's a flag for 428 00:25:50,440 --> 00:25:52,359 Speaker 1: all of y'all out there. So it's always good to 429 00:25:52,560 --> 00:25:56,320 Speaker 1: double check things and check multiple sources. Sometimes it can 430 00:25:56,400 --> 00:26:00,800 Speaker 1: be really difficult to determine what reality is based on 431 00:26:01,440 --> 00:26:06,360 Speaker 1: the reporting of various sources. Sometimes even reputable sources get 432 00:26:06,400 --> 00:26:10,720 Speaker 1: things wrong. So you know, thinking critically involves a lot 433 00:26:10,760 --> 00:26:13,960 Speaker 1: of checking and double checking, and sometimes it involves making 434 00:26:14,240 --> 00:26:16,960 Speaker 1: an educated guess as to what is most likely to 435 00:26:16,960 --> 00:26:18,879 Speaker 1: be real. So in this case, I think it's most 436 00:26:18,920 --> 00:26:22,080 Speaker 1: likely that the information that was stolen was personal information 437 00:26:22,320 --> 00:26:25,600 Speaker 1: but not financial information. So the attackers got access to 438 00:26:25,720 --> 00:26:28,640 Speaker 1: things like names, email addresses, that kind of thing, which 439 00:26:28,680 --> 00:26:31,640 Speaker 1: again doesn't sound like it's as critical as credit card information, 440 00:26:31,880 --> 00:26:34,840 Speaker 1: but it's still really useful data if, for example, you 441 00:26:34,880 --> 00:26:37,800 Speaker 1: want to create a spear phishing campaign and trick people 442 00:26:37,840 --> 00:26:41,000 Speaker 1: into making mistakes, like if you know they are customers 443 00:26:41,040 --> 00:26:44,320 Speaker 1: of this particular bank, and you know what their email 444 00:26:44,359 --> 00:26:47,240 Speaker 1: address is, and you know their actual name, you can 445 00:26:47,320 --> 00:26:50,919 Speaker 1: craft and attack targeting that person that appears to be 446 00:26:51,000 --> 00:26:54,760 Speaker 1: coming from the legitimate business and potentially take advantage of 447 00:26:54,800 --> 00:26:58,119 Speaker 1: them that way. So the hackers then developed attacks for 448 00:26:58,240 --> 00:27:02,080 Speaker 1: these servers they had identified, and they ultimately infiltrated around 449 00:27:02,240 --> 00:27:07,080 Speaker 1: ninety servers within the business. The attackers had started back 450 00:27:07,080 --> 00:27:10,959 Speaker 1: in June twenty fourteen. JP Morgan Chase would detect the 451 00:27:10,960 --> 00:27:14,480 Speaker 1: intrusion a month later in July. The public, however, would 452 00:27:14,480 --> 00:27:17,800 Speaker 1: not find out about it until September, when the company 453 00:27:17,800 --> 00:27:20,960 Speaker 1: disclosed the attack in a securities filing and various media 454 00:27:21,000 --> 00:27:24,520 Speaker 1: outlets reported on it. Now, considering that other major breaches 455 00:27:24,600 --> 00:27:27,520 Speaker 1: like the aforementioned home depot attack, there was another one 456 00:27:27,520 --> 00:27:30,760 Speaker 1: that hit target, these attacks were fresh in the minds 457 00:27:30,760 --> 00:27:32,920 Speaker 1: of consumers because they were national news here in the 458 00:27:33,000 --> 00:27:36,359 Speaker 1: United States. The JP Morgan Chase attack was a huge 459 00:27:36,400 --> 00:27:41,480 Speaker 1: blow because it revealed that even massive financial institutions, which 460 00:27:41,560 --> 00:27:45,520 Speaker 1: had good reputations for being really secure, could also fall 461 00:27:45,640 --> 00:27:49,520 Speaker 1: victim to hacker intrusions, which became a brand news source 462 00:27:49,560 --> 00:27:53,560 Speaker 1: for anxiety for American consumers and as for the attackers 463 00:27:53,560 --> 00:27:57,280 Speaker 1: in this case, there were four identified arguably five. The 464 00:27:57,320 --> 00:28:00,199 Speaker 1: fifth one, however, was kind of after the effect, but 465 00:28:00,240 --> 00:28:04,280 Speaker 1: the main four included a Russian citizen named Andrew Turin. 466 00:28:04,560 --> 00:28:10,240 Speaker 1: There was an American named Joshua Samuel Arn aka Mike Shields. 467 00:28:10,560 --> 00:28:13,080 Speaker 1: That's the alias he would use and some of his 468 00:28:13,200 --> 00:28:16,560 Speaker 1: nefarious activities according to authorities. And then there were two 469 00:28:16,640 --> 00:28:23,840 Speaker 1: Israeli citizens. There was Gary Shalan aka Gary Shallis Lashville. 470 00:28:24,160 --> 00:28:29,200 Speaker 1: I know I mangled that name aka Gabriel aka Gabby 471 00:28:29,520 --> 00:28:36,199 Speaker 1: aka Philip Moussey aka Christopher Ingeham. Lots of aliases for 472 00:28:36,359 --> 00:28:40,800 Speaker 1: Gary Shallon. And then finally there was Ziv Ornstein aka 473 00:28:41,080 --> 00:28:46,280 Speaker 1: Aviv Stein aka John Avery. So for four people, that's 474 00:28:46,320 --> 00:28:48,720 Speaker 1: a lot of different names, right. Well, these four hackers 475 00:28:48,720 --> 00:28:52,320 Speaker 1: were linked to numerous crimes, not just the JP Morgan 476 00:28:52,520 --> 00:28:55,360 Speaker 1: chase instance. There were other ones as well, and they 477 00:28:55,360 --> 00:28:59,240 Speaker 1: were also operators I believe of online casino or something 478 00:28:59,280 --> 00:29:01,400 Speaker 1: along those lines. Anyway, at least one of them, that 479 00:29:01,560 --> 00:29:05,680 Speaker 1: being Gary Shallon, was released early. He secured an early 480 00:29:05,720 --> 00:29:08,680 Speaker 1: release after agreeing to a plea deal that had him 481 00:29:08,720 --> 00:29:13,320 Speaker 1: pay a whopping four hundred three million dollar fine. Now, 482 00:29:13,320 --> 00:29:15,840 Speaker 1: if you can afford to pay a four hundred three 483 00:29:15,920 --> 00:29:18,400 Speaker 1: million dollar fine to get out of the pokey. I mean, 484 00:29:18,440 --> 00:29:21,280 Speaker 1: I guess crime really does pay. Other folks connected to 485 00:29:21,320 --> 00:29:23,920 Speaker 1: the scheme were not so fortunate, so for example, Andrew 486 00:29:24,000 --> 00:29:27,080 Speaker 1: Tieran received a twelve year sentence at the end of 487 00:29:27,080 --> 00:29:30,400 Speaker 1: his trial. So I guess it's you know who you know, 488 00:29:30,880 --> 00:29:33,080 Speaker 1: and who you know needs to be a whole lot 489 00:29:33,080 --> 00:29:36,680 Speaker 1: of Benjamin Franklin's JP Morgan Chase pledged to beef up 490 00:29:36,680 --> 00:29:40,000 Speaker 1: the company's security and would double the investment within five 491 00:29:40,080 --> 00:29:42,240 Speaker 1: years from two hundred and fifty million a year to 492 00:29:42,480 --> 00:29:46,080 Speaker 1: five hundred million a year. So that's good. Okay, got 493 00:29:46,080 --> 00:29:48,000 Speaker 1: a couple more I want to talk about before we 494 00:29:48,040 --> 00:29:51,280 Speaker 1: wrap up Part one. I guess of our top ten 495 00:29:51,960 --> 00:29:55,480 Speaker 1: largest data breaches in US history, But first let's take 496 00:29:55,480 --> 00:30:08,040 Speaker 1: another quick break to thank our sponsors. We are up 497 00:30:08,120 --> 00:30:11,640 Speaker 1: to number six on our list of biggest data breaches 498 00:30:11,680 --> 00:30:16,560 Speaker 1: in US history. And that would be LinkedIn. Uh, LinkedIn, 499 00:30:16,680 --> 00:30:20,200 Speaker 1: that social network site that I almost never log into. 500 00:30:20,760 --> 00:30:24,560 Speaker 1: If I were a savvy mover and shaker, I would 501 00:30:24,600 --> 00:30:28,479 Speaker 1: make way better use of LinkedIn, But I'm not, and 502 00:30:28,560 --> 00:30:31,280 Speaker 1: so I post to my account once every blue moon, 503 00:30:31,600 --> 00:30:33,640 Speaker 1: and I keep thinking, Man, I need to make better 504 00:30:33,720 --> 00:30:36,080 Speaker 1: use of this resource and really network with people. That 505 00:30:36,120 --> 00:30:38,800 Speaker 1: could be so helpful. But I've got only so much 506 00:30:39,720 --> 00:30:43,120 Speaker 1: emotional energy for things like social networks. And I still 507 00:30:43,440 --> 00:30:46,360 Speaker 1: have a LinkedIn account. I just don't use it very much. However, 508 00:30:46,480 --> 00:30:49,840 Speaker 1: because I have a LinkedIn account, this next story affects 509 00:30:49,840 --> 00:30:52,840 Speaker 1: me whether I pop on there regularly or not. This 510 00:30:52,960 --> 00:30:56,240 Speaker 1: data breach is quite a bit different from the ones 511 00:30:56,280 --> 00:30:58,920 Speaker 1: we've talked about so far because this one did not 512 00:30:59,120 --> 00:31:04,480 Speaker 1: involve a HA gaining access to LinkedIn's internal systems. There 513 00:31:04,520 --> 00:31:08,600 Speaker 1: was no security intrusion in this case. Instead, the hacker 514 00:31:09,200 --> 00:31:12,120 Speaker 1: someone at least what's believed is that was a hacker 515 00:31:12,200 --> 00:31:16,880 Speaker 1: using the handle Tomliner, but Tomliner could be a middleman 516 00:31:17,080 --> 00:31:20,240 Speaker 1: like he might not he or she or they might 517 00:31:20,280 --> 00:31:22,960 Speaker 1: not have been the person responsible for the actual hack, 518 00:31:23,000 --> 00:31:25,280 Speaker 1: but they did get access to at least some of 519 00:31:25,320 --> 00:31:29,200 Speaker 1: the data. Anyway, The quote unquote hacker simply used tools 520 00:31:29,280 --> 00:31:33,360 Speaker 1: to scrape data off public profiles on LinkedIn. A ton 521 00:31:33,400 --> 00:31:36,960 Speaker 1: of public profiles, like more than ninety percent of the 522 00:31:37,040 --> 00:31:40,479 Speaker 1: public profiles on LinkedIn. That would be around seven hundred 523 00:31:40,720 --> 00:31:45,160 Speaker 1: million profiles. And here's the crazy thing. Earlier that same year, 524 00:31:45,640 --> 00:31:50,280 Speaker 1: the same person claimed responsibility for leaking five hundred million 525 00:31:50,360 --> 00:31:53,560 Speaker 1: LinkedIn records, So this was like the second time in 526 00:31:53,600 --> 00:31:55,800 Speaker 1: the same year and going from five hundred million to 527 00:31:55,880 --> 00:32:00,400 Speaker 1: seven hundred million yaalza. Now, essentially this methodology is the 528 00:32:00,440 --> 00:32:03,800 Speaker 1: same as if you were to go manually from LinkedIn 529 00:32:03,920 --> 00:32:07,080 Speaker 1: profile to profile and you just jotted down all the 530 00:32:07,120 --> 00:32:09,800 Speaker 1: relevant information that you were looking for. You know, stuff 531 00:32:09,880 --> 00:32:13,480 Speaker 1: like what's a person's username, what's their full name, what's 532 00:32:13,520 --> 00:32:17,320 Speaker 1: their phone number, their email address, you know what other 533 00:32:17,440 --> 00:32:21,479 Speaker 1: social networking sites do they use? Anything that would appear 534 00:32:21,600 --> 00:32:24,360 Speaker 1: on the person's profile. You would just jot it down. 535 00:32:24,680 --> 00:32:28,040 Speaker 1: That would take you an eternity to do seven hundred million, 536 00:32:28,400 --> 00:32:32,360 Speaker 1: So you create a tool that will just do this automatically. 537 00:32:32,600 --> 00:32:35,760 Speaker 1: So the hacker had used LinkedIn's API that stands for 538 00:32:35,880 --> 00:32:41,000 Speaker 1: Application Programmer Interface and they designed these data scraping tools 539 00:32:41,040 --> 00:32:45,960 Speaker 1: to harvest user data. This was against LinkedIn's policies, but 540 00:32:46,040 --> 00:32:49,120 Speaker 1: there really weren't any measures in place to actually prevent 541 00:32:49,160 --> 00:32:52,120 Speaker 1: it from happening. So yeah, LinkedIn says, hey, don't do this, 542 00:32:52,320 --> 00:32:54,200 Speaker 1: but they didn't have a way to stop you from 543 00:32:54,240 --> 00:32:57,200 Speaker 1: doing it twice in the same year. As it turns out, 544 00:32:57,480 --> 00:33:01,720 Speaker 1: now this attack did not compromise stuff like passwords or 545 00:33:01,760 --> 00:33:05,600 Speaker 1: financial information, but it did include things like those connected 546 00:33:05,760 --> 00:33:09,360 Speaker 1: social applications. So if an affected user had linked their 547 00:33:09,400 --> 00:33:12,920 Speaker 1: Facebook account or whatever to their LinkedIn profile, that meant 548 00:33:12,960 --> 00:33:15,920 Speaker 1: the attackers would have that information. And again this can 549 00:33:15,960 --> 00:33:19,040 Speaker 1: be incredibly helpful if you want to design a phishing attack. 550 00:33:19,360 --> 00:33:22,160 Speaker 1: You know, your basic blunt phishing attack might start from 551 00:33:22,160 --> 00:33:25,000 Speaker 1: a place where little to nothing is known about your target. 552 00:33:25,280 --> 00:33:28,360 Speaker 1: But the more attackers learn about you, the better they 553 00:33:28,360 --> 00:33:31,640 Speaker 1: can craft an effective trap. And considering that there were 554 00:33:31,760 --> 00:33:35,000 Speaker 1: a lot of executives using LinkedIn to network with each other, 555 00:33:35,280 --> 00:33:38,920 Speaker 1: there's some really high value targets mixed in with everybody else. 556 00:33:39,440 --> 00:33:41,880 Speaker 1: Like even if it's not an executive, it might be 557 00:33:41,960 --> 00:33:45,640 Speaker 1: someone who's an associate of an executive, like an assistant 558 00:33:45,840 --> 00:33:48,440 Speaker 1: or a coworker or something like that, a direct report. 559 00:33:48,680 --> 00:33:51,680 Speaker 1: And if you're able to know who that person's direct 560 00:33:51,720 --> 00:33:54,400 Speaker 1: report is or who they're reporting to, I guess I 561 00:33:54,440 --> 00:33:57,480 Speaker 1: should say, then you can craft an attack that might 562 00:33:57,520 --> 00:34:01,080 Speaker 1: be very convincing. You know, a classic one is your 563 00:34:01,120 --> 00:34:04,880 Speaker 1: boss apparently texting you out of nowhere saying hey, I 564 00:34:05,000 --> 00:34:08,920 Speaker 1: need access to five thousand dollars in petty cash. Can 565 00:34:08,960 --> 00:34:10,759 Speaker 1: you wire it to me, and then they give you 566 00:34:10,800 --> 00:34:12,800 Speaker 1: a link and it turns out it's just someone who's 567 00:34:13,680 --> 00:34:16,600 Speaker 1: made the connection. They know who your boss is, and 568 00:34:16,640 --> 00:34:19,480 Speaker 1: they're using that to pressure you into doing something you 569 00:34:19,560 --> 00:34:22,239 Speaker 1: really shouldn't do. That's a very simple example, but it 570 00:34:22,280 --> 00:34:25,040 Speaker 1: happens all the time. So this LinkedIn attack is a 571 00:34:25,040 --> 00:34:28,680 Speaker 1: pretty tricky one. And we've seen similar data scraping techniques 572 00:34:28,760 --> 00:34:31,920 Speaker 1: across the web, both of the purposes of harvesting user 573 00:34:31,960 --> 00:34:35,200 Speaker 1: information and in recent years also using it to train 574 00:34:35,320 --> 00:34:40,480 Speaker 1: up AI models. And typically platforms condemn these practices. They 575 00:34:40,520 --> 00:34:43,239 Speaker 1: say it violates their policies. They want to protect their 576 00:34:43,320 --> 00:34:47,320 Speaker 1: user information. Now, I would argue it's largely really because 577 00:34:47,920 --> 00:34:51,279 Speaker 1: user data is valuable, and these platforms would very much 578 00:34:51,360 --> 00:34:54,160 Speaker 1: like to prevent other entities from taking advantage of the 579 00:34:54,200 --> 00:34:57,960 Speaker 1: same information that the platforms themselves are profiting off of. 580 00:34:58,400 --> 00:35:01,080 Speaker 1: It's not so much to protect our privacy as it 581 00:35:01,120 --> 00:35:04,880 Speaker 1: is to protect the platform's investment in gathering all the 582 00:35:04,920 --> 00:35:07,360 Speaker 1: information in the first place. Like no, this is ours. 583 00:35:07,719 --> 00:35:12,160 Speaker 1: This is ours to exploit and to profit from, not yours. Well. 584 00:35:12,239 --> 00:35:15,040 Speaker 1: Number five on this list includes an old topic for 585 00:35:15,160 --> 00:35:20,239 Speaker 1: tech stuff, which is the infamous Cambridge Analytica case with Facebook. Now, 586 00:35:20,280 --> 00:35:22,759 Speaker 1: this one is a little bit complicated, but I'll see 587 00:35:22,760 --> 00:35:25,680 Speaker 1: if I can summarize at least the tech side of it, 588 00:35:25,680 --> 00:35:30,160 Speaker 1: although it does also include politics. Sorry I wish it didn't, 589 00:35:30,480 --> 00:35:34,560 Speaker 1: but it's literally the very nature of this case. So 590 00:35:34,960 --> 00:35:37,080 Speaker 1: the LinkedIn attack we just talked about is kind of 591 00:35:37,120 --> 00:35:41,440 Speaker 1: similar to this because this attack, the Cambridge Analytica scandal, 592 00:35:41,640 --> 00:35:47,160 Speaker 1: really centers on some loopholes in Facebook's API. So it 593 00:35:47,200 --> 00:35:51,640 Speaker 1: all starts with a researcher named Alexander Cogan. And Cogan 594 00:35:51,719 --> 00:35:54,920 Speaker 1: used Facebook's API to create a survey app, and it 595 00:35:54,960 --> 00:35:57,960 Speaker 1: would pay Facebook users a small amount in return for 596 00:35:58,040 --> 00:36:01,279 Speaker 1: them taking the survey. They did not know is that 597 00:36:01,360 --> 00:36:05,080 Speaker 1: anyone who opted to take this survey was unknowingly giving 598 00:36:05,160 --> 00:36:09,120 Speaker 1: Cogan the ability to view that person's friends profiles as 599 00:36:09,160 --> 00:36:12,200 Speaker 1: if Cogan were in fact the person taking the survey. 600 00:36:12,440 --> 00:36:14,080 Speaker 1: So let me give an example to make this a 601 00:36:14,080 --> 00:36:17,640 Speaker 1: little more clear. Let's say I'm your friend on Facebook. 602 00:36:17,800 --> 00:36:21,600 Speaker 1: Hi friend, and as your friend, I can see more 603 00:36:21,600 --> 00:36:25,120 Speaker 1: of your profile than just some random schmo on the internet. Right, 604 00:36:25,200 --> 00:36:28,640 Speaker 1: maybe you've set certain things on your profile to friends only, 605 00:36:28,920 --> 00:36:32,360 Speaker 1: so as your friend I can see that. But some 606 00:36:32,560 --> 00:36:35,319 Speaker 1: random person wouldn't be able to see that, right, But 607 00:36:35,400 --> 00:36:37,360 Speaker 1: then I decide I'm going to go take the survey 608 00:36:37,400 --> 00:36:39,719 Speaker 1: so I can make twenty bucks or whatever. And now 609 00:36:39,800 --> 00:36:43,080 Speaker 1: Cogan can see your profile as if he were me 610 00:36:43,760 --> 00:36:47,200 Speaker 1: because of this loophole and Facebook's API, and so now 611 00:36:47,280 --> 00:36:50,719 Speaker 1: Cogan can view all of your friend's only information as 612 00:36:50,760 --> 00:36:54,080 Speaker 1: if Cogan were your friend. So Facebook would actually close 613 00:36:54,120 --> 00:36:58,360 Speaker 1: off this loophole before the Cambridge Analytica scandal became ann 614 00:36:58,480 --> 00:37:01,759 Speaker 1: thing like face. This book made that change in the 615 00:37:01,840 --> 00:37:06,600 Speaker 1: years following twenty thirteen when Cogan did this actual work. 616 00:37:07,480 --> 00:37:11,600 Speaker 1: But by then the data already existed with Cogan. Cogan 617 00:37:11,640 --> 00:37:13,880 Speaker 1: had access to all this information and he worked with 618 00:37:13,920 --> 00:37:17,319 Speaker 1: Cambridge Analytica to share it. And so Cambridge Analytica had 619 00:37:17,360 --> 00:37:20,719 Speaker 1: access to all this data they shouldn't have. They did 620 00:37:20,719 --> 00:37:24,600 Speaker 1: not have the consent of the various people on Facebook 621 00:37:24,640 --> 00:37:27,239 Speaker 1: to share the information, and they began to use this 622 00:37:27,440 --> 00:37:31,960 Speaker 1: data in various ways during political campaigns, mostly conservative ones. 623 00:37:31,960 --> 00:37:35,560 Speaker 1: Cambridge Analytica was a British company. It was a sort 624 00:37:35,560 --> 00:37:39,239 Speaker 1: of a campaign strategy company, and their pitch was that 625 00:37:39,280 --> 00:37:43,359 Speaker 1: they were using data driven techniques to make it far 626 00:37:43,400 --> 00:37:47,040 Speaker 1: more effective to get messaging out to potential voters, and 627 00:37:47,440 --> 00:37:53,360 Speaker 1: it was largely for conservative politicians. Facebook was reportedly aware 628 00:37:53,800 --> 00:37:57,440 Speaker 1: of these issues, but didn't take any action until a 629 00:37:57,560 --> 00:38:00,920 Speaker 1: former Cambridge Analytica employee essentially blew the whistle on the 630 00:38:00,920 --> 00:38:05,080 Speaker 1: whole operation and it became a big public scandal. Now, 631 00:38:05,160 --> 00:38:08,880 Speaker 1: ultimately it's debatable whether any of Cambridge Analytica's efforts were 632 00:38:08,920 --> 00:38:12,200 Speaker 1: actually that effective, but the point is the company got 633 00:38:12,280 --> 00:38:16,719 Speaker 1: access to somewhere between fifty and ninety million Facebook profiles 634 00:38:16,719 --> 00:38:19,279 Speaker 1: that it should not have been able to access, and 635 00:38:19,320 --> 00:38:22,279 Speaker 1: that's a big no no. Now, both Cambridge Analytica and 636 00:38:22,400 --> 00:38:26,279 Speaker 1: Facebook would face serious repercussions for this scandal. Facebook would 637 00:38:26,280 --> 00:38:29,600 Speaker 1: face hundreds of millions of dollars in various costs, from 638 00:38:29,680 --> 00:38:33,840 Speaker 1: fines to a massive class action lawsuit settlement, and in 639 00:38:33,880 --> 00:38:38,359 Speaker 1: a separate but related matter, the Federal Trade Commission or FTC, 640 00:38:38,760 --> 00:38:42,960 Speaker 1: would find Facebook an astonishing five billion with a B 641 00:38:43,400 --> 00:38:47,880 Speaker 1: dollars for failing to practice secure and ethical data privacy policies. 642 00:38:48,080 --> 00:38:51,080 Speaker 1: Cambridge Analytica was just kind of related to this. It 643 00:38:51,200 --> 00:38:55,040 Speaker 1: was it was a specific instance of a larger problem. Now, 644 00:38:55,080 --> 00:38:58,719 Speaker 1: Cambridge Analytica would actually fold as a result of this scandal. 645 00:38:58,880 --> 00:39:02,359 Speaker 1: The company ended up essentially liquidating, but you could argue 646 00:39:02,440 --> 00:39:05,440 Speaker 1: Cambridge Analytica is not really gone because some other companies 647 00:39:05,440 --> 00:39:08,480 Speaker 1: that were related to Cambridge Analytica would continue to exist, 648 00:39:08,520 --> 00:39:12,280 Speaker 1: and they bought up the assets of Cambridge Analytica. So yeah, 649 00:39:12,680 --> 00:39:14,880 Speaker 1: you could argue it's still out there lurking, it's just 650 00:39:15,000 --> 00:39:18,480 Speaker 1: under different names. Now, the political nature of Cambridge Analytica 651 00:39:18,480 --> 00:39:22,120 Speaker 1: and the use of psychological profiling techniques really make this 652 00:39:22,160 --> 00:39:25,239 Speaker 1: particular data breach stand out. Now, you could argue there 653 00:39:25,280 --> 00:39:28,720 Speaker 1: are lots of other breaches, including ones we've already talked about, 654 00:39:28,760 --> 00:39:33,600 Speaker 1: that had a much broader scope and involved way more victims, right, 655 00:39:33,840 --> 00:39:38,080 Speaker 1: But the involvement of psychological profiling, specifically for the purposes 656 00:39:38,120 --> 00:39:42,800 Speaker 1: of affecting political campaigns makes this one seem particularly sinister. 657 00:39:43,160 --> 00:39:48,840 Speaker 1: But as I said earlier, number five actually includes Cambridge Analytica. 658 00:39:49,040 --> 00:39:52,600 Speaker 1: It's not exclusively Cambridge Analytica. That was just part of it. 659 00:39:52,800 --> 00:39:57,600 Speaker 1: The whole of number five on Chen's list is Facebook itself, 660 00:39:58,040 --> 00:40:01,520 Speaker 1: specifically with regard to an a April twenty twenty one 661 00:40:01,640 --> 00:40:05,759 Speaker 1: incident anchoring the topic, and that is where we're going 662 00:40:05,800 --> 00:40:08,759 Speaker 1: to pick up in our next episode. We'll pick up 663 00:40:08,800 --> 00:40:11,560 Speaker 1: with number five in Facebook and talk about the twenty 664 00:40:11,600 --> 00:40:15,360 Speaker 1: twenty one incident that merited entry upon this list of 665 00:40:15,400 --> 00:40:18,080 Speaker 1: the largest data breaches in US history, and then we'll 666 00:40:18,280 --> 00:40:20,719 Speaker 1: we'll you know, work our way through four, three, two 667 00:40:20,800 --> 00:40:23,279 Speaker 1: and one, and I'll probably have more to say about 668 00:40:23,320 --> 00:40:27,560 Speaker 1: ticket Master as well as we get to that. Anyway, 669 00:40:27,640 --> 00:40:31,400 Speaker 1: just as a reminder again, there's very little we as 670 00:40:31,640 --> 00:40:34,319 Speaker 1: individuals can do about these kinds of things. I mean, 671 00:40:34,360 --> 00:40:37,520 Speaker 1: if we work in the security department of these big corporations, 672 00:40:37,560 --> 00:40:40,439 Speaker 1: we can try and make sure that the practices we're 673 00:40:40,560 --> 00:40:45,160 Speaker 1: using are best practices and that we're not being laxed 674 00:40:45,160 --> 00:40:47,480 Speaker 1: at all on computer security. But for the rest of us, 675 00:40:47,640 --> 00:40:49,200 Speaker 1: you know, we can just do what we can to 676 00:40:49,239 --> 00:40:52,880 Speaker 1: protect ourselves and hope that the companies we do business 677 00:40:52,880 --> 00:40:55,520 Speaker 1: with are doing the same. And if they're not, we 678 00:40:55,600 --> 00:40:59,719 Speaker 1: can take whatever little measures we might have to mitigate 679 00:40:59,800 --> 00:41:02,399 Speaker 1: the impact it's going to have on ourselves. But really 680 00:41:02,440 --> 00:41:04,600 Speaker 1: a lot of this is out of our control. This 681 00:41:04,640 --> 00:41:09,120 Speaker 1: is why security is an everybody problem, not just on 682 00:41:09,160 --> 00:41:12,920 Speaker 1: the individual or on the company. It's everyone involved. And 683 00:41:13,120 --> 00:41:15,960 Speaker 1: it only takes one week link to make a real 684 00:41:16,280 --> 00:41:19,719 Speaker 1: entry point for malicious agents. So I know that's not 685 00:41:19,840 --> 00:41:22,360 Speaker 1: very comforting, but it's good to know the reality of 686 00:41:22,360 --> 00:41:25,200 Speaker 1: the situation that we all need to do our part 687 00:41:25,280 --> 00:41:27,600 Speaker 1: as best we can. Even that's not going to protect 688 00:41:27,640 --> 00:41:30,840 Speaker 1: us from everything, but it will at least limit the 689 00:41:30,960 --> 00:41:35,759 Speaker 1: amount of effect these hackers can have, and hopefully we'll 690 00:41:35,800 --> 00:41:39,000 Speaker 1: be able to act in such a way to minimize 691 00:41:39,040 --> 00:41:41,239 Speaker 1: the impact. If you can do that enough, then you 692 00:41:41,360 --> 00:41:44,319 Speaker 1: remove the incentive to attack in the first place. If 693 00:41:44,360 --> 00:41:47,760 Speaker 1: it's so hard to get a success in your attack, 694 00:41:48,320 --> 00:41:51,760 Speaker 1: you might figure there's a way to make money faster 695 00:41:52,120 --> 00:41:55,000 Speaker 1: and easier, some other method. So yeah, let's make it 696 00:41:55,080 --> 00:41:57,560 Speaker 1: real hard for the crooks to do crime. If we 697 00:41:57,640 --> 00:42:03,560 Speaker 1: do that, maybe they'll look something else. So that's the hope. 698 00:42:04,080 --> 00:42:07,320 Speaker 1: I hope that all of you out there are doing well, 699 00:42:07,600 --> 00:42:16,919 Speaker 1: and I will talk to you again really soon. Tech 700 00:42:17,000 --> 00:42:21,400 Speaker 1: Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 701 00:42:21,719 --> 00:42:25,440 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 702 00:42:25,440 --> 00:42:26,520 Speaker 1: to your favorite shows.