1 00:00:04,480 --> 00:00:12,399 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,440 --> 00:00:15,960 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:15,960 --> 00:00:19,160 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,160 --> 00:00:22,439 Speaker 1: tech are you? So? In our last episode, I started 5 00:00:22,480 --> 00:00:25,400 Speaker 1: covering some of the largest data breaches in US history, 6 00:00:25,680 --> 00:00:27,720 Speaker 1: and as a refresher, I'm using a list that was 7 00:00:27,720 --> 00:00:33,120 Speaker 1: compiled by Kyle Chen of upguard dot com, and you know, 8 00:00:33,200 --> 00:00:37,760 Speaker 1: biggest data breaches. That's an interesting adjective biggest, because the 9 00:00:37,800 --> 00:00:40,320 Speaker 1: scope of the attack isn't always a case where the 10 00:00:40,360 --> 00:00:44,279 Speaker 1: data breach affected more people than, say, the previous one 11 00:00:44,320 --> 00:00:46,880 Speaker 1: on the list. There are cases in which fewer people 12 00:00:46,960 --> 00:00:50,840 Speaker 1: were affected, but the types of information made the breach 13 00:00:51,000 --> 00:00:53,800 Speaker 1: more severe. For example, in our last episode, I talked 14 00:00:53,840 --> 00:00:58,440 Speaker 1: about al hackers stole information from millions of FriendFinder Networks customers. 15 00:00:58,680 --> 00:01:02,480 Speaker 1: So friend Fighter Networks op rates lots of different businesses, 16 00:01:02,680 --> 00:01:06,040 Speaker 1: including some that fall into the adult entertainment and adult 17 00:01:06,120 --> 00:01:10,640 Speaker 1: lifestyle categories. So for some customers, the revelation that they 18 00:01:10,640 --> 00:01:14,840 Speaker 1: were patronizing such a business could be embarrassing or damaging 19 00:01:14,959 --> 00:01:17,640 Speaker 1: due to the taboo nature of the services, and so 20 00:01:17,800 --> 00:01:20,160 Speaker 1: they're less likely to put up a big fuss about 21 00:01:20,160 --> 00:01:23,600 Speaker 1: a data breach, because if they did, it would be 22 00:01:23,800 --> 00:01:26,680 Speaker 1: admitting that they were using the service. And if you're 23 00:01:26,720 --> 00:01:30,080 Speaker 1: a hacker, that kind of target is really tempting because 24 00:01:30,120 --> 00:01:35,039 Speaker 1: your victims are incentivized to not come forward. Now, we 25 00:01:35,120 --> 00:01:37,840 Speaker 1: left off in the last episode talking about the Cambridge 26 00:01:37,840 --> 00:01:41,440 Speaker 1: Analytica scandal with Facebook. That scandal was a doozy and 27 00:01:41,680 --> 00:01:44,720 Speaker 1: was hitting right around the time when Facebook, now known 28 00:01:44,760 --> 00:01:48,480 Speaker 1: as Meta, was in the US government's crosshairs for multiple reasons. 29 00:01:48,640 --> 00:01:51,240 Speaker 1: In fact, I'd argue that those events are in part 30 00:01:51,280 --> 00:01:53,560 Speaker 1: what led to Facebook changing its name to Meta in 31 00:01:53,600 --> 00:01:55,440 Speaker 1: the first place. It was an attempt to kind of 32 00:01:55,680 --> 00:01:59,720 Speaker 1: distance itself from a brand that was being associated with 33 00:02:00,360 --> 00:02:04,840 Speaker 1: lacks security and other shady maybe not shady shade is 34 00:02:04,880 --> 00:02:09,320 Speaker 1: the wrong word, but ethically questionable actions. But as I 35 00:02:09,360 --> 00:02:12,320 Speaker 1: mentioned in the last episode, the Cambridge Analytica scandal was 36 00:02:12,440 --> 00:02:15,360 Speaker 1: just a little side quest in the real Number five 37 00:02:15,880 --> 00:02:18,920 Speaker 1: on our top ten list of largest data breaches in 38 00:02:19,000 --> 00:02:22,160 Speaker 1: US history. Chen's main target for number five was just 39 00:02:22,280 --> 00:02:25,120 Speaker 1: Facebook in general, because the site has been a massive 40 00:02:25,160 --> 00:02:28,320 Speaker 1: target for hackers looking to harvest a mountain of data, 41 00:02:28,360 --> 00:02:31,240 Speaker 1: and that makes sense. Most estimates put the number of 42 00:02:31,240 --> 00:02:35,400 Speaker 1: Facebook users in the neighborhood of three billion people billion 43 00:02:35,480 --> 00:02:38,000 Speaker 1: with a B. So, if you're a hacker and if 44 00:02:38,040 --> 00:02:40,880 Speaker 1: your goal is to steal as much personal information as 45 00:02:40,919 --> 00:02:44,679 Speaker 1: you can, you're likely balancing a few different things when 46 00:02:44,680 --> 00:02:48,160 Speaker 1: you're figuring out who you're going to attack. How valuable 47 00:02:48,200 --> 00:02:50,280 Speaker 1: is the information you're going to get hold of. How 48 00:02:50,360 --> 00:02:52,680 Speaker 1: much of that information do you think you can potentially 49 00:02:52,760 --> 00:02:56,040 Speaker 1: get away with? How secure is the system? You know 50 00:02:56,080 --> 00:02:59,040 Speaker 1: you might be able to infiltrate a relatively small target 51 00:02:59,080 --> 00:03:01,600 Speaker 1: without much trouble. On the flip side, you're not likely 52 00:03:01,600 --> 00:03:03,760 Speaker 1: going to end up with a ton of useful stuff 53 00:03:03,800 --> 00:03:07,320 Speaker 1: in the process. But Facebook, well, Facebook has just about 54 00:03:07,360 --> 00:03:10,200 Speaker 1: everyone's data on it, and heck, if you wanted to 55 00:03:10,240 --> 00:03:13,519 Speaker 1: see if you could compromise specific accounts and use those 56 00:03:13,520 --> 00:03:16,640 Speaker 1: to boost a misinformation campaign, that's a possibility too. It 57 00:03:16,680 --> 00:03:19,200 Speaker 1: may be that you're not looking to sell information. You 58 00:03:19,240 --> 00:03:22,000 Speaker 1: may want to sell a narrative, and in doing so, 59 00:03:22,080 --> 00:03:26,240 Speaker 1: what you're doing is borrowing accounts that aren't yours to 60 00:03:26,360 --> 00:03:30,360 Speaker 1: elevate a message. The main data breach that Chen mentions 61 00:03:30,440 --> 00:03:35,280 Speaker 1: with Facebook actually happened with Facebook in twenty twenty one. 62 00:03:35,360 --> 00:03:38,360 Speaker 1: You know, that's years after the Cambridge Analyticus scandal, and 63 00:03:38,400 --> 00:03:41,080 Speaker 1: that's when someone posted to a black market site on 64 00:03:41,160 --> 00:03:44,200 Speaker 1: the dark Web that they had data records including personal 65 00:03:44,240 --> 00:03:48,400 Speaker 1: information of more than half a billion Facebook users, so 66 00:03:48,480 --> 00:03:50,920 Speaker 1: more than five hundred million, like five hundred and thirty 67 00:03:51,040 --> 00:03:55,680 Speaker 1: million user accounts essentially, so this information would include things 68 00:03:55,720 --> 00:04:00,640 Speaker 1: like user names, the actual names, the data, birth location data, 69 00:04:01,080 --> 00:04:03,680 Speaker 1: and more, all from people who were on the platform 70 00:04:03,720 --> 00:04:08,040 Speaker 1: between twenty eighteen to twenty nineteen. How much data was 71 00:04:08,080 --> 00:04:11,560 Speaker 1: available per person depended upon the individual and the settings 72 00:04:11,560 --> 00:04:14,560 Speaker 1: that they used on Facebook, because this was another case 73 00:04:14,600 --> 00:04:18,200 Speaker 1: of hackers using data scraping tools to pull publicly available 74 00:04:18,200 --> 00:04:21,800 Speaker 1: information off a platform. So no different than if you 75 00:04:22,000 --> 00:04:25,760 Speaker 1: were to visit each person's account and then just note 76 00:04:25,800 --> 00:04:29,560 Speaker 1: down all the information that was visible to you. And 77 00:04:29,600 --> 00:04:31,880 Speaker 1: you might remember from our last episode that a hacker 78 00:04:31,920 --> 00:04:36,160 Speaker 1: with a handle Tomliner was offering similar information that someone 79 00:04:36,400 --> 00:04:41,720 Speaker 1: presumably Tomliner themselves whoever they are, had scraped off of LinkedIn. 80 00:04:42,200 --> 00:04:46,240 Speaker 1: So this Facebook instance is a similar case, except it 81 00:04:46,240 --> 00:04:49,240 Speaker 1: appears that the hacker exploited a vulnerability in the Facebook 82 00:04:49,279 --> 00:04:54,640 Speaker 1: platform to harvest information from profiles. Until Facebook fixed the 83 00:04:54,720 --> 00:04:57,960 Speaker 1: vulnerability in late twenty nineteen, and this is based off 84 00:04:57,960 --> 00:05:01,200 Speaker 1: Facebook's own statements about the data reach. The company claimed 85 00:05:01,240 --> 00:05:04,279 Speaker 1: initially that the April twenty twenty one data was the 86 00:05:04,360 --> 00:05:07,960 Speaker 1: same as a similar breach that happened in January, also 87 00:05:08,040 --> 00:05:12,320 Speaker 1: of twenty twenty one. Now Motherboard reported in January twenty 88 00:05:12,320 --> 00:05:15,400 Speaker 1: twenty one that quote a user of a low level 89 00:05:15,440 --> 00:05:18,840 Speaker 1: cyber criminal forum is selling access to a database of 90 00:05:18,880 --> 00:05:23,159 Speaker 1: phone numbers belonging to Facebook users and conveniently letting customers 91 00:05:23,200 --> 00:05:26,520 Speaker 1: look up those numbers by using an automated Telegram bot 92 00:05:27,040 --> 00:05:29,640 Speaker 1: end quote. So this person had created a database in 93 00:05:29,680 --> 00:05:31,800 Speaker 1: which you could search for specific names and see if 94 00:05:31,800 --> 00:05:33,839 Speaker 1: you've got a hit with a relevant phone number. But 95 00:05:34,000 --> 00:05:38,320 Speaker 1: the April revelation showed more than just phone numbers, which 96 00:05:38,520 --> 00:05:40,600 Speaker 1: you know already. Just the phone numbers is bad enough, 97 00:05:40,960 --> 00:05:44,000 Speaker 1: But the hack that happened later in twenty twenty one 98 00:05:44,440 --> 00:05:47,359 Speaker 1: included lots more stuff like that geolocation data if it 99 00:05:47,440 --> 00:05:50,320 Speaker 1: was in fact part of the account. Now companies like 100 00:05:50,800 --> 00:05:54,920 Speaker 1: Meta Slash Facebook typically have policies prohibiting the mass scraping 101 00:05:54,960 --> 00:05:57,600 Speaker 1: of data. As I said in the last episode, data 102 00:05:57,640 --> 00:06:00,640 Speaker 1: is effectively the currency of the Internet, and when you 103 00:06:00,760 --> 00:06:03,760 Speaker 1: really get down to it. User data is the commodity 104 00:06:03,760 --> 00:06:07,359 Speaker 1: that Meta relies upon for its revenue. Meta's whole business 105 00:06:07,400 --> 00:06:12,080 Speaker 1: model is primarily centered on advertising, and user data allows 106 00:06:12,160 --> 00:06:15,680 Speaker 1: Meta to work with ad companies to target specific customer bases. 107 00:06:16,000 --> 00:06:19,840 Speaker 1: So you'd better believe Meta would prefer to maintain dominion 108 00:06:20,200 --> 00:06:23,680 Speaker 1: over that information rather than having someone else get a 109 00:06:23,720 --> 00:06:26,240 Speaker 1: hold of it. Plus, by not keeping user information under 110 00:06:26,279 --> 00:06:30,000 Speaker 1: tight control, Meta opens itself up to lawsuits and regulatory 111 00:06:30,080 --> 00:06:32,560 Speaker 1: finds that sort of thing, and that certainly happened in 112 00:06:32,560 --> 00:06:36,520 Speaker 1: twenty twenty one. So, for example, the Data Protection Commission 113 00:06:36,760 --> 00:06:40,400 Speaker 1: or DPC, which is a regulatory agency in Ireland and 114 00:06:40,440 --> 00:06:43,880 Speaker 1: thus part of the European Union, found Facebook at fault 115 00:06:43,880 --> 00:06:47,960 Speaker 1: for failing to comply with the quote GDPR obligation for 116 00:06:48,080 --> 00:06:51,719 Speaker 1: data protection by design and default in the quote. So 117 00:06:51,760 --> 00:06:54,200 Speaker 1: as a quick reminder for those of y'all who live 118 00:06:54,240 --> 00:06:58,160 Speaker 1: outside the EU like me, GDPR stands for General Data 119 00:06:58,160 --> 00:07:01,240 Speaker 1: Protection Regulation and it's part of a much larger piece 120 00:07:01,240 --> 00:07:05,080 Speaker 1: of legislation regarding fundamental rights. So here in the United 121 00:07:05,080 --> 00:07:09,600 Speaker 1: States we have relatively little protection with regard to our 122 00:07:09,600 --> 00:07:14,840 Speaker 1: private information. That may be changing, but it's long overdue. 123 00:07:14,920 --> 00:07:17,400 Speaker 1: The same is not true in the EU, where citizens 124 00:07:17,440 --> 00:07:20,840 Speaker 1: are supposed to have authority over their own personal information. 125 00:07:21,400 --> 00:07:24,320 Speaker 1: In the wake of hackers releasing personal information about more 126 00:07:24,320 --> 00:07:27,840 Speaker 1: than five hundred million Facebook users, the DPC conducted an 127 00:07:27,920 --> 00:07:31,880 Speaker 1: investigation into Facebook. The DPC determined that Facebook failed to 128 00:07:32,000 --> 00:07:35,640 Speaker 1: embed proper data privacy and protection measures in place, and 129 00:07:35,680 --> 00:07:38,600 Speaker 1: that gave the hackers the opportunity to scrape all that data. 130 00:07:38,880 --> 00:07:41,760 Speaker 1: So just having a policy saying hey, don't do that, 131 00:07:41,760 --> 00:07:44,040 Speaker 1: that wasn't enough. They had to have things that actually 132 00:07:44,040 --> 00:07:47,160 Speaker 1: would prohibit the practice, and that was where the failure was. 133 00:07:47,640 --> 00:07:50,600 Speaker 1: The DPC then ruled that Facebook would owe a fine 134 00:07:50,640 --> 00:07:53,760 Speaker 1: of two hundred and sixty five million euros, which at 135 00:07:53,760 --> 00:07:56,000 Speaker 1: the time equalled to around two hundred and seventy five 136 00:07:56,040 --> 00:07:59,280 Speaker 1: million dollars, And that was actually the third fine DPC 137 00:07:59,480 --> 00:08:02,760 Speaker 1: levey to get Meta just that year. The first fine 138 00:08:02,800 --> 00:08:05,960 Speaker 1: was for eighteen point six million dollars that related to 139 00:08:06,120 --> 00:08:09,360 Speaker 1: Meta keeping really poor records that contributed to a data 140 00:08:09,440 --> 00:08:12,960 Speaker 1: leak that exposed the information of around thirty million Facebook users, 141 00:08:12,960 --> 00:08:16,920 Speaker 1: so a much smaller leak in that case. The second 142 00:08:17,240 --> 00:08:19,560 Speaker 1: of the fines was actually the largest because this two 143 00:08:19,640 --> 00:08:22,280 Speaker 1: hundred and sixty five million euros. That wasn't the most. 144 00:08:22,480 --> 00:08:25,440 Speaker 1: The most was four hundred two million, and that had 145 00:08:25,440 --> 00:08:29,000 Speaker 1: to do with how Facebook, well really Meta was handling 146 00:08:29,080 --> 00:08:32,640 Speaker 1: data that belonged to teenage users of the app Instagram. 147 00:08:32,760 --> 00:08:34,680 Speaker 1: So yeah, it was a very expensive year for Meta 148 00:08:34,760 --> 00:08:37,080 Speaker 1: in terms of fines paid to the EU. I mean 149 00:08:37,559 --> 00:08:39,960 Speaker 1: keep in mind that Facebook had also had to face 150 00:08:40,000 --> 00:08:43,480 Speaker 1: a five billion dollar fine in the United States earlier 151 00:08:43,600 --> 00:08:46,480 Speaker 1: due to other issues. So yeah, Facebook was no stranger 152 00:08:46,520 --> 00:08:49,240 Speaker 1: to having to pay out hundreds of millions and even 153 00:08:49,400 --> 00:08:54,280 Speaker 1: billions of dollars in fines for how the company handles information. 154 00:08:54,800 --> 00:08:57,440 Speaker 1: In Chen's blog post, he also points out a few 155 00:08:57,480 --> 00:09:01,600 Speaker 1: other incidents involving Meta and data security or lack of 156 00:09:02,000 --> 00:09:06,120 Speaker 1: data security. So, for example, in twenty nineteen, Upguard researchers 157 00:09:06,160 --> 00:09:10,080 Speaker 1: found a database of five hundred and forty million Facebook 158 00:09:10,160 --> 00:09:14,360 Speaker 1: user data records on quote public Amazon S three cloud 159 00:09:14,440 --> 00:09:17,880 Speaker 1: servers end quote. Now, arguably you could say this really 160 00:09:17,960 --> 00:09:21,600 Speaker 1: wasn't Facebook's fault, This was the fault of a third 161 00:09:21,640 --> 00:09:24,760 Speaker 1: party company that was working with Facebook. That third party 162 00:09:24,800 --> 00:09:29,120 Speaker 1: company was Cultura Collectiva, which was a media and app 163 00:09:29,160 --> 00:09:33,119 Speaker 1: developer company. So you could say that, you know, Cultura 164 00:09:33,200 --> 00:09:38,199 Speaker 1: Collectiva was a customer for Facebook, and they had this 165 00:09:38,440 --> 00:09:41,560 Speaker 1: massive database of Facebook user data, presumably so that they 166 00:09:41,559 --> 00:09:44,840 Speaker 1: could develop the apps or whatever that they were working on, 167 00:09:45,240 --> 00:09:49,160 Speaker 1: and they were storing this database on an Amazon S 168 00:09:49,240 --> 00:09:53,200 Speaker 1: three cloud server, but they failed to actually secure that 169 00:09:53,240 --> 00:09:56,480 Speaker 1: information on the cloud. They didn't put protections in place 170 00:09:56,720 --> 00:10:00,560 Speaker 1: to prevent unauthorized people from being able to access, which 171 00:10:00,600 --> 00:10:03,319 Speaker 1: is a big old whoopsie. But I think it's fair 172 00:10:03,400 --> 00:10:06,360 Speaker 1: to say that it was not Meta's direct fault for 173 00:10:06,480 --> 00:10:09,400 Speaker 1: this one. It sounded to me like this was the 174 00:10:09,440 --> 00:10:12,760 Speaker 1: third party that arranged to have the database on that server. 175 00:10:13,080 --> 00:10:15,440 Speaker 1: But I don't know if Facebook had done that, Like 176 00:10:15,480 --> 00:10:18,240 Speaker 1: if Meta had made it available on that server and 177 00:10:18,280 --> 00:10:23,480 Speaker 1: then gave access to Cultura Collectiva but failed to secure 178 00:10:23,520 --> 00:10:27,120 Speaker 1: that access appropriately, then yes, it would have been Meta's fault. 179 00:10:27,200 --> 00:10:29,880 Speaker 1: I honestly don't know the details where I can make 180 00:10:29,920 --> 00:10:33,560 Speaker 1: a determination in that regard. Anyway, That's really just the 181 00:10:33,559 --> 00:10:35,800 Speaker 1: tip of the iceberg when it comes to data breaches 182 00:10:35,960 --> 00:10:40,439 Speaker 1: with Meta slash Facebook. And given the reach of platforms 183 00:10:40,600 --> 00:10:44,360 Speaker 1: like Facebook and Watsapp and Instagram, you know, all these 184 00:10:44,360 --> 00:10:47,320 Speaker 1: different meta properties, it really should come as no surprise 185 00:10:47,440 --> 00:10:50,560 Speaker 1: that any data breach that targets Meta is going to 186 00:10:50,600 --> 00:10:52,560 Speaker 1: end up being one of the largest in US history 187 00:10:52,760 --> 00:10:56,560 Speaker 1: simply because they have such a huge user base, right like, 188 00:10:56,960 --> 00:10:59,880 Speaker 1: just by definition, if it's an effective attack, it's going 189 00:10:59,880 --> 00:11:04,120 Speaker 1: to be one of the largest ever. So again, it's 190 00:11:04,160 --> 00:11:06,440 Speaker 1: one of those things that you know, it makes Meta 191 00:11:06,559 --> 00:11:10,040 Speaker 1: such an attractive target for attackers, the fact that it's 192 00:11:10,080 --> 00:11:13,480 Speaker 1: this huge. Okay, we've got lots more to go through, 193 00:11:13,840 --> 00:11:16,640 Speaker 1: and rather than start the next entry and then just 194 00:11:16,679 --> 00:11:18,480 Speaker 1: have to take a break in the middle, let's take 195 00:11:18,480 --> 00:11:21,880 Speaker 1: a break. Now, we'll be back with numbers four and three. 196 00:11:21,960 --> 00:11:34,120 Speaker 1: I think, okay, we're back. So number four on our 197 00:11:34,160 --> 00:11:37,800 Speaker 1: list of largest data breaches in US history is another 198 00:11:37,840 --> 00:11:41,120 Speaker 1: financial institution. So in the last episode, I mentioned how 199 00:11:41,160 --> 00:11:45,080 Speaker 1: a twenty fourteen hack on JP Morgan Chase demonstrated that 200 00:11:45,160 --> 00:11:48,079 Speaker 1: it wasn't just web based companies that were at risk. 201 00:11:48,400 --> 00:11:51,880 Speaker 1: In May twenty nineteen, that lesson was reinforced when poor 202 00:11:52,000 --> 00:11:57,240 Speaker 1: data security practices meant the first American financial corporation inadvertently 203 00:11:57,320 --> 00:12:01,120 Speaker 1: made it possible for anyone with a particular weblink and 204 00:12:01,679 --> 00:12:05,400 Speaker 1: able to access some really sensitive financial information. There was 205 00:12:05,440 --> 00:12:09,880 Speaker 1: no hacking required. Literally, all you needed was the URL 206 00:12:10,000 --> 00:12:12,400 Speaker 1: and boom you could view a whole bunch of files, 207 00:12:12,520 --> 00:12:16,640 Speaker 1: and by whole bunch, I mean nine hundred million files almost, 208 00:12:16,760 --> 00:12:18,920 Speaker 1: so you know, more than eight hundred billion. I think 209 00:12:18,960 --> 00:12:21,040 Speaker 1: it was like eight hundred and eighty million something along 210 00:12:21,120 --> 00:12:24,520 Speaker 1: those lines. So what the heck happened? Well, whomever first 211 00:12:24,559 --> 00:12:27,840 Speaker 1: American financial corporation hired to do web design made a 212 00:12:28,040 --> 00:12:30,120 Speaker 1: major boo boo, And I'd like to think it was 213 00:12:30,160 --> 00:12:33,319 Speaker 1: an honest but dumb mistake and not you know, some 214 00:12:33,400 --> 00:12:36,560 Speaker 1: sort of malicious or premeditated attempt to gain access to 215 00:12:36,640 --> 00:12:40,520 Speaker 1: sensitive information. So essentially, the web developer failed to include 216 00:12:40,600 --> 00:12:44,880 Speaker 1: any sort of authorization or verification process to access this 217 00:12:45,440 --> 00:12:49,280 Speaker 1: list of files. So maybe the thought was just that, 218 00:12:49,720 --> 00:12:52,120 Speaker 1: you know, the URL for the web directory wouldn't be 219 00:12:52,160 --> 00:12:55,160 Speaker 1: published anywhere, so the only people who would even have 220 00:12:55,240 --> 00:12:56,959 Speaker 1: access to it were the ones who knew what the 221 00:12:57,080 --> 00:12:59,920 Speaker 1: UURL was. And if that was the thought, like if 222 00:12:59,920 --> 00:13:02,240 Speaker 1: it was like, oh, we don't need to institute anything 223 00:13:02,240 --> 00:13:05,920 Speaker 1: else because this is a private web address, who's going 224 00:13:06,000 --> 00:13:08,280 Speaker 1: to see it? That would be a practice that's called 225 00:13:08,480 --> 00:13:12,040 Speaker 1: security through obscurity, which by the way, is not very 226 00:13:12,080 --> 00:13:14,920 Speaker 1: secure at all. And as the name implies, the hope 227 00:13:14,960 --> 00:13:18,360 Speaker 1: is that by flying below the radar of attackers, no 228 00:13:18,440 --> 00:13:21,120 Speaker 1: one takes notice of you, so you remain safe, not 229 00:13:21,200 --> 00:13:24,640 Speaker 1: because you're practicing really good security, but because no one 230 00:13:24,679 --> 00:13:27,200 Speaker 1: has spotted you, and so no one has decided to 231 00:13:27,280 --> 00:13:31,319 Speaker 1: target you. Sometimes this isn't something that you're actively practicing, 232 00:13:31,360 --> 00:13:34,520 Speaker 1: it's just kind of in effect. I have argued in 233 00:13:34,559 --> 00:13:39,280 Speaker 1: the past that Apple enjoyed security through obscurity. For many years, 234 00:13:39,679 --> 00:13:44,560 Speaker 1: you didn't see much malware targeting Apple systems. Apple fans 235 00:13:44,760 --> 00:13:47,760 Speaker 1: would argue this was because Apple was just better at 236 00:13:47,840 --> 00:13:51,160 Speaker 1: security than companies like Microsoft or IBM. In fact, some 237 00:13:51,200 --> 00:13:54,720 Speaker 1: of Apple's own ads seemed to me at least to 238 00:13:54,880 --> 00:13:59,079 Speaker 1: imply that one advantage Apple computers had over Windows based 239 00:13:59,080 --> 00:14:02,040 Speaker 1: PCs is that they were more secure. Now, I would 240 00:14:02,040 --> 00:14:06,240 Speaker 1: counter that I don't think Apple was necessarily so much 241 00:14:06,280 --> 00:14:10,319 Speaker 1: better at security. I mean, granted, they were using proprietary systems, 242 00:14:10,360 --> 00:14:14,880 Speaker 1: including proprietary software and hardware, which does make it harder 243 00:14:15,160 --> 00:14:18,599 Speaker 1: to develop malware for those platforms. It doesn't make it impossible, 244 00:14:18,760 --> 00:14:21,000 Speaker 1: but it makes it harder. But I think that the 245 00:14:21,120 --> 00:14:24,000 Speaker 1: vast majority of people, I don't think this this is true. 246 00:14:24,040 --> 00:14:26,920 Speaker 1: The vast majority of people were not using Apple computers. 247 00:14:26,960 --> 00:14:29,600 Speaker 1: They were on Windows based machines. So, yeah, there were 248 00:14:29,640 --> 00:14:32,720 Speaker 1: people using Apple computers, but if you looked at the percentages, 249 00:14:32,800 --> 00:14:36,920 Speaker 1: I mean, Apple was just dwarfed by Windows machines. So 250 00:14:37,440 --> 00:14:40,120 Speaker 1: if you are a hacker and you're going to dedicate 251 00:14:40,200 --> 00:14:43,000 Speaker 1: the time and effort into developing malware, and you have 252 00:14:43,040 --> 00:14:45,360 Speaker 1: a specific goal in mind, whatever that goal might be, 253 00:14:45,760 --> 00:14:48,520 Speaker 1: you're more likely going to aim at the largest target 254 00:14:48,760 --> 00:14:50,960 Speaker 1: in order to have the biggest impact. Like, that's going 255 00:14:51,040 --> 00:14:54,560 Speaker 1: to make more of a difference than affecting a relatively 256 00:14:54,720 --> 00:14:58,440 Speaker 1: small population of Apple users. Go for the bigger population 257 00:14:58,520 --> 00:15:03,520 Speaker 1: of Windows users. That's security through obscurity, right, Like, yes, 258 00:15:03,960 --> 00:15:07,520 Speaker 1: the security might have been better to some degree, but 259 00:15:07,640 --> 00:15:10,440 Speaker 1: that's not why Apple was so safe for the longest time. 260 00:15:10,640 --> 00:15:13,480 Speaker 1: It was also safe or what was not the only reason? 261 00:15:13,520 --> 00:15:15,640 Speaker 1: I guess I should say it was also safe because 262 00:15:15,960 --> 00:15:19,440 Speaker 1: the company's market share was small enough that a lot 263 00:15:19,440 --> 00:15:22,720 Speaker 1: of hackers just didn't see the return on investment to 264 00:15:22,800 --> 00:15:26,560 Speaker 1: develop malware for Apple when most people were using Windows 265 00:15:26,640 --> 00:15:30,920 Speaker 1: machines anyway, Whether the thought was that no one who 266 00:15:31,040 --> 00:15:34,560 Speaker 1: was unauthorized would even find their way to this URL, 267 00:15:34,960 --> 00:15:38,280 Speaker 1: or it was just an oversight and the web developer 268 00:15:38,320 --> 00:15:40,920 Speaker 1: failed to take steps that they should have taken the 269 00:15:41,000 --> 00:15:44,400 Speaker 1: vulnerability was in place, and this type of vulnerability is 270 00:15:44,520 --> 00:15:48,880 Speaker 1: called an insecure direct object reference. So this is just 271 00:15:48,960 --> 00:15:52,360 Speaker 1: a category for vulnerabilities, and essentially it just means that 272 00:15:52,640 --> 00:15:55,840 Speaker 1: if a user has the ability to submit a specific 273 00:15:55,960 --> 00:15:59,720 Speaker 1: input into a system and they can gain access to 274 00:15:59,760 --> 00:16:03,000 Speaker 1: that system without having to pass through any control measures, 275 00:16:03,240 --> 00:16:07,880 Speaker 1: that is an insecure direct object reference. So in this case, 276 00:16:08,240 --> 00:16:11,680 Speaker 1: the input was just a web address, so the only 277 00:16:11,840 --> 00:16:14,440 Speaker 1: tool that you needed was a browser. You did not 278 00:16:14,560 --> 00:16:16,680 Speaker 1: have to be a hacker. You needed a web browser 279 00:16:16,760 --> 00:16:19,320 Speaker 1: and you needed to put a URL into the address 280 00:16:19,400 --> 00:16:22,360 Speaker 1: bar and boom, you would be taken to one of 281 00:16:22,360 --> 00:16:25,680 Speaker 1: these documents. Now, adding to this problem was the fact 282 00:16:25,680 --> 00:16:29,960 Speaker 1: that First American Financial Corporation was using a sequential numbering 283 00:16:30,000 --> 00:16:33,680 Speaker 1: system for their documents, which meant if you had discovered 284 00:16:33,720 --> 00:16:37,520 Speaker 1: a URL that brought you to one of these documents, 285 00:16:37,800 --> 00:16:40,480 Speaker 1: you could then adjust the numeral that was at the 286 00:16:40,720 --> 00:16:43,920 Speaker 1: end of the file name, and if you went down one, 287 00:16:44,200 --> 00:16:46,760 Speaker 1: then you'd see the previous record, and if you went 288 00:16:46,840 --> 00:16:48,920 Speaker 1: up one, you'd see the next record. So what kind 289 00:16:48,920 --> 00:16:52,320 Speaker 1: of records are we talking about. Well, it included stuff 290 00:16:52,400 --> 00:16:58,520 Speaker 1: like bank statements, receipts, actual bank account numbers, drivers licenses, 291 00:16:58,560 --> 00:17:01,520 Speaker 1: and other stuff. Now, on the bright side, while security 292 00:17:01,560 --> 00:17:04,680 Speaker 1: researcher Brian Krebs reported on the discovery of the issue, 293 00:17:04,720 --> 00:17:08,800 Speaker 1: it appeared that no unauthorized third parties with malicious intent 294 00:17:09,080 --> 00:17:13,159 Speaker 1: had actually accessed this information. You could argue, really just 295 00:17:13,200 --> 00:17:15,119 Speaker 1: a matter of luck. Also, there was no way to 296 00:17:15,200 --> 00:17:18,160 Speaker 1: really know. But it just didn't seem like this stuff 297 00:17:18,240 --> 00:17:21,560 Speaker 1: was showing up anywhere where you would say, oh, the 298 00:17:21,600 --> 00:17:23,639 Speaker 1: bad guys got hold of this, right. It wasn't like 299 00:17:23,680 --> 00:17:26,919 Speaker 1: it was popping up on the dark web. Perhaps because 300 00:17:27,000 --> 00:17:29,639 Speaker 1: it seemed that no criminals had found out about this 301 00:17:29,760 --> 00:17:33,520 Speaker 1: problem before First American could actually, you know, address it, 302 00:17:33,680 --> 00:17:37,800 Speaker 1: the SEC went fairly light on the financial institution, at 303 00:17:37,880 --> 00:17:40,680 Speaker 1: least in my opinion. First American was ordered to pay 304 00:17:40,720 --> 00:17:43,960 Speaker 1: a five hundred thousand dollars fine. Now, five hundred thousand 305 00:17:43,960 --> 00:17:45,639 Speaker 1: dollars is a lot of money. If you hit me 306 00:17:45,680 --> 00:17:48,159 Speaker 1: with a five hundred thousand dollars fine, I would be 307 00:17:48,359 --> 00:17:51,400 Speaker 1: up the creek. I would be so deep in trouble. 308 00:17:51,600 --> 00:17:54,560 Speaker 1: But when you think of like a financial institution and 309 00:17:54,600 --> 00:17:58,640 Speaker 1: the fact that this this data leak scope was nearly 310 00:17:58,760 --> 00:18:04,040 Speaker 1: a billion records of incredibly sensitive financial information. I think 311 00:18:04,080 --> 00:18:06,920 Speaker 1: it's still a fairly light fine to have to pay. 312 00:18:07,320 --> 00:18:09,640 Speaker 1: The State of New York would actually secure a million 313 00:18:09,720 --> 00:18:13,000 Speaker 1: dollar payout from First American in a lawsuit settlement, so 314 00:18:13,240 --> 00:18:15,280 Speaker 1: that was twice as much. Now it may be, and 315 00:18:15,320 --> 00:18:18,040 Speaker 1: I haven't looked at this. It maybe that the SEC 316 00:18:18,359 --> 00:18:22,040 Speaker 1: just had limitations on how much it could find an institution, 317 00:18:22,440 --> 00:18:24,840 Speaker 1: in which case, you know, no fault on them. It's 318 00:18:24,880 --> 00:18:26,840 Speaker 1: not like it's not like they could do more. If 319 00:18:26,920 --> 00:18:30,080 Speaker 1: you throw the book at someone, you don't typically assume 320 00:18:30,160 --> 00:18:32,800 Speaker 1: the person has a second book to throw. More recently, 321 00:18:33,119 --> 00:18:36,040 Speaker 1: First American was in the news again in late twenty 322 00:18:36,119 --> 00:18:38,560 Speaker 1: twenty three when hackers claiming to be part of the 323 00:18:38,600 --> 00:18:43,199 Speaker 1: infamous Alpha and black Cat groups infiltrated company systems and 324 00:18:43,359 --> 00:18:47,760 Speaker 1: encrypted several directories as a ransomware attack. Now, this disrupted 325 00:18:47,800 --> 00:18:50,720 Speaker 1: First Americans operations for a short while, but the company 326 00:18:50,800 --> 00:18:53,160 Speaker 1: assured the public that the attackers had failed to lock 327 00:18:53,200 --> 00:18:57,080 Speaker 1: off anything that was absolutely critical for operations, and at 328 00:18:57,119 --> 00:19:01,320 Speaker 1: first they believe that customer records weren't act. However, very 329 00:19:01,400 --> 00:19:05,399 Speaker 1: recently First American revealed that quote personal information pertaining to 330 00:19:05,440 --> 00:19:09,679 Speaker 1: approximately forty four thousand individuals may have been accessed without 331 00:19:09,760 --> 00:19:12,760 Speaker 1: authorization as a result of the incident. End quote. This 332 00:19:12,800 --> 00:19:16,600 Speaker 1: is according to the title report dot com. So the 333 00:19:16,600 --> 00:19:19,040 Speaker 1: company is going to be contacting those who are potentially 334 00:19:19,040 --> 00:19:23,199 Speaker 1: affected and offer identity protection services and credit monitoring for 335 00:19:23,280 --> 00:19:25,560 Speaker 1: free for you know, some amount of time. I don't 336 00:19:25,600 --> 00:19:27,960 Speaker 1: know for how long. Number three on our list is 337 00:19:28,000 --> 00:19:31,200 Speaker 1: a real estate education company. It is called the Real 338 00:19:31,320 --> 00:19:34,919 Speaker 1: Estate Wealth Network. Now, before I did any research on 339 00:19:34,960 --> 00:19:38,080 Speaker 1: this episode, I had not heard about this particular data breach. 340 00:19:38,200 --> 00:19:40,480 Speaker 1: This one I just missed. When it happened, it was 341 00:19:40,520 --> 00:19:42,919 Speaker 1: big news, and somehow I didn't see it, And it 342 00:19:42,960 --> 00:19:45,480 Speaker 1: really was a data leak. So similar to what we 343 00:19:45,720 --> 00:19:48,320 Speaker 1: just talked about with First American. As opposed to you know, 344 00:19:48,400 --> 00:19:51,840 Speaker 1: hackers compromising a system, this is more of a company 345 00:19:52,040 --> 00:19:56,520 Speaker 1: failing to put in the proper measures to protect information. 346 00:19:56,920 --> 00:20:03,080 Speaker 1: Before I read about this particular situation, my imagination went wild, right, 347 00:20:03,080 --> 00:20:06,960 Speaker 1: because it's the Real Estate Wealth Network. I in my 348 00:20:07,040 --> 00:20:10,719 Speaker 1: imagination thought, aha, this is going to be a story 349 00:20:10,760 --> 00:20:15,040 Speaker 1: about some sort of crypto anarchist, right, because here's a 350 00:20:15,040 --> 00:20:17,920 Speaker 1: company that's dedicated to teaching folks how to build wealth 351 00:20:17,960 --> 00:20:20,440 Speaker 1: by buying up real estate in many parts of the 352 00:20:20,560 --> 00:20:23,760 Speaker 1: United States. Right now, we're in sort of a housing crisis. 353 00:20:23,920 --> 00:20:26,880 Speaker 1: There are a lot of middle class Americans who cannot 354 00:20:26,960 --> 00:20:30,480 Speaker 1: afford to buy a house, which is a change from 355 00:20:30,640 --> 00:20:33,840 Speaker 1: previous generations. Right like, you have people who are in 356 00:20:33,920 --> 00:20:40,280 Speaker 1: the same relative space in middle class, who, unlike earlier generations, 357 00:20:40,640 --> 00:20:44,199 Speaker 1: cannot buy an average home or a suitable home. So 358 00:20:44,240 --> 00:20:47,879 Speaker 1: the housing prices have skyrocketed in various regions throughout the 359 00:20:47,960 --> 00:20:50,359 Speaker 1: United States in recent years, and a lot of properties 360 00:20:50,400 --> 00:20:53,280 Speaker 1: seem to get scooped up by people or companies determined 361 00:20:53,320 --> 00:20:56,960 Speaker 1: to turn the houses into lucrative vacation rental properties a 362 00:20:57,080 --> 00:21:01,000 Speaker 1: la Airbnb and such. So I am imagine this data 363 00:21:01,040 --> 00:21:04,439 Speaker 1: breach was the work of some crypto anarchist akin to 364 00:21:04,520 --> 00:21:06,919 Speaker 1: the main character in v for Vendetta. You know, a 365 00:21:06,960 --> 00:21:09,400 Speaker 1: person who was determined to take down the system by 366 00:21:09,400 --> 00:21:14,120 Speaker 1: targeting a wealth generation platform that by its very existence 367 00:21:14,240 --> 00:21:17,520 Speaker 1: seemed to put the common person at a disadvantage. But no, 368 00:21:18,040 --> 00:21:20,520 Speaker 1: it's nothing quite so dramatic as that. This is another 369 00:21:20,560 --> 00:21:23,640 Speaker 1: case of a company failing to institute basic security measures 370 00:21:23,640 --> 00:21:26,920 Speaker 1: in place in order to protect sensitive information. So let's 371 00:21:26,960 --> 00:21:30,240 Speaker 1: talk about that information first, and then I will talk 372 00:21:30,240 --> 00:21:35,480 Speaker 1: about what actually happened. So the leaked info included all 373 00:21:35,560 --> 00:21:39,200 Speaker 1: the top hits, you know, your basic stuff like customer names, 374 00:21:39,359 --> 00:21:42,280 Speaker 1: their phone numbers, addresses, that kind of thing. But it 375 00:21:42,280 --> 00:21:45,760 Speaker 1: also included other stuff like tax IDs, or whether or 376 00:21:45,840 --> 00:21:48,800 Speaker 1: not the entity that owned the property had ever gone 377 00:21:48,800 --> 00:21:52,880 Speaker 1: through bankruptcy, or if there were any court judgments against 378 00:21:52,920 --> 00:21:56,880 Speaker 1: the property owner. Obviously, it also included property history records 379 00:21:56,920 --> 00:21:59,000 Speaker 1: like when it was bought and sold and who bought 380 00:21:59,080 --> 00:22:01,800 Speaker 1: it or sold it. And there were a lot of 381 00:22:01,840 --> 00:22:04,119 Speaker 1: these data records, or like a one and a half 382 00:22:04,640 --> 00:22:08,760 Speaker 1: billion data records in total. Some of them included information 383 00:22:08,800 --> 00:22:13,240 Speaker 1: about very notable people, you know, like politicians and celebrities 384 00:22:13,320 --> 00:22:15,359 Speaker 1: and that kind of thing. And you could imagine this 385 00:22:15,440 --> 00:22:19,479 Speaker 1: sort of crime meant that the people who had access 386 00:22:19,480 --> 00:22:22,600 Speaker 1: to this information could do all sorts of other crimes, 387 00:22:23,040 --> 00:22:25,879 Speaker 1: everything from spearfishing, which would be a breeze because you 388 00:22:25,880 --> 00:22:28,640 Speaker 1: would have so much information about your target, you could 389 00:22:28,640 --> 00:22:33,679 Speaker 1: craft a very convincing attack that would potentially compromise them further. 390 00:22:34,160 --> 00:22:36,280 Speaker 1: You would have all the information you needed to commit 391 00:22:36,359 --> 00:22:39,479 Speaker 1: some pretty nasty fraud crimes in the name of some 392 00:22:39,520 --> 00:22:41,800 Speaker 1: of these victims. Not to mention now that you have 393 00:22:41,880 --> 00:22:45,760 Speaker 1: these notable people like politicians and celebrities. You have information 394 00:22:45,920 --> 00:22:48,879 Speaker 1: that you know, stalkers would really like, or people who 395 00:22:48,920 --> 00:22:53,000 Speaker 1: have a particular desire to hurt somebody specific they might 396 00:22:53,080 --> 00:22:55,200 Speaker 1: really want the information. So yeah, there's a whole host 397 00:22:55,200 --> 00:22:57,640 Speaker 1: of crimes that can happen as a result of this. 398 00:22:58,200 --> 00:23:02,360 Speaker 1: So what actually happened? How did this come about? Well, 399 00:23:02,359 --> 00:23:04,959 Speaker 1: the simple explanation is that the Real Estate Wealth Network 400 00:23:05,000 --> 00:23:08,680 Speaker 1: failed to put any security around accessing this database, which 401 00:23:08,720 --> 00:23:11,919 Speaker 1: held one point one six terabytes of information in it. 402 00:23:12,359 --> 00:23:16,119 Speaker 1: There was this security researcher named Jeremiah Fowler who discovered 403 00:23:16,119 --> 00:23:20,440 Speaker 1: the problem. He found an unencrypted, massive database containing extremely 404 00:23:20,480 --> 00:23:23,640 Speaker 1: sensitive information, and he said he found it after searching 405 00:23:23,680 --> 00:23:26,280 Speaker 1: for his own details, and then he followed a trail 406 00:23:26,359 --> 00:23:28,719 Speaker 1: that led to this database. So I don't know if 407 00:23:28,720 --> 00:23:31,919 Speaker 1: he was just like doing this search just on his 408 00:23:32,040 --> 00:23:34,320 Speaker 1: own and that had nothing to do with, you know, 409 00:23:34,400 --> 00:23:38,320 Speaker 1: any kind of investigation and just stumbled upon this, or 410 00:23:38,680 --> 00:23:42,159 Speaker 1: if he had already heard something about this or suspected 411 00:23:42,160 --> 00:23:44,880 Speaker 1: something about it and then started to do some searching 412 00:23:45,160 --> 00:23:48,200 Speaker 1: and came across it. Either way, he reported the problem 413 00:23:48,320 --> 00:23:51,080 Speaker 1: to the Real Estate Wealth Network, and to their credit, 414 00:23:51,160 --> 00:23:54,920 Speaker 1: they promptly locked down access to the database, They closed 415 00:23:54,960 --> 00:23:58,440 Speaker 1: off that avenue toward the data, and they also confirmed 416 00:23:58,560 --> 00:24:02,480 Speaker 1: that they owned the data base, which again all credit 417 00:24:02,600 --> 00:24:06,480 Speaker 1: to Real Estate Wealth Network for doing this. They acknowledged 418 00:24:06,480 --> 00:24:09,000 Speaker 1: the data leak, and honestly, that had a real breath 419 00:24:09,000 --> 00:24:11,720 Speaker 1: of fresh air to it, because, yes, the leak never 420 00:24:11,760 --> 00:24:13,640 Speaker 1: should have happened in the first place, but at least 421 00:24:13,680 --> 00:24:15,800 Speaker 1: we're not talking about an incident in which the company 422 00:24:15,880 --> 00:24:20,520 Speaker 1: delayed installed for days, weeks, months, or even years before 423 00:24:20,560 --> 00:24:23,760 Speaker 1: finally owning up to it. Because spoiler alert number one 424 00:24:23,800 --> 00:24:27,399 Speaker 1: on this list did that. It's a company that held 425 00:24:27,440 --> 00:24:31,000 Speaker 1: back on disclosing information about a massive data breach for 426 00:24:31,200 --> 00:24:34,879 Speaker 1: multiple years. But no, Real Estate Wealth Networks was pretty 427 00:24:34,960 --> 00:24:38,639 Speaker 1: quick to disclose this issue, which is really the responsible 428 00:24:38,640 --> 00:24:40,800 Speaker 1: thing to do, right, to warn those who have been 429 00:24:40,800 --> 00:24:42,840 Speaker 1: affected so that they could be on the lookout for 430 00:24:42,920 --> 00:24:47,840 Speaker 1: any signs that malicious actors have accessed that information. However, 431 00:24:47,960 --> 00:24:51,280 Speaker 1: despite this responsible approach, you know, you still have to 432 00:24:51,320 --> 00:24:53,960 Speaker 1: admit there were some pretty massive problems here. So for 433 00:24:54,040 --> 00:24:57,359 Speaker 1: one thing, it was not clear how long that database 434 00:24:57,400 --> 00:25:01,400 Speaker 1: had been unsecured, right it could have been unsecured for ages. 435 00:25:01,680 --> 00:25:04,639 Speaker 1: It was also unknown whether anyone besides you know, the 436 00:25:04,720 --> 00:25:08,200 Speaker 1: security research or Fowler had noticed this issue, because you know, 437 00:25:08,240 --> 00:25:11,399 Speaker 1: if criminals had found it, they would have a treasure 438 00:25:11,440 --> 00:25:14,280 Speaker 1: trove of data at their fingertips, and they wouldn't likely 439 00:25:15,119 --> 00:25:17,520 Speaker 1: just blurred out that they had found it if they 440 00:25:17,520 --> 00:25:20,760 Speaker 1: wanted to keep accessing it, although they could have made copies. 441 00:25:20,760 --> 00:25:24,080 Speaker 1: And apparently, from what I understand, no one found anything 442 00:25:24,080 --> 00:25:26,400 Speaker 1: about this on the dark web, So that's a fairly 443 00:25:26,440 --> 00:25:31,080 Speaker 1: good indicator that it was unnoticed by the criminal element. 444 00:25:31,200 --> 00:25:32,720 Speaker 1: But I mean, it would have only been a matter 445 00:25:32,760 --> 00:25:34,719 Speaker 1: of time. So the fact that Fowler found it and 446 00:25:34,760 --> 00:25:37,800 Speaker 1: that Real Estate Wealth Networks responded so quickly, that's really 447 00:25:37,880 --> 00:25:40,719 Speaker 1: a good thing. But yeah, this data included you know, 448 00:25:40,800 --> 00:25:45,080 Speaker 1: information about actual like celebrities, everyone from Britney Spears to 449 00:25:45,119 --> 00:25:49,480 Speaker 1: Elon Musk to Nancy Pelosi. So okay, we're coming down 450 00:25:49,640 --> 00:25:52,800 Speaker 1: to our top two entries on this list. Although I 451 00:25:52,800 --> 00:25:55,560 Speaker 1: should remind everybody that while I'm presenting this as a 452 00:25:55,600 --> 00:25:58,880 Speaker 1: top ten countdown, in episode one was like ten through 453 00:25:59,280 --> 00:26:01,680 Speaker 1: five and a half and this is five through one, 454 00:26:01,920 --> 00:26:04,240 Speaker 1: I should mention Kyle Chen did not claim that his 455 00:26:04,320 --> 00:26:07,080 Speaker 1: list was ordered in any kind of ranked way. He 456 00:26:07,119 --> 00:26:11,320 Speaker 1: listed twenty six different data breaches, and I don't wish 457 00:26:11,400 --> 00:26:15,040 Speaker 1: to imply that this is Kyle Chin's ranking. I just 458 00:26:15,320 --> 00:26:18,840 Speaker 1: chose to interpret it that way. So I'm just an 459 00:26:18,840 --> 00:26:21,480 Speaker 1: Internet hack. Really. Maybe at some point I'll do an 460 00:26:21,520 --> 00:26:25,720 Speaker 1: episode covering the other sixteen data breaches. But we're gonna 461 00:26:25,720 --> 00:26:28,680 Speaker 1: take a quick break right now for some more messages, 462 00:26:28,720 --> 00:26:41,120 Speaker 1: and when we come back, we're gonna keep on hacking. Okay, 463 00:26:41,200 --> 00:26:44,680 Speaker 1: we're back, and I mentioned before the break, we're gonna 464 00:26:44,760 --> 00:26:47,160 Speaker 1: keep on hacking. And hacking actually does play a part 465 00:26:47,440 --> 00:26:50,920 Speaker 1: in numbers two and one, right, because the last two 466 00:26:51,200 --> 00:26:55,080 Speaker 1: those weren't really data breaches. Those were data leaks. A breach, 467 00:26:55,160 --> 00:26:58,880 Speaker 1: I would argue, is when an outsider, a hacker, has 468 00:26:59,600 --> 00:27:02,399 Speaker 1: managed to infiltrate a system in some way. They have 469 00:27:02,560 --> 00:27:06,320 Speaker 1: breached the security whatever there might be of a system. 470 00:27:06,480 --> 00:27:09,000 Speaker 1: Data leaks are when you don't have any security in 471 00:27:09,080 --> 00:27:12,480 Speaker 1: place and no one actively has to be like searching 472 00:27:12,520 --> 00:27:15,040 Speaker 1: for it. It's just out there and it's available if 473 00:27:15,119 --> 00:27:17,520 Speaker 1: you happen to nowhere to look. Well, this is definitely 474 00:27:17,600 --> 00:27:20,040 Speaker 1: a breach, not a leak, and it's one that puts 475 00:27:20,080 --> 00:27:24,760 Speaker 1: Microsoft in the spotlight. The Culprit was an allegedly China 476 00:27:24,840 --> 00:27:31,040 Speaker 1: backed hacker group called Hafnium. The target were Microsoft Exchange servers. 477 00:27:31,440 --> 00:27:35,600 Speaker 1: So to be clear, the targets weren't just Microsoft. It 478 00:27:35,640 --> 00:27:38,560 Speaker 1: wasn't like the they were going after the company Microsoft. 479 00:27:38,640 --> 00:27:42,640 Speaker 1: They were going after companies that were using Microsoft Exchange servers, 480 00:27:42,840 --> 00:27:47,280 Speaker 1: So customers of Microsoft. That's who these hackers were going after. 481 00:27:47,480 --> 00:27:52,480 Speaker 1: And the entry point was allegedly four separate zero day vulnerabilities. 482 00:27:52,680 --> 00:27:55,280 Speaker 1: Now I mentioned in the previous episode. A zero day 483 00:27:55,359 --> 00:27:59,360 Speaker 1: vulnerability is one that the person or entity that's responsible 484 00:27:59,480 --> 00:28:02,800 Speaker 1: for the softwareware, or system doesn't know about. That's what 485 00:28:02,880 --> 00:28:05,960 Speaker 1: makes it a zero day vulnerability. They have zero days 486 00:28:06,080 --> 00:28:09,560 Speaker 1: to address any sort of exploit that pops up in 487 00:28:09,600 --> 00:28:14,200 Speaker 1: the wild, and zero day vulnerabilities are like gold to hackers. 488 00:28:14,520 --> 00:28:18,439 Speaker 1: They represent an opportunity to infiltrate systems without fear of 489 00:28:18,480 --> 00:28:22,760 Speaker 1: being stopped or sometimes even detected, and that gives hackers 490 00:28:22,840 --> 00:28:26,479 Speaker 1: way more time and opportunities to achieve whatever goals they have, 491 00:28:26,600 --> 00:28:30,640 Speaker 1: whether that's stealing information or breaking entire directories as part 492 00:28:30,640 --> 00:28:34,399 Speaker 1: of a ransomware attack, or infiltrating systems with malware, whatever 493 00:28:34,400 --> 00:28:38,240 Speaker 1: it might be. One zero day vulnerability is bad for 494 00:28:38,600 --> 00:28:42,960 Speaker 1: zero day vulnerabilities is really really bad. And on top 495 00:28:43,000 --> 00:28:45,200 Speaker 1: of that, some of these vulnerabilities had sort of been 496 00:28:45,240 --> 00:28:48,240 Speaker 1: grandfathered in from software that had been out for almost 497 00:28:48,280 --> 00:28:51,480 Speaker 1: a decade at that point. One key element that made 498 00:28:51,480 --> 00:28:54,640 Speaker 1: this particular attack so devastating was that many of the 499 00:28:54,640 --> 00:28:59,000 Speaker 1: customers using Microsoft Exchange kept their own email servers on 500 00:28:59,200 --> 00:29:03,200 Speaker 1: company premises or on prem as those in the biz 501 00:29:03,680 --> 00:29:07,720 Speaker 1: call it, so in case you're not familiar with that term, 502 00:29:07,840 --> 00:29:11,520 Speaker 1: it just means that a company hasn't offloaded everything to 503 00:29:11,560 --> 00:29:15,040 Speaker 1: the cloud. At least some of their servers that they're 504 00:29:15,120 --> 00:29:18,920 Speaker 1: using are on company property, whether that's in the company 505 00:29:19,080 --> 00:29:22,480 Speaker 1: HQ or a data center or whatever. The point is, 506 00:29:22,800 --> 00:29:26,360 Speaker 1: they own and operate the server. They're just running the 507 00:29:26,400 --> 00:29:31,040 Speaker 1: Microsoft Exchange product on those servers. So these are computers 508 00:29:31,040 --> 00:29:33,840 Speaker 1: that are not under the direct supervision of Microsoft. And 509 00:29:33,880 --> 00:29:37,320 Speaker 1: that's the important part, and the reason it's important is 510 00:29:37,360 --> 00:29:42,480 Speaker 1: that once Microsoft became aware of these vulnerabilities, they could 511 00:29:42,720 --> 00:29:47,040 Speaker 1: and they did release security patches to plug up those holes, 512 00:29:47,240 --> 00:29:50,400 Speaker 1: which would stop attackers from being able to infiltrate those 513 00:29:50,480 --> 00:29:54,960 Speaker 1: Microsoft Exchange servers. But while Microsoft could install these patches 514 00:29:55,000 --> 00:29:58,400 Speaker 1: directly onto any systems that were under their direct control. 515 00:29:58,680 --> 00:30:01,600 Speaker 1: They couldn't do that with exchange servers that were on 516 00:30:01,680 --> 00:30:06,080 Speaker 1: premises on customer property, right because Microsoft doesn't have that access. 517 00:30:06,120 --> 00:30:08,600 Speaker 1: You wouldn't want Microsoft to have that access. You know, 518 00:30:08,600 --> 00:30:11,480 Speaker 1: the whole reason to have on premises systems is that 519 00:30:11,560 --> 00:30:14,880 Speaker 1: those are under your direction and control. So you don't 520 00:30:14,880 --> 00:30:18,080 Speaker 1: want to give another party access to that's that's a 521 00:30:18,240 --> 00:30:22,040 Speaker 1: security vulnerability. So that meant Microsoft had to send out 522 00:30:22,160 --> 00:30:25,120 Speaker 1: messages to their customers that essentially said, hey, you need 523 00:30:25,160 --> 00:30:30,920 Speaker 1: to install these security patches asap, like yesterday, and the 524 00:30:31,040 --> 00:30:34,239 Speaker 1: customers were in charge of actually following through on that 525 00:30:34,360 --> 00:30:38,000 Speaker 1: because they were maintaining their own on premises email servers. 526 00:30:38,320 --> 00:30:41,640 Speaker 1: And you know, some companies have it departments that will 527 00:30:41,680 --> 00:30:44,760 Speaker 1: respond immediately to these kinds of messages and install those 528 00:30:44,800 --> 00:30:47,560 Speaker 1: kind of security patches, but there are cases where that 529 00:30:47,760 --> 00:30:51,880 Speaker 1: doesn't happen. Now, sometimes the decision not to implement a 530 00:30:51,920 --> 00:30:55,040 Speaker 1: security patch right away is actually done out of a 531 00:30:55,080 --> 00:30:59,160 Speaker 1: sense of caution, because occasionally you'll get a patch that 532 00:30:59,200 --> 00:31:03,720 Speaker 1: once you install, but it has unintended consequences on your operations, 533 00:31:04,000 --> 00:31:08,240 Speaker 1: right it might break something else, And so sometimes companies 534 00:31:08,320 --> 00:31:11,800 Speaker 1: won't install a security patch on the initial day of 535 00:31:11,880 --> 00:31:15,200 Speaker 1: release just in case that patch causes other issues. They 536 00:31:15,240 --> 00:31:17,640 Speaker 1: take a wait and see approach to make sure that 537 00:31:17,680 --> 00:31:21,040 Speaker 1: they're not going to interrupt their regular business operations by 538 00:31:21,120 --> 00:31:24,640 Speaker 1: installing this patch. But that means that they remain vulnerable 539 00:31:25,000 --> 00:31:28,280 Speaker 1: for as long as the patch remains uninstalled. Now, other times, 540 00:31:28,320 --> 00:31:30,720 Speaker 1: you might have IT departments that just are not on 541 00:31:30,800 --> 00:31:34,040 Speaker 1: top of the ball, and that means some companies would 542 00:31:34,040 --> 00:31:37,320 Speaker 1: just go unprotected longer than others. Sometimes you don't really 543 00:31:37,360 --> 00:31:40,320 Speaker 1: have an IT department at all, Right, You've outsourced a 544 00:31:40,320 --> 00:31:43,480 Speaker 1: lot of stuff, and so you don't necessarily have the 545 00:31:44,000 --> 00:31:47,640 Speaker 1: assets in place to be able to install security patches. 546 00:31:47,720 --> 00:31:51,640 Speaker 1: You're depending too heavily upon third parties and as a result, 547 00:31:51,760 --> 00:31:55,040 Speaker 1: you may not have the staff who has the knowledge 548 00:31:55,040 --> 00:31:58,360 Speaker 1: of how to install those security patches. It's possible there 549 00:31:58,360 --> 00:32:01,880 Speaker 1: are still some servers out there that aren't protected, though 550 00:32:01,920 --> 00:32:04,200 Speaker 1: I hope that's not the case because it has been 551 00:32:04,400 --> 00:32:08,240 Speaker 1: like three years since this attack happened, and surely someone 552 00:32:08,320 --> 00:32:11,560 Speaker 1: has installed security updates within those three years. But you 553 00:32:11,640 --> 00:32:16,640 Speaker 1: never know anyway, what did the attackers actually achieve, what 554 00:32:16,880 --> 00:32:19,520 Speaker 1: was their goal and what did they do well by 555 00:32:19,560 --> 00:32:24,320 Speaker 1: exploiting these vulnerabilities in Microsoft's software. The hackers gained access 556 00:32:24,360 --> 00:32:27,920 Speaker 1: to these infected email servers, and that meant they could 557 00:32:27,960 --> 00:32:31,040 Speaker 1: access all the email on that server as if they 558 00:32:31,080 --> 00:32:34,640 Speaker 1: were the administrator of the server. They could read everything, 559 00:32:34,880 --> 00:32:37,840 Speaker 1: so they could go into any account on that email 560 00:32:37,880 --> 00:32:41,280 Speaker 1: server and read stuff. They could send email as someone 561 00:32:41,360 --> 00:32:44,280 Speaker 1: else if they wanted to, and they could harvest data 562 00:32:44,320 --> 00:32:47,200 Speaker 1: in contacts. They could pull all the information off of 563 00:32:47,280 --> 00:32:52,080 Speaker 1: calendars like they could gather a lot of valuable information. Now, 564 00:32:52,400 --> 00:32:57,080 Speaker 1: in this case, the targets were companies, not necessarily individuals. However, 565 00:32:57,400 --> 00:33:00,920 Speaker 1: if they were able to compromise a company system that 566 00:33:01,160 --> 00:33:04,120 Speaker 1: had an individual of interest on it, then they could 567 00:33:04,200 --> 00:33:07,760 Speaker 1: pull tons of information about that person as part of 568 00:33:07,840 --> 00:33:10,680 Speaker 1: their attack. The attackers were able to compromise the email 569 00:33:10,720 --> 00:33:14,400 Speaker 1: servers of around thirty thousand companies in the United States. 570 00:33:14,520 --> 00:33:17,760 Speaker 1: Thirty thousand companies, not people like keep in mind, like 571 00:33:17,800 --> 00:33:21,000 Speaker 1: these businesses might employ hundreds or thousands of people. They 572 00:33:21,000 --> 00:33:24,840 Speaker 1: were also able to compromise another thirty thousand businesses around 573 00:33:24,880 --> 00:33:28,480 Speaker 1: the world, so sixty thousand total. Brian Krebs, once again 574 00:33:28,640 --> 00:33:31,720 Speaker 1: was the researcher who wrote about these attacks and alerted 575 00:33:31,720 --> 00:33:33,680 Speaker 1: the world in general as to what was going on. 576 00:33:33,920 --> 00:33:36,680 Speaker 1: The revelation came just a few days after Microsoft had 577 00:33:36,680 --> 00:33:40,160 Speaker 1: released emergency security updates. By the way, there are actually 578 00:33:40,160 --> 00:33:43,560 Speaker 1: a lot of security researchers out there, the white hat 579 00:33:43,640 --> 00:33:46,760 Speaker 1: hacker types, who tend to have a general code of 580 00:33:46,840 --> 00:33:50,000 Speaker 1: honor right. If they discover a vulnerability in a system, 581 00:33:50,160 --> 00:33:52,440 Speaker 1: typically what they do is they reach out to the 582 00:33:52,480 --> 00:33:56,520 Speaker 1: company or person responsible for that system and they alert them. 583 00:33:56,560 --> 00:33:59,880 Speaker 1: They say, hey, I found this, it's possible to exploit. 584 00:34:00,280 --> 00:34:03,280 Speaker 1: You're gonna want to patch it. And typically they'll give 585 00:34:03,320 --> 00:34:05,560 Speaker 1: a little bit of time for the company to respond 586 00:34:05,720 --> 00:34:08,560 Speaker 1: to this, and then they'll wait for that response before 587 00:34:08,600 --> 00:34:11,520 Speaker 1: they reveal what they know to the public. That's the 588 00:34:11,560 --> 00:34:14,320 Speaker 1: responsible way to do it, so that the least amount 589 00:34:14,320 --> 00:34:17,840 Speaker 1: of damage is done. However, if a company fails to respond, 590 00:34:18,120 --> 00:34:20,920 Speaker 1: like if it doesn't acknowledge the fact that you have 591 00:34:20,960 --> 00:34:26,279 Speaker 1: alerted them to this vulnerability at all, well, sometimes researchers 592 00:34:26,320 --> 00:34:28,880 Speaker 1: will reveal what they know in an effort to pressure 593 00:34:28,960 --> 00:34:32,000 Speaker 1: the company into responding. If they go public and say 594 00:34:32,400 --> 00:34:35,800 Speaker 1: this vulnerability exists, then there's suddenly this enormous amount of 595 00:34:35,800 --> 00:34:38,680 Speaker 1: pressure on the company to fix it, and then also 596 00:34:38,840 --> 00:34:42,799 Speaker 1: warns the customers of that company that the product they're 597 00:34:42,880 --> 00:34:46,399 Speaker 1: using isn't secure. However, in this case, Microsoft acted very 598 00:34:46,480 --> 00:34:49,520 Speaker 1: quickly and then Brian Krebs reported on it, so back 599 00:34:49,560 --> 00:34:53,200 Speaker 1: to Microsoft. US cybersecurity experts determine that the hacker group 600 00:34:53,280 --> 00:34:56,560 Speaker 1: Halfnium was responsible for these attacks, and that in turn 601 00:34:56,719 --> 00:35:00,880 Speaker 1: Halfnium enjoyed the backing and protection of the Chinese government. Further, 602 00:35:01,239 --> 00:35:03,360 Speaker 1: the attacks were assumed to be part of a large 603 00:35:03,520 --> 00:35:07,560 Speaker 1: espionage mission to spy upon and steal information from US companies. 604 00:35:07,600 --> 00:35:11,360 Speaker 1: In particular, The Guardian reported that the US government, in 605 00:35:11,400 --> 00:35:14,879 Speaker 1: the form of the FBI, the Federal Bureau of Investigation, 606 00:35:15,040 --> 00:35:19,520 Speaker 1: took a rather extreme step to protect companies by hacking 607 00:35:19,719 --> 00:35:23,759 Speaker 1: into infected systems that belonged to various companies that had 608 00:35:23,840 --> 00:35:27,120 Speaker 1: failed to mitigate the malware themselves. In order to actually 609 00:35:27,160 --> 00:35:31,279 Speaker 1: neutralize the threat, so the FBI hacked into company systems 610 00:35:31,400 --> 00:35:34,799 Speaker 1: in order to shut down the malware. The FBI did 611 00:35:34,840 --> 00:35:38,120 Speaker 1: not go the extra step to install the security updates 612 00:35:38,200 --> 00:35:41,360 Speaker 1: on those systems. But apparently the FBI was fighting hacking 613 00:35:41,480 --> 00:35:44,440 Speaker 1: by hacking, I guess, and not directly against the hackers, 614 00:35:44,480 --> 00:35:47,080 Speaker 1: but against the same companies that had already been hacked. 615 00:35:47,320 --> 00:35:49,960 Speaker 1: When it comes to zero day vulnerabilities. I don't think 616 00:35:50,000 --> 00:35:53,120 Speaker 1: there's really anything I can say that's terribly helpful. I mean, 617 00:35:53,160 --> 00:35:55,000 Speaker 1: how do you fix a problem if you don't know 618 00:35:55,080 --> 00:35:57,719 Speaker 1: there is a problem. So, you know, it might be 619 00:35:57,840 --> 00:36:02,080 Speaker 1: tempting to blame Microsoft for leasing software platforms that had 620 00:36:02,120 --> 00:36:04,839 Speaker 1: these vulnerabilities in them, but again, if the company has 621 00:36:04,880 --> 00:36:07,960 Speaker 1: no awareness that there's an issue, how are they supposed 622 00:36:07,960 --> 00:36:11,120 Speaker 1: to fix it? So I think just saying do better 623 00:36:11,360 --> 00:36:13,960 Speaker 1: isn't really useful. You see that a lot in the 624 00:36:14,000 --> 00:36:18,359 Speaker 1: gamer community. It's very dismissive and snobby, and I hate 625 00:36:18,360 --> 00:36:21,400 Speaker 1: it and it's not useful at all. It's not helpful. 626 00:36:21,600 --> 00:36:25,719 Speaker 1: So I don't really think that's a viable response. As 627 00:36:25,760 --> 00:36:28,360 Speaker 1: for customers, well, there's even less that you can do 628 00:36:28,400 --> 00:36:30,800 Speaker 1: as a customer to prevent an attack that leverages a 629 00:36:30,880 --> 00:36:34,560 Speaker 1: zero day vulnerability in some tool that your company is using. 630 00:36:34,719 --> 00:36:36,560 Speaker 1: The best you can do is beyond the ball when 631 00:36:36,600 --> 00:36:39,160 Speaker 1: security patches come out to minimize the risk of becoming 632 00:36:39,160 --> 00:36:44,439 Speaker 1: another victim, or to at least temporarily migrate away from 633 00:36:44,520 --> 00:36:47,680 Speaker 1: a tool that's been known to have been compromised. Okay, 634 00:36:48,160 --> 00:36:50,760 Speaker 1: we're finally up to number one on the list, and maybe, 635 00:36:50,760 --> 00:36:52,680 Speaker 1: like I said, I'll do a future episodes to cover 636 00:36:52,760 --> 00:36:56,000 Speaker 1: some of the other instances that Kyle Chen mentions. So again, 637 00:36:56,280 --> 00:36:59,719 Speaker 1: if you didn't hear me before, these are all coming 638 00:36:59,719 --> 00:37:02,400 Speaker 1: from a blog post that Kyle Chen wrote for upguard 639 00:37:02,480 --> 00:37:04,640 Speaker 1: dot com, So you can always look up like the 640 00:37:04,719 --> 00:37:08,279 Speaker 1: largest data breaches in US history. That'll probably pull that 641 00:37:08,480 --> 00:37:11,000 Speaker 1: article up as a result. That's where I'm getting this 642 00:37:11,080 --> 00:37:16,480 Speaker 1: particular list. But our number one entry is Yahoo, and yeah, 643 00:37:16,880 --> 00:37:20,440 Speaker 1: this one was huge. Also, I can't really say this 644 00:37:20,600 --> 00:37:24,520 Speaker 1: one because the word one is deceptive. Honestly, this was 645 00:37:24,560 --> 00:37:27,640 Speaker 1: a series of cyber attacks that ultimately resulted in more 646 00:37:27,680 --> 00:37:34,040 Speaker 1: than three billion records being exposed to cyber criminals. Three billion. 647 00:37:34,239 --> 00:37:37,920 Speaker 1: That's nearly forty percent of the world's population. Though honestly, 648 00:37:37,960 --> 00:37:41,879 Speaker 1: that's three billion accounts. I'm sure that some people had 649 00:37:42,000 --> 00:37:44,680 Speaker 1: multiple accounts, so it's not the same thing as three 650 00:37:44,800 --> 00:37:48,200 Speaker 1: billion people, I guess, But that seems like a fine 651 00:37:48,280 --> 00:37:50,600 Speaker 1: point that we don't really need to worry about. If 652 00:37:50,640 --> 00:37:53,200 Speaker 1: we do assume that's three billion people, that means if 653 00:37:53,200 --> 00:37:55,960 Speaker 1: you were to get ten people together, four of those folks, 654 00:37:56,080 --> 00:37:58,960 Speaker 1: or really three point seven eight of them would be 655 00:37:58,960 --> 00:38:02,279 Speaker 1: affected by this. But those chances would actually depend upon 656 00:38:02,320 --> 00:38:04,200 Speaker 1: where in the world you are, right, Like, if you're 657 00:38:04,200 --> 00:38:06,359 Speaker 1: in a more developed part of the world, then the 658 00:38:06,360 --> 00:38:09,080 Speaker 1: odds are going to be higher that there'll be more 659 00:38:09,200 --> 00:38:12,160 Speaker 1: people there who were affected by this. My analogy only 660 00:38:12,200 --> 00:38:14,359 Speaker 1: works if the ten people are randomly selected from all 661 00:38:14,400 --> 00:38:16,200 Speaker 1: over the world. But anyway, I'm on a tangent. So 662 00:38:16,440 --> 00:38:20,240 Speaker 1: what happened, Well, what happened was a combination of various 663 00:38:20,320 --> 00:38:25,120 Speaker 1: hacking techniques, some international intrigue, a company staying quiet about 664 00:38:25,160 --> 00:38:28,320 Speaker 1: the whole thing for way too long after finding out 665 00:38:28,360 --> 00:38:32,320 Speaker 1: about this intrusion, possibly in an effort to avoid affecting 666 00:38:32,400 --> 00:38:35,760 Speaker 1: a massive acquisition deal that was going on with another 667 00:38:35,800 --> 00:38:40,719 Speaker 1: big company. Because this story is juicy, y'all. But we 668 00:38:40,760 --> 00:38:44,680 Speaker 1: begin in twenty thirteen, and this is when the first 669 00:38:44,719 --> 00:38:48,239 Speaker 1: and the largest hack happened, though the public would not 670 00:38:48,400 --> 00:38:53,600 Speaker 1: find out about it for another three years. Now. Arguably, 671 00:38:53,680 --> 00:38:57,680 Speaker 1: Yahoo itself remained ignorant of this attack for quite some time, 672 00:38:57,920 --> 00:39:00,600 Speaker 1: but it would take three years for the US public 673 00:39:00,719 --> 00:39:03,640 Speaker 1: to hear that this had happened. Even in that revelation, 674 00:39:04,000 --> 00:39:07,399 Speaker 1: Yahoo would downplay how many accounts were affected by a 675 00:39:07,440 --> 00:39:09,960 Speaker 1: third in fact, because, as we would later hear in 676 00:39:10,000 --> 00:39:15,799 Speaker 1: twenty seventeen, the attack compromised some three billion accounts, but 677 00:39:15,840 --> 00:39:18,400 Speaker 1: when Yahoo disclosed this attack to the public back in 678 00:39:18,440 --> 00:39:21,759 Speaker 1: twenty sixteen, Yahoo said it was just one billion, which 679 00:39:21,800 --> 00:39:24,359 Speaker 1: is weird. I was saying, just is crazy. I mean, 680 00:39:24,400 --> 00:39:27,040 Speaker 1: one billion is a number so huge that I cannot 681 00:39:27,160 --> 00:39:29,560 Speaker 1: possibly imagine it. So you might as well cop to 682 00:39:29,560 --> 00:39:31,520 Speaker 1: all three billion at that point, right, But to be 683 00:39:31,600 --> 00:39:34,560 Speaker 1: fair to Yahoo, I do think it's very possible the 684 00:39:34,600 --> 00:39:37,760 Speaker 1: company honestly did not have a grasp on how large 685 00:39:37,800 --> 00:39:41,320 Speaker 1: this attack actually was at that time. And by large 686 00:39:41,360 --> 00:39:45,719 Speaker 1: I do mean every single Yahoo account active or otherwise 687 00:39:46,040 --> 00:39:49,920 Speaker 1: was affected by this because the company had lousy security. 688 00:39:50,000 --> 00:39:52,800 Speaker 1: There's no getting around it. I don't think that's opinion, 689 00:39:52,840 --> 00:39:55,960 Speaker 1: by the way, because in twenty twelve, Yahoo had already 690 00:39:55,960 --> 00:39:58,760 Speaker 1: been the target of a different hacker attack that ultimately 691 00:39:58,800 --> 00:40:02,400 Speaker 1: compromised nearly half a million accounts. Now half a million 692 00:40:02,920 --> 00:40:05,520 Speaker 1: is a drop in the bucket compared to three billion. 693 00:40:05,680 --> 00:40:09,360 Speaker 1: It's nothing. But you would think that half a million 694 00:40:09,440 --> 00:40:12,480 Speaker 1: compromised accounts would be a serious wake up call to Yahoo, 695 00:40:12,560 --> 00:40:14,640 Speaker 1: and it would at least kick things into gears so 696 00:40:14,680 --> 00:40:18,080 Speaker 1: that the company would institute better security practices. But sadly, 697 00:40:18,480 --> 00:40:20,840 Speaker 1: that didn't really happen, or at least it didn't happen 698 00:40:20,880 --> 00:40:23,520 Speaker 1: on a scale and timeline that would prevent the massive 699 00:40:23,560 --> 00:40:27,680 Speaker 1: intrusion in twenty thirteen. Now, it is easy to conflate 700 00:40:28,000 --> 00:40:30,880 Speaker 1: the twenty thirteen attack with another one that happened in 701 00:40:30,920 --> 00:40:34,160 Speaker 1: twenty fourteen. So if you're keeping count, that's three different 702 00:40:34,200 --> 00:40:38,400 Speaker 1: hacker intrusions in three years. Twenty twelve was half a million, 703 00:40:38,520 --> 00:40:44,120 Speaker 1: twenty thirteen three billion, twenty fourteen was like five hundred million. 704 00:40:44,520 --> 00:40:47,040 Speaker 1: I think it is important to draw some distinctions between 705 00:40:47,080 --> 00:40:51,640 Speaker 1: these because authorities would ultimately identify four individuals believed to 706 00:40:51,640 --> 00:40:55,120 Speaker 1: be responsible for the twenty fourteen attack. These folks would 707 00:40:55,160 --> 00:40:59,280 Speaker 1: also end up having alleged ties to the Russian government. Thus, 708 00:40:59,560 --> 00:41:02,640 Speaker 1: there are locations that the Yahoo hack in twenty fourteen 709 00:41:02,880 --> 00:41:06,560 Speaker 1: was the result of a state backed hacking operation, that 710 00:41:06,640 --> 00:41:10,840 Speaker 1: it was essentially funded by the Russians. Now, whether the 711 00:41:10,840 --> 00:41:14,960 Speaker 1: twenty thirteen attack was state backed or not, we don't 712 00:41:15,160 --> 00:41:17,600 Speaker 1: really know. To this day. As far as I can tell, 713 00:41:17,840 --> 00:41:20,640 Speaker 1: no one has the definitive information on who was behind 714 00:41:20,640 --> 00:41:24,080 Speaker 1: the twenty thirteen attack. Marissa Meyer, who was CEO of 715 00:41:24,160 --> 00:41:27,719 Speaker 1: Yahoo during those data breaches, would testify to Congress in 716 00:41:27,760 --> 00:41:30,880 Speaker 1: twenty seventeen that the company had been unable to determine 717 00:41:30,880 --> 00:41:34,280 Speaker 1: who was responsible for that particular attack in twenty thirteen. 718 00:41:34,560 --> 00:41:37,120 Speaker 1: She also revealed that she did learn of the twenty 719 00:41:37,200 --> 00:41:40,759 Speaker 1: fourteen intrusion in twenty fourteen, this is the one that 720 00:41:40,800 --> 00:41:43,680 Speaker 1: would affect half a billion accounts, but she didn't learn 721 00:41:43,680 --> 00:41:47,080 Speaker 1: of the actual scale of the attacks until twenty sixteen. 722 00:41:47,360 --> 00:41:50,040 Speaker 1: So if that's true, it meant the attackers had some 723 00:41:50,800 --> 00:41:54,040 Speaker 1: really good ability to hide their tracks and had years 724 00:41:54,040 --> 00:41:57,359 Speaker 1: of opportunity there as well. So presumably no one knew 725 00:41:57,400 --> 00:42:00,359 Speaker 1: about the twenty thirteen attack until much later. They knew 726 00:42:00,360 --> 00:42:02,680 Speaker 1: about the twenty fourteen attack, but they didn't know how 727 00:42:02,760 --> 00:42:05,439 Speaker 1: bad it was. This also means I can't really talk 728 00:42:05,440 --> 00:42:08,000 Speaker 1: about what happened in twenty thirteen, because the truth of 729 00:42:08,040 --> 00:42:11,120 Speaker 1: the matter is there were several potential exploits that the 730 00:42:11,160 --> 00:42:14,200 Speaker 1: hackers could have used, and the twenty thirteen attack may 731 00:42:14,200 --> 00:42:18,320 Speaker 1: have used multiple avenues to get access to all the information. 732 00:42:18,719 --> 00:42:21,720 Speaker 1: We can, however, talk about what happened in twenty fourteen. 733 00:42:21,760 --> 00:42:24,040 Speaker 1: We know more about that one. It's a tail that 734 00:42:24,120 --> 00:42:28,960 Speaker 1: involves spearfishing and cookies. Now I should also mention this 735 00:42:29,120 --> 00:42:32,080 Speaker 1: version of the story comes to us courtesy of the FBI. 736 00:42:32,560 --> 00:42:36,000 Speaker 1: The agency said that the hackers first sent spearfishing emails 737 00:42:36,040 --> 00:42:39,239 Speaker 1: to Yahoo employees, So again, these are emails meant to 738 00:42:39,280 --> 00:42:42,440 Speaker 1: trick someone into taking some sort of action that benefits 739 00:42:42,480 --> 00:42:44,480 Speaker 1: the hacker, and in this case, it was clicking on 740 00:42:44,520 --> 00:42:48,040 Speaker 1: a link that ultimately would give the hackers administrator level 741 00:42:48,080 --> 00:42:51,680 Speaker 1: access to some of Yahoo's systems, but not all of 742 00:42:51,719 --> 00:42:54,560 Speaker 1: their systems. But it did give them a chance to 743 00:42:54,600 --> 00:42:57,759 Speaker 1: look at some code that Yahoo used that would be 744 00:42:57,840 --> 00:43:01,880 Speaker 1: really valuable, and this included code on how Yahoo generated 745 00:43:02,080 --> 00:43:06,440 Speaker 1: Internet cookies, So a quick refresher course on what cookies 746 00:43:06,480 --> 00:43:08,959 Speaker 1: are and what they do. With the Web, You've got 747 00:43:08,960 --> 00:43:12,040 Speaker 1: clients and you've got servers. So if you're browsing the 748 00:43:12,080 --> 00:43:15,800 Speaker 1: web on your computer or your smartphone, you're using a client. 749 00:43:16,280 --> 00:43:20,480 Speaker 1: Servers are sending or serving you the information that you're requesting. 750 00:43:20,640 --> 00:43:23,120 Speaker 1: So if you type in an address in your web 751 00:43:23,120 --> 00:43:27,640 Speaker 1: browser like www dot Google dot com, your client sends 752 00:43:27,680 --> 00:43:30,520 Speaker 1: a request to go to that website. This request gets 753 00:43:30,640 --> 00:43:33,840 Speaker 1: routed to the appropriate web server, which then sends the 754 00:43:33,920 --> 00:43:36,760 Speaker 1: data to your client, so that you are visiting Google. 755 00:43:36,920 --> 00:43:39,520 Speaker 1: So each interaction is kind of like this. Now, to 756 00:43:39,600 --> 00:43:43,480 Speaker 1: save some time and effort, we have cookies. So cookies 757 00:43:43,520 --> 00:43:46,480 Speaker 1: are little bits of info that get saved to your 758 00:43:46,560 --> 00:43:49,880 Speaker 1: computer the client, and cookies can do really handy stuff. 759 00:43:49,880 --> 00:43:52,680 Speaker 1: For example, let's say that you've logged into an account 760 00:43:52,680 --> 00:43:55,400 Speaker 1: on a service like Amazon and you're doing some shopping, 761 00:43:55,560 --> 00:43:57,480 Speaker 1: and then you navigate a way because you have to 762 00:43:57,480 --> 00:44:00,200 Speaker 1: go do something else. But then gush, darn it, you remember, oh, well, 763 00:44:00,200 --> 00:44:02,120 Speaker 1: you know what, I should have ordered something else on 764 00:44:02,160 --> 00:44:04,839 Speaker 1: top of everything else. So you go back to Amazon. Well, 765 00:44:04,880 --> 00:44:06,719 Speaker 1: if you didn't have cookies, you would need to go 766 00:44:06,960 --> 00:44:10,000 Speaker 1: all the way back through the login process because Amazon 767 00:44:10,040 --> 00:44:13,280 Speaker 1: wouldn't know it was you again. But with a cookie 768 00:44:13,320 --> 00:44:16,560 Speaker 1: stored on your computer, Amazon can say, oh, yeah, that's 769 00:44:16,560 --> 00:44:19,560 Speaker 1: Billy Bob who logged in earlier, never logged out their 770 00:44:19,680 --> 00:44:22,680 Speaker 1: back Let's just let them continue their session. So cookies 771 00:44:22,719 --> 00:44:26,080 Speaker 1: are how web pages quote unquote know where you were 772 00:44:26,200 --> 00:44:28,359 Speaker 1: the last time you visited, so that you can kind 773 00:44:28,360 --> 00:44:30,880 Speaker 1: of pick up where you left off. Of course, cookies 774 00:44:30,920 --> 00:44:33,240 Speaker 1: can do more than that, and there's a whole discussion 775 00:44:33,239 --> 00:44:36,080 Speaker 1: to be had about tracking and whatnot with cookies, but 776 00:44:36,080 --> 00:44:38,560 Speaker 1: we're going to leave that for now. So how did 777 00:44:38,560 --> 00:44:41,320 Speaker 1: the attack happen. Well, the hypothesis goes that the hackers 778 00:44:41,320 --> 00:44:44,800 Speaker 1: were able to figure out how to forge Yahoo cookies 779 00:44:45,080 --> 00:44:50,280 Speaker 1: because Yahoo was not good at practicing decent security, because 780 00:44:50,360 --> 00:44:53,600 Speaker 1: ideally each cookie would contain data that's unique to a 781 00:44:53,680 --> 00:44:56,360 Speaker 1: single individual and that would be very hard to crack. 782 00:44:56,440 --> 00:44:59,560 Speaker 1: But Yahoo apparently wasn't really doing this in that regard. 783 00:44:59,600 --> 00:45:02,359 Speaker 1: You could say Yahoo was kind of behaving sort of 784 00:45:02,400 --> 00:45:05,200 Speaker 1: like how certain German officers in World War Two were 785 00:45:05,400 --> 00:45:10,160 Speaker 1: reusing code phrases while encrypting messages using an Enigma machine. Now, 786 00:45:10,200 --> 00:45:13,960 Speaker 1: the practice of reusing those code phrases, that's what gave 787 00:45:14,040 --> 00:45:17,640 Speaker 1: British code breakers the advantage that they needed to actually 788 00:45:17,680 --> 00:45:20,600 Speaker 1: break the code. If the Germans had used more discipline 789 00:45:20,600 --> 00:45:23,200 Speaker 1: security measures, the code likely would have stood up to 790 00:45:23,239 --> 00:45:27,120 Speaker 1: the BRIT's best efforts and it wouldn't have ever been broken. Possibly. 791 00:45:27,200 --> 00:45:29,399 Speaker 1: I mean, it's impossible to say, because we know what 792 00:45:29,480 --> 00:45:32,040 Speaker 1: did happen, it's impossible to say that what could have 793 00:45:32,120 --> 00:45:35,560 Speaker 1: happened was a definite possibility. Anyway, all this meant that 794 00:45:35,600 --> 00:45:39,120 Speaker 1: the hackers could actually forge a cookie, and they could 795 00:45:39,160 --> 00:45:43,560 Speaker 1: fool Yahoo into thinking that the hackers were actually somebody else, 796 00:45:43,680 --> 00:45:46,279 Speaker 1: like a legitimate Yahoo user, So it meant that they 797 00:45:46,280 --> 00:45:51,120 Speaker 1: could access user accounts and then spy on that user's correspondence, 798 00:45:51,239 --> 00:45:55,359 Speaker 1: plus scrape any data relating to that user and their 799 00:45:55,400 --> 00:45:58,680 Speaker 1: other online accounts. So potentially they could even compromise even 800 00:45:58,760 --> 00:46:00,960 Speaker 1: more data in the process. A right, like if you 801 00:46:01,160 --> 00:46:04,479 Speaker 1: figure out a user's username and password, and you also 802 00:46:04,520 --> 00:46:06,840 Speaker 1: see that they use other services, you can try and 803 00:46:06,960 --> 00:46:08,759 Speaker 1: use that and see if it works. Because so many 804 00:46:08,760 --> 00:46:11,480 Speaker 1: people reuse their passwords. I said it in the last episode, 805 00:46:11,520 --> 00:46:13,799 Speaker 1: I'll say it again. Don't do that, And you start 806 00:46:13,840 --> 00:46:15,720 Speaker 1: to see how a data breach can be much bigger 807 00:46:15,760 --> 00:46:18,880 Speaker 1: than just this enormous scope of three billion accounts. So 808 00:46:19,000 --> 00:46:22,880 Speaker 1: coy to Marissa Meyer, Yahoo learned about the twenty fourteen attack, 809 00:46:23,440 --> 00:46:26,040 Speaker 1: even though they might not have known what the scope was, 810 00:46:26,239 --> 00:46:29,000 Speaker 1: but they kept things quiet, possibly in part because there 811 00:46:29,080 --> 00:46:33,920 Speaker 1: was this big acquisition deal Verizon wanted to acquire Yahoo. 812 00:46:34,280 --> 00:46:36,799 Speaker 1: Some of this news, however, would break before that deal 813 00:46:36,840 --> 00:46:39,719 Speaker 1: would be finalized, and Verizon would drop the offer by 814 00:46:39,719 --> 00:46:42,719 Speaker 1: a whopping three hundred and fifty million dollars, So that 815 00:46:42,880 --> 00:46:47,640 Speaker 1: was a big deduction on the final cost for that acquisition, 816 00:46:47,880 --> 00:46:50,319 Speaker 1: and even that might have been too modest. Because, of course, 817 00:46:50,400 --> 00:46:53,920 Speaker 1: later we would find out the real scope of those attacks, 818 00:46:54,160 --> 00:46:56,479 Speaker 1: lawsuits would follow, but Yahoo would pay out a little 819 00:46:56,520 --> 00:46:58,960 Speaker 1: more than one hundred million dollars as a result of 820 00:46:58,960 --> 00:47:02,960 Speaker 1: the lawsuits, a few tens of millions here, tens of 821 00:47:02,960 --> 00:47:06,440 Speaker 1: millions there in different fines, But honestly, I think it 822 00:47:06,480 --> 00:47:09,360 Speaker 1: was a pretty small amount considering the nature and scope 823 00:47:09,400 --> 00:47:12,000 Speaker 1: of the attack. Then again, the whole thing did cost 824 00:47:12,080 --> 00:47:15,040 Speaker 1: y'all who hundreds of millions of dollars because the acquisition 825 00:47:15,440 --> 00:47:18,560 Speaker 1: was reduced in price too. Only one of the four 826 00:47:18,600 --> 00:47:22,200 Speaker 1: men identified as hackers in that twenty fourteen attack was 827 00:47:22,320 --> 00:47:25,399 Speaker 1: ever arrested and sentenced for this crime was in a 828 00:47:25,440 --> 00:47:29,440 Speaker 1: Canadian citizen named Karim Borotov. He was sentenced to five 829 00:47:29,520 --> 00:47:33,240 Speaker 1: years in prison. He even wrote a book about his experiences. 830 00:47:33,360 --> 00:47:36,640 Speaker 1: The other three accused, which included two members of the 831 00:47:36,719 --> 00:47:39,920 Speaker 1: Russian FSB, that's kind of like the successor of the 832 00:47:40,000 --> 00:47:42,840 Speaker 1: KGB in the old Soviet days. They were accused, but 833 00:47:42,880 --> 00:47:45,640 Speaker 1: they were never arrested, and of course Russia denied having 834 00:47:45,719 --> 00:47:49,080 Speaker 1: any involvement in the attacks whatsoever. And that brings us 835 00:47:49,160 --> 00:47:52,319 Speaker 1: down to the last of my top ten. Again, I 836 00:47:52,320 --> 00:47:55,239 Speaker 1: don't wish to suggest that Kyle Chin had ranked these 837 00:47:55,280 --> 00:47:57,880 Speaker 1: in any specific way. This was kind of how I 838 00:47:57,960 --> 00:48:00,640 Speaker 1: ranked them, and there were sixteen other ones on that list. 839 00:48:00,680 --> 00:48:03,480 Speaker 1: So again, go to upguard dot com and search for 840 00:48:03,520 --> 00:48:06,080 Speaker 1: that article about the largest data breaches in US history 841 00:48:06,120 --> 00:48:07,879 Speaker 1: if you want to read more about them. Maybe I'll 842 00:48:07,880 --> 00:48:10,239 Speaker 1: do an episode about those later. I might do an 843 00:48:10,239 --> 00:48:14,040 Speaker 1: episode just about the ticketmaster data breach because that happened. 844 00:48:14,040 --> 00:48:17,840 Speaker 1: That are particularly bad time for ticketmasters since it's already 845 00:48:18,000 --> 00:48:20,600 Speaker 1: in the crosshairs of the US government due to an 846 00:48:20,600 --> 00:48:24,240 Speaker 1: antitrust lawsuit. So maybe I'll do a full episode about 847 00:48:24,280 --> 00:48:26,319 Speaker 1: that in the future too. In the meantime, I hope 848 00:48:26,400 --> 00:48:29,680 Speaker 1: you're all safe. I hope all your personal information is safe, 849 00:48:29,719 --> 00:48:38,880 Speaker 1: and I'll talk to you again really soon. Tech Stuff 850 00:48:39,000 --> 00:48:43,520 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 851 00:48:43,560 --> 00:48:47,080 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 852 00:48:47,120 --> 00:48:48,080 Speaker 1: your favorite shows.