1 00:00:04,440 --> 00:00:12,840 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Heydarren, Welcome 2 00:00:12,880 --> 00:00:16,080 Speaker 1: to tech Stuff. I'm your host, Jonathan Strickland. I'm an 3 00:00:16,079 --> 00:00:19,520 Speaker 1: executive producer with iHeartRadio. And how the tech are you? 4 00:00:20,440 --> 00:00:23,760 Speaker 1: So this morning I woke up to a text message 5 00:00:23,880 --> 00:00:28,480 Speaker 1: from a radio station in Alabama. I have frequently been 6 00:00:28,520 --> 00:00:33,239 Speaker 1: on that station talking about tech stuff in general, and 7 00:00:33,280 --> 00:00:35,160 Speaker 1: they were asking if I would be willing to jump 8 00:00:35,200 --> 00:00:37,440 Speaker 1: on the air this morning to talk about how some 9 00:00:38,040 --> 00:00:42,120 Speaker 1: Google Chrome news was unfolding, specifically that Google discovered a 10 00:00:42,360 --> 00:00:48,599 Speaker 1: zero day vulnerability within Chrome that could potentially compromise users, 11 00:00:50,080 --> 00:00:53,199 Speaker 1: like upwards of three billion of them. So today I 12 00:00:53,200 --> 00:00:55,240 Speaker 1: thought I would talk a little bit about what a 13 00:00:55,480 --> 00:00:59,160 Speaker 1: zero day vulnerability is, touch on a little bit of 14 00:00:59,240 --> 00:01:03,040 Speaker 1: terminology used in the information security sector to kind of 15 00:01:03,600 --> 00:01:06,399 Speaker 1: demystify a bit of it and to help folks like 16 00:01:06,520 --> 00:01:09,800 Speaker 1: me wrap our minds around the whole thing. Now, for 17 00:01:09,840 --> 00:01:13,040 Speaker 1: those of y'all who are neck deep in computer security fields, 18 00:01:13,720 --> 00:01:16,200 Speaker 1: you're going to know all of this already. This is 19 00:01:16,280 --> 00:01:19,280 Speaker 1: sort of a one oh one of what zero day 20 00:01:19,360 --> 00:01:23,520 Speaker 1: vulnerabilities are all about. So let's define that. Because zero 21 00:01:23,640 --> 00:01:28,240 Speaker 1: day vulnerability, zero day exploit, zero day attack. These terms 22 00:01:29,240 --> 00:01:32,200 Speaker 1: seem to mean something, but you don't necessarily understand it 23 00:01:32,240 --> 00:01:37,360 Speaker 1: when you first see it. Essentially, it's describing a situation 24 00:01:37,480 --> 00:01:42,160 Speaker 1: in which there is something exploitable within some sort of 25 00:01:42,200 --> 00:01:46,240 Speaker 1: technical system. This system can be software, it can be hardware, 26 00:01:46,920 --> 00:01:50,960 Speaker 1: and this vulnerability exists within the system, and the people 27 00:01:50,960 --> 00:01:52,920 Speaker 1: who should be on the lookout for that kind of 28 00:01:52,960 --> 00:01:56,080 Speaker 1: thing are unaware of it. So, in other words, the 29 00:01:56,160 --> 00:02:00,760 Speaker 1: company behind the system doesn't know that the vulnerability the exists. 30 00:02:01,200 --> 00:02:06,400 Speaker 1: The security community at large doesn't know that this vulnerability exists, 31 00:02:06,800 --> 00:02:09,320 Speaker 1: but it's there. So if you think about it, think 32 00:02:09,360 --> 00:02:11,960 Speaker 1: of it in physical terms. Let's think about like a 33 00:02:12,040 --> 00:02:16,840 Speaker 1: giant security wall that goes around your home, and it 34 00:02:16,919 --> 00:02:19,560 Speaker 1: turns out there's a gap in the wall, but it's 35 00:02:19,600 --> 00:02:22,840 Speaker 1: behind some shrubs and stuff, and you just haven't gone 36 00:02:22,840 --> 00:02:25,119 Speaker 1: back there, and you haven't noticed it. You don't even 37 00:02:25,120 --> 00:02:28,280 Speaker 1: know there's a gap there. Well, that gap obviously shows 38 00:02:28,320 --> 00:02:31,400 Speaker 1: that there's a hole in your security and you just 39 00:02:31,440 --> 00:02:33,400 Speaker 1: aren't aware of it. But if someone else is aware 40 00:02:33,440 --> 00:02:36,720 Speaker 1: of it, they could exploit that without your awareness and 41 00:02:36,800 --> 00:02:40,960 Speaker 1: potentially cause a lot of damage. And if they're very clever, 42 00:02:41,360 --> 00:02:43,520 Speaker 1: they can cause a lot of damage. Over an extended 43 00:02:43,520 --> 00:02:45,920 Speaker 1: amount of time, as long as they don't make you 44 00:02:46,000 --> 00:02:49,400 Speaker 1: aware that that hole is there. That's kind of the 45 00:02:49,520 --> 00:02:54,600 Speaker 1: idea here. It's partly called a zero day vulnerability because 46 00:02:54,639 --> 00:02:58,680 Speaker 1: the entity behind the product has had zero days to 47 00:02:58,720 --> 00:03:01,440 Speaker 1: fix it before someone's figure out a way to exploit it. 48 00:03:02,720 --> 00:03:04,520 Speaker 1: In other words, you can't really fix a problem if 49 00:03:04,520 --> 00:03:07,120 Speaker 1: you don't know that there is a problem there. Right again, 50 00:03:07,200 --> 00:03:09,600 Speaker 1: if you look out and you don't see the hole 51 00:03:09,639 --> 00:03:13,000 Speaker 1: in the wall because it's being obscured by something else, 52 00:03:13,560 --> 00:03:15,440 Speaker 1: then you don't know that there's a problem to fix. 53 00:03:15,480 --> 00:03:18,799 Speaker 1: You just assume everything's fine, and it's not until some 54 00:03:19,000 --> 00:03:23,040 Speaker 1: other signs pop up that suggests that it's not fine, 55 00:03:23,560 --> 00:03:25,960 Speaker 1: that you start looking, and then maybe you uncover it. 56 00:03:26,280 --> 00:03:27,840 Speaker 1: At that point you could argue, all right, well, now 57 00:03:27,880 --> 00:03:30,440 Speaker 1: it's no longer zero day because we found it and 58 00:03:30,480 --> 00:03:33,600 Speaker 1: we can do something about it. But the gap of 59 00:03:33,680 --> 00:03:37,920 Speaker 1: time between when the problem first manifested and when you've 60 00:03:37,960 --> 00:03:42,520 Speaker 1: found out about it that represents the target of opportunity 61 00:03:42,880 --> 00:03:47,360 Speaker 1: for this in the case of computer security hackers. So 62 00:03:47,600 --> 00:03:52,400 Speaker 1: let's give an example with Google Chrome. Google makes the 63 00:03:52,480 --> 00:03:55,680 Speaker 1: Chrome web browser, engineers create an update for the browser, 64 00:03:55,760 --> 00:04:00,200 Speaker 1: and in the process. There is a gap in the 65 00:04:00,200 --> 00:04:04,120 Speaker 1: browser's security. It's unintentional. It was not put there on purpose. 66 00:04:04,680 --> 00:04:09,720 Speaker 1: It just sort of happens because this is complicated stuff. 67 00:04:10,240 --> 00:04:13,960 Speaker 1: So the engineers didn't detect the gap. They pushed it through, 68 00:04:14,360 --> 00:04:17,560 Speaker 1: and the update goes through the entire development process, and 69 00:04:17,600 --> 00:04:20,800 Speaker 1: then Google deploys the update. They roll it out, and 70 00:04:21,240 --> 00:04:24,360 Speaker 1: people start to update their browsers or the browser gets 71 00:04:24,400 --> 00:04:28,200 Speaker 1: updated upon a reboot or something, and thus the update 72 00:04:28,240 --> 00:04:32,600 Speaker 1: gets installed, and now the security gap is affecting those 73 00:04:32,680 --> 00:04:36,960 Speaker 1: machines because they actually have that version of Chrome in place. 74 00:04:37,600 --> 00:04:41,359 Speaker 1: Now in the wild, hackers are also scouring Chrome, but 75 00:04:41,400 --> 00:04:46,280 Speaker 1: they're looking for these vulnerabilities. They're probing the code, looking 76 00:04:46,360 --> 00:04:50,520 Speaker 1: for weaknesses where they can exploit those weaknesses. It might 77 00:04:50,600 --> 00:04:55,120 Speaker 1: mean creating some specific types of malware or some sort 78 00:04:55,160 --> 00:04:59,360 Speaker 1: of process that will initiate a sequence that allows them 79 00:04:59,400 --> 00:05:02,680 Speaker 1: to exploit the vulnerability. So it may you know, it 80 00:05:02,720 --> 00:05:05,039 Speaker 1: takes more than just finding there's a vulnerability there. You 81 00:05:05,080 --> 00:05:07,080 Speaker 1: have to figure out a way to exploit it. But 82 00:05:07,240 --> 00:05:09,720 Speaker 1: these are all things that hackers do on the rag. 83 00:05:10,279 --> 00:05:13,520 Speaker 1: So they find this gap, then they scramble to develop 84 00:05:13,560 --> 00:05:16,080 Speaker 1: the tools that can take advantage of this gap. Then 85 00:05:16,080 --> 00:05:19,919 Speaker 1: they deploy those tools, perhaps doing so very quietly in 86 00:05:19,960 --> 00:05:22,320 Speaker 1: an effort to extend the amount of time that they 87 00:05:22,360 --> 00:05:28,440 Speaker 1: have to exploit this vulnerability before Google or some computer 88 00:05:28,520 --> 00:05:33,360 Speaker 1: security expert from somewhere else finds this gap and then 89 00:05:33,480 --> 00:05:36,919 Speaker 1: alerts everybody to it. The hackers do their best to 90 00:05:37,040 --> 00:05:38,960 Speaker 1: use the gap to do whatever it is they want 91 00:05:39,000 --> 00:05:41,680 Speaker 1: to do, depending upon the nature of the gap, that 92 00:05:41,720 --> 00:05:45,800 Speaker 1: could include anything from spying on computer activity on a 93 00:05:45,839 --> 00:05:50,560 Speaker 1: compropised machine to forcing a computer to execute malicious code 94 00:05:50,760 --> 00:05:54,599 Speaker 1: remote code execution in other words, And really it just 95 00:05:54,640 --> 00:05:57,480 Speaker 1: depends upon the nature of the vulnerability and the nature 96 00:05:57,520 --> 00:06:01,600 Speaker 1: of the malware being developed, so there are lots of 97 00:06:01,640 --> 00:06:05,520 Speaker 1: different possibilities there. So when we the general public hear 98 00:06:05,560 --> 00:06:09,279 Speaker 1: about a zero day vulnerability, it's kind of like hearing 99 00:06:09,279 --> 00:06:13,000 Speaker 1: about something that was a zero day vulnerability but now 100 00:06:13,160 --> 00:06:17,080 Speaker 1: kind of isn't because it's now not just the hackers 101 00:06:17,080 --> 00:06:19,240 Speaker 1: who know about it. Now, there might not be much 102 00:06:19,279 --> 00:06:22,520 Speaker 1: we can do about it at the time. But often 103 00:06:22,560 --> 00:06:25,160 Speaker 1: when we hear about zero day vulnerabilities for the first time, 104 00:06:25,200 --> 00:06:28,560 Speaker 1: it comes along with a solution that's being rolled out 105 00:06:28,560 --> 00:06:31,360 Speaker 1: at the same time. Because companies that are hit by 106 00:06:31,360 --> 00:06:33,720 Speaker 1: this sort of stuff. If they can hold on to 107 00:06:33,839 --> 00:06:36,800 Speaker 1: that information until they have a solution, they typically will 108 00:06:36,839 --> 00:06:40,320 Speaker 1: because otherwise you're panicking people and you have no way 109 00:06:40,360 --> 00:06:43,680 Speaker 1: to fix things. Right. So it may be that when 110 00:06:43,680 --> 00:06:46,480 Speaker 1: you hear about this vulnerability, it's the same time when 111 00:06:46,520 --> 00:06:48,920 Speaker 1: you hear about a patch, a security patch that's been 112 00:06:49,400 --> 00:06:54,760 Speaker 1: rolled out that you can update your device to. Typically 113 00:06:54,839 --> 00:06:57,800 Speaker 1: we see a very fast response to the discovery. Sometimes 114 00:06:58,560 --> 00:07:02,640 Speaker 1: the response we see is in other areas of cybersecurity, 115 00:07:02,720 --> 00:07:07,080 Speaker 1: for example, antivirus software creators. They might come up with 116 00:07:07,160 --> 00:07:13,320 Speaker 1: tools to help detect and prevent applications that have been 117 00:07:13,400 --> 00:07:17,640 Speaker 1: hit by zero day vulnerabilities to do malicious activity on 118 00:07:17,680 --> 00:07:21,600 Speaker 1: your machine, and in the meantime, you're still waiting for 119 00:07:22,000 --> 00:07:26,160 Speaker 1: the creator of the root problem to roll out a patch. 120 00:07:26,560 --> 00:07:29,480 Speaker 1: That often can be the case too companies will patch 121 00:07:29,520 --> 00:07:32,480 Speaker 1: their products. So Google, for example, pushed out a patch 122 00:07:32,520 --> 00:07:36,280 Speaker 1: for this most recent vulnerability already, which, by the way, 123 00:07:36,320 --> 00:07:38,440 Speaker 1: is where I need to point out to all of 124 00:07:38,480 --> 00:07:41,280 Speaker 1: you that if you use Google Chrome, you should check 125 00:07:41,280 --> 00:07:43,840 Speaker 1: to see if it's up to date, just to protect yourself. 126 00:07:43,840 --> 00:07:46,640 Speaker 1: If you're wondering how to do that. In the upper 127 00:07:46,680 --> 00:07:49,240 Speaker 1: right corner of Google Chrome, they're the little three dots. 128 00:07:49,240 --> 00:07:51,480 Speaker 1: If you click on that, it brings up settings, like 129 00:07:51,560 --> 00:07:55,160 Speaker 1: you can choose settings from a drop down menu, and 130 00:07:55,200 --> 00:07:57,720 Speaker 1: then from the next menu that pops up, you look 131 00:07:57,760 --> 00:08:01,600 Speaker 1: at about Chrome and it will have a little button 132 00:08:01,640 --> 00:08:05,080 Speaker 1: there that lets you update your browser at that point, 133 00:08:05,120 --> 00:08:07,320 Speaker 1: and you should click on that make sure your browser 134 00:08:07,400 --> 00:08:10,440 Speaker 1: is up to date and patched so that you are 135 00:08:10,480 --> 00:08:17,800 Speaker 1: not going to be victimized by this type of vulnerability. Now, 136 00:08:17,840 --> 00:08:21,920 Speaker 1: sometimes hackers will even know about vulnerabilities and code before 137 00:08:21,960 --> 00:08:25,720 Speaker 1: the code gets released. So imagine for a moment that 138 00:08:25,840 --> 00:08:30,880 Speaker 1: you are a black hat hacker. Your job is to 139 00:08:30,960 --> 00:08:35,199 Speaker 1: try and find vulnerabilities in various systems and platforms and 140 00:08:35,240 --> 00:08:38,040 Speaker 1: then figure out ways to exploit them. So you are 141 00:08:38,080 --> 00:08:43,040 Speaker 1: an expert in penetrating systems, and maybe you're on the 142 00:08:43,080 --> 00:08:46,120 Speaker 1: payroll of a nation state. Maybe you know Russia or 143 00:08:46,200 --> 00:08:49,480 Speaker 1: China is paying you to do this, or maybe you 144 00:08:49,559 --> 00:08:51,839 Speaker 1: work in the United States for one of those organizations 145 00:08:51,880 --> 00:08:54,679 Speaker 1: that goes by initials all the time, like the NSSA, 146 00:08:55,240 --> 00:08:58,200 Speaker 1: because they do this too. So information is a precious 147 00:08:58,200 --> 00:09:02,400 Speaker 1: resource and access to information is your specialty as this 148 00:09:02,520 --> 00:09:06,280 Speaker 1: kind of hacker, So you dedicate your time to gaining 149 00:09:06,320 --> 00:09:10,240 Speaker 1: access to developer platforms on the QT so that you 150 00:09:10,280 --> 00:09:14,200 Speaker 1: can look at code while it's still in development. Maybe 151 00:09:14,920 --> 00:09:18,160 Speaker 1: you start probing this code for vulnerabilities well before it 152 00:09:18,200 --> 00:09:21,440 Speaker 1: gets rolled out to the public. Now, if you're lucky 153 00:09:21,880 --> 00:09:25,800 Speaker 1: and you're very good, you find something that can be 154 00:09:25,960 --> 00:09:30,040 Speaker 1: used immediately as soon as this gets rolled out. And 155 00:09:30,120 --> 00:09:32,160 Speaker 1: this is the best case scenario for you, because you 156 00:09:32,160 --> 00:09:35,280 Speaker 1: already know the vulnerability, you can already work on an exploit, 157 00:09:35,760 --> 00:09:39,000 Speaker 1: and the tool hasn't even been pushed out yet, So 158 00:09:39,760 --> 00:09:44,520 Speaker 1: at the moment that it goes public, you'll chances are 159 00:09:44,600 --> 00:09:47,720 Speaker 1: you'll be able to exploit it right away. It really 160 00:09:47,760 --> 00:09:51,400 Speaker 1: does become a zero day vulnerability. In that case, this 161 00:09:51,480 --> 00:09:54,120 Speaker 1: gives you the maximum amount of time to work with 162 00:09:54,200 --> 00:09:58,360 Speaker 1: that vulnerability and to compromise target devices before someone gets wise. 163 00:09:59,120 --> 00:10:01,920 Speaker 1: And again there's no telling how long that might take. 164 00:10:02,320 --> 00:10:06,319 Speaker 1: Sometimes it can be years before someone realizes, Hey, something 165 00:10:06,360 --> 00:10:09,520 Speaker 1: hanky's going on. Let's look into this now. Of course, 166 00:10:09,600 --> 00:10:13,320 Speaker 1: chances are you're not necessarily the person who's making use 167 00:10:13,480 --> 00:10:17,520 Speaker 1: of these vulnerabilities. You're finding them, but you aren't necessarily 168 00:10:17,559 --> 00:10:22,000 Speaker 1: the same person who's exploiting them. Instead, you find these 169 00:10:22,040 --> 00:10:26,680 Speaker 1: vulnerabilities and then you pedal your knowledge on the black market, 170 00:10:27,200 --> 00:10:30,640 Speaker 1: where you're selling information to criminals and the like, and 171 00:10:30,720 --> 00:10:35,120 Speaker 1: they in turn will actually use the exploits to leverage 172 00:10:35,120 --> 00:10:40,040 Speaker 1: that vulnerability. You might instead go to the gray market. 173 00:10:40,080 --> 00:10:42,360 Speaker 1: This is where you could be selling the information to say, 174 00:10:42,440 --> 00:10:47,040 Speaker 1: researchers in the cybersecurity field, or maybe to a defense 175 00:10:47,120 --> 00:10:51,600 Speaker 1: contractor who in turn is working alongside military organizations. I mean, 176 00:10:51,640 --> 00:10:54,719 Speaker 1: we've seen this too, right. We've seen companies that are 177 00:10:54,800 --> 00:11:01,680 Speaker 1: essentially defense contractors develop malware that are it's zero day vulnerabilities. 178 00:11:01,679 --> 00:11:05,560 Speaker 1: A great example that's the NSO Group, the Israeli company 179 00:11:05,880 --> 00:11:11,880 Speaker 1: that created the Pegasus malware that targeted iOS devices. So 180 00:11:12,160 --> 00:11:15,120 Speaker 1: you might be working with intelligence agencies to do this 181 00:11:15,200 --> 00:11:18,160 Speaker 1: sort of thing, or maybe you are one of the 182 00:11:18,240 --> 00:11:21,280 Speaker 1: good guys out there. You're a white hat, so what 183 00:11:21,320 --> 00:11:25,520 Speaker 1: you're doing is not trying to sell off this information 184 00:11:25,720 --> 00:11:29,280 Speaker 1: to criminal bidders on the black market. Instead, what you 185 00:11:29,360 --> 00:11:32,960 Speaker 1: are doing is you're taking part in bug bounties where 186 00:11:32,960 --> 00:11:37,160 Speaker 1: a company will offer a reward if you find security 187 00:11:37,240 --> 00:11:41,640 Speaker 1: vulnerabilities in their products. And so you discover the vulnerability, 188 00:11:41,720 --> 00:11:46,480 Speaker 1: you send it a bug report to a company, they 189 00:11:46,600 --> 00:11:49,840 Speaker 1: verify that it is in fact a vulnerability, and then 190 00:11:49,840 --> 00:11:52,319 Speaker 1: they pay you while they get to work patching that 191 00:11:52,400 --> 00:11:55,800 Speaker 1: vulnerability out before any bad guy out there figures out 192 00:11:55,880 --> 00:11:57,800 Speaker 1: that this thing's there, or if they have figured it 193 00:11:57,840 --> 00:12:00,360 Speaker 1: out before, they can do too much damage with it. 194 00:12:00,880 --> 00:12:05,040 Speaker 1: So hopefully the white hats get the information to the 195 00:12:05,120 --> 00:12:08,000 Speaker 1: right people in time, and the right people take the 196 00:12:08,080 --> 00:12:13,080 Speaker 1: right actions to prevent any exploitation of that vulnerability, though 197 00:12:13,240 --> 00:12:17,079 Speaker 1: that you know, is never a guarantee, so these zero 198 00:12:17,160 --> 00:12:23,360 Speaker 1: day attacks typically they are devastating because there's no defense 199 00:12:23,600 --> 00:12:28,280 Speaker 1: right like, there's no way as a user, there's nothing 200 00:12:28,320 --> 00:12:31,680 Speaker 1: you can do because you don't you're not doing anything 201 00:12:31,679 --> 00:12:34,160 Speaker 1: with the code on the back end of the various 202 00:12:34,640 --> 00:12:39,840 Speaker 1: devices and programs you're using. But beyond that, if no 203 00:12:39,880 --> 00:12:42,880 Speaker 1: one is aware of them, apart from the attackers, then 204 00:12:43,280 --> 00:12:45,800 Speaker 1: there's going to be a lot of damage done up 205 00:12:45,880 --> 00:12:48,920 Speaker 1: until that discovery has been made. They can also happen 206 00:12:48,960 --> 00:12:52,000 Speaker 1: at a very high level. These can be attacks that 207 00:12:52,040 --> 00:12:56,240 Speaker 1: are not going after the general citizen. They might cast 208 00:12:56,280 --> 00:12:58,960 Speaker 1: a very wide net in order to get as many 209 00:12:59,360 --> 00:13:02,800 Speaker 1: people lump as possible, but the hope typically is to 210 00:13:02,920 --> 00:13:06,000 Speaker 1: land a few important targets, like you're aiming to get 211 00:13:06,000 --> 00:13:11,760 Speaker 1: some high profile or important people, whether those are politicians 212 00:13:12,000 --> 00:13:16,640 Speaker 1: or activists or journalists. You know, it depends on case 213 00:13:16,640 --> 00:13:22,000 Speaker 1: to case, but that's often the goal. Now. Granted, even 214 00:13:22,040 --> 00:13:24,680 Speaker 1: if that's the goal, people might still make use of 215 00:13:24,720 --> 00:13:27,079 Speaker 1: all that other information by selling it off on the 216 00:13:27,200 --> 00:13:31,240 Speaker 1: black market or whatever, so you know, maximize your gains. 217 00:13:31,679 --> 00:13:36,240 Speaker 1: But typically these kinds of attacks are not necessarily meant 218 00:13:36,280 --> 00:13:41,960 Speaker 1: to grab John Smith's information or whatever. They're going after 219 00:13:42,440 --> 00:13:46,400 Speaker 1: bigger fish, So the vast majority of machines affected by 220 00:13:46,400 --> 00:13:49,520 Speaker 1: a vulnerability exploit might not be of much interest to 221 00:13:49,559 --> 00:13:52,680 Speaker 1: the hacker, at least not directly. That's very small comfort 222 00:13:52,679 --> 00:13:54,360 Speaker 1: when you figure out that your machine has been hit, 223 00:13:54,480 --> 00:13:56,280 Speaker 1: But at the very least you might luck out in 224 00:13:56,960 --> 00:13:59,679 Speaker 1: that you're not the type of person the hackers we're targeting, 225 00:14:00,520 --> 00:14:04,600 Speaker 1: and they aren't exploiting the information right away, unless, of course, 226 00:14:04,600 --> 00:14:06,400 Speaker 1: you are an important person, in which case, Hey, thanks 227 00:14:06,440 --> 00:14:08,679 Speaker 1: for listening to tech stuff. Thanks everybody for listening to 228 00:14:08,760 --> 00:14:11,800 Speaker 1: tack stuff. You're all important to me, And honestly, the 229 00:14:11,880 --> 00:14:14,720 Speaker 1: list of important people can get pretty big depending upon 230 00:14:14,760 --> 00:14:17,440 Speaker 1: the aim of the attack. Okay, we're going to take 231 00:14:17,440 --> 00:14:19,280 Speaker 1: a quick break. When we come back, we'll talk more 232 00:14:19,360 --> 00:14:31,920 Speaker 1: about zero Day vulnerabilities and attacks. Okay, we're back, and 233 00:14:32,120 --> 00:14:34,600 Speaker 1: up to this point, I've kind of been using Google 234 00:14:34,680 --> 00:14:38,040 Speaker 1: Chrome as an example because, well, we just had that 235 00:14:38,200 --> 00:14:41,400 Speaker 1: news break that up to around three billion users could 236 00:14:41,400 --> 00:14:46,160 Speaker 1: have been affected by this particular exploit or this vulnerability, 237 00:14:47,040 --> 00:14:49,760 Speaker 1: and web browsers have frequently been the focus of zero 238 00:14:49,880 --> 00:14:53,400 Speaker 1: day attacks, so even without this most recent example, we 239 00:14:53,400 --> 00:14:55,640 Speaker 1: could still talk about Google Chrome. This is not the 240 00:14:55,680 --> 00:14:59,240 Speaker 1: first time there's been a zero day attack or zero 241 00:14:59,280 --> 00:15:02,960 Speaker 1: day vulnerability with Google Chrome. It's happened multiple times in 242 00:15:03,000 --> 00:15:06,840 Speaker 1: the past. The evolving nature of the web means that 243 00:15:06,960 --> 00:15:11,040 Speaker 1: companies that make web browsers are constantly updating their products, 244 00:15:11,360 --> 00:15:15,000 Speaker 1: whether to make them run more efficiently or add new features, 245 00:15:15,440 --> 00:15:19,840 Speaker 1: or to accommodate new types of web technology. They need 246 00:15:19,880 --> 00:15:23,760 Speaker 1: to update the browser to make it work. This, however, 247 00:15:23,840 --> 00:15:26,920 Speaker 1: can introduce the chance for vulnerabilities to emerge. As software 248 00:15:26,920 --> 00:15:31,920 Speaker 1: gets more complicated, changes to that software can have unexpected consequences. 249 00:15:32,880 --> 00:15:34,880 Speaker 1: I'm sure anyone out there who has worked on a 250 00:15:34,880 --> 00:15:38,440 Speaker 1: complicated system, whether it's software or otherwise, they know that 251 00:15:38,520 --> 00:15:41,800 Speaker 1: when you fix one problem, sometimes the fix can end 252 00:15:41,880 --> 00:15:46,680 Speaker 1: up causing three more problems somewhere else because the interconnections 253 00:15:46,680 --> 00:15:50,600 Speaker 1: between all these different components gets really really complex. By 254 00:15:50,640 --> 00:15:53,920 Speaker 1: the way, this is where all of us should take 255 00:15:53,960 --> 00:15:57,680 Speaker 1: time to thank people who work in QA, because it's 256 00:15:57,720 --> 00:16:01,440 Speaker 1: their job to test products and look for problems so 257 00:16:01,480 --> 00:16:05,880 Speaker 1: that hopefully issues can be fixed before the product has 258 00:16:05,960 --> 00:16:09,320 Speaker 1: headed out to the real world. And even when you 259 00:16:09,360 --> 00:16:11,240 Speaker 1: sit there and think, wow, I have this thing and 260 00:16:11,280 --> 00:16:13,160 Speaker 1: it doesn't work nearly as well as I hoped it, 261 00:16:13,200 --> 00:16:16,880 Speaker 1: would just know that there's a really good chance there 262 00:16:16,880 --> 00:16:19,840 Speaker 1: were QA people working on that where if they had 263 00:16:19,880 --> 00:16:22,400 Speaker 1: not been there, you wouldn't even get the level of 264 00:16:22,440 --> 00:16:25,840 Speaker 1: performance that you got out of that thing. So, yeah, 265 00:16:25,920 --> 00:16:28,520 Speaker 1: QA people are really super important. I don't just say 266 00:16:28,520 --> 00:16:31,120 Speaker 1: that because I happen to be married to one. Now, 267 00:16:31,560 --> 00:16:35,960 Speaker 1: beyond web browsers, there are other types of products that 268 00:16:36,640 --> 00:16:41,160 Speaker 1: are rich targets for zero day attacks. Now, again, for 269 00:16:41,200 --> 00:16:43,280 Speaker 1: a zero day attack to even work, there has to 270 00:16:43,280 --> 00:16:47,080 Speaker 1: be a vulnerability, right, there's not a guarantee of vulnerability 271 00:16:47,120 --> 00:16:51,240 Speaker 1: will be there. But there are certain types of technologies 272 00:16:51,600 --> 00:16:56,840 Speaker 1: that hackers focus on more because the potential of finding 273 00:16:56,840 --> 00:16:59,960 Speaker 1: a vulnerability and then exploiting it also means the potential 274 00:17:00,120 --> 00:17:04,840 Speaker 1: we're hitting a huge number of targets so browsers are 275 00:17:04,840 --> 00:17:07,840 Speaker 1: way up there because that's how a lot of people 276 00:17:08,000 --> 00:17:12,360 Speaker 1: access the Internet right they're using the web based Internet. 277 00:17:12,560 --> 00:17:16,200 Speaker 1: Operating systems are also way up there, so our email systems. 278 00:17:16,720 --> 00:17:21,240 Speaker 1: The Internet of Things era introduced tons of new components 279 00:17:21,280 --> 00:17:24,840 Speaker 1: that connect to information networks, and in the rush to 280 00:17:25,040 --> 00:17:29,680 Speaker 1: build new and sometimes useful tools, not always, the Internet 281 00:17:29,680 --> 00:17:31,159 Speaker 1: of Things has got a lot of stuff that you 282 00:17:31,200 --> 00:17:34,600 Speaker 1: could argue has limited, if any use. There's a ton 283 00:17:34,600 --> 00:17:37,639 Speaker 1: of stuff that is really useful. Well, whenever you're tapping 284 00:17:37,680 --> 00:17:44,000 Speaker 1: into a networked communication infrastructure, you're potentially introducing a vulnerability 285 00:17:44,080 --> 00:17:47,200 Speaker 1: to the overall system, especially if you have not taken 286 00:17:47,240 --> 00:17:51,720 Speaker 1: the time to build real security into your product. And 287 00:17:52,080 --> 00:17:54,399 Speaker 1: time and again there have been stories about Internet of 288 00:17:54,400 --> 00:17:59,879 Speaker 1: things devices that limited or no security to them, which 289 00:18:00,320 --> 00:18:04,560 Speaker 1: created a great intrusion point for hackers. So really, any 290 00:18:04,680 --> 00:18:09,440 Speaker 1: networked component can potentially be ground zero for a zero 291 00:18:09,600 --> 00:18:15,119 Speaker 1: day attack, whether it's hardware, firmware, or software. It's just 292 00:18:15,160 --> 00:18:19,359 Speaker 1: that stuff like browsers and operating systems are so widely deployed, 293 00:18:19,560 --> 00:18:23,359 Speaker 1: they're so prominent that these targets are often the most 294 00:18:23,359 --> 00:18:29,520 Speaker 1: desirable because everybody's got an operating system just about everybody's 295 00:18:29,520 --> 00:18:32,800 Speaker 1: got a browser, but not everyone has I don't know, 296 00:18:33,160 --> 00:18:37,399 Speaker 1: like a smart seismometer attached to their network. And so 297 00:18:38,240 --> 00:18:41,520 Speaker 1: you focus on these big, big targets hoping to find 298 00:18:41,600 --> 00:18:46,920 Speaker 1: vulnerabilities as a hacker because of the potential of how 299 00:18:47,000 --> 00:18:49,879 Speaker 1: many hits you're going to get on an attack. It 300 00:18:49,920 --> 00:18:53,439 Speaker 1: may be that you find an incredible attack for some 301 00:18:53,800 --> 00:18:56,080 Speaker 1: Internet of Things connected device, but if there aren't a 302 00:18:56,119 --> 00:18:59,440 Speaker 1: ton of those out in the world, then it still 303 00:18:59,480 --> 00:19:02,720 Speaker 1: limits the effectiveness of your attack. Right, So you're balancing 304 00:19:02,760 --> 00:19:05,840 Speaker 1: this out how bad is the vulnerability, how well can 305 00:19:05,880 --> 00:19:10,840 Speaker 1: I exploit it, how widely is it distributed, and how 306 00:19:10,880 --> 00:19:13,040 Speaker 1: long do I think it can get away with it? Well, 307 00:19:13,040 --> 00:19:14,840 Speaker 1: that being said, if we look back on the history 308 00:19:14,880 --> 00:19:18,160 Speaker 1: of zero day attacks, one of the standouts that comes 309 00:19:18,240 --> 00:19:21,800 Speaker 1: up an early example of zero day attacks, although the 310 00:19:22,000 --> 00:19:26,800 Speaker 1: term zero day predates the discovery of this particular attack, 311 00:19:26,840 --> 00:19:30,000 Speaker 1: because I say that because a lot of the resources 312 00:19:30,000 --> 00:19:32,719 Speaker 1: I looked at called this the first zero day attack, 313 00:19:33,000 --> 00:19:35,280 Speaker 1: which is kind of funny because we had the term 314 00:19:35,359 --> 00:19:38,520 Speaker 1: before we even knew it existed. But it's Stuck's net, 315 00:19:39,080 --> 00:19:42,159 Speaker 1: stuxn et. You may have heard of that. This was 316 00:19:42,800 --> 00:19:45,760 Speaker 1: in the news more than a decade ago at this point, 317 00:19:46,800 --> 00:19:52,080 Speaker 1: but it was intended to infect a specific kind of system. 318 00:19:52,440 --> 00:19:55,120 Speaker 1: I've actually done an episode about Stuck's Net, so I'm 319 00:19:55,119 --> 00:19:58,080 Speaker 1: not going to go into a full history of it, 320 00:19:58,800 --> 00:20:01,080 Speaker 1: but i will talk a bit about what it was 321 00:20:01,240 --> 00:20:05,840 Speaker 1: and what was going on. Okay, So around two thousand 322 00:20:05,840 --> 00:20:09,479 Speaker 1: and five or two thousand and six, some programmers or 323 00:20:09,800 --> 00:20:14,360 Speaker 1: hackers if you prefer, were hard at work developing a 324 00:20:14,440 --> 00:20:19,159 Speaker 1: sneaky kind of malware. And based upon the scope of 325 00:20:19,200 --> 00:20:24,840 Speaker 1: this malware, the target of the malware, and the sophistication 326 00:20:25,040 --> 00:20:29,119 Speaker 1: of the attack, it's pretty clear that this had to 327 00:20:29,160 --> 00:20:33,960 Speaker 1: be a state sponsored effort, that this was a group 328 00:20:34,000 --> 00:20:37,680 Speaker 1: of hackers who had access to a lot of resources, 329 00:20:38,040 --> 00:20:41,640 Speaker 1: like in the form of money and stuff, and subsequently 330 00:20:41,720 --> 00:20:45,320 Speaker 1: people have sussed out that was probably the United States 331 00:20:45,320 --> 00:20:49,640 Speaker 1: and Israel working together to do this. So the malware 332 00:20:50,240 --> 00:20:53,320 Speaker 1: it had to do several things. First, it needed to 333 00:20:53,359 --> 00:20:59,600 Speaker 1: be able to infect a target machine and spread very easily. Second, 334 00:21:00,080 --> 00:21:03,439 Speaker 1: it needed to remain undetectable, so it needed to not 335 00:21:03,720 --> 00:21:07,480 Speaker 1: cause too much trouble or else someone might catch on 336 00:21:07,600 --> 00:21:10,840 Speaker 1: that something he ky's happening. Third, it needed to be 337 00:21:10,840 --> 00:21:15,600 Speaker 1: able to transfer itself onto a device like a flash drive. 338 00:21:16,040 --> 00:21:18,120 Speaker 1: So if you were to plug a flash drive into 339 00:21:18,160 --> 00:21:20,959 Speaker 1: an infected computer, it needed to be able to copy 340 00:21:21,080 --> 00:21:24,479 Speaker 1: itself onto that flash drive, along with whatever else it 341 00:21:24,600 --> 00:21:28,560 Speaker 1: was you were planning to put on that flash drive. Fourth, 342 00:21:28,880 --> 00:21:31,040 Speaker 1: it had to carry programming that would allow it to 343 00:21:31,080 --> 00:21:37,560 Speaker 1: manipulate systems with programmable logic controllers. So these components also 344 00:21:37,680 --> 00:21:44,919 Speaker 1: known as plc's connect to industrial machinery. So essentially, PLCs 345 00:21:45,000 --> 00:21:49,680 Speaker 1: let a computer system send commands to industrial equipment that 346 00:21:49,760 --> 00:21:54,199 Speaker 1: does something whatever industrial process it needs to do, but 347 00:21:54,240 --> 00:21:56,840 Speaker 1: the computer can control it, and the PLC is kind 348 00:21:56,880 --> 00:22:01,359 Speaker 1: of the interface that allows it to communicate with this 349 00:22:01,480 --> 00:22:05,080 Speaker 1: industrial equipment. And in the case of stucks net, it 350 00:22:05,160 --> 00:22:09,280 Speaker 1: was a specific kind of industrial equipment. It was a 351 00:22:09,280 --> 00:22:15,960 Speaker 1: centrifuge that was used to process uranium, specifically to refine uranium. 352 00:22:16,280 --> 00:22:21,640 Speaker 1: Because the target for stuckx net was Iran's nuclear program, 353 00:22:22,240 --> 00:22:28,000 Speaker 1: So the computer systems responsible for controlling centrifuges was the 354 00:22:28,040 --> 00:22:32,840 Speaker 1: goal here. And the centrifuges spin at very very high speed, 355 00:22:33,600 --> 00:22:37,440 Speaker 1: and in the process when they're spinning, they're spinning samples 356 00:22:37,520 --> 00:22:42,240 Speaker 1: of uranium, and this is what helps separate the uranium 357 00:22:42,320 --> 00:22:45,240 Speaker 1: so that you can refine it, and it's an important 358 00:22:45,280 --> 00:22:49,280 Speaker 1: step in that process. So the malware would interrupt this 359 00:22:49,440 --> 00:22:53,600 Speaker 1: chain of command between the computer system responsible for governing 360 00:22:53,640 --> 00:22:56,800 Speaker 1: the centrifuge and the centrifuge itself, and then the malware 361 00:22:56,880 --> 00:23:01,400 Speaker 1: could send instructions for the centrifuges to spin faster than 362 00:23:01,440 --> 00:23:04,119 Speaker 1: they were supposed to. This had a dual effect. For 363 00:23:04,200 --> 00:23:07,240 Speaker 1: one thing, it would cause the centrifuges to wear out 364 00:23:07,320 --> 00:23:10,560 Speaker 1: faster and to fail more frequently. Essentially, it could break 365 00:23:10,600 --> 00:23:14,879 Speaker 1: the centrifuges. For another, it could ruin uranium samples and 366 00:23:14,920 --> 00:23:18,920 Speaker 1: slow down Iran's nuclear program in the process. But there 367 00:23:19,040 --> 00:23:22,000 Speaker 1: was a major obstacle in the way of carrying through 368 00:23:22,080 --> 00:23:27,280 Speaker 1: with this attack because the target systems, those computers that 369 00:23:27,359 --> 00:23:31,960 Speaker 1: actually sent the messages to centrifuges, they were an air 370 00:23:32,080 --> 00:23:35,680 Speaker 1: gap system. So an air gap system is one that 371 00:23:35,760 --> 00:23:40,160 Speaker 1: does not connect to an external network, so it doesn't 372 00:23:40,200 --> 00:23:43,000 Speaker 1: connect to the Internet. It's air gaped. There is a 373 00:23:43,040 --> 00:23:47,000 Speaker 1: gap between the system and the outside world. This is 374 00:23:47,040 --> 00:23:50,920 Speaker 1: a strategy that a lot of companies and militaries use 375 00:23:51,040 --> 00:23:58,240 Speaker 1: for systems that hold critically important and sensitive information. You 376 00:23:58,320 --> 00:24:00,640 Speaker 1: cannot trust for it to be connected to the Internet, 377 00:24:00,680 --> 00:24:03,440 Speaker 1: because then that information might leak out to the world. 378 00:24:03,440 --> 00:24:06,359 Speaker 1: We've seen it happen lots of times. So you create 379 00:24:06,400 --> 00:24:09,600 Speaker 1: an air gap system and ideally there's no way for 380 00:24:09,640 --> 00:24:12,879 Speaker 1: the outside world to get into the computer system. So 381 00:24:12,920 --> 00:24:17,800 Speaker 1: how do you compromise an air gapped computer system. You 382 00:24:17,840 --> 00:24:20,639 Speaker 1: couldn't just create a neatly wrapped package in code and 383 00:24:20,680 --> 00:24:24,640 Speaker 1: send it via email or something, because again, those targeted 384 00:24:24,640 --> 00:24:28,119 Speaker 1: computers didn't have that external connection. So what they did 385 00:24:29,240 --> 00:24:32,080 Speaker 1: was the hackers targeted companies that were known to be 386 00:24:32,160 --> 00:24:36,520 Speaker 1: working with Iran on its nuclear program. So the goal 387 00:24:36,800 --> 00:24:41,720 Speaker 1: was to infect the machines on the collaborators, to target 388 00:24:41,760 --> 00:24:45,720 Speaker 1: these collaborators and try and get those machines infected, and 389 00:24:45,800 --> 00:24:48,480 Speaker 1: the hope that as part of their work with Iran, 390 00:24:49,280 --> 00:24:53,919 Speaker 1: they would unknowingly transfer malware from their own machines to 391 00:24:54,000 --> 00:24:56,959 Speaker 1: something like a flash drive, and then they would use 392 00:24:56,960 --> 00:25:02,159 Speaker 1: that flash drive to update Iran's computers that were in 393 00:25:02,200 --> 00:25:05,760 Speaker 1: control of the centrifuges, and thus the malware could be 394 00:25:05,840 --> 00:25:10,399 Speaker 1: transferred from the flash drive to the target machines. So 395 00:25:10,560 --> 00:25:13,439 Speaker 1: you had this extra step you had to take. But 396 00:25:13,480 --> 00:25:16,800 Speaker 1: here's the thing. It totally worked for at least a year, 397 00:25:16,880 --> 00:25:20,719 Speaker 1: the attackers were able to disrupt operations in Iran's nuclear program, 398 00:25:20,760 --> 00:25:24,720 Speaker 1: even updating the malware so that subsequent visits from these 399 00:25:24,760 --> 00:25:30,040 Speaker 1: partner companies would help keep things going. Now, eventually, like 400 00:25:30,119 --> 00:25:32,679 Speaker 1: in twenty ten, which was at least two years after 401 00:25:33,520 --> 00:25:39,680 Speaker 1: the machines had been compromised, Iran uncovered the reason that 402 00:25:39,760 --> 00:25:42,919 Speaker 1: they were seeing centrifuges fail more frequently than they we're 403 00:25:42,960 --> 00:25:45,240 Speaker 1: supposed to. Like you know, of course stuff wears out, 404 00:25:45,280 --> 00:25:48,840 Speaker 1: particularly stuff that moves a lot, but the centrifuges were 405 00:25:48,840 --> 00:25:51,840 Speaker 1: wearing out way too quickly. They also noticed that their 406 00:25:51,880 --> 00:25:55,040 Speaker 1: computer systems were crashing a lot. They figured it out 407 00:25:55,080 --> 00:25:59,679 Speaker 1: finally that there was this malware to blame Stuck'snut itself, 408 00:25:59,720 --> 00:26:02,639 Speaker 1: because it was designed to spread from system to system 409 00:26:02,680 --> 00:26:06,680 Speaker 1: really effectively actually infected a ton of machines that had 410 00:26:06,760 --> 00:26:09,760 Speaker 1: nothing to do with Iron's nuclear program. That was kind 411 00:26:09,760 --> 00:26:13,600 Speaker 1: of collateral damage, because again, the goal was to try 412 00:26:13,640 --> 00:26:16,960 Speaker 1: and get these systems that otherwise were very well protected. 413 00:26:17,520 --> 00:26:20,960 Speaker 1: And if you just happened to infect millions of other 414 00:26:21,000 --> 00:26:23,919 Speaker 1: computers around the world, well that's a price you have 415 00:26:23,960 --> 00:26:28,240 Speaker 1: to be willing to pay. Anyway. Stucksnet initially targeted five 416 00:26:28,720 --> 00:26:34,600 Speaker 1: zero day vulnerabilities as part of its strategy. Now, through 417 00:26:34,600 --> 00:26:38,160 Speaker 1: a security patch, one of those vulnerabilities was eliminated before 418 00:26:38,200 --> 00:26:42,639 Speaker 1: Stuck's net could be deployed, so when the malware was 419 00:26:42,760 --> 00:26:46,120 Speaker 1: ready to go, it was depending upon four zero day 420 00:26:46,200 --> 00:26:48,800 Speaker 1: vulnerabilities because the other four had not yet been uncovered, 421 00:26:49,280 --> 00:26:53,680 Speaker 1: so they still had different vectors to use in order 422 00:26:53,720 --> 00:26:57,800 Speaker 1: to try and inject malware into the targets. The vulnerabilities 423 00:26:58,000 --> 00:27:02,480 Speaker 1: targeted stuff like Microsoft Windows operating system and Microsoft Networks 424 00:27:03,320 --> 00:27:07,040 Speaker 1: and specifically was designed to seek out computers that had 425 00:27:07,040 --> 00:27:12,800 Speaker 1: the Step seven software suite from the company Siemens. Those 426 00:27:12,840 --> 00:27:15,480 Speaker 1: were you know, That's essentially what stuck set would do. 427 00:27:15,560 --> 00:27:19,480 Speaker 1: It'd be like, all right, I've infected this machine. Does 428 00:27:19,520 --> 00:27:24,359 Speaker 1: this machine have Step seven installed on it? No? Cool, 429 00:27:24,520 --> 00:27:28,360 Speaker 1: I'm not doing anything else other than infecting other machines 430 00:27:28,359 --> 00:27:31,960 Speaker 1: if I have the chance. If it did detect Step seven, 431 00:27:32,480 --> 00:27:35,680 Speaker 1: that was software that was meant to interoperate with these 432 00:27:36,119 --> 00:27:39,240 Speaker 1: PLCs so that you could work with industrial equipment, it 433 00:27:39,280 --> 00:27:42,919 Speaker 1: would then continue on its mode of attack. Now, as 434 00:27:42,960 --> 00:27:46,359 Speaker 1: you might imagine, like I said, zero day attacks can 435 00:27:46,400 --> 00:27:49,240 Speaker 1: cause a huge amount of trouble. The vulnerabilities there the 436 00:27:49,320 --> 00:27:52,680 Speaker 1: exploit's been developed and no one, not even cybersecurity companies, 437 00:27:53,200 --> 00:27:56,320 Speaker 1: it's prepared to respond to it. If it's carried out well, 438 00:27:56,359 --> 00:27:59,080 Speaker 1: the attackers can achieve goals, and like Stuck's net, they 439 00:27:59,080 --> 00:28:03,280 Speaker 1: can continue to operate for years without being spotted, assuming 440 00:28:03,359 --> 00:28:07,680 Speaker 1: the attacks are not causing noticeable issues in the infected systems. 441 00:28:08,520 --> 00:28:11,600 Speaker 1: If it's causing stuff that most people would just chalk 442 00:28:11,720 --> 00:28:15,800 Speaker 1: up to regular technical errors or glitches or whatever, you 443 00:28:15,840 --> 00:28:18,199 Speaker 1: can get away with it for a while. But if 444 00:28:18,240 --> 00:28:21,080 Speaker 1: you're like causing lots of problems, then eventually someone's going 445 00:28:21,119 --> 00:28:24,160 Speaker 1: to say something's wrong with this machine, and that brings 446 00:28:24,200 --> 00:28:28,800 Speaker 1: out the possibility that someone figures out it's been exploited. Now, 447 00:28:28,840 --> 00:28:31,720 Speaker 1: the recent Google Chrome zero day attack potentially affected up 448 00:28:31,760 --> 00:28:34,840 Speaker 1: to three billion people, like I mentioned, according to initial estimates, 449 00:28:35,359 --> 00:28:37,119 Speaker 1: which puts it neck and neck with one of the 450 00:28:37,119 --> 00:28:40,800 Speaker 1: worst zero day attacks we know about. I was about 451 00:28:40,840 --> 00:28:43,080 Speaker 1: to say one of the worst zero day attacks in history. 452 00:28:43,120 --> 00:28:46,880 Speaker 1: But of course, the scary thing is there are probably 453 00:28:47,680 --> 00:28:51,080 Speaker 1: huge zero day attacks going on right now and no 454 00:28:51,120 --> 00:28:54,440 Speaker 1: one has detected them yet, and who knows the scope 455 00:28:54,560 --> 00:28:58,240 Speaker 1: or nature of those attacks. That's the scary thing about that, right, 456 00:28:58,440 --> 00:29:00,640 Speaker 1: Like you just there's no way to know because no 457 00:29:00,640 --> 00:29:05,480 Speaker 1: one's discovered there was a vulnerability or noticed anything unusual 458 00:29:05,560 --> 00:29:09,680 Speaker 1: going on with their systems. But anyway, the other really 459 00:29:09,720 --> 00:29:12,440 Speaker 1: really big one that happened at the same scale as 460 00:29:12,480 --> 00:29:16,960 Speaker 1: Google Chrome happened to a little company called Yahoo back 461 00:29:17,000 --> 00:29:20,320 Speaker 1: in twenty thirteen. Now we're going to take another quick break. 462 00:29:20,320 --> 00:29:24,200 Speaker 1: When we come back, I'll talk about this attack on 463 00:29:24,280 --> 00:29:38,920 Speaker 1: Yahoo because it was another enormous deal. Okay, let's talk 464 00:29:38,960 --> 00:29:42,240 Speaker 1: about this data breach attack on Yahoo that happened in 465 00:29:42,280 --> 00:29:46,920 Speaker 1: twenty thirteen. We didn't even know about it until twenty sixteen. Again, 466 00:29:48,040 --> 00:29:52,680 Speaker 1: the sinister nature of these attacks is that they can 467 00:29:52,800 --> 00:29:56,960 Speaker 1: have happened and even continue to happen without us being 468 00:29:57,000 --> 00:30:00,840 Speaker 1: aware of it for ages, and only in retrospector were 469 00:30:00,840 --> 00:30:03,320 Speaker 1: able to look back and say, wow, that was an 470 00:30:03,520 --> 00:30:08,040 Speaker 1: enormous attack. So, first off, Yahoo had already been the 471 00:30:08,040 --> 00:30:12,160 Speaker 1: target of zero day attacks before twenty thirteen. In fact, 472 00:30:12,200 --> 00:30:14,840 Speaker 1: back in two thousand and seven, which was before the 473 00:30:14,880 --> 00:30:17,240 Speaker 1: world knew that stucks net was a thing. I mean, 474 00:30:17,280 --> 00:30:20,160 Speaker 1: you know, hackers had developed it and everything, but the 475 00:30:20,240 --> 00:30:23,200 Speaker 1: world was not aware of Stuck's net. There was a 476 00:30:23,320 --> 00:30:26,400 Speaker 1: zero day attack. There were zero attacks aimed at Yahoo, 477 00:30:26,440 --> 00:30:30,680 Speaker 1: specifically Yahoo Messenger that was the company's instant messaging service. 478 00:30:31,160 --> 00:30:35,640 Speaker 1: So reportedly, this malicious attack could initiate a remote code 479 00:30:35,680 --> 00:30:40,360 Speaker 1: execution on a target without them even doing anything, assuming 480 00:30:40,360 --> 00:30:44,760 Speaker 1: that they had their browser security setting set fairly low, right, 481 00:30:45,680 --> 00:30:50,640 Speaker 1: specifically an Internet Explorer RIP. But yeah, Internet Explorer, you 482 00:30:50,720 --> 00:30:54,840 Speaker 1: might remember, had different kind of levels of security you 483 00:30:54,840 --> 00:30:57,800 Speaker 1: could set, So at the highest it would really limit 484 00:30:57,840 --> 00:31:00,480 Speaker 1: the types of websites you could go to. It really 485 00:31:00,600 --> 00:31:03,360 Speaker 1: restricted your freedom quite a bit, but it also protected 486 00:31:03,400 --> 00:31:07,320 Speaker 1: you against the vast majority of potential attacks, or at 487 00:31:07,400 --> 00:31:10,280 Speaker 1: least that was the intent. For people who felt like 488 00:31:10,320 --> 00:31:14,960 Speaker 1: they were more capable of determining their own safety, you 489 00:31:14,960 --> 00:31:17,360 Speaker 1: could set that much lower and you would be able 490 00:31:17,400 --> 00:31:20,880 Speaker 1: to go to more websites and use more services, but 491 00:31:20,960 --> 00:31:24,200 Speaker 1: you also would incur greater risk. So depending upon what 492 00:31:24,360 --> 00:31:27,520 Speaker 1: level you had your Internet security set at for Internet Explorer, 493 00:31:28,200 --> 00:31:31,200 Speaker 1: you could potentially be a target of this zero day 494 00:31:31,280 --> 00:31:35,440 Speaker 1: vulnerability that was leveraging Yahoo Messenger. But that was just 495 00:31:35,680 --> 00:31:39,080 Speaker 1: one The twenty thirteen one would be much much worse. 496 00:31:39,480 --> 00:31:42,440 Speaker 1: So again, it wasn't until twenty sixteen that we really 497 00:31:43,040 --> 00:31:45,800 Speaker 1: heard about this. Yahoo revealed that hackers had managed to 498 00:31:45,840 --> 00:31:50,280 Speaker 1: access and steal Yahoo user information, lots of private information. 499 00:31:51,240 --> 00:31:54,520 Speaker 1: The initial guess was that it affected around a billion 500 00:31:54,800 --> 00:31:59,560 Speaker 1: Yahoo users, but subsequently Yahoo, now under the ownership of Verizon, 501 00:32:00,480 --> 00:32:06,160 Speaker 1: revealed that potentially all three billion users had been hit 502 00:32:06,360 --> 00:32:10,120 Speaker 1: by this attack. This was in addition to a separate 503 00:32:10,160 --> 00:32:14,920 Speaker 1: attack that had happened in twenty fourteen, and Yahoo had 504 00:32:15,760 --> 00:32:19,760 Speaker 1: detected that one and already talked about it. So there 505 00:32:19,800 --> 00:32:22,680 Speaker 1: was a big attack in twenty thirteen, then a second 506 00:32:22,720 --> 00:32:25,920 Speaker 1: attack in twenty fourteen, probably not connected to the first attack. 507 00:32:26,640 --> 00:32:31,680 Speaker 1: Yahoo saw evidence of the second attack, the twenty fourteen attack, 508 00:32:32,360 --> 00:32:34,640 Speaker 1: but still didn't know about the twenty thirteen one. The 509 00:32:34,680 --> 00:32:38,520 Speaker 1: twenty fourteen one, though, had already hit half a billion accounts, 510 00:32:38,840 --> 00:32:42,120 Speaker 1: right like five hundred million people hit by that one. 511 00:32:43,440 --> 00:32:46,080 Speaker 1: And it really just points out how vulnerable Yahoo was 512 00:32:46,160 --> 00:32:50,320 Speaker 1: to have these two massive attacks both succeed against it, 513 00:32:50,480 --> 00:32:54,480 Speaker 1: one of which remained undetected even after Yahoo had found 514 00:32:54,480 --> 00:32:58,520 Speaker 1: evidence of a second attack. Subsequent investigations pointed to a 515 00:32:58,600 --> 00:33:02,880 Speaker 1: possible connection to Rush hackers, so it was likely a 516 00:33:02,920 --> 00:33:06,280 Speaker 1: state sponsored attack, which could mean that the primary purpose 517 00:33:06,320 --> 00:33:09,240 Speaker 1: of the attack was to gather information about specific targets 518 00:33:10,760 --> 00:33:12,480 Speaker 1: that being said, even if you're not a person of 519 00:33:12,520 --> 00:33:15,480 Speaker 1: note in the eyes of Russian intelligence. The hackers also 520 00:33:15,720 --> 00:33:18,920 Speaker 1: started to sell user data on the dark web, because 521 00:33:18,920 --> 00:33:20,960 Speaker 1: I mean, why not. You've already got it, why not 522 00:33:21,040 --> 00:33:24,200 Speaker 1: make some money off of it. Sure, your main reason 523 00:33:24,320 --> 00:33:26,680 Speaker 1: for your attacks was to get information about you know, 524 00:33:26,760 --> 00:33:29,240 Speaker 1: person A, person B in person C. But you have 525 00:33:30,160 --> 00:33:32,800 Speaker 1: you know a billion other people or in the case 526 00:33:32,800 --> 00:33:36,120 Speaker 1: of Yahoo, three billion other people. Why not sell their 527 00:33:36,120 --> 00:33:40,400 Speaker 1: information too and make some extra money. So, yeah, anyone 528 00:33:40,400 --> 00:33:43,600 Speaker 1: who had a Yahoo account by say mid twenty thirteen 529 00:33:44,320 --> 00:33:49,320 Speaker 1: was pretty much hit by this attack because it got everything, 530 00:33:49,520 --> 00:33:52,840 Speaker 1: which is a big ol' yauza. Now, sometimes it can 531 00:33:52,880 --> 00:33:56,040 Speaker 1: actually be difficult to tell the difference between an attack 532 00:33:56,120 --> 00:34:00,440 Speaker 1: that uses a zero day vulnerability versus something that is 533 00:34:00,480 --> 00:34:04,520 Speaker 1: able to achieve really big results but through entirely different means. So, 534 00:34:04,640 --> 00:34:09,240 Speaker 1: for example, in twenty twenty one, hackers began to offer 535 00:34:09,880 --> 00:34:14,680 Speaker 1: LinkedIn data on the black market, so data about LinkedIn users. 536 00:34:15,120 --> 00:34:18,600 Speaker 1: The word was that anywhere from five hundred million to 537 00:34:18,640 --> 00:34:21,760 Speaker 1: seven hundred million accounts had been part of this attack, 538 00:34:22,160 --> 00:34:26,719 Speaker 1: like anywhere between ninety to ninety five percent of LinkedIn's 539 00:34:26,840 --> 00:34:31,560 Speaker 1: user base, and there were differing explanations for how this 540 00:34:31,680 --> 00:34:37,200 Speaker 1: all went down. So one of the possible explanations was 541 00:34:37,280 --> 00:34:42,160 Speaker 1: that LinkedIn had an API that's an application programming interface, 542 00:34:43,200 --> 00:34:48,120 Speaker 1: and that this API had a vulnerability in it, and 543 00:34:48,160 --> 00:34:52,200 Speaker 1: that this vulnerability would allow a hacker to create a 544 00:34:52,239 --> 00:34:56,160 Speaker 1: tool to access information on the back end of LinkedIn systems. 545 00:34:56,760 --> 00:35:00,279 Speaker 1: So there was at least some guesses that that was 546 00:35:00,440 --> 00:35:03,759 Speaker 1: to blame, but LinkedIn said no, no, no, there was 547 00:35:03,800 --> 00:35:08,400 Speaker 1: a vulnerability in our API, but we subsequently patched that out. 548 00:35:09,200 --> 00:35:13,000 Speaker 1: And while there had been an early attack using that vulnerability, 549 00:35:13,040 --> 00:35:16,920 Speaker 1: it was very small in nature. This larger one was 550 00:35:16,960 --> 00:35:20,600 Speaker 1: not an attack on LinkedIn's back systems, according to LinkedIn, 551 00:35:21,000 --> 00:35:26,320 Speaker 1: but instead made use of data scrapers. So a data 552 00:35:26,360 --> 00:35:29,560 Speaker 1: scraper is just what sounds like, it's a program that 553 00:35:29,800 --> 00:35:34,480 Speaker 1: scrapes information off of a platform. So you could achieve 554 00:35:34,560 --> 00:35:39,640 Speaker 1: the same thing by having people go to LinkedIn and 555 00:35:39,680 --> 00:35:42,760 Speaker 1: write down the personal information they can find about each user, 556 00:35:43,560 --> 00:35:45,520 Speaker 1: and then go to the next user and then write 557 00:35:45,560 --> 00:35:47,920 Speaker 1: it all down. It would be the same thing. So 558 00:35:48,120 --> 00:35:50,720 Speaker 1: you're not getting anything secret because you're literally just going 559 00:35:50,960 --> 00:35:53,840 Speaker 1: entry to entry and writing down all the information you have. 560 00:35:55,000 --> 00:36:00,240 Speaker 1: Maybe you corroborate this with data from other websites too, 561 00:36:00,400 --> 00:36:03,240 Speaker 1: in order to build out a bigger dossier on each person. 562 00:36:03,880 --> 00:36:07,439 Speaker 1: But it's not like you penetrated the back end system, right. 563 00:36:07,520 --> 00:36:11,080 Speaker 1: You didn't get to see the actual database that LinkedIn 564 00:36:11,239 --> 00:36:14,760 Speaker 1: has where it has all the information about each user. 565 00:36:15,120 --> 00:36:20,040 Speaker 1: You're just grabbing stuff that's already publicly viewable on the website. 566 00:36:20,480 --> 00:36:22,840 Speaker 1: That's what LinkedIn was saying was happening. Whether or not 567 00:36:22,920 --> 00:36:25,880 Speaker 1: that's exactly what happened, I don't know. I don't have 568 00:36:25,880 --> 00:36:28,719 Speaker 1: any reason to doubt LinkedIn necessarily, because from what I 569 00:36:28,760 --> 00:36:32,120 Speaker 1: can understand, the information that was being sold didn't contain 570 00:36:32,320 --> 00:36:35,000 Speaker 1: a lot of stuff you would expect to find if, 571 00:36:35,000 --> 00:36:37,920 Speaker 1: in fact, it were all the back end stuff. It 572 00:36:37,960 --> 00:36:40,440 Speaker 1: was all things that you would expect to find if 573 00:36:40,480 --> 00:36:43,040 Speaker 1: you were to just visit someone's profile page. So it's 574 00:36:43,200 --> 00:36:47,520 Speaker 1: possible that that explanation is in fact the accurate one. Now, 575 00:36:47,560 --> 00:36:49,920 Speaker 1: if you do a search about the most recent Google 576 00:36:50,000 --> 00:36:54,120 Speaker 1: Chrome zero day vulnerability, you are likely going to see 577 00:36:54,120 --> 00:36:59,439 Speaker 1: that it's listed as vulnerability cve DASH twenty twenty three, 578 00:36:59,600 --> 00:37:05,160 Speaker 1: Dashed twenty thirty three. All right, so Google Chrome has 579 00:37:05,239 --> 00:37:07,560 Speaker 1: had other zero day vulnerabilities. In fact, if you do 580 00:37:07,600 --> 00:37:10,960 Speaker 1: a search and you see a different CVE. You know 581 00:37:11,000 --> 00:37:13,560 Speaker 1: it's got different numbers following it. That's one of the 582 00:37:13,600 --> 00:37:16,120 Speaker 1: other zero day vulnerabilities Google Chrome has had to deal 583 00:37:16,160 --> 00:37:19,040 Speaker 1: with in the past, so this is not a new thing. 584 00:37:19,800 --> 00:37:27,319 Speaker 1: The letters CVE stand for Common Vulnerabilities and Exploits. This 585 00:37:27,360 --> 00:37:31,160 Speaker 1: is from the National Standards Institute, so it's like a 586 00:37:31,200 --> 00:37:35,560 Speaker 1: standard just being used by the computer science community. So 587 00:37:35,680 --> 00:37:39,799 Speaker 1: CVE has that designation and the numbers give you more 588 00:37:39,800 --> 00:37:46,440 Speaker 1: information about the specific instance of this vulnerability. This particular 589 00:37:46,560 --> 00:37:51,160 Speaker 1: vulnerability is taking advantage of something called type confusion. Now, 590 00:37:51,200 --> 00:37:55,360 Speaker 1: to get into type confusion in detail would go beyond 591 00:37:55,440 --> 00:37:59,520 Speaker 1: my meager knowledge and understanding of coding. So I'm not 592 00:37:59,520 --> 00:38:02,080 Speaker 1: going to dive too deeply into this because more likely 593 00:38:02,120 --> 00:38:03,920 Speaker 1: than not, I would just say something that was wrong, 594 00:38:04,560 --> 00:38:07,480 Speaker 1: and rather than try to get it right and get 595 00:38:07,480 --> 00:38:09,759 Speaker 1: it wrong, I'm going to give you a very high 596 00:38:09,880 --> 00:38:13,720 Speaker 1: level look at what type confusion is. So the Miter 597 00:38:14,000 --> 00:38:19,120 Speaker 1: Corporation says that type confusion happens when quote the program 598 00:38:19,280 --> 00:38:24,000 Speaker 1: allocates or initializes a resource such as a pointer, object, 599 00:38:24,160 --> 00:38:28,640 Speaker 1: or variable using one type, but it later accesses that 600 00:38:28,800 --> 00:38:33,560 Speaker 1: resource using a type that is incompatible with the original type. 601 00:38:33,920 --> 00:38:37,239 Speaker 1: End quote. That clears it up right. So type in 602 00:38:37,280 --> 00:38:39,920 Speaker 1: this case references a set of values as well as 603 00:38:39,920 --> 00:38:44,719 Speaker 1: a set of operations allowed to be performed on those values. 604 00:38:45,960 --> 00:38:47,960 Speaker 1: That's about as deep as I can get into that 605 00:38:48,160 --> 00:38:52,239 Speaker 1: without running into the danger of hopelessly confusing myself and 606 00:38:52,400 --> 00:38:57,360 Speaker 1: probably saying the wrong thing. But certain coding languages lack 607 00:38:57,719 --> 00:39:03,440 Speaker 1: memory protection capabilities, like C programming language doesn't have that 608 00:39:03,600 --> 00:39:06,280 Speaker 1: memory protection capability built into it, and so a hacker 609 00:39:06,840 --> 00:39:11,120 Speaker 1: can try to purposefully kind of confuse a program and 610 00:39:11,160 --> 00:39:14,040 Speaker 1: gain out of bounds memory access, which can lead to 611 00:39:14,040 --> 00:39:17,080 Speaker 1: all sorts of bad outcomes. Now, to manage this with 612 00:39:17,160 --> 00:39:19,719 Speaker 1: Google Chrome, because that's the program we're looking at right here, 613 00:39:19,800 --> 00:39:23,480 Speaker 1: right as a web browser. The way you would take 614 00:39:23,520 --> 00:39:26,680 Speaker 1: advantage of this vulnerability is a hacker would typically create 615 00:39:26,719 --> 00:39:31,880 Speaker 1: a website an HTML document, and within the document, the 616 00:39:31,920 --> 00:39:35,960 Speaker 1: hacker would embed this attack so that when someone who 617 00:39:36,040 --> 00:39:40,280 Speaker 1: is using an unpatched version of Google Chrome visits that page, 618 00:39:41,040 --> 00:39:44,520 Speaker 1: the attack initiates. Now, what the attack does is dependent 619 00:39:44,600 --> 00:39:47,000 Speaker 1: upon the nature of the malware itself, so it could 620 00:39:47,040 --> 00:39:48,760 Speaker 1: be used to do all sorts of things like steel 621 00:39:48,800 --> 00:39:52,239 Speaker 1: information or inject a different kind of malware into a 622 00:39:52,280 --> 00:39:55,880 Speaker 1: target computer, all sorts of different stuff, so you can 623 00:39:55,920 --> 00:39:59,440 Speaker 1: see why experts recommend users update Google Chrome to patch 624 00:39:59,440 --> 00:40:04,120 Speaker 1: out that vululnerability. Apparently at least one such attack was 625 00:40:04,200 --> 00:40:06,520 Speaker 1: found out in the wild, so this isn't just a 626 00:40:06,640 --> 00:40:10,320 Speaker 1: zero day vulnerability. There was evidence found of zero day attacks, 627 00:40:10,880 --> 00:40:13,680 Speaker 1: so this is something that's happening right now. So again, 628 00:40:13,760 --> 00:40:16,839 Speaker 1: if you use Google Chrome, make sure you update it 629 00:40:16,880 --> 00:40:19,960 Speaker 1: to the most recent version. It is not difficult to do. 630 00:40:20,400 --> 00:40:22,680 Speaker 1: It might require you to reboot your computer, but that's 631 00:40:22,760 --> 00:40:27,239 Speaker 1: the biggest hassle involved with it, and it could potentially 632 00:40:27,280 --> 00:40:30,720 Speaker 1: prevent you from being part of a massive hacker attack. 633 00:40:31,360 --> 00:40:34,719 Speaker 1: So go ahead and do that. Because the hackers have 634 00:40:34,760 --> 00:40:37,160 Speaker 1: been aware of this for a while now, we were 635 00:40:37,239 --> 00:40:40,480 Speaker 1: just made aware of it over this past weekend. All right, 636 00:40:40,520 --> 00:40:43,520 Speaker 1: that's it for this episode. I hope you are all well. 637 00:40:43,920 --> 00:40:45,319 Speaker 1: If you would like to reach out to me, you 638 00:40:45,360 --> 00:40:47,680 Speaker 1: can do so on Twitter. The handle for the show 639 00:40:47,719 --> 00:40:51,799 Speaker 1: is tech Stuff HSW where you could download the iHeartRadio app. 640 00:40:51,800 --> 00:40:54,880 Speaker 1: It's free to download, free to use. You can just 641 00:40:54,960 --> 00:40:57,240 Speaker 1: go into the little search field type in tech stuff. 642 00:40:57,320 --> 00:41:00,520 Speaker 1: It'll take you to the podcast page results. Go into 643 00:41:00,560 --> 00:41:02,840 Speaker 1: the podcast. You'll see a little microphone icon. If you 644 00:41:02,840 --> 00:41:04,680 Speaker 1: click on that, you can leave a voice message up 645 00:41:04,719 --> 00:41:07,000 Speaker 1: to thirty seconds in length. Let me know what you'd 646 00:41:07,040 --> 00:41:10,680 Speaker 1: like to hear, and I'll talk to you again really soon. 647 00:41:17,000 --> 00:41:21,640 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 648 00:41:21,960 --> 00:41:25,680 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 649 00:41:25,719 --> 00:41:30,640 Speaker 1: to your favorite shows.