1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,480 --> 00:00:15,320 Speaker 1: Pay there and welcome to tech Stuff. I'm your host, 3 00:00:15,440 --> 00:00:18,640 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,680 --> 00:00:21,599 Speaker 1: and a love of all things tech, and today I've 5 00:00:21,600 --> 00:00:24,520 Speaker 1: got something special for you guys. I'm going to be 6 00:00:24,640 --> 00:00:30,200 Speaker 1: talking with Shannon Morse, my good friend, hacker extraordinaire, incredible 7 00:00:30,240 --> 00:00:32,640 Speaker 1: tech communicator, and she and I are going to break 8 00:00:32,680 --> 00:00:37,400 Speaker 1: down the solar winds hack, a hack that was dominating 9 00:00:37,440 --> 00:00:41,560 Speaker 1: the news for late December into January. It will likely 10 00:00:41,600 --> 00:00:43,640 Speaker 1: be a part of the news cycle in the tech 11 00:00:43,720 --> 00:00:48,800 Speaker 1: space for months and possibly years to come, as it 12 00:00:48,960 --> 00:00:54,200 Speaker 1: was a particularly effective and potentially devastating attack, one that 13 00:00:54,320 --> 00:00:58,600 Speaker 1: will take quite a long time to repair. And I 14 00:00:58,680 --> 00:01:01,480 Speaker 1: wanted to bring Shannon on to the show because while 15 00:01:01,520 --> 00:01:03,720 Speaker 1: I can do a lot of research into this stuff, 16 00:01:04,560 --> 00:01:08,920 Speaker 1: I come at this as the same as anyone else would, really, 17 00:01:08,959 --> 00:01:13,440 Speaker 1: anyone who's not in the the info sex space. I 18 00:01:13,480 --> 00:01:16,080 Speaker 1: would look at it as an outsider trying to learn 19 00:01:16,080 --> 00:01:19,520 Speaker 1: as best I can. But Shannon has been working in 20 00:01:19,560 --> 00:01:24,720 Speaker 1: the hacker sphere for many years and has a particularly 21 00:01:25,480 --> 00:01:29,080 Speaker 1: uh strong point of view when it comes to such 22 00:01:29,160 --> 00:01:31,960 Speaker 1: things and is able to see things that I just don't. 23 00:01:32,280 --> 00:01:35,160 Speaker 1: So I was very glad that she took the time 24 00:01:35,280 --> 00:01:38,720 Speaker 1: out of her schedule to jump on this episode. And 25 00:01:38,800 --> 00:01:43,120 Speaker 1: so now here is my conversation with Shannon Morris about 26 00:01:43,160 --> 00:01:49,000 Speaker 1: the solar winds hack. I hope you enjoy it. Shannon, 27 00:01:49,320 --> 00:01:52,240 Speaker 1: Welcome back to tech Stuff. It has been too long. 28 00:01:52,400 --> 00:01:55,800 Speaker 1: Thank you for joining me. Thank you for having me. Jonathan, 29 00:01:55,840 --> 00:01:58,880 Speaker 1: how are you. I'm well, It's always a pleasure to 30 00:01:58,920 --> 00:02:01,520 Speaker 1: have you on even and we have to talk about 31 00:02:02,040 --> 00:02:06,720 Speaker 1: terrifying existential threats, but this one is a fun one. 32 00:02:07,080 --> 00:02:10,720 Speaker 1: This one is interesting, fun for us to talk about. Yeah, well, 33 00:02:10,880 --> 00:02:15,040 Speaker 1: because it's it is different from a lot of of 34 00:02:15,240 --> 00:02:19,800 Speaker 1: malware threats and hacker threats that we typically hear about. So, Shannon, 35 00:02:19,800 --> 00:02:23,960 Speaker 1: you're the expert, you let me know if I'm way 36 00:02:23,960 --> 00:02:26,400 Speaker 1: off base. I'm going to give kind of my take 37 00:02:26,520 --> 00:02:31,079 Speaker 1: on what the typical hacker attack tends to be and 38 00:02:31,200 --> 00:02:32,840 Speaker 1: the way we tend to see, at least the ones 39 00:02:32,840 --> 00:02:35,680 Speaker 1: that we hear about. Um, if it's not something like 40 00:02:36,040 --> 00:02:39,639 Speaker 1: someone taking advantage of a security vulnerability in a system 41 00:02:39,760 --> 00:02:43,160 Speaker 1: or using social engineering to get access to someone's system, 42 00:02:43,400 --> 00:02:46,760 Speaker 1: what we usually hear about our malware attacks where there's 43 00:02:46,800 --> 00:02:51,760 Speaker 1: like an email attachment or someone has uploaded and infected 44 00:02:51,840 --> 00:02:55,880 Speaker 1: file through some sort of distribution point where it might 45 00:02:55,919 --> 00:02:58,000 Speaker 1: be a peer to peer network, it might be a database, 46 00:02:59,000 --> 00:03:01,480 Speaker 1: or it might be that you go to some website 47 00:03:01,600 --> 00:03:04,480 Speaker 1: that you've been directed to and you click on something 48 00:03:04,480 --> 00:03:07,799 Speaker 1: that then installs malware to your system. And in this 49 00:03:07,840 --> 00:03:11,520 Speaker 1: sort of attack, you've got hackers that are kind of 50 00:03:11,520 --> 00:03:14,400 Speaker 1: taking a shotgun approach, right. They don't know who's going 51 00:03:14,440 --> 00:03:17,680 Speaker 1: to end up getting this malware. It's more like, let's 52 00:03:17,680 --> 00:03:19,720 Speaker 1: try and spread it as far and wide as we 53 00:03:19,800 --> 00:03:24,840 Speaker 1: possibly can, taking a pretty brute force kind of tactic. 54 00:03:24,919 --> 00:03:28,520 Speaker 1: Is that more or less accurate for the general types 55 00:03:28,560 --> 00:03:30,840 Speaker 1: of stuff we hear about? Yeah, pretty much. I mean 56 00:03:30,919 --> 00:03:34,600 Speaker 1: usually you hear about the very consumer oriented hacks. You know, 57 00:03:34,680 --> 00:03:37,920 Speaker 1: an app gets installed from Google Play and it turns 58 00:03:37,920 --> 00:03:41,400 Speaker 1: out it has hundreds of thousands of downloads and everybody 59 00:03:41,400 --> 00:03:43,080 Speaker 1: all of a sudden has malware and they have to 60 00:03:43,080 --> 00:03:44,720 Speaker 1: get rid of it. Blah blah blah blah blah. So 61 00:03:44,760 --> 00:03:48,720 Speaker 1: you see a lot of targeted assaults happening towards consumers, 62 00:03:48,760 --> 00:03:51,880 Speaker 1: but in this case, with a supply chain attack, as 63 00:03:51,960 --> 00:03:55,360 Speaker 1: what it's called, uh you see a a attack that's 64 00:03:55,520 --> 00:03:59,240 Speaker 1: very targeted towards a specific type of brand or a 65 00:03:59,320 --> 00:04:02,200 Speaker 1: vendor that happens to work with a whole bunch of people. 66 00:04:02,520 --> 00:04:05,880 Speaker 1: So the attackers don't necessarily know of the whole bunch 67 00:04:05,920 --> 00:04:09,240 Speaker 1: of people these businesses, clients that this vendor works with. 68 00:04:09,280 --> 00:04:11,720 Speaker 1: They don't know who's actually going to install it in 69 00:04:11,800 --> 00:04:14,720 Speaker 1: order for them to be able to attack all these 70 00:04:14,720 --> 00:04:18,239 Speaker 1: different brands. They just know, we know this vendor works 71 00:04:18,240 --> 00:04:22,400 Speaker 1: with thousands of thousands of really important businesses, so let's 72 00:04:22,440 --> 00:04:26,200 Speaker 1: just attack this one brand and then see what happens. Yeah, 73 00:04:26,320 --> 00:04:29,000 Speaker 1: and in this case, the Solar Winds hack a lot 74 00:04:29,040 --> 00:04:31,920 Speaker 1: of people. I'm sure the average person had never heard 75 00:04:31,920 --> 00:04:35,920 Speaker 1: about solar Winds before the news broke about the actual hack. 76 00:04:36,720 --> 00:04:40,200 Speaker 1: Because this is a business to business sort of enterprise. 77 00:04:40,279 --> 00:04:45,560 Speaker 1: They create software packages for businesses, typically really big businesses 78 00:04:45,560 --> 00:04:49,360 Speaker 1: are really big organizations to use to do things like 79 00:04:49,400 --> 00:04:52,680 Speaker 1: just monitor their network system. So it's not the kind 80 00:04:52,680 --> 00:04:54,800 Speaker 1: of thing that the average person would ever have to 81 00:04:54,839 --> 00:04:57,240 Speaker 1: come in contact with unless you happen to be like 82 00:04:57,720 --> 00:05:00,480 Speaker 1: the I T. Person at a big company or a 83 00:05:00,520 --> 00:05:04,640 Speaker 1: government agency exactly. So I give an example of when 84 00:05:04,760 --> 00:05:08,560 Speaker 1: I used to work at a bank and forward facing. 85 00:05:08,800 --> 00:05:10,600 Speaker 1: When I was working at that bank, you know, I 86 00:05:10,640 --> 00:05:12,560 Speaker 1: was talking to customers all the time, and I had 87 00:05:12,640 --> 00:05:16,480 Speaker 1: my own little register where I had the money and everything, 88 00:05:16,920 --> 00:05:19,279 Speaker 1: and I had my own computer. But that computer was 89 00:05:19,640 --> 00:05:23,320 Speaker 1: running Windows, and it was running software on Windows. But 90 00:05:23,480 --> 00:05:26,640 Speaker 1: behind the scenes, for that entire branch and for all 91 00:05:26,760 --> 00:05:29,920 Speaker 1: the different branches and all the different cities for this 92 00:05:30,000 --> 00:05:32,960 Speaker 1: company that I worked at, they had servers that were 93 00:05:33,000 --> 00:05:36,880 Speaker 1: connected to all the different physical locations for this bank, 94 00:05:37,200 --> 00:05:40,640 Speaker 1: and on those servers is where you would see these 95 00:05:40,720 --> 00:05:44,039 Speaker 1: kind of platforms being used, these kind of operating systems. 96 00:05:44,400 --> 00:05:48,279 Speaker 1: So if you're just working at a very like consumer 97 00:05:48,320 --> 00:05:51,400 Speaker 1: facing or an office oriented job, then you don't necessarily 98 00:05:51,480 --> 00:05:54,520 Speaker 1: run into this, even if you're an employee. A lot 99 00:05:54,520 --> 00:05:56,640 Speaker 1: of times it's just happening on the back end for 100 00:05:56,720 --> 00:05:59,920 Speaker 1: like the network administrators, the I T security and from 101 00:06:00,000 --> 00:06:02,080 Speaker 1: Asian security, like those are the kind of people that 102 00:06:02,120 --> 00:06:07,320 Speaker 1: would be using this kind of uh networking product. Yeah. So, 103 00:06:07,320 --> 00:06:09,960 Speaker 1: so like if you're a company that does products that 104 00:06:10,000 --> 00:06:12,600 Speaker 1: are like software as a service, where you need to 105 00:06:12,720 --> 00:06:15,800 Speaker 1: keep a really close eye on things like network loads 106 00:06:16,000 --> 00:06:19,320 Speaker 1: because you might have to react uh nimbly and and 107 00:06:19,720 --> 00:06:24,880 Speaker 1: quickly to changing demands on your system. Solar Winds makes 108 00:06:24,920 --> 00:06:27,680 Speaker 1: the kind of software that allows you to have that 109 00:06:27,680 --> 00:06:31,599 Speaker 1: that top level look at what's going on with your networks. 110 00:06:31,640 --> 00:06:34,599 Speaker 1: So again not something that most of us would run into, 111 00:06:34,640 --> 00:06:38,599 Speaker 1: but it is really important software. And that's why nearly 112 00:06:38,760 --> 00:06:42,200 Speaker 1: every company that's on the Fortune five list is a 113 00:06:42,320 --> 00:06:47,880 Speaker 1: client of Solar Winds, and several high level government agencies, 114 00:06:47,920 --> 00:06:50,600 Speaker 1: particularly in the United States, like the Department of Justice, 115 00:06:50,640 --> 00:06:54,200 Speaker 1: the Department of Homeland Security, Department of the Treasury, the 116 00:06:54,200 --> 00:06:59,279 Speaker 1: Department of Energy, like big national security level organizations are 117 00:06:59,320 --> 00:07:02,760 Speaker 1: all client of Solar Winds, and in particular, they have 118 00:07:02,839 --> 00:07:06,800 Speaker 1: a product that's called Oriyan, and this is specifically to 119 00:07:06,880 --> 00:07:11,720 Speaker 1: monitor stuff like network traffic and network assets and where 120 00:07:11,760 --> 00:07:14,520 Speaker 1: you might need to make adjustments on the fly. And 121 00:07:14,640 --> 00:07:19,000 Speaker 1: that ends up being the bulls eye of the target 122 00:07:19,440 --> 00:07:22,600 Speaker 1: for the hackers who created the Solar Winds hack, which 123 00:07:22,640 --> 00:07:26,560 Speaker 1: is also sometimes called sunbursts, the particular malware that was used, 124 00:07:27,200 --> 00:07:30,560 Speaker 1: and um, this is where we get into that supply 125 00:07:30,680 --> 00:07:35,080 Speaker 1: chain attack. And I think an easy way for people 126 00:07:35,120 --> 00:07:39,160 Speaker 1: to understand it is that it's unfortunate that it's an 127 00:07:39,160 --> 00:07:42,400 Speaker 1: attack that that takes advantage of something that we typically 128 00:07:42,440 --> 00:07:45,760 Speaker 1: tell people to do, which is, when a patch comes 129 00:07:45,760 --> 00:07:50,440 Speaker 1: out for your software, you install it. Because typically patches 130 00:07:50,480 --> 00:07:55,440 Speaker 1: do things like they address previous vulnerabilities in software and 131 00:07:55,480 --> 00:07:59,800 Speaker 1: they close down an avenue of attack for hackers. But 132 00:08:00,560 --> 00:08:04,440 Speaker 1: if a hacker were able to target that that actual software, 133 00:08:04,520 --> 00:08:07,120 Speaker 1: whatever it might be, like, if they were able to 134 00:08:07,160 --> 00:08:11,320 Speaker 1: target Windows and insert the malicious code into the Windows 135 00:08:11,360 --> 00:08:13,360 Speaker 1: code so that when the patch notes go out, when 136 00:08:13,360 --> 00:08:17,400 Speaker 1: the patches go out, the malicious code hitchhikes along. And 137 00:08:17,440 --> 00:08:19,360 Speaker 1: then when you install your patch, as you do as 138 00:08:19,360 --> 00:08:23,360 Speaker 1: a good user, you have just installed the malware that 139 00:08:23,600 --> 00:08:29,040 Speaker 1: is the supply chain attack, and it's devastating. It's yeah, 140 00:08:29,080 --> 00:08:33,439 Speaker 1: it's very very scary because the it kind of focuses 141 00:08:33,480 --> 00:08:37,000 Speaker 1: on the inherent trust that a lot of clients have 142 00:08:37,320 --> 00:08:42,119 Speaker 1: with the vendors that they use for this distributed software 143 00:08:42,200 --> 00:08:44,600 Speaker 1: that they might use on their back end for for 144 00:08:44,640 --> 00:08:47,480 Speaker 1: their network or whatever it might be. So by having 145 00:08:47,480 --> 00:08:51,360 Speaker 1: that inherent trust, you are trusting as a business that 146 00:08:51,440 --> 00:08:54,719 Speaker 1: when you do these auto updates, when you physically go 147 00:08:54,800 --> 00:08:57,719 Speaker 1: in and you know, update your firmware or whatever it 148 00:08:57,800 --> 00:09:01,120 Speaker 1: might be, that you are going to be protecting yourself 149 00:09:01,120 --> 00:09:03,439 Speaker 1: because you're on top of it, you're downloading that stuff 150 00:09:03,480 --> 00:09:05,680 Speaker 1: every single time there's a new version that comes out. 151 00:09:05,760 --> 00:09:09,240 Speaker 1: But in this case, because the attackers were targeting the 152 00:09:09,320 --> 00:09:14,280 Speaker 1: vendor itself and not the specific clients, they were distributing 153 00:09:14,320 --> 00:09:19,080 Speaker 1: that malware, two thousands upon thousands of potential customers, and 154 00:09:19,120 --> 00:09:22,480 Speaker 1: it's the ones that were updating like they should be 155 00:09:22,600 --> 00:09:25,840 Speaker 1: that ended up being kind of caught in the crosshairs. Yeah, 156 00:09:25,840 --> 00:09:27,880 Speaker 1: this is one of those cases where you say, I 157 00:09:27,920 --> 00:09:32,080 Speaker 1: did everything right and you still screwed me. Uh yeah. 158 00:09:32,160 --> 00:09:36,720 Speaker 1: So oriyan Orian is a platform that's very popular. Around 159 00:09:36,960 --> 00:09:40,959 Speaker 1: thirty three thousand of solar Wind's clients have some version 160 00:09:41,000 --> 00:09:43,320 Speaker 1: of Orian running on their system. Out of that thirty 161 00:09:43,360 --> 00:09:47,880 Speaker 1: three thousand, solar Winds said, approximately eighteen thousand had the 162 00:09:48,000 --> 00:09:51,600 Speaker 1: versions that were specifically affected when the malicious code had 163 00:09:51,640 --> 00:09:54,880 Speaker 1: been inserted and those patches had been pushed out to 164 00:09:54,960 --> 00:09:57,360 Speaker 1: the clients and they had actually installed it. Out of 165 00:09:57,360 --> 00:10:01,880 Speaker 1: those eighteen thousand, however, we later learned that a very 166 00:10:02,000 --> 00:10:05,000 Speaker 1: very small number were followed up on because, as it 167 00:10:05,000 --> 00:10:09,000 Speaker 1: turns out, that sunburst attack was just stage one. It 168 00:10:09,120 --> 00:10:11,719 Speaker 1: was not it was not the end all. It wasn't like, 169 00:10:11,960 --> 00:10:15,680 Speaker 1: oh we snug some malicious code into a legitimate product. 170 00:10:16,280 --> 00:10:19,880 Speaker 1: High fives all around. That was just the beginning. Yeah, 171 00:10:19,960 --> 00:10:22,960 Speaker 1: So in this case, the attackers were like, let's just 172 00:10:23,120 --> 00:10:25,760 Speaker 1: get it out there and see who gets caught in 173 00:10:25,800 --> 00:10:28,720 Speaker 1: the crosshairs. And then they started following up and they 174 00:10:28,720 --> 00:10:31,679 Speaker 1: were like, Okay, well, who who matters the most to us? 175 00:10:31,760 --> 00:10:35,240 Speaker 1: Which ones might be financially motivated for us to hack? 176 00:10:35,679 --> 00:10:38,600 Speaker 1: Who might be the ones that have the biggest and 177 00:10:38,679 --> 00:10:42,480 Speaker 1: best data sets that we could potentially pilfer off and 178 00:10:42,600 --> 00:10:45,640 Speaker 1: sell to a third party. Like, we don't necessarily know 179 00:10:45,679 --> 00:10:48,560 Speaker 1: what their end goal is, but a lot of times 180 00:10:48,600 --> 00:10:51,600 Speaker 1: with hacks like this, especially if they are distributed towards 181 00:10:52,040 --> 00:10:55,560 Speaker 1: Fortune five hundreds and government and sectors like that, they 182 00:10:55,600 --> 00:10:59,280 Speaker 1: are state sponsored or they are very very financially motivated. 183 00:10:59,640 --> 00:11:03,840 Speaker 1: So that would be my general like hypothesis as far 184 00:11:03,880 --> 00:11:07,560 Speaker 1: as what their their motivations were behind it and why 185 00:11:07,600 --> 00:11:12,200 Speaker 1: they specifically target, you know, the government sector. The very 186 00:11:12,320 --> 00:11:15,320 Speaker 1: few that they actually did out of the eighteen thousand, Yeah, 187 00:11:15,320 --> 00:11:18,880 Speaker 1: I think the last report I read said that it 188 00:11:18,920 --> 00:11:23,520 Speaker 1: looked like it was around forty systems out of eighteen thousand. 189 00:11:23,800 --> 00:11:26,240 Speaker 1: That's less than that's less than less like point two 190 00:11:27,160 --> 00:11:30,640 Speaker 1: of all the different systems that they hit that they 191 00:11:30,679 --> 00:11:33,160 Speaker 1: followed up on, and it does say that there was 192 00:11:33,200 --> 00:11:37,480 Speaker 1: a very concentrated, focused effort to look at very specific systems. 193 00:11:37,720 --> 00:11:39,400 Speaker 1: Most of the ones that they targeted were out of 194 00:11:39,400 --> 00:11:42,960 Speaker 1: big tech and then government agencies and then some non 195 00:11:43,000 --> 00:11:45,880 Speaker 1: government offices outside of that, like think tanks and things 196 00:11:45,880 --> 00:11:49,320 Speaker 1: like that. UM I've seen speculation that, as you say, 197 00:11:49,440 --> 00:11:53,400 Speaker 1: it was very likely a state backed attack, and that 198 00:11:54,520 --> 00:11:57,840 Speaker 1: the evidence seems to point, but it does not necessarily 199 00:11:57,840 --> 00:12:03,560 Speaker 1: indicate proof positive that Russia was behind the attack. At least, yes, 200 00:12:03,920 --> 00:12:06,160 Speaker 1: there appears that that's what all the signs point to, 201 00:12:06,280 --> 00:12:08,800 Speaker 1: but then there's also always the possibility of what is 202 00:12:08,800 --> 00:12:14,600 Speaker 1: called a false flag operation exactly. So it's very interesting 203 00:12:14,640 --> 00:12:19,280 Speaker 1: when people start kind of laying blame on specific groups 204 00:12:19,360 --> 00:12:22,560 Speaker 1: of attackers or groups of hackers and saying like, hey, 205 00:12:22,600 --> 00:12:24,960 Speaker 1: because the code looks this way, we think that it's, 206 00:12:25,440 --> 00:12:27,679 Speaker 1: you know, backed by Russia or whoever, it might be 207 00:12:27,760 --> 00:12:30,160 Speaker 1: backed by China and North Korea. Those are usually the 208 00:12:30,200 --> 00:12:33,360 Speaker 1: ones that we see in the news. Uh. In this case, 209 00:12:33,480 --> 00:12:37,560 Speaker 1: they found samples of code that could be very very 210 00:12:37,559 --> 00:12:42,880 Speaker 1: closely linked to a previous attacker group from Russia. So 211 00:12:42,960 --> 00:12:44,800 Speaker 1: they made that tie and they were like, hey, we 212 00:12:44,840 --> 00:12:47,760 Speaker 1: think that this is the same group. But there is 213 00:12:47,800 --> 00:12:52,480 Speaker 1: always the potential that somebody could have copied previous malware 214 00:12:53,040 --> 00:12:56,320 Speaker 1: and used samples of that for new quote code for 215 00:12:56,559 --> 00:13:00,200 Speaker 1: solar winds for the sunburst. So it's entirely possible that 216 00:13:00,280 --> 00:13:04,880 Speaker 1: it's not the same group, but it's plausible, right. So 217 00:13:05,520 --> 00:13:08,120 Speaker 1: again you can't draw any firm conclusions. But when you 218 00:13:08,160 --> 00:13:13,320 Speaker 1: start thinking about this as a potential state backed attack 219 00:13:14,200 --> 00:13:20,160 Speaker 1: that largely gives hackers high level access to systems once 220 00:13:20,200 --> 00:13:24,320 Speaker 1: they deliver that second payload of malware, which specifically allows 221 00:13:24,360 --> 00:13:28,880 Speaker 1: them to move laterally across networks, not just hit a 222 00:13:28,920 --> 00:13:32,360 Speaker 1: specific server, but then to kind of infiltrate across an 223 00:13:32,520 --> 00:13:35,920 Speaker 1: entire system. A lot of the reports we've seen have 224 00:13:36,040 --> 00:13:40,400 Speaker 1: shown that the hackers were at least able to read 225 00:13:40,920 --> 00:13:44,120 Speaker 1: material to see what what material was around. They could 226 00:13:44,200 --> 00:13:47,000 Speaker 1: look at source code at Microsoft, for example, or they 227 00:13:47,000 --> 00:13:49,480 Speaker 1: can look at emails that had been both sent and 228 00:13:49,559 --> 00:13:53,640 Speaker 1: received through a particular system. A lot of this kind 229 00:13:53,640 --> 00:13:58,440 Speaker 1: of leads you down the path to thinking one potential 230 00:13:58,440 --> 00:14:02,240 Speaker 1: purpose for this attack could be espionage. That it literally 231 00:14:02,360 --> 00:14:06,680 Speaker 1: is another part of cyber espionage where you're spying on 232 00:14:07,280 --> 00:14:12,000 Speaker 1: UM an enemy or or adversary, and that fits the 233 00:14:12,080 --> 00:14:16,719 Speaker 1: narrative really well. Again, we can't draw that conclusion conclusively 234 00:14:17,400 --> 00:14:20,520 Speaker 1: to be redundant, but we can at least we can 235 00:14:20,520 --> 00:14:24,120 Speaker 1: at least say like that is a potential answer to 236 00:14:24,400 --> 00:14:30,200 Speaker 1: why this has happened. Yeah, So I like to lay 237 00:14:30,200 --> 00:14:33,640 Speaker 1: out a lot of caveats because it's it's very dangerous 238 00:14:33,640 --> 00:14:36,240 Speaker 1: to speak in absolutes when you come to something like this, 239 00:14:36,640 --> 00:14:40,040 Speaker 1: because it may turn out yes, ongoing. So we still 240 00:14:40,040 --> 00:14:42,200 Speaker 1: have a lot of questions. But I am glad that 241 00:14:42,280 --> 00:14:46,240 Speaker 1: we have companies like Microsoft, for example, with Office three 242 00:14:46,720 --> 00:14:49,200 Speaker 1: and the fact that they were able to see source code, 243 00:14:49,360 --> 00:14:51,640 Speaker 1: the attackers were able to see source code. I'm glad 244 00:14:51,680 --> 00:14:56,680 Speaker 1: they're coming forward these clients that were attacked and we're targeted, 245 00:14:56,720 --> 00:14:59,880 Speaker 1: because it's giving us a clear perspective of what was 246 00:15:00,080 --> 00:15:04,680 Speaker 1: actually targeted in this assault. And in Microsoft's case, it was, 247 00:15:05,440 --> 00:15:08,040 Speaker 1: or at least they believe that it was the source code, 248 00:15:08,040 --> 00:15:11,520 Speaker 1: because the attackers did get access to that information. Now, 249 00:15:11,560 --> 00:15:14,840 Speaker 1: were they also like collecting the source code? Were they 250 00:15:14,880 --> 00:15:17,960 Speaker 1: taking it from Microsoft and collecting it into their own 251 00:15:18,040 --> 00:15:22,040 Speaker 1: data set? Maybe? Probably, I mean, they did have access 252 00:15:22,080 --> 00:15:24,720 Speaker 1: to it, so it's entirely plausible as well. But again 253 00:15:24,800 --> 00:15:28,240 Speaker 1: it's that plausibility of like all these questions that we 254 00:15:28,320 --> 00:15:31,920 Speaker 1: currently have with an active attack where there's still being 255 00:15:32,000 --> 00:15:37,000 Speaker 1: discoveries happening. This is Jonathan outside the interview here. I'm 256 00:15:37,040 --> 00:15:39,760 Speaker 1: just interrupting so that we can take a quick break, 257 00:15:39,800 --> 00:15:49,840 Speaker 1: but we'll be right back. So we know that the 258 00:15:49,920 --> 00:15:53,600 Speaker 1: nature of the attack allowed for a lot of access 259 00:15:53,640 --> 00:15:57,080 Speaker 1: to things from a certain level, but in most cases 260 00:15:57,120 --> 00:15:59,640 Speaker 1: that we've heard about, the companies are saying no one 261 00:15:59,840 --> 00:16:03,200 Speaker 1: was able to actually make any changes to anything. They 262 00:16:03,280 --> 00:16:05,360 Speaker 1: might have seen it, they might have copied it, but 263 00:16:05,440 --> 00:16:09,120 Speaker 1: they could not modify anything. However, part of what I 264 00:16:09,120 --> 00:16:11,600 Speaker 1: would think would be useful if you're looking at source 265 00:16:11,680 --> 00:16:14,520 Speaker 1: code for products like Office three six five, which has 266 00:16:14,840 --> 00:16:19,800 Speaker 1: incredible distribution to millions of systems around the world, consumer level, 267 00:16:19,960 --> 00:16:24,040 Speaker 1: enterprise level, everything in between, that now that you have 268 00:16:24,120 --> 00:16:26,400 Speaker 1: that source code, you can start looking at ways to 269 00:16:26,520 --> 00:16:31,160 Speaker 1: exploit that. You essentially have a playground, a sandbox that 270 00:16:31,240 --> 00:16:33,720 Speaker 1: you can work in with the actual source code of 271 00:16:33,760 --> 00:16:38,720 Speaker 1: the product, at least from that particular era until Microsoft 272 00:16:39,000 --> 00:16:41,920 Speaker 1: makes changes to it, and then you have a way 273 00:16:41,960 --> 00:16:45,240 Speaker 1: of of practicing on that to try and develop malware 274 00:16:45,280 --> 00:16:49,360 Speaker 1: that could potentially be used out in other distributions using 275 00:16:49,400 --> 00:16:53,320 Speaker 1: perhaps totally different attack vectors. Is that something that could 276 00:16:53,360 --> 00:16:59,840 Speaker 1: actually be possible or my addled by Hollywood, That's entirely possible. 277 00:17:00,160 --> 00:17:03,240 Speaker 1: And that's one of the reasons why we have seen 278 00:17:03,400 --> 00:17:08,760 Speaker 1: supply chain attacks targeting very specific like firmware versions or 279 00:17:08,920 --> 00:17:12,359 Speaker 1: or the back ends for these really large clients like 280 00:17:12,480 --> 00:17:15,840 Speaker 1: Microsoft UH in order to be able to steal source 281 00:17:15,880 --> 00:17:19,200 Speaker 1: code and stuff like that, because oftentimes, even though new 282 00:17:19,320 --> 00:17:22,760 Speaker 1: versions might come out of an operating system or of 283 00:17:22,920 --> 00:17:27,920 Speaker 1: software or firmware UH, they will use previous generations of 284 00:17:27,920 --> 00:17:33,440 Speaker 1: that firmware in order to maintain like consistency across all 285 00:17:33,440 --> 00:17:36,720 Speaker 1: of the different platforms that their product might be installed onto. 286 00:17:37,119 --> 00:17:40,560 Speaker 1: So there might be a few changes for future versions 287 00:17:40,640 --> 00:17:44,280 Speaker 1: or future releases, but the source code might remain pretty 288 00:17:44,320 --> 00:17:48,879 Speaker 1: similar to previous installations, and it's so much work to 289 00:17:49,080 --> 00:17:53,040 Speaker 1: change things on a fundamental level that it's impractical. Right, 290 00:17:53,160 --> 00:17:59,080 Speaker 1: There's there's almost no possibility, especially for programs that typically 291 00:18:00,040 --> 00:18:03,040 Speaker 1: they typically grow larger. I don't know if you've noticed this, Shannon, 292 00:18:03,119 --> 00:18:07,800 Speaker 1: but I have, even from like a programming perspective, which 293 00:18:07,840 --> 00:18:09,920 Speaker 1: I am not a programmer. But I have done some 294 00:18:10,000 --> 00:18:13,240 Speaker 1: coding in the past, and I know that there is 295 00:18:13,359 --> 00:18:17,880 Speaker 1: a lot of turnover at companies, and oftentimes they will 296 00:18:18,359 --> 00:18:21,600 Speaker 1: forcibly not change a lot of the code in order 297 00:18:21,920 --> 00:18:24,840 Speaker 1: to make sure that it still works with new employees 298 00:18:24,960 --> 00:18:27,199 Speaker 1: if there is like a new codeer that comes in 299 00:18:27,320 --> 00:18:30,480 Speaker 1: or a new programmer. Uh and sometimes you won't find 300 00:18:30,560 --> 00:18:34,840 Speaker 1: notes in the in the code for future programmers, so 301 00:18:35,000 --> 00:18:38,400 Speaker 1: they just choose not to break anything by not changing anything, 302 00:18:38,520 --> 00:18:42,560 Speaker 1: so code will remain the same for years and years 303 00:18:42,560 --> 00:18:47,800 Speaker 1: and years before somebody actually goes in and bravely changes anything. Right, So, 304 00:18:47,920 --> 00:18:51,800 Speaker 1: if you if you are someone who's creating a uh 305 00:18:51,880 --> 00:18:54,639 Speaker 1: some malware and you want to target users of a 306 00:18:54,720 --> 00:18:59,400 Speaker 1: specific type of of software, whatever it may be, whether 307 00:18:59,400 --> 00:19:03,240 Speaker 1: it's an operating system or something entirely different, then being 308 00:19:03,280 --> 00:19:06,600 Speaker 1: able to make a change to like a fundamental part 309 00:19:06,720 --> 00:19:09,320 Speaker 1: of that code, one that is not likely to have 310 00:19:09,440 --> 00:19:13,359 Speaker 1: been altered because it's it's sort of a pillar of 311 00:19:13,400 --> 00:19:17,840 Speaker 1: the software, then that's a pretty decent bet that your malware, 312 00:19:17,920 --> 00:19:22,399 Speaker 1: if you're able to inject it into the actual real 313 00:19:22,720 --> 00:19:26,440 Speaker 1: software on whatever the vendor side is, that that will 314 00:19:26,480 --> 00:19:30,840 Speaker 1: then get rolled out through various patches and updates or 315 00:19:30,880 --> 00:19:34,440 Speaker 1: even just new installations of that that product as people 316 00:19:34,600 --> 00:19:38,119 Speaker 1: come on board, and the longer you can keep that 317 00:19:38,200 --> 00:19:41,359 Speaker 1: on the d L, the more systems you can infect 318 00:19:41,400 --> 00:19:44,560 Speaker 1: without anyone being the wiser. As it turns out with 319 00:19:44,560 --> 00:19:48,240 Speaker 1: with the solar winds hack, we now know that the 320 00:19:48,320 --> 00:19:53,639 Speaker 1: attacks started no later than October two thousand nineteen. It 321 00:19:53,680 --> 00:19:56,560 Speaker 1: may have been insane. Yeah, So that that was for 322 00:19:56,640 --> 00:20:00,280 Speaker 1: a full year plus a couple of months before we 323 00:20:00,280 --> 00:20:04,200 Speaker 1: were made aware of it. And it was another security 324 00:20:04,280 --> 00:20:08,760 Speaker 1: firm called fire Eye that noticed something hinky was going on. 325 00:20:09,440 --> 00:20:13,399 Speaker 1: Something hinky. Yeah, but it's kind of but it was hanky, 326 00:20:15,240 --> 00:20:17,320 Speaker 1: It's true. They were They were like, hey, what's this 327 00:20:17,520 --> 00:20:20,600 Speaker 1: wise our network being weird? I call it jankie. But 328 00:20:21,640 --> 00:20:24,600 Speaker 1: they just like, some odd is going on, Like we're 329 00:20:24,600 --> 00:20:27,159 Speaker 1: getting some red flags. And we didn't know at the 330 00:20:27,200 --> 00:20:29,600 Speaker 1: time that it was Sunburst, that we didn't know that 331 00:20:29,640 --> 00:20:32,040 Speaker 1: it was a solar winds hack or where it was 332 00:20:32,080 --> 00:20:35,320 Speaker 1: being distributed from. So fire I was just like, we 333 00:20:35,359 --> 00:20:37,400 Speaker 1: think we got hacked, and then a few days later 334 00:20:37,480 --> 00:20:40,480 Speaker 1: everybody was like, oh, actually this is connected to a 335 00:20:40,600 --> 00:20:45,000 Speaker 1: much bigger thing. It wasn't them, it was the vendor 336 00:20:45,119 --> 00:20:47,640 Speaker 1: that they were using. So all of a sudden, everybody 337 00:20:47,680 --> 00:20:50,840 Speaker 1: was just like, oh, we should probably check our systems too, 338 00:20:50,960 --> 00:20:54,600 Speaker 1: And then everybody started realizing, oh, this is actually a 339 00:20:54,640 --> 00:20:56,480 Speaker 1: really huge thing because it wasn't just us, it was 340 00:20:56,520 --> 00:21:01,880 Speaker 1: a vendor. That's scary. Well, and when it's a cybersecurity 341 00:21:02,000 --> 00:21:05,359 Speaker 1: firm that first says, oh, gosh, we were hacked, you 342 00:21:05,440 --> 00:21:08,360 Speaker 1: know it's bad because these are the people who are 343 00:21:08,359 --> 00:21:14,520 Speaker 1: paid to stop that from happening to other people. So 344 00:21:15,240 --> 00:21:17,399 Speaker 1: it's a great example when you look at it from 345 00:21:17,440 --> 00:21:21,200 Speaker 1: from that perspective of fire Eye as a cybersecurity company, 346 00:21:21,760 --> 00:21:26,160 Speaker 1: even they had inherent trust in Solar Winds to distribute 347 00:21:26,520 --> 00:21:30,720 Speaker 1: their firmware and their updates in a trusted way. And 348 00:21:30,800 --> 00:21:34,520 Speaker 1: even then they couldn't fully trust Solar Winds to do 349 00:21:34,600 --> 00:21:37,960 Speaker 1: that in a matter that would keep them protected, right right, 350 00:21:38,000 --> 00:21:41,520 Speaker 1: I mean, we there's this whole certification process, this digital 351 00:21:41,560 --> 00:21:45,639 Speaker 1: certification that proves that a piece of code is really 352 00:21:45,720 --> 00:21:49,479 Speaker 1: coming from the source that you think you're receiving it, 353 00:21:49,760 --> 00:21:53,480 Speaker 1: you know from, so that there's this approach that's very 354 00:21:53,560 --> 00:21:58,920 Speaker 1: well tested, very well proven by history that this is reliable. 355 00:21:59,520 --> 00:22:04,040 Speaker 1: And that's why this hack is so insidious because it said, cool, 356 00:22:04,160 --> 00:22:06,359 Speaker 1: we were not going to try and get around that. 357 00:22:06,920 --> 00:22:12,159 Speaker 1: We're gonna rely on that trust, on that that whole process, 358 00:22:12,840 --> 00:22:15,640 Speaker 1: because everyone knows it works. So if you can, if 359 00:22:15,640 --> 00:22:17,920 Speaker 1: you can get to the code before it goes through, 360 00:22:18,760 --> 00:22:22,359 Speaker 1: then you're golden. And that's exactly what happened. Uh. An 361 00:22:22,359 --> 00:22:25,520 Speaker 1: analogy I use is that the way we typically think 362 00:22:25,520 --> 00:22:28,239 Speaker 1: of hackers is and you should appreciate this because I 363 00:22:28,280 --> 00:22:30,920 Speaker 1: know you've played with them. We can think of someone 364 00:22:30,960 --> 00:22:33,480 Speaker 1: who's got lock picks and they're going through an apartment 365 00:22:33,520 --> 00:22:36,199 Speaker 1: building and they're just they're they're opening up locks just 366 00:22:36,240 --> 00:22:38,880 Speaker 1: for fun. But the Solar Winds hack is as if 367 00:22:39,400 --> 00:22:43,080 Speaker 1: the supervisor for the entire building with the master key 368 00:22:43,280 --> 00:22:45,200 Speaker 1: is the one who has decided to do all the snooping, 369 00:22:45,200 --> 00:22:46,959 Speaker 1: and they can just walk in when because they've been 370 00:22:46,960 --> 00:22:50,439 Speaker 1: trusted with that master key. So that's kind of the 371 00:22:50,480 --> 00:22:54,240 Speaker 1: analogy I give. It's it's totally different from the hacks 372 00:22:54,240 --> 00:22:56,680 Speaker 1: where you're like, that person looks us I'm not gonna 373 00:22:56,760 --> 00:22:59,040 Speaker 1: let them into the building. No, it's it's the supervisor. 374 00:22:59,080 --> 00:23:03,480 Speaker 1: Of course, the supervisor comes in, he's tolly fied. Yeah, 375 00:23:03,760 --> 00:23:06,240 Speaker 1: that's a great analogy. Actually, I don't I hope you 376 00:23:06,280 --> 00:23:08,520 Speaker 1: don't mind if I steal that? Please? Do I get 377 00:23:08,600 --> 00:23:11,320 Speaker 1: like two a year? So I'm just glad that I 378 00:23:11,400 --> 00:23:14,360 Speaker 1: was able to. I mean I peaked early. We're in January. 379 00:23:14,440 --> 00:23:20,640 Speaker 1: But but yeah, so the scope of this attack, even 380 00:23:20,640 --> 00:23:25,359 Speaker 1: though only only I say only, but like forty different 381 00:23:25,359 --> 00:23:30,080 Speaker 1: systems have been compromised then further infiltrated. Uh, you still 382 00:23:30,160 --> 00:23:34,480 Speaker 1: have around eighteen thousand that could potentially be infiltrated because 383 00:23:34,520 --> 00:23:37,840 Speaker 1: they do have the malicious code installed within their systems 384 00:23:38,200 --> 00:23:40,399 Speaker 1: that allows for that backdoor access. So they have to 385 00:23:40,840 --> 00:23:43,240 Speaker 1: they it is now incumbent upon them to make sure 386 00:23:43,320 --> 00:23:49,000 Speaker 1: they uh they they isolate those servers, they remediate them, 387 00:23:49,160 --> 00:23:51,840 Speaker 1: and that they bring everything up to a new version 388 00:23:51,840 --> 00:23:56,600 Speaker 1: that no longer has that backdoor access. Meanwhile, for all 389 00:23:56,680 --> 00:24:00,240 Speaker 1: the systems that we're compromised, for those forty, which again 390 00:24:00,280 --> 00:24:05,480 Speaker 1: includes like national security level government offices, they have the 391 00:24:05,560 --> 00:24:09,919 Speaker 1: unenviable task of figuring out how extensive the attack was 392 00:24:10,000 --> 00:24:14,119 Speaker 1: within their systems, what parts of their systems were specifically affected, 393 00:24:14,359 --> 00:24:18,240 Speaker 1: at what level of access did the hackers have, was 394 00:24:18,280 --> 00:24:20,480 Speaker 1: it like microsoftware they could just see it or could 395 00:24:20,520 --> 00:24:23,439 Speaker 1: they do more? And how do they fix it? Um? 396 00:24:23,480 --> 00:24:26,560 Speaker 1: And this is. I think I think the way we 397 00:24:26,600 --> 00:24:31,120 Speaker 1: could we could call it a ginormous challenge. Oh yeah, 398 00:24:31,400 --> 00:24:34,240 Speaker 1: So I'll give you an example from a very much 399 00:24:34,440 --> 00:24:38,119 Speaker 1: smaller scale. When I was working at Hack five in 400 00:24:38,160 --> 00:24:42,840 Speaker 1: an office, I learned how I could do network sniffing 401 00:24:43,320 --> 00:24:46,280 Speaker 1: on the entire office. So I was able to figure 402 00:24:46,320 --> 00:24:51,280 Speaker 1: out from my little Lennox laptop what machines were connected 403 00:24:51,880 --> 00:24:54,800 Speaker 1: all to the same network, even if they were Ethernet 404 00:24:54,880 --> 00:24:56,560 Speaker 1: or WiFi. I was able to figure out how to 405 00:24:56,840 --> 00:24:59,119 Speaker 1: you know, sniff WiFi as well, because we made a 406 00:24:59,160 --> 00:25:01,920 Speaker 1: product for that, uh, And I was able to see 407 00:25:02,000 --> 00:25:04,960 Speaker 1: that we had, like I think it was like twelve 408 00:25:05,080 --> 00:25:09,280 Speaker 1: different computers, we had two printers. So then from there 409 00:25:09,359 --> 00:25:12,359 Speaker 1: I was able to look up the versions of everybody's 410 00:25:12,440 --> 00:25:15,040 Speaker 1: operating systems and find out which ones were vulnerable. And 411 00:25:15,080 --> 00:25:18,200 Speaker 1: it turns out one of our printers was vulnerable. So 412 00:25:18,320 --> 00:25:22,000 Speaker 1: even though I was not necessarily connected to the printer, 413 00:25:22,200 --> 00:25:25,040 Speaker 1: like I didn't have it installed, the driver's installed, or 414 00:25:25,040 --> 00:25:28,399 Speaker 1: anything on my Lenox computer, I was able to send 415 00:25:28,920 --> 00:25:32,880 Speaker 1: that printer a piece of paper that said I got hacks, 416 00:25:33,040 --> 00:25:34,760 Speaker 1: and I was able to print it out on the computer. 417 00:25:35,119 --> 00:25:37,520 Speaker 1: And it was the funniest thing because like nobody it was. 418 00:25:37,680 --> 00:25:39,600 Speaker 1: It was Darren's printer, so like he was able to 419 00:25:39,640 --> 00:25:42,720 Speaker 1: look at it. My coworker, Darren Kitchen, and he was 420 00:25:43,160 --> 00:25:44,679 Speaker 1: and he looks at the piece of paper and he 421 00:25:44,720 --> 00:25:46,280 Speaker 1: was like, s did you just figure out how to 422 00:25:46,280 --> 00:25:49,040 Speaker 1: hack the printer? And I said, yeah, it was super funny. 423 00:25:49,080 --> 00:25:52,399 Speaker 1: But even from a much more broad perspective of when 424 00:25:52,440 --> 00:25:56,600 Speaker 1: you're looking at solar winds um, if somebody had access 425 00:25:56,640 --> 00:25:59,480 Speaker 1: to a net, the network of one of their clients, 426 00:26:00,200 --> 00:26:04,480 Speaker 1: they could see the actual desktop computers that many of 427 00:26:04,480 --> 00:26:07,880 Speaker 1: their office employees might have access to. They could see printers, 428 00:26:07,960 --> 00:26:12,680 Speaker 1: They might be able to see network connected security cameras. UH. 429 00:26:12,720 --> 00:26:14,360 Speaker 1: If they work at a bank, they might be able 430 00:26:14,400 --> 00:26:17,440 Speaker 1: to see network connected a t M s UH. They 431 00:26:17,480 --> 00:26:22,520 Speaker 1: have access to maybe like passwords or anything that's being 432 00:26:22,560 --> 00:26:26,720 Speaker 1: distributed across the network if it's not being protected correctly. 433 00:26:27,320 --> 00:26:31,720 Speaker 1: They could have access to network attached storage in server racks, 434 00:26:31,760 --> 00:26:35,280 Speaker 1: all sorts of things. So if you have hundreds and 435 00:26:35,359 --> 00:26:38,359 Speaker 1: hundreds of different connected devices and any of those have 436 00:26:38,520 --> 00:26:41,080 Speaker 1: not been like auto update, and then again we're putting 437 00:26:41,080 --> 00:26:45,360 Speaker 1: trust in vendors to auto update correctly. If these machines 438 00:26:45,400 --> 00:26:51,040 Speaker 1: have not been auto updated or patched correctly, and a 439 00:26:51,080 --> 00:26:54,720 Speaker 1: hacker has access over that network to see what version 440 00:26:54,760 --> 00:26:58,920 Speaker 1: these programs are running. There's plenty of information on Google 441 00:26:59,000 --> 00:27:02,119 Speaker 1: about what version of what software is still vulnerable to 442 00:27:02,240 --> 00:27:05,280 Speaker 1: what problems. There are these things called c v s 443 00:27:05,359 --> 00:27:07,920 Speaker 1: and you can look them up and see what kind 444 00:27:07,960 --> 00:27:10,720 Speaker 1: of vulnerabilities are currently out there and how they are 445 00:27:10,760 --> 00:27:14,640 Speaker 1: being fixed. And if a hacker knows and they look 446 00:27:14,640 --> 00:27:17,080 Speaker 1: at this version and then they find out there's a vulnerability, 447 00:27:17,119 --> 00:27:19,280 Speaker 1: they could use that to their advantage to get another 448 00:27:19,359 --> 00:27:23,439 Speaker 1: foothold within that network. Even if even if the network 449 00:27:23,480 --> 00:27:27,240 Speaker 1: admin found out that there was a vulnerability on their 450 00:27:27,280 --> 00:27:29,199 Speaker 1: network and they were able to cut that off, the 451 00:27:29,200 --> 00:27:33,080 Speaker 1: hacker might have already gotten another foothold. So it's entirely 452 00:27:33,119 --> 00:27:36,320 Speaker 1: possible that there's like plenty of other places that these 453 00:27:36,359 --> 00:27:40,280 Speaker 1: attackers are snooping on networks through. So yeah, it's a 454 00:27:40,400 --> 00:27:43,560 Speaker 1: huge issue, and it's no wonder like given that this 455 00:27:43,680 --> 00:27:46,600 Speaker 1: was just discovered a few weeks ago, maybe about a 456 00:27:46,680 --> 00:27:50,359 Speaker 1: month month in two weeks ago or six weeks, Uh, 457 00:27:50,480 --> 00:27:53,359 Speaker 1: it's no wonder that there's tons of network admins and 458 00:27:53,400 --> 00:27:57,040 Speaker 1: security professionals that are still having to work like over 459 00:27:57,119 --> 00:28:00,520 Speaker 1: time just to ensure that their networks are safe. And 460 00:28:00,520 --> 00:28:03,560 Speaker 1: and you pointed out a problem that I hadn't even 461 00:28:03,560 --> 00:28:05,920 Speaker 1: thought about, which just as like, hey, you know how 462 00:28:05,960 --> 00:28:08,720 Speaker 1: bad you thought this was. Guess what, It's worse than that, 463 00:28:10,040 --> 00:28:12,919 Speaker 1: because because like, if to go back to my analogy, 464 00:28:13,320 --> 00:28:16,399 Speaker 1: it would be almost like if you are you have 465 00:28:16,480 --> 00:28:19,640 Speaker 1: infiltrated a building, you were able to sneak in, and 466 00:28:19,720 --> 00:28:22,160 Speaker 1: you're snooping around and you're looking at all this sort 467 00:28:22,160 --> 00:28:24,960 Speaker 1: of stuff, and meanwhile you're also unlocking every window you 468 00:28:25,040 --> 00:28:28,680 Speaker 1: go by, so that if if your original entry point 469 00:28:28,680 --> 00:28:33,120 Speaker 1: has been shut off, you got like fifty others. Because example, 470 00:28:33,200 --> 00:28:35,800 Speaker 1: so if somebody was to change the lock on their door, 471 00:28:36,440 --> 00:28:39,680 Speaker 1: but you also had unlocked the windows so that you 472 00:28:39,680 --> 00:28:42,400 Speaker 1: could get access that way, they might not even think 473 00:28:42,400 --> 00:28:44,400 Speaker 1: about checking the window when they fixed the lock on 474 00:28:44,400 --> 00:28:47,760 Speaker 1: the door, right right, So, like you were saying, looking 475 00:28:47,760 --> 00:28:50,120 Speaker 1: at all the different versions of software that are running 476 00:28:50,120 --> 00:28:54,200 Speaker 1: on various computers and other systems, other devices running on 477 00:28:54,240 --> 00:28:58,120 Speaker 1: that network, if you identify all those potential vulnerabilities, really 478 00:28:58,200 --> 00:29:01,880 Speaker 1: you're just you're like you're saying, we should use this 479 00:29:01,960 --> 00:29:05,480 Speaker 1: time to start developing tools to take advantage of all 480 00:29:05,520 --> 00:29:09,000 Speaker 1: these different potential weak points, because we can make the 481 00:29:09,000 --> 00:29:12,960 Speaker 1: problem so big that it is almost impossible to think 482 00:29:13,000 --> 00:29:15,479 Speaker 1: of what the solution would be apart from new kit 483 00:29:15,560 --> 00:29:18,520 Speaker 1: from orbit. It's the only way to be sure. A 484 00:29:18,560 --> 00:29:21,800 Speaker 1: lot of it is risk assessment, and that's something that 485 00:29:21,880 --> 00:29:24,280 Speaker 1: a lot of a lot of large businesses do, and 486 00:29:24,440 --> 00:29:28,480 Speaker 1: it's even something that customers can do. Consumers like I 487 00:29:28,520 --> 00:29:31,840 Speaker 1: could do this from my home network risk assessment. What's 488 00:29:31,920 --> 00:29:35,760 Speaker 1: running on your network right now? What devices are vulnerable 489 00:29:35,840 --> 00:29:38,920 Speaker 1: or potentially vulnerable? Have you done a yearly audit to 490 00:29:38,960 --> 00:29:42,040 Speaker 1: make sure that there's nobody getting access there's no like 491 00:29:42,160 --> 00:29:46,000 Speaker 1: random email addresses tied or associated to your online accounts? 492 00:29:46,360 --> 00:29:48,600 Speaker 1: Have you changed your passwords in the past year to 493 00:29:49,440 --> 00:29:54,520 Speaker 1: comply with nests recommended framework for passwords, Like, there's a 494 00:29:54,560 --> 00:29:56,600 Speaker 1: bunch of different things that you can do to kind 495 00:29:56,600 --> 00:30:00,320 Speaker 1: of assess where your risks lay and then act on 496 00:30:00,360 --> 00:30:05,000 Speaker 1: those assessments before a hack actually happens, right, Yeah, And 497 00:30:05,040 --> 00:30:07,760 Speaker 1: as long as you don't have an issue like this 498 00:30:08,000 --> 00:30:12,960 Speaker 1: where a trusted vendors where because yeah, because that just 499 00:30:12,960 --> 00:30:14,920 Speaker 1: slips right in right, just like you were saying, like 500 00:30:15,280 --> 00:30:17,600 Speaker 1: these these companies could have been doing all the right 501 00:30:17,680 --> 00:30:20,680 Speaker 1: things it's not like they did something wrong. They did 502 00:30:20,760 --> 00:30:23,080 Speaker 1: the right thing. And you might wonder, well, how did 503 00:30:23,080 --> 00:30:28,240 Speaker 1: the hackers get access to the Orion software to start with? 504 00:30:28,280 --> 00:30:31,520 Speaker 1: Like how did that happen? And honestly, we don't fully know, you, 505 00:30:31,680 --> 00:30:35,360 Speaker 1: or at least the public doesn't fully know yet. Someone 506 00:30:35,440 --> 00:30:39,600 Speaker 1: might know, but I don't. But the working theory right 507 00:30:39,640 --> 00:30:43,840 Speaker 1: now is that another third party vendor called jet Brains 508 00:30:44,760 --> 00:30:48,200 Speaker 1: creates a tool called Team City. Jet Brains, by the way, 509 00:30:48,400 --> 00:30:52,000 Speaker 1: I'm sure completely coincidentally founded by a group of Russian 510 00:30:52,040 --> 00:30:56,120 Speaker 1: cybersecurity experts, but Team City. Team City is a software 511 00:30:56,160 --> 00:30:58,760 Speaker 1: testing environment. So it's the kind of thing where you've 512 00:30:58,760 --> 00:31:00,640 Speaker 1: got your little virtual say in box, so that you 513 00:31:00,680 --> 00:31:03,600 Speaker 1: can build software and try and break it and see 514 00:31:03,600 --> 00:31:07,120 Speaker 1: if it works before you deploy it in the real world. Right, 515 00:31:07,360 --> 00:31:10,400 Speaker 1: That's kind of the thing they make. And Solar Winds 516 00:31:10,600 --> 00:31:13,719 Speaker 1: is one of the customers who uses Team City, and 517 00:31:13,800 --> 00:31:18,440 Speaker 1: so the current thinking is that the hackers targeted Team City. 518 00:31:18,840 --> 00:31:23,040 Speaker 1: They specifically targeted a server that Solar Winds uses that 519 00:31:23,200 --> 00:31:26,080 Speaker 1: has Team City on it. They targeted that and then 520 00:31:26,120 --> 00:31:29,000 Speaker 1: they were able to get access to solar Winds software 521 00:31:29,280 --> 00:31:32,480 Speaker 1: through that link, which just shows you, like there could 522 00:31:32,480 --> 00:31:36,360 Speaker 1: be a lot of hops from between the hacker and 523 00:31:36,400 --> 00:31:40,640 Speaker 1: their ultimate goal. So this team City Server was one hop. 524 00:31:41,280 --> 00:31:45,080 Speaker 1: The solar winds system where they were able to inject 525 00:31:45,120 --> 00:31:50,000 Speaker 1: malware into Orion was a second hop. The customers were 526 00:31:50,080 --> 00:31:53,160 Speaker 1: the third hop, and then they could go in and 527 00:31:53,200 --> 00:31:56,280 Speaker 1: start adding a second payload. Because once they once they 528 00:31:56,280 --> 00:31:58,800 Speaker 1: were deployed to the customers, that was the in road, 529 00:31:58,840 --> 00:32:01,479 Speaker 1: that was the back door. There is no doubt in 530 00:32:01,480 --> 00:32:04,840 Speaker 1: my mind that their end goal what were the clients 531 00:32:04,880 --> 00:32:10,560 Speaker 1: that use solar winds, And chances are that these attackers 532 00:32:10,640 --> 00:32:14,280 Speaker 1: are very very advanced and that they probably are state 533 00:32:14,320 --> 00:32:18,920 Speaker 1: sponsored because the time that they're investing in order to 534 00:32:19,840 --> 00:32:23,160 Speaker 1: get the foothold within get these back doors within these 535 00:32:23,200 --> 00:32:26,240 Speaker 1: clients took them over a year. I mean, it took 536 00:32:26,280 --> 00:32:28,680 Speaker 1: them a very very long time. And if they started 537 00:32:29,600 --> 00:32:33,560 Speaker 1: even behind solar winds to jet brains, that's insane, Like 538 00:32:33,680 --> 00:32:37,000 Speaker 1: that is extremely advanced. And that's one of the reasons 539 00:32:37,040 --> 00:32:41,320 Speaker 1: why this is such a crucial attack and what why 540 00:32:41,400 --> 00:32:43,720 Speaker 1: it's It's going to go in like history books when 541 00:32:43,760 --> 00:32:47,520 Speaker 1: people talk about information security and learning about previous attacks, 542 00:32:47,960 --> 00:32:51,960 Speaker 1: this is going to be one of those historical examples 543 00:32:52,040 --> 00:32:55,080 Speaker 1: of a supply chain attack, because it's insane how how 544 00:32:55,080 --> 00:32:58,880 Speaker 1: advanced it is. We'll be right back with more with 545 00:32:58,960 --> 00:33:09,080 Speaker 1: Shannon Morse about the solar winds hack. After this quick message, 546 00:33:10,520 --> 00:33:14,800 Speaker 1: I've read some articles by cybersecurity experts who, you know, 547 00:33:14,880 --> 00:33:18,920 Speaker 1: hindsight is, now that it's happened, you can see where 548 00:33:18,920 --> 00:33:23,320 Speaker 1: the opportunities were earlier on, in the sense that if 549 00:33:23,320 --> 00:33:29,000 Speaker 1: you're thinking about the cybersecurity environment of say two eighteen 550 00:33:29,240 --> 00:33:35,120 Speaker 1: to present day, a lot of that attention was rightfully 551 00:33:35,240 --> 00:33:39,040 Speaker 1: devoted to things like how do we maintain a secure 552 00:33:39,240 --> 00:33:42,480 Speaker 1: election cycle here in the United States, So a lot 553 00:33:42,520 --> 00:33:46,680 Speaker 1: of resources we're looking in one direction, which meant that 554 00:33:46,760 --> 00:33:51,160 Speaker 1: not as many resources we're looking for potential supply chain threats. 555 00:33:51,480 --> 00:33:55,600 Speaker 1: So while there were a few analysts who had previously 556 00:33:55,640 --> 00:33:58,240 Speaker 1: said this is something we really need to be cognizant 557 00:33:58,360 --> 00:34:02,200 Speaker 1: of and have developed best practices so that we can 558 00:34:02,880 --> 00:34:06,000 Speaker 1: hopefully prevent it, but if not prevent it, certainly detected 559 00:34:06,040 --> 00:34:10,799 Speaker 1: and react to it. But because there were other pressing 560 00:34:10,840 --> 00:34:15,160 Speaker 1: matters that were very much tied to cybersecurity, that that 561 00:34:15,239 --> 00:34:18,319 Speaker 1: just didn't get as much attention as it might have otherwise, 562 00:34:18,600 --> 00:34:21,760 Speaker 1: and it ended up being the perfect opportunity. It actually 563 00:34:21,760 --> 00:34:28,399 Speaker 1: really does point to the incredible um UH inventiveness and 564 00:34:28,480 --> 00:34:32,640 Speaker 1: the the how how nimble the hackers were to be 565 00:34:32,680 --> 00:34:37,719 Speaker 1: able to recognize a time and and opportunity to really 566 00:34:38,440 --> 00:34:43,120 Speaker 1: develop and deploy that malware, because you couldn't have asked 567 00:34:43,160 --> 00:34:46,800 Speaker 1: for a better environment, right, It just was the perfect 568 00:34:46,800 --> 00:34:49,040 Speaker 1: time for for the neighborhood Watch to be looking the 569 00:34:49,080 --> 00:34:53,400 Speaker 1: other way. Oh yes, it's um And I feel like 570 00:34:53,440 --> 00:34:57,799 Speaker 1: the attackers got very lucky on their timing, even though 571 00:34:58,440 --> 00:35:01,120 Speaker 1: and this this is bringing up the pandemic in a sense, 572 00:35:01,160 --> 00:35:05,480 Speaker 1: even though the it's probable that this started in October 573 00:35:05,960 --> 00:35:09,280 Speaker 1: nineteen and that happened before the pandemic. What perfect timing 574 00:35:09,320 --> 00:35:12,000 Speaker 1: for these attackers because the entire time that they have 575 00:35:12,080 --> 00:35:16,279 Speaker 1: been silently getting intrusions into all these different clients and 576 00:35:16,360 --> 00:35:19,680 Speaker 1: into solar Winds as the vendor. There have been companies 577 00:35:19,719 --> 00:35:22,480 Speaker 1: out here that have been losing funds because of the pandemic. 578 00:35:22,840 --> 00:35:26,120 Speaker 1: They don't have as much manpower because everybody's working from home, 579 00:35:26,560 --> 00:35:28,879 Speaker 1: and they've had to lay off a lot of their 580 00:35:28,920 --> 00:35:32,520 Speaker 1: network administrators and their I T and consultants and everybody else, 581 00:35:32,520 --> 00:35:34,480 Speaker 1: and they don't have the money right now to fund 582 00:35:35,120 --> 00:35:38,120 Speaker 1: doing like third party audits of their systems and stuff 583 00:35:38,160 --> 00:35:41,759 Speaker 1: like that, So perfect timing for attackers to just come 584 00:35:41,800 --> 00:35:46,600 Speaker 1: in and silently attack and intrude on all of these 585 00:35:46,600 --> 00:35:49,480 Speaker 1: different networks because nobody, nobody has the manpower right now. 586 00:35:49,520 --> 00:35:52,000 Speaker 1: It's it's almost impossible for all these companies to be 587 00:35:52,040 --> 00:35:56,160 Speaker 1: able to fulfill all the projects they could have potentially 588 00:35:56,200 --> 00:36:01,400 Speaker 1: had for security and privacy of their networks. Yeah, uh, 589 00:36:01,440 --> 00:36:05,320 Speaker 1: it is. It's it's a remarkable set of circumstances that 590 00:36:05,440 --> 00:36:09,399 Speaker 1: all helped create almost a perfect storm. The only way, 591 00:36:09,440 --> 00:36:11,160 Speaker 1: this is the only way you could argue that this 592 00:36:11,200 --> 00:36:16,400 Speaker 1: would be obviously worse, is if that number of compromise 593 00:36:16,480 --> 00:36:20,239 Speaker 1: systems had an even larger number of ones that were 594 00:36:20,320 --> 00:36:23,080 Speaker 1: followed up upon, if that were if that number were 595 00:36:23,080 --> 00:36:27,000 Speaker 1: even bigger than we would be talking about. I mean, 596 00:36:27,120 --> 00:36:29,359 Speaker 1: it's I don't even know how to call it like 597 00:36:29,480 --> 00:36:32,319 Speaker 1: a catastrophe, because I think it's already a catastrophe. We're 598 00:36:32,320 --> 00:36:37,920 Speaker 1: already at catastrophic level because of the potential espionage that 599 00:36:37,960 --> 00:36:41,200 Speaker 1: could have been done in critical systems. We don't know 600 00:36:41,520 --> 00:36:45,600 Speaker 1: if they were ever able to really access like highly 601 00:36:45,600 --> 00:36:49,000 Speaker 1: classified information. Clearly that's something that the government likes to 602 00:36:49,080 --> 00:36:51,760 Speaker 1: keep on the down low. They're not they're not too 603 00:36:51,840 --> 00:36:53,799 Speaker 1: they're not too happy to say like, oh, by the way, 604 00:36:53,880 --> 00:36:57,080 Speaker 1: Russian spies were able to look at our top secret 605 00:36:57,120 --> 00:37:00,319 Speaker 1: classified information that even most of our government official never 606 00:37:00,360 --> 00:37:03,000 Speaker 1: get a chance to see. That would be bad. Uh, 607 00:37:03,080 --> 00:37:05,200 Speaker 1: we don't know if that's happened or not from based 608 00:37:05,239 --> 00:37:08,040 Speaker 1: on what we've seen at other places. Uh, it's hard 609 00:37:08,080 --> 00:37:11,560 Speaker 1: to say because it all depends upon what other security 610 00:37:11,560 --> 00:37:14,319 Speaker 1: practices these different departments were doing, whether or not they 611 00:37:14,360 --> 00:37:17,920 Speaker 1: had had sort of sequestered some of their most sensitive 612 00:37:17,920 --> 00:37:22,040 Speaker 1: information in systems that are not as easily accessible. There 613 00:37:22,080 --> 00:37:24,560 Speaker 1: are possible ways of doing that. Microsoft in fact, has 614 00:37:24,600 --> 00:37:28,840 Speaker 1: talked about how through their own security system that is 615 00:37:28,920 --> 00:37:32,240 Speaker 1: part of the reason why they were limited in their access. 616 00:37:33,120 --> 00:37:35,239 Speaker 1: They still got to see a ton of stuff. It's 617 00:37:35,280 --> 00:37:39,280 Speaker 1: not good, but but it was a low privileged user access. 618 00:37:39,320 --> 00:37:42,560 Speaker 1: They weren't able to get like full access to everything 619 00:37:42,719 --> 00:37:46,600 Speaker 1: on Microsoft systems because the attacker was only able to 620 00:37:46,640 --> 00:37:52,320 Speaker 1: get that lower end access. So here's hoping, and and 621 00:37:53,080 --> 00:37:56,360 Speaker 1: the cynic in me it feels like hope is a 622 00:37:56,440 --> 00:37:59,080 Speaker 1: strong word to use because I've also familiar with government 623 00:37:59,120 --> 00:38:03,120 Speaker 1: systems and they're not always laid out in the best way, 624 00:38:03,160 --> 00:38:07,080 Speaker 1: often because not to any fault of government officials, I 625 00:38:07,120 --> 00:38:08,960 Speaker 1: don't want to throw a lot of shade at them. 626 00:38:09,160 --> 00:38:10,799 Speaker 1: We also have to keep in mind that some of 627 00:38:10,840 --> 00:38:14,760 Speaker 1: those positions there's a lot of turnover just because government 628 00:38:14,840 --> 00:38:18,520 Speaker 1: changes a lot. So it's hard to keep a real 629 00:38:18,640 --> 00:38:22,160 Speaker 1: legacy of security in those systems because you don't necessarily 630 00:38:22,200 --> 00:38:25,200 Speaker 1: have the same personnel from one administration to the next. 631 00:38:25,680 --> 00:38:28,919 Speaker 1: Um and there can even be turnover within administrations, as 632 00:38:28,960 --> 00:38:35,719 Speaker 1: our most recent administration taught us nearly daily. So so yeah, 633 00:38:35,719 --> 00:38:41,320 Speaker 1: so this is this is a huge challenge. The process 634 00:38:41,400 --> 00:38:44,720 Speaker 1: of cleaning it up is going to take a really 635 00:38:44,760 --> 00:38:48,839 Speaker 1: long time. I tried to see if any analysts had 636 00:38:48,960 --> 00:38:52,719 Speaker 1: kind of an estimate, but the most specific answer I 637 00:38:52,719 --> 00:38:57,880 Speaker 1: could get was probably years to really assess the full extent. 638 00:38:58,840 --> 00:39:01,840 Speaker 1: That's the same thing that I saw, which was pretty 639 00:39:01,920 --> 00:39:05,239 Speaker 1: much the consensus even among like my hacker friends, was 640 00:39:05,320 --> 00:39:07,920 Speaker 1: it's probably going to take several years in order for 641 00:39:07,960 --> 00:39:11,759 Speaker 1: them to really figure out how deep this honestly goes. 642 00:39:12,520 --> 00:39:16,759 Speaker 1: That is a sobering fact. It's also, you know, a 643 00:39:16,840 --> 00:39:21,120 Speaker 1: good reminder that this is something that you know, it's 644 00:39:21,120 --> 00:39:23,800 Speaker 1: not necessarily going to be an isolated incident. The fact 645 00:39:23,800 --> 00:39:27,600 Speaker 1: that this was so successful sends out a message to 646 00:39:27,760 --> 00:39:32,520 Speaker 1: any state sponsored hacker group that if you can manage 647 00:39:32,560 --> 00:39:37,359 Speaker 1: something like this, then the the all the doors are 648 00:39:37,400 --> 00:39:42,439 Speaker 1: open to you. So it's now something that vendors are 649 00:39:42,560 --> 00:39:44,839 Speaker 1: really going to have to be cognizant of to make 650 00:39:44,880 --> 00:39:48,319 Speaker 1: certain that the the product they send out has not 651 00:39:48,480 --> 00:39:51,120 Speaker 1: been altered in any way. And this has made more 652 00:39:51,239 --> 00:39:54,959 Speaker 1: challenging because obviously hackers are clever. They figure out ways 653 00:39:54,960 --> 00:39:59,799 Speaker 1: to cover their footsteps, thank you, I mean, a good 654 00:40:00,040 --> 00:40:03,240 Speaker 1: caer is anyway, right, A good hacker doesn't just figure 655 00:40:03,239 --> 00:40:05,560 Speaker 1: out how to intrude on the system, they also figure 656 00:40:05,560 --> 00:40:09,600 Speaker 1: out how to cover up that intrusion so that it's 657 00:40:09,640 --> 00:40:14,520 Speaker 1: not immediately apparent. Yes, because a lot of companies have 658 00:40:14,680 --> 00:40:19,240 Speaker 1: like they have really good intrusion detection software which will 659 00:40:19,600 --> 00:40:23,359 Speaker 1: send them a red flag or notify several of the 660 00:40:23,440 --> 00:40:27,160 Speaker 1: administrators that are working on that network immediately as soon 661 00:40:27,200 --> 00:40:30,880 Speaker 1: as something is noticed, so that they can um assess 662 00:40:30,920 --> 00:40:35,600 Speaker 1: the situation and cut off the threat. Yeah. And just 663 00:40:35,680 --> 00:40:39,120 Speaker 1: to make this story even more scary, uh, there have 664 00:40:39,160 --> 00:40:45,120 Speaker 1: been four major cybersecurity companies that have reported being compromised 665 00:40:45,120 --> 00:40:47,760 Speaker 1: in some way or at least attacked by these hackers. 666 00:40:48,000 --> 00:40:51,320 Speaker 1: One of the four says that no harm was done, 667 00:40:51,480 --> 00:40:53,839 Speaker 1: and those would be fire Eye, which we mentioned before. 668 00:40:53,880 --> 00:40:55,960 Speaker 1: That was the first company that came forward that kind 669 00:40:56,000 --> 00:40:59,239 Speaker 1: of broke open the dam on on the discovery of this. 670 00:40:59,719 --> 00:41:03,600 Speaker 1: Mike Soft as another, uh, malware bites which we learned 671 00:41:03,600 --> 00:41:06,719 Speaker 1: about not too long before the recording of this. Yeah, 672 00:41:06,840 --> 00:41:10,680 Speaker 1: like really recently and worse than that, not related directly 673 00:41:10,719 --> 00:41:14,040 Speaker 1: to solar winds because they don't use solar winds products. 674 00:41:14,040 --> 00:41:16,719 Speaker 1: We'll get back to that. And then CrowdStrike, which is 675 00:41:16,760 --> 00:41:19,759 Speaker 1: the company that says, yeah, they tried, we didn't. Then 676 00:41:19,880 --> 00:41:23,680 Speaker 1: they didn't and they didn't do anything. So good on them. 677 00:41:24,239 --> 00:41:27,600 Speaker 1: But as for malware bites, they came forward and said, yes, 678 00:41:27,760 --> 00:41:31,799 Speaker 1: we also have detected the presence of these hackers in 679 00:41:31,840 --> 00:41:35,640 Speaker 1: our systems. But in our case it was because of 680 00:41:35,680 --> 00:41:40,719 Speaker 1: an Office three sixty five email protection app that was 681 00:41:40,960 --> 00:41:44,520 Speaker 1: dormant that they were able to target and get to 682 00:41:44,680 --> 00:41:46,640 Speaker 1: our systems through that. So they were able to read 683 00:41:46,680 --> 00:41:51,040 Speaker 1: some emails. So that tells us that potentially that could 684 00:41:51,040 --> 00:41:53,040 Speaker 1: have been something they learned by being able to look 685 00:41:53,040 --> 00:41:55,880 Speaker 1: at the source code over at Microsoft. We don't know 686 00:41:55,960 --> 00:41:59,200 Speaker 1: that but that's possibly how that happened, was that they 687 00:41:59,320 --> 00:42:05,080 Speaker 1: learned of a particular attack vector by scouring the source code, 688 00:42:05,400 --> 00:42:08,759 Speaker 1: and thus we're able to have a secondary attack through 689 00:42:08,880 --> 00:42:11,160 Speaker 1: a totally different approach and not have to depend upon 690 00:42:11,160 --> 00:42:14,239 Speaker 1: solar winds at all. And if that's the case, if 691 00:42:14,280 --> 00:42:19,800 Speaker 1: malware bites was targeted, then there's a really good chance 692 00:42:19,800 --> 00:42:22,040 Speaker 1: that others were too and we just don't know about 693 00:42:22,080 --> 00:42:25,640 Speaker 1: it yet. Yeah, that's an excellent example, and it kind 694 00:42:25,640 --> 00:42:28,600 Speaker 1: of takes us back, you know, back to the beginning 695 00:42:28,600 --> 00:42:33,480 Speaker 1: of the conversation, kind of explaining why the attackers were 696 00:42:33,480 --> 00:42:36,840 Speaker 1: targeting these companies in the first place, because they're getting 697 00:42:36,880 --> 00:42:40,879 Speaker 1: access to this crucial information that could potentially give them 698 00:42:40,960 --> 00:42:45,000 Speaker 1: access to other people or other brands and companies in 699 00:42:45,040 --> 00:42:48,319 Speaker 1: the future for completely different hacks that have nothing to 700 00:42:48,360 --> 00:42:52,279 Speaker 1: do with solar winds. So so while we're we might 701 00:42:52,280 --> 00:42:54,920 Speaker 1: be on the lookout for one type of attech, just 702 00:42:55,040 --> 00:42:57,520 Speaker 1: like we did with what I was talking about the 703 00:42:57,560 --> 00:42:59,920 Speaker 1: you know, election cycle, really taking a lot of cyber 704 00:43:00,000 --> 00:43:03,520 Speaker 1: security attention. If we're all looking for one specific type 705 00:43:03,560 --> 00:43:06,640 Speaker 1: of attack, that just means that there's opportunities for other attacks. 706 00:43:06,640 --> 00:43:09,920 Speaker 1: In fact, this is sort of just the the cracker 707 00:43:10,080 --> 00:43:12,759 Speaker 1: style of hacker, you know, the ones that specifically are 708 00:43:12,800 --> 00:43:16,719 Speaker 1: are looking at how to infiltrate systems. It really just 709 00:43:16,760 --> 00:43:19,120 Speaker 1: goes into their mindset, which is that all they care 710 00:43:19,160 --> 00:43:22,319 Speaker 1: about is at first anyway, figuring out how do I 711 00:43:22,360 --> 00:43:26,360 Speaker 1: infiltrate that system. That's that's their only focus. The problem 712 00:43:26,600 --> 00:43:29,880 Speaker 1: with people who build these systems, they also are burdened 713 00:43:30,320 --> 00:43:34,080 Speaker 1: with the weighty responsibility of making the system do whatever 714 00:43:34,120 --> 00:43:38,000 Speaker 1: it was supposed to do, plus make it invulnerable to intrusion. 715 00:43:39,239 --> 00:43:41,239 Speaker 1: But you have to make your system work first, right, 716 00:43:41,280 --> 00:43:44,080 Speaker 1: So you're like, hey, everything works, and like, oh, you 717 00:43:44,160 --> 00:43:47,040 Speaker 1: forgot about this way that a person could intrude to 718 00:43:47,239 --> 00:43:51,200 Speaker 1: and and get access to your system without authorization. You think, well, shoot, 719 00:43:51,239 --> 00:43:54,080 Speaker 1: I was just trying to make the thing go. Oh yeah, 720 00:43:54,160 --> 00:43:57,240 Speaker 1: like straight up, even if you're like working in an office. 721 00:43:57,480 --> 00:43:59,440 Speaker 1: I love giving those kind of examples because a lot 722 00:43:59,440 --> 00:44:02,480 Speaker 1: of people work in offices. Uh, let's say they have 723 00:44:02,520 --> 00:44:05,319 Speaker 1: to update the firmware on your printer and they have 724 00:44:05,400 --> 00:44:08,879 Speaker 1: to disconnect it to make it invulnerable from some kind 725 00:44:08,920 --> 00:44:12,120 Speaker 1: of attack. All of a sudden, they have to reauthorize 726 00:44:12,239 --> 00:44:15,160 Speaker 1: all of the PCs to connect to that one printer. 727 00:44:15,560 --> 00:44:18,160 Speaker 1: And that's a huge headache, and that creates even more work. 728 00:44:18,400 --> 00:44:21,359 Speaker 1: So you have like all these people that are just 729 00:44:21,400 --> 00:44:23,239 Speaker 1: trying to get their work done and you can't do 730 00:44:23,280 --> 00:44:27,080 Speaker 1: anything from from the perspective of an employee. And yeah, 731 00:44:27,160 --> 00:44:30,040 Speaker 1: and I'm definitely that guy who gets a little pop 732 00:44:30,120 --> 00:44:33,920 Speaker 1: up in Windows that says, hey, we've got some updates. 733 00:44:33,960 --> 00:44:35,640 Speaker 1: Do you want to reboot your system or do you 734 00:44:35,680 --> 00:44:38,040 Speaker 1: wanna you know, and you're like no, I'm like no, 735 00:44:38,360 --> 00:44:41,200 Speaker 1: twelve hours, tell me in twelve hours. And then after 736 00:44:41,280 --> 00:44:43,719 Speaker 1: like four days, like no, seriously, my heart is going 737 00:44:43,760 --> 00:44:46,080 Speaker 1: to come and take your computer if you don't update, Like, okay, 738 00:44:46,120 --> 00:44:48,640 Speaker 1: you know what, We've had some fun, I'll go ahead 739 00:44:48,640 --> 00:44:52,279 Speaker 1: and reboot. Uh. Yeah. So so this is this is 740 00:44:52,320 --> 00:44:55,239 Speaker 1: fascinating to me, and I'm so thankful for you to 741 00:44:55,320 --> 00:44:57,359 Speaker 1: join the show to help me kind of suss all 742 00:44:57,400 --> 00:44:59,440 Speaker 1: this out because I kind of had a handle on it, 743 00:44:59,480 --> 00:45:02,239 Speaker 1: but you really we opened up my eyes to other 744 00:45:02,280 --> 00:45:05,000 Speaker 1: opportunities that, honestly, I mean, I just didn't think about. 745 00:45:05,440 --> 00:45:08,719 Speaker 1: So that's exactly why I wanted you to to join 746 00:45:08,760 --> 00:45:10,759 Speaker 1: the show and why I'm so thankful that you you 747 00:45:10,800 --> 00:45:13,880 Speaker 1: said yes. After yeah, of course, after I bugged you 748 00:45:13,880 --> 00:45:16,719 Speaker 1: while you were on holiday. Well, I'm I'm glad I 749 00:45:16,760 --> 00:45:18,600 Speaker 1: was able to join you because there are so many 750 00:45:18,640 --> 00:45:21,839 Speaker 1: different ways that you can look at this attack. So 751 00:45:22,000 --> 00:45:25,120 Speaker 1: talking about all those different perspectives like I have been 752 00:45:25,600 --> 00:45:28,760 Speaker 1: is really important to really understand and get ingrained into, 753 00:45:28,800 --> 00:45:32,080 Speaker 1: like the motivations behind the solar winds attack, but also 754 00:45:32,200 --> 00:45:36,440 Speaker 1: understand it from a client perspective why this has been 755 00:45:36,520 --> 00:45:39,680 Speaker 1: so crucial and so important to so many people. And 756 00:45:39,760 --> 00:45:43,640 Speaker 1: it's it's great to be able to have that sort 757 00:45:43,680 --> 00:45:48,000 Speaker 1: of conversational approach. Where as I get my understanding, I 758 00:45:48,040 --> 00:45:50,920 Speaker 1: hope that my listeners have gotten a deeper understanding of 759 00:45:50,960 --> 00:45:53,200 Speaker 1: what's going on and why this is such a big 760 00:45:53,200 --> 00:45:57,080 Speaker 1: deal and why it dominated tech news for a couple 761 00:45:57,160 --> 00:46:01,440 Speaker 1: of weeks. Uh, you know, before we hear about Apple 762 00:46:01,960 --> 00:46:06,840 Speaker 1: interfering with you know, defibrillators and things like that. Um 763 00:46:06,880 --> 00:46:09,319 Speaker 1: So I'm sure we're going to hear a lot more 764 00:46:09,360 --> 00:46:14,200 Speaker 1: about this over the coming months and potentially years as 765 00:46:14,239 --> 00:46:17,719 Speaker 1: well as well. Inevitably we're going to hear about other 766 00:46:17,760 --> 00:46:20,600 Speaker 1: hacks that are going to be compared against this, because, 767 00:46:20,640 --> 00:46:23,279 Speaker 1: as you say, this is going to be a benchmark, 768 00:46:23,760 --> 00:46:28,319 Speaker 1: This is this is a historic hack event. And will 769 00:46:28,360 --> 00:46:30,400 Speaker 1: be one of those big ones we talked about for 770 00:46:30,480 --> 00:46:34,120 Speaker 1: years to come, you know. Um, But Shannon, if people 771 00:46:34,280 --> 00:46:37,960 Speaker 1: want to find your work and follow all the incredible 772 00:46:37,960 --> 00:46:42,279 Speaker 1: things that you do, where would they go? Check out 773 00:46:42,360 --> 00:46:46,160 Speaker 1: YouTube dot com slash Shannon Morse spelled just like my name. 774 00:46:46,840 --> 00:46:49,000 Speaker 1: That's where I've been doing a lot of security and 775 00:46:49,040 --> 00:46:51,960 Speaker 1: privacy as well as tech reviews too, and I do 776 00:46:52,040 --> 00:46:56,280 Speaker 1: answer a lot of questions about security and privacy for consumers. Yeah, 777 00:46:56,400 --> 00:46:59,919 Speaker 1: and if you hunt around, you can follow Shannon doing 778 00:47:00,000 --> 00:47:02,799 Speaker 1: all sorts of crazy things like traveling the world when 779 00:47:02,840 --> 00:47:06,160 Speaker 1: there's not a pandemic going on. And she takes really 780 00:47:06,160 --> 00:47:12,319 Speaker 1: good photos. Me too. Me two. And it doesn't help 781 00:47:12,400 --> 00:47:15,000 Speaker 1: that my wife will occasionally send me a picture of 782 00:47:15,040 --> 00:47:17,800 Speaker 1: a place I really want to be in but cannot 783 00:47:17,840 --> 00:47:24,320 Speaker 1: go to until it's very relatable. Yes, well, thank you again, 784 00:47:24,560 --> 00:47:27,520 Speaker 1: and I will certainly have you back on tech stuff 785 00:47:27,680 --> 00:47:31,000 Speaker 1: whenever you agree to do it. Well, thank you, Jonathan, 786 00:47:31,120 --> 00:47:33,000 Speaker 1: I appreciate it. Thank you so much for having me. 787 00:47:34,000 --> 00:47:37,320 Speaker 1: I hope you guys enjoyed the interview with Shannon Morrise 788 00:47:37,719 --> 00:47:39,840 Speaker 1: and once again I have to thank her for coming 789 00:47:39,880 --> 00:47:43,920 Speaker 1: onto the show. She is very generous with her time, 790 00:47:44,040 --> 00:47:47,719 Speaker 1: so I greatly appreciate it, and I hope that that 791 00:47:47,760 --> 00:47:52,279 Speaker 1: discussion gives you a deeper understanding and appreciation for the 792 00:47:53,120 --> 00:47:57,319 Speaker 1: large challenge ahead in dealing with this hack, as well 793 00:47:57,360 --> 00:48:00,520 Speaker 1: as just you know, something to think about for all 794 00:48:00,560 --> 00:48:04,200 Speaker 1: of you folks managing stuff out there about things to 795 00:48:04,239 --> 00:48:06,520 Speaker 1: look out for in the future. I mean, as Shannon 796 00:48:06,560 --> 00:48:10,640 Speaker 1: points out, the real issue here is that the attack 797 00:48:11,040 --> 00:48:15,640 Speaker 1: targeted something from a trusted source. So when you get 798 00:48:15,640 --> 00:48:20,080 Speaker 1: a message that is from a trusted partner, you don't 799 00:48:20,120 --> 00:48:23,160 Speaker 1: expect there to be malware in that. So this really 800 00:48:23,239 --> 00:48:27,120 Speaker 1: is a major wake up call, and unfortunately it's a 801 00:48:27,160 --> 00:48:30,880 Speaker 1: wake up call that's doing active damage right now. But 802 00:48:31,200 --> 00:48:34,640 Speaker 1: hopefully we'll have better news to bring about the Solar 803 00:48:34,640 --> 00:48:37,960 Speaker 1: Winds hack as time goes on and as people learn 804 00:48:38,000 --> 00:48:42,040 Speaker 1: how to remediate those servers. In the meantime, if you 805 00:48:42,080 --> 00:48:45,680 Speaker 1: guys have any suggestions for future topics I should cover 806 00:48:45,800 --> 00:48:49,560 Speaker 1: on tech stuff, whether it's a company technology, a trend, 807 00:48:50,080 --> 00:48:53,160 Speaker 1: something like the Solar Winds hack, or maybe there's somebody 808 00:48:53,520 --> 00:48:55,520 Speaker 1: you would love for me to have on the show 809 00:48:55,600 --> 00:48:58,239 Speaker 1: as a guest. Let me know. The best way to 810 00:48:58,280 --> 00:49:00,400 Speaker 1: get in touch with me is over on Twitter. The 811 00:49:00,440 --> 00:49:03,560 Speaker 1: handle for the show is text Stuff H s W 812 00:49:04,239 --> 00:49:12,640 Speaker 1: and I'll talk to you again really soon. Text Stuff 813 00:49:12,719 --> 00:49:15,839 Speaker 1: is an I heart Radio production. For more podcasts from 814 00:49:15,880 --> 00:49:19,680 Speaker 1: I heart Radio, visit the i heart Radio app, Apple Podcasts, 815 00:49:19,800 --> 00:49:21,760 Speaker 1: or wherever you listen to your favorite shows.