1 00:00:03,880 --> 00:00:06,880 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:06,960 --> 00:00:14,040 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,080 --> 00:00:17,640 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer here 4 00:00:17,640 --> 00:00:21,520 Speaker 1: at how Stuff Works. And yeah, I still kind of 5 00:00:21,600 --> 00:00:23,640 Speaker 1: have a cold. You can kind of hear it. It's 6 00:00:23,680 --> 00:00:25,159 Speaker 1: not as bad as it was last week when I 7 00:00:25,160 --> 00:00:28,640 Speaker 1: was recording those earlier episodes though, So that's something. Uh, 8 00:00:28,800 --> 00:00:31,720 Speaker 1: since I have a cold, you know, I thought my 9 00:00:31,800 --> 00:00:33,720 Speaker 1: brain got on this little topic. It's sort of this 10 00:00:33,800 --> 00:00:38,440 Speaker 1: free association thing technology colds viruses. How about I talk 11 00:00:38,479 --> 00:00:42,760 Speaker 1: about a famous virus. So we're going to really dive 12 00:00:42,840 --> 00:00:48,160 Speaker 1: in to the story behind stucks net, a famous piece 13 00:00:48,280 --> 00:00:52,960 Speaker 1: of malware that made headlines in and I've talked about 14 00:00:52,960 --> 00:00:55,320 Speaker 1: it before on this show. In fact, Chris Palette and 15 00:00:55,360 --> 00:00:58,440 Speaker 1: I did an episode about stucks net several years ago, 16 00:00:58,760 --> 00:01:03,000 Speaker 1: but at the time not as much information was available 17 00:01:03,040 --> 00:01:06,720 Speaker 1: about what was going on. Tech Stuff technically launched two 18 00:01:06,840 --> 00:01:10,640 Speaker 1: years before stocks Net made headlines. So this is actually 19 00:01:10,640 --> 00:01:13,440 Speaker 1: an opportunity for me to look back at something that 20 00:01:13,560 --> 00:01:16,280 Speaker 1: developed over the course of the history of this show 21 00:01:17,000 --> 00:01:21,039 Speaker 1: and learn more about where it came from, what's purpose was, 22 00:01:21,720 --> 00:01:26,559 Speaker 1: and how that whole story unfolded. Before I really dive 23 00:01:26,680 --> 00:01:30,840 Speaker 1: into the story, I want to mention one of the 24 00:01:30,920 --> 00:01:36,039 Speaker 1: sources I used when I was researching these two episodes. Uh, 25 00:01:36,080 --> 00:01:40,080 Speaker 1: this would be a book titled Countdown to Zero Day, 26 00:01:40,200 --> 00:01:43,279 Speaker 1: stucks Net and the Launch of the World's First Digital Weapon. 27 00:01:43,800 --> 00:01:46,920 Speaker 1: The book goes into great detail regarding the story of 28 00:01:46,920 --> 00:01:52,160 Speaker 1: stocks Net. It also gives wonderful background information on the 29 00:01:52,280 --> 00:01:59,400 Speaker 1: key figures of cryptography researchers, cybersecurity researchers, all these people 30 00:01:59,440 --> 00:02:03,600 Speaker 1: who were very much instrumental in discovering and uncovering stuck 31 00:02:03,680 --> 00:02:06,240 Speaker 1: s net and figuring out what it did and and 32 00:02:06,280 --> 00:02:10,079 Speaker 1: who was probably behind it since that was never something 33 00:02:10,080 --> 00:02:12,240 Speaker 1: that was officially acknowledged, but come on, we know who 34 00:02:12,320 --> 00:02:14,920 Speaker 1: it actually was. I'll talk about that in these episodes. 35 00:02:15,320 --> 00:02:18,040 Speaker 1: That's a great book. If you want more information about 36 00:02:18,120 --> 00:02:21,800 Speaker 1: stucks Net after this episode, go check that out, count 37 00:02:21,800 --> 00:02:24,360 Speaker 1: Down to Zero Day, stucks Net and the Launch of 38 00:02:24,360 --> 00:02:28,400 Speaker 1: the World's First Digital Weapon. It goes into way more 39 00:02:28,480 --> 00:02:32,160 Speaker 1: detail than I'm going to cover in these episodes. Now, 40 00:02:32,840 --> 00:02:35,880 Speaker 1: these episodes are also going to contain a lot of 41 00:02:36,000 --> 00:02:41,280 Speaker 1: history and politics in them because stucks net, unlike many 42 00:02:41,360 --> 00:02:46,160 Speaker 1: other examples of malware was not intended to be a 43 00:02:46,440 --> 00:02:51,600 Speaker 1: type of uh computer virus to create monetary gain for 44 00:02:51,639 --> 00:02:56,920 Speaker 1: the people who designed it, or even just make people irritated. 45 00:02:57,120 --> 00:02:59,880 Speaker 1: It wasn't that kind of malware. You may have a 46 00:03:00,040 --> 00:03:03,200 Speaker 1: counter and malware that was meant to try and extort 47 00:03:03,240 --> 00:03:07,480 Speaker 1: money from someone where it locks down a computer and 48 00:03:07,600 --> 00:03:10,360 Speaker 1: the only way to get access, at least according to 49 00:03:10,600 --> 00:03:15,160 Speaker 1: the messengersy receive, is to pay a ransom to the hackers. 50 00:03:15,240 --> 00:03:18,680 Speaker 1: We call that ransomware. Stuck's Net was not that type 51 00:03:18,680 --> 00:03:22,720 Speaker 1: of malware. Nor was it just some sort of capricious 52 00:03:22,840 --> 00:03:26,840 Speaker 1: code that someone created in order to turn computer hard 53 00:03:26,960 --> 00:03:31,280 Speaker 1: drives into giant concrete blocks. It was neither of those things. 54 00:03:31,280 --> 00:03:37,040 Speaker 1: It had a very specific intent, and it was very 55 00:03:37,120 --> 00:03:40,760 Speaker 1: much a at least all signs at least pointed to 56 00:03:40,840 --> 00:03:44,440 Speaker 1: it being very much a state sponsored piece of software, 57 00:03:44,480 --> 00:03:49,640 Speaker 1: meaning that some government agency or agencies was behind the 58 00:03:49,680 --> 00:03:53,240 Speaker 1: development of this. So it sets it apart from a 59 00:03:53,240 --> 00:03:56,760 Speaker 1: lot of other versions of malware. And in order to 60 00:03:56,840 --> 00:03:59,040 Speaker 1: understand it, I think it's good to begin with a 61 00:03:59,120 --> 00:04:04,320 Speaker 1: quick history lesson of Iran's nuclear program, because that was 62 00:04:04,480 --> 00:04:09,680 Speaker 1: the ultimate target of stocks net back in the nineteen fifties. Iran, 63 00:04:09,920 --> 00:04:14,680 Speaker 1: under the leadership of shaw Mohammad Reza PALAVII, had received 64 00:04:14,680 --> 00:04:18,400 Speaker 1: the nod from the world community to pursue a nuclear 65 00:04:18,480 --> 00:04:22,320 Speaker 1: power program. At that same time, nuclear powers like the 66 00:04:22,400 --> 00:04:25,839 Speaker 1: United States, we're trying to discourage other nations from developing 67 00:04:25,920 --> 00:04:29,560 Speaker 1: nuclear weapons. So they were essentially saying, hey, nuclear power, 68 00:04:29,920 --> 00:04:34,560 Speaker 1: tots cool nuclear weapons, Let's not make this worse because 69 00:04:34,680 --> 00:04:40,159 Speaker 1: nuclear proliferation was becoming a big fear among various powers 70 00:04:40,160 --> 00:04:42,480 Speaker 1: in the world and of course populations in the world. 71 00:04:43,040 --> 00:04:46,000 Speaker 1: So it was sort of an Atoms for Peace kind 72 00:04:46,040 --> 00:04:49,359 Speaker 1: of initiative, saying, let's go and develop nuclear power for 73 00:04:49,400 --> 00:04:52,719 Speaker 1: a country's that's great. That way you can generate electricity, 74 00:04:52,760 --> 00:04:57,440 Speaker 1: but let's stay away from building the bombs. Iran's program 75 00:04:57,520 --> 00:05:00,559 Speaker 1: was launched with the understanding that the country was only 76 00:05:00,600 --> 00:05:05,520 Speaker 1: going to build these power plants, not weaponry, although all 77 00:05:05,560 --> 00:05:09,080 Speaker 1: indications showed that the long term plan for Iran was 78 00:05:09,120 --> 00:05:13,360 Speaker 1: in fact to develop nuclear weapons at some point. As 79 00:05:13,400 --> 00:05:16,400 Speaker 1: part of this early agreement, the United States would sell 80 00:05:16,520 --> 00:05:20,040 Speaker 1: to Iran the enriched uranium its power plants would need 81 00:05:20,080 --> 00:05:22,960 Speaker 1: as fuel, so Iran wouldn't need to create its own 82 00:05:23,080 --> 00:05:27,679 Speaker 1: uranium enrichment facilities. It would just purchase enriched uranium ready 83 00:05:27,760 --> 00:05:31,800 Speaker 1: to go from the United States. And in fact, the US, Germany, 84 00:05:31,880 --> 00:05:37,400 Speaker 1: and France were all totally supportive of Iran's efforts, perhaps 85 00:05:37,920 --> 00:05:41,760 Speaker 1: because those countries also stood to benefit from it. They 86 00:05:41,760 --> 00:05:44,520 Speaker 1: were all going to make a boatload of cash by 87 00:05:44,520 --> 00:05:48,279 Speaker 1: selling equipment and fuel to Iran, so there was a 88 00:05:48,360 --> 00:05:54,400 Speaker 1: financial incentive to support Iran's efforts to create nuclear power plants. 89 00:05:54,440 --> 00:05:57,080 Speaker 1: All of this was despite the fact that the tools 90 00:05:57,200 --> 00:06:01,120 Speaker 1: used to create nuclear power plants could receivably be put 91 00:06:01,160 --> 00:06:04,080 Speaker 1: to use to build nuclear weapons. So you could have 92 00:06:04,160 --> 00:06:07,080 Speaker 1: someone say, hey, I just need this technology because I 93 00:06:07,080 --> 00:06:09,800 Speaker 1: want to make a power plant, but in reality they 94 00:06:09,880 --> 00:06:13,640 Speaker 1: might be using that technology to make boom booms. So 95 00:06:14,240 --> 00:06:16,960 Speaker 1: the thing that the U United States said that kind 96 00:06:16,960 --> 00:06:22,360 Speaker 1: of justified their choice to support this program was, Hey, 97 00:06:22,520 --> 00:06:26,640 Speaker 1: the Shah, he's awesome. We get along so well, we're 98 00:06:26,680 --> 00:06:32,040 Speaker 1: like besties. And so there's no way that Iran, even 99 00:06:32,040 --> 00:06:34,520 Speaker 1: if they did develop nuclear weapons, would be a threat 100 00:06:34,600 --> 00:06:38,000 Speaker 1: to US, their their allies. So let's go ahead and 101 00:06:38,040 --> 00:06:40,360 Speaker 1: go all in. Let's go ahead and make some money. 102 00:06:40,400 --> 00:06:44,640 Speaker 1: Come on, capitalism, woo. It's not like Iran will ever 103 00:06:44,680 --> 00:06:48,400 Speaker 1: have a problem with the United States. And then nineteen 104 00:06:48,480 --> 00:06:52,880 Speaker 1: seventy nine happened. In nineteen seventy nine, the Ayatollah Ruala 105 00:06:53,000 --> 00:06:57,120 Speaker 1: Kamaney overthrew the shop. Now Kamany did not share the 106 00:06:57,160 --> 00:07:01,200 Speaker 1: Shah's opinion of the United States, and suddenly the U 107 00:07:01,440 --> 00:07:06,400 Speaker 1: S was tugging at its collar and saying yikes. So 108 00:07:06,520 --> 00:07:10,320 Speaker 1: Germany in the US withdrew their support for Iran's nuclear 109 00:07:10,320 --> 00:07:14,640 Speaker 1: program and the Komani Aetola. Komani was not terribly interested 110 00:07:14,680 --> 00:07:19,160 Speaker 1: in pursuing a nuclear power program either, so the power 111 00:07:19,200 --> 00:07:21,760 Speaker 1: plants were pretty much abandoned for a few years. They 112 00:07:21,800 --> 00:07:26,560 Speaker 1: were also the frequent target of bombing raids during various 113 00:07:27,000 --> 00:07:30,320 Speaker 1: conflicts that Iran got into over the course of the eighties. 114 00:07:30,680 --> 00:07:34,680 Speaker 1: Now that Aatola would eventually renew the nuclear program. In 115 00:07:34,720 --> 00:07:38,560 Speaker 1: the nineteen eighties, after rumors spread that Iraq was developing 116 00:07:38,600 --> 00:07:42,640 Speaker 1: nuclear weapons and Saddam Hussein, the leader of Iraq at 117 00:07:42,680 --> 00:07:46,800 Speaker 1: the time, had already used chemical weapons against Iran during 118 00:07:46,840 --> 00:07:50,840 Speaker 1: the Iran Iraq War, the Aetola hired on an engineer 119 00:07:50,880 --> 00:07:56,480 Speaker 1: from Pakistan to help Iran, using plans for centrifuges that 120 00:07:56,560 --> 00:08:00,600 Speaker 1: the engineer had stolen from European companies. So This engineer 121 00:08:00,600 --> 00:08:03,200 Speaker 1: had worked on behalf of these European companies and then 122 00:08:03,320 --> 00:08:08,960 Speaker 1: essentially did a little industrial theft, stealing the plans for 123 00:08:09,000 --> 00:08:13,280 Speaker 1: centrifuges so that they could create a similar program in 124 00:08:13,400 --> 00:08:17,280 Speaker 1: nations like Pakistan. This was all happening in secret, obviously, 125 00:08:17,720 --> 00:08:20,120 Speaker 1: but Iran had gotten word of it, and so they 126 00:08:20,160 --> 00:08:25,040 Speaker 1: contacted this Pakistani engineer who agreed to help out Iran. 127 00:08:25,440 --> 00:08:30,000 Speaker 1: In nineteen Iran entered into a contract with Russia to 128 00:08:30,080 --> 00:08:34,000 Speaker 1: complete a nuclear power reactor at Boucher. This site in 129 00:08:34,040 --> 00:08:36,760 Speaker 1: Iran had been one of the original plan power plants 130 00:08:36,960 --> 00:08:41,040 Speaker 1: way back in nineteen fifty seven, but the various conflicts 131 00:08:41,080 --> 00:08:43,880 Speaker 1: between fifty seven and ninety five had delayed and even 132 00:08:43,920 --> 00:08:47,160 Speaker 1: destroyed the work that had been done on the location. 133 00:08:47,559 --> 00:08:51,319 Speaker 1: Iran and Russia were going to also build a uranium 134 00:08:51,440 --> 00:08:55,800 Speaker 1: enrichment plant kind of co located with this nuclear power plant, 135 00:08:56,240 --> 00:08:59,000 Speaker 1: but the United States stepped in and said to Russia, hey, 136 00:08:59,120 --> 00:09:03,440 Speaker 1: we think that's like a bad idea man, and Russia 137 00:09:03,480 --> 00:09:07,800 Speaker 1: eventually said dah and backed off. And that was supposedly that, 138 00:09:08,559 --> 00:09:12,640 Speaker 1: except it totally wasn't just that. In two thousand, Iran 139 00:09:12,760 --> 00:09:17,959 Speaker 1: started building a new facility at Natan's another site in Iran. 140 00:09:18,080 --> 00:09:25,400 Speaker 1: Iranian officials claimed that this facility was a desert eradication location, 141 00:09:26,120 --> 00:09:29,680 Speaker 1: but satellite imagery eventually showed that something else was up 142 00:09:29,880 --> 00:09:33,200 Speaker 1: in that site. The design of the facility suggested it 143 00:09:33,240 --> 00:09:36,880 Speaker 1: was going to house something super secret that was to 144 00:09:37,000 --> 00:09:40,720 Speaker 1: be protected from missile strikes and air strikes. And the 145 00:09:40,760 --> 00:09:44,840 Speaker 1: reason they were drawing this conclusion was that Iran was 146 00:09:44,880 --> 00:09:49,160 Speaker 1: clearly excavating a lot of land building a large underground 147 00:09:49,200 --> 00:09:54,040 Speaker 1: facility something that needed to be uh insulated from potential 148 00:09:54,080 --> 00:09:59,040 Speaker 1: air attack, and the entrance hallway into the facility had 149 00:09:59,040 --> 00:10:01,480 Speaker 1: a big U turn in it. It wasn't a straight 150 00:10:01,520 --> 00:10:05,280 Speaker 1: shot down into the heart of the facility. That you 151 00:10:05,480 --> 00:10:08,760 Speaker 1: turn was an indication that perhaps this was a way 152 00:10:09,000 --> 00:10:13,400 Speaker 1: to avoid a smart missile flying down the entryway and 153 00:10:13,520 --> 00:10:16,080 Speaker 1: hitting a target. If it had to turn nine degrees 154 00:10:16,240 --> 00:10:20,080 Speaker 1: or eighty degrees, then chances are no missile would actually 155 00:10:20,080 --> 00:10:23,240 Speaker 1: be able to do that, and it was thus a 156 00:10:23,360 --> 00:10:26,640 Speaker 1: tactic to avoid damage in the case of an air strike. 157 00:10:27,040 --> 00:10:31,840 Speaker 1: But why would you need that for some innocent desert 158 00:10:31,840 --> 00:10:35,160 Speaker 1: eradication facility. Why would it need to be underground and 159 00:10:35,200 --> 00:10:38,600 Speaker 1: have these kind of measures in place In two thousand two, 160 00:10:38,679 --> 00:10:42,600 Speaker 1: some whistleblowers alerted the u N that this facility would 161 00:10:42,679 --> 00:10:52,360 Speaker 1: actually be a uranium enrichment plant. Now, Iranian officials eventually said, okay, yeah, 162 00:10:52,520 --> 00:10:55,000 Speaker 1: but we were gonna tell you about it. We just 163 00:10:55,040 --> 00:10:57,800 Speaker 1: hadn't done that yet because there wasn't really any need to. 164 00:10:57,920 --> 00:11:02,240 Speaker 1: Were still months away from going online, so it's not 165 00:11:02,400 --> 00:11:06,360 Speaker 1: like there's any chance that this thing is is already 166 00:11:06,480 --> 00:11:10,560 Speaker 1: producing enriched uranium. We just want to have a facility 167 00:11:10,800 --> 00:11:13,400 Speaker 1: to create nuclear fuel that we're going to use for 168 00:11:13,400 --> 00:11:15,840 Speaker 1: our power plants. We want to be self sufficient, is all. 169 00:11:15,880 --> 00:11:17,880 Speaker 1: We don't want to have to buy our fuel from 170 00:11:17,960 --> 00:11:22,240 Speaker 1: other nations. The u N stepped up inspections of the facility, 171 00:11:22,320 --> 00:11:25,600 Speaker 1: or at least attempted to, although it initially encountered a 172 00:11:25,600 --> 00:11:28,439 Speaker 1: lot of resistance from Iran. The u N would say, 173 00:11:28,440 --> 00:11:30,720 Speaker 1: all right, well, we're ready to come in and investigate 174 00:11:30,760 --> 00:11:34,720 Speaker 1: this facility, and then the people in Iran would say, sorry, 175 00:11:34,760 --> 00:11:36,960 Speaker 1: it's not ready yet. Come back next month. And then 176 00:11:36,960 --> 00:11:38,600 Speaker 1: they would come back next month and say, all right, 177 00:11:38,600 --> 00:11:40,120 Speaker 1: we're ready to look at the facility and say, you 178 00:11:40,120 --> 00:11:41,760 Speaker 1: know what, we lost the keys, don't know where the 179 00:11:41,840 --> 00:11:45,640 Speaker 1: keys are. Could you come back maybe another day, and 180 00:11:45,679 --> 00:11:49,840 Speaker 1: it became increasingly clear, at least to the the investigators, 181 00:11:49,880 --> 00:11:52,079 Speaker 1: that something was up, and that there was a lot 182 00:11:52,120 --> 00:11:56,120 Speaker 1: of activity going on, perhaps to cover tracks, perhaps to 183 00:11:56,160 --> 00:11:59,440 Speaker 1: get rid of evidence, although it's impossible to say, because 184 00:11:59,520 --> 00:12:01,840 Speaker 1: unless you actually are there to witness what is happening, 185 00:12:01,840 --> 00:12:05,760 Speaker 1: you don't really know, but it seemed to imply that 186 00:12:05,800 --> 00:12:09,240 Speaker 1: there was something hainky going on. Eventually they were able 187 00:12:09,320 --> 00:12:14,720 Speaker 1: to set up a regular inspection schedule of this facility, 188 00:12:14,840 --> 00:12:18,360 Speaker 1: and they were there to make sure that the uranium 189 00:12:18,360 --> 00:12:21,520 Speaker 1: that was being produced was meant for nuclear power and 190 00:12:21,600 --> 00:12:26,040 Speaker 1: not nuclear weapons. And meanwhile, countries like the United States 191 00:12:26,080 --> 00:12:30,520 Speaker 1: were getting awfully antsy about Iran. On July six, two 192 00:12:30,520 --> 00:12:34,720 Speaker 1: thousand nine, Wiki Leaks hosted a note written by founder 193 00:12:34,800 --> 00:12:39,920 Speaker 1: Julian Assange that referenced some sort of serious nuclear accident 194 00:12:40,320 --> 00:12:43,720 Speaker 1: that had happened at the uranium enrichment facility. Now this 195 00:12:43,720 --> 00:12:47,160 Speaker 1: would have been shortly after the stux Net virus would 196 00:12:47,160 --> 00:12:50,640 Speaker 1: initially be released, but at this time, no one outside 197 00:12:50,800 --> 00:12:54,719 Speaker 1: of the people involved in stocks Net would have possibly 198 00:12:54,800 --> 00:12:59,839 Speaker 1: known about the virus and become public knowledge. Yet, in January, 199 00:13:00,160 --> 00:13:04,520 Speaker 1: a United Nations agency called the International Atomic Energy Agency 200 00:13:04,720 --> 00:13:07,920 Speaker 1: or I a e A. Began to notice that something 201 00:13:07,960 --> 00:13:13,520 Speaker 1: unusual was happening to the centrifuges at Iran's Natan's uranium 202 00:13:13,640 --> 00:13:16,880 Speaker 1: enrichment plant. They saw that there was a failure rate 203 00:13:16,880 --> 00:13:20,800 Speaker 1: that was unusually high. The agents would inspect the facilities 204 00:13:20,840 --> 00:13:23,440 Speaker 1: at least once a month and then occasionally with some 205 00:13:23,520 --> 00:13:27,199 Speaker 1: surprise inspections, and the whole point was just to make 206 00:13:27,200 --> 00:13:29,840 Speaker 1: sure that nothing illegal was happening, that Iran was in 207 00:13:29,880 --> 00:13:33,160 Speaker 1: fact not trying to stockpile enriched uranium in the effort 208 00:13:33,200 --> 00:13:39,920 Speaker 1: to build bombs. This was an important uh thing that 209 00:13:39,960 --> 00:13:42,280 Speaker 1: the U N was doing, but it also was not 210 00:13:42,360 --> 00:13:44,720 Speaker 1: the most efficient way of doing it if you wanted 211 00:13:44,760 --> 00:13:51,200 Speaker 1: to recognize trends, because they would swap out who went 212 00:13:51,360 --> 00:13:56,959 Speaker 1: to investigate the facility each time. That kind of makes sense. 213 00:13:57,000 --> 00:14:01,080 Speaker 1: You don't want one group to get compromised in any 214 00:14:01,120 --> 00:14:05,920 Speaker 1: way or fooled in some way. Sending new people sends 215 00:14:06,000 --> 00:14:08,520 Speaker 1: new sets of eyes. But it also meant that until 216 00:14:08,559 --> 00:14:11,839 Speaker 1: you were looking at aggregated data, you could not necessarily 217 00:14:11,920 --> 00:14:16,400 Speaker 1: see that something unusual is happening, And something unusual was 218 00:14:16,480 --> 00:14:20,480 Speaker 1: happening specifically to the centrifuges. Now to understand that. It 219 00:14:20,720 --> 00:14:23,440 Speaker 1: also helps to understand what the heck the centrifugures were 220 00:14:23,440 --> 00:14:25,480 Speaker 1: being used for in the first place, Like what is 221 00:14:25,520 --> 00:14:30,240 Speaker 1: their purpose in the process of refining uranium. Well, first, 222 00:14:30,720 --> 00:14:33,680 Speaker 1: nuclear fuel needs to be made up of between three 223 00:14:33,760 --> 00:14:37,760 Speaker 1: and a half to five percent uranium two thirty five isotope. 224 00:14:38,200 --> 00:14:41,840 Speaker 1: So isotopes are two or more forms of the same element, 225 00:14:42,160 --> 00:14:45,080 Speaker 1: in which the atoms of the different isotopes have a 226 00:14:45,080 --> 00:14:49,520 Speaker 1: different number of neutrons. Chemically, the two atoms behave the 227 00:14:49,560 --> 00:14:53,080 Speaker 1: same way, but they'll have different atomic masses because of 228 00:14:53,080 --> 00:14:56,080 Speaker 1: the difference in neutrons uh, and they'll have different decay 229 00:14:56,160 --> 00:14:59,040 Speaker 1: rates and things of that nature as well. So there 230 00:14:59,040 --> 00:15:03,760 Speaker 1: are three may jr isotopes of uranium that occur naturally 231 00:15:04,320 --> 00:15:07,960 Speaker 1: uh within the Earth's crust. So if you mine uranium, 232 00:15:07,960 --> 00:15:11,400 Speaker 1: you're gonna come up with a mixture of different isotopes 233 00:15:11,520 --> 00:15:15,320 Speaker 1: at different concentrations. Most of it, in fact, more than 234 00:15:15,440 --> 00:15:19,960 Speaker 1: nine of the stuff that occurs naturally is uranium two 235 00:15:19,960 --> 00:15:23,600 Speaker 1: thirty eight, less than one percent of it is uranium 236 00:15:23,680 --> 00:15:26,000 Speaker 1: two thirty five, and then you get a ten ty 237 00:15:26,120 --> 00:15:29,520 Speaker 1: tiny bit that's uranium two thirty four. If you want 238 00:15:29,520 --> 00:15:32,960 Speaker 1: to make nuclear fuel, you need a much higher percentage 239 00:15:32,960 --> 00:15:35,480 Speaker 1: of uranium two thirty five than what you find in nature. 240 00:15:35,520 --> 00:15:38,000 Speaker 1: In nature it's less than a percent, and fuel you 241 00:15:38,040 --> 00:15:39,680 Speaker 1: need it to be at least three and a half 242 00:15:39,800 --> 00:15:43,120 Speaker 1: to five percent. By the time you're getting into the 243 00:15:43,240 --> 00:15:47,040 Speaker 1: enrichment process. You need the uranium to be in gas form, 244 00:15:47,520 --> 00:15:51,200 Speaker 1: so you would get uranium or you would refine that 245 00:15:51,360 --> 00:15:54,600 Speaker 1: down to uranium oxide, and then you would take that 246 00:15:54,680 --> 00:15:57,680 Speaker 1: to a conversion plant that would take the uranium oxide 247 00:15:57,720 --> 00:16:02,200 Speaker 1: and turn it into a gas called uranium hexafluoride. This 248 00:16:02,320 --> 00:16:04,960 Speaker 1: gas has various isotubes of uranium in it. You have 249 00:16:05,040 --> 00:16:08,280 Speaker 1: both uranium two thirty eight and two thirty five, and 250 00:16:08,360 --> 00:16:10,720 Speaker 1: you have it in the concentrations you would expect because 251 00:16:10,760 --> 00:16:13,520 Speaker 1: it's from the stuff you mind from the ground. You 252 00:16:13,560 --> 00:16:18,680 Speaker 1: then feed that gas into tubes inside a centrifuge. Centrifuges 253 00:16:18,760 --> 00:16:21,800 Speaker 1: spin and they can spend it really high velocities. We're 254 00:16:21,800 --> 00:16:25,240 Speaker 1: talking tens of thousands of revolutions per minute now. When 255 00:16:25,280 --> 00:16:29,520 Speaker 1: they spin, it separates out the materials of different weight 256 00:16:29,800 --> 00:16:33,200 Speaker 1: within those tubes. The heavy stuff moves towards the edges 257 00:16:33,240 --> 00:16:35,760 Speaker 1: of the tubes, the sides of the tubes, and the 258 00:16:35,960 --> 00:16:39,040 Speaker 1: lighter stuff will move towards the center. So if you 259 00:16:39,120 --> 00:16:43,320 Speaker 1: spend the centrifugures at the right speed and you then 260 00:16:43,440 --> 00:16:47,200 Speaker 1: effectively scoop out the middle of the tube, you can 261 00:16:47,320 --> 00:16:50,440 Speaker 1: separate the uranium two thirty five from the uranium two 262 00:16:50,440 --> 00:16:52,320 Speaker 1: thirty eight. Now you actually have to do this in 263 00:16:52,360 --> 00:16:55,600 Speaker 1: a lot of different stages. You put them through one centrifuge, 264 00:16:55,640 --> 00:16:58,160 Speaker 1: you do the scooping process, you doing about another centrifuge. 265 00:16:58,240 --> 00:16:59,920 Speaker 1: You have to do this multiple times in order to 266 00:17:00,080 --> 00:17:04,679 Speaker 1: really get the right concentrations. Eventually you can do this 267 00:17:04,840 --> 00:17:07,920 Speaker 1: enough to manufacture the uranium pellets that you would use 268 00:17:08,119 --> 00:17:11,560 Speaker 1: for nuclear fuel. If you wanted to make a nuclear weapon, 269 00:17:11,920 --> 00:17:15,359 Speaker 1: you'd follow the same process, but you need way more 270 00:17:15,520 --> 00:17:19,360 Speaker 1: uranium two thirty five. Nuclear weapons typically have a proportion 271 00:17:19,400 --> 00:17:23,320 Speaker 1: of or more uranium two thirty five in them. Sometimes 272 00:17:23,320 --> 00:17:27,520 Speaker 1: it's or greater, so you need a lot more uranium, 273 00:17:27,600 --> 00:17:29,360 Speaker 1: and then you have to refine a lot of it 274 00:17:29,720 --> 00:17:31,400 Speaker 1: and rich a lot of it in order to get 275 00:17:31,440 --> 00:17:35,919 Speaker 1: to that level of uranium two thirty five. But it 276 00:17:36,040 --> 00:17:38,959 Speaker 1: is exactly the same process, it's just a matter of 277 00:17:39,480 --> 00:17:43,920 Speaker 1: more stuff. Centrifuges, as they turn out, are are delicate. 278 00:17:44,440 --> 00:17:47,560 Speaker 1: They're the ones that Iran was using. We're supposed to 279 00:17:47,600 --> 00:17:52,160 Speaker 1: have a ten year lifespan, but these are moving pieces 280 00:17:52,280 --> 00:17:55,919 Speaker 1: of machinery. They have mechanical parts, and they work at 281 00:17:56,000 --> 00:18:00,280 Speaker 1: high speeds, so eventually they'll fail. They may fail because 282 00:18:00,320 --> 00:18:04,400 Speaker 1: of mechanical error, human error, all sorts of different stuff 283 00:18:04,440 --> 00:18:07,880 Speaker 1: could cause them to break down. And because of that, 284 00:18:08,040 --> 00:18:11,879 Speaker 1: typically in a year you might have to replace about 285 00:18:11,920 --> 00:18:15,280 Speaker 1: ten of the centrifugures you have, even if they're brand new. 286 00:18:16,200 --> 00:18:19,400 Speaker 1: But the thing that the i a e A. Discovered 287 00:18:20,080 --> 00:18:24,000 Speaker 1: eventually after they looked at aggregated data, was that the 288 00:18:24,119 --> 00:18:27,680 Speaker 1: number of centrifuges that they were replacing at this uranium 289 00:18:27,680 --> 00:18:32,400 Speaker 1: and enrichment facility was much higher than that they had 290 00:18:32,400 --> 00:18:35,240 Speaker 1: centrifugures at this point, So you would expect about eight 291 00:18:35,280 --> 00:18:38,200 Speaker 1: hundred and seventy of them need to be replaced every year, 292 00:18:38,560 --> 00:18:41,240 Speaker 1: but apparently the number was actually much higher than that, 293 00:18:41,280 --> 00:18:44,560 Speaker 1: perhaps as high as two thousand or more, although the 294 00:18:44,600 --> 00:18:48,359 Speaker 1: actual figures were never published. But the i a e A. 295 00:18:48,600 --> 00:18:51,600 Speaker 1: Was keeping track of this stuff. They just didn't notice 296 00:18:51,640 --> 00:18:55,159 Speaker 1: the trend until they were looking at again a sequence 297 00:18:55,359 --> 00:18:58,160 Speaker 1: of these visits and then realized, hey, that that seems 298 00:18:58,200 --> 00:19:01,400 Speaker 1: like a pretty high number to place that many centrifuges. 299 00:19:01,480 --> 00:19:06,359 Speaker 1: Wonder what's happening with this? Well, while this was going on, uh, 300 00:19:06,520 --> 00:19:10,280 Speaker 1: there were other things happening that we're indicating that something 301 00:19:10,400 --> 00:19:14,919 Speaker 1: unusual had been unleashed in the world of computers. The 302 00:19:14,960 --> 00:19:17,000 Speaker 1: folks at I A e A at this point did 303 00:19:17,040 --> 00:19:20,760 Speaker 1: not suspect any kind of computer virus. They weren't sure 304 00:19:20,800 --> 00:19:23,040 Speaker 1: what was causing the centrifuges to fail. It could have 305 00:19:23,080 --> 00:19:26,400 Speaker 1: just been that they were really bad centrifugus, that Iran 306 00:19:26,520 --> 00:19:30,159 Speaker 1: had purchased them from a bad source, although Iran was 307 00:19:30,240 --> 00:19:32,959 Speaker 1: stating that they had actually made the whole thing themselves, 308 00:19:33,000 --> 00:19:36,080 Speaker 1: that the centrifugures were based off their own design, although 309 00:19:36,080 --> 00:19:41,479 Speaker 1: again the United Nations officials the investigators were not buying it. 310 00:19:41,640 --> 00:19:44,400 Speaker 1: They said, wow, these things look an awful lot like 311 00:19:45,080 --> 00:19:47,160 Speaker 1: the ones that were being used in Europe a few 312 00:19:47,240 --> 00:19:49,120 Speaker 1: years ago. In fact, if I didn't know any better, 313 00:19:49,160 --> 00:19:51,600 Speaker 1: I would say that these were direct copies of that 314 00:19:51,680 --> 00:19:55,920 Speaker 1: and that they were based off stolen information. But Iran's 315 00:19:55,920 --> 00:19:59,520 Speaker 1: messaging was that no, these were of our design and 316 00:19:59,720 --> 00:20:02,480 Speaker 1: we built them at any rate. I A e A 317 00:20:02,640 --> 00:20:06,520 Speaker 1: wasn't sure why these centrifuges were starting to fail at 318 00:20:06,520 --> 00:20:09,760 Speaker 1: that same time, or actually a little bit later. In June, 319 00:20:10,760 --> 00:20:14,840 Speaker 1: there was a cybersecurity professional named Seragei Ulasson in Belarus 320 00:20:15,119 --> 00:20:18,880 Speaker 1: who was investigating some really weird computer behavior that had 321 00:20:18,920 --> 00:20:23,119 Speaker 1: been reported in an Iranian computer. The computer in question 322 00:20:23,240 --> 00:20:28,400 Speaker 1: was caught in an endless crash and reboot cycle and 323 00:20:29,280 --> 00:20:32,639 Speaker 1: they weren't really sure what was causing it. The culprit 324 00:20:33,240 --> 00:20:36,400 Speaker 1: looked like it might have been the anti virus software 325 00:20:36,640 --> 00:20:40,040 Speaker 1: that was on the computer, that something was not compatible, 326 00:20:40,680 --> 00:20:44,600 Speaker 1: and the antivirus software came from the company that uh 327 00:20:44,720 --> 00:20:47,840 Speaker 1: Sarageulawson was working for. He was working for this company 328 00:20:47,840 --> 00:20:52,520 Speaker 1: called virus Block Ada. The Iranian computer had that anti 329 00:20:52,600 --> 00:20:55,760 Speaker 1: virus program on it. It was purchased originally from a reseller, 330 00:20:55,880 --> 00:20:58,840 Speaker 1: so it wasn't purchased directly from the Belarus company, but 331 00:20:59,000 --> 00:21:02,920 Speaker 1: rather an Ira Auny and company that had the right 332 00:21:03,119 --> 00:21:07,840 Speaker 1: to re sell this anti virus software, and originally the 333 00:21:07,840 --> 00:21:10,800 Speaker 1: person who owned the computer or the the agency that 334 00:21:10,920 --> 00:21:15,840 Speaker 1: owned the computer contacted the reseller and said, I'm getting 335 00:21:15,880 --> 00:21:18,480 Speaker 1: this error. It's the computer just keeps crashing and trying 336 00:21:18,480 --> 00:21:22,640 Speaker 1: to reboot. What's going on the reseller eventually fielded that 337 00:21:23,119 --> 00:21:28,199 Speaker 1: question up to Lawson. Lawson got permission to log into 338 00:21:28,240 --> 00:21:31,560 Speaker 1: this problematic computer using a remote log in, and he 339 00:21:31,640 --> 00:21:33,520 Speaker 1: began to look around to see what the heck was 340 00:21:33,560 --> 00:21:37,840 Speaker 1: going on, and he eventually suspected the machine had been 341 00:21:37,880 --> 00:21:41,679 Speaker 1: infected by some malware and that this malware included a 342 00:21:41,800 --> 00:21:46,440 Speaker 1: root kit quick refresher. A root kit is software that 343 00:21:46,520 --> 00:21:51,160 Speaker 1: gives an unauthorized party access to control of a computer system. 344 00:21:51,520 --> 00:21:55,040 Speaker 1: Hackers use this to get a back door access and 345 00:21:55,080 --> 00:21:57,960 Speaker 1: get information on computers, or they do it to create 346 00:21:58,080 --> 00:22:02,000 Speaker 1: boton nets. Moreover, a root kit masks this activity. It 347 00:22:02,240 --> 00:22:05,639 Speaker 1: acts as like a shield to hide it from the 348 00:22:05,640 --> 00:22:08,800 Speaker 1: host computer in an effort to escape detection. So a 349 00:22:08,840 --> 00:22:12,000 Speaker 1: good root kit is doing all this allowing someone to 350 00:22:12,160 --> 00:22:16,920 Speaker 1: remotely access your computer, but you can't tell because it's 351 00:22:17,040 --> 00:22:20,640 Speaker 1: hiding all that activity from you. Well, like all malware, 352 00:22:20,720 --> 00:22:23,719 Speaker 1: root kits are only useful if the targeted machine doesn't 353 00:22:23,720 --> 00:22:26,959 Speaker 1: have suitable anti virus protection on it. It could be 354 00:22:27,280 --> 00:22:30,040 Speaker 1: out of date, or it might not have antivirus software 355 00:22:30,040 --> 00:22:33,240 Speaker 1: on it at all, or it might be so new 356 00:22:33,960 --> 00:22:38,639 Speaker 1: that antivirus software doesn't yet have a profile on that 357 00:22:38,720 --> 00:22:41,240 Speaker 1: type of root kit. Which means that it will escape 358 00:22:41,280 --> 00:22:44,960 Speaker 1: the anti virus software's detection because it doesn't know to 359 00:22:45,040 --> 00:22:48,760 Speaker 1: look for it. Once anti virus software companies learn of 360 00:22:48,800 --> 00:22:51,520 Speaker 1: a piece of malware, they can then adjust their software 361 00:22:51,640 --> 00:22:54,480 Speaker 1: to identify and block those programs. But if there is 362 00:22:54,520 --> 00:22:58,160 Speaker 1: a gap there, the malware can go for a while 363 00:22:58,240 --> 00:23:01,399 Speaker 1: without detection, and it means that all machines can be 364 00:23:01,480 --> 00:23:04,919 Speaker 1: vulnerable to those attacks until someone catches on. And that 365 00:23:05,040 --> 00:23:08,240 Speaker 1: seems to be what was going on in this case. 366 00:23:08,680 --> 00:23:11,080 Speaker 1: Now I have a lot more to say about the 367 00:23:11,160 --> 00:23:13,760 Speaker 1: early detection of stucks net, but before I get into that, 368 00:23:13,880 --> 00:23:23,959 Speaker 1: let's take a quick break to thank our sponsor. Alright, 369 00:23:24,000 --> 00:23:28,960 Speaker 1: So a lawson realized that whomever was responsible for creating 370 00:23:29,000 --> 00:23:33,679 Speaker 1: this malware that was causing this this computer to crash repeatedly, 371 00:23:34,280 --> 00:23:37,600 Speaker 1: had done so by finding what is called a zero 372 00:23:37,920 --> 00:23:42,400 Speaker 1: day exploit. A zero day exploit is a vulnerability within 373 00:23:42,480 --> 00:23:46,040 Speaker 1: a piece of software code that has not yet been 374 00:23:46,080 --> 00:23:51,080 Speaker 1: identified by anybody else, including the people who made the 375 00:23:51,160 --> 00:23:54,400 Speaker 1: software code in the first place. The software coders are 376 00:23:54,440 --> 00:23:57,560 Speaker 1: likely completely unaware of it. In fact, that's that's really 377 00:23:57,600 --> 00:24:00,879 Speaker 1: what makes it zero day is the fact that you know, 378 00:24:00,920 --> 00:24:02,879 Speaker 1: you come out with like a new version of of 379 00:24:02,880 --> 00:24:05,919 Speaker 1: an operating system, for example, and you are not aware 380 00:24:06,000 --> 00:24:09,680 Speaker 1: that that part of that operating system has this glaring 381 00:24:09,760 --> 00:24:14,280 Speaker 1: flaw in it that uh could be exploited. That's a 382 00:24:14,400 --> 00:24:18,240 Speaker 1: zero day exploit, and that ignorance is an incredibly powerful 383 00:24:18,240 --> 00:24:21,879 Speaker 1: weapon for hackers. They will end up writing code that 384 00:24:22,040 --> 00:24:24,600 Speaker 1: can exploit this vulnerability, and they know that there's no 385 00:24:24,640 --> 00:24:28,679 Speaker 1: protection against it because the responsible parties for the software 386 00:24:28,960 --> 00:24:33,240 Speaker 1: have not even realized that there's a potential for exploitation. 387 00:24:33,600 --> 00:24:36,000 Speaker 1: The lawson figured out that the malware had to have 388 00:24:36,080 --> 00:24:40,919 Speaker 1: been distributed by a USB thumb drive initially. Later on, 389 00:24:41,200 --> 00:24:43,480 Speaker 1: researchers would figure out that the code would allow up 390 00:24:43,480 --> 00:24:47,320 Speaker 1: to three machines to be infected by the same USB 391 00:24:47,560 --> 00:24:51,040 Speaker 1: flash drive before the malware would prompt a computer to 392 00:24:51,119 --> 00:24:54,800 Speaker 1: delete the contents of the flash drive, so it's kind 393 00:24:54,800 --> 00:24:58,040 Speaker 1: of like a self destruct button. After three infections, the 394 00:24:58,160 --> 00:25:01,520 Speaker 1: drive would be wiped. F a propagation could happen across 395 00:25:01,560 --> 00:25:05,760 Speaker 1: a compromised network through computer computer connections, and later on 396 00:25:06,119 --> 00:25:10,080 Speaker 1: they discovered even other different ways that the virus can 397 00:25:10,080 --> 00:25:12,920 Speaker 1: move from computer to computer. It did not, however, move 398 00:25:13,000 --> 00:25:17,560 Speaker 1: across the Internet. This was a piece of malware that 399 00:25:17,640 --> 00:25:21,920 Speaker 1: was designed to infect computers that were on local networks 400 00:25:21,920 --> 00:25:25,240 Speaker 1: but perhaps not connected to the Internet at large. So 401 00:25:25,680 --> 00:25:28,560 Speaker 1: that was why they were using USB drives in the 402 00:25:28,600 --> 00:25:31,080 Speaker 1: first place. Now, that did come with a disadvantage. It 403 00:25:31,119 --> 00:25:34,280 Speaker 1: means that you have to get physical access to a 404 00:25:34,359 --> 00:25:39,000 Speaker 1: machine to get the malware from the USB drive onto 405 00:25:39,000 --> 00:25:42,240 Speaker 1: the computer, and that drastically reduces the number of computers 406 00:25:42,240 --> 00:25:45,680 Speaker 1: you could potentially infect. So why would you do this, Well, 407 00:25:45,720 --> 00:25:48,280 Speaker 1: one reason to go with the USB delivery mechanism is 408 00:25:48,320 --> 00:25:52,360 Speaker 1: to target computers that have an air gap. And that 409 00:25:52,440 --> 00:25:54,480 Speaker 1: air gap is what I was talking about a second ago. 410 00:25:54,600 --> 00:25:57,720 Speaker 1: That's a computer or even a network of computers that 411 00:25:57,760 --> 00:26:01,879 Speaker 1: has no direct connection to the wider Internet at large. 412 00:26:02,200 --> 00:26:05,320 Speaker 1: As an air gap between the Internet and the computer 413 00:26:05,480 --> 00:26:08,560 Speaker 1: or system of computers, it's kind of like a self 414 00:26:08,560 --> 00:26:11,840 Speaker 1: contained island. It's cut off from the rest of the world, 415 00:26:12,160 --> 00:26:15,199 Speaker 1: and it keeps the system safe from most forms of 416 00:26:15,240 --> 00:26:19,119 Speaker 1: hacker intrusions. If there are no pathways that lead to 417 00:26:19,160 --> 00:26:21,800 Speaker 1: the system, there's not a whole lot of hacker can do. 418 00:26:22,440 --> 00:26:25,600 Speaker 1: A true air gap system would have no connectivity to 419 00:26:25,640 --> 00:26:28,760 Speaker 1: the Internet at all. Now, some systems have what we 420 00:26:28,880 --> 00:26:31,520 Speaker 1: call an air gap but they really have limited and 421 00:26:31,600 --> 00:26:35,480 Speaker 1: controlled access to the Internet, typically through a computer or router. 422 00:26:35,560 --> 00:26:38,360 Speaker 1: The acts as a gatekeeper or portal. But if you 423 00:26:38,400 --> 00:26:41,199 Speaker 1: put your malware on a USB stick and then you 424 00:26:41,240 --> 00:26:44,320 Speaker 1: convince someone with a physical access to the machine to 425 00:26:44,440 --> 00:26:49,080 Speaker 1: insert the USB drive and air gap isn't really a problem. 426 00:26:49,119 --> 00:26:52,160 Speaker 1: It might, however, mean that you, as a hacker, will 427 00:26:52,240 --> 00:26:55,800 Speaker 1: remain unaware of your success. If the target machine has 428 00:26:55,800 --> 00:26:58,200 Speaker 1: no way to phone home, if there's no way for 429 00:26:58,240 --> 00:27:02,520 Speaker 1: the target machine to indicate hey, success, then you may 430 00:27:02,560 --> 00:27:06,640 Speaker 1: just be hoping that whatever you planned on doing was working. So, 431 00:27:06,680 --> 00:27:09,639 Speaker 1: like I said, all the vectors of attack for stocks 432 00:27:09,680 --> 00:27:14,200 Speaker 1: net were based off of either USB or local network connections, 433 00:27:14,560 --> 00:27:19,000 Speaker 1: but not over the Internet. And also the USB stick 434 00:27:19,080 --> 00:27:22,600 Speaker 1: attack did not use auto run, at least not after 435 00:27:22,680 --> 00:27:26,000 Speaker 1: the first initial wave of attacks. There were three separate 436 00:27:26,040 --> 00:27:29,000 Speaker 1: waves of attacks, and the second and third one did 437 00:27:29,000 --> 00:27:32,360 Speaker 1: not use the auto run feature. A lot of malware 438 00:27:32,520 --> 00:27:35,240 Speaker 1: does depend on auto run, and that's a feature that 439 00:27:35,280 --> 00:27:38,720 Speaker 1: will automatically launch a program from something like a USB 440 00:27:38,920 --> 00:27:41,919 Speaker 1: drive or an optical drive once you insert the media. 441 00:27:41,960 --> 00:27:44,440 Speaker 1: You're probably familiar with this. Let's say that you've got 442 00:27:44,840 --> 00:27:47,879 Speaker 1: a DVD, an actual movie on a DVD and you 443 00:27:47,920 --> 00:27:50,680 Speaker 1: put it into your computer, and the computer automatically launches 444 00:27:50,920 --> 00:27:54,320 Speaker 1: the DVD player software so that you can watch it 445 00:27:54,640 --> 00:27:58,240 Speaker 1: as soon as the DVD has gone into the optical drive. Well, 446 00:27:58,280 --> 00:28:01,120 Speaker 1: that speeds things up for the user, makes it more convenient, 447 00:28:01,160 --> 00:28:03,560 Speaker 1: you don't have to hunt for the right program. But 448 00:28:03,680 --> 00:28:07,800 Speaker 1: it does present a security risk because if the software 449 00:28:07,840 --> 00:28:11,880 Speaker 1: on the media is malicious, the computer is just automatically 450 00:28:11,960 --> 00:28:16,320 Speaker 1: launched bad software. But here's the thing. You can turn 451 00:28:16,520 --> 00:28:20,120 Speaker 1: off the auto run feature and a lot of systems 452 00:28:20,240 --> 00:28:23,479 Speaker 1: will do that because it is a way to limit 453 00:28:23,600 --> 00:28:27,479 Speaker 1: the risk and the vulnerability of those systems. You just 454 00:28:27,560 --> 00:28:30,879 Speaker 1: turn off auto run and then your planned form of 455 00:28:30,920 --> 00:28:34,000 Speaker 1: attack is not going to work. Someone puts that media 456 00:28:34,080 --> 00:28:36,560 Speaker 1: into a computer where the auto run has been switched off, 457 00:28:36,800 --> 00:28:40,280 Speaker 1: they'll get prompted or they'll they'll have the chance to 458 00:28:40,480 --> 00:28:43,680 Speaker 1: run that stuff themselves. But chances are you go, if 459 00:28:43,720 --> 00:28:46,600 Speaker 1: you don't recognize a program, you're not just gonna launch it. 460 00:28:47,000 --> 00:28:49,360 Speaker 1: You might do some snooping first and find out if 461 00:28:49,400 --> 00:28:51,760 Speaker 1: this is in fact something you want to run. So 462 00:28:51,840 --> 00:28:55,480 Speaker 1: to remove that possibility, you might want to not use 463 00:28:55,520 --> 00:29:00,120 Speaker 1: auto run feature to launch your malware. So that's what 464 00:29:00,360 --> 00:29:03,680 Speaker 1: the hackers responsible for stucks net did. They decided that 465 00:29:03,720 --> 00:29:07,120 Speaker 1: they would use a different approach. They targeted what are 466 00:29:07,200 --> 00:29:10,880 Speaker 1: called l n K files. So an l n K 467 00:29:11,120 --> 00:29:15,160 Speaker 1: file carries the information to display icons next to file 468 00:29:15,240 --> 00:29:19,000 Speaker 1: types and applications like Windows Explorer. So if you've opened 469 00:29:19,040 --> 00:29:21,480 Speaker 1: up a file directory type of program and you've seen 470 00:29:21,480 --> 00:29:24,160 Speaker 1: those little icons next to file names, that's due to 471 00:29:24,240 --> 00:29:27,360 Speaker 1: an l n K file. This was a pretty sophisticated 472 00:29:27,360 --> 00:29:30,240 Speaker 1: form of attack, and as far as Lawson could figure out, 473 00:29:30,280 --> 00:29:32,520 Speaker 1: it was the first of its type. Not turned out 474 00:29:33,120 --> 00:29:35,200 Speaker 1: that it was not the very first of its type, 475 00:29:35,480 --> 00:29:39,360 Speaker 1: but the previous implementations of this attack had not really 476 00:29:39,360 --> 00:29:44,800 Speaker 1: received widespread coverage, so it was still really new. Adding 477 00:29:44,840 --> 00:29:47,880 Speaker 1: to this sophistication was the fact that there were four 478 00:29:48,000 --> 00:29:51,040 Speaker 1: different versions of the l en k files on those 479 00:29:51,160 --> 00:29:54,640 Speaker 1: USB sticks, and that meant that they could affect up 480 00:29:54,680 --> 00:29:58,600 Speaker 1: to seven different versions of Windows. That increased the number 481 00:29:58,600 --> 00:30:00,640 Speaker 1: of potential targets for the mal where so if a 482 00:30:00,720 --> 00:30:03,160 Speaker 1: computer was running one version of Windows, or maybe the 483 00:30:03,160 --> 00:30:05,920 Speaker 1: next one or the next one, it still might be vulnerable. 484 00:30:06,440 --> 00:30:10,880 Speaker 1: The only real thing that limited it was it needed 485 00:30:10,920 --> 00:30:14,640 Speaker 1: to be a thirty two bit installation of Windows. If 486 00:30:14,680 --> 00:30:17,640 Speaker 1: it were a sixty four bit installation, the virus was 487 00:30:17,680 --> 00:30:20,719 Speaker 1: not going to work on it. Later on, researchers at 488 00:30:20,760 --> 00:30:25,680 Speaker 1: the security firm Kaspersky UH discovered other zero day exploits 489 00:30:25,680 --> 00:30:27,680 Speaker 1: that the virus took advantage of. So there wasn't just 490 00:30:27,920 --> 00:30:34,280 Speaker 1: one zero day vulnerability that stucks net could glom onto. 491 00:30:34,600 --> 00:30:37,200 Speaker 1: There were three more that Kaspersky found at that point. 492 00:30:37,840 --> 00:30:42,360 Speaker 1: One exploited a print spooler vulnerability, and it would propagate 493 00:30:42,480 --> 00:30:46,080 Speaker 1: the virus across networks that had a shared network printer 494 00:30:46,160 --> 00:30:48,720 Speaker 1: and a lot of a lot of networks do the 495 00:30:48,760 --> 00:30:53,760 Speaker 1: other two vectors use something called privilege escalation, which is 496 00:30:53,800 --> 00:30:56,200 Speaker 1: where a program is able to leverage exploits to gain 497 00:30:56,600 --> 00:31:00,800 Speaker 1: eventually a system level control over computer is even if 498 00:31:00,840 --> 00:31:04,000 Speaker 1: those computers have been locked down. The combination of all 499 00:31:04,040 --> 00:31:07,000 Speaker 1: the exploits suggested that the people responsible for the virus 500 00:31:07,400 --> 00:31:12,040 Speaker 1: were serious heavy hitters who really desperately wanted to target 501 00:31:12,120 --> 00:31:15,880 Speaker 1: specific computers. And it raised some really big questions why 502 00:31:15,920 --> 00:31:20,520 Speaker 1: would you use four zero day exploits because common logic 503 00:31:20,600 --> 00:31:24,000 Speaker 1: said you should just stick to one at a time. 504 00:31:24,320 --> 00:31:28,000 Speaker 1: Once a zero day exploit is discovered, the clock is 505 00:31:28,040 --> 00:31:33,120 Speaker 1: ticking before someone patches that respective software to plug up 506 00:31:33,120 --> 00:31:36,680 Speaker 1: that vulnerability so that the exploit won't work anymore. So 507 00:31:36,720 --> 00:31:41,640 Speaker 1: the zero day exploit is only really valuable until people 508 00:31:41,680 --> 00:31:43,960 Speaker 1: discover it. If you have more than one zero day 509 00:31:43,960 --> 00:31:47,840 Speaker 1: exploit involved in your malware, then you run the risk 510 00:31:48,120 --> 00:31:52,280 Speaker 1: of someone discovering all of those exploits if the malware 511 00:31:52,320 --> 00:31:56,440 Speaker 1: itself becomes evident, and if they find all of those exploits, 512 00:31:56,440 --> 00:31:59,880 Speaker 1: that all of those can be patched, which means you 513 00:32:00,080 --> 00:32:03,080 Speaker 1: lose all of those vectors of attack in a single 514 00:32:03,200 --> 00:32:08,520 Speaker 1: cell swoop. So this was kind of considered a big gamble. 515 00:32:08,680 --> 00:32:11,520 Speaker 1: Why would you throw all of your eggs into this 516 00:32:11,680 --> 00:32:15,200 Speaker 1: basket having all of four zero day exploits. By the way, 517 00:32:15,200 --> 00:32:17,720 Speaker 1: there was a fifth one actually that they had not 518 00:32:17,840 --> 00:32:21,320 Speaker 1: yet discovered, though that one ended up getting patched after 519 00:32:21,360 --> 00:32:25,040 Speaker 1: the first wave of attacks, uh not because of stocks net. 520 00:32:25,240 --> 00:32:30,480 Speaker 1: The fifth vulnerability had been independently discovered through other means 521 00:32:31,040 --> 00:32:34,880 Speaker 1: and had been patched, But ultimately that did mean five 522 00:32:34,920 --> 00:32:39,680 Speaker 1: different zero day vulnerabilities were used when designing stucks net. 523 00:32:39,760 --> 00:32:42,640 Speaker 1: Over the course of the life of stucks neet. On 524 00:32:42,760 --> 00:32:45,960 Speaker 1: top of those zero day exploits, the virus used four 525 00:32:46,160 --> 00:32:49,880 Speaker 1: other means to copy and send itself along to other machines. 526 00:32:50,200 --> 00:32:53,320 Speaker 1: So in total it had nine different methods to spread 527 00:32:53,360 --> 00:32:56,600 Speaker 1: the virus. One of them leveraged of vulnerability in special 528 00:32:56,840 --> 00:33:02,160 Speaker 1: Semens software to gain system level privileges. Siemens is a 529 00:33:02,240 --> 00:33:05,440 Speaker 1: company it's in Germany that creates all sorts of different 530 00:33:05,520 --> 00:33:10,680 Speaker 1: kinds of software. The software in particular that stocks net 531 00:33:10,720 --> 00:33:14,240 Speaker 1: was concerned with was for something called p lcs programmable 532 00:33:14,240 --> 00:33:18,760 Speaker 1: logic circuits, so are controllers rather logic controllers. So these 533 00:33:18,760 --> 00:33:25,880 Speaker 1: are little implementations that allow computers to communicate with various devices, 534 00:33:25,920 --> 00:33:30,320 Speaker 1: typically that are used in industrial applications, so it might 535 00:33:30,360 --> 00:33:34,720 Speaker 1: be like a conveyor belt or valve system, that kind 536 00:33:34,720 --> 00:33:37,880 Speaker 1: of thing, which is a pretty odd thing for viruses 537 00:33:38,040 --> 00:33:42,440 Speaker 1: to target. Typically, there was another clever way that the 538 00:33:42,480 --> 00:33:46,240 Speaker 1: malware could spread. It would create a file sharing server 539 00:33:46,400 --> 00:33:50,160 Speaker 1: folder on every computer it infected if that computer were 540 00:33:50,160 --> 00:33:54,080 Speaker 1: connected to other infected machines. So it's a computer on 541 00:33:54,080 --> 00:33:56,560 Speaker 1: a network and other computers on the network also got 542 00:33:56,600 --> 00:33:59,120 Speaker 1: infected by stocks net. They would chat with each other 543 00:33:59,320 --> 00:34:02,240 Speaker 1: and they would com hair notes. They would say, hey, 544 00:34:02,360 --> 00:34:04,680 Speaker 1: one version of stucks net are you running. I've got 545 00:34:04,720 --> 00:34:07,520 Speaker 1: one point two and they might say, well, I've got 546 00:34:07,520 --> 00:34:11,080 Speaker 1: one point to one. Hey, your version is more current 547 00:34:11,120 --> 00:34:13,240 Speaker 1: than my version is, Give me some of that sweet 548 00:34:13,239 --> 00:34:17,600 Speaker 1: stucks net, And sure enough the system would propagate the 549 00:34:17,680 --> 00:34:21,440 Speaker 1: latest version of stucks net across its network. So it 550 00:34:21,520 --> 00:34:25,759 Speaker 1: was kind of appear to appear approach to spreading the 551 00:34:25,840 --> 00:34:28,120 Speaker 1: latest and greatest version of stuck set. And if someone 552 00:34:28,200 --> 00:34:31,560 Speaker 1: came in and infected a new computer with an even 553 00:34:31,840 --> 00:34:35,760 Speaker 1: more recent version of stocks net, then shortly that version 554 00:34:35,760 --> 00:34:39,160 Speaker 1: of stucks net would propagate across the other infected computers 555 00:34:39,239 --> 00:34:41,200 Speaker 1: on the network. It was a way of making sure 556 00:34:41,480 --> 00:34:44,399 Speaker 1: everyone was on the same page, even without them being 557 00:34:44,600 --> 00:34:48,200 Speaker 1: aware of it. The malware would install two driver modules 558 00:34:48,280 --> 00:34:53,600 Speaker 1: on the infected computer, and uh these driver modules were 559 00:34:54,400 --> 00:34:58,919 Speaker 1: they were posing as as software drivers. Software drivers are 560 00:34:59,760 --> 00:35:03,839 Speaker 1: lee aisons between a computer and some other piece of hardware. So, 561 00:35:03,960 --> 00:35:07,359 Speaker 1: for example, if you have a separate computer mouse, or 562 00:35:07,400 --> 00:35:09,840 Speaker 1: a microphone that you plug into your computer, or a 563 00:35:09,880 --> 00:35:13,560 Speaker 1: webcam that you plug into your computer, the driver is 564 00:35:13,640 --> 00:35:18,040 Speaker 1: what allows for meaningful communication between that device and your computer. 565 00:35:18,400 --> 00:35:21,640 Speaker 1: You may have occasionally had an issue where one of 566 00:35:21,640 --> 00:35:24,520 Speaker 1: your peripherals no longer seems to work on your computer, 567 00:35:24,800 --> 00:35:26,920 Speaker 1: and it's because the driver is out of date. It 568 00:35:27,000 --> 00:35:29,200 Speaker 1: may be that there was an update to the operating 569 00:35:29,239 --> 00:35:33,600 Speaker 1: system and that update has broken that communication channel between 570 00:35:33,760 --> 00:35:36,680 Speaker 1: your peripheral and your computer, and it requires that you 571 00:35:36,800 --> 00:35:41,120 Speaker 1: update your software drivers so that now the two machines 572 00:35:41,200 --> 00:35:44,800 Speaker 1: can talk to each other again. That's what the malware 573 00:35:44,840 --> 00:35:49,840 Speaker 1: would install. These these apparently innocent, at least on casual glance, 574 00:35:50,680 --> 00:35:55,920 Speaker 1: driver modules onto the infected computer. Now normally, later versions 575 00:35:55,960 --> 00:35:59,480 Speaker 1: of Windows would send an alert to a user whenever 576 00:35:59,480 --> 00:36:03,319 Speaker 1: an up of software was to be installed. If you've 577 00:36:03,400 --> 00:36:06,160 Speaker 1: used Windows seven or later than you know about this. 578 00:36:06,200 --> 00:36:09,080 Speaker 1: You get that little window that pops up and says, hey, 579 00:36:09,120 --> 00:36:11,120 Speaker 1: I see that you're trying to install this thing. Is 580 00:36:11,160 --> 00:36:14,399 Speaker 1: that really your intention? Because it gives you the chance 581 00:36:14,440 --> 00:36:17,320 Speaker 1: to say, heck no, I didn't know that was happening, 582 00:36:17,440 --> 00:36:20,040 Speaker 1: stop it, and then you could investigate, and if it 583 00:36:20,120 --> 00:36:22,360 Speaker 1: were malware, you would know something was up and you 584 00:36:22,360 --> 00:36:25,880 Speaker 1: can maybe do something about it. So the goal of 585 00:36:25,920 --> 00:36:28,880 Speaker 1: the hackers is not to have this window pop up. 586 00:36:29,040 --> 00:36:32,720 Speaker 1: So this malware stocks Net was a lot more insidious 587 00:36:33,080 --> 00:36:38,200 Speaker 1: than just a fake driver. It contained a digital certificate 588 00:36:38,560 --> 00:36:43,760 Speaker 1: from a legitimate Taiwanese hardware company called real tech Semiconductor 589 00:36:44,480 --> 00:36:49,319 Speaker 1: digital certificates are like authorized signatures. These are away for 590 00:36:49,400 --> 00:36:53,280 Speaker 1: companies to authenticate that the software they distribute in fact 591 00:36:53,400 --> 00:36:57,160 Speaker 1: actually comes from them, and big players that are trusted 592 00:36:57,400 --> 00:37:00,880 Speaker 1: can use those certificates to authenticate. Driver is another software 593 00:37:01,000 --> 00:37:04,080 Speaker 1: machines without the need for that pop up notification. You're 594 00:37:04,080 --> 00:37:07,000 Speaker 1: not gonna get it every time, because essentially what's happening 595 00:37:07,040 --> 00:37:10,279 Speaker 1: is Microsoft says, hey, there's this software that wants to 596 00:37:10,360 --> 00:37:13,960 Speaker 1: execute on this machine. Oh wait, this software is from 597 00:37:13,960 --> 00:37:17,160 Speaker 1: such and such company, and I know they're cool, And 598 00:37:17,280 --> 00:37:20,680 Speaker 1: it's a digital certificate that tells me that it's absolutely 599 00:37:20,760 --> 00:37:24,480 Speaker 1: from that company because they protect their certification process, so 600 00:37:24,520 --> 00:37:27,560 Speaker 1: I know it's not from anyone else, so I don't 601 00:37:27,560 --> 00:37:30,080 Speaker 1: need to worry the user. I'm not gonna send that 602 00:37:30,120 --> 00:37:32,960 Speaker 1: pop up because everything's totally on the up and up, 603 00:37:33,080 --> 00:37:37,319 Speaker 1: so as long as the software is authenticated as being 604 00:37:37,400 --> 00:37:40,759 Speaker 1: from a trusted source, there's no extra step in there. 605 00:37:41,000 --> 00:37:44,400 Speaker 1: But that created a pathway for potential attacks, though at 606 00:37:44,400 --> 00:37:47,160 Speaker 1: the time not very many people were considering that. One 607 00:37:47,160 --> 00:37:50,760 Speaker 1: person who was was a security expert with the Finnish 608 00:37:50,920 --> 00:37:55,080 Speaker 1: company f Secure. That is a company from Finland, not 609 00:37:55,160 --> 00:38:00,320 Speaker 1: a company that finished things, and in July he pointed 610 00:38:00,320 --> 00:38:02,239 Speaker 1: out that if a hacker were to get access to 611 00:38:02,320 --> 00:38:06,880 Speaker 1: digital certificates, they could potentially sneak in malware onto computers 612 00:38:07,040 --> 00:38:09,719 Speaker 1: using that, which was exactly what was going on with 613 00:38:09,760 --> 00:38:14,600 Speaker 1: stucks net. Now, this researcher wasn't aware of stocks neet 614 00:38:14,640 --> 00:38:17,279 Speaker 1: at the time. He was just saying, hey, this is 615 00:38:17,320 --> 00:38:19,719 Speaker 1: a potential problem. And as it turns out, it wasn't 616 00:38:19,760 --> 00:38:22,080 Speaker 1: just a potential problem, it was a real problem that 617 00:38:22,160 --> 00:38:25,839 Speaker 1: was going on at that very moment. Moreover, digital certificates 618 00:38:25,840 --> 00:38:28,520 Speaker 1: have an expiration date, and this is to help make 619 00:38:28,520 --> 00:38:32,200 Speaker 1: sure that they remain secure. You have to renew your 620 00:38:32,200 --> 00:38:35,920 Speaker 1: certificate so that it doesn't stick around long enough for 621 00:38:36,040 --> 00:38:38,280 Speaker 1: bad actors to get hold of it and then leverage 622 00:38:38,280 --> 00:38:41,000 Speaker 1: it the way the malware authors had done in the 623 00:38:41,000 --> 00:38:45,000 Speaker 1: case of stocks net. So you end up creating a 624 00:38:45,000 --> 00:38:47,920 Speaker 1: certificate that has an expiration date on it. After the 625 00:38:47,960 --> 00:38:53,759 Speaker 1: expiration date, you you then administer a new certificate that 626 00:38:54,040 --> 00:38:57,680 Speaker 1: has new code on it, but it still has that authentication. 627 00:38:58,040 --> 00:38:59,640 Speaker 1: And that way, if anyone tries to use the old 628 00:38:59,640 --> 00:39:02,759 Speaker 1: certific of get, then an operating system like Windows can 629 00:39:02,800 --> 00:39:05,960 Speaker 1: say wait a minute, that certificate is out of date. Uh, 630 00:39:06,000 --> 00:39:08,480 Speaker 1: I'm going to alert the user because that could be 631 00:39:08,520 --> 00:39:11,279 Speaker 1: an indication that someone had gotten a hold of an 632 00:39:11,280 --> 00:39:14,279 Speaker 1: old authentication certificate and they're trying to pass it off 633 00:39:14,320 --> 00:39:19,920 Speaker 1: as legitimate. So anytime a certificate expires, it's no longer 634 00:39:19,960 --> 00:39:24,719 Speaker 1: really useful for the case of malware distributors. So some 635 00:39:24,760 --> 00:39:30,319 Speaker 1: companies will hire out their certificates, like they'll create certificates, 636 00:39:30,440 --> 00:39:32,879 Speaker 1: and then other parties will come to them and say, hey, 637 00:39:33,239 --> 00:39:36,080 Speaker 1: we have created the software. We would like to say 638 00:39:36,080 --> 00:39:39,120 Speaker 1: that we created it in partnership with you, and in return, 639 00:39:39,520 --> 00:39:42,960 Speaker 1: you can put your authentication certificate on this software, which 640 00:39:43,000 --> 00:39:45,520 Speaker 1: will help us out a lot. Uh So some companies 641 00:39:45,520 --> 00:39:48,400 Speaker 1: will actually do that. Others are way more protective of 642 00:39:48,400 --> 00:39:51,600 Speaker 1: their digital certificates. No one was sure at this point 643 00:39:51,640 --> 00:39:56,120 Speaker 1: if Real Tech had their certificate stolen somehow, if if 644 00:39:56,120 --> 00:39:59,279 Speaker 1: the hackers had managed to uh illegally get hold of 645 00:39:59,320 --> 00:40:02,120 Speaker 1: this digit at all certificate, or if there had been 646 00:40:02,200 --> 00:40:05,480 Speaker 1: some other form of transaction involved, if fuel Ticket perhaps 647 00:40:06,239 --> 00:40:10,600 Speaker 1: licensed out essentially they're certificate. Circumstantial evidence suggested that it 648 00:40:10,760 --> 00:40:15,000 Speaker 1: was a stolen certificate. Looking at the malware code, it 649 00:40:15,040 --> 00:40:18,480 Speaker 1: appeared that one of the driver modules had its certification 650 00:40:18,640 --> 00:40:22,920 Speaker 1: signed to it just six minutes after the original code 651 00:40:22,920 --> 00:40:25,880 Speaker 1: had been compiled. This was found out by converting the 652 00:40:25,920 --> 00:40:29,120 Speaker 1: code into binary and then being very meticulous about looking 653 00:40:29,200 --> 00:40:32,239 Speaker 1: for the data for any sort of time stamp information. Now, 654 00:40:32,280 --> 00:40:35,440 Speaker 1: it is possible to fudge things like the date and 655 00:40:35,560 --> 00:40:42,319 Speaker 1: time of compiling, but that's not necessarily easy to do, 656 00:40:42,480 --> 00:40:45,360 Speaker 1: so you could say that the compiled dates not really 657 00:40:45,360 --> 00:40:48,480 Speaker 1: a smoking gun, but it does suggest that the certification 658 00:40:48,520 --> 00:40:51,440 Speaker 1: had been sticking around in the pocket of whomever had 659 00:40:51,440 --> 00:40:55,800 Speaker 1: been designing stucks net and then immediately slapped onto stucks 660 00:40:55,800 --> 00:40:58,520 Speaker 1: net once the code was compiled and ready to go. Now, 661 00:40:58,560 --> 00:41:00,279 Speaker 1: I've got a lot more to say in the first 662 00:41:00,320 --> 00:41:02,919 Speaker 1: episode about stuck s net, but before I continue, let's 663 00:41:02,960 --> 00:41:13,520 Speaker 1: take another quick break to thank our sponsor. Lawson would 664 00:41:13,520 --> 00:41:17,640 Speaker 1: reach out to both real Tech and Microsoft to alert 665 00:41:17,760 --> 00:41:21,800 Speaker 1: both companies of this vulnerability because it had that digital 666 00:41:21,840 --> 00:41:26,680 Speaker 1: certification from real Tech and it was affecting Microsoft based machines. 667 00:41:27,239 --> 00:41:30,240 Speaker 1: He had not figured out what the malware was actually 668 00:41:30,320 --> 00:41:34,160 Speaker 1: for yet that would be the payload. He was understanding 669 00:41:34,160 --> 00:41:37,200 Speaker 1: a little bit of how the malware would infect machines, 670 00:41:38,000 --> 00:41:40,000 Speaker 1: but he didn't know what it was supposed to do. 671 00:41:41,000 --> 00:41:45,040 Speaker 1: He didn't know it could potentially infect millions of computers 672 00:41:45,040 --> 00:41:48,359 Speaker 1: around the world because that digital certification gave it kind 673 00:41:48,400 --> 00:41:52,839 Speaker 1: of a v I P pass onto machines, and if 674 00:41:52,840 --> 00:41:55,560 Speaker 1: it was meant to steal information or cause mischief, he 675 00:41:55,600 --> 00:41:58,600 Speaker 1: wanted to nip that in the bud. One interesting tidbit 676 00:41:58,719 --> 00:42:01,160 Speaker 1: is whomever it des I in the malware have been 677 00:42:01,200 --> 00:42:02,960 Speaker 1: really careful to do it in such a way that 678 00:42:03,000 --> 00:42:07,239 Speaker 1: the major anti virus packages out there wouldn't suspect a thing. 679 00:42:07,680 --> 00:42:10,560 Speaker 1: It was compatible with all the major anti virus packages, 680 00:42:11,000 --> 00:42:14,799 Speaker 1: so most people wouldn't have any way of telling that 681 00:42:14,920 --> 00:42:19,040 Speaker 1: something hinky was going on. Clearly, the hackers who designed 682 00:42:19,080 --> 00:42:22,280 Speaker 1: this had worked with computers that had these anti virus 683 00:42:22,280 --> 00:42:25,360 Speaker 1: software packages installed on them to make sure that it 684 00:42:25,400 --> 00:42:29,759 Speaker 1: would slip under the radar. But Virus Block Ada was 685 00:42:30,200 --> 00:42:34,000 Speaker 1: a small operation, and it may have been able to 686 00:42:34,840 --> 00:42:37,880 Speaker 1: have this this incompatibility problem where it was causing the 687 00:42:37,920 --> 00:42:41,400 Speaker 1: computer to crash and reboot over and over again, simply 688 00:42:41,440 --> 00:42:45,160 Speaker 1: because the people who were designing the stux Net virus 689 00:42:45,360 --> 00:42:49,360 Speaker 1: had never really encountered this particular anti virus platform before, 690 00:42:49,400 --> 00:42:52,480 Speaker 1: and they weren't able to make sure that stux Net 691 00:42:52,640 --> 00:42:56,200 Speaker 1: would not be picked up by it, and so it 692 00:42:56,280 --> 00:43:00,760 Speaker 1: was a real enigma. Lawson couldn't even get the virus 693 00:43:00,840 --> 00:43:05,200 Speaker 1: to regularly replicate the problems he was seeing, so he 694 00:43:05,239 --> 00:43:08,880 Speaker 1: wasn't really certain what was happening. Uh. It was largely 695 00:43:08,920 --> 00:43:11,920 Speaker 1: a matter of luck that this happened at all and 696 00:43:12,000 --> 00:43:14,919 Speaker 1: brought people's attention to it. After two weeks without hearing 697 00:43:14,960 --> 00:43:18,840 Speaker 1: anything back from Microsoft Real Tech Who, Lawson posted information 698 00:43:18,880 --> 00:43:22,280 Speaker 1: about what he had found both to his company's website 699 00:43:22,560 --> 00:43:27,080 Speaker 1: and on an English speaking cyber security forum. He did 700 00:43:27,120 --> 00:43:29,719 Speaker 1: that on July twelve, two thousand and ten. That was 701 00:43:29,760 --> 00:43:33,320 Speaker 1: the same day that the Finnish security firm was talking 702 00:43:33,320 --> 00:43:36,960 Speaker 1: about how digital certificates from trusted sources could become a 703 00:43:37,040 --> 00:43:40,839 Speaker 1: vector for malware on July. Just a few days later, 704 00:43:41,160 --> 00:43:45,320 Speaker 1: security researcher and tech journalist Brian Krebs posted about the malware, 705 00:43:45,680 --> 00:43:48,280 Speaker 1: and it quickly became the talk of the cyber security 706 00:43:48,360 --> 00:43:52,160 Speaker 1: sphere at that point. Microsoft is the company that actually 707 00:43:52,200 --> 00:43:55,520 Speaker 1: gave the malware its name, and the company named it 708 00:43:55,600 --> 00:43:59,000 Speaker 1: that by combining some elements of code that were found 709 00:43:59,320 --> 00:44:02,440 Speaker 1: in the virus itself, including the file name for one 710 00:44:02,480 --> 00:44:05,920 Speaker 1: of the driver modules, which was m R x net 711 00:44:06,320 --> 00:44:10,520 Speaker 1: dot sis. At this time, virus Block Ada had updated 712 00:44:10,520 --> 00:44:14,560 Speaker 1: its anti virus software to sniff out stucks net. It 713 00:44:14,600 --> 00:44:18,120 Speaker 1: was looking for any sort of markers that would identify 714 00:44:18,239 --> 00:44:21,120 Speaker 1: stucks net, and the company discovered that the malicious code 715 00:44:21,120 --> 00:44:25,200 Speaker 1: had infected many computers across the Middle East. In particular, 716 00:44:25,400 --> 00:44:30,480 Speaker 1: on July, a Slovakian security firm called e Set e 717 00:44:30,800 --> 00:44:34,600 Speaker 1: s Et discovered a new driver module that seemed to 718 00:44:34,600 --> 00:44:36,719 Speaker 1: be very similar to the stocks net one that was 719 00:44:36,760 --> 00:44:40,279 Speaker 1: previously identified. This one had a digital certificate from a 720 00:44:40,320 --> 00:44:44,879 Speaker 1: different company called j Micron Technology, which was also from 721 00:44:44,920 --> 00:44:47,520 Speaker 1: Taiwan and in fact was located just a couple of 722 00:44:47,520 --> 00:44:51,799 Speaker 1: blocks away from Real Tech. The malware appeared otherwise to 723 00:44:51,840 --> 00:44:55,120 Speaker 1: be pretty much the same as its predecessor, So why 724 00:44:55,120 --> 00:44:58,239 Speaker 1: did it have a different digital certificate? Well, part of 725 00:44:58,280 --> 00:45:00,399 Speaker 1: the reason was that the real tex orti get had 726 00:45:00,480 --> 00:45:03,520 Speaker 1: expired in June two thousand and ten, so you couldn't 727 00:45:03,520 --> 00:45:06,200 Speaker 1: infect new computers using it. Windows would not allow a 728 00:45:06,320 --> 00:45:09,160 Speaker 1: driver with an expired digital certificate to install itself on 729 00:45:09,160 --> 00:45:13,440 Speaker 1: a computer without notifying the user. The new legitimate digital 730 00:45:13,480 --> 00:45:17,239 Speaker 1: certificate from j Micron Technology could sidestep that problem. The 731 00:45:17,280 --> 00:45:21,400 Speaker 1: new attack may have launched on July four, just two 732 00:45:21,520 --> 00:45:26,279 Speaker 1: days after Ulison had made his findings public, and it's 733 00:45:26,320 --> 00:45:30,640 Speaker 1: possible that the malware was released hurriedly in reaction to 734 00:45:30,880 --> 00:45:33,680 Speaker 1: the announcement, and it what might have been an attempt 735 00:45:33,800 --> 00:45:37,279 Speaker 1: to infect as many computers as possible before Microsoft could 736 00:45:37,280 --> 00:45:41,319 Speaker 1: patch the vulnerability. There's some evidence to support this hypothesis, 737 00:45:41,400 --> 00:45:43,799 Speaker 1: as the code in this release was a little less 738 00:45:43,840 --> 00:45:47,200 Speaker 1: buttoned down than the original attack had been back in 739 00:45:47,440 --> 00:45:49,880 Speaker 1: two thousand nine. And by some evidence, I mean there 740 00:45:49,880 --> 00:45:53,200 Speaker 1: were some sloppy mistakes. The digital certificate contained a block 741 00:45:53,239 --> 00:45:57,040 Speaker 1: of information about the company that issued the certificate, kind 742 00:45:57,040 --> 00:45:59,600 Speaker 1: of like a you know, a little bit of information 743 00:45:59,680 --> 00:46:03,920 Speaker 1: about J Mikron, and that bit of information included a 744 00:46:04,080 --> 00:46:06,799 Speaker 1: u r L to a J Mikron website, except there 745 00:46:06,840 --> 00:46:09,759 Speaker 1: was a typo in the u r L, and so 746 00:46:09,840 --> 00:46:13,399 Speaker 1: any attempt to visit that particular address would return a 747 00:46:13,480 --> 00:46:17,200 Speaker 1: Server not Found error. Uh. If anyone had tried it, 748 00:46:17,239 --> 00:46:19,280 Speaker 1: they might have said, well, this is kind of strange 749 00:46:19,360 --> 00:46:22,960 Speaker 1: that a company would issue a digital certificate and yet 750 00:46:23,000 --> 00:46:25,080 Speaker 1: have the wrong u r L in there. You would 751 00:46:25,080 --> 00:46:27,520 Speaker 1: think that for something that important they would make absolutely 752 00:46:27,560 --> 00:46:30,960 Speaker 1: certain they had correct information included so that was one 753 00:46:31,000 --> 00:46:34,360 Speaker 1: red flag. There were also fields within the certificate that 754 00:46:34,440 --> 00:46:38,560 Speaker 1: had the value change me written in them instead of 755 00:46:38,560 --> 00:46:41,320 Speaker 1: whatever information should have been there. Now clearly that was 756 00:46:41,360 --> 00:46:43,799 Speaker 1: a note written by a hacker to his or her 757 00:46:43,880 --> 00:46:47,440 Speaker 1: team as a placeholder, you know, don't let this go 758 00:46:47,520 --> 00:46:51,360 Speaker 1: out before you change it, but it was never actually 759 00:46:51,360 --> 00:46:54,720 Speaker 1: replaced or changed. Those elements suggest the malware was rushed 760 00:46:54,719 --> 00:46:57,719 Speaker 1: out the door ahead of plan. Researchers later determined that 761 00:46:57,760 --> 00:47:01,799 Speaker 1: the original attacks happened in three waves. June of two 762 00:47:01,840 --> 00:47:04,400 Speaker 1: thousand nine was the first one and used an auto 763 00:47:04,520 --> 00:47:09,600 Speaker 1: run attack. March and April were the second two attacks, 764 00:47:10,080 --> 00:47:13,680 Speaker 1: and then after that you end up with these approaches 765 00:47:13,760 --> 00:47:17,400 Speaker 1: that we're using a different digital certificate. It didn't appear 766 00:47:17,440 --> 00:47:20,600 Speaker 1: to have anything to do with identity theft, didn't have 767 00:47:20,640 --> 00:47:23,680 Speaker 1: anything to do with creating a botan net, So why 768 00:47:23,680 --> 00:47:27,400 Speaker 1: would you design code that can infect millions of machines 769 00:47:28,120 --> 00:47:31,480 Speaker 1: but it didn't actually cause harm to the host computers 770 00:47:31,840 --> 00:47:36,360 Speaker 1: or do anything else of any real consequence. Frank Baldwin, 771 00:47:36,560 --> 00:47:40,680 Speaker 1: a cybersecurity expert in Germany, discovered the first clues as 772 00:47:40,719 --> 00:47:45,040 Speaker 1: two stucks nets purpose. Baldwin had analyzed the code and 773 00:47:45,080 --> 00:47:48,160 Speaker 1: noticed that appeared to have been designed to target computers 774 00:47:48,360 --> 00:47:51,080 Speaker 1: that had a particular type of software on it. That 775 00:47:51,200 --> 00:47:55,279 Speaker 1: software came from the German company Siemens that I mentioned earlier. Now, 776 00:47:55,280 --> 00:47:58,759 Speaker 1: they make lots of different stuff, including software for other 777 00:47:58,800 --> 00:48:02,520 Speaker 1: businesses and particular software, or to be more specific, the 778 00:48:02,680 --> 00:48:07,040 Speaker 1: two programs that this virus was searching for. Whenever it 779 00:48:07,040 --> 00:48:09,080 Speaker 1: would infect a computer, it would look to see if 780 00:48:09,520 --> 00:48:13,040 Speaker 1: one or both of these programs was installed. Also on 781 00:48:13,040 --> 00:48:17,040 Speaker 1: that computer, there were for industrial control systems. It's the 782 00:48:17,080 --> 00:48:19,880 Speaker 1: sort of thing you would find in a manufacturing facility, 783 00:48:20,120 --> 00:48:23,520 Speaker 1: so again like the controllers for things like valves or 784 00:48:23,560 --> 00:48:28,600 Speaker 1: conveyor belts or other simple interconnected systems. Now, Baldwin's hypothesis 785 00:48:29,280 --> 00:48:32,400 Speaker 1: was that the malware was a type of industrial espionage. 786 00:48:32,640 --> 00:48:35,319 Speaker 1: He thought perhaps a company had created this malware in 787 00:48:35,360 --> 00:48:38,719 Speaker 1: an attempt to spy on competitors and learn how they 788 00:48:38,760 --> 00:48:42,880 Speaker 1: operate in an effort to gain a market advantage over them. 789 00:48:42,960 --> 00:48:47,120 Speaker 1: That wasn't exactly the right track, but at least showed 790 00:48:47,160 --> 00:48:51,000 Speaker 1: that this malware was meant for a very specific reason. 791 00:48:51,640 --> 00:48:54,320 Speaker 1: What that reason was I've kind of alluded to already, 792 00:48:54,360 --> 00:48:56,480 Speaker 1: but we're going to dive into more of that in 793 00:48:56,520 --> 00:49:00,560 Speaker 1: our next episode to really look at how ducks Net 794 00:49:00,920 --> 00:49:06,680 Speaker 1: unraveled and what were the motivations behind it, who was responsible, 795 00:49:07,080 --> 00:49:10,359 Speaker 1: and what was the fallout from this stuff. Uh, there's 796 00:49:10,400 --> 00:49:13,319 Speaker 1: no pun in that there was no nuclear fallout. I 797 00:49:13,320 --> 00:49:15,399 Speaker 1: want to be clear about that, because otherwise this would 798 00:49:15,440 --> 00:49:19,800 Speaker 1: be a very dark series of episodes. As it stands, 799 00:49:19,840 --> 00:49:23,320 Speaker 1: it's still pretty scary because we're talking about cyber warfare 800 00:49:23,360 --> 00:49:30,000 Speaker 1: at this point, using computers to create real world physical effects, 801 00:49:30,880 --> 00:49:33,520 Speaker 1: which is pretty phenomenal. Up to this point, most people 802 00:49:33,560 --> 00:49:36,520 Speaker 1: thought of that as being just theoretical, that computers could 803 00:49:36,520 --> 00:49:39,720 Speaker 1: do a lot of damage to data and could create 804 00:49:39,760 --> 00:49:44,360 Speaker 1: a nuisance, but couldn't necessarily cause physical damage to the 805 00:49:44,400 --> 00:49:48,480 Speaker 1: real world around us. Stocks Net proved we shouldn't be 806 00:49:48,520 --> 00:49:50,920 Speaker 1: so sure about that again. I'll talk about that more 807 00:49:50,920 --> 00:49:53,600 Speaker 1: in our next episode. If you guys have suggestions for 808 00:49:53,840 --> 00:49:56,879 Speaker 1: tech topics I should cover in the future, maybe it's 809 00:49:56,880 --> 00:49:59,680 Speaker 1: a company, maybe it's a specific technology, maybe it's a 810 00:49:59,760 --> 00:50:02,480 Speaker 1: per soon in tech who you think I should profile, 811 00:50:03,040 --> 00:50:05,279 Speaker 1: let me know. Or if there's someone you think I 812 00:50:05,280 --> 00:50:08,160 Speaker 1: should interview or have on as a guest co host, 813 00:50:08,440 --> 00:50:10,279 Speaker 1: let me know that as well. You can get in 814 00:50:10,320 --> 00:50:12,959 Speaker 1: touch with me through email. The address for the show 815 00:50:13,120 --> 00:50:16,040 Speaker 1: is tech stuff at how stuff works dot com, or 816 00:50:16,120 --> 00:50:18,240 Speaker 1: draw me a line on Facebook or Twitter. The handle 817 00:50:18,280 --> 00:50:21,680 Speaker 1: for both of those is tech stuff hs W. Follow 818 00:50:21,800 --> 00:50:24,920 Speaker 1: us on Instagram, and of course you can watch me 819 00:50:25,000 --> 00:50:29,520 Speaker 1: record this show live at twitch dot tv slash tech stuff. 820 00:50:29,640 --> 00:50:32,960 Speaker 1: I typically record on Wednesdays and Friday's. There's a chat 821 00:50:33,040 --> 00:50:35,240 Speaker 1: room there. You can join in on the merry band 822 00:50:35,400 --> 00:50:40,960 Speaker 1: and have fund high spirited conversation about that weird thing 823 00:50:41,040 --> 00:50:43,040 Speaker 1: I just said and had to go back and fix 824 00:50:43,160 --> 00:50:46,800 Speaker 1: so that the podcast listeners will never know, but you'll 825 00:50:46,840 --> 00:50:50,759 Speaker 1: know because you're pretty darn cool. Well that's it for 826 00:50:50,840 --> 00:50:54,400 Speaker 1: me for now. I'll talk to you again really soon 827 00:51:00,120 --> 00:51:02,520 Speaker 1: for more on this and thousands of other topics. Is 828 00:51:02,520 --> 00:51:13,160 Speaker 1: that how stuff works dot com m