1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,320 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,000 --> 00:00:18,320 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,480 --> 00:00:23,360 Speaker 1: and I love all things tech. Now, let me paint 5 00:00:23,440 --> 00:00:28,080 Speaker 1: you a scenario, or, as my quiz to alter Ego 6 00:00:28,120 --> 00:00:33,520 Speaker 1: would say, over on Ridiculous History, a sonario. You sit 7 00:00:33,560 --> 00:00:37,280 Speaker 1: down to your computer. Maybe you're about to do some work, 8 00:00:37,440 --> 00:00:40,360 Speaker 1: or maybe you're planning on, you know, being totally sussed 9 00:00:40,360 --> 00:00:43,360 Speaker 1: while claiming to work on wires and among us. Maybe 10 00:00:43,360 --> 00:00:45,080 Speaker 1: you just want to watch an episode of The Haunting 11 00:00:45,120 --> 00:00:48,920 Speaker 1: a blind manner. But whatever the reason, you encounter an 12 00:00:49,000 --> 00:00:54,040 Speaker 1: unexpected problem. Your computer won't come out of unlock mode. Instead, 13 00:00:54,080 --> 00:00:58,680 Speaker 1: you get an ominous message. Someone has locked down your computer, 14 00:00:59,080 --> 00:01:01,360 Speaker 1: creating a new path us were to keep you out. 15 00:01:01,680 --> 00:01:03,800 Speaker 1: If you don't cough up a certain amount of money 16 00:01:03,920 --> 00:01:06,920 Speaker 1: by a certain amount of time, something bad will happen. 17 00:01:07,319 --> 00:01:10,039 Speaker 1: Maybe that's something bad is that the intruder will lock 18 00:01:10,080 --> 00:01:13,000 Speaker 1: down your computer forever. Maybe they will convert all the 19 00:01:13,080 --> 00:01:16,800 Speaker 1: data on your computer to gibberish. Maybe they'll fill up 20 00:01:16,840 --> 00:01:19,600 Speaker 1: your computer with garbage data to turn it into a 21 00:01:19,680 --> 00:01:24,000 Speaker 1: useless device. Or maybe they'll use the information stored on 22 00:01:24,040 --> 00:01:29,120 Speaker 1: your computer against you in some way allah blackmail. You 23 00:01:29,520 --> 00:01:35,640 Speaker 1: have been hit by ransomware. Ransomware is a subset of malware, 24 00:01:36,080 --> 00:01:38,559 Speaker 1: or what we goofs is in the old days, lumped 25 00:01:38,640 --> 00:01:42,679 Speaker 1: under the general term computer virus, though as it turns out, 26 00:01:43,000 --> 00:01:48,200 Speaker 1: that's not accurate. Really, ransomware is ugly stuff, and it 27 00:01:48,240 --> 00:01:52,400 Speaker 1: can cause enormous problems for people and organizations. There are 28 00:01:52,480 --> 00:01:56,480 Speaker 1: dozens of stories of computer systems and critical infrastructure being 29 00:01:56,560 --> 00:01:59,920 Speaker 1: hit by ransomware. One target that seems to get hit 30 00:02:00,040 --> 00:02:04,840 Speaker 1: fairly frequently would be hospitals. That is particularly ugly, and 31 00:02:04,920 --> 00:02:09,080 Speaker 1: even more so during a pandemic. So in this episode, 32 00:02:09,240 --> 00:02:12,560 Speaker 1: we're going to explore the history of ransomware, how it 33 00:02:12,600 --> 00:02:15,760 Speaker 1: works in general, and some stories on how it's been 34 00:02:15,880 --> 00:02:19,600 Speaker 1: used and what people did in response. Now, often with 35 00:02:19,680 --> 00:02:23,600 Speaker 1: these topics, I have to give pretty vague estimations of 36 00:02:23,840 --> 00:02:27,920 Speaker 1: when something got started, sometimes even having to resort to 37 00:02:28,040 --> 00:02:32,280 Speaker 1: using a decade rather than a specific year. Thankfully, I 38 00:02:32,320 --> 00:02:36,359 Speaker 1: guess for ransomware, the origin story is fairly well established. 39 00:02:36,639 --> 00:02:40,640 Speaker 1: Now I cannot for certain state that this is categorically 40 00:02:40,800 --> 00:02:44,880 Speaker 1: the very first case of ransomware, but generally speaking people 41 00:02:44,960 --> 00:02:49,440 Speaker 1: accept it as such, and so we come to the odd, 42 00:02:49,760 --> 00:02:55,480 Speaker 1: sinister and absurd story of the Aid's Trojan. Now, in 43 00:02:55,520 --> 00:02:59,440 Speaker 1: this case, a trojan is also a subset of malware. 44 00:02:59,760 --> 00:03:03,760 Speaker 1: It's really more of a delivery system for malware, and 45 00:03:03,880 --> 00:03:06,200 Speaker 1: one that a lot of folks are probably familiar with 46 00:03:06,600 --> 00:03:11,040 Speaker 1: by concept, if not by name. The name Trojan references 47 00:03:11,120 --> 00:03:14,320 Speaker 1: the story of the Trojan Horse, in which the Greek 48 00:03:14,440 --> 00:03:17,560 Speaker 1: forces that have been laying siege to the Free city 49 00:03:17,639 --> 00:03:22,639 Speaker 1: of Troy got a great idea. Hey, said the Greek forces, 50 00:03:22,639 --> 00:03:26,400 Speaker 1: what if we pretend to give them a really big present, 51 00:03:26,800 --> 00:03:29,960 Speaker 1: only instead of you know, I don't know, chocolate or whatever, 52 00:03:30,360 --> 00:03:34,840 Speaker 1: they find out that it's actually some secret Greek task force. 53 00:03:35,520 --> 00:03:39,000 Speaker 1: That task force will be inside Troy without having to 54 00:03:39,040 --> 00:03:42,360 Speaker 1: break through the walls, and then they could actually open 55 00:03:42,480 --> 00:03:44,120 Speaker 1: up the gates and let the rest of us in. 56 00:03:44,280 --> 00:03:48,520 Speaker 1: And so the story goes, the Greeks constructed a massive 57 00:03:48,720 --> 00:03:53,480 Speaker 1: wooden horse, hiding the warriors inside of the horse and 58 00:03:53,480 --> 00:03:56,280 Speaker 1: then leaving it out for the Trojan forces to bring 59 00:03:56,320 --> 00:03:58,760 Speaker 1: into the city while the rest of the Greek forces 60 00:03:59,160 --> 00:04:04,920 Speaker 1: pretended to sail away. Well, according to the story, the 61 00:04:04,960 --> 00:04:09,480 Speaker 1: Trojans celebrated and they pulled the massive wooden horse into 62 00:04:09,480 --> 00:04:12,480 Speaker 1: the city as a kind of war trophy, and the 63 00:04:12,560 --> 00:04:16,960 Speaker 1: Greek warriors inside the horse snuck out at night, opened 64 00:04:17,040 --> 00:04:20,440 Speaker 1: up the city gates, and then the returning Greek forces 65 00:04:20,560 --> 00:04:24,279 Speaker 1: just sauntered their way in conquering the city. In the 66 00:04:24,320 --> 00:04:28,920 Speaker 1: world of malware, a trojan is some sort of program 67 00:04:29,080 --> 00:04:33,120 Speaker 1: or application that appears to be legitimate but is secretly 68 00:04:33,279 --> 00:04:37,839 Speaker 1: carrying along some malware. A lot of malware falls into 69 00:04:37,880 --> 00:04:41,080 Speaker 1: this category or uses similar tactics to get people to, 70 00:04:41,520 --> 00:04:45,400 Speaker 1: you know, actually install the malware on their computers. When 71 00:04:45,520 --> 00:04:48,599 Speaker 1: file sharing became a huge trend as peer to peer 72 00:04:48,640 --> 00:04:52,440 Speaker 1: networks took off, one of the biggest dangers wasn't that 73 00:04:52,640 --> 00:04:55,720 Speaker 1: a movie studio or a record studio is going to 74 00:04:55,839 --> 00:05:00,800 Speaker 1: come after you with some absurdly overblown lawsuit. Although those 75 00:05:00,920 --> 00:05:04,520 Speaker 1: did happen. The bigger threat is that one of those 76 00:05:04,560 --> 00:05:08,359 Speaker 1: files you downloaded was actually malware in disguise, or it 77 00:05:08,480 --> 00:05:12,600 Speaker 1: had malware embedded within the program just waiting to be unleashed. 78 00:05:13,120 --> 00:05:17,240 Speaker 1: The AIDS Trojan was called that because it had to 79 00:05:17,360 --> 00:05:22,200 Speaker 1: do with events surrounding the AIDS crisis, and it was 80 00:05:22,240 --> 00:05:27,560 Speaker 1: actually distributed on diskette. This was pre Internet days. Well, 81 00:05:27,640 --> 00:05:30,240 Speaker 1: the Internet was around, but not many people were using it. 82 00:05:30,279 --> 00:05:33,760 Speaker 1: This was nineteen eighty nine. So the person who made 83 00:05:33,800 --> 00:05:38,680 Speaker 1: this malware saved the malware on two disks disguised as 84 00:05:38,720 --> 00:05:42,720 Speaker 1: a legitimate program. Only that he did it to twenty 85 00:05:43,000 --> 00:05:45,920 Speaker 1: thousand discs, and I imagine that must have taken a 86 00:05:46,120 --> 00:05:50,760 Speaker 1: very long time. So who received those discs? Who was 87 00:05:50,800 --> 00:05:53,880 Speaker 1: on the target side? Well, the discs went to people 88 00:05:54,200 --> 00:05:57,359 Speaker 1: who were part of a nineteen eighty nine World Health 89 00:05:57,440 --> 00:06:02,000 Speaker 1: Organization conference on the AIDS crisis. And who was the 90 00:06:02,000 --> 00:06:07,359 Speaker 1: mastermind behind this plot? Well, that would be Dr Joseph L. Pop, 91 00:06:07,760 --> 00:06:13,600 Speaker 1: an evolutionary biologist and AIDS researcher. Okay, so let's get 92 00:06:13,760 --> 00:06:19,120 Speaker 1: some more context. The medical community first began identifying medical 93 00:06:19,160 --> 00:06:22,640 Speaker 1: cases that in retrospect we would understand to be related 94 00:06:22,680 --> 00:06:27,799 Speaker 1: to acquired immuno deficiency syndrome or AIDS, back in nineteen 95 00:06:27,920 --> 00:06:31,200 Speaker 1: eight one. The c d C would first use the 96 00:06:31,360 --> 00:06:37,800 Speaker 1: term AIDS on September. The World Health Organization would hold 97 00:06:37,800 --> 00:06:43,599 Speaker 1: its first meeting to discuss the global situation in nine three. Now, 98 00:06:43,600 --> 00:06:48,039 Speaker 1: despite the clear danger, many countries, including the United States, 99 00:06:48,440 --> 00:06:52,560 Speaker 1: failed to take this crisis very seriously at first, in 100 00:06:52,680 --> 00:06:56,480 Speaker 1: large part because it was seemingly affecting only gay men, 101 00:06:56,920 --> 00:07:01,000 Speaker 1: and the general social attitude toward homosexual in these countries, 102 00:07:01,040 --> 00:07:04,640 Speaker 1: including the United States, was at the very least uninviting, 103 00:07:04,839 --> 00:07:08,240 Speaker 1: which is putting it lightly. By nine nine, the US 104 00:07:08,320 --> 00:07:12,400 Speaker 1: government couldn't sit by idly, and Congress created the National 105 00:07:12,440 --> 00:07:17,960 Speaker 1: Commission on AIDS. Dr Anthony Fauci, And yes, I'm talking 106 00:07:18,000 --> 00:07:22,920 Speaker 1: about that Dr Fauci, the same one today who's advocating 107 00:07:23,040 --> 00:07:27,360 Speaker 1: social distancing and using masks during our COVID pandemic. Well 108 00:07:27,400 --> 00:07:31,840 Speaker 1: back then he was endorsing giving HIV positive people access 109 00:07:31,880 --> 00:07:36,000 Speaker 1: to experimental treatments, even if those people did not technically 110 00:07:36,120 --> 00:07:41,080 Speaker 1: qualify for clinical trials, because that man is a freaking legend. 111 00:07:41,600 --> 00:07:46,080 Speaker 1: And just to be clear, HIV is human immuno deficiency virus. 112 00:07:46,200 --> 00:07:49,640 Speaker 1: That's the virus that can lead to AIDS. The number 113 00:07:49,760 --> 00:07:52,400 Speaker 1: of AIDS cases in the United States at that point 114 00:07:52,480 --> 00:07:55,480 Speaker 1: reached one hundred thousand, and the w h O, the 115 00:07:55,520 --> 00:07:59,240 Speaker 1: World Health Organization, estimated that there were up to four 116 00:07:59,320 --> 00:08:02,240 Speaker 1: hundred doll and cases around the world, and in some 117 00:08:02,280 --> 00:08:06,720 Speaker 1: parts of the world, like Africa, the medical establishment was 118 00:08:06,960 --> 00:08:12,280 Speaker 1: woefully underprepared to treat infected people and outbreaks were rampant, 119 00:08:12,760 --> 00:08:15,040 Speaker 1: and all of this is important when we get to 120 00:08:15,520 --> 00:08:19,360 Speaker 1: motives behind this trojan malware a little bit later on. 121 00:08:19,640 --> 00:08:23,160 Speaker 1: That's why I needed to set the stage. So Dr 122 00:08:23,240 --> 00:08:26,559 Speaker 1: Pop had been a collaborating member of a group called 123 00:08:26,640 --> 00:08:30,160 Speaker 1: the Flying Doctors, and that itself was a branch of 124 00:08:30,200 --> 00:08:34,400 Speaker 1: the African Medical Research Foundation. Pop had served as a 125 00:08:34,440 --> 00:08:38,680 Speaker 1: consultant for the World Health Organization in Kenya and had 126 00:08:38,760 --> 00:08:44,640 Speaker 1: even organized AIDS related conferences himself. So Dr Pop takes 127 00:08:44,640 --> 00:08:47,079 Speaker 1: a look at the list of attendees to this w 128 00:08:47,400 --> 00:08:51,840 Speaker 1: h O conference in males twenty thousand people and various organizations. 129 00:08:52,120 --> 00:08:55,360 Speaker 1: A diskette and the discs label stated that it was 130 00:08:55,520 --> 00:09:00,320 Speaker 1: an quote AIDS Information Introductory Diskette end quote from a 131 00:09:00,320 --> 00:09:05,720 Speaker 1: company called PC Cyborg Corporation, a fictional company of Dr 132 00:09:05,800 --> 00:09:09,240 Speaker 1: Pop's invention. Now, if you were to insert the diskette 133 00:09:09,280 --> 00:09:12,400 Speaker 1: into a PC and then run the program, you would 134 00:09:12,440 --> 00:09:16,880 Speaker 1: encounter a seemingly straightforward survey, you know, a questionnaire about AIDS. 135 00:09:17,480 --> 00:09:22,040 Speaker 1: But in the background, the malware was infecting the auto 136 00:09:22,160 --> 00:09:27,240 Speaker 1: exec dot bat file in the root directory for the PC. Now, 137 00:09:27,280 --> 00:09:30,840 Speaker 1: this file is the startup file for a computer of 138 00:09:30,880 --> 00:09:34,040 Speaker 1: that era. It's in charge of booting up a computer 139 00:09:34,120 --> 00:09:37,880 Speaker 1: and activating all the components to work with the operating system. 140 00:09:37,880 --> 00:09:41,200 Speaker 1: It sets up everything to move forward when you are 141 00:09:41,240 --> 00:09:43,880 Speaker 1: actually using your computer, so when you power it on, 142 00:09:44,120 --> 00:09:47,040 Speaker 1: this is the program that sets everything up. And then, 143 00:09:47,480 --> 00:09:50,520 Speaker 1: like the Greek soldiers in our legend about the City 144 00:09:50,559 --> 00:09:53,520 Speaker 1: of Troy, the malware would lie and wait for the 145 00:09:53,640 --> 00:09:56,600 Speaker 1: right moment to strike. But instead of waiting for night 146 00:09:56,640 --> 00:09:59,400 Speaker 1: to fall, the malware would keep count of how many 147 00:09:59,440 --> 00:10:03,520 Speaker 1: times the program had been activated, although actually some sources 148 00:10:03,559 --> 00:10:07,680 Speaker 1: say it tracked how frequently the computer system was rebooted 149 00:10:07,760 --> 00:10:11,840 Speaker 1: or turned on, so different reports have conflicting information about this, 150 00:10:11,920 --> 00:10:15,479 Speaker 1: but generally speaking, it was keeping track of how frequently 151 00:10:15,640 --> 00:10:19,480 Speaker 1: the system was being used, and eventually, typically around either 152 00:10:19,760 --> 00:10:24,800 Speaker 1: the program activation or reboot number ninety, the malware would 153 00:10:24,800 --> 00:10:27,439 Speaker 1: then initiate the actual attacks. So it would wait until 154 00:10:27,960 --> 00:10:31,800 Speaker 1: this counter I'd hit nine of ninety instances of this 155 00:10:31,840 --> 00:10:35,280 Speaker 1: thing happening. The program would then encrypt all the files 156 00:10:35,320 --> 00:10:38,559 Speaker 1: on the sea drive of the host machine, which is 157 00:10:38,600 --> 00:10:41,160 Speaker 1: where most files lived. The sea drive is kind of 158 00:10:41,160 --> 00:10:45,760 Speaker 1: the default drive on PC machines, and it rendered those 159 00:10:45,760 --> 00:10:49,400 Speaker 1: files inaccessible to the user, and it would also launch 160 00:10:49,520 --> 00:10:53,320 Speaker 1: a ransom message. Now, if you search for the stuff online, 161 00:10:53,520 --> 00:10:56,360 Speaker 1: you're likely to come across a picture of a screen 162 00:10:56,440 --> 00:10:59,720 Speaker 1: that has a message that starts with quote, attention, I 163 00:10:59,760 --> 00:11:02,760 Speaker 1: have and elected to inform you that throughout your process 164 00:11:02,800 --> 00:11:05,520 Speaker 1: of collecting and executing files and so on and so on. 165 00:11:06,200 --> 00:11:08,760 Speaker 1: It then goes on to drop some expletives. So I'm 166 00:11:08,760 --> 00:11:10,480 Speaker 1: not going to quote the whole thing on here. This 167 00:11:10,760 --> 00:11:13,560 Speaker 1: show is family friendly for the most part. And the 168 00:11:13,679 --> 00:11:16,800 Speaker 1: message also gloats over the fact that the computer has 169 00:11:16,800 --> 00:11:19,880 Speaker 1: been infected by a virus. But that is not the 170 00:11:19,920 --> 00:11:23,720 Speaker 1: actual message that popped up with the AIDS trojan. It's 171 00:11:23,760 --> 00:11:27,360 Speaker 1: frequently used in articles that are about the AIDS trojan, 172 00:11:27,559 --> 00:11:31,000 Speaker 1: but that's not what users saw. For one thing, that 173 00:11:31,040 --> 00:11:35,440 Speaker 1: message has no information regarding the actual ransom, which for 174 00:11:35,600 --> 00:11:40,480 Speaker 1: ransomware is kind of an important step. The message that 175 00:11:40,679 --> 00:11:44,360 Speaker 1: gets shown all over the place concludes by saying, rememb 176 00:11:44,360 --> 00:11:49,079 Speaker 1: member that that's actually the way it's written. Now that's misspelled. 177 00:11:49,400 --> 00:11:52,720 Speaker 1: There is no cure for AIDS end quote, So I 178 00:11:52,760 --> 00:11:55,040 Speaker 1: just want to clear out the confusion. That was not 179 00:11:55,160 --> 00:12:00,640 Speaker 1: the message that dr pops ransomware used. Instead, the actual 180 00:12:00,760 --> 00:12:04,080 Speaker 1: message came across as more official and less of a 181 00:12:04,400 --> 00:12:08,080 Speaker 1: Nelson from the Simpsons, pointing and saying ha ha. The 182 00:12:08,160 --> 00:12:12,640 Speaker 1: real message starts off as quote, Dear customer, it is 183 00:12:12,720 --> 00:12:17,439 Speaker 1: time to pay for your software lease from PC Cyborg Corporation. 184 00:12:17,920 --> 00:12:21,560 Speaker 1: Complete the invoice and attached payment for the lease option 185 00:12:21,720 --> 00:12:25,400 Speaker 1: of your choice. End quote, and the two choices offered 186 00:12:25,679 --> 00:12:30,240 Speaker 1: included a yearly lease for one eighty nine dollars or 187 00:12:30,320 --> 00:12:34,360 Speaker 1: a lifetime lease for three d seventy eight dollars. In 188 00:12:34,400 --> 00:12:36,920 Speaker 1: either case, the user was instructed to pay in the 189 00:12:36,960 --> 00:12:41,120 Speaker 1: form of a banker's draft, a cashier's check, or international 190 00:12:41,320 --> 00:12:45,720 Speaker 1: money order payable to PC Cyborg Corporation to a post 191 00:12:45,760 --> 00:12:51,040 Speaker 1: office box located in Panama. Now, presumably, after paying this 192 00:12:51,160 --> 00:12:54,839 Speaker 1: so called lease, you would receive a way to decrypt 193 00:12:55,040 --> 00:12:59,240 Speaker 1: the files on your computer from the PC Cyborg Corporation. 194 00:12:59,600 --> 00:13:02,640 Speaker 1: But the timeline for how this all shook out was 195 00:13:02,679 --> 00:13:04,800 Speaker 1: a bit too short to know for sure whether or 196 00:13:04,800 --> 00:13:07,720 Speaker 1: not Dr Pop would have sent anything out, because I'm 197 00:13:07,720 --> 00:13:09,880 Speaker 1: not even sure that Dr Pop had a chance to 198 00:13:09,920 --> 00:13:12,520 Speaker 1: retrieve anything from that post office box before it all 199 00:13:12,559 --> 00:13:15,800 Speaker 1: went down. Now, the fact that the malware didn't go 200 00:13:16,040 --> 00:13:19,400 Speaker 1: into action immediately, it would wait a bit before striking 201 00:13:19,600 --> 00:13:23,120 Speaker 1: meant that the outbreak of infected computers was a little staggered. 202 00:13:23,480 --> 00:13:26,760 Speaker 1: The first reports of the computer virus came out of England, 203 00:13:27,240 --> 00:13:30,079 Speaker 1: and that's also when it first became clear that this 204 00:13:30,200 --> 00:13:33,560 Speaker 1: was a new type of crime, one that wasn't covered 205 00:13:33,640 --> 00:13:38,400 Speaker 1: on the lawbooks explicitly, which would force prosecutors to rely 206 00:13:38,520 --> 00:13:42,120 Speaker 1: on older laws and hope that those laws could maybe 207 00:13:42,400 --> 00:13:46,760 Speaker 1: bend enough to apply to this brand new crime. As 208 00:13:46,840 --> 00:13:50,240 Speaker 1: news spread through the medical research community of the virus, 209 00:13:50,280 --> 00:13:55,760 Speaker 1: some organizations took extreme steps. The newspaper The Independent reported 210 00:13:55,800 --> 00:13:59,720 Speaker 1: that and AIDS organization in Italy chose to delete data 211 00:14:00,120 --> 00:14:03,400 Speaker 1: off of infected machines and they lost ten years of 212 00:14:03,440 --> 00:14:06,600 Speaker 1: work in the process. Now, as it turned out, that 213 00:14:06,760 --> 00:14:10,600 Speaker 1: was an extreme overreaction, but this was a new type 214 00:14:10,600 --> 00:14:13,720 Speaker 1: of crisis, so I can't really put too much blame here. 215 00:14:14,400 --> 00:14:18,360 Speaker 1: While computer scientists started to tackle the technological problem of 216 00:14:18,360 --> 00:14:23,440 Speaker 1: how to decrypt infected computers, Scotland Yards Computer Unit launched 217 00:14:23,480 --> 00:14:26,800 Speaker 1: its largest investigation at that point to try and track 218 00:14:26,840 --> 00:14:30,200 Speaker 1: down the perpetrator, and Dr Pop might have even slipped 219 00:14:30,200 --> 00:14:34,480 Speaker 1: away without anyone ever knowing about his involvement except for 220 00:14:34,600 --> 00:14:39,000 Speaker 1: his odd behaviors which gave him away. Researchers figured out 221 00:14:39,120 --> 00:14:42,640 Speaker 1: how to decrypt the machines. They created a decryption tool 222 00:14:42,920 --> 00:14:46,760 Speaker 1: called AIDS out, for example, that would reverse the process 223 00:14:46,840 --> 00:14:50,360 Speaker 1: on an infected computer, so you could decrypt your own files. 224 00:14:50,800 --> 00:14:53,920 Speaker 1: If Dr Pop had just kept a low profile, the 225 00:14:54,000 --> 00:14:57,440 Speaker 1: AIDS trojan might have entered computer lore as a great 226 00:14:57,600 --> 00:15:03,160 Speaker 1: unsolved mystery. But Dr Pop was behaving in odd ways. 227 00:15:03,520 --> 00:15:06,320 Speaker 1: Shortly after mailing out the discs, he attended an AIDS 228 00:15:06,400 --> 00:15:10,000 Speaker 1: seminar in Nairobi, and while this was a short time 229 00:15:10,080 --> 00:15:13,040 Speaker 1: after the infected discs had hit the targets, it was 230 00:15:13,080 --> 00:15:15,600 Speaker 1: already becoming a big point of conversation. It was like 231 00:15:15,960 --> 00:15:18,560 Speaker 1: less than two weeks after he had mailed out the discs, 232 00:15:19,120 --> 00:15:22,880 Speaker 1: and maybe that unnerved Dr Pop. As he was traveling 233 00:15:22,920 --> 00:15:25,600 Speaker 1: back to the United States, he had a layover at 234 00:15:25,600 --> 00:15:29,400 Speaker 1: an airport in Amsterdam, and he wrote the phrase Dr 235 00:15:29,480 --> 00:15:34,600 Speaker 1: Pop has been poisoned on a fellow traveler's suitcase. That 236 00:15:34,680 --> 00:15:37,840 Speaker 1: did not go unnoticed, and authorities decided they wanted to 237 00:15:37,880 --> 00:15:42,000 Speaker 1: have a little word with Dr Pop. Upon searching his luggage, 238 00:15:42,040 --> 00:15:46,280 Speaker 1: the authorities found material with the label PC Cyborg Corporation, 239 00:15:46,680 --> 00:15:50,440 Speaker 1: the fictional company at the heart of the ransomware. Pop 240 00:15:50,600 --> 00:15:53,480 Speaker 1: was allowed to return to the United States, but not 241 00:15:53,600 --> 00:15:57,320 Speaker 1: long afterward received a visit from the FBI, who arrested him, 242 00:15:57,680 --> 00:16:01,600 Speaker 1: and then the US extradited Dr Pop to Britain on 243 00:16:01,760 --> 00:16:06,800 Speaker 1: charges of blackmail and criminal damage. Pop's lawyers would claim 244 00:16:06,880 --> 00:16:09,800 Speaker 1: that Pop was planning on using that money from his 245 00:16:09,880 --> 00:16:14,080 Speaker 1: scheme to fund alternative AIDS research, kind of framing his 246 00:16:14,120 --> 00:16:15,800 Speaker 1: defense in such a way as to make it seem 247 00:16:15,840 --> 00:16:18,400 Speaker 1: as though Pop was responding to what he saw as 248 00:16:18,440 --> 00:16:22,800 Speaker 1: a flawed approach to tackling this global crisis. Now, as 249 00:16:22,840 --> 00:16:26,240 Speaker 1: I mentioned earlier, it took far too long for most 250 00:16:26,280 --> 00:16:29,840 Speaker 1: of the world to take the AIDS crisis seriously, so 251 00:16:30,000 --> 00:16:32,360 Speaker 1: this narrative had a bit of an appeal to it. 252 00:16:32,400 --> 00:16:36,400 Speaker 1: There was evidence that the world was not responding quickly 253 00:16:36,720 --> 00:16:41,520 Speaker 1: or appropriately to this problem, so he was kind of 254 00:16:41,600 --> 00:16:44,800 Speaker 1: like a Robin Hood figure at least by this story, 255 00:16:44,880 --> 00:16:48,640 Speaker 1: you know, stealing from bloated bureaucracies that were spending more 256 00:16:48,680 --> 00:16:52,560 Speaker 1: money on infrastructure than on actual research. Or at least 257 00:16:52,920 --> 00:16:56,160 Speaker 1: that was the narrative that his lawyers wanted people to accept, 258 00:16:56,440 --> 00:17:00,240 Speaker 1: but there were some potential alternative motivations. The guard In 259 00:17:00,280 --> 00:17:03,480 Speaker 1: newspaper published an article that revealed that Dr Pop had 260 00:17:03,520 --> 00:17:05,960 Speaker 1: recently applied for a job with the w h O 261 00:17:06,480 --> 00:17:10,439 Speaker 1: but had been rejected, so it's possible that the malware 262 00:17:10,560 --> 00:17:14,600 Speaker 1: was an act of vindictive revenge for being snubbed. The 263 00:17:14,640 --> 00:17:18,720 Speaker 1: actual trial was a huge mess for many reasons. One 264 00:17:18,760 --> 00:17:20,959 Speaker 1: big one, as I mentioned, is that the legal system 265 00:17:21,000 --> 00:17:24,160 Speaker 1: didn't yet have a framework for cyber crimes like this, 266 00:17:24,600 --> 00:17:27,879 Speaker 1: so they had to apply older crimes in the charge. 267 00:17:28,160 --> 00:17:32,240 Speaker 1: But another was Pop's own behavior. Reportedly, he would show 268 00:17:32,320 --> 00:17:35,119 Speaker 1: up to court while wearing a cardboard box on his head, 269 00:17:35,320 --> 00:17:38,400 Speaker 1: or he would put curlers in his beard or a 270 00:17:38,440 --> 00:17:41,399 Speaker 1: condom on his nose, and he claimed that it was 271 00:17:41,440 --> 00:17:45,080 Speaker 1: to ward off radiation. The judge in the case ultimately 272 00:17:45,160 --> 00:17:49,280 Speaker 1: ruled that Dr Pop was unfit to stand trial. Prosecutors 273 00:17:49,280 --> 00:17:53,400 Speaker 1: were frustrated, pointing out that a digital diary in Pop's 274 00:17:53,440 --> 00:17:57,399 Speaker 1: possession revealed that this aid's trojan plan had been in 275 00:17:57,440 --> 00:18:01,879 Speaker 1: development for well over a year, indicating that this was 276 00:18:01,920 --> 00:18:07,240 Speaker 1: not some sort of spontaneous manic manifestation. And nevertheless, Pop 277 00:18:07,359 --> 00:18:10,320 Speaker 1: was off the hook and he returned back home to 278 00:18:10,359 --> 00:18:15,480 Speaker 1: the United States. He continued his work researching evolutionary biology. 279 00:18:15,520 --> 00:18:18,240 Speaker 1: According to some sources that came across. He also spent 280 00:18:18,280 --> 00:18:21,560 Speaker 1: a lot of time pushing some rather unorthodox ideas about 281 00:18:21,680 --> 00:18:27,800 Speaker 1: human reproduction that I find at best misogynistic. Anyway, he 282 00:18:27,920 --> 00:18:31,520 Speaker 1: passed away in two thousand seven. His legacy includes not 283 00:18:31,640 --> 00:18:35,959 Speaker 1: just the first case of ransomware, but also a lovely 284 00:18:36,080 --> 00:18:39,879 Speaker 1: butterfly conservatory in Cooperstown, New York. For real, So I 285 00:18:39,880 --> 00:18:43,399 Speaker 1: mean that Joseph L. Pop Junior Butterfly Conservatory is a 286 00:18:43,440 --> 00:18:47,280 Speaker 1: place you can visit, you know, when things are more safe. 287 00:18:48,000 --> 00:18:50,280 Speaker 1: When we come back, I'll talk a bit more about 288 00:18:50,280 --> 00:18:53,760 Speaker 1: the encryption method Pop used and why the ransomware of 289 00:18:53,800 --> 00:18:57,359 Speaker 1: today is far more dangerous. But first let's take a 290 00:18:57,480 --> 00:19:08,800 Speaker 1: quick break. When Pop designed his malware, he had a 291 00:19:08,840 --> 00:19:11,840 Speaker 1: limited tool set. When it came to encrypting the files 292 00:19:11,840 --> 00:19:17,520 Speaker 1: on target computers, he used a process called symmetric key encryption. Now, 293 00:19:17,560 --> 00:19:20,639 Speaker 1: the name gives us a hint about how this works. 294 00:19:21,160 --> 00:19:25,399 Speaker 1: You've got a key to encode and decode text, and 295 00:19:25,480 --> 00:19:29,040 Speaker 1: that key is the same for both parties that are 296 00:19:29,080 --> 00:19:32,800 Speaker 1: trying to communicate secretly. You each have an exact copy 297 00:19:33,000 --> 00:19:36,840 Speaker 1: of this key. This is easier to understand with an analogy. 298 00:19:37,080 --> 00:19:39,399 Speaker 1: So let's say that you and I both have a 299 00:19:39,520 --> 00:19:43,760 Speaker 1: Captain Crusader decoder ring, and I can write a message 300 00:19:43,960 --> 00:19:47,000 Speaker 1: in plain old English, and then I use this decoder 301 00:19:47,119 --> 00:19:50,800 Speaker 1: ring to encode that message so that it looks like 302 00:19:50,840 --> 00:19:55,119 Speaker 1: a meaningless jumble of letters, numbers, and symbols. I send 303 00:19:55,160 --> 00:19:58,879 Speaker 1: you the encoded message. You have your own ring, which 304 00:19:59,040 --> 00:20:01,760 Speaker 1: in every way is a duplicate of the one I have, 305 00:20:02,280 --> 00:20:05,480 Speaker 1: and using your ring, you reverse the process. You turn 306 00:20:05,520 --> 00:20:09,600 Speaker 1: each coded letter back into the original uncoded text, and 307 00:20:09,640 --> 00:20:13,800 Speaker 1: after some work, voila, you have the original message. You 308 00:20:13,920 --> 00:20:17,399 Speaker 1: probably see the limitations of this approach right away. It 309 00:20:17,440 --> 00:20:20,960 Speaker 1: depends upon the various parties having access to a private 310 00:20:21,080 --> 00:20:26,040 Speaker 1: encryption key. If anyone else should get hold of that key, 311 00:20:26,080 --> 00:20:31,000 Speaker 1: they could conceivably reverse the encryption process on any intercepted message. 312 00:20:31,359 --> 00:20:35,080 Speaker 1: So the encryption method only works if the keys remain secret, 313 00:20:35,600 --> 00:20:38,800 Speaker 1: and that's tricky because you first have to make sure 314 00:20:38,840 --> 00:20:43,280 Speaker 1: that both parties have the secret key, and getting a 315 00:20:43,359 --> 00:20:48,520 Speaker 1: secret key to somebody securely is its own problem. Moreover, 316 00:20:48,960 --> 00:20:53,000 Speaker 1: this type of encryption can be vulnerable to cryptanalysis, that is, 317 00:20:53,320 --> 00:20:57,359 Speaker 1: efforts of others to reverse the process without a key 318 00:20:57,400 --> 00:21:00,680 Speaker 1: in an effort to determine what the key is. This 319 00:21:00,760 --> 00:21:03,280 Speaker 1: is something that happened a lot during World War Two, 320 00:21:03,359 --> 00:21:06,960 Speaker 1: where both the Axis and the Allied powers worked hard 321 00:21:07,000 --> 00:21:10,200 Speaker 1: to crack the codes of the opposition and then try 322 00:21:10,240 --> 00:21:12,399 Speaker 1: to keep the fact that they had cracked the codes 323 00:21:12,600 --> 00:21:16,439 Speaker 1: secret long enough to capitalize on the discovered information. The 324 00:21:16,520 --> 00:21:22,440 Speaker 1: limitations of symmetric key cryptography made ransomware largely an impractical 325 00:21:22,600 --> 00:21:26,240 Speaker 1: method to make some cash. With a wide enough spread, 326 00:21:26,440 --> 00:21:28,840 Speaker 1: you might get some hits from people who lack the 327 00:21:28,920 --> 00:21:33,720 Speaker 1: information or access to information to make an informed decision, 328 00:21:34,040 --> 00:21:37,760 Speaker 1: So a few targets might panic and capitulate, but it's 329 00:21:37,760 --> 00:21:40,919 Speaker 1: not the most reliable means of pulling off a big heist. 330 00:21:41,440 --> 00:21:45,880 Speaker 1: A few years after Dr Pop's attempt, a pair of researchers, 331 00:21:46,359 --> 00:21:50,000 Speaker 1: or as some have written, a cryptographer and a hacker, 332 00:21:50,520 --> 00:21:53,920 Speaker 1: laid out the strategies that they expected to see used 333 00:21:54,000 --> 00:21:58,800 Speaker 1: in future ransomware attacks, and they were right. The two 334 00:21:58,840 --> 00:22:03,600 Speaker 1: were multi Yng and Adam L Young, Young and Young. 335 00:22:04,160 --> 00:22:07,679 Speaker 1: They were working together to anticipate future problems, and just 336 00:22:07,760 --> 00:22:10,200 Speaker 1: to be clear here, they were looking at the challenges 337 00:22:10,400 --> 00:22:14,480 Speaker 1: from the perspective of a potential attacker, which is important 338 00:22:14,520 --> 00:22:18,520 Speaker 1: because that's what the attackers are doing all the time right. 339 00:22:18,840 --> 00:22:21,640 Speaker 1: I mean this is similar to white hat hackers, who 340 00:22:21,680 --> 00:22:25,520 Speaker 1: operate the same way as a malicious attacker, but for 341 00:22:25,560 --> 00:22:28,879 Speaker 1: the purposes of figuring out where vulnerabilities are within a 342 00:22:29,000 --> 00:22:32,359 Speaker 1: system and in an effort to design a more effective 343 00:22:32,400 --> 00:22:37,040 Speaker 1: digital security measure. Young and Young started to ask some 344 00:22:37,280 --> 00:22:41,280 Speaker 1: pretty simple questions and come up with answers. So let's 345 00:22:41,280 --> 00:22:44,879 Speaker 1: say you want to design some malware. You know for 346 00:22:44,960 --> 00:22:48,240 Speaker 1: some reason, you'd probably have a checklist of things you 347 00:22:48,280 --> 00:22:52,880 Speaker 1: would want for that malware. Now, depending upon your motivations, 348 00:22:53,240 --> 00:22:56,240 Speaker 1: you might want the malware to remain hidden from view 349 00:22:56,359 --> 00:23:00,240 Speaker 1: as well. If you're making a statement with malware, that's 350 00:23:00,280 --> 00:23:03,199 Speaker 1: probably not the case. Maybe you want to hijack computer 351 00:23:03,280 --> 00:23:07,280 Speaker 1: systems and display some sort of anarchistic message on monitors. 352 00:23:07,600 --> 00:23:09,800 Speaker 1: But if your goal is to do something else like 353 00:23:10,160 --> 00:23:14,920 Speaker 1: secretly monitor communications or steal information, or spread an infection 354 00:23:14,960 --> 00:23:18,600 Speaker 1: to other computers on a connected system, chances are you 355 00:23:18,680 --> 00:23:21,760 Speaker 1: don't want people detecting the malware right away. But let's 356 00:23:21,800 --> 00:23:24,080 Speaker 1: say you actually want people to know that their machines 357 00:23:24,119 --> 00:23:27,399 Speaker 1: are infected, because the whole point of your malware is 358 00:23:27,440 --> 00:23:30,440 Speaker 1: to extort money from the owners of the target systems, 359 00:23:30,760 --> 00:23:32,639 Speaker 1: and you can only do that if they realize that 360 00:23:32,680 --> 00:23:36,480 Speaker 1: they've been targeted. You'll have a few other considerations at play. 361 00:23:36,520 --> 00:23:39,400 Speaker 1: For example, you probably don't want people to be able 362 00:23:39,440 --> 00:23:43,119 Speaker 1: to remove the malware easily before it can actually do 363 00:23:43,240 --> 00:23:46,720 Speaker 1: its work. Young and Young compared this to the face 364 00:23:46,800 --> 00:23:50,560 Speaker 1: hugger in the Alien franchise of films. Once the face 365 00:23:50,640 --> 00:23:54,479 Speaker 1: hugger latches onto a person, any attempts to remove it 366 00:23:54,600 --> 00:23:58,600 Speaker 1: from the victim cause injury to the victim. So a 367 00:23:58,640 --> 00:24:01,879 Speaker 1: malware designer will likely want to make it difficult or 368 00:24:01,920 --> 00:24:05,679 Speaker 1: impossible to remove the malware without causing harm to the 369 00:24:05,720 --> 00:24:10,360 Speaker 1: target machine. It reduces the incentive to just rip out 370 00:24:10,400 --> 00:24:13,720 Speaker 1: the malware, So you want your malware to be like 371 00:24:13,960 --> 00:24:17,399 Speaker 1: a barbed arrow. Removing the arrow has the potential to 372 00:24:17,480 --> 00:24:20,560 Speaker 1: cause even more damage than it creates, an opportunity to 373 00:24:20,560 --> 00:24:24,080 Speaker 1: convince people to pay up rather than risk their data 374 00:24:24,160 --> 00:24:27,320 Speaker 1: being out of reach forever. The question then arises, how 375 00:24:27,359 --> 00:24:29,840 Speaker 1: do you make sure the attack is one that isn't 376 00:24:29,920 --> 00:24:35,120 Speaker 1: easily reversible? How do you avoid the weakness of pops approach? 377 00:24:35,880 --> 00:24:39,879 Speaker 1: And their answer wasn't Another approach to cryptography, one that 378 00:24:40,080 --> 00:24:43,520 Speaker 1: had its history dating back several decades. This approach called 379 00:24:43,640 --> 00:24:48,840 Speaker 1: asymmetric key or public private key cryptography sidestep the major 380 00:24:48,920 --> 00:24:53,640 Speaker 1: vulnerabilities of the symmetric key approach. Now I'll describe what's 381 00:24:53,640 --> 00:24:57,080 Speaker 1: going on from a very high level, and just to 382 00:24:57,160 --> 00:24:59,760 Speaker 1: let you guys know, I'm not going to go into 383 00:24:59,800 --> 00:25:03,360 Speaker 1: a deep dive into detail because it's a very complicated 384 00:25:03,400 --> 00:25:07,000 Speaker 1: concept to unwrap and it merits its own episode. In fact, 385 00:25:07,119 --> 00:25:11,880 Speaker 1: I've actually done episodes about this. But with a symmetric key, 386 00:25:12,040 --> 00:25:15,480 Speaker 1: the two parties in communication are using exact copies of 387 00:25:15,520 --> 00:25:21,119 Speaker 1: the same encoding and decoding component. But with an asymmetric key, 388 00:25:21,200 --> 00:25:25,520 Speaker 1: you've got one key that encodes and a different key 389 00:25:25,560 --> 00:25:30,639 Speaker 1: that decodes, and that's it. Communication goes one way for 390 00:25:30,720 --> 00:25:34,159 Speaker 1: each set of keys. This allows for a public key 391 00:25:34,200 --> 00:25:38,240 Speaker 1: for encoding and a private key for decoding. So again 392 00:25:38,320 --> 00:25:41,600 Speaker 1: let's talk about examples. Let's say I want to send 393 00:25:41,760 --> 00:25:45,760 Speaker 1: you an encrypted message, and that way, anyone who intercepts 394 00:25:45,840 --> 00:25:48,359 Speaker 1: my communication to you would just see a garbled mess 395 00:25:48,359 --> 00:25:52,200 Speaker 1: of nonsense. Now you have your own private key and 396 00:25:52,280 --> 00:25:55,680 Speaker 1: there is a corresponding public key, and you have made 397 00:25:55,960 --> 00:25:59,640 Speaker 1: the public key truly public. Anyone in the world can 398 00:25:59,760 --> 00:26:03,960 Speaker 1: use it to send you encrypted messages. So I use 399 00:26:04,119 --> 00:26:07,520 Speaker 1: your public key to encode my message to you. Now 400 00:26:07,560 --> 00:26:10,520 Speaker 1: we've got an encrypted message, one that could only be 401 00:26:10,680 --> 00:26:13,879 Speaker 1: decoded by the private key. There's only one of those, 402 00:26:14,080 --> 00:26:17,040 Speaker 1: and it's in your possession, and that one you are 403 00:26:17,080 --> 00:26:20,479 Speaker 1: not sharing with nobody. Gosh darn it. So with a 404 00:26:20,520 --> 00:26:24,399 Speaker 1: public private key, everyone can send you encrypted messages. Only 405 00:26:24,520 --> 00:26:27,439 Speaker 1: you can decrypt them to see the original text. The 406 00:26:27,480 --> 00:26:30,640 Speaker 1: public key cannot be used for decoding. It can only 407 00:26:30,640 --> 00:26:34,080 Speaker 1: be used for encoding. But then, what if you wanted 408 00:26:34,160 --> 00:26:37,240 Speaker 1: to send a reply message to me, then you wanted 409 00:26:37,240 --> 00:26:39,880 Speaker 1: to encrypt it. Well, in that case, you would use 410 00:26:40,320 --> 00:26:43,840 Speaker 1: my public key, the one I have for anyone to 411 00:26:43,920 --> 00:26:47,960 Speaker 1: send me an encrypted message. You send your encoded message 412 00:26:48,000 --> 00:26:51,080 Speaker 1: to me, and then I use my private key to 413 00:26:51,200 --> 00:26:54,520 Speaker 1: decipher that message and read the contents. So we're using 414 00:26:54,560 --> 00:26:57,359 Speaker 1: two different sets of keys here, to public and to 415 00:26:57,560 --> 00:27:01,440 Speaker 1: private keys. Now why is this important for ransomware? Well, 416 00:27:01,600 --> 00:27:05,800 Speaker 1: asymmetric keys are harder to crack through cryptanalysis. You cannot 417 00:27:05,920 --> 00:27:10,040 Speaker 1: reverse engineer them nearly as effectively. They typically rely on 418 00:27:10,160 --> 00:27:17,080 Speaker 1: factoring really big numbers. So, for example, take two enormous 419 00:27:17,200 --> 00:27:19,600 Speaker 1: prime numbers, and a prime number is a number that 420 00:27:19,640 --> 00:27:23,640 Speaker 1: can only be divided by itself. So you multiply these 421 00:27:23,640 --> 00:27:27,160 Speaker 1: two huge prime numbers together, and thus you get an 422 00:27:27,240 --> 00:27:31,840 Speaker 1: even bigger number that is the the product of these 423 00:27:31,880 --> 00:27:35,280 Speaker 1: two being multiplied. This bigger number, you can think of 424 00:27:35,320 --> 00:27:38,160 Speaker 1: that as the public key. If you happen to know 425 00:27:38,480 --> 00:27:42,320 Speaker 1: the two factors that were used to create that bigger number, 426 00:27:42,680 --> 00:27:46,280 Speaker 1: then you can decode messages that use that public key. 427 00:27:46,320 --> 00:27:49,880 Speaker 1: That would technically be the private key. But by choosing 428 00:27:50,000 --> 00:27:55,880 Speaker 1: really really big prime numbers, you've created a difficult computational challenge. 429 00:27:56,480 --> 00:27:59,040 Speaker 1: A computer system would have to go through all the 430 00:27:59,240 --> 00:28:03,480 Speaker 1: factors of that big number and then dismiss any of 431 00:28:03,520 --> 00:28:06,960 Speaker 1: the factors that are not themselves prime numbers. So if 432 00:28:07,000 --> 00:28:09,480 Speaker 1: one of the factors where something like a four, you 433 00:28:09,480 --> 00:28:12,760 Speaker 1: would toss that one. That could not possibly be one 434 00:28:12,840 --> 00:28:15,000 Speaker 1: of the components you need because four it can be 435 00:28:15,040 --> 00:28:17,480 Speaker 1: divided by two, So four is not a prime number. 436 00:28:17,560 --> 00:28:19,360 Speaker 1: You get rid of it, You get rid of all 437 00:28:19,400 --> 00:28:22,400 Speaker 1: the non prime factors. Then you would have to find 438 00:28:22,440 --> 00:28:25,320 Speaker 1: the specific pair of really big prime numbers that were 439 00:28:25,359 --> 00:28:29,280 Speaker 1: used to create this private key. Now this isn't impossible, 440 00:28:29,720 --> 00:28:32,840 Speaker 1: but as you use larger prime numbers, it gets more 441 00:28:32,960 --> 00:28:37,800 Speaker 1: computationally complex and it requires more processing power and thus 442 00:28:37,840 --> 00:28:42,120 Speaker 1: more time to crack it. Time is a precious resource, 443 00:28:42,480 --> 00:28:45,000 Speaker 1: you know, you can think of time as money, so 444 00:28:45,480 --> 00:28:49,000 Speaker 1: you don't even need to make the encryption full proof. 445 00:28:49,360 --> 00:28:51,600 Speaker 1: You just need to make it good enough so that 446 00:28:51,880 --> 00:28:56,280 Speaker 1: is too expensive for anyone to bother trying to crack it. Anyway, 447 00:28:56,360 --> 00:29:00,080 Speaker 1: this type of cryptography is fascinating, and like all cryptog or, 448 00:29:00,200 --> 00:29:03,160 Speaker 1: he becomes a sort of seesaw approach as people find 449 00:29:03,200 --> 00:29:07,280 Speaker 1: new ways to decrypt things more effectively and efficiently. Young 450 00:29:07,400 --> 00:29:11,320 Speaker 1: and Young projected that future ransomware designers would make use 451 00:29:11,360 --> 00:29:14,760 Speaker 1: of asymmetric cryptography approaches to make it more difficult to 452 00:29:14,840 --> 00:29:18,400 Speaker 1: reverse the attack, and it would just be easier for 453 00:29:18,440 --> 00:29:21,520 Speaker 1: people to pay the hackers the ransoms. So in other words, 454 00:29:21,800 --> 00:29:25,560 Speaker 1: you might say, well, they're asking for ten thousand dollars, 455 00:29:25,600 --> 00:29:29,080 Speaker 1: but the value of my data is incredibly high, and 456 00:29:29,120 --> 00:29:32,320 Speaker 1: the price of reversing this attack could end up being 457 00:29:32,480 --> 00:29:35,720 Speaker 1: much more expensive than ten grand, so we'll just cough 458 00:29:35,800 --> 00:29:40,680 Speaker 1: up the money. Young and Young called this crypto viral extortion, 459 00:29:41,160 --> 00:29:44,600 Speaker 1: defining it as quote an active attack in which the 460 00:29:44,680 --> 00:29:48,920 Speaker 1: hybrid encrypts the victims files works if there are no backups. 461 00:29:49,200 --> 00:29:54,240 Speaker 1: Attacker demands ransom in return for the randomly generated symmetric key. 462 00:29:54,400 --> 00:29:59,400 Speaker 1: Cannot determine decryption key even when code is scrutinized end quote. 463 00:30:00,040 --> 00:30:04,320 Speaker 1: Young also describes some related scenarios that are equally troubling, 464 00:30:04,600 --> 00:30:08,440 Speaker 1: such as one in which malware infects a computer and 465 00:30:08,480 --> 00:30:12,720 Speaker 1: then uses cryptography to encode specific information on that computer 466 00:30:13,040 --> 00:30:17,400 Speaker 1: before broadcasting the information to the attacker, essentially sending secret 467 00:30:17,440 --> 00:30:21,240 Speaker 1: messages from the target machine. The attacker has created the 468 00:30:21,320 --> 00:30:25,040 Speaker 1: keys for encoding and decoding, and thus only the attacker 469 00:30:25,120 --> 00:30:28,880 Speaker 1: knows what information was even stolen. So even if someone 470 00:30:28,960 --> 00:30:31,960 Speaker 1: detects that a security breach has happened, they couldn't be 471 00:30:32,040 --> 00:30:35,959 Speaker 1: certain what information had been accessed. That's not great if 472 00:30:36,000 --> 00:30:40,760 Speaker 1: you're handling supersensitive information like financial information or medical records 473 00:30:40,840 --> 00:30:45,600 Speaker 1: or military communications, etcetera. But I digress. When we come back, 474 00:30:45,640 --> 00:30:48,960 Speaker 1: i'll dive into a little more detail in cryptoviral extortion 475 00:30:49,040 --> 00:30:51,560 Speaker 1: methods and we'll talk about a few cases where we've 476 00:30:51,600 --> 00:30:54,720 Speaker 1: seen it play out. But first let's take another quick break. 477 00:31:02,040 --> 00:31:06,160 Speaker 1: You'll pointed out that a crypto viral extortion methodology really 478 00:31:06,160 --> 00:31:10,040 Speaker 1: only works if a computer system lacks backups, and that's 479 00:31:10,040 --> 00:31:14,000 Speaker 1: because you could potentially wipe an effect infected machine clean. 480 00:31:14,240 --> 00:31:19,040 Speaker 1: Right you've got hit by ransomware, you could just completely 481 00:31:19,120 --> 00:31:22,520 Speaker 1: reformat that machine. You can even uninstall and reinstall the 482 00:31:22,600 --> 00:31:26,640 Speaker 1: operating system and all the necessary applications, and then restore 483 00:31:26,680 --> 00:31:29,760 Speaker 1: the data from your backup. It doesn't matter if the 484 00:31:29,800 --> 00:31:32,880 Speaker 1: attacker encrypted all the content on the computer, because you've 485 00:31:32,920 --> 00:31:36,160 Speaker 1: got an unaffected copy of that material. I think what 486 00:31:36,200 --> 00:31:38,560 Speaker 1: I'm trying to say here is that it's a good 487 00:31:38,560 --> 00:31:42,400 Speaker 1: idea to make regular backups of your data, preferably onto 488 00:31:42,440 --> 00:31:44,960 Speaker 1: a secondary storage device that you can keep in a 489 00:31:45,000 --> 00:31:48,600 Speaker 1: safe location. There are a lot of external hard drive 490 00:31:48,680 --> 00:31:52,080 Speaker 1: solutions out there, and many of them are not very expensive. 491 00:31:52,360 --> 00:31:55,000 Speaker 1: So if you work on sensitive stuff, or let's say 492 00:31:55,000 --> 00:31:56,959 Speaker 1: you've just got a lot of data that you're attached to, 493 00:31:57,120 --> 00:32:00,479 Speaker 1: like maybe a lot of sentimental photos and videos and stuff, 494 00:32:00,680 --> 00:32:04,440 Speaker 1: I recommend investing in a backup. It's an extra step, 495 00:32:04,520 --> 00:32:06,600 Speaker 1: and I get it that can be a hassle, but 496 00:32:06,680 --> 00:32:09,240 Speaker 1: it's better to have it and not need it than 497 00:32:09,440 --> 00:32:12,560 Speaker 1: need it and not have it. Uh. Cloud storage can 498 00:32:12,600 --> 00:32:15,719 Speaker 1: also be a solution, so that's also a potential, but 499 00:32:15,840 --> 00:32:19,240 Speaker 1: you should definitely have a backup now. Young also said 500 00:32:19,680 --> 00:32:23,560 Speaker 1: that an attacker could take a couple of different tactics 501 00:32:23,600 --> 00:32:27,000 Speaker 1: to make their approach more robust. And this gets pretty 502 00:32:27,000 --> 00:32:29,480 Speaker 1: technical too, so I'm not going to go into any 503 00:32:29,520 --> 00:32:32,800 Speaker 1: detail because to explain all of it would require another episode. 504 00:32:33,080 --> 00:32:36,480 Speaker 1: But the point is that Young and Young were anticipating 505 00:32:36,640 --> 00:32:39,400 Speaker 1: the attacks that would happen in the future, and they 506 00:32:39,440 --> 00:32:44,320 Speaker 1: published this paper that they wrote in It would take 507 00:32:44,360 --> 00:32:47,560 Speaker 1: about a decade before we started seeing ransomware attacks that 508 00:32:47,640 --> 00:32:50,800 Speaker 1: kind of aligned with the predictions, but they were on 509 00:32:50,840 --> 00:32:53,040 Speaker 1: the way. And by the way, I think this is 510 00:32:53,080 --> 00:32:56,320 Speaker 1: a good point for me to reaffirm my stance on 511 00:32:56,360 --> 00:32:59,480 Speaker 1: this kind of work. So from one perspective, you could 512 00:32:59,600 --> 00:33:04,280 Speaker 1: argue that this research accelerated the development of new approaches 513 00:33:04,320 --> 00:33:08,560 Speaker 1: to ransomware. In other words, that by publishing these findings, 514 00:33:08,720 --> 00:33:13,160 Speaker 1: Young and Young were enabling the next wave of attackers. 515 00:33:13,200 --> 00:33:16,920 Speaker 1: But on the other hand, figuring out potential vulnerabilities is 516 00:33:17,000 --> 00:33:21,640 Speaker 1: important if you want to prevent people from exploiting those vulnerabilities. 517 00:33:21,840 --> 00:33:24,320 Speaker 1: So the good guys have to look at how to 518 00:33:24,440 --> 00:33:29,040 Speaker 1: crack systems because that's what the bad guys are always doing. 519 00:33:29,160 --> 00:33:32,560 Speaker 1: If good guys were not doing it too, then only 520 00:33:32,600 --> 00:33:35,720 Speaker 1: the bad guys would be figuring out how to exploit systems, 521 00:33:35,760 --> 00:33:39,120 Speaker 1: and we would be caught unawares far more often, with 522 00:33:39,440 --> 00:33:43,040 Speaker 1: much more dire consequences as a result. Now, all that 523 00:33:43,080 --> 00:33:47,640 Speaker 1: being said, there is a tendency even within the white 524 00:33:47,640 --> 00:33:51,719 Speaker 1: hat community to communicate these discoveries in a way that 525 00:33:51,800 --> 00:33:55,280 Speaker 1: comes across as you know, smug or snarky, or sometimes 526 00:33:55,360 --> 00:33:59,840 Speaker 1: even cruel. That's more of a commentary about communication style 527 00:34:00,440 --> 00:34:05,000 Speaker 1: and the tendency to detach the significance of the consequences 528 00:34:05,000 --> 00:34:08,960 Speaker 1: of an action from the problems of just solving tough 529 00:34:09,040 --> 00:34:13,839 Speaker 1: computational challenges. But that's a soapbox for another episode. In 530 00:34:13,920 --> 00:34:16,919 Speaker 1: the mid two thousand's we started to see slightly more 531 00:34:17,000 --> 00:34:23,360 Speaker 1: sophisticated attempts at ransomware emerge. One ransomware was cry zip, 532 00:34:23,719 --> 00:34:27,640 Speaker 1: and the name suggests that it used ZIP file compression 533 00:34:27,640 --> 00:34:30,160 Speaker 1: technology as part of the attack. In fact, that's exactly 534 00:34:30,160 --> 00:34:33,880 Speaker 1: what it did. Essentially, if you were unlucky enough to 535 00:34:34,000 --> 00:34:37,840 Speaker 1: have fallen for the tactic and installed the program, cry 536 00:34:37,920 --> 00:34:41,160 Speaker 1: zip would crawl through your sea drive and select files 537 00:34:41,200 --> 00:34:45,000 Speaker 1: to put into a password protected ZIP folder, and then 538 00:34:45,040 --> 00:34:48,359 Speaker 1: it would delete all the original files. So rather than 539 00:34:48,440 --> 00:34:51,040 Speaker 1: a hard drive full of files, you would see a 540 00:34:51,080 --> 00:34:54,479 Speaker 1: folder with password protection on it, and the ransomware would 541 00:34:54,480 --> 00:34:57,640 Speaker 1: place a t x T or text file on the 542 00:34:57,760 --> 00:35:01,040 Speaker 1: drive that, if you opened, would of you instructions on 543 00:35:01,120 --> 00:35:04,600 Speaker 1: where you were to deposit money. In return, you would 544 00:35:04,640 --> 00:35:08,120 Speaker 1: get the password to access your files. The hacker who 545 00:35:08,160 --> 00:35:11,680 Speaker 1: designed this actually put the password within the d l 546 00:35:11,920 --> 00:35:17,040 Speaker 1: L file for the ransomware itself, unencrypted too. I guess 547 00:35:17,120 --> 00:35:19,240 Speaker 1: they just figured no one was going to go looking 548 00:35:19,280 --> 00:35:22,400 Speaker 1: for it. But it turns out if you include your 549 00:35:22,440 --> 00:35:27,440 Speaker 1: password with the protected stuff, it's not that protected. It's 550 00:35:27,520 --> 00:35:30,520 Speaker 1: kind of like writing the password for your computer on 551 00:35:30,600 --> 00:35:32,719 Speaker 1: a post it note and then putting the post it 552 00:35:32,760 --> 00:35:37,400 Speaker 1: note next to your computer. What's even the point. The 553 00:35:37,600 --> 00:35:41,920 Speaker 1: Archivaus ransomware, which was based off of Chrysip, was another 554 00:35:41,960 --> 00:35:45,319 Speaker 1: one that caused some mischief. In two thousand six, Mike 555 00:35:45,400 --> 00:35:50,160 Speaker 1: Chrysip analysts found the password for the ransomware embedded within 556 00:35:50,200 --> 00:35:53,680 Speaker 1: the code of the malware itself, and it was a 557 00:35:53,760 --> 00:35:57,680 Speaker 1: thirty digit pass code. I'm not gonna read it, because 558 00:35:57,719 --> 00:36:00,040 Speaker 1: who wants to hear a string of seemingly ran of 559 00:36:00,360 --> 00:36:03,640 Speaker 1: letters and numbers. The point is that if the code 560 00:36:03,680 --> 00:36:07,120 Speaker 1: had not contained the passwords, it would have been a 561 00:36:07,160 --> 00:36:10,040 Speaker 1: lot trickier to get around, But while some folks were 562 00:36:10,040 --> 00:36:12,799 Speaker 1: tricked into installing the malware, the solution ended up being 563 00:36:13,000 --> 00:36:15,719 Speaker 1: fairly straightforward in the end, so it didn't have as 564 00:36:15,880 --> 00:36:20,279 Speaker 1: massive an impact as it could have. By ransomware was 565 00:36:20,320 --> 00:36:23,360 Speaker 1: on the rise, in September of that year, hackers released 566 00:36:23,360 --> 00:36:27,719 Speaker 1: a particularly effective weapon called crypto Locker. Like the other 567 00:36:27,800 --> 00:36:31,400 Speaker 1: examples that I've mentioned, this malware, once it was installed, 568 00:36:31,560 --> 00:36:34,759 Speaker 1: would encrypted files on Windows machines and then show a 569 00:36:34,800 --> 00:36:38,799 Speaker 1: message demanding payment in exchange for the decryption key. One 570 00:36:38,840 --> 00:36:41,920 Speaker 1: new twist is that the hackers were demanding the ransom 571 00:36:41,960 --> 00:36:46,560 Speaker 1: be paid in bitcoins, the cryptocurrency that makes transactions difficult 572 00:36:46,600 --> 00:36:50,040 Speaker 1: to trace. If the hackers remained careful about how they 573 00:36:50,120 --> 00:36:53,600 Speaker 1: accessed their ill gotten gains, they could profit off their 574 00:36:53,640 --> 00:36:56,680 Speaker 1: crime without much fear of being tracked by the authorities. 575 00:36:57,080 --> 00:37:01,239 Speaker 1: The trojan horse attacks carrying crypto Lockers spread primarily through 576 00:37:01,400 --> 00:37:05,600 Speaker 1: email attachments. The code could not replicate itself. It couldn't 577 00:37:05,640 --> 00:37:08,680 Speaker 1: spread to other machines all on its own, so it 578 00:37:08,800 --> 00:37:12,280 Speaker 1: wasn't a virus or a worm like other malware. Instead, 579 00:37:12,400 --> 00:37:16,120 Speaker 1: the hackers created a boton net to spam out millions 580 00:37:16,120 --> 00:37:20,000 Speaker 1: of computers with emails carrying infected files. A boton net, 581 00:37:20,040 --> 00:37:23,160 Speaker 1: by the way, is a network of computers a k a. 582 00:37:23,320 --> 00:37:27,120 Speaker 1: Bots that ultimately are under the control of a hacker 583 00:37:27,320 --> 00:37:30,680 Speaker 1: or a group of hackers. There are other types of 584 00:37:30,719 --> 00:37:34,200 Speaker 1: malware that can give a hacker remote access to your computer. 585 00:37:34,800 --> 00:37:37,919 Speaker 1: Sometimes this is done just to snoop on communications, other 586 00:37:38,000 --> 00:37:41,520 Speaker 1: times to turn your computer into a resource for the hackers. 587 00:37:41,560 --> 00:37:43,920 Speaker 1: So in this case it was a resource. It was 588 00:37:43,960 --> 00:37:48,239 Speaker 1: meant to help distribute emails, and these emails had the 589 00:37:48,320 --> 00:37:52,000 Speaker 1: infected attachments. Uh. This also comes with the bonus of 590 00:37:52,080 --> 00:37:56,359 Speaker 1: creating some separation between the hacker and the emails. So, 591 00:37:56,640 --> 00:37:59,200 Speaker 1: in other words, if authorities were to trace back where 592 00:37:59,200 --> 00:38:01,640 Speaker 1: the email came from and they found out it came 593 00:38:01,680 --> 00:38:04,880 Speaker 1: from your computer, and your computer had been compromised by 594 00:38:04,880 --> 00:38:07,719 Speaker 1: this botan net, you could be on the hook at 595 00:38:07,760 --> 00:38:13,840 Speaker 1: least temporarily, while the hacker remains undetected. The cryptographic method 596 00:38:14,040 --> 00:38:17,799 Speaker 1: used by crypto locker was pretty sophisticated, and unlike the 597 00:38:17,840 --> 00:38:21,480 Speaker 1: earlier examples, the decryption key was not evident within the 598 00:38:21,520 --> 00:38:25,880 Speaker 1: code of the malware itself. Crypto Locker used asymmetric keys, 599 00:38:26,120 --> 00:38:29,080 Speaker 1: a public and a private one, and the hackers held 600 00:38:29,120 --> 00:38:32,400 Speaker 1: on to both of them. A task force called Operation 601 00:38:32,520 --> 00:38:36,440 Speaker 1: Tovar was able to discover the decryption keys another way, 602 00:38:36,600 --> 00:38:40,600 Speaker 1: because the task force targeted the bot net, not the 603 00:38:40,640 --> 00:38:44,799 Speaker 1: actual ransomware directly. Doing this gave the team access to 604 00:38:44,920 --> 00:38:47,719 Speaker 1: the decryption keys, but it took time, and in the 605 00:38:47,760 --> 00:38:50,520 Speaker 1: gap between when the malware first hit the Internet and 606 00:38:50,560 --> 00:38:53,759 Speaker 1: started to infect machines and when the task force had 607 00:38:53,760 --> 00:38:56,400 Speaker 1: found the decryption keys, a lot of people and companies 608 00:38:56,440 --> 00:38:58,920 Speaker 1: had given up and they had ponied up the cash 609 00:38:58,960 --> 00:39:03,480 Speaker 1: to get their data back. Since crypto Locker, numerous ransomware variants, 610 00:39:03,520 --> 00:39:07,680 Speaker 1: many of them descended from crypto Locker itself, have appeared 611 00:39:07,719 --> 00:39:10,600 Speaker 1: on the scene. The medical sector continues to be one 612 00:39:10,640 --> 00:39:13,520 Speaker 1: that gets hit hard by this type of malware, and 613 00:39:13,600 --> 00:39:17,520 Speaker 1: from a criminals perspective, you can understand why the information 614 00:39:17,560 --> 00:39:20,400 Speaker 1: on computers that are tied to the medical industry contain 615 00:39:20,560 --> 00:39:24,440 Speaker 1: critical information. A lot of that information is private. It 616 00:39:24,600 --> 00:39:28,280 Speaker 1: is protected by law, so for it to get revealed 617 00:39:28,280 --> 00:39:32,080 Speaker 1: would be a big legal problem as well. It's tied 618 00:39:32,080 --> 00:39:35,239 Speaker 1: to patients frequently, and there is an enormous sense of 619 00:39:35,400 --> 00:39:39,160 Speaker 1: urgency to regain access to that kind of information, and 620 00:39:39,239 --> 00:39:44,760 Speaker 1: many medical establishments, including hospitals, lacked the robust backup infrastructure 621 00:39:44,800 --> 00:39:47,960 Speaker 1: to recover in the event of a ransomware attack. And 622 00:39:48,000 --> 00:39:50,839 Speaker 1: that's not just me throwing shade at hospitals for not 623 00:39:51,000 --> 00:39:54,840 Speaker 1: having appropriate backups. This is actually a really tricky area 624 00:39:54,920 --> 00:39:58,520 Speaker 1: because you want that data to remain secure and private, 625 00:39:58,880 --> 00:40:03,240 Speaker 1: and making copies of data creates an opportunity for data breaches. 626 00:40:03,560 --> 00:40:08,439 Speaker 1: So it's a pretty delicate balance. Since crypto Locker, we've 627 00:40:08,480 --> 00:40:11,960 Speaker 1: seen a lot of other ransomware variants out in the wild. Locky, 628 00:40:12,200 --> 00:40:16,160 Speaker 1: similar to crypto Locker, targeted more than one sixty different 629 00:40:16,200 --> 00:40:20,000 Speaker 1: file types when infecting machines, particularly those file types that 630 00:40:20,040 --> 00:40:24,719 Speaker 1: are prevalent in areas like design and engineering. Wanna cry, 631 00:40:24,800 --> 00:40:28,640 Speaker 1: which made headlines in teen, took advantage of an exploit 632 00:40:28,719 --> 00:40:32,240 Speaker 1: in older Windows systems. Now, one juicy bit of information 633 00:40:32,239 --> 00:40:36,279 Speaker 1: about that mess was that the United States National Security 634 00:40:36,320 --> 00:40:41,000 Speaker 1: Agency a A the n s A had discovered this exploit, 635 00:40:41,480 --> 00:40:44,719 Speaker 1: but then they sat on it. They were quiet about it, 636 00:40:44,800 --> 00:40:48,319 Speaker 1: presumably so that the agency itself would be able to 637 00:40:48,360 --> 00:40:52,040 Speaker 1: take advantage of that exploit. Now, reporting the exploit would 638 00:40:52,040 --> 00:40:55,759 Speaker 1: have given Microsoft the chance to patch the problem, So 639 00:40:56,040 --> 00:40:58,040 Speaker 1: the NSA said nothing at all so that they could 640 00:40:58,040 --> 00:41:01,920 Speaker 1: take advantage of it. And then addictively, hackers got hold 641 00:41:01,960 --> 00:41:05,279 Speaker 1: of that information, and so they used the exploit themselves 642 00:41:05,480 --> 00:41:09,480 Speaker 1: to craft the Wanna cry ransomware, which, like crypto locker, 643 00:41:09,760 --> 00:41:13,760 Speaker 1: demanded payment in the form of bitcoin to return data. 644 00:41:13,840 --> 00:41:16,800 Speaker 1: And that gives me a chance not just to wag 645 00:41:16,920 --> 00:41:19,400 Speaker 1: my finger at the n S, a an organization that 646 00:41:19,480 --> 00:41:24,400 Speaker 1: has had an incredibly shady reputation, but also to explain 647 00:41:24,440 --> 00:41:28,080 Speaker 1: how this points out that a government mandated back door 648 00:41:28,280 --> 00:41:33,319 Speaker 1: into any system is always a bad idea. Governments love 649 00:41:33,360 --> 00:41:37,920 Speaker 1: the idea because monitoring digital communication is really hard, and 650 00:41:37,960 --> 00:41:41,640 Speaker 1: sometimes parties that are in opposition to that government, whether 651 00:41:41,960 --> 00:41:46,760 Speaker 1: foreign or domestic, will use digital communications, you know, plan stuff, 652 00:41:47,080 --> 00:41:49,480 Speaker 1: and so it would be useful to have a window 653 00:41:49,600 --> 00:41:52,120 Speaker 1: to peek through and see what's going on and prepare. 654 00:41:52,719 --> 00:41:56,640 Speaker 1: But even if you trust your government, a backdoor is 655 00:41:56,719 --> 00:42:00,680 Speaker 1: something that can potentially be exploited by any one if 656 00:42:00,680 --> 00:42:03,000 Speaker 1: they find out about it, which was kind of the 657 00:42:03,040 --> 00:42:06,759 Speaker 1: case with Wanna cry, although that was an exploit, not 658 00:42:06,960 --> 00:42:12,240 Speaker 1: an intentional back door. You do not improve national security 659 00:42:12,280 --> 00:42:17,680 Speaker 1: by making systems less secure anyway. Want to Cry could 660 00:42:17,719 --> 00:42:22,000 Speaker 1: have been an enormous problem. But fortunately Microsoft was able 661 00:42:22,040 --> 00:42:26,239 Speaker 1: to patch the exploit quickly and data security specialists discovered 662 00:42:26,280 --> 00:42:28,880 Speaker 1: a kill switch for the ransomware and we're able to 663 00:42:28,920 --> 00:42:31,520 Speaker 1: shut it down before I could really go into overdrive. 664 00:42:31,920 --> 00:42:35,040 Speaker 1: So we were lucky on that one. Other examples of 665 00:42:35,120 --> 00:42:41,400 Speaker 1: ransomware include malware names like bad Rabbit, Yoke, troll Dish, 666 00:42:41,560 --> 00:42:46,120 Speaker 1: Golden Eye, and ganned Crab. The details vary from case 667 00:42:46,160 --> 00:42:49,960 Speaker 1: to case, but the general approach is very similar. To 668 00:42:50,040 --> 00:42:52,919 Speaker 1: close this out, I want to stress a few good 669 00:42:52,920 --> 00:42:57,480 Speaker 1: ways to prevent yourself from becoming a victim of ransomware. First, 670 00:42:57,680 --> 00:43:01,520 Speaker 1: of course, is to be on alert for suspicious messages 671 00:43:01,600 --> 00:43:05,600 Speaker 1: and files and attachments. Don't open emails from sources you 672 00:43:05,680 --> 00:43:11,080 Speaker 1: don't know. Definitely don't open unfamiliar email attachments, and back 673 00:43:11,160 --> 00:43:14,480 Speaker 1: up your data. The best defense against ransomware is just 674 00:43:14,640 --> 00:43:17,200 Speaker 1: not to install the dang stuff in the first place. 675 00:43:17,400 --> 00:43:20,000 Speaker 1: But if you do get tricked, and we all get 676 00:43:20,040 --> 00:43:24,480 Speaker 1: tricked on occasion, having that backup is key. What you 677 00:43:24,840 --> 00:43:29,319 Speaker 1: absolutely do not want to do is pay the ransom. 678 00:43:29,360 --> 00:43:32,600 Speaker 1: Every time a ransom is paid, the message is sent 679 00:43:32,640 --> 00:43:36,960 Speaker 1: out that this tactic works. This is a way to 680 00:43:37,040 --> 00:43:40,440 Speaker 1: make money. So if we send that message, we shouldn't 681 00:43:40,480 --> 00:43:44,480 Speaker 1: be surprised when we see it happen again and again, 682 00:43:44,680 --> 00:43:47,520 Speaker 1: because other people will follow suit in an effort to 683 00:43:47,520 --> 00:43:51,840 Speaker 1: make some cash. And also keep in mind, there's no 684 00:43:51,960 --> 00:43:55,160 Speaker 1: guarantee that paying the ransom will actually get you the 685 00:43:55,200 --> 00:43:58,600 Speaker 1: decryption key. There might be cases where there is no 686 00:43:58,680 --> 00:44:02,080 Speaker 1: way to recover the day, but you don't necessarily know that, 687 00:44:02,239 --> 00:44:05,040 Speaker 1: and then you pay the ransom and you never get 688 00:44:05,080 --> 00:44:10,160 Speaker 1: a cure for your problem. So paying is a terrible idea. 689 00:44:10,400 --> 00:44:13,960 Speaker 1: Let's all just remember the lesson we learned from that 690 00:44:14,080 --> 00:44:18,640 Speaker 1: classic film War Games. The only winning move is not 691 00:44:18,840 --> 00:44:22,640 Speaker 1: to play, and that wraps up this look at what 692 00:44:22,840 --> 00:44:26,840 Speaker 1: ransomware is and its history. It is fascinating. It is 693 00:44:26,880 --> 00:44:30,160 Speaker 1: important to be very much on alert about it, especially 694 00:44:30,760 --> 00:44:34,520 Speaker 1: right now and again in our pandemic, we've seen an 695 00:44:34,640 --> 00:44:39,680 Speaker 1: uptick in malware attacks and spamming attacks because we have 696 00:44:39,800 --> 00:44:44,120 Speaker 1: people who are not in centralized locations anymore. They're working 697 00:44:44,160 --> 00:44:47,480 Speaker 1: from home. Their security at home maybe lower than it 698 00:44:47,560 --> 00:44:51,319 Speaker 1: is in say, an office environment. So it's even more 699 00:44:51,360 --> 00:44:54,120 Speaker 1: important that we each do our part and we pay 700 00:44:54,239 --> 00:44:57,839 Speaker 1: very close attention, and if we do get attacked, we 701 00:44:57,880 --> 00:45:00,839 Speaker 1: should not panic. We should really care fully consider all 702 00:45:00,840 --> 00:45:04,400 Speaker 1: of our options. Sometimes just the option of waiting. It 703 00:45:04,480 --> 00:45:09,439 Speaker 1: works because there are people in data security constantly trying 704 00:45:09,440 --> 00:45:13,680 Speaker 1: to build decryptor tools to reverse these kind of attacks. 705 00:45:14,160 --> 00:45:20,719 Speaker 1: So hold on and just be alert, keep calm, and 706 00:45:20,800 --> 00:45:23,920 Speaker 1: don't install ransomwhere I guess is what I'm saying. If 707 00:45:23,960 --> 00:45:26,600 Speaker 1: you guys have suggestions for topics for future episodes of 708 00:45:26,640 --> 00:45:29,400 Speaker 1: tech Stuff, reach out to me on Twitter. The handle 709 00:45:29,560 --> 00:45:32,799 Speaker 1: is text stuff h s W and I'll talk to 710 00:45:32,800 --> 00:45:41,560 Speaker 1: you again really soon. Text Stuff is an I Heart 711 00:45:41,640 --> 00:45:45,399 Speaker 1: Radio production. For more podcasts from my Heart Radio, visit 712 00:45:45,440 --> 00:45:48,480 Speaker 1: the i Heart Radio app, Apple Podcasts, or wherever you 713 00:45:48,560 --> 00:45:49,920 Speaker 1: listen to your favorite shows.