1 00:00:04,200 --> 00:00:07,240 Speaker 1: Get in text with technology with tech Stuff from how 2 00:00:07,280 --> 00:00:14,000 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,040 --> 00:00:17,840 Speaker 1: I'm your host, senior writer John in Strickland right for 4 00:00:17,960 --> 00:00:20,720 Speaker 1: how stuff works dot com. It's a groovy website. You've 5 00:00:20,760 --> 00:00:23,280 Speaker 1: never been there. You should go check it out. You've 6 00:00:23,320 --> 00:00:25,360 Speaker 1: been listening to tech stuff all this time and didn't 7 00:00:25,440 --> 00:00:29,760 Speaker 1: know there was website. Work on your listening skills. I 8 00:00:29,840 --> 00:00:32,360 Speaker 1: love you how stuff works dot Com. Check it out. 9 00:00:32,400 --> 00:00:34,479 Speaker 1: So today I thought take a look at a tech 10 00:00:34,560 --> 00:00:37,120 Speaker 1: story that happened not too long ago as the recording 11 00:00:37,120 --> 00:00:41,159 Speaker 1: of this podcast. I'm recording it on ma It is 12 00:00:41,200 --> 00:00:43,960 Speaker 1: publishing much later than that, but not too long ago 13 00:00:44,000 --> 00:00:48,640 Speaker 1: from today. A virus emerge that really caused a lot 14 00:00:48,680 --> 00:00:52,600 Speaker 1: of headaches, particularly in the UK and a lot of 15 00:00:52,640 --> 00:00:54,600 Speaker 1: other countries. Not so much in the United States, but 16 00:00:54,720 --> 00:00:58,200 Speaker 1: a lot of other ones. And it's called Wanna Cry. 17 00:00:58,640 --> 00:01:02,360 Speaker 1: It's the Wanna Cry ran somewhere virus. It really became 18 00:01:02,400 --> 00:01:06,200 Speaker 1: big news starting on May twelve, sen That's where when 19 00:01:06,200 --> 00:01:08,760 Speaker 1: it went viral for the first time and spread the 20 00:01:08,840 --> 00:01:12,880 Speaker 1: thousands of machines. Uh. The account goes anywhere between two 21 00:01:13,000 --> 00:01:16,959 Speaker 1: hundred thousand and four hundred thousand computers, depending upon what 22 00:01:17,040 --> 00:01:21,280 Speaker 1: authority you're looking at. I want to cry was exploiting 23 00:01:21,280 --> 00:01:26,080 Speaker 1: a vulnerability in a protocol used by the Windows operating system. 24 00:01:26,120 --> 00:01:28,360 Speaker 1: But I'll explain all of that a little bit later. First, 25 00:01:28,440 --> 00:01:33,080 Speaker 1: let's talk about what ransomware is and where it came from. So, 26 00:01:34,120 --> 00:01:39,400 Speaker 1: to put it simply, ransomware is a subset of malware, 27 00:01:39,640 --> 00:01:44,600 Speaker 1: and malware stands for malicious software. Um. You might also 28 00:01:44,640 --> 00:01:47,919 Speaker 1: hear it described as a computer virus. That's largely because 29 00:01:47,920 --> 00:01:50,800 Speaker 1: in the early days of personal computers there are really 30 00:01:50,800 --> 00:01:53,880 Speaker 1: only two major types of malware, and those were viruses 31 00:01:53,960 --> 00:01:57,800 Speaker 1: and worms. Uh, and so we've often used computer viruses 32 00:01:57,880 --> 00:02:00,480 Speaker 1: shorthand for malware. But there are a lot and lots 33 00:02:00,520 --> 00:02:03,120 Speaker 1: of different kinds of malware out there, and so using 34 00:02:03,120 --> 00:02:05,600 Speaker 1: a term like virus is not as specific as most 35 00:02:05,600 --> 00:02:08,600 Speaker 1: people would prefer. But what the heck is a virus 36 00:02:08,600 --> 00:02:10,160 Speaker 1: and what the heck is a worm? Well, a virus 37 00:02:10,200 --> 00:02:13,000 Speaker 1: is some malicious code that a programmer designs that inserts 38 00:02:13,000 --> 00:02:16,280 Speaker 1: itself into another program. They're typically part of some sort 39 00:02:16,280 --> 00:02:19,520 Speaker 1: of executable file, so e x E in the Windows 40 00:02:19,520 --> 00:02:23,120 Speaker 1: operating System world or DOSS. Even the virus does not 41 00:02:23,280 --> 00:02:27,160 Speaker 1: activate until the computer runs the respective file. So you 42 00:02:27,200 --> 00:02:29,680 Speaker 1: can have a computer that has a virus on it, 43 00:02:29,800 --> 00:02:32,680 Speaker 1: but the virus is inactive. It is dormant because you 44 00:02:32,760 --> 00:02:35,600 Speaker 1: have not yet run that file, and as long as 45 00:02:35,600 --> 00:02:38,880 Speaker 1: you don't run that file, the virus will remain dormant. 46 00:02:38,919 --> 00:02:41,919 Speaker 1: It will be inert. But once you run the file, 47 00:02:42,000 --> 00:02:44,960 Speaker 1: it activates the virus and it ends up replicating itself. 48 00:02:45,880 --> 00:02:48,920 Speaker 1: Sometimes it will use other programs to spread itself to 49 00:02:49,160 --> 00:02:53,440 Speaker 1: other machines. In the old days, before you had networked computers, 50 00:02:53,600 --> 00:02:56,080 Speaker 1: it would essentially replicate itself over and over again in 51 00:02:56,200 --> 00:03:00,480 Speaker 1: order to overwrite everything on the computer and essentially jam 52 00:03:00,520 --> 00:03:04,320 Speaker 1: everything up. You couldn't end up saving anything to the computer. 53 00:03:04,639 --> 00:03:09,160 Speaker 1: Everything would be overwritten by this virus, essentially rendered your 54 00:03:09,160 --> 00:03:15,040 Speaker 1: computer useless. Uh. It's pretty nasty stuff. The worm, on 55 00:03:15,080 --> 00:03:17,760 Speaker 1: the other hand, is a self propagating piece of code 56 00:03:17,800 --> 00:03:20,800 Speaker 1: that does not rely on another file, and typically the 57 00:03:20,880 --> 00:03:25,200 Speaker 1: programmer depends upon some sort of trick like social engineering 58 00:03:25,280 --> 00:03:28,440 Speaker 1: to get people to execute the worm and start that 59 00:03:28,560 --> 00:03:32,120 Speaker 1: self propagation process. Now, both viruses and worms are part 60 00:03:32,160 --> 00:03:35,080 Speaker 1: of a larger classification of malware, and ransomware is a 61 00:03:35,120 --> 00:03:40,200 Speaker 1: specific type of malware that as the name suggests holds 62 00:03:40,200 --> 00:03:44,200 Speaker 1: a victim's computer for ransom. It doesn't break into their 63 00:03:44,240 --> 00:03:46,360 Speaker 1: house and steal it and then put a gun to 64 00:03:46,440 --> 00:03:49,520 Speaker 1: the monitor and say pay up or it gets it. 65 00:03:50,000 --> 00:03:52,440 Speaker 1: Otherwise you would just need a particular set of skills 66 00:03:52,960 --> 00:03:55,200 Speaker 1: to go after those folks, as we learned in the 67 00:03:55,240 --> 00:04:00,240 Speaker 1: documentary Taken. Typically, malware that as ransomware will do one 68 00:04:00,240 --> 00:04:04,560 Speaker 1: of two things. The most common version on desktop machines 69 00:04:04,600 --> 00:04:08,880 Speaker 1: and laptops is that it will encrypt the victims computer, 70 00:04:09,480 --> 00:04:12,160 Speaker 1: so that means it will encode your computer so that 71 00:04:12,200 --> 00:04:15,280 Speaker 1: none of your files will be readable or even you know, 72 00:04:15,320 --> 00:04:17,120 Speaker 1: you won't even be able to locate them because they're 73 00:04:17,160 --> 00:04:22,920 Speaker 1: all renamed. Under this nonsense encryption approach, that can end 74 00:04:23,000 --> 00:04:25,760 Speaker 1: up causing your computer to be useless or at least 75 00:04:26,000 --> 00:04:30,000 Speaker 1: give make it your information inaccessible. The goal is to 76 00:04:30,080 --> 00:04:32,839 Speaker 1: get the victim to fork over some cash and in return, 77 00:04:33,200 --> 00:04:36,320 Speaker 1: the hackers will decrypt the computer. They'll give whatever the 78 00:04:36,360 --> 00:04:41,880 Speaker 1: password is or the methodology to decrypt all the information 79 00:04:41,880 --> 00:04:43,720 Speaker 1: and turn it back to the way it was before 80 00:04:43,880 --> 00:04:50,120 Speaker 1: it was attacked. Now, uh, there's the second variant of 81 00:04:50,240 --> 00:04:54,640 Speaker 1: ransomware that doesn't encrypt a computer. Instead, what it does 82 00:04:54,760 --> 00:04:57,919 Speaker 1: is locks people out of a device. This is the 83 00:04:58,120 --> 00:05:02,440 Speaker 1: locker version of in somewhere. It's most frequently seen in 84 00:05:02,560 --> 00:05:07,000 Speaker 1: Android based devices, so mostly mobile sets like handsets, tablets, 85 00:05:07,040 --> 00:05:10,960 Speaker 1: that kind of thing. And essentially hackers full of victim 86 00:05:11,000 --> 00:05:14,600 Speaker 1: into downloading and installing a malicious app, and then the 87 00:05:14,920 --> 00:05:17,839 Speaker 1: app will then activate this software that locks the victim 88 00:05:17,920 --> 00:05:21,040 Speaker 1: off from accessing their device. They won't be able to 89 00:05:21,080 --> 00:05:24,520 Speaker 1: use it, essentially bricks it until you are to pay 90 00:05:24,600 --> 00:05:26,640 Speaker 1: up a ransom. You might get like a little screen 91 00:05:27,200 --> 00:05:30,160 Speaker 1: that demons that shows you, you know, until you pay 92 00:05:30,560 --> 00:05:35,279 Speaker 1: x amount to why you won't have access to this device. 93 00:05:35,920 --> 00:05:37,719 Speaker 1: So you are told that you have to pay the 94 00:05:37,720 --> 00:05:39,680 Speaker 1: hackers in order to regain access to your device. And 95 00:05:39,640 --> 00:05:43,440 Speaker 1: in either case, ransomware is not pretty. Now. This is 96 00:05:43,839 --> 00:05:48,279 Speaker 1: similar to, but distinct from, another scheme that some hackers 97 00:05:48,320 --> 00:05:52,599 Speaker 1: employ over the last few years, which is blackmail. Hacker 98 00:05:52,640 --> 00:05:57,240 Speaker 1: groups like rex Mundy have targeted large corporations with a 99 00:05:57,279 --> 00:06:01,440 Speaker 1: goal of infiltrating their systems and dealing as much data 100 00:06:01,520 --> 00:06:06,480 Speaker 1: as possible, including customer data. That's one of the big targets. 101 00:06:06,520 --> 00:06:10,839 Speaker 1: So having that customer data is a very powerful tool. 102 00:06:11,120 --> 00:06:14,720 Speaker 1: Companies do not want their customers to lose confidence in them. 103 00:06:15,160 --> 00:06:17,760 Speaker 1: So if a hacker group is able to get hold 104 00:06:17,800 --> 00:06:20,360 Speaker 1: of a huge amount of customer information from a company 105 00:06:20,640 --> 00:06:23,000 Speaker 1: and then say, hey, if you don't pay up, we're 106 00:06:23,000 --> 00:06:26,760 Speaker 1: going to release this information or we're gonna sell it off. Uh, 107 00:06:26,839 --> 00:06:29,960 Speaker 1: it's bad news and it's very hard to recover as 108 00:06:29,960 --> 00:06:33,440 Speaker 1: a company if you've suffered that kind of data breach. 109 00:06:34,400 --> 00:06:37,839 Speaker 1: So it's similar to blackmail, but not exactly the same 110 00:06:37,880 --> 00:06:40,960 Speaker 1: because with ransomware, the hackers might not even be interested 111 00:06:41,040 --> 00:06:44,280 Speaker 1: at all in what's on the computer systems they target. 112 00:06:44,600 --> 00:06:49,240 Speaker 1: They don't care if there's customer information or if it's 113 00:06:49,400 --> 00:06:53,120 Speaker 1: internal systems that that doesn't matter. What they want to 114 00:06:53,200 --> 00:06:57,160 Speaker 1: do is affect as many critical computers as they possibly 115 00:06:57,200 --> 00:07:01,000 Speaker 1: can with ransomware, because if it's a critical device, if 116 00:07:01,000 --> 00:07:04,280 Speaker 1: it's something that's very important for the operations of a 117 00:07:04,400 --> 00:07:08,799 Speaker 1: larger organization or company, then that puts a huge amount 118 00:07:08,800 --> 00:07:11,239 Speaker 1: of pressure on the company to pay up the ransom 119 00:07:11,360 --> 00:07:16,000 Speaker 1: so they can get access to that critical hardware. Again, um, 120 00:07:16,040 --> 00:07:19,080 Speaker 1: that's the whole point of ransomware. They don't they don't 121 00:07:19,120 --> 00:07:21,000 Speaker 1: care if it's you know, what the nature of the 122 00:07:21,040 --> 00:07:24,080 Speaker 1: stuff is, as long as it's important because they're not 123 00:07:24,240 --> 00:07:27,440 Speaker 1: after the data itself there after money. They just want 124 00:07:27,440 --> 00:07:29,480 Speaker 1: to lock down those computers as much as they can 125 00:07:30,360 --> 00:07:32,920 Speaker 1: and then convince people to pay them so that they 126 00:07:32,920 --> 00:07:37,400 Speaker 1: can unlock them. Now, the first recorded instance of ransomware 127 00:07:38,000 --> 00:07:41,520 Speaker 1: was called the AIDS trojan and it was designed by 128 00:07:41,640 --> 00:07:46,280 Speaker 1: Joseph L. Pop p O p P. That particular attack 129 00:07:46,320 --> 00:07:49,080 Speaker 1: falls under the category of the trojan horse, which is 130 00:07:49,360 --> 00:07:52,679 Speaker 1: of course named after the legendary gift to the city 131 00:07:52,680 --> 00:07:56,920 Speaker 1: of Troy that secretly housed invading soldiers that were from Greece. 132 00:07:57,680 --> 00:08:02,040 Speaker 1: A trojan horse is malware that that looks like a 133 00:08:02,080 --> 00:08:05,320 Speaker 1: regular program. It fools someone into thinking they're using some 134 00:08:05,720 --> 00:08:09,600 Speaker 1: benign piece of software, but in reality they're essentially handing 135 00:08:09,600 --> 00:08:11,920 Speaker 1: over some critical part of their computer systems to the 136 00:08:11,920 --> 00:08:14,360 Speaker 1: whims of a hacker. So a lot of trojan horse 137 00:08:14,400 --> 00:08:18,679 Speaker 1: programs these days are programs that look like they're innocent. 138 00:08:19,040 --> 00:08:21,520 Speaker 1: You run them, and then it allows a hacker to 139 00:08:21,560 --> 00:08:25,360 Speaker 1: get a back end, like a back door entry into 140 00:08:25,440 --> 00:08:29,640 Speaker 1: your computer, usually administrative level control, and from there they 141 00:08:29,640 --> 00:08:31,960 Speaker 1: can do lots of different things. They can lock you 142 00:08:32,000 --> 00:08:34,640 Speaker 1: out of a system, They can allow you to continue 143 00:08:34,760 --> 00:08:37,040 Speaker 1: using a system so that you don't know that they're 144 00:08:37,040 --> 00:08:39,920 Speaker 1: even there. They can spy on what you're doing. They 145 00:08:39,920 --> 00:08:43,000 Speaker 1: can even redirect your computer to send traffic to a 146 00:08:43,160 --> 00:08:46,600 Speaker 1: target machine as part of the distributed denial of service attacks. 147 00:08:46,679 --> 00:08:50,720 Speaker 1: So this is a very common ploy that hackers will 148 00:08:50,800 --> 00:08:54,840 Speaker 1: use in order to build bot nets or computer armies. Now, 149 00:08:55,240 --> 00:09:00,160 Speaker 1: the AIDS trojan virus predates the World Wide Web, so 150 00:09:00,800 --> 00:09:04,120 Speaker 1: this was not a virus that was spread over email. 151 00:09:04,160 --> 00:09:08,960 Speaker 1: It wasn't spread over a compromised website. It was distributed 152 00:09:09,040 --> 00:09:14,200 Speaker 1: actually on floppy disks, good old floppy disks, and they 153 00:09:14,200 --> 00:09:18,480 Speaker 1: were sent over the postal service. Most of the recipients 154 00:09:18,559 --> 00:09:22,360 Speaker 1: were from other parts of the world, not the United States. 155 00:09:22,400 --> 00:09:24,800 Speaker 1: Here in the US, we really didn't have an issue 156 00:09:24,800 --> 00:09:29,240 Speaker 1: with the AIDS trojan virus directly. These were the targeted 157 00:09:29,280 --> 00:09:32,400 Speaker 1: systems were mostly in other places in the world, like Europe, 158 00:09:32,400 --> 00:09:36,719 Speaker 1: in Africa, in uh Asia, that kind of thing. So 159 00:09:38,120 --> 00:09:44,280 Speaker 1: the target for this attack happened to be companies and 160 00:09:44,360 --> 00:09:49,040 Speaker 1: agencies that were either in education or healthcare, and they 161 00:09:49,040 --> 00:09:53,480 Speaker 1: were concerned with educating people about the AIDS virus. The 162 00:09:53,559 --> 00:09:57,000 Speaker 1: disc was posing as educational software that was to teach 163 00:09:57,040 --> 00:10:00,520 Speaker 1: you about the AIDS virus. So it's pretty insidious that 164 00:10:00,600 --> 00:10:03,800 Speaker 1: it was. It took on this form. The software on 165 00:10:03,840 --> 00:10:07,160 Speaker 1: the disc included an actual survey that would tell the 166 00:10:07,200 --> 00:10:09,920 Speaker 1: taker what their odds were of contracting the AIDS virus 167 00:10:09,960 --> 00:10:13,440 Speaker 1: based off their responses. So, for example, it might ask 168 00:10:13,600 --> 00:10:17,240 Speaker 1: if you take intrivinous drugs and if so, do you 169 00:10:17,280 --> 00:10:20,560 Speaker 1: share needles? That sort of thing, and as you would 170 00:10:20,600 --> 00:10:22,880 Speaker 1: answer it, it would give you the odds of you 171 00:10:23,080 --> 00:10:25,960 Speaker 1: contracting the AIDS fires. So on the surface, it seemed 172 00:10:26,000 --> 00:10:30,480 Speaker 1: like actual educational software. What you didn't realize as you 173 00:10:30,600 --> 00:10:33,160 Speaker 1: ran this software on your computer is that in the 174 00:10:33,200 --> 00:10:37,240 Speaker 1: background code was running so that it would infect the computer, 175 00:10:37,360 --> 00:10:40,800 Speaker 1: and after a predetermined number of reboots to the system, 176 00:10:40,880 --> 00:10:44,160 Speaker 1: the software would encrypt all of your files. So, in 177 00:10:44,160 --> 00:10:47,240 Speaker 1: other words, it would set up as kind of a 178 00:10:47,280 --> 00:10:51,200 Speaker 1: doomsday clock, except instead of time, it was in reboots. 179 00:10:51,520 --> 00:10:53,600 Speaker 1: So every time you shut down your system and turned 180 00:10:53,640 --> 00:10:57,400 Speaker 1: it on, you were one step closer to activating this worm, 181 00:10:57,720 --> 00:11:00,720 Speaker 1: and eventually you would hit that threshold old and the 182 00:11:00,760 --> 00:11:02,840 Speaker 1: next time you turned on your computer, all of your 183 00:11:03,000 --> 00:11:08,360 Speaker 1: files would get encrypted by this by this malware. The 184 00:11:08,400 --> 00:11:11,000 Speaker 1: only thing you would see when you would reboot that 185 00:11:11,120 --> 00:11:15,160 Speaker 1: system that last time would be a message that says 186 00:11:15,320 --> 00:11:17,960 Speaker 1: turn on a printer. So essentially you'd have to have 187 00:11:18,000 --> 00:11:21,520 Speaker 1: a printer connected to the affected computer and when you 188 00:11:21,559 --> 00:11:23,760 Speaker 1: turned it on, it would send a print command to 189 00:11:23,880 --> 00:11:26,120 Speaker 1: the printer and print on a sheet of paper with 190 00:11:26,200 --> 00:11:30,760 Speaker 1: the instructions to pay the ransom, which is kind of interesting, 191 00:11:30,800 --> 00:11:35,000 Speaker 1: a little primitive, but obviously you didn't have bitcoin or 192 00:11:35,000 --> 00:11:38,800 Speaker 1: anything like that back in those days, so the ransom 193 00:11:38,840 --> 00:11:43,040 Speaker 1: note would print out once the computer was activated or 194 00:11:43,120 --> 00:11:46,680 Speaker 1: connected to an activated printer. The note directed victims to 195 00:11:46,760 --> 00:11:50,559 Speaker 1: send one eighty nine dollars to a post office box 196 00:11:50,640 --> 00:11:55,840 Speaker 1: located in Panama. After doing so, uh Pop, who of 197 00:11:55,880 --> 00:11:59,920 Speaker 1: course was not identifying himself as the perpetrator, promised that 198 00:12:00,040 --> 00:12:03,120 Speaker 1: he would send the decryption program to unlock the contents 199 00:12:03,160 --> 00:12:06,280 Speaker 1: of the victim's computers. In the UK, where the virus 200 00:12:06,320 --> 00:12:09,720 Speaker 1: was first reported, some medical institutions began to delete data 201 00:12:09,840 --> 00:12:12,160 Speaker 1: rather than pay the ransom. They were worried that their 202 00:12:12,200 --> 00:12:15,440 Speaker 1: systems have been totally compromised and that a hacker had 203 00:12:15,559 --> 00:12:18,200 Speaker 1: access to all of that data, so as a result, 204 00:12:18,760 --> 00:12:22,199 Speaker 1: they started the leading stuff, and in fact other parts 205 00:12:22,200 --> 00:12:26,080 Speaker 1: of the world were following a similar strategy. The Independent 206 00:12:26,480 --> 00:12:29,960 Speaker 1: reported that there was one organization in Italy that lost 207 00:12:30,160 --> 00:12:33,880 Speaker 1: a decade's worth of AIDS research as a result of this, 208 00:12:34,000 --> 00:12:37,760 Speaker 1: because there was a panic that uh, the compromised data 209 00:12:37,800 --> 00:12:41,760 Speaker 1: could be otherwise changed or altered, UH, which I guess 210 00:12:41,800 --> 00:12:44,840 Speaker 1: is repetitive or redundant, but at any rate that they 211 00:12:44,880 --> 00:12:48,040 Speaker 1: were worried that this vulnerability was worse than what they 212 00:12:48,080 --> 00:12:52,360 Speaker 1: were already seeing. So there were people who who lost 213 00:12:52,480 --> 00:12:55,200 Speaker 1: years and years of work as a result of this 214 00:12:55,520 --> 00:12:59,160 Speaker 1: ransomware attack. Now I mentioned earlier, we know who made 215 00:12:59,160 --> 00:13:03,040 Speaker 1: this virus. So knowing who made it, what exactly happened? 216 00:13:03,040 --> 00:13:05,760 Speaker 1: How did this story unfold? It's a bit strange, to 217 00:13:05,760 --> 00:13:07,920 Speaker 1: be honest. So let's give you some background on the 218 00:13:07,960 --> 00:13:10,360 Speaker 1: man who had programmed the virus in the first place. 219 00:13:10,920 --> 00:13:15,800 Speaker 1: Joseph L. Pop had graduated with a PhD from Harvard University, 220 00:13:15,840 --> 00:13:19,480 Speaker 1: and he was in the field of evolutionary biologies, not 221 00:13:19,600 --> 00:13:22,520 Speaker 1: in the field that you would immediately associate with someone 222 00:13:22,559 --> 00:13:28,360 Speaker 1: who's programming the world's first ransomware virus. UH. He was 223 00:13:28,760 --> 00:13:33,319 Speaker 1: actually not an enemy to AIDS research. That was his field. 224 00:13:33,640 --> 00:13:38,080 Speaker 1: He was consulting with the World Health Organization in the 225 00:13:38,160 --> 00:13:41,600 Speaker 1: area of AIDS research over in Kenya, so why would 226 00:13:41,640 --> 00:13:44,840 Speaker 1: he design a computer program that locked away computers used 227 00:13:44,880 --> 00:13:47,080 Speaker 1: by people who were trying to research AIDS and provide 228 00:13:47,160 --> 00:13:51,440 Speaker 1: education for at risk populations. Well, that depends upon whose 229 00:13:51,559 --> 00:13:55,520 Speaker 1: story you believe. So story number one came from Pop's lawyer, 230 00:13:55,600 --> 00:13:59,520 Speaker 1: who said that Pop's plan was to shake things up. 231 00:14:00,080 --> 00:14:04,440 Speaker 1: He wanted to change the the whole model of how 232 00:14:04,720 --> 00:14:08,679 Speaker 1: AIDS research was going about. He thought it was two regimented, 233 00:14:08,720 --> 00:14:11,520 Speaker 1: he thought it was off base. According to the lawyer, 234 00:14:12,280 --> 00:14:15,200 Speaker 1: uh and that Pop's plan was to use the ransom 235 00:14:15,280 --> 00:14:18,199 Speaker 1: money that he would get from people paying this d 236 00:14:18,720 --> 00:14:23,840 Speaker 1: dollars a pop to fund alternative AIDS education programs. So 237 00:14:24,160 --> 00:14:26,680 Speaker 1: you could argue that if this is actually the case, 238 00:14:27,080 --> 00:14:30,200 Speaker 1: this was a protest against the establishment and their approach 239 00:14:30,200 --> 00:14:32,080 Speaker 1: to AIDS research. So you would think of Pop as 240 00:14:32,120 --> 00:14:37,040 Speaker 1: some sort of crypto activist or crypto anarchist. But the 241 00:14:37,120 --> 00:14:40,320 Speaker 1: judge in the case actually disagreed with this and said 242 00:14:40,360 --> 00:14:43,480 Speaker 1: that Pop just wasn't even fit to stand trial, and 243 00:14:43,520 --> 00:14:47,640 Speaker 1: this was because his behavior had become something pretty strange 244 00:14:47,640 --> 00:14:51,440 Speaker 1: and erratic. He was the reason he was caught in 245 00:14:51,480 --> 00:14:54,120 Speaker 1: the first place. I mean, he could have just gotten 246 00:14:54,120 --> 00:14:56,160 Speaker 1: away from Europe and and no one would have ever 247 00:14:56,200 --> 00:14:59,000 Speaker 1: known it was him. The reason he was caught was 248 00:14:59,040 --> 00:15:03,200 Speaker 1: that he was in an airport in Amsterdam and he 249 00:15:03,240 --> 00:15:08,120 Speaker 1: wrote the sentence doctor Pop has been poisoned, which I 250 00:15:08,160 --> 00:15:11,480 Speaker 1: think would make a great title for an album, but 251 00:15:11,560 --> 00:15:16,920 Speaker 1: he wrote it on another passenger suitcase. It's pretty strange already. 252 00:15:16,960 --> 00:15:21,600 Speaker 1: Apparently he had been um acting somewhat unusually as the 253 00:15:21,680 --> 00:15:24,360 Speaker 1: stress was getting to him about trying to get out 254 00:15:24,360 --> 00:15:29,800 Speaker 1: of Europe while this story about the AIDS trojan virus 255 00:15:29,960 --> 00:15:33,440 Speaker 1: was making headlines over there, so he was feeling a 256 00:15:33,440 --> 00:15:36,680 Speaker 1: lot of pressure, and according to some stories, at least 257 00:15:36,720 --> 00:15:39,880 Speaker 1: he cracked well. The authorities saw that he was writing 258 00:15:39,880 --> 00:15:43,000 Speaker 1: stuff on other people's suitcases and took him aside for questioning, 259 00:15:43,000 --> 00:15:45,800 Speaker 1: and they searched his baggage, and when they did, they 260 00:15:45,840 --> 00:15:49,360 Speaker 1: found evidence that he was the one behind manufacturing and 261 00:15:49,400 --> 00:15:53,960 Speaker 1: distributing all those discs that had the malware on them. So, 262 00:15:54,000 --> 00:15:56,800 Speaker 1: while he was waiting for trial in the UK, his 263 00:15:56,880 --> 00:16:01,840 Speaker 1: behavior grew increasingly strange, and eventually Judge Jeffrey Rivlin dismissed 264 00:16:01,840 --> 00:16:04,920 Speaker 1: the case because he said that Pop was unfit to 265 00:16:05,040 --> 00:16:10,240 Speaker 1: stand trial. Pop was released and essentially got off scott free. 266 00:16:10,320 --> 00:16:14,680 Speaker 1: He eventually would open up a butterfly conservatory in upstate 267 00:16:14,680 --> 00:16:17,720 Speaker 1: New York. So you can go see Joseph L. Pop's 268 00:16:17,760 --> 00:16:22,800 Speaker 1: butterfly Conservatory and see the the conservatory built by a 269 00:16:22,800 --> 00:16:25,920 Speaker 1: guy who built the first ransomware in the world, which 270 00:16:25,920 --> 00:16:29,360 Speaker 1: is a little unusual. There is another theory about what 271 00:16:30,160 --> 00:16:34,200 Speaker 1: pops motivations were that have nothing to do with crypto 272 00:16:34,320 --> 00:16:41,720 Speaker 1: anarchist tendencies or erratic behavior. It's not nearly as grand 273 00:16:41,720 --> 00:16:43,720 Speaker 1: an act as all that, it's not as strange as 274 00:16:43,760 --> 00:16:46,280 Speaker 1: all that. The theory states that Pop was actually just 275 00:16:46,360 --> 00:16:50,120 Speaker 1: seeking revenge. He had been passed over for a position 276 00:16:50,280 --> 00:16:53,600 Speaker 1: with the World Health Organization, so some theories say he 277 00:16:53,640 --> 00:16:55,960 Speaker 1: got very upset that he wasn't picked for this job, 278 00:16:56,440 --> 00:16:59,280 Speaker 1: and as a result, he designed and then unleashed the 279 00:16:59,400 --> 00:17:04,480 Speaker 1: software targeting organizations that he felt he should have been 280 00:17:05,400 --> 00:17:07,960 Speaker 1: taking a larger role in, but because he got passed over, 281 00:17:08,320 --> 00:17:10,800 Speaker 1: he didn't have that opportunity. And he even had a 282 00:17:10,800 --> 00:17:13,840 Speaker 1: digital diary that contained evidence that he had been planning 283 00:17:14,000 --> 00:17:15,960 Speaker 1: this attack for more than a year and a half, 284 00:17:16,640 --> 00:17:19,600 Speaker 1: so it was a premeditated act, not something that was 285 00:17:19,680 --> 00:17:24,840 Speaker 1: done spontaneously, at least according to that digital diary. Ah. So, 286 00:17:25,800 --> 00:17:27,200 Speaker 1: there are some people who say that he was just 287 00:17:27,359 --> 00:17:30,119 Speaker 1: bitter about not getting that job, and that was the 288 00:17:30,119 --> 00:17:33,280 Speaker 1: motivation he had for building the first ransomware. But whatever 289 00:17:33,280 --> 00:17:35,920 Speaker 1: the reason, he didn't serve any time for his crime. 290 00:17:36,080 --> 00:17:39,560 Speaker 1: And his encryption scheme was relatively simple to reverse. It 291 00:17:39,600 --> 00:17:45,359 Speaker 1: was symmetric encryption, and it wasn't particularly robust, so after 292 00:17:45,480 --> 00:17:48,480 Speaker 1: some time, experts were able to figure out how to 293 00:17:48,560 --> 00:17:52,879 Speaker 1: reverse engineer it, essentially using brute force to decrypt the 294 00:17:53,320 --> 00:17:58,760 Speaker 1: affected computers. So uh, it really wasn't as bad as 295 00:17:58,800 --> 00:18:02,040 Speaker 1: it could have been, or as it later would turn 296 00:18:02,160 --> 00:18:06,240 Speaker 1: to be, as future ransomware hackers would create more robust 297 00:18:06,480 --> 00:18:14,240 Speaker 1: means of of putting your data off limits. So one 298 00:18:14,280 --> 00:18:20,560 Speaker 1: thing that Pop also set into motion was this tendency 299 00:18:20,680 --> 00:18:25,360 Speaker 1: for hackers who have developed ransomware to target healthcare organizations, 300 00:18:25,600 --> 00:18:29,960 Speaker 1: whether it's hospitals or organizations that are related to healthcare, 301 00:18:30,640 --> 00:18:33,600 Speaker 1: that's a prime target for ransomware. And the reason is 302 00:18:34,000 --> 00:18:39,080 Speaker 1: the information inside those computers is critical, literally critical to 303 00:18:39,119 --> 00:18:43,640 Speaker 1: the lives of human beings. So by targeting these very 304 00:18:43,720 --> 00:18:48,440 Speaker 1: critical systems that have a high sense of of urgency 305 00:18:48,520 --> 00:18:53,480 Speaker 1: about the data that they contain, the hackers are maximizing 306 00:18:53,520 --> 00:18:58,000 Speaker 1: the chance that people will give in and pay their demands. 307 00:18:58,040 --> 00:19:02,280 Speaker 1: So two different trends that he started. He started the 308 00:19:02,359 --> 00:19:06,360 Speaker 1: ransomware trend and he started the targeting healthcare trend, both 309 00:19:06,440 --> 00:19:11,120 Speaker 1: of which are pretty odious, I would say, But yeah, 310 00:19:11,160 --> 00:19:13,640 Speaker 1: the more valuable and urgent the information is, the more 311 00:19:13,720 --> 00:19:16,880 Speaker 1: likely you are to pay up when something gets locked away. 312 00:19:17,600 --> 00:19:23,240 Speaker 1: Now we'll talk more about early ransomware in just a minute, 313 00:19:23,280 --> 00:19:26,280 Speaker 1: but before we jump into that, let's take a quick 314 00:19:26,320 --> 00:19:38,200 Speaker 1: break to thank our sponsor. So early ransomware attackers would 315 00:19:38,720 --> 00:19:42,040 Speaker 1: originally they were building their own encryption codes to convert 316 00:19:42,119 --> 00:19:45,679 Speaker 1: files into seemingly meaningless gibberish. So what's going on with 317 00:19:45,760 --> 00:19:47,880 Speaker 1: encryption in the first place? What does that actually mean? 318 00:19:48,240 --> 00:19:50,119 Speaker 1: I used the term a lot. You've probably heard it 319 00:19:50,160 --> 00:19:52,480 Speaker 1: a lot, and some of you are probably very familiar 320 00:19:52,840 --> 00:19:55,560 Speaker 1: with the whole concept of encryption. But in case you 321 00:19:55,600 --> 00:19:58,560 Speaker 1: are not, and you're wondering, what does that even mean? 322 00:19:59,040 --> 00:20:01,680 Speaker 1: I mean, I get that it turning my files into 323 00:20:01,680 --> 00:20:04,840 Speaker 1: stuff that I can't read, but what is actually happening? 324 00:20:05,080 --> 00:20:08,840 Speaker 1: I thought I would give a very very basic explanation 325 00:20:08,840 --> 00:20:12,119 Speaker 1: of what encryption is. Now, keep in mind, this is 326 00:20:12,160 --> 00:20:16,720 Speaker 1: at its most basic level encryption involves using a key 327 00:20:16,840 --> 00:20:19,760 Speaker 1: to encode data in a way that makes it meaningless 328 00:20:19,800 --> 00:20:24,480 Speaker 1: to an outside observer who does not also possess that key. 329 00:20:24,480 --> 00:20:27,679 Speaker 1: So this is just making codes essentially, It's what it 330 00:20:27,680 --> 00:20:30,280 Speaker 1: boils down to. It's just using a very advanced algorithm 331 00:20:30,280 --> 00:20:33,280 Speaker 1: in order to do it, and using a huge number 332 00:20:33,400 --> 00:20:37,280 Speaker 1: of potential of variations on that so that you make 333 00:20:37,280 --> 00:20:41,640 Speaker 1: it very very difficult for people to reverse engineer the 334 00:20:41,640 --> 00:20:46,560 Speaker 1: strategy you use to encrypt the information, thus making it safe. Uh, 335 00:20:46,600 --> 00:20:49,360 Speaker 1: if you use a very simple set of rules, then 336 00:20:49,400 --> 00:20:52,280 Speaker 1: obviously your data isn't that safe. All it takes is 337 00:20:52,400 --> 00:20:55,320 Speaker 1: someone to notice what the rules are and then they 338 00:20:55,320 --> 00:20:59,400 Speaker 1: can reverse it that way. So if you've ever used 339 00:20:59,400 --> 00:21:03,720 Speaker 1: a substitut tuition cipher, you're you've experimented with an extremely 340 00:21:03,760 --> 00:21:06,600 Speaker 1: simple version of encryption. So you might decide with a 341 00:21:06,600 --> 00:21:09,159 Speaker 1: buddy that you're going to shift all the meaning of 342 00:21:09,280 --> 00:21:12,880 Speaker 1: letters one over from their actual place on the alphabet, 343 00:21:13,280 --> 00:21:15,800 Speaker 1: so that when you write your to a message encode 344 00:21:15,960 --> 00:21:19,080 Speaker 1: to your buddy, a B is an A, and a 345 00:21:19,200 --> 00:21:21,400 Speaker 1: C is a B, and so on and so forth. 346 00:21:21,400 --> 00:21:25,919 Speaker 1: That's a very simple one shift substitution cipher. When you 347 00:21:26,000 --> 00:21:29,159 Speaker 1: receive a message, you use that key, which in this 348 00:21:29,200 --> 00:21:32,680 Speaker 1: case is just that very simple rule to decode the message, 349 00:21:32,720 --> 00:21:34,479 Speaker 1: and then you read it, and then later that night 350 00:21:34,520 --> 00:21:37,439 Speaker 1: you'll probably TP someone's home, because that's the kind of 351 00:21:37,440 --> 00:21:39,200 Speaker 1: thing we allows the kids used to do before there 352 00:21:39,240 --> 00:21:44,119 Speaker 1: was an Internet and Nintendo switches and whatnot. Obviously, computers 353 00:21:44,119 --> 00:21:47,520 Speaker 1: are using much more robust encryption techniques than a simple 354 00:21:47,560 --> 00:21:50,399 Speaker 1: substitution cipher. The goal is to create a method of 355 00:21:50,480 --> 00:21:53,080 Speaker 1: encryption that is so sophisticated that it would take someone 356 00:21:53,600 --> 00:21:57,480 Speaker 1: years or even decades before they could decrypt the information 357 00:21:57,600 --> 00:22:00,800 Speaker 1: without the use of a key in others, to use 358 00:22:00,880 --> 00:22:05,800 Speaker 1: brute force. Brute force is essentially when you just tele computer, 359 00:22:06,400 --> 00:22:09,480 Speaker 1: I want you to work through every variation of this 360 00:22:10,000 --> 00:22:14,640 Speaker 1: particular approach until you find the one that works. And 361 00:22:14,920 --> 00:22:19,000 Speaker 1: the more approaches there are, the longer that will take 362 00:22:19,040 --> 00:22:23,600 Speaker 1: a computer to accomplish. So your goal is to make 363 00:22:23,760 --> 00:22:28,119 Speaker 1: the encryption process difficult enough so that a computer doesn't 364 00:22:28,160 --> 00:22:31,080 Speaker 1: have any hope of solving it by brute force in 365 00:22:31,160 --> 00:22:35,320 Speaker 1: any reasonable amount of time. The earliest forms of computer 366 00:22:35,400 --> 00:22:38,199 Speaker 1: encryption used a fifty six bit key. Now remember a 367 00:22:38,240 --> 00:22:41,280 Speaker 1: bit is a single unit of information. It is either 368 00:22:41,400 --> 00:22:44,760 Speaker 1: a zero or a one. So if you have fifty 369 00:22:44,800 --> 00:22:49,600 Speaker 1: six bits, how many different combinations will that get you. 370 00:22:50,160 --> 00:22:55,240 Speaker 1: The answer is it's around seventy quadrillion possible combinations. That 371 00:22:55,280 --> 00:22:58,200 Speaker 1: sounds like a lot, seventy quadrillion, but as it turns out, 372 00:22:58,359 --> 00:23:03,720 Speaker 1: modern computers can brute for us that fairly quickly, quickly 373 00:23:03,760 --> 00:23:07,040 Speaker 1: being a relative term. But it's not impossible to use 374 00:23:07,080 --> 00:23:09,720 Speaker 1: brute force and break that kind of encryption, so it's 375 00:23:09,800 --> 00:23:13,680 Speaker 1: not safe. So today you would use a much higher 376 00:23:14,320 --> 00:23:17,960 Speaker 1: uh bit for your encryption, like two fifty six bit encryption, 377 00:23:18,359 --> 00:23:26,560 Speaker 1: which gives you way more potential combinations, exponentially more combinations. 378 00:23:26,960 --> 00:23:30,600 Speaker 1: So to decrypt without brute force, if you're not going 379 00:23:30,640 --> 00:23:34,000 Speaker 1: to try and just force all those different variations through, 380 00:23:34,520 --> 00:23:37,560 Speaker 1: you need that key. The key is like a secret 381 00:23:37,640 --> 00:23:40,919 Speaker 1: dacoder ring. So if you get hit with ransomware, what 382 00:23:40,960 --> 00:23:44,480 Speaker 1: the hackers are actually offering you is the decryption key. 383 00:23:44,560 --> 00:23:47,679 Speaker 1: In exchange for money you pay them, they give you 384 00:23:47,720 --> 00:23:50,760 Speaker 1: the secret super secret dacoder rings, so you can decode 385 00:23:50,760 --> 00:23:53,040 Speaker 1: all that stuff that's on your computer and you can 386 00:23:53,119 --> 00:23:56,720 Speaker 1: use it again. These days, the money is typically demanded 387 00:23:56,720 --> 00:24:00,080 Speaker 1: in the form of digital currency like bitcoin, or in 388 00:24:00,240 --> 00:24:05,119 Speaker 1: prepaid cards like money Pack, which, by the way, and 389 00:24:05,200 --> 00:24:08,639 Speaker 1: one of the stories I was reading was misspelled with 390 00:24:08,680 --> 00:24:12,480 Speaker 1: a typo calling it monkey pack, and I wish it 391 00:24:12,600 --> 00:24:16,960 Speaker 1: was monkey Pack, but monkey Pack is a brand of backpacks. 392 00:24:17,000 --> 00:24:21,040 Speaker 1: It is not a method of cash transfer, unless you 393 00:24:21,080 --> 00:24:23,399 Speaker 1: were to stuff a monkey pack filled with money and 394 00:24:23,400 --> 00:24:26,120 Speaker 1: then hand it to someone, then technically it is cash transfer. 395 00:24:26,480 --> 00:24:29,280 Speaker 1: But I'm pretty sure that the the author of the 396 00:24:29,359 --> 00:24:37,200 Speaker 1: article meant money Pack. More's the pity. So using Bitcoin 397 00:24:37,359 --> 00:24:42,320 Speaker 1: or these prepaid options it allows hackers to maintain their anonymity, 398 00:24:42,520 --> 00:24:45,200 Speaker 1: as opposed to giving you an address, like a physical 399 00:24:45,240 --> 00:24:48,160 Speaker 1: address to send money to, which you know you could 400 00:24:48,160 --> 00:24:50,639 Speaker 1: just hand over to authorities who would then stake it 401 00:24:50,680 --> 00:24:53,320 Speaker 1: out and try and catch the people who are responsible. 402 00:24:53,840 --> 00:24:56,160 Speaker 1: Using the digital approach, it's a lot harder to do that. 403 00:24:57,000 --> 00:25:01,200 Speaker 1: Since ransomware has become a more popular method to attack computers, 404 00:25:01,200 --> 00:25:03,600 Speaker 1: and it really took off once the World Wide Web 405 00:25:03,680 --> 00:25:06,960 Speaker 1: matured and upon the launch of the smartphone industry as well. 406 00:25:07,560 --> 00:25:12,320 Speaker 1: The Internet Crime Complaint Center or I SEE three says 407 00:25:12,400 --> 00:25:15,680 Speaker 1: that between two thousand five and two thousand sixteen they 408 00:25:15,680 --> 00:25:20,200 Speaker 1: received reports of more than seven thousand, six hundred ransomware attacks, 409 00:25:20,320 --> 00:25:23,640 Speaker 1: and by comparison, the i C three says it received 410 00:25:23,680 --> 00:25:27,800 Speaker 1: more than six thousand reports of data breaches, so ransomware 411 00:25:27,840 --> 00:25:32,040 Speaker 1: actually outnumbers data breaches the information you tend to see 412 00:25:32,080 --> 00:25:35,000 Speaker 1: in the US, at least, you see these big stories 413 00:25:35,040 --> 00:25:38,760 Speaker 1: about companies that had their systems compromised and people stole 414 00:25:38,800 --> 00:25:41,320 Speaker 1: a lot of information. That's a data breach. The big 415 00:25:41,400 --> 00:25:43,679 Speaker 1: Sony data breach from a few years ago is a 416 00:25:43,720 --> 00:25:47,040 Speaker 1: great example. Um not that it's great, but it serves 417 00:25:47,080 --> 00:25:51,760 Speaker 1: as a great example. Ransomware actually happens way more frequently 418 00:25:51,880 --> 00:25:54,879 Speaker 1: than those big data breaches because again, you don't have 419 00:25:54,960 --> 00:25:58,600 Speaker 1: to care about what information is in the system. You 420 00:25:58,720 --> 00:26:01,679 Speaker 1: just want to make it unreachable. So all you have 421 00:26:01,760 --> 00:26:06,840 Speaker 1: to do is fool someone into executing some malicious code, 422 00:26:07,560 --> 00:26:10,640 Speaker 1: and depending upon the nature of the malware, you might 423 00:26:10,640 --> 00:26:13,320 Speaker 1: be able to infect an entire system just through one 424 00:26:13,359 --> 00:26:16,520 Speaker 1: point of entry. You don't have to try and navigate 425 00:26:16,920 --> 00:26:21,040 Speaker 1: a complex and potentially very secure system of computers in 426 00:26:21,119 --> 00:26:24,960 Speaker 1: order to look for specific information, because again you don't 427 00:26:25,000 --> 00:26:27,320 Speaker 1: care what the information is, You just want them to 428 00:26:27,359 --> 00:26:32,960 Speaker 1: have no access to it. Now, in the mid two thousands, 429 00:26:33,160 --> 00:26:35,919 Speaker 1: there are a lot of different types of malware in 430 00:26:35,960 --> 00:26:42,680 Speaker 1: the ransomware category that debuted that included stuff like gp code, Archivas, Crotton, 431 00:26:42,960 --> 00:26:47,679 Speaker 1: cry Zip, may Archive, and troj Dot ransom dot A 432 00:26:48,480 --> 00:26:51,320 Speaker 1: and these were using tougher algorithms that were harder to crack. 433 00:26:51,720 --> 00:26:54,480 Speaker 1: Arcives was one of the first, and it used our 434 00:26:54,640 --> 00:26:58,440 Speaker 1: essay encryption and demanded that users visit specific websites to 435 00:26:58,520 --> 00:27:01,840 Speaker 1: make purchases and are to buy a password to remove 436 00:27:01,840 --> 00:27:04,720 Speaker 1: the lock on their files. So you would get a 437 00:27:04,760 --> 00:27:07,720 Speaker 1: message saying you need to go to this pharmacy's website 438 00:27:07,960 --> 00:27:11,040 Speaker 1: and you need to buy x amount of drugs from 439 00:27:11,040 --> 00:27:14,280 Speaker 1: this pharmacy, and after you do, we'll give you the password. 440 00:27:15,400 --> 00:27:19,680 Speaker 1: Pretty aggressive marketing scheme for that pharmacy, if you were 441 00:27:19,680 --> 00:27:23,160 Speaker 1: to ask me. Obviously it was a front for these hackers, 442 00:27:24,359 --> 00:27:28,560 Speaker 1: but pretty nasty stuff. And more and more frequently hackers 443 00:27:28,600 --> 00:27:31,320 Speaker 1: began to use off the shelf solutions as time went on. 444 00:27:31,520 --> 00:27:34,359 Speaker 1: Rather than build their own encryption codes, they began to 445 00:27:34,480 --> 00:27:37,399 Speaker 1: use stuff that a couple of people had developed and 446 00:27:37,440 --> 00:27:40,199 Speaker 1: then had released out into the wild for others to 447 00:27:40,320 --> 00:27:45,840 Speaker 1: use at their own discretion. So this did two things. 448 00:27:45,960 --> 00:27:50,119 Speaker 1: It increased the sophistication of the encryption algorithms that the 449 00:27:50,200 --> 00:27:54,480 Speaker 1: hackers were using, and it lowered the barrier to entrgue 450 00:27:54,720 --> 00:27:57,800 Speaker 1: for hackers to the point where if you are willing 451 00:27:57,840 --> 00:28:01,800 Speaker 1: to pay the money, you and get very simple hacker 452 00:28:02,000 --> 00:28:06,879 Speaker 1: tool kits that are easy to run. Like they are 453 00:28:06,920 --> 00:28:09,320 Speaker 1: they are made to be user friendly for the hacker 454 00:28:10,200 --> 00:28:12,680 Speaker 1: UM and you don't have to know how they work. 455 00:28:13,080 --> 00:28:15,560 Speaker 1: You just have to use them. It's like using any 456 00:28:15,600 --> 00:28:18,040 Speaker 1: other program on a computer. You don't have to know 457 00:28:18,080 --> 00:28:20,800 Speaker 1: how it works in order for it to work. And 458 00:28:20,880 --> 00:28:23,640 Speaker 1: that makes it much more dangerous because it suddenly makes 459 00:28:23,760 --> 00:28:27,760 Speaker 1: ransomware a more viable option for a larger group of 460 00:28:27,760 --> 00:28:31,760 Speaker 1: people and thus put more computers at risk. It's a 461 00:28:31,840 --> 00:28:40,680 Speaker 1: pretty ugly cycle. So you also saw websites began to 462 00:28:41,600 --> 00:28:47,719 Speaker 1: get compromised and that became an issue too. UM and 463 00:28:47,760 --> 00:28:51,880 Speaker 1: you also started to see malware that would copy notifications 464 00:28:51,920 --> 00:28:55,880 Speaker 1: from trusted sources to fool people into installing malicious software. 465 00:28:56,240 --> 00:28:59,200 Speaker 1: So you've probably encountered something like this in the past. 466 00:28:59,400 --> 00:29:03,120 Speaker 1: You may have gone to a website that was not secure, 467 00:29:03,200 --> 00:29:06,360 Speaker 1: it was maybe a compromised website, and you might get 468 00:29:06,360 --> 00:29:08,640 Speaker 1: a pop up window that says, hey, you need to 469 00:29:08,760 --> 00:29:12,600 Speaker 1: update your flash, so that you can watch this content, 470 00:29:12,800 --> 00:29:16,600 Speaker 1: or you might get a notification saying, hey, the FBI 471 00:29:16,720 --> 00:29:18,920 Speaker 1: is looking at you right now, so you need to 472 00:29:18,960 --> 00:29:22,600 Speaker 1: follow this this link. But in in general, these are 473 00:29:22,640 --> 00:29:26,520 Speaker 1: not legitimate things. These are actually phishing attempts to try 474 00:29:26,600 --> 00:29:28,880 Speaker 1: and get you to click on stuff to download and 475 00:29:28,880 --> 00:29:32,400 Speaker 1: install the malware so that you compromise your own computer. 476 00:29:33,400 --> 00:29:36,640 Speaker 1: So don't do that, and don't go to that website anymore. 477 00:29:36,760 --> 00:29:39,920 Speaker 1: It's been compromised. It is not a nice place for 478 00:29:39,920 --> 00:29:43,640 Speaker 1: you to go visit. Go outside, get some fresh air, 479 00:29:44,720 --> 00:29:47,400 Speaker 1: or if it's on your phone, turn your phone off. 480 00:29:48,320 --> 00:29:52,840 Speaker 1: You know, just be careful. Over time, the demands from 481 00:29:52,840 --> 00:29:55,840 Speaker 1: hackers have increased as well as the sophistication of the 482 00:29:55,880 --> 00:29:59,120 Speaker 1: hacking program. In the mid two thousand's, the typical demand 483 00:29:59,240 --> 00:30:03,560 Speaker 1: for payment is hovering somewhere around three hundred dollars, typically 484 00:30:03,600 --> 00:30:06,400 Speaker 1: between two hundred and four hundred bucks. And this is 485 00:30:06,400 --> 00:30:08,600 Speaker 1: where the economies of scale come into place. A three 486 00:30:08,640 --> 00:30:12,560 Speaker 1: hundred dollars in the grand scheme of things is not 487 00:30:12,760 --> 00:30:17,800 Speaker 1: that much money. Now, it's not cheap. Three dollars is significant. 488 00:30:17,840 --> 00:30:20,360 Speaker 1: I mean, I'm not gonna just drop three hundred bucks 489 00:30:20,400 --> 00:30:22,520 Speaker 1: and walk away without a care. In the world. That's 490 00:30:22,920 --> 00:30:25,880 Speaker 1: it's a significant amount of money, but it's not an 491 00:30:26,080 --> 00:30:28,560 Speaker 1: enormous ransom. It's not like the sort of stuff you 492 00:30:28,600 --> 00:30:31,880 Speaker 1: see in movies where a character gets kidnapped and then 493 00:30:32,280 --> 00:30:36,640 Speaker 1: the the kidnappers demand a million dollars in ransom money. 494 00:30:36,800 --> 00:30:39,960 Speaker 1: It's three hundred bucks. However, you also have to remember 495 00:30:40,160 --> 00:30:44,400 Speaker 1: that ransomware typically if it's being really successful, is infecting 496 00:30:44,960 --> 00:30:49,400 Speaker 1: hundreds or thousands of computers at three hundred bucks of pop. 497 00:30:49,480 --> 00:30:53,520 Speaker 1: Assuming that people are playing ball, that ends up adding 498 00:30:53,600 --> 00:30:58,640 Speaker 1: up pretty quickly, so it ends up being uh an 499 00:30:58,680 --> 00:31:01,719 Speaker 1: effective way to extort people out of money. Today, the 500 00:31:01,760 --> 00:31:05,400 Speaker 1: price is closer to five dollars on average, so it's 501 00:31:05,400 --> 00:31:08,120 Speaker 1: gone up. It's no longer around three hundreds, around five. 502 00:31:09,040 --> 00:31:12,840 Speaker 1: And again, just through sheer number alone, you can see 503 00:31:12,880 --> 00:31:15,520 Speaker 1: the potential for hackers to make lots of money using 504 00:31:15,560 --> 00:31:20,680 Speaker 1: this methodology. And also a lot of the software today 505 00:31:21,080 --> 00:31:23,959 Speaker 1: comes along with a deadline, so it's not just that 506 00:31:24,000 --> 00:31:27,160 Speaker 1: your information is locked away, but that you have a 507 00:31:27,200 --> 00:31:31,800 Speaker 1: limited amount of time before UH something worse happens to you. 508 00:31:31,840 --> 00:31:33,960 Speaker 1: So you've gotta like pay up before the end of 509 00:31:33,960 --> 00:31:38,640 Speaker 1: the month, or we'll start deleting your information. We'll start 510 00:31:38,640 --> 00:31:40,760 Speaker 1: deleting your files so that not only are they not 511 00:31:40,840 --> 00:31:44,240 Speaker 1: accessible to you now, you'll never be able to access 512 00:31:44,280 --> 00:31:49,080 Speaker 1: them again because we're gonna completely delete and overwrite them. 513 00:31:49,160 --> 00:31:52,200 Speaker 1: So it becomes that kind of level of extortion. You know, 514 00:31:53,000 --> 00:31:56,120 Speaker 1: you've got a nice, uh database, only it sure would 515 00:31:56,160 --> 00:31:58,280 Speaker 1: be a shame as someone out though encrypted it and 516 00:31:58,280 --> 00:32:03,000 Speaker 1: then stead deleting it piece by peace. That's the sort 517 00:32:03,000 --> 00:32:06,760 Speaker 1: of message that the hackers are sending. So it's definitely 518 00:32:06,800 --> 00:32:16,800 Speaker 1: gotten more sophisticated, more expensive, and more um malicious over time. However, 519 00:32:16,880 --> 00:32:20,240 Speaker 1: ransomware does tend to change very quickly. You don't tend 520 00:32:20,280 --> 00:32:24,600 Speaker 1: to see one type of ransomware dominate for longer than 521 00:32:24,680 --> 00:32:28,280 Speaker 1: say a year or so. Kaspersky Labs, which is a 522 00:32:29,040 --> 00:32:33,520 Speaker 1: computer security company, reported that the most prominent ransomware between 523 00:32:33,560 --> 00:32:36,160 Speaker 1: two thousand and fourteen and two thousand fifteen was a 524 00:32:36,200 --> 00:32:39,760 Speaker 1: program called crypto Wall, which accounted for more than half 525 00:32:39,800 --> 00:32:42,840 Speaker 1: of all the ransomware examples found in the wild. Something 526 00:32:42,880 --> 00:32:45,600 Speaker 1: like fifty eight percent of all ransomware was crypto Wall 527 00:32:45,800 --> 00:32:49,080 Speaker 1: or some variation of crypto Wall, and according to the FBI, 528 00:32:49,200 --> 00:32:52,720 Speaker 1: the hackers behind crypto Wall made eighteen million dollars from 529 00:32:52,720 --> 00:32:56,160 Speaker 1: their victims, and crypto Wall was one of the earliest 530 00:32:56,160 --> 00:32:59,320 Speaker 1: types of ransomware to spread over compromised websites, and earlier 531 00:32:59,400 --> 00:33:05,200 Speaker 1: ransomwarely relied on other methodology too for distribution, but crypto 532 00:33:05,240 --> 00:33:10,960 Speaker 1: wall went through compromise websites and email attachments and affected 533 00:33:10,960 --> 00:33:14,080 Speaker 1: a lot of targeted computers. It used a two hundred 534 00:33:14,120 --> 00:33:17,760 Speaker 1: fifty six bit key to encrypt specific types of files, 535 00:33:17,760 --> 00:33:20,680 Speaker 1: so it would look for files that had uh specific 536 00:33:20,680 --> 00:33:24,800 Speaker 1: extensions like a dot d C file, a dot A 537 00:33:24,880 --> 00:33:27,600 Speaker 1: document file. It would look for those sorts of files 538 00:33:27,680 --> 00:33:30,800 Speaker 1: and encrypt them using this two d fifty six bit key. 539 00:33:30,920 --> 00:33:33,760 Speaker 1: Then it would use a two thousand, forty eight bit 540 00:33:34,000 --> 00:33:36,800 Speaker 1: r s A key to encrypt the two fifties six 541 00:33:36,880 --> 00:33:40,960 Speaker 1: bit key. This double encryption made it much more difficult 542 00:33:41,000 --> 00:33:44,520 Speaker 1: for you to figure out how to reverse the process. 543 00:33:44,560 --> 00:33:47,120 Speaker 1: But the following year saw crypto wall reduced to just 544 00:33:47,320 --> 00:33:50,760 Speaker 1: five point one of all ransomware, so it went from 545 00:33:50,760 --> 00:33:53,480 Speaker 1: fifty eight percent to five point to one percent in 546 00:33:53,520 --> 00:33:56,800 Speaker 1: the span of one year. The new heavy hitter was 547 00:33:56,840 --> 00:34:00,000 Speaker 1: a piece of software called Tesla crypt, and the hackers 548 00:34:00,040 --> 00:34:03,600 Speaker 1: behind that malware frequently demanded their ransoms in Bitcoin and 549 00:34:03,680 --> 00:34:09,000 Speaker 1: other forms of digital payment. Ransomware attackers continued to aim 550 00:34:09,000 --> 00:34:11,799 Speaker 1: at the healthcare industry for the reasons I mentioned earlier. 551 00:34:12,080 --> 00:34:15,359 Speaker 1: Hospitals have been affected by various types of ransomware UH. 552 00:34:15,480 --> 00:34:19,960 Speaker 1: Some of them include Los Angeles Hollywood Presbyterian Medical Center, 553 00:34:20,640 --> 00:34:24,880 Speaker 1: the Los Angeles County Department of Health Services, Ottawa Hospital, 554 00:34:25,600 --> 00:34:28,719 Speaker 1: Kentucky Methodist Hospital, and lots and lots of others. A 555 00:34:28,760 --> 00:34:32,000 Speaker 1: ton of them are in California. In fact, in some cases, 556 00:34:32,440 --> 00:34:35,200 Speaker 1: hospitals paid the ransom in order to regain control and 557 00:34:35,320 --> 00:34:38,439 Speaker 1: access of their systems, but in other cases, savvy tech 558 00:34:38,480 --> 00:34:43,000 Speaker 1: professionals were helping to quarantine affected computers to disconnect them 559 00:34:43,000 --> 00:34:45,920 Speaker 1: from the network so that they wouldn't spread the malware 560 00:34:46,080 --> 00:34:50,120 Speaker 1: further into the system. And then they worked to UH 561 00:34:50,160 --> 00:34:55,720 Speaker 1: to reboot the systems using old backups, so essentially going 562 00:34:55,760 --> 00:34:59,799 Speaker 1: to the backup files and you know, you lose some 563 00:35:00,000 --> 00:35:03,080 Speaker 1: stuff because chances are you generated some data since the 564 00:35:03,200 --> 00:35:06,560 Speaker 1: last backup, but it meant that they got back these 565 00:35:06,640 --> 00:35:09,440 Speaker 1: systems UH and didn't have to pay the ransom in 566 00:35:09,480 --> 00:35:13,640 Speaker 1: several cases. Now, sometimes hackers have a real flair for 567 00:35:13,680 --> 00:35:18,640 Speaker 1: the dramatic UH. There's the team that's behind the Jigsaw ransomware, 568 00:35:19,880 --> 00:35:23,560 Speaker 1: Jigsaw taking its name from the villain in the Saw 569 00:35:23,880 --> 00:35:27,600 Speaker 1: series of films. The malware not only locked the victim's computer, 570 00:35:28,080 --> 00:35:31,120 Speaker 1: but displayed an image of the puppet that was used 571 00:35:31,160 --> 00:35:35,560 Speaker 1: by Jigsaw, Billy, the puppet from the Saw series, And 572 00:35:35,600 --> 00:35:38,839 Speaker 1: there was a message there that would state that rather 573 00:35:38,880 --> 00:35:42,759 Speaker 1: than just a regular deadline, Jigsaw would delete files as 574 00:35:42,800 --> 00:35:45,520 Speaker 1: time passed, like every hour that passed would mean more 575 00:35:45,560 --> 00:35:49,640 Speaker 1: files deleted. So the longer you waited, the more information 576 00:35:49,640 --> 00:35:52,440 Speaker 1: you would lose. That gave that sense of urgency to 577 00:35:52,560 --> 00:35:57,280 Speaker 1: pay off the hackers. H And also if you turned 578 00:35:57,360 --> 00:36:01,320 Speaker 1: off your computer, it was even worse really, because the 579 00:36:01,360 --> 00:36:04,479 Speaker 1: next time you booted your computer, one thousand files would 580 00:36:04,520 --> 00:36:07,360 Speaker 1: be deleted from your computer. It was an incentive to 581 00:36:07,560 --> 00:36:11,560 Speaker 1: not turn your system off, um, because once you turn 582 00:36:11,600 --> 00:36:14,040 Speaker 1: it on again, you would lose a thousand times what 583 00:36:14,160 --> 00:36:18,800 Speaker 1: you would lose every hour. It's pretty evil. By fourteen, 584 00:36:19,280 --> 00:36:23,000 Speaker 1: hackers were designing locker based ransomware for Android systems, and 585 00:36:23,040 --> 00:36:26,840 Speaker 1: one of those was Saiping, which used fake Adobe Flash 586 00:36:26,960 --> 00:36:30,160 Speaker 1: update messages to commence users to install the malware that 587 00:36:30,200 --> 00:36:33,160 Speaker 1: would lock you out of your Android device until you 588 00:36:33,239 --> 00:36:37,000 Speaker 1: paid a two D dollar ransom using money packs. Those 589 00:36:37,000 --> 00:36:40,359 Speaker 1: are those prepaid charge cards. So what happened is when 590 00:36:40,360 --> 00:36:43,160 Speaker 1: you try to activate your phone, instead of getting the 591 00:36:43,320 --> 00:36:46,239 Speaker 1: screen to unlock your phone, you've got a message saying 592 00:36:46,280 --> 00:36:48,680 Speaker 1: you had to pay this amount of money uh in 593 00:36:48,800 --> 00:36:54,480 Speaker 1: money packs to this particular account or you would not 594 00:36:54,640 --> 00:36:58,400 Speaker 1: get access to your phone again. A similar piece of 595 00:36:58,520 --> 00:37:03,239 Speaker 1: ransomware was called Coal or ko l e R or 596 00:37:03,320 --> 00:37:06,279 Speaker 1: Color if you prefer, which claimed that the holder of 597 00:37:06,320 --> 00:37:09,600 Speaker 1: the phone was being investigated by law enforcement and then 598 00:37:09,640 --> 00:37:12,440 Speaker 1: they were being fined as a result. So this is 599 00:37:12,440 --> 00:37:15,440 Speaker 1: playing on people's fear, right Like if you send them 600 00:37:15,480 --> 00:37:18,600 Speaker 1: a message saying, hey, you're in trouble and unless you 601 00:37:19,000 --> 00:37:22,360 Speaker 1: follow this link, you're gonna go to jail, that gives 602 00:37:22,360 --> 00:37:25,279 Speaker 1: people a big incentive to try and figure out what's 603 00:37:25,320 --> 00:37:26,879 Speaker 1: going on. A lot of people are going to click 604 00:37:26,920 --> 00:37:30,960 Speaker 1: that link, not thinking that, hey, the FBI probably doesn't 605 00:37:30,960 --> 00:37:33,680 Speaker 1: reach out through websites to let you know that you're 606 00:37:33,719 --> 00:37:36,840 Speaker 1: in trouble. They probably come door to door for that 607 00:37:36,920 --> 00:37:40,040 Speaker 1: kind of thing. But uh, it's the sort of thing 608 00:37:40,040 --> 00:37:42,759 Speaker 1: that's meant to instill panic. And when we panic, we 609 00:37:42,840 --> 00:37:45,839 Speaker 1: make bad decisions. We make very quick decisions. We don't 610 00:37:46,200 --> 00:37:49,320 Speaker 1: think we don't use critical thinking. So that's the whole 611 00:37:50,360 --> 00:37:56,279 Speaker 1: method of attack in this type of ransomware. So this 612 00:37:56,320 --> 00:37:58,839 Speaker 1: one also added a nasty additional kick. It was a 613 00:37:58,880 --> 00:38:01,120 Speaker 1: locker worm type of all where that would then send 614 00:38:01,160 --> 00:38:04,360 Speaker 1: messages to anyone in the context list of a compromised device. 615 00:38:04,880 --> 00:38:07,279 Speaker 1: So if you got me with that, if you send 616 00:38:07,320 --> 00:38:10,120 Speaker 1: me a message saying, hey, we're the FBI and your 617 00:38:10,160 --> 00:38:12,960 Speaker 1: totes in trouble brow and I fell for it and 618 00:38:13,000 --> 00:38:16,080 Speaker 1: I clicked on it, then it would the malware would 619 00:38:16,080 --> 00:38:17,880 Speaker 1: not only lock me on my phone, it would go 620 00:38:18,000 --> 00:38:20,400 Speaker 1: through my contact list and send a message out to 621 00:38:20,560 --> 00:38:24,680 Speaker 1: everyone in my contact list with a similar message in 622 00:38:24,719 --> 00:38:28,000 Speaker 1: the hopes of catching even more people. So this way 623 00:38:28,040 --> 00:38:31,040 Speaker 1: you allow the virus to propagate across the network. All 624 00:38:31,040 --> 00:38:32,680 Speaker 1: you have to do is in fact a couple of 625 00:38:33,160 --> 00:38:35,960 Speaker 1: well connected people, and chances are you're going to see 626 00:38:36,000 --> 00:38:39,320 Speaker 1: a lot more infected devices as a resultant that becomes 627 00:38:39,320 --> 00:38:42,719 Speaker 1: like a ripple effect that keeps moving out from the source. Uh, 628 00:38:43,440 --> 00:38:46,480 Speaker 1: people who are savvy to it will ignore it, but 629 00:38:47,320 --> 00:38:51,240 Speaker 1: that doesn't help all the people who don't ignore it. 630 00:38:51,239 --> 00:38:55,000 Speaker 1: It's pretty nasty stuff though. By two thousand fifteen, enterprising 631 00:38:55,000 --> 00:38:58,799 Speaker 1: programmers began to create ransomware as a service or are 632 00:38:58,960 --> 00:39:01,480 Speaker 1: a a s now. These were the people who had 633 00:39:01,480 --> 00:39:04,879 Speaker 1: designed the tools that other folks would actually use. So 634 00:39:05,120 --> 00:39:09,160 Speaker 1: you might have programmers who have no desire to actually 635 00:39:09,280 --> 00:39:12,680 Speaker 1: use ransomware themselves. They're not directly going to put it 636 00:39:12,719 --> 00:39:16,200 Speaker 1: to use. Instead, they'll sell it to hackers who do 637 00:39:16,320 --> 00:39:18,799 Speaker 1: want to use it, but who don't have the ability 638 00:39:18,880 --> 00:39:23,880 Speaker 1: to program or design these algorithms or these types of malware, 639 00:39:24,880 --> 00:39:26,920 Speaker 1: and so you'd sell it for like a thousand to 640 00:39:26,960 --> 00:39:29,239 Speaker 1: three thousand dollars. There's a lot of money, but when 641 00:39:29,280 --> 00:39:32,239 Speaker 1: you factor into the account the fact that you can 642 00:39:32,280 --> 00:39:35,480 Speaker 1: demand five h bucks per locked computer, and if you're 643 00:39:35,520 --> 00:39:39,040 Speaker 1: hitting thousands of them, three thousand dollars is nothing. A 644 00:39:39,040 --> 00:39:44,040 Speaker 1: lot of these ransomware as a service providers also demand 645 00:39:44,120 --> 00:39:49,320 Speaker 1: a certain percentage of the profits, like ten, but still 646 00:39:49,400 --> 00:39:54,600 Speaker 1: you're still talking huge amounts of money, So it doesn't 647 00:39:54,600 --> 00:39:57,200 Speaker 1: take very many victims to play ball before you recapture 648 00:39:57,239 --> 00:40:01,279 Speaker 1: your costs, and it makes ransomware even more prevalent. One 649 00:40:01,400 --> 00:40:03,960 Speaker 1: ransomware attack that made headlines in the United States happened 650 00:40:04,000 --> 00:40:08,920 Speaker 1: on November That was the Friday following the US holiday 651 00:40:08,960 --> 00:40:12,360 Speaker 1: of Thanksgiving, which is also known as Black Friday. For 652 00:40:12,440 --> 00:40:15,319 Speaker 1: those who don't know what Black Friday is, that's a day. 653 00:40:15,320 --> 00:40:17,239 Speaker 1: It's called that because a lot of stores will open 654 00:40:17,320 --> 00:40:20,560 Speaker 1: up with special sales and it's all in an effort 655 00:40:20,600 --> 00:40:24,080 Speaker 1: to sell enough stuff to make an overall profit for 656 00:40:24,200 --> 00:40:26,400 Speaker 1: the end of the year, to go in the black. 657 00:40:26,800 --> 00:40:28,680 Speaker 1: As they say, if you're in the red, that means 658 00:40:28,680 --> 00:40:30,520 Speaker 1: that you're operating at a loss. If you're in the black, 659 00:40:30,560 --> 00:40:33,680 Speaker 1: you're operating at a profit. That's why it's called Black Friday. Well, 660 00:40:33,800 --> 00:40:36,040 Speaker 1: that's a very popular day for people to go out shopping, 661 00:40:36,080 --> 00:40:38,640 Speaker 1: and it means it's also a popular day to to 662 00:40:39,040 --> 00:40:44,279 Speaker 1: just get outside and travel. So the hackers had targeted 663 00:40:44,560 --> 00:40:48,920 Speaker 1: San Francisco's municipal transportation system also known as MUNI, M 664 00:40:49,000 --> 00:40:52,440 Speaker 1: You and I, and on that day they were able 665 00:40:52,480 --> 00:40:55,560 Speaker 1: to infect the ticketing and bus management system for MUNI 666 00:40:55,680 --> 00:40:59,560 Speaker 1: with a ransomware attack. They demanded one hundred bitcoin for 667 00:40:59,719 --> 00:41:05,520 Speaker 1: the antidote for the the key to decode everything uh 668 00:41:05,600 --> 00:41:08,760 Speaker 1: and at that time a hundred bitcoin was worth about 669 00:41:08,800 --> 00:41:13,560 Speaker 1: seventy three thousand dollars. But instead of paying the ransom, 670 00:41:13,719 --> 00:41:17,319 Speaker 1: MUNI decided to offer free rides to passengers while they 671 00:41:17,320 --> 00:41:20,439 Speaker 1: worked on a solution. So for two days you could 672 00:41:20,520 --> 00:41:23,239 Speaker 1: ride Muni absolutely free. You didn't have to have a 673 00:41:23,280 --> 00:41:26,319 Speaker 1: ticket or anything. You could just get on um. But 674 00:41:26,440 --> 00:41:29,480 Speaker 1: then once they were able to reboot the system and 675 00:41:29,520 --> 00:41:34,120 Speaker 1: restore from backup the it was back to normal operations. 676 00:41:34,520 --> 00:41:37,640 Speaker 1: So it was only a temporary downtime for Muni. It 677 00:41:37,680 --> 00:41:40,239 Speaker 1: was very you know, it was still damaging because that's 678 00:41:40,239 --> 00:41:44,920 Speaker 1: two days without any revenue, but it showed that the 679 00:41:44,960 --> 00:41:47,520 Speaker 1: city of San Francisco and Muni in particular, was not 680 00:41:47,640 --> 00:41:51,480 Speaker 1: willing to play ball by the hackers standards. Now, there 681 00:41:51,520 --> 00:41:54,320 Speaker 1: are dozens of other variations that have appeared over the years, 682 00:41:54,320 --> 00:41:57,279 Speaker 1: but I think it's a good time too now look 683 00:41:57,360 --> 00:42:00,160 Speaker 1: over at the want to Cry virus, because that is 684 00:42:00,280 --> 00:42:03,120 Speaker 1: the most recent version of ransomware as of the recording 685 00:42:03,120 --> 00:42:05,640 Speaker 1: of this episode, and I'm gonna jump right into that 686 00:42:05,760 --> 00:42:08,839 Speaker 1: topic right as when we take another break to thank 687 00:42:08,880 --> 00:42:21,840 Speaker 1: our sponsors. One of Cry is an aggressive, coordinated ransomware attack, 688 00:42:21,920 --> 00:42:26,399 Speaker 1: one of the biggest ransomware attacks in history, and it's 689 00:42:26,400 --> 00:42:29,960 Speaker 1: affected hundreds of thousands of computers, many of which are 690 00:42:30,040 --> 00:42:33,279 Speaker 1: part of the health care industry. Its main method of 691 00:42:33,320 --> 00:42:36,640 Speaker 1: compromising a machine is to exploit vulnerabilities that are in 692 00:42:36,680 --> 00:42:41,160 Speaker 1: an old build of the Service Message Block Protocol, which 693 00:42:41,239 --> 00:42:45,320 Speaker 1: is part of a larger block of protocols that Windows 694 00:42:45,320 --> 00:42:49,560 Speaker 1: machines used for file sharing. Specifically, the virus could attack 695 00:42:49,600 --> 00:42:54,280 Speaker 1: computers that had inbound SMB communications on ports one, nine 696 00:42:54,520 --> 00:42:57,000 Speaker 1: or four forty five, and then there were some later 697 00:42:57,120 --> 00:42:59,680 Speaker 1: variants that aimed at different ports, but the initial one 698 00:42:59,800 --> 00:43:03,239 Speaker 1: was looking at those two. All you have to do 699 00:43:03,960 --> 00:43:06,840 Speaker 1: to protect yourself against this, by the way, is updating 700 00:43:06,880 --> 00:43:10,239 Speaker 1: your computer to the latest Microsoft security patch. It removes 701 00:43:10,280 --> 00:43:15,839 Speaker 1: the vulnerability. Now, once the computer is infected, the malware 702 00:43:16,160 --> 00:43:19,680 Speaker 1: could sort of put out feelers across the local network. 703 00:43:19,800 --> 00:43:24,040 Speaker 1: So if this infected machine is on a local network 704 00:43:24,120 --> 00:43:27,720 Speaker 1: with other machines, it could then use that to send 705 00:43:27,719 --> 00:43:30,800 Speaker 1: the malware to the other devices on that local network, 706 00:43:30,840 --> 00:43:33,200 Speaker 1: so it could spread really fast. All it takes is 707 00:43:33,200 --> 00:43:36,480 Speaker 1: that one compromise device on a system to have it 708 00:43:36,600 --> 00:43:39,839 Speaker 1: spread throughout the entire system, and it made it particularly 709 00:43:39,960 --> 00:43:42,799 Speaker 1: dangerous for these interconnected devices that weren't up to date 710 00:43:42,840 --> 00:43:47,080 Speaker 1: on security patches. Now. Before it made its debut, Want 711 00:43:47,080 --> 00:43:49,600 Speaker 1: to Cry was published as part of a large group 712 00:43:49,640 --> 00:43:53,840 Speaker 1: of documents stolen from the n ess A by a 713 00:43:53,880 --> 00:43:56,960 Speaker 1: group of hackers. So among those documents was a list 714 00:43:57,000 --> 00:44:02,000 Speaker 1: of twenty three hacking tools that targeted indoors vulnerabilities. One 715 00:44:02,040 --> 00:44:05,560 Speaker 1: of those hacking tools was codenamed Eternal Blue, and that 716 00:44:05,719 --> 00:44:09,520 Speaker 1: is what would become Wanna Cry. So Wanna Cry started 717 00:44:09,520 --> 00:44:13,960 Speaker 1: off as an n S a identified and targeted vulnerability 718 00:44:14,160 --> 00:44:18,399 Speaker 1: in Windows operating systems. This raises some tricky questions about 719 00:44:18,440 --> 00:44:22,120 Speaker 1: intelligence agencies and how they intersect with computer vulnerabilities that 720 00:44:22,280 --> 00:44:25,160 Speaker 1: I will get to in just a moment. But nearly 721 00:44:25,200 --> 00:44:27,600 Speaker 1: a month went by without want to Cry becoming a 722 00:44:27,640 --> 00:44:30,640 Speaker 1: public menace. So it was released by this group of 723 00:44:30,640 --> 00:44:34,480 Speaker 1: hackers into the wild. Anyone who went to tour and 724 00:44:34,520 --> 00:44:38,760 Speaker 1: went to this particular site could or really file sharing 725 00:44:38,760 --> 00:44:42,839 Speaker 1: area could get hold of these documents. But for about 726 00:44:42,880 --> 00:44:46,040 Speaker 1: a month nothing really happened. Then on May twelve, two 727 00:44:46,040 --> 00:44:49,560 Speaker 1: thousand seventeen, at eight forty two a m. London time, 728 00:44:50,280 --> 00:44:52,239 Speaker 1: and I love how precise we can be with this, 729 00:44:52,840 --> 00:44:57,720 Speaker 1: the virus was unleashed and the first attacked attack lasted 730 00:44:57,760 --> 00:45:00,600 Speaker 1: for most of the day and it compromised hundreds of 731 00:45:00,600 --> 00:45:04,160 Speaker 1: thousands of machines. But it wasn't as bad as it 732 00:45:04,200 --> 00:45:07,320 Speaker 1: could have been because it got sidelined when a British 733 00:45:07,400 --> 00:45:11,560 Speaker 1: cybersecurity analyst found a u r L embedded in the 734 00:45:11,680 --> 00:45:15,479 Speaker 1: Wanna cry virus attack. That led them to a kill 735 00:45:15,560 --> 00:45:18,480 Speaker 1: switch for the virus. So this was something that the 736 00:45:18,520 --> 00:45:22,759 Speaker 1: hackers had built into the system, or really you could 737 00:45:22,840 --> 00:45:25,200 Speaker 1: argue the n S a built into the system so 738 00:45:25,239 --> 00:45:29,719 Speaker 1: that you could shut it off remotely. So they did. 739 00:45:29,920 --> 00:45:32,640 Speaker 1: They flipped the kill switch and it stopped the spread 740 00:45:32,719 --> 00:45:34,759 Speaker 1: of the virus right there, So it could have been 741 00:45:34,880 --> 00:45:37,640 Speaker 1: much worse than it was if it had left been 742 00:45:37,719 --> 00:45:41,680 Speaker 1: left unchecked. The hacking group that was responsible was called 743 00:45:41,680 --> 00:45:44,880 Speaker 1: the Shadow Brokers Um. They sent out a message on 744 00:45:44,960 --> 00:45:47,800 Speaker 1: May sixteenth claiming to have many more exploits for sale 745 00:45:47,840 --> 00:45:51,319 Speaker 1: if hackers wanted to subscribe to their services. So they 746 00:45:51,320 --> 00:45:54,120 Speaker 1: were saying, hey, you see how much mess we made 747 00:45:54,120 --> 00:45:56,080 Speaker 1: with want to cry? We have a whole lot more. 748 00:45:56,200 --> 00:46:00,480 Speaker 1: Just become a subscriber and then we'll share our tools 749 00:46:00,520 --> 00:46:04,520 Speaker 1: with you. Meanwhile, affected computers were causing huge headaches for 750 00:46:04,600 --> 00:46:07,680 Speaker 1: thousands of people in the UK. Several hospitals sent out 751 00:46:07,680 --> 00:46:12,040 Speaker 1: messages that some appointments and operations would be postponed while 752 00:46:12,360 --> 00:46:15,919 Speaker 1: they were working on fixing these compromise systems. They said, 753 00:46:15,920 --> 00:46:18,680 Speaker 1: it just wasn't safe. It was putting people's health at 754 00:46:18,800 --> 00:46:24,160 Speaker 1: risk to try and maintain appointments and operations without having 755 00:46:24,160 --> 00:46:28,239 Speaker 1: those computer systems in place. Experts were work working really 756 00:46:28,239 --> 00:46:30,960 Speaker 1: hard to restore systems from backups, but that's a pretty 757 00:46:30,960 --> 00:46:34,280 Speaker 1: slow process, and just the sheer number of affected computers 758 00:46:34,320 --> 00:46:38,719 Speaker 1: across multiple companies and multiple countries meant that there was 759 00:46:38,760 --> 00:46:42,600 Speaker 1: no coordinated effort. Right Like, you had all these individual 760 00:46:42,640 --> 00:46:46,359 Speaker 1: little islands that were affected by this virus, and each 761 00:46:46,400 --> 00:46:49,400 Speaker 1: one had to respond to it in its own way, 762 00:46:49,400 --> 00:46:53,080 Speaker 1: in its own time, So there was no coordinated, major 763 00:46:53,120 --> 00:46:56,480 Speaker 1: effort to overturn the virus. It was just pockets of 764 00:46:56,480 --> 00:47:00,000 Speaker 1: that throughout the world. The same was true for others 765 00:47:00,000 --> 00:47:04,480 Speaker 1: systems all over the world. In all, fifty countries were 766 00:47:04,480 --> 00:47:07,800 Speaker 1: affected by the Wanna cry virus. That being said, according 767 00:47:07,840 --> 00:47:10,759 Speaker 1: to zd net, despite the fact that the virus was 768 00:47:10,800 --> 00:47:15,520 Speaker 1: pretty widespread, only zero point one percent of the victims 769 00:47:15,560 --> 00:47:19,600 Speaker 1: have opted to pay the ransom. As of the zd 770 00:47:19,719 --> 00:47:22,600 Speaker 1: net report, the hackers had raised about a hundred eight 771 00:47:22,719 --> 00:47:26,440 Speaker 1: thousand dollars total, which, considering the size of the attack 772 00:47:26,480 --> 00:47:29,440 Speaker 1: and the number of systems that were compromised. Is actually 773 00:47:29,480 --> 00:47:32,640 Speaker 1: a pretty small amount of money. Hundred eight thousand dollars. 774 00:47:32,960 --> 00:47:34,960 Speaker 1: It's a lot of money to me, But if you're 775 00:47:35,000 --> 00:47:38,600 Speaker 1: talking about the payoff for a massive attack on that scale, 776 00:47:39,120 --> 00:47:41,400 Speaker 1: it's a fraction of what those hackers were hoping for. 777 00:47:41,560 --> 00:47:45,200 Speaker 1: I'm sure of that. Uh. Now here are some takeaways 778 00:47:45,640 --> 00:47:50,400 Speaker 1: from the Wanna Cry experience that I think are really important. First, 779 00:47:50,440 --> 00:47:52,680 Speaker 1: let's talk about the n s A. And I'm gonna 780 00:47:52,719 --> 00:47:55,880 Speaker 1: try and maintain my composure because I have very strong 781 00:47:55,960 --> 00:47:59,239 Speaker 1: feelings about this particular issue. So this is my own 782 00:47:59,239 --> 00:48:01,640 Speaker 1: personal opinion in This is not the opinion of how 783 00:48:01,680 --> 00:48:05,279 Speaker 1: stuff works. It's just Jonathan Strickland's opinion. I find it 784 00:48:05,440 --> 00:48:11,000 Speaker 1: unconscionable that an intelligence agency would identify and design an 785 00:48:11,040 --> 00:48:15,960 Speaker 1: exploit for a vulnerability in software rather than informing the 786 00:48:16,040 --> 00:48:20,280 Speaker 1: respective parties about the vulnerability. So, in other words, instead 787 00:48:20,280 --> 00:48:24,040 Speaker 1: of going to Microsoft and saying, hey, we we discovered 788 00:48:24,080 --> 00:48:26,759 Speaker 1: this vulnerability that's in your software. You should patch it 789 00:48:26,880 --> 00:48:29,160 Speaker 1: or else someone else might create an exploit for it, 790 00:48:29,400 --> 00:48:33,280 Speaker 1: they said, hey, there's a vulnerability in Windows. Let's create 791 00:48:33,320 --> 00:48:36,200 Speaker 1: our own exploit for it that we might end up 792 00:48:36,320 --> 00:48:40,040 Speaker 1: using for intelligence purposes. In the future. Never mind the 793 00:48:40,080 --> 00:48:45,200 Speaker 1: fact that this puts everyone at risk, as is evidenced 794 00:48:45,200 --> 00:48:47,440 Speaker 1: by the fact that the want to Cry virus is 795 00:48:47,440 --> 00:48:53,760 Speaker 1: an actual thing. So the company Microsoft had no knowledge 796 00:48:53,880 --> 00:48:57,239 Speaker 1: of this vulnerability. They weren't aware that it existed. It 797 00:48:57,320 --> 00:48:59,800 Speaker 1: wasn't until the shadow brokers published those n s A 798 00:49:00,000 --> 00:49:02,359 Speaker 1: hacking tools that Microsoft found out about it, and then 799 00:49:02,360 --> 00:49:06,360 Speaker 1: they got to work creating a security patch to cover 800 00:49:07,000 --> 00:49:09,680 Speaker 1: and change that exploit so that it wouldn't work anymore. 801 00:49:10,560 --> 00:49:12,680 Speaker 1: And then they made the security patch available, so if 802 00:49:12,680 --> 00:49:15,680 Speaker 1: you installed it, you were fine. Your security patch was 803 00:49:15,760 --> 00:49:19,320 Speaker 1: up to date. Then the at least the initial attack 804 00:49:19,360 --> 00:49:23,239 Speaker 1: of want to Cry wouldn't affect you because the vulnerability 805 00:49:23,280 --> 00:49:28,239 Speaker 1: had been patched up. So I say shame on the 806 00:49:28,360 --> 00:49:30,799 Speaker 1: n s A for identifying and then building a tool 807 00:49:30,840 --> 00:49:34,319 Speaker 1: to exploit such a vulnerability for their own purposes. As 808 00:49:34,360 --> 00:49:36,600 Speaker 1: we've seen this particular case, it can result in someone 809 00:49:36,680 --> 00:49:39,200 Speaker 1: else getting those same tools and using them to cause 810 00:49:39,239 --> 00:49:42,040 Speaker 1: a great deal of trouble. But it was also possible 811 00:49:42,680 --> 00:49:45,280 Speaker 1: that just by sitting on this information and not sharing 812 00:49:45,280 --> 00:49:48,000 Speaker 1: it with Microsoft, the n s A could have given 813 00:49:48,000 --> 00:49:51,640 Speaker 1: other parties the chance to discover that same weakness and 814 00:49:51,719 --> 00:49:54,839 Speaker 1: develop their own exploits for it, which would have been 815 00:49:54,840 --> 00:49:58,120 Speaker 1: even worse because Microsoft wouldn't have known about until after 816 00:49:58,160 --> 00:50:03,040 Speaker 1: people had been actively affected by that exploit. So, in 817 00:50:03,040 --> 00:50:05,480 Speaker 1: other words, even if the NSA had never had their 818 00:50:05,480 --> 00:50:08,640 Speaker 1: hacking tools stolen, let's say that the hackers never were 819 00:50:08,680 --> 00:50:11,560 Speaker 1: able to get hold of eternal Blue and turn it 820 00:50:11,560 --> 00:50:14,880 Speaker 1: into want to cry. Even if that had never happened, 821 00:50:15,120 --> 00:50:19,920 Speaker 1: someone still might have discovered that Microsoft vulnerability and exploited it. Meanwhile, 822 00:50:19,920 --> 00:50:21,920 Speaker 1: the n s A had known about it the whole time. 823 00:50:22,200 --> 00:50:25,840 Speaker 1: I really maintain that it was their responsibility to share 824 00:50:25,840 --> 00:50:30,480 Speaker 1: that information with Microsoft considering the potential for destruction. And 825 00:50:31,080 --> 00:50:34,879 Speaker 1: I find it really troubling that an intelligence agency can 826 00:50:34,920 --> 00:50:37,960 Speaker 1: act in such a way that puts hundreds of thousands 827 00:50:37,960 --> 00:50:40,640 Speaker 1: of computers and people, because we're talking about the health 828 00:50:40,680 --> 00:50:44,759 Speaker 1: care industry at risk. I don't know that any intelligence 829 00:50:44,840 --> 00:50:48,879 Speaker 1: is worth that. Again, that's my own personal opinion. So 830 00:50:49,760 --> 00:50:53,800 Speaker 1: that's the Jonathan bias to be perfectly blunt. But another 831 00:50:53,840 --> 00:50:57,600 Speaker 1: takeaway is that in order to practice good security, you 832 00:50:57,640 --> 00:51:01,360 Speaker 1: need to make sure your operating system is patched and current. 833 00:51:02,000 --> 00:51:05,000 Speaker 1: Now I'm just as guilty as other people at putting 834 00:51:05,040 --> 00:51:07,640 Speaker 1: off installing updates if you ever get that message like 835 00:51:07,680 --> 00:51:11,120 Speaker 1: you need to install some updates, chances are you've gotten 836 00:51:11,120 --> 00:51:14,080 Speaker 1: on the computer to do something specific and you don't 837 00:51:14,120 --> 00:51:17,120 Speaker 1: really want to put that off by installing updates. You 838 00:51:17,120 --> 00:51:19,200 Speaker 1: want to get to whatever it is you need to do, 839 00:51:19,880 --> 00:51:22,040 Speaker 1: and so you might just put it off, and you 840 00:51:22,120 --> 00:51:25,279 Speaker 1: might keep putting it off until your computer forces you 841 00:51:25,320 --> 00:51:29,400 Speaker 1: to do it. But really the better plan is to 842 00:51:29,400 --> 00:51:32,719 Speaker 1: go ahead and install those security patches when you get them, 843 00:51:32,760 --> 00:51:34,880 Speaker 1: so that you can make sure that your computer is 844 00:51:34,920 --> 00:51:37,160 Speaker 1: not vulnerable to these sort of attacks. Plus, you know 845 00:51:37,200 --> 00:51:39,279 Speaker 1: what often means that your system is just running more 846 00:51:39,320 --> 00:51:44,960 Speaker 1: effectively if it's patched properly. So just be sure you're 847 00:51:45,080 --> 00:51:49,920 Speaker 1: installing legitimate updates to your system, not falling for some 848 00:51:50,040 --> 00:51:53,000 Speaker 1: fishing scam. Typically you can do it because if it's 849 00:51:53,080 --> 00:51:56,719 Speaker 1: the system itself that's prompting you to update, and you're 850 00:51:56,760 --> 00:52:00,320 Speaker 1: not in any browser or anything, you're probably pretty safe. 851 00:52:00,320 --> 00:52:02,759 Speaker 1: You're either pretty safe for your computer is already compromised, 852 00:52:02,760 --> 00:52:07,160 Speaker 1: in which case you know it's too late anyway. And finally, 853 00:52:07,480 --> 00:52:10,560 Speaker 1: back up your data. Use some sort of system to 854 00:52:10,640 --> 00:52:14,160 Speaker 1: back everything up, whether it's an external drive a cloud 855 00:52:14,200 --> 00:52:18,480 Speaker 1: based system, back up your information that way. If worst 856 00:52:18,520 --> 00:52:23,240 Speaker 1: comes to worst, if you cannot retrieve your information because 857 00:52:23,239 --> 00:52:27,240 Speaker 1: of a ransomware attack, you can bite the bullet, wipe 858 00:52:27,280 --> 00:52:31,560 Speaker 1: your system, install the operating system again, go to your backups, 859 00:52:31,600 --> 00:52:35,040 Speaker 1: and restore from your backups. Now, that probably means that 860 00:52:35,080 --> 00:52:38,160 Speaker 1: you're gonna lose some stuff, because chances are you've generated 861 00:52:38,200 --> 00:52:41,120 Speaker 1: some data since the last time you did a backup. 862 00:52:41,560 --> 00:52:45,239 Speaker 1: Unless you're doing backups very frequently, that's always going to 863 00:52:45,280 --> 00:52:48,400 Speaker 1: be the case. But it's better to lose some data 864 00:52:48,600 --> 00:52:52,200 Speaker 1: rather than lose everything or be forced to pay into 865 00:52:52,680 --> 00:52:57,320 Speaker 1: a ransomware attack, because every time someone pays the hackers, 866 00:52:58,120 --> 00:53:01,120 Speaker 1: you are sending the message this is a way you 867 00:53:01,160 --> 00:53:04,479 Speaker 1: can make money, and you're inspiring other people to take 868 00:53:04,560 --> 00:53:08,360 Speaker 1: the same pathway as the hackers did, whether they're designing 869 00:53:08,400 --> 00:53:11,640 Speaker 1: their own or using and off the shelf ransomware as 870 00:53:11,640 --> 00:53:18,520 Speaker 1: a service approach. So don't negotiate with the hackers. Instead, 871 00:53:19,440 --> 00:53:24,480 Speaker 1: use backups, patch your security, have up to date antivirus software, 872 00:53:24,560 --> 00:53:29,400 Speaker 1: running practice, good web browsing and email hygiene so that 873 00:53:29,480 --> 00:53:34,320 Speaker 1: you're not inviting these sort of attacks into your life. 874 00:53:34,960 --> 00:53:38,000 Speaker 1: And if you do that, you really minimize the chance 875 00:53:38,040 --> 00:53:41,280 Speaker 1: that you will fall victim to this kind of attack. 876 00:53:42,200 --> 00:53:45,040 Speaker 1: It no, no system is ever going to be perfect, 877 00:53:45,160 --> 00:53:47,360 Speaker 1: no system is ever going to be full proof, but 878 00:53:47,480 --> 00:53:52,520 Speaker 1: you reduce those odds drastically, and if you are backing 879 00:53:52,600 --> 00:53:56,840 Speaker 1: up your information, then you can at least you know again, 880 00:53:56,880 --> 00:54:01,320 Speaker 1: wipe your machine and start over again without worrying about 881 00:54:01,560 --> 00:54:06,560 Speaker 1: enabling some hackers into and inspiring future generations of hackers 882 00:54:06,719 --> 00:54:09,920 Speaker 1: to do the same thing further down the line. And 883 00:54:09,960 --> 00:54:12,759 Speaker 1: that's it. That's all I have to say about ransomware 884 00:54:12,760 --> 00:54:14,719 Speaker 1: and want to cry for this episode. I might end 885 00:54:14,800 --> 00:54:16,759 Speaker 1: up having to do another one in the future. The 886 00:54:16,800 --> 00:54:19,160 Speaker 1: story is still playing out as I record this episode, 887 00:54:19,160 --> 00:54:21,880 Speaker 1: so who knows. But if you guys have any suggestions 888 00:54:21,920 --> 00:54:24,440 Speaker 1: for future episodes of tech Stuff, whether it's a topic 889 00:54:24,440 --> 00:54:26,440 Speaker 1: you want me to cover, or someone you would like 890 00:54:26,480 --> 00:54:29,000 Speaker 1: me to interview, or perhaps a guest host you would 891 00:54:29,040 --> 00:54:31,880 Speaker 1: love to see on the show, send me a message. 892 00:54:32,200 --> 00:54:35,879 Speaker 1: The email address for the show is text stuff at 893 00:54:36,160 --> 00:54:39,160 Speaker 1: how stuff works dot com, where you can drop me 894 00:54:39,280 --> 00:54:41,799 Speaker 1: a line on Twitter or Facebook. The handle for the 895 00:54:41,800 --> 00:54:44,200 Speaker 1: show at both of those is text Stuff hs W. 896 00:54:44,760 --> 00:54:49,880 Speaker 1: And finally, you can watch this show stream live on Twitch. 897 00:54:50,239 --> 00:54:54,000 Speaker 1: I record I live stream all my recordings. You get 898 00:54:54,000 --> 00:54:58,280 Speaker 1: to see me make mistakes chat with folks in between segments. 899 00:54:58,640 --> 00:55:00,160 Speaker 1: So if you want to be part of that, want 900 00:55:00,160 --> 00:55:03,080 Speaker 1: to be part of the community, go to twitch dot 901 00:55:03,160 --> 00:55:06,319 Speaker 1: tv slash tech stuff. You'll be able to see the 902 00:55:06,320 --> 00:55:09,200 Speaker 1: show page and the schedule. And I would love for 903 00:55:09,239 --> 00:55:12,200 Speaker 1: you to join me someday in one of these podcast dreams. 904 00:55:12,200 --> 00:55:14,520 Speaker 1: I have a lot of fun chatting with everyone there 905 00:55:14,920 --> 00:55:18,200 Speaker 1: and just kind of geeking out over technology. So join me, 906 00:55:18,239 --> 00:55:22,280 Speaker 1: won't you, And I'll talk to you guys again really 907 00:55:22,320 --> 00:55:30,680 Speaker 1: soon for more on this and thousands of other topics 908 00:55:30,719 --> 00:55:41,640 Speaker 1: because it how staff works dot com