1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,600 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,640 --> 00:00:16,400 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer and 4 00:00:16,440 --> 00:00:20,320 Speaker 1: how Stuff Works in the Love all Things Tech. And recently, 5 00:00:21,160 --> 00:00:25,200 Speaker 1: Randall Charles Tucker, who once proclaimed himself to be the 6 00:00:25,239 --> 00:00:30,320 Speaker 1: Bitcoin Baron, was sentenced to a twenty month prison term 7 00:00:30,320 --> 00:00:34,640 Speaker 1: and find more than sixty nine thousand dollars for launching 8 00:00:34,720 --> 00:00:39,600 Speaker 1: distributed denial of service ord DOS attacks against municipal websites, 9 00:00:39,800 --> 00:00:43,400 Speaker 1: which not only affected normal city operations but also emergency 10 00:00:43,479 --> 00:00:47,599 Speaker 1: response systems. So today we're gonna take a look at 11 00:00:47,680 --> 00:00:50,400 Speaker 1: de DOS attacks and their history, and in our next 12 00:00:50,440 --> 00:00:53,320 Speaker 1: episode I will go into more detail about the different 13 00:00:53,440 --> 00:00:56,120 Speaker 1: kinds of the DOS attacks out there and the security 14 00:00:56,120 --> 00:00:59,800 Speaker 1: measures administrators deployed to mitigate their impact. Because this is 15 00:01:00,240 --> 00:01:03,920 Speaker 1: an ongoing important story. We've heard a lot about DIDOS 16 00:01:03,960 --> 00:01:07,720 Speaker 1: attacks in recent years. There was one that affected some 17 00:01:07,920 --> 00:01:12,360 Speaker 1: apartment buildings over in northern Europe and had them shut 18 00:01:12,400 --> 00:01:16,600 Speaker 1: down the uh the HVAC systems during the coldest days 19 00:01:16,840 --> 00:01:20,000 Speaker 1: of the year, so people no longer had heat. This 20 00:01:20,040 --> 00:01:22,800 Speaker 1: is a serious thing. So what the heck is a 21 00:01:22,880 --> 00:01:25,000 Speaker 1: di DOS attack. Well, it helps to break this down 22 00:01:25,040 --> 00:01:29,520 Speaker 1: by looking at what denial of service means, and generally speaking, 23 00:01:29,720 --> 00:01:33,039 Speaker 1: denial of service refers to using tactics that prevent or 24 00:01:33,120 --> 00:01:37,880 Speaker 1: discourage people from using something online they otherwise would use 25 00:01:38,000 --> 00:01:40,600 Speaker 1: if there were no outside interference, which is a pretty 26 00:01:40,640 --> 00:01:44,399 Speaker 1: broad definition. It can cover lots of stuff, and not 27 00:01:44,480 --> 00:01:48,600 Speaker 1: just stuff that involves hacking or inserting some malicious code 28 00:01:48,760 --> 00:01:51,960 Speaker 1: or sending commands over the internet. A denial of service 29 00:01:52,000 --> 00:01:56,000 Speaker 1: attack by itself also does not necessarily aim to steal 30 00:01:56,080 --> 00:01:59,760 Speaker 1: information or spy on anyone or anything like that, although 31 00:02:00,080 --> 00:02:03,680 Speaker 1: it can certainly accompany those types of attacks as well. 32 00:02:04,040 --> 00:02:06,920 Speaker 1: So there are a lot of instances where the denial 33 00:02:06,960 --> 00:02:11,320 Speaker 1: of service attack is just part of an overall attacker strategy, 34 00:02:11,440 --> 00:02:14,519 Speaker 1: or an attacker might use the threat of a denial 35 00:02:14,639 --> 00:02:18,160 Speaker 1: of service attack to extort money from a potential target, 36 00:02:18,240 --> 00:02:21,840 Speaker 1: essentially saying pay up or we're gonna shut you down. Often, 37 00:02:21,880 --> 00:02:25,640 Speaker 1: attackers will demonstrate their capabilities with a small scale attack 38 00:02:25,960 --> 00:02:29,240 Speaker 1: to accompany their demands to show they mean business. So 39 00:02:29,280 --> 00:02:32,640 Speaker 1: in other words, they might actually launch a small attack, 40 00:02:32,840 --> 00:02:36,000 Speaker 1: bring down a service temporarily, and say that was just 41 00:02:36,080 --> 00:02:39,360 Speaker 1: a taste of what could happen if you don't cough 42 00:02:39,440 --> 00:02:42,200 Speaker 1: up the dough. But as I said, denying service all 43 00:02:42,200 --> 00:02:45,320 Speaker 1: by itself can be the full motive, and it doesn't 44 00:02:45,400 --> 00:02:50,040 Speaker 1: have to require code or scripts or overwhelming internet infrastructure. 45 00:02:50,600 --> 00:02:53,440 Speaker 1: So for example, let's say I'm looking online for a 46 00:02:53,480 --> 00:02:57,000 Speaker 1: forum to talk about one of my interests. And for 47 00:02:57,040 --> 00:03:01,000 Speaker 1: this example, we'll just say it's musical theater. Because I 48 00:03:01,080 --> 00:03:04,480 Speaker 1: love musicals and I would love to go online and 49 00:03:04,560 --> 00:03:07,480 Speaker 1: chat with other fans of musicals. I find a forum. 50 00:03:07,520 --> 00:03:10,520 Speaker 1: It's great, there are tons of other enthusiastic fans. Maybe 51 00:03:10,600 --> 00:03:13,080 Speaker 1: there are some performers in there as well. We have 52 00:03:13,200 --> 00:03:18,200 Speaker 1: threads discussing shows and writers and inspiring performances, maybe some 53 00:03:18,280 --> 00:03:22,800 Speaker 1: embarrassing missteps, personal stories from our own performances or the 54 00:03:22,840 --> 00:03:25,680 Speaker 1: times we've attended plays, all the stuff you would typically 55 00:03:25,680 --> 00:03:29,240 Speaker 1: find on a forum about any given sort of interest. 56 00:03:29,560 --> 00:03:33,800 Speaker 1: But then something frustrating starts to happen. The forum gets 57 00:03:33,840 --> 00:03:39,040 Speaker 1: invaded by one or more troublemakers. These people disrupt conversations 58 00:03:39,120 --> 00:03:42,560 Speaker 1: just for fun. They might hurl insults at people, which 59 00:03:42,880 --> 00:03:46,040 Speaker 1: isn't exactly subtle or clever, but it can be an 60 00:03:46,040 --> 00:03:49,440 Speaker 1: effective tactic. Or they might be more insidious and post 61 00:03:49,520 --> 00:03:55,000 Speaker 1: inflammatory messages that are couched in seemingly reasonable language, which 62 00:03:55,040 --> 00:03:57,720 Speaker 1: gives the troublemaker kind of an out right like, Oh, 63 00:03:57,760 --> 00:03:59,720 Speaker 1: I'm so sorry you're offended. All I was trying to 64 00:03:59,760 --> 00:04:02,000 Speaker 1: do is say such and such. You know, they never 65 00:04:02,000 --> 00:04:05,040 Speaker 1: said anything blatantly awful. They just implied it, or they 66 00:04:05,320 --> 00:04:08,920 Speaker 1: danced around it quite a bit, but ultimately they get 67 00:04:08,960 --> 00:04:11,440 Speaker 1: what they want, which is to disrupt the conversation and 68 00:04:11,480 --> 00:04:14,960 Speaker 1: turn the attention toward themselves. We tend to call these 69 00:04:15,000 --> 00:04:18,680 Speaker 1: folks trolls, and the original reason for that is back 70 00:04:18,680 --> 00:04:20,800 Speaker 1: in the old newsgroup days, they were said to be 71 00:04:20,920 --> 00:04:25,640 Speaker 1: phishing for hits or trolling as it were. Trolling in 72 00:04:25,680 --> 00:04:28,520 Speaker 1: the sense of drawing a bated line through the water 73 00:04:28,640 --> 00:04:31,839 Speaker 1: to lure fish. These trolls were trying to get a 74 00:04:31,960 --> 00:04:36,279 Speaker 1: rise out of people and derail conversations, mostly just for laughs. 75 00:04:36,720 --> 00:04:39,600 Speaker 1: I've done episodes about trolls before, so I'm going to 76 00:04:39,760 --> 00:04:42,080 Speaker 1: leave it at that. But trolling is a type of 77 00:04:42,160 --> 00:04:45,480 Speaker 1: denial of service. It disrupts the activity that was supposed 78 00:04:45,520 --> 00:04:49,640 Speaker 1: to happen on that site. It discourages people from participating, 79 00:04:49,720 --> 00:04:53,480 Speaker 1: it denies them that opportunity. And there was no code 80 00:04:53,520 --> 00:04:56,680 Speaker 1: needed to do it. But in the case I mentioned 81 00:04:56,960 --> 00:04:59,360 Speaker 1: just now, trolls were mostly looking to get a rise 82 00:04:59,360 --> 00:05:02,360 Speaker 1: out of people they found humor and upsetting the apple cart. 83 00:05:02,800 --> 00:05:05,480 Speaker 1: They might not have any goals beyond just being a 84 00:05:05,600 --> 00:05:09,360 Speaker 1: nuisance and exerting some small amount of power over people. 85 00:05:09,800 --> 00:05:12,279 Speaker 1: Maybe they belong to a different forum and there's a 86 00:05:12,360 --> 00:05:15,440 Speaker 1: rivalry between the two, But there's some people who just 87 00:05:15,920 --> 00:05:19,359 Speaker 1: as as uh you might hear in Batman, want to 88 00:05:19,400 --> 00:05:22,840 Speaker 1: watch the world burn. But denial of service can have 89 00:05:22,960 --> 00:05:27,800 Speaker 1: far more serious effects than just inconveniencing users. For a business, 90 00:05:28,040 --> 00:05:31,159 Speaker 1: a denial of service attack can prevent them from conducting 91 00:05:31,200 --> 00:05:34,080 Speaker 1: their business, which results in lost revenue. So if you 92 00:05:34,200 --> 00:05:37,440 Speaker 1: run an online store and someone brings down your site 93 00:05:37,600 --> 00:05:40,640 Speaker 1: or prevents people from getting to your site, you're not 94 00:05:40,680 --> 00:05:43,680 Speaker 1: going to make any sales during that time. That's lost money. 95 00:05:44,160 --> 00:05:47,320 Speaker 1: Denial of service attacks can also hurt a company or 96 00:05:47,440 --> 00:05:51,480 Speaker 1: services reputation. So for example, there was a massive denial 97 00:05:51,520 --> 00:05:56,240 Speaker 1: of service attack that affected Sony's PlayStation network and Microsoft's 98 00:05:56,320 --> 00:06:00,080 Speaker 1: Xbox Live service back in during the holiday season, and 99 00:06:00,080 --> 00:06:03,280 Speaker 1: it made a lot of gamers really angry. They were 100 00:06:03,320 --> 00:06:06,360 Speaker 1: accusing both companies of not doing enough to secure their 101 00:06:06,360 --> 00:06:10,039 Speaker 1: services to make sure they were robust against such attacks. 102 00:06:10,720 --> 00:06:13,799 Speaker 1: This is sort of like pouring lemon juice in the wound. 103 00:06:14,080 --> 00:06:17,479 Speaker 1: In some ways, you know they're already hurting because they've 104 00:06:17,480 --> 00:06:20,160 Speaker 1: been knocked down, and now the users are yelling at 105 00:06:20,200 --> 00:06:22,560 Speaker 1: them too. But there is a valid argument to be 106 00:06:22,640 --> 00:06:28,000 Speaker 1: made that services, particularly really big, heavily trafficked services, need 107 00:06:28,040 --> 00:06:31,160 Speaker 1: to invest in good security measures. I talked about a 108 00:06:31,200 --> 00:06:34,240 Speaker 1: non technical approach to denial of service attacks with that 109 00:06:34,320 --> 00:06:38,360 Speaker 1: forum example, but most of the time when we talk 110 00:06:38,480 --> 00:06:41,080 Speaker 1: about the denial of service attack, we tend to mean 111 00:06:41,160 --> 00:06:43,760 Speaker 1: one that involved bringing down a system using some sort 112 00:06:43,800 --> 00:06:47,520 Speaker 1: of technology based attack vector. So you can think of 113 00:06:47,640 --> 00:06:52,640 Speaker 1: denial of service attacks belonging to three large categories in general. 114 00:06:52,920 --> 00:06:56,479 Speaker 1: The first category is volumetric. That means the goal is 115 00:06:56,520 --> 00:07:00,640 Speaker 1: to overwhelm the target by sending a huge number requests 116 00:07:00,720 --> 00:07:05,359 Speaker 1: or messages to that target device, more messages than the 117 00:07:05,400 --> 00:07:08,360 Speaker 1: target can actually handle. And I always think of this 118 00:07:08,520 --> 00:07:11,240 Speaker 1: in a rather old fashioned way. When I was growing up, 119 00:07:11,640 --> 00:07:15,160 Speaker 1: cell phones weren't really a thing. Everyone had landlines. You know, 120 00:07:15,160 --> 00:07:17,040 Speaker 1: you'd be at home and you use your phone, which 121 00:07:17,120 --> 00:07:19,880 Speaker 1: was plugged into the wall, and in fact, most of 122 00:07:19,880 --> 00:07:22,480 Speaker 1: the time. It was a wired handset. Didn't have a 123 00:07:22,520 --> 00:07:24,320 Speaker 1: whole lot of wireless ones when I was growing up, 124 00:07:24,320 --> 00:07:26,800 Speaker 1: and they existed, I just didn't have them. And call 125 00:07:26,880 --> 00:07:29,640 Speaker 1: waiting was not a common feature in those early days, 126 00:07:29,880 --> 00:07:32,800 Speaker 1: which meant if you called someone and they were already 127 00:07:32,880 --> 00:07:35,680 Speaker 1: on the phone, you would get a busy signal. Well, 128 00:07:35,720 --> 00:07:39,400 Speaker 1: this volumetric category of denial of service attacks is kind 129 00:07:39,400 --> 00:07:42,400 Speaker 1: of like having a jerk calling you over and over 130 00:07:42,480 --> 00:07:44,960 Speaker 1: again and they call you, you you pick up, you hear 131 00:07:45,000 --> 00:07:47,320 Speaker 1: it's that same jerk. You hang up, they immediately hit 132 00:07:47,560 --> 00:07:50,679 Speaker 1: redial and they call you right back again, and the 133 00:07:50,680 --> 00:07:52,680 Speaker 1: phone starts to ring, and that means no one else 134 00:07:52,720 --> 00:07:55,360 Speaker 1: can get through to you. Anyone who tries is just 135 00:07:55,400 --> 00:07:58,480 Speaker 1: going to get a busy signal, so they're getting a 136 00:07:58,520 --> 00:08:01,560 Speaker 1: denial of service. And because you can't receive any other 137 00:08:01,680 --> 00:08:05,160 Speaker 1: calls due to this person calling you up repeatedly, you 138 00:08:05,240 --> 00:08:08,320 Speaker 1: also get a denial of service. Now that analogy doesn't 139 00:08:08,320 --> 00:08:10,800 Speaker 1: work quite as well today because we can do stuff 140 00:08:10,800 --> 00:08:14,600 Speaker 1: like block incoming calls pretty much routinely, and call waiting 141 00:08:14,680 --> 00:08:17,480 Speaker 1: is a standard feature on almost every phone service. But 142 00:08:17,600 --> 00:08:20,680 Speaker 1: you get the idea. Next, we have the application de 143 00:08:20,840 --> 00:08:26,280 Speaker 1: DOS flood attack. This concentrates not on individual applications. That 144 00:08:26,280 --> 00:08:29,920 Speaker 1: that's what the phrase makes you think, like, oh, this 145 00:08:30,000 --> 00:08:33,320 Speaker 1: is like a Spotify de dos attack or something. No. 146 00:08:34,240 --> 00:08:37,360 Speaker 1: It rather it refers to the application layer of a 147 00:08:37,679 --> 00:08:41,080 Speaker 1: communications network. And I talked about the application layer back 148 00:08:41,120 --> 00:08:43,400 Speaker 1: in the Dip into the Seven Layers of the O 149 00:08:43,600 --> 00:08:47,960 Speaker 1: SI Model episode that published back in November. But this 150 00:08:48,000 --> 00:08:50,679 Speaker 1: would be a flood attack similar to the volumetric one 151 00:08:50,720 --> 00:08:53,520 Speaker 1: I just mentioned, but it aims to overwhelm the system 152 00:08:53,600 --> 00:08:56,960 Speaker 1: with a large number of requests at the application layer 153 00:08:57,200 --> 00:09:00,480 Speaker 1: rather than the network layer. I'll explain more about what 154 00:09:00,559 --> 00:09:04,080 Speaker 1: that means in the next episode. And the third category 155 00:09:04,120 --> 00:09:07,080 Speaker 1: is a low rate denial of service attack also known 156 00:09:07,080 --> 00:09:10,319 Speaker 1: as a vulnerability attack, and those attacks take advantage of 157 00:09:10,400 --> 00:09:14,920 Speaker 1: vulnerabilities or limitations and application implementations and so are kind 158 00:09:15,000 --> 00:09:18,840 Speaker 1: of related to application de dos flood attacks, but they're 159 00:09:18,840 --> 00:09:21,080 Speaker 1: slightly different. I'll explain more about that in the next 160 00:09:21,080 --> 00:09:25,160 Speaker 1: episode two. Then you have a distributed denial of service 161 00:09:25,160 --> 00:09:29,439 Speaker 1: attack that ups the anti in ad DOS attack. Hundreds 162 00:09:29,559 --> 00:09:33,720 Speaker 1: or thousands or even hundreds of thousands of machines combine 163 00:09:33,760 --> 00:09:36,760 Speaker 1: their efforts to bring down a target to go back 164 00:09:36,760 --> 00:09:39,400 Speaker 1: to my phone analogy for a second. Let's just say 165 00:09:39,400 --> 00:09:42,000 Speaker 1: that that jerk who was calling me really wants to 166 00:09:42,040 --> 00:09:45,880 Speaker 1: irritate me by making my phone line absolutely useless, so 167 00:09:45,920 --> 00:09:48,560 Speaker 1: he actually recruits all of his jerk friends and gives 168 00:09:48,559 --> 00:09:51,720 Speaker 1: them my phone number. Then he and all his jerk 169 00:09:51,760 --> 00:09:54,600 Speaker 1: friends just keep dialing me up over and over, which 170 00:09:54,640 --> 00:09:57,760 Speaker 1: makes it even harder to handle than the one jerk 171 00:09:57,840 --> 00:10:00,480 Speaker 1: doing it all by himself. So let's say I managed 172 00:10:00,520 --> 00:10:03,280 Speaker 1: to finally get an open line, so I make a 173 00:10:03,280 --> 00:10:05,920 Speaker 1: call to the phone company and ask them to block 174 00:10:06,080 --> 00:10:09,840 Speaker 1: the number that just called me, and they agree for 175 00:10:09,920 --> 00:10:13,400 Speaker 1: whatever reason. Well, that just reduces the jerk faces attack 176 00:10:13,520 --> 00:10:17,800 Speaker 1: vectors by one, right, It just removes one of the callers. 177 00:10:17,960 --> 00:10:20,520 Speaker 1: But a group of jerk friends, with the exception of 178 00:10:20,559 --> 00:10:23,400 Speaker 1: the one I managed to catch when I asked for 179 00:10:23,440 --> 00:10:25,800 Speaker 1: the number to be blocked, can keep on calling me. 180 00:10:25,840 --> 00:10:28,760 Speaker 1: They they're calling from different phone numbers, so their calls 181 00:10:28,840 --> 00:10:31,480 Speaker 1: keep coming through, and I can keep trying to block 182 00:10:31,559 --> 00:10:34,280 Speaker 1: the numbers one by one. But this is laborious and 183 00:10:34,360 --> 00:10:37,040 Speaker 1: time consuming, and in the meantime I'm not able to 184 00:10:37,080 --> 00:10:39,920 Speaker 1: use my phone for anything else. That's what ad dos 185 00:10:40,000 --> 00:10:42,839 Speaker 1: attack does, but instead over the phone lines, it does 186 00:10:42,880 --> 00:10:45,720 Speaker 1: it over the Internet. In general, it uses an enormous 187 00:10:45,800 --> 00:10:49,359 Speaker 1: number of machines to carry out an attack, and individually 188 00:10:49,720 --> 00:10:52,080 Speaker 1: those machines might not be able to generate the sheer 189 00:10:52,160 --> 00:10:56,040 Speaker 1: volume of data that could overwhelm a target, but collectively 190 00:10:56,400 --> 00:10:59,160 Speaker 1: they can do it, and they can be difficult to stop. 191 00:10:59,280 --> 00:11:01,280 Speaker 1: In a moment, I'll talk about a real example of 192 00:11:01,280 --> 00:11:04,040 Speaker 1: how an attacker might overwhelm the target machine over the 193 00:11:04,040 --> 00:11:08,080 Speaker 1: Internet using a simple denial of service tactic. But first 194 00:11:08,360 --> 00:11:18,720 Speaker 1: let's take a quick break to thank our sponsor. One 195 00:11:18,800 --> 00:11:23,320 Speaker 1: real world denial of service attack falling into the category 196 00:11:23,480 --> 00:11:28,880 Speaker 1: of the volumetric attack, involves flooding web server with requests 197 00:11:28,920 --> 00:11:32,880 Speaker 1: called pings. A ping is a very simple message that 198 00:11:33,000 --> 00:11:36,880 Speaker 1: computers used to test connections between them on a network. 199 00:11:37,120 --> 00:11:41,000 Speaker 1: It measures the reachability of another computer. So consider that 200 00:11:41,040 --> 00:11:44,280 Speaker 1: the Internet is a network of networks, and between your 201 00:11:44,360 --> 00:11:48,000 Speaker 1: computer and some other computer on the Internet, there may 202 00:11:48,040 --> 00:11:51,160 Speaker 1: be hundreds of machines. Some of them are routers, some 203 00:11:51,240 --> 00:11:53,440 Speaker 1: of them are switches, some of them are computers. For 204 00:11:53,520 --> 00:11:58,760 Speaker 1: your computer to communicate with this target computer, traffic has 205 00:11:58,800 --> 00:12:01,600 Speaker 1: to go through the net work from your computer to 206 00:12:01,600 --> 00:12:03,760 Speaker 1: the distant one, and then traffic needs to be able 207 00:12:03,800 --> 00:12:07,559 Speaker 1: to come back from the target machine to your machine, 208 00:12:07,840 --> 00:12:09,719 Speaker 1: and a ping is a test to see if such 209 00:12:09,760 --> 00:12:12,480 Speaker 1: a thing is really possible. It measures the round trip 210 00:12:12,600 --> 00:12:15,319 Speaker 1: time for a message to be sent out from computer 211 00:12:15,400 --> 00:12:18,480 Speaker 1: A to go to computer B and then return back 212 00:12:18,480 --> 00:12:22,240 Speaker 1: to computer A. The name comes from an older technology, 213 00:12:22,320 --> 00:12:26,120 Speaker 1: which would be sonar and sonar where we use sounds 214 00:12:26,280 --> 00:12:30,280 Speaker 1: underwater to detect objects by listening for echoes. We would 215 00:12:30,280 --> 00:12:34,400 Speaker 1: send out a sound a ping or from a speaker 216 00:12:34,480 --> 00:12:36,920 Speaker 1: essentially underwater, and then we would listen in on a 217 00:12:36,960 --> 00:12:40,680 Speaker 1: microphone for a returning echo. So you send out a ping. 218 00:12:40,800 --> 00:12:42,440 Speaker 1: If you get an echo of that ping, you know 219 00:12:42,520 --> 00:12:46,320 Speaker 1: there is something out there under the water that is 220 00:12:46,400 --> 00:12:49,200 Speaker 1: reflecting that sound back at you. In fact, you may 221 00:12:49,240 --> 00:12:52,280 Speaker 1: remember in movies like The Hunt for October they talk 222 00:12:52,320 --> 00:12:54,240 Speaker 1: about this a lot. They use pings in order to 223 00:12:54,280 --> 00:12:57,800 Speaker 1: send secret messages to each other. But in the Internet, 224 00:12:57,840 --> 00:13:00,160 Speaker 1: we send out a small amount of data and then 225 00:13:00,240 --> 00:13:03,120 Speaker 1: we essentially listen back for its return and use the 226 00:13:03,120 --> 00:13:06,559 Speaker 1: travel time to judge the connection strength between the two computers, 227 00:13:06,640 --> 00:13:09,480 Speaker 1: or really just how much time does it take for 228 00:13:09,600 --> 00:13:12,760 Speaker 1: a message to go across the Internet and back again. 229 00:13:13,520 --> 00:13:17,240 Speaker 1: Mike must created the pain utility back in to help 230 00:13:17,320 --> 00:13:21,000 Speaker 1: test I P network connections. A quick ping could indicate 231 00:13:21,200 --> 00:13:24,280 Speaker 1: if there was a connectivity problem. If you send out 232 00:13:24,280 --> 00:13:26,200 Speaker 1: a ping and nothing comes back, you know there's a 233 00:13:26,240 --> 00:13:28,560 Speaker 1: problem with that connection. If you send out a paying 234 00:13:28,600 --> 00:13:31,360 Speaker 1: and it comes back but it comes back pretty like 235 00:13:31,400 --> 00:13:33,400 Speaker 1: there's a pretty long gap, and we're talking on the 236 00:13:33,520 --> 00:13:36,840 Speaker 1: order of less than a second typically, but it still 237 00:13:36,840 --> 00:13:38,600 Speaker 1: can be a long gap if you're talking about actually 238 00:13:38,640 --> 00:13:43,079 Speaker 1: sending real data across the network. Again, it can tell you, oh, 239 00:13:43,160 --> 00:13:45,559 Speaker 1: you need to really take a look at your network 240 00:13:45,600 --> 00:13:47,960 Speaker 1: and see where the problem is. There might be a 241 00:13:47,960 --> 00:13:50,840 Speaker 1: broken element that you need to replace. It's also a 242 00:13:50,840 --> 00:13:53,880 Speaker 1: great tool if you want to use bandwidth heavy applications 243 00:13:54,280 --> 00:13:57,760 Speaker 1: because it can indicate whether such a connection is even possible. 244 00:13:58,160 --> 00:14:00,920 Speaker 1: So let's say that you want to play an online 245 00:14:01,120 --> 00:14:05,160 Speaker 1: computer game, maybe it's a multiplayer computer game competitive. You 246 00:14:05,200 --> 00:14:08,640 Speaker 1: want to make sure you can find a server that 247 00:14:09,040 --> 00:14:13,280 Speaker 1: doesn't have a long latency issue between you and the 248 00:14:13,320 --> 00:14:16,120 Speaker 1: server you want to pin get a good time. And 249 00:14:16,160 --> 00:14:18,880 Speaker 1: it may be that that's a game that has multiple servers, 250 00:14:18,880 --> 00:14:21,000 Speaker 1: so you want to find the server that has the 251 00:14:21,120 --> 00:14:25,240 Speaker 1: best connection between your computer and that server. So that 252 00:14:25,280 --> 00:14:28,040 Speaker 1: you can have the best experience when you're playing well. 253 00:14:28,080 --> 00:14:30,480 Speaker 1: If one were to send an enormous number of PING 254 00:14:30,520 --> 00:14:34,840 Speaker 1: requests to the same target computer, such as a web server, 255 00:14:35,400 --> 00:14:39,640 Speaker 1: that target could become overwhelmed by all those requests. It 256 00:14:39,640 --> 00:14:42,600 Speaker 1: would attempt to respond to each request, which takes up 257 00:14:42,600 --> 00:14:46,880 Speaker 1: resources it would otherwise use for normal operations. So let's 258 00:14:46,880 --> 00:14:50,400 Speaker 1: say a hacker has targeted the website hosting that musicals 259 00:14:50,440 --> 00:14:53,480 Speaker 1: forum I wanted to pop into, and instead of going 260 00:14:53,560 --> 00:14:55,560 Speaker 1: in there and starting a flame war in the forums, 261 00:14:55,880 --> 00:15:00,400 Speaker 1: they just start sending PING requests an uncountable number of 262 00:15:00,400 --> 00:15:04,560 Speaker 1: PAIN requests to that forums host computer, which is trying 263 00:15:04,560 --> 00:15:07,440 Speaker 1: to respond to each PIN request dutifully. I mean, that's 264 00:15:07,560 --> 00:15:10,520 Speaker 1: what it does. And as a result, the system becomes 265 00:15:10,600 --> 00:15:13,720 Speaker 1: unstable and crashes, and I get an error message when 266 00:15:13,720 --> 00:15:17,520 Speaker 1: I try to go to that forum site. This tactic 267 00:15:17,760 --> 00:15:21,480 Speaker 1: is called a ping flood. It's just one denial of 268 00:15:21,560 --> 00:15:24,680 Speaker 1: service tactic. I'll go into a lot of other ones 269 00:15:24,760 --> 00:15:28,120 Speaker 1: later on. Now, I mentioned earlier how a di DOS 270 00:15:28,160 --> 00:15:31,760 Speaker 1: attack can be effective by leveraging thousands or hundreds of 271 00:15:31,840 --> 00:15:35,320 Speaker 1: thousands of machines in a coordinated attack. But how does 272 00:15:35,400 --> 00:15:37,320 Speaker 1: that happen? How do you get to a point where 273 00:15:37,560 --> 00:15:41,480 Speaker 1: hundreds of thousands of machines can work together. How does 274 00:15:41,520 --> 00:15:45,000 Speaker 1: an attacker get control of that many devices? Well, sometimes 275 00:15:45,720 --> 00:15:49,840 Speaker 1: it happens by people volunteering to be part of this group. 276 00:15:50,240 --> 00:15:53,080 Speaker 1: There are activist groups that will send out a message 277 00:15:53,080 --> 00:15:55,200 Speaker 1: and say, hey, if you want to be part of 278 00:15:55,240 --> 00:15:58,680 Speaker 1: this movement, you can download the software and then we 279 00:15:58,680 --> 00:16:01,720 Speaker 1: can use your computer to be part of this attack 280 00:16:02,000 --> 00:16:05,720 Speaker 1: on whatever the target is. But in other cases it's 281 00:16:05,760 --> 00:16:11,640 Speaker 1: happening through trickery. Uh, it ends up being a compromised device. Right, 282 00:16:11,960 --> 00:16:16,400 Speaker 1: So for target computers, a hacker either rights some malware 283 00:16:16,680 --> 00:16:20,320 Speaker 1: or more likely makes use of existing malware. There's tons 284 00:16:20,360 --> 00:16:23,120 Speaker 1: of malware that's already been written out there. A lot 285 00:16:23,120 --> 00:16:26,840 Speaker 1: of the people who use these tactics aren't necessarily coders 286 00:16:26,920 --> 00:16:31,240 Speaker 1: or programmers. They are what some folks dismissively referred to 287 00:16:31,400 --> 00:16:35,400 Speaker 1: as script kitties. They go and they find code that 288 00:16:35,440 --> 00:16:37,600 Speaker 1: will do what they want it to do that someone 289 00:16:37,640 --> 00:16:40,560 Speaker 1: else has already written, and then they'll essentially download that 290 00:16:40,720 --> 00:16:44,560 Speaker 1: and use that kind of as a just an attack package. 291 00:16:45,080 --> 00:16:47,360 Speaker 1: So they're not having to make it themselves. They're already 292 00:16:47,360 --> 00:16:50,720 Speaker 1: it's kind of off the shelf hacker sort of software. 293 00:16:51,040 --> 00:16:55,280 Speaker 1: So they then use this malware to create a way 294 00:16:55,360 --> 00:17:00,000 Speaker 1: to infect numerous machines, typically by fooling people into execute 295 00:17:00,000 --> 00:17:04,160 Speaker 1: eating a file on their computers or their their computing devices. 296 00:17:04,800 --> 00:17:07,280 Speaker 1: The malware contains a way for the hacker to direct 297 00:17:07,400 --> 00:17:11,520 Speaker 1: those computers to send messages to a specific target. Um. 298 00:17:11,720 --> 00:17:14,040 Speaker 1: They may be completely automated. You just hit a little 299 00:17:14,040 --> 00:17:17,119 Speaker 1: button and then everything does it. You know. You hacker 300 00:17:17,200 --> 00:17:19,760 Speaker 1: might put in the IP address for the target machine, 301 00:17:20,040 --> 00:17:23,639 Speaker 1: but otherwise everything else gets taken care of automatically, and 302 00:17:23,640 --> 00:17:26,360 Speaker 1: the hacker uses those devices to turn all their focus 303 00:17:26,400 --> 00:17:28,600 Speaker 1: onto the target machine and then they bombard it with 304 00:17:28,640 --> 00:17:32,680 Speaker 1: countless messages. Uh. Or the hacker might exploit a known 305 00:17:32,760 --> 00:17:36,679 Speaker 1: vulnerability in various Internet connected devices such as routers, or 306 00:17:36,680 --> 00:17:41,240 Speaker 1: even stuff like smart TVs or Internet connected thermostats. Essentially, 307 00:17:41,600 --> 00:17:44,800 Speaker 1: the Internet of Things and the smart home movement have 308 00:17:44,960 --> 00:17:49,920 Speaker 1: created the potential for truly enormous coordinated attacks because again, 309 00:17:49,960 --> 00:17:53,919 Speaker 1: they don't have to send really sophisticated information across the Internet. 310 00:17:53,960 --> 00:17:56,879 Speaker 1: It could be as simple as pings. Pings are one 311 00:17:56,880 --> 00:18:00,080 Speaker 1: of the most basic messages you can send, so if 312 00:18:00,119 --> 00:18:02,400 Speaker 1: you just get devices that are capable of sending a ping, 313 00:18:02,480 --> 00:18:05,199 Speaker 1: then you're you're all set to go. And part of 314 00:18:05,240 --> 00:18:08,119 Speaker 1: this is because that Internet of Things developed faster than 315 00:18:08,160 --> 00:18:12,040 Speaker 1: companies could create good security measures to protect those devices 316 00:18:12,080 --> 00:18:14,719 Speaker 1: from people who would compromise them. And part of it 317 00:18:14,760 --> 00:18:17,800 Speaker 1: falls on the consumers shoulders, because a lot of people 318 00:18:17,840 --> 00:18:21,080 Speaker 1: don't bother to ever update their security settings. Right, they'll 319 00:18:21,080 --> 00:18:23,520 Speaker 1: get a new thing out of the box, they'll plug 320 00:18:23,560 --> 00:18:26,399 Speaker 1: it into their network, and they never bother to update 321 00:18:26,440 --> 00:18:30,000 Speaker 1: the log in and passwords on their devices, so they're 322 00:18:30,080 --> 00:18:33,359 Speaker 1: using the default settings for their login and passwords, and 323 00:18:33,440 --> 00:18:37,760 Speaker 1: that can create the opportunity for a hacker to access 324 00:18:37,880 --> 00:18:41,320 Speaker 1: those devices. If a company is using essentially a the 325 00:18:41,400 --> 00:18:45,440 Speaker 1: same sort of login and password for all of its 326 00:18:45,520 --> 00:18:48,280 Speaker 1: products along a certain line, that all you have to 327 00:18:48,280 --> 00:18:50,240 Speaker 1: do is know what that is, and then you have 328 00:18:50,320 --> 00:18:56,840 Speaker 1: access to countless instances of those unprotected devices because so 329 00:18:56,880 --> 00:18:59,879 Speaker 1: many people do not bother to update it a law. 330 00:19:00,080 --> 00:19:02,760 Speaker 1: The routers I've seen have had a log in that's 331 00:19:02,800 --> 00:19:05,760 Speaker 1: kind of like admin one and a password that might 332 00:19:05,800 --> 00:19:09,879 Speaker 1: literally be the word password. So if you just plug 333 00:19:09,960 --> 00:19:13,800 Speaker 1: that in, if you're a hacker to try and compromise 334 00:19:14,000 --> 00:19:18,080 Speaker 1: someone's home systems, chances are it's gonna work on a 335 00:19:18,119 --> 00:19:20,399 Speaker 1: lot of people because they never bothered to change it. 336 00:19:21,080 --> 00:19:25,400 Speaker 1: So uh, lesson there, change your passwords on your devices 337 00:19:25,440 --> 00:19:28,600 Speaker 1: from the default to something else. Now, some companies they 338 00:19:28,640 --> 00:19:32,439 Speaker 1: go a little bit further. They'll they'll create a password 339 00:19:32,480 --> 00:19:35,359 Speaker 1: for each device that is unique to that device, right. 340 00:19:35,400 --> 00:19:38,040 Speaker 1: They don't use the exact same password for all of 341 00:19:38,080 --> 00:19:41,240 Speaker 1: their routers, for example, And that's a good step that 342 00:19:41,280 --> 00:19:43,280 Speaker 1: makes it much harder to do. You you can't just 343 00:19:43,359 --> 00:19:47,800 Speaker 1: use a blanket attack the way a hacker normally would. Anyway, 344 00:19:48,080 --> 00:19:51,639 Speaker 1: I don't put the full blame on the consumer, and 345 00:19:51,680 --> 00:19:54,560 Speaker 1: I don't put the full blame on the manufacturer. It's 346 00:19:54,560 --> 00:19:56,840 Speaker 1: a problem that both parties have to pay attention to. 347 00:19:57,320 --> 00:19:59,600 Speaker 1: But there are some manufacturers out there who have made 348 00:19:59,640 --> 00:20:03,439 Speaker 1: product with very poor or completely absent security measures, And 349 00:20:03,440 --> 00:20:06,720 Speaker 1: in those cases, I pretty much blame the manufacturer of 350 00:20:06,800 --> 00:20:10,280 Speaker 1: the company, not the customers, because if you didn't even 351 00:20:09,920 --> 00:20:13,600 Speaker 1: include any kind of security measures in your device, then 352 00:20:13,640 --> 00:20:15,760 Speaker 1: there was nothing really the customer could do on their 353 00:20:15,800 --> 00:20:18,800 Speaker 1: side to protect themselves. And in any case, the collection 354 00:20:18,880 --> 00:20:22,639 Speaker 1: of infected computers and devices would be called a bot net. 355 00:20:23,080 --> 00:20:26,240 Speaker 1: Sometimes people call it a zombie computer army. Although you 356 00:20:26,280 --> 00:20:28,760 Speaker 1: hardly ever hear that phrase these days, it's almost always 357 00:20:28,800 --> 00:20:31,840 Speaker 1: just bought net and it's because the compromise computers are 358 00:20:31,880 --> 00:20:34,960 Speaker 1: being controlled by some sort of remote entity, either a 359 00:20:35,040 --> 00:20:38,360 Speaker 1: human hacker or an automated script or bought This can 360 00:20:38,440 --> 00:20:40,760 Speaker 1: happen even without you being aware of it. By the way, 361 00:20:40,920 --> 00:20:43,600 Speaker 1: you may only notice that your device is operating more 362 00:20:43,640 --> 00:20:46,040 Speaker 1: slowly than normal, and you wonder, well, why is my 363 00:20:46,080 --> 00:20:48,600 Speaker 1: computer no longer as fast as it used to be. 364 00:20:48,960 --> 00:20:52,120 Speaker 1: One possible explanation is that some of your computer systems 365 00:20:52,119 --> 00:20:54,480 Speaker 1: are being dedicated to sending out the tax over the 366 00:20:54,520 --> 00:20:56,880 Speaker 1: Internet and you never know it. Or you might get 367 00:20:56,880 --> 00:20:58,960 Speaker 1: a message about how much data you're using over a 368 00:20:58,960 --> 00:21:01,240 Speaker 1: given length of time and your thinking, that's weird, I'm 369 00:21:01,280 --> 00:21:04,040 Speaker 1: not even home when all this stuff is happening. Well, 370 00:21:04,040 --> 00:21:07,320 Speaker 1: that's an indicator that something has gone wrong. So to 371 00:21:07,440 --> 00:21:11,320 Speaker 1: understand how most distributed denial of service attacks work, it's 372 00:21:11,359 --> 00:21:14,439 Speaker 1: good to remind ourselves of how information tends to travel 373 00:21:14,520 --> 00:21:18,760 Speaker 1: across the Internet. There are protocols like TCP I P, 374 00:21:19,040 --> 00:21:21,760 Speaker 1: which that's actually two different sets of protocols. Those are 375 00:21:21,760 --> 00:21:25,200 Speaker 1: really rules that information has to follow to travel across 376 00:21:25,240 --> 00:21:28,359 Speaker 1: the Internet. The architects of the Internet who worked on 377 00:21:28,440 --> 00:21:32,120 Speaker 1: our pannet first one of the actual methodology of allowing 378 00:21:32,160 --> 00:21:35,440 Speaker 1: information to go from point A to point B to 379 00:21:35,560 --> 00:21:38,520 Speaker 1: be very light with the data. In other words, the 380 00:21:38,520 --> 00:21:42,879 Speaker 1: process itself shouldn't have been data specific. It should be 381 00:21:42,960 --> 00:21:46,440 Speaker 1: data agnostic. It doesn't matter what the information is. It's 382 00:21:46,520 --> 00:21:49,280 Speaker 1: just concerned with making sure that information can get from 383 00:21:49,440 --> 00:21:52,960 Speaker 1: the source to its destination. That's the only thing that's important. 384 00:21:53,520 --> 00:21:58,000 Speaker 1: The end points, the edge machines where a message originates 385 00:21:58,040 --> 00:22:00,320 Speaker 1: and where it terminates, would do all the heavy thing, 386 00:22:00,680 --> 00:22:02,600 Speaker 1: but the middle bits would be much less hands on 387 00:22:02,640 --> 00:22:05,000 Speaker 1: with the data, with a deeper concern with just making 388 00:22:05,000 --> 00:22:07,960 Speaker 1: sure it gets to the right destination. And it's verify 389 00:22:08,119 --> 00:22:10,600 Speaker 1: that everything got to where it needed to go. So 390 00:22:10,640 --> 00:22:14,199 Speaker 1: the Internet sends data in bundles called packets. This is 391 00:22:14,200 --> 00:22:17,840 Speaker 1: really where TCP comes in. A single file might consist 392 00:22:17,920 --> 00:22:21,760 Speaker 1: of hundreds or thousands or millions of packets, and the 393 00:22:21,760 --> 00:22:25,760 Speaker 1: packets are just bundles of data, and your computer sends 394 00:22:25,800 --> 00:22:29,160 Speaker 1: this information over the Internet. So let's say you want 395 00:22:29,160 --> 00:22:32,240 Speaker 1: to send a big file. Let's say it's a film. 396 00:22:32,280 --> 00:22:34,359 Speaker 1: You've got a film and it's an enormous file and 397 00:22:34,400 --> 00:22:35,960 Speaker 1: you want to send it across the Internet to a 398 00:22:35,960 --> 00:22:39,000 Speaker 1: friend of yours. Well, the data gets chopped up into 399 00:22:39,040 --> 00:22:42,920 Speaker 1: these packets, and the packets include a header that has 400 00:22:43,000 --> 00:22:47,600 Speaker 1: important meta information about the data the packet carries. Namely, 401 00:22:47,640 --> 00:22:51,000 Speaker 1: it has the identity of the sender's computer, and it 402 00:22:51,000 --> 00:22:54,960 Speaker 1: has the identity of the destination computer. And also it 403 00:22:55,040 --> 00:22:58,520 Speaker 1: has information about how the data inside the packets fits 404 00:22:58,560 --> 00:23:00,760 Speaker 1: in with all the other path gets of data that 405 00:23:00,800 --> 00:23:04,600 Speaker 1: are being sent. So one way to imagine this is 406 00:23:04,640 --> 00:23:07,320 Speaker 1: to think about having like a giant poster for an 407 00:23:07,320 --> 00:23:10,440 Speaker 1: awesome movie. Let's say it's Big Trouble in Little China. Now, 408 00:23:10,480 --> 00:23:12,200 Speaker 1: on the back of the poster, you've got a grid, 409 00:23:12,720 --> 00:23:16,119 Speaker 1: and inside each cell of this grid is a number, 410 00:23:16,160 --> 00:23:19,320 Speaker 1: and their insequential order. So the top left corner has 411 00:23:19,359 --> 00:23:21,560 Speaker 1: the number one, and then when you move to the right, 412 00:23:21,880 --> 00:23:24,680 Speaker 1: they increase sequentially till you get to the number twenty. 413 00:23:24,760 --> 00:23:27,080 Speaker 1: And then you dropped down a row so that the 414 00:23:27,119 --> 00:23:29,359 Speaker 1: first number on the far right side, on the second 415 00:23:29,440 --> 00:23:32,960 Speaker 1: row is twenty one. You go sequentially to the left, 416 00:23:33,000 --> 00:23:34,840 Speaker 1: and so on you zig zag all the way down, 417 00:23:34,880 --> 00:23:38,280 Speaker 1: so you've got the whole poster numbered. And let's say 418 00:23:38,280 --> 00:23:41,159 Speaker 1: it's got a hundred cells total, so it's one to 419 00:23:41,280 --> 00:23:45,160 Speaker 1: one hundred you send. You cut up the poster into 420 00:23:45,200 --> 00:23:47,080 Speaker 1: these cells, so you you cut up all the little 421 00:23:47,119 --> 00:23:48,880 Speaker 1: blocks because that's the only way you're gonna be able 422 00:23:48,880 --> 00:23:50,800 Speaker 1: to send it to your friend. And you send it 423 00:23:50,840 --> 00:23:54,080 Speaker 1: to your friend in one hundred different envelopes, and your 424 00:23:54,119 --> 00:23:56,920 Speaker 1: friend opens up the one hundred different envelopes and then 425 00:23:57,000 --> 00:23:58,800 Speaker 1: they see the numbers on the back and they're able 426 00:23:58,840 --> 00:24:01,760 Speaker 1: to put the poster back to other based on those numbers. Now, 427 00:24:01,760 --> 00:24:03,280 Speaker 1: it doesn't make a whole lot of sense in this 428 00:24:03,359 --> 00:24:06,080 Speaker 1: real world example, but over the Internet it makes perfect sense. 429 00:24:06,240 --> 00:24:11,000 Speaker 1: And that's because the Internet depends upon relatively cheap, unreliable connections, 430 00:24:11,640 --> 00:24:14,280 Speaker 1: which is actually a good thing. See in the old days, 431 00:24:14,320 --> 00:24:18,240 Speaker 1: before the Internet, before Arpanet, connecting computers together would require 432 00:24:18,320 --> 00:24:22,160 Speaker 1: a dedicated connection linking computer A with computer B. We're 433 00:24:22,200 --> 00:24:27,320 Speaker 1: talking direct connection between the two, which ends up being limiting. 434 00:24:27,400 --> 00:24:30,320 Speaker 1: It's also expensive, and if the connection were to fail, 435 00:24:30,359 --> 00:24:33,240 Speaker 1: you would have to repair it before any communication could continue. 436 00:24:33,280 --> 00:24:37,480 Speaker 1: Because it's just this direct communication channel that the architects 437 00:24:37,480 --> 00:24:40,600 Speaker 1: of the Arpanet wanted to make certain that communication could 438 00:24:40,600 --> 00:24:45,199 Speaker 1: continue even if individual pathways were to shut down. If 439 00:24:45,240 --> 00:24:48,159 Speaker 1: you think about like a town, it's saying, well, the 440 00:24:48,200 --> 00:24:51,800 Speaker 1: main road has been shut down because a tree fell 441 00:24:51,840 --> 00:24:54,520 Speaker 1: across it. But luckily they're all these side roads you 442 00:24:54,520 --> 00:24:56,600 Speaker 1: can take to still get to the same destination. Might 443 00:24:56,600 --> 00:24:58,840 Speaker 1: take you a little longer and you go a little 444 00:24:58,880 --> 00:25:00,360 Speaker 1: further out of the way, but you can will get 445 00:25:00,400 --> 00:25:03,320 Speaker 1: there well. To that end, the architects of the arpanet 446 00:25:03,440 --> 00:25:07,359 Speaker 1: built their infrastructure on cheap hardware. Individually, those pieces of 447 00:25:07,400 --> 00:25:11,920 Speaker 1: hardware aren't as reliable as the more expensive, more sophisticated 448 00:25:11,960 --> 00:25:15,880 Speaker 1: types of hardware out there, but collectively, this is a 449 00:25:15,560 --> 00:25:18,000 Speaker 1: approach that makes a lot of sense because it made 450 00:25:18,040 --> 00:25:21,760 Speaker 1: scaling the Internet easier. It didn't require a whole huge 451 00:25:21,800 --> 00:25:25,439 Speaker 1: investment to add more infrastructure to the Internet. It scaled 452 00:25:25,520 --> 00:25:28,160 Speaker 1: up very very quickly. But if you build your network 453 00:25:28,160 --> 00:25:31,199 Speaker 1: on top of hardware that sometimes goes offline, you have 454 00:25:31,240 --> 00:25:35,240 Speaker 1: to make sure that the rules the data follows are flexible, 455 00:25:35,320 --> 00:25:38,680 Speaker 1: that they're able to handle that situation and route around 456 00:25:38,920 --> 00:25:43,160 Speaker 1: those problems. And that's where packet switching comes in. Packets 457 00:25:43,160 --> 00:25:47,119 Speaker 1: of data follow whatever path is best at that given time, 458 00:25:47,359 --> 00:25:51,280 Speaker 1: as in whatever connection is the most reliable, fastest connection 459 00:25:51,320 --> 00:25:55,240 Speaker 1: between the originating computer and the destination computer. Now that 460 00:25:55,280 --> 00:25:59,040 Speaker 1: can change over time just from not just physical things 461 00:25:59,080 --> 00:26:01,440 Speaker 1: that are going on on the network, but also traffic 462 00:26:01,480 --> 00:26:03,640 Speaker 1: that's passing across the network at the same time from 463 00:26:03,640 --> 00:26:07,840 Speaker 1: other computers. So one hundred digital packets representing the same 464 00:26:07,880 --> 00:26:11,960 Speaker 1: file could potentially take one hundred different pathways to get 465 00:26:12,000 --> 00:26:15,280 Speaker 1: to their destination, so that it's kind of like a 466 00:26:15,320 --> 00:26:18,400 Speaker 1: caravan all splitting up and taking different routes in order 467 00:26:18,400 --> 00:26:22,520 Speaker 1: to get to the final destination. Now, there's probably never 468 00:26:22,560 --> 00:26:24,720 Speaker 1: going to be a case where every single packet is 469 00:26:24,720 --> 00:26:27,280 Speaker 1: going to take its own individual pathway. Some of them 470 00:26:27,320 --> 00:26:29,720 Speaker 1: may end up taking at least part of the same 471 00:26:30,119 --> 00:26:33,920 Speaker 1: journey to get to their destination. But you get the idea. Uh, 472 00:26:33,960 --> 00:26:36,760 Speaker 1: it makes the Internet much more robust because one pathway 473 00:26:36,800 --> 00:26:38,960 Speaker 1: could fail and data can still find a way to 474 00:26:39,240 --> 00:26:43,520 Speaker 1: the intended destination. In addition, computers will send more packets 475 00:26:43,560 --> 00:26:46,320 Speaker 1: than what are needed as a redundancy measure. This is 476 00:26:46,320 --> 00:26:50,240 Speaker 1: probably that TCP protocol which is redundant. It's like a 477 00:26:50,320 --> 00:26:53,840 Speaker 1: t M machine. But TCP does make certain that all 478 00:26:53,920 --> 00:26:55,919 Speaker 1: the different packets get to where they need to go, 479 00:26:56,000 --> 00:26:58,640 Speaker 1: and if anything didn't show up, then it can make 480 00:26:58,680 --> 00:27:02,280 Speaker 1: certain that essentially a replacement packet gets sent so that 481 00:27:02,520 --> 00:27:05,639 Speaker 1: it can verify that all the packets that are necessary, 482 00:27:05,680 --> 00:27:08,399 Speaker 1: all one hundred of them, for example, have made it 483 00:27:08,440 --> 00:27:13,600 Speaker 1: to their destination, and that the communication from that that 484 00:27:13,720 --> 00:27:16,960 Speaker 1: part of the communication at any rate, is complete. This 485 00:27:17,040 --> 00:27:19,439 Speaker 1: approach makes the Internet easy to build out, but it 486 00:27:19,480 --> 00:27:22,359 Speaker 1: also makes it more challenging to do anything across the 487 00:27:22,440 --> 00:27:26,480 Speaker 1: infrastructure layer in response to people who exploit the system, 488 00:27:26,520 --> 00:27:29,639 Speaker 1: because the underlying connections are really only concerned with moving 489 00:27:29,720 --> 00:27:33,000 Speaker 1: data from origin to destination. They're not concerned with what 490 00:27:33,040 --> 00:27:37,120 Speaker 1: that data is or what purpose it serves. Now, I've 491 00:27:37,119 --> 00:27:39,040 Speaker 1: got a little more to say about the basics of 492 00:27:39,080 --> 00:27:41,960 Speaker 1: distributed denial of service attacks, but first let's take another 493 00:27:42,040 --> 00:27:52,560 Speaker 1: quick break to thank our sponsor. One other element of 494 00:27:52,600 --> 00:27:55,080 Speaker 1: the Internet I feel I should mention before I talk 495 00:27:55,160 --> 00:27:57,880 Speaker 1: about the history of denial of service attacks. Is the 496 00:27:57,960 --> 00:28:01,080 Speaker 1: domain name system. And you guys is likely at least 497 00:28:01,119 --> 00:28:04,240 Speaker 1: have heard of an IP address. I mentioned it earlier 498 00:28:04,280 --> 00:28:07,720 Speaker 1: in this episode. Those are the addresses that identify a 499 00:28:07,720 --> 00:28:11,040 Speaker 1: device that's connected to the Internet. Uh. It can be 500 00:28:11,280 --> 00:28:14,600 Speaker 1: a device like a router that then sends out temporary 501 00:28:14,920 --> 00:28:17,320 Speaker 1: addresses to anything that's connected to the router, but you 502 00:28:17,359 --> 00:28:20,919 Speaker 1: get it. This is the way that a computer system 503 00:28:20,960 --> 00:28:25,280 Speaker 1: knows where to send information. They're necessary for communication. It's 504 00:28:25,320 --> 00:28:27,440 Speaker 1: like if you were to send a letter, you would 505 00:28:27,480 --> 00:28:30,479 Speaker 1: have to include an address on the letters envelope, so 506 00:28:30,640 --> 00:28:33,600 Speaker 1: the postal service knows where to deliver that letter, and 507 00:28:33,640 --> 00:28:35,439 Speaker 1: if you wanted to get a letter back in return, 508 00:28:35,480 --> 00:28:37,920 Speaker 1: you would want to have a return address on there 509 00:28:38,160 --> 00:28:40,080 Speaker 1: if you've got to want to get a response. And 510 00:28:40,120 --> 00:28:43,440 Speaker 1: the Internet is similar. All devices have an IP address 511 00:28:43,520 --> 00:28:46,840 Speaker 1: to facilitate communication um at least through a router if 512 00:28:46,840 --> 00:28:50,360 Speaker 1: nothing else. But the devices address might change over time, 513 00:28:50,400 --> 00:28:52,760 Speaker 1: so that's a little different. It's not like the device 514 00:28:52,840 --> 00:28:55,360 Speaker 1: is always going to have the exact same IP address. 515 00:28:55,640 --> 00:28:58,240 Speaker 1: It may change depending upon what network gets connected to. 516 00:28:58,400 --> 00:29:00,680 Speaker 1: In fact, it will change depending upon what network gets 517 00:29:00,680 --> 00:29:05,400 Speaker 1: connected to. So it's not exactly analogous to a physical address, 518 00:29:05,520 --> 00:29:07,320 Speaker 1: but it's similar enough for us to kind of think 519 00:29:07,320 --> 00:29:11,240 Speaker 1: about that. Now here's a problem. However, these addresses are 520 00:29:11,240 --> 00:29:14,200 Speaker 1: not easy for us to remember. You know, IPv four 521 00:29:14,240 --> 00:29:18,160 Speaker 1: addresses and IPv six addresses. These are series of numbers 522 00:29:18,200 --> 00:29:21,400 Speaker 1: and sometimes letters within the case with IPv six, where 523 00:29:22,440 --> 00:29:24,760 Speaker 1: they don't seem to make any rhyme or reason to us. 524 00:29:24,800 --> 00:29:27,640 Speaker 1: They're hard for us to recall. So we had to 525 00:29:27,640 --> 00:29:30,080 Speaker 1: come up with a way to map addresses based on 526 00:29:30,240 --> 00:29:34,440 Speaker 1: language to the IP addresses that machines can deal with. So, 527 00:29:34,560 --> 00:29:38,959 Speaker 1: for example, www dot how stuff works dot Com is 528 00:29:39,160 --> 00:29:41,640 Speaker 1: a u r L an address that we humans can 529 00:29:41,760 --> 00:29:46,640 Speaker 1: easily remember, and there are special computers called DNS servers 530 00:29:46,680 --> 00:29:50,440 Speaker 1: that resolve these u r l s into IP addresses 531 00:29:50,800 --> 00:29:53,760 Speaker 1: so that traffic can go to the right locations. So 532 00:29:53,800 --> 00:29:58,080 Speaker 1: an attack on DNS servers which has happened can slow 533 00:29:58,120 --> 00:30:01,560 Speaker 1: down traffic to numerous website because the servers will be 534 00:30:01,560 --> 00:30:04,160 Speaker 1: so busy dealing with the attack they have trouble resolving 535 00:30:04,280 --> 00:30:07,280 Speaker 1: u r l s into IP addresses, even though the 536 00:30:07,320 --> 00:30:11,440 Speaker 1: actual websites themselves are perfectly fine. So if there's an 537 00:30:11,480 --> 00:30:16,240 Speaker 1: attack on a DNS server that would typically resolve www 538 00:30:16,280 --> 00:30:19,760 Speaker 1: dot how stup works dot Com to its respective IP address, 539 00:30:20,040 --> 00:30:22,320 Speaker 1: how stup works dot Com is fine. We haven't been 540 00:30:22,360 --> 00:30:26,760 Speaker 1: attacked by anybody, but the the name server that would 541 00:30:26,760 --> 00:30:29,240 Speaker 1: actually do the job of resolving that you are l 542 00:30:29,360 --> 00:30:33,720 Speaker 1: into an IP address, it's busy handling this attack, so 543 00:30:33,880 --> 00:30:36,400 Speaker 1: it would look like our site is loading super slowly 544 00:30:36,440 --> 00:30:39,120 Speaker 1: that you just can't even pull anything up. But it's 545 00:30:39,160 --> 00:30:40,760 Speaker 1: not a problem on our end, it would be a 546 00:30:40,800 --> 00:30:43,240 Speaker 1: problem in the middle. So there are a lot of 547 00:30:43,240 --> 00:30:49,120 Speaker 1: different ways that attackers can potentially affect the traffic and 548 00:30:49,240 --> 00:30:53,160 Speaker 1: the speed of internet connections. Now, to end this episode, 549 00:30:53,160 --> 00:30:55,480 Speaker 1: I'm going to talk about some early denial of service 550 00:30:55,520 --> 00:30:57,960 Speaker 1: attacks and some of the more notable examples, and in 551 00:30:58,000 --> 00:30:59,720 Speaker 1: our next episode, I'm going to focus more on the 552 00:30:59,720 --> 00:31:02,760 Speaker 1: spe cifis for types of de DOS attacks and how 553 00:31:02,800 --> 00:31:05,600 Speaker 1: companies try to handle them. So, first of all, it's 554 00:31:05,640 --> 00:31:08,320 Speaker 1: hard to get definitive history of denial of service attacks 555 00:31:08,360 --> 00:31:11,920 Speaker 1: because oddly enough, hackers were not too concerned about documenting 556 00:31:11,960 --> 00:31:15,920 Speaker 1: their actions as they unfolded. But before there was d DOS, 557 00:31:16,000 --> 00:31:19,520 Speaker 1: there were plenty of denial of service examples. One of 558 00:31:19,520 --> 00:31:22,480 Speaker 1: them happened in nineteen seventy four with David Dennis, who 559 00:31:22,520 --> 00:31:25,360 Speaker 1: was thirteen years old at the time. I wondered if 560 00:31:25,400 --> 00:31:27,880 Speaker 1: he might be able to affect all the terminals connected 561 00:31:27,880 --> 00:31:32,240 Speaker 1: to a computer at the Computer Based Education Research Laboratory 562 00:31:32,360 --> 00:31:36,480 Speaker 1: at the University of Illinois Urbana Champagne Campus. Dennis knew 563 00:31:36,520 --> 00:31:39,520 Speaker 1: that he could cause a terminal, which think of a 564 00:31:39,640 --> 00:31:42,760 Speaker 1: terminal as kind of as a keyboard and a monitor 565 00:31:42,960 --> 00:31:45,200 Speaker 1: in itself is not a computer, but it's connected to 566 00:31:45,240 --> 00:31:47,920 Speaker 1: a computer. You have multiple terminals all hooked up to 567 00:31:47,960 --> 00:31:51,520 Speaker 1: this central computer and they're all sharing those resources. Well, 568 00:31:51,520 --> 00:31:54,760 Speaker 1: he knew that if he was using a terminal connected 569 00:31:54,800 --> 00:31:58,920 Speaker 1: to this computer and he executed a command called external 570 00:31:59,240 --> 00:32:01,920 Speaker 1: or e x E, which was a command that would 571 00:32:01,960 --> 00:32:04,200 Speaker 1: tell the terminal that it was supposed to communicate with 572 00:32:04,280 --> 00:32:08,960 Speaker 1: a connected external device. But if you didn't have an 573 00:32:08,960 --> 00:32:11,680 Speaker 1: external device connected to the terminal and you and you 574 00:32:11,720 --> 00:32:14,800 Speaker 1: sent this command anyway, it would make the terminal lock up. 575 00:32:15,280 --> 00:32:17,880 Speaker 1: The terminal would be searching for this external device, it 576 00:32:17,920 --> 00:32:20,680 Speaker 1: would not find it, and that would send the terminal 577 00:32:21,200 --> 00:32:24,960 Speaker 1: into the terminal equivalent of a tizzy. And the only 578 00:32:24,960 --> 00:32:26,880 Speaker 1: way to fix it would be to shut everything down 579 00:32:26,960 --> 00:32:30,480 Speaker 1: and reboot. So he thought, what if I did this, 580 00:32:30,600 --> 00:32:33,200 Speaker 1: but I created a way for to do it across 581 00:32:33,280 --> 00:32:35,920 Speaker 1: all the terminals connected to that computer at the same time, 582 00:32:36,120 --> 00:32:38,760 Speaker 1: not just one, because I mean then I'm just I'm 583 00:32:38,800 --> 00:32:41,400 Speaker 1: just sitting there having to change it. So he wrote 584 00:32:41,440 --> 00:32:43,920 Speaker 1: some code and figured out a way to send that 585 00:32:44,040 --> 00:32:46,840 Speaker 1: command to all the terminals connected to a computer at 586 00:32:46,840 --> 00:32:49,000 Speaker 1: the same time, making them execute that e x D 587 00:32:49,120 --> 00:32:54,280 Speaker 1: command without the individual users knowledge or permission, and this 588 00:32:54,440 --> 00:32:57,120 Speaker 1: forced to shut down and nearly all the terminals connected 589 00:32:57,160 --> 00:33:00,560 Speaker 1: to that computer. The university ended up does stabling this 590 00:33:00,640 --> 00:33:02,880 Speaker 1: feature that would allow people to send such a command 591 00:33:03,080 --> 00:33:06,360 Speaker 1: to all the terminals from one single spot. They said, 592 00:33:06,400 --> 00:33:09,040 Speaker 1: you know, we gotta turn this default setting off. They 593 00:33:09,040 --> 00:33:11,480 Speaker 1: didn't think about it until after it had happened. In 594 00:33:12,520 --> 00:33:16,200 Speaker 1: Robert Morris unleashed a denial of service attack by accident. 595 00:33:16,840 --> 00:33:18,920 Speaker 1: He had developed a bit of code that would make 596 00:33:18,960 --> 00:33:21,800 Speaker 1: its way through the machines connected through the arpanet, and 597 00:33:21,840 --> 00:33:24,560 Speaker 1: the purpose was to find out how big the network was. 598 00:33:24,720 --> 00:33:26,480 Speaker 1: He just wanted to know how big the network was. 599 00:33:27,000 --> 00:33:29,440 Speaker 1: No one was really sure that this was something that 600 00:33:29,480 --> 00:33:33,680 Speaker 1: was growing very kind of organically and rapidly. So Morris 601 00:33:33,720 --> 00:33:36,400 Speaker 1: thought he had the perfect solution. He had this code 602 00:33:36,520 --> 00:33:40,520 Speaker 1: that would go out and essentially infect every single node 603 00:33:40,720 --> 00:33:44,320 Speaker 1: on the system that it encountered. But it was meant 604 00:33:44,320 --> 00:33:47,520 Speaker 1: to infect just as a way of making count of 605 00:33:47,600 --> 00:33:50,400 Speaker 1: each of the nodes. Really, he just wanted to find 606 00:33:50,400 --> 00:33:53,240 Speaker 1: out what the head count was. However, he made a 607 00:33:53,280 --> 00:33:56,480 Speaker 1: mistake when he was creating this code, and it ended 608 00:33:56,560 --> 00:33:59,440 Speaker 1: up being the equivalent of a worm. It went through 609 00:33:59,480 --> 00:34:02,960 Speaker 1: the system and it would replicate itself. It would infect 610 00:34:02,960 --> 00:34:06,520 Speaker 1: the same machines multiple times. It failed to detect that 611 00:34:06,760 --> 00:34:10,160 Speaker 1: it had already infected a machine, so it just kept 612 00:34:10,160 --> 00:34:14,800 Speaker 1: passing through this arpanet system, infecting node after note after node, 613 00:34:14,840 --> 00:34:17,840 Speaker 1: again and again and again, coming up the network and 614 00:34:17,920 --> 00:34:21,960 Speaker 1: essentially causing a shutdown of sixty thousand nodes. And he 615 00:34:21,960 --> 00:34:25,040 Speaker 1: would end up being fined ten thousand dollars and sentenced 616 00:34:25,080 --> 00:34:29,720 Speaker 1: to fours community service for that mistake. The earliest example 617 00:34:29,800 --> 00:34:32,520 Speaker 1: of a distributed denial of service attack that I could 618 00:34:32,560 --> 00:34:37,320 Speaker 1: find happened in nine. An Italian activist group called the 619 00:34:37,360 --> 00:34:41,960 Speaker 1: Strano Network or Strange Network launched a denial of service 620 00:34:42,000 --> 00:34:45,239 Speaker 1: attack against the French government in a protest against the 621 00:34:45,360 --> 00:34:48,880 Speaker 1: that nation's policies relating to nuclear power. But this was 622 00:34:48,920 --> 00:34:52,440 Speaker 1: done with actual human operators who were working voluntarily. They 623 00:34:52,440 --> 00:34:54,480 Speaker 1: were they had agreed to be part of this sort 624 00:34:54,480 --> 00:34:57,919 Speaker 1: of virtual sit in, and they were working on their 625 00:34:57,960 --> 00:35:01,320 Speaker 1: computers in an attempt to overwhelm on the target servers. 626 00:35:01,600 --> 00:35:04,680 Speaker 1: So this attack was limited both in scope and duration. Also, 627 00:35:04,800 --> 00:35:06,959 Speaker 1: back in those days, you were paying by the hour 628 00:35:07,120 --> 00:35:11,879 Speaker 1: for Internet access, so the actual protest lasted about an 629 00:35:11,880 --> 00:35:14,000 Speaker 1: hour because no one was willing to pour in a 630 00:35:14,000 --> 00:35:18,239 Speaker 1: whole lot of money to sit at their computer and 631 00:35:18,360 --> 00:35:22,160 Speaker 1: actively carry out this attack. The denial of service attack 632 00:35:22,200 --> 00:35:25,320 Speaker 1: became a go to strategy for activist groups in general. 633 00:35:25,760 --> 00:35:29,279 Speaker 1: One such group, called the Electronic Disturbance Theater or e 634 00:35:29,400 --> 00:35:32,799 Speaker 1: d T, developed a tool called flood Kit, which would 635 00:35:32,800 --> 00:35:36,160 Speaker 1: send a large volume of messages towards a targeted computer 636 00:35:36,239 --> 00:35:39,480 Speaker 1: across the Internet. A predetermined target is the important part 637 00:35:39,520 --> 00:35:42,200 Speaker 1: to remember here. Anyone who wanted to make use of 638 00:35:42,200 --> 00:35:44,920 Speaker 1: flood kit could download it, and the tool even had 639 00:35:44,960 --> 00:35:48,200 Speaker 1: to drop down menu that would let users select the 640 00:35:48,239 --> 00:35:52,400 Speaker 1: predetermined targeted computers like the White House Computer System. E 641 00:35:52,520 --> 00:35:54,839 Speaker 1: d T would arrange for virtual sit ins in which 642 00:35:54,840 --> 00:35:58,000 Speaker 1: they would schedule a coordinated effort to attack a specific 643 00:35:58,000 --> 00:36:00,879 Speaker 1: target like the White House servers, and then users would 644 00:36:00,920 --> 00:36:03,480 Speaker 1: all use that drop down menu to launch their individual 645 00:36:03,520 --> 00:36:07,239 Speaker 1: attacks as a big collective so as a collective of 646 00:36:07,320 --> 00:36:09,959 Speaker 1: individual attacks in that sense, and again in this case, 647 00:36:10,000 --> 00:36:12,919 Speaker 1: it was a voluntary action. It wasn't like they were 648 00:36:13,040 --> 00:36:17,360 Speaker 1: infecting computers and trying to uh take them over without 649 00:36:17,440 --> 00:36:21,880 Speaker 1: the user's consent. In two thousand, Michael Cols, a teenage 650 00:36:21,880 --> 00:36:26,280 Speaker 1: hacker who used the handle Mafia Boy, launched a series 651 00:36:26,320 --> 00:36:29,760 Speaker 1: of distributed denial of service attacks against high profile targets 652 00:36:29,760 --> 00:36:33,680 Speaker 1: like Yahoo, Amazon, Dell, and others. He also attempted to 653 00:36:33,680 --> 00:36:36,040 Speaker 1: attack the d n S system by targeting several of 654 00:36:36,080 --> 00:36:40,040 Speaker 1: the root name servers. He had compromised computers at university 655 00:36:40,080 --> 00:36:42,800 Speaker 1: networks and used them to send traffic to his targets 656 00:36:42,960 --> 00:36:46,080 Speaker 1: that would overwhelm the targets, and years later he would 657 00:36:46,120 --> 00:36:48,279 Speaker 1: say the whole purpose behind it was so that he 658 00:36:48,320 --> 00:36:51,759 Speaker 1: could impress and intimidate other hackers, so he was doing 659 00:36:51,840 --> 00:36:54,880 Speaker 1: it for the online street cred In other words, He 660 00:36:54,920 --> 00:36:58,279 Speaker 1: was eventually tracked down by agencies like the FBI and 661 00:36:58,400 --> 00:37:01,040 Speaker 1: got a pretty light punishment all things considered. He was 662 00:37:01,080 --> 00:37:04,080 Speaker 1: sentenced to eight months in a youth group home. And 663 00:37:04,120 --> 00:37:06,719 Speaker 1: part of the reason for the relatively light sentence is 664 00:37:06,719 --> 00:37:10,080 Speaker 1: that the law was dragging behind technology, because it's hard 665 00:37:10,080 --> 00:37:12,440 Speaker 1: to charge someone with a crime when you don't have 666 00:37:12,480 --> 00:37:15,799 Speaker 1: a law defining that crime yet. And this is something 667 00:37:15,840 --> 00:37:18,799 Speaker 1: we've seen in technology over and over where the developments 668 00:37:18,920 --> 00:37:23,200 Speaker 1: of tech have outstripped the social constructs like law. In 669 00:37:23,280 --> 00:37:25,960 Speaker 1: two thousand seven, in Russia, a massive de dos attack 670 00:37:26,040 --> 00:37:29,720 Speaker 1: shut down not just a site or made a service slow, 671 00:37:29,920 --> 00:37:33,920 Speaker 1: and actually shut down internet coverage for entire cities. The 672 00:37:33,920 --> 00:37:36,600 Speaker 1: attack was aimed at an Internet service provider, and it 673 00:37:36,640 --> 00:37:39,800 Speaker 1: was so effective that the provider went offline multiple times 674 00:37:39,880 --> 00:37:42,400 Speaker 1: in waves of attack that hit over the period of 675 00:37:42,440 --> 00:37:45,080 Speaker 1: a month. So they would get back up and then 676 00:37:45,080 --> 00:37:46,879 Speaker 1: they would be hit by another attack and it would 677 00:37:46,920 --> 00:37:49,800 Speaker 1: go down again. At the peak of an attack, traffic 678 00:37:49,840 --> 00:37:53,120 Speaker 1: being sent to the provider reached ten gigabytes per second, 679 00:37:53,520 --> 00:37:59,080 Speaker 1: which was pretty darn staggering back in two thousand seven. Later, Anonymous, 680 00:37:59,120 --> 00:38:03,719 Speaker 1: the most famous secret society of activists and techno anarchists, 681 00:38:04,040 --> 00:38:07,480 Speaker 1: began to make use of voluntary button nets to attack targets. 682 00:38:07,800 --> 00:38:10,720 Speaker 1: They urged people who wanted to lend their computer's power 683 00:38:10,760 --> 00:38:13,960 Speaker 1: to an attack to download software called the low orbit 684 00:38:14,040 --> 00:38:17,200 Speaker 1: ion cannon. This would make the users computer join a 685 00:38:17,280 --> 00:38:19,640 Speaker 1: large bot net, which then could be directed to attack 686 00:38:19,719 --> 00:38:23,600 Speaker 1: specific targets. Essentially, this is what hackers often try to 687 00:38:23,680 --> 00:38:27,799 Speaker 1: do through tricking others to install malware, only in this case, 688 00:38:27,840 --> 00:38:30,160 Speaker 1: Anonymous was outright saying, Hey, your computer is going to 689 00:38:30,239 --> 00:38:32,600 Speaker 1: be part of this if you download the software. So 690 00:38:32,640 --> 00:38:35,120 Speaker 1: if you want to help bring down the man, download 691 00:38:35,160 --> 00:38:38,480 Speaker 1: and install it now. That wraps up this episode. In 692 00:38:38,480 --> 00:38:40,880 Speaker 1: our next one, we're gonna talk more about how de 693 00:38:41,000 --> 00:38:44,960 Speaker 1: dos works and also the various strategies that people and 694 00:38:45,000 --> 00:38:48,680 Speaker 1: companies used in order to try and mitigate the effects 695 00:38:48,760 --> 00:38:51,560 Speaker 1: of de dos. As it turns out, it's pretty tricky. 696 00:38:52,160 --> 00:38:54,800 Speaker 1: If you guys enjoyed this episode, let me know. Also 697 00:38:55,120 --> 00:38:57,200 Speaker 1: give me a shout out if you have any suggestions 698 00:38:57,200 --> 00:39:00,920 Speaker 1: for future episode topics. Whether it's a technolology, a company, 699 00:39:00,920 --> 00:39:02,719 Speaker 1: a person in tech, maybe there's someone you want me 700 00:39:02,760 --> 00:39:05,480 Speaker 1: to interview, let me know by sending me an email. 701 00:39:05,600 --> 00:39:09,480 Speaker 1: The address is tech stuff at how stuff works dot com, 702 00:39:09,600 --> 00:39:11,640 Speaker 1: or drop me a line on Facebook or Twitter. The 703 00:39:11,640 --> 00:39:14,440 Speaker 1: handle for both of those is text stuff H s W. 704 00:39:15,040 --> 00:39:17,680 Speaker 1: Don't forget to follow us on Instagram and I'll talk 705 00:39:17,719 --> 00:39:26,720 Speaker 1: to you again really soon. For moral thiss and thousands 706 00:39:26,719 --> 00:39:38,960 Speaker 1: of other topics. Is that how stuff works dot com