1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tex Stuff, a production from my Heart Radio 2 00:00:11,800 --> 00:00:15,680 Speaker 1: Heathen and welcome to text Stuff. I am your host, 3 00:00:15,920 --> 00:00:21,000 Speaker 1: Jonathan STRICKLINN. I am an executive producer and apparently vampire 4 00:00:21,440 --> 00:00:25,760 Speaker 1: at I Heart Radio. How the tech are you? Okay, 5 00:00:25,760 --> 00:00:30,080 Speaker 1: I'm dropping in now you all got the cringe out 6 00:00:30,080 --> 00:00:33,559 Speaker 1: of the way. First thing. We're done with that. But 7 00:00:33,720 --> 00:00:39,560 Speaker 1: we are in spooky season as I publish this October 8 00:00:39,600 --> 00:00:43,000 Speaker 1: of two thousand twenty two. And you know, I don't 9 00:00:43,040 --> 00:00:46,280 Speaker 1: typically do episodes that relate to spooky stuff, but I 10 00:00:46,280 --> 00:00:47,800 Speaker 1: thought it would be fun if I did a few 11 00:00:47,880 --> 00:00:54,080 Speaker 1: this month that are tangentially maybe questionably themed to be 12 00:00:54,200 --> 00:00:58,120 Speaker 1: halloween ish. This gets a little tricky in tech. I mean, 13 00:00:58,160 --> 00:01:01,000 Speaker 1: I have done episodes about stuff like the tech in 14 00:01:01,120 --> 00:01:05,880 Speaker 1: professional haunted house attractions and stuff like that, which it 15 00:01:06,000 --> 00:01:09,680 Speaker 1: is great, but you know, I can't really revisit that. 16 00:01:09,680 --> 00:01:12,840 Speaker 1: There's not a whole lot more to say. I did 17 00:01:12,920 --> 00:01:16,479 Speaker 1: do a classic episode with my co host Chris Palette 18 00:01:17,000 --> 00:01:24,000 Speaker 1: about ghost hunting technology or so called ghost hunting technology. 19 00:01:24,200 --> 00:01:26,600 Speaker 1: Maybe I'll do an update to that because it's been 20 00:01:26,720 --> 00:01:30,800 Speaker 1: so long since I recorded that that episode, and uh, 21 00:01:31,000 --> 00:01:34,319 Speaker 1: and I always like getting my dander up about stuff 22 00:01:34,319 --> 00:01:37,200 Speaker 1: like that. But you know, pickens are slim when you 23 00:01:37,240 --> 00:01:40,640 Speaker 1: look at stuff that you can theme toward Halloween in 24 00:01:40,680 --> 00:01:45,640 Speaker 1: the tech space. But yesterday in the news episode, I 25 00:01:45,680 --> 00:01:49,320 Speaker 1: talked about how some activists who at the very least 26 00:01:49,320 --> 00:01:54,680 Speaker 1: are sympathetic to the Kremlin, used distributed denial of service 27 00:01:54,680 --> 00:01:58,600 Speaker 1: attacks or de DOS attacks against a dozen or so 28 00:01:58,880 --> 00:02:05,040 Speaker 1: US airport websites. Not air carriers, not the airlines, mind you, 29 00:02:05,160 --> 00:02:08,680 Speaker 1: but the airport websites. And the attacks brought down some 30 00:02:08,680 --> 00:02:11,880 Speaker 1: sites for a few hours, but otherwise had very little 31 00:02:11,919 --> 00:02:16,280 Speaker 1: impact on travel. Now you might say, okay, but how 32 00:02:16,280 --> 00:02:21,320 Speaker 1: are you connecting di DOS attacks to Halloween. Well, the 33 00:02:21,720 --> 00:02:26,639 Speaker 1: tenuous connective tissue is that to to pull off the 34 00:02:26,760 --> 00:02:32,120 Speaker 1: d DOS attack, hackers first have to assemble a bot net, 35 00:02:33,320 --> 00:02:38,440 Speaker 1: which is a collection of compromised computer systems. And another 36 00:02:38,520 --> 00:02:43,520 Speaker 1: phrase that sometimes describes a bot is zombie, and bought 37 00:02:43,560 --> 00:02:46,680 Speaker 1: net would be a zombie army. So you have these 38 00:02:46,760 --> 00:02:52,919 Speaker 1: zombie computers and so zombies. It's totally thematic, right. Okay, 39 00:02:53,000 --> 00:02:55,760 Speaker 1: let's start with a baseline before we get to distributed 40 00:02:55,840 --> 00:02:59,920 Speaker 1: denial of service attack. Let's just start with denial of service. 41 00:03:00,320 --> 00:03:03,280 Speaker 1: What the heck is that? How does it work. Well, 42 00:03:03,880 --> 00:03:07,760 Speaker 1: let's think of the Internet as a giant, interconnected mess 43 00:03:07,800 --> 00:03:12,520 Speaker 1: of clients and servers. There there are other components there too. 44 00:03:12,800 --> 00:03:15,960 Speaker 1: I am oversimplifying it down to clients and servers. So 45 00:03:16,320 --> 00:03:19,560 Speaker 1: servers are the machines that hold the stuff that we 46 00:03:19,639 --> 00:03:23,679 Speaker 1: want to access online. Maybe we are logging on to 47 00:03:23,800 --> 00:03:27,680 Speaker 1: play an online game and the game exists on a 48 00:03:27,760 --> 00:03:31,640 Speaker 1: server somewhere out on the internet, probably exists on several servers, 49 00:03:31,639 --> 00:03:35,440 Speaker 1: and we just connect to a specific one. Or maybe 50 00:03:35,480 --> 00:03:39,119 Speaker 1: we want to order food online using an app, Well, 51 00:03:39,160 --> 00:03:43,800 Speaker 1: that service is hosted on another server somewhere online. Or 52 00:03:43,880 --> 00:03:46,200 Speaker 1: maybe we just want to pop onto the web and 53 00:03:46,320 --> 00:03:48,880 Speaker 1: visit a news site and read up on the headlines, 54 00:03:49,000 --> 00:03:52,040 Speaker 1: while that news site is on another server somewhere out 55 00:03:52,080 --> 00:03:56,040 Speaker 1: there on the web. So the basic way the Internet 56 00:03:56,080 --> 00:04:01,480 Speaker 1: works is that you access a client of One example 57 00:04:01,480 --> 00:04:03,920 Speaker 1: of a client would be a web browser on your computer. 58 00:04:04,280 --> 00:04:08,240 Speaker 1: That's the client. So that's your point of access to 59 00:04:08,240 --> 00:04:11,480 Speaker 1: the Internet, and you want to see something specific, like 60 00:04:11,520 --> 00:04:15,600 Speaker 1: that news site. Let's say, so you type in the 61 00:04:15,720 --> 00:04:18,840 Speaker 1: u r L for the news site into your browser 62 00:04:19,480 --> 00:04:22,360 Speaker 1: and the browser sends out a message that goes across 63 00:04:22,400 --> 00:04:26,119 Speaker 1: the Internet and it gets directed to the specific server 64 00:04:26,520 --> 00:04:30,159 Speaker 1: that houses that you are l The server receives this 65 00:04:30,279 --> 00:04:34,200 Speaker 1: request from your client, and then it replies to that request. 66 00:04:34,240 --> 00:04:37,320 Speaker 1: It sends the files that represent the front page of 67 00:04:37,360 --> 00:04:41,200 Speaker 1: that news site to your client your web browser. Your 68 00:04:41,240 --> 00:04:44,159 Speaker 1: web browser then displays those files as a web page 69 00:04:44,200 --> 00:04:46,640 Speaker 1: to the user. In a way, this is the most 70 00:04:46,720 --> 00:04:49,679 Speaker 1: simplified method to describe what's going on with the Internet 71 00:04:49,720 --> 00:04:53,400 Speaker 1: in general and the web in particular. The specifics get 72 00:04:53,440 --> 00:04:56,400 Speaker 1: a little more sophisticated than that, but from a very 73 00:04:56,480 --> 00:05:00,680 Speaker 1: high level, that is what's happening with web traffic, without 74 00:05:00,680 --> 00:05:04,520 Speaker 1: getting into things like packets and routing and all that 75 00:05:04,600 --> 00:05:10,080 Speaker 1: kind of stuff. Now, sometimes stuff goes wrong in this process. 76 00:05:10,560 --> 00:05:13,320 Speaker 1: You know, maybe the server that's holding the files you 77 00:05:13,320 --> 00:05:15,520 Speaker 1: want has gone offline for some reason, so you get 78 00:05:15,560 --> 00:05:19,360 Speaker 1: an error back because your request could not be answered. 79 00:05:19,440 --> 00:05:22,120 Speaker 1: The server that would normally be there for some reason 80 00:05:22,320 --> 00:05:26,800 Speaker 1: isn't there. Maybe there are issues between the client and 81 00:05:26,839 --> 00:05:29,480 Speaker 1: the server. So it's not that the server has an 82 00:05:29,480 --> 00:05:31,960 Speaker 1: issue or that your client has an issue, but something 83 00:05:31,960 --> 00:05:36,200 Speaker 1: in the middle is causing some problems, or maybe the 84 00:05:36,279 --> 00:05:40,000 Speaker 1: client connection goes down, like maybe your home internet has 85 00:05:40,160 --> 00:05:44,640 Speaker 1: gone down and that's the problem or maybe the server 86 00:05:45,040 --> 00:05:47,800 Speaker 1: is online. There are no other issues between your client 87 00:05:47,839 --> 00:05:51,640 Speaker 1: and the server, but the server itself is currently overwhelmed. 88 00:05:52,440 --> 00:05:57,279 Speaker 1: Now that can happen naturally without malice involved. So let's say, 89 00:05:57,360 --> 00:06:01,880 Speaker 1: for example, that word gets out out a new video 90 00:06:01,920 --> 00:06:06,400 Speaker 1: game console. Let's say, and everyone knows when that console 91 00:06:06,560 --> 00:06:09,520 Speaker 1: is officially going to go up for pre order, and 92 00:06:09,600 --> 00:06:12,520 Speaker 1: you can go straight to this company's website and sign 93 00:06:12,600 --> 00:06:15,799 Speaker 1: up for a pre order the moment it becomes available, 94 00:06:15,800 --> 00:06:17,359 Speaker 1: and you will be first in line to get this 95 00:06:17,440 --> 00:06:20,719 Speaker 1: brand new video game console. And lots of people know 96 00:06:20,800 --> 00:06:23,520 Speaker 1: about this, so tons of people are interested and invested 97 00:06:23,560 --> 00:06:28,159 Speaker 1: in this. So the appointed hour arrives and now millions 98 00:06:28,160 --> 00:06:31,080 Speaker 1: of people around the world are all frantically attempting to 99 00:06:31,120 --> 00:06:34,159 Speaker 1: connect to the same server to put in their pre order, 100 00:06:34,760 --> 00:06:37,599 Speaker 1: and the server is just overwhelmed by the mass of 101 00:06:37,720 --> 00:06:41,520 Speaker 1: incoming traffic and it slows down the servers of ability 102 00:06:41,560 --> 00:06:45,200 Speaker 1: to respond to the requests. So everyone's starting to experience 103 00:06:45,240 --> 00:06:48,479 Speaker 1: these long delays as they try and connect, and you 104 00:06:48,520 --> 00:06:52,080 Speaker 1: get increasingly frustrated because you're waiting and waiting and waiting 105 00:06:52,080 --> 00:06:55,039 Speaker 1: for a web page to load in your browser, and 106 00:06:55,080 --> 00:06:59,640 Speaker 1: the servers doing its best to respond to demands. Sometimes 107 00:06:59,680 --> 00:07:02,599 Speaker 1: this kind of situation can be enough to actually cause 108 00:07:02,680 --> 00:07:06,719 Speaker 1: the server to crash entirely, which is even more frustrating 109 00:07:06,760 --> 00:07:08,680 Speaker 1: because then it has to go through the whole reboot 110 00:07:08,680 --> 00:07:11,800 Speaker 1: process before you can connect to it again, and that 111 00:07:11,880 --> 00:07:15,400 Speaker 1: obviously makes matters more frustrating. And like I said, this 112 00:07:15,480 --> 00:07:18,200 Speaker 1: all can happen naturally just due to demand. We've seen 113 00:07:18,240 --> 00:07:21,680 Speaker 1: it happen multiple times, even in modern day, like we 114 00:07:21,720 --> 00:07:24,760 Speaker 1: saw it happen a lot early in the days of 115 00:07:24,800 --> 00:07:29,600 Speaker 1: the web because of unexpected demand, but it still happens 116 00:07:29,600 --> 00:07:33,840 Speaker 1: today too. However, this thing that can happen naturally can 117 00:07:33,920 --> 00:07:38,400 Speaker 1: also be caused to happen artificially. A nefarious person can 118 00:07:38,440 --> 00:07:41,640 Speaker 1: try and create that sort of situation on purpose. Now 119 00:07:41,680 --> 00:07:44,920 Speaker 1: this brings us to a denial of service attack or 120 00:07:45,040 --> 00:07:48,160 Speaker 1: DOS attack D O S big D little oh big S. 121 00:07:48,720 --> 00:07:51,680 Speaker 1: All right, So there's an analogy that I love to 122 00:07:51,800 --> 00:07:54,040 Speaker 1: use when talking about denial of service. I'm going to 123 00:07:54,160 --> 00:07:57,880 Speaker 1: use it again. So imagine for a moment that anytime 124 00:07:58,040 --> 00:08:02,200 Speaker 1: anyone rang your doorbell or knocked on your door, you 125 00:08:02,280 --> 00:08:05,720 Speaker 1: absolutely had to go answer the door. You couldn't pretend 126 00:08:05,760 --> 00:08:09,120 Speaker 1: not to be home or ignore it. You are compelled, 127 00:08:09,320 --> 00:08:12,320 Speaker 1: you have no other option. You have to answer the door. Now, 128 00:08:12,960 --> 00:08:15,160 Speaker 1: let's say you're at home and you've decided that you 129 00:08:15,200 --> 00:08:18,160 Speaker 1: want to make yourself snack. You're feeling peckish, so you 130 00:08:18,200 --> 00:08:20,520 Speaker 1: start to head to your kitchen, but then someone rings 131 00:08:20,560 --> 00:08:23,280 Speaker 1: the doorbell, so you turn around and you walk to 132 00:08:23,480 --> 00:08:26,360 Speaker 1: the front door and you open it, but there's no 133 00:08:26,400 --> 00:08:29,040 Speaker 1: one there, so you close the door and you head 134 00:08:29,080 --> 00:08:31,200 Speaker 1: back inside. You start heading back to the kitchen, but 135 00:08:31,480 --> 00:08:33,559 Speaker 1: you get two steps toward the kitchen and the doorbell 136 00:08:33,720 --> 00:08:36,040 Speaker 1: rings again, so you do a one eight. You walk 137 00:08:36,120 --> 00:08:38,840 Speaker 1: back to the door and you open it. There's no 138 00:08:38,880 --> 00:08:43,439 Speaker 1: one there, those darn kids, And now you're getting irritated, 139 00:08:43,600 --> 00:08:46,080 Speaker 1: possibly because you've got some low blood sugar going on 140 00:08:46,120 --> 00:08:48,360 Speaker 1: because you haven't had your snack yet, So you close 141 00:08:48,440 --> 00:08:51,120 Speaker 1: the door. You turn back to head toward the kitchen. 142 00:08:51,480 --> 00:08:53,880 Speaker 1: Bing bong goes the doorbell and you turn around again 143 00:08:53,920 --> 00:08:56,840 Speaker 1: to answer the door, and once again, no one is there. 144 00:08:56,880 --> 00:09:01,319 Speaker 1: And this happens over and over, and because you are 145 00:09:01,440 --> 00:09:04,960 Speaker 1: compelled to answer the door, you can't ever make any 146 00:09:05,000 --> 00:09:08,960 Speaker 1: progress doing anything else, and then eventually you just collapse 147 00:09:09,040 --> 00:09:14,040 Speaker 1: in frustration, starvation, and confusion and a denial of service 148 00:09:14,080 --> 00:09:17,440 Speaker 1: attack is really similar to this. A basic one might 149 00:09:17,559 --> 00:09:20,440 Speaker 1: be that someone is sending a message to a server, 150 00:09:21,400 --> 00:09:25,719 Speaker 1: but the return address for this message goes nowhere, So 151 00:09:26,480 --> 00:09:29,640 Speaker 1: it's a message that's going to a server, but the 152 00:09:29,679 --> 00:09:33,040 Speaker 1: server has the wrong information about where that message came 153 00:09:33,040 --> 00:09:35,400 Speaker 1: from and where it has to send its reply. So 154 00:09:36,440 --> 00:09:38,680 Speaker 1: the hacker is just sending message after message to the 155 00:09:38,760 --> 00:09:42,160 Speaker 1: server with this false return address, and the server has 156 00:09:42,200 --> 00:09:45,720 Speaker 1: to answer each one. That's the server's job. So the 157 00:09:45,720 --> 00:09:48,640 Speaker 1: hacker is just flooding the server as much as possible 158 00:09:48,679 --> 00:09:51,360 Speaker 1: to bring it down, because the server can't just ignore 159 00:09:51,440 --> 00:09:55,920 Speaker 1: incoming messages. If a server ignored incoming messages, the basic 160 00:09:55,960 --> 00:10:01,000 Speaker 1: operations of the Internet would break down. Now, that kind 161 00:10:01,000 --> 00:10:03,720 Speaker 1: of attack is actually not that hard to defend against 162 00:10:03,720 --> 00:10:05,640 Speaker 1: because if you do detect it, if you detect that 163 00:10:05,679 --> 00:10:09,120 Speaker 1: there's an unusual amount of traffic coming from a single source, 164 00:10:09,880 --> 00:10:13,120 Speaker 1: um even if that source is a fake IP address, 165 00:10:13,559 --> 00:10:16,960 Speaker 1: you can just block anything coming from there, and then 166 00:10:17,720 --> 00:10:21,640 Speaker 1: you can keep accepting other traffic. And it's britty. It's, 167 00:10:22,480 --> 00:10:25,319 Speaker 1: in the grand scheme of things, relatively easy to deal with. 168 00:10:25,720 --> 00:10:28,520 Speaker 1: A denial of service attack a basic denial of service attack, 169 00:10:29,080 --> 00:10:32,920 Speaker 1: but the denial of service attack is small change compared 170 00:10:32,960 --> 00:10:37,080 Speaker 1: to a distributed denial of service attack. This is big 171 00:10:37,160 --> 00:10:41,400 Speaker 1: D big D little oh big s d DOS. So 172 00:10:41,440 --> 00:10:44,880 Speaker 1: to do ad DOS attack, a hacker needs access to 173 00:10:45,040 --> 00:10:49,520 Speaker 1: a bunch of computers. This is the distributed part, and 174 00:10:49,559 --> 00:10:53,640 Speaker 1: working together these computers, which could number in the millions 175 00:10:53,800 --> 00:10:58,560 Speaker 1: for a particularly huge zombie army or bought net, they 176 00:10:58,559 --> 00:11:00,760 Speaker 1: can all work together to send them just to a 177 00:11:00,840 --> 00:11:05,160 Speaker 1: targeted server, which then gets bogged down trying to answer 178 00:11:05,480 --> 00:11:10,120 Speaker 1: all these messages. Now, if this podcast were instead of book, 179 00:11:10,160 --> 00:11:12,360 Speaker 1: I would have put a footnote up there when I 180 00:11:12,440 --> 00:11:16,120 Speaker 1: mentioned large botton nets. It is hard to get a 181 00:11:16,160 --> 00:11:20,360 Speaker 1: real figure for how big a boton net is it. 182 00:11:20,840 --> 00:11:23,160 Speaker 1: You can make some estimates, but it's hard to get 183 00:11:23,400 --> 00:11:28,360 Speaker 1: a firm grasp, largely because computers are not necessarily always 184 00:11:28,480 --> 00:11:31,480 Speaker 1: on right. They're not always connected. You might turn your 185 00:11:31,480 --> 00:11:34,760 Speaker 1: computer off, or you might your Internet connection might go 186 00:11:34,800 --> 00:11:39,280 Speaker 1: down or whatever, and so it's not easy to actually 187 00:11:39,360 --> 00:11:43,440 Speaker 1: quantify how big these buttonets can be. However, we have 188 00:11:43,559 --> 00:11:46,000 Speaker 1: some general idea of some of the largest ones. So 189 00:11:47,120 --> 00:11:49,800 Speaker 1: there was a botton net. Still is a bottonet associated 190 00:11:49,840 --> 00:11:54,120 Speaker 1: with a trojan called Zeus that involved more than thirteen 191 00:11:54,200 --> 00:11:59,000 Speaker 1: million computers, so they can be quite large. Also should 192 00:11:59,000 --> 00:12:00,800 Speaker 1: add botton nets can be is for lots of other 193 00:12:00,840 --> 00:12:04,400 Speaker 1: stuff besides the DOS attacks. That's like one of the 194 00:12:05,520 --> 00:12:09,360 Speaker 1: easily identifiable reasons for a butt net, but there are 195 00:12:09,400 --> 00:12:12,240 Speaker 1: other ones as well. Also, just adding that you should 196 00:12:12,280 --> 00:12:14,800 Speaker 1: always be careful to make sure your machines don't become 197 00:12:14,840 --> 00:12:19,440 Speaker 1: part of one, which means practicing good etiquette online. You know, 198 00:12:19,520 --> 00:12:23,240 Speaker 1: making sure you're not downloading files that are coming from 199 00:12:23,320 --> 00:12:26,920 Speaker 1: questionable sources, not clicking on links that are coming from 200 00:12:27,000 --> 00:12:30,959 Speaker 1: questionable sources, all the basic stuff you know about. These 201 00:12:31,120 --> 00:12:34,960 Speaker 1: are reasons why that's important to follow. All right, We've 202 00:12:34,960 --> 00:12:38,320 Speaker 1: got some more to say about zombies, but first let's 203 00:12:38,360 --> 00:12:51,120 Speaker 1: take a break. We're back, all right. So for ad 204 00:12:51,280 --> 00:12:54,840 Speaker 1: DOS attack to work, first you have to actually gather 205 00:12:55,240 --> 00:12:59,280 Speaker 1: your zombie army, and that alone requires a few steps 206 00:12:59,280 --> 00:13:02,160 Speaker 1: of its own. So step one is you design or 207 00:13:02,559 --> 00:13:08,480 Speaker 1: you make use of existing malware that compromises targeted computer systems. 208 00:13:08,760 --> 00:13:13,280 Speaker 1: So if someone installs the malware, it creates a compromised 209 00:13:13,280 --> 00:13:16,280 Speaker 1: computer system. So the goal is to create a means 210 00:13:16,320 --> 00:13:18,720 Speaker 1: for a hacker to be able to send a command 211 00:13:19,240 --> 00:13:22,520 Speaker 1: to the compromised machines that will then prompt the machine 212 00:13:22,559 --> 00:13:26,080 Speaker 1: to follow orders the malware. It could be relatively simple 213 00:13:27,040 --> 00:13:30,120 Speaker 1: where it just allows for this de dos attack approach, 214 00:13:30,400 --> 00:13:32,880 Speaker 1: or it could be more extensive, and frequently is more 215 00:13:32,920 --> 00:13:37,840 Speaker 1: extensive that allows a a wider spectrum back door access 216 00:13:37,880 --> 00:13:42,360 Speaker 1: for hackers that could ultimately give a hacker administrator level 217 00:13:42,400 --> 00:13:45,240 Speaker 1: access to a machine, which is obviously a really bad 218 00:13:45,280 --> 00:13:49,360 Speaker 1: thing for that target machine. And that's really why you 219 00:13:49,440 --> 00:13:52,240 Speaker 1: need to be super careful and practice good etiquette when 220 00:13:52,280 --> 00:13:55,920 Speaker 1: you're online, because if you download certain types of malware, 221 00:13:56,080 --> 00:13:59,240 Speaker 1: it essentially means you've just handed your computer over to 222 00:13:59,320 --> 00:14:02,600 Speaker 1: a hacker. They're able to get back door access to 223 00:14:02,640 --> 00:14:05,160 Speaker 1: your your system, they can look at all your files, 224 00:14:05,200 --> 00:14:08,400 Speaker 1: they can lock it down. That's how ransomware works, where 225 00:14:08,440 --> 00:14:11,199 Speaker 1: they locked down your computer system or locked down certain 226 00:14:11,240 --> 00:14:14,760 Speaker 1: directories in your computer and then they demand a ransom 227 00:14:15,360 --> 00:14:18,839 Speaker 1: and in return they will unlock those for you. So 228 00:14:19,280 --> 00:14:22,760 Speaker 1: that this is again a reminder to be very careful 229 00:14:22,800 --> 00:14:26,040 Speaker 1: when you're online. You don't want to hand over the 230 00:14:26,120 --> 00:14:29,320 Speaker 1: keys to your system to some stranger, right, You just 231 00:14:29,360 --> 00:14:33,520 Speaker 1: don't want to do that. Anyway, Lots of hackers make 232 00:14:33,600 --> 00:14:37,320 Speaker 1: use of already existing tools. Uh, there's a much smaller 233 00:14:37,360 --> 00:14:41,120 Speaker 1: group of them who are actually designing the tools. Those 234 00:14:41,160 --> 00:14:42,760 Speaker 1: are the ones you really have to worry about. I mean, 235 00:14:42,800 --> 00:14:44,880 Speaker 1: you have to worry about all of them, the ones 236 00:14:44,920 --> 00:14:48,080 Speaker 1: who just make use of re existing stuff in order 237 00:14:48,120 --> 00:14:54,880 Speaker 1: to advance their own agendas. Often they are dismissively referred 238 00:14:54,880 --> 00:14:59,840 Speaker 1: to as script kitties. They're taking existing script or programming 239 00:15:00,320 --> 00:15:03,240 Speaker 1: and making use of it, but they're not righting it themselves. 240 00:15:03,720 --> 00:15:06,240 Speaker 1: Uh that that's a term that's often used for them. 241 00:15:06,280 --> 00:15:10,280 Speaker 1: I find that term to be problematic simply because it 242 00:15:10,360 --> 00:15:15,400 Speaker 1: doesn't reduce how potentially dangerous they can be. Uh. You, 243 00:15:15,800 --> 00:15:17,600 Speaker 1: if you dismiss them and you think that they're not 244 00:15:17,640 --> 00:15:20,080 Speaker 1: an issue, then you might be setting yourself up for 245 00:15:20,520 --> 00:15:24,000 Speaker 1: being victimized. So I don't really like using the script 246 00:15:24,040 --> 00:15:29,960 Speaker 1: kiddies designation anyway. A lot of the time hackers hide 247 00:15:30,120 --> 00:15:35,360 Speaker 1: malware packages inside a larger, seemingly legitimate file, and this 248 00:15:35,440 --> 00:15:38,800 Speaker 1: is called the trojan method. It's named after the trojan 249 00:15:38,880 --> 00:15:42,120 Speaker 1: horse of ancient legend. So instead of packing a bunch 250 00:15:42,160 --> 00:15:46,520 Speaker 1: of soldiers inside a big wooden horse, these digital trojan 251 00:15:46,560 --> 00:15:50,720 Speaker 1: horses have a malware package hiding within them. So you 252 00:15:50,800 --> 00:15:53,560 Speaker 1: designed the trojan to look like something else, maybe something 253 00:15:53,600 --> 00:15:56,720 Speaker 1: that folks would really like to get hold of. This 254 00:15:56,800 --> 00:16:00,320 Speaker 1: is one reason while you hear people cause UH and 255 00:16:00,400 --> 00:16:05,480 Speaker 1: others about downloading pirated files, going to sources where you've got, 256 00:16:06,040 --> 00:16:10,160 Speaker 1: you know, stuff that's like games and movie files and 257 00:16:10,200 --> 00:16:13,120 Speaker 1: all this kind of stuff supposedly ready for you to download. 258 00:16:13,840 --> 00:16:18,240 Speaker 1: It's not just that the matter of piracy itself is illegal, 259 00:16:18,360 --> 00:16:22,080 Speaker 1: that you're essentially stealing, you know, the idea of downloading 260 00:16:22,120 --> 00:16:27,160 Speaker 1: a product without paying for that product is stealing. But 261 00:16:27,200 --> 00:16:32,280 Speaker 1: it's also that hackers will sometimes insert malware into files 262 00:16:32,640 --> 00:16:35,800 Speaker 1: and they will hide those in pirate communities, like they'll 263 00:16:35,880 --> 00:16:39,080 Speaker 1: they'll name the files something that people really want, you know, 264 00:16:39,160 --> 00:16:42,880 Speaker 1: maybe it's like uh, an upcoming film that hasn't hit 265 00:16:42,920 --> 00:16:46,480 Speaker 1: theaters yet, but it's supposedly elite copy of it, and 266 00:16:47,160 --> 00:16:48,320 Speaker 1: you know, there are a lot of people who are 267 00:16:48,320 --> 00:16:50,240 Speaker 1: curious about that, and they'll go through the trouble of 268 00:16:50,280 --> 00:16:53,160 Speaker 1: downloading it. Well, you hide some malware in there, and 269 00:16:53,320 --> 00:16:57,440 Speaker 1: whether it's the real film or not, you've delivered malware 270 00:16:57,480 --> 00:17:00,600 Speaker 1: to someone and potentially commenced them to install it because 271 00:17:00,680 --> 00:17:04,480 Speaker 1: you know, maybe you've compressed the file in some way 272 00:17:04,600 --> 00:17:07,359 Speaker 1: and you've disguised it and people are clicking on it. 273 00:17:07,400 --> 00:17:09,800 Speaker 1: They're just eager to get a look at this movie, 274 00:17:10,160 --> 00:17:14,240 Speaker 1: and in the process they install malware to their machine. Also, 275 00:17:14,320 --> 00:17:18,040 Speaker 1: if someone is illegally downloading files, that person is likely 276 00:17:18,600 --> 00:17:24,800 Speaker 1: to resist speaking up about being victimized simply because they 277 00:17:24,840 --> 00:17:28,359 Speaker 1: were already engaged in something that was questionable, right they 278 00:17:28,400 --> 00:17:32,479 Speaker 1: were pirating files. It's it's that thought that if someone's 279 00:17:32,480 --> 00:17:35,000 Speaker 1: being dishonest, they're not going to come forward when you 280 00:17:35,320 --> 00:17:38,359 Speaker 1: have targeted them because they're worried about being found out. 281 00:17:39,200 --> 00:17:43,720 Speaker 1: So it's it's identifying your target audience and the ones 282 00:17:43,760 --> 00:17:46,720 Speaker 1: that are less likely to actually take steps to fix 283 00:17:46,800 --> 00:17:49,240 Speaker 1: a problem if it pops up, so that can give 284 00:17:49,280 --> 00:17:54,080 Speaker 1: hackers more time with these compromise machines, these zombie computers. 285 00:17:54,760 --> 00:17:58,679 Speaker 1: So hackers build up their zombie armies, their bought nets 286 00:17:58,720 --> 00:18:02,520 Speaker 1: by distributing the malware in various ways. The trojan method 287 00:18:02,560 --> 00:18:04,719 Speaker 1: is just one of many. There are lots of others, 288 00:18:05,200 --> 00:18:08,560 Speaker 1: and they monitor the botton net as it grows. You know, 289 00:18:08,600 --> 00:18:12,320 Speaker 1: they're essentially administering the botton net in the back end. 290 00:18:12,440 --> 00:18:14,760 Speaker 1: They have the ability to send out commands. This is 291 00:18:14,800 --> 00:18:19,000 Speaker 1: why you get concepts like a zombie army because the 292 00:18:19,000 --> 00:18:23,240 Speaker 1: the individual compromised devices are the soldiers of that army. 293 00:18:24,160 --> 00:18:28,879 Speaker 1: The hacker ends up being the commander of that army 294 00:18:29,240 --> 00:18:32,520 Speaker 1: and can send out commands to the entire army. And 295 00:18:32,640 --> 00:18:35,320 Speaker 1: maybe the hacker doesn't take action right away. Maybe they 296 00:18:35,359 --> 00:18:39,280 Speaker 1: just sit and wait. They have this growing number of 297 00:18:39,680 --> 00:18:43,000 Speaker 1: devices that are part of their army, and they just 298 00:18:43,040 --> 00:18:46,320 Speaker 1: wait until the time's right. In fact, it's even possible 299 00:18:46,359 --> 00:18:48,640 Speaker 1: that they don't even have a target in mind yet. 300 00:18:48,680 --> 00:18:52,680 Speaker 1: They just they compromise the machines. But it's really because 301 00:18:52,720 --> 00:18:54,560 Speaker 1: they plan on doing an attack, but they haven't even 302 00:18:54,560 --> 00:18:57,880 Speaker 1: decided who they're going to attack. That can sometimes happen too. 303 00:18:58,080 --> 00:19:00,399 Speaker 1: But when the time comes, they send out the command 304 00:19:00,480 --> 00:19:03,240 Speaker 1: to all these infected devices, at least the ones that 305 00:19:03,320 --> 00:19:07,400 Speaker 1: are currently online, and they direct these devices to all 306 00:19:07,400 --> 00:19:11,359 Speaker 1: start flinging messages at the target server and boom, you 307 00:19:11,480 --> 00:19:14,840 Speaker 1: got your distributed denial of service attack carried out by 308 00:19:15,040 --> 00:19:22,280 Speaker 1: zombie computers. Spooky. Now, de dos attacks can get more 309 00:19:22,359 --> 00:19:27,520 Speaker 1: complicated than how I've described. For example, it's also possible 310 00:19:27,560 --> 00:19:30,600 Speaker 1: to make use of compromised Internet of Things devices. They 311 00:19:30,640 --> 00:19:33,760 Speaker 1: don't have to just be computers, and you may have 312 00:19:33,840 --> 00:19:37,440 Speaker 1: heard me speak in past episodes about issues with IoT 313 00:19:37,600 --> 00:19:41,320 Speaker 1: security in the past. Some companies are not very good 314 00:19:41,760 --> 00:19:46,800 Speaker 1: at securing their devices properly, so you'll get a manufactured 315 00:19:46,800 --> 00:19:52,760 Speaker 1: product there's poor security on that product. There's the assumption 316 00:19:52,880 --> 00:19:55,880 Speaker 1: is just that it's not going to get targeted. UH. 317 00:19:55,920 --> 00:19:59,880 Speaker 1: There's a great example of various manufacturers that have used 318 00:20:00,119 --> 00:20:04,679 Speaker 1: common login and password for devices like including routers where 319 00:20:04,720 --> 00:20:09,560 Speaker 1: there's a a generic UH log in and password, and 320 00:20:09,600 --> 00:20:12,080 Speaker 1: if you know the generic logan and password for those routers, 321 00:20:12,080 --> 00:20:14,240 Speaker 1: it means that you can access any router where the 322 00:20:14,359 --> 00:20:18,000 Speaker 1: user has not made the effort to change those and 323 00:20:18,880 --> 00:20:22,520 Speaker 1: as you might guess, most people don't go to that effort. 324 00:20:22,680 --> 00:20:26,040 Speaker 1: Most people fail to go in and change the default 325 00:20:26,040 --> 00:20:29,199 Speaker 1: settings on their various devices, which means if you know 326 00:20:29,320 --> 00:20:32,920 Speaker 1: the default log in, you can access those devices right 327 00:20:33,040 --> 00:20:35,040 Speaker 1: even if you don't have access to other stuff on 328 00:20:35,080 --> 00:20:38,760 Speaker 1: the network. So then hackers can get access to a 329 00:20:38,880 --> 00:20:42,320 Speaker 1: very large installed base of Internet of Things devices in 330 00:20:42,359 --> 00:20:45,480 Speaker 1: this way. For Scout Research Labs looked at more than 331 00:20:45,560 --> 00:20:50,240 Speaker 1: eight million devices in the IoT field and they found 332 00:20:50,240 --> 00:20:53,400 Speaker 1: that there are some particularly weak examples and they happen 333 00:20:53,480 --> 00:20:57,200 Speaker 1: to be in very important places. They found that one 334 00:20:57,760 --> 00:21:00,960 Speaker 1: category of device of IoT device that tends to have 335 00:21:01,040 --> 00:21:05,880 Speaker 1: pretty weak security our medical devices. That is terrifying. They 336 00:21:05,920 --> 00:21:11,280 Speaker 1: also found that networking equipment was particularly weak with security. Again, 337 00:21:11,760 --> 00:21:16,080 Speaker 1: this is like the infrastructure, the bones upon which everything 338 00:21:16,200 --> 00:21:20,720 Speaker 1: is built, and those are weak points in a way, 339 00:21:20,800 --> 00:21:23,040 Speaker 1: it almost doesn't matter how much security you've built on 340 00:21:23,080 --> 00:21:25,119 Speaker 1: top of everything else. If you can get at the 341 00:21:25,160 --> 00:21:30,120 Speaker 1: underlying networking equipment, you can cause some real havoc. So 342 00:21:30,240 --> 00:21:33,240 Speaker 1: it's possible to direct these kinds of devices to also 343 00:21:33,320 --> 00:21:36,720 Speaker 1: send Internet traffic to a targeted server. So a zombie 344 00:21:36,800 --> 00:21:40,840 Speaker 1: army may not be composed of computers. It could include 345 00:21:40,840 --> 00:21:44,960 Speaker 1: stuff that's well outside your typical computer. And as more 346 00:21:45,000 --> 00:21:48,800 Speaker 1: devices joined the Internet of Things, this problem continues to grow. 347 00:21:49,520 --> 00:21:52,480 Speaker 1: And while companies like cloud Flare, which we'll talk about 348 00:21:52,920 --> 00:21:55,320 Speaker 1: in a couple of minutes, have really come up with 349 00:21:55,359 --> 00:21:58,960 Speaker 1: some mitigation strategies to deal with de dos attacks, the 350 00:21:59,000 --> 00:22:02,720 Speaker 1: attackers are always looking for other ways to be effectived. 351 00:22:02,840 --> 00:22:05,879 Speaker 1: DOS attacks can also be sophisticated in other ways. So 352 00:22:05,960 --> 00:22:09,439 Speaker 1: I gave a big overview of how de dose works. 353 00:22:09,480 --> 00:22:12,080 Speaker 1: But while that is an overview, you need to know 354 00:22:12,119 --> 00:22:15,520 Speaker 1: that there are different types of de DOS attacks that 355 00:22:15,600 --> 00:22:20,360 Speaker 1: target different elements or layers of a network, So you've 356 00:22:20,359 --> 00:22:24,720 Speaker 1: got you know, you can think of networking as different layers, 357 00:22:24,720 --> 00:22:31,280 Speaker 1: with each layer corresponding to a specific subset of UM tasks. 358 00:22:31,880 --> 00:22:34,560 Speaker 1: And I'm not going to go into the full layer description. 359 00:22:34,600 --> 00:22:36,520 Speaker 1: I've done episodes about that in the past, but my 360 00:22:36,600 --> 00:22:40,520 Speaker 1: point is that ad dose attack can target a specific layer, 361 00:22:41,160 --> 00:22:44,280 Speaker 1: and if you use ad DOS attack that targets multiple 362 00:22:44,359 --> 00:22:48,400 Speaker 1: layers using lots of different computers, that becomes a very 363 00:22:48,400 --> 00:22:51,800 Speaker 1: sophisticated de DOS attack, one that is much harder to 364 00:22:51,840 --> 00:22:55,800 Speaker 1: defend against than one than ADDS attack that targets just 365 00:22:55,840 --> 00:23:00,080 Speaker 1: a single layer, like the web server layer that I 366 00:23:00,160 --> 00:23:03,159 Speaker 1: kind of described earlier, the one that took down the 367 00:23:03,200 --> 00:23:06,000 Speaker 1: airport websites that I mentioned at the beginning of this episode, 368 00:23:06,520 --> 00:23:10,199 Speaker 1: that was a very simple de DOS attack. It was 369 00:23:10,240 --> 00:23:13,720 Speaker 1: attacking a specific layer, just one, so it wasn't a 370 00:23:13,760 --> 00:23:17,480 Speaker 1: multi layer attack UM and so was therefore easier to 371 00:23:17,600 --> 00:23:20,520 Speaker 1: defend against. But they don't all have to be like that. 372 00:23:20,560 --> 00:23:25,120 Speaker 1: They can be a multi layer attack from multiple vectors, 373 00:23:25,160 --> 00:23:29,720 Speaker 1: and that becomes a much more challenging issue to defend against. 374 00:23:30,400 --> 00:23:33,879 Speaker 1: And you know, the goal is almost always to gum 375 00:23:34,000 --> 00:23:37,200 Speaker 1: up the network so that traffic slows to a crawl 376 00:23:37,320 --> 00:23:41,080 Speaker 1: or it crashes entirely. So the goal is usually the 377 00:23:41,160 --> 00:23:44,680 Speaker 1: same goal, right, You're just trying to disrupt connectivity to 378 00:23:44,800 --> 00:23:48,600 Speaker 1: a specific target, But there are different ways of doing that, 379 00:23:48,640 --> 00:23:53,720 Speaker 1: whether you're attacking the server itself or you're attacking elements 380 00:23:53,720 --> 00:23:58,760 Speaker 1: within the Internet that direct traffic to that server. And 381 00:23:58,800 --> 00:24:01,000 Speaker 1: maybe in the future episode I'll go into more detail 382 00:24:01,080 --> 00:24:03,640 Speaker 1: about that, but that's going to require like a full 383 00:24:03,720 --> 00:24:07,160 Speaker 1: length episode, so we're gonna leave that for now. We're 384 00:24:07,160 --> 00:24:09,440 Speaker 1: also going to take another quick break. When we come back, 385 00:24:09,480 --> 00:24:12,840 Speaker 1: I'm gonna talk more about cloud Flare and how cloud 386 00:24:12,880 --> 00:24:16,399 Speaker 1: Flare helps protect against things like de dos attacks and 387 00:24:16,480 --> 00:24:20,920 Speaker 1: keep us safe from the zombies. But first this break. 388 00:24:30,200 --> 00:24:32,960 Speaker 1: All right, before the break, I mentioned cloud Flare, which 389 00:24:33,040 --> 00:24:37,000 Speaker 1: provides several services, not just protection against de dos. But 390 00:24:37,040 --> 00:24:40,920 Speaker 1: that's one that a lot of people relate cloud Flare too. 391 00:24:41,240 --> 00:24:43,800 Speaker 1: They think of that as like a company that protects 392 00:24:44,480 --> 00:24:50,200 Speaker 1: other companies from unwanted massive amounts of traffic, in other words, 393 00:24:50,240 --> 00:24:55,360 Speaker 1: de DOS attacks. This is tricky because, at least initially, 394 00:24:55,359 --> 00:24:58,800 Speaker 1: a de dos attack can look like legitimate traffic to 395 00:24:59,160 --> 00:25:02,760 Speaker 1: a server, and you know you don't want to block 396 00:25:02,920 --> 00:25:08,119 Speaker 1: legitimate traffic, right, You don't want to proactively cut people 397 00:25:08,200 --> 00:25:10,600 Speaker 1: off from connecting to a server. The whole point is 398 00:25:10,760 --> 00:25:14,200 Speaker 1: of services to allow clients to connect to them, and 399 00:25:14,280 --> 00:25:16,919 Speaker 1: so if you are blocking off all traffic, then there 400 00:25:17,000 --> 00:25:18,640 Speaker 1: might as well not be a server there at all. 401 00:25:19,840 --> 00:25:23,800 Speaker 1: So you want to make sure you're able to differentiate 402 00:25:23,880 --> 00:25:27,679 Speaker 1: between what an attack is and what legitimate traffic is. 403 00:25:28,160 --> 00:25:31,280 Speaker 1: The server starts getting requests that are piling up, but 404 00:25:31,600 --> 00:25:34,840 Speaker 1: these requests are coming from different machines with ad DOS attack, right, 405 00:25:34,840 --> 00:25:38,480 Speaker 1: They're not coming from a single source. So at first 406 00:25:38,480 --> 00:25:41,600 Speaker 1: Blush it looks like it's just a massive uptick in 407 00:25:41,760 --> 00:25:44,840 Speaker 1: legitimate traffic. And as I said, there could be times 408 00:25:44,840 --> 00:25:48,360 Speaker 1: when this actually happens, like there are situations where this 409 00:25:48,720 --> 00:25:51,359 Speaker 1: occurs naturally, so you have to be able to sort 410 00:25:51,480 --> 00:25:55,960 Speaker 1: those moments out from malicious de dos attacks. Now, the 411 00:25:55,960 --> 00:25:59,600 Speaker 1: way cloud Flare does this involves a few different approaches. 412 00:26:00,080 --> 00:26:02,359 Speaker 1: One is to look at the IP addresses of the 413 00:26:02,440 --> 00:26:06,600 Speaker 1: incoming messages. Uh, if they originate from the same address 414 00:26:06,760 --> 00:26:12,000 Speaker 1: or from a relatively narrow range of IP addresses, that's suspicious, right. 415 00:26:12,040 --> 00:26:14,040 Speaker 1: If you're if you're looking at and you're thinking, these 416 00:26:14,080 --> 00:26:17,040 Speaker 1: are all really similar, so it looks like they're all 417 00:26:17,080 --> 00:26:20,600 Speaker 1: coming from the same group. That could indicate a the 418 00:26:20,680 --> 00:26:24,400 Speaker 1: DOS attack. Similarly, if the traffic is coming in from 419 00:26:24,400 --> 00:26:28,680 Speaker 1: a narrow range of behavioral profiles, that's a red flag. 420 00:26:29,000 --> 00:26:33,959 Speaker 1: So a behavioral profile in this context is really about 421 00:26:34,080 --> 00:26:37,320 Speaker 1: the type of device that sent the traffic in the 422 00:26:37,400 --> 00:26:40,600 Speaker 1: first place. Right, Was it a laptop, was it a 423 00:26:40,640 --> 00:26:44,760 Speaker 1: mobile device? Wasn't an Internet of Things device? So if 424 00:26:44,840 --> 00:26:47,720 Speaker 1: you are running a news site and you start to 425 00:26:47,760 --> 00:26:50,720 Speaker 1: detect a you know, ton of traffic that's coming in 426 00:26:51,280 --> 00:26:55,400 Speaker 1: from smart thermostats, that's a big old red flag because 427 00:26:55,800 --> 00:26:58,440 Speaker 1: you can't really think of a reason why smart thermostats 428 00:26:58,440 --> 00:27:02,959 Speaker 1: would be pinging a web server for a news site. 429 00:27:03,280 --> 00:27:05,040 Speaker 1: So if you're cloud flare, you might look at the 430 00:27:05,119 --> 00:27:11,600 Speaker 1: incoming traffic and say, that's hanky, this is probably ADDS attack. Also, 431 00:27:11,680 --> 00:27:15,439 Speaker 1: if the surge in traffic starts arriving in patterns, like 432 00:27:15,480 --> 00:27:18,680 Speaker 1: if you notice that every thirty minutes you get another surge, 433 00:27:19,440 --> 00:27:21,959 Speaker 1: that's a red flag. If it's happening at a regularly, 434 00:27:22,440 --> 00:27:26,720 Speaker 1: you know, kind of timed interval that looks artificial, that 435 00:27:26,760 --> 00:27:30,760 Speaker 1: looks like that's a system that's directing waves of messages 436 00:27:31,240 --> 00:27:34,280 Speaker 1: at a predetermined amount of time. If they're not coming 437 00:27:34,320 --> 00:27:36,720 Speaker 1: in haphazardly, If they're coming in in these waves, then 438 00:27:36,800 --> 00:27:41,080 Speaker 1: that suggests it's an artificial attack. Or if you detect 439 00:27:41,240 --> 00:27:44,680 Speaker 1: a huge traffic spike but it's an unusual time of day, 440 00:27:44,720 --> 00:27:48,000 Speaker 1: that's another indicator. For example, you wouldn't expect a ton 441 00:27:48,000 --> 00:27:52,640 Speaker 1: of traffic to hit say a website for a line 442 00:27:52,640 --> 00:27:54,480 Speaker 1: of credit unions that are on the East coast of 443 00:27:54,520 --> 00:27:58,800 Speaker 1: the United States at two a m. Eastern time, for example, right, 444 00:27:58,960 --> 00:28:03,480 Speaker 1: because typically those sites should only really be getting huge 445 00:28:03,480 --> 00:28:06,080 Speaker 1: amounts of traffic or even just regular amounts of traffic 446 00:28:06,760 --> 00:28:10,680 Speaker 1: during the daytime for Eastern time. So, yes, the Internet 447 00:28:10,720 --> 00:28:14,000 Speaker 1: is global, so it's not like you would expect traffic 448 00:28:14,000 --> 00:28:18,119 Speaker 1: to drop to zero necessarily. But we tend to see 449 00:28:18,240 --> 00:28:23,880 Speaker 1: traffic behave in similar amounts uh and and in similar 450 00:28:23,880 --> 00:28:27,240 Speaker 1: scale wherever the site happens to be based. Right, So 451 00:28:27,280 --> 00:28:29,879 Speaker 1: if a site is based on the East coast of 452 00:28:29,920 --> 00:28:31,960 Speaker 1: the United States in the middle of the night in 453 00:28:32,000 --> 00:28:35,680 Speaker 1: the US, you probably see a drop in traffic there. 454 00:28:36,520 --> 00:28:38,720 Speaker 1: If you see a spike in the middle of the night, 455 00:28:39,160 --> 00:28:42,760 Speaker 1: that's a potential indication of an attack. Now, obviously that's 456 00:28:42,760 --> 00:28:45,520 Speaker 1: not always true, but you know, for certain types of sites, 457 00:28:45,600 --> 00:28:50,040 Speaker 1: it's a good rule of thumb. So first, cloud flare 458 00:28:50,080 --> 00:28:53,920 Speaker 1: actually has to differentiate an attack from legitimate traffic, and 459 00:28:53,960 --> 00:28:57,440 Speaker 1: then it essentially has to block incoming traffic from suspected 460 00:28:57,440 --> 00:29:01,600 Speaker 1: attack sources, thus shielding the client from all of those 461 00:29:01,840 --> 00:29:06,120 Speaker 1: unwanted messages. It may also use something called rate limiting. 462 00:29:06,480 --> 00:29:10,480 Speaker 1: This is essentially all about setting boundaries, which is important 463 00:29:10,480 --> 00:29:13,800 Speaker 1: in any relationship. You've got to set your boundaries. Now, 464 00:29:13,840 --> 00:29:17,719 Speaker 1: in this case, setting boundaries means setting how many requests 465 00:29:17,800 --> 00:29:20,400 Speaker 1: a server will accept in a given amount of time, 466 00:29:20,960 --> 00:29:24,080 Speaker 1: and once you hit that limit, no more requests can 467 00:29:24,160 --> 00:29:26,920 Speaker 1: come through until the next available time slot opens up. 468 00:29:27,440 --> 00:29:31,560 Speaker 1: That limits both the attacks and legitimate traffic however, so 469 00:29:32,080 --> 00:29:36,880 Speaker 1: it can definitely reduce the probability of addos attack taking 470 00:29:36,960 --> 00:29:40,920 Speaker 1: down a target, but it also means that legitimate users 471 00:29:40,920 --> 00:29:43,040 Speaker 1: aren't going to really be able to get access either, 472 00:29:43,120 --> 00:29:47,360 Speaker 1: so everyone kind of gets affected. Another strategy is making 473 00:29:47,480 --> 00:29:51,120 Speaker 1: use of a reverse proxy. All right, so proxies are 474 00:29:51,160 --> 00:29:54,440 Speaker 1: really useful things on the Internet, and it's very possible 475 00:29:54,440 --> 00:29:57,440 Speaker 1: that you've used one before. If you use a VPN, 476 00:29:57,960 --> 00:30:02,440 Speaker 1: you have relied on a proxy, so a proxy stands 477 00:30:02,480 --> 00:30:06,800 Speaker 1: in place for some other entity. With VPNs, the proxy 478 00:30:06,880 --> 00:30:10,520 Speaker 1: stands in place for clients. So when you connect to 479 00:30:10,520 --> 00:30:14,160 Speaker 1: your computer to a VPN, you're connecting to a proxy 480 00:30:14,280 --> 00:30:18,240 Speaker 1: server and all the web traffic you engage with has 481 00:30:18,280 --> 00:30:20,760 Speaker 1: to go through that VPN. So to the outside world, 482 00:30:20,800 --> 00:30:24,360 Speaker 1: if someone were snooping on you, they would see that 483 00:30:24,440 --> 00:30:26,680 Speaker 1: you are connected to a VPN, but that's as far 484 00:30:26,720 --> 00:30:29,000 Speaker 1: as they can tell. They know that you connected to 485 00:30:29,040 --> 00:30:31,440 Speaker 1: this VPN, but that's as much as they know. They 486 00:30:31,440 --> 00:30:34,000 Speaker 1: can also tell that the VPN is connecting to all 487 00:30:34,040 --> 00:30:37,760 Speaker 1: these other different sites and services, but they wouldn't be 488 00:30:37,800 --> 00:30:41,160 Speaker 1: able to say for sure that that was you directing 489 00:30:41,200 --> 00:30:43,120 Speaker 1: the VPN to do that because the VPN is also 490 00:30:43,440 --> 00:30:46,400 Speaker 1: got lots of other computers connected to it, so you 491 00:30:46,480 --> 00:30:49,320 Speaker 1: don't know who is connecting to what. You can see 492 00:30:49,360 --> 00:30:52,040 Speaker 1: that everything is going into the VPN, and then you 493 00:30:52,040 --> 00:30:54,640 Speaker 1: can see that the VPN is then sending that information 494 00:30:54,640 --> 00:30:58,120 Speaker 1: along to the various clients connected to the VPN, But 495 00:30:58,440 --> 00:31:02,480 Speaker 1: you have obfuskated what folks are doing. VPNs are used 496 00:31:02,480 --> 00:31:05,920 Speaker 1: for all sorts of legitimate reasons. There are companies that 497 00:31:06,000 --> 00:31:10,560 Speaker 1: use VPNs so that way outsiders can't snoop on traffic 498 00:31:10,760 --> 00:31:16,000 Speaker 1: between the company and the employees. For example, VPNs can 499 00:31:16,000 --> 00:31:21,280 Speaker 1: be used to get around regional restrictions. So an example 500 00:31:21,320 --> 00:31:25,240 Speaker 1: of that could be in a foreign country where the 501 00:31:25,440 --> 00:31:30,080 Speaker 1: government is really cracking down on Internet access, a VPN 502 00:31:30,160 --> 00:31:33,320 Speaker 1: might allow you to or proxy server might allow you 503 00:31:33,400 --> 00:31:37,640 Speaker 1: to sidestep those restrictions and access the Internet in an 504 00:31:37,680 --> 00:31:41,920 Speaker 1: otherwise unfettered way. So they're legitimate reasons for this. So 505 00:31:42,000 --> 00:31:44,560 Speaker 1: the VPN sends all traffic along to you, and due 506 00:31:44,560 --> 00:31:47,040 Speaker 1: to encryption and the fact that there are multiple clients 507 00:31:47,040 --> 00:31:51,560 Speaker 1: connected to the VPN, it hides what's going on. Now. 508 00:31:52,240 --> 00:31:56,320 Speaker 1: Reverse proxy is similar, but it's different in an important way. 509 00:31:56,400 --> 00:31:59,200 Speaker 1: So a reverse proxy is a server that sits in 510 00:31:59,240 --> 00:32:03,760 Speaker 1: front of or servers. So with a VPN, no server 511 00:32:03,920 --> 00:32:06,720 Speaker 1: will ever connect directly to a client. It can only 512 00:32:06,720 --> 00:32:10,520 Speaker 1: connect to a client via the VPN. With a reverse proxy, 513 00:32:10,720 --> 00:32:16,000 Speaker 1: no client can connect to a specific server, instead connects 514 00:32:16,000 --> 00:32:18,760 Speaker 1: to the reverse proxy server, which acts as kind of 515 00:32:18,760 --> 00:32:23,240 Speaker 1: a mentalman and then sends traffic along to the ultimate server. 516 00:32:23,800 --> 00:32:28,120 Speaker 1: So it's a gatekeeper really, and attackers would not have 517 00:32:28,200 --> 00:32:31,840 Speaker 1: the IP address of the target server, they would instead 518 00:32:32,200 --> 00:32:36,040 Speaker 1: be able to only direct traffic to the reverse proxy server. 519 00:32:36,440 --> 00:32:39,200 Speaker 1: So if a company like cloud flare is in charge 520 00:32:39,240 --> 00:32:43,920 Speaker 1: of those reverse proxy servers, cloud Flare can institute tougher 521 00:32:43,960 --> 00:32:48,000 Speaker 1: security measures to prevent an onslaught of illegitimate traffic hitting 522 00:32:48,560 --> 00:32:51,800 Speaker 1: the target. So the reverse proxy kind of acts like 523 00:32:51,840 --> 00:32:54,840 Speaker 1: a really tough bouncer outside of a club. The bouncer 524 00:32:54,880 --> 00:32:57,200 Speaker 1: will let the right folks into the club and make 525 00:32:57,240 --> 00:32:59,960 Speaker 1: sure that the undesirables hit the curve. Now, protect the 526 00:33:00,040 --> 00:33:03,840 Speaker 1: against the dos attacks can get really sophisticated, largely because 527 00:33:04,280 --> 00:33:07,280 Speaker 1: a well designed de dos attack will aim at hitting 528 00:33:07,280 --> 00:33:10,480 Speaker 1: a target through several layers. Right. They won't just be 529 00:33:10,640 --> 00:33:15,720 Speaker 1: a simple overwhelming attack if they're if they're planned out properly. 530 00:33:16,080 --> 00:33:18,840 Speaker 1: So defense has to be able to work for all 531 00:33:18,840 --> 00:33:22,480 Speaker 1: these different layers of attack. Otherwise, you can protect one 532 00:33:22,560 --> 00:33:25,680 Speaker 1: part of your target but leave another part unshielded and 533 00:33:25,720 --> 00:33:28,400 Speaker 1: boom the di dos attack. It still ends up being effective. 534 00:33:28,880 --> 00:33:31,840 Speaker 1: This is why companies like cloud flare exist because while 535 00:33:31,840 --> 00:33:36,400 Speaker 1: protection isn't impossible, it is time consuming, it's easy to 536 00:33:36,440 --> 00:33:40,480 Speaker 1: get wrong, and it's also why it's a really big deal. 537 00:33:40,520 --> 00:33:44,960 Speaker 1: Whenever cloud Flare dumps a client, which doesn't happen often, 538 00:33:45,280 --> 00:33:50,160 Speaker 1: but it can in extreme circumstances. For example, there's the 539 00:33:50,240 --> 00:33:53,640 Speaker 1: Kiwi Farms case. Now, in case you are unaware of 540 00:33:53,720 --> 00:33:57,160 Speaker 1: Kiwi Farms, which I would say you are lucky if 541 00:33:57,200 --> 00:34:00,600 Speaker 1: you don't know what Kiwi farms is. Kiwi Farms is 542 00:34:00,640 --> 00:34:05,040 Speaker 1: a site that houses forums largely dedicated to doxing, that 543 00:34:05,240 --> 00:34:10,680 Speaker 1: is the release of private information about a person harassing, abusing, 544 00:34:10,760 --> 00:34:17,040 Speaker 1: and threatening certain folks, for example the trans community. And 545 00:34:17,400 --> 00:34:21,960 Speaker 1: it's beyond horrifying the links that folks will go to 546 00:34:22,120 --> 00:34:26,880 Speaker 1: in order to torture targets. And the Kiwi Farms groups 547 00:34:27,040 --> 00:34:30,720 Speaker 1: have been known to heap so much abuse on people, 548 00:34:31,239 --> 00:34:36,160 Speaker 1: including revealing details of their personal lives on online, or 549 00:34:36,200 --> 00:34:39,760 Speaker 1: inventing stories and spreading them as if they were true 550 00:34:39,800 --> 00:34:44,239 Speaker 1: online or swatting victims that that means that they make 551 00:34:44,280 --> 00:34:47,879 Speaker 1: a fake emergency call into law enforcement that prompts an 552 00:34:48,040 --> 00:34:52,560 Speaker 1: armed response team to arrive at the target's home. These 553 00:34:52,640 --> 00:34:54,960 Speaker 1: levels of abuse have gone so far that some folks 554 00:34:55,080 --> 00:34:58,680 Speaker 1: were driven to committing suicide as a result. It is 555 00:34:59,360 --> 00:35:03,600 Speaker 1: truly hardifying stuff. Now, Kiwi Farms depended on cloud Flare 556 00:35:03,920 --> 00:35:08,520 Speaker 1: to shield the site from attacks, because obviously hate group 557 00:35:08,680 --> 00:35:11,920 Speaker 1: is also going to become a target itself from people 558 00:35:11,920 --> 00:35:15,520 Speaker 1: who want to take that hate group down. In September 559 00:35:15,600 --> 00:35:18,920 Speaker 1: of this year, cloud Flare announced it was dropping Kiwi 560 00:35:18,920 --> 00:35:22,400 Speaker 1: Farms as a client due to quote immediate threat to 561 00:35:22,520 --> 00:35:26,480 Speaker 1: human life end quote, and so Kiwi Farms has had 562 00:35:26,560 --> 00:35:31,440 Speaker 1: trouble staying online ever since and has been the the 563 00:35:31,480 --> 00:35:35,600 Speaker 1: site of data breaches since then. People have gotten access 564 00:35:35,600 --> 00:35:39,520 Speaker 1: to accounts and things like that, and there are related 565 00:35:39,520 --> 00:35:43,000 Speaker 1: issues that the site has encountered that involve hosting, so 566 00:35:43,120 --> 00:35:46,200 Speaker 1: not just protection but hosting. But that's that's just a 567 00:35:46,239 --> 00:35:48,239 Speaker 1: related but different matters, So we're not going to go 568 00:35:48,280 --> 00:35:51,760 Speaker 1: into all that, but it really does illustrate that cloud 569 00:35:51,800 --> 00:35:56,880 Speaker 1: flares services are really important, particularly for high profile sites, 570 00:35:56,920 --> 00:36:00,160 Speaker 1: whether that site is high profile because it's seen as 571 00:36:00,200 --> 00:36:03,680 Speaker 1: being a really important part of the infrastructure as a 572 00:36:03,680 --> 00:36:07,080 Speaker 1: as a whole, or it's just high profile because of 573 00:36:07,120 --> 00:36:09,120 Speaker 1: the nature of the site itself in the case of 574 00:36:09,200 --> 00:36:12,760 Speaker 1: Kiwi Farms, and once that protection goes away, those sites 575 00:36:12,800 --> 00:36:16,080 Speaker 1: have a real hard time staying up because they are 576 00:36:16,080 --> 00:36:19,400 Speaker 1: such attempting target at any rate. The di DOS attack 577 00:36:19,480 --> 00:36:21,680 Speaker 1: that brought down the airport websites that I talked about 578 00:36:21,680 --> 00:36:24,280 Speaker 1: at the beginning that appears to have been a relatively 579 00:36:24,320 --> 00:36:26,840 Speaker 1: simple one. It was effective in that it did clog 580 00:36:26,920 --> 00:36:30,000 Speaker 1: up web traffic to the airport websites, but it didn't 581 00:36:30,040 --> 00:36:32,200 Speaker 1: take very long for folks to resolve the problem, and 582 00:36:32,280 --> 00:36:35,040 Speaker 1: as I mentioned, it failed to disrupt travel at all. 583 00:36:35,520 --> 00:36:38,439 Speaker 1: But we still see the occasional di dos attack take 584 00:36:38,480 --> 00:36:41,680 Speaker 1: down sites and services that have a wider impact on society, 585 00:36:42,040 --> 00:36:45,080 Speaker 1: so it's not like these things are going away. And again, 586 00:36:45,920 --> 00:36:50,640 Speaker 1: part of the responsibility falls to us as denizens of 587 00:36:50,680 --> 00:36:53,879 Speaker 1: the online world to make sure that we are being 588 00:36:53,880 --> 00:36:57,200 Speaker 1: as careful as we can so that we don't compromise 589 00:36:57,239 --> 00:37:00,719 Speaker 1: our devices and have them join zombie arm. Some of 590 00:37:00,760 --> 00:37:03,239 Speaker 1: that is beyond our control. Some of it falls to 591 00:37:03,400 --> 00:37:07,440 Speaker 1: companies to make sure that they institute better security measures 592 00:37:07,440 --> 00:37:11,719 Speaker 1: when they create Internet connected devices so that hackers don't 593 00:37:11,840 --> 00:37:15,200 Speaker 1: easily have a skeleton key that gives them access to 594 00:37:15,360 --> 00:37:21,160 Speaker 1: an enormous number of those devices. And obviously, ultimately at 595 00:37:21,200 --> 00:37:23,839 Speaker 1: fault are the people who are directing these attacks. Right 596 00:37:23,960 --> 00:37:26,040 Speaker 1: if they weren't doing it, then it wouldn't be a concern. 597 00:37:26,600 --> 00:37:29,680 Speaker 1: But we have to do our best to make sure 598 00:37:29,719 --> 00:37:33,560 Speaker 1: we don't become part of the problem anyway. That was 599 00:37:33,640 --> 00:37:39,839 Speaker 1: today's spooky topic of zombie computers zombie armies. Um, I'll 600 00:37:39,880 --> 00:37:43,800 Speaker 1: be talking about lots of other types of spooky related stuff, 601 00:37:43,880 --> 00:37:48,000 Speaker 1: questionably spooky related stuff this month. I'm still trying to 602 00:37:48,000 --> 00:37:49,680 Speaker 1: figure out how I could do a ghost in the 603 00:37:49,760 --> 00:37:53,320 Speaker 1: Machine episode. I'll try and figure out if I couldn't 604 00:37:53,320 --> 00:37:56,000 Speaker 1: make that happen, And there's some other concepts that are 605 00:37:56,040 --> 00:37:58,360 Speaker 1: floating around that I would like to tackle. If you 606 00:37:58,440 --> 00:38:01,160 Speaker 1: have suggestions for spooky top picks that are tech related, 607 00:38:01,280 --> 00:38:03,160 Speaker 1: let me know. One way to do that is to 608 00:38:03,200 --> 00:38:05,720 Speaker 1: download the I Heart radio app. It's free to download 609 00:38:05,719 --> 00:38:08,880 Speaker 1: and use. Just navigate on over to the tech Stuff page. 610 00:38:08,880 --> 00:38:11,120 Speaker 1: You can do that in the search engine and use 611 00:38:11,160 --> 00:38:13,560 Speaker 1: the little microphone icon to leave me a voice message 612 00:38:13,640 --> 00:38:15,279 Speaker 1: up to thirty seconds in length. Let me know if 613 00:38:15,320 --> 00:38:17,440 Speaker 1: you would like me to use the message in an 614 00:38:17,520 --> 00:38:20,680 Speaker 1: upcoming episode. I'm all about opt in, so I will 615 00:38:20,719 --> 00:38:23,040 Speaker 1: only do it if you tell me expressly that it's 616 00:38:23,040 --> 00:38:25,399 Speaker 1: okay to do it. And the other way to reach 617 00:38:25,480 --> 00:38:27,279 Speaker 1: out to me is on Twitter. The handle for the 618 00:38:27,320 --> 00:38:30,800 Speaker 1: show is tech Stuff hs W and I'll talk to 619 00:38:30,840 --> 00:38:40,319 Speaker 1: you again really soon. Tech Stuff is an i heart 620 00:38:40,400 --> 00:38:44,160 Speaker 1: Radio production. For more podcasts from my heart Radio, visit 621 00:38:44,200 --> 00:38:47,279 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 622 00:38:47,360 --> 00:38:48,680 Speaker 1: listen to your favorite shows.