1 00:00:04,360 --> 00:00:12,320 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:15,760 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,840 --> 00:00:20,959 Speaker 1: I'm an executive producer with iHeartRadio and how the tech area. Y'all, 4 00:00:21,040 --> 00:00:25,680 Speaker 1: it is time for a classic tech Stuff episode. This 5 00:00:25,760 --> 00:00:28,920 Speaker 1: is one where my friend Shannon Morse joined the show. 6 00:00:29,000 --> 00:00:36,800 Speaker 1: Shannon is a really talented communicator of technology. She's a hacker, 7 00:00:37,360 --> 00:00:42,640 Speaker 1: She's fantastic. She's got a ton of content out there online. 8 00:00:43,080 --> 00:00:45,720 Speaker 1: So highly recommend you check out Shannon if you are 9 00:00:45,800 --> 00:00:49,560 Speaker 1: unfamiliar with her work. She's great. And on this episode 10 00:00:49,720 --> 00:00:53,960 Speaker 1: we talked about hacking for dollars, the various ways that 11 00:00:54,080 --> 00:00:58,600 Speaker 1: hackers make money by being you know, hackers, and this 12 00:00:58,680 --> 00:01:02,880 Speaker 1: episode public on Star Wars Day back in twenty sixteen. 13 00:01:02,960 --> 00:01:06,600 Speaker 1: That of course is May the fourth on twenty sixteen. 14 00:01:07,000 --> 00:01:11,640 Speaker 1: Sit back and enjoy. The term hacker is actually a 15 00:01:11,760 --> 00:01:14,640 Speaker 1: very broad term that can apply to a lot of 16 00:01:14,640 --> 00:01:18,480 Speaker 1: different things, and not all of them are that nefarious 17 00:01:18,640 --> 00:01:22,959 Speaker 1: evil infiltrate a system and steal all the corporate secrets 18 00:01:23,040 --> 00:01:27,120 Speaker 1: kind of approach to hacking that Hollywood often presents right 19 00:01:27,319 --> 00:01:29,720 Speaker 1: right exactly. I actually ask this question to a lot 20 00:01:29,720 --> 00:01:32,440 Speaker 1: of people, especially when I first meet them. Since I'm 21 00:01:32,600 --> 00:01:35,600 Speaker 1: so closely affiliated with a lot of the infoset community, 22 00:01:36,160 --> 00:01:39,520 Speaker 1: I want to surround myself with positive people. So you'll 23 00:01:39,560 --> 00:01:42,720 Speaker 1: notice with the hacker definition, you can either get a 24 00:01:42,800 --> 00:01:46,479 Speaker 1: very negative vibe from somebody or a very positive vibe. Oftentimes, 25 00:01:46,480 --> 00:01:48,680 Speaker 1: with the negative vibe, you'll get somebody who says, oh, 26 00:01:48,760 --> 00:01:51,480 Speaker 1: that's the person who stole my credit card data when 27 00:01:51,520 --> 00:01:53,800 Speaker 1: I went to a restaurant the other day. But on 28 00:01:53,840 --> 00:01:56,320 Speaker 1: the positive side, you'll get somebody that says, oh, they're 29 00:01:56,360 --> 00:01:59,400 Speaker 1: the kind of people that will break something apart and 30 00:01:59,400 --> 00:02:01,120 Speaker 1: then put it back together in a way that it 31 00:02:01,160 --> 00:02:03,440 Speaker 1: wasn't supposed to be put back together to make it 32 00:02:03,440 --> 00:02:08,560 Speaker 1: do something cool. And that's a hack in mainstream. So 33 00:02:08,639 --> 00:02:10,639 Speaker 1: that's the way I see it. I see hackers as 34 00:02:10,639 --> 00:02:15,680 Speaker 1: being people who reverse engineer different software, different hardware. It 35 00:02:15,720 --> 00:02:18,480 Speaker 1: could just be a bicycle, for example, and put it 36 00:02:18,560 --> 00:02:21,320 Speaker 1: back together in a way to make it harder, better, faster, 37 00:02:21,400 --> 00:02:25,960 Speaker 1: and stronger. Nice the old daft punk approach. Of course, Yeah, 38 00:02:26,080 --> 00:02:30,240 Speaker 1: I agree entirely. The original term hacker was really all 39 00:02:30,280 --> 00:02:35,240 Speaker 1: about people who have almost an insatiable curiosity to learn 40 00:02:35,280 --> 00:02:39,840 Speaker 1: how stuff works. Oddly enough, I share that quality having 41 00:02:39,880 --> 00:02:43,480 Speaker 1: worked at how stuff works for a decade, but yeah, 42 00:02:43,480 --> 00:02:46,160 Speaker 1: to understand how it works and then to make stuff 43 00:02:46,200 --> 00:02:49,480 Speaker 1: do things it wasn't necessarily intended to do. Not for 44 00:02:49,560 --> 00:02:54,080 Speaker 1: nefarious purposes necessarily, although that could clearly be an application, 45 00:02:54,480 --> 00:02:59,000 Speaker 1: but just for curiosity's sake. Can I take these elements 46 00:02:59,360 --> 00:03:01,040 Speaker 1: that are meant to do this one thing and do 47 00:03:01,200 --> 00:03:05,640 Speaker 1: something completely transformative with it, whether it is hardware or software. 48 00:03:05,760 --> 00:03:08,440 Speaker 1: And we've seen some really cool stuff come out of that. 49 00:03:08,480 --> 00:03:10,640 Speaker 1: I mean, I would argue that a lot of the 50 00:03:10,680 --> 00:03:13,760 Speaker 1: things you see in the cosplay world and the steampunk world, 51 00:03:14,120 --> 00:03:18,600 Speaker 1: those are all taking elements of hacking. Maker Fair is 52 00:03:18,760 --> 00:03:21,600 Speaker 1: really just a hacker's paradise when you get down to it, 53 00:03:21,720 --> 00:03:24,520 Speaker 1: especially for hardware hacks. Absolutely, I'm kind of sad I'm 54 00:03:24,520 --> 00:03:27,240 Speaker 1: going to miss maker Fair this year. I haven't been 55 00:03:27,280 --> 00:03:29,200 Speaker 1: to one yet. I've been to a small one here 56 00:03:29,200 --> 00:03:33,240 Speaker 1: in Atlanta, a very modest maker Fair. Everyone there was 57 00:03:33,440 --> 00:03:37,040 Speaker 1: great and passionate and intelligent, but it was, you know, 58 00:03:37,120 --> 00:03:39,200 Speaker 1: a much smaller scale than something you would see in 59 00:03:39,240 --> 00:03:42,280 Speaker 1: the Bay Area. Yeah, but that's the kind of thing 60 00:03:42,400 --> 00:03:45,680 Speaker 1: that hacker means to me. Now that being said, in 61 00:03:45,720 --> 00:03:49,119 Speaker 1: this episode, we're really going to be focusing on sort 62 00:03:49,120 --> 00:03:52,440 Speaker 1: of the computer oriented, really the software side of hacking, 63 00:03:53,360 --> 00:03:54,960 Speaker 1: and a large part of it's going to be on 64 00:03:55,120 --> 00:03:58,480 Speaker 1: the bad guy, the naughty bits, as I call it 65 00:03:58,480 --> 00:04:01,960 Speaker 1: in our notes about simply to talk about what are 66 00:04:02,000 --> 00:04:06,960 Speaker 1: the ways that hackers cause or the malicious hackers cause problems, 67 00:04:07,880 --> 00:04:11,760 Speaker 1: how do they expect to profit from that? And also 68 00:04:12,160 --> 00:04:15,640 Speaker 1: we'll look at ways that hackers who don't follow that path, 69 00:04:15,680 --> 00:04:19,839 Speaker 1: who are looking to help people, not hurt people, how 70 00:04:19,880 --> 00:04:23,279 Speaker 1: do they make a living Because it's one of those 71 00:04:23,279 --> 00:04:25,520 Speaker 1: things where you kind of take it for granted when 72 00:04:25,520 --> 00:04:28,240 Speaker 1: you see the Hollywood depiction of a hacker, the person 73 00:04:28,320 --> 00:04:31,200 Speaker 1: sitting down. Usually they're sitting at a keyboard and for 74 00:04:31,279 --> 00:04:35,640 Speaker 1: some reason, their monitor only is monochromatic green. Yes, that's 75 00:04:35,640 --> 00:04:37,840 Speaker 1: so true. Well, yeah, they're using the old Apple to 76 00:04:38,000 --> 00:04:42,240 Speaker 1: ems terminals are actually written and green oftentimes, but you 77 00:04:42,279 --> 00:04:44,919 Speaker 1: can change the colors to rainbow colors if you choose. 78 00:04:44,920 --> 00:04:48,119 Speaker 1: That is a hack, it's a real life hack. Yeah, yeah, 79 00:04:48,120 --> 00:04:50,360 Speaker 1: And usually you see them sitting down and then they 80 00:04:50,400 --> 00:04:56,160 Speaker 1: cause some sort of mischief, sometimes bordering on sabotage. But 81 00:04:56,200 --> 00:04:58,359 Speaker 1: then you when you think about it outside of the 82 00:04:58,400 --> 00:05:02,240 Speaker 1: context of that scene, you think, Oh, how did they 83 00:05:02,279 --> 00:05:05,159 Speaker 1: expect to profit from this? Yeah, so that's kind of 84 00:05:05,160 --> 00:05:08,560 Speaker 1: what we're looking at. Yeah, because it's always important to 85 00:05:08,560 --> 00:05:11,839 Speaker 1: me to reiterate too that there are always going to 86 00:05:11,880 --> 00:05:13,920 Speaker 1: be two sides of a coin to everything in life. 87 00:05:14,040 --> 00:05:17,360 Speaker 1: Of course, there are going to be bad guys in 88 00:05:16,480 --> 00:05:20,120 Speaker 1: the in the world who do nefarious hacks, but there's 89 00:05:20,160 --> 00:05:23,560 Speaker 1: also a lot of good guys too, And personally, for me, 90 00:05:23,600 --> 00:05:26,000 Speaker 1: the reason why I'm so interested in researching this is 91 00:05:26,040 --> 00:05:29,039 Speaker 1: because it has made me a much more privacy and 92 00:05:29,080 --> 00:05:32,000 Speaker 1: security guarded person. I've gotten a lot better at my 93 00:05:32,040 --> 00:05:34,440 Speaker 1: own protections online, and I feel like if somebody else 94 00:05:34,480 --> 00:05:37,400 Speaker 1: can understand what a hacker does on the bad side 95 00:05:37,440 --> 00:05:39,279 Speaker 1: as well as the good side, they can better protect 96 00:05:39,360 --> 00:05:42,559 Speaker 1: themselves too, And that's what I've always tried to teach people. Yeah, 97 00:05:42,600 --> 00:05:44,680 Speaker 1: I think all you have to really do is attend 98 00:05:44,760 --> 00:05:48,480 Speaker 1: one def con and really have that driven home. I 99 00:05:48,520 --> 00:05:51,679 Speaker 1: have not yet gone to a def con, mostly because 100 00:05:51,720 --> 00:05:53,840 Speaker 1: I don't know that I could part with my smartphone 101 00:05:53,880 --> 00:05:56,479 Speaker 1: for that long and I certainly wouldn't take it with me. 102 00:05:57,400 --> 00:06:02,040 Speaker 1: Bring a burner phone, you'll be fine. Yeah, that's me Jonathan, 103 00:06:02,080 --> 00:06:05,279 Speaker 1: the guy which carries the burner. It makes sense, I 104 00:06:05,320 --> 00:06:07,240 Speaker 1: mean when you're doing something like that. So for those 105 00:06:07,279 --> 00:06:10,719 Speaker 1: who don't know, Defcon is a large hacker based conference 106 00:06:11,160 --> 00:06:16,320 Speaker 1: largely looking at the realm of information security, and often 107 00:06:16,360 --> 00:06:20,560 Speaker 1: they will you'll have entire presentations dedicated to showing off 108 00:06:20,640 --> 00:06:24,200 Speaker 1: vulnerabilities and security, again not necessarily so that people can 109 00:06:24,240 --> 00:06:27,280 Speaker 1: take advantage of them, but rather to raise awareness and 110 00:06:27,360 --> 00:06:30,400 Speaker 1: to kind of force the hands of the parties that 111 00:06:30,440 --> 00:06:34,599 Speaker 1: are responsible for that software to take action and fix 112 00:06:34,640 --> 00:06:37,360 Speaker 1: a problem. Right like that was what we saw with 113 00:06:38,400 --> 00:06:42,599 Speaker 1: the hack about remotely taking control of a person's vehicle, 114 00:06:43,200 --> 00:06:46,400 Speaker 1: specifically Jeep was really having that issue. Those one of 115 00:06:46,400 --> 00:06:49,440 Speaker 1: those things where the researchers were saying, look, we're bringing 116 00:06:49,480 --> 00:06:52,320 Speaker 1: this to light, not so that we can create an 117 00:06:52,360 --> 00:06:55,360 Speaker 1: era where people are terrified of their vehicles that someone's 118 00:06:55,400 --> 00:06:57,760 Speaker 1: going to take remote control of their car, but rather 119 00:06:57,839 --> 00:07:01,920 Speaker 1: to really drive home the act that the information security 120 00:07:02,040 --> 00:07:05,640 Speaker 1: is now it's important everywhere. It's not just your phone, 121 00:07:05,680 --> 00:07:08,880 Speaker 1: it's not just your computer. As the Internet of Things 122 00:07:08,960 --> 00:07:12,760 Speaker 1: continues to blossom, it's everything. Yes, I agree, And in 123 00:07:12,760 --> 00:07:16,440 Speaker 1: that sense, those researchers were trying to use something the 124 00:07:16,480 --> 00:07:20,800 Speaker 1: old school term is called responsible disclosure where they explain 125 00:07:21,400 --> 00:07:23,960 Speaker 1: some kind of vulnerability that they found to the company 126 00:07:24,240 --> 00:07:27,440 Speaker 1: in hopes that the company will fix this problem before 127 00:07:27,440 --> 00:07:30,320 Speaker 1: it becomes mainstream and before it gets out into the wild. 128 00:07:30,760 --> 00:07:34,160 Speaker 1: In the case of Jeep, I believe, if my memory 129 00:07:34,200 --> 00:07:37,880 Speaker 1: serves me right, that Jeep did not necessarily release a 130 00:07:37,880 --> 00:07:41,200 Speaker 1: patch for this vulnerability. So then the researchers decided to 131 00:07:41,240 --> 00:07:44,360 Speaker 1: go out publicly about the information that they found, and 132 00:07:44,400 --> 00:07:46,920 Speaker 1: then Jeep decided to fix it once everybody else knew 133 00:07:46,960 --> 00:07:50,080 Speaker 1: about it, right, And sometimes that's what it takes. And 134 00:07:51,320 --> 00:07:54,360 Speaker 1: I've had the same discussion offline with a mutual friend 135 00:07:54,360 --> 00:07:57,960 Speaker 1: of ours, Brian Brushwood. Brian is a stage magician. He 136 00:07:58,000 --> 00:08:01,000 Speaker 1: has a show called Scam School. It's all about social engineering. 137 00:08:01,400 --> 00:08:03,600 Speaker 1: One of the things I have talked about with Brian 138 00:08:03,720 --> 00:08:06,560 Speaker 1: is that his show, he often shows how to do 139 00:08:06,680 --> 00:08:10,120 Speaker 1: certain types of scams or tricks, but they're mostly in 140 00:08:10,160 --> 00:08:13,360 Speaker 1: the barbet world, right, Like, not stuff that you would 141 00:08:13,400 --> 00:08:16,200 Speaker 1: do to ruin someone's life, but something that you know 142 00:08:16,240 --> 00:08:18,040 Speaker 1: you might want to you might win a free beer 143 00:08:18,600 --> 00:08:23,280 Speaker 1: that way. Yeah, And he showed off He had an 144 00:08:23,280 --> 00:08:27,160 Speaker 1: episode where he showed off this guy who had was 145 00:08:27,320 --> 00:08:32,080 Speaker 1: demonstrating a well known vulnerability of a popular bike lock 146 00:08:32,280 --> 00:08:34,079 Speaker 1: that has been off the market for a couple of 147 00:08:34,200 --> 00:08:37,840 Speaker 1: years because of this vulnerability. But that particular vulnerability meant 148 00:08:37,840 --> 00:08:41,040 Speaker 1: that you could use a regular plastic pen, remove the 149 00:08:41,080 --> 00:08:44,240 Speaker 1: pen part of the pen, use the casing, and jam 150 00:08:44,320 --> 00:08:46,840 Speaker 1: that into the lock and pop the lock open. Oh 151 00:08:46,880 --> 00:08:50,720 Speaker 1: that's horrible, right, And so people were complaining in the comments, 152 00:08:50,720 --> 00:08:54,800 Speaker 1: they were saying, you're you're, you're publicizing this vulnerability, And 153 00:08:54,840 --> 00:08:57,959 Speaker 1: I said, guess what, the bad guys already know about 154 00:08:58,000 --> 00:09:01,160 Speaker 1: this vulnerability. What they're doing is publicizing it to a 155 00:09:01,200 --> 00:09:03,840 Speaker 1: public that might be still vulnerable to it so that 156 00:09:03,920 --> 00:09:07,520 Speaker 1: they don't fall victim. And that, to me is a 157 00:09:07,679 --> 00:09:11,280 Speaker 1: very important part of hackers across the board. They serve 158 00:09:11,520 --> 00:09:16,840 Speaker 1: very important purpose to alert folks to potential dangers before 159 00:09:17,120 --> 00:09:21,520 Speaker 1: it gets too late. Yeah. Absolutely, And you're those hackers 160 00:09:21,559 --> 00:09:25,200 Speaker 1: are the people that are generally working to make a 161 00:09:25,240 --> 00:09:29,200 Speaker 1: better world for consumers at about a private and secure 162 00:09:29,280 --> 00:09:32,440 Speaker 1: world for consumers. But then, of course, on the other hand, 163 00:09:32,520 --> 00:09:35,360 Speaker 1: are the batties. Yeah, let's talk about some of them. 164 00:09:35,440 --> 00:09:38,840 Speaker 1: So I kind of gave some weird little titles for 165 00:09:38,880 --> 00:09:41,400 Speaker 1: this when I was typing it up, because in the 166 00:09:41,400 --> 00:09:43,640 Speaker 1: middle of the week. I get bored, Shannon. Let's to 167 00:09:43,679 --> 00:09:45,960 Speaker 1: be honest, and so when I was making an outline 168 00:09:46,040 --> 00:09:47,960 Speaker 1: kind of for us to work from, I started coming 169 00:09:48,000 --> 00:09:51,600 Speaker 1: up with goofy subtitles. So this whole section is titled 170 00:09:51,800 --> 00:09:55,040 Speaker 1: the Naughty Bits in our Notes, and the first one 171 00:09:55,120 --> 00:09:59,600 Speaker 1: is malware moolaw as in people who make money through 172 00:10:00,080 --> 00:10:03,959 Speaker 1: the development or distribution of malware. And malware, as I've 173 00:10:04,000 --> 00:10:06,760 Speaker 1: said on this show many times in order to define it, 174 00:10:06,760 --> 00:10:09,760 Speaker 1: it's really software that is intended to do something that 175 00:10:09,880 --> 00:10:12,760 Speaker 1: is ultimately harmful to the person who runs that software 176 00:10:12,800 --> 00:10:19,679 Speaker 1: on their machine. It covers a wide array of different subcategories, like, uh, 177 00:10:19,920 --> 00:10:21,840 Speaker 1: you know, this is the sort of term that we 178 00:10:21,960 --> 00:10:23,679 Speaker 1: normally would have in the old days just called a 179 00:10:23,720 --> 00:10:27,240 Speaker 1: computer virus, But computer virus is a very specific thing, 180 00:10:27,320 --> 00:10:31,760 Speaker 1: and malware covers more stuff than just viruses, also worms 181 00:10:31,800 --> 00:10:34,760 Speaker 1: and all sorts of stuff. Yeah, there's there's malware for 182 00:10:34,880 --> 00:10:37,560 Speaker 1: Java and Flash. If you still have Flash, install it, 183 00:10:37,600 --> 00:10:39,760 Speaker 1: I highly recommend that you uninstall it if you don't 184 00:10:39,800 --> 00:10:43,679 Speaker 1: need it. There's malware for browsers. There's malware for advertisements 185 00:10:43,720 --> 00:10:47,320 Speaker 1: online for sponsors that you'll see like on different websites 186 00:10:47,720 --> 00:10:49,720 Speaker 1: that was a very recent problem that a lot of 187 00:10:49,760 --> 00:10:53,800 Speaker 1: news publications had with yeah, big name news publication, but yeah, 188 00:10:53,840 --> 00:10:56,160 Speaker 1: so that was a big one. But you'll see malware 189 00:10:56,360 --> 00:10:59,640 Speaker 1: all over the place, and luckily we do have anti 190 00:10:59,720 --> 00:11:02,000 Speaker 1: mal where software that we can use to protect our 191 00:11:02,040 --> 00:11:04,800 Speaker 1: computers from it, and we can also block certain ports 192 00:11:04,800 --> 00:11:07,240 Speaker 1: on or routers that can hopefully protect you from MAUER. 193 00:11:07,400 --> 00:11:10,560 Speaker 1: But there's also a lot of cases where Mauer is 194 00:11:11,200 --> 00:11:15,560 Speaker 1: distributed and built so quickly that a lot of those 195 00:11:15,600 --> 00:11:19,360 Speaker 1: anti Mauer software are not updated quick enough. So in 196 00:11:19,360 --> 00:11:20,760 Speaker 1: that case, we need to do the best that we 197 00:11:20,800 --> 00:11:24,160 Speaker 1: can to protect ourselves and keep MAUER from getting out 198 00:11:24,440 --> 00:11:26,960 Speaker 1: from the deep web. Yeah, you know, it used to 199 00:11:26,960 --> 00:11:31,200 Speaker 1: be that you really all you needed to worry about 200 00:11:31,320 --> 00:11:34,120 Speaker 1: was just don't go to the more seedy elements of 201 00:11:34,160 --> 00:11:36,920 Speaker 1: the web, and you were generally all right, right, Yeah, 202 00:11:36,920 --> 00:11:40,640 Speaker 1: It's kind of like avoiding a bad neighborhood, Like, yeah, obviously, 203 00:11:40,800 --> 00:11:43,040 Speaker 1: if you don't want to get robbed, there are certain 204 00:11:43,040 --> 00:11:45,679 Speaker 1: neighborhoods that you should probably shouldn't walk around in by 205 00:11:45,679 --> 00:11:48,880 Speaker 1: yourself at night, right, And this is kind of similar 206 00:11:48,920 --> 00:11:51,680 Speaker 1: in that case where you avoid the deep web unless 207 00:11:51,679 --> 00:11:54,040 Speaker 1: you really want to be on somebody's like hit list 208 00:11:54,120 --> 00:11:57,160 Speaker 1: or something like that. Yeah. Yeah, if you suddenly think 209 00:11:57,160 --> 00:11:58,880 Speaker 1: that you want to come across as a big shot, 210 00:11:59,000 --> 00:12:01,360 Speaker 1: look if you're not a big shot, don't do that. 211 00:12:02,080 --> 00:12:03,800 Speaker 1: It's kind of like, kind of like walking up to 212 00:12:03,840 --> 00:12:06,400 Speaker 1: someone who works in a carnival and claiming that you're 213 00:12:06,440 --> 00:12:07,840 Speaker 1: with it and for it. If you don't know what 214 00:12:07,920 --> 00:12:10,959 Speaker 1: that means, you do not say that. Okay, I think 215 00:12:11,000 --> 00:12:16,319 Speaker 1: I just gave terrible advice to an entire population of listeners. Yeah, don't, don't. 216 00:12:16,400 --> 00:12:19,360 Speaker 1: Don't talk to Carney's unless you are one, all right, so, 217 00:12:19,600 --> 00:12:22,880 Speaker 1: uh and I love you Carney's, I love you all So. 218 00:12:23,120 --> 00:12:26,280 Speaker 1: The the thing that we're getting across, though, is that 219 00:12:26,440 --> 00:12:29,559 Speaker 1: today that's not as big a guarantee as it used 220 00:12:29,559 --> 00:12:32,520 Speaker 1: to be. Right like ten years ago, you'd say, look, 221 00:12:32,600 --> 00:12:36,640 Speaker 1: just be careful. Don't download unusual files. Don't don't run 222 00:12:36,800 --> 00:12:41,160 Speaker 1: a file that's linked in your email without checking it 223 00:12:41,200 --> 00:12:44,199 Speaker 1: out first, don't. Don't, you know, be careful opening up 224 00:12:44,240 --> 00:12:47,360 Speaker 1: emails from things that you don't recognize. Be careful with 225 00:12:47,400 --> 00:12:52,439 Speaker 1: PDF files, Be careful with stuff, especially unsolicited stuff that 226 00:12:52,480 --> 00:12:56,280 Speaker 1: has come to you, because that raises the chances that 227 00:12:56,400 --> 00:12:59,280 Speaker 1: something hinky is going on. It doesn't necessarily mean it's 228 00:12:59,320 --> 00:13:03,840 Speaker 1: definitely a problem, but it's potentially a problem, and it's 229 00:13:03,880 --> 00:13:06,520 Speaker 1: better to be safe than sorry. Make sure you have 230 00:13:06,640 --> 00:13:10,280 Speaker 1: good antivirus software on your computer, make sure you have 231 00:13:10,280 --> 00:13:12,560 Speaker 1: a nice strong firewall. All of these kind of things. 232 00:13:12,559 --> 00:13:17,320 Speaker 1: Those used to be pretty good at keeping nine of 233 00:13:17,360 --> 00:13:20,200 Speaker 1: the malware away from you if you were being a 234 00:13:20,240 --> 00:13:25,440 Speaker 1: fairly responsible Netaicin these days, they definitely help. These days, 235 00:13:25,600 --> 00:13:28,400 Speaker 1: these days, the attacks are sometimes getting like in the 236 00:13:28,400 --> 00:13:32,600 Speaker 1: case of the advertisements on news sites. These are attacks 237 00:13:32,640 --> 00:13:36,079 Speaker 1: that are going through avenues that you want at one 238 00:13:36,120 --> 00:13:39,240 Speaker 1: point would have considered perfectly safe. Right, Not that it's 239 00:13:39,280 --> 00:13:41,640 Speaker 1: happening all the time, but the fact that it can 240 00:13:41,760 --> 00:13:46,280 Speaker 1: happen tells you that it requires an extra level of 241 00:13:46,360 --> 00:13:51,160 Speaker 1: vigilance beyond what we used to say was sufficient. Yeah. Absolutely, 242 00:13:51,320 --> 00:13:53,960 Speaker 1: A data collection for a lot of this malware is 243 00:13:54,040 --> 00:13:59,720 Speaker 1: extremely It's high sensitive in the fact that a user's 244 00:14:00,120 --> 00:14:03,480 Speaker 1: data can get so much money on the on the 245 00:14:03,520 --> 00:14:08,120 Speaker 1: deep web, so much money, really, particularly a collection of 246 00:14:08,240 --> 00:14:10,959 Speaker 1: user data, that's where the big money is, right. I 247 00:14:11,000 --> 00:14:14,080 Speaker 1: did an episode once where we tried to break down 248 00:14:14,120 --> 00:14:18,280 Speaker 1: how much is your personal information worth? It? Really bucks? Yeah, 249 00:14:18,320 --> 00:14:21,520 Speaker 1: it really depends. It depends upon what information you're talking about, 250 00:14:21,520 --> 00:14:25,240 Speaker 1: Like how extensive is that profile on a person? But yeah, 251 00:14:25,280 --> 00:14:27,680 Speaker 1: it's not much in the grand scheme of things. Like 252 00:14:27,720 --> 00:14:30,880 Speaker 1: to you, it's worth a lot, right you as a person, Shannon, Right, 253 00:14:30,960 --> 00:14:33,320 Speaker 1: you as a person, that information is worth a lot 254 00:14:33,400 --> 00:14:36,160 Speaker 1: of money to you, Yeah, because it's who you are. 255 00:14:36,880 --> 00:14:39,920 Speaker 1: To someone else, it's worth pennies on the dollar, really 256 00:14:39,960 --> 00:14:43,640 Speaker 1: depending up, depending upon the amount of information. But the 257 00:14:43,840 --> 00:14:49,040 Speaker 1: malware often is giving hackers access to massive amounts of 258 00:14:49,080 --> 00:14:53,040 Speaker 1: info about a huge number of people, and in numbers 259 00:14:53,400 --> 00:14:56,280 Speaker 1: there is more value, and that's where they will sell that. 260 00:14:56,720 --> 00:15:00,000 Speaker 1: Sometimes they sell it to companies that are just interested 261 00:15:00,160 --> 00:15:03,440 Speaker 1: and getting information so that they can do targeted advertising. 262 00:15:03,880 --> 00:15:07,880 Speaker 1: So it might be that the ultimate use of your 263 00:15:07,920 --> 00:15:11,240 Speaker 1: information isn't as bad as it could be. It just 264 00:15:11,280 --> 00:15:14,320 Speaker 1: means you're going to get some ads, but still not 265 00:15:14,600 --> 00:15:17,000 Speaker 1: fun to think about and to think that you know, 266 00:15:17,080 --> 00:15:19,760 Speaker 1: now these companies have access to information about you that 267 00:15:19,840 --> 00:15:24,560 Speaker 1: you probably would rather they not have, particularly in targeted advertising. 268 00:15:24,840 --> 00:15:28,760 Speaker 1: The famous story about target when they started sending ads 269 00:15:28,800 --> 00:15:31,760 Speaker 1: to a young lady that were related to pregnancy yep, 270 00:15:32,000 --> 00:15:34,800 Speaker 1: and then her dad got really really ticked off. About it. 271 00:15:34,800 --> 00:15:37,720 Speaker 1: But it turned out that little girl was pregnant, yeah, 272 00:15:37,760 --> 00:15:40,920 Speaker 1: and that it was because the algorithms had picked up 273 00:15:41,120 --> 00:15:44,200 Speaker 1: through her search habits that she was pregnant based upon 274 00:15:44,320 --> 00:15:46,800 Speaker 1: the search terms she was putting in, and so they 275 00:15:46,840 --> 00:15:51,080 Speaker 1: proactively sent her some coupons for pregnancy related items. The 276 00:15:51,160 --> 00:15:53,600 Speaker 1: dad got very upset. Then the dad ended up apologizing 277 00:15:53,600 --> 00:15:56,040 Speaker 1: to Target, saying that he was unaware at the time 278 00:15:56,120 --> 00:15:58,800 Speaker 1: of the full situation. Well, in that case, it was 279 00:15:59,240 --> 00:16:03,040 Speaker 1: search algorithm, wasn't a hacker who had gained access to 280 00:16:03,040 --> 00:16:05,600 Speaker 1: stuff and then sold it. But there are other cases 281 00:16:05,640 --> 00:16:08,600 Speaker 1: where that does happen, Yeah, where you know, just a 282 00:16:08,680 --> 00:16:11,560 Speaker 1: database of info and a lot of times they will 283 00:16:11,600 --> 00:16:14,920 Speaker 1: release this malware in something that's called an exploit kit. 284 00:16:15,360 --> 00:16:18,480 Speaker 1: So generally, these exploit kits are like a batch of 285 00:16:18,840 --> 00:16:22,440 Speaker 1: similar malware that will work across several different platforms, So 286 00:16:22,480 --> 00:16:25,440 Speaker 1: that whether that's several different types of software like job 287 00:16:25,480 --> 00:16:28,480 Speaker 1: and flash, or several different browsers. It could be several 288 00:16:28,480 --> 00:16:31,800 Speaker 1: different operating systems too, So you might see an exploit 289 00:16:31,840 --> 00:16:36,200 Speaker 1: kit that works on Linux fourteen O four but also 290 00:16:36,240 --> 00:16:40,960 Speaker 1: works on Windows XP up through eight or something like that. Right, 291 00:16:41,400 --> 00:16:43,560 Speaker 1: And what's crazy is that when you start looking at 292 00:16:44,280 --> 00:16:46,040 Speaker 1: I mean, this is one of the things that hackers do, right, 293 00:16:46,040 --> 00:16:48,520 Speaker 1: They'll look at operating systems and what the market penetration 294 00:16:48,600 --> 00:16:51,200 Speaker 1: is for those systems, because that's that shows you where 295 00:16:51,200 --> 00:16:53,640 Speaker 1: your target rich environment is. Right. So if you have 296 00:16:53,680 --> 00:16:59,720 Speaker 1: Windows seven, guess what you are prime target for malware 297 00:16:59,760 --> 00:17:04,760 Speaker 1: because that is by far the largest that has the 298 00:17:04,800 --> 00:17:07,600 Speaker 1: greatest market share of any operating system right now. Ye, 299 00:17:07,840 --> 00:17:13,919 Speaker 1: Windows XP Still it's number three, number three, and it 300 00:17:14,000 --> 00:17:16,600 Speaker 1: has not been supported by wind formed by Microsoft for 301 00:17:16,680 --> 00:17:21,000 Speaker 1: two years. This, by the way, bad thing. If you 302 00:17:21,040 --> 00:17:23,960 Speaker 1: want to be really secure with your your computer information, 303 00:17:24,000 --> 00:17:25,760 Speaker 1: you don't want to be using an operating system that 304 00:17:25,800 --> 00:17:28,199 Speaker 1: no longer gets support from the company that made it, 305 00:17:29,560 --> 00:17:32,440 Speaker 1: because because that means no vulnerabilities will be patched. From 306 00:17:32,440 --> 00:17:34,439 Speaker 1: that moment forward, you're pretty much on your own. You 307 00:17:34,480 --> 00:17:37,240 Speaker 1: have gone into the dark forest, and you forgot to 308 00:17:37,240 --> 00:17:41,800 Speaker 1: bring your flashlight. It's pretty dangerous. One of the things 309 00:17:41,800 --> 00:17:45,439 Speaker 1: that you kind of I think leads in from what 310 00:17:45,480 --> 00:17:48,280 Speaker 1: you were saying before with these exploit kits. One of 311 00:17:48,320 --> 00:17:52,480 Speaker 1: the most terrifying aspects of this type of malware and 312 00:17:52,480 --> 00:17:54,920 Speaker 1: and the fact that that people can use it for 313 00:17:55,400 --> 00:17:59,760 Speaker 1: nefarious purposes and monetary gain. Is that you also have 314 00:17:59,800 --> 00:18:02,399 Speaker 1: a population of people who don't even understand how the 315 00:18:02,400 --> 00:18:05,880 Speaker 1: malware works. They don't even Script kitties is what I'm 316 00:18:05,880 --> 00:18:09,000 Speaker 1: getting at. Script kitties, that's the term we use for 317 00:18:09,040 --> 00:18:16,000 Speaker 1: people who are they're benefiting from the work that hackers 318 00:18:16,040 --> 00:18:18,399 Speaker 1: have done. Hackers are the ones who are actually putting 319 00:18:18,440 --> 00:18:21,280 Speaker 1: together the software. They're the ones who have identified the 320 00:18:21,320 --> 00:18:25,119 Speaker 1: vulnerability and then exploited it in some way. Script kitties 321 00:18:25,119 --> 00:18:27,600 Speaker 1: are the ones who essentially they're given a set of 322 00:18:27,600 --> 00:18:31,000 Speaker 1: skeleton keys, and they didn't make the skeleton keys, they're 323 00:18:31,000 --> 00:18:35,560 Speaker 1: just using them. And it's scary because you don't need 324 00:18:35,600 --> 00:18:38,480 Speaker 1: a level of expertise. You might think, oh, well, I'm 325 00:18:38,520 --> 00:18:40,760 Speaker 1: kind of safe from hackers because how many people are 326 00:18:40,800 --> 00:18:45,000 Speaker 1: actually hackers? How many people really know how this system works. Well, 327 00:18:45,560 --> 00:18:47,600 Speaker 1: you don't have to really know how the system works. 328 00:18:47,640 --> 00:18:50,719 Speaker 1: If you have a tool that exploits a vulnerability, oh absolutely. 329 00:18:50,760 --> 00:18:53,720 Speaker 1: Although I really hate the word script kitty, I will 330 00:18:53,720 --> 00:18:56,240 Speaker 1: put it out there because I feel like if you're 331 00:18:56,320 --> 00:19:00,560 Speaker 1: interested in information security, and if you're interested in becoming 332 00:19:00,720 --> 00:19:04,680 Speaker 1: a good hacker, then you do start somewhere and everybody 333 00:19:04,760 --> 00:19:06,760 Speaker 1: is going to start with the easy tools that are 334 00:19:06,760 --> 00:19:09,560 Speaker 1: out there and that are available for free. For example, 335 00:19:09,680 --> 00:19:11,960 Speaker 1: one thing that I learned how to use a couple 336 00:19:12,080 --> 00:19:15,120 Speaker 1: years back was this tool called wire shark. It easily 337 00:19:15,200 --> 00:19:18,320 Speaker 1: lets you see everything that's happening on your wireless network, 338 00:19:18,400 --> 00:19:22,560 Speaker 1: or you can use it for any computers that are 339 00:19:21,960 --> 00:19:24,840 Speaker 1: on your network, like behind your router, so you can 340 00:19:24,840 --> 00:19:27,520 Speaker 1: see everything that's going on and you don't necessarily have 341 00:19:27,600 --> 00:19:32,000 Speaker 1: to learn or understand what's going on behind it to 342 00:19:32,080 --> 00:19:35,479 Speaker 1: be able to read what's on your screen happening right 343 00:19:35,520 --> 00:19:38,639 Speaker 1: in front of you. I think it's really important though, 344 00:19:38,720 --> 00:19:43,240 Speaker 1: for people who might be called script kitties to look 345 00:19:43,280 --> 00:19:46,719 Speaker 1: at as being beneficial and that they can grow from 346 00:19:46,800 --> 00:19:50,680 Speaker 1: that process. They can start from being a beginner and say, okay, 347 00:19:50,760 --> 00:19:52,840 Speaker 1: well I need to understand the theory. Now I can 348 00:19:52,880 --> 00:19:55,879 Speaker 1: move on from being a script kittie quote unquote to 349 00:19:56,000 --> 00:19:59,200 Speaker 1: becoming somebody who is an expert in some kind of 350 00:19:59,280 --> 00:20:03,119 Speaker 1: information security out there. Yeah. When I think of the 351 00:20:03,240 --> 00:20:05,879 Speaker 1: term script kitty, in my mind, it's a it's a 352 00:20:05,920 --> 00:20:09,520 Speaker 1: subset of the people that typically get labeled as such. Yeah, 353 00:20:09,640 --> 00:20:12,720 Speaker 1: that subset being people who have little to no interest 354 00:20:12,760 --> 00:20:17,680 Speaker 1: in actually learning how to hack or program people who 355 00:20:17,760 --> 00:20:22,000 Speaker 1: want a very very fast track way to gain either 356 00:20:22,480 --> 00:20:26,080 Speaker 1: a reputation by being the person who took down a 357 00:20:26,160 --> 00:20:30,320 Speaker 1: system by whatever means, or by making a whole lot 358 00:20:30,320 --> 00:20:34,199 Speaker 1: of money really fast for relatively little effort. Those are 359 00:20:34,200 --> 00:20:36,040 Speaker 1: the ones I specifically think of when I think of 360 00:20:36,040 --> 00:20:38,679 Speaker 1: script kitty. But you are absolutely right, you have to 361 00:20:38,720 --> 00:20:41,439 Speaker 1: start somewhere if you're interesting it is. I'm kind of 362 00:20:41,480 --> 00:20:44,240 Speaker 1: defensive with that because I was called a script kitty 363 00:20:44,280 --> 00:20:47,760 Speaker 1: when I first started up started off learning about hacking 364 00:20:47,760 --> 00:20:51,320 Speaker 1: and information security. People would be like, Osh, she's just 365 00:20:51,359 --> 00:20:53,439 Speaker 1: a script kitty, and I'd be like, well, I actually 366 00:20:53,440 --> 00:20:55,159 Speaker 1: want to understand the theory. I want to learn how 367 00:20:55,200 --> 00:20:57,160 Speaker 1: to program. I want to learn how right code. I'm 368 00:20:57,200 --> 00:20:59,879 Speaker 1: no longer called that because I have learned how to 369 00:21:00,280 --> 00:21:02,760 Speaker 1: certain kinds of code. I have learned how to program. 370 00:21:02,800 --> 00:21:05,919 Speaker 1: I can make my Arduino do whatever I want. So 371 00:21:05,960 --> 00:21:09,280 Speaker 1: at this point in my stage, I've surpassed that moment 372 00:21:09,320 --> 00:21:13,000 Speaker 1: of being a nube, and I've gone on to learning 373 00:21:13,040 --> 00:21:17,560 Speaker 1: things and being able to understand specific tasks and get 374 00:21:17,600 --> 00:21:19,159 Speaker 1: them to do what I want them to do without 375 00:21:19,200 --> 00:21:22,760 Speaker 1: finding tutorials online. Yeah, so now I make my own tutorials. 376 00:21:23,000 --> 00:21:26,440 Speaker 1: Seeing that's nice because when I started at How Stuff Works, 377 00:21:26,440 --> 00:21:29,240 Speaker 1: they call me that weird bald guy, and today they 378 00:21:29,320 --> 00:21:32,440 Speaker 1: still do. Shannon and I will be back to talk 379 00:21:32,520 --> 00:21:34,800 Speaker 1: more about hacking for dollars in just a moment, but 380 00:21:34,880 --> 00:21:46,680 Speaker 1: first let's take this quick break so that kind of 381 00:21:46,720 --> 00:21:50,800 Speaker 1: covers the malware approach. People can make money through malware, 382 00:21:50,840 --> 00:21:55,120 Speaker 1: either by selling your information, they might do so by 383 00:21:55,800 --> 00:22:00,320 Speaker 1: another method, which kind of leads into this idea of ransomware. 384 00:22:00,440 --> 00:22:03,199 Speaker 1: So this would be malware specific type of malware that 385 00:22:04,359 --> 00:22:07,280 Speaker 1: locks down your machine in some way so that you 386 00:22:07,320 --> 00:22:09,679 Speaker 1: can no longer access it, and then you essentially get 387 00:22:09,680 --> 00:22:11,520 Speaker 1: a message saying, hey, if you want if you want 388 00:22:11,520 --> 00:22:13,840 Speaker 1: your data back, if you want access to your data, 389 00:22:13,920 --> 00:22:16,359 Speaker 1: if you want to be able to do all this stuff, 390 00:22:16,359 --> 00:22:18,879 Speaker 1: and you want our hands out of your business, then 391 00:22:18,880 --> 00:22:22,600 Speaker 1: you've got to pay us some mulah. Yeah. So basically 392 00:22:22,680 --> 00:22:26,520 Speaker 1: what happens with ransomware is it is, just like you said, 393 00:22:26,520 --> 00:22:29,920 Speaker 1: a type of malware that gets distributed in one way, 394 00:22:29,920 --> 00:22:32,919 Speaker 1: shape or form onto somebody's computer and it ends up 395 00:22:33,040 --> 00:22:35,800 Speaker 1: encrypting their data. It could be a whole hard drive, 396 00:22:35,840 --> 00:22:38,120 Speaker 1: it could be a folder of data. It's some kind 397 00:22:38,160 --> 00:22:40,840 Speaker 1: of important data that they have sitting on their computer. 398 00:22:41,640 --> 00:22:45,760 Speaker 1: And in many cases, a thief the hacker will ask 399 00:22:45,800 --> 00:22:49,320 Speaker 1: them in an email or maybe an unencrypted text document 400 00:22:49,359 --> 00:22:53,119 Speaker 1: that's now surreptitiously on their computer out of nowhere, to 401 00:22:54,000 --> 00:22:56,400 Speaker 1: send them a certain amount of bitcoins, and they tell 402 00:22:56,440 --> 00:22:58,320 Speaker 1: them how to set up a bitcoin wallet so that 403 00:22:58,359 --> 00:23:00,960 Speaker 1: they can send the bitcoins to them for them to 404 00:23:00,960 --> 00:23:04,520 Speaker 1: get a pass code to unlock their encrypted data. Now, 405 00:23:04,520 --> 00:23:07,360 Speaker 1: the weird part is they already owned this data. It's 406 00:23:07,359 --> 00:23:09,440 Speaker 1: on their own hard drive. It could be anything from 407 00:23:09,440 --> 00:23:12,280 Speaker 1: like kids photos, it could be tax documents. But in 408 00:23:12,320 --> 00:23:14,280 Speaker 1: any case, it's going to be some kind of important 409 00:23:14,320 --> 00:23:17,360 Speaker 1: information that people don't want to lose because it might 410 00:23:17,400 --> 00:23:20,600 Speaker 1: be years and years of information that's just on that computer. 411 00:23:21,200 --> 00:23:23,440 Speaker 1: So of course people are going to send them bitcoins. 412 00:23:23,840 --> 00:23:26,240 Speaker 1: And I think last I checked, a bitcoin was a 413 00:23:26,280 --> 00:23:28,320 Speaker 1: few hundred bucks, so it ends up being quite a 414 00:23:28,320 --> 00:23:30,000 Speaker 1: bit of money that they have to send to get 415 00:23:30,000 --> 00:23:33,320 Speaker 1: their information unlocked. Yeah, and this is this is the 416 00:23:33,359 --> 00:23:37,399 Speaker 1: type of malware. When we were talking about the advertising 417 00:23:37,440 --> 00:23:41,040 Speaker 1: that was targeting people through massive news sites, if I'm 418 00:23:41,040 --> 00:23:44,520 Speaker 1: not mistaken, it was specifically ransomware. It was the kind 419 00:23:44,520 --> 00:23:48,679 Speaker 1: of stuff that was encrypting users. Yeah. Yeah, so it 420 00:23:48,720 --> 00:23:52,000 Speaker 1: wasn't just malware. It was ransomware that was infecting computers. 421 00:23:52,040 --> 00:23:53,919 Speaker 1: Because malware can do other stuff too, right, it can 422 00:23:54,119 --> 00:23:59,480 Speaker 1: It can create something like a backdoor access Oh yeah, yeah, 423 00:24:00,119 --> 00:24:02,359 Speaker 1: can take control of your machine or just monitor what 424 00:24:02,400 --> 00:24:04,199 Speaker 1: you're doing. Even if they don't want to take control, 425 00:24:04,560 --> 00:24:06,640 Speaker 1: they can put in key loggers so they can see 426 00:24:06,720 --> 00:24:09,600 Speaker 1: what all your passwords are. So you might want to 427 00:24:09,640 --> 00:24:13,280 Speaker 1: think about using things like a really good password manager. Yeah, 428 00:24:13,840 --> 00:24:18,560 Speaker 1: that's what I use and I love mine. Yeah, so 429 00:24:18,800 --> 00:24:21,640 Speaker 1: the things where you don't have to type the password 430 00:24:21,680 --> 00:24:23,560 Speaker 1: in so you don't have to worry about key loggers 431 00:24:23,600 --> 00:24:26,840 Speaker 1: picking up on that kind of stuff. But we'll talk 432 00:24:26,880 --> 00:24:29,560 Speaker 1: more about that in just a second. So one of 433 00:24:29,560 --> 00:24:31,080 Speaker 1: the other ones I wanted to talk about. This one's 434 00:24:31,119 --> 00:24:34,520 Speaker 1: kind of a gray area because this is this I 435 00:24:34,560 --> 00:24:38,359 Speaker 1: titled this section spies like Us, and by this I 436 00:24:38,480 --> 00:24:43,240 Speaker 1: meant state sponsored hackers. People who are hacking on behalf 437 00:24:43,400 --> 00:24:49,920 Speaker 1: of a specific state or nation or government. Sometimes they 438 00:24:49,960 --> 00:24:54,800 Speaker 1: may be doing so not with the what should I say, like, 439 00:24:54,880 --> 00:24:58,640 Speaker 1: not with the express permission of the nation. It may 440 00:24:58,720 --> 00:25:01,080 Speaker 1: turn out that the state says, hey, we didn't tell 441 00:25:01,119 --> 00:25:03,040 Speaker 1: them to do this. They're just doing it because they 442 00:25:03,160 --> 00:25:06,359 Speaker 1: love us so much and they hate and they hate 443 00:25:06,400 --> 00:25:09,720 Speaker 1: you guys. Yea, and that's why they're doing it. Whether 444 00:25:09,760 --> 00:25:13,000 Speaker 1: that's true or not depends upon the situation. I would 445 00:25:13,080 --> 00:25:16,360 Speaker 1: I would think that if I were running a government 446 00:25:16,400 --> 00:25:19,600 Speaker 1: and I had employed a bunch of hackers to infiltrate 447 00:25:19,880 --> 00:25:23,840 Speaker 1: or sabotage another nation's systems, I also would like some 448 00:25:23,880 --> 00:25:28,159 Speaker 1: plausible deniability in there. Hey, I didn't tell him to 449 00:25:28,160 --> 00:25:30,439 Speaker 1: do it. I just said, man, it's kind of like 450 00:25:30,720 --> 00:25:34,560 Speaker 1: there's there's a story that a king of England once 451 00:25:34,680 --> 00:25:37,280 Speaker 1: yelled out, who will rid me of this meddlesome priest? 452 00:25:37,680 --> 00:25:39,919 Speaker 1: And then a couple of nights went off and ridded 453 00:25:40,000 --> 00:25:43,560 Speaker 1: him of that meddlesome priest and it turned out that 454 00:25:43,600 --> 00:25:46,840 Speaker 1: he was he was just mad and just talking out loud. 455 00:25:47,480 --> 00:25:50,240 Speaker 1: And then one of his dearest friends ended up being 456 00:25:50,280 --> 00:25:52,760 Speaker 1: murdered by a couple of nights because they heard the 457 00:25:52,800 --> 00:25:54,679 Speaker 1: guy talking and said, hey, we should get rid of them. 458 00:25:54,720 --> 00:25:57,840 Speaker 1: We'll get rewarded. That's why the States argue, I don't 459 00:25:57,840 --> 00:25:59,760 Speaker 1: know that that's always the case. Also, by the way, 460 00:26:00,000 --> 00:26:02,120 Speaker 1: are you listeners out there who recognize who I'm talking 461 00:26:02,160 --> 00:26:04,680 Speaker 1: about send me an email and prove it because I'm 462 00:26:04,720 --> 00:26:08,560 Speaker 1: a medievalist and I love that stuff. But yeah, this 463 00:26:08,600 --> 00:26:10,760 Speaker 1: is something that we see. You know, you often will 464 00:26:10,800 --> 00:26:13,080 Speaker 1: hear stories about Chinese hackers or Russian hackers. There was 465 00:26:13,119 --> 00:26:18,960 Speaker 1: a story several years ago about how information security experts 466 00:26:18,960 --> 00:26:23,400 Speaker 1: were noticing some artifacts in our power grid system that 467 00:26:23,440 --> 00:26:28,040 Speaker 1: were indicative of people who had infiltrated that system and 468 00:26:28,160 --> 00:26:30,680 Speaker 1: planted some stuff in there so that they could monitor 469 00:26:30,720 --> 00:26:33,280 Speaker 1: things or perhaps even jump back into the power grid 470 00:26:33,359 --> 00:26:36,520 Speaker 1: system should push come to shove in some sort of 471 00:26:36,560 --> 00:26:39,840 Speaker 1: political situation. They had traced it back to either China 472 00:26:39,960 --> 00:26:43,040 Speaker 1: or Russia. It's pretty tricky to actually figure out where 473 00:26:43,040 --> 00:26:46,600 Speaker 1: attacks ultimately originate from, because if you're really good, you 474 00:26:46,600 --> 00:26:50,400 Speaker 1: can cover your tracks pretty well. But the United States 475 00:26:50,440 --> 00:26:52,880 Speaker 1: has done it too. You might have heard about Stuck's Net. 476 00:26:53,160 --> 00:26:55,879 Speaker 1: That was the that was the computer virus that was 477 00:26:55,920 --> 00:27:01,800 Speaker 1: designed to to spin a centrifuge and nuclear facility at 478 00:27:01,800 --> 00:27:04,640 Speaker 1: a speed greater than what it was supposed to spin at. 479 00:27:05,320 --> 00:27:07,800 Speaker 1: And originally I think the hope was that it would 480 00:27:07,800 --> 00:27:12,800 Speaker 1: cause a catastrophic failure and perhaps even destroy the facility. 481 00:27:13,119 --> 00:27:15,679 Speaker 1: As it turned out, it caused a failure, but not 482 00:27:15,760 --> 00:27:19,760 Speaker 1: at that level, but that those are examples of something 483 00:27:19,760 --> 00:27:24,840 Speaker 1: that's technically legal within the country because it's it's endorsed 484 00:27:24,960 --> 00:27:30,120 Speaker 1: or at least permitted by a government, but you don't 485 00:27:30,160 --> 00:27:33,520 Speaker 1: want it out there because it seems pretty darn shady 486 00:27:33,560 --> 00:27:38,639 Speaker 1: to anybody else. Yeah. Yeah, So state sponsored hacks are 487 00:27:38,920 --> 00:27:43,359 Speaker 1: more worrisome to me because they oftentimes have much larger targets. 488 00:27:43,960 --> 00:27:48,040 Speaker 1: For example, they might target a large government facility, like 489 00:27:48,400 --> 00:27:51,640 Speaker 1: I don't know, the Pentagon, So I worry about those 490 00:27:51,680 --> 00:27:55,560 Speaker 1: because those kind of servers have a lot of information 491 00:27:55,600 --> 00:27:58,680 Speaker 1: on the citizens of any sort of country. So many 492 00:27:58,720 --> 00:28:01,320 Speaker 1: time you see these in the news, it's always like, oh, well, 493 00:28:01,320 --> 00:28:05,040 Speaker 1: this this hack was done by Chinese state sponsored hackers, 494 00:28:05,160 --> 00:28:09,520 Speaker 1: or Russian state sponsored hackers, or American state sponsored hackers, 495 00:28:09,520 --> 00:28:12,879 Speaker 1: and these are North Korea would be another big one. Yeah. Yeah, 496 00:28:12,880 --> 00:28:16,160 Speaker 1: So so they are either it might be a Tinama 497 00:28:16,240 --> 00:28:20,680 Speaker 1: hackers that are kind of comprised together in a illegitimate 498 00:28:20,720 --> 00:28:24,480 Speaker 1: company who are hired by a government, or like you say, 499 00:28:24,520 --> 00:28:29,399 Speaker 1: where they may not necessarily have any affiliation quote unquote 500 00:28:29,400 --> 00:28:33,120 Speaker 1: with the government, but the government ends them paying them 501 00:28:33,160 --> 00:28:37,280 Speaker 1: in some way, shape or form for their infiltration because 502 00:28:37,320 --> 00:28:39,719 Speaker 1: it ends up helping the government in some way or another. 503 00:28:40,400 --> 00:28:43,840 Speaker 1: So it's it's a very sticky scenario when you start 504 00:28:43,880 --> 00:28:47,080 Speaker 1: dealing with these state sponsored hackers, because it's it's hard 505 00:28:47,080 --> 00:28:51,520 Speaker 1: to understand, how are we going to, you know, penalize them. 506 00:28:51,640 --> 00:28:54,200 Speaker 1: Who do we penalize? Do we penalize government or the 507 00:28:54,240 --> 00:28:57,560 Speaker 1: hackers themselves? Are both like who was actually involved? It 508 00:28:57,720 --> 00:28:59,880 Speaker 1: might end up being how do we address the un 509 00:29:00,040 --> 00:29:03,680 Speaker 1: d lying situation that led to the employment of hackers 510 00:29:03,680 --> 00:29:08,600 Speaker 1: in the first place, which can get pretty pretty delicate. 511 00:29:08,920 --> 00:29:12,400 Speaker 1: Another great example not too long ago, or at least 512 00:29:12,440 --> 00:29:15,720 Speaker 1: one that may or may not have been involved in 513 00:29:16,120 --> 00:29:18,040 Speaker 1: may or may not have involved a state sponsored hacker 514 00:29:18,080 --> 00:29:22,200 Speaker 1: I'm still somewhat skeptical of that would be the Sony hack. 515 00:29:22,560 --> 00:29:26,360 Speaker 1: Oh yeah, because the Sony hack, the US government essentially 516 00:29:26,440 --> 00:29:29,360 Speaker 1: was pointing fingers to North Korea, saying the hackers must 517 00:29:29,400 --> 00:29:31,760 Speaker 1: have come from North Korea. Look at this IP address, 518 00:29:31,960 --> 00:29:34,480 Speaker 1: which we don't even need to go into detail right now, 519 00:29:34,800 --> 00:29:37,520 Speaker 1: except to say that an IP address does not proof 520 00:29:37,640 --> 00:29:41,520 Speaker 1: make but at any rate, they're pointing over at North 521 00:29:41,600 --> 00:29:44,080 Speaker 1: Korea saying we think the attacks came from there. The 522 00:29:44,320 --> 00:29:47,400 Speaker 1: attack appears to be politically motivated North Korea for its part, 523 00:29:47,440 --> 00:29:50,080 Speaker 1: the government, which, by the way, North Korea not shy 524 00:29:50,160 --> 00:29:53,280 Speaker 1: about taking credit for stuff, but they said, no, no, 525 00:29:53,400 --> 00:29:56,000 Speaker 1: we didn't, we didn't ask for this, but we're totally 526 00:29:56,000 --> 00:30:00,520 Speaker 1: cool with it happening. So you know, it's one of those. 527 00:30:00,560 --> 00:30:02,920 Speaker 1: It's also very muddy because obviously when you're talking about 528 00:30:02,920 --> 00:30:05,720 Speaker 1: things like espionage or sabotage or any of those things, 529 00:30:06,880 --> 00:30:09,760 Speaker 1: you don't you don't come out and talk more about it, 530 00:30:09,800 --> 00:30:13,000 Speaker 1: you don't. That ends up being closed away. Yeah, in fact, 531 00:30:13,000 --> 00:30:14,800 Speaker 1: I should, I should really throw that over to the 532 00:30:14,800 --> 00:30:16,680 Speaker 1: stuff they don't want you to know guys and have 533 00:30:16,760 --> 00:30:18,240 Speaker 1: them do an episode on it, because that would be 534 00:30:18,240 --> 00:30:22,200 Speaker 1: a lot of fun. And then we've got got the 535 00:30:22,200 --> 00:30:26,640 Speaker 1: the traditional at least, I would argue the traditional concept 536 00:30:26,680 --> 00:30:29,400 Speaker 1: of a hacker from the Hollywood perspective. The black hats, 537 00:30:30,000 --> 00:30:32,640 Speaker 1: the ones that they are wearing the hoodies, and they're 538 00:30:32,640 --> 00:30:34,920 Speaker 1: sitting at a keyboard and they're typing really fast on 539 00:30:34,960 --> 00:30:38,960 Speaker 1: a green and black screen over Yes, they got like, 540 00:30:39,040 --> 00:30:43,760 Speaker 1: got some junk food food around them. Yeah, mail, and 541 00:30:43,800 --> 00:30:46,040 Speaker 1: they have a ton of different windows popping up on 542 00:30:46,080 --> 00:30:48,280 Speaker 1: their computer. Really, you're really fast, so you can't make 543 00:30:48,320 --> 00:30:51,600 Speaker 1: out anything that's happening. It's entirely not true. That's not 544 00:30:51,640 --> 00:30:55,040 Speaker 1: how it works. It's actually a somewhat slow process to 545 00:30:55,120 --> 00:30:59,160 Speaker 1: get UM basically, to get reconnaissance and to get into 546 00:30:59,320 --> 00:31:01,960 Speaker 1: any kind of net work. The only things I've done, 547 00:31:01,960 --> 00:31:05,600 Speaker 1: of course, are completely legal. I've had an authorization by 548 00:31:05,600 --> 00:31:11,240 Speaker 1: everybody who I have tested my abilities on. Right. Yeah, 549 00:31:11,280 --> 00:31:16,040 Speaker 1: so black hats. That's That's another awkward definition because it's 550 00:31:16,080 --> 00:31:18,240 Speaker 1: not one that I like to use all the time 551 00:31:18,280 --> 00:31:22,440 Speaker 1: because black hat hacker means that there's it makes hackers 552 00:31:22,840 --> 00:31:26,040 Speaker 1: have more of a negative appeal to a lot of people. 553 00:31:26,160 --> 00:31:29,160 Speaker 1: So I always just call them black hat thieves. Yeah. Now, 554 00:31:29,240 --> 00:31:32,200 Speaker 1: that's a great way of putting it, because typically you'll 555 00:31:32,240 --> 00:31:35,480 Speaker 1: see things like um uh, the idea of infiltrating a 556 00:31:35,520 --> 00:31:38,120 Speaker 1: system in order to steal information, perhaps to sell it 557 00:31:38,160 --> 00:31:41,320 Speaker 1: to someone else, or to hold it against the party 558 00:31:41,320 --> 00:31:44,400 Speaker 1: that you've stolen it from. Um, you know, so it 559 00:31:44,480 --> 00:31:50,080 Speaker 1: might be extortion as opposed to to stealing and selling. Also, 560 00:31:50,120 --> 00:31:52,400 Speaker 1: we should go ahead and point out something else that 561 00:31:52,440 --> 00:31:55,280 Speaker 1: I'll talk about it in a future episode, but I've 562 00:31:55,320 --> 00:31:58,880 Speaker 1: mentioned it in previous ones too. Hackers don't necessarily just 563 00:31:59,040 --> 00:32:02,120 Speaker 1: sit at a keyboard in type and strings of letters 564 00:32:02,120 --> 00:32:04,520 Speaker 1: and numbers. They also do a lot of social engineering 565 00:32:04,560 --> 00:32:07,280 Speaker 1: where or they can do a lot of social engineering 566 00:32:07,280 --> 00:32:10,360 Speaker 1: where they attempt to gain access to systems, either by 567 00:32:10,440 --> 00:32:14,520 Speaker 1: physically gaining access to a system, which makes it way 568 00:32:14,560 --> 00:32:18,840 Speaker 1: easier than remotely doing it, or even easier than that 569 00:32:18,920 --> 00:32:21,400 Speaker 1: manipulating someone who does have access to a system, and 570 00:32:21,400 --> 00:32:24,760 Speaker 1: then you get it that way, and it's surprisingly easy 571 00:32:24,840 --> 00:32:29,600 Speaker 1: to do if employees have not been educated on how 572 00:32:29,680 --> 00:32:32,440 Speaker 1: to spot that and avoid it. Yeah, properly training your 573 00:32:33,320 --> 00:32:35,959 Speaker 1: employees at your place of work is really important when 574 00:32:36,000 --> 00:32:39,560 Speaker 1: it comes to social engineering. And it is incredibly easy 575 00:32:39,800 --> 00:32:43,800 Speaker 1: to do social engineering, especially when you're a female, I 576 00:32:43,880 --> 00:32:47,040 Speaker 1: would imagine. So it turns out also if you are 577 00:32:47,160 --> 00:32:51,520 Speaker 1: dressed as the stereotypical it guy and you are there too. Yeah, 578 00:32:51,640 --> 00:32:55,800 Speaker 1: quote unquote upgrade someone's machine. Really easy to get access 579 00:32:55,840 --> 00:33:01,520 Speaker 1: to that machine. Yeah, people are so eager. Yeah. And obviously, 580 00:33:01,600 --> 00:33:05,600 Speaker 1: like social engineering completely depends upon identifying and then exploiting 581 00:33:05,680 --> 00:33:11,760 Speaker 1: a person's vulnerability and typically speaking like greed lust, those 582 00:33:11,800 --> 00:33:16,080 Speaker 1: are two big ones that are exploitable and that the 583 00:33:16,120 --> 00:33:18,560 Speaker 1: people who are really good at social engineering know that, 584 00:33:18,720 --> 00:33:21,520 Speaker 1: and they're very good at that leveraging that. Just as 585 00:33:21,560 --> 00:33:25,719 Speaker 1: knowing what sort of vulnerabilities typically show up within code 586 00:33:25,760 --> 00:33:29,320 Speaker 1: within programs, you need to know what vulnerabilities show up 587 00:33:29,360 --> 00:33:33,239 Speaker 1: in people. And I also I had a little thing 588 00:33:33,240 --> 00:33:36,120 Speaker 1: on here about botnet masters. Really what in this I 589 00:33:36,160 --> 00:33:38,160 Speaker 1: was thinking about the people who are using malware to 590 00:33:38,200 --> 00:33:41,160 Speaker 1: get that backdoor access to machines, to get to get 591 00:33:41,160 --> 00:33:45,959 Speaker 1: that administrative control over a wide array. Sometimes we call 592 00:33:46,000 --> 00:33:48,200 Speaker 1: it a botnet, sometimes we call it a zombie army 593 00:33:48,640 --> 00:33:52,720 Speaker 1: of user computers, and then utilizing that to do stuff 594 00:33:52,760 --> 00:33:57,880 Speaker 1: like distributed denial of service attacks so our de dos 595 00:33:57,920 --> 00:34:03,600 Speaker 1: attacks where you are directing an army essentially to coordinate 596 00:34:03,640 --> 00:34:08,200 Speaker 1: an attack against an identified target. Sometimes this is done 597 00:34:08,320 --> 00:34:11,480 Speaker 1: just to cause problems. I mean obviously if you've ever 598 00:34:12,080 --> 00:34:16,680 Speaker 1: had issues logging into like a gaming network. Xbox Live 599 00:34:16,719 --> 00:34:19,880 Speaker 1: has had this happen, PlayStation has had this happen, where 600 00:34:20,280 --> 00:34:23,560 Speaker 1: people who are disenchanted with the service for one reason 601 00:34:23,680 --> 00:34:26,000 Speaker 1: or another, or they just want to do it for 602 00:34:26,040 --> 00:34:31,000 Speaker 1: the lulls, specifically around holiday times. That's a big that's 603 00:34:31,000 --> 00:34:34,040 Speaker 1: a big target time to attack something like Xbox Live. 604 00:34:34,600 --> 00:34:37,520 Speaker 1: They'll direct a ton of traffic to break down servers, 605 00:34:37,560 --> 00:34:40,560 Speaker 1: so servers can't respond to legitimate traffic because they're too 606 00:34:40,560 --> 00:34:45,480 Speaker 1: busy responding to a bunch of fake traffic. Essentially, I'm oversimplifying, 607 00:34:45,480 --> 00:34:47,919 Speaker 1: but this is a basic detos attack. It is. It's 608 00:34:47,920 --> 00:34:49,919 Speaker 1: such a mean thing to do to those little kids 609 00:34:49,960 --> 00:34:52,680 Speaker 1: during Christmas time, just turn off their xboxes so that 610 00:34:52,719 --> 00:34:54,760 Speaker 1: they can't log in and they can't play their games, 611 00:34:54,760 --> 00:34:58,160 Speaker 1: so they just go on. Yeah, yeah, I think, break 612 00:34:58,239 --> 00:35:01,359 Speaker 1: my heart. Gosh, it's a move. It's a jerk move. 613 00:35:01,480 --> 00:35:04,680 Speaker 1: Don't do it. Yeah. I love the definition, or I 614 00:35:04,760 --> 00:35:09,399 Speaker 1: love the term zombie for botnets, because that's exactly what 615 00:35:09,440 --> 00:35:11,920 Speaker 1: it is. Where you have a you have a zero, 616 00:35:12,200 --> 00:35:14,680 Speaker 1: a patient zero, and that would be the first computer. 617 00:35:15,440 --> 00:35:18,319 Speaker 1: They end up biting a few more computers, and those 618 00:35:18,360 --> 00:35:21,320 Speaker 1: ones end up getting infected with the same exact infection 619 00:35:21,360 --> 00:35:24,600 Speaker 1: that patient zero had, and then those ones end up 620 00:35:24,800 --> 00:35:27,640 Speaker 1: biting ten each, So you end up with thousands upon 621 00:35:27,719 --> 00:35:31,160 Speaker 1: thousands of these computers that each have the same exact infection, 622 00:35:31,600 --> 00:35:35,480 Speaker 1: and they all end up perpetrating the same exact vulnerability 623 00:35:35,640 --> 00:35:39,480 Speaker 1: on whatever their target might be. Yeah, and then ultimately 624 00:35:39,960 --> 00:35:42,960 Speaker 1: you end up with a situation where Nagan is standing 625 00:35:43,000 --> 00:35:45,080 Speaker 1: there with a baseball bat and you don't know whose 626 00:35:45,080 --> 00:35:49,399 Speaker 1: head he's gonna cave in. I might have taken that 627 00:35:49,600 --> 00:35:52,719 Speaker 1: metaphor a little too far. But one of the things 628 00:35:52,719 --> 00:35:56,080 Speaker 1: that botnet controllers might do, and in fact, this has 629 00:35:56,120 --> 00:35:59,759 Speaker 1: happened on multiple occasions. It's similar to ransomware is they'll 630 00:35:59,760 --> 00:36:03,440 Speaker 1: send a message to an identified target and say, hey, 631 00:36:04,200 --> 00:36:06,799 Speaker 1: we we got your number. We're going to come after you. 632 00:36:07,040 --> 00:36:10,200 Speaker 1: Unless you pay us a certain amount of money, we 633 00:36:10,239 --> 00:36:15,040 Speaker 1: will unleash the dogs of war on your servers and 634 00:36:15,160 --> 00:36:17,880 Speaker 1: you will be unable to do business. And there have 635 00:36:17,880 --> 00:36:21,200 Speaker 1: been cases where businesses have folded to this kind of pressure, 636 00:36:21,239 --> 00:36:24,240 Speaker 1: where they have in fact paid to do this because 637 00:36:24,760 --> 00:36:30,000 Speaker 1: the hospital. Yes, yes, it was. Yeah, I've seen a 638 00:36:30,000 --> 00:36:38,000 Speaker 1: few cases of particularly malicious and odious acts against things 639 00:36:38,000 --> 00:36:40,960 Speaker 1: like hospitals. There was one year when I was participating 640 00:36:41,760 --> 00:36:45,719 Speaker 1: in a charity for children's hospitals and the charity was 641 00:36:46,000 --> 00:36:50,800 Speaker 1: targeted in the middle of the event and for about 642 00:36:50,920 --> 00:36:55,440 Speaker 1: three hours they were offline trying to deal with that. Yeah, 643 00:36:55,560 --> 00:36:57,920 Speaker 1: it's and in that case it wasn't. It wasn't an 644 00:36:57,960 --> 00:37:00,799 Speaker 1: attack in an effort to get money. I don't think. 645 00:37:00,840 --> 00:37:03,280 Speaker 1: I think it was just someone being truly an awful 646 00:37:03,480 --> 00:37:06,879 Speaker 1: human being. But we have seen cases of people trying 647 00:37:06,880 --> 00:37:09,560 Speaker 1: to do this in order to extort money. So you're 648 00:37:09,600 --> 00:37:14,720 Speaker 1: probably noticing some trends here extortion, stealing, you know, holding 649 00:37:14,719 --> 00:37:18,040 Speaker 1: things for a ransom, this idea of making sure that 650 00:37:18,880 --> 00:37:23,040 Speaker 1: people are spending money out of fear or out of 651 00:37:23,080 --> 00:37:26,480 Speaker 1: a need to get back and have access to something 652 00:37:26,520 --> 00:37:31,480 Speaker 1: that belongs to them. These are all terrible, terrible motivations 653 00:37:31,480 --> 00:37:35,840 Speaker 1: to make money, and as such, as such terrible motivations, 654 00:37:35,880 --> 00:37:37,759 Speaker 1: you might think, well, wait a minute, how are they 655 00:37:37,800 --> 00:37:40,000 Speaker 1: actually like, how are they getting paid? How is this 656 00:37:40,040 --> 00:37:45,000 Speaker 1: money transfer happening? Because you would think anything that would 657 00:37:45,040 --> 00:37:49,600 Speaker 1: be traceable would end up being somewhat problematic. You've got 658 00:37:49,600 --> 00:37:51,800 Speaker 1: a trail that leads back to you as a person, 659 00:37:51,960 --> 00:37:55,480 Speaker 1: then pretty soon law enforcement's going to get involved, or 660 00:37:55,520 --> 00:38:01,960 Speaker 1: at least the irs. So how Shannon do hackers? How 661 00:38:01,960 --> 00:38:04,799 Speaker 1: do they get the money? So there's probably some ways 662 00:38:04,800 --> 00:38:07,720 Speaker 1: that I don't even know about yet, but the ones 663 00:38:07,760 --> 00:38:10,560 Speaker 1: that I can think of would be trading of high 664 00:38:10,640 --> 00:38:14,240 Speaker 1: value data. So that's a pretty big one, where say 665 00:38:14,280 --> 00:38:16,759 Speaker 1: a hacker collects a whole bunch of really really high 666 00:38:16,840 --> 00:38:20,719 Speaker 1: value data like your SO security number, your credit card accounts, 667 00:38:20,800 --> 00:38:24,440 Speaker 1: your banking account tons of information and they decide to 668 00:38:24,480 --> 00:38:26,600 Speaker 1: go on to a deep web forum sell it, and 669 00:38:26,640 --> 00:38:29,880 Speaker 1: then or trade it for something else of high value, 670 00:38:29,920 --> 00:38:33,680 Speaker 1: for example, a gift card. They could ask for people 671 00:38:33,719 --> 00:38:35,759 Speaker 1: to give them a ton of gift cards that are, 672 00:38:35,800 --> 00:38:38,759 Speaker 1: like you, twenty five or fifty dollars each, and then 673 00:38:39,600 --> 00:38:42,920 Speaker 1: use those gift cards at a retailer who is easily 674 00:38:43,320 --> 00:38:46,319 Speaker 1: vulnerable to some kind of gift card scam, and in 675 00:38:46,320 --> 00:38:48,439 Speaker 1: that sense they would be able to make some kind 676 00:38:48,440 --> 00:38:50,720 Speaker 1: of money back through those gift cards and that trade 677 00:38:50,719 --> 00:38:55,160 Speaker 1: of that high value data that they stole from whoever 678 00:38:55,200 --> 00:38:58,799 Speaker 1: it might be, whatever company. Another way would be bitcoins. 679 00:38:59,320 --> 00:39:01,560 Speaker 1: Now that's probably the most obvious one, of course, because 680 00:39:01,600 --> 00:39:04,600 Speaker 1: bitcoins are very very hard to track. Yes, they are 681 00:39:04,719 --> 00:39:07,800 Speaker 1: traceable in some circumstances, depending on what kind of wallet 682 00:39:07,840 --> 00:39:11,520 Speaker 1: you use, but in a lot of circumstances, the bitcoins 683 00:39:11,560 --> 00:39:15,080 Speaker 1: will trade wallets so many times that it'll be somewhat 684 00:39:15,080 --> 00:39:17,560 Speaker 1: impossible to find out where it actually came from, where 685 00:39:17,560 --> 00:39:20,680 Speaker 1: it actually started. Yeah, it's kind of interesting because every 686 00:39:20,719 --> 00:39:24,239 Speaker 1: single bitcoin contains with it a record of every transaction, 687 00:39:24,280 --> 00:39:26,600 Speaker 1: but that does not mean that the parties involved are 688 00:39:26,600 --> 00:39:31,520 Speaker 1: actually identifiable. Yeah, exactly. It really is. It's actually data 689 00:39:31,600 --> 00:39:34,440 Speaker 1: that's used in order to allow for the mining of 690 00:39:34,480 --> 00:39:38,440 Speaker 1: further bitcoins. It's a really fascinating process. But one of 691 00:39:38,480 --> 00:39:41,280 Speaker 1: the things that attracts people to bitcoins is this idea 692 00:39:41,360 --> 00:39:45,360 Speaker 1: of being able to spend them anonymously and be able 693 00:39:45,360 --> 00:39:50,600 Speaker 1: to purchase things, whether legal or illegal, without it being 694 00:39:50,719 --> 00:39:53,640 Speaker 1: traced back to that person. You often will hear about 695 00:39:53,760 --> 00:39:56,480 Speaker 1: things like, you know, the old Silk Road, where you 696 00:39:56,520 --> 00:40:00,440 Speaker 1: could purchase all sources of stuff, including illegal drugs or 697 00:40:00,520 --> 00:40:04,719 Speaker 1: other materials, sometimes weapons, that kind of stuff, and you 698 00:40:04,719 --> 00:40:07,680 Speaker 1: could do it through bitcoins, and people felt a high 699 00:40:07,760 --> 00:40:11,680 Speaker 1: level of confidence because it was not a state backed currency. 700 00:40:12,360 --> 00:40:17,640 Speaker 1: It was this independent cryptocurrency that allowed them that freedom 701 00:40:17,680 --> 00:40:22,000 Speaker 1: and had real value because people want the bitcoins. If 702 00:40:22,040 --> 00:40:25,920 Speaker 1: no one wanted the bitcoins, they wouldn't be worth anything, right, 703 00:40:26,000 --> 00:40:29,440 Speaker 1: And bitcoins have actually been pretty steady last time I checked, 704 00:40:29,520 --> 00:40:34,400 Speaker 1: so their value has been pretty decent in late days, 705 00:40:34,520 --> 00:40:38,399 Speaker 1: in recent days, So I completely understand why hacker would 706 00:40:38,440 --> 00:40:42,120 Speaker 1: want to be paid in bitcoins. It makes sense. Yeah. Yeah, 707 00:40:42,160 --> 00:40:45,160 Speaker 1: there's also the old, the old deal of putting the 708 00:40:45,200 --> 00:40:49,279 Speaker 1: money into the washing machine. Right, that's how money laundering works, right, Yes, 709 00:40:49,760 --> 00:40:52,359 Speaker 1: money laundering. So that was something that I learned about 710 00:40:52,400 --> 00:40:53,880 Speaker 1: way back in the day when I worked at a 711 00:40:53,920 --> 00:40:56,920 Speaker 1: bank of all places, which also got me really interested 712 00:40:56,960 --> 00:41:00,680 Speaker 1: in security before I started podcasting. But money laundering, it's 713 00:41:00,760 --> 00:41:03,120 Speaker 1: very easy for somebody to go online, be able to 714 00:41:03,160 --> 00:41:06,960 Speaker 1: sell this high value data, get some bitcoins or it 715 00:41:07,040 --> 00:41:10,880 Speaker 1: might be some other form of currency, and then be 716 00:41:10,920 --> 00:41:15,200 Speaker 1: able to resell that money or be able to trade 717 00:41:15,200 --> 00:41:17,759 Speaker 1: a product to get real money, real cash at one 718 00:41:17,760 --> 00:41:22,680 Speaker 1: point or another. But basically it's it's um exchanging the 719 00:41:22,800 --> 00:41:26,160 Speaker 1: hands that hold that money so many times that again 720 00:41:26,200 --> 00:41:29,279 Speaker 1: it's very hard to trace, yeah, and it's it's hard 721 00:41:29,320 --> 00:41:32,600 Speaker 1: to determine that the original source of that money was 722 00:41:32,680 --> 00:41:37,239 Speaker 1: anything remotely illegal. And then depending on again, if you're 723 00:41:37,320 --> 00:41:39,960 Speaker 1: if you're a state sponsored hacker, you're probably just drawing 724 00:41:40,000 --> 00:41:44,080 Speaker 1: a salary or doing contract work, so you're actually getting paid. 725 00:41:44,239 --> 00:41:50,160 Speaker 1: You get a pay check, yeah, yeah, so you got 726 00:41:50,160 --> 00:41:53,440 Speaker 1: money withdrawn from your paycheck to handle to support the 727 00:41:53,480 --> 00:41:57,120 Speaker 1: government while you are subverting other governments, and then it 728 00:41:57,160 --> 00:41:59,719 Speaker 1: looks completely legitimate. So that's a really easy way for 729 00:42:00,000 --> 00:42:06,120 Speaker 1: somebody to do something that might be very very bad. Yeah, 730 00:42:06,160 --> 00:42:09,040 Speaker 1: because they are they do have to pay the I rs. 731 00:42:09,120 --> 00:42:11,239 Speaker 1: They do get a tax refund every year, they do 732 00:42:11,360 --> 00:42:14,680 Speaker 1: have an employer, so it looks completely normal for them 733 00:42:14,680 --> 00:42:19,520 Speaker 1: to be receiving a paycheck for whatever work this might be. Yeah, 734 00:42:19,640 --> 00:42:24,880 Speaker 1: So the nice thing is there aren't just quote unquote 735 00:42:24,920 --> 00:42:28,200 Speaker 1: bad guys out there doing all this kind of work 736 00:42:28,200 --> 00:42:32,879 Speaker 1: with computers, with a hacking, with discovering vulnerabilities. There are 737 00:42:32,880 --> 00:42:35,640 Speaker 1: plenty of people, as as you mentioned earlier, Shannon, who 738 00:42:35,680 --> 00:42:39,480 Speaker 1: are doing this in order to help others, either to 739 00:42:39,600 --> 00:42:43,760 Speaker 1: make systems more secure or to inform people of how 740 00:42:43,800 --> 00:42:46,479 Speaker 1: these kind of attacks happen so that they can be 741 00:42:46,480 --> 00:42:49,640 Speaker 1: better prepared to defend themselves. So let's talk about some 742 00:42:49,719 --> 00:42:54,160 Speaker 1: of them. Of course, if you have black hat hackers, right, 743 00:42:54,200 --> 00:42:56,560 Speaker 1: you got the bad guys you gotta have, you gotta 744 00:42:56,600 --> 00:43:01,000 Speaker 1: have the white white hat white hack hackers. These these 745 00:43:01,040 --> 00:43:07,200 Speaker 1: are the noble bounty hunter characters of those westerns, the 746 00:43:07,200 --> 00:43:11,200 Speaker 1: ones who you know they've seen things, but deep down 747 00:43:11,280 --> 00:43:14,960 Speaker 1: they have a heart of gold. Well, not all of them, 748 00:43:15,040 --> 00:43:20,640 Speaker 1: but a lot of my friends are considered white hat hackers. 749 00:43:20,719 --> 00:43:24,280 Speaker 1: They're the people who either they work for a company 750 00:43:24,320 --> 00:43:28,200 Speaker 1: that specializes insecurity. So a lot of my friends work 751 00:43:28,239 --> 00:43:31,760 Speaker 1: for these companies who will be contracted with big brands, 752 00:43:32,360 --> 00:43:34,719 Speaker 1: go into their networks and then find out what the 753 00:43:34,800 --> 00:43:37,839 Speaker 1: vulnerabilities are and fix them, or they will give them 754 00:43:37,840 --> 00:43:39,680 Speaker 1: a report and tell them how to fix them fix 755 00:43:39,719 --> 00:43:42,240 Speaker 1: it in the future. They make a lot of money. 756 00:43:42,520 --> 00:43:44,520 Speaker 1: A lot of them don't like it because they have 757 00:43:45,080 --> 00:43:49,799 Speaker 1: specific amounts of vulnerabilities or specific timeframe set that they 758 00:43:49,840 --> 00:43:52,040 Speaker 1: have to get this work done, and a lot of 759 00:43:52,040 --> 00:43:54,359 Speaker 1: times hacking takes a lot of time. It takes a 760 00:43:54,360 --> 00:43:58,760 Speaker 1: lot of information reconnaissance. So a lot of my friends 761 00:43:58,760 --> 00:44:02,280 Speaker 1: don't necessarily appreciate having to be under these time constraints 762 00:44:02,360 --> 00:44:05,560 Speaker 1: with these big brands well, particularly since you figure the 763 00:44:05,560 --> 00:44:09,200 Speaker 1: bad guys aren't under any particular time constraints exactly. So 764 00:44:09,440 --> 00:44:12,920 Speaker 1: the bad guys have tons of time to find these vulnerabilities, 765 00:44:12,920 --> 00:44:14,839 Speaker 1: while the white hacks are under the stress of these 766 00:44:14,840 --> 00:44:17,000 Speaker 1: time constraints to get the work done so that they 767 00:44:17,000 --> 00:44:20,040 Speaker 1: make their bosses happy. In this sense, a lot of 768 00:44:20,840 --> 00:44:22,600 Speaker 1: a lot of people that I know have created their 769 00:44:22,640 --> 00:44:26,480 Speaker 1: own security companies because of this fault in the generic 770 00:44:26,560 --> 00:44:30,360 Speaker 1: nature of having these security companies. So they said, you know, 771 00:44:30,400 --> 00:44:33,480 Speaker 1: I'm tired of having to deal with these constraints that 772 00:44:33,520 --> 00:44:35,520 Speaker 1: my boss has given me. Just going to open my 773 00:44:35,640 --> 00:44:37,799 Speaker 1: own security company, and we're going to do it even 774 00:44:37,840 --> 00:44:40,440 Speaker 1: better because we won't give ourselves those time constraints. We'll 775 00:44:40,480 --> 00:44:44,040 Speaker 1: give us ourselves several months to find all the vulnerabilities 776 00:44:44,080 --> 00:44:46,720 Speaker 1: that we absolutely can and then we'll write a report 777 00:44:46,760 --> 00:44:49,279 Speaker 1: and we'll fix it. And those are the ones that 778 00:44:49,400 --> 00:44:51,960 Speaker 1: I would definitely work with if I had to hire 779 00:44:52,000 --> 00:44:55,200 Speaker 1: a security company. Yeah, because they're the ones who are 780 00:44:55,239 --> 00:44:58,799 Speaker 1: going to use the exact same kind of methodologies, right 781 00:44:59,040 --> 00:45:01,480 Speaker 1: the bad guys are going to use. And if if 782 00:45:01,520 --> 00:45:05,080 Speaker 1: you want to really be secure, you want the people 783 00:45:05,120 --> 00:45:08,239 Speaker 1: to throw everything they can at your system so that 784 00:45:08,400 --> 00:45:11,239 Speaker 1: you can find out are you actually secure? If you're not, 785 00:45:11,360 --> 00:45:14,120 Speaker 1: what do you need to do to address it? If 786 00:45:14,120 --> 00:45:17,000 Speaker 1: you want to see a movie that does a very 787 00:45:17,520 --> 00:45:21,600 Speaker 1: fantasy version of this very idea, there's a nineteen ninety 788 00:45:21,600 --> 00:45:26,080 Speaker 1: two film that I always think back to. Sneakers had 789 00:45:26,160 --> 00:45:29,200 Speaker 1: Robert Redford and Dan Ackroyd, who plays a character named mother. 790 00:45:30,520 --> 00:45:34,080 Speaker 1: Ben Kingsley is in it, a ton of folks, River 791 00:45:34,160 --> 00:45:37,960 Speaker 1: Phoenix was in it, and it's a It's a movie 792 00:45:38,000 --> 00:45:42,799 Speaker 1: about a group of kind of almost like outcasts who 793 00:45:42,960 --> 00:45:46,080 Speaker 1: have grouped together to form a company that they specifically 794 00:45:46,120 --> 00:45:49,439 Speaker 1: do this. They try to infiltrate a company in order 795 00:45:49,480 --> 00:45:54,200 Speaker 1: to test its security, not to exploit it, but rather 796 00:45:54,320 --> 00:45:57,560 Speaker 1: to tell the company, hey, here's how we got in, 797 00:45:57,640 --> 00:45:59,759 Speaker 1: here's how someone else could get in, so you need 798 00:45:59,800 --> 00:46:02,879 Speaker 1: to plug this vulnerability that kind of thing. And then 799 00:46:02,880 --> 00:46:05,359 Speaker 1: of course they get involved in all sorts of shenanigans. 800 00:46:05,719 --> 00:46:08,520 Speaker 1: And in case you are interested in the methodology, I 801 00:46:08,560 --> 00:46:11,920 Speaker 1: actually find it very very interesting how they get their 802 00:46:11,960 --> 00:46:15,120 Speaker 1: work done, because of course they have to go through 803 00:46:15,239 --> 00:46:18,120 Speaker 1: the tennis match of back and forth with a brand 804 00:46:18,160 --> 00:46:21,680 Speaker 1: name company, whatever it might be. So they'll have to 805 00:46:21,719 --> 00:46:24,239 Speaker 1: get a purchase order, they'll do a little bit of 806 00:46:24,280 --> 00:46:26,640 Speaker 1: negotiation for an amount that they'll do the work for, 807 00:46:27,120 --> 00:46:29,480 Speaker 1: and then they'll go in and they'll gather information on 808 00:46:29,520 --> 00:46:33,040 Speaker 1: the network, and they'll capture traffic, and they'll try to 809 00:46:33,080 --> 00:46:35,760 Speaker 1: find any kind of vulnerabilities that are on that network, 810 00:46:36,400 --> 00:46:39,319 Speaker 1: even with the people too. For example, they could use 811 00:46:39,360 --> 00:46:44,120 Speaker 1: social engineering to get into the server rack physically, or 812 00:46:44,160 --> 00:46:47,520 Speaker 1: they could get into a network that doesn't necessarily have 813 00:46:47,600 --> 00:46:51,440 Speaker 1: a very good password on it. They could email clients 814 00:46:51,480 --> 00:46:54,000 Speaker 1: that work there, that are employed at the brand name 815 00:46:54,040 --> 00:46:58,200 Speaker 1: company with I don't know malware written PDFs for example, 816 00:46:59,000 --> 00:47:01,040 Speaker 1: and they can use why list attacks. They could do 817 00:47:01,120 --> 00:47:03,480 Speaker 1: war driving from the parking lot if they wanted to. 818 00:47:04,000 --> 00:47:06,879 Speaker 1: And then what they'll do is write a very very 819 00:47:06,880 --> 00:47:10,040 Speaker 1: long report so that the brand name company can see 820 00:47:10,120 --> 00:47:12,759 Speaker 1: exactly what happens on their network and exactly what they 821 00:47:12,760 --> 00:47:16,560 Speaker 1: were able to do from whatever back door they were 822 00:47:16,560 --> 00:47:20,520 Speaker 1: able to get into. It's really interesting how well they're 823 00:47:20,520 --> 00:47:24,960 Speaker 1: able to put everything together in turn hopefully save this 824 00:47:25,000 --> 00:47:27,720 Speaker 1: company in the long run thousands and thousands of dollars. 825 00:47:28,920 --> 00:47:30,759 Speaker 1: We will be back to talk a little bit more 826 00:47:30,800 --> 00:47:34,719 Speaker 1: about hacking for that cold hard cash after we take 827 00:47:34,880 --> 00:47:48,920 Speaker 1: another quick break. Security has always been a tick talk approach, Right, 828 00:47:48,960 --> 00:47:51,360 Speaker 1: You've got the tick, which is where someone has identified 829 00:47:51,440 --> 00:47:54,400 Speaker 1: a way of exploiting a system, and then the talk 830 00:47:54,600 --> 00:47:58,359 Speaker 1: is where you find a way to correct that vulnerability. 831 00:47:58,760 --> 00:48:01,319 Speaker 1: The tick is the next time someone's found of vulnerability, 832 00:48:02,000 --> 00:48:05,200 Speaker 1: you're always going to have that, right unless someone somehow 833 00:48:05,680 --> 00:48:09,359 Speaker 1: designs the absolute perfect system, which as far as we know, 834 00:48:09,640 --> 00:48:13,759 Speaker 1: is an impossibility. Yeah. Yeah, because for one thing, if 835 00:48:13,800 --> 00:48:17,080 Speaker 1: people are involved, there's no such thing as a perfect system. Yeah, 836 00:48:17,320 --> 00:48:19,879 Speaker 1: it's always a battle, and I love my video game, 837 00:48:19,920 --> 00:48:24,240 Speaker 1: so I love a battle. But yeah, it also drives 838 00:48:24,280 --> 00:48:27,200 Speaker 1: other other industries though, because we'll see things like the 839 00:48:27,400 --> 00:48:31,960 Speaker 1: artificial intelligence industry improve as a result of this security 840 00:48:32,000 --> 00:48:36,160 Speaker 1: battle between hackers and the infosec experts who are trying 841 00:48:36,200 --> 00:48:39,960 Speaker 1: to make sure to their protecting systems. And as a result, 842 00:48:40,040 --> 00:48:42,480 Speaker 1: we're getting information that can be used in other areas, 843 00:48:43,120 --> 00:48:46,719 Speaker 1: which is phenomenal. Like I remember, here's the simple one. 844 00:48:46,760 --> 00:48:49,680 Speaker 1: It's as far as security goes, This is as low 845 00:48:49,800 --> 00:48:53,080 Speaker 1: level as it gets. But the capture system. So when 846 00:48:53,200 --> 00:48:57,400 Speaker 1: capture was implemented, even the people who were writing capture 847 00:48:57,480 --> 00:48:59,799 Speaker 1: at the time, we're not really thinking of it as 848 00:49:00,000 --> 00:49:03,279 Speaker 1: being some sort of full proof security system to make 849 00:49:03,320 --> 00:49:06,400 Speaker 1: sure that bots don't get into a system, right. They 850 00:49:06,440 --> 00:49:10,400 Speaker 1: weren't thinking, oh, now only human beings can get access. 851 00:49:10,520 --> 00:49:12,280 Speaker 1: And if you don't know what a capture is, anytime 852 00:49:12,360 --> 00:49:14,160 Speaker 1: you get you're filling out a thing and you get 853 00:49:14,200 --> 00:49:18,120 Speaker 1: a little picture of something and it says tell you 854 00:49:18,280 --> 00:49:20,600 Speaker 1: write down the word or numbers that are in this picture, 855 00:49:20,719 --> 00:49:23,719 Speaker 1: or even to a point of identify the pictures in 856 00:49:23,800 --> 00:49:27,279 Speaker 1: this sequence that have this particular feature, like identify all 857 00:49:27,320 --> 00:49:29,080 Speaker 1: the pictures that have a lake in it or something 858 00:49:29,160 --> 00:49:32,320 Speaker 1: like that. That's simply that's simply a version of capture. 859 00:49:33,600 --> 00:49:35,560 Speaker 1: The people who made it, they actually said, our goal 860 00:49:35,800 --> 00:49:38,640 Speaker 1: was really to help push artificial intelligence, because we created 861 00:49:38,680 --> 00:49:42,839 Speaker 1: a system where programmers or hackers had to start coming 862 00:49:42,960 --> 00:49:47,240 Speaker 1: up with computer programs that could identify the same things 863 00:49:47,320 --> 00:49:50,759 Speaker 1: that we humans can identify. And in turn, that means 864 00:49:50,880 --> 00:49:55,200 Speaker 1: now we've got software that pushes forward artificial intelligence. Now, granted, 865 00:49:55,239 --> 00:49:58,080 Speaker 1: that also means you have to improve the system you 866 00:49:58,160 --> 00:50:00,399 Speaker 1: had designed to keep bots out in the first place. 867 00:50:00,680 --> 00:50:03,040 Speaker 1: So again it goes to that TikTok, But there's an 868 00:50:03,080 --> 00:50:07,680 Speaker 1: added benefit beyond someone being able to to automatically access 869 00:50:07,760 --> 00:50:10,919 Speaker 1: systems and build, you know, dozens and dozens of fake 870 00:50:11,040 --> 00:50:14,120 Speaker 1: profiles on Facebook or whatever it might be, whatever that 871 00:50:14,239 --> 00:50:17,279 Speaker 1: might be. Yeah, yeah, And and keep in mind, like 872 00:50:17,480 --> 00:50:20,280 Speaker 1: like we've been saying here, I mean, any any systems 873 00:50:20,400 --> 00:50:23,360 Speaker 1: security is only as strong as its weakest link. That 874 00:50:23,480 --> 00:50:27,960 Speaker 1: weakest link is pretty much always people. That's the big one, right. 875 00:50:28,120 --> 00:50:30,720 Speaker 1: But I mean, I've I've read stories about a hacker 876 00:50:31,120 --> 00:50:34,760 Speaker 1: gaining access to a system because there was an overall 877 00:50:34,840 --> 00:50:38,280 Speaker 1: security system that was really robust for the main company, 878 00:50:38,840 --> 00:50:41,360 Speaker 1: but then they had a little branch office, and the 879 00:50:41,440 --> 00:50:44,200 Speaker 1: branch office didn't have that crazy amount of security but 880 00:50:44,719 --> 00:50:47,719 Speaker 1: was still on the same network. I think I read 881 00:50:47,760 --> 00:50:50,480 Speaker 1: about that story too, So I mean, these are these 882 00:50:50,520 --> 00:50:53,120 Speaker 1: are things like if you identify a potential point of 883 00:50:53,200 --> 00:50:57,520 Speaker 1: weakness that's now suddenly the you know it's it's like 884 00:50:57,680 --> 00:51:00,759 Speaker 1: a bank vault. If the bank vault hasn't enormous door 885 00:51:00,840 --> 00:51:03,719 Speaker 1: with huge locks on it that you have to get through. Oh, 886 00:51:03,800 --> 00:51:06,279 Speaker 1: but it also has a backdoor. Just for convenience sake, 887 00:51:06,520 --> 00:51:10,440 Speaker 1: You're going to aim for the backdoor. But there are 888 00:51:10,480 --> 00:51:14,719 Speaker 1: other ways that hackers can make a legitimate living that 889 00:51:14,880 --> 00:51:19,399 Speaker 1: don't even involve testing security systems. It might involve education. Yeah, 890 00:51:19,560 --> 00:51:23,080 Speaker 1: absolutely so education is I guess what you would say, 891 00:51:23,160 --> 00:51:25,920 Speaker 1: I fall into that kind of category. And while I 892 00:51:26,239 --> 00:51:28,920 Speaker 1: don't necessarily like to call myself a hacker because I 893 00:51:29,160 --> 00:51:31,399 Speaker 1: know so many experts in the field who are much 894 00:51:31,440 --> 00:51:35,400 Speaker 1: more knowledgeable than I am, I'm quite a intermediate, i 895 00:51:35,480 --> 00:51:39,319 Speaker 1: would say. But I love to teach, and I love 896 00:51:39,360 --> 00:51:42,800 Speaker 1: to give tutorials online, so I give tutorials on YouTube. 897 00:51:43,360 --> 00:51:45,239 Speaker 1: But I also know a lot of people who have 898 00:51:45,520 --> 00:51:49,479 Speaker 1: either written books about hacking, and they could do either 899 00:51:49,640 --> 00:51:53,000 Speaker 1: specifics about penetration testing, or they get to make it 900 00:51:53,080 --> 00:51:57,000 Speaker 1: a very very wide based book where they explain everything 901 00:51:57,120 --> 00:51:59,080 Speaker 1: that you would have to do as a penetration tester, 902 00:51:59,239 --> 00:52:01,560 Speaker 1: and a penetration tester is basically one of those guys 903 00:52:01,640 --> 00:52:04,360 Speaker 1: that would go into a company and find all the 904 00:52:04,400 --> 00:52:08,800 Speaker 1: vulnerabilities and report on it. You would also have companies 905 00:52:08,880 --> 00:52:12,800 Speaker 1: that administer certifications, so a lot of I'm sure a 906 00:52:12,880 --> 00:52:16,800 Speaker 1: lot of your listeners probably know that you have to 907 00:52:16,880 --> 00:52:21,000 Speaker 1: get certifications to get a lot to get into a 908 00:52:21,080 --> 00:52:24,239 Speaker 1: lot of the fields with computer security and even just 909 00:52:24,560 --> 00:52:27,880 Speaker 1: you know, computer networking too. There's a lot of searts 910 00:52:27,920 --> 00:52:30,120 Speaker 1: for those and they're very, very expensive. So a lot 911 00:52:30,160 --> 00:52:34,200 Speaker 1: of companies just administer their certifications or they'll will have 912 00:52:34,360 --> 00:52:36,680 Speaker 1: you take classes for a period of time until you 913 00:52:36,719 --> 00:52:39,839 Speaker 1: actually take the test and get certified. But that ends 914 00:52:39,880 --> 00:52:41,359 Speaker 1: up being a really good thing to put on your 915 00:52:41,400 --> 00:52:44,640 Speaker 1: resume for a lot of companies whenever you do intend 916 00:52:44,719 --> 00:52:49,320 Speaker 1: to get a job in network security. And then lastly, 917 00:52:49,400 --> 00:52:53,000 Speaker 1: we have the publishers. So that's the YouTubers, that's the 918 00:52:53,080 --> 00:52:57,160 Speaker 1: people that made podcasts, That's the people that might be 919 00:52:58,040 --> 00:53:01,840 Speaker 1: creating other forms of entertaining mint that not only educate 920 00:53:01,920 --> 00:53:04,960 Speaker 1: but also entertain their users and their listeners so that 921 00:53:05,120 --> 00:53:09,239 Speaker 1: they get excited about being a part of information security. 922 00:53:09,920 --> 00:53:11,759 Speaker 1: And that's what I like to do. I like to 923 00:53:12,160 --> 00:53:14,480 Speaker 1: teach people in a way that makes it exciting. So 924 00:53:14,600 --> 00:53:16,640 Speaker 1: I do a lot of hands on stuff. I make, 925 00:53:16,800 --> 00:53:19,080 Speaker 1: I make jokes, and I explain things in a very 926 00:53:19,200 --> 00:53:24,640 Speaker 1: natural light and it helps. It helps, again foster that 927 00:53:24,800 --> 00:53:28,120 Speaker 1: desire to learn how things work. Yeah, right, that does 928 00:53:28,880 --> 00:53:31,359 Speaker 1: so again that that same fascination, Like if you were 929 00:53:31,400 --> 00:53:33,880 Speaker 1: ever a kid that took apart a watch or a 930 00:53:34,040 --> 00:53:37,120 Speaker 1: radio or some other piece of equipment, because you really 931 00:53:37,160 --> 00:53:40,279 Speaker 1: want to know what's the magic that makes this thing 932 00:53:40,480 --> 00:53:44,680 Speaker 1: do what it does. Hackers have that, I mean, that's 933 00:53:44,760 --> 00:53:47,960 Speaker 1: the that's that's the defining quality in my mind of 934 00:53:48,000 --> 00:53:51,440 Speaker 1: a hacker is ultimately it's someone who is fascinated with 935 00:53:51,560 --> 00:53:55,759 Speaker 1: the way something works. We've largely been focusing on software, 936 00:53:56,080 --> 00:53:59,799 Speaker 1: but that is just as legitimate as any hardware hack. 937 00:54:00,280 --> 00:54:02,680 Speaker 1: It's the idea of how does this It might not 938 00:54:02,800 --> 00:54:04,800 Speaker 1: even just be the software, It might be a full system, 939 00:54:04,920 --> 00:54:07,480 Speaker 1: like how does this system work? What are all the 940 00:54:07,560 --> 00:54:11,680 Speaker 1: interlocking parts? How do they communicate with each other? I 941 00:54:11,840 --> 00:54:14,680 Speaker 1: just had a random memory from when I was younger 942 00:54:14,719 --> 00:54:17,720 Speaker 1: and in school, I took apart my first iPod because 943 00:54:17,760 --> 00:54:19,239 Speaker 1: I had no clue how it worked, and I was 944 00:54:19,360 --> 00:54:22,560 Speaker 1: very curious about what the interior of it was. Yeah, 945 00:54:22,640 --> 00:54:24,960 Speaker 1: so I just I took it apart. I could have 946 00:54:25,040 --> 00:54:27,960 Speaker 1: put it back together, So I was not hacker in 947 00:54:28,040 --> 00:54:33,600 Speaker 1: any sense. For an article I was writing, we got 948 00:54:33,760 --> 00:54:40,920 Speaker 1: a first edition Launch Day Nintendo three DS, and it 949 00:54:41,040 --> 00:54:43,680 Speaker 1: was my job to disassemble it and take photos of 950 00:54:43,840 --> 00:54:47,080 Speaker 1: all the pieces. So first I took a picture of 951 00:54:47,200 --> 00:54:52,279 Speaker 1: it whole and shared it online on Twitter and said 952 00:54:52,320 --> 00:54:54,879 Speaker 1: look what I have, and everyone got excited. And then 953 00:54:55,280 --> 00:54:57,080 Speaker 1: by the end of it, I had a little had 954 00:54:57,080 --> 00:55:00,600 Speaker 1: a little black cauldron at my desk that was leftover 955 00:55:00,719 --> 00:55:03,160 Speaker 1: from a Halloween thing. And then I put all the 956 00:55:03,239 --> 00:55:05,600 Speaker 1: different pieces because there was no way this thing was 957 00:55:05,640 --> 00:55:08,000 Speaker 1: going back together after I took it apart. For one thing, 958 00:55:08,120 --> 00:55:11,360 Speaker 1: Nintendo is pretty careful about sealing stuff in such a 959 00:55:11,400 --> 00:55:14,480 Speaker 1: way that it's not meant to come apart, so so 960 00:55:14,600 --> 00:55:16,040 Speaker 1: you have to have it was a little force in 961 00:55:16,160 --> 00:55:18,279 Speaker 1: some cases in order to get to stuff. And then 962 00:55:18,320 --> 00:55:19,960 Speaker 1: I showed a picture. I'm like, I'm like, look what 963 00:55:20,120 --> 00:55:25,439 Speaker 1: I did to the thing. The entire internet cry. Yeah, 964 00:55:25,719 --> 00:55:28,920 Speaker 1: although ultimately I think the three DS most people were like, 965 00:55:29,000 --> 00:55:32,719 Speaker 1: oh whatever, But at the time when it was brand new, 966 00:55:32,800 --> 00:55:35,600 Speaker 1: people were freaking out. And of course there's there's also 967 00:55:35,719 --> 00:55:38,480 Speaker 1: another role for hackers out there. It may not be 968 00:55:38,600 --> 00:55:42,160 Speaker 1: a steady gig, but we are seeing more and more 969 00:55:42,680 --> 00:55:46,840 Speaker 1: of the Hollywood productions out there actually talk with people 970 00:55:46,960 --> 00:55:50,040 Speaker 1: in the industry so that the depictions that we're getting 971 00:55:50,239 --> 00:55:54,160 Speaker 1: are more accurately reflecting what really happens. Mister Robot is 972 00:55:54,239 --> 00:55:58,600 Speaker 1: probably the example that immediately leaps to my mind, and 973 00:55:58,760 --> 00:56:01,600 Speaker 1: that it's it's a show. It tries very hard to 974 00:56:01,760 --> 00:56:04,960 Speaker 1: take a more realistic approach to the world of hacking, 975 00:56:05,080 --> 00:56:08,759 Speaker 1: as opposed to you type in three passwords, the third 976 00:56:08,800 --> 00:56:11,640 Speaker 1: one gets you in, and then you're navigating through a 977 00:56:11,880 --> 00:56:15,080 Speaker 1: vector graphics three D dungeon and you encounter a skull 978 00:56:15,160 --> 00:56:18,920 Speaker 1: and cross bones. That's not how hacking works. Sounds like 979 00:56:19,000 --> 00:56:23,160 Speaker 1: you were talking about hackers hack the planet. I might 980 00:56:23,280 --> 00:56:26,600 Speaker 1: have been. I mentioned too with education, Just to bring 981 00:56:26,680 --> 00:56:29,040 Speaker 1: it back a bit, professors, I didn't leave you guys out. 982 00:56:29,080 --> 00:56:32,080 Speaker 1: I'm sorry. I love you guys. You are the reason 983 00:56:32,120 --> 00:56:33,920 Speaker 1: why I'm here now. If I didn't take my computer 984 00:56:34,040 --> 00:56:36,680 Speaker 1: courses in college with my professors, I would not be 985 00:56:36,800 --> 00:56:39,480 Speaker 1: doing what I'm doing now. So professors are like at 986 00:56:39,520 --> 00:56:42,040 Speaker 1: the top of that educational list because oh sure, and 987 00:56:42,120 --> 00:56:44,120 Speaker 1: you can take a lot of computer security courses in 988 00:56:44,239 --> 00:56:47,480 Speaker 1: college and sometimes in high schools if you're lucky. But yeah, 989 00:56:47,560 --> 00:56:52,319 Speaker 1: technical assistance. So technical assistance are people that will come 990 00:56:52,360 --> 00:56:55,360 Speaker 1: on board with a Hollywood movie or a TV show 991 00:56:55,520 --> 00:56:58,320 Speaker 1: or what have you, and they will explain to the 992 00:56:58,400 --> 00:57:03,400 Speaker 1: network how the hacking actually happens. So I know a 993 00:57:03,520 --> 00:57:07,880 Speaker 1: few they will. They'll come to some of their hacker 994 00:57:07,960 --> 00:57:10,000 Speaker 1: friends or they will be a hacker themselves and they 995 00:57:10,040 --> 00:57:13,560 Speaker 1: will say, Okay, in this season, I know that they 996 00:57:13,680 --> 00:57:16,200 Speaker 1: want to do X, Y and Z on camera, and 997 00:57:16,360 --> 00:57:18,720 Speaker 1: I need to make it look legitimate, so they will 998 00:57:18,760 --> 00:57:20,720 Speaker 1: come up with the script. They will come up with 999 00:57:20,880 --> 00:57:24,560 Speaker 1: the hack and the actual keyboard commands that the actor 1000 00:57:24,720 --> 00:57:28,040 Speaker 1: has to type in on camera so that they are 1001 00:57:28,200 --> 00:57:32,080 Speaker 1: actually doing legitimate hacks. So that way, they're not only 1002 00:57:32,760 --> 00:57:35,680 Speaker 1: making it look cool for a wider audience because an 1003 00:57:35,680 --> 00:57:38,520 Speaker 1: audience is actually going to see how a hack works, 1004 00:57:38,600 --> 00:57:42,880 Speaker 1: but they're also getting that credibility with the infoset community too. 1005 00:57:43,360 --> 00:57:46,400 Speaker 1: So mister Robot is huge with the infoset community because 1006 00:57:46,600 --> 00:57:49,800 Speaker 1: it is legitimate. Like I've watched several of those episodes, 1007 00:57:49,800 --> 00:57:51,440 Speaker 1: and I've seen a lot of the hacks that they do. 1008 00:57:51,880 --> 00:57:54,000 Speaker 1: They've even used some of our hack fi products on 1009 00:57:54,080 --> 00:57:58,080 Speaker 1: the show, and they're actually using legit hacks, And it 1010 00:57:58,280 --> 00:58:00,280 Speaker 1: is so much fun to see it ont be and 1011 00:58:00,360 --> 00:58:02,520 Speaker 1: see them get so many good reviews from a wider 1012 00:58:02,560 --> 00:58:06,120 Speaker 1: consumer audience, because it makes me feel like many more 1013 00:58:06,160 --> 00:58:08,800 Speaker 1: people are getting interested in info sex because they see 1014 00:58:08,800 --> 00:58:10,880 Speaker 1: what's happening on camera and they see that this is 1015 00:58:10,920 --> 00:58:13,560 Speaker 1: actually how you do it. Yeah, it's nice to see 1016 00:58:13,600 --> 00:58:20,320 Speaker 1: it go beyond the niche that I would argue info 1017 00:58:20,400 --> 00:58:24,480 Speaker 1: SEC and hacking has largely inhabited for the past three decades. 1018 00:58:24,600 --> 00:58:27,520 Speaker 1: Right the people who had been interested when it's first started, 1019 00:58:27,560 --> 00:58:31,440 Speaker 1: it was essentially your hobbyists, and often those hobbyists were 1020 00:58:31,840 --> 00:58:36,160 Speaker 1: isolated individuals. You got to the phone freaking days where 1021 00:58:36,200 --> 00:58:39,720 Speaker 1: there was a little bit of a small subculture of 1022 00:58:39,800 --> 00:58:43,080 Speaker 1: people who were interested in hacking the telephone system using 1023 00:58:43,160 --> 00:58:46,320 Speaker 1: all sorts of stuff, including a whistle from Captain Crunch. 1024 00:58:47,600 --> 00:58:50,880 Speaker 1: You had the early hack days where people were just 1025 00:58:51,000 --> 00:58:53,680 Speaker 1: trying to create interesting programs for their computers or to 1026 00:58:53,760 --> 00:58:56,000 Speaker 1: see how some of the programs that were coming out, 1027 00:58:56,080 --> 00:59:00,840 Speaker 1: how did those work? But it was largely a tiny 1028 00:59:00,960 --> 00:59:04,600 Speaker 1: slice of the folks who were even aware of personal computers, 1029 00:59:04,720 --> 00:59:07,800 Speaker 1: and even that group was still a tiny slice of 1030 00:59:07,880 --> 00:59:12,520 Speaker 1: the overall population. We're seeing that tiny slice grow over time, 1031 00:59:13,000 --> 00:59:16,200 Speaker 1: and largely because so many of us are so dependent 1032 00:59:16,320 --> 00:59:19,680 Speaker 1: upon computers these days that it benefits us to have 1033 00:59:19,720 --> 00:59:22,960 Speaker 1: an awareness to make sure that we remain safe, but 1034 00:59:23,160 --> 00:59:26,600 Speaker 1: also because of things like mister Robot showing how this 1035 00:59:26,800 --> 00:59:31,560 Speaker 1: works and sparking the imagination of people who perhaps before 1036 00:59:31,640 --> 00:59:34,400 Speaker 1: they saw that, never thought, yeah, it's kind of cool. 1037 00:59:34,520 --> 00:59:37,120 Speaker 1: I would love to be able to manipulate code in 1038 00:59:37,240 --> 00:59:39,600 Speaker 1: such a way that I could do something new or 1039 00:59:39,800 --> 00:59:44,439 Speaker 1: unexpected or help people. And it's really encouraging to see 1040 00:59:44,520 --> 00:59:47,840 Speaker 1: that kind of thing happen right now. I kind of 1041 00:59:47,880 --> 00:59:49,720 Speaker 1: wish it had happened ten years ago, but I love 1042 00:59:49,760 --> 00:59:52,600 Speaker 1: seeing it happen now same. I actually feel like there 1043 00:59:52,720 --> 00:59:55,760 Speaker 1: was a little bit of negativity in the aspect that 1044 00:59:56,560 --> 01:00:00,320 Speaker 1: we used to have all these really fancy graphics happening 1045 01:00:00,640 --> 01:00:03,240 Speaker 1: in these Hollywood movies and these TV shows, and now 1046 01:00:03,360 --> 01:00:06,640 Speaker 1: they're actually seeing the reality that is hacking, and it 1047 01:00:06,880 --> 01:00:10,800 Speaker 1: is not super colorful. It's not super quick, fast paced 1048 01:00:10,840 --> 01:00:13,440 Speaker 1: and exciting, like it looks like it is on those 1049 01:00:13,520 --> 01:00:17,160 Speaker 1: old school shows, So I'm hoping that now that they're 1050 01:00:17,200 --> 01:00:20,800 Speaker 1: actually seeing it, people will try it too. Like if 1051 01:00:20,840 --> 01:00:24,400 Speaker 1: they seem the main actor on Mister Robot do a 1052 01:00:24,480 --> 01:00:28,360 Speaker 1: specific command line option, they'll go to their computer and 1053 01:00:28,440 --> 01:00:30,720 Speaker 1: try it themselves and see that it actually does work, 1054 01:00:30,800 --> 01:00:32,360 Speaker 1: and then they'll be like, oh, I really want to 1055 01:00:32,400 --> 01:00:34,480 Speaker 1: try some new stuff too, so they'll start googling it 1056 01:00:34,520 --> 01:00:36,600 Speaker 1: and see what else they can find out. That's the 1057 01:00:36,720 --> 01:00:39,640 Speaker 1: kind of inspiration that I wish happened thirty years ago, 1058 01:00:39,920 --> 01:00:43,640 Speaker 1: and it didn't, So I want to see more of 1059 01:00:43,720 --> 01:00:46,800 Speaker 1: that now, and I'm really happy that, for example, mister 1060 01:00:46,920 --> 01:00:49,640 Speaker 1: Robot has done a great job with it. Yeah it's 1061 01:00:49,680 --> 01:00:55,280 Speaker 1: it's and you not to poop all over Hollywood, because 1062 01:00:55,480 --> 01:01:01,000 Speaker 1: I do lovesm hollywoods, but but it is to understand 1063 01:01:01,360 --> 01:01:03,080 Speaker 1: where they were coming from. They were trying to find 1064 01:01:03,120 --> 01:01:09,120 Speaker 1: a way to create an exciting visual depiction of something 1065 01:01:09,320 --> 01:01:14,040 Speaker 1: that doesn't necessarily necessarily lend itself to that in order 1066 01:01:14,760 --> 01:01:17,480 Speaker 1: to create a dramatic effect. So I get it. It's 1067 01:01:17,760 --> 01:01:21,800 Speaker 1: very similar to the way Hollywood portrayed virtual reality back 1068 01:01:21,800 --> 01:01:25,800 Speaker 1: in the nineties, way before virtual reality was ready for 1069 01:01:25,960 --> 01:01:30,480 Speaker 1: public consumption and it's what largely killed VR for a 1070 01:01:30,680 --> 01:01:34,800 Speaker 1: decade before the various video game systems started to make 1071 01:01:34,840 --> 01:01:37,720 Speaker 1: the components cheap enough for people to play in that 1072 01:01:37,840 --> 01:01:40,320 Speaker 1: space again, and now we're on the verge of another 1073 01:01:40,440 --> 01:01:43,960 Speaker 1: VR revolution. The same sort of thing is true of hacking, Like, 1074 01:01:44,040 --> 01:01:47,880 Speaker 1: how do you show hacking in a way that gets 1075 01:01:47,920 --> 01:01:50,680 Speaker 1: across what is happening to an audience and makes it interesting. 1076 01:01:50,880 --> 01:01:53,400 Speaker 1: I think largely you have to do that through really 1077 01:01:53,440 --> 01:01:55,680 Speaker 1: good writing of your characters. And once you do that, 1078 01:01:56,440 --> 01:01:59,680 Speaker 1: then everything else follows. I think if you can show 1079 01:02:00,080 --> 01:02:02,880 Speaker 1: that the characters in a movie or in a TV 1080 01:02:03,040 --> 01:02:06,360 Speaker 1: show are actually real people that have real relationships, they 1081 01:02:06,440 --> 01:02:09,000 Speaker 1: have real jobs and real lives, and they have hobbies 1082 01:02:09,480 --> 01:02:13,360 Speaker 1: outside of just hacking, you can really you can start 1083 01:02:13,400 --> 01:02:16,960 Speaker 1: to relate to that character in a very real sense 1084 01:02:17,080 --> 01:02:19,440 Speaker 1: in the fact that, hey, they are humans too, because 1085 01:02:19,560 --> 01:02:23,760 Speaker 1: here's our people too. That was actually a documentary nice. Yeah, 1086 01:02:23,840 --> 01:02:27,000 Speaker 1: because again, when when you're when you're thinking about it 1087 01:02:27,040 --> 01:02:29,959 Speaker 1: in the abstract, you're really it becomes that us versus 1088 01:02:30,080 --> 01:02:34,520 Speaker 1: them mentality, where by its very nature it's dehumanizing. But 1089 01:02:34,680 --> 01:02:38,360 Speaker 1: that's probably a topic for a show that's not about technology. 1090 01:02:38,480 --> 01:02:42,040 Speaker 1: So I will just leave it be. Shannon Morse, thank 1091 01:02:42,120 --> 01:02:45,960 Speaker 1: you so much for joining me today. Please let everyone 1092 01:02:46,040 --> 01:02:50,080 Speaker 1: know where they can find all of your stuff. Jonathan Strickland, 1093 01:02:50,240 --> 01:02:54,360 Speaker 1: thank you. So yeah, it was a little it was 1094 01:02:54,400 --> 01:02:57,640 Speaker 1: a little curt it was a little laden. Yeah. Yeah, 1095 01:02:57,680 --> 01:03:01,120 Speaker 1: I've been watching Startrek lately. Way wait too much starts Trek, 1096 01:03:01,200 --> 01:03:04,800 Speaker 1: So you can find me. The most direct path is 1097 01:03:04,960 --> 01:03:09,120 Speaker 1: on Twitter. I'm at snubs and that's snubs and then 1098 01:03:09,240 --> 01:03:12,760 Speaker 1: my shows, specifically Our tech Thing over at tek thing 1099 01:03:12,840 --> 01:03:17,040 Speaker 1: dot com and Hack five over at HK five dot org. 1100 01:03:18,000 --> 01:03:20,720 Speaker 1: Well that wraps up this classic episode of tech Stuff. 1101 01:03:21,040 --> 01:03:23,960 Speaker 1: Hope you enjoyed it. It was great having Shannon on 1102 01:03:24,000 --> 01:03:26,280 Speaker 1: the show. I've had her on a few times over 1103 01:03:26,360 --> 01:03:28,360 Speaker 1: the years, and I would actually I would love to 1104 01:03:28,400 --> 01:03:31,440 Speaker 1: have her on again. There's always an open invitation to Shannon. 1105 01:03:31,800 --> 01:03:34,080 Speaker 1: She is a very busy woman. She's got a lot 1106 01:03:34,200 --> 01:03:38,040 Speaker 1: going on. So if I can find a time to 1107 01:03:38,240 --> 01:03:40,320 Speaker 1: schedule her so that we can have her on and 1108 01:03:40,400 --> 01:03:44,200 Speaker 1: share her expertise, that would be wonderful because she's got 1109 01:03:44,760 --> 01:03:48,360 Speaker 1: a deep and practical knowledge in fields where I just 1110 01:03:48,560 --> 01:03:51,720 Speaker 1: have kind of a general awareness. So she's always great 1111 01:03:51,800 --> 01:03:54,040 Speaker 1: to have on the show. If you have suggestions for 1112 01:03:54,120 --> 01:03:56,440 Speaker 1: topics I should cover in future episodes of tech Stuff, 1113 01:03:56,520 --> 01:03:58,400 Speaker 1: Please reach out to me and let me know what 1114 01:03:58,520 --> 01:04:00,560 Speaker 1: those are. One way to do that at is to 1115 01:04:00,720 --> 01:04:03,760 Speaker 1: download the iHeartRadio app. It's free to download, it's free 1116 01:04:03,800 --> 01:04:06,280 Speaker 1: to use. He can navigate over to the tech Stuff 1117 01:04:06,320 --> 01:04:08,960 Speaker 1: part by just putting tech Stuff into that little search bar. 1118 01:04:09,440 --> 01:04:11,240 Speaker 1: It'll take you to the podcast page and you'll see 1119 01:04:11,240 --> 01:04:14,760 Speaker 1: a little microphone icon. If you click on that microphone icon, 1120 01:04:14,840 --> 01:04:16,760 Speaker 1: you can leave me a voice message up to thirty 1121 01:04:16,800 --> 01:04:18,520 Speaker 1: seconds in length and let me know what you would 1122 01:04:18,560 --> 01:04:21,160 Speaker 1: like to hear in the future, or if you prefer, 1123 01:04:21,240 --> 01:04:23,200 Speaker 1: you can go on over to Twitter and send me 1124 01:04:23,400 --> 01:04:27,120 Speaker 1: a tweet. The handle for the show is tech Stuff 1125 01:04:27,560 --> 01:04:38,000 Speaker 1: HSWU and I'll talk to you again really soon. Tech 1126 01:04:38,040 --> 01:04:42,439 Speaker 1: Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 1127 01:04:42,800 --> 01:04:46,479 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 1128 01:04:46,520 --> 01:04:47,600 Speaker 1: to your favorite shows.