1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:14,040 Speaker 1: stuff works dot com. Hey there, everybuddy, Welcome to tech Stuff. 3 00:00:14,080 --> 00:00:17,400 Speaker 1: I'm Jonathan Strickland. I'm your host and the executive producer 4 00:00:17,440 --> 00:00:19,599 Speaker 1: of this show there at how Stuff Works, and I 5 00:00:19,680 --> 00:00:22,080 Speaker 1: heeart radio and a love of all things tech, and 6 00:00:22,160 --> 00:00:25,320 Speaker 1: we are doing another classic episode of tech Stuff hacking 7 00:00:25,560 --> 00:00:28,639 Speaker 1: for dollars. I got to talk with Shannon Morse of 8 00:00:28,720 --> 00:00:32,360 Speaker 1: Hack five and we talked all about how hackers make 9 00:00:32,560 --> 00:00:37,240 Speaker 1: money and both in in legal and nefarious ways. And 10 00:00:37,760 --> 00:00:40,159 Speaker 1: this is going to be the last of our classic 11 00:00:40,200 --> 00:00:43,199 Speaker 1: episodes covering my vacation time, I will be back in 12 00:00:43,240 --> 00:00:46,000 Speaker 1: the studio recording brand new episodes. So I thank you 13 00:00:46,120 --> 00:00:47,640 Speaker 1: very much for your patience. I hope you guys have 14 00:00:47,760 --> 00:00:51,000 Speaker 1: enjoyed this look back on some past episodes of tech Stuff. 15 00:00:51,400 --> 00:00:57,040 Speaker 1: Let's join past Jonathan and Shannon right now. Hi, how 16 00:00:57,080 --> 00:01:00,840 Speaker 1: are you Jonathan, I'm doing great. How are you? I'm awesome. Yeah, 17 00:01:00,840 --> 00:01:02,520 Speaker 1: thank you so much for having me on. This is 18 00:01:02,560 --> 00:01:04,440 Speaker 1: a great show and I love listening to it, so 19 00:01:04,520 --> 00:01:07,240 Speaker 1: I'm super happy to be on. Yeah, excellent. Now, this 20 00:01:07,280 --> 00:01:09,280 Speaker 1: is of course the second time we've had you on, 21 00:01:09,360 --> 00:01:13,520 Speaker 1: so I will work very hard to to increase that number. 22 00:01:13,560 --> 00:01:14,880 Speaker 1: I want to at least get us up to double 23 00:01:14,920 --> 00:01:18,959 Speaker 1: digits with that. But I got Shannon on the show 24 00:01:19,040 --> 00:01:21,959 Speaker 1: specifically to talk about an area that she talks about 25 00:01:21,959 --> 00:01:25,840 Speaker 1: a lot, the realm of hacking, and specifically I wanted 26 00:01:25,840 --> 00:01:28,640 Speaker 1: to do kind of an episode about how do hacker, 27 00:01:28,680 --> 00:01:30,160 Speaker 1: how do you make money? How do you make a 28 00:01:30,200 --> 00:01:33,639 Speaker 1: career out of hacking? And uh and really to frame 29 00:01:33,680 --> 00:01:35,840 Speaker 1: this conversation, I think one of the most important things 30 00:01:35,880 --> 00:01:39,000 Speaker 1: to do is to sort of define your terms. And 31 00:01:39,080 --> 00:01:41,800 Speaker 1: as it turns out, the term hacker is is actually 32 00:01:41,800 --> 00:01:44,720 Speaker 1: a very broad term that can apply to a lot 33 00:01:44,720 --> 00:01:47,120 Speaker 1: of different things, and not all of them are that 34 00:01:47,280 --> 00:01:52,640 Speaker 1: nefarious evil infiltrate a system and steal all the corporate 35 00:01:52,720 --> 00:01:57,000 Speaker 1: secrets kind of approach to hacking that Hollywood often presents 36 00:01:57,120 --> 00:01:59,720 Speaker 1: right right exactly. I actually asked this question to a 37 00:01:59,760 --> 00:02:02,360 Speaker 1: lot of people, especially when I first meet them. Since 38 00:02:02,400 --> 00:02:05,000 Speaker 1: I'm so closely affiliated with a lot of the info 39 00:02:05,040 --> 00:02:08,480 Speaker 1: set community, I want to surround myself with positive people. 40 00:02:08,960 --> 00:02:12,639 Speaker 1: So you'll notice with the hacker definition, you can either 41 00:02:12,639 --> 00:02:14,880 Speaker 1: get a very negative vibe from somebody or a very 42 00:02:14,919 --> 00:02:18,200 Speaker 1: positive vibe. Oftentimes, with the negative vibe, you'll get somebody 43 00:02:18,200 --> 00:02:20,760 Speaker 1: who says, oh, that's the person who stole my credit 44 00:02:20,760 --> 00:02:23,320 Speaker 1: card data when I went to a restaurant the other day. 45 00:02:23,639 --> 00:02:26,240 Speaker 1: But on the positive side, you'll get somebody that says, oh, 46 00:02:26,280 --> 00:02:29,160 Speaker 1: they're the kind of people that will like break something 47 00:02:29,200 --> 00:02:31,040 Speaker 1: apart and then put it back together in a way 48 00:02:31,080 --> 00:02:33,320 Speaker 1: that it wasn't supposed to be put back together to 49 00:02:33,360 --> 00:02:36,880 Speaker 1: make it do something cool. And that's a hack in 50 00:02:36,880 --> 00:02:39,720 Speaker 1: in mainstream. Uh So that's the way I see it. 51 00:02:39,800 --> 00:02:43,720 Speaker 1: I see hackers as being people who um reverse engineer 52 00:02:44,080 --> 00:02:47,200 Speaker 1: different software, different hardware. It could just be a bicycle, 53 00:02:47,320 --> 00:02:50,000 Speaker 1: for example, and put it back together in a way 54 00:02:50,040 --> 00:02:53,760 Speaker 1: to make it harder, better, faster, and stronger. Nice the 55 00:02:53,800 --> 00:02:57,720 Speaker 1: old daft punk approach, of course, Yeah, I agree entirely. Uh. 56 00:02:57,880 --> 00:03:01,120 Speaker 1: The the original term hacker was really all about people 57 00:03:01,160 --> 00:03:06,760 Speaker 1: who have almost an insatiable curiosity to learn how stuff works. 58 00:03:06,800 --> 00:03:10,680 Speaker 1: Oddly enough, I share that quality, having worked at how 59 00:03:10,760 --> 00:03:14,359 Speaker 1: stuff works for a decade. Uh. But yeah, to understand 60 00:03:14,440 --> 00:03:16,919 Speaker 1: how it works and then to make stuff do things 61 00:03:17,040 --> 00:03:21,639 Speaker 1: it wasn't necessarily intended to do. Not for nefarious purposes necessarily, 62 00:03:21,680 --> 00:03:25,760 Speaker 1: although that could clearly be an application, but just for 63 00:03:25,919 --> 00:03:29,840 Speaker 1: curiosity's sake. Can can I take these elements that are 64 00:03:29,880 --> 00:03:32,240 Speaker 1: meant to do this one thing and do something completely 65 00:03:32,240 --> 00:03:36,040 Speaker 1: transformative with it, whether it is hardware or software. And 66 00:03:36,080 --> 00:03:38,640 Speaker 1: we've seen some really cool stuff come out of that. 67 00:03:38,680 --> 00:03:40,800 Speaker 1: I mean, I would argue that a lot of the 68 00:03:40,840 --> 00:03:43,960 Speaker 1: things you see in the cosplay world and the steampunk world, 69 00:03:44,320 --> 00:03:48,720 Speaker 1: those are all taking elements of hacking. Maker Fair is 70 00:03:48,960 --> 00:03:51,800 Speaker 1: really just a hacker's paradise when you get down to it, 71 00:03:51,880 --> 00:03:54,680 Speaker 1: especially for hardware hacks. Absolutely, I'm kind of sad I'm 72 00:03:54,680 --> 00:03:57,440 Speaker 1: gonna miss Make Her Fair this year. I haven't been 73 00:03:57,480 --> 00:03:59,360 Speaker 1: to one yet. I've been to a small one here 74 00:03:59,360 --> 00:04:03,440 Speaker 1: in Atlanta, very very modest maker Fair. Everyone there was 75 00:04:03,600 --> 00:04:07,240 Speaker 1: great and passionate and intelligent, but it was, you know, 76 00:04:07,280 --> 00:04:09,400 Speaker 1: a much smaller scale than something you would see in 77 00:04:09,400 --> 00:04:12,400 Speaker 1: the Bay Area. But but that's the kind of thing 78 00:04:12,600 --> 00:04:15,880 Speaker 1: that hacker means to me. Now that being said, in 79 00:04:15,920 --> 00:04:18,960 Speaker 1: this episode, we're really going to be focusing on on 80 00:04:19,120 --> 00:04:22,599 Speaker 1: sort of the computer oriented, really the software side of hacking. 81 00:04:23,240 --> 00:04:24,920 Speaker 1: Uh And a large part of it's going to be 82 00:04:25,000 --> 00:04:28,200 Speaker 1: on the the the bad guy, the naughty bits as 83 00:04:28,200 --> 00:04:31,159 Speaker 1: I call it in our notes about hacking, simply to 84 00:04:31,200 --> 00:04:35,080 Speaker 1: talk about what are the ways that hackers cause or 85 00:04:35,240 --> 00:04:39,080 Speaker 1: the malicious hackers cause problems? How do they expect to 86 00:04:39,600 --> 00:04:43,120 Speaker 1: profit from that? And also that, well, we'll look at 87 00:04:43,160 --> 00:04:46,520 Speaker 1: ways that hackers who don't follow that path, who are 88 00:04:46,560 --> 00:04:50,360 Speaker 1: looking to help people, not hurt people, how do they 89 00:04:50,400 --> 00:04:54,240 Speaker 1: make a living? Because it's one of those things where 90 00:04:54,320 --> 00:04:55,960 Speaker 1: you kind of take it for granted when you see 91 00:04:56,000 --> 00:04:59,039 Speaker 1: the Hollywood depiction of a hacker, the person sitting down. 92 00:04:59,080 --> 00:05:02,080 Speaker 1: Usually they're sitting at a keyboard and for some reason 93 00:05:02,120 --> 00:05:06,800 Speaker 1: they're monitor only is monochromatic green. Yes, that's so true. Well, 94 00:05:07,000 --> 00:05:10,039 Speaker 1: they're using the old Apple to E terminals. Terminals are 95 00:05:10,080 --> 00:05:13,120 Speaker 1: actually written and green oftentimes, but you can change the 96 00:05:13,160 --> 00:05:15,720 Speaker 1: colors to rainbow colors if you choose. That is a hack, 97 00:05:15,800 --> 00:05:19,000 Speaker 1: it's a real life hack. Yeah, yeah, And usually you 98 00:05:19,040 --> 00:05:22,120 Speaker 1: see them sitting down and then they cause some sort 99 00:05:22,320 --> 00:05:26,960 Speaker 1: of mischief, sometimes bordering on sabotage. But then you when 100 00:05:27,000 --> 00:05:29,520 Speaker 1: you think about it outside the context of that scene, 101 00:05:29,560 --> 00:05:34,440 Speaker 1: you think, how did they expect to profit from this? 102 00:05:34,720 --> 00:05:37,400 Speaker 1: So that's kind of what we're looking at. Yeah, Because 103 00:05:37,800 --> 00:05:40,920 Speaker 1: it's always important to me to reiterate to that there 104 00:05:40,920 --> 00:05:43,120 Speaker 1: are always going to be two sides of a coin. 105 00:05:43,160 --> 00:05:45,320 Speaker 1: To everything in life. Of course, there are going to 106 00:05:45,360 --> 00:05:48,400 Speaker 1: be bad guys in the realm in the world who 107 00:05:48,480 --> 00:05:51,120 Speaker 1: do nefarious hacks, but there's also a lot of good 108 00:05:51,120 --> 00:05:54,520 Speaker 1: guys too, And personally, for me, the reason why I'm 109 00:05:54,520 --> 00:05:57,160 Speaker 1: so interested in researching this is because it has made 110 00:05:57,320 --> 00:06:01,000 Speaker 1: me a much more privacy and security guard person. I've 111 00:06:01,000 --> 00:06:03,520 Speaker 1: gotten a lot better at my own protections online, and 112 00:06:03,520 --> 00:06:06,039 Speaker 1: I feel like if somebody else can understand what a 113 00:06:06,080 --> 00:06:08,120 Speaker 1: hacker does on the bad side as well as the 114 00:06:08,160 --> 00:06:10,440 Speaker 1: good side, they can better protect themselves too, And that's 115 00:06:10,440 --> 00:06:13,080 Speaker 1: what I've always tried to teach people. Yeah, I think 116 00:06:13,080 --> 00:06:15,440 Speaker 1: all you have to really do is attend one def 117 00:06:15,480 --> 00:06:19,000 Speaker 1: con and really have that driven home. I have not 118 00:06:19,120 --> 00:06:22,240 Speaker 1: yet gone to a def con, mostly because I don't 119 00:06:22,240 --> 00:06:24,320 Speaker 1: know that I could part with my smartphone for that 120 00:06:24,400 --> 00:06:27,800 Speaker 1: long and I certainly wouldn't take it with me. Bring 121 00:06:27,839 --> 00:06:32,240 Speaker 1: a burner phone, you'll be fine. Yeah, that that's me Jonathan, 122 00:06:32,240 --> 00:06:35,359 Speaker 1: the guy who carries the burner. Uh. It makes sense, 123 00:06:35,400 --> 00:06:37,279 Speaker 1: I mean, when you're doing something like that. So, for 124 00:06:37,279 --> 00:06:39,960 Speaker 1: those who don't know, def Con is a large hacker 125 00:06:40,080 --> 00:06:44,920 Speaker 1: based conference largely looking at the realm of information security 126 00:06:45,279 --> 00:06:50,120 Speaker 1: UM and often they will you'll have entire presentations dedicated 127 00:06:50,160 --> 00:06:53,840 Speaker 1: to showing off vulnerabilities and security, again not necessarily so 128 00:06:53,880 --> 00:06:56,120 Speaker 1: that people can take advantage of them, but rather to 129 00:06:56,200 --> 00:07:00,080 Speaker 1: raise awareness and to kind of force the hands of 130 00:06:59,880 --> 00:07:03,760 Speaker 1: the parties that are responsible for that software to take 131 00:07:03,800 --> 00:07:06,880 Speaker 1: action and fix a problem. Right Like, that was what 132 00:07:06,920 --> 00:07:11,720 Speaker 1: we saw with the hack about remotely taking control of 133 00:07:11,760 --> 00:07:15,880 Speaker 1: a person's vehicle. Uh specifically Jeep was really having that issue. 134 00:07:16,160 --> 00:07:18,720 Speaker 1: Those one of those things where the researchers were saying, look, 135 00:07:19,080 --> 00:07:21,960 Speaker 1: we're bringing this to light, not so that we can 136 00:07:22,000 --> 00:07:25,120 Speaker 1: create an era where people are terrified of their vehicles 137 00:07:25,120 --> 00:07:27,120 Speaker 1: that someone's going to take remote control of their car, 138 00:07:27,480 --> 00:07:31,040 Speaker 1: but rather to really drive home the fact that the 139 00:07:31,080 --> 00:07:35,040 Speaker 1: information security is now it's important everywhere. It's not just 140 00:07:35,280 --> 00:07:38,640 Speaker 1: your phone, it's not just your computer. As the Internet 141 00:07:38,640 --> 00:07:42,600 Speaker 1: of Things continues to blossom, it's everything. Yes, I agree, 142 00:07:42,640 --> 00:07:46,080 Speaker 1: And in that sense, those researchers were trying to use 143 00:07:46,160 --> 00:07:49,920 Speaker 1: something the old school term is called responsible disclosure, where 144 00:07:49,960 --> 00:07:53,520 Speaker 1: they explain some kind of vulnerability that they found to 145 00:07:53,600 --> 00:07:56,600 Speaker 1: the company in hopes that the company will fix this 146 00:07:56,680 --> 00:07:59,680 Speaker 1: problem before it becomes mainstream and before it gets out 147 00:07:59,760 --> 00:08:03,520 Speaker 1: into the wild. In the case of Jeep. I believe, 148 00:08:03,600 --> 00:08:05,920 Speaker 1: if my memory serves me right, that Jeep did not 149 00:08:06,120 --> 00:08:10,280 Speaker 1: necessarily release a patch for this vulnerability. So then the 150 00:08:10,320 --> 00:08:13,360 Speaker 1: researchers decided to go out publicly about the information that 151 00:08:13,400 --> 00:08:16,240 Speaker 1: they found, and then Jeep decided to fix it once 152 00:08:16,240 --> 00:08:19,640 Speaker 1: everybody else knew about it, right, And then sometimes that's 153 00:08:19,640 --> 00:08:22,280 Speaker 1: what it takes. And then and I've had the same 154 00:08:22,320 --> 00:08:25,760 Speaker 1: discussion offline with a mutual friend of ours, Brian Brushwood. 155 00:08:26,440 --> 00:08:29,000 Speaker 1: Brian is a stage magician. He has a show called 156 00:08:29,040 --> 00:08:31,960 Speaker 1: Scam School. It's all about social engineering. One of the 157 00:08:32,040 --> 00:08:34,800 Speaker 1: things I have talked about with Brian is that his show, 158 00:08:35,080 --> 00:08:38,320 Speaker 1: he often shows how to do certain types of scams 159 00:08:38,400 --> 00:08:42,400 Speaker 1: or tricks, but they're mostly in the bar bet world, right, Like, 160 00:08:42,520 --> 00:08:45,520 Speaker 1: not stuff that you would do to ruin someone's life, 161 00:08:45,559 --> 00:08:47,160 Speaker 1: but something that you know you might want to you 162 00:08:47,240 --> 00:08:50,480 Speaker 1: might win a free beer that way. Totally use some 163 00:08:50,520 --> 00:08:53,280 Speaker 1: of those myself. Yeah, And he showed off. He had 164 00:08:53,320 --> 00:08:57,120 Speaker 1: an episode where he showed off this guy who had 165 00:08:57,240 --> 00:09:01,959 Speaker 1: was demonstrating a well known vulnerableity of a popular bike 166 00:09:02,040 --> 00:09:04,200 Speaker 1: lock that has been off the market for a couple 167 00:09:04,240 --> 00:09:07,800 Speaker 1: of years because of this vulnerability. But that particular vulnerability 168 00:09:07,840 --> 00:09:11,040 Speaker 1: meant that you could use a regular plastic pen, remove 169 00:09:11,160 --> 00:09:13,959 Speaker 1: the pen part of the pen, use the casing, and 170 00:09:14,120 --> 00:09:18,520 Speaker 1: jam them into the lock and pop the lock open hole. Right. 171 00:09:18,920 --> 00:09:21,520 Speaker 1: And so people were complaining in the comments. They were saying, 172 00:09:21,920 --> 00:09:25,760 Speaker 1: you're you're, you're publicizing this vulnerability. And I said, guess what, 173 00:09:26,000 --> 00:09:29,400 Speaker 1: the bad guys already know about this vulnerability. What they're 174 00:09:29,440 --> 00:09:32,360 Speaker 1: doing is publicizing it to a public that might be 175 00:09:32,480 --> 00:09:35,959 Speaker 1: still vulnerable to it so that they don't fall victim. 176 00:09:36,520 --> 00:09:38,920 Speaker 1: And that, to me is a very important part of 177 00:09:39,000 --> 00:09:42,320 Speaker 1: hackers across the board. They they serve a very important 178 00:09:42,320 --> 00:09:47,720 Speaker 1: purpose to alert folks to potential dangers before it gets 179 00:09:47,760 --> 00:09:51,920 Speaker 1: too late. Yeah. Absolutely, And and you're those hackers are 180 00:09:51,920 --> 00:09:55,640 Speaker 1: the people that are generally working to make a better 181 00:09:55,720 --> 00:09:59,400 Speaker 1: world for consumers, a better about a private and secure 182 00:09:59,440 --> 00:10:02,640 Speaker 1: world for consumers. But then, of course, on the other hand, 183 00:10:02,679 --> 00:10:05,559 Speaker 1: are the batties. Yeah, let's talk about some of them. 184 00:10:05,640 --> 00:10:09,040 Speaker 1: So I kind of gave some weird little titles for 185 00:10:09,080 --> 00:10:11,600 Speaker 1: this when I was typing it up, because in the 186 00:10:11,600 --> 00:10:13,840 Speaker 1: middle of a week, I get bored. Shannon has to 187 00:10:13,840 --> 00:10:16,160 Speaker 1: be honest, And so when I was making an outline 188 00:10:16,240 --> 00:10:18,120 Speaker 1: kind of for us to work from, I started coming 189 00:10:18,200 --> 00:10:21,800 Speaker 1: up with goofy subtitles. So this whole section is titled 190 00:10:22,000 --> 00:10:25,199 Speaker 1: the Naughty Bits in our Notes, and the first one 191 00:10:25,280 --> 00:10:29,480 Speaker 1: is malware moo law as in people who make money 192 00:10:29,520 --> 00:10:33,920 Speaker 1: through the development or distribution of malware. And malware, as 193 00:10:33,960 --> 00:10:36,480 Speaker 1: I've said on this show many times in order to 194 00:10:36,480 --> 00:10:39,480 Speaker 1: define it, it's really software that is intended to do 195 00:10:39,559 --> 00:10:42,360 Speaker 1: something that is ultimately harmful to the person who runs 196 00:10:42,360 --> 00:10:46,640 Speaker 1: that software on their machine. It covers a wide array 197 00:10:46,880 --> 00:10:51,000 Speaker 1: of different sub categories, like, uh, you know, this is 198 00:10:51,000 --> 00:10:52,959 Speaker 1: the sort of term that we normally would have in 199 00:10:53,000 --> 00:10:55,800 Speaker 1: the old days just called a computer virus, But computer 200 00:10:55,880 --> 00:10:59,320 Speaker 1: virus is a very specific thing, and malware covers more 201 00:10:59,360 --> 00:11:03,720 Speaker 1: stuff than just viruses, also worms and all sorts of stuff. Yeah, 202 00:11:03,760 --> 00:11:06,760 Speaker 1: there's there's malware for Java and Flash. If you still 203 00:11:06,760 --> 00:11:09,480 Speaker 1: have Flash installed, I highly recommend that you uninstall it 204 00:11:09,520 --> 00:11:12,520 Speaker 1: if you don't need it. There's malware for browsers. There's 205 00:11:12,559 --> 00:11:16,000 Speaker 1: malware for advertisements online for sponsors that you'll see like 206 00:11:16,160 --> 00:11:19,400 Speaker 1: on on different websites. That was a very recent problem 207 00:11:19,440 --> 00:11:22,400 Speaker 1: that a lot of news publications had with yeah, big 208 00:11:22,480 --> 00:11:24,880 Speaker 1: name news publicly. Yeah, so that was a big one. 209 00:11:25,400 --> 00:11:28,040 Speaker 1: But you'll see maw'd all over the place, and luckily 210 00:11:28,120 --> 00:11:31,520 Speaker 1: we do have anti malware software that we can use 211 00:11:31,600 --> 00:11:33,679 Speaker 1: to protect our computers from it, and we can also 212 00:11:33,960 --> 00:11:36,800 Speaker 1: block certain ports on the routers that can hopefully protect 213 00:11:36,840 --> 00:11:39,760 Speaker 1: you from Mauer. But there's also a lot of cases 214 00:11:39,800 --> 00:11:45,080 Speaker 1: where maware is distributed and built so quickly that a 215 00:11:45,120 --> 00:11:48,960 Speaker 1: lot of those anti Mauer software are not updated quick enough. 216 00:11:49,320 --> 00:11:50,720 Speaker 1: So in that case, we need to do the best 217 00:11:50,760 --> 00:11:53,719 Speaker 1: that we can to protect ourselves and keep Mauer from 218 00:11:53,760 --> 00:11:56,760 Speaker 1: getting out from the deep web. Yeah, you know, it 219 00:11:56,840 --> 00:12:00,680 Speaker 1: used to be, uh that you really all you needed 220 00:12:00,679 --> 00:12:02,920 Speaker 1: to worry about was just don't go to the more 221 00:12:03,040 --> 00:12:07,120 Speaker 1: seedy elements of the web, and you were generally all right, right, Yeah, 222 00:12:07,120 --> 00:12:10,839 Speaker 1: It's kind of like avoiding a bad neighborhood. Like, obviously, 223 00:12:11,000 --> 00:12:13,840 Speaker 1: if you don't want to get robbed, there's certain neighborhoods 224 00:12:13,880 --> 00:12:16,320 Speaker 1: that you should probably shouldn't walk around in by yourself 225 00:12:16,360 --> 00:12:19,200 Speaker 1: at night, right, And this is kind of similar in 226 00:12:19,200 --> 00:12:21,960 Speaker 1: that case where you avoid the deep web unless you 227 00:12:22,040 --> 00:12:24,400 Speaker 1: really want to be on somebody's like hit list or 228 00:12:24,400 --> 00:12:27,120 Speaker 1: something like that. Yeah. Yeah, if you're if you suddenly 229 00:12:27,120 --> 00:12:29,400 Speaker 1: think that you want to come across as a big shot. Look, 230 00:12:29,440 --> 00:12:32,439 Speaker 1: if you're not a big shot. Don't do that. It's 231 00:12:32,480 --> 00:12:34,240 Speaker 1: kind of like, kind of like walking up to someone 232 00:12:34,280 --> 00:12:36,720 Speaker 1: who works in a carnival and claiming that you're with 233 00:12:36,800 --> 00:12:38,480 Speaker 1: it and for it. If you don't know what that means, 234 00:12:38,520 --> 00:12:41,440 Speaker 1: you do not say that. Okay, I think I just 235 00:12:41,480 --> 00:12:46,480 Speaker 1: gave terrible advice to an entire population of listeners. Don't don't. 236 00:12:46,559 --> 00:12:50,319 Speaker 1: Don't talk to Carney's unless you are one. Alright, so, uh, 237 00:12:50,360 --> 00:12:53,080 Speaker 1: and I love you Carney's, I love you all. So. 238 00:12:53,320 --> 00:12:56,480 Speaker 1: The the thing that we're getting across, though, is that 239 00:12:56,640 --> 00:12:59,719 Speaker 1: today that's not as big a guarantee as it used 240 00:12:59,720 --> 00:13:02,720 Speaker 1: to be. Right like ten years ago, you'd say, look, 241 00:13:02,800 --> 00:13:06,840 Speaker 1: just be careful. Don't download unusual files. Don't don't run 242 00:13:07,000 --> 00:13:11,320 Speaker 1: a file that's linked in your email without checking it 243 00:13:11,360 --> 00:13:14,360 Speaker 1: out first. Don't don't, you know, be careful opening up 244 00:13:14,400 --> 00:13:17,520 Speaker 1: emails from things that you don't recognize. Be careful with 245 00:13:17,600 --> 00:13:22,600 Speaker 1: PDF files. Be careful with stuff, especially unsolicited stuff that 246 00:13:22,679 --> 00:13:26,600 Speaker 1: has come to you, because that raises the chances that's 247 00:13:26,600 --> 00:13:29,480 Speaker 1: something hinky is going on. It doesn't necessarily mean it's 248 00:13:29,520 --> 00:13:34,000 Speaker 1: definitely a problem, but it's potentially a problem, and it's 249 00:13:34,040 --> 00:13:36,720 Speaker 1: better to be safe than sorry. Make sure you have 250 00:13:36,800 --> 00:13:40,079 Speaker 1: good and uh anti virus software on your computer, make 251 00:13:40,080 --> 00:13:42,240 Speaker 1: sure you have a nice strong firewall, all of these 252 00:13:42,320 --> 00:13:45,720 Speaker 1: kind of things. Those used to be pretty good at 253 00:13:45,800 --> 00:13:49,559 Speaker 1: keeping of the malware away from you if you were 254 00:13:49,600 --> 00:13:54,319 Speaker 1: being a fairly responsible ned is in these days, they 255 00:13:54,360 --> 00:13:57,920 Speaker 1: definitely help these days. These days, the attacks are sometimes 256 00:13:57,960 --> 00:14:00,600 Speaker 1: getting like in the case of the advertise eismans on 257 00:14:00,679 --> 00:14:05,200 Speaker 1: news sites. These are attacks that are going through avenues 258 00:14:05,200 --> 00:14:08,040 Speaker 1: that you want at one point would have considered perfectly safe. 259 00:14:08,920 --> 00:14:11,200 Speaker 1: Not that it's happening all the time, but the fact 260 00:14:11,200 --> 00:14:15,559 Speaker 1: that it can happen tells you that it requires an 261 00:14:15,640 --> 00:14:18,960 Speaker 1: extra level of vigilance beyond what we used to say 262 00:14:19,080 --> 00:14:23,080 Speaker 1: was was sufficient. Yeah. Absolutely, A data collection for a 263 00:14:23,080 --> 00:14:27,720 Speaker 1: lot of this matare is extremely Uh. It's high sensitive 264 00:14:27,840 --> 00:14:32,440 Speaker 1: in the fact that a user's data can get so 265 00:14:32,520 --> 00:14:35,160 Speaker 1: much money on the on the deep web, so much 266 00:14:35,200 --> 00:14:39,560 Speaker 1: money really, particularly a collection of user data. That's where 267 00:14:39,560 --> 00:14:42,440 Speaker 1: the big money is, right. I did an episode once 268 00:14:42,520 --> 00:14:45,200 Speaker 1: where we tried to break down how much is your 269 00:14:45,240 --> 00:14:50,160 Speaker 1: personal information worth? Yeah, it really depends. It depends upon 270 00:14:50,240 --> 00:14:53,000 Speaker 1: what information you're talking about, Like how extensive is that 271 00:14:53,040 --> 00:14:56,640 Speaker 1: profile on a person? But yeah, it's not much in 272 00:14:56,680 --> 00:14:58,720 Speaker 1: the grand scheme of things like, to you, it's worth 273 00:14:58,800 --> 00:15:01,760 Speaker 1: a lot right you a person, Shannon, you as a person, 274 00:15:01,840 --> 00:15:04,240 Speaker 1: that information is worth a lot of money to you 275 00:15:04,960 --> 00:15:08,240 Speaker 1: because it's who you are. To someone else, it's worth 276 00:15:08,640 --> 00:15:11,560 Speaker 1: pennies on the dollar, really, depending upon depending upon the 277 00:15:11,600 --> 00:15:16,600 Speaker 1: amount of information. But the smellware often is giving hackers 278 00:15:16,640 --> 00:15:21,040 Speaker 1: access to massive amounts of info about a huge number 279 00:15:21,080 --> 00:15:24,960 Speaker 1: of people, and in numbers there is more value, and 280 00:15:25,000 --> 00:15:27,840 Speaker 1: that's where they will sell that. Sometimes they sell it 281 00:15:27,920 --> 00:15:31,600 Speaker 1: to companies that are just interested in getting information so 282 00:15:31,640 --> 00:15:35,200 Speaker 1: that they can do targeted advertising. So it might be 283 00:15:35,400 --> 00:15:40,160 Speaker 1: that the ultimate use of your information isn't as bad 284 00:15:40,200 --> 00:15:42,040 Speaker 1: as it could be. It just means you're going to 285 00:15:42,080 --> 00:15:45,920 Speaker 1: get some ads, but still not fun to think about 286 00:15:46,080 --> 00:15:48,200 Speaker 1: and to think that you know, now these companies have 287 00:15:48,320 --> 00:15:51,640 Speaker 1: access to information about you that you probably would rather 288 00:15:51,680 --> 00:15:55,920 Speaker 1: they not have, particularly in targeted advertising. The famous story 289 00:15:56,080 --> 00:15:59,480 Speaker 1: about target when they started sending ads to a young 290 00:15:59,560 --> 00:16:03,160 Speaker 1: lady that were related to pregnancy, and then her dad 291 00:16:03,200 --> 00:16:05,480 Speaker 1: got really really ticked off about it. But it turned 292 00:16:05,480 --> 00:16:08,720 Speaker 1: out that little girl was pregnant, yeah, and that it 293 00:16:08,800 --> 00:16:11,520 Speaker 1: was it was because the algorithms had picked up through 294 00:16:11,560 --> 00:16:14,560 Speaker 1: her search habits that she was pregnant based upon the 295 00:16:14,560 --> 00:16:17,640 Speaker 1: search terms she was putting in, and so they proactively 296 00:16:17,680 --> 00:16:21,560 Speaker 1: sent her some coupons for pregnancy related items. The dad 297 00:16:21,560 --> 00:16:24,280 Speaker 1: got very upset. Then the dad ended up apologizing to Targets, 298 00:16:24,320 --> 00:16:26,760 Speaker 1: saying that he was unaware at the time of the 299 00:16:26,800 --> 00:16:30,320 Speaker 1: full situation. Well, in that case, it was search algorithms. 300 00:16:30,360 --> 00:16:33,520 Speaker 1: It wasn't a hacker who had gained access to stuff 301 00:16:33,520 --> 00:16:36,040 Speaker 1: and then sold it. But there are other cases where 302 00:16:36,040 --> 00:16:39,800 Speaker 1: that does happen, where you know, just a database of info, 303 00:16:40,320 --> 00:16:43,120 Speaker 1: and a lot of times they will release this malware 304 00:16:43,240 --> 00:16:46,560 Speaker 1: in something that's called an exploit kit. So generally, these 305 00:16:46,600 --> 00:16:50,160 Speaker 1: exploit kits are like a batch of similar malware that 306 00:16:50,200 --> 00:16:53,360 Speaker 1: will work across several different platforms. So that whether that's 307 00:16:53,400 --> 00:16:56,240 Speaker 1: several different types of software like job and flash, or 308 00:16:56,280 --> 00:17:00,160 Speaker 1: several different browsers. It could be several different operating systems too, 309 00:17:00,320 --> 00:17:03,400 Speaker 1: So you might see an exploit kit that works on 310 00:17:03,840 --> 00:17:09,360 Speaker 1: Lenox four but also works on Windows XP up through 311 00:17:09,440 --> 00:17:12,560 Speaker 1: eight or something like that. Right, And what's crazy is 312 00:17:12,600 --> 00:17:14,960 Speaker 1: that when you start looking at I mean, this is 313 00:17:15,000 --> 00:17:16,600 Speaker 1: one of the things that hackers do, right, They'll look 314 00:17:16,600 --> 00:17:18,960 Speaker 1: at operating systems and what the market penetration is for 315 00:17:18,960 --> 00:17:21,479 Speaker 1: those systems because that that's that shows you where your 316 00:17:21,480 --> 00:17:24,520 Speaker 1: target rich environment is. Right, So if you have Windows seven, 317 00:17:24,840 --> 00:17:30,399 Speaker 1: guess what you are prime target for for malware because 318 00:17:30,400 --> 00:17:34,800 Speaker 1: that is by far the largest um that that has 319 00:17:34,840 --> 00:17:37,600 Speaker 1: the greatest market share of any operating system right now, 320 00:17:38,000 --> 00:17:44,080 Speaker 1: Windows XP still it's number three, number three, and it 321 00:17:44,160 --> 00:17:46,800 Speaker 1: has not been supported by Window formed by Microsoft for 322 00:17:46,880 --> 00:17:51,199 Speaker 1: two years. This, by the way, bad thing. If you 323 00:17:51,240 --> 00:17:54,160 Speaker 1: want to be really secure with your your computer information, 324 00:17:54,200 --> 00:17:55,919 Speaker 1: you don't want to be using and operate system that 325 00:17:55,960 --> 00:17:58,399 Speaker 1: no longer gets support from the company that made it, 326 00:17:59,680 --> 00:18:02,600 Speaker 1: because because that means no vulnerabilities will be patched. From 327 00:18:02,640 --> 00:18:04,639 Speaker 1: that moment forward, you're pretty much on your own. You 328 00:18:04,640 --> 00:18:07,399 Speaker 1: have gone into the dark forest, and you forgot to 329 00:18:07,440 --> 00:18:11,720 Speaker 1: bring your flashlight. It's pretty dangerous. UM. One of the 330 00:18:11,760 --> 00:18:14,040 Speaker 1: things that you kind of uh that that I think 331 00:18:14,600 --> 00:18:16,480 Speaker 1: it leads in from what you were saying before with 332 00:18:16,560 --> 00:18:21,000 Speaker 1: these exploit kits. One of the most terrifying aspects of 333 00:18:21,040 --> 00:18:23,719 Speaker 1: this type of malware and and the fact that that 334 00:18:23,760 --> 00:18:27,600 Speaker 1: people can use it for nefarious purposes and monetary gain, 335 00:18:28,480 --> 00:18:31,200 Speaker 1: is that you also have a population of people who 336 00:18:31,240 --> 00:18:34,160 Speaker 1: don't even understand how the malware works. They don't even 337 00:18:34,920 --> 00:18:38,240 Speaker 1: Script kitties is what I'm getting at. Script kitties, that's 338 00:18:38,280 --> 00:18:42,720 Speaker 1: the term we use for people who are, uh, they're 339 00:18:42,760 --> 00:18:47,040 Speaker 1: benefiting from the the work that hackers have done. Hackers 340 00:18:47,040 --> 00:18:49,720 Speaker 1: are the ones who are actually putting together the software. 341 00:18:49,760 --> 00:18:52,760 Speaker 1: They're the ones who have identified the vulnerability and then 342 00:18:52,800 --> 00:18:55,640 Speaker 1: exploited it in some way. Script kitties are the ones 343 00:18:55,680 --> 00:18:59,320 Speaker 1: who essentially they're given a set of skeleton keys, and 344 00:18:59,400 --> 00:19:03,560 Speaker 1: they didn't make the skeleton keys, they're just using them. Um. 345 00:19:03,600 --> 00:19:06,960 Speaker 1: And it's scary because you don't need a level of expertise. 346 00:19:07,119 --> 00:19:09,280 Speaker 1: You might think, oh, well, I'm kind of safe from 347 00:19:09,320 --> 00:19:12,160 Speaker 1: hackers because how many people are actually hackers? How many 348 00:19:12,160 --> 00:19:16,120 Speaker 1: people really know how this system works. Well, you don't 349 00:19:16,200 --> 00:19:18,040 Speaker 1: have to really know how the system works. If you 350 00:19:18,080 --> 00:19:21,240 Speaker 1: have a tool that exploits a vulnerability. Oh absolutely. Although 351 00:19:21,280 --> 00:19:24,080 Speaker 1: I really hate the word script kittie, I will put 352 00:19:24,119 --> 00:19:27,080 Speaker 1: it out there because I feel like if you're interested 353 00:19:27,280 --> 00:19:31,240 Speaker 1: in information security, and if you're interested in becoming a 354 00:19:31,280 --> 00:19:35,000 Speaker 1: good hacker, then you do start somewhere. And everybody is 355 00:19:35,000 --> 00:19:37,120 Speaker 1: going to start with the easy tools that are out 356 00:19:37,160 --> 00:19:40,080 Speaker 1: there and that are available for free. For example, one 357 00:19:40,119 --> 00:19:42,240 Speaker 1: thing that I learned how to use a couple of 358 00:19:42,280 --> 00:19:45,280 Speaker 1: years back was this tool called wire shark. It easily 359 00:19:45,359 --> 00:19:48,479 Speaker 1: lets you see everything that's happening on your wireless network, 360 00:19:48,600 --> 00:19:51,879 Speaker 1: or you can use it for um, any computers that 361 00:19:51,920 --> 00:19:54,280 Speaker 1: are on your on your network, like behind your router, 362 00:19:54,640 --> 00:19:56,679 Speaker 1: so you can see everything that's going on and you 363 00:19:56,720 --> 00:20:01,439 Speaker 1: don't necessarily have to learn or understand what's going on 364 00:20:01,520 --> 00:20:04,600 Speaker 1: behind it to be able to read what's on your 365 00:20:04,600 --> 00:20:07,560 Speaker 1: screen happening right in front of you. I think it's 366 00:20:07,640 --> 00:20:11,840 Speaker 1: really important though, for people who might be called script 367 00:20:11,920 --> 00:20:16,120 Speaker 1: kitties to look at as being beneficial and that they 368 00:20:16,200 --> 00:20:19,280 Speaker 1: can grow from that process. They can start from being 369 00:20:19,320 --> 00:20:22,000 Speaker 1: a beginner and say, okay, well I need to understand 370 00:20:22,040 --> 00:20:24,280 Speaker 1: the theory. Now I can move on from being a 371 00:20:24,320 --> 00:20:27,800 Speaker 1: script kittie quote unquote to becoming somebody who is an 372 00:20:27,840 --> 00:20:32,240 Speaker 1: expert in some kind of information security out there. Yeah. 373 00:20:32,280 --> 00:20:34,640 Speaker 1: I when I think of the term script kitty, in 374 00:20:34,680 --> 00:20:36,760 Speaker 1: my mind, it's a very it's a subset of the 375 00:20:36,760 --> 00:20:40,840 Speaker 1: people that typically get labeled as such. That subset being 376 00:20:41,320 --> 00:20:43,840 Speaker 1: people who have little to no interest in actually learning 377 00:20:43,960 --> 00:20:48,320 Speaker 1: how to hack or program. Uh, people who want a 378 00:20:48,440 --> 00:20:53,560 Speaker 1: very very fast track way to gain either a reputation 379 00:20:54,280 --> 00:20:57,280 Speaker 1: by being the person who took down a system by 380 00:20:57,400 --> 00:21:00,879 Speaker 1: whatever means, or by making a whole lot of money 381 00:21:00,920 --> 00:21:04,680 Speaker 1: really fast for relatively little effort. Those are the ones 382 00:21:04,720 --> 00:21:07,040 Speaker 1: I specifically think of when I think of script kitty. 383 00:21:07,080 --> 00:21:09,600 Speaker 1: But you are absolutely right, you have to start somewhere 384 00:21:09,760 --> 00:21:12,160 Speaker 1: if you're interested in that is. I'm kind of defensive 385 00:21:12,200 --> 00:21:14,480 Speaker 1: with that because I I was called a script katie 386 00:21:14,480 --> 00:21:17,960 Speaker 1: when I first started up started off learning about hacking 387 00:21:17,960 --> 00:21:21,520 Speaker 1: and information security. People would be like, oh, she's just 388 00:21:21,560 --> 00:21:23,560 Speaker 1: a script katty, and I'd be like, well, I actually 389 00:21:23,600 --> 00:21:25,359 Speaker 1: want to understand the theory. I want to learn how 390 00:21:25,359 --> 00:21:27,360 Speaker 1: to program. I want to learn how to code. I'm 391 00:21:27,359 --> 00:21:30,119 Speaker 1: no longer called that because I have learned how to 392 00:21:30,119 --> 00:21:32,960 Speaker 1: write certain kinds of code. I have learned how to program. 393 00:21:33,000 --> 00:21:35,240 Speaker 1: I can make my r do we know, do whatever 394 00:21:35,280 --> 00:21:37,960 Speaker 1: I want. So at this point in my stage, I've 395 00:21:38,000 --> 00:21:41,760 Speaker 1: surpassed that moment of being a nube and I've gone 396 00:21:41,760 --> 00:21:46,760 Speaker 1: on to learning things and being able to understand specific 397 00:21:46,840 --> 00:21:48,719 Speaker 1: tests and get them to do what I want them 398 00:21:48,760 --> 00:21:51,840 Speaker 1: to do without finding tutorials online. Yea, so now I 399 00:21:51,880 --> 00:21:55,000 Speaker 1: make my own tutorials. Seeing Now that's nice because when 400 00:21:55,000 --> 00:21:57,119 Speaker 1: I started at how stuff works. They call me that 401 00:21:57,160 --> 00:22:01,160 Speaker 1: weird bald guy, and today they still do. So some 402 00:22:01,280 --> 00:22:04,720 Speaker 1: labels just stick, is what I'm saying. So yeah, so, 403 00:22:04,720 --> 00:22:08,040 Speaker 1: so that kind of covers the malware approach. People can 404 00:22:08,080 --> 00:22:12,199 Speaker 1: make money through malware, either by selling your information UM, 405 00:22:12,240 --> 00:22:16,800 Speaker 1: they might do so by another method, which kind of 406 00:22:16,880 --> 00:22:19,600 Speaker 1: leads into this idea of ransomware. So this would be 407 00:22:19,640 --> 00:22:23,760 Speaker 1: malware specific type of malware that UM locks down your 408 00:22:23,800 --> 00:22:26,560 Speaker 1: machine in some way so that you can no longer 409 00:22:26,640 --> 00:22:29,119 Speaker 1: access it, and then you essentially get a message saying, hey, 410 00:22:29,200 --> 00:22:31,199 Speaker 1: if you want if you want your data back, if 411 00:22:31,240 --> 00:22:33,000 Speaker 1: you want access to your data, if you want to 412 00:22:33,000 --> 00:22:35,360 Speaker 1: be able to do all this stuff, and you want 413 00:22:35,359 --> 00:22:37,920 Speaker 1: our hands out of your business, then you've got to 414 00:22:37,920 --> 00:22:41,320 Speaker 1: pay us some moolah, some money. Shannon and I will 415 00:22:41,320 --> 00:22:44,199 Speaker 1: have more to say about how hackers make all the 416 00:22:44,320 --> 00:22:47,280 Speaker 1: dollar dollar bills y'all in just a moment, but first 417 00:22:47,359 --> 00:22:57,880 Speaker 1: let's take a quick break. Yeah. So, basically what happens 418 00:22:57,880 --> 00:23:01,160 Speaker 1: with ransomware is, uh, it is just like you said, 419 00:23:01,200 --> 00:23:04,560 Speaker 1: a type of aware that gets distributed in one way, 420 00:23:04,600 --> 00:23:07,600 Speaker 1: shape or form onto somebody's computer and it ends up 421 00:23:07,680 --> 00:23:10,480 Speaker 1: encrypting their data. It could be a whole hard drive. 422 00:23:10,520 --> 00:23:12,800 Speaker 1: It could be a folder of data. It's some kind 423 00:23:12,800 --> 00:23:16,280 Speaker 1: of important data that they have sitting on their computer. Uh. 424 00:23:16,320 --> 00:23:20,439 Speaker 1: And in many cases, a thief the hacker will ask 425 00:23:20,520 --> 00:23:23,959 Speaker 1: them in an email or maybe an encrypted text document 426 00:23:24,040 --> 00:23:27,760 Speaker 1: that's now surreptitiously on their computer out of nowhere, to 427 00:23:28,680 --> 00:23:31,080 Speaker 1: send them a certain amount of bitcoins, and they tell 428 00:23:31,119 --> 00:23:33,000 Speaker 1: them how to set up a bitcoin wallets so that 429 00:23:33,040 --> 00:23:35,600 Speaker 1: they can send the bitcoins to them for them to 430 00:23:35,640 --> 00:23:39,160 Speaker 1: get a pass code to unlock their encrypted data. Now, 431 00:23:39,200 --> 00:23:42,040 Speaker 1: the weird part is they already own this data is 432 00:23:42,040 --> 00:23:44,120 Speaker 1: on their own hard drive. It could be anything from 433 00:23:44,119 --> 00:23:46,959 Speaker 1: like kids photos, it could be tax documents. But in 434 00:23:47,000 --> 00:23:48,960 Speaker 1: any case, it's going to be some kind of important 435 00:23:49,000 --> 00:23:52,040 Speaker 1: information that people don't want to lose because it might 436 00:23:52,080 --> 00:23:55,280 Speaker 1: be years and years of information that's just on that computer. 437 00:23:55,880 --> 00:23:58,159 Speaker 1: So of course people are going to send them bitcoins. 438 00:23:58,560 --> 00:24:00,919 Speaker 1: And I think last night Act to Bitcoin was a 439 00:24:00,960 --> 00:24:03,000 Speaker 1: few hundred bucks, so it ends up being quite a 440 00:24:03,000 --> 00:24:04,680 Speaker 1: bit of money that they have to send to get 441 00:24:04,680 --> 00:24:08,000 Speaker 1: their information unlocked. Yeah, and this is this is the 442 00:24:08,000 --> 00:24:12,040 Speaker 1: type of malware. When we were talking about the advertising 443 00:24:12,119 --> 00:24:15,679 Speaker 1: that was targeting people through massive news sites. If I'm 444 00:24:15,720 --> 00:24:19,159 Speaker 1: not mistaken. It was specifically ransomware. It was the kind 445 00:24:19,160 --> 00:24:23,240 Speaker 1: of stuff that was encrypting users. Uh yeah, yeah, so 446 00:24:23,280 --> 00:24:26,679 Speaker 1: it wasn't just malware. It was ransomware that was infecting computers. 447 00:24:26,720 --> 00:24:28,640 Speaker 1: Because malware can do other stuff too, right, it can 448 00:24:28,800 --> 00:24:34,160 Speaker 1: It can create something like a backdoor access. So yeah, 449 00:24:34,359 --> 00:24:36,880 Speaker 1: hackers can take control of your machine or just monitor 450 00:24:36,920 --> 00:24:38,879 Speaker 1: what you're doing. Even if they don't want to take control. 451 00:24:39,240 --> 00:24:41,360 Speaker 1: They can put in key loggers so they can see 452 00:24:41,359 --> 00:24:44,200 Speaker 1: what all your passwords are. Um, so you might want 453 00:24:44,200 --> 00:24:47,480 Speaker 1: to think about using things like a really good password manager. 454 00:24:48,040 --> 00:24:51,399 Speaker 1: UM that's what I use and and I love mine. 455 00:24:52,000 --> 00:24:55,600 Speaker 1: Uh yeah, so the things where you don't have to 456 00:24:55,600 --> 00:24:57,360 Speaker 1: type the password and so you don't have to worry 457 00:24:57,359 --> 00:24:59,399 Speaker 1: about key loggers picking up on that kind of stuff. 458 00:25:01,000 --> 00:25:03,240 Speaker 1: But we'll talk more about that in just a second. 459 00:25:03,359 --> 00:25:05,440 Speaker 1: So one of the other ones I wanted to talk about, 460 00:25:05,480 --> 00:25:07,959 Speaker 1: This one is kind of a gray area because, uh, 461 00:25:08,119 --> 00:25:12,399 Speaker 1: this is this. I titled this section spies like us um, 462 00:25:12,400 --> 00:25:16,480 Speaker 1: and by this I meant state sponsored hackers. People who 463 00:25:16,560 --> 00:25:21,960 Speaker 1: are hacking on behalf of a specific state or nation 464 00:25:22,080 --> 00:25:25,840 Speaker 1: or government. Um. Sometimes they may be doing so not 465 00:25:26,040 --> 00:25:29,760 Speaker 1: with the Uh why should I say like, not with 466 00:25:29,840 --> 00:25:33,760 Speaker 1: the express permission of the nation. It may turn out 467 00:25:33,800 --> 00:25:36,040 Speaker 1: that the state says, hey, we didn't tell them to 468 00:25:36,080 --> 00:25:38,280 Speaker 1: do this. They're just doing it because they love us 469 00:25:38,359 --> 00:25:41,560 Speaker 1: so much and they hate and they hate you guys, 470 00:25:42,119 --> 00:25:44,960 Speaker 1: and that's why they're doing it. Um. Whether that's true 471 00:25:45,040 --> 00:25:48,119 Speaker 1: or not depends upon the situation. I would I would 472 00:25:48,280 --> 00:25:51,399 Speaker 1: think that if I were running a government and I 473 00:25:51,440 --> 00:25:55,480 Speaker 1: had employed a bunch of hackers to infiltrate or sabotage 474 00:25:55,960 --> 00:25:59,680 Speaker 1: another nations systems, I also would like some plausible deniability 475 00:25:59,680 --> 00:26:03,040 Speaker 1: in their Hey, I didn't tell him to do it. 476 00:26:03,119 --> 00:26:05,640 Speaker 1: I just said, man, it's It's kind of like there's 477 00:26:05,720 --> 00:26:09,840 Speaker 1: there's a story that a king of England once yelled out, 478 00:26:09,880 --> 00:26:12,640 Speaker 1: who will rid me of this meddlesome priest? And then 479 00:26:12,640 --> 00:26:15,119 Speaker 1: a couple of nights went often ridded him of his 480 00:26:15,480 --> 00:26:18,520 Speaker 1: that meddlesome priest, and it turned out that he was 481 00:26:18,760 --> 00:26:22,840 Speaker 1: he was just mad and just talking out loud. And 482 00:26:22,880 --> 00:26:25,360 Speaker 1: then one of his dearest friends ended up being murdered 483 00:26:25,400 --> 00:26:27,640 Speaker 1: by a couple of nights because they heard the guy 484 00:26:27,720 --> 00:26:29,359 Speaker 1: talking and said, hey, we should get rid of them. 485 00:26:29,359 --> 00:26:32,280 Speaker 1: We'll get rewarded. Um. That's why the states argue, I 486 00:26:32,280 --> 00:26:34,440 Speaker 1: don't know that that's always the case. Also, by the way, 487 00:26:34,560 --> 00:26:37,119 Speaker 1: for you listeners out there who recognize who I'm talking about, 488 00:26:37,160 --> 00:26:39,439 Speaker 1: send me an email and prove it, because I'm a 489 00:26:39,480 --> 00:26:43,199 Speaker 1: medievalist and I love that stuff. Um. But yeah, this 490 00:26:43,280 --> 00:26:45,439 Speaker 1: is something that we see. You know, you often will 491 00:26:45,440 --> 00:26:47,760 Speaker 1: hear stories about Chinese hackers or Russian hackers. There was 492 00:26:47,800 --> 00:26:52,679 Speaker 1: a story, UH several years ago about how UH information 493 00:26:52,720 --> 00:26:57,200 Speaker 1: security experts were noticing some artifacts in our power grid 494 00:26:57,280 --> 00:27:02,080 Speaker 1: system that were indicative of UH people who had infiltrated 495 00:27:02,119 --> 00:27:04,359 Speaker 1: that system and planted some stuff in there so that 496 00:27:04,400 --> 00:27:07,440 Speaker 1: they could monitor things or perhaps even jump back into 497 00:27:07,440 --> 00:27:10,560 Speaker 1: the power grid system should UH push come to shove 498 00:27:10,600 --> 00:27:13,439 Speaker 1: in some sort of political situation. They had traced it 499 00:27:13,480 --> 00:27:16,439 Speaker 1: back to either China or Russia. It's pretty tricky to 500 00:27:16,600 --> 00:27:20,399 Speaker 1: actually figure out where attacks ultimately originate from, because if 501 00:27:20,400 --> 00:27:23,680 Speaker 1: you're really good, you can cover your tracks pretty well. Um. 502 00:27:23,920 --> 00:27:26,359 Speaker 1: But the United States has done it too. You might 503 00:27:26,400 --> 00:27:29,080 Speaker 1: have heard about Stuck's Net. That was the that was 504 00:27:29,119 --> 00:27:32,879 Speaker 1: the computer virus that was designed to um to to 505 00:27:33,119 --> 00:27:37,000 Speaker 1: spin a centrifuge in a nuclear facility at a speed 506 00:27:37,160 --> 00:27:40,280 Speaker 1: greater than what it was supposed to spin at, and 507 00:27:40,440 --> 00:27:42,720 Speaker 1: originally I think the hope was that it would cause 508 00:27:42,760 --> 00:27:47,480 Speaker 1: a catastrophic failure and perhaps perhaps even destroy the facility. 509 00:27:47,800 --> 00:27:50,320 Speaker 1: As it turned out, it caused a failure, but not 510 00:27:50,440 --> 00:27:54,440 Speaker 1: at that level. But that those are examples of something 511 00:27:54,440 --> 00:27:59,480 Speaker 1: that's technically legal within the country because it's it's endorsed 512 00:27:59,600 --> 00:28:04,800 Speaker 1: or at least permitted by a government, but you don't 513 00:28:04,840 --> 00:28:08,160 Speaker 1: want it out there because it seems pretty darn shady 514 00:28:08,240 --> 00:28:13,200 Speaker 1: to anybody else. Yeah. Yeah, So state sponsored hacks are 515 00:28:13,560 --> 00:28:18,040 Speaker 1: more worrisome to me because they oftentimes have much larger targets. 516 00:28:18,640 --> 00:28:22,800 Speaker 1: For example, they might target a large government facility, like 517 00:28:23,080 --> 00:28:26,280 Speaker 1: I don't know, the Pentagon, So I worried about those 518 00:28:26,359 --> 00:28:30,240 Speaker 1: because those kind of servers have a lot of information 519 00:28:30,280 --> 00:28:33,119 Speaker 1: on the citizens of any sort of country. Uh So, 520 00:28:33,160 --> 00:28:35,920 Speaker 1: anytime you see these in the news, it's it's always like, oh, well, 521 00:28:36,000 --> 00:28:39,720 Speaker 1: this this hack was done by Chinese state sponsored hackers, 522 00:28:39,840 --> 00:28:44,200 Speaker 1: or Russian state sponsored hackers, or American state sponsored hackers, 523 00:28:44,200 --> 00:28:47,520 Speaker 1: and these are Korea. North Korea would be another big one. Yeah. Yeah, 524 00:28:47,560 --> 00:28:50,800 Speaker 1: So so they are either it might be a tinam 525 00:28:50,800 --> 00:28:54,400 Speaker 1: of hackers that are kind of comprised together in a 526 00:28:54,640 --> 00:28:58,560 Speaker 1: illegitimate company who are hired by a government or like 527 00:28:58,640 --> 00:29:02,920 Speaker 1: you say, where they may not necessarily have any affiliation 528 00:29:03,120 --> 00:29:07,160 Speaker 1: quote unquote with the government, but the government ends them 529 00:29:07,160 --> 00:29:10,320 Speaker 1: paying them in some way, shape or form for their 530 00:29:10,400 --> 00:29:13,640 Speaker 1: infiltration because it ends up helping the government in some 531 00:29:13,680 --> 00:29:17,320 Speaker 1: way or another. And so it's it's a very sticky 532 00:29:17,360 --> 00:29:20,240 Speaker 1: scenario when you start dealing with these state sponsored hackers, 533 00:29:20,280 --> 00:29:24,160 Speaker 1: because it's it's hard to understand, Um, how are we 534 00:29:24,200 --> 00:29:27,360 Speaker 1: going to, you know, penalize them? Who do we penalize 535 00:29:27,360 --> 00:29:30,080 Speaker 1: Do we penalize government or the hackers themselves? Are both? 536 00:29:30,320 --> 00:29:33,200 Speaker 1: Like who was actually involved? It might end up being 537 00:29:33,320 --> 00:29:36,320 Speaker 1: how do we address the underlying situation that led to 538 00:29:36,400 --> 00:29:40,680 Speaker 1: the employment of hackers in the first place? Um, which 539 00:29:40,800 --> 00:29:45,720 Speaker 1: can get pretty pretty delicate. Another great example or him 540 00:29:45,760 --> 00:29:48,600 Speaker 1: not too long ago, or at least one that may 541 00:29:48,720 --> 00:29:51,160 Speaker 1: or may not have been involved in may or may 542 00:29:51,200 --> 00:29:53,600 Speaker 1: not have involved a state sponsored hacker I'm still somewhat 543 00:29:54,600 --> 00:29:58,120 Speaker 1: skeptical of that would be the Sony hack. Oh yeah, 544 00:29:58,320 --> 00:30:01,520 Speaker 1: because the Sony hack, the Wes government essentially was pointing 545 00:30:01,520 --> 00:30:04,240 Speaker 1: fingers to North Korea, saying the hackers must have come 546 00:30:04,280 --> 00:30:07,200 Speaker 1: from North Korea. Look at this IP address, which we 547 00:30:07,240 --> 00:30:09,840 Speaker 1: don't even need to go into detail right now, except 548 00:30:09,840 --> 00:30:12,520 Speaker 1: to say that an IP address does not proof make 549 00:30:12,800 --> 00:30:16,200 Speaker 1: But at any rate, they're they're pointing over at North 550 00:30:16,280 --> 00:30:18,760 Speaker 1: Korea saying, we think the attacks came from there. The 551 00:30:19,000 --> 00:30:22,080 Speaker 1: attack appears to be politically motivated North Korea for its part, 552 00:30:22,120 --> 00:30:24,720 Speaker 1: the government, which, by the way, North Korea not shy 553 00:30:24,840 --> 00:30:27,920 Speaker 1: about taking credit for stuff. But they said, no, no, 554 00:30:28,040 --> 00:30:30,240 Speaker 1: we didn't. We we didn't ask for this, but we're 555 00:30:30,280 --> 00:30:34,800 Speaker 1: totally cool with it happening. Um, So you know, it's 556 00:30:34,800 --> 00:30:37,000 Speaker 1: one of those. It's also very muddy because obviously when 557 00:30:37,040 --> 00:30:39,600 Speaker 1: you're talking about things like espionage or sabotage or any 558 00:30:39,640 --> 00:30:43,400 Speaker 1: of those things, Uh, you don't. You don't come out 559 00:30:43,440 --> 00:30:45,600 Speaker 1: and talk more about it, you don't. That ends up 560 00:30:45,640 --> 00:30:48,360 Speaker 1: being closed away. Um. In fact, I should, I should 561 00:30:48,360 --> 00:30:50,240 Speaker 1: really throw that over to the stuff they don't want 562 00:30:50,280 --> 00:30:52,240 Speaker 1: you to know guys and have them do an episode 563 00:30:52,280 --> 00:30:54,960 Speaker 1: on it, because I would be a lot of fun. Uh. 564 00:30:55,040 --> 00:30:59,680 Speaker 1: And then we've got got the the traditional at least, 565 00:30:59,680 --> 00:31:02,200 Speaker 1: I would argue the traditional concept of a hacker from 566 00:31:02,200 --> 00:31:06,120 Speaker 1: the Hollywood perspective. The black hats, the ones that are 567 00:31:06,160 --> 00:31:08,280 Speaker 1: wearing the hoodies and they're sitting at a keyboard and 568 00:31:08,280 --> 00:31:13,000 Speaker 1: they're typing really fast on a green and black screen over. Yes, 569 00:31:13,120 --> 00:31:18,720 Speaker 1: they've got got some junk food snail and they have 570 00:31:18,760 --> 00:31:21,320 Speaker 1: a ton of different windows popping up on their computer 571 00:31:21,400 --> 00:31:24,240 Speaker 1: really really fast. You can't make out anything that's happening. 572 00:31:24,440 --> 00:31:27,800 Speaker 1: It's entirely not true. That's not how it works. It's 573 00:31:27,840 --> 00:31:32,200 Speaker 1: actually a somewhat slow process to get um basically, to 574 00:31:32,280 --> 00:31:35,600 Speaker 1: get reconnaissance and to get into any kind of network. Uh. 575 00:31:35,640 --> 00:31:38,400 Speaker 1: The only things I've done, of course, are completely legal. 576 00:31:38,880 --> 00:31:42,160 Speaker 1: I've had an authorization by everybody who I have tested 577 00:31:42,240 --> 00:31:47,560 Speaker 1: my my abilities on. Right. Yeah, so black hats, that's 578 00:31:47,560 --> 00:31:51,760 Speaker 1: that's another awkward definition because it's not one that I 579 00:31:51,880 --> 00:31:54,240 Speaker 1: like to use all the time because black hat hacker 580 00:31:54,360 --> 00:31:58,200 Speaker 1: means that there's it makes hackers have more of a 581 00:31:58,280 --> 00:32:01,320 Speaker 1: negative appeal to a lot of people. So I always 582 00:32:01,360 --> 00:32:04,160 Speaker 1: just call them black hat thiefs. Yeah. No, that's a 583 00:32:04,160 --> 00:32:07,120 Speaker 1: great way of putting it, because, uh, typically you'll see 584 00:32:07,160 --> 00:32:10,440 Speaker 1: things like um uh, the idea of infiltrating a system 585 00:32:10,480 --> 00:32:12,880 Speaker 1: in order to steal information, perhaps to sell it to 586 00:32:12,920 --> 00:32:16,080 Speaker 1: someone else, or to hold it against the party that 587 00:32:16,120 --> 00:32:19,280 Speaker 1: you've stolen it from. Um, you know, so it might 588 00:32:19,280 --> 00:32:24,720 Speaker 1: be extortion as opposed to uh to stealing and selling. Uh. Also, 589 00:32:24,800 --> 00:32:27,080 Speaker 1: we should go ahead and point out something else that 590 00:32:27,120 --> 00:32:30,280 Speaker 1: I'll talk about in a future episode, but I've mentioned 591 00:32:30,280 --> 00:32:33,640 Speaker 1: it in previous ones too. Um. Hackers don't necessarily just 592 00:32:33,680 --> 00:32:36,800 Speaker 1: sit at a keyboard and type in strings of letters 593 00:32:36,800 --> 00:32:39,160 Speaker 1: and numbers. They also do a lot of social engineering 594 00:32:39,240 --> 00:32:41,920 Speaker 1: where or they can do a lot of social engineering 595 00:32:41,960 --> 00:32:45,040 Speaker 1: where they attempt to gain access to systems, either by 596 00:32:45,120 --> 00:32:49,200 Speaker 1: physically gaining access to a system, which makes it way 597 00:32:49,200 --> 00:32:53,400 Speaker 1: easier than remotely doing it um, or even easier than 598 00:32:53,400 --> 00:32:55,960 Speaker 1: that manipulating someone who does have access to a system, 599 00:32:56,000 --> 00:32:58,200 Speaker 1: and then you get it that way. Um. And it's 600 00:32:58,240 --> 00:33:03,320 Speaker 1: surprisingly easy to do if employees have not been educated 601 00:33:03,520 --> 00:33:06,600 Speaker 1: on how to spot that and avoid it. Yeah, properly 602 00:33:06,640 --> 00:33:09,600 Speaker 1: training your your your employees at your place of work 603 00:33:09,720 --> 00:33:12,440 Speaker 1: is really important when it comes to social engineering. And 604 00:33:12,920 --> 00:33:16,720 Speaker 1: it is incredibly easy to do social engineering, especially when 605 00:33:16,760 --> 00:33:20,200 Speaker 1: you're a female, I would imagine. So it turns out 606 00:33:20,240 --> 00:33:23,960 Speaker 1: also if you are dressed as the stereotypical I T 607 00:33:24,120 --> 00:33:28,480 Speaker 1: guy and you are there to quote unquote upgrade someone's machine, 608 00:33:29,240 --> 00:33:32,040 Speaker 1: really easy to get access to that machine. People are 609 00:33:32,440 --> 00:33:38,040 Speaker 1: so eager. Yeah, and obviously, like social engineering, completely depends 610 00:33:38,120 --> 00:33:43,640 Speaker 1: upon identifying and then exploiting a person's vulnerability and typically 611 00:33:43,680 --> 00:33:47,840 Speaker 1: speaking like greed lust, those are two big ones that 612 00:33:48,400 --> 00:33:51,680 Speaker 1: are exploitable, and that the people who are really good 613 00:33:51,680 --> 00:33:54,320 Speaker 1: at social engineering know that, and they're very good at 614 00:33:54,320 --> 00:33:58,720 Speaker 1: that leveraging that. Just as knowing what sort of vulnerabilities 615 00:33:58,760 --> 00:34:02,600 Speaker 1: typically show up within code, within within programs, you need 616 00:34:02,640 --> 00:34:06,520 Speaker 1: to know what vulnerabilities show up in people. UM And 617 00:34:06,760 --> 00:34:08,399 Speaker 1: I also I had a little thing on here about 618 00:34:08,480 --> 00:34:11,120 Speaker 1: botan net masters. Really what in this I was thinking 619 00:34:11,120 --> 00:34:13,120 Speaker 1: about the people who are using malware to get that 620 00:34:13,160 --> 00:34:15,839 Speaker 1: back door access to machines, to get UH, to get 621 00:34:15,840 --> 00:34:20,640 Speaker 1: that administrative control over a wide array. Sometimes we call 622 00:34:20,680 --> 00:34:22,560 Speaker 1: it a boton net. Sometimes we call it a zombie 623 00:34:22,640 --> 00:34:26,799 Speaker 1: army of of user computers, and then utilizing that to 624 00:34:26,880 --> 00:34:32,120 Speaker 1: do stuff like UH distributed denial of service attacks or 625 00:34:32,200 --> 00:34:36,600 Speaker 1: de DOS attacks, where you are UH directing an army 626 00:34:36,880 --> 00:34:42,279 Speaker 1: essentially to coordinate an attack against an identified target. Sometimes 627 00:34:42,320 --> 00:34:45,480 Speaker 1: this is done just to cause problems. I mean, obviously, 628 00:34:45,520 --> 00:34:50,320 Speaker 1: if you've ever had issues logging into like a gaming network, 629 00:34:50,760 --> 00:34:53,319 Speaker 1: Xbox Live has had this happen, PlayStation has had this 630 00:34:53,400 --> 00:34:57,719 Speaker 1: happen where people who are disenchanted with the service for 631 00:34:57,760 --> 00:35:00,440 Speaker 1: one reason or another, or they just want to do 632 00:35:00,480 --> 00:35:04,719 Speaker 1: it for the lulls. Uh, specifically around holiday times. That's 633 00:35:04,760 --> 00:35:07,879 Speaker 1: a big that's a big target time to attack something 634 00:35:07,920 --> 00:35:10,880 Speaker 1: like Xbox Live. They'll direct a ton of traffic to 635 00:35:11,120 --> 00:35:14,720 Speaker 1: break down servers, so servers can't respond to legitimate traffic 636 00:35:14,719 --> 00:35:18,200 Speaker 1: because they're too busy responding to a bunch of fake traffic. Essentially, 637 00:35:18,920 --> 00:35:22,360 Speaker 1: I'm oversimplifying, but this is a basic detos attack. It is. 638 00:35:22,400 --> 00:35:24,359 Speaker 1: It's such a mean thing to do to those little 639 00:35:24,400 --> 00:35:27,360 Speaker 1: kids during Christmas times, turn off their xboxes so that 640 00:35:27,400 --> 00:35:29,440 Speaker 1: they can't log in and they can't play their games, 641 00:35:29,480 --> 00:35:32,799 Speaker 1: so they just go on Yeah, yeah, I think, break 642 00:35:32,920 --> 00:35:35,440 Speaker 1: my heart. Gosh, it's it's a jerk move. It's a 643 00:35:35,520 --> 00:35:39,239 Speaker 1: jerk move. Don't do it. I love the definition or 644 00:35:39,280 --> 00:35:43,439 Speaker 1: I love the term zombie for bot nuts, because that's 645 00:35:43,440 --> 00:35:45,799 Speaker 1: exactly what it is. Where you have a you have 646 00:35:45,880 --> 00:35:48,640 Speaker 1: a zero, a patient zero, and that would be the 647 00:35:48,680 --> 00:35:52,600 Speaker 1: first computer. They end up biting a few more computers, 648 00:35:52,680 --> 00:35:55,000 Speaker 1: and those ones end up getting infected with the same 649 00:35:55,080 --> 00:35:58,640 Speaker 1: exact infection that patient zero had, and then those ones 650 00:35:58,840 --> 00:36:01,520 Speaker 1: end up biting ten each so you end up with 651 00:36:01,560 --> 00:36:04,520 Speaker 1: thousands upon thousands of these computers that each have the 652 00:36:04,600 --> 00:36:08,360 Speaker 1: same exact infection, and they all end up perpetrating the 653 00:36:08,480 --> 00:36:13,000 Speaker 1: same exact vulnerability on whatever their target might be. Yeah, 654 00:36:13,120 --> 00:36:16,719 Speaker 1: and then ultimately you end up with a situation where 655 00:36:16,800 --> 00:36:19,000 Speaker 1: Nagan is standing there with a baseball bat and you 656 00:36:19,000 --> 00:36:23,279 Speaker 1: don't know whose head he's gonna cave in. I might 657 00:36:23,280 --> 00:36:26,960 Speaker 1: have taken that metaphor a little too far. But one 658 00:36:26,960 --> 00:36:30,120 Speaker 1: of the things that Boughton net controllers might do, and 659 00:36:30,120 --> 00:36:32,560 Speaker 1: in fact, this has happened on multiple occasions. It's similar 660 00:36:32,600 --> 00:36:36,160 Speaker 1: to ransomware is they'll send a message to an identified 661 00:36:36,160 --> 00:36:40,560 Speaker 1: target and say, hey, we we got your number. We're 662 00:36:40,600 --> 00:36:42,920 Speaker 1: gonna come after you unless you pay us a certain 663 00:36:42,960 --> 00:36:46,799 Speaker 1: amount of money. Um, we will unleash the dogs of 664 00:36:46,880 --> 00:36:51,000 Speaker 1: war on your servers and you will be unable to 665 00:36:51,080 --> 00:36:54,120 Speaker 1: do business. And there have been cases where businesses have 666 00:36:54,360 --> 00:36:56,880 Speaker 1: folded to this kind of pressure, where they have in 667 00:36:56,920 --> 00:37:00,799 Speaker 1: fact paid to do this because the hospital ended up 668 00:37:00,800 --> 00:37:04,640 Speaker 1: doing that. Yes, yes it was. Yeah, I've seen a 669 00:37:04,680 --> 00:37:12,680 Speaker 1: few cases of particularly malicious and odious acts against things 670 00:37:12,719 --> 00:37:15,640 Speaker 1: like hospitals. There was one year when I was participating 671 00:37:16,480 --> 00:37:20,360 Speaker 1: in a charity for children's hospitals and the charity was 672 00:37:20,640 --> 00:37:25,520 Speaker 1: targeted in the middle of the event and for about 673 00:37:25,600 --> 00:37:30,040 Speaker 1: three hours they were offline trying to deal with that. Um, yeah, 674 00:37:30,239 --> 00:37:32,520 Speaker 1: it's and in that case, it wasn't a it wasn't 675 00:37:32,520 --> 00:37:35,160 Speaker 1: an attack in an effort to get money. I don't 676 00:37:35,200 --> 00:37:37,560 Speaker 1: think I think it was just someone being truly an 677 00:37:37,600 --> 00:37:41,280 Speaker 1: awful human being. But we have seen cases of people 678 00:37:41,280 --> 00:37:43,879 Speaker 1: trying to do this in order to extort money. So 679 00:37:44,080 --> 00:37:49,000 Speaker 1: you're probably noticing some trends here extortion, stealing, uh, you know, 680 00:37:49,040 --> 00:37:53,319 Speaker 1: holding things for ransom, this idea of making sure that 681 00:37:53,320 --> 00:37:57,400 Speaker 1: that people are spending money for out of fear or 682 00:37:57,440 --> 00:38:00,120 Speaker 1: out of a need to get back and and have 683 00:38:00,320 --> 00:38:05,120 Speaker 1: access to something that belongs to them. These are all terrible, 684 00:38:05,200 --> 00:38:09,520 Speaker 1: terrible motivations to make money, and as such, as such 685 00:38:09,600 --> 00:38:12,080 Speaker 1: terrible motivations, you might think, well, wait a minute, how 686 00:38:12,120 --> 00:38:14,319 Speaker 1: are they actually like, how are they getting paid? How 687 00:38:14,400 --> 00:38:19,120 Speaker 1: is this money transfer happening? Because you would think anything 688 00:38:19,239 --> 00:38:23,879 Speaker 1: that would be traceable would end up being somewhat problematic. 689 00:38:23,920 --> 00:38:25,960 Speaker 1: You've got a trail that leads back to you as 690 00:38:26,000 --> 00:38:29,880 Speaker 1: a person, then pretty soon law enforcement's going to get involved, 691 00:38:30,120 --> 00:38:33,640 Speaker 1: or at least the I R S. So so how 692 00:38:33,880 --> 00:38:38,400 Speaker 1: Shannon do hackers? How do they get the money? So 693 00:38:38,480 --> 00:38:41,120 Speaker 1: there's probably some ways that I don't even know about yet, 694 00:38:41,480 --> 00:38:44,000 Speaker 1: but the ones that I can think of would be 695 00:38:44,520 --> 00:38:47,439 Speaker 1: treating of high value data. So that's a pretty big 696 00:38:47,440 --> 00:38:50,400 Speaker 1: one where uh saya hacker collects a whole bunch of 697 00:38:50,520 --> 00:38:53,720 Speaker 1: really really high value data like your SO security number, 698 00:38:53,880 --> 00:38:57,960 Speaker 1: your credit card accounts, your banking account, tons of information, 699 00:38:58,400 --> 00:39:00,520 Speaker 1: and they decided to go on to a deep forum 700 00:39:00,640 --> 00:39:03,839 Speaker 1: sell it, and then or trade it for something else 701 00:39:03,840 --> 00:39:07,319 Speaker 1: of high value, for example, a gift card. They could 702 00:39:07,400 --> 00:39:09,760 Speaker 1: ask for people to give them a ton of gift 703 00:39:09,760 --> 00:39:12,000 Speaker 1: cards that are like, you know, twenty five or fifty 704 00:39:12,040 --> 00:39:15,799 Speaker 1: dollars each, and then use those gift cards at a 705 00:39:15,880 --> 00:39:19,600 Speaker 1: retailer who is easily vulnerable to some kind of gift 706 00:39:19,640 --> 00:39:22,200 Speaker 1: card scam, and in that sense they would be able 707 00:39:22,239 --> 00:39:24,439 Speaker 1: to make some kind of money back through those gift 708 00:39:24,480 --> 00:39:27,600 Speaker 1: cards and that trade of that high value UH data 709 00:39:27,680 --> 00:39:31,800 Speaker 1: that they stole from whoever it might be, whatever company. 710 00:39:31,840 --> 00:39:34,960 Speaker 1: Another way would be bitcoins. Now that's probably the most 711 00:39:34,960 --> 00:39:38,000 Speaker 1: obvious one, of course, because bitcoins are very very hard 712 00:39:38,040 --> 00:39:41,600 Speaker 1: to track. Yes, they are traceable in some circumstances, depending 713 00:39:41,600 --> 00:39:44,160 Speaker 1: on what kind of wallet you use, but in a 714 00:39:44,200 --> 00:39:48,200 Speaker 1: lot of circumstances, the bitcoins will trade wallets so many 715 00:39:48,239 --> 00:39:50,919 Speaker 1: times that it will be somewhat impossible to find out 716 00:39:50,920 --> 00:39:53,799 Speaker 1: where it actually came from, where it actually started. Yeah, 717 00:39:53,800 --> 00:39:57,080 Speaker 1: it's kind of interesting because every single bitcoin contains with 718 00:39:57,160 --> 00:39:59,600 Speaker 1: it a record of every transaction. But that does not 719 00:39:59,760 --> 00:40:04,120 Speaker 1: mean that the parties involved are actually identifiable. It really 720 00:40:04,320 --> 00:40:07,320 Speaker 1: is um. It's it's actually data that's used in order 721 00:40:07,360 --> 00:40:10,319 Speaker 1: to allow for the mining of further bitcoins. It's a 722 00:40:10,360 --> 00:40:13,680 Speaker 1: really fascinating process. But but one of the things that 723 00:40:13,719 --> 00:40:16,720 Speaker 1: attracts people to bitcoins is this idea of being able 724 00:40:16,760 --> 00:40:22,280 Speaker 1: to spend them anonymously and be able to purchase things, uh, 725 00:40:22,560 --> 00:40:26,160 Speaker 1: whether legal or illegal, without it being traced back to 726 00:40:26,200 --> 00:40:29,000 Speaker 1: that person. You often will hear about things like, you know, 727 00:40:29,080 --> 00:40:32,480 Speaker 1: the old Silk Road, where you could purchase all sources 728 00:40:32,520 --> 00:40:37,200 Speaker 1: of stuff, including illegal drugs or other materials, sometimes weapons, 729 00:40:37,680 --> 00:40:39,759 Speaker 1: that kind of stuff, um, and you could do it 730 00:40:39,800 --> 00:40:43,160 Speaker 1: through bitcoins, and people felt a high level of confidence 731 00:40:43,200 --> 00:40:47,440 Speaker 1: because it was not a state backed currency. It was 732 00:40:47,560 --> 00:40:52,480 Speaker 1: this independent cryptocurrency that allowed them that that freedom and 733 00:40:52,600 --> 00:40:56,799 Speaker 1: had real value because people want the bitcoins. If no 734 00:40:56,840 --> 00:41:00,560 Speaker 1: one wanted the bitcoins, they wouldn't be worth anything, right, 735 00:41:00,680 --> 00:41:04,120 Speaker 1: and bitcoins have actually been pretty steady last time I checked, 736 00:41:04,200 --> 00:41:09,080 Speaker 1: so their value has been pretty decent in late days, 737 00:41:09,200 --> 00:41:12,840 Speaker 1: in recent days, So so I completely understand why hacker 738 00:41:12,880 --> 00:41:16,760 Speaker 1: would want to be paid in bitcoins. It makes sense. Yeah. Yeah, 739 00:41:16,880 --> 00:41:19,840 Speaker 1: there's also the old, the old deal of putting the 740 00:41:19,880 --> 00:41:22,640 Speaker 1: money into the the washing machine, right, that's how money 741 00:41:22,680 --> 00:41:26,280 Speaker 1: laundering work, right, Yes, money laundering. So that was something 742 00:41:26,280 --> 00:41:28,040 Speaker 1: that I learned about way back in the day when 743 00:41:28,080 --> 00:41:30,399 Speaker 1: I worked at a bank of all places, which also 744 00:41:30,440 --> 00:41:33,600 Speaker 1: got me really interested in security before I started podcasting. 745 00:41:34,040 --> 00:41:37,160 Speaker 1: But money laundering, it's very easy for somebody to go online, 746 00:41:37,360 --> 00:41:40,439 Speaker 1: be able to sell this high value data, get some 747 00:41:40,680 --> 00:41:44,759 Speaker 1: bitcoins or it might be some other form of currency, 748 00:41:45,160 --> 00:41:48,920 Speaker 1: and then be able to resell that money or be 749 00:41:48,960 --> 00:41:51,759 Speaker 1: able to trade a product to get real money, real 750 00:41:51,800 --> 00:41:54,759 Speaker 1: cash at one point or another. But basically it's it's 751 00:41:54,880 --> 00:41:59,879 Speaker 1: um exchanging the hands that hold that money so many 752 00:42:00,000 --> 00:42:03,239 Speaker 1: times that again it's very hard to trace. Yeah, and 753 00:42:03,280 --> 00:42:06,560 Speaker 1: it's it's hard to determine that the original source of 754 00:42:06,600 --> 00:42:11,560 Speaker 1: that money was anything remotely illegal. And then depending on again, 755 00:42:11,600 --> 00:42:14,120 Speaker 1: if you're if you're a state sponsored hacker, you're probably 756 00:42:14,160 --> 00:42:17,880 Speaker 1: just drawing a salary or doing contract work, so you're 757 00:42:17,920 --> 00:42:23,000 Speaker 1: actually getting paid. You get a pay check. Yeah. Yeah, 758 00:42:24,160 --> 00:42:27,360 Speaker 1: so you've got money withdrawn from your paycheck to handle 759 00:42:27,480 --> 00:42:31,080 Speaker 1: to support the government while you are subverting other governments. 760 00:42:31,360 --> 00:42:33,839 Speaker 1: And then it looks completely legitimate. So that's a really 761 00:42:33,840 --> 00:42:38,400 Speaker 1: easy way for somebody to do something that might be very, 762 00:42:38,520 --> 00:42:42,680 Speaker 1: very bad. Yeah, because they are, They do have to 763 00:42:42,760 --> 00:42:44,720 Speaker 1: pay the I R S, they do get a tax 764 00:42:44,760 --> 00:42:47,200 Speaker 1: refund every year, they do have an employer, so it 765 00:42:47,239 --> 00:42:51,080 Speaker 1: looks completely normal for them to be receiving a paycheck 766 00:42:51,160 --> 00:42:54,880 Speaker 1: for whatever work this might be. Yeah, we've got a 767 00:42:54,920 --> 00:42:57,880 Speaker 1: little bit more to say about how hackers make money, 768 00:42:57,920 --> 00:43:00,840 Speaker 1: but first let's take another quick break to thank our sponsor. 769 00:43:08,480 --> 00:43:13,680 Speaker 1: So the nice thing is there aren't just quote unquote 770 00:43:13,719 --> 00:43:16,680 Speaker 1: bad guys out there doing all this kind of of 771 00:43:16,719 --> 00:43:21,520 Speaker 1: work with computers, with a hacking, with discovering vulnerabilities. There 772 00:43:21,520 --> 00:43:24,240 Speaker 1: are plenty of people, as as you mentioned earlier, Shannon, 773 00:43:24,280 --> 00:43:27,680 Speaker 1: who are doing this in order to help others, either 774 00:43:27,840 --> 00:43:32,160 Speaker 1: to make systems more secure or to inform people of 775 00:43:32,239 --> 00:43:35,000 Speaker 1: how these kind of attacks happen so that they can 776 00:43:35,040 --> 00:43:38,280 Speaker 1: be better prepared to defend themselves. So let's talk about 777 00:43:38,320 --> 00:43:41,120 Speaker 1: some of them. Um, of course, if you have black 778 00:43:41,120 --> 00:43:44,600 Speaker 1: hat hackers, right, you got the bad guys, you gotta have, 779 00:43:44,960 --> 00:43:49,799 Speaker 1: you gotta the white hack hackers. These are the These 780 00:43:49,800 --> 00:43:55,840 Speaker 1: are the the noble bounty hunter characters of those westerns, 781 00:43:55,880 --> 00:43:59,640 Speaker 1: the ones who you know they've seen things but deep 782 00:43:59,680 --> 00:44:03,160 Speaker 1: down and they have a heart of gold. Well, not 783 00:44:03,200 --> 00:44:06,000 Speaker 1: all of them, but a lot of a lot of 784 00:44:06,000 --> 00:44:10,280 Speaker 1: my friends are considered white hat hackers. They're the people 785 00:44:10,320 --> 00:44:15,040 Speaker 1: who either they work for a company that specializes insecurity. 786 00:44:15,200 --> 00:44:17,880 Speaker 1: So a lot of my friends work for these companies 787 00:44:17,880 --> 00:44:21,879 Speaker 1: who will be contracted with big brands, go into their 788 00:44:21,920 --> 00:44:24,880 Speaker 1: networks and then find out what the vulnerabilities are and 789 00:44:24,960 --> 00:44:27,160 Speaker 1: fix them, or they will give them a report and 790 00:44:27,160 --> 00:44:29,200 Speaker 1: tell them how to fix that fix it in the future. 791 00:44:29,760 --> 00:44:31,879 Speaker 1: They make a lot of money. A lot of them 792 00:44:31,920 --> 00:44:36,600 Speaker 1: don't like it because they have specific amounts of vulnerabilities 793 00:44:36,719 --> 00:44:39,160 Speaker 1: or specific time frame set that they have to get 794 00:44:39,200 --> 00:44:41,839 Speaker 1: this work done, and a lot of times hacking takes 795 00:44:41,880 --> 00:44:45,960 Speaker 1: a lot of time. It takes a lot of information reconnaissance. 796 00:44:46,440 --> 00:44:49,640 Speaker 1: So a lot of my friends don't necessarily appreciate having 797 00:44:49,640 --> 00:44:52,680 Speaker 1: to be under these time constraints with these big brands. Well, 798 00:44:52,719 --> 00:44:55,759 Speaker 1: particularly since you figure the bad guys aren't under any 799 00:44:55,800 --> 00:44:59,560 Speaker 1: particular time constraints exactly, So the bad guys have tons 800 00:44:59,560 --> 00:45:02,479 Speaker 1: of time to find these vulnerabilities, while the white hacks 801 00:45:02,480 --> 00:45:04,879 Speaker 1: are under the stress of these time constraints to get 802 00:45:04,880 --> 00:45:06,800 Speaker 1: the work done so that they make their bosses happy. 803 00:45:07,480 --> 00:45:10,000 Speaker 1: In this sense, a lot of my a lot of 804 00:45:10,000 --> 00:45:12,640 Speaker 1: people that I know, have created their own security companies 805 00:45:12,680 --> 00:45:16,640 Speaker 1: because of this fault in the generic nature of having 806 00:45:16,920 --> 00:45:19,680 Speaker 1: these security companies. So they said, you know, I'm tired 807 00:45:19,719 --> 00:45:22,680 Speaker 1: of having to deal with these constraints that my boss 808 00:45:22,719 --> 00:45:25,640 Speaker 1: has given me. Just gonna open my own security company, 809 00:45:25,640 --> 00:45:27,360 Speaker 1: and we're going to do it even better because we 810 00:45:27,360 --> 00:45:30,200 Speaker 1: won't give ourselves those time constraints. Will give us ourselves 811 00:45:30,200 --> 00:45:33,719 Speaker 1: several months to find all the vulnerabilities that we absolutely 812 00:45:33,760 --> 00:45:36,279 Speaker 1: can and then we'll write a report and we'll fix it. 813 00:45:36,640 --> 00:45:39,400 Speaker 1: And uh, those are the ones that I would definitely 814 00:45:39,440 --> 00:45:42,800 Speaker 1: work with if I had to hire a security company. Yeah, 815 00:45:42,840 --> 00:45:45,000 Speaker 1: because they're the ones who are going to use the 816 00:45:45,040 --> 00:45:48,799 Speaker 1: exact same kind of methodologies that bad guys are going 817 00:45:48,880 --> 00:45:52,280 Speaker 1: to use. And if if you want to really be secure, 818 00:45:52,719 --> 00:45:55,960 Speaker 1: you want the people to throw everything they can at 819 00:45:56,000 --> 00:45:58,560 Speaker 1: your system. So that you can find out are you 820 00:45:58,600 --> 00:46:00,960 Speaker 1: actually secure? If you're what do you need to do 821 00:46:01,000 --> 00:46:03,560 Speaker 1: to address it? Um? If you want to see a 822 00:46:03,600 --> 00:46:08,000 Speaker 1: movie that that does a very fantasy version of this 823 00:46:08,160 --> 00:46:12,200 Speaker 1: very idea, there's a film that I always think back to, 824 00:46:12,440 --> 00:46:17,080 Speaker 1: Sneakers had Robert Redford and Dan Ackroyd, who plays a 825 00:46:17,160 --> 00:46:21,800 Speaker 1: character named mother. Ben Kingsley is in it Um. A 826 00:46:21,920 --> 00:46:25,520 Speaker 1: ton of folks. River Phoenix was in it Um, and 827 00:46:25,600 --> 00:46:29,160 Speaker 1: it's a It's a movie about a group of kind 828 00:46:29,200 --> 00:46:33,200 Speaker 1: of almost like outcasts who have grouped together to form 829 00:46:33,280 --> 00:46:35,880 Speaker 1: a company that they specifically do this. They try to 830 00:46:35,960 --> 00:46:40,480 Speaker 1: infiltrate a company in order to test its security, not 831 00:46:40,880 --> 00:46:44,960 Speaker 1: to exploit it, but rather to tell the company, hey, 832 00:46:45,440 --> 00:46:47,600 Speaker 1: here's how we got in, here's how someone else could 833 00:46:47,640 --> 00:46:50,120 Speaker 1: get in, So you need to plug this vulnerability, that 834 00:46:50,200 --> 00:46:52,479 Speaker 1: kind of thing um. And then of course they get 835 00:46:52,560 --> 00:46:55,360 Speaker 1: involved in all sorts of shenanigans. And in case you 836 00:46:55,400 --> 00:46:58,399 Speaker 1: are interested in the methodology, I actually find it very 837 00:46:58,520 --> 00:47:02,279 Speaker 1: very interesting how they get their work done, because of 838 00:47:02,320 --> 00:47:05,680 Speaker 1: course they have to go through the tennis match of 839 00:47:05,719 --> 00:47:08,800 Speaker 1: back and forth with a brand name company, whatever it 840 00:47:08,880 --> 00:47:11,439 Speaker 1: might be, So they'll have to get a purchase order, 841 00:47:11,560 --> 00:47:14,440 Speaker 1: they'll do a little bit of negotiation for an amount 842 00:47:14,480 --> 00:47:16,600 Speaker 1: that they'll do the work for, and then they'll go 843 00:47:16,680 --> 00:47:20,080 Speaker 1: in and they'll gather information on the network and they'll 844 00:47:20,160 --> 00:47:22,560 Speaker 1: capture traffic, and they'll try to find any kind of 845 00:47:22,600 --> 00:47:26,359 Speaker 1: vulnerabilities that are on that network, even with the people too. 846 00:47:26,920 --> 00:47:29,720 Speaker 1: For example, they could use social engineering to get into 847 00:47:30,040 --> 00:47:34,120 Speaker 1: the server rack uh physically, or they could get into 848 00:47:34,320 --> 00:47:37,440 Speaker 1: a network that doesn't necessarily have a very good password 849 00:47:37,480 --> 00:47:41,279 Speaker 1: on it. Uh. They could email clients that work there 850 00:47:41,320 --> 00:47:44,640 Speaker 1: that are employed at the brand name company with I 851 00:47:44,680 --> 00:47:48,279 Speaker 1: don't know malware written PDFs for example, and they could 852 00:47:48,360 --> 00:47:50,920 Speaker 1: use wireless attacks. They could do war driving from the 853 00:47:50,960 --> 00:47:53,560 Speaker 1: parking lot if they wanted to. And then what they'll 854 00:47:53,600 --> 00:47:56,719 Speaker 1: do is write a very very long report so that 855 00:47:56,800 --> 00:48:00,120 Speaker 1: the brand name company can see exactly what happened is 856 00:48:00,160 --> 00:48:02,040 Speaker 1: on their network and exactly what they were able to 857 00:48:02,080 --> 00:48:05,640 Speaker 1: do from from whatever back door they were able to 858 00:48:05,640 --> 00:48:09,520 Speaker 1: get into. It's really interesting how how well they're able 859 00:48:09,600 --> 00:48:13,719 Speaker 1: to put everything together in in turn hopefully save this 860 00:48:13,800 --> 00:48:17,600 Speaker 1: company in the long run thousands and thousands of dollars. Yeah, yeah, 861 00:48:17,640 --> 00:48:20,279 Speaker 1: I mean this is the whole Security has always been 862 00:48:20,320 --> 00:48:22,959 Speaker 1: a tick talk approach, right, You've got the tick, which 863 00:48:23,000 --> 00:48:26,600 Speaker 1: is where someone has identified a way of exploiting a system, 864 00:48:26,640 --> 00:48:28,839 Speaker 1: and then the talk is where you find a way 865 00:48:28,880 --> 00:48:32,719 Speaker 1: to correct that that vulnerability. The tick is the next 866 00:48:32,719 --> 00:48:35,800 Speaker 1: time someone's found a vulnerability. Uh, you're always going to 867 00:48:35,880 --> 00:48:41,000 Speaker 1: have that, right unless someone somehow designs the absolute perfect system, 868 00:48:41,280 --> 00:48:44,000 Speaker 1: which as far as we know, is an impossibility. Yeah, 869 00:48:44,040 --> 00:48:47,800 Speaker 1: that's impossible. Yeah, because for one thing, if people are involved, 870 00:48:47,880 --> 00:48:50,840 Speaker 1: there's no such thing as a perfect system. It's always 871 00:48:50,840 --> 00:48:53,160 Speaker 1: a battle. And I love my video games, so I 872 00:48:53,200 --> 00:48:57,839 Speaker 1: love a battle. But also it also drives other other 873 00:48:57,920 --> 00:49:01,560 Speaker 1: industries though, because we'll see things like the artificial intelligence 874 00:49:01,600 --> 00:49:05,760 Speaker 1: industry improve as a result of this security battle between 875 00:49:05,800 --> 00:49:09,319 Speaker 1: hackers and uh, the infosec experts who are trying to 876 00:49:09,360 --> 00:49:13,000 Speaker 1: make sure that their protecting systems. And as a result, 877 00:49:13,080 --> 00:49:15,560 Speaker 1: we're we're getting information that can be used in other areas, 878 00:49:16,160 --> 00:49:19,799 Speaker 1: which is phenomenal. Like I remember, here's a simple one. 879 00:49:19,800 --> 00:49:22,520 Speaker 1: It's it's as far as security goes. This is as 880 00:49:22,560 --> 00:49:25,319 Speaker 1: low level as it gets. But the capture system, so 881 00:49:25,360 --> 00:49:29,359 Speaker 1: when cap when capture was implemented, even the people who 882 00:49:29,400 --> 00:49:32,240 Speaker 1: were writing capture at the time, we're not really thinking 883 00:49:32,280 --> 00:49:35,640 Speaker 1: of it as being some sort of full proof security 884 00:49:35,640 --> 00:49:38,960 Speaker 1: system to make sure that bots don't get into a system. Right, 885 00:49:39,320 --> 00:49:43,520 Speaker 1: they weren't thinking, oh, now only human beings can get access. 886 00:49:43,520 --> 00:49:45,360 Speaker 1: And if you don't know what a capture is, anytime 887 00:49:45,400 --> 00:49:47,200 Speaker 1: you get your filling out a thing and you get 888 00:49:47,239 --> 00:49:50,959 Speaker 1: a little picture of something and it says, uh, tell 889 00:49:51,120 --> 00:49:53,040 Speaker 1: you know, write down the word or numbers that are 890 00:49:53,040 --> 00:49:56,080 Speaker 1: in this picture, or even to a point of identify 891 00:49:56,160 --> 00:49:59,480 Speaker 1: the pictures in this sequence that have this particular feature, 892 00:49:59,520 --> 00:50:01,400 Speaker 1: like deify all the pictures that have a lake in 893 00:50:01,440 --> 00:50:04,120 Speaker 1: it or something like that. That's a simply that's simply 894 00:50:04,120 --> 00:50:07,440 Speaker 1: a version of capture. Um. The people who made it, 895 00:50:07,440 --> 00:50:09,799 Speaker 1: they actually said, our goal was really to help push 896 00:50:09,880 --> 00:50:14,040 Speaker 1: artificial intelligence because we created a system where programmers or 897 00:50:14,120 --> 00:50:18,400 Speaker 1: hackers had to start coming up with uh, computer programs 898 00:50:18,400 --> 00:50:22,720 Speaker 1: that could identify the same things that we humans can identify. 899 00:50:22,880 --> 00:50:25,920 Speaker 1: And in turn, that means now we've got software that 900 00:50:25,960 --> 00:50:28,960 Speaker 1: pushes forward artificial intelligence. Now, granted, that also means you 901 00:50:29,000 --> 00:50:32,160 Speaker 1: have to improve the system you had designed to keep 902 00:50:32,200 --> 00:50:34,399 Speaker 1: bots out in the first place. So again it goes 903 00:50:34,400 --> 00:50:37,960 Speaker 1: to that TikTok. But there's an added benefit beyond someone 904 00:50:38,200 --> 00:50:42,560 Speaker 1: being able to to automatically access systems and build you know, 905 00:50:42,719 --> 00:50:45,600 Speaker 1: dozens and dozens of fake profiles on Facebook or whatever 906 00:50:45,640 --> 00:50:49,239 Speaker 1: it might be, whatever that might be. Yeah, yeah, and 907 00:50:49,280 --> 00:50:51,520 Speaker 1: then keep in mind, like like we've been saying here, 908 00:50:51,560 --> 00:50:54,760 Speaker 1: I mean, any any systems security is only as strong 909 00:50:54,800 --> 00:50:58,799 Speaker 1: as its weakest link. That we cat is pretty much 910 00:50:58,840 --> 00:51:01,560 Speaker 1: always people. That's the big one, right. But I mean, 911 00:51:01,600 --> 00:51:05,000 Speaker 1: I've I've read stories about hacker gaining access to a 912 00:51:05,040 --> 00:51:08,880 Speaker 1: system because there was an overall security system that was 913 00:51:08,960 --> 00:51:12,400 Speaker 1: really robust for the main company, but then they had 914 00:51:12,440 --> 00:51:15,719 Speaker 1: a little branch office and the branch office didn't have 915 00:51:15,800 --> 00:51:18,799 Speaker 1: that crazy amount of security but was still on the 916 00:51:18,840 --> 00:51:21,960 Speaker 1: same network. I think I read about that story too, 917 00:51:22,360 --> 00:51:24,239 Speaker 1: So I mean, these are these are things like if 918 00:51:24,280 --> 00:51:29,240 Speaker 1: you identify a potential point of weakness that's now suddenly 919 00:51:29,280 --> 00:51:32,080 Speaker 1: the you know, it's it's like a bank vault. If 920 00:51:32,120 --> 00:51:34,680 Speaker 1: the bank vault has an enormous door with huge locks 921 00:51:34,760 --> 00:51:36,920 Speaker 1: on it that you have to get through. Oh, but 922 00:51:36,960 --> 00:51:39,719 Speaker 1: it also has a backdoor just for convenience sake, you're 923 00:51:39,719 --> 00:51:43,520 Speaker 1: gonna aim for the back door. So but there are 924 00:51:43,520 --> 00:51:47,320 Speaker 1: other ways that that hackers can can make a legitimate 925 00:51:47,360 --> 00:51:50,839 Speaker 1: living that don't even involve testing security systems. It might 926 00:51:50,880 --> 00:51:55,560 Speaker 1: involve education. Yeah, absolutely so education is I guess what 927 00:51:55,600 --> 00:51:58,120 Speaker 1: you would say, I fall into that kind of comory. 928 00:51:58,480 --> 00:52:01,040 Speaker 1: And while I I don't necessarily like to call myself 929 00:52:01,080 --> 00:52:03,520 Speaker 1: a hacker because I know so many experts in the 930 00:52:03,560 --> 00:52:06,239 Speaker 1: field who are much more knowledgeable than I am. I'm 931 00:52:06,680 --> 00:52:11,520 Speaker 1: quite a intermediate, I would say, but I love to teach, 932 00:52:11,719 --> 00:52:14,560 Speaker 1: and I love to give tutorials online, so I give 933 00:52:14,600 --> 00:52:17,560 Speaker 1: tutorials on YouTube. But I also know a lot of 934 00:52:17,560 --> 00:52:21,600 Speaker 1: people who have either written books about hacking UH, and 935 00:52:21,680 --> 00:52:25,359 Speaker 1: they could do either specifics about penetration testing or they 936 00:52:25,360 --> 00:52:28,640 Speaker 1: get to make it a very very wide based book 937 00:52:28,680 --> 00:52:30,960 Speaker 1: where they explain everything that you would have to do 938 00:52:31,000 --> 00:52:33,920 Speaker 1: as a penetration tester. And a penetration tester is basically 939 00:52:33,920 --> 00:52:36,239 Speaker 1: one of those guys that would go into a company 940 00:52:36,280 --> 00:52:39,000 Speaker 1: and UH find all the vulnerabilities and report on it. 941 00:52:39,800 --> 00:52:44,719 Speaker 1: You would also have companies that administer certifications, so a 942 00:52:44,840 --> 00:52:47,760 Speaker 1: lot of I'm sure a lot of your your UM 943 00:52:47,800 --> 00:52:51,000 Speaker 1: listeners probably know that you have to get certifications to 944 00:52:51,040 --> 00:52:54,239 Speaker 1: get a lot of UH to get into a lot 945 00:52:54,280 --> 00:52:57,799 Speaker 1: of the fields with computer security and even just you know, 946 00:52:57,880 --> 00:53:01,279 Speaker 1: computer networking too. Sure a lot of searts for those 947 00:53:01,320 --> 00:53:03,800 Speaker 1: and they're very, very expensive. So a lot of companies 948 00:53:03,880 --> 00:53:07,799 Speaker 1: just administer their certifications or they will have you take 949 00:53:07,880 --> 00:53:10,359 Speaker 1: classes for a period of time until you actually take 950 00:53:10,360 --> 00:53:13,200 Speaker 1: the test and get certified. But that ends up being 951 00:53:13,200 --> 00:53:14,960 Speaker 1: a really good thing to put on your resume for 952 00:53:15,000 --> 00:53:18,040 Speaker 1: a lot of companies whenever you do intend to get 953 00:53:18,040 --> 00:53:22,680 Speaker 1: a job in network security. And then lastly, we have 954 00:53:22,800 --> 00:53:26,400 Speaker 1: the publishers. So that's the YouTubers, the that's the people 955 00:53:26,440 --> 00:53:30,560 Speaker 1: that make podcast That's the people that um might be 956 00:53:31,080 --> 00:53:35,040 Speaker 1: creating other forms of entertainment that not only educate but 957 00:53:35,239 --> 00:53:38,319 Speaker 1: also entertain their users and their listeners so that they 958 00:53:38,440 --> 00:53:42,960 Speaker 1: get excited about being a part of information security. Uh. 959 00:53:42,960 --> 00:53:44,840 Speaker 1: And that's what I like to do. I like to 960 00:53:45,200 --> 00:53:47,600 Speaker 1: teach people in a way that makes it exciting. So 961 00:53:47,640 --> 00:53:49,719 Speaker 1: I do a lot of hands on stuff. I I make, 962 00:53:49,880 --> 00:53:52,120 Speaker 1: I make jokes, and I explain things in a very 963 00:53:52,200 --> 00:53:57,719 Speaker 1: natural light and it helps it helps again foster that 964 00:53:57,800 --> 00:54:02,080 Speaker 1: desire to learn how things work right. That does so 965 00:54:02,160 --> 00:54:04,600 Speaker 1: again that that same fascination, like if you were ever 966 00:54:04,680 --> 00:54:07,480 Speaker 1: a kid that took apart a watch or a radio 967 00:54:07,719 --> 00:54:10,359 Speaker 1: or some other piece of equipment, because you really want 968 00:54:10,400 --> 00:54:13,560 Speaker 1: to know what's the magic that makes this thing do 969 00:54:13,760 --> 00:54:17,759 Speaker 1: what it does? Uh, hackers have that, I mean, that's 970 00:54:17,840 --> 00:54:21,000 Speaker 1: the that's that's the defining quality in my mind of 971 00:54:21,040 --> 00:54:24,520 Speaker 1: a hacker is ultimately it's someone who is fascinated with 972 00:54:24,600 --> 00:54:28,840 Speaker 1: the way something works. Uh. We've largely been focusing on software, 973 00:54:29,120 --> 00:54:32,879 Speaker 1: but that is just as legitimate as any hardware hack. 974 00:54:33,320 --> 00:54:35,759 Speaker 1: It's the idea of how does this It might not 975 00:54:35,840 --> 00:54:37,880 Speaker 1: even just be the software, might be a full system, 976 00:54:37,960 --> 00:54:40,560 Speaker 1: like how does the system work? What are all the 977 00:54:40,600 --> 00:54:44,799 Speaker 1: interlocking parts? How do they communicate with each other. I 978 00:54:44,880 --> 00:54:47,760 Speaker 1: just had a random memory from when I was younger 979 00:54:47,760 --> 00:54:50,760 Speaker 1: and in school. I took apart my first iPod because 980 00:54:50,800 --> 00:54:52,279 Speaker 1: I had no clue how it worked, and I was 981 00:54:52,400 --> 00:54:55,040 Speaker 1: very curious about what what the interior of it was. 982 00:54:55,640 --> 00:54:58,239 Speaker 1: So I just I took it apart. I couldn't put 983 00:54:58,280 --> 00:55:01,600 Speaker 1: it back together, So I was not hacker in any sense. 984 00:55:01,840 --> 00:55:06,080 Speaker 1: We um we for for an article I was writing, 985 00:55:06,120 --> 00:55:13,000 Speaker 1: We've got a first edition Launch Day Nintendo three D 986 00:55:13,280 --> 00:55:16,000 Speaker 1: s and it was my job to disassemble it and 987 00:55:16,040 --> 00:55:19,480 Speaker 1: take photos of all the pieces. So first I took 988 00:55:19,480 --> 00:55:23,680 Speaker 1: a picture of it whole and shared it online on 989 00:55:23,719 --> 00:55:26,680 Speaker 1: Twitter and said look what I have, and everyone got 990 00:55:26,719 --> 00:55:29,319 Speaker 1: excited and then by the end of it, I had 991 00:55:29,360 --> 00:55:32,600 Speaker 1: a little had a little black cauldron at my desk 992 00:55:32,640 --> 00:55:35,400 Speaker 1: that was left over from a Halloween thing, and then 993 00:55:35,440 --> 00:55:37,960 Speaker 1: I put all the different pieces because there was no 994 00:55:38,000 --> 00:55:39,879 Speaker 1: way this thing was going back together after I took 995 00:55:39,880 --> 00:55:43,040 Speaker 1: it apart. For one thing, Nintendo is pretty careful about 996 00:55:43,280 --> 00:55:45,520 Speaker 1: sealing stuff in such a way that's not meant to 997 00:55:45,520 --> 00:55:48,120 Speaker 1: come aboard, so um, so you have to have he 998 00:55:48,200 --> 00:55:50,120 Speaker 1: was a little force in some cases in order to 999 00:55:50,120 --> 00:55:52,319 Speaker 1: get to stuff. And then I showed a picture. I'm like, 1000 00:55:52,360 --> 00:55:55,800 Speaker 1: I'm like, look what I did to the thing. I 1001 00:55:55,880 --> 00:56:00,600 Speaker 1: made the entire Internet cry. Yeah, although ultimately I think 1002 00:56:00,640 --> 00:56:04,160 Speaker 1: the three DS most people are like, oh whatever. But 1003 00:56:04,239 --> 00:56:06,200 Speaker 1: at the time when it was brand new, people were 1004 00:56:06,239 --> 00:56:09,439 Speaker 1: freaking out. And of course there's there's also another role 1005 00:56:09,560 --> 00:56:11,719 Speaker 1: for for hackers out there. It may not be a 1006 00:56:11,719 --> 00:56:15,960 Speaker 1: steady gig, but we are seeing more and more of 1007 00:56:16,000 --> 00:56:20,080 Speaker 1: the Hollywood productions out there actually talk with people in 1008 00:56:20,120 --> 00:56:23,440 Speaker 1: the industry so that the depictions that we're getting are 1009 00:56:23,600 --> 00:56:27,760 Speaker 1: more accurately reflecting what really happens. Mr Robot is probably 1010 00:56:28,320 --> 00:56:32,000 Speaker 1: the example that immediately leaps to my mind, and that 1011 00:56:32,200 --> 00:56:34,960 Speaker 1: it's it's a show that tries very hard to take 1012 00:56:35,000 --> 00:56:38,279 Speaker 1: a more realistic approach to the world of hacking as 1013 00:56:38,280 --> 00:56:41,839 Speaker 1: opposed to um. You type in three passwords, the third 1014 00:56:41,840 --> 00:56:44,799 Speaker 1: one gets you in, and then you're navigating through a 1015 00:56:44,920 --> 00:56:48,160 Speaker 1: vector graphics three D dungeon and you encounter a skull 1016 00:56:48,200 --> 00:56:51,920 Speaker 1: and cross bones. That's not how hacking works. It sounds 1017 00:56:51,920 --> 00:56:55,920 Speaker 1: like you were talking about hackers hack the planet. I 1018 00:56:56,040 --> 00:57:01,040 Speaker 1: might have been education. Just bring it back. But professors, 1019 00:57:01,080 --> 00:57:03,560 Speaker 1: I didn't leave you guys out. I'm sorry. I love 1020 00:57:03,560 --> 00:57:05,839 Speaker 1: you guys. You are the reason why I'm here now. 1021 00:57:05,920 --> 00:57:08,239 Speaker 1: If I didn't take my computer courses in college with 1022 00:57:08,280 --> 00:57:10,759 Speaker 1: my professors, I would not be doing what I'm doing now. 1023 00:57:10,840 --> 00:57:13,560 Speaker 1: So professors are like at the top of that education 1024 00:57:13,640 --> 00:57:16,240 Speaker 1: list because and you can take a lot of computer 1025 00:57:16,280 --> 00:57:19,120 Speaker 1: security courses in college and sometimes in high schools if 1026 00:57:19,160 --> 00:57:23,880 Speaker 1: you're lucky. But yeah, technical assistance. So technical assistance are 1027 00:57:23,960 --> 00:57:27,360 Speaker 1: people that will come on board with a Hollywood movie 1028 00:57:27,520 --> 00:57:29,960 Speaker 1: or a TV show or what have you, and they 1029 00:57:29,960 --> 00:57:34,400 Speaker 1: will explain to the network how the hacking actually happens. 1030 00:57:34,880 --> 00:57:39,640 Speaker 1: So I know a few, uh, they will They'll come 1031 00:57:39,680 --> 00:57:41,800 Speaker 1: to some of their hacker friends or they will be 1032 00:57:41,840 --> 00:57:45,040 Speaker 1: a hacker themselves and they will say okay uh. In 1033 00:57:45,080 --> 00:57:47,600 Speaker 1: this season, I know that they want to do X, 1034 00:57:47,720 --> 00:57:50,120 Speaker 1: Y and Z on camera, and I need to make 1035 00:57:50,120 --> 00:57:52,920 Speaker 1: it look legitimate, so they will come up with the script. 1036 00:57:53,040 --> 00:57:55,080 Speaker 1: They will come up with the hack and the actual 1037 00:57:55,240 --> 00:57:59,280 Speaker 1: keyboard commands that the actor has to type in on 1038 00:57:59,400 --> 00:58:03,560 Speaker 1: camera so that they are actually doing legitimate hacks. So 1039 00:58:03,600 --> 00:58:07,240 Speaker 1: that way, they're not only making it look cool for 1040 00:58:07,280 --> 00:58:09,960 Speaker 1: a wider audience because an audience is actually going to 1041 00:58:10,000 --> 00:58:12,560 Speaker 1: see how a hack works, but they're also getting that 1042 00:58:12,600 --> 00:58:17,320 Speaker 1: credibility with the info set community too. So Mr Robot 1043 00:58:17,400 --> 00:58:20,760 Speaker 1: is huge with the infoset community because it is legitimate. 1044 00:58:20,840 --> 00:58:23,360 Speaker 1: Like I've watched several of those episodes and I've seen 1045 00:58:23,360 --> 00:58:25,400 Speaker 1: a lot of the hacks that they do. They've even 1046 00:58:25,480 --> 00:58:27,520 Speaker 1: used some of our hack fi products on the show, 1047 00:58:28,000 --> 00:58:31,520 Speaker 1: and they're actually using legit hacks. And it is so 1048 00:58:31,640 --> 00:58:33,720 Speaker 1: much fun to see it on TV and see them 1049 00:58:33,760 --> 00:58:36,440 Speaker 1: get so many good reviews from a wider consumer audience, 1050 00:58:36,480 --> 00:58:39,600 Speaker 1: because it makes me feel like many more people are 1051 00:58:39,600 --> 00:58:42,400 Speaker 1: getting interested in info sex because they see what's happening 1052 00:58:42,440 --> 00:58:44,520 Speaker 1: on camera and they see that this is actually how 1053 00:58:44,560 --> 00:58:46,920 Speaker 1: you do it. Yeah, it's nice to see it go 1054 00:58:47,160 --> 00:58:53,400 Speaker 1: beyond uh. The the niche that I would argue, info 1055 00:58:53,440 --> 00:58:57,520 Speaker 1: sec and hacking has largely inhabited for the past three decades, 1056 00:58:57,640 --> 00:59:00,600 Speaker 1: right the people who have been interested. When it's first started, 1057 00:59:00,600 --> 00:59:04,840 Speaker 1: it was essentially your hobbyists, and often those hobbyists were 1058 00:59:04,880 --> 00:59:08,960 Speaker 1: isolated individuals. Uh. You got to the phone freaking days 1059 00:59:09,000 --> 00:59:12,680 Speaker 1: where there was a little bit of a small subculture 1060 00:59:12,720 --> 00:59:15,800 Speaker 1: of people who were interested in hacking the telephone system 1061 00:59:15,920 --> 00:59:20,400 Speaker 1: using all sorts of stuff, including a whistle from Captain Crunch. Uh. 1062 00:59:20,640 --> 00:59:23,439 Speaker 1: You had you had the the early hack days where 1063 00:59:23,440 --> 00:59:26,000 Speaker 1: people were just trying to create interesting programs for their 1064 00:59:26,000 --> 00:59:28,520 Speaker 1: computers or to see how some of the programs that 1065 00:59:28,560 --> 00:59:31,320 Speaker 1: were coming out, how did those work? Um, but it 1066 00:59:31,440 --> 00:59:36,040 Speaker 1: was largely a tiny slice of the folks who were 1067 00:59:36,080 --> 00:59:39,400 Speaker 1: even aware of personal computers, and and even that group 1068 00:59:39,480 --> 00:59:42,919 Speaker 1: was still a tiny slice of the overall population. We're 1069 00:59:42,920 --> 00:59:47,280 Speaker 1: seeing that tiny slice grow over time, and largely because 1070 00:59:47,760 --> 00:59:50,280 Speaker 1: so many of us are so dependent upon computers these 1071 00:59:50,360 --> 00:59:53,520 Speaker 1: days that it benefits us to have an awareness to 1072 00:59:53,560 --> 00:59:57,040 Speaker 1: make sure that we remain safe. But also because of 1073 00:59:57,080 --> 01:00:01,200 Speaker 1: things like Mr Robot showing how this wre x and 1074 01:00:01,240 --> 01:00:05,080 Speaker 1: sparking the imagination of people who perhaps before they saw 1075 01:00:05,120 --> 01:00:07,920 Speaker 1: that never thought, Yeah, it's kind of cool. I would 1076 01:00:07,920 --> 01:00:10,560 Speaker 1: love to be able to manipulate code in such a 1077 01:00:10,560 --> 01:00:14,000 Speaker 1: way that I could do something new or unexpected or 1078 01:00:14,400 --> 01:00:17,640 Speaker 1: help people. Uh. And it's really encouraging to see that 1079 01:00:17,720 --> 01:00:21,120 Speaker 1: kind of thing happen right now. I kind of wish 1080 01:00:21,120 --> 01:00:23,040 Speaker 1: it had happened ten years ago, but I love seeing 1081 01:00:23,040 --> 01:00:25,800 Speaker 1: it happen now. Same. I actually feel like there was 1082 01:00:25,840 --> 01:00:28,960 Speaker 1: a little bit of negativity in in the aspect that 1083 01:00:29,600 --> 01:00:32,920 Speaker 1: we we used to have all these really fancy graphics 1084 01:00:32,960 --> 01:00:35,720 Speaker 1: happening on in these Hollywood movies and these TV shows, 1085 01:00:35,880 --> 01:00:39,240 Speaker 1: and now they're actually seeing the reality that is hacking, 1086 01:00:39,320 --> 01:00:43,240 Speaker 1: and it is not super colorful. It's not super quick, 1087 01:00:43,280 --> 01:00:46,040 Speaker 1: fast paced and exciting like it looks like it is 1088 01:00:46,120 --> 01:00:49,800 Speaker 1: on those old school shows. So I'm hoping that now 1089 01:00:49,840 --> 01:00:52,840 Speaker 1: that they're actually seeing it, people will try it to 1090 01:00:53,480 --> 01:00:56,840 Speaker 1: Like if they see um, the main actor on Mr. 1091 01:00:56,920 --> 01:01:00,720 Speaker 1: Robot do a specific command line option, they'll go to 1092 01:01:00,760 --> 01:01:02,800 Speaker 1: their computer and try it themselves and see that it 1093 01:01:02,880 --> 01:01:04,920 Speaker 1: actually does work, and then they'll be like, oh, I 1094 01:01:04,960 --> 01:01:06,840 Speaker 1: really want to try some new stuff too, so they'll 1095 01:01:06,840 --> 01:01:08,959 Speaker 1: start googling and see what else they can find out. 1096 01:01:09,360 --> 01:01:12,200 Speaker 1: That's the kind of inspiration that I wish happened thirty 1097 01:01:12,280 --> 01:01:16,360 Speaker 1: years ago, and it didn't, so I want to see 1098 01:01:16,400 --> 01:01:19,560 Speaker 1: more of that now, and I'm really happy that, for example, 1099 01:01:19,680 --> 01:01:22,520 Speaker 1: Mr Robot has done a great job with it. Yeah, 1100 01:01:22,520 --> 01:01:26,360 Speaker 1: it's it's and you not to not to poop all 1101 01:01:26,400 --> 01:01:31,760 Speaker 1: over Hollywood because I do loves mo Hollywood's but but 1102 01:01:31,800 --> 01:01:35,320 Speaker 1: it is. And to understand where they were coming from, 1103 01:01:35,360 --> 01:01:38,280 Speaker 1: they were trying to find a way to create an 1104 01:01:38,280 --> 01:01:45,240 Speaker 1: exciting visual depiction of something that doesn't necessarily necessarily lend 1105 01:01:45,320 --> 01:01:49,040 Speaker 1: itself to that in order for to create a dramatic effect. 1106 01:01:49,400 --> 01:01:51,760 Speaker 1: So I get it. It's very similar to the way 1107 01:01:51,800 --> 01:01:56,840 Speaker 1: Hollywood portrayed virtual reality back in the nineties, way before 1108 01:01:56,920 --> 01:02:01,080 Speaker 1: virtual reality was ready for public consumption, and it's what 1109 01:02:01,360 --> 01:02:06,720 Speaker 1: largely killed VR for a decade before the various video 1110 01:02:06,720 --> 01:02:09,280 Speaker 1: game systems started to make the very the components cheap 1111 01:02:09,400 --> 01:02:11,680 Speaker 1: enough for people to play in that space again, and 1112 01:02:11,680 --> 01:02:15,560 Speaker 1: now we're on the verge of another VR revolution. The 1113 01:02:15,600 --> 01:02:17,240 Speaker 1: same sort of thing is true of hacking, Like how 1114 01:02:17,240 --> 01:02:21,280 Speaker 1: do you show hacking in a way that gets across 1115 01:02:21,320 --> 01:02:23,760 Speaker 1: what is happening to an audience and makes it interesting? 1116 01:02:23,920 --> 01:02:26,400 Speaker 1: I think largely you have to do that through really 1117 01:02:26,480 --> 01:02:28,840 Speaker 1: good writing of your characters, and once you do that, 1118 01:02:29,480 --> 01:02:32,439 Speaker 1: then everything else follows. I think if if you can 1119 01:02:32,520 --> 01:02:35,680 Speaker 1: show that the characters in a movie or in a 1120 01:02:35,720 --> 01:02:39,280 Speaker 1: TV show are actually real people that have real relationships, 1121 01:02:39,320 --> 01:02:41,520 Speaker 1: they have real jobs and real lives, and they have 1122 01:02:41,600 --> 01:02:46,000 Speaker 1: hobbies outside of just hacking, you can really you can 1123 01:02:46,160 --> 01:02:49,600 Speaker 1: start to relate to that character in a very real 1124 01:02:49,720 --> 01:02:52,120 Speaker 1: sense in the fact that hey, they are humans too, 1125 01:02:52,200 --> 01:02:56,840 Speaker 1: because here's our people too. That was actually a documentary. Nice. Yeah, 1126 01:02:56,880 --> 01:03:00,160 Speaker 1: because again, when when you're when you're thinking about in 1127 01:03:00,200 --> 01:03:03,840 Speaker 1: the abstract, you're really it becomes that us versus them mentality, 1128 01:03:03,880 --> 01:03:07,959 Speaker 1: where by by its very nature, it's dehumanizing. But that's 1129 01:03:08,000 --> 01:03:11,400 Speaker 1: probably a topic for a show that's not about technology, 1130 01:03:11,520 --> 01:03:15,080 Speaker 1: So I will just leave it be. Shannon Morse, thank 1131 01:03:15,160 --> 01:03:19,000 Speaker 1: you so much for joining me today. Please let everyone 1132 01:03:19,040 --> 01:03:23,160 Speaker 1: know where they can find all of your stuff. Jonathan Strickland, 1133 01:03:23,280 --> 01:03:27,800 Speaker 1: thank you. So it was a little it was a little, 1134 01:03:28,160 --> 01:03:31,280 Speaker 1: it was a little laden. Yeah. Yeah, I've been watching 1135 01:03:31,280 --> 01:03:34,240 Speaker 1: Star Trek lately, way way too much Star Trek. So 1136 01:03:34,280 --> 01:03:37,880 Speaker 1: you can find me. Um. The most direct path is 1137 01:03:37,960 --> 01:03:40,600 Speaker 1: on Twitter. I'm at snubs and that's s n U 1138 01:03:40,800 --> 01:03:44,520 Speaker 1: b S. And then my shows specifically our Tech Thing 1139 01:03:44,640 --> 01:03:47,040 Speaker 1: over at t e k thing dot com and Hack 1140 01:03:47,160 --> 01:03:51,760 Speaker 1: five over at h K five dot org. Yeah, so 1141 01:03:51,880 --> 01:03:55,200 Speaker 1: go check those shows out. They are awesome. Shannon and 1142 01:03:55,280 --> 01:03:58,520 Speaker 1: her co hosts are all awesome. I gotta get Corn. 1143 01:03:58,880 --> 01:04:01,160 Speaker 1: I gotta get Darren on this. Yeah, no, you are cooler, 1144 01:04:01,200 --> 01:04:03,560 Speaker 1: but someday I gotta get Darren on the show. Um. 1145 01:04:03,840 --> 01:04:05,760 Speaker 1: I don't think Darren and I have ever I think 1146 01:04:05,760 --> 01:04:09,880 Speaker 1: we may have been on one of Tom Merritt's shows 1147 01:04:10,000 --> 01:04:12,439 Speaker 1: at the same time, but otherwise I don't think we've 1148 01:04:12,440 --> 01:04:15,280 Speaker 1: ever done a show together at any rate. Yeah, I know, 1149 01:04:15,320 --> 01:04:19,280 Speaker 1: it's crazy, right that happen. Let's let's do that. Let's 1150 01:04:19,320 --> 01:04:22,400 Speaker 1: do that. And that wraps up this classic episode of 1151 01:04:22,400 --> 01:04:24,959 Speaker 1: tech Stuff. I hope you guys enjoyed this look back 1152 01:04:25,280 --> 01:04:27,320 Speaker 1: on some of the episodes that I've done over the 1153 01:04:27,320 --> 01:04:29,480 Speaker 1: past couple of years. These are more recent than some 1154 01:04:29,560 --> 01:04:32,880 Speaker 1: of our other classic episodes. We've been running on Fridays, 1155 01:04:32,920 --> 01:04:35,360 Speaker 1: and like I said before, I should be back in 1156 01:04:35,400 --> 01:04:37,840 Speaker 1: the office recording brand new stuff. You're gonna hear a 1157 01:04:37,960 --> 01:04:42,360 Speaker 1: whole arc of episodes about our relationship with media and 1158 01:04:42,400 --> 01:04:45,960 Speaker 1: how media has changed over time, how the business of 1159 01:04:46,040 --> 01:04:49,920 Speaker 1: media has changed, how our consumption of media has changed. 1160 01:04:50,160 --> 01:04:53,960 Speaker 1: It's a huge, huge topic and it spans multiple episodes, 1161 01:04:54,240 --> 01:04:56,640 Speaker 1: so I hope you enjoy that. It's been fascinating for 1162 01:04:56,680 --> 01:04:58,960 Speaker 1: me to jump into that research and kind of break 1163 01:04:59,040 --> 01:05:02,280 Speaker 1: this out, and um, it was actually I didn't know 1164 01:05:02,520 --> 01:05:05,080 Speaker 1: how big a bite I was taking at the time 1165 01:05:05,120 --> 01:05:08,280 Speaker 1: when I started. And uh, I hope you you enjoy 1166 01:05:08,320 --> 01:05:10,640 Speaker 1: it when you start hearing those episodes. If you have 1167 01:05:10,680 --> 01:05:13,000 Speaker 1: any suggestions for future episodes of tech Stuff, send me 1168 01:05:13,040 --> 01:05:16,240 Speaker 1: a message. The email address is tech stuff at how 1169 01:05:16,280 --> 01:05:19,439 Speaker 1: stuff works dot com. Drop on by our website that's 1170 01:05:19,520 --> 01:05:22,480 Speaker 1: tech stuff podcast dot com. There you'll find an archive 1171 01:05:22,560 --> 01:05:25,200 Speaker 1: of all of our episodes. You'll find links to our 1172 01:05:25,240 --> 01:05:27,800 Speaker 1: social media presence, and you'll find a link to our 1173 01:05:27,840 --> 01:05:31,000 Speaker 1: online store, where every purchasing make goes to help the show. 1174 01:05:31,120 --> 01:05:33,640 Speaker 1: And we greatly appreciate it. And I'll talk to you 1175 01:05:33,680 --> 01:05:42,160 Speaker 1: again really soon for more on this and thousands of 1176 01:05:42,200 --> 01:05:54,200 Speaker 1: other topics. Is it how stuff works dot com.