1 00:00:04,440 --> 00:00:10,080 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. 2 00:00:11,840 --> 00:00:14,280 Speaker 2: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,360 --> 00:00:15,240 Speaker 2: Jonathan Strickland. 4 00:00:15,240 --> 00:00:18,720 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 5 00:00:18,760 --> 00:00:21,560 Speaker 1: are you? It is time for another tech Stuff classic 6 00:00:21,720 --> 00:00:23,439 Speaker 1: and we actually have a part two. 7 00:00:23,640 --> 00:00:26,240 Speaker 2: So last week, last Friday. 8 00:00:25,840 --> 00:00:28,920 Speaker 1: We had The Worst Hacking Scenes from Hollywood Part one 9 00:00:29,480 --> 00:00:32,360 Speaker 1: with Shannon Morse, and Shannon is back for this other 10 00:00:32,360 --> 00:00:36,320 Speaker 1: classic episode, The Worst Hacking Scenes from Hollywood Part two. 11 00:00:36,400 --> 00:00:42,839 Speaker 1: It originally published on September fourteenth, twenty sixteen. Enjoy Today 12 00:00:42,880 --> 00:00:45,560 Speaker 1: we're going to talk more about some of the embarrassing 13 00:00:45,600 --> 00:00:48,280 Speaker 1: examples out there in entertainment as well as some of 14 00:00:48,320 --> 00:00:49,440 Speaker 1: the really good ones. 15 00:00:49,920 --> 00:00:50,680 Speaker 2: And we're going to. 16 00:00:50,680 --> 00:00:53,400 Speaker 1: Start off with a conversation about a pair of movies 17 00:00:53,640 --> 00:00:57,560 Speaker 1: that not only did a disservice to hackers, but also 18 00:00:57,760 --> 00:00:58,720 Speaker 1: virtual reality. 19 00:00:59,240 --> 00:01:02,640 Speaker 2: We joined the conver station already in progress. 20 00:01:08,800 --> 00:01:13,520 Speaker 1: Lawnmower Man and lawnmower Man Too, inspired by the work 21 00:01:13,520 --> 00:01:16,640 Speaker 1: of Stephen King. So these movies are terrible, it could 22 00:01:16,680 --> 00:01:18,039 Speaker 1: be a lot of fun to watch with a group 23 00:01:18,080 --> 00:01:21,039 Speaker 1: of people who just want to see, like a really 24 00:01:21,040 --> 00:01:22,760 Speaker 1: bad movie and make fun of it, you know. Mst 25 00:01:22,880 --> 00:01:25,560 Speaker 1: three K style. I included a couple of eclips and 26 00:01:25,600 --> 00:01:29,319 Speaker 1: I showed them to Shannon. One of them was a 27 00:01:29,360 --> 00:01:35,800 Speaker 1: demonstration that hacking a system in this world isn't just 28 00:01:36,000 --> 00:01:42,600 Speaker 1: graphical but immersive. So you're wearing like a head mounted display, 29 00:01:42,680 --> 00:01:47,240 Speaker 1: and you're wearing gloves that allow you to interact with 30 00:01:47,280 --> 00:01:48,160 Speaker 1: the virtual world. 31 00:01:48,720 --> 00:01:49,880 Speaker 2: And so they're. 32 00:01:49,720 --> 00:01:55,480 Speaker 1: Showing a character getting backdoor access to a system by 33 00:01:56,760 --> 00:02:01,960 Speaker 1: slapping at hexagons inside the virtual space. So these hexagons 34 00:02:01,960 --> 00:02:04,720 Speaker 1: are popping up in front of them, and he's slapping 35 00:02:04,720 --> 00:02:08,320 Speaker 1: at them and eventually gets you know, access granted. Mostly 36 00:02:08,320 --> 00:02:11,120 Speaker 1: it's getting access denied, but eventually access granted pops up 37 00:02:11,280 --> 00:02:13,000 Speaker 1: and it looks like the only thing that he did 38 00:02:13,000 --> 00:02:14,880 Speaker 1: differently was just that he hit it a little faster 39 00:02:15,000 --> 00:02:17,880 Speaker 1: that time. So, in other words, it'd be equivalent to 40 00:02:18,040 --> 00:02:21,120 Speaker 1: that NCIS scene we talked about earlier, where if you 41 00:02:21,480 --> 00:02:25,280 Speaker 1: just tiede fast enough, you either hack in or you 42 00:02:25,360 --> 00:02:29,160 Speaker 1: prevent a hack from happening. Obviously, this is not at 43 00:02:29,160 --> 00:02:34,120 Speaker 1: all remotely realistic, but it is one of those where 44 00:02:34,440 --> 00:02:36,880 Speaker 1: you look at and you you could tell the screenwriter 45 00:02:37,000 --> 00:02:39,800 Speaker 1: was saying, well, I want to show that this character 46 00:02:39,880 --> 00:02:43,600 Speaker 1: who has has found that he's very powerful in the 47 00:02:43,639 --> 00:02:46,840 Speaker 1: virtual space, can access a system. 48 00:02:47,080 --> 00:02:49,000 Speaker 2: How can I do that in a way that's not just. 49 00:02:50,480 --> 00:02:52,800 Speaker 1: You know, it's over in a blink of an eye, 50 00:02:52,840 --> 00:02:55,880 Speaker 1: and there's no way to show it visually, So they 51 00:02:55,919 --> 00:02:58,880 Speaker 1: created this sort of three D display. This isn't one 52 00:02:58,880 --> 00:03:00,600 Speaker 1: of those things we see in a lot of Hollywood 53 00:03:00,639 --> 00:03:05,160 Speaker 1: movies where they try and visualize the navigating a secure 54 00:03:05,240 --> 00:03:07,680 Speaker 1: system as like going through a maze. 55 00:03:08,000 --> 00:03:09,480 Speaker 2: It happens a ton. 56 00:03:10,400 --> 00:03:15,360 Speaker 3: And again, actually going through that maze of information that's 57 00:03:15,400 --> 00:03:19,480 Speaker 3: located on a network is actually quite boring looking on camera. 58 00:03:20,000 --> 00:03:24,400 Speaker 3: So they created this out by terrible virtual reality scene 59 00:03:25,440 --> 00:03:30,560 Speaker 3: to actually give some kind of implementation, some kind of 60 00:03:30,600 --> 00:03:33,359 Speaker 3: visual implementation to the people that are watching this movie. 61 00:03:33,360 --> 00:03:38,680 Speaker 3: And it's pretty terrible. It's definitely not how gaining access 62 00:03:38,720 --> 00:03:42,880 Speaker 3: to a network would work, but I thought it was hilarious. Yeah. 63 00:03:42,960 --> 00:03:46,120 Speaker 1: Lawnmower Man two also has a scene that's similar to that. 64 00:03:46,600 --> 00:03:50,320 Speaker 1: The characters are looking at what is just an equation 65 00:03:50,800 --> 00:03:54,400 Speaker 1: on a computer screen. So they're looking at this equation 66 00:03:54,560 --> 00:03:59,200 Speaker 1: and trying to figure out how to get access and 67 00:03:59,640 --> 00:04:02,320 Speaker 1: one of the character says that she can't get past 68 00:04:02,360 --> 00:04:06,880 Speaker 1: the memory lock to access the chain so the smarmy 69 00:04:07,000 --> 00:04:10,440 Speaker 1: computer expert who's next to her says, well, all you 70 00:04:10,520 --> 00:04:14,920 Speaker 1: have to do is just enhance the memory index, which 71 00:04:15,000 --> 00:04:18,120 Speaker 1: already none of that makes any real sense. It's kind 72 00:04:18,160 --> 00:04:20,719 Speaker 1: of like Star Trek technobabble where they talk about reversing 73 00:04:20,760 --> 00:04:25,440 Speaker 1: the polarity it really, it's just there to say something 74 00:04:25,520 --> 00:04:27,080 Speaker 1: technical is happening. 75 00:04:26,800 --> 00:04:29,840 Speaker 2: And you don't need to understand what it is. It's 76 00:04:29,920 --> 00:04:30,680 Speaker 2: kind of shorthand. 77 00:04:31,360 --> 00:04:31,560 Speaker 4: Yeah. 78 00:04:32,040 --> 00:04:35,520 Speaker 1: So then a little yellow ball appears on the screen 79 00:04:35,560 --> 00:04:41,560 Speaker 1: and the equation they're looking at it starts to rotate 80 00:04:42,120 --> 00:04:44,640 Speaker 1: and starts to turn on its side and become a 81 00:04:44,680 --> 00:04:49,320 Speaker 1: three dimensional maze again, so we get another example of 82 00:04:50,480 --> 00:04:55,200 Speaker 1: accessing a system involves navigating through an actual three dimensional maze. 83 00:04:56,000 --> 00:04:58,880 Speaker 1: He uses gesture controls to move this little yellow ball 84 00:04:58,920 --> 00:05:05,520 Speaker 1: around until finally getting to where the data is. The 85 00:05:05,560 --> 00:05:11,200 Speaker 1: whole time he's just using meaningless computer jargon incorrectly. And 86 00:05:11,200 --> 00:05:14,919 Speaker 1: then the very the cherry on top of the Sunday 87 00:05:15,000 --> 00:05:20,680 Speaker 1: is that he says to da, so uh, if you 88 00:05:20,760 --> 00:05:23,480 Speaker 1: really want a good time, watch the hacking scene from 89 00:05:23,560 --> 00:05:28,279 Speaker 1: lawnmower Man too. Both of those, obviously, are are in 90 00:05:28,360 --> 00:05:33,760 Speaker 1: that same category of trying to visualize what this process 91 00:05:33,839 --> 00:05:36,760 Speaker 1: would be like in a way that is interesting. I 92 00:05:36,760 --> 00:05:41,080 Speaker 1: said that the worst a movie can do is not 93 00:05:41,400 --> 00:05:44,920 Speaker 1: only be incorrect, but also fail to be entertaining, and 94 00:05:44,920 --> 00:05:45,640 Speaker 1: then I would. 95 00:05:45,480 --> 00:05:47,440 Speaker 2: Argue both of these fall into that category. 96 00:05:48,160 --> 00:05:50,960 Speaker 3: I feel like if hacking was as entertaining as it 97 00:05:51,000 --> 00:05:53,920 Speaker 3: is in lawnmower Man, I would have gotten into it 98 00:05:53,960 --> 00:05:56,360 Speaker 3: at a much, much younger age. 99 00:05:56,560 --> 00:06:00,279 Speaker 2: It's I remember lawnmower Man's one of them movies. 100 00:06:00,279 --> 00:06:03,279 Speaker 1: I would often cite as being a real reason why 101 00:06:03,360 --> 00:06:06,000 Speaker 1: virtual reality died in the mid nineties. 102 00:06:07,120 --> 00:06:08,479 Speaker 4: So sad, yeah. 103 00:06:08,560 --> 00:06:11,160 Speaker 1: Because the stuff that was around in the mid nineties 104 00:06:11,279 --> 00:06:14,320 Speaker 1: was very, very primitive, right. You had like Dactyl Nightmare, 105 00:06:14,400 --> 00:06:19,400 Speaker 1: these games that were just polygons, and they were very primitive. 106 00:06:20,120 --> 00:06:22,839 Speaker 1: The gameplay was limited, the headsets you wore had to 107 00:06:22,839 --> 00:06:25,320 Speaker 1: be supported on cables because they were so heavy they 108 00:06:25,320 --> 00:06:29,479 Speaker 1: would really hurt your neck otherwise. And so when people 109 00:06:29,520 --> 00:06:32,760 Speaker 1: saw what the actual state of the art of virtual 110 00:06:32,760 --> 00:06:35,520 Speaker 1: reality was at the time compared to what they were 111 00:06:35,520 --> 00:06:40,240 Speaker 1: seeing in films, that gap ended up making people say, well, 112 00:06:40,279 --> 00:06:44,239 Speaker 1: this isn't good at all. I don't see any reason 113 00:06:44,839 --> 00:06:48,760 Speaker 1: in putting any money toward this, and virtual reality died 114 00:06:49,200 --> 00:06:53,080 Speaker 1: for about ten years, and now we're starting to see 115 00:06:53,120 --> 00:06:57,120 Speaker 1: it get back into the consumer space with the success 116 00:06:57,240 --> 00:07:00,840 Speaker 1: of Oculus Rift in HTCVI mean you can get your 117 00:07:00,839 --> 00:07:04,599 Speaker 1: hands on one. We're starting to see it come back, 118 00:07:04,640 --> 00:07:08,120 Speaker 1: but it took almost twenty years for it to recover. 119 00:07:08,760 --> 00:07:14,000 Speaker 1: So thanks lawnmower Man. Not only were you a bad movie, 120 00:07:14,040 --> 00:07:15,040 Speaker 1: but you killed v Are. 121 00:07:15,640 --> 00:07:16,840 Speaker 4: It's all your fault. 122 00:07:17,360 --> 00:07:19,679 Speaker 2: Yeah, I'm not bitter or anything. 123 00:07:20,800 --> 00:07:23,320 Speaker 1: There's a movie that we have to mention everyone that 124 00:07:23,400 --> 00:07:25,800 Speaker 1: I when I posted on Twitter that I was going 125 00:07:25,840 --> 00:07:27,800 Speaker 1: to do this, I had a lot of people say 126 00:07:27,920 --> 00:07:31,920 Speaker 1: the entire movie of Hackers should be considered as part 127 00:07:31,920 --> 00:07:33,560 Speaker 1: of this as a bad hacking scene. 128 00:07:34,120 --> 00:07:35,760 Speaker 2: And Hackers is a great example. 129 00:07:36,680 --> 00:07:39,280 Speaker 1: It's another one of those where it's a movie that 130 00:07:39,360 --> 00:07:43,080 Speaker 1: was posing as an anti authoritarian kind of film, but 131 00:07:43,480 --> 00:07:46,560 Speaker 1: in the safest way possible, like there was nothing. 132 00:07:46,320 --> 00:07:47,400 Speaker 2: Really daring about it. 133 00:07:48,520 --> 00:07:52,600 Speaker 1: And the opening sequence has a couple of characters trying 134 00:07:52,680 --> 00:07:56,000 Speaker 1: to battle it out over one another, and one of 135 00:07:56,040 --> 00:07:59,000 Speaker 1: them is it trying to create an intrusion into a system. 136 00:07:59,280 --> 00:08:03,480 Speaker 1: The other one exit and then shuts everything down. So Shannon, 137 00:08:04,040 --> 00:08:08,320 Speaker 1: typically when you're talking about things like a hacker intruding 138 00:08:08,360 --> 00:08:12,240 Speaker 1: into a system, detecting that there's been an intrusion. We're 139 00:08:12,240 --> 00:08:16,960 Speaker 1: not normally seeing this happen on a one versus one 140 00:08:17,080 --> 00:08:19,960 Speaker 1: kind of basis in real time typically are we. 141 00:08:21,000 --> 00:08:23,680 Speaker 3: Not necessarily a lot of times, what you'll see with 142 00:08:23,800 --> 00:08:27,840 Speaker 3: a hack intrusion into a system, like you said, is 143 00:08:27,920 --> 00:08:30,040 Speaker 3: that they will sit and watch and they will do 144 00:08:30,080 --> 00:08:32,280 Speaker 3: a lot of recons so they'll try to figure out 145 00:08:32,760 --> 00:08:36,040 Speaker 3: what's available for them, what can they see, and they'll 146 00:08:36,040 --> 00:08:39,600 Speaker 3: wait for something to happen that'll give them even more access. 147 00:08:40,280 --> 00:08:42,440 Speaker 3: And then if you have somebody that's working inside of 148 00:08:42,440 --> 00:08:46,959 Speaker 3: that company that's being intruded upon, usually the first action 149 00:08:47,160 --> 00:08:49,680 Speaker 3: that you will see is them trying to also collect 150 00:08:49,720 --> 00:08:54,680 Speaker 3: information so they can find where this opening is, where 151 00:08:54,679 --> 00:08:57,200 Speaker 3: this vulnerability is, and then they'll shut it down. 152 00:08:58,000 --> 00:08:59,240 Speaker 4: And then after that they'll do. 153 00:08:59,240 --> 00:09:03,640 Speaker 3: A whole whole research report on it and try to 154 00:09:03,679 --> 00:09:06,079 Speaker 3: figure out like who actually did it, and probably take 155 00:09:06,559 --> 00:09:10,040 Speaker 3: two enforcement agencies so that they can get the capture 156 00:09:10,040 --> 00:09:13,240 Speaker 3: of the person and hopefully get them charged, because that's 157 00:09:13,280 --> 00:09:15,320 Speaker 3: not a very good thing to do. Yeah, I mean, 158 00:09:15,320 --> 00:09:18,000 Speaker 3: you don't necessarily see people like duking it out like 159 00:09:18,080 --> 00:09:19,000 Speaker 3: they do in package. 160 00:09:19,360 --> 00:09:23,160 Speaker 1: Yeah, it's usually you know, like, your first indication that 161 00:09:23,240 --> 00:09:25,800 Speaker 1: something is wrong can often just be that you notice 162 00:09:25,840 --> 00:09:29,480 Speaker 1: there's unusual amounts of data traffic across your network, and 163 00:09:29,840 --> 00:09:33,079 Speaker 1: you might say, well, why why are we getting these spikes. 164 00:09:33,360 --> 00:09:38,599 Speaker 1: If you're a savvy hacker, you're trying to mask your activity. 165 00:09:38,920 --> 00:09:40,960 Speaker 1: You're kind of like trying to sneak in when a 166 00:09:40,960 --> 00:09:43,200 Speaker 1: big crowd is going into a room. It's kind of 167 00:09:43,200 --> 00:09:46,959 Speaker 1: the same thing, right, Like you're waiting for large transfers 168 00:09:46,960 --> 00:09:49,680 Speaker 1: of data so that it masks what you are doing. 169 00:09:50,040 --> 00:09:52,520 Speaker 1: In a lot of cases, if you're really really good 170 00:09:52,520 --> 00:09:54,440 Speaker 1: at what you're doing, you're trying to kind of go 171 00:09:54,480 --> 00:09:57,120 Speaker 1: in that route because it means you can stay hidden 172 00:09:57,200 --> 00:09:59,319 Speaker 1: for longer and get more information. 173 00:09:59,559 --> 00:10:00,880 Speaker 2: Information is valuable. 174 00:10:01,080 --> 00:10:04,439 Speaker 1: Therefore, you don't want to just you know, you don't 175 00:10:04,480 --> 00:10:06,840 Speaker 1: want to just break into a system and then immediately 176 00:10:06,840 --> 00:10:07,480 Speaker 1: get booted out. 177 00:10:07,520 --> 00:10:09,960 Speaker 2: Then what was the point of that other than to say. 178 00:10:09,880 --> 00:10:12,079 Speaker 1: Hey, I figured out you have a vulnerability, And if 179 00:10:12,080 --> 00:10:14,839 Speaker 1: that's your job, that's awesome. If you're a white hat 180 00:10:14,880 --> 00:10:17,880 Speaker 1: hacker and your job is to hey, we've got this system, 181 00:10:18,240 --> 00:10:20,240 Speaker 1: we think it's pretty secure, but we would like you 182 00:10:20,280 --> 00:10:21,760 Speaker 1: to really put it to the test, and if you 183 00:10:21,800 --> 00:10:24,520 Speaker 1: find any vulnerabilities, let us know, and then we can 184 00:10:24,559 --> 00:10:26,840 Speaker 1: go and we can address that and fix them so 185 00:10:26,880 --> 00:10:28,880 Speaker 1: that the bad guys don't have a chance to do 186 00:10:28,920 --> 00:10:29,480 Speaker 1: it later. 187 00:10:29,800 --> 00:10:33,440 Speaker 2: That's a legitimate job. But in the movies, that's not 188 00:10:33,520 --> 00:10:34,080 Speaker 2: what you see. 189 00:10:34,120 --> 00:10:36,640 Speaker 1: You see a person going like I've got access, and 190 00:10:36,679 --> 00:10:41,760 Speaker 1: then almost immediately everything is gone pear shaped, right, like 191 00:10:41,800 --> 00:10:45,240 Speaker 1: there's alarms going off, And that's just not how it works, 192 00:10:45,360 --> 00:10:48,960 Speaker 1: not if you're doing it correctly anyway, exactly. Yeah, So 193 00:10:49,480 --> 00:10:52,800 Speaker 1: these movies, I mean, I appreciate it. Again, try to 194 00:10:52,800 --> 00:10:55,720 Speaker 1: create dramatic tension, but not terribly accurate. 195 00:10:56,920 --> 00:10:59,440 Speaker 3: One thing that I did like about hackers is the 196 00:10:59,440 --> 00:11:03,120 Speaker 3: fact that he uses social engineering to first gain access 197 00:11:03,160 --> 00:11:06,520 Speaker 3: to this network, which is something that's very very common 198 00:11:06,600 --> 00:11:12,920 Speaker 3: with hackers in general, because humans are the first, the first. 199 00:11:12,640 --> 00:11:14,199 Speaker 4: Failure in any network. 200 00:11:14,240 --> 00:11:17,000 Speaker 3: They're the people, they're the ones that you can generally 201 00:11:17,040 --> 00:11:20,560 Speaker 3: go to and find some kind of vulnerability because people 202 00:11:20,800 --> 00:11:24,560 Speaker 3: inherently trust each other. And that's just a thing that 203 00:11:24,640 --> 00:11:27,840 Speaker 3: humans need to understand is if somebody is asking you 204 00:11:27,920 --> 00:11:30,920 Speaker 3: questions that you shouldn't necessarily give out, don't give them 205 00:11:30,920 --> 00:11:33,680 Speaker 3: out because they might be social engineering you into giving 206 00:11:33,720 --> 00:11:37,839 Speaker 3: out those answers. Hackers also does a good job of 207 00:11:38,240 --> 00:11:42,959 Speaker 3: pinpointing people towards real life hacker culture. For example, there 208 00:11:43,000 --> 00:11:45,360 Speaker 3: is one scene in Hackers where they read off the 209 00:11:45,400 --> 00:11:50,840 Speaker 3: Hacker Manifesto, which was originally found in a hacker magazine, 210 00:11:51,160 --> 00:11:53,280 Speaker 3: which you can only find if you were a part 211 00:11:53,320 --> 00:11:55,760 Speaker 3: of that culture. And then later on in the movie 212 00:11:55,760 --> 00:12:00,000 Speaker 3: they also show a real life booklet, a documentation book 213 00:12:00,559 --> 00:12:06,200 Speaker 3: that shows that was actually used by phone operations to 214 00:12:06,800 --> 00:12:08,880 Speaker 3: you know, put up new towers and things like that 215 00:12:08,960 --> 00:12:11,840 Speaker 3: back in the nineteen eighties and nineties. So it has 216 00:12:12,000 --> 00:12:14,560 Speaker 3: these real life parts of culture, but in reality the 217 00:12:14,640 --> 00:12:19,160 Speaker 3: movie still has that terrible Hollywood hacking parts of it 218 00:12:19,200 --> 00:12:19,680 Speaker 3: as well. 219 00:12:19,880 --> 00:12:22,720 Speaker 1: Yeah, I'm glad you brought up the social engineering. I'm 220 00:12:22,760 --> 00:12:26,000 Speaker 1: actually going to do an episode about specifically about social 221 00:12:26,040 --> 00:12:32,480 Speaker 1: engineering live at Dragon Con with our mutual friend Brian Brushwood. 222 00:12:33,640 --> 00:12:36,160 Speaker 1: So Brushwood's going to come on the show. There's no 223 00:12:36,200 --> 00:12:39,120 Speaker 1: one like a magician to tell you how to fool 224 00:12:39,240 --> 00:12:41,960 Speaker 1: people right, how to lie to. 225 00:12:42,000 --> 00:12:43,840 Speaker 2: People and get them to do the things you want 226 00:12:43,920 --> 00:12:44,319 Speaker 2: them to do. 227 00:12:44,920 --> 00:12:47,480 Speaker 1: So we're gonna we're going to go into social engineering 228 00:12:47,480 --> 00:12:50,000 Speaker 1: big time on that episode. And I'm also glad you 229 00:12:50,040 --> 00:12:53,400 Speaker 1: mentioned that things like the manuals that definitely is a 230 00:12:53,400 --> 00:12:55,520 Speaker 1: big part of hacker culture, going all the way back 231 00:12:55,520 --> 00:12:59,120 Speaker 1: to those phone freak days, where again the hackers were 232 00:12:59,200 --> 00:13:01,920 Speaker 1: not necessarily trying to take advantage of a system. They 233 00:13:01,960 --> 00:13:04,240 Speaker 1: just wanted to know how it worked. They were fascinated 234 00:13:04,640 --> 00:13:08,400 Speaker 1: by the way it actually performed, and so when you 235 00:13:08,480 --> 00:13:13,280 Speaker 1: got a manual, suddenly you had an insight into it. 236 00:13:13,320 --> 00:13:17,040 Speaker 1: Even if the manual didn't go into deep detail about 237 00:13:17,040 --> 00:13:20,120 Speaker 1: how the system as a whole operates on the back end, 238 00:13:20,320 --> 00:13:22,720 Speaker 1: it could give you enough insight and then they start 239 00:13:22,840 --> 00:13:26,160 Speaker 1: understanding like, oh wow, that's really cool that they have 240 00:13:26,360 --> 00:13:28,120 Speaker 1: this system set up in such a way where they 241 00:13:28,120 --> 00:13:30,920 Speaker 1: can route calls like this and they can dynamically change 242 00:13:30,960 --> 00:13:34,080 Speaker 1: things over. And I'm glad you brought that up, because 243 00:13:34,760 --> 00:13:37,959 Speaker 1: once again, being a hacker doesn't necessarily mean you're a 244 00:13:38,040 --> 00:13:40,920 Speaker 1: terrible person at all. It may mean that you just 245 00:13:41,080 --> 00:13:44,520 Speaker 1: have a deep curiosity that is only satisfied by learning 246 00:13:44,520 --> 00:13:46,960 Speaker 1: how this stuff works. And it's not always easy to 247 00:13:47,000 --> 00:13:52,600 Speaker 1: do that. Information is not always publicly available at all times. 248 00:13:52,640 --> 00:13:55,199 Speaker 1: There have been times where it accidentally got publicly available. 249 00:13:55,480 --> 00:13:56,760 Speaker 2: That's again going. 250 00:13:56,600 --> 00:13:58,800 Speaker 1: Back to the phone freaks, there were some phone manuals 251 00:13:59,240 --> 00:14:03,400 Speaker 1: that kind of were released to the public, not intentionally. 252 00:14:03,520 --> 00:14:07,160 Speaker 1: It also wasn't considered to be a huge problem until 253 00:14:07,200 --> 00:14:09,520 Speaker 1: the phone freakers got hold of it and suddenly you 254 00:14:09,559 --> 00:14:12,959 Speaker 1: had all these people making weird calls long distance all 255 00:14:12,960 --> 00:14:13,559 Speaker 1: over the place. 256 00:14:14,840 --> 00:14:15,400 Speaker 4: So funny. 257 00:14:15,559 --> 00:14:19,040 Speaker 1: Yeah, our next one is another case of people just 258 00:14:19,120 --> 00:14:22,000 Speaker 1: throwing out terms without really having any meaning to them. 259 00:14:22,000 --> 00:14:24,080 Speaker 1: I mean, the terms themselves have meaning, but not in 260 00:14:24,120 --> 00:14:26,520 Speaker 1: the context of the lines. And it was from a 261 00:14:26,520 --> 00:14:30,240 Speaker 1: CSI New York episode in which a character says, and 262 00:14:30,280 --> 00:14:33,520 Speaker 1: this is a quote, I'll create a gooey interface using 263 00:14:33,600 --> 00:14:40,000 Speaker 1: visual basics, see if I can track an IP address. So, Shannon, 264 00:14:40,240 --> 00:14:42,080 Speaker 1: you want to track an IP address? Do you go 265 00:14:42,240 --> 00:14:45,160 Speaker 1: to the trouble of developing a graphical user interface? 266 00:14:46,560 --> 00:14:50,080 Speaker 3: No, what I would do is open up probably netcat 267 00:14:50,160 --> 00:14:52,760 Speaker 3: in my terminals so that I can gain access and 268 00:14:52,800 --> 00:14:55,720 Speaker 3: see what the heck's going on on that IP address. Yeah, 269 00:14:55,800 --> 00:14:59,160 Speaker 3: Or if it was wireless and I was on a 270 00:14:59,200 --> 00:15:01,760 Speaker 3: nearby network, I would I would probably use wireshark and 271 00:15:01,760 --> 00:15:04,880 Speaker 3: WiFi Pinapple Like. It's not you don't have to create 272 00:15:04,920 --> 00:15:09,120 Speaker 3: an entirely new graphical user interface table to be able 273 00:15:09,160 --> 00:15:11,800 Speaker 3: to access an IP address and see what kind of 274 00:15:11,800 --> 00:15:14,600 Speaker 3: traffic is happening to and from them. 275 00:15:15,160 --> 00:15:16,080 Speaker 2: Yeah, it doesn't. 276 00:15:16,720 --> 00:15:20,560 Speaker 1: Creating a graphical user interface literally makes no sense. I mean, 277 00:15:20,920 --> 00:15:24,400 Speaker 1: it has nothing to do with your ability to track 278 00:15:24,440 --> 00:15:28,280 Speaker 1: an IP address, to identify an IP address. It has 279 00:15:28,360 --> 00:15:30,800 Speaker 1: nothing to do with a graphical user interface is an 280 00:15:30,880 --> 00:15:35,920 Speaker 1: interface in itself. Isn't a thing that performs these functions. 281 00:15:36,000 --> 00:15:38,840 Speaker 1: It's just a way to visualize data and allow you 282 00:15:39,320 --> 00:15:43,160 Speaker 1: to interact with it in some way. So Windows is 283 00:15:43,200 --> 00:15:47,880 Speaker 1: a graphical user interface. It's a gooey. Any any program 284 00:15:47,960 --> 00:15:52,080 Speaker 1: that has a graphical representation of information that allows you 285 00:15:52,120 --> 00:15:55,280 Speaker 1: to move things around, that's a guy. Has nothing to 286 00:15:55,280 --> 00:15:59,080 Speaker 1: do with the actual function. It is separate from the function. 287 00:15:59,320 --> 00:16:03,600 Speaker 1: It is just a way of manifesting what that data 288 00:16:03,680 --> 00:16:06,640 Speaker 1: actually means. Now, you might go through the trouble of 289 00:16:06,720 --> 00:16:10,920 Speaker 1: creating some sort of visual data if you wanted to 290 00:16:11,280 --> 00:16:13,600 Speaker 1: explain it to someone else who wouldn't understand if you 291 00:16:13,680 --> 00:16:17,480 Speaker 1: just handed them lines of code or whatever. But that's 292 00:16:17,520 --> 00:16:20,040 Speaker 1: not what That doesn't even work in this case because 293 00:16:20,040 --> 00:16:23,160 Speaker 1: we're talking about an IP address. We're an IP address. 294 00:16:23,160 --> 00:16:25,760 Speaker 1: You don't need like a pie chart or you know, 295 00:16:26,080 --> 00:16:28,520 Speaker 1: you don't need a graphical representation of what an IP 296 00:16:28,760 --> 00:16:29,360 Speaker 1: address is. 297 00:16:29,400 --> 00:16:31,960 Speaker 2: It's an IP address exactly. 298 00:16:32,280 --> 00:16:34,280 Speaker 1: Yeah, this one, this one kind of broke my brain 299 00:16:34,360 --> 00:16:36,560 Speaker 1: for about twenty minutes when I when I watched it. 300 00:16:37,680 --> 00:16:41,080 Speaker 1: We'll be back with more terrible hacking scenes from Hollywood 301 00:16:41,320 --> 00:16:52,760 Speaker 1: after this quick break. The next one I had on 302 00:16:52,800 --> 00:16:56,440 Speaker 1: the list was another example of hackers battling it out 303 00:16:56,480 --> 00:16:59,520 Speaker 1: in real time and seeing who could type the fastest. 304 00:16:59,760 --> 00:17:00,760 Speaker 2: This happens all the time. 305 00:17:00,800 --> 00:17:02,640 Speaker 1: There was also an episode of Chuck that did this 306 00:17:02,720 --> 00:17:05,520 Speaker 1: where the best part of that episode of Chuck was 307 00:17:05,560 --> 00:17:08,240 Speaker 1: that the guy that Chuck was facing against was played 308 00:17:08,240 --> 00:17:10,840 Speaker 1: by Freddy Wong and his character was named Freddie. So 309 00:17:11,320 --> 00:17:13,720 Speaker 1: I was like, oh my gosh, it's Freddy Wong, a 310 00:17:13,800 --> 00:17:14,520 Speaker 1: YouTube star. 311 00:17:14,720 --> 00:17:15,320 Speaker 2: Done good. 312 00:17:17,040 --> 00:17:20,440 Speaker 1: Yeah, it's fantastic, It's I think it's it's I can't remember. 313 00:17:20,520 --> 00:17:23,000 Speaker 1: It's like Chuck versus the Hackathon or something like that. 314 00:17:23,080 --> 00:17:26,000 Speaker 1: It's along those lines, and Freddy Wong plays the elite 315 00:17:26,000 --> 00:17:28,600 Speaker 1: hacker that he goes up against. But in this case, 316 00:17:28,640 --> 00:17:32,040 Speaker 1: I'm talking about criminal minds, which, as Nicholas Brendan in It, 317 00:17:32,119 --> 00:17:34,840 Speaker 1: who I will always and forever think of as Xander 318 00:17:34,880 --> 00:17:36,800 Speaker 1: from Buffy the vampire slayer. 319 00:17:38,600 --> 00:17:38,800 Speaker 2: Yep. 320 00:17:38,880 --> 00:17:44,639 Speaker 1: And he's brought in to try and access another i 321 00:17:44,640 --> 00:17:49,160 Speaker 1: think another employees system, but the employees kind of on 322 00:17:49,200 --> 00:17:52,159 Speaker 1: the lamb and she's working with other folks who are 323 00:17:52,160 --> 00:17:54,879 Speaker 1: trying to do this quietly without the rest of the 324 00:17:54,960 --> 00:17:59,760 Speaker 1: organization taking any notice of them. And so he ends 325 00:17:59,800 --> 00:18:04,520 Speaker 1: up getting into her system and trying to snoop around 326 00:18:04,560 --> 00:18:07,399 Speaker 1: a bit. She notices, and then they have this duel. 327 00:18:07,760 --> 00:18:09,760 Speaker 1: And as Shannon has already said, this is not really 328 00:18:09,800 --> 00:18:12,159 Speaker 1: how we see things play out, but it does have 329 00:18:12,240 --> 00:18:14,880 Speaker 1: the best line I think out of all the examples 330 00:18:14,880 --> 00:18:18,120 Speaker 1: we have here, which is her gooey is mind blowing. 331 00:18:21,280 --> 00:18:24,600 Speaker 3: And what you would actually see a hacker say is, well, 332 00:18:24,920 --> 00:18:25,959 Speaker 3: check out this code. 333 00:18:26,240 --> 00:18:29,359 Speaker 1: Yeah yeah, instead of all these little uh these little 334 00:18:29,400 --> 00:18:31,920 Speaker 1: windows pop up. Also, it's funny because in the sequence 335 00:18:32,200 --> 00:18:34,920 Speaker 1: as they're typing uh, there's like one point where the 336 00:18:35,560 --> 00:18:39,040 Speaker 1: Xander I'll always call him Xander, is saying, well, here, 337 00:18:39,240 --> 00:18:41,959 Speaker 1: see what you think about this, And all that you 338 00:18:42,000 --> 00:18:44,760 Speaker 1: see on the screen is that one of the windows 339 00:18:44,960 --> 00:18:48,080 Speaker 1: on the screen is dragged down to like the lower 340 00:18:48,160 --> 00:18:48,800 Speaker 1: right corner. 341 00:18:49,400 --> 00:18:52,360 Speaker 2: Yes, and you're the wow, you have a. 342 00:18:52,400 --> 00:18:57,560 Speaker 1: Mouse so bad, But they're actually not even using computer mice, 343 00:18:57,600 --> 00:19:00,800 Speaker 1: they're just typing on the keyboard and so so really 344 00:19:00,840 --> 00:19:03,159 Speaker 1: all you're seeing are all these little windows being dragged 345 00:19:03,160 --> 00:19:06,560 Speaker 1: around the screen and no real way of doing that easily. 346 00:19:06,800 --> 00:19:09,840 Speaker 1: If you're just if you're using a keyboard, you're making 347 00:19:09,880 --> 00:19:11,720 Speaker 1: it too hard on yourself, honestly. 348 00:19:13,440 --> 00:19:16,280 Speaker 3: One thing is about hackers that I have noticed. I 349 00:19:16,680 --> 00:19:19,080 Speaker 3: am not as good as a lot of my friends are, 350 00:19:19,119 --> 00:19:22,760 Speaker 3: but keyboard shortcuts are huge in the hacker culture. 351 00:19:23,080 --> 00:19:23,560 Speaker 4: Huge. 352 00:19:23,960 --> 00:19:27,440 Speaker 1: Well yeah, I mean, especially if you're doing the same 353 00:19:27,480 --> 00:19:29,720 Speaker 1: sort of code over and over, if you're going to 354 00:19:29,760 --> 00:19:33,199 Speaker 1: be using the same sort of strings repeatedly, having like 355 00:19:33,320 --> 00:19:37,399 Speaker 1: various shortcuts and macro set up saves you so much time. 356 00:19:38,640 --> 00:19:44,239 Speaker 1: It's way more efficient. And you know, I totally get that. 357 00:19:44,280 --> 00:19:47,080 Speaker 1: I've worked on documents where that sort of thing made 358 00:19:47,080 --> 00:19:49,240 Speaker 1: a lot of sense. And I don't even do coding 359 00:19:49,680 --> 00:19:52,440 Speaker 1: right I'm doing I'm talking about working mostly in the 360 00:19:52,520 --> 00:19:54,760 Speaker 1: legal world. I don't like to talk about it. It 361 00:19:54,880 --> 00:19:56,639 Speaker 1: was a long time ago, and I really like to 362 00:19:56,720 --> 00:20:01,800 Speaker 1: leave that chapter behind. I was working four lawyers. I 363 00:20:01,920 --> 00:20:06,960 Speaker 1: wasn't being pursued by them. Another great example weird science. 364 00:20:07,160 --> 00:20:09,520 Speaker 1: This one sort of gets a pass from me too. 365 00:20:09,560 --> 00:20:13,320 Speaker 1: Because it's such a goofy movie. It's one of those 366 00:20:13,320 --> 00:20:16,760 Speaker 1: films from the eighties where you kind of the basic 367 00:20:16,800 --> 00:20:19,719 Speaker 1: premise is kind of squiky, well more than kind of squiky, 368 00:20:19,720 --> 00:20:23,560 Speaker 1: because you've got two high school males who decide to 369 00:20:23,640 --> 00:20:27,720 Speaker 1: try and create a girl so that they can do 370 00:20:27,840 --> 00:20:32,320 Speaker 1: various sexual things with her, which is pretty awful. But 371 00:20:32,359 --> 00:20:34,000 Speaker 1: it turns out that she ends up having a way 372 00:20:34,000 --> 00:20:36,440 Speaker 1: more intelligence than either of them and a lot more 373 00:20:36,480 --> 00:20:39,960 Speaker 1: agency than either of them. So it's okay because it 374 00:20:39,960 --> 00:20:42,200 Speaker 1: could have gone way worse than that. I know, because 375 00:20:42,200 --> 00:20:45,120 Speaker 1: I've watched a movie recently that took the other path 376 00:20:45,160 --> 00:20:48,280 Speaker 1: and it was terrible. I was for a different podcast, 377 00:20:49,440 --> 00:20:54,120 Speaker 1: So this one also has another depiction of characters trying 378 00:20:54,160 --> 00:20:58,720 Speaker 1: to navigate through a three dimensional virtual environment in order 379 00:20:58,760 --> 00:21:03,000 Speaker 1: to access a secure system. In this case, they're trying 380 00:21:03,040 --> 00:21:08,399 Speaker 1: to create a simulated brain for their created female, but 381 00:21:08,960 --> 00:21:12,720 Speaker 1: the computer that Wyatt has is only capable of getting 382 00:21:12,720 --> 00:21:14,960 Speaker 1: her intelligence up to fifth grade level, and they need 383 00:21:15,000 --> 00:21:18,760 Speaker 1: more than that, so they tap into some as far 384 00:21:18,760 --> 00:21:22,840 Speaker 1: as I can tell, unidentified government agency to get more 385 00:21:22,880 --> 00:21:27,880 Speaker 1: computing power, which in this film is equated to electricity. 386 00:21:28,080 --> 00:21:32,239 Speaker 1: I guess because if you watch the sequence when they 387 00:21:32,320 --> 00:21:36,520 Speaker 1: get access to the computers, everything at the government center 388 00:21:36,600 --> 00:21:38,840 Speaker 1: is going crazy. You're seeing these real to real tapes 389 00:21:38,880 --> 00:21:43,199 Speaker 1: that are spinning super fast. You see lights flashing, and 390 00:21:43,240 --> 00:21:46,520 Speaker 1: then a power surge ends up blasting all of the 391 00:21:46,560 --> 00:21:50,359 Speaker 1: electronics at Wyatt's home, like the microwave, and there's like 392 00:21:51,400 --> 00:21:55,720 Speaker 1: a smoke detector. I think that explodes, and so that's 393 00:21:55,760 --> 00:21:59,159 Speaker 1: what zooms in enough power to not only boost the 394 00:21:59,200 --> 00:22:04,360 Speaker 1: intelligence of the character up to superhuman levels, but also 395 00:22:04,680 --> 00:22:10,760 Speaker 1: somehow magically brings to life this woman. I'd say ninety 396 00:22:10,800 --> 00:22:17,160 Speaker 1: seven percent accurate. Let's take a quick break to thank 397 00:22:17,200 --> 00:22:17,920 Speaker 1: our sponsor. 398 00:22:18,560 --> 00:22:19,320 Speaker 2: Are you hiring? 399 00:22:19,560 --> 00:22:22,240 Speaker 1: Do you know where to post your job to find 400 00:22:22,280 --> 00:22:25,560 Speaker 1: the best candidates? Posting your job in one place isn't 401 00:22:25,720 --> 00:22:28,680 Speaker 1: enough to find quality candidates. If you want to find 402 00:22:28,720 --> 00:22:31,280 Speaker 1: the perfect hire, you need to post your job on 403 00:22:31,480 --> 00:22:34,520 Speaker 1: all the top job sites, and now you can. With 404 00:22:34,680 --> 00:22:38,120 Speaker 1: ZipRecruiter dot com. You could post your job to one 405 00:22:38,240 --> 00:22:42,359 Speaker 1: hundred plus job sites, including social media networks like Facebook 406 00:22:42,400 --> 00:22:46,439 Speaker 1: and Twitter, all with a single click find candidates in 407 00:22:46,600 --> 00:22:50,960 Speaker 1: any city or industry nationwide. Just post once and watch 408 00:22:50,960 --> 00:22:54,480 Speaker 1: your qualified candidates roll in to zip recruiters. Easy to 409 00:22:54,600 --> 00:22:58,920 Speaker 1: use interface. There's no juggling emails or calls to your office. 410 00:22:59,280 --> 00:23:02,919 Speaker 1: You can quickly screen candidates, rate them, and hire the 411 00:23:03,000 --> 00:23:06,560 Speaker 1: right person fast. Find out today why zip recruiter has 412 00:23:06,600 --> 00:23:10,399 Speaker 1: been used by over one million businesses and right now 413 00:23:10,680 --> 00:23:13,520 Speaker 1: listeners of tech stuff can post jobs on zip recruiter 414 00:23:13,600 --> 00:23:18,520 Speaker 1: for free by going to ZipRecruiter dot com slash first. 415 00:23:19,200 --> 00:23:23,760 Speaker 1: That's ZipRecruiter dot com slash first. F I R S 416 00:23:23,840 --> 00:23:26,560 Speaker 1: T one more time. Try it for free. Go to 417 00:23:26,720 --> 00:23:29,280 Speaker 1: ZipRecruiter dot com slash first. 418 00:23:29,640 --> 00:23:35,640 Speaker 3: Oh, so this is a great example the Weird Science series. 419 00:23:35,880 --> 00:23:39,320 Speaker 3: Her segment is a great example of what you can 420 00:23:39,400 --> 00:23:42,800 Speaker 3: actually do with more computing power. So in the real life, 421 00:23:43,080 --> 00:23:46,239 Speaker 3: we have quantum computing, which people are currently working on 422 00:23:46,320 --> 00:23:51,520 Speaker 3: to be able to decrypt encryption that is not currently vulnerable, 423 00:23:52,080 --> 00:23:55,880 Speaker 3: but that requires more computing power. That's what quantum computing 424 00:23:55,920 --> 00:23:59,639 Speaker 3: is all about, not necessarily more electricity. 425 00:23:59,080 --> 00:24:03,000 Speaker 1: Right, Yeah, quantum computing, you know, using cubits where you 426 00:24:03,040 --> 00:24:06,480 Speaker 1: have quantum bits that represent not just a zero, not 427 00:24:06,520 --> 00:24:09,960 Speaker 1: just a one, but both and technically all values in 428 00:24:10,000 --> 00:24:15,000 Speaker 1: between mean that you can complete calculations in parallel. And 429 00:24:15,119 --> 00:24:18,080 Speaker 1: this is great for certain types of computing problems. Where 430 00:24:18,119 --> 00:24:21,520 Speaker 1: you can have those parallel computing problems that are easily 431 00:24:21,560 --> 00:24:25,359 Speaker 1: solved this way, other computing problems are not parallel, and 432 00:24:25,400 --> 00:24:28,440 Speaker 1: in those cases, a quantum computer is not necessarily going 433 00:24:28,480 --> 00:24:30,560 Speaker 1: to be any better than a classical computer. In fact, 434 00:24:30,560 --> 00:24:33,760 Speaker 1: it may be worse depending upon how many cubits the 435 00:24:33,840 --> 00:24:34,800 Speaker 1: quantum computer has. 436 00:24:34,840 --> 00:24:37,040 Speaker 2: If it has enough, then it'll chug along. 437 00:24:37,720 --> 00:24:42,119 Speaker 1: But for those parallel problems, including encryption, quantum computers will 438 00:24:42,280 --> 00:24:45,639 Speaker 1: transform our world. The encryption that we rely upon today 439 00:24:46,040 --> 00:24:50,360 Speaker 1: will be trivial if you have a significantly powerful quantum computer, 440 00:24:50,520 --> 00:24:52,800 Speaker 1: because it will be able to go through all the 441 00:24:52,880 --> 00:24:57,639 Speaker 1: possible answers that are the basis for your encryption. Typically 442 00:24:57,720 --> 00:25:04,080 Speaker 1: we're talking about very large hash numbers and assign probabilities 443 00:25:04,080 --> 00:25:06,600 Speaker 1: to them and figure out things like your encryption keys 444 00:25:07,200 --> 00:25:10,640 Speaker 1: very very quickly. So that is interesting, But as you say, 445 00:25:11,800 --> 00:25:15,119 Speaker 1: not just by It's not something that's done by pumping 446 00:25:15,119 --> 00:25:19,480 Speaker 1: more electricity through a power line exactly. That would be 447 00:25:19,520 --> 00:25:22,760 Speaker 1: a little little simplistic. Our next example is of course 448 00:25:22,800 --> 00:25:25,200 Speaker 1: the most famous one in my mind. It's the one 449 00:25:25,240 --> 00:25:28,840 Speaker 1: that I spent an episode ranting about previously on tex stuff. 450 00:25:28,840 --> 00:25:30,800 Speaker 1: So I won't go into too much detail. It is 451 00:25:30,920 --> 00:25:37,800 Speaker 1: Independence Day. It's like Kickbook movie. Yeah, I gotta tell you, 452 00:25:38,320 --> 00:25:41,520 Speaker 1: if an alien race ever attacks us, I want Jeff 453 00:25:41,520 --> 00:25:44,600 Speaker 1: Goldbloom to develop the computer virus that we're going to 454 00:25:44,600 --> 00:25:47,320 Speaker 1: put on their on their ships so that they can't 455 00:25:47,320 --> 00:25:49,720 Speaker 1: attack us. 456 00:25:50,119 --> 00:25:52,439 Speaker 3: So this is a great example because he was was 457 00:25:52,520 --> 00:25:55,840 Speaker 3: out of any prior knowledge, he was able to get 458 00:25:55,880 --> 00:25:59,040 Speaker 3: a computer virus from his Mac laptop onto an alien 459 00:25:59,240 --> 00:26:01,760 Speaker 3: spacecraft and be able to basically take them over and. 460 00:26:01,760 --> 00:26:02,439 Speaker 4: Shut them down. 461 00:26:02,640 --> 00:26:07,960 Speaker 1: Yeah, so there's so many things wrong with this one. 462 00:26:08,119 --> 00:26:12,919 Speaker 1: We cannot be certain that any alien, intelligent alien civilization 463 00:26:13,000 --> 00:26:17,520 Speaker 1: out there uses computers that remotely resemble the way our 464 00:26:17,560 --> 00:26:22,199 Speaker 1: computers work. That's problem one. Problem two, we can't be 465 00:26:22,280 --> 00:26:25,680 Speaker 1: certain that a virus that we would create for Earth 466 00:26:25,720 --> 00:26:29,240 Speaker 1: based computers would ever be transferable onto an alien craft. 467 00:26:29,440 --> 00:26:31,320 Speaker 2: Problem three, we don't even know how. 468 00:26:31,160 --> 00:26:34,040 Speaker 1: We would do that, right, Like, it's do they have 469 00:26:34,280 --> 00:26:38,960 Speaker 1: universal serial ports or something? What's how do you get 470 00:26:39,840 --> 00:26:45,280 Speaker 1: your stuff, your your program into alien technology, even if 471 00:26:45,320 --> 00:26:50,680 Speaker 1: it's wireless, the wireless protocols, those are things. We don't 472 00:26:50,760 --> 00:26:56,240 Speaker 1: just have computers shooting off random radio waves. They have 473 00:26:56,280 --> 00:26:59,639 Speaker 1: to follow very specific rules for computers to communicate with 474 00:26:59,680 --> 00:27:00,000 Speaker 1: each other. 475 00:27:00,680 --> 00:27:02,320 Speaker 2: Everything about this is wrong. 476 00:27:03,080 --> 00:27:06,320 Speaker 3: And given the premise of this movie, I highly doubt 477 00:27:06,359 --> 00:27:10,719 Speaker 3: that their alien technology would even remotely compare to ours. 478 00:27:11,480 --> 00:27:12,560 Speaker 4: I'm sure that they would be. 479 00:27:12,560 --> 00:27:16,959 Speaker 3: Much much stronger and much more, much more advanced than 480 00:27:17,000 --> 00:27:17,440 Speaker 3: our own. 481 00:27:17,840 --> 00:27:20,680 Speaker 1: Yeah, and some of the arguments are that, oh, but see, 482 00:27:20,720 --> 00:27:23,560 Speaker 1: the whole point of the movie is that the United 483 00:27:23,600 --> 00:27:27,119 Speaker 1: States took technology from the Aliens and use that to 484 00:27:27,240 --> 00:27:30,320 Speaker 1: boost our own technology so that we would advance faster. 485 00:27:30,480 --> 00:27:36,119 Speaker 1: And my thought to that is how that considering that 486 00:27:36,200 --> 00:27:39,080 Speaker 1: all the technology that we rely upon today is based 487 00:27:39,119 --> 00:27:44,480 Speaker 1: off of very very very well documented advances in engineering, 488 00:27:44,920 --> 00:27:48,360 Speaker 1: none of which are so dramatic as to say, oh, 489 00:27:48,440 --> 00:27:50,360 Speaker 1: that's when the Aliens arrived. 490 00:27:50,720 --> 00:27:52,920 Speaker 4: This is so great. I love that one. 491 00:27:53,160 --> 00:27:53,360 Speaker 2: Now. 492 00:27:53,560 --> 00:27:56,359 Speaker 1: I included this next one just for you, Shannon, because 493 00:27:56,600 --> 00:27:59,760 Speaker 1: I know that you are a lover of Japanese culture, 494 00:28:00,640 --> 00:28:06,560 Speaker 1: and so I found a great anime scene from tangin 495 00:28:06,680 --> 00:28:12,680 Speaker 1: Tapa gurin Lagan gurn Hen and it involves a character 496 00:28:13,320 --> 00:28:17,439 Speaker 1: going into a virtual world as a virtual avatar. 497 00:28:17,560 --> 00:28:19,360 Speaker 2: Runs through a maze as we've. 498 00:28:19,119 --> 00:28:23,720 Speaker 1: Seen before, finds a lock box that apparently has the 499 00:28:23,800 --> 00:28:27,159 Speaker 1: data he wants to get At takes out a green 500 00:28:27,200 --> 00:28:30,399 Speaker 1: glowing key, puts it in the lock box, tries to unlock. 501 00:28:30,480 --> 00:28:34,639 Speaker 1: It doesn't work, so as he do, his virtual avatar 502 00:28:35,160 --> 00:28:38,080 Speaker 1: head butts the box repeatedly till it breaks open. 503 00:28:38,520 --> 00:28:40,880 Speaker 2: He grabs hold of a red sphere, which. 504 00:28:40,680 --> 00:28:44,040 Speaker 1: Represents the data he wants, and then his character eats 505 00:28:44,120 --> 00:28:46,400 Speaker 1: the data and that's how they analyze it. 506 00:28:46,720 --> 00:28:48,959 Speaker 2: And boy do I wish that's the way it worked. 507 00:28:50,280 --> 00:28:53,440 Speaker 4: Oh that's hilarious. I just I hope that it tasted good. 508 00:28:53,720 --> 00:28:56,240 Speaker 1: Yeah, I mean, I mean it is a virtual character, 509 00:28:56,320 --> 00:28:59,600 Speaker 1: so I guess you could technically assume it tastes any 510 00:28:59,640 --> 00:29:03,080 Speaker 1: way you like it to taste. I watched this and 511 00:29:03,120 --> 00:29:07,360 Speaker 1: I thought, well, this is delightful, but it's it's also 512 00:29:07,400 --> 00:29:09,440 Speaker 1: again not trying to play this as this is how 513 00:29:09,520 --> 00:29:12,800 Speaker 1: real hackers are. This is obviously a fantasy sci fi 514 00:29:12,960 --> 00:29:16,880 Speaker 1: world that we're looking at, but one of the most 515 00:29:17,000 --> 00:29:20,560 Speaker 1: entertaining versions of that meme that I've ever seen. 516 00:29:21,520 --> 00:29:24,160 Speaker 3: I just love it, and like, just watching that scene 517 00:29:24,160 --> 00:29:26,360 Speaker 3: makes me want to watch the entire. 518 00:29:26,880 --> 00:29:29,560 Speaker 4: Anime because it's just so ridiculous. 519 00:29:29,840 --> 00:29:30,560 Speaker 2: Yeah. 520 00:29:30,800 --> 00:29:33,080 Speaker 4: Yeah, it's like somebody trying. 521 00:29:32,840 --> 00:29:37,720 Speaker 3: To brute force, but their implementation of brute forcing is headbutts. 522 00:29:37,840 --> 00:29:42,680 Speaker 1: Yeah, literal brute force, like not not putting password after 523 00:29:42,720 --> 00:29:47,920 Speaker 1: password attempt through a password manager, nothing like that. It's 524 00:29:48,080 --> 00:29:54,080 Speaker 1: literally attacking a virtual box with a virtual headbutt. It's wonderful. 525 00:29:54,560 --> 00:29:56,560 Speaker 1: The last little bit I have on here, this one. 526 00:29:56,640 --> 00:29:58,880 Speaker 1: This one doesn't really count because it was done on purpose. 527 00:29:59,360 --> 00:30:01,680 Speaker 1: But there was a segment on the Late Show with 528 00:30:01,680 --> 00:30:04,960 Speaker 1: Stephen Colbert who had Michael Ian Black on Michael Ian 529 00:30:04,960 --> 00:30:09,480 Speaker 1: Black who was in the state and tons of other stuff, 530 00:30:09,680 --> 00:30:15,240 Speaker 1: and Michael Ian Black explained that he had dramatic acting 531 00:30:15,400 --> 00:30:17,920 Speaker 1: chops that he felt he never got a chance to 532 00:30:18,000 --> 00:30:21,200 Speaker 1: really exercise. He's known as a comedic actor, and that 533 00:30:21,280 --> 00:30:24,800 Speaker 1: he always wanted his dream role is that of a 534 00:30:24,880 --> 00:30:29,480 Speaker 1: hacker on a police procedural show, because it meant that 535 00:30:29,520 --> 00:30:33,160 Speaker 1: he could dress in nice clothes, sit down behind a desk, 536 00:30:33,760 --> 00:30:36,600 Speaker 1: and everything he would do would seem really important. He says, 537 00:30:36,600 --> 00:30:39,880 Speaker 1: it's a dream job. And so then they do a 538 00:30:39,920 --> 00:30:43,560 Speaker 1: segment where Stephen Colbert's show gets hacked and Michael Ian 539 00:30:43,600 --> 00:30:46,960 Speaker 1: Black comes back out to fight off against the hack. 540 00:30:47,440 --> 00:30:50,480 Speaker 1: It is a send up of all the tropes we've 541 00:30:50,520 --> 00:30:53,200 Speaker 1: been mentioning so far in this episode, and if you 542 00:30:53,280 --> 00:30:55,920 Speaker 1: have not seen it. I recommend going out and watching 543 00:30:55,920 --> 00:30:57,200 Speaker 1: it because it's hilarious. 544 00:30:58,360 --> 00:31:01,000 Speaker 3: My favorite part is the fact that he comes out 545 00:31:01,000 --> 00:31:04,880 Speaker 3: with black lipstick on and he's like biker gloves, which 546 00:31:05,360 --> 00:31:08,360 Speaker 3: first off, would be incredibly hard to type in yep, 547 00:31:08,800 --> 00:31:12,000 Speaker 3: wearing those gloves. And second off, like the hackers, I know, 548 00:31:12,160 --> 00:31:16,640 Speaker 3: do you not wear black lipstick, especially if they're male. Yeah. 549 00:31:16,880 --> 00:31:19,160 Speaker 1: The first one of the first things he says is 550 00:31:19,160 --> 00:31:21,280 Speaker 1: talking about how he's gonna be typing, and then his 551 00:31:21,320 --> 00:31:24,960 Speaker 1: typing will in no way be hampered by the fact 552 00:31:24,960 --> 00:31:28,240 Speaker 1: he's wearing fingerless biker gloves, so he's actually he actually 553 00:31:28,280 --> 00:31:31,240 Speaker 1: brings attention to that, and there is a funny there's 554 00:31:31,240 --> 00:31:33,800 Speaker 1: a funny moment he's bent over Stephen Colbert's desk and 555 00:31:33,800 --> 00:31:37,120 Speaker 1: he's typing furiously, not really typing, he's just slapping at 556 00:31:37,160 --> 00:31:40,160 Speaker 1: the keyboard furiously. And then it becomes clear that his 557 00:31:40,280 --> 00:31:45,320 Speaker 1: lines are on the screen and he scrolled past them, 558 00:31:45,600 --> 00:31:48,120 Speaker 1: and so he's like, I'm trying to find my line. 559 00:31:50,080 --> 00:31:52,400 Speaker 1: So there's the But it's great because they just roll 560 00:31:52,440 --> 00:31:56,160 Speaker 1: with it, they just keep on going. So it's fantastic 561 00:31:56,240 --> 00:32:01,280 Speaker 1: to see that kind of self aware take on the 562 00:32:01,280 --> 00:32:05,040 Speaker 1: way hackers and hacking culture has been portrayed in the 563 00:32:05,160 --> 00:32:08,800 Speaker 1: popular media. We're going to take another quick break and 564 00:32:08,840 --> 00:32:10,880 Speaker 1: we'll be right back with more of the worst hacking 565 00:32:10,920 --> 00:32:22,840 Speaker 1: scenes from Hollywood. We've harped on a lot of different 566 00:32:22,840 --> 00:32:27,400 Speaker 1: shows in movies about being terrible or being inaccurate, but 567 00:32:27,440 --> 00:32:30,560 Speaker 1: there are some examples out there that are trying really 568 00:32:30,680 --> 00:32:35,280 Speaker 1: hard to be respectful and realistic when it comes to 569 00:32:35,360 --> 00:32:38,720 Speaker 1: hacker culture. One of the ones that leaps to mind 570 00:32:39,720 --> 00:32:44,760 Speaker 1: is Mister Robot, a series where hacking plays a very 571 00:32:44,800 --> 00:32:48,080 Speaker 1: important part. It's not like it's all about hacking, but 572 00:32:48,160 --> 00:32:50,760 Speaker 1: that's an important plot device in several episodes. 573 00:32:51,480 --> 00:32:52,960 Speaker 2: And they name. 574 00:32:53,320 --> 00:32:59,040 Speaker 1: Real security firms, they name real penetration teams that actually 575 00:32:59,080 --> 00:33:01,640 Speaker 1: go and do the sort of testing. They name real 576 00:33:01,720 --> 00:33:07,960 Speaker 1: software packages that are meant to either help someone commit 577 00:33:08,080 --> 00:33:10,120 Speaker 1: one of these attacks or defend against them. 578 00:33:10,520 --> 00:33:11,880 Speaker 2: They use real products. 579 00:33:11,920 --> 00:33:14,400 Speaker 1: They're so it's clear that people who are working on 580 00:33:14,440 --> 00:33:16,560 Speaker 1: the show do their research. 581 00:33:18,200 --> 00:33:20,600 Speaker 3: Yes, it's very clear, and in fact, I have a 582 00:33:20,720 --> 00:33:24,320 Speaker 3: very close working relationship with the team over at Mister 583 00:33:24,400 --> 00:33:27,640 Speaker 3: Robot to make sure that they get our products right, 584 00:33:27,680 --> 00:33:31,560 Speaker 3: because they have shown very recently one of hack Five's 585 00:33:31,600 --> 00:33:33,320 Speaker 3: products called the USB robber Ducky. 586 00:33:33,720 --> 00:33:34,360 Speaker 2: They have a. 587 00:33:34,320 --> 00:33:38,560 Speaker 3: Team over there that includes somebody who used to work 588 00:33:38,640 --> 00:33:41,840 Speaker 3: for the FBI, so they have actual people who have 589 00:33:41,920 --> 00:33:45,640 Speaker 3: worked with security and with privacy and they have a 590 00:33:45,840 --> 00:33:48,560 Speaker 3: general understanding, but they still like to talk to real 591 00:33:48,600 --> 00:33:50,800 Speaker 3: life hackers to make sure that everything that they show 592 00:33:50,840 --> 00:33:54,120 Speaker 3: on the show is legit and is an actual hack. 593 00:33:55,360 --> 00:33:57,640 Speaker 3: Mister Robot is such a great example because it gives 594 00:33:57,640 --> 00:34:02,480 Speaker 3: it perfect real life scenario where you have people that 595 00:34:02,520 --> 00:34:03,360 Speaker 3: are fighting back. 596 00:34:03,440 --> 00:34:04,320 Speaker 4: You have people that are. 597 00:34:04,240 --> 00:34:06,800 Speaker 3: Fighting back against you know, the big corpse and trying 598 00:34:06,840 --> 00:34:10,120 Speaker 3: to do something that's right for the little guy, which 599 00:34:10,200 --> 00:34:14,400 Speaker 3: you do see a lot in real life hacking cases. 600 00:34:15,160 --> 00:34:17,719 Speaker 3: A lot of times people will go after large corporations 601 00:34:17,760 --> 00:34:20,239 Speaker 3: because they don't necessarily agree with their tactics, or they 602 00:34:20,280 --> 00:34:22,759 Speaker 3: don't agree with how much they're paying their employees or 603 00:34:22,760 --> 00:34:26,160 Speaker 3: something like that. So I just I love watching Mister 604 00:34:26,239 --> 00:34:28,279 Speaker 3: Robot because it gives me the chance to look at 605 00:34:28,280 --> 00:34:30,839 Speaker 3: it and be like, like, oh, what they're showing right 606 00:34:30,880 --> 00:34:32,800 Speaker 3: now is a real it's a real hack. 607 00:34:32,920 --> 00:34:36,240 Speaker 4: Like they're actually showing this too. A huge fan. 608 00:34:36,040 --> 00:34:40,840 Speaker 3: Base of people who wouldn't necessarily be interested in this 609 00:34:40,960 --> 00:34:44,600 Speaker 3: kind of stuff unless they were watching a show. So 610 00:34:45,560 --> 00:34:48,919 Speaker 3: I just love that they're bringing such a huge fan 611 00:34:49,000 --> 00:34:53,480 Speaker 3: base into what we do on a day to day basis. 612 00:34:53,800 --> 00:34:55,640 Speaker 1: And I like that not only are they taking these 613 00:34:55,640 --> 00:34:59,560 Speaker 1: steps to make sure that the stuff they portray is accurate, 614 00:35:00,400 --> 00:35:04,040 Speaker 1: but they are even going so far as certain aspects 615 00:35:04,040 --> 00:35:06,520 Speaker 1: that we do see reflected in the news. You were 616 00:35:06,600 --> 00:35:09,359 Speaker 1: mentioning this shan and things like someone being on the 617 00:35:09,400 --> 00:35:12,880 Speaker 1: inside of a company and for one reason or another, 618 00:35:13,400 --> 00:35:18,400 Speaker 1: they decide to aid someone who wants to access that 619 00:35:18,560 --> 00:35:21,239 Speaker 1: company's information, and they're doing so knowingly. It's not that 620 00:35:21,280 --> 00:35:25,080 Speaker 1: they've been tricked into it. We see this where we 621 00:35:25,160 --> 00:35:28,839 Speaker 1: have their hackers who one of the best things, one 622 00:35:28,840 --> 00:35:30,880 Speaker 1: of the best tools you can have is making friends 623 00:35:30,880 --> 00:35:34,000 Speaker 1: with someone who works for the company that you want 624 00:35:34,040 --> 00:35:36,799 Speaker 1: to try and access, like you want to get into 625 00:35:36,840 --> 00:35:40,680 Speaker 1: their system. And if it's someone who's disgruntled for whatever reason, 626 00:35:41,120 --> 00:35:44,480 Speaker 1: maybe they were looked over for promotion, maybe they don't 627 00:35:44,560 --> 00:35:47,600 Speaker 1: like what the company is doing, maybe what the company's 628 00:35:47,600 --> 00:35:51,200 Speaker 1: purpose is conflicts with their own ethical code. They might 629 00:35:51,320 --> 00:35:55,680 Speaker 1: feel like helping someone out is the right choice. And 630 00:35:55,760 --> 00:35:59,000 Speaker 1: even if they themselves are not the ones doing the 631 00:35:59,040 --> 00:36:02,319 Speaker 1: programming or or whatever. When it comes to it they 632 00:36:02,360 --> 00:36:04,880 Speaker 1: might be the ones who allow the access in the 633 00:36:04,880 --> 00:36:08,040 Speaker 1: first place. We do see that in real life, and 634 00:36:08,080 --> 00:36:10,480 Speaker 1: there are sometimes where we have to ask the question 635 00:36:10,600 --> 00:36:13,279 Speaker 1: like was this someone from the inside or was this 636 00:36:13,400 --> 00:36:16,680 Speaker 1: someone who attacked from an external site. 637 00:36:17,400 --> 00:36:19,160 Speaker 2: Again, going back to Sony. 638 00:36:19,640 --> 00:36:24,600 Speaker 1: There's still a lot of argument about who ultimately was 639 00:36:24,719 --> 00:36:28,160 Speaker 1: responsible for that. At the time, the two big arguments 640 00:36:28,160 --> 00:36:30,800 Speaker 1: that were coming out was that it was either someone 641 00:36:30,800 --> 00:36:34,640 Speaker 1: in North Korea who had done this, or someone who 642 00:36:34,719 --> 00:36:38,400 Speaker 1: was in the pay of North Korea, like a state 643 00:36:38,520 --> 00:36:41,600 Speaker 1: agent in other words, or it was someone who used 644 00:36:41,600 --> 00:36:44,000 Speaker 1: to work for Sony or was currently working for Sony 645 00:36:44,520 --> 00:36:48,000 Speaker 1: and they did not like something that happened, and so 646 00:36:48,120 --> 00:36:51,200 Speaker 1: as a result ended up stealing a ton of information 647 00:36:51,320 --> 00:36:53,840 Speaker 1: and dumping it online for public review. 648 00:36:54,920 --> 00:36:56,319 Speaker 2: And then the. 649 00:36:56,760 --> 00:36:59,560 Speaker 1: North Korea part just became a great smoke screen for 650 00:36:59,600 --> 00:37:03,840 Speaker 1: that burst. I saw that a lot of security experts, 651 00:37:03,880 --> 00:37:07,320 Speaker 1: at least at the time, felt more inclined to believe 652 00:37:07,360 --> 00:37:10,120 Speaker 1: it was the second possibility, that it was someone from 653 00:37:10,120 --> 00:37:15,520 Speaker 1: the inside and not necessarily a state sponsored attack. I 654 00:37:15,560 --> 00:37:19,280 Speaker 1: don't know if that opinion ultimately changed. I've honestly lost 655 00:37:19,360 --> 00:37:21,840 Speaker 1: track of the Sony story at this point because it 656 00:37:21,920 --> 00:37:27,239 Speaker 1: kind of died down after that initial flare of controversy 657 00:37:27,320 --> 00:37:29,080 Speaker 1: around all the different elements. Yeah. 658 00:37:29,120 --> 00:37:32,239 Speaker 3: The interesting thing that I found about that is a 659 00:37:32,280 --> 00:37:35,800 Speaker 3: lot of the friends that I have in hacker culture 660 00:37:37,040 --> 00:37:39,759 Speaker 3: just move on because there are so many different hacks 661 00:37:39,800 --> 00:37:42,480 Speaker 3: that you see in our day and age that they 662 00:37:42,480 --> 00:37:44,080 Speaker 3: don't have time to go back and. 663 00:37:44,400 --> 00:37:45,960 Speaker 4: Discuss the Sony hack. 664 00:37:46,680 --> 00:37:48,840 Speaker 3: And disclaimer, I did do a bunch of videos for 665 00:37:48,920 --> 00:37:54,719 Speaker 3: Signal by Sony, which was a different, different core company 666 00:37:54,760 --> 00:37:57,480 Speaker 3: inside of Sony that was doing these shows that I 667 00:37:57,560 --> 00:37:58,080 Speaker 3: was working with. 668 00:37:58,120 --> 00:38:00,000 Speaker 4: It wasn't the entertainment one that got hacked. 669 00:38:00,320 --> 00:38:02,839 Speaker 3: Yeah, but I definitely went to them when I got 670 00:38:02,840 --> 00:38:04,879 Speaker 3: that job, and I was like, you sure you want 671 00:38:04,880 --> 00:38:07,880 Speaker 3: to hire me because that thing happened to you guys, 672 00:38:07,920 --> 00:38:09,759 Speaker 3: and I'm kind of a hacker, so. 673 00:38:10,600 --> 00:38:11,960 Speaker 2: Yeah, you heard. 674 00:38:13,040 --> 00:38:17,520 Speaker 1: Yeah, So I'm pleased that mister Robot is out there. 675 00:38:17,920 --> 00:38:21,440 Speaker 1: You were mentioning before we went on on the recording, 676 00:38:21,480 --> 00:38:25,200 Speaker 1: before we started recording that that Silicon Valley also typically 677 00:38:25,480 --> 00:38:28,680 Speaker 1: does a pretty good job of portraying hacking in a 678 00:38:28,800 --> 00:38:30,120 Speaker 1: in a realistic way. 679 00:38:30,760 --> 00:38:33,960 Speaker 3: They do so Silicon Valley. I actually hired one of 680 00:38:34,000 --> 00:38:36,120 Speaker 3: my friends, Rob Fuller, who you can see in the 681 00:38:36,160 --> 00:38:39,640 Speaker 3: credits of several of their episodes to give them a 682 00:38:39,680 --> 00:38:43,520 Speaker 3: good overview, a good synopsis of what they should be 683 00:38:43,600 --> 00:38:46,960 Speaker 3: showing on their show and what would be fake, what 684 00:38:46,960 --> 00:38:49,960 Speaker 3: would be called out. So he helped, and he's actually 685 00:38:49,960 --> 00:38:52,600 Speaker 3: a penetration tester. He does this as his day job, 686 00:38:52,640 --> 00:38:55,080 Speaker 3: and he's been doing it for you know, a couple 687 00:38:55,080 --> 00:38:58,400 Speaker 3: of decades, I believe, so he was the perfect person 688 00:38:58,480 --> 00:39:01,400 Speaker 3: for them to go to to say, you know, is 689 00:39:01,760 --> 00:39:02,200 Speaker 3: this right? 690 00:39:02,360 --> 00:39:02,960 Speaker 4: Is this correct? 691 00:39:03,560 --> 00:39:05,680 Speaker 3: Can you can you actually build a script for us 692 00:39:05,719 --> 00:39:09,359 Speaker 3: to show on on camera so that we don't get 693 00:39:09,360 --> 00:39:11,399 Speaker 3: called out by our fan base, a lot of which 694 00:39:11,480 --> 00:39:14,360 Speaker 3: is going to be the Silicon Valley, you know, gurus 695 00:39:14,400 --> 00:39:17,120 Speaker 3: that actually do coding on a day day and day 696 00:39:17,120 --> 00:39:21,719 Speaker 3: out basis. So Mister Robot and Siliconbat Valley both have 697 00:39:21,920 --> 00:39:26,040 Speaker 3: hired on several people who have worked in security and 698 00:39:26,080 --> 00:39:29,880 Speaker 3: penetration testing, information technology, things of that nature so that 699 00:39:29,920 --> 00:39:32,239 Speaker 3: they can get it right on camera. And I think 700 00:39:32,239 --> 00:39:36,560 Speaker 3: that's very important because what we're finding in this in 701 00:39:36,600 --> 00:39:39,760 Speaker 3: this genre, in this career career path is that. 702 00:39:39,680 --> 00:39:42,520 Speaker 4: Where there's a lot of people who lose. 703 00:39:42,280 --> 00:39:45,040 Speaker 3: Interest very very quickly in it because it is hard, 704 00:39:45,080 --> 00:39:48,839 Speaker 3: it is ever changing, and it's complicated, and you have 705 00:39:48,880 --> 00:39:50,279 Speaker 3: to go to school for it, and you have to 706 00:39:50,280 --> 00:39:54,600 Speaker 3: get certificates that increased your knowledge, and you have to 707 00:39:54,760 --> 00:39:57,479 Speaker 3: renew those certificates day in and day out every year. 708 00:39:57,960 --> 00:40:00,719 Speaker 4: So it's expensive to stay in this career too. 709 00:40:00,800 --> 00:40:02,799 Speaker 3: Hopefully you get a company or get a job with 710 00:40:02,840 --> 00:40:05,239 Speaker 3: a company that you know will pay for those certificates 711 00:40:05,239 --> 00:40:09,680 Speaker 3: for you. Yeah, So the fact that they're showing these 712 00:40:09,760 --> 00:40:14,279 Speaker 3: real life scenarios on camera, I'm hoping that it will 713 00:40:14,320 --> 00:40:18,240 Speaker 3: increase more interest in this in this genre of work, 714 00:40:18,320 --> 00:40:21,280 Speaker 3: because we really need more people to be interested in 715 00:40:21,440 --> 00:40:23,879 Speaker 3: it in the long term, because it is so so 716 00:40:23,960 --> 00:40:27,600 Speaker 3: important for companies as a whole, especially if they're holding 717 00:40:27,680 --> 00:40:30,359 Speaker 3: user data, to be secure and to be private and 718 00:40:30,400 --> 00:40:33,319 Speaker 3: to be very conscious of what they're doing behind the 719 00:40:33,360 --> 00:40:36,000 Speaker 3: scenes as opposed to just making a website pretty. 720 00:40:36,800 --> 00:40:38,400 Speaker 4: The things that are most important. 721 00:40:38,000 --> 00:40:41,359 Speaker 3: To me are security and privacy, and it's it's very 722 00:40:41,400 --> 00:40:44,440 Speaker 3: important to me that more and more companies get involved 723 00:40:44,440 --> 00:40:48,640 Speaker 3: with this information and get a much broader understanding of 724 00:40:48,680 --> 00:40:51,680 Speaker 3: how important it is to actually pay money to make 725 00:40:51,719 --> 00:40:53,200 Speaker 3: sure that you have good security. 726 00:40:54,000 --> 00:40:56,600 Speaker 1: And to that end, I mean, there's a there's a 727 00:40:56,640 --> 00:40:59,440 Speaker 1: movie that's coming out later on this year, I believe 728 00:40:59,520 --> 00:41:03,120 Speaker 1: called I. It's got Pierce brosen in in it, uh. 729 00:41:03,160 --> 00:41:07,880 Speaker 1: And in that film, there's this idea of privacy and 730 00:41:07,880 --> 00:41:10,319 Speaker 1: security plays a huge part of it, as well as 731 00:41:10,320 --> 00:41:13,320 Speaker 1: the Internet of Things, which again is a great illustration 732 00:41:13,440 --> 00:41:16,279 Speaker 1: of why security is so important. We have more and 733 00:41:16,320 --> 00:41:21,160 Speaker 1: more devices that are are opening up opportunities to be 734 00:41:21,320 --> 00:41:24,160 Speaker 1: a point of entry for a hacker, right like if 735 00:41:24,160 --> 00:41:28,120 Speaker 1: you have not if you haven't designed your IoT device 736 00:41:28,320 --> 00:41:32,040 Speaker 1: to also be secure and encrypted, that's a potential in 737 00:41:32,560 --> 00:41:37,200 Speaker 1: depending upon how it communicates with the rest of your devices. 738 00:41:37,960 --> 00:41:41,600 Speaker 1: So this movie I that's coming out, there's a Pierce 739 00:41:41,640 --> 00:41:45,839 Speaker 1: Brosnan's playing like a a Tony Stark like business guy 740 00:41:46,400 --> 00:41:49,839 Speaker 1: who ends up having a an IT guy come in 741 00:41:49,880 --> 00:41:51,759 Speaker 1: and help him out when he's giving a presentation and 742 00:41:51,880 --> 00:41:56,120 Speaker 1: the technology is failing. The IT guy gets it turned around, 743 00:41:56,360 --> 00:41:59,280 Speaker 1: and so Pierce Brosidan says, Hey, come back to my house. 744 00:41:59,480 --> 00:42:01,240 Speaker 1: I'd like you to help me out with some stuff. 745 00:42:01,239 --> 00:42:03,480 Speaker 1: And the guy's like, okay, sure, and he comes by 746 00:42:03,600 --> 00:42:06,360 Speaker 1: and the guy's house has got all this high tech equipment, 747 00:42:06,440 --> 00:42:10,000 Speaker 1: most of which is not really working up to the. 748 00:42:09,840 --> 00:42:10,920 Speaker 2: Possibility that could. 749 00:42:11,280 --> 00:42:14,840 Speaker 1: But he also has a young like a teenage daughter 750 00:42:15,640 --> 00:42:18,279 Speaker 1: that the IT guy ends up becoming fixated with. So 751 00:42:18,320 --> 00:42:21,680 Speaker 1: then it becomes a psychological thriller where the IT guy 752 00:42:21,880 --> 00:42:24,359 Speaker 1: who was upgrading all the systems is really using them 753 00:42:24,360 --> 00:42:28,640 Speaker 1: to spy on people, and when he's rebuffed, he uses 754 00:42:28,719 --> 00:42:31,759 Speaker 1: them to terrorize people. So it becomes kind of a 755 00:42:31,760 --> 00:42:36,200 Speaker 1: psychological thriller slash horror movie that's IT based and Internet 756 00:42:36,200 --> 00:42:38,560 Speaker 1: of Things based. And while that is going to be 757 00:42:39,480 --> 00:42:44,479 Speaker 1: pushed to the limit for drama for dramatic purposes, there's 758 00:42:44,520 --> 00:42:45,800 Speaker 1: a lot there. 759 00:42:45,880 --> 00:42:48,840 Speaker 2: That you could say, like, you know, yeah. 760 00:42:48,719 --> 00:42:51,000 Speaker 1: It went to extremes for the purposes of this movie, 761 00:42:51,040 --> 00:42:54,400 Speaker 1: but it does drive home certain things you should be 762 00:42:54,400 --> 00:42:57,160 Speaker 1: aware of, like how many devices do you own that 763 00:42:57,239 --> 00:42:59,520 Speaker 1: have microphones in them? How many do you own that 764 00:42:59,600 --> 00:43:03,680 Speaker 1: have cameras in them? What are they connected to? And 765 00:43:03,920 --> 00:43:07,640 Speaker 1: is it secure? Is your router secure? Did you buy 766 00:43:07,640 --> 00:43:13,640 Speaker 1: any chance not change the default identifier and password to 767 00:43:13,680 --> 00:43:16,880 Speaker 1: your router because you might want to do that that 768 00:43:16,960 --> 00:43:17,560 Speaker 1: kind of stuff. 769 00:43:18,000 --> 00:43:22,439 Speaker 4: So I'm so grateful that I got into this early, so. 770 00:43:22,760 --> 00:43:26,080 Speaker 3: Like before Internet of Things became a thing, because now 771 00:43:26,120 --> 00:43:28,879 Speaker 3: I can go home, like right at this minute, pull 772 00:43:28,960 --> 00:43:31,239 Speaker 3: up a program on my computer, sit down on the 773 00:43:31,239 --> 00:43:34,439 Speaker 3: same networks as my camera, and make sure that it's 774 00:43:34,560 --> 00:43:37,719 Speaker 3: not open to the World Wide Web, because it's it's 775 00:43:37,880 --> 00:43:40,200 Speaker 3: entirely possible that those things can happen. 776 00:43:40,320 --> 00:43:42,440 Speaker 4: They have happened, and. 777 00:43:42,560 --> 00:43:46,839 Speaker 3: It's oh, man, consumers, just I wish more consumers were 778 00:43:46,840 --> 00:43:48,799 Speaker 3: interested in this kind of thing because it would make 779 00:43:48,840 --> 00:43:49,880 Speaker 3: them so much safer. 780 00:43:50,320 --> 00:43:50,600 Speaker 2: Yeah. 781 00:43:50,800 --> 00:43:52,719 Speaker 1: Yeah, it's the sort of stuff no one wants to 782 00:43:52,719 --> 00:43:56,319 Speaker 1: really think about. The convenience of the technology is so 783 00:43:56,480 --> 00:44:02,520 Speaker 1: great that it people don't feel feel comfortable thinking about 784 00:44:02,560 --> 00:44:05,279 Speaker 1: the other side of it because the technology they rely 785 00:44:05,440 --> 00:44:08,040 Speaker 1: so heavily on it it does so many useful things 786 00:44:08,040 --> 00:44:11,200 Speaker 1: for them that I think that ends up making them 787 00:44:11,680 --> 00:44:15,399 Speaker 1: kind of ignore the possible security problems, because if they 788 00:44:15,560 --> 00:44:18,080 Speaker 1: paid attention to it, they would feel that they would 789 00:44:18,080 --> 00:44:20,240 Speaker 1: either need to take a lot of effort to fix 790 00:44:20,280 --> 00:44:23,520 Speaker 1: those security issues, at least the ones they can fix 791 00:44:23,560 --> 00:44:26,080 Speaker 1: from a consumer side of things, or they would have 792 00:44:26,080 --> 00:44:29,759 Speaker 1: to abandon the technology, which is so incredibly useful and convenient, 793 00:44:30,040 --> 00:44:31,160 Speaker 1: and neither of those. 794 00:44:30,960 --> 00:44:35,160 Speaker 2: Seem particularly interesting. It's way better to just, you know, 795 00:44:35,360 --> 00:44:36,279 Speaker 2: just pretend like. 796 00:44:36,239 --> 00:44:40,080 Speaker 1: It doesn't happen and keep using your unsecured Wi Fi 797 00:44:40,360 --> 00:44:42,960 Speaker 1: and yeah, absolutely. 798 00:44:42,520 --> 00:44:45,400 Speaker 3: I mean there, I've made some sacrifices in my life 799 00:44:45,400 --> 00:44:48,000 Speaker 3: to be more secure. For example, I don't use Facebook 800 00:44:48,040 --> 00:44:50,560 Speaker 3: Messenger on my phone, Yeah, because I don't I don't 801 00:44:50,600 --> 00:44:53,120 Speaker 3: trust that feature. I don't even use the Facebook app 802 00:44:53,200 --> 00:44:55,200 Speaker 3: on my phone because I don't trust the app. 803 00:44:55,520 --> 00:44:55,719 Speaker 4: Yeah. 804 00:44:56,280 --> 00:44:58,920 Speaker 1: I noticed that when I abandoned the Facebook app. You 805 00:44:58,960 --> 00:45:00,719 Speaker 1: gave me a big thumbs up on that day. 806 00:45:01,360 --> 00:45:02,160 Speaker 4: Oh, yes, I did. 807 00:45:03,000 --> 00:45:06,960 Speaker 3: And there's other things that I've chosen convenience over security, 808 00:45:07,239 --> 00:45:11,799 Speaker 3: even though I understand what what I might what might 809 00:45:11,840 --> 00:45:14,640 Speaker 3: happen to me if I because I chose that convenience. 810 00:45:14,960 --> 00:45:17,480 Speaker 3: For example, I use a thumbprint to unlock my phone, 811 00:45:17,600 --> 00:45:20,840 Speaker 3: even though you are basically forced to give away a 812 00:45:20,880 --> 00:45:23,520 Speaker 3: thumb print if you are ever if there is a 813 00:45:23,560 --> 00:45:26,040 Speaker 3: warrant on your phone, as opposed to if you have 814 00:45:26,080 --> 00:45:27,880 Speaker 3: a pin code or a password, you don't have to 815 00:45:27,920 --> 00:45:31,160 Speaker 3: give those away. So that's like those kind of things 816 00:45:31,200 --> 00:45:34,759 Speaker 3: you really have to think about, especially if you know 817 00:45:34,800 --> 00:45:38,360 Speaker 3: you're going into this with the with the thought you know, 818 00:45:38,440 --> 00:45:40,560 Speaker 3: I have to better my security and privacy. 819 00:45:40,840 --> 00:45:41,160 Speaker 2: Yeah. 820 00:45:41,239 --> 00:45:44,839 Speaker 1: And see it's different for you too, Shannon, because you're 821 00:45:44,880 --> 00:45:48,120 Speaker 1: you are aware of all the potential or at least 822 00:45:48,280 --> 00:45:52,080 Speaker 1: a large majority of the potential bad things that can happen, 823 00:45:52,120 --> 00:45:54,880 Speaker 1: so you can make an informed decision, and you can 824 00:45:55,040 --> 00:45:58,960 Speaker 1: you can measure that risk versus the reward you get 825 00:45:59,440 --> 00:46:01,399 Speaker 1: based upon whatever choice you make. 826 00:46:01,800 --> 00:46:04,280 Speaker 2: A lot of people out there they aren't. 827 00:46:05,440 --> 00:46:07,759 Speaker 1: I think the problem is so scary they don't even 828 00:46:07,760 --> 00:46:10,799 Speaker 1: want to look at it, and that means that they're 829 00:46:10,840 --> 00:46:16,840 Speaker 1: making uninformed, uneducated decisions. And I know the problem is scary, guys, 830 00:46:16,880 --> 00:46:19,759 Speaker 1: but that's why you gotta look at it. You can't. 831 00:46:19,800 --> 00:46:22,520 Speaker 3: That's why we're here so that we can inform everybody 832 00:46:22,560 --> 00:46:25,480 Speaker 3: and we can educate everyone right on the proper uses 833 00:46:25,520 --> 00:46:26,640 Speaker 3: of the security and privacy. 834 00:46:26,800 --> 00:46:30,160 Speaker 1: So if you guys out there, take anything at all 835 00:46:30,200 --> 00:46:33,440 Speaker 1: away from this episode, I want you to take a well, really, there's. 836 00:46:33,320 --> 00:46:34,640 Speaker 2: Two things I want you to take away. 837 00:46:34,960 --> 00:46:38,239 Speaker 1: One is, really give it some serious thought if you 838 00:46:38,320 --> 00:46:41,640 Speaker 1: have it before, because it's something that could save you 839 00:46:41,960 --> 00:46:45,440 Speaker 1: tons of heartache down the line, and it can protect 840 00:46:45,480 --> 00:46:49,080 Speaker 1: you and those who are close to you from attacks. 841 00:46:49,080 --> 00:46:50,840 Speaker 1: And the second thing I want you to remember is 842 00:46:51,360 --> 00:46:54,719 Speaker 1: if aliens attack, get Jeff goldblooma mac because that guy 843 00:46:54,760 --> 00:46:59,200 Speaker 1: can do anything. And that's it for that classic episode 844 00:46:59,200 --> 00:47:02,600 Speaker 1: that originally published September fourteenth, twenty sixteen, The Worst Hacking 845 00:47:02,640 --> 00:47:06,440 Speaker 1: Scenes from Hollywood, Part two. Big thanks to Shannon Morse, 846 00:47:06,719 --> 00:47:11,960 Speaker 1: even seven years later. She has always been a great 847 00:47:12,000 --> 00:47:15,760 Speaker 1: friend and a phenomenal guest, and also is a great 848 00:47:15,760 --> 00:47:19,000 Speaker 1: host all on her own. You should definitely go seek 849 00:47:19,040 --> 00:47:22,880 Speaker 1: out her work. She has done tons of cool stuff 850 00:47:23,320 --> 00:47:29,480 Speaker 1: and is wicked smart and experienced with things like hacking 851 00:47:30,320 --> 00:47:32,920 Speaker 1: and understand that on a much deeper level than I do. 852 00:47:33,120 --> 00:47:36,920 Speaker 1: So check out her stuff. It's great and I hope 853 00:47:37,080 --> 00:47:39,160 Speaker 1: all of you are well, and I'll talk to you 854 00:47:39,200 --> 00:47:49,759 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 855 00:47:49,880 --> 00:47:54,719 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 856 00:47:54,840 --> 00:47:58,520 Speaker 1: or wherever you listen to your favorite shows. 857 00:48:00,080 --> 00:48:00,319 Speaker 3: Zo