1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,720 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,760 --> 00:00:16,720 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,720 --> 00:00:20,279 Speaker 1: how Stuff Works in a love all things tech, and 5 00:00:20,360 --> 00:00:23,560 Speaker 1: today we're going to kind of do a continuation on 6 00:00:23,680 --> 00:00:27,080 Speaker 1: our discussions about critical thinking and skepticism. I know I've 7 00:00:27,120 --> 00:00:29,760 Speaker 1: been talking about that a lot, but this is another 8 00:00:30,200 --> 00:00:33,600 Speaker 1: area where that's important, and recently in the news, there's 9 00:00:33,600 --> 00:00:36,559 Speaker 1: been a lot of talk about spear fishing. Now much 10 00:00:36,600 --> 00:00:41,440 Speaker 1: of that discussion centers on a report from that detailed 11 00:00:41,440 --> 00:00:45,839 Speaker 1: how Russian intelligence agents targeted the United States Democratic National 12 00:00:45,880 --> 00:00:49,600 Speaker 1: Committee or d NC with a spear phishing campaign that 13 00:00:49,760 --> 00:00:52,800 Speaker 1: ultimately allowed the malicious actors to make off with a 14 00:00:52,800 --> 00:00:57,360 Speaker 1: lot of sensitive and confidential information and infiltrate sensitive systems. Now, 15 00:00:57,400 --> 00:01:00,120 Speaker 1: I'm not going to get political on this EPISOD, so 16 00:01:00,280 --> 00:01:02,760 Speaker 1: that's not what I want to focus on. I rather 17 00:01:02,840 --> 00:01:05,840 Speaker 1: want to focus on the strategies that malicious actors use 18 00:01:06,240 --> 00:01:09,160 Speaker 1: to either steal information or convince people to hand it 19 00:01:09,200 --> 00:01:12,720 Speaker 1: over willingly through deception. So I'm going to talk about 20 00:01:12,800 --> 00:01:18,120 Speaker 1: concepts like social engineering, fishing, spear fishing, and whaling and 21 00:01:18,120 --> 00:01:21,360 Speaker 1: I'm listing them in that order because it involves moving 22 00:01:21,440 --> 00:01:25,520 Speaker 1: from a more general concept to a more specific application 23 00:01:25,760 --> 00:01:31,080 Speaker 1: of those concepts. So let's get it started with social engineering. Now, 24 00:01:31,120 --> 00:01:36,119 Speaker 1: generally speaking, social engineering, at least in this context, refers 25 00:01:36,160 --> 00:01:40,479 Speaker 1: to using deception and manipulation in order to get hold 26 00:01:40,520 --> 00:01:43,480 Speaker 1: of information. It's just, really it's just about tricking people 27 00:01:43,520 --> 00:01:47,119 Speaker 1: to give up stuff that normally they wouldn't part with, 28 00:01:47,560 --> 00:01:50,880 Speaker 1: and that's really all it boils down to. Typically, this 29 00:01:50,960 --> 00:01:54,440 Speaker 1: means that someone is pretending to be a trusted entity 30 00:01:54,640 --> 00:01:57,920 Speaker 1: and they work to convince a target or mark in 31 00:01:57,960 --> 00:02:01,400 Speaker 1: the old carnival speak, the marking the person that you 32 00:02:01,480 --> 00:02:05,600 Speaker 1: have marked as being vulnerable. You are trying to get 33 00:02:05,600 --> 00:02:08,400 Speaker 1: them to hand over information or even give them control 34 00:02:09,160 --> 00:02:12,720 Speaker 1: of their devices. So you might convince someone to install 35 00:02:12,880 --> 00:02:17,240 Speaker 1: some malware. It's disguised as something else, like an innocent file, 36 00:02:17,800 --> 00:02:20,440 Speaker 1: but your your job is to get them to do 37 00:02:20,560 --> 00:02:24,960 Speaker 1: something that compromises information or system information systems. And this 38 00:02:25,000 --> 00:02:27,720 Speaker 1: shows us yet again that critical thinking is a really 39 00:02:27,760 --> 00:02:30,919 Speaker 1: important skill. It's good to apply critical thinking when someone 40 00:02:31,000 --> 00:02:33,600 Speaker 1: is asking you for information or telling you that you 41 00:02:33,600 --> 00:02:38,040 Speaker 1: should install a program. Sometimes those are bad people, and 42 00:02:38,120 --> 00:02:40,520 Speaker 1: sometimes they're up to no good. Now, you can think 43 00:02:40,520 --> 00:02:43,880 Speaker 1: of a social engineer as a con artist, so it's 44 00:02:43,919 --> 00:02:48,040 Speaker 1: someone who uses psychological manipulation to get a specific reaction 45 00:02:48,200 --> 00:02:52,160 Speaker 1: from mark. Stage magicians and mentalists use those sort of 46 00:02:52,160 --> 00:02:55,640 Speaker 1: strategies or to entertain. They're not doing it to do 47 00:02:55,840 --> 00:03:00,480 Speaker 1: something underhanded. They're doing it in order to make people uh, 48 00:03:00,520 --> 00:03:04,400 Speaker 1: amazed or laugh or applaud. A stage magician, for example, 49 00:03:04,520 --> 00:03:07,480 Speaker 1: has to learn the art of misdirection. This is the 50 00:03:07,520 --> 00:03:10,239 Speaker 1: technique of getting an audience to pay attention to something 51 00:03:10,320 --> 00:03:14,280 Speaker 1: that is ultimately unimportant, at least as far as the 52 00:03:14,320 --> 00:03:17,200 Speaker 1: mechanics of a trick, and that's so that they don't 53 00:03:17,280 --> 00:03:20,320 Speaker 1: notice the actual important stuff that's going on. It gives 54 00:03:20,360 --> 00:03:23,200 Speaker 1: the magician the time and opportunity to pull off a trick. 55 00:03:23,760 --> 00:03:27,120 Speaker 1: A good magician can hold an audience's attention with some 56 00:03:27,200 --> 00:03:31,280 Speaker 1: mesmerizing stage work. They might incorporate clever pattern, but the 57 00:03:31,320 --> 00:03:33,600 Speaker 1: whole point is to keep focus off of something that 58 00:03:33,720 --> 00:03:37,920 Speaker 1: might otherwise spoil the illusion. Dr Robert Saldini, is a 59 00:03:38,000 --> 00:03:42,160 Speaker 1: professor emeritus of psychology and Marketing at Arizona State University, 60 00:03:42,360 --> 00:03:47,640 Speaker 1: outlined six basic principles of influence. These relate to strategies 61 00:03:47,680 --> 00:03:51,840 Speaker 1: that they're they're drivers that someone can employ to convince 62 00:03:51,880 --> 00:03:55,400 Speaker 1: another person to agree with what they are saying. And essentially, 63 00:03:55,680 --> 00:03:59,080 Speaker 1: these six broad principles can get someone to say yes. 64 00:03:59,240 --> 00:04:02,240 Speaker 1: It's the idea of influencing someone to agree to do something. 65 00:04:02,640 --> 00:04:05,920 Speaker 1: Understanding these principles can help you recognize when someone is 66 00:04:05,960 --> 00:04:09,400 Speaker 1: trying to use those strategies on you. But these are 67 00:04:09,440 --> 00:04:14,080 Speaker 1: strategies that rely on very human traits. They work because 68 00:04:14,120 --> 00:04:17,640 Speaker 1: of the way we're wired. More or less, it's even 69 00:04:17,640 --> 00:04:19,640 Speaker 1: being aware of them doesn't mean that you will be 70 00:04:19,720 --> 00:04:24,159 Speaker 1: immune to them, because it's it's very much dependent upon 71 00:04:24,200 --> 00:04:27,520 Speaker 1: what it means to be human. So Childni cites other 72 00:04:27,560 --> 00:04:32,440 Speaker 1: psychologists who identified what they called judgmental heuristics. You can 73 00:04:32,480 --> 00:04:36,200 Speaker 1: think of these as sort of cognitive shortcuts. They are 74 00:04:36,279 --> 00:04:39,200 Speaker 1: basic rules that we accept as being generally true, so 75 00:04:39,240 --> 00:04:42,279 Speaker 1: it's sort of like a foundational statement that we would 76 00:04:42,279 --> 00:04:45,800 Speaker 1: not question. But if someone knows how those work, they 77 00:04:45,800 --> 00:04:48,440 Speaker 1: can take advantage of it. So, in other words, if 78 00:04:48,440 --> 00:04:51,120 Speaker 1: someone is really good at using these principles, you might 79 00:04:51,160 --> 00:04:53,640 Speaker 1: not be aware of it right away. And I certainly 80 00:04:53,640 --> 00:04:56,080 Speaker 1: consider myself to be vulnerable to them. I know I'm 81 00:04:56,160 --> 00:04:58,479 Speaker 1: vulnerable to them. I can think of examples where people 82 00:04:58,600 --> 00:05:01,679 Speaker 1: use these technique U on me and it worked. Sales 83 00:05:01,720 --> 00:05:05,200 Speaker 1: people in particular tend to really focus on these their 84 00:05:05,360 --> 00:05:08,479 Speaker 1: entire books out there that are all about how to 85 00:05:08,640 --> 00:05:13,600 Speaker 1: leverage these principles in order to make sales, make business deals, 86 00:05:14,360 --> 00:05:18,000 Speaker 1: et cetera. So what are the six big categories. Well, 87 00:05:18,279 --> 00:05:21,880 Speaker 1: the first is the rule of reciprocation. That's our tendency 88 00:05:21,960 --> 00:05:25,200 Speaker 1: to want to repay someone who has done something on 89 00:05:25,200 --> 00:05:29,640 Speaker 1: our behalf. Essentially, this comes down to the concept of favors. 90 00:05:29,720 --> 00:05:32,839 Speaker 1: So an example you might have encountered could be free samples. 91 00:05:33,240 --> 00:05:36,320 Speaker 1: Merchants know that a free sample can create the urge 92 00:05:36,320 --> 00:05:39,359 Speaker 1: within a potential customer to buy a product because that 93 00:05:39,400 --> 00:05:43,240 Speaker 1: person feels obligated after having accepted a sample. So if 94 00:05:43,279 --> 00:05:46,080 Speaker 1: I'm sitting at a table and I have samples of 95 00:05:46,160 --> 00:05:49,160 Speaker 1: let's say it's olive oil, I've got different little bowls 96 00:05:49,160 --> 00:05:51,159 Speaker 1: of olive oil, and you can dip a little bit 97 00:05:51,240 --> 00:05:53,200 Speaker 1: of bread in there and taste them, and I'm there 98 00:05:53,240 --> 00:05:56,160 Speaker 1: saying I'm here to answer any questions. Do you like it? 99 00:05:56,240 --> 00:05:59,520 Speaker 1: What do you think? We have the social pressure within 100 00:05:59,640 --> 00:06:02,080 Speaker 1: us to say, oh, I like it a lot, even 101 00:06:02,120 --> 00:06:04,720 Speaker 1: if we don't. We have that social pressure, Like very 102 00:06:04,760 --> 00:06:06,359 Speaker 1: few of us would say, oh, I don't like this 103 00:06:06,440 --> 00:06:09,680 Speaker 1: at all to someone who seems like they are invested 104 00:06:09,720 --> 00:06:13,400 Speaker 1: in your answer. So we feel obligated to go along 105 00:06:13,440 --> 00:06:15,520 Speaker 1: with it. And we may feel obligated because we have 106 00:06:15,600 --> 00:06:19,080 Speaker 1: accepted something that in return we will buy something even 107 00:06:19,160 --> 00:06:22,720 Speaker 1: if we didn't like the thing, because we feel this 108 00:06:22,960 --> 00:06:27,640 Speaker 1: social pressure. And this goes along with what it means 109 00:06:27,680 --> 00:06:29,640 Speaker 1: to be human. I'll touch back on that when we 110 00:06:29,680 --> 00:06:32,159 Speaker 1: get towards the end of these principles. So most people 111 00:06:32,200 --> 00:06:34,880 Speaker 1: don't like to feel that they are under some sense 112 00:06:34,880 --> 00:06:37,679 Speaker 1: of obligation to someone else. They don't like to feel 113 00:06:37,680 --> 00:06:42,000 Speaker 1: they owe somebody something, and so they will very quickly 114 00:06:42,160 --> 00:06:45,200 Speaker 1: try to act to even the scales right so that 115 00:06:45,240 --> 00:06:49,159 Speaker 1: they are no longer obligated. So rule of reciprocation is 116 00:06:49,200 --> 00:06:53,359 Speaker 1: the first principle. Next is scarcity. People tend to want 117 00:06:53,440 --> 00:06:56,920 Speaker 1: more of stuff that they can have less of. That's 118 00:06:56,920 --> 00:07:00,200 Speaker 1: the basic idea. And you can look at this with 119 00:07:00,360 --> 00:07:03,880 Speaker 1: just like the prices for precious metals. Precious metals are 120 00:07:03,880 --> 00:07:07,159 Speaker 1: precious largely because of their scarcity, because they're so hard 121 00:07:07,200 --> 00:07:09,080 Speaker 1: to get and there's not a whole lot of it. 122 00:07:09,440 --> 00:07:11,640 Speaker 1: That's what drives up their value. People want it more 123 00:07:11,720 --> 00:07:15,440 Speaker 1: because it's harder to get. The entire diamond industry is 124 00:07:15,480 --> 00:07:19,240 Speaker 1: based off of this concept. There are plenty of diamonds, 125 00:07:19,320 --> 00:07:23,560 Speaker 1: There are tons of diamonds. They are not even rare. 126 00:07:23,680 --> 00:07:29,040 Speaker 1: But because the world's supply of natural diamonds is controlled 127 00:07:29,200 --> 00:07:33,360 Speaker 1: essentially by a single company, that company can control the 128 00:07:33,440 --> 00:07:37,280 Speaker 1: scarcity of diamonds and thus inflate their value. Because people 129 00:07:37,320 --> 00:07:40,559 Speaker 1: want what they can't have, So offering someone a chance 130 00:07:40,600 --> 00:07:44,320 Speaker 1: to experience something or possess something that has extremely limited 131 00:07:44,320 --> 00:07:49,120 Speaker 1: availability is a really useful tactic. If you tell someone, hey, 132 00:07:49,120 --> 00:07:51,480 Speaker 1: this service that we have, we're gonna get rid of 133 00:07:51,520 --> 00:07:54,320 Speaker 1: it in a week, typically you see a lot more 134 00:07:54,360 --> 00:07:56,560 Speaker 1: people try and take advantage of that service before it 135 00:07:56,600 --> 00:08:00,000 Speaker 1: goes away forever. It's an incredibly useful way of getting 136 00:08:00,080 --> 00:08:02,600 Speaker 1: people to do something you want them to do. Next 137 00:08:02,720 --> 00:08:05,640 Speaker 1: is the principle of authority, which is the notion that 138 00:08:05,680 --> 00:08:07,920 Speaker 1: people will tend to go along with someone they view 139 00:08:08,040 --> 00:08:11,280 Speaker 1: as being an expert or being credible. So have you 140 00:08:11,280 --> 00:08:14,600 Speaker 1: ever encountered a problem and thought, Uh, someone smarter than 141 00:08:14,600 --> 00:08:16,800 Speaker 1: I am is going to handle this, or you just 142 00:08:16,880 --> 00:08:20,680 Speaker 1: assume that someone more capable than you has the situation covered. 143 00:08:21,080 --> 00:08:24,280 Speaker 1: Maybe you were present when an emergency happened and you 144 00:08:24,320 --> 00:08:26,640 Speaker 1: look around to see if someone else jumps to help, 145 00:08:26,720 --> 00:08:29,360 Speaker 1: because you think, well, clearly, there's gotta be someone here 146 00:08:29,400 --> 00:08:32,080 Speaker 1: who's better at handling this kind of situation than I am, 147 00:08:32,320 --> 00:08:34,920 Speaker 1: so I'm gonna step back and allow that to happen. Well, 148 00:08:34,920 --> 00:08:37,760 Speaker 1: that behavior shows that we frequently will defer and even 149 00:08:37,840 --> 00:08:41,040 Speaker 1: seek out people who appear to be more knowledgeable or 150 00:08:41,200 --> 00:08:45,400 Speaker 1: capable than ourselves under certain situations. So if someone poses 151 00:08:45,480 --> 00:08:48,760 Speaker 1: as a person of authority convincingly, they can more easily 152 00:08:48,800 --> 00:08:52,480 Speaker 1: persuade people to do stuff. The fourth principle of influence, 153 00:08:52,480 --> 00:08:56,320 Speaker 1: according to Cialdini, is consistency. This is related to the 154 00:08:56,360 --> 00:08:59,880 Speaker 1: concept of commitment committing to do something. If you get 155 00:09:00,000 --> 00:09:03,839 Speaker 1: someone to agree to a small initial commitment, that can 156 00:09:03,880 --> 00:09:07,040 Speaker 1: set the ground for a larger commitment further down the road. 157 00:09:07,120 --> 00:09:10,280 Speaker 1: So a classic example in a scam could be the 158 00:09:10,360 --> 00:09:13,600 Speaker 1: Nigerian scam. A lot of Nigerian scams will start off 159 00:09:13,640 --> 00:09:16,240 Speaker 1: where the scam artist first asks for a relatively small 160 00:09:16,280 --> 00:09:19,079 Speaker 1: amount of money because they say, hey, there's a huge 161 00:09:19,120 --> 00:09:22,160 Speaker 1: amount of money in this country. Has your name on it. 162 00:09:22,160 --> 00:09:24,480 Speaker 1: It's it's clearly not meant for you, but because it 163 00:09:24,480 --> 00:09:26,280 Speaker 1: has your name on it, we can get it to you. 164 00:09:26,720 --> 00:09:29,400 Speaker 1: It's an enormous sum. However, in order for me to 165 00:09:29,440 --> 00:09:31,400 Speaker 1: get this money to you, first, I'm gonna need a 166 00:09:31,440 --> 00:09:33,920 Speaker 1: small amount of money from you to help secure this, 167 00:09:34,200 --> 00:09:36,640 Speaker 1: and then we can get you rich. And people will 168 00:09:36,640 --> 00:09:38,959 Speaker 1: say okay, And then the scam ars might come back 169 00:09:39,000 --> 00:09:40,640 Speaker 1: and say, oh, as it turns out, I'm gonna need 170 00:09:40,679 --> 00:09:42,319 Speaker 1: a little bit more money because I'm gonna have to 171 00:09:42,400 --> 00:09:46,080 Speaker 1: bribe some officials here in order to get your wealth 172 00:09:46,240 --> 00:09:49,120 Speaker 1: to you, and so on and so on. The idea 173 00:09:49,160 --> 00:09:52,319 Speaker 1: being that because you've already made an initial commitment, you 174 00:09:52,360 --> 00:09:55,800 Speaker 1: would be willing to make a larger subsequent commitment. And 175 00:09:55,880 --> 00:09:58,559 Speaker 1: this can happen not just in scams, it can happen 176 00:09:58,559 --> 00:10:04,640 Speaker 1: in completelygitimate, although sometimes questionably ethical applications. That in the 177 00:10:04,679 --> 00:10:07,679 Speaker 1: Principles is the principle of liking, which means we're more 178 00:10:07,720 --> 00:10:12,440 Speaker 1: easily influenced by people we like. This is not brain surgery, right, 179 00:10:12,920 --> 00:10:15,840 Speaker 1: If you like someone, you're more willing to agree with 180 00:10:15,920 --> 00:10:19,199 Speaker 1: them and to do things that they need you to do. Now, 181 00:10:19,480 --> 00:10:22,040 Speaker 1: Sheldeoni states there are three factors that go into whether 182 00:10:22,120 --> 00:10:25,280 Speaker 1: or not we actually like someone. First, we tend to 183 00:10:25,360 --> 00:10:28,680 Speaker 1: like people who are similar to ourselves. The more alike 184 00:10:28,760 --> 00:10:30,800 Speaker 1: they are to ourselves, at least to a certain extent, 185 00:10:31,120 --> 00:10:33,800 Speaker 1: the more we tend to like them. Second, we like 186 00:10:33,920 --> 00:10:38,240 Speaker 1: people who complement us because we're vain. We're egotistical creatures. 187 00:10:38,640 --> 00:10:40,040 Speaker 1: If you come up to me and say, hey, I 188 00:10:40,080 --> 00:10:42,600 Speaker 1: really like your work, I am far more likely to 189 00:10:42,679 --> 00:10:45,720 Speaker 1: like you. And Third, we like people who will work 190 00:10:45,800 --> 00:10:49,079 Speaker 1: with us towards a mutual goal. If we both need 191 00:10:49,120 --> 00:10:52,640 Speaker 1: to get a specific task done, and you help me 192 00:10:52,760 --> 00:10:55,640 Speaker 1: do that task, I am more likely to like you. 193 00:10:55,800 --> 00:10:58,400 Speaker 1: So a scam artist might try to wheel out some 194 00:10:58,720 --> 00:11:01,240 Speaker 1: information about you in order to make it seem as 195 00:11:01,280 --> 00:11:04,439 Speaker 1: if the artist also shares the similar interests and values 196 00:11:04,480 --> 00:11:07,960 Speaker 1: that you have, and try and create this pathway to 197 00:11:08,040 --> 00:11:10,640 Speaker 1: get you to like the scam artists so that in return, 198 00:11:11,360 --> 00:11:13,680 Speaker 1: you will actually do whatever the scam artist wants you 199 00:11:13,720 --> 00:11:18,240 Speaker 1: to do. The sixth principle is consensus, which essentially says 200 00:11:18,520 --> 00:11:20,360 Speaker 1: that when we're in doubt, we tend to look at 201 00:11:20,360 --> 00:11:22,480 Speaker 1: what other people are doing so that we can get 202 00:11:22,520 --> 00:11:25,280 Speaker 1: some direction over what we should do. Now, I certainly 203 00:11:25,360 --> 00:11:28,560 Speaker 1: identify with this. I think about just about any situation 204 00:11:28,600 --> 00:11:31,120 Speaker 1: I've been in in which I was joining a group 205 00:11:31,160 --> 00:11:34,000 Speaker 1: of people working together on an activity. Maybe we're all 206 00:11:34,520 --> 00:11:37,480 Speaker 1: doing the same thing over and over, like folding T shirts. 207 00:11:37,480 --> 00:11:40,480 Speaker 1: That's an easy one. We're all folding T shirts, and 208 00:11:40,640 --> 00:11:42,680 Speaker 1: I keep looking over to the left and right to 209 00:11:42,679 --> 00:11:44,280 Speaker 1: see how other people are doing it, so that I 210 00:11:44,280 --> 00:11:46,199 Speaker 1: feel like I'm doing it the right way, that I'm 211 00:11:46,240 --> 00:11:49,520 Speaker 1: not messing up. I have this incredible social pressure on 212 00:11:49,559 --> 00:11:53,440 Speaker 1: me to do it correctly, and rather than try and 213 00:11:53,800 --> 00:11:56,960 Speaker 1: learn this in a more formal way, you know, a 214 00:11:57,000 --> 00:12:00,719 Speaker 1: more process oriented way, I'm doing it looking at how 215 00:12:00,720 --> 00:12:03,640 Speaker 1: everyone else is doing this. Now, if everyone else is 216 00:12:03,679 --> 00:12:06,320 Speaker 1: doing this quote unquote the correct way, that's fine. But 217 00:12:06,360 --> 00:12:09,440 Speaker 1: if people are doing things incorrectly, then all I'm doing 218 00:12:09,520 --> 00:12:11,360 Speaker 1: is adding to the number of people who are doing 219 00:12:11,400 --> 00:12:14,439 Speaker 1: something incorrectly. But there's again, is a lot of social 220 00:12:14,480 --> 00:12:17,599 Speaker 1: pressure at play here for you to do as the 221 00:12:17,640 --> 00:12:21,920 Speaker 1: group does. And this is not just psychobabble. These principles 222 00:12:22,160 --> 00:12:26,640 Speaker 1: put into words behaviors that humans have developed as social creatures. 223 00:12:27,320 --> 00:12:30,720 Speaker 1: Human survival has largely depended upon our working together, and 224 00:12:30,760 --> 00:12:35,040 Speaker 1: so we've developed these behaviors that promote cooperation and they 225 00:12:35,080 --> 00:12:41,239 Speaker 1: discourage inhibiting cooperation. Understanding those behaviors and then subtly leveraging 226 00:12:41,320 --> 00:12:44,640 Speaker 1: them can have a really big impact on interactions, and 227 00:12:44,640 --> 00:12:48,000 Speaker 1: that plays right into the hands of social engineers. So 228 00:12:48,240 --> 00:12:51,960 Speaker 1: if you happen to know people tend to follow these 229 00:12:52,000 --> 00:12:57,199 Speaker 1: principles because humans as a species have sort of evolved 230 00:12:57,280 --> 00:13:00,000 Speaker 1: to be these social creatures, then you can take advantag 231 00:13:00,040 --> 00:13:02,239 Speaker 1: edge of that. Now, granted they are going to be outliers. 232 00:13:02,240 --> 00:13:05,360 Speaker 1: There will be people who do not conform, they do 233 00:13:05,400 --> 00:13:08,679 Speaker 1: not adhere to these principles, they don't have any connection 234 00:13:08,760 --> 00:13:11,640 Speaker 1: to them, But more frequently than not, you're going to 235 00:13:11,679 --> 00:13:15,120 Speaker 1: find that they work and that they work on you. Next, 236 00:13:15,200 --> 00:13:18,400 Speaker 1: I'm going to define the terms like fishing, spear fishing, 237 00:13:18,440 --> 00:13:21,440 Speaker 1: and whaling, and we'll explore some specific tactics and stories 238 00:13:21,480 --> 00:13:24,720 Speaker 1: related to those practices. But first let's take a quick 239 00:13:24,760 --> 00:13:35,760 Speaker 1: break to thank our sponsor. All right, So social engineering 240 00:13:35,800 --> 00:13:37,800 Speaker 1: is all about manipulating people to give them to do 241 00:13:37,840 --> 00:13:40,599 Speaker 1: what you want, typically in a way that involves deception 242 00:13:40,679 --> 00:13:44,880 Speaker 1: and handing over confidential information. But what is fishing all about? 243 00:13:45,120 --> 00:13:48,040 Speaker 1: And it's spelled p h I s H I n 244 00:13:48,080 --> 00:13:51,800 Speaker 1: G by the way, fishing with a pH Fishing describes 245 00:13:51,840 --> 00:13:54,960 Speaker 1: the practice of tricking people into handing over sensitive information 246 00:13:55,280 --> 00:13:59,440 Speaker 1: through some digital means. Most frequently we talk about using 247 00:13:59,720 --> 00:14:02,640 Speaker 1: e ail as a way of phishing for data, but 248 00:14:03,000 --> 00:14:07,959 Speaker 1: you could use a spoofed website, you could use instant messaging, 249 00:14:08,360 --> 00:14:12,280 Speaker 1: but it doesn't really matter. The basic concept is the same. 250 00:14:12,320 --> 00:14:15,679 Speaker 1: You're fishing for data. I find it interesting, by the way, 251 00:14:15,720 --> 00:14:18,000 Speaker 1: that we have a lot of nautically themed terms to 252 00:14:18,080 --> 00:14:22,160 Speaker 1: describe different behaviors on the Internet, because there's fishing, their 253 00:14:22,200 --> 00:14:26,080 Speaker 1: spear fishing, there's whaling, but there's also trolling. Trolling refers 254 00:14:26,160 --> 00:14:29,080 Speaker 1: not to mythical monsters from fairy tales, but from the 255 00:14:29,080 --> 00:14:32,600 Speaker 1: practice of dragging a net behind a boat to troll 256 00:14:32,840 --> 00:14:35,640 Speaker 1: in an effort to scoop up a catch. Trolling on 257 00:14:35,680 --> 00:14:38,360 Speaker 1: the Internet originally referred to people who would go into 258 00:14:38,800 --> 00:14:41,760 Speaker 1: message forums and they would lay down a trap, which 259 00:14:41,760 --> 00:14:44,640 Speaker 1: would be they would post something intended to get a 260 00:14:44,760 --> 00:14:48,440 Speaker 1: rise out of members of the forum, legitimate members, so 261 00:14:48,800 --> 00:14:51,600 Speaker 1: they weren't necessarily aiming at anyone in particular. They were 262 00:14:51,640 --> 00:14:54,000 Speaker 1: just kind of laying a trap for anybody who was 263 00:14:54,200 --> 00:14:57,320 Speaker 1: gullible enough to fall into it. The goal was just 264 00:14:57,400 --> 00:15:00,280 Speaker 1: to rile people up, and why would you do that. Well, 265 00:15:00,360 --> 00:15:02,560 Speaker 1: some people just want to watch the world burn. I 266 00:15:02,600 --> 00:15:04,960 Speaker 1: guess they like to think that they were able to 267 00:15:05,000 --> 00:15:08,680 Speaker 1: have influence on another person and that brings them some 268 00:15:09,160 --> 00:15:12,240 Speaker 1: mean sense of joy. It's not great, but it does 269 00:15:12,320 --> 00:15:16,800 Speaker 1: happen far too frequently. Or sometimes the troll actually has 270 00:15:16,840 --> 00:15:20,440 Speaker 1: ulterior motives beyond just upsetting people. Maybe you have someone 271 00:15:20,480 --> 00:15:23,120 Speaker 1: who's running a different message board, and so what they're 272 00:15:23,120 --> 00:15:26,440 Speaker 1: doing is they're essentially sabotaging what they see as competition. 273 00:15:26,680 --> 00:15:29,840 Speaker 1: But whatever the motivation, the practice involved being obnoxious in 274 00:15:29,920 --> 00:15:32,640 Speaker 1: one way or another in an attempt to get at 275 00:15:32,680 --> 00:15:35,280 Speaker 1: least a hit or two from well meaning but misguided 276 00:15:35,280 --> 00:15:38,880 Speaker 1: forum members. And I say misguided because the person's anger 277 00:15:39,000 --> 00:15:42,040 Speaker 1: or frustration was the trolls goal all along. They're not 278 00:15:42,520 --> 00:15:46,120 Speaker 1: necessarily arguing something they actually believe in. All we wanted 279 00:15:46,160 --> 00:15:48,880 Speaker 1: to do was get someone upset. So if you get upset, 280 00:15:49,040 --> 00:15:51,600 Speaker 1: the troll wins. So the only real way to win 281 00:15:51,800 --> 00:15:54,440 Speaker 1: is not to play. And that's why you often hear 282 00:15:54,480 --> 00:15:57,040 Speaker 1: people say don't feed the trolls. Problem is, if you 283 00:15:57,040 --> 00:15:59,600 Speaker 1: don't feed the trolls, sometimes trolls will just keep on 284 00:15:59,680 --> 00:16:05,520 Speaker 1: trying to get people to agree to awful, awful things, 285 00:16:05,720 --> 00:16:08,400 Speaker 1: So that philosophy only goes so far as well, and 286 00:16:08,440 --> 00:16:11,600 Speaker 1: it's only applicable in certain situations. But that's another podcast. 287 00:16:12,080 --> 00:16:14,680 Speaker 1: Fishing is similar in a lot of ways to trolling, 288 00:16:14,720 --> 00:16:17,600 Speaker 1: except instead of trying to irritate people, you're trying to 289 00:16:17,640 --> 00:16:20,960 Speaker 1: trick people into handing over information they should definitely not 290 00:16:21,240 --> 00:16:24,960 Speaker 1: hand over. Fishing is a general approach. The attacker does 291 00:16:25,000 --> 00:16:28,360 Speaker 1: not necessarily have a particular target in mind. It's kind 292 00:16:28,360 --> 00:16:31,680 Speaker 1: of a blanket attack. They don't really care who ends 293 00:16:31,760 --> 00:16:33,800 Speaker 1: up being a victim. All they want to do is 294 00:16:33,840 --> 00:16:37,200 Speaker 1: gather as many hits as possible. Just doesn't matter. From 295 00:16:37,240 --> 00:16:41,680 Speaker 1: where as such this, phishing messages tend to be pretty generic. 296 00:16:42,320 --> 00:16:45,320 Speaker 1: Phishing emails may appear to come from a legitimate source. 297 00:16:45,440 --> 00:16:47,520 Speaker 1: On a casual glance, they might look like they were 298 00:16:47,520 --> 00:16:50,440 Speaker 1: sent from an actual company, like a bank or a 299 00:16:50,480 --> 00:16:54,640 Speaker 1: government organization. There are countless variations, but they all try 300 00:16:54,720 --> 00:16:57,480 Speaker 1: to fool the recipient into taking an action that will 301 00:16:57,600 --> 00:17:01,200 Speaker 1: ultimately result in a bad outcome for the target. So, 302 00:17:01,280 --> 00:17:03,520 Speaker 1: for example, you might get an email from a bank 303 00:17:03,640 --> 00:17:05,800 Speaker 1: stating that you have an old account with money in 304 00:17:05,840 --> 00:17:08,080 Speaker 1: it and you should transfer that money out of your account. 305 00:17:08,480 --> 00:17:10,879 Speaker 1: It is essentially saying, hey, you got free money, and 306 00:17:10,920 --> 00:17:13,399 Speaker 1: this bank asks that you fill out some information so 307 00:17:13,440 --> 00:17:16,240 Speaker 1: that you can retrieve your cash. But in reality, the 308 00:17:16,280 --> 00:17:18,679 Speaker 1: emails coming from a phishing scheme and it's looking to 309 00:17:18,720 --> 00:17:22,120 Speaker 1: harvest as much confidential information from users as possible. Maybe 310 00:17:22,119 --> 00:17:25,639 Speaker 1: they're saying, give us your current bank account information so 311 00:17:25,680 --> 00:17:27,800 Speaker 1: that we can transfer the money. We can wire it 312 00:17:27,880 --> 00:17:31,679 Speaker 1: from this old account into your current account, but in 313 00:17:31,760 --> 00:17:34,680 Speaker 1: reality they just want access to your current account. That's 314 00:17:34,760 --> 00:17:38,199 Speaker 1: a pretty clumsy example, but it is an example of 315 00:17:38,200 --> 00:17:42,199 Speaker 1: something that could actually happen. Often, the email address in 316 00:17:42,240 --> 00:17:44,639 Speaker 1: a message can be a dead giveaway whether the message 317 00:17:44,640 --> 00:17:47,280 Speaker 1: is legitimate or not. You can look at the originating 318 00:17:47,320 --> 00:17:50,720 Speaker 1: email address and say, well, it says it's from Gmail, 319 00:17:50,960 --> 00:17:55,280 Speaker 1: but the domain on the email is not related at 320 00:17:55,320 --> 00:17:58,320 Speaker 1: all to Google. Well that's a dead giveaway. Chances are 321 00:17:58,359 --> 00:18:02,679 Speaker 1: you're looking at a scam if email address does not 322 00:18:02,800 --> 00:18:06,879 Speaker 1: match whatever entity it claims to be from. And I 323 00:18:06,920 --> 00:18:10,200 Speaker 1: doubt any legitimate organization would ever ask for sensitive details 324 00:18:10,240 --> 00:18:13,040 Speaker 1: to be sent over email. Almost every single one if 325 00:18:13,080 --> 00:18:15,560 Speaker 1: they respond to these kind of attacks, says we will 326 00:18:15,640 --> 00:18:19,800 Speaker 1: never ask for your personal information over email, particularly since 327 00:18:19,840 --> 00:18:23,479 Speaker 1: there's no guarantee that the recipient is using encryption on 328 00:18:23,560 --> 00:18:27,520 Speaker 1: their emails, So if you're not using an encrypted email service, 329 00:18:27,880 --> 00:18:31,240 Speaker 1: there's no guarantee that a third party listening in wouldn't 330 00:18:31,320 --> 00:18:34,080 Speaker 1: get access to that information anyway, even if everything else 331 00:18:34,119 --> 00:18:37,960 Speaker 1: was legitimate. So that being said, it is possible to 332 00:18:38,000 --> 00:18:41,439 Speaker 1: forge an email address so that a message appears to 333 00:18:41,480 --> 00:18:45,080 Speaker 1: be from an official, legitimate source. There's nothing in the 334 00:18:45,160 --> 00:18:48,200 Speaker 1: email protocol on its own, just the regular email protocol 335 00:18:48,520 --> 00:18:52,280 Speaker 1: that verifies an address in the front field is actually 336 00:18:52,320 --> 00:18:55,400 Speaker 1: a legitimate address. Now you can do some snooping yourself. 337 00:18:55,680 --> 00:18:59,280 Speaker 1: You can examine email headers, which might require activating a 338 00:18:59,320 --> 00:19:03,080 Speaker 1: specific feature in your email interface like in Gmail. I 339 00:19:03,119 --> 00:19:07,480 Speaker 1: think it's reveal source or something along those lines. Uh oh, 340 00:19:07,520 --> 00:19:09,320 Speaker 1: show original is what it is. That's what it is. 341 00:19:09,560 --> 00:19:12,240 Speaker 1: You choose show original, and that's gonna give you kind 342 00:19:12,280 --> 00:19:16,160 Speaker 1: of the hypertext version of the email, and that will 343 00:19:16,200 --> 00:19:19,040 Speaker 1: include listing all the servers that the email has passed 344 00:19:19,280 --> 00:19:23,879 Speaker 1: through with the line received colon from, and at the 345 00:19:23,880 --> 00:19:28,720 Speaker 1: bottom of that section, the last received from would represent 346 00:19:28,840 --> 00:19:32,760 Speaker 1: the original computer that sent the email. And if you 347 00:19:32,800 --> 00:19:36,360 Speaker 1: look at that and the original computers domain is different 348 00:19:36,480 --> 00:19:40,600 Speaker 1: from the official domain. Again, that could be a red flag. 349 00:19:40,640 --> 00:19:44,639 Speaker 1: It's not necessarily proof positive, but it does indicate that 350 00:19:44,680 --> 00:19:46,679 Speaker 1: it could be a malicious email and you should be 351 00:19:46,720 --> 00:19:49,800 Speaker 1: careful about it. On the admin side of this, if 352 00:19:49,800 --> 00:19:53,560 Speaker 1: you were a system administrator, organizations can implement what is 353 00:19:53,560 --> 00:19:58,120 Speaker 1: called a sender policy framework to prevent forged email addresses, 354 00:19:58,600 --> 00:20:02,800 Speaker 1: which is an email validation protocol, and essentially it allows 355 00:20:02,840 --> 00:20:06,760 Speaker 1: an organization to map specific I P addresses that are 356 00:20:06,800 --> 00:20:10,920 Speaker 1: authorized to be associated with email addresses containing the organization's 357 00:20:10,960 --> 00:20:13,879 Speaker 1: web domain. So, in other words, it says, if someone 358 00:20:13,920 --> 00:20:16,480 Speaker 1: tries to send an email and they claim that email 359 00:20:16,520 --> 00:20:20,560 Speaker 1: is from US, check to make sure their IP address 360 00:20:20,760 --> 00:20:24,720 Speaker 1: is on this list of authorized IP addresses, and if 361 00:20:24,720 --> 00:20:29,080 Speaker 1: it isn't, don't allow our email. Don't allow that that 362 00:20:29,200 --> 00:20:33,280 Speaker 1: from field to display our domain. It's essentially saying check 363 00:20:33,320 --> 00:20:36,359 Speaker 1: here to make sure that it's legit. But again, you 364 00:20:36,400 --> 00:20:39,080 Speaker 1: have to implement this. It's not something that is natively 365 00:20:39,280 --> 00:20:43,520 Speaker 1: part of the email protocol. Now. I've received countless phishing 366 00:20:43,600 --> 00:20:45,680 Speaker 1: emails over the years, and my favorites are the ones 367 00:20:45,720 --> 00:20:48,640 Speaker 1: from companies or banks that I have never done any 368 00:20:48,680 --> 00:20:52,320 Speaker 1: business with. But even those types of obvious scams can 369 00:20:52,359 --> 00:20:55,200 Speaker 1: fool people. They might think, Hey, I might be able 370 00:20:55,200 --> 00:20:57,240 Speaker 1: to get some free money because these folks think I 371 00:20:57,280 --> 00:20:59,920 Speaker 1: have cash in their coffers, so I'll just play along, 372 00:21:00,440 --> 00:21:02,840 Speaker 1: and I can essentially get some poor SAPs doughe just 373 00:21:02,880 --> 00:21:04,800 Speaker 1: because they happen to have the same name as I 374 00:21:04,840 --> 00:21:07,920 Speaker 1: do that stinks to be them. Only it turns out 375 00:21:07,960 --> 00:21:10,600 Speaker 1: that the SAP all along was the person who received 376 00:21:10,600 --> 00:21:13,560 Speaker 1: the email, because they will end up handing over sensitive 377 00:21:13,600 --> 00:21:16,400 Speaker 1: information to an attacker rather than getting hold of someone 378 00:21:16,440 --> 00:21:19,840 Speaker 1: else's money. A phishing attack doesn't need to get that 379 00:21:19,880 --> 00:21:22,800 Speaker 1: many hits to be a success. Is a pretty cheap 380 00:21:23,280 --> 00:21:26,879 Speaker 1: way of attacking people. It's not expensive to mock up 381 00:21:26,920 --> 00:21:29,479 Speaker 1: an email that looks like it came from an official source, 382 00:21:29,800 --> 00:21:33,000 Speaker 1: and it's also not that expensive to email hundreds of 383 00:21:33,040 --> 00:21:35,679 Speaker 1: thousands of people. It's pretty easy to do so if 384 00:21:35,720 --> 00:21:38,640 Speaker 1: you cast a wide enough net, you're sure to get 385 00:21:38,680 --> 00:21:40,760 Speaker 1: some hits. There might not be a lot, but you 386 00:21:40,800 --> 00:21:43,160 Speaker 1: don't really need a lot to actually make it profitable. 387 00:21:43,600 --> 00:21:46,159 Speaker 1: So it's not an efficient way to trick people, but 388 00:21:46,240 --> 00:21:49,720 Speaker 1: it can still make you money, though not at a 389 00:21:49,720 --> 00:21:52,600 Speaker 1: super high profitability. But also you would be a total 390 00:21:52,680 --> 00:21:55,159 Speaker 1: scumbag for doing it if you try chose to do 391 00:21:55,200 --> 00:21:58,800 Speaker 1: this kind of stuff. So that's fishing. What is spear 392 00:21:58,880 --> 00:22:01,439 Speaker 1: fishing because that's been the news quite a bit with 393 00:22:01,480 --> 00:22:03,800 Speaker 1: this d n C stuff. Well, as I mentioned earlier, 394 00:22:03,840 --> 00:22:06,439 Speaker 1: we're moving from the general to the more specific. So 395 00:22:06,480 --> 00:22:09,639 Speaker 1: spear phishing is phishing. It uses the same sort of 396 00:22:09,680 --> 00:22:12,800 Speaker 1: general approaches phishing, except this time the target is a 397 00:22:12,880 --> 00:22:17,040 Speaker 1: defined one. Instead of it just being a widespread blanket 398 00:22:17,040 --> 00:22:21,680 Speaker 1: attack against anyone and everyone, this is a targeted attack. 399 00:22:22,080 --> 00:22:26,119 Speaker 1: It's targeting a specific company or organization. The emails or 400 00:22:26,119 --> 00:22:29,399 Speaker 1: other messages take aim at the people who work in 401 00:22:29,520 --> 00:22:33,080 Speaker 1: those organizations or for those companies. The attacker might tailor 402 00:22:33,160 --> 00:22:36,359 Speaker 1: their approach specifically for the target. They might be able 403 00:22:36,400 --> 00:22:40,280 Speaker 1: to use the target's actual name in the email. UM 404 00:22:40,359 --> 00:22:43,360 Speaker 1: that's a pretty smart move. Actually, it's likely to increase 405 00:22:43,400 --> 00:22:47,080 Speaker 1: the number of successful hits. So rather than sending out 406 00:22:47,119 --> 00:22:50,000 Speaker 1: generic hey you've got some cash and X bank that 407 00:22:50,080 --> 00:22:52,880 Speaker 1: you need to retrieve, or dear Amazon customer, we see 408 00:22:52,880 --> 00:22:54,960 Speaker 1: you overpaid on your last order, so fill out this 409 00:22:55,000 --> 00:22:58,520 Speaker 1: information for a refund, a spear fisher might take aim 410 00:22:58,560 --> 00:23:02,760 Speaker 1: using details that apply to a specific targeted organization. The 411 00:23:02,800 --> 00:23:06,520 Speaker 1: actual emails might not look that different from general phishing ones, 412 00:23:06,640 --> 00:23:10,199 Speaker 1: though they may contain more specific information, they use similar 413 00:23:10,280 --> 00:23:13,159 Speaker 1: language because again, they're trying to leverage those principles of 414 00:23:13,160 --> 00:23:16,520 Speaker 1: influence that all humans are vulnerable to. A spear phishing 415 00:23:16,520 --> 00:23:19,000 Speaker 1: attack could take the form of an email claiming to 416 00:23:19,000 --> 00:23:22,800 Speaker 1: contain a security patch or a system update that supposedly 417 00:23:23,040 --> 00:23:26,000 Speaker 1: a company wide policy. The email would claim to guide 418 00:23:26,040 --> 00:23:29,160 Speaker 1: employees to bringing their computers up to date, but actually 419 00:23:29,440 --> 00:23:32,240 Speaker 1: they would install a malicious piece of software. Are a 420 00:23:32,320 --> 00:23:36,040 Speaker 1: good old friend, you know malware and malware can take 421 00:23:36,080 --> 00:23:39,320 Speaker 1: lots of different forms, but typically with phishing attacks, the 422 00:23:39,400 --> 00:23:43,480 Speaker 1: malware's purpose is to facilitate the transfer of stolen information. 423 00:23:43,760 --> 00:23:45,879 Speaker 1: It might allow a back channel of access to a 424 00:23:45,960 --> 00:23:49,840 Speaker 1: system's computers, you know that backdoor access that allows a 425 00:23:49,880 --> 00:23:52,560 Speaker 1: hacker to take administrative control of a computer. Or it 426 00:23:52,640 --> 00:23:55,920 Speaker 1: might contain a key logger which will record every keystroke 427 00:23:56,080 --> 00:23:58,960 Speaker 1: made on that computer, which is an effective way to 428 00:23:58,960 --> 00:24:01,320 Speaker 1: steal someone's user and aim or password. Or it could 429 00:24:01,320 --> 00:24:04,240 Speaker 1: be lots of other stuff too. Worse yet, it could 430 00:24:04,320 --> 00:24:06,639 Speaker 1: contain a file or a link to a page that 431 00:24:06,640 --> 00:24:10,240 Speaker 1: would take advantage of a zero day exploit. This is 432 00:24:10,240 --> 00:24:13,879 Speaker 1: a vulnerability that exists in some form of software or 433 00:24:13,960 --> 00:24:17,920 Speaker 1: operating system that has yet to have been published. It's 434 00:24:18,040 --> 00:24:21,640 Speaker 1: very possible that the entity that made the software has 435 00:24:21,720 --> 00:24:25,000 Speaker 1: no way of knowing that it even exists. Zero day 436 00:24:25,040 --> 00:24:28,200 Speaker 1: exploits are incredibly valuable to hackers. If you find one 437 00:24:28,359 --> 00:24:30,960 Speaker 1: and you know that it hasn't been published, then you 438 00:24:31,040 --> 00:24:33,280 Speaker 1: know that you have a really good chance of that 439 00:24:33,400 --> 00:24:36,639 Speaker 1: zero day exploit having a huge impact if you wrap 440 00:24:36,680 --> 00:24:39,800 Speaker 1: it in some other attack, So you could compromise a 441 00:24:39,840 --> 00:24:42,920 Speaker 1: target computer or an entire system, and you give an 442 00:24:42,960 --> 00:24:47,520 Speaker 1: attacker access to that machine or the entire network that 443 00:24:47,560 --> 00:24:50,200 Speaker 1: machine is connected to, at least as far as the 444 00:24:50,240 --> 00:24:52,000 Speaker 1: target would be able to access. And then if you 445 00:24:52,040 --> 00:24:55,800 Speaker 1: had escalation software in there, you can even escalate so 446 00:24:55,840 --> 00:25:00,240 Speaker 1: that you elevate the the status to admin levels test 447 00:25:00,320 --> 00:25:03,359 Speaker 1: that gives you unfettered access to the system. The spear 448 00:25:03,400 --> 00:25:06,600 Speaker 1: phishing attack might look like it came from another employee 449 00:25:06,640 --> 00:25:09,680 Speaker 1: requesting access to certain types of information. It could look 450 00:25:09,720 --> 00:25:12,879 Speaker 1: like it came from a finance department saying, we need 451 00:25:12,920 --> 00:25:16,200 Speaker 1: you to pay this uh this invoice, so just give 452 00:25:16,280 --> 00:25:20,080 Speaker 1: me your your information here. The larger the organization the 453 00:25:20,119 --> 00:25:22,399 Speaker 1: easier it is to slip something like this through, because 454 00:25:22,440 --> 00:25:24,960 Speaker 1: it's not likely that everyone's going to know everyone else. 455 00:25:25,680 --> 00:25:28,600 Speaker 1: The smaller the organization, the more you might say, I'm 456 00:25:28,640 --> 00:25:30,720 Speaker 1: just gonna walk over to Sue's desk and find out 457 00:25:30,760 --> 00:25:33,080 Speaker 1: if Sue actually sent me this email, and that would 458 00:25:33,080 --> 00:25:38,399 Speaker 1: obviously the work against the the intent of the hacker. Next, 459 00:25:38,480 --> 00:25:40,199 Speaker 1: I'm going to share a whale of a tail with 460 00:25:40,240 --> 00:25:43,639 Speaker 1: you me, lads, but first let's take another quick break 461 00:25:43,800 --> 00:25:54,520 Speaker 1: to thank our sponsor. Okay, so we've defined fishing, and 462 00:25:54,520 --> 00:25:58,600 Speaker 1: we define spear fishing, but what is whaling. Whaling refers 463 00:25:58,640 --> 00:26:01,879 Speaker 1: to phishing attacks that are then more specific than spear fishing. 464 00:26:02,040 --> 00:26:06,320 Speaker 1: These attacks target high ranking executives the the whales, or 465 00:26:06,320 --> 00:26:09,320 Speaker 1: they might be high ranking members of an organization. These 466 00:26:09,320 --> 00:26:12,600 Speaker 1: targets have the most potential access to an organization, and 467 00:26:12,640 --> 00:26:15,399 Speaker 1: it's confidential information, so they have a very high value 468 00:26:15,440 --> 00:26:18,359 Speaker 1: attached to them. The attacker stands to gain the most 469 00:26:18,600 --> 00:26:23,480 Speaker 1: from compromising that kind of target. Also, some of them 470 00:26:23,600 --> 00:26:27,360 Speaker 1: are really really not computer savvy, and as such they 471 00:26:27,359 --> 00:26:31,000 Speaker 1: can fall for tricks way too easily. I say this 472 00:26:31,280 --> 00:26:34,040 Speaker 1: having met a lot of executives who seem to have 473 00:26:34,160 --> 00:26:38,960 Speaker 1: only a cursory understanding of how technology works. Uh. As 474 00:26:39,160 --> 00:26:43,320 Speaker 1: they age out of leadership positions, then we see a 475 00:26:43,359 --> 00:26:45,000 Speaker 1: little bit less of that. But we always got to 476 00:26:45,040 --> 00:26:48,919 Speaker 1: remember that typically leadership tends to be older, you know, 477 00:26:49,320 --> 00:26:53,959 Speaker 1: than your rank and file employee, and because technology changes 478 00:26:54,040 --> 00:26:57,520 Speaker 1: so quickly, it's very possible for the older members of 479 00:26:57,560 --> 00:27:01,280 Speaker 1: any group to have less knowledge about the most current 480 00:27:01,800 --> 00:27:06,200 Speaker 1: and potentially most dangerous uses of tech. So whaling again 481 00:27:06,240 --> 00:27:09,280 Speaker 1: follows similar tactics as phishing and spear fishing. It could 482 00:27:09,320 --> 00:27:11,840 Speaker 1: be a much more specific message designed to give the 483 00:27:11,840 --> 00:27:14,919 Speaker 1: best possible chance for a successful hit, but otherwise it's 484 00:27:14,960 --> 00:27:17,240 Speaker 1: pretty much the same as what we've already covered, So 485 00:27:17,240 --> 00:27:19,600 Speaker 1: I'm not gonna waste time rehashing what I've already said. 486 00:27:19,680 --> 00:27:24,440 Speaker 1: Just remember that the specificity I talked about earlier will 487 00:27:24,480 --> 00:27:29,120 Speaker 1: be even more so for whaling. You will have messages 488 00:27:29,200 --> 00:27:32,120 Speaker 1: that call out the executive by name. It might use 489 00:27:32,280 --> 00:27:35,440 Speaker 1: names of people the executive actually knows. It might even 490 00:27:35,480 --> 00:27:38,200 Speaker 1: use a spoofed email address so it seems like it's 491 00:27:38,200 --> 00:27:42,040 Speaker 1: coming from someone the person knows. Of course, the closer 492 00:27:42,119 --> 00:27:45,480 Speaker 1: the friendship, the more you run the risk of not 493 00:27:45,760 --> 00:27:50,280 Speaker 1: sounding like that person, which could set off red flags 494 00:27:50,440 --> 00:27:53,560 Speaker 1: in someone's mind. They might say, huh, soon never talks 495 00:27:53,560 --> 00:27:56,399 Speaker 1: to me like this an email that would end up 496 00:27:56,400 --> 00:27:58,840 Speaker 1: being an issue, But otherwise you have a much greater 497 00:27:59,000 --> 00:28:03,800 Speaker 1: chance of getting a succes us. As always, you should 498 00:28:03,880 --> 00:28:07,000 Speaker 1: use critical thinking whenever you get a message. It's best 499 00:28:07,040 --> 00:28:10,440 Speaker 1: to do so. And while I've focused on email again, 500 00:28:10,640 --> 00:28:13,439 Speaker 1: those phishing attacts could come through other channels such as 501 00:28:13,480 --> 00:28:18,320 Speaker 1: instant messaging or similar communication channels, or even through spoofed websites. 502 00:28:18,440 --> 00:28:21,800 Speaker 1: So using critical thinking and just paying attention can really 503 00:28:21,840 --> 00:28:24,200 Speaker 1: save you a lot of heart heartache in the long run. 504 00:28:24,240 --> 00:28:27,760 Speaker 1: So here's some basic rules you should follow. First, try 505 00:28:27,800 --> 00:28:30,280 Speaker 1: to make sure an incoming message is in fact from 506 00:28:30,280 --> 00:28:33,719 Speaker 1: a legitimate source, particularly if there's an attachment to the 507 00:28:33,760 --> 00:28:37,199 Speaker 1: email or if it's asking you to hand over information. 508 00:28:37,359 --> 00:28:40,320 Speaker 1: That might involve digging a little deeper if the attacker 509 00:28:40,360 --> 00:28:43,720 Speaker 1: bothered to forge an email field, so if they forge 510 00:28:43,800 --> 00:28:46,360 Speaker 1: that from field, then you might need to check the 511 00:28:46,400 --> 00:28:48,560 Speaker 1: source code just to make sure that in fact it 512 00:28:48,680 --> 00:28:52,040 Speaker 1: is legitimate. And then you might even ask like, well, 513 00:28:52,040 --> 00:28:54,200 Speaker 1: why are they asking for this information and why should 514 00:28:54,240 --> 00:28:56,520 Speaker 1: it be done over email? That's not necessarily the most 515 00:28:56,560 --> 00:29:00,000 Speaker 1: secure way. In fact, You should really never send confident 516 00:29:00,080 --> 00:29:05,280 Speaker 1: ential information like user names, passwords, credit card information, wire 517 00:29:05,320 --> 00:29:08,640 Speaker 1: transfer information, anything like that over email or an instant 518 00:29:08,680 --> 00:29:11,959 Speaker 1: message system because most of the time those are not secure. 519 00:29:12,040 --> 00:29:14,920 Speaker 1: They're not a lot of them are not encrypted. So 520 00:29:15,000 --> 00:29:18,200 Speaker 1: even if the person contacting you is completely legitimate, if 521 00:29:18,680 --> 00:29:21,000 Speaker 1: they are on the up and up, you totally trust them. 522 00:29:21,240 --> 00:29:25,000 Speaker 1: They're not gonna do anything to to to mess with 523 00:29:25,080 --> 00:29:28,960 Speaker 1: you the channel. If you're using a channel that's unencrypted, 524 00:29:29,000 --> 00:29:32,400 Speaker 1: someone else could potentially get hold of that information. So 525 00:29:32,680 --> 00:29:35,160 Speaker 1: it may even be that a third party could get 526 00:29:35,240 --> 00:29:37,400 Speaker 1: hold of this information. It's just not a smart move 527 00:29:37,480 --> 00:29:40,680 Speaker 1: to send that kind of stuff over email or instant messager, 528 00:29:40,960 --> 00:29:44,080 Speaker 1: and most legitimate communication will never require you to send 529 00:29:44,080 --> 00:29:46,880 Speaker 1: in sensitive info over email. Rather, you would have to 530 00:29:47,160 --> 00:29:50,120 Speaker 1: visit a legit website that's verified and secure. You look 531 00:29:50,160 --> 00:29:52,840 Speaker 1: for that little lock in the U r L bar, 532 00:29:53,000 --> 00:29:55,680 Speaker 1: and you make sure that the website you go to 533 00:29:55,800 --> 00:29:59,120 Speaker 1: actually belongs to the company or entity that it's supposed 534 00:29:59,120 --> 00:30:02,560 Speaker 1: to belong to before fill anything out. Third, if you're 535 00:30:02,560 --> 00:30:05,280 Speaker 1: in a company or organization and you receive a message 536 00:30:05,320 --> 00:30:08,120 Speaker 1: that seems hinky and it's catered to you or to 537 00:30:08,160 --> 00:30:11,040 Speaker 1: your company, you should probably let someone in I T 538 00:30:11,200 --> 00:30:13,080 Speaker 1: know about it so that they can be on the 539 00:30:13,080 --> 00:30:16,560 Speaker 1: lookout and perhaps issue a company wide alert. Because even 540 00:30:16,600 --> 00:30:19,720 Speaker 1: if you don't fall for this, if it's a spear 541 00:30:19,760 --> 00:30:22,680 Speaker 1: fishing attack that's targeting people who belong to the same 542 00:30:22,760 --> 00:30:25,120 Speaker 1: organization you do or the same company you work for, 543 00:30:25,760 --> 00:30:28,120 Speaker 1: all it takes is a couple, maybe even as few 544 00:30:28,160 --> 00:30:31,680 Speaker 1: as one positive hits to make a lot of damage. 545 00:30:32,120 --> 00:30:36,480 Speaker 1: So if you notice something, you say something, it gives 546 00:30:36,480 --> 00:30:40,640 Speaker 1: you the best chance to not have your company or 547 00:30:40,760 --> 00:30:44,080 Speaker 1: organization fall victim to these kind of attacks. This is 548 00:30:44,160 --> 00:30:47,160 Speaker 1: really serious business. So let's take another look at the 549 00:30:47,280 --> 00:30:50,480 Speaker 1: U S. D n C phishing attacks in twenty sixteen 550 00:30:50,480 --> 00:30:54,800 Speaker 1: that I mentioned earlier. So what actually happened in those attacks. Well, 551 00:30:54,840 --> 00:30:57,520 Speaker 1: according to a joint report from the Department of Homeland 552 00:30:57,520 --> 00:31:01,600 Speaker 1: Security and the FBI, two groups which the report labels 553 00:31:01,680 --> 00:31:06,040 Speaker 1: a P twenty eight and APT twenty nine, breached the 554 00:31:06,120 --> 00:31:09,200 Speaker 1: d n C servers through phishing attacks. APT, by the way, 555 00:31:09,240 --> 00:31:13,800 Speaker 1: stands for Advanced Persistent threat. The first attack started in 556 00:31:13,840 --> 00:31:17,520 Speaker 1: the summer of The group behind the attack was APT 557 00:31:17,800 --> 00:31:21,120 Speaker 1: twenty nine. The attackers sent emails to more than one 558 00:31:21,160 --> 00:31:25,120 Speaker 1: thousand different addresses attached to the d n C, and 559 00:31:25,160 --> 00:31:28,560 Speaker 1: the messages contained malware attachments. They were posing as a 560 00:31:28,600 --> 00:31:31,600 Speaker 1: normal file. It looked like an innocent file. Opening the 561 00:31:31,640 --> 00:31:35,320 Speaker 1: file would install the malware on the target machine. The 562 00:31:35,400 --> 00:31:40,880 Speaker 1: malware established an encrypted communication channel between the compromise systems 563 00:31:41,160 --> 00:31:45,600 Speaker 1: and APT twenty nine. The malware also could escalate privileges, 564 00:31:45,880 --> 00:31:48,520 Speaker 1: meaning the malware was able to trick computers into giving 565 00:31:48,520 --> 00:31:51,400 Speaker 1: the attackers admin level access to the system so they 566 00:31:51,400 --> 00:31:55,640 Speaker 1: could steal information send it back to their own computers 567 00:31:55,680 --> 00:31:59,200 Speaker 1: over this encrypted channel of communication, and because of this 568 00:31:59,440 --> 00:32:02,680 Speaker 1: they were largely undetected for a while. The second attack 569 00:32:02,800 --> 00:32:06,400 Speaker 1: happened in that was a P twenty eight. They sent 570 00:32:06,440 --> 00:32:08,880 Speaker 1: out an email to a large number of people in 571 00:32:08,960 --> 00:32:12,120 Speaker 1: the d n C. The email posed as an organization 572 00:32:12,160 --> 00:32:15,560 Speaker 1: wide policy that would require users to reset their passwords, 573 00:32:15,560 --> 00:32:18,480 Speaker 1: saying passwords have expired. You need to go and follow 574 00:32:18,520 --> 00:32:22,320 Speaker 1: this link and reset your password. So they would go 575 00:32:22,400 --> 00:32:24,280 Speaker 1: to a link and that led to a website that 576 00:32:24,360 --> 00:32:26,240 Speaker 1: was made to look like it was an official d 577 00:32:26,360 --> 00:32:28,920 Speaker 1: NC site, but in fact it was a phishing site, 578 00:32:29,200 --> 00:32:31,480 Speaker 1: so they would guide users to enter their user names 579 00:32:31,480 --> 00:32:34,960 Speaker 1: and their passwords and a new password. Ostensibly so that 580 00:32:35,040 --> 00:32:38,560 Speaker 1: they could reset their passwords, but in reality, the hackers 581 00:32:38,600 --> 00:32:42,000 Speaker 1: were gathering log and credentials and they were just shipp 582 00:32:42,040 --> 00:32:47,040 Speaker 1: shipping them straight to APT. The investigation into those attacks 583 00:32:47,080 --> 00:32:49,600 Speaker 1: is still ongoing, but from what we know, it seems 584 00:32:49,640 --> 00:32:54,200 Speaker 1: pretty certain the attackers were Russian civilian and military intelligence services. 585 00:32:54,800 --> 00:32:58,960 Speaker 1: So you have a group of Russian hackers targeting a 586 00:32:59,120 --> 00:33:04,080 Speaker 1: United States political parties UH systems, and that's a big 587 00:33:04,240 --> 00:33:07,440 Speaker 1: scary thing. No matter what country you live in or 588 00:33:07,480 --> 00:33:12,080 Speaker 1: what country uh you know, no matter which countries are involved, 589 00:33:12,160 --> 00:33:14,240 Speaker 1: this kind of attack is a serious thing where one 590 00:33:14,280 --> 00:33:19,800 Speaker 1: country is trying to influence the political outcome of another country. 591 00:33:19,920 --> 00:33:23,120 Speaker 1: I don't care who you are. That's terrifying. The United States, 592 00:33:23,640 --> 00:33:28,320 Speaker 1: by the way, has done not necessarily cyber crime kind 593 00:33:28,360 --> 00:33:31,960 Speaker 1: of stuff like this, but the US certainly has a 594 00:33:32,040 --> 00:33:35,560 Speaker 1: long history of trying to influence other nations and their 595 00:33:35,560 --> 00:33:39,920 Speaker 1: political proceedings. So I'm not saying the US is innocent 596 00:33:39,920 --> 00:33:42,800 Speaker 1: of those kind of things either, But it is terrifying 597 00:33:42,840 --> 00:33:46,200 Speaker 1: to look at the way that you can actually leverage 598 00:33:46,400 --> 00:33:50,560 Speaker 1: human psychology and have a big impact like this, And honestly, 599 00:33:50,600 --> 00:33:52,720 Speaker 1: I think that's the most troubling thing about fishing attacks 600 00:33:52,720 --> 00:33:55,920 Speaker 1: because they don't have to be sophisticated. There's no need 601 00:33:55,960 --> 00:33:58,960 Speaker 1: to craft a super tricky code. You don't have to 602 00:33:59,000 --> 00:34:03,720 Speaker 1: sit there and secretly infiltrate a computer system. Phishing works 603 00:34:04,120 --> 00:34:07,959 Speaker 1: not because of the technology. I mean technology facilitates it. 604 00:34:07,960 --> 00:34:11,760 Speaker 1: It makes it easier to steal information. But phishing works 605 00:34:11,840 --> 00:34:14,680 Speaker 1: because of the way we humans work. It works because 606 00:34:14,680 --> 00:34:17,959 Speaker 1: we're fallible. We can be tricked, we can be influenced. 607 00:34:18,280 --> 00:34:21,000 Speaker 1: It shows yet again that no matter how secure a 608 00:34:21,080 --> 00:34:24,440 Speaker 1: system is, no matter what technology you have put in place, 609 00:34:25,040 --> 00:34:27,440 Speaker 1: the system is really only as good as the people 610 00:34:27,560 --> 00:34:30,560 Speaker 1: who have access to it. If you design the world's 611 00:34:30,680 --> 00:34:35,000 Speaker 1: most secure bank vault door and it has biometrics and 612 00:34:35,440 --> 00:34:37,959 Speaker 1: voice activation and all this kind of stuff where only 613 00:34:38,239 --> 00:34:41,520 Speaker 1: a select few are able to ever access it, and 614 00:34:41,560 --> 00:34:44,279 Speaker 1: then I leave that door open while I dash off 615 00:34:44,320 --> 00:34:46,360 Speaker 1: to go grab lunch, it doesn't do you any good. 616 00:34:46,719 --> 00:34:51,800 Speaker 1: I have practiced very very very bad security protocol. Phishing 617 00:34:51,840 --> 00:34:55,440 Speaker 1: works because we humans will often take these mental shortcuts 618 00:34:55,640 --> 00:34:58,279 Speaker 1: and we won't use critical thinking. And it doesn't need 619 00:34:58,320 --> 00:35:00,880 Speaker 1: to work every single time. In fact, pending upon the attack, 620 00:35:00,920 --> 00:35:03,440 Speaker 1: it may need to only work once. So if there 621 00:35:03,440 --> 00:35:06,440 Speaker 1: are ten thousand employees of a company, and you just 622 00:35:06,520 --> 00:35:09,680 Speaker 1: need just one of them to click on a malicious attachment. 623 00:35:09,960 --> 00:35:13,000 Speaker 1: That means all you need is a point zero one 624 00:35:13,080 --> 00:35:17,320 Speaker 1: percent success rate. It does not have to be great, 625 00:35:18,080 --> 00:35:21,520 Speaker 1: it just has to work once. So it's no surprise 626 00:35:21,560 --> 00:35:23,640 Speaker 1: that there's so many phishing attacks out in the wild. 627 00:35:23,960 --> 00:35:25,960 Speaker 1: They get results, and they don't need a lot of 628 00:35:25,960 --> 00:35:27,799 Speaker 1: people to fall for them to be effective. That's all 629 00:35:28,040 --> 00:35:30,040 Speaker 1: Also why there's a lot of crappy ones out there, 630 00:35:30,080 --> 00:35:34,520 Speaker 1: because why put in the effort to design something really, 631 00:35:34,560 --> 00:35:38,000 Speaker 1: really good If you can still get hits with something 632 00:35:38,040 --> 00:35:40,440 Speaker 1: that's crappy. That will save you time and effort. So 633 00:35:40,560 --> 00:35:42,399 Speaker 1: just go ahead send the crappy thing out. You won't 634 00:35:42,440 --> 00:35:46,560 Speaker 1: get quality hits, but you'll still get hits. To defeat 635 00:35:46,600 --> 00:35:50,120 Speaker 1: phishing scams, we can institute different protocols and protections to 636 00:35:50,120 --> 00:35:52,480 Speaker 1: help weed out spam emails before they ever get to 637 00:35:52,520 --> 00:35:54,840 Speaker 1: a user, and obvious scams that kind of thing. We 638 00:35:54,880 --> 00:36:00,360 Speaker 1: can blacklist certain uh I P addresses, but ultimately we 639 00:36:00,440 --> 00:36:03,560 Speaker 1: have to rely on people resisting those attempts to influence 640 00:36:03,600 --> 00:36:06,080 Speaker 1: them and use a bit of critical thinking. Now. I 641 00:36:06,080 --> 00:36:07,880 Speaker 1: know I've talked a lot about critical thinking over the 642 00:36:07,960 --> 00:36:10,160 Speaker 1: last two weeks, I'm gonna give it a rest. I'm 643 00:36:10,160 --> 00:36:12,360 Speaker 1: gonna switch gears for the rest of this week, sort of. 644 00:36:12,880 --> 00:36:15,360 Speaker 1: I always try to use critical thinking whenever I'm putting 645 00:36:15,360 --> 00:36:18,080 Speaker 1: a show together, so there is some going on in 646 00:36:18,080 --> 00:36:20,320 Speaker 1: the back end, but I'm not gonna harp on about 647 00:36:20,360 --> 00:36:23,680 Speaker 1: it anymore for the rest of this week. Instead, we're 648 00:36:23,719 --> 00:36:28,919 Speaker 1: going to talk about speech recognition, natural language processing, and 649 00:36:29,520 --> 00:36:34,160 Speaker 1: voice assistant slash virtual assistant slash. It's hard to come 650 00:36:34,239 --> 00:36:35,560 Speaker 1: up with a name for them, but you know, I'm 651 00:36:35,560 --> 00:36:39,480 Speaker 1: talking about your series, your Alexas, your Google assistants, that 652 00:36:39,560 --> 00:36:41,320 Speaker 1: kind of thing. So that's where we're going to concentrate 653 00:36:41,480 --> 00:36:43,320 Speaker 1: on for the rest of this week. If you guys 654 00:36:43,360 --> 00:36:46,320 Speaker 1: have any suggestions for future episodes of Tech Stuff, whether 655 00:36:46,360 --> 00:36:50,160 Speaker 1: it's a specific technology, maybe it's a policy relating to tech. 656 00:36:50,600 --> 00:36:53,719 Speaker 1: Maybe it's a company or a person in technology. Maybe 657 00:36:53,760 --> 00:36:55,880 Speaker 1: there's someone you want me to interview or have on 658 00:36:55,920 --> 00:36:59,000 Speaker 1: as a special guest, Please let me know. Send me 659 00:36:59,040 --> 00:37:02,000 Speaker 1: a message. My email address for this show is tech 660 00:37:02,080 --> 00:37:05,680 Speaker 1: Stuff at how stuff works dot com, or you can 661 00:37:05,760 --> 00:37:08,600 Speaker 1: drop me a line on Facebook or Twitter. The handle 662 00:37:08,680 --> 00:37:12,200 Speaker 1: for both of those is tech Stuff hs W. Don't 663 00:37:12,239 --> 00:37:15,040 Speaker 1: forget to follow us on Instagram and I'll talk to 664 00:37:15,040 --> 00:37:24,280 Speaker 1: you again really soon. For more on this and thousands 665 00:37:24,320 --> 00:37:36,560 Speaker 1: of other topics, because it how stuff works. Dot com