1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,480 Speaker 1: This season of Smart Talks with IBM is all about 3 00:00:14,600 --> 00:00:18,480 Speaker 1: new creators, the developers, data scientists, c t o s 4 00:00:18,560 --> 00:00:23,280 Speaker 1: and other visionaries creatively applying technology in business to drive change. 5 00:00:23,800 --> 00:00:26,680 Speaker 1: They use their knowledge and creativity to develop better ways 6 00:00:26,680 --> 00:00:30,280 Speaker 1: of working, no matter the industry. Join hosts from your 7 00:00:30,320 --> 00:00:33,880 Speaker 1: favorite Pushkin Industries podcasts as they use their expertise to 8 00:00:33,960 --> 00:00:37,480 Speaker 1: deepen these conversations, and of course Malcolm Gladwell will guide 9 00:00:37,520 --> 00:00:39,839 Speaker 1: you through the season as your host and provide his 10 00:00:39,920 --> 00:00:43,000 Speaker 1: thoughts and analysis along the way. Look out for new 11 00:00:43,040 --> 00:00:45,480 Speaker 1: episodes of Smart Talks with IBM on the I Heart 12 00:00:45,560 --> 00:00:49,159 Speaker 1: Radio app, Apple Podcasts, or wherever you get your podcasts, 13 00:00:49,360 --> 00:00:57,640 Speaker 1: and learn more at IBM dot com slash smart talks. Hello, Hello, 14 00:00:57,760 --> 00:01:01,720 Speaker 1: Welcome to Smart Talks with IBM podcast from Bushkin Industries, 15 00:01:02,080 --> 00:01:06,479 Speaker 1: I Heart Radio and IBM. I'm Malcolm Globwell. This season, 16 00:01:06,520 --> 00:01:10,840 Speaker 1: we're talking to new creators, the developers, data scientists, c 17 00:01:11,040 --> 00:01:14,120 Speaker 1: t o s and other visionaries who are creatively applying 18 00:01:14,160 --> 00:01:18,679 Speaker 1: technology and business to drive change. Channeling their knowledge and expertise, 19 00:01:19,080 --> 00:01:23,520 Speaker 1: they're developing more creative and effective solutions, no matter the industry. 20 00:01:24,280 --> 00:01:27,720 Speaker 1: Our guest today is Stephanie snow Cruthers. Snow is a 21 00:01:27,760 --> 00:01:30,800 Speaker 1: hacker alias, and it's how we'll refer to Stephanie for 22 00:01:30,800 --> 00:01:34,080 Speaker 1: the rest of this episode. Snow is the chief people 23 00:01:34,120 --> 00:01:37,800 Speaker 1: hacker for x Force at IBM. She gets paid to 24 00:01:37,840 --> 00:01:41,640 Speaker 1: hack into her client's businesses before criminal hackers do in 25 00:01:41,720 --> 00:01:46,160 Speaker 1: order to test her client's information security. In today's show, 26 00:01:46,160 --> 00:01:48,280 Speaker 1: you'll hear some of the more creative ways Snow has 27 00:01:48,280 --> 00:01:53,320 Speaker 1: persuaded people into sharing confidential information. She also talks about 28 00:01:53,320 --> 00:01:56,680 Speaker 1: the state of cybersecurity and what businesses need to do 29 00:01:57,040 --> 00:02:01,480 Speaker 1: to keep their data protected. Snow spoke with economics journalist 30 00:02:01,480 --> 00:02:05,400 Speaker 1: Tim Harford, host of the Pushkin podcast Cautionary Tales and 31 00:02:05,480 --> 00:02:09,280 Speaker 1: a longtime columnist at the Financial Times, where he writes 32 00:02:09,320 --> 00:02:13,639 Speaker 1: The Undercover Economist. In addition to publishing several books on 33 00:02:13,680 --> 00:02:16,799 Speaker 1: the topic, Tim is also a BBC broadcaster with his 34 00:02:16,919 --> 00:02:20,840 Speaker 1: show More or Less. Okay, let's now get to the 35 00:02:20,880 --> 00:02:27,400 Speaker 1: interview with Tim and Chief people Hacker Snow. Before you 36 00:02:27,440 --> 00:02:31,320 Speaker 1: tell me what achieve people hacker is, what is hacking 37 00:02:31,520 --> 00:02:34,639 Speaker 1: to you? I think if you ask the average person 38 00:02:34,760 --> 00:02:38,000 Speaker 1: to close their eyes and envision a hacker, they are 39 00:02:38,040 --> 00:02:42,160 Speaker 1: going to think of someone in a dark room with 40 00:02:42,200 --> 00:02:46,720 Speaker 1: a black hoodieon and all the screen text behind them. Right. Um, 41 00:02:46,760 --> 00:02:50,520 Speaker 1: But to me, a hacker doesn't even have to be technical. 42 00:02:50,680 --> 00:02:57,280 Speaker 1: It's someone who finds creative solutions or just different ways 43 00:02:57,360 --> 00:03:00,720 Speaker 1: to break apart something to make it work a unique 44 00:03:00,760 --> 00:03:04,680 Speaker 1: way that maybe it wasn't intended to do. Whether that's computers, 45 00:03:04,840 --> 00:03:08,000 Speaker 1: people devices, it could be a number of things. Right. 46 00:03:08,040 --> 00:03:11,440 Speaker 1: We see food hackers, we see life hackers. That's absolutely 47 00:03:11,600 --> 00:03:15,280 Speaker 1: a type of hacker. Yeah. And my my mother, I think, 48 00:03:15,320 --> 00:03:18,280 Speaker 1: would have described herself as a hacker before she died. 49 00:03:18,320 --> 00:03:21,239 Speaker 1: She loved to take apart computer. She had loved to 50 00:03:21,280 --> 00:03:24,400 Speaker 1: take apart software. She just wanted to know how everything worked, 51 00:03:24,440 --> 00:03:27,119 Speaker 1: and when she put it back together again, it sometimes 52 00:03:27,120 --> 00:03:29,360 Speaker 1: worked how she wanted it to work, rather than her 53 00:03:29,400 --> 00:03:33,000 Speaker 1: it was originally designed. But how was it that you 54 00:03:33,120 --> 00:03:36,480 Speaker 1: originally became interested in in this strange craft of hacking. 55 00:03:37,120 --> 00:03:40,200 Speaker 1: I actually got involved and figured out I want to 56 00:03:40,240 --> 00:03:42,080 Speaker 1: do this a little bit late in life. I was 57 00:03:42,120 --> 00:03:45,560 Speaker 1: in my mid twenties and I went to the world's 58 00:03:45,640 --> 00:03:48,680 Speaker 1: largest hacking conference, which takes place every year in Las Vegas, 59 00:03:49,440 --> 00:03:52,160 Speaker 1: and went with a group of friends and my husband 60 00:03:52,320 --> 00:03:56,040 Speaker 1: and I had honestly no interest at all. I wanted 61 00:03:56,080 --> 00:03:58,080 Speaker 1: to go to Vegas and sip drinks by the pool. 62 00:03:58,480 --> 00:04:01,080 Speaker 1: But they got me a pass too attend this really 63 00:04:01,080 --> 00:04:04,080 Speaker 1: cool conference and we sat in on the first talk 64 00:04:04,120 --> 00:04:07,840 Speaker 1: and it was extremely technical. They were going through step 65 00:04:07,880 --> 00:04:11,840 Speaker 1: by step about how to reverse malware, and I fell asleep. 66 00:04:12,480 --> 00:04:16,839 Speaker 1: I completely just zoned out. It didn't make sense to me. 67 00:04:17,200 --> 00:04:19,280 Speaker 1: So I got up and I started wandering around this 68 00:04:19,440 --> 00:04:22,000 Speaker 1: huge conference and I found what was called the lock 69 00:04:22,120 --> 00:04:25,880 Speaker 1: Picking Village. I was very confused by that, like, why 70 00:04:26,200 --> 00:04:28,840 Speaker 1: do people want to pick locks? I mean, there was 71 00:04:28,880 --> 00:04:31,640 Speaker 1: a there was an obvious answer to that question, but okay, 72 00:04:32,440 --> 00:04:35,279 Speaker 1: that's very true. So in that point in my life, 73 00:04:35,320 --> 00:04:39,040 Speaker 1: it did not like click at all. And so I'm 74 00:04:39,080 --> 00:04:41,200 Speaker 1: walking and someone's like, hey, do you want to learn 75 00:04:41,200 --> 00:04:44,400 Speaker 1: how to pick a lock? I said sure, and so 76 00:04:44,440 --> 00:04:47,119 Speaker 1: they sat me down and taught me everything. And there's 77 00:04:47,160 --> 00:04:50,359 Speaker 1: something magical that happens when someone picks a lock for 78 00:04:50,400 --> 00:04:52,240 Speaker 1: the first time, Like you can see it in their 79 00:04:52,240 --> 00:04:54,920 Speaker 1: face where it's like, wow, that was really cool and easy, 80 00:04:55,000 --> 00:04:57,960 Speaker 1: and then the oh shit, I just picked a lock, 81 00:04:58,600 --> 00:05:02,080 Speaker 1: and they're envisioning every anything in their life that's protected 82 00:05:02,120 --> 00:05:06,200 Speaker 1: by locks. Right, file, cabinets, their door, things that protect 83 00:05:06,200 --> 00:05:09,040 Speaker 1: their children, like all these things that you have locks 84 00:05:09,120 --> 00:05:12,880 Speaker 1: to protect and you just picked it in seconds. Um. 85 00:05:13,040 --> 00:05:15,560 Speaker 1: So that was the most eye opening moment for me 86 00:05:15,640 --> 00:05:19,919 Speaker 1: that really launched me into this career and thinking that 87 00:05:19,960 --> 00:05:22,880 Speaker 1: I could do it for a living. Well, there's I mean, 88 00:05:22,920 --> 00:05:26,480 Speaker 1: it feels like a long gap between that, or big 89 00:05:26,520 --> 00:05:28,800 Speaker 1: gap at least maybe not a long one between that 90 00:05:28,880 --> 00:05:31,880 Speaker 1: initial spark of wow, I can pick a lock. This 91 00:05:31,960 --> 00:05:35,640 Speaker 1: is this matters to realizing there's a career in this 92 00:05:35,920 --> 00:05:38,200 Speaker 1: and I might actually be good at this career. So 93 00:05:38,880 --> 00:05:40,640 Speaker 1: how did you figure out there's a there's a job 94 00:05:40,760 --> 00:05:42,560 Speaker 1: being a hacker, and how did you figure out that 95 00:05:42,600 --> 00:05:45,640 Speaker 1: you actually might be good at doing that job? So 96 00:05:45,760 --> 00:05:47,880 Speaker 1: once I was at that conference, I had met so 97 00:05:47,960 --> 00:05:50,559 Speaker 1: many different people who explained what they do for a living, 98 00:05:51,040 --> 00:05:53,320 Speaker 1: and again, at that point in my life, it felt 99 00:05:53,400 --> 00:05:56,760 Speaker 1: like that shouldn't be possible, right, people are getting paid 100 00:05:57,240 --> 00:06:01,240 Speaker 1: money to break into client's network, into their computers and 101 00:06:01,279 --> 00:06:04,080 Speaker 1: all these things, and it's still it didn't add up. 102 00:06:04,160 --> 00:06:07,799 Speaker 1: But what for me really stood out was another village 103 00:06:08,000 --> 00:06:11,360 Speaker 1: at the same conference staf Con called the Social Engineering Village. 104 00:06:11,880 --> 00:06:14,760 Speaker 1: And when I walked in they were actually placing live 105 00:06:14,920 --> 00:06:19,960 Speaker 1: phone calls to people to try to elicit information. And 106 00:06:20,000 --> 00:06:22,800 Speaker 1: so I'm sitting there in the audience listening to how 107 00:06:22,800 --> 00:06:25,719 Speaker 1: these people were doing it. I'm like, wow, Like, I'm 108 00:06:25,760 --> 00:06:30,760 Speaker 1: a people person, I've done cells. I could absolutely do this. Um. 109 00:06:30,839 --> 00:06:34,039 Speaker 1: So from there, I talked to a bunch of people 110 00:06:34,120 --> 00:06:35,880 Speaker 1: that I just met, like my goal is just to 111 00:06:35,960 --> 00:06:38,960 Speaker 1: meet people and ask questions at that point, and found 112 00:06:39,080 --> 00:06:42,040 Speaker 1: every book I could on the subject matter, went home 113 00:06:42,160 --> 00:06:45,279 Speaker 1: and practiced and taught myself, and actually went back and 114 00:06:45,320 --> 00:06:48,359 Speaker 1: competed in that same competition three years in a row, 115 00:06:48,720 --> 00:06:50,760 Speaker 1: and I went on my third year, which was huge, 116 00:06:50,800 --> 00:06:54,280 Speaker 1: but that really was able to propel me into this 117 00:06:54,440 --> 00:06:57,920 Speaker 1: career and where a company actually saw me placing these 118 00:06:57,960 --> 00:06:59,920 Speaker 1: calls and asked me like, hey, do you want a job, 119 00:07:00,040 --> 00:07:02,880 Speaker 1: And that's that was my first job. It was super exciting. 120 00:07:03,880 --> 00:07:07,000 Speaker 1: In three years, Snow went from amateur hacking enthusiasts to 121 00:07:07,160 --> 00:07:11,160 Speaker 1: hacking professional. Companies started to pay her real money to 122 00:07:11,240 --> 00:07:15,880 Speaker 1: test their information security. But remember, Snow's line of work 123 00:07:15,960 --> 00:07:19,720 Speaker 1: isn't just limited to email servers and data networks. She's 124 00:07:19,760 --> 00:07:23,440 Speaker 1: a people hacker. Instead of trying to bypass a firewall 125 00:07:23,840 --> 00:07:27,720 Speaker 1: or cracking a password. She uses what's called social engineering 126 00:07:27,760 --> 00:07:30,840 Speaker 1: to trick users into letting her into systems where she 127 00:07:30,880 --> 00:07:34,520 Speaker 1: doesn't belong. In her work on what's called a red team, 128 00:07:34,800 --> 00:07:39,120 Speaker 1: Snow explains how hacking, the technical and the human come together. 129 00:07:39,480 --> 00:07:43,400 Speaker 1: So a red team is a group of offensive security 130 00:07:43,640 --> 00:07:46,440 Speaker 1: or hackers. So IBM on our x fource team, we 131 00:07:46,520 --> 00:07:50,040 Speaker 1: have a whole team dedicated to our we call adversary simulation. 132 00:07:50,080 --> 00:07:52,240 Speaker 1: But our red team and how it works. As a 133 00:07:52,280 --> 00:07:55,800 Speaker 1: client comes in and says, these are our crown jewels. 134 00:07:55,880 --> 00:07:59,120 Speaker 1: We want to make sure you cannot access them. We 135 00:07:59,360 --> 00:08:03,720 Speaker 1: spend trying to access them, and along the way, we 136 00:08:03,800 --> 00:08:06,240 Speaker 1: have tons of meetings with our clients and giving them 137 00:08:06,280 --> 00:08:10,200 Speaker 1: status updates and where we are. Um, but it's it's 138 00:08:10,200 --> 00:08:13,600 Speaker 1: a very long engagement to try to get access to 139 00:08:13,680 --> 00:08:16,880 Speaker 1: the most sensitive things that our clients have. So how 140 00:08:16,920 --> 00:08:19,280 Speaker 1: do they brief you, I mean, and how do they 141 00:08:19,320 --> 00:08:22,120 Speaker 1: brief you in such a way as to not give 142 00:08:22,160 --> 00:08:24,960 Speaker 1: away the stuff that they're trying to not give aways? 143 00:08:25,000 --> 00:08:27,560 Speaker 1: If that makes any sense. Yeah, So, so they stay 144 00:08:27,600 --> 00:08:30,600 Speaker 1: as high level as possible. They might say, um, let's 145 00:08:30,840 --> 00:08:33,640 Speaker 1: let's use I P for example. Right, they have this 146 00:08:34,120 --> 00:08:37,320 Speaker 1: their secret sauce that if their competitors get or anyone 147 00:08:37,320 --> 00:08:40,520 Speaker 1: else gets, they can pretty much copy their business. And 148 00:08:40,559 --> 00:08:45,840 Speaker 1: so that information probably lives on something that's very secure, 149 00:08:45,960 --> 00:08:49,240 Speaker 1: in a couple of documents that hopefully limited people have 150 00:08:49,280 --> 00:08:52,600 Speaker 1: access to. Yeah, so a certain a certain soft drinks 151 00:08:52,640 --> 00:08:57,559 Speaker 1: secret recipe for example, mentioning no particular brand names. Yes, exactly. 152 00:08:58,120 --> 00:09:01,000 Speaker 1: So they might say, Okay, we have this secret recipe 153 00:09:01,240 --> 00:09:02,960 Speaker 1: and we want to see if you can get it. 154 00:09:03,000 --> 00:09:06,199 Speaker 1: They won't give us any details to where it's stored 155 00:09:06,600 --> 00:09:10,000 Speaker 1: or any other information, but they'll just say go. They 156 00:09:10,080 --> 00:09:11,800 Speaker 1: might have a couple of things that are off limits, 157 00:09:11,840 --> 00:09:14,840 Speaker 1: but in general it's can we get this by any 158 00:09:14,880 --> 00:09:18,280 Speaker 1: means possible. So a lot of social engineering is used, 159 00:09:18,280 --> 00:09:21,400 Speaker 1: whether it's phone calls or emails, sometimes on site, and 160 00:09:21,440 --> 00:09:24,559 Speaker 1: a good amount of technical hacking. Right if we get 161 00:09:24,600 --> 00:09:27,760 Speaker 1: into one person's computer, can we move into another's? And 162 00:09:27,800 --> 00:09:29,840 Speaker 1: then can we move into a server? And it's a 163 00:09:29,880 --> 00:09:32,760 Speaker 1: lot of moving around and digging, but um, at the 164 00:09:32,880 --> 00:09:35,720 Speaker 1: end of the day, we're pretty successful with these types 165 00:09:35,760 --> 00:09:39,760 Speaker 1: of engagements. And you mentioned certain things being off limits 166 00:09:40,240 --> 00:09:43,680 Speaker 1: because really the hackers that the bad hackers don't care 167 00:09:43,720 --> 00:09:47,080 Speaker 1: what's off limits. And what is not So what are 168 00:09:47,080 --> 00:09:48,960 Speaker 1: the kinds of things that people are the clients are 169 00:09:48,960 --> 00:09:52,000 Speaker 1: saying that you're not allowed to do that, that's cheating. Yeah, 170 00:09:52,120 --> 00:09:54,600 Speaker 1: So so we will see a good handful of times 171 00:09:54,679 --> 00:09:58,120 Speaker 1: is do not mess with our executives, like don't send 172 00:09:58,160 --> 00:10:01,280 Speaker 1: our CEO and email which a in bad guys do 173 00:10:01,360 --> 00:10:04,760 Speaker 1: not have limits and they will absolutely continue to do that. Um, 174 00:10:04,800 --> 00:10:07,160 Speaker 1: but we have to expect those unfortunately. But we will 175 00:10:07,240 --> 00:10:09,920 Speaker 1: every once in a while run into a good handful 176 00:10:09,920 --> 00:10:13,440 Speaker 1: of things. Or maybe they have another system that I 177 00:10:13,480 --> 00:10:17,840 Speaker 1: don't know runs something sensitive, right, maybe it's a medical 178 00:10:17,920 --> 00:10:20,800 Speaker 1: device company. They're like, okay, do not access this system 179 00:10:20,840 --> 00:10:23,280 Speaker 1: because you know, people's lives could be on the line. 180 00:10:23,320 --> 00:10:26,280 Speaker 1: So we won't even touch those types of systems. It 181 00:10:26,320 --> 00:10:28,240 Speaker 1: really depends on the end of the day. What what 182 00:10:28,280 --> 00:10:30,120 Speaker 1: they don't want us to have access to. Well, you're 183 00:10:30,160 --> 00:10:33,520 Speaker 1: people hackers, so you're doing it with people. So so 184 00:10:33,760 --> 00:10:35,560 Speaker 1: I mean what does that what does that look like? 185 00:10:35,960 --> 00:10:37,760 Speaker 1: I mean is it is it literally phoning people up 186 00:10:37,800 --> 00:10:39,840 Speaker 1: and persuading them to give you passwords or is it 187 00:10:40,040 --> 00:10:42,640 Speaker 1: a bit more complicated than that these days? So I 188 00:10:42,840 --> 00:10:46,120 Speaker 1: break down social engineering in two ways. You either have 189 00:10:46,240 --> 00:10:49,200 Speaker 1: remote or on site. When you look at the remote, 190 00:10:49,280 --> 00:10:51,680 Speaker 1: you're looking at a couple of different things. So the 191 00:10:51,679 --> 00:10:53,880 Speaker 1: first one is what we call OSANT, which stands for 192 00:10:53,920 --> 00:10:59,240 Speaker 1: open Source intelligence, and that's actually not actively hacking a person, 193 00:10:59,400 --> 00:11:03,680 Speaker 1: but it's looking at their online accounts. Are they revealing 194 00:11:03,720 --> 00:11:06,480 Speaker 1: information that they shouldn't be that an attacker could leverage, 195 00:11:06,520 --> 00:11:09,360 Speaker 1: So that's that's one type of assessment. We have the 196 00:11:09,400 --> 00:11:12,000 Speaker 1: fishing or voice fishing, so that's placing those phone calls 197 00:11:12,040 --> 00:11:14,679 Speaker 1: to get information or maybe get them to do a 198 00:11:14,720 --> 00:11:17,360 Speaker 1: task over the phone. And then fishing, and that's by 199 00:11:17,400 --> 00:11:20,800 Speaker 1: far the most common social engineering type of assessment. That's 200 00:11:21,080 --> 00:11:24,880 Speaker 1: the malicious email with a link or an attachment or 201 00:11:24,880 --> 00:11:27,480 Speaker 1: even a conversation. And then we move into the on 202 00:11:27,600 --> 00:11:31,280 Speaker 1: site stuff and this is my favorite. It's the most tangible, 203 00:11:31,320 --> 00:11:34,720 Speaker 1: but it's actually breaking and entering, so it's trying to 204 00:11:34,800 --> 00:11:39,040 Speaker 1: get access to clients, sensitive locations, and sensitive data. So 205 00:11:39,080 --> 00:11:43,160 Speaker 1: those are the two um types of social engineering. Give 206 00:11:43,160 --> 00:11:45,360 Speaker 1: me a little bit of advice, then, if if if 207 00:11:45,400 --> 00:11:47,800 Speaker 1: you're trying to find a weakness, if you're trying to 208 00:11:47,800 --> 00:11:51,800 Speaker 1: persuade somebody to do something they shouldn't be doing, what 209 00:11:51,840 --> 00:11:54,400 Speaker 1: are the kind of things that you're doing. So let's 210 00:11:54,440 --> 00:11:58,520 Speaker 1: just take the physical part for an example. Is tailgating right? 211 00:11:58,559 --> 00:12:01,560 Speaker 1: That sounds so easy and so obvious, but it's the 212 00:12:01,679 --> 00:12:04,200 Speaker 1: number one way that we break into buildings. It's just 213 00:12:04,400 --> 00:12:08,760 Speaker 1: following someone who badges in, who unlocks the door, who 214 00:12:08,800 --> 00:12:12,319 Speaker 1: has that access. We just follow them and people are 215 00:12:12,360 --> 00:12:14,680 Speaker 1: trained all the time, don't let anyone fall, you, check 216 00:12:14,720 --> 00:12:17,000 Speaker 1: the badge behind you, make sure people badge in all 217 00:12:17,040 --> 00:12:20,080 Speaker 1: of these policies, but when it comes down to it, 218 00:12:21,040 --> 00:12:24,880 Speaker 1: people are a little bit scared to ask to see 219 00:12:24,920 --> 00:12:29,880 Speaker 1: the badger, to question them. It's rude for somebody. Yes, 220 00:12:30,280 --> 00:12:32,320 Speaker 1: it's human nature to want to help, so that goes 221 00:12:32,360 --> 00:12:35,920 Speaker 1: against everything that people are used to doing. So that's 222 00:12:35,920 --> 00:12:38,240 Speaker 1: by far the number one way that that we get 223 00:12:38,240 --> 00:12:42,280 Speaker 1: into buildings. Now I understand that before you got into 224 00:12:42,320 --> 00:12:46,440 Speaker 1: this game, you were a makeup artist for independent films. 225 00:12:47,800 --> 00:12:50,920 Speaker 1: Is there a connection between It seems like a stretch, 226 00:12:51,000 --> 00:12:54,400 Speaker 1: but between being a makeup artist and being a people hecker. Yeah, 227 00:12:54,480 --> 00:12:57,200 Speaker 1: you would think those those things absolutely don't go together 228 00:12:57,240 --> 00:12:59,840 Speaker 1: at all. However, I've been pretty lucky, or I've been 229 00:13:00,040 --> 00:13:02,640 Speaker 1: will to leverage a little bit of the makeup art 230 00:13:02,640 --> 00:13:06,199 Speaker 1: and special effects to when we do the physical security assessments, 231 00:13:06,200 --> 00:13:09,160 Speaker 1: so maybe we get caught on the first day, or 232 00:13:09,200 --> 00:13:11,480 Speaker 1: maybe someone suspicious, so we don't want to go back 233 00:13:11,480 --> 00:13:14,079 Speaker 1: and blow our cover, so we'll change our appearance as 234 00:13:14,200 --> 00:13:16,679 Speaker 1: much as possible when we go back the next day. 235 00:13:16,679 --> 00:13:20,000 Speaker 1: So absolutely something that I leverage all the time. And 236 00:13:20,200 --> 00:13:21,800 Speaker 1: it's it's a lot of fun too. It just adds 237 00:13:21,800 --> 00:13:24,360 Speaker 1: a little bit more to the job. It sounds like 238 00:13:25,400 --> 00:13:28,680 Speaker 1: it's more creative than I would have expected a cybersecurity 239 00:13:28,720 --> 00:13:31,600 Speaker 1: job to be. Oh. Absolutely. When you think of cyber security, 240 00:13:31,640 --> 00:13:33,520 Speaker 1: you just think of someone sitting at a computer typing 241 00:13:33,520 --> 00:13:36,800 Speaker 1: all day. That is not my job at all. Um. 242 00:13:36,880 --> 00:13:40,679 Speaker 1: It's it's pretty amazing how much I could leverage creativity 243 00:13:40,760 --> 00:13:43,480 Speaker 1: in what I do day to day. Can you give 244 00:13:43,480 --> 00:13:46,680 Speaker 1: me an example, so I actually have a story, um, 245 00:13:46,720 --> 00:13:49,680 Speaker 1: if you're ready for a breaking story, it's one of 246 00:13:49,720 --> 00:13:53,360 Speaker 1: the ones that absolutely went wrong. UM. Our client was 247 00:13:53,640 --> 00:13:56,080 Speaker 1: based out of the US and they had just opened 248 00:13:56,200 --> 00:13:59,880 Speaker 1: their European branch to their headquarters in Amsterdam, and so 249 00:14:00,040 --> 00:14:02,840 Speaker 1: they wanted us to test the building's physical security to 250 00:14:02,880 --> 00:14:06,320 Speaker 1: see if it's protecting their people and their data. And 251 00:14:06,360 --> 00:14:09,160 Speaker 1: so some of the goals were to see if we 252 00:14:09,200 --> 00:14:11,720 Speaker 1: can get insight past all the badged areas where we 253 00:14:11,720 --> 00:14:15,120 Speaker 1: shouldn't have access and see if we see anything that's 254 00:14:15,120 --> 00:14:17,920 Speaker 1: out of place or or maybe red flags or something 255 00:14:17,960 --> 00:14:21,000 Speaker 1: that they should fix. So we always start with with 256 00:14:21,040 --> 00:14:23,240 Speaker 1: our o SEN or open source intelligence, where we're going 257 00:14:23,280 --> 00:14:28,800 Speaker 1: online investigating the location. We're working at Google Maps as 258 00:14:28,920 --> 00:14:31,960 Speaker 1: much as we can. However, this building was so new 259 00:14:32,000 --> 00:14:34,120 Speaker 1: that they weren't even on Google Maps yet, so we 260 00:14:34,200 --> 00:14:37,480 Speaker 1: had a really hard time finding all of this information. 261 00:14:38,400 --> 00:14:40,360 Speaker 1: We decided we just had to show up on site 262 00:14:40,680 --> 00:14:43,560 Speaker 1: to to see what we can do. So I walk, 263 00:14:43,720 --> 00:14:45,720 Speaker 1: I walk into the building and walk into the lobby. 264 00:14:46,120 --> 00:14:49,000 Speaker 1: The second I walk in, the lady pretty much kicked 265 00:14:49,000 --> 00:14:50,560 Speaker 1: me out. I didn't even get to open my mouth 266 00:14:50,680 --> 00:14:53,880 Speaker 1: or explain why I was there, right out of the gate, 267 00:14:54,120 --> 00:14:58,320 Speaker 1: just get out. And so for doing this type of 268 00:14:58,360 --> 00:15:02,440 Speaker 1: an assessment, that was horrible. This client paid all this 269 00:15:02,560 --> 00:15:05,200 Speaker 1: money to get me out there to test her physical 270 00:15:05,240 --> 00:15:08,440 Speaker 1: security and here I am getting kicked out within the 271 00:15:08,480 --> 00:15:14,200 Speaker 1: first five minutes. So that was awful. Security is pretty good, Yeah, yeah, no, 272 00:15:14,320 --> 00:15:17,920 Speaker 1: they're their Their receptionist was on her game. Um, So 273 00:15:18,040 --> 00:15:20,320 Speaker 1: I went back to my hotel room and like was 274 00:15:20,360 --> 00:15:22,800 Speaker 1: banging my head against the wall, like how do I 275 00:15:22,840 --> 00:15:26,200 Speaker 1: get in? I can't find information online. They're kicking me 276 00:15:26,280 --> 00:15:28,880 Speaker 1: out before I'm even trying, Like I was just wanting 277 00:15:28,880 --> 00:15:30,440 Speaker 1: to go in and see what it looked like because 278 00:15:30,480 --> 00:15:32,800 Speaker 1: I had no idea what I was walking into. So 279 00:15:32,920 --> 00:15:35,320 Speaker 1: I went back online, like, Okay, I have to I 280 00:15:35,360 --> 00:15:39,120 Speaker 1: have to figure this out. And finally, out of nowhere, 281 00:15:39,160 --> 00:15:41,640 Speaker 1: it popped into my head. Okay, it has to be 282 00:15:41,720 --> 00:15:45,840 Speaker 1: someone that's not local, because I'm not from Amsterdam, and 283 00:15:45,880 --> 00:15:48,640 Speaker 1: I have to leverage some type of position of authorities, 284 00:15:48,760 --> 00:15:51,680 Speaker 1: some reason why I'm supposed to be there. And so 285 00:15:52,200 --> 00:15:56,000 Speaker 1: I thought, investor relations. I am going to pretend to 286 00:15:56,040 --> 00:15:59,360 Speaker 1: be an investor relations manager from the US and I'm 287 00:15:59,360 --> 00:16:02,960 Speaker 1: going to the new site meeting with some potential investors. 288 00:16:03,480 --> 00:16:06,520 Speaker 1: And so I called the receptionist. I spoofed my number, 289 00:16:06,600 --> 00:16:08,200 Speaker 1: so I made it look like I was calling from 290 00:16:08,240 --> 00:16:11,680 Speaker 1: the US location, and UM changed my voice a little 291 00:16:11,680 --> 00:16:13,840 Speaker 1: bit and said that we have someone that's going to 292 00:16:13,920 --> 00:16:16,520 Speaker 1: be coming on site tomorrow. Please give them whatever they need. 293 00:16:16,560 --> 00:16:18,520 Speaker 1: They're going to be meeting with all these high end 294 00:16:18,720 --> 00:16:23,000 Speaker 1: clients potentially, UM, so just make sure they're comfortable. The 295 00:16:23,040 --> 00:16:24,520 Speaker 1: next day, I walk in and again I had to 296 00:16:24,600 --> 00:16:26,560 Speaker 1: change my parents a bit because she saw me and 297 00:16:26,680 --> 00:16:30,560 Speaker 1: she didn't that, and I she welcomed me, she got 298 00:16:30,600 --> 00:16:33,240 Speaker 1: me coffee, she sent me up in the office where 299 00:16:33,240 --> 00:16:35,440 Speaker 1: they had my name on the on the front door, 300 00:16:36,200 --> 00:16:39,800 Speaker 1: and I was like, how can we help? So from 301 00:16:39,800 --> 00:16:42,360 Speaker 1: there I was able to go through and complete my objectives. 302 00:16:42,440 --> 00:16:45,680 Speaker 1: But it's it's kind of amazing how much you have 303 00:16:45,800 --> 00:16:48,560 Speaker 1: to leverage creativity and even kind of the on the 304 00:16:48,600 --> 00:16:54,880 Speaker 1: spot improv sometimes to to actually complete these objectives. Yeah, 305 00:16:55,160 --> 00:16:59,280 Speaker 1: improv was the word that springs to mind hearing that story. 306 00:17:00,640 --> 00:17:03,440 Speaker 1: I would imagine that there must be some playbook that 307 00:17:03,720 --> 00:17:06,960 Speaker 1: there's a bunch of things you try and then you 308 00:17:07,000 --> 00:17:09,679 Speaker 1: have to improvise if the playbook isn't working. Is that 309 00:17:09,720 --> 00:17:14,600 Speaker 1: playbook always changing? Is it? Is it this constant arms race? Constantly? 310 00:17:14,640 --> 00:17:17,879 Speaker 1: It also depends on who my target is, right, I 311 00:17:17,960 --> 00:17:21,879 Speaker 1: will change the way I ask questions, the way I 312 00:17:21,960 --> 00:17:26,160 Speaker 1: set things up, just completely everything depending on if I'm 313 00:17:26,200 --> 00:17:29,600 Speaker 1: talking to someone younger or older, or male or female. Like, 314 00:17:29,640 --> 00:17:33,440 Speaker 1: there's a lot of things that absolutely adapt to whoever 315 00:17:33,520 --> 00:17:35,560 Speaker 1: I'm speaking to at the end of the day, because 316 00:17:36,400 --> 00:17:38,640 Speaker 1: people are different and I want to try to make 317 00:17:38,640 --> 00:17:41,840 Speaker 1: sure whoever I'm talking to is comfortable and I can 318 00:17:41,880 --> 00:17:45,240 Speaker 1: get them to trust me. And is there a collaborative 319 00:17:45,280 --> 00:17:48,800 Speaker 1: process this kind of ethical hacking or is it very 320 00:17:48,880 --> 00:17:53,280 Speaker 1: much a lone wolf. It's really both. It just depends 321 00:17:53,320 --> 00:17:56,320 Speaker 1: on what the type of assessment is and there's a 322 00:17:56,400 --> 00:18:00,080 Speaker 1: lot of variables. I prefer a team right, working with 323 00:18:00,119 --> 00:18:03,080 Speaker 1: as many people as possible, because I might be looking 324 00:18:03,119 --> 00:18:06,640 Speaker 1: at a problem from, you know, my perspective, but if 325 00:18:06,680 --> 00:18:08,879 Speaker 1: I have two or three other people with completely different 326 00:18:08,880 --> 00:18:13,080 Speaker 1: backgrounds and sets of experience, they're thinking about from another perspective. 327 00:18:13,160 --> 00:18:16,800 Speaker 1: So the more we collaborate and work together, typically the 328 00:18:16,840 --> 00:18:21,200 Speaker 1: more successful we can be as well. I'm curious about 329 00:18:22,160 --> 00:18:25,240 Speaker 1: a day in the life of Snow. I mean, on 330 00:18:25,280 --> 00:18:29,760 Speaker 1: a completely typical day, what is it that you're doing. 331 00:18:30,600 --> 00:18:32,639 Speaker 1: So that's what I love about my job is I 332 00:18:32,680 --> 00:18:35,680 Speaker 1: don't have a typical day. I could be one day 333 00:18:35,720 --> 00:18:39,840 Speaker 1: waking up in Manhattan breaking into the building, and the 334 00:18:39,960 --> 00:18:42,280 Speaker 1: next day I could be in my home office writing 335 00:18:42,280 --> 00:18:45,360 Speaker 1: a report. Like It's all over the place, and that's 336 00:18:45,400 --> 00:18:49,600 Speaker 1: what makes it super exciting that it's not mundane. It's 337 00:18:49,800 --> 00:18:52,119 Speaker 1: constantly changed, and I love that it's like, yeah, one 338 00:18:52,160 --> 00:18:53,960 Speaker 1: day I'm writing a report the other day, I'm breaking 339 00:18:54,000 --> 00:18:59,320 Speaker 1: into a building in Manhattan. It's perfect. Absolutely. One description 340 00:18:59,320 --> 00:19:02,040 Speaker 1: I've seen is that are like a secret shopper, except 341 00:19:02,240 --> 00:19:04,800 Speaker 1: instead of being a secret shopper for a restaurateur or 342 00:19:04,840 --> 00:19:07,720 Speaker 1: a chain store, you're a secret shopper for breaking in 343 00:19:07,840 --> 00:19:11,960 Speaker 1: and stealing passwords. It is that accurate that I would 344 00:19:12,000 --> 00:19:14,639 Speaker 1: I would say that's accurate. And if people are hiring 345 00:19:14,680 --> 00:19:18,040 Speaker 1: you to probe their security and to find the weaknesses, 346 00:19:19,400 --> 00:19:21,840 Speaker 1: have you ever come back and said, no, it's perfect. 347 00:19:21,880 --> 00:19:25,280 Speaker 1: I got nothing I couldn't get in. So I have 348 00:19:25,440 --> 00:19:29,440 Speaker 1: broken into over a hundred and thirty unique buildings. I've 349 00:19:29,480 --> 00:19:31,800 Speaker 1: only had one of those buildings I was not able 350 00:19:31,840 --> 00:19:35,520 Speaker 1: to break into, and that is because it was a 351 00:19:35,560 --> 00:19:38,760 Speaker 1: small company in the middle of nowhere where everyone knew 352 00:19:38,800 --> 00:19:42,760 Speaker 1: each other. It's not because necessarily because they had all 353 00:19:42,800 --> 00:19:45,920 Speaker 1: these you know, expensive security controls that they had place. 354 00:19:46,000 --> 00:19:48,280 Speaker 1: It was just I stuck out like a sore thumb, 355 00:19:48,359 --> 00:19:51,160 Speaker 1: and no matter what I said, they knew I wasn't 356 00:19:51,200 --> 00:19:53,840 Speaker 1: supposed to be there. But it's kind of scary some 357 00:19:53,920 --> 00:19:58,520 Speaker 1: of the very large organizations in these famous skyscrapers that 358 00:19:58,560 --> 00:20:01,960 Speaker 1: I've broken into, where they've invested hundreds of thousands, if 359 00:20:02,000 --> 00:20:05,760 Speaker 1: not millions of dollars into their physical security, but I'm 360 00:20:05,800 --> 00:20:09,280 Speaker 1: able to get in right. That's kind of terrifying if 361 00:20:09,280 --> 00:20:12,439 Speaker 1: you think about it. Whether it's brick and mortar hacking 362 00:20:12,880 --> 00:20:15,800 Speaker 1: or using something much more high tech, it's all founded 363 00:20:15,840 --> 00:20:19,920 Speaker 1: on the same principle, using deception to get what you want. 364 00:20:20,680 --> 00:20:23,800 Speaker 1: To round out their conversation, Tim and Snow talk about 365 00:20:23,800 --> 00:20:27,600 Speaker 1: the state of the global cybersecurity industry, where the art 366 00:20:27,600 --> 00:20:30,600 Speaker 1: of the corn is headed, and how prepared companies are 367 00:20:30,720 --> 00:20:33,560 Speaker 1: for any of it. Let's zoom back a bit now 368 00:20:33,760 --> 00:20:36,000 Speaker 1: and and take in what you know the state of 369 00:20:36,080 --> 00:20:40,159 Speaker 1: the global hacking industry if that's a phrase, or the 370 00:20:40,160 --> 00:20:45,879 Speaker 1: global security industry, and what has changed in security and 371 00:20:45,920 --> 00:20:49,040 Speaker 1: cybersecurity over the last few years. What are the new trends? 372 00:20:50,200 --> 00:20:54,040 Speaker 1: So what's changed? I would say more of our lives 373 00:20:54,040 --> 00:20:57,720 Speaker 1: are online and and that's kind of scary. Everything from 374 00:20:57,840 --> 00:21:02,440 Speaker 1: your IoT lightbulb to your oven to IoT being the 375 00:21:02,440 --> 00:21:05,080 Speaker 1: the Internet of things, so I just basically every everything 376 00:21:05,119 --> 00:21:08,760 Speaker 1: has a web address now exactly, and so there's so 377 00:21:08,840 --> 00:21:12,000 Speaker 1: much more of that now. It's just it surrounds us 378 00:21:12,080 --> 00:21:15,960 Speaker 1: are are just our lives are online and with that 379 00:21:16,080 --> 00:21:18,760 Speaker 1: much being online, that's just more that we have to 380 00:21:18,800 --> 00:21:22,320 Speaker 1: protect or more that we have to worry about. Unfortunately, 381 00:21:22,960 --> 00:21:27,400 Speaker 1: that clearly raises the stakes. I would have hoped there's 382 00:21:27,440 --> 00:21:31,440 Speaker 1: also more awareness. People don't fall for the most obvious 383 00:21:32,160 --> 00:21:36,920 Speaker 1: scams and tricks anymore. And do you think companies put 384 00:21:37,040 --> 00:21:40,480 Speaker 1: enough emphasis on security? Is it a high enough priority 385 00:21:40,720 --> 00:21:44,880 Speaker 1: at the c suite level? I wish I could say yes. However, 386 00:21:45,000 --> 00:21:47,520 Speaker 1: it's all over the board. I've I've worked with clients 387 00:21:47,520 --> 00:21:51,520 Speaker 1: who they put everything they have into stopping attackers, into 388 00:21:51,560 --> 00:21:54,359 Speaker 1: securing their environment. I've seen some clients in the past 389 00:21:54,400 --> 00:21:56,560 Speaker 1: to just want to get the check in the box 390 00:21:56,560 --> 00:21:58,760 Speaker 1: that they did their assessments and they want to move 391 00:21:58,800 --> 00:22:01,640 Speaker 1: on to something else. So unfortunately, it's a pretty big 392 00:22:01,760 --> 00:22:06,280 Speaker 1: range of types of people who really have that security mindset. 393 00:22:06,720 --> 00:22:12,320 Speaker 1: And I'm always reading stories in the news about breaches 394 00:22:12,520 --> 00:22:17,000 Speaker 1: and they these security whiches, and they sometimes they sound 395 00:22:17,240 --> 00:22:22,480 Speaker 1: very sensational. Sometimes they sound incredibly banal, like, oh yeah, 396 00:22:22,520 --> 00:22:27,640 Speaker 1: somebody just stuck all the passwords online in plain text. Boops. 397 00:22:28,359 --> 00:22:32,760 Speaker 1: I mean, is there a standard procedure for the bad actors? 398 00:22:32,960 --> 00:22:38,640 Speaker 1: Is there a way that breaches happen like this? Not 399 00:22:38,760 --> 00:22:41,119 Speaker 1: these days, just because there's so many different ways they 400 00:22:41,160 --> 00:22:44,399 Speaker 1: get in. I mean, most of them are financially motivated. 401 00:22:44,440 --> 00:22:45,880 Speaker 1: So at the end of the day, once they get 402 00:22:45,880 --> 00:22:48,600 Speaker 1: in there going for their going to see if they 403 00:22:48,600 --> 00:22:52,320 Speaker 1: can get money somehow, whether it's ransomware or they're looking 404 00:22:52,359 --> 00:22:56,919 Speaker 1: for credentials to high end executives. Right, it kind of 405 00:22:56,920 --> 00:23:00,560 Speaker 1: depends on their angle. But really it's it's how they're 406 00:23:00,560 --> 00:23:04,280 Speaker 1: getting in is It's pretty tricky again. Social engineering is 407 00:23:04,400 --> 00:23:07,200 Speaker 1: one of the number one ways to get in, typically 408 00:23:07,200 --> 00:23:10,639 Speaker 1: through fishing, um sending some type of malicious payload and 409 00:23:10,680 --> 00:23:13,760 Speaker 1: if their target does open it, that gets them into 410 00:23:13,800 --> 00:23:15,800 Speaker 1: their environment and then they kind of pivot from there 411 00:23:15,840 --> 00:23:18,679 Speaker 1: and see what they could get access to and how 412 00:23:18,760 --> 00:23:23,920 Speaker 1: much does it cost when security has breached? So ibmed 413 00:23:23,920 --> 00:23:27,280 Speaker 1: at a report on the one from one the cost 414 00:23:27,359 --> 00:23:31,320 Speaker 1: of an average data breach was over four million dollars, 415 00:23:32,080 --> 00:23:35,040 Speaker 1: which is insane to think about. It kind of makes 416 00:23:35,040 --> 00:23:38,360 Speaker 1: you wonder why they don't put more emphasis on their 417 00:23:38,400 --> 00:23:42,439 Speaker 1: security and security awareness training and updating their machines and 418 00:23:42,480 --> 00:23:45,000 Speaker 1: things like that. When when you think about how big 419 00:23:45,040 --> 00:23:48,560 Speaker 1: that number is, why there's tons of reasons they could 420 00:23:48,600 --> 00:23:51,000 Speaker 1: have finds that they have to pay out. Depending on 421 00:23:51,080 --> 00:23:54,439 Speaker 1: what industry they're in, they have to pay out for 422 00:23:54,520 --> 00:23:59,040 Speaker 1: things like credit monitoring for whoever is effective, UM, legal 423 00:23:59,119 --> 00:24:01,639 Speaker 1: fees like there's there's tons and tons of things that 424 00:24:01,640 --> 00:24:05,400 Speaker 1: are involved. When when the company actually gets breached, there's 425 00:24:05,400 --> 00:24:07,159 Speaker 1: a couple of things they could do to try to 426 00:24:07,200 --> 00:24:09,680 Speaker 1: prevent them UM and the first one that is higher 427 00:24:09,720 --> 00:24:14,160 Speaker 1: folks like myself to come in and test their environments 428 00:24:14,200 --> 00:24:18,440 Speaker 1: to see where those vulnerabilities are so they can patch them. UM. 429 00:24:18,480 --> 00:24:21,960 Speaker 1: To do ongoing training for their internal team to make 430 00:24:22,000 --> 00:24:24,440 Speaker 1: sure they're up to date they know how to stop 431 00:24:24,480 --> 00:24:28,400 Speaker 1: these type of attacks, and really just care about security 432 00:24:28,400 --> 00:24:32,119 Speaker 1: in general goes a long way. No, I mean, in 433 00:24:32,160 --> 00:24:35,160 Speaker 1: some ways, what you're describing is is tremendously varied, lots 434 00:24:35,200 --> 00:24:38,960 Speaker 1: of creativity, lots of improvisation, lots of variety. In other ways, 435 00:24:38,960 --> 00:24:41,360 Speaker 1: it's it seems kind of simple. You're trying to break 436 00:24:41,400 --> 00:24:44,800 Speaker 1: into places, So what's the state of the art and 437 00:24:44,840 --> 00:24:47,119 Speaker 1: how do you advance the state of the art in 438 00:24:47,400 --> 00:24:51,719 Speaker 1: people hacking? Unfortunately, social engineering is is kind of stagnant. 439 00:24:52,119 --> 00:24:54,520 Speaker 1: I mean, if you if you get that unfortunate, it 440 00:24:54,560 --> 00:24:57,440 Speaker 1: feels it feels kind of like it might be good 441 00:24:57,440 --> 00:25:01,200 Speaker 1: news for me, it's unfortunate. Okay, I'm looking from the 442 00:25:01,240 --> 00:25:04,040 Speaker 1: attack or point of view, so that's very correct. Um, 443 00:25:04,040 --> 00:25:06,240 Speaker 1: But if you go back to the Middle Ages, there 444 00:25:06,280 --> 00:25:09,160 Speaker 1: were cons that people were doing back then. Um, there's 445 00:25:09,240 --> 00:25:13,200 Speaker 1: tons of cons from the early nineteen hundreds, and still 446 00:25:13,280 --> 00:25:15,320 Speaker 1: we're taking some of those kinds of cons and just 447 00:25:15,400 --> 00:25:19,560 Speaker 1: adapting it to today's digital world, which there's there's improvements there, 448 00:25:20,119 --> 00:25:23,720 Speaker 1: but in general social engineering there's there's not much that's 449 00:25:24,680 --> 00:25:27,120 Speaker 1: that's changing. So that's actually one of the things that 450 00:25:27,400 --> 00:25:30,280 Speaker 1: I have put a lot of emphasis on the last year, 451 00:25:30,359 --> 00:25:32,840 Speaker 1: especially with my team, is once we go in and 452 00:25:32,840 --> 00:25:38,640 Speaker 1: we complete an assessment, we spend the last trying something new, 453 00:25:38,720 --> 00:25:43,320 Speaker 1: trying something novel. Can this technique work? Maybe it's walking 454 00:25:43,320 --> 00:25:45,240 Speaker 1: into a building saying, hey, I shouldn't be here, Will 455 00:25:45,280 --> 00:25:48,199 Speaker 1: someone stop us? Right? Any little thing like that. What 456 00:25:48,320 --> 00:25:51,440 Speaker 1: can we actually get away with? And that's that's something 457 00:25:51,480 --> 00:25:54,440 Speaker 1: that I've enjoyed doing and pushing my team to see 458 00:25:54,560 --> 00:25:57,840 Speaker 1: what we can learn and where those boundaries are. Can 459 00:25:57,880 --> 00:26:02,080 Speaker 1: you give me an example of a medieval con? Very curious? Yes? Okay, 460 00:26:02,160 --> 00:26:06,280 Speaker 1: so in the Middle Ages, there is have you ever 461 00:26:06,320 --> 00:26:09,159 Speaker 1: heard the term pig and a poke. Uh, yeah, I've 462 00:26:09,200 --> 00:26:12,080 Speaker 1: heard the term. I always wanted where it came from. Yeah, 463 00:26:12,160 --> 00:26:16,440 Speaker 1: So pig and a poke came from vendors at the times, 464 00:26:16,280 --> 00:26:19,240 Speaker 1: or people who worked on the street and sold different 465 00:26:19,520 --> 00:26:22,720 Speaker 1: various goods and foods. They would put a suckling pig 466 00:26:22,800 --> 00:26:24,399 Speaker 1: inside of what they called a poke, which is a 467 00:26:24,440 --> 00:26:26,919 Speaker 1: burlack sack, and so did it shut, and that's what 468 00:26:26,920 --> 00:26:30,200 Speaker 1: they would sell on people by then eat that for dinner. However, 469 00:26:30,359 --> 00:26:34,879 Speaker 1: at the time, there were no shortage of small dogs 470 00:26:34,880 --> 00:26:38,359 Speaker 1: and cats, So what some creative folks would do is 471 00:26:38,600 --> 00:26:41,199 Speaker 1: put those types of animals inside of the sack and 472 00:26:41,240 --> 00:26:44,199 Speaker 1: so it shut, and make a lot of money and 473 00:26:44,200 --> 00:26:47,280 Speaker 1: then move on to the next city and continue that 474 00:26:47,359 --> 00:26:51,960 Speaker 1: con So again, cons have been around for the longest time. 475 00:26:54,800 --> 00:26:58,760 Speaker 1: I suppose the fact that cons themselves haven't changed that much. 476 00:26:59,400 --> 00:27:01,720 Speaker 1: I mean, you know it seems to make life easy, 477 00:27:01,800 --> 00:27:04,199 Speaker 1: right then nothing nothing changes. But in another way, that 478 00:27:04,240 --> 00:27:07,040 Speaker 1: just goes to show that we just all have the 479 00:27:07,080 --> 00:27:09,520 Speaker 1: same vulnerabilities over and over again, and people have been 480 00:27:09,560 --> 00:27:13,440 Speaker 1: exploiting them for centuries. Exactly, if it's not broke, why 481 00:27:13,600 --> 00:27:16,960 Speaker 1: fix it, yes, Or if it's broken away that will 482 00:27:17,080 --> 00:27:22,920 Speaker 1: enable you to take it really enjoy this conversation. Thank 483 00:27:22,920 --> 00:27:25,480 Speaker 1: you so much and goodbye. Absolutely, thank you so much 484 00:27:25,480 --> 00:27:30,240 Speaker 1: for having me. Snow mentioned something that's really hard to forget. 485 00:27:30,680 --> 00:27:33,240 Speaker 1: She's tried to break into over a hundred and thirty 486 00:27:33,320 --> 00:27:36,880 Speaker 1: unique buildings, and out of those, she's had only one 487 00:27:37,200 --> 00:27:40,800 Speaker 1: one that she wasn't able to break into. That's bananas. 488 00:27:41,720 --> 00:27:43,720 Speaker 1: What snowstat us is that we have to think of 489 00:27:43,760 --> 00:27:47,960 Speaker 1: information security in a much more holistic way. It has 490 00:27:48,000 --> 00:27:52,720 Speaker 1: to involve networks and computers, but also employees and office buildings. 491 00:27:53,400 --> 00:27:57,320 Speaker 1: Of course, no defense is ever perfect, and that's why 492 00:27:57,320 --> 00:28:00,320 Speaker 1: it's important for companies to have people like Snow on 493 00:28:00,359 --> 00:28:03,439 Speaker 1: their side, because in a world where business is bound 494 00:28:03,480 --> 00:28:06,320 Speaker 1: to be hacked, the real question is is there a 495 00:28:06,359 --> 00:28:11,919 Speaker 1: good hacker hacking for you. On the next episode of 496 00:28:12,000 --> 00:28:16,800 Speaker 1: Smart Talks with IBM, the Mayflower Autonomous Ship, how IBMS 497 00:28:16,920 --> 00:28:21,840 Speaker 1: artificial intelligence is powering the world's very first autonomous vessel. 498 00:28:22,600 --> 00:28:25,479 Speaker 1: We talked with Brett Fanoff and Don Scott about how 499 00:28:25,480 --> 00:28:30,840 Speaker 1: they're using IBM tech to revolutionize oceanography. Smart Talks with 500 00:28:30,880 --> 00:28:34,760 Speaker 1: IBM is produced by Molly Sosha, David jaw, Royston Reserve, 501 00:28:35,359 --> 00:28:40,040 Speaker 1: and Edith Rousselo with Jacob Goldstein, were edited by Jan Guerra. 502 00:28:40,360 --> 00:28:44,680 Speaker 1: Our engineers are Jason Gambrel, Sarah Brugair and Ben Tolliday. 503 00:28:44,920 --> 00:28:50,240 Speaker 1: Theme song by Gramoscope. Special thanks to Carlie Megliori, Andy Kelly, 504 00:28:50,360 --> 00:28:54,320 Speaker 1: Kathy Callaghan and the Eight Bar and IBM teams, as 505 00:28:54,360 --> 00:28:58,360 Speaker 1: well as the Pushkin marketing team. Smart Talks with IBM 506 00:28:58,400 --> 00:29:01,240 Speaker 1: is a production of Pushkin Industries and I Heart Media. 507 00:29:01,640 --> 00:29:04,600 Speaker 1: To find more Pushkin podcasts, listen on the i Heart 508 00:29:04,680 --> 00:29:09,080 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to podcasts. 509 00:29:09,480 --> 00:29:25,760 Speaker 1: Hi'm Malcolm Gladwell. This is a paid advertisement from IBM.