1 00:00:02,800 --> 00:00:04,840 Speaker 1: The bad guys like to attack over holidays, so it's 2 00:00:04,840 --> 00:00:05,720 Speaker 1: really not fun for me. 3 00:00:06,040 --> 00:00:10,200 Speaker 2: That's Curtis Minder, a renowned ransomware negotiator, telling me about 4 00:00:10,200 --> 00:00:12,320 Speaker 2: a time when he picked up an emergency call on 5 00:00:12,400 --> 00:00:13,280 Speaker 2: a major holiday. 6 00:00:13,640 --> 00:00:16,600 Speaker 1: The initial call is always very emotional, as you can imagine, 7 00:00:17,000 --> 00:00:19,000 Speaker 1: even in the large companies, you know, you may have 8 00:00:19,040 --> 00:00:22,400 Speaker 1: a boardroom of people with it's very emotional. 9 00:00:22,760 --> 00:00:24,840 Speaker 2: On the other end of the call was a chemical 10 00:00:24,880 --> 00:00:28,800 Speaker 2: manufacturing company who'd been locked out of their own assembly line. 11 00:00:29,000 --> 00:00:32,400 Speaker 1: They had a complete operational interruption, so they couldn't manufacture 12 00:00:32,400 --> 00:00:32,960 Speaker 1: their product. 13 00:00:33,440 --> 00:00:35,839 Speaker 2: Costs can add up quickly when a cyber attack delays 14 00:00:35,840 --> 00:00:38,120 Speaker 2: at game studio's next release or leads to a data 15 00:00:38,159 --> 00:00:40,640 Speaker 2: breach at a bank. But when attackers shut down a 16 00:00:40,640 --> 00:00:43,960 Speaker 2: manufacturing line that's part of a global supply chain, you 17 00:00:44,000 --> 00:00:45,960 Speaker 2: can almost see the money circling the drain. 18 00:00:46,200 --> 00:00:49,559 Speaker 1: They were losing millions of dollars a day in revenue. 19 00:00:49,040 --> 00:00:51,840 Speaker 2: And for this chemical manufacturer, like with any business shut 20 00:00:51,840 --> 00:00:55,120 Speaker 2: down by ransomware, the losses went way beyond a few 21 00:00:55,200 --> 00:00:56,320 Speaker 2: days of missing shipments. 22 00:00:56,640 --> 00:00:59,240 Speaker 1: I call it the ransomware blast radius. It's like we 23 00:00:59,320 --> 00:01:02,200 Speaker 1: know the base impact. It's operational interruption, But what about 24 00:01:02,240 --> 00:01:04,920 Speaker 1: these other things? And so that's cost of goods going bad, 25 00:01:04,959 --> 00:01:08,200 Speaker 1: supplier confidence, that's hey, wait, you didn't make payroll for 26 00:01:08,200 --> 00:01:11,160 Speaker 1: two weeks, the attrition that just occurred, wuldn't that cost you? 27 00:01:11,560 --> 00:01:12,840 Speaker 3: Those are all part. 28 00:01:12,600 --> 00:01:16,360 Speaker 1: Of the fairly complex equation on total cost of impact. 29 00:01:17,120 --> 00:01:19,840 Speaker 1: That formula, if you will, kind of helps his decide 30 00:01:19,840 --> 00:01:22,920 Speaker 1: on whether to pay a bad guy or not, or 31 00:01:22,959 --> 00:01:24,360 Speaker 1: to engage a bad guy or not. 32 00:01:24,840 --> 00:01:28,360 Speaker 2: In this case, after finishing this exhausting analysis with Curtis, 33 00:01:28,400 --> 00:01:30,880 Speaker 2: the company decided to pay the ransom. 34 00:01:30,360 --> 00:01:33,720 Speaker 1: And my job as a negotiators to make sure we 35 00:01:33,760 --> 00:01:36,200 Speaker 1: don't pay the price on the window. 36 00:01:35,760 --> 00:01:36,400 Speaker 3: On the sticker. 37 00:01:37,040 --> 00:01:40,200 Speaker 2: Before long, the systems were back online, products were going 38 00:01:40,240 --> 00:01:42,920 Speaker 2: out the door again, and Curtis was helping the company recover. 39 00:01:43,480 --> 00:01:45,280 Speaker 2: But when he sat down with the companies see so, 40 00:01:45,760 --> 00:01:48,600 Speaker 2: he heard something that changed how he thought about the industry. 41 00:01:49,080 --> 00:01:52,120 Speaker 1: He said, Curtis, here's my biggest concern. We have been 42 00:01:52,160 --> 00:01:55,320 Speaker 1: the manufacturer of this particular product for almost one hundred years, 43 00:01:55,960 --> 00:01:58,320 Speaker 1: and the way that we manufacture this product and the 44 00:01:58,360 --> 00:02:01,920 Speaker 1: materials we use to manufacture this product are our trade secret. 45 00:02:03,240 --> 00:02:06,360 Speaker 1: I am concerned that that information has left the building, 46 00:02:07,120 --> 00:02:10,000 Speaker 1: and I won't know about that risk for some time 47 00:02:10,160 --> 00:02:13,160 Speaker 1: until a competitor of mine makes the exact same product 48 00:02:13,200 --> 00:02:19,400 Speaker 1: in five years from now and puts me out of business. 49 00:02:20,400 --> 00:02:25,600 Speaker 2: From Bloomberg Media Studios and Chrome Enterprise, this is Security Bookmarked. 50 00:02:28,240 --> 00:02:31,760 Speaker 2: I'm your host, Kate Fazzini. I've been a cybersecurity professional 51 00:02:31,840 --> 00:02:35,040 Speaker 2: and journalist for over twenty years, and on this podcast, 52 00:02:35,080 --> 00:02:38,560 Speaker 2: I'm talking with leaders in gaming, finance, and manufacturing about 53 00:02:38,560 --> 00:02:41,360 Speaker 2: what security looks like in a workplace that's moved to 54 00:02:41,440 --> 00:02:46,600 Speaker 2: the cloud. In twenty twenty three, ransomware attacks against manufacturers 55 00:02:46,600 --> 00:02:50,160 Speaker 2: and other industrial companies increased by fifty percent, and since 56 00:02:50,200 --> 00:02:55,760 Speaker 2: twenty nineteen, cybersecurity incidents targeting operational technology have risen exponentially. 57 00:02:57,320 --> 00:03:00,520 Speaker 2: So today I'm speaking with Curtis about why manufacturs are 58 00:03:00,520 --> 00:03:04,440 Speaker 2: facing more ransomware attacks than ever and how AI is 59 00:03:04,480 --> 00:03:08,280 Speaker 2: amplifying threats and offering new defenses for cybersecurity leaders. 60 00:03:08,560 --> 00:03:10,680 Speaker 1: I'm the founder of Group Sense, which is a digital 61 00:03:10,760 --> 00:03:14,560 Speaker 1: risk protection company. I'm also the lead ransomware negotiator at 62 00:03:14,560 --> 00:03:17,280 Speaker 1: group Sense, and I have about thirty years in what's 63 00:03:17,320 --> 00:03:18,760 Speaker 1: now just called cyber. 64 00:03:18,760 --> 00:03:21,600 Speaker 2: Then i'll chat with David Adrian security product manager for 65 00:03:21,720 --> 00:03:24,920 Speaker 2: Chrome about how a web focused strategy can help manufacturers 66 00:03:24,960 --> 00:03:33,720 Speaker 2: secure the connection between their IT and their OT. The 67 00:03:33,840 --> 00:03:37,320 Speaker 2: job title of ransomware negotiator is still fairly new, but 68 00:03:37,400 --> 00:03:40,080 Speaker 2: Curtis has been dealing with cyber attackers since the early 69 00:03:40,160 --> 00:03:43,360 Speaker 2: nineties when he worked on systems for an Internet service provider. 70 00:03:43,920 --> 00:03:46,720 Speaker 2: He's seen pretty much every kind of ransomware scenario you 71 00:03:46,720 --> 00:03:48,280 Speaker 2: could imagine. 72 00:03:47,760 --> 00:03:51,760 Speaker 1: Incidence where the victim has started the negotiation before we 73 00:03:51,840 --> 00:03:56,200 Speaker 1: showed up and has made some very very novice mistakes. 74 00:03:56,640 --> 00:03:58,120 Speaker 3: We've also had incidents. 75 00:03:57,680 --> 00:04:00,960 Speaker 1: Where we're in the middle of the negotiation and the 76 00:04:01,040 --> 00:04:03,440 Speaker 1: thread actors get back in and do more damage, where 77 00:04:03,520 --> 00:04:06,600 Speaker 1: there's some confidence from the victim that hey, we've got 78 00:04:06,600 --> 00:04:08,720 Speaker 1: the doors locked, they can't get back in, and they 79 00:04:08,720 --> 00:04:11,520 Speaker 1: were wrong about that, and that causes issues. 80 00:04:12,280 --> 00:04:15,520 Speaker 2: Going back to his ransomware story, Curtis couldn't reveal exactly 81 00:04:15,560 --> 00:04:17,600 Speaker 2: how the attacker got in, but he told me they 82 00:04:17,600 --> 00:04:19,080 Speaker 2: didn't have to be very creative. 83 00:04:19,400 --> 00:04:22,440 Speaker 1: One of the things that is frustrating for us is 84 00:04:22,480 --> 00:04:24,800 Speaker 1: that at the end of this we're taking stock on 85 00:04:24,839 --> 00:04:28,159 Speaker 1: how the thread actors gained access, and it can be 86 00:04:28,200 --> 00:04:33,440 Speaker 1: distilled down into like seven to eight sort of preventable things. 87 00:04:33,600 --> 00:04:37,400 Speaker 2: Strong passwords, multi factor authentication, staying on top of your 88 00:04:37,480 --> 00:04:41,200 Speaker 2: updates and patches, securing remote access. These are just a 89 00:04:41,200 --> 00:04:44,039 Speaker 2: few of the things Curtis considers low hanging fruit for 90 00:04:44,160 --> 00:04:45,440 Speaker 2: any company. 91 00:04:45,480 --> 00:04:47,880 Speaker 1: They're trying to gain access to your systems as cheaply 92 00:04:47,960 --> 00:04:51,160 Speaker 1: and as efficiently as possible, and so they're not buying 93 00:04:51,680 --> 00:04:53,680 Speaker 1: zero days on the dark web to break into your 94 00:04:54,279 --> 00:04:56,159 Speaker 1: to break into your network because they don't have to. 95 00:04:56,880 --> 00:05:00,279 Speaker 1: They can use some very simple mistakes and so fiber 96 00:05:00,360 --> 00:05:03,960 Speaker 1: hygiene or processes to gain access, and often that is 97 00:05:04,000 --> 00:05:05,839 Speaker 1: the case. It is something fairly simple to gain the 98 00:05:05,839 --> 00:05:08,679 Speaker 1: initial access, and then once they're in, they're very good 99 00:05:08,800 --> 00:05:10,600 Speaker 1: at expanding their access and pivoting. 100 00:05:11,320 --> 00:05:13,479 Speaker 2: Later in the episode, i'll chat with David Adriene at 101 00:05:13,520 --> 00:05:16,280 Speaker 2: Chrome about how a web focused strategy can secure that 102 00:05:16,320 --> 00:05:17,200 Speaker 2: point of access. 103 00:05:17,400 --> 00:05:18,640 Speaker 3: But first i'll hear. 104 00:05:18,600 --> 00:05:22,360 Speaker 2: More from Curtis about his experiences helping manufacturers recover from 105 00:05:22,440 --> 00:05:25,200 Speaker 2: ransomware attacks and what he sees in the near future 106 00:05:25,240 --> 00:05:26,679 Speaker 2: for enterprise cybersecurity. 107 00:05:29,440 --> 00:05:32,400 Speaker 1: You know, when you talk about partners or constituents who 108 00:05:33,000 --> 00:05:37,000 Speaker 1: lose confidence in the manufacturing and supply chain space, a 109 00:05:37,000 --> 00:05:41,520 Speaker 1: lot of these companies have a fairly robust supply chain 110 00:05:41,720 --> 00:05:46,000 Speaker 1: resiliency strategy, right, and if one of your manufacturers in 111 00:05:46,040 --> 00:05:50,880 Speaker 1: your supply chain stops producing, you've got a backup or 112 00:05:50,920 --> 00:05:54,000 Speaker 1: two or three, and you might not never ever go 113 00:05:54,120 --> 00:05:57,400 Speaker 1: back to that manufacturer. When I'm talking to companies about 114 00:05:57,400 --> 00:06:00,280 Speaker 1: how to prepare and respond to this in ants of 115 00:06:00,279 --> 00:06:03,880 Speaker 1: an attack, I tell them that when the dust settles 116 00:06:04,440 --> 00:06:08,000 Speaker 1: on an attack, you're going to need a tremendous amount 117 00:06:08,040 --> 00:06:11,599 Speaker 1: of goodwill from your community, and the quickest way to 118 00:06:11,680 --> 00:06:14,919 Speaker 1: make that go away is to lie to them or 119 00:06:15,000 --> 00:06:17,560 Speaker 1: make them think you're lying to them or withholding information. 120 00:06:17,880 --> 00:06:21,599 Speaker 1: And so their ability to address this quickly and also 121 00:06:22,000 --> 00:06:25,160 Speaker 1: communicate transparently is so important. 122 00:06:25,400 --> 00:06:27,880 Speaker 2: Yes, I am so kind. You're saying that I've seen 123 00:06:27,920 --> 00:06:31,480 Speaker 2: the communication piece goes so wrong, both as a practitioner 124 00:06:31,600 --> 00:06:34,240 Speaker 2: and then as a reporter, even though that doesn't have 125 00:06:34,320 --> 00:06:36,880 Speaker 2: to be the case. So thank you for emphasizing that. Now, 126 00:06:37,040 --> 00:06:39,400 Speaker 2: going back to the start of your ransomware story, I 127 00:06:39,440 --> 00:06:43,680 Speaker 2: want to ask something more simple. Why are manufacturers and 128 00:06:43,720 --> 00:06:47,880 Speaker 2: in particular, operating technology itself a target? To begin with? 129 00:06:48,200 --> 00:06:50,640 Speaker 1: Yeah, I think increasingly, like everywhere else in the world, 130 00:06:50,680 --> 00:06:54,680 Speaker 1: the devices and manufacturing are connected, and the reason why 131 00:06:54,720 --> 00:06:57,080 Speaker 1: we're connecting them is data. We want to manage them, 132 00:06:57,080 --> 00:06:59,200 Speaker 1: we want to optimize them, we want to look for 133 00:06:59,320 --> 00:07:01,960 Speaker 1: errors and mist and things like that. And so as 134 00:07:02,000 --> 00:07:08,360 Speaker 1: we've implemented technology to manage those manufacturing devices and connected 135 00:07:08,400 --> 00:07:12,640 Speaker 1: those systems to the network, we've introduced a new attack 136 00:07:13,000 --> 00:07:14,040 Speaker 1: vector for the bad guys. 137 00:07:14,240 --> 00:07:17,160 Speaker 2: And it's not just one attack vector, right, there's this 138 00:07:17,240 --> 00:07:20,320 Speaker 2: whole Internet of things now, lots of new devices attached 139 00:07:20,360 --> 00:07:20,920 Speaker 2: to the network. 140 00:07:20,960 --> 00:07:22,200 Speaker 3: They're all targets. Yeah. 141 00:07:22,240 --> 00:07:24,760 Speaker 1: So in a manufacturing environment that is dealing with something 142 00:07:24,760 --> 00:07:27,080 Speaker 1: that is sensitive to temperature control, the HVAC system is 143 00:07:27,160 --> 00:07:31,440 Speaker 1: very important. So the thread actors obviously have gotten better 144 00:07:31,480 --> 00:07:34,520 Speaker 1: at this. They know that impacting those devices and those 145 00:07:34,560 --> 00:07:39,920 Speaker 1: systems makes a bigger impact operationally. And so HVAC systems 146 00:07:40,000 --> 00:07:42,920 Speaker 1: and IP phone systems and product life cycle devices. 147 00:07:43,440 --> 00:07:44,760 Speaker 3: You lock one of those up. 148 00:07:44,720 --> 00:07:48,440 Speaker 1: In manufacturing stops, things stop getting built. 149 00:07:49,120 --> 00:07:52,000 Speaker 2: It's just devastating. And when you think about the kind 150 00:07:52,040 --> 00:07:55,480 Speaker 2: of leverage that an attacker can get when they deploy 151 00:07:55,560 --> 00:08:00,000 Speaker 2: ransomware on these operational devices, it's astonished. 152 00:08:00,560 --> 00:08:04,560 Speaker 1: Yeah, I mean, the thread actors have gotten better at 153 00:08:05,280 --> 00:08:11,360 Speaker 1: learning how to disrupt our businesses and ot or ICs 154 00:08:11,360 --> 00:08:15,200 Speaker 1: devices industrial control devices. They are computers, they are running 155 00:08:15,200 --> 00:08:19,200 Speaker 1: an operating system. It is typically not a normal operating system, 156 00:08:19,360 --> 00:08:22,800 Speaker 1: and so one of the challenges for organizations is how 157 00:08:22,840 --> 00:08:26,600 Speaker 1: do you secure those And on top of that, those 158 00:08:26,720 --> 00:08:31,400 Speaker 1: devices are often not managed by the IT staff or 159 00:08:31,440 --> 00:08:35,240 Speaker 1: even the organization itself. Sometimes whoever's making these devices have 160 00:08:35,559 --> 00:08:39,040 Speaker 1: a maintenance contract to manage those devices inside the network. 161 00:08:39,240 --> 00:08:42,080 Speaker 1: So you've got a third party who's responsible for keeping 162 00:08:42,080 --> 00:08:44,640 Speaker 1: that device up to date and secure, et cetera. And 163 00:08:44,640 --> 00:08:46,520 Speaker 1: then you've got an IT staff who's responsible for the 164 00:08:46,520 --> 00:08:51,280 Speaker 1: overall organization. And it makes for an interesting dynamic that 165 00:08:51,320 --> 00:08:54,400 Speaker 1: creates a sort of a paradox for the IT security 166 00:08:54,440 --> 00:08:57,400 Speaker 1: folks in those organizations as far as protecting those devices, 167 00:08:57,440 --> 00:09:00,640 Speaker 1: and they are connected, so that connected he needs to 168 00:09:00,640 --> 00:09:04,880 Speaker 1: be closely monitored and managed and also be minimalistic, so 169 00:09:04,920 --> 00:09:07,480 Speaker 1: it only the things that need to talk need to talk, 170 00:09:07,520 --> 00:09:10,080 Speaker 1: and that is it right, and keep it very very tight. 171 00:09:10,360 --> 00:09:13,000 Speaker 2: That's great advice, Thank you so much, Curtis. Now you're 172 00:09:13,040 --> 00:09:16,200 Speaker 2: constantly reminding business leaders that they don't want to have 173 00:09:16,280 --> 00:09:18,800 Speaker 2: low hanging fruit, that attackers have plenty of old tricks 174 00:09:18,840 --> 00:09:21,120 Speaker 2: that still work so, and I know you also do 175 00:09:21,160 --> 00:09:23,960 Speaker 2: reconnaissance on thread actors. So looking to the future, do 176 00:09:24,040 --> 00:09:27,120 Speaker 2: you see it change happening in the way cyber attackers 177 00:09:27,160 --> 00:09:28,760 Speaker 2: are approaching their attacks? 178 00:09:29,000 --> 00:09:31,440 Speaker 1: You know, I think having done quite a bit analysis 179 00:09:31,480 --> 00:09:33,640 Speaker 1: on this, and my core company does a lot of 180 00:09:33,679 --> 00:09:37,880 Speaker 1: work around intelligence, I think right now our biggest concern 181 00:09:38,320 --> 00:09:43,360 Speaker 1: is synthetic content. So the phishing campaigns are more effective, 182 00:09:43,480 --> 00:09:46,079 Speaker 1: the landing pages that they send you to to carvesture 183 00:09:46,120 --> 00:09:48,640 Speaker 1: credentials are more real. I'll just give you a quick 184 00:09:48,679 --> 00:09:51,520 Speaker 1: example of one of those. The thread actors will go 185 00:09:51,640 --> 00:09:55,000 Speaker 1: to your management page of your company and they'll pick 186 00:09:55,040 --> 00:09:57,280 Speaker 1: out all the names of your board members, and then 187 00:09:57,320 --> 00:10:00,880 Speaker 1: they will have AI generate a fake email threat between 188 00:10:00,920 --> 00:10:04,439 Speaker 1: those people on a particular topic, and it looks very 189 00:10:04,559 --> 00:10:05,000 Speaker 1: very real. 190 00:10:05,160 --> 00:10:07,040 Speaker 2: Okay, that's a new one. That's new. I haven't heard 191 00:10:07,080 --> 00:10:07,480 Speaker 2: that before. 192 00:10:07,600 --> 00:10:10,360 Speaker 1: Yeah, you're a mid level finance person and then suddenly 193 00:10:10,400 --> 00:10:13,720 Speaker 1: you're looped in on this email thread by a board 194 00:10:13,760 --> 00:10:16,840 Speaker 1: member and they say, hey, we need you to do this, 195 00:10:17,080 --> 00:10:20,240 Speaker 1: and you scroll back and you look at Oh my gosh, 196 00:10:20,280 --> 00:10:23,720 Speaker 1: it's the board they need you know, I feel important. 197 00:10:23,760 --> 00:10:25,120 Speaker 1: I'm going to do this thing right away. I'm not 198 00:10:25,120 --> 00:10:27,800 Speaker 1: going to ask any questions. We've seen evidence of that, 199 00:10:27,880 --> 00:10:29,959 Speaker 1: and the AI makes that very easy for the bad 200 00:10:30,000 --> 00:10:31,960 Speaker 1: guys to do, to create the sort of synthetic content 201 00:10:32,000 --> 00:10:34,320 Speaker 1: that looks very very real to the average person and 202 00:10:34,400 --> 00:10:36,760 Speaker 1: create sort of a social pressure in the email chains 203 00:10:36,800 --> 00:10:38,679 Speaker 1: and things like that. And I say that in lieu 204 00:10:38,760 --> 00:10:42,040 Speaker 1: of are the bad guys using AI to write custom malware? 205 00:10:42,080 --> 00:10:44,840 Speaker 1: Not yet, we haven't seen any in the wild yet, 206 00:10:44,880 --> 00:10:48,160 Speaker 1: but it is plausible that AI can write, you know, 207 00:10:48,400 --> 00:10:51,800 Speaker 1: polymorphic malware for bad guys. But primarily they're not doing 208 00:10:51,880 --> 00:10:52,800 Speaker 1: that because they don't have. 209 00:10:52,760 --> 00:10:55,120 Speaker 2: To exactly, it's just totally unnecessary. 210 00:10:55,240 --> 00:10:57,840 Speaker 1: Yeah, they're running a business, and this is it's just 211 00:10:57,880 --> 00:11:00,280 Speaker 1: easier to trick you into giving your credentials or wiring money. 212 00:11:00,280 --> 00:11:02,040 Speaker 3: That's easier and cheaper for them. 213 00:11:02,160 --> 00:11:04,600 Speaker 1: Where I do think AI will play a risk, if 214 00:11:04,679 --> 00:11:08,720 Speaker 1: it hasn't already, is the volumes and volumes and volumes 215 00:11:08,720 --> 00:11:11,440 Speaker 1: of data that have been collected, you know, prior to 216 00:11:11,760 --> 00:11:15,480 Speaker 1: generative AI, finding the needle in the proverbial haystack in 217 00:11:15,520 --> 00:11:20,120 Speaker 1: that data was difficult and time consuming. So in some 218 00:11:20,160 --> 00:11:22,319 Speaker 1: ways we were sort of protected by the fact that 219 00:11:22,400 --> 00:11:26,680 Speaker 1: they have too much data right, But now AI, they 220 00:11:26,720 --> 00:11:28,319 Speaker 1: can train a model in AI and say this is 221 00:11:28,360 --> 00:11:30,760 Speaker 1: the kind of information that I'm looking for in this haystack, 222 00:11:31,040 --> 00:11:32,960 Speaker 1: and it will go find it for them in seconds. 223 00:11:33,400 --> 00:11:36,160 Speaker 3: And that is dangerous. Now on the. 224 00:11:36,160 --> 00:11:39,240 Speaker 1: Flip side, you could say the same On the defense, 225 00:11:39,559 --> 00:11:42,520 Speaker 1: one of the biggest challenges the security teams have is 226 00:11:42,600 --> 00:11:43,360 Speaker 1: log data. 227 00:11:43,480 --> 00:11:46,400 Speaker 3: It's just huge. They can find they're finding a needle 228 00:11:46,400 --> 00:11:47,200 Speaker 3: in a haystack too. 229 00:11:47,840 --> 00:11:50,160 Speaker 1: AI can also help with that, right, AI can help 230 00:11:50,200 --> 00:11:51,920 Speaker 1: them find the bad guys quicker. 231 00:11:52,280 --> 00:11:56,000 Speaker 2: So I'm just thinking that what we know about technology 232 00:11:56,120 --> 00:11:59,000 Speaker 2: and how it's always part of this race between attackers 233 00:11:59,000 --> 00:12:01,760 Speaker 2: and their targets, what do you say to CISOs who 234 00:12:01,760 --> 00:12:04,960 Speaker 2: maybe feel like they're losing this race, especially when it 235 00:12:05,000 --> 00:12:07,960 Speaker 2: comes to AI, or maybe to put this another way, 236 00:12:08,840 --> 00:12:11,600 Speaker 2: we often know the first steps in attacker will take 237 00:12:11,840 --> 00:12:14,920 Speaker 2: to compromise your business. What's the first step a cybersecurity 238 00:12:14,960 --> 00:12:17,760 Speaker 2: leader needs to take so their operation can stand up 239 00:12:17,760 --> 00:12:18,280 Speaker 2: to that risk. 240 00:12:18,520 --> 00:12:23,160 Speaker 1: Yeah, So cyber risk, in mitigating cyber risk is a 241 00:12:23,200 --> 00:12:27,000 Speaker 1: top down thing for organizations. I think that it does 242 00:12:27,040 --> 00:12:32,040 Speaker 1: start with culture and education for the greater staff. That 243 00:12:32,280 --> 00:12:36,120 Speaker 1: is step one is understanding that you know cybersecurity is 244 00:12:36,160 --> 00:12:39,800 Speaker 1: not an overhead. It is a fundamental operational part of 245 00:12:39,840 --> 00:12:43,680 Speaker 1: the business. When we start talking about how to mitigate 246 00:12:43,720 --> 00:12:46,520 Speaker 1: these risks, there's this very well known set of cyber 247 00:12:46,600 --> 00:12:49,360 Speaker 1: risk practices that all companies should use. That said, you 248 00:12:49,440 --> 00:12:52,280 Speaker 1: should also assume that that's not always going to work. 249 00:12:52,520 --> 00:12:56,520 Speaker 1: What organizations can do, and manufacturers specifically can do, is 250 00:12:56,679 --> 00:13:01,720 Speaker 1: put in place a response in mitigations strategy that contains 251 00:13:01,800 --> 00:13:03,199 Speaker 1: these things quickly. 252 00:13:07,440 --> 00:13:10,480 Speaker 2: The AI assisted phishing emails that Curtis told me about, 253 00:13:10,720 --> 00:13:14,559 Speaker 2: the warning that attackers will eventually breach your perimeter, these 254 00:13:14,640 --> 00:13:17,080 Speaker 2: reminded me that the first step of so many cyber 255 00:13:17,120 --> 00:13:19,640 Speaker 2: attacks is using your own accounts against you. 256 00:13:20,240 --> 00:13:22,840 Speaker 4: Step one is like, if an employee doesn't have access 257 00:13:22,880 --> 00:13:26,679 Speaker 4: to something, they can't leak it right, whether intentionally or 258 00:13:26,800 --> 00:13:29,920 Speaker 4: because their account was taken over by an attacker or otherwise, 259 00:13:30,000 --> 00:13:32,800 Speaker 4: so strong access control sort of limits the problem down. 260 00:13:33,080 --> 00:13:35,640 Speaker 2: That's David Adrian and the security product manager for Chrome. 261 00:13:36,480 --> 00:13:39,120 Speaker 2: When I brought up the equipment that attackers can target 262 00:13:39,240 --> 00:13:42,280 Speaker 2: after they gain account access, David took a step back 263 00:13:42,320 --> 00:13:45,200 Speaker 2: and looked at the overall posture. He explained how the 264 00:13:45,240 --> 00:13:48,680 Speaker 2: network connections that make them vulnerable could be transformed into 265 00:13:48,720 --> 00:13:49,640 Speaker 2: points of defense. 266 00:13:50,520 --> 00:13:53,920 Speaker 4: I saw some research recently about we'll call it industrial 267 00:13:53,920 --> 00:13:58,319 Speaker 4: control systems or ICs systems, these sort of factory floor 268 00:13:58,520 --> 00:14:02,440 Speaker 4: management systems, and it was saying that the core sort 269 00:14:02,480 --> 00:14:05,880 Speaker 4: of ICs protocols, you weren't really seeing them online as 270 00:14:05,920 --> 00:14:08,880 Speaker 4: much anymore, which is good because these protocols don't really 271 00:14:08,880 --> 00:14:12,040 Speaker 4: have any security in them, but they do expose a 272 00:14:12,040 --> 00:14:18,680 Speaker 4: web interface HTTP configuration pages for this equipment for managing 273 00:14:18,720 --> 00:14:23,240 Speaker 4: factories or other industrial control systems or other manufacturing processes. 274 00:14:23,520 --> 00:14:28,040 Speaker 4: It's bad if these administration pages are accessible, but it's 275 00:14:28,080 --> 00:14:30,560 Speaker 4: good because it kind of shapes the problem from how 276 00:14:30,600 --> 00:14:34,680 Speaker 4: do I secure this old protocol that wasn't built for security, 277 00:14:34,800 --> 00:14:38,680 Speaker 4: that's confusing, that's used for somewhat niche applications for like 278 00:14:39,280 --> 00:14:42,600 Speaker 4: managing centrifuges or whatever it is that you're using in 279 00:14:42,600 --> 00:14:45,720 Speaker 4: your manufacturing process, And instead it just boils down to 280 00:14:46,320 --> 00:14:50,080 Speaker 4: limiting access to websites on the front end and then 281 00:14:50,240 --> 00:14:53,400 Speaker 4: sort of strong network segmentation on the backside. And then 282 00:14:53,400 --> 00:14:56,320 Speaker 4: you can build access controls on top of a system 283 00:14:56,320 --> 00:14:58,200 Speaker 4: that was never built for this in the first place, 284 00:14:58,560 --> 00:15:00,680 Speaker 4: right by just routing all of the traffic and all 285 00:15:00,720 --> 00:15:03,160 Speaker 4: of that access through an enterprise browser. 286 00:15:03,600 --> 00:15:06,600 Speaker 2: I think if you were talking ten years ago, you 287 00:15:06,680 --> 00:15:09,120 Speaker 2: might say you wanted the OT and IT systems to 288 00:15:09,200 --> 00:15:12,520 Speaker 2: be not connected at all, or that you would want 289 00:15:12,560 --> 00:15:15,280 Speaker 2: an OT system never to connect to the Internet. Talk 290 00:15:15,320 --> 00:15:17,760 Speaker 2: to me a little bit about why, with the way 291 00:15:17,760 --> 00:15:20,000 Speaker 2: that we work today, that's not as realistic. 292 00:15:20,520 --> 00:15:24,200 Speaker 4: Yeah, air gappening sounds nice in practice, but in reality, 293 00:15:24,480 --> 00:15:27,600 Speaker 4: systems end up needing to be connected directly to the 294 00:15:27,640 --> 00:15:29,760 Speaker 4: Internet or to some other network that is then connected 295 00:15:29,800 --> 00:15:32,000 Speaker 4: to the Internet, and so it makes way more sense 296 00:15:32,040 --> 00:15:35,120 Speaker 4: to adopt these sort of zero trust approaches where each 297 00:15:35,200 --> 00:15:39,000 Speaker 4: device is behind its own sort of authentication proxy, and 298 00:15:39,000 --> 00:15:42,000 Speaker 4: then you access the configuration pages through the web browser, 299 00:15:42,000 --> 00:15:44,640 Speaker 4: through the enterprise browser, and you leverage everything that's built 300 00:15:44,680 --> 00:15:47,320 Speaker 4: into the enterprise browser, and then you can do that 301 00:15:47,840 --> 00:15:51,120 Speaker 4: without any of these devices actually needed to be updated 302 00:15:51,160 --> 00:15:54,760 Speaker 4: to understand all of these sort of modern authentication and 303 00:15:54,840 --> 00:15:56,400 Speaker 4: device authentication protocols. 304 00:15:56,720 --> 00:15:59,200 Speaker 2: That's the point that I think is really important because 305 00:15:59,720 --> 00:16:03,200 Speaker 2: it's many conversations about OT developments while you can't keep 306 00:16:03,320 --> 00:16:06,240 Speaker 2: updating all of these different operating systems all of the time, 307 00:16:06,280 --> 00:16:07,920 Speaker 2: and you know it's just never going to get better. 308 00:16:07,960 --> 00:16:11,280 Speaker 2: But then another layer of security on top is what's helpful. 309 00:16:11,480 --> 00:16:15,240 Speaker 4: Absolutely or alternatively, if you somehow made a mistake and 310 00:16:15,320 --> 00:16:17,960 Speaker 4: there is a way to access sort of the configuration 311 00:16:18,040 --> 00:16:20,280 Speaker 4: or the management of some ot device that doesn't go 312 00:16:20,320 --> 00:16:22,440 Speaker 4: through the browser, then hopefully that's a lot more obvious 313 00:16:22,480 --> 00:16:25,720 Speaker 4: than the sign of like immediate concern because commands are 314 00:16:25,760 --> 00:16:29,640 Speaker 4: getting sent or configuration is being pushed to some device 315 00:16:29,680 --> 00:16:33,440 Speaker 4: on the manufacturing floor and isn't corresponding with some sort 316 00:16:33,480 --> 00:16:36,440 Speaker 4: of known employee log in, like this is a red flag, and. 317 00:16:36,400 --> 00:16:38,080 Speaker 2: It's an instantaneous red flag too. 318 00:16:38,280 --> 00:16:41,720 Speaker 4: Absolutely, So one thing you get from Chrome Enterprise is 319 00:16:41,920 --> 00:16:44,920 Speaker 4: sort of real time reporting and analytics of what all 320 00:16:44,960 --> 00:16:47,520 Speaker 4: of your users are doing. And if you have strong 321 00:16:47,560 --> 00:16:51,240 Speaker 4: authentication of all of your users, you know they're your employees. 322 00:16:51,360 --> 00:16:54,400 Speaker 4: Then if you have you know, corresponding visibility on the 323 00:16:54,440 --> 00:16:57,880 Speaker 4: say factory floor manufacturing floor that isn't aligned with what 324 00:16:57,920 --> 00:17:00,840 Speaker 4: you're seeing out of the Chrome braan houser, then you know, well, 325 00:17:00,880 --> 00:17:04,320 Speaker 4: something is wrong. Something is accessing something on the manufacturing 326 00:17:04,320 --> 00:17:06,960 Speaker 4: floor and is not going through one of my managed browsers, 327 00:17:07,000 --> 00:17:08,439 Speaker 4: and that's an immediate red flag. 328 00:17:08,960 --> 00:17:13,080 Speaker 2: So David, just looking forward as technology improves, we've seen 329 00:17:13,119 --> 00:17:16,600 Speaker 2: a lot of new approaches by attackers using that technology 330 00:17:16,600 --> 00:17:20,600 Speaker 2: and making it more sophisticated, so particularly attackers using AI 331 00:17:20,720 --> 00:17:23,800 Speaker 2: to their advantage. One example, which I had never heard 332 00:17:23,800 --> 00:17:27,600 Speaker 2: before was an attacker using generative AI to create a 333 00:17:27,680 --> 00:17:31,439 Speaker 2: very realistic email chain that included basically spoofs of the 334 00:17:31,480 --> 00:17:34,760 Speaker 2: target's bosses and even board members, and then after that 335 00:17:34,760 --> 00:17:37,280 Speaker 2: they looped the target into the email. 336 00:17:37,960 --> 00:17:40,120 Speaker 4: In this type of situation, with this sort of AI 337 00:17:40,200 --> 00:17:42,960 Speaker 4: phishing email, it sounds more like they're trying to trick 338 00:17:43,000 --> 00:17:45,199 Speaker 4: the user to go to a legitimate site and do 339 00:17:45,280 --> 00:17:47,399 Speaker 4: the wrong thing. And I think the best way to 340 00:17:47,440 --> 00:17:50,080 Speaker 4: defend against that is to make sure that your organization 341 00:17:50,640 --> 00:17:55,200 Speaker 4: has processes in place for doing things that are sensitive. 342 00:17:55,400 --> 00:17:57,639 Speaker 4: And then once you have those sort of processes in place, 343 00:17:57,720 --> 00:18:00,600 Speaker 4: these sort of steps in your workflow that get pushed 344 00:18:01,000 --> 00:18:03,119 Speaker 4: to some sort of application in the browser is then 345 00:18:03,160 --> 00:18:06,919 Speaker 4: another opportunity to have someone else verify that yes, this 346 00:18:07,000 --> 00:18:10,800 Speaker 4: is actually the business process we expected. And so as 347 00:18:10,840 --> 00:18:14,320 Speaker 4: you start to route these business processes through web apps 348 00:18:14,440 --> 00:18:17,800 Speaker 4: through the browser, then every single step in the process 349 00:18:17,800 --> 00:18:19,840 Speaker 4: where you do that is a step where you can 350 00:18:20,160 --> 00:18:22,360 Speaker 4: secure it in the sense that you can make sure 351 00:18:22,400 --> 00:18:25,440 Speaker 4: that the people participating in it are actually your employees 352 00:18:25,480 --> 00:18:29,000 Speaker 4: and give more people an opportunity to identify when something 353 00:18:29,080 --> 00:18:29,720 Speaker 4: is going wrong. 354 00:18:30,119 --> 00:18:32,040 Speaker 2: This is a really cool way of looking at it too, 355 00:18:32,080 --> 00:18:34,880 Speaker 2: I think from a security person's point of view, where 356 00:18:35,320 --> 00:18:38,640 Speaker 2: you have this visibility now that we didn't have before. 357 00:18:39,119 --> 00:18:42,040 Speaker 2: You can see each step of a compromise or each 358 00:18:42,040 --> 00:18:45,800 Speaker 2: step of an attempted breach. Now you can also see 359 00:18:45,880 --> 00:18:49,600 Speaker 2: each step of the pre breach, the pre boom scenario 360 00:18:50,080 --> 00:18:52,760 Speaker 2: in a way that's really systematic. That's actually really exciting. 361 00:18:52,880 --> 00:18:55,880 Speaker 4: Yeah, in the modern web based workplace that we've all 362 00:18:55,920 --> 00:18:59,280 Speaker 4: become accustomed to, there's a ton of opportunities to solve 363 00:18:59,400 --> 00:19:03,560 Speaker 4: enterprise caity problems that have plagued companies for years. Using 364 00:19:03,560 --> 00:19:06,200 Speaker 4: a managed browser like Chrome enterprise can be a critical 365 00:19:06,200 --> 00:19:09,600 Speaker 4: component of these solutions. But I think we're really understanding 366 00:19:09,600 --> 00:19:12,520 Speaker 4: that there's a leadership aspect to cybersecurity that's absolutely critical 367 00:19:12,560 --> 00:19:15,399 Speaker 4: as well. So I hope that we've been able to 368 00:19:15,400 --> 00:19:18,320 Speaker 4: help leaders understand the direction that cybersecurity is headed in 369 00:19:18,840 --> 00:19:21,760 Speaker 4: and demonstrate how much companies can benefit from setting up 370 00:19:21,760 --> 00:19:24,040 Speaker 4: their teams with protections that take into account the way 371 00:19:24,040 --> 00:19:25,400 Speaker 4: that we all work on the web. 372 00:19:28,119 --> 00:19:30,920 Speaker 2: To learn more about how the most trusted enterprise browser 373 00:19:31,000 --> 00:19:35,400 Speaker 2: can help protect your organization, visit Chrome Enterprise dot Google. 374 00:19:36,680 --> 00:19:37,160 Speaker 3: Security. 375 00:19:37,160 --> 00:19:40,920 Speaker 2: Bookmark does a podcast from Bloomberg Media Studios and Chrome Enterprise. 376 00:19:41,280 --> 00:19:44,280 Speaker 2: Check out our other episodes about cybersecurity and finance and 377 00:19:44,320 --> 00:19:49,160 Speaker 2: gaming in your podcast app. I'm Kate Fazzini. Thanks for listening.