1 00:00:02,240 --> 00:00:04,680 Speaker 1: We can go back to a quote from the Depression 2 00:00:04,680 --> 00:00:08,319 Speaker 1: era bank robber Willie Sutton. He had this infamous quote 3 00:00:08,360 --> 00:00:10,600 Speaker 1: that said, like, I rob banks because that's where the 4 00:00:10,600 --> 00:00:11,080 Speaker 1: money is. 5 00:00:11,920 --> 00:00:15,080 Speaker 2: Old fashioned bank heights aren't so common today, but modern 6 00:00:15,080 --> 00:00:18,919 Speaker 2: financial institutions protect more than just money, and finance is 7 00:00:19,000 --> 00:00:22,040 Speaker 2: consistently in the top three most targeted industries when it 8 00:00:22,040 --> 00:00:23,120 Speaker 2: comes to cyber attacks. 9 00:00:23,520 --> 00:00:27,560 Speaker 1: There's accounts, but there's also a lot of strategic information 10 00:00:27,760 --> 00:00:32,640 Speaker 1: with regards to transactions and the likes, and that's what 11 00:00:32,800 --> 00:00:36,560 Speaker 1: continues to make financial institutions a target for this. 12 00:00:37,120 --> 00:00:38,320 Speaker 2: That's JF Lego. 13 00:00:38,720 --> 00:00:42,320 Speaker 1: I'm w chief Information Security Officer at JP Morgan Chase. 14 00:00:42,920 --> 00:00:45,959 Speaker 2: As a leader of cybersecurity operations for the bank and 15 00:00:45,960 --> 00:00:49,680 Speaker 2: its clients, JF thinks constantly about every opportunity that an 16 00:00:49,720 --> 00:00:54,320 Speaker 2: attacker could exploit, from software bugs to natural disasters. 17 00:00:53,960 --> 00:00:57,520 Speaker 1: Whether the scenario will be a technology outage, whether it 18 00:00:57,600 --> 00:01:00,520 Speaker 1: be whether a threat actor could use that as a lure. 19 00:01:00,960 --> 00:01:05,440 Speaker 1: We've actually seen, you know, like fake donation sites. When 20 00:01:05,480 --> 00:01:08,640 Speaker 1: there's a natural disaster right where people are looking to 21 00:01:08,800 --> 00:01:13,280 Speaker 1: donate to earthquake relief for hurricane relief, the. 22 00:01:13,200 --> 00:01:16,000 Speaker 2: Bad guys are there and by setting up fake disaster 23 00:01:16,080 --> 00:01:19,399 Speaker 2: relief websites, the bad guys can harvest any credentials that 24 00:01:19,440 --> 00:01:22,800 Speaker 2: come with those well meaning donations. This is just one 25 00:01:22,800 --> 00:01:25,960 Speaker 2: scenario and a bigger trend that JF seeing where cyber 26 00:01:26,000 --> 00:01:28,479 Speaker 2: attackers set traps to compromise team accounts. 27 00:01:28,880 --> 00:01:32,800 Speaker 1: We're seeing more and more threat actors using you know, 28 00:01:32,880 --> 00:01:38,480 Speaker 1: search engine optimization to present fake websites. When somebody's doing 29 00:01:38,760 --> 00:01:42,080 Speaker 1: an online search, the website will come up at the 30 00:01:42,080 --> 00:01:44,640 Speaker 1: top versus a legitimate when they're looking for and then 31 00:01:44,680 --> 00:01:48,559 Speaker 1: they get the ability to deliver malicious software. So that's 32 00:01:48,640 --> 00:01:52,440 Speaker 1: like a really interesting trend that people should think about. 33 00:01:52,880 --> 00:01:55,680 Speaker 1: You know, we'll use to train people to look for 34 00:01:55,720 --> 00:02:00,720 Speaker 1: phishing based on like grammar and urgency and things like that. 35 00:02:00,080 --> 00:02:01,560 Speaker 1: That's changing. 36 00:02:02,480 --> 00:02:05,640 Speaker 2: Fishing and browser based attacks are evolving to catch us 37 00:02:05,640 --> 00:02:09,360 Speaker 2: where we spend our money, our attention, and our working hours, 38 00:02:09,600 --> 00:02:12,920 Speaker 2: and as work itself happens more consistently in web browsers. 39 00:02:13,160 --> 00:02:16,320 Speaker 2: JF sees the role of a cybersecurity leader evolving too. 40 00:02:16,760 --> 00:02:18,760 Speaker 1: I've been doing this for like twenty five years now. 41 00:02:19,200 --> 00:02:22,480 Speaker 1: That overall evolution, and we used to call it computer security. 42 00:02:22,560 --> 00:02:26,079 Speaker 1: Network security was very infrastructure focused, and then there was 43 00:02:26,120 --> 00:02:31,400 Speaker 1: an evolution to information security. You know, when I look 44 00:02:31,400 --> 00:02:34,120 Speaker 1: at the role today, a lot of it and most 45 00:02:34,120 --> 00:02:36,200 Speaker 1: of it is really how do you secure a business? 46 00:02:36,560 --> 00:02:41,720 Speaker 1: And I think that's where strong cybersecurity leaders are evolving towards, 47 00:02:41,840 --> 00:02:43,800 Speaker 1: is like how do you interface with your business? How 48 00:02:43,840 --> 00:02:48,280 Speaker 1: do you understand the practices? There's an evolution in a 49 00:02:48,400 --> 00:02:53,000 Speaker 1: variety of technologies that help bad guys sell. You also 50 00:02:53,160 --> 00:02:57,880 Speaker 1: need to adapt based on the evolution of just the world. 51 00:03:02,680 --> 00:03:07,560 Speaker 2: From Bloomberg Media Studios and Chrome Enterprise, this is Security Bookmarked. 52 00:03:11,400 --> 00:03:14,840 Speaker 2: I'm your host, Kate Fazzini. I've been a cybersecurity professional 53 00:03:14,880 --> 00:03:17,720 Speaker 2: and journalist for over twenty years, and on this podcast, 54 00:03:17,800 --> 00:03:22,040 Speaker 2: I'm talking with leaders in gaming, finance, and manufacturing about 55 00:03:22,080 --> 00:03:24,960 Speaker 2: what security looks like in a workplace that's moved to 56 00:03:25,000 --> 00:03:28,640 Speaker 2: the cloud. Much of what we think of as cybersecurity 57 00:03:28,760 --> 00:03:32,680 Speaker 2: was pioneered in financial services. In fact, a bank created 58 00:03:32,720 --> 00:03:35,520 Speaker 2: the first CISO rule, and banks invented many of the 59 00:03:35,560 --> 00:03:38,320 Speaker 2: guidelines that are now standard across a range of industries. 60 00:03:39,440 --> 00:03:42,680 Speaker 2: According to the IMF, around twenty percent of all reported 61 00:03:42,720 --> 00:03:45,720 Speaker 2: cyber incidents in the past twenty years have affected the 62 00:03:45,760 --> 00:03:49,640 Speaker 2: global financial sector. So today I'm speaking with JF about 63 00:03:49,640 --> 00:03:52,640 Speaker 2: what he's learned as a leader of cybersecurity and finance. 64 00:03:53,080 --> 00:03:58,000 Speaker 1: Really, my role is twofold. One is to represent cybersecurity 65 00:03:58,200 --> 00:04:02,120 Speaker 1: in the lines of businesses, but it's also to hear 66 00:04:02,600 --> 00:04:07,000 Speaker 1: where they're heading towards from a business strategy standpoint. 67 00:04:06,520 --> 00:04:08,600 Speaker 2: And I'll find out why he's flipping the script on 68 00:04:08,720 --> 00:04:13,120 Speaker 2: enterprise security from simply defending the perimeter to transforming whole 69 00:04:13,160 --> 00:04:17,040 Speaker 2: teams into early detection networks. Then I'll chat with David Adrian, 70 00:04:17,240 --> 00:04:20,799 Speaker 2: security product manager for Chrome, about how businesses can implement 71 00:04:20,839 --> 00:04:23,440 Speaker 2: this kind of strategy and set up a strong monitoring 72 00:04:23,480 --> 00:04:30,880 Speaker 2: system to protect their teams. Going back to the trend 73 00:04:30,880 --> 00:04:34,880 Speaker 2: of cyber attackers using fake websites as phishing lures, JF 74 00:04:34,880 --> 00:04:37,000 Speaker 2: talked me through each step of their attack path. 75 00:04:37,480 --> 00:04:39,440 Speaker 1: A lot of it starts with the endpoint. It starts 76 00:04:39,560 --> 00:04:45,680 Speaker 1: via email or web browsing. Credential theft continues to be 77 00:04:46,240 --> 00:04:50,040 Speaker 1: a driver of this and phishing phishing from two standpoints, 78 00:04:50,080 --> 00:04:54,240 Speaker 1: either the credential theft that I mentioned, but also delivery 79 00:04:54,279 --> 00:04:58,440 Speaker 1: of malwer via those channels is normally step one. What 80 00:04:58,520 --> 00:05:02,000 Speaker 1: we continue to see in terms of exploitation is things 81 00:05:02,080 --> 00:05:07,560 Speaker 1: like you know, not having multi factor authentication on remote access, 82 00:05:07,560 --> 00:05:11,200 Speaker 1: on remote log in, or an element of like push fatigue. 83 00:05:11,240 --> 00:05:15,240 Speaker 1: There are multi factor authentication solutions that send a pop 84 00:05:15,360 --> 00:05:18,120 Speaker 1: up and then people just end up hitting the yes 85 00:05:18,320 --> 00:05:21,240 Speaker 1: button somehow because they're just tired of seeing it. 86 00:05:21,480 --> 00:05:24,440 Speaker 2: But tricking someone into signing into a website is just 87 00:05:24,520 --> 00:05:25,680 Speaker 2: the first step, and. 88 00:05:25,680 --> 00:05:29,400 Speaker 1: I think what's important for organizations is that there's multiple 89 00:05:29,480 --> 00:05:31,960 Speaker 1: steps that are carried out by an actor. I think 90 00:05:32,120 --> 00:05:37,599 Speaker 1: understanding these attack paths of how actors operate and carry 91 00:05:37,640 --> 00:05:44,440 Speaker 1: out their activity is hugely important because the more you understand, 92 00:05:44,480 --> 00:05:47,840 Speaker 1: the more you can design layered control. So what if 93 00:05:47,880 --> 00:05:52,160 Speaker 1: an actor is able to obtain credentials, Well, those credentials, 94 00:05:52,240 --> 00:05:55,960 Speaker 1: if you've got multi factor, they won't work right. They 95 00:05:56,000 --> 00:05:57,640 Speaker 1: might get them part of the way, but they won't 96 00:05:57,680 --> 00:06:01,800 Speaker 1: get them logged in. Let's say they're able to get 97 00:06:01,839 --> 00:06:04,599 Speaker 1: logged in. Well, actors are going to start carrying out 98 00:06:04,640 --> 00:06:07,760 Speaker 1: some element of reconnaissance on the network. So how would 99 00:06:07,800 --> 00:06:10,800 Speaker 1: you detect that reconnaissance or how would you detect them 100 00:06:11,200 --> 00:06:14,600 Speaker 1: setting up a foothold on the network. So it's really 101 00:06:14,640 --> 00:06:20,200 Speaker 1: about as early detection as possible and understanding those early 102 00:06:20,360 --> 00:06:23,719 Speaker 1: indicators of an adversary being present on the network. 103 00:06:24,720 --> 00:06:26,560 Speaker 2: One of the biggest threats that JF and I talked 104 00:06:26,560 --> 00:06:30,479 Speaker 2: about was an ongoing rise in ransomware attacks, where attackers 105 00:06:30,480 --> 00:06:34,880 Speaker 2: don't go directly after a bank's money or even its data. Instead, 106 00:06:35,040 --> 00:06:38,320 Speaker 2: they try to paralyze the bank itself, which can have 107 00:06:38,440 --> 00:06:41,039 Speaker 2: serious consequences for the greater business world. 108 00:06:41,520 --> 00:06:47,520 Speaker 1: The financial services ecosystem interfaces with utilities, infrastructure, all of 109 00:06:47,560 --> 00:06:51,480 Speaker 1: the clearing and settlement, payment providers, the third parties that 110 00:06:51,560 --> 00:06:53,200 Speaker 1: we rely on day to day. 111 00:06:53,480 --> 00:06:58,200 Speaker 2: And protecting that entire ecosystem at a global scale that's daunting. 112 00:06:58,880 --> 00:07:01,120 Speaker 2: So I asked Ji of how to secure a high 113 00:07:01,120 --> 00:07:03,520 Speaker 2: stakes perimeter that goes way beyond the bank. 114 00:07:03,600 --> 00:07:03,840 Speaker 3: Fauld. 115 00:07:04,360 --> 00:07:07,560 Speaker 1: What's made this so interesting for bad guys is when 116 00:07:07,600 --> 00:07:11,840 Speaker 1: you look at organizations that are historically stored sensitive information 117 00:07:11,920 --> 00:07:16,320 Speaker 1: or process sensitive information, they have been highly regulated, they've 118 00:07:16,320 --> 00:07:19,000 Speaker 1: had a lot of focus in terms of building up 119 00:07:19,040 --> 00:07:25,440 Speaker 1: security controls. But by focusing on the disruption the availability 120 00:07:25,480 --> 00:07:30,280 Speaker 1: aspect right like, ransomware, operators are now able to target 121 00:07:30,440 --> 00:07:35,240 Speaker 1: a variety of organizations that don't store transactional information, that 122 00:07:35,360 --> 00:07:40,560 Speaker 1: don't store personally identifiable information, and that causes broader disruption 123 00:07:40,720 --> 00:07:43,680 Speaker 1: and I think that's why we take our role incredibly 124 00:07:43,720 --> 00:07:47,280 Speaker 1: seriously in securing the broader financial ecosystem. 125 00:07:47,440 --> 00:07:49,640 Speaker 2: That is a great answer because I think to the 126 00:07:49,680 --> 00:07:53,360 Speaker 2: consumer or the banker who needs availability, it kind of 127 00:07:53,360 --> 00:07:57,200 Speaker 2: doesn't matter if it's down because of ransomware or a hurricane. 128 00:07:57,400 --> 00:07:59,000 Speaker 2: It's just wait, is it coming back up? And what 129 00:07:59,480 --> 00:07:59,960 Speaker 2: is the alternate? 130 00:08:00,480 --> 00:08:03,400 Speaker 1: Yeah, and I still remember back in my early days, 131 00:08:04,000 --> 00:08:06,560 Speaker 1: we had one vendor that had a data center in 132 00:08:06,720 --> 00:08:09,720 Speaker 1: Florida and another women in California. So you basically have 133 00:08:09,800 --> 00:08:13,080 Speaker 1: a data center in hurricane territory and you have another 134 00:08:13,120 --> 00:08:17,360 Speaker 1: one in earthquake territory. And you might go like, why 135 00:08:17,440 --> 00:08:19,720 Speaker 1: is this part of your role to think like site 136 00:08:19,760 --> 00:08:23,960 Speaker 1: resiliency strategy with clients, Well, our clients operate in a 137 00:08:24,000 --> 00:08:26,440 Speaker 1: bunch of different industries and if they can't move money 138 00:08:27,040 --> 00:08:29,560 Speaker 1: because people can't go into the office and they can't 139 00:08:29,640 --> 00:08:33,360 Speaker 1: work from home, that has a direct impact on their 140 00:08:33,440 --> 00:08:36,319 Speaker 1: day to day operations if they can't move money. And 141 00:08:36,360 --> 00:08:39,360 Speaker 1: I think that's why ransomware has had such an impact 142 00:08:39,440 --> 00:08:45,520 Speaker 1: because it attacks confidentiality, and integrity and availability, so actually 143 00:08:45,640 --> 00:08:50,160 Speaker 1: three elements of the CIA triad, and that causes broader 144 00:08:50,200 --> 00:08:54,880 Speaker 1: disruption and I think that also gains more focus because 145 00:08:55,040 --> 00:08:58,320 Speaker 1: organizations are actually stricten as a result of these attacks. 146 00:08:58,520 --> 00:09:01,920 Speaker 2: You know, businesses are always online now, especially after COVID, 147 00:09:02,040 --> 00:09:04,720 Speaker 2: lots of people working remotely having to be on at 148 00:09:04,720 --> 00:09:07,600 Speaker 2: all times. Customers expect you to be available at all times. 149 00:09:08,040 --> 00:09:12,520 Speaker 2: Another source of constant surprises, I imagine is the third parties 150 00:09:12,600 --> 00:09:15,760 Speaker 2: that you had to work with, and the hundreds and thousands, 151 00:09:15,840 --> 00:09:20,000 Speaker 2: maybe hundreds of thousands. So how do you manage resilience 152 00:09:20,080 --> 00:09:23,600 Speaker 2: when there are all of these other factors in the 153 00:09:23,640 --> 00:09:26,520 Speaker 2: form of vendors and other companies that you're hinging your 154 00:09:26,520 --> 00:09:30,040 Speaker 2: operations on. How do you deal with that in terms 155 00:09:30,040 --> 00:09:30,679 Speaker 2: of resilience. 156 00:09:31,640 --> 00:09:35,120 Speaker 1: You know, you mentioned the pandemic. The pandemic was a 157 00:09:35,200 --> 00:09:39,800 Speaker 1: vector for adversaries. Everybody was after information for the pandemic, right, 158 00:09:39,840 --> 00:09:43,800 Speaker 1: so it became a very interesting lure for bad guys 159 00:09:43,800 --> 00:09:46,800 Speaker 1: to send like phishing emails, set up fake websites, So 160 00:09:46,840 --> 00:09:49,760 Speaker 1: it became like a lure for social engineering. And then 161 00:09:49,920 --> 00:09:53,560 Speaker 1: companies shifted very very quickly to work from home, and 162 00:09:53,640 --> 00:09:58,400 Speaker 1: by doing so, they may have exposed infrastructure that may 163 00:09:58,440 --> 00:10:02,040 Speaker 1: not has been as secure as it should to be 164 00:10:02,160 --> 00:10:06,360 Speaker 1: exposed to the Internet and that gave threat actors a 165 00:10:06,400 --> 00:10:11,880 Speaker 1: path into some organizations, but it also affected business practices. 166 00:10:12,640 --> 00:10:15,760 Speaker 1: There were organizations that were ready for it, that had 167 00:10:15,760 --> 00:10:19,960 Speaker 1: been working their resiliency plans for years for pandemics. The 168 00:10:20,040 --> 00:10:24,160 Speaker 1: financial services sector is one of those areas where it's 169 00:10:24,200 --> 00:10:28,320 Speaker 1: basically part of our DNA to build out strong resiliency 170 00:10:28,320 --> 00:10:32,720 Speaker 1: and recovery mechanisms. And our role is to work with 171 00:10:32,800 --> 00:10:35,560 Speaker 1: our business to rethink some of the controls and get 172 00:10:35,760 --> 00:10:39,439 Speaker 1: the message out, the awareness message out to our clients. 173 00:10:40,080 --> 00:10:43,520 Speaker 1: And it gets really interesting when you start to break 174 00:10:43,600 --> 00:10:49,200 Speaker 1: down resiliency and recovery for organizations as a result of 175 00:10:49,960 --> 00:10:52,320 Speaker 1: things like a ransomware event. 176 00:10:52,920 --> 00:10:56,080 Speaker 2: Then I am also thinking of vulnerability management, which we 177 00:10:56,200 --> 00:10:59,400 Speaker 2: kind of never it's not very fun to talk about. 178 00:11:00,080 --> 00:11:04,120 Speaker 1: I think vulnerable management foundational to everything, right. 179 00:11:04,280 --> 00:11:06,960 Speaker 2: The patching, the kind of day to day You know, 180 00:11:07,000 --> 00:11:09,720 Speaker 2: there's a lot of talk about alert fatigue, but you 181 00:11:09,800 --> 00:11:11,840 Speaker 2: have people who need to access the web, who are 182 00:11:11,880 --> 00:11:14,440 Speaker 2: on their browsers from wherever they are all the time. 183 00:11:14,520 --> 00:11:18,080 Speaker 2: How do you deal with web browser security? What is 184 00:11:18,120 --> 00:11:21,160 Speaker 2: sort of the best practices today versus what they were 185 00:11:21,200 --> 00:11:22,000 Speaker 2: when you first started. 186 00:11:22,760 --> 00:11:25,240 Speaker 1: That's a great question. I get the point around alert 187 00:11:25,280 --> 00:11:29,640 Speaker 1: fatigue and volumes. But it's really about thinking through the 188 00:11:29,800 --> 00:11:33,120 Speaker 1: entire life cycle of that attack. So going back to like, 189 00:11:33,160 --> 00:11:37,240 Speaker 1: how do you drive awareness for employees not to click 190 00:11:37,280 --> 00:11:41,559 Speaker 1: on links. If they do click, how are you filtering 191 00:11:42,400 --> 00:11:45,920 Speaker 1: the sites that they're going to that could be malicious. 192 00:11:46,320 --> 00:11:53,840 Speaker 1: Interestingly enough, most systems that assist in like categorization of 193 00:11:53,920 --> 00:12:01,400 Speaker 1: websites have a functionality that blocks. Uncategorized websites mean websites 194 00:12:01,440 --> 00:12:05,760 Speaker 1: that are too new to have a category associate with them, 195 00:12:06,080 --> 00:12:09,600 Speaker 1: and oftentimes these are the ones that the threat actors 196 00:12:09,600 --> 00:12:13,200 Speaker 1: have just recently set up to look like a legitimate 197 00:12:13,240 --> 00:12:17,680 Speaker 1: website that you know somebody will click on, and you 198 00:12:17,760 --> 00:12:23,440 Speaker 1: can actually see a significant reduction of that browsing risk 199 00:12:23,520 --> 00:12:27,320 Speaker 1: if you're eliminating websites that are too new, that have 200 00:12:27,520 --> 00:12:31,079 Speaker 1: just been stood up, that have like a certificate mismatch 201 00:12:31,200 --> 00:12:32,000 Speaker 1: and things like that. 202 00:12:32,440 --> 00:12:35,640 Speaker 2: When you think about enterprise security and finance, and especially 203 00:12:35,679 --> 00:12:38,520 Speaker 2: about protecting teams, what are the most critical threats that 204 00:12:38,520 --> 00:12:39,520 Speaker 2: you're watching out for. 205 00:12:40,080 --> 00:12:43,320 Speaker 1: I think there's two aspects to this. We often talk 206 00:12:43,400 --> 00:12:47,079 Speaker 1: about how do we protect the workforce, but it's also 207 00:12:47,200 --> 00:12:50,439 Speaker 1: like how do we use our workforce? As the first 208 00:12:50,640 --> 00:12:55,520 Speaker 1: indicator of an attack or of targeting. So you know, 209 00:12:55,720 --> 00:12:58,640 Speaker 1: one of the things that's like hugely important is how 210 00:12:58,679 --> 00:13:03,439 Speaker 1: do you mine the reports that you're getting from end 211 00:13:03,559 --> 00:13:09,520 Speaker 1: users around cyber issues or targeting. We test our employees 212 00:13:09,600 --> 00:13:13,679 Speaker 1: for phishing on a quarterly basis. The first thing we 213 00:13:13,679 --> 00:13:17,319 Speaker 1: were doing was we were measuring click rates and then 214 00:13:17,520 --> 00:13:20,920 Speaker 1: we thought to ourselves, well, let's start measuring the reporting 215 00:13:21,040 --> 00:13:25,120 Speaker 1: rate because what we want to know is if somebody 216 00:13:25,240 --> 00:13:27,400 Speaker 1: is going to get this, are they going to forward 217 00:13:27,440 --> 00:13:30,600 Speaker 1: it to us? But then it was also measuring the 218 00:13:30,640 --> 00:13:37,520 Speaker 1: forward rate, meaning people's reaction often with a phishing email 219 00:13:37,600 --> 00:13:39,600 Speaker 1: is they send it to their colleagues and they go, 220 00:13:39,720 --> 00:13:44,760 Speaker 1: is this legit? So they're actually amplifying the adversaries reach 221 00:13:44,960 --> 00:13:47,520 Speaker 1: by forwarding it to a bunch of people who may 222 00:13:47,600 --> 00:13:51,080 Speaker 1: click on it who would have never gotten it. So 223 00:13:51,280 --> 00:13:56,040 Speaker 1: it's really how do you think through the awareness for 224 00:13:56,160 --> 00:14:00,000 Speaker 1: people with the most common types of attacks, But also 225 00:14:00,000 --> 00:14:04,480 Speaker 1: so how do you turn your entire workforce into early 226 00:14:04,600 --> 00:14:08,960 Speaker 1: detection sensors where they're reporting what they're seeing to the 227 00:14:09,000 --> 00:14:14,280 Speaker 1: cybersecurity organization so they can promptly take action on it. 228 00:14:14,800 --> 00:14:18,360 Speaker 1: And that is a game changer in the early stages 229 00:14:18,400 --> 00:14:21,800 Speaker 1: of an attack because people will notice, hey, there's something 230 00:14:21,880 --> 00:14:26,800 Speaker 1: wrong here. I have never seen this happen before. It 231 00:14:26,880 --> 00:14:30,880 Speaker 1: might be a glitch, but it also might be a 232 00:14:30,920 --> 00:14:35,520 Speaker 1: bad guy, a threat actor that's doing something that's absolutely 233 00:14:36,040 --> 00:14:41,480 Speaker 1: unexpected that just revealed their presence on the network. Organizations 234 00:14:41,520 --> 00:14:46,920 Speaker 1: need to be ready and continuously adapting to the threat landscape. 235 00:14:49,080 --> 00:14:52,400 Speaker 2: Jf's strategy called out the importance of monitoring for potential 236 00:14:52,440 --> 00:14:55,840 Speaker 2: threats and risky activities, but when monitoring means catching a 237 00:14:55,840 --> 00:14:59,560 Speaker 2: fake disaster relief website leaders need to recognize how opening 238 00:14:59,560 --> 00:15:02,160 Speaker 2: a browser for work shapes people's behavior. 239 00:15:02,600 --> 00:15:05,840 Speaker 3: Security certainly isn't top of mind for most users. Most 240 00:15:05,840 --> 00:15:08,080 Speaker 3: of the time, they're trying to get their work done, 241 00:15:08,160 --> 00:15:10,680 Speaker 3: and they're probably also trying to get their life done. 242 00:15:10,880 --> 00:15:13,840 Speaker 2: That's David Adrian, security product manager for Chrome. 243 00:15:14,400 --> 00:15:17,120 Speaker 3: For most people, browsing the Internet may not seem like 244 00:15:17,160 --> 00:15:19,840 Speaker 3: a big deal, But if you're an administrator for a 245 00:15:19,880 --> 00:15:22,960 Speaker 3: bank or other organizations that have a lot of customer data, 246 00:15:23,080 --> 00:15:25,400 Speaker 3: then keeping your employees safe on the web should be 247 00:15:25,440 --> 00:15:26,480 Speaker 3: even more top of mind. 248 00:15:26,880 --> 00:15:29,360 Speaker 2: He told me how he would approach protecting teams from 249 00:15:29,440 --> 00:15:32,000 Speaker 2: cyber attacks that take advantage of search. 250 00:15:32,280 --> 00:15:36,840 Speaker 3: Chrome runs a feature called safe Browsing, which attempts to 251 00:15:36,880 --> 00:15:39,840 Speaker 3: warn on sites that are known to be fishing, sites 252 00:15:39,880 --> 00:15:42,720 Speaker 3: known to be malware, and it doesn't reveal what sites 253 00:15:42,720 --> 00:15:45,560 Speaker 3: that you're visiting. You can opt into a version of 254 00:15:45,600 --> 00:15:47,680 Speaker 3: it called Enhance Safe Browsing, which is able to do 255 00:15:47,720 --> 00:15:50,400 Speaker 3: the checks in real time by sending them back to 256 00:15:50,440 --> 00:15:53,200 Speaker 3: the safe browsing server. That could be a good sort 257 00:15:53,240 --> 00:15:56,760 Speaker 3: of trade off to make if you want additional protection 258 00:15:56,960 --> 00:16:01,040 Speaker 3: against malware and against phishing, regardless of they're being fished 259 00:16:01,040 --> 00:16:03,480 Speaker 3: at work or fished at home on their work device. 260 00:16:03,600 --> 00:16:06,480 Speaker 3: And in fact, safe browsing is like such a popular 261 00:16:06,520 --> 00:16:09,960 Speaker 3: feature that it's also an open API leveraged by some 262 00:16:10,000 --> 00:16:10,680 Speaker 3: other browsers. 263 00:16:11,000 --> 00:16:15,240 Speaker 2: So of course you're dealing with data on these vulnerabilities 264 00:16:15,280 --> 00:16:18,280 Speaker 2: that is at the scale of Google, So you have 265 00:16:18,440 --> 00:16:22,760 Speaker 2: access to a great deal of very relevant data about vulnerabilities. 266 00:16:23,160 --> 00:16:26,160 Speaker 2: And not only that, but what of those vulnerabilities can 267 00:16:26,520 --> 00:16:28,240 Speaker 2: actually lead to a problem. 268 00:16:28,560 --> 00:16:32,880 Speaker 3: Absolutely. Yeah, Google is crawling the web every day for 269 00:16:32,960 --> 00:16:35,280 Speaker 3: its search engine, and as part of that, it's also 270 00:16:35,320 --> 00:16:38,000 Speaker 3: seeing malware, and that sort of same crawling is powering 271 00:16:38,040 --> 00:16:41,640 Speaker 3: safe browsing, and safe browsing is something that you just 272 00:16:41,680 --> 00:16:43,760 Speaker 3: get out of the box with Chrome, among other end 273 00:16:43,840 --> 00:16:46,840 Speaker 3: user features like site isolation, then we have other features 274 00:16:46,880 --> 00:16:50,240 Speaker 3: that are built with enterprises and businesses in mind. For example, 275 00:16:50,520 --> 00:16:53,240 Speaker 3: with Chrome enter Price Premium, you can implement filters based 276 00:16:53,280 --> 00:16:55,800 Speaker 3: on website categories that you've defined, and you can get 277 00:16:55,800 --> 00:16:58,560 Speaker 3: reporting that shows how your teams are handling those filters. So, 278 00:16:58,600 --> 00:17:01,320 Speaker 3: for example, our people get fatigued by their alerts and 279 00:17:01,320 --> 00:17:05,240 Speaker 3: clicking through regardless. Having that kind of information means teams 280 00:17:05,240 --> 00:17:08,199 Speaker 3: can get visibility into what's happening in their fleet and 281 00:17:08,240 --> 00:17:10,120 Speaker 3: they can take action based on their findings. 282 00:17:10,480 --> 00:17:13,520 Speaker 2: This is great because one of the big intractable long 283 00:17:13,560 --> 00:17:17,600 Speaker 2: time problems in cybersecurity is just a lack of visibility 284 00:17:17,720 --> 00:17:21,800 Speaker 2: into process and how things are working in the web, 285 00:17:21,800 --> 00:17:25,120 Speaker 2: apps and web browsers, which is realistically how people are 286 00:17:25,119 --> 00:17:28,440 Speaker 2: actually working today in the modern office workspace. 287 00:17:28,720 --> 00:17:31,080 Speaker 3: Absolutely, And like the old way of looking at this 288 00:17:31,119 --> 00:17:33,120 Speaker 3: would just be what programs did you launch? And it'd 289 00:17:33,119 --> 00:17:35,000 Speaker 3: be like, oh, well you launched a web browser and 290 00:17:35,040 --> 00:17:37,399 Speaker 3: it's like okay, well what does that mean? Right? You 291 00:17:37,400 --> 00:17:39,240 Speaker 3: could have done anything inside of that now, So you 292 00:17:39,240 --> 00:17:41,119 Speaker 3: need to know what's happening inside. 293 00:17:40,760 --> 00:17:43,440 Speaker 2: Yeah, and thinking about where the work is actually happening right, 294 00:17:43,520 --> 00:17:46,760 Speaker 2: because too often, I think in security we've gotten used 295 00:17:46,800 --> 00:17:49,200 Speaker 2: to looking at the people in a certain way. They're 296 00:17:49,240 --> 00:17:52,480 Speaker 2: just people making mistakes, people forwarding emails, people clicking on 297 00:17:52,560 --> 00:17:55,280 Speaker 2: dangerous links. We look at people and see them as 298 00:17:55,320 --> 00:17:57,639 Speaker 2: weak points. But so we could be treating every one 299 00:17:57,720 --> 00:18:00,560 Speaker 2: of those people as a point of defense. What do 300 00:18:00,680 --> 00:18:03,600 Speaker 2: you think about this growing emphasis on resiliency and managing 301 00:18:03,640 --> 00:18:06,640 Speaker 2: threats and what is the role of teams in creating 302 00:18:06,640 --> 00:18:07,400 Speaker 2: that resiliency. 303 00:18:07,760 --> 00:18:11,040 Speaker 3: Yeah, I think this idea of cybersecurity resilience is becoming 304 00:18:11,280 --> 00:18:14,600 Speaker 3: more and more popular, especially in the financial services sector 305 00:18:14,600 --> 00:18:17,359 Speaker 3: where the stakes are really high. Breaches are going to happen, 306 00:18:17,440 --> 00:18:19,960 Speaker 3: and mitigating and responding to them should be something that 307 00:18:20,000 --> 00:18:24,040 Speaker 3: takes five minutes, not five days or five years. I 308 00:18:24,080 --> 00:18:27,800 Speaker 3: talked last time about how strong an identity is really important. 309 00:18:27,840 --> 00:18:30,880 Speaker 3: Once you have strong identity, you can start doing access 310 00:18:30,920 --> 00:18:33,919 Speaker 3: controls and authorization and limiting who has access to what 311 00:18:34,080 --> 00:18:37,000 Speaker 3: instead of everyone having access to everything. The more that 312 00:18:37,040 --> 00:18:39,119 Speaker 3: you can do that, and then you conpair that with 313 00:18:39,240 --> 00:18:44,200 Speaker 3: audit logs. Audit logs are the key to any security monitoring. 314 00:18:44,560 --> 00:18:47,919 Speaker 2: Yes, and whenever you compare different pieces of information that 315 00:18:47,960 --> 00:18:51,200 Speaker 2: you have vulnerabilities. With audit logs, for instance, you start 316 00:18:51,240 --> 00:18:55,120 Speaker 2: to get that matrixed view which allows you to take 317 00:18:55,160 --> 00:18:56,840 Speaker 2: action in a much more meaningful way. 318 00:18:57,800 --> 00:19:01,879 Speaker 3: What you want is that people regular day to day 319 00:19:02,160 --> 00:19:05,640 Speaker 3: web browsing is instrumented and understood as a baseline, so 320 00:19:05,680 --> 00:19:09,719 Speaker 3: that when something anomalist happens, it's detected as being anomalists. 321 00:19:09,760 --> 00:19:12,719 Speaker 3: You can't have an anomaly without a baseline. Ideally, you 322 00:19:12,760 --> 00:19:17,000 Speaker 3: want that detection to happen automatically, whether that's just because 323 00:19:17,280 --> 00:19:21,040 Speaker 3: you've it's something very simple like blocking a copy paste 324 00:19:21,080 --> 00:19:23,680 Speaker 3: from your CRM and it's some sort of public document, 325 00:19:24,359 --> 00:19:27,840 Speaker 3: or it's something more complicated about detecting a download from 326 00:19:27,880 --> 00:19:30,440 Speaker 3: a site that normally doesn't have a download. And then 327 00:19:30,560 --> 00:19:33,920 Speaker 3: where Chrominer price premium can really help is identifying the 328 00:19:34,560 --> 00:19:38,719 Speaker 3: non standard usage, is the anomalies and remediating those. You 329 00:19:38,760 --> 00:19:41,399 Speaker 3: can get an audit log of all of the events 330 00:19:41,440 --> 00:19:44,200 Speaker 3: that are happening in Chrome, all of the user interactions 331 00:19:44,560 --> 00:19:48,560 Speaker 3: and so on, and that is exposed through the cloud, 332 00:19:48,880 --> 00:19:52,840 Speaker 3: either directly to you via APIs, or it can integrate 333 00:19:53,119 --> 00:19:56,480 Speaker 3: with a sort of third party sim provider and hook 334 00:19:56,520 --> 00:19:59,360 Speaker 3: into your security team's workflow to look for anything out 335 00:19:59,359 --> 00:20:02,960 Speaker 3: of the ordinary, whether that's through integrating with data loss 336 00:20:02,960 --> 00:20:07,400 Speaker 3: prevention or just more specific rule sets on hey, this 337 00:20:07,440 --> 00:20:10,159 Speaker 3: thing looks different than normal. And then in that world, 338 00:20:10,440 --> 00:20:12,919 Speaker 3: you're not relying on the users to always make the 339 00:20:12,960 --> 00:20:16,280 Speaker 3: right decision, but you're trying to detect when the users 340 00:20:16,480 --> 00:20:18,840 Speaker 3: haven't made the right decision or are doing something weird. 341 00:20:18,960 --> 00:20:20,600 Speaker 3: And then if you've paired that with all of the 342 00:20:20,640 --> 00:20:25,159 Speaker 3: other best practices, then hopefully your time to mitigation is 343 00:20:25,280 --> 00:20:27,440 Speaker 3: very fast and it's actually a very low impact event 344 00:20:27,480 --> 00:20:28,639 Speaker 3: if something bad did happen. 345 00:20:29,200 --> 00:20:32,320 Speaker 2: Yeah, I know, we have so many amazing technology solutions now, 346 00:20:32,359 --> 00:20:35,679 Speaker 2: but it also reminds me of how difficult it can 347 00:20:35,720 --> 00:20:38,920 Speaker 2: be for a security team to implement the new technologies 348 00:20:38,960 --> 00:20:41,520 Speaker 2: that they want to have. And that's where again we 349 00:20:41,600 --> 00:20:44,080 Speaker 2: go back to the people involved. You really have to 350 00:20:44,119 --> 00:20:47,639 Speaker 2: have strong leadership who are listening in to their security 351 00:20:47,640 --> 00:20:50,200 Speaker 2: teams and their experts and able to make the right 352 00:20:50,240 --> 00:20:54,000 Speaker 2: decisions for the company in terms of what kind of 353 00:20:54,200 --> 00:20:56,680 Speaker 2: security measures are going to work the best for them 354 00:20:56,800 --> 00:20:58,680 Speaker 2: and the level of visibility that they want. 355 00:20:59,000 --> 00:21:04,800 Speaker 3: Absolutely, I think that's this move to management becoming something 356 00:21:04,840 --> 00:21:09,560 Speaker 3: that the security team or whoever is responsible for security, 357 00:21:09,720 --> 00:21:12,119 Speaker 3: that the management of the web browser or of a 358 00:21:12,160 --> 00:21:15,399 Speaker 3: phone or of the device is actually a security product, 359 00:21:15,560 --> 00:21:18,199 Speaker 3: like rather than just an IT product, because all of 360 00:21:18,680 --> 00:21:22,960 Speaker 3: sort of modern security operations is about identifying who's logging 361 00:21:23,040 --> 00:21:25,880 Speaker 3: in a web browser and securing that web browser, whether 362 00:21:26,000 --> 00:21:28,440 Speaker 3: that browser is on a laptop, that browser is on 363 00:21:28,480 --> 00:21:31,120 Speaker 3: a phone, it's on a company owned phone, or it's 364 00:21:31,119 --> 00:21:34,919 Speaker 3: on a personal phone. It's ensuring that whatever device the 365 00:21:35,000 --> 00:21:37,240 Speaker 3: user is going to, some browser signing it on has 366 00:21:37,280 --> 00:21:40,440 Speaker 3: some minimum security posture. You've strongly authenticated them, you can 367 00:21:40,480 --> 00:21:42,920 Speaker 3: wipe data if you need to. And all of these 368 00:21:42,920 --> 00:21:46,400 Speaker 3: things might have previously been something that you'd just been like, oh, 369 00:21:46,400 --> 00:21:48,639 Speaker 3: that's something that just it has to deal with for 370 00:21:48,680 --> 00:21:51,440 Speaker 3: IT related reasons, and it's like, you know, actually, these 371 00:21:51,440 --> 00:21:55,879 Speaker 3: problems are really deeply central to the security story of 372 00:21:55,920 --> 00:21:57,120 Speaker 3: a modern workplace as well. 373 00:21:59,280 --> 00:22:02,400 Speaker 2: To learn more about how the most trusted enterprise browser 374 00:22:02,440 --> 00:22:06,480 Speaker 2: can help protect your organization, visit Chrome Enterprise dot Google. 375 00:22:09,040 --> 00:22:12,000 Speaker 2: Next time, on security Bookmarks, i'll talk to Curtis Minder, 376 00:22:12,160 --> 00:22:16,120 Speaker 2: a renowned ransomware negotiator, about the security challenges he's tackled. 377 00:22:16,320 --> 00:22:17,800 Speaker 2: In the manufacturing industry. 378 00:22:18,200 --> 00:22:20,520 Speaker 4: We have been the manufacturer of this particular product for 379 00:22:20,520 --> 00:22:23,600 Speaker 4: almost one hundred years, and the way that we manufacture 380 00:22:23,600 --> 00:22:25,800 Speaker 4: this product and the materials we use to manufacture this 381 00:22:25,880 --> 00:22:29,639 Speaker 4: product are our trade secret. I am concerned that that 382 00:22:29,800 --> 00:22:32,520 Speaker 4: information has left the building, and I won't know about 383 00:22:32,520 --> 00:22:36,000 Speaker 4: that risk for some time until a competitor of mine 384 00:22:36,040 --> 00:22:38,400 Speaker 4: makes the exact same product in five years from now 385 00:22:38,760 --> 00:22:39,960 Speaker 4: and puts me out of business. 386 00:22:40,480 --> 00:22:43,919 Speaker 2: Security Bookmarked is a podcast from Bloomberg Media Studios and 387 00:22:44,000 --> 00:22:47,560 Speaker 2: Chrome Enterprise. Subscribing your podcast app so you don't miss 388 00:22:47,600 --> 00:22:51,119 Speaker 2: our newest episode. I'm Kate Fazzini. Thanks for listening.