1 00:00:00,120 --> 00:00:02,880 Speaker 1: In this episode, we're going to cover a topic that 2 00:00:03,080 --> 00:00:07,600 Speaker 1: all too often only gets attention when something goes horribly wrong. 3 00:00:08,119 --> 00:00:12,240 Speaker 1: That is cybersecurity. And before we get into the interview, 4 00:00:12,760 --> 00:00:16,200 Speaker 1: let me hit you with some facts. In the United States, 5 00:00:16,480 --> 00:00:19,279 Speaker 1: the cost a company incurs in the wake of a 6 00:00:19,360 --> 00:00:22,560 Speaker 1: data breach has been on the rise. According to the 7 00:00:22,600 --> 00:00:27,920 Speaker 1: Ponamon Institute, costs have increased by one hundred thirty percent 8 00:00:28,200 --> 00:00:30,800 Speaker 1: since two thousand and six. So a data breach that 9 00:00:30,840 --> 00:00:33,159 Speaker 1: would set a company back three point five to four 10 00:00:33,360 --> 00:00:36,520 Speaker 1: million dollars in two thousand and six would cost eight 11 00:00:36,600 --> 00:00:41,000 Speaker 1: point one nine million dollars in twenty nineteen. A single 12 00:00:41,159 --> 00:00:46,800 Speaker 1: compromised record costs one hundred and fifty dollars on average, 13 00:00:46,840 --> 00:00:50,880 Speaker 1: and that's just the average. For some companies, like those 14 00:00:50,920 --> 00:00:54,120 Speaker 1: in healthcare, the cost can be much higher due to 15 00:00:54,160 --> 00:00:57,480 Speaker 1: the nature of the data. The report found that more 16 00:00:57,520 --> 00:01:00,840 Speaker 1: than half of data breaches come as the result of 17 00:01:00,880 --> 00:01:04,040 Speaker 1: a malicious attack. And one important thing to keep in 18 00:01:04,080 --> 00:01:07,520 Speaker 1: mind about this report is that it was for twenty nineteen. 19 00:01:08,200 --> 00:01:11,840 Speaker 1: We are in a different environment today with more potential 20 00:01:11,880 --> 00:01:15,160 Speaker 1: attack vectors, and while the tricks of the trade haven't 21 00:01:15,240 --> 00:01:18,440 Speaker 1: changed much over the years, the number of opportunities for 22 00:01:18,520 --> 00:01:22,280 Speaker 1: attack are on the rise now. With all that in mind, 23 00:01:22,600 --> 00:01:28,520 Speaker 1: I sat down virtually speaking, we were all remotely isolated, 24 00:01:28,959 --> 00:01:33,480 Speaker 1: with Wendy Whitmore, VP IBM Security X Force Threat Intelligence, 25 00:01:33,840 --> 00:01:37,760 Speaker 1: and Alison Ritter, program leader at IBM Security Command Center. 26 00:01:38,080 --> 00:01:41,560 Speaker 1: We talked about the challenges that companies face today as 27 00:01:41,560 --> 00:01:44,880 Speaker 1: our concept of the workplace is changing, and how companies 28 00:01:44,920 --> 00:01:49,920 Speaker 1: can best prepare themselves for the worst day. Ever, here's 29 00:01:49,960 --> 00:01:53,680 Speaker 1: our conversation. I think we can get the obvious out 30 00:01:53,720 --> 00:01:56,760 Speaker 1: of the way. We can state that a priority for 31 00:01:56,840 --> 00:01:59,240 Speaker 1: any business in the twenty first century needs to be 32 00:01:59,800 --> 00:02:04,919 Speaker 1: on cybersecurity. That is pretty obvious. But what has become 33 00:02:04,960 --> 00:02:08,600 Speaker 1: more complicated, I would think, would be this shift we're 34 00:02:08,639 --> 00:02:12,440 Speaker 1: seeing now that we're in an era of let's say, 35 00:02:12,480 --> 00:02:18,040 Speaker 1: momentous events where we've seen a real move to decentralization. 36 00:02:18,160 --> 00:02:20,959 Speaker 1: A lot of people are working from home and that's 37 00:02:21,040 --> 00:02:25,480 Speaker 1: kind of changed the nature of business. Has that impacted 38 00:02:25,680 --> 00:02:28,600 Speaker 1: sort of the focus of cyber threats as well? Are 39 00:02:28,600 --> 00:02:33,320 Speaker 1: we seeing changes in that realm? Wendy, yes, I would 40 00:02:33,320 --> 00:02:35,919 Speaker 1: say we absolutely are. You know. The good news, though, 41 00:02:36,040 --> 00:02:38,680 Speaker 1: is it hasn't shifted in terms of such new and 42 00:02:38,760 --> 00:02:42,000 Speaker 1: novel types of attack techniques. But I think what's really 43 00:02:42,040 --> 00:02:46,560 Speaker 1: shifted is the volume of attacks as well as the frequency, 44 00:02:46,800 --> 00:02:50,160 Speaker 1: and then the attack surface. So when I say attack surface, 45 00:02:50,280 --> 00:02:53,440 Speaker 1: you know what I mean is there are millions of 46 00:02:53,520 --> 00:02:57,800 Speaker 1: more computers that are now connecting from remote locations into 47 00:02:57,840 --> 00:03:02,079 Speaker 1: devices and applications and systems that previously were within the 48 00:03:02,120 --> 00:03:05,520 Speaker 1: same network. So that gives the attackers a bigger attack 49 00:03:05,680 --> 00:03:08,720 Speaker 1: surface from the external side of that, right, all these 50 00:03:08,760 --> 00:03:11,639 Speaker 1: new systems that were online that are no longer behind 51 00:03:11,680 --> 00:03:15,880 Speaker 1: these firewalls. But then also it opens the corporate entities 52 00:03:15,919 --> 00:03:19,120 Speaker 1: to potentially more of an attack surface because of the 53 00:03:19,160 --> 00:03:22,440 Speaker 1: fact that they might have overnight, you know, enabled the 54 00:03:22,520 --> 00:03:25,400 Speaker 1: ability for their workers to work remotely, and perhaps they 55 00:03:25,440 --> 00:03:28,960 Speaker 1: don't have the right types of authentication or enough authentication 56 00:03:29,240 --> 00:03:33,200 Speaker 1: on their systems where these attackers can take advantage of, 57 00:03:33,360 --> 00:03:36,640 Speaker 1: but ultimately where their users need to connect to do 58 00:03:36,720 --> 00:03:38,800 Speaker 1: business and to do their work on a daily basis. 59 00:03:39,160 --> 00:03:43,080 Speaker 1: And we've definitely seen issues in the past where even 60 00:03:43,120 --> 00:03:49,600 Speaker 1: within these well protected systems, we can see failures. Often 61 00:03:49,840 --> 00:03:53,840 Speaker 1: I argue in tech stuff that the weakest link in 62 00:03:53,920 --> 00:04:00,280 Speaker 1: a company's cybersecurity process often isn't necessarily the technology It 63 00:04:00,320 --> 00:04:03,320 Speaker 1: can be the implementation of that technology, but it often 64 00:04:03,440 --> 00:04:08,160 Speaker 1: falls to a weak link in the the use of 65 00:04:08,160 --> 00:04:11,640 Speaker 1: that technology. So user error, you could argue, I think 66 00:04:12,000 --> 00:04:14,320 Speaker 1: leads to a lot of those I imagine that that 67 00:04:14,960 --> 00:04:20,880 Speaker 1: this decentralized approach has created enormous opportunities to exploit that 68 00:04:21,200 --> 00:04:26,920 Speaker 1: because people are having to navigate a new workspace, they're 69 00:04:26,920 --> 00:04:30,760 Speaker 1: having to access systems in ways that they haven't before. That, 70 00:04:30,960 --> 00:04:34,520 Speaker 1: as you say, are are a little a step outside 71 00:04:34,560 --> 00:04:37,240 Speaker 1: of the immediate control of a lot of these businesses. 72 00:04:38,120 --> 00:04:40,919 Speaker 1: So can you talk a bit about sort of the 73 00:04:41,040 --> 00:04:45,240 Speaker 1: nature of that, Like this attack surface, what are the 74 00:04:45,279 --> 00:04:49,320 Speaker 1: sort of attacks that you typically see, what's kind of 75 00:04:49,320 --> 00:04:52,080 Speaker 1: the nature of it? Well, okay, so let's take that 76 00:04:52,600 --> 00:04:55,120 Speaker 1: question from a couple of different angles, right. So, one 77 00:04:55,200 --> 00:04:57,800 Speaker 1: is just the ability to exploit human error, which we 78 00:04:57,839 --> 00:04:59,800 Speaker 1: are all going to be prone to do. Right, So, 79 00:05:00,120 --> 00:05:02,840 Speaker 1: if you're looking at it from the attacker perspective, they're 80 00:05:02,920 --> 00:05:05,440 Speaker 1: kind of saying, hey, great, look at this. There's all 81 00:05:05,480 --> 00:05:08,200 Speaker 1: this chaos going on in the world right now. There's 82 00:05:08,279 --> 00:05:11,279 Speaker 1: new ways of working. But people who are now working 83 00:05:11,320 --> 00:05:14,520 Speaker 1: remotely that maybe didn't before are also doing things like 84 00:05:14,640 --> 00:05:17,960 Speaker 1: checking their email more regularly and they're checking social media 85 00:05:18,000 --> 00:05:20,839 Speaker 1: sources and news sources because they want to know what 86 00:05:20,920 --> 00:05:24,440 Speaker 1: are the local regulations where I live, what's the current 87 00:05:24,839 --> 00:05:27,560 Speaker 1: counts in terms of how many people are being infected. 88 00:05:28,080 --> 00:05:31,360 Speaker 1: And so since March we have seen a six thousand 89 00:05:31,440 --> 00:05:36,160 Speaker 1: percent increase in spam related to COVID nineteen. So that's 90 00:05:36,200 --> 00:05:40,560 Speaker 1: coming into users directly, so maybe your personal email accounts, 91 00:05:40,720 --> 00:05:44,520 Speaker 1: but it's also coming into work email accounts unfortunately. And 92 00:05:44,839 --> 00:05:47,960 Speaker 1: each one of those systems that I mentioned that previously 93 00:05:48,040 --> 00:05:49,920 Speaker 1: used to be inside of a network and is now 94 00:05:50,040 --> 00:05:53,000 Speaker 1: not may or may not have the same types of 95 00:05:53,040 --> 00:05:56,680 Speaker 1: personal protections on it, right in forms of firewalls and 96 00:05:56,720 --> 00:05:59,960 Speaker 1: antivirus and endpoint detection that it had when it was 97 00:06:00,120 --> 00:06:03,440 Speaker 1: on the interior of their network. And so from that perspective, 98 00:06:03,680 --> 00:06:06,560 Speaker 1: human error and humans are always going to be a 99 00:06:06,640 --> 00:06:09,160 Speaker 1: huge part of the attack surface, right, So we've all 100 00:06:09,160 --> 00:06:11,640 Speaker 1: got that, you know, as if we don't have enough 101 00:06:11,640 --> 00:06:13,400 Speaker 1: things to be concerned about right now, we're going to 102 00:06:13,400 --> 00:06:17,400 Speaker 1: add that to the list. And then as the pandemic 103 00:06:17,440 --> 00:06:20,360 Speaker 1: has shifted and as it continues to go on, there 104 00:06:20,400 --> 00:06:26,280 Speaker 1: is obviously a huge influence on testing, on vaccine research, 105 00:06:26,560 --> 00:06:31,159 Speaker 1: on the development of a vaccine and processes and procedures 106 00:06:31,160 --> 00:06:32,919 Speaker 1: that are all going to make all of us more secure, 107 00:06:33,360 --> 00:06:36,560 Speaker 1: and so now you not only have those. When we 108 00:06:36,600 --> 00:06:39,960 Speaker 1: talk about spam, we're often talking about cyber criminals, right 109 00:06:40,000 --> 00:06:43,719 Speaker 1: who are financially motivated looking to steal information, maybe looking 110 00:06:43,720 --> 00:06:46,520 Speaker 1: to conduct ransomware attacks. But when we shift over to 111 00:06:46,720 --> 00:06:50,080 Speaker 1: the vaccine research and the testing, we're then really looking 112 00:06:50,120 --> 00:06:52,960 Speaker 1: at nation state actors who are looking to capitalize on 113 00:06:53,000 --> 00:06:55,840 Speaker 1: the theft of intellectual property and make sure that they 114 00:06:55,880 --> 00:07:00,520 Speaker 1: can protect their citizens and potentially turn that research into 115 00:07:01,160 --> 00:07:05,719 Speaker 1: financial gain as well. So it's a pretty tumultuous environment 116 00:07:05,800 --> 00:07:09,480 Speaker 1: right now. I mentioned that the attacks themselves are not 117 00:07:09,520 --> 00:07:12,600 Speaker 1: necessarily all that new and novel or exciting, but the 118 00:07:12,720 --> 00:07:15,320 Speaker 1: volume of them, and combined with the increase of tax 119 00:07:15,360 --> 00:07:18,080 Speaker 1: surface as well as just the general day to day chaos, 120 00:07:18,160 --> 00:07:22,360 Speaker 1: has made it a pretty interesting environment to say the least. Yes, 121 00:07:22,440 --> 00:07:24,840 Speaker 1: I think interesting is a great word for it. It's 122 00:07:24,880 --> 00:07:29,280 Speaker 1: one of those nice catch alls in your work, Wendy, 123 00:07:29,360 --> 00:07:34,000 Speaker 1: have you noticed any particular sectors or industries that are 124 00:07:35,160 --> 00:07:40,320 Speaker 1: particularly being targeted by cyber attacks in this era? Right now, 125 00:07:40,400 --> 00:07:44,800 Speaker 1: we're absolutely concerned about critical infrastructure, and when I say 126 00:07:44,800 --> 00:07:47,960 Speaker 1: that though, that's kind of a big list, right of organizations. 127 00:07:47,960 --> 00:07:51,480 Speaker 1: So that's everything from the obvious like hospitals who are 128 00:07:51,520 --> 00:07:56,040 Speaker 1: providing healthcare to people who are sick. It's the medical insures, 129 00:07:56,080 --> 00:07:59,400 Speaker 1: it's the whole infrastructure and ecosystem on the medical side. 130 00:07:59,520 --> 00:08:03,400 Speaker 1: It's also financial services industry, and they're supporting infrastructure and 131 00:08:03,440 --> 00:08:07,360 Speaker 1: supply chain as well as energy and oil and gas 132 00:08:07,600 --> 00:08:10,160 Speaker 1: and really any of these organizations. It could also be 133 00:08:10,240 --> 00:08:13,240 Speaker 1: food supply chain, right, All of those things that we 134 00:08:13,280 --> 00:08:16,880 Speaker 1: need now to work perfectly more than ever are also 135 00:08:16,960 --> 00:08:19,240 Speaker 1: potentially at risk. And so what we look at and 136 00:08:19,280 --> 00:08:22,760 Speaker 1: what we're concerned about most our ransomware attacks to those 137 00:08:22,800 --> 00:08:26,400 Speaker 1: type of organizations. We know the continual targeting and theft 138 00:08:26,400 --> 00:08:29,800 Speaker 1: of intellectual property will go on as it always has. 139 00:08:29,880 --> 00:08:32,920 Speaker 1: But if we can stop some of these major ransomware 140 00:08:32,920 --> 00:08:37,240 Speaker 1: breaches from being effective and from stopping business for our clients, 141 00:08:37,600 --> 00:08:41,120 Speaker 1: that's really what we're concerned about helping out with. I 142 00:08:41,120 --> 00:08:45,439 Speaker 1: think one of the things we've learned we being sort 143 00:08:45,480 --> 00:08:48,480 Speaker 1: of the layman I include myself in this in the 144 00:08:48,480 --> 00:08:53,560 Speaker 1: wake of this pandemic, is how incredibly interconnected all these 145 00:08:53,559 --> 00:08:56,720 Speaker 1: different pieces are, and if you do put yourself in 146 00:08:56,760 --> 00:09:00,160 Speaker 1: the mind of someone who is attempting to exploit the 147 00:09:00,160 --> 00:09:04,720 Speaker 1: the chaos. Then you can think, well, then you want 148 00:09:04,760 --> 00:09:09,400 Speaker 1: to target whatever links appear to be the most vulnerable 149 00:09:09,440 --> 00:09:12,960 Speaker 1: at any given time. And this kind of brings us 150 00:09:13,040 --> 00:09:17,200 Speaker 1: over into something that I wanted to speak with Alison about. Alison, 151 00:09:17,280 --> 00:09:22,160 Speaker 1: you have a pretty cool job in that you help 152 00:09:22,360 --> 00:09:26,560 Speaker 1: architects scenarios for companies so that they can have a 153 00:09:26,640 --> 00:09:32,640 Speaker 1: simulated cyber threat attack sort of a worse Day ever scenario. 154 00:09:32,960 --> 00:09:35,560 Speaker 1: Can you talk a little bit about what that's all 155 00:09:35,600 --> 00:09:39,640 Speaker 1: about and what goes into planning this sort of thing. Yeah, 156 00:09:39,679 --> 00:09:42,520 Speaker 1: So having a well tested and really thought out plan 157 00:09:42,679 --> 00:09:46,240 Speaker 1: is key to any incident response piece that you'd be 158 00:09:46,280 --> 00:09:49,240 Speaker 1: working on with in a company. So where I work 159 00:09:49,320 --> 00:09:54,080 Speaker 1: is really working on creating custom scenarios for organizations to 160 00:09:54,120 --> 00:09:56,720 Speaker 1: go through and handle. Really a day in the life 161 00:09:56,720 --> 00:09:59,320 Speaker 1: of a cybersecurity attack, something that would go on. It 162 00:09:59,400 --> 00:10:02,000 Speaker 1: is really, like you said, your worst day that could 163 00:10:02,040 --> 00:10:06,760 Speaker 1: possibly happen within an organization. A plan is really only 164 00:10:06,800 --> 00:10:09,640 Speaker 1: part of the solution, So you also need to find 165 00:10:09,679 --> 00:10:12,800 Speaker 1: out if your company is ready and able to execute 166 00:10:12,880 --> 00:10:16,280 Speaker 1: and work through that plan. And that's where my team 167 00:10:16,320 --> 00:10:19,680 Speaker 1: comes in. With helping to test out that plan within 168 00:10:19,720 --> 00:10:23,720 Speaker 1: your organization. We run a fully immersive and gamified cyber 169 00:10:23,840 --> 00:10:27,320 Speaker 1: range as part of IBM Security Command Centers. Within the 170 00:10:27,320 --> 00:10:29,880 Speaker 1: command centers, we test and train companies in order to 171 00:10:29,960 --> 00:10:33,800 Speaker 1: practice their response to a cybersecurity attack. Now, when I 172 00:10:33,800 --> 00:10:36,680 Speaker 1: say test, it's not just reading through your plan and 173 00:10:36,720 --> 00:10:40,760 Speaker 1: answering questions. We put your plan into action by throwing 174 00:10:40,800 --> 00:10:44,800 Speaker 1: your entire response into a full on simulation of a 175 00:10:44,840 --> 00:10:48,920 Speaker 1: cyber attack. The most effective response plans that we found 176 00:10:49,040 --> 00:10:52,760 Speaker 1: are really tested and rehearsed multiple times through different types 177 00:10:52,800 --> 00:10:56,040 Speaker 1: of attack scenarios. So, for example, you could be testing 178 00:10:56,840 --> 00:11:01,199 Speaker 1: a ransomware response, DIDOS attack in threats. All of these 179 00:11:01,240 --> 00:11:04,600 Speaker 1: areas are important to test and train when dealing and 180 00:11:04,640 --> 00:11:07,040 Speaker 1: handling a cyber attack. And you know a lot of 181 00:11:07,040 --> 00:11:10,640 Speaker 1: people think that these are technical responses, that this is 182 00:11:10,679 --> 00:11:15,199 Speaker 1: something that you know, it's really for your security operation centers, 183 00:11:15,280 --> 00:11:18,520 Speaker 1: your IT areas, but actually, as cyber response plans are 184 00:11:18,559 --> 00:11:21,720 Speaker 1: best executed by the whole of business response, So dealing 185 00:11:21,760 --> 00:11:27,600 Speaker 1: with individuals from human resources, communications, finance, legal, all of 186 00:11:27,640 --> 00:11:30,800 Speaker 1: those individuals come into play when handling the cyber attack. 187 00:11:31,200 --> 00:11:33,480 Speaker 1: So we work with all of those within the cyber range. 188 00:11:33,960 --> 00:11:38,920 Speaker 1: That's absolutely fascinating. And as you point out, like I 189 00:11:38,920 --> 00:11:42,040 Speaker 1: think a lot of us think of cyber attacks and 190 00:11:42,120 --> 00:11:45,800 Speaker 1: the response to them in very Hollywood terms, just because 191 00:11:45,840 --> 00:11:49,240 Speaker 1: the way the media tends to portray this sort of stuff, 192 00:11:49,280 --> 00:11:51,439 Speaker 1: where you have the people just furiously typing, maybe two 193 00:11:51,480 --> 00:11:53,800 Speaker 1: people typing on the same keyboard, which we all know 194 00:11:53,880 --> 00:11:58,160 Speaker 1: works incredibly well, that clearly is not an accurate representation 195 00:11:58,400 --> 00:12:01,040 Speaker 1: of what actually happens. And I'm sure there are a 196 00:12:01,040 --> 00:12:04,200 Speaker 1: lot of people out there listening who are working in 197 00:12:04,240 --> 00:12:07,440 Speaker 1: their IT departments, perhaps they are leaders in their IT departments, 198 00:12:07,840 --> 00:12:10,480 Speaker 1: and maybe they're thinking about this for the first time. 199 00:12:10,920 --> 00:12:13,520 Speaker 1: So do you have any thoughts about even just the 200 00:12:13,559 --> 00:12:17,000 Speaker 1: process of getting started and building a response plan? How 201 00:12:17,000 --> 00:12:20,840 Speaker 1: does someone go about doing that? Yeah, that's a great question. 202 00:12:20,920 --> 00:12:24,480 Speaker 1: I think a lot of organizations have, and I think 203 00:12:24,600 --> 00:12:27,240 Speaker 1: at times they can feel overwhelmed on where do I start? 204 00:12:27,360 --> 00:12:29,080 Speaker 1: You know, I don't know how to get started. It's 205 00:12:29,120 --> 00:12:32,240 Speaker 1: this huge thing, and you know, to be honest, what 206 00:12:32,280 --> 00:12:36,160 Speaker 1: we see is there's still three quarters of organizations that 207 00:12:36,240 --> 00:12:38,679 Speaker 1: don't actually have a plan in place, so no incident 208 00:12:38,679 --> 00:12:42,080 Speaker 1: ave response plan, no playbooks on specifically, how do we 209 00:12:42,120 --> 00:12:45,000 Speaker 1: respond to a certain kind of attack. So first and foremost, 210 00:12:45,200 --> 00:12:48,240 Speaker 1: put it on paper, right, start somewhere, Start with names 211 00:12:48,360 --> 00:12:53,000 Speaker 1: of your personnel, their contact information, their email addresses, and 212 00:12:53,040 --> 00:12:56,320 Speaker 1: their roles, and literally start there, and then from there 213 00:12:56,400 --> 00:13:00,479 Speaker 1: start looking to build out different components right of the organization, 214 00:13:00,679 --> 00:13:04,760 Speaker 1: so cross functional departments, who they're, who those leaders are, 215 00:13:04,840 --> 00:13:08,719 Speaker 1: what applications are responsible for, and really getting an understanding 216 00:13:08,880 --> 00:13:12,280 Speaker 1: of what roles and responsibilities different team members are going 217 00:13:12,320 --> 00:13:14,920 Speaker 1: to play. Then, as we look at organizations that are 218 00:13:14,960 --> 00:13:17,760 Speaker 1: more advanced, what we would encourage them to do is 219 00:13:17,760 --> 00:13:21,840 Speaker 1: certainly to have specific playbooks for certain activities, right, so 220 00:13:21,880 --> 00:13:28,200 Speaker 1: a ransomware playbook, a thective intellectual property playbook, any type 221 00:13:28,200 --> 00:13:30,680 Speaker 1: of things along those lines. And then once you have 222 00:13:30,800 --> 00:13:33,520 Speaker 1: those in place, then we look at testing them. So 223 00:13:33,800 --> 00:13:36,680 Speaker 1: increasing the frequency of testing them. If you can be 224 00:13:36,760 --> 00:13:40,680 Speaker 1: testing quarterly at least one of those scenarios, your organization 225 00:13:40,800 --> 00:13:43,959 Speaker 1: is going to then identify where the gaps are. And 226 00:13:44,160 --> 00:13:45,920 Speaker 1: if you can do that in advance of an attack 227 00:13:46,000 --> 00:13:48,040 Speaker 1: or doing it for you, you're going to be much 228 00:13:48,080 --> 00:13:52,280 Speaker 1: better prepared to respond effectively to an attack. I imagine 229 00:13:52,280 --> 00:13:56,360 Speaker 1: part of that also comes into how you communicate this 230 00:13:57,000 --> 00:14:00,840 Speaker 1: Both internally and then externally. We've probably I'm sure we 231 00:14:00,880 --> 00:14:06,239 Speaker 1: could all list off examples in the past of companies 232 00:14:06,320 --> 00:14:10,360 Speaker 1: that have had a data breach, for example, and kept 233 00:14:10,400 --> 00:14:13,520 Speaker 1: that quiet for maybe up to a year before news breaks. 234 00:14:13,559 --> 00:14:17,200 Speaker 1: And honestly, I feel that the longer that goes, the 235 00:14:17,360 --> 00:14:21,800 Speaker 1: deeper the sense of loss of trust tends to follow. 236 00:14:22,720 --> 00:14:26,320 Speaker 1: There's almost a sense of betrayal among the various stakeholders, 237 00:14:26,360 --> 00:14:29,720 Speaker 1: whether it's a customer or a client or whatever. So 238 00:14:30,320 --> 00:14:34,360 Speaker 1: is communication a part of that playbook? Is that something 239 00:14:34,400 --> 00:14:38,200 Speaker 1: that you help develop as well? So communication is absolutely, 240 00:14:38,240 --> 00:14:40,320 Speaker 1: I would argue the most important part of the whole 241 00:14:40,320 --> 00:14:43,240 Speaker 1: thing today, and I'll let Allison definitely talk more about 242 00:14:43,280 --> 00:14:45,800 Speaker 1: how we train that in the range. But what we 243 00:14:45,880 --> 00:14:48,560 Speaker 1: talk to our clients about in these situations is that 244 00:14:48,960 --> 00:14:51,520 Speaker 1: there are components that you can do in advance. So 245 00:14:51,600 --> 00:14:54,440 Speaker 1: things like having what we call a holding statement, which 246 00:14:54,480 --> 00:14:56,640 Speaker 1: is some sort of a statement that if press breaks 247 00:14:57,080 --> 00:15:00,600 Speaker 1: and you're not potentially ready to share information that you've 248 00:15:00,600 --> 00:15:03,200 Speaker 1: got a canned statement prepared and ready to go. That 249 00:15:03,360 --> 00:15:06,000 Speaker 1: is going to put you in position where it appears 250 00:15:06,000 --> 00:15:08,320 Speaker 1: that the organization is on top of things that they're 251 00:15:08,320 --> 00:15:12,080 Speaker 1: communicating with their clients and that they are investigating the situation. 252 00:15:12,640 --> 00:15:15,280 Speaker 1: In so many of these cases today, it's not just 253 00:15:15,360 --> 00:15:19,040 Speaker 1: about what the response was to the event, but it's 254 00:15:19,080 --> 00:15:21,920 Speaker 1: the communication of it and the public's perception of that 255 00:15:21,960 --> 00:15:26,200 Speaker 1: communication as well as your customers and clients' perception of 256 00:15:26,240 --> 00:15:30,160 Speaker 1: that that can cause reputational damage. Or on the plus side, 257 00:15:30,600 --> 00:15:32,520 Speaker 1: even in the wake of some of the worst breaches 258 00:15:32,520 --> 00:15:35,040 Speaker 1: we've seen in history, we've seen leaders who have come 259 00:15:35,080 --> 00:15:38,040 Speaker 1: out and done a fantastic job of communicating about it, 260 00:15:38,400 --> 00:15:41,160 Speaker 1: and they've actually built even more goodwill and trust in 261 00:15:41,160 --> 00:15:43,960 Speaker 1: their client base as a result of one of these breaches. 262 00:15:44,520 --> 00:15:47,440 Speaker 1: That's something that Allison and her team share on a 263 00:15:47,560 --> 00:15:50,320 Speaker 1: daily basis within the range. So Alison, I'd love to 264 00:15:50,320 --> 00:15:53,120 Speaker 1: hear your perspective on it. I'd say a great deal 265 00:15:53,600 --> 00:15:57,120 Speaker 1: about my area is working on how we get the 266 00:15:57,160 --> 00:16:00,880 Speaker 1: attendees to engage within the scenarios, right breaking you away 267 00:16:01,280 --> 00:16:04,200 Speaker 1: from your everyday life and now simulating something a cyber 268 00:16:04,240 --> 00:16:06,680 Speaker 1: attack that could be possibly simulating your worst day in 269 00:16:06,680 --> 00:16:09,840 Speaker 1: that organization. So something that you know we think about 270 00:16:09,840 --> 00:16:13,280 Speaker 1: when creating these we're really testing you and training you 271 00:16:13,320 --> 00:16:16,440 Speaker 1: to emulate these business and security issues that would be 272 00:16:16,480 --> 00:16:19,280 Speaker 1: taking place and all of the stories that we work on, 273 00:16:19,320 --> 00:16:22,480 Speaker 1: and these experiences are based upon real life incidents and 274 00:16:22,560 --> 00:16:25,640 Speaker 1: stories that are from the field and kind of like 275 00:16:25,840 --> 00:16:28,560 Speaker 1: top headlines that we're seeing. So in order to create 276 00:16:28,600 --> 00:16:32,080 Speaker 1: these simulations, we use a method called really experienced design 277 00:16:32,200 --> 00:16:35,880 Speaker 1: that creates real life situations that not only pull from 278 00:16:35,920 --> 00:16:39,600 Speaker 1: real life stories, but also feelings such as like panic 279 00:16:39,640 --> 00:16:43,280 Speaker 1: and uncertainty. And these areas are really kind of this 280 00:16:43,440 --> 00:16:46,560 Speaker 1: experimental learning where in order to fully learn what you 281 00:16:46,560 --> 00:16:49,160 Speaker 1: need to do, you have to experience it firsthand. So 282 00:16:49,200 --> 00:16:51,480 Speaker 1: we want to drop you into a scenario and have 283 00:16:51,520 --> 00:16:54,360 Speaker 1: you go through that so you know, for example, something 284 00:16:54,360 --> 00:16:56,560 Speaker 1: that you might be dealing with, like Wendy said, is 285 00:16:56,920 --> 00:17:00,119 Speaker 1: going through a holding statement, having to actually put that out, 286 00:17:00,240 --> 00:17:03,640 Speaker 1: test you and put you firsthand into what we call 287 00:17:03,720 --> 00:17:06,399 Speaker 1: the hot seat. It's a live broadcast studio where we 288 00:17:06,520 --> 00:17:09,360 Speaker 1: drop you in full green screen lights and we turn 289 00:17:09,440 --> 00:17:12,199 Speaker 1: that camera on and ask you questions from a real reporter. 290 00:17:12,320 --> 00:17:14,080 Speaker 1: It's up to you to answer and how do you 291 00:17:14,119 --> 00:17:16,760 Speaker 1: deal with that? You know, many people find out once 292 00:17:16,800 --> 00:17:19,080 Speaker 1: they go through I need to go back and take 293 00:17:19,119 --> 00:17:20,800 Speaker 1: some time to learn how do you answer some of 294 00:17:20,840 --> 00:17:23,000 Speaker 1: these questions? How are ways that you would go through that, 295 00:17:23,240 --> 00:17:25,960 Speaker 1: because again, the brand and reputation of your company is really, 296 00:17:26,160 --> 00:17:28,359 Speaker 1: you know, a big piece of this, so keeping that 297 00:17:28,520 --> 00:17:30,760 Speaker 1: up is something that we work on. And all of 298 00:17:30,800 --> 00:17:33,199 Speaker 1: this comes through these kind of emulating and you know, 299 00:17:33,240 --> 00:17:37,399 Speaker 1: simulating these scenario pieces. Allison, one of the things that 300 00:17:37,440 --> 00:17:41,080 Speaker 1: you and I share is a background in theater and 301 00:17:42,000 --> 00:17:45,440 Speaker 1: as someone who is in theater and who has participated 302 00:17:45,480 --> 00:17:50,000 Speaker 1: in various theatrical events where you are simulating something. To me, 303 00:17:50,160 --> 00:17:53,280 Speaker 1: one of the magic parts of theater is that people 304 00:17:53,359 --> 00:17:59,439 Speaker 1: actually will experience those reactions even in a simulation. You know, 305 00:17:59,480 --> 00:18:02,160 Speaker 1: you have removed yourself from any real danger, you are 306 00:18:02,200 --> 00:18:07,160 Speaker 1: not in a legit dangerous situation, but your your body 307 00:18:07,200 --> 00:18:10,480 Speaker 1: and your mind still goes through those reactions. Do you 308 00:18:10,840 --> 00:18:13,800 Speaker 1: witness that in these simulations. Do you actually see people 309 00:18:14,200 --> 00:18:16,960 Speaker 1: having those kind of emotional responses and that's a big 310 00:18:17,000 --> 00:18:20,679 Speaker 1: part of learning how to respond appropriately when this happens 311 00:18:20,680 --> 00:18:24,280 Speaker 1: in real life? Yes, exactly, You're spot on having that 312 00:18:24,400 --> 00:18:26,639 Speaker 1: we you know, the whole piece is really creating that 313 00:18:26,720 --> 00:18:29,679 Speaker 1: adrenaline rush, seeing your heart rate go up, you know, 314 00:18:29,720 --> 00:18:32,280 Speaker 1: as soon as you see your headline, you know, splashed 315 00:18:32,280 --> 00:18:35,359 Speaker 1: across you know, front page and in the news. That's 316 00:18:35,400 --> 00:18:38,760 Speaker 1: creating something really for you internally, and so what we're 317 00:18:38,800 --> 00:18:40,960 Speaker 1: doing is creating it in a safe space. Right, this 318 00:18:41,080 --> 00:18:42,840 Speaker 1: is a space where you know, we want you to 319 00:18:42,880 --> 00:18:45,359 Speaker 1: fail in here versus out in the real world. We 320 00:18:45,400 --> 00:18:47,200 Speaker 1: want you to understand what you would need to do 321 00:18:47,600 --> 00:18:49,720 Speaker 1: if you did have something that took place and now 322 00:18:49,720 --> 00:18:52,520 Speaker 1: you need to respond to that. So in order to 323 00:18:52,520 --> 00:18:56,720 Speaker 1: do that, we use lighting, sound design, interactive apps to 324 00:18:56,840 --> 00:19:00,000 Speaker 1: create and evoke this emotion. You know, we have an 325 00:19:00,000 --> 00:19:02,000 Speaker 1: individual come through and they said it almost created like 326 00:19:02,040 --> 00:19:05,640 Speaker 1: a level of PTSD from a previous tyber attack. They 327 00:19:05,640 --> 00:19:07,639 Speaker 1: came through and said like, wow, this is like really, 328 00:19:07,960 --> 00:19:10,200 Speaker 1: I know that I'm in a simulation, but my heart 329 00:19:10,200 --> 00:19:12,280 Speaker 1: and mind sort of take me to this other place 330 00:19:12,560 --> 00:19:14,760 Speaker 1: where now I'm really feeling what it's like. And that's 331 00:19:14,800 --> 00:19:17,840 Speaker 1: the whole thing of practicing and having this muscle memory 332 00:19:17,840 --> 00:19:20,960 Speaker 1: of going through it. Right, you're just rehearsing and rehearsing 333 00:19:21,040 --> 00:19:24,600 Speaker 1: and understanding, and like Wendy said, you know, doing these 334 00:19:24,680 --> 00:19:28,800 Speaker 1: every quarter can really help for you to really understand 335 00:19:28,800 --> 00:19:30,200 Speaker 1: what you would need to do in order to deal 336 00:19:30,240 --> 00:19:32,480 Speaker 1: with that, and that pressure might then go down because 337 00:19:32,520 --> 00:19:34,919 Speaker 1: now you know how you work with you know, the 338 00:19:34,960 --> 00:19:36,879 Speaker 1: attack and the next steps of what you need to 339 00:19:36,880 --> 00:19:39,320 Speaker 1: do to process it. Yes, I think it's much better 340 00:19:39,400 --> 00:19:42,600 Speaker 1: to have that visceral reaction when you're in a practice 341 00:19:42,640 --> 00:19:46,000 Speaker 1: stage than to have it when you're having to deal 342 00:19:46,040 --> 00:19:50,119 Speaker 1: with a real world intrusion or a data breach or 343 00:19:50,160 --> 00:19:52,879 Speaker 1: something along those lines. You definitely want to be able 344 00:19:52,920 --> 00:19:56,800 Speaker 1: to look back on that training you've had and rely 345 00:19:56,960 --> 00:19:59,800 Speaker 1: upon that muscle memory, as you say, rather than have 346 00:19:59,840 --> 00:20:03,720 Speaker 1: to to soldier on and put that response plan to 347 00:20:03,800 --> 00:20:08,679 Speaker 1: test without ever having actually done it. It's that's I 348 00:20:08,680 --> 00:20:11,000 Speaker 1: would love to actually be a fly on the wall 349 00:20:11,040 --> 00:20:13,639 Speaker 1: on one of these It sounds truly amazing to me, 350 00:20:14,240 --> 00:20:16,639 Speaker 1: and and the sort of stuff that I've seen in 351 00:20:16,880 --> 00:20:20,120 Speaker 1: in like hacker movies, but never thought that anyone actually 352 00:20:20,160 --> 00:20:25,280 Speaker 1: did it, So that's phenomenal. Wendy, can you talk a 353 00:20:25,320 --> 00:20:28,840 Speaker 1: little bit. Are there any common traits that you see 354 00:20:29,280 --> 00:20:34,120 Speaker 1: among companies that are really good at recovering from these 355 00:20:34,160 --> 00:20:37,160 Speaker 1: sort of of threats of these sort of attacks. Are 356 00:20:37,160 --> 00:20:40,360 Speaker 1: there certain things that you can identify and say these 357 00:20:40,400 --> 00:20:43,720 Speaker 1: are our markers sort of best practices that are common 358 00:20:43,760 --> 00:20:47,360 Speaker 1: across different industries. Well, I think, first and foremost it's 359 00:20:47,400 --> 00:20:51,000 Speaker 1: because they have access to an incident response team. Right, So, 360 00:20:51,200 --> 00:20:53,679 Speaker 1: whether that's an internal team or whether they choose to 361 00:20:53,920 --> 00:20:56,720 Speaker 1: use and leverage an external team. The reason that you 362 00:20:56,760 --> 00:20:59,199 Speaker 1: want people there is actually right along the lines of 363 00:20:59,200 --> 00:21:01,720 Speaker 1: what you two are talking about, which is you want 364 00:21:01,800 --> 00:21:04,119 Speaker 1: people who have had a lot of practice in this, 365 00:21:04,280 --> 00:21:07,920 Speaker 1: right who have responded to events. I will say, you know, 366 00:21:07,920 --> 00:21:10,720 Speaker 1: I've been doing this almost twenty years, and I still 367 00:21:10,720 --> 00:21:12,480 Speaker 1: when I get the first phone call from a client 368 00:21:12,600 --> 00:21:14,440 Speaker 1: it's a new client that we know there's a situation 369 00:21:14,560 --> 00:21:16,880 Speaker 1: going on, I get the adrenaline rush, you know, because 370 00:21:16,880 --> 00:21:18,840 Speaker 1: I want to know, Okay, what are the details that 371 00:21:18,880 --> 00:21:20,960 Speaker 1: are going to share with me? Who's the potential attacker? 372 00:21:21,040 --> 00:21:22,480 Speaker 1: What do we need to do? In my mind is 373 00:21:22,560 --> 00:21:24,840 Speaker 1: racing of all of these different things and actions we 374 00:21:24,920 --> 00:21:27,840 Speaker 1: need to take. But because I've been through it so 375 00:21:27,880 --> 00:21:30,280 Speaker 1: many times, I'm able to then really harness that and 376 00:21:30,400 --> 00:21:35,199 Speaker 1: channel it into a productive, credible discussion. Right, here's what 377 00:21:35,240 --> 00:21:37,280 Speaker 1: we need to do, Here's the actions we need to take. 378 00:21:37,320 --> 00:21:39,240 Speaker 1: Here the things not to do right now, here's the 379 00:21:39,280 --> 00:21:42,960 Speaker 1: evidence to preserve. So the more that organizations have access 380 00:21:43,000 --> 00:21:46,879 Speaker 1: to personnel like that and those skills, the more successful 381 00:21:46,880 --> 00:21:49,480 Speaker 1: they're going to be because they're going to reduce time 382 00:21:49,880 --> 00:21:53,040 Speaker 1: that it takes to get answers. And when you talk about, 383 00:21:53,119 --> 00:21:57,280 Speaker 1: you know the age old verbiage that time is money, 384 00:21:57,560 --> 00:22:01,000 Speaker 1: that is extremely true in attacks because the more time 385 00:22:01,000 --> 00:22:03,639 Speaker 1: that you can save, right, or the less time you 386 00:22:03,680 --> 00:22:06,639 Speaker 1: take to get answers, the more money you're ultimately going 387 00:22:06,680 --> 00:22:09,840 Speaker 1: to save because you're exposing your organization to less risk 388 00:22:10,119 --> 00:22:14,320 Speaker 1: throughout that entire time. And so, first and foremost, if 389 00:22:14,320 --> 00:22:16,479 Speaker 1: we want to look at who's successful, it's they have 390 00:22:16,560 --> 00:22:19,560 Speaker 1: a team of people who can respond to the incident. 391 00:22:20,040 --> 00:22:23,240 Speaker 1: That said, then those team of people also have things 392 00:22:23,320 --> 00:22:26,240 Speaker 1: like technology in place that gives them the visibility to 393 00:22:26,280 --> 00:22:29,840 Speaker 1: answer questions. Because if you can answer questions really quickly, again, 394 00:22:29,960 --> 00:22:32,600 Speaker 1: then we can make decisions for the business. Whether that's 395 00:22:32,640 --> 00:22:35,800 Speaker 1: taking a system offline, whether it's taking an entire part 396 00:22:35,800 --> 00:22:39,240 Speaker 1: of the network offline because of the risk that is exposed. 397 00:22:39,600 --> 00:22:41,840 Speaker 1: Those are all decisions we can make. So the quicker 398 00:22:41,880 --> 00:22:44,720 Speaker 1: that we can do that based on visibility, the better 399 00:22:46,000 --> 00:22:48,520 Speaker 1: I like that. That answer goes back to what you 400 00:22:48,560 --> 00:22:51,600 Speaker 1: were saying earlier, Wendy, about that first step of building 401 00:22:51,640 --> 00:22:57,000 Speaker 1: a response at all involves getting that list of names 402 00:22:57,160 --> 00:22:59,640 Speaker 1: and their contact information and the roles that they play 403 00:23:00,280 --> 00:23:04,200 Speaker 1: drives home that when you have something like this happen, 404 00:23:04,480 --> 00:23:07,320 Speaker 1: obviously your first response is oh no, and your second 405 00:23:07,320 --> 00:23:10,320 Speaker 1: response is what do I do? And having that list 406 00:23:10,440 --> 00:23:14,280 Speaker 1: of people who have very specific job roles and ways 407 00:23:14,280 --> 00:23:18,480 Speaker 1: of reacting to this is absolutely of critical importance. You 408 00:23:18,800 --> 00:23:21,919 Speaker 1: reduce the amount of time it takes to even know 409 00:23:21,960 --> 00:23:25,600 Speaker 1: who you're going to turn to. It's one of the 410 00:23:25,600 --> 00:23:29,960 Speaker 1: worst feelings in the world is receiving information and literally 411 00:23:30,000 --> 00:23:32,639 Speaker 1: not knowing where you need to go in order to 412 00:23:32,680 --> 00:23:36,000 Speaker 1: resolve it. So having that in place, I think, as 413 00:23:36,040 --> 00:23:40,000 Speaker 1: you point out, is absolutely critical. Alison, do you have 414 00:23:40,080 --> 00:23:44,159 Speaker 1: any specific sort of lessons that the companies tend to 415 00:23:44,240 --> 00:23:47,119 Speaker 1: learn in this simulation, apart from the fact that a 416 00:23:47,160 --> 00:23:50,840 Speaker 1: simulation can be almost as terrifying as the real thing. Yeah, 417 00:23:50,920 --> 00:23:52,800 Speaker 1: I mean one of them, i'd say, is just a 418 00:23:52,840 --> 00:23:55,080 Speaker 1: lot of organizations realize that they need to test their 419 00:23:55,119 --> 00:23:56,919 Speaker 1: plan and go through it. That's i'd say, like the 420 00:23:56,960 --> 00:23:59,280 Speaker 1: first piece. But one of the things that you were 421 00:23:59,320 --> 00:24:03,359 Speaker 1: just mentioning about, you know, the types of people that 422 00:24:03,400 --> 00:24:05,520 Speaker 1: go through and the response and pieces like that. One 423 00:24:05,520 --> 00:24:08,920 Speaker 1: thing that we've found is those with military or first 424 00:24:08,960 --> 00:24:13,080 Speaker 1: responder training have responded very well within these types of 425 00:24:13,200 --> 00:24:16,399 Speaker 1: response challenges, you know. And a thing I think we 426 00:24:16,880 --> 00:24:18,720 Speaker 1: look at from that is those are the ones that 427 00:24:18,760 --> 00:24:21,920 Speaker 1: are really trained in incidents that have taken place for them, 428 00:24:22,720 --> 00:24:25,160 Speaker 1: and they're not really shying away or pushing it away 429 00:24:25,200 --> 00:24:28,280 Speaker 1: onto someone else's issue. They're taking it on and leaning 430 00:24:28,320 --> 00:24:31,719 Speaker 1: into that situation and really you know, moving forward quickly 431 00:24:31,760 --> 00:24:33,520 Speaker 1: with it. We tend to see those are the ones 432 00:24:33,520 --> 00:24:36,360 Speaker 1: that get up, answer the phone and handle the situation. 433 00:24:37,080 --> 00:24:39,560 Speaker 1: So taking kind of a lesson from that, you know. 434 00:24:39,600 --> 00:24:42,880 Speaker 1: And another piece that we'd say is just that many 435 00:24:42,920 --> 00:24:45,800 Speaker 1: learn to understand that cybersecurity is a whole of business response. 436 00:24:45,920 --> 00:24:48,480 Speaker 1: It's not just that it. We need to see everyone 437 00:24:48,520 --> 00:24:51,640 Speaker 1: within your organization taking part and understanding that there's now 438 00:24:51,640 --> 00:24:54,760 Speaker 1: a cybersecurity culture that needs to go you know, take 439 00:24:54,800 --> 00:24:58,080 Speaker 1: place and go within you know. Another thing is looking 440 00:24:58,119 --> 00:25:00,520 Speaker 1: at it from a you know, a top down approach, 441 00:25:00,920 --> 00:25:04,840 Speaker 1: looking at cybersecurity awareness, this idea of good cybersecurity culture 442 00:25:04,840 --> 00:25:06,960 Speaker 1: that comes from the top of your organization and can 443 00:25:07,000 --> 00:25:10,600 Speaker 1: trickle down within the rest and just making sure that 444 00:25:10,680 --> 00:25:14,040 Speaker 1: your teams have are empowered to take steps to immediately 445 00:25:14,080 --> 00:25:17,560 Speaker 1: react without hesitation right, giving them that power to say, 446 00:25:17,640 --> 00:25:19,879 Speaker 1: you know what you need to do. You've practiced and rehearse, 447 00:25:19,920 --> 00:25:21,480 Speaker 1: and these are your steps that you would need to 448 00:25:21,480 --> 00:25:25,080 Speaker 1: take out of curiosity, Alison, do you have a particular 449 00:25:25,119 --> 00:25:28,440 Speaker 1: type of threat that you've seen where the response has 450 00:25:28,640 --> 00:25:32,440 Speaker 1: been frequently lacking? Is there a place that people really 451 00:25:32,440 --> 00:25:34,120 Speaker 1: need to focus on? I guess is what I'm trying 452 00:25:34,160 --> 00:25:36,399 Speaker 1: to get at. Yeah, I would say a big piece 453 00:25:36,520 --> 00:25:39,800 Speaker 1: that you know where people lack is the response to 454 00:25:40,000 --> 00:25:44,640 Speaker 1: media and communications. That side of it isn't always thought about, right. Yeah, 455 00:25:44,680 --> 00:25:47,159 Speaker 1: you're dealing with the technical you have teams that are 456 00:25:47,200 --> 00:25:49,240 Speaker 1: trained in that, but then when it comes to putting 457 00:25:49,280 --> 00:25:52,600 Speaker 1: out that holding statement, even communicating internally to your teams 458 00:25:52,840 --> 00:25:55,320 Speaker 1: so that they're not sending out messages or putting things 459 00:25:55,320 --> 00:25:57,880 Speaker 1: out you know they're wondering what's going on. You can 460 00:25:57,920 --> 00:26:02,119 Speaker 1: put these sort of hold statements internally within your organization. 461 00:26:02,200 --> 00:26:05,080 Speaker 1: And something that we also practice is called a leader's intent, 462 00:26:05,480 --> 00:26:09,600 Speaker 1: where we have the team write out a leader's intent 463 00:26:09,640 --> 00:26:11,600 Speaker 1: for your entire organization. And this gives you like a 464 00:26:11,640 --> 00:26:13,840 Speaker 1: purpose and an end state of what you would need 465 00:26:13,880 --> 00:26:15,639 Speaker 1: to do. If there was some sort of piece that 466 00:26:15,680 --> 00:26:18,200 Speaker 1: took place, So it gives them everyone in your company 467 00:26:18,560 --> 00:26:20,679 Speaker 1: that right and that kind of goal of what they 468 00:26:20,720 --> 00:26:23,760 Speaker 1: would need to do. As a member of the media, 469 00:26:23,920 --> 00:26:28,480 Speaker 1: I can certainly understand how we can be intimidating. So, 470 00:26:29,320 --> 00:26:32,800 Speaker 1: I mean, our job is to spread information and sometimes 471 00:26:32,800 --> 00:26:34,560 Speaker 1: you really need to contain it for the moment so 472 00:26:34,560 --> 00:26:37,040 Speaker 1: that you can do the right thing. So I certainly 473 00:26:37,119 --> 00:26:40,800 Speaker 1: can appreciate that from my perspective. Oh, yes, we use 474 00:26:40,880 --> 00:26:42,760 Speaker 1: you as the bad guys all the time. I mean, 475 00:26:43,480 --> 00:26:50,920 Speaker 1: it's fine, it's fine, wen They your team recently released 476 00:26:50,920 --> 00:26:54,960 Speaker 1: a threat Landscape report on cloud environments. Now, obviously, over 477 00:26:55,000 --> 00:26:57,720 Speaker 1: the last two decades, we have seen an incredible migration 478 00:26:57,840 --> 00:27:01,000 Speaker 1: to cloud services. There's so many companies out there that 479 00:27:01,040 --> 00:27:04,520 Speaker 1: are dependent upon either a hybrid cloud strategy or a 480 00:27:04,520 --> 00:27:07,439 Speaker 1: lot have even moved almost all of their processes to 481 00:27:07,520 --> 00:27:10,639 Speaker 1: the cloud. What were some of the things that you 482 00:27:10,800 --> 00:27:14,320 Speaker 1: learned in that and that you released in that threat 483 00:27:14,400 --> 00:27:18,080 Speaker 1: landscape report. Yeah, you know, I think they're pretty consistent 484 00:27:18,119 --> 00:27:19,919 Speaker 1: with the things that we've seen in the field with 485 00:27:19,960 --> 00:27:23,920 Speaker 1: our investigations. And you know I mentioned earlier about time 486 00:27:23,960 --> 00:27:26,239 Speaker 1: being money and not being never truer than in the 487 00:27:26,240 --> 00:27:29,239 Speaker 1: case of a data breach and data breaches in the 488 00:27:29,240 --> 00:27:33,640 Speaker 1: cloud are not any different actually, right, They are primarily 489 00:27:33,720 --> 00:27:37,800 Speaker 1: motivated around financial gain. So that's really the most common 490 00:27:37,840 --> 00:27:41,480 Speaker 1: motivation for the threat actors that we see targeting those 491 00:27:41,960 --> 00:27:45,199 Speaker 1: and you know, I think the it relates primarily to 492 00:27:45,320 --> 00:27:48,080 Speaker 1: data theft, right, So data that's hosted in the cloud. 493 00:27:48,320 --> 00:27:51,560 Speaker 1: One of the things we consistently see is that organizations 494 00:27:51,600 --> 00:27:53,960 Speaker 1: who move data to the cloud will kind of have 495 00:27:54,040 --> 00:27:58,240 Speaker 1: this false idea that you know, Okay, well, now it's 496 00:27:58,240 --> 00:28:01,760 Speaker 1: someone else's responsibility and so I'm kind of absolved from 497 00:28:01,840 --> 00:28:05,320 Speaker 1: the responsibility of protecting that data. And unfortunately that's not 498 00:28:05,480 --> 00:28:08,600 Speaker 1: the case, right, And so we see a huge amount 499 00:28:08,600 --> 00:28:12,520 Speaker 1: of misconfigurations. About forty three percent of attacks that we 500 00:28:12,560 --> 00:28:14,879 Speaker 1: see in the clouds in the cloud, excuse me, our 501 00:28:15,000 --> 00:28:19,280 Speaker 1: result of misconfigurations of that And you know, oftentimes, again 502 00:28:19,320 --> 00:28:21,639 Speaker 1: it's kind of unclear as to whether it's the hosting 503 00:28:21,680 --> 00:28:25,280 Speaker 1: provider or the actual data owner who felt like, you know, 504 00:28:25,320 --> 00:28:28,160 Speaker 1: maybe they were pointing fingers about who was actually responsible 505 00:28:28,840 --> 00:28:31,640 Speaker 1: for those attacks. But you know, I think the reality 506 00:28:31,800 --> 00:28:34,359 Speaker 1: is that we are going to continue to see more 507 00:28:34,440 --> 00:28:37,800 Speaker 1: and more of those types of attacks as more organizations 508 00:28:37,920 --> 00:28:40,960 Speaker 1: move to hosting data in the cloud. That to me 509 00:28:41,080 --> 00:28:44,560 Speaker 1: is incredibly interesting Wendy, because I the first thing I 510 00:28:44,560 --> 00:28:46,400 Speaker 1: think of when I think of the possibility of moving 511 00:28:46,440 --> 00:28:50,560 Speaker 1: things to the cloud is a reticence of letting go 512 00:28:50,920 --> 00:28:53,840 Speaker 1: of something in that I think about the old days 513 00:28:53,840 --> 00:28:56,560 Speaker 1: when everything is self contained. But it is interesting to 514 00:28:56,560 --> 00:28:58,640 Speaker 1: think of it from the other perspective of the idea 515 00:28:58,640 --> 00:29:02,560 Speaker 1: that you're absolving yourself of responsibility by putting it onto 516 00:29:02,680 --> 00:29:06,680 Speaker 1: potentially a cloud provider. And in either case it's a 517 00:29:06,760 --> 00:29:09,880 Speaker 1: destructive way of thinking, and I think it does point 518 00:29:09,960 --> 00:29:14,959 Speaker 1: back to your earlier point about this is another example 519 00:29:15,000 --> 00:29:19,080 Speaker 1: of how a response plan is absolutely critical to any 520 00:29:19,440 --> 00:29:23,520 Speaker 1: business that whether you are overseeing your systems internally or 521 00:29:23,560 --> 00:29:25,680 Speaker 1: whether it's on the cloud, you have to have that 522 00:29:25,880 --> 00:29:29,840 Speaker 1: plan in place. It isn't enough to just say, oh, 523 00:29:29,920 --> 00:29:32,760 Speaker 1: well it's in safe hands. I can just brush my 524 00:29:32,840 --> 00:29:36,040 Speaker 1: hands and walk the other way and never have to worry. So, Allison, 525 00:29:36,440 --> 00:29:41,320 Speaker 1: does your team work on creating scenarios that involve things 526 00:29:41,440 --> 00:29:45,320 Speaker 1: like cloud environments. Yes, very much. So. You know that's 527 00:29:45,360 --> 00:29:49,280 Speaker 1: a big area. You know that we're seeing companies go towards, 528 00:29:49,280 --> 00:29:51,720 Speaker 1: and that's something that we're highlighting and working on within 529 00:29:51,760 --> 00:29:56,240 Speaker 1: the range. Yes, we have organizations that test and train 530 00:29:56,720 --> 00:29:59,120 Speaker 1: within the space. And something that we look at is 531 00:29:59,120 --> 00:30:01,400 Speaker 1: we put you in this a fictitious company right that 532 00:30:01,400 --> 00:30:03,480 Speaker 1: you're going through, and we put it now as a 533 00:30:03,480 --> 00:30:06,920 Speaker 1: cloud first environment, and we give participants best practices on 534 00:30:07,040 --> 00:30:10,920 Speaker 1: manning managing those cloud attacks and the response to them. So, 535 00:30:11,040 --> 00:30:12,720 Speaker 1: you know, we look at the you know, migrating to 536 00:30:12,760 --> 00:30:16,120 Speaker 1: the cloud, which introduces new security risks and different challenges, 537 00:30:16,960 --> 00:30:20,440 Speaker 1: and we take participants through really a fictitious multi cloud 538 00:30:20,520 --> 00:30:23,960 Speaker 1: organization that is about to experience a cyber attack and 539 00:30:24,040 --> 00:30:25,960 Speaker 1: what you would need to do in order to support that, 540 00:30:26,080 --> 00:30:27,560 Speaker 1: What do you need to do in order to kind 541 00:30:27,600 --> 00:30:30,320 Speaker 1: of stop and you know, what are those responses to 542 00:30:30,440 --> 00:30:32,840 Speaker 1: dealing with it now that it's in the cloud, and 543 00:30:32,880 --> 00:30:34,720 Speaker 1: this gives you still a chance to deal with it 544 00:30:34,760 --> 00:30:37,960 Speaker 1: protecting your customers, your employees, your brand, but all of 545 00:30:38,000 --> 00:30:41,000 Speaker 1: that within the cloud, and how your organization would be 546 00:30:41,040 --> 00:30:44,520 Speaker 1: handling it with these cloud environments. Out of curiosity, do 547 00:30:44,560 --> 00:30:46,960 Speaker 1: you have members of your team who are essentially filling 548 00:30:47,000 --> 00:30:49,280 Speaker 1: in the role of people who are working for this 549 00:30:49,480 --> 00:30:54,640 Speaker 1: fictional cloud service provider and do they have to interact 550 00:30:54,680 --> 00:30:57,680 Speaker 1: with the people going through the simulation. Yeah, so we 551 00:30:57,800 --> 00:30:59,760 Speaker 1: have they're kind of like our actors in a way, 552 00:31:00,240 --> 00:31:05,760 Speaker 1: but they're trained experts in cloud, cloud resources, open shift, 553 00:31:05,840 --> 00:31:07,880 Speaker 1: all of this sort of area, you know, when you're 554 00:31:07,920 --> 00:31:10,520 Speaker 1: dealing with it. So we definitely are you know, have 555 00:31:10,680 --> 00:31:13,360 Speaker 1: these experts that are there and they're interacting and putting 556 00:31:13,360 --> 00:31:15,840 Speaker 1: in those pieces. So when you know a client or 557 00:31:15,880 --> 00:31:18,840 Speaker 1: an attendee in that is asking questions and going through it, 558 00:31:19,000 --> 00:31:21,480 Speaker 1: there are sort of these real life in a way 559 00:31:21,920 --> 00:31:24,320 Speaker 1: actors that come in and ask these questions and have 560 00:31:24,440 --> 00:31:27,320 Speaker 1: these real life scenarios that would come out and play through. 561 00:31:27,560 --> 00:31:31,280 Speaker 1: That's fascinating. So but it is it's incredibly valuable, right 562 00:31:31,360 --> 00:31:35,040 Speaker 1: because other than that, you would just have people talking 563 00:31:35,080 --> 00:31:37,920 Speaker 1: through their response plan and if there's no one that 564 00:31:37,960 --> 00:31:41,000 Speaker 1: they can bounce off of, and if the control is 565 00:31:41,080 --> 00:31:45,680 Speaker 1: outside of the company, it really would be a frustrating experience. 566 00:31:45,680 --> 00:31:47,880 Speaker 1: So having that where you have that extra piece in 567 00:31:47,920 --> 00:31:50,680 Speaker 1: there and you can figure out what the resolution is 568 00:31:50,800 --> 00:31:54,080 Speaker 1: to one stop the attack and then to move on 569 00:31:54,160 --> 00:31:58,280 Speaker 1: to your next phase, that's absolutely important and critical. Obviously, 570 00:31:58,560 --> 00:32:01,640 Speaker 1: I have another question for for really for both of you, 571 00:32:01,720 --> 00:32:05,240 Speaker 1: but Wendy, maybe you can take first crack at this. 572 00:32:05,240 --> 00:32:08,320 Speaker 1: This is where we put on prognosticator hats. It's where 573 00:32:08,320 --> 00:32:12,240 Speaker 1: we look into the future, which we all know is dangerous. 574 00:32:12,320 --> 00:32:15,880 Speaker 1: And yes, and often we have to we have to 575 00:32:15,920 --> 00:32:18,880 Speaker 1: couch things, which is perfectly fine. But how do you 576 00:32:19,280 --> 00:32:26,360 Speaker 1: see the cyber landscape evolving now, especially given this decentralized approach, 577 00:32:26,400 --> 00:32:29,240 Speaker 1: which I imagine for a lot of companies is going 578 00:32:29,320 --> 00:32:33,920 Speaker 1: to become the normal mode of operations, even once we 579 00:32:34,000 --> 00:32:37,600 Speaker 1: emerge from the pandemic. Right, you know, I think we're 580 00:32:37,640 --> 00:32:40,520 Speaker 1: going to see write a prolonged period of a little 581 00:32:40,520 --> 00:32:45,080 Speaker 1: bit of instability. Right, how do people work from home? 582 00:32:45,280 --> 00:32:47,920 Speaker 1: Does part of the workforce work from home? Park go 583 00:32:48,000 --> 00:32:50,200 Speaker 1: back to the office. There's going to be just a 584 00:32:50,240 --> 00:32:52,560 Speaker 1: continued kind of dynamic shift, and I think that's going 585 00:32:52,600 --> 00:32:55,280 Speaker 1: to make a lot of people uneasy. Right, So from 586 00:32:55,320 --> 00:32:57,560 Speaker 1: that perspective, I think we're going to continue to see 587 00:32:57,680 --> 00:33:00,720 Speaker 1: attackers take advantage of that. I think are some things 588 00:33:00,720 --> 00:33:03,680 Speaker 1: that organizations can do to be much more successful at that, 589 00:33:04,600 --> 00:33:11,600 Speaker 1: things like implementing multi factor authentication for remotely accessible devices 590 00:33:11,600 --> 00:33:15,040 Speaker 1: and systems and applications. That's going to be critical. Right, 591 00:33:15,080 --> 00:33:17,880 Speaker 1: regardless of whom you have working in an office or not, 592 00:33:19,040 --> 00:33:21,160 Speaker 1: you'll be able to then secure that data a little 593 00:33:21,200 --> 00:33:24,000 Speaker 1: bit more because attackers will continue to take advantage of that, 594 00:33:24,400 --> 00:33:27,720 Speaker 1: I think will continue to see more online scams. As 595 00:33:27,800 --> 00:33:30,320 Speaker 1: the election season within the US is coming up, you're 596 00:33:30,360 --> 00:33:32,680 Speaker 1: going to continue to see more related to that. And 597 00:33:32,720 --> 00:33:36,560 Speaker 1: then once vaccines are available and once more testing is 598 00:33:36,600 --> 00:33:39,160 Speaker 1: readily available, we're going to continue to see a lot 599 00:33:39,200 --> 00:33:43,440 Speaker 1: more scams related to that. So individual users will need 600 00:33:43,480 --> 00:33:46,440 Speaker 1: to really, I think, learn to protect themselves a little 601 00:33:46,480 --> 00:33:50,280 Speaker 1: bit more effectively. And that multi factor authentication I mentioned 602 00:33:50,320 --> 00:33:53,640 Speaker 1: for example, is also great for you to implement personally, 603 00:33:54,000 --> 00:33:57,320 Speaker 1: So things like on your online banking accounts, on your 604 00:33:57,320 --> 00:34:01,640 Speaker 1: personal email accounts, your social media accounts, having multi factor 605 00:34:01,680 --> 00:34:06,040 Speaker 1: authentication most of those now most applications have that built 606 00:34:06,040 --> 00:34:09,280 Speaker 1: in that people can take advantage of. And then also 607 00:34:09,360 --> 00:34:12,440 Speaker 1: doing things like having a password manager, so using that 608 00:34:12,480 --> 00:34:14,400 Speaker 1: there's lots of free ones you can use, so that 609 00:34:14,520 --> 00:34:16,600 Speaker 1: one you don't have to memorize your passwords and that 610 00:34:16,680 --> 00:34:19,360 Speaker 1: you're not using the same ones over and over again. 611 00:34:19,760 --> 00:34:22,319 Speaker 1: We know that the number of breaches is going to 612 00:34:22,320 --> 00:34:26,080 Speaker 1: continue to increase, the number of compromise networks and systems 613 00:34:26,120 --> 00:34:28,960 Speaker 1: and accounts will continue to increase, and at this point, 614 00:34:29,440 --> 00:34:32,279 Speaker 1: over sixty percent of the breaches that we see are 615 00:34:32,360 --> 00:34:35,400 Speaker 1: leveraging data that's already been stolen somewhere else or a 616 00:34:35,480 --> 00:34:38,319 Speaker 1: vulnerability that's already been exploited and is out there and 617 00:34:38,360 --> 00:34:40,799 Speaker 1: known to the public. So if we can all do 618 00:34:40,960 --> 00:34:43,080 Speaker 1: our best to kind of take our part and our 619 00:34:43,160 --> 00:34:46,279 Speaker 1: actions that are going to help secure our own environments, 620 00:34:46,680 --> 00:34:49,400 Speaker 1: then the better off that that's going to translate to 621 00:34:49,560 --> 00:34:53,239 Speaker 1: our corporate environments and just to overall security. Yeah, I 622 00:34:54,760 --> 00:34:57,720 Speaker 1: can't tell you how many times I've rolled my eyes 623 00:34:57,760 --> 00:35:02,480 Speaker 1: at reports of a data breach where passwords were shared, 624 00:35:02,520 --> 00:35:05,040 Speaker 1: and you see that the most common passwords are things 625 00:35:05,080 --> 00:35:08,320 Speaker 1: like password or one, two, three, four, five six or whatever, 626 00:35:08,920 --> 00:35:11,760 Speaker 1: or password one so that you have the one numeral 627 00:35:11,840 --> 00:35:15,480 Speaker 1: in there. And I think a big part of cybersecurity 628 00:35:15,480 --> 00:35:18,600 Speaker 1: from an individual standpoint, and please correct me if I 629 00:35:18,640 --> 00:35:21,399 Speaker 1: am off base, because you're the experts, but I think 630 00:35:21,440 --> 00:35:23,120 Speaker 1: a large part of it is the idea of you're 631 00:35:23,120 --> 00:35:26,320 Speaker 1: trying to just reduce the number of opportunities an attacker 632 00:35:26,440 --> 00:35:31,000 Speaker 1: has to take advantage of you, and the more opportunities 633 00:35:31,040 --> 00:35:35,960 Speaker 1: you eliminate, the less valuable you are to the typical attacker. Because, 634 00:35:36,000 --> 00:35:38,319 Speaker 1: as you had mentioned, earlier. Time is money, even on 635 00:35:38,360 --> 00:35:41,439 Speaker 1: the attack side, and an attacker is far more likely 636 00:35:41,480 --> 00:35:44,520 Speaker 1: to go after a target that they view as being 637 00:35:44,640 --> 00:35:49,000 Speaker 1: vulnerable than to waste time on targets are that appear 638 00:35:49,080 --> 00:35:53,239 Speaker 1: to be more savvy from a security perspective? Am I 639 00:35:53,360 --> 00:35:56,359 Speaker 1: more or less on track there? I think you're ready 640 00:35:56,400 --> 00:35:59,120 Speaker 1: to be an incident response consultant because that's one of 641 00:35:59,200 --> 00:36:02,040 Speaker 1: the things that we say. Basically taking the language you 642 00:36:02,120 --> 00:36:05,719 Speaker 1: just use, shifting that to a corporate environment. The fundamentals are, 643 00:36:05,760 --> 00:36:08,000 Speaker 1: we want to increase the amount of time it takes 644 00:36:08,040 --> 00:36:11,440 Speaker 1: for the attacker to meet their objective right to accomplish 645 00:36:11,480 --> 00:36:13,960 Speaker 1: their goal, whatever that may be, to steal information, to 646 00:36:14,040 --> 00:36:16,959 Speaker 1: break in, etc. So we increase the time it takes 647 00:36:17,000 --> 00:36:18,920 Speaker 1: them to do it, and we decrease the time it 648 00:36:18,960 --> 00:36:22,239 Speaker 1: takes your organization or the good guys right to be 649 00:36:22,280 --> 00:36:24,719 Speaker 1: able to identify it. So if we can marry those 650 00:36:24,719 --> 00:36:28,520 Speaker 1: two together, then we tend to make your organization less 651 00:36:28,520 --> 00:36:31,359 Speaker 1: of a target than other locations because the attackers are 652 00:36:31,360 --> 00:36:32,839 Speaker 1: going to have to work harder, they're going to use 653 00:36:32,840 --> 00:36:34,920 Speaker 1: more resources, they're going to have to spend more money 654 00:36:35,160 --> 00:36:37,440 Speaker 1: to get the job done, and more than likely they're 655 00:36:37,480 --> 00:36:39,759 Speaker 1: going to move to somewhere else where they can accomplish 656 00:36:39,760 --> 00:36:43,879 Speaker 1: that much faster. I'm glad I got something right. Well, 657 00:36:44,280 --> 00:36:46,880 Speaker 1: let me ask this also, Are there are there tips 658 00:36:47,160 --> 00:36:52,160 Speaker 1: or strategies that you think companies and individuals should be 659 00:36:52,880 --> 00:36:57,040 Speaker 1: following beyond making a response plan. I think one of 660 00:36:57,080 --> 00:37:01,719 Speaker 1: the big ones is finding a way to unicate policies 661 00:37:01,800 --> 00:37:06,040 Speaker 1: and processes and good security behaviors to people in a 662 00:37:06,080 --> 00:37:10,719 Speaker 1: way that is really instructive. I know that almost every 663 00:37:10,760 --> 00:37:16,239 Speaker 1: company out there now has the mandatory video or presentation 664 00:37:16,760 --> 00:37:21,399 Speaker 1: on security. What do you think are things that really 665 00:37:21,440 --> 00:37:23,560 Speaker 1: people need to focus on, our companies need to focus 666 00:37:23,600 --> 00:37:29,640 Speaker 1: on in general to help improve security overall. Well, something 667 00:37:29,719 --> 00:37:32,239 Speaker 1: Allison I'm sure is going to talk further about is 668 00:37:32,280 --> 00:37:35,879 Speaker 1: about building a security culture right and building that into 669 00:37:35,920 --> 00:37:38,640 Speaker 1: really the fabric of your operations. I think and tell 670 00:37:38,960 --> 00:37:42,760 Speaker 1: people at all levels of an organization feel like security 671 00:37:42,800 --> 00:37:46,200 Speaker 1: is their responsibility and they're empowered to make decisions on it. 672 00:37:46,600 --> 00:37:49,239 Speaker 1: Until they do that, an organization is always going to 673 00:37:49,280 --> 00:37:52,600 Speaker 1: struggle right to make decisions effectively. So that's a huge 674 00:37:52,600 --> 00:37:56,280 Speaker 1: part of it. The communications, having them planned and prepared 675 00:37:56,360 --> 00:37:58,839 Speaker 1: in advance so that you're ready to go once an 676 00:37:58,840 --> 00:38:02,279 Speaker 1: attack actually occurs is also critical, and then shifting to 677 00:38:02,320 --> 00:38:06,280 Speaker 1: some of the more technical components, things like I mentioned 678 00:38:06,640 --> 00:38:10,400 Speaker 1: multi factor authentication on remote devices. That's absolutely critical, but 679 00:38:10,520 --> 00:38:13,759 Speaker 1: also making sure that you have backups of your most 680 00:38:13,800 --> 00:38:17,200 Speaker 1: sensitive data and that you've tested those backups. We have 681 00:38:17,280 --> 00:38:20,759 Speaker 1: an organization we're working with right now, major ransomware outbreak. 682 00:38:20,960 --> 00:38:24,279 Speaker 1: They had all the best technology in place and all 683 00:38:24,280 --> 00:38:27,080 Speaker 1: of the best procedures for having backups, making sure they 684 00:38:27,120 --> 00:38:29,760 Speaker 1: were offline and not connected to the network at all times, 685 00:38:30,120 --> 00:38:32,799 Speaker 1: but they had never tested them. And whence they did, 686 00:38:32,840 --> 00:38:35,479 Speaker 1: they realized they couldn't actually restore them because the data 687 00:38:35,560 --> 00:38:39,080 Speaker 1: wasn't replicating correctly. So, you know, we talk about testing 688 00:38:39,080 --> 00:38:42,520 Speaker 1: our incident response plan, also test your most sensitive data 689 00:38:42,520 --> 00:38:44,880 Speaker 1: in those backups, because if you have access to that 690 00:38:45,080 --> 00:38:47,440 Speaker 1: and you are attacked and you are the victim of 691 00:38:47,440 --> 00:38:49,799 Speaker 1: a ransomware attack, you don't even have to engage in 692 00:38:49,800 --> 00:38:52,200 Speaker 1: any of those discussions. You can say, Okay, it's going 693 00:38:52,239 --> 00:38:55,960 Speaker 1: to take us six hours, twelve hours, twenty four, whatever 694 00:38:56,000 --> 00:38:58,080 Speaker 1: the case may be, to get access to that data. 695 00:38:58,120 --> 00:38:59,920 Speaker 1: But we have it, and it's just a matter of 696 00:39:00,200 --> 00:39:03,760 Speaker 1: getting access to it and restoring it and then certainly 697 00:39:03,760 --> 00:39:08,320 Speaker 1: securing the ability for the attackers to successfully do that. Again, 698 00:39:08,520 --> 00:39:10,560 Speaker 1: we want to prevent that as well. I feel like 699 00:39:10,640 --> 00:39:13,719 Speaker 1: a lot of those lessons can be applied not just 700 00:39:13,880 --> 00:39:18,120 Speaker 1: in the corporate culture, but in our personal day to 701 00:39:18,200 --> 00:39:23,360 Speaker 1: day operations as well. This thought of taking security seriously, 702 00:39:24,160 --> 00:39:27,440 Speaker 1: it's interesting to me because I'm old enough to remember 703 00:39:27,560 --> 00:39:30,080 Speaker 1: when no one wanted to use the internet to buy 704 00:39:30,120 --> 00:39:33,960 Speaker 1: anything because everyone was worried about security. They're thinking, I 705 00:39:33,960 --> 00:39:35,480 Speaker 1: don't want to put the numbers that are on my 706 00:39:35,600 --> 00:39:38,879 Speaker 1: card onto this computer thing and have it sent out 707 00:39:38,880 --> 00:39:42,560 Speaker 1: to everybody. And oddly enough, now we're in a world 708 00:39:42,600 --> 00:39:47,160 Speaker 1: where a lot of things that would drastically improve security 709 00:39:47,360 --> 00:39:51,319 Speaker 1: are either an afterthought for some people. They never consider it, 710 00:39:51,480 --> 00:39:54,719 Speaker 1: or they think of it as an annoyance. I know 711 00:39:54,800 --> 00:39:58,680 Speaker 1: people who find multi factor authentication to be irritated, Oh 712 00:39:58,719 --> 00:40:01,160 Speaker 1: I have to type in my that six digit code 713 00:40:01,200 --> 00:40:04,040 Speaker 1: that just got sent to my smartphone. And explaining to 714 00:40:04,120 --> 00:40:07,239 Speaker 1: them that this is a way in order to make 715 00:40:07,280 --> 00:40:11,440 Speaker 1: it harder for an attacker to find that exploit, whether 716 00:40:11,560 --> 00:40:15,680 Speaker 1: it's in a company or it's in your personal information. 717 00:40:16,000 --> 00:40:19,840 Speaker 1: I think that is incredibly valuable, and I want to 718 00:40:19,880 --> 00:40:23,120 Speaker 1: see that culture adopted at large, not just in companies 719 00:40:23,160 --> 00:40:28,279 Speaker 1: but beyond as well, Alison, any other little tips or 720 00:40:28,320 --> 00:40:32,160 Speaker 1: tricks or any any fun ways to terrify people that 721 00:40:32,200 --> 00:40:35,640 Speaker 1: you would like to share before we wrap up. Yeah, 722 00:40:35,719 --> 00:40:38,479 Speaker 1: I mean just you know, for my area, it's where 723 00:40:38,480 --> 00:40:40,480 Speaker 1: can we get you? What are those things we like 724 00:40:40,560 --> 00:40:42,720 Speaker 1: to almost think really like a hacker in a way, 725 00:40:42,800 --> 00:40:45,120 Speaker 1: and what are those areas that we can take advantage 726 00:40:45,160 --> 00:40:47,919 Speaker 1: of and then show you what those are? And that's 727 00:40:47,960 --> 00:40:50,640 Speaker 1: really what we're you know, working on within that. But 728 00:40:50,680 --> 00:40:52,600 Speaker 1: you know, like when you said all of these areas 729 00:40:52,680 --> 00:40:55,520 Speaker 1: to you know, to stay cyber safe, working on that 730 00:40:55,600 --> 00:40:58,480 Speaker 1: as you know, a security culture, even having those security 731 00:40:58,480 --> 00:41:01,480 Speaker 1: culture pieces at homes, staying cyber safe at home with 732 00:41:01,560 --> 00:41:03,800 Speaker 1: your family and kids, that can kind of just penetrate 733 00:41:03,840 --> 00:41:07,040 Speaker 1: that within your you know, entire self and bring that 734 00:41:07,120 --> 00:41:09,920 Speaker 1: into your organization. I'd say, you know, those are areas 735 00:41:09,920 --> 00:41:13,879 Speaker 1: and just practice, practice, practice, keep those plans going, keep 736 00:41:13,920 --> 00:41:18,279 Speaker 1: going with those tests, you know, emulating those experiences and 737 00:41:18,320 --> 00:41:21,560 Speaker 1: making sure that you're really taking those plans into action. 738 00:41:22,239 --> 00:41:25,200 Speaker 1: Out of curiosity, Allison, does your team look at a 739 00:41:25,320 --> 00:41:27,719 Speaker 1: response plan in advance and then look to see if 740 00:41:27,760 --> 00:41:30,279 Speaker 1: there are any potential holes in that response plan? So 741 00:41:30,320 --> 00:41:34,319 Speaker 1: that you can demonstrate that this is something that they 742 00:41:34,719 --> 00:41:38,560 Speaker 1: the client really needs to focus on in order to improve. Definitely, 743 00:41:39,080 --> 00:41:42,000 Speaker 1: we'll take the response plans, study them, and then create 744 00:41:42,040 --> 00:41:45,279 Speaker 1: scenarios that are specifically designed to possibly you know, go 745 00:41:45,400 --> 00:41:49,200 Speaker 1: around or you know, penetrate certain areas that they might 746 00:41:49,200 --> 00:41:51,080 Speaker 1: be missing. We also take it where we might not 747 00:41:51,160 --> 00:41:53,520 Speaker 1: have any insight and show that there are you know, 748 00:41:54,680 --> 00:41:58,120 Speaker 1: openings and holes that might you know, appear. A lot 749 00:41:58,160 --> 00:42:02,160 Speaker 1: of it has to do with human human interaction, things 750 00:42:02,200 --> 00:42:04,960 Speaker 1: that we might miss, things that are happening, So it's 751 00:42:05,040 --> 00:42:07,160 Speaker 1: kind of taking all those in and then showing where 752 00:42:07,200 --> 00:42:09,759 Speaker 1: you need to add those within your plan. So definitely 753 00:42:09,800 --> 00:42:12,560 Speaker 1: that's an area. Yeah. I think of that a lot 754 00:42:13,080 --> 00:42:16,400 Speaker 1: in terms of things like learning a martial art where 755 00:42:16,920 --> 00:42:20,719 Speaker 1: you practice practice, practice, practice, and then you're ready to 756 00:42:20,760 --> 00:42:22,839 Speaker 1: show off to someone and say, all right, i'll show 757 00:42:22,880 --> 00:42:24,160 Speaker 1: you how you get out of it. Here, grab me 758 00:42:24,200 --> 00:42:26,200 Speaker 1: from behind. Someone grabs you from behind. Oh no, no, 759 00:42:26,239 --> 00:42:28,120 Speaker 1: not like that. You need to grab me from behind 760 00:42:28,120 --> 00:42:30,319 Speaker 1: this way so I can get out of it. And 761 00:42:30,360 --> 00:42:32,040 Speaker 1: you think, well, that's not how the bad guys are 762 00:42:32,080 --> 00:42:35,640 Speaker 1: going to do it. They're not going to attack you 763 00:42:35,680 --> 00:42:38,440 Speaker 1: at your strongest point. Just because you really practice that. 764 00:42:38,640 --> 00:42:42,359 Speaker 1: So I think that again, the service you're providing is 765 00:42:42,520 --> 00:42:48,799 Speaker 1: incredibly valuable. And as we're seeing the landscape change, I 766 00:42:48,840 --> 00:42:51,680 Speaker 1: think it's going to be important for more and more 767 00:42:51,719 --> 00:42:55,240 Speaker 1: companies to really focus on this, to continue to focus 768 00:42:55,280 --> 00:42:58,520 Speaker 1: on it. You don't want your story to become the 769 00:42:58,560 --> 00:43:01,960 Speaker 1: next big scandal. You want your story to be a 770 00:43:02,000 --> 00:43:04,960 Speaker 1: success story of how you were able to respond in 771 00:43:05,000 --> 00:43:08,600 Speaker 1: a in an agile way, an effective way, and a 772 00:43:08,640 --> 00:43:12,040 Speaker 1: way that was responsible both to your company, to your customers, 773 00:43:12,040 --> 00:43:15,520 Speaker 1: to your clients. That those are the stories we want 774 00:43:15,560 --> 00:43:17,279 Speaker 1: to see. We want to see because we know the 775 00:43:17,280 --> 00:43:19,960 Speaker 1: bad guys aren't going away. We know that they're not 776 00:43:20,080 --> 00:43:22,799 Speaker 1: going to just stop, but we do know that we 777 00:43:22,880 --> 00:43:26,480 Speaker 1: can work better at responding to it and make sure 778 00:43:26,560 --> 00:43:31,480 Speaker 1: that the actions we take are more effective and that 779 00:43:31,560 --> 00:43:35,360 Speaker 1: people don't feel like they are left out in the 780 00:43:35,480 --> 00:43:39,400 Speaker 1: lurch and there's nowhere to turn to and you're just 781 00:43:39,520 --> 00:43:43,080 Speaker 1: you're just going through the absolute worst feeling of your life. 782 00:43:44,040 --> 00:43:46,279 Speaker 1: We want to prevent that as much as possible. You can. 783 00:43:46,360 --> 00:43:48,880 Speaker 1: You can save that for the stage, and then in 784 00:43:48,920 --> 00:43:52,520 Speaker 1: real life you can have the actionable plan. Do you 785 00:43:52,560 --> 00:43:55,640 Speaker 1: have any other last thoughts you would like to share 786 00:43:55,760 --> 00:43:59,640 Speaker 1: before we conclude. I think I've learned a lot in 787 00:43:59,680 --> 00:44:02,520 Speaker 1: this conversation. First of all, I mean I've learned I 788 00:44:02,520 --> 00:44:06,520 Speaker 1: definitely want to see one of these simulations because I 789 00:44:06,560 --> 00:44:13,480 Speaker 1: think it would be incredibly informative. And also I've learned 790 00:44:13,520 --> 00:44:17,919 Speaker 1: that I probably need to update my password manager. Yeah. 791 00:44:18,000 --> 00:44:21,319 Speaker 1: My last thoughts would be, do not be part of 792 00:44:22,239 --> 00:44:25,239 Speaker 1: the people that believe they don't need to change their passwords? Right. 793 00:44:25,560 --> 00:44:27,880 Speaker 1: We mentioned that, you know, we hear about breaches happening 794 00:44:27,960 --> 00:44:30,799 Speaker 1: on a daily basis, and then so many people just 795 00:44:30,880 --> 00:44:33,399 Speaker 1: kind of think, oh, well, now they happen all the time, 796 00:44:33,440 --> 00:44:35,560 Speaker 1: so it's no big deal. Just keep my passwords the same. 797 00:44:35,960 --> 00:44:38,840 Speaker 1: Don't do that. Please change your passwords, Please use a 798 00:44:38,920 --> 00:44:42,839 Speaker 1: password manager. And if you've got questions on other things 799 00:44:42,920 --> 00:44:45,239 Speaker 1: related to things we talked about, you can also visit 800 00:44:45,280 --> 00:44:48,600 Speaker 1: ibmsecurity dot com and read more about all of the 801 00:44:48,640 --> 00:44:50,759 Speaker 1: services that we have to offer. We'd love to chat. 802 00:44:51,880 --> 00:44:54,799 Speaker 1: I again want to thank Wendy and Allison for their 803 00:44:54,840 --> 00:44:59,120 Speaker 1: time and their expertise. I am convinced that companies absolutely 804 00:44:59,160 --> 00:45:02,200 Speaker 1: need to have an incident response team and a response 805 00:45:02,320 --> 00:45:05,880 Speaker 1: plan in place to deal with cyber threats. Reducing the 806 00:45:05,920 --> 00:45:09,080 Speaker 1: attack surface is important, but making sure you've got the 807 00:45:09,160 --> 00:45:12,000 Speaker 1: right plan and people ready to go should the worst 808 00:45:12,000 --> 00:45:15,759 Speaker 1: happen is absolutely critical. It reduces the cost of an 809 00:45:15,760 --> 00:45:19,080 Speaker 1: attack dramatically, and when you consider the cost we're talking 810 00:45:19,080 --> 00:45:24,239 Speaker 1: about isn't just the significant financial cost, it's also how 811 00:45:24,320 --> 00:45:29,080 Speaker 1: others perceive your company. It's an imperative. We've seen companies 812 00:45:29,160 --> 00:45:32,719 Speaker 1: large and small take massive hits to their credibility as 813 00:45:32,760 --> 00:45:36,040 Speaker 1: a result of attacks. I hope one day I get 814 00:45:36,080 --> 00:45:38,600 Speaker 1: to see Alison and her team at work, and her 815 00:45:38,640 --> 00:45:41,920 Speaker 1: description of people going through real world emotions even in 816 00:45:42,000 --> 00:45:45,640 Speaker 1: a simulated event reminded me of how we can experience 817 00:45:45,640 --> 00:45:48,920 Speaker 1: stuff like fear and trepidation even when we're in a 818 00:45:49,000 --> 00:45:52,279 Speaker 1: virtual environment. But it's better to have that experience in 819 00:45:52,320 --> 00:45:56,000 Speaker 1: a test run than the real thing. That's all from 820 00:45:56,040 --> 00:45:59,480 Speaker 1: this episode of Smart Talks. To learn more about IBM's 821 00:45:59,480 --> 00:46:07,440 Speaker 1: cyber security services, visit IBM dot com, slash Security Slash Solutions. 822 00:46:11,400 --> 00:46:16,040 Speaker 1: Tex Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 823 00:46:16,360 --> 00:46:20,080 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 824 00:46:20,120 --> 00:46:21,160 Speaker 1: to your favorite shows.